├── .gitignore
├── LICENSE
├── README.md
├── data
├── Schaefer2018_200Parcels_7Networks_count.csv
├── Schaefer2018_200Parcels_7Networks_distance.csv
├── label_ts_corrected
├── leadfield
├── network_colour.xlsx
├── only_high_trial.mat
└── real_EEG
├── img
├── Figure_1.PNG.png
├── Figure_2.PNG
├── Figure_3.PNG
├── Figure_4.PNG
├── Figure_6.PNG
└── Figure_7.PNG
├── nb
├── Goodness_of_fit_LM.ipynb
├── Params_exploration_LM.ipynb
└── model_fit_LM.ipynb
├── requirements.txt
└── tepfit
├── __init__.py
├── __pycache__
├── __init__.cpython-37.pyc
└── viz.cpython-37.pyc
├── data.py
├── fit.py
├── sim.py
└── viz.py
/.gitignore:
--------------------------------------------------------------------------------
1 | .*
2 | !/.gitignore
3 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) [2023] [Davide Momi]
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in
13 | all copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
21 | THE SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Modelling large-scale brain network dynamics underlying the TMS-EEG evoked response
2 |
3 | This repository includes the code required to reproduce the results in: "TMS-EEG evoked responses are driven by recurrent large-scale network dynamics" Davide Momi, Zheng Wang, John D. Griffiths.
4 |
5 | #### Please read our paper, and if you use this code, please cite our paper:
6 | Momi D, Wang Z, Griffiths JD. 2023. TMS-EEG evoked responses are driven by recurrent large-scale network dynamics. eLife2023;12:e83232 DOI: https://doi.org/10.7554/eLife.83232
7 |
8 |
9 |
10 | ## Study Overview:
11 | Grand-mean structural connectome was calculated using neuroimaging data of 400 healthy young individuals (Males= 170; Age range 21-35 years old) from the Human Connectome Project (HCP) Dataset (https://ida.loni.usc.edu) The Schaefer Atlas of 200 parcels was used, which groups brain regions into 7 canonical functional networks (Visual: VISN, Somato-motor: SMN, Dorsal Attention: DAN, Anterior Salience: ASN, Limbic: LIMN, Fronto-parietal: FPN, Default Mode DMN). These parcels were mapped to the individual’s FreeSurfer parcellation using spherical registration. Once this brain parcellation covering cortical structures was extrapolated, it was then used to extract individual structural connectomes. The Jansen-Rit model 22, a physiological connectome-based neural mass model, was embedded in every parcel for simulating time series. The TMS-induced polarization effect of the resting membrane potential was modeled by perturbing voltage offset uTMS to the mean membrane potential of the excitatory interneurons. A lead-field matrix was then used for moving the parcels’ timeseries into channel space and generating simulated TMS-EEG activity. The goodness-of-fit (loss) was calculated as the spatial distance between simulated and empirical TMS-EEG timeseries from an open dataset. The corresponding gradient was computed using Automatic Differentiation. Model parameters were then updated accordingly using the optimization algorithm (ADAM). The model was run with new updated parameters, and the process was repeated until convergence between simulated and empirical timeseries was found. At the end of iteration, the optimal model parameters were used to generate the fitted simulated TMS-EEG activity, which was compared with the empirical data at both the channel and source level
12 |
13 | 
14 |
15 |
16 | ## Conceptual Framework:
17 | Studying the role of recurrent connectivity in stimulation-evoked neural responses with computational models. Shown here is a schematic overview of the hypotheses, methodology, and general conceptual framework of the present work. A) Single TMS stimulation (i [diagram], iv [real data]) pulse applied to a target region (in this case left M1) generates an early response (TEP waveform component) at EEG channels sensitive to that region and its immediate neighbours (ii). This also appears in more distal connected regions such as the frontal lobe (iii) after a short delay due to axonal conduction and polysynaptic transmission. Subsequently, second and sometimes third late TEP components are frequently observed at the target site (i, iv), but not in nonconnected regions (v). Our question is whether these late responses represent evoked oscillatory ‘echoes’ of the initial stimulation that are entirely ‘local’ and independent of the rest of the network, or whether they rather reflect a chain of recurrent activations dispersing from and then propagating back to the initial target site via the connectome. B) In order to investigate this, precisely timed communication interruptions or ‘virtual lesions’ (vii) are introduced into an otherwise accurate and well-fitting computational model of TMS-EEG stimulation responses (vi). and the resulting changes in the propagation pattern (viii) were evaluated. No change in the TEP component of interest would support the ‘local echo’ scenario, and whereas suppressed TEPs would support the ‘recurrent activation’ scenario.
18 |
19 |
20 | 
21 |
22 | ## Large-scale neurophysiological brain network model:
23 | A brain network model comprising 200 cortical areas was used to model TMS-evoked activity patterns, where each network node represents population-averaged activity of a single brain region according to the rationale of mean field theory60. We used the Jansen-Rit (JR) equations to describe activity at each node, which is one of the most widely used neurophysiological models for both stimulus-evoked and resting-state EEG activity measurements. JR is a relatively coarse-grained neural mass model of the cortical microcircuit, composed of three interconnected neural populations: pyramidal projection neurons, excitatory interneurons, and inhibitory interneurons. The excitatory and the inhibitory populations both receive input from and feed back to the pyramidal population but not to each other, and so the overall circuit motif contains one positive and one negative feedback loop. For each of the three neural populations, the post-synaptic somatic and dendritic membrane response to an incoming pulse of action potentials is described by the second-order differential equation:
24 |
25 |
26 |
27 |
28 |
29 |
30 | which is equivalent to a convolution of incoming activity with a synaptic impulse response function
31 |
32 |
33 |
34 |
35 |
36 | whose kernel $h_{e,i}(t)$ is given by
37 |
38 |
39 |
40 |
41 |
42 | where $m(t)$ is the (population-average) presynaptic input, $v(t)$ is the postsynaptic membrane potential, $H_{e,i}$ is the maximum postsynaptic potential and $\tau_{e,i}$ a lumped representation of delays occurring during the synaptic transmission.
43 |
44 | This synaptic response function, also known as a pulse-to-wave operator [Freeman et al., 1975](https://www.sciencedirect.com/book/9780122671500/mass-action-in-the-nervous-system), determines the excitability of the population, as parameterized by the rate constants $a$ and $b$, which are of particular interest in the present study. Complementing the pulse-to-wave operator for the synaptic response, each neural population also has wave-to-pulse operator [Freeman et al., 1975](https://www.sciencedirect.com/book/9780122671500/mass-action-in-the-nervous-system) that determines the its output - the (population-average) firing rate - which is an instantaneous function of the somatic membrane potential that takes the sigmoidal form
45 |
46 |
47 |
48 |
49 |
50 | where $e_{0}$ is the maximum pulse, $r$ is the steepness of the sigmoid function, and $v_0$ is the postsynaptic potential for which half of the maximum pulse rate is achieved.
51 |
52 | In practice, we re-write the three sets of second-order differential equations that follow the form in (1) as pairs of coupled first-order differential equations, and so the full JR system for each individual cortical area $j \in i:N$ in our network of $N$=200 regions is given by the following six equations:
53 |
54 |
55 |
56 |
57 |
58 |
59 |
60 |
61 |
62 |
63 |
64 |
65 |
66 |
67 |
68 |
69 |
70 |
71 |
72 |
73 | where $v_{1,2,3}$ is the average postsynaptic membrane potential of populations of the excitatory stellate cells, inhibitory interneuron, and excitatory pyramidal cell populations, respectively.
74 | The output $y(t) = v_1(t) - v_2(t)$ is the EEG signal.
75 |
76 |
77 |
78 |
79 | **Dependencies**
80 |
81 | * [`pytorch`](https://pytorch.org/)
82 | * [`mne`](https://mne.tools/stable/index.html)
83 | * [`numpy`](https://numpy.org/)
84 | * [`scipy`](https://scipy.org/scipylib/index.html)
85 | * [`scikit-learn`](https://scikit-learn.org/stable/)
86 | * [`matplotlib`](https://matplotlib.org/)
87 | * [`nibabel`](https://nipy.org/nibabel/index.html)
88 | * [`nilearn`](https://nilearn.github.io/)
89 |
90 | ## Data
91 |
92 | For the purposes of this study, already preprocessed TMS-EEG data following a stimulation of primary motor cortex (M1) of twenty healthy young individuals (24.50 ± 4.86 years; 14 females) were taken from an open dataset (https://figshare.com/articles/dataset/TEPs-_SEPs/7440713). For details regarding on the data acquisition and the preprocessing steps please refer to the original paper
93 |
94 | * Biabani M, Fornito A, Coxon JP, Fulcher BD & Rogasch NC (2021). The correspondence between EMG and EEG measures of changes in cortical excitability following transcranial magnetic stimulation. J Physiol 599, 2907–2932.
95 | * Biabani M, Fornito A, Mutanen TP, Morrow J & Rogasch NC (2019). Characterizing and minimizing the contribution of sensory inputs to TMS-evoked potentials. Brain Stimulat 12, 1537–1552.
96 |
97 |
98 | ## Running the code on your local machine
99 |
100 | If you want to run the code locally please follow the instructions:
101 | 1) Install the required dependencies using:
102 | ```
103 | pip install -r requirements.txt
104 | ```
105 |
106 | 2) Download PyTepFit using:
107 | ```
108 | git clone https://github.com/GriffithsLab/PyTepFit
109 | ```
110 |
111 | 3) Create a folder in your local machine `reproduce_Momi_et_al_2022`
112 | 4) Download the required data folder at: https://drive.google.com/drive/folders/1iwsxrmu_rnDCvKNYDwTskkCNt709MPuF
113 | 5) Download the fsaverage folder at: https://drive.google.com/drive/folders/1YPyf3h9YKnZi0zRwBwolROQuqDtxEfzF?usp=sharing
114 | 6) Move the data and the fsaverage folders inside `reproduce_Momi_et_al_2022` (please see the example of the data structure below)
115 |
116 | 
117 |
118 | 6) You now ready to run the nb on your local machine. Please be sure to change the path to where your data are stored
119 |
120 |
121 | ## Main Results
122 |
123 | Comparison between simulated and empirical TMS-EEG data in channel space. A) Empirical and simulated TMSEEG time series for 3 representative subjects showing a robust recovery of individual subjects’ empirical TEP propagation patterns in model-generated activity EEG time series. B) Pearson correlation coefficients over subjects between empirical and simulated TMS-EEG time series. C) Time-wise permutation tests results showing the significant Pearson correlation coefficient and the corresponding reversed p-values (bottom) for every electrode. D) PCI values extracted from the empirical (orange) and simulated (blue) TMS-EEG time series (left). Significant positive correlation (R2 = 80%, p < 0.0001) was found between the simulated and the empirical PCI (right).
124 |
125 | 
126 |
127 |
128 | Comparison between simulated and empirical TMS-EEG data in source space. A) TMS-EEG time series showing a robust recovery of grand-mean empirical TEP patterns in model-generated EEG time series. B) Source reconstructed TMS-evoked propagation pattern dynamics for empirical (top) and simulated (bottom) data. C) Time-wise permutation test results showing the significant Pearson correlation coefficient (tp) and the corresponding reversed p-values (bottom) for every vertex. Network-based dSPM values extracted for the grand mean empirical (left) and simulated (right) source-reconstructed time series.
129 |
130 | 
131 |
132 | Synaptic time constant of inhibitory population affects early and late TEP amplitudes. A) Singular value decomposition (SVD) topoplots for simulated (top) and empirical (bottom) TMS-EEG data. By looking at simulated (top) and empirical (bottom) grand mean timeseries, results revealed that the first (orange outline) and the second (blue outline) components were located ∼65ms and ∼110ms after the TMS pulse. B) First (orange outline) and second (blue outline) components latency and amplitude were extracted for every subject and the distribution plots (top row) show the time location where higher cosine similarity with the SVD components were found. Scatter plots (bottom row) show a negative (left) and positive (right) significant correlation between the synaptic time constant of the inhibitory population and the amplitude of the the first and second component. C) Model-generated first and second SVD components for 2 representative subjects with high (top) and low (bottom) value for the synaptic time constant of the inhibitory population. Topoplots show that higher synaptic time constant affects the amplitude of the individual first and second SVD components. D) Model-generated TMS-EEG data were run using the optimal (top right) or a value decreased by 85% (central left) for the synaptic time constant of the inhibitory population. Absolute values for different values of the synaptic time constant of the inhibitory population (bottom right). Results show an increase in the amplitude of the early first local component and a decrease of the second latest global component
133 |
134 | 
135 |
--------------------------------------------------------------------------------
/data/Schaefer2018_200Parcels_7Networks_count.csv:
--------------------------------------------------------------------------------
1 | 0 7323 160 2609 521 465 195 66 30 646 330 86 353 139 47 32 0 0 0 0 1 0 0 0 0 1 0 0 0 1 577 1 29 0 0 0 33 2 5 0 1 1 0 14 0 0 25 5 0 1 21 3 0 1 5 20 446 257 25 13 2 1 0 33 4 19 4 21 4 1 73 2 6 86 22 81 8 8 6 80 0 8 31 0 6 83 21 3 20 3 11 9 9 4 7 198 22 0 1 1060 5 3 18 2 10 19 4 10 9 16 3 35 12 24 0 1 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 3 0 0 0 3 0 5 9 3 0 0 0 0 0 2 0 0 3 0 0 0 0 1 0 0 1 4 3 4 0 1 0 1 0 0 0 0 0 0 0 0 5 0 1 0 0 0 0 0 3 1 2 0 0 0 1 0 0 0 1 0 0 14 12 9
2 | 7323 0 1637 5412 8224 550 7688 135 552 1618 1039 75 1492 1158 283 77 4 11 1 1 2 0 0 0 0 4 4 3 5 1 4406 11 80 0 0 0 65 14 5 5 0 13 6 60 1 13 373 36 3 14 113 5 0 7 11 161 1327 768 130 235 9 6 0 322 63 165 75 143 29 4 79 6 2 675 175 1110 56 57 26 168 9 37 270 7 53 544 120 7 132 9 100 34 24 35 27 161 15 0 2 1701 6 15 56 14 26 71 29 38 38 88 13 117 39 74 20 4 8 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 24 10 0 0 0 46 0 33 18 2 1 0 0 1 0 1 1 1 29 0 0 0 1 3 1 2 3 19 25 21 1 0 1 4 3 3 1 0 0 0 1 0 30 7 1 0 0 7 0 0 15 7 7 0 1 2 4 3 0 3 1 0 0 44 38 33
3 | 160 1637 0 41 1871 27 18 1451 41 26 1913 1 177 198 157 135 3 9 1 9 59 0 5 7 0 12 0 1 0 1 4566 585 83 12 1 12 80 38 4 9 8 12 9 24 5 6 82 2 0 4 85 2 0 4 2 11 228 227 13 27 44 58 5 253 7 59 45 99 63 58 48 9 0 63 23 166 29 45 666 1277 2 901 26 0 7 54 33 0 11 1 10 4 425 40 2 27 2 0 4 180 9 4 7 1 25 31 4 24 10 19 8 62 17 41 8 9 8 0 0 1 0 0 0 0 0 1 0 0 0 0 2 1 0 0 21 14 0 0 0 43 0 41 26 4 1 2 2 2 0 1 3 2 26 1 1 1 0 15 0 5 4 3 13 2 0 0 7 4 3 2 0 0 0 0 2 3 26 7 0 2 9 4 0 0 14 16 4 0 0 1 0 1 0 3 5 1 2 40 74 63
4 | 2609 5412 41 0 278 2310 2171 16 18 6243 66 541 1678 124 43 7 0 2 2 0 0 0 1 0 0 0 1 0 1 0 115 2 16 0 0 0 34 2 1 0 0 2 2 7 0 3 42 5 0 1 26 1 0 1 4 39 355 99 32 40 2 5 0 44 10 39 11 33 6 1 62 3 1 135 31 175 9 9 6 23 0 7 62 2 8 120 25 4 34 1 19 6 4 6 3 161 19 0 1 418 4 12 24 4 6 30 19 16 14 54 3 56 17 55 5 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 4 3 0 0 0 2 0 2 3 0 0 0 0 0 0 0 0 0 4 0 0 0 0 1 0 1 0 6 6 9 1 0 0 3 0 0 0 0 0 0 0 0 6 1 0 0 1 0 0 0 6 0 0 0 0 0 0 0 0 2 0 0 0 34 12 6
5 | 521 8224 1871 278 0 89 8058 180 8575 506 3449 27 2450 3887 247 50 3 8 1 2 0 0 0 0 6 4 3 1 1 0 886 6 57 0 1 1 41 4 4 4 0 7 1 31 1 4 251 17 1 5 55 2 0 7 2 45 363 200 60 168 6 4 0 178 25 70 37 65 13 3 20 4 0 276 69 726 57 59 18 81 10 32 97 1 21 202 78 2 44 2 47 18 14 17 17 39 3 0 0 180 12 8 30 7 43 112 6 40 20 47 16 171 39 118 22 19 7 0 0 0 1 0 0 0 0 0 0 1 0 1 0 1 0 0 39 22 0 1 0 49 1 60 35 3 0 0 0 2 0 0 0 5 35 1 1 0 1 2 0 3 0 13 36 10 0 0 7 2 3 0 0 0 0 0 0 1 42 3 1 0 1 9 1 1 20 14 13 1 0 2 0 1 1 1 0 1 0 75 60 46
6 | 465 550 27 2310 89 0 288 22 13 1730 49 292 953 74 35 25 1 0 0 0 0 0 0 0 0 0 0 0 0 0 77 1 24 0 0 0 32 3 8 4 0 0 0 5 0 0 28 1 0 0 5 69 5 3 3 32 233 33 30 3 1 0 0 6 1 6 2 2 2 0 175 108 46 9 4 17 3 5 2 27 0 0 3 15 1 65 2 79 106 140 33 114 3 5 36 927 207 4 13 434 0 0 5 1 4 20 3 6 10 18 2 25 7 25 3 2 1 0 0 0 0 0 0 0 0 0 0 0 0 2 0 4 2 5 2 0 0 1 0 0 0 2 16 1 0 1 0 0 0 0 0 0 2 0 0 0 6 1 0 0 6 1 7 1 0 0 0 1 0 0 0 0 0 0 1 1 4 8 1 0 4 0 0 0 0 0 0 1 0 1 5 1 3 10 2 0 0 17 6 5
7 | 195 7688 18 2171 8058 288 0 43 7452 3757 407 279 10200 3650 172 28 2 2 1 0 0 0 0 0 0 1 4 1 1 0 130 4 9 0 0 0 15 3 4 1 0 2 4 17 3 2 163 14 1 4 34 3 0 1 4 34 197 78 47 107 3 3 0 100 10 61 21 47 9 1 18 6 1 205 43 416 30 32 5 23 2 9 72 2 19 132 48 0 37 3 27 6 8 16 15 59 2 0 1 113 24 23 55 19 72 193 27 120 67 90 38 377 85 290 38 17 8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 67 15 0 0 0 75 1 64 48 2 0 0 0 0 0 0 0 5 38 1 0 0 0 0 0 8 1 27 45 17 1 0 9 5 2 0 0 1 0 0 0 0 50 4 0 0 3 15 0 0 29 20 16 0 4 0 0 1 0 1 0 0 2 157 94 64
8 | 66 135 1451 16 180 22 43 0 151 15 4285 4 400 702 296 107 1 10 1 6 28 0 4 3 1 8 1 6 0 1 1038 907 91 3 1 8 75 25 8 13 0 12 5 42 2 8 196 7 1 17 95 2 1 3 1 14 174 174 32 100 47 38 2 227 3 53 42 78 58 47 21 2 1 156 60 697 78 137 1900 2015 4 871 38 0 9 82 47 0 21 1 26 11 297 32 11 15 3 0 3 91 6 2 1 1 13 32 1 6 3 11 5 46 12 46 8 11 6 0 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 1 23 6 0 0 0 30 0 23 24 3 3 2 0 0 1 0 1 4 19 0 1 0 0 11 2 5 0 5 15 4 0 0 0 2 0 1 0 0 0 1 1 3 14 3 0 2 1 3 0 0 14 6 5 0 0 2 2 0 0 1 4 2 3 39 30 27
9 | 30 552 41 18 8575 13 7452 151 0 222 3020 4 1975 6256 97 35 0 0 1 1 0 0 0 0 0 0 0 0 1 1 68 4 16 0 0 0 10 2 2 1 0 2 0 8 0 1 79 2 1 2 16 0 0 0 0 14 55 23 18 46 1 1 0 28 1 13 13 10 1 1 5 2 0 64 18 211 17 16 3 8 3 5 21 0 1 39 14 1 10 0 8 1 0 5 0 9 0 0 1 19 10 8 17 9 27 87 3 48 18 49 19 139 26 117 16 12 11 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 38 16 0 0 0 51 0 36 19 2 0 0 0 0 0 0 0 2 32 0 0 0 0 1 0 6 0 7 25 6 1 0 8 1 3 1 0 0 0 0 0 0 29 3 0 0 1 6 0 0 19 7 6 0 0 1 0 0 0 0 0 0 0 58 49 37
10 | 646 1618 26 6243 506 1730 3757 15 222 0 103 6032 16055 612 64 40 0 0 0 0 0 0 1 1 1 0 1 1 0 0 68 1 34 0 0 1 148 3 2 1 0 1 0 5 0 0 48 3 0 0 7 3 0 1 3 16 417 68 59 23 1 1 0 9 7 10 1 18 1 2 233 1 3 71 16 99 5 9 4 27 0 7 34 0 5 51 8 3 18 2 7 5 8 5 5 787 20 0 1 457 11 18 109 28 44 218 72 83 131 240 27 382 157 315 27 9 5 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 35 11 0 0 0 27 0 5 9 0 1 0 0 1 0 0 0 5 17 0 2 0 0 0 1 0 0 7 21 27 0 0 2 2 3 1 0 0 0 0 0 0 44 4 0 0 0 12 0 1 6 6 4 0 0 0 0 0 0 0 0 0 0 240 37 16
11 | 330 1039 1913 66 3449 49 407 4285 3020 103 0 13 2908 6397 392 238 3 7 3 1 9 0 0 1 2 1 2 0 2 1 1206 80 735 1 0 1 344 16 11 4 2 12 4 39 1 5 281 6 0 10 46 1 1 6 0 23 274 225 35 105 8 14 1 157 7 55 29 44 11 4 79 8 2 162 73 529 59 74 198 1748 3 76 51 1 16 92 28 0 32 3 28 11 95 30 16 55 3 0 3 211 12 4 21 6 39 87 10 68 26 50 19 142 38 132 19 12 6 0 0 0 0 1 0 0 1 0 0 2 0 1 2 2 1 1 33 9 0 0 0 27 1 27 27 4 2 0 0 1 1 0 0 3 23 1 0 0 1 3 1 4 2 10 29 5 2 0 4 5 1 1 0 0 0 0 1 0 48 3 0 1 3 8 0 0 19 7 10 0 1 1 1 0 0 0 2 0 1 88 74 41
12 | 86 75 1 541 27 292 279 4 4 6032 13 0 4766 75 11 11 0 0 0 0 0 0 0 0 1 0 0 0 0 0 19 1 9 0 0 0 32 0 1 0 0 0 0 4 0 0 6 0 0 0 2 0 0 0 0 2 58 6 8 3 0 1 0 2 0 3 2 3 0 0 28 0 0 15 2 8 0 2 1 7 0 0 4 0 0 4 1 0 0 0 0 1 1 1 0 128 2 0 0 66 1 2 16 5 9 49 12 19 14 51 8 94 27 71 3 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 0 0 0 0 5 0 3 1 0 0 1 0 0 0 0 0 0 3 0 0 1 0 0 2 0 0 1 3 7 0 0 0 1 1 0 0 0 0 0 0 0 12 1 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 61 17 4
13 | 353 1492 177 1678 2450 953 10200 400 1975 16055 2908 4766 0 15066 756 530 10 4 1 0 2 0 1 2 3 4 3 10 3 1 247 13 838 0 1 0 2591 35 87 16 3 15 2 69 9 4 413 8 3 6 40 16 4 26 3 29 423 94 88 96 3 61 0 72 3 36 22 32 12 5 723 23 1 188 36 515 51 103 17 895 2 111 39 1 6 96 33 5 28 8 25 12 80 25 13 777 46 8 43 420 60 53 181 83 243 747 142 392 220 670 133 1478 323 1166 157 37 30 0 0 0 2 1 0 0 1 2 0 1 0 6 1 10 2 3 214 79 0 0 0 157 1 99 108 11 3 5 1 2 2 5 2 16 86 1 1 0 7 12 0 18 2 44 106 43 2 0 21 25 10 12 0 0 1 0 0 5 247 31 5 3 8 51 0 3 70 50 33 6 3 2 0 1 1 2 1 1 3 838 332 141
14 | 139 1158 198 124 3887 74 3650 702 6256 612 6397 75 15066 0 381 231 8 2 1 1 2 0 0 0 1 1 2 3 1 0 218 6 2031 0 0 1 1122 14 15 0 0 10 1 33 4 3 276 5 2 2 38 2 1 4 2 11 146 75 32 84 2 10 0 78 2 27 13 31 7 1 81 3 0 130 29 379 51 91 12 1620 1 20 31 1 6 65 19 0 14 0 19 3 53 24 15 89 2 0 3 124 23 14 47 18 71 220 31 129 48 142 40 437 88 324 58 13 8 0 0 0 0 0 0 0 2 2 0 1 0 0 0 3 2 0 82 28 0 0 0 61 0 46 39 5 1 1 0 1 0 0 1 7 30 0 1 0 1 8 1 9 2 14 49 8 0 0 3 12 4 5 0 0 0 0 0 1 81 5 1 0 3 25 3 1 32 21 15 4 1 1 0 0 0 0 1 1 1 218 145 70
15 | 47 283 157 43 247 35 172 296 97 64 392 11 756 381 0 5543 614 81 239 11 8 2 6 6 199 19 50 11 4 3 122 47 222 6 15 98 368 589 134 241 4 25 3 437 162 55 592 15 10 56 40 2 0 18 0 2 41 35 6 441 14 403 7 25 1 30 11 74 50 9 299 18 0 115 8 780 123 693 212 710 12 539 20 0 11 10 77 0 14 2 15 6 80 25 13 61 29 0 27 42 0 2 4 1 13 15 1 10 6 12 3 32 15 32 10 9 13 0 3 0 1 1 0 1 0 1 0 5 0 14 0 32 2 7 6 4 5 3 0 26 1 34 115 27 0 0 0 0 3 7 0 0 20 0 0 1 15 4 0 0 0 0 6 1 3 1 4 5 0 0 0 0 0 0 0 0 10 16 1 0 2 7 0 0 1 3 1 1 0 0 1 0 0 1 1 0 0 28 49 53
16 | 32 77 135 7 50 25 28 107 35 40 238 11 530 231 5543 0 909 268 882 66 10 0 5 14 251 25 57 11 4 4 141 31 225 5 16 143 550 921 217 396 5 71 21 2160 506 167 506 62 58 214 64 1 1 30 0 1 50 31 5 105 42 535 16 41 1 126 35 316 203 9 340 29 1 112 23 1455 227 988 96 592 33 483 30 0 43 21 347 0 20 3 67 17 76 40 38 89 34 1 32 50 2 3 4 0 6 11 1 6 6 10 3 33 8 31 6 18 28 1 4 0 0 2 1 1 0 1 1 15 1 61 0 74 5 17 11 10 2 5 0 40 3 79 260 77 0 0 0 2 8 12 0 3 36 0 3 1 39 6 1 1 0 1 1 1 8 0 2 5 1 1 0 0 0 2 0 3 26 51 1 1 8 7 2 0 4 4 14 4 0 0 0 1 0 2 8 2 2 28 57 64
17 | 0 4 3 0 3 1 2 1 0 0 3 0 10 8 614 909 0 1337 844 80 16 0 64 115 242 85 230 19 20 14 9 5 8 10 45 34 29 129 39 106 3 4 3 135 230 159 898 10 165 88 7 1 1 9 0 0 2 8 0 2 18 32 23 33 0 15 2 39 13 0 7 3 0 10 11 26 20 39 1 12 1 16 6 0 8 1 41 0 5 1 5 2 11 6 7 2 2 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 3 1 0 0 0 0 0 0 0 1 1 0 1 0 1 1 8 1 3 0 0 0 0 0 1 0 5 5 4 1 0 0 0 0 4 0 0 1 0 0 0 1 2 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 3 0 0 0 0 0 0 0 1 1 0 2 0 0 0 0 0 1 0 0 0 4 5
18 | 0 11 9 2 8 0 2 10 0 0 7 0 4 2 81 268 1337 0 2167 1468 451 3 131 89 200 447 104 1239 240 32 42 39 5 336 116 385 11 146 25 59 54 505 168 269 2066 1245 1738 101 847 786 25 3 0 689 0 0 9 22 1 5 272 127 391 121 0 36 9 80 80 35 6 0 1 22 21 84 78 69 16 20 21 55 34 0 25 11 189 0 15 0 29 13 64 19 135 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 2 1 0 1 4 0 7 0 2 11 23 1 0 3 8 41 12 39 12 0 1 0 0 0 0 0 2 8 2 19 23 2 1 0 3 18 0 4 0 1 1 3 197 0 0 0 0 1 0 1 0 5 0 0 0 0 0 1 1 1 11 0 6 12 0 29 0 1 1 1 0 0 0 0 0 0 0 0 2 5 3 6 0 0 3
19 | 0 1 1 2 1 0 1 1 1 0 3 0 1 1 239 882 844 2167 0 181 7 0 5 8 52 23 37 16 22 8 1 0 1 27 8 24 8 25 19 56 0 7 5 195 1990 109 384 8 44 64 7 0 0 4 0 0 3 5 1 2 3 7 33 20 0 18 3 38 21 1 3 3 0 8 4 30 16 34 0 3 3 8 9 0 6 4 47 0 3 1 3 1 4 2 9 1 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 1 3 1 1 2 1 1 4 0 1 0 0 0 1 0 13 0 29 6 14 0 0 5 1 0 0 0 3 10 3 0 0 0 2 2 10 0 0 2 0 0 0 12 1 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 6 3
20 | 0 1 9 0 2 0 0 6 1 0 1 0 0 1 11 66 80 1468 181 0 2676 2 1167 742 163 930 243 696 486 167 64 61 1 651 159 193 0 74 6 46 5 31 243 34 1301 140 210 4 13 75 5 0 0 63 0 0 6 7 1 1 205 115 119 89 0 10 3 19 22 158 0 2 1 4 8 26 17 40 25 5 4 44 1 0 3 2 20 0 1 0 4 0 188 2 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 1 3 5 16 5 14 38 34 4 1 46 17 67 83 475 82 0 0 6 0 0 0 0 7 16 4 28 4 1 1 0 0 15 0 1 0 1 0 22 145 0 0 0 1 0 0 1 0 1 0 0 0 0 0 0 0 1 6 0 72 6 0 2 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 0 1 2
21 | 1 2 59 0 0 0 0 28 0 0 9 0 2 2 8 10 16 451 7 2676 0 1 1044 1094 266 1648 153 475 49 37 157 171 2 182 135 242 2 105 2 14 924 160 488 22 63 133 159 4 4 161 45 1 0 140 0 0 19 10 0 3 556 523 165 135 0 15 23 78 155 522 2 0 4 5 6 17 35 80 112 52 9 288 4 0 4 6 30 0 3 0 4 1 694 39 17 1 0 0 0 2 1 0 0 0 2 0 0 0 0 0 1 0 1 4 8 1 0 0 0 40 2 59 15 2 215 254 6 1 79 23 422 27 559 9 1 6 0 0 1 4 1 11 22 9 279 119 39 1 0 2 206 0 6 0 8 15 15 1645 0 0 0 0 1 0 1 5 10 2 0 0 0 1 3 1 5 71 0 12 70 0 37 11 9 1 0 0 0 0 0 2 0 0 0 3 4 3 13 1 1 0
22 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 3 0 2 1 0 0 1 1 7 3 301 5165 1 1 0 0 0 0 0 8 2 326 253 0 1 1 0 1 0 19 0 0 3 3 475 304 954 0 0 2 0 0 0 0 0 0 0 0 1 1 0 0 0 17 477 717 0 0 1 0 0 0 2 0 1 0 1 0 9 2 10 17 49 13 43 1 0 63 12 46 35 60 1 0 0 0 0 1 0 0 0 0 0 2 0 0 3 3 0 0 0 0 26 2 42 30 3 108 403 6 2 76 15 285 27 796 8 1 2 1 1 0 2 0 6 17 2 100 40 20 0 0 1 91 0 1 0 2 13 24 1117 0 0 3 0 1 0 1 4 8 0 0 0 0 3 1 3 3 28 1 125 474 7 58 5 4 2 0 0 0 0 0 1 2 0 0 17 8 4 5 0 0 1
23 | 0 0 5 1 0 0 0 4 0 1 0 0 1 0 6 5 64 131 5 1167 1044 0 0 2244 266 848 570 135 47 33 11 10 1 1399 1953 127 3 52 2 14 29 22 51 3 174 9 541 3 5 25 47 0 0 11 0 1 2 1 0 0 77 122 30 3 0 17 11 36 46 237 1 0 1 3 0 3 0 4 11 10 0 57 2 0 1 5 28 0 5 0 15 5 605 17 10 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 3 6 8 1 1 13 18 5 30 151 39 0 0 2 1 0 0 0 1 9 4 8 3 1 0 0 3 1 0 3 0 1 2 7 25 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 4 1 25 1 1 9 0 0 0 1 0 0 0 0 0 0 1 0 0 0 1 1 0 0 2
24 | 0 0 7 0 0 0 0 3 0 1 1 0 2 0 6 14 115 89 8 742 1094 1 2244 0 128 4068 444 653 188 357 8 12 0 739 437 56 5 34 6 12 101 55 40 4 134 15 882 10 15 24 69 1 0 14 0 0 2 1 1 2 38 64 14 3 0 29 19 98 64 277 1 0 0 4 1 5 5 3 3 10 0 47 7 0 4 5 70 0 9 0 16 5 910 65 15 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 4 0 2 6 4 3 0 5 11 12 1 0 9 12 16 24 72 19 0 0 2 0 0 1 0 0 12 8 6 3 2 0 1 5 3 0 0 0 2 0 9 70 0 0 1 0 0 0 2 0 0 0 0 0 0 0 0 0 1 6 0 5 3 1 11 0 1 1 0 1 0 0 0 0 0 1 0 1 1 2 2 0 2 1
25 | 0 0 0 0 6 0 0 1 0 1 2 1 3 1 199 251 242 200 52 163 266 1 266 128 0 143 830 97 96 47 11 1 8 898 1466 191 48 534 33 605 25 87 40 24 621 272 723 4 7 48 20 2 1 90 0 0 4 2 8 1 59 165 393 6 0 12 11 12 39 87 3 2 0 2 0 6 2 16 3 17 2 51 3 0 4 1 23 0 2 1 9 3 235 15 8 1 0 0 1 1 0 4 1 0 2 1 2 0 0 0 1 1 1 4 7 30 38 2 7 1 8 12 0 13 2 5 2 7 1 54 3 128 24 45 4 1 11 8 0 21 2 42 158 50 1 1 2 2 15 64 2 1 20 0 4 4 19 25 0 0 0 1 2 0 14 0 3 2 0 1 0 0 1 0 0 3 18 9 0 0 3 3 5 2 4 18 17 9 3 1 0 1 0 2 0 1 1 4 24 37
26 | 1 4 12 0 4 0 1 8 0 0 1 0 4 1 19 25 85 447 23 930 1648 7 848 4068 143 0 1016 3434 637 455 16 6 5 32 77 31 15 47 23 82 1269 705 72 5 51 19 1946 25 58 120 256 3 1 85 0 0 2 1 1 2 41 148 6 11 0 38 79 92 33 281 5 1 3 14 4 22 5 7 10 27 2 64 3 0 8 22 78 0 25 3 57 11 1151 453 60 1 0 0 3 1 0 0 0 0 0 0 0 0 0 0 0 0 1 3 6 0 1 0 1 31 5 82 13 13 178 200 13 2 82 8 183 21 726 14 0 1 3 1 0 1 0 9 23 6 159 32 30 1 0 0 97 0 5 0 5 5 16 542 0 0 2 0 0 0 0 2 6 0 0 0 3 0 3 0 0 19 2 21 38 3 43 7 2 2 0 0 0 0 0 2 0 1 0 2 10 2 12 0 0 0
27 | 0 4 0 1 3 0 4 1 0 1 2 0 3 2 50 57 230 104 37 243 153 3 570 444 830 1016 0 380 919 389 2 5 2 340 497 13 3 45 32 451 38 229 16 9 763 37 1082 13 16 60 43 1 1 147 0 0 2 0 3 2 11 14 5 6 0 42 12 91 47 42 2 3 0 5 2 3 3 1 3 6 1 13 13 0 10 6 103 1 12 2 40 12 238 54 35 0 3 0 1 3 0 3 0 0 2 0 6 2 0 0 1 2 5 5 5 53 68 2 56 19 52 54 0 51 18 16 24 14 9 115 12 322 141 141 5 3 50 21 4 20 4 29 216 63 6 2 5 14 28 203 6 1 42 3 17 3 89 45 0 0 0 2 1 1 47 2 1 10 1 0 2 3 4 3 0 6 16 25 3 1 6 5 0 0 8 14 17 6 3 7 0 1 0 2 1 1 2 11 55 70
28 | 0 3 1 0 1 0 1 6 0 1 0 0 10 3 11 11 19 1239 16 696 475 301 135 653 97 3434 380 0 850 173 18 10 19 16 16 13 87 33 237 382 871 1761 358 9 30 26 1438 22 66 504 253 3 12 1490 0 1 4 1 1 1 19 43 8 25 1 32 82 61 22 97 19 12 15 8 4 20 6 9 2 27 1 40 7 0 8 25 62 0 14 0 48 14 561 471 108 0 2 0 5 0 0 1 1 0 2 0 0 1 0 0 2 1 1 13 18 1 2 2 0 91 7 196 30 10 378 579 29 9 105 22 340 57 1063 20 6 17 11 0 0 17 1 18 60 17 267 80 86 1 1 2 309 0 12 1 19 23 57 1314 0 0 1 0 0 0 0 3 27 2 1 0 0 2 4 2 4 37 0 35 110 6 77 25 8 10 0 1 0 2 0 5 0 0 0 5 10 7 7 0 0 2
29 | 0 5 0 1 1 0 1 0 1 0 2 0 3 1 4 4 20 240 22 486 49 5165 47 188 96 637 919 850 0 1240 6 4 1 17 6 5 19 11 160 1389 2 67 27 2 72 8 416 15 5 93 57 237 167 2727 0 2 2 0 0 2 5 4 6 5 0 22 7 66 24 6 5 771 182 4 1 4 1 4 0 8 3 5 17 4 9 19 94 4 19 23 35 65 20 27 276 5 11 6 6 2 0 1 2 0 0 0 2 1 0 0 8 4 2 10 20 20 22 4 45 91 68 331 76 148 458 703 52 4 417 170 455 646 4342 344 1 6 58 14 8 21 7 30 213 66 243 22 38 9 7 145 190 1 31 1 41 11 229 826 0 0 2 0 3 2 19 2 18 6 0 1 2 3 6 5 4 32 14 811 286 5 41 19 5 8 1 8 9 6 3 8 0 1 0 12 22 3 7 3 48 71
30 | 1 1 1 0 0 0 0 1 1 0 1 0 1 0 3 4 14 32 8 167 37 1 33 357 47 455 389 173 1240 0 1 1 0 30 20 3 2 4 2 62 14 90 3 2 79 3 221 13 9 26 14 1 0 71 0 0 0 0 0 0 5 5 1 1 0 16 7 29 21 13 0 1 1 0 0 5 0 2 2 0 0 2 4 0 8 3 49 0 6 1 15 8 49 17 11 0 1 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 4 3 0 5 7 7 13 2 20 4 6 7 3 4 17 1 81 119 41 1 0 19 1 1 2 1 2 35 17 0 1 0 4 2 35 2 1 8 0 6 1 24 9 0 0 0 0 0 1 3 0 1 1 0 0 0 0 0 0 0 2 1 16 3 1 3 0 1 0 1 1 1 2 1 1 0 0 1 2 2 0 1 4 10 15
31 | 577 4406 4566 115 886 77 130 1038 68 68 1206 19 247 218 122 141 9 42 1 64 157 1 11 8 11 16 2 18 6 1 0 1824 142 19 4 17 213 76 34 22 8 62 55 46 21 77 107 14 11 74 166 4 1 17 4 65 1761 3222 65 22 380 162 10 906 21 173 120 409 524 172 181 7 0 135 74 179 31 215 443 998 129 1162 69 2 27 147 181 3 33 1 49 14 804 77 44 90 9 0 18 1228 11 10 37 7 47 86 8 73 30 25 25 143 36 124 52 23 22 0 0 0 0 0 0 0 2 0 0 2 0 3 3 0 1 1 76 43 0 0 0 203 2 135 132 8 1 5 3 4 2 3 4 3 48 2 2 4 0 26 2 6 2 22 41 7 0 2 36 24 2 10 2 2 0 1 1 2 128 18 2 2 17 30 3 1 41 24 12 4 3 2 2 0 0 2 6 2 4 88 249 191
32 | 1 11 585 2 6 1 4 907 4 1 80 1 13 6 47 31 5 39 0 61 171 0 10 12 1 6 5 10 4 1 1824 0 12 16 4 23 16 61 9 8 3 41 82 22 20 47 51 5 2 65 121 1 0 21 0 5 92 97 0 9 601 166 10 2553 0 88 73 232 380 186 13 1 0 23 26 92 151 464 914 538 155 2222 13 0 12 28 104 0 4 0 16 5 808 36 12 5 1 0 4 7 0 1 1 0 4 1 0 2 0 0 1 7 2 6 2 2 1 0 0 1 0 0 0 0 1 2 0 0 0 0 3 0 2 0 5 1 0 0 0 11 0 9 16 0 2 7 2 0 0 0 3 1 7 0 1 0 0 17 0 0 1 1 1 0 0 0 0 0 0 1 0 1 1 1 0 2 7 1 2 1 9 1 0 0 2 2 2 0 0 2 1 0 0 2 4 0 1 4 12 9
33 | 29 80 83 16 57 24 9 91 16 34 735 9 838 2031 222 225 8 5 1 1 2 0 1 0 8 5 2 19 1 0 142 12 0 1 0 6 3497 654 139 63 4 31 1 21 11 18 184 3 1 6 73 7 0 53 0 5 48 27 14 14 5 2620 3 28 0 13 14 18 7 5 205 7 1 36 9 85 11 53 17 3495 4 1180 7 0 2 29 10 0 6 0 12 5 138 56 40 28 5 1 24 37 3 2 4 2 13 19 2 10 8 26 3 37 9 42 9 4 0 0 1 1 1 0 0 1 3 9 0 0 0 1 11 10 8 0 9 3 0 1 0 8 0 16 36 8 5 4 2 0 0 0 1 0 3 0 0 0 1 31 1 0 0 1 7 1 0 0 2 2 0 0 0 0 0 2 1 4 14 7 3 0 7 2 0 0 4 2 0 1 0 1 0 0 0 1 4 0 3 30 32 20
34 | 0 0 12 0 0 0 0 3 0 0 1 0 0 0 6 5 10 336 27 651 182 0 1399 739 898 32 340 16 17 30 19 16 1 0 2593 1706 5 304 1 55 1 2 25 3 1367 415 45 1 2 57 4 0 0 6 0 0 0 3 0 0 166 268 602 4 0 3 1 8 6 28 0 0 0 1 2 3 1 6 15 6 0 70 1 0 0 0 11 0 0 0 2 0 60 0 2 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 1 0 0 2 1 5 1 4 2 6 6 0 6 13 0 46 126 37 0 0 3 1 0 1 0 2 11 4 0 1 1 0 0 5 1 0 0 0 4 0 8 5 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 25 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 0 0 0 0 0 1
35 | 0 0 1 0 1 0 0 1 0 0 0 0 1 0 15 16 45 116 8 159 135 0 1953 437 1466 77 497 16 6 20 4 4 0 2593 0 1220 4 92 3 20 1 4 21 1 334 48 136 1 4 17 7 0 0 1 0 0 0 0 0 0 18 37 548 2 0 1 2 11 7 44 0 0 0 0 2 1 0 5 0 0 0 16 0 0 0 2 6 0 1 0 2 3 95 4 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 0 1 0 0 1 0 1 0 1 2 4 3 5 13 6 0 0 1 1 0 1 0 2 5 3 1 0 0 0 0 0 3 0 1 0 0 0 2 2 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 0 1 2 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0
36 | 0 0 12 0 1 0 0 8 0 1 1 0 0 1 98 143 34 385 24 193 242 0 127 56 191 31 13 13 5 3 17 23 6 1706 1220 0 19 1220 8 60 6 9 77 11 549 744 124 5 1 100 10 0 0 10 0 0 3 2 3 2 512 1820 2700 13 0 4 3 23 35 97 1 0 1 5 2 11 6 24 12 23 5 242 2 0 1 4 22 0 1 0 1 0 251 15 5 0 1 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 2 0 0 0 0 0 0 0 3 0 0 0 0 4 2 5 4 7 0 0 0 1 0 1 1 7 16 6 2 6 0 0 0 2 2 1 2 0 0 1 3 9 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 1 1 1 0 2 0 0 0 0 1 1 0 0 1 0 0 0 0 1 0 0 0 1 4
37 | 33 65 80 34 41 32 15 75 10 148 344 32 2591 1122 368 550 29 11 8 0 2 8 3 5 48 15 3 87 19 2 213 16 3497 5 4 19 0 1040 857 1096 6 120 2 103 42 108 353 2 10 27 77 63 32 229 1 16 84 46 18 28 23 745 13 55 3 23 19 26 4 3 1200 143 12 62 14 139 36 58 15 1964 10 1027 21 2 5 61 17 7 13 19 23 36 91 56 127 60 80 9 214 42 4 6 5 3 16 30 10 13 10 32 8 66 22 93 14 48 54 1 4 2 6 6 0 4 2 4 1 16 1 29 7 92 13 31 27 14 3 3 1 118 6 210 530 114 3 7 0 5 9 30 1 9 87 0 5 5 54 50 1 4 0 7 20 3 17 0 13 17 0 6 1 2 1 1 0 7 59 161 1 4 30 16 2 3 17 21 16 5 3 2 2 1 0 4 10 1 5 53 130 198
38 | 2 14 38 2 4 3 3 25 2 3 16 0 35 14 589 921 129 146 25 74 105 2 52 34 534 47 45 33 11 4 76 61 654 304 92 1220 1040 0 108 732 10 46 28 145 233 498 686 6 6 51 25 1 0 32 1 5 17 24 10 17 164 1637 508 54 1 11 6 25 33 52 53 8 1 30 13 81 24 83 51 433 15 962 9 0 1 18 26 0 7 1 7 1 164 15 6 5 7 1 10 8 0 1 1 0 0 0 1 1 0 0 1 0 6 7 3 22 27 0 4 1 0 1 0 0 1 5 0 6 1 9 3 25 5 5 5 4 1 1 0 23 3 57 162 34 3 5 1 0 7 20 1 1 19 0 1 0 11 10 0 0 0 0 6 0 7 0 7 3 0 1 0 0 1 0 1 2 16 14 2 0 8 5 1 1 5 7 8 3 3 1 0 0 0 0 3 0 3 6 32 49
39 | 5 5 4 1 4 8 4 8 2 2 11 1 87 15 134 217 39 25 19 6 2 326 2 6 33 23 32 237 160 2 34 9 139 1 3 8 857 108 0 949 3 203 5 59 158 139 262 6 6 89 49 545 870 1562 0 6 16 8 7 5 12 48 7 23 0 5 9 9 11 1 341 448 187 20 11 46 13 33 2 178 2 124 4 5 3 33 36 28 49 111 101 158 26 58 692 36 156 63 1272 8 1 13 2 1 5 2 18 2 2 3 8 6 27 70 32 123 154 3 26 8 25 20 3 25 13 39 11 36 14 196 7 529 191 220 12 11 22 18 1 179 11 397 1232 285 7 30 10 26 53 178 12 3 159 3 17 24 254 216 0 5 6 9 16 4 81 3 21 26 3 9 3 4 5 1 8 39 118 532 78 13 145 20 9 3 42 42 57 36 6 8 3 1 1 23 44 14 17 25 221 421
40 | 0 5 9 0 4 4 1 13 1 1 4 0 16 0 241 396 106 59 56 46 14 253 14 12 605 82 451 382 1389 62 22 8 63 55 20 60 1096 732 949 0 9 287 7 74 594 309 507 4 12 54 32 165 250 1190 0 2 6 3 11 7 25 153 91 17 0 9 3 14 12 8 51 186 50 15 3 23 11 22 9 75 2 116 6 1 4 15 41 3 12 21 36 58 49 46 266 10 6 2 17 4 3 7 0 0 4 1 13 3 0 1 4 6 23 32 21 102 172 4 63 15 63 55 9 67 16 27 20 27 30 251 12 723 419 282 4 13 54 25 2 95 14 165 770 204 5 11 4 28 65 348 4 1 121 0 26 9 244 96 0 2 3 5 6 4 91 6 11 16 3 1 1 2 4 2 2 23 58 257 22 3 55 22 14 4 19 42 54 31 10 7 0 1 1 5 16 1 7 41 129 218
41 | 1 0 8 0 0 0 0 0 0 0 2 0 3 0 4 5 3 54 0 5 924 0 29 101 25 1269 38 871 2 14 8 3 4 1 1 6 6 10 3 9 0 1751 16 2 0 10 109 2 20 68 68 1 1 68 0 0 0 1 0 0 9 61 2 5 0 6 12 12 3 466 2 0 1 0 3 5 0 1 5 22 1 48 0 0 1 0 6 0 2 1 10 2 1000 180 31 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 5 0 8 2 0 22 62 1 0 6 1 31 10 29 2 1 4 0 0 1 1 1 2 9 7 31 120 26 8 0 0 66 0 3 0 5 19 13 521 0 0 0 0 1 0 0 3 4 3 0 0 1 1 1 4 0 71 0 0 22 0 93 4 11 5 0 4 2 0 0 1 0 0 0 5 9 10 12 0 0 0
42 | 1 13 12 2 7 0 2 12 2 1 12 0 15 10 25 71 4 505 7 31 160 1 22 55 87 705 229 1761 67 90 62 41 31 2 4 9 120 46 203 287 1751 0 251 92 19 64 1027 43 201 2985 504 16 17 578 0 3 6 18 2 3 42 63 8 115 0 44 80 89 63 124 39 5 10 21 21 75 51 47 12 96 8 106 15 0 17 30 341 0 45 6 268 82 2897 3690 1403 3 4 1 17 1 0 0 0 0 1 0 0 0 0 0 0 1 1 0 7 0 1 1 1 21 0 18 3 2 9 59 2 0 5 4 32 24 37 3 7 13 0 0 0 4 2 6 35 5 33 194 60 11 1 1 67 0 4 0 37 156 34 979 1 0 10 1 2 0 1 6 18 19 0 3 10 19 7 28 34 266 1 5 56 73 820 9 26 23 0 5 2 6 0 23 0 1 2 84 145 68 129 1 12 10
43 | 0 6 9 2 1 0 4 5 0 0 4 0 2 1 3 21 3 168 5 243 488 1 51 40 40 72 16 358 27 3 55 82 1 25 21 77 2 28 5 7 16 251 0 53 21 115 42 11 1 1372 53 1 0 322 0 0 10 13 0 0 138 61 57 166 0 93 30 283 1880 1490 2 0 0 6 13 28 44 64 25 8 7 28 4 0 12 7 488 0 2 1 15 6 1265 25 52 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 2 1 1 0 0 0 0 0 4 1 1 11 26 0 0 4 2 41 3 25 3 0 0 0 0 1 0 0 1 4 3 31 28 1 0 1 0 23 0 1 0 1 7 2 255 0 0 0 0 1 0 0 0 0 1 0 1 0 0 0 2 0 19 1 1 11 1 32 1 0 1 0 0 0 0 0 0 0 0 0 3 4 0 2 0 0 0
44 | 14 60 24 7 31 5 17 42 8 5 39 4 69 33 437 2160 135 269 195 34 22 0 3 4 24 5 9 9 2 2 46 22 21 3 1 11 103 145 59 74 2 92 53 0 603 891 210 38 48 255 76 0 1 43 0 5 51 107 1 45 70 57 8 484 3 146 22 383 374 11 48 14 0 132 242 1152 930 1654 26 92 265 113 37 0 61 26 416 0 25 1 53 13 97 29 62 8 6 0 11 5 0 0 0 0 1 1 0 0 0 1 1 2 1 4 0 4 1 0 1 0 0 0 0 0 0 0 0 2 1 12 0 18 0 6 0 0 0 1 0 8 0 21 77 22 1 0 0 0 0 2 0 0 11 0 1 1 5 10 0 1 1 0 0 1 1 0 0 1 0 1 0 1 0 1 0 1 3 14 1 0 8 2 1 0 0 0 3 1 0 1 0 0 0 1 4 0 0 3 10 11
45 | 0 1 5 0 1 0 3 2 0 0 1 0 9 4 162 506 230 2066 1990 1301 63 1 174 134 621 51 763 30 72 79 21 20 11 1367 334 549 42 233 158 594 0 19 21 603 0 2893 203 8 39 171 3 0 0 7 0 0 11 16 1 4 117 118 265 115 1 16 3 37 28 7 7 3 0 19 25 74 69 83 13 8 25 42 10 0 7 2 68 0 3 0 4 2 13 8 9 0 0 0 4 0 0 0 0 0 0 0 1 0 1 0 0 0 0 2 1 11 10 1 7 6 4 12 1 13 3 2 8 3 7 52 1 109 124 61 0 0 9 3 1 3 0 8 55 10 1 1 1 1 3 36 1 0 15 0 2 0 33 6 0 0 0 0 1 0 7 0 0 0 0 0 0 0 0 0 0 0 2 34 0 1 0 1 1 0 0 2 0 1 0 1 0 0 0 1 2 0 0 3 4 8
46 | 0 13 6 3 4 0 2 8 1 0 5 0 4 3 55 167 159 1245 109 140 133 0 9 15 272 19 37 26 8 3 77 47 18 415 48 744 108 498 139 309 10 64 115 891 2893 0 187 21 73 445 58 0 0 26 0 4 25 50 1 7 1157 251 650 393 1 110 26 209 255 64 15 2 0 34 69 191 326 219 14 13 238 81 20 0 32 15 309 0 9 0 17 5 286 22 45 3 1 0 10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 1 1 0 1 0 0 0 0 0 1 1 0 0 0 7 0 22 2 3 0 1 3 0 0 1 0 7 14 6 1 3 1 0 0 4 0 0 3 0 0 0 5 13 0 0 0 1 0 0 2 0 0 0 0 0 1 2 0 1 1 4 1 5 1 1 6 1 0 0 0 0 1 0 0 1 0 0 0 5 8 1 3 0 5 2
47 | 25 373 82 42 251 28 163 196 79 48 281 6 413 276 592 506 898 1738 384 210 159 19 541 882 723 1946 1082 1438 416 221 107 51 184 45 136 124 353 686 262 507 109 1027 42 210 203 187 0 385 604 489 354 29 10 717 25 10 33 32 890 575 85 418 42 115 11 252 111 470 289 41 235 26 4 672 38 184 66 83 138 436 11 395 340 0 120 128 827 1 192 13 436 211 245 549 949 80 41 1 37 22 2 0 5 0 5 10 4 11 5 6 3 30 17 15 16 9 14 0 0 1 5 7 0 1 12 18 0 5 8 23 22 39 43 7 5 8 2 2 0 44 1 40 89 15 16 17 2 0 5 17 17 2 15 0 5 4 9 112 0 0 9 5 7 1 5 2 6 4 0 1 2 4 2 7 9 18 30 12 11 8 85 6 1 2 2 5 6 5 1 2 2 2 2 32 31 8 15 25 64 63
48 | 5 36 2 5 17 1 14 7 2 3 6 0 8 5 15 62 10 101 8 4 4 0 3 10 4 25 13 22 15 13 14 5 3 1 1 5 2 6 6 4 2 43 11 38 8 21 385 0 62 383 92 2 1 18 0 1 1 13 5 14 4 7 5 27 8 122 73 202 86 2 0 0 2 56 9 101 24 15 3 6 2 7 807 0 222 118 1458 0 96 6 147 74 24 81 106 3 2 0 3 2 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 4 0 0 0 2 0 0 0 1 0 0 0 0 1 0 2 0 0 6 0 0 0 0 0 1 0 0 0 2 2 0 2 2 1 0 1 1 3 11 1 0 0 0 0 0 0 0 1 2 5 2 10 10 0 4 0 2 1
49 | 0 3 0 0 1 0 1 1 1 0 0 0 3 2 10 58 165 847 44 13 4 0 5 15 7 58 16 66 5 9 11 2 1 2 4 1 10 6 6 12 20 201 1 48 39 73 604 62 0 587 6 0 0 114 0 0 1 3 2 1 14 8 14 18 2 9 6 16 7 4 2 0 0 6 2 19 8 10 1 4 2 9 18 0 12 1 63 0 6 0 7 5 13 17 88 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 1 0 0 0 0 0 0 2 0 0 13 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 3 0 0 1 0 6 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 3 0 0 0
50 | 1 14 4 1 5 0 4 17 2 0 10 0 6 2 56 214 88 786 64 75 161 3 25 24 48 120 60 504 93 26 74 65 6 57 17 100 27 51 89 54 68 2985 1372 255 171 445 489 383 587 0 97 30 11 1258 0 0 20 21 0 10 159 58 115 216 0 147 38 367 1203 313 14 5 6 36 40 130 99 126 6 26 20 34 82 0 65 24 1062 0 29 7 117 147 694 397 2129 0 0 1 6 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 2 0 1 2 0 4 0 1 6 21 0 0 0 1 14 7 15 3 1 2 0 1 1 2 1 2 2 2 7 40 5 1 0 4 15 0 2 0 3 6 1 207 0 0 7 0 0 0 0 0 2 1 0 1 0 2 5 2 6 37 0 0 20 8 147 0 0 1 0 0 0 1 0 2 0 0 1 14 33 10 17 0 2 0
51 | 21 113 85 26 55 5 34 95 16 7 46 2 40 38 40 64 7 25 7 5 45 3 47 69 20 256 43 253 57 14 166 121 73 4 7 10 77 25 49 32 68 504 53 76 3 58 354 92 6 97 0 42 15 54 0 0 29 30 4 19 167 235 16 138 0 495 3140 2594 221 68 36 1 10 86 31 234 41 63 90 328 15 369 43 2 45 458 486 2 420 33 1723 401 2344 4992 1093 17 19 3 24 9 1 0 0 0 1 0 4 0 0 0 0 4 0 2 1 0 1 0 2 1 0 1 1 1 0 4 0 0 4 3 3 2 1 0 4 7 0 2 0 1 1 6 9 1 1 11 28 4 3 2 3 0 4 2 17 10 0 35 0 0 116 0 0 0 4 2 8 7 0 40 96 186 83 228 137 84 1 1 19 101 350 3 7 2 0 1 0 1 0 91 4 28 13 949 626 31 29 1 8 8
52 | 3 5 2 1 2 69 3 2 0 3 1 0 16 2 2 1 1 3 0 0 1 475 0 1 2 3 1 3 237 1 4 1 7 0 0 0 63 1 545 165 1 16 1 0 0 0 29 2 0 30 42 0 1002 1199 0 14 32 1 3 0 0 0 0 2 0 2 22 5 4 0 449 34 446 1 0 2 0 0 1 11 0 2 5 19 1 114 25 120 173 992 224 1337 16 14 1501 348 1136 408 783 45 0 1 0 0 1 0 0 0 0 2 1 0 0 0 0 1 1 0 1 0 2 2 2 1 0 11 0 0 0 3 5 10 4 3 3 2 0 1 0 5 0 3 18 3 2 102 28 2 1 0 10 0 6 1 18 124 10 471 1 0 37 0 3 0 0 2 7 5 0 4 12 21 18 22 46 128 1 12 275 119 984 4 8 4 0 0 1 2 1 19 7 9 44 188 293 40 110 1 6 6
53 | 0 0 0 0 0 5 0 1 0 0 1 0 4 1 0 1 1 0 0 0 0 304 0 0 1 1 1 12 167 0 1 0 0 0 0 0 32 0 870 250 1 17 0 1 0 0 10 1 0 11 15 1002 0 293 0 11 1 1 0 0 1 1 0 0 0 0 4 1 2 1 88 267 115 2 0 1 0 0 0 1 0 1 0 11 1 46 14 90 106 308 104 263 1 7 483 16 170 433 700 1 0 2 0 0 2 0 4 1 0 1 1 1 2 6 0 11 12 2 6 2 8 5 6 7 5 18 4 3 14 33 7 108 174 60 2 1 7 1 0 11 0 15 61 14 4 11 5 4 5 24 6 0 11 0 4 3 45 110 0 0 10 0 1 0 8 1 2 1 0 1 2 2 2 10 5 13 6 166 43 12 53 1 0 0 2 3 1 1 0 2 4 4 1 36 31 4 6 4 15 26
54 | 1 7 4 1 7 3 1 3 0 1 6 0 26 4 18 30 9 689 4 63 140 954 11 14 90 85 147 1490 2727 71 17 21 53 6 1 10 229 32 1562 1190 68 578 322 43 7 26 717 18 114 1258 54 1199 293 0 0 1 2 4 1 1 14 30 9 48 1 16 17 27 20 54 92 23 660 10 10 20 18 13 8 63 2 54 2 0 6 8 61 1 29 30 75 144 108 93 1544 2 36 5 108 2 0 1 0 0 4 0 1 0 0 1 3 4 2 5 18 9 5 0 3 120 6 114 20 11 125 811 11 5 40 18 363 134 329 21 10 29 1 3 0 10 3 37 163 33 262 617 168 10 0 1 469 0 22 1 75 308 150 5147 0 1 5 0 8 1 2 11 37 26 0 2 2 7 7 18 19 404 4 19 793 39 974 33 45 29 1 12 3 13 1 12 0 1 5 46 97 69 120 0 5 8
55 | 5 11 2 4 2 3 4 1 0 3 0 0 3 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 25 0 0 0 0 0 0 0 0 249 9 0 520 11 0 0 0 0 176 15 0 2 0 0 2 0 0 92 0 2 0 0 0 0 0 2 57 3 0 177 0 0 3 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 2 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 2 1
56 | 20 161 11 39 45 32 34 14 14 16 23 2 29 11 2 1 0 0 0 0 0 0 1 0 0 0 0 1 2 0 65 5 5 0 0 0 16 5 6 2 0 3 0 5 0 4 10 1 0 0 0 14 11 1 249 0 49 11 361 16 4 10 0 12 177 1 0 0 2 1 28 16 48 217 5 30 6 6 8 12 1 5 365 127 3 6 2 139 7 166 2 1 3 0 0 72 121 25 30 23 0 0 1 0 0 2 0 0 0 1 0 0 1 3 2 0 4 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 5 1 7 13 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 4 0 11 0 1 0 0 0 4 1 0 0 0 0 0 0 0 0 3 1 0 0 0 0 0 0 1 0 0 1 0 0 18 0 0 1 0 0 0 5 8 3
57 | 446 1327 228 355 363 233 197 174 55 417 274 58 423 146 41 50 2 9 3 6 19 2 2 2 4 2 2 4 2 0 1761 92 48 0 0 3 84 17 16 6 0 6 10 51 11 25 33 1 1 20 29 32 1 2 9 49 0 628 152 8 51 32 0 188 8 23 17 45 72 19 202 43 11 34 20 65 16 85 61 192 32 131 21 4 2 71 39 27 35 51 18 36 75 10 27 554 116 1 14 1595 7 3 33 6 14 20 6 24 14 37 5 44 25 36 10 5 8 0 0 0 0 0 0 0 1 1 1 0 0 0 1 4 3 3 17 10 0 0 0 63 1 58 70 5 0 0 0 0 1 1 0 0 12 0 3 0 3 2 1 1 6 8 2 13 1 1 7 4 1 0 0 0 0 0 0 1 26 13 0 1 3 5 0 2 10 7 5 2 1 1 3 1 0 2 0 1 0 33 60 50
58 | 257 768 227 99 200 33 78 174 23 68 225 6 94 75 35 31 8 22 5 7 10 0 1 1 2 1 0 1 0 0 3222 97 27 3 0 2 46 24 8 3 1 18 13 107 16 50 32 13 3 21 30 1 1 4 0 11 628 0 49 5 96 21 1 1092 7 31 14 111 120 32 54 9 0 55 325 46 58 164 64 180 82 136 14 1 10 22 77 0 5 0 16 2 102 14 15 53 7 0 3 259 1 1 8 4 8 23 2 10 7 7 6 24 12 22 8 10 2 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 5 6 0 0 0 31 0 21 26 4 1 1 0 1 0 1 0 1 13 0 0 0 3 2 0 1 0 0 4 5 0 1 3 3 0 0 0 0 0 0 0 1 20 3 0 0 2 3 0 0 7 5 1 2 1 0 1 0 0 3 0 1 0 22 44 36
59 | 25 130 13 32 60 30 47 32 18 59 35 8 88 32 6 5 0 1 1 1 0 0 0 1 8 1 3 1 0 0 65 0 14 0 0 3 18 10 7 11 0 2 0 1 1 1 890 5 2 0 4 3 0 1 520 361 152 49 0 54 1 11 0 2 44 73 4 9 2 1 36 8 4 27 1 4 1 1 4 29 0 13 266 9 9 465 16 10 21 17 16 7 6 2 5 88 24 1 4 370 3 0 7 1 3 4 2 5 4 4 0 8 3 5 1 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 3 1 0 0 0 5 1 9 16 0 0 0 1 0 0 0 0 3 1 0 1 0 0 1 0 0 51 7 2 1 0 1 0 2 0 1 2 0 0 1 0 0 4 1 0 0 1 0 0 1 3 2 0 0 0 0 7 7 1 14 2 0 0 4 9 11
60 | 13 235 27 40 168 3 107 100 46 23 105 3 96 84 441 105 2 5 2 1 3 0 0 2 1 2 2 1 2 0 22 9 14 0 0 2 28 17 5 7 0 3 0 45 4 7 575 14 1 10 19 0 0 1 11 16 8 5 54 0 1 11 1 6 13 34 7 22 15 0 12 3 0 312 2 422 17 69 46 77 1 44 104 0 16 50 49 0 25 2 20 4 18 8 9 7 5 0 1 8 0 0 0 0 2 0 0 1 0 1 1 1 1 4 0 3 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 2 2 0 0 0 8 0 8 13 2 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 3 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 3 2 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 1 1 0 0 4 4 8
61 | 2 9 44 2 6 1 3 47 1 1 8 0 3 2 14 42 18 272 3 205 556 0 77 38 59 41 11 19 5 5 380 601 5 166 18 512 23 164 12 25 9 42 138 70 117 1157 85 4 14 159 167 0 1 14 0 4 51 96 1 1 0 678 2031 1177 1 105 53 239 341 384 7 1 0 38 140 106 762 1359 352 17 1547 1216 14 0 13 22 144 0 2 0 11 2 1375 51 13 1 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 2 0 0 1 2 3 1 1 0 0 0 0 0 0 3 0 0 3 0 1 2 0 0 0 1 1 0 0 0 0 3 1 34 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 2 2 9 0 0 0 0 11 0 0 0 0 0 0 0 0 0 0 0 0 6 13 3 2 0 0 1
62 | 1 6 58 5 4 0 3 38 1 1 14 1 61 10 403 535 32 127 7 115 523 0 122 64 165 148 14 43 4 5 162 166 2620 268 37 1820 745 1637 48 153 61 63 61 57 118 251 418 7 8 58 235 0 1 30 0 10 32 21 11 11 678 0 923 100 4 48 71 110 122 385 80 2 3 38 12 74 31 164 172 1433 55 4445 11 0 5 57 47 0 7 1 10 4 1675 138 29 2 0 0 8 8 0 0 0 0 1 0 0 1 0 1 0 0 2 1 0 1 1 0 1 1 0 2 0 0 6 12 0 2 0 4 21 5 19 3 0 0 0 0 0 2 0 9 10 5 10 9 4 0 0 2 8 0 1 0 1 0 3 87 0 0 0 2 0 0 0 0 1 0 0 0 0 2 0 3 4 7 1 3 5 1 22 1 0 1 1 0 0 0 0 1 0 0 0 5 13 2 6 0 2 3
63 | 0 0 5 0 0 0 0 2 0 0 1 0 0 0 7 16 23 391 33 119 165 0 30 14 393 6 5 8 6 1 10 10 3 602 548 2700 13 508 7 91 2 8 57 8 265 650 42 5 14 115 16 0 0 9 0 0 0 1 0 1 2031 923 0 41 0 13 9 46 110 90 0 0 0 1 3 6 14 10 8 4 5 103 2 0 11 2 56 0 2 0 1 1 250 4 11 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 1 1 0 1 0 3 1 3 6 0 0 0 0 0 0 0 0 1 3 3 0 1 0 0 0 2 1 0 2 0 0 0 3 6 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 5 0 1 0 0 0 1 0 0 1 0 0 0 2 1 0 1 1 0 1
64 | 33 322 253 44 178 6 100 227 28 9 157 2 72 78 25 41 33 121 20 89 135 0 3 3 6 11 6 25 5 1 906 2553 28 4 2 13 55 54 23 17 5 115 166 484 115 393 115 27 18 216 138 2 0 48 0 12 188 1092 2 6 1177 100 41 0 3 210 65 563 823 184 12 10 1 324 1943 287 1318 1642 262 156 946 762 51 0 48 47 456 0 30 6 86 25 562 87 81 3 4 0 1 16 0 1 0 0 1 7 0 1 0 1 1 6 1 8 0 2 1 0 0 1 0 0 0 0 0 0 0 3 1 1 4 2 2 0 4 0 0 1 0 9 1 17 26 4 2 9 1 0 0 0 1 1 7 0 0 2 2 42 0 0 1 2 5 0 2 0 0 0 0 0 2 0 0 3 4 1 6 7 4 1 12 1 0 0 2 3 4 0 0 0 0 3 0 3 7 2 2 7 15 25
65 | 4 63 7 10 25 1 10 3 1 7 7 0 3 2 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 21 0 0 0 0 0 3 1 0 0 0 0 0 3 1 1 11 8 2 0 0 0 0 1 176 177 8 7 44 13 1 4 0 3 0 655 3 6 4 0 4 1 0 98 7 33 2 3 1 4 0 6 977 0 198 644 19 0 3 0 0 0 2 1 1 3 0 0 0 5 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 1 6 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 5 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 2 0 0 0 0 0 0 0 0 2 1 0 4 0 0 0 0 0 9
66 | 19 165 59 39 70 6 61 53 13 10 55 3 36 27 30 126 15 36 18 10 15 1 17 29 12 38 42 32 22 16 173 88 13 3 1 4 23 11 5 9 6 44 93 146 16 110 252 122 9 147 495 2 0 16 15 1 23 31 73 34 105 48 13 210 655 0 476 3963 738 38 13 1 0 212 51 246 77 96 43 89 15 106 1501 1 956 413 879 0 129 4 48 14 205 66 58 1 1 0 4 8 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 1 0 0 0 0 0 1 0 1 1 0 1 0 0 2 0 0 0 0 2 0 0 0 5 0 7 10 2 0 3 2 0 0 1 0 0 2 1 3 1 1 9 0 0 107 0 0 0 0 1 1 0 0 7 18 13 4 23 9 8 5 3 0 15 39 0 2 1 1 0 0 0 0 9 6 32 1 156 63 1 1 1 6 8
67 | 4 75 45 11 37 2 21 42 13 1 29 2 22 13 11 35 2 9 3 3 23 1 11 19 11 79 12 82 7 7 120 73 14 1 2 3 19 6 9 3 12 80 30 22 3 26 111 73 6 38 3140 22 4 17 0 0 17 14 4 7 53 71 9 65 3 476 0 855 141 21 9 0 3 56 19 103 13 26 44 105 5 142 297 4 157 699 136 2 673 15 202 100 432 351 60 2 6 1 8 6 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 1 2 3 1 0 0 0 0 0 0 0 2 2 1 0 1 7 0 1 0 1 0 3 0 5 7 2 2 0 0 214 2 0 0 0 0 2 0 0 16 56 47 24 55 49 21 1 0 2 44 85 0 5 0 1 0 0 0 0 25 9 52 13 444 159 8 14 0 4 5
68 | 21 143 99 33 65 2 47 78 10 18 44 3 32 31 74 316 39 80 38 19 78 0 36 98 12 92 91 61 66 29 409 232 18 8 11 23 26 25 9 14 12 89 283 383 37 209 470 202 16 367 2594 5 1 27 2 0 45 111 9 22 239 110 46 563 6 3963 855 0 4126 111 6 1 1 128 116 449 212 233 111 108 53 185 115 2 393 120 2736 0 49 2 90 39 723 192 136 4 1 0 5 4 0 0 0 0 0 0 0 0 0 1 0 0 0 2 1 1 0 0 0 0 0 0 0 0 1 0 1 0 1 0 2 2 2 0 2 2 0 0 0 1 2 11 6 2 0 5 5 0 0 1 2 0 0 0 1 1 0 14 0 0 35 0 0 0 2 2 1 1 0 9 26 40 20 35 44 23 0 1 2 17 85 0 2 0 0 1 0 0 0 15 5 7 5 207 131 5 17 2 5 3
69 | 4 29 63 6 13 2 9 58 1 1 11 0 12 7 50 203 13 80 21 22 155 0 46 64 39 33 47 22 24 21 524 380 7 6 7 35 4 33 11 12 3 63 1880 374 28 255 289 86 7 1203 221 4 2 20 0 2 72 120 2 15 341 122 110 823 4 738 141 4126 0 557 3 2 0 60 121 297 210 246 129 57 57 164 28 0 86 46 2996 0 14 3 107 56 1422 154 138 3 0 1 1 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 2 0 0 2 0 3 2 1 0 1 0 0 0 0 1 0 0 4 0 1 1 9 0 0 0 1 0 2 0 6 2 1 13 0 0 3 0 0 0 0 0 3 1 0 2 7 7 8 18 26 26 0 2 4 11 92 1 3 2 0 0 0 0 0 7 0 1 2 32 90 9 15 1 0 4
70 | 1 4 58 1 3 0 1 47 1 2 4 0 5 1 9 9 0 35 1 158 522 0 237 277 87 281 42 97 6 13 172 186 5 28 44 97 3 52 1 8 466 124 1490 11 7 64 41 2 4 313 68 0 1 54 0 1 19 32 1 0 384 385 90 184 0 38 21 111 557 0 1 0 3 7 14 20 33 71 98 62 10 211 5 0 7 12 61 0 1 0 7 3 2849 9 22 0 0 0 0 4 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 5 0 2 1 0 14 17 0 0 6 0 42 7 12 0 2 4 0 0 0 0 1 0 5 1 18 184 10 2 0 0 40 0 0 0 7 18 3 594 0 0 0 0 0 0 0 2 1 0 0 0 0 1 2 2 6 79 0 1 24 2 103 2 5 2 0 1 0 1 0 6 0 0 0 1 9 13 17 0 1 0
71 | 73 79 48 62 20 175 18 21 5 233 79 28 723 81 299 340 7 6 3 0 2 17 1 1 3 5 2 19 5 0 181 13 205 0 0 1 1200 53 341 51 2 39 2 48 7 15 235 0 2 14 36 449 88 92 2 28 202 54 36 12 7 80 0 12 4 13 9 6 3 1 0 538 65 34 6 59 7 48 7 395 4 173 6 9 4 69 9 92 110 257 76 193 50 20 189 933 1414 240 950 215 22 20 32 16 62 91 22 47 28 90 27 213 73 276 45 109 61 0 2 0 9 5 1 4 3 0 2 6 0 65 2 152 29 46 56 23 3 3 1 188 5 275 624 80 7 5 3 8 5 17 4 16 189 1 5 2 66 23 7 13 18 24 71 4 7 2 12 14 9 14 3 1 1 7 6 6 150 341 6 5 21 20 2 2 67 35 54 5 6 11 5 3 6 27 21 0 3 231 337 325
72 | 2 6 9 3 4 108 6 2 2 1 8 0 23 3 18 29 3 0 3 2 0 477 0 0 2 1 3 12 771 1 7 1 7 0 0 0 143 8 448 186 0 5 0 14 3 2 26 0 0 5 1 34 267 23 0 16 43 9 8 3 1 2 0 10 1 1 0 1 2 0 538 0 29 6 3 20 6 12 1 15 1 4 3 11 2 22 0 39 34 51 5 29 1 2 19 535 891 111 479 47 0 0 1 0 3 1 2 2 1 0 1 4 1 6 0 2 3 0 0 1 1 2 2 2 2 6 0 1 5 7 1 25 76 7 3 0 0 0 0 10 0 5 32 5 1 1 0 0 1 2 3 0 3 0 1 2 10 6 0 0 4 1 0 0 2 0 2 0 0 1 0 0 1 0 0 1 7 75 3 0 8 1 0 1 1 1 0 1 0 1 0 0 0 5 2 0 0 6 12 17
73 | 6 2 0 1 0 46 1 1 0 3 2 0 1 0 0 1 0 1 0 1 4 717 1 0 0 3 0 15 182 1 0 0 1 0 0 1 12 1 187 50 1 10 0 0 0 0 4 2 0 6 10 446 115 660 0 48 11 0 4 0 0 3 0 1 0 0 3 1 0 3 65 29 0 2 0 2 0 0 0 4 0 0 0 31 0 57 3 150 88 161 44 103 1 1 297 273 660 288 128 15 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 1 0 0 3 0 3 0 0 0 14 0 0 1 1 3 4 4 1 0 0 0 0 0 1 0 0 7 1 0 7 2 0 0 1 3 0 1 0 3 17 6 78 0 0 11 0 0 0 0 1 0 0 0 0 2 2 1 3 1 2 0 3 53 7 34 0 3 0 0 0 0 0 0 2 4 5 2 21 13 3 7 0 0 1
74 | 86 675 63 135 276 9 205 156 64 71 162 15 188 130 115 112 10 22 8 4 5 0 3 4 2 14 5 8 4 0 135 23 36 1 0 5 62 30 20 15 0 21 6 132 19 34 672 56 6 36 86 1 2 10 92 217 34 55 27 312 38 38 1 324 98 212 56 128 60 7 34 6 2 0 234 1056 108 181 77 118 34 82 442 2 65 510 207 2 78 11 79 17 37 27 24 8 12 1 8 92 0 2 2 0 2 4 0 3 1 3 3 10 2 11 2 7 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 5 3 0 0 0 31 0 32 50 7 0 0 0 0 0 2 0 0 9 0 0 0 1 0 1 0 30 10 4 0 4 1 9 1 0 0 0 2 0 0 1 0 20 9 2 1 1 1 1 1 6 1 0 1 0 0 11 3 0 16 2 0 0 11 30 41
75 | 22 175 23 31 69 4 43 60 18 16 73 2 36 29 8 23 11 21 4 8 6 0 0 1 0 4 2 4 1 0 74 26 9 2 2 2 14 13 11 3 3 21 13 242 25 69 38 9 2 40 31 0 0 10 0 5 20 325 1 2 140 12 3 1943 7 51 19 116 121 14 6 3 0 234 0 508 600 371 55 76 140 82 44 0 19 33 103 0 18 1 17 9 66 14 16 6 2 0 2 10 1 0 0 0 0 2 0 1 0 1 0 1 1 2 2 1 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 1 1 1 3 0 0 0 0 2 0 11 13 0 1 0 0 0 0 0 1 1 3 0 0 0 1 6 0 0 2 0 0 0 0 0 2 0 0 0 0 0 0 1 0 1 0 2 0 0 4 0 0 0 0 0 0 0 0 0 1 0 0 1 2 0 0 2 0 1
76 | 81 1110 166 175 726 17 416 697 211 99 529 8 515 379 780 1455 26 84 30 26 17 1 3 5 6 22 3 20 4 5 179 92 85 3 1 11 139 81 46 23 5 75 28 1152 74 191 184 101 19 130 234 2 1 20 2 30 65 46 4 422 106 74 6 287 33 246 103 449 297 20 59 20 2 1056 508 0 1869 1838 472 491 140 424 272 0 83 223 518 1 169 9 189 62 200 93 67 26 11 1 14 49 1 3 4 1 8 7 1 5 5 4 1 15 1 19 5 13 7 0 0 0 1 0 0 0 0 2 0 4 0 7 0 17 1 2 4 2 0 2 0 28 0 50 114 31 0 3 0 0 1 2 0 0 7 0 0 0 9 7 0 1 13 5 2 0 2 1 4 1 0 1 1 0 1 1 1 2 14 27 0 3 7 4 0 2 6 2 2 1 1 2 0 3 2 19 7 2 1 19 34 47
77 | 8 56 29 9 57 3 30 78 17 5 59 0 51 51 123 227 20 78 16 17 35 0 0 5 2 5 3 6 1 0 31 151 11 1 0 6 36 24 13 11 0 51 44 930 69 326 66 24 8 99 41 0 0 18 0 6 16 58 1 17 762 31 14 1318 2 77 13 212 210 33 7 6 0 108 600 1869 0 3832 128 83 652 417 23 0 20 22 188 0 15 0 38 6 127 26 29 3 6 1 4 5 0 0 0 0 1 0 0 0 0 0 1 2 1 3 2 2 6 0 0 0 0 0 0 0 0 0 0 1 0 4 0 6 0 0 0 0 0 0 0 5 1 12 27 13 1 0 1 0 0 1 0 0 3 0 0 0 5 14 0 0 1 0 0 0 1 0 0 1 0 0 1 0 0 0 1 1 4 8 0 0 5 0 0 0 0 0 1 0 0 0 0 0 0 1 3 1 2 2 5 6
78 | 8 57 45 9 59 5 32 137 16 9 74 2 103 91 693 988 39 69 34 40 80 0 4 3 16 7 1 9 4 2 215 464 53 6 5 24 58 83 33 22 1 47 64 1654 83 219 83 15 10 126 63 0 0 13 0 6 85 164 1 69 1359 164 10 1642 3 96 26 233 246 71 48 12 0 181 371 1838 3832 0 2027 157 1486 3459 26 0 27 27 188 0 22 1 34 8 248 21 31 11 2 0 4 11 0 1 1 0 2 2 0 5 2 2 1 3 0 7 1 5 5 0 0 0 0 0 0 0 1 1 0 5 0 6 0 16 1 7 0 0 0 1 0 8 1 17 60 15 0 1 1 0 0 2 1 0 3 0 0 1 6 5 0 0 1 0 1 0 2 0 0 0 0 0 0 0 0 0 2 1 7 12 0 0 6 3 1 0 1 1 3 0 0 0 0 0 0 2 6 0 2 3 11 17
79 | 6 26 666 6 18 2 5 1900 3 4 198 1 17 12 212 96 1 16 0 25 112 0 11 3 3 10 3 2 0 2 443 914 17 15 0 12 15 51 2 9 5 12 25 26 13 14 138 3 1 6 90 1 0 8 0 8 61 64 4 46 352 172 8 262 1 43 44 111 129 98 7 1 0 77 55 472 128 2027 0 831 201 4942 16 0 6 30 34 0 6 1 18 2 488 40 8 3 0 0 0 14 1 0 1 0 1 1 0 2 0 0 0 1 0 3 0 2 0 0 0 0 0 0 1 0 3 2 1 0 0 0 3 0 1 0 2 0 0 0 0 1 0 0 7 0 2 1 0 0 0 1 0 1 2 0 0 0 0 22 0 1 0 0 3 0 0 0 1 0 0 0 0 0 0 0 0 2 2 1 0 4 4 0 0 0 1 3 0 0 0 0 0 0 0 1 2 0 1 4 4 2
80 | 80 168 1277 23 81 27 23 2015 8 27 1748 7 895 1620 710 592 12 20 3 5 52 2 10 10 17 27 6 27 8 0 998 538 3495 6 0 23 1964 433 178 75 22 96 8 92 8 13 436 6 4 26 328 11 1 63 0 12 192 180 29 77 17 1433 4 156 4 89 105 108 57 62 395 15 4 118 76 491 83 157 831 0 5 1822 26 0 18 81 42 1 27 3 37 14 911 223 92 45 7 0 32 124 3 1 6 4 21 46 4 19 10 26 5 66 25 67 12 7 2 0 1 1 0 2 0 0 12 13 1 2 1 4 13 14 13 2 15 8 0 0 0 18 0 25 67 13 22 6 1 0 1 2 12 2 13 0 1 1 6 137 0 3 2 4 14 3 0 0 0 4 0 1 1 1 1 1 6 12 19 20 5 2 24 1 0 0 6 2 5 1 0 0 1 1 0 4 18 4 7 64 45 28
81 | 0 9 2 0 10 0 2 4 3 0 3 0 2 1 12 33 1 21 3 4 9 0 0 0 2 2 1 1 3 0 129 155 4 0 0 5 10 15 2 2 1 8 7 265 25 238 11 2 2 20 15 0 0 2 0 1 32 82 0 1 1547 55 5 946 0 15 5 53 57 10 4 1 0 34 140 140 652 1486 201 5 0 367 0 0 4 10 33 0 1 0 5 0 53 4 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 2 1 0 2 0 0 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 2 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0
82 | 8 37 901 7 32 0 9 871 5 7 76 0 111 20 539 483 16 55 8 44 288 1 57 47 51 64 13 40 5 2 1162 2222 1180 70 16 242 1027 962 124 116 48 106 28 113 42 81 395 7 9 34 369 2 1 54 2 5 131 136 13 44 1216 4445 103 762 6 106 142 185 164 211 173 4 0 82 82 424 417 3459 4942 1822 367 0 14 0 13 71 82 1 10 0 32 6 1715 235 42 8 2 0 21 25 0 0 0 0 0 0 0 1 0 6 0 2 0 7 2 1 3 0 0 0 0 1 0 0 7 15 1 0 2 6 23 4 13 2 1 0 0 0 0 9 0 7 20 9 17 11 1 0 0 1 14 0 4 0 3 0 2 106 0 0 0 1 1 0 0 0 2 0 0 0 0 1 1 3 3 14 1 3 6 7 40 1 0 1 1 2 0 0 0 0 0 0 0 6 15 2 6 6 6 7
83 | 31 270 26 62 97 3 72 38 21 34 51 4 39 31 20 30 6 34 9 1 4 0 2 7 3 3 13 7 17 4 69 13 7 1 0 2 21 9 4 6 0 15 4 37 10 20 340 807 18 82 43 5 0 2 57 365 21 14 266 104 14 11 2 51 977 1501 297 115 28 5 6 3 0 442 44 272 23 26 16 26 0 14 0 2 915 2055 2068 1 172 5 44 34 14 13 25 8 1 0 4 10 0 0 0 0 1 1 2 1 0 4 0 3 1 2 0 2 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 10 0 16 10 2 0 0 0 0 0 1 0 2 1 1 0 0 0 2 1 0 21 0 0 0 0 0 2 0 0 0 1 0 0 1 2 2 5 5 0 4 5 0 0 1 0 2 2 0 0 3 2 6 0 24 10 0 0 4 5 8
84 | 0 7 0 2 1 15 2 0 0 0 1 0 1 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 4 0 2 0 0 0 0 0 2 0 5 1 0 0 0 0 0 0 0 0 0 0 2 19 11 0 3 127 4 1 9 0 0 0 0 0 0 1 4 2 0 0 9 11 31 2 0 0 0 0 0 0 0 0 2 0 0 274 2 341 206 151 16 32 1 0 1 53 108 22 29 12 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 3 0 0 0 0 1 0 0 0 0 1 0 0 1 1 0 133 0 465 4 0 0 0 0 0 0 0 3 5 2 0 2 1 0 0 1 3 1 13 0 0 0 1 0 0 0 0 4 1318 48 5 94 7 0 0 2 1 0
85 | 6 53 7 8 21 1 19 9 1 5 16 0 6 6 11 43 8 25 6 3 4 0 1 4 4 8 10 8 9 8 27 12 2 0 0 1 5 1 3 4 1 17 12 61 7 32 120 222 12 65 45 1 1 6 0 3 2 10 9 16 13 5 11 48 198 956 157 393 86 7 4 2 0 65 19 83 20 27 6 18 4 13 915 0 0 472 1524 1 99 2 31 15 23 18 17 2 1 0 4 3 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 2 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 20 0 0 0 0 0 0 0 0 3 5 3 0 4 4 0 1 0 0 0 3 0 0 0 0 1 0 0 0 2 0 2 2 29 13 1 0 1 0 3
86 | 83 544 54 120 202 65 132 82 39 51 92 4 96 65 10 21 1 11 4 2 6 9 5 5 1 22 6 25 19 3 147 28 29 0 2 4 61 18 33 15 0 30 7 26 2 15 128 118 1 24 458 114 46 8 177 6 71 22 465 50 22 57 2 47 644 413 699 120 46 12 69 22 57 510 33 223 22 27 30 81 10 71 2055 274 472 0 145 498 3098 325 570 570 101 113 90 204 317 40 79 56 0 1 0 0 1 0 5 0 0 2 0 1 1 3 1 4 2 0 1 0 0 0 0 0 0 1 1 1 0 0 1 4 0 1 1 3 0 0 0 15 1 25 36 3 1 0 7 0 0 2 1 0 6 2 3 7 6 4 86 2 2508 3 0 0 1 1 4 0 0 94 152 51 10 51 20 13 10 14 16 102 118 1 0 1 4 1 0 0 0 50 800 956 46 1820 165 6 5 13 39 33
87 | 21 120 33 25 78 2 48 47 14 8 28 1 33 19 77 347 41 189 47 20 30 2 28 70 23 78 103 62 94 49 181 104 10 11 6 22 17 26 36 41 6 341 488 416 68 309 827 1458 63 1062 486 25 14 61 0 2 39 77 16 49 144 47 56 456 19 879 136 2736 2996 61 9 0 3 207 103 518 188 188 34 42 33 82 2068 2 1524 145 0 1 232 32 821 556 489 907 1010 5 15 0 8 5 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 1 1 0 0 1 0 0 0 1 3 2 0 2 0 1 1 0 0 0 1 0 0 13 1 0 5 4 1 0 0 1 0 0 1 4 4 3 16 0 0 18 0 0 0 1 1 0 1 0 0 11 28 10 31 31 14 1 2 7 16 89 0 0 0 0 0 0 0 0 11 4 1 8 117 130 8 8 0 2 4
88 | 3 7 0 4 2 79 0 0 1 3 0 0 5 0 0 0 0 0 0 0 0 10 0 0 0 0 1 0 4 0 3 0 0 0 0 0 7 0 28 3 0 0 0 0 0 0 1 0 0 0 2 120 90 1 0 139 27 0 10 0 0 0 0 0 0 0 2 0 0 0 92 39 150 2 0 1 0 0 0 1 0 1 1 341 1 498 1 0 1136 714 40 165 0 1 4 265 655 161 146 25 0 0 0 0 0 0 1 0 1 0 0 2 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 2 1 1 11 0 362 0 0 0 0 0 0 0 0 4 17 5 2 11 2 0 0 4 1 10 28 0 0 0 0 0 0 0 0 3 235 133 17 321 37 1 0 1 2 1
89 | 20 132 11 34 44 106 37 21 10 18 32 0 28 14 14 20 5 15 3 1 3 17 5 9 2 25 12 14 19 6 33 4 6 0 1 1 13 7 49 12 2 45 2 25 3 9 192 96 6 29 420 173 106 29 3 7 35 5 21 25 2 7 2 30 3 129 673 49 14 1 110 34 88 78 18 169 15 22 6 27 1 10 172 206 99 3098 232 1136 0 622 5341 2814 41 360 327 302 572 96 134 45 0 0 0 0 0 0 2 0 2 1 0 2 1 0 1 0 3 1 0 0 0 0 2 0 0 3 0 1 2 2 0 0 1 0 0 1 1 0 0 2 0 3 7 3 0 4 15 0 0 2 0 0 3 1 6 14 8 15 23 0 1308 0 1 0 2 1 1 0 0 107 219 190 33 145 50 12 4 6 27 188 293 0 1 0 0 0 0 0 0 106 317 404 108 3498 467 7 13 9 30 13
90 | 3 9 1 1 2 140 3 1 0 2 3 0 8 0 2 3 1 0 1 0 0 49 0 0 1 3 2 0 23 1 1 0 0 0 0 0 19 1 111 21 1 6 1 1 0 0 13 6 0 7 33 992 308 30 0 166 51 0 17 2 0 1 0 6 0 4 15 2 3 0 257 51 161 11 1 9 0 1 1 3 0 0 5 151 2 325 32 714 622 0 386 1441 16 26 337 477 1358 413 452 52 0 1 0 0 0 2 3 0 0 2 0 0 1 1 0 0 1 0 1 0 0 0 1 0 0 3 0 0 1 3 0 1 2 0 2 0 0 1 0 0 0 3 8 1 0 1 16 0 0 1 0 0 5 0 10 6 1 13 2 0 115 0 0 0 0 1 1 0 0 5 27 27 15 37 41 19 1 6 18 103 245 1 0 0 0 0 0 0 0 10 57 60 68 457 309 4 15 3 7 5
91 | 11 100 10 19 47 33 27 26 8 7 28 0 25 19 15 67 5 29 3 4 4 13 15 16 9 57 40 48 35 15 49 16 12 2 2 1 23 7 101 36 10 268 15 53 4 17 436 147 7 117 1723 224 104 75 0 2 18 16 16 20 11 10 1 86 0 48 202 90 107 7 76 5 44 79 17 189 38 34 18 37 5 32 44 16 31 570 821 40 5341 386 0 4646 137 3390 2966 137 235 35 97 32 0 1 0 0 0 0 1 0 0 0 0 0 0 2 2 1 1 0 2 2 1 0 3 1 0 2 3 3 1 5 4 2 5 1 4 5 1 3 0 1 2 8 12 1 0 6 46 4 0 8 2 1 8 2 38 20 3 34 1 1 262 3 2 0 1 8 6 10 0 51 145 254 114 378 235 114 4 12 32 290 595 4 11 8 0 1 1 2 0 157 40 76 84 2177 1254 30 48 9 16 17
92 | 9 34 4 6 18 114 6 11 1 5 11 1 12 3 6 17 2 13 1 0 1 43 5 5 3 11 12 14 65 8 14 5 5 0 3 0 36 1 158 58 2 82 6 13 2 5 211 74 5 147 401 1337 263 144 1 1 36 2 7 4 2 4 1 25 0 14 100 39 56 3 193 29 103 17 9 62 6 8 2 14 0 6 34 32 15 570 556 165 2814 1441 4646 0 51 311 3433 344 696 95 281 36 0 1 0 0 3 0 2 0 1 1 0 0 1 3 2 1 7 2 2 0 0 3 2 0 0 7 0 2 1 4 7 6 3 3 9 10 1 2 0 3 2 3 11 1 2 23 81 4 1 7 3 0 8 1 62 47 6 86 1 0 218 1 2 0 2 4 9 14 0 14 105 143 99 246 231 174 1 6 67 359 1164 3 11 8 0 1 0 3 1 109 47 72 186 1372 1376 59 105 8 19 15
93 | 9 24 425 4 14 3 8 297 0 8 95 1 80 53 80 76 11 64 4 188 694 1 605 910 235 1151 238 561 20 49 804 808 138 60 95 251 91 164 26 49 1000 2897 1265 97 13 286 245 24 13 694 2344 16 1 108 0 3 75 102 6 18 1375 1675 250 562 2 205 432 723 1422 2849 50 1 1 37 66 200 127 248 488 911 53 1715 14 1 23 101 489 0 41 16 137 51 0 3210 321 3 1 0 13 19 1 0 0 0 0 0 0 0 0 2 0 2 0 0 5 1 0 0 2 1 0 3 2 0 5 21 3 1 7 2 21 7 21 4 14 14 0 3 0 4 3 4 12 1 18 183 75 5 1 0 36 0 4 2 44 105 10 672 0 0 16 1 7 0 2 7 16 14 0 0 23 42 40 61 101 383 3 1 30 112 1306 8 16 21 2 4 0 3 1 44 0 0 3 106 392 103 235 4 7 8
94 | 4 35 40 6 17 5 16 32 5 5 30 1 25 24 25 40 6 19 2 2 39 0 17 65 15 453 54 471 27 17 77 36 56 0 4 15 56 15 58 46 180 3690 25 29 8 22 549 81 17 397 4992 14 7 93 0 0 10 14 2 8 51 138 4 87 1 66 351 192 154 9 20 2 1 27 14 93 26 21 40 223 4 235 13 0 18 113 907 1 360 26 3390 311 3210 0 2784 7 9 0 15 6 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 2 0 2 3 0 0 4 3 8 0 3 1 8 12 0 2 0 2 0 3 9 0 3 20 50 2 2 3 9 0 5 1 38 22 8 67 1 0 39 0 3 1 0 6 4 7 1 10 44 79 69 108 179 171 0 4 21 96 602 6 7 12 1 3 0 3 1 57 1 10 18 286 593 54 96 2 10 6
95 | 7 27 2 3 17 36 15 11 0 5 16 0 13 15 13 38 7 135 9 3 17 63 10 15 8 60 35 108 276 11 44 12 40 2 2 5 127 6 692 266 31 1403 52 62 9 45 949 106 88 2129 1093 1501 483 1544 0 0 27 15 5 9 13 29 11 81 1 58 60 136 138 22 189 19 297 24 16 67 29 31 8 92 3 42 25 1 17 90 1010 4 327 337 2966 3433 321 2784 0 112 260 51 325 11 4 0 0 0 2 0 2 0 1 2 1 0 2 5 8 1 6 1 1 11 2 4 6 0 4 54 1 2 5 5 17 12 16 6 19 28 1 1 1 8 1 11 46 12 12 289 165 12 3 13 19 0 11 8 116 419 30 1476 0 0 66 0 7 0 3 12 39 43 0 9 62 83 81 105 203 653 5 6 294 408 3308 22 29 23 0 6 1 1 0 113 6 20 51 417 1018 187 465 7 34 23
96 | 198 161 27 161 39 927 59 15 9 787 55 128 777 89 61 89 2 0 1 0 1 12 0 0 1 1 0 0 5 0 90 5 28 0 0 0 60 5 36 10 1 3 1 8 0 3 80 3 0 0 17 348 16 2 0 72 554 53 88 7 1 2 0 3 3 1 2 4 3 0 933 535 273 8 6 26 3 11 3 45 0 8 8 53 2 204 5 265 302 477 137 344 3 7 112 0 1870 64 116 713 9 8 14 7 20 55 10 31 14 61 10 114 32 115 20 21 8 0 0 2 2 0 0 0 0 0 1 0 0 10 0 29 8 20 17 6 2 0 0 36 0 25 64 9 0 2 0 0 0 2 0 2 35 0 0 2 14 4 0 5 23 3 13 5 0 0 2 3 2 2 3 2 2 5 3 2 42 68 4 2 9 5 0 0 5 6 8 1 0 3 11 8 3 40 11 2 2 139 77 46
97 | 22 15 2 19 3 207 2 3 0 20 3 2 46 2 29 34 2 0 0 0 0 46 1 0 0 0 3 2 11 1 9 1 5 0 0 1 80 7 156 6 1 4 0 6 0 1 41 2 0 0 19 1136 170 36 1 121 116 7 24 5 1 0 0 4 0 1 6 1 0 0 1414 891 660 12 2 11 6 2 0 7 0 2 1 108 1 317 15 655 572 1358 235 696 1 9 260 1870 0 1176 2098 129 7 2 1 2 11 7 3 3 1 12 2 24 4 46 9 35 19 0 5 0 2 1 0 2 0 1 1 1 0 34 0 121 27 57 24 4 0 1 0 58 0 47 190 27 0 1 1 2 2 11 0 4 52 0 1 2 57 8 1 3 42 11 20 0 1 0 2 3 2 5 6 11 8 9 9 2 36 248 5 11 31 5 1 1 12 9 19 1 2 8 19 22 2 84 46 2 2 52 109 102
98 | 0 0 0 0 0 4 0 0 0 0 0 0 8 0 0 1 0 0 0 0 0 35 0 0 0 0 0 0 6 0 0 0 1 0 0 0 9 1 63 2 0 1 0 0 0 0 1 0 0 1 3 408 433 5 0 25 1 0 1 0 0 0 0 0 0 0 1 0 1 0 240 111 288 1 0 1 1 0 0 0 0 0 0 22 0 40 0 161 96 413 35 95 0 0 51 64 1176 0 599 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 4 0 5 3 6 0 0 0 0 0 0 0 1 7 2 0 1 0 0 0 0 0 0 1 0 0 0 2 0 0 0 19 0 0 0 0 0 0 0 0 0 1 0 0 0 2 1 0 9 1 1 4 0 0 0 0 0 1 0 0 0 2 1 3 13 5 1 0 0 2 1
99 | 1 2 4 1 0 13 1 3 1 1 3 0 43 3 27 32 0 1 2 0 0 60 0 0 1 3 1 5 6 1 18 4 24 0 0 0 214 10 1272 17 1 17 0 11 4 10 37 3 0 6 24 783 700 108 0 30 14 3 4 1 0 8 0 1 0 4 8 5 1 0 950 479 128 8 2 14 4 4 0 32 0 21 4 29 4 79 8 146 134 452 97 281 13 15 325 116 2098 599 0 9 7 10 2 0 10 9 7 8 2 2 3 17 14 31 20 82 50 1 8 2 6 6 1 7 0 3 2 9 1 60 1 176 41 86 29 10 7 1 1 134 1 161 516 82 0 3 2 4 6 36 7 7 95 0 5 9 84 18 3 6 20 14 28 2 15 1 8 6 6 8 5 4 0 3 6 5 53 326 5 5 23 17 2 4 33 20 19 7 1 3 3 5 4 34 19 0 3 15 173 243
100 | 1060 1701 180 418 180 434 113 91 19 457 211 66 420 124 42 50 0 0 0 0 2 1 0 0 1 1 3 0 2 0 1228 7 37 1 0 1 42 8 8 4 0 1 0 5 0 0 22 2 0 0 9 45 1 2 0 23 1595 259 370 8 1 8 0 16 5 8 6 4 5 4 215 47 15 92 10 49 5 11 14 124 0 25 10 12 3 56 5 25 45 52 32 36 19 6 11 713 129 0 9 0 5 6 22 5 14 21 11 16 12 34 5 35 14 33 5 3 1 0 0 0 1 0 0 0 0 0 0 0 0 3 0 9 1 5 14 4 0 0 0 18 0 14 16 2 0 0 0 0 1 0 0 1 7 0 0 0 0 1 0 0 1 6 3 10 0 0 3 2 2 2 1 0 0 0 0 1 22 10 0 3 5 2 1 0 8 3 1 1 0 0 1 2 0 5 3 0 0 20 33 16
101 | 5 6 9 4 12 0 24 6 10 11 12 1 60 23 0 2 0 0 0 0 1 0 0 0 0 0 0 0 0 0 11 0 3 0 0 0 4 0 1 3 0 0 0 0 0 0 2 0 0 0 1 0 0 0 0 0 7 1 3 0 0 0 0 0 0 0 0 0 0 0 22 0 0 0 1 1 0 0 1 3 0 0 0 0 0 0 0 0 0 0 0 0 1 0 4 9 7 0 7 5 0 534 4313 78 713 142 32 463 28 26 283 560 19 270 230 14 17 1 0 2 0 8 0 3 2 0 0 0 0 0 1 2 1 1 4155 213 0 1 0 92 0 8 8 0 4 5 11 5 1 0 5 9 34 1 5 0 1 1 15 19 27 164 1140 478 2 0 19 54 8 24 4 5 7 5 9 52 52 4 0 2 4 260 43 10 140 122 25 10 3 13 2 0 0 6 8 6 7 19 12 13
102 | 3 15 4 12 8 0 23 2 8 18 4 2 53 14 2 3 0 0 0 0 0 0 0 0 4 0 3 1 1 0 10 1 2 0 0 0 6 1 13 7 0 0 0 0 0 0 0 0 0 0 0 1 2 1 0 0 3 1 0 0 1 0 0 1 0 0 1 0 0 0 20 0 0 2 0 3 0 1 0 1 0 0 0 0 1 1 0 0 0 1 1 1 0 0 0 8 2 0 10 6 534 0 3540 967 330 460 334 306 572 443 128 429 443 508 153 16 19 0 0 0 0 2 0 1 1 0 0 0 0 6 0 2 2 0 426 73 0 0 0 58 0 4 11 3 0 2 0 4 4 0 1 7 37 1 2 18 0 1 16 33 21 222 239 590 2 0 10 12 12 31 3 2 0 1 5 8 102 48 35 36 21 86 7 4 154 84 28 7 3 7 7 15 8 23 7 3 0 705 181 53
103 | 18 56 7 24 30 5 55 1 17 109 21 16 181 47 4 4 0 0 0 0 0 0 0 0 1 0 0 1 2 0 37 1 4 0 0 0 5 1 2 0 0 0 0 0 0 0 5 1 0 0 0 0 0 0 0 1 33 8 7 0 0 0 0 0 0 0 0 0 0 0 32 1 0 2 0 4 0 1 1 6 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 14 1 0 2 22 4313 3540 0 5654 1283 9847 930 7645 4298 323 827 3283 534 947 284 17 28 0 1 0 0 1 0 0 0 0 0 1 0 0 0 1 1 0 3071 95 0 0 0 41 0 5 3 1 0 5 3 22 1 2 0 184 248 16 14 0 0 0 87 312 284 366 1069 679 5 0 6 54 172 383 25 3 6 11 2 5 28 7 0 0 1 63 12 5 1243 762 215 47 7 149 9 4 0 61 12 3 1 50 13 5
104 | 2 14 1 4 7 1 19 1 9 28 6 5 83 18 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 0 2 0 0 0 3 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 4 1 0 0 0 0 0 0 0 0 0 0 0 16 0 0 0 0 1 0 0 0 4 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 7 2 0 0 5 78 967 5654 0 51 1401 3423 203 6465 474 27 492 2089 1572 41 4 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 57 9 0 0 0 23 0 2 1 0 0 1 0 1 0 0 0 17 28 2 1 0 0 0 11 29 25 27 73 34 0 0 0 6 6 32 3 1 0 0 0 3 36 1 0 0 1 3 1 1 100 74 20 2 3 14 2 0 0 5 2 1 0 59 4 0
105 | 10 26 25 6 43 4 72 13 27 44 39 9 243 71 13 6 0 0 0 0 2 1 0 0 2 0 2 2 0 0 47 4 13 0 0 0 16 0 5 4 0 1 0 1 0 0 5 0 0 0 1 1 2 4 0 0 14 8 3 2 0 1 0 1 0 0 0 0 0 1 62 3 0 2 0 8 1 2 1 21 0 0 1 0 0 1 0 0 0 0 0 3 0 0 2 20 11 0 10 14 713 330 1283 51 0 335 80 5061 75 62 3236 3077 78 1546 1787 77 101 0 2 4 1 6 1 3 10 0 2 2 8 5 1 0 0 0 4910 8189 0 4 0 221 2 30 30 1 14 49 18 26 7 3 9 90 320 5 12 0 0 3 29 97 75 178 752 294 5 0 43 132 63 103 14 12 6 20 30 97 120 6 0 1 17 1256 73 28 526 475 293 69 21 51 4 1 0 27 27 21 25 26 20 16
106 | 19 71 31 30 112 20 193 32 87 218 87 49 747 220 15 11 0 0 0 0 0 0 0 0 1 0 0 0 0 0 86 1 19 0 0 0 30 0 2 1 0 0 0 1 0 0 10 0 0 0 0 0 0 0 0 2 20 23 4 0 0 0 0 7 0 0 0 0 0 0 91 1 0 4 2 7 0 2 1 46 0 0 1 0 0 0 0 0 0 2 0 0 0 0 0 55 7 0 9 21 142 460 9847 1401 335 0 847 19586 6999 766 607 20316 884 1492 116 38 50 0 1 2 1 1 0 0 0 0 0 0 0 0 1 1 0 0 285 75 0 1 0 41 1 6 5 1 0 2 3 27 5 6 1 168 452 14 21 1 1 0 63 218 153 219 395 97 5 0 8 56 111 253 25 2 1 3 1 2 59 2 0 0 3 48 10 13 960 577 380 83 22 104 3 3 0 48 11 0 1 70 22 8
107 | 4 29 4 19 6 3 27 1 3 72 10 12 142 31 1 1 1 0 0 0 0 0 0 0 2 0 6 0 2 1 8 0 2 0 0 0 10 1 18 13 0 0 0 0 1 0 4 0 0 0 4 0 4 1 0 0 6 2 2 0 0 0 0 0 0 0 0 0 0 0 22 2 1 0 0 1 0 0 0 4 0 0 2 1 0 5 0 1 2 3 1 2 0 0 2 10 3 0 7 11 32 334 930 3423 80 847 0 175 1605 1295 30 473 1308 1327 73 45 33 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 55 15 0 0 0 60 0 7 7 0 0 3 2 3 0 1 0 6 55 2 1 69 0 2 4 9 6 43 50 107 2 1 3 8 2 7 1 0 0 0 1 0 170 232 103 109 52 14 2 0 56 33 23 4 1 6 9 50 30 69 11 0 1 1184 284 78
108 | 10 38 24 16 40 6 120 6 48 83 68 19 392 129 10 6 0 0 0 0 0 0 0 0 0 0 2 1 1 0 73 2 10 0 0 1 13 1 2 3 0 0 0 0 0 0 11 0 0 0 0 0 1 0 0 0 24 10 5 1 0 1 0 1 0 0 0 0 0 0 47 2 0 3 1 5 0 5 2 19 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 31 3 0 8 16 463 306 7645 203 5061 19586 175 0 251 264 6951 13713 91 1060 477 31 39 0 0 1 0 1 0 0 0 0 0 0 2 1 0 0 0 0 2070 368 1 0 0 56 0 12 7 3 1 7 3 15 3 4 1 167 377 9 16 0 1 2 33 159 121 168 437 176 7 1 6 51 82 153 14 2 1 3 1 8 60 3 0 0 4 72 8 2 689 491 356 74 23 71 5 0 1 43 10 4 2 26 14 2
109 | 9 38 10 14 20 10 67 3 18 131 26 14 220 48 6 6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 30 0 8 0 0 0 10 0 2 0 0 0 0 0 1 0 5 0 0 0 0 0 0 0 0 0 14 7 4 0 0 0 0 0 0 0 0 0 0 0 28 1 0 1 0 5 0 2 0 10 0 0 0 0 0 0 0 1 2 0 0 1 0 0 1 14 1 0 2 12 28 572 4298 6465 75 6999 1605 251 0 1578 82 6838 8631 4470 75 13 17 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 48 20 0 0 0 35 0 4 5 0 0 3 0 6 1 1 0 27 60 2 1 0 0 0 16 37 28 42 92 57 1 0 1 7 32 54 5 0 0 0 0 1 40 2 0 0 1 8 1 0 176 89 39 5 2 19 1 0 0 12 1 0 1 97 2 1
110 | 16 88 19 54 47 18 90 11 49 240 50 51 670 142 12 10 0 0 0 0 0 0 0 0 0 0 0 0 0 0 25 0 26 0 0 0 32 0 3 1 0 0 0 1 0 0 6 0 0 0 0 2 1 1 0 1 37 7 4 1 0 1 0 1 1 1 0 1 0 0 90 0 0 3 1 4 0 2 0 26 0 6 4 0 0 2 0 0 1 2 0 1 2 0 2 61 12 0 2 34 26 443 323 474 62 766 1295 264 1578 0 53 859 3707 2261 60 16 18 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 44 15 0 0 0 44 0 3 3 1 0 1 0 1 0 0 0 5 30 0 0 0 0 0 2 6 9 55 33 166 0 0 2 2 0 4 0 1 0 0 0 0 102 1 0 0 2 6 1 0 39 23 21 4 2 1 0 0 0 3 1 0 0 412 10 5
111 | 3 13 8 3 16 2 38 5 19 27 19 8 133 40 3 3 0 0 0 0 1 2 0 0 1 0 1 2 8 0 25 1 3 0 0 0 8 1 8 4 1 0 1 1 0 0 3 0 0 1 0 1 1 3 0 0 5 6 0 1 0 0 0 1 0 0 0 0 0 0 27 1 0 3 0 1 1 1 0 5 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 10 2 0 3 5 283 128 827 27 3236 607 30 6951 82 53 0 7587 49 1433 3695 101 162 1 0 1 1 1 0 1 7 0 0 1 8 3 2 0 1 0 626 2649 0 1 0 296 0 65 41 6 9 58 8 37 11 5 7 73 324 7 15 1 1 2 20 54 30 64 276 92 6 2 18 68 25 51 14 8 4 9 6 51 48 5 0 0 14 1028 44 9 337 284 310 64 30 27 2 0 0 16 24 9 34 12 6 21
112 | 35 117 62 56 171 25 377 46 139 382 142 94 1478 437 32 33 1 0 0 0 0 0 0 0 1 0 2 1 4 0 143 7 37 0 0 0 66 0 6 6 0 1 0 2 0 0 30 1 0 0 4 0 1 4 1 0 44 24 8 1 0 0 0 6 0 0 0 0 0 0 213 4 0 10 1 15 2 3 1 66 1 2 3 0 0 1 1 2 2 0 0 0 2 0 0 114 24 0 17 35 560 429 3283 492 3077 20316 473 13713 6838 859 7587 0 2219 10654 4434 184 218 0 1 0 0 1 0 1 4 0 1 1 1 4 2 0 0 0 838 696 0 0 1 417 1 42 40 3 4 22 5 34 11 6 1 130 630 8 15 1 0 5 23 102 70 148 396 211 5 1 19 78 45 123 9 4 1 3 5 18 191 7 0 3 13 214 28 8 654 430 556 112 47 51 2 1 0 20 9 7 18 231 25 18
113 | 12 39 17 17 39 7 85 12 26 157 38 27 323 88 15 8 0 0 0 1 1 0 0 0 1 1 5 1 2 0 36 2 9 0 0 0 22 6 27 23 0 1 0 1 0 0 17 0 0 0 0 0 2 2 0 1 25 12 3 1 0 2 0 1 1 0 0 0 0 0 73 1 0 2 1 1 1 0 0 25 0 0 1 0 1 1 0 1 1 1 0 1 0 0 2 32 4 0 14 14 19 443 534 2089 78 884 1308 91 8631 3707 49 2219 0 6909 76 46 58 1 3 1 0 1 2 0 0 3 0 0 0 0 1 0 0 0 68 26 0 0 0 93 0 4 24 1 1 9 2 6 2 0 0 12 72 2 5 21 0 2 6 18 16 40 63 49 1 1 4 14 10 23 0 1 1 1 1 4 505 37 6 8 16 11 2 0 91 48 46 10 9 10 1 7 3 12 5 1 6 384 336 67
114 | 24 74 41 55 118 25 290 46 117 315 132 71 1166 324 32 31 1 0 2 0 4 3 0 1 4 3 5 13 10 1 124 6 42 0 1 1 93 7 70 32 0 0 2 4 2 3 15 0 0 0 2 0 6 5 0 3 36 22 5 4 1 1 0 8 1 0 1 2 0 0 276 6 0 11 2 19 3 7 3 67 0 7 2 0 0 3 0 0 0 1 2 3 0 1 5 115 46 0 31 33 270 508 947 1572 1546 1492 1327 1060 4470 2261 1433 10654 6909 0 3020 455 526 4 5 2 2 4 5 1 5 3 0 3 3 1 4 2 1 1 487 501 0 0 0 1404 0 174 164 15 19 88 6 37 44 19 6 108 745 6 27 33 1 28 17 59 41 120 320 143 14 1 76 112 43 62 6 4 7 9 12 42 1307 55 8 26 79 295 25 13 466 393 551 127 56 42 3 8 3 24 36 13 65 693 423 137
115 | 0 20 8 5 22 3 38 8 16 27 19 3 157 58 10 6 0 1 1 1 8 3 0 0 7 6 5 18 20 1 52 2 9 0 0 0 14 3 32 21 1 7 1 0 1 0 16 0 0 1 1 0 0 18 0 2 10 8 1 0 0 0 0 0 0 1 1 1 1 0 45 0 0 2 2 5 2 1 0 12 0 2 0 1 0 1 0 0 1 0 2 2 5 0 8 20 9 0 20 5 230 153 284 41 1787 116 73 477 75 60 3695 4434 76 3020 0 339 398 2 8 7 2 6 2 1 28 1 5 6 18 6 10 2 3 0 727 1208 1 9 3 5054 6 1563 543 27 62 299 31 46 23 12 15 71 549 13 36 1 2 24 8 40 32 68 335 106 17 4 1548 87 22 43 6 8 8 21 51 157 367 29 4 1 136 3942 75 32 319 293 477 121 48 40 3 0 0 22 67 48 187 44 9 181
116 | 1 4 9 0 19 2 17 11 12 9 12 1 37 13 9 18 3 2 3 1 1 0 1 0 30 0 53 1 20 4 23 2 4 2 1 0 48 22 123 102 0 0 1 4 11 1 9 1 0 1 0 1 11 9 0 0 5 10 2 3 0 1 1 2 0 1 0 1 0 0 109 2 1 7 1 13 2 5 2 7 0 1 2 0 0 4 1 0 0 0 1 1 1 0 1 21 35 0 82 3 14 16 17 4 77 38 45 31 13 16 101 184 46 455 339 0 2595 162 653 50 131 20 1 9 7 1 11 167 9 133 1 68 7 18 46 155 16 14 11 349 108 1251 440 262 7 20 31 104 123 374 7 21 2137 20 246 1 18 3 0 7 6 4 18 12 77 12 342 27 5 9 5 35 22 8 10 26 101 72 0 2 17 427 138 146 38 40 932 55 140 60 2 0 1 41 54 3 13 55 50 37
117 | 2 8 8 1 7 1 8 6 11 5 6 0 30 8 13 28 1 1 1 0 0 0 0 4 38 1 68 2 22 3 22 1 0 1 1 2 54 27 154 172 1 1 0 1 10 1 14 0 0 2 1 1 12 5 0 4 8 2 0 0 0 1 1 1 1 1 1 0 1 0 61 3 1 4 0 7 6 5 0 2 0 3 3 0 0 2 0 0 3 1 1 7 0 0 6 8 19 0 50 1 17 19 28 9 101 50 33 39 17 18 162 218 58 526 398 2595 0 148 622 104 179 24 1 4 5 1 10 62 10 71 6 71 12 15 68 302 10 4 4 288 30 617 316 157 3 28 52 1606 709 741 12 17 890 52 506 1 26 15 1 20 14 11 68 12 344 26 288 151 4 16 11 105 72 30 23 71 75 100 0 3 71 651 712 281 90 243 2529 350 1819 160 0 0 2 83 189 7 18 31 23 23
118 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 2 0 2 2 4 0 0 0 0 0 0 0 1 0 3 4 0 1 0 0 1 0 0 0 0 0 0 0 2 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 2 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 1 0 1 0 1 4 2 162 148 0 596 153 441 8 0 6 1 0 20 35 24 111 5 8 38 32 3 3 26 38 2 1 16 45 23 29 1 3 15 13 32 159 1 8 649 12 1062 0 1 6 0 2 2 0 4 0 63 5 14 5 0 2 2 10 7 5 4 7 0 1 0 0 16 6 18 2 1 7 11 9 2 35 0 1 1 15 26 1 5 0 1 0
119 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 4 0 1 2 1 0 0 0 2 7 1 56 0 45 5 0 0 1 0 0 0 4 4 26 63 0 1 0 1 7 1 0 0 0 1 2 1 6 3 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 2 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 2 2 2 0 1 0 5 0 8 0 0 0 1 0 2 1 0 0 0 0 0 1 3 5 8 653 622 596 0 206 580 23 0 8 0 1 6 12 7 49 4 17 8 35 1 3 120 37 0 11 9 60 28 15 1 2 32 26 161 2078 1 4 717 25 491 0 1 6 0 5 2 0 3 0 231 6 4 7 1 6 6 34 33 10 1 7 1 11 2 0 20 5 18 2 4 10 32 10 7 103 0 0 0 25 66 1 5 2 0 2
120 | 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 1 3 40 26 0 6 1 31 19 91 91 7 0 1 1 2 0 0 2 1 8 15 5 21 0 0 6 0 1 0 0 2 1 0 2 120 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 5 0 1 3 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 2 0 1 0 11 2 0 0 2 0 2 0 0 0 4 2 0 1 0 0 1 0 1 2 7 50 104 153 206 0 1939 2135 0 408 138 12 125 133 208 223 411 15 432 73 11 19 829 355 71 2 165 107 24 38 25 73 335 24 36 524 142 7 311 20 1224 0 3 973 0 6 5 0 10 0 247 252 157 18 3 7 3 30 48 6 4 71 0 3 32 1 38 29 170 97 6 12 11 13 6 90 0 0 0 35 59 4 24 0 0 1
121 | 0 0 0 0 1 0 0 0 0 0 0 0 2 0 1 0 0 0 1 5 2 2 0 4 8 5 52 7 68 7 0 0 1 1 1 0 6 0 25 63 0 0 0 0 4 0 5 0 0 0 0 2 8 6 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 9 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 2 2 2 0 6 1 0 0 0 0 1 1 0 0 0 0 1 0 0 2 2 131 179 441 580 1939 0 375 0 103 15 1 27 37 73 165 20 43 229 119 1 3 1327 283 26 1 33 42 26 28 3 1 16 22 68 2869 19 7 442 15 801 0 5 18 0 0 4 1 7 0 307 61 41 3 1 9 0 24 12 7 4 10 0 9 1 0 12 6 37 17 4 13 11 11 2 58 0 0 0 16 31 1 2 0 0 2
122 | 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 2 0 7 4 16 59 42 1 3 12 82 54 196 331 13 0 0 0 5 0 0 6 1 20 55 8 18 4 0 12 0 7 1 1 4 1 2 5 114 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 2 5 2 3 0 0 0 0 0 0 2 0 1 0 0 0 0 1 0 0 0 0 3 3 1 4 0 1 0 6 0 8 2 1 0 6 1 0 1 0 0 1 1 1 4 6 20 24 8 23 2135 375 0 0 2830 2363 5 1022 446 372 906 258 19 779 75 66 115 1679 406 192 2 406 321 7 60 78 20 235 52 9 190 1277 0 85 1 145 0 2 303 0 1 1 4 26 1 95 475 722 57 2 3 2 12 61 5 14 227 2 5 20 0 8 103 446 390 6 39 13 23 5 16 0 0 0 5 15 2 3 0 0 0
123 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 5 15 30 1 0 0 13 0 30 76 2 0 0 0 1 0 0 0 0 3 9 2 3 1 0 1 0 0 0 0 0 1 2 6 20 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 2 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 2 1 3 2 2 2 6 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 2 5 2 1 1 0 0 0 0 0 0 0 0 704 0 0 0 0 0 124 262 12 1 0 0 0 0 12 0 6 177 8 0 0 4 0 0 0 0 0 1 0 0 342 528 67 0 1 2 0 0 0 1 0 2 1 0 0 0 0 0 2 0 0 6 261 130 56 113 6 1 0 0 0 0 0 0 2 1 7 1 18 8 0 1 6 131 183
124 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 2 1 14 2 3 3 5 13 13 51 10 148 20 0 0 1 4 1 0 4 0 25 67 0 2 1 0 13 0 1 0 0 1 1 1 7 11 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 4 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 2 1 7 0 3 1 0 0 3 0 0 0 0 0 1 1 0 1 1 9 4 6 8 408 103 2830 0 0 1471 2 1574 619 410 1637 25 5 74 17 16 20 1172 120 952 1 244 217 6 32 67 12 95 6 0 21 369 0 87 1 110 0 0 12 0 1 0 1 8 0 10 176 276 12 0 1 3 6 13 1 7 282 0 5 3 0 2 36 83 118 0 7 3 3 0 8 0 0 0 4 13 5 3 0 1 1
125 | 0 0 0 0 0 0 0 1 0 0 1 0 1 2 0 0 1 11 0 38 215 108 6 11 2 178 18 378 458 4 2 1 3 2 0 3 2 1 13 16 22 9 11 0 3 1 12 0 1 6 0 0 5 125 0 0 1 0 0 0 1 6 1 0 0 1 0 1 0 14 3 2 0 0 0 0 0 1 3 12 0 7 0 0 0 0 1 0 0 0 0 0 5 2 4 0 0 0 0 0 2 1 0 0 10 0 0 0 0 0 7 4 0 5 28 7 5 1 0 138 15 2363 0 1471 0 3 1780 228 1512 910 527 4 211 35 48 52 58 44 49 9 152 204 6 57 1144 292 55 9 3 2 982 0 41 2 25 1 0 99 0 2 1 1 14 0 9 116 475 17 0 1 0 4 11 6 9 192 0 1 0 0 21 132 71 158 5 4 8 3 2 8 0 0 0 7 22 11 26 0 0 1
126 | 0 1 1 0 0 0 0 1 0 0 0 0 2 2 1 1 1 23 0 34 254 403 8 12 5 200 16 579 703 6 0 2 9 6 1 0 4 5 39 27 62 59 26 0 2 1 18 0 1 21 4 11 18 811 0 0 1 0 0 0 2 12 1 0 0 1 1 0 2 17 0 6 14 0 1 2 0 1 2 13 0 15 0 0 0 1 0 0 3 3 2 7 21 3 54 0 1 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 3 3 1 1 1 0 1 12 1 5 704 2 3 0 0 0 2 0 12 425 5855 52 0 1 0 1 0 3 0 3 160 18 1 2 5 0 0 0 8 0 4 0 11 1092 480 1911 0 0 1 0 0 0 0 0 0 1 0 0 0 1 1 1 1 3 2 457 868 59 188 1 2 0 0 0 0 0 0 1 0 1 1 20 25 0 0 1 17 32
127 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 4 6 6 1 1 2 13 24 29 52 7 0 0 0 6 0 0 1 0 11 20 1 2 0 0 8 0 0 0 0 0 0 0 4 11 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 2 0 0 0 0 0 0 0 1 1 0 1 0 0 0 1 0 0 0 0 3 0 3 0 1 1 1 0 2 0 0 0 0 0 2 0 0 0 0 0 0 1 0 0 5 11 10 20 6 125 27 1022 0 1574 1780 0 0 127 4244 1032 131 2 346 248 12 8 150 13 269 1 53 48 4 18 278 93 49 6 0 12 372 3 302 6 187 0 0 14 0 5 8 0 5 0 4 31 91 2 1 3 7 15 17 14 14 457 0 5 1 1 15 24 14 29 2 1 1 3 0 39 0 0 0 29 41 17 18 0 0 0
128 | 0 1 0 1 1 0 0 0 0 0 2 0 1 1 5 15 1 0 1 1 1 2 1 0 7 2 14 9 4 3 2 0 0 0 1 0 16 6 36 27 0 0 0 2 3 0 5 0 0 0 0 0 3 5 0 1 0 0 0 0 0 2 1 3 0 1 0 0 0 0 6 1 0 0 0 4 1 5 0 2 0 0 0 0 0 1 0 0 1 0 3 2 1 0 2 0 1 0 9 0 0 0 1 0 2 0 0 0 0 0 1 1 0 3 6 167 62 35 12 133 37 446 0 619 228 0 127 0 75 1092 6 25 19 13 2 5 704 1077 2422 1 416 366 10 621 64 54 88 6 5 101 175 0 442 2 66 0 0 9 0 1 2 2 3 0 114 119 70 4 1 1 1 12 28 3 14 305 0 2 0 0 8 10 45 20 0 2 14 3 1 12 0 0 0 4 19 13 7 0 1 0
129 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 3 0 46 79 76 13 9 1 82 9 105 417 4 0 0 0 6 2 0 1 1 14 30 6 5 4 1 7 0 8 0 0 0 4 0 14 40 0 0 0 0 0 1 1 0 0 1 0 0 0 1 2 6 0 5 1 0 0 0 0 0 0 1 0 2 0 0 0 0 1 0 2 1 1 1 7 4 5 0 0 0 1 0 0 0 0 0 8 0 0 2 0 0 8 1 0 3 18 9 10 24 7 208 73 372 0 410 1512 2 4244 75 0 1945 1135 56 931 801 15 14 30 7 8 7 29 121 52 103 601 387 44 3 5 12 206 9 432 14 294 2 2 103 0 8 13 1 5 0 6 15 116 5 3 5 9 22 24 35 20 182 0 8 2 0 108 70 10 42 0 4 2 5 1 74 0 0 0 57 136 17 79 1 0 1
130 | 0 1 0 0 1 2 0 1 0 1 1 0 6 0 14 61 1 8 13 17 23 15 18 12 54 8 115 22 170 17 3 0 1 13 4 4 29 9 196 251 1 4 2 12 52 7 23 0 0 1 3 3 33 18 0 1 0 0 0 0 2 4 3 1 0 0 0 0 0 0 65 7 1 0 1 7 4 6 0 4 0 6 0 0 0 0 3 0 2 3 5 4 2 3 5 10 34 4 60 3 0 6 0 0 5 0 0 1 1 1 3 4 0 1 6 133 71 111 49 223 165 906 0 1637 910 0 1032 1092 1945 0 109 250 541 438 7 11 389 159 1075 8 83 203 80 794 232 231 94 12 21 158 374 3 1075 21 339 0 7 55 0 13 8 1 5 0 51 50 114 11 2 9 5 27 30 21 29 519 0 12 0 1 38 19 26 33 2 14 24 15 2 69 0 1 0 34 80 33 42 0 0 1
131 | 0 0 2 1 0 0 0 0 0 0 2 0 1 0 0 0 1 41 0 67 422 285 5 16 3 183 12 340 455 1 3 3 11 0 3 2 7 3 7 12 31 32 41 0 1 0 22 0 1 14 3 5 7 363 0 0 1 0 0 0 3 21 1 4 0 2 1 2 3 42 2 1 3 0 1 0 0 0 3 13 0 23 0 0 0 1 2 0 0 0 4 7 21 8 17 0 0 0 1 0 1 0 0 0 1 1 1 0 0 0 2 2 1 4 10 1 6 5 4 411 20 258 0 25 527 12 131 6 1135 109 0 93 1052 93 6 4 6 4 3 4 4 35 135 83 2181 1646 262 2 2 7 264 1 120 4 287 0 29 479 0 7 7 0 1 0 5 8 37 3 1 7 4 20 31 24 34 134 1 3 16 0 401 20 6 9 2 1 0 5 0 59 0 0 1 42 122 33 302 0 0 1
132 | 0 0 1 0 1 4 0 0 0 0 2 0 10 3 32 74 8 12 29 83 27 27 30 24 128 21 322 57 646 81 0 0 10 46 5 5 92 25 529 723 10 24 3 18 109 22 39 0 0 7 2 10 108 134 0 0 4 1 0 1 1 5 3 2 0 0 2 2 2 7 152 25 4 1 1 17 6 16 0 14 0 4 1 0 0 4 0 0 0 1 2 6 7 0 12 29 121 5 176 9 2 2 1 0 0 1 0 0 0 1 0 0 0 2 2 68 71 8 17 15 43 19 124 5 4 425 2 25 56 250 93 0 2010 217 0 0 26 12 0 3 3 51 545 622 11 23 19 14 21 132 2 1 131 0 37 128 1683 610 0 0 1 0 2 0 40 3 5 12 0 1 1 4 7 4 3 6 3 421 77 24 196 7 10 1 11 18 16 14 2 12 0 3 1 22 30 2 22 1 33 3
133 | 0 0 0 0 0 2 0 0 0 0 1 0 2 2 2 5 1 39 6 475 559 796 151 72 24 726 141 1063 4342 119 1 2 8 126 13 4 13 5 191 419 29 37 25 0 124 2 43 0 0 15 1 4 174 329 0 0 3 0 0 0 1 19 6 2 0 0 3 2 1 12 29 76 4 0 1 1 0 1 1 13 0 13 0 1 0 0 2 1 1 2 5 3 21 3 16 8 27 3 41 1 1 2 1 0 0 0 0 0 0 0 1 0 0 1 3 7 12 38 8 432 229 779 262 74 211 5855 346 19 931 541 1052 2010 0 1716 2 6 87 7 0 0 3 51 133 126 239 243 97 3 0 14 80 3 235 2 256 141 448 2088 0 4 13 0 1 0 6 10 18 1 1 1 6 14 10 11 8 49 0 1183 296 11 200 10 7 11 0 1 1 1 0 46 0 1 1 36 80 7 80 0 22 3
134 | 0 0 0 0 0 5 0 1 0 0 1 0 3 0 7 17 3 12 14 82 9 8 39 19 45 14 141 20 344 41 1 0 0 37 6 7 31 5 220 282 2 3 3 6 61 3 7 0 0 3 0 3 60 21 0 0 3 0 1 0 0 3 0 0 0 0 1 0 0 0 46 7 1 0 1 2 0 7 0 2 0 2 0 1 0 1 0 0 0 0 1 3 4 1 6 20 57 6 86 5 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 18 15 32 35 73 119 75 12 17 35 52 248 13 801 438 93 217 1716 0 1 1 38 3 1 0 2 35 45 148 29 48 8 3 0 44 13 1 144 2 91 10 37 178 0 1 1 0 0 0 12 0 2 0 0 2 3 8 2 5 5 28 0 51 2 1 31 4 5 0 0 7 2 0 0 14 1 1 1 7 21 6 16 0 5 1
135 | 5 24 21 4 39 2 67 23 38 35 33 8 214 82 6 11 0 0 0 0 1 1 0 0 4 0 5 6 1 1 76 5 9 0 0 0 27 5 12 4 1 7 0 0 0 0 5 0 0 1 4 3 2 10 0 0 17 5 3 2 0 0 0 4 0 0 0 2 1 2 56 3 0 5 3 4 0 0 2 15 0 1 1 0 0 1 1 0 0 2 4 9 14 8 19 17 24 0 29 14 4155 426 3071 57 4910 285 55 2070 48 44 626 838 68 487 727 46 68 3 1 11 1 66 1 16 48 0 12 2 15 7 6 0 2 1 0 2093 9 18 0 205 4 53 24 9 35 102 126 83 13 6 60 21 117 3 35 0 0 16 31 64 91 498 4205 763 30 20 228 2189 35 90 44 87 106 98 235 710 170 9 2 0 37 2863 461 222 402 795 124 80 14 72 5 1 0 30 103 88 69 30 33 44
136 | 3 10 14 3 22 0 15 6 16 11 9 0 79 28 4 10 0 1 0 0 6 2 0 0 1 1 3 17 6 0 43 1 3 0 0 0 14 4 11 13 4 13 0 0 0 1 8 1 0 2 7 2 1 29 0 0 10 6 1 2 0 0 0 0 0 2 0 2 0 4 23 0 0 3 0 2 0 0 0 8 0 0 0 0 0 3 1 0 1 0 5 10 14 12 28 6 4 0 10 4 213 73 95 9 8189 75 15 368 20 15 2649 696 26 501 1208 155 302 3 3 19 3 115 0 20 52 1 8 5 14 11 4 0 6 1 2093 0 19 18 5 190 10 86 46 9 52 170 268 1923 78 52 111 94 489 7 70 1 1 40 6 57 60 88 644 64 223 67 445 1760 33 63 62 132 245 92 298 1006 47 11 1 0 83 5822 3114 647 422 779 559 825 132 94 1 0 1 44 146 91 96 7 16 21
137 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 2 0 0 5 6 0 1 2 2 11 3 50 11 58 19 0 0 0 3 1 0 3 1 22 54 0 0 0 0 9 3 2 0 0 0 0 0 7 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 1 2 0 0 7 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 16 10 26 120 829 1327 1679 0 1172 58 0 150 704 30 389 6 26 87 38 9 19 0 625 313 0 468 240 11 79 6 5 85 8 4 1584 44 1 20 2 304 0 2 4 0 0 0 0 2 0 65 228 301 8 0 2 0 7 13 2 5 33 0 3 0 1 3 19 93 105 1 4 1 8 1 22 0 0 0 6 9 2 1 0 0 0
138 | 0 0 0 0 1 1 0 0 0 0 0 0 0 0 3 5 0 0 1 0 0 1 1 0 8 1 21 0 14 1 0 0 1 1 1 1 3 1 18 25 0 0 0 1 3 0 2 0 0 1 2 1 1 3 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 3 0 0 0 0 2 0 1 0 0 0 0 0 0 0 0 0 0 0 1 3 2 3 2 1 0 1 0 1 0 1 0 0 0 4 1 0 0 0 0 1 0 0 0 9 14 4 38 37 355 283 406 0 120 44 1 13 1077 7 159 4 12 7 3 18 18 625 0 553 9 2644 835 103 198 19 18 197 11 4 494 59 0 45 6 280 0 5 5 0 2 3 2 8 0 340 1806 1169 11 0 3 4 33 48 5 6 151 0 3 3 0 12 33 298 251 4 5 5 2 0 49 0 0 0 4 18 3 8 0 0 2
139 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 4 0 8 1 0 0 0 0 0 0 1 0 1 2 1 0 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 3 11 4 2 0 71 26 192 0 952 49 0 269 2422 8 1075 3 0 0 1 0 5 313 553 0 5 786 226 3 57 6 5 29 1 0 11 35 0 26 0 23 0 0 1 0 0 0 0 0 0 10 75 120 4 0 0 0 3 4 1 0 41 0 0 0 0 0 9 25 28 2 1 0 0 0 7 0 0 0 0 3 1 1 0 0 0
140 | 3 46 43 2 49 0 75 30 51 27 27 5 157 61 26 40 1 0 0 0 4 2 0 1 21 1 20 17 21 2 203 11 8 1 1 1 118 23 179 95 1 4 0 8 3 1 44 0 0 2 1 5 11 10 1 5 63 31 5 8 3 2 0 9 3 5 0 1 1 0 188 10 1 31 2 28 5 8 1 18 1 9 10 0 1 15 1 0 2 0 1 3 4 2 8 36 58 0 134 18 92 58 41 23 221 41 60 56 35 44 296 417 93 1404 5054 349 288 1 11 2 1 2 12 1 9 3 1 1 7 8 4 3 0 0 205 190 0 9 5 0 4 3273 2430 66 24 136 8 13 20 22 9 37 460 2 24 41 9 20 8 24 41 63 248 25 14 5 2253 41 20 42 4 5 6 7 18 44 3895 145 11 38 90 1603 17 55 174 113 201 26 18 15 1 6 3 18 27 15 56 93 268 800
141 | 0 0 0 0 1 0 1 0 0 0 1 0 1 0 1 3 0 0 0 0 1 0 0 0 2 0 4 1 7 1 2 0 0 0 0 1 6 3 11 14 1 2 0 0 0 0 1 0 0 1 1 0 0 3 0 1 1 0 1 0 0 0 0 1 0 0 0 2 0 1 5 0 0 0 0 0 1 1 0 0 0 0 0 0 0 1 0 0 0 0 2 2 3 0 1 0 0 0 1 0 0 0 0 0 2 1 0 0 0 0 0 1 0 0 6 108 30 16 9 165 33 406 0 244 152 0 53 416 29 83 4 3 3 2 4 10 468 2644 786 4 0 1246 7 16 43 24 162 1 5 50 184 0 154 0 106 0 0 7 0 1 0 1 6 0 90 829 1024 7 0 0 3 19 45 14 36 395 1 0 0 0 12 13 70 59 1 0 11 1 0 21 0 0 0 4 24 5 8 0 0 0
142 | 5 33 41 2 60 2 64 23 36 5 27 3 99 46 34 79 5 2 3 7 11 6 1 0 42 9 29 18 30 2 135 9 16 2 2 7 210 57 397 165 2 6 1 21 8 7 40 0 0 2 6 3 15 37 1 7 58 21 9 8 0 9 1 17 1 7 2 11 0 0 275 5 0 32 11 50 12 17 0 25 2 7 16 0 2 25 0 0 3 3 8 3 4 3 11 25 47 1 161 14 8 4 5 2 30 6 7 12 4 3 65 42 4 174 1563 1251 617 45 60 107 42 321 6 217 204 3 48 366 121 203 35 51 51 35 53 86 240 835 226 3273 1246 0 888 438 107 148 117 32 86 401 192 23 1235 6 97 5 5 50 4 23 16 33 50 1 635 617 3955 34 14 25 13 28 40 19 35 544 141 41 2 2 64 1213 198 231 84 67 189 21 25 33 2 0 0 18 42 22 43 5 17 102
143 | 9 18 26 3 35 16 48 24 19 9 27 1 108 39 115 260 5 8 10 16 22 17 9 12 158 23 216 60 213 35 132 16 36 11 5 16 530 162 1232 770 9 35 4 77 55 14 89 4 0 2 9 18 61 163 2 13 70 26 16 13 3 10 3 26 6 10 2 6 4 5 624 32 7 50 13 114 27 60 7 67 1 20 10 3 1 36 13 0 7 8 12 11 12 9 46 64 190 7 516 16 8 11 3 1 30 5 7 7 5 3 41 40 24 164 543 440 316 23 28 24 26 7 177 6 6 160 4 10 52 80 135 545 133 45 24 46 11 103 3 2430 7 888 0 442 25 79 36 26 88 340 23 9 442 3 63 326 1815 642 3 10 12 14 27 5 421 31 330 24 5 10 3 12 16 12 5 29 205 500 49 154 597 476 63 27 41 72 99 32 13 21 3 24 8 82 75 8 58 23 404 1260
144 | 3 2 4 0 3 1 2 3 2 0 4 0 11 5 27 77 4 2 3 4 9 2 4 8 50 6 63 17 66 17 8 0 8 4 3 6 114 34 285 204 7 5 3 22 10 6 15 0 0 2 1 3 14 33 0 1 5 4 0 2 0 5 3 4 0 2 1 2 0 1 80 5 1 7 0 31 13 15 0 13 0 9 2 0 1 3 1 1 3 1 1 1 1 0 12 9 27 2 82 2 0 3 1 0 1 1 0 3 0 1 6 3 1 15 27 262 157 29 15 38 28 60 8 32 57 18 18 621 103 794 83 622 126 148 9 9 79 198 57 66 16 438 442 0 52 58 36 12 36 214 38 6 415 2 47 6 47 187 0 3 3 1 4 0 295 74 104 12 0 0 4 4 10 2 6 82 7 26 10 2 69 52 35 12 18 15 29 17 7 14 0 0 1 8 22 2 22 1 7 6
145 | 0 1 1 0 0 0 0 3 0 1 2 0 3 1 0 0 1 19 0 28 279 100 8 6 1 159 6 267 243 0 1 2 5 0 1 2 3 3 7 5 31 33 31 1 1 1 16 0 0 7 1 2 4 262 0 0 0 1 0 0 1 10 0 2 0 0 0 0 1 18 7 1 0 0 1 0 1 0 2 22 2 17 0 0 0 1 0 0 0 0 0 2 18 3 12 0 0 0 0 0 4 0 0 0 14 0 0 1 0 0 9 4 1 19 62 7 3 1 1 25 3 78 0 67 1144 1 278 64 601 232 2181 11 239 29 35 52 6 19 6 24 43 107 25 52 0 2593 33 0 0 2 525 0 18 0 31 1 1 570 0 2 2 1 8 0 4 25 247 12 0 0 2 2 14 3 15 855 8 0 2 0 72 131 33 55 1 7 4 1 2 7 0 0 1 3 19 24 124 0 0 3
146 | 0 0 2 0 0 1 0 2 0 0 0 1 5 1 0 0 0 23 0 4 119 40 3 3 1 32 2 80 22 1 5 7 4 1 0 6 7 5 30 11 120 194 28 0 1 3 17 0 3 40 11 102 11 617 0 0 0 1 0 0 2 9 1 9 0 3 1 5 1 184 5 1 7 0 0 3 0 1 1 6 0 11 0 0 0 0 5 0 4 1 6 23 183 20 289 2 1 1 3 0 5 2 5 1 49 2 3 7 3 1 58 22 9 88 299 20 28 3 2 73 1 20 0 12 292 2 93 54 387 231 1646 23 243 48 102 170 5 18 5 136 24 148 79 58 2593 0 547 31 4 10 280 4 44 13 581 1 8 991 0 3 6 3 27 5 20 40 551 48 0 6 12 36 79 41 69 572 38 2 14 2 983 497 99 173 14 26 19 17 1 102 0 0 0 21 194 277 866 1 2 3
147 | 0 0 2 0 0 0 0 0 0 0 0 0 1 0 0 0 0 2 0 1 39 20 1 2 2 30 5 86 38 0 3 2 2 1 0 0 0 1 10 4 26 60 1 0 1 1 2 2 0 5 28 28 5 168 0 0 0 0 1 1 0 4 0 1 0 2 7 5 9 10 3 0 2 0 0 0 1 1 0 1 0 1 0 0 0 7 4 0 15 16 46 81 75 50 165 0 1 0 2 0 11 0 3 0 18 3 2 3 0 0 8 5 2 6 31 31 52 15 32 335 16 235 4 95 55 5 49 88 44 94 262 19 97 8 126 268 85 197 29 8 162 117 36 36 33 547 0 240 32 202 1831 4 79 15 806 7 22 1569 1 10 19 6 78 1 350 563 381 301 5 27 57 348 3172 99 328 6721 6 4 45 10 1481 104 1320 423 28 124 34 128 11 1540 1 0 6 122 1062 176 763 2 12 12
148 | 0 1 2 0 2 0 0 0 0 1 1 0 2 1 0 2 0 1 2 1 1 0 0 0 2 1 14 1 9 4 4 0 0 0 0 0 5 0 26 28 8 11 0 0 1 0 0 0 1 1 4 2 4 10 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 2 8 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 4 4 5 2 12 0 2 0 4 0 5 4 22 1 26 27 3 15 6 1 37 34 6 37 46 104 1606 13 26 24 22 52 0 6 9 0 6 6 3 12 2 14 3 3 83 1923 8 11 1 13 1 32 26 12 0 31 240 0 1791 508 37 12 171 20 142 0 4 39 1 14 19 13 221 2 1512 70 96 1079 9 20 42 197 308 48 129 398 4 11 2 1 66 272 2504 311 157 841 572 2936 1447 150 0 0 0 68 222 18 35 0 3 3
149 | 0 0 0 0 0 0 0 1 0 0 1 0 2 0 3 8 0 0 2 0 0 0 0 1 15 0 28 1 7 2 2 0 0 0 0 0 9 7 53 65 0 1 1 0 3 0 5 0 0 0 3 1 5 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 5 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 2 3 0 2 0 6 1 1 4 1 0 7 5 0 3 1 0 11 11 2 44 23 123 709 32 161 36 68 9 0 0 3 0 0 5 5 21 2 21 0 0 13 78 4 4 0 20 5 86 88 36 0 4 32 1791 0 686 1 4 162 14 174 0 9 9 0 9 4 5 53 2 603 9 18 122 1 5 5 47 43 18 11 23 4 33 1 0 26 36 587 38 27 174 139 215 160 74 0 0 0 38 81 2 7 0 5 5
150 | 2 1 1 0 0 0 0 0 0 0 0 0 5 0 7 12 4 3 10 0 2 1 3 5 64 0 203 2 145 35 3 0 0 5 0 2 30 20 178 348 0 1 0 2 36 4 17 0 0 4 2 0 24 1 0 0 1 1 0 0 1 2 2 0 0 1 0 1 0 0 17 2 1 2 0 2 1 2 1 2 0 1 1 0 0 2 0 0 2 1 8 7 0 3 13 2 11 0 36 0 0 0 2 0 3 6 1 4 1 0 5 6 0 19 12 374 741 159 2078 524 2869 190 0 21 2 0 12 101 12 158 7 132 14 44 6 52 1584 494 11 22 50 401 340 214 2 10 202 508 686 0 5 6 364 45 882 0 31 14 0 18 19 5 33 1 1322 77 56 83 5 17 13 146 147 28 25 51 7 74 1 3 49 21 280 36 34 127 99 149 64 226 0 0 1 69 180 10 16 1 5 10
151 | 0 1 3 0 0 0 0 1 0 0 0 0 2 1 0 0 0 18 0 15 206 91 1 3 2 97 6 309 190 2 4 3 1 1 3 2 1 1 12 4 66 67 23 0 1 0 17 1 0 15 3 10 6 469 0 0 0 0 0 0 1 8 1 1 0 0 1 2 1 40 4 3 3 0 1 0 0 1 0 12 0 14 0 0 0 1 1 0 0 0 2 3 36 9 19 0 0 0 7 0 5 1 0 0 9 1 0 1 0 0 7 1 0 6 15 7 12 1 1 142 19 1277 0 369 982 8 372 175 206 374 264 2 80 13 60 111 44 59 35 9 184 192 23 38 525 280 1831 37 1 5 0 0 21 1 23 1 4 287 0 0 5 3 26 3 24 190 430 52 0 2 17 29 127 19 55 1490 3 3 15 1 21 106 266 250 2 12 4 18 1 25 0 0 0 2 10 6 14 0 0 0
152 | 0 1 2 0 5 0 5 4 2 5 3 0 16 7 0 3 0 0 0 0 0 0 0 0 1 0 1 0 1 1 3 1 0 0 0 1 9 1 3 1 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 1 3 0 0 0 0 1 0 0 0 0 0 0 16 0 0 0 1 0 0 0 1 2 0 0 2 0 0 0 0 0 0 0 1 0 0 0 0 2 4 0 7 1 9 7 184 17 90 168 6 167 27 5 73 130 12 108 71 21 17 8 4 7 7 0 0 0 0 0 3 0 9 3 1 1 3 1 21 94 1 0 0 37 0 23 9 6 0 4 4 12 4 6 0 0 401 8 69 0 0 1 25 69 17 412 14 7 6 1 33 11 61 31 4 2 4 0 5 2 10 9 0 0 2 96 26 13 701 31 21 14 7 12 0 0 0 3 7 0 4 3 9 10
153 | 3 29 26 4 35 2 38 19 32 17 23 3 86 30 20 36 1 4 2 1 6 1 3 0 20 5 42 12 31 8 48 7 3 0 1 2 87 19 159 121 3 4 1 11 15 3 15 0 0 2 4 6 11 22 1 0 12 13 1 0 0 1 2 7 1 2 3 0 2 0 189 3 1 9 3 7 3 3 2 13 0 4 1 1 0 6 0 0 3 5 8 8 4 5 11 35 52 1 95 7 34 37 248 28 320 452 55 377 60 30 324 630 72 745 549 2137 890 649 717 311 442 85 1 87 41 4 302 442 432 1075 120 131 235 144 117 489 20 45 26 460 154 1235 442 415 18 44 79 171 162 364 21 401 0 277 1349 1 26 104 5 517 114 294 56 22 128 42 445 97 215 176 102 66 53 79 30 88 133 59 7 3 82 804 284 206 748 165 528 102 142 782 2 1 3 287 276 19 33 58 94 53
154 | 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 3 1 1 0 2 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 2 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 2 1 0 1 0 2 1 2 1 8 0 0 0 0 0 1 1 16 2 5 14 2 9 2 0 7 8 2 6 13 20 52 12 25 20 15 1 0 1 2 0 6 2 14 21 4 0 2 2 3 7 2 6 0 2 0 6 3 2 0 13 15 20 14 45 1 8 277 0 687 0 0 13 0 308 24 0 2 1 32 8 4 6 185 15 13 27 16 29 5 8 1 2 1 0 33 4 20 1 4 10 7 10 3 811 0 0 0 87 101 1 8 0 2 2
155 | 0 0 1 0 1 0 0 1 0 2 0 0 1 1 0 3 0 1 0 1 8 2 1 2 4 5 17 19 41 6 2 1 0 4 0 0 5 1 17 26 5 37 1 1 2 0 5 0 2 3 17 18 4 75 0 0 3 0 1 0 0 1 0 0 0 3 5 1 6 7 5 1 3 0 0 0 0 0 0 1 0 3 0 0 0 3 4 0 6 10 38 62 44 38 116 0 1 0 5 0 5 2 14 1 12 21 1 16 1 0 15 15 5 27 36 246 506 1062 491 1224 801 145 0 110 25 11 187 66 294 339 287 37 256 91 35 70 304 280 23 24 106 97 63 47 31 581 806 142 174 882 23 69 1349 687 0 8 21 1462 0 105 56 2 47 0 488 180 167 103 36 28 52 176 113 132 120 536 3 11 45 11 1601 90 309 87 30 120 85 118 38 1249 1 1 7 305 958 160 451 2 14 6
156 | 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 1 0 1 0 0 15 13 2 0 4 5 3 23 11 1 4 0 0 0 0 1 5 0 24 9 19 156 7 1 0 0 4 1 0 6 10 124 3 308 0 0 0 0 0 0 3 0 0 2 0 1 7 1 2 18 2 2 17 0 0 0 0 1 0 1 0 0 0 1 1 7 4 2 14 6 20 47 105 22 419 2 2 0 9 0 0 18 0 0 0 1 69 0 0 0 1 1 21 33 1 1 1 0 0 0 0 0 342 0 1 1092 0 0 2 0 0 128 141 10 0 1 0 0 0 41 0 5 326 6 1 1 7 0 0 0 1 0 1 0 8 0 625 1073 1 0 12 6 2 21 0 0 0 1 0 1 0 2 4 2 0 2 105 184 909 808 545 1 2 0 1 0 1 0 0 9 9 74 29 142 49 1 3 260 1369 812
157 | 0 1 0 0 1 6 0 0 0 0 1 0 7 1 15 39 1 3 12 22 15 24 7 9 19 16 89 57 229 24 0 0 1 8 2 3 54 11 254 244 13 34 2 5 33 5 9 0 0 1 0 10 45 150 1 0 3 3 0 0 1 3 3 2 0 1 2 0 1 3 66 10 6 1 1 9 5 6 0 6 0 2 0 1 1 6 3 1 8 1 3 6 10 8 30 14 57 2 84 0 1 0 0 0 0 1 0 1 0 0 1 0 0 1 2 18 26 1 1 3 5 2 528 0 0 480 0 0 2 7 29 1683 448 37 0 1 2 5 0 9 0 5 1815 47 1 8 22 4 9 31 4 0 26 0 21 625 0 484 0 0 7 0 3 0 25 4 1 3 0 1 5 0 4 5 1 7 0 562 121 211 569 1 6 2 3 9 9 3 2 20 2 21 6 66 60 4 23 0 366 500
158 | 1 3 15 1 2 1 0 11 1 0 3 0 12 8 4 6 2 197 1 145 1645 1117 25 70 25 542 45 1314 826 9 26 17 31 5 2 9 50 10 216 96 521 979 255 10 6 13 112 2 13 207 35 471 110 5147 0 0 2 2 1 1 34 87 6 42 0 9 2 14 13 594 23 6 78 0 6 7 14 5 22 137 4 106 2 0 1 4 16 1 15 13 34 86 672 67 1476 4 8 0 18 1 1 1 0 0 3 0 2 2 0 0 2 5 2 28 24 3 15 6 6 973 18 303 67 12 99 1911 14 9 103 55 479 610 2088 178 16 40 4 5 1 20 7 50 642 187 570 991 1569 39 9 14 287 1 104 13 1462 1073 484 0 0 12 16 1 6 0 25 34 44 33 2 16 8 105 143 31 34 174 19 26 1328 35 2472 43 106 41 3 18 3 17 4 240 0 2 3 117 275 23 131 10 66 95
159 | 0 1 0 0 0 0 0 2 0 1 1 2 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 4 1 0 0 0 0 0 0 0 0 0 0 0 0 0 7 0 0 1 0 0 0 0 0 0 0 0 1 133 0 86 0 11 23 2 1 1 0 1 0 0 1 0 3 0 15 16 87 11 29 63 4 33 16 2 20 23 6 17 8 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 31 6 0 0 0 8 0 4 3 0 0 0 1 1 0 0 0 25 5 0 0 1 0 0 0 149 110 947 27 1 4 1 5 4 0 116 7 0 2 2 0 0 5 1 3 8 0 16 6 1 447 64 3 1 2 11 120 58 14 12 0 0 0 5 9 6
160 | 0 2 5 1 3 0 8 5 6 0 4 0 18 9 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0 0 0 4 0 5 2 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 13 0 0 0 0 1 0 0 1 3 0 0 0 0 0 2 0 0 0 0 1 0 0 0 0 5 3 0 6 0 19 33 312 29 97 218 9 159 37 6 54 102 18 59 40 7 20 2 5 6 0 1 1 1 2 0 5 1 8 13 7 0 4 1 64 57 0 2 0 24 1 23 10 3 2 3 10 14 9 18 0 69 517 308 105 0 0 12 149 0 198 473 25 9 5 1 26 7 152 2077 25 3 2 1 7 14 11 7 0 0 12 52 17 9 453 93 11 7 4 1510 31 0 0 8 7 2 1 1 12 14
161 | 1 3 4 0 0 6 1 0 0 0 2 0 2 2 0 0 0 0 1 0 0 3 0 1 0 2 0 1 2 0 2 1 0 0 0 0 0 0 6 3 0 10 0 1 0 0 9 6 0 7 116 37 10 5 4 11 6 0 51 3 0 0 0 1 5 107 214 35 3 0 18 4 11 30 2 13 1 1 0 2 0 0 21 465 20 2508 18 362 1308 115 262 218 16 39 66 23 42 19 20 1 27 21 284 25 75 153 6 121 28 9 30 70 16 41 32 6 14 2 2 5 4 1 2 0 1 1 8 2 13 8 7 1 13 1 91 60 0 3 0 41 0 16 12 3 2 6 19 19 4 19 5 17 114 24 56 12 7 16 110 198 0 274 42 5 10 2 29 15 14 592 646 198 37 144 116 47 19 2 7 61 179 39 22 11 421 110 7 10 1 365 624 250 23 1686 315 17 19 9 27 28
162 | 4 19 3 6 13 1 27 5 7 7 10 1 44 14 0 1 0 0 0 1 0 0 0 0 1 0 2 0 0 0 22 1 1 0 0 0 7 0 9 5 0 1 0 0 0 1 5 0 0 0 0 0 0 0 0 0 8 0 7 1 0 2 0 2 0 0 2 0 0 0 24 1 0 10 0 5 0 0 0 4 0 1 0 4 0 3 0 0 0 0 3 1 1 0 0 3 11 0 14 6 164 222 366 27 178 219 43 168 42 55 64 148 40 120 68 4 11 0 0 0 1 4 0 1 1 0 0 2 1 1 0 0 0 0 498 88 0 2 0 63 1 33 14 1 1 3 6 13 5 5 3 412 294 0 2 6 0 1 947 473 274 0 267 240 5 0 33 43 105 129 2 1 3 4 11 26 31 14 5 7 7 110 26 16 181 69 8 8 5 13 493 30 7 18 6 4 4 82 39 30
163 | 3 25 13 6 36 7 45 15 25 21 29 3 106 49 6 1 0 1 0 0 1 1 0 0 2 0 1 0 3 0 41 1 7 0 1 0 20 6 16 6 1 2 1 0 1 0 7 0 0 0 0 3 1 8 0 1 2 4 2 0 0 0 0 5 0 0 0 0 0 0 71 0 0 4 0 2 0 1 3 14 0 1 0 0 0 0 0 0 1 0 2 2 7 3 7 13 20 0 28 3 1140 239 1069 73 752 395 50 437 92 33 276 396 63 320 335 18 68 4 3 10 7 26 0 8 14 0 5 3 5 5 1 2 1 0 4205 644 2 8 0 248 6 50 27 4 8 27 78 221 53 33 26 14 56 2 47 2 3 6 27 25 42 267 0 291 97 14 102 1280 14 46 21 42 74 33 96 254 138 11 0 1 23 651 353 130 1186 2734 93 217 41 37 9 1 1 23 55 23 17 30 64 48
164 | 4 21 2 9 10 1 17 4 6 27 5 7 43 8 1 1 0 0 0 0 0 0 0 0 0 0 1 0 2 1 7 0 1 0 0 0 3 0 4 4 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 0 13 5 1 1 0 0 0 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 5 0 0 2 10 478 590 679 34 294 97 107 176 57 166 92 211 49 143 106 12 12 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 763 64 0 0 0 25 0 1 5 0 0 5 1 2 2 1 3 7 22 1 0 21 0 0 1 9 5 240 291 0 0 0 9 9 1 8 0 0 2 2 2 3 57 27 24 27 13 77 8 5 40 33 14 5 2 1 2 11 5 9 2 2 1 337 123 19
165 | 0 1 0 1 0 0 1 0 1 0 2 0 2 0 3 8 1 1 0 1 1 1 0 2 14 0 47 0 19 3 0 0 0 1 0 0 17 7 81 91 0 1 0 1 7 2 5 0 0 0 4 0 8 2 0 0 1 0 0 0 0 0 0 2 0 0 0 2 0 0 7 2 0 4 0 2 1 2 0 0 0 0 0 0 0 1 1 0 2 0 1 2 2 0 3 0 1 0 15 0 2 2 5 0 5 5 2 7 1 0 6 5 1 14 17 77 344 63 231 247 307 95 1 10 9 0 4 114 6 51 5 40 6 12 30 223 65 340 10 14 90 635 421 295 4 20 350 1512 603 1322 24 6 128 32 488 0 25 25 4 5 10 5 97 0 0 645 170 305 3 17 8 133 212 12 24 125 4 58 1 0 36 28 3143 350 52 284 151 351 104 186 0 0 1 58 142 4 16 0 7 12
166 | 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 5 4 0 0 0 2 2 3 2 0 2 0 0 0 0 0 0 0 3 6 3 6 0 0 0 0 2 0 0 0 2 2 1 11 0 0 1 1 1 0 0 0 0 0 1 1 0 2 0 2 2 0 1 1 0 1 0 0 0 0 0 0 0 0 0 1 1 0 1 1 8 4 7 6 12 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 2 1 1 1 4 12 26 5 6 252 61 475 0 176 116 0 31 119 15 50 8 3 10 0 20 67 228 1806 75 5 829 617 31 74 25 40 563 70 9 77 190 1 42 8 180 0 4 34 1 1 2 0 14 0 645 0 1157 52 0 13 11 91 271 33 90 680 0 0 0 1 33 31 2278 802 11 30 29 49 11 101 0 0 1 12 69 44 31 0 1 2
167 | 0 1 7 0 7 0 9 0 8 2 4 0 21 3 4 2 1 5 0 1 10 8 0 0 3 6 1 27 18 1 36 0 2 0 1 0 13 7 21 11 4 18 0 0 0 0 6 1 0 2 8 7 2 37 0 4 7 3 0 0 0 1 0 0 1 1 2 1 3 1 12 2 0 9 2 4 0 0 1 0 0 2 2 0 0 4 0 0 1 1 6 9 16 4 39 2 2 0 8 3 19 10 6 0 43 8 3 6 1 2 18 19 4 76 1548 342 288 14 4 157 41 722 2 276 475 0 91 70 116 114 37 5 18 2 228 445 301 1169 120 2253 1024 3955 330 104 247 551 381 96 18 56 430 33 445 4 167 0 1 44 5 26 29 33 102 9 170 1157 0 186 12 27 43 82 135 108 293 1983 77 17 1 0 99 2652 1186 2959 92 153 192 76 19 75 2 0 0 24 180 160 259 2 3 57
168 | 1 4 4 3 2 1 5 2 1 2 5 1 25 12 5 5 0 0 0 0 2 0 0 0 2 0 10 2 6 1 24 0 2 0 0 1 17 3 26 16 3 19 1 1 0 0 4 0 0 1 7 5 1 26 0 1 4 3 2 0 0 0 0 0 0 0 0 1 1 0 14 0 0 1 0 1 1 0 0 4 0 0 0 0 0 0 1 0 0 0 10 14 14 7 43 3 3 0 6 2 54 12 54 6 132 56 8 51 7 2 68 78 14 112 87 27 151 5 7 18 3 57 1 12 17 1 2 4 5 11 3 12 1 0 2189 1760 8 11 4 41 7 34 24 12 12 48 301 1079 122 83 52 11 97 6 103 1 3 33 4 7 15 43 1280 9 305 52 186 0 6 24 44 169 307 73 311 790 23 12 0 1 57 643 947 311 124 4597 132 1780 94 113 3 0 1 40 157 57 61 1 15 12
169 | 0 3 3 0 3 0 2 0 3 3 1 1 10 4 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 2 0 0 0 0 0 0 0 3 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 2 2 0 6 2 8 12 172 6 63 111 2 82 32 0 25 45 10 43 22 5 4 0 1 3 1 2 0 0 0 0 1 1 3 2 1 0 1 0 35 33 0 0 0 20 0 14 5 0 0 0 5 9 1 5 0 61 215 185 36 0 0 2 0 152 14 105 14 1 3 0 12 6 0 11 6 0 3 2 4 6 4 1 0 0 2 32 9 1 306 71 5 6 4 193 0 1 0 18 8 0 1 2 7 7
170 | 0 3 2 0 0 0 0 1 1 1 1 0 12 5 0 1 0 0 0 0 0 0 0 0 1 0 0 0 1 0 10 1 0 0 0 0 6 1 9 1 0 3 1 1 0 0 1 0 0 1 40 4 1 2 0 0 0 0 1 0 0 0 0 0 0 7 16 9 2 0 14 1 0 0 0 1 0 0 0 1 0 0 0 3 3 94 0 4 107 5 51 14 0 10 9 2 5 0 8 2 24 31 383 32 103 253 7 153 54 4 51 123 23 62 43 9 16 2 6 7 9 3 0 1 1 0 3 1 5 9 7 1 1 2 90 63 2 3 0 42 0 25 10 0 0 6 27 20 5 17 2 31 176 15 28 1 1 16 116 2077 592 129 46 8 17 13 27 24 11 0 743 738 64 120 82 55 5 3 1 1 14 46 34 11 393 154 9 18 4 399 3 5 1 95 24 7 5 5 8 13
171 | 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 2 0 2 0 2 0 0 0 0 0 1 0 3 1 1 10 0 0 0 1 2 2 1 0 96 12 2 2 0 0 0 0 2 0 0 0 0 2 0 18 56 26 7 0 3 0 2 0 0 1 1 0 0 1 0 0 1 5 5 152 11 17 219 27 145 105 23 44 62 3 6 1 5 1 4 3 25 3 14 25 1 14 5 0 14 9 0 6 6 5 11 2 6 3 0 2 0 3 0 0 7 1 9 5 4 1 6 3 44 62 0 4 0 4 3 13 3 4 2 12 57 42 5 13 17 4 102 13 52 0 5 8 7 25 646 2 21 0 8 11 43 44 6 743 0 1007 145 3037 995 250 0 1 1 3 35 35 50 38 16 22 11 14 0 419 4 5 3 396 158 100 39 0 0 0
172 | 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 1 3 0 0 0 0 3 2 3 0 2 1 0 0 0 0 2 0 4 2 1 19 0 1 0 2 4 2 0 2 186 21 2 7 0 0 0 0 0 0 1 2 1 0 0 13 47 40 7 1 1 0 2 2 0 0 0 0 0 1 1 1 0 2 3 51 28 5 190 27 254 143 42 79 83 2 11 0 4 0 5 2 3 1 12 2 0 2 0 1 8 4 1 4 8 35 105 10 34 30 24 12 0 6 4 1 15 12 22 27 20 4 14 8 87 132 7 33 3 5 19 28 12 4 2 36 348 197 47 146 29 2 66 27 176 2 0 105 0 3 198 1 42 0 133 91 82 169 0 738 1007 0 1601 1502 831 706 1 1 4 2 135 34 458 124 14 81 32 93 23 1570 0 8 6 499 462 54 78 0 2 1
173 | 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 3 1 0 0 1 3 4 4 6 0 0 1 0 0 0 0 1 1 5 4 1 7 0 0 0 0 2 0 0 5 83 18 2 7 0 0 0 0 0 0 0 0 0 0 0 4 24 20 8 2 1 1 1 0 0 1 0 0 0 1 0 1 0 0 0 10 10 2 33 15 114 99 40 69 81 2 8 0 0 0 7 0 6 0 6 1 0 1 0 0 4 1 1 7 8 22 72 7 33 48 12 61 0 13 11 1 17 28 24 30 31 7 10 2 106 245 13 48 4 6 45 40 16 10 14 79 3172 308 43 147 127 4 53 16 113 4 4 143 2 2 37 3 74 2 212 271 135 307 3 64 145 1601 0 238 686 2650 1 3 4 3 164 36 969 217 31 165 40 147 30 1386 0 2 1 164 540 60 107 1 3 4
174 | 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 2 0 1 0 0 1 3 0 0 0 0 3 2 5 0 1 1 2 0 0 0 1 0 1 2 4 28 2 1 0 1 7 2 0 2 228 22 10 18 0 0 0 0 1 0 2 3 1 3 0 23 55 35 18 2 7 0 3 0 1 1 0 0 0 1 0 3 1 2 4 51 31 11 145 37 378 246 61 108 105 5 9 0 3 0 5 1 11 0 20 3 0 3 0 0 9 3 1 9 21 8 30 5 10 6 7 5 2 1 6 1 14 3 35 21 24 4 11 5 98 92 2 5 1 7 14 19 12 2 3 41 99 48 18 28 19 0 79 29 132 2 5 31 2 1 144 4 33 2 12 33 108 73 2 120 3037 1502 238 0 2323 377 4 1 5 3 159 86 101 110 14 43 14 31 4 407 4 0 5 457 2198 583 242 0 2 3
175 | 0 1 2 0 0 1 0 1 0 0 1 0 0 0 0 0 0 1 0 1 5 3 1 1 0 0 0 4 4 0 1 0 1 0 1 1 0 1 8 2 0 34 0 0 0 1 9 2 0 6 137 46 5 19 0 0 0 0 0 0 2 4 0 4 0 9 49 44 26 6 6 0 1 1 0 1 1 2 0 6 0 3 2 1 4 20 31 2 50 41 235 231 101 179 203 3 9 2 6 0 9 5 2 0 30 1 1 1 0 0 6 5 1 12 51 10 23 4 1 4 4 14 0 7 9 1 14 14 20 29 34 3 8 5 235 298 5 6 0 18 36 35 5 6 15 69 328 129 11 25 55 5 30 5 120 0 1 34 0 7 116 11 96 2 24 90 293 311 4 82 995 831 686 2323 0 2197 2 0 3 2 99 146 332 241 28 123 19 73 5 182 0 0 4 166 1735 1722 600 0 3 3
176 | 0 0 3 0 1 1 0 3 0 0 0 0 5 1 0 3 0 11 1 6 71 28 4 6 3 19 6 37 32 2 2 2 4 1 0 1 7 2 39 23 71 266 19 1 0 4 18 1 3 37 84 128 13 404 0 0 1 1 0 0 9 7 0 1 0 8 21 23 26 79 6 1 2 0 1 2 1 1 2 12 0 14 2 0 0 13 14 0 12 19 114 174 383 171 653 2 2 1 5 1 52 8 5 3 97 2 0 8 1 0 51 18 4 42 157 26 71 7 7 71 10 227 0 282 192 3 457 305 182 519 134 6 49 28 710 1006 33 151 41 44 395 544 29 82 855 572 6721 398 23 51 1490 2 88 8 536 2 7 174 0 14 47 26 254 3 125 680 1983 790 6 55 250 706 2650 377 2197 0 12 3 12 4 171 699 1910 1442 63 304 69 243 30 450 0 4 2 63 586 1109 1242 3 4 1
177 | 5 30 26 6 42 4 50 14 29 44 48 12 247 81 10 26 1 0 1 0 0 1 1 0 18 2 16 0 14 1 128 7 14 0 1 1 59 16 118 58 0 1 1 3 2 1 30 0 0 0 1 1 6 4 1 3 26 20 4 3 0 1 0 6 1 5 1 0 0 0 150 7 0 20 0 14 4 7 2 19 1 1 5 0 1 10 1 0 4 1 4 1 3 0 5 42 36 0 53 22 52 102 28 36 120 59 170 60 40 102 48 191 505 1307 367 101 75 0 1 0 0 2 6 0 0 2 0 0 0 0 1 3 0 0 170 47 0 0 0 3895 1 141 205 7 8 38 6 4 4 7 3 10 133 1 3 105 0 19 5 11 19 31 138 57 4 0 77 23 4 5 0 1 1 4 2 12 0 157 23 105 90 142 4 4 74 49 60 18 2 10 7 27 5 52 25 5 14 1045 2568 974
178 | 0 7 7 1 3 8 4 3 3 4 3 1 31 5 16 51 3 6 0 72 12 125 25 5 9 21 25 35 811 16 18 1 7 25 2 1 161 14 532 257 0 5 1 14 34 5 12 1 0 0 1 12 166 19 0 1 13 3 1 2 0 3 0 7 0 3 0 1 2 1 341 75 3 9 2 27 8 12 1 20 2 3 5 1 0 14 2 4 6 6 12 6 1 4 6 68 248 9 326 10 4 48 7 1 6 2 232 3 2 1 5 7 37 55 29 72 100 1 11 3 9 5 261 5 1 457 5 2 8 12 3 421 1183 51 9 11 3 3 0 145 0 41 500 26 0 2 4 11 33 74 3 9 59 2 11 184 562 26 1 7 2 14 11 27 58 0 17 12 1 3 1 1 3 1 0 3 157 0 82 166 43 19 13 2 25 24 32 17 4 6 16 87 25 60 11 0 2 640 1644 749
179 | 1 1 0 0 1 1 0 0 0 0 0 0 5 1 1 1 0 12 0 6 70 474 1 3 0 38 3 110 286 3 2 2 3 0 0 1 1 2 78 22 22 56 11 1 0 1 11 1 1 20 19 275 43 793 0 0 0 0 0 0 0 5 0 4 0 0 2 2 4 24 6 3 53 2 0 0 0 0 0 5 0 6 0 3 0 16 7 1 27 18 32 67 30 21 294 4 5 1 5 0 0 35 0 0 0 0 103 0 0 0 0 0 6 8 4 0 0 0 2 32 1 20 130 3 0 868 1 0 2 0 16 77 296 2 2 1 0 3 0 11 0 2 49 10 2 14 45 2 1 1 15 0 7 1 45 909 121 1328 3 0 7 5 0 24 1 0 1 0 0 1 1 4 4 5 3 12 23 82 0 469 430 2 0 1 1 2 0 3 0 15 46 260 114 271 54 2 7 196 403 65
180 | 0 0 2 0 0 0 0 2 0 0 1 0 3 0 0 1 0 0 0 0 0 7 1 1 0 3 1 6 5 1 2 1 0 0 0 0 4 0 13 3 0 73 1 0 1 1 8 3 0 8 101 119 12 39 0 0 1 0 0 0 0 1 0 1 0 15 44 17 11 2 5 0 7 1 0 3 0 0 4 2 0 7 4 1 0 102 16 10 188 103 290 359 112 96 408 2 11 1 5 3 2 36 0 0 1 0 109 0 0 0 0 3 8 26 1 2 3 0 0 1 0 0 56 0 0 59 1 0 0 1 0 24 11 1 0 0 1 0 0 38 0 2 154 2 0 2 10 1 0 3 1 0 3 0 11 808 211 35 8 0 61 7 1 27 0 1 0 1 0 1 3 2 3 3 2 4 105 166 469 0 1253 2 2 0 2 2 0 1 0 26 58 637 370 1473 136 0 1 394 1358 456
181 | 0 0 9 1 1 4 3 1 1 0 3 0 8 3 2 8 0 29 0 2 37 58 9 11 3 43 6 77 41 3 17 9 7 0 0 2 30 8 145 55 93 820 32 8 0 6 85 11 6 147 350 984 53 974 0 0 3 2 1 0 11 22 5 12 0 39 85 85 92 103 21 8 34 1 4 7 5 6 4 24 1 40 5 13 3 118 89 28 293 245 595 1164 1306 602 3308 9 31 4 23 5 4 21 1 1 17 3 52 4 1 2 14 13 16 79 136 17 71 16 20 38 12 8 113 2 21 188 15 8 108 38 401 196 200 31 37 83 3 12 0 90 12 64 597 69 72 983 1481 66 26 49 21 2 82 33 1601 545 569 2472 0 12 179 7 23 13 36 33 99 57 2 14 35 135 164 159 99 171 90 43 430 1253 0 166 136 62 11 42 22 45 9 811 21 99 158 2893 3730 235 1246 171 675 448
182 | 0 7 4 0 9 0 15 3 6 12 8 0 51 25 7 7 0 0 0 0 11 5 0 0 3 7 5 25 19 0 30 1 2 0 0 0 16 5 20 22 4 9 1 2 1 1 6 1 0 0 3 4 1 33 1 0 5 3 0 1 0 1 0 1 2 0 0 0 1 2 20 1 0 1 0 4 0 3 0 1 0 1 0 0 0 1 0 0 0 1 4 3 8 6 22 5 5 0 17 2 260 86 63 3 1256 48 14 72 8 6 1028 214 11 295 3942 427 651 6 5 29 6 103 6 36 132 1 24 10 70 19 20 7 10 4 2863 5822 19 33 9 1603 13 1213 476 52 131 497 104 272 36 21 106 96 804 4 90 1 1 43 16 52 39 110 651 77 28 31 2652 643 32 46 35 34 36 86 146 699 142 19 2 2 166 0 2079 2409 443 576 721 323 108 58 1 0 0 26 142 104 270 8 11 122
183 | 0 0 0 0 1 0 0 0 0 0 0 0 0 3 0 2 0 1 0 0 9 4 0 1 5 2 0 8 5 1 3 0 0 0 0 0 2 1 9 14 11 26 0 1 1 0 1 0 0 0 7 8 0 45 0 0 0 0 0 0 0 0 1 0 0 2 5 2 3 5 2 0 3 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 11 11 16 7 29 0 1 0 2 1 43 7 12 1 73 10 2 8 1 1 44 28 2 25 75 138 712 18 18 170 37 446 1 83 71 2 14 45 10 26 6 10 7 5 461 3114 93 298 25 17 70 198 63 35 33 99 1320 2504 587 280 266 26 284 20 309 2 6 106 6 17 22 26 353 8 3143 2278 1186 947 9 34 50 458 969 101 332 1910 4 13 0 2 136 2079 0 6067 174 673 486 830 221 303 1 0 1 67 364 63 85 1 2 2
184 | 0 0 0 0 1 0 0 0 0 1 0 0 3 1 0 0 0 1 0 0 1 2 0 1 2 2 0 10 8 0 1 0 0 0 0 0 3 1 3 4 5 23 1 0 0 0 2 0 0 1 2 4 0 29 0 0 2 0 1 0 0 1 0 0 0 1 0 0 2 2 2 1 0 1 0 2 0 0 0 0 0 1 1 0 0 1 0 0 0 0 8 8 21 12 23 0 1 0 4 0 10 4 5 1 28 13 0 2 0 0 9 8 0 13 32 146 281 2 2 97 17 390 0 118 158 0 29 20 42 33 9 1 11 0 222 647 105 251 28 55 59 231 27 12 55 173 423 311 38 36 250 13 206 1 87 0 2 41 1 9 11 16 130 5 350 802 2959 311 1 11 38 124 217 110 241 1442 4 2 1 0 62 2409 6067 0 93 245 220 221 43 66 0 1 0 21 184 99 114 2 2 7
185 | 3 15 14 6 20 0 29 14 19 6 19 2 70 32 1 4 0 1 0 0 0 0 1 0 4 0 8 0 1 1 41 2 4 0 0 0 17 5 42 19 0 0 0 0 0 0 2 0 0 0 0 0 2 1 0 1 10 7 3 0 0 1 0 2 0 1 1 0 0 0 67 1 0 6 0 6 0 1 1 6 0 1 0 1 0 4 0 0 0 0 0 0 2 1 0 5 12 0 33 8 140 154 1243 100 526 960 56 689 176 39 337 654 91 466 319 38 90 1 4 6 4 6 0 0 5 0 2 0 0 2 2 11 0 0 402 422 1 4 2 174 1 84 41 18 1 14 28 157 27 34 2 701 748 4 30 1 3 3 447 453 421 181 1186 40 52 11 92 124 306 393 16 14 31 14 28 63 74 25 1 2 11 443 174 93 0 2733 464 150 51 44 36 1 0 26 35 3 8 16 34 38
186 | 1 7 16 0 14 0 20 6 7 6 7 0 50 21 3 4 1 0 0 0 0 0 0 1 18 0 14 1 8 1 24 2 2 1 1 1 21 7 42 42 4 5 0 0 2 0 5 0 0 0 1 0 3 12 1 0 7 5 2 0 0 0 0 3 0 0 0 1 0 1 35 1 0 1 0 2 0 1 3 2 0 2 2 0 1 1 0 0 0 0 1 1 4 3 6 6 9 0 20 3 122 84 762 74 475 577 33 491 89 23 284 430 48 393 293 40 243 7 10 12 13 39 0 7 4 0 1 2 4 14 1 18 1 7 795 779 4 5 1 113 0 67 72 15 7 26 124 841 174 127 12 31 165 10 120 0 9 18 64 93 110 69 2734 33 284 30 153 4597 71 154 22 81 165 43 123 304 49 24 2 2 42 576 673 245 2733 0 609 2358 203 98 6 0 0 40 118 14 27 14 25 25
187 | 2 7 4 0 13 0 16 5 6 4 10 0 33 15 1 14 1 0 0 0 0 0 0 0 17 0 17 0 9 1 12 2 0 1 0 1 16 8 57 54 2 2 0 3 0 1 6 0 0 0 0 1 1 3 0 0 5 1 0 1 0 0 1 4 0 0 0 0 0 0 54 0 0 0 0 2 1 3 0 5 0 0 2 0 0 0 0 0 0 0 1 0 0 0 1 8 19 1 19 1 25 28 215 20 293 380 23 356 39 21 310 556 46 551 477 932 2529 11 32 11 11 13 0 3 8 0 1 14 2 24 0 16 1 2 124 559 1 5 0 201 11 189 99 29 4 19 34 572 139 99 4 21 528 7 85 1 9 3 3 11 7 8 93 14 151 29 192 132 5 9 11 32 40 14 19 69 60 32 0 0 22 721 486 220 464 609 0 408 1078 55 0 1 0 20 49 7 18 16 21 10
188 | 0 0 0 0 1 1 0 0 0 0 0 0 6 4 1 4 0 0 0 0 0 0 0 0 9 0 6 2 6 2 4 0 1 1 0 0 5 3 36 31 0 6 0 1 1 0 5 0 0 1 1 2 1 13 0 1 2 2 0 0 0 0 0 0 0 0 0 0 0 1 5 1 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 2 3 3 3 1 1 1 0 7 1 10 7 47 2 69 83 4 74 5 4 64 112 10 127 121 55 350 9 10 13 11 23 0 3 3 0 3 3 5 15 5 14 1 0 80 825 8 2 0 26 1 21 32 17 1 17 128 2936 215 149 18 14 102 10 118 0 3 17 1 7 10 8 217 5 351 49 76 1780 6 18 14 93 147 31 73 243 18 17 3 1 45 323 830 221 150 2358 408 0 525 92 0 0 1 41 127 7 22 4 6 4
189 | 0 1 0 0 0 0 4 0 0 0 1 0 3 1 0 0 2 0 0 0 0 0 0 0 3 0 3 0 3 1 3 0 0 0 0 0 3 3 6 10 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 6 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 2 0 1 0 3 3 7 3 21 22 1 23 2 2 30 47 9 56 48 140 1819 2 7 6 2 5 0 0 2 0 0 1 1 2 0 2 0 0 14 132 1 0 0 18 0 25 13 7 2 1 11 1447 160 64 1 7 142 3 38 0 2 4 2 4 1 5 41 2 104 11 19 94 4 4 0 23 30 4 5 30 2 4 0 0 9 108 221 43 51 203 1078 525 0 24 0 0 0 9 25 2 5 0 3 1
190 | 0 2 1 0 2 1 0 2 1 0 1 0 2 1 0 0 0 0 0 1 2 1 0 0 1 2 7 5 8 1 2 2 1 0 0 1 2 1 8 7 1 23 0 1 1 1 2 1 0 2 91 19 2 12 0 0 1 0 0 0 0 1 1 0 0 9 25 15 7 6 11 1 2 0 0 2 0 0 0 0 0 0 3 4 2 50 11 3 106 10 157 109 44 57 113 3 8 0 3 0 13 7 149 14 51 104 6 71 19 1 27 51 10 42 40 60 160 35 103 90 58 16 2 8 8 1 39 12 74 69 59 12 46 14 72 94 22 49 7 15 21 33 21 14 7 102 1540 150 74 226 25 12 782 811 1249 9 20 240 11 1510 365 13 37 1 186 101 75 113 193 399 419 1570 1386 407 182 450 10 6 15 26 811 58 303 66 44 98 55 92 24 0 2 12 11 859 1708 83 216 3 23 26
191 | 1 4 0 0 0 5 0 2 0 0 1 0 0 0 1 0 0 0 0 0 0 2 0 0 0 0 0 0 0 0 2 1 0 0 0 0 2 0 3 0 0 0 0 0 0 0 2 2 0 0 4 7 4 0 0 18 3 1 7 0 0 0 0 0 2 6 9 5 0 0 5 0 4 11 1 0 0 0 0 1 0 0 2 1318 0 800 4 235 317 57 40 47 0 1 6 11 19 2 3 1 2 7 9 2 4 3 9 5 1 0 2 2 1 3 3 2 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 5 1 0 0 0 1 0 2 3 0 0 0 1 0 0 0 0 0 2 0 1 9 2 0 120 31 624 493 9 2 0 0 2 3 0 3 4 0 0 4 0 0 7 16 46 58 21 1 1 0 36 6 0 0 0 2 0 870 173 592 13 0 0 27 58 14
192 | 0 3 1 0 1 1 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 1 1 1 1 0 1 0 0 0 0 0 0 0 1 0 1 1 0 1 0 0 0 0 2 5 0 0 28 9 4 1 0 0 1 0 7 0 0 0 0 3 1 32 52 7 1 0 3 0 5 3 0 3 0 0 0 1 0 0 6 48 2 956 1 133 404 60 76 72 0 10 20 8 22 1 5 2 0 15 4 0 1 3 50 0 0 0 0 1 7 8 0 0 0 1 0 0 0 0 7 0 0 1 0 0 0 1 0 3 1 1 1 0 0 0 0 6 0 0 24 0 0 0 0 0 0 0 0 0 1 0 1 74 21 2 58 0 250 30 1 11 0 0 0 0 1 5 5 8 2 0 0 4 27 87 260 637 99 0 0 1 1 0 1 0 0 12 870 0 525 1959 17 1 0 143 356 66
193 | 0 0 0 0 1 3 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 1 0 2 0 0 0 0 2 2 0 1 13 44 1 5 1 0 0 0 1 0 0 0 0 0 0 1 13 5 2 0 6 0 2 0 0 2 0 0 0 0 0 0 0 5 2 46 8 17 108 68 84 186 3 18 51 3 2 3 4 0 0 8 0 0 0 0 30 1 0 0 0 0 3 3 0 1 2 1 0 0 0 0 1 0 0 1 0 0 0 0 1 1 1 1 0 1 0 0 0 3 0 0 8 1 1 0 6 0 0 1 0 0 3 0 7 29 6 3 14 0 23 7 1 5 1 1 0 1 0 1 3 6 1 5 4 2 5 25 114 370 158 0 1 0 0 0 0 1 0 11 173 525 0 769 93 0 1 59 132 20
194 | 0 3 3 2 1 10 1 1 0 0 0 0 2 0 1 2 0 2 0 0 3 17 0 1 2 2 2 5 12 2 2 2 1 1 0 0 4 0 23 5 5 84 3 1 1 5 32 10 0 14 949 188 36 46 1 1 2 3 14 1 6 5 2 3 4 156 444 207 32 1 27 5 21 16 1 19 1 2 1 4 0 6 24 94 29 1820 117 321 3498 457 2177 1372 106 286 417 40 84 13 34 5 6 23 61 5 27 48 69 43 12 3 16 20 12 24 22 41 83 15 25 35 16 5 18 4 7 20 29 4 57 34 42 22 36 7 30 44 6 4 0 18 4 18 82 8 3 21 122 68 38 69 2 3 287 87 305 142 66 117 12 8 1686 18 23 9 58 12 24 40 18 95 396 499 164 457 166 63 52 60 271 1473 2893 26 67 21 26 40 20 41 9 859 592 1959 769 0 6612 100 132 164 445 145
195 | 1 1 5 0 0 2 0 4 0 0 2 0 1 1 1 8 1 5 0 0 4 8 0 1 0 10 1 10 22 2 6 4 4 0 0 1 10 3 44 16 9 145 4 4 2 8 31 10 1 33 626 293 31 97 0 0 0 0 2 1 13 13 1 7 0 63 159 131 90 9 21 2 13 2 2 7 3 6 2 18 0 15 10 7 13 165 130 37 467 309 1254 1376 392 593 1018 11 46 5 19 3 8 7 12 2 27 11 11 10 1 1 24 9 5 36 67 54 189 26 66 59 31 15 8 13 22 25 41 19 136 80 122 30 80 21 103 146 9 18 3 27 24 42 75 22 19 194 1062 222 81 180 10 7 276 101 958 49 60 275 0 7 315 6 55 2 142 69 180 157 8 24 158 462 540 2198 1735 586 25 11 54 136 3730 142 364 184 35 118 49 127 25 1708 13 17 93 6612 0 3757 2670 39 151 113
196 | 0 0 1 0 1 0 0 2 0 0 0 0 1 1 0 2 0 3 0 1 3 4 1 2 1 2 1 7 3 0 2 0 0 0 0 0 1 0 14 1 10 68 0 0 0 1 8 0 0 10 31 40 4 69 0 0 1 1 0 0 3 2 0 2 0 1 8 5 9 13 0 0 3 0 0 2 1 0 0 4 1 2 0 0 1 6 8 1 7 4 30 59 103 54 187 2 2 1 0 0 6 3 3 1 21 0 0 4 0 0 9 7 1 13 48 3 7 1 1 4 1 2 0 5 11 0 17 13 17 33 33 2 7 6 88 91 2 3 1 15 5 22 8 2 24 277 176 18 2 10 6 0 19 1 160 1 4 23 0 2 17 4 23 2 4 44 160 57 0 7 100 54 60 583 1722 1109 5 0 2 0 235 104 63 99 3 14 7 7 2 83 0 1 0 100 3757 0 4606 0 0 2
197 | 0 0 2 0 0 0 2 3 0 0 1 0 3 1 0 2 0 6 0 1 13 5 1 2 1 12 2 7 7 1 4 1 3 0 0 0 5 3 17 7 12 129 2 0 0 3 15 4 3 17 29 110 6 120 0 0 0 0 0 0 2 6 1 2 0 1 14 17 15 17 3 0 7 0 0 1 2 2 1 7 1 6 0 0 0 5 8 0 13 15 48 105 235 96 465 2 2 0 3 0 7 0 1 0 25 1 1 2 1 0 34 18 6 65 187 13 18 5 5 24 2 3 1 3 26 0 18 7 79 42 302 22 80 16 69 96 1 8 1 56 8 43 58 22 124 866 763 35 7 16 14 4 33 8 451 3 23 131 0 1 19 4 17 1 16 31 259 61 1 5 39 78 107 242 600 1242 14 2 7 1 1246 270 85 114 8 27 18 22 5 216 0 0 1 132 2670 4606 0 2 2 8
198 | 14 44 40 34 75 17 157 39 58 240 88 61 838 218 28 28 0 0 1 0 1 0 0 0 4 0 11 0 3 4 88 4 30 0 0 0 53 6 25 41 0 1 0 3 3 0 25 0 0 0 1 1 4 0 0 5 33 22 4 4 0 0 1 7 0 1 0 2 1 0 231 6 0 11 2 19 2 3 4 64 0 6 4 2 1 13 0 1 9 3 9 8 4 2 7 139 52 0 15 20 19 705 50 59 26 70 1184 26 97 412 12 231 384 693 44 55 31 0 2 0 0 0 6 0 0 1 0 0 1 0 0 1 0 0 30 7 0 0 0 93 0 5 23 1 0 1 2 0 0 1 0 3 58 0 2 260 0 10 5 1 9 82 30 337 0 0 2 1 2 5 0 0 1 0 0 3 1045 640 196 394 171 8 1 2 16 14 16 4 0 3 27 143 59 164 39 0 2 0 1492 169
199 | 12 38 74 12 60 6 94 30 49 37 74 17 332 145 49 57 4 0 6 1 1 0 0 2 24 0 55 0 48 10 249 12 32 0 1 1 130 32 221 129 0 12 0 10 4 5 64 2 0 2 8 6 15 5 2 8 60 44 9 4 0 2 0 15 0 6 4 5 0 1 337 12 0 30 0 34 5 11 4 45 0 6 5 1 0 39 2 2 30 7 16 19 7 10 34 77 109 2 173 33 12 181 13 4 20 22 284 14 2 10 6 25 336 423 9 50 23 1 0 0 0 0 131 1 0 17 0 1 0 0 0 33 22 5 33 16 0 0 0 268 0 17 404 7 0 2 12 3 5 5 0 9 94 2 14 1369 366 66 9 12 27 39 64 123 7 1 3 15 7 8 0 2 3 2 3 4 2568 1644 403 1358 675 11 2 2 34 25 21 6 3 23 58 356 132 445 151 0 2 1492 0 3040
200 | 9 33 63 6 46 5 64 27 37 16 41 4 141 70 53 64 5 3 3 2 0 1 2 1 37 0 70 2 71 15 191 9 20 1 0 4 198 49 421 218 0 10 0 11 8 2 63 1 0 0 8 6 26 8 1 3 50 36 11 8 1 3 1 25 9 8 5 3 4 0 325 17 1 41 1 47 6 17 2 28 0 7 8 0 3 33 4 1 13 5 17 15 8 6 23 46 102 1 243 16 13 53 5 0 16 8 78 2 1 5 21 18 67 137 181 37 23 0 2 1 2 0 183 1 1 32 0 0 1 1 1 3 3 1 44 21 0 2 0 800 0 102 1260 6 3 3 12 3 5 10 0 10 53 2 6 812 500 95 6 14 28 30 48 19 12 2 57 12 7 13 0 1 4 3 3 1 974 749 65 456 448 122 2 7 38 25 10 4 1 26 14 66 20 145 113 2 8 169 3040 0
201 |
--------------------------------------------------------------------------------
/data/label_ts_corrected:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/data/label_ts_corrected
--------------------------------------------------------------------------------
/data/leadfield:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/data/leadfield
--------------------------------------------------------------------------------
/data/network_colour.xlsx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/data/network_colour.xlsx
--------------------------------------------------------------------------------
/data/only_high_trial.mat:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/data/only_high_trial.mat
--------------------------------------------------------------------------------
/data/real_EEG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/data/real_EEG
--------------------------------------------------------------------------------
/img/Figure_1.PNG.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/img/Figure_1.PNG.png
--------------------------------------------------------------------------------
/img/Figure_2.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/img/Figure_2.PNG
--------------------------------------------------------------------------------
/img/Figure_3.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/img/Figure_3.PNG
--------------------------------------------------------------------------------
/img/Figure_4.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/img/Figure_4.PNG
--------------------------------------------------------------------------------
/img/Figure_6.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/img/Figure_6.PNG
--------------------------------------------------------------------------------
/img/Figure_7.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/img/Figure_7.PNG
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | matplotlib
2 | mne
3 | nibabel
4 | nilearn
5 | numpy
6 | pandas
7 | scipy
8 | seaborn
9 | torch
10 | sklearn
11 |
--------------------------------------------------------------------------------
/tepfit/__init__.py:
--------------------------------------------------------------------------------
1 | # init for tepfit
2 |
--------------------------------------------------------------------------------
/tepfit/__pycache__/__init__.cpython-37.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/tepfit/__pycache__/__init__.cpython-37.pyc
--------------------------------------------------------------------------------
/tepfit/__pycache__/viz.cpython-37.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/GriffithsLab/PyTepFit/66b94e488c82478e6f302015b6cd8e8d9d33792a/tepfit/__pycache__/viz.cpython-37.pyc
--------------------------------------------------------------------------------
/tepfit/data.py:
--------------------------------------------------------------------------------
1 | # Data functions
2 |
--------------------------------------------------------------------------------
/tepfit/fit.py:
--------------------------------------------------------------------------------
1 | # Pytorch stuff
2 |
3 |
4 | """
5 | Importage
6 | """
7 |
8 | #from Model_pytorch import wwd_model_pytorch_new
9 | import matplotlib.pyplot as plt # for plotting
10 | import numpy as np # for numerical operations
11 | import pandas as pd # for data manipulation
12 | import seaborn as sns # for plotting
13 | import time # for timer
14 | import torch
15 | import torch.optim as optim
16 |
17 | from torch.nn.parameter import Parameter
18 |
19 | import matplotlib.pyplot as plt # for plotting
20 | import numpy as np # for numerical operations
21 | import pandas as pd # for data manipulation
22 | import seaborn as sns # for plotting
23 | import time # for timer
24 | import torch
25 | import torch.optim as optim
26 | from sklearn.metrics.pairwise import cosine_similarity
27 | import pickle
28 |
29 | from torch.nn.parameter import Parameter
30 |
31 | class OutputNM():
32 | mode_all = ['train', 'test']
33 | stat_vars_all = ['m', 'v']
34 |
35 | def __init__(self, model_name, node_size, param, fit_weights=False, fit_lfm=False):
36 | self.loss = np.array([])
37 | if model_name == 'WWD':
38 | state_names = ['E', 'I', 'x', 'f', 'v', 'q']
39 | self.output_name = "bold"
40 | elif model_name == "JR":
41 | state_names = ['E', 'Ev', 'I', 'Iv', 'P', 'Pv']
42 | self.output_name = "eeg"
43 | for name in state_names + [self.output_name]:
44 | for m in self.mode_all:
45 | setattr(self, name + '_' + m, [])
46 |
47 | vars = [a for a in dir(param) if not a.startswith('__') and not callable(getattr(param, a))]
48 | for var in vars:
49 | if np.any(getattr(param, var)[1] > 0):
50 | if var != 'std_in':
51 | setattr(self, var, np.array([]))
52 | for stat_var in self.stat_vars_all:
53 | setattr(self, var + '_' + stat_var, [])
54 | else:
55 | setattr(self, var, [])
56 |
57 | self.weights = []
58 | if model_name == 'JR' and fit_lfm == True:
59 | self.leadfield = []
60 |
61 | def save(self, filename):
62 | with open(filename, 'wb') as f:
63 | pickle.dump(self, f)
64 |
65 |
66 | class ParamsJR():
67 |
68 | def __init__(self, model_name, **kwargs):
69 | if model_name == 'WWD':
70 | param = {
71 |
72 | "std_in": [0.02, 0], # standard deviation of the Gaussian noise
73 | "std_out": [0.02, 0], # standard deviation of the Gaussian noise
74 | # Parameters for the ODEs
75 | # Excitatory population
76 | "W_E": [1., 0], # scale of the external input
77 | "tau_E": [100., 0], # decay time
78 | "gamma_E": [0.641 / 1000., 0], # other dynamic parameter (?)
79 |
80 | # Inhibitory population
81 | "W_I": [0.7, 0], # scale of the external input
82 | "tau_I": [10., 0], # decay time
83 | "gamma_I": [1. / 1000., 0], # other dynamic parameter (?)
84 |
85 | # External input
86 | "I_0": [0.32, 0], # external input
87 | "I_external": [0., 0], # external stimulation
88 |
89 | # Coupling parameters
90 | "g": [20., 0], # global coupling (from all nodes E_j to single node E_i)
91 | "g_EE": [.1, 0], # local self excitatory feedback (from E_i to E_i)
92 | "g_IE": [.1, 0], # local inhibitory coupling (from I_i to E_i)
93 | "g_EI": [0.1, 0], # local excitatory coupling (from E_i to I_i)
94 |
95 | "aE": [310, 0],
96 | "bE": [125, 0],
97 | "dE": [0.16, 0],
98 | "aI": [615, 0],
99 | "bI": [177, 0],
100 | "dI": [0.087, 0],
101 |
102 | # Output (BOLD signal)
103 |
104 | "alpha": [0.32, 0],
105 | "rho": [0.34, 0],
106 | "k1": [2.38, 0],
107 | "k2": [2.0, 0],
108 | "k3": [0.48, 0], # adjust this number from 0.48 for BOLD fluctruate around zero
109 | "V": [.02, 0],
110 | "E0": [0.34, 0],
111 | "tau_s": [0.65, 0],
112 | "tau_f": [0.41, 0],
113 | "tau_0": [0.98, 0],
114 | "mu": [0.5, 0]
115 |
116 | }
117 | elif model_name == "JR":
118 | param = {
119 | "A ": [3.25, 0], "a": [100, 0.], "B": [22, 0], "b": [50, 0], "g": [1000, 0], \
120 | "c1": [135, 0.], "c2": [135 * 0.8, 0.], "c3 ": [135 * 0.25, 0.], "c4": [135 * 0.25, 0.], \
121 | "std_in": [100, 0], "vmax": [5, 0], "v0": [6, 0], "r": [0.56, 0], "y0": [2, 0], \
122 | "mu": [.5, 0], "k": [5, 0], "cy0": [5, 0], "ki": [1, 0]
123 | }
124 | for var in param:
125 | setattr(self, var, param[var])
126 |
127 | for var in kwargs:
128 | setattr(self, var, kwargs[var])
129 | """self.A = A # magnitude of second order system for populations E and P
130 | self.a = a # decay rate of the 2nd order system for population E and P
131 | self.B = B # magnitude of second order system for population I
132 | self.b = b # decay rate of the 2nd order system for population I
133 | self.g= g # global gain
134 | self.c1= c1# local gain from P to E (pre)
135 | self.c2= c2 # local gain from P to E (post)
136 | self.c3= c3 # local gain from P to I
137 | self.c4= c4 # local gain from P to I
138 | self.mu = mu
139 | self.y0 = y0
140 | self.std_in= std_in # local gain from P to I
141 | self.cy0 = cy0
142 | self.vmax = vmax
143 | self.v0 = v0
144 | self.r = r
145 | self.k = k"""
146 |
147 |
148 | def sys2nd(A, a, u, x, v):
149 | return A*a*u -2*a*v-a**2*x
150 |
151 | def sigmoid(x, vmax, v0, r):
152 | return vmax/(1+torch.exp(r*(v0-x)))
153 |
154 |
155 | class RNNJANSEN(torch.nn.Module):
156 | """
157 | A module for forward model (JansenRit) to simulate a batch of EEG signals
158 | Attibutes
159 | ---------
160 | state_size : int
161 | the number of states in the JansenRit model
162 | input_size : int
163 | the number of states with noise as input
164 | tr : float
165 | tr of image
166 | step_size: float
167 | Integration step for forward model
168 | hidden_size: int
169 | the number of step_size in a tr
170 | batch_size: int
171 | the number of EEG signals to simulate
172 | node_size: int
173 | the number of ROIs
174 | sc: float node_size x node_size array
175 | structural connectivity
176 | fit_gains: bool
177 | flag for fitting gains 1: fit 0: not fit
178 | g, c1, c2, c3,c4: tensor with gradient on
179 | model parameters to be fit
180 | w_bb: tensor with node_size x node_size (grad on depends on fit_gains)
181 | connection gains
182 | std_in std_out: tensor with gradient on
183 | std for state noise and output noise
184 | hyper parameters for prior distribution of model parameters
185 | Methods
186 | -------
187 | forward(input, noise_out, hx)
188 | forward model (JansenRit) for generating a number of EEG signals with current model parameters
189 | """
190 | state_names = ['E', 'Ev', 'I', 'Iv', 'P', 'Pv']
191 | model_name = "JR"
192 |
193 | def __init__(self, input_size: int, node_size: int,
194 | batch_size: int, step_size: float, output_size: int, tr: float, sc: float, lm: float, dist: float,
195 | fit_gains_flat: bool, \
196 | fit_lfm_flat: bool, param: ParamsJR) -> None:
197 | """
198 | Parameters
199 | ----------
200 | state_size : int
201 | the number of states in the JansenRit model
202 | input_size : int
203 | the number of states with noise as input
204 | tr : float
205 | tr of image
206 | step_size: float
207 | Integration step for forward model
208 | hidden_size: int
209 | the number of step_size in a tr
210 | batch_size: int
211 | the number of EEG signals to simulate
212 | node_size: int
213 | the number of ROIs
214 | output_size: int
215 | the number of channels EEG
216 | sc: float node_size x node_size array
217 | structural connectivity
218 | fit_gains: bool
219 | flag for fitting gains 1: fit 0: not fit
220 | param from ParamJR
221 | """
222 | super(RNNJANSEN, self).__init__()
223 | self.state_size = 6 # 6 states WWD model
224 | self.input_size = input_size # 1 or 2 or 3
225 | self.tr = tr # tr ms (integration step 0.1 ms)
226 | self.step_size = torch.tensor(step_size, dtype=torch.float32) # integration step 0.1 ms
227 | self.hidden_size = np.int(tr / step_size)
228 | self.batch_size = batch_size # size of the batch used at each step
229 | self.node_size = node_size # num of ROI
230 | self.output_size = output_size # num of EEG channels
231 | self.sc = sc # matrix node_size x node_size structure connectivity
232 | self.dist = torch.tensor(dist, dtype=torch.float32)
233 | self.fit_gains_flat = fit_gains_flat # flag for fitting gains
234 | self.fit_lfm_flat = fit_lfm_flat
235 | self.param = param
236 |
237 | self.output_size = lm.shape[0] # number of EEG channels
238 |
239 | # set model parameters (variables: need to calculate gradient) as Parameter others : tensor
240 | # set w_bb as Parameter if fit_gain is True
241 | if self.fit_gains_flat == True:
242 | self.w_bb = Parameter(torch.tensor(np.zeros((node_size, node_size)) + 0.05,
243 | dtype=torch.float32)) # connenction gain to modify empirical sc
244 | else:
245 | self.w_bb = torch.tensor(np.zeros((node_size, node_size)), dtype=torch.float32)
246 |
247 | if self.fit_lfm_flat == True:
248 | self.lm = Parameter(torch.tensor(lm, dtype=torch.float32)) # leadfield matrix from sourced data to eeg
249 | else:
250 | self.lm = torch.tensor(lm, dtype=torch.float32) # leadfield matrix from sourced data to eeg
251 |
252 | vars = [a for a in dir(param) if not a.startswith('__') and not callable(getattr(param, a))]
253 | for var in vars:
254 | if np.any(getattr(param, var)[1] > 0):
255 | setattr(self, var, Parameter(
256 | torch.tensor(getattr(param, var)[0] + 1 / getattr(param, var)[1] * np.random.randn(1, )[0],
257 | dtype=torch.float32)))
258 | if var != 'std_in':
259 | dict_nv = {}
260 | dict_nv['m'] = getattr(param, var)[0]
261 | dict_nv['v'] = getattr(param, var)[1]
262 |
263 | dict_np = {}
264 | dict_np['m'] = var + '_m'
265 | dict_np['v'] = var + '_v'
266 |
267 | for key in dict_nv:
268 | setattr(self, dict_np[key], Parameter(torch.tensor(dict_nv[key], dtype=torch.float32)))
269 | else:
270 | setattr(self, var, torch.tensor(getattr(param, var)[0], dtype=torch.float32))
271 |
272 | """def check_input(self, input: Tensor) -> None:
273 | expected_input_dim = 2
274 | if input.dim() != expected_input_dim:
275 | raise RuntimeError(
276 | 'input must have {} dimensions, got {}'.format(
277 | expected_input_dim, input.dim()))
278 | if self.input_size != input.size(-1):
279 | raise RuntimeError(
280 | 'input.size(-1) must be equal to input_size. Expected {}, got {}'.format(
281 | self.input_size, input.size(-1)))
282 | if self.batch_size != input.size(0):
283 | raise RuntimeError(
284 | 'input.size(0) must be equal to batch_size. Expected {}, got {}'.format(
285 | self.batch_size, input.size(0)))"""
286 |
287 | def forward(self, input, noise_in, noise_out, hx, hE):
288 | """
289 | Forward step in simulating the EEG signal.
290 | Parameters
291 | ----------
292 | input: tensor with node_size x hidden_size x batch_size x input_size
293 | noise for states
294 | noise_out: tensor with node_size x batch_size
295 | noise for EEG
296 | hx: tensor with node_size x state_size
297 | states of JansenRit model
298 | Outputs
299 | -------
300 | next_state: dictionary with keys:
301 | 'current_state''EEG_batch''E_batch''I_batch''M_batch''Ev_batch''Iv_batch''Mv_batch'
302 | record new states and EEG
303 | """
304 |
305 | # define some constants
306 | conduct_lb = 1.5 # lower bound for conduct velocity
307 | u_2ndsys_ub = 500 # the bound of the input for second order system
308 | noise_std_lb = 150 # lower bound of std of noise
309 | lb = 0.01 # lower bound of local gains
310 | s2o_coef = 0.0001 # coefficient from states (source EEG) to EEG
311 | k_lb = 0.5 # lower bound of coefficient of external inputs
312 |
313 | next_state = {}
314 |
315 | M = hx[:, 0:1] # current of main population
316 | E = hx[:, 1:2] # current of excitory population
317 | I = hx[:, 2:3] # current of inhibitory population
318 |
319 | Mv = hx[:, 3:4] # voltage of main population
320 | Ev = hx[:, 4:5] # voltage of exictory population
321 | Iv = hx[:, 5:6] # voltage of inhibitory population
322 |
323 | dt = self.step_size
324 | # Generate the ReLU module for model parameters gEE gEI and gIE
325 |
326 | m = torch.nn.ReLU()
327 |
328 | # define constant 1 tensor
329 | con_1 = torch.tensor(1.0, dtype=torch.float32)
330 | if self.sc.shape[0] > 1:
331 |
332 | # Update the Laplacian based on the updated connection gains w_bb.
333 | w = torch.exp(self.w_bb) * torch.tensor(self.sc, dtype=torch.float32) #5*(con_1+torch.tanh(self.w_bb)) * torch.tensor(self.sc, dtype=torch.float32)
334 | #w = m(self.w_bb)
335 | #w_n = 0.5 * (w + torch.transpose(w, 0, 1)) / torch.linalg.norm(0.5 * (w + torch.transpose(w, 0, 1)))
336 | w_n = torch.log1p(0.5 * (w + torch.transpose(w, 0, 1))) / torch.linalg.norm(
337 | torch.log1p(0.5 * (w + torch.transpose(w, 0, 1))))
338 |
339 | self.sc_m = w_n
340 | dg = -torch.diag(torch.sum(w_n, axis=1))
341 | else:
342 | l_s = torch.tensor(np.zeros((1, 1)), dtype=torch.float32)
343 |
344 | self.delays = (self.dist / (conduct_lb * con_1 + m(self.mu))).type(torch.int64)
345 | # print(torch.max(self.delays), self.delays.shape)
346 |
347 | # placeholder for the updated corrent state
348 | current_state = torch.zeros_like(hx)
349 |
350 | # placeholders for output BOLD, history of E I x f v and q
351 | eeg_batch = []
352 | E_batch = []
353 | I_batch = []
354 | M_batch = []
355 | Ev_batch = []
356 | Iv_batch = []
357 | Mv_batch = []
358 |
359 | # Use the forward model to get EEGsignal at ith element in the batch.
360 | for i_batch in range(self.batch_size):
361 | # Get the noise for EEG output.
362 | noiseEEG = noise_out[:, i_batch:i_batch + 1]
363 |
364 | for i_hidden in range(self.hidden_size):
365 | Ed = torch.tensor(np.zeros((self.node_size, self.node_size)), dtype=torch.float32) # delayed E
366 |
367 | """for ind in range(self.node_size):
368 | #print(ind, hE[ind,:].shape, self.delays[ind,:].shape)
369 | Ed[ind] = torch.index_select(hE[ind,:], 0, self.delays[ind,:])"""
370 | hE_new = hE.clone()
371 | Ed = hE_new.gather(1, self.delays) # delayed E
372 |
373 | LEd = torch.reshape(torch.sum(w_n * torch.transpose(Ed, 0, 1), 1),
374 | (self.node_size, 1)) # weights on delayed E
375 |
376 | # Input noise for M.
377 | noiseE = noise_in[:, i_hidden, i_batch, 0:1]
378 | noiseI = noise_in[:, i_hidden, i_batch, 1:2]
379 | noiseM = noise_in[:, i_hidden, i_batch, 2:3]
380 | u = input[:, i_hidden:i_hidden + 1, i_batch]
381 |
382 | # LEd+torch.matmul(dg,E): Laplacian on delayed E
383 |
384 | rM = sigmoid(E - I, self.vmax, self.v0, self.r) # firing rate for Main population
385 | rE = (noise_std_lb * con_1 + m(self.std_in)) * noiseE + (lb * con_1 + m(self.g)) * (
386 | LEd + 1 * torch.matmul(dg, E)) \
387 | + (lb * con_1 + m(self.c2)) * sigmoid((lb * con_1 + m(self.c1)) * M, self.vmax, self.v0,
388 | self.r) # firing rate for Excitory population
389 | rI = (lb * con_1 + m(self.c4)) * sigmoid((lb * con_1 + m(self.c3)) * M, self.vmax, self.v0,
390 | self.r) # firing rate for Inhibitory population
391 |
392 | # Update the states by step-size.
393 | ddM = M + dt * Mv
394 | ddE = E + dt * Ev
395 | ddI = I + dt * Iv
396 | ddMv = Mv + dt * sys2nd(0 * con_1 + m(self.A), 1 * con_1 + m(self.a),
397 | + u_2ndsys_ub * torch.tanh(rM / u_2ndsys_ub), M, Mv) \
398 | + 0 * torch.sqrt(dt) * (1.0 * con_1 + m(self.std_in)) * noiseM
399 |
400 | ddEv = Ev + dt * sys2nd(0 * con_1 + m(self.A), 1 * con_1 + m(self.a), \
401 | (k_lb * con_1 + m(self.k)) *self.ki* u \
402 | + u_2ndsys_ub * torch.tanh(rE / u_2ndsys_ub), E,
403 | Ev) # (0.001*con_1+m_kw(self.kw))/torch.sum(0.001*con_1+m_kw(self.kw))*
404 |
405 | ddIv = Iv + dt * sys2nd(0 * con_1 + m(self.B), 1 * con_1 + m(self.b), \
406 | +u_2ndsys_ub * torch.tanh(rI / u_2ndsys_ub), I, Iv) + 0 * torch.sqrt(dt) * (
407 | 1.0 * con_1 + m(self.std_in)) * noiseI
408 |
409 | # Calculate the saturation for model states (for stability and gradient calculation).
410 | E = ddE # 1000*torch.tanh(ddE/1000)#torch.tanh(0.00001+torch.nn.functional.relu(ddE))
411 | I = ddI # 1000*torch.tanh(ddI/1000)#torch.tanh(0.00001+torch.nn.functional.relu(ddI))
412 | M = ddM # 1000*torch.tanh(ddM/1000)
413 | Ev = ddEv # 1000*torch.tanh(ddEv/1000)#(con_1 + torch.tanh(df - con_1))
414 | Iv = ddIv # 1000*torch.tanh(ddIv/1000)#(con_1 + torch.tanh(dv - con_1))
415 | Mv = ddMv # 1000*torch.tanh(ddMv/1000)#(con_1 + torch.tanh(dq - con_1))
416 |
417 | # update placeholders for E buffer
418 | hE[:, 0] = E[:, 0]
419 |
420 | # Put M E I Mv Ev and Iv at every tr to the placeholders for checking them visually.
421 | M_batch.append(M)
422 | I_batch.append(I)
423 | E_batch.append(E)
424 | Mv_batch.append(Mv)
425 | Iv_batch.append(Iv)
426 | Ev_batch.append(Ev)
427 | hE = torch.cat([E, hE[:, :-1]], axis=1) # update placeholders for E buffer
428 |
429 | # Put the EEG signal each tr to the placeholder being used in the cost calculation.
430 | lm_t = (self.lm - 1 / self.output_size * torch.matmul(torch.ones((1, self.output_size)), self.lm))
431 | self.lm_t = lm_t
432 | temp = s2o_coef * self.cy0 * torch.matmul(lm_t, E - I) - 1 * self.y0
433 | eeg_batch.append(temp) # torch.abs(E) - torch.abs(I) + 0.0*noiseEEG)
434 |
435 | # Update the current state.
436 | current_state = torch.cat([M, E, I, Mv, Ev, Iv], axis=1)
437 | next_state['current_state'] = current_state
438 | next_state['eeg_batch'] = torch.cat(eeg_batch, axis=1)
439 | next_state['E_batch'] = torch.cat(E_batch, axis=1)
440 | next_state['I_batch'] = torch.cat(I_batch, axis=1)
441 | next_state['P_batch'] = torch.cat(M_batch, axis=1)
442 | next_state['Ev_batch'] = torch.cat(Ev_batch, axis=1)
443 | next_state['Iv_batch'] = torch.cat(Iv_batch, axis=1)
444 | next_state['Pv_batch'] = torch.cat(Mv_batch, axis=1)
445 |
446 | return next_state, hE
447 |
448 |
449 |
450 |
451 | class Costs:
452 | def __init__(self, method):
453 | self.method = method
454 |
455 | def cost_dist(self, sim, emp):
456 | """
457 | Calculate the Pearson Correlation between the simFC and empFC.
458 | From there, the probability and negative log-likelihood.
459 | Parameters
460 | ----------
461 | logits_series_tf: tensor with node_size X datapoint
462 | simulated EEG
463 | labels_series_tf: tensor with node_size X datapoint
464 | empirical EEG
465 | """
466 |
467 | losses = torch.sqrt(torch.mean((sim - emp) ** 2)) #
468 | return losses
469 |
470 | def cost_beamform(self, model, emp):
471 | corr = torch.matmul(emp, emp.T)
472 | corr_inv = torch.inverse(corr)
473 | corr_inv_s = torch.inverse(torch.matmul(model.lm.T, torch.matmul(corr_inv, model.lm)))
474 | W = torch.matmul(corr_inv_s, torch.matmul(model.lm.T, corr_inv))
475 | return torch.trace(torch.matmul(W, torch.matmul(corr, W.T)))
476 |
477 | def cost_r(self, logits_series_tf, labels_series_tf):
478 | """
479 | Calculate the Pearson Correlation between the simFC and empFC.
480 | From there, the probability and negative log-likelihood.
481 | Parameters
482 | ----------
483 | logits_series_tf: tensor with node_size X datapoint
484 | simulated BOLD
485 | labels_series_tf: tensor with node_size X datapoint
486 | empirical BOLD
487 | """
488 | # get node_size(batch_size) and batch_size()
489 | node_size = logits_series_tf.shape[0]
490 | truncated_backprop_length = logits_series_tf.shape[1]
491 |
492 | # remove mean across time
493 | labels_series_tf_n = labels_series_tf - torch.reshape(torch.mean(labels_series_tf, 1),
494 | [node_size, 1]) # - torch.matmul(
495 |
496 | logits_series_tf_n = logits_series_tf - torch.reshape(torch.mean(logits_series_tf, 1),
497 | [node_size, 1]) # - torch.matmul(
498 |
499 | # correlation
500 | cov_sim = torch.matmul(logits_series_tf_n, torch.transpose(logits_series_tf_n, 0, 1))
501 | cov_def = torch.matmul(labels_series_tf_n, torch.transpose(labels_series_tf_n, 0, 1))
502 |
503 | # fc for sim and empirical BOLDs
504 | FC_sim_T = torch.matmul(torch.matmul(torch.diag(torch.reciprocal(torch.sqrt( \
505 | torch.diag(cov_sim)))), cov_sim),
506 | torch.diag(torch.reciprocal(torch.sqrt(torch.diag(cov_sim)))))
507 | FC_T = torch.matmul(torch.matmul(torch.diag(torch.reciprocal(torch.sqrt( \
508 | torch.diag(cov_def)))), cov_def),
509 | torch.diag(torch.reciprocal(torch.sqrt(torch.diag(cov_def)))))
510 |
511 | # mask for lower triangle without diagonal
512 | ones_tri = torch.tril(torch.ones_like(FC_T), -1)
513 | zeros = torch.zeros_like(FC_T) # create a tensor all ones
514 | mask = torch.greater(ones_tri, zeros) # boolean tensor, mask[i] = True iff x[i] > 1
515 |
516 | # mask out fc to vector with elements of the lower triangle
517 | FC_tri_v = torch.masked_select(FC_T, mask)
518 | FC_sim_tri_v = torch.masked_select(FC_sim_T, mask)
519 |
520 | # remove the mean across the elements
521 | FC_v = FC_tri_v - torch.mean(FC_tri_v)
522 | FC_sim_v = FC_sim_tri_v - torch.mean(FC_sim_tri_v)
523 |
524 | # corr_coef
525 | corr_FC = torch.sum(torch.multiply(FC_v, FC_sim_v)) \
526 | * torch.reciprocal(torch.sqrt(torch.sum(torch.multiply(FC_v, FC_v)))) \
527 | * torch.reciprocal(torch.sqrt(torch.sum(torch.multiply(FC_sim_v, FC_sim_v))))
528 |
529 | # use surprise: corr to calculate probability and -log
530 | losses_corr = -torch.log(0.5000 + 0.5 * corr_FC) # torch.mean((FC_v -FC_sim_v)**2)#
531 | return losses_corr
532 |
533 | def cost_eff(self, model, sim, emp):
534 | if self.method == 0:
535 | return self.cost_dist(sim, emp)
536 | elif self.method == 1:
537 | return self.cost_beamform(model, emp) + self.cost_dist(sim, emp)
538 | else:
539 | return self.cost_r(sim, emp)
540 |
541 |
542 |
543 | class Model_fitting:
544 | """
545 | Using ADAM and AutoGrad to fit JansenRit to empirical EEG
546 | Attributes
547 | ----------
548 | model: instance of class RNNJANSEN
549 | forward model JansenRit
550 | ts: array with num_tr x node_size
551 | empirical EEG time-series
552 | num_epoches: int
553 | the times for repeating trainning
554 | Methods:
555 | train()
556 | train model
557 | test()
558 | using the optimal model parater to simulate the BOLD
559 | """
560 |
561 | # from sklearn.metrics.pairwise import cosine_similarity
562 | def __init__(self, model, ts, num_epoches, cost):
563 | """
564 | Parameters
565 | ----------
566 | model: instance of class RNNJANSEN
567 | forward model JansenRit
568 | ts: array with num_tr x node_size
569 | empirical EEG time-series
570 | num_epoches: int
571 | the times for repeating trainning
572 | """
573 | self.model = model
574 | self.num_epoches = num_epoches
575 | # self.u = u
576 | """if ts.shape[1] != model.node_size:
577 | print('ts is a matrix with the number of datapoint X the number of node')
578 | else:
579 | self.ts = ts"""
580 | self.ts = ts
581 |
582 | self.cost = Costs(cost)
583 |
584 | def save(self, filename):
585 | with open(filename, 'wb') as f:
586 | pickle.dump(self, f)
587 |
588 | def train(self, u=0):
589 | """
590 | Parameters
591 | ----------
592 | None
593 | Outputs: OutputRJ
594 | """
595 |
596 | # define some constants
597 | lb = 0.001
598 | delays_max = 500
599 | state_ub = 2
600 | state_lb = 0.5
601 | w_cost = 10
602 |
603 | epoch_min = 110 # run minimum epoch # part of stop criteria
604 | r_lb = 0.85 # lowest pearson correlation # part of stop criteria
605 |
606 | self.u = u
607 |
608 | # placeholder for output(EEG and histoty of model parameters and loss)
609 | self.output_sim = OutputNM(self.model.model_name, self.model.node_size, self.model.param,
610 | self.model.fit_gains_flat, self.model.fit_lfm_flat)
611 | # define an optimizor(ADAM)
612 | optimizer = optim.Adam(self.model.parameters(), lr=0.05, eps=1e-7)
613 |
614 | # initial state
615 | if self.model.model_name == 'WWD':
616 | # initial state
617 | X = torch.tensor(0.45 * np.random.uniform(0, 1, (self.model.node_size, self.model.state_size)) + np.array(
618 | [0, 0, 0, 1.0, 1.0, 1.0]), dtype=torch.float32)
619 | elif self.model.model_name == 'JR':
620 | X = torch.tensor(np.random.uniform(state_lb, state_ub, (self.model.node_size, self.model.state_size)),
621 | dtype=torch.float32)
622 | hE = torch.tensor(np.random.uniform(state_lb, state_ub, (self.model.node_size, delays_max)),
623 | dtype=torch.float32)
624 |
625 | # define masks for geting lower triangle matrix
626 | mask = np.tril_indices(self.model.node_size, -1)
627 | mask_e = np.tril_indices(self.model.output_size, -1)
628 |
629 | # placeholders for the history of model parameters
630 | fit_param = {}
631 | exclude_param = []
632 | fit_sc = [self.model.sc[mask].copy()] # sc weights history
633 | if self.model.fit_gains_flat == True:
634 | exclude_param.append('w_bb')
635 |
636 | if self.model.model_name == "JR" and self.model.fit_lfm_flat == True:
637 | exclude_param.append('lm')
638 | fit_lm = [self.model.lm.detach().numpy().ravel().copy()] # leadfield matrix history
639 |
640 | for key, value in self.model.state_dict().items():
641 | if key not in exclude_param:
642 | fit_param[key] = [value.detach().numpy().ravel().copy()]
643 |
644 | loss_his = []
645 |
646 | # define constant 1 tensor
647 |
648 | con_1 = torch.tensor(1.0, dtype=torch.float32)
649 |
650 | # define num_batches
651 | num_batches = np.int(self.ts.shape[2] / self.model.batch_size)
652 | for i_epoch in range(self.num_epoches):
653 |
654 | # X = torch.tensor(np.random.uniform(0, 5, (self.model.node_size, self.model.state_size)) , dtype=torch.float32)
655 | # hE = torch.tensor(np.random.uniform(0, 5, (self.model.node_size,83)), dtype=torch.float32)
656 | eeg = self.ts[i_epoch % self.ts.shape[0]]
657 | # Create placeholders for the simulated EEG E I M Ev Iv and Mv of entire time series.
658 | for name in self.model.state_names + [self.output_sim.output_name]:
659 | setattr(self.output_sim, name + '_train', [])
660 |
661 | external = torch.tensor(np.zeros([self.model.node_size, self.model.hidden_size, self.model.batch_size]),
662 | dtype=torch.float32)
663 |
664 | # Perform the training in batches.
665 |
666 | for i_batch in range(num_batches):
667 |
668 | # Reset the gradient to zeros after update model parameters.
669 | optimizer.zero_grad()
670 |
671 | # Initialize the placeholder for the next state.
672 | X_next = torch.zeros_like(X)
673 |
674 | # Get the input and output noises for the module.
675 | noise_in = torch.tensor(np.random.randn(self.model.node_size, self.model.hidden_size, \
676 | self.model.batch_size, self.model.input_size),
677 | dtype=torch.float32)
678 | noise_out = torch.tensor(np.random.randn(self.model.node_size, self.model.batch_size),
679 | dtype=torch.float32)
680 | if not isinstance(self.u, int):
681 | external = torch.tensor(
682 | (self.u[:, :, i_batch * self.model.batch_size:(i_batch + 1) * self.model.batch_size]),
683 | dtype=torch.float32)
684 |
685 | # Use the model.forward() function to update next state and get simulated EEG in this batch.
686 | next_batch, hE_new = self.model(external, noise_in, noise_out, X, hE)
687 |
688 | # Get the batch of emprical EEG signal.
689 | ts_batch = torch.tensor(
690 | (eeg.T[i_batch * self.model.batch_size:(i_batch + 1) * self.model.batch_size, :]).T,
691 | dtype=torch.float32)
692 |
693 | if self.model.model_name == 'WWD':
694 | E_batch = next_batch['E_batch']
695 | I_batch = next_batch['I_batch']
696 | loss_EI = 0.1 * torch.mean(
697 | torch.mean(E_batch * torch.log(E_batch) + (con_1 - E_batch) * torch.log(con_1 - E_batch) \
698 | + 0.5 * I_batch * torch.log(I_batch) + 0.5 * (con_1 - I_batch) * torch.log(
699 | con_1 - I_batch), axis=1))
700 | else:
701 | lose_EI = 0
702 | loss_prior = []
703 | # define the relu function
704 | m = torch.nn.ReLU()
705 | variables_p = [a for a in dir(self.model.param) if
706 | not a.startswith('__') and not callable(getattr(self.model.param, a))]
707 | # get penalty on each model parameters due to prior distribution
708 | for var in variables_p:
709 | # print(var)
710 | if np.any(getattr(self.model.param, var)[1] > 0) and var != 'std_in' and var not in exclude_param:
711 | # print(var)
712 | dict_np = {}
713 | dict_np['m'] = var + '_m'
714 | dict_np['v'] = var + '_v'
715 | loss_prior.append(torch.sum((lb + m(self.model.get_parameter(dict_np['v']))) * \
716 | (m(self.model.get_parameter(var)) - m(
717 | self.model.get_parameter(dict_np['m']))) ** 2) \
718 | + torch.sum(-torch.log(lb + m(self.model.get_parameter(dict_np['v'])))))
719 | # total loss
720 | if self.model.model_name == 'WWD':
721 | loss = 0.1 * w_cost * self.cost.cost_eff(self.model, next_batch['bold_batch'], ts_batch) + sum(
722 | loss_prior) + 0.5 * loss_EI
723 | elif self.model.model_name == 'JR':
724 | loss = w_cost * self.cost.cost_eff(self.model, next_batch['eeg_batch'], ts_batch) + sum(loss_prior)
725 |
726 |
727 | # Put the batch of the simulated EEG, E I M Ev Iv Mv in to placeholders for entire time-series.
728 | for name in self.model.state_names + [self.output_sim.output_name]:
729 | name_next = name + '_batch'
730 | tmp_ls = getattr(self.output_sim, name + '_train')
731 | tmp_ls.append(next_batch[name_next].detach().numpy())
732 | # print(name+'_train', name+'_batch', tmp_ls)
733 | setattr(self.output_sim, name + '_train', tmp_ls)
734 | """eeg_sim_train.append(next_batch['eeg_batch'].detach().numpy())
735 | E_sim_train.append(next_batch['E_batch'].detach().numpy())
736 | I_sim_train.append(next_batch['I_batch'].detach().numpy())
737 | M_sim_train.append(next_batch['M_batch'].detach().numpy())
738 | Ev_sim_train.append(next_batch['Ev_batch'].detach().numpy())
739 | Iv_sim_train.append(next_batch['Iv_batch'].detach().numpy())
740 | Mv_sim_train.append(next_batch['Mv_batch'].detach().numpy())"""
741 |
742 | loss_his.append(loss.detach().numpy())
743 | # print('epoch: ', i_epoch, 'batch: ', i_batch, loss.detach().numpy())
744 |
745 | # Calculate gradient using backward (backpropagation) method of the loss function.
746 | loss.backward(retain_graph=True)
747 |
748 | # Optimize the model based on the gradient method in updating the model parameters.
749 | optimizer.step()
750 |
751 | # Put the updated model parameters into the history placeholders.
752 | # sc_par.append(self.model.sc[mask].copy())
753 | for key, value in self.model.state_dict().items():
754 | if key not in exclude_param:
755 | fit_param[key].append(value.detach().numpy().ravel().copy())
756 |
757 | fit_sc.append(self.model.sc_m.detach().numpy()[mask].copy())
758 | if self.model.model_name == "JR" and self.model.fit_lfm_flat == True:
759 | fit_lm.append(self.model.lm.detach().numpy().ravel().copy())
760 |
761 | # last update current state using next state... (no direct use X = X_next, since gradient calculation only depends on one batch no history)
762 | X = torch.tensor(next_batch['current_state'].detach().numpy(), dtype=torch.float32)
763 | hE = torch.tensor(hE_new.detach().numpy(), dtype=torch.float32)
764 | # print(hE_new.detach().numpy()[20:25,0:20])
765 | # print(hE.shape)
766 | fc = np.corrcoef(self.ts.mean(0))
767 | """ts_sim = np.concatenate(eeg_sim_train, axis=1)
768 | E_sim = np.concatenate(E_sim_train, axis=1)
769 | I_sim = np.concatenate(I_sim_train, axis=1)
770 | M_sim = np.concatenate(M_sim_train, axis=1)
771 | Ev_sim = np.concatenate(Ev_sim_train, axis=1)
772 | Iv_sim = np.concatenate(Iv_sim_train, axis=1)
773 | Mv_sim = np.concatenate(Mv_sim_train, axis=1)"""
774 | tmp_ls = getattr(self.output_sim, self.output_sim.output_name + '_train')
775 | ts_sim = np.concatenate(tmp_ls, axis=1)
776 | fc_sim = np.corrcoef(ts_sim[:, 10:])
777 |
778 | print('epoch: ', i_epoch, loss.detach().numpy())
779 |
780 | print('epoch: ', i_epoch, np.corrcoef(fc_sim[mask_e], fc[mask_e])[0, 1], 'cos_sim: ',
781 | np.diag(cosine_similarity(ts_sim, self.ts.mean(0))).mean())
782 |
783 | for name in self.model.state_names + [self.output_sim.output_name]:
784 | tmp_ls = getattr(self.output_sim, name + '_train')
785 | setattr(self.output_sim, name + '_train', np.concatenate(tmp_ls, axis=1))
786 | """self.output_sim.EEG_train = ts_sim
787 | self.output_sim.E_train = E_sim
788 | self.output_sim.I_train= I_sim
789 | self.output_sim.P_train = M_sim
790 | self.output_sim.Ev_train = Ev_sim
791 | self.output_sim.Iv_train= Iv_sim
792 | self.output_sim.Pv_train = Mv_sim"""
793 | self.output_sim.loss = np.array(loss_his)
794 |
795 | if i_epoch > epoch_min and np.corrcoef(fc_sim[mask_e], fc[mask_e])[0, 1] > r_lb:
796 | break
797 | # print('epoch: ', i_epoch, np.corrcoef(fc_sim[mask_e], fc[mask_e])[0, 1])
798 | self.output_sim.weights = np.array(fit_sc)
799 | if self.model.model_name == 'JR' and self.model.fit_lfm_flat == True:
800 | self.output_sim.leadfield = np.array(fit_lm)
801 | for key, value in fit_param.items():
802 | setattr(self.output_sim, key, np.array(value))
803 |
804 | def test(self, x0, he0, base_batch_num, u=0):
805 | """
806 | Parameters
807 | ----------
808 | num_batches: int
809 | length of simEEG = batch_size x num_batches
810 | values of model parameters from model.state_dict
811 | Outputs:
812 | output_test: OutputJR
813 | """
814 |
815 | # define some constants
816 | state_lb = 0
817 | state_ub = 5
818 | delays_max = 500
819 | # base_batch_num = 20
820 | transient_num = 10
821 |
822 | self.u = u
823 |
824 | # initial state
825 | X = torch.tensor(x0, dtype=torch.float32)
826 | hE = torch.tensor(he0, dtype=torch.float32)
827 |
828 | # X = torch.tensor(np.random.uniform(state_lb, state_ub, (self.model.node_size, self.model.state_size)) , dtype=torch.float32)
829 | # hE = torch.tensor(np.random.uniform(state_lb, state_ub, (self.model.node_size,500)), dtype=torch.float32)
830 |
831 | # placeholders for model parameters
832 |
833 | # define mask for geting lower triangle matrix
834 | mask = np.tril_indices(self.model.node_size, -1)
835 | mask_e = np.tril_indices(self.model.output_size, -1)
836 |
837 | # define num_batches
838 | num_batches = np.int(self.ts.shape[2] / self.model.batch_size) + base_batch_num
839 | # Create placeholders for the simulated BOLD E I x f and q of entire time series.
840 | for name in self.model.state_names + [self.output_sim.output_name]:
841 | setattr(self.output_sim, name + '_test', [])
842 |
843 | u_hat = np.zeros(
844 | (self.model.node_size, self.model.hidden_size, base_batch_num * self.model.batch_size + self.ts.shape[2]))
845 | u_hat[:, :, base_batch_num * self.model.batch_size:] = self.u
846 |
847 | # Perform the training in batches.
848 |
849 | for i_batch in range(num_batches):
850 |
851 | # Initialize the placeholder for the next state.
852 | X_next = torch.zeros_like(X)
853 |
854 | # Get the input and output noises for the module.
855 | noise_in = torch.tensor(np.random.randn(self.model.node_size, self.model.hidden_size, \
856 | self.model.batch_size, self.model.input_size), dtype=torch.float32)
857 | noise_out = torch.tensor(np.random.randn(self.model.node_size, self.model.batch_size), dtype=torch.float32)
858 | external = torch.tensor(
859 | (u_hat[:, :, i_batch * self.model.batch_size:(i_batch + 1) * self.model.batch_size]),
860 | dtype=torch.float32)
861 |
862 | # Use the model.forward() function to update next state and get simulated EEG in this batch.
863 | next_batch, hE_new = self.model(external, noise_in, noise_out, X, hE)
864 |
865 | if i_batch > base_batch_num - 1:
866 | for name in self.model.state_names + [self.output_sim.output_name]:
867 | name_next = name + '_batch'
868 | tmp_ls = getattr(self.output_sim, name + '_test')
869 | tmp_ls.append(next_batch[name_next].detach().numpy())
870 | # print(name+'_train', name+'_batch', tmp_ls)
871 | setattr(self.output_sim, name + '_test', tmp_ls)
872 |
873 | # last update current state using next state... (no direct use X = X_next, since gradient calculation only depends on one batch no history)
874 | X = torch.tensor(next_batch['current_state'].detach().numpy(), dtype=torch.float32)
875 | hE = torch.tensor(hE_new.detach().numpy(), dtype=torch.float32)
876 | # print(hE_new.detach().numpy()[20:25,0:20])
877 | # print(hE.shape)
878 | fc = np.corrcoef(self.ts.mean(0))
879 | """ts_sim = np.concatenate(eeg_sim_test, axis=1)
880 | E_sim = np.concatenate(E_sim_test, axis=1)
881 | I_sim = np.concatenate(I_sim_test, axis=1)
882 | M_sim = np.concatenate(M_sim_test, axis=1)
883 | Ev_sim = np.concatenate(Ev_sim_test, axis=1)
884 | Iv_sim = np.concatenate(Iv_sim_test, axis=1)
885 | Mv_sim = np.concatenate(Mv_sim_test, axis=1)"""
886 | tmp_ls = getattr(self.output_sim, self.output_sim.output_name + '_test')
887 | ts_sim = np.concatenate(tmp_ls, axis=1)
888 |
889 | fc_sim = np.corrcoef(ts_sim[:, transient_num:])
890 | # print('r: ', np.corrcoef(fc_sim[mask_e], fc[mask_e])[0, 1], 'cos_sim: ', np.diag(cosine_similarity(ts_sim, self.ts.mean(0))).mean())
891 |
892 | for name in self.model.state_names + [self.output_sim.output_name]:
893 | tmp_ls = getattr(self.output_sim, name + '_test')
894 | setattr(self.output_sim, name + '_test', np.concatenate(tmp_ls, axis=1))
895 |
--------------------------------------------------------------------------------
/tepfit/sim.py:
--------------------------------------------------------------------------------
1 | # Numpy sim implementation
2 |
3 | """
4 | Numpy JR model implementation
5 | ==============================
6 | """
7 |
8 | """
9 | Importage
10 | """
11 |
12 | from mpl_toolkits import mplot3d
13 | %matplotlib inline
14 | import numpy as np
15 | import matplotlib.pyplot as plt
16 |
17 |
18 |
19 |
20 | # %load jr_model.py
21 |
22 | """
23 | Importages
24 | """
25 |
26 |
27 | # Generic stuff
28 |
29 | from copy import copy,deepcopy
30 | import os,sys,glob
31 | from itertools import product
32 | from datetime import datetime
33 | from numpy import exp,sin,cos,sqrt,pi, r_,floor,zeros
34 | import numpy as np,pandas as pd
35 | from joblib import Parallel,delayed
36 | from numpy.random import rand,normal
37 | import scipy
38 |
39 |
40 | # Neuroimaging & signal processing stuff
41 |
42 | from scipy.signal import welch,periodogram
43 | from scipy.spatial import cKDTree
44 |
45 |
46 | # Vizualization stuff
47 |
48 | from matplotlib.tri import Triangulation
49 | from matplotlib import pyplot as plt
50 | from matplotlib import cm
51 | import matplotlib as mpl
52 | import seaborn as sns
53 | import moviepy
54 | from moviepy.editor import ImageSequenceClip
55 | from matplotlib.pyplot import subplot
56 |
57 |
58 |
59 | """
60 | Run sim function
61 | """
62 |
63 | # Notes:
64 | # - time series not returned by default (to avoid memory errors). Will be returned
65 | # if return_ts=True
66 | # - freq vs. amp param sweep has default range within which max freqs are calculated
67 | # of 5-95 Hz
68 |
69 |
70 | def run_net_sim(D_pyc = .0001,D_ein= .0001, D_iin = 0.0001,T = 40000,P=1, Q = 1,K = 1,Dt = 0.0001,
71 | dt = 0.1,gain = 20.,threshold = 0.,Pi = 3.14159,g = 0.9, noise_std=0.005,
72 | T_transient=10000, stim_freq=35.,stim_amp=0.5,return_ts=False,compute_connectivity=False,
73 | weights=None,delays=None,
74 | fmin = 1.,fmax = 100.,fskip = 1,
75 | stim_nodes=None,stim_pops=['pyc_v'],mne_welch=False,#filt_freqs = [0,np.inf],
76 | stim_type='sinewave',stim_on=None,stim_off=None,
77 | welch_nperseg=None,welch_noverlap=None,
78 | nu_max = 2.5, # STANDARD JR MODEL PARAMS (JANSEN AND RITT 1995)
79 | v0 = 6.,
80 | r = 0.56,
81 | a_1 = 1,
82 | a_2 = 0.8,
83 | a_3 = 0.25,
84 | a_4 = 0.25*1,
85 | a = 100,
86 | A = 3.25,
87 | b = 50,
88 | B = 22.0,
89 | J = 135,
90 | mu = 0):
91 |
92 |
93 | """a_1 = a_1*J
94 | a_2 = a_2*J
95 | a_3 = a_3*J
96 | a_4 = a_4*J"""
97 |
98 | """
99 | Time units explained:
100 |
101 |
102 | Elements t of Ts are integers and represent 'time units'
103 |
104 | Physical duration of a time unit is as follows:
105 |
106 | Time (real, in ms) = t*dt*scaling = t*Dt
107 |
108 | scaling is implicitly defined as Dt/dt
109 |
110 | Example: (to add)
111 |
112 |
113 | NOTE: need to be careful with delays units etc.
114 |
115 | """
116 |
117 |
118 |
119 |
120 |
121 | n_nodes = K
122 |
123 | x1 = n_nodes/2.
124 | x2 = x1+20.
125 |
126 |
127 | # Neuronal response function
128 | def f(u,gain=gain,threshold=threshold):
129 | output= 1./(1.+exp(-gain*(u-threshold)));
130 | return output
131 |
132 |
133 | def SigmoidalJansenRit(x):
134 | cmax = 5
135 | cmin = 0
136 | midpoint = 6
137 | r= 0.56
138 | pre = cmax / (1.0 + np.exp(r * (midpoint - x)))
139 | return pre
140 |
141 |
142 | # Initialize variables
143 | Xi_pyc_c = np.random.uniform(low=-1.,high=1.,size=[K,T+T_transient])
144 | Xi_pyc_v = np.random.uniform(low=-1.,high=1.,size=[K,T+T_transient])
145 | Xi_ein_c = np.random.uniform(low=-1.,high=1.,size=[K,T+T_transient])
146 | Xi_ein_v = np.random.uniform(low=-1.,high=1.,size=[K,T+T_transient])
147 | Xi_iin_c = np.random.uniform(low=-1.,high=1.,size=[K,T+T_transient])
148 | Xi_iin_v = np.random.uniform(low=-1.,high=1.,size=[K,T+T_transient])
149 |
150 |
151 | pyc_c_all = np.zeros_like(Xi_pyc_c)
152 | pyc_v_all = np.zeros_like(Xi_pyc_v)
153 | ein_c_all = np.zeros_like(Xi_ein_c)
154 | ein_v_all = np.zeros_like(Xi_ein_v)
155 | iin_c_all = np.zeros_like(Xi_iin_c)
156 | iin_v_all = np.zeros_like(Xi_iin_v)
157 |
158 |
159 | Ts = np.arange(0,T+T_transient) # time step numbers list
160 |
161 |
162 | if stim_type == 'sinewave':
163 | #state_input = ((Ts>x1+1)&(Ts=t_cnt:
174 | spiketrain[t] = stim_amp
175 | spiketrain[t+1] = stim_amp
176 | spiketrain[t+2] = 0
177 | spiketrain[t+3] = 0
178 | t_cnt = t_cnt+stim_period
179 | stim = spiketrain
180 | elif stim_type== 'square':
181 | stim = np.zeros_like(Ts)
182 | stim[(Ts>=stim_on) & (Ts<=stim_off)] = stim_amp
183 | elif stim_type== 'rand':
184 | stim = np.random.randn(Ts.shape[0])*stim_amp
185 |
186 |
187 |
188 | # convert stim to array the size of e
189 | #stim = np.tile(stim,[n_nodes,1])
190 |
191 | stim_temp = np.zeros((n_nodes, stim.shape[0]))
192 |
193 |
194 | # arrange stimulus for the nodes
195 | if stim_nodes==None:
196 | stim = stim_temp
197 | pass
198 | else:
199 | stim_temp[stim_nodes] = stim
200 | stim = stim_temp
201 |
202 | stim_nodes = np.array(stim_nodes)
203 | nostim_nodes = np.array([k for k in range(n_nodes) if k not in stim_nodes])
204 | if nostim_nodes.shape[0] != 0: stim[nostim_nodes,:] = 0.
205 |
206 |
207 |
208 | # decide which population to add the stimulus (currently mutually exclusive)
209 | stim_pyc_c = stim_pyc_v = stim_ein_c = stim_ein_v = stim_iin_c = stim_iin_v = np.zeros_like(stim)
210 | if 'pyc_v' in stim_pops: stim_pyc_v = stim
211 | if 'pyc_c' in stim_pops: stim_pyc_c = stim
212 | if 'ein_c' in stim_pops: stim_ein_c = stim
213 | if 'ein_v' in stim_pops: stim_ein_v = stim
214 | if 'iin_c' in stim_pops: stim_iin_c = stim
215 | if 'iin_v' in stim_pops: stim_iin_v = stim
216 |
217 |
218 | phi_n_scaling = (0.1 * 325 * (0.32-0.12) * 0.5 )**2 / 2.
219 | #+ phi_n_scaling *0.0000000001
220 | #Initialize just first state with random variables instead...
221 | """pyc_c = np.random.randn(n_nodes)
222 | pyc_v = np.random.randn(n_nodes)
223 | ein_c = np.random.randn(n_nodes)
224 | ein_v = np.random.randn(n_nodes)
225 | iin_c = np.random.randn(n_nodes)
226 | iin_v = np.random.randn(n_nodes)"""
227 |
228 |
229 | pyc_c = np.zeros(n_nodes)
230 | pyc_v = np.zeros(n_nodes)
231 | ein_c = np.zeros(n_nodes)
232 | ein_v = np.zeros(n_nodes) + phi_n_scaling *0.00000
233 | iin_c = np.zeros(n_nodes)
234 | iin_v = np.zeros(n_nodes)
235 |
236 | # Define time
237 | #ts = r_[0:sim_len:dt] # integration step time points (ms)
238 | #n_steps = len(ts) #n_steps = int(round(sim_len/dt)) # total number of integration steps
239 | #ts_idx = arange(0,n_steps) # time step indices
240 |
241 | # Sampling frequency for fourier analysis
242 | fs = (1000.*Dt)/dt
243 | #fs = 1./dt
244 |
245 |
246 | # Make a past value index lookup table from
247 | # the conduction delays matrix and the
248 | # integration step size
249 | #delays_lookup = floor(delays/dt).astype(int)
250 | delays_lookup = floor(delays).astype(int)
251 |
252 | maxdelay = int(np.max(delays_lookup))+1
253 |
254 | # Integration loop
255 | for t in Ts[maxdelay:-1]:
256 | #print(t)
257 |
258 |
259 | # For each node, find and summate the time-delayed, coupling
260 | # function-modified inputs from all other nodes
261 | #ccin = zeros(n_nodes)
262 | noise_e = np.random.rand(n_nodes)
263 | noise_i = np.random.rand(n_nodes)
264 | noise_p = np.random.rand(n_nodes)
265 |
266 | diff = ein_c_all[:,:t] #- iin_c_all[:,:t]
267 |
268 | Ed =diff[:,::-1][np.arange(n_nodes), delays.T].T
269 | L = weights - 1*np.diag(weights.sum(1))
270 | ccen=np.sum(L*Ed.T, axis=1)
271 | """diff = iin_c_all[:,:t] #- iin_c_all[:,:t]
272 | Ed =diff[:,::-1][np.arange(n_nodes), delays.T].T
273 | L = weights - np.diag(weights.sum(1))
274 | ccin=np.sum(L*Ed.T, axis=1)
275 | diff = pyc_c_all[:,:t] #- iin_c_all[:,:t]
276 | Ed =diff[:,::-1][np.arange(n_nodes), delays.T].T
277 | L = weights - np.diag(weights.sum(1))
278 | cpin=np.sum(L*Ed.T, axis=1)"""
279 |
280 | """for j in range(0,n_nodes):
281 |
282 | # The .sum() here sums inputs for the current node here
283 | # over both nodes and history (i.e. over space-time)
284 | #insum[i] = f((1./n_nodes) * history_idx_weighted[:,i,:] * history).sum()
285 |
286 | for k in range(0,n_nodes):
287 |
288 | # Get time-delayed, sigmoid-transformed inputs
289 | delay_idx = delays_lookup[j,k]
290 | if delay_idx < t:
291 | #delayed_input = f(pyc_v_all[k,t-delay_idx])
292 | delayed_input = SigmoidalJansenRit(ein_c_all[k,t-delay_idx] - iin_c_all[k,t-delay_idx])
293 | else:
294 | delayed_input = 0
295 |
296 | # Weight inputs by anatomical connectivity, and normalize by number of nodes
297 | ccin[j]+=(1./n_nodes)*weights[j,k]*delayed_input"""
298 |
299 |
300 | # Cortico-cortical connections
301 |
302 | """gccen = g*(ccen-ccin)
303 | gccin = g*(ccin-ccen)/4
304 |
305 | gcpin = g*cpin"""
306 | gccen = g*(ccen)
307 | # Noise inputs
308 | """pyc_c_noise = sqrt(2.*D_pyc*dt)*Xi_pyc_c[:,t]
309 | pyc_v_noise = sqrt(2.*D_pyc*dt)*Xi_pyc_v[:,t]
310 | ein_c_noise = sqrt(2.*D_ein*dt)*Xi_ein_c[:,t]
311 | ein_v_noise = sqrt(2.*D_ein*dt)*Xi_ein_v[:,t]
312 | iin_c_noise = sqrt(2.*D_iin*dt)*Xi_iin_c[:,t]
313 | iin_v_noise = sqrt(2.*D_iin*dt)*Xi_iin_v[:,t]"""
314 |
315 |
316 |
317 |
318 | """
319 |
320 | # REFERENCE: TVB JR Model Implementation
321 | # https://github.com/the-virtual-brain/tvb-root/blob/017cd94f09a91b3a498efcb0213c06e79ed87ab5/scientific_library/tvb/simulator/models/jansen_rit.py#L206
322 |
323 | def _numpy_dfun(self, state_variables, coupling, local_coupling=0.0):
324 | y0, y1, y2, y3, y4, y5 = state_variables
325 |
326 | # NOTE: This is assumed to be \sum_j u_kj * S[y_{1_j} - y_{2_j}]
327 | lrc = coupling[0, :]
328 | short_range_coupling = local_coupling*(y1 - y2)
329 |
330 | # NOTE: for local couplings
331 | # 0: pyramidal cells
332 | # 1: excitatory interneurons
333 | # 2: inhibitory interneurons
334 | # 0 -> 1,
335 | # 0 -> 2,
336 | # 1 -> 0,
337 | # 2 -> 0,
338 |
339 | exp = numpy.exp
340 | sigm_y1_y2 = 2.0 * self.nu_max / (1.0 + exp(self.r * (self.v0 - (y1 - y2))))
341 | sigm_y0_1 = 2.0 * self.nu_max / (1.0 + exp(self.r * (self.v0 - (self.a_1 * self.J * y0))))
342 | sigm_y0_3 = 2.0 * self.nu_max / (1.0 + exp(self.r * (self.v0 - (self.a_3 * self.J * y0))))
343 |
344 | return numpy.array([
345 | y3,
346 | y4,
347 | y5,
348 | self.A * self.a * sigm_y1_y2 - 2.0 * self.a * y3 - self.a ** 2 * y0,
349 | self.A * self.a * (self.mu + self.a_2 * self.J * sigm_y0_1 + lrc + short_range_coupling)
350 | - 2.0 * self.a * y4 - self.a ** 2 * y1,
351 | self.B * self.b * (self.a_4 * self.J * sigm_y0_3) - 2.0 * self.b * y5 - self.b ** 2 * y2,
352 | ])
353 | """
354 |
355 | # Sigmoid transfuer functions - 'pulse to wave'
356 | sigm_ein_iin = 2.0 * nu_max / ( 1. + exp(r * (v0 - (ein_c - iin_c ))) )
357 | sigm_pyc_1 = 2.0 * nu_max / ( 1. + exp(r * (v0 - (a_1 * J * pyc_c))) )
358 | sigm_pyc_3 = 2.0 * nu_max / ( 1. + exp(r * (v0 - (a_3 * J * pyc_c))) )
359 |
360 | pyc_u = (sigm_ein_iin )
361 | ein_u = a_2*J*sigm_pyc_1+ gccen + stim_ein_c[:,t] +noise_std*noise_e #1000*np.tanh(0.0001*(g*ccen+mu))
362 | iin_u = a_4*J*sigm_pyc_3 #0*800*np.tanh(0.001*(gccin) ) +
363 |
364 | # Synaptic response ='wave to pulse'
365 |
366 | pyc_in = A*a*(pyc_u) - (2.0*a*pyc_v) - ((a**2)*pyc_c)
367 |
368 | ein_in = A*a*(ein_u) - (2.0*a*ein_v) - ((a**2)*ein_c)
369 |
370 | iin_in = B*b*(iin_u) - (2.0*b*iin_v) - ((b**2)*iin_c)
371 |
372 |
373 |
374 |
375 | # d/dt
376 |
377 | pyc_c = pyc_c + dt * (pyc_v) + 0*np.sqrt(dt)*noise_std*noise_p
378 | ein_c = ein_c + dt * (ein_v) + 0*np.sqrt(dt)*noise_std*noise_e
379 | iin_c = iin_c + dt * (iin_v) + 0*np.sqrt(dt)*noise_std*noise_i
380 |
381 | pyc_v = pyc_v + dt * (pyc_in)
382 | ein_v = ein_v + dt * (ein_in)
383 | iin_v = iin_v + dt * (iin_in)
384 |
385 | #print(ein_c[0:5])
386 |
387 | pyc_v_all[:,t+1] = pyc_v
388 | pyc_c_all[:,t+1] = pyc_c
389 | ein_v_all[:,t+1] = ein_v
390 | ein_c_all[:,t+1] = ein_c
391 | iin_v_all[:,t+1] = iin_v
392 | iin_c_all[:,t+1] = iin_c
393 |
394 |
395 | # Concatenate sim time series
396 | df_all = pd.concat({'stim': pd.DataFrame(stim.T[T_transient:]),
397 | 'pyc_c': pd.DataFrame(pyc_c_all.T[T_transient:]),
398 | 'pyc_v': pd.DataFrame(pyc_v_all.T[T_transient:]),
399 | 'ein_c':pd.DataFrame(ein_c_all.T[T_transient:]),
400 | 'ein_v': pd.DataFrame(ein_v_all.T[T_transient:]),
401 | 'iin_c': pd.DataFrame(iin_c_all.T[T_transient:]),
402 | 'iin_v': pd.DataFrame(iin_v_all.T[T_transient:])}, axis=1)
403 | df_all.index = (Ts[T_transient:]-T_transient)*(Dt*10000.)
404 |
405 |
406 |
407 | return df_all
408 |
409 |
410 |
411 |
412 |
--------------------------------------------------------------------------------
/tepfit/viz.py:
--------------------------------------------------------------------------------
1 | # Viz functions
2 | """
3 | miscelallaneous functions and classes to visualize simulation data
4 | """
5 |
6 |
7 | """
8 | Importage
9 | """
10 |
11 | from scipy.io import loadmat
12 | # Suppress warnings; keeps things cleaner
13 | import warnings
14 | warnings.filterwarnings('ignore')
15 |
16 |
17 | # Standard scientific python import commands
18 | import os,sys,glob,numpy as np,pandas as pd,seaborn as sns
19 | sns.set_style('white')
20 |
21 | #%matplotlib inline
22 | from matplotlib import pyplot as plt
23 | import pickle
24 | from surfer import Brain
25 | from mayavi import mlab
26 | import pylab
27 | import glob
28 | import mne
29 | import os.path as op
30 | from scipy.stats import norm
31 | from scipy import stats
32 | import nibabel as nib
33 | import scipy
34 | from sklearn.metrics.pairwise import cosine_similarity
35 | import itertools
36 | import os
37 | from PIL import Image
38 | from IPython import display
39 |
40 |
41 |
42 | def plot_source_activity(data, fwd, noise_cov, inv, real_src_path, annot2use, subjects_dir, subject,
43 | eeg_dir, method='dSPM', hemi='both', clim= 'auto',
44 | time_viewer=True, views=['dorsal'],
45 | surface='inflated',
46 | alpha=0.5, size=(800, 400)):
47 |
48 | '''
49 | computing and plotting simulated time series using mne
50 |
51 |
52 | Parameters
53 | ----------
54 | data = np.array |
55 | simulated timeseries
56 | array (dim: number of regions X time)
57 | fwd = str |
58 | path to Forward solution
59 | noise_cov = str |
60 | path to noise_cov
61 | inv = str |
62 | path to Inverse solution
63 | real_src_path = str |
64 | Path to a source-estimate file
65 | annot2use = str |
66 | name of the parcellation to use (e.g Schaefer2018_200Parcels_7Networks_order.annot)
67 | subjects_dir = str |
68 | path to the subjects' folders
69 | subject = str |
70 | subject folder name that you want to plot
71 | eeg_dir = str |
72 | path to eegfile
73 | method (optional) = str |
74 | Use minimum norm (e.g. dSPM, sLORETA, or eLORETA)
75 | Default is dSPM
76 | hemi (optional)= str |
77 | (ie 'lh', 'rh', 'both', or 'split'). In the case
78 | of 'both', both hemispheres are shown in the same window.
79 | In the case of 'split' hemispheres are displayed side-by-side
80 | in different viewing panes
81 | Default is both
82 | clim (optional)= str | dict
83 | Colorbar properties specification. If ‘auto’, set clim automatically
84 | based on data percentiles. If dict, should contain:
85 | - kind‘value’ | ‘percent’
86 | Flag to specify type of limits.
87 | - limslist | np.ndarray | tuple of float, 3 elements
88 | Lower, middle, and upper bounds for colormap.
89 | - pos_limslist | np.ndarray | tuple of float, 3 elements
90 | Lower, middle, and upper bound for colormap. Positive values will be
91 | mirrored directly across zero during colormap construction to obtain
92 | negative control points.
93 | time_viewer (optional) = bool | str
94 | Display time viewer GUI. Can be “auto”, which is True for
95 | the PyVista backend and False otherwise.
96 | Default is True
97 | views (optional) = str | list
98 | View to use. Can be any of:
99 | ['lateral', 'medial', 'rostral', 'caudal', 'dorsal', 'ventral',
100 | 'frontal', 'parietal', 'axial', 'sagittal', 'coronal']
101 | Three letter abbreviations (e.g., 'lat') are also supported.
102 | Using multiple views (list) is not supported for mpl backend.
103 | Default is 'dorsal'
104 | surface= str |
105 | freesurfer surface mesh name (ie 'white', 'inflated', etc.).
106 | Default is inflated
107 | alpha (optional) = float |
108 | Alpha value to apply globally to the surface meshes. Defaults to 0.4.
109 | Default is 0.5
110 | size (optional) = float or tuple of float |
111 | The size of the window, in pixels. can be one number to specify a
112 | square window, or the (width, height) of a rectangular window.
113 | Default is (800, 400)
114 |
115 | '''
116 | #Importing empirical source estimate
117 | stcs = mne.read_source_estimate(real_src_path)
118 |
119 |
120 | #Computing regmap for a specific parcellation
121 | lh_regmap = nib.freesurfer.io.read_annot(op.join(subjects_dir + \
122 | subject + '/label/lh.' + annot2use))[0]
123 | rh_regmap = nib.freesurfer.io.read_annot(op.join(subjects_dir + \
124 | subject + '/label/rh.' + annot2use))[0]
125 | rh_regmap_fixed = rh_regmap+np.max(lh_regmap)
126 | rh_regmap_fixed[rh_regmap_fixed == np.max(lh_regmap)] = 0
127 | lh_vtx2use= lh_regmap[stcs.vertices[0]]
128 | rh_vtx2use= rh_regmap_fixed[stcs.vertices[1]]
129 | regions_idx = np.concatenate([lh_vtx2use, rh_vtx2use])
130 |
131 |
132 | #Creating the leadfield matrix to use down the line
133 | leadfield = fwd['sol']['data']
134 | fwd_fixed = mne.convert_forward_solution(fwd, surf_ori=True, force_fixed=True,
135 | use_cps=True)
136 | leadfield = fwd_fixed['sol']['data']
137 |
138 | new_leadfield = []
139 | for xx in range(1,np.unique(regions_idx).shape[0]):
140 | new_leadfield.append(np.mean(np.squeeze(leadfield[:,np.where(regions_idx==xx)]),axis=1))
141 |
142 | new_leadfield = np.array(new_leadfield).T
143 |
144 |
145 | #Moving timeseries to EEG space using the leadfield matrix
146 | EEG = new_leadfield.dot(data).T # EEG shape [n_samples x n_eeg_channels]
147 | # reference is mean signal, tranposing because trailing dimension of arrays must agree
148 | EEG = (EEG.T - EEG.mean(axis=1).T).T
149 | EEG = EEG - EEG.mean(axis=0) # center EEG
150 | simulated_EEG = EEG.T
151 |
152 |
153 | #Importing empirical EEG and creating an mne EEG object for simulated timeseries
154 | epochs = mne.read_epochs(eeg_dir)
155 | evoked = epochs.average()
156 | sim_EEG = evoked.copy()
157 | sim_EEG.data = simulated_EEG*1e-5
158 | sim_EEG.times = sim_EEG.times[:simulated_EEG.data.shape[1]]
159 |
160 |
161 | #Computing source for simulated timeseries
162 | inv_sim = mne.minimum_norm.make_inverse_operator(sim_EEG.info, forward=fwd, noise_cov=noise_cov)
163 | stc_sim = mne.minimum_norm.apply_inverse(sim_EEG, inv, method=method)
164 |
165 |
166 | #Plotting simulated source timeseries
167 | mne.viz.set_3d_backend("notebook")
168 | subjects_dir = subjects_dir
169 | subject = subject
170 | os.environ['SUBJECTS_DIR'] = subjects_dir
171 | brain = stc_sim.plot(subjects_dir=subjects_dir, subject=subject, clim=clim,
172 | hemi=hemi, time_viewer=time_viewer, views=views,
173 | surface=surface, alpha=alpha, size=size)
174 |
175 | return stc_sim, sim_EEG
176 |
177 |
178 |
179 |
180 | def comparison_empirical_and_simulation(data, fwd, noise_cov, inv,
181 | real_src_path, annot2use,
182 | subjects_dir, subject,
183 | eeg_dir, time_range, space='source',
184 | metric2use= 'correlation',
185 | movie= False, windows=None,
186 | method='dSPM', opacity=0.5,
187 | colormap='coolwarm', views=['dorsal'],
188 | clim=(0, 0.02), distance=600.0,
189 | cortex='ivory'):
190 |
191 |
192 | '''
193 | computing and plotting comparison between empirical and simulated data
194 | both in the source and in the eeg space
195 |
196 |
197 | Parameters
198 | ----------
199 | data = np.array |
200 | simulated timeseries
201 | array (dim: number of regions X time)
202 | fwd = str |
203 | path to Forward solution
204 | noise_cov = str |
205 | path to noise_cov
206 | inv = str |
207 | path to Inverse solution
208 | real_src_path = str |
209 | Path to a source-estimate file
210 | annot2use = str |
211 | name of the parcellation to use (e.g Schaefer2018_200Parcels_7Networks_order.annot)
212 | subjects_dir = str |
213 | path to the subjects' folders
214 | subject = str |
215 | subject folder name that you want to plot
216 | eeg_dir = str |
217 | path to eegfile
218 | time_range: = float or tuple of float |
219 | time interval for the timeseries comparison in sec
220 | (e.g. -0.5, -0.1)
221 | space (optional) = str |
222 | space where the comparison analysis will be performed. Options are
223 | 'source' or 'eeg'
224 | Default is 'source'
225 | metric2use (optional) = str |
226 | use any of the following:
227 | - correlation: I will compute the standard pearson coefficient
228 | - cosine: I will compute the cosine similarity
229 | - cross_correlation: I will compute the the cross correlation
230 | Default is 'correlation'
231 | movie (optional) = bool |
232 | if True, a dynamic comparison over time will be computed and
233 | export as gif. movie
234 | Default is False
235 | windows(optional) = int |
236 | only if movie is True. Specify the time resolution of the dynamic
237 | comparison
238 | Default is None
239 | method (optional) = str |
240 | Use minimum norm (e.g. dSPM, sLORETA, or eLORETA)
241 | Default is dSPM
242 | opacity (optional) = float in [0, 1] |
243 | Value to control opacity of the cortical surface
244 | Default is 0.5
245 | colormap (optional) = str |
246 | matplotlib colormap to be used in the visualization. Options can be
247 | founded here: https://matplotlib.org/stable/tutorials/colors/colormaps.html
248 | Default is 'coolwarm'
249 | views (optional) = str | list
250 | View to use. Can be any of:
251 | ['lateral', 'medial', 'rostral', 'caudal', 'dorsal', 'ventral',
252 | 'frontal', 'parietal', 'axial', 'sagittal', 'coronal']
253 | Three letter abbreviations (e.g., 'lat') are also supported.
254 | Using multiple views (list) is not supported for mpl backend.
255 | Default is 'dorsal'
256 | clim (optional) = str | dict
257 | Colorbar properties specification with vmin and vmax values.
258 | Default is (0, 0.02)
259 | distance (optional): = float |
260 | Default is 600.0,
261 | cortex (optional) = str, tuple, dict, or None |
262 | Specifies how the cortical surface is rendered. Options:
263 | 1) The name of one of the preset cortex styles: 'classic' (default),
264 | 'high_contrast', 'low_contrast', or 'bone'.
265 | 2) A color-like argument to render the cortex as a single color,
266 | e.g. 'red' or (0.1, 0.4, 1.). Setting this to None is equivalent
267 | to (0.5, 0.5, 0.5).
268 | 3) The name of a colormap used to render binarized curvature values, e.g., Grays.
269 | 4) A list of colors used to render binarized curvature values.
270 | Only the first and last colors are used. E.g., [‘red’, ‘blue’] or
271 | [(1, 0, 0), (0, 0, 1)].
272 | 5)A container with four entries for colormap (string specifiying
273 | the name of a colormap), vmin (float specifying the minimum value
274 | for the colormap), vmax (float specifying the maximum value for
275 | the colormap), and reverse (bool specifying whether the colormap
276 | should be reversed. E.g., ('Greys', -1, 2, False).
277 | 6) A dict of keyword arguments that is passed on to the call to surface.
278 | Default is 'ivory'
279 |
280 |
281 | '''
282 |
283 |
284 | def NormalizeData(data):
285 | return (data - np.min(data)) / (np.max(data) - np.min(data))
286 | #Importing empirical source estimate
287 |
288 | stcs = mne.read_source_estimate(real_src_path)
289 |
290 |
291 | #Computing regmap for a specific parcellation
292 | lh_regmap = nib.freesurfer.io.read_annot(op.join(subjects_dir + \
293 | subject + '/label/lh.' + annot2use))[0]
294 | rh_regmap = nib.freesurfer.io.read_annot(op.join(subjects_dir + \
295 | subject + '/label/rh.' + annot2use))[0]
296 | rh_regmap_fixed = rh_regmap+np.max(lh_regmap)
297 | rh_regmap_fixed[rh_regmap_fixed == np.max(lh_regmap)] = 0
298 | lh_vtx2use= lh_regmap[stcs.vertices[0]]
299 | rh_vtx2use= rh_regmap_fixed[stcs.vertices[1]]
300 | regions_idx = np.concatenate([lh_vtx2use, rh_vtx2use])
301 |
302 |
303 | #Creating the leadfield matrix to use down the line
304 | leadfield = fwd['sol']['data']
305 | fwd_fixed = mne.convert_forward_solution(fwd, surf_ori=True, force_fixed=True,
306 | use_cps=True)
307 | leadfield = fwd_fixed['sol']['data']
308 |
309 | new_leadfield = []
310 | for xx in range(1,np.unique(regions_idx).shape[0]):
311 | new_leadfield.append(np.mean(np.squeeze(leadfield[:,np.where(regions_idx==xx)]),axis=1))
312 |
313 | new_leadfield = np.array(new_leadfield).T
314 |
315 |
316 | #Moving timeseries to EEG space using the leadfield matrix
317 | EEG = new_leadfield.dot(data).T # EEG shape [n_samples x n_eeg_channels]
318 | # reference is mean signal, tranposing because trailing dimension of arrays must agree
319 | EEG = (EEG.T - EEG.mean(axis=1).T).T
320 | EEG = EEG - EEG.mean(axis=0) # center EEG
321 | simulated_EEG = EEG.T
322 |
323 |
324 | #Importing empirical EEG and creating an mne EEG object for simulated timeseries
325 | epochs = mne.read_epochs(eeg_dir)
326 | evoked = epochs.average()
327 | sim_EEG = evoked.copy()
328 | sim_EEG.data = simulated_EEG*1e-5
329 | sim_EEG.times = sim_EEG.times[:simulated_EEG.data.shape[1]]
330 |
331 |
332 | #Computing source for simulated timeseries
333 | inv_sim = mne.minimum_norm.make_inverse_operator(sim_EEG.info, forward=fwd, noise_cov=noise_cov)
334 | stc_sim = mne.minimum_norm.apply_inverse(sim_EEG, inv, method=method)
335 |
336 | if space=='source':
337 | #Comparing empirical data and simulated data in source space
338 |
339 | if movie:
340 | starting_pnt = (np.abs(epochs.times - (time_range[0]))).argmin()
341 | ending_pnt = (np.abs(epochs.times - (time_range[1]))).argmin()
342 | window_size = int((ending_pnt - starting_pnt) / windows)
343 | source_corr=np.zeros((stc_sim.data.shape[0], windows))
344 | source_similarity=np.zeros((stc_sim.data.shape[0], windows))
345 | source_cross=np.zeros((stc_sim.data.shape[0], windows))
346 | for tt in range(windows):
347 | X_simulated = np.abs(scipy.stats.zscore(stc_sim.data[:,starting_pnt:starting_pnt+window_size]))
348 | X_empirical = np.abs(scipy.stats.zscore(stcs.data[:,starting_pnt:starting_pnt+window_size]))
349 |
350 | for bb in range(X_simulated.shape[0]):
351 | source_corr[bb,tt]= (np.corrcoef(X_simulated[bb], X_empirical[bb])[1][0])
352 | source_similarity[bb,tt]=(cosine_similarity(X_simulated[bb].reshape(1, -1), X_empirical[bb].reshape(1, -1))[0][0])
353 | source_cross[bb,tt]=np.mean(scipy.correlate(X_simulated[bb], X_empirical[bb], mode='same'))
354 |
355 | starting_pnt=starting_pnt+window_size
356 |
357 | source_cross = NormalizeData(source_cross)
358 |
359 |
360 | starting_pnt = (np.abs(epochs.times - (time_range[0]))).argmin()
361 | ending_pnt = (np.abs(epochs.times - (time_range[1]))).argmin()
362 | for cc in range(source_similarity.shape[1]):
363 | if metric2use=='correlation':
364 | data2plot = source_corr[:,cc]
365 | elif metric2use=='cosine':
366 | data2plot = source_similarity[:,cc]
367 | elif metric2use=='cross_correlation':
368 | data2plot = source_cross[:,cc]
369 |
370 | #Left Hemiphere
371 | lh_correlation = np.zeros((lh_regmap.shape[0]))
372 |
373 | for xx in range(1,np.unique(lh_regmap).shape[0]):
374 | lh_correlation[np.where(lh_regmap==xx)[0]] = data2plot[xx-1]
375 |
376 | #Right Hemiphere
377 | rh_correlation = np.zeros((rh_regmap.shape[0]))
378 |
379 | for xx in range(1,np.unique(rh_regmap).shape[0]):
380 | rh_correlation[np.where(rh_regmap==xx)[0]] = data2plot[xx+100-1]
381 |
382 | brain = Brain(subject, subjects_dir=subjects_dir, hemi='both', surf='pial', cortex=cortex, alpha=opacity,
383 | views=views)
384 | brain.add_data(lh_correlation, min=clim[0], max=clim[1], hemi='lh', colormap=colormap)
385 | brain.add_data(rh_correlation, min=clim[0], max=clim[1], hemi='rh', colormap=colormap)
386 | brain.add_text(x=0.1, y=0.9, text=str(epochs.times[starting_pnt]) + 'ms/' + str(epochs.times[starting_pnt+window_size]) + 'ms',
387 | name=str(epochs.times[starting_pnt]) + 'ms/' + str(epochs.times[starting_pnt+window_size]) + 'ms',
388 | font_size=20)
389 |
390 | starting_pnt=starting_pnt+window_size
391 | file = 'simulation_surface%03.0f.png' %cc
392 | mlab.view(distance=distance)
393 | mlab.savefig(file)
394 | mlab.close(all=True)
395 |
396 | #Making a gif
397 | fp_in = op.join(os.getcwd() + "/simulation_surface*")
398 | fp_out = op.join('source_' + metric2use + '.gif')
399 | img, *imgs = [Image.open(f) for f in sorted(glob.glob(fp_in))]
400 | img.save(fp=fp_out, format='GIF', append_images=imgs,
401 | save_all=True, duration=200, loop=0)
402 |
403 | for filename in glob.glob("simulation_surface*"):
404 | os.remove(filename)
405 |
406 | source_comparison = {"cosine": source_similarity,
407 | "correlation": source_corr,
408 | "cross_correlation": source_cross}
409 |
410 |
411 | return source_comparison, display.Image(filename=fp_out,embed=True)
412 |
413 | else:
414 | starting_pnt = (np.abs(epochs.times - (time_range[0]))).argmin()
415 | ending_pnt = (np.abs(epochs.times - (time_range[1]))).argmin()
416 | X_simulated = np.abs(scipy.stats.zscore(stc_sim.data[:,starting_pnt:ending_pnt]))
417 | X_empirical = np.abs(scipy.stats.zscore(stcs.data[:,starting_pnt:ending_pnt]))
418 |
419 | source_corr=[]
420 | source_similarity=[]
421 | source_cross=[]
422 |
423 | for bb in range(X_simulated.shape[0]):
424 | source_corr.append(np.corrcoef(X_simulated[bb], X_empirical[bb])[1][0])
425 | source_similarity.append(cosine_similarity(X_simulated[bb].reshape(1, -1), X_empirical[bb].reshape(1, -1))[0][0])
426 | source_cross.append(scipy.correlate(X_simulated[bb], X_empirical[bb], mode='same'))
427 |
428 | source_corr = np.array(source_corr)
429 | source_similarity = np.array(source_similarity)
430 | source_cross = NormalizeData(np.array(np.mean(source_cross, axis=1)))
431 |
432 | if metric2use=='correlation':
433 | data2plot = source_corr
434 | elif metric2use=='cosine':
435 | data2plot = source_similarity
436 | elif metric2use=='cross_correlation':
437 | data2plot = source_cross
438 |
439 | #Left Hemiphere
440 | lh_correlation = np.zeros((lh_regmap.shape[0]))
441 |
442 | for xx in range(1,np.unique(lh_regmap).shape[0]):
443 | lh_correlation[np.where(lh_regmap==xx)[0]] = data2plot[xx-1]
444 |
445 | #Right Hemiphere
446 | rh_correlation = np.zeros((rh_regmap.shape[0]))
447 |
448 | for xx in range(1,np.unique(rh_regmap).shape[0]):
449 | rh_correlation[np.where(rh_regmap==xx)[0]] = data2plot[xx+100-1]
450 |
451 | brain = Brain(subject, subjects_dir=subjects_dir, hemi='both', surf='pial', cortex=cortex, alpha=opacity,
452 | views=views)
453 | brain.add_data(lh_correlation, min=clim[0], max=clim[1], hemi='lh', colormap=colormap)
454 | brain.add_data(rh_correlation, min=clim[0], max=clim[1], hemi='rh', colormap=colormap)
455 | brain.add_text(x=0.1, y=0.9, text='average ' + str(time_range[0]) + 'ms/' + str(time_range[1]) + 'ms',
456 | name='average ' + str(time_range[0]) + 'ms/' + str(time_range[1]) + 'ms',
457 | font_size=20)
458 |
459 | mlab.view(distance=distance)
460 | mlab.show()
461 |
462 | return data2plot
463 |
464 | elif space=='eeg':
465 |
466 | if movie:
467 | starting_pnt = (np.abs(epochs.times - (time_range[0]))).argmin()
468 | ending_pnt = (np.abs(epochs.times - (time_range[1]))).argmin()
469 | window_size = int((ending_pnt - starting_pnt) / windows)
470 | source_corr=np.zeros((sim_EEG.data.shape[0], windows))
471 | eeg_similarity=np.zeros((sim_EEG.data.shape[0], windows))
472 | eeg_corr=np.zeros((sim_EEG.data.shape[0], windows))
473 | eeg_cross=np.zeros((sim_EEG.data.shape[0], windows))
474 | #Loop over windows number
475 | for tt in range(windows):
476 | X_simulated = np.abs(scipy.stats.zscore(sim_EEG.data[:,starting_pnt:starting_pnt+window_size]))
477 | X_empirical = np.abs(scipy.stats.zscore(evoked.data[:,starting_pnt:starting_pnt+window_size]))
478 | #Loop over electrodes
479 | for bb in range(X_simulated.shape[0]):
480 | eeg_corr[bb,tt]= (np.corrcoef(X_simulated[bb], X_empirical[bb])[1][0])
481 | eeg_similarity[bb,tt]=(cosine_similarity(X_simulated[bb].reshape(1, -1), X_empirical[bb].reshape(1, -1))[0][0])
482 | eeg_cross[bb,tt]=np.mean(scipy.correlate(X_simulated[bb], X_empirical[bb], mode='same'))
483 |
484 | starting_pnt=starting_pnt+window_size
485 |
486 | source_cross = NormalizeData(eeg_cross)
487 |
488 | standard_1020_montage = mne.channels.make_standard_montage('standard_1020')
489 |
490 | ch_names=evoked.info['ch_names']
491 | ch_names_lower = np.array([x.lower() if isinstance(x, str) else x for x in ch_names])
492 |
493 | standard_1020_ch_names_lower = np.array([x.lower() if isinstance(x, str)\
494 | else x for x in standard_1020_montage.ch_names]).tolist()
495 |
496 | ch_idx = []
497 | for xx in range(len(ch_names_lower)):
498 | ch_idx.append(np.where(np.array(ch_names_lower[xx])==np.array(standard_1020_ch_names_lower))[0][0])
499 |
500 | ch_idx = np.array(ch_idx)
501 |
502 |
503 | standard_1020_montage_ch_pos = []
504 | for xx in standard_1020_montage.get_positions()['ch_pos'].keys():
505 | idx_tmp = np.where(np.array(list(standard_1020_montage.get_positions()['ch_pos'].keys())) == xx)[0][0]
506 | standard_1020_montage_ch_pos.append(standard_1020_montage.get_positions()['ch_pos'][xx])
507 |
508 | standard_1020_montage_ch_pos = np.array(standard_1020_montage_ch_pos)
509 |
510 | standard_1020_montage_ch_pos_filtered = standard_1020_montage_ch_pos[ch_idx]*1000
511 |
512 | colormap=colormap
513 | color_map=plt.get_cmap(colormap)
514 |
515 |
516 | opacity=opacity
517 |
518 |
519 | surf='outer_skin_surface'
520 | hemi='rh'
521 |
522 | starting_pnt = (np.abs(epochs.times - (time_range[0]))).argmin()
523 | ending_pnt = (np.abs(epochs.times - (time_range[1]))).argmin()
524 | for cc in range(eeg_similarity.shape[1]):
525 | if metric2use=='correlation':
526 | data2plot = eeg_corr[:,cc]
527 | elif metric2use=='cosine':
528 | data2plot = eeg_similarity[:,cc]
529 | elif metric2use=='cross_correlation':
530 | data2plot = eeg_cross[:,cc]
531 |
532 | coords = standard_1020_montage_ch_pos_filtered
533 | new_coords = coords.copy()
534 | new_scale_factor = data2plot.copy()
535 |
536 | brain = Brain(subject, subjects_dir=subjects_dir, hemi=hemi, surf=surf, views=views,
537 | cortex=cortex, alpha=opacity)
538 |
539 | hemi = 'rh'
540 | for roi in range(new_coords.shape[0]):
541 | rgba = color_map(data2plot[roi])
542 | brain.add_foci(new_coords[roi], color=(rgba[0], rgba[1], rgba[2]), scale_factor=new_scale_factor[roi],hemi=hemi)
543 |
544 | brain.add_text(x=0.1, y=0.9, text=str(epochs.times[starting_pnt]) + 'ms/' + str(epochs.times[starting_pnt+window_size]) + 'ms',
545 | name=str(epochs.times[starting_pnt]) + 'ms/' + str(epochs.times[starting_pnt+window_size]) + 'ms',
546 | font_size=20)
547 |
548 | starting_pnt=starting_pnt+window_size
549 | file = 'simulation_surface%03.0f.png' %cc
550 | mlab.view(distance=distance)
551 | mlab.savefig(file)
552 | mlab.close(all=True)
553 |
554 | #Making a gif
555 | fp_in = op.join(os.getcwd() + "/simulation_surface*")
556 | fp_out = op.join('eeg_' + metric2use + '.gif')
557 | img, *imgs = [Image.open(f) for f in sorted(glob.glob(fp_in))]
558 | img.save(fp=fp_out, format='GIF', append_images=imgs,
559 | save_all=True, duration=200, loop=0)
560 |
561 | for filename in glob.glob("simulation_surface*"):
562 | os.remove(filename)
563 |
564 | eeg_comparison = {"cosine": eeg_similarity,
565 | "correlation": eeg_corr,
566 | "cross_correlation": eeg_cross}
567 |
568 |
569 | return eeg_comparison, display.Image(filename=fp_out,embed=True)
570 |
571 |
572 |
573 | else:
574 |
575 | #Comparing empirical data and simulated data in eeg space
576 | starting_pnt = (np.abs(epochs.times - (time_range[0]))).argmin()
577 | ending_pnt = (np.abs(epochs.times - (time_range[1]))).argmin()
578 |
579 | X_simulated = sim_EEG.data[:,starting_pnt:ending_pnt]
580 | Y_empirical = evoked.data[:,starting_pnt:ending_pnt]
581 |
582 | eeg_corr=[]
583 | eeg_similarity=[]
584 | eeg_cross=[]
585 |
586 | for bb in range(X_simulated.shape[0]):
587 | eeg_corr.append(np.corrcoef(X_simulated[bb], Y_empirical[bb])[1][0])
588 | eeg_similarity.append(cosine_similarity(X_simulated[bb].reshape(1, -1), Y_empirical[bb].reshape(1, -1))[0][0])
589 | eeg_cross.append(scipy.correlate(X_simulated[bb], Y_empirical[bb], mode='same'))
590 |
591 | eeg_corr = np.array(eeg_corr)
592 | eeg_similarity = np.array(eeg_similarity)
593 | eeg_cross = np.array(eeg_cross)
594 |
595 | standard_1020_montage = mne.channels.make_standard_montage('standard_1020')
596 |
597 | ch_names=evoked.info['ch_names']
598 | ch_names_lower = np.array([x.lower() if isinstance(x, str) else x for x in ch_names])
599 |
600 | standard_1020_ch_names_lower = np.array([x.lower() if isinstance(x, str)\
601 | else x for x in standard_1020_montage.ch_names]).tolist()
602 |
603 | ch_idx = []
604 | for xx in range(len(ch_names_lower)):
605 | ch_idx.append(np.where(np.array(ch_names_lower[xx])==np.array(standard_1020_ch_names_lower))[0][0])
606 |
607 | ch_idx = np.array(ch_idx)
608 |
609 |
610 | standard_1020_montage_ch_pos = []
611 | for xx in standard_1020_montage.get_positions()['ch_pos'].keys():
612 | idx_tmp = np.where(np.array(list(standard_1020_montage.get_positions()['ch_pos'].keys())) == xx)[0][0]
613 | standard_1020_montage_ch_pos.append(standard_1020_montage.get_positions()['ch_pos'][xx])
614 |
615 | standard_1020_montage_ch_pos = np.array(standard_1020_montage_ch_pos)
616 |
617 | standard_1020_montage_ch_pos_filtered = standard_1020_montage_ch_pos[ch_idx]*1000
618 |
619 |
620 |
621 | colormap=colormap
622 | color_map=plt.get_cmap(colormap)
623 |
624 |
625 | opacity=opacity
626 |
627 |
628 | surf='outer_skin_surface'
629 | hemi='rh'
630 |
631 |
632 | if metric2use=='correlation':
633 | data2plot = NormalizeData(eeg_corr)
634 | elif metric2use=='cosine':
635 | data2plot = NormalizeData(eeg_similarity)
636 | elif metric2use=='cross_correlation':
637 | data2plot = eeg_cross
638 |
639 | coords = standard_1020_montage_ch_pos_filtered
640 | new_coords = coords.copy()
641 | new_scale_factor = data2plot.copy()
642 |
643 | brain = Brain(subject, subjects_dir=subjects_dir, hemi=hemi, surf=surf, views=views,
644 | cortex=cortex, alpha=opacity)
645 |
646 | hemi = 'rh'
647 | for roi in range(new_coords.shape[0]):
648 | rgba = color_map(data2plot[roi])
649 | brain.add_foci(new_coords[roi], color=(rgba[0], rgba[1], rgba[2]), scale_factor=new_scale_factor[roi],hemi=hemi)
650 |
651 | brain.add_text(x=0.1, y=0.9, text='average ' + str(time_range[0]) + 'ms/' + str(time_range[1]) + 'ms',
652 | name='average ' + str(time_range[0]) + 'ms/' + str(time_range[1]) + 'ms',
653 | font_size=20)
654 | mlab.view(distance=distance)
655 | mlab.show()
656 |
657 | return data2plot
658 |
--------------------------------------------------------------------------------