├── .gitignore
├── custom_style.css
├── datasets
├── gforces.txt
├── gforces_description.txt
├── neuron2.txt
├── planes.txt
├── planes_description.txt
├── psychophysics.txt
├── psychophysics_description.txt
├── sleep.txt
└── sleep_description.txt
├── environment.yml
├── index.ipynb
├── lectures
├── ANOVA.ipynb
├── Interactions.ipynb
├── blah.pdf
├── dataframes_plots.ipynb
├── design_matrices.ipynb
├── homework_solution_1.ipynb
├── link_functions.ipynb
└── pics_for_lectures
│ ├── hindenburg.jpg
│ ├── hot_air_balloon.jpg
│ ├── interactions_article.png
│ ├── paperplane.jpg
│ ├── pics_for_lectures.001.png
│ ├── pics_for_lectures.002.png
│ ├── resid-plots.gif
│ ├── stimulus.png
│ ├── tidy_data.png
│ └── topgun.jpg
├── readme.md
└── setup_python.ipynb
/.gitignore:
--------------------------------------------------------------------------------
1 | *-checkpoint.ipynb
--------------------------------------------------------------------------------
/custom_style.css:
--------------------------------------------------------------------------------
1 |
69 |
--------------------------------------------------------------------------------
/datasets/gforces.txt:
--------------------------------------------------------------------------------
1 | Subject Age Signs
2 | JW 39 0
3 | JM 42 1
4 | DT 20 0
5 | LK 37 1
6 | JK 20 1
7 | MK 21 0
8 | FP 41 1
9 | DG 52 1
10 |
--------------------------------------------------------------------------------
/datasets/gforces_description.txt:
--------------------------------------------------------------------------------
1 | # G-Induced Loss of Consciousness
2 |
3 | Source: http://www.statsci.org/data/general/gforces.html
4 |
5 | Keywords: binary data, binomial regression
6 |
7 | ## Description
8 |
9 | Military pilots sometimes black out when their brains are deprived of oxygen due to G-forces during violent maneuvers. Glaister and Miller (1990) produced similar symptoms by exposing volunteers’ lower bodies to negative air pressure, likewise decreasing oxygen to the brain. The data lists the subjects' ages and whether they showed syncopal blackout related signs (pallor, sweating, slow heartbeat, unconsciousness) during an 18 minute period.
10 |
11 | Variable Description
12 | Subject Initials of the subject's name
13 | Age Subject's age in years
14 | Signs Whether subject showed blackout-related signs (0=No, 1=Yes)
15 |
16 | ## Source
17 |
18 | Glaister and Miller (1990). Aviation, Space and Environmental Medicine.
19 |
20 | Hamilton, L. C. (1992). Regression with Graphics. Duxbury, page 243.
21 |
22 | ## Analysis
23 |
24 | Regress binary response on one continuous covariate. Not statistically significant but easy to understand.
25 |
26 |
--------------------------------------------------------------------------------
/datasets/neuron2.txt:
--------------------------------------------------------------------------------
1 | spikes contrast orientation
2 | 3.0000000e+00 1.0000000e+00 0.0000000e+00
3 | 6.0000000e+00 1.0000000e+00 0.0000000e+00
4 | 3.0000000e+00 1.0000000e+00 0.0000000e+00
5 | 3.0000000e+00 1.0000000e+00 0.0000000e+00
6 | 6.0000000e+00 1.0000000e+00 0.0000000e+00
7 | 2.0000000e+00 1.0000000e+00 0.0000000e+00
8 | 4.0000000e+00 1.0000000e+00 0.0000000e+00
9 | 2.0000000e+00 1.0000000e+00 0.0000000e+00
10 | 4.0000000e+00 1.0000000e+00 0.0000000e+00
11 | 4.0000000e+00 1.0000000e+00 0.0000000e+00
12 | 4.0000000e+00 1.0000000e+00 0.0000000e+00
13 | 5.0000000e+00 1.0000000e+00 0.0000000e+00
14 | 1.1000000e+01 1.0000000e+00 0.0000000e+00
15 | 6.0000000e+00 1.0000000e+00 0.0000000e+00
16 | 7.0000000e+00 1.0000000e+00 0.0000000e+00
17 | 0.0000000e+00 1.0000000e+00 0.0000000e+00
18 | 6.0000000e+00 1.0000000e+00 0.0000000e+00
19 | 3.0000000e+00 1.0000000e+00 0.0000000e+00
20 | 2.0000000e+00 1.0000000e+00 0.0000000e+00
21 | 1.0000000e+00 1.0000000e+00 0.0000000e+00
22 | 4.0000000e+00 1.0000000e+00 0.0000000e+00
23 | 1.0000000e+00 1.0000000e+00 0.0000000e+00
24 | 6.0000000e+00 1.0000000e+00 0.0000000e+00
25 | 6.0000000e+00 1.0000000e+00 0.0000000e+00
26 | 7.0000000e+00 1.0000000e+00 0.0000000e+00
27 | 6.0000000e+00 1.0000000e+00 0.0000000e+00
28 | 7.0000000e+00 1.0000000e+00 0.0000000e+00
29 | 5.0000000e+00 1.0000000e+00 0.0000000e+00
30 | 1.1000000e+01 1.0000000e+00 0.0000000e+00
31 | 6.0000000e+00 1.0000000e+00 0.0000000e+00
32 | 3.0000000e+00 1.0000000e+00 0.0000000e+00
33 | 7.0000000e+00 1.0000000e+00 0.0000000e+00
34 | 1.0000000e+00 1.0000000e+00 0.0000000e+00
35 | 6.0000000e+00 1.0000000e+00 0.0000000e+00
36 | 3.0000000e+00 1.0000000e+00 0.0000000e+00
37 | 4.0000000e+00 1.0000000e+00 0.0000000e+00
38 | 9.0000000e+00 1.0000000e+00 0.0000000e+00
39 | 2.0000000e+00 1.0000000e+00 0.0000000e+00
40 | 5.0000000e+00 1.0000000e+00 0.0000000e+00
41 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
42 | 1.0000000e+00 1.0000000e+01 0.0000000e+00
43 | 4.0000000e+00 1.0000000e+01 0.0000000e+00
44 | 5.0000000e+00 1.0000000e+01 0.0000000e+00
45 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
46 | 4.0000000e+00 1.0000000e+01 0.0000000e+00
47 | 2.0000000e+00 1.0000000e+01 0.0000000e+00
48 | 1.0000000e+00 1.0000000e+01 0.0000000e+00
49 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
50 | 2.0000000e+00 1.0000000e+01 0.0000000e+00
51 | 6.0000000e+00 1.0000000e+01 0.0000000e+00
52 | 8.0000000e+00 1.0000000e+01 0.0000000e+00
53 | 8.0000000e+00 1.0000000e+01 0.0000000e+00
54 | 6.0000000e+00 1.0000000e+01 0.0000000e+00
55 | 8.0000000e+00 1.0000000e+01 0.0000000e+00
56 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
57 | 5.0000000e+00 1.0000000e+01 0.0000000e+00
58 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
59 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
60 | 4.0000000e+00 1.0000000e+01 0.0000000e+00
61 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
62 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
63 | 5.0000000e+00 1.0000000e+01 0.0000000e+00
64 | 4.0000000e+00 1.0000000e+01 0.0000000e+00
65 | 5.0000000e+00 1.0000000e+01 0.0000000e+00
66 | 1.0000000e+00 1.0000000e+01 0.0000000e+00
67 | 1.0000000e+00 1.0000000e+01 0.0000000e+00
68 | 4.0000000e+00 1.0000000e+01 0.0000000e+00
69 | 9.0000000e+00 1.0000000e+01 0.0000000e+00
70 | 4.0000000e+00 1.0000000e+01 0.0000000e+00
71 | 5.0000000e+00 1.0000000e+01 0.0000000e+00
72 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
73 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
74 | 6.0000000e+00 1.0000000e+01 0.0000000e+00
75 | 6.0000000e+00 1.0000000e+01 0.0000000e+00
76 | 5.0000000e+00 1.0000000e+01 0.0000000e+00
77 | 3.0000000e+00 1.0000000e+01 0.0000000e+00
78 | 2.0000000e+00 1.0000000e+01 0.0000000e+00
79 | 2.0000000e+00 1.0000000e+01 0.0000000e+00
80 | 6.0000000e+00 1.0000000e+00 9.0000000e+01
81 | 3.0000000e+00 1.0000000e+00 9.0000000e+01
82 | 5.0000000e+00 1.0000000e+00 9.0000000e+01
83 | 1.0000000e+00 1.0000000e+00 9.0000000e+01
84 | 7.0000000e+00 1.0000000e+00 9.0000000e+01
85 | 2.0000000e+00 1.0000000e+00 9.0000000e+01
86 | 2.0000000e+00 1.0000000e+00 9.0000000e+01
87 | 0.0000000e+00 1.0000000e+00 9.0000000e+01
88 | 1.0000000e+00 1.0000000e+00 9.0000000e+01
89 | 9.0000000e+00 1.0000000e+00 9.0000000e+01
90 | 0.0000000e+00 1.0000000e+00 9.0000000e+01
91 | 2.0000000e+00 1.0000000e+00 9.0000000e+01
92 | 6.0000000e+00 1.0000000e+00 9.0000000e+01
93 | 1.2000000e+01 1.0000000e+00 9.0000000e+01
94 | 2.0000000e+00 1.0000000e+00 9.0000000e+01
95 | 5.0000000e+00 1.0000000e+00 9.0000000e+01
96 | 1.0000000e+00 1.0000000e+00 9.0000000e+01
97 | 6.0000000e+00 1.0000000e+00 9.0000000e+01
98 | 3.0000000e+00 1.0000000e+00 9.0000000e+01
99 | 4.0000000e+00 1.0000000e+00 9.0000000e+01
100 | 3.0000000e+00 1.0000000e+00 9.0000000e+01
101 | 8.0000000e+00 1.0000000e+00 9.0000000e+01
102 | 5.0000000e+00 1.0000000e+00 9.0000000e+01
103 | 8.0000000e+00 1.0000000e+00 9.0000000e+01
104 | 4.0000000e+00 1.0000000e+00 9.0000000e+01
105 | 4.0000000e+00 1.0000000e+00 9.0000000e+01
106 | 3.0000000e+00 1.0000000e+00 9.0000000e+01
107 | 6.0000000e+00 1.0000000e+00 9.0000000e+01
108 | 5.0000000e+00 1.0000000e+00 9.0000000e+01
109 | 2.0000000e+00 1.0000000e+00 9.0000000e+01
110 | 3.0000000e+00 1.0000000e+00 9.0000000e+01
111 | 6.0000000e+00 1.0000000e+00 9.0000000e+01
112 | 7.0000000e+00 1.0000000e+00 9.0000000e+01
113 | 6.0000000e+00 1.0000000e+00 9.0000000e+01
114 | 6.0000000e+00 1.0000000e+00 9.0000000e+01
115 | 1.2000000e+01 1.0000000e+00 9.0000000e+01
116 | 4.0000000e+00 1.0000000e+00 9.0000000e+01
117 | 5.0000000e+00 1.0000000e+00 9.0000000e+01
118 | 9.0000000e+00 1.0000000e+00 9.0000000e+01
119 | 1.7000000e+01 1.0000000e+01 9.0000000e+01
120 | 2.2000000e+01 1.0000000e+01 9.0000000e+01
121 | 1.8000000e+01 1.0000000e+01 9.0000000e+01
122 | 1.7000000e+01 1.0000000e+01 9.0000000e+01
123 | 1.4000000e+01 1.0000000e+01 9.0000000e+01
124 | 1.9000000e+01 1.0000000e+01 9.0000000e+01
125 | 2.2000000e+01 1.0000000e+01 9.0000000e+01
126 | 2.2000000e+01 1.0000000e+01 9.0000000e+01
127 | 1.6000000e+01 1.0000000e+01 9.0000000e+01
128 | 1.5000000e+01 1.0000000e+01 9.0000000e+01
129 | 1.5000000e+01 1.0000000e+01 9.0000000e+01
130 | 1.9000000e+01 1.0000000e+01 9.0000000e+01
131 | 1.9000000e+01 1.0000000e+01 9.0000000e+01
132 | 1.7000000e+01 1.0000000e+01 9.0000000e+01
133 | 1.2000000e+01 1.0000000e+01 9.0000000e+01
134 | 2.1000000e+01 1.0000000e+01 9.0000000e+01
135 | 1.8000000e+01 1.0000000e+01 9.0000000e+01
136 | 2.0000000e+01 1.0000000e+01 9.0000000e+01
137 | 1.7000000e+01 1.0000000e+01 9.0000000e+01
138 | 1.5000000e+01 1.0000000e+01 9.0000000e+01
139 | 1.2000000e+01 1.0000000e+01 9.0000000e+01
140 | 1.7000000e+01 1.0000000e+01 9.0000000e+01
141 | 2.0000000e+01 1.0000000e+01 9.0000000e+01
142 | 1.8000000e+01 1.0000000e+01 9.0000000e+01
143 | 1.9000000e+01 1.0000000e+01 9.0000000e+01
144 | 1.9000000e+01 1.0000000e+01 9.0000000e+01
145 | 1.7000000e+01 1.0000000e+01 9.0000000e+01
146 | 2.1000000e+01 1.0000000e+01 9.0000000e+01
147 | 2.0000000e+01 1.0000000e+01 9.0000000e+01
148 | 2.2000000e+01 1.0000000e+01 9.0000000e+01
149 | 2.3000000e+01 1.0000000e+01 9.0000000e+01
150 | 2.3000000e+01 1.0000000e+01 9.0000000e+01
151 | 1.3000000e+01 1.0000000e+01 9.0000000e+01
152 | 1.6000000e+01 1.0000000e+01 9.0000000e+01
153 | 2.3000000e+01 1.0000000e+01 9.0000000e+01
154 | 1.6000000e+01 1.0000000e+01 9.0000000e+01
155 | 2.2000000e+01 1.0000000e+01 9.0000000e+01
156 | 2.6000000e+01 1.0000000e+01 9.0000000e+01
157 | 1.8000000e+01 1.0000000e+01 9.0000000e+01
158 |
--------------------------------------------------------------------------------
/datasets/planes.txt:
--------------------------------------------------------------------------------
1 | Distance Paper Angle Plane Order
2 | 2160 1 1 1 12
3 | 1511 1 1 1 11
4 | 4596 1 1 2 8
5 | 3706 1 1 2 6
6 | 3854 1 2 1 4
7 | 1690 1 2 1 2
8 | 5088 1 2 2 1
9 | 4255 1 2 2 7
10 | 6520 2 1 1 3
11 | 4091 2 1 1 9
12 | 2130 2 1 2 14
13 | 3150 2 1 2 5
14 | 6348 2 2 1 16
15 | 4550 2 2 1 15
16 | 2730 2 2 2 13
17 | 2585 2 2 2 10
18 |
--------------------------------------------------------------------------------
/datasets/planes_description.txt:
--------------------------------------------------------------------------------
1 | # Paper Plane Experiment
2 |
3 | Keywords: Two-Way Analysis of Variance, Three-Way Analysis of Variance, Interaction, Experimental Design.
4 |
5 | Source: http://www.statsci.org/data/oz/planes.html
6 |
7 | # Description
8 |
9 | The data was collected by Stewart Fischer and David Tippetts, statistics students at the Queensland University of Technology in a subject taught by Dr Margaret Mackisack. Here is their description of the data and its collection:
10 |
11 | The experiment decided upon was to see if by using two different designs of paper aeroplane, how far the plane would travel. In considering this, the question arose, whether different types of paper and different angles of release would have any effect on the distance travelled. Knowing that paper aeroplanes are greatly influenced by wind, we had to find a way to eliminate this factor. We decided to perform the experiment in a hallway of the University, where the effects of wind can be controlled to some extent by closing doors.
12 |
13 | In order to make the experimental units as homogeneous as possible we allocated one person to a task, so person 1 folded and threw all planes, person 2 calculated the random order assignment, measured all the distances, checked that the angles of flight were right, and checked that the plane release was the same each time.
14 |
15 | The factors that we considered each had two levels as follows:
16 |
17 | Paper: A4 size, 80gms and 50gms
18 | Design: High Performance Dual Glider, and Incredibly Simple Glider (patterns attached to original report)
19 | Angle of release: Horizontal, or 45 degrees upward.
20 |
21 | The random order assignment was calculated using the random number function of a calculator. Each combination of factors was assigned a number from one to eight, the random numbers were generated and accordingly the order of the experiment was found.
22 |
23 | Variable Description
24 | Distance Distance travelled in mm
25 | Paper 80gms = 1, 50gms = 2
26 | Angle Horizontal = 1, 45 degrees = 2
27 | Design High-performance = 1, Incredibly simple = 2
28 | Order Order in which the runs were conducted
29 |
30 | Source
31 |
32 | Mackisack, M. S. (1994). What is the use of experiments conducted by statistics students? Journal of Statistics Education, 2, no 1.
33 |
34 | Analysis
35 |
36 | two- or three-way ANOVA.
37 |
38 |
--------------------------------------------------------------------------------
/datasets/psychophysics_description.txt:
--------------------------------------------------------------------------------
1 | # Image correlates of crowding in natural scenes
2 |
3 | Source: http://www.journalofvision.org/content/12/7/6
4 |
5 | ## Description
6 |
7 | This dataset consists of a number of trials (rows) for three observers in a psychophysical experiment. Observers detected the location of a patch of texture embedded in a natural scene, that could be presented above, right, below or left of fixation. The primary dependent variable of interest is whether the observer got the trial correct or incorrect.
8 |
9 | Variable Description
10 |
11 | observer TW = 1, PB = 2, N1 = 3
12 | trialNum In chronological order, for each observer
13 | blockNum block of trials
14 | eccent 2, 4 or 8 degrees
15 | patchSize log10(degrees)
16 | correct trial correct (1) or incorrect (0)
17 | targetLoc location of the target relative to fixation (1-4, corresponding to up, right, down, left)
18 | responseLoc location of the observer's response (1-4, corresponding to up, right, down, left)
19 | imageOri orientation of the image segment presented (1-4 corresponding to upright, 90 deg clockwise, upside-down and 90 degrees CCW)
20 |
21 |
22 | ## Analysis
23 |
24 | The dataset can be analysed using logistic regression, and includes predictors that can be treated as continuous (e.g. patch size, eccentricity, trial number) and categorical predictors (e.g. observer, target location, image orientation).
25 |
26 | ## Source
27 |
28 | This is part of the dataset from:
29 |
30 | Wallis, T. S. A., & Bex, P. J. (2012). Image correlates of crowding in natural scenes. Journal of Vision, 12(7), 1–19,6.
31 |
32 | The full dataset can be found [here](http://www.journalofvision.org/content/12/7/6/suppl/DC1).
33 |
34 |
--------------------------------------------------------------------------------
/datasets/sleep.txt:
--------------------------------------------------------------------------------
1 | Species BodyWt BrainWt NonDreaming Dreaming TotalSleep LifeSpan Gestation Predation Exposure Danger
2 | Africanelephant 6654 5712 NA NA 3.3 38.6 645 3 5 3
3 | Africangiantpouchedrat 1 6.6 6.3 2 8.3 4.5 42 3 1 3
4 | ArcticFox 3.385 44.5 NA NA 12.5 14 60 1 1 1
5 | Arcticgroundsquirrel 0.92 5.7 NA NA 16.5 NA 25 5 2 3
6 | Asianelephant 2547 4603 2.1 1.8 3.9 69 624 3 5 4
7 | Baboon 10.55 179.5 9.1 0.7 9.8 27 180 4 4 4
8 | Bigbrownbat 0.023 0.3 15.8 3.9 19.7 19 35 1 1 1
9 | Braziliantapir 160 169 5.2 1 6.2 30.4 392 4 5 4
10 | Cat 3.3 25.6 10.9 3.6 14.5 28 63 1 2 1
11 | Chimpanzee 52.16 440 8.3 1.4 9.7 50 230 1 1 1
12 | Chinchilla 0.425 6.4 11 1.5 12.5 7 112 5 4 4
13 | Cow 465 423 3.2 0.7 3.9 30 281 5 5 5
14 | Deserthedgehog 0.55 2.4 7.6 2.7 10.3 NA NA 2 1 2
15 | Donkey 187.1 419 NA NA 3.1 40 365 5 5 5
16 | EasternAmericanmole 0.075 1.2 6.3 2.1 8.4 3.5 42 1 1 1
17 | Echidna 3 25 8.6 0 8.6 50 28 2 2 2
18 | Europeanhedgehog 0.785 3.5 6.6 4.1 10.7 6 42 2 2 2
19 | Galago 0.2 5 9.5 1.2 10.7 10.4 120 2 2 2
20 | Genet 1.41 17.5 4.8 1.3 6.1 34 NA 1 2 1
21 | Giantarmadillo 60 81 12 6.1 18.1 7 NA 1 1 1
22 | Giraffe 529 680 NA 0.3 NA 28 400 5 5 5
23 | Goat 27.66 115 3.3 0.5 3.8 20 148 5 5 5
24 | Goldenhamster 0.12 1 11 3.4 14.4 3.9 16 3 1 2
25 | Gorilla 207 406 NA NA 12 39.3 252 1 4 1
26 | Grayseal 85 325 4.7 1.5 6.2 41 310 1 3 1
27 | Graywolf 36.33 119.5 NA NA 13 16.2 63 1 1 1
28 | Groundsquirrel 0.101 4 10.4 3.4 13.8 9 28 5 1 3
29 | Guineapig 1.04 5.5 7.4 0.8 8.2 7.6 68 5 3 4
30 | Horse 521 655 2.1 0.8 2.9 46 336 5 5 5
31 | Jaguar 100 157 NA NA 10.8 22.4 100 1 1 1
32 | Kangaroo 35 56 NA NA NA 16.3 33 3 5 4
33 | Lessershort-tailedshrew 0.005 0.14 7.7 1.4 9.1 2.6 21.5 5 2 4
34 | Littlebrownbat 0.01 0.25 17.9 2 19.9 24 50 1 1 1
35 | Man 62 1320 6.1 1.9 8 100 267 1 1 1
36 | Molerat 0.122 3 8.2 2.4 10.6 NA 30 2 1 1
37 | Mountainbeaver 1.35 8.1 8.4 2.8 11.2 NA 45 3 1 3
38 | Mouse 0.023 0.4 11.9 1.3 13.2 3.2 19 4 1 3
39 | Muskshrew 0.048 0.33 10.8 2 12.8 2 30 4 1 3
40 | NAmericanopossum 1.7 6.3 13.8 5.6 19.4 5 12 2 1 1
41 | Nine-bandedarmadillo 3.5 10.8 14.3 3.1 17.4 6.5 120 2 1 1
42 | Okapi 250 490 NA 1 NA 23.6 440 5 5 5
43 | Owlmonkey 0.48 15.5 15.2 1.8 17 12 140 2 2 2
44 | Patasmonkey 10 115 10 0.9 10.9 20.2 170 4 4 4
45 | Phanlanger 1.62 11.4 11.9 1.8 13.7 13 17 2 1 2
46 | Pig 192 180 6.5 1.9 8.4 27 115 4 4 4
47 | Rabbit 2.5 12.1 7.5 0.9 8.4 18 31 5 5 5
48 | Raccoon 4.288 39.2 NA NA 12.5 13.7 63 2 2 2
49 | Rat 0.28 1.9 10.6 2.6 13.2 4.7 21 3 1 3
50 | Redfox 4.235 50.4 7.4 2.4 9.8 9.8 52 1 1 1
51 | Rhesusmonkey 6.8 179 8.4 1.2 9.6 29 164 2 3 2
52 | Rockhyrax(Heterob) 0.75 12.3 5.7 0.9 6.6 7 225 2 2 2
53 | Rockhyrax(Procaviahab) 3.6 21 4.9 0.5 5.4 6 225 3 2 3
54 | Roedeer 14.83 98.2 NA NA 2.6 17 150 5 5 5
55 | Sheep 55.5 175 3.2 0.6 3.8 20 151 5 5 5
56 | Slowloris 1.4 12.5 NA NA 11 12.7 90 2 2 2
57 | Starnosedmole 0.06 1 8.1 2.2 10.3 3.5 NA 3 1 2
58 | Tenrec 0.9 2.6 11 2.3 13.3 4.5 60 2 1 2
59 | Treehyrax 2 12.3 4.9 0.5 5.4 7.5 200 3 1 3
60 | Treeshrew 0.104 2.5 13.2 2.6 15.8 2.3 46 3 2 2
61 | Vervet 4.19 58 9.7 0.6 10.3 24 210 4 3 4
62 | Wateropossum 3.5 3.9 12.8 6.6 19.4 3 14 2 1 1
63 | Yellow-belliedmarmot 4.05 17 NA NA NA 13 38 3 1 1
64 |
--------------------------------------------------------------------------------
/datasets/sleep_description.txt:
--------------------------------------------------------------------------------
1 | # Sleep in Mammals
2 |
3 | Source: http://www.statsci.org/data/general/sleep.html
4 |
5 | ## Description
6 |
7 | Includes brain and body weight, life span, gestation time, time sleeping, and predation and danger indices for 62 species of mammals. Of interest is to predict the time spent sleeping and the proportion of sleep time in dream sleep.
8 |
9 | Variable Description
10 | BodyWt body weight (kg)
11 | BrainWt brain weight (g)
12 | NonDreaming slow wave ("nondreaming") sleep (hrs/day)
13 | Dreaming paradoxical ("dreaming") sleep (hrs/day)
14 | TotalSleep total sleep, sum of slow wave and paradoxical sleep (hrs/day)
15 | LifeSpan maximum life span (years)
16 | Gestation gestation time (days)
17 | Predation predation index (1-5)
18 | 1 = minimum (least likely to be preyed upon); 5 = maximum (most likely to be preyed upon)
19 | Exposure sleep exposure index (1-5)
20 | 1 = least exposed (e.g. animal sleeps in a well-protected den); 5 = most exposed
21 | Danger overall danger index (1-5) (based on the above two indices and other information)
22 | 1 = least danger (from other animals); 5 = most danger (from other animals)
23 | Download
24 |
25 | ## Source
26 |
27 | Allison, T., and Cicchetti, D. V. (1976). Sleep in mammals: ecological and constitutional correlates. Science 194 (November 12), 732-734.
28 |
29 | The electronic data file was obtained from the Statlib database.
30 |
31 | ## Analysis
32 |
33 | One can use multiple regression to predict the three sleep variables after suitably transforming the predictors and response. The response variable could be hours of sleep or the proportion of sleep spent dreaming. There are quite a few missing values. I have also used this data to illustrate regression trees for predicting duration of dream-sleep.
34 |
35 |
--------------------------------------------------------------------------------
/environment.yml:
--------------------------------------------------------------------------------
1 | name: stats_course
2 | dependencies:
3 | - abstract-rendering=0.5.1=np19py34_0
4 | - anaconda=2.1.0=np19py34_0
5 | - argcomplete=0.8.1=py34_0
6 | - astropy=0.4.2=np19py34_0
7 | - beautiful-soup=4.3.2=py34_0
8 | - binstar=0.7.1=py34_0
9 | - bitarray=0.8.1=py34_0
10 | - blaze=0.6.3=np19py34_0
11 | - blz=0.6.2=np19py34_0
12 | - bokeh=0.6.1=np19py34_0
13 | - boto=2.32.1=py34_0
14 | - cffi=0.8.6=py34_0
15 | - colorama=0.3.1=py34_0
16 | - configobj=5.0.6=py34_0
17 | - cryptography=0.5.4=py34_0
18 | - curl=7.38.0=0
19 | - cython=0.21=py34_0
20 | - cytoolz=0.7.0=py34_0
21 | - datashape=0.3.0=np19py34_1
22 | - dateutil=2.1=py34_2
23 | - decorator=3.4.0=py34_0
24 | - docutils=0.12=py34_0
25 | - dynd-python=0.6.5=np19py34_0
26 | - flask=0.10.1=py34_1
27 | - freetype=2.4.10=1
28 | - future=0.13.1=py34_0
29 | - greenlet=0.4.4=py34_0
30 | - h5py=2.3.1=np19py34_0
31 | - hdf5=1.8.13=0
32 | - ipython=2.3.1=py34_0
33 | - ipython-notebook=2.3.1=py34_0
34 | - ipython-qtconsole=2.2.0=py34_0
35 | - itsdangerous=0.24=py34_0
36 | - jdcal=1.0=py34_0
37 | - jinja2=2.7.3=py34_1
38 | - jpeg=8d=1
39 | - launcher=1.0.0=1
40 | - libdynd=0.6.5=0
41 | - libpng=1.5.13=1
42 | - libsodium=0.4.5=0
43 | - libtiff=4.0.2=1
44 | - libxml2=2.9.0=1
45 | - libxslt=1.1.28=2
46 | - llvm=3.3=0
47 | - llvmpy=0.12.7=py34_0
48 | - lxml=3.4.0=py34_0
49 | - markupsafe=0.23=py34_0
50 | - matplotlib=1.4.2=np19py34_0
51 | - mock=1.0.1=py34_0
52 | - multipledispatch=0.4.7=py34_0
53 | - networkx=1.9.1=py34_0
54 | - nltk=3.0.0=np19py34_0
55 | - node-webkit=0.10.1=0
56 | - nose=1.3.4=py34_0
57 | - numba=0.14.0=np19py34_0
58 | - numexpr=2.3.1=np19py34_0
59 | - numpy=1.9.1=py34_0
60 | - openpyxl=1.8.5=py34_0
61 | - openssl=1.0.1k=1
62 | - pandas=0.15.2=np19py34_0
63 | - patsy=0.3.0=np19py34_0
64 | - pillow=2.5.1=py34_0
65 | - pip=7.1.0=py34_1
66 | - ply=3.4=py34_0
67 | - psutil=2.1.1=py34_0
68 | - py=1.4.25=py34_0
69 | - pycosat=0.6.1=py34_0
70 | - pycparser=2.10=py34_0
71 | - pycrypto=2.6.1=py34_0
72 | - pycurl=7.19.5=py34_1
73 | - pyflakes=0.8.1=py34_0
74 | - pygments=1.6=py34_0
75 | - pyopenssl=0.14=py34_0
76 | - pyparsing=2.0.1=py34_0
77 | - pyqt=4.10.4=py34_0
78 | - pytables=3.1.1=np19py34_0
79 | - pytest=2.6.3=py34_0
80 | - python=3.4.3=0
81 | - python.app=1.2=py34_3
82 | - pytz=2014.9=py34_0
83 | - pyyaml=3.11=py34_0
84 | - pyzmq=14.4.1=py34_0
85 | - qt=4.8.5=3
86 | - readline=6.2=2
87 | - redis=2.6.9=0
88 | - redis-py=2.9.1=py34_0
89 | - requests=2.4.1=py34_0
90 | - rope=0.9.4=py34_1
91 | - runipy=0.1.1=py34_0
92 | - scikit-image=0.10.1=np19py34_0
93 | - scikit-learn=0.15.2=np19py34_0
94 | - scipy=0.15.1=np19py34_0
95 | - seaborn=0.5.1=np19py34_0
96 | - setuptools=18.1=py34_0
97 | - sip=4.15.5=py34_0
98 | - six=1.9.0=py34_0
99 | - sockjs-tornado=1.0.1=py34_0
100 | - sphinx=1.2.3=py34_0
101 | - spyder=2.3.1=py34_1
102 | - spyder-app=2.3.1=py34_0
103 | - sqlalchemy=0.9.7=py34_0
104 | - sqlite=3.8.4.1=1
105 | - statsmodels=0.6.1=np19py34_0
106 | - sympy=0.7.5=py34_0
107 | - tk=8.5.18=0
108 | - toolz=0.7.0=py34_0
109 | - tornado=4.0.2=py34_0
110 | - ujson=1.33=py34_0
111 | - werkzeug=0.9.6=py34_1
112 | - wheel=0.24.0=py34_0
113 | - xlrd=0.9.3=py34_0
114 | - xlsxwriter=0.5.7=py34_0
115 | - xz=5.0.5=0
116 | - yaml=0.1.4=1
117 | - zeromq=4.0.4=0
118 | - zlib=1.2.8=0
119 | - pip:
120 | - beautifulsoup4==4.3.2
121 | - brewer2mpl==1.4.1
122 | - ggplot==0.6.5
123 | - python-dateutil==2.1
124 | - rope-py3k-0.9.4==1
125 | - svgutils==0.1.0
126 | - tables==3.1.1
127 |
128 |
--------------------------------------------------------------------------------
/index.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:b0230e2c391d363a04ed8d177db84780f6641a74753aba55aa1cbdbb0c8497a5"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "# The Generalised Linear Model\n",
16 | "\n",
17 | "## A practical introduction using the Python environment\n",
18 | "\n",
19 | "[Tom Wallis](http://www.tomwallis.info) and [Philipp Berens](http://philippberens.wordpress.com/)\n"
20 | ]
21 | },
22 | {
23 | "cell_type": "markdown",
24 | "metadata": {},
25 | "source": [
26 | "This course provides an overview of an extremely flexible\n",
27 | "statistical framework for describing and performing inference with a wide\n",
28 | "variety of data types: the Generalised Linear Model (GLM). Many common statistical procedures are special cases of the GLM. In the course, we focus on the construction and understanding of design matrices and the interpretation of regression weights. We mostly concentrate on the linear Gaussian model, before discussing more general cases. We also touch on how this framework relates to ANOVA-style model comparison.\n",
29 | "\n",
30 | "The course was designed and presented as a six week elective statistics course for graduate students in the neuroscience program at the University of T\u00fcbingen, in January 2015. Lectures were presented as a collection of IPython Notebooks. While the notebooks are (we hope) well documented, they *are* lecture materials rather than a textbook. As such, some content might not be self-explanatory. \n",
31 | "\n",
32 | "We chose to do the course in Python because \n",
33 | "\n",
34 | "1. It is a general purpose programming language and thus more versatile than e.g. [R](http://cran.r-project.org/). Neuroscientists can use Python to not only analyse data but also to e.g. interface with hardware, [conduct behavioural experiments](http://www.psychopy.org/), etc. \n",
35 | "1. Its [popularity as a scientific computing environment is rapidly growing](http://www.nature.com/news/programming-pick-up-python-1.16833).\n",
36 | "1. The scientific computing environment of Python has many similarities to MATLAB ™, which is the historically dominant environment in our field.\n",
37 | "1. It is free and open source, and thus we feel will continue to benefit students who move out of a university environment. \n",
38 | "\n",
39 | "Nevertheless, the main statistical module we use here (`Statsmodels`) is well behind R in its maturity (no wonder, since R is a *lot* older). Thankfully, learning to create and interpret design matrices using `Patsy` formula notation is a skill that transfers easily to R's `glm` routines. \n",
40 | "\n",
41 | "Note two things:\n",
42 | "\n",
43 | "1. *This is not a programming course*. If you do not have enough experience with programming (or Python) to follow the materials here, seek out an introduction to programming in Python. There are many available for free on the internet.\n",
44 | "1. *This is not a basic statistics course*. You should be reasonably familiar with things like t-tests and ANOVA before proceeding.\n",
45 | "\n",
46 | "Where content is erroneous, unclear or buggy, please tell us at our [GitHub repository](https://github.com/tomwallis/glm-intro)."
47 | ]
48 | },
49 | {
50 | "cell_type": "markdown",
51 | "metadata": {},
52 | "source": [
53 | "## Lectures"
54 | ]
55 | },
56 | {
57 | "cell_type": "markdown",
58 | "metadata": {},
59 | "source": [
60 | "1. [Dataframes and Exploratory Plotting](lectures/dataframes_plots.ipynb)\n",
61 | "1. [Introduction to design matrices](lectures/design_matrices.ipynb)\n",
62 | " 1. [Homework solution](lectures/homework_solution_1.ipynb)\n",
63 | "1. [Interactions](lectures/Interactions.ipynb)\n",
64 | "1. [ANOVA](lectures/ANOVA.ipynb)\n",
65 | "1. [Link functions](lectures/link_functions.ipynb)"
66 | ]
67 | },
68 | {
69 | "cell_type": "markdown",
70 | "metadata": {},
71 | "source": [
72 | "## Datasets\n",
73 | "\n",
74 | "To demonstrate the ideas in the course we used several datasets obtained from the [OzDASL database](http://www.statsci.org/data/index.html) as well as from our own research. They are provided in the git repository to facilitate self learning."
75 | ]
76 | },
77 | {
78 | "cell_type": "markdown",
79 | "metadata": {},
80 | "source": [
81 | "## License"
82 | ]
83 | },
84 | {
85 | "cell_type": "markdown",
86 | "metadata": {},
87 | "source": [
88 | "**Authors:** [Tom Wallis](http://www.tomwallis.info) and [Philipp Berens](http://philippberens.wordpress.com/)\n",
89 | "\n",
90 | "**Year:** 2015 \n",
91 | "\n",
92 | "**Copyright:** This work is licensed under a [CC-by-4.0](https://creativecommons.org/licenses/by/4.0/) license. You may reuse, modify and redistribute these materials provided you give appropriate credit to the authors. All images embedded in the lecture materials were obtained from the internet and are used under \"fair use\" for educational purposes. The copyright for all images remain with their respective holders."
93 | ]
94 | },
95 | {
96 | "cell_type": "markdown",
97 | "metadata": {},
98 | "source": [
99 | "## Further reading\n",
100 | "\n",
101 | "Here we provide some references for further reading. These reflect our own backgrounds in neuroscience and psychology.\n",
102 | "\n",
103 | "Burnham, K. P., & Anderson, D. R. (2002). Model selection and multimodel inference a practical information-theoretic approach. New York: Springer. \n",
104 | " - *A thorough overview of model comparison, which is how we like to think of ANOVA*\n",
105 | "\n",
106 | "Gelman, A., & Hill, J. (2007). Data Analysis using regression and multilevel/hierarchical models. New York, NY: Cambridge Univ Press.\n",
107 | "- *Introduction to regression using multilevel models. Random effects models, pooling and shrinkage...*\n",
108 | "\n",
109 | "Knoblauch, K., & Maloney, L. T. (2012). Modeling Psychophysical Data in R. New York: Springer. \n",
110 | "- *This book presents some clear examples of applying GLMs to modelling psychophysical data, using the R environment*\n",
111 | "\n",
112 | "Kruschke, J. K. (2011). Doing Bayesian Data Analysis. Academic Press / Elsevier. \n",
113 | "- *The last half of this book is dedicated to a clear and thorough practical introduction to GLMs. We recommend the Bayesian inference stuff too, but it's not part of our course*\n",
114 | "\n",
115 | "Wickham, H. (2014). Tidy Data. Journal of Statistical Software, 59(10), ??\u2013?? \n",
116 | "- *An article about how to arrange data so that analysis environments like Pandas can work with it.*\n"
117 | ]
118 | },
119 | {
120 | "cell_type": "markdown",
121 | "metadata": {},
122 | "source": [
123 | "--------------------------------\n",
124 | "\n",
125 | "[Here are some notes](setup_python.ipynb) on how we set up a Python environment for the course (packages, versions) etc."
126 | ]
127 | },
128 | {
129 | "cell_type": "code",
130 | "collapsed": false,
131 | "input": [
132 | "from IPython.core.display import HTML\n",
133 | "\n",
134 | "\n",
135 | "def css_styling():\n",
136 | " styles = open(\"custom_style.css\", \"r\").read()\n",
137 | " return HTML(styles)\n",
138 | "css_styling()"
139 | ],
140 | "language": "python",
141 | "metadata": {},
142 | "outputs": [
143 | {
144 | "html": [
145 | "\n",
213 | ""
228 | ],
229 | "metadata": {},
230 | "output_type": "pyout",
231 | "prompt_number": 1,
232 | "text": [
233 | ""
234 | ]
235 | }
236 | ],
237 | "prompt_number": 1
238 | }
239 | ],
240 | "metadata": {}
241 | }
242 | ]
243 | }
--------------------------------------------------------------------------------
/lectures/ANOVA.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:06aaaf7085ddd47dd0b96d93bf3c322d5569eeaac8e2cb804e1a926e01aa02e8"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "# The Generalised Linear Model and ANOVA\n",
16 | "\n",
17 | "[Tom Wallis](http://www.tomwallis.info) and [Philipp Berens](http://philippberens.wordpress.com/)\n",
18 | "\n",
19 | "The University of T\u00fcbingen"
20 | ]
21 | },
22 | {
23 | "cell_type": "code",
24 | "collapsed": false,
25 | "input": [
26 | "import numpy as np\n",
27 | "import pandas as pd\n",
28 | "import patsy\n",
29 | "import seaborn as sns\n",
30 | "import statsmodels.formula.api as smf\n",
31 | "import statsmodels.stats as sms\n",
32 | "import matplotlib.pyplot as plt\n",
33 | "\n",
34 | "%matplotlib inline\n",
35 | "\n",
36 | "sns.set_style(\"white\")"
37 | ],
38 | "language": "python",
39 | "metadata": {},
40 | "outputs": [],
41 | "prompt_number": 21
42 | },
43 | {
44 | "cell_type": "markdown",
45 | "metadata": {},
46 | "source": [
47 | "# Introduction\n",
48 | "\n",
49 | "The analysis of variance (ANOVA) is typically used to \"*learn the relative importance of different sources of variation in a dataset*\" (see Gelman & Hill, Chapter 22). Recall that when we think about regression models, we ask questions like: \n",
50 | "\n",
51 | "**What is our prediction for distance flown given paper and design?**\n",
52 | "\n",
53 | "The ANOVA for the same model fit focuses on a slightly different question:\n",
54 | "\n",
55 | "**How much of the variance in distance flown is associated with changes in paper and design?**\n",
56 | "\n",
57 | "ANOVA can be thought of as *breaking up* the total variance in the response into (a) how much variance is associated with each independent variable in our design and (b) the remaining *error* or *residual* variance. "
58 | ]
59 | },
60 | {
61 | "cell_type": "code",
62 | "collapsed": false,
63 | "input": [
64 | "# for this demonstration we will use the *planes* example because it's balanced.\n",
65 | "\n",
66 | "# read in the data, rename some values:\n",
67 | "planes = pd.read_csv('../Datasets/planes.txt', sep='\\t')\n",
68 | "\n",
69 | "# column names to lowercase:\n",
70 | "planes.columns = [c.lower() for c in planes.columns]\n",
71 | "\n",
72 | "# turn \"paper\" variable into an object, not int64:\n",
73 | "planes['paper'] = planes['paper'].astype(object)\n",
74 | "\n",
75 | "planes.replace(to_replace={'paper': {1: '80',\n",
76 | " 2: '50'},\n",
77 | " 'angle': {1: 'horz',\n",
78 | " 2: '45deg '},\n",
79 | " 'plane': {1: 'sleek',\n",
80 | " 2: 'simple'}}, \n",
81 | " inplace=True)\n",
82 | "\n",
83 | "# convert distance into meters for ease of SSR:\n",
84 | "planes['distance'] = planes['distance'] / 1000\n",
85 | "\n",
86 | "# rename \"plane\" to \"design\":\n",
87 | "planes = planes.rename(columns={'plane': 'design'})\n",
88 | "\n",
89 | "# planes is a balanced dataset (same number of cases in each cell):\n",
90 | "planes.groupby(['paper', 'design']).size()"
91 | ],
92 | "language": "python",
93 | "metadata": {},
94 | "outputs": [
95 | {
96 | "metadata": {},
97 | "output_type": "pyout",
98 | "prompt_number": 23,
99 | "text": [
100 | "paper design\n",
101 | "50 simple 4\n",
102 | " sleek 4\n",
103 | "80 simple 4\n",
104 | " sleek 4\n",
105 | "dtype: int64"
106 | ]
107 | }
108 | ],
109 | "prompt_number": 23
110 | },
111 | {
112 | "cell_type": "markdown",
113 | "metadata": {},
114 | "source": [
115 | "# ANOVA as model comparison between nested models\n",
116 | "\n",
117 | "Recall for the planes data we are interested in the effects of `design` and `paper` on distance. This is a $2 \\times 2$ factorial design. We can think about all the possible models here as a series of nested models. That is, each simpler model is a subset of the most complex model (the model with all main effects and interactions). We can visualise this in a diagram (adapted from [1]):\n",
118 | "\n",
119 | "
\n",
120 | "\n",
121 | "In the diagram, the arrows indicate a differencing of the sum of squared residuals (SSR). The SSR for the upper model (with more parameters, and thus a lower SSR) is subtracted from the SSR of the lower model. This difference score gives the change in SS associated with the new predictor.\n",
122 | "\n",
123 | "Notice that there are actually two comparisons here that speak to the \"main effects\" of `paper` (in red) and `design` (in blue). These correspond to different hypotheses. Let's consider the test for `paper`. The first comparison (red text, right side of figure) tests whether paper has an effect *ignoring* design. This is usually called the **main effect** of paper. The second comparison (red text, left side of figure) tests whether adding paper changes the residuals *after discounting the variance due to design*. This corresponds to the SS for the **simple effects** of paper (in an interaction model, we also add the SS from the interaction term -- see [2], p.433).\n",
124 | "\n",
125 | "The `planes` dataset we use here is an example of a *balanced design*. Balanced designs have an equal number of observations in each cell (four planes were thrown in each combination of `paper` and `design`). It turns out that in balanced designs, the main effects and the sum of simple effects give the same answer. That's because they're just different ways to divide up the total sum of squares. When we have unbalanced data however, it matters which comparison you do (see Types of Sums of Squares, below). "
126 | ]
127 | },
128 | {
129 | "cell_type": "markdown",
130 | "metadata": {},
131 | "source": [
132 | "# Orthogonal contrasts\n",
133 | "\n",
134 | "When we do ANOVA, we are breaking up the total variance from the grand mean into subgroups of variance. To do this in a way that makes sense, we need to code categorical variables in a way that is independent of the other columns. These are called *orthogonal contrasts*. \n",
135 | "\n",
136 | "What is an orthogonal contrast? Intuitively, in a design matrix of orthogonal contrasts, the overall effect of one column is independent of the other columns. Mathematically, two columns of a design matrix are orthogonal if their product sums to zero.\n",
137 | "\n",
138 | "Throughout this course so far, recall that we have coded categorical variables using *treatment* or *dummy* coding. In this case, the Intercept term is the mean of one of the subgroups in the design (the *reference* level) and all other coefficients give deflections from this reference level. Are treatment contrasts orthogonal?\n"
139 | ]
140 | },
141 | {
142 | "cell_type": "code",
143 | "collapsed": false,
144 | "input": [
145 | "X = patsy.dmatrix('~ design * paper', planes)\n",
146 | "X"
147 | ],
148 | "language": "python",
149 | "metadata": {},
150 | "outputs": [
151 | {
152 | "metadata": {},
153 | "output_type": "pyout",
154 | "prompt_number": 24,
155 | "text": [
156 | "DesignMatrix with shape (16, 4)\n",
157 | " Intercept design[T.sleek] paper[T.80] design[T.sleek]:paper[T.80]\n",
158 | " 1 1 1 1\n",
159 | " 1 1 1 1\n",
160 | " 1 0 1 0\n",
161 | " 1 0 1 0\n",
162 | " 1 1 1 1\n",
163 | " 1 1 1 1\n",
164 | " 1 0 1 0\n",
165 | " 1 0 1 0\n",
166 | " 1 1 0 0\n",
167 | " 1 1 0 0\n",
168 | " 1 0 0 0\n",
169 | " 1 0 0 0\n",
170 | " 1 1 0 0\n",
171 | " 1 1 0 0\n",
172 | " 1 0 0 0\n",
173 | " 1 0 0 0\n",
174 | " Terms:\n",
175 | " 'Intercept' (column 0)\n",
176 | " 'design' (column 1)\n",
177 | " 'paper' (column 2)\n",
178 | " 'design:paper' (column 3)"
179 | ]
180 | }
181 | ],
182 | "prompt_number": 24
183 | },
184 | {
185 | "cell_type": "code",
186 | "collapsed": false,
187 | "input": [
188 | "# example for coefficients of design and paper:\n",
189 | "np.sum(X[:, 1] * X[:, 2])"
190 | ],
191 | "language": "python",
192 | "metadata": {},
193 | "outputs": [
194 | {
195 | "metadata": {},
196 | "output_type": "pyout",
197 | "prompt_number": 25,
198 | "text": [
199 | "array(4.0)"
200 | ]
201 | }
202 | ],
203 | "prompt_number": 25
204 | },
205 | {
206 | "cell_type": "markdown",
207 | "metadata": {},
208 | "source": [
209 | "So in the *treatment* contrasts regime, we're not using orthogonal contrasts (the sum of the product of the columns is not zero). Therefore ANOVAs done with these design matrices become hard or impossible to interpret.\n",
210 | "\n",
211 | "The fix for this is easy: we can tell Patsy to code our categorical variables using orthogonal contrasts. The most common of these is Sum coding, in which the values for each category sum to one. This means that the coefficients in the regression represent *deviations* from the overall mean across the category levels. There are a number of other contrasts available in Patsy, which can test different hypotheses about the variables. See [here](http://statsmodels.sourceforge.net/devel/contrasts.html), [here](http://www.ats.ucla.edu/stat/r/library/contrast_coding.htm) and [here](http://talklab.psy.gla.ac.uk/tvw/catpred/) for further reading.\n",
212 | "\n",
213 | "What does the design matrix look like for sum coding?"
214 | ]
215 | },
216 | {
217 | "cell_type": "code",
218 | "collapsed": false,
219 | "input": [
220 | "X = patsy.dmatrix('~ C(design, Sum) * C(paper, Sum)', planes)\n",
221 | "print(X)"
222 | ],
223 | "language": "python",
224 | "metadata": {},
225 | "outputs": [
226 | {
227 | "output_type": "stream",
228 | "stream": "stdout",
229 | "text": [
230 | "[[ 1. -1. -1. 1.]\n",
231 | " [ 1. -1. -1. 1.]\n",
232 | " [ 1. 1. -1. -1.]\n",
233 | " [ 1. 1. -1. -1.]\n",
234 | " [ 1. -1. -1. 1.]\n",
235 | " [ 1. -1. -1. 1.]\n",
236 | " [ 1. 1. -1. -1.]\n",
237 | " [ 1. 1. -1. -1.]\n",
238 | " [ 1. -1. 1. -1.]\n",
239 | " [ 1. -1. 1. -1.]\n",
240 | " [ 1. 1. 1. 1.]\n",
241 | " [ 1. 1. 1. 1.]\n",
242 | " [ 1. -1. 1. -1.]\n",
243 | " [ 1. -1. 1. -1.]\n",
244 | " [ 1. 1. 1. 1.]\n",
245 | " [ 1. 1. 1. 1.]]\n"
246 | ]
247 | }
248 | ],
249 | "prompt_number": 26
250 | },
251 | {
252 | "cell_type": "markdown",
253 | "metadata": {},
254 | "source": [
255 | "Instead of our levels being either 0 or 1, they're now either -1 or 1. Now, the sum over our contrast columns is zero (note the intercept is not a contrast, so does not sum to zero), and their cross products are all zero:"
256 | ]
257 | },
258 | {
259 | "cell_type": "code",
260 | "collapsed": false,
261 | "input": [
262 | "print('sum of columns = ' + str(X.sum(axis=0)))\n",
263 | "print('sum of products of design and paper = ' + str((X[:, 1] * X[:, 2]).sum()))\n",
264 | "print('sum of products of design and interaction = ' + str((X[:, 1] * X[:, 3]).sum()))"
265 | ],
266 | "language": "python",
267 | "metadata": {},
268 | "outputs": [
269 | {
270 | "output_type": "stream",
271 | "stream": "stdout",
272 | "text": [
273 | "sum of columns = [ 16. 0. 0. 0.]\n",
274 | "sum of products of design and paper = 0.0\n",
275 | "sum of products of design and interaction = 0.0\n"
276 | ]
277 | }
278 | ],
279 | "prompt_number": 27
280 | },
281 | {
282 | "cell_type": "markdown",
283 | "metadata": {},
284 | "source": [
285 | "Using these orthogonal contrasts, let's fit the model and look at the ANOVA table:"
286 | ]
287 | },
288 | {
289 | "cell_type": "code",
290 | "collapsed": false,
291 | "input": [
292 | "fit_full = smf.ols('distance ~ C(design, Sum) * C(paper, Sum)', planes).fit()\n",
293 | "print(fit_full.summary())"
294 | ],
295 | "language": "python",
296 | "metadata": {},
297 | "outputs": [
298 | {
299 | "output_type": "stream",
300 | "stream": "stdout",
301 | "text": [
302 | " OLS Regression Results \n",
303 | "==============================================================================\n",
304 | "Dep. Variable: distance R-squared: 0.727\n",
305 | "Model: OLS Adj. R-squared: 0.659\n",
306 | "Method: Least Squares F-statistic: 10.66\n",
307 | "Date: Tue, 24 Feb 2015 Prob (F-statistic): 0.00106\n",
308 | "Time: 20:51:51 Log-Likelihood: -18.584\n",
309 | "No. Observations: 16 AIC: 45.17\n",
310 | "Df Residuals: 12 BIC: 48.26\n",
311 | "Df Model: 3 \n",
312 | "Covariance Type: nonrobust \n",
313 | "================================================================================================================\n",
314 | " coef std err t P>|t| [95.0% Conf. Int.]\n",
315 | "----------------------------------------------------------------------------------------------------------------\n",
316 | "Intercept 3.6853 0.223 16.514 0.000 3.199 4.171\n",
317 | "C(design, Sum)[S.simple] -0.1552 0.223 -0.696 0.500 -0.641 0.331\n",
318 | "C(paper, Sum)[S.50] 0.3277 0.223 1.469 0.168 -0.158 0.814\n",
319 | "C(design, Sum)[S.simple]:C(paper, Sum)[S.50] -1.2090 0.223 -5.418 0.000 -1.695 -0.723\n",
320 | "==============================================================================\n",
321 | "Omnibus: 0.751 Durbin-Watson: 2.909\n",
322 | "Prob(Omnibus): 0.687 Jarque-Bera (JB): 0.745\n",
323 | "Skew: 0.359 Prob(JB): 0.689\n",
324 | "Kurtosis: 2.224 Cond. No. 1.00\n",
325 | "==============================================================================\n",
326 | "\n",
327 | "Warnings:\n",
328 | "[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.\n"
329 | ]
330 | }
331 | ],
332 | "prompt_number": 28
333 | },
334 | {
335 | "cell_type": "markdown",
336 | "metadata": {},
337 | "source": [
338 | "What do the coefficients mean? The intercept coefficient is now the grand mean of distance (recall that in the Treatment contrast regime it was the mean distance when the design was \"simple\" and the paper weight was \"50\"):"
339 | ]
340 | },
341 | {
342 | "cell_type": "code",
343 | "collapsed": false,
344 | "input": [
345 | "np.round(planes['distance'].mean(), decimals=4)"
346 | ],
347 | "language": "python",
348 | "metadata": {},
349 | "outputs": [
350 | {
351 | "metadata": {},
352 | "output_type": "pyout",
353 | "prompt_number": 29,
354 | "text": [
355 | "3.6852"
356 | ]
357 | }
358 | ],
359 | "prompt_number": 29
360 | },
361 | {
362 | "cell_type": "markdown",
363 | "metadata": {},
364 | "source": [
365 | "The coefficient values for the contrasts tell us the average deviations from this grand mean for each category:"
366 | ]
367 | },
368 | {
369 | "cell_type": "code",
370 | "collapsed": false,
371 | "input": [
372 | "# example for design:\n",
373 | "print(fit_full.params[0] + fit_full.params[1]) # model prediction for simple\n",
374 | "print(fit_full.params[0] + -1*fit_full.params[1]) # model prediction for sleek\n",
375 | "print('\\n\\n Actual Means:')\n",
376 | "print(planes.groupby('design').distance.mean()) # actual means"
377 | ],
378 | "language": "python",
379 | "metadata": {},
380 | "outputs": [
381 | {
382 | "output_type": "stream",
383 | "stream": "stdout",
384 | "text": [
385 | "3.53\n",
386 | "3.8405\n",
387 | "\n",
388 | "\n",
389 | " Actual Means:\n",
390 | "design\n",
391 | "simple 3.5300\n",
392 | "sleek 3.8405\n",
393 | "Name: distance, dtype: float64\n"
394 | ]
395 | }
396 | ],
397 | "prompt_number": 30
398 | },
399 | {
400 | "cell_type": "markdown",
401 | "metadata": {},
402 | "source": [
403 | "## ANOVA table for this model"
404 | ]
405 | },
406 | {
407 | "cell_type": "code",
408 | "collapsed": false,
409 | "input": [
410 | "anova_table = sms.anova.anova_lm(fit_full) # type I SS by default; same answer in this case\n",
411 | "print(anova_table) "
412 | ],
413 | "language": "python",
414 | "metadata": {},
415 | "outputs": [
416 | {
417 | "output_type": "stream",
418 | "stream": "stdout",
419 | "text": [
420 | " df sum_sq mean_sq F PR(>F)\n",
421 | "C(design, Sum) 1 0.385641 0.385641 0.484016 0.499861\n",
422 | "C(paper, Sum) 1 1.718721 1.718721 2.157158 0.167628\n",
423 | "C(design, Sum):C(paper, Sum) 1 23.386896 23.386896 29.352777 0.000156\n",
424 | "Residual 12 9.561029 0.796752 NaN NaN\n"
425 | ]
426 | }
427 | ],
428 | "prompt_number": 31
429 | },
430 | {
431 | "cell_type": "markdown",
432 | "metadata": {},
433 | "source": [
434 | "## Relating the ANOVA table to nested model comparisons\n",
435 | "\n",
436 | "Recall that above we talked about ANOVA as a comparison of nested models. I'll reproduce the diagram here:\n",
437 | "\n",
438 | "
\n",
439 | "\n",
440 | "We can produce all the sums of squares from the ANOVA table by computing the relevant change in residual (error) sum of squares for each model.\n",
441 | "\n",
442 | "### Fit all models in the diagram"
443 | ]
444 | },
445 | {
446 | "cell_type": "code",
447 | "collapsed": false,
448 | "input": [
449 | "# fit all the models in the tree\n",
450 | "null = smf.ols('distance ~ 1', planes).fit()\n",
451 | "design = smf.ols('distance ~ C(design, Sum)', planes).fit()\n",
452 | "paper = smf.ols('distance ~ C(paper, Sum)', planes).fit()\n",
453 | "additive = smf.ols('distance ~ C(design, Sum) + C(paper, Sum)', planes).fit()\n",
454 | "full = smf.ols('distance ~ C(design, Sum) * C(paper, Sum)', planes).fit()"
455 | ],
456 | "language": "python",
457 | "metadata": {},
458 | "outputs": [],
459 | "prompt_number": 32
460 | },
461 | {
462 | "cell_type": "markdown",
463 | "metadata": {},
464 | "source": [
465 | "### The null model represents the total sum of squared residuals"
466 | ]
467 | },
468 | {
469 | "cell_type": "code",
470 | "collapsed": false,
471 | "input": [
472 | "print(null.summary())"
473 | ],
474 | "language": "python",
475 | "metadata": {},
476 | "outputs": [
477 | {
478 | "output_type": "stream",
479 | "stream": "stdout",
480 | "text": [
481 | " OLS Regression Results \n",
482 | "==============================================================================\n",
483 | "Dep. Variable: distance R-squared: 0.000\n",
484 | "Model: OLS Adj. R-squared: 0.000\n",
485 | "Method: Least Squares F-statistic: nan\n",
486 | "Date: Tue, 24 Feb 2015 Prob (F-statistic): nan\n",
487 | "Time: 20:51:51 Log-Likelihood: -28.977\n",
488 | "No. Observations: 16 AIC: 59.95\n",
489 | "Df Residuals: 15 BIC: 60.73\n",
490 | "Df Model: 0 \n",
491 | "Covariance Type: nonrobust \n",
492 | "==============================================================================\n",
493 | " coef std err t P>|t| [95.0% Conf. Int.]\n",
494 | "------------------------------------------------------------------------------\n",
495 | "Intercept 3.6853 0.382 9.643 0.000 2.871 4.500\n",
496 | "==============================================================================\n",
497 | "Omnibus: 0.701 Durbin-Watson: 1.723\n",
498 | "Prob(Omnibus): 0.704 Jarque-Bera (JB): 0.714\n",
499 | "Skew: 0.351 Prob(JB): 0.700\n",
500 | "Kurtosis: 2.239 Cond. No. 1.00\n",
501 | "==============================================================================\n",
502 | "\n",
503 | "Warnings:\n",
504 | "[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.\n"
505 | ]
506 | }
507 | ],
508 | "prompt_number": 33
509 | },
510 | {
511 | "cell_type": "markdown",
512 | "metadata": {},
513 | "source": [
514 | "This model explains 0% of the variance in the data. Because it's predicting every data point to be the mean, the residual variance is equal to the total variance:"
515 | ]
516 | },
517 | {
518 | "cell_type": "code",
519 | "collapsed": false,
520 | "input": [
521 | "null.mse_total # mean squared error total"
522 | ],
523 | "language": "python",
524 | "metadata": {},
525 | "outputs": [
526 | {
527 | "metadata": {},
528 | "output_type": "pyout",
529 | "prompt_number": 34,
530 | "text": [
531 | "2.336819133333333"
532 | ]
533 | }
534 | ],
535 | "prompt_number": 34
536 | },
537 | {
538 | "cell_type": "code",
539 | "collapsed": false,
540 | "input": [
541 | "null.mse_resid"
542 | ],
543 | "language": "python",
544 | "metadata": {},
545 | "outputs": [
546 | {
547 | "metadata": {},
548 | "output_type": "pyout",
549 | "prompt_number": 35,
550 | "text": [
551 | "2.336819133333333"
552 | ]
553 | }
554 | ],
555 | "prompt_number": 35
556 | },
557 | {
558 | "cell_type": "code",
559 | "collapsed": false,
560 | "input": [
561 | "# this is the same as the variance in distance:\n",
562 | "planes['distance'].var()"
563 | ],
564 | "language": "python",
565 | "metadata": {},
566 | "outputs": [
567 | {
568 | "metadata": {},
569 | "output_type": "pyout",
570 | "prompt_number": 36,
571 | "text": [
572 | "2.3368191333333317"
573 | ]
574 | }
575 | ],
576 | "prompt_number": 36
577 | },
578 | {
579 | "cell_type": "code",
580 | "collapsed": false,
581 | "input": [
582 | "# let's calculate the sum of squared residuals manually:\n",
583 | "res = planes['distance'] - null.predict() # absolute residuals (difference between data and prediction)\n",
584 | "res = res ** 2 # squared residuals\n",
585 | "res.sum()"
586 | ],
587 | "language": "python",
588 | "metadata": {},
589 | "outputs": [
590 | {
591 | "metadata": {},
592 | "output_type": "pyout",
593 | "prompt_number": 37,
594 | "text": [
595 | "35.052286999999993"
596 | ]
597 | }
598 | ],
599 | "prompt_number": 37
600 | },
601 | {
602 | "cell_type": "code",
603 | "collapsed": false,
604 | "input": [
605 | "# the ssr is also contained in the model object:\n",
606 | "null.ssr"
607 | ],
608 | "language": "python",
609 | "metadata": {},
610 | "outputs": [
611 | {
612 | "metadata": {},
613 | "output_type": "pyout",
614 | "prompt_number": 38,
615 | "text": [
616 | "35.052286999999993"
617 | ]
618 | }
619 | ],
620 | "prompt_number": 38
621 | },
622 | {
623 | "cell_type": "markdown",
624 | "metadata": {},
625 | "source": [
626 | "### Main effect of design\n",
627 | "\n",
628 | "Recall that the SS for each model term can be calculated by comparing the change in the residual sum of squares for nested models. We do that here:"
629 | ]
630 | },
631 | {
632 | "cell_type": "code",
633 | "collapsed": false,
634 | "input": [
635 | "## Main effect of design, ignoring paper:\n",
636 | "print(null.ssr - design.ssr)\n",
637 | "\n",
638 | "## sum of simple effects of design, eliminating paper:\n",
639 | "print(paper.ssr - additive.ssr)\n",
640 | "\n",
641 | "## main effect from ANOVA table:\n",
642 | "print(anova_table['sum_sq'][0])"
643 | ],
644 | "language": "python",
645 | "metadata": {},
646 | "outputs": [
647 | {
648 | "output_type": "stream",
649 | "stream": "stdout",
650 | "text": [
651 | "0.385641\n",
652 | "0.385641\n",
653 | "0.385641\n"
654 | ]
655 | }
656 | ],
657 | "prompt_number": 39
658 | },
659 | {
660 | "cell_type": "markdown",
661 | "metadata": {},
662 | "source": [
663 | "### Main effect of paper"
664 | ]
665 | },
666 | {
667 | "cell_type": "code",
668 | "collapsed": false,
669 | "input": [
670 | "## Main effect of paper, ignoring design:\n",
671 | "print(null.ssr - paper.ssr)\n",
672 | "\n",
673 | "## sum of simple effects of paper, eliminating design:\n",
674 | "print(design.ssr - additive.ssr)\n",
675 | "\n",
676 | "## main effect from ANOVA table:\n",
677 | "print(anova_table['sum_sq'][1])"
678 | ],
679 | "language": "python",
680 | "metadata": {},
681 | "outputs": [
682 | {
683 | "output_type": "stream",
684 | "stream": "stdout",
685 | "text": [
686 | "1.718721\n",
687 | "1.718721\n",
688 | "1.718721\n"
689 | ]
690 | }
691 | ],
692 | "prompt_number": 40
693 | },
694 | {
695 | "cell_type": "markdown",
696 | "metadata": {},
697 | "source": [
698 | "### Interaction effect"
699 | ]
700 | },
701 | {
702 | "cell_type": "code",
703 | "collapsed": false,
704 | "input": [
705 | "# interaction term:\n",
706 | "print(additive.ssr - full.ssr)\n",
707 | "\n",
708 | "## interaction effect from ANOVA table:\n",
709 | "print(anova_table['sum_sq'][2])"
710 | ],
711 | "language": "python",
712 | "metadata": {},
713 | "outputs": [
714 | {
715 | "output_type": "stream",
716 | "stream": "stdout",
717 | "text": [
718 | "23.386896\n",
719 | "23.386896\n"
720 | ]
721 | }
722 | ],
723 | "prompt_number": 41
724 | },
725 | {
726 | "cell_type": "markdown",
727 | "metadata": {},
728 | "source": [
729 | "# Partitioning of residuals, variance explained\n",
730 | "\n",
731 | "As we add terms to the model, the residual sum of squares gets lower. It turns out that for any model, its $R^2$ (the coefficient of determination) is given by the proportion of the variance explained, relative to the null model:"
732 | ]
733 | },
734 | {
735 | "cell_type": "code",
736 | "collapsed": false,
737 | "input": [
738 | "# e.g. for full model:\n",
739 | "print(full.summary())"
740 | ],
741 | "language": "python",
742 | "metadata": {},
743 | "outputs": [
744 | {
745 | "output_type": "stream",
746 | "stream": "stdout",
747 | "text": [
748 | " OLS Regression Results \n",
749 | "==============================================================================\n",
750 | "Dep. Variable: distance R-squared: 0.727\n",
751 | "Model: OLS Adj. R-squared: 0.659\n",
752 | "Method: Least Squares F-statistic: 10.66\n",
753 | "Date: Tue, 24 Feb 2015 Prob (F-statistic): 0.00106\n",
754 | "Time: 20:51:51 Log-Likelihood: -18.584\n",
755 | "No. Observations: 16 AIC: 45.17\n",
756 | "Df Residuals: 12 BIC: 48.26\n",
757 | "Df Model: 3 \n",
758 | "Covariance Type: nonrobust \n",
759 | "================================================================================================================\n",
760 | " coef std err t P>|t| [95.0% Conf. Int.]\n",
761 | "----------------------------------------------------------------------------------------------------------------\n",
762 | "Intercept 3.6853 0.223 16.514 0.000 3.199 4.171\n",
763 | "C(design, Sum)[S.simple] -0.1552 0.223 -0.696 0.500 -0.641 0.331\n",
764 | "C(paper, Sum)[S.50] 0.3277 0.223 1.469 0.168 -0.158 0.814\n",
765 | "C(design, Sum)[S.simple]:C(paper, Sum)[S.50] -1.2090 0.223 -5.418 0.000 -1.695 -0.723\n",
766 | "==============================================================================\n",
767 | "Omnibus: 0.751 Durbin-Watson: 2.909\n",
768 | "Prob(Omnibus): 0.687 Jarque-Bera (JB): 0.745\n",
769 | "Skew: 0.359 Prob(JB): 0.689\n",
770 | "Kurtosis: 2.224 Cond. No. 1.00\n",
771 | "==============================================================================\n",
772 | "\n",
773 | "Warnings:\n",
774 | "[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.\n"
775 | ]
776 | }
777 | ],
778 | "prompt_number": 42
779 | },
780 | {
781 | "cell_type": "code",
782 | "collapsed": false,
783 | "input": [
784 | "diff = null.ssr - full.ssr\n",
785 | "# express as a proportion of the total (null model) residual:\n",
786 | "prop = diff / null.ssr\n",
787 | "print(prop)"
788 | ],
789 | "language": "python",
790 | "metadata": {},
791 | "outputs": [
792 | {
793 | "output_type": "stream",
794 | "stream": "stdout",
795 | "text": [
796 | "0.727235230044\n"
797 | ]
798 | }
799 | ],
800 | "prompt_number": 43
801 | },
802 | {
803 | "cell_type": "code",
804 | "collapsed": false,
805 | "input": [
806 | "# same as R-squared:\n",
807 | "print(full.rsquared)"
808 | ],
809 | "language": "python",
810 | "metadata": {},
811 | "outputs": [
812 | {
813 | "output_type": "stream",
814 | "stream": "stdout",
815 | "text": [
816 | "0.727235230044\n"
817 | ]
818 | }
819 | ],
820 | "prompt_number": 44
821 | },
822 | {
823 | "cell_type": "markdown",
824 | "metadata": {},
825 | "source": [
826 | "# F-tests and t-tests\n",
827 | "\n",
828 | "As you may know from your basic statistics courses, the sum of squares for each term is divided by how many parameters were estimated for that term (the degrees of freedom added) to give the mean squared column, and then the ratio between the change and the amount of remaining error gives the F-statistic. This can then be tested against the F-distribution (with `model, error` degrees of freedom) for significance."
829 | ]
830 | },
831 | {
832 | "cell_type": "code",
833 | "collapsed": false,
834 | "input": [
835 | "print(anova_table)"
836 | ],
837 | "language": "python",
838 | "metadata": {},
839 | "outputs": [
840 | {
841 | "output_type": "stream",
842 | "stream": "stdout",
843 | "text": [
844 | " df sum_sq mean_sq F PR(>F)\n",
845 | "C(design, Sum) 1 0.385641 0.385641 0.484016 0.499861\n",
846 | "C(paper, Sum) 1 1.718721 1.718721 2.157158 0.167628\n",
847 | "C(design, Sum):C(paper, Sum) 1 23.386896 23.386896 29.352777 0.000156\n",
848 | "Residual 12 9.561029 0.796752 NaN NaN\n"
849 | ]
850 | }
851 | ],
852 | "prompt_number": 45
853 | },
854 | {
855 | "cell_type": "markdown",
856 | "metadata": {},
857 | "source": [
858 | "Consider the regression table for the full model above. How are the t-statistics related to the F-values for the model terms? \n",
859 | "\n",
860 | "In the $2 \\times 2$ design here, the t-tests are equivalent to the F-tests for the factors... their values are the square roots of the F-values (i.e. $F = t^2$)"
861 | ]
862 | },
863 | {
864 | "cell_type": "code",
865 | "collapsed": false,
866 | "input": [
867 | "full.tvalues[1:] **2"
868 | ],
869 | "language": "python",
870 | "metadata": {},
871 | "outputs": [
872 | {
873 | "metadata": {},
874 | "output_type": "pyout",
875 | "prompt_number": 46,
876 | "text": [
877 | "C(design, Sum)[S.simple] 0.484016\n",
878 | "C(paper, Sum)[S.50] 2.157158\n",
879 | "C(design, Sum)[S.simple]:C(paper, Sum)[S.50] 29.352777\n",
880 | "dtype: float64"
881 | ]
882 | }
883 | ],
884 | "prompt_number": 46
885 | },
886 | {
887 | "cell_type": "code",
888 | "collapsed": false,
889 | "input": [
890 | "print(anova_table['F'])"
891 | ],
892 | "language": "python",
893 | "metadata": {},
894 | "outputs": [
895 | {
896 | "output_type": "stream",
897 | "stream": "stdout",
898 | "text": [
899 | "C(design, Sum) 0.484016\n",
900 | "C(paper, Sum) 2.157158\n",
901 | "C(design, Sum):C(paper, Sum) 29.352777\n",
902 | "Residual NaN\n",
903 | "Name: F, dtype: float64\n"
904 | ]
905 | }
906 | ],
907 | "prompt_number": 47
908 | },
909 | {
910 | "cell_type": "markdown",
911 | "metadata": {},
912 | "source": [
913 | "# One way to visualise variance partitioning\n",
914 | "\n",
915 | "Recall that when we do ANOVA, we're partitioning the total variance into the variance associated with each factor, and that this is equivalent to nested model comparison. We can visualise this as a pie chart:"
916 | ]
917 | },
918 | {
919 | "cell_type": "code",
920 | "collapsed": false,
921 | "input": [
922 | "# plot the explained variances as a pie chart:\n",
923 | "labels = 'design', 'paper', 'interaction', 'residual'\n",
924 | "sizes = anova_table['sum_sq']\n",
925 | "colors = ['yellowgreen', 'gold', 'lightskyblue', 'lightcoral']\n",
926 | "plt.pie(sizes, labels=labels,\n",
927 | " autopct='%1.0f%%',\n",
928 | " colors=colors,\n",
929 | " startangle=90)\n",
930 | "# Set aspect ratio to be equal so that pie is drawn as a circle.\n",
931 | "plt.axis('equal')\n",
932 | "plt.show() "
933 | ],
934 | "language": "python",
935 | "metadata": {},
936 | "outputs": [
937 | {
938 | "metadata": {
939 | "png": {
940 | "height": 253,
941 | "width": 349
942 | }
943 | },
944 | "output_type": "display_data",
945 | "png": "iVBORw0KGgoAAAANSUhEUgAAAroAAAH6CAYAAADoTe/oAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAWJQAAFiUBSVIk8AAAIABJREFUeJzs3Xd4VFX+BvD3TJ+U6ZNM2qRzE0gghF4CoQVQRKrSm4od\nELuufS1rw7I/+6KuvXfXspZVxN4LXhUBaVKSkADpyf39cScYIKEmuVPez/PkGTJz595vYpy8OXPO\n9whFUUBEREREFG50WhdARERERNQRGHSJiIiIKCwx6BIRERFRWGLQJSIiIqKwxKBLRERERGGJQZeI\niIiIwhKDLhERERGFJQZdIiIiIgpLDLpEREREFJYYdImIiIgoLDHoEhEREVFYYtAlIiIiorDEoEtE\nREREYYlBl4iCgiRJWZIkNUmS9GAHXmNe4BqLOuoaREQUPBh0iSjYKB147q8BXAngkw68BhERBQmD\n1gUQEXUWWZa/BfCt1nUQEVHn4IguEREREYUljugSUaeTJCkPwDUAhkD9g/sVAHe3cpwJwLkAZgNI\nB1AJ4G0Al8myvGafY6cDOAtAV6ivbasALJdl+Z4Wx8wDsBzAObIs397i/uEALgfQE0AtgOcB/BPA\ndwCukmX5qsBx7wNIBTAYwE0ARgOwAPgCwOWyLP/vyL8rRETU3jiiS0SdSpKkAgAfATgGwBsAHgVQ\nDOC5fY4zAvgPgGsBVAC4M3D8ZACfS5LUrcWx0wA8BsANNcjeA8AJ4C5Jkv7WShlKi+dOAvAWgHwA\nTwN4CsAUAC/te2zg3zEAPgwc/yCAFwEMAvCmJEldD+d7QUREHYsjukTU2W4HEAVgjCzL7wCAJElX\nAHgPgK/FcUsADAPwD1mWL26+U5KkOwCshBpo+wXuPg/ALgC9ZFneHTjuagA/Qx3l/XtrhUiSFA3g\nLgA7APSTZXl14P4bAXzVylME1DD9IYCpsiw3Bo7/AWognw3g4laeR0REGuCILhF1GkmSkgAUAXiz\nOeQCgCzLZQCu2OfwkwCUA7i05Z2yLH8J4BkAfSRJyg3crYManvNbHLcTQB+oUx7aMhpAHIB/Nofc\nwHPXA7j1AM+7pTnkBvwncJt6gOcQEVEn44guEXWm7oHbz1p5bGXzPwIjrV0A/AngckmS9j22eeS3\nJ9S5uPcEPlZKkvQ9gNcDHytkWT5Qu7I+h1LPPhQAv+xzX0Xg1nyAaxERUSdj0CWizuQM3O5s5bGy\nFv+2B259UBeJtUZpPp8sy/dJkrQVwCKoI8b5AC4EsFGSpKWyLD/Txjk8gds/W3lsUxvPAdQFa/vW\nAqhTG4iIKEgw6BKFIUmSHgIwB0A81LfgjwPQAHVu6aWyLP/U4thUABcBKAGQGDhOBnC/LMv3tjhu\nHtR5sWMA9AdwKgAbgO8BXCvL8mut1DEVwDlQR3IbAfweeMje4phiAO8CWAo1KM4BMDbw8AeyLBcf\nytcsy/KLAF6UJMkOdW7veAAzADwhSdJPsiz/2MrTKgO3tlYea+0+IiIKIQy6ROHtDahzUB8AkAJg\nEoBiSZKGyrL8nSRJaQA+B2CF2lJrPYBkqF0H7pYkySDL8v/tc87roLbwehRqeJ0K4GVJkk6SZfmh\n5oMCi8H+BmAN1IAMACdCDbNToO5Q1tJlgdufAKyDGo7zJEmyyLJc0/LAQCuxbAAPA9gGdTHaTlmW\nl8myXAG1E8KLkiT9DuBqAAMAtBZ0vwjc9gPw/j6P9QMREYU0LkYjCm8OAD1kWT5PluUToQZNO9TO\nB4A6kusCMF6W5TmyLF8qy/JcqO2+AGB6K+fsDqBYluWFsiyfDqAv1DmqywKjqZAkqS/UkPsegG6y\nLC+SZXkRgFwAuwF0C4wQ71urAuALWZaPg9q6ywXgBkmS9kwJCLTwugfqCHCpLMtVABYAuEqSpH0X\nnjV/vq6N789LUKdMLAqE/uZrJAO4oI3nEBFRiGDQJQpv1wQ6GgAAZFl+HmoP26GSJCUCeATAAlmW\n3235JFmWPwdQA3U0eF+Py7L8WYtjf4e6uYIdwLGBuxcEbs9vORobqOWawKfLJUl6Dmr7LxG4Xks3\nAPgU6rzbLyRJulWSpOWB+6IBnCrL8q7AsRdB7W/7lSRJ90mS9A9Jkt4BMB/Ae7Isv93aNycQks+E\nOhf4S0mSHpAk6V4AX0Pt4gCoo9YtcR4uEVGI4NQFovD2Xiv3fQZ1g4Pusiy/AeAjSZJcAAoAZAGQ\noM7BNQPQH+I5Pw/cdgfwOIBegc+nSJI0fp9jk6GO3P4GdeFYdOD+F9FiBFmW5RpJkoYBOB/ANACn\nQ+13+yGA62VZ/rDFsU9KklQJdQrDeKihew3UlmU3tri2gr03gIAsy09JkrQbahuz6VBHnJ8IXOcp\nAFUHej4REQUvBl2i8LaxlfuaOwzYJUlyAlgGddGWAWqIWwN1cVhPtD56ecBzBm4dgduL2qhLAfCt\nLMtTWyxG2yLL8l7vMgVGg6/BX6PAbZJlubml2IGOeRjqvF4AgCRJsQBssiy/CuDVlsdKkjQ/8M/1\nLZ4/rI3zrgXfISMiCjoMukThzYr9W3k1h9BSqAvKxgK4G+o0hu9b7Cw2+wDn3FfzObcHbndB7d5g\n3WdjhWAjAfhMkqSHZFlesOdOSbJCndJQD2CFVsUREdHRYdAlCm99Abyzz30DoAa4L6GG3M9lWT6z\n5QGBhVlmtD6i2xfAK62cE1DnzwLAtwB6ACjEX9Mams/dB8BEAK/Lsqx1iPwS6lSOeS06UEQBGAfA\nD7UVW2s9domIKATwrTai8Ha1JEkxzZ9IkjQFakeFl6Au/lIAOCVJMrY4xgp1cRkAGLG/0yVJym5x\nvAR1wdgGAG8F7n4ocLssMD2g+dgYAPdCndKg+etPYNe0EqgtyOKhjuLOALAWwFRZlm/QrjoiIjpa\nQlG4roIo3LTYMKIC6hSF16AuApsAdY7tQFmWN0iS9DTUnrbfAngbaueC46CG4EaovXdjZFlWWmwY\n0bzd7bNQw+pkqKO/E2RZfrNFDbfhrwD8OtTdxCYG6ri7eRS5xRzd22RZXtr+3w0iIopUmo+oEFGH\nOgHAN1DbfQ2E2pu2nyzLGwKPnwTgNqhzbBcBGAl1sVYh1M0mLPirp26zqwHcD7W7wSQAKwEMbRly\nAUCW5SUAZkNdzDULavDeBLXl11nt+DUSERG1iiO6RGGoxYhuXsvtfo/ynPOgjuieJcvyXe1xTiIi\noo7EEV0iIiIiCksMukREREQUlhh0icJTR+zgxV3BiIgopHCOLhERERGFJY7oEhEREVFYYtAlIiIi\norDEoEtEREREYYlBl4iIiIjCkkHrAoiIwpEQQg91Zzkz1NdaAXVwwQZ1G+WmwEdji39XKYrSqEnB\nRERhiF0XiIhaIYQwAHAD8ALw2mKQ6HEizWyCv7JG10NEm3cZjMJsMAqT3iCMOr0w6vXCoNPDIIQw\n6I1CZzLrhNEsdHqDEDq9gBCA/Hate0y6VC6EaGpSFKWhqQlNTU1KQ1MTdtXVNdU3NtY3NDXV1Tc2\n1tc33zY21tc1NtY1KcpuAOVNilJe19i49c+dO+WahoY/AGwEsFVRlCYtv2dERMGGQZeIIpIQQgDw\nAOiS4EUPpw19TUak2WJgi45CVJQFlgQvjEnxMCfFIdrngcXrArwuYNI5ourkB3pGHcl1b5/6R/X7\nM0+3GvX6w35ubUMDdtbWYmdtLSpqavDnzp1N63bs2Pl7WVnVhoqK2qq6ut3V9fVVu+vrd9c3Nm5r\naGpavb2q6qcd1dVrAKwDsEHhiz4RRRBOXSCisCaEsAHIdjuQH+dCX4MBXewxcBwzBDZ/IqJ75sLW\nLQsx2alqiBXi4Oc0Gzu87NavazDAbDDAEx3dfJcOgD3wsZedtbXYXFmJTZWV+KOiYvcv27bt/LW0\ndPfA1NQdFTU15fVNTd9vrKhYUVVf/z2ANYqiNHTil0JE1CkYdIkoLAgh3GYT+qcmothkRLfYaLhs\nMbDPnYCYghzE5neBvUsakBQP6I5yGa7JgEOIw9qKNZsR6/Wii9cLANGBDwBAY1MT1pWXj/x527ZF\nX2/aVPbT1q27BqelVVTU1JTXNTbKW3btWlFRU/M5gF85AkxEoYxBl4hCjhDCDKAgMwVjoqwY7HHC\nd9JkOEcOhKcwF+b0ZMDYgaOuRoOCpqYm6I42MWtEr9Mhw+1GhtutPyYnxwt1HjIURcHGysphP2/b\ntvDTP/7Y/vWmTRUFiYnbquvrv1tXXv5ybWPjJ4qilGtcPhHRIWPQJaKgJ4TwuewYkeDFREcs0iaX\nwF3cB87BvWDPywYMnfxK5ogFdpU3wOY2de6FO5gQAsl2O5Ltdt3IrKw4AHGKomSvr6gY+Nn69fPf\nW716W1F6ellZVdXmnbW1766vqHgLwA+c9kBEwYpBl4iCSmCRWFqCF2PcDoxzO5By+jS4jx2KuEGF\nMDhsWlcIxLuhbN9YF3ZBtzVCCPgdDvgdDvOU/PxkAMl1DQ3df9iyZfSKtWsv+viPP8p7Jydv31lb\n+/kv27c/AuALtkgjomDBoEtEmhNCOHweTI5z4cQR/ZFc2A2usUWI698DwmrRurr9+TwQv2+pQ4bW\nhWjEZDCgMCkJhUlJzkWDBjkBZKwpK+v7xi+/zHrnt9+2dU9IWF9eXf3KhoqK5xRF+UPreokocjHo\nEpEmhBD+tCTMddlx7MxxSJg1HonD+sFgDoFB0gQvDF9trde6jKCS7nLh9P797af372+vb2zM+nLj\nxuKXf/rpgkFpadvKqqp+2VBR8ciuurp3FEXZpXWtRBQ5GHSJqFMEpiT0kNKx0GXHgEWzED9rPBJ6\ndTv6LgidLd4NY+VPdY0ADr8ZbgQw6vXo7/eL/n6/D4CvvKoq/73ffx/3yqpVfxYmJW3dUV399pry\n8vsURVmnda1EFN4YdImowwghjBYzhmYkY+HQPsgbNRDx046BK9OvdWVHx2UHqsvr6sGge0icUVGY\nlJdnnpSXl6ooSuoPW7b0efTrrxf0Sk7eXF5V9caa8vJ7GXqJqCMw6BJRuxJCxHqcOD7Bi9ljipAx\naRQSJo5EtMepdWXtx2UHairruN3uERBCIN/nwz/GjvUpiuL7YcuWno9+/fX83snJm8uqq99cU1Z2\nr6Ioa7Wuk4jCA4MuER01IUR0UjzmxLkwc+oYJM84Fgklg2CKsmpdWcdw2oD63WwscLT2Db0/qqF3\nXq/k5M3lDL1E1A4YdInoiAghhMmIAV3ScOG4YuSdPh0powfBqI+AN/MdNqC+pjHEZhYHNyEE8nw+\n3NAi9D72zTfzCpOSNm3bvfuRDRUVD3AhGxEdLgZdIjosQghfph9L+vfAsZNLkDx/IhzuMJqWcCj0\nekCv49a4HaU59F4/ZoyvqanJ9+Hatd3v/fTTRV3j479bXVp6XV1j4+fcmpiIDgWDLhEdlBBCb4mx\nz/e5as6aPxHus2cjqWcuhNZ1ackYASPXwUCn02FoRoZhaEZGellVVfrDX3455NWff16f5nQ+s27H\njv/jlsREdCAMukTUJiGEz+3PujitsKik9/i5/i+fvlG3/LpfgnALh85nMnBEsbO5oqJwTlGRc8ng\nwc7PNmzods8nn5zcLT5+1bry8ht319e/z1FeItoXgy4R7UUIIcxRsSOdiakXFY6b3aVozjnJvqw8\nAMD67z6q+vCLX1DUW+Mig4CJr56aEUKgX0qKvl9KSmplTU3qE99+O+C5H37YkOFyvbimvPwmRVEq\ntK6RiIIDX6qJCAAghDA7EvxnpuT1WdDz2FkphcfNtpmjYvY6ZvCc86POu+71qk8f3hKlUZlBw2RU\nInrqRrCwWSw4tV8/+6n9+tm/3Lgx99YPP5wpeb0rf9m+/W+KoqzRuj4i0haDLlGEE0LEuJIzLkgv\nLDpx6Pzz07oMHG1SNzHbn8efhdKGNF1NzRZYInwCg9UM1FY3wWxl84Vg0SspSffYtGlp63fsSLt1\nxYribvHxP68pL7+8qq7uI61rIyJtMOgSRSghhMvtz7oyu//IY4Yv/FtaWsHAQ1peVTjhVN3ld39d\nf+M5dcaOrjGYeV1QSjfXIjEjTJsFh7AUhwPLxo1L3Flbm/jAZ5/17J6Q8PvmnTuv27579wucx0sU\nWQT/nyeKLEKIBG+adIM3TRoy8rTLUxO6dD+st+Ab6mrx4IKeNauf+y2ix3QvvxPVO6RMa/5g+2E9\n7/apf1S/P/N0a0Q0HA4S9Y2NePLbb3cu/+KLdVt27frnpsrK5Yqi1GtdFxF1PI7oEkUIIURmXHru\nTQXHTO89YuFlKR5/1hGdx2Ayw5XVv+nTb39Dvx7tXGQI8bmhW7O1Tusy6BAY9XrMLiyMnVlQkPef\nX3657Y6PPjonzel8eN2OHbcoisL/iERhjEGXKMzpDcbu3jTppn5TFuYPP+WSBJs38ajPWTT3gqil\nN79Z9dHybRG7KM3ngXHHj3UKENn9hEOJTqfDsTk5lmNzcqR3V6++8ob335+b4nDcsaGi4l5FUbin\nM1EYYtAlClNGs7XA48+6ffCsJTlD550XF2V3tdu549Jz8GdNqq6ubhtMpnY7bUjxOKHbXVpfByBC\nvwOhbXhmpmlYRob06qpVN9+6YsUZCTbb9X/u3Pk45/AShRcuFyYKM0KIxPiMri/1n3rqG6c9+L8h\nYxdf164ht1nBcSeLv99nbGj3E4cIlx2orajjKGAIE0LguK5drf89+eRul48YcU93n+9rb3T0ONFW\n2xEiCjkMukRhQggR40nN/mf+yMkfL7jrtfHHnntTvDk6tsOuVzBmuvmx9/wRu6DHaQdqdzVw9C8M\n6HU6zCgoiHnnlFN6nDd06CN58fGfOqzWIq3rIqKjx6kLRCFOCKF3JPgXp/cqOv3YpTelJ+UWdspy\nfqPFCkdab3yzajUKcjvjisHFZQfqqho48hdGjHo9TuvXzzG3sLDPnStXvtAtPn7V6rKyRTX19V9r\nXRsRHRmO6BKFsBhX3LjEnJ7fjDvvlr+fct9/szor5DYrmnehdfGtnurOvGawsFoANHJANxxZjUZc\nMHSo+40FCwaf0qfPG5LX+4IQwqt1XUR0+Bh0iUKQ3mDM82XnrRg67/xHznh4RV63YcdbtZhW6MvK\nw/rKFNEQoTN1TQYuXApnNosFfx89Ou7pmTMnlGRnf5bucv1NCMF3QolCCIMuUQgRQvjiMnKf6zdl\n4dun/ev9QUWzlzj0Rm03KOt+7Hz8Y7k+IqOuUc+cGwnSnE48PXNm2o1jx/6te0LC157o6OFa10RE\nh4ZBlygECCFMHn/WLd2GHf/J/DtfnjT+wtt8ltjD25GroxSOm2156K3UiFyUZuTYXkQp6dLF/PZJ\nJ+Wd3KfP0zlxca8LIY6+KTURdSgGXaIgF2V3DUjMKfhq8hX3nTXrlmdSnYlpWpe0F5M1GjHJBcqP\nv2ldSeczGbhZRKQxGwy4eNgw9/OzZo09Nifn4wyX6zohBHspEwUpBl2iICWEiPamdnm413FzXjjt\nwQ+6pRcWBe0v08FzLrQuvsUVcYvSTEYFTU1NWpdBGkiy2/HYtGn+28ePP7dnYuI3cTExx2hdExHt\nj0GXKAjFuuPHpuT3/XL6Px6feey5N8UbzRatSzqgpNye4vfSFBFpmc9pAyq2R+T0ZAoYkp5ueuuk\nk3LP6N//UcnrfUUI0f67sxDREWPQJQoiQghnXHrOC/1POP2RU//1vpQo9ejUdmFHo1vJLCz7ty6i\ndgrzuaGUbqrVugzSmFGvxzlFRc6nZ8wYNyg19bNku/0ErWsiIhWDLlGQsMcnz0ovLPp8zm0vTBix\n8FK33hBaK516T1hguefV1Dqt6+hMPi9E2Z8RuQ6PWpHmcuGVuXMzT+vf/54u6uiuU+uaiCIdgy6R\nxoQQvviMrm8Xzz//zpPvezvT48/SuqQjYo6OhdWXh1/Xal1J50nwwLhjK4Mu/UWn0+HsgQObR3c/\nT7TZpmpdE1EkY9Al0ogQQriS0hdLg8Z8vODu10cOOPEMh04X2v9LDp5zgXnxLc4arevoLHFuGCq3\n13GSLu0nvXl0t1+/eyWv92WO7hJpI7R/qxKFKCFEYnxm1w9Lzrzq73PveCnNHpekdUntIiWvr27V\nn8mIlEVpLjtQXV7LoEut0ul0WDx4sPPJ6dOPG+D3f55ks03RuiaiSMOgS9TJ7PFJEzL7Dltx0j1v\nDioYOz1Gi617O4oQArkjpuHuJ0VELEpz2YHayvoIifV0pDLcbrw2b17mQnV090UhhE3rmogiBYMu\nUScRQhg9qdn39R4/7/4Fd/0nPdYdr3VJHaLPpFMsd7zgj4hFaU4bUFcVEZmejpJOp8OSwYNdj0+b\nNr5XUtJndoult9Y1EUUCBl2iTiCE8Mdn5X086bJ75o4640pPqM/FPRBrrAMmTzes3aB1JR3PHgs0\n1DaGz5A8dbgsj0e8Om+edGxOzkvpLtfFIpze0iEKQuH725YoSDgS/NO6DCj54JR73+qV0WtI0O5u\n1p4Gzj7PtPhme9jvlKbTAcaQ6XRMwcJqNOLuiRMTLy4uvriLx/MfTmUg6jgMukQdRAhh8qZ2ebjv\npJP/OffOl1OjnR6tS+o0aQWD9N9uiIyd0gw6RdG6BgpNJ/boEfvE9OklhYmJnMpA1EEYdIk6gBAi\nw5ed/+nUq5dPH37yxe5wnqrQGiEEugydjAdfRNhHXZOBOZeOXKbbLV6fP186Nifn5QyX6xJOZSBq\nX5H125eoEzgT/HNzhhz73sL7/1vg797PqHU9Wuk35TTLzU/6w35/XGNobWBHQciiTmVIuHjYsIuy\nPZ43hBB2rWsiChcMukTtRAhh8aZJTwyYduayOcue91ttkd0fPtrhhs6Rg41btK6kY5mNCkfgqF2c\n0L177JPTp48KdGXoo3U9ROGAQZeoHQghEnxZeSunXffvKUPmLHXy3UfVwFnnmZbcHBvWO6VFWYDq\n3dwzgtpHptstXps3r8vI7OwX/Q7HfK3rIQp1DLpER8kcHdsztceADxbc/Z+eSbmFfCO7hYzeQ/Vf\nrEnRuowOFeeGUropItoGUyexGI341+TJiTN79rwx0+1exnm7REeOQZfoKDgS/FOy+498+aS738gK\n1w0gjoYQAhkDxiuPvoKwXbHl8wClmxl0qX0JIXBRcbHnypEjT8lyu18VQli1rokoFDHoEh0hd0rW\n5d1Lpv7fjBufTDZa+DuoLQOmnWW9/tGUsJ2+4HNDX76lXusyKEyN79o1+sGpU0d3jYv7SAiRoHU9\nRKGGQZfoMAkhDN7ULo8VL7hg6TFLboiLtNZhhyvGFYem2C7YWqp1JR3D54WxYltd2I5Yk/byfT79\nC7Nn9+yXkvIB++0SHR7+hiY6DEIImzc9592Jl909pc+E+WwBdIgGTD/HuPSWmLAc1fU4oNtdVssh\nXepQ8bGxeGnOnKzhmZkvpjqds7SuJ1hIkrRWkqTyQzx2gyRJazqhpjRJkpokSXqho69FB8egS3SI\nhBCpiVLByjnLnh8cKVv5tpes/iMNH8nhuSjNZQdqK+rZdoE6nMVoxINTpyad0L37sky3+2YuUgMA\nLANw/WEc35nvvvCdniDAoEt0CKKdnkEZfYrfO/neN7t5/Fn85XKYdDodUvuMVZ57S+tK2p/TBtTu\nYs6lziGEwN+GD/dcNnz4qdlu9wtCiIjdlAYAZFm+XZblG7Wug4IXgy7RQbiS0hZIA0c/veCfr6VH\n+iYQR2PgjMXWqx5Mrta6jvbmtAP11Y3844c61cS8vJg7xo8fm+12vyGEsGhdD1GwYs9PogPwpnW5\nvtf4uScPP+VSD98lPDo2bwLqLJmibMcGuBxaV9N+LGZAKHyHkjrfgNRU0/KpU4ee/Oyz7wkhRiuK\nUql1TQAgSdL7AFIBnAngLgBeAK/KsnyiJEmFAC4HUATACkAGcI8sy/fuc454ANcBGAogCUAZgHcA\nXCXL8uoWx60FYJdl2dniPheAqwBMBOAE8CmAc1upcx6A5QDOkWX59la+hiEAHLIsVwbuMwI4A8B0\nADmB+jcD+A+Ay2RZ3n443yfqHBzRJWqFEEJ4/Nl3Fs1eevqIhX9jyG0n/U5coj9/WVSt1nW0N6Oe\nSZe0ke/z6R+fPr1f17i4/wkhPFrXE6AAcAN4EsAHAB4E8IEkSWMBrARQDOAlAHdAzSF3S5K0J+hK\nkmSBGh5nA/gcwK0AVkANmCslSdr3rTWlxXNjAHwINWT/DOAeADYA7wFoawFxW///7nv/E1DnBNcC\nuDdw7hoApwJ4vY1zkMY4oku0DyGEcPuz7h1+8sUnFo6bbdO6nnAiDR5jvP9+f436+yd8GA3MuaSd\nTLdbPDtrVsH0J574QAhRoijKBo1LEgBiANwiy/L5ACBJUhSAtQDKAfSTZfmPwP0XA3gKwCmSJL0o\ny/J/AIwEUAB19Paq5pNKknQugBsBTANwdxvXPh9ALoBrZFm+IvA8HYCHAMwCsO1IviBJkvoDmATg\nUVmW57S4Xw/gKwC9JUnKlmX51yM5P3UcjugStRAIuQ+VnHHVdIbc9qfT65Hcc6Ty2vtaV9K+TBwy\nII0l2mx4ftas3D7Jye+a9PpsresJeK7Fv8cD8AC4qTnkAoAsywqASwKfzg/cNmeTHpIkmVuc4y4A\nflmW2wq5gDrquwPANS2u0QRgKYCmI/kiAtYDmAt12sUesiw3Avgo8Kn3KM5PHYQvz0QBQgid25/1\n+DFLbhjXtXh8tNb1hKvBs5ZaL730+epjizeFzXZyJoPCuS2kOXd0NJ6fPTt7xhNPvOmwWqfuqK7+\nUsNyFAAte9b2Ctz2liTpylaOb4I6igsAbwP4HcAEAFskSfov1KkMr8qyvLGtC0qSZAWQBeB9WZb3\naoUiy/J2SZJ+AXBEC/cC131EkiRDYJ6xBCATQE8AIwKH6Y/k3NSxGHSJAAgh9B5/9jPjzr91jDRo\ndNgEsGBkj09GlT5dVO7aBFuM1tW0D5MBaGhogsHAN8lIW7FmM56ZOTN97tNPvxAfEzNvy65d72pY\nTssuK81LUKe1cawCdeEYZFmuDkwVuBTAVKhTBiYBaJIk6XkAp8qy3NomEc1zd3e2cY0yAImHXv7e\nJEk6FepSZNAZAAAgAElEQVSIbvNWzOUAPgGwCkA/qFM2KMjwVZkinhDC4EnNfmnCJf8cy5DbOfqe\nsEh/4W3WsFmU5nZA2bGVvXQpOFiMRjw2bVrK4PT0R5Lt9uO0ridgV+B2uCzLulY+9LIs73nrX5bl\n7bIsnyPLcjLUkd4LoQbKKWh7fm5z+G1r0dm+79Q1T65vLQtFtfxEkqSpgetuBXA8gBRZlt2yLB8L\n4Js2rkdBgEGXIpoQwuRJ7fLa5MvvHZXZdxh7UXaS3KHHGd/82h82K7ji3UDZ5rDJ7RQGDHo9Hpg0\nKbFvSso9vtjYUVrXA+DbwG2ffR+QJMkuSdItkiTNDHxeIknSnZIkZQCALMvfybJ8E4C+AHYDGNza\nBWRZroYahgsDnRtaXiMWakuwluoCt9H7HCsAZOxz7IzmW1mWX9lnCkVu4JYjukGIQZcilhDC4k3r\n8sbUq5cPS+s5mFv6diK9wYCEvGLlnY+1rqR9JHihK9tSr3UZRHvR6XS4f9KkxN7Jycu9MTFDNC7n\nBQCVAC6UJGnfxXI3AzgH6pxXQO2beyb2733rg9q7dt0BrvMQ1I4PNzTfEQiu1wIw73PsqsDtMYHO\nDM1OB+Da59iaFjXsIUnSHKj9dhUAEb1LXbDiHF2KSEKIKG+a9MaJ1/57QFJuT/5/oIHBc861XnDl\ny1VfDtgcdfCjg5vPA+OKdXUHP5Cokxn0ejw0dWryzCeffNQVFTWlrKrqs0669F6jm7IsV0iSdDKA\nxwF8LUnSC1A3WxgKdZT3M6iBF1D71Z4N4HRJkvKhzoO1QZ220Ih9Oh/s4zaoHR4WSZLUN/Dc/gC6\nB663py5Zlr+RJOlLAAMArJAk6YPAccOgbjLRr8V5HwFwIoAXJEl6Auo84L4ABgL4L9SWaMHSx5ha\n4IguRRwhhDUuPee/M258YiBDrnaciWmoVNJ0u3ZrXcnRi3NBv6u0tlHrOohaY9Tr8ciJJ6YUJCQ8\nE2s29+iESypoZRMGWZafhTr6+Q6AsQDOgjr6ejWAkbIsVwWOqwEwCsAtAOKgju5OgbrZxBBZlt/Z\n51otr1EPoATqiG4CgNOgZp3RAP5opa5xAB4GkB2oxwo16H7S8lhZll+HupBuNdR+vPMBbIcadi8I\nHDb24N8a6mxC4YY+FEGEEAZvmvTm9H88PjQhO5+tYDT27RtP1cX/cgpuu7AmpKeOfPUjsPiB2Oo5\n12YfdDHj7VP/qH5/5ulWo54/ftS5quvrMfWxx9Z+uXHjMVV1dasO/gyi0McRXYoYgW19n5hw6f8N\nYsgNDt1GTDS9/FnK0TRxDwouB1C7s4GjBhTUrEYjnpw+Pa1HQsKrJoMh8+DPIAp9DLoUMdz+rDtL\nzrp6TEavIfsuSCCNGIwmxOUMblqhZVv7duC0AXVVDVxxTUEvxmzGUzNmZBQkJLwhhPBrXQ9RR2PQ\npYjg8WddOnDamTPzR04Oky0KwsfgOedHnXtHfJXWdRwNWwzQWBfyA9MUIewWC56eMSOrZ2LiW0KI\nI95AgSgUMOhS2HMlpc/PGzl58cBpZzoOfjR1No8/C6UNabqamoMfG6yEAIw6zlyg0OGMisIzM2dK\n+T7fm0IIvjZS2GLQpbBmi0sck9F76LWjz7rGe/CjSSuFExbqrrjbFNKNaI2G/VeZEwUzT3Q0Hjnh\nhDzJ631dCBHSC0KJ2sKgS2HLanP0Su7a675Jl92TIASnTwaz7iUnmJ5d4Q/pPXSNBuZcCj1pLhfu\nnjChT5bb/bwQgpmAwg5/qCksCSEyErr0eGb6DY+n6NjGKegZTGa4svrj8++1ruTImdiRmUJUYVKS\n4ZqSkmGZLtfdWtdC1N4YdCnsCCHikrv1fn3Orc+lG82Wgz+BgkLRnPOtS5Z5Q3ZRmtmo8G0DCllj\nJSnqtP79p2a4XJdoXQtRe2LQpbAihIj1Zee/OfvW5yRLrF3rcugwxGXk4s9qv64uRHfSjbYCuytD\nevYFRbhT+vZ1jpGkxYk22wStayFqLwy6FDaEEAZves5rM258oofNm6B1OXQEehx3srj2fmNILkqL\nd0PZvilEUzpRwLUlJXE9EhLuiDWbu2tdC1F7YNClsOFJzb73+Ivu6OtN7cK3kENUz7EzzI+9G5qL\n0nweiLLNtVqXQXRUdDodlk+ZkpIbF/esECJe63qIjhaDLoUFd3LG6X0nnTwxs08xdz0LYUaLFfa0\n3vhmldaVHNyWUj16Tc/Ew6+oLUh9Huh3bFUHo797px73nrkbd560Cy/eUo1dZftvJrGmogL9n3gC\nKzZu7NS6iQ4mymTCIyeemJ0bF/e6EMKqdT1ER4NBl0JelN3dL7XHgL8NnrXEqXUtdPSK5l5gXXyr\np1rrOg5kd7XA2TckYne1Ds1vH/g8MO7YVtf05++NeOtftYh1C+QVG7Hh50a8tGz/3TDu/e47dHO7\nMTgpqXOLJzoEvthY3DdpUo8st/spwf6MFMIYdCmkCSHi49KlRyddfm8iX4vDgy87Hxt2+kVDkE5g\n2LjVgNmXpuC7X/fu6OF2QFSX1dX/8H4DrNHACZdZMWy2GSMXmLF5dRO2rGncc6xcVob3N2zAqd05\nDZKCV77Pp18yePDQNKfzQq1rITpSDLoUsoQQpvjMrq/MuPHJLIOJMxbCSf7YubjxQf0RRd3bHnMj\nZ0J2qx9Lb/btOe7pt+wYuiAdvWdk4ozrErGldP9+y6vXm5A7MRvvfxENAHjoZQeOW5SKX9aZ0T9/\n705oLjtQU1nfVLGtCc4EHQxG9Q8vr189b8W2vzaUuO+HH9AzLg59fT4QBbNZPXvaBqWlLXJarQO0\nroXoSLDFOYUsT2qXBydddm8BOyyEn8Lj5lgeXHh79SWn/H7Yr1E/rzHDZFSwcHLZfo91SVW7Ivzw\nmxmX3xWHgi41KMipxgvv2nH2DYl4+qb1ex1/+2Nu9MiuQXHv3QCAf7/iRHJ8A64+YwvWbDThk++j\n9hzrtAG1uxoUS5rAztK/Qm1dtfpvc+DQJhh0n27ejPtHjTrcL41IE7cee2zCr9u2/VsIMUBRlO1a\n10N0OBh0KSS5ktNPHTT97LH+7v2MWtdC7c9kjUZ0UgF+Wv07umYe3nN/WWdGVkodzpq2f9Bt9tx/\n7bDHNOHf126Ayaige3YNlt6SgB9Xm9EtU+2c8ONqM97+NAbLr9yw53nXnLkFA3tUQQjg9w2mvc7p\ntAH11Y0iIVuHVSsb8OvnDfB30+PL1+thNANxqerIbpPeauyXkIDuXu/hfWFEGjEbDFg+dWrW5Ecf\nfVEIMVRRlMaDP4soOHDqAoUcoyUqP7lr78sGTj+Li8/CWNHcCy2Lb3Yd1qK0XVU6bNpmgJR24DZf\nG7YakZ5YB5NRHW2V0tXjN2796++m2x7zoE+3agzo8VcJgwrUkNsakwnQQUH34UbEp+vw0rIa3Hny\nbvzyWQOGTDfDGiuw9rsGKDDoFubl7Xlek6K0fkKiIJJst+O60aN7Zbhcd2hdC9Hh4IguhRQhhC2p\na6+np1z5QBIXn4W3pNye4rVSv2hqKoPuEP8k/3mtOsp6sKBri27En9v/evnbXaWOtsZGqQNVX66y\n4MOvovD49etbfX5bTAZFMRgFZlxlxa+fN2D3DgUpuXrEpannX/F0HYRS3yi5XPoft2/HNZ9+irWV\nlUiNjcWFffqgMJ5tSyl4jcjKskzOyzsh2W5/b0NFxbNa10N0KDiiSyFDCCG8adKz0677t2SyRh38\nCRTyupXMxG2P6PZvQtsGea26KLGsQo/5lyehz4xM9J2ZiUX/SMCajX+N1vbMqcGvf5jw30+isatK\nh4dedsBqVpATGNm97VEPigqrUJi7f1uwAzHq1dFZvUEgZ4ARvcaa9oTc375owJbfm6Bvqq5vaGrC\nhStWwG2x4PbiYmQ6HLjgww+xM1T3P6aIcXFxsadrXNzNQogMrWshOhQMuhQyXMmZV486/YoBHn82\nh3IjRO8JCyz3vOI/5O3GmoPu8hddiI1uwomjK9C9Sw3e+jgGJ5zvx89r1BHfE0oq0C2zFmfdkIje\nMzLx5spYnDd3G5y2Jnz0TRQ+/9GKJTP/WnPTdIhR29jGe2SKomDF03XIGWiAQKPy8ebN2FZVhaW9\neqFfQgIu7tsXu+vr8cbatYf6pRJpQqfT4f7Jk1Pz4uNf4GYSFAo4dYFCgjkqtrDH6BNOyh81JUbr\nWqjzmKNjYfHl47d1a5GVevDjDXogKa4eNyzegj7d/ppb+8r/YnH+Mh8uudOH52/9Ayajgif/8Qfe\n/iQG28oN6NutGrkZzaO5bozqvwvdMmvx3S8WXHJnPFZvMCE9sQ5Xnr4VffPanjZsMiit/hH288oG\nlG1qwvFLLfjtf8CGnTsBAP7YWABArMkEh9mMjbt2HfL3hkgrDqsVd0+c2G3Bs8/+G8BUreshOhCO\n6FLQE0JYPP6sf487/1b2EYtAg+dcYF50s/OQFqVdfupWvHPf2r1CLgAcN3Qn+nSrxk+/m/dMYTAa\ngGMG78Lc43bsCbn//TQaP6y2YNGMUtQ3AGffkACPowEPXLERXVLrcNb1iajc1fbLptkINNTvPfzb\n1Khg5bN16DbEAKdPfW5jYAFaY4uFaHWNjeBbFRQq8n0+/ayCghFJdvuJWtdCdCAMuhT0PP7suyZc\n+n+clxuhUvL66lb9mSwOdfpAW3Jb6azQkqIAtz/mwTGDdyLbX4cPvozGljIDLj1lGwYVVOHqM7Zg\nd7UOr/wvts1reJ1QyrbsPc/2xw8aUFmqYMCkv9qRpdpsAIDvtqvTI9ZWVGBXfT38gfuJQsHZAwc6\nM12u64UQcVrXQtQWBl0KajGuuFFdhx1/bEpeH06ziVBCCOSOmIZ7nhIH7N3Z2Kj2vv3u19Z3yaup\nU8dLzabW23m99mEsft9gwqIZpQCAdZvVYJqaoAZXW0wTnLZGrN/SduvmeA9QuumvoNvYoGDl83XI\nH2aEzfPXy21fnw9xUVG48uOPsezLL7H0f/+Dw2zG6NRDmJ9BFCR0Oh3umTgxXfJ4nhJsg0NBikGX\ngpYQwuZKzvhnyRlXcbQgwvWZdIrljhdSD9iSoKFR4ITz/TjlquT9Fo8pCvD1z1YY9Mqekd29nwvc\n+YQbE4ZVIjWhfs99ANDY9Nfv79o6saePrhDYr6dugge68hYjut/+tx7VOxUMmLh3ODbr9bituBgJ\n0dF47tdfEWM04pahQxFl5P4nFFqS7HacU1TUK9XhWKx1LUSt4SgZBS1vmvTQlCvvz9bzl3/Es8Y6\nYHR3xbqNa5Ga1PoxZpOC4t678c5n0bjvORdOm/rXzmjLX3Ti1z9MmDisEjFR+8+BeOFdGzZtM+DM\naaV77stIVgPrNz9bMKBHNVZvMGJnlQ5piWoQnji8EhOHV+51Hp8Xxh2/1e/5vHCMCYVj9t5BrVmm\nw4F/lZQc2jeAKIhN69Ej9tVVq5YKIV5WFOV3reshaolBl4KSPT75xIHTziyOS8/h22EEABg4+zzT\n4ps/qn5xWUWbLY0uWrANX/9swW2PufHZD1ZIabX44TcLPv/Rimx/HS46adt+z6mrB+56yo2poyqR\n6G3Yc39Rzyr43A04f1kCji3aiXc/i4bT1ohxQyr3O0czrxP6naV19QD41xlFlDvHj0/5vazsKSFE\nf24RTMGEUxco6AghvN7U7OuK5izlFr+0R1rBIP2361MOuCgtxVeP5275AxOHVeKXdWY88qoDm7cb\nsGBCOZ644Q/YY/Z/8lNvOlBeqcfpJ5Tudb/ZpOC+yzciOa4ej//HjtjoJtxz6SZEW9vestdlB2p2\n1DW0eQBRmHJGReGakpK8dJfrOq1rIWqJI7oUVIQQwpue8+SUq/6VoTvUfV8pIgghkD10Mh568Yem\nBZPa/iM9wduA6xdvOeTzzh63A7PH7Wj1sS6pdXjyxkPfBthlB2p31redhInC2IisLEtRWtrsGLP5\nqV21tV9pXQ8RwBFdCjLOpPSzhsxZ2tvhS9G6FApC/aacZrnpyUPfKa2zOe1AXVUjp9tQxLphzJiE\nLLf7YSGERetaiAAGXQoiQohUX1beeb3Gz2UzUWpVtMMNnSMHm7ZqXUnrYqKApvqjbPhLFMKiTCYs\nGzcuN9Ptvl3rWogABl0KEkIIXXxm16cmX36vn+0Y6UAGzjzXvOQm2yHtlNbZhACMes5coMhWmJSk\nH5yaepzVaMzTuhYiBl0KCs7EtLOHn3JpfrTTo3UpFOQy+hTrPv89OWj/GjLqwaRLEe+akpKETLd7\nOTeSIK0x6JLmhBBOV3L6ovxRU7jHLx2UEAIZA8crj70anIHSZAjKsog6lc1iwdKioq5+h+MsrWuh\nyMagS5rzpnW56/iL7kjnH/50qAZMO8t6/aMpQTl9wWQAf5CJAEzq1i060+U6Rwjh1roWilwMuqQp\no9lakN1/VLE3TWI4oEMW44pDY3S22Fp68GM7m8mooOlAzX6JIoQQAsvGjUvv4vE8oHUtFLkYdEkz\nQgjhSc2+v+Ssa3xa10KhZ8CMc4xLb42p0bqOfdmigd0VDLpEAJDmcmF8166DPdHRw7SuhSITgy5p\nxuHznzZ03nm55qgYrUuhEJTVf5Tho5+Dr99yvBvK9k1B2+qXqNNdMGSIJ9luv1MIwa2xqdMx6JIm\nhBB2V3L6uT3GTIvWuhYKTTqdDql9xiovvK11JXvzeSDKNtdpXQZR0DAZDPh7SUl2usv1d61rocjD\noEua8KZJdx5/0R0ZXIBGR2PgjMXWK5YnB9WitAQv9OVbGXSJWipKTzcVJCTMEEKkaV0LRRYGXep0\nRou1e2afYaPiMnKZcumo2LwJqDVnoGyH1pX8Jd4NU+V2bo9GtK+bjjkmOTcu7l9a10GRhUGXOpUQ\nQnhSsh4Ys+haLkCjdtFv2hLj+bdFBc2kWJcdqC6vrde6DqJg44mOxoSuXXu4oqKGal0LRQ4GXepU\njgT/yUWzz8k1R8dqXQqFiZzBYw3vfu8Pml0aXHagpqK+Ues6iILR4kGD3Ml2+63cMY06C4MudRoh\nRKwzIfWCnuNmsc0CtRudXo/kgpHK6//TuhKV0w7U7W7QugyioGQxGrGwb99sX2zsDK1rocjAoEud\nxpOafcv4C2/jAjRqd4NmnWO99L7EoFiU5rQB9TWN/CEnasOMgoLYRJvtUrYbo87AoEudQggR58vK\nK/Fl5/Nnjtqdw5eC3YYMVO7SuhLAYAD0CJqZFERBR6/T4ZLi4oxUh+NcrWuh8MfQQZ3CmybdPPrs\na1O1roPCV98pZxkuus0aFH29TAYmXaIDGZmdbU51Ok8WQnDBBnUoBl3qcEKIxASpR7HHn6V1KRTG\ncovHG9/4OiUo2noZDMy5RAdzTUlJWqbbfYPWdVB4Y9ClDudNz7llzNl/D769Wims6A0G+PKKlXc+\n1roSwKTXugKi4NcjIUGfFx8/TgiRoHUtFL4YdKlDCSH8yV17DXYmpmldCkWAwbPPtV54V0KV1nWY\njAoXoxEdgmtKSvxdPJ47ta6DwheDLnUob3rOstFnXZOsdR0UGVxJ6ahQ0nRVGkddiwmivjYoZlEQ\nBTW/w4Gi9PSBQogcrWuh8MSgSx1GCJHh796/vz2eOZc6T+/JZ+gu/T+LpovSvE40lf4ZFOviiILe\nJcXFCble761a10HhiUGXOkxcRu5tJWdclah1HRRZuo2YaHrpE20Xpfk8QOmmoNmVmCiouaOj0dfv\n7yGEyNC6Fgo/DLrUIYQQXdILi3rbvFxjQJ3LYDTBKw1SVnypXQ0JXujLt9RrVwBRiLlw6NBEyeu9\nUes6KPww6FKHiMvoevvI069gyiVNFM0933renfGa7ZTm88BYvo1TF4gOVaLNhu4+X192YKD2xqBL\n7U5vMHbL6je8Z4zTq3UpFKE8/mxsr08VNTUaXd8JXVVpHYd0iQ7DJcOGpWR7POyrS+3KoHUBFH68\nadLtIxb+LV7rOiiyFU44VXflPd/U37CkztjZ13bZgZoddQ0AOv3akWZ7dTXu//57fLRpE8pramAz\nmdDX58PC7t2RFBMDADj+pZfw5+7dBzzPZf37Y1yGOkX0gw0bcPtXX6G0pgZd3W5c0Ls30uz2vY7f\nUVODiS+/jDMKCjC1S5eO+eIiTLrLhS4ezxAhhFtRlFKt66HwwBFdaldCiJysfiPyouwurUuhCNe9\n5ATTMx/6G7S4tssO1O5q4PZoHWx7dTXmv/kmXvztN2TY7ZgmSejmduPNdesw7803sX7nTgDAdEnC\nKfn5+33M7toVOiFgNRjQze0GAGyrqsIlK1bApNdjUnY21u/cicXvv4/axsa9rv3wTz/BZjZjYhZ3\nfGxPlw4blprpcl2jdR0UPjiiS+0qLj3370PmnsvRXNKcwWSGK6uf8vn3v6FPfude22kD6qo0ydgR\n5f7vv8fWqiosKSzE9Jy/2rC+sWYNrvj4Y9z+1Ve4eehQTMtpvUXrzV98gSZFwbm9eiE9MGL7xtq1\naFQU3Dl8ODxWK4anpOCkt97Cyk2bMCxF3eBxW1UVnvv1V5zbuzcMOo4Xtaeu8fEi3eUaLYSIVRRl\np9b1UOjj/6HUboQQDm9al17stEDBomjOBVHnLPN2+vYRUVZAaeDuaB3tfxs2wGmx7BVyAWBMejqS\nYmLw6Z9/tvncb7ZuxTO//IJ+CQk4LjNzz/2bdu+Gw2yGx2oFAGQ7ner9u3btOWb5jz/CGxWF4zLY\nDasjXFRcnJrudF6qdR0UHhh0qd24kjPOHzr/Ar/WdRA1i8vIxeZqv66ukxsgCAEYDZy50JGaFAXz\nu3XDKfmtD9cb9XrUNzWhoan1lsq3f/01DDodzuvVa6/7Y00m1DT8NRq/u15dUxhjMgEANu7ahZdX\nr8Yp+fnQCf4t0xF6JyfrUxyOSUIIq9a1UOhj0KV2IYTQx7p9k1Ly+vBnioJKj+NOFtc9YOj0eQRG\nPZh0O5BOCJwoSZicnb3fY2srKrCushJJMTGtTi14b/16/FRainEZGfDbbHs9lu/xoKqhAU/JMqrq\n6/HoqlUQQiDf4wEAPPD99/DHxmJMWlqHfF2kOreoKC3Fbj9b6zoo9DGUULuIdnqn9Ju6kKO5FHR6\njp1hfvSd1E5v9WXiiK4mmhQFN33xBRRFaXOh2OM//wydEJidm7vfY0VJSRiclIRbv/wSw555Bo+t\nWoVZubnIsNuxtqICb6xdi4Xdu+91PWp/Q9LTjd6YmFlCcNicjg4Xo1G7sMUlLu0+amqU1nUQ7cto\nscKW1gvfrFqNgv1zTcdd1wD+gu5kiqLg+s8+wxdbtqCr241pkrTfMXJZGb7btg3DUlKQHBvb6nlu\nGToUH27YgPW7dqGb240eXrUn+L3ff49spxPDUlKwfudOXLFyJX4qK0Oc1YrFhYUY4eff+u1FCIHJ\neXkp8rZtQwD8T+t6KHRxRJeOmt5g7C4NGp2uN7JlKAWnIXMvtC5Z5u7UndLMRgVNbcwPpfbX0NSE\naz79FC+vXo3kmBjcNGRIq9MWXluzBgAO2hasKDkZM3Jy9oTcX8rL8d769TgtMJp7+cqVqG1sxLKh\nQ1GUnIzLVq7EH5WV7fxVRbbZPXs6Uh0OLkqjo8KgS0fNm9blmkEzF3MbNApavux8rK9MFQ2dOFPX\nHgPsLGOLsc5Q09CA8z/4AK/9/jv8NhvuGjlyT9eEfa3YuBF2sxl9fL7Dusbd336LfI8HAxMT8Wt5\nOX4qLcXpPXpgQGIilhYWwmYy4cXVq9vjy6EAm8WCfJ8vRwjBlpV0xBh06agIITxxGV0LuN0vBbu8\nMXNx00P6Tkue8W4opZs6ud1DBKqsq8MZ77yDlZs2QXK5cP/IkYiPan0W1brKSmzctQtDkpMPq2PC\nd9u2YeWmTTg1MJrbvBFFSmDqg16nQ2J0NDa2aEFG7WPRoEEpGS7XRVrXQaGLQZeOijsl66LiBRek\naF0H0cH0Gj/HsvyNzluU5vNAlP7JoNuRahsbsfT99/FjaSkK4+Nxz4gRcFgsbR7//fbtAIAC7+H9\nYX73d9+hd3w8eserA4uNgQVojS0WotU2NnJSdgfoFh+P+JiYsUIIrimiI8KgS0dMCGGM9frGJUoF\nfH2noGeyRiM6qQCrOund5cQ4GMq3dnqzh4hy1zff4Pvt29Hd68XtxcWIOsg6gV/KywEAUmATiEPx\n2Z9/4qstW3Bajx577ksNtCRrDs6VdXX4Y+dO+NtY3EZHZ17v3inuqKiJWtdBoYl/IdERi/X4pg84\n4YxUresgOlSD51xgWXTz+9Vv313W4Y3o41ww7vyxrhGAvqOvFYm2V1fj2V9/BaAGz4d+/HG/Y4QQ\nmNu1K0x69T/BhsCUA28b83dbc8+332JgYuKeProA0MXpRI7Lhdu++gq/lZfj623bAAATW+npS0dv\nYteuUXd+9NFSAM9oXQuFHgZdOmKxnoSzug2f0Pb7hERBJrlroXi9NEU0NZWhlQX57cplB6rL6+rB\noNshfti+HQ1NTRBC4JU2FoEJITA9J2dP0K2sq4MQYs8uZwfz4YYN+KmsDP8eM2a/x24cMgTXfvop\nXvjtNyTExOD6wYOREB195F8QtclkMGBwWlqaECJbUZRfta6HQguDLh0RIUTagGlnpugN/BGi0NK1\nZBZuf+T7xnPmNnVoAHXZgdqKOvYX6yDFKSn4dMaMw3rOAyUlh3V8UXIyPpk+vdXH4qOicMewYYd1\nPjpyZw4Y4Ht39eorAMzSuhYKLZyjS0fE489e3G/KKYfXn4coCPQ+fr7l7ldTO3yVmNMG1O5u7OjL\nEEWEFIcDyXZ7fyGEWetaKLQw6NJhE0IIq901Mj6jq9alEB02S4wNlvg8rF7Xsddx2ICG2ka+xhK1\nk1k9eyY5rdbxWtdBoYUvwnQkCnKKxiZqXQTRkRo0+wLzopudHbpTml4PGHQt+k8R0VEZK0mWRJvt\nNK3roNDCoEuHLS49d2mv8XNdWtdBdKT8+X11P/2ZLDp6h16jAQy6RO3EajQi0+XKFELYtK6FQgeD\nLqXLEoEAACAASURBVB0WIYQ+xh3Xxx6XpHUpREdMCIGcYSfi3qdFh06iNeqZc4na07xevZISYmNn\nal0HhQ4GXTos5qjY4d1LTmDKpZDXd8pCy+3P+Tt0UZqJTUmI2tWQ9HSDNzp6ttZ1UOhg0KXD4kjw\nL+kx+oQYresgOlrWWAcM7q5Yt6njrmEyKNw1kKgdGfR65Pt8KUKIeK1rodDAoEuHTAhhcSSk5Fhi\n7VqXQtQuBs0+z7TkZltNR50/ygLUVrOVLlF7WtC7d5Lf4ViodR0UGhh06ZBFu7xTeh8/n9MWKGyk\n9Rys/+aPFHTUojSPC0rp5tqOOTlRhCpMShLuqKgJWtdBoYFBlw6ZzZNwck7RMWzWTWFDCIHsoknK\nwy+hQ6JuggeidFOH701BFFGEEBjg9ycJITK1roWCH4MuHRIhhCsuPTfdYGLOpfDS74QzrDc94e+Q\nYVefB7qyLQy6RO1tXq9e8Zlu9yKt66Dgx6BLh8ThS5nfd/LJnLZAYSfa4YZw5GDT1vY/t88NY8W2\nOvYYI2pnXbxeuKOihmldBwU/Bl06JFEOz5S0wiK91nUQdYSBM881nXNLbLsvSvM4odtdVl/f3ucl\nImCg3x8vhOiidR0U3Bh06aCEENGu5PQEnY4/LhSeMvoU6z9bndLu53XZgZoddR26KQVRpJqcnx+X\n7nTO1boOCm5MLnRQlhj7qLwRk9izkMKWEALp/Y9THn+tfbfsdTmAul0NnLpA1AHy4uNht1pHaV0H\nBTcGXTooW1zSnC4DSyxa10HUkQZOP9t63SMp1e15TqcNqKtq4KYRRB1ACIEcr9cnhHBpXQsFLwZd\nOiAhhIh2uCVrrEPrUog6VIwrDo3R2bqtpe13TqsFQBMHdIk6ytT8/ERfbOxkreug4MWgSweTndpj\ngFfrIog6w4AZ5xjOvTWmXRelmQwMukQdZXBamt4dFTVN6zooeDHo0gE5k9Kn542czKBLESGr/yjD\nip/bd1GaUa8w6RJ1ELPBAF9MjF8IYdC6FgpODLp0QNZYx+gEqYfWZRB1Cp1Oh9Q+Y5QX/9t+5zTy\n1y9RhxqVnR2nF6KP1nVQcGLQpTYJIazOxFS2FaOIMnDGEusVy5Oq2ut8JgO4GI2oAx2Tk2PLcLvn\na10HBScmGGqTyRo9Im/EJJ/WdRB1Jps3ATXmTFFe0T7nMxsVNDU1tc/JiGg/focDDoull9Z1UHBi\n0KU2ORL8c9lWjCJRvxOXGM9fFlXbHudy2ICKbQ3tcSoiaoPk9cYLIbiehPbDoEttinJ4cqLsbE9I\nkSdn8FjDO9+ntMsiMp8bSunmdsnMRNSG8bm5PldU1Bit66Dgw6BLrRJCZPnz+/KvY4pIOr0eyQUj\nlf98cPTnSvBCV7q5/uhPRERt+v/27juw6TJxA/jzzepIZ7qBsgoECgg4QMWFinfnOM95jlPPcXfq\n3amnnvOc5/x57q3nZqiIoKKAC9kUaAtlpoO26W7SNk2a0az390cLV/Zq+2Y8n39ik3yTJ5WmT9+8\n3/ednJurztTrfyc7B4UeFl3ap9QBQ68YP/1SbvtLUWvqH+6Me+CdAUe9U1pOBjQ2i7c3IhHRfqTE\nxSElLm6Y7BwUelh0aZ9iE5PPHjB6kuwYRNKkZOfCqR4Oe8fRPU6GARqH1cdJukR9bHBKSrqiKEmy\nc1BoYdGlfdKnpGdyWTGKdpMv/Zvm/pfjjmo41pAMuNs6WXSJ+tj0ESMy4rTaqbJzUGhhk6G9KIoy\nKHP4mBTZOYhkG3PGb7ULi3KPam0wQzLgsfu4vhhRH5s6dGhsbnLypbJzUGhh0aW9xCYknzJiylk8\nEY2inlqjQfbYM8TPa478MVKTAJ8r0HuhiGifBiQlITEmZpzsHBRaWHRpL0kZA87LHT+ZG5cSATjl\n2rvi7n0j54hPSktOBPydAe6ORtQPshMTMxRFiZGdg0IHiy7tJUafODIhlQO6RABgGDgMtuBQxXWE\nmwKrVIBG1StL8hLRQZw+fHg6gGNl56DQwaJLu1EURaM3ZKTLzkEUSo675BbVv16PPeKT0nT8fISo\nX5w+bFhinsFwkewcFDpYdGlP+UMmnJwqOwRRKBl31sW6+WuO/KQ0rVpwSJeoH4xKT4depztJdg4K\nHSy6tJvk7Nyzhx17Kvf9JepBo9UhwzhVrCg8suO1HNEl6heKoiBdr89SFIXz4gkAiy7tITYh6awB\noyfKjkEUck697p9xd7+adUQnpem0gr90ifpJfmZmIoBc2TkoNLDo0m7ikwwDtTGxsmMQhZz0wSNh\n9Q5WPJ7DP1YfC7id3DOCqD9Mzs1Ni9VoOGJDAFh0qQdFURKTswZx2gLRfkz63V9Uj72tO+zGmpkG\n0VJ/VBusEdEhGp+drc1NTj5Tdg4KDSy6tIuiUh2fN+XMNNk5iELVMedcrvt82WDf4R6XnQ60NLDo\nEvWHISkpiNFojpGdg0IDiy7tkpY74vwhE06Kl52DKFRpY2JhyJsi1m06vONyMqBpazrsfkxER0Cl\nUiElLo6fThIAFl3qQRsbd2z64JGyYxCFtFOvuyf+Hy9mHNb2EVlp0LQ3dx7x8mREdHjS9foURVF4\nwgmx6NL/xCWmpHJFFqIDyxw+Bg3uISrvYcxESE+Bytnm5ZAuUT85YdCgZAD5snOQfCy6BKBrR7S4\nxJRk2TmIwsGE829Qnvqv5pBPSjMkA502X6AvMxHR/xw7YEBKTmIiN44gFl3aZXj2yPF62SGIwsGk\nc6+OmfHToZ+UZkgGOrm8GFG/yc/KQnJs7Omyc5B8LLoEANDGxo8dMHoiJ+8THQJtbByShh6Pku2H\ndv/UZMDnDnBeEFE/SY6NhV6nGyQ7B8nHoksAgJTsQVMzh49Ry85BFC5Ou+7euNtfTD+kndJidIAi\nRF9HIqIekmNj07gVMLHoEgBArdGNSx0wTHYMorCRPXI8zLZcxX+IMxK0ajZdov40NDU1AUC67Bwk\nF4suAQBi9IkGtUYjOwZRWBn3mz/iuQ/Vh1R1dRr2XKL+NDojIwHAUNk5SC4WXQIAxOiTkmRnIAo3\nx/322tj3Fx3aSWla/h1J1K9GpqcnJcfGjpKdg+Ri0SUoipKUkJbJFReIDpMuTg/9wInYVnEI99WA\ncwWJ+tHglBRk6vUTZecguVh0CQCGZ+WNZdElOgKnXHtv7O3PGw56UlqMVih+PzdHI+ovg5KToVap\nRsvOQXKx6BJ0cXpj1vB8bhZBdAQG5R+rlFtzleBBOqwhBUFbMzdHI+ovMRoN4rXaVNk5SC4WXUJS\n5oDjDbnD+W+B6Ajln/MHvDpTdcCdz7INQGvDYewbTERHTa/TJcjOQHKx3BBUau2Y1JwhsmMQha3j\nL7w+9vWvBx+wxWZnQNXSyKJL1J/idbpErqUb3Vh0Cbo4vUGji5EdgyhsxSYkITZrPCqq93+fnAxo\nOHWBqH/lJifHAsiQnYPkYdEl6OLi42RnIAp3U6+5J+a251P3e1JaRio0DmvnIW4vQUS9wci1dKMe\niy5Bo4uJlZ2BKNwNHj9ZtbVh/yelGZIBj83LIV2ifjQiLS0pOTZ2pOwcJA+LbpRTFEXFokt09BRF\nwehpl+OdOco+T0ozpACdDh+3RyPqR0NSUpCZkDBJdg6Sh0WXkvWpGdyziagXTL7kT7EvzR2yzzPO\nUpMAryvAk2KI+lF2YiLUijJcdg6Sh0WXMpIyB/JMNKJeEJeUCo1hDMwNe9+WlAAEvNwwgqg/xet0\nUKtUibJzkDwsupSelJ7Dk9GIesnUa+7W3fFckmfP6xUF0Ko5c4Gov8Wo1ZyeF8VYdKOcWqvLSszI\niZedgyhSDJ10irrYnIt9nZSmVYNNl6ifxWg0/NQyirHoRrkEQ+ZgfUo6/x0Q9RJFUTDi1IvFx1/v\nXWp1GvZcov6m44huVGPBiXK6OP0QfWq67BhEEeXEy26J+7/Zg/eavqDlaZ9E/U6rVnNEN4qx6EY5\nRaUaoE9h0SXqTfrUdCjJRjQ07369TiO46gJRP4vVaGIURWHfiVL8Hx/lFJU6LSYhSXYMoohz0lV3\n6f7xfOJuo7qJeiAYYNkl6k+G+Hg1AP6ii1IsulFOq4uJVRT+3iXqbXmTp6kLynN3uy7TABHwC77v\nEvWjzIQEHYA02TlIDr7hRjlNTCwn6RP1AUVRMOykC8Ts7/53Ulp2OpRAgEWXqD9l6vUxAAyyc5Ac\nfMONchodiy5RXznpir/FPflxrnvn1zkZUAf8nLpA1J8yExL0YNGNWiy6UU6t0fJsVKI+kpiWhYB+\nhGJp7fo6Kw3aYJBFl6g/GeLjNWnx8QNk5yA5WHSjnKJS898AUR868cp/aO96IcEDAGkpUBSw6BL1\np3itFnFabYrsHCQHS06UU3gmGlGfGnnSOZrl27pOSjMkAyoVf+aI+pNGpYJGpeKnl1GKRTfasegS\n9SmVSoXBx/9KfPVTd9HlTxxRv9KoVFAAnewcJAeLLvHfAFEfm3r1HXEPvzfQnZoE8E9Lov6lVauh\n5ohu1GLJiWKKoqhUajV/7RL1saSMAfDo8uB0AypFiIMfQUS9hSO60Y1FN7qpVRqt7AxEUWHK72/X\n/vPF+E61Giy6RP1Io1JBURQW3SjFohvdNBzRJeofo089V/NjSa5Q8V2XqF9xRDe68S03umnUGi2L\nLlE/UKnVGDTxbOH1yk5CFF20ajVHdKMYi250U6s5dYGo30y9+h9xPhHL912ifqRRqSA4ohu1+IYb\n3TQqtYYjukT9JCVnMBATF/zP8uXu77ZvD5ZZrfD6/bJjEUU0Tl2IbhrZAUgqtYpTF4j61TUz1qmq\nipbFFWxYFbCvL3Z5mmuVmGBQ6NUqpMXGKOOyMnFMVlbMiPR01eCUFGjVatmRicJa988QP76MUiy6\n0Y1zdIn6WXxSCvLP+C3yz/itGkB8z9s62qyoKlqGFRtX++0FGzweS70Sh6DQq1VKWmwsxmdl4Zjs\nrNgRaWlKbnIyNCzBRAfVPXWBRTdKsehGN05dIAohCanpGHfWxRh31sUa7PH+3NHajPLCpfhlw2q/\nY1Wx19vSoMRCCL1KpWTq4zA+Kwvjs7Ji87pLsJrLOxABALyBABSgU3YOkoNFN7qpWXSJwkOCIRPH\nTL8Mx0y/bK8SbLfUY3vhMvy0cbXPvny9z9faqMQpEHq1SsmMj8eE7CxlfFZW7PC0NAxKSoKKJZii\niMfngwCcsnOQHCy60a3T3+kOyg5BREcnKWMAJvz6Ckz49RVa7PERra2hBpuKlmNxyWqfY+kan6/N\nosQrEPFqlZKj1+OY7CxlXNdIMAYkJrIEU8Rx+/0QQrDoRikW3ejmcDvaArJDEFHfScnJxaTzrsKk\n867aqwS31lWiqGg5vtu42uvYuMrnb7eq4rtHggcmJopjsjLVY7OyYvLS0pCTmAhF4QdAFH48Ph+E\nEA7ZOUgOFt3o1uFx2DmiSxSlDAOHwTBwGHDBtTrssfySpboMBUXL8fWmgs6O4uWBgL1110hwblKS\nmJCTrc7PyIjJS0tDVkICSzCFLLffD18waJedg+Rg0Y1iQojgmNPO98nOQUShJ2PISGQMGQlcdENM\nz+uDwSAs1aVYWbgcX24p8HYU/uIP2tsUvVoRerVaGZycJCZmZ2tGZ2To8tLSkKHXswSTVB6fD95A\ngCO6UYpFN8oFAz5uSEpEh0ylUiFr2GhkDRsNXPqn3UaCg8EgmnZsxZKiFfhs05pO59qfAsGOdkWv\nVokEtUoZkpIsJmZna8ZkZuqGGwxIi49nCaY+5/L54Pb5OKIbpVh0o1zA7+eILhH1CpVKhZwR45Az\nYhxw+c17jQQ3lpbgx+KVmLV5Tadz1fqAcDmUBLVK6NUqZXhqqpiQk60dk5GhzTMYkBofv7+nITos\nHr8fjs7Odtk5SA4W3SgX9PtYdImoz6lUKgwYPREDRk8E8Ne9SnD99mIsKl6JGZvWeDpWrA0qHqei\nVykiQaNW8lJTxcScbO3ojAztcIMBKXFxkl4FhSOXz+f3B4Mu2TlIDhbdKBdg0SUiyVQqFQblH4dB\n+ccBuC22521+vx/12wqxoHgVPty8xuNctlqoPC7Eq1UiUaNWRqYZxMTsbJ0xI0Mz3GBAUmzsfp6F\nopXT6/UCYNGNUiy6UY5Fl4hCmUajweDxUzB4/BQA2L0Ee72o3boeXxavFK1b1rqdtSuh8XoQr1JE\nolajjEoziEk5OTEj09PVww0GJMbE7PtJKKK5vF4fALfsHCQHi26U49QFIgpXGp0OQyeejKETT1YA\n7Dafwe/1wrxpDT7fsEq0bVnrdtYtFxpfJ/QqBYlajTI6PU1MyM7WjeoeCdbrdPt5Fgp3jq6iyxHd\nKMWiG+UCfr8n4PdDreE/BSKKHBqdDsOPOw3DjzttrxLs9XhQU7Ian25YFWzbus7tql8Krd8LvVoR\nSVqtMjo9XUzMyd41Ehyn1e7nWSgcNNjtnQCssnOQHGw3ZPe6OxCXmCI7BxFRv9DFxiJv8jTkTZ6m\nwp4l2O1CVfEKzChZHWzbtN7lblii6AI+oVcrSNbplPyMDHFMdlbMyPR09bDUVMSyBIe8po4OH4AW\n2TlIDhbdaCdEW6fTwaJLRARAFxePUSefg1Enn6MCsNsaZ54OO6qKV+KjkjVB28b1LleDWYkJ+oVe\nrVJSY3QYm5mBY7KzY0akpamGpqYihp+UhQS3z+cRQnC7+yjFn8IoFwz62zpd3DCGiOhgYhOSMPrU\n32D0qb/ZqwS77DZUFS9HwcbVAXtRsdvdVINYERB6lUoxxMZgXGYmxmdnxYxIS1MNSUmBjiW433j8\nfp6IFsX4kxblvC5nvbu9TXYMIqKwFp+UgvzTL0D+6Reoscd0CFd7K3YULsPKjWsC9nVFLo+lTokV\nQZGg7irB4zMzd40ED05JgUatlvQqIpPH52PRjWIsulHObqnfZms0dwJTue4OEVEfiE82YNyZv8O4\nM3+nxh4jwR2tzdhRuBzLS1b729ds8HRa65U4ERR6jVrJiIvD+KxMjM/Kih2RlqYMSk5mCT5MQgi4\n/X6uuBDFWHTJbK0uswPIkB2EiCjaJBgyMX76JRg//RIN9vidbLc2wlS4DD9vXO13rCzydrY0KnEQ\nIkGjUjLi43BMVpYyPisrJq+7BKtVKkmvInTZOzsRCAZ5IloUY9GlWqu5zCM7BBER7S4pPRsTfnU5\nJvzq8r1KcHtTHbYULsMPJat9juXrfL7WJiVOgdCrVUqWPh4TsrKV8dlZscMNBgxMSoIqSkuw1emE\nPxg0y85B8rDoRjkhhMd4yq/5sQ4RURhJzhqISedeiUnnXqkFsNsaZ2311dhYtBwLN672dZSs8fls\nll0leEBCgpiQnaUam5UVm2cwICcxMaJLcHNHBxydnTtk5yB5WHQJXrezQ3YGIiLqHakDhiB1wBAc\ne/4f9irBLTUVWF+0AgtKVnsdxSv9AXvLrhI8MDFRTMjOVo/NyozJMxiQnZgIRVEkvYre0dTR4Wvq\n6KiUnYPkYdEleF0sukRE0SAtNw9puXnAhdfpAOza9zgYDKLFXIY1RSsxf/PqTmfhskDA3qrEq1VC\nr1YpuUldJTg/s6sEZyYkhEUJrrPbnQCaZOcgeVh0CX5vp7XT6UCMPlF2FCIikkClUiFjqBEZQ43A\nxTfstgpPMBiEpXI7lhevwBeb1nQ61/0SCDraFH13CR6SnCwmZmdpRmdm6vIMBqTr9SFTgs1tbS4A\njbJzkDwsuoRgwGeyNdUga3i+7ChERBRiVCoVsvLykZWXD1z6571KcFP5ZvxUtAKfbi7wdBT8FISz\nXdGrukrw0NQUMTE7WzM6I0OXZzDAEB/fryW42mZzgyO6UY1Fl+CwNpbYGmpE1vD80PgTnIiIwoJK\npULOqGOQM+oYALfG9rwtGAyioXQjvi9agZmbCzqdq9YH4HLsGgnOM6SKCdnZ2jEZGdrhBgNS4+P3\n8yxHzun1OoQQotcfmMIGiy6h0+nY0VJb4QCQJDsLERFFBpVKhYGjJ2Hg6EkA/r7bSLDf70fD9mJ8\nt2ElPt5U4HGuKAgqbqeSoFaJBI1aGWFIFROyc7SjM9K1w9PSkBwbu59nOTBHZyf3uI9yLLoEAGZr\nVWkHWHSJiKgfaDQa5I47AbnjTgCA3Vqs3+9H/Zb1+HrDSry/ucDjWrpKqDrd0KtVIlGrUUYaDGJi\nTrZuVHq6Ji8tDYkx+97Y09HZCY/fz2kLUY5FlwDA0rUNMBERkVwajQaDJ5yIwRNOBPYswV4vares\nxdzilaJ1yzq3c8kKqL0e6NUqkaTVKMa0NDExJ1s3Mj1d4/b50On3l8h5FRQqWHQJQojgiClnOWXn\nICIiOhCNToehk07B0EmnKADiet7m9XhQu7kAnxavEm1b17ptFZtVZputVVJUChEsugQA8LocnMdE\nRERhSxcbi+HHn47hx5+uAIhb9OqDTeUfPPed7FwkV+Tu+0eHpdPlbOh0susSEVFkaN6xzQXALDsH\nycWiSwAAT0f7ssbyzbJjEBER9QpPh71dCBGQnYPkYtElAIC9uW5lzeZ17bJzEBERHS0hBDwOm012\nDpKPRZd22lKzqYBvCkREFPYc1gYEfL5y2TlIPhZdAgAIIdwdrc0sukREFPYsVaVw2qyrZecg+Vh0\naRdXe2tLMBiUHYOIiOio1G/fYHO2WTbIzkHysejSLn5v54bW2grZMYiIiI5K9cZVbQB4hjWx6NL/\ntNVX/1y/fYNXdg4iIqKj4WhpahFC8PcZsejS/wR8ncXVG1a1yM5BRER0pDyOdnR2tFfLzkGhgUWX\nemqwVJfZZYcgIiI6UrXbioSzzbpIdg4KDSy6tIsQQrgdbdwXnIiIwtaO9b9YOlqbl8rOQaGBRZd2\n43V1mN32NtkxiIiIjkjd1qJ2AFxDlwCw6NIe3Pa2XxrKNsmOQUREdEScNqtFCCFk56DQwKJLu3FY\nG1ebSwo4fYGIiMJOe3MdvG7nVtk5KHSw6NKetlRvWMm5C0REFHZqNhV4bY01C2TnoNDBoku7EUL4\n7ZaGJn7qQ0RE4aa84Gerz+3k1r+0C4su7aXT5VjRXLlNdgwiIqLDYjWXtwkhmmXnoNDBokt7aTGX\nf1G26geb7BxERESHKhgIwGWzsuTSblh0aV+Ky9f+bJUdgoiI6FBZqk3welxrZOeg0MKiS3sRQvgd\n1oamYDAoOwoREdEhqS5e1dFiLl8oOweFFhZd2qdOp2NZ8w6u0EJEROFh+4qFzQDWys5BoYVFl/ap\npabii9KVi7nMGBERhbxgIABbo7lBCNEpOwuFFhZd2p+NFeuXtsgOQUREdDB124vhcbR/JzsHhR4W\nXdonIUTAYW1o5DxdIiIKdZt/nNvUVl/1mewcFHpYdGm/Op0dvzSVb5Idg4iI6IBqNq21CiEqZOeg\n0MOiS/vVWlsxt3Tl962ycxAREe2P294Gp63FJDsHhSYWXTqQkh2FnKdLREShq3T1D572ppqPZeeg\n0MSiS/slhAg6WpoagoGA7ChERET7tOWneY2dTsePsnNQaGLRpQNyt7d+bS5ZzTPSiIgo5Agh0FpX\n2SiEcMrOQqGJRZcOyNZYM6Pw64/rZecgIiLaU2PZJng62n+SnYNCF4suHZAQoqmhtKRBCCE7ChER\n0W62LPmqpaWmYpbsHBS6WHTpoFztrQvqthXJjkFERLSbysLlFgDbZOeg0MWiSwfVVl/1fuHXHzXI\nzkFERLRTp9MBp81aLviRIx0Aiy4dlBCitnZLIacvEBFRyCgr+NHrsNRz2gIdEIsuHRK3vfXHpvLN\nsmMQEREBAIq++aTO1d76tewcFNpYdOmQtNRUvFP4zSfNsnMQERF1ujrQVl9VxmXF6GBYdOmQCCEq\nzCWrOU+XiIik2/zTPJetsfZ12Tko9LHo0iFztbctt5rLZMcgIqIot+G7WXUeh22h7BwU+lh06ZBZ\nq0vfLF4w0yo7BxERRa+ONgvsloYSIYRPdhYKfSy6dMiEEFsr1v3SKDsHERFFrw3fzWpvqSl/QXYO\nCg8sunRYXPbWtW311bJjEBFRlNry8/z6gM+7WnYOCg8sunRYLJXbX9+wcHab7BxERBR9Wusq4Wyz\nruQmEXSoWHTpcBVvW/ZtHd9jiIiov62b977FUmV6UXYOCh8sunRYhBDC2Wb9un57MZsuERH1GyEE\nKtb+XC+E2Co7C4UPFl06bK21FS+vnP1arewcREQUPRrLSuC0tSySnYPCC4suHTYhRHNDackOn8ct\nOwoREUWJgi/erW+t3fGa7BwUXlh06YjYm+te3vTjXJfsHEREFPmCwSBqNq+rE0Lw00Q6LCy6dERc\ntpZvCr/+iG84RETU5yrWLvG5bNZZsnNQ+GHRpSMihPA7rI3LW2t3yI5CREQRbsWMF822xpp3ZOeg\n8MOiS0fMUmV6cvmMlxpk5yAiosjVWleJtvrqFUIITpejw8aiS0dMCFFZvWHVDr+3U3YUIiKKUEs/\n/E+9pcr0iOwcFJ5YdOmo2C31L5Z8P8cpOwcREUUer9sJc8maUiEE956nI8KiS0fF2Wadv27eezWy\ncxARUeRZP/+D9rb6qidk56DwxaJLR0UIEXC0NC1sLNskOwoREUUQIQSKv5tl7nQ6fpadhcIXiy4d\ntRZz+bPLPnmRo7pERNRrygt+9nW0NH8ohOCW83TEWHTpqAkhmhq2b9jqtrfJjkJERBGia0kx81uy\nc1B4Y9GlXtFcuf3epR/+xyI7BxERhT8uKUa9hUWXekXA79toWrnI5HVzAQYiIjo6XFKMeguLLvWa\n1rrKB1Z9+jrnLxAR0RHjkmLUm1h0qdd0Oh3LS76fU+73eWVHISKiMMUlxag3sehSr2pvrHl8/fwP\n7LJzEBFR+AkGgyj+diaXFKNew6JLvcrV3vrtuvnvVwYDAdlRiIgozJQs/txptza+wCXFqLdoZAeg\nyCKEEEmZA/5Tsvjztyaee6Vedh76n4oVC7B14QzYaiugjU9A5siJOO6K25A8YBgAYM7fzkFHb8P2\nIgAAIABJREFUS8MBH+OUm5/AyNMvBACY1y/B2hnPwW2zIiNvPE684UGkDBy+2/099jZ8cfuvcdwV\nd2DMr67smxdGRBEhGAhgxcyXK+3NdR/LzkKRg0WXep3D0jBr5ezX7p/wmyvyFUWRHYcAFH72Ckrm\nv4vknKEYfc4VcLU0obLgezRsKcBvn/4ciZmDkH/uNfC5O/Y61t/pweYFH0Kti0HGiHEAAFdrM355\n+S4kZg/B6LMvR+XqRfjh6Ztx0QvfQKOL2XVsyVf/RYw+GcazLu2310pE4an4u5kdDkv9M0KIoOws\nFDlYdKnXCSGCKTmD39i+7Nvnx5x+fszBj6C+ZCnfhJL57yI7/wScc99bUGt1AIAhBdOx5KU7sWHu\nWzj1licw9txr9nn8mg+fghBBnPjH+5EyMA8AULFyAYKBAH714LuIT0nH0CnnYMHDV6N2wzIMnTwd\nQFcZNv34OaZcdx9UGm3/vFgiCksBvx+rZr9eabc0zJadhSIL5+hSn2hvrHln6Uf/qeI0K/m2fT8b\niqLC1D89uqvkAsDQKdNhPOtSJOcM2e+xTduLsG3xbAw85mSMPOOiXdd3NNchNikV8SnpAIDUIcau\n6y31u+6zcd7biE/N3O04IqJ9KfrmY4fD0vAER3Opt7HoUp8QQvjszfUzKouW+2RniXZ1G1YgdfBI\nJGUP3uu2k296BMf87k/7PXbtJ89BpdZgyh8f2O16nT4J/k73rq93TnnQxSUAABzNtShd8iUmXnor\nFBXfZoho/wI+H9bMeXuHo6VxjuwsFHn4G4j6TFt91fM/v/tUlewc0czd3gKPow0pg/Jgq9uBn56/\nHTNvOAkzbzgJS166C47muv0eW7X2B1h3bMbIM36316hv5sgJ8Hlc2LpoJnweFzZ/8wEUKMgcNREA\nsOGLN5GUMxR5p5zXp6+PiMLfuvnv2+3NdY9ypQXqCyy61GeEEG5bQ/U8c8karjUmiavN0nXZ0oQF\n/7oKzpYGjJx2MTKNE1FV8D2+fegqdFj3vdLClm8/hqJSY9z51+91W+5xZyD32NNR8NEzmHH9FGz+\n9iOMu+D6XYW6YuW3mHTZrbvuL4L8NJKI9ub3ebH2y/9WdLQ2fyU7C0UmnoxGfaqlpuKx716676K/\nvLdkJFdg6H/+ThcAoHF7IUacdiFOufnf2Pn/YeuiWSj46GkUfPQMzrrr5d2Oa6nchubSDRg6efo+\npzwAwNn/fA3mwl/gaDQjfcQ4ZBmPBQAUz3kdhiGjMHTydNgbzVj2+v2wlm9CfFoWJv/hnxh64jl9\n+IqJKJwUzHm7vb2p9l8czaW+whFd6lNCCFd7Y+2LJd/PccnOEo0UpetHXKVSY/K196DnHxtjfnUl\nEjMGom7Dcvi9nbsdV77sawDAqIMsCzb4uDMw9rxrd5XclqrtqC74Acde9ncAwNJX70XA68H0+97E\n4OOmYemr96C9gdvXExHg6/Sg8OuPyl22loWys1DkYtGlPmdrNL+99MPnyvYsU9T3dPGJAICEjIGI\n0SftdpuiKEgdMgoBvw/OPaYv1BT9gpiEFAwYd+JhPV/RZ68gY9QEDJp0KlqrTbDu2Ixjf38bBk6Y\nisnX3QtdQhJKf+L5JkQErPn8TVt7U+0DHM2lvsSiS31OCBFsb6y9e/nHL7TKzhJtEjIHQVFUCPr3\nvfhF0O8HAGhiYndd115fCUdzLQYfP+2wVkxoLt2A2g3LcezlXaO59kYzACApu+tENpVKjcTMXDia\na4/otRBR5PB53Cj6dmaZ09byvewsFNlYdKlfOG3WHzcu/nyj09YiO0pU0ehikJ43Dh0tDbuK507B\ngB+t1SbEJqYi3pC16/rmshIA2DUd4VAVfvYKcsZOQc7YybseHwBE8H/nIga8HgCcq00U7ZZ/8kJr\ne2PNvbJzUORj0aV+01Sx5ebFrz5Yf/B7Um/auf1uwUdP7yqfALB5wUdwtTUj79QLdpu721q1DQCQ\nNmzMIT9H/aY1aNy6Dsf+/u+7rkseMAxA10gvAHR2tKO9ofqAG1QQUeSzWxqwcfGcIld76xLZWSjy\ncdUF6jdCiNLMYcZfLNWlV2UMGSU7TtQYecZFMBf+AvP6n/HVvZdi4ISpaK/fgdoNK5CcMxSTLr11\nt/s7mmoAAPGpmYf8HEWfvYJBE09F5sgJu65LGzoa6cPysfaT59BqLkXT9kJACBjPvrx3XhgRhaVv\nnruzpnnH1htk56DowBFd6leWqtLbv/3P3ZWyc0Sbaf94AZOvuQcAsP37T9FaXYox51yJ8x6fAW2c\nfrf7dna0Q4ECXXzCIT22ufAXWCu6Tjrb05l3vYzMURNQ+tMcBLydmPaPF5CQMeDoXxARhaXKohW+\nBtPGz4UQNbKzUHRQeLIj9be03LwnL374rbvzjj9DJzsLERH1j2AggDf/eJqpdsu6CUIILsND/YIj\nutTvWmt3PL741Yd2BLlbFhFR1Fg1+zWbvbnufpZc6k8sutTvhBCd9ubaZ4oWfNIhOwsREfW9jjYL\n1n/1YUl7c9082VkourDokhTtTXUfr5z1qsnnccuOQkREfey7F+6pa6rYcr3sHBR9WHRJCiGEaKur\n+uvi1x5qkp2FiIj6Tu2W9YGazWu/EkLskJ2Fog+LLknj6WgvMK1YuLB++4bAwe9NREThJhgM4pv/\n3FVhrS77p+wsFJ1YdEkqq7nsr/Oe+mt5wO8/+J2JiCisrJv3nr29seZxIYRLdhaKTiy6JJUQwmVr\nMN++5L2nuTcwEVEEcTtsWP3Zm1vam2pnyc5C0YtFl6RzWBsXlyz+fKmlupSLOhMRRYiFL91Xb6ky\nXS+4YD9JxKJLIcFSZbrxy8dv5tq6REQRoKp4hbeycPlXAZ/XJDsLRTcWXQoJQghba13lg6tmv2aT\nnYWIiI5cp6sDXz/7j+1Wc9kdsrMQsehSyGhvqv1s7Zf/LbA1mGVHISKiI/T1s3fUW6tLrxZCeGVn\nIWLRpZBiqdx+7Zf/vrmKU7qIiMKPacUid1Xxyg+9Htdm2VmIABZdCjFCiGaruez/ir75xCE7CxER\nHTpXeysWvvLA5tbaiodlZyHaiUWXQk5bffVbyz95YWNHa7PsKEREdIi+/PctNU3lm68QQnATIAoZ\nLLoUcoQQoqli69XznvwrJ+sSEYWBDQtndzSYNrzCbX4p1LDoUkgSQpgbS0veKP5uFqcwEBGFMLul\nAUvee7aota7yedlZiPbEokshq7Wu8v+W/Pfp1VZzOc9MIyIKQUIIfPHoTVXNO7ZewY0hKBSx6FLI\nEkIIS5Xp8s/+dV2Z39spOw4REe1hzZy32i1VpseFEA2ysxDtC4suhTQhRHuLufzar5+5nW+iREQh\npLWuEqs/fWN1W331B7KzEO0Piy6FPFd7a0HF+qXvlnw/xyk7CxERAcFAAF88clOFpcp0tewsRAfC\nokthobW24tEf3/p3QWtdpewoRERRb/Fr/7K01FTcIYRolZ2F6EBYdCksdM3X3X7pZw9cW+b3cVdJ\nIiJZti9f6N6y5KsZ7c11C2RnIToYFl0KG0KINqu5/IYFz/2jUXYWIqJo1FZfjYUv3buuxVx+t+ws\nRIeCRZfCitNmXVG25uf3N/88n/N1iYj6kd/biU8fuKasuXL7RUKIoOw8RIeCRZfCTmttxUM/vP7w\nelsDN04jIuov85/6W72lynQN5+VSOGHRpbAjhAg2V26/5NMHry0P+Hyy4xARRbzCrz+yVxYue9XV\n3logOwvR4WDRpbAkhGixVpX++dsX/tkkOwsRUSSr21bs/+WD535qrat8VnYWosPFokthq6PNssS0\ncvEb6+a93y47CxFRJOpos2DOIzduslaXXsUtfikcaWQHIDoaLTXlj2cMGWVMy827ZPjxp8fIzkNE\nFCkCPh9m3XPljqbyzRcIITyy8xAdCY7oUtizmsuum//U39ZYzWUcbSAi6iVfPXNbQ/OObTcIIepk\nZyE6Uiy6FPaEEH5LlemCWfdetcVpa5Edh4go7BV88W57xbolL3a0Ni+VnYXoaLDoUkQQQjgaTBvP\nm3n35Tu4cxoR0ZGrLFruXTHz5YUtNRXPyc5CdLRYdCliCCHMjeWbr57z8A31PGeCiOjwNZVvCcx7\n4taV1urSa2RnIeoNLLoUUVztrWvMJWvu++GNRy2ysxARhRNbYw1m33/1BkuV6XwhhF92HqLewKJL\nEaetvvqTjYs/e3/DwtkO2VmIiMKBq70VM+66bGtTxdZfCSFcsvMQ9RYWXYpIrbU77v/p7ScWV29c\nw63TiIgOwOdx45M7L6mo21b0GyEEz+iliMKiSxFJCCGs5rKrvnz8z+tb6yplxyEiCkkBvx8z77nC\n3GAquUQIYZadh6i3sehSxBJC+Jort583859XbHPb22THISIKKUIIzH38z/X124tv9DjtG2XnIeoL\nLLoU0YQQbfXbi8/98LYLyz0ddtlxiIhCxqJXHmyuLFx2n93S8KPsLER9hUWXIp4Qoqpua+FvPrr9\nwopOV4fsOERE0q2a/Xrb5p/mvdxWX/2J7CxEfYlFl6KC3+ctr9tWfMHHd1xU6XXzhGIiil6bfvzS\nueqz12e11JQ/JTsLUV9j0aWo4XU7t9VtK/rdJ3deUu3r9MiOQ0TU7yqLlnu/f/3hRS3m8r/LzkLU\nH1h0Kap4OuwltVvXXzLjrkvN3CqYiKKJuaTAN++JW5dZq0uvENw+kqIEiy5FHbfdVlized3vZ959\neU3Ax2V2iSjyVRWv8M555MafLVWmc7nrGUUTFl2KSq721jU1m9ddM/u+q2sDfr7nE1Hkqli7xDP3\n8b/8YK0uPV8Iwb/uKaqw6FLU6mhtXmretOb6z/51XV0wEJAdh4io15Wu/sE9/6m/LbJWl13IkVyK\nRiy6FNXsloYfq4tX3jznkRsbgsGg7DhERL1m29IFrm+evWOB1Vx2iRCCf81TVGLRpajX3ly3oLJw\n2d/m/fvmRp6fQUSRYPNP85zfvnjPfKu57AohBP+Kp6jFoksEwNZY82X52iV3znn4hgbO2SWicLZh\n0acdi159cE6LufwPLLkU7RSOYBH9T1JGzvSBY45976pnZ+dqY+NkxyEiOiyF33xsX/LeM5+2mMtv\n5hJiRCy6RHuJ0SdOHGCcOPfaF+cOj0tKlR2HiOiQrP3yvfZlHz3/SUtN+W0suURdWHSJ9kFRlCED\nxkxaeO0Lc8ckZw2SHYeI6IBWffqGbdXs196zmsvulp2FKJSw6BLth6Io6Vl5Yxdf9eysiZnDx3A+\nOxGFHCEEfn7nSWvxd7PesZrLHpSdhyjUsOgSHYCiKPqMocbvLn74rROHTpyqk52HiGingM+HLx77\nU0N18cqHWuur3pOdhygUsegSHYSiKNr0wSPn/Pr2p84eO+1Cvew8RERuhw0z7rq8qrGs5FqnrWW5\n7DxEoYpFl+gQKIqipA0e8dYpV9/2+xMvuzlZdh4iil6ttTsw854rttZv33C+EKJSdh6iUKaRHYAo\nHHSfwfyX9MEjGuyWhr9Ov+XRdEVRZMcioihTvWGVd+6/b15nqdx+vhDCJjsPUajjiC7RYTIMGnZT\n3gnT/n3Rg29kq9Rq2XGIKEoUfzvL8fO7Ty62msuuEkL4ZOchCgcsukRHIClzwK+zhue/ddWzs4Zw\nrV0i6ktCCPzwxqOWku8/f6+lpuIBrpFLdOhYdImOkKIow3JGTfjq909+ODYrbyyXHyOiXuf3eTHn\n4RvqzSVr7murr/5Edh6icMOiS3QUupcfmzv91sdOHX/2xfGy8xBR5HC1t2LG3ZdXNpZtusrV3rpG\ndh6icMSiS3SUuldkeHbcmb/74zl/eyJDpeLgLhEdHUuVScy+/w9bGkwbzxNCmGXnIQpXLLpEvSQ1\nZ/AlWSPGvXjFkx/nxiZyBTIiOjLrv/rQvvTD/6y0Vpf+XgjhkJ2HKJyx6BL1Io1WNyJzeP6Xlz3+\nXn7OqGO4JAMRHTKv24V5T9xSZy5Z81JrXeXzPOmM6Oix6BL1MkVR4tOHjJp12rV3TjvhohuSZOch\notDXvGNb8POHrjdZqkuv6nQ6NsjOQxQpWHSJ+oCiKIph0PA7h0w8+c6LHnxjgDYmVnYkIgpRa798\nr33FjBeXWapKrxJCdMjOQxRJWHSJ+lBcYsoJGcOMM6546pNRhoHDZMchohDidTsx97G/1NVsWfuf\ntrqqlzlVgaj3segS9TFFUVIzho3+7JSrbptywsU3JnHrYCJqqtgS/PyhG7ZbKrf/3utxbZadhyhS\nsegS9QNFUZTUnCF/yswb88Clj/53SIIhU3YkIpJACIG1c99tXzHzlV+s1aVXCyGcsjMRRTIWXaJ+\npCjKoKy8/M/O+svDE7nBBFF06XR1YO5jf66t3Vr4f621O16VnYcoGrDoEvUzRVFUhkHD7x+Uf9zN\nFz34xiCuuUsU+aqKV/q+/r87TNaq0iu8HtcW2XmIogWLbhgyGo0fArgWwESTyVRymMeqAdwC4H2T\nyeTqg3i9zmg0ngAgxWQy/dD99VAAOwB8ZTKZLpKZ7WiotbpRmcNGf3reXc/lj5h8ZozsPETU+zpd\nHVjwn7saKouWf9FiLr9bCOGVnYkomnCv0vA0D8CjAJqO4NhZAF4BoOnNQH3FaDSeB2ANgDE9rm4D\n8BiA2VJC9ZKAz1vaWLZp8vwn//rGvCdubfR53LIjEVEvMq1c3PnW9adv2LDw019Zq8tuY8kl6n9h\nUXZodyaT6SsAXx3h4VkAwmkYPwPAbssUmEymdnQV3bAnhPADuDMuMWV27dbCDy568PXRg8Yezx3V\niMKY296G+U//vbZ2y/qPWmt3PCKECMjORBStWHSjV7itcRVueQ+L22FbpyjK8Z89eN2bY8+88Lzp\ntzyWodZqZcciosNU8sMXrp/feXJTU8WWPwghymXnIYp2nKMbhnrO0QVgR9d81ccAFAP4F4BxABzo\nGvW932QytXQfF9zjoZaaTKZp3bfpANwF4BoAw7of9wcAD5lMpsoez/0ogIcBnA3g6e4MlQCONZlM\nLqPROLX7cU4CYADgBLAewFMmk+mXPV5HOoAHAfwOXSPNNQDmAHjaZDI5e7zOnoaia8rNXnN0jUZj\nDoBHAJwHIBNdUzu+BfCYyWRq3MdrGAPgOgB/6L5/OYBXTSbT25AowZA5PXXAkFfOu/O5vKGTprLt\nEoUBh7UR8568tbqxfMvrbXWVzwsh9ny/JSIJOEc3clwA4EsAdQBe7r68CbtPcXgMQHX3fz8D4AMA\nMBqNWgALATwJoB3AqwAWAbgEwDqj0Th2H883E10l9mUAS7pL7oUAlgKYDGAugBcArAJwFoDvjUbj\nhJ0HG43GbHQV4NsBVAB4DV1F9wEA87tPmpvXI/8idM1Lbu+RQfR4vDx0Ff0/A9iKrnnI2wH8BUCh\n0Wjc17ZkM7q/RwsAvAtgIIA3jUbjTfu4b7/paG3+oXbL+olfPPqnV2bec2WNw9p48IOISAohBNbN\ne9/+35t/vXTb0gWntNbueI4llyh0cOpC5DgWwGUmk2kuABiNxofQVfxONhqNRlOXx4xG4zQAgwE8\nYzKZ7N3H3gFgGoBnTSbT/Tsf0Gg0voKuovo+gCl7PJ/ZZDKducd1z6LrRLFJJpPJ0uNx/tl92+UA\nNnZf/X/dOe4wmUyv9LjvW+gqqxeYTKb5RqMxFcCFABbtvF/3dXt6B12jsjeZTKb3ezzezQDeQFeR\nPXuPYwwAxvQY8Z4FYCWAGwH8dx/P0W+EEJ0A7lYU5ZXmHVvfm3juVceeds2dBk5nIAodtgYz5j7+\nl0qruexpW4P5v9zClyj0cEQ3clTsLLkAYDKZ/AB+6v5yyEGOvRFdBfXBnleaTKZCdE0lOMFoNI7Z\n45h5Pb8wGo0KgPsAXNOz5HZb2n2Z0X3fGAAXAyjtWXK7PYWukeWGg2Tu+dy56Crqy3qW3O7X8BaA\ndQDONBqNg/c49P2dJbf7vqvRNWJ8sO9XvxFCmJsqtk5fNevVP7z5x1O3VqxbwrO2iSTzedz4/vVH\nLB/8/YJvywt+mtJWX/0uSy5RaOKIbuQo3cd1Oz/m3+8arUajMQHAKACNAB42Go173iW7+3IigG09\nrq/seSeTySQAzO9+zCHomiecByAfXSUUAHauJpAHIB7A6j2fzGQymQE8tL+8+zGx+3LZfm5fBeAE\nABMAmHtcv6/vmR1AwmE+f59ztDQtVBTlpy//fcsj2SPGXf3be18akpw1SHYsoqgSDAZRtOCTjpUz\nXzG11Vfd5umwr5KdiYgOjEU3cnTu47qdIwwHWrFg57Zc2eg6QWtfBLo+5u9pr0VfjUbjeHTNjT29\n+yofgC3oGlEd2SPHzqkHdvSOpO7L9v3cXt99ueeWu/v7noXkCg/da3A+qCjK61Zz2bvjp1865Ywb\n7k3TaHWyoxFFvMrC5b5Frzywo7257qn2ptpPOIJLFB5YdKmj+3KZyWQ640gfxGg0JqJrlYZEdK26\n8AOA7SaTyW80GicDuGofz5m4n8fSm0wm52E8vaP7cuB+bt9ZrFv2c3tYEULUAzhPn5px5ralC146\n56+PjzJO/RV3ViPqA1ZzOb59/u6q5srts1trdzzWPX+eiMIEi2702W0UwmQytRuNRjOAcUajMdZk\nMnl63m40Gq9E12jsRyaTqRr7dya6TgZ7zmQyvbjHbfndlztHSk0AvOhanWE3RqNxIIAao9H4jslk\nunnPvPuxofvylP3cfhqAILpWY4gYzjbLz4qiHPvVM7fdnzls9PXn/uPZYZnDRsuORRQR3PY2LHr1\nX/WVhcuWWapMfxdCWGVnIqLDx5PRoo8PXYWz5wjgh+iamvBM90llAACj0ZgP4C0Ad+Lgo6E7pzJk\n97yy+wSwR7q/1AJAd5meCyB/H0t5PdB9+WOPvMAB5hl3z+tdAuD47lUWej7/TQBORtcSaPX7Oj6c\nCSH8rbU7/m1asfCET+689P2Z/7yi2mrmGvVER8rv82LpB8+1vn3jmcvWf/XhWc2V269kySUKXxzR\njT613ZfvG43G700m06voWlP3VwBuA3Cq0WhcCiAFwGUA4gBcbTKZOvb5aP+zAkAVgGu6N4IoAZAL\n4LfoWrJrEID0Hve/G10jsO8YjcaL0TWXdzKAUwHMM5lMX+yR9xaj0ZgG4KX9PP9fACwH8Eb3420C\nMB5dS4rVoWvJsoglhGgBcKOiKJlNFVueyzFOmPbrvz+RmzpgqOxoRGFBCIEtP81z/fze0xXtTXX3\nONssi2RnIqKjxxHd8CRwaB/p7+t+TwIoADAdwK3ArhHWaegaeY0FcAuA36CrOE4zmUyfHey5TSaT\nq/sxvwRwHLo2gjCia77uuegqsqcYjcb47vs3oKvYvg3gmO775wL4N4ArejzuMgCvo2vE+VZ07Wa2\nF5PJVA7geHStl5sP4K/oWt3hZXSt69tzlYgDff/C+gQTIURzc+X260oWf37iB3//7aefP3R9ra2x\nRnYsopAlhIBp5aLOt2+cVvrti/fc32DaOIkllyhycAtgogimKEpu5rAxL+aOn3ziOX99fGBSRo7s\nSEQhIRgMYvOPc10rZrxc7bA2vGNrrHmTJ5oRRR4WXaIooCjK0Mzh+S8NnTT1hOm3PDIgwZApOxKR\nFAGfD0ULPnGs+eLt6g5r04t2S/3HQgi/7FxE1DdYdImiiKIoI7Ly8l8efvwZx571l4ey9SlpsiMR\n9Qufx42CL96xFX87c4fd2vBUR0vTl1wLlyjysegSRSFFUUZnjRj7wsDRk8afccO9gzKG7rUjHlFE\n6HQ6sHzGSy1bfp5fZmusedjjsP3IgksUPVh0iaKYoigD0gaPeCglO/esU66+ffCoqb+OUal4jiqF\nP6etBb+8/2xz+Zoft7XU7rjP63aukZ2JiPofiy4RQVGUmKTMgTckpmX9+ZhzLhsy+ZI/pcYmJB38\nQKIQYzWXY9lHz9eZN60psVSa/hnw+7bIzkRE8rDoEtEuiqIo2pi4qWm5eY8MzD9uzBk33DMwffBI\n2bGIDijg92PbsgWe1Z+9UWNvrvvFWl32hBDCLDsXEcnHoktE+6QoyqD0ISMfTsnKnXbKNXcMHnnS\nOTpOa6BQYrc0YNXs1yxla36oslsa3+hoaZwlhPDKzkVEoYNFl4gOSFGUuOSsQTclGDJvmvCbKwYf\nf+EfU+ISU2THoigVDARQtvoH3+rP36xpq6sqaq7c9qgQgtMTiGifWHSJ6JAoiqLo4vSnpw4YemdK\ndm7+cb+9buDo086L1cbEyo5GUaC1rhKrZr/WWFm0wtzR0vS+3VL/iRDCJTsXEYU2Fl0iOmyKosTq\nU9MvSkzP+XPGsNHDplx806Bhx5+u5tQG6k0+jxubfpzrLPz6o1q7pWGFtbr0GSFEuexcRBQ+WHSJ\n6KgoipKSnJ17rT4l/apB+cflTrn0TwNyjBOhKIrsaBSGOp0ObF36jbtk8ef1tsaaCrul/i2XreUb\n7l5GREeCRZeIeo2iKAMNg/Ju1aeknZd3whkDTrj4xgzDwGGyY1GIc9vbsOXn+c6SH+bWOyz1pbbG\nmnc8He2LhRCdsrMRUXhj0SWiPqEoypiMocZ/6FPSp+ZP++2AsWf+LoWll3bqaLNg0w9fOLYu+are\nYW3a1lpX+YbP41rCkVsi6k0sukTUp5SuOQwnZAw1/jE2MXlK5rDRmeOnX5Y9/LjTNNrYONnxqB/Z\nLfXYuPhz27alCxqdbdaNLTXlbwR83pVCiIDsbEQUmVh0iahfKYqSlpCWdWGCIfPyBEPG0BFTzs4c\ne+aFqWm5IzivN8L4fV7UbForti//ttm8qcDqamspslSXviaCgXWCv3yIqB+w6BKRNIqiqABMTB8y\n6prYhKSp6YNHZh9zzmXZeZOnaXVxetnx6DAFfD7Ubi3E9uXfNtdsWtvqtFlrXO2t39qb674BUMly\nS0T9jUWXiEKGoigp8cmG3ySm51ytN2TkDTv21LS8E6ZlDBxzLHRx8bLj0R4Cfj/qTRt9F+QIAAAD\niUlEQVRgWv6dpWrDqlZnm6XO7bAtbG+s+RpAGYstEcnGoktEIal7bu/oxIycs+KTDNNjE1MGJWUO\nSMs7/oy0IRNPSsgcNgYqtVp2zKgS8PvRWLYJphULWyqLlu8stj/YGszzAWxjsSWiUMOiS0RhQ1GU\neACT0nLzfqON05+sT07LTBs8ImXkiWdl5I6fokvOHCg7YsTwONrRULYJNZsK2syb19qdrZY2t8Nm\n9bqdK9rqKr8EsEUIEZSdk4joQFh0iSisKYqSpY3Tn5iSNeh8XXxCvj4lLWNg/nHJuWNPSMsYNlqd\nOmAo1BqN7JghSwiBtvoqNJg2+quKV1qbdmxzuB1tLV5XR43b3vaLw9q4GsBWrmlLROGIRZeIIkr3\nCW4jtLHx45OzBp2s1urGxSYkGWL0SSlpg4bHD8w/NjVj6Og4w6Bh0KekR81KD54OO2wN1bA11gSt\n1WX2um1FDoe1weayt7X4Oz2bbY01P/k8rkIAtZyCQESRgkWXiKJC95zfLABjkrNzJ8bEJ05Ua7S5\nMfrEpJj4hKTEjJy4rLyxCWm5eUn6lHRVfEoa9KnpiE1IDvkyHAwE4LA2wNZYg9a6yk5rdWm7tbrc\n43bYXD6Ps6PT1dER8HmbAz7vNrulYZPP46pE18liDtnZiYj6EosuERG6VnwAMEyt1Q1JSMsarIuN\nH6yo1AMUlZKmjYmL1ehi4jS62Fi1Vheri42PSUjP1iRl5MQkZQzQJxgyY+NT0qDWaKHSaKBSqaGo\n1FBrNFBUaqjUGqjU3depNVDUaqi6r1fUagQDfnQ6HfC6HPB02NHp6kCn0w6Po93rdtg8bnur1223\n+VztrQFPR3sg4PP5gwGf1+/z+nweV4fX5XQEg4HKTqdji725bisAM7pGZj2Sv61ERFKx6BIRHSZF\nUbQADAAyAKTHxCdmx6cYchWVJk6lUmmhQKsoigZQtIqi0gBCi66vNYDQANjzv32AsIlg0Bbw+6w+\nj9vqtrc1+72edgAOAPYel07uJEZEdGhYdImIiIgoIqlkByAiIiIi6gssukREREQUkVh0iYiIiCgi\nsegSERERUURi0SUiIiKiiMSiS0REREQRiUWXiIiIiCISiy4RERERRSQWXSIiIiKKSCy6RERERBSR\nWHSJiIiIKCKx6BIRERFRRGLRJSIiIqKIxKJLRERERBGJRZeIiIiIIhKLLhERERFFpP8HP6nrdsRE\njLMAAAAASUVORK5CYII=\n",
946 | "text": [
947 | ""
948 | ]
949 | }
950 | ],
951 | "prompt_number": 48
952 | },
953 | {
954 | "cell_type": "markdown",
955 | "metadata": {},
956 | "source": [
957 | "Notice how the residual error variance is 27%. This is $100 - R^2$. "
958 | ]
959 | },
960 | {
961 | "cell_type": "markdown",
962 | "metadata": {},
963 | "source": [
964 | "## Aside: Types of Sums of Squares\n",
965 | "\n",
966 | "In unbalanced designs, the \"Types of Sums of Squares\" become relevant. The different Types of Sums of Squares (Type I, II, III...) basically correspond to different ways to compare models up and down the graph diagram shown above. That is, they correspond to different nested model comparisons -- and with unbalanced data, it matters which comparisons you do.\n",
967 | "\n",
968 | "A complete discussion of the different sums of squares is beyond what we want to cover in this course. For some starter reading, see [here](http://goanna.cs.rmit.edu.au/~fscholer/anova.php), [here](http://www.uni-kiel.de/psychologie/dwoll/r/ssTypes.php). \n",
969 | "\n",
970 | "The main thing we want to emphasise is: **ANOVA is about model comparison**. Think about what hypotheses make sense to test for your dataset."
971 | ]
972 | },
973 | {
974 | "cell_type": "markdown",
975 | "metadata": {},
976 | "source": [
977 | "# ANOVA as model comparison: summary\n",
978 | "\n",
979 | "We've provided here a brief overview of one way to think about ANOVA: it's comparing the incremental change in the sum of squared residuals as you add terms to the model. In this sense, ANOVA is about **model comparison**, using the change in residuals to decide whether a term is important. \n",
980 | "\n",
981 | "F-ratios are not the only way we could compare the models. ANOVAs are just a historically popular way to do model comparison -- mainly because they are computable by hand. One advantage of thinking about ANOVAs as model comparisons is that once you understand the idea, it's easier to think how to use other types of comparisons.\n",
982 | "\n",
983 | "For example, if we wanted to emphasise parsimonious models (i.e. keeping the number of parameters low), we could consider using an information criterion like the [AIC](https://en.wikipedia.org/wiki/Akaike_information_criterion) or [BIC](https://en.wikipedia.org/wiki/Bayesian_information_criterion) to compare the models. The structure of the comparisons and their hypotheses are the same, however. For a comprehensive treatment of this type of model comparison, see [3].\n",
984 | "\n",
985 | "Another possibility is using *regularisation* -- we fit the model but penalise the fit for having many coefficients far from zero. For example, [*Lasso regression*](http://statweb.stanford.edu/~tibs/lasso.html) (also called L1 norm regression) penalises the sum of the coefficients from growing too large. This has the effect of setting many coefficients to zero -- i.e. removing those terms from the model. This is equivalent to using a model lower down the nesting hierarchy drawn above.\n"
986 | ]
987 | },
988 | {
989 | "cell_type": "markdown",
990 | "metadata": {},
991 | "source": [
992 | "# ANOVA in GLMs\n",
993 | "\n",
994 | "We've demonstrated ANOVA for the linear model. The concepts inherent in ANOVA can be extended to Generalised Linear Models with non-Gaussian errors and non-identity link functions. To do this, we need to consider a generalisation of the squared residuals, called the *deviance*. \n",
995 | "\n",
996 | "The *deviance* is the difference in log likelihood (i.e. the likelihood of the data under the model) between a *saturated* model (that contains a parameter for every data point, and thus predicts the data perfectly) and the model in question. Intuitively, think of *deviance* as the \"amount by which our model predictions *deviate* from a perfect prediction\". In the Gaussian + identity model, the deviance reduces to the sum of squared residuals:"
997 | ]
998 | },
999 | {
1000 | "cell_type": "code",
1001 | "collapsed": false,
1002 | "input": [
1003 | "# anova table from interaction model:\n",
1004 | "print(anova_table)"
1005 | ],
1006 | "language": "python",
1007 | "metadata": {},
1008 | "outputs": [
1009 | {
1010 | "output_type": "stream",
1011 | "stream": "stdout",
1012 | "text": [
1013 | " df sum_sq mean_sq F PR(>F)\n",
1014 | "C(design, Sum) 1 0.385641 0.385641 0.484016 0.499861\n",
1015 | "C(paper, Sum) 1 1.718721 1.718721 2.157158 0.167628\n",
1016 | "C(design, Sum):C(paper, Sum) 1 23.386896 23.386896 29.352777 0.000156\n",
1017 | "Residual 12 9.561029 0.796752 NaN NaN\n"
1018 | ]
1019 | }
1020 | ],
1021 | "prompt_number": 49
1022 | },
1023 | {
1024 | "cell_type": "code",
1025 | "collapsed": false,
1026 | "input": [
1027 | "# summary table from GLM fit:\n",
1028 | "fit_glm = smf.glm('distance ~ C(design, Sum) * C(paper, Sum)', planes).fit()\n",
1029 | "print(fit_glm.deviance)"
1030 | ],
1031 | "language": "python",
1032 | "metadata": {},
1033 | "outputs": [
1034 | {
1035 | "output_type": "stream",
1036 | "stream": "stdout",
1037 | "text": [
1038 | "9.561029\n"
1039 | ]
1040 | }
1041 | ],
1042 | "prompt_number": 50
1043 | },
1044 | {
1045 | "cell_type": "markdown",
1046 | "metadata": {},
1047 | "source": [
1048 | "For GLMs with non-Gaussian errors and non-identity link functions, ANOVA becomes \"Analysis of Deviance\", but the conceptual principles remain the same. We perform nested model comparison, comparing how the deviance is reduced by adding terms to the model.\n",
1049 | "\n",
1050 | "Unfortunately Statsmodels has not implemented analysis of deviance as a function yet (as of version `0.6.1`), so it's not as straightforward to do as in the linear model cases outlined above. However, analysis of deviance can be easily performed in [R](https://stat.ethz.ch/R-manual/R-patched/library/stats/html/anova.glm.html) or [MATLAB](http://de.mathworks.com/help/stats/generalizedlinearmodel.deviancetest.html?refresh=true), or by hand in Statsmodels using model comparisons as before. Remember to take degrees of freedom into account."
1051 | ]
1052 | },
1053 | {
1054 | "cell_type": "markdown",
1055 | "metadata": {},
1056 | "source": [
1057 | "# References\n",
1058 | "\n",
1059 | "[1] Venables, W. N. (1998). Exegeses on linear models. In S-Plus User\u2019s Conference, Washington DC. Retrieved from [here](http://www.sefsc.noaa.gov/sedar/download/S10RD01.pdf?id=DOCUMENT)\n",
1060 | "\n",
1061 | "[2] Howell, D. C. (2002). Statistical Methods for Psychology (5th Edition).\n",
1062 | "\n",
1063 | "[3] Burnham, K. P., & Anderson, D. R. (2002). Model selection and multimodel inference: a practical information-theoretic approach. New York: Springer."
1064 | ]
1065 | },
1066 | {
1067 | "cell_type": "code",
1068 | "collapsed": false,
1069 | "input": [
1070 | "from IPython.core.display import HTML\n",
1071 | "\n",
1072 | "\n",
1073 | "def css_styling():\n",
1074 | " styles = open(\"../custom_style.css\", \"r\").read()\n",
1075 | " return HTML(styles)\n",
1076 | "css_styling()"
1077 | ],
1078 | "language": "python",
1079 | "metadata": {},
1080 | "outputs": [
1081 | {
1082 | "html": [
1083 | "\n",
1151 | ""
1166 | ],
1167 | "metadata": {},
1168 | "output_type": "pyout",
1169 | "prompt_number": 1,
1170 | "text": [
1171 | ""
1172 | ]
1173 | }
1174 | ],
1175 | "prompt_number": 1
1176 | }
1177 | ],
1178 | "metadata": {}
1179 | }
1180 | ]
1181 | }
--------------------------------------------------------------------------------
/lectures/blah.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/blah.pdf
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/hindenburg.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/hindenburg.jpg
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/hot_air_balloon.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/hot_air_balloon.jpg
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/interactions_article.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/interactions_article.png
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/paperplane.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/paperplane.jpg
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/pics_for_lectures.001.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/pics_for_lectures.001.png
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/pics_for_lectures.002.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/pics_for_lectures.002.png
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/resid-plots.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/resid-plots.gif
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/stimulus.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/stimulus.png
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/tidy_data.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/tidy_data.png
--------------------------------------------------------------------------------
/lectures/pics_for_lectures/topgun.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/tomwallis/glm-intro/b867c727852822d4b0d88869d83beedfbe1ed08e/lectures/pics_for_lectures/topgun.jpg
--------------------------------------------------------------------------------
/readme.md:
--------------------------------------------------------------------------------
1 | # The Generalised Linear Model
2 |
3 | ## A practical introduction using the Python environment
4 |
5 | [Tom Wallis](http://www.tomwallis.info) and [Philipp Berens](http://philippberens.wordpress.com/)
6 |
7 |
8 | This course provides an overview of an extremely flexible
9 | statistical framework for describing and performing inference with a wide
10 | variety of data types: the Generalised Linear Model (GLM). Many common statistical procedures are special cases of the GLM. In the course, we focus on the construction and understanding of design matrices (using S-style formula notation) and the interpretation of regression weights. We mostly concentrate on the linear Gaussian model, before discussing more general cases. We also touch on how this framework relates to ANOVA-style model comparison.
11 |
12 | The course was designed and presented as a six week elective statistics course for graduate students in the neuroscience program at the University of Tübingen, in January 2015. Lectures were presented as a collection of IPython Notebooks. While the notebooks are (we hope) well documented, they *are* lecture materials rather than a textbook. As such, some content might not be self-explanatory.
13 |
14 | [You can find the notebooks on the nbviewer platform here](http://nbviewer.ipython.org/github/tomwallis/glm-intro/blob/master/index.ipynb)
15 |
16 | We chose to do the course in Python because
17 |
18 | 1. It is a general purpose programming language and thus more versatile than e.g. [R](http://cran.r-project.org/). Neuroscientists can use Python to not only analyse data but also to e.g. interface with hardware, [conduct behavioural experiments](http://www.psychopy.org/), etc.
19 | 1. Its [popularity as a scientific computing environment is rapidly growing](http://www.nature.com/news/programming-pick-up-python-1.16833).
20 | 1. The scientific computing environment of Python has many similarities to MATLAB ™, which is the historically dominant environment in our field.
21 | 1. It is free and open source, and thus we feel will continue to benefit students who move out of a university environment.
22 |
23 | Nevertheless, the main statistical module we use here (`Statsmodels`) is well behind R in its maturity (no wonder, since R is a *lot* older). Thankfully, learning to create and interpret design matrices using `Patsy` formula notation is a skill that transfers easily to R's `glm` routines.
24 |
25 | Note two things:
26 |
27 | 1. *This is not a programming course*. If you do not have enough experience with programming (or Python) to follow the materials here, seek out an introduction to programming in Python. There are many available for free on the internet.
28 | 1. *This is not a basic statistics course*. You should be reasonably familiar with things like t-tests and ANOVA before proceeding.
29 |
30 | Where content is erroneous, unclear or buggy, please tell us at our [GitHub repository](https://github.com/tomwallis/glm-intro)
31 |
32 | ### Setting up the environment we used
33 |
34 | I've exported an `environment.yml` file into this repo. According to [the conda docs](http://conda.pydata.org/docs/using/envs.html), you should be able to recreate our environment for the course using this file.
35 |
36 | > Make a new directory, change to the directory, and copy the environment.yml file into it.
37 |
38 | mkdir stats_course
39 | cd stats_course
40 | cp environment.yml
41 |
42 | > In the same directory as the environment.yml file, create the new environment:
43 |
44 | conda env create
45 |
46 | Too easy!
47 |
--------------------------------------------------------------------------------
/setup_python.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:3851c943c6508d4ce8237f71f8cabee83b3a407b48e0c40c4f42d7e5cc17db2e"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "# Setting up Python\n",
16 | "\n",
17 | "## Short version for those familiar with Python\n",
18 | "\n",
19 | "The course materials use Python 3.4 along with the following packages and versions:\n",
20 | "\n",
21 | " pandas 0.15.2\n",
22 | " patsy 0.3.0\n",
23 | " scipy 0.15.1\n",
24 | " seaborn 0.5.1\n",
25 | " statsmodels 0.6.1\n",
26 | "\n",
27 | "\n",
28 | "## Longer version\n",
29 | "\n",
30 | "We used Python 3.4 along with a number of additional packages in the course. This document specifies the steps needed to set up the environment on your computer. It assumes you have already installed the Anaconda Python distribution (Python 3.4) from [here](http://store.continuum.io/cshop/anaconda/).\n",
31 | "\n",
32 | "The Anaconda distribution is a set of software libraries, including the core Python language, something called the `conda` utility, and a number of additional libraries.\n",
33 | "\n",
34 | "The conda utility can manage the *libraries* (also called *packages* or *modules*, which are extensions to the Python language that allow for things like statistics) we want to use.\n",
35 | "\n",
36 | "### Checking that conda is working\n",
37 | "\n",
38 | "Linux / Mac OSX: Open a terminal (command prompt). \n",
39 | "\n",
40 | "Windows: type \"anaconda\" into the search field of the start menu and start the \"anaconda command prompt\"\n",
41 | "\n",
42 | "At the command line, type `conda list`, which should return a list of all the python packages in your default python environment. It might look something like this (longer list if you have used the default anaconda distribution):\n",
43 | "\n",
44 | "```\n",
45 | "conda 3.7.4 py34_0 \n",
46 | "openssl 1.0.1j 4 \n",
47 | "pip 1.5.6 py34_0 \n",
48 | "pycosat 0.6.1 py34_0 \n",
49 | "python 3.4.2 0 \n",
50 | "pyyaml 3.11 py34_0 \n",
51 | "readline 6.2 2 \n",
52 | "requests 2.5.1 py34_0 \n",
53 | "setuptools 3.6 py34_0 \n",
54 | "sqlite 3.8.4.1 0 \n",
55 | "tk 8.5.15 0 \n",
56 | "xz 5.0.5 0 \n",
57 | "yaml 0.1.4 1 \n",
58 | "zlib 1.2.7 1 \n",
59 | "```\n",
60 | "\n",
61 | "The left column is the package name and the middle column is the package version. \n",
62 | "\n",
63 | "If this doesn't work for you (e.g. if the prompt reports that it doesn't know what `conda` is) then the anaconda distribution has not been correctly configured on your computer.\n",
64 | "\n",
65 | "You can find more information about the conda package manager [here](http://conda.pydata.org/index.html).\n",
66 | "\n",
67 | "\n",
68 | "### Upgrading packages\n",
69 | "\n",
70 | "Some of the packages installed by Anaconda aren't the most recent version available. We want to use the most recent versions where possible, so we're going to upgrade those packages manually using the `conda` tool (or `pip` if necessary, which is another tool that does essentially the same).\n",
71 | "\n",
72 | "For example, if you type `conda list` and look down the resulting list of packages, you'll see that the package `pandas` is version 0.14.1. We want to use the more recent 0.15.\n",
73 | "\n",
74 | "Now, type\n",
75 | "\n",
76 | " conda install pandas\n",
77 | "\n",
78 | "this will report that we're upgrading `pandas` to version 0.15.2, and that to do this, a bunch of other packages will need to also be upgraded. One nice thing about using `conda` is that it will look after all these dependencies for us. \n",
79 | "\n",
80 | "Press `y` to proceed. Once these packages have downloaded and installed, we'll do the next one. I'm just going to write the install commands below. The procedure is the same as above.\n",
81 | "\n",
82 | " conda install statsmodels\n",
83 | " pip install ggplot\n",
84 | " conda install seaborn\n",
85 | " conda install ipython-notebook\n",
86 | "\n",
87 | "That should be it! If all the steps above proceeded successfully, everyone in the course should now be using the same package versions. That means that when we find bugs (we probably will), at least they're the same for everyone.\n",
88 | "\n",
89 | "To check, type `conda list` again. Look down the list. You should be using the following versions: \n",
90 | "\n",
91 | " pandas 0.15.2\n",
92 | " patsy 0.3.0\n",
93 | " scipy 0.15.1\n",
94 | " seaborn 0.5.1\n",
95 | " statsmodels 0.6.1\n"
96 | ]
97 | },
98 | {
99 | "cell_type": "markdown",
100 | "metadata": {},
101 | "source": [
102 | "## The IPython notebook\n",
103 | "\n",
104 | "There are a number of different ways to use Python for data analysis. In this course we will encourage you to use the IPython Notebook. \n",
105 | "\n",
106 | "### Starting the notebook server\n",
107 | "\n",
108 | "Linux / Mac OSX: open the Anaconda Launcher, and click \"IPython Notebook\".\n",
109 | "\n",
110 | "Windows: type ipython notebook into the start menu search field, and click \"IPython Notebook\".\n",
111 | "\n",
112 | "The IPython Notebook is a file format (extension .ipynb) that allows you to integrate text, code and figures into a single document. You view the document using a web browser, and it can be exported into formats like html and pdf, if you want to give people a static (i.e. not interactive) document.\n",
113 | "\n",
114 | "Think of the Notebook like a lab notebook: you can write down some ideas about what you want to do and how you want to do it. The advantage of using this electronic format, though, is that you can now include the code that run your analysis and the outputs that result. It becomes a living, interactive document. \n",
115 | "\n",
116 | "Take some time to familiarise yourself with the notebook. You can find many tutorials online. For example, try [this](http://nbviewer.ipython.org/github/ipython/ipython/blob/master/examples/Notebook/Running%20Code.ipynb), [this](https://opentechschool.github.io/python-data-intro/core/notebook.html) or [this](https://www.youtube.com/watch?v=DUCQ_HZamhs). Note that you can skip all the \"installing python\" steps, since we've done that already. You can also see [this page](https://github.com/ipython/ipython/wiki/A-gallery-of-interesting-IPython-Notebooks#introductory-tutorials) for a list of interesting ipython notebooks available on the web.\n",
117 | "\n",
118 | "We will be providing you with some example notebook files that you will be able to run. During the course you will work on your own notebooks for data analysis. Feel free to copy code from our notebooks into yours."
119 | ]
120 | },
121 | {
122 | "cell_type": "markdown",
123 | "metadata": {},
124 | "source": [
125 | "## Packages and documentation\n",
126 | "\n",
127 | "During the course we will use a number of packages to analyse data in Python. These are all open source packages, maintained and developed by teams of volunteers around the world. You can find all the documentation for any package we're using by doing a web search for the package names. \n",
128 | "\n",
129 | "The main packages we will use are [Pandas](http://pandas.pydata.org/pandas-docs/stable/index.html), [statsmodels](http://statsmodels.sourceforge.net/stable/index.html), and for plotting, [Seaborn](http://stanford.edu/~mwaskom/software/seaborn/). If you are having trouble with commands from the packages we're using, refer there for help first."
130 | ]
131 | },
132 | {
133 | "cell_type": "code",
134 | "collapsed": false,
135 | "input": [
136 | "from IPython.core.display import HTML\n",
137 | "\n",
138 | "\n",
139 | "def css_styling():\n",
140 | " styles = open(\"custom_style.css\", \"r\").read()\n",
141 | " return HTML(styles)\n",
142 | "css_styling()"
143 | ],
144 | "language": "python",
145 | "metadata": {},
146 | "outputs": [
147 | {
148 | "html": [
149 | "\n",
217 | ""
232 | ],
233 | "metadata": {},
234 | "output_type": "pyout",
235 | "prompt_number": 1,
236 | "text": [
237 | ""
238 | ]
239 | }
240 | ],
241 | "prompt_number": 1
242 | }
243 | ],
244 | "metadata": {}
245 | }
246 | ]
247 | }
--------------------------------------------------------------------------------