├── .gitignore ├── README.md ├── code ├── __init__.py ├── models.py └── utils.py └── notebooks ├── 00_preface.ipynb ├── 01_golem_of_prague.ipynb ├── 02_small_large_worlds.ipynb ├── 03_sampling_the_imaginary.ipynb ├── 04_geocentric_models.ipynb ├── 05_many_vars_and_spurious_waffles.ipynb ├── 06_haunted_dag_and_causal_terror.ipynb ├── 07_ulysses_compass.ipynb ├── 08_conditional_manatees.ipynb ├── 09_mcmc.ipynb ├── 10_entropy_and_glm.ipynb ├── 11_god_spiked_ints.ipynb ├── 12_monsters_and_mixtures.ipynb ├── 13_models_with_memory.ipynb ├── 14_adventures_in_covariance.ipynb ├── 15_missing_data.ipynb ├── 16_genaralized_linear_madness.ipynb └── README.md /.gitignore: -------------------------------------------------------------------------------- 1 | data 2 | .ipynb_checkpoints 3 | __pycache__ 4 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Statistical Rethinking 2 | 3 | Going through the book _Statistical Rethinking_ (2nd edition) by Richard McElreath in an attempt to learn Bayesian modeling starting from zero. I'm a `python` kind of guy, so I think I'm going to try and redo all the code examples using one of the various PPL's (Probabilistic Programming Languages) that exist in the `python` universe. I have been getting more into `pytorch` lately as a framework for autodifferentiation and neural networks, and there is a nice-looking package called `pyro` for Bayesian inference that is built on top of it, so I will try and use that. 4 | 5 | I think this is a much better idea than learning R so that I can copy McElreath's code, because I have learned so much more by implementing things from scratch rather than relying on his custom-built `quap`, `precis`, and other functions as black boxes that simply give you the answer and hide away a lot of the implementation details. 6 | 7 | Du Phan, one of the maintainers of the package, is [doing something similar](https://fehiepsi.github.io/rethinking-pyro/), so their repo can serve as a comparison. 8 | 9 | I will also use a mixture of `numpy`, `sklearn`, `pandas`, `matplotlib`, etc. for various other things if the need arises rather than go straight to `torch`/`pyro` (especially for simpler problems). 10 | 11 | The data used can be found in the [official repository](https://github.com/rmcelreath/rethinking/tree/master/data) for the book. Noticed some files were missing (like `cars.csv`), but they can be found [here](https://github.com/fehiepsi/rethinking-numpyro/tree/master/data). -------------------------------------------------------------------------------- /code/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ecotner/statistical-rethinking/ffb5b62f06cc6a856fc45655ebd15d32a3dbbf4e/code/__init__.py -------------------------------------------------------------------------------- /code/models.py: -------------------------------------------------------------------------------- 1 | import tqdm 2 | import torch.tensor as tt 3 | import pyro 4 | from pyro.infer import SVI, Trace_ELBO 5 | import pyro.infer.autoguide 6 | from pyro.infer.autoguide import AutoMultivariateNormal, AutoDiagonalNormal, init_to_mean, AutoLaplaceApproximation 7 | from pyro.optim import Adam 8 | 9 | class RegressionBase: 10 | def __init__(self, df, categoricals=None): 11 | if categoricals is None: 12 | categoricals = [] 13 | for col in set(df.columns) - set(categoricals): 14 | setattr(self, col, tt(df[col].values).double()) 15 | for col in categoricals: 16 | setattr(self, col, tt(df[col].values).long()) 17 | 18 | def __call__(self): 19 | raise NotImplementedError 20 | 21 | def train(self, num_steps, lr=1e-2, restart=True, autoguide=None, use_tqdm=True): 22 | if restart: 23 | pyro.clear_param_store() 24 | if autoguide is None: 25 | autoguide = AutoMultivariateNormal 26 | else: 27 | autoguide = getattr(pyro.infer.autoguide, autoguide) 28 | self.guide = autoguide(self, init_loc_fn=init_to_mean) 29 | svi = SVI(self, guide=self.guide, optim=Adam({"lr": lr}), loss=Trace_ELBO()) 30 | loss = [] 31 | if use_tqdm: 32 | iterator = tqdm.notebook.tnrange(num_steps) 33 | else: 34 | iterator = range(num_steps) 35 | for _ in iterator: 36 | loss.append(svi.step()) 37 | return loss 38 | 39 | 40 | -------------------------------------------------------------------------------- /code/utils.py: -------------------------------------------------------------------------------- 1 | import pandas as pd 2 | import numpy as np 3 | import matplotlib.pyplot as plt 4 | from networkx.algorithms.moral import moral_graph 5 | from networkx.algorithms.dag import ancestors 6 | from networkx.algorithms.shortest_paths import has_path 7 | from pyro.infer import Predictive 8 | from pyro.infer.mcmc import NUTS, MCMC 9 | from pyro import poutine 10 | import torch 11 | import torch.tensor as tt 12 | 13 | ### Sample summarization and interval calculation 14 | 15 | def HPDI(samples, prob): 16 | """Calculates the Highest Posterior Density Interval (HPDI) 17 | 18 | Sorts all the samples, then with a fixed width window (in index space), 19 | iterates through them all and caclulates the interval width, taking the 20 | maximimum as it moves along. Probably only useful/correct for continuous 21 | distributions or discrete distributions with a notion of ordering and a large 22 | number of possible values. 23 | Arguments: 24 | samples (np.array): array of samples from a 1-dim posterior distribution 25 | prob (float): the probability mass of the desired interval 26 | Returns: 27 | Tuple[float, float]: the lower/upper bounds of the interval 28 | """ 29 | samples = sorted(samples) 30 | N = len(samples) 31 | W = int(round(N*prob)) 32 | min_interval = float('inf') 33 | bounds = [0, W] 34 | for i in range(N-W): 35 | interval = samples[i+W] - samples[i] 36 | if interval < min_interval: 37 | min_interval = interval 38 | bounds = [i, i+W] 39 | return samples[bounds[0]], samples[bounds[1]] 40 | 41 | 42 | def precis(samples: dict, prob=0.89): 43 | """Computes some summary statistics of the given samples. 44 | 45 | Arguments: 46 | samples (Dict[str, np.array]): dictionary of samples, where the key 47 | is the name of the sample site, and the value is the collection 48 | of sample values 49 | prob (float): the probability mass of the symmetric credible interval 50 | Returns: 51 | pd.DataFrame: summary dataframe 52 | """ 53 | p1, p2 = (1-prob)/2, 1-(1-prob)/2 54 | cols = ["mean","stddev",f"{100*p1:.1f}%",f"{100*p2:.1f}%"] 55 | df = pd.DataFrame(columns=cols, index=samples.keys()) 56 | if isinstance(samples, pd.DataFrame): 57 | samples = {k: np.array(samples[k]) for k in samples.columns} 58 | elif not isinstance(samples, dict): 59 | raise TypeError(" must be either dict or DataFrame") 60 | for k, v in samples.items(): 61 | df.loc[k]["mean"] = v.mean() 62 | df.loc[k]["stddev"] = v.std() 63 | q1, q2 = np.quantile(v, [p1, p2]) 64 | df.loc[k][f"{100*p1:.1f}%"] = q1 65 | df.loc[k][f"{100*p2:.1f}%"] = q2 66 | return df 67 | 68 | ### Causal inference tools 69 | 70 | def independent(G, n1, n2, n3=None): 71 | """Computes whether n1 and n2 are independent given n3 on the DAG G 72 | 73 | Can find a decent exposition of the algorithm at http://web.mit.edu/jmn/www/6.034/d-separation.pdf 74 | """ 75 | if n3 is None: 76 | n3 = set() 77 | elif isinstance(n3, (int, str)): 78 | n3 = set([n3]) 79 | elif not isinstance(n3, set): 80 | n3 = set(n3) 81 | # Construct the ancestral graph of n1, n2, and n3 82 | a = ancestors(G, n1) | ancestors(G, n2) | {n1, n2} | n3 83 | G = G.subgraph(a) 84 | # Moralize the graph 85 | M = moral_graph(G) 86 | # Remove n3 (if applicable) 87 | M.remove_nodes_from(n3) 88 | # Check that path exists between n1 and n2 89 | return not has_path(M, n1, n2) 90 | 91 | def conditional_independencies(G): 92 | """Finds all conditional independencies in the DAG G 93 | 94 | Only works when conditioning on a single node at a time 95 | """ 96 | tuples = [] 97 | for i1, n1 in enumerate(G.nodes): 98 | for i2, n2 in enumerate(G.nodes): 99 | if i1 >= i2: 100 | continue 101 | for n3 in G.nodes: 102 | try: 103 | if independent(G, n1, n2, n3): 104 | tuples.append((n1, n2, n3)) 105 | except: 106 | pass 107 | return tuples 108 | 109 | def marginal_independencies(G): 110 | """Finds all marginal independencies in the DAG G 111 | """ 112 | tuples = [] 113 | for i1, n1 in enumerate(G.nodes): 114 | for i2, n2 in enumerate(G.nodes): 115 | if i1 >= i2: 116 | continue 117 | try: 118 | if independent(G, n1, n2, {}): 119 | tuples.append((n1, n2, {})) 120 | except: 121 | pass 122 | return tuples 123 | 124 | def sample_posterior(model, num_samples, sites=None, data=None): 125 | p = Predictive( 126 | model, 127 | guide=model.guide, 128 | num_samples=num_samples, 129 | return_sites=sites, 130 | ) 131 | if data is None: 132 | p = p() 133 | else: 134 | p = p(data) 135 | return {k: v.detach().numpy() for k, v in p.items()} 136 | 137 | def sample_prior(model, num_samples, sites=None): 138 | return { 139 | k: v.detach().numpy() 140 | for k, v in Predictive( 141 | model, 142 | {}, 143 | return_sites=sites, 144 | num_samples=num_samples 145 | )().items() 146 | } 147 | 148 | def plot_intervals(samples, p): 149 | for i, (k, s) in enumerate(samples.items()): 150 | mean = s.mean() 151 | hpdi = HPDI(s, p) 152 | plt.scatter([mean], [i], facecolor="none", edgecolor="black") 153 | plt.plot(hpdi, [i, i], color="C0") 154 | plt.axhline(i, color="grey", alpha=0.5, linestyle="--") 155 | plt.yticks(range(len(samples)), samples.keys(), fontsize=15) 156 | plt.axvline(0, color="black", alpha=0.5, linestyle="--") 157 | 158 | 159 | def WAIC(model, x, y, out_var_nm, num_samples=100): 160 | p = torch.zeros((num_samples, len(y))) 161 | # Get log probability samples 162 | for i in range(num_samples): 163 | tr = poutine.trace(poutine.condition(model, data=model.guide())).get_trace(x) 164 | dist = tr.nodes[out_var_nm]["fn"] 165 | p[i] = dist.log_prob(y).detach() 166 | pmax = p.max(axis=0).values 167 | lppd = pmax + (p - pmax).exp().mean(axis=0).log() # numerically stable version 168 | penalty = p.var(axis=0) 169 | return -2*(lppd - penalty) 170 | 171 | 172 | def format_data(df, categoricals=None): 173 | data = dict() 174 | if categoricals is None: 175 | categoricals = [] 176 | for col in set(df.columns) - set(categoricals): 177 | data[col] = tt(df[col].values).double() 178 | for col in categoricals: 179 | data[col] = tt(df[col].values).long() 180 | return data 181 | 182 | 183 | def train_nuts(model, data, num_warmup, num_samples, num_chains=1, **kwargs): 184 | _kwargs = dict(adapt_step_size=True, adapt_mass_matrix=True, jit_compile=True) 185 | _kwargs.update(kwargs) 186 | print(_kwargs) 187 | kernel = NUTS(model, **_kwargs) 188 | engine = MCMC(kernel, num_samples, num_warmup, num_chains=num_chains) 189 | engine.run(data, training=True) 190 | return engine 191 | 192 | 193 | def traceplot(s, num_chains=1): 194 | fig, axes = plt.subplots(nrows=len(s), figsize=(12, len(s)*5)) 195 | for (k, v), ax in zip(s.items(), axes): 196 | plt.sca(ax) 197 | if num_chains > 1: 198 | for c in range(num_chains): 199 | plt.plot(v[c], linewidth=0.5) 200 | else: 201 | plt.plot(v, linewidth=0.5) 202 | plt.ylabel(k) 203 | plt.xlabel("Sample index") 204 | return fig 205 | 206 | def trankplot(s, num_chains): 207 | fig, axes = plt.subplots(nrows=len(s), figsize=(12, len(s)*num_chains)) 208 | ranks = {k: np.argsort(v, axis=None).reshape(v.shape) for k, v in s.items()} 209 | num_samples = 1 210 | for p in list(s.values())[0].shape: 211 | num_samples *= p 212 | bins = np.linspace(0, num_samples, 30) 213 | for i, (ax, (k, v)) in enumerate(zip(axes, ranks.items())): 214 | for c in range(num_chains): 215 | ax.hist(v[c], bins=bins, histtype="step", linewidth=2, alpha=0.5) 216 | ax.set_xlim(left=0, right=num_samples) 217 | ax.set_yticks([]) 218 | ax.set_ylabel(k) 219 | plt.xlabel("sample rank") 220 | return fig 221 | 222 | 223 | def unnest_samples(s, max_depth=1): 224 | """Unnests samples from multivariate distributions 225 | 226 | The general index structure of a sample tensor is 227 | [[chains,] samples [,idx1, idx2, ...]]. Sometimes the distribution is univariate 228 | and there are no additional indices. So we will always unnest from the right, but 229 | only if the tensor has rank of 3 or more (2 in the case of no grouping by chains). 230 | """ 231 | def _unnest_samples(s): 232 | _s = dict() 233 | for k in s: 234 | assert s[k].dim() > 0 235 | if s[k].dim() == 1: 236 | _s[k] = s[k] 237 | elif s[k].dim() == 2: 238 | for i in range(s[k].shape[1]): 239 | _s[f"{k}[{i}]"] = s[k][:,i] 240 | else: 241 | for i in range(s[k].shape[1]): 242 | _s[f"{k}[{i}]"] = s[k][:,i,...] 243 | return _s 244 | 245 | for _ in range(max_depth): 246 | s = _unnest_samples(s) 247 | if all([v.dim() == 1 for v in s.values()]): 248 | break 249 | return s 250 | 251 | 252 | def get_log_prob(mcmc, data, site_names): 253 | """Gets the pointwise log probability of the posterior density conditioned on the data 254 | 255 | Arguments: 256 | mcmc (pyro.infer.mcmc.MCMC): the fitted MC model 257 | data (dict): dictionary containing all the input data (including return sites) 258 | site_names (str or List[str]): names of return sites to measure log likelihood at 259 | Returns: 260 | Tensor: pointwise log-likelihood of shape (num posterior samples, num data points) 261 | """ 262 | samples = mcmc.get_samples() 263 | model = mcmc.kernel.model 264 | # get number of samples 265 | N = [v.shape[0] for v in samples.values()] 266 | assert [n == N[0] for n in N] 267 | N = N[0] 268 | if isinstance(site_names, str): 269 | site_names = [site_names] 270 | # iterate over samples 271 | log_prob = torch.zeros(N, len(data[site_names[0]])) 272 | for i in range(N): 273 | # condition on samples and get trace 274 | s = {k: v[i] for k, v in samples.items()} 275 | for nm in site_names: 276 | s[nm] = data[nm] 277 | tr = poutine.trace(poutine.condition(model, data=s)).get_trace(data) 278 | # get pointwise log probability 279 | for nm in site_names: 280 | node = tr.nodes[nm] 281 | log_prob[i] += node["fn"].log_prob(node["value"]) 282 | return log_prob -------------------------------------------------------------------------------- /notebooks/00_preface.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Preface" 8 | ] 9 | }, 10 | { 11 | "cell_type": "code", 12 | "execution_count": 1, 13 | "metadata": {}, 14 | "outputs": [ 15 | { 16 | "name": "stdout", 17 | "output_type": "stream", 18 | "text": [ 19 | "/home/ecotner/statistical-rethinking\n" 20 | ] 21 | } 22 | ], 23 | "source": [ 24 | "%cd ~/statistical-rethinking" 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": 2, 30 | "metadata": {}, 31 | "outputs": [], 32 | "source": [ 33 | "import numpy as np\n", 34 | "import pandas as pd\n", 35 | "from sklearn.linear_model import LinearRegression\n", 36 | "import matplotlib.pyplot as plt" 37 | ] 38 | }, 39 | { 40 | "cell_type": "markdown", 41 | "metadata": {}, 42 | "source": [ 43 | "### Code 0.1" 44 | ] 45 | }, 46 | { 47 | "cell_type": "markdown", 48 | "metadata": {}, 49 | "source": [ 50 | "Illustration of what `code` looks like" 51 | ] 52 | }, 53 | { 54 | "cell_type": "code", 55 | "execution_count": 3, 56 | "metadata": {}, 57 | "outputs": [ 58 | { 59 | "name": "stdout", 60 | "output_type": "stream", 61 | "text": [ 62 | "All models are wrong, but some are useful\n" 63 | ] 64 | } 65 | ], 66 | "source": [ 67 | "print(\"All models are wrong, but some are useful\")" 68 | ] 69 | }, 70 | { 71 | "cell_type": "markdown", 72 | "metadata": {}, 73 | "source": [ 74 | "### Code 0.2" 75 | ] 76 | }, 77 | { 78 | "cell_type": "markdown", 79 | "metadata": {}, 80 | "source": [ 81 | "A complicated way to compute `10*20=200`" 82 | ] 83 | }, 84 | { 85 | "cell_type": "code", 86 | "execution_count": 4, 87 | "metadata": {}, 88 | "outputs": [ 89 | { 90 | "data": { 91 | "text/plain": [ 92 | "200.0000000000001" 93 | ] 94 | }, 95 | "execution_count": 4, 96 | "metadata": {}, 97 | "output_type": "execute_result" 98 | } 99 | ], 100 | "source": [ 101 | "x = np.array([1, 2])\n", 102 | "x = x*10\n", 103 | "x = np.log(x)\n", 104 | "x = np.sum(x)\n", 105 | "x = np.exp(x)\n", 106 | "x" 107 | ] 108 | }, 109 | { 110 | "cell_type": "markdown", 111 | "metadata": {}, 112 | "source": [ 113 | "### Code 0.3" 114 | ] 115 | }, 116 | { 117 | "cell_type": "markdown", 118 | "metadata": {}, 119 | "source": [ 120 | "Mathematically, the expressions\n", 121 | "$$\n", 122 | "p_1 = \\log(0.01^{200}) \\\\\n", 123 | "p_2 = 200 \\times \\log(0.01)\n", 124 | "$$\n", 125 | "are equivalent. However, if you compute them numerically, you will see that one is much more stable:" 126 | ] 127 | }, 128 | { 129 | "cell_type": "code", 130 | "execution_count": 5, 131 | "metadata": {}, 132 | "outputs": [ 133 | { 134 | "name": "stdout", 135 | "output_type": "stream", 136 | "text": [ 137 | "-inf\n", 138 | "-921.0340371976182\n" 139 | ] 140 | }, 141 | { 142 | "name": "stderr", 143 | "output_type": "stream", 144 | "text": [ 145 | "/home/ecotner/.local/lib/python3.7/site-packages/ipykernel_launcher.py:1: RuntimeWarning: divide by zero encountered in log\n", 146 | " \"\"\"Entry point for launching an IPython kernel.\n" 147 | ] 148 | } 149 | ], 150 | "source": [ 151 | "print(np.log(0.01**200))\n", 152 | "print(200*np.log(0.01))" 153 | ] 154 | }, 155 | { 156 | "cell_type": "markdown", 157 | "metadata": {}, 158 | "source": [ 159 | "### Code 0.4" 160 | ] 161 | }, 162 | { 163 | "cell_type": "markdown", 164 | "metadata": {}, 165 | "source": [ 166 | "Running linear regression on a sample dataset." 167 | ] 168 | }, 169 | { 170 | "cell_type": "code", 171 | "execution_count": 6, 172 | "metadata": {}, 173 | "outputs": [ 174 | { 175 | "name": "stdout", 176 | "output_type": "stream", 177 | "text": [ 178 | "coefficients: [3.93240876]\n", 179 | "intercept: -17.579094890510973\n" 180 | ] 181 | }, 182 | { 183 | "data": { 184 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYcAAAEWCAYAAACNJFuYAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO3df5xcdX3v8dc7S4SFAAskAtkkBI2GX6GEx16kF+u1aBtUamJq0dYfYL2NeLVqtamkeitVfBDN9Vd/qVi09KFFuIqBK7ZBBPxVfy0ECRBDIwSTTULCjyUJWTA/PvePcxZnd2Z3Z2d35pwz5/18POaxM99zZuYzZ85+P/P9cc5RRGBmZlZpStYBmJlZ/jg5mJlZFScHMzOr4uRgZmZVnBzMzKyKk4OZmVVxcrCGSPodSRuyjqOoJN0n6aVZxzGSsb5fSf8i6YpJeJ+5kkLSIRN9LZtcTg42KkmbJL18eHlEfD8i5mcRUzuIiNMj4o6s4xiJv19zcrBCqecX5mT+ClWikP8nRY7dsucdxxoi6aWStlQ83iTpLyXdI+lJSddJOqxi+YWS7pbUL+k/JZ1ZsewySb+UtFvS/ZJeU7HsEkk/lPQpSY8Bl9eI5XJJX5P0ZUm7gEskHS3paknbJPVJukJSR7p+h6RPSHpU0kOS3lnZtSHpDkkflfRDYC/wPEmnSPq2pMclbZB0UcX7vzKNe3f6Xn+Zlk+X9M30Mz8u6fuDlXVli0zSoZI+LWlrevu0pEMrt7Ok90nakX6et4zyvUxW7MO/34WS7krXuw6o/G4vkfSDYXGEpHnp/VdJWitpl6TNkqq+w2Gv9WD6Pg9JesNI61qTRYRvvo14AzYBL69R/lJgy7D1fgrMBI4F1gOXpssWAjuAFwEdwMXp+oemy/8ofd4U4HXAU8CJ6bJLgP3AnwOHAJ01Yrkc2AcsSV+jE/gG8HngCOC5aWxvS9e/FLgfmAUcA9wKBHBIuvwO4FfA6el7Hg1sBt6SPl4IPAqclq6/Dfid9P4xwNnp/SuBzwFT09vvABq+XYEPAz9O45wB/CfwkYrtvD9dZyrwSpJK/5gRvq/Jiv3Z7xd4DvAw8BdpDK9Nt/cVFd/RD4bFEcC8itdakH43ZwKPAEvSZXMHt336Xe0C5qfLTgROz/p/oKw3txxsMv1dRGyNiMeB/weclZYvAz4fET+JiAMRcQ3wDHAuQET83/R5ByPiOuC/gHMqXndrRPx9ROyPiIER3vtHEbE6Ig4CR5FUou+JiKciYgfwKeD16boXAZ+JiC0R8QSwssbr/UtE3BcR+4ELgE0R8aU0hrXA10mSGiQV5WmSjoqIJyLiroryE4GTImJfJP34tU5m9gbgwxGxIyJ2An8LvKli+b50+b6I+BawBxhtPGAyYq90LklS+HQaw9eAn43y/kNExB0RsS79fu8BrgX+xwirHwTOkNQZEdsi4r5638cml5ODTabtFff3AtPS+ycB70u7V/ol9QOzSVoLSHpzRZdTP3AGML3itTbX8d6V65xEUpltq3jNz5P8Mid9380jPHek13vRsPjfAJyQLv9DkmT0sKTvSvrttHwVsBG4Je0quWyE2GeS/DIf9HBaNuixtKIfVLlta5mM2IfH1zcssT1cY72aJL1I0u2Sdkp6kqTlNn34ehHxFEnL8VKS7+5mSafU+z42uZwcrBU2Ax+NiK6K2+ERca2kk4AvAO8EjouILuBeQBXPr+fUwZXrbCZpmUyveL+jIuL0dPk2ki6lQbPreL3vDot/WkS8HSAifhYRi0mSz2rg+rR8d0S8LyKeB7waeK+kl9V4r60klfigOWlZoyYc+zDbgG5Jld/JnIr7TwGHDz6QdAJD/RtwEzA7Io4m6WoTNUTEmoj4PZIW1y9I9g3LgJOD1WOqpMMqbuOdDfQF4NL0F6QkHZEOUh5J0s8cwE6AdLD1jIkEGxHbgFuAT0g6StIUSc+XNNiVcT3wbkndkrqA94/xkt8EXijpTZKmprf/JulUSc+R9AZJR0fEPpI+84PpZ7lQ0ry0Un0SODC4bJhrgQ9KmiFpOvA3wJcnsg0mGvswPyIZ93hX+vylDO32+zlwuqSzlExCuHzY848EHo+IpyWdA/xJrUAlHS9psaQjSJL7nhHisRZwcrB6fAsYqLhdPp4nR0Qv8GfAPwBPkHS1XJIuux/4BEkF9AjJwOUPJyHmN5MMpN6fvufXSH6NQpKsbgHuAdaSfL79JJV3rfh3A79PMmaxlaT77GPAoekqbwI2KZkpdSlJtw3AC0gGu/ekn++fIuL2Gm9xBdCbxrMOuCstm7AJxF75Gr8GlpJ8Z4+TdP3cULH8AZIB81tJxot+MOwl/hfwYUm7SRJfrdYJJPXRe9M4HycZl3j7eD6vTZ7BmRNmpSXpFcDnIuKkMVc2Kwm3HKx0JHWm8/sPkdQNfIhk6quZpdxysNKRdDjwXeAUkm6ym4F3R8SuTAMzyxEnBzMzq+JuJTMzq9IWp8mdPn16zJ07N+swzMwK5c4773w0ImbUWtYWyWHu3Ln09vZmHYaZWaFIGvFId3crmZlZFScHMzOr4uRgZmZVnBzMzKyKk4OZmVVpi9lKZmZFtHptH6vWbGBr/wAzuzpZvmg+SxZ2Zx0W4ORgZpaJ1Wv7WHHDOgb2JScD7usfYMUN6wBykSCcHMwalOdffZZ/q9ZseDYxDBrYd4BVazbkYj9ycjBrQN5/9Vn+be2vfTn0kcpbzQPSZg0Y7VefWT1mdnWOq7zVnBzMGpD3X32Wf8sXzadzaseQss6pHSxfND+jiIZycjBrQN5/9Vn+LVnYzZVLF9Dd1YmA7q5Orly6IDfdkh5zMGvA8kXzh4w5QL5+9VkxLFnYnZtkMJyTg1kDBv+hPVvJ2lXmyUFSB9AL9EXEhZJOBr4KHAfcCbwpIn6dZYxmteT5V5/ZROVhzOHdwPqKxx8DPhUR84AngLdmEpWZWYllmhwkzQJeBfxz+ljA+cDX0lWuAZZkE52ZWXll3XL4NPBXwMH08XFAf0TsTx9vAWq22yUtk9QrqXfnzp3Nj9TMrEQySw6SLgR2RMSdjTw/Iq6KiJ6I6Jkxo+YlUM3MrEFZDkifB7xa0iuBw4CjgM8AXZIOSVsPs4C+DGM0MyulzFoOEbEiImZFxFzg9cBtEfEG4HbgtelqFwM3ZhSimVlpZT3mUMv7gfdK2kgyBnF1xvGYmZVO5sc5AETEHcAd6f0HgXOyjMfMrOzy2HIwM7OMOTmYmVkVJwczM6vi5GBmZlWcHMzMrIqTg5mZVXFyMDOzKk4OZmZWxcnBzMyqODmYmVkVJwczM6vi5GBmZlWcHMzMrIqTg5mZVXFyMDOzKk4OZmZWJRcX+zGz/Fq9to9VazawtX+AmV2dLF80nyULu7MOy5rMycHMRrR6bR8rbljHwL4DAPT1D7DihnUAThBtLrNuJUmHSfqppJ9Luk/S36blJ0v6iaSNkq6T9JysYjQru1VrNjybGAYN7DvAqjUbMorIWiXLMYdngPMj4reAs4ALJJ0LfAz4VETMA54A3pphjGaltrV/YFzl1j4ySw6R2JM+nJreAjgf+Fpafg2wJIPwzAyY2dU5rnJrH5nOVpLUIeluYAfwbeCXQH9E7E9X2QLU7NiUtExSr6TenTt3tiZgs5JZvmg+nVM7hpR1Tu1g+aL5GUVkrZJpcoiIAxFxFjALOAc4ZRzPvSoieiKiZ8aMGU2L0azMlizs5sqlC+ju6kRAd1cnVy5d4MHoEsjFbKWI6Jd0O/DbQJekQ9LWwyygL9vozMptycJuJ4MSynK20gxJXen9TuD3gPXA7cBr09UuBm7MJkIzs/LKsuVwInCNpA6SJHV9RHxT0v3AVyVdAawFrs4wRjOzUsosOUTEPcDCGuUPkow/mJlZRnxuJTMzq+LkYGZmVZwczMysipODmZlVcXIwM7MqTg5mZlbFycHMzKo4OZiZWRUnBzMzq+LkYGZmVZwczMysSi5O2W1m+bV6bR+r1mxga/8AM7s6Wb5ovk/hXQJODmYNKkOluXptHytuWMfAvgMA9PUPsOKGdQBt91ltKHcrmTVgsNLs6x8g+E2luXpte12batWaDc8mhkED+w6was2GjCKyVnFyMGtAWSrNrf0D4yq39uHkYNaAslSaM7s6x1Vu7cPJwawBZak0ly+aT+fUjiFlnVM7WL5ofkYRWas4OZg1oCyV5pKF3Vy5dAHdXZ0I6O7q5MqlCzwYXQKerWTWgMHKsd1nK0HyWdvxc9noMksOkmYD/wocDwRwVUR8RtKxwHXAXGATcFFEPJFVnGYjcaVp7SzLbqX9wPsi4jTgXOAdkk4DLgO+ExEvAL6TPjYzsxbKLDlExLaIuCu9vxtYD3QDi4Fr0tWuAZZkE6GZWXnlYkBa0lxgIfAT4PiI2JYu2k7S7VTrOcsk9Urq3blzZ0viNDMri8yTg6RpwNeB90TErsplEREk4xFVIuKqiOiJiJ4ZM2a0IFIzs/LINDlImkqSGL4SETekxY9IOjFdfiKwI6v4zMzKKrPkIEnA1cD6iPhkxaKbgIvT+xcDN7Y6NjOzssvyOIfzgDcB6yTdnZb9NbASuF7SW4GHgYsyis/MrLQySw4R8QNAIyx+WStjMTOzoTIfkDYzs/zx6TPMzDKS5wtGOTmYmWUg71fZc7eSmVkG8n7BKCcHM7MM5P2CUU4OZmYZ6Dp86rjKW83JwcwsA1HzxEAjl7eak4OZWQaeHNg3rvJWc3IwM8tA3q9DXldykHSepCPS+2+U9ElJJzU3NDOz9pX365DX23L4LLBX0m8B7wN+SXKJTzMza8CShd1cuXQB3V2dCOju6uTKpQtycYwD1H8Q3P6ICEmLgX+IiKvTE+OZFV4WR6nm+chYa508X4e83uSwW9IK4I3ASyRNAfIx36okilSZFC3WVh+lmvcjY82g/m6l1wHPAG+NiO3ALGBV06JqgdVr+zhv5W2cfNnNnLfyNlav7cs6pBENViZ9/QMEv6lM8hhzkWKFbI5SzfuRsWZQZ3KIiO0R8cmI+H76+FcRUdgxB1dgzVOkWCGbo1TzfmSsGYyRHCTtlrSrxm23pF2jPTfPXIE1T5FihWymE+Z9CqMZjJEcIuLIiDiqxu3IiDiqVUFONldgzVOkWCGb6YR5n8I4mYrUfWtDjesgOEnPlTRn8NasoJrNFVjzFClWyGY6Yd6nME6WonXf2lCKOk7kIenVwCeAmcAO4CRgfUScPqE3l74IXAjsiIgz0rJjgeuAucAm4KKIeGK01+np6Yne3t6633f4bBFIKrA8/4MWbQZQUWK15jlv5W301WiNd3d18sPLzs8gIhtO0p0R0VNzWZ3J4efA+cCtEbFQ0u8Cb4yICR3rIOklwB7gXyuSw8eBxyNipaTLgGMi4v2jvc54kwO4AjNrtpMvu5latYuAh1a+qtXhWA2jJYd6j3PYFxGPSZoiaUpE3C7p0xMNLCK+J2nusOLFwEvT+9cAdwCjJodG5PngE7N2MLOrs2bLIa/dtzZUvWMO/ZKmAd8DviLpM8BTTYrp+IjYlt7fDhxfayVJyyT1SurduXNnk0Ixs0YVbfzJhqo3OSwGBoC/AP6D5NxKf9CsoAZF0udVs98rIq6KiJ6I6JkxY0azQzGzcSrLwHu7qqtbKSIqWwnXNCmWQY9IOjEitkk6kWQA3MwKyN23xVXvKbsrD4Z7WtKBJh4EdxNwcXr/YuDGJr2PmZmNoN6Ww5GD9yWJpJvp3Im+uaRrSQafp0vaAnwIWAlcn5719WHgoom+j5mZjU+9s5WelY4DrJb0IeCyibx5RPzxCIteNpHXNTOziakrOUhaWvFwCtADPN2UiMzMLHP1thwqZybtJzlyefGkR2NmZrlQ75jDW5odiJmZ5ceoyUHS3zPCcQYAEfGuSY/IzMwyN9ZU1l7gTuAw4Gzgv9LbWcBzmhuamZllZdSWQ0RcAyDp7cCLI2J/+vhzwPebH56ZmWWh3tNnHANUXtxnWlpmZmZtqN7ZSiuBtZJuJznj7kuAy5sVlBWbT4du1nzN/j+rd7bSlyT9O/CitOj9EbF90qLIgCuw5hh+IaXBq38B3r5mk6QV/2ejditJOiX9ezbJVeA2p7eZaVkh+fKFzbNqzYYhV9gDGNh3gFVrNmQUkVn7acX/2Vgth/cCy0guETpckFwdrnBG27D+dTsxW2tc3GW0crNa3LIfXSv+z8aarbQs/fu7k/aOOeAKrHl89a/20+qK2l2TY2vF/1m9p+z+I0lHpvc/KOkGSQsnLYoWG2kDugKbuIle/Wv12j7OW3kbJ192M+etvM1dfRnLogvWXZNja8VV9uqdyvq/I2K3pBcDLweuBj43aVG0WJkuX9jqynYiV//yWFD+ZFFRu2U/tlZcZa/eqayDe8ergKsi4mZJV0xaFC02uAHbvU8zq+Z5o1f/8lhQ/mRRUbtrsj7NvspevcmhT9Lngd8DPibpUOpvdeRSGS5fWLTK1r8Y8yeLinr5ovlDftRA+7bs86zeCv4iYA2wKCL6gWOB5U2LyiZF0SpbjwU1VyNdjFl0wbaiy8TGVu9BcHsl7QBeTHLivf3pX8uxojXPs/rF+MHV67j2J5s5EEGHxB+/aDZXLFnQ1PdstUa7GLPqgi1Dyz7v6r0S3IdIrv42H/gSMBX4MnBeswKTdAHwGaAD+OeIWNms92pXRWueZ1ERfXD1Or784189+/hAxLOP2ylBTKSL0RV1OdU75vAaYCFwF0BEbB2c2toMkjqAfyQZ49gC/EzSTRFxf7Pesx0VceC91RXRtT/ZPGJ5OyWHonUxWvbqTQ6/joiQFACSjmhiTADnABsj4sH0/b5KclnSmslh79693H333UPKZsyYQXd3NwcOHGDdunVVzznhhBM44YQT2LdvH/fdd1/V8pkzZ/Lc5z6Xp59+ml/84hdVy2fNmsX06dPZu3cvDzzwQNXyk046iWOOOYY9e/awcePGquUnn3wyRx99NE8++SQPPfRQ1fJ58+Yxbdo0nnjiCR5++OGq5S984Qs5/PDDefTRR9myZUvV8lNOOYXDDjuM/949lX+84NiKJTu5++6dnH766UydOpXt27ezfXv1abIWLFhAR0cHfX197Ny5s2r5WWedBcDmzZt57LHHhiybMmUKZ555JgCbNm2iv79/yPJDDjmEM844A4AHH3yQXbt2DVl+6KGHcuqppwKwceNG9uzZM2R5Z2cn8+cnrZ8NGzYwMDC0gps2bRrz5s0DYP369TzzzDNDlh911FE873nPA2DOlMfo4OCQ5bvjUB45mPz2ueeeezh4cOjy4447jtmzZwNU7XeQz32v58hdPLH31wBsO3gUT8VzOEK/5rRpT1d9hsna93bs2MHWrVurlnvfS/a9e++9l/379w9Z3tXVxdy5c4Fs9r1KYw5ISxLwzXS2UpekPwNuBb4w1nMnoJvkHE6DtqRllXEtk9QrqXf37t1NDMXa2Uj/AB1SS+NotleccQLP6Rj6aQ87ZAqvOOOEjCKyvFPEiFcB/c1K0jqS8yz9Pskpu9dExLebFpT0WuCCiPif6eM3AS+KiHfWWr+npyd6e3ubFY61seFjDoPeeO6cMbuVinb+n6LFa80n6c6I6Km1rN5upbuA/oho1fTVPmB2xeNZaZlZTY1WfIMJYLyzlYp4/h8PLNt41Nty+AUwD3gYeGqwPCLObEpQ0iHAA8DLSJLCz4A/iYianWRuOZTb8IoakllZzZwbf97K22pOE+7u6uSHlxXyZMVWQpPRclg0ifGMKSL2S3onyYF3HcAXR0oMZlkcCe7ZP9bu6j0IrnrKQpNFxLeAb7X6fa14fP4fmwwekxmq0OdHMoNsTrtRpjP7loHPCFzNycEKz+f/sYnyNSSq1TvmYFa3VjfPff4fmyiPIVVzcrBJVbRrSJiBx5BqcbeSTSo3z62IPIZUzS0Hm1RunttkKEvXZJ45OdikcvO8ucow3dJdk/ngbiWbVFk1zxu5ylnRlGW6pbsm88Eth4Ioyi/GLJrnRTzPUSOKdk3wRrlrMh+cHAqgaJVfq5vnrjTbq9J012Q+uFupANzMHl2ZKs3xlBdVmWYO5bk71MmhAMpS+TXKlWZ7VZplOfo872NI7lYqADezR7d80fyap+yut9L0eE7+lGHmUN67Q50cCmCilV+jylBpejzHspL3HgEnhwLwDKCxNVpp5v3Xm7WvvPcIODkURFlmALW6tZL3X2/WvrLqEaiXk4PVlEWlmUVrJe+/3sqqKF2aE5H3MSQnB6spi0ozi9ZK3n+9TaaiVLhF69KciDyPIXkqq9WUxbTJibRWGp0v7mmT+Zg2WcnH9eRDJi0HSX8EXA6cCpwTEb0Vy1YAbwUOAO+KiDVZxFh2WTR5G22tTPSXZp5/vU2WIg28exwoH7LqVroXWAp8vrJQ0mnA64HTgZnArZJeGBEHql/Cmq3VlWajXTxFqviyUqQK1+NA+ZBJt1JErI+IWm3ExcBXI+KZiHgI2Aic09roLCuNdvEUqeLLSpGOIi/LkeB5l7cB6W7gxxWPt6RlVSQtA5YBzJkzp/mRWUs00lrxL82xTWTgfSID2Y08N++zeMqiaclB0q3ACTUWfSAibpzo60fEVcBVAD09PTHR17Pi8hHkY2u0wp3IeM5EnluGcaC8a1pyiIiXN/C0PmB2xeNZaZnZiHwEeX0aqXAnMp7jsaBiy1u30k3Av0n6JMmA9AuAn2YbkhVBWY4gb7WJjOd4LKjYMhmQlvQaSVuA3wZulrQGICLuA64H7gf+A3iHZypZHpWl4pvIQHaRBsGtWlazlb4REbMi4tCIOD4iFlUs+2hEPD8i5kfEv2cRn9lYylLxTWTm0ESem+eL4JRF3rqVzAqhLIPgExnPyWIQ3CaPIoo/0aenpyd6e3vHXtFsErW6oh5eaUKSkNrtdB/nrbyt5tTk7q5OfnjZ+RlE1L4k3RkRPbWWueVg1iAPgjdHWcZz8s4n3jMriLJUmmUZz8k7JwezgihLpenTZ+SDk4NZQZSl0izLadTzzmMO1haKdCoL8DmHxuLTZ2TPycEKr2hTH33OISsCdytZ4RXtymFFi9fKycnBCq9os3iKFq+Vk5ODFV7RZvEULV4rJycHK7yizeLJKl6fr8jGwwPSVnhFm8Xj609YEfjcSmYl4PMVWS2jnVvJ3UpmJeBBcBsvdyuZlcDMrs6aLQcPgk+Ooh2EWQ+3HMxKoGiD9kUyOJ7T1z9A8JvxnKIP+Ds5mJWAz1fUPO16UGMm3UqSVgF/APwa+CXwlojoT5etAN4KHADeFRFrsojRrN341BvN0a7jOVm1HL4NnBERZwIPACsAJJ0GvB44HbgA+CdJHSO+iplZxtr1oMZMkkNE3BIR+9OHPwZmpfcXA1+NiGci4iFgI3BOFjGamdWjXcdz8jBb6U+B69L73STJYtCWtKyKpGXAMoA5c+Y0Mz4za1A7zuIZrmgHYdaraclB0q3ACTUWfSAibkzX+QCwH/jKeF8/Iq4CroLkILgJhGrWcmWoNMt0VHY7juc0LTlExMtHWy7pEuBC4GXxm8O0+4DZFavNSsvM2kZZKs3RZvG00+dsV5mMOUi6APgr4NURsbdi0U3A6yUdKulk4AXAT7OI0axZ2nXq43DtOounLLIac/gH4FDg25IAfhwRl0bEfZKuB+4n6W56R0QcGOV1zAqnLJWmj8outqxmK82LiNkRcVZ6u7Ri2Ucj4vkRMT8i/j2L+MyaqV2nPg7XrrN4ysJHSJu1WFkqTR+VXWx5mMpqVirtOvWxlnacxVMWTg5mGXClaXnnbiUzM6vi5GBmZlWcHMzMrIqTg5mZVXFyMDOzKk4OZmZWxcnBzMyqODmYmVkVJwczM6vi5GBmZlWcHMzMrIqTg5mZVXFyMDOzKj4rq5nlzuq1faU4pXmeOTmYWa6sXtvHihvWPXud7b7+AVbcsA7ACaKFMulWkvQRSfdIulvSLZJmpuWS9HeSNqbLz84iPjPLzqo1G55NDIMG9h1g1ZoNGUVUTlmNOayKiDMj4izgm8DfpOWvAF6Q3pYBn80oPjPLyNb+gXGVW3NkkhwiYlfFwyOASO8vBv41Ej8GuiSd2PIAzSwzM7s6x1VuzZHZbCVJH5W0GXgDv2k5dAObK1bbkpbVev4ySb2Senfu3NncYM2sZZYvmk/n1I4hZZ1TO1i+aH5GEZVT05KDpFsl3VvjthggIj4QEbOBrwDvHO/rR8RVEdETET0zZsyY7PDNLCNLFnZz5dIFdHd1IqC7q5Mrly7wYHSLNW22UkS8vM5VvwJ8C/gQ0AfMrlg2Ky0zsxJZsrDbySBjWc1WekHFw8XAL9L7NwFvTmctnQs8GRHbWh6gmVnJZXWcw0pJ84GDwMPApWn5t4BXAhuBvcBbsgnPzKzcMkkOEfGHI5QH8I4Wh2NmZsP43EpmZlbFycHMzKoo6ckpNkk7ScYuWm068GgG71sU3j5j8zYanbfP2CayjU6KiJrHArRFcsiKpN6I6Mk6jrzy9hmbt9HovH3G1qxt5G4lMzOr4uRgZmZVnBwm5qqsA8g5b5+xeRuNzttnbE3ZRh5zMDOzKm45mJlZFScHMzOr4uTQIEmbJK1LL3Xam3U8WZP0RUk7JN1bUXaspG9L+q/07zFZxpi1EbbR5ZL60v3obkmvzDLGLEmaLel2SfdLuk/Su9Ny70eMun2asg95zKFBkjYBPRHhA3QASS8B9pBcye+MtOzjwOMRsVLSZcAxEfH+LOPM0gjb6HJgT0T8nyxjy4P0qo8nRsRdko4E7gSWAJfg/Wi07XMRTdiH3HKwSRER3wMeH1a8GLgmvX8NyY5cWiNsI0tFxLaIuCu9vxtYT3IlSO9HjLp9msLJoXEB3CLpTknLsg4mp46vuB7HduD4LIPJsXdKuiftdipll8lwkuYCC4Gf4P2oyrDtA03Yh5wcGvfiiDgbeAXwjrTLwEaQno7dfZjVPgs8HzgL2AZ8IttwsidpGvB14D0RsatymfejmtunKfuQk0ODIqIv/bsD+AZwTrYR5dIjaT/pYH/pjozjyZ2IeCQiDkTEQeALlHw/kjSVpOL7SkTckBZ7P0rV2j7N2oecHBog6Yh0QAhJRwC/D9w7+vVhl1EAAAJwSURBVLNK6Sbg4vT+xcCNGcaSS4OVXuo1lHg/kiTgamB9RHyyYpH3I0bePs3ahzxbqQGSnkfSWoDkanr/FhEfzTCkzEm6FngpyemDHwE+BKwGrgfmkJxS/aKIKO2A7Ajb6KUk3QEBbALeVtbrpkt6MfB9YB3JJYQB/pqkX730+9Eo2+ePacI+5ORgZmZV3K1kZmZVnBzMzKyKk4OZmVVxcjAzsypODmZmVsXJwSxnJN0hadIvGG82Hk4OZmZWxcnBrA7pUfE3S/q5pHslvS69psfH0+t6/FTSvHTdGZK+Luln6e28itf4YrruWkmL0/JOSV+VtF7SN4DODD+qGZAc3WtmY7sA2BoRrwKQdDTwMeDJiFgg6c3Ap4ELgc8An4qIH0iaA6wBTgU+ANwWEX8qqQv4qaRbgbcBeyPiVElnAne1/NOZDeMjpM3qIOmFwC3AdcA3I+L76QWfzo+IB9MTom2PiOMk7QC2Vjx9BjAfuAM4DNiflh8LLAKuBP4uIm5L3+suYFlElP4Kg5YdtxzM6hARD0g6G3glcIWk7wwuqlwt/TsFODcinq58jfTEaX8YERuGlTcparPGeczBrA6SZpJ0/XwZWAWcnS56XcXfH6X3bwH+vOK5Z6V31wB/niYJJC1My78H/EladgZwZpM+hlnd3HIwq88CYJWkg8A+4O3A14BjJN0DPENydkyAdwH/mJYfQlL5Xwp8hGRc4h5JU4CHSMYoPgt8SdJ6kks/3tmyT2U2Ao85mDUoHXPoiYhHs47FbLK5W8nMzKq45WBmZlXccjAzsypODmZmVsXJwczMqjg5mJlZFScHMzOr8v8BlHzjlqM4yL0AAAAASUVORK5CYII=\n", 185 | "text/plain": [ 186 | "
" 187 | ] 188 | }, 189 | "metadata": { 190 | "needs_background": "light" 191 | }, 192 | "output_type": "display_data" 193 | } 194 | ], 195 | "source": [ 196 | "# Import the data\n", 197 | "cars = pd.read_csv(\"data/cars.csv\")\n", 198 | "\n", 199 | "# Fit a linear regression of distance on speed\n", 200 | "model = LinearRegression()\n", 201 | "X = cars[\"speed\"].values.reshape(-1, 1)\n", 202 | "y = cars[\"dist\"].values\n", 203 | "model.fit(X, y)\n", 204 | "\n", 205 | "# Estimated coefficients from the model\n", 206 | "print(\"coefficients:\", model.coef_)\n", 207 | "print(\"intercept:\", model.intercept_)\n", 208 | "\n", 209 | "# Plot residuals against speed\n", 210 | "res = y - model.predict(X)\n", 211 | "plt.scatter(cars[\"speed\"], res)\n", 212 | "plt.axhline(0, color='grey', linestyle='--', alpha=0.5)\n", 213 | "plt.title(\"Linear regression residuals\")\n", 214 | "plt.ylabel(\"residuals\")\n", 215 | "plt.xlabel(\"speed\")\n", 216 | "plt.show()" 217 | ] 218 | }, 219 | { 220 | "cell_type": "markdown", 221 | "metadata": {}, 222 | "source": [ 223 | "### Code 0.5\n", 224 | "\n", 225 | "The author installs his `rethinking` package for `R`, but since we're doing this in `python`, there is no equivalent step." 226 | ] 227 | }, 228 | { 229 | "cell_type": "code", 230 | "execution_count": null, 231 | "metadata": {}, 232 | "outputs": [], 233 | "source": [] 234 | } 235 | ], 236 | "metadata": { 237 | "kernelspec": { 238 | "display_name": "Python 3", 239 | "language": "python", 240 | "name": "python3" 241 | }, 242 | "language_info": { 243 | "codemirror_mode": { 244 | "name": "ipython", 245 | "version": 3 246 | }, 247 | "file_extension": ".py", 248 | "mimetype": "text/x-python", 249 | "name": "python", 250 | "nbconvert_exporter": "python", 251 | "pygments_lexer": "ipython3", 252 | "version": "3.7.5" 253 | } 254 | }, 255 | "nbformat": 4, 256 | "nbformat_minor": 4 257 | } 258 | -------------------------------------------------------------------------------- /notebooks/01_golem_of_prague.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# 1: The Golem of Prague" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "This chapter is entirely expository; there are no code samples. There are many interesting philosophical ideas touched upon that I will try to briefly summarize:\n", 15 | "\n", 16 | "* Models are not hypotheses (nor are hypotheses models)\n", 17 | " * In many cases, one hypothesis could be described by multiple sufficiently flexible models\n", 18 | " * Multiple hypotheses may lead to the same statistical model\n", 19 | " * The underlying scientific/causal model is the real source of truth, but can be difficult or impossible to ascertain.\n", 20 | "* There are multiple types of errors and uncertainties, and we should be careful to distinguish them\n", 21 | " * measurement/systematic error\n", 22 | " * level of belief\n", 23 | " * statistical variations\n", 24 | "* Just because something cannot be proven wrong does not mean it is right\n", 25 | "* The book will focus mainly on four main tools/techniques:\n", 26 | "\n", 27 | " 1. Bayesian data analysis\n", 28 | "\n", 29 | " 1. Reasons about outcomes in the form of probability distributions\n", 30 | " 2. Many natural intuitions about probability and statistics align with the Bayesian interpretation as opposed to others\n", 31 | " 2. Model comparison\n", 32 | " 1. Can use techniques like cross-validation and information criterion to measure generalization accuracy and compare models\n", 33 | " 2. Will be useful for determining if complex models are overfitting\n", 34 | " 3. Multilevel models\n", 35 | " 1. Complex models can be built from hierarchies of simpler ones\n", 36 | " 2. Makes use of _partial pooling_ trick to share information across units in a model to produce better estimates for all units\n", 37 | " 1. helps reduce the effects of common problems like repeat sampling, dataset imbalance, population variation, and misuse of data averaging\n", 38 | " 3. Author argues that multilevel regression should be the default!\n", 39 | " 4. Graphical causal models\n", 40 | " 1. Statistics cannot tell you anything about causation\n", 41 | " 2. If two events are correlated, which one caused the other?\n", 42 | " 1. Maybe both could be traced back to a common \"confounding factor\", so there is actually no causal relationship!\n", 43 | " 3. A DAG (Directed Acyclic Graph) is a common tool for specifying causal models that can represent chains of causal relationships\n", 44 | "* The layout of the chapters will be\n", 45 | "\n", 46 | " * Chapters 2/3 introduce Bayesian foundations\n", 47 | " * Chapters 4-9 explore multiple linear regression from a Bayesian perspective, and touch on the problem of overfitting\n", 48 | " * Chapters 9-12 explore \"generalized\" linear models, MCMC (Markov Chain Monte Carlo), and the use of \"maximum entropy\"\n", 49 | " * Chapters 13-16 discuss multilevel models as well as some other specialized models\n", 50 | " * Chapter 17 kind of returns to the beginning and wraps everything up" 51 | ] 52 | }, 53 | { 54 | "cell_type": "code", 55 | "execution_count": null, 56 | "metadata": {}, 57 | "outputs": [], 58 | "source": [] 59 | } 60 | ], 61 | "metadata": { 62 | "kernelspec": { 63 | "display_name": "Python 3", 64 | "language": "python", 65 | "name": "python3" 66 | }, 67 | "language_info": { 68 | "codemirror_mode": { 69 | "name": "ipython", 70 | "version": 3 71 | }, 72 | "file_extension": ".py", 73 | "mimetype": "text/x-python", 74 | "name": "python", 75 | "nbconvert_exporter": "python", 76 | "pygments_lexer": "ipython3", 77 | "version": "3.7.5" 78 | } 79 | }, 80 | "nbformat": 4, 81 | "nbformat_minor": 4 82 | } 83 | -------------------------------------------------------------------------------- /notebooks/10_entropy_and_glm.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# 10: Big Entropy and the Generalized Linear Model\n", 8 | "We are going to examine the role of entropy in our choice of distributions to represent our priors/posteriors. The guiding principle here is that we want to choose the distribution that maximizes entropy (uncertainty) given some constraints. Before, we only really used Gaussian distributions, which it turns out are the maximum entropy distribution over the real numbers given a fixed mean and variance." 9 | ] 10 | }, 11 | { 12 | "cell_type": "code", 13 | "execution_count": 1, 14 | "metadata": {}, 15 | "outputs": [], 16 | "source": [ 17 | "import numpy as np\n", 18 | "import pandas as pd\n", 19 | "import matplotlib.pyplot as plt\n", 20 | "from scipy.special import binom" 21 | ] 22 | }, 23 | { 24 | "cell_type": "markdown", 25 | "metadata": {}, 26 | "source": [ 27 | "### Code 10.1 - 10.4\n", 28 | "Now we will analyze the role of entropy in choosing a maximum entropy discrete distribution for count data. If we assume we have 10 pebbles that can be split between 5 buckets, how many ways are there to arrange them? How many of these ways end up with the same number of pebbles in each bucket?" 29 | ] 30 | }, 31 | { 32 | "cell_type": "code", 33 | "execution_count": 2, 34 | "metadata": {}, 35 | "outputs": [], 36 | "source": [ 37 | "# consider 5 different potential orderings of the pebbles\n", 38 | "p = {\n", 39 | " \"A\": [0, 0, 10, 0, 0],\n", 40 | " \"B\": [0, 1, 8, 1, 0],\n", 41 | " \"C\": [0, 2, 6, 2, 0],\n", 42 | " \"D\": [1, 2, 4, 2, 1],\n", 43 | " \"E\": [2, 2, 2, 2, 2],\n", 44 | "}\n", 45 | "# normalize\n", 46 | "p_norm = {k: np.array(v)/np.sum(v) for k, v in p.items()}" 47 | ] 48 | }, 49 | { 50 | "cell_type": "code", 51 | "execution_count": 3, 52 | "metadata": {}, 53 | "outputs": [ 54 | { 55 | "name": "stderr", 56 | "output_type": "stream", 57 | "text": [ 58 | "/home/ecotner/.local/lib/python3.7/site-packages/ipykernel_launcher.py:2: RuntimeWarning: divide by zero encountered in log\n", 59 | " \n" 60 | ] 61 | }, 62 | { 63 | "data": { 64 | "text/plain": [ 65 | "{'A': -0.0,\n", 66 | " 'B': 0.639031859650177,\n", 67 | " 'C': 0.9502705392332347,\n", 68 | " 'D': 1.4708084763221112,\n", 69 | " 'E': 1.6094379124341005}" 70 | ] 71 | }, 72 | "execution_count": 3, 73 | "metadata": {}, 74 | "output_type": "execute_result" 75 | } 76 | ], 77 | "source": [ 78 | "# calculate entropy\n", 79 | "H = {k: -(q * np.where(q==0, 0, np.log(q))).sum() for k, q in p_norm.items()}\n", 80 | "H" 81 | ] 82 | }, 83 | { 84 | "cell_type": "markdown", 85 | "metadata": {}, 86 | "source": [ 87 | "It turns out that the entropy is highest for the most uniform distribution. How many ways can these counts of pebbles be realized? Well there are $\\binom{N}{k}$ ways to sample $k$ identical objects from a pool of $N$. So if we envision looking into each bucket in sequence, we know that for the first bucket, there are $\\binom{N}{k_1}$ ways of arranging the pebbles in that bucket, $\\binom{N-k_1}{k_2}$ ways of arranging the pebbles in the second bucket (because $k_1$ of the pebbles are already in the first bucket, so there are only $N - k_1$ \"free\" pebbles left), $\\binom{N-(k_1+k_2)}{k_3}$ in the third, etc..." 88 | ] 89 | }, 90 | { 91 | "cell_type": "code", 92 | "execution_count": 4, 93 | "metadata": {}, 94 | "outputs": [ 95 | { 96 | "data": { 97 | "text/plain": [ 98 | "{'A': 1, 'B': 90, 'C': 1260, 'D': 37800, 'E': 113400}" 99 | ] 100 | }, 101 | "execution_count": 4, 102 | "metadata": {}, 103 | "output_type": "execute_result" 104 | } 105 | ], 106 | "source": [ 107 | "ways = dict()\n", 108 | "for k in p:\n", 109 | " n_left = 10\n", 110 | " w = []\n", 111 | " for n in p[k]:\n", 112 | " w.append(binom(n_left, n))\n", 113 | " n_left -= n\n", 114 | " ways[k] = int(np.prod(w))\n", 115 | "ways" 116 | ] 117 | }, 118 | { 119 | "cell_type": "code", 120 | "execution_count": 5, 121 | "metadata": {}, 122 | "outputs": [ 123 | { 124 | "data": { 125 | "text/plain": [ 126 | "{'A': 0.0,\n", 127 | " 'B': 0.6491853096329675,\n", 128 | " 'C': 1.029920801838728,\n", 129 | " 'D': 1.5206098613995798,\n", 130 | " 'E': 1.6791061114716954}" 131 | ] 132 | }, 133 | "execution_count": 5, 134 | "metadata": {}, 135 | "output_type": "execute_result" 136 | } 137 | ], 138 | "source": [ 139 | "logwayspp = {k: np.log2(ways[k])/10 for k in ways}\n", 140 | "logwayspp" 141 | ] 142 | }, 143 | { 144 | "cell_type": "code", 145 | "execution_count": 6, 146 | "metadata": {}, 147 | "outputs": [ 148 | { 149 | "data": { 150 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYgAAAEGCAYAAAB/+QKOAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO3daXgUZfb38e8hCTsSNhcQCYoiYYeALKKICKggKC4R2ZR1VFxB5Y8zMKI+jAwjIAKGsKhocAFZlEVQHB2QkQAKGEQRVIjKEggqhCXJeV50hWlCBzqmuyudnM919ZXuu7YfZZuTqrvqLlFVjDHGmNxKuB3AGGNM4WQFwhhjjE9WIIwxxvhkBcIYY4xPViCMMcb4FOl2gECqWrWqxsTEuB3DGGPCxoYNGw6oajVf04pUgYiJiSE5OdntGMYYEzZE5Me8ptkpJmOMMT5ZgTDGmBAYM2YMIuLzNXfuXLfj+VSkTjEZY0xhVrFiRZYvX35Ge506dVxIc25WIIwxJkQiIyNp1aqV2zH8ZqeYjDHG+GRHEMYYE0KZmZlntEVGFs5fxXYEYYwxIZKWlkZUVNQZrx9++MHtaD4VzrJljDFFUMWKFVm1atUZ7dWrV//T60xJSSE2NrYgsfJkBcIYY0IkMjKSuLi4gKzr999/Z+TIkbz88sssXLiQ7t27B2S93qxAGGNMmFm2bBlDhgxhz549PPzww1x//fVB2U7QCoSIzAK6AvtUtYGP6SOAe7xy1AOqqepBEfkB+B3IAjJVNTAl1xhjwlxSUhK9evWiXr16rFmzhtatWwdtW8E8gpgDTAFe8zVRVccD4wFEpBvwqKoe9JrlOlU9EMR8xhgTUpmZmaxbt+6M9po1a1KjRo08l1NV0tLSqFq1Kt27d2f8+PEMGzaMUqVKBTNu8AqEqn4qIjF+zn43kBSsLMYYUxgcPnzY51/8Y8eO5emnn/a5zC+//ML999/Pli1b2Lx5M2XLlmX48OHBjgoUgstcRaQs0AWY79WswIciskFEBruTzBhjAmfMmDGoqs+Xr+KgqsyaNYt69eqxfPlyhgwZQsmSJUOauTB0UncD1uQ6vXS1qqaKyPnAShH5RlU/9bWwU0AGA1xyySXBT2uMMUF26NAh7rzzTlatWsU111zDjBkzuOKKK0Kew/UjCCCeXKeXVDXV+bkPeA9omdfCqpqgqnGqGletms9nXhhjTFg577zziIyMZNq0aaxevdqV4gAuFwgRqQhcCyzyaisnIhVy3gOdgK3uJDTGFGXeQ3CXKFGCSpUq0aJFC0aNGsWvv/4a0iwpKSl07dqVffv2ERERwdKlSxk6dCglSrj3azqYl7kmAe2BqiKyBxgNRAGo6nRntluBD1X1iNeiFwDviUhOvjdV9czxcY0xJgC8h+A+fPgwGzduZNq0aSQkJLB8+XKaN28e1O2fOHGCF154gbFjx1KhQgW++eYbzj//fJzfga4SVXU7Q8DExcWpPXLUGOOvMWPGMGXKFA4cOP2K+vT0dK655hoyMjL45ptviIiICMr2k5OTGTBgAJs3byY+Pp5JkyZx/vnnB2VbeRGRDXnda1YY+iCMMaZQiY6O5oUXXmDHjh2sXLkyaNv517/+xYEDB1i0aBFJSUkhLw7nUhiuYjLGmEKnffv2REZGsm7dOrp06RKw9f773//mggsu4Morr+Sll14iIiKC6OjogK0/kOwIwhhjfChdujRVq1Zl7969AVnfb7/9xl/+8hfat2/P3//+dwCqVKlSaIsDWIEwxpg8BaqPdunSpdSvX5+EhAQee+wxEhMTA7LeYLNTTMYY48OxY8dIS0vjggsuKNB6cgbXq1+/Pu+++y5XXXVVgBIGnx1BGGOMD6tXryYzM/NPjZaqquzfvx+A7t27M2HCBDZu3BhWxQGsQBhjzBnS09N58sknqVOnDh07dszXsqmpqfTo0YPWrVtz9OhRypYty2OPPRbycZQCwU4xGWOKNe8huH///Xc2bNjAtGnTOHr0KMuXL/f7HghVJTExkeHDh3Py5EnGjh17RlFYuCmV8Su283N6BtWjyzCic116NM17mG+3WYEwxhRrOUNwiwjnnXcederUoXfv3gwbNowLL7zQr3UcOnSInj17snr1atq3b8+MGTOoU6fOafMs3JTKyAVbyDiZBUBqegYjF2wBKLRFwu6kNsaYAsrOzqZbt2706NGDgQMH+hwmo+24j0lNzzijvUZ0GdY81SEUMX2yO6mNMSbAtm7dyo033sjevXspUaIE77//PoMGDcpzDKWffRSHs7UXBlYgjDEmH06cOMHf//53mjVrRnJyMt9++y3AOQfXqx5dJl/thYEVCGOM8dMXX3xB8+bNGTNmDHfccQfbtm2jXbt2fi07onNdykSd3uFdJiqCEZ3rBiNqQFgntTHG+GnixIkcOnSIJUuW0LVr13wtm9MRHU5XMVkntTHGnMXq1au58MILqVevHmlpaURFRXHeeee5HStgrJPaGGPy6fDhwwwZMoQOHTowduxYwDO4XlEqDudiBcIYY3JZvHgxsbGxp258C5fB9QLN+iCMMcbLG2+8Qe/evWnYsCELFy6kRYsWbkdyTdCOIERklojsE5GteUxvLyKHReRL5/U3r2ldRGS7iOwQkaeCldEYY8AzTEbOcx9uu+02XnzxRZKTk4t1cYDgnmKaA5zrMUyfqWoT5/UMgIhEAC8DNwKxwN0iEhvEnMaYYmz37t1069aNNm3acPToUcqUKcMjjzwSloPrBVrQCoSqfgoc/BOLtgR2qOpOVT0BzAO6BzScMabYy87O5pVXXqF+/fqsXr2ahx56iFKlSrkdq1Bxu5O6tYh8JSLLRKS+01YD2O01zx6nzScRGSwiySKSnDP+ujHGnM3Bgwfp0KEDQ4cOpWXLlry44D+8ndGQOqOW03bcxyzclOp2xELBzU7qjUAtVf1DRG4CFgKX53clqpoAJIDnPojARjTGFEXR0dFUqFCBxMREKjftzP+9tzWsRlkNFdeOIFT1N1X9w3m/FIgSkapAKlDTa9aLnTZjjPnTNm/eTOfOnfn1118pUaIES5YsYcCAAfzzw29PFYccGSezGL9iu0tJCw/XCoSIXCjO6FYi0tLJkgasBy4XkdoiUhKIBxa7ldMYE96OHz/O6NGjad68OZs2bWLHjh2nTQ/HUVZDJWinmEQkCWgPVBWRPcBoIApAVacDtwN/EZFMIAOIV8+4H5ki8iCwAogAZqnq18HKaYwputatW8eAAQNISUmhd+/eTJw4kSpVqpw2T/XoMj6f01CYR1kNlaAVCFW9+xzTpwBT8pi2FFgajFzGmOJj8uTJ/P7773zwwQfcdNNNPucZ0bnuaU96g8I/ymqo2J3Uxpgi5aOPPuKiiy4iNjaWKVOmEBkZedbxk8JxlNVQsQJhjCkS0tPTGT58ODNnzuTuu+/mzTffpHLlyn4t26NpDSsIPrh9H4QxxhTYwoULiY2NZc6cOTz11FPMnDnT7UhFgh1BGGPCWs7geo0bN2bJkiU0b97c7UhFhh1BGGPCjqry66+/Ap7B9SZPnsz69eutOASYFQhjTFj56aefuPnmm2nTpg1HjhyhTJkyDBs2jKioKLejFTlWIIwxYSE7O5upU6dSv359Pv30Ux599FFKly7tdqwizfogjDGF3sGDB+nRowefffYZN9xwAwkJCcTExLgdq8izAmGMKfSio6OpVKkSs2fPpl+/fjij9Jggs1NMxphC6csvv6Rjx46nBtdbtGgR/fv3t+IQQlYgjDGFyrFjxxg1ahRxcXFs3bqV77//3u1IxZYVCGNMobFmzRqaNGnC888/T58+fUhJSaFt27Zuxyq2rA/CGFNoTJ06lWPHjrFixQo6derkdpxizwqEMcZVH374IRdffPGpwfWioqIoX76827EMdorJGOOSQ4cOce+999K5c2eef/55ACpVqmTFoRCxAmGMCbkFCxYQGxvL66+/zv/93/+RmJjodiTjg51iMsaE1Ny5c+nTpw9NmzZl2bJlNGnSxO1IJg9WIIwxQZczuN5FF11Ez549OXz4MIMHD7bxkwq5oJ1iEpFZIrJPRLbmMf0eEdksIltEZK2INPaa9oPT/qWIJAcrozEm+H744Qc6d+5M27ZtTw2u98ADD1hxCAPB7IOYA3Q5y/RdwLWq2hAYCyTkmn6dqjZR1bgg5TPGBFF2djYvvfQSDRo04PPPP2f48OGUKVPG7VgmH4J2iklVPxWRmLNMX+v1cR1wcbCyGGNC6+DBg3Tr1o21a9fSpUsXpk+fTq1atdyOZfKpsFzFNABY5vVZgQ9FZIOIDD7bgiIyWESSRSR5//79QQ1pjPFPdHQ0F1xwAa+99hpLly614hCmXC8QInIdngLxpFfz1araDLgReEBErslreVVNUNU4VY2rVq1akNMaY/KyceNGOnTowC+//EKJEiVYsGABffr0scH1wpirBUJEGgGJQHdVTctpV9VU5+c+4D2gpTsJjTHnkpGRwciRI2nZsiXbtm1j165dbkcyAeJagRCRS4AFQB9V/darvZyIVMh5D3QCfF4JZYxx13/+8x+aNGnCuHHj6NevHykpKbRp08btWCZAgtZJLSJJQHugqojsAUYDUQCqOh34G1AFmOocgmY6VyxdALzntEUCb6rq8mDlNMb8edOnT+fEiROsXLmSjh07uh3HBJioqtsZAiYuLk6Tk+22CVP8zJ8/n5dffpmNGzeSkZFBrVq16Nq1K8OHD6d69eoB3dayZcuoWbMmDRo04NChQza4XpgTkQ153U7geie1MaZgHn/8ce68804uvfRSXn/9dT788EMeffRRPvroIx544IGAbSctLY2+ffty0003MW7cOMAG1yvyVLXIvJo3b67GFCeLFy9WQGfOnHnGtMzMTF26dGmBt5Gdna1vv/22nn/++RoZGal//etf9dixYwVerykcgGTN43eqnWIyJox16NCBw4cPs2HDhqBt4/XXX6dv3740b96cWbNm0ahRo6Bty4Te2U4x2WB9xoSpkydPsnbtWh5//PGAr1tV+fnnn6lRowZ33HEHR44cYeDAgURG2q+M4sT6IIwJU2lpaRw/fpxLLrkkoOvdtWsXnTp14uqrr+bIkSOULl2aoUOHWnEohqxAGBPmAnWnclZWFpMmTaJBgwb897//5cknn7TB9Yo5+5PAmDBVpUoVSpUqxU8//VTgdaWlpdG1a1fWrVvHTTfdxPTp06lZs2YAUppwZkcQxoSpqKgo2rZty4oVKwq8rkqVKlG9enXmzp3L+++/b8XBAFYgjAlrjzzyCMnJybz66qtnTMvOzmb58rwHIVi/fj3XXnstP//8MyVKlGD+/Pncc889NrieOcVOMRkTxrp168Zjjz3GgAEDWLNmDd27d6d8+fJ88803TJ8+nZiYGLp0Of25XUePHmXMmDFMmDCBCy+8kJ9++ingd1ubosEKhDFhbsKECbRp04YpU6bQq1cvMjIyiImJ4ZZbbmH48OGnzfvJJ58waNAgduzYwaBBgxg/fjwVK1Z0Kbkp7KxAGFME9OzZk549e55zvpkzZ5Kdnc1HH31Ehw4dQpDMhDMrEMYUcR988AG1atWiQYMGvPTSS0RFRVGuXDm3Y5kwYJ3UxhRRBw4coHfv3nTt2pUXXngB8DwK1IqD8ZcVCGOKGFVl3rx51KtXj7fffpvRo0eTmJjodiwThuwUkzFFzOuvv06/fv1o0aIFM2fOpGHDhm5HMmHKCoQxRUB2djapqanUrFmTO++8k2PHjjFgwAAiIiLcjmbCmF+nmETEvmXGFFI7duzg+uuvp127dqcG1xs8eLAVB1Ng/vZBfCci40UkNj8rF5FZIrJPRLbmMV1EZLKI7BCRzSLSzGtaPxH5znn1y892jSkOsrKymDBhAo0aNWLjxo08/fTTlC1b1u1Ypgjxt0A0Br4FEkVknYgMFpHz/FhuDtDlLNNvBC53XoOBaQAiUhkYDVwFtARGi0glP7MaU+SlpaXRunVrhg8fTseOHUlJSWHgwIE2TIYJKL8KhKr+rqozVLUN8CSeX96/iMirIlLnLMt9Chw8y6q7A685T75bB0SLyEVAZ2Clqh5U1UPASs5eaIwpVipVqkRMTAxJSUksWrSIGjVquB3JFEF+90GIyC0i8h4wEZgAXAosAZYWYPs1gN1en/c4bXm1+8o2WESSRSR5//79BYhiTOH2xRdf0K5dO1JTUylRogRvv/028fHxdtRggsbvPgg8f+2PV9WmqvovVd2rqu8CeQ8XGQKqmqCqcaoaV61aNTejGBMUR48eZfjw4bRu3Zpdu3axe/fucy9kTAD4e5lrI1X9w9cEVX2oANtPBbwHnr/YaUsF2udq/6QA2zEmLK1evZqBAweyc+dOhg4dyrhx484YXG/hplTGr9jOz+kZVI8uw4jOdenR1E45mYLz9wjifBFZIiIHnKuSFonIpQHY/mKgr3M1UyvgsKr+AqwAOolIJadzupPTZkyxMnv2bEqUKMEnn3zCtGnTfBaHkQu2kJqegQKp6RmMXLCFhZtS3QlsihR/jyDeBF4GbnU+xwNJeK4yypOIJOE5EqgqInvwdG5HAajqdDz9FzcBO4CjwL3OtIMiMhZY76zqGVU9W2e3MUXG4sWLqV27Ng0bNjw1uF5el6+OX7GdjJNZp7VlnMxi/IrtdhRhCszfAlFWVV/3+jxXREacayFVvfsc0xV4II9ps4BZfuYzJuzt27ePhx56iLfeeos+ffrw2muvnfNZDT+nZ+Sr3Zj88PcU0zIReUpEYkSklog8ASwVkcrOPQvGmD9JVXnjjTeIjY3lvffeY+zYsX4Prlc9uky+2o3JD3+PIO50fg7J1R4PKJ5LXo0xf8Jrr71G//79adWqFTNnziQ21v8BC0Z0rsvIBVtOO81UJiqCEZ3rBiOqKWb8KhCqWjvYQYwpTrKzs9mzZw+XXHIJd911F5mZmfTv3z/f4yfl9DPYVUwmGMTTDXCOmUSigL8A1zhNnwCvqOrJ4EXLv7i4OE1OTnY7hjFn9d133zFo0CB27dpFSkqKPcDHuEpENqhqnK9p/vZBTAOaA1OdV3OnzRjjp8zMTMaPH0+jRo348ssvGT16tA2uZwo1fwtEC1Xtp6ofO697gRbBDGaMv8aMGYOInHqVLVuWhg0bkpCQ4Ha0Uw4cOEDr1q154okn6NKlCykpKdx33302TIYp1PztpM4SkctU9XsA5ya5rHMsY0zIVKxYkeXLPaO+HDlyhCVLljBkyBDKly9Pr169XE4HlStX5rLLLuOJJ57g9ttvt8JgwoK/BWI4sFpEdgIC1MK5qc2YwiAyMpJWrVqd+nz99dezdu1aFi5c6FqB+Pzzz3n88cd55513qFGjBvPmzXMlhzF/1jlPMTlPk2uM55kNDwHDgLqqujrI2YwpkAoVKnDyZOivozhy5AiPPPIIbdu2JTU1ldRUG/bChKdzFghVzQLuVtXjqrrZeR0PQTZj8iUzM5PMzEx+++035s6dy7///W9uvfXWcy8YQKtWraJBgwZMmjSJ+++/n61bt9KyZcuQZjAmUPw9xbRGRKYAbwFHchpVdWNQUhmTT2lpaURFRZ3W9tBDD9G3b9+Q5pg7dy4lS5bk008/pV27diHdtjGB5m+BaOL8fMarTYEOgY1jzJ9TsWJFVq1aBcDx48fZsGEDf/vb36hcuTKjR48O6rYXLlzIpZdeSqNGjZg8eTJRUVGUKWNDXZjw52+BGKCqO70bAjTctzEBERkZSVzc/+71adu2LZmZmYwcOZJhw4ZRuXLghwzbu3cvw4YN45133qFfv37MmTOH887z51HtxoQHf++DeNdH2zuBDGJMoNWrV48TJ07w/fffB3S9qsrrr79ObGwsixYt4rnnnmPGjBkB3YYxhcFZjyBE5EqgPlBRRG7zmnQeUDqYwYwpqK1btwJQs2bNc8yZPzmD67Vp04aZM2dy5ZVXBnT9xhQW5zrFVBfoCkQD3bzafwcGBSuUMfmVmZnJunXrADhx4gQbNmzg2WefpXv37lx44YUFXn92dja7d++mVq1axMfHk52dTd++ffM9uJ4x4eSsBUJVFwGLRKS1qn4eokzG5Nvhw4dp3bo1AFFRUdSqVYuhQ4fy9NNPF3jd27dvZ+DAgfz444+kpKRQvnx57r3X7hM1RZ+/o7lWw3PEEINXUVHV+86xXBdgEhABJKrquFzTXwSucz6WBc5X1WhnWhawxZn2k6recq6cNpqrCaSTJ08yYcIExowZQ9myZXnxxRfp27evDZNhipSzjebq71VMi4DPgFX4OQaTcwf2y8ANwB5gvYgsVtWUnHlU9VGv+YcBTb1WkaGqTTDGBQcOHKBTp05s2rSJnj17MmXKlICcqjImnOTnmdRP5nPdLYEdOZfHisg8oDuQksf8dwPBvWDdmHNQVUSEKlWqUK9ePUaNGkXPnj3djmWMK/y9zPV9Ebkpn+uuAez2+rzHaTuDiNQCagMfezWXFpFkEVknIj3y2oiIDHbmS96/f38+IxrzP2vWrKFVq1bs2bMHEeGNN96w4mCKNX8LxMPAEhHJEJHfROR3EfktgDnigXedcZ9y1HLOi/UCJorIZb4WVNUEVY1T1bhq1aoFMJIpLv744w8eeugh2rVrx969e/nll1/cjmRMoeBvgagI9Af+n6qeh+feiBvOsUwq4H0B+sVOmy/xQJJ3g6qmOj934nnEadMzFzOmYD788EMaNGjAlClTePDBB9m6dSstWtizsIwB/wvEy0ArPP0E4LkPYso5llkPXC4itUWkJJ4isDj3TM7NeJWAz73aKolIKed9VaAtefddGPOnvfnmm5QpU4bPPvuMyZMnU758ebcjGVNo+NtJfZWqNhORTQCqesj5pZ8nVc0UkQeBFXguc52lql+LyDNAsqrmFIt4YJ6efr1tPeAVEcnGU8TGeV/9ZExBzJ8/nzp16tC4cWMmT55MyZIlKV3aBgYwJjd/C8RJ57JVhVP3RWSfayFVXQoszdX2t1yfx/hYbi3Q0M9sxvjll19+4cEHH2TBggX079+f2bNn2+B6xpyFv6eYJgPvAeeLyHPAf4Dng5bKmABSVWbPnk1sbCwffPAB48aNs8H1jPGDX0cQqvqGiGwArsfzTOoeqrotqMmMCZA5c+Zw33330a5dOxITE7niiivcjmRMWPD3FBOq+g3wTRCzGBMwWVlZ7N69m5iYGHr16kVERAS9e/emRAl/D5qNMX4XCGMKu4WbUhm/Yjs/p2dQ4thhTm6Yz7blr1GuXLmQP3rUmKLA/pwyRcLCTak8tWAzqekZKJBVuiKlrr6XD7cfcjuaMWHLCoQpEsYtTeHYydMvrDupwj8//NalRMaEPysQJqzl3D6z9/cTPqf/nJ4RyjjGFClWIEzY+vTTT2nZsiV79uyhenQZn/Pk1W6MOTcrECbs/PbbbzzwwANce+21pKWl8euvvzKic13KRJ3++M8yURGM6FzXpZTGhD+7ismElWXLljFkyBD27NnDI488wrPPPku5cuXIeRxWzlVM1aPLMKJzXXo09TnCvDHGD1YgTFh55513qFChAmvXrqVVq1anTevRtIYVBGMCyAqEKdRUlXfeeYcrrriCJk2aMGnSJEqWLEmpUqXcjmZMkWd9EKbQ+vnnn7ntttu46667mDx5MgAVKlSw4mBMiFiBMIWOqjJz5kxiY2NZvnw548ePJyEhwe1YxhQ7dorJFDpz5sxh4MCBXHvttSQmJlKnTh23IxlTLFmBMIVCVlYWP/74I5deeim9evUiKiqKXr162eB6xrjI/u8zrvv6669p27Yt1157LUeOHKFUqVI28qoxhYD9H2hcc+LECZ555hmaNm3Kjh07+Mc//kHZsmXdjmWMcQS1QIhIFxHZLiI7ROQpH9P7i8h+EfnSeQ30mtZPRL5zXv2CmdOE3v79+4mLi2P06NHcfvvtbNu2jV69eiEibkczxjiC1gfhPMP6ZeAGYA+wXkQWq2pKrlnfUtUHcy1bGRgNxOF5DvYGZ1kbuznMqSoiQtWqVWnatCnPPfcc3bp1czuWMcaHYB5BtAR2qOpOVT0BzAO6+7lsZ2Clqh50isJKoEuQcpoQ+eSTT4iLi2P37t2ICK+++qoVB2MKsWAWiBrAbq/Pe5y23HqKyGYReVdEauZzWURksIgki0jy/v37A5HbBNjhw4cZOnQo1113Henp6ezbt8/tSMYYP7jdSb0EiFHVRniOEl7N7wpUNUFV41Q1rlq1agEPaArmgw8+oH79+syYMYPHH3+cLVu20Lx5c7djGWP8EMz7IFKBml6fL3baTlHVNK+PicALXsu2z7XsJwFPaIJu/vz5VKpUiQULFtCyZUu34xhj8iGYBWI9cLmI1MbzCz8e6OU9g4hcpKq/OB9vAbY571cAz4tIJedzJ2BkELOaAFFV3nrrLerWrUvTpk2ZNGkSpUqVomTJkm5HM8bkU9BOMalqJvAgnl/224C3VfVrEXlGRG5xZntIRL4Wka+Ah4D+zrIHgbF4isx64BmnzRRie/bsoXv37tx9991MmTIF8AyuZ8XBmPAkOc/0LQri4uI0OTnZ7RjFTnZ2NomJiYwYMYKTJ0/y7LPP8vDDDxMREXHuhY0xrhKRDaoa52uajcVkCmzOnDkMGTKEDh06kJCQwGWXXeZ2JGNMAFiBMH9KVlYWu3btok6dOvTu3ZuyZcty11132Z3QxhQhbl/masLQli1baN26Ne3bt+fIkSOULFmS+Ph4Kw7GFDFWIIzfjh8/zujRo2nWrBk//PADEyZMsMH1jCnC7BST8cu+ffvo0KEDX3/9Nffccw8TJ06katWqbscyxgSRHUGYs8q5yq1atWq0aNGC999/n7lz51pxMKYYsAJh8vTxxx/TrFmzU4PrzZ49m5tvvtntWMaYELECYc6Qnp7OoEGDuP766/njjz84cOCA25GMMS6wAmFOs3jxYurXr8+sWbN44okn2Lx5M02bNnU7ljHGBdZJbU6zePFiqlatyqJFi4iL83lzpTGmmLACUcypKm+++Sb16tWjWbNmTJw4kZIlS9r4ScYYO8VUnO3evZuuXbvSu3dvpk6dCkD58uWtOBhjACsQxVJ2dlstT80AABBVSURBVDbTpk2jfv36fPLJJ0ycOJFXXnnF7VjGmELGTjEVQ3PmzOH++++nY8eOJCQkULt2bbcjGWMKISsQxURmZia7du3i8ssvp3fv3pQvX5477rjDxk8yxuTJTjEVA1999RWtWrXiuuuuOzW43p133mnFwRhzVlYgirDjx4/z17/+lbi4OHbv3s3EiRNtcD1jjN/sFFMRtW/fPtq3b8+2bdvo27cv//rXv6hSpYrbsYwxYSSoRxAi0kVEtovIDhF5ysf0x0QkRUQ2i8hHIlLLa1qWiHzpvBYHM2dR4j24Xps2bVi2bBmvvvqqFQdjTL4FrUCISATwMnAjEAvcLSKxuWbbBMSpaiPgXeAFr2kZqtrEed0SrJxFycqVK2ncuDE//fQTIkJiYiJdunRxO5YxJkwF8wiiJbBDVXeq6glgHtDdewZVXa2qR52P64CLg5jnT6tduzYiwo4dO9yO4tOhQ4cYMGAAnTp14vjx4xw8eNDtSMaYIiCYBaIGsNvr8x6nLS8DgGVen0uLSLKIrBORHnktJCKDnfmS9+/fX7DEPnz++ef88MMPACQlJQV8/QX13nvvERsby6uvvsrIkSP56quvaNKkiduxjDFFQKG4iklEegNxwHiv5lqqGgf0AiaKyGW+llXVBFWNU9W4atWqBTxbUlIS5cqV46qrriqUBeKDDz7gwgsv5IsvvuD555+ndOnSbkcyxhQRwSwQqUBNr88XO22nEZGOwCjgFlU9ntOuqqnOz53AJ0DIx5zOysri7bff5pZbbuG+++5j27ZtfPXVV6GOcRpV5bXXXmPjxo0ATJo0iS+++IJmzZq5mssYU/QEs0CsBy4XkdoiUhKIB067GklEmgKv4CkO+7zaK4lIKed9VaAtkBLErD6tXr2avXv3Eh8fz+23305UVJSrRxE//vgjN954I/369WP69OkAlCtXjqioKNcyGWOKrqAVCFXNBB4EVgDbgLdV9WsReUZEcq5KGg+UB97JdTlrPSBZRL4CVgPjVDXkBSIpKYno6Gi6dOlC5cqV6dSpE/PmzTt1KWmoZGdn8/LLL9OgQQP+85//MHny5FMFwhhjgiWoN8qp6lJgaa62v3m975jHcmuBhsHMdi4nTpxgwYIF3HrrraeGv46Pj6dPnz58/vnntGnTJmRZZs+ezYMPPsgNN9xAQkICMTExIdu2Mab4sjup87Bs2TLS09O56aabSE9PB6B9+/aUKlWKpKSkoBeIkydPsmvXLq644gr69OnDeeedx+23327jJxljQkZCfbokmOLi4jQ5OTkg64qPj+ett97yOe2CCy4gNTWViIiIgGwrt02bNjFgwAD27t3Lt99+S7ly5YKyHWOMEZENzhWjZ7AjCB+OHDnCkiVLuPvuuxk8ePBp0zZt2sRjjz3Gxx9/zA033BDQ7R47doxnnnmGF154gapVqzJ16lQrDsYY11iB8GHRokUcPXqUhx9+mKuuuuq0aW3btuW5554jKSkpoAVi3759XHPNNWzfvp17772XCRMmUKlSpYCt3xhj8qtQ3ChX2CQlJXH55ZefURwAoqKiuPPOO1mwYAHHjx/3sXT+eA+ud80117BixQpmzZplxcEY4zrrg3DRihUrGD58OO+//z61atU69wLGGBNgZ+uDsCMIFxw8eJD+/fvTpUsXMjMzT10lZYwxhYkViBCbP38+sbGxzJ07l1GjRrFp0yYaN27sdixjjDmDdVKH2IoVK6hevTrLly+3UVeNMYWaFYggU1XmzJlDw4YNiYuL48UXX6RUqVJERtquN8YUbnaKKYh27dpFp06duO+++5gxYwbgGVzPioMxJhxYgQiCrKwsJk+eTIMGDVi3bh1Tp05l2rRpbscyxph8sT9lg2DOnDk8/PDD3HjjjUyfPp1LLrnE7UjGGJNvViAC5OTJk3z//fdceeWV9O3bl8qVK9OjRw8bXM8YE7bsFFMAbNy4kRYtWtChQweOHDlCVFQUt956qxUHY0xYswJRABkZGTz11FO0bNmSffv22eB6xpgixU4x/Ul79+6lXbt2fPfddwwYMIB//vOfREdHux3LGGMCJqhHECLSRUS2i8gOEXnKx/RSIvKWM/2/IhLjNW2k075dRDoHM2d+ZGdnA3D++edz3XXXsWrVKhITE604GGOKnKAdQYhIBPAycAOwB1gvIotzPVt6AHBIVeuISDzwD+AuEYkF4oH6QHVglYhcoapZwcqbl4WbUhm/Yjs/p2dQqRT8vvZNVs54jpiYGF555ZVQxzHGmJAJ5hFES2CHqu5U1RPAPKB7rnm6A686798FrhdPz253YJ6qHlfVXcAOZ30htXBTKiMXbCE1PQMFDh6Hk41vZ/FXv4Q6ijHGhFwwC0QNYLfX5z1Om895VDUTOAxU8XPZoBu/YjsZJ3MdtESW5K1tGaGOYowxIRf2VzGJyGARSRaR5P379wd03T+n+y4EebUbY0xREswCkQrU9Pp8sdPmcx4RiQQqAml+LguAqiaoapyqxlWrVq3AoVWVmTNnsn79eqpHl/E5T17txhhTlASzQKwHLheR2iJSEk+n8+Jc8ywG+jnvbwc+Vs8j7hYD8c5VTrWBy4EvgpgVgJ07d9KxY0cGDhzIzJkzGdG5LmWiIk6bp0xUBCM61w12FGOMcV3QrmJS1UwReRBYAUQAs1T1axF5BkhW1cXATOB1EdkBHMRTRHDmextIATKBB4J5BVPO4HpPP/00ERERTJ8+nUGDBlGihKd+5lzFVD26DCM616VH05B3hxhjTMjZM6mBxMREBg0axM0338z06dO5+OKLg5DOGGMKn7M9k9rupAb69etHtWrVuOWWW2z8JGOMcViBAKKioujePfctGsYYU7yF/WWuxhhjgsMKhDHGGJ+sQBhjjPHJCoQxxhifrEAYY4zxyQqEMcYYn6xAGGOM8ckKhDHGGJ+K1FAbIrIf+DFIq68KHAjSugPFMgZGOGSE8MhpGQMjmBlrqarPobCLVIEIJhFJzmu8ksLCMgZGOGSE8MhpGQPDrYx2iskYY4xPViCMMcb4ZAXCfwluB/CDZQyMcMgI4ZHTMgaGKxmtD8IYY4xPdgRhjDHGJysQxhhjfLICAYhIFxHZLiI7ROQpH9NLichbzvT/ikiM17SRTvt2EensYsbHRCRFRDaLyEciUstrWpaIfOm8FruYsb+I7PfKMtBrWj8R+c559XMx44te+b4VkXSvaaHaj7NEZJ+IbM1juojIZOffsFlEmnlNC9V+PFfGe5xsW0RkrYg09pr2g9P+pYjk/xnBgcvYXkQOe/03/ZvXtLN+T0KYcYRXvq3Od7CyMy34+1FVi/ULiAC+By4FSgJfAbG55rkfmO68jwfect7HOvOXAmo764lwKeN1QFnn/V9yMjqf/ygk+7E/MMXHspWBnc7PSs77Sm5kzDX/MGBWKPejs51rgGbA1jym3wQsAwRoBfw3lPvRz4xtcrYN3JiT0fn8A1C1EOzH9sD7Bf2eBDNjrnm7AR+Hcj/aEQS0BHao6k5VPQHMA3I/f7Q78Krz/l3gevE8vLo7ME9Vj6vqLmCHs76QZ1TV1ap61Pm4Drg4CDkKlPEsOgMrVfWgqh4CVgJdCkHGu4GkIOQ4K1X9FDh4llm6A6+pxzogWkQuInT78ZwZVXWtkwHc+T76sx/zUpDvcr7kM2PIv49WIKAGsNvr8x6nzec8qpoJHAaq+LlsqDJ6G4DnL8wcpUUkWUTWiUiPIOQD/zP2dE49vCsiNfO5bKgy4pyiqw187NUciv3oj7z+HaHaj/mV+/uowIciskFEBruUKUdrEflKRJaJSH2nrdDtRxEpi6fYz/dqDvp+jAzGSo17RKQ3EAdc69VcS1VTReRS4GMR2aKq37sQbwmQpKrHRWQInqOyDi7k8Ec88K6qZnm1FZb9GDZE5Do8BeJqr+arnf14PrBSRL5x/pIOtY14/pv+ISI3AQuBy13I4Y9uwBpV9T7aCPp+tCMISAVqen2+2GnzOY+IRAIVgTQ/lw1VRkSkIzAKuEVVj+e0q2qq83Mn8AnQ1I2MqprmlSsRaO7vsqHK6CWeXIfzIdqP/sjr3xGq/egXEWmE579zd1VNy2n32o/7gPcIzmnZc1LV31T1D+f9UiBKRKpSyPaj42zfx+Dtx2B2cITDC89R1E48pxNyOqTq55rnAU7vpH7beV+f0zupdxKcTmp/MjbF07F2ea72SkAp531V4DuC0OHmZ8aLvN7fCqxz3lcGdjlZKznvK7uR0ZnvSjwdgBLq/ei1vRjy7ly9mdM7qb8I5X70M+MlePrk2uRqLwdU8Hq/FujiUsYLc/4b4/nl+pOzT/36noQiozO9Ip5+inKh3o9B+QeH2wvPVSHfOr9gRzltz+D5SxygNPCO84X/ArjUa9lRznLbgRtdzLgK2At86bwWO+1tgC3Ol3wLMMDFjP8P+NrJshq40mvZ+5z9uwO4162MzucxwLhcy4VyPyYBvwAn8Zz/HgAMBYY60wV42fk3bAHiXNiP58qYCBzy+j4mO+2XOvvwK+e7MMrFjA96fR/X4VXMfH1P3MjozNMfz8Uw3suFZD/aUBvGGGN8sj4IY4wxPlmBMMYY45MVCGOMMT5ZgTDGGOOTFQhjjDE+WYEwxiEifxRw+Xedu6xDQkQeFJH7QrU9U/xYgTAmAJxxfCLUc5d1qMzCM+KsMUFhBcKYXJznLYx3xt/fIiJ3Oe0lRGSqiHwjIitFZKmI3O4sdg+wyJnvDhH5l/P+YRHZ6by/VETWOO//JiLrnW0kONu8TEQ2euW4POeziIyT/z3v458A6hm99wcRcWWoClP0WYEw5ky3AU2AxkBHYLwznPZteIZFiAX6AK29lmkLbHDefwa0c963A9JEpIbzPmcwtSmq2kJVGwBlgK7qGfjvsIg0cea5F5gtIlXwDE1SX1UbAc96bTfZa1vGBJQVCGPOdDWeUWezVHUv8G+ghdP+jqpmq+qveIYLyXERsB/AmVZeRCrgGfTtTTwPhmmHp3gAXCeepxNuwTOibc5Q04nAvSISAdzlLHsYOAbMFJHbgJznfgDsA6oH9F9vjMMKhDGBkYFnzK4ca/EcAWznf0cUrYE1IlIamArcrqoNgRley87H8wS2rsAG9YyAm4lnMLl3nfblXtsp7WzbmICzAmHMmT4D7hKRCBGphuev/y+ANXgeeFRCRC7A88jKHNuAOrnWMRzPKaVNeB4Je1xVD/O/YnBARMoDOf0YqOoxYAUwDZgN4MxTUT1DUj+K59RXjisAn88zNqag7IFBxpzpPTx/7X+F56ldT6jqryIyH7geSMHzxLGNeE7/AHyAp2Cscj5/huf00qeqmiUiu4FvAFQ1XURm4PnF/iuwPtf238DT5/Ch87kCsMg58hDgMa952+IZfdaYgLPRXI3JBxEpr54nkFXBc1TR1ikeZfD0SbTV059C92e2MRzPEcNfzzFfU+AxVe1TkO0Zkxc7gjAmf94XkWg8D5IZ63RIo6oZIjIaz7OLf/qzKxeR94DL8O9RrFWBsxYRYwrCjiCMMcb4ZJ3UxhhjfLICYYwxxicrEMYYY3yyAmGMMcYnKxDGGGN8+v/AEGo94jU4VgAAAABJRU5ErkJggg==\n", 151 | "text/plain": [ 152 | "
" 153 | ] 154 | }, 155 | "metadata": { 156 | "needs_background": "light" 157 | }, 158 | "output_type": "display_data" 159 | } 160 | ], 161 | "source": [ 162 | "x = np.linspace(-0.05, 1.80, 3)\n", 163 | "plt.plot(x, x, color=\"black\", linestyle=\"--\", zorder=-1)\n", 164 | "x = list(logwayspp.values())\n", 165 | "y = list(H.values())\n", 166 | "plt.scatter(x, y)\n", 167 | "plt.xlabel(\"log(ways)\")\n", 168 | "plt.ylabel(\"entropy\")\n", 169 | "labels = list(H.keys())\n", 170 | "for i in range(len(labels)):\n", 171 | " plt.text(x[i], y[i]+0.15, labels[i], horizontalalignment=\"center\", fontsize=15)\n", 172 | "plt.show()" 173 | ] 174 | }, 175 | { 176 | "cell_type": "markdown", 177 | "metadata": {}, 178 | "source": [ 179 | "### Code 10.5 - 10.6\n", 180 | "Now we want to compare the entropies of several potential probability distributions for sampling blue and white marbles from a bag, where we _know_ that the expected number of blue marbles over two draws is exactly 1. We consider the following proposal distributions" 181 | ] 182 | }, 183 | { 184 | "cell_type": "code", 185 | "execution_count": 7, 186 | "metadata": {}, 187 | "outputs": [ 188 | { 189 | "data": { 190 | "text/html": [ 191 | "
\n", 192 | "\n", 205 | "\n", 206 | " \n", 207 | " \n", 208 | " \n", 209 | " \n", 210 | " \n", 211 | " \n", 212 | " \n", 213 | " \n", 214 | " \n", 215 | " \n", 216 | " \n", 217 | " \n", 218 | " \n", 219 | " \n", 220 | " \n", 221 | " \n", 222 | " \n", 223 | " \n", 224 | " \n", 225 | " \n", 226 | " \n", 227 | " \n", 228 | " \n", 229 | " \n", 230 | " \n", 231 | " \n", 232 | " \n", 233 | " \n", 234 | " \n", 235 | " \n", 236 | " \n", 237 | " \n", 238 | " \n", 239 | " \n", 240 | " \n", 241 | " \n", 242 | " \n", 243 | " \n", 244 | " \n", 245 | "
wwbwwbbb
A0.2500000.2500000.2500000.250000
B0.3333330.1666670.1666670.333333
C0.1666670.3333330.3333330.166667
D0.1250000.5000000.2500000.125000
\n", 246 | "
" 247 | ], 248 | "text/plain": [ 249 | " ww bw wb bb\n", 250 | "A 0.250000 0.250000 0.250000 0.250000\n", 251 | "B 0.333333 0.166667 0.166667 0.333333\n", 252 | "C 0.166667 0.333333 0.333333 0.166667\n", 253 | "D 0.125000 0.500000 0.250000 0.125000" 254 | ] 255 | }, 256 | "execution_count": 7, 257 | "metadata": {}, 258 | "output_type": "execute_result" 259 | } 260 | ], 261 | "source": [ 262 | "p = pd.DataFrame([\n", 263 | " [1/4, 1/4, 1/4, 1/4],\n", 264 | " [2/6, 1/6, 1/6, 2/6],\n", 265 | " [1/6, 2/6, 2/6, 1/6],\n", 266 | " [1/8, 4/8, 2/8, 1/8],\n", 267 | "], columns=[\"ww\",\"bw\",\"wb\",\"bb\"], index=list(\"ABCD\"))\n", 268 | "p" 269 | ] 270 | }, 271 | { 272 | "cell_type": "code", 273 | "execution_count": 8, 274 | "metadata": {}, 275 | "outputs": [ 276 | { 277 | "data": { 278 | "text/plain": [ 279 | "A 1.0\n", 280 | "B 1.0\n", 281 | "C 1.0\n", 282 | "D 1.0\n", 283 | "dtype: float64" 284 | ] 285 | }, 286 | "execution_count": 8, 287 | "metadata": {}, 288 | "output_type": "execute_result" 289 | } 290 | ], 291 | "source": [ 292 | "# Compute expected value of # of blue marbles\n", 293 | "(p*np.array([[0, 1, 1, 2]])).sum(axis=1)" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": 9, 299 | "metadata": {}, 300 | "outputs": [ 301 | { 302 | "data": { 303 | "text/plain": [ 304 | "A 1.386294\n", 305 | "B 1.329661\n", 306 | "C 1.329661\n", 307 | "D 1.213008\n", 308 | "dtype: float64" 309 | ] 310 | }, 311 | "execution_count": 9, 312 | "metadata": {}, 313 | "output_type": "execute_result" 314 | } 315 | ], 316 | "source": [ 317 | "# compute entropy of each distribution\n", 318 | "-(p*np.log(p)).sum(axis=1)" 319 | ] 320 | }, 321 | { 322 | "cell_type": "markdown", 323 | "metadata": {}, 324 | "source": [ 325 | "We see that distribution A has the largest entropy. It just happens to be the same as the binomal distribution for $b$ successes out of $n=2$ trials." 326 | ] 327 | }, 328 | { 329 | "cell_type": "markdown", 330 | "metadata": {}, 331 | "source": [ 332 | "### Code 10.7 - 10.13\n", 333 | "The above example was kind of special because the distribution over outcomes can remain flat and still be consistent with the constraint. What if the expected value was 1.4 marbles in two draws ($p = 0.7$). The binomial distribution with this expected value is" 334 | ] 335 | }, 336 | { 337 | "cell_type": "code", 338 | "execution_count": 10, 339 | "metadata": {}, 340 | "outputs": [ 341 | { 342 | "data": { 343 | "text/plain": [ 344 | "array([0.09, 0.21, 0.21, 0.49])" 345 | ] 346 | }, 347 | "execution_count": 10, 348 | "metadata": {}, 349 | "output_type": "execute_result" 350 | } 351 | ], 352 | "source": [ 353 | "p = 0.7\n", 354 | "A = np.array([(1-p)**2, p*(1-p), (1-p)*p, p**2])\n", 355 | "A" 356 | ] 357 | }, 358 | { 359 | "cell_type": "code", 360 | "execution_count": 11, 361 | "metadata": {}, 362 | "outputs": [ 363 | { 364 | "data": { 365 | "text/plain": [ 366 | "1.221728604109787" 367 | ] 368 | }, 369 | "execution_count": 11, 370 | "metadata": {}, 371 | "output_type": "execute_result" 372 | } 373 | ], 374 | "source": [ 375 | "# entropy of distribution\n", 376 | "-(A*np.log(A)).sum()" 377 | ] 378 | }, 379 | { 380 | "cell_type": "markdown", 381 | "metadata": {}, 382 | "source": [ 383 | "If we randomly generate a bunch of distributions with the same expected value of 1.4, then we expect that none of them will have a larger entropy than 1.22." 384 | ] 385 | }, 386 | { 387 | "cell_type": "code", 388 | "execution_count": 12, 389 | "metadata": {}, 390 | "outputs": [], 391 | "source": [ 392 | "def sim_dists(G=1.4):\n", 393 | " x123 = np.random.rand(3)\n", 394 | " x4 = (G * x123.sum() - x123[1] - x123[2])/(2-G)\n", 395 | " z = x123.sum() + x4\n", 396 | " p = np.array([*x123, x4])/z\n", 397 | " return dict(H=-(p*np.log(p)).sum(), p=p)\n", 398 | "\n", 399 | "H = [sim_dists() for _ in range(10_000)]\n", 400 | "entropies = np.array([d[\"H\"] for d in H])\n", 401 | "distributions = np.array([d[\"p\"] for d in H])" 402 | ] 403 | }, 404 | { 405 | "cell_type": "code", 406 | "execution_count": 13, 407 | "metadata": {}, 408 | "outputs": [ 409 | { 410 | "data": { 411 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXgAAAEGCAYAAABvtY4XAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO3de3iU1bn38e8NSQgBOUgo724Rg61yCnIwrShIg1RftKCC7Fep2AahtGCpdotu7bZqrVVrERVrS9EiRUVt3cCubvXdakmRykHOB0FFBQptQQ7hFCCJrP3HzMRJmCSTZNac8vtcVy4y8zyz1nomw52V9ax1L3POISIi6adZohsgIiJ+KMCLiKQpBXgRkTSlAC8ikqYU4EVE0lRGohsQLjc31+Xl5SW6GSIicfP+++8D0K1btwa9ftWqVXudcx0jHUuqAJ+Xl8fKlSsT3QwRkbgpLCwEoLi4uEGvN7PtNR3TEI2ISJpKqh68iEhTc+edd3orWwFeRCSBvvGNb3grO+kDfHl5OTt37uT48eOJbkpKyc7OpnPnzmRmZia6KSJSi7Vr1wLQt2/fmJed9AF+586dnHbaaeTl5WFmiW5OSnDOsW/fPnbu3EnXrl0T3RwRqcXNN98MNPwma22S/ibr8ePH6dChg4J7PZgZHTp00F89Ik1c0gd4QMG9AfSeiUhKBHgREam/tAvw99xzD9OmTeOuu+7izTffrPG8hQsX8t5779V4fObMmcydOxcILESozwKskpISfv3rX1c+/vvf/87o0aOjfr2IpL/SsgoOHC2j4jPHSU/7ciT9TdaGuvfee2s9vnDhQoYPH07Pnj1POVZRUcH3v//9BtcdCvCTJ08G4Itf/CIvvfRSg8sTkfRSWlbB4IcWsfdIGcfPHEG7nAxKyyrIyYptSE6LHvzPf/5zzjnnHAYNGlSZ16GoqKgyqN5+++307NmTc889l6lTp/LOO+/wpz/9iVtvvZW+ffvy0UcfUVhYyM0330xBQQGPPfZY5V8CIc888wx9+/YlPz+fFStWAJxyTn5+Ptu2beP222/no48+om/fvtx6661s27aN/Px8IHDTeNy4cfTu3Zt+/fqxaNEiAObMmcOoUaMYNmwYZ599NrfddhsAn332GUVFReTn59O7d28eeeQR/2+oiHh1ovwke4+UsWhqIUsfncTx08/mRPnJmNeT8j34VatW8cILL7B27VoqKiro378/5513XuXxffv2sWDBArZs2YKZUVJSQrt27bjiiisYPnx4laGTsrKyyqGYe+65p0o9paWlrF27lsWLF3PDDTewcePGGtv04IMPsnHjxsr5rdu2bas89sQTT2BmbNiwgS1btnDppZfywQcfAIH5sGvWrKFFixZ069aNKVOmsGfPHnbt2lVZX0lJSaPeLxFJHu1aZrJ82VKO79wMXBLz8lO+B//2228zcuRIcnJyaNOmDVdccUWV423btiU7O5vx48czf/58cnJyaizrmmuuqfHYmDFjABg8eDCHDh1qcKBdsmQJY8eOBaB79+6ceeaZlQF+6NChle3t2bMn27dv56yzzuLjjz9mypQpvP7667Rp06ZB9YpIcrrvnrsoWfx7L2WnfICvS0ZGBitWrGD06NG88sorDBs2rMZzW7VqVeOx6tMOzYyMjAxOnvz8z6rGzjtv0aJF5ffNmzenoqKC9u3bs27dOgoLC5k5cyYTJkxoVB0i0nSkfIAfPHgwCxcu5NixYxw+fJiXX365yvEjR45w8OBBLr/8ch555BHWrVsHwGmnncbhw4ejrufFF18EAj3wtm3b0rZtW/Ly8li9ejUAq1ev5pNPPqmz7IsuuojnnnsOgA8++IAdO3bUmgd67969nDx5kquvvpr77ruvsj4Rkbqk/Bh8//79ueaaa+jTpw9f+MIX+OpXv1rl+OHDh7nyyis5fvw4zjmmT58OwLXXXst3v/tdZsyYEdUMl+zsbPr160d5eTmzZ88G4Oqrr2bu3Ln06tWL888/n3POOQeADh06MHDgQPLz87nsssu48cYbK8uZPHkykyZNonfv3mRkZDBnzpwqPffqdu3axbhx4yr/UnjggQfq9waJSJNlztP8y4YoKChw1eebb968mR49eiSoRalN751IcjpwtIx+P3uDNT+5hBHDLmHl9v38Y/Mq2rfKqndZZrbKOVcQ6VjKD9GIiKSy+x+axulDJ3opO+WHaEREUlnvc/uQ1WmPl7LVgxcRSaDiRW9xbNtaL2UrwIuIJNDDv3iQg++84KVsbwHezLqZ2dqwr0NmdrOv+kREpCpvY/DOufeBvgBm1hzYBSzwVZ+IiFQVryGaocBHzrntcaovZsIThYWbMGFCremGY+Hyyy+vMyVCXl4ee/fu9doOEWm4UFrgA0fLKC2riGvd8ZpFcy3wfKQDZjYRmAjQpUuXODWn8Z566invdbz66qve6xARf8LTAgPkts5i8W1D4la/9x68mWUBVwB/jHTcOTfLOVfgnCvo2LGj7+Y0SEVFBddddx09evRg9OjRlJaWVtkEpHXr1vzHf/wHffr0YcCAAezevRsI9P4vvvhizj33XIYOHcqOHTuAQCrjSZMmMWDAAM466yyKi4u54YYb6NGjB0VFRZX1hvfOr7rqKs477zx69erFrFmz4vsGiEiDhKcFXjS1kL1Hyk5JCzx9xq/o8H9/4KX+eAzRXAasds7tjkVhhYWFp3yFdk8KBd7qX3PmzAECeV2qH4vG+++/z+TJk9m8eTNt2rSpslsTwNGjRxkwYADr1q1j8ODBPPnkkwBMmTKF73znO6xfv57rrruOH/7wh5WvOXDgAEuXLuWRRx7hiiuu4Ec/+hGbNm1iw4YNlWmGw82ePZtVq1axcuVKZsyYwb59+xrw7olIIrRrmUm7lpkRj519TjcyO3T2Um88AvwYahieSRVnnHEGAwcOBGDs2LEsWbKkyvGsrCyGDx8OwHnnnVeZ/33p0qV861vfAuD666+v8roRI0ZgZvTu3ZtOnTrRu3dvmjVrRq9evarkjw+ZMWNG5V8If/vb3/jwww89XKmIxNvrr75C6dblXsr2OgZvZq0IZLH/XqzKLC4urvFYTk5Orcdzc3NrPV6TSKmCw2VmZlY+F0rzW5dQgrFmzZpVSTbWrFmzU15fXFzMm2++ydKlS8nJyaGwsLDRqYlFJDFKjpVXefzEjMc4tH0/cGfM6/Lag3fOHXXOdXDOHfRZj287duxg6dKlAMybN49BgwZF9boLL7yQF14ILGB47rnnuOiiixpU/8GDB2nfvj05OTls2bKFZcuWNagcEUmcFpnNyG2dxZBpxQyZVkxu6yxaZPodRFEumih069aNJ554ghtuuIGePXsyadKkU/LOR/L4448zbtw4fvnLX9KxY0eefvrpBtU/bNgwZs6cSY8ePejWrRsDBgxoUDkikjg5WRksvm1I5U3WFpnNYr7JdnVKF5zG9N6JJFZ4WuCaUgEPuujrShcsIiL1owAvIpJAv3lqNrnDb/FStgK8iEgCde58Bhlt/CzyVIAXEUmg+S/9kaObF3spWwFeRCSBnn5qFofX+Mk7lXLTJEvLKk7J5dAY8ZiqJCKSCCkV2apnZouFUHa3eAX5vLw8Vq5cSW5ublzqE5GmK6UCfHhmtpoS99RHybFyhkwr5kT5SXKimH7qnMM5R7NmGtkSkeSXkpGqXctM2rfKavRXNL8ktm3bRrdu3fj2t79Nfn4+48ePp6CggF69enH33XdXnpeXl8fdd99N//796d27N1u2bAFg3759XHrppfTq1YsJEyYQvrBs+vTp5Ofnk5+fz6OPPlpZX/fu3SkqKuKcc87huuuu480332TgwIGcffbZrFixIsbvpoikq5QM8PH24YcfMnnyZDZt2sTDDz/MypUrWb9+PX/5y19Yv3595Xm5ubmsXr2aSZMmMW3aNAB++tOfMmjQIDZt2sTIkSMrc8KvWrWKp59+muXLl7Ns2TKefPJJ1qxZA8DWrVu55ZZb2LJlC1u2bGHevHksWbKEadOmcf/998f/DRARb+Y8+zwdr7rDS9kK8FE488wzK/O//OEPf6B///7069ePTZs2Vdm2b9SoUUDVlMGLFy9m7NixAHzzm9+kffv2ACxZsoSRI0fSqlUrWrduzahRo3j77bcB6Nq1a5X0wUOHDq1MLRwplbCIpK4Oubk0z2nrpeyUGoNPlFatWgHwySefMG3aNN59913at29PUVFRlbS9obS/0aYMrkn19MHhqYUbU66IJJ95z87lyIZNBDKrx1ZK9uBLjpVXbmLbmK/qeZnrcujQIVq1akXbtm3ZvXs3r732Wp2vGTx4MPPmzQPgtdde48CBAwBcdNFFLFy4kNLSUo4ePcqCBQsanE5YRFLX888+w5ENb3opO6V68OH5lGOlPjmZ+/TpQ79+/ejevXuVXZ5qc/fddzNmzBh69erFhRdeWLmxeP/+/SkqKuJrX/saABMmTKBfv34aghFJUeFrdJJlfU3KpQvWQqfoKV2wSHxUX6OT2zqLl6cM4nj5SYZMK05YuuCUi2w5WRlRzVkXEYmX8DU62ZnNGPH4Ei544M9A/UYJYs33nqztgKeAfMABNzjnlvqsU0QkUUJrdOK9c1NNfNf6GPC6c260mWUBOQ0pxDl3ykbXUrtkGnoTSXX1HV+vz0jDi/P/iwsefKuxTYzIW4A3s7bAYKAIwDlXBtQ7iUx2djb79u2jQ4cOCvJRcs6xb98+srOzE90UkZQXaXw9lvmrcnJyaJbp5/+qzx58V+BT4Gkz6wOsAm5yzh0NP8nMJgITgcoZJuE6d+7Mzp07+fTTTz02Nf1kZ2fTuXPnRDdDJOWFj68D9cpfFY3fzZrJ4dVb8DEP3meAzwD6A1Occ8vN7DHgduAn4Sc552YBsyAwi6Z6IZmZmXTt2tVjM0VE6haLBIeRLJz/nxzdvt9L2T5v7e4Edjrnlgcfv0Qg4IuISBx4C/DOuX8CfzOzbsGnhgLv1fISEZGEKi2rqFzpXlqW+mlBfM+imQI8F5xB8zEwznN9IiINUtPN1FTmNcA759YCEVdYiYgkk5puptYmNH2yvnmt4iXlVrKKiPgUzc3UkmPlHK/4jBGPL6nS42/IitWXX3+Dfj97o96vi4YCvIhIlKonPMxtncXSOy4mO6N5Uua1Sq7WiIgksZysjJinIXj8sekcXP4hqTYPXkQk7cQ64eH/vPYax1JwHryIiCSQAryISJpSgBcRSVMK8CIiCZTdMhvLaOGlbAV4EZEE+uOCl+n0/37qpWzNohERqUGyrlCNlgK8iEg1kRY0+dpX9ZcP3k/JX7eiefAiInHgY0FTTRYXL+K4p3nwCvAiIhHEekFTIugmq4hImlKAFxFJUwrwIiIJ1P7002neso2XshXgRUQSaO68F+k48sdeylaAFxFJU14DvJltM7MNZrbWzFb6rEtEJBXde/edHPjLHC9lx2Oa5BDn3N441CMiknLeXb6cE7uUD15EROrBd4B3wP+Y2Sozm+i5LhERCeN7iGaQc26XmX0BeMPMtjjnFoefEAz8EwG6dOniuTkiIlBaVhGXNASJ5vWqnHO7gv/uMbMFwNeAxdXOmQXMAigoKHA+2yMiUlpWweCHFrH3SBkQSCS2+LYhCWvPF7/0JTL2+xlM8TZEY2atzOy00PfApcBGX/WJiNSmtKyCA0fL2H3oBHuPlLFoaiGLphay90hZZW8+EX77uznkjpjqpWyfPfhOwAIzC9Uzzzn3usf6REQiitRr79SmRUIDezx4C/DOuY+BPr7KFxGJ1onyk5W99nYtMyvH3U+UlyW6adxx2y3sX74DfqJ88CIiDdauZSbtWyVXDuCN69dTtkfz4EVEpB4U4EVE0pQCvIhImlKAFxFJoC9/5Stknv4lL2UrwItIk1dyrJySY+UJqfvRX/2GDsOmeClbs2hEpMlqkdmM3NZZDJlWDATmx7fITJ9+b1QB3sxGAP/tnEvvVQEi0qTkZGWw+LYhCc1Lc/MPJrFvzS4v8+Cj/VV1DfChmT1kZt1j3goRkQTJycqgfass2rfKSkjSsY+2bqV8/y4vZUd1Nc65sWbWBhgDzDEzBzwNPO+cO+ylZSIiUQjPDAnpnR2yvqJ+F5xzh8zsJaAlcDMwErjVzGY45x731UARkZpUzzEDn2eHVJCPfgz+SqAI+AowF/haMAVwDvAeoAAvInFXPcdMybFyhkwr5kT5SXKSKyNBQkT7K24U8Ej1zTqcc6VmNj72zRIRiV71HDOhKY+pMFyTf+65rD+2w0vZ0V75PyPsxPQL59y/O+fe8tAuEZF6izTtMZGbeUTjgYce5vWfveGl7Ghn0USav3NZLBsiItJYoWmPa35ySVJs5pFotfbgzWwSMBn4spmtDzt0GvBXnw0TEalJaOZMpNWnOVkZKTX+/r3xRezd8I+E5IOfB7wGPADcHvb8YeecnwTGItKk1bUhdqTdmVJ59enfd+2i4rCfcFpXgHfOuW1mdmP1A2Z2uoK8iMRSTRtihwf5mnZnklNF04MfDqwCHGBhxxxwlqd2iUgTFB68gVqnPCbj7kzJptYA75wbHvy3a0MrMLPmwEpgV6g8EZHatGuZmegmpIWoBq7MbKCZtQp+P9bMpptZlyjruAnY3NAGiog0RiJTAUfjq+efT4sv+UnxFe2did8ApWbWB7gF+Ah4pq4XmVln4JvAUw1uoYhIA4TPiR8yrThpb8be9dP7aP/1Ii9lR3tnosI554IpC37lnPtdlCtYHwVuIzCtMiIzmwhMBOjSJdo/CkREapcMqYATLdpfZ4fN7A5gLPDfZtYMqHWQzMyGA3ucc6tqO885N8s5V+CcK+jYsWOUzRERqVuiUwFH49vfuoZPF9zvpez65IM/AYx3zv0T6Az8so7XDASuMLNtwAvAxWb2bEMbKiKSjg7s389nxw55KTuqAO+c+6dzbrpz7u3g4x3Oubl1vOYO51xn51wecC3wZ+fc2Ea3WEREohLtLJpRZvahmR00s0NmdtjM/PzKEREJU3KsnANHyygtq0h0U1JOtINSDwEjnHMNmu7onCsGihvyWhFpmlIxM2SyiTbA725ocBcRaYjwWTDhG3mkm8GFQ9hYvNVL2dEG+JVm9iKwkMDNVgCcc/O9tEpEhNTLDNkQt97+Y+aV+8kHH22AbwOUApeGPecABXgRkSQVVYB3zo3z3RARkaboX0eOYPfWfV7ywUc7i+YcM3vLzDYGH59rZnfGvDUiIrVI9rwyDXH82HFcxYm6T2yAaIdongRuBX4L4Jxbb2bzgPu8tEpEJEykGTXJmFcm2UQb4HOccyvMwtPBo0mpIhIXyivTMNG+Q3vN7MsEbqxiZqOBf3hrlYhINU1hRk2sRRvgbwRmAd3NbBfwCXCdt1aJiDQRl152GZve+NBL2bUGeDP7t7CHrwKLCNyYPQpcDUz30ioRkSZiyk3/xuxDiZkHH8rj3g34KvBfBPZlvR5Y4aVFIiISE3XtyfpTADNbDPR3zh0OPr4H+G/vrRORtFJaVhHxRmno+XSbAhmNEcMu4Z/b93uZBx/tGHwnoCzscVnwORGRqJSWVTD4oUXsPRIIJeHJw6o/rymQsRFtgJ8LrDCzBcHHVwFzvLRIRNLSifKT7D1SxqKphQBVkoeFnm/XMlNTIGMo2lQFPzez14CLgk+Nc86t8dcsEUlX7VpG3u2zXctM2rfSPMhYivrXpHNuNbDaY1tERCSGNNAlIgmTjrll6uuqUVfTqvtFdZ/YABroEpG4U26Zz42f+H1+tTux+eDrzcyygcVAi2A9Lznn7vZVn4ikDuWW+VxpaSkny497KdvnO3oCuNg5d8TMMoElZvaac26ZxzpFJEUot0zANaOuZM/2/XDviJiX7S3AO+cccCT4MDP45XzVJyLJqSkvYko0r38TmVlzYBXwFeAJ59zyCOdMBCYCdOnSxWdzRCTOIi1uaqpj7YngNcA75z4D+ppZO2CBmeU75zZWO2cWgUyVFBQUqIcvkkbCFzdpEVP8xeVXqXOuhEAmymHxqE9EkktoEZOCe3x5C/Bm1jHYc8fMWgKXAFt81ScikorGjL2e1r2/4aVsnz34fwEWmdl64F3gDefcKx7rE5EkUVpWwYGjZbqxGoVvjf22twDvcxbNeqCfr/JFJDnpxmr97Nu7l89KD3opWwNiIhIT4dMhdWM1ekVjx/Dp9v3wwOiYl613XUQaLVKvvVObFgrsCaZ3X0QaTdMhk5N+AiJSLzVtuwfK6Z5sFOBFJGq1bbsnyUcBXkSiVtu2e9Iw4yZM5P35672UrQAvIhHVNRQjsTFq9L/ys83tvJStAC8ip9BQTPzs3Pk3Kg596qVsrT4QkVOED8UsmlrI3iNlGorxZNKEG9j7ysNeylYPXkRqFD4Uo7QDqUcBXqSJq22sPfRcpP1TQ69R4E9eCvAiTViksfaXpwzieNhwTM37p1Zo4+wkpwAv0oSFj7VnZzZjxONLuOCBPwNVA3ak/VO1cXby009DRCpXoNY3YGvj7Ma78Yc3cdOLa72UrQAvIpUUsONv2OXDyVnVwkvZGjATEUmgDz94n/J9O72UrR68iEgC/dsPf8C+7fuBcTEvWz14EZE0pQAvIpKmvAV4MzvDzBaZ2XtmtsnMbvJVl4hEFtr8+sDRMkrLKhLdHIkzn2PwFcAtzrnVZnYasMrM3nDOveexThEJqilhWE5WRpX9UyV9eQvwzrl/AP8Ifn/YzDYDXwIU4EXioObc7acGfq1ATZxb/v12vv/sai9lx2UWjZnlAf2A5RGOTQQmAnTp0iUezRFJeeH5Y6D2RUnVc7dr/9TkUjhkKC2X+MnU6f2namatgf8EbnbOHap+3Dk3C5gFUFBQ4Hy3RyTVVR96garDL9HS/qnJYcP6dZTt/thL2V4DvJllEgjuzznn5vusS6SpqN4DLzlWXjn8kpNFrePrGnNPPj++bSr7t+8Hvhfzsr0FeDMz4HfAZufcdF/1iDRVkXrgkW6shsbXlfmx6fHZgx8IXA9sMLNQJp0fO+de9VinSJNW2/i6Mj82PT5n0SwBzFf5IlKzSL17JRJrevQrXCSJ1bXbkkht9GkRSVI1LVSS9HLnPfdS9PS7XspWgBdJUjUvVJJ0cv6AC8h+44iXsnUbXSTJtWuZecpiJUkfy5ct5fjOzV7KVoAXEUmg++65i5LFv/dStoZoRJJA9dQDoEVJ0ngK8CIJFin1QEhoQZLG3qUhFOBFEqz64qRwoamRJ8pPDf4idVGAF0kSjU3+pSEdqU4BXiTFtchspjwzKez+h6Zx7axlXspWgBdJcTlZGcozk8J6n9uHrE57vJStT4FICqlpGEZ5ZlJX8aK3OLZtLXBJzMtWgBdJARqGSV8P/+JBDm7fD9wa87IV4EVSgIZhpCH0CRFJERqGkfrS33giImlKPXiROFJKAoknBXiROIkmJYE0PdNn/IqRv37HS9k+N92eDQwH9jjn8n3VI5IqoklJIE3P2ed0I7PDDi9l++wyzAGGeSxfJCWFUhKEfym4N12vv/oKpVuXeynbW4B3zi0G9vsqX0QkHTwx4zEOrVjgpeyEdxvMbCIwEaBLly4Jbo1IbOhmqiSDhAd459wsYBZAQUGBS3BzRBpNN1MlWSQ8wIukG91MlWShT5qIJ43N7y7SWN7+VjSz54GlQDcz22lm433VJSKSqn7z1Gxyh9/ipWxvPXjn3BhfZYuIpIvOnc8go01HL2Xrbo9IjJSWVXDgaJlmy0i9zH/pjxzdvNhL2RqDF6mn8CmQoZum1WfOaLaMROvpp2ZxePt+4GcxL1sBXqQeIgXyl6cM4ni1mTOaLSPJQJ9AkXoInwKZndmMEY8v4YIH/gwEgn2nNi0U2CVp6JMo0gChKZDaZUmSmT6N0mREGjtvLO2yJMlMAV6ahEhj54tvG6IetyTcnGefZ8jDxV7K1qdbmoTwsXOAIdOKOVF+kpwsPz17kWh1yM2leU5bL2XrkyxNSnhumJJj5Ryv+IwRjy85ZVZMdkbzKq9T4Bdf5j07lyMbNgGXxLxsfWKlyWmR2Yzc1lkMmVYMBIL60jsuBqgyKyZc+HRIkVh6/tlnOLLdz9YZCvCSFuozzJKTlVHj7Jfw50NCvfzw6ZBaxCSpQAFeUl5Ni4/Ch1mqpw+oafZLTc9rOqSkIn1KJWWFeu0lx8prXHwUrjE9b02HlFSkAC8pKVKvPbSKNNIwC6jnLU2PPu2S1KrvbRoK0tV3TQoP3uptSyp5cf5/ccGDb3kpWwFeohLNTczGnBNNhkb4fIFSiHZNklSXk5NDs8xsL2UrwKehWAXj8HPrWgXa0HNenjII4JS56KFhlvBeesmx8soFSiLp4nezZnJ49RY0D17qFKtgHDqv+k1MiLwKNNI5uw+doF3Lz4NxbTdDQ3PRj5efrHxtSPVeujbUkHSycP5/clTz4Juu+vS2Iy3JDwXaSOPX1c8JibTCs1ObFpXtqGkVaKc2LSq/Dy0kClfTzdDwYZnqi5BCM18iLVDSfHSRmnkN8GY2DHgMaA485Zx70Gd9qaz6zcSQaJfSh4R6t6Ebj9UDYvXx6+rnhAv1qrMzmof9Yjk1AJ96TuQFQ0CdN0NrW4RU2zEROZW3/x1m1hx4gsDA0k7gXTP7k3PuPR/11XfcOVx9z4+16kG8umiW0lc/P3RNoYAYGr8OH/qAU4NmuEjvS7RBtjEzWWp7rWbIiETPZ/fna8BW59zHAGb2AnAlEPMAH81KxtqCaH3P9yG8J1xdND3jms4PBcTahjfqGzQVZEVSgznn/BRsNhoY5pybEHx8PXC+c+4H1c6bCEwMPuwGvO+lQfGRC+xNdCNiSNeT3HQ9yS1e13Omc65jpAMJH8B0zs0CZiW6HbFgZiudcwWJbkes6HqSm64nuSXD9ficgrALOCPscefgcyIiEgc+A/y7wNlm1tXMsoBrgT95rE9ERMJ4G6JxzlWY2Q+A/09gmuRs59wmX/UlibQYagqj60luup7klvDr8XaTVUREEkvLAEVE0pQCvIhImlKAbwAzG2Zm75vZVjO7PcLxR8xsbfDrAzMrSUQ7oxXF9XQxs0VmtsbM1pvZ5YloZ7SiuJ4zzeyt4LUUm1nnRE8ZJe8AAAS+SURBVLQzGmY228z2mNnGGo6bmc0IXut6M+sf7zbWRxTX093MlprZCTObGu/21VcU13Nd8OeywczeMbM+cW2gc05f9fgicMP4I+AsIAtYB/Ss5fwpBG4wJ7ztDb0eAjeLJgW/7wlsS3S7G3k9fwS+E/z+YuCZRLe7lusZDPQHNtZw/HLgNcCAAcDyRLe5kdfzBeCrwM+BqYlubwyu50KgffD7y+L981EPvv4qUzA458qAUAqGmowBno9LyxommutxQJvg922Bv8exffUVzfX0BEIJfRZFOJ40nHOLgdpyyV4JzHUBy4B2ZvYv8Wld/dV1Pc65Pc65d4GUyAkdxfW845w7EHy4jMB6oLhRgK+/LwF/C3u8M/jcKczsTKArnweTZBTN9dwDjDWzncCrBP4qSVbRXM86YFTw+5HAaWbWIQ5t8yHqz6Mk3HgCf23FjQK8X9cCLznnPkt0QxppDDDHOdeZwJDAM2aWyp+dqcDXzWwN8HUCK6xT/WckSczMhhAI8P8ez3oTnosmBdUnBcO1wI3eW9Q40VzPeGAYgHNuqZllE0iktCcuLayfOq/HOfd3gj14M2sNXO2cS+ob4bVQSpAkZ2bnAk8Blznn9sWz7lTuhSVKVCkYzKw70B5YGuf21Vc017MDGApgZj2AbODTuLYyenVej5nlhv0FcgcwO85tjKU/Ad8OzqYZABx0zv0j0Y2SADPrAswHrnfOfRDv+tWDrydXQwoGM7sXWOmcCwWTa4EXXPD2ebKK8npuAZ40sx8RuOFalKzXFeX1FAIPmJkDFpPEf2WZ2fME2psbvAdyN5AJ4JybSeCeyOXAVqAUGJeYlkanrusxs/8DrCRwU/+kmd1MYBbUoQQ1uVZR/HzuAjoAvzYzgAoXxwyTSlUgIpKmNEQjIpKmFOBFRNKUAryISJpSgBcRSVMK8CIiaUoBXiTIzK4ys56JbodIrCjAi3zuKgKJyE5hZlozIilHAV7SmpmNNbMVwdz8vzWz5mZ2xMx+bmbrzGyZmXUyswuBK4BfBs/9cjBX/KNmthK4ycyGBnPibwjmAW8RrGObmT0UfH6FmX3FzE4zs0/MLDN4TpvwxyLxoAAvaSuYVuEaYKBzri+BhGLXAa2AZc65PgRWsn7XOfcOgWX/tzrn+jrnPgoWkxVcefgEMAe4xjnXm8Aq8Elh1R0MPv8r4FHn3GGgGPhm8Pi1wHznXEqkwZX0oAAv6WwocB7wrpmtDT4+CygDXgmeswrIq6WMF4P/dgM+Ccsn8nsCmz2EPB/27wXB75/i89QB44CnG3QVIg2kcUVJZwb83jl3R5UnzaaG5dL5jNr/HxyNsi5X/Xvn3F/NLM/MCoHmzrmI27qJ+KIevKSzt4DRZvYFADM7PbgJS00OA6fVcOx9IM/MvhJ8fD3wl7Dj14T9G55BdC4wD/XeJQEU4CVtOefeA+4E/sfM1gNvALVtZ/cCcGvwRuqXq5V1nMAwyx/NbANwEpgZdkr7YB03AT8Ke/45Ammjk3nbRklTyiYp0khmtg0ocM7tjXBsNHClc+76uDdMmjyNwYt4YmaPA5cRyNcuEnfqwYuIpCmNwYuIpCkFeBGRNKUALyKSphTgRUTSlAK8iEia+l9ilia5ZNXpZwAAAABJRU5ErkJggg==\n", 412 | "text/plain": [ 413 | "
" 414 | ] 415 | }, 416 | "metadata": { 417 | "needs_background": "light" 418 | }, 419 | "output_type": "display_data" 420 | } 421 | ], 422 | "source": [ 423 | "plt.hist(entropies, bins=100, density=True, histtype=\"step\", linewidth=1.15, label=\"random\")\n", 424 | "plt.xlabel(\"entropy\")\n", 425 | "plt.ylabel(\"density\")\n", 426 | "plt.axvline(-(A*np.log(A)).sum(), color=\"black\", linestyle=\"--\", label=\"binomial\")\n", 427 | "plt.legend(title=\"distributions\")\n", 428 | "plt.show()" 429 | ] 430 | }, 431 | { 432 | "cell_type": "markdown", 433 | "metadata": {}, 434 | "source": [ 435 | "We can see that the largest entropy sample has a distribution that is almost identical to the binomial distribution." 436 | ] 437 | }, 438 | { 439 | "cell_type": "code", 440 | "execution_count": 14, 441 | "metadata": {}, 442 | "outputs": [ 443 | { 444 | "name": "stdout", 445 | "output_type": "stream", 446 | "text": [ 447 | "max entropy: 1.221705804760178\n", 448 | "sample distribution: [0.08953159 0.21253592 0.20840089 0.48953159]\n", 449 | "binomial distribution: [0.09 0.21 0.21 0.49]\n" 450 | ] 451 | } 452 | ], 453 | "source": [ 454 | "idx = np.argmax(entropies)\n", 455 | "print(\"max entropy:\", entropies[idx])\n", 456 | "print(\"sample distribution:\", distributions[idx])\n", 457 | "print(\"binomial distribution:\", A)" 458 | ] 459 | }, 460 | { 461 | "cell_type": "markdown", 462 | "metadata": {}, 463 | "source": [ 464 | "## Generalized linear models" 465 | ] 466 | }, 467 | { 468 | "cell_type": "markdown", 469 | "metadata": {}, 470 | "source": [ 471 | "The rest of the chapter is about generalized linear models (abbreviated GLM). There are no more code snippets, just some general theoretical motivation for their use. The idea is that instead of using the normal distribution and having $\\mu$ be a linear function of the predictors, perhaps we could (and in fact _should_) use other distributions in different scenarios, and instead try and predict the parameters that define them as functions of the predictors." 472 | ] 473 | }, 474 | { 475 | "cell_type": "markdown", 476 | "metadata": {}, 477 | "source": [ 478 | "For example, if we have some count data where we need to infer the probability of an even occurring, we could use the following GLM:\n", 479 | "\n", 480 | "$$\n", 481 | "\\begin{align}\n", 482 | "y_i &\\sim \\text{Binomial}(n, p_i) \\\\\n", 483 | "f(p_i) &= \\alpha + \\beta x_i\n", 484 | "\\end{align}\n", 485 | "$$\n", 486 | "\n", 487 | "where $f(p)$ is known as the \"link function\", and its purpose is to transform the range of the linear function $\\alpha + \\beta x$ into the domain of the parameter $p$. The problem is that $\\alpha + \\beta x$ can in principle be any real number, but $p \\in [0, 1]$. So we need some function to \"squash\" the range to fit in there. A useful transformation in this case is the logit function\n", 488 | "$$\\text{logit}(p) = \\log \\frac{p}{1-p},$$\n", 489 | "which amounts to a transformation of\n", 490 | "$$p = f^{-1}(\\alpha + \\beta x) = \\frac{\\exp(\\alpha + \\beta x)}{1 + \\exp(\\alpha + \\beta x)}$$\n", 491 | "Because this inverse transformation is so common, you'll usually see $f^{-1}$ just as often as you'll see $f$." 492 | ] 493 | }, 494 | { 495 | "cell_type": "markdown", 496 | "metadata": {}, 497 | "source": [ 498 | "The use of a nonlinear link function implicitly leads to interaction effects between all of the predictors because now the derivative of the parameter with respect to one predictor is now a function of all the predictors, not just the one:\n", 499 | "$$\n", 500 | "\\frac{\\partial p}{\\partial x_i} = (f^{-1})^\\prime(\\alpha + \\beta \\cdot x) \\beta_i\n", 501 | "$$\n", 502 | "(I'm assuming $x$ and $\\beta$ are both vectors here)" 503 | ] 504 | }, 505 | { 506 | "cell_type": "code", 507 | "execution_count": null, 508 | "metadata": {}, 509 | "outputs": [], 510 | "source": [] 511 | } 512 | ], 513 | "metadata": { 514 | "kernelspec": { 515 | "display_name": "Python 3", 516 | "language": "python", 517 | "name": "python3" 518 | }, 519 | "language_info": { 520 | "codemirror_mode": { 521 | "name": "ipython", 522 | "version": 3 523 | }, 524 | "file_extension": ".py", 525 | "mimetype": "text/x-python", 526 | "name": "python", 527 | "nbconvert_exporter": "python", 528 | "pygments_lexer": "ipython3", 529 | "version": "3.7.5" 530 | } 531 | }, 532 | "nbformat": 4, 533 | "nbformat_minor": 4 534 | } 535 | -------------------------------------------------------------------------------- /notebooks/README.md: -------------------------------------------------------------------------------- 1 | # Statistical Rethinking chapters 2 | 3 | Each of the jupyter notebooks in this directory corresponds to a chapter of _Statistical Rethinking_. Yes, those are the actual names of the chapters, I did not come up with them! 4 | 5 | ## Chapter 0: [Preface](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/00_preface.ipynb) 6 | Introduces the content of the book, how to use it effectively, some advice on coding, etc. 7 | 8 | ## Chapter 1: [The Golem of Prague](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/01_golem_of_prague.ipynb) 9 | Discusses the zoo of common statistical tests, differences between hypotheses and models, some philosophy about truth/falsification of models. Introduces some of the fundamental differences between Frequentist and Bayesian statistics, then goes on to highlight specific future chapters: chapter 7 on model comparison, chapter 13 on multilevel models, chapters 5/6 on graphical causal models. 10 | 11 | ## Chapter 2: [Small Worlds and Large Worlds](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/02_small_large_worlds.ipynb) 12 | Gives motivation for the Bayesian way of computing probabilities - it's just counting the ways observations could have occurred. Explains the various pieces of Bayes' rule: likelihood, prior, evidence, posterior. Illustrates how successive observations allow one to update their prior beliefs. Introduces the _grid approximation_ and _quadratic approximation_ (also known as _Laplace approximation_) for computing posteriors of simple models with low dimensionality and nearly Gaussian posteriors. 13 | 14 | ## Chapter 3: [Sampling the Imaginary](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/03_sampling_the_imaginary.ipynb) 15 | Goes into detail about the grid approximation and how to compute posterior distributions from it. Discusses confidence and credible/compatibility intervals, HDI/HPDI/PI. Explains the benefits of having the entire posterior distribution over only having a point estimate. Shows how to sample from posteriors and how to use the samples to calculate any quantity of interest. 16 | 17 | ## Chapter 4: [Geocentric Models](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/04_geocentric_models.ipynb) 18 | Explains why "common" models/distributions like linear regression and the Gaussian are the default in most scenarios and illustrates the Central Limit Theorem. Uses the grid/quadratic approximation to estimate the posterior of a simple linear regression model, and shows the importance of doing prior predictive sampling/simulation to determine logical/informative priors. Shows how to generalize to polynomial and spline regression models. 19 | 20 | ## Chapter 5: [The Many Variables & The Spurious Waffles](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/05_many_vars_and_spurious_waffles.ipynb) 21 | This chapter starts looking at the correlation vs. causation problem, introduces some techniques for causal inference (DAG's), and how including certain predictors in your model could either increase/decrease bias if you're not careful. Shows how to use models to do counterfactual deduction. Explains how to incorporate categorical predictors into your linear models. 22 | 23 | ## Chapter 6: [The Haunted DAG & The Causal Terror](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/06_haunted_dag_and_causal_terror.ipynb) 24 | Sources of bias and causality are discussed in more depth. Explains how multicollinearity can disguise causal relationships and introduce non-identifiability of parameters. Explains d-separation criteria and how to use it to make causal inferences when designing models. Illustrates Simpson's paradox, how to (try to) eliminate confounding, and create tests of causality. 25 | 26 | ## Chapter 7: [Ulysses' Compass](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/07_ulysses_compass.ipynb) 27 | Introduces underfitting/overfitting problems, how to identify them through information criteria (AIC, WAIC, PSIS-LOO) and cross-validation techniques. Discusses various model fit metrics and when to use them (absolutely wrecks $R^2$ haha). Gives some more detail of the math of information theory underlying probability theory (entropy, KL divergence). Explains common pitfalls when comparing metrics of model fit. Shows how regularizing priors can be used to improve inference in the presence of domain knowledge or to reduce overfitting. Author heavily prefers using WAIC and PSIS-LOO over out-of-sample CV... not sure if I completely agree. Illustrates all this with a problem comparing models of primate brain mass. 28 | 29 | ## Chapter 8: [Conditional Manatees](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/08_conditional_manatees.ipynb) 30 | Introduces nonlinear interactions between predictor variables, how to include them in "linear" models. Shows how rewriting more complicated models can eliminate identifiability issues. Shows how normalizing variables can make choosing priors simpler and more logical. 31 | 32 | ## Chapter 9: [Markov Chain Monte Carlo](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/09_mcmc.ipynb) 33 | This chapter title is actually pretty apt. The concept of MCMC is introduced, various flavors (Metropolis, Gibbs sampling, Hamiltonian MC) are explained, and then we finally settle on HMC/NUTS, talk about some of the pros/cons, how to use it, how to diagnose problems. 34 | 35 | ## Chapter 10: [Big Entropy and the Generalized Linear Model](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/10_entropy_and_glm.ipynb) 36 | The guiding principle of maximum entropy (to choose priors) is introduced. Genearlized linear models (GLM's), link functions, and techniques for interpreting parameters are introduced as well. 37 | 38 | ## Chapter 11: [God Spiked the Integers](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/11_god_spiked_ints.ipynb) 39 | Integer-valued distributions like Poisson/Binomial are introduced, and GLM's are built using them. Brief discussion on how to account for censored data that comes up often in count models or survival analysis where you measure things like durations to an event, or events that cause "subjects" to "be removed from" the study. 40 | 41 | ## Chapter 12: [Monsters and Mixtures](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/12_monsters_and_mixtures.ipynb) 42 | Mixture distributions are introduced, such as the zero-inflated Poisson, the beta-binomial, and gamma-poisson. Utility of these distributions and their higher entropy to help cover unexplained variance is discussed. 43 | 44 | ## Chapter 13: [Models with Memory](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/13_models_with_memory.ipynb) 45 | Hierarchical/multilevel models are introduced. Advantages/disadvantages of pooling are discussed, and reparametrization (centered vs. non-centered) and its effects on HMC sampling efficiency are shown. 46 | 47 | ## Chapter 14: [Adventures in Covariance](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/14_adventures_in_covariance.ipynb) 48 | Takes multilevel models further by introducing adaptive priors that can take covariance between groups of data into account (multidimensional Gaussians). Introduces Gaussian processes for groups linked by continuous values, which is illustrated in the context of geospatial and phylogenetic similarities. 49 | 50 | ## Chapter 15: [Missing Data and Other Opportunities](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/15_missing_data.ipynb) 51 | Teaches how to deal with a variety of problems by modeling the data itself as distributions to be learned. It is shown how to treat measurement error and missing data as generative processes that allow the recovery of the "true" data. Lots of pitfalls related to the causal implications of using such techniques are discussed. Latent discrete variables and their treatment in HMC is also shown. 52 | 53 | ## Chapter 16: [Generalized Linear Madness](http://nbviewer.jupyter.org/github/ecotner/statistical-rethinking/blob/master/notebooks/16_generalized_linear_madness.ipynb) 54 | Explains that while GLM's are a powerful tool, they are sometimes so general as to be uninterpretable. Often times, it is better to formulate a model using scientific theory inspired by the domain, trying to keep the model as close as possible to a plausible generative story. We go through several examples, including biological growth inspired by basic geometric principles, state space models for inferring strategies in children and forecasting population dynamics, highly nonlinear situations where we infer the parameters of a differential equations. 55 | 56 | ## Chapter 17: Horoscopes 57 | This chapter doesn't have any code, so I did not make a notebook for it. It _very_ briefly discusses how statistical models should be used in scientific studies, and offers some guidelines on how scientific studies should be judged to improve the quality of scientific literature. --------------------------------------------------------------------------------