├── images ├── XOR2.png └── Feed_forward_neural_net.gif ├── .gitignore ├── data ├── SAheart.info └── SAheart.data ├── README.md ├── REF_linear_algebra.ipynb ├── pysrc ├── neural_networks.py ├── decision_trees.py └── ipynbs_src │ └── 02_Linear_Regression.py ├── Performance_Prediction.ipynb ├── 06_Graphical_Models.ipynb ├── 01_Introduction.ipynb ├── 04_Neural_Networks.ipynb └── 05_Decision_Trees.ipynb /images/XOR2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/masinoa/machine_learning/master/images/XOR2.png -------------------------------------------------------------------------------- /images/Feed_forward_neural_net.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/masinoa/machine_learning/master/images/Feed_forward_neural_net.gif -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | *.py[co] 2 | 3 | data/ 4 | # Packages 5 | *.egg 6 | *.egg-info 7 | dist 8 | build 9 | eggs 10 | parts 11 | bin 12 | var 13 | sdist 14 | develop-eggs 15 | .installed.cfg 16 | output 17 | 18 | # Installer logs 19 | pip-log.txt 20 | 21 | # Unit test / coverage reports 22 | .coverage 23 | .tox 24 | 25 | #Translations 26 | *.mo 27 | 28 | #Mr Developer 29 | .mr.developer.cfg 30 | -------------------------------------------------------------------------------- /data/SAheart.info: -------------------------------------------------------------------------------- 1 | A retrospective sample of males in a heart-disease high-risk region 2 | of the Western Cape, South Africa. There are roughly two controls per 3 | case of CHD. Many of the CHD positive men have undergone blood 4 | pressure reduction treatment and other programs to reduce their risk 5 | factors after their CHD event. In some cases the measurements were 6 | made after these treatments. These data are taken from a larger 7 | dataset, described in Rousseauw et al, 1983, South African Medical 8 | Journal. 9 | 10 | sbp systolic blood pressure 11 | tobacco cumulative tobacco (kg) 12 | ldl low densiity lipoprotein cholesterol 13 | adiposity 14 | famhist family history of heart disease (Present, Absent) 15 | typea type-A behavior 16 | obesity 17 | alcohol current alcohol consumption 18 | age age at onset 19 | chd response, coronary heart disease 20 | 21 | To read into R: 22 | read.table("http://www-stat.stanford.edu/~tibs/ElemStatLearn/datasets/SAheart.data", 23 | sep=",",head=T,row.names=1) 24 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | machine_learning 2 | ================ 3 | 4 | This repo contains a collection of IPython notebooks detailing various machine learning algorithims. In general, the mathematics follows that presented by Dr. Andrew Ng's Machine Learning course taught at Stanford University (materials available from [ITunes U] (http://www.apple.com/education/itunes-u/), Stanford Machine Learning), Dr. Tom Mitchell's course at Carnegie Mellon (materials avialable [here](http://www.cs.cmu.edu/~tom/10701_sp11/)), and Christopher M. Bishop's "Pattern Recognition And Machine Learning". Unless otherwise noted, the Python code is orginal and any errors or ommissions should be attribued to me and not the aforemention authors. 5 | 6 | Each ipynb provides a list of the pertinent reading material. It is suggested that the material be read in the order provided. 7 | 8 | If you do not have IPython installed or Notebook configured (why not?) the src directory has .py versions of the notebook files and some of the PDF output files are in this repository's Downloads section. However, they are not always as updated as the ipynb files. 9 | 10 | Python Version 2.7.2 11 | IPython Version 0.13 12 | 13 | 14 | -------------------------------------------------------------------------------- /REF_linear_algebra.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "metadata": { 3 | "name": "REF_linear_algebra" 4 | }, 5 | "nbformat": 3, 6 | "nbformat_minor": 0, 7 | "worksheets": [ 8 | { 9 | "cells": [ 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "

Trace Properties

\n", 15 | "

For the square matric A, define tr A = trace(A) = $\\sum_{i}A_{i,i}$. Let $\\bigtriangledown f(A)$ be the matrix of partial\n", 16 | " derivatives of f with respect to the elements of A. The following hold:\n", 17 | "

    \n", 18 | "
  1. tr AB = tr BA
  2. \n", 19 | "
  3. $\\bigtriangledown tr AB = B^T$
  4. \n", 20 | "
  5. $tr A = tr A^T$
  6. \n", 21 | "
  7. $\\bigtriangledown tr ABA^TC = CAB + C^TAB^T$
  8. \n", 22 | " \n", 23 | "
\n", 24 | "

" 25 | ] 26 | }, 27 | { 28 | "cell_type": "markdown", 29 | "metadata": {}, 30 | "source": [ 31 | "

Transpose Properties

\n", 32 | "
    \n", 33 | "
  1. $(A+B)^T= A^T+ B^T$
  2. \n", 34 | "
  3. $(AB)^T = B^TA^T$
  4. \n", 35 | "
  5. $det(A^T) = det(A)$
  6. \n", 36 | "
  7. $(A^T)^{-1}=(A^{-1})^T$
  8. \n", 37 | "
" 38 | ] 39 | }, 40 | { 41 | "cell_type": "markdown", 42 | "metadata": {}, 43 | "source": [ 44 | "

Inverse Properpties

\n", 45 | "\n", 46 | "1. $(AB)^{-1} = B^{-1}A^{-1}$" 47 | ] 48 | }, 49 | { 50 | "cell_type": "markdown", 51 | "metadata": {}, 52 | "source": [ 53 | "

Matrix Derivatives

\n", 54 | "\n", 55 | "1. $\\frac{\\partial}{\\partial \\mathbf{x}} (\\mathbf{x}^T \\mathbf{y}) = \\frac{\\partial}{\\partial \\mathbf{x}} (\\mathbf{y}^T \\mathbf{x}) = \\mathbf{y}$" 56 | ] 57 | }, 58 | { 59 | "cell_type": "code", 60 | "collapsed": false, 61 | "input": [], 62 | "language": "python", 63 | "metadata": {}, 64 | "outputs": [] 65 | } 66 | ], 67 | "metadata": {} 68 | } 69 | ] 70 | } -------------------------------------------------------------------------------- /pysrc/neural_networks.py: -------------------------------------------------------------------------------- 1 | import networkx as nx 2 | 3 | def forward_prop(DG, clear_current = True): 4 | '''Computes the feed-forward network output for every node in the given a directed graph DG. 5 | INPUT: 6 | DG - a directed graph 7 | The nodes of DG must contain the following attributes: 8 | af : the node's activation function 9 | if the node is a bias node or an input node, this attribute should just be a constant value 10 | o : the output attribute of the node, should be set to None. if clear_current is set to True this 11 | attribute will be added automatically 12 | The edges should be weighted edges, with the following attributes: 13 | w : weight 14 | clear_current - if True, the algorithm will first traverse DG to clear each node's 'o' 15 | attribute. This is useful for algorithms requiring multiple propagations through 16 | DG 17 | OUTPUT: 18 | Every node in DG will have the attribute 'o' added representing that node's output value 19 | ''' 20 | if clear_current: clear_output(DG) 21 | for n in DG.nodes(): 22 | if len(DG.successors(n)) == 0: #this is an output node 23 | computeNodeOutput(DG,n) 24 | 25 | def error_back_prop(DG, clear_current = True): 26 | '''Computes the error term for each node in a feed-forward network, using back 27 | propagation, associated with some input vector x and the associated target value t. 28 | It is assumed that the output values for each node, using the input x, have already 29 | been computed and are stored in an attribute labeled 'o' for each node. 30 | INPUT: 31 | DG - a directed graph 32 | The nodes of DG must contain the following attributes: 33 | daf : the derivative of the node's activation function, which is itself a function 34 | of a single input variable, namely a_j=sum_i[w_ji z_i] where i indexes over 35 | nodes that send input to node j. For output nodes: 36 | 1. daf should account for the target value t. 37 | 2. will typically have the canonical link function as an activation function so that 38 | daf will by a_j - t_j. TyNone, it is assumed (see Bishop page 243) 39 | o : the output attribute of the node, should be set based on applying forward_prop for 40 | the input x associated with t. 41 | The edges should be weighted edges, with the following attributes: 42 | w : weight 43 | clear_current - if True, the algorithm will first traverse DG to clear each node's 'e' 44 | attribute. This is useful for algorithms requiring multiple propagations through 45 | DG 46 | OUTPUT: 47 | Every node in DG will have the attribute 'e' added representing the error term at that node 48 | computed as daf(a_j)*sum_k(w_kj error_k) where k indexes over nodes that receive input from 49 | node j. For output nodes this sum is taken as 1, so that the error term is just daf(a_j) 50 | ''' 51 | if clear_current: clear_errors(DG) 52 | for n in DG.nodes(): 53 | if len(DG.predecessors(n)) == 0: #this is an input or bias node 54 | computeNodeError(DG,n) 55 | 56 | def computeNodeError(DG, node): 57 | if DG.node[node]['e']: #this node's error has already been computed 58 | return DG.node[node]['e'] 59 | else: #need to compute error 60 | sucs = DG.successors(node) 61 | preds = DG.predecessors(node) 62 | ua = reduce(lambda wsum, pred: wsum + DG.edge[pred][node]['weight']*DG.node[pred]['o'], 63 | preds, 64 | 0) 65 | if len(sucs) > 0: #this is not an output node 66 | sk = reduce(lambda wsum, suc: wsum + DG.edge[node][suc]['weight']*computeNodeError(DG,suc), 67 | sucs, 68 | 0) 69 | else: #this is an output node 70 | sk = 1.0 71 | e = DG.node[node]['daf'](ua) * sk 72 | DG.node[node]['e'] = e 73 | return e 74 | 75 | 76 | def clear_errors(DG): 77 | for key in DG.nodes(): 78 | DG.node[key]['e'] = None 79 | 80 | def clear_output(DG): 81 | for key in DG.nodes(): 82 | DG.node[key]['o'] = None 83 | 84 | def computeNodeOutput(DG, node): 85 | if DG.node[node]['o']: #this node's output has already been computed 86 | return DG.node[node]['o'] 87 | else: #need to compute output 88 | #compute unit activation 89 | preds = DG.predecessors(node) 90 | if len(preds) > 0: 91 | ua = reduce(lambda wsum, pred: wsum + DG.edge[pred][node]['weight']*computeNodeOutput(DG,pred), 92 | preds, 93 | 0) 94 | o = DG.node[node]['af'](ua) 95 | else: #this node is a bias term and has no input 96 | o = DG.node[node]['af'] 97 | DG.node[node]['o'] = o 98 | return o 99 | 100 | -------------------------------------------------------------------------------- /Performance_Prediction.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "metadata": { 3 | "name": "Performance_Prediction" 4 | }, 5 | "nbformat": 3, 6 | "nbformat_minor": 0, 7 | "worksheets": [ 8 | { 9 | "cells": [ 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "

Performance Prediction: How Well Does The Model Work?

\n", 15 | "This chapter describes some of the many performance prediction methods that can be applied to machine learning and other data minining models. " 16 | ] 17 | }, 18 | { 19 | "cell_type": "markdown", 20 | "metadata": {}, 21 | "source": [ 22 | "

Selecting Test Data

\n", 23 | "Let's assume we have chosen a performance metric that we will used to determine how well a given model performs. It is necessary to select a set of *test* data used as model input for measuring the selected metric. The *test* data should **not** contain any of the data used in training or optimizing the model.\n", 24 | "

\"Big Data\"

\n", 25 | "If the available data set is large enough, it can be divided into three sets:\n", 26 | "\n", 27 | "* training data - used by one or more learning schemes\n", 28 | "\n", 29 | "* validation data - used to optimize parameters\n", 30 | "\n", 31 | "* test data - used to calculate the performance metric\n", 32 | "\n", 33 | "How large is \"large enough\"? Ideally, the training and test data sets should be *representative* of the data that will be observed in practice. This means that the data should span the possible cases and distribution of cases should be the same as that observed in practice. In general, it is impossible to know that the data is representative. If one assumes the data set is representative, the requirement for selecting training and test data is that it is *stratified*, that is that the distribution of cases is the same between the test and training data." 34 | ] 35 | }, 36 | { 37 | "cell_type": "markdown", 38 | "metadata": {}, 39 | "source": [ 40 | "

Cross Validation - \"Normal Data\"

\n", 41 | "Most problems of interest have limited data available for training and testing. In such cases, it is not desirable or feasible to divide the data into training and testing sets. The *standard* procedure in these cases is to use *stratified tenfold cross-validation performed ten times*. A single stratified *N* fold cross-validation involves first splitting the data into N approximately equal stratified sets. Then N-1 sets of the data are combined and used for training and the Nth data set is used for performance prediction. This is done in turn, so that an error estimate is yielded for each of the N data sets. The final error estimate is the average of the N estimates. In tenfold cross-validation, N is equal to 10. The standard method is to repeat the tenfold cross-validation ten times, that is divide the data into 10 sets 10 times, running the cross-validation each time. This helps minimize error prediction variation resulting from the random selection of the N=10 data folds. \n", 42 | "\n", 43 | "

Other Methods

\n", 44 | "Other testing methods include *Leave-One-Out Cross Validation*, essentially this n-fold cross-validation where n is the size of the complete data set, and *0.632 Bootstrap* validation, whihc uses sampling with replacement to form the training data set." 45 | ] 46 | }, 47 | { 48 | "cell_type": "markdown", 49 | "metadata": {}, 50 | "source": [ 51 | "

Performance Metrics

\n", 52 | "There are a variety of metrics that can be used to predict how well a given learning scheme will perform. The appropriate measure, naturally, depends on the problem.\n", 53 | "

Classification Success

\n", 54 | "If the problem is a classification problem, then one possible metric is the percentage of inputs correctly classified. See page 151 of [Data Mining](http://www.amazon.com/Data-Mining-Practical-Techniques-Management/dp/0123748569/ref=sr_1_1?s=books&ie=UTF8&qid=1356718659&sr=1-1&keywords=data+mining) for confidence interval discussion." 55 | ] 56 | }, 57 | { 58 | "cell_type": "markdown", 59 | "metadata": {}, 60 | "source": [ 61 | "

Predicting Probabilities

\n", 62 | "Often the outcome of a learning scheme is a probability measure. Here we consider the case where the learning scheme assigns a probability to *K* possible outcomes for a given instance. Assume these are output as a probability vector $\\mathbf{p} = \\left(p_1, p_2, \\ldots, p_k \\right)\\hspace{2pt}$. Express the actual class for the instance as a vector $\\mathbf{a} = \\left(a_1, a_2, \\ldots, a_k \\right) \\hspace{3pt}$ where $a_i$ equals 1 if *i* is the class the instance actually belongs to and 0 otherwise. A performance metric that may apply in such situations is the use of a loss function that is calculated for a given instance. If the test set contains several instances the loss function is summed over all of them. Two are described here.\n", 63 | "\n", 64 | "

Quadratic Loss Function

\n", 65 | "$$\\sum_{j=1}^K \\left(p_j-a_j\\right)^2$$\n", 66 | "\n", 67 | "\n", 68 | "

Informational Loss Function

\n", 69 | "$$-\\log_2 p_i$$\n", 70 | "\n", 71 | "where *i* is the actual class for the instance. This function represents the information (in bits) required to express the actual class *i* with respect to the probability distribution $\\mathbf{p}$, i.e. if one has the knowledge of the distribution, this is the number of bits required to communicate a specific class if done optimally. " 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "collapsed": false, 77 | "input": [], 78 | "language": "python", 79 | "metadata": {}, 80 | "outputs": [] 81 | } 82 | ], 83 | "metadata": {} 84 | } 85 | ] 86 | } -------------------------------------------------------------------------------- /pysrc/decision_trees.py: -------------------------------------------------------------------------------- 1 | import networkx as nx 2 | import math 3 | import matplotlib.pyplot as plt 4 | 5 | def pest(outputs, k): 6 | '''Calculates the probability estimate, p, that a given data point in the region defined 7 | by the values in outputs will be in class k 8 | INPUT: 9 | outputs - 1D collection of integer values, presumably the outcome class labels for some region 10 | of input space 11 | k - an integer corresponding to a class label 12 | OUTPUT: 13 | p - a float estimate of the probability for class k 14 | ''' 15 | nm = len(outputs) #number of data values in this region 16 | p = 0.0 17 | for o in outputs: 18 | if o == k: p += 1 19 | return p/nm 20 | 21 | def compute_class(data_row_indicies, outputs, clazz): 22 | k = None 23 | max_p = -1 24 | out = [outputs[index] for index in data_row_indicies] 25 | for c in clazz: 26 | temp = pest(out, c) 27 | if temp > max_p: 28 | max_p = temp 29 | k = c 30 | return k 31 | 32 | 33 | def __assign_class(tree, node, clazz, outputs): 34 | data_row_indicies = tree.node[node]['data'] 35 | return compute_class(data_row_indicies, outputs, clazz) 36 | 37 | def cross_entropy(outputs, clazz): 38 | '''Calculates the cross entropy for the region defined by the values in outputs 39 | INPUT: 40 | outputs - 1D collection of integer values, presumably the outcome class labels for some region 41 | of input space 42 | clazz - 1D collection of integers corresponding to the class labels 43 | OUTPUT: 44 | ce - a float estimate of the cross entropy 45 | ''' 46 | def ent(k): 47 | p = pest(outputs,k) 48 | if p==0: return 0 49 | else: return p * math.log(p) 50 | return -1.0 * reduce(lambda accum, k: accum+ ent(k), clazz, 0) 51 | 52 | def gini_index(outputs, clazz): 53 | def gini(k): 54 | p = pest(outputs,k) 55 | return p - p*p 56 | return reduce(lambda accum, k: accum + gini(k), clazz, 0) 57 | 58 | def __initiate_node(tree, node): 59 | tree.add_node(node) 60 | keys = ['j','i','s','data'] 61 | for k in keys: tree.node[node][k] = None 62 | 63 | def __grow_path(tree, coords, input, output, clazz, split_func, max_rm, meta, data_row_indicies, parent_node = None, weight = 0): 64 | '''INPUT: 65 | tree - current digraph 66 | node - parent node 67 | input - input data 68 | output - output data set 69 | clazz - integer class labels 70 | split_func - splitting function Qm 71 | max_rm - stopping criteria 72 | meta - input space labels 73 | ''' 74 | 75 | if len(data_row_indicies)<=max_rm: #add a terminal node 76 | k = compute_class(data_row_indicies, output, clazz) 77 | label = '{0}, C={1}'.format(coords,k) 78 | __initiate_node(tree, label) 79 | tree.node[label]['c'] = coords 80 | tree.node[label]['o'] = k 81 | if parent_node: tree.add_weighted_edges_from([(parent_node, label, weight)]) 82 | return #this path is complete 83 | temp1 = 0 84 | temp2 = 0 85 | min_balance = float("inf") 86 | min_split = float("inf") 87 | split_j = None 88 | split_i = None 89 | for idx in data_row_indicies: #loop over data 90 | out1 = [] 91 | out2 = [] 92 | for col in range(input.shape[1]): #loop over feature space 93 | split_value = input[idx][col] 94 | out1 = [output[index] for index in data_row_indicies if input[index][col] <= split_value] 95 | out2 = [output[index] for index in data_row_indicies if input[index][col] > split_value] 96 | if len(out1)>0 and len(out2)>0: 97 | temp1 = split_func(out1, clazz) + split_func(out2, clazz) 98 | temp2 = math.fabs(len(out1)-len(out2)) 99 | if temp1 <= min_split and temp2 <= min_balance: 100 | split_j = col 101 | split_i = idx 102 | min_split = temp1 103 | min_balance = temp2 104 | 105 | #create the node for this split 106 | split_value = input[split_i][split_j] 107 | label = '{0}<={1},{2}'.format(meta[split_j],split_value, coords) 108 | tree.add_node(label) 109 | tree.node[label]['j'] = split_j 110 | tree.node[label]['i'] = split_i 111 | tree.node[label]['s'] = split_value 112 | tree.node[label]['c'] = coords 113 | tree.node[label]['data'] = data_row_indicies 114 | tree.node[label]['o'] = __assign_class(tree, label, clazz, output) 115 | if parent_node: tree.add_weighted_edges_from([(parent_node, label, weight)]) 116 | 117 | #grow paths for split 118 | left_data = [index for index in data_row_indicies if input[index][split_j] <= split_value] 119 | right_data = [index for index in data_row_indicies if input[index][split_j] > split_value] 120 | l,k = [int(x) for x in coords.split(',')] #level and column for this node 121 | left_coords = '{0},{1}'.format(str(l+1),str(2*k)) 122 | right_coords = '{0},{1}'.format(str(l+1),str(2*k+1)) 123 | __grow_path(tree, left_coords, input, output, clazz, split_func, max_rm, meta, left_data, label, 0) 124 | __grow_path(tree, right_coords, input, output, clazz, split_func, max_rm, meta, right_data, label, 1) 125 | 126 | def build_tree(input, output, clazz, meta, split_func=cross_entropy, max_rm=5): 127 | '''Computes a classification decision tree given the training data and splitting function 128 | INPUT: 129 | input - a numpy array of the input data, each row should correspond to a single input 130 | vector, i.e. columns represent a feature or dimension of the input space. 131 | output - the 1D output values corresponding to the input data 132 | clazz - 1D collection of integers corresponding to the class labels 133 | meta - 1D collection of labels for feature space 134 | split_func - function used as splitting criteria 135 | max_rm - the maximum number of input data points allowed to be associated with a 136 | terminal node. This is used as the stopping criteria for growing the tree 137 | OUTPUT: 138 | tree - A networkx DiGraph (binary, exactly two edges from each node except for terminal 139 | nodes, no loops). 140 | Nodes are named with the convention N,M where N is the row level and M is the node count 141 | Each node is assigned: 142 | j - the column number of the feature to be split on (0 based). 143 | i - the index of output to split on 144 | s - the value to split on (output[i], provided for convenience) 145 | o - the class assigned to inputs in this region 146 | data - a list of the row indicies from the training input that are in the region defined 147 | by the node 148 | c - the coordinates for this node 149 | Each edge from a given node is assigned a weight of 0 or 1. Edges with 0 weight 150 | define the path to the next node for which values the j feature value in "data" 151 | is less or equal to s and those with weight 1 are for those greater than s. 152 | ''' 153 | tree = nx.DiGraph() 154 | __grow_path(tree, '0,0', input, output, clazz, split_func, max_rm, meta, range(len(output))) 155 | return tree 156 | 157 | def draw_tree(tree, min_delta=100, node_size=500): 158 | pos = {} 159 | max_level = 0 160 | for n in tree.nodes(): 161 | temp = int(tree.node[n]['c'].split(',')[0]) 162 | if temp > max_level: max_level = temp 163 | 164 | for node in tree.nodes(): 165 | l,k = [int(x) for x in tree.node[node]['c'].split(',')] 166 | ck = (2**l - 1) / 2.0 167 | delta = 2**(max_level - l) * min_delta 168 | pos[node]=((k-ck)*delta, (max_level-l)*min_delta) 169 | 170 | nx.draw_networkx_nodes(tree,pos,node_size=node_size) 171 | nx.draw_networkx_edges(tree,pos,alpha=0.5,width=2) 172 | nx.draw_networkx_labels(tree,pos) 173 | plt.axis('off') 174 | return plt 175 | 176 | def find_node_by_coords(tree, coords): 177 | for n in tree.nodes(): 178 | if tree.node[n]['c']==coords: return n 179 | 180 | def decide(tree, input, coords='0,0'): 181 | node = find_node_by_coords(tree, coords) 182 | if tree.node[node]['s']: 183 | col = tree.node[node]['j'] 184 | l,k = [int(x) for x in coords.split(',')] 185 | split_val = tree.node[node]['s'] 186 | v = input[col] 187 | if v <= split_val: return decide(tree, input, '{0},{1}'.format(str(l+1),str(2*k))) 188 | else: return decide(tree, input, '{0},{1}'.format(str(l+1),str(2*k + 1))) 189 | else: return tree.node[node]['o'] -------------------------------------------------------------------------------- /06_Graphical_Models.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "metadata": { 3 | "name": "06_Graphical_Models" 4 | }, 5 | "nbformat": 3, 6 | "nbformat_minor": 0, 7 | "worksheets": [ 8 | { 9 | "cells": [ 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "

Graphical Models

\n", 15 | "Probabilistic graph models provide an intuitive way to express joint probability relationships. The nodes in a graph represent random variables and the edges between nodes represent a probabilistic relationship between variables. \n", 16 | "\n", 17 | "

Bayesian Networks

\n", 18 | "Here we consider **acyclic directed** graphical models that express the joint probability relationship between some set of variables, known as *Bayesian Networks* because of the prominant use of Bayes' Theorem. To begin, consider the joint probability distribution $p\\left( x_1, \\ldots, x_K \\right)$ over $K$ variables. Using repeated application of the product rule, this can be expressed as \n", 19 | "\n", 20 | "$$p \\left(x_1, \\ldots, x_K \\right) = p \\left(x_K \\mid x_1, \\ldots, x_{K-1} \\right) \\ldots p\\left(x_2 \\mid x_1 \\right) p\\left(x_1\\right) $$\n", 21 | "\n", 22 | "This can be represented graphically as a **fully connected** directed graph having $K$ nodes, with each node having **incoming** links from all **lower** numbered nodes. \n", 23 | "\n", 24 | "Now consider the general case where the graph is **not necessarily** fully connected. Let the a given node $x_k$ have **incoming** links from the set of nodes $pa_k$ known as the *parent nodes* of $x_k$. Then the joint distribution defined by the directed acyclic graph over all nodes is given by the product of a conditional distribution for *each node conditioned on its parent nodes*. A graph with $K$ nodes has the joint distribution defined as\n", 25 | "\n", 26 | "$$ p\\left(\\mathbf{x}\\right) = \\prod_{k=1}^K p\\left(x_k \\mid pa_k \\right) $$\n", 27 | "\n", 28 | "

Graphical Notation

\n", 29 | "There are several conventions used to convey information in graphical models. They are briefly summarized here - see Bishop 363-365 for more detail.\n", 30 | "\n", 31 | "+ Represent multiple nodes of the form $t_1, \\ldots, t_N$ as a single node with a box, known as a *plate*, drawn around it and annotated with $N$, the number of repeated nodes.\n", 32 | "\n", 33 | "+ Deterministic model parameters, e.g. the mean of a distribution, are drawn as solid circles, and random variables as open circles\n", 34 | "\n", 35 | "+ Observed variables are drawn as shaded open circles\n", 36 | "\n" 37 | ] 38 | }, 39 | { 40 | "cell_type": "markdown", 41 | "metadata": {}, 42 | "source": [ 43 | "

Conditional Independence

\n", 44 | "Consider three random variables $a,b,c$, we use the following notation to indicate that $a$ is *conditionally idependent* of $b$ given $c$\n", 45 | "\n", 46 | "$$a \\perp\\\\!\\\\!\\\\!\\perp b \\mid c$$\n", 47 | "\n", 48 | "When this condition holds, using the product rule, we may write the joint distribution of $a$ conditioned on $b$ and $c$ as\n", 49 | "\n", 50 | "$$p\\left(a, b \\mid c \\right) = p\\left(a\\mid b,c\\right)p\\left(b\\mid c\\right) = p\\left(a\\mid c\\right) p\\left(b \\mid c\\right)$$\n", 51 | "\n", 52 | "

D-separation

\n", 53 | "Given the description of conditional independence above, we now consider a general directed acyclic graph. Let $A$, $B$, and $C$ be arbitrary sets of *nonintersecting* nodes. We wish to determine if the graph structure implies the conditional independence $A \\perp\\\\!\\\\!\\\\!\\perp B \\mid C$. It turns out this conidition is satisfied if **all** possible paths from **any** node in $A$ to **any** node in $B$ are *blocked*, in which case $A$ is said to be *d-separated* from $B$. A path is considered blocked if either of the following hold:\n", 54 | "\n", 55 | "**(a)** There is a node, $n \\in C$, in the path such that the arrows on the path meet head-to-tail or tail-to-tail at n\n", 56 | "\n", 57 | "**(b)** There is a node, $n \\notin C$ with none of its descendants in $C$, in the path such that the arrows meet head-to-head at the node. A node $d$ is considered to a descendant of a node $p$ if there is a path from $p$ to $d$ in which each path step is in the direction of the arrows connecting the nodes on th path.\n", 58 | "\n", 59 | "Note that model parameters (e.g. the mean of a distribution) will always be tail-to-tail and therefore play no role in determining d-separation. " 60 | ] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": {}, 65 | "source": [ 66 | "-------------------------------------------\n", 67 | "\n", 68 | "

Training a Bayes Net

\n", 69 | "MLE for training a Bayes Net, estimating the theta parameters for the nodes, is the equivalent to maximizing the likelihood of the data. So for each node, probability of the node being equal to $s$ be conditioned on say $f$ and $a$, then we have - TODO - generalize this to N conditional quantities\n", 70 | "\n", 71 | "$$\\theta_{s|ij} = \\frac{\\sum_{k=1}^K \\delta \\left(f_k = i, a_k =j, s_k =1 \\right)}{\\sum_{k=1}^K \\delta \\left(f_k=i, a_k=j \\right)}$$\n", 72 | "\n", 73 | "where\n", 74 | "$\\delta(x) = 1$ \n", 75 | "\n", 76 | "if $x$ is true, 0 otherwise" 77 | ] 78 | }, 79 | { 80 | "cell_type": "markdown", 81 | "metadata": {}, 82 | "source": [ 83 | "

Partially Observed Data

\n", 84 | "Can't use the MLE approach above due to lack of data\n", 85 | "Let $X$ be all *observed* variables values\n", 86 | "Let $Z$ be all *unobserved* variables\n", 87 | "Assume we always observe/unobserve the same variables - it is possible to generalize this\n", 88 | "\n", 89 | "Approach:\n", 90 | "\n", 91 | "$ argmax_{\\theta} E_{P(Z|X,\\theta)}\\left[log P(x,z|\\theta) \\right]$\n", 92 | "\n", 93 | "using some probability distribution on $Z$, namely, $P(Z|X,\\theta)$ - do we need a model for this distribution?\n", 94 | "\n" 95 | ] 96 | }, 97 | { 98 | "cell_type": "markdown", 99 | "metadata": {}, 100 | "source": [ 101 | "

Expectation Maximization (EM) Algorithim

\n", 102 | "guaranteed to find local maximum of expected log likelihood above\n", 103 | "\n", 104 | "$ argmax_{\\theta} E_{P(Z|X,\\theta)}\\left[log P(x,z|\\theta) \\right]$\n", 105 | "\n", 106 | "see slide at 29:59\n", 107 | "\n", 108 | "EM - general procedure for learning from partly observed data. Given observed varialbes, $X$, and unobserved variables, $Z$, define\n", 109 | "\n", 110 | "$Q\\left(\\theta' | \\theta \\right) = E_{E_{P(Z|X,\\theta)}} \\left[\\log P(X,Z | \\theta') \\right]$\n", 111 | "\n", 112 | "Iterate until convergence:\n", 113 | "\n", 114 | "* E Step: Use $X$ and current $\\theta$ to calculate $P(Z|X,\\theta)$ - done for every variable in $Z$ for each training example, \n", 115 | "\n", 116 | "* M Step: Replace current $\\theta$ by \n", 117 | "\n", 118 | "$\\theta \\leftarrow arg max_{\\theta'} Q $\n", 119 | "\n", 120 | "using step 1, plug into into last equation on slide 29:59 and pick max theta\n", 121 | "\n", 122 | "Example: see slide 38:59, 50:59\n", 123 | "\n", 124 | "More generally: Given observed variables $X$ and unobserved variables $Z$ all of which are boolean:\n", 125 | "\n", 126 | "* E Step: Calculate for each training example, $k$, the expected value of each unobserved variable in $Z$\n", 127 | "\n", 128 | "* M Step: Calculate estimates similar to MLE, but replacing each count (i.e. observed data proportions) by its expected count\n", 129 | "\n", 130 | "$\\delta(Y=1) \\rightarrow E_{Z|X,\\theta}[Y]$\n", 131 | "\n", 132 | "$\\delta(Y=0) \\rightarrow 1 - \\delta(Y=1)$\n", 133 | "\n" 134 | ] 135 | }, 136 | { 137 | "cell_type": "markdown", 138 | "metadata": {}, 139 | "source": [ 140 | "

Example: Linear Gaussian Model

\n", 141 | "*TODO* A useful example may be a Linear-Gaussian model - see Bishop pages 370-372 - " 142 | ] 143 | } 144 | ], 145 | "metadata": {} 146 | } 147 | ] 148 | } -------------------------------------------------------------------------------- /01_Introduction.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "metadata": { 3 | "name": "01_Introduction" 4 | }, 5 | "nbformat": 3, 6 | "nbformat_minor": 0, 7 | "worksheets": [ 8 | { 9 | "cells": [ 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "

Machine Learning Defined

\n", 15 | "

\n", 16 | "

  1. field of study that gives computers the ability to learn without being explicitly programmed : Arthur Samuel 1959
  2. \n", 17 | " \n", 18 | "
  3. a computer program is said to learn from experience E with respect to some task T and some performance measure P, if its performance on T as measured by P, \n", 19 | " improves with E : Tom Mitchell 1998
  4. \n", 20 | "
\n", 21 | "

\n", 22 | "**Supervised Learning** : training data includes the \u201ccorrect\u201d answer\n", 23 | " \n", 26 | "\n", 27 | "**Unsupervised Learning** : problems where the algorithm is given a data set without any \u201cright answers\u201d. The objective is to find some underlying structure in the data, e.g. clustering\n", 28 | "

\n", 29 | "**Reinforcement Learning** : problems where a sequence of decisions are made as opposed to a single decision (or prediction) \n", 30 | "

\n", 31 | "**Learning Theory** : study of how and why (mathematically) a learning algorithm works" 32 | ] 33 | }, 34 | { 35 | "cell_type": "markdown", 36 | "metadata": {}, 37 | "source": [ 38 | "

Introduction

\n", 39 | "The material presented will include both the *Bayesian* and *frequentist* approaches. The labels, or even their *\"proper\"* use is, in practical terms, of little consequence. \n", 40 | "The important considerations are, as in any modeling, what assumptions are being made to create a given mathematical model and how do those \n", 41 | "assumptions affect model quality. However, in an effort to be consistent with the literature, I will attempt to use these terms where appropriate. \n", 42 | "With respect to Bayesian modeling vs. frequentist modeling the most fundamental difference is how data is viewed. In simplistic terms, the \n", 43 | "Bayesian approach asserts that their is some prior knowledge about the distribution of the data and model parameters. This prior knowledge alone could be used to make \n", 44 | "predictions about the future observation - however poor they may be. Once data is observed, it is used to *modulate* the prior assumptions.\n", 45 | "The frenquentist approach tends to ignore any such prior knowledge - relying more heavily, or even completely, on the observed data. Both approaches, of course, have\n", 46 | "adavantages and disadvantages that I will try to point out along the way.\n", 47 | "\n", 48 | "

A Little Probability

\n", 49 | "Assume we have a random variable (RV) $X$. In the case where $X$ is a discrete random variable, the probability that $X$ takes some particular value $x$ is $p(X=x)=p_X(x)$ where\n", 50 | "$p_X$ is the *probability distribution function* of the RV $X$. For simplicity, we will usually drop the $X$ subscript when using the probability distribution function, i.e.\n", 51 | "$p(x)=p_X(x)$ and $p(x,y)=p_{X,Y}(x,y)$\n", 52 | "Now assume we have a second RV, $Y$. The probability that $X$ and $Y$ together take some particular value $(x,y)$ is the *joint probability* $p_{X,Y}(x,y)=p(X=x, Y=y)$. The\n", 53 | "joint probability can be defined in terms of conditional and marginal probabilities as follows:\n", 54 | "

\n", 55 | "**product rule** $p(X,Y) = p_Y\\left(Y|X\\right)p_X\\left(X\\right) = p(Y,X) = p_X\\left(X|Y\\right)p_Y\\left(Y\\right)$\n", 56 | "

\n", 57 | "**sum rule** $p(X)=\\sum_Yp\\left(X,Y\\right)$\n", 58 | "

\n", 59 | "where the *conditional probability*, $p_X(X|Y)$ , is the probability distribution of X given that Y has taken some specific value and $p_X(x)$ is the *marginal probability*. Note\n", 60 | "that the marginal probability is simply the probability distribution of $X$ as described above, but when considered as part of a larger set of RVs, as in $(X,Y)$ in this case, the \n", 61 | "probability distribution of an isolated RV is referred to as the marginal probability. If $X$ and $Y$ are independent events, then \n", 62 | "$p(X,Y)=p(X)p(Y)$. \n", 63 | "

\n", 64 | "If $X$ is a continuous RV, then the probability distribution gives way to a *probability density function*, $p_X(x)$. What's the difference? In the case of a discrete RV, we think of $p_X(x)$\n", 65 | "as being a finite value in $[0,1]$ representing the probability that $X=x$. In the case of a continuous RV, we can only think of the probability that $X$ is in some range $(a,b)$ defined by\n", 66 | "

\n", 67 | "$P(X\\in (a,b)) = \\int_a^b p_X(x)dx$\n", 68 | "

\n", 69 | "From this definition, one sees that the $P(X=a)=\\int_a^a p_X(x)dx = 0$. The *product* rule remains the same rules for continuous RVs, however the *sum* rule becomes:

\n", 70 | "**sum rule** $p(X) = \\int p_{X,Y}(x,y) dy = \\int p_X(X|Y=y) p_Y(y) dy = E_Y[p(X|Y)]$

\n", 71 | "where $E_{\\Phi}[X]$ is the **expected value** of $X$ under the distribution $\\Phi$. Normally, $\\Phi$ is the marginal distribution of $X$ so that $E_X[X] = \\int x p_X(x) dx$ for the RV $X$. \n", 72 | "\n", 73 | "The following definition of a **compound distribution** will also be useful. Let $t$ be a RV with distribution $F$ paramaterized by $\\mathbf{w}$ and let $\\mathbf{w}$ be a RV distributed by $G$ \n", 74 | "parameterized by $\\mathbf{t}$, then the compound distribution $H$ parameterized by $\\mathbf{t}$ for the random variable $t$ is defined by:

\n", 75 | "$p_H(t|\\mathbf{t}) = \\int_{\\mathbf{w}} P_F(t|\\mathbf{w}) P_G(\\mathbf{w}|\\mathbf{t})d\\mathbf{w}$

\n", 76 | "\n", 77 | "

Bayesian Modeling

\n", 78 | "Using the probability rules above, it is possible to obtain the following, known as *Bayes' theorem*,

\n", 79 | "$p(Y|X) = \\frac{p\\left(X|Y\\right)p\\left(Y\\right)}{p\\left(X\\right)}$\n", 80 | "

\n", 81 | "Bayes' theorem will appear repeatadly in the discussion of machine learning.\n", 82 | "Not surprisingly, Bayes' theorem plays a fundamental role in Bayesian modelling. Assume, that we model some process where the model has \n", 83 | "free parameters contained in the vector $\\mathbf{w}$. Now assume that we have some notion of the probability distribution of these parameters, $p(\\mathbf{w})$,\n", 84 | "called the *prior*. That is, we assume that **any** set of values from some space (e.g. the real numbers) is a possible best choice \n", 85 | "for $\\mathbf{w}$ with some probability $p(\\mathbf{w})$. Finally, assume that we observe a set of data, $\\mathbf{D}$, for the output we are attempting to predict with our model. Our objective is to find\n", 86 | "the *best* set of parameters $\\mathbf{w}$ given the observed data. How we choose the *best* set is the challenge and is where Bayesian modeling\n", 87 | "departs from frequentist modeling. In the Bayesian approach we attempt to maximize the *the probability of the parameters given the data*, \n", 88 | "$p(\\mathbf{w}|D)$, known as the *posterior*. Using Bayes' theorem we can express the *posterior* as

\n", 89 | "\n", 90 | "$p(\\mathbf{w}|D) = \\frac{p(D|\\mathbf{w})p(\\mathbf{w})}{p(D)}$

\n", 91 | "\n", 92 | "In order to apply a fully Bayesian approach, we must formulate models for both the *prior*, $p(\\mathbf{w})$, and the *likelihood function*, $p(D|\\mathbf{w})$. Given\n", 93 | "these models and a set of data we can compute appropriate values for our free parameter vector $\\mathbf{w}$ by maximizing \n", 94 | "$p(\\mathbf{w}|D) \\propto p(D|\\mathbf{w})p(\\mathbf{w})$. How does this differ from frequentist modeling?\n", 95 | "The frequentist approach, or *maximum likelihood* approach, ignores the formulation of a *prior*, and goes directly to maximizing the likelihood function \n", 96 | "to find the model parameters. Thus, the frequentist approach can be described as *maximizing the probability of the data given the parameters*. Under certain\n", 97 | "conditions the results of Bayesian and frequentist modeling will conincide, but this is not true in general. \n", 98 | "\n", 99 | "One could obtain a point estimate for $\\mathbf{w}$ by maximizing the *posterior probability* model, but this not typical. Instead a *predictive distribution* of the value of the target\n", 100 | "variable, $t$, is formed based on the compound distribution definition provided above. Taking the mean of this distribution provides a point estimate of $t$ while distribution itself provides\n", 101 | "a measure of the uncertainty in the estimate, say by considering the standard deviation.\n", 102 | "\n", 103 | "**TODO:** Add a simple example illustrating the difference. For now, a good illustration is available \n", 104 | "here" 105 | ] 106 | }, 107 | { 108 | "cell_type": "markdown", 109 | "metadata": {}, 110 | "source": [ 111 | "

Maximum Likelihood

\n", 112 | "A maximum likelihood approach does not attempt to formulate models for the parameter *priors*, $p(\\mathbf{w})$, or the parameter *posterior probabilities*, $p(\\mathbf{w}|D)$. Rather it views the data, $D$, as fixed and \n", 113 | "attempts to determine the model parameters, $\\mathbf{w}$, by maximizing the likelihood function. As we will frequently use this approach, it is useful to understand the likelihood function. \n", 114 | "\n", 115 | "We assume we have specified a probability density model, $p_{\\mathbf{w}}(d)$ for the observed data elements, ${d \\in D}$ that is parameterized by $\\mathbf{w}$, i.e. $p$ is a parametric model for the distribution of $D$. As\n", 116 | "an example, if $D$ ahs a normal distribution with mean $\\mu$ and variance $\\sigma^2$, then

\n", 117 | "$\\mathbf{w} = (\\mu, \\sigma^2)$

\n", 118 | "and

\n", 119 | "$p_{\\mathbf{w}}(d) = \\frac{1}{\\sqrt{2 \\pi} \\sigma} e^{-(d-\\mu)^2/2\\sigma^2}$

\n", 120 | "The likelihood function, regardless of our choice of model $p$, is defined by

\n", 121 | "$L(\\mathbf{w}; D) = \\prod_{i=1}^N p_{\\mathbf{w}}(d_i)$

\n", 122 | "where $N$ is the number of elements in $D$. Thus the likelihood function is simply the product of the probability of all the individual data points, $d_i \\in D$, under the probability model, $p_{\\mathbf{w}}$. Note that this\n", 123 | "definition implicitly assumes these data points are independent events. \n", 124 | "\n", 125 | "Out of mathematical convenience, we will most often work with the *log-likelihood* function (which turns the product into a sum by properties of the log function), i.e. the logarithm of $L(\\mathbf{w}; D)$, defined as

\n", 126 | "$l(\\mathbf{w};D) = \\sum_{i=1}^N l(\\mathbf{w};d_i) = \\sum_{i=1}^N \\log p_{\\mathbf{w}}(d_i)$

\n", 127 | "where we recall that $\\log(ab) = \\log(a) + \\log(b)$. \n", 128 | "\n", 129 | "The method of maximum likelihood chooses the value $\\mathbf{w} = \\widehat{\\mathbf{w}}$ that maximizes the *log-likelihood* function. We will also often work with an **error function**, $E(\\mathbf{w})$, defined as the\n", 130 | "negative of the log-likelihood function

\n", 131 | "$E(\\mathbf{w}) = -l(\\mathbf{w};D)$

\n", 132 | "where we note $-\\log(a) = \\log(1/a)$\n" 133 | ] 134 | }, 135 | { 136 | "cell_type": "code", 137 | "collapsed": false, 138 | "input": [], 139 | "language": "python", 140 | "metadata": {}, 141 | "outputs": [] 142 | } 143 | ], 144 | "metadata": {} 145 | } 146 | ] 147 | } -------------------------------------------------------------------------------- /data/SAheart.data: -------------------------------------------------------------------------------- 1 | 160,12,5.73,23.11,1,49,25.3,97.2,52,1 2 | 144,0.01,4.41,28.61,0,55,28.87,2.06,63,1 3 | 118,0.08,3.48,32.28,1,52,29.14,3.81,46,0 4 | 170,7.5,6.41,38.03,1,51,31.99,24.26,58,1 5 | 134,13.6,3.5,27.78,1,60,25.99,57.34,49,1 6 | 132,6.2,6.47,36.21,1,62,30.77,14.14,45,0 7 | 142,4.05,3.38,16.2,0,59,20.81,2.62,38,0 8 | 114,4.08,4.59,14.6,1,62,23.11,6.72,58,1 9 | 114,0,3.83,19.4,1,49,24.86,2.49,29,0 10 | 132,0,5.8,30.96,1,69,30.11,0,53,1 11 | 206,6,2.95,32.27,0,72,26.81,56.06,60,1 12 | 134,14.1,4.44,22.39,1,65,23.09,0,40,1 13 | 118,0,1.88,10.05,0,59,21.57,0,17,0 14 | 132,0,1.87,17.21,0,49,23.63,0.97,15,0 15 | 112,9.65,2.29,17.2,1,54,23.53,0.68,53,0 16 | 117,1.53,2.44,28.95,1,35,25.89,30.03,46,0 17 | 120,7.5,15.33,22,0,60,25.31,34.49,49,0 18 | 146,10.5,8.29,35.36,1,78,32.73,13.89,53,1 19 | 158,2.6,7.46,34.07,1,61,29.3,53.28,62,1 20 | 124,14,6.23,35.96,1,45,30.09,0,59,1 21 | 106,1.61,1.74,12.32,0,74,20.92,13.37,20,1 22 | 132,7.9,2.85,26.5,1,51,26.16,25.71,44,0 23 | 150,0.3,6.38,33.99,1,62,24.64,0,50,0 24 | 138,0.6,3.81,28.66,0,54,28.7,1.46,58,0 25 | 142,18.2,4.34,24.38,0,61,26.19,0,50,0 26 | 124,4,12.42,31.29,1,54,23.23,2.06,42,1 27 | 118,6,9.65,33.91,0,60,38.8,0,48,0 28 | 145,9.1,5.24,27.55,0,59,20.96,21.6,61,1 29 | 144,4.09,5.55,31.4,1,60,29.43,5.55,56,0 30 | 146,0,6.62,25.69,0,60,28.07,8.23,63,1 31 | 136,2.52,3.95,25.63,0,51,21.86,0,45,1 32 | 158,1.02,6.33,23.88,0,66,22.13,24.99,46,1 33 | 122,6.6,5.58,35.95,1,53,28.07,12.55,59,1 34 | 126,8.75,6.53,34.02,0,49,30.25,0,41,1 35 | 148,5.5,7.1,25.31,0,56,29.84,3.6,48,0 36 | 122,4.26,4.44,13.04,0,57,19.49,48.99,28,1 37 | 140,3.9,7.32,25.05,0,47,27.36,36.77,32,0 38 | 110,4.64,4.55,30.46,0,48,30.9,15.22,46,0 39 | 130,0,2.82,19.63,1,70,24.86,0,29,0 40 | 136,11.2,5.81,31.85,1,75,27.68,22.94,58,1 41 | 118,0.28,5.8,33.7,1,60,30.98,0,41,1 42 | 144,0.04,3.38,23.61,0,30,23.75,4.66,30,0 43 | 120,0,1.07,16.02,0,47,22.15,0,15,0 44 | 130,2.61,2.72,22.99,1,51,26.29,13.37,51,1 45 | 114,0,2.99,9.74,0,54,46.58,0,17,0 46 | 128,4.65,3.31,22.74,0,62,22.95,0.51,48,0 47 | 162,7.4,8.55,24.65,1,64,25.71,5.86,58,1 48 | 116,1.91,7.56,26.45,1,52,30.01,3.6,33,1 49 | 114,0,1.94,11.02,0,54,20.17,38.98,16,0 50 | 126,3.8,3.88,31.79,0,57,30.53,0,30,0 51 | 122,0,5.75,30.9,1,46,29.01,4.11,42,0 52 | 134,2.5,3.66,30.9,0,52,27.19,23.66,49,0 53 | 152,0.9,9.12,30.23,0,56,28.64,0.37,42,1 54 | 134,8.08,1.55,17.5,1,56,22.65,66.65,31,1 55 | 156,3,1.82,27.55,0,60,23.91,54,53,0 56 | 152,5.99,7.99,32.48,0,45,26.57,100.32,48,0 57 | 118,0,2.99,16.17,0,49,23.83,3.22,28,0 58 | 126,5.1,2.96,26.5,0,55,25.52,12.34,38,1 59 | 103,0.03,4.21,18.96,0,48,22.94,2.62,18,0 60 | 121,0.8,5.29,18.95,1,47,22.51,0,61,0 61 | 142,0.28,1.8,21.03,0,57,23.65,2.93,33,0 62 | 138,1.15,5.09,27.87,1,61,25.65,2.34,44,0 63 | 152,10.1,4.71,24.65,1,65,26.21,24.53,57,0 64 | 140,0.45,4.3,24.33,0,41,27.23,10.08,38,0 65 | 130,0,1.82,10.45,0,57,22.07,2.06,17,0 66 | 136,7.36,2.19,28.11,1,61,25,61.71,54,0 67 | 124,4.82,3.24,21.1,1,48,28.49,8.42,30,0 68 | 112,0.41,1.88,10.29,0,39,22.08,20.98,27,0 69 | 118,4.46,7.27,29.13,1,48,29.01,11.11,33,0 70 | 122,0,3.37,16.1,0,67,21.06,0,32,1 71 | 118,0,3.67,12.13,0,51,19.15,0.6,15,0 72 | 130,1.72,2.66,10.38,0,68,17.81,11.1,26,0 73 | 130,5.6,3.37,24.8,0,58,25.76,43.2,36,0 74 | 126,0.09,5.03,13.27,1,50,17.75,4.63,20,0 75 | 128,0.4,6.17,26.35,0,64,27.86,11.11,34,0 76 | 136,0,4.12,17.42,0,52,21.66,12.86,40,0 77 | 134,0,5.9,30.84,0,49,29.16,0,55,0 78 | 140,0.6,5.56,33.39,1,58,27.19,0,55,1 79 | 168,4.5,6.68,28.47,0,43,24.25,24.38,56,1 80 | 108,0.4,5.91,22.92,1,57,25.72,72,39,0 81 | 114,3,7.04,22.64,1,55,22.59,0,45,1 82 | 140,8.14,4.93,42.49,0,53,45.72,6.43,53,1 83 | 148,4.8,6.09,36.55,1,63,25.44,0.88,55,1 84 | 148,12.2,3.79,34.15,0,57,26.38,14.4,57,1 85 | 128,0,2.43,13.15,1,63,20.75,0,17,0 86 | 130,0.56,3.3,30.86,0,49,27.52,33.33,45,0 87 | 126,10.5,4.49,17.33,0,67,19.37,0,49,1 88 | 140,0,5.08,27.33,1,41,27.83,1.25,38,0 89 | 126,0.9,5.64,17.78,1,55,21.94,0,41,0 90 | 122,0.72,4.04,32.38,0,34,28.34,0,55,0 91 | 116,1.03,2.83,10.85,0,45,21.59,1.75,21,0 92 | 120,3.7,4.02,39.66,0,61,30.57,0,64,1 93 | 143,0.46,2.4,22.87,0,62,29.17,15.43,29,0 94 | 118,4,3.95,18.96,0,54,25.15,8.33,49,1 95 | 194,1.7,6.32,33.67,0,47,30.16,0.19,56,0 96 | 134,3,4.37,23.07,0,56,20.54,9.65,62,0 97 | 138,2.16,4.9,24.83,1,39,26.06,28.29,29,0 98 | 136,0,5,27.58,1,49,27.59,1.47,39,0 99 | 122,3.2,11.32,35.36,1,55,27.07,0,51,1 100 | 164,12,3.91,19.59,0,51,23.44,19.75,39,0 101 | 136,8,7.85,23.81,1,51,22.69,2.78,50,0 102 | 166,0.07,4.03,29.29,0,53,28.37,0,27,0 103 | 118,0,4.34,30.12,1,52,32.18,3.91,46,0 104 | 128,0.42,4.6,26.68,0,41,30.97,10.33,31,0 105 | 118,1.5,5.38,25.84,0,64,28.63,3.89,29,0 106 | 158,3.6,2.97,30.11,0,63,26.64,108,64,0 107 | 108,1.5,4.33,24.99,0,66,22.29,21.6,61,1 108 | 170,7.6,5.5,37.83,1,42,37.41,6.17,54,1 109 | 118,1,5.76,22.1,0,62,23.48,7.71,42,0 110 | 124,0,3.04,17.33,0,49,22.04,0,18,0 111 | 114,0,8.01,21.64,0,66,25.51,2.49,16,0 112 | 168,9,8.53,24.48,1,69,26.18,4.63,54,1 113 | 134,2,3.66,14.69,0,52,21.03,2.06,37,0 114 | 174,0,8.46,35.1,1,35,25.27,0,61,1 115 | 116,31.2,3.17,14.99,0,47,19.4,49.06,59,1 116 | 128,0,10.58,31.81,1,46,28.41,14.66,48,0 117 | 140,4.5,4.59,18.01,0,63,21.91,22.09,32,1 118 | 154,0.7,5.91,25,0,13,20.6,0,42,0 119 | 150,3.5,6.99,25.39,1,50,23.35,23.48,61,1 120 | 130,0,3.92,25.55,0,68,28.02,0.68,27,0 121 | 128,2,6.13,21.31,0,66,22.86,11.83,60,0 122 | 120,1.4,6.25,20.47,0,60,25.85,8.51,28,0 123 | 120,0,5.01,26.13,0,64,26.21,12.24,33,0 124 | 138,4.5,2.85,30.11,0,55,24.78,24.89,56,1 125 | 153,7.8,3.96,25.73,0,54,25.91,27.03,45,0 126 | 123,8.6,11.17,35.28,1,70,33.14,0,59,1 127 | 148,4.04,3.99,20.69,0,60,27.78,1.75,28,0 128 | 136,3.96,2.76,30.28,1,50,34.42,18.51,38,0 129 | 134,8.8,7.41,26.84,0,35,29.44,29.52,60,1 130 | 152,12.18,4.04,37.83,1,63,34.57,4.17,64,0 131 | 158,13.5,5.04,30.79,0,54,24.79,21.5,62,0 132 | 132,2,3.08,35.39,0,45,31.44,79.82,58,1 133 | 134,1.5,3.73,21.53,0,41,24.7,11.11,30,1 134 | 142,7.44,5.52,33.97,0,47,29.29,24.27,54,0 135 | 134,6,3.3,28.45,0,65,26.09,58.11,40,0 136 | 122,4.18,9.05,29.27,1,44,24.05,19.34,52,1 137 | 116,2.7,3.69,13.52,0,55,21.13,18.51,32,0 138 | 128,0.5,3.7,12.81,1,66,21.25,22.73,28,0 139 | 120,0,3.68,12.24,0,51,20.52,0.51,20,0 140 | 124,0,3.95,36.35,1,59,32.83,9.59,54,0 141 | 160,14,5.9,37.12,0,58,33.87,3.52,54,1 142 | 130,2.78,4.89,9.39,1,63,19.3,17.47,25,1 143 | 128,2.8,5.53,14.29,0,64,24.97,0.51,38,0 144 | 130,4.5,5.86,37.43,0,61,31.21,32.3,58,0 145 | 109,1.2,6.14,29.26,0,47,24.72,10.46,40,0 146 | 144,0,3.84,18.72,0,56,22.1,4.8,40,0 147 | 118,1.05,3.16,12.98,1,46,22.09,16.35,31,0 148 | 136,3.46,6.38,32.25,1,43,28.73,3.13,43,1 149 | 136,1.5,6.06,26.54,0,54,29.38,14.5,33,1 150 | 124,15.5,5.05,24.06,0,46,23.22,0,61,1 151 | 148,6,6.49,26.47,0,48,24.7,0,55,0 152 | 128,6.6,3.58,20.71,0,55,24.15,0,52,0 153 | 122,0.28,4.19,19.97,0,61,25.63,0,24,0 154 | 108,0,2.74,11.17,0,53,22.61,0.95,20,0 155 | 124,3.04,4.8,19.52,1,60,21.78,147.19,41,1 156 | 138,8.8,3.12,22.41,1,63,23.33,120.03,55,1 157 | 127,0,2.81,15.7,0,42,22.03,1.03,17,0 158 | 174,9.45,5.13,35.54,0,55,30.71,59.79,53,0 159 | 122,0,3.05,23.51,0,46,25.81,0,38,0 160 | 144,6.75,5.45,29.81,0,53,25.62,26.23,43,1 161 | 126,1.8,6.22,19.71,0,65,24.81,0.69,31,0 162 | 208,27.4,3.12,26.63,0,66,27.45,33.07,62,1 163 | 138,0,2.68,17.04,0,42,22.16,0,16,0 164 | 148,0,3.84,17.26,0,70,20,0,21,0 165 | 122,0,3.08,16.3,0,43,22.13,0,16,0 166 | 132,7,3.2,23.26,0,77,23.64,23.14,49,0 167 | 110,12.16,4.99,28.56,0,44,27.14,21.6,55,1 168 | 160,1.52,8.12,29.3,1,54,25.87,12.86,43,1 169 | 126,0.54,4.39,21.13,1,45,25.99,0,25,0 170 | 162,5.3,7.95,33.58,1,58,36.06,8.23,48,0 171 | 194,2.55,6.89,33.88,1,69,29.33,0,41,0 172 | 118,0.75,2.58,20.25,0,59,24.46,0,32,0 173 | 124,0,4.79,34.71,0,49,26.09,9.26,47,0 174 | 160,0,2.42,34.46,0,48,29.83,1.03,61,0 175 | 128,0,2.51,29.35,1,53,22.05,1.37,62,0 176 | 122,4,5.24,27.89,1,45,26.52,0,61,1 177 | 132,2,2.7,21.57,1,50,27.95,9.26,37,0 178 | 120,0,2.42,16.66,0,46,20.16,0,17,0 179 | 128,0.04,8.22,28.17,0,65,26.24,11.73,24,0 180 | 108,15,4.91,34.65,0,41,27.96,14.4,56,0 181 | 166,0,4.31,34.27,0,45,30.14,13.27,56,0 182 | 152,0,6.06,41.05,1,51,40.34,0,51,0 183 | 170,4.2,4.67,35.45,1,50,27.14,7.92,60,1 184 | 156,4,2.05,19.48,1,50,21.48,27.77,39,1 185 | 116,8,6.73,28.81,1,41,26.74,40.94,48,1 186 | 122,4.4,3.18,11.59,1,59,21.94,0,33,1 187 | 150,20,6.4,35.04,0,53,28.88,8.33,63,0 188 | 129,2.15,5.17,27.57,0,52,25.42,2.06,39,0 189 | 134,4.8,6.58,29.89,1,55,24.73,23.66,63,0 190 | 126,0,5.98,29.06,1,56,25.39,11.52,64,1 191 | 142,0,3.72,25.68,0,48,24.37,5.25,40,1 192 | 128,0.7,4.9,37.42,1,72,35.94,3.09,49,1 193 | 102,0.4,3.41,17.22,1,56,23.59,2.06,39,1 194 | 130,0,4.89,25.98,0,72,30.42,14.71,23,0 195 | 138,0.05,2.79,10.35,0,46,21.62,0,18,0 196 | 138,0,1.96,11.82,1,54,22.01,8.13,21,0 197 | 128,0,3.09,20.57,0,54,25.63,0.51,17,0 198 | 162,2.92,3.63,31.33,0,62,31.59,18.51,42,0 199 | 160,3,9.19,26.47,1,39,28.25,14.4,54,1 200 | 148,0,4.66,24.39,0,50,25.26,4.03,27,0 201 | 124,0.16,2.44,16.67,0,65,24.58,74.91,23,0 202 | 136,3.15,4.37,20.22,1,59,25.12,47.16,31,1 203 | 134,2.75,5.51,26.17,0,57,29.87,8.33,33,0 204 | 128,0.73,3.97,23.52,0,54,23.81,19.2,64,0 205 | 122,3.2,3.59,22.49,1,45,24.96,36.17,58,0 206 | 152,3,4.64,31.29,0,41,29.34,4.53,40,0 207 | 162,0,5.09,24.6,1,64,26.71,3.81,18,0 208 | 124,4,6.65,30.84,1,54,28.4,33.51,60,0 209 | 136,5.8,5.9,27.55,0,65,25.71,14.4,59,0 210 | 136,8.8,4.26,32.03,1,52,31.44,34.35,60,0 211 | 134,0.05,8.03,27.95,0,48,26.88,0,60,0 212 | 122,1,5.88,34.81,1,69,31.27,15.94,40,1 213 | 116,3,3.05,30.31,0,41,23.63,0.86,44,0 214 | 132,0,0.98,21.39,0,62,26.75,0,53,0 215 | 134,0,2.4,21.11,0,57,22.45,1.37,18,0 216 | 160,7.77,8.07,34.8,0,64,31.15,0,62,1 217 | 180,0.52,4.23,16.38,0,55,22.56,14.77,45,1 218 | 124,0.81,6.16,11.61,0,35,21.47,10.49,26,0 219 | 114,0,4.97,9.69,0,26,22.6,0,25,0 220 | 208,7.4,7.41,32.03,0,50,27.62,7.85,57,0 221 | 138,0,3.14,12,0,54,20.28,0,16,0 222 | 164,0.5,6.95,39.64,1,47,41.76,3.81,46,1 223 | 144,2.4,8.13,35.61,0,46,27.38,13.37,60,0 224 | 136,7.5,7.39,28.04,1,50,25.01,0,45,1 225 | 132,7.28,3.52,12.33,0,60,19.48,2.06,56,0 226 | 143,5.04,4.86,23.59,0,58,24.69,18.72,42,0 227 | 112,4.46,7.18,26.25,1,69,27.29,0,32,1 228 | 134,10,3.79,34.72,0,42,28.33,28.8,52,1 229 | 138,2,5.11,31.4,1,49,27.25,2.06,64,1 230 | 188,0,5.47,32.44,1,71,28.99,7.41,50,1 231 | 110,2.35,3.36,26.72,1,54,26.08,109.8,58,1 232 | 136,13.2,7.18,35.95,0,48,29.19,0,62,0 233 | 130,1.75,5.46,34.34,0,53,29.42,0,58,1 234 | 122,0,3.76,24.59,0,56,24.36,0,30,0 235 | 138,0,3.24,27.68,0,60,25.7,88.66,29,0 236 | 130,18,4.13,27.43,0,54,27.44,0,51,1 237 | 126,5.5,3.78,34.15,0,55,28.85,3.18,61,0 238 | 176,5.76,4.89,26.1,1,46,27.3,19.44,57,0 239 | 122,0,5.49,19.56,0,57,23.12,14.02,27,0 240 | 124,0,3.23,9.64,0,59,22.7,0,16,0 241 | 140,5.2,3.58,29.26,0,70,27.29,20.17,45,1 242 | 128,6,4.37,22.98,1,50,26.01,0,47,0 243 | 190,4.18,5.05,24.83,0,45,26.09,82.85,41,0 244 | 144,0.76,10.53,35.66,0,63,34.35,0,55,1 245 | 126,4.6,7.4,31.99,1,57,28.67,0.37,60,1 246 | 128,0,2.63,23.88,0,45,21.59,6.54,57,0 247 | 136,0.4,3.91,21.1,1,63,22.3,0,56,1 248 | 158,4,4.18,28.61,1,42,25.11,0,60,0 249 | 160,0.6,6.94,30.53,0,36,25.68,1.42,64,0 250 | 124,6,5.21,33.02,1,64,29.37,7.61,58,1 251 | 158,6.17,8.12,30.75,0,46,27.84,92.62,48,0 252 | 128,0,6.34,11.87,0,57,23.14,0,17,0 253 | 166,3,3.82,26.75,0,45,20.86,0,63,1 254 | 146,7.5,7.21,25.93,1,55,22.51,0.51,42,0 255 | 161,9,4.65,15.16,1,58,23.76,43.2,46,0 256 | 164,13.02,6.26,29.38,1,47,22.75,37.03,54,1 257 | 146,5.08,7.03,27.41,1,63,36.46,24.48,37,1 258 | 142,4.48,3.57,19.75,1,51,23.54,3.29,49,0 259 | 138,12,5.13,28.34,0,59,24.49,32.81,58,1 260 | 154,1.8,7.13,34.04,1,52,35.51,39.36,44,0 261 | 118,0,2.39,12.13,0,49,18.46,0.26,17,1 262 | 124,0.61,2.69,17.15,1,61,22.76,11.55,20,0 263 | 124,1.04,2.84,16.42,1,46,20.17,0,61,0 264 | 136,5,4.19,23.99,1,68,27.8,25.86,35,0 265 | 132,9.9,4.63,27.86,1,46,23.39,0.51,52,1 266 | 118,0.12,1.96,20.31,0,37,20.01,2.42,18,0 267 | 118,0.12,4.16,9.37,0,57,19.61,0,17,0 268 | 134,12,4.96,29.79,0,53,24.86,8.23,57,0 269 | 114,0.1,3.95,15.89,1,57,20.31,17.14,16,0 270 | 136,6.8,7.84,30.74,1,58,26.2,23.66,45,1 271 | 130,0,4.16,39.43,1,46,30.01,0,55,1 272 | 136,2.2,4.16,38.02,0,65,37.24,4.11,41,1 273 | 136,1.36,3.16,14.97,1,56,24.98,7.3,24,0 274 | 154,4.2,5.59,25.02,0,58,25.02,1.54,43,0 275 | 108,0.8,2.47,17.53,0,47,22.18,0,55,1 276 | 136,8.8,4.69,36.07,1,38,26.56,2.78,63,1 277 | 174,2.02,6.57,31.9,1,50,28.75,11.83,64,1 278 | 124,4.25,8.22,30.77,0,56,25.8,0,43,0 279 | 114,0,2.63,9.69,0,45,17.89,0,16,0 280 | 118,0.12,3.26,12.26,0,55,22.65,0,16,0 281 | 106,1.08,4.37,26.08,0,67,24.07,17.74,28,1 282 | 146,3.6,3.51,22.67,0,51,22.29,43.71,42,0 283 | 206,0,4.17,33.23,0,69,27.36,6.17,50,1 284 | 134,3,3.17,17.91,0,35,26.37,15.12,27,0 285 | 148,15,4.98,36.94,1,72,31.83,66.27,41,1 286 | 126,0.21,3.95,15.11,0,61,22.17,2.42,17,0 287 | 134,0,3.69,13.92,0,43,27.66,0,19,0 288 | 134,0.02,2.8,18.84,0,45,24.82,0,17,0 289 | 123,0.05,4.61,13.69,0,51,23.23,2.78,16,0 290 | 112,0.6,5.28,25.71,0,55,27.02,27.77,38,1 291 | 112,0,1.71,15.96,0,42,22.03,3.5,16,0 292 | 101,0.48,7.26,13,0,50,19.82,5.19,16,0 293 | 150,0.18,4.14,14.4,0,53,23.43,7.71,44,0 294 | 170,2.6,7.22,28.69,1,71,27.87,37.65,56,1 295 | 134,0,5.63,29.12,0,68,32.33,2.02,34,0 296 | 142,0,4.19,18.04,0,56,23.65,20.78,42,1 297 | 132,0.1,3.28,10.73,0,73,20.42,0,17,0 298 | 136,0,2.28,18.14,0,55,22.59,0,17,0 299 | 132,12,4.51,21.93,0,61,26.07,64.8,46,1 300 | 166,4.1,4,34.3,1,32,29.51,8.23,53,0 301 | 138,0,3.96,24.7,1,53,23.8,0,45,0 302 | 138,2.27,6.41,29.07,0,58,30.22,2.93,32,1 303 | 170,0,3.12,37.15,0,47,35.42,0,53,0 304 | 128,0,8.41,28.82,1,60,26.86,0,59,1 305 | 136,1.2,2.78,7.12,0,52,22.51,3.41,27,0 306 | 128,0,3.22,26.55,1,39,26.59,16.71,49,0 307 | 150,14.4,5.04,26.52,1,60,28.84,0,45,0 308 | 132,8.4,3.57,13.68,0,42,18.75,15.43,59,1 309 | 142,2.4,2.55,23.89,0,54,26.09,59.14,37,0 310 | 130,0.05,2.44,28.25,1,67,30.86,40.32,34,0 311 | 174,3.5,5.26,21.97,1,36,22.04,8.33,59,1 312 | 114,9.6,2.51,29.18,0,49,25.67,40.63,46,0 313 | 162,1.5,2.46,19.39,1,49,24.32,0,59,1 314 | 174,0,3.27,35.4,0,58,37.71,24.95,44,0 315 | 190,5.15,6.03,36.59,0,42,30.31,72,50,0 316 | 154,1.4,1.72,18.86,0,58,22.67,43.2,59,0 317 | 124,0,2.28,24.86,1,50,22.24,8.26,38,0 318 | 114,1.2,3.98,14.9,0,49,23.79,25.82,26,0 319 | 168,11.4,5.08,26.66,1,56,27.04,2.61,59,1 320 | 142,3.72,4.24,32.57,0,52,24.98,7.61,51,0 321 | 154,0,4.81,28.11,1,56,25.67,75.77,59,0 322 | 146,4.36,4.31,18.44,1,47,24.72,10.8,38,0 323 | 166,6,3.02,29.3,0,35,24.38,38.06,61,0 324 | 140,8.6,3.9,32.16,1,52,28.51,11.11,64,1 325 | 136,1.7,3.53,20.13,0,56,19.44,14.4,55,0 326 | 156,0,3.47,21.1,0,73,28.4,0,36,1 327 | 132,0,6.63,29.58,1,37,29.41,2.57,62,0 328 | 128,0,2.98,12.59,0,65,20.74,2.06,19,0 329 | 106,5.6,3.2,12.3,0,49,20.29,0,39,0 330 | 144,0.4,4.64,30.09,0,30,27.39,0.74,55,0 331 | 154,0.31,2.33,16.48,0,33,24,11.83,17,0 332 | 126,3.1,2.01,32.97,1,56,28.63,26.74,45,0 333 | 134,6.4,8.49,37.25,1,56,28.94,10.49,51,1 334 | 152,19.45,4.22,29.81,0,28,23.95,0,59,1 335 | 146,1.35,6.39,34.21,0,51,26.43,0,59,1 336 | 162,6.94,4.55,33.36,1,52,27.09,32.06,43,0 337 | 130,7.28,3.56,23.29,1,20,26.8,51.87,58,1 338 | 138,6,7.24,37.05,0,38,28.69,0,59,0 339 | 148,0,5.32,26.71,1,52,32.21,32.78,27,0 340 | 124,4.2,2.94,27.59,0,50,30.31,85.06,30,0 341 | 118,1.62,9.01,21.7,0,59,25.89,21.19,40,0 342 | 116,4.28,7.02,19.99,1,68,23.31,0,52,1 343 | 162,6.3,5.73,22.61,1,46,20.43,62.54,53,1 344 | 138,0.87,1.87,15.89,0,44,26.76,42.99,31,0 345 | 137,1.2,3.14,23.87,0,66,24.13,45,37,0 346 | 198,0.52,11.89,27.68,1,48,28.4,78.99,26,1 347 | 154,4.5,4.75,23.52,1,43,25.76,0,53,1 348 | 128,5.4,2.36,12.98,0,51,18.36,6.69,61,0 349 | 130,0.08,5.59,25.42,1,50,24.98,6.27,43,1 350 | 162,5.6,4.24,22.53,0,29,22.91,5.66,60,0 351 | 120,10.5,2.7,29.87,1,54,24.5,16.46,49,0 352 | 136,3.99,2.58,16.38,1,53,22.41,27.67,36,0 353 | 176,1.2,8.28,36.16,1,42,27.81,11.6,58,1 354 | 134,11.79,4.01,26.57,1,38,21.79,38.88,61,1 355 | 122,1.7,5.28,32.23,1,51,24.08,0,54,0 356 | 134,0.9,3.18,23.66,1,52,23.26,27.36,58,1 357 | 134,0,2.43,22.24,0,52,26.49,41.66,24,0 358 | 136,6.6,6.08,32.74,0,64,33.28,2.72,49,0 359 | 132,4.05,5.15,26.51,1,31,26.67,16.3,50,0 360 | 152,1.68,3.58,25.43,0,50,27.03,0,32,0 361 | 132,12.3,5.96,32.79,1,57,30.12,21.5,62,1 362 | 124,0.4,3.67,25.76,0,43,28.08,20.57,34,0 363 | 140,4.2,2.91,28.83,1,43,24.7,47.52,48,0 364 | 166,0.6,2.42,34.03,1,53,26.96,54,60,0 365 | 156,3.02,5.35,25.72,1,53,25.22,28.11,52,1 366 | 132,0.72,4.37,19.54,0,48,26.11,49.37,28,0 367 | 150,0,4.99,27.73,0,57,30.92,8.33,24,0 368 | 134,0.12,3.4,21.18,1,33,26.27,14.21,30,0 369 | 126,3.4,4.87,15.16,1,65,22.01,11.11,38,0 370 | 148,0.5,5.97,32.88,0,54,29.27,6.43,42,0 371 | 148,8.2,7.75,34.46,1,46,26.53,6.04,64,1 372 | 132,6,5.97,25.73,1,66,24.18,145.29,41,0 373 | 128,1.6,5.41,29.3,0,68,29.38,23.97,32,0 374 | 128,5.16,4.9,31.35,1,57,26.42,0,64,0 375 | 140,0,2.4,27.89,1,70,30.74,144,29,0 376 | 126,0,5.29,27.64,0,25,27.62,2.06,45,0 377 | 114,3.6,4.16,22.58,0,60,24.49,65.31,31,0 378 | 118,1.25,4.69,31.58,1,52,27.16,4.11,53,0 379 | 126,0.96,4.99,29.74,0,66,33.35,58.32,38,0 380 | 154,4.5,4.68,39.97,0,61,33.17,1.54,64,1 381 | 112,1.44,2.71,22.92,0,59,24.81,0,52,0 382 | 140,8,4.42,33.15,1,47,32.77,66.86,44,0 383 | 140,1.68,11.41,29.54,1,74,30.75,2.06,38,1 384 | 128,2.6,4.94,21.36,0,61,21.3,0,31,0 385 | 126,19.6,6.03,34.99,0,49,26.99,55.89,44,0 386 | 160,4.2,6.76,37.99,1,61,32.91,3.09,54,1 387 | 144,0,4.17,29.63,1,52,21.83,0,59,0 388 | 148,4.5,10.49,33.27,0,50,25.92,2.06,53,1 389 | 146,0,4.92,18.53,0,57,24.2,34.97,26,0 390 | 164,5.6,3.17,30.98,1,44,25.99,43.2,53,1 391 | 130,0.54,3.63,22.03,1,69,24.34,12.86,39,1 392 | 154,2.4,5.63,42.17,1,59,35.07,12.86,50,1 393 | 178,0.95,4.75,21.06,0,49,23.74,24.69,61,0 394 | 180,3.57,3.57,36.1,0,36,26.7,19.95,64,0 395 | 134,12.5,2.73,39.35,0,48,35.58,0,48,0 396 | 142,0,3.54,16.64,0,58,25.97,8.36,27,0 397 | 162,7,7.67,34.34,1,33,30.77,0,62,0 398 | 218,11.2,2.77,30.79,0,38,24.86,90.93,48,1 399 | 126,8.75,6.06,32.72,1,33,27,62.43,55,1 400 | 126,0,3.57,26.01,0,61,26.3,7.97,47,0 401 | 134,6.1,4.77,26.08,0,47,23.82,1.03,49,0 402 | 132,0,4.17,36.57,0,57,30.61,18,49,0 403 | 178,5.5,3.79,23.92,1,45,21.26,6.17,62,1 404 | 208,5.04,5.19,20.71,1,52,25.12,24.27,58,1 405 | 160,1.15,10.19,39.71,0,31,31.65,20.52,57,0 406 | 116,2.38,5.67,29.01,1,54,27.26,15.77,51,0 407 | 180,25.01,3.7,38.11,1,57,30.54,0,61,1 408 | 200,19.2,4.43,40.6,1,55,32.04,36,60,1 409 | 112,4.2,3.58,27.14,0,52,26.83,2.06,40,0 410 | 120,0,3.1,26.97,0,41,24.8,0,16,0 411 | 178,20,9.78,33.55,0,37,27.29,2.88,62,1 412 | 166,0.8,5.63,36.21,0,50,34.72,28.8,60,0 413 | 164,8.2,14.16,36.85,0,52,28.5,17.02,55,1 414 | 216,0.92,2.66,19.85,1,49,20.58,0.51,63,1 415 | 146,6.4,5.62,33.05,1,57,31.03,0.74,46,0 416 | 134,1.1,3.54,20.41,1,58,24.54,39.91,39,1 417 | 158,16,5.56,29.35,0,36,25.92,58.32,60,0 418 | 176,0,3.14,31.04,1,45,30.18,4.63,45,0 419 | 132,2.8,4.79,20.47,1,50,22.15,11.73,48,0 420 | 126,0,4.55,29.18,0,48,24.94,36,41,0 421 | 120,5.5,3.51,23.23,0,46,22.4,90.31,43,0 422 | 174,0,3.86,21.73,0,42,23.37,0,63,0 423 | 150,13.8,5.1,29.45,1,52,27.92,77.76,55,1 424 | 176,6,3.98,17.2,1,52,21.07,4.11,61,1 425 | 142,2.2,3.29,22.7,0,44,23.66,5.66,42,1 426 | 132,0,3.3,21.61,0,42,24.92,32.61,33,0 427 | 142,1.32,7.63,29.98,1,57,31.16,72.93,33,0 428 | 146,1.16,2.28,34.53,0,50,28.71,45,49,0 429 | 132,7.2,3.65,17.16,1,56,23.25,0,34,0 430 | 120,0,3.57,23.22,0,58,27.2,0,32,0 431 | 118,0,3.89,15.96,0,65,20.18,0,16,0 432 | 108,0,1.43,26.26,0,42,19.38,0,16,0 433 | 136,0,4,19.06,0,40,21.94,2.06,16,0 434 | 120,0,2.46,13.39,0,47,22.01,0.51,18,0 435 | 132,0,3.55,8.66,1,61,18.5,3.87,16,0 436 | 136,0,1.77,20.37,0,45,21.51,2.06,16,0 437 | 138,0,1.86,18.35,1,59,25.38,6.51,17,0 438 | 138,0.06,4.15,20.66,0,49,22.59,2.49,16,0 439 | 130,1.22,3.3,13.65,0,50,21.4,3.81,31,0 440 | 130,4,2.4,17.42,0,60,22.05,0,40,0 441 | 110,0,7.14,28.28,0,57,29,0,32,0 442 | 120,0,3.98,13.19,1,47,21.89,0,16,0 443 | 166,6,8.8,37.89,0,39,28.7,43.2,52,0 444 | 134,0.57,4.75,23.07,0,67,26.33,0,37,0 445 | 142,3,3.69,25.1,0,60,30.08,38.88,27,0 446 | 136,2.8,2.53,9.28,1,61,20.7,4.55,25,0 447 | 142,0,4.32,25.22,0,47,28.92,6.53,34,1 448 | 130,0,1.88,12.51,1,52,20.28,0,17,0 449 | 124,1.8,3.74,16.64,1,42,22.26,10.49,20,0 450 | 144,4,5.03,25.78,1,57,27.55,90,48,1 451 | 136,1.81,3.31,6.74,0,63,19.57,24.94,24,0 452 | 120,0,2.77,13.35,0,67,23.37,1.03,18,0 453 | 154,5.53,3.2,28.81,1,61,26.15,42.79,42,0 454 | 124,1.6,7.22,39.68,1,36,31.5,0,51,1 455 | 146,0.64,4.82,28.02,0,60,28.11,8.23,39,1 456 | 128,2.24,2.83,26.48,0,48,23.96,47.42,27,1 457 | 170,0.4,4.11,42.06,1,56,33.1,2.06,57,0 458 | 214,0.4,5.98,31.72,0,64,28.45,0,58,0 459 | 182,4.2,4.41,32.1,0,52,28.61,18.72,52,1 460 | 108,3,1.59,15.23,0,40,20.09,26.64,55,0 461 | 118,5.4,11.61,30.79,0,64,27.35,23.97,40,0 -------------------------------------------------------------------------------- /04_Neural_Networks.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "metadata": { 3 | "name": "04_Neural_Networks" 4 | }, 5 | "nbformat": 3, 6 | "nbformat_minor": 0, 7 | "worksheets": [ 8 | { 9 | "cells": [ 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "

Neural Networks

\n", 15 | "\n", 16 | "Neural networks are one approach to machine learning that attempts to deal with the problem of large data dimensionality. The neural network approach uses a fixed number of basis functions - in contrast\n", 17 | "to methods such as support vector machines that attempt to adapt the number of basis functions - that are themselves parameterized by the model parameters. This is a significant departure from \n", 18 | "linear regression and logistic regression methods where the models consisted of linear combinations of fixed basis functions, $\\phi(\\mathbf{x})$, that dependend only on the input vector, $\\mathbf{x}$. In neural networks, the basis functions can now depend on both the model parameters and the input vector and thus take the form $\\phi(\\mathbf{x} | \\mathbf{w})$. \n", 19 | "\n", 20 | "Here we will cover only **feed-forward** neural networks. One can envision a neural network as a series of layers where each layer has some number of nodes and each node is a function. Each layer represents a single linear regression model. The nodes are connected to each other through inputs accepted from and outputs passed to other nodes. A *feed-forward* network is one in which these connections do not form any directed cycles. See [here](http://en.wikipedia.org/wiki/Feedforward_neural_network) for more detail.\n", 21 | "\n", 22 | "![alt text](files/images/Feed_forward_neural_net.gif \"A feed forward network.\")\n", 23 | "\n", 24 | "As a matter of convention, we will refer the model as a $N$ layer model where $N$ is the number of layers for which adaptive parameters, $\\mathbf{w}$, must be determined. Thus for a model consisting of an input layer, one hidden layer, and an output layer, we consider $N$ to be 2 since parameters are determined only for the hidden and output layers." 25 | ] 26 | }, 27 | { 28 | "cell_type": "markdown", 29 | "metadata": {}, 30 | "source": [ 31 | "

Feed-forward Network Functions

\n", 32 | "We will consider a basic two layer nueral network model, i.e a model that maps inputs to a hidden layer and then to an output layer. We will make **the following assumptions**\n", 33 | "\n", 34 | "1. The final output will be a vector $Y$ with $K$ elements, $y_k$, where $y_k(\\mathbf{x},\\mathbf{w}) = p(C_1|\\mathbf{x})$ is the probability that node $k$ is in class $C_1$ and $p(C_2|\\mathbf{x}) = 1-p(C_1|\\mathbf{x})$\n", 35 | "2. The activation function at a given layer is an arbitrary nonlinear function of a linear combination of the inputs and parameters for that layer\n", 36 | "3. The network is fully connected, i.e. every node at the input layer is connected to every node in the hidden layer and every node in the hidden layer is connected to every node in the output layer\n", 37 | "4. A bias parameter is included at the hidden and output layers\n", 38 | "\n", 39 | "Working from the input layer toward the output layer, we can build this model as follows:\n", 40 | "\n", 41 | "

Input Layer

\n", 42 | "Assume we have an input vector $\\mathbf{x} \\in \\Re^D$. Then the input layer consists of $D+1$ nodes where the value of the $i^{th}$ node for $i=0\\ldots D$, is 0 if $i=0$ and $x_i$, i.e. the $i^{th}$ value of $\\mathbf{x}$, otherwise.\n", 43 | "\n", 44 | "

Hidden Layer

\n", 45 | "At the hidden layer we construct $M$ nodes where the value of $M$ depends on the specifics of the particular modelling problem. For each node, we define a *unit activation*, $a_m$, for $m=1\\ldots M$ as
\n", 46 | "$a_m = \\sum_{i=0}^D w_{ji}^{(1)}x_i$
\n", 47 | "where the $(1)$ superscript indicates this weight is for the hidden layer. The output from each node, $z_m$, is then given by the value of a *fixed nonlinear function*, $h$, known as the *activation function*, acting on the unit activation
\n", 48 | "$z_m = h(a_m) = h \\left( \\sum_{i=0}^D w_{mi}^{(1)}x_i \\right)$
\n", 49 | "Notice that $h$ is the same function for all nodes.\n", 50 | "

Output Layer

\n", 51 | "The process at the output layer is essentially the same as at the hidden layer. We construct $K$ nodes, where again the value of $K$ depends on the specific modeling problem. For each node, we again define a *unit activation*, $a_k$, for $k=1 \\ldots K$ by
\n", 52 | "$a_k = \\sum_{m=0}^M w_{km}^{(2)} z_m$
\n", 53 | "We again apply a nonlinear activation function, say $y$, to produce the output
\n", 54 | "$y_k = y(a_k)$\n", 55 | "\n", 56 | "Thus, the entire model can be summarized as a $K$ dimensional output vector $Y \\in \\Re^K$ where each element $y_k$ by
\n", 57 | "$y_k(\\mathbf{x},\\mathbf{w}) = y \\left( \\sum_{m=0}^M w_{km}^{(2)} h \\left( \\sum_{i=0}^D w_{mi}^{(1)}x_i \\right) \\right)$\n", 58 | "\n", 59 | "

Generalizations

\n", 60 | "There are a wide variety of generalizations possible for this model. Some of the more important ones for practical applications include\n", 61 | "\n", 62 | "* Addition of hidden layers\n", 63 | "* Inclusion of *skip-layer* connections, e.g. a connection from an input node directly to an output node\n", 64 | "* Sparse network, i.e. not a fully connected network" 65 | ] 66 | }, 67 | { 68 | "cell_type": "markdown", 69 | "metadata": {}, 70 | "source": [ 71 | "

Network Training

\n", 72 | "Here we will consider how to determine the network model parameters. We will use a *maximum likelihood* approach. The likelihood function for the network is dependent on the type of problem the network models. Of particular importance is the number and type of values generated at the output layer. We consider several cases below.\n", 73 | "

Regression Single Gaussian Target

\n", 74 | "*Assume* that our target variable, *t*, is Gaussian distributed with an $\\mathbf{x}$ dependent mean, $\\mu(\\mathbf{x})$ and constant variance $\\sigma^2 = 1/\\beta$. Our network model is designed to estimate the unknown function for the mean, i.e. $y(\\mathbf{x},\\mathbf{w}) \\approx \\mu(\\mathbf{x})$ so that the distribution for the target value, *t*, is modeled by

\n", 75 | "$p(t|\\mathbf{x},\\mathbf{w}) = ND(t|y(\\mathbf{x},\\mathbf{w}), \\beta^{-1})$

\n", 76 | "where $ND(\\mu,\\sigma^2)$ is the normal distribution with mean $\\mu$ and variance $\\sigma^2$. *Assume* the output layer activation function is the identity, i.e. the output is simply the the unit activations, and that we have $N$ i.i.d. observations $\\mathbf{X}=\\\\{\\mathbf{x}_1, \\ldots, \\mathbf{x}_2 \\\\}$ and target values $\\mathbf{t} = \\\\{t_1, \\ldots, t_2\\\\}$, the likelihood function is

\n", 77 | "$p(\\mathbf{t}|\\mathbf{X},\\mathbf{w}, \\beta) = \\prod_{n=1}^N p(t_n|\\mathbf{x}_n, \\mathbf{w}, \\beta)$

\n", 78 | "The **total error function** is defined as the negative logarithm of the likelihood function given by
\n", 79 | "\n", 80 | "$\\frac {\\beta}{2} \\sum_{n=1}^N \\\\{y(\\mathbf{x}_n,\\mathbf{w}) - t_n \\\\}^2 - \\frac{N}{2} \\ln(\\beta) + \\frac{N}{2} \\ln (2\\pi)$\n", 81 | "\n", 82 | "The parameter estimates, $\\mathbf{w}^{(ML)}$ (ML indicates maximum likelihood) are found by maximizing the equivalent sum-of-squares error function\n", 83 | "\n", 84 | "$E(\\mathbf{w}) = \\frac{1}{2} \\sum_{n=1}^N \\\\{y(\\mathbf{x}_n,\\mathbf{w}) - t_n \\\\}^2$\n", 85 | "\n", 86 | "Note, due to the nonlinearity of the network model, $E(\\mathbf{w})$ is not necessisarily convex, and thus may have *local minima* and hence it is not possible to know that the global minimum has been obtained. The model parameter, $\\beta$ is found by first finding $\\mathbf{w}_{ML}$ and then solving\n", 87 | "\n", 88 | "$\\frac{1}{\\beta_{ML}} = \\frac{1}{N}\\sum_{n=1}^N \\\\{ y(\\mathbf{x}_n,\\mathbf{w}^{(ML)}) - t_n \\\\}^2$" 89 | ] 90 | }, 91 | { 92 | "cell_type": "markdown", 93 | "metadata": {}, 94 | "source": [ 95 | "

Regression Multiple Gaussian Targets

\n", 96 | "Now *assume* that we have $K$ Gaussian distributed target variables, $\\[t_1, \\ldots, t_K\\]$, each with a mean that is independently conditional on $\\mathbf{x}$, i.e. the mean of $t_k$ is defined by some function $\\mu_k(\\mathbf{x})$. Also assume that all $K$ variables share the same variance, $\\sigma^2=1/\\beta$. Assuming the network output layer has $K$ nodes where $y_k(\\mathbf{x},\\mathbf{w}) \\approx \\mu_k(\\mathbf{x})$ and letting $\\mathbf{y}(\\mathbf{x},\\mathbf{w}) = \\[y_1(\\mathbf{x},\\mathbf{w}), \\ldots, y_K(\\mathbf{x},\\mathbf{w})\\]$, and that we again have $N$ training target values $\\mathbf{t}$ ($\\mathbf{t}$ is a $K \\times N$ matrix of the training values), the conditional distribution of the target training values is given by\n", 97 | "\n", 98 | "$p(\\mathbf{t}|\\mathbf{x},\\mathbf{w}) = ND(\\mathbf{t} | \\mathbf{y}(\\mathbf{x},\\mathbf{w}), \\beta^{-1} \\mathbf{I})$\n", 99 | "\n", 100 | "The parameter estimates, $\\mathbf{w}^{(ML)}$ are again found by minimizing the sum-of-squares error function, $E(\\mathbf{w})$, and the estimate for $\\beta$ is found from\n", 101 | "\n", 102 | "$\\frac{1}{\\beta_{ML}} = \\frac{1}{NK}\\sum_{n=1}^N ||y(\\mathbf{x}_n,\\mathbf{w}^{(ML)}) - t_n ||^2$" 103 | ] 104 | }, 105 | { 106 | "cell_type": "markdown", 107 | "metadata": {}, 108 | "source": [ 109 | "

Binary Classification Single Target

\n", 110 | "Now *assume* we have a single target variable, $t\\in \\\\{0,1\\\\}$, such that $t=1$ denotes class $C_1$ and $t=0$ denotes class $C_2$. *Assume* that the network has a single output node whos activation function is the logistic sigmoid\n", 111 | "\n", 112 | "$y=1/(1+\\exp(-a))$\n", 113 | "\n", 114 | "where $a$ is the output of the hidden layer. We can interpret the network output, $y(\\mathbf{x},\\mathbf{w})$ as the conditional probability $p(C_1|\\mathbf{x})$ with $p(C_2|\\mathbf{x})=1-y(\\mathbf{x},\\mathbf{w})$ so that the coniditional probability takes a Bernoulli distribution\n", 115 | "\n", 116 | "$p(t|\\mathbf{x},\\mathbf{w}) = y(\\mathbf{x},\\mathbf{w})^t\\[1 - y(\\mathbf{x},\\mathbf{w}) \\]^{1-t}$\n", 117 | "\n", 118 | "Given a training set with $N$ observations the error function is given by\n", 119 | "\n", 120 | "$E(\\mathbf{w}) = - \\sum_{n=1}^N \\\\{t_n \\ln (y_n) + (1 - t_n) \\ln (1-y_n) \\\\}$\n", 121 | "\n", 122 | "where $y_n = y(\\mathbf{x_n},\\mathbf{w})$. This model *assumes* all training inputs are correctly labeled. " 123 | ] 124 | }, 125 | { 126 | "cell_type": "markdown", 127 | "metadata": {}, 128 | "source": [ 129 | "

Binary Classification $K$ Seperate Targets

\n", 130 | "Assume we have $K$ seperate binary classification target variables (i.e. there are K classification sets each with two classes. The K sets are independent and the input will be assigned to one class from each set). This can be modeled with network having $K$ nodes in the output layer where the activation function of each node is the logistic sigmoid. *Assume* the class labels are independent, i.e. $p(x_i \\in C_1| x_j \\in C_1) = p(x_i \\in C_1)$ $\\forall \\\\{i,j\\\\}$ (this is often an invalid assumption in practice), and that we have some training set, $\\mathbf{t}$, then the coniditional probability of a single output vector $\\mathbf{t}$ is\n", 131 | "\n", 132 | "$p(\\mathbf{t}|\\mathbf{x},\\mathbf{w}) = \\prod_{k=1}^K y_k(\\mathbf{x},\\mathbf{w})^{t_k} \\[ 1 - y_k(\\mathbf{x},\\mathbf{w})\\]^{1-t_k}$\n", 133 | "\n", 134 | "Given a training input with $N$ values (note that the training set is an $K \\times N$ matrix) the error function is \n", 135 | "\n", 136 | "$E(\\mathbf{w} = - \\sum_{n=1}^N \\sum_{k=1}^K \\[ t_{nk} \\ln (y_{nk}) + (1-t_{nk}) \\ln (1-y_{nk}) \\]$\n", 137 | "\n", 138 | "where $y_{nk} = y_k(\\mathbf{x}_n, \\mathbf{w}). " 139 | ] 140 | }, 141 | { 142 | "cell_type": "markdown", 143 | "metadata": {}, 144 | "source": [ 145 | "

Multi-class Classification

\n", 146 | "Assume we have a $K$ **mutually exclusive** classes (i.e. the input can only be in one of the $K$ classes). This network is also modelled with $K$ output nodes where each output node represents the probability that the input is in class $k$, $y_k(\\mathbf{x},\\mathbf{w}) = p(t_k = 1 | \\mathbf{x})$. The error function is given by \n", 147 | "\n", 148 | "$E(\\mathbf{w}) = - \\sum_{n=1}^N \\sum_{k=1}^K t_{nk} \\ln y_k(\\mathbf{x},\\mathbf{w})$\n", 149 | "\n", 150 | "where the output activation function $y_k$ is the *softmax function*\n", 151 | "\n", 152 | "$y_k(\\mathbf{x},\\mathbf{w}) = \\frac{\\exp(a_k(\\mathbf{x},\\mathbf{w})}{\\sum_j \\exp(a_j(\\mathbf{x},\\mathbf{w}))}$" 153 | ] 154 | }, 155 | { 156 | "cell_type": "markdown", 157 | "metadata": {}, 158 | "source": [ 159 | "

Error Backpropagation

\n", 160 | "Most training algorithms involve a two step procedure for minimizing the model parameter dependent error function\n", 161 | "\n", 162 | "1. Evaluate the derivatives of the error function with respect to the parameters\n", 163 | "2. Use the derivatives to iterate the values of the parameters\n", 164 | "\n", 165 | "In this section, we consider **error backpropagation** which provides an efficient way to evaluate a feed-forward neural network's error function. *Note* that this only satisfies step 1 above. It is still necessary to choose an optimization technique in order to use the derivative information provided by error backpropagation to update the parameter estimates as indicated by step 2. \n", 166 | "\n", 167 | "The results presented here are applicable to \n", 168 | "\n", 169 | "1. An **arbitary** feed-forward network toplogy\n", 170 | "2. **Arbitrary** differentiable nonlinear activation functions\n", 171 | "\n", 172 | "The one *assumption* that is made is that the error function can be expressed as a *sum of terms*, one for each training data point, $t_n$ for $n \\in \\\\{1\\ldots N\\\\}$, so that\n", 173 | "\n", 174 | "$E(\\mathbf{w}) = \\sum_{n=1}^N E_n(\\mathbf{w},t_n)$\n", 175 | "\n", 176 | "For such error functions, the derivative of the error function with respect to $\\mathbf{w}) takes the form\n", 177 | "\n", 178 | "$\\bigtriangledown E(\\mathbf{w}) = \\sum_{n=1}^N \\bigtriangledown E_n(\\mathbf{w},t_n)$\n", 179 | "\n", 180 | "Thus we need only consider the evaluation of $\\bigtriangledown E_n(\\mathbf{w},t_n)$ which may be used directly for sequential optimization techniques or summed over the training set for batch techniques (see next section). The error backpropagation method can be summarized as follows\n", 181 | "\n", 182 | "1. Given an input vector, $\\mathbf{x}_n$, forward propagate through the network using \n", 183 | " 1. $a_j = \\sum_i w_{ji}z_i$\n", 184 | " 2. $z_j = h(a_j)$\n", 185 | "2. Evaluate $\\delta_k \\equiv \\frac{\\partial E_n} {\\partial a_k}$ for the ouptput layer as\n", 186 | " 1. $\\delta_k = y_k - t_k$\n", 187 | "3. Backpropagate the $\\delta$'s to obtain $\\delta_j$ for each hidden unit in the network using\n", 188 | " 1. $\\delta_j = h'(a_j) \\sum_k w_{kj}\\delta_k$\n", 189 | "4. Evaluate the derivatives using \n", 190 | " 1. $\\frac{\\partial E_n} {\\partial w_{ji}} = \\delta_j z_i$\n" 191 | ] 192 | }, 193 | { 194 | "cell_type": "markdown", 195 | "metadata": {}, 196 | "source": [ 197 | "

Parameter Optimization

\n", 198 | "Given the neural network error function derivative estimate provided by *error backpropagation*, one can use a variety of numerical optimization techniques to find appropriate parameter estimates. The simplest technique is the method of *steepest descent* where the new parameter estimate $\\mathbf{w}^{(\\tau+1)}$ is given by\n", 199 | "\n", 200 | "$\\mathbf{w}^{(\\tau+1)} = \\mathbf{w}^{(\\tau)} - \\eta \\bigtriangledown E(\\mathbf{w}^{(\\tau)})$\n", 201 | "\n", 202 | "where $\\eta>0$ is the *learning rate*. Other **more advanced** optimization algorithms inlcude *conjugate gradients* and *quasi-Newton* methods." 203 | ] 204 | }, 205 | { 206 | "cell_type": "markdown", 207 | "metadata": {}, 208 | "source": [ 209 | "

Example: Exlusive or

\n", 210 | "As a first example, let's consider the classical [Exlusive Or (XOR) problem](http://en.wikipedia.org/wiki/Exclusive_or). We will consider the XOR problem involving two inputs $x_1$ and $x_2$ where $x_i \\in \\\\{0,1\\\\}$. This problem states that the output should be 1 if exactly one of the inputs is 1 and 0 otherwise. Thus this problem has a very simple known input output relationship\n", 211 | "\n", 212 | "\n", 213 | " \n", 214 | " \n", 215 | " \n", 216 | " \n", 217 | " \n", 218 | " \n", 219 | " \n", 220 | "
$x_1$$x_2$Output
000
101
011
110
\n", 221 | "\n", 222 | "While this may be a trivial example, it illustrates all of the features of an arbitary feed-forward network while remaining simple enough to understand everything that is happening in the network. The XOR network is typically presented as having 2 input nodes, 2 hidden layer nodes, and one output nodes, for example see [here](http://www.heatonresearch.com/online/introduction-neural-networks-java-edition-2/chapter-1/page4.html). Such an arrangement however requires that the nodes in the hidden layer use a thresholding scheme whereby the output of the node is 0 or 1 depending on the sum of the input being greater than some fixed threshold value. However, such a network would violoate our model assumptions for training the network. Specifically, we could not use error backpropagation because the node activation functions would be step functions which are not differentiable. To overcome this, we will the *hyperbolic tangent*, *tanh*, as the hidden layer activation function and add a third node to the hidden layer representing a bias term. Our output layer will have a single node. We will interpret the out values as being 0 (or false) for output values less than 0 and 1 (or true) for output values greater than 0. The figure below provides a graphical representation of the network. Note, the parameters $w$ and $a$ are distinct for each layer, e.g. $w_{11}$ on the edge between $x_1$ and $h(a_1)$ is not the same $w_{11}$ on the edge from $h(a_1)$ to $\\sigma(a_1)$.\n", 223 | "\n", 224 | "\n", 225 | "\n", 226 | "\"The feedforward neural network for the 2 input Exclusive Or problem\"" 227 | ] 228 | }, 229 | { 230 | "cell_type": "code", 231 | "collapsed": false, 232 | "input": [ 233 | "import networkx as nx\n", 234 | "import sys\n", 235 | "from math import tanh\n", 236 | "sys.path.append('pysrc')\n", 237 | "import neural_networks as nn\n", 238 | "from functools import partial\n", 239 | "\n", 240 | "DG = nx.DiGraph()\n", 241 | "DG.add_nodes_from(['X1','X2','H0','H1','H2','O1'])\n", 242 | "\n", 243 | "#set the activation functions\n", 244 | "DG.node['H0']['af']=1.0\n", 245 | "for node in ['H1','H2','O1']:\n", 246 | " DG.node[node]['af']=tanh\n", 247 | " \n", 248 | "#set the derivatives of the activation functions for all nodes except the output, this is done below\n", 249 | "def zero(x):\n", 250 | " return 0\n", 251 | "for node in ['X1','X2','H0']: #the inputs and bias terms have zero derivatives\n", 252 | " DG.node[node]['daf'] = zero\n", 253 | "def dtanh(x):\n", 254 | " return 1.0 - tanh(x) * tanh(x)\n", 255 | "for node in ['H1','H2']:\n", 256 | " DG.node[node]['daf']=dtanh\n", 257 | " \n", 258 | "#create the edges\n", 259 | "for source in ['X1','X2']:\n", 260 | " for target in ['H1','H2']:\n", 261 | " DG.add_weighted_edges_from([(source,target,1.0)])\n", 262 | "for source in ['H0','H1','H2']:\n", 263 | " DG.add_weighted_edges_from([(source, 'O1', 1.0)])\n", 264 | " \n", 265 | "#set the input values\n", 266 | "DG.node['X1']['af']=0\n", 267 | "DG.node['X2']['af']=1\n", 268 | "\n", 269 | "#given these inputs, the correct output should be 1\n", 270 | "#we'll use a partial function so we can assign the correct 'daf' value\n", 271 | "#dynamically when we iteratively train the network\n", 272 | "def dout(x, t):\n", 273 | " if x<0:\n", 274 | " xx = 0\n", 275 | " else:\n", 276 | " xx = 1\n", 277 | " return xx - t\n", 278 | "DG.node['O1']['daf']=partial(dout, t=1)\n", 279 | "\n", 280 | "nn.forward_prop(DG)\n", 281 | "\n", 282 | "nn.error_back_prop(DG)\n", 283 | "\n", 284 | "for node in ['X1','X2','H0','H1','H2','O1']:\n", 285 | " print \"node {0} has output {1} and error {2}\".format(node, DG.node[node]['o'], DG.node[node]['e'])" 286 | ], 287 | "language": "python", 288 | "metadata": {}, 289 | "outputs": [ 290 | { 291 | "output_type": "stream", 292 | "stream": "stdout", 293 | "text": [ 294 | "node X1 has output 0 and error 0.0\n", 295 | "node X2 has output 1 and error 0.0\n", 296 | "node H0 has output 1.0 and error 0.0\n", 297 | "node H1 has output 0.761594155956 and error 0.0\n", 298 | "node H2 has output 0.761594155956 and error 0.0\n", 299 | "node O1 has output 0.987217029728 and error 0.0\n" 300 | ] 301 | } 302 | ], 303 | "prompt_number": 5 304 | }, 305 | { 306 | "cell_type": "code", 307 | "collapsed": false, 308 | "input": [], 309 | "language": "python", 310 | "metadata": {}, 311 | "outputs": [] 312 | } 313 | ], 314 | "metadata": {} 315 | } 316 | ] 317 | } 318 | -------------------------------------------------------------------------------- /pysrc/ipynbs_src/02_Linear_Regression.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # 2 3 | 4 | # 5 | 6 | #

Readings

7 | # 13 | #

Regression: Given the value of a D-dimensional input vector $\mathbf{x}$, predict the value of one or more target variables

14 | #

Linear: The models discussed in this section are linear with respect to the adjustable parameters, not 15 | # necessisarily with respect to the input variables.

16 | 17 | # 18 | 19 | #

Creating A Model

20 | # In this notebook, our objective is to construct models that can predict the value of some target variable, $t$, given some 21 | # input vector, $\mathbf{x}$, where the target value can occupy any value in some space - though here we'll only consider the space of 22 | # real valued vectors. We want the models to allow for uncertainty in the accuracy of the model and/or noise on the observed data. 23 | # We also want the model to provide some information on our confidence in a given prediction. 24 | # 25 | # The first step is to contruct a mathematical model that adequately represents the observations we wish to predict. 26 | # The model we will use is described in the next two subsections. It is **important to note** that the model itself is independent 27 | # of the use of a frequentist or Bayesian viewpoint. It is *how we obtain the free parameters* of the model that is affected by using 28 | # frequentist or Bayesian approaches. However, if the model is a poor choice for a particular observation, then its predictive 29 | # capability is likely to be poor whether we use a frequentist or Bayesian approach to obtain the parameters. 30 | 31 | # 32 | 33 | #

Gaussian Noise: Model Assumption 1

34 | # We will *assume* throughout this notebook that the target variable is described by

35 | # $t = y(\mathbf{x},\mathbf{w}) + \epsilon$ 36 | #

37 | # where $y(\mathbf{x},\mathbf{w})$ is an as of yet undefined function of $\mathbf{x}$ and $\mathbf{w}$ and $\epsilon$ is a Gaussian distributed noise component. 38 | # 39 | # **Gaussian Noise?** The derivations provided below all assume Gaussian noise on the target data. Is this a good assumption? In many cases yes. The argument hinges 40 | # on the use of the [Central_Limit_Theorem](http://en.wikipedia.org/wiki/Central_limit_theorem) that basically says the the **sum** of many independent random 41 | # variables behaves behaves like a Gaussian distributed random variable. The _noise_ term in this model, $\epsilon$, can be thought of as the sum of features 42 | # not included in the model function, $y(\mathbf{x},\mathbf{w})$. Assuming these features are themselves independent random variables then the Central Limit Theorom suggests a Gaussian model 43 | # is appropriate, assuming there are many independent unaccounted for features. It is possible that there is only a small number of unaccounted for features 44 | # or that there is genuine _non-Gauisian_ noise in our observation measurements, e.g. sensor shot noise that often has a Poisson distribution. In such cases, the assumption is no longer valid. 45 | 46 | # 47 | 48 | #

General Linear Model: Model Assumption 2

49 | # In order to proceed, we need to define a model for $y(\mathbf{x},\mathbf{w})$. We will use the *general linear regression* model defined as follows

50 | # $y(\mathbf{x},\mathbf{w}) = \sum_{j=0}^{M-1} w_j\phi_j(\mathbf{x}) = \mathbf{w}^T\mathbf{\phi}(\mathbf{x})$

51 | # where $\mathbf{x}$ is a $D$ dimensional input vector, $M$ is the number of free parameters in the model, $\mathbf{w}$ is a column 52 | # vector of the free parameters, and 53 | # $\phi(\mathbf{x}) = \\{\phi_0(\mathbf{x}),\phi_1(\mathbf{x}), \ldots,\phi_{M-1}(\mathbf{x})\\}$ with $\phi_0(\mathbf{x})=1$ is a set of basis functions where 54 | # each $\phi_i$ is in the real valued function space 55 | # $\\{f \in \mathbf{R}^D\Rightarrow\mathbf{R}^1\\}$. It is important to note that the set of basis functions, $\phi$, need 56 | # not be linear with respect to $\mathbf{x}$. Further, note that this model defines an entire class of models. In order to 57 | # contruct an actual predictive model for some observable quantity, we will have to make a further assumption on the choice of the 58 | # set of basis functions, $\phi$. However, for the purposes of deriving general results, we can delay this choice. 59 | # 60 | # Note that that $\mathbf{w}^T$ is an $1 \times M$ vector and that $\mathbf{\phi}(\mathbf{x})$ is a $M \times 1$ vector so that the target, $y$ 61 | # is a scalar. This will be exteneded to $K$ dimensional target variables below. 62 | # 63 | # 64 | 65 | # 66 | 67 | #

Frequentist View: Maximum Likelihood

68 | # Let's now embark on the path of obtaining the free parameters, $\mathbf{w}$, of our model. We will begin using a *frequentist*, or 69 | # *maximum likelihood*, approach. This approach assumes that we first obtain observation training data, $\mathbf{t}$, and that the *best* 70 | # value of $\mathbf{w}$, is that which maximizes the likelihood function, $p(\mathbf{t}|\mathbf{w})$. 71 | # 72 | #

Under the Gaussian noise condition it can be shown that the maximum likelihood function for the training data is

73 | # 74 | # $p(\mathbf{t}|\mathbf{X},\mathbf{w},\sigma^2) = \prod_{n=1}^N ND(t_n|\mathbf{w}^T\phi(\mathbf{x}_n),\sigma^2)$

75 | # 76 | # $=\frac{N}{2}\ln\frac{1}{\sigma^2} -\frac{N}{2}\ln(2\pi) - \frac{1}{2\sigma^2}\sum_{n=1}^N 77 | # \{t_n -\mathbf{w}^T\phi(\mathbf{x}_n)\}^2$

78 | # 79 | # where $\mathbf{X}=\{\mathbf{x}_1,\ldots,\mathbf{x}_N\}$ is the input value set for the corresponding $N$ oberved output values contained in the vector 80 | # $\mathbf{t}$, and $ND(\mu,\sigma^2)$ is the Normal Distribution (Gaussian). (I used ND instead of the standard N to avoid confusion 81 | # with the product limit). 82 | # 83 | # Taking the logarithm of the maximum likelihood and setting the derivative with respect to $\mathbf{w}$ equal to zero, one can obtain 84 | # the maximum likelikhood parameters given by the normal equations:

85 | # $\mathbf{w}_{ML} = \left(\mathbf{\Phi}^T\mathbf{\Phi}\right)^{-1}\mathbf{\Phi}^T\mathbf{t}$

86 | # where $\Phi$ is the $N \times M$ design matrix with elements $\Phi_{n,j}=\phi_j(\mathbf{x}_n)$, and $\mathbf{t}$ is the $N \times K$ 87 | # matrix of training set target values (for $K=1$, it is simply a column vector). Note that $\mathbf{\Phi}^T$ is a $M \times N$ matrix, so that $\mathbf{w}_{ML}=\left(\mathbf{\Phi}^T \mathbf{\Phi}\right)^{-1}\mathbf{\Phi}^T\mathbf{t}$ is 88 | # $(M \times N)\times(N \times M)\times(M\times N)\times(N \times K) = M \times K$, where $M$ is the number of free parameters and $K$ is the number of predicted 89 | # target values for a given input.
90 | #

91 | # 92 | # Note that the only term in the likelihood function that depends on $\mathbf{w}$ is the last term. Thus, maximizing the likelihood 93 | # function with respect to $\mathbf{w}$ __under the assumption of Gaussian noise__ is equivalent to minimizing a 94 | # sum-of-squares error function. 95 | # 96 | #

97 | # The quantity, $\mathbf{\Phi}^\dagger=\left(\mathbf{\Phi}^T\mathbf{\Phi}\right)^{-1}\mathbf{\Phi}^T$ is known as the 98 | # Moore-Penrose pseudo-inverse of $\Phi$. When $\Phi^T\Phi$ is invertible, the pseudo-inverse is 99 | # equivalent to the inverse. When this condition fails, the pseudo-inverse can be found with techniques such as singular value decomposition. 100 | #

101 | 102 | # 103 | 104 | #

Example 1

105 | #

(a) Linear Data

106 | #

Let's generate data of the for $y = m*x + b + \epsilon $ where $\epsilon$ is a random Gaussian component with zero mean. Given this data, let's apply the maximum likelihood 107 | # solution to find values for the parameters $m$ and $b$. Given that we know our data is linear, we chose basis functions $\phi_0(x)=1$ and $\phi_1(x)=x$. Thus, our 108 | # our model will be $y=\theta_0\phi_0(x) + \theta_1\phi_1(x)$, where presumabely the solution should yield $\theta_0 \approx b$ and $\theta_1 \approx 109 | # m$ 110 | #

111 | 112 | # 113 | 114 | import numpy as np 115 | from matplotlib import pyplot as plt 116 | #in order to compare between examples, set a seed in random 117 | seed = 123456789 118 | np.random.seed(seed) 119 | def y(x,m,b,mu=0,sigma=1): return m*x + b + np.random.normal(mu,sigma,1)[0] 120 | #training data, with N data points 121 | N = 101 122 | M = 2 123 | t = np.empty(N) 124 | domain_bound = 1.0/N 125 | domain = np.empty(N) 126 | for i in range(N): domain[i] = i*domain_bound 127 | for i in range(N): t[i] = y(x=domain[i],m=4.89,b=0.57) 128 | #design matrix, phi, N X M 129 | phi = np.array([np.ones(N),domain]).T 130 | #find the solution 131 | #in this case case phi.T X phi is invertible so do the folloing: 132 | temp1 = np.linalg.inv(np.dot(phi.T,phi)) #inverse of phi.T X phi 133 | temp2 = np.dot(temp1, phi.T) 134 | w1 = np.dot(temp2,t) #solution 135 | print 'w1=',w1 136 | #assuming that phi.T X phi was not invertible we could find the pseudo inverse using the pinv function 137 | #we expect to obtain the same solution 138 | phi_pi = np.linalg.pinv(phi) 139 | w2 = np.dot(phi_pi,t) 140 | print 'w2=',w2 141 | #compute the model predicted values for the training data domain 142 | predicted_t = [w2[0]+w2[1]*x for x in domain] 143 | plt.plot(domain,t) 144 | plt.plot(domain,predicted_t) 145 | plt.show() 146 | 147 | # 148 | 149 | #

(b) Trigonometric Data

150 | #

One common misconception regarding linear regression is that the basis functions are required to be linear. This is not the case. Indeed, the basis 151 | # functions need not even be polynomials. They must be linearly independent, i.e. orthoganol. It is only the dependence on the model parameters, $\mathbf{w}_{ML}$, that must be linear. 152 | # Note that an example of a nonlinear parameter model would be $y=\exp(a)\sin(x)$ where $a$ is the free parameter. 153 | # Let's generate trigonometric data of the form $y = a + b\sin(x) + c\cos(x) + \epsilon $ where again $\epsilon$ is a random Gaussian component with zero mean. 154 | # Here we chose basis functions $\phi_0=1$, $\phi_1(x)=sin(x)$ and $\phi_2(x)=cos(x)$. If you're wondering if we're cheating a bit here, the answer is yes. 155 | # In reality, we may not know ahead of time what the appropriate basis functions for the observed data should be. The appropraite choice may be suggested 156 | # by the data, knowledge of the problem, and other machine learing techniques (hopefully to be discussed later). In this example, our 157 | # our model will be $y=\theta_0\phi_0(x) + \theta_1\phi_1(x) + + \theta_2\phi_2(x)$, where presumabely the solution should yield $\theta_0 \approx a$, 158 | # $\theta_1 \approx b$ and $\theta_2 \approx c$. 159 | #

160 | 161 | # 162 | 163 | import numpy as np 164 | import math 165 | from matplotlib import pyplot as plt 166 | #in order to compare between examples, set a seed in random 167 | seed = 123456789 168 | np.random.seed(seed) 169 | def y(x,a,b,c,mu=0,sigma=1): return a + b*math.sin(x) + c*math.cos(x) + np.random.normal(mu,sigma,1)[0] 170 | #training data, with N data points 171 | N = 101 172 | M = 3 173 | t = np.empty(N) 174 | domain = np.empty(N) 175 | domain_bound = 4.0*math.pi/N 176 | for i in range(N): domain[i] = i * domain_bound 177 | for i in range (N): t[i] = y(x=domain[i],a=1.85,b=0.57,c=4.37) 178 | #design matrix, phi, N X M 179 | c1 = [math.sin(x) for x in domain] 180 | c2 = [math.cos(x) for x in domain] 181 | phi = np.array([np.ones(N),c1,c2]).T 182 | #find the solution 183 | #in this case case phi.T X phi is invertible so do the folloing: 184 | temp1 = np.linalg.inv(np.dot(phi.T,phi)) #inverse of phi.T X phi 185 | temp2 = np.dot(temp1, phi.T) 186 | w1 = np.dot(temp2,t) #solution 187 | print 'w1=',w1 188 | #assuming that phi.T X phi was not invertible we could find the pseudo inverse using the pinv function 189 | #we expect to obtain the same solution 190 | phi_pi = np.linalg.pinv(phi) 191 | w2 = np.dot(phi_pi,t) 192 | print 'w2=',w2 193 | #compute the model predicted values for the training data domain 194 | predicted_t = [w2[0]+w2[1]*math.sin(x)+w2[2]*math.cos(x) for x in domain] 195 | plt.plot(domain,t) 196 | plt.plot(domain, predicted_t) 197 | plt.show() 198 | 199 | # 200 | 201 | #

Stochastic Gradient Descent

202 | #

In cases where the training data set is very large or data is received in a stream, a direct solution using the normal equations 203 | # may not be possible. An alternative approach is the stochastic gradient descent algorithm. If the total error function, $TE$, is the sum of 204 | # a given error function, $E$, evaluated at each of the $N$ trianing inputs, $TE = \sum_{i=1}^N E(\mathbf{x}_i)$ then the stochastic gradient descent 205 | # algorithm is

206 | # $\mathbf{w}^{\tau + 1} = \mathbf{w}^\tau - \eta \bigtriangledown E_\tau$

207 | # where ${\tau}$ is the iteration number and $\eta$ is a learning rate parameter. For this type of total error function, the order of evaluation 208 | # does not change the result. If the error function is the sum-of-squares function, then the 209 | # algorithm is

210 | # $\mathbf{w}^{\tau + 1} = \mathbf{w}^\tau + \eta \left(t_n - \mathbf{w}^{(\tau)T}\phi_n\right)\phi_n$ 211 | #

212 | 213 | # 214 | 215 | #

Example 2

216 | # Repeat the evaluation in example 1(b) using the gradient descent approach. Is the answer the same when using the same training data? 217 | 218 | # 219 | 220 | import numpy as np 221 | import math 222 | from matplotlib import pyplot as plt 223 | #in order to compare between examples, set a seed in random 224 | seed = 123456789 225 | np.random.seed(seed) 226 | def y(x,a,b,c,mu=0,sigma=1): return a + b*math.sin(x) + c*math.cos(x) + np.random.normal(mu,sigma,1)[0] 227 | N = 101 228 | M = 3 229 | w = np.zeros((M,1)) 230 | phi = np.empty((M,1)) 231 | eta = 0.25 232 | #create arrays to store the values as they are generated so they can be plotted at the end 233 | x = np.empty(N) 234 | t = np.empty(N) 235 | domain_bound = 4.0*math.pi/N 236 | for i in range(N): 237 | x[i] = i * domain_bound 238 | t[i] = y(x[i],a=1.85,b=0.57,c=4.37) 239 | phi = np.array([[1],[math.sin(x[i])],[math.cos(x[i])]]) 240 | w = w + eta*(t[i]-np.dot(w.T,phi))*phi #the learning model 241 | 242 | print w.T 243 | #compute the model predicted values for the training data domain 244 | predicted_t = [w[0]+w[1]*math.sin(item)+w[2]*math.cos(item) for item in x] 245 | plt.plot(x,t) 246 | plt.plot(x,predicted_t) 247 | plt.show() 248 | 249 | # 250 | 251 | #

Over and Under Fitting

252 | #

This section describes some of the techniques used to account for over fitting or under fitting a given model when using a 253 | # maximum likelihood approach.

254 | #

Regularized Least Squares

255 | #

Regularization attempts to address the problem of overfitting a given model to the training data and/or eliminating parameters that are not important 256 | # based on the data. The general total error function with a regularization term is given by

257 | # $E_D(\mathbf{w}) + \lambda E_W(\mathbf{w})$

258 | # where $\lambda$ is the regularization coefficient and $E_W$ is the regularization term. A commonly used regularization term is the sum-of-squares of the 259 | # model parameter elements

260 | # $E_W(\mathbf{w}) = \frac1{2}\mathbf{w}^T\mathbf{w}$

261 | # known as the weight decay regularizer. This regularization terms leads to the optimal solution, assuming a linear regression model with 262 | # Gaussian noise on the training data, of

263 | # $\mathbf{w} = \left(\lambda \mathbf{I} + \mathbf{\Phi}^T \mathbf{\Phi}\right)^{-1} \mathbf{\Phi}^T\mathbf{t}$ 264 | #

265 | 266 | # 267 | 268 | #

Example 4

269 | #

In this example, we use the same training data as example 1(a), except that here, the model is erroneously chosen to be a 7th order polynomial. 270 | # This example is somewhat contrived, but it illustrates the point that an overfit model can be corrected to some extend using regularization.

271 | 272 | # 273 | 274 | import numpy as np 275 | from matplotlib import pyplot as plt 276 | #in order to compare between examples, set a seed in random 277 | seed = 123456789 278 | np.random.seed(seed) 279 | def y(x,m,b,mu=0,sigma=1): return m*x + b + np.random.normal(mu,sigma,1)[0] 280 | def el_pow(x,pow): 281 | temp = x 282 | for i in range(pow-1): 283 | temp = temp * x 284 | return temp 285 | def prediction(params, x): 286 | pred = 0 287 | for i in range(len(params)):pred += params[i]*math.pow(x,i) 288 | return pred 289 | #training data, with N data points 290 | N = 101 291 | M = 8 292 | t = np.empty(N) 293 | domain = np.empty(N) 294 | domain_bound = 1.0/N 295 | for i in range(N): domain[i] = i*domain_bound 296 | for i in range(N): t[i] = y(x=domain[i],m=4.89,b=0.57) 297 | #find the solution without using regularization 298 | #design matrix, phi, N X M 299 | phi = np.array([np.ones(N),domain, el_pow(domain,2),el_pow(domain,3),el_pow(domain,4),el_pow(domain,5),el_pow(domain,6),el_pow(domain,7)]).T 300 | temp1 = np.linalg.inv(np.dot(phi.T,phi)) #inverse of phi.T X phi 301 | temp2 = np.dot(temp1, phi.T) 302 | w1 = np.dot(temp2,t) #solution 303 | print 'w1=',w1 304 | predicted_t = [prediction(w1,x) for x in domain] 305 | 306 | #find the regularized solution 307 | lam = 0.1 308 | temp1 = np.linalg.inv(lam*np.eye(M)+np.dot(phi.T,phi)) 309 | temp2 = np.dot(temp1,phi.T) 310 | w2 = np.dot(temp2,t) 311 | print 'w2=',w2 312 | predicted_t_reg = [prediction(w2,x) for x in domain] 313 | 314 | #add some plots 315 | plt.plot(domain,t) 316 | plt.plot(domain,predicted_t) 317 | plt.plot(domain,predicted_t_reg) 318 | plt.legend(("training","un-regularized","regularized"), loc='lower right') 319 | plt.show() 320 | 321 | # 322 | 323 | #

Locally Weighted Linear Regression

324 | #

In attempting to predict a target value, $y$, for some new input $x$ and given a training set $\mathbf{t}$, locally weighted linear regression weights 325 | # training inputs relative to their distance from the new input $x$. The adjusted regression model is one of finding $\theta$ to minimize

326 | # $\sum_i w^{(i)} \left(y^{(i)} - \theta^Tx^{(i)}\right)$

327 | # where

328 | #

329 | #

330 | # $w^{(i)} = \exp \left( -\frac{\left( x^{(i)} - x \right)^2}{\left( 2\tau^2\right)} \right)$

331 | # The parameter $\tau$, called the **bandwidth** controls how the weight of a training input, $x^{(i)}$, falls off with distance from the new input, $x$. Also, notice that the weights, 332 | # $w^{(i)}$, require that the training data is available for each evaluation of a new input $x$, making it a so-called non-parametric model. 333 | # 334 | # The solution is obtained as

335 | # $\mathbf{w}_{ML} = \left(\mathbf{\Phi}^T\mathbf{W}\mathbf{\Phi}\right)^{-1}\mathbf{\Phi}^T\mathbf{W}\mathbf{t}$

336 | # where $\mathbf{W}$ is the $N \times N$ weighting matrix with elements $w_{i,j} = \left[w^{(i)}\right]^2$ for $i=j$ and 0 otherwise. 337 | #

338 | #

Example 5

339 | # Here we apply locally weighted regression to training data that is generated from a second order polynomial with Gaussian 340 | # noise. 341 | 342 | # 343 | 344 | import numpy as np 345 | from matplotlib import pyplot as plt 346 | #in order to compare between examples, set a seed in random 347 | seed = 123456789 348 | np.random.seed(seed) 349 | def y(x,coefs,inc_ran = True,mu=0,sigma=0.2): 350 | ans = 0 351 | for i in range(len(coefs)): ans += coefs[i]*math.pow(x,i) 352 | if inc_ran: return ans + np.random.normal(mu,sigma,1)[0] 353 | else: return ans 354 | #training data, with N = 101 data points 355 | N = 101 356 | M = 2 357 | t = np.empty(N) 358 | domain = np.empty(N) 359 | domain_bound = 3.0/N 360 | for i in range(N): domain[i] = i*domain_bound 361 | for i in range(N): t[i] = y(x=domain[i],coefs=[1.75, 0.25, -1.0]) 362 | 363 | #first find the standard linear regression fit using model y = phi_0 + phi_1 * x 364 | #design matrix, phi, N X M 365 | phi = np.array([np.ones(N),domain]).T 366 | temp1 = np.linalg.inv(np.dot(phi.T,phi)) #inverse of phi.T X phi 367 | temp2 = np.dot(temp1, phi.T) 368 | w1 = np.dot(temp2,t) #solution 369 | predicted_t = [y(x,w1,inc_ran=False) for x in domain] 370 | 371 | #now construct the weighted locally weighted solution 372 | def wt(x,x_i,tau=1): return math.exp(-(x_i-x)*(x_i-x)/(2*tau*tau)) 373 | def lws(x, wt_func, td_x,td_y, Phi): 374 | _M = Phi.shape[1] 375 | _N = Phi.shape[0] 376 | _W = np.zeros((_N,_N)) 377 | for i in range(_N): _W[i,i] = wt_func(x,td_x[i]) 378 | temp1 = np.linalg.inv(np.dot(np.dot(Phi.T,_W),Phi)) #inverse of phi.T X phi 379 | temp2 = np.dot(np.dot(temp1,Phi.T),_W) 380 | w1 = np.dot(temp2,t) #local solution for parameters 381 | return y(x,w1,inc_ran=False) 382 | predicted_local = [lws(x, wt, domain, t, phi) for x in domain] 383 | plt.plot(domain,t) 384 | plt.plot(domain,predicted_t) 385 | plt.plot(domain,predicted_local) 386 | plt.legend(('training','regression', 'local regression'),loc='lower left') 387 | 388 | # 389 | 390 | #

Bayesian View

391 | # Here we consider a fully Bayesian approach to the regression problem. First note, that our fundamental **model** assumptions still hold. Namely, 392 | # we assume that the observation we are trying to predict is modeled by

393 | # $t = y(\mathbf{x},\mathbf{w}) + \epsilon$ 394 | #

395 | # where $\epsilon$ is a Gaussian distributed noise component. Additionally, the assumption of a linear dependence on the model parameters, $\mathbf{w}$ also still holds:

396 | # $y(\mathbf{x},\mathbf{w}) = \sum_{j=0}^{M-1} w_j\phi_j(\mathbf{x}) = \mathbf{w}^T\mathbf{\phi}(\mathbf{x})$

397 | # 398 | # Recall that the Bayesian approach implies that the *best* model parameters, $\mathbf{w}$, are those that maximize the **posterior** probability, which from Bayes' Theorem is given by:

399 | # $p(\mathbf{w}|\mathbf{t}) = \frac{p(\mathbf{t}|\mathbf{w})p(\mathbf{w})}{p(\mathbf{t})}$

400 | # 401 | #

Prior Model

402 | # The first step in formulating a Bayesian model is to construct a model for the *prior* probability model, $p(\mathbf{w})$. **NOTE:** In general any probability distribution model could be chosen to 403 | # model the prior. The appropriate choice is dependent on different factors including prior knowledge of the problem and mathematical convenience. Often, the prior is chosen to be the *conjugate prior* of the 404 | # likelihood function. This is a choice of mathematical convenience becuase it implies that the *posterior* can be derived analytically. It is often a reasonable choice, as is this case here for 405 | # the problem of linear regression. Thus, given the Gaussian distribution used for the likelihood function above, we assume a Gaussian distribution for our *prior* of the form:

406 | # $p(\mathbf{w}) = ND(\mathbf{w}|\mathbf{m}_0,\mathbf{S}_0)$

407 | # where $\mathbf{m}_0$ is the prior mean and $\mathbf{S}_0$ is the prior covariance. We will *assume* $\mathbf{m}_0 = 0$ and an infinitely broad prior so that $\mathbf{S}_0=\alpha^{-1}\mathbf{I}$ where $\alpha$ 408 | # is a precision parameter that we will have to choose. Given these choices of *prior* and *likelihood* functions the *posterior* probability is given by

409 | # $p(\mathbf{w}|\mathbf{t}) = N\left(\mathbf{w}|\mathbf{m}_N, \mathbf{S}_N\right)$

410 | # where

411 | # $\mathbf{m}_N = \beta \mathbf{S}_N \mathbf{\Phi}^T \mathbf{t}$

412 | # $\mathbf{S}_N^{-1} = \alpha\mathbf{I} + \beta \mathbf{\Phi}^T\mathbf{\Phi}$ 413 | # 414 | # where $\beta = 1/\sigma^2$ is the inverse variance of the random noise component associated with the target variable, $t = y(\mathbf{x},\mathbf{w}) + ND(0, \sigma^2)$. 415 | # Finally, the log of the *posterior* is seen to be the sum of the log likelihood **and** the log of the prior:

416 | # $\ln p(\mathbf{w}|\mathbf{t}) = -\frac{\beta}{2}\left[\sum_{n=1}^N\\{t_n - \mathbf{w}^T \phi(\mathbf{x}_n)\\}^2\right] - \frac{\alpha}{2}\mathbf{w}^T\mathbf{w} + constant$

417 | # 418 | # **NOTE:** Maximizing the posterior function with respect to $\mathbf{w}$ __under the assumption of Gaussian noise and a Gaussian prior__ is equivalent to the least-squares error solution 419 | # with the addition of a regulariztion term $\lambda = \alpha/\beta$. 420 | # 421 | # Rather than find a point estimate for $\mathbf{w}$ by maximizing the posterior and thereby make point predictions for the target variable $t$, it is more instructive to use the posterior 422 | # to formulate a *predictive distribution* for $t$. For **our model assumptions** this is given by

423 | # $p(t|\mathbf{t},\alpha,\beta) = \int p(t|\mathbf{w},\beta) p(\mathbf{w}|\mathbf{t}, \alpha, \beta) d\mathbf{w} = 424 | # ND(t|\mathbf{m}_N^T\phi(\mathbf{x}), \frac{1}{\beta} + \phi(\mathbf{x})^T \mathbf{S}_N \phi(\mathbf{x})$

425 | # where a point estimate of $t$ is given my the mean $\mu = \mathbf{m}_N^T\phi(\mathbf{x})$ and an estimate of the uncertainty is given by the standard deviation 426 | # $\sigma_N^2(\mathbf{x}) = \frac{1}{\beta} + \phi(\mathbf{x})^T \mathbf{S}_N \phi(\mathbf{x})$ 427 | # 428 | # There is one final issue with completing the Bayesian model, namely the determination of $\alpha$ and $\beta$. This can be done in a fully Bayesian manner by developing prior models but 429 | # this tends to make the equations intractible. Instead the so called *evidence function* approach is used. See section 3.5 of Bishop for more detail. 430 | 431 | # 432 | 433 | #

Example 6

434 | # Use the data provided in example 5 to formulate a Bayesian model. Here we know that $\beta = (1/0.2) = 5$. For this example, we will simply choose $\alpha = .4$ so that $\alpha/\beta = 0.08$ (See Bishop 3.5 for methods 435 | # on determining $\alpha$ and $\beta$. 436 | 437 | # 438 | 439 | import numpy as np 440 | from matplotlib import pyplot as plt 441 | #in order to compare between examples, set a seed in random 442 | seed = 123456789 443 | np.random.seed(seed) 444 | alpha = 0.4 445 | beta = 5.0 446 | def y(x,coefs, mu=0, sigma=1.0/beta): 447 | ans = 0 448 | for i in range(len(coefs)): ans += coefs[i]*math.pow(x,i) 449 | return ans + np.random.normal(mu,sigma,1)[0] 450 | 451 | #training data, with N = 101 data points 452 | N = 101 453 | M = 4 454 | t = np.empty(N) 455 | domain = np.empty(N) 456 | domain_bound = 3.0/N 457 | for i in range(N): domain[i] = i*domain_bound 458 | for i in range(N): t[i] = y(x=domain[i],coefs=[1.75, 0.25, -1.0]) 459 | 460 | #Let's assume that we want to fit a 3rd order polynomial to the data even though we know its a second order 461 | #polynomial. Given the Bayesain approach, we should see the so that the unecessary terms are damped out. We have 462 | #y = phi_0 + phi_1 * x + phi_2 x^2 + phi_3 x^4 463 | #design matrix, phi, N X M where N = 101 and M = 4 464 | d2 = domain * domain 465 | phi = np.array([np.ones(N),domain, d2, d2 * domain]).T 466 | alphaI = alpha * np.eye(M) 467 | SN = np.linalg.inv(alphaI + beta * np.dot(phi.T,phi)) #posterior variance 468 | mN = beta * np.dot(np.dot(SN, phi.T), t) 469 | point_estimates = [np.dot(mN, phi[i]) for i in range(N)] 470 | uncertain_t = [1.0/beta + np.dot(np.dot(phi[i].T, SN), phi[i]) for i in range(N)] 471 | plt.plot(domain,t) 472 | plt.errorbar(domain,point_estimates, uncertain_t, ecolor = "red") 473 | plt.legend(('training','Bayes'),loc='lower left') 474 | 475 | # 476 | 477 | 478 | -------------------------------------------------------------------------------- /05_Decision_Trees.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "metadata": { 3 | "name": "05_Decision_Trees" 4 | }, 5 | "nbformat": 3, 6 | "nbformat_minor": 0, 7 | "worksheets": [ 8 | { 9 | "cells": [ 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "

Decision Trees

\n", 15 | "

Introduction

\n", 16 | "Decision trees divide the input space into some number of cuboid regions. **Every** input within a given region is assigned the *same output*. A decision tree model can be represented by a binary tree where \n", 17 | "\n", 18 | "* Each tree node tests one attribute from the input vector $X$\n", 19 | "\n", 20 | "* Each tree branch selects one value (or range) for the given attribute\n", 21 | "\n", 22 | "* Each leaf node predicts an output, Y\n", 23 | "\n", 24 | "Decision tree models cab be applied to both regression and classification problems, however the focus here will be on classification trees. Note also, that decision trees can represent **any** boolean function. The tree will have $2^N$ leaf nodes where $N$ is the number of boolean variables, however there are $2^{2^N}$ possible trees given $N$ variables. For other decision trees (i.e. for non-Boolean variables) the optimal number of leaves is unknown and the input vector selection attribute may be repeated at multiple nodes. \n", 25 | "\n", 26 | "The algorithms presented here are the so-called *CART* (classification and regression trees) methods. Another popular method, is the C5.0 (formally ID3) method presented by Quinlan. The resulting tree models will be binary decision trees (i.e. there are exactly 2 edges from each node except for leaf nodes)" 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "metadata": {}, 32 | "source": [ 33 | "

Classification Trees

\n", 34 | "Assume our training data consists of $N$ input vectors each with $p$ elements and an associateed single output from a set of $K$ classes, i.e. each input is of the form $X_i = \\\\{x_{i1}, x_{i2}, \\ldots , x_{ip}\\\\}$ and with an associated output $y_i \\in \\\\{c_1, c_2, \\ldots, c_K\\\\}$ where $i \\in \\\\{1 \\ldots N\\\\}$. Assume the decision tree model has $M$ leaf nodes, i.e. the input space is divided into $M$ regions, denoted $R_m$. Every input in the region $R_m$ will be assigned to class $k(m)$ where \n", 35 | "\n", 36 | "$k(m) = \\max_k p^e_{mk} = \\max_k\\frac{1}{N_m} \\sum_{x_i \\in R_m} I(y_i = k)$\n", 37 | "\n", 38 | "where $I$ returns 0 if $y_i=k$ and 0 otherwise, $p^e$ is the estimate of the probability of the observation being in class $k$ given the training data, and \n", 39 | "\n", 40 | "$N_m = num\\\\{x_i \\in R_m\\\\}$\n", 41 | "\n", 42 | "is the number of inputs from the training set in region $R_m$. NOTE: I would prefer to use $\\hat{p}$ instead of $p^e$ but there appears to be a bug in IPython Notebook." 43 | ] 44 | }, 45 | { 46 | "cell_type": "markdown", 47 | "metadata": {}, 48 | "source": [ 49 | "

Growing The Tree

\n", 50 | "The standard approach to creating a decision tree model is to use a greedy algorithm. The tree is started with a single root node that splits the data into two regions based on the selection of a single input feature, $j$, and an associated splitting value, $s$. That is, $j$ is the choice of feature and $s$ is the value of that feature used to split the training data. From there new nodes are added by successively repeating this process for the training data subregions resulting from the previous split. This process is continued until some stopping criteria is reached. From there, the tree pruning is applied as a means to prevent overfitting the data. \n", 51 | "

\n", 52 | "\n", 53 | "

Split Criteria

\n", 54 | "The first task is to define how to select the split variable, $j$, and value $s$. This is done by minimizing an impurity measure, $Q_m(T)$, summed over the two nodes resulting from the split. Some of the more impurity measures for classification problems are\n", 55 | "\n", 56 | "Misclassification error: $\\frac{1}{N_m} \\sum_{i\\in R_m} I \\left( y_i \\ne k(m) \\right) = 1 - p^e_{mk}(m)$\n", 57 | "\n", 58 | "Gini Index: $\\sum_{k \\ne k'} p^e_{mk} p^e_{mk'} = \\sum_{k=1}^{K} p^e_{mk} \\left(1 - p^e_{mk} \\right)$\n", 59 | "\n", 60 | "Cross-entropy: $- \\sum_{k=1}^K p^e_{mk} \\log p^e_{mk}$\n", 61 | "\n", 62 | "Given a choice of $Q_m$, it is a straightforward matter of iterating over each feature, $j$, and split value $s$ to determine the optimal choice. Note that even if a feature is continuous, implying an infinite space of possible split values, the training data set, from which $s$ must be selected, is finite.\n", 63 | "\n", 64 | "The stopping criteria is generally chosen as some number, $D$, of training values associated with the leaf nodes. That is, if any region $R_m$ contains greater than $D$ inputs from the training data, continue the splitting process.\n", 65 | "

" 66 | ] 67 | }, 68 | { 69 | "cell_type": "markdown", 70 | "metadata": {}, 71 | "source": [ 72 | "**TODO** \n", 73 | "

Pruning

\n", 74 | "Given a starting tree, $T_0$, pruning is done by minimizing the cost complexity function\n", 75 | "\n", 76 | "$C_{\\lambda}(T) = \\sum_{m=1}^{|T|} N_m Q_m(T) + \\lambda |T|$\n", 77 | "\n", 78 | "where $T \\subset T_0$, $|T|$ is the number of leaf nodes in $T$, and $\\lambda$ is a tuning parameter.\n", 79 | "\n", 80 | "Reduced-Error Pruning 1. Split data inot training and validation set 2. Create tree that classifies training set data Prune until further pruning is harmful 1. Evaluate impact on validation data of pruning each possible node 2. Greedily remove the node that most improves the validation accuracy\n", 81 | "\n", 82 | "Rule Post-Pruning 1. Convert tree to equivalent set of rules 2. Prune each rule independently of others 3. Sort final rules into desired sequence for use" 83 | ] 84 | }, 85 | { 86 | "cell_type": "markdown", 87 | "metadata": {}, 88 | "source": [ 89 | "

Example 1

\n", 90 | "Uses artificially generated data in 4 regions." 91 | ] 92 | }, 93 | { 94 | "cell_type": "code", 95 | "collapsed": false, 96 | "input": [ 97 | "sys.path.append('pysrc')\n", 98 | "import decision_trees as dt\n", 99 | "import networkx as nx\n", 100 | "import numpy as np\n", 101 | "\n", 102 | "inputs = np.zeros((16, 2))\n", 103 | "outputs = []\n", 104 | "row = 0\n", 105 | "for x in range(4):\n", 106 | " for y in range(4):\n", 107 | " inputs[row][0] = x\n", 108 | " inputs[row][1] = y\n", 109 | " row += 1\n", 110 | "\n", 111 | "for row in inputs:\n", 112 | " if (row[0] > 1 and row[1] < 2) or (row[0] < 2 and row[1] > 1):outputs.append(1)\n", 113 | " else: outputs.append(0)\n", 114 | " \n", 115 | "clazz = [0,1]\n", 116 | "meta = ['x','y']\n", 117 | "tree = dt.build_tree(inputs, outputs, clazz, meta)\n", 118 | "dt.draw_tree(tree)\n", 119 | "data = np.zeros((16,4))\n", 120 | "for r in range(16):\n", 121 | " data[r][0] = r\n", 122 | " data[r][1] = inputs[r][0]\n", 123 | " data[r][2] = inputs[r][1]\n", 124 | " data[r][3] = outputs[r]\n", 125 | "#print data" 126 | ], 127 | "language": "python", 128 | "metadata": {}, 129 | "outputs": [ 130 | { 131 | "output_type": "display_data", 132 | "png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAD9CAYAAABQvqc9AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xd4VHXa//H3zCSTMikkBBKBwEJQSkgQWYoKSFGKBduu\niiBSXcVVV9ZH3FV/us8+7K7lYddHFFAXEQGVJqirBEnoRZCSBoTegoSE9D7l+/sjZEwQkklykpnk\n3K/rmusyMyfnfM/XMx9OZu5zH4NSSiGEEEIXjO4egBBCiKYjoS+EEDoioS+EEDoioS+EEDoioS+E\nEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoi\noS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+E\nEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoioS+EEDoi\noS+EEDoioS+EEDri5e4BCOFuBQUFrFy5kp0JCezbtYtLubmYTCY6XHcdfQcPZvioUYwePRqTyeTu\noQrRYAallHL3IIRwh9zcXF6bNYtPP/2UoUYjI4qKuAloCziAk8Beg4G1AQFc8PHhhVdf5amnn5bw\nF82ahL7QpfXr1zN13DjuKSri5bIy2tey/F7geYsFx/XXs3jVKrp06dIUwxRCcxL6QneWLlnCC088\nwdKSEobX4fccwD+NRuYEB/P9tm307NmzsYYoRKOR0Be6Eh8fz4SxY4kvLqa+kb3EYOBPoaHsO3SI\nNm3aaDo+IRqbhL7Qjfz8fGKiovggK4tRDVzXf5nNnBk5ki++/lqTsQnRVKRkU+jG7Nde4/bCQmfg\nLwduASzAMBd+fxnQCQgA0srL+TEhgQ0bNlx12VOnTjFs2DAsFgs9evQgPj6+xnXPmjWLsLAwwsLC\neOmll1zdJSHqTEJftAgZGRk1vl5SUsLCjz7i5dJS53OtgZmAKxGbCjwJLAUygECgdXExc//xj6su\nP27cOPr27Ut2djazZ8/mN7/5DVlZWVdddsGCBaxdu5akpCSSkpL4+uuvWbBggQujEqIelBAe5M03\n31QPPvhgteeeeeYZ9dxzz/1i2aKiIrV48WI1bNgwFR0dXeN6v/jiCzUyMFAp+MXjQ1BDr/J81cef\nQI2v8vNxUGZQQWazyszMrLattLQ05ePjowoLC53PDRkyRM2fP/+qY7v55pvVhx9+6Px54cKFauDA\ngbXOlRD1IWf6wqM89thjrFu3jry8PABsNhtffPEFjz/+uHOZnTt3Mn36dDp06MCSJUuYPn06e/fu\ndb4+Y8YMQkJCqj0mTpzIloICbqznuA4Cvav83AXwAW4wm/nxxx+rLZuamkqXLl2wWCzO53r37k1q\naurV133wIL17/7z22NjYay4rRENJ6AuPEhERweDBg1mxYgUA69ato02bNvTp04fly5fTvXt3Jk+e\nTFRUFCkpKcTFxTFu3Dh8fHyc63j//ffJycmp9rglOpq1wIF6jqsQCL7iuSAgqqSEAweqr7WwsJDg\n4OpLBwUFUVBQcPV1X7F8UFAQhYWF9RypEDWT0Bce5/HHH2fJkiUALFmyhMceewyA9PR0zp8/T58+\nfYiNjaVt27YurzMnJ4fWDRhTAJB3xXN5QFu7nZzMzOrLBgSQn59f7bnc3FyCgoKuvu4rls/LyyMg\nIKABoxXi2iT0hce59957SUpKIiUlhf/85z+MHz8egOeff5709HSGDx/O7NmziYyMZObMmb84037y\nyScJDAys9kg6dYohQMxVtmdwYUzRQGKVn48D5UAIYPL2rr5sdDQnTpyodraemJhIdHT01dcdHV1t\nHxITE+nVq5cLoxKiHtz9pYIQVzNt2jQVExOjRowYcc1l0tLS1J/+9CfVoUOHGpdTSqkH7rhDLbvi\ny1k7qBJQ80ANAVUKqvwaX+SmggoCtRVUIahxlx+T/P3V/Pnz1WuvvaaGDh3q3N7AgQPVCy+8oEpK\nStSqVatUq1atVFZWllJKqZMnTyqDwaBOnz6tlFJq/vz5qkePHio9PV2dO3dO9ezZUy1YsECDWRTi\nlyT0hUfaunWrMhgMatGiRbUu63A41K5du2pc5m+zZ6vnvb2rBfnHoAxXPCZXeT0A1LYqPy8D1RGU\nBdR9oHJA9QoMVHv27FFTpkxRr7zyinN7p06dUkOHDlV+fn6qe/fuKj4+3vnali1bVOfOnZXNZnM+\n9+KLL6rQ0FAVGhqqZs2aVY8ZE8I1ckWu8Ehnz56le/fuZGRkaPL59o8//shDQ4dytKgIrXpkHgZu\nCwzkbFYWAwYMICEhgZCQkFp/b/bs2bRt25bp06drNBIhXCehLzyOw+Fg5syZFBYW8tFHH2m23l93\n68brR45wt0bre85sxvLss/ztrbc0WqMQjU9CX3iUoqIiwsPD6dy5M+vWraN9+9qaHrtu9erVvDxx\nIvuLivBt4LpSgGEWCwfS0jQdoxCNTUJf6Mrdw4fTecsW3rXb672OEmCg2cy0t97imWef1W5wQjQB\nKdkUupGamkrMgAGsCgjgryYT9TnbKQbGentjiorCarOReUWNvhCeTkJf6MLOnTtZsWIFPj4+vDl3\nLis6dWKcnx9Xb4F2dQeAWywW2tx1F088+yz5+fksXLiQM2fONNawhdCchL5o0ZRSxMXFERcXB8Ad\nd9zB+PHj2ZWcTPspU+jl58dfTSZ+qmEdB4AnfH0ZGRDA83PnsnT1aqZOnUq3bt0oKSlh8eLFHDp0\nqEn2R4iGks/0RYtls9lYs2YNKSkpmEwm7r33XmJjY6stk5yczHv/+798/sUXdDSbuclmI7ysDIfB\nwAk/P/Y6HNh9fPjds8/yxFNPVWv94HA4+Pbbb/nxxx8xGAyMGTOG/v37N/VuClEnEvqiRSotLeXz\nzz/n1KlT+Pj48PDDD9d4M/OysjKSk5PZv38/ly5dwmQy0aFDB/r27UvXrl0xGq/+R7FSim3btjlv\nknLrrbdy++23YzC40txBiKYnoS9anPz8fJYsWcLFixcJDAxk/PjxRERENOo2Dxw4wFdffYXD4SA2\nNpZ7770Xk0mry8CE0I6EvmhRLl68yJIlS8jPzycsLIwJEybQqlWrJtn2sWPHWL58OeXl5XTp0oWH\nH364WstnITyBhL5oMU6dOsXnn39OaWkpHTt2ZNy4cfj5+TXpGM6fP8+yZcsoLCwkIiKC8ePHExgY\n2KRjEKImEvqiRUhNTWX16tXY7XZ69OjBAw88gPcVLY+bSk5ODkuWLOHSpUsEBwczYcIE2rRp45ax\nCHElCX3R7O3cudNZktm/f39Gjx59zS9em0pxcTHLli3j3Llz+Pn5MW7cODp27OjWMQkBEvqiGVNK\nsX79enbu3AlU1ODfcsstHlM5Y7VaWblyJWlpaXh5efHggw/So0cPdw9L6JyEvmiWXKnB9wRSyy88\njYS+aHbqWoPvblLLLzyJhL5oVtxRg68VqeUXnkBCXzQb7qzB14rU8gt3k9AXzYIn1OBrRWr5hTtJ\n6AuP50k1+FqRWn7hLhL6wqN5Yg2+VqSWX7iDhL7wSJ5eg68VqeUXTU1CX3ic5lKDrxWp5RdNSUJf\neJTmVoOvFanlF01FQl94jOZcg68VqeUXjU1CX3iEllCDrxWp5ReNSUJfuF1LqsHXitTyi8YioS/c\nqiXW4GtFavlFY5DQF27TkmvwtSK1/EJrEvqiyemlBl8rUssvtCShL5qU3mrwtSK1/EIrEvqiyei1\nBl8rV9byDxo0iBEjRshfSKJOJPRFk5AafO1ILb9oCAl90eikBl97Ussv6ktCXzQqqcFvPFLLL+pD\nQl80GqnBb3xSyy/qSkJfNAqpwW86Ussv6kJCX2hKavDdQ2r5hask9IVmpAbfvaSWX7hCQl9oQmrw\nPYNSiq1bt5KQkABILb/4JQl90WBSg+95pJZfXIuEvmgQqcH3XFLLL65GQl/Um9Tge77z58+zdOlS\nioqKpJZfABL6op6kBr/5kFp+UZWEvqgzqcFvfqSWX1SS0Bcukxr85k1q+QVI6AsXSQ1+yyC1/EJC\nX9RKavBbFqnl1zcJfeGklOLSpUuEhYU5n5Ma/Jarplr+qx0LomWQb98EUPFn/7p165g3bx4nT54E\nKmrwP/roIy5evEhYWBhTp06VwG9BbrzxRh599FHMZjNJSUksXbqUsrIyADZv3sy8efNITU118yiF\n1uRMX2Cz2Vi9ejUHDx4EwMfHh9tvv534+HipwdeBK2v5o6OjnbdkNBgMjBo1ioEDB7p5lEIrEvo6\nV1JSwueff87p06edz2VmZnLixAl69+5Nnz59pAZfBypr+Y8cOcKxY8eIiYnB39/f+fott9zCHXfc\nIZ/7twDy8Y6O5eXlsXDhwmqBf+7cOVJTUykpKaGgoIC7775bAl8HQkJCGDNmDOfOnaOkpIT9+/eT\nl5fnfH3Hjh2sWrUKm83mxlEKLUjo61RGRgYfffQRmZmZzueOHz/OsWPHAIiKiqJNmzYsX75c3ug6\nkJ2dzZdffklMTAxhYWFYrVYSExPJyspyLpOSksLSpUspLS1140hFQ0no69DJkydZuHAhBQUFQMWX\nuAcPHuTs2bMYjUZ69OhBZGQkAKdPn2bbtm3uHK5oZEopvvzyS4qKijAajURHR9OuXTscDgepqamk\np6c7lz158iQff/wx+fn5bhyxaAj5TF9nkpOTWbNmDXa7Haj4EjclJYXc3Fy8vLyIjo4mJCTEuXz3\n7t158MEH5SOeFi47O5slS5aQnZ3tfO706dPOSq6OHTtWuzYjODiY8ePH07Zt2yYfq2gYCX2dUEqx\nc+dO1q9f73yurKyMpKQkioqK8PHxISYmhoCAAOfr/fr1Y8yYMdJXRyeKiopYtmxZtTP7CxcukJaW\nhlKK8PBwunfv7vwy19fXl3HjxtGpUyd3DVnUg4S+DjgcDuLi4vjhhx+czxUVFZGUlERZWRn+/v7E\nxsbi6+vrfH3EiBEMGjRIqjV0pry8nBUrVnD06FHnc9nZ2aSmpmK32wkJCaFXr17Oi7hMJhMPPPAA\n0dHR7hqyqCMJ/WbK4XBw/Phx9u7dy5kzZ3A4HISEhNCnT59qAX5lDT5Abm4uKSkp2Gw2goODiYmJ\nwcvLCwCj0cjYsWO58cYb3bJfwv0cDgfffPMN+/btcz5XUFBAcnIy5eXlBAQEEBsbi9lsBq5ey5+T\nk8PevXtJTU2luLgYHx8funXrRt++feUCPzeT0G9msrKy+HD+fOa/8w6GkhL6Go10KSnBqBSZPj7s\n9/LiRFkZv33wQaY98wyHDh36RQ3+oUOHcDgctGnThh49ejg/vjGbzTz88MNERUW5a/eEh1BKsXnz\nZjZt2uR8rrS0lKSkJIqLi/H19SU2NrZaLf/AgQMpLS1l3ptvsnPvXm708yO2tJRAq5USLy9SfX3Z\nW1ZGt65dmTFrFg8//LDzHw7RdCT0mwmlFMuWLmXmjBncbbXydGkpN11j2YvAR0Yj/zKZ6BoTw9BR\nozCbzZw7d85Zktm+fXu6du3q/PgmICCA8ePHc9111zXNDolmYd++fXzzzTc4HA6goj1zcnIy+fn5\neHt706tXL4KDg8nJyeG7lSvxzc7mxdJSfgP4XmV9NuA/wL8sFrIjIli0YgV9+vRpwj0SEvrNgN1u\n53ePP86uNWtYVFTEr138vRxghpcXmy0WBg4f7qzMiIqKcpZkArRu3ZoJEyZUq9oRotKRI0dYsWIF\nVqsV+LnENysrC6PRSGBgIJu/+46XbTZmKoUrt19XwBLgj35+vPX++zw+aVIj7oGoSkLfwymlmDZh\nAqfWrGFtcTEBtf/KL7xhMPAPb2+i+/UjJiaG8PBw52uRkZGMGzeu2p/pQlwpPT2dZcuWUVRUBFQc\nl0ePHuXgwYOcTE7mPw4Hg+qx3sPAHf7+vPHBBzw6frymYxZXJ7V4Hu7TxYvZs3YtX9Uz8AFmKcVU\nq5Ws06er1VV369aNiRMnSuCLWrVv356pU6cSGhoKVHx5GxkZyelDh1hZz8AH6A58V1zMs088wfHj\nxzUbr7g2OdP3YBkZGcR07UpcYSEN/dSzHOjt7U3Pu+8mJjaWX//619x5551Sgy/qpGot/39WrqTf\n4cO8f/lCv4aYYzTy1U03sXH3bikTbmTyjvdg8+fO5X6r1Rn4y4FbAAswzIXfXwZ0AgKAh4G/Wa3s\nTkhg+PDh3HXXXdUC/9VXXyUmJgZvb2/+8pe/1LruWbNmERYWRlhYGC+99FId90w0VxaLhccff5zW\nrVuTlpbGG5cDvyHH5v3ARIeDc4cOsWvXrl8sW5djc+PGjQwbNoxWrVrRuXPnOu2bXkjou0lGRkaN\nr9vtdj547z2evnxTC4DWwEzAlYhNBZ4ElgIZgD8Vb0y/8nKMRuMvzqauv/563nrrLe66665az7QW\nLFjA2rVrSUpKIikpia+//poFCxa4MCrRHNR2bJrNZjJ/+onHDAYCLz/X0GPzGeCp4mLef/vtXyxf\nl2MzICCAadOm8dZbb7kwEn2S0G+g48eP07p1a/bv3w9U3JCiTZs2bNmy5RfLFhcX8+mnnzJ8+HBG\njBhR43oPHTqEv9VK1VuPjwB+A7hSVLkUGAsMouLs66/AamBsWRnfx8X9YvmJEycyevRoAgMDqe0T\nv08++YQXXniBdu3a0a5dO1544QUWLVrkwqhEU2qsYxMg/quv+M3lah7Q5ti8Uyk2XL55S1V1OTb7\n9evH+PHj5Sy/BhL6DRQVFcUbb7zBhAkTKCkpYfLkyUyePJkhQ4Y4l9m5cyfTp0+nQ4cOLFmyhOnT\np7N3717n6zNmzCAkJKTaY8CAAZwuLKS+18UeBHpX+bkL4ANEOBzs3by5nmu9vO6DB+nd++e1x8bG\nym31PFBjHZshISHsSE3luXqO61rHZglgLSvj/Pnz9VyzcIWEvgamTZtG165d6d+/PxkZGcyePRuA\n5cuX0717dyZPnkxUVBQpKSnExcUxbtw4fHx8nL///vvvk5OTU+3x4osvMgs4UM8xFQLBVzwXBIRC\ntSt067XuwkKCg39ee1BQEIWFhQ1ap2gcjXFsHjp0iDa+viTVc0zXOjYLga4+Pg0+PkXNJPQ1Mm3a\nNFJTU3nmmWecbYjT09M5f/68sx9OXdrQ2u12vBowngAg74rn8oDAy+tuiICAgGr91PPy8qp15xSe\npVGOzQZUfdV0bJpo+PEpaiahr4HCwkL+8Ic/MG3aNF577TVycnIAeP7550lPT2f48OHMnj2byMhI\nZs6cyYED1c/fn3zySQIDA6s93nzzTWYDMVfZnisFbdFAYpWfj1NRthkIhLRqVePv1vZlWXR0dLV9\nSExMpFevXi6MSjS1xjg2u3fvzk/FxZofmzcAWZcbB16LlHM2nIS+Bp577jn69+/PBx98wF133cWT\nTz7pfC0wMJDp06ezfft2Nm/ejK+vL/fccw+33367c5n58+dTUFBQ7fHdd9/RLziY5CrbcQClgPXy\nf5dd/u+rGQ98DWwDioBXgQeBQ8CNAwfy+uuvM2zYz8V1NpuN0tJS7HY7VquV0tJSZ7+VU6dOYTQa\nOXPmDFDxxdqcOXM4f/486enpzJkzh0lyGb1Haoxjs6CggOsjIvisyna0ODYdwLnSUr744ot6H5tK\nKUpLS7FarSilKCsro7y8vEFz2OIo0SBr1qxRHTp0UDk5OUoppQoLC1XXrl3VsmXLrvk7DodD7dq1\nq8b15ubmqkCzWeWCUpcfH4MyXPGYXOX1AFDbqvy8DFRHUBZQ94HKATU2IEAtXrxYTZkyRb3yyivO\n7T3++OPKYDBUe3zyySdKKaW2bNmiOnfurGw2m3P5F198UYWGhqrQ0FA1a9ashkyhaCSNdWwqpdRj\nDzyg3tX42PwG1K0xMQ06Njdu3Ohcxmg0KoPBoIYNG9aQaWxx5IpcD/bw3Xcz6NtveUaj/0Vngd5+\nfpy5eJHBgweTkJDgUpO12bNn07ZtW6ZPn67JOETzl5CQwHP33ktSYaFLH+m4YmxAAPf+61/MnTtX\njs1GJKHvwbZu3crE0aNJKS7GosH6Zvj44D1pEu/Mn6/B2oSeKaXo1bkzs0+f5j4N1rcXGBMYyKkL\nF6QXVCOTz/Q92ODBgxl85528VKWErr42AV/5+/P63//e4HUJYTAYmLd4MU/7+XGpgesqByZZLMx5\n7z0J/CYgoe/h3vngA74ODubfDahaOAKM9/fnw6VLpWe+0MyQIUN4dNo0HvT3p7ie67ADk3196TZk\nCOMnTNByeOIaJPQ9XEhICOu3buX10FD+ZjJR1wrmLcAwf3/+5//+jzFjxjTGEIWO/eOf/6TjnXcy\n0t+fs3X83Rzgt35+ZPTpw6erVkk5ZhOR0G8GbrjhBrbv28d/evWiv7c3P1Bx56Ga/ATMMJn4bUAA\nHyxfzuSpU5tgpEJvTCYTi774gtEvvEBvb2/eo6KdQk2swOdAd29vrnv0Ub5JSMDPz6/xBysAML3+\n+uuvu3sQonZBQUHYTSZ+slr5v6wslnl7k2m1UkxFfXMuFR/jfAW8abHwR5MJW3Q09z/6KA8//LC8\nqUSjMRgMBIWEUOpwsCY7m7+UlHDaaCTbbsdARS3+eSq+V/rYy4tpPj7svu46bhs7lkcfe4zrr7/e\nrePXG6neaSZSUlJYuXIlFouF3//+92zZsoX4devYu3Urp8+dw+FwEBocTJ8BAxg4bBi//e1viY+P\nJzExkZ49e/LQQw+5exdEC1VeXs67775LQUEBDzzwAIGBgaxcsYIfN20iJTmZ4tJSfMxmunfrRt/b\nbuO+Bx4gODiYhQsX4u3tzTPPPENQUJC7d0M3JPSbAZvNxty5c8nNzeWee+6hb9++Lv1efn4+7777\nLlarlSlTptCxY8dGHqnQo02bNrFp0ybat2/PtGnTXP5sfsWKFaSmptK7d2/uv//+Rh6lqCSf6TcD\nu3btIjc3l/DwcPr0cf3GiUFBQdxyyy0AxMXF1dqLXIi6ys/PZ/v27QCMGjWqTl/G3n777ZhMJhIT\nE6WdchOS0PdwRUVFbN26FYCRI0fW+Z62t956K4GBgaSnp5OcnFz7LwhRBwkJCVitVnr27FnnvyRD\nQkIYOHAgICclTUlC38Nt3LiRsrIyrr/+eqKiour8+2azmeHDhwOwYcMGrNZrtcESom7Onz/PgQMH\nMJlM1Zq01cXgwYPx9/fn9OnTHD58WOMRiquR0PdgFy9eZO/evRiNRkaOHFnv9fTu3ZuIiAjy8/PZ\nuXOnhiMUeqWUYv369QAMGDCA0NDQeq3H19fX2VHz+++/l176TUBC34OtX78epRR9+/alTZs29V6P\n0Whk1KhRAGzbtk3uciUaLC0tjVOnTuHv71/t9ov1UXl8Z2dns3v3bo1GKK5FQt9DHTt2jGPHjuHj\n48PQoUMbvL7OnTvTrVs3ysvLSUhIaPgAhW7Z7XbnWf7QoUPx9fVt0Pqq/iW7efNmiovr29RBuEJC\n3wM5HA7nm2rIkCFYLFr02IQ77rgDo9HI/v37uXDhgibrFPqzZ88esrOzCQsLc7l8uDZdu3YlKiqK\n0tJSNm/erMk6xdVJ6Hugffv2cfHiRUJCQhgwYIBm6w0LC6Nfv37Oz2OlWkLUVUlJiTOUR44ciclk\n0mS9BoOBkSNHYjAY2LNnD1lZWZqsV/yShL6HKSsrY+PGjUBFHbOXV0Nuj/5Lt912G76+vpw4cYKj\nR49qum7R8m3evJmSkhK6dOmiefuE8PBwbrrpJhwOB99//72m6xY/k9D3MFu3bqWoqIjIyEh69uyp\n+fr9/f257bbbgIoviqVaQrjq0qVL7N69G4PBUOcLsVw1bNgwzGYzaWlpnDx5UvP1Cwl9j5Kbm8uu\nXbuAul/dWBf9+/cnNDSUrKws9u7d2yjbEC3P999/j8PhoE+fPoSHhzfKNgICAhg8eDBQccFW5Q3Q\nhXYk9D3Ihg0bsNlsxMTE0KFDh0bbjslk4o477gAq+qaUlpY22rZEy3Dy5EkOHz5c7WK/xjJw4ECC\ng4O5cOECiYmJjbotPZLQ9xBnz54lJSUFLy+vel/dWBfdu3enU6dOFBcXs2XLlkbfnmi+HA4HcXFx\nAAwaNIiAgIBG3Z63t7fzPZCQkEB5eXmjbk9vJPQ9gFLK+aa6+eabCQ4ObvRtVn4uC/DDDz+QnZ3d\n6NsUzVNSUhIXLlwgODiYm2++uUm22atXL9q3b09BQYGzoZvQhoS+B0hNTeXcuXNYLBYGDRrUZNtt\n164dvXv3xm63s2HDhibbrmg+ysvLiY+PB2DEiBF4e3s3yXarnpTs2LGD/Pz8JtmuHkjou5nNZnMG\n7vDhw/Hx8WnS7Ve+kQ8ePMiZM2eadNvC8+3YsYOCggLat29PTExMk267Y8eOREdHY7Vanf/wiIaT\n0Hez+vbK14r03BfX0pBe+VqRnvvak9B3o4b2yteK9NwXV9OQXvlakZ772pPQd6OG9srXivTcF1fS\nole+VqTnvrYk9N1Eq175WpGe+6KSVr3ytSI997Uloe8mWvXK14r03BeVtOyVrxXpua8dCX030LpX\nvlak577Qule+VqTnvnYk9JtYY/XK14r03Ne3xuiVrxXpua8NCf0m1li98rUiPff1q7F65WtFeu5r\nQ0K/CTV2r3ytSM99fWrMXvlakZ77DSeh34Qau1e+VqTnvv40Ra98rUjP/YaR0G8iTdUrXyvSc19f\nmqJXvlak537DSOg3kabqla8V6bmvH03ZK18r0nO//iT0m0BT98rXivTcb/maule+VqTnfv1J6Dcy\nd/TK14r03G/53NErXyvSc79+JPQbmbt65WtFeu63XO7qla8V6blfPxL6jcjdvfK1Ij33W6bt27e7\nrVe+VqTnft1J6Dcid/fK14r03G958vPz2bFjB9A8qslqIj3360ZCv5F4Sq98rUjP/ZYlPj7e7b3y\ntSI99+umeSeRB/OUXvlakZ77Lcf58+dJTEz0iF75WpGe+66T0G8EntYrXyvSc7/5q1pN5gm98rVy\nZc99m80MT4qgAAAUdElEQVTm5hF5Lgn9RuBpvfK1Ij33m7/Dhw9z+vRpj+qVr5WqPff37Nnj7uF4\nLAl9jXlqr3ytSM/95stutzublHlSr3ytSM9910joa8jTe+VrRXruN0+7d+/22F75WpGe+7WT0NeQ\np/fK14r03G9+iouLPbpXvlak537tJPQ10lx65WtFeu43L5s3b6a0tNSje+VrRXru10xCv55KS0tZ\ntGiRM/CaS698rVyt535OTg4rVqzgyJEjbh6d2LBhAxs3bqS8vJysrCz27NnTLHrla6Vqz/0TJ05g\nt9vZtWsXX375pbuH5nYt+3S0EW3bto1Tp05x6tQp2rVrx4kTJ/D19dXNmwoqeu7v2bOHCxcu8P77\n75Obm4vdbicjI4OoqKgW+xGCp8vKymLHjh04HA727t2L1WrFbrfTt29fj++Vr5XKnvsbNmxg8eLF\nhISEkJOTA1SUHnfp0sXNI3QfOdOvh5ycnGp16hs2bGDXrl2UlZXRqlUrN46saRkMBiIiIvjhhx/4\n6quvKCsrA5Abr7hZ5Q1RAM6dO0dcXBwHDhzQXdB16tSJI0eOsGnTJg4dOuR8fv369bq+8YqEfj3E\nx8c7byGYn5/PxYsXnW0W3n33XbZu3driLw45evQo8+bN4+DBg1gsFqxWK6dPn3a+vmnTJkpKStw4\nQn06ceIEaWlpQMWFWMeOHQMgNDSUlStX8tlnn7X4LzcLCgpYs2YNH3/8MSEhIQDOj3gA3d94RUK/\njipviFKp8k0VGRmJj48PZWVl7Nq1q8XfVzYlJYXMzEwAZ5uJ9PR0Z9AXFxc7ew+JplG1ZBggIyOD\nwsJCfH19iYyMBCAtLa3FNyXLycnhwIEDKKVo27YtQUFBlJeXc/bsWecy8fHxur3xioR+HVS9hB0q\n2i3k5+djNpurNa1qzm2UXTV8+HBn//XAwEAiIiJwOBycOHHCuYzceKVpJSYmOq+bsNvtzpuGd+7c\n2fmXaLt27ZptG2VXdezYsVoxReVJydmzZ50fQRYWFur2xisS+nVQeUMUoFrAde7c2fmlZdu2bZt1\nG2VXBQcHO9stw89zkJmZSV5eHoDceKUJVb0hCvwccEFBQdW+vNVLoUFlu2WoOFbbtm1b7R9CqLjx\nSuWxqicS+i6yWq3Van7PnTtHaWkpAQEBREREOJ8fNWpUs2+j7Kpbb73VeU9VHx8f50cIx48fdy5z\n8ODBap/1i8axfft2Zy+ksrIy50cZVTu89uzZk06dOrllfE0tNDS02gWSXbp0wWg0cuHCBQoKCoCK\n97QeW4noI53qQSlFfn4+ly5dorS0lB9++MF5VmC1Wp13kIqKinKeObWUNsquMpvNjBgxwvlz5fca\n+fn5ZGRkOJ+Pi4vDZrORk5NDdnZ2i/+SuymUlZVx6dIl8vPzyc3Ndd4QBeDkyZPY7XbatGnjvCdz\nS2qj7KohQ4bg7+8PVHTh7NChA1D9pKTyxiuFhYVcunRJF8UHUqdfRV5eHos/+YRvv/iCH5OSKCkr\nw9topMRmI8xiIaJ9e3r07YvNZsNms9G6dWtndUBLa6Psqt69e/PDDz9w4cIFTCYTnTt35vDhw5w4\ncQKj0Ujyvn2cP3aMPzz9NObLfwFZlSImKorBI0cybcYMunXr5ua98HxKKRISEvh0wQJ279zJiZ9+\nws/LC6vDgbfJRLvWrekYHU2Xrl25cOECRqOxWolmS2qj7CpfX1+GDh3Kt99+C1R81v/TTz+Rm5vL\nTz/9xIULFziZmMjcN9+k2GrF12Si2GajfevW9OvXj0emTePuu+9ucVfXG5Q0TqGsrIy/vvoq782d\nyyijkUeKiugHtAMMgBU4BGwB3vP2JsNgILJ7d0aNGuU8k+jXrx933XWX2/bBnU6cOMHixYuBinDa\ntm0bafv2UVRQwFSluMfhoA8QdHn5AmA/sM7Li397efHrAQN4b9EifvWrX7lnBzxcQkICT0+ahFdO\nDtMLCxkCRAOVtzE/D/wIfOrtzbd2O+HXXceAwYOd/5j6+/vz7LPPtriumq6w2+3MmzfPWaZ67tw5\nNick8NOZM9xqMjGlvJwBQCcq3ut24AiwA1gYGMgZs5m333uPhx56qMV8F6L70D906BC/GTOGGzIz\nmVtcTPtallfAemCyyUTH6GjuGDvW+aZqqV01XfHZZ5+RlpZGanIy67/+mudtNl5Sitpipgz4l8nE\nWz4+zHnvPSZOmtQEo20erFYrM2fMYM2yZcwrLuYuKoKpJj8BvzMa2WOxcN/48YSHh3PnnXfSv3//\nJhixZzpy5AjLli0jPz+fNUuXEn7pEovtdlzpQLQTmGqx0HPIEBYtX+78DqtZUzqWnJysIoKD1UcG\ng3KAUnV45IEa4+Wlojt3Vps2bXL3rrhdZmamGnv33SrC21sdqONcKlApoH7l76/e/de/3L0rHsFq\ntaoHRo9WY/z9VU495vMTUK3MZjVr1ixls9ncvTtu5XA41Jw5c1Qbi0XNNhiUvY5zWQJqiq+vujk2\nVhUUFLh7dxpMt6Gfk5OjOrZpo5bW4w1V+SgHNdrbWz09bZq7d8ftEhISVFsfH3W0AfN5ElQHf3+1\nbt06d++O273wzDNqpL+/KmvAfC4HdV1wsMrKynL37rhVaWmp6tmpk3rLYKj3XDpATfX1VQ+MHq0c\nDoe7d6lBdBv6kx95RD3p41Pvg6DykQ2qvZ+f2rhxo7t3yW0KCgrUr9q2Vf9p4FwqUN+DimzdWuXm\n5rp7t9xm+/btKsLPT2VqMJ/P+fioR++/39275FYvv/iiutffv85/zV/5KAXV02JRny1b5u5dahBd\nhn5iYqJq5++v8jV4UylQq0Hd2LVrsz8DqK/Z//3f6lE/P03mUoGa7OurXv3Tn9y9W24zsFcvtUyj\nuSwC1dHfX+3evdvdu+UW6enpKsTXV53XaD53gmofGqrKy8vdvWv11mxDv6ysTE2ZMkV16tRJBQYG\nqhtvvFF99913Nf7OnDlzVEREhDJ7e6ubDIY6/emcB+o5UB1BBYCKAvUHUFmg7KC6Wixq586dtY77\n0qVL6r777lMWi0V16tRJLfOQs4a6zmdycrIaOXKkCgsLU4DaX8c3T03zmQrqulatXHpjvfvuu6pv\n377Kx8dHTZo0Scspqbe6zuWiRYtU3759VVBQkAoPD1eBXl6aHZsK1BtGo5r8yCO1jtsT51Kpus/n\nZ599prp166aCgoKUv5+f6mw0qnQN53NQYKBatWpVreP21Pd6s704y2az0bFjR7Zs2UJ+fj7/8z//\nw0MPPXTNqz/j4uJ44403iI+Px8/LCx+leM3FbZUDI6go24yjouRwJxAG7KbiCrfJxcV8tmhRret6\n+umn8fX15eLFiyxdupSnnnqKgwcPujiSxlPX+TSbzTzyyCPMnDkTgBvrsK3a5rMn0MnhcOkep+3b\nt+fVV19lypQpdRhB46rrXJaUlPDOO+9w6dIlHrjnHgJsNv7p4rZqm0uAyQ4HK778stYmgJ44l1D3\n+bz11lvZsmULeXl5dAgJoZvDwUwXt+XKfE4pKOCzDz+sdV2e+l5vtmf6VxMbG6tWr1591dfGjRun\nXn75ZXX06FHV0WJRCaAiXPyX/0NQ4Zf/VL7WMhtB3RIdXeP4CgsLldlsVkePHnU+N3HiRPXSSy9p\nOg9aqWk+K/35z39W1PEs35X5nOntrf7+t7+5PNZXXnnFo85Or+TKXCql1LCbblJPgLpHw7lUoK4P\nCFApKSkujdXT51Ip1+YzPz9f+Xl5qccun6lrNZ+HQf2qTZsat+3J7/Vme6Z/pYyMDI4cOUJ0dPRV\nXz948CC9e/cmOTmZWJOJWCADyHFh3RuAMYB/Dcv8HdiRmkpISMgvHmPHjgUq6oW9vLzo2rWr8/d6\n9+5NamqqazvZhGqbz0pp+/fXWjt+JVfmM95q5f+99lqN81mVUqqOo2g6rs4lQMqRI5wGerm4blfm\n8m7gVGEhAwYMaPZzCa7N57Zt22jXrh0lNhtngTdcXLcr8/lH4FRmZrN9r7eI64utVivjx49n0qRJ\n3HDDDVddprCwkODgYC5evEigw1Ht6tCQWtafDfy6lmX+A/gYDM5bsl1rDEFBQdWeCwwMdDaA8hSu\nzGel4qKiOq/flfl8Dfhk0CDWuNgQy1OvlqzLXALklZSQCixzcf2uzOU3wBSLhVvfeYepU6fWuk5P\nnUtwfT4HDRrE2rVreeW++/AuKOC/gHdcWL+r89nG15eUw4eveftJT36vN/szfYfDwWOPPYavry9z\n58695nIBAQHk5+dX3OjEaKSyoWqgC9toTcWl7jUpBXxq6dFROYaq8vLyCAx0ZRRNw9X5rGSux30D\nXJ3PuqzbE89O6zqXa9aswWq38zngapccV+YSoNRgwGw2u7ROT5xLqMexaTaD0chfgcUubsOV+VRA\nmcNR4z0zPPm93qxDXynF1KlTyczMZNWqVTXeiDs6OpoDBw7QrVs3DipFIhBO7Wf5ALdT8aVOcQ3L\njATKbDYCAwN/8ajsyXPDDTdgs9mcd9uCii5/vXq5+sd846rLfFbq0rMndY0IV+bzVWBtQkKN81mV\np52d1nUu161bxxNPPEHPTp2oy91bXZnLMcAXhYU8+eSTzXIuoX7H5g033MCh0lLKqfnjmqpcmc/h\nQGF5OZGRkc3zve6+rxMa7ne/+50aOHCgKiwsvOrrBoNBbd68WSml1Lp161RERIRKSkpSfl5eahCo\nP1X5cuZxUJOu8cVNGah+oEZf/hLHfrl8azaoby8vMxfU1HHjah3zI488osaNG6eKiorU1q1bVXBw\nsDp48KCm81JfdZlPpZQqKSlR8+fPV1Bx4UqphvM5KihIrVmzptYx22w2VVJSol566SX12GOPqdLS\nUo9oO1CXuYyPj1ehoaFq69atasbUqertK+aroXNZBMrP21uVlJTUOGZPnUul6jafS5YsUWfOnFFK\nKXVdcLDqB+oZDedzFagxt95a65g99b3ebEP/1KlTymAwKD8/PxUQEOB8VNbCnjlzRgUFBans7Gzn\n78yZM0eFh4crLy8vdQsVbRQq/2ePAPVRDd/Y512uAIjk59rdP1JxRa4CNSAwUK1du7bWcWdnZ1er\n3f3ss88abY7qoq7zefLkSWUwGJTBYFCAMoDqrNF8poMK8fNTeXl5tY77tddec46j8vGXv/ylUeeq\nNnWdy2HDhilvb28VEBCg/Pz8lBHUnRoemwtB3XXbbbWO2xPnUqm6z+fLL7+sOnTooCwWiwoMDFS3\nGAyqRMP5vN9iUfPnzat13J76Xm+2oV+bJUuWqD//+c9XfS0+Pl5FBwQoW5V/3XuC8+e6PnaB6hgW\n5jFnRY2hpvn8w1NPqf/y9q52ttSQ+XzFZFJPTZ7cxHvYdGqaS7vdrrped53arNFc2kH1CQhQ3377\nbRPvZdOpaT5TU1NVhJ+fswSzofN58vIJSXNuvNZiQ78mDodDDevfX71pMtXrf3zVRzmoGy0W9fHC\nhe7eLbc5e/asCrNYVGID51KBOgiqtb+/OnnypLt3y20+W7ZM9fT3r/ZxWX0f7xiN6pbevZXdbnf3\nbrnNo/ffr57XoM+WA9Qd/v5qtgf89dMQugx9pZQ6ceKECrNY1A8NPBD+6O2t7rztNt323an0748+\nUjEWi8ptwFwWgLrJYlHvz53r7t1xK4fDoe4fNUr93mxuUJOwfZf/AU1LS3P3LrlVVlaWuq5VK+dn\n8vV9/K/JpPp2766sVqu7d6lBdBv6Sin11VdfqbZ+fmpXPf/Vf9XbW/Xo1EllZma6e1fczuFwqKen\nTVMD/f3VpXrMZy6oIf7+atqECbr/B1Spis+DY6Oi1Itmc537vytQe0FF+PmplStWuHtXPML27dtV\nmL9/vYN/rtGoIsPCWsRfoLoOfaWU+vrrr1WYxaL+6uVV7Yvdmh6nQN1usagBvXqpjIwMd++Cx7Db\n7erF555THfz969RmeT0VnSCfe/JJXX8McaWsrCw1qE8fNdRiUcddnEsrqL+bTKq1v79LbR/0ZPv2\n7eq6Vq3UTLO51rYVlY8MUL/x81M9OnVSJ06ccPcuaEL3oa9Uxbf/owcPVp0tFvWG0ahOXz6Tr/o/\nvxzUDiruoNPK11f97b//u9n/mddY4uPjVefwcHVbYKD6/PJZ/JVvpjxQK0ANDwxUHcPCVFxcnLuH\n7ZFsNpt66x//UCF+fmqin5/aBr/owOkAdRbU2waDirJY1IiBA1vEGWljyMzMVI+MHauu8/NT/89k\nUkev8l63UdE19vc+PirU11e9+Ic/1Fru2pzo/h65Ve3Zs4f33n6bb7/7DoPNRnezGTOQCxwsLuZX\nERFMeOIJpj7xBG3btnX3cD2a1WplzZo1LHj7bXYdOEA7s5nIyxfUpDscnC0tpX9sLE/88Y88+OCD\nLl8tqldZWVl8/O9/s3jePI6np9PD359Qg4FyIK28HLvJxOiRI5nxwgsMHDjQIy+w8iQpKSm8/89/\nsnb1akpLSujp64svFW1ZUouLiWjdmkcmTeKJGTOIjIx093A1JaF/FUopzp07x7Fjx7BarQQFBREd\nHe0Rl1A3RzabjcOHD5ORkYFSivDwcHr06IFXLW0rxNUVFhaSmppKXl4e3t7eREVFERkZKUFfTxcu\nXCAtLY2ysjIsFgvR0dG0atXK3cNqNBL6QgihI826944QQoi6kdAXQggdkdAXQggdkdAXQggdkdAX\nQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggd\nkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAX\nQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggd\nkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAXQggdkdAX\nQggd+f+smSJTnTuXvgAAAABJRU5ErkJggg==\n" 133 | } 134 | ], 135 | "prompt_number": 3 136 | }, 137 | { 138 | "cell_type": "markdown", 139 | "metadata": {}, 140 | "source": [ 141 | "

Example 2

\n", 142 | "Uses data provided from the \"The Elements of Statistical Learning\" website on South African heart disease [here](http://www-stat.stanford.edu/~tibs/ElemStatLearn/)." 143 | ] 144 | }, 145 | { 146 | "cell_type": "code", 147 | "collapsed": false, 148 | "input": [ 149 | "sys.path.append('pysrc')\n", 150 | "import decision_trees as dt\n", 151 | "import numpy as np\n", 152 | "import networkx as nx\n", 153 | "from matplotlib import pyplot as plt\n", 154 | "\n", 155 | "p = 'data/SAheart.data'\n", 156 | "f = open(p,'r')\n", 157 | "all_lines = f.readlines()\n", 158 | "train_cnt = int(0.75 * len(all_lines))\n", 159 | "lines = all_lines[0:train_cnt]\n", 160 | "col_cnt = len(lines[0].split(','))\n", 161 | "row_cnt = len(lines)\n", 162 | "outputs = [int(line.split(',')[col_cnt-1][0]) for line in lines]\n", 163 | "inputs = np.zeros((row_cnt, col_cnt-1))\n", 164 | "for row in range(row_cnt):\n", 165 | " line = lines[row].split(',')[0:col_cnt-1]\n", 166 | " inputs[row] = [float(v) for v in line] \n", 167 | "clazz = [0,1]\n", 168 | "\n", 169 | "tree = dt.build_tree(inputs, outputs, clazz, meta=['sbp','tob','ldl','adip','famhist','typea','obes','alc','age'], max_rm=5)\n", 170 | "#dt.draw_tree(tree)\n", 171 | "\n", 172 | "#compare the tree precition to training values, since we haven't pruned this should be very accurate\n", 173 | "diff = []\n", 174 | "for row in range(train_cnt):\n", 175 | " p = dt.decide(tree, inputs[row])\n", 176 | " if p == outputs[row]: diff.append(0)\n", 177 | " else: diff.append(1)\n", 178 | " \n", 179 | "misses = sum(diff)\n", 180 | "print \"In the training data, there were {0} miss classifications for {1} inputs, a rate of {2}%\".format(misses, train_cnt, 100*misses/float(train_cnt))\n", 181 | "x = range(train_cnt)\n", 182 | "f, axarr = plt.subplots(2,1)\n", 183 | "f.subplots_adjust(right=1.5)\n", 184 | "f.subplots_adjust(top=1.5)\n", 185 | "\n", 186 | "#plot training comparison\n", 187 | "ax1 = axarr[0]\n", 188 | "ax1.scatter(x,diff)\n", 189 | "\n", 190 | "#compare the tree prediction to actual values not used in training set\n", 191 | "test_lines = all_lines[train_cnt+1:len(all_lines)-1]\n", 192 | "actual_out = [int(line.split(',')[col_cnt-1][0]) for line in test_lines]\n", 193 | "row_cnt = len(test_lines)\n", 194 | "test_in = np.zeros((row_cnt, col_cnt-1))\n", 195 | "for row in range(row_cnt):\n", 196 | " line = test_lines[row].split(',')[0:col_cnt-1]\n", 197 | " test_in[row] = [float(v) for v in line]\n", 198 | "\n", 199 | "diff = []\n", 200 | "for row in range(len(test_in)):\n", 201 | " p = dt.decide(tree, test_in[row])\n", 202 | " if p == actual_out[row]: diff.append(0)\n", 203 | " else: diff.append(1)\n", 204 | "misses = sum(diff) \n", 205 | "print \"In the hold out data, there were {0} miss classifications for {1} inputs, a rate of {2}%\".format(misses, len(test_in), 100*misses/float(len(test_in)))\n", 206 | "\n", 207 | "x = range(len(diff))\n", 208 | "ax2 = axarr[1]\n", 209 | "ax2.scatter(x,diff)\n" 210 | ], 211 | "language": "python", 212 | "metadata": {}, 213 | "outputs": [ 214 | { 215 | "output_type": "stream", 216 | "stream": "stdout", 217 | "text": [ 218 | "In the training data, there were 95 miss classifications for 345 inputs, a rate of 27.5362318841%\n", 219 | "In the hold out data, there were 27 miss classifications for 114 inputs, a rate of 23.6842105263%" 220 | ] 221 | }, 222 | { 223 | "output_type": "stream", 224 | "stream": "stdout", 225 | "text": [ 226 | "\n" 227 | ] 228 | }, 229 | { 230 | "output_type": "pyout", 231 | "prompt_number": 8, 232 | "text": [ 233 | "" 234 | ] 235 | }, 236 | { 237 | "output_type": "display_data", 238 | "png": "iVBORw0KGgoAAAANSUhEUgAAAoMAAAGqCAYAAABwJU7tAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzs3Xl8VOW9P/DPmcwksySTnZANQkhIAgkhsmndoohaFayo\nLdp6vUCRcmutva21trWCbRX0elsrvb9Sq9ZWK2pdsBVSRQ3gAhEIWzZCSMhODAkBsk0y8/39MSeT\nzGQmCckAlfN5v155ycx5zjnf8zzf58k3ZxYVEREQERERkSbpzncARERERHT+sBgkIiIi0jAWg0RE\nREQaxmKQiIiISMNYDBIRERFpGItBIiIiIg0bUzG4dOlSxMTEICsry+v2l19+GdnZ2Zg+fTouvfRS\n7N+/fyynIyIiIiI/G1MxuGTJEuTl5fncnpycjG3btmH//v14+OGHcc8994zldERERETkZ2MqBi+/\n/HKEh4f73H7JJZcgNDQUADB37lzU1taO5XRERERE5Gf6c3Wi5557DjfccMOg5xVFOVchEBEREV1w\nxvo/kzsnHyD56KOP8Pzzz2Pt2rVet4uI5n8eeeSR8x7Dv8MP+4H9wH5gP7Af2A/sh5H/+MNZvzO4\nf/9+LF++HHl5eUO+pExERERE595ZvTNYXV2NRYsW4aWXXkJKSsrZPBURERERjcKY7gzecccd2Lp1\nK5qbm5GYmIjVq1ejp6cHALBixQo8+uijaG1txcqVKwEABoMBBQUFY4/6ApSbm3u+Q/i3wH5wYj84\nsR+c2A9O7Acn9oMT+8F/FPHXC86jDUBR/PaaNxEREZGW+KOO4v+BhIiIiEjDWAwSERERaRiLQSIi\nIiINYzFIREREpGEsBomIiIg0jMUgERERkYaxGCQiIiLSMBaDRERERBrGYpCIiIhIw1gMEhEREWkY\ni0EiIiIiDWMxSERERKRhLAaJiIiINIzFIBEREZGGsRgkIiIi0jAWg0REREQaNqZicOnSpYiJiUFW\nVpbPNvfddx9SU1ORnZ2NwsLCsZyOiIiIiPxMP5adlyxZgu9973v4j//4D6/bN23ahMOHD6O8vBw7\nd+7EypUrsWPHjrGc8oLW1dWF0tJShIeHY+LEiYO2t7e349ChQxg3bhwURUFlZSXsdjtEBDExMZgy\nZQp0Ou/1/dGjR3HixAlMmjQJVVVVsFgsSE5OhqIorjbNzc2orq5GcnIywsLCAAAdHR0oKyvDyZMn\nAcDV3uFwICYmBnFxcThy5AjGjx+P2NhYt3NWV1ejpaUFaWlpMJlMrhjS0tIQFBSEyspK1NbWIjAw\nEFOmTEFERIRrXxFBeXk5GhoaYDKZkJaWhtDQ0EH9EB8ff8b93NraisrKSkycOBGRkZE+23nGMGXK\nFBw7dgy9vb1ITExERUXFiGNoa2tDRUUFEhMT0dHR4eoHo9HoatPd3Y3S0lJYrVZMmjRpyOM1Njai\nvr4eqampCAkJ8drm5MmTOHz4MOLj4xETE+OXGDzzwWw2Iz093RXD6dOnUV5e7jUfPPshOjp60Pbj\nx4+jvLwcNpsNiYmJPmM4dOgQoqKikJCQMGheHDt2DKmpqQgODvbZfz09PSgtLYXRaERKSorbPPDk\nbV54am9vR2lpKU6dOuWaF+Xl5Whvb3fNTYfDgdLSUjQ3N7vmqYggLi7OawyeYzHc+jBQfX09jh07\nhuTkZNTV1UGn0w1aH3yNhYigsrISp0+fds3VPgNjMJlMKC8vh8PhgMPhQFRU1BnNC2/rQ3V1NfR6\nPVJTU3Hy5ElXDIGBgThy5Ag6OjqQnp4OvV7vypO+x4D3eTFcTgLu+RAZGYny8nJ0dXXB4XAgIiIC\n6enpbjEkJSWhoqLCbSxEBIcOHRq0PvjKyd7eXpSWliIwMBCpqamu8R/JWu1tLGprazF58mTXOulp\nYD/Y7XZ88cUXSEtLg9lsHnKcfBnpvPCcm8PlZGlpKbq6unzOCzpLZIwqKyslMzPT67YVK1bIhg0b\nXI/T0tKksbHRrY0fQrgglJaWSkzMJAkJmSZGY6SsWPF9cTgcru27du2S8PA4sVqzJCAgRHQ6swBm\nAcIEGCcGw3i59NL50tHR4XZch8MhS5f+lxiNUWKxpEpAQKhYLKliMsXIbbfdJb29vSIi8vzzL4rJ\nFC5W63SxWCLlH//4p+zfv18iIxMkICBaAKsAQeo5wwWIloCAKAkICJaQkEwJCgqTxx570nXOe+/9\nkRiNkRISMk2ioibILbfcKUZjlISETJXx45PlxhtvE70+VACTKEqyGI1h8vrrb4iIiM1mk+uuu0X0\n+ggBTKLTJYvFEilbtmxx64egoHB5+OFfnlE/v/XW22I2R4jVOl1MpnD5619f9trOMwZFSRKdLlSC\nguLEaIyXgIAQ9bqHj2Hz5s1isURJSEiW6HRWMRjCJSRkqsTGTpby8nIRETly5IjEx6dKSEiGGI3R\nctdd97iN/0BPPvlbCQoKE6s1S0JCxsn27dsHtfnoo48kJCRarNbpYjSGyYoV/zXmGAbng3NsQkKi\nZfv27fLpp5+K1Rqjjk1/Pnj2Q19Mf/zjc27bN2x4VYKCrKIooQIkisEQ6TWGqKhEsVozxWiMkG9+\n8z/d5kVAQIhYrVkSFhYrO3fu9Np/DQ0NkpKSLcHBqWIyxcqNN94uPT09Xtt6mxeedu3aJWFh40Wn\ni3LNC53OIjpdpGtuzplzpaSnzxSDYZwARgFMAoQKECsGw7hBMXiOxcKFXx9yfRjo0UcfF6PROb4B\nAaFiMk0UsznRbX3wNRZ2u10WL14iRuM4CQlJl8TENKmqqhIR9zVKr7eKTmcSIFiACAEiRa8fN6J5\n4Wt90OtD1PVlkpqjkRISki4JCVNk/vybxWQaL8HBUyQpaZrk5t4gJlOcBAenSFraRdLU1OR1XgyX\nk575YDCEq9cVql6XVfT6JImP74/BbE6SgACrBAdPdY1FR0eHXHnlV8VsTnRbH3zlZHNzs0ydOluC\ngyeL2Rwv8+YtkM7OzhGt1d7GIiAgWKzW6RIcHCVbtmwZdI0D+0GnCxG9PlSs1kyJiIiXwsJCr+M0\nlJHOC8+5OVxOmkwRoijhPucFeeePOuqsFoM33XSTfPLJJ67H8+bNk127drkHwGJQRESysi4RRVkn\ngAhwQiyWLHn77bdd2+PjUwXYIMAOAeIFiBNgoQBLBbAL0CNBQbfKgw/+3O24r732mlgsMwQ4qbZ/\nWD1Hu5jNl8mzzz4rR48eFZMpUoASddtnYjZHSHJylgC3CHCVALECJAvwNQHuEKBHjeMf6j61YjbH\ny+7du+Wf//ynWCwZArSo2+4RRUlTYxBRlDtEUaapi+0+tc0eMZsj5Pjx47JmzZMSFHS5ANECVKjb\nP5Lg4CiJi+vrBxHgmFgsSV6LIW9OnDghZnOEAAXq/gfFZIqQurq6QW0Hx7Ba7b8eAUYeQ0dHh1gs\nkQJsF+A1AbJd/aDT/UZmzrxSRETmzp0nOt0T6jFPicUyS15+eXChum/fPjGbYwWoVttulrCw8WK3\n211tenp6xGodJ8D7aptStQAZWwwpKdke+dAfQ2hojEREeM+Hwf0gAhwSkylKjhw5IiIix44dE5Mp\nQoBLBOiPwWye6SWG59Xtx0VRrAPmRYIADeq2N2TcuCSvY7JgwWIxGH4sgEOATjGbr5Hf/vbpQe18\nzYtTp065tXPOTc954T43dbopoig3CJAmQKQA1wrQH4PROM8tBs+xUJRoUZRnfK4PfXbs2CFmc6La\nD/cKcLcrBqPxdnnwwZ8PORYvvPCCWCwXC3BaAJGAgF/J5Zd/VUQGrlGH1HkxVc2Hr53RvPC2PgAp\n6npQIMALAsxxxaAoC0Snu0SATgEcoijzRKe7SoBuARxiMNwv11yzYNC8GC4nB+dDqxpDrgA3C9Df\nD8BNA2K4RIDfuY3F7bffKSaTZz/4zsnFi5eKwXCvOv42MZluksWL7xx2re4zeCzc18nu7m5XW4fD\nMaAfPhTnOv6F2v6vMnHiVK/zxJczmxee/TBcTg49L8g7f9RRZ/0DJM44+3m75btq1SrXT35+/tkO\n6d/S4cPFEFmsPgpFV9dXUVRUBACw2Wyor68AcDuAYgCXAWgDcBrAN+B866ce3d1fx65dRW7HLSoq\nQnv7TQBCABQB+Ka6xYyOjptRWFiEw4cPIzBwGoB0ddvF0OkiUFVVCiAAwDQAFwOoAtAL4OsA2gGc\nAHCTuk88dLrLUFxcjOLiYths1wMIV7dFQOQ2NQZAJBwilwNIBjBdbZMDvT4BlZWV2LOnGN3d2QBm\nqm0AIBd2uwENDX39AADjIHI1iouLR9THR48eRUDAeACz1WemITAwDRUVFYPaDo6hWL1uB4CRx1Bf\nXw8gGM4xKwKwwNUPDsdilJU5x6u0tBgOR9/4B6O9/SYcOFA06HglJSUICPgKgET1mevR2dmFlpYW\nV5svvvgCNpsAuEZ9Rg8gdMwxVFYWoz8fBsdw8qT3fBjcDwCQisDAbBw6dAgAcOTIEQQGJsOZY/0x\ndHQs8BJD/3aR0+ifF1cBGK9uuwXNzXXo6OgY1IcHDhSjp+cbABQARnR03ILduwf3ta95UVtb62rT\nPzc954X73HQ49BCZqPZbMIBGdbszhq6uRW4xeI6FSAdE7lAfu68PAxUXF0NRctV+KAZwpyuGrq7b\nsGtX0ZBjsX9/MdrbFwKwAADs9sUoLi5S+6NvjSqDc16Uw5kPt+FM5oW39QHoi3m2GvfXXDGIhMHh\n+AYAIwAFIlY4HHcACASgoKdnMQ4cOOh1XgyVk33686FajaEGQDSA/n4AwgfE0Nev/WOxb18ZOjs9\n+8F3Tu7bNzAHDejsvBWFhQeHXav7DB6L/nXS4QhEY2Ojq+3Jkydx6lRfPxQDmA8gSt36DVRXl8Lh\ncHgOk09nNi88+2G4nBx6XpBTfn6+W93kD2e1GIyPj0dNTY3rcW1trdf3kQy8qNzc3LMZ0r+tSZPS\noChvqo9Ow2h8D+npzskWGBiImJgkABsBpAH4DM6JEwzgTQACwI7AwDeRk5Phdtz09HRYLHlw/pJK\nB/C6uqULZvO7mD49HZMnT4bNVgTgsLptNxyO40hMnALn4lYGYCeACQAMAN5Szx0C4F/qPsfgcHyC\ntLQ09T0+7wM4qW47AUV5W40BcBayn8K5aPYtzAfQ01ODiRMnIjs7DUFBBwHsgXOBBoBPEBBgw7hx\nff0AAMehKPmufhrOhAkT0NvbAGCv+kwZbLYyJCcnD2o7OIY09boDAIw8Bud7lE7B2X/pADa7+kFR\n3kRKinO/1NQ06HR9498Bi2Uzpk3LGHS8tLQ02O2fAWhQn/kAQUGBbu+3jIqKgsEgALaqzzjg7POx\nxTBhQhr682FwDMHB3vNhcD8AQCVstv1ISUkBACQlJcFmqwAwEc6cdsZgNm/yEkP/dmce9s2LbQC+\nULe9i4iI8TCZTIP6cOrUNOj1ffPGBpPpH8jOHjx+vuZFQkKCq03/3PScF+5zU1HsUJQa9fpPAYgZ\nsN0Go/Edtxg8x0JRzADeUB+7rw8DpaWlQaSvH9IA/N0Vg9H4NnJyMoYci2nT0mA2vwugEwCg072J\ntDTnefrXqFQ458VkOPNhI85kXnhbH4CP4cynvWrc/3DFAJyETvd3AN0AAEU5rT7uASAwGN7A1Knp\nXufFUDnZpz8fEtX9EwAcB9DfD0DbgBjSBo1FZmYKjEbPfvCdk5mZaTAY+sa/FybTRmRlDb9W9xk8\nFv3rpE7XjZiYGFdbq9UKi6WvH9IBfACgVd36FhISfL/X3Jszmxee/TBcTg49L8gpNzfX78XgWX2Z\n+N1335WvftX5EsNnn30mc+fOHdTGDyFcEIqKitT3Ql0kJtN4+c//XOn2nqAdO3aI1RojoaGzJCAg\nVAICgqX//XsJYjBMkNmzr5TTp0+7HdfhcMiddy4TkylWgoOniU4XIhZLppjNCXLTTV93vQ/lD394\nVkymCAkNnS1mc6S8+eZbsmfPHgkLGy8BAePU85jUc0YIECcBAeNFpwsWq3WmGI2R8sgjv3adc/ny\n74nJFCNW60USEREvN9xwq5hMsWK15khU1AS55pqFYjBECGARRUkXozFcXnnF+f7Srq4uyc29QQyG\naAEsotOli9kcKZs3b3brB6MxSh54wP1l8eG8+urrrus0mSLkT396wWs7zxgUZYrodFYxGieKyZQ0\n4LqHj+Gdd/4hZnOkWK2zRKcLFYMhWqzWHImOniilpaUiInLo0CGJiZkkVusMMZvj5Otfv9vtpd+B\nfvWrtWI0Rkpo6CwJDo6WDz/8cFCb9957TyyWKLWfIuTuu5eNOYbB+RAsOl26WCxR8uGHH8rWrVsl\nJGSces7+fPDsh9DQ2WI0Rsgzz/w/t+0vvvhXCQoKVd8zOFkCA2O8xhAeHiehoTPFZBont956h9u8\n0OvDJDR0llitMW5vURmorq5OkpKmSkhIplgsE+WaaxaKzWbz2tbbvPC0Y8cOCQmJVt8z6JwXitL3\nnkHn3Jwx4xJJTp4uBsN4dQ4Zxfm+tIkSGBg/KAbPsbjuupslMtL3+jDQQw+tEqMxSqzWGaLTWcVk\nShWLZbLb+uBrLHp7e+WWW+4UszlerNZsiY2dLBUVFSLivkbp9eHq+5b73jM4XgIC4kY0L3ytDwZD\nuNo3aaLTWSUwMEas1mwZPz5ZrrjierFYJkhISJYkJk6Riy+eJxZLkoSETJPJk7OkoaHB67wYLic9\n8yEwMFoUpe89g5ECRIheP8UtBrM5VXS6YAkJyXGNxenTp9WYkt3WB1852dTUJKmpMyQkZKpYLJPk\nssuulY6OjhGt1d7GIiAgREJDZ4vFEiWbN28edI0D+8EZU7iEhs6UsLBY+fzzz72O01BGOi885+Zw\nOWk0hqvvGfQ+L8g7f9RRinqgUbnjjjuwdetWNDc3IyYmBqtXr0ZPTw8AYMWKFQCAe++9F3l5ebBY\nLHjhhRdw0UUXuR1DUZRBLyVr1enTp1FUVITw8HBMmTJl0Pa2tjaUlJQgJiYGiqLg0KFDsNvtcDgc\nSEhIQGZmJgICAgbtJ+qn3Nra2pCcnIzKykpYLBZkZGS4vWxfX1+Po0ePIiUlxfXpwlOnTuHAgQM4\nfvw4dDodAgICICJwOByIi4tDQkICKioqEBsbO+gTjocPH8bx48cxbdo0WCwWVwzTpk2D2WxGaWkp\njh49CqPRiIyMDLe/Zh0OB4qKilBTUwOLxYLMzEzXJ38H9sNwn7r1pqmpCRUVFUhKSvL56UJvMWRk\nZKChoQF2ux0TJkzA4cOHRxxDc3MzysvLkZiYiPb2dlc/WCwWV5uOjg4cPHgQVqsVaWlpQ36Krrq6\nGvX19UhLS0N4eLjXNi0tLTh06BDi4+ORmJjolxg88yE0NBTTpk1zxdDa2oqysjKv+TCwHyZMmOD1\nVYLGxkaUlJSgu7sbSUlJPmMoLi5GZGQkUlJSBs2LxsZGpKen+/yEI+D8pO6BAwdgNBoxbdq0Ifva\n27zw1NbWhgMHDqCtrc01L8rKytDW1uaam3a7HQcOHEBjY6Pbp4knTJjgNQbPsWhvbx9yfRioqqoK\njY2NmDx5MmpqahAQEDBoffA1FiKC0tJSnD59GpmZmW53VweuUX1zuKenB729vYiPjz+jeeFtfais\nrERgYCDS09PR1tbmisFoNKK4uBidnZ3IzMxEYGAgioqKYLPZkJmZ6frEs7d5MVxOAu75EB0djZKS\nEnR0dMBut7u+Pm1gDAM/Tdw3Fna7HQcPHhy0PvjKSZvNhgMHDiAwMBDTpk2DTqcb8VrtbSxqamow\nZcoUn9+QMLAf7HY7mpqaMHXqVFit1iHHyZeRzgvPuTlcTjrf2tTuc17QYP6oo8ZUDPoDi0EiIiKi\n0fFHHcX/AwkRERGRhrEYJCIiItIwFoNEREREGsZikIiIiEjDWAwSERERaRiLQSIiIiINYzFIRERE\npGEsBomIiIg0jMUgERERkYaxGCQiIiLSMBaDRERERBrGYpCIiIhIw1gMEhEREWkYi0EiIiIiDWMx\nSERERKRhLAaJiIiINIzFIBEREZGGjakYzMvLQ3p6OlJTU7F27dpB25ubm3H99ddjxowZyMzMxJ//\n/OexnI6IiIiI/EwRERnNjna7HWlpadiyZQvi4+Mxe/ZsvPLKK8jIyHC1WbVqFbq7u/H444+jubkZ\naWlpOHbsGPR6fX8AioJRhkBERESkaf6oo0Z9Z7CgoAApKSlISkqCwWDA4sWLsXHjRrc2sbGxOHny\nJADg5MmTiIyMdCsEiYiIiOj8GnVlVldXh8TERNfjhIQE7Ny5063N8uXLcfXVVyMuLg6nTp3Ca6+9\n5vVYq1atcv07NzcXubm5ow2LiIiI6IKVn5+P/Px8vx5z1MWgoijDtnnssccwY8YM5Ofno6KiAvPn\nz8e+ffsQEhLi1m5gMUhERERE3nneNFu9evWYjznql4nj4+NRU1PjelxTU4OEhAS3Np9++iluv/12\nAMDkyZMxadIklJWVjfaURERERORnoy4GZ82ahfLyclRVVcFms+HVV1/FwoUL3dqkp6djy5YtAIBj\nx46hrKwMycnJY4uYiIiIiPxm1C8T6/V6rFu3Dtdddx3sdjuWLVuGjIwMrF+/HgCwYsUK/PSnP8WS\nJUuQnZ0Nh8OBJ554AhEREX4LnoiIiIjGZtRfLeO3APjVMkRERESjcl6/WoaIiIiIvvxYDBIRERFp\nGItBIiIiIg1jMUhERESkYSwGiYiIiDSMxSARERGRhrEYJCIiItIwFoNEREREGsZikIiIiEjDWAwS\nERERaRiLQSIiIiINYzFIREREpGEsBomIiIg0jMUgERERkYaxGCQiIiLSMBaDRERERBrGYpCIiIhI\nw8ZUDObl5SE9PR2pqalYu3at1zb5+fnIyclBZmYmcnNzx3I6IiIiIvIzRURkNDva7XakpaVhy5Yt\niI+Px+zZs/HKK68gIyPD1ebEiRO49NJL8a9//QsJCQlobm5GVFSUewCKglGGQERERKRp/qijRn1n\nsKCgACkpKUhKSoLBYMDixYuxceNGtzZ/+9vfcOuttyIhIQEABhWCRERERHR+6Ue7Y11dHRITE12P\nExISsHPnTrc25eXl6OnpwVVXXYVTp07h+9//Pu66665Bx1q1apXr37m5uXw5mYiIiMiL/Px85Ofn\n+/WYoy4GFUUZtk1PTw/27NmDDz74AB0dHbjkkktw8cUXIzU11a3dwGKQiIiIiLzzvGm2evXqMR9z\n1MVgfHw8ampqXI9rampcLwf3SUxMRFRUFEwmE0wmE6644grs27dvUDFIREREROfHqN8zOGvWLJSX\nl6Oqqgo2mw2vvvoqFi5c6Nbm5ptvxscffwy73Y6Ojg7s3LkTU6dOHXPQREREROQfo74zqNfrsW7d\nOlx33XWw2+1YtmwZMjIysH79egDAihUrkJ6ejuuvvx7Tp0+HTqfD8uXLWQwSERER/RsZ9VfL+C0A\nfrUMERER0aic16+WISIiIqIvPxaDRERERBrGYpCIiIhIw1gMEhEREWkYi0EiIiIiDWMxSERERKRh\nLAaJiIiINIzFIBEREZGGsRgkIiIi0jAWg0REREQaxmKQiIiISMNYDBIRERFpGItBIiIiIg1jMUhE\nRESkYSwGiYiIiDSMxSARERGRhrEYJCIiItKwMRWDeXl5SE9PR2pqKtauXeuz3eeffw69Xo8333xz\nLKcjIiIiIj8bdTFot9tx7733Ii8vD8XFxXjllVdQUlLitd2DDz6I66+/HiIypmCJiIiIyL9GXQwW\nFBQgJSUFSUlJMBgMWLx4MTZu3Dio3TPPPIPbbrsN0dHRYwqUiIiIiPxPP9od6+rqkJiY6HqckJCA\nnTt3DmqzceNGfPjhh/j888+hKIrXY61atcr179zcXOTm5o42LCIiIqILVn5+PvLz8/16zFEXg74K\nu4Huv/9+rFmzBoqiQER8vkw8sBgkIiIiIu88b5qtXr16zMccdTEYHx+Pmpoa1+OamhokJCS4tdm9\nezcWL14MAGhubsbmzZthMBiwcOHC0Z6WiIiIiPxIkVF+qqO3txdpaWn44IMPEBcXhzlz5uCVV15B\nRkaG1/ZLlizBggULsGjRIvcA1LuGRERERHRm/FFHjfrOoF6vx7p163DdddfBbrdj2bJlyMjIwPr1\n6wEAK1asGFNgRERERHT2jfrOoN8C4J1BIiIiolHxRx3F/wMJERERkYaxGCQiIiLSMBaDRERERBrG\nYpCIiIhIw1gMEhEREWkYi0EiIiIiDWMxSERERKRhLAaJiIiINIzFIBEREZGGsRgkIiIi0jAWg0RE\nREQaxmKQiIiISMNYDBIRERFpGItBIiIiIg1jMUhERESkYSwGiYiIiDRszMVgXl4e0tPTkZqairVr\n1w7a/vLLLyM7OxvTp0/HpZdeiv3794/1lERERETkJ4qIyGh3ttvtSEtLw5YtWxAfH4/Zs2fjlVde\nQUZGhqvNZ599hqlTpyI0NBR5eXlYtWoVduzY0R+AomAMIRARERFplj/qqDHdGSwoKEBKSgqSkpJg\nMBiwePFibNy40a3NJZdcgtDQUADA3LlzUVtbO5ZTEhEREZEf6ceyc11dHRITE12PExISsHPnTp/t\nn3vuOdxwww2Dnl+1apXr37m5ucjNzR1LWEREREQXpPz8fOTn5/v1mGMqBhVFGXHbjz76CM8//zw+\n+eSTQdsGFoNERERE5J3nTbPVq1eP+ZhjKgbj4+NRU1PjelxTU4OEhIRB7fbv34/ly5cjLy8P4eHh\nYzklEREREfnRmN4zOGvWLJSXl6Oqqgo2mw2vvvoqFi5c6NamuroaixYtwksvvYSUlJQxBUtERERE\n/jWmO4N6vR7r1q3DddddB7vdjmXLliEjIwPr168HAKxYsQKPPvooWltbsXLlSgCAwWBAQUHB2CMn\nIiIiojEb01fL+CUAfrUMERER0aic96+WISIiIqIvNxaDRERERBrGYpCIiIhIw1gMEhEREWkYi0Ei\nIiIiDWMxSERERKRhLAaJiIiINIzFIBEREZGGsRgkIiIi0jAWg0REREQaxmKQiIiISMNYDBIRERFp\nGItBIiK3zxf7AAAgAElEQVQiIg1jMUhERESkYSwGiYiIiDSMxSARERGRhrEYJCIiItKwMRWDeXl5\nSE9PR2pqKtauXeu1zX333YfU1FRkZ2ejsLBwLKcjIiIiIj/Tj3ZHu92Oe++9F1u2bEF8fDxmz56N\nhQsXIiMjw9Vm06ZNOHz4MMrLy7Fz506sXLkSO3bs8EvgF5LOzk48+eT/4uDBw0hKGodjx1qxe/ce\n2O29ABS1lYLY2HjMmZOFY8daAQDf+c5/Yu7cufjiiy/w+OP/g/z87ejs7AIAiAgAQKdToNcHYebM\n6YiKCsPRo01ITIxCW1sHdu3aje7ubgAKdDodUlOnYNq0SSgsLEFlZRVEHBg/Ph6LFn0V3/3uSgQE\nBLhibmlpwdq1T6GqqgGTJ8eioqIWBw8WweGwA1BgMAQiOzsLsbGR2LmzEK2tJ10x7NixBydPtmPu\n3Jl48MH7cfDgQfzpTy+houIIRJz7G40WXHbZHDz44H9j69Zt+MtfXkN19VG3fqiubsD+/Qdc/SSi\nQFHE1V+e/eBwOP+bnJyM7OwpOHiwAuXl5RBxuO0DOLzG8M47WxAZGQKDQY+GhhYkJ8eOKAYASEpK\nxkUXpaO6uglGowFhYcHYsWMPmpuPQwRQFEFk5DhcfPEMtLS0YdeuPejp6VGPIXA4HNDpFAAKAgL0\nmDp1KiZPjseuXQdRU1MLQNTzOmPQ6YDExImYNWsqysqqUVpagsTEpDHFICJu+aDT6fDnP7+Iv/3t\nTdTU1EJRgISERMyenTkoHzz7oaTkCMrKygZsV6AoOqSmpmD58m8hLS0NTzzxW+zc+bnXGObOnY66\nuibs27cf0dHjXfPCuT0Cn3++D/X1DQC850N4eCTmzp2Bjo4uFBTscs2Dvn7o28fbvHA47K6xGDg3\nb7/9Jtx++6148snfoKqqAddeexkWLLgJa9Y8hfz87dDrjbj22ivw4IM/REhICACgpqYGjz/+FD79\ndMegGDzHIiwsEhdfPN3r+uDcBxBRoNfrkZU1DRMnxqKysgExMWHo7RXs2FHgWh+GGovAQBNmzZqB\n8PBgfPbZbrS0tLrmRXR0LC6+eDpaWk6ju7sHS5fegauuugptbW1Yu/YpHD5cM6J54W19mD07Bz/+\n8fdRUVGBl19+C8HBQQgJMeOzz3ajvb0Lc+bMQHCwGZ99tgtdXT2YM2cGjMZA7NixGydPnvQ6L4bL\nSc986O7uwYkTp3H77Tdi0aJF6O7uxlNP/RbvvvueWwy1tc2D1mqTKQRz5+YgMNDgWh+GysmQkHBc\nckkOurtt2LnTmYMjWau9jUVXlw0TJ8Zg166DqK4+6pYPANz6oa7uOAICdIiODkVBwV40Nh4btE4q\nSv86ONZ54Tk3h8vJu+++HZde+hXXvDCbQ3DDDVfjRz/6AYxGI+jsUKQvC87QZ599htWrVyMvLw8A\nsGbNGgDAT37yE1eb73znO7jqqqvwjW98AwCQnp6OrVu3IiYmpj8ARcEoQ7gg2O12XHrptdi3Lwxd\nXdkAngAQAyAEQCuANgBhAC4HkAjgGQC/AKDAbF6LN974K7797ftQX2+BSCeAJgAG9Ri9AI4D+BmA\n59Tj3ADgVwAS1HbNALoA3AugBMB29XEkgIsAXIegoD/jttvS8dJLfwIAtLe3IytrLurqvgKbzQpg\nvRrvOHXfZgAPA3hJjb9tQAw29fHPAdQhKOj3AILR3d2u7m8EcArAg1CUUphMf4XDYUVXV4tHP/wO\nQDiAKACn1XOaBsTg2Q9Nao//J4AWAG+rj2MG9EOnzxiAcejo+C6Ax9UYJp1BDN8A0APgVQCrADwP\noFuNo0eNYRKAywA8BiBOPc5x9biRA/LhJIAH1HEqUh+HeInhZgCBAF5QH/sjBvd8SEiIw9NPb0BX\nV5Maw7Vqf3jmg2c/vAIgAEAsnL84mtQxugfABAQG/hKK0onu7r6x8IwhA8D/qtfsOS82A6gEcMLH\nWMTAmUc3A3hU3bcvhr5+GG5eeJubVyIo6P8hMLAe3d0LYLPNhsn0ewQENKK9PRYiQQC+jYCAD5Ce\nfgS7d29Da2srpk6didbWCDgNjMFzLDIAzITv9SFczYUWAD9V+6FJ7dNfqsfoHsFYHFf3/z/1cSv6\n50W6GsOTcOZgGMzmx/GXv6zDz372GKqqstDdHYfh54W39eHnAJoRFPRbKIoVXV0PDojhFICfAPiN\nmg/tAH6k9oVV3d/XvBguJwfmwy8BLAOQCrP5f7BmzQ/w5pub8PHH9ejtPT4ghikA5nqMRSeA++C+\nPgyVk0YAi+HMwQSMfK32NRa7AXwOoMMjH4579MMf4Zz/bwJoVPPFc50UuM/NscwLz7k5XE5mIiho\nDRSlDV1dcep5F8Ng2Ig5c2zYtm0zdDq+u82TP+qoUReDf//73/Gvf/0Lzz77LADgpZdews6dO/HM\nM8+42ixYsAAPPfQQvvKVrwAArrnmGqxduxYzZ87sD0BR8Mgjj7ge5+bmIjc3dzQhfSkVFBRg3ry7\ncfr0QQA/gHPh2wegCs4JbAFQCuBjAHcCuALASnXvZ5GZ+UccOTIOHR1bAcwBMBnOBeAjOBeMh9X/\nfgXAUQBrAZQB+ATOYuErcE78x+FcdHLV5/cC2AXnOwlOIzAwDg0NVYiIiMDrr7+OpUv/iNOn34dz\nEY0BUAegL4b/gnMhSPcSQwyAp9XzAMB4ANfDuQAVqNf/LwDT1O3BAG7y0g/BAI6o1zEPQLxHDJ79\nMBXOxe1PAGYAyIGzMB3YD0PFcABAMZy/nM8khnYAfwNwKZwL/cB+uGhADOVqPzUMiGHgWFTBmQ+z\n4cyTYLXfIr3E0ARnsTsfQJIfY+jPB4MhFoAdPT19MVQB8JUPnv2Qrh5724AYQuAsIgDgW+p5PvcR\nwz3qWHnOi6UAQocYi758qIOz0CjwiMEzH3zNi76xGJiTCoA/w1l0fKZexzoAb6jX0aBeoyAkZC7e\neOPX2LdvH3760wL09HiLwXMshlofStS4rodzzPv6oQHAOwBeO4OxeBrOIu4mDJ4XfTFEwfnLHQDe\nRnLyw/jiCytOnRrpvPC2PtygHm8igBc9YnhZjeFu9M/NLwD8N4aeF1UYOic986EUzvUBAAoRHn4T\nbDYD2ts9Y/A2FrsxeH0YKicb4CyG38eZrdW+xiIC3vPBsx9uBfA1OAsxX+uk59wcy7zw7IfhchIA\nHgRwCM55dBRAEIBeWCwZ2L79NeTk5EDr8vPzkZ+f73q8evXqMReDo36ZWFGU4RsBgwL0tt+qVatG\nG8aXXnd3N3S6YDj/Ou+G8y/GEPXfAudEsML5y6Zb/XefUHR32yBiUffvgXNITep+PWp7m/pc4IBz\nmOGcgBhwzG71v0FqDH1/gZmg0wXCZrO5YnbfRw/nAtAXg1V93lsMNo9r6HuJ1gjnQt7ssb3XRz+E\nq+372njG4NkPfdv7YtZ56YehYui7hjONYWA/efbDwBj06B+bvhiA/rHoywer2md29cfgJQbPsfFX\nDO750Nt7ekAMQ+WDt5jMHjEM7G+dus1XDN3qvz3nhd2jH3zlg1E9r2cMI50X3uYm1P8OvA4MiMHk\naqMoVthsNnR322C3+4rBcyyGWh9C0J8PA/vBorY7k7Hom7ve5kVfDKEDri9UXRfOZF54Wx/6OLzE\nYIXzD5yBc7MWw8+L4XLSMx/c19aeHhsUJdRLDN7Gwtv6MFROmtQ2Z7pW+xoLX/ng2Q995xhqnfTn\nvPDsh+FyEuo5LQP6AQACoNNZXL+DtM7zptnq1avHfMyAVaOsxE6ePIl33nkH3/rWtwAA7777LkJD\nQ3HZZZe52mzbtg0RERHIzMwEADz22GO47777EBwc7GqzevVqTReD48aNw7PPPoOOjkY4HFPgvDty\nCs6XIpoA7ITzdr4Dzr/mHoXzr7cKmM334xe/+D4++ODPsNkmwDm59sJ5FysGzgn1Bpx/zb2vPp8D\n4Ck4J2w8nH/Bvw/nHaQuAPvh/MvzJJwvOQRBr38YF11kwfe/vxKKoiAmJgb/938/R3e3Hs6/TF8C\ncBhAXwxvALgY/X9NvjUghjD1cTqAHQgI2AiD4TTs9gI4F4YkABvV6/8Aev0HCAgAHA7PfvgTnL8Q\nUuBctHbD+dekr34oArADzjuRAQA+BFAI52LZ1w++YwgM3Ine3svhvINgV/d5bgQx7IHzr3aj2u+X\nD+iHQ+o+BgA1cN6J/F/1+EnqsXeoY9GXD/+A8yXDJjjvMuwGUO0RQwmcvzCsAP6uxuGPGPrzYeZM\nC2bPvhjl5VVwOArgvCtg8JEPnv3wLpx5ZoXzzkUZnHcJEgG0IjBwPYAd6vtHvcWQpI5DE9znRbYa\nxwl1bH3lw170v8QWOiCGvn4Ybl54m5s6BAQ8i4CAjyFiBiAICnoOwDaIJAP4FEAsFOUvCA3dgief\nfByJiYl4/vmfoKcnxEsMnmPRrJ73aQxeH47BOZ/a4MzbLLUf/qGO9VPqdYxkLN5U93kbzjtAleif\nF30x/AJAKoAGmM3fww9/uASff/4uurp6IRKH4eeFt/UhA8Ae6PWvIzBwuzrX+mJ4C8CVcN45SlKv\n8Ur1PFPgvEvobV5UYeic9MyHX6rnOw2z+V4sWXIjGhqqcOJEtNovfTE0qv09cK3+B5x3wAauD7+E\n75z8DM67Yv8D55o00rXa11icgvNu8uce+fC5Rz/8Cs67qAfgLFD3Y/A6uRfuc3Ms88Jzbg6Xk10w\nGP4ARfkEImFw3mGMhKI8g7i4Mvzyl7+AXj/qe1gXLH/UUaN+mbi3txdpaWn44IMPEBcXhzlz5uCV\nV14Z9AGSdevWYdOmTdixYwfuv//+QR8g0fp7BgGgvr4e3/nOD1FaehixseFoajqBmpp6OBw9cL5p\nPgCKosBqDUFGRhKam9uhKAoeeGAF7rrrWzhw4AC+850HcOBAEXp6ugDooCg6iAgUxYGAACMSE+MQ\nHR2GhobjGDcuFG1tHaiqqobd7vxLS1ECER0dieTkeBw6VIGWlpMAFAQHW3D99fOwbt0TsFr7/3or\nLS3Fd7/7IGprGxAfH4GjRxvR2NgEh8P513nfOceNC0dxcRm6uuxITIxDVFQoiovL0N3tQErKJDzx\nxM/x6aef47nn/orm5jY1ZoFeH4SMjFT85jeP4o03/okNG15Ha+tpAP39UFf3BerqGiHigIgdihKg\nvmFa4FyQ3fvB+ZeqHhER4ZgyJREVFdVobm6Fw+FQtwMiAWp/DI7h3Xc/QEiICXq9AS0tbYiLCx9R\nDDqdHmFhoUhPn4iGhhMIDNTDajWjuLgMHR0dcC6IgNlsRkbGJLS0nMbRo7VwOHoBOCCig/OOkgPO\nD4oEISYmGklJsSgpKUVbW4frDd59MQABCA119lNVVT2amloRGhoyphgUReeWD4GBgXjggYfx97+/\niba2DgAKQkOtyMhIGpQPnv1w5EgNmppa3PpeUQIRFRWB5cu/idmzZ+DHP/4lKiqqvMYwdWoyGhqO\no66uEcHBFte8EHEgOjoMJSWlOHmyw2s+AIDRaEJ6+iS0t3fjyJEq1zwAAgbki/d54exrx6C5uWDB\ntVi5cgkeeGA1amsbMG/eZViy5A7cf//DOHiwGDpdAGbPzsH69U9h0qRJAIAdO3bge997CKWl5W4x\neI5FdXUdjEYjpk6d5HV96MtZEefcGz9+HBITY1Bf34yIiGDYbHaUlx9xrQ9DjUVAgBETJ8YjLMyK\nkpJSdHZ2uq7TYrFg6tRJOHGiC3a7HStWfBP33rsSR44cwcqVD+Do0doRzQtv68OkSROxZs1D2L+/\nGH/5yxsICtLDYjGjpKQUPT1AUlICQkLMKCk5hN5eIDk5ESaTCaWlh9Dd3el1XgyXk5750N3tQEdH\nJ2677UasWvVTNDc3Y+XK/8bWrR+7YrBag9HU1DJorTYYTEhJmYDAwCDX+uArJxUFCAoyIi1tErq7\ne3D4cCXsdtuI1mpvY9HT04Px48NRVlaO1tbTXtYDnasfjh07BZ0OiIy0oqSkDCdPdrh+D/etk841\np38ejGVeeM7NoXIyPDwM3/zmIixYcB1+8IOHUVZ2GAZDIC67bC7Wr/9fjB8/3vcvUg07r+8ZBIDN\nmzfj/vvvh91ux7Jly/DQQw9h/fr1AIAVK1YAAO69917k5eXBYrHghRdewEUXXeQeAItBIiIiolE5\n78WgP7AYJCIiIhodf9RR/Iw2ERERkYaxGCQiIiLSMBaDRERERBrGYpCIiIhIw1gMEhEREWkYi0Ei\nIiIiDWMxSERERKRhLAaJiIiINIzFIBEREZGGsRgkIiIi0jAWg0REREQaxmKQiIiISMNYDBIRERFp\nGItBIiIiIg1jMUhERESkYSwGiYiIiDSMxeC/ifz8/PMdwr8F9oMT+8GJ/eDEfnBiPzixH5zYD/4z\n6mKwpaUF8+fPx5QpU3DttdfixIkTg9rU1NTgqquuwrRp05CZmYnf/e53Ywr2QsakdmI/OLEfnNgP\nTuwHJ/aDE/vBif3gP6MuBtesWYP58+fj0KFDmDdvHtasWTOojcFgwG9+8xsUFRVhx44d+P3vf4+S\nkpIxBUxERERE/jPqYvCdd97B3XffDQC4++678fbbbw9qM378eMyYMQMAEBwcjIyMDNTX14/2lERE\nRETkZ4qIyGh2DA8PR2trKwBARBAREeF67E1VVRWuvPJKFBUVITg4uD8ARRnN6YmIiIgIzjpsLPRD\nbZw/fz4aGxsHPf/rX//a7bGiKEMWdadPn8Ztt92Gp59+2q0QBMZ+AUREREQ0ekMWg++//77PbTEx\nMWhsbMT48ePR0NCAcePGeW3X09ODW2+9Fd/61rfwta99bWzREhEREZFfjfo9gwsXLsSLL74IAHjx\nxRe9FnoigmXLlmHq1Km4//77Rx8lEREREZ0Vo37PYEtLC77+9a+juroaSUlJeO211xAWFob6+nos\nX74c7777Lj7++GNcccUVmD59uutl5McffxzXX3+9Xy+CiIiIiEZn1HcGIyIisGXLFhw6dAjvvfce\nwsLCAABxcXF49913AQCXXXYZHA4H9u7di8LCQhQWFroKwVWrViEhIQE5OTnIycnB5s2bXcd+/PHH\nkZqaivT0dLz33ntjub4vhby8PKSnpyM1NRVr16493+GcU0lJSZg+fTpycnIwZ84cACP7Dssvu6VL\nlyImJgZZWVmu54a67gt1TnjrBy2uDb6+k1VrOeGrH7SWE11dXZg7dy5mzJiBqVOn4qGHHgKgvXzw\n1Q9ay4c+drsdOTk5WLBgAQA/54OcJ6tWrZKnnnpq0PNFRUWSnZ0tNptNKisrZfLkyWK3289DhOdG\nb2+vTJ48WSorK8Vms0l2drYUFxef77DOmaSkJDl+/Ljbcw888ICsXbtWRETWrFkjDz744PkI7aza\ntm2b7NmzRzIzM13P+bruC3lOeOsHLa4NDQ0NUlhYKCIip06dkilTpkhxcbHmcsJXP2gxJ9rb20VE\npKenR+bOnSvbt2/XXD6IeO8HLeaDiMhTTz0ld955pyxYsEBE/Ps747z+7+jEyyvUGzduxB133AGD\nwYCkpCSkpKSgoKDgPER3bhQUFCAlJQVJSUkwGAxYvHgxNm7ceL7DOqc882Ak32H5ZXf55ZcjPDzc\n7Tlf130hzwlv/QBob23w9p2sdXV1mssJX/0AaC8nzGYzAMBms8FutyM8PFxz+QB47wdAe/lQW1uL\nTZs24dvf/rbr2v2ZD+e1GHzmmWeQnZ2NZcuWuW5v1tfXIyEhwdUmISHBtRhciOrq6pCYmOh6fKFf\nrydFUXDNNddg1qxZePbZZwEAx44dQ0xMDADnp9aPHTt2PkM8Z3xdt9bmBKDttaGqqgqFhYWYO3eu\npnOirx8uvvhiANrLCYfDgRkzZiAmJsb10rkW88FbPwDay4cf/OAHePLJJ6HT9Zdt/syHs1oMzp8/\nH1lZWYN+3nnnHaxcuRKVlZXYu3cvYmNj8cMf/tDncS7kL6a+kK9tJD755BMUFhZi8+bN+P3vf4/t\n27e7bR/uOywvVMNd94XcJ1peG06fPo1bb70VTz/9NEJCQty2aSknPL+bVos5odPpsHfvXtTW1mLb\ntm346KOP3LZrJR88+yE/P19z+fDPf/4T48aNQ05Ojs/vZh5rPgz5PYNjNdT3FA707W9/2/WGyPj4\neNTU1Li21dbWIj4+/qzE9+/A83pramrcKvoLXWxsLAAgOjoat9xyCwoKCkb8HZYXGl/XrbU5MXC8\ntbQ29H0n61133eX6qi4t5oS376bVak4AQGhoKG688Ubs3r1bk/nQp68fdu3ahdzcXNfzWsiHTz/9\nFO+88w42bdqErq4unDx5EnfddZdf8+G8vUzc0NDg+vdbb73l+jThwoULsWHDBthsNlRWVqK8vNz1\nKdML0axZs1BeXo6qqirYbDa8+uqrWLhw4fkO65zo6OjAqVOnAADt7e147733kJWVNaLvsLwQ+bpu\nrc0JLa4N4uM7WbWWE776QWs50dzc7Hrps7OzE++//z5ycnI0lw+++mHg/xlNC/nw2GOPoaamBpWV\nldiwYQOuvvpq/PWvf/VvPpyVj7yMwF133SVZWVkyffp0ufnmm6WxsdG17de//rVMnjxZ0tLSJC8v\n73yFeM5s2rRJpkyZIpMnT5bHHnvsfIdzzhw5ckSys7MlOztbpk2b5rr248ePy7x58yQ1NVXmz58v\nra2t5zlS/1u8eLHExsaKwWCQhIQEef7554e87gt1Tnj2w3PPPafJtWH79u2iKIpkZ2fLjBkzZMaM\nGbJ582bN5YS3fti0aZPmcmL//v2Sk5Mj2dnZkpWVJU888YSIDL02aqkftJYPA+Xn57s+TezPfBj1\nl04TERER0Zffef00MRERERGdXywGiYiIiDSMxSARERGRhrEYJCIiItIwFoNEREREGsZikIiIiEjD\nWAwSERERaRiLQSIiIiINYzFIREREpGEsBomIiIg0jMUgERERkYaxGCQiIiLSMBaDRERERBrGYpCI\niIhIw1gMEhEREWkYi0EiIiIiDWMxSERERKRhLAaJiIiINIzFIBEREZGGjakYXLp0KWJiYpCVleV1\n+8svv4zs7GxMnz4dl156Kfbv3z+W0xERERGRn42pGFyyZAny8vJ8bk9OTsa2bduwf/9+PPzww7jn\nnnvGcjoiIiIi8rMxFYOXX345wsPDfW6/5JJLEBoaCgCYO3cuamtrx3I6IiIiIvIz/bk60XPPPYcb\nbrhh0POKopyrEIiIiIguOCIypv3PyQdIPvroIzz//PNYu3at1+0iwh8//jzyyCPnPYYL7Yd9yj79\nMvywT9mnX4Yf9ql/f/zhrN8Z3L9/P5YvX468vLwhX1ImIiIionPvrN4ZrK6uxqJFi/DSSy8hJSXl\nbJ6KiIiIiEZhTHcG77jjDmzduhXNzc1ITEzE6tWr0dPTAwBYsWIFHn30UbS2tmLlypUAAIPBgIKC\ngrFHTUPKzc093yFccNin/sc+9T/2qf+xT/2PffrvRxF/veA82gAUxW+veRMRERFpiT/qKP4fSIiI\niIg0jMUgERERkYaxGCQiIiLSMBaDRERERBrGYpCIiIhIw1gMEhEREWkYi0EiIiIiDWMxSERERKRh\nLAaJiIiINIzFIBEREZGGsRgkIiIi0jAWg0REREQaxmKQiIiISMNYDBIRERFpGItBIiIiIg1jMUhE\nRESkYWMqBpcuXYqYmBhkZWX5bHPfffchNTUV2dnZKCwsHMvpiIiIiMjPxlQMLlmyBHl5eT63b9q0\nCYcPH0Z5eTn++Mc/YuXKlWM5HY1QXV0dCgsL0d7e7nV7V1cX9u3bh6qqKp/HqK2txd69e9HR0eF1\ne2dnJ/bu3Yvq6uoRx9XS0oI9e/agpaVlxPucC+3t7SgsLER9ff35DuW8qK+vHzJfPIkIysrKUFRU\nBLvdPqJ9+nLu6NGjPtvU1NRg79696Ozs9LrdM+ccDgdKSkpQUlICh8MBAKiurh7yGJ7sdjuKi4tR\nWlrqOsb50Nvbi4MHD+LQoUMQEYgIKioqsH//fthstvMW10i0tbVhz549+OKLL7xuFxEcOnQIRUVF\n6O3tBTD8GnU2+CNfaOz68qWpqWnE+5zpGuVNbW0tCgsLXb/TWltbsWfPHjQ3NwMY2Rrl6d/1d9qo\nyBhVVlZKZmam120rVqyQDRs2uB6npaVJY2OjWxs/hEAD/OxnqyUoKFys1iwJD4+T3bt3u20vLy+X\n2NjJEhIyVYzGKFm69L/E4XC4tfnRj34mRmOEWK2ZEhmZIPv27XPbXlJSIuPGJUlIyDQxGiNl5cof\nDDqGpw0bXhWTKVys1uliNkfIa6/93T8XPEYFBQUSFhYrVmuWGI3h8sgjvz7fIZ1TjzzyazEanfkS\nFhYrBQUFQ7bv7OyUK6/8qpjNiWKxJEtW1sXS0tIy5D5lZWUyfnyyK+eWL//eoHy5//4HXTkXFTVB\nDhw44La9qKhIoqMnunLunnvuk1mzrhSLZaJYLBNl9uxcWb78XjEaIyUkZJqMG5ckJSUlQ8bV2toq\n2dlfEYtlkpjNiXL55ddJR0fHkPucDU1NTZKWdpEEB6eIyRQn1177NVm06JtiMsVIcHCaTJw4VWpq\nas55XCPx3nvvSXBwlFit08VoDJM//OFZt+1dXV1y9dU3idmcIBZLskybNkf++79/MuQadTacOnVK\n5sy5ypUvM2deIStWfN+VL9HRE6WoqOisx6F1W7ZsccuXdev+MOw+Z7pGefOTn/xCzblMiYiIl9/+\n9rdiNkeI1TpdTKZwWbv2yWHXKE+vvfZ3t2Ns2PDaGcflL/6oo85qMXjTTTfJJ5984no8b9482bVr\nl3sALAb9Ztu2bWI2JwlwTAARYIPEx09xazNz5pWi0/2vuv2kWCw58tpr/Un8/vvvi8WSIkCz2ubP\nkpyc5XaMzMyLRVF+r25vFYslUzZu3OgzrmPHjonJFCHAXnWfPWI2R0hzc7N/O2AUxo9PFuB1Na5G\nMVHGFaEAACAASURBVJsnyKeffnq+wzonPv30UzGbJwjQqF7/6zJ+fPKQ+/z856vFZPqaAD0C2CUw\ncIXcffd3htwnO/tSUZTfqedoE4tlurzxxhuu7Xl5eWKxpAlwXG3znKSm5rgdIz19lijKenV7i+j1\n48Vg+KYAvQL0isFwhRgMUwQ4IYCIovxeMjMvHjKuZcu+K0FB3xbALkCPGI23ykMP/WKYXvO/RYu+\nJQbDDwRwCNAtBsNUMRjmCtAhgEMCAh6Rq69eeM7jGk5nZ6cEB0cJsFUdl3IxmaLk8OHDrjarVv1K\nTKYFAtgEcIhe/zUJCIgbco06G+6//8diNN7pyhe9/irR6ycL0KLmy3rJyJh91uPQsu7ubjVfPlLH\nvkJMpnFSVlbmc5/RrFGePvroI7FYkgVoUo/xrCiKSYCd6uMiUZRQUZSnfa5Rnpqbm8VkChdgj7rP\nXjGZIqSpqemMYvMXf9RRZ/0DJM44+ymKMqjNqlWrXD/5+flnO6QLVnFxMYB5AMapz9yO+vrD6Onp\ncbUpKyuGw7FYfRSCjo4bUVRU5HaM3t5rAUSqzyxGVVWx2ziWlxdB5A71URi6ur7qdgxPlZWVCAyc\nBCBbfSYHen3ikC9TnwtdXV1oaqr+/+3deVxU9f4/8NfADMwZVgEBBRSVgWFR1MA9Qw01U+u6FFZq\nbpnGLc1K697fTbslkmlaWpllWRnara5WKrkklal43Rc0ceErIunFJXeW4f37w+M0I4s2M16yeT0f\njx555nw+7/mcz3nzmTfnjEcA/dVXQqDRdFHn8c8vLy8PGk0XACHqK/1x8uRRXLlypcY+27bl4fLl\nAQC0ANxQVvYgtm+v+dwDQH5+HkSu5ZwvLl/uZZMve/fuRXl5DwAB6itpOHzYNubhw9Yx6qGiwh/l\n5WkA3AG4o7y8EcrLewPwAwCIpCE/v/Zxbd+eh9LSB3D12zJaXLkyEFu21N7nVti5M089Fg0AD5SX\nh6G8fAAABYAGZnNarT9fdaW4uBgiCoDO6itR8PBohQMHDlja/JYvOgAaVFQ0Q2VlF9S2Rt0KW7fm\n4cqVgbiWLxUVkaio6AWgHoCr+XLo0B9vjv9MfvnlF1RWegBIUV9pCg+PO/Dzzz/X2MeeNaq6GJWV\ndwOor76SBJEAAG3U7TiIlFl9plVdo65XUFAAna4RgFbqK4nw8GiCw4cP3/S4HJGTk2NTNznDLS0G\nw8LCUFhYaNk+duwYwsLCqrSzPqiUlJRbOaQ/NZPJBI1mHYBT6ivLEBraBDqdztImKioGGs2X6tZF\nGAzZMJlMlv0xMTHQatcCOKu+8iUaNTLZFPFNm5qg0Xyhbp2HoqyyiXG9yMhIlJUdAXCtyNqD8vKj\naNy4sf0H6wR6vR5BQWEAvlZfKQHwfa3H8mcSExMDke9x9bgB4GsEBYVDr9fX2CcxMQZ6/TIAZgAC\nne5LtGhR+3w1bWqdcxegKN8iNjbWst9kMkGnWw3gV/WVL9G4sW3Mxo1jAFyLcQ5a7a/Q6T4HUAmg\nElrtMeh0KwGcBwBoNF+iWbNY1KZ58xh4ePwbgAAwQ69filat/vfnPj4+Blrtl+o4yqHTFUOnWwrg\n6geeu/uXf8icDA0NBXARwEb1lQKUle2A0Wi0tLmaL0sBVAAQuLsXwM0tB7WtUbdCYmIMPD2X4rd8\nKYRW+y2Ac2qLLxEZ+ceb4z+TkJAQuLmVAvhJfeUoysq22uTL9exZo6qL4ea2DsAZ9ZWd0GhOA7j2\nF1oPQKPxsPpMq7pGXa9x48YoLz8KYI/6Sh7Kyo4gMjLypsfliJSUFKcXg7f0NvHy5cvlnnvuERGR\njRs3Stu2bau0ccIQyMrV7/sFiZ9fkvj5hUpubq7N/v3790v9+o3F17eVKEoDeeihETbfjaisrJS/\n/vUZUZRg8fO7Q/z9G1T5Ts+ePXskKChCfH1bi6KEVvu9w+t99NEnoigB4ueXLIoSIIsWZTnvoB2w\nYcMG8fUNET+/JNHrg2TSpP/9bcK6NGnSPyz54usbcsNb5BcvXpR27bqJl1dT8faOFZPpjhve7s/L\ny5OgoEaWfBk8+LEqOff44+MsORcQECbbt2+3ibFz504JDAxXY4TIo4+OlhYt2ou3d7R4exulZcuO\nMnjwKFGUUPH1bS1BQRGyZ8+eWsd16tQpiYtLFm9vk3h5NZM2bbrIhQsXbjBjzldcXCxNmzYXH594\n8fKKlLvuukd69RogBkO4+Pq2kPDwaCkoKPifj+tmLF++XAyGQPHzSxa9PkBmz55rs//SpUvSsWOq\neHk1EW/vWImObiVPPPF0rWvUrXDu3Dlp2bKjeHsbxds7Wpo3bydDh44WRQkRX9/WEhgYLrt27brl\n43B1K1euFC+vIHW9DZAZM964YZ/fu0ZVZ9y4iaIo9cXPL0n8/RtIRkamzefRSy+9XOsaVZ1Fi7Js\nYnz00Se/e1zO4ow6SqMGssugQYPw/fffo6SkBCEhIZgyZYrlcv/o0aMBAOnp6cjOzoaXlxc++OAD\ntG7d2iaGRqOpciuZHHPkyBGcOHECsbGx8PPzq7L/4sWL2Lt3L/z8/BAdHV3trftDhw6hpKQEsbGx\n8PX1rbL/woUL2Lt3LwICAmr9zc7aL7/8giNHjqBp06YICQm5cYf/kbNnz2L//v0ICQlBkyZN6no4\n/3PX8sVkMsHf3/+G7c1mM/bs2QOz2YzmzZvf1FWdaznn7++P6OjoatscPHgQp06dQlxcHHx8fKrs\nP3/+PPLy8hAYGIioqChUVFRg9+7d0Gg0SEhIgFarRX5+Pk6fPo34+Hh4e3vfcFzl5eXYvXs33N3d\nkZCQAHd39xv2uRVKS0uxZ88eeHh4ID4+HhqNBvv27cPFixeRkJAARVHqZFw3o6SkBPn5+WjUqFG1\nd34qKyuxZ88eVFRUICEhAR4eHjdco26Fa39jW0TQvHlzaLVaS87dbL6Q406dOoUDBw4gIiIC4eHh\nN9Xn965R1Tl8+DBOnjyJuLg4+Pr64uTJkzh06BAiIyPRoEGDm1qjrnfixAkcPnwYTZo0Ua+U1w1n\n1FEOFYPOwGKQiIiIyD7OqKP4L5AQERERuTAWg0REREQujMUgERERkQtjMUhERETkwlgMEhEREbkw\nFoNERERELozFIBEREZELYzFIRERE5MJYDBIRERG5MBaDRERERC6MxSARERGRC2MxSEREROTCWAwS\nERERuTAWg0REREQujMUgERERkQtjMUhERETkwlgMEhEREbkwh4rB7OxsmEwmGI1GZGZmVtlfUlKC\nnj17omXLlkhISMCHH37oyNsRERERkZNpRETs6Wg2mxETE4M1a9YgLCwMycnJyMrKQmxsrKXN5MmT\nUVpaioyMDJSUlCAmJgYnTpyAVqv9bQAaDewcAhEREZFLc0YdZfeVwc2bNyMqKgqRkZHQ6XRIS0vD\nsmXLbNo0aNAA586dAwCcO3cOgYGBNoUgEREREdUtuyuzoqIiREREWLbDw8ORm5tr02bUqFHo2rUr\nGjZsiPPnz+Ozzz6rNtbkyZMtf05JSUFKSoq9wyIiIiL608rJyUFOTo5TY9pdDGo0mhu2mTp1Klq2\nbImcnBwcOnQIqamp2LlzJ3x8fGzaWReDRERERFS96y+aTZkyxeGYdt8mDgsLQ2FhoWW7sLAQ4eHh\nNm02bNiAgQMHAgCaNWuGJk2a4Oeff7b3LYmIiIjIyewuBpOSkpCfn4+CggKUlZVhyZIl6Nu3r00b\nk8mENWvWAABOnDiBn3/+GU2bNnVsxERERETkNHbfJtZqtZgzZw569OgBs9mMESNGIDY2FvPmzQMA\njB49Gi+88AKGDRuGxMREVFZW4tVXX0VAQIDTBk9EREREjrH70TJOGwAfLUNERERklzp9tAwRERER\n3f5YDBIRERG5MBaDRERERC6MxSARERGRC2MxSEREROTCWAwSERERuTAWg0REREQujMUgERERkQtj\nMUhERETkwlgMEhEREbkwFoNERERELozFIBEREZELYzFIRERE5MJYDBIRERG5MBaDRERERC6MxSAR\nERGRC2MxSEREROTCHCoGs7OzYTKZYDQakZmZWW2bnJwctGrVCgkJCUhJSXHk7YiIiIjIyTQiIvZ0\nNJvNiImJwZo1axAWFobk5GRkZWUhNjbW0ubs2bPo2LEjvv32W4SHh6OkpARBQUG2A9BoYOcQiIiI\niFyaM+oou68Mbt68GVFRUYiMjIROp0NaWhqWLVtm0+bTTz9F//79ER4eDgBVCkEiIiIiqltaezsW\nFRUhIiLCsh0eHo7c3FybNvn5+SgvL0eXLl1w/vx5PPXUUxg8eHCVWJMnT7b8OSUlhbeTiYiIiKqR\nk5ODnJwcp8a0uxjUaDQ3bFNeXo5t27Zh7dq1uHTpEtq3b4927drBaDTatLMuBomIiIioetdfNJsy\nZYrDMe0uBsPCwlBYWGjZLiwstNwOviYiIgJBQUFQFAWKoqBz587YuXNnlWKQiIiIiOqG3d8ZTEpK\nQn5+PgoKClBWVoYlS5agb9++Nm3uu+8+rF+/HmazGZcuXUJubi7i4uIcHjQREREROYfdVwa1Wi3m\nzJmDHj16wGw2Y8SIEYiNjcW8efMAAKNHj4bJZELPnj3RokULuLm5YdSoUSwGiYiIiP5A7H60jNMG\nwEfLEBEREdmlTh8tQ0RERES3PxaDRERERC6MxSARERGRC2MxSEREROTCWAwSERERuTAWg0REREQu\njMUgERERkQtjMUhERETkwlgMEhEREbkwFoNERERELozFIBEREZELYzFIRERE5MJYDBIRERG5MBaD\nRERERC6MxSARERGRC2MxSEREROTCWAwSERERuTCHisHs7GyYTCYYjUZkZmbW2O4///kPtFotvvzy\nS0fejoiIiIiczO5i0Gw2Iz09HdnZ2cjLy0NWVhb27dtXbbuJEyeiZ8+eEBGHBktEREREzmV3Mbh5\n82ZERUUhMjISOp0OaWlpWLZsWZV2b775JgYMGID69es7NFAiIiIicj6tvR2LiooQERFh2Q4PD0du\nbm6VNsuWLcN3332H//znP9BoNNXGmjx5suXPKSkpSElJsXdYRERERH9aOTk5yMnJcWpMu4vBmgo7\na+PGjcO0adOg0WggIjXeJrYuBomIiIioetdfNJsyZYrDMe0uBsPCwlBYWGjZLiwsRHh4uE2brVu3\nIi0tDQBQUlKClStXQqfToW/fvva+LRERERE5kUbs/FsdFRUViImJwdq1a9GwYUO0adMGWVlZiI2N\nrbb9sGHD0KdPH/Tr1892AOpVQyIiIiL6fZxRR9l9ZVCr1WLOnDno0aMHzGYzRowYgdjYWMybNw8A\nMHr0aIcGRkRERES3nt1XBp02AF4ZJCIiIrKLM+oo/gskRERERC6MxSARERGRC2MxSEREROTCWAwS\nERERuTAWg0REREQujMUgERERkQtjMUhERETkwlgMEhEREbkwFoNERERELozFIBEREZELYzFIRERE\n5MJYDBIRERG5MBaDRERERC6MxSARERGRC2MxSEREROTCWAwSERERuTCHi8Hs7GyYTCYYjUZkZmZW\n2b9o0SIkJiaiRYsW6NixI3bt2uXoWxIRERGRk2hEROztbDabERMTgzVr1iAsLAzJycnIyspCbGys\npc3GjRsRFxcHPz8/ZGdnY/Lkydi0adNvA9Bo4MAQiIiIiFyWM+ooh64Mbt68GVFRUYiMjIROp0Na\nWhqWLVtm06Z9+/bw8/MDALRt2xbHjh1z5C2JiIiIyIm0jnQuKipCRESEZTs8PBy5ubk1tn///ffR\nq1evKq9PnjzZ8ueUlBSkpKQ4MiwiIiKiP6WcnBzk5OQ4NaZDxaBGo7nptuvWrcOCBQvw008/Vdln\nXQwSERERUfWuv2g2ZcoUh2M6VAyGhYWhsLDQsl1YWIjw8PAq7Xbt2oVRo0YhOzsb9erVc+QtiYiI\niMiJHPrOYFJSEvLz81FQUICysjIsWbIEffv2tWlz9OhR9OvXD5988gmioqIcGiwREREROZdDVwa1\nWi3mzJmDHj16wGw2Y8SIEYiNjcW8efMAAKNHj8ZLL72EM2fOYMyYMQAAnU6HzZs3Oz5yIiIiInKY\nQ4+WccoA+GgZIiIiIrvU+aNliIiIiOj2xmKQiIiIyIWxGCQiIiJyYSwGiYiIiFwYi0EiIiIiF8Zi\nkIiIiMiFsRgkIiIicmEsBomIiIhcGItBIiIiIhfGYpCIiIjIhbEYJCIiInJhLAaJiIiIXBiLQSIi\nIiIXxmKQiIiIyIWxGCQiIiJyYSwGiYiIiFwYi0EiIiIiF+ZQMZidnQ2TyQSj0YjMzMxq2zz55JMw\nGo1ITEzE9u3bHXk7IiIiInIyrb0dzWYz0tPTsWbNGoSFhSE5ORl9+/ZFbGyspc2KFStw8OBB5Ofn\nIzc3F2PGjMGmTZucMnC6ORUVFZg9+03k5u5CQkIUnn32aSiKYtNm+/btmDPnPZjNZjz22BB06NDB\nZn95eTlmzpyNrVv3IjExBs88Mx6enp42bbZs2YK33loAEcGYMcMQERGBzMyZOHHiNO67rzvS0h60\naS8i+PTTLHz99Ro0aBCESZMm4MiRI3jnnYVwc9MgPX0kgoODMX36LJw8eQb9+/dC7973Yvr0mdi9\nOx9t2jTHk0+mIytrMVasyEF4eDCef/4Z7N+/H/PnfwKdzh1//etjqFevHl59dRZOnz6HBx/sg9TU\nVGRmzsC+fYfRvn1L/PWvT0Crtf0xyMnJwYIFWfD01GHcuDEwGAyYPn02fv31Ih566H7cddddyMyc\ngZ9/LkCnTndg7NjHsWDBB1i7dgOaNAnDpEnPYMuWLVi48DMYDHqMHz8W7u7umDlzDs6du4QhQwag\nffv2yMycgfz8o0hJaYvHHhuJefPm4/vvN8NobISJEydgw4YN+PjjL+Dra8DTT6fDbDbj9dffwqVL\nVzB06ANISkrCtGmv4ciRInTr1gHDhw/DW2+9g/XrtyImJhITJ07A999/j08/XQo/Py88++xTuHTp\nEmbNehulpeUYPnwQUlJSquTLm2/OxcaNOxAb2xQTJ07AqlWr8Nln3yAgwBfPPTcOjRo1sulTUlKC\njIzXcOzYSdx7bxekpT2IN96Yg82bd6N5cyOeffZp6PV6mz5bt27F3Lnvo7JS8PjjQ9GuXTub/WVl\nZZg5cza2bctDy5YmTJgwrkrO3ciVK1cwY8Ys7Nz5M+64Ix7jxz+JJUs+wzfffIewsPqYNGkCgoOD\nbfoUFhbi1Vdn4dSpXzFw4L3o2bMnpk+fiT17DqJt2xZ48sl0fPLJImRn/4CIiBA8//wz2Lt3L957\nbxE8PHR46qnR8PHxwfTps3HmzHkMGnQfunbtiszMGdi//wg6dGiF9PSx+PDDhVi9+idERjbApEnP\noF69erUey/nz55GZOQMHDvwfOndOxuOPP4b589/HunWbEBUVgYkTJyA3NxcfffQveHkpmDAhHQAw\nY8YcXLx4GUOHPoA2bdogM3MGDh4sRJcu7TBq1Ai88867+OGH/yA6ujEmTpyAH3/8EYsW/Rt+fl6Y\nMOGvKC0txaxZb+Py5VIMHz4ILVu2xLRpr6GgoBipqR0xYsRwaDQam7F+/fXXyMpahnr1fPDss08h\nMjKy1mOrbo3Kzs7Gv/61HIGBfnjuuXEoKSmxWaOioqIwbdoMFBX9F717d8WDDz6A119/w2aNWrp0\nGZYu/RbBwfUwceLTaNiwYa3jKC0txYwZs7Bjx360bh2H8eOfxOeff1Hna1R6+lh89NHHWLVqPRo1\nCsXzzz+DXbt23ZZr1KhRI6vky/Lly52+Rp05cwZvvvkuysvNGDXqEXTq1KnWc1+d69eoyMhIZGbO\nRHFxCfr0uRsPPTSoyrEsXrwEy5atQkhIACZNmoDQ0NDf/b51Ruy0YcMG6dGjh2U7IyNDMjIybNqM\nHj1aFi9ebNmOiYmRX375xaaNA0OgG6isrJT77hskBsPdAswXvb6ftG3bVSoqKixttmzZIgZDkAAZ\nAswQRakv3333nU2Mnj37iaL0FGC+KEpf6dy5p5jNZkubDRs2qDFeFWC6KEqA1KvXULTacQLME4Mh\nRqZNe81mbC+/PE0MhlgB5olW+6QEBDQQRakvwGsCZIqiBIifXwNxd39GgHdEUZpJZGS86PV/UcfR\nXaKiWojB0FyAd0WnGyuBgWGiKMECzBRgqihKPfH1DRF394kCvC2KEimNG8eJXj9QgPliMHSR/v0f\nsRnX8uXLxWAIFWCWaDT/FEWpJ97eweLm9oIAc0VRIiQiwiSenoPUGHdKdHRLMRiSBZgvHh7DJTS0\nsShKAwHeEI1mihgMAeLlFSQazT8EmCOK0lAaNjSKp+cQNUZ7iY5uJQZDewHmi6fnYGnYsKkYDOEC\nzBGN5h/i5XUtxhQB3hBFCZXQ0Gbi4TFcjZGsxrhTPdeDJDw8SgyGRgLMFTe3F8TbO1AMhkDRaP4p\nwCwxGEJl+fLlNsffv/8jYjB0UWMMlEaNYsRgaCLA2+LuPlHq1WsoRUVFlvZnz56VsDCj6HRjBXhX\nFCVBoqJaiqJ0V2PcLx06pNrkXG5urpovmQK8JgZDffn+++9tcu7uu/uKovRSz3Vv6dLlXpucuxGz\n2Sx33tlTFKWvGqOnmi/xAswTnS5dGjRoJmfOnLH0KS4uloCAMHF3f1bNlyZqzvVT5/huMRoTxcsr\nUc25xyUoKFwUJUSA10WjeUUUpZ74+ASLm9vzArwlitJIGjWKFb3+QTXGXWq+3CHAu+LhMVKaNk2Q\nCxcu1HgspaWlEh/fRjw9H1ZjdFTPdVs1X4ZKaGikKEpDAd4UjeZFq3x5UYA3RVEaSIMGUeLpOVSN\n0VaN0VGN8bCEhTUTgyFCzZe/i5dXoBrjJQFmi14fLCEhTcXDY6QA74rBcIeMG/eczVjfe2+BGAyN\nBXhL3NyeF3//BlJYWFjjsVW3RjVubBKDoamac8+Kr299UZRAyxql1wdKYGCE6HTp6voSJ1FRiTZr\nVJMm8WIwRAvwjmi1T0v9+o3k5MmTteZLly73iqL0VmP0kqZNE/4Qa5TR2FIMhlZqvjwmwcERoii3\n5xr1xBNP28z7hx8udPoa5eNTXwyGQAGmCjBTDIZgWb16dY3nvjrXr1GKEigBAWGi1T6p5lysvPzy\nNJs+06a9JgZDjJov4yQ4uLGUlJT8rve1lzPqKLsj/Otf/5KRI0datj/++GNJT0+3adO7d2/56aef\nLNvdunWTLVu22A4AkBdffNHy37p16+wdEl2nsLBQ9PogAS4LIAJUiLe3SXJzcy1tBg4cKsAsdb8I\nsFDuuquPZX9+fr66aJSq+8vFy6uJ7Nixw9Kmd+80Ad62ivGQuLn1s9reLz4+wTZj8/IKEOCQpY2b\nWzMBFlj1+Yu4uT1stb1YgIYCVKjblwTQCVBoFSNSgE+t+twjGs1Iq+35otFECmBWty+Kp2c9OX78\nuGVcSUldBfjCqk830WjSrbZniUYTI0Clul0igFaA0+p2pbi5NRDgG6s+nQR4xmp7qmg0za1iHFOP\n5awlhkYTLMAaqz5tBfh/Vtt/Eze3ZKsYBwXwFOC8um0WjSZQgB+t+rQW4GWr7c8lKamr5diPHz8u\nnp711Lm9FsNfgM2WPjrdaJtf+hYuXCheXn2tYm4QwFeAK5ac8/KKkq1bt1r63H//wwK8adXnfenW\n7X7L/v3796sfMmXq/jLx8mose/bsuenc37Fjh3h5NRGgXI1xWZ2fAsv7Ggx/kffff9/SZ/r06Wqh\n89vPAhBhlXMX1XNdbJVzjQRYYtUnVTSax6223xKNpplVzp1RY5RYzrW3d1f5/PPPazyWVatWiY9P\nktW5PqHGOGOVL6ECZFu9bwcBnrfanixubq2tYhQI4CHAOasYQQKss+qTJMBkq+3nxM2tvdX2f0Wr\n9ZTy8nLLWMPCTAJstLTRasfKP//5co3HVt0aBfgJsM0SQ6OJF9s1aoS4ufWy2l4ngL9Yr1GAlwB7\nLW30+kfkjTfeqHEce/bsES+vxlY5VyqAXup+jTqnnusTVjHC5fZco06JVusppaWllnlv3DhBnL1G\nXR33dKsYi6Rjx3tuuGZYq7pGDRE3N+t17qB4eQXY9PHxCRbgZ0sbRUmTt95663e9781at26dTd3k\njGLQ7tvE118ereXK4w37TZ482d5hUC1KS0vh5uYJwEN9xR1ubt4oKyuztLl8uRSAr1UvP5SWll0X\nQwGgs4rhdYMYOohY39LzQ0VFGayVl9v2EXGrJoa/1bYHAAWAu2U/INfF0FwXQ1slhogBv31VVg83\nN0+bY7ly5fpjqRoD8AagsdoGAC/1/xpcTXnbYwH8r9v2tYqhU8dksMS4yjqGOwA/mxgi1jG06n/X\nvgJw7Rhri1HdufYEcO3cuak/v7/FMJur9rk6jms8rP67GuNmcu7Klepy7trypIWbm8Emxo1cjeGF\n3/LFA0AlbPPFr8q5N5uvPxaDVQwtruacj1WM63Ouurz1xm/nw0ON8Vu+AH61HltpaSk0Gmfkiw9s\n88Udv+VLdTG01cSw3u8NEYHZbLZ81aKszPbcms1+6s9Uzcd2/RpV9Ty5VxmXiPW4ruXsb2vU9WuD\n2XzjOXZzM+C3nNNVM466WKOu5Yt3LeO4XdYoLwAaVFRUwMPj6phKS69fCxxfo66WNTXHuBnVf6bZ\nxrz+M62iwrZPZWXtOeeIlJQUm9vnU6ZMcTyovVXkxo0bbW4TT506VaZNs71sOnr0aMnKyrJs8zbx\n/5bZbJbExA7i4TFGgI2i1f5NGjUyyaVLlyxtvvnmG1GUMAG+FuBbMRiaycKFH1n2l5eXS2xskuh0\nTwqwUXS656RJkwS5cuWKpc3nn3+hXupfLsBK0esbiqenvwDzBVgvitJNRox4wmZsQ4c+rt5K/EmA\nd8XT01cUJVKuXt34RvT6UPU3wA/UGJ3Ex6eBaLXPq+N4Qnx9w0RR7hVgg2g0b4mnp58oSpQACad1\njgAADyRJREFUqwT4SvT6+mqMjwX4QRSlrXh7h4q7+z8E2CgeHqOkdes7bW4/zp37tnpraI0A/xZP\nz0Dx9AxQf5v/XhSltXh5BYu7+z8F2CCeno+Kn1+E6PUPCLBB3NxmiKIEiMGQIMB3Anwunp71xNMz\nUK5eQcoRRWkuBkN9cXPLVGM8LP7+EeptwA3i5jZNDIZA9dZQjgBLxMPDX/T6+gJ8LsB3oiixYjAE\nipvbDDXGQDXGowJsEHf3f4qXV30xGNoI8L0An4qnp6/o9cEC/FuANWIwxMrcuW/b5Evr1neKh8dj\nAmwUd/d/iLd3iBgMHQT4QYCPxWAIkt27d1v6FBYWio9PsGg0bwmwQfT6XuLnFy463RNqzj0vkZFx\ncvnyZUufpUuXqrcjvxEgWwyGJrJo0aeW/WVlZRIT01p0uvFqjAkSFZVoc0XhRq5cuSJNmiSIVvuc\neq6fVPOlp5pz74iPT7D83//9n6VPXl6eemtooQA/iqK0Fx+fUNFq/6bGeFz8/MLVW88bRKOZo+Zc\ntACrBVgqnp5Bar58ouZckpovUwTYKJ6eI9R86a/GeF38/RvIiRMnajyWs2fPSnBwpLi7T1XP9RD1\nXKep+TJdzbkWcvUq2WdqzgUJ8JkA60RR4sVgCBI3t+kCbBAPjwfVGEPUfJkqBkOQGAyt1ZzLEk9P\nP/H0rC9Xr0KtFYMhRhQlUDSa19Vz3V96937AZqyTJv1DDIZ2ar58IgZDkOzcubPGY6tujfLxCRWD\noZN6xWihmrcNLWuUXt9I9Pp6ArwjwE+i13cXX98wmzXKx6eBKEoXAdYLsEC8vIIkPz+/xnGUlpZK\nVFSiaLUT1Bjj1Xyp+zXKzy9c9Pr71RhviF7vL4piuu3WKL3+AenZs5/NvP/tb1OcvkZdjdFAgK8E\nWCUGg1Hef/+DGs99da5fo/T6cDXn3hXgJ1GU7vLoo4/b9Bkx4glRlG5qzs0XL68gOXTo0O96X3s5\no46yO0J5ebk0bdpUjhw5IqWlpZKYmCh5eXk2bZYvXy733HP18uzGjRulbdu2VQfAYvCWOn36tAwa\nNEKMxiTp0yfN5vte13z22b8kMbGzJCR0lPfeW1Bl/3//+18ZOHCoGI1Jcv/9D0txcXGVNosWfSot\nWtwpzZt3ko8//kS2bt0qnTrdIzExbeS55/4uZWVlNu1LS0tlwoQXJCamjXTqdI9s27ZNPvxwoTRv\n3klatLhTsrIWy6ZNm6RDhx5iMrWVF16YLEePHpX77ntIjMYkefDBYXL8+HF56qnnJDo6WTp3vld2\n794t7777niQkdJTExM7yxRdfyPr166Vdu1QxmdrKiy++LAUFBdK794NiNCbJww+PtPnOmMjV7zDN\nnfu2xMd3kNatU+Srr76SnJwcadPmbomNbScvv5wphw8flnvuGSBGY5IMGTJaiouL5bHHnpTo6GS5\n++77Zf/+/TJz5hsSH99B7riji6xcuVJWr14tycndJC6uvbz66kw5ePCgdO/eT4zGJBk+/AkpLi6W\n4cOfEKMxSbp37ycHDx6UV1+dKXFx7SU5uZusXr1aVq5cKXfc0UXi4zvIzJlvyP79++Xuu++X6Ohk\neeyxJ6W4uFiGDBktRmOS3HPPADl8+LC8/HKmxMa2kzZt7pacnBz56quvpHXrFImP7yBz574tlZWV\nNsd/5swZefjhkWI0Jknv3g9KQUGBvPjiy2IytZV27VJl/fr1Vc797t27pXPneyU6Olmeeuo5OX78\nuDz44DAxGpPkvvsesrkNf01W1mJLvnz44cIq+0+ePCkDBgwRozFJ+vUbXGuxVJPi4mK5//6HxWhM\nkoEDh0pRUZGMHz9JYmLaSOfO91ZbpGzcuFHat+8uJlNb+fvfX5KjR49Knz5pYjQmyaBBI+T48eOS\nnv6MREcnS0pKb9m9e7e89dY8SUjoKK1a3SX//ve/5YcffrDky0svZUhBQYHce+8DYjQmySOPPCbF\nxcUyZsx4iY5Olm7d7pP9+/ff8FiOHDkiPXr0l+joZHn00TFSXFwsI0emS3R0sqSm/kUOHDggr702\nS+Li2ktSUlf59ttv5dtvv5WkpK4SF9deZsyYJQcOHJDU1L9IdHSyjByZLsXFxfLoo2MkOjpZevbs\nL4cOHZKMjOkSG9tOkpO7ydq1a+Wbb76x5Mvs2XNk37590q3bfRIdnSxjxoy3+cVS5OqH9UsvZVhy\n7ocffrjhsV2/Rh09elT+/veXxGRqK+3bd5eNGzdWWaN27twpnTvfKzExbWT8+ElSVFRks0YVFhbK\npEn/kJiYNtKxY0/ZvHnzDcdx4sQJ6ddvsBiNSTJgwBA5duzYH2KNKi4ulrFjn5bo6GTp0qWP7N27\n97Zdoy5evFglX27FGvXFF19Y8uXdd9+74bmvzvVr1LZt2yyfaRMmvFDll9OysjJ57rm/W/LF+qsx\nt5oz6iiNGsguK1euxLhx42A2mzFixAg8//zzmDdvHgBg9OjRAID09HRkZ2fDy8sLH3zwAVq3bm0T\nQ6PRVLmVTEREREQ35ow6yqFi0BlYDBIRERHZxxl1FP8FEiIiIiIXxmKQiIiIyIWxGCQiIiJyYSwG\niYiIiFwYi0EiIiIiF8ZikIiIiMiFsRgkIiIicmEsBomIiIhcGItBIiIiIhfGYpCIiIjIhbEYJCIi\nInJhLAaJiIiIXBiLQSIiIiIXxmKQiIiIyIWxGCQiIiJyYSwGiYiIiFwYi8E/oZycnLoewp8O59T5\nOKfOxzl1Ps6p83FO/3jsLgZPnz6N1NRUREdHo3v37jh79myVNoWFhejSpQvi4+ORkJCAN954w6HB\n0s3hD5rzcU6dj3PqfJxT5+OcOh/n9I/H7mJw2rRpSE1NxYEDB9CtWzdMmzatShudTofXX38de/fu\nxaZNmzB37lzs27fPoQETERERkfPYXQx+9dVXGDp0KABg6NChWLp0aZU2oaGhaNmyJQDA29sbsbGx\nOH78uL1vSUREREROphERsadjvXr1cObMGQCAiCAgIMCyXZ2CggLcdddd2Lt3L7y9vX8bgEZjz9sT\nEREREa7WYY7Q1rYzNTUVv/zyS5XXX3nlFZttjUZTa1F34cIFDBgwALNnz7YpBAHHD4CIiIiI7Fdr\nMbh69eoa94WEhOCXX35BaGgoiouLERwcXG278vJy9O/fH4888gjuv/9+x0ZLRERERE5l93cG+/bt\ni4ULFwIAFi5cWG2hJyIYMWIE4uLiMG7cOPtHSURERES3hN3fGTx9+jQeeOABHD16FJGRkfjss8/g\n7++P48ePY9SoUVi+fDnWr1+Pzp07o0WLFpbbyBkZGejZs6dTD4KIiIiI7GP3lcGAgACsWbMGBw4c\nwKpVq+Dv7w8AaNiwIZYvXw4A6NSpEyorK7Fjxw5s374d27dvtxSCzz77LGJjY5GYmIh+/frh119/\ntcTOyMiA0WiEyWTCqlWrHDk+l5OdnQ2TyQSj0YjMzMy6Hs5tqabnY97MszWpdmazGa1atUKfPn0A\ncE4ddfbsWQwYMACxsbGIi4tDbm4u59RBGRkZiI+PR/PmzfHQQw+htLSUc/o7DB8+HCEhIWjevLnl\ntdrmj5/3N1bdnDq7hqqzf4Gke/fu2Lt3L3bu3Ino6GhkZGQAAPLy8rBkyRLk5eUhOzsbY8eORWVl\nZV0N87ZiNpuRnp6O7Oxs5OXlISsri891tENNz8e8mWdrUu1mz56NuLg4y50CzqljnnrqKfTq1Qv7\n9u3Drl27YDKZOKcOKCgowPz587Ft2zbs3r0bZrMZixcv5pz+DsOGDUN2drbNazXNHz/vb051c+rs\nGqrOisHU1FS4uV19+7Zt2+LYsWMAgGXLlmHQoEHQ6XSIjIxEVFQUNm/eXFfDvK1s3rwZUVFRiIyM\nhE6nQ1paGpYtW1bXw7rtVPd8zKKiopt6tibV7NixY1ixYgVGjhxpeYoA59R+v/76K3788UcMHz4c\nAKDVauHn58c5dYCvry90Oh0uXbqEiooKXLp0CQ0bNuSc/g533nkn6tWrZ/NaTfPHz/ubU92cOruG\n+kP828QLFixAr169AADHjx9HeHi4ZV94eDiKiorqami3laKiIkRERFi2OXeOKygowPbt29G2bVuc\nOHECISEhAK7+bfoTJ07U8ehuL+PHj8f06dMtCxgAzqkDjhw5gvr162PYsGFo3bo1Ro0ahYsXL3JO\nHRAQEIAJEyagUaNGaNiwIfz9/ZGamso5dVBN88fPe+dwRg11S4vB1NRUNG/evMp/X3/9taXNK6+8\nAg8PDzz00EM1xuGDqW8O58m5Lly4gP79+2P27Nnw8fGx2XejZ2uSrW+++QbBwcFo1apVjc8W5Zz+\nPhUVFdi2bRvGjh2Lbdu2wcvLq8rtS87p73Po0CHMmjULBQUFOH78OC5cuIBPPvnEpg3n1DE3mj/O\n7e/jrBqq1ucMOqq25xQCwIcffogVK1Zg7dq1ltfCwsJQWFho2T527BjCwsJu2Rj/TK6fu8LCQpvf\nEOjmXXs+5uDBgy2PTbrZZ2tSVRs2bMBXX32FFStW4MqVKzh37hwGDx7MOXVAeHg4wsPDkZycDAAY\nMGAAMjIyEBoayjm105YtW9ChQwcEBgYCAPr164eNGzdyTh1U0885P+8d48waqs5uE2dnZ2P69OlY\ntmwZ9Hq95fW+ffti8eLFKCsrw5EjR5Cfn482bdrU1TBvK0lJScjPz0dBQQHKysqwZMkS9O3bt66H\nddup6fmYN/NsTare1KlTUVhYiCNHjmDx4sXo2rUrPv74Y86pA0JDQxEREYEDBw4AANasWYP4+Hj0\n6dOHc2onk8mETZs24fLlyxARrFmzBnFxcZxTB9X0c87Pe/s5vYaSOhIVFSWNGjWSli1bSsuWLWXM\nmDGWfa+88oo0a9ZMYmJiJDs7u66GeFtasWKFREdHS7NmzWTq1Kl1PZzb0o8//igajUYSExMt+bly\n5Uo5deqUdOvWTYxGo6SmpsqZM2fqeqi3pZycHOnTp4+ICOfUQTt27JCkpCRp0aKF/OUvf5GzZ89y\nTh2UmZkpcXFxkpCQIEOGDJGysjLO6e+QlpYmDRo0EJ1OJ+Hh4bJgwYJa54+f9zd2/Zy+//77Tq+h\n7H7oNBERERHd/v4Qf5uYiIiIiOoGi0EiIiIiF8ZikIiIiMiFsRgkIiIicmEsBomIiIhcGItBIiIi\nIhf2/wE9dvlhFBepTgAAAABJRU5ErkJggg==\n" 239 | } 240 | ], 241 | "prompt_number": 8 242 | }, 243 | { 244 | "cell_type": "code", 245 | "collapsed": false, 246 | "input": [], 247 | "language": "python", 248 | "metadata": {}, 249 | "outputs": [], 250 | "prompt_number": 2 251 | } 252 | ], 253 | "metadata": {} 254 | } 255 | ] 256 | } --------------------------------------------------------------------------------