├── AvellanedaLeeStatArb20090616.pdf ├── Clean_dp_qlbs_oneset_m3_ex3_v4.ipynb ├── FINAL_Bank_failure_rand_forests_m2_ex2.ipynb ├── Final_Bank_failure_m1_ex4_v4.ipynb ├── Final_DJI_tSNE_m2_ex4_corrected.ipynb ├── Final_absorp_ratio_m2_ex5.ipynb ├── Final_discrete_black_scholes_m3_ex1_v3.ipynb ├── Final_linear_regress_m1_ex2_v4.ipynb ├── Final_pca_eigen_portfolios_m2_ex3.ipynb ├── MY_Euclidian_Distance_m1_ex1_v3.ipynb ├── MY_Tobit_regression_m1_ex3_v4.ipynb ├── MY_dp_qlbs_oneset_m3_ex2_v3.ipynb ├── P3_QLBS Q-Learner in the Black-Scholes(-Merton) Worlds.pdf └── README.md /AvellanedaLeeStatArb20090616.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/michaelsyao/Machine-Learning-and-Reinforcement-Learning-in-Finance/49abda7cb6b21405f9697d6cdf85893769e057b6/AvellanedaLeeStatArb20090616.pdf -------------------------------------------------------------------------------- /Final_discrete_black_scholes_m3_ex1_v3.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Discrete-Time Black Scholes\n", 8 | "Welcome to your 1st assignment in Reinforcement Learning in Finance. This exercise will introduce Black-Scholes model as viewed through the lens of pricing an option as discrete-time replicating portfolio of stock and bond.\n", 9 | "\n", 10 | "**Instructions:**\n", 11 | "- You will be using Python 3.\n", 12 | "- Avoid using for-loops and while-loops, unless you are explicitly told to do so.\n", 13 | "- Do not modify the (# GRADED FUNCTION [function name]) comment in some cells. Your work would not be graded if you change this. Each cell containing that comment should only contain one function.\n", 14 | "- After coding your function, run the cell right below it to check if your result is correct.\n", 15 | "\n", 16 | "\n", 17 | "Let's get started!" 18 | ] 19 | }, 20 | { 21 | "cell_type": "markdown", 22 | "metadata": {}, 23 | "source": [ 24 | "## About iPython Notebooks ##\n", 25 | "\n", 26 | "iPython Notebooks are interactive coding environments embedded in a webpage. You will be using iPython notebooks in this class. You only need to write code between the ### START CODE HERE ### and ### END CODE HERE ### comments. After writing your code, you can run the cell by either pressing \"SHIFT\"+\"ENTER\" or by clicking on \"Run Cell\" (denoted by a play symbol) in the upper bar of the notebook. \n", 27 | "\n", 28 | "We will often specify \"(≈ X lines of code)\" in the comments to tell you about how much code you need to write. It is just a rough estimate, so don't feel bad if your code is longer or shorter." 29 | ] 30 | }, 31 | { 32 | "cell_type": "code", 33 | "execution_count": 32, 34 | "metadata": { 35 | "collapsed": true 36 | }, 37 | "outputs": [], 38 | "source": [ 39 | "import numpy as np\n", 40 | "import matplotlib.pyplot as plt\n", 41 | "%matplotlib inline\n", 42 | "\n", 43 | "from numpy.random import standard_normal, seed\n", 44 | "\n", 45 | "import scipy.stats as stats\n", 46 | "from scipy.stats import norm\n", 47 | "\n", 48 | "import sys\n", 49 | "\n", 50 | "sys.path.append(\"..\")\n", 51 | "import grading\n", 52 | "\n", 53 | "import datetime \n", 54 | "import time\n", 55 | "import bspline\n", 56 | "import bspline.splinelab as splinelab" 57 | ] 58 | }, 59 | { 60 | "cell_type": "code", 61 | "execution_count": 33, 62 | "metadata": { 63 | "collapsed": true 64 | }, 65 | "outputs": [], 66 | "source": [ 67 | "### ONLY FOR GRADING. DO NOT EDIT ###\n", 68 | "submissions=dict()\n", 69 | "assignment_key=\"J_L65CoiEeiwfQ53m1Mlug\" \n", 70 | "all_parts=[\"9jLRK\",\"YoMns\",\"Wc3NN\",\"fcl3r\"]\n", 71 | "### ONLY FOR GRADING. DO NOT EDIT ###" 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "execution_count": 34, 77 | "metadata": { 78 | "collapsed": true 79 | }, 80 | "outputs": [], 81 | "source": [ 82 | "COURSERA_TOKEN = 'qNi79w1QVsdabsuQ' # the key provided to the Student under his/her email on submission page\n", 83 | "COURSERA_EMAIL = 'michaelsyao@gmail.com'# the email" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": 35, 89 | "metadata": { 90 | "collapsed": true 91 | }, 92 | "outputs": [], 93 | "source": [ 94 | "# The Black-Scholes prices\n", 95 | "def bs_put(t, S0, K, r, sigma, T):\n", 96 | " d1 = (np.log(S0/K) + (r + 1/2 * sigma**2) * (T-t)) / sigma / np.sqrt(T-t)\n", 97 | " d2 = (np.log(S0/K) + (r - 1/2 * sigma**2) * (T-t)) / sigma / np.sqrt(T-t)\n", 98 | " price = K * np.exp(-r * (T-t)) * norm.cdf(-d2) - S0 * norm.cdf(-d1)\n", 99 | " return price\n", 100 | "\n", 101 | "def bs_call(t, S0, K, r, sigma, T):\n", 102 | " d1 = (np.log(S0/K) + (r + 1/2 * sigma**2) * (T-t)) / sigma / np.sqrt(T-t)\n", 103 | " d2 = (np.log(S0/K) + (r - 1/2 * sigma**2) * (T-t)) / sigma / np.sqrt(T-t)\n", 104 | " price = S0 * norm.cdf(d1) - K * np.exp(-r * (T-t)) * norm.cdf(d2)\n", 105 | " return price\n", 106 | "\n", 107 | "def d1(S0, K, r, sigma, T):\n", 108 | " return (np.log(S0/K) + (r + sigma**2 / 2) * T)/(sigma * np.sqrt(T))\n", 109 | " \n", 110 | "def d2(S0, K, r, sigma, T):\n", 111 | " return (np.log(S0 / K) + (r - sigma**2 / 2) * T) / (sigma * np.sqrt(T))\n", 112 | " " 113 | ] 114 | }, 115 | { 116 | "cell_type": "markdown", 117 | "metadata": {}, 118 | "source": [ 119 | "Simulate $N_{MC}$ stock price sample paths with $T$ steps by the classical Black-Sholes formula.\n", 120 | "\n", 121 | "$$dS_t=\\mu S_tdt+\\sigma S_tdW_t\\quad\\quad S_{t+1}=S_te^{\\left(\\mu-\\frac{1}{2}\\sigma^2\\right)\\Delta t+\\sigma\\sqrt{\\Delta t}Z}$$\n", 122 | "\n", 123 | "where $Z$ is a standard normal random variable.\n", 124 | "\n", 125 | "MC paths are simulated by GeneratePaths() method of DiscreteBlackScholes class." 126 | ] 127 | }, 128 | { 129 | "cell_type": "markdown", 130 | "metadata": {}, 131 | "source": [ 132 | "### Part 1\n", 133 | "\n", 134 | "\n", 135 | "Class DiscreteBlackScholes implements the above calculations with class variables to math symbols mapping of:\n", 136 | "\n", 137 | "$$\\Delta S_t=S_{t+1} - e^{-r\\Delta t} S_t\\space \\quad t=T-1,...,0$$\n", 138 | " \n", 139 | "**Instructions:**\n", 140 | "Some portions of code in DiscreteBlackScholes have bee taken out. You are to implement the missing portions of code in DiscreteBlackScholes class.\n", 141 | "\n", 142 | "$$\\Pi_t=e^{-r\\Delta t}\\left[\\Pi_{t+1}-u_t \\Delta S_t\\right]\\quad t=T-1,...,0$$\n", 143 | "\n", 144 | "- implement DiscreteBlackScholes.function_A_vec() method\n", 145 | "$$A_{nm}^{\\left(t\\right)}=\\sum_{k=1}^{N_{MC}}{\\Phi_n\\left(X_t^k\\right)\\Phi_m\\left(X_t^k\\right)\\left(\\Delta\\hat{S}_t^k\\right)^2}\\quad\\quad$$ \n", 146 | "\n", 147 | "- implement DiscreteBlackScholes.function_B_vec() method\n", 148 | "$$B_n^{\\left(t\\right)}=\\sum_{k=1}^{N_{MC}}{\\Phi_n\\left(X_t^k\\right)\\left[\\hat\\Pi_{t+1}^k\\Delta\\hat{S}_t^k+\\frac{1}{2\\gamma\\lambda}\\Delta S_t^k\\right]}$$\n", 149 | "- implement DiscreteBlackScholes.gen_paths() method using the following relation:\n", 150 | "$$S_{t+1}=S_te^{\\left(\\mu-\\frac{1}{2}\\sigma^2\\right)\\Delta t+\\sigma\\sqrt{\\Delta t}Z}$$\n", 151 | "where $Z \\sim N(0,1)$\n", 152 | "- implement parts of DiscreteBlackScholes.roll_backward()\n", 153 | " - DiscreteBlackScholes.bVals corresponds to $B_t$ and is computed as $$B_t = e^{-r\\Delta t}\\left[B_{t+1} + (u_{t+1} - u_t)S_{t+1}\\right]\\quad t=T-1,...,0$$\n", 154 | " \n", 155 | "DiscreteBlackScholes.opt_hedge corresponds to $\\phi_t$ and is computed as \n", 156 | " $$\\phi_t=\\mathbf A_t^{-1}\\mathbf B_t$$" 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": 36, 162 | "metadata": { 163 | "collapsed": true 164 | }, 165 | "outputs": [], 166 | "source": [ 167 | "class DiscreteBlackScholes:\n", 168 | " \"\"\"\n", 169 | " Class implementing discrete Black Scholes\n", 170 | " DiscreteBlackScholes is class for pricing and hedging under\n", 171 | " the real-world measure for a one-dimensional Black-Scholes setting\n", 172 | " \"\"\"\n", 173 | "\n", 174 | " def __init__(self,\n", 175 | " s0,\n", 176 | " strike,\n", 177 | " vol,\n", 178 | " T,\n", 179 | " r,\n", 180 | " mu,\n", 181 | " numSteps,\n", 182 | " numPaths):\n", 183 | " \"\"\"\n", 184 | " :param s0: initial price of the underlying\n", 185 | " :param strike: option strike\n", 186 | " :param vol: volatility\n", 187 | " :param T: time to maturity, in years\n", 188 | " :param r: risk-free rate,\n", 189 | " :param mu: real drift, asset drift\n", 190 | " :param numSteps: number of time steps\n", 191 | " :param numPaths: number of Monte Carlo paths\n", 192 | " \"\"\"\n", 193 | " self.s0 = s0\n", 194 | " self.strike = strike\n", 195 | " self.vol = vol\n", 196 | " self.T = T\n", 197 | " self.r = r\n", 198 | " self.mu = mu\n", 199 | " self.numSteps = numSteps\n", 200 | " self.numPaths = numPaths\n", 201 | "\n", 202 | " self.dt = self.T / self.numSteps # time step\n", 203 | " self.gamma = np.exp(-r * self.dt) # discount factor for one time step, i.e. gamma in the QLBS paper\n", 204 | "\n", 205 | " self.sVals = np.zeros((self.numPaths, self.numSteps + 1), 'float') # matrix of stock values\n", 206 | "\n", 207 | " # initialize half of the paths with stock price values ranging from 0.5 to 1.5 of s0\n", 208 | " # the other half of the paths start with s0\n", 209 | " half_paths = int(numPaths / 2)\n", 210 | "\n", 211 | " if False:\n", 212 | " # Grau (2010) \"Applications of Least-Squares Regressions to Pricing and Hedging of Financial Derivatives\"\n", 213 | " self.sVals[:, 0] = (np.hstack((np.linspace(0.5 * s0, 1.5 * s0, half_paths),\n", 214 | " s0 * np.ones(half_paths, 'float')))).T\n", 215 | "\n", 216 | " self.sVals[:, 0] = s0 * np.ones(numPaths, 'float')\n", 217 | " self.optionVals = np.zeros((self.numPaths, self.numSteps + 1), 'float') # matrix of option values\n", 218 | " self.intrinsicVals = np.zeros((self.numPaths, self.numSteps + 1), 'float')\n", 219 | "\n", 220 | " self.bVals = np.zeros((self.numPaths, self.numSteps + 1), 'float') # matrix of cash position values\n", 221 | " self.opt_hedge = np.zeros((self.numPaths, self.numSteps + 1),\n", 222 | " 'float') # matrix of optimal hedges calculated from cross-sectional information F_t\n", 223 | " self.X = None\n", 224 | " self.data = None # matrix of features, i.e. self.X as sum of basis functions\n", 225 | " self.delta_S_hat = None\n", 226 | "\n", 227 | " # coef = 1.0/(2 * gamma * risk_lambda)\n", 228 | " # override it by zero to have pure risk hedge\n", 229 | " self.coef = 0.\n", 230 | "\n", 231 | " def gen_paths(self):\n", 232 | " \"\"\"\n", 233 | " A simplest path generator\n", 234 | " \"\"\"\n", 235 | " np.random.seed(42)\n", 236 | " # Spline basis of order p on knots k\n", 237 | "\n", 238 | " ### START CODE HERE ### (≈ 3-4 lines of code)\n", 239 | " #############################################################Forward PASS\n", 240 | " # self.sVals = your code goes here ...\n", 241 | " z= np.random.normal(0,1,size = (self.numSteps+1, self.numPaths)).T\n", 242 | " #Z = np.randn(numSteps+1,numPaths) .T \n", 243 | " \n", 244 | " # for-loop or while loop is allowed heres\n", 245 | " for t in range(0, self.numSteps):\n", 246 | " self.sVals[:, t+1] = self.sVals[:,t]*np.exp((self.mu - 0.5*self.vol**2)*self.dt \\\n", 247 | " +(self.vol * np.sqrt(self.dt)*z[:, t+1]))\n", 248 | " print (self.sVals)\n", 249 | " \n", 250 | " \n", 251 | " \n", 252 | " ### END CODE HERE ###\n", 253 | "\n", 254 | " # like in QLBS\n", 255 | " delta_S = self.sVals[:, 1:] - np.exp(self.r * self.dt) * self.sVals[:, :self.numSteps]\n", 256 | " self.delta_S_hat = np.apply_along_axis(lambda x: x - np.mean(x), axis=0, arr=delta_S)\n", 257 | "\n", 258 | " # state variable\n", 259 | " # delta_t here is due to their conventions\n", 260 | " self.X = - (self.mu - 0.5 * self.vol ** 2) * np.arange(self.numSteps + 1) * self.dt + np.log(self.sVals)\n", 261 | "\n", 262 | " X_min = np.min(np.min(self.X))\n", 263 | " X_max = np.max(np.max(self.X))\n", 264 | "\n", 265 | " print('X.shape = ', self.X.shape)\n", 266 | " print('X_min, X_max = ', X_min, X_max)\n", 267 | "\n", 268 | " p = 4 # order of spline (as-is; 3 = cubic, 4: B-spline?)\n", 269 | " ncolloc = 12\n", 270 | " tau = np.linspace(X_min, X_max, ncolloc) # These are the sites to which we would like to interpolate\n", 271 | "\n", 272 | " # k is a knot vector that adds endpoints repeats as appropriate for a spline of order p\n", 273 | " # To get meaningful results, one should have ncolloc >= p+1\n", 274 | " k = splinelab.aptknt(tau, p)\n", 275 | " basis = bspline.Bspline(k, p)\n", 276 | "\n", 277 | " num_basis = ncolloc # len(k) #\n", 278 | " self.data = np.zeros((self.numSteps + 1, self.numPaths, num_basis))\n", 279 | "\n", 280 | " print('num_basis = ', num_basis)\n", 281 | " print('dim self.data = ', self.data.shape)\n", 282 | "\n", 283 | " # fill it, expand function in finite dimensional space\n", 284 | " # in neural network the basis is the neural network itself\n", 285 | " t_0 = time.time()\n", 286 | " for ix in np.arange(self.numSteps + 1):\n", 287 | " x = self.X[:, ix]\n", 288 | " self.data[ix, :, :] = np.array([basis(el) for el in x])\n", 289 | " t_end = time.time()\n", 290 | " print('\\nTime Cost of basis expansion:', t_end - t_0, 'seconds')\n", 291 | "\n", 292 | " def function_A_vec(self, t, reg_param=1e-3):\n", 293 | " \"\"\"\n", 294 | " function_A_vec - compute the matrix A_{nm} from Eq. (52) (with a regularization!)\n", 295 | " Eq. (52) in QLBS Q-Learner in the Black-Scholes-Merton article\n", 296 | "\n", 297 | " Arguments:\n", 298 | " t - time index, a scalar, an index into time axis of data_mat\n", 299 | " reg_param - a scalar, regularization parameter\n", 300 | "\n", 301 | " Return:\n", 302 | " - np.array, i.e. matrix A_{nm} of dimension num_basis x num_basis\n", 303 | " \"\"\"\n", 304 | " X_mat = self.data[t, :, :]\n", 305 | " num_basis_funcs = X_mat.shape[1]\n", 306 | " this_dS = self.delta_S_hat[:, t]\n", 307 | " hat_dS2 = (this_dS ** 2).reshape(-1, 1)\n", 308 | " A_mat = np.dot(X_mat.T, X_mat * hat_dS2) + reg_param * np.eye(num_basis_funcs)\n", 309 | " return A_mat\n", 310 | "\n", 311 | " def function_B_vec(self, t, Pi_hat):\n", 312 | " \"\"\"\n", 313 | " function_B_vec - compute vector B_{n} from Eq. (52) QLBS Q-Learner in the Black-Scholes-Merton article\n", 314 | "\n", 315 | " Arguments:\n", 316 | " t - time index, a scalar, an index into time axis of delta_S_hat\n", 317 | " Pi_hat - pandas.DataFrame of dimension N_MC x T of portfolio values\n", 318 | " Return:\n", 319 | " B_vec - np.array() of dimension num_basis x 1\n", 320 | " \"\"\"\n", 321 | " tmp = Pi_hat * self.delta_S_hat[:, t] + self.coef * (np.exp((self.mu - self.r) * self.dt)) * self.sVals[:, t]\n", 322 | " X_mat = self.data[t, :, :] # matrix of dimension N_MC x num_basis\n", 323 | "\n", 324 | " B_vec = np.dot(X_mat.T, tmp)\n", 325 | " return B_vec\n", 326 | "\n", 327 | " def seed_intrinsic(self, strike=None, cp='P'):\n", 328 | " \"\"\"\n", 329 | " initilaize option value and intrinsic value for each node\n", 330 | " \"\"\"\n", 331 | " if strike is not None:\n", 332 | " self.strike = strike\n", 333 | "\n", 334 | " if cp == 'P':\n", 335 | " # payoff function at maturity T: max(K - S(T),0) for all paths\n", 336 | " self.optionVals = np.maximum(self.strike - self.sVals[:, -1], 0).copy()\n", 337 | " # payoff function for all paths, at all time slices\n", 338 | " self.intrinsicVals = np.maximum(self.strike - self.sVals, 0).copy()\n", 339 | " elif cp == 'C':\n", 340 | " # payoff function at maturity T: max(S(T) -K,0) for all paths\n", 341 | " self.optionVals = np.maximum(self.sVals[:, -1] - self.strike, 0).copy()\n", 342 | " # payoff function for all paths, at all time slices\n", 343 | " self.intrinsicVals = np.maximum(self.sVals - self.strike, 0).copy()\n", 344 | " else:\n", 345 | " raise Exception('Invalid parameter: %s'% cp)\n", 346 | "\n", 347 | " self.bVals[:, -1] = self.intrinsicVals[:, -1]\n", 348 | "\n", 349 | " def roll_backward(self):\n", 350 | " \"\"\"\n", 351 | " Roll the price and optimal hedge back in time starting from maturity\n", 352 | " \"\"\"\n", 353 | "\n", 354 | " for t in range(self.numSteps - 1, -1, -1):\n", 355 | "\n", 356 | " # determine the expected portfolio value at the next time node\n", 357 | " piNext = self.bVals[:, t+1] + self.opt_hedge[:, t+1] * self.sVals[:, t+1]\n", 358 | " pi_hat = piNext - np.mean(piNext)\n", 359 | "\n", 360 | " A_mat = self.function_A_vec(t)\n", 361 | " B_vec = self.function_B_vec(t, pi_hat)\n", 362 | " phi = np.dot(np.linalg.inv(A_mat), B_vec)\n", 363 | " self.opt_hedge[:, t] = np.dot(self.data[t, :, :], phi)\n", 364 | "\n", 365 | " ### START CODE HERE ### (≈ 1-2 lines of code)\n", 366 | " #############################################################Backward PASS\n", 367 | " # (4)\n", 368 | " # Bt = exp(-dt*r) [B_t+1+ (u_t+1 - u_t)*S_t+1]\n", 369 | " ##########################################################################\n", 370 | " # implement code to update self.bVals\n", 371 | " # self.bVals[:,t] = your code goes here ....\n", 372 | "\n", 373 | " #self.bVals[:,t] = np.exp(-dt*self.r)*(self.bVals[:,t+1] \\\n", 374 | " # + (self.opt_hedge[:,t+1] - self.opt_hedge[:,t])*self.sVals[:,t+1])\n", 375 | "\n", 376 | " self.bVals[:,t] = np.exp(-self.r * self.dt) * (self.bVals[:, t+1] + (self.opt_hedge[:, t+1] - self.opt_hedge[:, t]) * self.sVals[:, t+1])\n", 377 | " \n", 378 | " ### END CODE HERE ###\n", 379 | "\n", 380 | " # calculate the initial portfolio value\n", 381 | " initPortfolioVal = self.bVals[:, 0] + self.opt_hedge[:, 0] * self.sVals[:, 0]\n", 382 | "\n", 383 | " # use only the second half of the paths generated with paths starting from S0\n", 384 | " optionVal = np.mean(initPortfolioVal)\n", 385 | " optionValVar = np.std(initPortfolioVal)\n", 386 | " delta = np.mean(self.opt_hedge[:, 0])\n", 387 | "\n", 388 | " return optionVal, delta, optionValVar" 389 | ] 390 | }, 391 | { 392 | "cell_type": "code", 393 | "execution_count": 37, 394 | "metadata": {}, 395 | "outputs": [ 396 | { 397 | "name": "stdout", 398 | "output_type": "stream", 399 | "text": [ 400 | "[[ 100. 98.23650359 98.6842395 ..., 111.52820537\n", 401 | " 111.93345414 111.50088104]\n", 402 | " [ 100. 99.47538589 100.18466561 ..., 69.58859259\n", 403 | " 69.36721589 68.46903615]\n", 404 | " [ 100. 99.57310236 100.94511135 ..., 110.66761375\n", 405 | " 110.53260244 110.37282496]\n", 406 | " ..., \n", 407 | " [ 100. 100.19783913 100.59050962 ..., 151.7887043 151.63565543\n", 408 | " 152.14692905]\n", 409 | " [ 100. 100.07733423 101.11151453 ..., 103.08321744\n", 410 | " 101.41095506 101.46651123]\n", 411 | " [ 100. 98.57422289 99.36322314 ..., 91.31429149\n", 412 | " 91.06798685 92.50219743]]\n", 413 | "X.shape = (100, 253)\n", 414 | "X_min, X_max = 4.10743882917 5.16553756345\n", 415 | "num_basis = 12\n", 416 | "dim self.data = (253, 100, 12)\n", 417 | "\n", 418 | "Time Cost of basis expansion: 7.917062282562256 seconds\n" 419 | ] 420 | }, 421 | { 422 | "data": { 423 | "text/plain": [ 424 | "" 425 | ] 426 | }, 427 | "execution_count": 37, 428 | "metadata": {}, 429 | "output_type": "execute_result" 430 | }, 431 | { 432 | "data": { 433 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAWkAAAEcCAYAAAAFlEU8AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGrtJREFUeJzt3X+U3HV97/HnK5tN3KCycAlqFmK8/ginihIaPbT0B7+u\nsf4iordSwZZrKae9tS2WEyTiVejRIzQ9V+y1p23Aej38kCDEFbytASv01p6CJmxiDJArys8BZCms\ntWSBzeZ9/5jvhMlkZndmdma+n+/s63HOHnbmOzP7/gby2g/v7+fz+SoiMDOzNC3IuwAzM2vMIW1m\nljCHtJlZwhzSZmYJc0ibmSXMIW1mljCHtJlZwhzSZmYJc0ib1ZD0vyV9Ju86zMAhbR0g6UFJk5L+\no+rriznXc1oXP7v2XJ+T9N1u/Lx2SDpL0jZJE5KekXSHpJG867L2OKStU94TES+t+vpo3gV10QHn\nCvx+3gVVSDoHuBQ4DzgMeD1wDfBMjmXZHDikrWskvVbS05KOzx4vkzQu6aTs8YOS1ku6JxvxfVnS\nS6rev0zSTdl7HpD0x1XHjpa0OTv2b5WRu6SrgeXALdko98KZPid7zypJd0v6uaRNwEvoMkkfl3Rj\nzXNfkPSXVcdLWU27JZ3a5Ef/LvC3EbEtyp6KiKsiYk+nz8F6wyFtXRMRPwY+DlwjaQnwZeArEXFH\n1cvOAtYArwXeAHwSQNIC4BZgBzACnAqcL2mNpAHgm8BDwIrs+PXZz/ww8DDZaBf4i0afk/2cRcAo\ncDVwOPA14P2d/9M4yPXAOyW9LKtjAPhN4DpJK4GPAm+NiJdR/vN5sMnPnQQ+Iuk3JR3R+bKt1xzS\n1imjWQ+08vV7ABFxJXA/cBfwKuDimvd9MSIeiYingc8Cv5U9/1ZgaUT8WUS8EBE/Aa4EzgTeBiwD\n1kXEsxHxXEQ06gnP9DkAJwCDwBURMRURNwLfn8sfhKTLJf2zpKslDdZ7TUQ8BNwNvC976hRgT0Tc\nCUwDi4FfkDQYEQ9mv/Ca8dvAFsq/nH4q6RZJR1bVdrKk5W2emuXAIW2dsjYihqu+rqw6diXwJuB/\nRcTzNe97pOr7hyiHL8CrgWXVwQ98AngFcDTwUETsbaKumT6H7OeV4sA9ex9q4nPrkvQWYCQifhW4\nD/jADC+/jhd/KX0oe0xE3A+cD1wCPCnpeknL6n5CjYh4IiLOj4jllH+ZvZny/81UfATw/sQF4pC2\nrpL0UuAK4EvAJZIOr3nJ0VXfLwcey75/BHigJvhfFhHvzI4tl7SwwY+tDqGZPgfgcWBEkmrqaNcv\nA7dm338LOHGG134NOEnSUZRH1NftP4GI6yLiVyj/kgng8lYLiYhtwE7gEABJ7wXeDVwt6cOtfp7l\nwyFt3fYFYGtEnAv8H+Bvao7/oaSjsvC+GNiUPf894OfZBbQhSQOS3iTprdmxx4HLJB0i6SWSqsPw\np8B/buJzAP4V2Av8saRBSWdQHoG26zDg37Pvf0a5z11XRIwDd1Du1T8QEfcCSFop6RRJi4HnKPeZ\n9832gyVdJOlESYuzr3OAk7LPh3Iff1tEnBQRV7dzctZ7DmnrlMpsisrX1yWdDrwD+IPsNX8KHC/p\nrKr3XUd55PkT4MfAZwAiYpryqO844AHgKeAq4NDs2HuA11G+SPgo8MGqz/wc8MmstfGxRp+T/ZwX\ngDOAc4Cns8/ZPIc/hwng5dn3h2afOZPrgNOoGkVT7kdfltX6BHAksB5A0j9I+kSDz3o55UD+N8p/\nLh8ETo2Iu7LjrwN+1MrJWP7k22dZXiQ9CJwbEd/Ou5ZOkXQc8KcR8dtZmD4QEV/Nuy4ASe8DXh0R\nV+RdizXPI2mzDoqI7ZRnVfwz8EbgppxLqrYbOFeSQ7pAPJK23PTjSNqs0xzSZmYJc7vDzCxhDmkz\ns4Q1WgxQCEcccUSsWLEi7zLMzFq2bdu2pyJi6WyvK3RIr1ixgq1bt+ZdhplZyyQ1tf2A2x1mZglz\nSJuZJcwhbWaWMIe0mVnCHNJmZgkr9OwOM7O8jI6V2LBlN49NTLJseIh1a1aydlXnb8rukDYza9Ho\nWIn1m3cyOTUNQGlikvWbdwJ0PKjd7jAza9GGLbv3B3TF5NQ0G7bs7vjPckibmbXosYnJlp6fC4e0\nmVmLlg0PtfT8XDikzcxatG7NSoYGBw54bmhwgHVrVnb8Z/nCoZlZiyoXBz27w8wsUWtXjXQllGu5\n3WFmljCHtJlZwhzSZmYJc0ibmSXMIW1mljCHtJlZwhzSZmYJc0ibmSUsqZCWNCzpRkn3SbpX0i/l\nXZOZWZ5SW3H4BeBbEfEBSYuAJXkXZGaWp2RCWtKhwK8B5wBExAvAC3nWZGaWt5TaHa8BxoEvSxqT\ndJWkQ/IuyswsTymF9ELgeOCvI2IV8CxwUe2LJJ0naaukrePj472u0cysp1IK6UeBRyPiruzxjZRD\n+wARsTEiVkfE6qVLl/a0QDOzXksmpCPiCeARSZVds08F7smxJDOz3CVz4TDzR8C12cyOnwD/Led6\nzMxylVRIR8R2YHXedZiZpSKZdoeZmR3MIW1mljCHtJlZwhzSZmYJc0ibmSXMIW1mljCHtJlZwhzS\nZmYJc0ibmSXMIW1mljCHtJlZwpLau8PM+t/oWIkNW3bz2MQky4aHWLdmJWtXjeRdVrIc0mbWM6Nj\nJdZv3snk1DQApYlJ1m/eCeCgbsDtDjPrmQ1bdu8P6IrJqWk2bNmdU0Xpc0ibWc88NjHZ0vPmkDaz\nHlo2PNTS8+aQNrMeWrdmJUODAwc8NzQ4wLo1Kxu8w3zh0Mx6pnJx0LM7mueQNrOeWrtqxKHcAoe0\nmc2J5z13l0PazNrmec/d5wuHZtY2z3vuPoe0mbXN8567zyFtZm3zvOfuc0ibWds877n7fOHQzNrm\nec/d55A2sznxvOfuSq7dIWlA0pikb+Zdi5lZ3pILaeBPgHvzLsLMLAVJhbSko4B3AVflXYuZWQqS\nCmngCuBCYF+jF0g6T9JWSVvHx8d7V5mZWQ6SuXAo6d3AkxGxTdJJjV4XERuBjQCrV6+OHpVn1pe8\n70b6UhpJnwi8V9KDwPXAKZKuybcks/5V2XejNDFJUN534/xN21n1Z7cyOlbKuzzLJBPSEbE+Io6K\niBXAmcB3IuLsnMsy60ujYyUuuGHHQftuADyzZ4r1m3c6qBORTEibWW9URtDT0bhb6E2S0pFMT7pa\nRNwB3JFzGWZ9qd7OdfV4k6Q0eCRtNs80G77eJCkNDmmzeaaZ8PUmSelwSJvNM/V2rhtcIA5bMoiA\nkeEhPnfGsZ6Kl4gke9Jm1j2V8L30ll08s2cKgEMWL+TT73mjgzlBHkmbzVPPTb24sHdi0tPuUuWQ\nNpuHfG/C4nC7w6zg2lna7XsTFodH0mYFVm9pdzNtC9+bsDgc0mYF1m7bwvcmLA63O8wKrN22he9N\nWBwOabMCWzY8RKlOIDfTtvC9CYvB7Q6zAnPbov95JG1WYG5b9D+HtFnBuW3R3xzSZjnwbausWQ5p\nsx6rzG2uTJ2rzG0GHNR2EF84NOsxL8m2VjikzXrMS7KtFQ5psx7zkmxrhUParMc8t9la4QuHZj3m\nuc3WCoe0WQ48t9ma5ZA26wHPi7Z2OaTNuszzom0uHNJmXVIZPdfbpa4yL9ohbbNxSJt1SHVL49Ch\nQZ59YS9T09Hw9Z4Xbc1IZgqepKMl3S7pHkm7JP1J3jWZNav2NlYTk1MzBjR4XrQ1J6WR9F7ggoi4\nW9LLgG2SbouIe/IuzGw29ZZ6z8Tzoq1ZyYykI+LxiLg7+/7nwL2AG3ZWCK20LgS8/xc9Bc+ak0xI\nV5O0AlgF3FXn2HmStkraOj4+3uvSzA4yOlZigdT06wO4/T7/t2vNSandAYCklwI3AedHxL/XHo+I\njcBGgNWrV8/c9DPrgupZG6Icuq3yRUNrVlIhLWmQckBfGxGb867HDGaetdHuKMEXDa1ZyYS0JAFf\nAu6NiP+Zdz1mo2MlLr1lF8/smdr/3MTk1AzvaI4vGlorUupJnwh8GDhF0vbs6515F2XzU2VKXXVA\nt2NkeIizT1jOyPAQyh5/7oxjfdHQmpbMSDoivkv5wrdZ7lqdUlePgH+56JTOFGTzVkojabNkdOLC\nnvvO1gkOabM65hqw7jtbp7Qc0pI+JOl6SddKuk7Sb3WjMLM8nXzM0pbfM5DNlXbf2TqpnZ70r0fE\nmZUHkv4K+GrnSjLLX6PFJgtUDuOpfS9OvhsaHHAoW9e0E9KLJb0LeAQ4GnDjzfpOo570voCBBTA8\nNMjPJqe8gb91XTsh/d+BM4A3AY8CH+1oRWY5qyzzno76S1WmpoNDFi9k+6ff3uPKbD6aNaQl/WWj\nQ8BIRFzb2ZLM8lOZH90ooCu8rNt6pZmR9OnAp7pdiFkvzHavwWbnR3t6nfVKMyH9dER8peuVmHVZ\nvXsNnr9pO5fesot3vflV3H7feN1bXdXy9DrrpWZC2jvNWV9oNEp+Zs8U19z58IzvHZDYF+ELhdZz\nySwLN+u2ZkbJ9XiKneWpmZB+i6SD9nWmfOEwIuLlHa7JrC31+s1Awzt2N2PEI2fL2awhHREDvSjE\nbC7q9ZvX3biD6X3BvjYbdiPDQ94gyXLndof1hUtv2XVQv3m2u3XPxBcHLRXeYMkKb3SsNOd9n73n\ns6XKI2krvEtv2TWn9x+yaIDPrD22Q9WYdZZH0lZocx1FDywQn32fA9rS5ZG0FUrtDI5nn9/b9md5\n5oYVgUPaCqPeDI5WHbZkkE+/540OZisMh7Qlq3bUvOeFvW3fd/DBy97V4erMesMhbUnqxKi5YsSb\nIVmB+cKhJakTd+sGGByQ5ztboXkkbUnqxH7N7j9bP3BIW5KWDQ/Nab8NL+e2fuF2hyVp3ZqVDA22\nvm2Ml3Nbv/FI2pJUaVFcesuuGRerHLZkkCWLFja804pZ0SUV0pLeAXwBGACuiojLci7Jumi2W1kB\nPDe1r+H7hwYH3HO2vpdMSEsaAP4K+C+U70L+fUk3R8Q9+VZm3VBvit36zTuBF0fRM83w8GpBmy+S\nCWngbcD9EfETAEnXU74JrkO6D9UL4Mmpac7ftJ3zN21nZIYLhwJfGLR5I6ULhyPAI1WPH82esz4y\nOlbixMu+M+vMjdLEJGpwzHfqtvkkpZF0UySdB5wHsHz58pyrsVbUtjhmE2T3aKt6zrM3bL5JaSRd\nAo6uenxU9twBImJjRKyOiNVLly7tWXE2N6NjJS64YUfLqwgDvBm/zWspjaS/D7xe0msoh/OZwIfy\nLck6oTKCno7Wb2flhSk23yUT0hGxV9JHgS2Up+D9XUTM7ZYblpvq6XULpLYCWuDWhs17yYQ0QET8\nPfD3eddhc1Pbe243oM86YblbGzbvJRXSVlxzGTlXLg4OZO/zHGizFzmkbc7mMnJ2IJvNzCFtc3bJ\nzbva2vvZFwXNZpfSFDwrmNGxEsddeisTk63frdvznc2a45C2tlRaHM0G9NknLPd8Z7M2uN1hbWnl\n9lYjw0N8Zu2xXa7IrD95JG1tafb2Vm5rmM2NQ9ra0swmR25rmM2d2x3WlNoN+k8+Zik3bSvVbXkM\nLhAb/utbHM5mHeCRtM2qcpGwNDFJUN5G9KZtJd7/iyOMZCPqAZU3Fh0ZHnJAm3WQR9I2q0Yb9F9z\n58OMDA9xxQePcyibdYlD2vYbHSsdcOPX2r2c66l32ysz6xy3OwwoB/S6G3cccGfuZhd3T05Ns2HL\n7u4UZjbPOaQNKLc0pqZb362uotkpeWbWGoe0AXMPWd930Kw7HNIGzC1kvWDFrHsc0gbAycc0f7/I\nBYLhoUHvw2HWA57dMQ/UztoYHhrkkve+cX+wjo6V2PT9R5r6rKHBAYeyWQ85pPtcZdZG9UXBickp\n1n1tB1CeNjfTRcPDlgyyZNHC/SsNvUG/WW85pPtcowCe2hds2LKbtatGZrxoOLFnirFPvb2bJZrZ\nDNyT7nOlGQK4Es4zXTT0rA2zfDmk+1xlT416KgG8bs1KBgcOft3gAnnWhlnO3O7oczPdFLYSwJUe\n80wXF80sHw7pPjcg1Q1q6cC9NtauGnEgmyXI7Y4+12gkPcMA28wS4pF0gdXbiP/2+8YPmC43MjxU\n9+LhiC8ImhWCR9IF9cnRnXxs0/YDNuK/5s6HD3i87ms7OPmYpQwNDhzwXi/jNiuOJEJa0gZJ90n6\ngaSvSxrOu6aUjY6VuPbOh2fdSnRqX/DNHY/zuTOOZWR4yMu4zQoolXbHbcD6iNgr6XJgPfDxnGtK\n1oYtu5ve63licsoXBc0KLImRdETcGhF7s4d3AkflWU/qWt1WdHSs1KVKzKzbUhlJV/sIsKnRQUnn\nAecBLF++vFc15a76IuGCBtPqGqks/zaz4ulZSEv6NvDKOocujohvZK+5GNgLXNvocyJiI7ARYPXq\n1fNiIlntJkmtBDT4rilmRdazkI6I02Y6Lukc4N3AqRGexVs9cpZg3xz+RLz/hllxJdHukPQO4ELg\n1yNiT9715G10rMT6zTuZnJoG5rbwxNPtzIotiZAGvggsBm5TeUOgOyPi9/MtKT8btuzeH9CtGh4a\nRCpvMer9n82KL4mQjojX5V1DStrtIY8MD/EvF53S4WrMLE9JhPR8Vru0e92alSxrsJR7Jm5rmPWn\nJOZJz1eV3nP1Uu71m3fWXco9OKADbv569gnLvYrQbB7wSDono2MlLrhhx0HT6Sanpvcv5a4dYTuE\nzeYfh3SPjY6VuOTmXUxMTjV8TeWY+8tm5nZHD1XaGzMFdMWGLbt7UJGZpc4h3UOtTK3zKkEzA4d0\nT7USvF4laGbgkO6pZoPX0+nMrMIh3UPr1qw8aGpdLU+nM7Nqnt3RQ5Xg3bBlN6WJyf138h7xFDsz\na8Ah3WO+S4qZtcLtDjOzhHkkPQe1+26cfMxSbr9v3KsEzaxjHNJtqt3zuTQxyTV3Prz/eGUfDsBB\nbWZtc7ujTZfcvGvWhSmTU9NeOWhmc+KRdBPqtTWaWdoNXjloZnPjkJ5FvbbGtVVtjdl45aCZzYXb\nHbOot99Gs7cc9MpBM5srj6TrqG5vtHIP2EMWDTC8ZJFnd5hZxzika9S2N5o1NDjAZ9/n5dxm1llu\nd9Ro907d3m/DzLrBIV2jndkYI8NDDmgz6wqHdI1WZ2P44qCZdZNDukYz24lWeFtRM+s2XzisUb2d\n6EyzO4RvFGtm3eeQrqN6O9ETL/sOpTp9ai9SMbNeSKrdIekCSSHpiLxrqajX/nAf2sx6JZmRtKSj\ngbcDza+57oHa9ocXqZhZLyUT0sDngQuBb+RdSC3fTcXM8pJEu0PS6UApInY08drzJG2VtHV8fLwH\n1ZmZ5adnI2lJ3wZeWefQxcAnKLc6ZhURG4GNAKtXr25la42Dthx128LMUtezkI6I0+o9L+lY4DXA\nDkkARwF3S3pbRDzRqZ9fb8tR3znFzFKXe7sjInZGxJERsSIiVgCPAsd3MqCh/p4cvnOKmaUu95Du\nlUZ7cvjOKWaWsuRCOhtRP9Xpz220+MSLUswsZcmFdLd4UYqZFVFK86S7yotSzKyI5k1IgxelmFnx\nzJt2h5lZETmkzcwS5pA2M0uYQ9rMLGEOaTOzhCmipT2KkiJpHHioweEjgI4vismJzyU9/XIe4HPJ\ny6sjYulsLyp0SM9E0taIWJ13HZ3gc0lPv5wH+FxS53aHmVnCHNJmZgnr55DemHcBHeRzSU+/nAf4\nXJLWtz1pM7N+0M8jaTOzwnNIm5klrG9CWtKwpBsl3SfpXkm/JOlwSbdJ+lH2z8PyrnM2kj4maZek\nH0r6qqSXFOU8JP2dpCcl/bDquYa1S1ov6X5JuyWtyafq+hqcy4bsv68fSPq6pOGqY4U6l6pjF0gK\nSUdUPVe4c5H0R9m/m12S/rzq+WTPpWkR0RdfwFeAc7PvFwHDwJ8DF2XPXQRcnneds5zDCPAAMJQ9\nvgE4pyjnAfwacDzww6rn6tYO/AKwA1hM+UbEPwYG8j6HWc7l7cDC7PvLi3wu2fNHA1soLwg7oqjn\nApwMfBtYnD0+sgjn0uxXX4ykJR1K+V/elwAi4oWImABOpxzeZP9cm0+FLVkIDElaCCwBHqMg5xER\n/xd4uubpRrWfDlwfEc9HxAPA/cDbelJoE+qdS0TcGhF7s4d3Ur6zPRTwXDKfBy4EqmcPFPFc/gC4\nLCKez17zZPZ80ufSrL4Iacq/JceBL0sak3SVpEOAV0TE49lrngBekVuFTYiIEvAXwMPA48DPIuJW\nCnYeNRrVPgI8UvW6R7PniuIjwD9k3xfuXCSdDpQiYkfNocKdC/AG4Fcl3SXpnyS9NXu+iOdykH4J\n6YWU/xforyNiFfAs5f+13i/K//+T9HzDrF97OuVfOsuAQySdXf2aIpxHI0WuvZqki4G9wLV519IO\nSUuATwCfyruWDlkIHA6cAKwDbpCkfEvqnH4J6UeBRyPiruzxjZRD+6eSXgWQ/fPJBu9PxWnAAxEx\nHhFTwGbglyneeVRrVHuJck+04qjsuaRJOgd4N3BW9ksHincur6U8ENgh6UHK9d4t6ZUU71yg/Pd/\nc5R9D9hHeaOlIp7LQfoipCPiCeARSZVbf58K3APcDPxO9tzvAN/IobxWPAycIGlJNhI4FbiX4p1H\ntUa13wycKWmxpNcArwe+l0N9TZP0Dso93PdGxJ6qQ4U6l4jYGRFHRsSKiFhBOeSOz/4eFepcMqOU\nLx4i6Q2UJw48RTHP5WB5X7ns1BdwHLAV+AHlf2mHAf8J+EfgR5Sv/h6ed51NnMelwH3AD4GrKV+Z\nLsR5AF+l3EufovwX/3dnqh24mPIV993Ab+RdfxPncj/lHuf27OtvinouNccfJJvdUcRzoRzK12R/\nZ+4GTinCuTT75WXhZmYJ64t2h5lZv3JIm5klzCFtZpYwh7SZWcIc0mZmCXNIm5klzCFtlpH0KknX\nS9oq6f9Juj3vmswW5l2AWUKuBq6MiE0Ako7NuR4zL2YxA5A0ADwPHBXl5dFmSXC7wwyIiGnKy9Z3\nSPpbSSdWjqV6JxybHxzSZi/6DeD9wM+Ab0mq3KDg8/mVZPOde9JmmSj3/r4LfDcbPb9Z0nPAMZLW\nRcSGfCu0+cgjaTNA0hpJi7LvjwR+BbiN8paX1zigLS8OabOyDwD3StoBfBP4HxHxr8CbKd/M1CwX\nbneYARHxew0OPQWcK+mpiLi3lzWZgafgmZklze0OM7OEOaTNzBLmkDYzS5hD2swsYQ5pM7OEOaTN\nzBLmkDYzS5hD2swsYQ5pM7OE/X/tRcYsgGPH/wAAAABJRU5ErkJggg==\n", 434 | "text/plain": [ 435 | "" 436 | ] 437 | }, 438 | "metadata": {}, 439 | "output_type": "display_data" 440 | } 441 | ], 442 | "source": [ 443 | "np.random.seed(42)\n", 444 | "strike_k = 95\n", 445 | "test_vol = 0.2\n", 446 | "test_mu = 0.03\n", 447 | "dt = 0.01\n", 448 | "rfr = 0.05\n", 449 | "num_paths = 100\n", 450 | "num_periods = 252\n", 451 | "\n", 452 | "hMC = DiscreteBlackScholes(100, strike_k, test_vol, 1., rfr, test_mu, num_periods, num_paths)\n", 453 | "hMC.gen_paths()\n", 454 | "\n", 455 | "t = hMC.numSteps - 1\n", 456 | "piNext = hMC.bVals[:, t+1] + 0.1 * hMC.sVals[:, t+1]\n", 457 | "pi_hat = piNext - np.mean(piNext)\n", 458 | "\n", 459 | "A_mat = hMC.function_A_vec(t)\n", 460 | "B_vec = hMC.function_B_vec(t, pi_hat)\n", 461 | "phi = np.dot(np.linalg.inv(A_mat), B_vec)\n", 462 | "opt_hedge = np.dot(hMC.data[t, :, :], phi)\n", 463 | "\n", 464 | "# plot the results\n", 465 | "fig = plt.figure(figsize=(12,4))\n", 466 | "ax1 = fig.add_subplot(121)\n", 467 | "\n", 468 | "ax1.scatter(hMC.sVals[:,t], pi_hat)\n", 469 | "ax1.set_title(r'Expected $\\Pi_0$ vs. $S_t$')\n", 470 | "ax1.set_xlabel(r'$S_t$')\n", 471 | "ax1.set_ylabel(r'$\\Pi_0$')" 472 | ] 473 | }, 474 | { 475 | "cell_type": "code", 476 | "execution_count": 38, 477 | "metadata": {}, 478 | "outputs": [ 479 | { 480 | "name": "stdout", 481 | "output_type": "stream", 482 | "text": [ 483 | "Submission successful, please check on the coursera grader page for the status\n" 484 | ] 485 | }, 486 | { 487 | "data": { 488 | "text/plain": [ 489 | "array([ 0.81274895, -3.49043554, 0.69994334, 1.61239986, -0.25153316,\n", 490 | " -3.19082265, 0.8848621 , -2.0380868 , 0.45033564, 3.74872863,\n", 491 | " -0.6568227 , 1.74148929, 0.94314331, -4.19716113, 1.72135256,\n", 492 | " -0.66188482, 6.95675041, -2.20512677, -0.14942482, 0.30067272,\n", 493 | " 3.33419402, 0.68536713, 1.65097153, 2.69898611, 1.22528159,\n", 494 | " 1.47188744, -2.48129898, -0.37360224, 0.81064666, -1.05269459,\n", 495 | " 0.02476551, -1.88267258, 0.11748169, -0.9038195 , 0.69753811,\n", 496 | " -0.54805029, 1.97594593, -0.44331403, 0.62134931, -1.86191032,\n", 497 | " -3.21226413, 2.24508097, -2.23451292, -0.13488281, 3.64364848,\n", 498 | " -0.11270281, -1.15582237, -3.30169455, 1.74454841, -1.10425448,\n", 499 | " 2.10192819, 1.80570507, -1.68587001, -1.42113397, -2.70292006,\n", 500 | " 0.79454199, -2.05396827, 3.13973887, -1.08786662, 0.42347686,\n", 501 | " 1.32787012, 0.55924965, -3.54140814, -3.70258632, 2.14853641,\n", 502 | " 1.11495458, 3.69639676, 0.62864736, -2.62282995, -0.05315552,\n", 503 | " 1.05789698, 1.8023196 , -3.35217374, -2.30436466, -2.68609519,\n", 504 | " 0.95284884, -1.35963013, -0.56273408, -0.08311276, 0.79044269,\n", 505 | " 0.46247485, -1.04921463, -2.18122285, 1.82920128, 1.05635272,\n", 506 | " 0.90161346, -1.93870347, -0.37549305, -1.96383274, 1.9772888 ,\n", 507 | " -1.37386984, 0.95230068, 0.88842589, -1.42214528, -2.60256696,\n", 508 | " -1.53509699, 4.47491253, 4.87735375, -0.19068803, -1.08711941])" 509 | ] 510 | }, 511 | "execution_count": 38, 512 | "metadata": {}, 513 | "output_type": "execute_result" 514 | } 515 | ], 516 | "source": [ 517 | "### GRADED PART (DO NOT EDIT) ###\n", 518 | "\n", 519 | "part_1 = list(pi_hat)\n", 520 | "try:\n", 521 | " part1 = \" \".join(map(repr, part_1))\n", 522 | "except TypeError:\n", 523 | " part1 = repr(part_1)\n", 524 | "submissions[all_parts[0]]=part1\n", 525 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:1],all_parts,submissions)\n", 526 | "pi_hat\n", 527 | "### GRADED PART (DO NOT EDIT) ###" 528 | ] 529 | }, 530 | { 531 | "cell_type": "code", 532 | "execution_count": 39, 533 | "metadata": {}, 534 | "outputs": [ 535 | { 536 | "name": "stdout", 537 | "output_type": "stream", 538 | "text": [ 539 | "[[ 100. 101.44740793 119.84140463 ..., 192.78653975 210.7076386\n", 540 | " 167.37134738]\n", 541 | " [ 100. 98.79378416 81.67103247 ..., 78.75163254\n", 542 | " 104.69106128 114.29766651]\n", 543 | " [ 100. 116.62110943 127.89787986 ..., 85.9631909 79.72061217\n", 544 | " 78.03372489]\n", 545 | " ..., \n", 546 | " [ 100. 106.73222875 103.49782882 ..., 108.30352919\n", 547 | " 96.76512324 114.08668191]\n", 548 | " [ 100. 96.45073828 98.70345177 ..., 89.5899346 75.07626471\n", 549 | " 91.91332688]\n", 550 | " [ 100. 101.81014094 115.21893111 ..., 68.72837469\n", 551 | " 64.71929858 65.04500528]]\n", 552 | "X.shape = (50000, 7)\n", 553 | "X_min, X_max = 2.96880459823 6.37164911461\n", 554 | "num_basis = 12\n", 555 | "dim self.data = (7, 50000, 12)\n", 556 | "\n", 557 | "Time Cost of basis expansion: 94.9500093460083 seconds\n", 558 | "Option value = 13.1083499076\n", 559 | "Option value variance = 5.17079676287\n", 560 | "Option delta = -0.356133722933\n", 561 | "BS value 13.1458939003\n" 562 | ] 563 | } 564 | ], 565 | "source": [ 566 | "# input parameters\n", 567 | "s0 = 100.0\n", 568 | "strike = 100.0\n", 569 | "r = 0.05\n", 570 | "mu = 0.07 # 0.05\n", 571 | "vol = 0.4\n", 572 | "T = 1.0\n", 573 | "\n", 574 | "# Simulation Parameters\n", 575 | "numPaths = 50000 # number of Monte Carlo trials\n", 576 | "numSteps = 6\n", 577 | "\n", 578 | "# create the class object\n", 579 | "hMC = DiscreteBlackScholes(s0, strike, vol, T, r, mu, numSteps, numPaths)\n", 580 | "\n", 581 | "# calculation\n", 582 | "hMC.gen_paths()\n", 583 | "hMC.seed_intrinsic()\n", 584 | "option_val, delta, option_val_variance = hMC.roll_backward()\n", 585 | "bs_call_value = bs_put(0, s0, K=strike, r=r, sigma=vol, T=T)\n", 586 | "print('Option value = ', option_val)\n", 587 | "print('Option value variance = ', option_val_variance)\n", 588 | "print('Option delta = ', delta) \n", 589 | "print('BS value', bs_call_value)" 590 | ] 591 | }, 592 | { 593 | "cell_type": "code", 594 | "execution_count": 40, 595 | "metadata": {}, 596 | "outputs": [ 597 | { 598 | "name": "stdout", 599 | "output_type": "stream", 600 | "text": [ 601 | "Submission successful, please check on the coursera grader page for the status\n" 602 | ] 603 | }, 604 | { 605 | "data": { 606 | "text/plain": [ 607 | "13.10834990762385" 608 | ] 609 | }, 610 | "execution_count": 40, 611 | "metadata": {}, 612 | "output_type": "execute_result" 613 | } 614 | ], 615 | "source": [ 616 | "### GRADED PART (DO NOT EDIT) ###\n", 617 | "part2 = str(option_val)\n", 618 | "submissions[all_parts[1]]=part2\n", 619 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:2],all_parts,submissions)\n", 620 | "option_val\n", 621 | "### GRADED PART (DO NOT EDIT) ###" 622 | ] 623 | }, 624 | { 625 | "cell_type": "code", 626 | "execution_count": 41, 627 | "metadata": {}, 628 | "outputs": [ 629 | { 630 | "name": "stdout", 631 | "output_type": "stream", 632 | "text": [ 633 | "[[ 100. 101.44740793 119.84140463 ..., 192.78653975 210.7076386\n", 634 | " 167.37134738]\n", 635 | " [ 100. 98.79378416 81.67103247 ..., 78.75163254\n", 636 | " 104.69106128 114.29766651]\n", 637 | " [ 100. 116.62110943 127.89787986 ..., 85.9631909 79.72061217\n", 638 | " 78.03372489]\n", 639 | " ..., \n", 640 | " [ 100. 106.73222875 103.49782882 ..., 108.30352919\n", 641 | " 96.76512324 114.08668191]\n", 642 | " [ 100. 96.45073828 98.70345177 ..., 89.5899346 75.07626471\n", 643 | " 91.91332688]\n", 644 | " [ 100. 101.81014094 115.21893111 ..., 68.72837469\n", 645 | " 64.71929858 65.04500528]]\n", 646 | "X.shape = (50000, 7)\n", 647 | "X_min, X_max = 2.96880459823 6.37164911461\n", 648 | "num_basis = 12\n", 649 | "dim self.data = (7, 50000, 12)\n", 650 | "\n", 651 | "Time Cost of basis expansion: 95.37499976158142 seconds\n" 652 | ] 653 | }, 654 | { 655 | "data": { 656 | "text/plain": [ 657 | "array([ 6.70326307, 8.59543726, 10.74614496, 13.1458939 ,\n", 658 | " 15.78197485, 18.63949388])" 659 | ] 660 | }, 661 | "execution_count": 41, 662 | "metadata": {}, 663 | "output_type": "execute_result" 664 | } 665 | ], 666 | "source": [ 667 | "strikes = np.linspace(85, 110, 6)\n", 668 | "results = [None] * len(strikes)\n", 669 | "bs_prices = np.zeros(len(strikes))\n", 670 | "bs_deltas = np.zeros(len(strikes))\n", 671 | "numPaths = 50000\n", 672 | "hMC = DiscreteBlackScholes(s0, strike, vol, T, r, mu, numSteps, numPaths)\n", 673 | "hMC.gen_paths()\n", 674 | "for ix, k_strike in enumerate(strikes):\n", 675 | " hMC.seed_intrinsic(k_strike)\n", 676 | " results[ix] = hMC.roll_backward()\n", 677 | " bs_prices[ix] = bs_put(0, s0, K=k_strike, r=r, sigma=vol, T=T)\n", 678 | " bs_deltas[ix] = norm.cdf(d1(s0, K=k_strike, r=r, sigma=vol, T=T)) - 1\n", 679 | "bs_prices" 680 | ] 681 | }, 682 | { 683 | "cell_type": "code", 684 | "execution_count": 43, 685 | "metadata": { 686 | "collapsed": true 687 | }, 688 | "outputs": [], 689 | "source": [ 690 | "mc_prices = np.array([x[0] for x in results])\n", 691 | "mc_deltas = np.array([x[1] for x in results])\n", 692 | "price_variances = np.array([x[-1] for x in results])\n", 693 | "prices_diff = mc_prices - bs_prices\n", 694 | "deltas_diff = mc_deltas - bs_deltas\n", 695 | "# price_variances" 696 | ] 697 | }, 698 | { 699 | "cell_type": "code", 700 | "execution_count": 44, 701 | "metadata": {}, 702 | "outputs": [ 703 | { 704 | "name": "stdout", 705 | "output_type": "stream", 706 | "text": [ 707 | "Submission successful, please check on the coursera grader page for the status\n" 708 | ] 709 | }, 710 | { 711 | "data": { 712 | "text/plain": [ 713 | "array([-0.03641511, -0.04034139, -0.03996597, -0.03754399, -0.03240009,\n", 714 | " -0.02997062])" 715 | ] 716 | }, 717 | "execution_count": 44, 718 | "metadata": {}, 719 | "output_type": "execute_result" 720 | } 721 | ], 722 | "source": [ 723 | "### GRADED PART (DO NOT EDIT) ###\n", 724 | "\n", 725 | "part_3 = list(prices_diff)\n", 726 | "try:\n", 727 | " part3 = \" \".join(map(repr, part_3))\n", 728 | "except TypeError:\n", 729 | " part3 = repr(part_3)\n", 730 | "submissions[all_parts[2]]=part3\n", 731 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:3],all_parts,submissions)\n", 732 | "prices_diff\n", 733 | "### GRADED PART (DO NOT EDIT) ###" 734 | ] 735 | }, 736 | { 737 | "cell_type": "code", 738 | "execution_count": 46, 739 | "metadata": {}, 740 | "outputs": [ 741 | { 742 | "name": "stdout", 743 | "output_type": "stream", 744 | "text": [ 745 | "Submission successful, please check on the coursera grader page for the status\n" 746 | ] 747 | }, 748 | { 749 | "data": { 750 | "text/plain": [ 751 | "array([ 0.01279798, 0.01416019, 0.01532701, 0.01645681, 0.01715345,\n", 752 | " 0.01780652])" 753 | ] 754 | }, 755 | "execution_count": 46, 756 | "metadata": {}, 757 | "output_type": "execute_result" 758 | } 759 | ], 760 | "source": [ 761 | "### GRADED PART (DO NOT EDIT) ###\n", 762 | "part_4 = list(deltas_diff)\n", 763 | "try:\n", 764 | " part4 = \" \".join(map(repr, part_4))\n", 765 | "except TypeError:\n", 766 | " part4= repr(part_4)\n", 767 | "submissions[all_parts[3]]=part4\n", 768 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:4],all_parts,submissions)\n", 769 | "deltas_diff\n", 770 | "### GRADED PART (DO NOT EDIT) ###" 771 | ] 772 | } 773 | ], 774 | "metadata": { 775 | "coursera": { 776 | "course_slug": "reinforcement-learning-in-finance" 777 | }, 778 | "kernelspec": { 779 | "display_name": "Python 3", 780 | "language": "python", 781 | "name": "python3" 782 | }, 783 | "language_info": { 784 | "codemirror_mode": { 785 | "name": "ipython", 786 | "version": 3 787 | }, 788 | "file_extension": ".py", 789 | "mimetype": "text/x-python", 790 | "name": "python", 791 | "nbconvert_exporter": "python", 792 | "pygments_lexer": "ipython3", 793 | "version": "3.6.0" 794 | } 795 | }, 796 | "nbformat": 4, 797 | "nbformat_minor": 2 798 | } 799 | -------------------------------------------------------------------------------- /Final_linear_regress_m1_ex2_v4.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Linear Regression\n", 8 | "\n", 9 | "Welcome to your first assignment. This exercise gives you a brief introduction to linear regression. The exercise is to be implemented in Python. Even if you've used Python before, this will help familiarize you with functions we'll need. \n", 10 | "\n", 11 | "**Instructions:**\n", 12 | "- You will be using Python 3.\n", 13 | "- Avoid using for-loops and while-loops, unless you are explicitly told to do so.\n", 14 | "- Do not modify the (# GRADED FUNCTION [function name]) comment in some cells. Your work would not be graded if you change this. Each cell containing that comment should only contain one function.\n", 15 | "- After coding your function, run the cell right below it to check if your result is correct.\n", 16 | "\n", 17 | "**After this assignment you will:**\n", 18 | "- Be able to implement linear regression model using statsmodels, scikit-learn, and tensorflow\n", 19 | "- Work with simulated non-linear dataset\n", 20 | "- Compare model performance (quality of fit) of both models\n", 21 | "\n", 22 | "Let's get started!" 23 | ] 24 | }, 25 | { 26 | "cell_type": "markdown", 27 | "metadata": {}, 28 | "source": [ 29 | "## About iPython Notebooks ##\n", 30 | "\n", 31 | "iPython Notebooks are interactive coding environments embedded in a webpage. You will be using iPython notebooks in this class. You only need to write code between the ### START CODE HERE ### and ### END CODE HERE ### comments. After writing your code, you can run the cell by either pressing \"SHIFT\"+\"ENTER\" or by clicking on \"Run Cell\" (denoted by a play symbol) in the upper bar of the notebook. \n", 32 | "\n", 33 | "We will often specify \"(≈ X lines of code)\" in the comments to tell you about how much code you need to write. It is just a rough estimate, so don't feel bad if your code is longer or shorter." 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": 1, 39 | "metadata": { 40 | "collapsed": true 41 | }, 42 | "outputs": [], 43 | "source": [ 44 | "import os\n", 45 | "import numpy as np\n", 46 | "\n", 47 | "import sys\n", 48 | "sys.path.append(\"..\")\n", 49 | "import grading\n", 50 | "\n", 51 | "try:\n", 52 | " import matplotlib.pyplot as plt\n", 53 | " %matplotlib inline\n", 54 | "except: pass\n", 55 | "\n", 56 | "import pandas as pd\n", 57 | "\n", 58 | "import tensorflow as tf\n", 59 | "from tensorflow.python.layers import core as core_layers\n", 60 | "try:\n", 61 | " from mpl_toolkits.mplot3d import Axes3D\n", 62 | "except: pass" 63 | ] 64 | }, 65 | { 66 | "cell_type": "code", 67 | "execution_count": 2, 68 | "metadata": { 69 | "collapsed": true 70 | }, 71 | "outputs": [], 72 | "source": [ 73 | "### ONLY FOR GRADING. DO NOT EDIT ###\n", 74 | "submissions=dict()\n", 75 | "assignment_key=\"QNZTAPW2Eeeg_w5MCivhhg\" \n", 76 | "all_parts=[\"dtA5d\", \"2inmf\", \"FCpek\",\"78aDd\",\"qlQVj\"]\n", 77 | "### ONLY FOR GRADING. DO NOT EDIT ###" 78 | ] 79 | }, 80 | { 81 | "cell_type": "code", 82 | "execution_count": 7, 83 | "metadata": { 84 | "collapsed": true 85 | }, 86 | "outputs": [], 87 | "source": [ 88 | "COURSERA_TOKEN = \"sfh6Zg5vcIS3LyHC\"# the key provided to the Student under his/her email on submission page\n", 89 | "COURSERA_EMAIL = \"michaelsyao@gmail.com\"# the email" 90 | ] 91 | }, 92 | { 93 | "cell_type": "code", 94 | "execution_count": 3, 95 | "metadata": { 96 | "collapsed": true 97 | }, 98 | "outputs": [], 99 | "source": [ 100 | "def reset_graph(seed=42):\n", 101 | " \"\"\"\n", 102 | " Utility function to reset current tensorflow computation graph\n", 103 | " and set the random seed \n", 104 | " \"\"\"\n", 105 | " # to make results reproducible across runs\n", 106 | " tf.reset_default_graph()\n", 107 | " tf.set_random_seed(seed)\n", 108 | " np.random.seed(seed)" 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "metadata": { 114 | "collapsed": true 115 | }, 116 | "source": [ 117 | "## We use artificial data for the following two specifications of regression:\n", 118 | "\n", 119 | "### Linear Regression\n", 120 | "\n", 121 | "$ y(x) = a + b_1 \\cdot X_1 + b_2 \\cdot X_2 + b_3 \\cdot X_3 + \\sigma \\cdot \\varepsilon $ \n", 122 | "\n", 123 | "where $ \\varepsilon \\sim N(0, 1) $ is a Gaussian noise, and $ \\sigma $ is its volatility, \n", 124 | "with the following choice of parameters:\n", 125 | "\n", 126 | "$ a = 1.0 $\n", 127 | "\n", 128 | "$ b_1, b_2, b_3 = (0.5, 0.2, 0.1) $\n", 129 | "\n", 130 | "$ \\sigma = 0.1 $\n", 131 | "\n", 132 | "$ X_1, X_2, X_3 $ will be uniformally distributed in $ [-1,1] $\n", 133 | "\n", 134 | "### Non-Linear Regression\n", 135 | "\n", 136 | "$ y(x) = a + w_{00} \\cdot X_1 + w_{01} \\cdot X_2 + w_{02} \\cdot X_3 + + w_{10} \\cdot X_1^2 \n", 137 | "+ w_{11} \\cdot X_2^2 + w_{12} \\cdot X_3^2 + \\sigma \\cdot \\varepsilon $ \n", 138 | "\n", 139 | "where\n", 140 | "\n", 141 | "$ w = [[1.0, 0.5, 0.2],[0.5, 0.3, 0.15]] $\n", 142 | "\n", 143 | "and the rest of parameters is as above, with the same values of $ X_i $" 144 | ] 145 | }, 146 | { 147 | "cell_type": "markdown", 148 | "metadata": { 149 | "collapsed": true 150 | }, 151 | "source": [ 152 | "### Generate Data" 153 | ] 154 | }, 155 | { 156 | "cell_type": "code", 157 | "execution_count": 8, 158 | "metadata": {}, 159 | "outputs": [ 160 | { 161 | "data": { 162 | "text/plain": [ 163 | "((7500, 3), (7500, 1))" 164 | ] 165 | }, 166 | "execution_count": 8, 167 | "metadata": {}, 168 | "output_type": "execute_result" 169 | } 170 | ], 171 | "source": [ 172 | "def generate_data(n_points=10000, n_features=3, use_nonlinear=True, \n", 173 | " noise_std=0.1, train_test_split = 4):\n", 174 | " \"\"\"\n", 175 | " Arguments:\n", 176 | " n_points - number of data points to generate\n", 177 | " n_features - a positive integer - number of features\n", 178 | " use_nonlinear - if True, generate non-linear data\n", 179 | " train_test_split - an integer - what portion of data to use for testing\n", 180 | " \n", 181 | " Return:\n", 182 | " X_train, Y_train, X_test, Y_test, n_train, n_features\n", 183 | " \"\"\"\n", 184 | " \n", 185 | " # Linear data or non-linear data?\n", 186 | " if use_nonlinear:\n", 187 | " weights = np.array([[1.0, 0.5, 0.2],[0.5, 0.3, 0.15]])\n", 188 | " else:\n", 189 | " weights = np.array([1.0, 0.5, 0.2])\n", 190 | " \n", 191 | "\n", 192 | " \n", 193 | " bias = np.ones(n_points).reshape((-1,1))\n", 194 | " low = - np.ones((n_points,n_features),'float')\n", 195 | " high = np.ones((n_points,n_features),'float')\n", 196 | " \n", 197 | " np.random.seed(42)\n", 198 | " X = np.random.uniform(low=low, high=high)\n", 199 | " \n", 200 | " np.random.seed(42)\n", 201 | " noise = np.random.normal(size=(n_points, 1))\n", 202 | " noise_std = 0.1\n", 203 | " \n", 204 | " if use_nonlinear:\n", 205 | " Y = (weights[0,0] * bias + np.dot(X, weights[0, :]).reshape((-1,1)) + \n", 206 | " np.dot(X*X, weights[1, :]).reshape([-1,1]) +\n", 207 | " noise_std * noise)\n", 208 | " else:\n", 209 | " Y = (weights[0] * bias + np.dot(X, weights[:]).reshape((-1,1)) + \n", 210 | " noise_std * noise)\n", 211 | " \n", 212 | " n_test = int(n_points/train_test_split)\n", 213 | " n_train = n_points - n_test\n", 214 | " \n", 215 | " X_train = X[:n_train,:]\n", 216 | " Y_train = Y[:n_train].reshape((-1,1))\n", 217 | "\n", 218 | " X_test = X[n_train:,:]\n", 219 | " Y_test = Y[n_train:].reshape((-1,1))\n", 220 | " \n", 221 | " return X_train, Y_train, X_test, Y_test, n_train, n_features\n", 222 | "\n", 223 | "X_train, Y_train, X_test, Y_test, n_train, n_features = generate_data(use_nonlinear=False)\n", 224 | "X_train.shape, Y_train.shape" 225 | ] 226 | }, 227 | { 228 | "cell_type": "markdown", 229 | "metadata": {}, 230 | "source": [ 231 | "### Linear Regression with Numpy" 232 | ] 233 | }, 234 | { 235 | "cell_type": "code", 236 | "execution_count": 9, 237 | "metadata": { 238 | "collapsed": true 239 | }, 240 | "outputs": [], 241 | "source": [ 242 | "# GRADED FUNCTION: numpy_lin_regress\n", 243 | "def numpy_lin_regress(X_train, Y_train):\n", 244 | " \"\"\"\n", 245 | " numpy_lin_regress - Implements linear regression model using numpy module\n", 246 | " Arguments:\n", 247 | " X_train - np.array of size (n by k) where n is number of observations \n", 248 | " of independent variables and k is number of variables\n", 249 | " Y_train - np.array of size (n by 1) where n is the number of observations of dependend variable\n", 250 | " \n", 251 | " Return:\n", 252 | " np.array of size (k+1 by 1) of regression coefficients\n", 253 | " \"\"\"\n", 254 | " ### START CODE HERE ### (≈ 3 lines of code)\n", 255 | " \n", 256 | " # number of features\n", 257 | " ndim = X_train.shape[1]\n", 258 | " \n", 259 | " # add the column of ones\n", 260 | " X_train = np.hstack((np.ones((X_train.shape[0], 1)), X_train))\n", 261 | "\n", 262 | " # default answer, replace this\n", 263 | " theta_numpy = np.linalg.inv(X_train.T.dot(X_train)).dot(X_train.T).dot(Y_train)\n", 264 | "\n", 265 | " \n", 266 | " # default answer, replace this\n", 267 | " # theta_numpy = np.array([0.] * (ndim + 1)) \n", 268 | " ### END CODE HERE ###\n", 269 | " return theta_numpy" 270 | ] 271 | }, 272 | { 273 | "cell_type": "code", 274 | "execution_count": 10, 275 | "metadata": {}, 276 | "outputs": [ 277 | { 278 | "name": "stdout", 279 | "output_type": "stream", 280 | "text": [ 281 | "Submission successful, please check on the coursera grader page for the status\n" 282 | ] 283 | }, 284 | { 285 | "data": { 286 | "text/plain": [ 287 | "array([ 0.99946227, 0.99579039, 0.499198 , 0.20019798])" 288 | ] 289 | }, 290 | "execution_count": 10, 291 | "metadata": {}, 292 | "output_type": "execute_result" 293 | } 294 | ], 295 | "source": [ 296 | "### GRADED PART (DO NOT EDIT) ###\n", 297 | "theta_numpy = numpy_lin_regress(X_train, Y_train)\n", 298 | "part_1 = list(theta_numpy.squeeze())\n", 299 | "try:\n", 300 | " part1 = \" \".join(map(repr, part_1))\n", 301 | "except TypeError:\n", 302 | " part1 = repr(part_1)\n", 303 | "submissions[all_parts[0]]=part1\n", 304 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:1],all_parts,submissions)\n", 305 | "theta_numpy.squeeze()\n", 306 | "### GRADED PART (DO NOT EDIT) ###" 307 | ] 308 | }, 309 | { 310 | "cell_type": "markdown", 311 | "metadata": {}, 312 | "source": [ 313 | "### Linear Regression with Sklearn" 314 | ] 315 | }, 316 | { 317 | "cell_type": "code", 318 | "execution_count": 11, 319 | "metadata": { 320 | "collapsed": true 321 | }, 322 | "outputs": [], 323 | "source": [ 324 | "# GRADED FUNCTION: sklearn_lin_regress\n", 325 | "def sklearn_lin_regress(X_train, Y_train):\n", 326 | " \"\"\"\n", 327 | " Arguments:\n", 328 | " X_train - np.array of size (n by k) where n is number of observations \n", 329 | " of independent variables and k is number of variables\n", 330 | " Y_train - np.array of size (n by 1) where n is the number of observations of dependend variable\n", 331 | " \n", 332 | " Return:\n", 333 | " np.array of size (k+1 by 1) of regression coefficients\n", 334 | " \"\"\" \n", 335 | " from sklearn.linear_model import LinearRegression\n", 336 | " lin_reg = LinearRegression()\n", 337 | " ### START CODE HERE ### (≈ 3 lines of code)\n", 338 | " X_train = np.hstack((np.ones((X_train.shape[0], 1)), X_train))\n", 339 | " \n", 340 | " # use lin_reg to fit training data\n", 341 | " regSKL = LinearRegression(fit_intercept=False)\n", 342 | " regSKL.fit(X_train, Y_train)\n", 343 | " theta_sklearn = regSKL.coef_\n", 344 | "\n", 345 | " ### END CODE HERE ###\n", 346 | " return theta_sklearn" 347 | ] 348 | }, 349 | { 350 | "cell_type": "code", 351 | "execution_count": 12, 352 | "metadata": {}, 353 | "outputs": [ 354 | { 355 | "name": "stdout", 356 | "output_type": "stream", 357 | "text": [ 358 | "Submission successful, please check on the coursera grader page for the status\n" 359 | ] 360 | }, 361 | { 362 | "data": { 363 | "text/plain": [ 364 | "array([ 0.99946227, 0.99579039, 0.499198 , 0.20019798])" 365 | ] 366 | }, 367 | "execution_count": 12, 368 | "metadata": {}, 369 | "output_type": "execute_result" 370 | } 371 | ], 372 | "source": [ 373 | "### GRADED PART (DO NOT EDIT) ###\n", 374 | "theta_sklearn = sklearn_lin_regress(X_train, Y_train)\n", 375 | "part_2 = list(theta_sklearn.squeeze())\n", 376 | "try:\n", 377 | " part2 = \" \".join(map(repr, part_2))\n", 378 | "except TypeError:\n", 379 | " part2 = repr(part_2)\n", 380 | "submissions[all_parts[1]]=part2\n", 381 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:2],all_parts,submissions)\n", 382 | "theta_sklearn.squeeze()\n", 383 | "### GRADED PART (DO NOT EDIT) ###" 384 | ] 385 | }, 386 | { 387 | "cell_type": "markdown", 388 | "metadata": {}, 389 | "source": [ 390 | "### Linear Regression with Tensorflow" 391 | ] 392 | }, 393 | { 394 | "cell_type": "code", 395 | "execution_count": 15, 396 | "metadata": {}, 397 | "outputs": [ 398 | { 399 | "data": { 400 | "text/plain": [ 401 | "array([[ 1. , -0.25091976, 0.90142861, 0.46398788],\n", 402 | " [ 1. , 0.19731697, -0.68796272, -0.68801096],\n", 403 | " [ 1. , -0.88383278, 0.73235229, 0.20223002],\n", 404 | " ..., \n", 405 | " [ 1. , -0.41969273, 0.7395476 , 0.49515624],\n", 406 | " [ 1. , -0.15112479, 0.42105835, 0.73537775],\n", 407 | " [ 1. , 0.95130426, 0.90470301, -0.34934435]])" 408 | ] 409 | }, 410 | "execution_count": 15, 411 | "metadata": {}, 412 | "output_type": "execute_result" 413 | } 414 | ], 415 | "source": [ 416 | "np.hstack((np.ones((X_train.shape[0], 1)), X_train))" 417 | ] 418 | }, 419 | { 420 | "cell_type": "code", 421 | "execution_count": 23, 422 | "metadata": { 423 | "collapsed": true 424 | }, 425 | "outputs": [], 426 | "source": [ 427 | "# GRADED FUNCTION: tf_lin_regress\n", 428 | "def tf_lin_regress(X_train, Y_train):\n", 429 | " \"\"\"\n", 430 | " Arguments:\n", 431 | " X_train - np.array of size (n by k) where n is number of observations \n", 432 | " of independent variables and k is number of variables\n", 433 | " Y_train - np.array of size (n by 1) where n is the number of observations of dependend variable\n", 434 | " \n", 435 | " Return:\n", 436 | " np.array of size (k+1 by 1) of regression coefficients\n", 437 | " \"\"\"\n", 438 | " ### START CODE HERE ### (≈ 7-8 lines of code)\n", 439 | " ndim = X_train.shape[1] \n", 440 | " # add the column of ones\n", 441 | " np.hstack((np.ones((X_train.shape[0], 1)), X_train))\n", 442 | " \n", 443 | " # tf setup\n", 444 | " X = tf.constant(X_train, dtype=tf.float32, name=\"X\")\n", 445 | " Y = tf.constant(Y_train, dtype=tf.float32, name=\"Y\")\n", 446 | " XT = tf.transpose(X)\n", 447 | " \n", 448 | " # define theta for later evaluation\n", 449 | " ## theta_numpy = np.linalg.inv(X_train.T.dot(X_train)).dot(X_train.T).dot(Y_train)\n", 450 | " theta = tf.matmul(tf.matmul(tf.matrix_inverse(tf.matmul(XT, X)), XT), Y)\n", 451 | " \n", 452 | " ### END CODE HERE ###\n", 453 | " with tf.Session() as sess:\n", 454 | " theta_value = theta.eval()\n", 455 | " return theta_value" 456 | ] 457 | }, 458 | { 459 | "cell_type": "code", 460 | "execution_count": 24, 461 | "metadata": {}, 462 | "outputs": [ 463 | { 464 | "name": "stdout", 465 | "output_type": "stream", 466 | "text": [ 467 | "Submission successful, please check on the coursera grader page for the status\n" 468 | ] 469 | }, 470 | { 471 | "data": { 472 | "text/plain": [ 473 | "array([ 1.02695167, 0.50324714, 0.18036205], dtype=float32)" 474 | ] 475 | }, 476 | "execution_count": 24, 477 | "metadata": {}, 478 | "output_type": "execute_result" 479 | } 480 | ], 481 | "source": [ 482 | "### GRADED PART (DO NOT EDIT) ###\n", 483 | "theta_tf = tf_lin_regress(X_train, Y_train)\n", 484 | "part_3 = list(theta_tf.squeeze())\n", 485 | "try:\n", 486 | " part3 = \" \".join(map(repr, part_3))\n", 487 | "except TypeError:\n", 488 | " part3 = repr(part_3)\n", 489 | "submissions[all_parts[2]]=part3\n", 490 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:3],all_parts,submissions)\n", 491 | "theta_tf.squeeze()\n", 492 | "### GRADED PART (DO NOT EDIT) ###" 493 | ] 494 | }, 495 | { 496 | "cell_type": "code", 497 | "execution_count": 19, 498 | "metadata": { 499 | "collapsed": true 500 | }, 501 | "outputs": [], 502 | "source": [ 503 | "class LinRegressNormalEq:\n", 504 | " \"\"\"\n", 505 | " class LinRegressNormalEq - implements normal equation, maximum likelihood estimator (MLE) solution\n", 506 | " \"\"\"\n", 507 | " def __init__(self, n_features, learning_rate=0.05, L=0):\n", 508 | " import math as m\n", 509 | " # input placeholders\n", 510 | " self.X = tf.placeholder(tf.float32, [None, n_features], name=\"X\") \n", 511 | " self.Y = tf.placeholder(tf.float32, [None, 1], name=\"Y\")\n", 512 | " \n", 513 | " # regression parameters for the analytical solution using the Normal equation\n", 514 | " self.theta_in = tf.placeholder(tf.float32, [n_features+1,None])\n", 515 | "\n", 516 | " # Augmented data matrix is obtained by adding a column of ones to the data matrix\n", 517 | " data_plus_bias = tf.concat([tf.ones([tf.shape(self.X)[0], 1]), self.X], axis=1)\n", 518 | " \n", 519 | " XT = tf.transpose(data_plus_bias)\n", 520 | " \n", 521 | " #############################################\n", 522 | " # The normal equation for Linear Regression\n", 523 | " \n", 524 | " self.theta = tf.matmul(tf.matmul(\n", 525 | " tf.matrix_inverse(tf.matmul(XT, data_plus_bias)), XT), self.Y)\n", 526 | " \n", 527 | " # mean square error in terms of theta = theta_in\n", 528 | " self.lr_mse = tf.reduce_mean(tf.square(\n", 529 | " tf.matmul(data_plus_bias, self.theta_in) - self.Y))\n", 530 | " \n", 531 | " #############################################\n", 532 | " # Estimate the model using the Maximum Likelihood Estimation (MLE)\n", 533 | " \n", 534 | " # regression parameters for the Maximum Likelihood method\n", 535 | " # Note that there are n_features+2 parameters, as one is added for the intercept, \n", 536 | " # and another one for the std of noise \n", 537 | " self.weights = tf.Variable(tf.random_normal([n_features+2, 1]))\n", 538 | " \n", 539 | " # prediction from the model\n", 540 | " self.output = tf.matmul(data_plus_bias, self.weights[:-1, :])\n", 541 | "\n", 542 | " gauss = tf.distributions.Normal(loc=0.0, scale=1.0)\n", 543 | "\n", 544 | " # Standard deviation of the Gaussian noise is modelled as a square of the \n", 545 | " # last model weight\n", 546 | " sigma = 0.0001 + tf.square(self.weights[-1]) \n", 547 | " \n", 548 | " # though a constant sqrt(2*pi) is not needed to find the best parameters, here we keep it\n", 549 | " # to get the value of the log-LL right \n", 550 | " pi = tf.constant(m.pi)\n", 551 | " \n", 552 | " log_LL = tf.log(0.00001 + (1/( tf.sqrt(2*pi)*sigma)) * gauss.prob((self.Y - self.output) / sigma )) \n", 553 | " self.loss = - tf.reduce_mean(log_LL)\n", 554 | " \n", 555 | " self.train_step = (tf.train.AdamOptimizer(learning_rate).minimize(self.loss), -self.loss)" 556 | ] 557 | }, 558 | { 559 | "cell_type": "code", 560 | "execution_count": 30, 561 | "metadata": {}, 562 | "outputs": [], 563 | "source": [ 564 | "# GRADED FUNCTION: run_normal_eq\n", 565 | "def run_normal_eq(X_train, Y_train, X_test, Y_test, learning_rate=0.05):\n", 566 | " \"\"\"\n", 567 | " Implements normal equation using tensorflow, trains the model using training data set\n", 568 | " Tests the model quality by computing mean square error (MSE) of the test data set\n", 569 | " \n", 570 | " Arguments:\n", 571 | " X_train - np.array of size (n by k) where n is number of observations \n", 572 | " of independent variables and k is number of variables\n", 573 | " Y_train - np.array of size (n by 1) where n is the number of observations of dependend variable\n", 574 | " \n", 575 | " X_test - np.array of size (n by k) where n is number of observations \n", 576 | " of independent variables and k is number of variables\n", 577 | " Y_test - np.array of size (n by 1) where n is the number of observations of dependend variable\n", 578 | " \n", 579 | " \n", 580 | " Return a tuple of:\n", 581 | " - np.array of size (k+1 by 1) of regression coefficients\n", 582 | " - mean square error (MSE) of the test data set\n", 583 | " - mean square error (MSE) of the training data set\n", 584 | " \"\"\"\n", 585 | " # create an instance of the Linear Regression model class \n", 586 | " n_features = X_train.shape[1]\n", 587 | " model = LinRegressNormalEq(n_features=n_features, learning_rate=learning_rate)\n", 588 | "\n", 589 | " ### START CODE HERE ### (≈ 10-15 lines of code)\n", 590 | " # tf setup\n", 591 | " X = tf.placeholder(tf.float32, [None, n_features], name=\"X\")\n", 592 | " Y = tf.placeholder(tf.float32, [None, 1], name=\"Y\")\n", 593 | " theta_in = tf.placeholder(tf.float32, [n_features + 1, None])\n", 594 | " data_plus_bias = tf.concat([tf.ones([tf.shape(X)[0], 1]), X], axis=1)\n", 595 | " \n", 596 | " # train the model\n", 597 | " XT = tf.transpose(data_plus_bias)\n", 598 | " theta = tf.matmul(tf.matmul(tf.matrix_inverse(tf.matmul(XT, data_plus_bias)), XT), Y)\n", 599 | " \n", 600 | " lr_mse = tf.reduce_mean(tf.square(tf.matmul(data_plus_bias, theta_in) - Y))\n", 601 | " \n", 602 | " # Normal equation for Linear Regression\n", 603 | " with tf.Session() as temp:\n", 604 | " temp.run(tf.global_variables_initializer())\n", 605 | " \n", 606 | " theta_value = temp.run(theta, feed_dict={X: X_train, Y: Y_train})\n", 607 | " lr_mse_train = temp.run(lr_mse, feed_dict={X: X_train, Y: Y_train, theta_in: theta_value})\n", 608 | " lr_mse_test = temp.run(lr_mse, feed_dict={X: X_train, Y: Y_train, theta_in: theta_value})\n", 609 | " \n", 610 | " ### END CODE HERE ###\n", 611 | " return theta_value, lr_mse_train, lr_mse_test\n", 612 | "\n", 613 | "### (DO NOT EDIT) ###\n", 614 | "theta_value, lr_mse_train, lr_mse_test = run_normal_eq(X_train, Y_train, X_test, Y_test)\n", 615 | "### (DO NOT EDIT) ###" 616 | ] 617 | }, 618 | { 619 | "cell_type": "code", 620 | "execution_count": 31, 621 | "metadata": {}, 622 | "outputs": [ 623 | { 624 | "name": "stdout", 625 | "output_type": "stream", 626 | "text": [ 627 | "Submission successful, please check on the coursera grader page for the status\n" 628 | ] 629 | }, 630 | { 631 | "data": { 632 | "text/plain": [ 633 | "array([ 0.99946201, 0.99579054, 0.49919799, 0.20019798], dtype=float32)" 634 | ] 635 | }, 636 | "execution_count": 31, 637 | "metadata": {}, 638 | "output_type": "execute_result" 639 | } 640 | ], 641 | "source": [ 642 | "### GRADED PART (DO NOT EDIT) ###\n", 643 | "part_4 = list(theta_value.squeeze())\n", 644 | "try:\n", 645 | " part4 = \" \".join(map(repr, part_4))\n", 646 | "except TypeError:\n", 647 | " part4 = repr(part_4)\n", 648 | "submissions[all_parts[3]]=part4\n", 649 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:4],all_parts,submissions)\n", 650 | "theta_value.squeeze()\n", 651 | "### GRADED PART (DO NOT EDIT) ###" 652 | ] 653 | }, 654 | { 655 | "cell_type": "code", 656 | "execution_count": 32, 657 | "metadata": { 658 | "collapsed": true 659 | }, 660 | "outputs": [], 661 | "source": [ 662 | "# GRADED FUNCTION: run_mle# GRADED \n", 663 | "def run_mle(X_train, Y_train, X_test, Y_test, learning_rate=0.05, num_iter=5000):\n", 664 | " \"\"\"\n", 665 | " Maximum likelihood Estimate (MLE)\n", 666 | " Tests the model quality by computing mean square error (MSE) of the test data set\n", 667 | " \n", 668 | " Arguments:\n", 669 | " X_train - np.array of size (n by k) where n is number of observations \n", 670 | " of independent variables and k is number of variables\n", 671 | " Y_train - np.array of size (n by 1) where n is the number of observations of dependend variable\n", 672 | " \n", 673 | " X_test - np.array of size (n by k) where n is number of observations \n", 674 | " of independent variables and k is number of variables\n", 675 | " Y_test - np.array of size (n by 1) where n is the number of observations of dependend variable\n", 676 | " \n", 677 | " \n", 678 | " Return a tuple of:\n", 679 | " - np.array of size (k+1 by 1) of regression coefficients\n", 680 | " - mean square error (MSE) of the test data set\n", 681 | " - mean square error (MSE) of the training data set\n", 682 | " \"\"\"\n", 683 | " # create an instance of the Linear Regression model class \n", 684 | " n_features = X_train.shape[1]\n", 685 | " model = LinRegressNormalEq(n_features=n_features, learning_rate=learning_rate)\n", 686 | " \n", 687 | " # train the model\n", 688 | " with tf.Session() as sess:\n", 689 | " sess.run(tf.global_variables_initializer())\n", 690 | "\n", 691 | " # Now train the MLE parameters \n", 692 | " for _ in range(num_iter):\n", 693 | " (_ , loss), weights = sess.run((model.train_step, model.weights), feed_dict={\n", 694 | " model.X: X_train,\n", 695 | " model.Y: Y_train\n", 696 | " })\n", 697 | "\n", 698 | " # make test_prediction\n", 699 | " Y_test_predicted = sess.run(model.output, feed_dict={model.X: X_test})\n", 700 | "\n", 701 | " # output std sigma is a square of the last weight\n", 702 | " std_model = weights[-1]**2 \n", 703 | " sess.close()\n", 704 | " return weights[0:-1].squeeze(), loss, std_model\n", 705 | "\n", 706 | "weights, loss, std_model = run_mle(X_train, Y_train, X_test, Y_test)" 707 | ] 708 | }, 709 | { 710 | "cell_type": "code", 711 | "execution_count": 33, 712 | "metadata": {}, 713 | "outputs": [ 714 | { 715 | "name": "stdout", 716 | "output_type": "stream", 717 | "text": [ 718 | "Submission successful, please check on the coursera grader page for the status\n" 719 | ] 720 | }, 721 | { 722 | "data": { 723 | "text/plain": [ 724 | "array([ 0.99938166, 0.99561447, 0.49928492, 0.20008941], dtype=float32)" 725 | ] 726 | }, 727 | "execution_count": 33, 728 | "metadata": {}, 729 | "output_type": "execute_result" 730 | } 731 | ], 732 | "source": [ 733 | "### GRADED PART (DO NOT EDIT) ###\n", 734 | "part_5 = list(weights.squeeze())\n", 735 | "try:\n", 736 | " part5 = \" \".join(map(repr, part_5))\n", 737 | "except TypeError:\n", 738 | " part5 = repr(part_5)\n", 739 | "submissions[all_parts[4]]=part5\n", 740 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:5],all_parts,submissions)\n", 741 | "weights.squeeze()\n", 742 | "### GRADED PART (DO NOT EDIT) ###" 743 | ] 744 | } 745 | ], 746 | "metadata": { 747 | "anaconda-cloud": {}, 748 | "celltoolbar": "Edit Metadata", 749 | "coursera": { 750 | "course_slug": "guided-tour-machine-learning-finance" 751 | }, 752 | "kernelspec": { 753 | "display_name": "Python 3", 754 | "language": "python", 755 | "name": "python3" 756 | }, 757 | "language_info": { 758 | "codemirror_mode": { 759 | "name": "ipython", 760 | "version": 3 761 | }, 762 | "file_extension": ".py", 763 | "mimetype": "text/x-python", 764 | "name": "python", 765 | "nbconvert_exporter": "python", 766 | "pygments_lexer": "ipython3", 767 | "version": "3.6.0" 768 | } 769 | }, 770 | "nbformat": 4, 771 | "nbformat_minor": 1 772 | } 773 | -------------------------------------------------------------------------------- /MY_Euclidian_Distance_m1_ex1_v3.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "collapsed": true 7 | }, 8 | "source": [ 9 | "# Euclidean Distance \n", 10 | "\n", 11 | "Welcome to your 1-st assignment. By working through this exercise you will learn how to\n", 12 | "\n", 13 | "- do this\n", 14 | "- understand this\n", 15 | "- learn that\n", 16 | "\n", 17 | "**Instructions:**\n", 18 | "- You will be using Python 3.\n", 19 | "- Avoid using for-loops and while-loops, unless you are explicitly told to do so.\n", 20 | "- Do not modify the (# GRADED FUNCTION [function name]) comment in some cells. Your work would not be graded if you change this. Each cell containing that comment should only contain one function.\n", 21 | "- After coding your function, run the cell right below it to check if your result is correct.\n", 22 | "\n", 23 | "**After this assignment you will:**\n", 24 | "- know how to do this\n", 25 | "- understand so and so\n", 26 | "\n", 27 | "Let's get started!" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "metadata": { 33 | "collapsed": true 34 | }, 35 | "source": [ 36 | "## Dataset\n", 37 | "Suppose we have a $n$ dimensional space $\\mathbb{R}^{n}$, we want to generate $1000000$ pairs of uniformly distributed random\n", 38 | "numbers $X\\sim\\mathscr{U}\\left(-1,\\:1\\right)$. \n", 39 | "\n", 40 | "For instance, if $n=1$, we generate $p_{1}=\\left(x_{1},\\:y_{1}\\right)$, $p_{2}=\\left(x_{2},\\:y_{2}\\right)$, $\\cdots$, $p_{1000000}=\\left(x_{1000000},\\:y_{1000000}\\right)$, where $x_{1}$, $x_{2}$, $\\cdots$, $x_{1000000}$ are uniformly distributed, $y_{1}$, $y_{2}$, $\\cdots$, $y_{1000000}$ are uniformly distributed too. \n", 41 | "\n", 42 | "If $n=2$, we generate $\\mathbf{p}_{1}=\\left(\\mathbf{x}_{1},\\:\\mathbf{y}_{1}\\right)$, where $\\mathbf{x}_{1}=\\left(x_{1}^{\\left(1\\right)},\\:x_{1}^{\\left(2\\right)}\\right)$ and $\\mathbf{y}_{1}=\\left(y_{1}^{\\left(1\\right)},\\:y_{1}^{\\left(2\\right)}\\right)$, $\\mathbf{p}_{2}=\\left(\\mathbf{x}_{2},\\:\\mathbf{y}_{2}\\right)$, where $\\mathbf{x}_{2}=\\left(x_{2}^{\\left(1\\right)},\\:x_{2}^{\\left(2\\right)}\\right)$ and $\\mathbf{y}_{2}=\\left(y_{2}^{\\left(1\\right)},\\:y_{2}^{\\left(2\\right)}\\right)$, $\\cdots$, $\\mathbf{p}_{1000000}=\\left(\\mathbf{x}_{1000000},\\:\\mathbf{y}_{1000000}\\right)$, where $\\mathbf{x}_{1000000}=\\left(x_{1000000}^{\\left(1\\right)},\\:x_{1000000}^{\\left(2\\right)}\\right)$ and $\\mathbf{y}_{1000000}=\\left(y_{1000000}^{\\left(1\\right)},\\:y_{1000000}^{\\left(2\\right)}\\right)$, and $x_{1}^{\\left(1\\right)}$, $x_{2}^{\\left(1\\right)}$, $\\cdots$, $x_{1000000}^{\\left(1\\right)}$ are uniformly distributed, $x_{1}^{\\left(2\\right)}$, $x_{2}^{\\left(2\\right)}$, $\\cdots$, $x_{1000000}^{\\left(2\\right)}$ are uniformly distributed, $y_{1}^{\\left(1\\right)}$, $y_{2}^{\\left(1\\right)}$, $\\cdots$, $y_{1000000}^{\\left(1\\right)}$ are uniformly distributed, and $y_{1}^{\\left(2\\right)}$, $y_{2}^{\\left(2\\right)}$, $\\cdots$, $y_{1000000}^{\\left(2\\right)}$ are uniformly distributed too. " 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": 4, 48 | "metadata": { 49 | "collapsed": true 50 | }, 51 | "outputs": [], 52 | "source": [ 53 | "# imports \n", 54 | "import numpy as np\n", 55 | "# import matplotlib.pyplot as plt \n", 56 | "# %matplotlib inline\n", 57 | "\n", 58 | "from sklearn.metrics.pairwise import euclidean_distances\n", 59 | "\n", 60 | "import sys\n", 61 | "sys.path.append(\"..\")\n", 62 | "import grading\n", 63 | "\n", 64 | "import timeit\n", 65 | "import matplotlib.mlab\n", 66 | "import scipy.stats\n", 67 | "from scipy.stats import norm" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": 4, 73 | "metadata": { 74 | "collapsed": true 75 | }, 76 | "outputs": [], 77 | "source": [ 78 | "### ONLY FOR GRADING. DO NOT EDIT ###\n", 79 | "submissions=dict()\n", 80 | "assignment_key=\"2RRok_GPEeeQZgq5AVms2g\" \n", 81 | "all_parts=[\"pmqxU\", \"VrXL6\", \"XsLp1\",\"jD7SY\",\"Ad4J0\",\"1nPFm\"]\n", 82 | "### ONLY FOR GRADING. DO NOT EDIT ###" 83 | ] 84 | }, 85 | { 86 | "cell_type": "code", 87 | "execution_count": 5, 88 | "metadata": { 89 | "collapsed": true 90 | }, 91 | "outputs": [], 92 | "source": [ 93 | "COURSERA_TOKEN = \" \"# the key provided to the Student under his/her email on submission page\n", 94 | "COURSERA_EMAIL = \" \"# the email" 95 | ] 96 | }, 97 | { 98 | "cell_type": "code", 99 | "execution_count": 6, 100 | "metadata": { 101 | "collapsed": true 102 | }, 103 | "outputs": [], 104 | "source": [ 105 | "def euclidean_distances_stats(euclidean_distances_vector):\n", 106 | " \"\"\"\n", 107 | " Calculate Euclidean distances statistics\n", 108 | " \n", 109 | " Arguments:\n", 110 | " euclidean_distances_vector - 1-D vector of Euclidean distances\n", 111 | " \n", 112 | " Return:\n", 113 | " np.array() of length 4\n", 114 | " the first element of array is the mean\n", 115 | " the second element is variance\n", 116 | " the third element is skew of the distribution\n", 117 | " the forth element is kurtusis of the distribution\n", 118 | " \"\"\"\n", 119 | " if len(euclidean_distances_vector) > 0:\n", 120 | " this_mean = np.mean( euclidean_distances_vector )\n", 121 | " this_variance = np.var( euclidean_distances_vector )\n", 122 | " this_skewness = scipy.stats.skew( euclidean_distances_vector ) \n", 123 | " this_kurtosis = scipy.stats.kurtosis( euclidean_distances_vector )\n", 124 | " result = np.array([this_mean, this_variance, this_skewness, this_kurtosis])\n", 125 | " else:\n", 126 | " result = np.array([0.] * 4)\n", 127 | " return result\n", 128 | "\n", 129 | "\n", 130 | "def print_stats(euclidean_stats):\n", 131 | " \"\"\"\n", 132 | " Print Euclidean distances statistics\n", 133 | " \n", 134 | " Arguments: \n", 135 | " euclidean_stats - np.array() of length 4\n", 136 | " the first element of array is the mean\n", 137 | " the second element is variance\n", 138 | " the third element is skew of the distribution\n", 139 | " the forth element is kurtusis of the distribution\n", 140 | " \"\"\"\n", 141 | " this_mean = euclidean_stats[0]\n", 142 | " this_variance = euclidean_stats[1]\n", 143 | " this_skewness = euclidean_stats[2]\n", 144 | " this_kurtosis = euclidean_stats[3]\n", 145 | " print( 'Expectation of Euclidean distances: ', this_mean, '\\n' )\n", 146 | " print( 'Variance of Euclidean distances: ', this_variance, '\\n' )\n", 147 | " print( 'Skewness of Euclidean distances: ', this_skewness, '\\n' )\n", 148 | " print( 'Kurtosis of Euclidean distances: ',this_kurtosis, '\\n' )\n", 149 | "\n", 150 | "\n", 151 | "def plot_distribution(euclidean_distances_vector, euclidean_stats, dim_space, bins_number=30):\n", 152 | " \"\"\"\n", 153 | " Plot histogram of Euclidean distances against normal distribution PDF\n", 154 | " \n", 155 | " Arguments: \n", 156 | " \n", 157 | " euclidean_distances_vector - 1-D vector of Euclidean distances\n", 158 | " \n", 159 | " euclidean_stats - np.array() of length 4\n", 160 | " the first element of array is the mean\n", 161 | " the second element is variance\n", 162 | " the third element is skew of the distribution\n", 163 | " the forth element is kurtusis of the distribution\n", 164 | " \n", 165 | " dim_space - dimension of the space\n", 166 | " bins_number - number of bins in the histogram\n", 167 | " \"\"\"\n", 168 | " # verbose, but this is for clarity\n", 169 | " this_mean = euclidean_stats[0]\n", 170 | " this_variance = euclidean_stats[1]\n", 171 | " this_skewness = euclidean_stats[2]\n", 172 | " this_kurtosis = euclidean_stats[3]\n", 173 | " \n", 174 | " sample_size = len(euclidean_distances_vector)\n", 175 | " try:\n", 176 | " fig_l, ax_l = plt.subplots()\n", 177 | " n_bins_l, bins_l, patches_l = ax_l.hist( euclidean_distances_vector, bins_number, normed=1 ) \n", 178 | " y_l = matplotlib.mlab.normpdf( bins_l, this_mean, np.sqrt( this_variance ) )\n", 179 | " ax_l.plot( bins_l, y_l, 'r--' )\n", 180 | " plt.title( 'Histogram for dimension = %d and sample size = %d \\n $\\mu$ = %.3f, $\\sigma^2$ = %.3f, Skewness = %.3f, Kurtosis = %.3f' \\\n", 181 | " % (dim_space, sample_size, this_mean, this_variance, this_skewness, this_kurtosis ) )\n", 182 | " fig_l.tight_layout()\n", 183 | " plt.grid( True, which='both')\n", 184 | " plt.minorticks_on()\n", 185 | " return fig_l\n", 186 | " except:\n", 187 | " return None" 188 | ] 189 | }, 190 | { 191 | "cell_type": "code", 192 | "execution_count": 7, 193 | "metadata": { 194 | "scrolled": true 195 | }, 196 | "outputs": [ 197 | { 198 | "name": "stdout", 199 | "output_type": "stream", 200 | "text": [ 201 | "X: [[ 0.09220363 0.85065196 0.90075012 0.59361319 0.84875299]\n", 202 | " [ 0.13300259 0.50209599 0.76796562 0.92047036 0.47544869]\n", 203 | " [ 0.72927521 0.8054414 0.4002669 0.01355402 0.31719426]\n", 204 | " ..., \n", 205 | " [ 0.82071112 0.46084335 0.92036074 0.31746465 0.03535725]\n", 206 | " [ 0.21581585 0.12317179 0.42738517 0.35466096 0.93360429]\n", 207 | " [ 0.84577044 0.67545711 0.22706133 0.58893715 0.98216918]]\n", 208 | "Y: [[ 0.32900813 0.34963352 0.52804383 0.38208285 0.03237214]\n", 209 | " [ 0.11760546 0.46402303 0.12260294 0.18876132 0.99071561]\n", 210 | " [ 0.49587495 0.18125864 0.61421199 0.29089588 0.71308158]\n", 211 | " ..., \n", 212 | " [ 0.14440936 0.38925149 0.50634999 0.29421895 0.96282509]\n", 213 | " [ 0.15239208 0.4741476 0.84900715 0.70515312 0.22175127]\n", 214 | " [ 0.46490389 0.50546926 0.04574762 0.75900819 0.25636212]]\n" 215 | ] 216 | } 217 | ], 218 | "source": [ 219 | "lower_boundary = 0\n", 220 | "upper_boundary = 1\n", 221 | "n = 5 # dimension\n", 222 | "sample_size = 10000\n", 223 | "\n", 224 | "np.random.seed(9001) # set the seed to yield reproducible results\n", 225 | "\n", 226 | "X = np.random.uniform( low=lower_boundary, high=upper_boundary, size=(sample_size, n) )\n", 227 | "Y = np.random.uniform( low=lower_boundary, high=upper_boundary, size=(sample_size, n) )\n", 228 | "\n", 229 | "print( 'X: ', X )\n", 230 | "print( 'Y: ', Y )" 231 | ] 232 | }, 233 | { 234 | "cell_type": "markdown", 235 | "metadata": {}, 236 | "source": [ 237 | "## Part 1\n", 238 | "Calculate the Euclidean distance between the two points of each pair. Do this in a loop. Hint: use sklearn to do the computation.\n", 239 | "\n", 240 | "Plot the histogram of the Euclidean distance. In a $n$ dimensional space $\\mathbb{R}^{n}$, the Euclidean distance between $\\mathbf{x}=\\left(x_{1},\\:x_{2},\\:\\cdots,\\:x_{n}\\right)$ and $\\mathbf{y}=\\left(y_{1},\\:y_{2},\\:\\cdots,\\:y_{n}\\right)$ is given\n", 241 | "by \n", 242 | "\\begin{equation}\n", 243 | "\\begin{aligned}d_{E}\\left(\\mathbf{p},\\:\\mathbf{q}\\right) & =\\sqrt{\\left(x_{1}-y_{1}\\right)^{2}+\\left(x_{2}-y_{2}\\right)^{2}+\\cdots+\\left(x_{n}-y_{n}\\right)^{2}}\\\\\n", 244 | " & =\\sqrt{\\sum_{i=1}^{n}\\left(x_{i}-y_{i}\\right)^{2}}\\\\\n", 245 | " & =\\left\\Vert \\mathbf{x}-\\mathbf{y}\\right\\Vert _{2}\n", 246 | "\\end{aligned}\n", 247 | "\\end{equation}" 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "execution_count": 10, 253 | "metadata": {}, 254 | "outputs": [ 255 | { 256 | "name": "stdout", 257 | "output_type": "stream", 258 | "text": [ 259 | "Running time: 0.6099418379599229\n" 260 | ] 261 | } 262 | ], 263 | "source": [ 264 | "start = timeit.default_timer()\n", 265 | "### START CODE HERE ### (≈ 4 lines of code)\n", 266 | "# implement a loop which computes Euclidean distances between each element in X and Y\n", 267 | "# store results in euclidean_distances_vector_l list\n", 268 | "\n", 269 | "### END CODE HERE ###\n", 270 | "stop = timeit.default_timer()\n", 271 | "print( 'Running time: ', stop-start )" 272 | ] 273 | }, 274 | { 275 | "cell_type": "code", 276 | "execution_count": 11, 277 | "metadata": {}, 278 | "outputs": [ 279 | { 280 | "name": "stdout", 281 | "output_type": "stream", 282 | "text": [ 283 | "Submission successful, please check on the coursera grader page for the status\n" 284 | ] 285 | }, 286 | { 287 | "data": { 288 | "text/plain": [ 289 | "array([ 0.87662633, 0.06098537, -0.03504537, -0.26237711])" 290 | ] 291 | }, 292 | "execution_count": 11, 293 | "metadata": {}, 294 | "output_type": "execute_result" 295 | } 296 | ], 297 | "source": [ 298 | "# Filename: SklearnDistance, PART: pmqxU\n", 299 | "### GRADED PART (DO NOT EDIT) ###\n", 300 | "result = euclidean_distances_stats(euclidean_distances_vector_l)\n", 301 | "part_1 = list(result.squeeze())\n", 302 | "try:\n", 303 | " part1 = \" \".join(map(repr, part_1))\n", 304 | "except TypeError:\n", 305 | " part1 = repr(part_1)\n", 306 | "submissions[all_parts[0]]=part1\n", 307 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:1],all_parts,submissions)\n", 308 | "result\n", 309 | "### GRADED PART (DO NOT EDIT) ###" 310 | ] 311 | }, 312 | { 313 | "cell_type": "code", 314 | "execution_count": 12, 315 | "metadata": {}, 316 | "outputs": [ 317 | { 318 | "name": "stdout", 319 | "output_type": "stream", 320 | "text": [ 321 | "Expectation of Euclidean distances: 0.876626326649 \n", 322 | "\n", 323 | "Variance of Euclidean distances: 0.0609853651691 \n", 324 | "\n", 325 | "Skewness of Euclidean distances: -0.0350453681886 \n", 326 | "\n", 327 | "Kurtosis of Euclidean distances: -0.262377106269 \n", 328 | "\n" 329 | ] 330 | } 331 | ], 332 | "source": [ 333 | "print_stats(result)\n", 334 | "plot_distribution(euclidean_distances_vector_l, result, n)\n", 335 | "try:\n", 336 | " plt.show()\n", 337 | "except: pass" 338 | ] 339 | }, 340 | { 341 | "cell_type": "markdown", 342 | "metadata": {}, 343 | "source": [ 344 | "## Part 2\n", 345 | "Calculate the Euclidean distance between the two points of each pair using vectorized operations and inner product." 346 | ] 347 | }, 348 | { 349 | "cell_type": "code", 350 | "execution_count": 13, 351 | "metadata": {}, 352 | "outputs": [ 353 | { 354 | "name": "stdout", 355 | "output_type": "stream", 356 | "text": [ 357 | "Running time: 1.8537122719571926\n" 358 | ] 359 | } 360 | ], 361 | "source": [ 362 | "# using vectorization by calculating inner product\n", 363 | "start = timeit.default_timer()\n", 364 | "# variables needed for grading\n", 365 | "euclidean_distances_vector_l_vectorized = []\n", 366 | "### START CODE HERE ### (≈ 3 lines of code)\n", 367 | "# compute Euclidean distances between each element in X and Y using (vectorized implementation)\n", 368 | "# store results in euclidean_distances_vector_v \n", 369 | "\n", 370 | "\n", 371 | "### END CODE HERE ###\n", 372 | "stop = timeit.default_timer()\n", 373 | "print( 'Running time: ', stop-start )" 374 | ] 375 | }, 376 | { 377 | "cell_type": "code", 378 | "execution_count": 15, 379 | "metadata": {}, 380 | "outputs": [ 381 | { 382 | "name": "stdout", 383 | "output_type": "stream", 384 | "text": [ 385 | "Submission successful, please check on the coursera grader page for the status\n" 386 | ] 387 | }, 388 | { 389 | "data": { 390 | "text/plain": [ 391 | "array([ 0.87662633, 0.06098537, -0.03504537, -0.26237711])" 392 | ] 393 | }, 394 | "execution_count": 15, 395 | "metadata": {}, 396 | "output_type": "execute_result" 397 | } 398 | ], 399 | "source": [ 400 | "# Filename: VectorizedDistance, PART: VrXL6\n", 401 | "### GRADED PART (DO NOT EDIT) ### \n", 402 | "result = euclidean_distances_stats(euclidean_distances_vector_l_vectorized)\n", 403 | "part_2 = result.squeeze()\n", 404 | "try:\n", 405 | " part2 = \" \".join(map(repr, part_2))\n", 406 | "except TypeError:\n", 407 | " part2 = repr(part_2)\n", 408 | "submissions[all_parts[1]]=part2\n", 409 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:2],all_parts,submissions)\n", 410 | "result\n", 411 | "### GRADED PART (DO NOT EDIT) ###" 412 | ] 413 | }, 414 | { 415 | "cell_type": "code", 416 | "execution_count": 17, 417 | "metadata": {}, 418 | "outputs": [ 419 | { 420 | "name": "stdout", 421 | "output_type": "stream", 422 | "text": [ 423 | "Expectation of Euclidean distances: 0.876626326649 \n", 424 | "\n", 425 | "Variance of Euclidean distances: 0.0609853651691 \n", 426 | "\n", 427 | "Skewness of Euclidean distances: -0.0350453681886 \n", 428 | "\n", 429 | "Kurtosis of Euclidean distances: -0.262377106269 \n", 430 | "\n" 431 | ] 432 | } 433 | ], 434 | "source": [ 435 | "print_stats(result)\n", 436 | "fig = plot_distribution(euclidean_distances_vector_l_vectorized, result, n)\n", 437 | "try:\n", 438 | " plt.plot()\n", 439 | "except: pass" 440 | ] 441 | }, 442 | { 443 | "cell_type": "markdown", 444 | "metadata": {}, 445 | "source": [ 446 | "## Question 3 \n", 447 | "We repeat question 1 and question 2 for $n=1$, $n=5$, $n=10$, $n=100$, $n=1000$, $n=5000$, and $n=10000$. Then plot the expectation and variance as a function of $n$.\n", 448 | "You need to generate two sets of n-dimensional samples, compute " 449 | ] 450 | }, 451 | { 452 | "cell_type": "code", 453 | "execution_count": 18, 454 | "metadata": { 455 | "collapsed": true 456 | }, 457 | "outputs": [], 458 | "source": [ 459 | "def VectorizationMethod(dim_space, sample_size, lower_boundary, upper_boundary, bins_number=30):\n", 460 | " \"\"\"\n", 461 | " Generate sample_size elements from dim_space-dimensional space. The coordinates of each element in the space\n", 462 | " are sampled from uniform distribution between lower_boundary and upper_boundary\n", 463 | " \n", 464 | " Arguments: \n", 465 | " \n", 466 | " dim_space - dimension of the space, a positive integer\n", 467 | " sample_size - number of samples in the dim_space-dimensional space\n", 468 | " lower_boundary - lower boundary of coordinates sampled from U(lower_boundary, upper_boundary)\n", 469 | " upper_boundary - lower boundary of coordinates sampled from U(lower_boundary, upper_boundary)\n", 470 | " bins_number - number of bins to plot a histogram\n", 471 | " \n", 472 | " stats_result - np.array() of length 4\n", 473 | " the first element of array is the mean\n", 474 | " the second element is variance\n", 475 | " the third element is skew of the distribution\n", 476 | " the forth element is kurtusis of the distribution\n", 477 | " \"\"\"\n", 478 | " np.random.seed(42)\n", 479 | " # variables needed for grading\n", 480 | " euclidean_distances_vector_v = []\n", 481 | " ### START CODE HERE ### (≈ 7-10 lines of code)\n", 482 | " # store results in euclidean_distances_vector_v\n", 483 | "\n", 484 | " ### END CODE HERE ###\n", 485 | " stats_result = euclidean_distances_stats(euclidean_distances_vector_v)\n", 486 | " return tuple(stats_result.tolist())" 487 | ] 488 | }, 489 | { 490 | "cell_type": "code", 491 | "execution_count": 19, 492 | "metadata": {}, 493 | "outputs": [ 494 | { 495 | "name": "stdout", 496 | "output_type": "stream", 497 | "text": [ 498 | "Calculating finished for sample size = 10000, dimension = 2\n", 499 | "\n", 500 | "Calculating finished for sample size = 10000, dimension = 5\n", 501 | "\n", 502 | "Calculating finished for sample size = 10000, dimension = 10\n", 503 | "\n", 504 | "Calculating finished for sample size = 10000, dimension = 20\n", 505 | "\n", 506 | "Calculating finished for sample size = 10000, dimension = 40\n", 507 | "\n", 508 | "Calculating finished for sample size = 10000, dimension = 60\n", 509 | "\n", 510 | "Calculating finished for sample size = 10000, dimension = 80\n", 511 | "\n", 512 | "Calculating finished for sample size = 10000, dimension = 100\n", 513 | "\n", 514 | "Calculating finished for sample size = 10000, dimension = 200\n", 515 | "\n", 516 | "Calculating finished for sample size = 10000, dimension = 400\n", 517 | "\n", 518 | "Calculating finished for sample size = 10000, dimension = 600\n", 519 | "\n", 520 | "Calculating finished for sample size = 10000, dimension = 800\n", 521 | "\n", 522 | "Calculating finished for sample size = 10000, dimension = 1000\n", 523 | "\n", 524 | "Running time: 20.341083290986717\n" 525 | ] 526 | } 527 | ], 528 | "source": [ 529 | "start = timeit.default_timer()\n", 530 | "\n", 531 | "sample_size = 10000\n", 532 | "lower_boundary = 0\n", 533 | "upper_boundary = 1\n", 534 | "dimension_vector = [2, 5, 10, 20, 40, 60, 80, 100, 200, 400, 600, 800, 1000] \n", 535 | "n_dims = len(dimension_vector)\n", 536 | "\n", 537 | "euclidean_distances_mean_vector = [np.nan] * n_dims\n", 538 | "euclidean_distances_variance_vector = [np.nan] * n_dims\n", 539 | "euclidean_distances_skewness_vector = [np.nan] * n_dims\n", 540 | "euclidean_distances_kurtosis_vector = [np.nan] * n_dims\n", 541 | "\n", 542 | "for idx, space_dims in enumerate(dimension_vector):\n", 543 | " \n", 544 | " # using vectorization\n", 545 | " euclidean_distances_mean, euclidean_distances_variance, euclidean_distances_skewness, euclidean_distances_kurtosis = \\\n", 546 | " VectorizationMethod( space_dims, sample_size, lower_boundary, upper_boundary )\n", 547 | " \n", 548 | " euclidean_distances_mean_vector[idx] = euclidean_distances_mean\n", 549 | " euclidean_distances_variance_vector[idx] = euclidean_distances_variance\n", 550 | " euclidean_distances_skewness_vector[idx] = euclidean_distances_skewness\n", 551 | " euclidean_distances_kurtosis_vector[idx] = euclidean_distances_kurtosis\n", 552 | " \n", 553 | " print( 'Calculating finished for sample size = %d, dimension = %d\\n' %( sample_size, space_dims) )\n", 554 | "\n", 555 | "stop = timeit.default_timer()\n", 556 | "print( 'Running time: ', stop-start )" 557 | ] 558 | }, 559 | { 560 | "cell_type": "code", 561 | "execution_count": 20, 562 | "metadata": {}, 563 | "outputs": [ 564 | { 565 | "name": "stdout", 566 | "output_type": "stream", 567 | "text": [ 568 | "Submission successful, please check on the coursera grader page for the status\n" 569 | ] 570 | }, 571 | { 572 | "data": { 573 | "text/plain": [ 574 | "[0.5244117684024786,\n", 575 | " 0.8822841161864812,\n", 576 | " 1.2676717606162842,\n", 577 | " 1.8110504380007288,\n", 578 | " 2.5684460728327534,\n", 579 | " 3.1487610877583165,\n", 580 | " 3.64396853019095,\n", 581 | " 4.073344650824303,\n", 582 | " 5.768449828048197,\n", 583 | " 8.160150803731382,\n", 584 | " 9.997217189326255,\n", 585 | " 11.543203181243685,\n", 586 | " 12.906928018524363]" 587 | ] 588 | }, 589 | "execution_count": 20, 590 | "metadata": {}, 591 | "output_type": "execute_result" 592 | } 593 | ], 594 | "source": [ 595 | "# Filename : DistancesMean, PART: XsLp1\n", 596 | "### GRADED PART (DO NOT EDIT) ###\n", 597 | "part_3 = list(euclidean_distances_mean_vector)\n", 598 | "try:\n", 599 | " part3 = \" \".join(map(repr, part_3))\n", 600 | "except TypeError:\n", 601 | " part3 = repr(part_3)\n", 602 | "submissions[all_parts[2]]=part3\n", 603 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:3],all_parts,submissions)\n", 604 | "euclidean_distances_mean_vector\n", 605 | "### GRADED PART (DO NOT EDIT) ###" 606 | ] 607 | }, 608 | { 609 | "cell_type": "code", 610 | "execution_count": 21, 611 | "metadata": {}, 612 | "outputs": [ 613 | { 614 | "name": "stdout", 615 | "output_type": "stream", 616 | "text": [ 617 | "Submission successful, please check on the coursera grader page for the status\n" 618 | ] 619 | }, 620 | { 621 | "data": { 622 | "text/plain": [ 623 | "[0.06230677292748971,\n", 624 | " 0.061198079555789694,\n", 625 | " 0.0608126495018327,\n", 626 | " 0.059183678488410246,\n", 627 | " 0.05949007814616248,\n", 628 | " 0.057252681257966946,\n", 629 | " 0.05935452158486421,\n", 630 | " 0.05831142832530561,\n", 631 | " 0.05928563431624706,\n", 632 | " 0.059076129472239725,\n", 633 | " 0.05762985490169308,\n", 634 | " 0.059174927565307574,\n", 635 | " 0.05815990596103261]" 636 | ] 637 | }, 638 | "execution_count": 21, 639 | "metadata": {}, 640 | "output_type": "execute_result" 641 | } 642 | ], 643 | "source": [ 644 | "# Filename: DistancesVariance, PART jD7SY\n", 645 | "### GRADED PART (DO NOT EDIT) ###\n", 646 | "part_4 = list(euclidean_distances_variance_vector)\n", 647 | "try:\n", 648 | " part4 = \" \".join(map(repr, part_4))\n", 649 | "except TypeError:\n", 650 | " part4 = repr(part_4)\n", 651 | "submissions[all_parts[3]]=part4\n", 652 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:4],all_parts,submissions)\n", 653 | "euclidean_distances_variance_vector\n", 654 | "### GRADED PART (DO NOT EDIT) ###" 655 | ] 656 | }, 657 | { 658 | "cell_type": "code", 659 | "execution_count": 22, 660 | "metadata": {}, 661 | "outputs": [ 662 | { 663 | "name": "stdout", 664 | "output_type": "stream", 665 | "text": [ 666 | "Submission successful, please check on the coursera grader page for the status\n" 667 | ] 668 | }, 669 | { 670 | "data": { 671 | "text/plain": [ 672 | "[0.1988768646152347,\n", 673 | " -0.021074633737255218,\n", 674 | " -0.057498176201923115,\n", 675 | " -0.07189621539115594,\n", 676 | " -0.006116609407693535,\n", 677 | " -0.023983251393225706,\n", 678 | " -0.05204557015527248,\n", 679 | " -0.018424595473803224,\n", 680 | " -0.004037890673925173,\n", 681 | " -0.02085334934652273,\n", 682 | " -0.014025628984888351,\n", 683 | " 0.029458241353260365,\n", 684 | " -0.043966380540847866]" 685 | ] 686 | }, 687 | "execution_count": 22, 688 | "metadata": {}, 689 | "output_type": "execute_result" 690 | } 691 | ], 692 | "source": [ 693 | "# Filename: DistancesSkewness, PART: Ad4J0\n", 694 | "### GRADED PART (DO NOT EDIT) ###\n", 695 | "part_5 = list(euclidean_distances_skewness_vector)\n", 696 | "try:\n", 697 | " part5 = \" \".join(map(repr, part_5))\n", 698 | "except TypeError:\n", 699 | " part5 = repr(part_5)\n", 700 | "submissions[all_parts[4]]=part5\n", 701 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:5],all_parts,submissions)\n", 702 | "euclidean_distances_skewness_vector\n", 703 | "### GRADED PART (DO NOT EDIT) ###" 704 | ] 705 | }, 706 | { 707 | "cell_type": "code", 708 | "execution_count": 23, 709 | "metadata": {}, 710 | "outputs": [ 711 | { 712 | "name": "stdout", 713 | "output_type": "stream", 714 | "text": [ 715 | "Submission successful, please check on the coursera grader page for the status\n" 716 | ] 717 | }, 718 | { 719 | "data": { 720 | "text/plain": [ 721 | "[-0.6384013133225133,\n", 722 | " -0.27584397346027867,\n", 723 | " -0.15223233078033216,\n", 724 | " -0.07988375526844216,\n", 725 | " -0.010447691485871324,\n", 726 | " -0.08064860279897523,\n", 727 | " -0.02331335574782667,\n", 728 | " -0.020166667252636383,\n", 729 | " 0.10669665209383972,\n", 730 | " -0.05369066310062731,\n", 731 | " 0.024930971487188813,\n", 732 | " 0.00307535205057885,\n", 733 | " 0.06775391815498555]" 734 | ] 735 | }, 736 | "execution_count": 23, 737 | "metadata": {}, 738 | "output_type": "execute_result" 739 | } 740 | ], 741 | "source": [ 742 | "# Filename: DistancesKurtosis, PART: 1nPFm\n", 743 | "### GRADED PART (DO NOT EDIT) ###\n", 744 | "part_6 = list(euclidean_distances_kurtosis_vector)\n", 745 | "try:\n", 746 | " part6 = \" \".join(map(repr, part_6))\n", 747 | "except TypeError:\n", 748 | " part6 = repr(part_6)\n", 749 | "submissions[all_parts[5]]=part6\n", 750 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:6],all_parts,submissions)\n", 751 | "euclidean_distances_kurtosis_vector\n", 752 | "### GRADED PART (DO NOT EDIT) ###" 753 | ] 754 | }, 755 | { 756 | "cell_type": "code", 757 | "execution_count": 24, 758 | "metadata": { 759 | "collapsed": true 760 | }, 761 | "outputs": [], 762 | "source": [ 763 | "# here we plot the stats for different sample sizes\n", 764 | "try:\n", 765 | " plt.figure()\n", 766 | " plt.plot( dimension_vector, euclidean_distances_mean_vector, 'r-', marker='o' )\n", 767 | " plt.grid( True, which='both')\n", 768 | " plt.minorticks_on()\n", 769 | " plt.title( 'Mean of Euclidean Distances Distribution' )\n", 770 | " plt.xlabel( 'Dimension' )\n", 771 | " plt.ylabel( 'Mean of Euclidean Distances' )\n", 772 | "\n", 773 | " plt.figure()\n", 774 | " plt.plot( dimension_vector, euclidean_distances_variance_vector, 'r-', marker='o' )\n", 775 | " plt.grid( True, which='both')\n", 776 | " plt.minorticks_on()\n", 777 | " plt.title( 'Variance of Euclidean Distances Distribution' )\n", 778 | " plt.xlabel( 'Dimension' )\n", 779 | " plt.ylabel( 'Variance of Euclidean Distances' )\n", 780 | "\n", 781 | " plt.figure()\n", 782 | " plt.plot( dimension_vector, euclidean_distances_skewness_vector, 'r-', marker='o' )\n", 783 | " plt.grid( True, which='both')\n", 784 | " plt.minorticks_on()\n", 785 | " plt.title( 'Skewness of Euclidean Distances Distribution' )\n", 786 | " plt.xlabel( 'Dimension' )\n", 787 | " plt.ylabel( 'Skewness of Euclidean Distances' )\n", 788 | "\n", 789 | " plt.figure()\n", 790 | " plt.plot( dimension_vector, euclidean_distances_kurtosis_vector, 'r-', marker='o' )\n", 791 | " plt.grid( True, which='both')\n", 792 | " plt.minorticks_on()\n", 793 | " plt.title( 'Kurtosis of Euclidean Distances Distribution' )\n", 794 | " plt.xlabel( 'Dimension' )\n", 795 | " plt.ylabel( 'Kurtosis of Euclidean Distances' )\n", 796 | "\n", 797 | " matplotlib.pyplot.show()\n", 798 | "except: pass" 799 | ] 800 | }, 801 | { 802 | "cell_type": "code", 803 | "execution_count": null, 804 | "metadata": { 805 | "collapsed": true 806 | }, 807 | "outputs": [], 808 | "source": [] 809 | } 810 | ], 811 | "metadata": { 812 | "coursera": { 813 | "course_slug": "guided-tour-machine-learning-finance", 814 | "graded_item_id": "qoIPX", 815 | "launcher_item_id": "rsGVU" 816 | }, 817 | "kernelspec": { 818 | "display_name": "Python 3", 819 | "language": "python", 820 | "name": "python3" 821 | }, 822 | "language_info": { 823 | "codemirror_mode": { 824 | "name": "ipython", 825 | "version": 3 826 | }, 827 | "file_extension": ".py", 828 | "mimetype": "text/x-python", 829 | "name": "python", 830 | "nbconvert_exporter": "python", 831 | "pygments_lexer": "ipython3", 832 | "version": "3.6.0" 833 | } 834 | }, 835 | "nbformat": 4, 836 | "nbformat_minor": 2 837 | } 838 | -------------------------------------------------------------------------------- /MY_Tobit_regression_m1_ex3_v4.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Tobit regression with TensorFlow\n", 8 | "\n", 9 | "Tobit regression fits the following model for non-negative data $ y $: \n", 10 | "\n", 11 | "$ y({\\bf X}) = \\max (0, w_0 + \\sum_{i=1}^{N} w_i X_i + w_{N+1} \\cdot \\varepsilon) $ \n", 12 | "\n", 13 | "Here $ X_i $ are predictors, $ \\varepsilon \\sim N(0,1) $ is a standard Gaussian noise, and $ w_{N+1} $ is the noise\n", 14 | "volatility (standard deviation).\n", 15 | "\n", 16 | "Our problem is to fit parameters $ N+2 $ parameters $ w_{i} $ for $ i = 0, \\ldots, N+1 $ to the observed set of pairs $ \\left({\\bf X}_i, y_i \\right) $ \n", 17 | "\n", 18 | "We use synthetic data with known parameters to learn how to implement Tobit Regression in TensorFlow. " 19 | ] 20 | }, 21 | { 22 | "cell_type": "markdown", 23 | "metadata": {}, 24 | "source": [ 25 | "## About iPython Notebooks ##\n", 26 | "\n", 27 | "iPython Notebooks are interactive coding environments embedded in a webpage. You will be using iPython notebooks in this class. You only need to write code between the ### START CODE HERE ### and ### END CODE HERE ### comments. After writing your code, you can run the cell by either pressing \"SHIFT\"+\"ENTER\" or by clicking on \"Run Cell\" (denoted by a play symbol) in the upper bar of the notebook. \n", 28 | "\n", 29 | "We will often specify \"(≈ X lines of code)\" in the comments to tell you about how much code you need to write. It is just a rough estimate, so don't feel bad if your code is longer or shorter." 30 | ] 31 | }, 32 | { 33 | "cell_type": "code", 34 | "execution_count": 1, 35 | "metadata": { 36 | "collapsed": true 37 | }, 38 | "outputs": [], 39 | "source": [ 40 | "import numpy as np\n", 41 | "import tensorflow as tf\n", 42 | "\n", 43 | "import sys\n", 44 | "sys.path.append(\"..\")\n", 45 | "import grading\n", 46 | "\n", 47 | "try:\n", 48 | " import matplotlib.pyplot as plt\n", 49 | " %matplotlib inline\n", 50 | "except:\n", 51 | " pass\n", 52 | "\n", 53 | "try:\n", 54 | " from mpl_toolkits.mplot3d import Axes3D\n", 55 | "except:\n", 56 | " pass" 57 | ] 58 | }, 59 | { 60 | "cell_type": "code", 61 | "execution_count": null, 62 | "metadata": { 63 | "collapsed": true 64 | }, 65 | "outputs": [], 66 | "source": [ 67 | "### ONLY FOR GRADING. DO NOT EDIT ###\n", 68 | "submissions=dict()\n", 69 | "assignment_key=\"w3Hc-vZdEeehlBIKDnZryg\" \n", 70 | "all_parts=[\"pLnY5\", \"RKR6p\", \"IU1pw\", \"ISVtY\", \"Cutr3\"]\n", 71 | "### ONLY FOR GRADING. DO NOT EDIT ###" 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "execution_count": null, 77 | "metadata": { 78 | "collapsed": true 79 | }, 80 | "outputs": [], 81 | "source": [ 82 | "COURSERA_TOKEN = # the key provided to the Student under his/her email on submission page\n", 83 | "COURSERA_EMAIL = # the email" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": 2, 89 | "metadata": { 90 | "collapsed": true 91 | }, 92 | "outputs": [], 93 | "source": [ 94 | "# utility function to reset the TF graph to the same state each time\n", 95 | "def reset_graph(seed=42):\n", 96 | " # to make results reproducible across runs\n", 97 | " tf.reset_default_graph()\n", 98 | " tf.set_random_seed(seed)\n", 99 | " np.random.seed(seed)\n", 100 | " " 101 | ] 102 | }, 103 | { 104 | "cell_type": "markdown", 105 | "metadata": {}, 106 | "source": [ 107 | "## Tobit Regression class\n", 108 | "\n", 109 | "**Instructions**:\n", 110 | "Complete the code for the calculation of loss function (the negative log-likelihood)." 111 | ] 112 | }, 113 | { 114 | "cell_type": "code", 115 | "execution_count": 3, 116 | "metadata": { 117 | "collapsed": true 118 | }, 119 | "outputs": [], 120 | "source": [ 121 | "class Tobit_Regression:\n", 122 | " \n", 123 | " def __init__(self, n_features, learning_rate=0.005, L=0):\n", 124 | " \n", 125 | " self.input = tf.placeholder(tf.float32, [None, n_features], name=\"Input\")\n", 126 | " self.target = tf.placeholder(tf.float32, [None, 1], name=\"Target\")\n", 127 | " \n", 128 | " # the first weight is for the intercept, the last one is for a square root of the noise std \n", 129 | " self.weights = tf.Variable(tf.random_normal([n_features + 2, 1]))\n", 130 | " \n", 131 | " # Augmented data matrix is obtained by adding a column of ones to the data matrix\n", 132 | " self.data_plus_bias = tf.concat([tf.ones([tf.shape(self.input)[0], 1]), self.input], axis=1)\n", 133 | "\n", 134 | " #######################################################################\n", 135 | " # MLE for Tobit regression \n", 136 | " \n", 137 | " # noise volatility is obtained as a square of the last weight to ensure positivity \n", 138 | " self.sigma = 0.0001 + tf.square(self.weights[-1])\n", 139 | " \n", 140 | " # term1 and term2 are just placeholders initialized such that the code runs\n", 141 | " # students need to initialize them appropriately to solve this assignment\n", 142 | " term1 = tf.Variable(np.zeros(shape=(n_features + 2, 1)))\n", 143 | " term2 = tf.Variable(np.zeros(shape=(n_features + 2, 1)))\n", 144 | " # THIS IS THE PART THAT STUDENTS ARE SUPPOSED TO WRITE THEMSELVES TO COMPLETE THE IMPLEMENTATION \n", 145 | " # OF THE TOBIT REGRESSION MODEL\n", 146 | " \n", 147 | " # FOR THE ASSIGNMENT: complete the code for the calculation of loss function \n", 148 | " # (the negative log-likelihood)\n", 149 | " ### START CODE HERE ### (≈ 6-7 lines of code)\n", 150 | "\n", 151 | " \n", 152 | " ### END CODE HERE ###\n", 153 | " self.loss = - tf.reduce_mean(term1 + term2)\n", 154 | " \n", 155 | " #####################################################################\n", 156 | "\n", 157 | " # Use Adam optimization for training\n", 158 | " self.train_step = (tf.train.AdamOptimizer(learning_rate).minimize(self.loss), -self.loss)\n", 159 | " \n", 160 | " # prediction made from the model: Use a ReLU neuron!\n", 161 | " self.output = tf.nn.relu(tf.matmul(self.data_plus_bias[:, :], self.weights[:-1]))\n", 162 | " \n", 163 | " # Check the output L1-norm error \n", 164 | " self.output_L1_error = tf.reduce_mean(tf.abs(self.target - self.output))\n", 165 | "\n", 166 | " def generate_data(n_points,\n", 167 | " n_features,\n", 168 | " weights,\n", 169 | " noise_std):\n", 170 | "\n", 171 | " # Bounds of [-1,1] in space of n_points x n_features\n", 172 | " np.random.seed(42)\n", 173 | " bias = np.ones(n_points).reshape((-1,1))\n", 174 | " low = - np.ones((n_points,n_features),'float')\n", 175 | " high = np.ones((n_points,n_features),'float')\n", 176 | "\n", 177 | " # simulated features are uniformally distributed on [-1,1].\n", 178 | " # The size n_points x n_features of array X is inferred by broadcasting of 'low' and 'high'\n", 179 | " X = np.random.uniform(low=low, high=high)\n", 180 | " \n", 181 | " # simulated noise\n", 182 | " noise = np.random.normal(size=(n_points, 1))\n", 183 | " \n", 184 | " # outputs \n", 185 | " Y = weights[0] * bias + np.dot(X, weights[1:]).reshape((-1,1)) + noise_std * noise\n", 186 | "\n", 187 | " # truncate negative values of Y \n", 188 | " np.clip(Y, a_min=0, a_max=None, out=Y)\n", 189 | "\n", 190 | " return X, Y " 191 | ] 192 | }, 193 | { 194 | "cell_type": "code", 195 | "execution_count": 4, 196 | "metadata": { 197 | "collapsed": true 198 | }, 199 | "outputs": [], 200 | "source": [ 201 | "def gen_tobit_dataset(n_points, n_features, train_test_split=4):\n", 202 | " \"\"\"\n", 203 | " Generate dataset for Tobit regression model and split it into training and test portions\n", 204 | " \n", 205 | " \"\"\"\n", 206 | " # n_features + 1 weights (one for a constant feature)\n", 207 | " data_weights = np.array([-0.25, 0.5, 0.2, .1]) \n", 208 | " noise_std = 0.1\n", 209 | " \n", 210 | " # Generate dataset\n", 211 | " X, Y = Tobit_Regression.generate_data(n_points=n_points,\n", 212 | " n_features=n_features,\n", 213 | " weights=data_weights,\n", 214 | " noise_std=noise_std)\n", 215 | " \n", 216 | " # split to the train and test set\n", 217 | " # 1/4 of the data is used for a test\n", 218 | " \n", 219 | " n_test = int(n_points / train_test_split)\n", 220 | " n_train = n_points - n_test\n", 221 | " \n", 222 | " X_train = X[:n_train,:]\n", 223 | " Y_train = Y[:n_train].reshape((-1,1))\n", 224 | "\n", 225 | " X_test = X[n_train:,:]\n", 226 | " Y_test = Y[n_train:].reshape((-1,1))\n", 227 | " return X_train, Y_train, X_test, Y_test\n", 228 | "\n", 229 | "def train_model(n_features, learning_rate, n_steps=1000):\n", 230 | " \"\"\"\n", 231 | " Train Tobit Regression model\n", 232 | " \n", 233 | " Return:\n", 234 | " a tuple of:\n", 235 | " - Model fitted weights, np.array\n", 236 | " - loss, double \n", 237 | " - fitted noise std error, double\n", 238 | " - L1 error, double\n", 239 | " \"\"\"\n", 240 | " # create an instance of the Tobit Regression class \n", 241 | " model = Tobit_Regression(n_features=n_features, learning_rate=learning_rate)\n", 242 | "\n", 243 | " # train the model\n", 244 | " with tf.Session() as sess:\n", 245 | " sess.run(tf.global_variables_initializer())\n", 246 | " \n", 247 | " for _ in range(0, n_steps):\n", 248 | " (_, loss), weights = sess.run((model.train_step, model.weights), feed_dict={\n", 249 | " model.input: X_train,\n", 250 | " model.target: Y_train\n", 251 | " })\n", 252 | " \n", 253 | " # predictions for the test set\n", 254 | " # std_model = weights[-1]**2 \n", 255 | " output, std_model = sess.run([model.output,model.sigma], \n", 256 | " feed_dict={model.input: X_test})\n", 257 | " \n", 258 | " output_L1_error = sess.run(model.output_L1_error,\n", 259 | " feed_dict={model.input: X_test,\n", 260 | " model.target: Y_test})\n", 261 | " sess.close()\n", 262 | " return weights[:-1], loss, std_model[0], output_L1_error, output\n", 263 | "\n", 264 | "def plot_results(): \n", 265 | " # Plot a projection of test prediction on the first two predictors\n", 266 | " fig = plt.figure()\n", 267 | " ax = fig.add_subplot(111, projection='3d')\n", 268 | " ax.scatter(X_test[:,1], X_test[:,2], Y_test, s=1, c=\"#000000\")\n", 269 | " ax.scatter(X_test[:,1], X_test[:,2], output.reshape([-1,1]), s=1, c=\"#FF0000\")\n", 270 | " plt.xlabel('X_1')\n", 271 | " plt.ylabel('X_2')\n", 272 | " plt.show()" 273 | ] 274 | }, 275 | { 276 | "cell_type": "code", 277 | "execution_count": 5, 278 | "metadata": {}, 279 | "outputs": [ 280 | { 281 | "data": { 282 | "text/plain": [ 283 | "array([ 1.08004403, -2.47319293, -1.57543015, -0.40005195], dtype=float32)" 284 | ] 285 | }, 286 | "execution_count": 5, 287 | "metadata": {}, 288 | "output_type": "execute_result" 289 | } 290 | ], 291 | "source": [ 292 | "### GRADED PART (DO NOT EDIT) ###\n", 293 | "n_points = 5000\n", 294 | "n_features = 3\n", 295 | "learning_rate = 0.05\n", 296 | "n_steps = 1000\n", 297 | "\n", 298 | "X_train, Y_train, X_test, Y_test = gen_tobit_dataset(n_points, n_features)\n", 299 | "reset_graph()\n", 300 | "weights, loss, std_model, error_L1, output = train_model(n_features, learning_rate, n_steps)\n", 301 | "\n", 302 | "part_1=list(weights.squeeze())\n", 303 | "try:\n", 304 | " part1 = \" \".join(map(repr, part_1))\n", 305 | "except TypeError:\n", 306 | " part1 = repr(part_1)\n", 307 | "submissions[all_parts[0]]=part1\n", 308 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key, all_parts[0],all_parts,submissions)\n", 309 | "weights.squeeze()\n", 310 | "### GRADED PART (DO NOT EDIT) ###" 311 | ] 312 | }, 313 | { 314 | "cell_type": "code", 315 | "execution_count": 6, 316 | "metadata": {}, 317 | "outputs": [ 318 | { 319 | "data": { 320 | "text/plain": [ 321 | "[99.899992522975907, 0.19905642, 1.3988718]" 322 | ] 323 | }, 324 | "execution_count": 6, 325 | "metadata": {}, 326 | "output_type": "execute_result" 327 | } 328 | ], 329 | "source": [ 330 | "### GRADED PART (DO NOT EDIT) ###\n", 331 | "part_2=[loss, std_model, error_L1]\n", 332 | "try:\n", 333 | " part2 = \" \".join(map(repr, part_2))\n", 334 | "except TypeError:\n", 335 | " part2 = repr(part_2) \n", 336 | " \n", 337 | "submissions[all_parts[1]]=part2\n", 338 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key, all_parts[:2],all_parts,submissions)\n", 339 | "[loss, std_model, error_L1]\n", 340 | "### GRADED PART (DO NOT EDIT) ###" 341 | ] 342 | }, 343 | { 344 | "cell_type": "code", 345 | "execution_count": 7, 346 | "metadata": {}, 347 | "outputs": [ 348 | { 349 | "data": { 350 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAWQAAADuCAYAAAAOR30qAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzsnXl8XHW5/9/nnFmSyaRJmjZNm7bZuu+2KW1ZCkWq7Nor\nVxYVxIULoghWBRdAFgFREEQBvci1LC4slyu39aKI8lMQbaG1pXuz73sy+3aW3x8nZzqTzExmkmmb\n1vN+vfpqm5z5zplkzmee83yf5/MImqZhYmJiYnLiEU/0CZiYmJiY6JiCbGJiYjJBMAXZxMTEZIJg\nCrKJiYnJBMEUZBMTE5MJginIJiYmJhMEU5BNTExMJgimIJuYmJhMEExBNjExMZkgWDI83mzrMzEx\nMckcIZ2DzAjZxMTEZIJgCrKJiYnJBMEUZBMTE5MJginIJiYmJhMEU5BNTExMJgimIJuYmJhMEExB\nNjExMZkgmIJsYmJiMkEwBdnExMRkgmAKsomJickEwRRkExMTkwmCKcgmJiYmE4RMzYVMTFKiaRqK\nogAgSRKCkJaniomJCaYgm2QJVVVRFAVZlgmFQtGvC4KAJEnRP6IoIooigiCYYm1iMgxTkE3Ghaqq\nyLIcjYoFQYgKrqbpbq2GUMeiKAqhUIjCwkIsFosp1CYmmIJsMgY0TUNVVfx+PxaL/hYyhNQQYeNr\nsX/H4vf7aW5uxuFwEA6H4x4jiiKSJJlCbfIvhynIJmljCLEsy4TDYXbt2sVpp502JqEURX0/WZKk\nhM+hKEqcUBvHDk9/mEJtciphCrLJqMQKsaqq0bywpmljFsPh0XTs1xOtGSvUsc/b0dHBzJkzowI9\nPE9tYnIyYQqySVKMiglZlqMiaPwRRTGhoGa6frokE+r29nZmzJgxQqiBqEAn2lA0MZmImIJsMoJE\nQmykGAySRbjpki1RTHRuoL8GTdOIRCKEw2FTqE1OCkxBNomiaRo+nw9N07BYLEnFbjTSEbbxCno6\n6ydLfQDIskwkEgGgu7ubvLw88vPzTaE2OaGYgmyCpmnIsowsy7S2tmKz2SgrKzumz3msBTnV88b+\nDeD1erHb7UC8UMc+ZnjVh9n0YnIsMAX5X5jYZg4ga7nhUCiE1WodU3R9oojNj8cyvJbaSOFompYy\n9WGKtclYMAX5X5BEzRyGgIiiiKqqGa+paRr9/f3U1dVFhR4gNzcXh8OB0+kkLy+P3NzcEY0jJ5pU\n55GsljpR08uhQ4eYP39+9IPNYrGYQm2SEaYg/4sQu8llCG4igRBFccQt+2jr9vT00NDQQG5uLgsX\nLsRmsyEIAqqqEgwG8fl8+Hw+enp68Pv9ANjtdvx+P93d3TgcDhwOxwmNqDMVykRC7fP5oq9BVdW4\nFnLjWLPpxSQVpiCf4hgVEx6Ph5ycHCD5hpfxvXQiV0Pc//73vzNp0iSWLl2Kw+FA0zTC4XD0lt4Q\n26lTp0Yfq6oqPp+Pffv24ff7o0KtaRq5ubnk5eVF/5xooU4X42eWqjvRbHoxGQ1TkE9RYps5ZFlm\n9+7drF27dtQLfLSUhaqqdHR00NTUhKIofOADH4gKPSRuoU70HLm5udhsNioqKuLOORAIRCPq3t7e\n4yLU42lwyWSNdJpeXC4Xg4ODlJeXAyTMUZuVH6cupiCfYiSqIc6kqy6ZICuKQltbGy0tLUydOpWa\nmhrefffdODHOhESiLQhCwog6mVADBAIB6uvrT3hEPd6uxdjHqqoa/Z0l6k4Es5b6VMUU5FOEdJo5\n0mG4UMqyTEtLC+3t7ZSWlnLaaadhtVrHfb6ZbOqlEurt27eTn59/XCPqRGQjygZdjI3zTBVRm00v\npyamIJ/kGDXEsVHUeITHiJAjkQjNzc10dnYyY8YM1qxZE3V2yxbjrbIwXuvUqVPTjqhzcnJGCHW2\nUhbZYLypD9A/RDs6OggGg8ycORMwhfpkwRTkkxRDdODoBZqNCFBRFPr7++nr62PWrFmsXbt2hCOb\ngRHljuWiPpZCkGnqIxgMEggEKCgoGFdEnY3XNN7Uh/G3EWkbqQ9I3PRiCHOiEj2T448pyCcZsc0c\n27dvZ/Xq1UkFMxOCwSANDQ309vZit9upqakZVZCMC38sz38iLvhkQr1//36mTJmCIAhjTn0ci5RF\nttZJt5babHo58ZiCfJKQqJkjNvoZK36/n4aGBtxuNxUVFcyYMYPm5ua0RGEiNXeMB0EQyM3NJT8/\nf8ypj2y1Uhv2ptlYZ7QUUzpC3dvbS39/P1VVVdFjzaaXY4cpyBOY0Zo5JElCUZQxbbJ5vV7q6+sJ\nBAJUVlayaNEiBEHA6/Wm3amXjTbriUwmqQ+fz0cgEOD9998f12aiEZ2Ol/FE2rFCrWlaNEo2ftdm\n08uxwxTkCchwQ3hI3lWXrnga6QVDiCORCFVVVUyePHnELn0m1Q+Jnv9ki5wzPddEQh0IBDhy5AjV\n1dXjqqOeCIKcbB2z6eXYYwryBCLRZI5Ub1xJktIWZFVV2blzJ4IgUFVVRVFRUcLjkolssmNPJuFN\nRTaqLJJ1JmbS8KIoSlaqWbIlyIqipLVHMFrTiyzL7Nq1i5UrV8bVx5tNL/GYgjwBiK0h3rNnDwsW\nLIj6QaRCFMVoTjnZuv39/dTX1xMKhViwYAHFxcWjrplJymIsRkSnIqk29TJJfQwODiIIAj09PeNK\nfRyLCHksxHZuGikNSD6SC/61S/RMQT6BJGrmMN6k4+mqS2T4c+jQIfLy8tJaM5OUxakQIWfjNYyl\nyiKRUDc0NJCXl4fT6cTv9+P1eunt7SUQCKCqatqpj2xGyDabLSvrxEbaZtNLYkxBPgGkaubIJA1h\nbOrFrtvV1UVDQwP5+flRwx9IP5o1UxZjI5uNIbGpjylTpsR9z3DP83q99PX14ff744TasDpVFOW4\npiyytU46TS+HDh1i8uTJFBYWAqeWUJuCfByJncwBiefBjZaGGH6sqqpxhj9FRUUjDH9ij013zXSf\n/1QR5GxwrMvejPK83NzcpELt8/no7++nt7cXt9sdF0k7nU5yc3MzEtjjnYtORuyGolFZlE7Ty6OP\nPsptt92WlXb/44EpyMeBRJM5Um3UpSvIgiDQ3d3NkSNHmDp1KqtWrYqOIhpOJhHyeKosFEWhvb0d\nAKfTicPhyEqEdSw5USmLZOuMpUNwuFC///77VFdXR5tdDKGOjagdDkc0PZJMqLMZIWer9V6W5eha\no9VSv/zyy3zrW9/KyvMeD0xBPoYYu8uHDh1izpw5aZX6pCPIiqLQ0tJCS0sLBQUFaRn+ZCLI6SAc\nOEDZT36CcMstUFCAoii0trbS2trKlClTEEWR1tZWfD4fqqqSk5MTnRpiRGwTiWykLE6UICfC6KC0\n2+1pRdSGUA9veJFlOSvnI8ty1j6YYwU5GbGBxcmUujAFOcskaubo7e1l7ty5ab0xUuWQhxv+VFZW\nIopiWrdj2a6IEBoacBw4QLCzk0abjba2NkpLS1mzZg0wFFkdOIBWUIA6Zw6R997D53bjEsVo2ZfP\n52Pv3r1xQp2bm3tSXUAG2WydPtYt2JmkPvr6+nC73dFoOvYDNROBzVakDZmL+8n0fjIFOUukauYw\nxDCdN1GiCDkcDtPU1ER3d3ec4U9bW1va45ayLciRD32IvZKEV9OYpapxbnCyLIOqYrvvPrS8PEI/\n/CGTfvxjnLNnU3T33dE1tm/fTmVlJT6fD4/HQ2dnJ4FAILqpFSvUdrv9mF1YEy1lcaI8MRIJ9Z49\ne6J3d+lE1MmEOpuCnO5ry2ZUfrwwBXmcpNPMYbFY0n5Dxm7qBYNBGhsb6e/vZ/bs2axbty7ujXhM\na4YVBXHLFrQ5c9DWr49+2fBHbmtrw1pYyIKKCqZNm5boCQnfcgvY7eBwEP7iF9GGdsUNBEGIXsgl\nJSUxT61EI+iBgQFaW1sJhUJIkhQ93hDrbDFRUhYTpX44dh2LxYLNZhtX6iMSiWTVvjWdn7XL5aKg\noCBrz3k8MAV5jCSqIU6WI5YkCVmW06rnlCQJv9/P/v37cblcVFRURCcZJzo204qMtHG7kX79a9TF\ni1HWr48T4rKyMtauXUtDQ0PSi17TNNRVq6L/V2tq0n5qSZLIz88nPz8/7uuyLMc1UTQ2NuLz+di1\na9cIoc62d/NoTLQIGbJzq54qkMgk9TEwMIAgCPT29o4r9ZEJbrc7Whp3smAKcoZomkYoFMLj8eB0\nOhOWrg0nXeH0+Xy0tbXh9/tZtGgRCxcuTHlRpRTZ/n4s3/wmyiWXoF10UeaCXFRE5MknkZ1Omuvr\naW9vZ+bMmaxbty56AWVSs5wNLBYLBQUFcVHPjh07WLx4cfTi7+zsxOv1oigKdrs9TqiTXfzHa6Ze\nOmQrss0W2Up91NXVRf2mx5P6yOTnPDg4aEbIpyqxzRyGQc+KFSvSeuxoguzxeKirqyMcDlNcXExh\nYWHcLfyY1vX5EA4dQli8GI3MI2RZlmnSNDqPHKGsrCxOiAGkb36T0s5OvN/9btprpsLy3HNokyah\nXHppxo+12WzYbLY4fw5N06dfe71efD5fXMXH8G63UzGHnC2ydT5G2Vu6EXUyobbb7WlH1GbK4hQk\nUTOH1WqN/j8dkgnn4OAg9fX1qKpKdXU1RUVF9Pf3093dnda6KUV21iwir74KQ3nWdAVZlmVCoRB/\n//vfU08MEUWEMfoxi2+/DVYrGBe6qiLt3YtWVDQmQU6EIAjY7Xbsdnucf4dx8RtC3dvbi8vlYvfu\n3XEpj0wrPrLVIDPRIuRska3Uh9frJRAIsGfPnlEjapfLZaYsThVSNXMYm3TpEivImqYxMDBAXV0d\nFouF6urquE/xrOaFY3KwkiSNsEWMJRKJ0NTURFdXF4IgsGbNmpTldMq999LT2Ig1UyHSNKzPPgsO\nB1x5pR6BiSKhO++E4yBEsRe/4R+xe/du5s2bh6qq46r4OBUj5GwxliqLRELt9XppbGyMszlNFFG/\n//77HDx4MKsRckVFBfn5+VHP53fffTdraxuYgjyMRJM5hl8gmYgm6AIeiUTo6emhvr6enJwcFixY\nMGLTCjJrnc6GeMcK8axZs1i3bh07duwY+8QQVSWlnAgC8oUXQk5O9PGCIEBublqv41ghimI0lTGW\nio/hRjhjJVt1yBONbHb8Wa3WUSPqhoYG3nzzTbq6uvjVr37F7Nmz2bJly7gj5j//+c9xz5ltTEFm\n9Mkcw8nkgtE0DZ/PR3NzM8XFxSxZsiRludaxqpwYfmwkEqGxsTFa2xwtqZNl8g8dQl20CCnBB0Ys\nIzb1PB6kO++EtWuJJEs9qCqWv/4VzemEiy5K69xPJOlWfPT39xOJROjv7x9XxUc2OvWyaXSULbIl\nyKm69GIj6ltuuQWv18sZZ5zBJZdcQnNzM5MmTRr38x9r/qUFObaG+MCBA5SXl2etU0xVVTo7O2ls\nbMRqtTJjxgzmzp076uPG4vYm/PWvSM8+i3zvvZBkM9CIvGObTBLVNgtvvcXc738fMScHPv7xlM8/\nIpoXRcjPR0sV7Yoi4c2bQZIQmpsnhDnRWNIEwys+Ojo6kGWZadOmjaviA8af+shmDXK2StKOhyAP\nx+12U1BQgCiKVFRUjPu5BUHgvPPOQ5Ik/uM//oPrrrtu3GsO519SkI0aYkVRoreIkUgkbR/iVKiq\nSltbG83NzUyZMoVVq1YxMDCAz+dL6/FjiZCFPXsQdu5E6OlBSyLIiqLQ19dHb28v5eXlI4TYQFu2\njO6PfIQpNTWMdvmMSFnk5aHcdReqoiC9/jr2X/6S0N13o82YEf8cpaX645ua0FwumDw5rdc7kTFE\nfSJUfEy05hKDbAQ6mXTfZXtT76233qKsrIzu7m42btzIggULWB/TNJUN/qUEOVUzh8ViyahywljP\neJMZhj9tbW1MmzaN1atXRxtBMhXZjO03r78e9bLLIEHHXDgcpqGhga6uLnJycqipqUl9kU2eTPfH\nPkbRkGiO9vwjBCQUQty5E62rC6GvD4YNw4xl6htv4HjsMSIPPIA2fbreHXjgAGplZbQ65HhwrMve\nRqv4MPyNDY+PHTt2kJubO+aKj4limXksyMQw3+VyJR1VNhbKysoAKCkpYdOmTWzfvt0U5LGQSIiH\nv2EzFWTjeEEQaG5upqOjgxkzZsR5OsQem4nIZmJ/CYAkjRBjQ4j7+vooLy+ntLSUlpaW+Nc9MIDl\ns59FvfBC1GuvhaYmGDIsGquZvXDwINIjjyBfey2BF17Qy9uSEJoxA0VV0YzSvCNHsD34IJErr0S5\n4IK0fgbZIhuNIdlooNixYwerVq2KjnYyKj6CwWB0wshoFR/ZFOSJVoIny3LaToHZFGTjjiY/Px+f\nz8cf/vAH7rjjjqysHcspLcjpGMIbZCrIgiBQW1tLf38/M2fOTF6vy9HW6eNBKBSioaGB/v5+Kioq\nmDt3LqIoRnOZcagqgs8HgQDiiy8iPfgg8iOPIBYVpfWhkKjKQluwAPmWW1Cqq7H+4Q9Y/vhHQrff\nnjAt4V26lMDll0cjHrWigsgnPhHXcp0Kcf9+8urqYPXqtI4/lmSzXE0UxTF5fBhCnU73aDpkK4ec\nzX2CTHLIPp8va34nXV1dbNq0KXoOV111Feeff35W1o7llBRkTdPweDwoikLOUHnVaBdLuoIcCoVo\nbGyMfvomy8UCoCgQDmdcJjcWhgvxcP+LhFFvcTGR114DQUA4cAB1wwa0qiqEwcG0IuSc994jaLXC\nnDlHv2i3o9TUoIRC5L7wApZ33iFy1VWoCQR5uKBL//gH1qeeIjxrFloapUXWX/yCsvZ2uOKKUY9N\nxUTq1EtFuhUfLpcLv98/bo+PbKUsspmLTjeHbPxOs/W8VVVV7N69OytrpeKUEuTYZo7u7m5UVaW8\nvDytx1osFkIp8p2BQICGhgYGBwepGHI4mzp1asJfuLB3L4gi0h13ILz/PtI77xwzQVZVlQMHDjAw\nMEBlZWVSI6KkaYihY7WFC1Eeekg/1u0eXZAHByn8znewzpwJGzdGz6W9vZ2mpia9o/Gyy6gSRYIH\nDqCVl0cnU0QbbHw+hO5umDVLP4eiIrSyMr0kLg3CN95I8+7dLE3r6NRMpJl6mTK84sPlctHZ2Rm1\nNh1rxUc2Ux/ZMnvKZK10ArGJxikhyImaOaxWa9qVDaC/qRMd7/P5qK+vx+fzUVlZGTX8cbvdiSNq\nTcN6ySVgtaJ89KMIwSCWnJysC3IwGKShoQG/3091dTULFixI34jI5wObLWl+N60cckEBnq9/nV6b\njXJNiwrxlClTqKmp0SNGjwfbL3+Jv76eVp+Prq6uuO63GU88gd3rJbhlC7b8fLRp0wg9+KB+bmmg\nlZcTTLPN/FiTrUkf2cAQ0vFUfDidzqz5CR/vaSFw8jbYnNSCbLy5EjVzZOo3MTxl4fF4qK+vJxQK\nUVlZyZQpU0Z4HCdcXxCQ77gDJAn16qtRAGloUzFdjM2yRBd4MBikvr4el8tFZWUl/f39lJaWjvrm\ni4qsz4f14otRly5F+dGPjj7ngQNId96J8rWvIUyaNHqkJgiEzz2X3vp62t95h+LiYmpqarDZbKiq\nqneuFRQQefxxrA4HlTGz/oxcaPfq1ciBAG11dQgdHVQ99xxKdTUsXox4wQXkOZ3HZZd/ItRCw/Fx\njMuk4sPlckUbm8Yz1eVETAtxu90JO2EnOie1IMcOOBz+BjHaldPFEFiXy0VdXR2qqlJVVcXkJDWy\nqfLC6rXXjjjPTC56Y+3Yi8pImRhCbETqTU1NaW2+RAXZZkNdtgxtwYL4AyIR8HggHB41QtY0ja6u\nLo4cOYIgCKxevTrpcFWG73JrWjQX2nTuuUyvqmKFwwELFsDgIOo//4nyxz9SP38+nqE6cWMenyEK\nubm5WY9GJ4L95ok0KEpU8dHe3o6iKEyePHnMFR+QXUFO905kcHDwpDMWgpNckCH57XWmEbLP56On\np4dIJDLC8CcRY6lbThdJktD27EHs7cV31lnUNzTgdrupqqoa4ZFs1C2P9oaPfihYrSiPPjri+9qy\nZcjbtoEoIg59II04RtPo6emJetsuWLCArq6u5GJsoChYXnoJYd8+LHv2EPzP/0QbMvaJkpsLV12F\neOGFSIEA86dPjz6n4c7m9Xrp7u7G7/dH0x7hcJi+vj6cTic2qxWxrU1vRDnO9bPZEuSJ1NBh5GvH\nU/GRl5d3QuqZT0anNzgFBDkZ6UTImqbR29tLQ0ND1LBk5cqVaa0vSVLKTcDxIEkSljvvRNm/n32P\nPcbslStZtGhRZpt1w0hLLIYu4uG10MbPqa6uDqfTyYoVK8jNzcXb3s6U3/xGF9Pq6uTrhkJI770H\ng4NokhTdSEx451BYGDfqKZE7G4qCoqr4g0EGBwcZGBigpaUF+759VP/Xf9F/5ZWoF1wQFYTRco4T\nxaB+opncp1onk6kugUAAQRBQFOW4TXU5Gb2Q4RQX5GQRrKZpdHd309DQgNPpZPHixeTm5vKPf/wj\no/Uz2TRMF7/fj9vtZs9ll1FltbLqQx9CSHFxHYuSOiOHrWka/f391NbW4nA4WLZsWVxRvtTcTNEb\nbyBOmYJ6ww1xm3FxwuJwEPr2t/X5ejHRdKapHAA0DdsDD6Dl5yPddBNWq5U5Q2V3wqxZCN3dOGpq\ncCkKHR0d+Hy+aPmjIQRGtUc20x7ZENOJ1vI8lsg20VSXjo4OQqEQBQUF4/L4yOR1nYzjm+AUEORk\nF0GijjdN0+jo6KCpqYmCggKWL19ObowRTibiMJZGEk3TEADxmWfQ5s1DW7cu+n2/309dXR0+nw+H\nw0H5xRczKY0uo2xPkzbW9Hg87Nixg5ycnKMOdaqq/xm6KNQlS+i64grmbN0KpaWol1+efNEkTlsZ\nC7IgoE2bltDASCsuRrvlFiYBsc82PO3R09NDIBAAIC8vj1AoRH9/P/n5+dhstjEJ60SLkLMRfWZL\n2GVZxm63U1RUNOaKj9zc3Ix9LMwIeYJi1McaFpgrV64cPe85CpIkIbW2QmEhDOU7RzteURQs3d1Y\nvvpVtIULifzlL9GyOr/fT1VVFVOmTOHQoUMZO74lJBRCfOMNfcBoGiOhAAYGBmhqakKSJJYtW4bT\nqAlWVaRvfxutoAD11lsBEEQR9wc+gCoIGQ0xNRir+EQ+97n4dVpa9DRHkl31EWkPRUF65x0ilZX4\nJk2Kpj1aW1sJh8NYLJa4aPp4pT1OhQg52TqJrrdMPT6M8tbGxsZRKz4GBwcTT0Of4Jz0gpzqItA0\njaamJlpbWykpKYmWZWUDi6Yx/+qrsZSVEUljcoDRPm2ZPh355z/HP20ah/fsIRAIUF1dTXFxcfS1\nZMsTWdi9G+mOO+C661BHsQp0uVzU1tYiqSozpk5FGrrFP7qYgFZcHCd6oigScThQv/SlEeulFKhw\nGOmNN7A7HGhDhi1jxepyYfvpT1EXLUJdvhxlzRoYpbFEaGrC9tBDCBdcgHjdddG0h3G+kUgkGrUN\nT3sYIj087THRIuSJJsiZrJNspJPL5aKpqQmHw5Gy4iMYDOJyuZg3b964z/14c9ILciKMkfU+n49I\nJJLQ8CcZ6V4Ukt1O18UXU7pkSVrrGiLr9fmor6jQhXjGjDghjj02E+N5RVFAlhEOHEBbuBCGXqu2\nYgXKvfdGvSHEQADxhRdQN26MlqO53W5qa2vRNI05c+ZQfNNNBF0umoe69qIIAurmzcO+NCwH3NkJ\nxcUIFkty57OODuw334zQ08Ok889HS3NQrNDWhrhrF+r8+WhVVdEqCtnpRL7wQgS3G9tjjxFyOlHX\nrEm5llZRQfjrX0dNshFptVoT3l7HzuIbnvbw+/04HA5ycnLGnPaYaBHyRBN2I5VRUlKSsuLja1/7\nGocOHWLbtm28/vrrLFu2jC984Qvjem5FUaipqaGsrIytW7eO96Uk5aQX5Ng3fuw4orKyMiZPnsys\nWbPSFmMjL5xqllz0WKuVhs99jpJRLn4Do8XZGGg6efLkpBdtTl0dOfX18PnPp5wzJ/7ud8y9/XYG\nHn0Usa8P6fbbUb73PVTDKc1mQx0yQBHeeovF3/0ulv37UT73OdwXXcThoQ7COXPmRDdAtOpq5MHB\nzM2Fmpqw3ngjyiWXoAxF49Ibb4AkoZxzTuwPAqxWIldcwWBNDVOHL2qsZ/xsNE2vXX7tNaxPP40Q\nDhPevBn5iisQ2toQIhHkTZsQAoFolDwqoogSk783Xstor3VEtQdE5/AdPnwYt9tNX1/fCU17GOc0\nkYQ0m+b0idYZXvHxyiuvcP311/Mf//Ef2Gw2Ghoaxv3cjz76KAsXLsTtdo97rVSc9IIMRw1/ent7\n46ZguFwuIpFI2vnidAVZfPllcrZsQbzhhlHX9Hq91NbW4na7qaysTMtbY/KWLeTu2IG2cSNUVSU/\nsLsbS28veDyoK1YgXHYZ6rJlR78vy+D3w6RJiK++ivPIEQKbNiE9/TTO//xP5vzP/5A/TJiUO+7A\n3d2N6nKNep5x6ZLiYpQzzkD7wAf0/2uaLqBuN4F166LVFVpZGcFnn9Uj7sOH44U/EsH+jW+gzp5N\n5KabALD85jeIdXWEP/c5tBkzkP76V9RlyxC6usj5zGcor6qCM88EhwP1tNNGPedsI4oi+fn5OBwO\nZs6cGRWFsaQ9JlpkOxGFPd3gyuVyMWvWLCorK1m7du24nre1tZVt27bxrW99i4cffnhca43GSS/I\nXq+X9957L85q0mCsHsejIbz9NuKOHdhSjDjyeDzU1dVFG00cDkfaPq7uL30J15EjTKusTHmc+ulP\n03z66YgOB8yciTLMn1W6807E114jsnUrns2b2bdgAVplJR8AcsJh8hcvTrhuJrXNUUF1OlG/8Q39\n35qGBiirVyPt2IHgcsVPMklWhywIaDk5ceVzmtOJVlQEkyejXHDBUZ/kUAjlgx9ksKCAyckiy0AA\n2513oq5ahZyiAuRYuL2lSnsYm1WxaQ+j3CsUChEKhcac9oCTW0hTMZbxTdng5ptv5sEHH8Tj8WRl\nvVSc9ILsdDpZt25dwjev1WodU/v0aCjf+x7KV76Cr7l5xPc8Hg+1tbXIshxNTdDdjbe9HSXNIYta\nRQWe4mImniT6AAAgAElEQVSmpXFBCrm5ScVTW7QIubmZ/c3NeAB14UJWrFiB9eyzUdzupKVo6Qpy\nKjN9QRCIXH898pVXJh0rNUKQLRbC994bd4xy8cUk3N602wl//esMptpQlWXE9na0ITe5Y0k66YZk\nm1WqquL3++ns7MTj8XDw4EFCoVC0Sy62bTxdY51sCHK2UijZNClKN6jxeDxZGWq6detWSkpKWLVq\nFW+++ea41xuNk16QU1nsHasIGasVZs6EGEF2u93U1dWhKArV1dVxkZH18supPHyYlj/9CdIYjZRJ\nlUWyjsFgMEjdihW4q6qorqxk4dSp/POf/9Tbc598EmnLFuTnnkNLMHg1k4aNlMfl5OgRbxYQ338f\n69NPE77lFrTZs9N7UH4+WnEx4t69ei462UZjFkRnPOIliiJOp5OioiIEQaB6aLMxEolEo+nYZorY\ntEdeXh4OhyNOgLPpPzyRctqZCLs25JkyXt5++21effVVfve73xEMBnG73Xzyk5/kueeeG/faiTgl\nBDkZx0yQY4g1IxouxAbKVVfh3bkTOVEt5htvIHR0oH7iE1HByLTKIvbYWDe4qqqquJZrURShrg5x\n2za9uSJJJ1M22rE1TUMYGCDnS19CPv985E99CiIRxNpa1AULdFN8QUBqbkY8fFivjPD7Edxufb7e\n8Ofq7kasq0MYHExfkAEtLw9hlAszamZ+4ABCZ6e+CZmhEB2Lsjer1UphYWFcx1myGl0gWvoVDAaJ\nRCLR4QwTgWycR7qpj2y6991///3cf//9ALz55pv84Ac/OGZiDKeAIKfCarVGc3TpkIkgu1wuAoEA\nhw8fZu7cuSnbNNXPfx7P7t0Jo17psccQmptRN22KDvYcy6DTcDhMfX09/f39CU2IjGPx+cBiQfna\n18CoFohE9FK5GOEea/efz+fjyJEjuN1ucn0+lvT04O/uRh4YYPLWreQ88gihhx9GOfts/RZ+61Zs\nTU0EFyzAsm0b4p49hG+7bYRLnHLuuQRWr46mWayPP47Y2Agf+1jK8wnfd196J66qWH75S8TaWpRV\nq5Kmc5JxvBpDRkt7eL1ewuFwdP8iNu1h/H0sPSSOJZl6IU+UD6NMODl/M8NIdos9lgh5NMMgo4EC\n9PrTpUuXkpPotlxV9brcGTMQtm+n7LLL6L76ahiWI5UfeQTB7Y6bspxJysIw/unp6aEiweimWERR\nJLxgAfKLLx7dOPN6sVx1Fdrq1Si33w6MzWMiEAhEW7/nzJmD0+nURWL1arw+H97eXrry85m6ahVd\nsoyttha/389gTQ05ixdDYSHKmjVo06bpVSHbt2P99a8Jf+tbekOKIMSJpFhfj1hbi6Ao446IBEHA\n/vWvI3R3I198MWJjY3y1Shqc6MYQI+3hdDppa2tj2bJlSJIUl/bo6uqKptXsdnucSA9Pe0xE0hVk\nr9d7TLyQzznnHM6JLeE8BpwSgpyMsZjUJzMMGhwcpK6uDkEQmDNnDgUFBezevTvp+uKzz2K55x4i\nTz2FVllJZMkS/NXVjHBXrqhguJykk7KIRCI0NjbS0dFBTk4ONTU1o15QsZ7IKAritm2o8+ZBWZku\nhMOPAz2NsHcvWk1Nwppoo756cHCQ6urqaI1ueGiWYP6kSeQbQjp3LsL8+eQXFuKx2fB4PNhefhl/\nSwv7i4rIKSnBOW8eTpeL4nfeQfrLXxA+/WldkIcReuABvaxv376Ur3k4Qm8v0u9+h/LhD0dfs6Zp\naKWlaHl5SG+9hXj4MOEhQRZ37NBL9EZpDZ+ordPJ0h6hUCjq7dHX1xeX9jDakg2DqfG8rmwPOE0n\nL+xyubKyoXciOCUEOVWEPN4qi8HBQWpraxEFgeVbtmAvL0cZsuhMFclqCxagrlih7/CXl+P65S9x\ntbUxM43zSLWu0cvf1dXF7NmzWbZsGS0tLWldyLFCKxw6hHTrrQgXX4z8xBNJjxO3bEF65BHkJ59E\n27AheozxgZBohFTSyovubnI2b8ZaU4Pt7rvxeDxon/kMeYrCinXr8A3dcvf19enlfJWVqKqKs7Y2\nGv1FIzmLJdqRmAlCW5tey7xwYdyHUPirX9W/39gIxh2PpmG7+24Agv/7vynXPdER8nBSrSMIAjk5\nOeTk5CRNewwODhIKhdixYweSJMVF05mkPbLphZzuFOyT1VgIThFBTsZ4NvUGBgaoq6tDkiTmzZvH\nJLsd229/izZ9OspQC3HC9V0ucDjQ1qxBfuml6JdTiaz48MMI+/ejPPEEWK0Jj5VlmebmZjra25kz\nOEjV2Wcj5ufj9XpHz/fW1yP09iIWFUWP1ebPR7n//oS35kY5m/DHP0JrK+onPoE2dJyiKDQ1NdHR\n0cHs2bNx5uRQ9uyzMG8e6kc/mvI0tOJiIp/6lL6phy4M4aoqlGnTEEHvtpJlPbc9dMEbbmBer5fm\n5uboHYzD4SA/P59IJEI4HI5z7UuFumwZ4bvuQoupdokVL62igphvoE2fjtjVBYGA7vuc7LVlIRLM\nZnXEWIhNe0yePBm/38/y5cuRZTna5JJp2iPb00LS+cA6WaeFwCkiyMl+SWOpQ/b7/ezYsQOLxcL8\n+fPjclHht9+O8/MdIZw9PdjOOgtlwwZdXGNIJcjCwYOIe/eihMNgtcZt6mn/+78MHDzIgdNPp6ys\njNM1DfvnPofy6U+jPPBAWhtw1k98AqG2Fstrr6Ear0eSogIq/vd/Iz32GJGf/xyqqqJ+yOL27QgH\nDyI/9RRqQQGtzc20tLRQVlbG2rVrkSSJ9sOHEXfsQAsEYJggCx0dWJ97jsjll+uVEZKEvGlTVGyH\n39kInZ3Yv/xl5PPOQ/785wGw2WxMnjw5bpRWbCQnyzIHDx4kEolgtVqjgmIIxQiBEwR9okiaRDZv\nhsHBlGJ8dOkxRrdDJXnZjJDHS+wIMYvFkjLt4fP5EqY9nEMzEY/3h8zJOi0EThFBTkYm1QL9/f0c\nOXIEn8/H6tWrE28KzIxPOIyIkB0OfV7d0pGD6VNF68pPfhIVY1pbEWbOjDrVTb/1Vqa6XEy6/nos\n+fkwaRLKxz6G+pGPACApCgzrrxcOHwaXC231an39r34V4cgRGIqQhR07kO6+G+Xee3WTnsFBGBhA\niETQYn5uyle+gubx0O7307h/P9OmTRth1KTk5iL//OfxxvN79yICgsuF9Le/oaxZgzJ7NgSD2G+5\nBXXePCI334y1tRUhRhw1pxN19myk995DufBCPaWQ4NY42QZWbDTd0tKC3+9H07RoOZjxJ6MuOFXV\nNxdLStBSdE6OVUwtzzyDtHMnofvum3CCnCqyTSft4XK5GBwcxOfz8d57743w9kjHM8YgkzsQU5An\nKOm8ufv6+qirq8Nms7Fw4UL27duX9g7tCJHNy0N+4YWEx6asnLBa9cj40UeRfvlLmu+/H58s65sY\nv/41qs+nizHoLcTf+x7i88+jlZSQ88QTLHzzTXjxRYTGRrQzz0S66SaElhYi27dDXh7qUGmYODQQ\nFa8Xoa8P4Y03kH72M5R77tEtRIfEzxDkrqEa6+LiYlavXp3cunTSJKivx/K976F88YtYbrgBKS+P\nwKuvEnz88aO5WosFde5c1PJyhMZGSh98EP/GjXDjjfr3nU7kyy/H9uSTCM3NWH/6U5Szz0b54AdT\n/h6Mi3W0aNrwPJ70hz9gBQIf/Wh05lsyARIGBhBaWvT6aMD6xBMILheRz35Wv6s5/XSw25OKqfTH\nP0IwiHLRRYlrm61WNJtN3zg8wSmLWMZ6LrEflqCnD7q7u6mqqoqr9vD5fFHj+tjcdLJqj3Tzx6AL\ncnGCjeCTgVNCkDONKozRRHV1ddjtdhYtWhR9A2U6HTrduXqjReuqqtJTXo5aXU2goIC8YDDasTX8\njIQjR5CeeQYkCW3pUjzt7Tj+/Ge9++6HP0S57TaEzk5wOPTb4XAY7PboOWgbNiA/+CD4fKiXXoq6\ncuVRy05No6+vLzoTbeXKlYnL+oafU2srws6dCE1NyN/8JpGh2XlxTR4WC5GvfEX/dyCAd8MGgsuX\nkxezjrpqFcFHHwVVRXv3XbRRdstH+93HCYTLhdjXh7WrCzUUoru4GK/PRzgcZteuXWiKQsmhQ0jz\n5pEzVLpnX7cO5dlnj6Ys7Ha0nBzEv/0N629+gzZ9OuqCBckF+f/9PwSPB+XCCxMKsnzllXDllfpr\nH6qfnQhk2+kt0WgnI+1hCHWytEdeXh6CIKS9kejxeKhKZco1gTklBDkVQkzkkUqIx0Imc/WSXWia\nptHe3k5TUxNTFi2i4he/YJrNRuff/gaRCMLhw2iLFumdbYcOYbniCpSvfx35xz9GmzsXLS+P+hkz\nmDp9OtrMmWiLF+uiMbS+9J3vIL7yCpHXXtMN5SMRkGWk++6DoiLk55+PnsvAwAC1tbXY7XZyc3NZ\nvHgxNDQgvv466pVX6ub0waDuu7xsWdSTWJNlpFdfRfnIR1AvuggNUEfL3efm4vrYx0ZGQ4OD+h1D\nfj6RYcb30t//jlpWFudNIUQiCK2tELsZlwTp7bex/td/IV92GfJ551E0eTJFhYU47rmH0tJStMmT\n4e238axdS/vUqbS1tUU9JaIpj8suIy8vD8nvRysvRy0r09NBsYKsaQidnWilpXr1hqKktFE1MKZO\niwcOoFksCdvaRyNboh6bQx4PqWqHY9MesRGtcVfj8/lwuVy0t7fj9/uJRCIcPHgwbo8gUdrDTFmc\nYEZrn45EIlH3NUNo8vLykj4mXYwpIGNB0zQ6OztpaGiguLg44TQT8bnnkB58UBdf47ZdEPTIc8jm\nUgDmPPEE1v5+Iv/7v3G5XEBvkZ45U4+QQyE9SrdY9PzxUOTr8Xg4fOAAk995h6UVFeRcfDF/Gxr4\nKr75JtKjjyIcOKCnSrZuRXroIeSbbkI791w96h4yyCc3Vz+/NO8yRpQrRiLkfP7zaKWlhH784/hj\nOzqw33QTyqpVhGI2TGe88grOv/yF0M9/jrpw4dGf3a5duh90jKOdcvrpCO3tWLdsQSssRPnwh/Vj\ng0Gk+nqEd98l8ulP4zj3XKpj8qKGlabX66WtrS06983hcFC2ZQvFf/4z9qHmGkEQEHfuxPb444Rv\nvFG/+0gTY+ai9eGHITeX0I9+lPZjDbIlyJmkCFIxlkg79q7GGMPkdrtpbW1l+vTpKdMera2tZpXF\nREXTNBRFiW4oRId1pvG4dN7UFosl44nPmqbR1dVFfX09RUVFrFq1KqFfsyiKyKtXI1x8sR4ho5eq\nRXbuHHFs/2mnMdVi0VMUxuNfflmvpLjhBn0iNFBw//042tvhRz9CW7kSn89H7e7dhMNh5hcVMXnb\nNoTaWhS7HSZNQtixA23xYtQPfhChoUH3XV67Fi6/HOmRR+BHP0K67z40SUL54Q8z/hkady9RLBaU\nc85BG2ouEffuxfLKK0S+8AW0nByUM89Evugi/bENDaBpeBYtQlYU1Nj5aaqK/a670PLzde9lg4IC\n/fE5OUeN7EWRIzfeSNHSpXqH3oIFI6LZOCtNjwf7bbchn3UWnksvJRgO0yOKuAsKojW7RW43JYsX\no02ahCMDQVJVFVGSiNxyS/TuI1OMKHu8TETrTZvNljDtYWzmDg4O8oMf/IA9e/awc+dOli5dytln\nn80NafiWJyIYDLJ+/XpCoRCyLHPZZZdx1113jfu1pOKUEOThF77RTlxfX6+Lzfz5aQ88zGRqSCYR\nsqZpRCIR/v73v1NQUDBqblaSJBSLBavLhTAwkNBwx6Bn/Xrmnn567JMh/uIXYLHofsKLFsGMGeS8\n9Rb29naCXi+1TU16m3N1NVN6etCKi1G+/W3E//s/1NmzyX3/fSxPP41mtSJ0daF85jMwdFupXn89\n0k9/itDXR25npx7lZhKV9feTc/PNOC66CM/pp+tpkO5utNmziXzxi9HDxMOHkd59F7m7GyEcRmhv\nj0bfth/9CBQF9zXX4L/yyvjflygSuv32OF9lAMuLL2J56SVCDz8cX4csinqFzNAHX0oiEcTWVsTu\nbt3jeuVKWLkS+44drF69GmX/fixPPEHPZZfRK8t4d+2Kjh6KrfTIHRjA8uab+gfEkMDYGhrIf/FF\n/QN08oiezrTIVmSbTU/l8Q4UhuSpj+GDUn/9619zySWX8Ktf/YqBgQF6e3vH/Jx2u50//elPOJ1O\nIpEIZ555JhdccMG4De9TcUoIsoGmafT09FBfX4/T6WTp0qU0Nzdn9Amd0RinNCJk48PBKKZfunRp\nWnlrSZIQ6uoQ33kH7YILolFyWggC8hNP6LXHmzejfvSjKLffTs8jj9BUW4uyd2+0zVno7ES6807U\nM89EvekmlPXrsVxxBTV/+APy00+DJGG9+WbEF15AW7kSbc0ayMkh8tprCB0dBGPbvINBpO99T5/c\nce65SU/PumULlldfpdDtxnP66di/9S0sW7fi37o1Lm8qf/SjKGeeidDXh/TOO4RuvVXPkQORz35W\nz9Um+UA05gjGopWUoM2ahTaspjijpo7Jkwm88IIu9pEIBINxg18tBQVYZ85kanU1U4Zei6ZpBAIB\nvF4vHo+Hjo4Ocv/6V2Zt3Uq304m0ahVOp5Oc/fvJHRJpdYyCnK3cr6IoWRkInM3xTZls6k2ZMoXp\naUyDT4UgCNFrNRKJEIlEjvmm6ykhyJqm0d3dHRXiZcuWRY2sx+Jnke7xqY41NhBra2txOBwsW7aM\nffv2pf0mlySJ0NlnY9m6FWLtJgcGdCMimw1h2za9mmKo8y2OmTPRSktRbr+dyJIl1B85ogvBlCms\nrak5+sYqKUG9+mq0mDXUSy9lYHCQgo0bIS+P8OLFWL7zHV18DMrL0crLEd5776igeb0IO3Yg2O1w\n7rlJ0xbKhg0ov/89AzfeiKZpyBs2gNsd18oMgCiilZQg7tmDeOQIXHppNOpVlyzB8vTTlO/aBd//\nvr4ROArKOefEz/dLhCzrdd2pBHEo4rM9+CDSzp0E/uu/ot8S+voQm5sR/P7oxqoxFdnhcBwdzrlw\nIerGjRRMnYrH56Ozs5OeFSvorKhAEkXyht7LxpindIUgm5FtttbJRsoik3VUVc2ao52iKKxatYra\n2lpuvPFG1qQ5Q3OsnBKCDHqFwPLly0e00GbDzyIZyWqLDSG22+1xeet0ImrhL39B/PWvsV59NYqq\nQuwMvu5urJdeirphA8r99yNt2watrYhz5ui3qq+/jvSd76DV1KB897soeXm0OZ04brqJSbfcwtTV\nq2ltbY2/uCUJ9eKL485Bveoq9lVUcLqRb6+qQv7FLxJWCsSV802Zgvzss5CXp/sWCwIUFOgfGlZr\n1CRIXbGCwGuvEe7qQgsG9akgw84hFuXcc1HWrImLRAHEvj5svb26s944iPpFv/021ueeQ+jv19Ma\nZWUpH6cuXKgLeEzqSXM60WbMQButlt1qRZw3jwKgYMhqNBKJUHnaaYiiGN1E7OrqIhAIIElSXL1u\nMj+JbBoUHcvBpGNZJ53yS03TsmpoJEkS//znPxkcHGTTpk3s3buXJWlOmh8Lp4Qgi6LIgqFa0OFY\nrVbC4XDaa2UiyMOjFqNszGq1JiypS8dWU3zpJaSXXsJx9tkolZV6xNnYiLZ2LVplJerKlWgrVgAg\nf/e7EAohDDV82O65B3HPHlS/n46dO6m3WqkMBpmqaShOJx6LZfTOxXAYy/XXM7OgAG3dOoTBQb1s\nK6bqIJaC7duR2trAENSiIn1K9He+g+B0It91FznXXYdWWEjwmWdG/PzSunhEcYQYA4Q3b+bIzp0s\nGW0o7cGDiO++q7dtGx8yxs8hZgyVtHcvwsAA6tCG3GjI//Zv8G//Fvc1raqKUKzF6tDU7JRlb0M+\nGcaGXKLp1rIsJ7TRzMnJiRPpbEa2E2lzMN2UhfG7zHZqobCwkA0bNvDaa6+ZgjweMqkVNo4fy9SQ\n2tra6AdDsk6/dDYBlXvvRb36akJOJ9LOnVhuuw2hrg5t7Voif/wjyuOPHz14KLKSWltRFAX5gQdw\n//WvHJ43j8KZM1lTUaG/ng9/WI/I/H79A0FV9chuWPpEuv9+xKGGE2d1NZqmYb35ZnC7kV9+OaGo\nVDz0EDZJQh4SZK/Xy+HDh8mvqkIRRXrffZfydeuwFhWh9fWN3rrs8WB74gnk9ev1io5ERCJYH3sM\ndf58mDFjVFEX2tt17+RAAG1IkHOuvBIBCPzmN0eX/fSniVx2GdaXXkLavRvlzDNTrpsOlmeeQWxq\nIvyNbxxNq0QiSH/+M+qSJQh9fdhvu00fTVVSklQEkzVWBIPBuGjaGPOkqmpcq3ImbcpwYsveEpGu\nIPv9/qyUtAL09PRE7UsDgQCvv/46t956a1bWTsYpI8jZNKlP93i3243f76e2tjbqkZyKtIznJ01C\nq6lBqq0lPGuWHg339KClSLsIgkB3dzetkkTRpk0sKy3FNjBw1AfCakX6yldwhMOon/40lmuuQXzj\nDcLvvx+tnBh6QQgeD+HXX6ehqYnlg4OoF1ygR3BJBLThjjuYMX06BALU1tYSCASYO3cu+WecgdDV\nRUVNDYElS/B4PHgHBmhpaSEcDmPTNPLr65FCIXznn09uSQniUEmZ2NCAOns22ty5iDt3opx9dvyH\nRziMePCgnkcfJa0AQ7njNWvihgBohYUjoym7HcHjwfLyyygNDVkRZC0/H62gIO7nJzY0YHvsMSKb\nNqFs2KCnOAyfkQwiu9jpIUY03dvby+DgICUlJdHJ1g0NDdF63dho2uFwJH2+bHfqHa91BgcHs2a9\n2dHRwTXXXBP9gPv4xz/OxSlSa9nglBHkZIzF8W20dmhjsrSxE71y5cq0LqRMh5dKR44g/uUvqHPn\nog29EcTHHoNJk1CvuQbQvTgGBweRJIkPfOAD5OTkYPn3f0d87TUiO3dGqxaEgQHEcFhvnZ4+Ha20\nFOu55+qbeXl5el76vvtQ7rkHZJlF116Ldd481GFphuH4li3jCODdtYs5c+bolRuCgPjkkwj19Shn\nnYWzq4u8KVPiBrwqu3ahvfwykUiEzkmT6J01C3tvL4saGwmuX0/kvPMofuUV7E8/TfDRR1GHjJIA\nyMsj9Mgjukjv3Xv067KMuHu3Xr5msx2t5RXFODEGCP30p9F/x36Qa1OmEHzySb1rLwso//ZvI6Zm\nq3PmEPriF/XhAJWVhL/4RSyvvop0xhlZMbm3WCxMmjQpzqR9uCl9T08Pfr8fURRH5KatVusxMcsf\nD+lGyNns0lu2bBm7du3KylrpcsoL8lgi5GQpDq/XS11dHeFwmDlz5lBUVMQ//vGP9HaABwYoffRR\nQps2wShmOaDnxX0rVxJ5+mm9K6+wECIRLHfdBUVFdH/kIxw5ciRqqFNVVRXd9FDPPx/B60WbNg3h\nb38Dvx/55z9HlmXUXbv0KcwlJdDWpqcuVFX/Iwh6VO31oubmok6bRjJ5MHyRe3p6mD17NkuXLo0T\nk9A3v6k7yPn95H72syinnUbo+9+Pfl9auBD3jTfiCYWoPPdc5vzpT0i//z39Dz+Me9IkvP39tJSX\n47z8cnyCQF59Pfn5+Xp5WE4OQoJx8NI772C/7TYin/oU1i1biFx7LZHrrhv1Zz0c7Vj7ICgK9qef\nRi0tJfToo7pXdUsLYiAwbvFKJoDJ3NkURYnmpmOj6VAoREtLCwUFBdFKj7GeWzbyuZkI8slqTg+n\nkCCn8kQeb8rCSEsEAgHmzJkT13dvHD/am0XYv5+CbdtwFxamJchGNB07pQOrlcH/+R+aOjsJ1NdH\n89X79u2L26xTP/tZ1M9+Vl/n3nsRBgaInHMO4tCmnuZwgKIg79mTOBVRWMiRu+9m/ooV5AKEwwhH\njqAtWoQa470xY8YMSktLmTJlysjmnMpKtIoKBFkm8rGPoQ5tRAL6+KiGBuSKCoJDI6W0SZOgpATn\njBk4jQtq3jy0886L5kmNGt5gMBj1lzCivoKCAliyhMi116KsXInlt79FS1Tv7fVieeEFlPXrdTtN\nUUwuGIqC5fnnUaurUc84I/plcdcuPeWRwo4zIZqm1y5brUQuuSQahSvnnoty+ukE9u077mOgJElK\nGE2/++67FBYW4vf7o5OtjfK94Vamx4N0X5cpyBOcTP0mYgXZGNzp9XqjQjz8gkl3fe300+l+4gn8\n5eWkY2dk+Psa+P1+jhw5QthuZ+5551GYkxOth42mQhobkR5/HOWmm6LezcqDD+o5YIsFYagkSFu1\nCuFPf4Lubhhe+6tp+mTq/HzUoQtA/NWvkH7wA3ruuovAtm3Yq6o47etfx2q1cvDgwYSVG5ann0bo\n7yfyla8QuemmuO8JXV1YfvMb7DU10fpn9fTTCcd2GxrHJsiTwlF/if6+Prr37aM+JwcNcKxbh9Pp\nJP/553XBGDpeNJzjRBHrU09h+cMfwGpFnTsXLrgg8S/B68WydSvq4sWEDUH2eMi57jrUigqCL76Y\n+HFJkF5+Gcs//kHo9tuRP/nJ2BcJublZ8aHIRorAOAcj/WSgKErUyrSvr4+mpiYikQg2m21EbvpE\n2YiezMZCcAoJcrI3cqZvcCOHvG/fPtxuN9XV1SxevDjpOmn7WQgCyooVyGlWfFjCYRRFIRgMUldX\nh8fjYe7cuXp03tKC9SMfQbniCtQbbojWAov/939IP/uZPs/vM58BQIsp0YnOvFu6VG8wGR5Bdnfr\nXsp9fTi++EW0OXNAlnE7nfhWr2bQ6WTJ669DeTnyhg1I3/42U04/He3mm0ecv+h2Q19fQqMhrbSU\nyCc+QdhuR4vJ1wvNzVh/8hPka6+NjnlKhuEvMe3QIRY++yzqBz+IJssM3HQT3mAwKhhyIIDdZmPu\nU08hlpUh33ILtnnzEAcGUKdNQyssRAiHsf70pwguF0JzM1puLuHbboPiYkI//GF8TbHTSfgrX9Fr\nlBUF6Z13UIZsUkdlyhTUmTOTNrFkY3R9tpoiEjX1SJKkj9mK+XnEekkMt9DMy8sjHA7T399/3KJp\nU36E5p4AACAASURBVJBPIYLBIPX19XhbW5m9bBmLFi0a9QLJRiPJcIQ//pGSr36VnuuvZ9fy5VRV\nVcWfS06ObkE51BpqrKtecw3y7Nmoo6RE1IsvPlo3HPu8HR2IdXW6x29+Ph6Ph8EHH6TsmWewPP44\n0z70ISJ//jM4HIjPPYe4fTtT6+ro//CH4ayz4taKfPnLqIqCELsz7nYjtLUh9PfrYm+zQawgDwwg\nHjmSvPswAeGpU5GXLEE6cABLQwPOL30JZ2kppaWlSK+/jv0b38Dz8MO4b7gBr6LgamzkwLXXMuP1\n16GyEmHVKip/+EO0qVMRbDakHTtQ58xBQPehjmsOkWV9DNWQf7HQ2Ij1iSdg40aIcZVLxmidgjld\nXVh+9SvkSy4Z+WGZJsfb5H64l0Tsebjdbtxud1w0bbVayc/Pj/M6Hu18M5mk4nK5mBVjz3qyccoI\n8mi/sFS/1HA4TH19Pf39/VSUlbFw82ZyqqqIvP76qM+baeXEaMfKskynz8ekggK0wkLWrl078ryn\nTkV+7rnof6Pdcg4H6pAbWhSXC+mHP0TduBEtJg+aCG35ciK//S3ya68xacsWGq+9lqUbN2Lp6EBY\nuhRh507dOL66GvXLXyayejW+H/8Y59NPw5lnxuejYy8yjwciEezf+AbSe+8hf/CDKBdeiLB8eVyF\ng1pVRfBnP4svxRuF4MyZ+B58EHs4jNDTg+By6bnfZ55BrahAKyrCWlBA0fLlFAGzAMJh7I8/jnz4\nMK1r1uAvLGRvTQ3uykoK16/HWl1NvqbhDAax2+165NrXh/3WW5EvuAB5aAKLVlpK5MoroaeHWf/9\n31BTk9JkSejtBZ8PLbb7MgZnXR3Snj0oNTVj8kKG7AnyeCP12AaXuTGvZfiYrdihtcnGbGXS7WdG\nyCcBRlph+K1cOBymoaGBvr4+KioqmD9/PoKm0bdiBSUJpjEnWzsbEbKiKDQ3N9Pe3s4cUSSvshJx\n+vS0y+mSduD5/Qh1dQgLF46YPDKccDhMvduNpaODSU4n8xcswFlainLWWbqD3AMPQE4OSk2NvhF3\nzjn0qiqOnBympDhP609+ojuk1dWhlpQQ+cIXsGzbRv4779C7YQPSX/+KsmwZtp/9DLW8HPnjH9cf\n99BDCAMDhO+5J17oXC49uo5tk8/PR/zd77D8/vcoF1+M9be/JXzzzQQSfajabMif+QxCfz/TFy9m\nxyc/yerVq+NMgFwuF21tbYS9Xibv30+B10uRoiBbLFiHRE96/XWszz8Pg4OUdnXBnXfGtVEDelQ9\n1F5teeopxJYWQj/4AeL+/dgee4zQ7bejDaU8emtqCG/apA+EHSMTaQxUotrhdMZsGbXqxtBawy0u\nnVpkt9ttCvJEx/CzMAQ5EonQ2NhId3c35eXlzJ079+ibWBA4tHkzxQk2mBIx6qaepun52smTE4q3\nqqq0tbXR3NzMjBkzWLt2Lbb770f8/e+xn3UWnH32qOcQO6V6BNOnIz/5ZMqpyYqi0NjYSGdnJxUV\nFcz45jc5dOgQU61WCAQQ3n8f4dAhpBdeQL7nnqMPbG4md9++qEdxMtQzzkDt74d//3e00lK08nIs\nW7ficLlwlJSQ8/3vI59/PvKFF6IZ5juA9I9/IPb0EFaUo00ufj95a9agLlxI4JVX4p7HtmWL7kPx\n2GOoixahpsjtGub0EN9uG2sCJL7/PtJbbyH+7W+oXi99F19M69y5+N57D4Ci3FxKTjsN+ayzaO/q\nYkkCrwXrf/4nYn09oXvvRbnkEtTeXsjNRejqQmxs1D0/jPOwWsclxpAdQc6WF0S6zRzD5/AZGNF0\nX18fwWCQnUNe4MZ4J6MMMjaadrvdZpXFRGC0qSHy0NDQpqYmOjs7mT17NuvWrRv3mzfaSNLbi/j/\n/p+en43xfxV/8Qss3/0ukWeeQVq5MiqcmqbR0dFBY2MjU6dO5bTTTou2tyqbNxPYuJGBNG/ToqOZ\nkpGklTT2w6CsrIy1a9dGLyAjDSLs2oX0gx+gbtqEVlODtm7d0ef97W+Zfvfd9IXDsHlz0qdXVq/W\nBdUQVVUlcs01eHNz8U2bhjp9Olp+Psr69Yj19bqrXE4Ooe98R+9yi72zsdlQVq/WmyqI79BUFy/W\no1GHY9RNwXQQ330XafduIp//PNq0aUyaM4dFQ+3qqqrq9buLF+NyuegPBNi+fTt2uz0qFE6nE2dF\nhb6JZ7HETy+54AL869fH5YqzNenjWNUyZ0omlpmJMKJpY/zavHnzEg6tDYVCRCIRnnnmGXp6emhu\nbqa8vHyE0VgmtLS0cPXVV9PV1YUgCFx33XV8+ctfHvN66XLKCHIqJEmiqamJwcFBZs2alZYQZzI1\nxOfzIf3sZ0iPPIKckxOXx9WqqvROu2nTotF0d3c3dXV1FBUVJRzdRG4uwsqVqP/8J8Lu3WhFRfEW\nnAleXyaTS1RVjdqVTpkyJe7DwEAURcTmZrQ5c1BuuQVt5UrUT30q7hht/XpClZVEkuREAQgGsd9z\nD2pVle5hDOB2Y3vgASZZLOTcfDOBN94AQUA8dAjL888jf+ITaJMn49i0CeW00+Knflgs8f+PIbbx\nJBOS/Z7lK69E3L8f6f33CcdsggpNTVjr6sjfsIH8/HyKi4sJBoMsX748Whft8Xh0p7biYqSSEpy1\ntXE5UkmS4jfuAoGsRKYTSZCz6WMRGygkiqa9Xi+XXnop9913H88//zy3334769at45FHHhnTc1os\nFh566CFWrlyJx+Nh1apVbNy4kUWZ+JKP5XmP6erHkYS+u0N52a6uLkpKSli3bl1ab5CkJvVdXYjb\ntqH++79H3ccMMVSuukpvQR5WbaCdfTbyUNrB1duL2+2mu7tbb3OOjRqHIUkSuFxY7r0Xbc4c5Kee\nQnzxRYQ33tDHJeXmQiiE+D//g2XhQtQ0rAlBv9i2b99Ofn5+/NSSUAjL5z+Punw56ubN2FpbmXLj\njYiXXILywAMJ19KWL6ftpZf+P3vnHSVFnb39T3V1mJnuyTkyTI6EGYacEUUERcGAKAbAiAkT6ipg\nQDAHVEy4iqCg6MqSlrASJOc0wwSGyTmnzlXvHzXTTkR09d1d9/ecwzmc7prq6uqqW/d773Ofp9v3\nHL9HK9dXas9WcHHB3r8/9sZGRaKzdVupd29st96K1GpYarvySuztMvI/CnLrwIZQV+ewjwKUevmY\nMV1+I+2LL6Leu1cR1I+IUFTarFbEw4dxSkrCycenwzSczWZzNLJKi4vxWLkSo48PzVdcgcFgwCcn\nB5+lS3G//noYOPBf+i6SJKEymVBv2YJ94MDfVAL5T9OxuJRM22AwcNVVV7FkyRL++te//surjcDA\nQIfAvaurK/Hx8RQXF/9fQP41aFu+SpJEYWEhhYWFBAcHEx4ejpOT0yVfHD0FZHHNGsQXX8Tm5oY0\nbVqHbQkPx97Dkqaurs4x5uzk5ERSUhJCVhaaSZOwPfww0n33dftdrM7OSnYaHIywezfiggUIlZWg\n0WB/+22EAwdQP/YYrrfcQsXMmV0/WJYhPx9CQ2lsaSErKwuLxUJSUlLXxofFgpCbi9D6uuTri3HU\nKJzj4xHnzEGaNw+5nYloG9rq18Lp04rmcedSgShivf32jq9ptZhXrKC5tBRLRkaH16WICMTdu5Ei\nIxW9ikuFJKFZtgwpKgr7xIlQX49gsXQMsJ1hs6H+4AM8nJ3Rffst6vXradm8ucMEnn3iROU/dXVo\nVq5EaGzENnWqMunX+uCQZRlDdja6deuwPPBAB2qb6tgx9F9+iWbePDxCQsBoxKmoCLtWS13v3srS\nW5IQDQaaNBqOHz/eoeTRZciisRFVerriiNKDHrKmogL1xo3IajX23xCQ/9My5EsVp/89dZDbIy8v\nj+PHj//h4vTwJwvI7QNxQEAAgwcPRq1WU1RU9LuI1NunT1dUuSZMcLx2saZemwiRJEnExsbi5ubG\nvn37EDIyEA4eRPbwcPipdYYgCGhraxF27EC+9VZUq1cjlJcjxcai2rEDe1MTqg0bkL28ME2d2i3L\nQjh4ENVTT5E7bRrlaWnExMQ4nLe7wNUV644djptcMBioevppwk6eRNy5E/nyy7sNyIIgoDt0CPXz\nzyMPGICtnXsGgHrlSrSvvILp008VayejEaGuDuH8eTyWLsWlrg769EE8fhzZ2xspIgLViRPK+W4N\neL8EWZahvh7t++8jRUdjnDgR9bffoqqowPLYY10HMWw2tEuWIAUHo121Cv+ICOxXX60Mhej1aJYt\nwz5kiMMGSty2DfU33yhOIIWFChOkXflGlmWM0dFKMG51A+/mIJXz8c03SL6+WB9+GL1er0hFXn45\n0mWXIR87RmJioqKM127IQhAEB2/Xd8cOPD/+GNvYsVifeqqDsS0o94AcGYn5uecu6sN4MfymQGo2\nIx47hj052VGK+T0NTn+NL9/vqYXc1NTE1KlTeeuttzqMl/9R+FMF5JMnT6LX67vURDUazS8quLVH\nj1Q2X98uddTuJvXatC9MJhPR0dGKW3E7iK+8gnDyJNbNm7uOLreDprYW1bFjyGlp2JcsQbr9doWf\nmp+PauVKRVw9MhIhLAx7WZnyR7IMZjMWlYp8mw2XuDjco6Lo5eICHh4dHT46QTh3DtWXX2J/+mkE\nQUDMy0OOjsayZQv07q3oMKjVHShogiAg1tQgBwdjv/VWsNmwHzlCaU4OVnd3Avfuxam0FCEzEwYO\nRHXiBOpdu9C+9hqyLNM0dCg6Jyc0X3yBFBKC5eWXlVpz50ZkTQ2adeuwjRyp8HhbSy2Om8/TE+Oa\nNQ59CKG6Whkw6Q6NjcqSvk8fjCtWkJefj/uVV2K76SZUmZlo33wT+4kTmD75RNlXRQVCczOWu+9G\n99Zbymg5KFn5xx9j1+uR+vZVmpdtaGhA3LcP+5AhmN944+fXnZwUe6hO368tK9VqtXh7e3cYsmgv\nAFQSE4M5IgLt/v1Uf/IJcloaTlFRDtGlNufqX62z0elYfm1AFo8dQ7twIZaHH8beOop+qS4fvwSb\nzXZJGscmk+lfauR1htVqZerUqcyYMYPrOhkR/FH4UwXk/v3796iJ/EfZOLXftv2Yc0/aFwD2+fOh\noEBRXLsImiMjsX73neLWoVIhtwq2C8ePI65di+zjgzRkCCq93hFk5dWrMa9dy+nZswlOTSXw/fdR\nP/sswvHj2D7/HEEQemwAqr74AvHjj5HHjkWVnIzbV1+h0usVPYzCQtT33490yy1IrTxhUEoWDePG\n4TtzJpJOR/GRI4jLl+NfUYEtIoLCxx8nd9Ikav38EI8exUOtxn3gQHzmzAGNhtzLLiPeYMC8aNHP\nQaobHqnQ3Ixw4QLOy5cjGI00nzrVZcku9e378zkeMwapVRNa3WrLZJ07VxlY8fTEuHYtsqsrwqlT\n9Pr6a4TkZOSQEKSYGEyffOJgcQDYbr4Z27RpIMtYZFnhYYPy8GtpQVCru44ZnzmD9sMPsbi6Ym83\nkGNrd+7a42JN5A4CQEFBkJqKkJ6O25tv0mS1Uuzv7xBdatNfaSt76PX6Xx1cf4tbiD05WQnG7Zb1\n/79LFnV1db9bFivLMrNmzSI+Pp558+b9Lvu8FPypAvL/T5F6ITsbcckSePxxrFYrmZmZVFdXExkZ\n2XXkWpaV8kLrElyKiUGIje3wPjU1SubU+absJmjLo0dj8/dHfOklhLo6Rx23oKCAxupqeoWGkjp8\nOKo2itaNNyKkpaHavZvw77/HRRAQe/fGvmBBh4k6+zPPII8bh3T55ajKy6m94QYMwcFQUIC4bJli\nM9SpxCIIApIsU1pXR25uLuGVlQQHBmK7/35yqquR7HZiWhkKNpvNsRxPDwigsbERs9mMnJ6uBBCt\nFtceGjhyaCiWZ55BVVGheAt+8kmHQNcGcdcu1GvXYlmwwKGhrCotRSgvVyRGW7+v7O+P7qGHkFxd\nMfv5/exE3dCAUFvbUSnOakV15gxSVJRi2+T4MBHrww9jbGlByMvrcBz21FTMzz2naDNfAn5V3Van\nQ+7XD/u8eTgFBhLRbpV16NAhhzh9cXExzc3NyLLsmIRrz93tCb8pkBoMjsz4X9pPN/h3aCHv3buX\nlStXkpycTL9WpcLFixczsa2n8AfhTxeQu8NvEan/xYB89izCjz9S0b8/LTExGAwGYmJiuj+GsjLU\nc+ciJScjPvss9tJSdN98g/2mm8DfH2HHDtQvvYRt8eIOPF/oIXNSq5GTk7GtWoUsCFSXlFBTU4Ob\nmxux993X5eKVk5ORk5MRH34Yt0OH0NbXo7JakW6+Gbn9g8HLC6n1glOpVJhCQ5EjIhD27UM4fRr7\nU08hjxzZYd/Nzc0UFxfj5+fHgKQk9LfdhpCZifmmm9idl4e2pISYmBiQZXQHD6IJCsKzdTltMpnI\nzMwkNDSUxsZGKisryc3NxW634+zs7Agerq6uCvlfr8f0ySeojh9H9+CDUFJCzPHj8MYb0CbEX1mJ\nqrgYWgVuACzz5inBuP15MZlQ//gj9qgoCp99Fv/WEoF65050f/kLgCP4CoWFaFavxnrddYoMp9n8\nM9e8NQno8hvpdEid68nV1coxdNM3+DV6DW2f20HS1PGygKenZ4cyWXvubk1NTQeVts4NREEQ/uOa\nev8OLeThw4f/YU3Ci+FPFZB7wm/JkDvXnIXdu1E//DC2997DNnAgeX36ULt4Md6DB6MvLib4YlZC\nAQHY3ngDOSICtSQhbNyI+NJLyG5uSHfcASEhyImJyJ3qyW2Uup4uxupW9obBYMBDlonJy1NEh3rY\n3r54MeevvRZ/oxGPurqL6iW03ZgA8pAhSrOu3fE1NzeTlZWF2WzG29tboQM1NSEnJVGdlsa69esZ\ndOWVhLY15urqED77DDE5GVsrG6UtAHWnIGY0GmlsbKS+vp6ioiLF9qlV5jFszRpcMjKwDx2qSIoC\nWCzKSPTUqYq/X/vapSj+7B7S1IT4z38ipaZiXrQIu83WIRDaxoyBxYuxDR+OKiMDVU4OtiuvxHrX\nXUihoahOnMD5llswz5+PNGAAspubUvroHEytVrSvvqo0AGfMUETpn34aPD0xd0Mj/NXSm2Yz2pde\nQkpIwHbTTV3fN5mUkfSUFFTe3l24u+1V2toehm0OIm217Pr6evR6fZfrT3X8OEJDg2KtdRH8ETzk\ni6Guru6/emwa/i8g97h9F9cQiwWamqgsKiJTkggMDKTP9dcjiiJFJSUX36EgILUK0ognTmCaPBmN\nlxfS5ZcDIMfFYVu2rOPfFBcT8eGHMG8edHK5bWhoIDs7G1EUSU5OxsXFhZz161Fv2YJtyRLknsa+\nXVyQAgKweHsjdXaRrq9HOHYMwWhEGjy4Y/NPEBT36/x8TLNmkXPhAg0NDcTExCDLMpWVlcp2BgP2\nBQuwHT9O2ltv4Tt6tKOm11hdzT5vb/zi4girrsbLywtBEFDX1yMUFiKHhqI6dAhx1y6sc+bg4uGB\noaAAoaRE4QILArYTJ1A//zzaI0doDg6mzGQi7667CNm5k97vvUfDa6+hHTsW1UX0JITGRsSMDOSg\nIDRffommoQGWLkUoL1dYLMnJyGo1zjNmIHl7I2ZlYU9IQDh/Hvz9kfV6ZD8/ZBcXtK+9Bg0NuLm5\nERAUhOraa3/OigUBWa12PAjEv/8doakJ69VXA60i9yEhDlpem+N0B8gyNDV167iNJCmO4I2N3f7U\nqlOn0L78MtZ77um2bt2TSpvdbic3NxeLxUJZWZnDNNXZ2dmxWvH/+mvUlZUKR1yrVR62ZnOXhOL/\ndw35v11YCP5kAbmnDKNt9PJS0TmAy7JMaWIiF1aswNffn4Hh4V04ynJNDapTp5Ql/UWWe6IoYndx\nUQK0LCNkZCjiMu1res3NiGvW4HbmDHJODuKqVchRUTRNn05OTg5ms5mYmJifl2e1tdSmpmIbOBA5\nJaX7D7bZoKamR5aF+PzziCtWgChinzsX1QMPdDxn587RdOwYJxMSCI+PJz4+HkEQqKmp6bidRoNX\nYiLuDzyA2dsbqaICtFo8X3iBAdXVnEpO5qfVq4mNiSFFEJC2bCFfoyFkwQJUotiBxaF9/XWFZeLm\nhuXll1FptWjNZqQpU1AHBBDa0kKNwYBercbm5UV5czMVrToT7bUOfJYvR11UpGgbBwZiuesu0Gox\nvfEGktkMLS2IO3YgHj+Oef58JchJkhLMbDZUGRk4PfYY5nnzsM2ZQ8v27Wg++AB7v36IW7bgdPgw\nblFRiM7OPwdktRrrE0/8fE3t3q08XIYORSgowPn225F69cLyyCOK0Wn7koUsozp1CtXBg2g//RTT\nW28hdR6QcXbGvHQp4rFjUFfXpREq9e2L5dlnsXdT1rgYRFFEq9Xi5uaGf2uA7Sy6VHH55dgaG7Gd\nOYNvaSlRjz+OoFZj+v57aKcf8mtU2i6GSy3n/F9A/pOiLSC3ZX/nz5/Hw8OD1LS0bvmQoiiieuMN\nNCtXYl29+qIyl6IoKt1/FxeEn35Cc+ut2B96CPtjjzm2Uf34I6oPPqDmxhtxHTQI5xdeoDE9nZMJ\nCQ6R+rYLVDh0CPXs2fhdeSVCbCxyc3NXxTFAnDcP8auvcFq1CqkbJTvpppuU2qi3N9KECRjeeYe6\n4cORY2IUvYuBAwm+6ioGxcV1yOTaB/i2wGzX6aiOiEC1di1uKhW5jY3EbdpEUHAwak9PbP7+ZO3b\nx+A9ewipquJ8//74qVRUBAejnz3bUb6wPP444saNaN9/H6G4GPvUqZhWrfq5JCPL2M+exfXMGViy\nRFHse+ABLHPmUB8X51iKN4oiGr0e+wcf4FFWhhAfj1tuLrZ587C7uCCcOoU9LAzN66+j7tcP9bp1\nStbu54cUE4PTrFnYe/dGvXevQu8bPBjNihVIkZHYxoxBd/gwhtOnESorFW5wNzC/8AJCUxN4eiIb\nDFjuvBNx717UmzYpjJB2dVshPx/tyy8jRUQgq1Q4zZ2LaeXKLg1C1YULaD76CGbOxN5ZA1unU1YW\ntbVKpi0IiglsejpSbGwHvZXO6JzZdhZdagu6VqsVedcuBJOJxshIzhQUYKupUTjTej1Wq/V3y5Iv\nBQ0NDQS0M9L9b8SfKiD/K5rI7aFWq2lpaeHQoUPo9Xr69et3UX6jWq3GPHUqKoMBuR31qjto6+rw\nmDMH8fLLFZpVbGyXcWtpxAhYtIjakBBqampIf/55QmNiGBwV1dW7zscHOTYWXXU16oULsRkMHWhp\ngDKt19KCHB+P7OXVbYYsp6U5eLTCP/+Jy7p1qDUaDhgMeHp6kjZ0aNdRcn5mtrRNSEqSRH5+PitW\nrEBVUsL0adMQL1wAwFZbi7VfP1KcnDhx/DgHBgxg5NGj9A8Lo6i8nNdeew1XV1ceHz8ev7w8rDfe\niPTIIzRNn46ubTncfukqCKirq3GbNw/Zzw/z+++j3rkToaUFj7lzcW1TdHviCWRZRnziCYRTpyjq\n1496nY7iU6dQ6/UYjUYaCgpwrq5GyM3FMncu4smTCpdXEJTAGBeHYLMh7t6N9u23MT/3HJrvv0eo\nq8Pu40PFVVfhcf31Px9aVRVYLMhBQcoLej1yG61Po8H60ENKbbnNxaXdtSmHhGCdMwcpLg77+PFo\nVqxQhog6QYqOxnrDDag3bkQKC0OKiurwviojA928eVhnz8Y2dSqqgwdxmjcPy733Yus8Pdl+v5fY\n1NNoNHDbbVhGjECTkkJKSQni11/TMGYMLq2yomd0OsdgR3uWh7Oz8+86wAH/lyH/V6Etk/ulp3V9\nfT2ZmZk0NzczcODALiIm3UEURazR0Wh7mtJqfxyurlgSEnBKTET84ANF8L3TSKbk6krB8OGUZmcT\nKAgMuPLKnm+QiAhs33xDwfbtBE+ahHTZZV02Ec6fB8D2ySfIOl3P2smtaEhJIf/xx6kJCmJwSwvO\nq1YhTZ+u1Ka7uYna+K9tbJYtW7bg7OxMv6lT8R80CM+RI8kPCOCzTZsYlp5Ozpkz1NTWUh8RgfPh\nw6gFAS8vL+Lj41m5ciX6HTu4PyEB1yuvJKewkK1btzJ16lSC2oJbKy5cuMDhvDwShw1DV1CAFBlJ\n2ddf4zdzJrq//AXb7t1Y77oLuVcvpUm5aBGC0Uhga800iNbm5J49GF1cOLVsGbVeXqh0OgwTJ2Ko\nqlLcLZ5+2nHdqD/+GDE9HSklBXtNDfZBgyh/5BGF5dIuIGreeguhshLze+/12GSV29XxOwRBtdqR\n8dqDg7GPGYP2+ecRGhoUAaV2OiFydDRs3oxgtXZJOGRPT6T4eITTp1HX1iK1ikAJv9Dz6CmrFcrK\nlLr0zTcjtV2zXl5IrcM4qsJCNPv24ZqSgsbDA9HJiZSUFGRZ7iq6ZDQiimIHwSWH6FI7/BrGx/8F\n5P8wXOyJ2+Y+3VNAbmpqIjs7G0mSiImJISMj45KCMfTcNBS2b0e9aBG2999XfOxQAnLlwoUEBwcj\npaR08JyTZZmysjIuXLiAr68vaatW4ZadDVu39qxnXF4OFotSl+4mGIMicGRPSICAAFSFhT0GZJPJ\nRHZ2NkajkZDx45GqqnBZsADV1q1KeSAlxXEccnU1cnExuuhoXFxcOH36NLbmZrxyctCbzUSlpTFi\n2DDkceOo9/Rk1803c6isjGkvv8ygkhLOTprE91u34llXR7ynJ2EeHvTu3RutVssPTU34xMQwe9Mm\n3N3dCQwMRKvVYrFYUKlUnDp1ioKCAnQ6HZnZ2RiTk3HKyUHIzsZ59myOxMUROGIEPmvWoBoxAnub\nGp2Li+K43Q5arZbgnTvpde4cprffRu7VC5vNRnNREQ0mE8UNDTQ3N2M0GtGePk1YXBxOX3yBwWBA\nvuMOZSc1NV3OpW3SJKVE0RaM7Xalt9DDNfqLqzejUflntSplh6QkRdIzJgbzO++ASoVUW4vvgQMQ\nH69k5AEBmF9/He1LL6FeswZUKqVu3s0IPAB1dajy8pC02u6DYGMjqvz8Hicg7YMHY1q2DDkw/q+i\nRwAAIABJREFUEPvYsZQePkwIyn3p5OSEk5NTz6JLpaU0NTUhSVIHzrROp/ufcQuBP1lAvhjapvU6\n14B7GnP+NU3AHvUs2pTM2t1oHaQyIyIcr1dXV5OdnY2bmxupqak47d6NvGcPKlFEuoi0puammxSG\nwAcf9HxTiyK01ta600622Wzk5uZSVVVFVFQU/qdPw/33Uz9nDrZly1AdOKA0lZydHaUJ+bPPsPzj\nH/D++8S2cZlzchDWrMEaH0+z3U7Nyy/jkZ+PpaoKvV7PiBEj8GppwfXcOcL+8Q/6GAx8N2gQMwYM\nIAyorKzk2muvZdSoUQR7eXHuySdpCAri+jffpKW5mdLcXIKjoqiqqqKkuJgZycl4Dh6McfhwxEce\nQSwqwrWlhZDISLZERyNNm8a0sWPJSk8nIyODKQYDThkZlN9wAwYfH7RqNdLevVT27Yv/5Zdj8fdH\nA2jq6wl48EF8hw3D8uSTAOzcupXI117DNSKC/Pff58KFC9hsNpydnR2/qclkQifLCE5OSK1TlYAi\nKHTHHUgJCViee67Lz6M6cgTnkydRXcRvz7J4MQDqb79Ft3Ah5ldewXbVVYoU6FdfYZ05E/WuXUS1\nPVQmT1b+UBCwzJ+POGQIquxs5bh6CHDaZctQ//3vaP/yF8RudETk6GiMq1Z10c9wQBAcJZpfWoWB\nck96eHh0CKKyLDs403V1ddTV1dHc3PzLoksoAbmzTMF/G/5nAnJbhtyGNkPTNmdpHx+f31zT6sl5\nWh43DmunZosoih04zg0NDWRlZaFWq0lOTnbM7AtZWdg9PKh46y38LpKp22+6SalXajSOgCzs3Qs2\nG3I3PNH2TThJkigqKqKwsJDeokhUQgIqDw+FUmUyIdtsEByMNHUqss2GsH07UmgoUkQEpa6uNFRU\nIJaVEdOrF0JJCaqDB7EHBrK9qYnQoiIGBQezffx4qlxcSIiKIjk5mfLmZnKbmmjZs4deCQk8068f\nnp6eSJLEtGnTkGXZQZXLu/9+1K3Z2vmvv6bmhx/Qv/IKEyZMoDI4GOHFF0kMC0MfEYEUE4Nq2zZU\nFgvegsC4sWORx49HpVZTXV1NaWkpsosLLWfP8vF771Evy1yXkEDvV1+lODKSglGjMH//PZNeew21\nXo+9Xz9FGrR1gjI5NZWahx/Gq39/ZdAFkOvq0N5zD2VpaZTFxFCzYQPhy5dTccUVVD3xhCPLc1Gr\nlVp/D8FC+957aI4exZafjzhgAHJ4eNfGa+u1aU9LwzptmoM9ocrKQvznP7EPH07LqFGUzplDYKee\nBGq1MtVotyNUV3dwZXGgvh6hrAzbmDEYAwJ6zkovQVOi7dr6Lc28NiElvV6Pv78/jY2NFBUVERkZ\n2aPoklqt5vz58zQ3N/8uGfKdd97Jhg0b8PPz48yZM//y/n4N/lQB+ZdcQ6xWawcfvYiICAd961/B\nb/HVa6mrw/j449TFxxM9a1aXCSPp3nspGT0a+Rdm86W771b2e+gQUk6O0nWvrETQ67GNHNllidw2\nZl1eXs758+fx9fVlUGQkLsOHI/Xrh+3bb5HHj8c4fDjG9HTlbz78EMliQcjLQ9W3L3JkJEF2O0GC\ngF2tRr14MZq33sJ27bWoPDyY9tBDSMCBkhLmf/opTk5O/Bgf34FL+m1pKZkXLlBYUEBOXR0T4uLw\nzM/HcvnlNHh64urqysARIxw3da8BA3Cvr8fJw4OTJ0+iAuIefhiXbdto3LULa69e6NLSsCcnI02Y\ngIePD7IsY7fbGTRoEKnx8Yi7d1N1//1Ebt2KrrISMSqK/GnTqHFxoX99PWJ1NSqTCdzdsV1zDbrn\nnkOOicE2ZQre3t7o77oLQaVC8/rr2IcPRw4LQz57FrW7OwEGA4FlZahDQ/FMSsLu6UljYyNVVVW0\ntLQg3nGHkt0VF+O/fj06tRr7nDkAmBcuhAcfxPfzzyErC/uwYd0yYQDk8HBsN92EUFmpMD7GjcMU\nHY3cqxd2o5HqiRMJ7CYoqfLz0Xz+OVazGXtb9tz+/YICxCNHsE6fjlWt/pcn9brjDqvOnkXcsgXr\nnXfCJWaybVN6FxNdKiwsZN26deTl5TFkyBDCw8OZMWMG17drsv4a3H777cydO5eZ3Una/sH4UwVk\n6FnPQqVSUVxcTFZWVlcfvc6QZXRlZUpWeAmE9F/j2CHLMhUVFdRnZpK2bx9+goC9u3FPlQrBwwN7\nc3MHDYaLHYNsMiHU1mKfORM5IaHbeqXRaKS4uBiz2fyzQL3Vin3ixA4cZt3bbxNUXIw9ORnNyy8j\nGgyYNm+m1m7HWFpK0O23I48aRYbFQu+oKLz79sX25JOY/PyoKyrCZDIRHx/PtddeS1NTU5dzbTAY\niEpPx/fHH9kGJBkMuDc1URccTNWwYdRs3coFJyfMfn7YbDaOHj1KUkoK+sJCYmJiHEvTcyoVX3/3\nHWlLlzIxJQXLnj0AaFqZHypBQPjb36C6Gs3KlTSMH4/TgQNcFhCAOioK4cwZwo8eJbCiAikwkJpH\nHuGnadPQAKHXXENsaxnhwoULfPnll1wVEcGId9/FfuoUjZ98wltTpiC6uTF11Cis11yDyc8PlUqF\nN3QIHu3rpeovv8RmNnOkb19cWvnS2mnTcA8JweXRR5F/wXVbu2gRQm0txk2bQBR/1ki5SANMiorC\n8tRTSD2owElJSZi++AIpLAz7mTNdzUlfeAGsViyLFl3UWbv99+28DyEnB3H/fuwjRzp0Ri5lPz0N\nhbSJLiUmJvLRRx8xcuRIjh075hgN/60YOXIkeZ20Sf5/4U8XkDtDkiQKCgooLCzE09PzkuybhJ9+\nov+jj0JVFdx1V8f3Dh9GfP99bIsWOWyVuhu17gybzUZeXh4lJSU4OzuTMnGi4rHWzn23MzS1tYRM\nnoxq7FhISsI+c6bCHy0uhvYaFLTWhqOi0Hz4Ybdd/ZbmZrIyMjBarXi7uZEUE/PzMIpGg/2ddxzb\nyrKM5ptv8KqtpTApCffgYAq9vAjXalm/cSPFxcXMmzePWjc3ssaOpTQwkHHbt5Ofn0/hiRNERESQ\nkJBAU1MToigyYsQIVKA0IP38QBCYMGECKldXysrKmHXgAK5lZVgHDiRbFIkTRdyWLkXq04eWjz5i\n//btDP74Y7SjRmGZOZPMzEyampo4duwY1aWlhDY0cNX27Wjefpv6kSPR+vryww8/UFZWxpwbb0T/\n0ks0+vhQ++ij+PXvT9OwYaiDg6mrrMR53Tq81GrE06eRXFxwsts5FR2NTZK44OdHtJsbOVlZbH7r\nLVIbG3GZOBHT558jRUej0Wi4dcQIGgUBnJ3JM5v58uWXufbaa9myZQspKSmMbg3o7eulwjffIMgy\nqc7O8MUX1KSkUCuKFF59NXJVFU5NTR10PHQ6XYdVnOXxxxGMxi614IsyEkRRaQT2BEFwKNx114sQ\nGhoU+dVLRHdMDfukSai3bUP74YeYUlJ6rGW3x6/15VOpVPT+F6RH/9340wXk9q4hJSUl5OfnExgY\nSHR0tKNL/0uQw8NpSk7GKTGRzppYQnY2wsmTCKWlDnucNt5yd2gTzS8qKiIkJIS+ffuSn5+vXPC/\nIMCucnbGGhCAU0MDqrffRo6KQvX556h278b67bcdhH4cteFOF6/FYiE3Nxft6tXEW620PPwwtnff\nRbV1K9KTT3bIdhwNO1km49VXOfHuu0xeu5b0xERS1qyheepUPObNw83NjcrKSvQ2Gwm9eiG6uHBy\n40Z8kpM5efIkZ8+e5b777lO4p2fPciEnByk4GN2qVdhuuw05IYGSkhJO1Ncz6qWX8P/gA2rT0kiP\niGDdxo3cHxSEe0YGssnE0aNHCfb3JzI8nGKtFnWvXvj6+nLs+HGqqqqYrNeT9uOPnB0yhP0lJXju\n3cuQoUMRRRHXykrOzJlDwJ13UrZqFQ07djDi6quJ1uv5/KWX8LbbuamigowJE2hwdSVs/35yw8IY\nM2YM/TIykN3dsdvtGAwGkvz8SLNaWfvDD6ivvppoWSY3I4Owfftw9fGhOjYWbX09aefPs2vdOk7n\n5hIbG8v58+cJDQ1Fq9ViNBqVFUlr40vzz3+iOXMGbUAAHjt20DJwIK4DBmBycaGxsVH5t3UrFosF\nY79+PwfpuDhFCAjAZkPz3ntIISFI48b9LqJA0LX8Z37llbY3Lunvu6XOiSK2q69GaGz8xRXfRffT\nDcxm80UV7P5b8KcLyG1uzhcuXOhg4NkmnnJJCA0l78UXFRpWp7ek6dORxo3rILTTHcuijcKWm5uL\nv78/gwYNcgTuSy1vqDw9yfnsM+JCQhD271eadHl5ysMgJKTDtp3LJm0DGiUlJYSHhxMycCBCcTEm\nUcRt1y5Ubm5I8+d3OF673e7IjrwSEnCaPBnJzY2UESPQVFbiOWQIkyZNUpbeZWXo5s/Hw9sbz6ws\nXO+5h5pZs/CKjXVQm7RqNXHl5Rw8e5ZViYncNnQocivbI+fcOfL/9jfqH3uMyPfeA6CPyYRnYCDn\nsrJw7dMH5+ZmQmWZoP79KXn/fd555x2eu+oqtNHR1M2ezR133EG8tzfa/HxitVpCTSbKN2+mLCKC\nkJAQnP7+d3rt3EluaCihQUEQGEhGRgaNjz3G3bt3s3H2bHJeeIEPt22jRavlqvh4pLAwYn18MMgy\nZVVVqJqbCQoKIvCOO7C98gqGsjJCFy3CmpnJy/37M7xfPy4fOxZZlgnQatG5uHCuvJxJkyYxYP9+\nDsyfz+mFCxkxejSLFy9mSu/ejJwyBdnFBc1HHyFFR2ObOpVaPz+c//pXhPPnMV17LX59++Ln54dz\n6wRn/YwZjqaWoy4tirjqdPTOzkaFkk3+HgG5fclPyM5GvX698iDtrH9yEfQUSO2t+i2XCpvNhktP\nrI52+D2V3v6dEBcuXPhrtv9VG/87kJ6ejtFoJDExEf9Wp2dQxjzr6uqU0c9LQHV1NXq9vuuEniB0\ndAvuZt9VZWWcOXoUWa0mMTERv9a6IiiBsqysrMuQQ3ewWCzU19fjFxKijKuKIvKgQUizZnVpilRV\nVTlcI0pLSzl9+rSS2SUl4e7uzndnzpD+ww8kHj2KraEBXUAAqm3boKICe58+2O12bDYb//jHP5Ak\niV69ehHXrx/a+HgEFxfkG27ghMFAdnY2/v7+lJaXI6SnUyIIBPXqhejkRMmwYazat8+xdK6prUVO\nTOSYJJE4bBg5Gg1ewcE4OzsT0tBA2t/+hoePD6qffkL9yitU1tayPieHL1atwrdvX8TiYva7uNBn\n+HAMbm5ERUYS9OOP2HQ6XsnNpaioiBETJiBHR6MKCkIqLOSflZUI/fszZMgQfEeOxBobi/aGGyjs\n25fdLS3s27cPsaoK78JCWmbMIGzCBJKDgth95gzfFhVhGDCAiVddRYOHB2s++YTSU6dIPnUKcdgw\nNJGRJN1wAyqVimNFRRTGx1NQX09BRQVxcXHow8J4/8ABbP7+zOrbF9uyZWSWl5MREsKgQYOIXLSI\nERs2oDl2DNuddyIbDEjDhiEHBJBZW8ve774jcO9ePD//HFt1NeIVVyAlJ2MfOxZVQAAuLi54eHjg\n5+dHcHAwfn5+aJ2dqUlJYbfJxIcffYSHhwc2mw2TyQSFhWjz8hCCghTlukOHkKOiHFluUVGRQ9mt\n7Xpbvnw5xcXFDGw1WxVPnkT9j38gpab+qoDc0NCALMv/cpCsaqVM/pITSElJCUeOHOGGHgwAfi3q\n6upYvXo193Xjd/kbsehSNvrTZchxcXHdciD/CJH6NrRlp/X19WRlZRH17rsMKSpC2rChi2aA5tAh\nohcvhuXLoW1goQe0MSIu9Rjq6urIzMzEzc2NtLS0Dks4jUaDT0MD2vp6Mp99lqT4eLQjRyI3NCDd\ncguCINDS0sKBAwcwGo24uLhgs9k4d+4cCeHh+AUFsWzZMk6ePMndd9/NlClTcKmspCUnh3XPPEND\n//7cOHUqEzw9SUxMxM3NjbVr1zJs2DAmTZ/Onj17OHv2LA0NDQzo25deixfjdPAgp+12wsvK0BUX\ns2//fg6MHo23tzeD5s+nqqqKEa0sE0EQkICKr77Cy8uLa2fNwmfTJkz33IP7+++jOnGC6rVr6VVa\nquhSFxcjnDyJafRoCk6fxsvVFV8/P87n5jImOJhgjQZBFDn95pvEfvEFE66+mgJ3d3bv3s3IESPo\nv2sXt+fmsq2qimKtltCZM5FTUhAA9Z13EjFpElecPcvazz6j9/79nN62jYMDB3LT3Xej0+nQvPkm\nvrW1BD75JGMHDcKyfTu9KysxubujuusuGpqacJ0wgaamJgrT0xVe7eTJmA4fpqK4GN9WsfeLCdyr\n1Wrc3d1xd3enqKwMeyvzJdzXl8aGBpxffBGT2cy5Bx6g15YtuOfmUh8Tgz40lKamJl588UVSU1MZ\nN24coaGhWK1WBxe+DVJsrDLyXVvb/UHIMuLWrcpIfrtGXRc9jOpqVNnZynj+r6DDXWoNuaFVde/3\nwPTp09m5cydVVVWEhISwaNEiZs2a9bvs+5fwpwvIFxOp/6MCstVqpaamBqvVSmxsLB6DByOkp3fb\nXBPz8tDn5iJUVCjecBfBpbI3mpqaKC8vR6vVduAyt8c1xcUIw4fTdPfdFBw5grWwkD7btoGrqyN7\n9/Dw4OGHH0aSJF599VVsNhsN1dW4nThBXUICoydPZsCAAVxzzTV4eXmhuu8+gmpr+SYnh6NHjzJ8\n+HCuvfZaAHJyciguLqa6uprc3FzOnTvHo48+SpSvL/n790N2NlZB4HOzmcCBAzEEBrI9PZ3LLruM\nHTt2sHz5ch588EG8WpuetbW1LF++nD59+nDrrbcyPi4OCguVEkthIUJBARvuvZdcf3+Sly1DWrkS\nafVqqubPJyU9HW1dHWHPP09qaip/feklboyJISIggEyrlaKAAIbeeCMZmzej1usp/e47fDZuRD1+\nPNVJSWSHhqJ68UX84+Ph3nvJz8+nsrKS3no9fu7uaPV6hh4/TlNhId5PPokhN5emQ4fY7+tL8GWX\nEbxyJaozZzi4bBm9Bg6k1mzmg9ZgeP78eTIyMhg6dCiSWs3qPn0Iveoqpo8f3+Ua++mnn4iPj+9W\nQGfw4MGE6fUEzZ+P9swZXCdORFdZiW3SJPqPGYM5Pp6GoiLqRZGizExaWlqIjY1FkiSeeuopbrvt\nNoKCgrBYLPRvLwGgUnVUIgSEggI0n32GdcYM5NBQ1OvXIwcFYekUkNsnBOLmzah/+AFp8eKL6nB3\nxr9DC/mrr776XfbzW/A/E5D/CF89s9nM+fPnqa+vR6vVMqDVa026yDJHvuUWjvv4kHYJtB9RpcJw\n+LBSnuimAWg2m8nJyaGpqQkvLy98fHx6NIOU7XZkux3RzY0LGzawIz+fifPm4e/vj6urK8Xff098\nTg7ezz2H5O3NqFGj+Oqrr4iLiVG+X0QEN954Y4ebo27wYCoqKjj5wQf4NzZy+sQJEry9ESQJs9mM\nWq3mm2++YfDgwQ79XfPbb1O/eTOnZBmXoCCyvLyoVqkY36cPz82YQUNDA1VVVZw9e5b09HSioqJw\ndXXF2dmZ6Oho+vTpo5i4hoSwPCKCXnv3cu0XX8Dx40Q//zy26mqOHj2KPjaWmOeeI2bCBITQUOxl\nZWg2bCBg2DBENzfkigrkPXvYLEkURkayLDCQWzZswDJkCM633IK9oYEgi4WGNWv4KTER923baC4o\noCgqisDcXAL69ydw7Vqud3fH8uabpH/zDYmDB/P2228TnpPD+Lw8Er28KH3gAfKvuALx9ttZuWcP\n0/z9iU9IICEhgbJz55i2ahXnvbz4a0EB15lM1EVE0OTp6XgQnzx5koCAAIxGIx999BGTJ09m2rRp\nGN57T3HnNpkQCguxLF0KYWE0Dx+OOjISrFbMb72FFBmpjC4HBuIUGEh7Tk9qaiqlpaVIkoRerycr\nKwuTyYQkSRQXFyu8aT8/5Lff7nAtCUYjQmUlgtGIrNNhefZZ5E5Bu3OGbJ8wATkigu9OnCBvwwYe\nfPDBSwq0/0tayPAnrCG3MQU6Q6VSUVBQQGho6CXtp7m5GavV2u0ops1m4/z58+Tk5BAUFERsbCzF\nxcWXtm9BIL+ykrCWFoTS0ou6Tgv5+fjccQfq7GykadM6fH5ubi7Z2dkEBwcTGxuL0WhEEIRul22y\nLGPr3x/7iBGoRJHUTz5hpFpNxGOPodPpqK2tZfOHH+KSl0dlVBS1rcyMn376iXEDB1Lk5cWFggIE\nX1+8vb3RarVUV1fz8ssvU1NTQ+/sbGa6uZHauzfcfTfqb77hSEoK2QcPMsrHB0NMDJMnTyY+Ph50\nOtLr6tjn70/c7NkMve46JEkiNTWVuLg4goKCGDp0KGPHjsXT05Pm5mZKS0s5fPgw69evx1mrJSIz\nE+cdO7C6uBBrteL9zDNcGD8en4kTaYmOJt7Dg6T33kPTqqjXxmpQf/MNcnAwxWo1Wb17k/DggxQc\nOYLHgQP0Gj6cGEHAa9QoXMePx/3KK5HXriV81y5Szp5F6+lJ1ZIlDKqtJfSFFyAsjJN2O/8oKsIz\nKgrPtDTsQUFUVlbS5O1NnlZLdHExdRYL9rQ0fKZNI3D7dgZu3ozhiivwjYxk/5o13HTqFPGVlQQn\nJjK6Xz9G3Hor/VsHN6qqqnjttddobGxkxIgRxMTEUH/0KIdXrqSvqytCSQlSVRWa3FyksDBqk5Ox\n9O8Pn32GdtMm5O7cu9tBFEU8PDxIS0sjPDycxMRELr/8csxmMwaDgZqaGn766Sc2bdqEXq/HYrEo\ngTYgAHnCBOTAQGVHBgM4O6PKyEDIykIOC+vag3FxQQ4NZf2GDWRmZnLFFVdcUqAtLi4mKCjoF5uV\nBw4cwNnZmUGdhLr+g/C/WUP+vST9unMNaaOwFRYWEhYWxuBWZ43fAvHZZ6GhAdvGjT3W1IRevcif\nMYPwVm83WZYpLi4mPz+f4ODgDpzq7oTn2x5OO3fuZO3atdx3330kJSUh/uUvVNfUUJaeTnJyMh4e\nHsz84ANHcN6xYwfe3t5cOWYMl2/dirW0lKDMTN7IycG0YAGjR49Gr9eTmJhISkoKwRMm4NXSgurm\nm2morSVj0CAio6MZsmUL/o2NvJCfz6A332TDhg3k5uZy0wsvkGKz4b9vHy4VFfRrx/W22Wx8++23\nyLLM9OnT8fLyYs+OHfRNSiJ24UJ8CwvxfPZZ9gYEcDgykpQff0Sdns7qhQsZ9+qrjJo6FZfduxEP\nHcJ44ADv7N3LwIEDiYyI4HsPD0Z9+y2XrVlD6TvvoPXy4raUFJrPncPVz09hADQ1Kec+JwdtYCC1\nY8fidPw4/l5e6JKTEWJiMM2Zg+sVV9D31lsxOzuT0EpJa2hoYOzAgSx9+218LRaKLBbW+/lxaONG\nnnByYmJZGZLZTFVdHS0aDVXu7ux7/nlGLFnCSKuVwkcf5c1332WqiwsDBw4kMDCQh2+/Ha+wMDSi\nyKAvviDy7FlKNRqOzpzJypIS3Pz8eOq999D17o1UXo7FYuELm42Y2FiuMxiwWq0U799P7JtvYps1\ni/ToaEwmE327kYltaGigpqYGr+JiItesoWH2bLYWFXHixAmmTJmCRqOhurqavLw8bDabQzDIbrcT\nHh6O29/+hqqkBHNqarcsi+zsbGbOnIm7u7tC/7sEXKoDdn19PRHttGH+W/GnC8i/F9qXLNpT6QIC\nAhhqMqE+eRKplYf8W2B79FEEi+WiDQ6VWk3xNdcQNnAgVcXFnM/OJriykkGDB6PulI2LougoybTX\nJgZobGzk2LFjbNq0iaSkJKTUVFa99Ra5f/87r9x5J25Ll1I/ZQpvbNuGt7c3FRUVvPDCC1w/dSrF\nTz7J6fp6PAcPJu3669F7eXH06FFHVuvq6opsMPD9iROcjYxkckQE/ZycqNLrKU9KwrOwkMvnzKGw\nsJAjR45gsVjQ6XQsWbyYaw8dInX0aNzbUaG+/vpr/vrXvzJ0yBBIT6fO2ZmYWbMwh4VRu3QpHlFR\nmO+/n0+//JKKgwdxVqlI8vUloKiI6uXLOTlnDm4FBegXLaIpLY2CFStwcXGhoaGBnOZm4gGtRoO6\ndSUhHTxIRlYWpXfdxXWNjTjLMsbx4zm2bx91W7eSM2UK4+fNo2XnTr6YPx+PXr0oKCjgKZsNnd1O\nbJ8++Pr6sm/fPorOnuWmc+e4Raul5uqrCXnjDcI3beLYd99xqrwc34QEFmzeTOWcOSydO5e3dDps\n/v6Y/f0pa/X9i6urw2//fhgwAFV1Nf3//nfsQ4eSHRFB7N/+RmBEBJ6ffca7P/xAwtmzXCOKqJ2d\nkU+fRrtnD5qpU7nmxhvxz8/nu9WrOV9cTPWxY7xYW4vObufDDz+krq6Ojz/+uEvAfPfdd8nIyODZ\nCRMQamt57403KDIYWLBgAZHtnEAAysvLyWntHaxfv5677rqLwJQUtH36oCkspKmpCbPZjNTYiMrJ\nidqmJp577jmioqJYtOiSkkUH/lfcQuB/LCD/GkfdtppzZWUlOTk5jqWdVqtFM3MmVFUhTZgAGg2q\nr74ibO9eGDLEQSkSfvoJ1e7d2OfN6+LgoVKpkNLSLuk47HY7R44cIW7xYobY7airqpBGjcL+8svK\nBnl5qJ9/Ht2MGZjCw2lqakI+dQq377/Hdv/9EBrKpEmTEAThZ1U24IYbbqCurg6X06cRduzA6OmJ\nl5cXs2fPRqVS4e/vzwdLlnDZ/v14iiIVn31GlFbraChJkkRzczONjY2cP3+eL7/8EpPZzM1aLfLp\n06isVgLXreNIdjbHjx9nzJgxPPXUU1itVtzd3Rk+ciRvpqczJSyM9v3r8PBwbrjhBm7t14+qGTPI\n792bNDc3qlUqbnjkEW674QZmhofjHBJCoEZDsY8PcQ89RPDSpbjp9Qzbtw/1smVYw8IoGDqUu665\nhg9Wr2bXrl3cd9995Obm8rXRyOOtAeaowcA5b2/88/LYmJqK64ABpH/6KR4eHrwEVK1bEvLpAAAg\nAElEQVRYQfOxY8wrKqJvTAwfnTuHj48P57KzuZCVRW1dHQ8uWcInn3yCaLMxNDKS1MsuQxw+HKsk\nMWfOHO686ip0CxeSk5qKHBhIhNmMa2Ul5SdPYvjyS9J9fFja3Iz7ggXcD0SWlWE3GhEqKhSPv+ho\nPvr8cyxDhvDc889Tevw4N994I6q4OKq//ZZ9u3YxoKQEn4ICNHo9QeXlaDduJF2v57xWy/W33kpN\nTAyfrFlDdHQ0o0aN4sK+ffTaswfd6NGoioqwTZxIzPHjTD14ENWUKZg++gi/r77CXxCIatV5Tk9P\nJzs7m0mTJvHD999T8f33jLvnHprGjmXYkCF4bduGuXdv6r29qa6upjw/H6fXX8cYEkL1nXcyYMAA\ntm/fzvr165kyZcovXvu/Bg0NDf8XkP8T8UsCQzab7ZImeoxGI1VVVQD07du3AzldDgxUFNYqKiA4\nGNX27XifPKloSbTWzFQbNqDatg379OnQaZSzbZDkYsdhNBrJzs7GZDLRp08f3AcPRjCbkUDxfLNa\nQaNByMpC2LkTXd++1Li58emnnxKQl8fDGo2i2Abk5+fz8ccfM3nyZKKjoxGOHyc0MxNLWhoHevXC\nf8UKeqWl8aKnp0N3uPTCBVI2bEAwGJDHjePNN95AamnhmRdfdPCqXVsZGs3Nzbz22mt4eHigEQSK\nKyooKi+nOTeX2tpaNm3aREtzM4NFEa/EREr9/UlLS2PFihWU1NV1+N7Dhw9n+PDh0NxMjl6P75Ej\nNN96K6WRkfTavJlRNhuu8+fzl7lz8Zk79+c65BVXIBQVoR4+XNECnj2bEIMBp9tv54mICP5+/fVs\n376dEydO8MQTT1BTU0Nubi7fHT6MU0AAqxoa0NXX43fmDMHBwdx77714e3vjsWgRafn5nLnuOj74\n4QdC9HoWjR9PkZMT5zQaPEJDqa2txa+pifAhQ3jx1CkeOnaMkpUr2RgQwJIFC+DDD2nIyEA/bBiz\nnZxI2b0b/+PHOX7jjfQqKyNyxAheCA1l54kTlERGYhs/Ht26dfT+7jtqbr6Z8pYWbr/9diwWC/lb\ntmBavhzNkiX0njSJRRs30rx3LxNeeQWDSqXIcS5fjjRqFNVFRfiWlTEhKAjnq66it7s7pTfdhMFg\n4J2nnmK20Uiijw+qM2doiI0l2McHH42G5taV2y233NLht9m0aRNHjx5l2LBhXNe3L+5ffIFzQQFD\nHnoIoaoKzZdfokpJQVywAJ1OR1x8PC6jR2MLCsIpOJgJEyZQUlKCyWRyuPG0dxHpfD9cqsMP/HkG\nQ/50AfliaKO+XSwQNjc3k52djdVqxdnZWenqd4L92WcRsrIczSLbm2+SsX8/fTQaxwm1P/000p13\ndgnGcHE6m7W6mpZnnqGob18Cp02jsbERd3d3ZcwZUN92G6qffsJ+770QEYF02WWYNm1CGxZGYE0N\noaGhlGu1rA4Lo/+PP6Ktq6PUZnM0/QDklSup3bGDF0JDuXfhQrxGjeLpp58mKSmJu+++mx9++IHt\nW7fytrc3tvR03vD1JbmwkBmFhci33KLoUbRiw4YNbNmyhTlz5rBu3TrGjh2L2WwmNDSU4OBgZFlW\ntAVqaqiaPZu/1dfzkSiyYMECPDw8KC4uxmKxdP1N9HoCb76ZU8uXczAigknTp/NEcjKYTEg+PgRc\ncw3bfvwR0Whk7LBh4O2NHBSE9aWXkOPjFfH/pibsgwdz2mwmLzcXqfX3bRMnmnvffdxcWEh4cjJP\nxcQwYMAAEhMT0ev15Obm0q9fP/wBp4IC7LJMk83GLV5eJJ46hfuAAXw9dSrTbruNEJuNl1xcUDk7\nU+jkxOmWFnIsFmJHjSL7zBmCCwsxzJ2LeexYPFevxpqQQHlUFO9UVjLrnXe48v77Udts3NLcjNO5\nc+zt35/MkyfxOXKEf+r1rKqsZN7UqfQ9cYLS1FTKRo/GydeXgwcPcvfdd1NdXa04bbu4oHnoIcQz\nZ7AVFXHfzJn41teTfvQonrJMs68vs2fPxmAwMP6++9CEhLDm9GnOV1Tg89hjjFi6FNvTT2MyGrG9\n8ALOFgu2uXNRf/cdtsmTufPOOxk0aBBlZWUER0XhtGgR9laxe9nHB/OrryK1CiPZ7XZErRbrPfcA\n4A64u7vz+uuvA8oKq033uLq62iEI5OTk5AjSv8bm6c+ghQx/QpaFIAg9Brv202ydYTabyczMVLSB\ne/cmIiLioswJsbUOJvfpAzodJfX1+Pj4/Jyx6XTQg2pXRUUFHh4eHYJQmwjShT17CP/uO/xjYnC5\n7DKKioo6HIM0dCjSZZchJyYiSRJ2SUL28kKlVuPq6sratWs5d+4cNf+PvfeOivLc3r8/05iBYei9\nN+kgCoKNYMXYE4m9RY0l1ng0RkM0xZKoSUyiKWpibCfRqFETewN7Q1REpBfpvQ112vsHMEdNPed7\n3vf9Lb+/ay2XCxbr4WGe5973vvfe13Xdvs3IxEQMzc1p8fXFwsICBwcHSkpKKLKxQdmtGwUGBohE\nIjp16sSlS5ews7NDLpfz66+/4uziQoiFBUWPHnHGxASjlhYcamvJCQ7G3t+f8vJyFAoFpgoF0jt3\neFRSwi+nTpGSksKECROwt7dv02ZuX1Du/v48Uqk4UVaG2MSEN954Ay8vLxwcHFCpVNy8eZOEhASa\nm5upqanB1NQUi7NnMTE0RDBiBJUPH+KXkYF1z57oevRApFCwdetWor/8Eru9e1FPm9amCR0cjM7e\nnpSUFGqbmjCdNQv7X35hYHIyIRs2ENmnD+7u7ogSE7G7fJnCTp2wHzSIycuW0atXLzw8PLBql+68\ndu0aBfv24d3URJm9PTMSEnAtKUEaHo7xzz+jycykQqslaOhQxEZGiB88wObHHznr5IQ4JQXzpiYC\np07FctQoso2M2LpkCf4pKajHjcN9/Xq8vL3p3KULwgED2KNWc6KqCkulkuybNymqqqJvURF5KhU1\nYjGzpk/H5uFDcuzsWH/pEmqBALVaTX5+Pl999RVSqbSNmerqilKhYFVjI5caGsjSarn5+DHXXFzo\nuXAhvr6++mduaWNDWWwsXe7cabOz8vamRiLh1q1bNG3bhr1Gw0ORiJYffsCoWzdknp58//33bNu2\njbv379P71VcRWViQnZ2NTCZj/XffUVZfj7+/P4WFhTg9Q+/vgE6n4/r16wiFQpydnbG0tMTe3h4H\nBwdMTU0RCATU19dTXFxMfX09VVVVKJVKWltbEQgEiMXi3wTq7du3M3v27P+T9Sz+d05ZwB9LcP4e\nOUStVpOTk0N5eblepUyfSf6Ra4hKhaCqCurq9N/6dyQ4n/xZnU5HaWkp2dnZ2NjY0HnUKAS9e+tp\nqs/WvXU2NmgsLdHW1aFubiYzMZF6pZKQa9eQzZ6NZ1wcMZmZbAwOpnHJEmwiI/E1MyMjI4MzZ84w\nffp0LD09qaurI+2XX7hz4ADut2+TnZFBcnKyPmvNy8vDZ9IkopYs4Y30dCorK6k0NCQoKIh9+/Zx\n9uxZNmzYgF1qKq/s3s0BW1vGLlpEYmIi27dvZ+zYsXh6epKTk0NsbCw6jYa58+dz/An9jB9++IGq\nqipGjRrFxYsXSUxM5OOPP0apVPLNN99gHBaGOigII8CktBT5pUt8cuUKmoAA3nzzTd739cX08OE2\n8kK7AasgPx8OH6buu+8427s3Mjc3/BIS8BAI8La2xt7FBZVKRcPevQSnpeG7cyfGra2IlyxBFRuL\nwNZW77AcKZWSI5Xyi7U1psOHI4qLo1NTE9pTpygKCMC8thaRUsmt+/cJiIrCKDwcZUAAwZaWBC1b\nhoFMhsTaGoDCx48pFYmQR0WRceYMe2/dYvEXX2BkZIQuOBi36GjqiopQv/ACjt98g41KhcH9+/Ra\ntIjeLS1IvbxInz4dZWkps2bNIiUlhYsXL7Ju3TqcnZ3x8PCgoaGBEhMTGkaNwjsggAe3blFnbMx7\n5eWIjYxQbN6MbtMmihYswK6wEOrrGZKURJOlJVMlEswTEvhk/HjS0tLY7O5Ot+7dMV29GmFVFd+d\nOIF1SQnjxo3Dzc0N16QkFMePc83JiW++/prX589HqVS2Ubb/AtXV1WzZsoXAwEBiY2P13xcIBBga\nGmJoaIi1tTUNDQ3k5ubi7e2t1/HIy8t7SpxeJBJRUFBAa2vrX9Kr/w5OnTrFokWL0Gg0vPbaayx/\n4n39/wLPXYYM/GFgrK2t1RsrdojvpKSkYGlpib+/PwqF4qmdNz8///czZFNTtK++2mZOqlIh/PVX\najQajOzs/tY4T2VlJYaGhrS0tJCUlIRarSYwMPBfmhcKhX76QnnqFFZpaRAYiC4lBXFsLIKHDzH4\n5hsqrl2j8PPP+TkxkdDsbBQREXgplThnZVEWGorZwIGIjI25f/8+69atIzU1lcmTJ+uzkoSEBObk\n5/PCnTsYDh/Ow8pKunTpQoifH3UNDXh6enLo0CFu3brFjBkz8PLywlAmo/bKFc7Ex3Pj3j1sy8vx\nzMzkor095XZ2TJ8+naNHj+Lq6oqHhwcSiYTa7Gyiz59HbmSEY4dCXXMzLg4ONKvVJCUlMXLkSLp0\n6UJKSgp1dXX07NkTJy8vvENCyMjI4J2vv0YQHk62SIShkRFNTU0o1qzBoqwMna0tmtdeA4EA0b59\nGHz7LTYGBrjMm0dCVhY53t6kBAUha+8D7N69G3lkJHGNjZzLyCCqogLx/v1oIiPB2ZnCwkJOnjyJ\nsLmZmitXMKis5EJ6OgOuXsXs+++RqFSkfvEFXyQn0+TkxOPYWPKvXKHG05MChQKnmhrEw4dT2Lcv\nwvYAox4wgJjUVEpdXJDcuMHgzExKRCIc+vVDtGcPnc6cIersWey7daPQ0hKJkxNOo0djUlKCYX4+\nNb/8gubsWRoPHCBo/HhC+/enS5cueHh44OnpiYWFBebm5tjZ2eHs7IyPiQldly1D1tCAwejRJD98\niNulS2jy8yk9cgTrU6cQVFbS4OpK8ty5OEZFMe/0aYxOnOCirS2lOTm4XLtGS3U19TodvxgaUlRT\nw5QpUwgJCaHTvXs0P3yI9sEDPNPS+Lm6moWLFtG93bqqqKgIB3t71M3NiJ5wKxeUliIzN8fL25te\nvXr9Kd25qamJxsbGNpElQ0NMTU31Oh62trbIZDJKS0vZunUrycnJ7Nu3j/j4eBQKhb4R+e9Ao9Ew\nePBgTp8+zYoVK1i4cCFRUVFYt2+q/0P8rQz5f1VArq+vR6fT0dDQwIMHD5DL5QQGBmJubv67tao/\nDMiglw8U3L2LeO7cNibcCy/86S4tSExEePUqZVZWFBcXU1dXh5+fH05OTn84JC/euBGja9dI8/fH\nrLERyenT6IKC2soBgwdTrVDgMG0a9tOmYRQeTmNYGJl79iDKzGS/UolarSY0NBQHBwd69+5Nr169\n2u5FIKBv37449uuHICgIr5kzGT5yJJ3FYgJXr6b3iBGYRkSwcOFCrl27xq5duzAyMsIxJYWQFSvo\nWVvLUZUKM0tLIuzsEEyYQOf+/fH39ycyMpKAgAAEAgEGSiU99u/HTCjkUlkZ4W+9hejaNcQ7dmBz\n+TLnHRxITk7m5ZdfxsbGhsDAQLp3705UVBTGhoYYGRtjampKU0sLg8aM4ZXRowkNDeXKlSt8//gx\n8oULKRw3jtLKSqqqqrh09y4tly+j7N+fj+7fp0+fPowfP56ffvqJ0tJSrKys2LlzJ4hEdL94EcuC\nAlw/+4zGvn1Z9sMPVLZfZ//+/bh26UK8VMrp0lJKDA15+cIFCouLORMUxMW6OoaFh9M7MhLdzz8j\nk0gIW7AAbzs7LDZvpqm0lAVHj5KQkIC1tTVGjx5hYGSE4ZYtGPr54ZSbi8no0ZzNysIpPx8DlQqd\nhweal17CxdYWp5gYWqVSMhsaqDM1xSY4mMdVVeSkpCDt2xfv0FDkcvm/JD2fgUQoxPj+fVzHjEE7\neDDrT58mLygIx4gIhCUlNHp5kejjQ97MmVQKhYjUaoJ37UJYVESnLVvo3r8/+0tK+E6loiQ6mnWf\nfcZAHx9kOh1CExPo1o0d2dlknD+Ps7s7V1QqDhw8SNKBA4hqazF2dSXnww8pXLUKcZ8+7Nu8GdV7\n7+F88CAiGxtsu3fHxMzsT+U8GxoaaGlp0dPnn15+QqRSKfb29rz00kscPHiQ27dvExgYiJWV1VPm\nAH8XN2/eJCkpSc8i7NCGiXzWEus/w/8tWTwJnU5HU1MTJSUl2LZ3+f9OvelPO71KJYKCAjTvvEOd\nqytGf0G1FnzyCZrERGo+/BDrgAA6/QWnX6fTUTx7Nsfj49nzzjssX76cXj/8gMjQEA2grKwkJDr6\nqfE5jVDIRTc30goKcHNz46effkIikeDs7ExRUdFTf49YLAYfHzQ+PggAuUSCzsiIMokEpUaDo50d\na9asYf/+/aSnp/Prr7/iN2ECiuBgFE5OOGi1nCguJmrJEtwNDTFqp+l2ZBTK0lKarl3DsaCAdBcX\nsh8/RtDSAkol9Z06cevBAzp16sTAgQMpLy8nLS0NPz8/jI2NOf/tt/h98QV5Q4YQ/eWXLOndm9Y3\n3yTW2JhhM2cyacIERlhYYBkVhcDcnObmZl577TXKsrJYoFRS9ugRY8vK0JiakmlryysREUS8/DLW\n7appTk5OOHh5UaFSsWPvXqqrqykpLKTV15fomBjUajXHjx/n7bffxmjZMgT19TRPm8YjrZZTra3I\nr18nOD6e1l698D9yBM9OnZB2+CEuW4aZiQmvJSXh7OxMQEAA9bt3U1BQwI/ffkt/W1usbWy4m5LC\nngsX8LOywv2DD7h97BjyV18lxNCQmpde4kHfvrgHBFB09y46T09iS0p4KBQyLy2NRnNzTp06RVlZ\nGV9++aWeiCEWi9uczFUqBPv3YyAUknPjBt5GRgSdO8el6GjCN2/m/oYN2J47h8fMmaw7dIjs7Gy8\nV63iXkYG7/frR3T37rwnl1M/ejSy/v2xyMlB8sUX6FxdeTxsGOL6elLy8kgQi/GePp33vbx4e8UK\nws+cQZCRQcjrr3NaLKagpoaS27fJPHOGPmlpVBgYIMzORhAURMPYsdi3J4S/NwL6d2nTHQJEYrEY\n/z8RY/orPNszcnJy4ubNm//x9f4TPJcB+VnU1NSQnp4OgI2NTRuF92+gY0xO8sSR60kIrl9H9P77\naFatQufk9IeZuUajIS8vj5qXXsJt9GgcQkL+9EXT6XR6kodnz54I7OyotLBAJpNx58EDtFotNTU1\nbNmyhd69e/PmtGncuHKFj3bupLa2ltWrV5N9+DAhISEEBASQmZnJtWvXqKqq4sUXX9SP8F24cIGH\nDx/qmyHCixdRHT3KYrkc9/R0VgITJkzA29ubt956C3Nzc/yjozEaOxYjYE1VFbm5uUg3b0bW0EDK\nggW4bN0KZmbULF5Mzptv8kJcHM12dmwwNublqVNp/eILtK6uDHzxRVIzM+n85Zc0NzezdOlSDhw4\nQP/+/dFoNFw+cYIAmYyEhARaDh8m5uRJFKdP49izJxqNBtnDh0jeeos6Dw8KBw3CbsoUgoODaVar\nGVFcjNrDg1Z3d3IcHCi6cgXrzz/nemoqDpMnY2JiQl1dHVsfPeLYsWN6E82xWVlY37vHD2o1Tk9Y\nfNna2oKZGc3u7nTy8eHTOXNQLV9OQ2oq5YaGdGt3hO7Qv7YzNubguHHUdepE+IYNiMVizI2NyW1p\n4datW3R+7TWaPvqIB2fPsjQ4mIAdO8izsKCxshKHigruT5yIoGdPIlJSkL36KhWtrRQFBxMeHk52\ndjaHDx8mJSWFl19+mbq6OtRqNTNmzCDIwYEVc+dCczPidevQhoaiHTWKq1evkp2Xh1t9PfsPH8Zl\n4kSClyyh/NIlrP38WLhwITU1NXh268bbY8dSVlZGaWEhEhsbrH/9lYarV6lpakJgaEh5cDApU6fi\nCTx2dsbW1hYzrZaMGTOYMmQIZtHR2NnbIxaL6bxsGYnW1gyIjKRn377c/uc/qa+ooI+nJzKlkpq8\nPGza10zH2uloBAvam5Z/V8fiv6X09v83nsuA3JEBKpVKMjIy0Gq1+Pn5oVarKSkp+dvX+auArOvZ\nE826dWh79kRUW/u7IvUdDD8HBwc6jxqFSCSisaAA2cmTCFtb27SNxWIQi/UMu9LSUt566y2GDBnC\n6NGjSU9Pp6amhuDgYGQymd4tWigUcvToUSbGx2OWl0dSfT0Ghobcu3cPqVTKkT178BSL+TU7GwOp\nlPXr1+uDcWNjI/v27aO0tJQpU6a0BeTbt5ElJOAfHEzXXr30tlO3b99GLpczffp0rl+/Tr9+/ZDJ\nZFhYWHDkyBHOJyby2vDheDk5UXXvHrYBAajVagQWFkh1OurMzenh4UGX7GzOGBlx48ABvL29aWlp\nobS0lKioKHr37k16ejo+Pj50VioZaWuLrrWVAIkEx/JyBOXltL7/Pq8vWIBAJELT0MB2oZC5ly4h\nuHGDM05ODBo0iJVXrnDotdeoMjbGKzubohMn6Pr++1QkJdFt4kSsQ0Kor68nNTWVa2fO0FhWhrGx\nMYMGDUJ15AgVBQV0Xr6cC/37s/Xw4X89TKmUrZ6eXLpxg9ezsojo3ZtikQhBjx5cfvddpJ6exBUX\nc+bMGT6dPJkRjx+jyMhAExpKUmQkvtu3083dna+++go3NzeuX72K/MMPKfXyQrVjB5YvvIBDeTml\nOTmY+/rS2tpK/qVLuAJN06Zh0rs3ExsasKirIzQmBkdHR302p9Fo8Pb2ZuSdO0jefJPjAwfSmJjI\nsKtXkebmMm3VKm4FBPDtt9/yMCGB0tJSImNi8Nq/H8GwYfjMng1WVmirq1nxyit8LBBgMX060n79\nEN68iQGgMzKiWS7H2NSU6lGjyAVeDQ/H3MKCsnv3MM/NRVBbS4+lS9FpNOSeOYNDr16sXLkSTWsr\nOq2Wl99+W/9x1gQHE+DggMjAQO9S8+z/tbW1KBSKtnepPUj/Xib932LpOTo6kp+fr/+6oKAAR0fH\n//F1/x08lwG5ubmZ9PR0lEolnTp10teg6uvr/7uKb3I52nbdWnF7vasDlZWVFB0/jtuJE4SvW4fE\n3b3Nhv3qVQysrDA5dAhRRQWiI0fQ2djQsns3LS0tbXVXAwOsrKwwMDBgyJAhVFZW0tfJCXVKCnTt\nilAoxMXFhbVr13L+/HkERkZ4ymQcaa8tz507l/z8fDZ4eTE0L48pMhmLnZ2fqsXduHGD+Pj4p6Qc\n1YsWUdCnD7c2bMAgOblNSN7JiddmzKC3pycPCwvZvWcP9vb2eonG/v37s3//fo41NTGjqYm1AQHM\nnDWLb1evxvbaNaIkEkyXL2ehVotoxw4Samq4kpzMjKlTCXz4kPsNDfRWKKivr2dpuzuG8ORJmmpr\n2VFQQNDs2YROmoTa2RlteDiC9manQC7HYvVq3vvoI0w8PBjWtSt2dnb06dOHf169SmVlJd8plbiW\nlyO2t8dn507Ky8vJzc2lvr4eGxsbjnXuTKJazXedO9O5c2c+OXeOiG7dCM7MxNndncq33+ZxcDCX\ny8ro06cPnbp1wykwkKgRI5BIJIQB2p9/pnHWLDLc3Bhz4ABBQUEEDhiAyMUF3fbtNHp5MTM2lik1\nNYzu3h3vdt+6Hj160Ozjg6GjI8WRkWRlZuLs7Iz3kCH/KpGtWgUDB9J98WLSnZ0Rf/cd5nl5JLq7\ntzUK1WoUCgUmJia8++67CG/fJvPiRcavWIG1REK/wECKjIz48J13mDNnDj/88AN37twhKCioraRn\nYIBAqUT85ZcIqqtRbdhAVzs7DG7eRJefT8uAAWjbew4Z6emYmJhga2vLlHXr9NrJjY2NWHftSrmZ\nGdr28cWqc+ew/fpr8qZMoevbb3N3+HCEra30unIFQYfU6xPqhR1BtoPK3djYSEpKCsbGxtja2uon\njeDp/pBQKEQgEPzXSCHdunUjIyODnJwcHB0d2bdvHz/88MP/+Lr/Dp7Lpl5ZWRmGhoZ4e3s/xbDT\narWUlZVh36FS9Rf4Q9eQJyB48ADRhg00eXvTJBYjkUhITk6mvr4e/+JiTE6cQBAVBc7OCJKSEC9d\nitbAgOIJEzCdOhUyMtDa2aF64QXee/11gpYuxc7MjIFvv42ZmRk7duzAxsaGb7OyMD15Es2sWTQ0\nNJCSkgKAv78/tTY2OA0ahL29PXZ2dnrhI8eAAMxOnsSlooJ8oNjZmaamJuLi4ti7dy9OTk7U19cT\nERGBnZ1dW+ZpaIiVlRUhISEEBwdjbm5Oxu7dNL3xBgJPT16cMYOQkJCnNJQHDBhAnz59cHV1ZfDQ\noW311IwM+mdlYW5uTra5OYkXLqB4/XW0/v6MfOklGh8/puv+/UxsaMD/xg12NjUhMDOjoqKCBnt7\nkl1dWX74MHnNzUwaMQJdYCCVLS1UVFToF5+/vz9FJiZs3L8fc3NzRCIRDQ0NJCYmEhUVxYAVK3Cs\nrcXUxYWEigqmTp3KsR9+wO+HH3j04AHdR4zgfmYm7nl5uIaFkVtfz9x336UgIoJIBwfs1q/n3KVL\nrLl6Fa1WS5cuXXCyt0eQmYlGoUBiYIBYIIDGRkyWLMEuJKTNzVwkAnd31EOGIPXzo6CwkN3p6XjF\nxODm5gZ1dciWLsVlxAhKhwxhz549VFRU0LlzZxQKxVPvl/DRI1p27mTTo0cELlxIja8vgydPxtbW\nVj8pVFxcTH5+PuUGBpRYWnL6zBkERkYYTZpEoZMTBjt20DU3F8tRo3B2dSUnJ4cpU6diPW0arqtW\ntbFO7e3RTJ5MUWIiZGQgdXRE88orIBBQV1rK/X79OPndd1j164fm1CkeZmTg4O5OXmEhtZmZBB85\ngktFBR6xsZi7uJBfXIzZkCHU63QUXbhAi0iEOjRUn7RIJJLfZLs6nY78/HwyMzPx9vbWN7s7auNC\noVAfhJ/0zty5cyfp6elMnjz5b63rP4JQKKRTp05MmjSJzZs3M2nSJGJiYv5H12ozmhgAACAASURB\nVHwC/3ubevb29r+b2Uokkv+6JrLg7l2EcXGI+valxMyMysrKNpF6MzMIDkYVHa1n9Ol8fdEsXozK\n25sWoRC1mxu6zz5DIBDQ2tzM/bt32xqFTU0A2NnZcf78eU6dOsVn69YRM3IkLWlp1NbW0qldtWvB\nggW0trayZ88efaAa2C5ufu7cOdb37MkyOzsMvbzw9PTk008/JTIyksbGRqKioqitrWX+/PnMnj1b\nTxqIjIx8KjDYh4dz3d8fE19fvbUPtI0l5eTk4Ofnp8/qOjbAUY8eIcrPRxMUROaRI+gyMzkREcHE\nefMwMDBAqVSSIpNh19iIhbk508aPx7x7d5qbm6mrqcHF2ppFPj5MTUigITqa5gkTWN2utLdr1y7q\nKys5vnIlgnalNXNzc7oFBxPeuTN3797l7t27CKOjkfz6KxqtFtvYWMLCwrARiRh2+jRGt2+j+uYb\nhp85gzgtDRWwdetWLl26xDvvvINcJqOfoyPebm6sdXOjz4IFWFlZUXPsGKxZQ+Hs2dS3nxKM58zB\n4do1lAYGGHbrhkgkIvXHH6l/5x1M167lpcpKvHr3RtjaSsbatQScOIHwwQMMjh1DdPAg8fHx5GRl\noVKpfjP3qo2O5sr339N6/jzOPXvim5aGVqEAqZTa2lrWr1/PoEGDcHV1JTU1lfPnzzN30CBqDxxg\n77ff8t6nn9J9yBDs2plxhgoFYrEYuVze1tQ2NES9YkXHA6X0vfe41thI4cCBTKyqwtLSEiO5HEcv\nL1LLysi5eJGWDRtodnbGyc2NzRUVpBob89F772HX2EjZvXtcyM1l2BdfYNxudRZ46hQajQalUkld\nXR2FhYUolUq0Wq2elSeRSCgoKGh7ju2f4bN4NpMuKytjyZIlCIVCPn9Gs/k/xZAhQxgyZMh/5Vr/\nCZ7LDPmPNJEFAsG/pYncMbf8bNbyJFQ+PuT6+5NpaIhMJqNbt27/yqgFgraZ4g6IxWh9fdEYGZGZ\nmUlNTQ3Nzc3odDoMDAzIrawks18/3CdN0gvNS6VScnNzudHYiP2gQTg5OWFvb8/Zs2e5desWN27c\nYPz48fj4+KDT6Z6qd3t4eDBo3DiMhw3Dt0sXWlpauHbtGhYWFuTl5REQEMDixYu5ceMGxcXF9O7d\nGwMDAwoLC/n1119JTExsY9wplVRu3879qioiRo/WX//QoUOsX7+e4ODg37hYaIOD0dTV0ZqbS+GE\nCWSZmTE4OpqPd+6kqqqKQYMGYTFyJA09eyIfOBDjPn0QiERIJBLMV6zAMTYW98BAbNLTaQoMRDB0\nKCobG+RyeZtv4MmT+O7di1ImI3jMGMaNHYtRbCziy5fxmjePhoYGfjx/nh4rViCZMgVTW1sqKyvR\nymT0EQoR37uHZtYssLVFExBAcZ8+vLFkCS4uLtTV1XHtxg1M/f35QCzGVyrltZ9+orG4GJ8zZ2i9\nfZvNGg0zlizB3t4eo6oqLBcuxODQIbJ0OnIrK6koL0d17x6C4GCqP/+c5Pv30SUk0OfoURoEAs6Y\nmyNauBC/UaMIFIuJiY/HJDAQ7759f/OOnYuL4/r16wzT6TD55z8R7tmDIC+PKxkZPExLI8Dfn8IV\nK7h95w6PkpJ4Ly+PfpaWXBGJsPDxIWTWLCpCQ6k/dw7RqlVUODsTPWYM9vb2CIVCJBJJ24YqkVDr\n5MRViYS9p08THBzMlStXWP3hh9y3tWVYbCxBkZE8UKmQdO6Ml1iMKDKSg3Fx/PrwId63bmFz/Tof\n3LtHcJcuT/lGCoVCZDIZpqamWFtb4+joiIODAzKZrI09WlSESCSiqamJ6upqmpqanp4ceQI6nY5D\nhw4xf/58Fi5cyKpVq/5bs8L/b+J/b4b8R2Nq/65W8p9lyB2uCo8fP8bJw4PO5ubk5OT84e94UhJT\nKpXSs2dPGhsbqauro6ysjPr6eqKiolizZg3JKSl8/PHHepHw3bt3ExcXh1QqRaVS8dFHH3Hv3j2i\noqLYtGkTvr6+zJgxA29vb9auXfub352WlsZbb73FnDlzmDp1KqtWraJ79+74+/tTXFyMr68vycnJ\n5OXlIRKJGDp0KNu2baO6upro6GiUZWV429kh8vMjNTUVExMTTExMiIiIoKmp6Tc6tFqtlgKJhMKJ\nE/FctIioTp2InjYN9ZYt3GtpQS6Xo1Qq2bZ3L5lHj9Jibc330dHI2zcTnaMjOrkc26VLqVyzBotO\nnRAIBIyhrYaYm5tLWXg4posXU1NQwLfbtlFfX09ISQnWtrbMmDgRpVKJp6cnjYGBmLZrHKTfvIki\nI4OCrVt5u6WFPhcv6gV0tMXFbWL4hYW83NTEYLEYmylT0Pn5oRaJGLl+Pa+tXo3Ox4cKQ0Nk7cGm\nuLgYkZkZ5iNGYLB/P4GffYZq2jRqZs6kduhQ6urrOfnKK9xMTWVuly40KJVkzpvH1rg43urRA1eB\ngO6RkWgGDULQbn/1LLp27UptbS1GDg4gFCK+c4fWwkIsq6oIVqtp7tmTl21s8HdxoVdMDHaPHiEe\nNYpxUilh3bpRUFhIdnY2MV5eGFhbI/fzo9bcXK/Up3ewVigwCQ9nRq9e9MnKonPnzkyZMoW8vDw8\nPT31Du4vffABn3zyCT9VV/PJ2LHEWltz9uxZ4tLSCH/9dda6u/+uBsyzaGhoIC0tDUtLS4KCghAK\nheh0OhobG6mvr6e6upq8vDxaW1uRyWRcuXIFiUTC6dOnsbKyIi4uTu9w/rzguQzI/y2IxeKnGnXQ\nFlg7JDmtrKyIiIhALBbT3Nz8u8H7WW3iJ/UdjI2NMTY21mcSGo2GSZMmYWxsTF5eHhkZGdy/f5/B\ngwdz7do10tPTycjI0NtFzZs3j/DwcPbu3Utubi4jRoz43b/D2NhYz85zcHBAJBLpg29YWBhdunSh\ntraWlStXUllZSXR0NGvXrkWtVuPs7IzOzY2djx/z4/79LI6IwN/fn5ycHL111ePHjzE3N8fExITm\n5maysrKwsrIiPCJCf7xUz5tHYU4OLdu2IROLqT96lMyff2ZVbi6Z7fZOHaiYPZu0bt0I79YNy/Za\n4YcffohAIKBPnz7Y2dlR39pK9d69TPbxwWfjRuLi4liek8PcwYNpbmdrzZw5k5KSEhoaGjAxMeHt\niAiMU1JoOHECmUyGRCKh6fPPKU1Px+Dtt1mwYAG+ubmYHz9OuUiEsb09l+LjsXzhBab84x8Ibt1C\n/eabKIYN46P2+128eDFyuZydO3ag8/VFfeoUus6dsRo9GrPZs7no6YncywtdXh5mMTHULFqEVKnk\nzbAwtFotSUlJbRvchg1tR/cnntvVq1fZsmULa9asYenSpQhPnwZTUwo3bOCjU6cIE4lw9fPDMiIC\n44AAjm3ZQqlYzKv79gEwgrZT3vbt2ynMy2Poiy8inDULsZcXlvAUeUKlUulLClVVVQAkJCQwadIk\nnJyc8PLyeqrma2hoiJGREVKplJEjR5KVlcWd6moE0dF0ecaV/VlotVpycnKorKzEz8/vqRNoByVa\nLpfrT10dHIJjx45x9uxZhEIhpaWlbT2BY8f+a6YU/yfguQzIf/WA/q6s37OuIR2u0lKplC5dujzV\n7HtWy0Kn0z01wtNRH+t4+err6ykqKmpzSG6/F5FIxNixY8nMzKS1tZXm5mYuXrxIVFQUR48eRafT\nkZqaiqmpKR4eHvp5WqlUSnBwMEOHDn3q/nft2sWlS5fYtGkTmzdvpq6ujuXLlyOVSsnOzubChQtE\nRUUhkUiQyWS89957tLS0YJCejtvy5aiWL0fr7Ex2ejpZGzdiaWSEs7Mzru3mrJPHj0dw/DjK2bPp\nN2sWmZmZaDQavWP1hQsXOPLjj8ydNg1Td3dqDA2ZNWsWkRoN7uvWsTkmBp1Oh9fQoQieOJYeionB\nIzGR69u20XPsWBoaGrh58yampqZMnTqV2NhYFAoF1qamLJPLSfvlF0qPHmVJSwu9wsIwGzSImoMH\ncTYzw9vPj7q6Ourq6ih2dsY6IADPZcvYMGYM9X36kDx+PMUpKWj69WPo8OG0CARUCYWkBAXRZ/Vq\nIq5c4dOpU3Ffvx5VYmKbsM0Tn/GkSZPaJhaEQgqGDWPm0aO8lJTEXC8vKuRyZs+eTVlZGZ06dcLa\n2hqNVouPj4++KdXQ0EBdXR3l5eVkZWWhVquRy+UoFAqqq6vbRN7ValCr0Q4axEGlkoPr1+OTns5O\nPz8+X7MGD1dX5N26sVEiIW3LlqfegfXr1/PgwQM+Wb4ck23b0NXUtLlD19fDE7O7EokEc3NzzNsz\n50ePHmFnZ0dgYCBKpZKCggKU7W4qxsbGDBs2DBMTE0QCAYKCAha/8QZqjeYPR0SfXEOpqanY2toS\nFhb2tzTBS0tLWbx4MSYmJpw+fVq/kdTU1DxXwRie04D8Z/ir2eLf+9mmpibS09NpbW3Fx8fnd4fQ\nxWLxU4JBGo1GH/gTExOZN2+efn7X39+frKwstmzZwuDBg9m+fTtqtVrvZuzp6YmVlRW+vr64urri\n5eWFra0tQqGQ0NBQ6uvruXnzJgKBgJqaGpydnXF3dyc2Npa33npLb3baEdTVajXx8fEkJiaSl5dH\nZGQkfn5+T7sLw79qfikpbdTwDqU2qZRl5uYI+vdHERgItGVUZffvs1KjofzyZapeeYXAwEAsLCw4\nePAgRUVFVFdXE3jsGDXHjzNGJqOypYWrV69iplCgKi/H+v59NDExaJ9oFAK49elDeV4epenpfDlq\nFAsfPuSjvn3x3LyZ6upqioqKUKlU7Jg+HZN16wgJC0NtbU1YczMWRkb0s7OjwcYGhUSCyfTpyGNi\nsImJAS8v8PGhtbiYOx4erBo3DtRqekyaxAhHRx4/fgzW1kxxccHFwoL+EyfSYGPD6LlzGTZsGCEh\nIYwZM4a4uDgWLFiAXC7n7NmznDlzBh8fH1xcXIiIiMCzd29SJk6ksrKSRYsWsXXrVoKdnEiYPJmG\n8nJ67N2LX0QEAoFAf0rqQMeRva6uDm9vb1asWIHxZ5+hLC+nbO1ayqqr8VOpmOTkRO/Jk9uU68Ri\nVO++i9jI6Dekp8GDB2NnZ0eX6GgeiER8umsXH3z2Gc6XLqFavx5d+xgetAXvhoYGhg8fjr+/v/6+\nnpS11Gq1+ky6uLiYsvh4XH74geI5cxD27o1CodA36Z6ERqMhOzub2tpaAgMD/9CM90lotVp++ukn\nNm3axNq1axk+fPhTAfh5EKR/Fv83IP8FKisrqampwcvL608bBx0Zj0aj0ZcncnNzycvL4+LFi9TW\n1lJaWsrJkyfp3r07y5cvZ8uWLVy/fp3s7GxKS0txcnIiPDxcnzWkpaWxfPlyWltbefPNNzEyMiIo\nKIgrV66wYsUK/P39OXHiBEKhkBMnTlBQUEBjYyM5OTk0NDQQEhJC165dSU5O5tChQzx48ICvv/4a\nlUrFkiVLmDx5sn4u9kno/P35edYs9nzxBZvc3bF3dcV02zZ0TwzJi8Vi3v7qK7J/+YXQkSPxa3dA\nUavV7Nq1CwsLC3bu3Mm1rCw0aWlo7t1DqFKhVqtJLy5G7eaG9+HDNKemclOr5eTJk6xcuRI7Ozv6\nrVrFy3fvcvOrr3A0M4P6epSlpUgkEmxsbFi6dClr1qyhsFMn3HftIqxzZ0KNjDh3/DhNjY0YDBnC\nm0ePMursWWbk5LQpwHXAzo6TM2cyefJkhgiFmGg0nGlqYurMmZg3NaG6f58Pmpvh/n1uvf02Bv/4\nB3t27EAqlVJVVcWZM2e4cuUK48ePRy6X6ydbGhoakMvlzJkzh6ysLAwMDAgNDaVz584ECoVkrFpF\nn/p6tKamGP9JVpeVlcWxY8eYOnWqfjxTlJiIJi+PEp0OV1dXWqdNY+qWLeRu3MgqIyOGDh2KydCh\nSKVS4uLi2LNnD+vWrcPOzo4XrK2JtLGhrLycBZ9+SnNhIcapqVQLBEhkMjrEAy5evMimTz8l2sKC\nsGnTEPxB2UEoFOp7CAACIyOEdXXYRkRQI5HoM/2Ok1KHiUGHHGdoaOjfymqLi4t54403sLCw4OLF\ni7+rZ/E84rkMyH/2wH9PgvNZdGgT5+fnIxKJ9Db2f4SO0oRYLOb27duYmJhgamrK119/za1bt9i0\naRP9+/fH1taW4OBg4uLiiI2NZdu2bVy+fJlXX32V1atX4+zsjPDyZURxcaiWLMHExAQ3NzcMDAz0\nUw8vvfQSra2thIWFERISQm1tLWvXrmXQoEG89dZb+syjQxjFwMAAR0dHhg4dip+fH2VlZWRmZhLQ\nrqXxbPlm06ZNereSc+fOMWXKFEaNGsXr48frj7h1dXWkp6djY2tL9w8/5MqVK4gzMvDx8UEsFrN5\n82YMDAwQCAT0+uADaG3l/IMHNFta6r3Zar/8kqrkZK6OHEl5Vhb37t3j0qVLuLm50dLSwrBhw8jM\nzKQZUO/fT9cn3IT79+lDhJkZiq5d0bZvXhXl5Xy4aRPBwcGsXr2a0AEDKJFKGaxQ8GF0NH60EQ7S\n0tLQarU4OjrybksLFq2trHBwQC6XY//uu2hPn+augwN9xo2j2MKCipwcGg4dwqaxkSqZjH6XL7NI\np8Ni2DCaNm9mfVUVK86fR2xtzf179xAIBISEhCCTyVi1alWbW3ZEBKHu7uQMGYLHyJEYBAT84bt0\n69Ytfv75ZyIjIwkNDQXamHrp6eloWlo4evQoSUlJRLi48IKHB2FhYbRevozy+HGKXFy40j6H3WE7\nZvzJJ5QdPUqehQW2ZWUo/fyIKyzkjEbDyIwMTEpLMTQ05O7du7zUpw+biopo3rCBQ716ERMT85eS\nljpXVzRvvYURYPTk93U6PVO2oaEBqVRKYWEhVVVVejKLQqFAKpU+9f5ptVp+/PFHNm/ezLp16xg6\ndOhzV5b4MzyXARn+WBO5wyvv99ChTZyVlYWdnR1hYWEkJyf/6eREWVkZpaWl+Pn56QOeQqGgsrIS\nT09PXF1d2b59O01NTXz00UfMmTOHgwcPkp+fj6GhIaNHjyY3N1efcQhv3kR4+TKCnj05efIkNjY2\nfPDBB4wZM6bNyy01laKiIj744AMGDhzI48ePefDgQduIm1aLOjeXhz16oNZonjoadgTC3bt389VX\nX7Fs2TIMDQ25ceMGUqlUn/UkJSXx8OFDVq1aRXR0ND///DNkZtLSuTNNo0dTNH06TU1N+Pj4oFAo\nyMrK4ta4cVQKhbhmZSEzNm4jPwC0tCByd0fV2oqLWIywe3dU+/aBTIYwNJSShAR8wsIY36sXkyZN\norKykurqalxcXHBwcKClpYVDhw5RXllJXnY2ZiIRcicnjK5dw+rDD1GtXIk2KgoAKysrnJycyMnJ\nQSaTERERwbJly7CwsEAgFJKVlUVFRQXe3t506dKF06dPs/z6dVZ7ebGtqgp1UxOamBhOZGYyOz+f\nTUFBbJ0zh7ebmtheXk69UEi2VEpZSQlqmQytTEbz5s2YnD5Ngbc3pSEhdP/mG2TOzmg3biQrK4v7\n9++3EThmz0Y2ejS+/v6/VTfTaDCYNAmdkxOqjRuJiYmhS5cu+Pr6kpGRwT//+U969uxJSEgINjY2\nOFy9imdxMW927ky9TMZ9pZLON25gcOQIiMWs+PRTRk+fjlgs5tGjR7hVVNBoYIBJTQ2LXF1xLS+n\nrKICF3t7PvnkE3Jycpg2bRqbNm3iRVdXmqdP59Dp09zZuJGAgAACAwPJyMjA19f333JYr6qqIiMj\nAycnJxwdHfXrsbm5mbq6Ompra8nPz6elpQWpVEp8fDxSqZQTJ07g5ubGxYsXnwsHkH8Xz+UcMvyx\nBGdNTQ0GBga/qWFVV1eTlJSEpj2QWVtbIxKJfleC88mG3fvvv8+2bdsIDg5mw4YNfPvttwwbNozb\nt2+zdetWXn31VTIyMqiqqiIiIoKUlBQKCwvp27cvL7zwAjY2NowbN07vrqANDeWWQkHDa69h+/gx\n583NGT16NMOGDaOgoICcnBw+/vhjevbsiVAoxNTUlGHDhtG9e3da1q1De/EikvHjsXdx+d06nY2N\nDVZWVtjb2+uVyCwsLDB4+JCWrCyazM3Zv38/V69eZfHixW1ZkoEBeT/9xKHqarpPn463t7d+KqK2\ntpaCnTux1WoxGjMGsyc69w01NbR88gm1ra0oWluR5OW1aUlHRGAQGor1tGk4ubtTUlJCVlYW1tbW\n+Pr6YmZmhoWFBeHh4UydOhV3d3fMPv0U07VrSevUifz2Rld5YCCt7a4nYrGYzMxMpFIp0dHRbTXu\nduLAgwcPcHBwIDAwUE9csbGxIbuqijOXL9MlIADLEyeQfP890rVrcVco6P/ii7SIRLh07YqmpYUb\nHh5UvfgiS5KTKRwzhuhjx2gJDyfHyYmWvn0pKyvD5dEj6oVC7ltacv78ebLi4njPywtLJydk33wD\nFhZonZ2pq6tDJpPxyy+/kJ+Tg+tPPyGWSNC88gpisRhra2uuXr3K/PnzuXv3LjKZDB8fH6ysrLDa\nt49Ora1YTJnCgyNHyPjxR5ymT0d8/jyPRCIaFy3Cq72Wb29vj2F0NPc8PZlw4gRW9fV09vWlFgjR\n6dhVWkojtDmR19WxtbWVrDt3ICWFGRYWOL/xBle3bKH59dcpd3TEMSjoL9edSqUiLS2NqqoqgoKC\n2jbEDuNfgQCJRIKxsbH+/pycnDA3N+fChQscO3YMnU7H48ePOXfuHBMmTHiesuP/vXPI8OeuIU9m\nyA0NDaSnp6PT6QgICHiqwdIxF/kkOurE33//PcXFxYwePRoXFxf+8Y9/UFFRwciRI7G0tCQsLIyX\nXnqJTp06sXHjRh4/fkxJSQnmBgbkZmbi6OiIkZERpaWl7N69Gw8PDzw8PKitreXQlStYOTsz8Y03\n2DluHCKRiJSUFJKTk1m7di3dunXT30tjYyPNzc188803VIpEmAUGIvrxR86fP89nn31GRUUFUVFR\nCHNyEDx+jH1UFNHR0YwePZqwsDA+++yztgw5NhaamvDcsIFJkyYRHx/PwYMHGTFiBOXAShcXhAIB\nI6qqkEqlKBQKDAwMyM7OZoNaTWRTE/2PHmXqq6+ik0gQmJoit7Li0s8/09zairSqCqtLlyjy9OSN\nCROQy+XEx8ezatUqQkNDOXr0KLa2tixevPg3n3dGRga+nTtTnJiIWi7Hwdub2nZfvLq6Om7dvMmu\n3bsZO3YsMTExVFVV4ejoiLGxsX7KYO7cucyfP19/3a5du+Lk5ERc9+6YDhiA8sUXkclk2Ds68mpS\nEr/MmoXbm2/Sc+RIlufkkJiYyIHFi7m0aBGyvDzUfftSNm4c5dXVlM2ezbqWFopef50xY8bQw8yM\noKAgXouPx/LAAepSUxElJFBZW8vepCR2797Nxo0b+fzzz2lububnmhoGe3jQV6nUn7ri4+MBmDlz\nJnv27MHIyIjY2Fj8fv0VVCpEJ07QpbaWwIoKru3fT5C9PRkVFbi3b5RKpZK4uDj69etHj5gYppw/\nz8jjxxEFBKALCSHru+/o7+KCpKaGskuXGP/oEfcHDMCoe3c0TU1Iw8MRKhT4urmhsrbGyMXlL9dc\nxziom5tbGxX/bwTToqIiFi5ciKOjI+fOndM36mpra5+nYPy38ffPIM8JOpp6ra2tpKSk8ODBA1xd\nXenatetTwfhZdARitVqt9wQ7ceIEAoGALl26UF1dzeDBg1m3bh1GRkZ4enqybNkytFott2/fBiDc\n15ceK1dyrF3O0s7OTl8zS01NxcPDg5s3b3Lu3Dns58yh1NOTR48e6YP5nTt3+PLLL/X3tGPHDqKi\norhz5w6nT5/m5M2bXGwP9t7e3vz4z3/y/htv8OjRI8SffIJkxQooKcHOzo5FixZhaWlJamoqAKo1\na0iZNo11S5cy9/59FrYL5wPI5XJ0Gg3v+vtjX1lJZWUlSUlJXL9+HSsrK/YuWcLXdnaMMTamsl8/\nznh6cvr0aRobG2nRaOjZuzemEyag+vprGgwNKSsqoqCgQL/ovL29SUtLIyMj4zefe0JCAqNGjeLN\nhARG1NfzybffMnHiRGbMmIFCocA7O5sR8+Zhk5KCSCRCLpeTl5dHQkICSUlJiNLTsVMq2bZtGwMH\nDtSPbkFbljx27FgM5HJGSSSM8fICKytUIhFxlZXcu3cPgPfee49D77yDSX09mbduMWnsWAoePsRa\nLkcdH49Hbi7TYmL4/vvvmT9/PklJSYwYMYK0QYNoGD4c04YGav7xD6RbtnDnzh2ys7OJi4ujqamJ\nboGBbCwrI+KLLxgzZgyjR49GJBIxZ84c5s+fz6zJkznw8suM6d6d2tparicno1Io0IweTcmqVWRK\nJOw5fZpNvXsz8MED/ZTF0aNHmTt3LnHbt9N47hyt9vZkhYaSLJOR06MH4wUCQoqLWWZlxepx4xhs\nZUXQ2LH4OzvzgkhEurk5N2/e5LG7Ow3799Ps4YGyfcN4FiqViuTkZIqKiujataveT/HPoNVq2bVr\nFzExMSxatIjt27c/NTXxPDhI/yd4bjPkP4JIJKK4uJiCggI8PDye0mH4Izw5OdFB7Ni0aRNDhgzh\nnXfe0ddlw8LC9Neqra3l1KlTxMfHs3LlSuzt7dn+5ZcMMzbGKTycE1ev4ufnh7+/P2vXriUoKAhj\nY2PmzJnDCy+8oO9Gd4wYeXt7Ex4ejlQqJSkpiZUrV9LY2Ii3tzfF7SyzDz/8kPLyctzc3JgyZQqV\nGzeyTCjE5MEDNC++iGb4cGjPXPz9/dm4cSNisZgVK1ag7dcP++7dmafR4HrgAEb+/ihGjMDCwgKV\nSsXm5cvpv2YNSKVYfvst8MSI1tix1NraUurszCOhkNtqNYEFBfz666+sXbuWDz74gKFDh1J79iye\nq1bxzYsvIpszB6VSiYmJCS0tLfz444+/q19gZ2eHWq0mIyODrz76iPyyMpbGxtKzZ882pwyxmGat\nFrVOh42NDbm5udja2tK1a1ciwsIwDwmhRiRiZufOlJaWcvv2bT2JpeOfS115EAAAIABJREFUTCbD\n2tubO3fuUHb7Nlb5+by1YAGms2cDYFhdjfGqVTS2tuKn0VArkTDM0pLRWVnMXr4cw7o6Og0fzsOd\nO5FIpfpj+Q2tllUJCWwwNcVg82ZuFBdz7949WlpaOHz4MBKJhODwcG7fvo1nZSVTmpu5HhRERUUF\ne/bsYceOHTiWlhJ5+DDrt23jXo8epKWlsXHjRgYMGID9q69y18yM0/PnY3HuHJbOzvxjxAgE7bXy\nWbNmEXHkCPVJSZwyN2dkdTWahw9xHzuWefPm8dVPP/FTQwOf9O2L+9ChYGuLZMEChNnZmM+eDcbG\nqFQq6uvrqaurIzs7+2lWn4kJra2tFBUV4enp2aYb/TdQUFDAggULcHNz4/Lly/9rg+/v4bkNyM8G\nWZ1OR1FREZmZmchkMnr06PGnTYqOGrGk3YXXwsICU1NTTExMkEqlyOVyli5dipmZGWPHjsXZ2ZnI\nyEiam5v1xI7Kykru3LlDYWEhZmZm7P3pJy67urKqXz9ix40jOjqaefPmkZeXR0T7FIGxsTHh4eHk\n5uZSU1NDSEgIJiYmODk5ceDAAbKzsykp+X/YO++4Kuv3/z/PgcMeRwVkyVAEBUVZLgwnapqrLFdh\nmblXqeXWNE1zpqZRuXFbjsycabkAcSHIVKYiS8Zhnf37A879A0Wlj34/fbJej0ePR57F+9znvq/7\nel/X63q9HiIWi7G3t2fcuHHEx8fj7OyMVCpl2bJluLm5ERQUhHWHDujdv4943Tq0Dg4o9u8Xvp+n\npyfr168XXEt0+rN27drxqHdvGrm5odFqOXz4MOqKCv44dIhGU6bQoE0bdHNV1aeqeP997IGmFy/i\n/+ABpqamZGZmVjYji4r49OOPiThxgj3+/ji/9RYSR0eKioro27cv3t7ebHhsoEGHRo0asW3bNqxN\nTGi9cCHKZs0wXL8ePz8/fv31V/T19Vnk4sL9+/cZolDg7e3NzZs3sbOzw9TCAu2776K/ZQuD796l\nx7lzGBoZUV5e/sRobht/fyRiMVEWFmyUSOhcXs40iQS1Ws1vt27xwNgYN0dHSmNj2Xf8OCcuXGD5\n8uV8tHs3Jmo1BAUxXaVCLpfj3rw529es4er69WS4ulLQoweNVqzgzcOHOW1sTJpEQmpqKqNGjeL1\n3r3JcHHB9ZNPaOzoyGA7O3567z0yOnZk7NixiH18iALizp2jXbt2DLK0xKekhLy8PNRqNd1ef50B\nffqw5+BBTp8+zbgbN9BPSuJESQkyAwOOtW1LTGoqs5YswUStxqhhQ6YsWEBBQQEfzZ7N1q1b0Wi1\nUDUVp1y4EFFpKVTtFiUSCfXr169BO1OpVOTn53Pv3j1BRD4tLa0Gg8LMzOyJ60uXFX/33XesXLmS\n7t27/yPLEs+C6KnOyrXjT734r4RKpRIae3l5eSQlJVGvXj1sbGx48OABLaoGHB5H9Ybdo0ePGD58\nOIWFhezYsQOxWExRUREKhYLo6GjWrl3LnDlzKksC+voCU0DXWVYqldy+fbtya+3uTnp6OsbGxjRo\n0IATJ07QvHlzTp8+zZw5c+jcuTMHDhwQ1jF06FBSU1M5ffo0pqamPHz4kNjYWFxdXXF1da1xslfP\nYq5fv46enh4ODg7CDcQ0Kgr9+vUpb9aMe/fu0apVqxqCLToam5mZGU2aNBE42vHx8QwdOpTp2dn0\nKypikq0tmsBAduzYIbx30aJF3Llzh+3bt5OQkMDq1auZM2eOEOhVFy+iHxJCuJ8fW0UiPvjkE6G+\nb2BgwO7du/H29ubdd98lKyuL6dOnM2bMGDp37kxhYSGPbt+m2caNlMXG8iAvD5MZM7D6+GMSExMJ\neu01LCQSfDt2pLCwkP79+5OcnMyRI0dYuXIlAwYMQHT7NtqgIPIBw1u3MKulFqrVapH06oW2oIDr\noaGcCAvjw4gIZK+/zr127cgMC8Px+HHO9+7N4ago3nvvPYYOHcqmTZsI+fprXMViDHNyECUkgKUl\ncgcHtnbtylu3blHPzg7Jpk1gYoL+9OnIevXiplrNxtOn+UKhQPPJJ9gPGoSkoACtsTH6w4cj//13\n7nh54X35MlCpqjd06FC8mjdn8ZUrKAwMCCospLS0lBOLF2M6fz7TKio4XVHBUG9vKu7eZXxREaV6\nerjHxaGnp4d05kxEDx6g2LuXi+HhiO/cocOAAYivXavUPK5jlqrVann48CGpqak1uPkqlUrYzRUX\nF1NSUoJIJMLc3JzffvsNBwcHNm/ejLu7OytWrHimYNcrijrdeV7pDFkmk5GQkFBpJ9OqFSYmJpSV\nlT2T9lZ9wi4xMZGEhARsbW2xs7MTTqLZs2cTHh5OZmYma9aswdbWlsuXL/PgwQP69evHpEmTmD9/\nPn379mX79u1ERERw+PBhnJycuHHjBlFRUYLE3zvvvMORI0dobWMDjx5BVSYyYcIEzp8/zzfffEO7\ntm0xV6lYMH8+HQIDWbFiRY11V89idJQzhUIh0Ive+eorsrOzUSgUiMVivvjiCwYOHIhIJCI5ObkG\nja063N3dWb58OY127EBy4gRdu3bFoFs3bty4QWxsLMOGDUMulwuKdRkZGZw5cwaZTMbeXbu4vGoV\nqeXlNCgrwyIjg7VlZew+dAiNmxtffvklxsbGhIWFCbuQpKQkoqKi8Pb2xt/fv9KK6swZtqjVSDMy\n0BOJSPf2xgowNzdnXpMm9L9/nwvNmrHs0CFCQ0PxkUpZ36UL/u3bV54HCgVGxsbYOTuj3biRHZ6e\nHDl+nI0bNwpZn1qtJjU2FkuZjFuhoRjXr4+Jnh6HT5xg56lTbO3dmwY3bmDdsSNXz5xh0aJFZGZm\n4uHhwR0rKxydnTl98iTpW7eilclos3w5P9y/j9bdnbGdO6Nt0gStiwuKCxeQNmrEaxoNpsuX47J1\nK3Jzc8J27aJjx45sX7+e2LQ0jovFtElLQ56SgtbVFbVaTUFBAUq1Gr1NmzDW12f48eMUFRVhZmOD\nXCplcO/eNH/4EDNHR2JcXFiXmEipXM5mY2MM7t1DlJVFRlIS91q3pv2ECejPncvdrVvxVChQzp+P\nesCA515TFRUVxMXFYWhoiL+/f43hKn19faRSaY06sFqtpqioiNu3b7Nlyxagstm4YsUKFi1a9Ny/\n9zyMHDmSY8eOYWNjQ0xMzBPPa7VapkyZwvHjxzExMWHbtm34+voCcOLECaZMmYJarWbUqFFPyJ7+\nVXhlA7KOT+zh4VGjRlWbgtvTBIDatWvH4cOHcXd3r0GQz8jIIDc3F4lEQtOmTenVqxc5OTlCeUEk\nEhEWFkaDBg3w9fVFKpViZGSEVqvlnXfeIT8/nzt37rBnzx6sra2ZPmkSXRcvRjJ5MoqwMAD8/f2Z\nOXMmMTExrH79dcYmJzPc0RETLy+ysrK4c+cOXbt2RSQSkZKSwqpVq5g4cSLNmjUDEFxHrKys6NSp\nE8nJyUItulGjRpw7d469e/fSo0cP2rVrh0KhQKlU1rjIKioqCAoKwig4GAoLCanihY4bN46IiAi6\ndOnCF198IRyz3r1707x5cy5dusS3H31Ev6NHKevYkeXNmmFx7RrDtFrWbt9Omz59KrV5CwrI+/pr\ngkNDoeoib9OmDQYGBqSkpODu7l4puHTnDn4FBYwqKUF28SK/iUTk5eVh7e1N/N27NGnVim39+2Nh\nYYHT4sVYnD5NxcSJaBs2ROPrS+LKlVgcP47NrVvkWliQkZFBYWEhBw4coGvXrri6unKtXj06FxYy\ncM8exkqluG/ahPLiRZo8eEBG27bcatSI7leuEKpQMFwq5bXXXsPF2ZnG7duTUVLCgqlT+aGsDAsD\nAw6ePk2+SIT17NnQtSuia9fQNmpEUUkJslatMM/OxrN/f9RDh3I7PJy8sWM5+/bbWLu6UmFvT84n\nn2CzejUG06cj//FHrl+/TllZGVKplPl79zJ//nymTJmCVqvl1I4dxKSkkHr2LONv3iTTzo4Bp05R\nXFxMYWEhN37/HZejR7G5d48MsRjT3FzUWi36CgUVmZkoQ0NRV7mCPA26cl9GRgZNmzats6NzZmam\ncE5GRUVhZmZGRUUFWVlZdXr/8/D+++8zceJEQkJCan3+119/JSkpiaSkJCIiIoTzVq1WM2HCBE6f\nPo2joyMBAQH069fvhQxSXxZe2YBsY2NTgwOpQ/WA/CwlNt2/W7Vq9cRnb9iwgc2bN3Ps2DGWLFlC\nkyZNeP/994Xn27dvz4ULF3BwcKB5lbhNRkYGCQkJqNVqRCIRubm57NmzB6VSSXlZGbNNTXmzc2eM\nqwxRc3JyWLJkCQcPHqRdUBAcO8aEzz5D6+nJrFmzOHbsGPv27cPT05OkpCT++OMPunXrJgTk6ggK\nCsLf359+/fqRn59PUlIS5eXlXLt2DY1Gg7W1NU5OTpSEhmJ4/z6Xu3TBzMKC5cuXI5VK+fHHH6Ea\nSX/u3Lmkp6djb28vHK+EhATOnDnDhx9+yNdff43bgAHEyeVsjIsjeMgQrjdowI9iMaODgnhdKiXk\n0CFS1Wp6/PILogsX0HTvjr6+Ph4eHsLfadmyJQqFgjVr1qBu25avz5/H086OH1euJCMjg+XLl3PV\n2ZnxXbsKN13Rp5+i8PZGW3VxKVUqRm7dSllREb+fOsUwlYqmPj7k5uYyf/58tmzZQnh4ON3PnuXk\nggW0y89n2ujR+BkbY/jrr6g++oieK1aQlJRE4/x8/FQqRn7+Ofv37+ejUaPITUjArmlTvpLLMVcq\nSfz0U46sWMFePT1sxWLUS5ZgtGsXycuXk+3lRbOwMBCLEVet17eigpbm5qisrVGMGMHu3buZfvEi\nW955B3FSEshkGOfnszIzkyNHjnAvO5uckBDsmjQhJiaGXxYuZHZ5OafMzMj09kbapQv29vaCLolk\n/nxEUVHkz5uH1NOTwvx8rmm1NO7ZE+uGDbnv54eFgQHGTxHcKi8vJy4uDhMTE/z9/etkOqrRaNi8\neTNbt25lzZo1dO7cWfhsIyMjXF1dn/sZdUFQUBCpqalPff7IkSOEhIQIyVVhYSFZWVlCuUUnGztk\nyBCOHDnyb0D+v4TO6qW2xx/XnHg8ED8NCoWCe/fuIZPJGDlyJJ988kmtr1uwYAFZWVmCKlr1iaPV\nq1eTkJCAiYkJo0aNYunSpVhZWTHl7l1WrV/Pxiq6mU7TolOnTqjVas7a2tK6USPMqcwMWrRoIdRp\ng4ODOXLkiPD3HseyZcsoLCzEwcEBiURCq1ataNu2LVKptNJx+OFD9rq5YZaYiKisjPnHj6OuGkbQ\narVcunSJ+fPn4+zsjEajoWvXrjWyEo1GQ2hoKEePHuW7777jypUrAKQHBCAfNYqoqCju3bvHjz/+\nWCnjePIkJmZmhGk09Jo4kYbPyNBkMhl79uzhNQsLvgbEHh44f/IJMTExeHh4YG9vz507d4Rpw4Y3\nbtBg92603t6kurszaNAgOnToQPfu3dGXSlm/YAE7d+5k586d+Pn5IZFI2LdvH2fOnOHSpUt0796d\n9d27Q2YmyOXor17NJ19+yZbDh4k7fpyBhoYoi4tZn5xM5NWrfFxRQVD9+uiJxXT19aXHmDF88Ntv\nBIaHUxAbS1KbNpjl5ZFhY4OZgQH5JSUoLSwwV6vRU6kwGj0ag3r14Nw5VImJvOHri0NAAOTlIUpO\nRpSWRvuLF5HIZLR99IiHeXls79qVwLAwvL29cX37bTShoVhdv05wdc2OKqh79UIslSIKCqJJtQRF\nERBAcXEx5TIZ2dnZlJWVIZFIBOaJubk5eXl5ZGVl4eHhUeepuZSUFCZNmoSXlxeXLl2qk4jQ/xXu\n379fY6jL0dGR+/fv1/p4RETEX7HEJ/DKBuTaUL1hl1hl2iiVSmto8dYGnbZFVlYWrq6ugnzi0/B4\nV7o6BlSr1bm6uuLk5ESDBg0YO3Ys5ubmghtC27ZtcXBwwMLCgmnTpnHs2DFCQkK4desWS5cuZfjw\n4UItvLi4mEOHDvHGG2/g7u5OdnY2c+fOZcSIEbRr144ZM2aQn5+Pq6trjXV17tyZWbNmETpnDqrb\ntzms0fCVhQVyfX2c7O3JzMzEyMgIf39/jIyMUKlU3Lx5E41GQ7NmzTA3N0dPT4/c3FyGDx9O3759\n6dy5s/D5Tk5OnDp1iitXrnD+/Hmsra0rj3/PnoT26UP+iRM86tePhrpykFaL6OZNtE2aQBWl6u7V\nq0z75BOCpFKMv/6aco2G9u3b07179xrHVS6XV27THRxQtm5NakkJ+QkJmJmZ0bx5c8HWasiQIWi1\nWtasWUNwcDBt27Zl48aNXL9+nUmTJtGxY0cuX75MQEAAZRMmYHH3Ll9u2EDEtWuE+vkRmJCAukrE\nRywW065dO7Kysph17Bh/5ObSUyzmo19+oez6de4CYgMDbAYMIP6PP/hl3z4+/PBDsrKySExMRKNS\n4dGhA/q2tphXDY8strJCvmIFKj091N26Yfjmm6gDAtA4OGCam4uVVktuSQnh4eGVAdTBASMDA/pI\nJGjz86FaOUGj0bDqyhWMjY35ITiYUaNGMX78eKBmSUsHXd8hPz9f4KebmpqSm5tLRUUFFhYWmJiY\n1Hruq9VqfvjhB3bs2MHatWsJCgr6l0HxH+CVDci10d50DTs/Pz+h4ZWVlUVFRQXGxsaCKJCFhQUS\niUSYmkpJSaFhw4a0adOmVq7sfwqFQoGjoyMKhYLz589jZmbGyZMn+eabb9DT08PV1VWwU9K5JmRl\nZXH+/Hl2797Nb7/9xvr169HT0yM0NBSATz75hKioKM6dO4eTkxN6enp4enri6OhIdHQ0r7/+Op9+\n+ilvvfUWy5Yt4/Lly4xatIiEtDR+3LyZDIWC/fv34+LiQnJyMvr6+hgaGvLLL78AlSPmZmZmyOVy\n4uPjUSqVmJiYIJfLMTMzIzExUTiGWq2WxYsXExwczODBg+nSpYsQFKbo6WEkFmNQtUs5c+YMJZcu\nMfjoUVTvvEP6u+9yY+dOPJctQz5gAM7btiHv14+J48dzbeZMzpw5U6MJaWhoWNnx79wZOnemnkaD\ndtcuAj77jLwmTYiNjUUul2OVlkZedDQnz54lOzubCRMmIJPJEIlEfPDBB/z888/Mnj2bDz74gL17\n97Jw4UI2isWk5eTQKz0dkakp6j59kEqljBw5ksuXL/Pdd9+xdu1a8vPzMTY2JiMzk/tKJW5ubkLA\nO3z4sKDMt3DhQsRiMbm5ubwTHc2xkye55OjIj0VF9LKxIWv/fu6lphLk7c3rjRujbdsW5Zgx6M2a\nhX5ZGW8PHYqbTIZVcTEbrl7lV0NDwt58E/OVK1HOng3m5uht2oTojz/48c4d9C0tcXJyeq51mUQi\noaSkhMLCQnx8fLC0tEShUAgMnpycnBqZtEajQalUYmRkxNSpU2nVqhUXL178S7Pi6nBwcCCj2q4h\nMzNTYD/V9vj/Al7ZgKxDbXXix7MDnehJUVGRwK9UKBSoVCpMTExwc3MTXI1fBtTV6sQ67WPdDaRD\nhw7MmjWL3r17C7XAjz76iL1793Lw4EFsbW1ZtWoVWq0WKysrcnJyaN++Pd988w3+/v4A7Ny5E7Va\nTWBgYI1u+C9ffcWAmBiizp7lrbfeorCwkIKCAn777TfcfvqJ+WIxC8LDaVxVx32c6K8LwGZmZmRn\nZz/R4FEqlQLtSadP8dNPP5GXl8fkyZOxtrYWXt/go4/Qa9YMlZsbABs3buR+YiLB77xDqosL+iUl\n+HTvTsbWrZj7+nLhwgUOHDiAo6MjFRUVlQadwOeff86lS5f46aef0NfXZ+TIkdy9e5fJgwYxatUq\njJs3R3ruHACajAyM+/dnuYUFBb1788GHH3Lt2jVat26Ns7MzpaWlBAQE8P7779OlSxcuX77M9evX\n6fHLL3grFKiDgiiOimLVV18xuSrDLisr491332VIFa88NDSU7t27P2HU+cUXXwiUvOHDhwuO6E08\nPZErldi0asUjkYipkZE47tlDeno6cXFxmH1eKYFgZmZGU4kEiVJJx7170VMo0GZl8e233/Lo0SPM\ndu9GfOgQopEj0ZqbI8rJwSA3l4ljxjB35Ureffdd+vbt+9RzsqSkhLi4OOrXr09AlZQqVGbSDRo0\nqPE764J0ZGQky5cvJzk5GWdnZ0HQSHceviiex4RYsWIFW7duJSUlhRYtWhAXF0dubm4NttGePXtY\nuXIl33zzDZaWltjZ2WFtbU1SUhIpKSk4ODiwd+9edu/e/VLW/KJ4ZXnIugxYKpUKNeK6bKHKy8uF\nwQ5HR0chyOiyKHNzcywtLbG0tHzq9u1p0KnJpaSkYG9vXym3WQcFrd9++43PPvuMxo0b06JFCyHT\nHT58OGVlZRQVFVFcXExpaSkKhYKkpCSKi4sZO3YsaWlp2NvbY2lpSdHcuZh//z2aFSvQDwkRblYf\nfPAB5jdv8la3bqyIj2fEiBEMGTLkibW3adMGmUwmKHLVZe2HDh2ifv36ODg4UFxcjFwuF3Yjuv90\nmhg7duzAzc2Nfv36IZPJBBNOgCVLlrBr1y727NlDq1atBBugzp07Y25uTnh4OFqtlsDAQFJSUrBt\n2JDUVasok0oRtWlDiVxOUlwcbT77DE1CAjP09Biwbx+vBQVRVlZGYWEh2dnZqNVqysvLMTc3Z9y4\ncSQkJHBALKa3mRnKY8eYtnUrtgcOELJkCdbvvceUKVO4evUqEyZMYPXq1RiWl3Px99/Rs7FBfPky\n+uvWofzqK2T163Pt2jXu3r3L8uXLWbp0aQ2LeblczooVK2jQoAHBwcHIZDIcHBywsbGhoKCAuLg4\nnM6dwzQrC+PoaB706YMyMBBNgwYkJSXxw4YNfD5+PNvPn2fUqFG08PICpZLcoiK+/fZbhg0bJij+\nVYdGoyE1NZW8vLwn7JSeheTkZCZNmoSfnx9ffPGF0CS2tbWtk5/e86BWq3F3d6/BhNizZ0+NxtvQ\noUM5f/48eXl5WFpaYmVlxdSpU4HKvsnVq1dZuHAhJ06cwMTEhK1btwo3i+PHjzN16lTUajUjR45k\nzpw5L7zm5+CfzUOOj49n2rRpFBUV0axZM/z8/AgICKBVq1a1aryqVCpSU1MF2czazBPVarWQAd69\ne5fS0lIMDAxqlDqMjIxqXY9u+MLU1BQ/Pz8hw6sLunbtKuhhPA5DQ0MsLS3JyMigtLQUNzc3PD09\nKS4u5vTp04wZM4a2bduyZs0aGnTrhvGBA2hiYlBSuVvQ09Nj+/btHD58mM+WLqWkpISHDx/W+Btl\nZWUMGjSI/Px8pk2b9oSp6dNQVlbG3LlzcXBw4NSpU8D/343ovNtSU1MpLy+nuLiYbdu24ezsTMOG\nDZk4cSLz589n+PDhAEybNo3BgwejUqlYvHgxBw4c4M0338TBwYHx48cL8qWRkZFs3ry5ciqzWTMU\nr71GtoUF0h078PbzQ7x9O4qPP2b1pUuoL1xA1LEjlhs2cDw5mc+vXGH27Nl8/vnnBAcHk5GRQdeu\nXalo0ICcgwe5uXo1Y8eNw+PIEfR+/JGKFi2QnDyJvrU1nTp1wsPDg5Zjx2LYsyeqa9cQ3b6N+Pp1\nRJmZLPvuO/bv38+KFSsqm3FVTIMPP/yQc+fO4efnR3h4OD179mTw4MFMnz6dvn37Ch6IgYGBmHbo\nwIEDBziQk8M3I0fyy9GjbPniC7x69iQvLo79R45w+Nw57O3tcXZ2xtTUFGtra+bNm1fr71NcXEx8\nfDzW1tZ1tlNSq9Vs2rSJvXv3sm7dOjp27AiAiYkJPXr0qNN5URdERkY+lwmxZ88e4f+HDRtGly5d\n+Oijj4DKgCwSiWpov1RH7969hVmA/yW8sgG5TZs2XLhwAaVSSWxsLOHh4ezatYsZM2YgFovx8fHB\n19cXHx8fTp8+jYeHB61ataqxXXscenp6gu+YDgqFQshQMzMzkcvlmJiYCEHa0NCQ1NRUKioqah2+\neFHk5eWRnJyMjY1NjRq3tbU1d+7cwVcux00ioby8nCxbW3b5+ICRER2joyuzfAMDEnv2pFQkokSh\nYOfOneTk5HDjxg28vb0FM0qxWEx+fj63bt0iPz+fzMxMgRKYm5tLWlraE1tVY2NjJk6cyM8//8yp\nU6fo0aMHIpEIY2Nj0tPTmTJlCm+//Tbt27dny5YtDLKzo9ebbyIrKcHFxQWJREJ6errQ9Xdzc2PO\nnDmEhYVRv359Gjk4cOnXX1FW3WBv3bpFeHg45ubm7Nq1CxMDAzrUq4ezgQEtQkJQHDiAxs8Po1mz\nMBw6FE1SEsqsLCTr1hFsZ8fOxo1xUalYX1aG0sUFo379+GT0aMQnTjDP3Bw7JydC7O25u2IF8fn5\ntJ0xg+VaLcnLl2NpaVl54zEyIjM+nuD0dJw/+ghN375o7e3pTWVJx8XFhaioKLZu3Yqvry9Xrlwh\nLy+PiN9/Z7mDA32GD6ekvJw7d+5w48YN1Go17du3F5qSt2/fJjY2lpKSEoJSU+kjk1EaGIhqzx5K\njh2j75Ej2NjYkJaWRklJieDwoRtpNjU1RavVkpKSQkFBQQ2rpuchMTGRyZMn06ZNGy5duvRc8foX\nwZ9hQpSVlXHixIka4/cikYju3bujp6fHmDFjGD169P/ZWl8mXtmArINEIqF169a0bt2asWPHCk4G\n165dIywsjDlz5uDq6oqlpSW+vr74+fnRpk0bGjZsWKdyhIGBAdbW1sIIqc4ht6CggOTkZEpKSjAw\nMEAqlVJYWIhGoxFsbV4EZWVlJCYmIhaLBYeKx/Fa06Z0NTfnemQkgwcP5syZM6y7eRO3khJCPv2U\n4uJi7qelYZWeTo5ajaWXFyYmJkyePJlGjRqxdOlSHBwcKonzffpg9fvvlN2+zWeffcbvv//OqVOn\nBC+/3377jRMnTuBWVRMGBHGkCxcuoFAohAxKpVJx/fp1EhISAGjevDlZd++yLDkZp7NnkR89Sv/+\n/YVR3AcPHiCTyYBKjzgvLy/69OlD/Y0bKXJ3Z0TDhuyIiODtN99auRdpAAAgAElEQVTk4cOHdOzU\niUmTJtGmTRsOPnrESCsrtOfOoa2aItM2b478wgW0FhaVo87791OvUSN2WFiQHxZG13r1ULq7M/D2\nbcQhIYhycljbty+iBQsQiUREGRtjNWMGV9zd6bJ6NaZOTqSnp7Nv3z7yHBzolp2N5tYtcvT1uTZ/\nPo3fe48OXbrQoUMHcnNz0VZphCxYsIBffvmFW7du8fWYMfRIS6PeyZNk2tiwYMECFi9ejIeHB0OG\nDCEoKIjWrVuzZvVqOp86RfrYsXRetgxJbi7lHTuySCJBYWDAlx061DgHVCqV0JRLTU0Vykbm5uYC\nV/l5pr8qlYqNGzdy4MAB1q9fT4fH/sZfjZ9//pnAwMAaDKKLFy/i4OBATk4OwcHBNGvWjKCgoL9w\nlXXDKx+QH4euDuzn58fOnTu5du0azs7OZGVlERkZSXh4ON9//z05OTm4ubnh5+eHv78/Pj4+mJmZ\n1SlIy2Qy0tPTsbOzE7JGXXDJzMxEJpMJmYuu1FHXerRKpRKym6ZNmz6THyr18EBvyRIe3rpFLyqH\nZX766adKa58qc017e3sKYmJYPnMmfZs3x8DAgDFjxiCVShGLxZSVlZGdnY32zh1WaLUUW1mRNmyY\noNcBEBISgouLC0616EQMGjSIu3fvMmnSJIG1cu/ePV577TWBnxwcHMzSlSuxzs5GaWMDVPLFJRIJ\nDRs2FGRA1Wo1MpkMJycnMjIyyNdq0VpYoLWwICcnh4NqNXnArPx8xo4dy8OHDzlx4gSi11/Hp0pf\nWHz5MgYhISjnzkXr6oreiRMo583j8rRpXLtwgdUlJRzftInmvXrBw4dobWzQeHqiHxREoUzGhAkT\n8LW3Z7i5OZatW2Pcti2NV6ygsH9/9u3bR9/69RkmkZB34wZXb9yg/f79nI2KYqi+PsOGDWPcuHHM\nnz+f8PBwxowZg6WlJWfPnqVMrWZFq1YM6tKFRg0aMGzYMPr27Yu5uTlyuZxNmzahr6+PWCTCITmZ\nCgMDtD4+aE1MMNq0idbff18ru0FfX19Qt9MJa7Vs2RKVSvWEglv1ur7ufIyPj2fy5MkEBgZy6dKl\np5bkXjaexpCoDXv37mXo0KFPvB8qz/mBAwcSGRn5twjIr2xT70WhVqtJSEggIiKCiIgIbty4gVKp\nxNvbWwjSnp6eNUaNdXViHTPjWXVi3QWhaz6WlZUJww3VVeV0qC7q0qhRI8EW52Xg6NGjjB8/nlGj\nRtG7d29h3Lz6Ggvy8pD8+iv4+GBY9Xz1mrnO0r769resrAxjY2NEIhGlpaUkJCSgUqlQqVS0b98e\nkUjEsWPHmDJliuDArYNcLicgIIDGjRtz+PDhp65927ZtzJs3j1GjRuGxahUSPT2KV63Cc/58FIaG\nbOvcmWHDhtGtWzcARImJiMeM4cHAgTj98AOiwkLkp08T17EjxnI530+YwMw5czAzM6OkpASVSoWF\nhQUpKSmkpKQwbtw4WrduzYG9eysn7q5cwWDUKFQTJnDJ3596Uikecjnapk2J+e03xEOHEh4QwIyb\nNxk0aBATJkxAk5yMTCrlo/HjcXV1RaVS4e3tTY8ePejatetTDXhvrF5N2datOIWGUq9xY8waNsTw\njTfQSiQonnGMHj16RGJiIg4ODjg6OtZ63lQXqCouLmbOnDncv3+fgoICJkyYwJAhQ2jatOlLO+ee\nx6A4e/YswcHBeHh4IJFIyMnJ4ezZs3h5edV477vvvsu6devIyMgQbkilpaXCTrS0tJTg4GDmz59P\nr169Xsra/0P8s5t6Lwodf9fT05MPPvgAqAwwN27cIDIyknXr1nHnzh3Mzc1pVqWi1r59eyZOnFgn\nfVd9ff0nBkjkcjlFRUUUFRWRnp6OQqHAxMQEQ0NDCgoKsLS0fELU5UWh1Wpp3Lgxr7/+Oj169CAg\nIEC46HRrjI6OZvTo0YSGhhIYGPhEzdzY2Jhdu3bx448/8vPPP+Pl5UVycjK9evXi/fffZ8iQITx6\n9Ah3d3eWLFnC/v37+fHHH/H39+eNN96gY8eOT1i664vFbJLLUeXnP3P9LVu2xNfXFw8PD7a0a0fH\njh2pJ5Ohr9FQIZdz4sQJysvLqV+/PhYWFtQrLWWdoSG3v/qKAyIRDBmC2MMD86NHUcvlfFFtsGXg\nwIHExsbSqVMnlixZQteuXTlz5kzlhV9Vq9e0b4/8wAG0bm60rbqB6rIWS5EIQ319XCoquKBUojAz\no7VKhcG8eVR88AGnT5/m4cOHlJSUIJFIMDY2JjU1tcYNWfdbaLVain/+mWZpaRQ9fEijqrKB/NCh\npx4bnY50eXn5U5vZOlQXqIqLi6O8vJzevXvTvXt3oqOjWbBgAWFhYS+F+lkXLQk9PT38/f0pLCxE\noVAI038bN25kwYIFRERE4OjoSJMmTWjXrl2N3UF2djYDBw4UjsGwYcP+6mBcZ/wbkP8ETExMCAwM\nJLBq1Fer1bJo0SJ27txJu3btuHbtGsHBwTg5OREQEICfnx9+fn4C9e55MDQ0xMbGBpuqbXtFRYXg\nT2Zubi7UvnXUu6fpztYVOjU8U1NTNm7c+NRAr2NjQO0184qKClxdXbG1tSU9PR2ZTEZZWRlWVlZE\nR0dz9epVtm3bhlQq5e233xY0K+Lj45k/fz5z5859IiCXFhfTUiTCrOpYPA7d9GRhYSG9evUiODgY\nH7GYH7Zs4buYGI4cPcprPj4cuXEDNzc3bk2ezKO0NJz19ZkXE8Ngd3cC4uPpqlIxMz8fm8dudEql\nEmNjY8FoYPPmzYjFYpYuXcqBAwdYuHAhkydPBpEIrZcX27dv5/r168xv0gSLu3eRrFzJj4mJHHV0\nZNeXX2I3YQJiT080jo6oO3VC7utLdna20LvQ09Nj6NCh5Obm8tlnnyGVSlGpVBgZGWFhYcHGkBBe\nj44mdd482r755v8/EE/ZhekkZ52dnWnWrFmdy2Fff/01R48eZePGjYJV2MtmI9SFQQGV5YbIyMga\nj+ma8br36iYPq6Nx48bcunXrpa75v4V/A/ILQCQSERgYyMyZM4Xygkaj4d69e4IV07JlyygtLcXT\n0xN/f3/8/f3x9vZ+5ri2Ltg8fPiQxo0bY21tLVxQGo1G2Fqmp6dTUlJSo/5naWkplAmeBqVSyd27\nd5HJZHh4eAiUsaehU6dOxMXFPfUYGBsbM378eOHiKC0t5c6dO6xbt44NGzZw6dIlzp49i6enJ/b2\n9syePRtTU1Pu3LnD5cuXiYmJeYK7+uuZM3yck0NbNzeWJybi7u4uPFdYWEhCQgLl5eXs27evsqSh\n0TB45kymiMUEhYXh4+sr1MAPHz6M76lTWJWVwfbtGKanEzpsGGPGjSPh7l0yMjJQKpWo1Woh08qv\nsnCysLBg4cKFgjlqm/x8nBUKbl2/DkB4eDibN2/m7t27xMfH07u8HD+RCKsFCyrdNBQKlC4ucOsW\ncrWag/v34/zuu5U3paZNa+ymbO7fR/3gAR988AEzZsxg8uTJwki4fUEBLTUaLty/j2HVuqrzuKv/\ntomJiSiVSnx9fZ8rC6DDnTt3mDRpEl27duXixYt1ft9/groyKC5fvoy3tzcODg6sXLkSLy+v/2kd\nipeBfwPyC+JxTQWxWIybmxtubm4Ch1YnaB8REcHmzZu5ffs2BgYG+Pj4CEHazc0NsVjM7du3KS0t\nxcbG5olpL93n6wZTdCemrv5XVFQkjLfq+Mm6IG1gYCDIKKanp+Ps7PxcTY4/C7VaLQwZuFeZkH7/\n/ffk5eVha2sr3Eh0lCxHR0e2bNmCl5dXjXozQJ/evbn3zjus/fFHwsLCWLRoEUqlkqSkJCoqKmjZ\nsiWzZs3iyJEjjBw5kiG+vtQ3MkLs4iJMpCmVSr7r35+RCgX7undn/KxZ6LdrhwpoAEilUk6ePImZ\nmRlubm6UlZURExODSCTC2tqaDz/8EBMTEzp27EhGRgaWlpaMNTRE5OBAaZWeb+zmzXQ+cIBBmzbh\n6ufHvu+/J7tePULq1+fTTz9lxowZwneK+fJLei9dymxnZ2afPv1EaevbkhJUhoa87e8vOMjIZDJs\nbGz4ODqanOho3vLxeYLHrVQqMTU1RSwWU1BQQOPGjWso8T0LSqWStWvX8ssvvwg2ZP8L8PX1JT09\nHTMzM44fP86AAQNq9Vx81fBvQP4vwMDAQAi8EyZMqKwHFhdz9epVIiIiWLBgAXfu3EGlUuHi4sKE\nCRNo3LhxnUsRtdns6C7awsJC0tPTqaioEC5cHTvjZQZjHR/azs6uBpfb0NBQ6Hg/LmBefdQ6KSmp\nRmPT5vZtFp45Q79hw7CZNIkHDx6QlpaGq6urQEmcNm0abdu25e2330ZVVsajMWNoMHAgmmrH5Y3G\njWkRH095u3Y4tGvHvn37+Pbbb/nmm29Ys2aNIBg1ceJETE1NmTlzpsBc8fLyqrEjycjIIPndd1F2\n6ULkkSP07duXd62tMZFKITkZ0d27zFu+HKr9btHR0Zw9e5auXbtiZGGB1tSUuxkZRKxZw9DoaAqX\nLMGgqmShXLwYkVbL3jfeAGDHjh3MmjWLzZs306tXL2yrxNWNjY0xNjYWRtvlcjmxsbEolUoaNGgg\naBebmprWUG97XDozJiaGyZMn06NHDy5cuPB/mhVXR10YFNV3bb1792b8+PHk5eX9KfbF3xH/BuS/\nACKRCEtLS7p370737t05evQoS5cuZfr06Wi1WiIiIli/fr3QCNMF89atW9eZHmdkZISRkRFSqZTk\n5GS0Wi1NmzZFpVKRm5vLvXv30Gq1wsCApaWlkGX9Gejq3CKR6Kl86KdBIpE8oZOga2zuv3YNr4oK\nxG5u5MbGCk7e9erVQ6tWk3vnDk7e3gwbNgyA0MGDaXL5Mo26d6e6Odc7ERFcOX6ci1eu4LBzJ+3G\njuUX4KeffmLu3LloNBouXbrEoUOH6GloyPW4OAJ27MDc0hKRSFRjRwKAlxdz5sxh48aNGBsb027g\nQLQtW9J040ZMHz4ke+hQzO3shGy/f//+FOXmUl5ezqFDh2gUGMjQAQPoq9WiPXCA0X37YjloEBs2\nbEDTp0+N49OkSROaNm361IBTfRS/SZMmQu9B91xpaSnFxcVkZ2eTlJSERqMhKSmJtLQ08vLyuHHj\nBqGhoYKLxsvC8xgUSUlJnDt3TpD1LCgo4KeffgLAxcUFc3NzNBoNRkZGXLt2jcjISDQaDQ0aNEAq\nlf7P6lC8DPwbkP8H0L17d/r06SOUJ95++22gsskSFxdHeHg4Bw4cYPbs2Wi1Wlq1aiUEaQ8Pj1pF\nw7VaLZmZmbW6a+sGAtRqNSUlJRQVFQllBD09vRqlDiMjo1pvABqNhrS0tFpFhl4Eusbm9UePmKdW\ns8Heno5V3oOzZs3i/fffx2z3brpcuMC56dPxmToVc3Nz6sfH00atJvcxVoZYX5+fL19my5Yt+I0Z\nQ5BIxDQ7OzxtbIhdtw7Rnj20PHiQixcvku7jQ8uzZ2nr7Eyfjz5i5cqVNT4rLS2NPn36kJOTw6hR\no+jZsyfvvfcer732Ggv27SM/K4sysZisxETBKuzr5s3pWlREkZMTW7OzWRQfT2FyMtqoKMLKyzk+\ndy62586RkpLyhHB7YGAg56v4049DLpcTFxeHRCKplXkjEolqcM2h8jfLz89n165dVFRUIBKJGDNm\nDCtWrKghm/oiqAuDws3NjbCwMObOnUtaWhr6+vp4eXnx7bffIpPJiIqKYu/evWzatElgh+zduxeR\nSIS+vj4bNmygZ8+egg6Fl5fXS1n7/wL+5SH/jaDLenRZQ2RkJAkJCdSrV0/gRgcEBBAbG0txcTGt\nW7fG1dX1T1GVdGUEHbWtvLwcIyOjGkFaJpORmJhIw4YNcXZ2fuGpw8eRl5dHXFwcxsbG+Pn5IRKJ\nsLOzo7i4mJkzZ9LX0hK+/BLlmjWYensjk8m4efUqD2/d4v25c6lfv36NIZ6CggKioqLo2rUr4tJS\nJN7elFhZIXv4kEYFBVwfNw7PlSt5r1MnVGlp3DIxYfTo0UydOpU//viDQ4cOMXfuXDasW4fNypWY\nAe+2a0fR6tW07t+fzp07s23bNuE3+u30adaOGUPI8OG8u3kzGj09Yr/6it+zsui/cCEPHRw4+M47\nlVlxlQrhpk2bnhhuqA1arZasrCzS0tJo2rRprZortUGhULBy5UrOnDnDt99+S+vWrYHKHU71ZuaL\n4sqVKyxcuJCTJ08C8OWXXwIwa9asWl9fUFBAixYtuH//PoAwWl7X7/U3wr885FcNuqynU6dOdOrU\nCUCYfouIiODs2bNMnz4dY2NjWrZsSUpKCgEBAfj4+Ahef8/D42WE6mJAubm5xMTEoNVqBTnS4uJi\nQaj+RVFRUUFiYiIAAQEBNcofGzZsIDk5mcmTJ1fyaSdPBmDUqFGUlJQQFhYmTEOmpqZSWlrKgwcP\ncHR0xMnJicDAQEpLS7l48SJLysqwMzTk66VLublyJZPPnqXPmjXsqMpGc2NjkVRxdseMGSMwXopz\ncjhoZoaxmRlGaWnoqdXEx8cL2anOBDRixAhOFhcTuXYtWa+9hkFICO5Dh3JgyRLaGRqyfO5cWhoa\n0qZNG4YOHYq+vj5t2rTh0aNHWFhYPNUmqby8nPj4eIyMjAgICKiTnRJUanxMmTKFN954gz/++KMG\nK+NlT979WRbE5s2bawwE/V01KF4W/g3If3OIRCIaNmxIv3792LdvH6GhofTu3Zvk5GTCw8P55Zdf\nWLx4MRUVFbRo0UJQvfPy8qqT4pxIJMLQ0FDQi/by8sLKyorS0lKKiopq6ExUp96ZmprWuWmo0WjI\nzMzkwYMHNUTdq6O6VGV13Lp1SxhFr940fPjwIX379qVFixZs3ryZlJQUwj79FOPr18lr2JD27dsz\n57ffOPbgAWKxGM+UFEQiESq5HNq3J18iod6jR2zcuJEtW7Zg+/vvTM7P5+zo0bwxZw4VIhE0aIAR\nlTetVatWUVxczOTJk/F87z1yv/mGq1ot3SIj6SGVMsbOjhkzZjDCzAy3VauQh4Xx9okTaLVaQUI1\nJyeHu3fvolarMTMzq3Ess7KyuH//Pu7u7k91o3kcOknPc+fO8cMPP7wUWcyXiXPnzrF582YuXrwo\nPPZ31aB4Wfi3ZPEPgVwu5+bNm4SHh3P16lViYmIwMTHB19dXqEfXpnGs4/xaWVnh4uLy1ExYpzNR\nXZtZX1+/Rqmj+uSZDkVFRSQkJFC/fv0/XV6ByqxUo9EIPGEdFAoFEydOpHXr1gQEBFQO54wYgeOt\nW0xv3Zo2Y8aQnJzMjUOHmLBsGe7NmyOVSjE3MyM8MBC1hQWdz54VPu/OZ5/h8t13TGnShCQzMz7+\n+GP69+9fybmOjSW/e3eUYjF9CwsB2LVrl8D1PnPmDOnp6STu3YtNcjKSVasqp/t8fGr9ThqNRrjh\nPXr0iLy8PPT09LCyshIajM9rwN68eZMpU6YwYMAAPv3005c63fks1LVkER0dzcCBA/n1119rcMyr\nY+HChZiZmTF9+vT/20X/d1Cn7OSVCci66am4uDgiIyOfyqd8Wgf40aNHDB48mNTUVFxcXNi/f3+d\njR3/jtBqtRQUFHD16lUhSKempuLo6Cjwoo8cOcLo0aMJCAh4IuDVBTqPNl2Q1lll6cT98/PzKS8v\nx8PDo84SkHWFWq3m3r17FBYWCv5/ovR0Lm/YQPA336AvkfDr9Om0+vJLrnTrhmdoKEVFRSiSkvCc\nO5fcYcNQDRsmUMbEYjHbt2xhSpWxbUhICFOnTiUvL49mHh6YduqE1tAQ/Vq25ydOnCD9jz+YHBaG\npn17FLt3C6PXT4NWqxV8HHXrr641oWvA6qY2TUxMMDMzQ6VSsXz5ci5cuMC3335Ly5YtX9oxfR57\nQqvVMmnSJEJDQ2nSpAnbt29n9OjR7N69m4yMDOG9gwYN4uDBg+zYsaOGctz/qAbFy8I/KyDHxcUh\nFosZM2YMK1eurDUgP8uF4NNPP6V+/frMnDmTZcuWUVBQwPLly/+Cb/LXQTdluHTpUo4dO4aXlxf5\n+fk0b95cKHV4e3v/xzq4OmlSXU1WIpGgp6dXY3v+MqRJdZzo2sR0IiIiCAkJ4a233mJMcDAFw4cj\nmjOHVhMmACC6cwfD3r0pHTWKrPffp6ioiJKSErK++w6z8HAka9YgqQp8tra2uLq6VmafSiWIRKCv\njyg+HqNOndA6OVERGVn5uFKJ/qJFaDp0QFOtZlobdHZK9erVe+auobr4U3h4uODc4e7uzrhx4wgK\nChIYFi+Kujh4HD9+nPXr1zNx4kTGjRtHTk4O8+bNY+bMmdja2jJlyhQ+/fRTbG1tUavVAqtEX19f\nUP57XIPiv+Dk8d/CP6up17x58+e+5lkz9EeOHBEoRiNGjKBz587/uIAsFouxsrLC2dmZlJQUTE1N\nUSqVxMTEEB4ezo4dO4iOjkZPT0/QFAgICKBp06Z1KjWUlZWRkJCAkZERgYGBSCSSGtvz+/fvC1ZZ\n1evRdeVey+XySjdnjeapnOi2bdsKOswAzg8e1Hhe6+lJRVwcekZGOIpEgvRn1Dvv0CIvjz+SkhC7\nu6ORSOjcuTOurq789NNPgvKdCBDJZFBRATIZaLWVAVkiQbV48TPXr6MS5uTk0Lx58+eOtOvEn0xM\nTIiLi8Pe3p6lS5dSWlpKVFQUarVamBZ9UdRFf+LIkSOEhITQp08f0tPT8fDwYOTIkURGRuLr68vc\nuXMBmDFjBvBkGePvrEHxsvDKBOS64Fkd4OzsbOzs7ACwtbUlOzv7L1njXw2pVMqCBQuEf0skEnx8\nfPDx8WHcuHFotVpkMhnXrl0TsrKkpCSsra1rUO+qC/xXH6n28PCoMa0nFosxNzev4aSis8oqKiqq\nk1WWVqvl/v37ZGRkPDEg8R+hlh2AdP9+Tp86Re8zZzBdswZZeLggg6pjdFRUVFSKAdWvj2V0NBa2\nthjUMduXyWTExcVhZWX1TNeaxxEVFcXHH3/M4MGDOX/+vMC8eP05WfifRV3YE7W95v79+6+8/sTL\nxN8qIHfv3v0JvzeoNMDs37//C39+9TpyaWkpBQUFT9SRExISGDx4sPDve/fusWjRIqZOncrChQv5\n/vvvBSW0pUuX/k/6dr0IdNlrly5d6NKlC4CgkaET+A8NDSU3N1fgyeoEePz9/euUSf8ZqywjIyMe\nPXqEVCr9U1SwukKhUJCYmIjayIjgGTMw+vxztDIZEhOTJ4KKVqtFLpdz4cMP0To40GDwYBQKRY0R\n5sdpbTqz1kePHv0pO6WKigqWLl1KREQEYWFhddoh/ov/ffytAvKZM2de6P3PmoNv2LAhc+fOpVu3\nbowYMQJvb2+WLVv2RNnCw8ODmzdvApWZnIODg1D3Avj4449fla5wnSESiYTjoDsWubm5fPDBB0RF\nRQkaHmq1+gmB/7oG0MdlP9VqNUlJSeTm5mJhYUFJSQlRUVE1RsFfRJq0uiFA48aNBd0IZRVr4GnH\nQZGfT6fDh3mor4/TV189QWtLTk4WGlcSiYS8vDzs7Ozw8/Or81ojIyOZNm0aQ4cO5dy5cy/9JlQb\n6qIh8bTXKJXKV1p/4mXibxWQXxQBAQFPnYPv168f27ZtIzo6mu3bt/P2229z+PDhZ9aRz549S5Mm\nTXB2dv5vfYW/DYyNjRk3bhx9qukzlJWVcf36dSIjI1m7di1xcXFYWFjUKHU4ODg8NzDpHDBsbW3p\n0KGD8HqNRiOMgmdkZPzHVlm6AQxDQ8M/bQhg4eDAtblzMak6J0QiEaamppiamgoNNqVSSXx8PLm5\nuZiZmZGdnS3cWJ5VNy8vL+eLL77g+vXr7Nq1i2bNmtV5XS+KZ107OvTr148NGzYwZMgQIiIisLS0\nxM7ODmtr61daf+Jl4pVhWRw6dIhJkyaRm5uLVCqldevWnDx5kgcPHjBq1CiOHz8OVHaCp06dKszB\n67q4+fn52Nra4uLigrOzM/v27aNJkyYUVvFKa8PIkSPx9fVl4sSJQCVvcuvWrZiamlJUVISRkRGN\nGzd+KoVOJ6Sip6cndJrhn0PB02q15OXlERkZSUREBJGRkWRmZuLs7Cxwo/38/LCsEvrRKa4pFAqa\nNWtWJ7bHn7HK0mq1ZGRk8ODBgz81gPFnUFBQQEJCAvb29jRq1EgIujoz0uo8bgMDA8rLy0lOTkYq\nlbJixQree+89pkyZ8lImI2vDs8493bUjl8vR09PDyMiIgoICXnvtNfbv349Wq6VNmzbcuHEDiURC\no0aNWLt2Lb17937qdfcPwj+L9lZXPKsOPWLEiBoBWKdEVRsUCgX29vbExsYK29ns7GysrKyYOXMm\nUVFRuLq64u7u/lQK3dPm9v/JFDyNRsPdu3eFAB0VFUVpaSn16tUjNTWVtWvXEhQU9EIjvzpFOV2Q\nVigUGBoaUlpailQqxd3d/aVLUapUKpKTkwWzgrrcTHQ62kuXLiW2SvHO3d2dESNGMGjQoJe6Ph3q\ncu5lZWWRlZWFr68vMpkMPz8/Dh8+jKen56s2zPEy8W9A/rPw8PDg/Pnz2NnZkZWVRefOnWtQpKrj\nyJEjfPPNN5w6darWz9m5cycjR47k9OnTT/2cpwXkP7OOVx0ymYxBgwZhYmJChw4diI6OJiYmBkND\nwxoC/02aNPmP6sU67nVubi4NGzYUhlm0Wi1mZmZ1nox7FvLz80lKSqJRo0Z1Fo6HSseMGTNmMGLE\nCCZNmoRYLCYtLQ25XI6Hh8d/tJbn4T859/r378/EiRMJDg5+ZkDOyMggKCiIa9euUb9+fQoKCvD1\n9eXcuXO4uLjUeO3NmzcZN24cxcXF6OnpMWfOnBrN9L8h/g3IfxYzZsygQYMGQnbw6NEjvvrqq1pf\nO2TIEHr27CkYoEJl5mBnZydQxyIiItizZw/16tWrtfTh6uqKpaXlE0IqUqlUeL1OyOdZpZNXGVqt\nlujoaFq1alXjsaKiIkHgPzIyknv37mFvby9wo/39/bGysmo6FdEAAA9wSURBVHpm8NONhTds2BAn\nJ6caAbe6ML1uOERnlaUrddTFKispKQm5XE7z5s3rnNWXlpayaNEiYmJi+O6772jatGmd3vcy8GfP\nvdTUVIKCgoiJiRHsrrZu3SoY8q5atapGue2rr74iOTmZ7777jjFjxuDi4lKrElxiYiIikYimTZvy\n4MED/Pz8iIuLe8J78W+EfwPyn0V+fj7vvPOOYHG0f/9+6tev/0QdWre1bdq0aY2LODMzE0tLSzIz\nM+nTpw+hoaHY2dk9tfRx//79GkIq69evJygoCKlUyr1794RaXmpqKjk5OU/UkTMyMggJCSE7OxuR\nSMTo0aOZMmUKwD+CglcdulFjXYC+evUqBQUFTwj8GxsbU1hYyL1794DKgaK6joVXdzgpKiqivLy8\nVqssqGSZJCcn4+Ligq2tbZ2yYq1Wy6VLl/jss88YOXIk48eP/z+pFb+ssl1JSQmdOnVizpw5vFll\nvKor24lEIubNm0dWVhZbtmwR3qNUKvHz82PkyJF8//333Lx5s05N01atWnHw4MH/6s3pJePfgPxX\n4T/Z9lXf6nl4eNCtWzecnJwECt7IkSP/reX9SahUKmJjY4mIiODq1avcuHFD0Hj+f+2de0zU17bH\nv5uHVm+gCK2KldeAU2BeIq3nHKSFKihSpLFNOdZKpsRAjHpre4PWQ6pY0+AF0kDS5iK2vdUSi6k0\nKigD2FP6OPJqa7EFD/YeLVHKoxW0haFhGGbdP2bmd2YYZhxwhuf+JL9k9pq99uzfz5nlj+9ev7WV\nSiWeeeYZhIaG3lfQM5YmNWrSGo0GWq0Wbm5uCA4Oho+Pj13jq9VqHDp0CG1tbTh27BiCg4MnPKf7\nwd7v7vDwMJKSkrBhwwb8l6G+x2ja29uRlJSElpYWM3t1dTUSEhJQU1OD+Pj4e86pqakJSqUSra2t\nDq+9PYnYFZBn7NlNZ5KTk3HixAkAwIkTJ8Z8aEWtVgtlK9VqNWpqaiCVSgX/06dPQ6lUmqXgjcbX\n11fYfsfDwwNhYWFCoW+O/tFihUKBjIwMvPvuu4iKioJEIkFBQQE8PT2Rl5eHNWvWIDExEQcOHMC5\nc+fQ2dmJ8dykPPDAA1i8eDFWrFiB5cuXg4jg7++PgIAA9Pb24ttvv0VjYyOuXr0qPBpuOj4R4csv\nv0R8fDzCw8Nx8eLFKQvGgH3fXSLC9u3bERYWZhGMu7q6hNdnzpwRvtOmqFQq+Pr6WgTqsejq6kJq\naio++OCDmRyM7YbfITsBe6QPW4VUJpKCN14tby7S2dlpUWzHuC9dY2OjcCfd3d0NkUgkFFSKiIjQ\nV4uzIjsMDQ2hra0Nrq6uEIvFFnWmjaVJjXfSarUajY2NaG1txd27d3Hnzh2UlJQIdSKcgb2plH5+\nfrhz5w6Gh4cxf/58tLe3w9vbG62trYiJicGiRYvg5eWFb775BjKZTAiSRkksNTUVzc3NYIwhMDBQ\nkO2MNDc348UXX4RKpUJ0dDQaGxvN3jfl999/R2xsLLKyspyWVTKJcMliuuNMLa+0tBSHDh3CyMgI\n/Pz8EBQUZKblERH27NmDyspKLFy4EMePHxfutu9VZnG2o9Pp8OOPP6KhoQFNTU24fPkyNBqNRYF/\nV1dX1NfXw8XFBSEhIYJefy+ICJWVlSgsLMRDDz0EnU6HmzdvIisry2mZBPamUjozFZOIEBUVhcOH\nDwtrJg0NDTh58qRFX41Gg40bN2LTpk145ZVXxney0xP7UmuIaDwHZ5IQi8XU2dlJRESdnZ0kFovH\n7KfRaGj9+vX01ltvCTatVksikYiuX79OQ0NDFBoaSsHBwWZ+Fy5coISEBNLpdFRfX0+rV68e01cu\nl1Nra6uTznLm8Mcff1B9fT0VFBTQ1q1bKTw8nPz8/Cg2NpaOHz9OLS0t1N/fT2q12ubR3d1N6enp\nFB8fTz/99JMwvk6nI41G47T52/t9CggIoF9//XXC/rYoLi6mlJQUoa3VaikiIoI+//xzi74lJSXk\n5uZGCoVCOL777rtxf+Y0wq4YywPyNCUzM5OOHDlCRERHjhyhvXv3WvTR6XSUmppKe/bsMbPX1dVR\nTEyM0E5MTCSZTGbWJyMjgz766COhbfzB1dXV0fr16wV7Tk4O5eTkOOKUZg2XL1+mlStX0ieffEIq\nlYqys7MpMTGRJBIJJSQk0Ouvv05nz56lW7du0cDAAKnVahoYGKDz58+TXC6n4uJiGhkZmdQ5P/jg\ng8JrnU5n1jYlMDCQFAoFrVq1ioqLi8ftz7GKXTF2TtWymEns378fKSkpeP/99wUdGoCZDn3p0iWU\nlJRAJpMJuwjn5ORgcHAQHR0dkMlkYIzB1dUVkZGRZuPzUokTRyqV4quvvhIqsxl3tNDpdGhvb0dD\nQwNqa2uRn5+P/v5+iMVi/PLLL1iwYAEqKirg7+/vlHnZksBMYYxZ1cPt2dPOlj/n/uABeZri4+OD\nv5vs6WZk2bJlQj50dHT0mBkBZWVliI2NxXvvvQcAKCkpmVBQraqqwr59+zA4OIjly5dbaMknT55E\nbm4uiAgeHh4oKioSHuCwVqdjNuDu7j5m7qyLiwtEIhFEIhG2bt0KQJ8e9v3336OiogIHDx50aqaA\nrWqIS5YsER5c6urqsloz2liFbfHixdi8eTOamprw5JNP2u0/Xn744Qekpqaa2ebPnz9nbwJ4QJ6F\nOKJU4sjICHbt2oVt27bB09MTpaWlSE5ONtshIigoCF988QUWLVoElUqFjIwMsx9SbW3tmDtIzyXc\n3d0RGRlp8ReKI7EngyI6OhqrVq3CkiVL0NPTg97eXhQWFprV8TYWU8rNzUVMTAxqampw8OBBAP9O\nh9u/f7/VdLiJIJPJhHK2HHANeTYyPDxMQUFBdOPGDWFhrqWlxazP+fPnzRb1Hn/8cTPfsrIyiouL\nE3zvpSX39fXRsmXLhLa1xSGO49m7d6/ZesO+ffss+ty+fZvWrl1LISEhtHbtWnr44Yepvb2dfv75\nZwoJCaH8/Hy6fv06yeVyksvlFB4eTm+++eaY/uvWraPe3t5JO79ZAl/Um8tcuHCBVqxYQSKRSPhh\nFRUVUVFRERHpF2Z27txJIpGIpFIpff3112a+S5cuJQ8PD8H3ww8/pF27dln9vPz8fNq+fbvQtrY4\nxHE8482AqK6upqioKKGdnZ1N+fn5Tp0jhwdkzn1w+vRpswBrKyB/9tlnFBoaSrdv3xZsHR0dRETU\n09NDcrmc8vLySCwWU3BwsHA3Z0ptbS15enoKKU5vvPGG8J5KpbLpO9cZbwZEWloavf3220I7Ozub\n/P39SSaTUVpaGvX19TltrnMYHpA5E8fe9LcrV66QSCSia9euWR3rwIED5O3tbTO3uba2lp5++mkL\nX54XrWfdunUkkUgsjrNnz1oEYC8vL6vjDA0NkY+PD3V3dwu27u5u0mq1NDIyQllZWZSWlua085jD\n8LQ3zsSxZ8uemzdv4tlnn0VJSQnEYrFgV6vVwr5xarUaZ86cQWBgoM0t5K1hz/bzcwFHZFAA+joS\nxsU9U38j6enpSEpKcsykOeNm9lfrmIXcunULQUFB6OvrA6DfFigoKAjt7e1j9k9ISICXl9e4fmhu\nbm545513sGHDBoSFhSElJQUSiQRHjx7F0aNHAQCHDx9Gb28vdu7ciZUrV+Kxxx4DoC/BGB0dDYVC\ngdWrV0MqlSIiIkIY25jzPJq6ujrI5XJs3LgRra2tAKznS3P+jT0FgYyUlpbihRdeMLPZUxCIM0nY\neytNXLKYVuTm5lJ6ejoR6Z+6s5UB8emnn1J5efmYksBkYI8e/dtvv1F/fz8R6RcVQ0JCxvTNzMwk\nLy8vq3pyXl6eoENLJBJycXERMgICAgJIKpWSQqGgyMhIh5+nI/n4448pPDycGGNmC66jUalUFBwc\nTAsWLCAfHx8hA6K3t5eeeOIJWrhwIcXFxVFfXx8NDAyQt7c33b1712yMbdu2kVQqJZlMRps2bRIW\nCDkOhWvIsxmNRkMymYwKCgooPDz8nnUQrGm0k8FEHsc2ps2Z+mq1WvL29qbMzEy79OTy8nJ66qmn\nLMacCVy9epXa2tooJibGakC2pa/bkwrHmVTsirFcspihuLu7Iz8/H6+++ioKCwvHtVX9ZGOqR2s0\nGpw6dQrJyclmfbq7u4WnDpuamqDT6eDj42Pme+nSJWg0Grz00kuYN2+eoCdbY6w/z2cKYWFh99w3\nz1RfH309zp07B6VSCQBQKpVj1tPmTD94QJ7BjKfQ91Rijx5dVlYGqVQKhUKBl19+GadOnQJjzMz3\n+eefx6OPPgqJRALAtp48ODiIqqoqPPfcc4KNMYa4uDhERkbi2LFjzj9xJ2NLX+/p6RHqDC9duhQ9\nPT1TMkfO+OBZFjOU5uZmXLx4EQ0NDYiOjsaWLVusFvqeDiQmJlrs6bdjxw7h9e7du7F7926bvmVl\nZaiqqrLr8yoqKrBmzRrhcWDAvHBOaGgoXnvtNTzyyCNj/odG5Px60baKATnq0WSAFwOaUdirbfBj\n+hzQF7uuBxBvaP8ngJP38IkFcH6q536f5/0XANUm7b8B+JuVvmcAbLUx1gcACgC0WHk/EYDKcK3/\nDKDRYHcFcB2ACMA8AFcAhDvxnD8H8Nh4rweAawB8Da99AVyb6n8/ftz74JLFzCQdwE0iumho/w+A\nMMZYzFidGWNfATgNYB1jrIMxtmGS5ulovgawgjEWxBibB2ALgPLRnRhjDwKIAXDOxPYfjDEP42sA\njwKwVdXmGQAfkp4GAF6MMV8AqwH8i4huEJEGwClD36nA1vUoB6A0vFbC5Fpwpi88IM9AiOgYEf3V\npD1CRKuI6Asr/Z8gooeJaAERLSei6smbreMgIi2A3QCqAfwTwMdE1MoY28EY22HSdTOAGiJSm9iW\nAPgHY+wKgCYAFwCMeb0MPALglkm7w2CzZncojLHNjLEO6O+CLzDGqg32ZYyxSsD69TAM8d8A4hlj\n/wcgztDmTHO4hsyZURBRJYDKUbajo9rHARwfZbsBQGFqY4wFOmGKDoGIzkAvu4y2d0IvpxjbFtfD\nYO8FsM6Zc+Q4Hh6QZwmMMRmAklHmISL601TMZxbwMwA/k/Zyg83dip3DuW94QJ4lENEPAFZO9Txm\nEeUAdjPGTgH4E4DfiKiLMfYrDLot9IF4C4CtUzhPziyCB2TOnIQxVgp95slDBq02G/q7X6MEUgm9\nNPAvAIMA0gzvaRljRt3WFcD/mui2HM59wYgs92TjcDgczuTDsyw4HA5nmsADMofD4UwTeEDmcDic\nacL/A3X/H5B8Z/2EAAAAAElFTkSuQmCC\n", 351 | "text/plain": [ 352 | "" 353 | ] 354 | }, 355 | "metadata": {}, 356 | "output_type": "display_data" 357 | } 358 | ], 359 | "source": [ 360 | "plot_results()" 361 | ] 362 | }, 363 | { 364 | "cell_type": "markdown", 365 | "metadata": {}, 366 | "source": [ 367 | "### Fitting Linear regression and Neural Network to Non-linear data" 368 | ] 369 | }, 370 | { 371 | "cell_type": "code", 372 | "execution_count": 8, 373 | "metadata": {}, 374 | "outputs": [ 375 | { 376 | "data": { 377 | "text/plain": [ 378 | "((7500, 3), (7500, 1))" 379 | ] 380 | }, 381 | "execution_count": 8, 382 | "metadata": {}, 383 | "output_type": "execute_result" 384 | } 385 | ], 386 | "source": [ 387 | "def generate_data(n_points=10000, n_features=3, use_nonlinear=True, \n", 388 | " noise_std=0.1, train_test_split = 4):\n", 389 | " \"\"\"\n", 390 | " Arguments:\n", 391 | " n_points - number of data points to generate\n", 392 | " n_features - a positive integer - number of features\n", 393 | " use_nonlinear - if True, generate non-linear data\n", 394 | " train_test_split - an integer - what portion of data to use for testing\n", 395 | " \n", 396 | " Return:\n", 397 | " X_train, Y_train, X_test, Y_test, n_train, n_features\n", 398 | " \"\"\"\n", 399 | " # Linear data or non-linear data?\n", 400 | " if use_nonlinear:\n", 401 | " weights = np.array([[1.0, 0.5, 0.2],[0.5, 0.3, 0.15]], dtype=np.float32)\n", 402 | " else:\n", 403 | " weights = np.array([1.0, 0.5, 0.2], dtype=np.float32)\n", 404 | " \n", 405 | " np.random.seed(42)\n", 406 | " bias = np.ones(n_points).reshape((-1,1))\n", 407 | " low = - np.ones((n_points,n_features), dtype=np.float32)\n", 408 | " high = np.ones((n_points,n_features), dtype=np.float32)\n", 409 | " \n", 410 | " X = np.random.uniform(low=low, high=high)\n", 411 | " noise = np.random.normal(size=(n_points, 1))\n", 412 | " noise_std = 0.1\n", 413 | " \n", 414 | " if use_nonlinear:\n", 415 | " Y = (weights[0,0] * bias + np.dot(X, weights[0, :]).reshape((-1,1)) + \n", 416 | " np.dot(X*X, weights[1, :]).reshape([-1,1]) +\n", 417 | " noise_std * noise)\n", 418 | " else:\n", 419 | " Y = (weights[0] * bias + np.dot(X, weights[:]).reshape((-1,1)) + \n", 420 | " noise_std * noise)\n", 421 | " \n", 422 | " n_test = int(n_points/train_test_split)\n", 423 | " n_train = n_points - n_test\n", 424 | " \n", 425 | " X_train = X[:n_train,:]\n", 426 | " Y_train = Y[:n_train].reshape((-1,1))\n", 427 | "\n", 428 | " X_test = X[n_train:,:]\n", 429 | " Y_test = Y[n_train:].reshape((-1,1))\n", 430 | " \n", 431 | " return X_train, Y_train, X_test, Y_test, n_train, n_features\n", 432 | "\n", 433 | "X_train, Y_train, X_test, Y_test, n_train, n_features = generate_data(use_nonlinear=False)\n", 434 | "X_train.shape, Y_train.shape" 435 | ] 436 | }, 437 | { 438 | "cell_type": "code", 439 | "execution_count": 9, 440 | "metadata": {}, 441 | "outputs": [ 442 | { 443 | "data": { 444 | "text/plain": [ 445 | "((7500, 3), (7500, 1))" 446 | ] 447 | }, 448 | "execution_count": 9, 449 | "metadata": {}, 450 | "output_type": "execute_result" 451 | } 452 | ], 453 | "source": [ 454 | "np.random.seed(42)\n", 455 | "X_train, Y_train, X_test, Y_test, n_train, n_features = generate_data(use_nonlinear=True)\n", 456 | "X_train.shape, Y_train.shape" 457 | ] 458 | }, 459 | { 460 | "cell_type": "markdown", 461 | "metadata": {}, 462 | "source": [ 463 | "**Instructions**\n", 464 | "Implement sklearn_lin_regress() function which returns a tuple of\n", 465 | "\n", 466 | "- coefficients of linear regression\n", 467 | "- an instance of LinearRegression class trained to X_train, Y_train\n" 468 | ] 469 | }, 470 | { 471 | "cell_type": "code", 472 | "execution_count": 10, 473 | "metadata": { 474 | "collapsed": true 475 | }, 476 | "outputs": [], 477 | "source": [ 478 | "# GRADED FUNCTION: sklearn_lin_regress\n", 479 | "def sklearn_lin_regress(X_train, Y_train):\n", 480 | " \"\"\"\n", 481 | " Arguments:\n", 482 | " X_train - np.array of size (n by k) where n is number of observations \n", 483 | " of independent variables and k is number of variables\n", 484 | " Y_train - np.array of size (n by 1) where n is the number of observations of dependend variable\n", 485 | " \n", 486 | " Return: a tuple of \n", 487 | " - np.array of size (k+1 by 1) of regression coefficients\n", 488 | " - an instance of LinearRegression\n", 489 | " \"\"\"\n", 490 | " from sklearn.linear_model import LinearRegression\n", 491 | " lr_model = None\n", 492 | " theta_sklearn = np.array([], dtype=np.float32)\n", 493 | " ### START CODE HERE ### (≈ 2-3 lines of code)\n", 494 | "\n", 495 | " ### END CODE HERE ###\n", 496 | " return theta_sklearn, lr_model" 497 | ] 498 | }, 499 | { 500 | "cell_type": "code", 501 | "execution_count": 11, 502 | "metadata": {}, 503 | "outputs": [ 504 | { 505 | "data": { 506 | "text/plain": [ 507 | "array([], dtype=float32)" 508 | ] 509 | }, 510 | "execution_count": 11, 511 | "metadata": {}, 512 | "output_type": "execute_result" 513 | } 514 | ], 515 | "source": [ 516 | "# you can make submission with answers so far to check yourself at this stage\n", 517 | "### GRADED PART (DO NOT EDIT) ###\n", 518 | "theta_sklearn, lr_model = sklearn_lin_regress(X_train, Y_train)\n", 519 | "\n", 520 | "part_3 = list(theta_sklearn.squeeze())\n", 521 | "try:\n", 522 | " part3 = \" \".join(map(repr, part_3))\n", 523 | "except TypeError:\n", 524 | " part3 = repr(part_3)\n", 525 | " \n", 526 | "submissions[all_parts[2]]=part3\n", 527 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key, all_parts[:3],all_parts,submissions)\n", 528 | "\n", 529 | "theta_sklearn.squeeze()\n", 530 | "### GRADED PART (DO NOT EDIT) ###" 531 | ] 532 | }, 533 | { 534 | "cell_type": "markdown", 535 | "metadata": {}, 536 | "source": [ 537 | "LinearRegression.score() computes $R^2$ coefficient. The coefficient $R^2$ is defined as $(1 - \\frac{u}{v})$, where u is the residual sum of squares $\\sum (y\\_true - y\\_pred)^2$ and v is the total sum of squares $\\sum (y\\_true - \\bar{y\\_true})^2$" 538 | ] 539 | }, 540 | { 541 | "cell_type": "code", 542 | "execution_count": 12, 543 | "metadata": {}, 544 | "outputs": [ 545 | { 546 | "data": { 547 | "text/plain": [ 548 | "0.0" 549 | ] 550 | }, 551 | "execution_count": 12, 552 | "metadata": {}, 553 | "output_type": "execute_result" 554 | } 555 | ], 556 | "source": [ 557 | "# you can make submission with answers so far to check yourself at this stage\n", 558 | "### GRADED PART (DO NOT EDIT) ###\n", 559 | "# calculate Linear Regression score\n", 560 | "model_score = 0.\n", 561 | "if lr_model is not None:\n", 562 | " model_score = lr_model.score(X_test, Y_test)\n", 563 | "part4=str(model_score)\n", 564 | "submissions[all_parts[3]]=part4\n", 565 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key, all_parts[:4],all_parts,submissions)\n", 566 | "model_score\n", 567 | "### GRADED PART (DO NOT EDIT) ###" 568 | ] 569 | }, 570 | { 571 | "cell_type": "markdown", 572 | "metadata": {}, 573 | "source": [ 574 | "### Neural Network with Tensorflow \n", 575 | "\n", 576 | "**Instructions**\n", 577 | "\n", 578 | "Construct two-layer Neural Network utilizing neuron_layer() function. The number of nodes in two hidden layers are defined by n_hidden1 and n_hidden2, respectively. Use Gradient Descent Optimizer.\n", 579 | "\n", 580 | "The train the network using X_train / y_train and compute accuracy of the prediction using X_test data set." 581 | ] 582 | }, 583 | { 584 | "cell_type": "code", 585 | "execution_count": 13, 586 | "metadata": { 587 | "collapsed": true 588 | }, 589 | "outputs": [], 590 | "source": [ 591 | "def random_batch(X_train, y_train, batch_size):\n", 592 | " np.random.seed(42)\n", 593 | " rnd_indices = np.random.randint(0, len(X_train), batch_size)\n", 594 | " X_batch = X_train[rnd_indices]\n", 595 | " y_batch = y_train[rnd_indices]\n", 596 | " return X_batch, y_batch\n", 597 | " \n", 598 | "def neuron_layer(X, n_neurons, name, activation_fn=None):\n", 599 | " with tf.name_scope(name):\n", 600 | " n_inputs = int(X.get_shape()[1])\n", 601 | " stddev = 2 / np.sqrt(n_inputs)\n", 602 | " init = tf.truncated_normal((n_inputs, n_neurons), stddev=stddev)\n", 603 | " W = tf.Variable(init, name=\"kernel\")\n", 604 | " b = tf.Variable(tf.zeros([n_neurons]), name=\"bias\")\n", 605 | " Z = tf.matmul(X, W) + b\n", 606 | " if activation_fn is not None:\n", 607 | " return activation_fn(Z)\n", 608 | " else:\n", 609 | " return Z" 610 | ] 611 | }, 612 | { 613 | "cell_type": "code", 614 | "execution_count": 14, 615 | "metadata": { 616 | "collapsed": true 617 | }, 618 | "outputs": [], 619 | "source": [ 620 | "n_hidden1 = 100\n", 621 | "n_hidden2 = 120\n", 622 | "n_outputs = 1 # single value prediction\n", 623 | "n_inputs = X_test.shape[1]\n", 624 | "\n", 625 | "reset_graph()\n", 626 | "X = tf.placeholder(tf.float32, shape=(None, n_inputs), name=\"X\")\n", 627 | "y = tf.placeholder(tf.float32, shape=(None), name=\"y\")\n", 628 | "\n", 629 | "### START CODE HERE ### (≈ 10-15 lines of code)\n", 630 | "\n", 631 | "### END CODE HERE ###\n", 632 | "\n", 633 | "init = tf.global_variables_initializer()" 634 | ] 635 | }, 636 | { 637 | "cell_type": "code", 638 | "execution_count": 15, 639 | "metadata": { 640 | "collapsed": true 641 | }, 642 | "outputs": [], 643 | "source": [ 644 | "learning_rate = 0.01\n", 645 | "n_epochs = 200\n", 646 | "batch_size = 60\n", 647 | "num_rec = X_train.shape[0]\n", 648 | "n_batches = int(np.ceil(num_rec / batch_size))\n", 649 | "acc_test = 0. # assign the result of accuracy testing to this variable\n", 650 | "\n", 651 | "### START CODE HERE ### (≈ 9-10 lines of code)\n", 652 | "with tf.Session() as sess:\n", 653 | "\n", 654 | "\n", 655 | "### END CODE HERE ###" 656 | ] 657 | }, 658 | { 659 | "cell_type": "code", 660 | "execution_count": 16, 661 | "metadata": {}, 662 | "outputs": [ 663 | { 664 | "data": { 665 | "text/plain": [ 666 | "0.0" 667 | ] 668 | }, 669 | "execution_count": 16, 670 | "metadata": {}, 671 | "output_type": "execute_result" 672 | } 673 | ], 674 | "source": [ 675 | "### GRADED PART (DO NOT EDIT) ###\n", 676 | "part5=str(acc_test)\n", 677 | "submissions[all_parts[4]]=part5\n", 678 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key, all_parts[:5],all_parts,submissions)\n", 679 | "acc_test\n", 680 | "### GRADED PART (DO NOT EDIT) ###" 681 | ] 682 | } 683 | ], 684 | "metadata": { 685 | "coursera": { 686 | "course_slug": "guided-tour-machine-learning-finance" 687 | }, 688 | "kernelspec": { 689 | "display_name": "Python 3", 690 | "language": "python", 691 | "name": "python3" 692 | }, 693 | "language_info": { 694 | "codemirror_mode": { 695 | "name": "ipython", 696 | "version": 3 697 | }, 698 | "file_extension": ".py", 699 | "mimetype": "text/x-python", 700 | "name": "python", 701 | "nbconvert_exporter": "python", 702 | "pygments_lexer": "ipython3", 703 | "version": "3.6.0" 704 | } 705 | }, 706 | "nbformat": 4, 707 | "nbformat_minor": 2 708 | } 709 | -------------------------------------------------------------------------------- /P3_QLBS Q-Learner in the Black-Scholes(-Merton) Worlds.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/michaelsyao/Machine-Learning-and-Reinforcement-Learning-in-Finance/49abda7cb6b21405f9697d6cdf85893769e057b6/P3_QLBS Q-Learner in the Black-Scholes(-Merton) Worlds.pdf -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Machine Learning and Reinforcement Learning in Finance 2 | 3 | ## Guided Tour of Machine Learning in Finance 4 | 1. [Euclidean Distance Calculation](MY_Euclidian_Distance_m1_ex1_v3.ipynb) 5 | 2. [Linear Regression](Final_linear_regress_m1_ex2_v4.ipynb) 6 | 3. [Tobit Regression](MY_Tobit_regression_m1_ex3_v4.ipynb) 7 | 4. [Bank defaults prediction using FDIC dataset](Final_Bank_failure_m1_ex4_v4.ipynb) 8 | 9 | ## Fundamentals of Machine Learning in Finance 10 | 1. [Random Forests And Decision Trees](FINAL_Bank_failure_rand_forests_m2_ex2.ipynb) 11 | 2. [Eigen Portfolio construction via PCA](Final_pca_eigen_portfolios_m2_ex3.ipynb) 12 | 3. [Data Visualization with t-SNE](Final_DJI_tSNE_m2_ex4_corrected.ipynb) 13 | 4. [Sequence Modeling and Reinforcement Learning](Final_absorp_ratio_m2_ex5.ipynb) 14 | 15 | ## Reinforcement Learning in Finance 16 | 1. [Discrete-time Black Scholes model](Final_discrete_black_scholes_m3_ex1_v3.ipynb) 17 | 2. [QLBS Model Implementation](MY_dp_qlbs_oneset_m3_ex2_v3.ipynb) 18 | 3. [Fitted Q-Iteration](Clean_dp_qlbs_oneset_m3_ex3_v4.ipynb) 19 | 20 | ## Reference Paper 21 | 1. [QLBS Q-Learner in the Black-Scholes(-Merton) Worlds](https://arxiv.org/abs/1712.04609) 22 | 2. [Algorithm Trading using Q-Learning and Recurrent Reinforcement Learning](http://cs229.stanford.edu/proj2009/LvDuZhai.pdf) 23 | 3. [Algorithmic Trading with Fitted Q Iteration and 24 | Heston Model](https://arxiv.org/abs/1805.07478) 25 | 4. [Value Investing: The Use of Historical Financial Statement Information to Separate Winners from Losers](https://www.chicagobooth.edu/~/media/FE874EE65F624AAEBD0166B1974FD74D.pdf) 26 | 27 | 28 | --------------------------------------------------------------------------------