├── Marc Potters_Hedged Monte-Carlo_2000.pdf ├── README.md ├── Week1 ├── Week1Exercise └── discrete_black_scholes_m3_ex1_v3.ipynb ├── Week2 ├── Week2 └── dp_qlbs_oneset_m3_ex2_v3.ipynb └── Week3 ├── Week3 └── dp_qlbs_oneset_m3_ex3_v4.ipynb /Marc Potters_Hedged Monte-Carlo_2000.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuxx1016/Reinforcement-Learning-in-Finance/578fe31da5b7027869f8a715e9a0febd0c28422e/Marc Potters_Hedged Monte-Carlo_2000.pdf -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Reinforcement-Learning-in-Finance 2 | Reinforcement Learning in Finance 3 | This is my repository of practice solutions to coursera course: Reinforcement learning in Finance by Igor. 4 | The course basically discussed about Q-learning for option pricing and replicated the work in the paper: 5 | https://arxiv.org/abs/1712.04609 6 | QLBS: Q-Learner in the Black-Scholes(-Merton) Worlds 7 | -------------------------------------------------------------------------------- /Week1/Week1Exercise: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Week1/discrete_black_scholes_m3_ex1_v3.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Discrete-Time Black Scholes\n", 8 | "Welcome to your 1st assignment in Reinforcement Learning in Finance. This exercise will introduce Black-Scholes model as viewed through the lens of pricing an option as discrete-time replicating portfolio of stock and bond.\n", 9 | "\n", 10 | "**Instructions:**\n", 11 | "- You will be using Python 3.\n", 12 | "- Avoid using for-loops and while-loops, unless you are explicitly told to do so.\n", 13 | "- Do not modify the (# GRADED FUNCTION [function name]) comment in some cells. Your work would not be graded if you change this. Each cell containing that comment should only contain one function.\n", 14 | "- After coding your function, run the cell right below it to check if your result is correct.\n", 15 | "\n", 16 | "\n", 17 | "Let's get started!" 18 | ] 19 | }, 20 | { 21 | "cell_type": "markdown", 22 | "metadata": {}, 23 | "source": [ 24 | "## About iPython Notebooks ##\n", 25 | "\n", 26 | "iPython Notebooks are interactive coding environments embedded in a webpage. You will be using iPython notebooks in this class. You only need to write code between the ### START CODE HERE ### and ### END CODE HERE ### comments. After writing your code, you can run the cell by either pressing \"SHIFT\"+\"ENTER\" or by clicking on \"Run Cell\" (denoted by a play symbol) in the upper bar of the notebook. \n", 27 | "\n", 28 | "We will often specify \"(≈ X lines of code)\" in the comments to tell you about how much code you need to write. It is just a rough estimate, so don't feel bad if your code is longer or shorter." 29 | ] 30 | }, 31 | { 32 | "cell_type": "code", 33 | "execution_count": 62, 34 | "metadata": { 35 | "collapsed": true 36 | }, 37 | "outputs": [], 38 | "source": [ 39 | "import numpy as np\n", 40 | "import matplotlib.pyplot as plt\n", 41 | "%matplotlib inline\n", 42 | "\n", 43 | "from numpy.random import standard_normal, seed\n", 44 | "\n", 45 | "import scipy.stats as stats\n", 46 | "from scipy.stats import norm\n", 47 | "\n", 48 | "import sys\n", 49 | "\n", 50 | "sys.path.append(\"..\")\n", 51 | "import grading\n", 52 | "\n", 53 | "import datetime \n", 54 | "import time\n", 55 | "import bspline\n", 56 | "import bspline.splinelab as splinelab" 57 | ] 58 | }, 59 | { 60 | "cell_type": "code", 61 | "execution_count": 63, 62 | "metadata": { 63 | "collapsed": true 64 | }, 65 | "outputs": [], 66 | "source": [ 67 | "### ONLY FOR GRADING. DO NOT EDIT ###\n", 68 | "submissions=dict()\n", 69 | "assignment_key=\"J_L65CoiEeiwfQ53m1Mlug\" \n", 70 | "all_parts=[\"9jLRK\",\"YoMns\",\"Wc3NN\",\"fcl3r\"]\n", 71 | "### ONLY FOR GRADING. DO NOT EDIT ###" 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "execution_count": 64, 77 | "metadata": { 78 | "collapsed": true 79 | }, 80 | "outputs": [], 81 | "source": [ 82 | "COURSERA_TOKEN = 'tXLFoYydXMEsuFwX' # the key provided to the Student under his/her email on submission page\n", 83 | "COURSERA_EMAIL = 'wudig159@163.com'# the email" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": 65, 89 | "metadata": { 90 | "collapsed": true 91 | }, 92 | "outputs": [], 93 | "source": [ 94 | "# The Black-Scholes prices\n", 95 | "def bs_put(t, S0, K, r, sigma, T):\n", 96 | " d1 = (np.log(S0/K) + (r + 1/2 * sigma**2) * (T-t)) / sigma / np.sqrt(T-t)\n", 97 | " d2 = (np.log(S0/K) + (r - 1/2 * sigma**2) * (T-t)) / sigma / np.sqrt(T-t)\n", 98 | " price = K * np.exp(-r * (T-t)) * norm.cdf(-d2) - S0 * norm.cdf(-d1)\n", 99 | " return price\n", 100 | "\n", 101 | "def bs_call(t, S0, K, r, sigma, T):\n", 102 | " d1 = (np.log(S0/K) + (r + 1/2 * sigma**2) * (T-t)) / sigma / np.sqrt(T-t)\n", 103 | " d2 = (np.log(S0/K) + (r - 1/2 * sigma**2) * (T-t)) / sigma / np.sqrt(T-t)\n", 104 | " price = S0 * norm.cdf(d1) - K * np.exp(-r * (T-t)) * norm.cdf(d2)\n", 105 | " return price\n", 106 | "\n", 107 | "def d1(S0, K, r, sigma, T):\n", 108 | " return (np.log(S0/K) + (r + sigma**2 / 2) * T)/(sigma * np.sqrt(T))\n", 109 | " \n", 110 | "def d2(S0, K, r, sigma, T):\n", 111 | " return (np.log(S0 / K) + (r - sigma**2 / 2) * T) / (sigma * np.sqrt(T))\n", 112 | " " 113 | ] 114 | }, 115 | { 116 | "cell_type": "markdown", 117 | "metadata": {}, 118 | "source": [ 119 | "Simulate $N_{MC}$ stock price sample paths with $T$ steps by the classical Black-Sholes formula.\n", 120 | "\n", 121 | "$$dS_t=\\mu S_tdt+\\sigma S_tdW_t\\quad\\quad S_{t+1}=S_te^{\\left(\\mu-\\frac{1}{2}\\sigma^2\\right)\\Delta t+\\sigma\\sqrt{\\Delta t}Z}$$\n", 122 | "\n", 123 | "where $Z$ is a standard normal random variable.\n", 124 | "\n", 125 | "MC paths are simulated by GeneratePaths() method of DiscreteBlackScholes class." 126 | ] 127 | }, 128 | { 129 | "cell_type": "markdown", 130 | "metadata": {}, 131 | "source": [ 132 | "### Part 1\n", 133 | "\n", 134 | "\n", 135 | "Class DiscreteBlackScholes implements the above calculations with class variables to math symbols mapping of:\n", 136 | "\n", 137 | "$$\\Delta S_t=S_{t+1} - e^{-r\\Delta t} S_t\\space \\quad t=T-1,...,0$$\n", 138 | " \n", 139 | "**Instructions:**\n", 140 | "Some portions of code in DiscreteBlackScholes have bee taken out. You are to implement the missing portions of code in DiscreteBlackScholes class.\n", 141 | "\n", 142 | "$$\\Pi_t=e^{-r\\Delta t}\\left[\\Pi_{t+1}-u_t \\Delta S_t\\right]\\quad t=T-1,...,0$$\n", 143 | "\n", 144 | "- implement DiscreteBlackScholes.function_A_vec() method\n", 145 | "$$A_{nm}^{\\left(t\\right)}=\\sum_{k=1}^{N_{MC}}{\\Phi_n\\left(X_t^k\\right)\\Phi_m\\left(X_t^k\\right)\\left(\\Delta\\hat{S}_t^k\\right)^2}\\quad\\quad$$ \n", 146 | "\n", 147 | "- implement DiscreteBlackScholes.function_B_vec() method\n", 148 | "$$B_n^{\\left(t\\right)}=\\sum_{k=1}^{N_{MC}}{\\Phi_n\\left(X_t^k\\right)\\left[\\hat\\Pi_{t+1}^k\\Delta\\hat{S}_t^k+\\frac{1}{2\\gamma\\lambda}\\Delta S_t^k\\right]}$$\n", 149 | "- implement DiscreteBlackScholes.gen_paths() method using the following relation:\n", 150 | "$$S_{t+1}=S_te^{\\left(\\mu-\\frac{1}{2}\\sigma^2\\right)\\Delta t+\\sigma\\sqrt{\\Delta t}Z}$$\n", 151 | "where $Z \\sim N(0,1)$\n", 152 | "- implement parts of DiscreteBlackScholes.roll_backward()\n", 153 | " - DiscreteBlackScholes.bVals corresponds to $B_t$ and is computed as $$B_t = e^{-r\\Delta t}\\left[B_{t+1} + (u_{t+1} - u_t)S_{t+1}\\right]\\quad t=T-1,...,0$$\n", 154 | " \n", 155 | "DiscreteBlackScholes.opt_hedge corresponds to $\\phi_t$ and is computed as \n", 156 | " $$\\phi_t=\\mathbf A_t^{-1}\\mathbf B_t$$" 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": 92, 162 | "metadata": { 163 | "collapsed": true 164 | }, 165 | "outputs": [], 166 | "source": [ 167 | "class DiscreteBlackScholes:\n", 168 | " \"\"\"\n", 169 | " Class implementing discrete Black Scholes\n", 170 | " DiscreteBlackScholes is class for pricing and hedging under\n", 171 | " the real-world measure for a one-dimensional Black-Scholes setting\n", 172 | " \"\"\"\n", 173 | "\n", 174 | " def __init__(self,\n", 175 | " s0,\n", 176 | " strike,\n", 177 | " vol,\n", 178 | " T,\n", 179 | " r,\n", 180 | " mu,\n", 181 | " numSteps,\n", 182 | " numPaths):\n", 183 | " \"\"\"\n", 184 | " :param s0: initial price of the underlying\n", 185 | " :param strike: option strike\n", 186 | " :param vol: volatility\n", 187 | " :param T: time to maturity, in years\n", 188 | " :param r: risk-free rate,\n", 189 | " :param mu: real drift, asset drift\n", 190 | " :param numSteps: number of time steps\n", 191 | " :param numPaths: number of Monte Carlo paths\n", 192 | " \"\"\"\n", 193 | " self.s0 = s0\n", 194 | " self.strike = strike\n", 195 | " self.vol = vol\n", 196 | " self.T = T\n", 197 | " self.r = r\n", 198 | " self.mu = mu\n", 199 | " self.numSteps = numSteps\n", 200 | " self.numPaths = numPaths\n", 201 | "\n", 202 | " self.dt = self.T / self.numSteps # time step\n", 203 | " self.gamma = np.exp(-r * self.dt) # discount factor for one time step, i.e. gamma in the QLBS paper\n", 204 | "\n", 205 | " self.sVals = np.zeros((self.numPaths, self.numSteps + 1), 'float') # matrix of stock values\n", 206 | "\n", 207 | " # initialize half of the paths with stock price values ranging from 0.5 to 1.5 of s0\n", 208 | " # the other half of the paths start with s0\n", 209 | " half_paths = int(numPaths / 2)\n", 210 | "\n", 211 | " if False:\n", 212 | " # Grau (2010) \"Applications of Least-Squares Regressions to Pricing and Hedging of Financial Derivatives\"\n", 213 | " self.sVals[:, 0] = (np.hstack((np.linspace(0.5 * s0, 1.5 * s0, half_paths),\n", 214 | " s0 * np.ones(half_paths, 'float')))).T\n", 215 | "\n", 216 | " self.sVals[:, 0] = s0 * np.ones(numPaths, 'float')\n", 217 | " self.optionVals = np.zeros((self.numPaths, self.numSteps + 1), 'float') # matrix of option values\n", 218 | " self.intrinsicVals = np.zeros((self.numPaths, self.numSteps + 1), 'float')\n", 219 | "\n", 220 | " self.bVals = np.zeros((self.numPaths, self.numSteps + 1), 'float') # matrix of cash position values\n", 221 | " self.opt_hedge = np.zeros((self.numPaths, self.numSteps + 1),\n", 222 | " 'float') # matrix of optimal hedges calculated from cross-sectional information F_t\n", 223 | " self.X = None\n", 224 | " self.data = None # matrix of features, i.e. self.X as sum of basis functions\n", 225 | " self.delta_S_hat = None\n", 226 | "\n", 227 | " # coef = 1.0/(2 * gamma * risk_lambda)\n", 228 | " # override it by zero to have pure risk hedge\n", 229 | " self.coef = 0.\n", 230 | "\n", 231 | " def gen_paths(self):\n", 232 | " \"\"\"\n", 233 | " A simplest path generator\n", 234 | " \"\"\"\n", 235 | " np.random.seed(42)\n", 236 | " # Spline basis of order p on knots k\n", 237 | "\n", 238 | " ### START CODE HERE ### (≈ 3-4 lines of code)\n", 239 | " # self.sVals = your code goes here ...\n", 240 | " # for-loop or while loop is allowed heres\n", 241 | " #Z = np.random.normal(0, 1, size=(self.numSteps+1, self.numPaths)).T\n", 242 | " #for step in range(numSteps):\n", 243 | " #self.sVals[:,step+1] = self.sVals[:,step]*np.exp((self.mu - 1/2*self.vol**2)*self.dt+(self.vol*np.sqrt(self.dt))*Z[:,step+1])\n", 244 | " Z = np.random.normal(0, 1, size=(self.numSteps+1, self.numPaths)).T\n", 245 | " print('Z=',Z)\n", 246 | " print('size of Z',Z.shape)\n", 247 | " Z1 = np.random.normal(0,1,size=(self.numPaths, self.numSteps+1))\n", 248 | " print('Z1=',Z1)\n", 249 | " print('size of Z1',Z1.shape)\n", 250 | " \n", 251 | " for step in range(self.numSteps):\n", 252 | " self.sVals[:, step+1] = self.sVals[:, step] * np.exp( (self.mu - 1/2 * self.vol ** 2)*self.dt + (self.vol * np.sqrt(self.dt) * Z[:, step+1]))\n", 253 | "\n", 254 | " ### END CODE HERE ###\n", 255 | "\n", 256 | " # like in QLBS\n", 257 | " delta_S = self.sVals[:, 1:] - np.exp(self.r * self.dt) * self.sVals[:, :self.numSteps]\n", 258 | " self.delta_S_hat = np.apply_along_axis(lambda x: x - np.mean(x), axis=0, arr=delta_S)\n", 259 | "\n", 260 | " # state variable\n", 261 | " # delta_t here is due to their conventions\n", 262 | " self.X = - (self.mu - 0.5 * self.vol ** 2) * np.arange(self.numSteps + 1) * self.dt + np.log(self.sVals)\n", 263 | "\n", 264 | " X_min = np.min(np.min(self.X))\n", 265 | " X_max = np.max(np.max(self.X))\n", 266 | "\n", 267 | " print('X.shape = ', self.X.shape)\n", 268 | " print('X_min, X_max = ', X_min, X_max)\n", 269 | "\n", 270 | " p = 4 # order of spline (as-is; 3 = cubic, 4: B-spline?)\n", 271 | " ncolloc = 12\n", 272 | " tau = np.linspace(X_min, X_max, ncolloc) # These are the sites to which we would like to interpolate\n", 273 | "\n", 274 | " # k is a knot vector that adds endpoints repeats as appropriate for a spline of order p\n", 275 | " # To get meaningful results, one should have ncolloc >= p+1\n", 276 | " k = splinelab.aptknt(tau, p)\n", 277 | " basis = bspline.Bspline(k, p)\n", 278 | "\n", 279 | " num_basis = ncolloc # len(k) #\n", 280 | " self.data = np.zeros((self.numSteps + 1, self.numPaths, num_basis))\n", 281 | "\n", 282 | " print('num_basis = ', num_basis)\n", 283 | " print('dim self.data = ', self.data.shape)\n", 284 | "\n", 285 | " # fill it, expand function in finite dimensional space\n", 286 | " # in neural network the basis is the neural network itself\n", 287 | " t_0 = time.time()\n", 288 | " for ix in np.arange(self.numSteps + 1):\n", 289 | " x = self.X[:, ix]\n", 290 | " self.data[ix, :, :] = np.array([basis(el) for el in x])\n", 291 | " t_end = time.time()\n", 292 | " print('\\nTime Cost of basis expansion:', t_end - t_0, 'seconds')\n", 293 | "\n", 294 | " def function_A_vec(self, t, reg_param=1e-3):\n", 295 | " \"\"\"\n", 296 | " function_A_vec - compute the matrix A_{nm} from Eq. (52) (with a regularization!)\n", 297 | " Eq. (52) in QLBS Q-Learner in the Black-Scholes-Merton article\n", 298 | "\n", 299 | " Arguments:\n", 300 | " t - time index, a scalar, an index into time axis of data_mat\n", 301 | " reg_param - a scalar, regularization parameter\n", 302 | "\n", 303 | " Return:\n", 304 | " - np.array, i.e. matrix A_{nm} of dimension num_basis x num_basis\n", 305 | " \"\"\"\n", 306 | " X_mat = self.data[t, :, :]\n", 307 | " num_basis_funcs = X_mat.shape[1]\n", 308 | " this_dS = self.delta_S_hat[:, t]\n", 309 | " hat_dS2 = (this_dS ** 2).reshape(-1, 1)\n", 310 | " A_mat = np.dot(X_mat.T, X_mat * hat_dS2) + reg_param * np.eye(num_basis_funcs)\n", 311 | " return A_mat\n", 312 | "\n", 313 | " def function_B_vec(self, t, Pi_hat):\n", 314 | " \"\"\"\n", 315 | " function_B_vec - compute vector B_{n} from Eq. (52) QLBS Q-Learner in the Black-Scholes-Merton article\n", 316 | "\n", 317 | " Arguments:\n", 318 | " t - time index, a scalar, an index into time axis of delta_S_hat\n", 319 | " Pi_hat - pandas.DataFrame of dimension N_MC x T of portfolio values\n", 320 | " Return:\n", 321 | " B_vec - np.array() of dimension num_basis x 1\n", 322 | " \"\"\"\n", 323 | " tmp = Pi_hat * self.delta_S_hat[:, t] + self.coef * (np.exp((self.mu - self.r) * self.dt)) * self.sVals[:, t]\n", 324 | " X_mat = self.data[t, :, :] # matrix of dimension N_MC x num_basis\n", 325 | "\n", 326 | " B_vec = np.dot(X_mat.T, tmp)\n", 327 | " return B_vec\n", 328 | "\n", 329 | " def seed_intrinsic(self, strike=None, cp='P'):\n", 330 | " \"\"\"\n", 331 | " initilaize option value and intrinsic value for each node\n", 332 | " \"\"\"\n", 333 | " if strike is not None:\n", 334 | " self.strike = strike\n", 335 | "\n", 336 | " if cp == 'P':\n", 337 | " # payoff function at maturity T: max(K - S(T),0) for all paths\n", 338 | " self.optionVals = np.maximum(self.strike - self.sVals[:, -1], 0).copy()\n", 339 | " # payoff function for all paths, at all time slices\n", 340 | " self.intrinsicVals = np.maximum(self.strike - self.sVals, 0).copy()\n", 341 | " elif cp == 'C':\n", 342 | " # payoff function at maturity T: max(S(T) -K,0) for all paths\n", 343 | " self.optionVals = np.maximum(self.sVals[:, -1] - self.strike, 0).copy()\n", 344 | " # payoff function for all paths, at all time slices\n", 345 | " self.intrinsicVals = np.maximum(self.sVals - self.strike, 0).copy()\n", 346 | " else:\n", 347 | " raise Exception('Invalid parameter: %s'% cp)\n", 348 | "\n", 349 | " self.bVals[:, -1] = self.intrinsicVals[:, -1]\n", 350 | "\n", 351 | " def roll_backward(self):\n", 352 | " \"\"\"\n", 353 | " Roll the price and optimal hedge back in time starting from maturity\n", 354 | " \"\"\"\n", 355 | "\n", 356 | " for t in range(self.numSteps - 1, -1, -1):\n", 357 | "\n", 358 | " # determine the expected portfolio value at the next time node\n", 359 | " piNext = self.bVals[:, t+1] + self.opt_hedge[:, t+1] * self.sVals[:, t+1]\n", 360 | " pi_hat = piNext - np.mean(piNext)\n", 361 | "\n", 362 | " A_mat = self.function_A_vec(t)\n", 363 | " B_vec = self.function_B_vec(t, pi_hat)\n", 364 | " phi = np.dot(np.linalg.inv(A_mat), B_vec)\n", 365 | " self.opt_hedge[:, t] = np.dot(self.data[t, :, :], phi)\n", 366 | "\n", 367 | " ### START CODE HERE ### (≈ 1-2 lines of code)\n", 368 | " # implement code to update self.bVals\n", 369 | " # self.bVals[:,t] = your code goes here ....\n", 370 | " self.bVals[:, t] = np.exp(-self.r * self.dt) * (self.bVals[:, t+1] + (self.opt_hedge[:, t+1] - self.opt_hedge[:, t])*self.sVals[:, t+1])\n", 371 | " \n", 372 | "\n", 373 | " ### END CODE HERE ###\n", 374 | "\n", 375 | " # calculate the initial portfolio value\n", 376 | " initPortfolioVal = self.bVals[:, 0] + self.opt_hedge[:, 0] * self.sVals[:, 0]\n", 377 | "\n", 378 | " # use only the second half of the paths generated with paths starting from S0\n", 379 | " optionVal = np.mean(initPortfolioVal)\n", 380 | " optionValVar = np.std(initPortfolioVal)\n", 381 | " delta = np.mean(self.opt_hedge[:, 0])\n", 382 | "\n", 383 | " return optionVal, delta, optionValVar" 384 | ] 385 | }, 386 | { 387 | "cell_type": "code", 388 | "execution_count": 93, 389 | "metadata": {}, 390 | "outputs": [ 391 | { 392 | "name": "stdout", 393 | "output_type": "stream", 394 | "text": [ 395 | "Z= [[ 0.49671415 -1.41537074 0.35778736 ..., 0.17087353 0.28473556\n", 396 | " -0.31048356]\n", 397 | " [-0.1382643 -0.42064532 0.56078453 ..., 0.01225543 -0.25605379\n", 398 | " -1.03759187]\n", 399 | " [ 0.64768854 -0.34271452 1.08305124 ..., -0.43115507 -0.10004104\n", 400 | " -0.11796758]\n", 401 | " ..., \n", 402 | " [ 0.26105527 0.15372511 0.30729952 ..., 1.78993649 -0.08322158\n", 403 | " 0.26402245]\n", 404 | " [ 0.00511346 0.05820872 0.81286212 ..., -0.87234029 -1.30132528\n", 405 | " 0.04032121]\n", 406 | " [-0.23458713 -1.1429703 0.62962884 ..., -0.5559135 -0.21753278\n", 407 | " 1.23713071]]\n", 408 | "size of Z (100, 253)\n", 409 | "Z1= [[-0.82509297 1.72633531 -1.28722318 ..., 0.19499029 -0.85910398\n", 410 | " -0.65880283]\n", 411 | " [ 0.45357985 0.93344041 -0.3227902 ..., 0.54517618 -0.61843117\n", 412 | " -0.65012904]\n", 413 | " [ 0.79540447 0.81199893 -0.17199499 ..., -1.37594032 -0.31606297\n", 414 | " 0.25565865]\n", 415 | " ..., \n", 416 | " [ 0.55016759 -1.54009227 1.27179887 ..., 0.47173665 -0.40921016\n", 417 | " -0.93454102]\n", 418 | " [-0.96226679 1.59233315 -1.82476061 ..., -0.604804 0.95590354\n", 419 | " -0.16326198]\n", 420 | " [ 0.59741217 -0.65335565 0.80366925 ..., -0.69656287 1.31561273\n", 421 | " 1.30238677]]\n", 422 | "size of Z1 (100, 253)\n", 423 | "X.shape = (100, 253)\n", 424 | "X_min, X_max = 4.10743882917 5.16553756345\n", 425 | "num_basis = 12\n", 426 | "dim self.data = (253, 100, 12)\n", 427 | "\n", 428 | "Time Cost of basis expansion: 5.966109037399292 seconds\n" 429 | ] 430 | }, 431 | { 432 | "data": { 433 | "text/plain": [ 434 | "" 435 | ] 436 | }, 437 | "execution_count": 93, 438 | "metadata": {}, 439 | "output_type": "execute_result" 440 | }, 441 | { 442 | "data": { 443 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAWkAAAEcCAYAAAAFlEU8AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAG0lJREFUeJzt3X+U3HV97/HnK5tN3KC4eAnVbIjBX+GUHxK6eujFtvzS\n4C+IaCsVql61Ob33WovVKBGvQo8e8cZzBa9e24AVLz8jEFewrQErtLWnQRM2MYaQSuVXFpClsNpL\nFths3veP+U6YTGZ2Z2Zn5vuZ2dfjnD3sfL8zs5/vDnnlk/f380MRgZmZpWlO3g0wM7PqHNJmZglz\nSJuZJcwhbWaWMIe0mVnCHNJmZglzSJuZJcwhbWaWMIe0WRlJV0n6XN7tMAOHtDWBpAckjUv6fyVf\nX825PWe08L3Lr/UZST9qxc9rhKTzJG2RNCbpSUl3ShrIu13WGIe0NcvbI+KFJV8fzrtBLXTAtQJ/\nkneDiiS9H7gEWAUcBrwGuAZ4Ksdm2Qw4pK1lJL0y68mdmD1eJOkJSadkjx+QtEbSPZKekvRNSS8o\nef0iSTdLGpV0v6SPlJw7UtKG7Ny/F3vukq4GlgC3Zr3cT0z1Ptlrlku6W9J/SFoPvIAWk3ShpJvK\njl0u6SvZ95+UNJK1aZek02t86w8CfxURW6LgiYi4MiL2NPsarD0c0tYyEfFvwCeBayUtAL4JXBUR\nd5Y87TxgBfBKCr2+TwNImgPcCmwDBoDTgQskrZDUA3wPeBBYmp2/IfuZfwQ8RNbbBb5U7X2ynzMP\nGAKuBl4C3Ai8s/m/jYNcD7xF0qFZO3qAPwCuk7QM+DDwuoh4EYXfzwM1vu848AFJfyDp8OY329rN\nIW3NMpTVQItffwwQEVcAPwfuAl4GXFT2uq9GxMMR8STweeAPs+OvAxZGxF9ExHMR8QvgCuBc4PXA\nImB1RDwdEc9ERLWa8FTvA3AS0AtcFhETEXET8JOZ/CIkfVHSP0m6WlJvpedExIPA3cDK7NBpwJ6I\n2ARMAvOB35TUGxEPZH/h1eK9wEYKfzn9UtKtko4oadupkpY0eGmWA4e0NcvKiOgv+bqi5NwVwLHA\n/46IZ8te93DJ9w9SCF+AlwOLSoMf+BTwG8CRwIMRsbeGdk31PmQ/byQOXLP3wRretyJJrwUGIuJ3\ngHuZuld+Hc//pfSe7DERcR9wAXAx8LikGyQtqvgOZSLisYi4ICKWUPjL7HgK/5op+gDg9Yk7iEPa\nWkrSC4HLgG8AF0t6SdlTjiz5fgnwSPb9w8D9ZcH/ooh4S3ZuiaS5VX5saQhN9T4AjwIDklTWjkad\nDNyWff994A1TPPdG4BRJi4F3kIU0QERcFxFvoPCXTABfrLchEbEF2A4cAiDpLOBtwNWSzq/3/Swf\nDmlrtcuBLRHxIeBvgL8sO//fJS3OwvtTwPrs+I+BX2c30Pok9Ug6VtLrsnOPApdKOkTSCySdXPKe\nvwReUcP7APwLsBf4iKS5ks6h0ANtVD/w6+z7X1Goc1cUEaPAnRRq9fdHxE4AScsknSZpPvAMhTrz\n5HQ/OLsZebKk+dnX+4FTsveHQh1/S0ScEhHXNHJx1n4OaWuW4miK4td3JJ0NnMnzQ9T+HDhR0nkl\nr7uOQs/zF9nX5wAiYhJ4O3ACcD/wBHAl8OKSc6+icJNwN/Dukvf8AvDprLTx0Wrvk/2c54BzgPdT\nGKb2bmDDDH4PY8Ch2fcvBp6c5vnXAWdQ0oumUI++NGvrY8ARFP4CQ9LfSfpUlfc6lEIg/zuF38u7\ngdMj4q7s/Kso3B+wDiJvn2V5kfQA8KGI+EHebWkWSScAfx4R783C9P6IuD7vdgFIegfw8oi4LO+2\nWO3ckzZroojYSmFUxT8BxwA359ykUruAD0lySHcQ96QtN93YkzZrNoe0mVnCXO4wM0uYQ9rMLGHV\nJgN0hMMPPzyWLl2adzPMzOq2ZcuWJyJi4XTP6+iQXrp0KZs3b867GWZmdZNU0/IDLneYmSXMIW1m\nljCHtJlZwhzSZmYJc0ibmSWso0d3mJnlZWh4hLUbd/HI2DiL+vtYvWIZK5c3f1N2h7SZWZ2GhkdY\ns2E74xOFZb5HxsZZs2E7QNOD2uUOM7M6rd24a39AF41PTLJ2466m/yyHtJlZnR4ZG6/r+Ew4pM3M\n6rSov6+u4zPhkDYzq9PqFcvo6+054Fhfbw+rVyxr+s/yjUMzszoVbw56dIeZWaJWLh9oSSiXc7nD\nzCxhDmkzs4Q5pM3MEuaQNjNLmEPazCxhDmkzs4Q5pM3MEuaQNjNLWFIhLalf0k2S7pW0U9Jv590m\nM7M8pTbj8HLg+xHxLknzgAV5N8jMLE/JhLSkQ4HfBd4PEBHPAc/l2SYzs7ylVO54BTAKfFPSsKQr\nJR2Sd6PMzPKUUkjPBU4Evh4Ry4GngQvLnyRplaTNkjaPjo62u41mZm2VUkjvBnZHxF3Z45sohPYB\nImJdRAxGxODChQvb2kAzs3ZLJqQj4jHgYUnFVbNPB+7JsUlmZrlL5sZh5k+Ba7ORHb8A/kvO7TEz\ny1VSIR0RW4HBvNthZpaKZModZmZ2MIe0mVnCHNJmZglzSJuZJcwhbWaWMIe0mVnCHNJmZglzSJuZ\nJcwhbWaWMIe0mVnCHNJmZglLau0OM+t+Q8MjrN24i0fGxlnU38fqFctYuXwg72YlyyFtZm0zNDzC\nmg3bGZ+YBGBkbJw1G7YDOKircLnDzNpm7cZd+wO6aHxikrUbd+XUovQ5pM2sbR4ZG6/ruDmkzayN\nFvX31XXcHNJm1karVyyjr7fngGN9vT2sXrGsyivMNw7NrG2KNwc9uqN2Dmkza6uVywccynVwSJvZ\njHjcc2s5pM2sYR733Hq+cWhmDfO459ZzSJtZwzzuufUc0mbWMI97bj2HtJk1zOOeW883Ds2sYR73\n3HoOaTObEY97bq3kyh2SeiQNS/pe3m0xM8tbciEN/BmwM+9GmJmlIKmQlrQYeCtwZd5tMTNLQVIh\nDVwGfALYV+0JklZJ2ixp8+joaPtaZmaWg2RuHEp6G/B4RGyRdEq150XEOmAdwODgYLSpeWZdyetu\npC+lnvTJwFmSHgBuAE6TdE2+TTLrXsV1N0bGxgkK625csH4ry//iNoaGR/JunmWSCemIWBMRiyNi\nKXAu8MOIOD/nZpl1paHhET727W0HrbsB8NSeCdZs2O6gTkQyIW1m7VHsQU9G9WqhF0lKRzI16VIR\ncSdwZ87NMOtKlVauq8SLJKXBPWmzWabW8PUiSWlwSJvNMrWErxdJSodD2myWqbRyXe8ccdiCXgQM\n9PfxhXOO81C8RCRZkzaz1imG7yW37uCpPRMAHDJ/Lp99+zEO5gS5J202Sz0z8fzE3rFxD7tLlUPa\nbBby3oSdw+UOsw7XyNRu703YOdyTNutglaZ211K28N6EncMhbdbBGi1beG/CzuFyh1kHa7Rs4b0J\nO4dD2qyDLervY6RCINdStvDehJ3B5Q6zDuayRfdzT9qsg7ls0f0c0mYdzmWL7uaQNsuBt62yWjmk\nzdqsOLa5OHSuOLYZcFDbQXzj0KzNPCXb6uGQNmszT8m2ejikzdrMU7KtHg5pszbz2Garh28cmrWZ\nxzZbPRzSZjnw2GarlUParA08Ltoa5ZA2azGPi7aZcEibtUix91xplbriuGiHtE3HIW3WJKUljRf3\n9fL0c3uZmIyqz/e4aKtFMkPwJB0p6Q5JOyXtkPRnebfJrFbl21iNjU9MGdDgcdFWm5R60nuBj0XE\n3ZJeBGyRdHtE3JN3w8ymU2mq91Q8LtpqlUxPOiIejYi7s+//A9gJuGBnHaGe0oWAd/6Wh+BZbZIJ\n6VKSlgLLgbsqnFslabOkzaOjo+1umtlBhoZHmCPV/PwA7rjX/+9abVIqdwAg6YXAzcAFEfHr8vMR\nsQ5YBzA4ODh10c+sBUpHbYhC6NbLNw2tVkmFtKReCgF9bURsyLs9ZjD1qI1Gewm+aWi1SiakJQn4\nBrAzIv5X3u0xGxoe4ZJbd/DUnon9x8bGJ6Z4RW1809DqkVJN+mTgj4DTJG3Nvt6Sd6NsdioOqSsN\n6EYM9Pdx/klLGOjvQ9njL5xznG8aWs2S6UlHxI8o3Pg2y129Q+oqEfDPF57WnAbZrJVST9osGc24\nsee6szWDQ9qsgpkGrOvO1ix1h7Sk90i6QdK1kq6T9IetaJhZnk49emHdr+nJxkq77mzN1EhN+vci\n4tziA0lfA65vXpPM8ldtsskcFcJ4Yt/zg+/6enscytYyjYT0fElvBR4GjgRceLOuU60mvS+gZw70\n9/Xyq/EJL+BvLddISP834BzgWGA38OGmtsgsZ8Vp3pNRearKxGRwyPy5bP3sm9rcMpuNpg1pSV+p\ndgoYiIhrm9sks/wUx0dXC+giT+u2dqmlJ3028JlWN8SsHabba7DW8dEeXmftUktIPxkR32p5S8xa\nrNJegxes38olt+7grce/jDvuHa241VU5D6+zdqolpL3SnHWFar3kp/ZMcM2mh6Z8bY/EvgjfKLS2\nS2ZauFmr1dJLrsRD7CxPtYT0ayUdtK4zhRuHERGHNrlNZg2pVG8Gqu7YXYsB95wtZ9OGdET0tKMh\nZjNRqd68+qZtTO4L9jVYsBvo7/MCSZY7lzusK1xy646D6s3T7dY9Fd8ctFR4gSXreEPDIzNe99lr\nPluq3JO2jnfJrTtm9PpD5vXwuZXHNak1Zs3lnrR1tJn2onvmiM+/wwFt6XJP2jpK+QiOp5/d2/B7\neeSGdQKHtHWMSiM46nXYgl4++/ZjHMzWMRzSlqzyXvOe5/Y2vO/gA5e+tcmtM2sPh7QlqRm95qIB\nL4ZkHcw3Di1JzditG6C3Rx7vbB3NPWlLUjPWa3b92bqBQ9qStKi/b0brbXg6t3ULlzssSatXLKOv\nt/5lYzyd27qNe9KWpGKJ4pJbd0w5WeWwBb0smDe36k4rZp0uqZCWdCZwOdADXBkRl+bcJGuh6bay\nAnhmYl/V1/f19rjmbF0vmZCW1AN8DXgjhV3IfyLploi4J9+WWStUGmK3ZsN24Ple9FQjPDxb0GaL\nZEIaeD1wX0T8AkDSDRQ2wXVId6FKATw+MckF67dywfqtDExx41DgG4M2a6R043AAeLjk8e7smHWR\noeERTr70h9OO3BgZG0dVznmnbptNUupJV/ozedCq7ZJWAasAlixZ0uo2WROVlzimE2R7tJUc8+gN\nm21S6knvBo4sebwYeKT8SRGxLiIGI2Jw4cKFbWuczczQ8Agf+/a2umcRBngxfpvVUupJ/wR4taSj\ngBHgXOA9+TbJmqHYg56M+rez8sQUm+2SCemI2Cvpw8BGCkPw/joiZrblhuWmdHjdHKmhgBa4tGGz\nXjIhDRARfwv8bd7tsJkprz03GtDnnbTEpQ2b9ZIKaetcM+k5F28O9mSv8xhos+c5pG3GZtJzdiCb\nTc0hbTN28S07Glr72TcFzaaX0hA86zBDwyOccMltjI3Xv1u3xzub1cYhbQ0pljhqDejzT1ri8c5m\nDXC5wxpSz/ZWA/19fG7lcS1ukVl3ck/aGlLr9lYua5jNjEPaGlLLIkcua5jNnMsdVpPyBfpPPXoh\nN28ZqVjy6J0j1v7+ax3OZk3gnrRNq3iTcGRsnKCwjOjNW0Z4528NMJD1qHtUWMRwoL/PAW3WRO5J\n27SqLdB/zaaHGOjv47J3n+BQNmsRh7TtNzQ8csDGr+VrOVdSadsrM2selzsMKAT06pu2HbAzd62T\nu8cnJlm7cVdrGmY2yzmkDSiUNCYm61+trqjWIXlmVh+HtAEzD1nvO2jWGg5pA2YWsp6wYtY6DmkD\n4NSja98vco6gv6/X63CYtYFHd8wC5aM2+vt6ufisY/YH69DwCOt/8nBN79XX2+NQNmsjh3SXK47a\nKL0pODY+weobtwGFYXNT3TQ8bEEvC+bN3T/T0Av0m7WXQ7rLVQvgiX3B2o27WLl8YMqbhmN7Jhj+\nzJta2UQzm4Jr0l1uZIoALobzVDcNPWrDLF8O6S5XXFOjkmIAr16xjN6eg5/XO0cetWGWM5c7utxU\nm8IWA7hYY57q5qKZ5cMh3eV6pIpBLR241sbK5QMOZLMEudzR5ar1pKfoYJtZQtyT7mCVFuK/497R\nA4bLDfT3Vbx5OOAbgmYdwT3pDvXpoe18dP3WAxbiv2bTQwc8Xn3jNk49eiF9vT0HvNbTuM06RxIh\nLWmtpHsl/VTSdyT1592mlA0Nj3DtpoemXUp0Yl/wvW2P8oVzjmOgv8/TuM06UCrljtuBNRGxV9IX\ngTXAJ3NuU7LWbtxV81rPY+MTvilo1sGS6ElHxG0RsTd7uAlYnGd7UlfvsqJDwyMtaomZtVoqPelS\nHwDWVzspaRWwCmDJkiXtalPuSm8SzqkyrK6a4vRvM+s8bQtpST8AXlrh1EUR8d3sORcBe4Frq71P\nRKwD1gEMDg7OioFk5Ysk1RPQ4F1TzDpZ20I6Is6Y6ryk9wFvA06P8Cje0p6zBPtm8Bvx+htmnSuJ\ncoekMyncKPy9iNiTd3vyNjQ8wpoN2xmfmARmNvHEw+3MOlsSIQ18FZgP3K7CgkCbIuJP8m1SftZu\n3LU/oOvV39eLVFhi1Os/m3W+JEI6Il6VdxtS0mgNeaC/j3++8LQmt8bM8pRESM9m5VO7V69YxqIq\nU7mn4rKGWXdKYpz0bFWsPZdO5V6zYXvFqdy9PTpg89fzT1riWYRms4B70jkZGh7hY9/edtBwuvGJ\nyf1Tuct72A5hs9nHId1mQ8MjXHzLDsbGJ6o+p3jO9WUzc7mjjYrljakCumjtxl1taJGZpc4h3Ub1\nDK3zLEEzA4d0W9UTvJ4laGbgkG6rWoPXw+nMrMgh3UarVyw7aGhdOQ+nM7NSHt3RRsXgXbtxFyNj\n4/t38h7wEDszq8Ih3WbeJcXM6uFyh5lZwtyTnoHydTdOPXohd9w76lmCZtY0DukGla/5PDI2zjWb\nHtp/vrgOB+CgNrOGudzRoItv2THtxJTxiUnPHDSzGXFPugaVyhq1TO0Gzxw0s5lxSE+jUlnj2pKy\nxnQ8c9DMZsLljmlUWm+j1i0HPXPQzGbKPekKSssb9ewBe8i8HvoXzPPoDjNrGod0mfLyRq36env4\n/Ds8ndvMmsvljjKN7tTt9TbMrBUc0mUaGY0x0N/ngDazlnBIl6l3NIZvDppZKzmky9SynGiRlxU1\ns1bzjcMypcuJTjW6Q3ijWDNrPYd0BaXLiZ586Q8ZqVCn9iQVM2uHpModkj4uKSQdnndbiiqVP1yH\nNrN2SaYnLelI4I1A7XOu26C8/OFJKmbWTsmENPBl4BPAd/NuSDnvpmJmeUmi3CHpLGAkIrbV8NxV\nkjZL2jw6OtqG1pmZ5adtPWlJPwBeWuHURcCngDfV8j4RsQ5YBzA4OFjP0hoHLTnqsoWZpa5tIR0R\nZ1Q6Luk44ChgmySAxcDdkl4fEY816+dXWnLUO6eYWepyL3dExPaIOCIilkbEUmA3cGIzAxoqr8nh\nnVPMLHW5h3S7VFuTwzunmFnKkgvprEf9RLPft9rkE09KMbOUJRfSreJJKWbWiVIaJ91SnpRiZp1o\n1oQ0eFKKmXWeWVPuMDPrRA5pM7OEOaTNzBLmkDYzS5hD2swsYYqoa42ipEgaBR6scvpwoOmTYnLi\na0lPt1wH+Fry8vKIWDjdkzo6pKciaXNEDObdjmbwtaSnW64DfC2pc7nDzCxhDmkzs4R1c0ivy7sB\nTeRrSU+3XAf4WpLWtTVpM7Nu0M09aTOzjueQNjNLWNeEtKR+STdJulfSTkm/Leklkm6X9PPsv4fl\n3c7pSPqopB2SfibpekkvkHSUpLuy61gvaV7e7axE0l9LelzSz0qOVfwMVPAVSfdJ+qmkE/Nr+cGq\nXMva7P+vn0r6jqT+knNrsmvZJWlFPq2urNK1lJz7uKSQdHj2uOM+l+z4n2a/+x2S/mfJ8WQ/l1p1\nTUgDlwPfj4ijgdcCO4ELgb+PiFcDf589TpakAeAjwGBEHAv0AOcCXwS+nF3HU8AH82vllK4Cziw7\nVu0zeDPw6uxrFfD1NrWxVldx8LXcDhwbEccD/wqsAZD0mxQ+p2Oy1/wfST2k4yoOvhYkHQm8EXio\n5HDHfS6STgXOBo6PiGOAL2XHU/9catIVIS3pUOB3gW8ARMRzETFG4YP7Vva0bwEr82lhXeYCfZLm\nAguAR4HTgJuy88leR0T8I/Bk2eFqn8HZwP+Ngk1Av6SXtael06t0LRFxW0TszR5uorCzPRSu5YaI\neDYi7gfuA17ftsZOo8rnAvBl4BNA6eiBjvtcgP8KXBoRz2bPeTw7nvTnUquuCGngFcAo8E1Jw5Ku\nlHQI8BsR8ShA9t8j8mzkdCJihEIv4CEK4fwrYAswVhIOu4FO2rmg2mcwADxc8rxOu64PAH+Xfd9x\n1yLpLGAkIraVneq4awFeA/xOVhL8B0mvy4534rUcpFtCei5wIvD1iFgOPE3ipY1Ksnrt2cBRwCLg\nEAr//CzXDeMmVeFYR1yXpIuAvcC1xUMVnpbstUhaAFwEfKbS6QrHkr2WzFzgMOAkYDXwbUmiM6/l\nIN0S0ruB3RFxV/b4Jgqh/cviP9Wy/z5e5fWpOAO4PyJGI2IC2AD8Zwr/5CxudbYYeCSvBjag2mew\nGziy5HkdcV2S3ge8DTgvnp9k0GnX8koKHYFtkh6g0N67Jb2UzrsWKLR5Q1ai+TGwj8JCS514LQfp\nipCOiMeAhyUVt/4+HbgHuAV4X3bsfcB3c2hePR4CTpK0IOsJFK/jDuBd2XM64TpKVfsMbgHem40m\nOAn4VbEskipJZwKfBM6KiD0lp24BzpU0X9JRFG66/TiPNtYiIrZHxBERsTQillIIsxOzP0cd97kA\nQxTu2yDpNcA8CivhddTnUlVEdMUXcAKwGfgphQ/tMOA/URhR8PPsvy/Ju501XMclwL3Az4CrgfkU\nau4/pnDj40Zgft7trNL26ynU0ico/MH/YLXPgMI/Rb8G/BuwncKIltyvYZpruY9CjXNr9vWXJc+/\nKLuWXcCb827/dNdSdv4B4PAO/lzmAddkf2buBk7rhM+l1i9PCzczS1hXlDvMzLqVQ9rMLGEOaTOz\nhDmkzcwS5pA2M0uYQ9rMLGEOabOMpJdJukHSZkn/KumOvNtkNnf6p5jNGlcDV0TEegBJx+XcHjNP\nZjEDyNYZfhZYHIXp0WZJcLnDDIiISeAHFBYd+itJJxfPdcKOPta9HNJmz3sz8E4K63h/X1Jxg4Iv\n59ckm+1ckzbLRKH29yPgR1nv+XhJzwBHS/p4RHwp3xbabOSetBkgaUVxg19JRwBvoLCn4RPANQ5o\ny4tD2qzgXcBOSduA7wH/IyL+BTgeKN9iyqxtXO4wAyLij6ucegL4kKQnImJnO9tkBh6CZ2aWNJc7\nzMwS5pA2M0uYQ9rMLGEOaTOzhDmkzcwS5pA2M0uYQ9rMLGEOaTOzhDmkzcwS9v8BeGVsUhWvQbwA\nAAAASUVORK5CYII=\n", 444 | "text/plain": [ 445 | "" 446 | ] 447 | }, 448 | "metadata": {}, 449 | "output_type": "display_data" 450 | } 451 | ], 452 | "source": [ 453 | "np.random.seed(42)\n", 454 | "strike_k = 95\n", 455 | "test_vol = 0.2\n", 456 | "test_mu = 0.03\n", 457 | "dt = 0.01\n", 458 | "rfr = 0.05\n", 459 | "num_paths = 100\n", 460 | "num_periods = 252\n", 461 | "\n", 462 | "hMC = DiscreteBlackScholes(100, strike_k, test_vol, 1., rfr, test_mu, num_periods, num_paths)\n", 463 | "hMC.gen_paths()\n", 464 | "\n", 465 | "t = hMC.numSteps - 1\n", 466 | "piNext = hMC.bVals[:, t+1] + 0.1 * hMC.sVals[:, t+1]\n", 467 | "pi_hat = piNext - np.mean(piNext)\n", 468 | "\n", 469 | "A_mat = hMC.function_A_vec(t)\n", 470 | "B_vec = hMC.function_B_vec(t, pi_hat)\n", 471 | "phi = np.dot(np.linalg.inv(A_mat), B_vec)\n", 472 | "opt_hedge = np.dot(hMC.data[t, :, :], phi)\n", 473 | "\n", 474 | "# plot the results\n", 475 | "fig = plt.figure(figsize=(12,4))\n", 476 | "ax1 = fig.add_subplot(121)\n", 477 | "\n", 478 | "ax1.scatter(hMC.sVals[:,t], pi_hat)\n", 479 | "ax1.set_title(r'Expected $\\Pi_0$ vs. $S_t$')\n", 480 | "ax1.set_xlabel(r'$S_t$')\n", 481 | "ax1.set_ylabel(r'$\\Pi_0$')" 482 | ] 483 | }, 484 | { 485 | "cell_type": "code", 486 | "execution_count": 79, 487 | "metadata": {}, 488 | "outputs": [ 489 | { 490 | "name": "stdout", 491 | "output_type": "stream", 492 | "text": [ 493 | "Submission successful, please check on the coursera grader page for the status\n" 494 | ] 495 | }, 496 | { 497 | "data": { 498 | "text/plain": [ 499 | "array([ 0.81274895, -3.49043554, 0.69994334, 1.61239986, -0.25153316,\n", 500 | " -3.19082265, 0.8848621 , -2.0380868 , 0.45033564, 3.74872863,\n", 501 | " -0.6568227 , 1.74148929, 0.94314331, -4.19716113, 1.72135256,\n", 502 | " -0.66188482, 6.95675041, -2.20512677, -0.14942482, 0.30067272,\n", 503 | " 3.33419402, 0.68536713, 1.65097153, 2.69898611, 1.22528159,\n", 504 | " 1.47188744, -2.48129898, -0.37360224, 0.81064666, -1.05269459,\n", 505 | " 0.02476551, -1.88267258, 0.11748169, -0.9038195 , 0.69753811,\n", 506 | " -0.54805029, 1.97594593, -0.44331403, 0.62134931, -1.86191032,\n", 507 | " -3.21226413, 2.24508097, -2.23451292, -0.13488281, 3.64364848,\n", 508 | " -0.11270281, -1.15582237, -3.30169455, 1.74454841, -1.10425448,\n", 509 | " 2.10192819, 1.80570507, -1.68587001, -1.42113397, -2.70292006,\n", 510 | " 0.79454199, -2.05396827, 3.13973887, -1.08786662, 0.42347686,\n", 511 | " 1.32787012, 0.55924965, -3.54140814, -3.70258632, 2.14853641,\n", 512 | " 1.11495458, 3.69639676, 0.62864736, -2.62282995, -0.05315552,\n", 513 | " 1.05789698, 1.8023196 , -3.35217374, -2.30436466, -2.68609519,\n", 514 | " 0.95284884, -1.35963013, -0.56273408, -0.08311276, 0.79044269,\n", 515 | " 0.46247485, -1.04921463, -2.18122285, 1.82920128, 1.05635272,\n", 516 | " 0.90161346, -1.93870347, -0.37549305, -1.96383274, 1.9772888 ,\n", 517 | " -1.37386984, 0.95230068, 0.88842589, -1.42214528, -2.60256696,\n", 518 | " -1.53509699, 4.47491253, 4.87735375, -0.19068803, -1.08711941])" 519 | ] 520 | }, 521 | "execution_count": 79, 522 | "metadata": {}, 523 | "output_type": "execute_result" 524 | } 525 | ], 526 | "source": [ 527 | "### GRADED PART (DO NOT EDIT) ###\n", 528 | "\n", 529 | "part_1 = list(pi_hat)\n", 530 | "try:\n", 531 | " part1 = \" \".join(map(repr, part_1))\n", 532 | "except TypeError:\n", 533 | " part1 = repr(part_1)\n", 534 | "submissions[all_parts[0]]=part1\n", 535 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:1],all_parts,submissions)\n", 536 | "pi_hat\n", 537 | "### GRADED PART (DO NOT EDIT) ###" 538 | ] 539 | }, 540 | { 541 | "cell_type": "code", 542 | "execution_count": 80, 543 | "metadata": {}, 544 | "outputs": [ 545 | { 546 | "name": "stdout", 547 | "output_type": "stream", 548 | "text": [ 549 | "X.shape = (50000, 7)\n", 550 | "X_min, X_max = 2.96880459823 6.37164911461\n", 551 | "num_basis = 12\n", 552 | "dim self.data = (7, 50000, 12)\n", 553 | "\n", 554 | "Time Cost of basis expansion: 75.7322986125946 seconds\n", 555 | "Option value = 13.1083498903\n", 556 | "Option value variance = 5.17079676287\n", 557 | "Option delta = -0.356133676933\n", 558 | "BS value 13.1458939003\n" 559 | ] 560 | } 561 | ], 562 | "source": [ 563 | "# input parameters\n", 564 | "s0 = 100.0\n", 565 | "strike = 100.0\n", 566 | "r = 0.05\n", 567 | "mu = 0.07 # 0.05\n", 568 | "vol = 0.4\n", 569 | "T = 1.0\n", 570 | "\n", 571 | "# Simulation Parameters\n", 572 | "numPaths = 50000 # number of Monte Carlo trials\n", 573 | "numSteps = 6\n", 574 | "\n", 575 | "# create the class object\n", 576 | "hMC = DiscreteBlackScholes(s0, strike, vol, T, r, mu, numSteps, numPaths)\n", 577 | "\n", 578 | "# calculation\n", 579 | "hMC.gen_paths()\n", 580 | "hMC.seed_intrinsic()\n", 581 | "option_val, delta, option_val_variance = hMC.roll_backward()\n", 582 | "bs_call_value = bs_put(0, s0, K=strike, r=r, sigma=vol, T=T)\n", 583 | "print('Option value = ', option_val)\n", 584 | "print('Option value variance = ', option_val_variance)\n", 585 | "print('Option delta = ', delta) \n", 586 | "print('BS value', bs_call_value)" 587 | ] 588 | }, 589 | { 590 | "cell_type": "code", 591 | "execution_count": 81, 592 | "metadata": {}, 593 | "outputs": [ 594 | { 595 | "name": "stdout", 596 | "output_type": "stream", 597 | "text": [ 598 | "Submission successful, please check on the coursera grader page for the status\n" 599 | ] 600 | }, 601 | { 602 | "data": { 603 | "text/plain": [ 604 | "13.108349890270032" 605 | ] 606 | }, 607 | "execution_count": 81, 608 | "metadata": {}, 609 | "output_type": "execute_result" 610 | } 611 | ], 612 | "source": [ 613 | "### GRADED PART (DO NOT EDIT) ###\n", 614 | "part2 = str(option_val)\n", 615 | "submissions[all_parts[1]]=part2\n", 616 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:2],all_parts,submissions)\n", 617 | "option_val\n", 618 | "### GRADED PART (DO NOT EDIT) ###" 619 | ] 620 | }, 621 | { 622 | "cell_type": "code", 623 | "execution_count": 82, 624 | "metadata": {}, 625 | "outputs": [ 626 | { 627 | "name": "stdout", 628 | "output_type": "stream", 629 | "text": [ 630 | "X.shape = (50000, 7)\n", 631 | "X_min, X_max = 2.96880459823 6.37164911461\n", 632 | "num_basis = 12\n", 633 | "dim self.data = (7, 50000, 12)\n", 634 | "\n", 635 | "Time Cost of basis expansion: 70.51716732978821 seconds\n" 636 | ] 637 | }, 638 | { 639 | "data": { 640 | "text/plain": [ 641 | "array([ 6.70326307, 8.59543726, 10.74614496, 13.1458939 ,\n", 642 | " 15.78197485, 18.63949388])" 643 | ] 644 | }, 645 | "execution_count": 82, 646 | "metadata": {}, 647 | "output_type": "execute_result" 648 | } 649 | ], 650 | "source": [ 651 | "strikes = np.linspace(85, 110, 6)\n", 652 | "results = [None] * len(strikes)\n", 653 | "bs_prices = np.zeros(len(strikes))\n", 654 | "bs_deltas = np.zeros(len(strikes))\n", 655 | "numPaths = 50000\n", 656 | "hMC = DiscreteBlackScholes(s0, strike, vol, T, r, mu, numSteps, numPaths)\n", 657 | "hMC.gen_paths()\n", 658 | "for ix, k_strike in enumerate(strikes):\n", 659 | " hMC.seed_intrinsic(k_strike)\n", 660 | " results[ix] = hMC.roll_backward()\n", 661 | " bs_prices[ix] = bs_put(0, s0, K=k_strike, r=r, sigma=vol, T=T)\n", 662 | " bs_deltas[ix] = norm.cdf(d1(s0, K=k_strike, r=r, sigma=vol, T=T)) - 1\n", 663 | "bs_prices" 664 | ] 665 | }, 666 | { 667 | "cell_type": "code", 668 | "execution_count": 83, 669 | "metadata": { 670 | "collapsed": true 671 | }, 672 | "outputs": [], 673 | "source": [ 674 | "mc_prices = np.array([x[0] for x in results])\n", 675 | "mc_deltas = np.array([x[1] for x in results])\n", 676 | "price_variances = np.array([x[-1] for x in results])\n", 677 | "prices_diff = mc_prices - bs_prices\n", 678 | "deltas_diff = mc_deltas - bs_deltas\n", 679 | "# price_variances" 680 | ] 681 | }, 682 | { 683 | "cell_type": "code", 684 | "execution_count": 84, 685 | "metadata": {}, 686 | "outputs": [ 687 | { 688 | "name": "stdout", 689 | "output_type": "stream", 690 | "text": [ 691 | "Submission successful, please check on the coursera grader page for the status\n" 692 | ] 693 | }, 694 | { 695 | "data": { 696 | "text/plain": [ 697 | "array([-0.03641513, -0.0403414 , -0.03996596, -0.03754401, -0.03240004,\n", 698 | " -0.02997067])" 699 | ] 700 | }, 701 | "execution_count": 84, 702 | "metadata": {}, 703 | "output_type": "execute_result" 704 | } 705 | ], 706 | "source": [ 707 | "### GRADED PART (DO NOT EDIT) ###\n", 708 | "\n", 709 | "part_3 = list(prices_diff)\n", 710 | "try:\n", 711 | " part3 = \" \".join(map(repr, part_3))\n", 712 | "except TypeError:\n", 713 | " part3 = repr(part_3)\n", 714 | "submissions[all_parts[2]]=part3\n", 715 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:3],all_parts,submissions)\n", 716 | "prices_diff\n", 717 | "### GRADED PART (DO NOT EDIT) ###" 718 | ] 719 | }, 720 | { 721 | "cell_type": "code", 722 | "execution_count": 85, 723 | "metadata": {}, 724 | "outputs": [ 725 | { 726 | "name": "stdout", 727 | "output_type": "stream", 728 | "text": [ 729 | "Submission successful, please check on the coursera grader page for the status\n" 730 | ] 731 | }, 732 | { 733 | "data": { 734 | "text/plain": [ 735 | "array([ 0.01279805, 0.01416023, 0.01532697, 0.01645686, 0.01715332,\n", 736 | " 0.01780665])" 737 | ] 738 | }, 739 | "execution_count": 85, 740 | "metadata": {}, 741 | "output_type": "execute_result" 742 | } 743 | ], 744 | "source": [ 745 | "### GRADED PART (DO NOT EDIT) ###\n", 746 | "part_4 = list(deltas_diff)\n", 747 | "try:\n", 748 | " part4 = \" \".join(map(repr, part_4))\n", 749 | "except TypeError:\n", 750 | " part4= repr(part_4)\n", 751 | "submissions[all_parts[3]]=part4\n", 752 | "grading.submit(COURSERA_EMAIL, COURSERA_TOKEN, assignment_key,all_parts[:4],all_parts,submissions)\n", 753 | "deltas_diff\n", 754 | "### GRADED PART (DO NOT EDIT) ###" 755 | ] 756 | }, 757 | { 758 | "cell_type": "code", 759 | "execution_count": null, 760 | "metadata": { 761 | "collapsed": true 762 | }, 763 | "outputs": [], 764 | "source": [] 765 | } 766 | ], 767 | "metadata": { 768 | "coursera": { 769 | "course_slug": "reinforcement-learning-in-finance" 770 | }, 771 | "kernelspec": { 772 | "display_name": "Python 3", 773 | "language": "python", 774 | "name": "python3" 775 | }, 776 | "language_info": { 777 | "codemirror_mode": { 778 | "name": "ipython", 779 | "version": 3 780 | }, 781 | "file_extension": ".py", 782 | "mimetype": "text/x-python", 783 | "name": "python", 784 | "nbconvert_exporter": "python", 785 | "pygments_lexer": "ipython3", 786 | "version": "3.6.2" 787 | } 788 | }, 789 | "nbformat": 4, 790 | "nbformat_minor": 2 791 | } 792 | -------------------------------------------------------------------------------- /Week2/Week2: -------------------------------------------------------------------------------- 1 | Week2 is about Igor's paper for QLBS model 2 | -------------------------------------------------------------------------------- /Week3/Week3: -------------------------------------------------------------------------------- 1 | Fitted Q-Learning 2 | --------------------------------------------------------------------------------