├── DRL_GBM.ipynb
├── LICENSE
├── README.md
├── ddpg_agent.py
├── main.py
├── model.py
└── report.pdf
/DRL_GBM.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "
\n",
8 | " \n",
9 | " \n",
10 | " \n",
11 | " \n",
12 | " & \n",
13 | " \n",
14 | " \n",
15 | " \n",
16 | " \n",
17 | "
"
18 | ]
19 | },
20 | {
21 | "cell_type": "markdown",
22 | "metadata": {},
23 | "source": [
24 | "# Deep Reinforcement Learning for Optimal Execution of Portfolio Transactions "
25 | ]
26 | },
27 | {
28 | "cell_type": "markdown",
29 | "metadata": {},
30 | "source": [
31 | "# Introduction\n",
32 | "\n",
33 | "This notebook demonstrates how to use Deep Reinforcement Learning (DRL) for optimizing the execution of large portfolio transactions. We begin with a brief review of reinforcement learning and actor-critic methods. Then, you will use an actor-critic method to generate optimal trading strategies that maximize profit when liquidating a block of shares. \n",
34 | "\n",
35 | "# Actor-Critic Methods\n",
36 | "\n",
37 | "In reinforcement learning, an agent makes observations and takes actions within an environment, and in return it receives rewards. Its objective is to learn to act in a way that will maximize its expected long-term rewards. \n",
38 | "\n",
39 | " \n",
40 | "\n",
41 | " \n",
42 | " Fig 1. - Reinforcement Learning. \n",
43 | " \n",
44 | " \n",
45 | "\n",
46 | "There are several types of RL algorithms, and they can be divided into three groups:\n",
47 | "\n",
48 | "- **Critic-Only**: Critic-Only methods, also known as Value-Based methods, first find the optimal value function and then derive an optimal policy from it. \n",
49 | "\n",
50 | "\n",
51 | "- **Actor-Only**: Actor-Only methods, also known as Policy-Based methods, search directly for the optimal policy in policy space. This is typically done by using a parameterized family of policies over which optimization procedures can be used directly. \n",
52 | "\n",
53 | "\n",
54 | "- **Actor-Critic**: Actor-Critic methods combine the advantages of actor-only and critic-only methods. In this method, the critic learns the value function and uses it to determine how the actor's policy parramerters should be changed. In this case, the actor brings the advantage of computing continuous actions without the need for optimization procedures on a value function, while the critic supplies the actor with knowledge of the performance. Actor-critic methods usually have good convergence properties, in contrast to critic-only methods. The **Deep Deterministic Policy Gradients (DDPG)** algorithm is one example of an actor-critic method.\n",
55 | "\n",
56 | " \n",
57 | "\n",
58 | " \n",
59 | " Fig 2. - Actor-Critic Reinforcement Learning. \n",
60 | " \n",
61 | " \n",
62 | "\n",
63 | "In this notebook, we will use DDPG to determine the optimal execution of portfolio transactions. In other words, we will use the DDPG algorithm to solve the optimal liquidation problem. But before we can apply the DDPG algorithm we first need to formulate the optimal liquidation problem so that in can be solved using reinforcement learning. In the next section we will see how to do this. "
64 | ]
65 | },
66 | {
67 | "cell_type": "markdown",
68 | "metadata": {},
69 | "source": [
70 | "# Modeling Optimal Execution as a Reinforcement Learning Problem\n",
71 | "\n",
72 | "As we learned in the previous lessons, the optimal liquidation problem is a minimization problem, *i.e.* we need to find the trading list that minimizes the implementation shortfall. In order to solve this problem through reinforcement learning, we need to restate the optimal liquidation problem in terms of **States**, **Actions**, and **Rewards**. Let's start by defining our States.\n",
73 | "\n",
74 | "### States\n",
75 | "\n",
76 | "The optimal liquidation problem entails that we sell all our shares within a given time frame. Therefore, our state vector must contain some information about the time remaining, or what is equivalent, the number trades remaning. We will use the latter and use the following features to define the state vector at time $t_k$:\n",
77 | "\n",
78 | "\n",
79 | "$$\n",
80 | "[r_{k-5},\\, r_{k-4},\\, r_{k-3},\\, r_{k-2},\\, r_{k-1},\\, r_{k},\\, m_{k},\\, i_{k}]\n",
81 | "$$\n",
82 | "\n",
83 | "where:\n",
84 | "\n",
85 | "- $r_{k} = \\log\\left(\\frac{\\tilde{S}_k}{\\tilde{S}_{k-1}}\\right)$ is the log-return at time $t_k$\n",
86 | "\n",
87 | "\n",
88 | "- $m_{k} = \\frac{N_k}{N}$ is the number of trades remaining at time $t_k$ normalized by the total number of trades.\n",
89 | "\n",
90 | "\n",
91 | "- $i_{k} = \\frac{x_k}{X}$ is the remaining number of shares at time $t_k$ normalized by the total number of shares.\n",
92 | "\n",
93 | "The log-returns capture information about stock prices before time $t_k$, which can be used to detect possible price trends. The number of trades and shares remaining allow the agent to learn to sell all the shares within a given time frame. It is important to note that in real world trading scenarios, this state vector can hold many more variables. \n",
94 | "\n",
95 | "### Actions\n",
96 | "\n",
97 | "Since the optimal liquidation problem only requires us to sell stocks, it is reasonable to define the action $a_k$ to be the number of shares to sell at time $t_{k}$. However, if we start with millions of stocks, intepreting the action directly as the number of shares to sell at each time step can lead to convergence problems, because, the agent will need to produce actions with very high values. Instead, we will interpret the action $a_k$ as a **percentage**. In this case, the actions produced by the agent will only need to be between 0 and 1. Using this interpretation, we can determine the number of shares to sell at each time step using:\n",
98 | "\n",
99 | "$$\n",
100 | "n_k = a_k \\times x_k\n",
101 | "$$\n",
102 | "\n",
103 | "where $x_k$ is the number of shares remaining at time $t_k$.\n",
104 | "\n",
105 | "### Rewards\n",
106 | "\n",
107 | "Defining the rewards is trickier than defining states and actions, since the original problem is a minimization problem. One option is to use the difference between two consecutive utility functions. Remeber the utility function is given by:\n",
108 | "\n",
109 | "$$\n",
110 | "U(x) = E(x) + λ V(x)\n",
111 | "$$\n",
112 | "\n",
113 | "After each time step, we compute the utility using the equations for $E(x)$ and $V(x)$ from the Almgren and Chriss model for the remaining time and inventory while holding parameter λ constant. Denoting the optimal trading trajectory computed at time $t$ as $x^*_t$, we define the reward as: \n",
114 | "\n",
115 | "$$\n",
116 | "R_{t} = {{U_t(x^*_t) - U_{t+1}(x^*_{t+1})}\\over{U_t(x^*_t)}}\n",
117 | "$$\n",
118 | "\n",
119 | "Where we have normalized the difference to train the actor-critic model easier."
120 | ]
121 | },
122 | {
123 | "cell_type": "markdown",
124 | "metadata": {},
125 | "source": [
126 | "# Simulation Environment\n",
127 | "\n",
128 | "In order to train our DDPG algorithm we will use a very simple simulated trading environment. This environment simulates stock prices that follow a discrete arithmetic random walk and that the permanent and temporary market impact functions are linear functions of the rate of trading, just like in the Almgren and Chriss model. This simple trading environment serves as a starting point to create more complex trading environments. You are encouraged to extend this simple trading environment by adding more complexity to simulte real world trading dynamics, such as book orders, network latencies, trading fees, etc... \n",
129 | "\n",
130 | "The simulated enviroment is contained in the **syntheticChrissAlmgren.py** module. You are encouraged to take a look it and modify its parameters as you wish. Let's take a look at the default parameters of our simulation environment. We have set the intial stock price to be $S_0 = 50$, and the total number of shares to sell to one million. This gives an initial portfolio value of $\\$50$ Million dollars. We have also set the trader's risk aversion to $\\lambda = 10^{-6}$.\n",
131 | "\n",
132 | "The stock price will have 12\\% annual volatility, a [bid-ask spread](https://www.investopedia.com/terms/b/bid-askspread.asp) of 1/8 and an average daily trading volume of 5 million shares. Assuming there are 250 trading days in a year, this gives a daily volatility in stock price of $0.12 / \\sqrt{250} \\approx 0.8\\%$. We will use a liquiditation time of $T = 60$ days and we will set the number of trades $N = 60$. This means that $\\tau=\\frac{T}{N} = 1$ which means we will be making one trade per day. \n",
133 | "\n",
134 | "For the temporary cost function we will set the fixed cost of selling to be 1/2 of the bid-ask spread, $\\epsilon = 1/16$. we will set $\\eta$ such that for each one percent of the daily volume we trade, we incur a price impact equal to the bid-ask\n",
135 | "spread. For example, trading at a rate of $5\\%$ of the daily trading volume incurs a one-time cost on each trade of 5/8. Under this assumption we have $\\eta =(1/8)/(0.01 \\times 5 \\times 10^6) = 2.5 \\times 10^{-6}$.\n",
136 | "\n",
137 | "For the permanent costs, a common rule of thumb is that price effects become significant when we sell $10\\%$ of the daily volume. If we suppose that significant means that the price depression is one bid-ask spread, and that the effect is linear for smaller and larger trading rates, then we have $\\gamma = (1/8)/(0.1 \\times 5 \\times 10^6) = 2.5 \\times 10^{-7}$. \n",
138 | "\n",
139 | "The tables below summarize the default parameters of the simulation environment"
140 | ]
141 | },
142 | {
143 | "cell_type": "code",
144 | "execution_count": 1,
145 | "metadata": {},
146 | "outputs": [],
147 | "source": [
148 | "import utils\n",
149 | "\n",
150 | "# Get the default financial and AC Model parameters\n",
151 | "financial_params, ac_params = utils.get_env_param()"
152 | ]
153 | },
154 | {
155 | "cell_type": "code",
156 | "execution_count": 2,
157 | "metadata": {},
158 | "outputs": [
159 | {
160 | "data": {
161 | "text/html": [
162 | "\n",
163 | "Financial Parameters \n",
164 | "\n",
165 | " Annual Volatility: 12% Bid-Ask Spread: 0.125 \n",
166 | " \n",
167 | "\n",
168 | " Daily Volatility: 0.8% Daily Trading Volume: 5,000,000 \n",
169 | " \n",
170 | "
"
171 | ],
172 | "text/plain": [
173 | ""
174 | ]
175 | },
176 | "execution_count": 2,
177 | "metadata": {},
178 | "output_type": "execute_result"
179 | }
180 | ],
181 | "source": [
182 | "financial_params"
183 | ]
184 | },
185 | {
186 | "cell_type": "code",
187 | "execution_count": 3,
188 | "metadata": {},
189 | "outputs": [
190 | {
191 | "data": {
192 | "text/html": [
193 | "\n",
194 | "Almgren and Chriss Model Parameters \n",
195 | "\n",
196 | " Total Number of Shares to Sell: 1,000,000 Fixed Cost of Selling per Share: $0.062 \n",
197 | " \n",
198 | "\n",
199 | " Starting Price per Share: $50.00 Trader's Risk Aversion: 1e-06 \n",
200 | " \n",
201 | "\n",
202 | " Price Impact for Each 1% of Daily Volume Traded: $2.5e-06 Permanent Impact Constant: 2.5e-07 \n",
203 | " \n",
204 | "\n",
205 | " Number of Days to Sell All the Shares: 60 Single Step Variance: 0.144 \n",
206 | " \n",
207 | "\n",
208 | " Number of Trades: 60 Time Interval between trades: 1.0 \n",
209 | " \n",
210 | "
"
211 | ],
212 | "text/plain": [
213 | ""
214 | ]
215 | },
216 | "execution_count": 3,
217 | "metadata": {},
218 | "output_type": "execute_result"
219 | }
220 | ],
221 | "source": [
222 | "ac_params"
223 | ]
224 | },
225 | {
226 | "cell_type": "code",
227 | "execution_count": 8,
228 | "metadata": {},
229 | "outputs": [],
230 | "source": [
231 | "import importlib\n",
232 | "import syntheticChrissAlmgren as sca"
233 | ]
234 | },
235 | {
236 | "cell_type": "code",
237 | "execution_count": 14,
238 | "metadata": {},
239 | "outputs": [
240 | {
241 | "data": {
242 | "text/plain": [
243 | ""
244 | ]
245 | },
246 | "execution_count": 14,
247 | "metadata": {},
248 | "output_type": "execute_result"
249 | }
250 | ],
251 | "source": [
252 | "importlib.reload(sca)"
253 | ]
254 | },
255 | {
256 | "cell_type": "code",
257 | "execution_count": null,
258 | "metadata": {},
259 | "outputs": [],
260 | "source": []
261 | },
262 | {
263 | "cell_type": "code",
264 | "execution_count": 15,
265 | "metadata": {},
266 | "outputs": [
267 | {
268 | "name": "stdout",
269 | "output_type": "stream",
270 | "text": [
271 | "Average Stock Price: $54.48\n",
272 | "Standard Deviation in Stock Price: $6.42\n"
273 | ]
274 | },
275 | {
276 | "data": {
277 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAaIAAAEKCAYAAABQRFHsAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzsnXncHtPZ+L9XFlFEFjsJQUIRa6zV1lYEtZVWtEqUKrVEy9sWbyUoL16tolpCra2ltaZqqdL0R2uL5Q2iJfagaotGCFmu3x9nxnOeeWY5s91zP89zvp/P/bnv+8zMOWdmzpxrrutc5zqiqng8Ho/H0xR9mq6Ax+PxeHo3XhB5PB6Pp1G8IPJ4PB5Po3hB5PF4PJ5G8YLI4/F4PI3iBZHH4/F4GsULIo/H4/E0ihdEHo/H42kUL4g8Ho/H0yj9mq5Ak4iIDyvh8Xg8+XlbVZerKjOvEXk8Ho8nLy9XmZkXRB6Px+NpFC+IPB6Px9MoXhB5PB6Pp1F6tbNCHEOGDGHSpEmMHDmSPn16j5xetGgRM2fOZNKkSbz33ntNV8fj8fQipDevRxTnNXfeeeex+eab069f75PRCxYs4OGHH2bChAlNV8Xj8bQ3j6rqplVlVvsrv4iMEJHx1v9zReSJ4POsiMy2ti20tk1JyG+AiFwvIjNF5CERGWFtOyFI/6eI7FykviNHjuyVQgigX79+jBw5sulqeDyeXkatPa6IHAEcCywRCKNxqvo9a/vRwMbWIR+p6kYZ2R4CvKeqI0VkHHAWsJ+IrAuMA9YDVgb+LCJrqerCPHXuTea4OHr7+Xs8ntZTW68jIgOBU4ADgR8D44G5kd32B67NmfWewJXB7xuAHUREgvTrVPVjVX0RmAlsXqz2Hk8PYvhwGDu26Vp4PInUqREtAhYDlgZQ1ZfsjSKyGrA6cK+VvLiITAMWAGeq6i0x+a4CvBrkuUBE3geWCdIftPabFaR1QkQOAw4rdkqt47LLLuPOO++kb9++iAgnnngi06dP5ytf+QqLL7547vwmT57MZz7zGb75zW/WUFtPWzN9OgweDCJN18TjiaU2QaSqc0XkQOAMYEURGQ2crKofBruMA26ImM5WVdXXRWQN4F4ReVJVn49kHfc0aUp6tF6TgcnQviF+pk+fzv33389vfvMbFltsMWbPns38+fO57rrr2HXXXQsJIk8vZvDgpmvg8aRS64CAqk4BvgqcDSwHHGdtHkfELKeqrwffLwBT6Tx+FDILGA4gIv2AQcC7dnrAMOD1Ck6j5bz99tsMGjSIxRZbDIDBgwdzzz338NZbb3H44Ydz+OGHA3DXXXcxbtw49ttvPy644IJPj//73//OAQccwNe//nWOOOKILvnffPPNHHPMMcybN681J+TxeDwp1KYRichSGJMZwBzgGWBosG1tYAjwgLX/EOBDVf1YRJYFtsYIsChTgIOCY/cF7lVVDbzsrhGRn2GcFUYBD5c5h58OH86zSyxRJosurPXhhxz36qup+2y55ZZceuml7LPPPmy22WbsuOOOjBs3jmuuuYaLLrqIwYMH89Zbb3HBBRdw9dVXM3DgQI4++mimTp3KhhtuyOmnn87kyZNZZZVVeP/99zvl/bvf/Y4HH3yQc84551NB5/F4PE1S5xhRf+BiYFmMQHoF+HqwbX+MY4FtGlsHuFhEFmE0tTNVdQaAiJwKTAs0rF8DV4vITIwmNA5AVZ8Wkd8BMzBjTEfm9ZhrF5ZYYgmuvvpqnnjiCaZNm8aJJ57IUUcd1WmfGTNmMGbMGIYMGQLA2LFjefzxx+nTpw8bb7wxq6xihscGDRr06TF33HEHyy+/POecc06vdVH3eDztR51jRO8BY4N5Ptuq6hXWtkkx+/8dWD8hr5Ot3/Mw5r64/U4HTi9R7U5kaS510rdvX8aMGcOYMWMYOXIkt912W6ftSRORVRVJGJReY401ePbZZ3nzzTc/FVQej6eHsNxysNpqMG1a0zXJTSsmjcwGnmhBOT2Gl156iVdeeeXT/88++ywrrbQSSyyxBHPnGg/40aNH89hjjzF79mwWLlzIXXfdxSabbMIGG2zAY489xmuvvQbQyTS39tprc+KJJ3Lcccfx1ltvtfakPB5PdfTrB9/7nvkOefxxeOSR5upUgtrtM6rqBVFOPvroI/73f/+XDz74gL59+zJs2DBOOukk7rrrLiZMmMCyyy7LRRddxJFHHsnhhx+OqrL11luzzTbbAHDiiSfygx/8AFVlyJAhXHjhhZ/mvdFGGzFhwgSOPfZYLrzwQgZ7jyqPp/tx1FHws59Bnz7w05+atG5s5fCx5iLccccdLLvssk1Upy14++232WWXXZquhqdKwme8p8wj2mwzePhh2HBDM0eqN3LyyXDKKeYzaZJJa+197l6x5jwej6dS9t7bfO+6a7P1aJKe8lIR4AWRx+PpXvSwTtiJPfaANdfsmt5DLFpeEEVYtGhR01VolN5+/p5uRNgJjx8PY8Y0WpXaufVWmDmz6VrUhp9MEmHmzJkMHTq0V86zWbBgATN7cGP39FAuv9x890ZNqYfQ+3rbDCZNmpR/hdb+/c3b2YIF9VauRuwVWj0eT5vTw4SuF0QR3nvvvfwrlPY0rySPx9M98GNEHo+nR7D44nDttTBsWNM16cxmm8Wn+xe+HocXRB5Pb2f33WHcuI6Jke3AMceYuUJf+lLyPj1EG/B4QeTxeEJPyXZaJn70aPM9YkSj1fC0hjZqeR6PpxFCQdS3b7P1cMWb5nocXhB5PL2ddtSIPOn0MGHsW57H09sJBVE7dm5xdWrHsaEDDzT1CtYHaxnteC0K4AWRx9Pb8aa58oRTPlZfvdl6dFO8IPK4s846rTPfjByZ7L7rqZZ2Ns2lvfG3ozbQjnXqBrRhy/O0JaNHw4wZ8N//3ZrynnvOuO92l7f07kwoiEaOhBVXbLYuUeK0n3bUiFpND7sGtQsiERkhIuOt/+eKyBPB51kRmW1tO0hEngs+ByXkN1RE7g72uVtEhgTpIiLni8hMEZkuIpvUfW69inCy45ZbtrbcMI6Ypz5CQTRqFLzxRrN1yYPXPnrMNahVEInIEcBdwGkiMlVEVlTV76nqRqq6EXABcFOw71BgIrAFsDkwMRQyEX4E3KOqo4B7gv8AuwCjgs9hwK9qPDVPq/jmN5uuQc/HR1wvTw/TUFpNbYJIRAYCpwAHAj8GxgNzI7vtD1wb/N4ZuFtV31XV94C7gbExWe8JXBn8vhLYy0q/Sg0PAoNFZKWKTscTPmg95A2sV5LUWXpBVB2tej6qEnwjRsAKK1STVwnq1IgWAYsBSwOo6kuqOifcKCKrAasD9wZJqwCvWsfPCtKirKCqbwR5vgEsn+d4ETlMRKaJyLQiJ9U4u+5qGns487xVtFIQ9eKl2huhHQWRf+Fxo+x1evFF+Ne/qqlLCWoTRKo6F6MNnYExzZ0jIktYu4wDblDVhcH/OBGf5yo7Ha+qk1V10yrXW28pewUK4FZbNVuPOnnrraZr0DPpThpR2ht/dzCDbb017Lhj/eV0h2vhQK1jRKo6BfgqcDawHHCctXkcHWY5MBrMcOv/MOD1mGzfDE1uwfe/cx7fvfFvip6qaUdB5ELZZ2GppeDeezsvwb3OOsU69+gx998Pf/pTufol5d0DqXOMaKnA/AYwB3gGGBhsWxsYAjxgHXIXsJOIDAmcFHYK0qJMAUKPuoOAW630AwPvuS2B90MTnsfjwV0jUm1+TlErXrh23RW22w7OOMP8X399M0XhpJOK51lHvdMEUQ95Ma2ztfUHLgZuwHjD7QKcF2zbH7hOteMqquq7wGnAI8Hn1CANEblUREJT2pnAjiLyHLBj8B/gduAFYCZwCfDd+k6tDWj1W1IveCurhIEDTedw+OFN18SdOI2oX8NrZrbCNBdd0HLVVc13q6coZNELnr3aWlvg+TZWREYA26rqFda2SQnHXAZcFpN+qPX7HWCHmH0UOLJktetl8cVNo/roo+J5NP0G1HT57U443+qYY+Cii5qtS5SkDi3unvaCzq/SlZU32qh8Hkn0gkm9rdC/ZwNPtKCc9uedd+DDD6vJK60hLrlk82+0vZXQpLVwYfp+TTBxYnvWK45WhPaJCqKi+a5U8yyRvKa5qoTUGmu0zERbeymqOltVvSACWGKJ7H2ycHmL++ADuP328mXZ9LA3sNoIQxK1owPAf/93fMcSd2+feAI22KD+OkXZYgvTxocPT96natNcWZZcsvo8bZp49tZYA55/Hk45pSXF+VhzPZW6XEebMs0NGpS+fdgw04k1TTtrREnEdXSf/WzHIH4r+W4wtLv99tn7qpbrpKs0zVVNnz7Qv7/5nbd+VZzPyiub7222KZ+XA14QdRf69oWdd26u/CYf1nHjYPZs2Hjj5H1eeAEefLB1dUqinTWiphCBK6/M96KQ9sJjt8V2EER1vJxNnQqffGJ++zEiT21suaVxHXXlxBPhzjthp53qq5MLTWhE4TmnDQiHb49NEwqi7q4RVcnQoWbhuD/+Mf+xWeMj7SCI6uALX+j43dQYUQvxI9pN8UAwhcq10ay9tvkOw990w8ZWmKpWEN1yy/q1Ji+IknF5icnrOFCFIKpyQL4dxoi6Yd/gNaLuQnfs4Kqiig5j992N8P/Od6qpU5QVVzQu291xjCiJqjq0KoRFE3m3C94052kbvCAq9/CttZb5HjWqfH3iuOkmOO+8jnLaeYwoKtDr7tTKuEdn1a2nmuZsvEbkAczbripsu21zdQgF0YIF5rupyAppnclyy8H48dWXXUWHEc6rCq9f1SyzjPlebDHz3c4vDK6CqGqNqF1Nc2EeRfOqW7OKq1feMtdZB444opr61IAXRC6EAqgus44LUY2oHQXRDTeYFVVXWy15nyJUIYhCZ4a6BFFId9Bcy7adgw+Gr389/3FFOuyswfg6NKK89bT3b9UYUd4QSI8/Dr/8ZXV1qhgviFwIJ6JWFRWhCN1h7CGcYV61B1uVGtH8+V23bbstXHJJ8bxtwvvUk01zl10Gv/2t+/5l7ltv9ZqzqcJrbsCA6upTA14QudAOgqjMm7aqGcPorhTtMJZaCq6+GoYMiRdERx4JY8fCX/4Chx4an0deusMLg6vTR7ua5qrSiKrMw1MK777tQncSREmdzN57V1ufOOp6oIsKoiOOgAMOgDfe6NBQbNPcL37Ref8+fcprMmFEi+4kiNrJfTvvMe2mEfU09+0WtQ0viFxoJ0GU5axQV5DCPA2y6oexaIcRCoO+fbtevzj69euYzV6UL3+5c9ntSHf3mrPziW5fZx0TI+/667PzrkIQfeMbMHJkcn2qoAn37Ra7tnvTnM2QITBmTNd0L4g6aGLuRdEJrbYgShsjCverMmJ5O48RtaOjS55900xzM2bAddcVq1/IuuuaSOUu/OY3MGlScl5l2WijaiMrtKkZ0gsim6lTYdq0rumhIJo7t6XV6URZ01xZmmzARSe0hteqT5+u7tt77dV1PxdBtPfe8Mor2ft6jahc/i5Cq6yzQpLb9hprGOGy9NLF866Kxx+v9v40vfJuAu1Zq6ZICnv/mc+Y748/bl1dorgG0wz3q4sm50zkfSDDa2VrRAsWwJ57ws03d+yXRxD96ldmiYJw3lASPUEQVS2gqmo7VTkrhNeh3cdUqnDfdtnWILULIhEZISLjI2lfE5EZIvK0iFxjpS8UkSeCz5SE/AaIyPUiMlNEHgpWgA23nRCk/1NEqgtVHd68Js0toYAJI1A3ZZpLo92cFULtJyqIVlih8355BJFrXbqTIKqbPPctz74jRxqvx6JEBVG07CLtuVWCKKSHmOZqdVYQkSOAY4ElAmE0DhgInABsrarvicjy1iEfqWrWmruHAO+p6kgRGQecBewnIusG+a8HrAz8WUTWUtX8PYJI/IBokzfRtfNoB9NcmTffKjUie4zIntAaRj+I7ucy/8l1vKqdx4jKmOa+8pX85YWTX/O0C5d9jzrKfIoSfa6reHbihFlZTbBKwdKm4YJqezUSkYHAKcCBwI+B8cBc4NvAhar6HoCq/jtn1nsCVwa/bwB2EBEJ0q9T1Y9V9UVgJrB5oconPahNCqKoya2nOivUbZor66zgOl7VKkHUt6+JdGDXp08fOOyw5PMp8+Z/443563jWWeY7ru185jOm/k0Q1YjqeHaqMJVXKTxczZo9yGtuEbAYsDSAqr6kqnOAtYC1RORvIvKgiNi69eIiMi1I3ysmT4BVgFeDPBcA7wPL2OkBs4K0TojIYUEZMV4JAUmdfjsJoiTaaTBy+eVhwoR8x5QVRH37dswij/OaW7iwaz5VmOaS8qybo44ykQ6+/e2OtEMOgYsvhu9/P/6YptpIXOd29tmm/uGij618xuoQRNH6FxFE0QUgqxQY0ReWNqG2mqjqXIw2dAZwmoicIyJLYMyBo4Btgf2BS0VkcHDYqqq6KfB14OcismZM1nF3RVPSo/WarKqbBuXEkyWIBg40a9u0ku4oiK68En7+885OIL/8Zfocj7KC6LbbYN488zvOay7uwa1jjKhV92G55cx3uE4VmIXo7O8o7eSsEI7XDRzYucxWvJFHz68Kb8KsPF147LFy9XDViIrUbdw4GDw4e7+c1Pq0qOoU4KvA2cBywHEYTeVWVZ0fmND+iRFMqOrrwfcLwFQgbm3oWcBwABHpBwwC3rXTA4YBrxeqeJZp7uabzdo2oVt3Ej/8Ycfa72WJCqIllzSrtkbT28lZYamlzHfYyYCJdvC1r7nnAXD88cnbotiD17ZpLm38J48gch0jatWKsUU67KYiK6TxrW91/p90Xmuvbeb7VUGWRlSFIOquprk4Ro2Ca681c6cqps4xoqVEJAzDPAd4BuOocAuwXbDPshhT3QsiMkREBljpWwMzYrKeAhwU/N4XuFdVNUgfF3jVrY4Rbg8XqnyWRhROeo0Oetussw6ceaaJSF0F0TpNmgSnn26WYLap21nBdX4HwEcfme/Q/T1POXm3xRHnNSfSNZ9QuFQ5RhRqZeutl77EeZ0k3at2Ms2F7LKL+c66x//4R9c2X5QsZ4W0cdgVV3Qrw85zzTXhlltg8cWL1dOlbnnyytsOwud4+PD0/QpQp9dcf+BiYFnMGM4rGJPb68BOIjIDWAj8l6q+IyKfAy4WkUUYAXmmqs4AEJFTgWmBhvVr4GoRmYnRhMYBqOrTIvI7jPBaABxZyGMOkhvkbrvB9Olu5pmwUwu1grJEBVH4xh19824HQRQSdsZ5Hry0a1p0Qqs9bhSXfx2muVdeMd9PPeW2fyupwk25CEW85orWzSVuYJb7dlJ7O+cc+N734ueSpWlE551n+pDtt4fbb0+vW1qeNmXct/M+TzVGK69NEAVecWODeT7bquoV1ubvBx97/78D6yfkdbL1ex7G3Be33+nA6bkq+rnPwb/+BS+80JGWpE7vvrv5vPuu+e9yw+MaSmhqykNSnaKD4u00RlREI0qjjNdcKAzj8gg1pyoE0SefGE05bvvQoR1tpwoGDer4nefatKNprmqKCCLX6xK6sMdFXqhijCgrzzKUcVaoURC1oteaDTzRgnKK8be/wfPPd07LapB5bkicINpzT/f6JdUpJPqwtdMYUdOmuSSNqAqvuaQ2kjRb//jj4Z13YNVV4/NdZRWzxo+r9rjrrjB7NnzhC27727R6GYiQPLHm8hwTh8s5Zpnm+vQx0dS33rpzemiFiAuQ26oxoqLPSZJGtHlkpsv6MTpBdxZEqjpbVdtXEMWR1XhcbkirfPTbUSNqlzEi22vOXhisDvftEHssymaPPcx30uq1555rJn6G0buz+OIXzfdWW7ntb9MdloGowjTnuk+aRvSnP8H993dOz7Pab90aUd6+JEkQPfRQx++11oKLLipfVg7aoNdqQ7Imj+YRRHE3Ly5tySWLuUVGBVFdsebydAjhvuEYUVOCKLzOWaa5PJEVXJ0VqhhMDtl5Zzj66M7OMaEmXKSj6w6CqCx77AGjR6fvU9RZIa2ddCevuaS2E660XFV9HPCCKI6sBplnWYK4hy/Odv366/Dee/nygdZrRHnMK7Nnm+88b+1VCqKQLNNcHq85e9ypStKu6513wvnnm6C7ocdmVBDlcctt5zGhquYRXX89PPlk+j5ZGlHSsxQKIheTWRMvhq4OP71sjKj74aoRpd3ItAYZ95AlhZwfNCjdVbRVgqiIRhRGK19jjWrKiW7bZBNzLddeO33/qCCKzmOqI8RPlRqRTTiROk0jyurMu4NG1ArtqaizQh5BlOcFIYkq749LfZLSvSBqMVU4K0Q7AzvPPA/ZCy+Ypa6TKOKs4GKCuuYaOPnkrunRui+5pPnefntYffX8dYmSRxDtv7/5ThpXiRNE++zTdYA/FETbbJNdv6KCKOueu7aJMF+XEEJlBVETzgp1le1Shut1CU2kLnWM26fMuI5LHscc09mMazseiMCIEeZ4V6ep3/zGjBn5MaIWU8cYkZ1nnoCYSWFaQvJqRGPGGG+ftPD5e+1lOvlTTknP65JL4IMPjGC7556O9KjtPQ95BFGWUAj332abDo3TDoUTEl5DF7f6KjWiPn3yxRWzt0fbUBn37SaJdm5x2twqq1Q3BcAmvA5bbGFeZlxNcyFJGpEd/qsKgZrXa27iRDjpJPN7m23M3Ef7uNBDLmlicDTvb3wDvvMdt7IL0kYtso1wjazsYporqxFlkSWIoo0mdEUNZ7LHYS8al5QPwKGHmu8w3ll037AudSyIZueb540+ri4u3k8hSWNE0bKWW84EfU1j4kQTVywqjNIIr2mZ6N51mOa+9a2u7SBKUdPcrFlwxx3F6pWGfd5/+EP+6xK3fZllTPivtH2qdD5IuqbhC+yoUV3zSjJFutbPC6IWUeUYUZxGVKRjLuqsUESby1uP116LL7Nq01w0vyKCKK4DzxMp21UjOvRQePPN9DqFAmjYMPfyqxBEVXcka64Jv/51ejBbKPdC4mI2zUuWBlREI4pqbvY+Rc+/yP1Kms9mn1N0W7huVFLZdaxoG2ZdeY49gT59YPx4eOYZ87+MDT2u4yrSiSSVlTVGVLUgciEqiIo4OrhsS/NezBpwjcvHBVdBlKfcPn3qGSNKoohGtMwyxmQbR9j5ZmlELtTVLlddtXOw1IEDswVPXF0mTUrfHqVq01yfPvDNb2a3v3B72jlFt/32tx1jvmn1iGpZFVDrCq3dlr594fLLO/5X4TVX1jSXFGBVxNhvH3oInniiPo+oKt/KipYTnReSJlT79CmmEYmk35+igiiOOEHaxBiRy4vWbbclL30SavtZwtF1Tp2dXsbd2Obll+H9981cvdGjjWu3bUKLyz/uukycmL49r9ecSygi+7ijjjIx6+bMST+miCAKtye1hxrHFr1GFIdtRkvrzFxuTFxnWUQQJQ3W9uljPFoefzy+TnnNWUkUESb2w7DxxvDXv5YrZ889O5to0s4l6YFKW48oq3yodh5REY3onHPMIHNa55V1Dnk7lLXXTl9/K7wWWR1qkXZvu9SXfakKY/OF62NF57dV5ayQtY99HRYuNHPEXMuJrt+UdE3De5JWn7ztoEYrihdEcdg3qG9fd40o7e2nrGkuSRAddFDn/3WZ5vI4HkS95kTMgnhhWBowb6NxDhNZ9bLNAmnaSVGNKIuyGpFdpzih5nJfrryytc4KtkdkWn5FNKIo0brY16Zub79Wz6/abjvzffTR6fuVsUbEnVP0+Uwr73OfK1cPR7wgiiPa+F0FUdxAYFUaUVIwzK9GApG7CqK8RM/HZd+khwHMG/bVV5vfG23UIaSyGrtdfhHTXFmNyM7fJk+9Q4poRCFl5hG5vLXbpK27Be4OFEW85myN6Je/TD+mbEeZdU+znJji0tLqdO+9bvVqlbOCa541kDlGJCICfANYQ1VPFZFVgRVVtdiic92BKjWiuH3rfNMq+1ZXxjEjqS5JZrIwLTQrrrWWseO70p1Mc2naWbTdrLsuzIhbE9IiKojKjBGVpcwYURa2IAqnCyRRtSCK/o/GgmzCWSFK1qTlvKa5M86AkSPz16MkLi3yl8BWQDCNnTnAhbXVqB3IMge4CKI8ZrsqcXlrS0tPirpQdozIJc9nn81ezTSPRhSHi7OCS/lVOivY9+z734enn+4alj/p2CJUHVnB1TTnQrRMl7BL0XpkUTTiRNQqUVYjcqVIHt/8pvnO66xw9NHJcwwbdlbYQlWPBOYBBAveZejq3ZwsjSjJWSHuJpfRiNIGiJMoO0aU5p3nSlQQZWlENiNGuJeT5r5dxmsujbrct0PCoKZJy0VEj40jr7NC2c7S1VkhD3GmuSzKnke0rCLu3Wl1Klq/MnnkFUSu9agYl5rMF5G+gJq6yHKAc4sTkREiMj6S9jURmSEiT4vINVb6QSLyXPA5qEtmZp+hInJ3sM/dIjIkSBcROV9EZorIdBHZxLWOXahijCgqiIo0pqh7qQtpNuGk+tjUIYiShEIR4jSiPG6oWeahujWiOGeFuDGirGgPLp2+65t/FlnXpE7TnEtcxJC484oGuE0jy5rgIojaRSMKSXNWyNsOGtaIzgduBpYXkdOB+4EzXDIXkSOAu4DTRGSqiKwoIqOAE4CtVXU94Nhg36HARGALYHNgYihkIvwIuEdVRwH3BP8BdgFGBZ/DgF+51DEWexCx6BiRiyNDGlW8PaXVvRWCKMs7J448jb0qZ4U8LwlVuW8PH54exTmvmavMNa5qbCWPINpnH/N/v/3i61JEI4qLUJEV7cEmWlaWUMkriFoxdSLr2DJj1TVqRJl3WVV/KyKPAjsAAuylqs9kHSciA4FTgN2BdYCpwFzg+8CFgYkPVf13cMjOwN2q+m5w/N3AWODaSNZ7AtsGv68M8v1hkH6VqirwoIgMFpGVVDUldHWERYu6PqRxnXnHSSb/TzPN/ec/2XWp4o27f3/3xrPyyiZUT9QLL5pvnrfaIqa5PFQ1oTWPIHJ5CcliwAB45ZWO/337dr2uWZ16mfLzvrVn3fO884j++Eez1LkLeQRROC5SlOjLRRHTnMu1rVIjzbp3eZ0V0mhSIxKRLYHXVPVCVf0FMEtEtnDIexFmLGlpAFV9SVXnAGsBa4nI30TkQRHdq3ULAAAgAElEQVQJw0CvArxqHT8rSIuyQihcgu8wsqTr8fmI69CSxohcTXOhuSBNIFWhEQ0a5K4R7bST+Z4wIT7fIvOI8jorJKUlkRXip25BdPrp8KMfpe8fPQ46L1se5lVWENnkHSPKyqcq01xImhAqoxGVJU0jEnETREX2gfTFI6s2zaXVI42Gx4h+BXxg/Z+Lg9lLVecCB2LMeKeJyDkisgRGCxuF0Wr2By4VkcEYbatLNg71C3E6XkQOE5FpIjLNKde+fZPHWYqa5lyoQhBNm9b5/9ix8NnPxucf1i/J5NQqZ4U8b+dFvObiBGkec4V97088Ef7nf9L3j7Ltth0z/OPKDykjiELKrkfkiqtpbsUVjeadRhlB5HoeWdEI4vKL6wfyakRpYzN//3t8nbJIO+c4a479LJZ1fKgQl7ssgbkLAFVdJCJOrUNVp4jIdIx5blPgOIym8qCqzgdeFJF/YgTTLDpMbgDDMGa3KG+GJjcRWQkITXuzgOGR41+PqdNkYDKAiGQLuj59ygmirLkuSVQhiIYN61xHO5R+NP/wIaxDEOVxVshTThHTXBkt7NBD4fOfN7+zvNqSmDixq0YQV36Ws0KZQXH7Hg8YYMar0qjKNLfSSvkncTapEdnPTr9+9WpEaRTtJ7KEZ9565FlpOScuNXlBRI4Rkf7BZwLwQtZBIrKUiIRP6xzgGWAgcAuwXbDPshhT3QsYp4adRGRI4KSwU5AWZQoQetQdBNxqpR8YeM9tCbyfa3woiTwaUZppLsmzK4kqBFGefOrWiFzzLKIR5fGay7oPaeXb2s/PfpZezzTWW6/z/7hrXoVGlIRd3u9/Dz/9afr+RU1zq6/e1estLXLzRhvBzjt3LrOVgihNI1psseoEUd5n+8EHix3bty9suGHntDLOCldemW//HLi05sOBzwGvYbSOLTBeaVn0By4GbsB4w+0CnIcRLu+IyAzgL8B/qeo7gZPCacAjwedUy3HhUhHZNMj3TGBHEXkO2DH4D3A7RqDNBC4BvptZwzSPEjvNdYxo5ZU7Zn+XbZBVOCuklTFoUOexivAhTHrwizgrpJkiqqDIPKIygqjMedjXLartVD1GlIXd4e6+e/F8ovnZdT7wQLPMfTQsT1q9H3+8q9dmu8wjWnLJYoJot93MvbVXBs5779Jc2NPOeccduzofldGIasTFa+7fwLi8GQdecWNFZASwrapeYW3+fvCJHnMZcFlM+qHW73cwHnzRfRQ4Mm89M4nTiMK3jGj6lClmkbA77mhOI3LNZ999zdtpGMmgp40RbbKJGZPIW2Yr7knWGlIulOlEXDv3rJePiy82QvW++8z/8LyGDet4e95++2J1LKIRlb0/0bZv/3cVRNG0733PfNur8FYpANLOeZ110vfvDoJIRH6gqmeLyAXEDPqr6jGOZcwGnihYv+aJGyOyt9mES0P3798eY0SQ/iDbanuSRrT22vDlL8NHH7nXYaedTFyuOia02oTX9vjju277wx/ij6lDI/rKV4wZypU4QRTt7LM6iarGiFxIyvuwwDASTrwOz8t+Afjkk3xlRcvMI4hWKekkGy3Lvk5LLdV1fxeNKNQS7bxbJQDiNCn7pbA7CCLMmA6Am3dZAqravoIo6r0UR5xGFJKW7uI11wpBlLR8RJSf/9x8Rzupv/8dhg6F444z/11Mc2cE853D6Np1aETrr995xU1XksyvLuUnbbvxxnx1yFrePaseSce4UvW4S9Q0Z9+XooIoJE9dDz7YbT9Xrzm77CWX7Poy1g6CKK2dJAkil2NbTOJdVtU/BKF9Rqvqf7WwTq1jzTWz90kTRNEbaZvh2kUjchVEIdGHcemli9enyDwi14d0+vT89UnKP23NlZ//HF5/Hc4+u/O2f/+7QwPOi4sgyroOaW3X1bmgKsJOtkpB1A5ec/b/pZaCjz/uvL2oIKpSAOyyC9x0Ezz3XNdtcdcuzqGqDUitiaouBMa0qC7tSR7TXIhI9hjRE0+0xllhiSXyHZ80uzxMLxpZIY4iD2TR9ZTSyrTD+z//fOdtEybAWWeZ3/Y9iXZKeXAZI8q6/4cfnrwta2zH7qCqiJgd5heeV5WCqGqhmUYdY0RVa0TR/LfZxkStjyNLI+ougijgcRGZIiLfFJGvhJ/aa9YuFDHNPfssfOELndOiDeiTT9pTI0ryHCrSIYR1STLNpR1TF1kP35AhHcswR7HrVuZN3WWMaNVV8+UZd93WWguWWaZr+t57u71YuN6LqCCyl0uYP98tjyRa2VlmaURFvObiBFH//h1R1m1U4aST3OubRTcyzbnc5aHAO8D2mImpuwNfrrNSbUWaRpQUlw3gyMCBb4cd4t+m5s+PbwgnnGC+qxJEkydnH7PSSh2/kzrYIoKojgmtZTUil44tyeQWneCYB7veLhrR5Zebt11XVl2164D6uHHwTExYyF12MVEh4upShLD+cXllTczNogqNyDUMU9oYUdEJreE1sfM6+WQT9SSOn/zEra4uxLXRNnVWcKnJf6nqwZHPt2qvWbvwv/8Lm24av+2QQ9zyWGONeI2oXz8Tb852mgjfiMoKotD5YK21so953QpAkfTgh+nLLedelzpMc2VxefiSztGub57rECVujChOwEYjU6dxwAHw0ENd05PqOWKE+S4r2CG9gy4qiKrsLF3DMKVpRHFavYsgCs/fziuvtluUnqARicjuIvIWMF1EZonI55L27dFsu235POLepkLb+cCBsNlmHemqxl36298uVlbYuOzoznnIEkS77grHHuuWV5ZGJAKTJnVNS6MVGpGLICpDVHNIuuZ5O6x113Xft8pOKJyEGjfRu6wgqtPtP0raGJE97munRUkbI6riXPLkESeIuqGzwunAF1R1ZWAfIGd0x15I2uBwnEYUYs8mVzVzYM48k1IUHSROGsi2H8qkpYSjZGlEYGKv5aXKwd44oktCh2lJazXlJSqI+vWLbztLLllNeXGE16HMSq8hYT3jzqEdTHOupM0jijPR5zHN7bOPCTrcSrI0IttJp2HSnugFqvoPAFV9CBMnzpNG2ptW3BhRiN1gqvIKK9oB2A+fLYjsh9R1XCF0lEjTiFzS0uqYFxchFleHcB5VFUSvX5KAa4UgqsI0F/XMrPLNv0mvOTsygatGlOSsEMbQy6IKL8aQLEHURqSNuC4vIt9P+q+qJaI+9jL69esqGGyNxW4w4bydotQliOwHbOzYzubEJOwAllUKojIaUVFBVHYQ2e7wo51N1YLI5Rq6aESuhDEL42L/lR3rbNJrzvZgixNEcfcnyTTnysKF5RyDbLKcFdqINEF0CZ21oOh/D3S+qWmmuegDb2tEVZl87PqUtc1DsmkO4OGH3fPM05m4jBHVrRHV3fnFaURJprm66lKlRhQKorg5c2U1miY1Ips401zckhZJzgquZAmu7baLTx8/vmta1hhRG5EWWeGUVlak2+LydhGnESWZ5qqqT9H5G0lRost0CHnewvbYI3ufuseI6n5jjHb+AwZ0b0EUfZGqYkC8CdNcmkt+nEYUR92CKMkSETdfsBuZ5tpTPDZBFQ0+6aHu379rA6hDIxLpiKZdVCNKMiGVEZZ5QvxsvXV6Xq3QiMJ6/eAHxctJY/31O/9PE0StFoo2rmXXoRHVaZpz9ZqL1qeIIMprmiv63L79dte0JNPcj39crIwa8YIopOhMeZeHdcCAZG8aqE4QbbVVx0RF1wZ9WGRpqSTT3BFHFK9XnjEiF8p0Ti6BbsN6haF9qiDtXOPC9YN5y61LIzrgAPNd5RhRnBZTtv7tohGlTWy3id7nvPUv6qwQV7edduqaJpK+OGFDZF5ZERkQkza0nuo0SNE3fldBlDaIWZVpznZ0cBVEF1/c+b/doKvy4EkSREU6qbIakctDWEfnn9ZONtkkeQ7QvHnV18WmyjGiOC2mrEZU9AXxllvyH1OHRpS3/kVfDAZ06abjcemv4pZVqRmXJ+4mEfm0pxSRlYC766tSQ1Q5ThMlbpnhOjQi+0EqquLb16EqQVRlx77MMnDVVfmPi64UmoYI7L9//jKy8kzDXsGz7rJC+vevViPq08csoW5PUi47RlT0+A8+yH9MHWNEeQVx0YgdroLoSIe1Q2fNKlaHErjc5VuA34tI32C11buAE+qsVCNMmJD/mAULOj/0QxMUxQEDOpYQD6lDI9p33851K0sVnRRUO85x4YXuE2ptwrWRXNh+e7jmmvxlpJF1DebMqbY8FxZfvFpnhf79O1YkDekpprmigiguuGkduAqiffbJ3ifuuZ89O199cpJ5ZVX1EowGdAvwB+BwVf2TawEiMkJExlv/x4vIWyLyRPA51Nq20EqfkpDfABG5XkRmishDgXAMt50QpP9TRBxnkAXYS/m60q+fm+q91lpd37Dtm12VILJdOKsQRFXkAaYzadpbJ49QdRlHykvW+VfZYeVxMqjSWSFuocKyprmixxcRsGllfeYzyWN5Nk21c1dB5ELctavZ7TttqXB7MqsAwzErrW4pIlu6TGgVkSOAY4ElAmE0Lth0vaoeFXPIR6q6UUa2hwDvqepIERkHnAXsJyLrBvmvB6wM/FlE1grWVMqmqC3apQHEdWy2RlTlPKKQKoRIlbO8m56/UJV2V5Smzz+OqjQi2zQXpex5t9JrLq0POP10t7ybus9VCqK4Z6VmAZvW+0Ynr96ckB6LiAwETsEsG7EOMBWYm7N+cewJTAp+3wD8QkQkSL9OVT8GXhSRmcDmwANOuRbVSlze2OIaiX2z67jJ7SaImtaI8nS4e+5ZfflNn38ciy9e7RhR3LPQlEZUhJVXLp9HTxVETWlEFUxoXQQsBiwd5PcSgJEZ7CMiXwSeBb6nqq8GxywuItOABcCZqhrn+rIK8GqQ5wIReR9YJkh/0NpvVpDWCRE5DDA+y7Y5pE5BFBdEMyl8TlW0myBqWiOo4s2/DO0oiLJMc673LNTo456FdpvQ+v3v54tM/9RTMHq0+/49QRDFtYma26+L+/bdIjLY+j9ERO7KOk5V5wIHAmcAp4nIOSKyBGacaYSqbgD8GbjSOmxVVd0U+DrwcxFZM65KccWlpEfrNVlVNw3K6aBOQRQ369nu5PNOnBw2LNsNuQpBtN565fMIaVoQtdI0FxfVoh0FUZZG9LnPGe0wq+5pprkyK9lC9WNEP/1pvnzyus+3yxiR6xpMcTSgEbnkvpyqfuoyoarvAQlLWHZGVacAXwXOBpYDjlPVdwLzGZj4dWOs/V8Pvl/AmPLiPAhmYcarEJF+wCDgXTs9YBjwepejkyj6wBTViMp0jP/+N8ycmb5P2SWaq6ZpQdRKjWhujAW66fOPI0sjgq7enkn5QPyzsOGG+esF2RpRmecnTzDZvO2mXTSiX/+6uDBqYIzI5aotFJFPV+gSkdWI0TSiiMhSwb4Ac4BngIHBPKSQPYL0UNMaEPxeFtgamBGT9RTgoOD3vsC9qqpB+rjAq251YBTgHpmz6JuXy3HRMPlQzuzl8nCU0YjmzTPCrkpaaeuPo5Ua0UcfdU3bcsvqy4kL65IHF2eFhQuz90kTREXJEkRl2vcVV7jvG7Yb1/bThCD6z3+6CiLV4hHjGzDNuagBJwH3i8hfg/9fJBxjSac/cDGwLGYM5xWMye0YEdkDMw70LjA+2H8d4GIRWYQRkGeq6gwAETkVmBZoWL8Grg6cEd4l8MRT1adF5HcY4bUAONLZY84U4rxr57N0MOnFvYHV3TGWEXTvvguPPFLtoH3TpqlWakR1R0MIKTuG56IRRefKJeUD1Qqi0IqQJojq8DaNEhc/L40mBNGcOfHRt4u2+XZyVghR1TtFZBMgfKX7nqpmvooFJryxwTyfbVX1imDTCcRMiFXVvwPrR9ODbSdbv+dhzH1x+52OWVk2P3EXetGi7BtwzjnZecfNUWpnjWjRouo77qZNU60URB9/nL1PFSS9zLgK/YED3QRR1r0LhUYd9zgpz6z2XZWjTd4XxiZeuD78sGuaavGX3TY1zQF8Dtg2+OS1MczGzD9qb4qO4xTVGspoRGHncVTcVKyA3iyIHngAbr+9c1orBVGrzIBJ5ay9ttvxyy+fXVcXQRRSxz1O0oiyxkCrugfdYYwobp5iGUEUd841m9ZdvObOBCZgTF4zgAki4jwKpqqzVbX9BVHcrOk6O688b2zPPtv5f1ivtIZWVtB1Z0H0zjuw226tK68p4u7xjjvC177mdvzyy7tpRK6dUB2dVZJjgb3CcRy9SRAtH+M7VuYZbmDyt8tV2xXYUVUvU9XLgLFAL3jK6biRF11Ufd55bnbSvmkNrYwgqaMh9ibTXJ1lXWnNdoi7T8OHd01LYoUV3DSiJgVREk1qRGmLNzbdzkOqNs3VjOtVG2z9riEQVy8kj0YUbRjhw1FGED35ZHp5accXCdBZ9wP61792/I6zZ/cUQWS3hbgOI4+346BB2cJj4cL2HKhvlUYUR9qz27RTTkgZjaiByd8uLed/gMdF5AoRuRJ4NEjrPdTRuPI8KEkNv4wgSnuYshpx1FToQt2d1G9/W2/+eajzQbbzjrumf/iDe16LLZa97MD8+e2pEbWrIGoXjQjKmeaSrl/WdS+IS/TtazEOCjcFn62CtJ5PeCObfstJavhpD1tWI8waX0o7vshk2bo7qazYfU2H+KkK+zzKtssBA+Dpp5O3z5njvjIptI8guvTSegVR2nVvF0FUpr1HzXr28z5xIvzzn8XzTsDFWeEeVX1DVaeo6q2q+i8RuafymrQjdQqiPA2ljjGiLCGWdnwRj7y6H9CsjqcnCqKyHf+AAfDaa8nbP/44flHHJNrFNPf979c7RpT2Itb0S2tI2TFi++XXft4XLID33y+edwJpy0AsDiwBLCsiQ+iI5bY0ZpmFnk+7CKI6THNlOu4iczRaKYia7gzqLD/LNJeHuCXsbUJB5Eq7OCukmZbKMnp0+jXpCRpR9PrNn98RL7Oma5t21b6DGQ/6bPAdfm4FLqy8Ju2MCNxd8eroVWhEWQ3im9/Mnydkd6TtohHZArGdTHM9RRB98km+qM7tYpor4zEWl5fN00/nj1h+8MHV1CWJ//yna1re9m4/S6rJGlEdcwxJEUSqep6qrg4cr6prqOrqwWdDVf1F5TVpR2yNaKed4NoCQ2NJ9tTozbwlbsWLgKIa0W9+kz9PMOdbtUZUNgpzHNts0/Hb7njiBGVPNM3FdUB5yBJE8+e3r0ZUlyC66qpsh4+8gui994rVxZXRo+Heezun5Wnv06bBhAkd/6OmOVv7bLVGJCKbiciKqnpB8P9AEblVRM4XkaGV16SdCR/WIm+6SQ0imp6Wd6tNc1mCqKolxMtiBxe1r1Hc9cq6HlXFh/vkk9ZpROedVy6vNEH01FMminge4dKkac4OAFums/zTn9LnCUF6W4q7nnXPy3n1VVNvmzyCaLPNOjutRK9fkxoRJmDpJwDBInZnAlcB7wOTK69JO1NGECWRRxC12lmhiEZ01VXp5dVB9E0tpIjGdvPN2fu48M471eSThH1fyr4QDBiQbN5bf30TNX733dPzsN/2m3RW+OIXO36X0YjCa7rvvh15RcmrEbVigmjZ2Hp2vaOmuSY1IqCvqr4b/N4PmKyqN6rqj4GRldekHWmls0IRjagu9+0iGlETg7TRN7W49JC081l66WJm1ziuuKJ1GlEV7ttpedgx65Kii9gLzTVpmrMnWZfpLMOAtWkvFN1BEOXVWux2sGgRPGgtdh0VRC3WiPoGC88B7ADYRsgaDP5tSFQQ5X3wH300O++Qsqa5X/wiflvS2EwZZ4W4+jQhiJI0ojyC6MgjTSdWxcP1r3/BSSe1ThDZHX/cmldZZAkim6SIDbZ5dKWV4vepg6ggsq9LGY3IJXJ6HkFUpwefTVlBZNd70SIYN64jz+hztvPOxeqYVnzKtmuBv4rIrcBHwH0AIjISY57r+ZQRRFOnGgeHKsaIkoSJncevfx2/rYg2FVc/G1eN6H/+B7785fRyymA/IPYYTx4zxS9/ab6rEERvv23y+exny+flgn3NP/oIzjor3/F5BFHSInxZY2tpL2NlqEsQuUQOyDNGVKUHXxpVCiJVMz74cLCuaNTyUMMyJ2lec6cDxwFXAJ8PVkENjzm68pq0M0UE0UUXmQXmksgjiLbbLj49afZzXP5RynjNhQ3z73/vSIsTRGecAa+8kl6PMtjnbK/JUsRZoTt61UWved5zWHJJ2GCD5O12p5zUQWcJorqua/QeR2PwVaURlR0jandBdOON5tv27g3rG/ZJSSbwCkm1p6jqg6p6s6rOtdKeVdXHaqlNTyLrrTyqVcyenbxvklnEbmx5w+5kmeZcnCfshzY6PiACH3wQv2hXVdjXcO7c+PSQVgiiVguzMoIoHNsZNix5H1v4JLWvVgqiyZPhxRfN76RAwCFlNaK09t+TxohCp4yXXup4bqKCyL7HTQiiKhCRESIy3vo/XkTeEpEngs+h1raDROS54HNQQn5DReTuYJ+7g6gPiOF8EZkpItODVWWrOonO30m8/Tbcd5/5He0MH3kE/vKXjv/R7d/9bnK+hxwSn57mQVU2soKLILIbf9IYUZ0PYZJGVMSbzPXBHTbMLJ/QDlxxRef/ea51dOHAOOzrmySI7DGiunn5ZXjoIfM7er+qEkRlx4habZo76STzXcWKtOEzFJ5f+Ezb16Q7CiIROQK4CzhNRKaKyIrBputVdaPgc2mw71BgIrAFsDkwMRQyEX4E3KOqo4B7gv8AuwCjgs9hwK8qPJHO30m8/37Hgxl2huFNPekkMzAeEu0s0+I3JU2IS9OIynrNpS3RHDf+lLT/G2+k18NeviEv9jnbGlGdpjlVo6GmxWirk7CeRx3VtV3k0T5chHWa6TeklRqRHQMxS/AU7ZjLjhHtumvXfau6Bj/8Yde0M84w32XHiGzCa9kTBJGIDAROAQ4EfgyMB+amHLIzcLeqvquq7wF3Yxbhi7InEK4OdiWwl5V+lRoeBAaLSLobj2vn5CqIVDtuXtgwwmMWLer8MOd5UJLMW2meYlnnluZqmySIRo2CZZeN14iSrs28eXD55clllQkr38QYUQOLhjlTtSCy72nSfSpzXfMKC7tTL2qa23rr9DJcxojykKUR5dEo0zxTvSBKZBGwGCZIKqr6kqqGzv77BOazG0QkXFJyFeBV6/hZQVqUFVT1jSDPN4BwnVyn40XkMBGZJiLTnC9qeEOyBNGiRR37Rh/0hQs7d5z29qWWSs836W20jEaUtpS2PUZ0/vkd6W++aTS38LrZ169PHxgzJj6/upY0z2Oaq+qttDcJIpukNhj3krTjjm51yjvvKKoRDRzYsS16X5LuU9Z5p40RrbFGR9l5qKrNRK+X7SlbpSDqSaa5wMHhQOAMjGnuHBFZAvgDMEJVNwD+TId2E9fL57maTser6mRV3VRVN01dpRQ6JnW5Rg2IE0T2G1ySIJqbpiiS3KjKCKI0bEEUF0YnSRAlrdya9uZbxradpBHWOUbUzoIoT93yXqOk/V98EXbYoXPcu6grtSv33w/f/nbydlsQLVxonGGSyikqiNI0olBI5TmnKseIohrRoYd2/M5rEUmjh2lEqOoU4KvA2cBywHGq+o6qhmd2CRC+Rs8ChluHDwNej8n2zdDkFnyHLmWux3eQ1Qm+9JLpkMOBXRfTXPjWEm0YaYIoi6RGlRbWpqwgChuhLYiiAig6RpRUZl2CyMYuuynTXCu95spGGK9KI1q40ATcTBMKrnzhC3D99en7hHlHX96iZSY5HbhqRPfdZ4IGH3ZYx7awXbWLILLxGlE8IrKUiKwW/J0DPAMMjIzb7BGkg3Fq2ElEhgROCjsFaVGmAKFH3UGYZSnC9AMD77ktgfdDE15l5BkjijMVFB0jcok1V7UgitOIooIoqhGFxJklk6gqgKpqxwS8pgRRkYgK9hLnZT226hRESfvHdc72NapCOIcOEaod0cCjgih6Xy69ND6vrGkO4fb5880yKi+80LGtiCCKq1tRbNPc1Kmdt1Xx/EfH38Jn2h4f7G6CCOiPCZx6A8YbbhfgPOAYEXlaRP4POAbjxEAQ1+404JHgc2oY605ELhWRTYN8zwR2FJHngB2D/wC3Ay8AMzGaVoo/tCPRjiXPGFGcF0/VGlHSw592jAtJGlHU3JikEeXxYJo1q3g9bVThjjs61zPv8S5U/SAecEDHb1urKEJ4Dtddl71v3muUdA+jbSLtdxrhxMq4/cPwVaodoYyiY1PR4z75BI47Lrm+UULvzjQvwKgD0syZyQLPrldam8kTGivc9+yzTdSWuLqVwXasssvrzhqRqr6nqmOBfYFTVHUbVX1NVU9Q1fWCdY22U9V/WMdcpqojg8/lVvqhqjot+P2Oqu6gqqOC73eDdFXVI1V1TVVdP9y/FGUEUfTtqahp7itfKWaaK4IdSj5tVnWSRhQnoLLqlrReU15U4bLLzFyTOC+9IhpRXMicJPdhlzKeegqeeSZ5e9ZYYRZh+TNnZu9blSAK20CS8LnzTrf8v/71rseGhG1RtWOl0CzTnF03m+h5//GP5nviRFNOWkcbHhs+46rZS3G4CKK0NmHzf/9nvu+7r6tml1cj2nhjWHPN+G09yTRnMRt4ogXlNI89RhRnmisyoJ4WkDMa1iRal7yEZg9bI4rrgOIaYxFBdM01cNtt+esZh6oJJzRiRHxYoSKCyPXcXfn2t2HPPZO3lxVEecj74pK0f9SBJfr79NOTozecckrH7zQ3flsQhRpRlmkuKS1p/p7L8xLViFSzzakugujss7PLBvj9700cw7hnJq8geuKJzmZH+5iepBGFqOpsVe2egiiqASWp0XcFQ1lZprkiGkyfPsk33zZrVS2Iomq6TTQESFjPEFdBdNpp1Y4RVX181W7gWSH0k+aLrbNO8TKTqFojShoXUk2e/HvTTV3Totfnz3+OF0RZppX1okUAABsnSURBVLlonUJcpkIkER5r18dlQcW0zvuOO/K1qSQLQpIg2nBDOPdct7yTTHPdfIyo55Flz00TRNH/rh1BWgDSZ59NPq6MIILk87Dztq+HXc88Y0RVeZqVdUbIEkSTJ8Nzz3XteB54wK1+kB2MM2ly43PPdU0ru9REUof84ovmzTtaRlJ7jTPNZXVWc+YYs9D06V23Re/D7rt37viXXNL8rso0l3Z8lDyCKAxUnKURffWr1TwDSYJo+nSzgqvNW2+l5xV9vpsOeuqJkDQBb8AA8217zaVFBwZ3QdS3b3JDjZozdtih43dW444TDq6mufBtNLTXQzHTXN++1TXsqrzibOx6X3YZrLVW1/q6mlWgq0YUHXBOEkR1uIUntb/Pfha+9rWu6UXHiOJQ7WoWSmLevM4dfzieEnVyWbTIhD2yI8K7arkudbb3cRFE9hhx2rSGefOqeQbSTHPR8rfcMj6PJNOcnbcXRA0QfetMEkThDO+0MaJoY3AVRGmmuSj33ptcXpS4jqV/f/OdZZoL7ep2RIgiXnP9+3etZ9QttVVkjRFVYUKMdkrRMm1BZNvl0+7lbrvBqafmr0uWhhOlqGkuL1nOCsceawR41ESlChde2DmET9UaUVx9ksaIbAGd9fxW8aKRJ/p+0r2MPvNh1P+eMEbUo8gKSZJmmouSRxCVmROQRJZGFG2UtvYVRlCwQ6zYtm7XgdN+/bpuS1vDqQxlTXMuY3rRPDbeuOs+aR32DTeY73XWgcUXj98veszttxuPLxfssZq8Cybm0YjKdLxZgujFF+Huu92OyxojssfeigqiJCeLNEeiuHzKEo1qktZmXJ+FPfYwE3r/9a+ObV4QNYCrRhRqEnGCKE0td6EuQWQ3qOuvN1GwTzvN/I8KohVXNJ+QMJzL0kub73nz4Cc/6dgePbekFWb79692/lMaZb3m8gYJha4R1bPu5dVXGy3zH/9I3qcM9pttXo0oS5PIY5qzy9htt86hauKYNMkseRFdhTgpz7R62OdhmweLCqKs+rRKI6pCEEUtGq+/Dpdc4iZUS+IFURqugih8y3rttfSxFZu6NaI09tuv87m99BJsu23Hg2mPES1aZIKd2ksOhIIo1Ij+9S9TxyRznu0EYROnEd1zT96zcaMVgihkm23MYHU0z+g8lbgyXV24izgr2OcTZ8pJy7PKMaJttun4ffvtnQVM3LHvvgsHH5y+yKLrM2KfR54627gIIrtTz4qMUrcgipKloaW9HHpB1AYkec09+SQcdJBZwO7WIOJQlonJtWN7+eXqb/4nn3Q+l0mTzHfc+kJx9Xz0UfN92WXmO+plE+20Qk+nKE8+2fWB+dWvYNNN4/evkypMcyH/7//Fj3VFPSCrfsGICpK33+78P23Q+Q9/SM8v6fzjxgWz2mtasOG6NOKssvKU+8475tsek00rI+t6VPF8V2mai+IFUZuRZpq76iqjKfzwh2YFz6gginYSLoLoJz8xnX7VD6etvUCH5489Pyg817gO6M03zT6/+11HfpDsqBG3zMVee5nl0eMadlVhf2yq1oiK3BMXjagqhg0zc0hs0oTpHnuk55cliOsUsHVRtM6vv27W5jr+eLe8a+q8O5GnfWaZ5qJ401ybkTTWYacvWtThbZKGyxt2OFZQhyCK0+7COs2fnxxFPJqP/Z0kvGynhpC0MbSqztdemr2ss4LLi0PavC6oXyOy83vttc5LM0C5TiTr/G2NatEiM+/KJeZdlCqvSfTl78c/Ti4rb7kzZ7q1CReNaOjQrmlXX20+RSmiESWZZr1G1DCukRXyLvAF1QQ9LcqiRfGNLmxkb7/tNtYVNcskHROnESW5etvbyjI2boHfBPIKojCGmm0SOfjgYuXkIQxf9Oab+csqE48w69iTTur8cvGd78D+++cvp05BFH1+VbualavGRRDZjkAhjzzS1bSaRZIzR/Sa5hUmXiNqmLxec3EkPVguIf9Dqr75SRpROJbz9ttuGpGdH+QzzUWPjUsre962a20RjSjtTfC73zVx7WbPNv8ffTQ7enYVprlzz4W9945ft8clKG9RsgTRokUd7uHtapqLE0RhlIEVVqinzDRBFPYBgwd33VbkXmU5fWRtO/ZYY6ZPi1rhBVEbkCSIkkx2ACeeaLx9ojPJ86w90yrTXPhAvPlm+hhRUv3Cc3r55c77HXYYPP54/LFxxE2SDLnzTjjySNhoo+Tj0+qah7Q3wQULup5n3noUrdMtt+QvC8y9LKoVuXRAVbxA2HX+1reK5xNHXJsPNczhw7tuy8uZZ3b8Tuq8d9+94/cWW5jv00/vmleS1aIIrqa5yy830VLSosJ4QdQAVZjmbrvNaBpprqch9pgGVC+IQm83Oz87LNCdd8LFF5uOPs8YUcjMmSac/ze+0Tn9mWdMTK20Y+O2RRv9CiuY6NW//GVHSPyqyNKI8gzWx6X97nfw0EP1OitkdTgLF8L66xuzmQtJQW2zyi9zXnVqU3HncN995vvpp4vnu99+xgx5wgkdaeF0h4kTO99zO3L2U0+Z77fe6pjDF1LkOrge04Zecymv8p4uFNGIotxwQ8dE0Cjbbx//xlxnhxU+DGBMWYcfbn7nGSOy87v22uwyk45N2jfExQkkiSKmOfuhKzO+8tJLprOKllO3CStOED3zjPv6NyG//W2HCTKNrEncrcblRXLaNFhttfilQ1wJvUdtPv64o/w4Z4QoJ5/c2Zki6tnqgqs3nB8j6uYkCZy//tU9j69+FXbeOV+5Vd38MDirS+yoIl5zaeSZy1BHR1a3IHLtNFrhyhtS1lkhPP7oo/PtX5VprmpCQXTVVZ3TywihJMoGCi1yHZLCDeV59rKO94KoAVxNc3E2XheWXx5WXjl5u0tHv8EG6Yut2YSamB16JqlzctGI0gKjRkk6h7TF56qykUfL/+xn07dH6wHJjghlHuqqO93o9Sr7Jpzn/tr7tbtGdPDB6Q40VVCFIMrb/l0doHqjIBKRESIyPiZ9XxFREdnU2u8jEXki+FyUkN9QEblbRJ4LvocE6SIi54vITBGZLiKbVH4ycaa5efOKP3hvvQVvvNE5bZllzMx8m7T8n3wSpkxxKy+cz2PPL8nSiNIEkb1kcl7CcufN6zpmEeYn0jn4Z1W4uoyHaTfeWK48u0Np1YTWuPyT7qXL0uJ5ymsXQRTFDluVZzXcbbdNXjohiSLmrOefN9/33BNv7suiLo2ou5vmROQI4C7gNBGZKiIrBukDgWOAhyKHPK+qGwWfwxOy/RFwj6qOAu4J/gPsAowKPocBv6rgBDr/jwqiIUOMVlMl777bVTiVvfnhxLg4QZS2VhCkm+byRA9OexjCqNNJ+1ZBliYSTbv99s4CMYkydvzw92OP5cvDJe+4/0ntaPTofOXYDi5x5VWpyVZJ0blCf/2rcTTJQ5HOe4MNzIvol77UNWCuC0kaUVnNuDtrRIGwOQU4EPgxMB4IX0NOA84GHNbZ7cKewJXB7yuBvaz0q9TwIDBYRFYqVvuA6NhPtCHPnt01xlMVRDv4sh3zgQeaPMMxLheNKI9proggSttmd2hVCaWshylazm67VWcitPO2xxnD9E03rWdCpatGlNSBTZtmvqMBUpNirEUnNpel7L13Na3XQRHT3IcfllsGJUkjitLLNKJFwGLA0gCq+pKqzhGRjYHhqnpbzDGri8jjIvJXEflCQr4rqOobQZ5vAKFKsgpgr4k7K0jrhIgcJiLTRMQ8ZQ8+GF/KyiubEOg2RSIoFKFqQRTFXlWyHTSiaOO2BUAd2lHcwxR3b1061rDjiM6TSsJ+cbHvbxXnmTVGlNdZYe+9Yaut3KYe2OW1ssPPQ6ueX6jG49J+QXBpX64aURuOEdXmvq2qc0XkQOAMYEURGQ2cDJyL0Y6ivAGsqqrviMgY4BYRWU9V/xOzbxxxr61drriqTgYmA4iI8qUvGfNadMJp1DwG+dy0q6BqQbT//l3DrrRKI4qSZiqzO7Q6NKK4PO0wK4cd1rUeSbz4ohk/eOIJ93o8+6xZdrzV5O0Q58xJflGLI800t9xyHVEMmqIpjSjP6qk2p55qns/LL++8qGES3VgjqrVnVdUpIjId2B3YFPgvYDQwVUxjXRGYIiJ7qOo04OPguEdF5HlgLWBaJNs3RWQlVX0jML2Fk0tmAfb06GHA65mVnDvXdCYutKohRx/kqm7+ddd1DURZhUZUxGvORRBVSdZb3V/+YsLm/PCHHRETXE1zWeMHSZpKnfPD4v5Hz/u449zbvgtp1ytv3LQqWbTIPLutFERVPLMffAA/+lH2fiFZy5Yn/c+im48RLSUiqwV/5wDPAEuq6rKqOkJVRwAPAnuo6jQRWU5E+gbHroFxOnghJuspwEHB74OAW630AwPvuS2B90MTXmXkHbAsSyu8kFrlNZfHNFf3pM+4POfNg3HjOofteeAB+OgjOOus6sqx05v2mvvZz+Dmm6svr5Ud/qhRZjwvjSlTzLjohRe2pk7Q2vliIVlrRiX9z6Kba0T9gYuBZYFlgFeAr6fs/0XgVBFZACwEDlfVdwFE5FLgokBrOhP4nYgcEuQZxo65HdgVmAl8CDiEQs7J3nvDvvt2LAjXKppwh23lGFH4O3yQ6hBERd7q3n0XlliimvLj6pLn3B55BDbbLH2fOoOeulDH/K8sZs5Mdj8P6/HaazBoUOvqBOUicdRNLxsjeg8YKyIjgG1V9YqYfba1ft8IxE7YUNVDrd/vAF38R1VVgSNLVjudOXNMh1A3WYPOraBur7m4xh2m1THXpm4tK45wkD/qnl1k4ufWW7uPUeaZPFwlWc4dq63mNi/s5ZfNvmUpM4ZZliY0oiR6s/u2xWzAcSS3G9CKRh19gJpo1FVrRFHiBEOdM/Nb8DB14e23jdfZAQd0Ti9S/vz5xkzoQpJGUrcg+trXTEDa6dPjt7/ySvbigeC2BHe7066TeqHXmeYAUFUviPKS5r49Zkz95UNrNaJoWt3aSysFe5zXWaucFaKUFUQrr5yex8yZJnK7p72IRqovY5qryezqo2/npZVvOnEdVlWz8L/whfQVTF00IjtkShbR62bHbgu3zZqVnU+UBQvcTFZNmOaSqEsQ1T1GFDeloQ5uv93Eg3Odm5VEu0Z4qJMvfhE237xz2mOPmYgN77xj/pfRiBZbrFz9EvCCKC+t6MQmToR11um6PlGV3H+/+STh4jVXVCPab7/Oncwnn5g1jOxIFtdcA1dckZ3vaqvFL7WcVn7T9vu6y29qjKgqbrjBfQ2vNMJ5NXlWQ+7u3HdfxxpLNmUiNtjtqaa5YF4Q5SW8KWVubBZPPmkEUZO4aEShq7NLUNDwus2ZEx/Q8ZprOv+PLq6XxOuvm08emtaIqg6Fk7fc7kBZIQRmIuiIEXDGGeXz6s2E7ebNNztHZakQL4iKEq7A2ErCFVZbwe23m2WN0zqv114zgVSTlkiIoykh0E4aUd3zbZpyVmg35s+HE09suhbdnxY4TXlBlJdwTZ9WC6KNN652FnwWX/uaMXllCQ5XIdSqSZxZ5TdZh2j5rR7D6G2CyFMNoQDygqiNCF1o//a31pbrGsusKubNM0tcV0XTgsimaY2o7omfSdfYZczNUw+nngorlVsMoDG8RtSGPPWU8Ux54IGmaxLP+ee7j6+0kibnRNnlR383QROhcHqjB1k7MXFi0zUoTjTySQ20abz2Nue++9IH8ZtkwgRYdtmma9GVJme5R+npGpEXOu3NiBH5FyNskhaY5rwgCvnpT5uuQc+mjCC64ALYbrvq6tK0IGpCI/K0Dy+/DE8/3XQt3GmBNcM/CSHHH+/fJOukjCA65hiYOrW6ujStlbk6K1xwQbFJnU2fn6dn4seIPN2ePGsX1U3TdXA1zR1zTP118XiyyBNBpWgRteXs8di0oDE707TG4E1znu6EF0SeHkOeRfTqpmlh2MS6PR5PUbwg8vQYZs8235dc0mw9oHlh6DUiT3eiBYLIjxFF+fznawvs16v58EPo37993d5bideIPN2JnqARicgIERkfk76viKiIbGqlnSAiM0XknyKyc0J+q4vIQyLynIhcLyKLBekDgv8zg+0jClX4b39zW7zLkx8vhAw9QSNaf31Yd92ma+FpBd1dEInIEcBdwGkiMlVEVgzSBwLHAA9Z+64LjAPWA8YCvxSRvjHZngWcq6qjgPeAQ4L0Q4D3VHUkcG6wn8fTfrzwgvl+//168m+FpvXUU/DMM/WX42meOXPM94wZtRVRm2kuEDanALsD6wBTgbnB5tOAs4HjrUP2BK5T1Y+BF0VkJrA58ICVpwDbA18Pkq4EJgG/Co6fFKTfAPxCRES16QEBT1ty+eXlF14ryrHHwl13wcMPN1O+p3ey117FFjd87jnYaaf09ctKUucY0SJgMWBpAFV9CUBENgaGq+ptImILolUAe13lWUGazTLAbFVdELPPKsCrQVkLROT9YP+37QxE5DDgsFJn5un+fOtbzZU9bx7cfHN9+ft3L08ct95a/Ni7766uHjHUJohUda6IHAicAawoIqOBkzFms/Exh8TZE6JPVNo+LsejqpOByQAi4p9Yj8fjaZhax4hUdQrwVYwZbjngv4DRwFQReQnYEpgSOCzMAoZbhw8Doktvvg0MFpF+Mft8enywfRBQ4zKqHk+bcdNN5vv3v2+2Hh5PTuocI1oKYxoDmAM8AwxV1WWtfaYCx6vqNBH5CLhGRH4GrAyMAjoZ0VVVReQvwL7AdcBBQKhvTgn+PxBsv9ePD3k68fnPw5e+1HQt6mPGDO8S7umW1DlG1B+4GFgWI5BeocPJoAuq+rSI/A6YASwAjlTVhQAicjtwqKq+DvwQuE5EfgI8Dvw6yOLXwNWBk8O7GA88j6eDv/2t9QsaenoXl18OgwY1XYtuh9StNATzebZV1StqLagAfoyoF9DUstztQm8/f09dPKqqm2bv5kYrZtTNBlq8zrXH4/F4ugu1h/hRVS+IPB6Px5NIN44x4vF4PJ6egBdEHo/H42kUH33b4+nJjB0Lr7zSdC08nlRq95prZ7zXXC/Ae415PHXQ7bzmPB6Px+NJxJvmPD2bMWNgiy2aroXH40nBm+Y8Ho/HkxdvmvN4PB5Pz8ELIo/H4/E0ihdEHo/H42kUL4g8Ho/H0yheEHk8Ho+nUbwg8ng8Hk+jeEHk8Xg8nkbxgsjj8Xg8jdLbIyt8APyz6Uq0CcsCbzddiTbBX4sO/LXowF+LDtauMrPeLoj+WeXs4O6MiEzz18Lgr0UH/lp04K9FByIyrcr8vGnO4/F4PI3iBZHH4/F4GqW3C6LJTVegjfDXogN/LTrw16IDfy06qPRa9Oro2x6Px+Npnt6uEXk8Ho+nYXqtIBKRsSLyTxGZKSI/aro+dSMiw0XkLyLyjIg8LSITgvShInK3iDwXfA8J0kVEzg+uz3QR2aTZM6gWEekrIo+LyG3B/9VF5KHgOlwvIosF6QOC/zOD7SOarHcdiMhgEblBRP4RtI+tenG7+F7wfDwlIteKyOK9pW2IyGUi8m8RecpKy90OROSgYP/nROQgl7J7pSASkb7AhcAuwLrA/iKybrO1qp0FwHGqug6wJXBkcM4/Au5R1VHAPcF/MNdmVPA5DPhV66tcKxOAZ6z/ZwHnBtfhPeCQIP0Q4D1VHQmcG+zX0zgPuFNVPwtsiLkuva5diMgqwDHApqo6GugLjKP3tI0rgLGRtFztQESGAhOBLYDNgYmh8EpFVXvdB9gKuMv6fwJwQtP1avE1uBXYETOhd6UgbSXM3CqAi4H9rf0/3a+7f4BhwUO1PXAbIJiJiv2i7QO4C9gq+N0v2E+aPocKr8XSwIvRc+ql7WIV4FVgaHCvbwN27k1tAxgBPFW0HQD7Axdb6Z32S/r0So2IjgYXMitI6xUEJoSNgYeAFVT1DYDge/lgt558jX4O/ABYFPxfBpitqguC//a5fnodgu3vB/v3FNYA3gIuD0yVl4rIkvTCdqGqrwHnAK8Ab2Du9aP03rYB+dtBofbRWwWRxKT1CvdBEVkKuBE4VlX/k7ZrTFq3v0Yi8mXg36r6qJ0cs6s6bOsJ9AM2AX6lqhsDc+kwv8TRY69HYELaE1gdWBlYEmOCitJb2kYaSede6Jr0VkE0Cxhu/R8GvN5QXVqGiPTHCKHfqupNQfKbIrJSsH0l4N9Bek+9RlsDe4jIS8B1GPPcz4HBIhKGvLLP9dPrEGwfBLzbygrXzCxglqo+FPy/ASOYelu7APgS8KKqvqWq84GbgM/Re9sG5G8HhdpHbxVEjwCjAm+YxTADklMarlOtiIgAvwaeUdWfWZumAKFny0GYsaMw/cDAO2ZL4P1QRe/OqOoJqjpMVUdg7vu9qvoN4C/AvsFu0esQXp99g/17zFuvqv4LeFVEwiCWOwAz6GXtIuAVYEsRWSJ4XsJr0SvbRkDednAXsJOIDAk0zJ2CtHSaHhxrcFBuV+BZ4HngpKbr04Lz/TxGRZ4OPBF8dsXYtO8Bngu+hwb7C8az8HngSYwnUePnUfE12Ra4Lfi9BvAwMBP4PTAgSF88+D8z2L5G0/Wu4TpsBEwL2sYtwJDe2i6AU4B/AE8BVwMDekvbAK7FjI3Nx2g2hxRpB8C3gmsyEzjYpWwfWcHj8Xg8jdJbTXMej8fjaRO8IPJ4PB5Po3hB5PF4PJ5G8YLI4/F4PI3iBZHH4/F4GsULIo+nIURkkogc33Q9PJ6m8YLI4/F4PI3iBZHH00JE5CQx62D9GVg7SPu2iDwiIv8nIjcGM/sHisiLQVgmRGRpEXlJRPqLyDEiMiNYB+a6Rk/I46kAL4g8nhYhImMwYYU2Br4CbBZsuklVN1PVcC2gQ1R1DjAV2C3YZxxwo5oYaD8CNlbVDYDDW3gKHk8teEHk8bSOLwA3q+qHaiKfh/ENR4vIfSLyJPANYL0g/VLg4OD3wcDlwe/pwG9F5ADMgoceT7fGCyKPp7XExdS6AjhKVdfHxDpbHEBV/waMEJFtgL6qGi7hvBsmztcY4FErMrTH0y3xgsjjaR3/D9hbRD4jIgOB3YP0gcAbwXjQNyLHXIUJRnk5gIj0AYar6l8wi/sNBpZqReU9nrrwQU89nhYiIicBBwIvYyIcz8AsRveDIO1JYKCqjg/2XxGzlPdKqjo7EFZ/wax9I8BvVPXMVp/H/2/vjm0AhGEoCtrr0dDQMhUDm4IViH6KuwncPSWyZPiTEMHGuvuqqnNm7vQssIq/ZdhUdz/1nao+0rPASl5EAERZVgAgSogAiBIiAKKECIAoIQIgSogAiHoBeAw+zaLe6CMAAAAASUVORK5CYII=\n",
278 | "text/plain": [
279 | ""
280 | ]
281 | },
282 | "metadata": {
283 | "needs_background": "light"
284 | },
285 | "output_type": "display_data"
286 | }
287 | ],
288 | "source": [
289 | "utils.plot_price_model()"
290 | ]
291 | },
292 | {
293 | "cell_type": "code",
294 | "execution_count": 16,
295 | "metadata": {},
296 | "outputs": [
297 | {
298 | "data": {
299 | "text/html": [
300 | "\n",
301 | "AC Optimal Strategy \n",
302 | "\n",
303 | " Number of Days to Sell All the Shares: 60 Initial Portfolio Value: $50,000,000.00 \n",
304 | " \n",
305 | "\n",
306 | " Half-Life of The Trade: 4.1 Expected Shortfall: $477,712.60 \n",
307 | " \n",
308 | "\n",
309 | " Utility: $704,723.22 Standard Deviation of Shortfall: $476,456.32 \n",
310 | " \n",
311 | "
"
312 | ],
313 | "text/plain": [
314 | ""
315 | ]
316 | },
317 | "execution_count": 16,
318 | "metadata": {},
319 | "output_type": "execute_result"
320 | }
321 | ],
322 | "source": [
323 | "utils.get_optimal_vals()"
324 | ]
325 | },
326 | {
327 | "cell_type": "code",
328 | "execution_count": null,
329 | "metadata": {},
330 | "outputs": [],
331 | "source": []
332 | },
333 | {
334 | "cell_type": "code",
335 | "execution_count": null,
336 | "metadata": {},
337 | "outputs": [],
338 | "source": []
339 | },
340 | {
341 | "cell_type": "markdown",
342 | "metadata": {},
343 | "source": [
344 | "# Reinforcement Learning\n",
345 | "\n",
346 | "In the code below we use DDPG to find a policy that can generate optimal trading trajectories that minimize implementation shortfall, and can be benchmarked against the Almgren and Chriss model. We will implement a typical reinforcement learning workflow to train the actor and critic using the simulation environment. We feed the states observed from our simulator to an agent. The Agent first predicts an action using the actor model and performs the action in the environment. Then, environment returns the reward and new state. This process continues for the given number of episodes. To get accurate results, you should run the code at least 10,000 episodes."
347 | ]
348 | },
349 | {
350 | "cell_type": "markdown",
351 | "metadata": {},
352 | "source": [
353 | "# Todo\n",
354 | "\n",
355 | "The above code should provide you with a starting framework for incorporating more complex dynamics into our model. Here are a few things you can try out:\n",
356 | "\n",
357 | "- Incorporate your own reward function in the simulation environmet to see if you can achieve a expected shortfall that is better (lower) than that produced by the Almgren and Chriss model.\n",
358 | "\n",
359 | "\n",
360 | "- Experiment rewarding the agent at every step and only giving a reward at the end.\n",
361 | "\n",
362 | "\n",
363 | "- Use more realistic price dynamics, such as geometric brownian motion (GBM). The equations used to model GBM can be found in section 3b of this [paper](https://ro.uow.edu.au/cgi/viewcontent.cgi?referer=https://www.google.com/&httpsredir=1&article=1705&context=aabfj)\n",
364 | "\n",
365 | "\n",
366 | "- Try different functions for the action. You can change the values of the actions produced by the agent by using different functions. You can choose your function depending on the interpretation you give to the action. For example, you could set the action to be a function of the trading rate.\n",
367 | "\n",
368 | "\n",
369 | "- Add more complex dynamics to the environment. Try incorporate trading fees, for example. This can be done by adding and extra term to the fixed cost of selling, $\\epsilon$."
370 | ]
371 | },
372 | {
373 | "cell_type": "code",
374 | "execution_count": 17,
375 | "metadata": {
376 | "scrolled": false
377 | },
378 | "outputs": [
379 | {
380 | "name": "stdout",
381 | "output_type": "stream",
382 | "text": [
383 | "Episode [100/10000]\tAverage Shortfall: $1,955,559.82\n",
384 | "Episode [200/10000]\tAverage Shortfall: $2,562,500.00\n",
385 | "Episode [300/10000]\tAverage Shortfall: $2,562,500.00\n",
386 | "Episode [400/10000]\tAverage Shortfall: $2,562,500.00\n",
387 | "Episode [500/10000]\tAverage Shortfall: $2,562,500.00\n",
388 | "Episode [600/10000]\tAverage Shortfall: $2,562,500.00\n",
389 | "Episode [700/10000]\tAverage Shortfall: $2,562,500.00\n",
390 | "Episode [800/10000]\tAverage Shortfall: $2,562,500.00\n",
391 | "Episode [900/10000]\tAverage Shortfall: $2,562,500.00\n",
392 | "Episode [1000/10000]\tAverage Shortfall: $2,562,500.00\n",
393 | "Episode [1100/10000]\tAverage Shortfall: $2,562,500.00\n",
394 | "Episode [1200/10000]\tAverage Shortfall: $2,562,500.00\n",
395 | "Episode [1300/10000]\tAverage Shortfall: $2,562,500.00\n",
396 | "Episode [1400/10000]\tAverage Shortfall: $2,562,500.00\n",
397 | "Episode [1500/10000]\tAverage Shortfall: $2,562,500.00\n",
398 | "Episode [1600/10000]\tAverage Shortfall: $2,562,500.00\n",
399 | "Episode [1700/10000]\tAverage Shortfall: $2,562,500.00\n",
400 | "Episode [1800/10000]\tAverage Shortfall: $2,562,500.00\n",
401 | "Episode [1900/10000]\tAverage Shortfall: $2,562,500.00\n",
402 | "Episode [2000/10000]\tAverage Shortfall: $2,365,766.34\n",
403 | "Episode [2100/10000]\tAverage Shortfall: $-4,961,069.15\n",
404 | "Episode [2200/10000]\tAverage Shortfall: $-9,865,626.73\n",
405 | "Episode [2300/10000]\tAverage Shortfall: $-10,111,475.89\n",
406 | "Episode [2400/10000]\tAverage Shortfall: $-9,667,788.71\n",
407 | "Episode [2500/10000]\tAverage Shortfall: $-12,157,641.08\n",
408 | "Episode [2600/10000]\tAverage Shortfall: $-9,094,492.66\n",
409 | "Episode [2700/10000]\tAverage Shortfall: $-9,853,531.99\n",
410 | "Episode [2800/10000]\tAverage Shortfall: $-1,816,561.93\n",
411 | "Episode [2900/10000]\tAverage Shortfall: $-10,270,762.94\n",
412 | "Episode [3000/10000]\tAverage Shortfall: $-10,202,420.94\n",
413 | "Episode [3100/10000]\tAverage Shortfall: $-10,528,299.52\n",
414 | "Episode [3200/10000]\tAverage Shortfall: $-10,470,899.05\n",
415 | "Episode [3300/10000]\tAverage Shortfall: $-9,406,544.15\n",
416 | "Episode [3400/10000]\tAverage Shortfall: $-8,838,479.19\n",
417 | "Episode [3500/10000]\tAverage Shortfall: $-11,441,167.07\n",
418 | "Episode [3600/10000]\tAverage Shortfall: $-9,947,253.26\n",
419 | "Episode [3700/10000]\tAverage Shortfall: $-10,928,396.01\n",
420 | "Episode [3800/10000]\tAverage Shortfall: $-11,946,627.98\n",
421 | "Episode [3900/10000]\tAverage Shortfall: $-6,900,682.33\n",
422 | "Episode [4000/10000]\tAverage Shortfall: $-6,695.71\n",
423 | "Episode [4100/10000]\tAverage Shortfall: $-8,904,595.61\n",
424 | "Episode [4200/10000]\tAverage Shortfall: $-9,659,425.17\n",
425 | "Episode [4300/10000]\tAverage Shortfall: $-10,491,365.42\n",
426 | "Episode [4400/10000]\tAverage Shortfall: $-9,667,934.38\n",
427 | "Episode [4500/10000]\tAverage Shortfall: $-9,290,083.81\n",
428 | "Episode [4600/10000]\tAverage Shortfall: $-9,772,319.45\n",
429 | "Episode [4700/10000]\tAverage Shortfall: $-9,741,509.38\n",
430 | "Episode [4800/10000]\tAverage Shortfall: $-9,893,239.53\n",
431 | "Episode [4900/10000]\tAverage Shortfall: $-9,595,988.92\n",
432 | "Episode [5000/10000]\tAverage Shortfall: $-12,501,569.45\n",
433 | "Episode [5100/10000]\tAverage Shortfall: $-11,811,195.05\n",
434 | "Episode [5200/10000]\tAverage Shortfall: $-10,138,894.46\n",
435 | "Episode [5300/10000]\tAverage Shortfall: $-10,504,233.09\n",
436 | "Episode [5400/10000]\tAverage Shortfall: $-5,038,707.51\n",
437 | "Episode [5500/10000]\tAverage Shortfall: $-8,936,829.38\n",
438 | "Episode [5600/10000]\tAverage Shortfall: $-10,781,792.26\n",
439 | "Episode [5700/10000]\tAverage Shortfall: $-8,302,488.86\n",
440 | "Episode [5800/10000]\tAverage Shortfall: $-9,594,546.28\n",
441 | "Episode [5900/10000]\tAverage Shortfall: $-9,814,352.78\n",
442 | "Episode [6000/10000]\tAverage Shortfall: $-10,684,048.74\n",
443 | "Episode [6100/10000]\tAverage Shortfall: $-10,373,680.60\n",
444 | "Episode [6200/10000]\tAverage Shortfall: $-10,625,956.59\n",
445 | "Episode [6300/10000]\tAverage Shortfall: $-11,631,723.77\n",
446 | "Episode [6400/10000]\tAverage Shortfall: $-10,585,561.44\n",
447 | "Episode [6500/10000]\tAverage Shortfall: $-9,381,861.37\n",
448 | "Episode [6600/10000]\tAverage Shortfall: $-11,218,867.50\n",
449 | "Episode [6700/10000]\tAverage Shortfall: $-7,701,202.56\n",
450 | "Episode [6800/10000]\tAverage Shortfall: $-10,332,167.94\n",
451 | "Episode [6900/10000]\tAverage Shortfall: $-9,952,372.31\n",
452 | "Episode [7000/10000]\tAverage Shortfall: $-10,887,028.27\n",
453 | "Episode [7100/10000]\tAverage Shortfall: $-9,948,462.96\n",
454 | "Episode [7200/10000]\tAverage Shortfall: $-10,970,077.82\n",
455 | "Episode [7300/10000]\tAverage Shortfall: $-9,718,404.47\n",
456 | "Episode [7400/10000]\tAverage Shortfall: $-8,927,763.57\n",
457 | "Episode [7500/10000]\tAverage Shortfall: $-9,266,710.71\n",
458 | "Episode [7600/10000]\tAverage Shortfall: $-10,700,665.61\n",
459 | "Episode [7700/10000]\tAverage Shortfall: $-10,407,278.17\n",
460 | "Episode [7800/10000]\tAverage Shortfall: $-10,812,626.73\n",
461 | "Episode [7900/10000]\tAverage Shortfall: $-9,742,862.11\n",
462 | "Episode [8000/10000]\tAverage Shortfall: $-8,727,273.28\n",
463 | "Episode [8100/10000]\tAverage Shortfall: $-11,945,952.88\n",
464 | "Episode [8200/10000]\tAverage Shortfall: $-9,305,444.17\n",
465 | "Episode [8300/10000]\tAverage Shortfall: $-9,569,747.92\n",
466 | "Episode [8400/10000]\tAverage Shortfall: $-11,008,021.52\n",
467 | "Episode [8500/10000]\tAverage Shortfall: $-10,176,711.27\n",
468 | "Episode [8600/10000]\tAverage Shortfall: $-8,852,801.73\n",
469 | "Episode [8700/10000]\tAverage Shortfall: $-9,340,764.19\n",
470 | "Episode [8800/10000]\tAverage Shortfall: $-9,508,081.87\n",
471 | "Episode [8900/10000]\tAverage Shortfall: $-9,887,704.43\n",
472 | "Episode [9000/10000]\tAverage Shortfall: $-10,422,108.32\n",
473 | "Episode [9100/10000]\tAverage Shortfall: $-11,640,813.81\n",
474 | "Episode [9200/10000]\tAverage Shortfall: $-8,812,642.36\n",
475 | "Episode [9300/10000]\tAverage Shortfall: $-10,013,637.13\n",
476 | "Episode [9400/10000]\tAverage Shortfall: $-10,609,369.46\n",
477 | "Episode [9500/10000]\tAverage Shortfall: $-9,189,505.15\n",
478 | "Episode [9600/10000]\tAverage Shortfall: $-9,111,533.09\n",
479 | "Episode [9700/10000]\tAverage Shortfall: $-9,875,456.01\n",
480 | "Episode [9800/10000]\tAverage Shortfall: $-10,223,336.47\n",
481 | "Episode [9900/10000]\tAverage Shortfall: $-10,590,809.41\n",
482 | "Episode [10000/10000]\tAverage Shortfall: $-11,169,677.73\n",
483 | "\n",
484 | "Average Implementation Shortfall: $-7,262,618.76 \n",
485 | "\n"
486 | ]
487 | }
488 | ],
489 | "source": [
490 | "import numpy as np\n",
491 | "\n",
492 | "import syntheticChrissAlmgren as sca\n",
493 | "from ddpg_agent import Agent\n",
494 | "from workspace_utils import keep_awake\n",
495 | "from collections import deque\n",
496 | "\n",
497 | "# Create simulation environment\n",
498 | "env = sca.MarketEnvironment()\n",
499 | "\n",
500 | "# Initialize Feed-forward DNNs for Actor and Critic models. \n",
501 | "agent = Agent(state_size=env.observation_space_dimension(), action_size=env.action_space_dimension(), random_seed=0)\n",
502 | "\n",
503 | "# Set the liquidation time\n",
504 | "lqt = 60\n",
505 | "\n",
506 | "# Set the number of trades\n",
507 | "n_trades = 60\n",
508 | "\n",
509 | "# Set trader's risk aversion\n",
510 | "tr = 1e-6\n",
511 | "\n",
512 | "# Set the number of episodes to run the simulation\n",
513 | "episodes = 10000\n",
514 | "\n",
515 | "shortfall_hist = np.array([])\n",
516 | "shortfall_deque = deque(maxlen=100)\n",
517 | "\n",
518 | "\n",
519 | "for episode in keep_awake(range(episodes)): \n",
520 | " # Reset the enviroment\n",
521 | " cur_state = env.reset(seed = episode, liquid_time = lqt, num_trades = n_trades, lamb = tr)\n",
522 | "\n",
523 | " # set the environment to make transactions\n",
524 | " env.start_transactions()\n",
525 | "\n",
526 | " for i in range(n_trades + 1):\n",
527 | " \n",
528 | " # Predict the best action for the current state. \n",
529 | " action = agent.act(cur_state, add_noise = True)\n",
530 | " \n",
531 | " # Action is performed and new state, reward, info are received. \n",
532 | " new_state, reward, done, info = env.step(action)\n",
533 | " \n",
534 | " # current state, action, reward, new state are stored in the experience replay\n",
535 | " agent.step(cur_state, action, reward, new_state, done)\n",
536 | " \n",
537 | " # roll over new state\n",
538 | " cur_state = new_state\n",
539 | "\n",
540 | " if info.done:\n",
541 | " shortfall_hist = np.append(shortfall_hist, info.implementation_shortfall)\n",
542 | " shortfall_deque.append(info.implementation_shortfall)\n",
543 | " break\n",
544 | " \n",
545 | " if (episode + 1) % 100 == 0: # print average shortfall over last 100 episodes\n",
546 | " print('\\rEpisode [{}/{}]\\tAverage Shortfall: ${:,.2f}'.format(episode + 1, episodes, np.mean(shortfall_deque))) \n",
547 | "\n",
548 | "print('\\nAverage Implementation Shortfall: ${:,.2f} \\n'.format(np.mean(shortfall_hist)))"
549 | ]
550 | },
551 | {
552 | "cell_type": "code",
553 | "execution_count": null,
554 | "metadata": {},
555 | "outputs": [],
556 | "source": []
557 | }
558 | ],
559 | "metadata": {
560 | "kernelspec": {
561 | "display_name": "Python 3",
562 | "language": "python",
563 | "name": "python3"
564 | },
565 | "language_info": {
566 | "codemirror_mode": {
567 | "name": "ipython",
568 | "version": 3
569 | },
570 | "file_extension": ".py",
571 | "mimetype": "text/x-python",
572 | "name": "python",
573 | "nbconvert_exporter": "python",
574 | "pygments_lexer": "ipython3",
575 | "version": "3.6.3"
576 | }
577 | },
578 | "nbformat": 4,
579 | "nbformat_minor": 2
580 | }
581 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2019 Akshay Sathe
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Actor_Critic-Method-for-Financial-Trading
2 |
3 |
4 | ## Implementation of the DDPG algorithm for Automatic Finance Trading
5 |
6 | In this project, I implemented the DDPG algorithm to solve the optimization problem of large portfolio transactions. This is an example of how Deep Reinforcement Learning can be used to solve real-world problems by simulating the problem in the form of an environment. Just like the various Deep RL algorithms, for example, DQN, DDPG and Multi-Agent DDPG algorithms are used to solve tasks related to robotics, gaming or even Autonomous Car Driving, these algorithms can be used to solve many supervised learning problems by properly formulating the tasks into an environment.
7 |
8 | The 'environment' is basically a python class having methods and attributes related to the task. It generates 'States' for an Agent. The agent is a class that follows one of the above algorithms. The agent then generates 'Actions' and gets rewarded based on that action. The important thing is to define carefully the 'State', 'Action' and 'Reward' for the environment based on the problem.
9 |
10 | In this repository, I will show how this can be done for a finance related problem.
11 |
12 | Please see the report.pdf file for full description.
13 |
14 | The basic code for this project was provided by Udacity. I modified it to simulate the stock prices based on a more realistic Geometric Brownian Motion.
15 |
16 |
17 |
--------------------------------------------------------------------------------
/ddpg_agent.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import random
3 | import copy
4 | from collections import namedtuple, deque
5 |
6 | from model import Actor, Critic
7 |
8 | import torch
9 | import torch.nn.functional as F
10 | import torch.optim as optim
11 |
12 | BUFFER_SIZE = int(1e4) # replay buffer size
13 | BATCH_SIZE = 128 # minibatch size
14 | GAMMA = 0.99 # discount factor
15 | TAU = 1e-3 # for soft update of target parameters
16 | LR_ACTOR = 1e-4 # learning rate of the actor
17 | LR_CRITIC = 1e-3 # learning rate of the critic
18 | WEIGHT_DECAY = 0 # L2 weight decay
19 |
20 | device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
21 |
22 | class Agent():
23 | """Interacts with and learns from the environment."""
24 |
25 | def __init__(self, state_size, action_size, random_seed):
26 | """Initialize an Agent object.
27 |
28 | Params
29 | ======
30 | state_size (int): dimension of each state
31 | action_size (int): dimension of each action
32 | random_seed (int): random seed
33 | """
34 | self.state_size = state_size
35 | self.action_size = action_size
36 | self.seed = random.seed(random_seed)
37 |
38 | # Actor Network (w/ Target Network)
39 | self.actor_local = Actor(state_size, action_size, random_seed).to(device)
40 | self.actor_target = Actor(state_size, action_size, random_seed).to(device)
41 | self.actor_optimizer = optim.Adam(self.actor_local.parameters(), lr=LR_ACTOR)
42 |
43 | # Critic Network (w/ Target Network)
44 | self.critic_local = Critic(state_size, action_size, random_seed).to(device)
45 | self.critic_target = Critic(state_size, action_size, random_seed).to(device)
46 | self.critic_optimizer = optim.Adam(self.critic_local.parameters(), lr=LR_CRITIC, weight_decay=WEIGHT_DECAY)
47 |
48 | # Noise process
49 | self.noise = OUNoise(action_size, random_seed)
50 |
51 | # Replay memory
52 | self.memory = ReplayBuffer(action_size, BUFFER_SIZE, BATCH_SIZE, random_seed)
53 |
54 | def step(self, state, action, reward, next_state, done):
55 | """Save experience in replay memory, and use random sample from buffer to learn."""
56 | # Save experience / reward
57 | self.memory.add(state, action, reward, next_state, done)
58 |
59 | # Learn, if enough samples are available in memory
60 | if len(self.memory) > BATCH_SIZE:
61 | experiences = self.memory.sample()
62 | self.learn(experiences, GAMMA)
63 |
64 | def act(self, state, add_noise=True):
65 | """Returns actions for given state as per current policy."""
66 | state = torch.from_numpy(state).float().to(device)
67 | self.actor_local.eval()
68 | with torch.no_grad():
69 | action = self.actor_local(state).cpu().data.numpy()
70 | self.actor_local.train()
71 | if add_noise:
72 | action += self.noise.sample()
73 | action = (action + 1.0) / 2.0
74 | return np.clip(action, 0, 1)
75 |
76 |
77 | def reset(self):
78 | self.noise.reset()
79 |
80 | def learn(self, experiences, gamma):
81 | """Update policy and value parameters using given batch of experience tuples.
82 | Q_targets = r + γ * critic_target(next_state, actor_target(next_state))
83 | where:
84 | actor_target(state) -> action
85 | critic_target(state, action) -> Q-value
86 |
87 | Params
88 | ======
89 | experiences (Tuple[torch.Tensor]): tuple of (s, a, r, s', done) tuples
90 | gamma (float): discount factor
91 | """
92 | states, actions, rewards, next_states, dones = experiences
93 |
94 | # ---------------------------- update critic ---------------------------- #
95 | # Get predicted next-state actions and Q values from target models
96 | actions_next = self.actor_target(next_states)
97 | Q_targets_next = self.critic_target(next_states, actions_next)
98 | # Compute Q targets for current states (y_i)
99 | Q_targets = rewards + (gamma * Q_targets_next * (1 - dones))
100 | # Compute critic loss
101 | Q_expected = self.critic_local(states, actions)
102 | critic_loss = F.mse_loss(Q_expected, Q_targets)
103 | # Minimize the loss
104 | self.critic_optimizer.zero_grad()
105 | critic_loss.backward()
106 | self.critic_optimizer.step()
107 |
108 | # ---------------------------- update actor ---------------------------- #
109 | # Compute actor loss
110 | actions_pred = self.actor_local(states)
111 | actor_loss = -self.critic_local(states, actions_pred).mean()
112 | # Minimize the loss
113 | self.actor_optimizer.zero_grad()
114 | actor_loss.backward()
115 | self.actor_optimizer.step()
116 |
117 | # ----------------------- update target networks ----------------------- #
118 | self.soft_update(self.critic_local, self.critic_target, TAU)
119 | self.soft_update(self.actor_local, self.actor_target, TAU)
120 |
121 | def soft_update(self, local_model, target_model, tau):
122 | """Soft update model parameters.
123 | θ_target = τ*θ_local + (1 - τ)*θ_target
124 |
125 | Params
126 | ======
127 | local_model: PyTorch model (weights will be copied from)
128 | target_model: PyTorch model (weights will be copied to)
129 | tau (float): interpolation parameter
130 | """
131 | for target_param, local_param in zip(target_model.parameters(), local_model.parameters()):
132 | target_param.data.copy_(tau*local_param.data + (1.0-tau)*target_param.data)
133 |
134 | class OUNoise:
135 | """Ornstein-Uhlenbeck process."""
136 |
137 | def __init__(self, size, seed, mu=0., theta=0.15, sigma=0.2):
138 | """Initialize parameters and noise process."""
139 | self.mu = mu * np.ones(size)
140 | self.theta = theta
141 | self.sigma = sigma
142 | self.seed = random.seed(seed)
143 | self.reset()
144 |
145 | def reset(self):
146 | """Reset the internal state (= noise) to mean (mu)."""
147 | self.state = copy.copy(self.mu)
148 |
149 | def sample(self):
150 | """Update internal state and return it as a noise sample."""
151 | x = self.state
152 | dx = self.theta * (self.mu - x) + self.sigma * np.array([random.random() for i in range(len(x))])
153 | self.state = x + dx
154 | return self.state
155 |
156 | class ReplayBuffer:
157 | """Fixed-size buffer to store experience tuples."""
158 |
159 | def __init__(self, action_size, buffer_size, batch_size, seed):
160 | """Initialize a ReplayBuffer object.
161 | Params
162 | ======
163 | buffer_size (int): maximum size of buffer
164 | batch_size (int): size of each training batch
165 | """
166 | self.action_size = action_size
167 | self.memory = deque(maxlen=buffer_size) # internal memory (deque)
168 | self.batch_size = batch_size
169 | self.experience = namedtuple("Experience", field_names=["state", "action", "reward", "next_state", "done"])
170 | self.seed = random.seed(seed)
171 |
172 | def add(self, state, action, reward, next_state, done):
173 | """Add a new experience to memory."""
174 | e = self.experience(state, action, reward, next_state, done)
175 | self.memory.append(e)
176 |
177 | def sample(self):
178 | """Randomly sample a batch of experiences from memory."""
179 | experiences = random.sample(self.memory, k=self.batch_size)
180 |
181 | states = torch.from_numpy(np.vstack([e.state for e in experiences if e is not None])).float().to(device)
182 | actions = torch.from_numpy(np.vstack([e.action for e in experiences if e is not None])).float().to(device)
183 | rewards = torch.from_numpy(np.vstack([e.reward for e in experiences if e is not None])).float().to(device)
184 | next_states = torch.from_numpy(np.vstack([e.next_state for e in experiences if e is not None])).float().to(device)
185 | dones = torch.from_numpy(np.vstack([e.done for e in experiences if e is not None]).astype(np.uint8)).float().to(device)
186 |
187 | return (states, actions, rewards, next_states, dones)
188 |
189 | def __len__(self):
190 | """Return the current size of internal memory."""
191 | return len(self.memory)
--------------------------------------------------------------------------------
/main.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 |
3 | import syntheticChrissAlmgren as sca
4 | from ddpg_agent import Agent
5 | from workspace_utils import keep_awake
6 | from collections import deque
7 |
8 | # Create simulation environment
9 | env = sca.MarketEnvironment()
10 |
11 | # Initialize Feed-forward DNNs for Actor and Critic models.
12 | agent = Agent(state_size=env.observation_space_dimension(), action_size=env.action_space_dimension(), random_seed=0)
13 |
14 | # Set the liquidation time
15 | lqt = 60
16 |
17 | # Set the number of trades
18 | n_trades = 60
19 |
20 | # Set trader's risk aversion
21 | tr = 1e-6
22 |
23 | # Set the number of episodes to run the simulation
24 | episodes = 10000
25 |
26 | shortfall_hist = np.array([])
27 | shortfall_deque = deque(maxlen=100)
28 |
29 |
30 | for episode in keep_awake(range(episodes)):
31 | # Reset the enviroment
32 | cur_state = env.reset(seed = episode, liquid_time = lqt, num_trades = n_trades, lamb = tr)
33 |
34 | # set the environment to make transactions
35 | env.start_transactions()
36 |
37 | for i in range(n_trades + 1):
38 |
39 | # Predict the best action for the current state.
40 | action = agent.act(cur_state, add_noise = True)
41 |
42 | # Action is performed and new state, reward, info are received.
43 | new_state, reward, done, info = env.step(action)
44 |
45 | # current state, action, reward, new state are stored in the experience replay
46 | agent.step(cur_state, action, reward, new_state, done)
47 |
48 | # roll over new state
49 | cur_state = new_state
50 |
51 | if info.done:
52 | shortfall_hist = np.append(shortfall_hist, info.implementation_shortfall)
53 | shortfall_deque.append(info.implementation_shortfall)
54 | break
55 |
56 | if (episode + 1) % 100 == 0: # print average shortfall over last 100 episodes
57 | print('\rEpisode [{}/{}]\tAverage Shortfall: ${:,.2f}'.format(episode + 1, episodes, np.mean(shortfall_deque)))
58 |
59 | print('\nAverage Implementation Shortfall: ${:,.2f} \n'.format(np.mean(shortfall_hist)))
--------------------------------------------------------------------------------
/model.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 |
3 | import torch
4 | import torch.nn as nn
5 | import torch.nn.functional as F
6 |
7 | def hidden_init(layer):
8 | fan_in = layer.weight.data.size()[0]
9 | lim = 1. / np.sqrt(fan_in)
10 | return (-lim, lim)
11 |
12 | class Actor(nn.Module):
13 | """Actor (Policy) Model."""
14 |
15 | def __init__(self, state_size, action_size, seed, fc1_units=24, fc2_units=48):
16 | """Initialize parameters and build model.
17 | Params
18 | ======
19 | state_size (int): Dimension of each state
20 | action_size (int): Dimension of each action
21 | seed (int): Random seed
22 | fc1_units (int): Number of nodes in first hidden layer
23 | fc2_units (int): Number of nodes in second hidden layer
24 | """
25 | super(Actor, self).__init__()
26 | self.seed = torch.manual_seed(seed)
27 | self.fc1 = nn.Linear(state_size, fc1_units)
28 | self.fc2 = nn.Linear(fc1_units, fc2_units)
29 | self.fc3 = nn.Linear(fc2_units, action_size)
30 | self.reset_parameters()
31 |
32 | def reset_parameters(self):
33 | self.fc1.weight.data.uniform_(*hidden_init(self.fc1))
34 | self.fc2.weight.data.uniform_(*hidden_init(self.fc2))
35 | self.fc3.weight.data.uniform_(-3e-3, 3e-3)
36 |
37 | def forward(self, state):
38 | """Build an actor (policy) network that maps states -> actions."""
39 | x = F.relu(self.fc1(state))
40 | x = F.relu(self.fc2(x))
41 | return F.tanh(self.fc3(x))
42 |
43 |
44 | class Critic(nn.Module):
45 | """Critic (Value) Model."""
46 |
47 | def __init__(self, state_size, action_size, seed, fcs1_units=24, fc2_units=48):
48 | """Initialize parameters and build model.
49 | Params
50 | ======
51 | state_size (int): Dimension of each state
52 | action_size (int): Dimension of each action
53 | seed (int): Random seed
54 | fcs1_units (int): Number of nodes in the first hidden layer
55 | fc2_units (int): Number of nodes in the second hidden layer
56 | """
57 | super(Critic, self).__init__()
58 | self.seed = torch.manual_seed(seed)
59 | self.fcs1 = nn.Linear(state_size, fcs1_units)
60 | self.fc2 = nn.Linear(fcs1_units+action_size, fc2_units)
61 | self.fc3 = nn.Linear(fc2_units, 1)
62 | self.reset_parameters()
63 |
64 | def reset_parameters(self):
65 | self.fcs1.weight.data.uniform_(*hidden_init(self.fcs1))
66 | self.fc2.weight.data.uniform_(*hidden_init(self.fc2))
67 | self.fc3.weight.data.uniform_(-3e-3, 3e-3)
68 |
69 | def forward(self, state, action):
70 | """Build a critic (value) network that maps (state, action) pairs -> Q-values."""
71 | xs = F.relu(self.fcs1(state))
72 | x = torch.cat((xs, action), dim=1)
73 | x = F.relu(self.fc2(x))
74 | return self.fc3(x)
75 |
--------------------------------------------------------------------------------
/report.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AkshayS21/Reinforcement-Learning-for-Optimal-Financial-Trading/b34069de0b73c60d5fa9a5af1359ad82119d937e/report.pdf
--------------------------------------------------------------------------------