├── .gitignore
├── Ch 1 - Introduction.ipynb
├── Ch 2 - Rank Nullspace and Determinants.ipynb
├── Ch 3 - Eigenvalues and Eigenvectors.ipynb
├── Ch 4 - Least Squares and Least Norm.ipynb
├── Ch 5 - Singular Value Decomposition.ipynb
├── README.md
├── helpers.py
├── img
├── Loop-left-shad.png
├── determinant_1.png
├── determinant_2.png
├── disclaimer.png
├── dyn_eig.png
├── eig.png
├── excel.jpg
├── gps.png
├── hamster.png
├── hard_disk.jpg
├── independence.png
├── intro_least_norm_control.png
├── intro_naive_control.png
├── intro_system_modes.png
├── intro_system_ringing.png
├── markov_1.png
├── markov_matrix.png
├── nullspace.png
├── qr_factorization.png
├── radio_tower.png
├── receiver.png
├── receiver2.png
├── rigid_body.png
├── rocket.png
├── satellite-clipart-satellite-signal-21.zip
├── sensor_interp.png
├── svd.png
├── svd_matrices.png
├── taylor_series.png
├── vector.png
└── vector_addition.png
├── requirements.txt
└── runtime.txt
/.gitignore:
--------------------------------------------------------------------------------
1 | *.pptx
2 |
--------------------------------------------------------------------------------
/Ch 1 - Introduction.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# A Crash Course in Applied Linear Algebra\n",
8 | "Patrick Landreman | Spry Health"
9 | ]
10 | },
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "### Some Motivation: Predicting the Future\n",
16 | "\n",
17 | "Since we are all busy people with too much to do, why should you spend your precious time reading this notebook? I'd like to give you a sample of the kind of power you'll have by the end.\n",
18 | "\n",
19 | "Consider the following time series plot:\n",
20 | "\n",
21 | "\n",
22 | "\n",
23 | "This could be the value of a stock price, or the position of an aircraft, or some biological signal, or any number of things. You are tasked with predicting the value at some future time. Looks unpleasant, no? Maybe there is some periodicity emerging by the end, but surely this is not a trivial task."
24 | ]
25 | },
26 | {
27 | "cell_type": "markdown",
28 | "metadata": {},
29 | "source": [
30 | "Now suppose you know that the series above was a mixture of a number of much simpler functions, and you could tell the amount of each function:\n",
31 | "\n",
32 | "\n",
33 | "\n",
34 | "Predicting how each of these functions will evolve should be a much simpler task. The question is - how do you find these simpler functions, and how do you find the weight of each function?"
35 | ]
36 | },
37 | {
38 | "cell_type": "markdown",
39 | "metadata": {},
40 | "source": [
41 | "Perhaps more interesting, suppose the system above comes from a robotic arm which can move in a 2D plane. The time series corresponds to the position of the end of the arm in one direction - we kicked the system, which caused some shaking and oscillating that subsides over time. \n",
42 | "\n",
43 | "We'll describe the complete position with a pair of numbers, $(y_1, y_2)$. We also can control our robot with some inputs, $(u_1, u_2)$. If we wanted to drive the arm to a specific location, say $(y_1, y_2) = (0.2, -0.2)$, we can expect that the arm will move, and then because of momentum or other issues the system might overshoot a bit and take some time to stabilize (this is exactly the kind of thing that happens with the read head in a spinning-disk hard drive, for instance). The next image shows what happens if at 50 seconds we apply a specific set of inputs to reach the target:"
44 | ]
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "\n",
51 | "\n",
52 | "Ok, we get there after ~3000 seconds, but it takes a while for the ringing to settle down."
53 | ]
54 | },
55 | {
56 | "cell_type": "markdown",
57 | "metadata": {},
58 | "source": [
59 | "Instead, however, it is possible to choose a more complicated sequence of inputs that brings the arm to the target in **exactly** a specified amount of time:\n",
60 | "\n",
61 | "\n",
62 | "\n",
63 | "Here, the arm reaches the specified coordinates at 800 seconds and remains there, perfectly stable. From looking at the sequence of inputs, there is **no way** a human could intuit the sequence.\n",
64 | "\n",
65 | "This is just one of the kinds of problems I hope to expose you to today."
66 | ]
67 | },
68 | {
69 | "attachments": {},
70 | "cell_type": "markdown",
71 | "metadata": {},
72 | "source": [
73 | "### Acknowledgements\n",
74 | "\n",
75 | "*This presentation is dedicated to Prof. Stephen Boyd and Prof. Sanjay Lall, who first exposed me to this way of seeing the world. Many of the ideas I present here were inspired largely by them and their teachers, and they deserve credit for the enormous amount of work they have done.*\n",
76 | "\n",
77 | "*If this subject interests you, I* strongly *suggest exploring their [books](http://vmls-book.stanford.edu/), [course notes](http://ee263.stanford.edu/), and lecture series on [YouTube](https://www.youtube.com/playlist?list=PL06960BA52D0DB32B).*"
78 | ]
79 | },
80 | {
81 | "cell_type": "markdown",
82 | "metadata": {},
83 | "source": [
84 | "### Python Setup\n",
85 | "\n",
86 | "Everything in this talk can be done with a basic installation of Numpy and Scipy. The version should not be important. Scipy is used exclusively for some convenience functions, and Matplotlib is included only for visualization purposes. Neither are necessary for linear algebra. This notebook was written using Python 3.6, Numpy 1.16.4, Scipy 1.2.1 and Matplotlib 3.1.1"
87 | ]
88 | },
89 | {
90 | "cell_type": "code",
91 | "execution_count": 9,
92 | "metadata": {
93 | "ExecuteTime": {
94 | "end_time": "2019-11-04T04:34:25.444430Z",
95 | "start_time": "2019-11-04T04:34:25.431666Z"
96 | }
97 | },
98 | "outputs": [],
99 | "source": [
100 | "# Render MPL figures within notebook cells\n",
101 | "%matplotlib inline\n",
102 | "\n",
103 | "# Import python libraries\n",
104 | "import numpy as np\n",
105 | "import matplotlib.pyplot as plt\n",
106 | "from matplotlib import rcParams"
107 | ]
108 | },
109 | {
110 | "cell_type": "code",
111 | "execution_count": 10,
112 | "metadata": {
113 | "ExecuteTime": {
114 | "end_time": "2019-11-04T04:34:25.501434Z",
115 | "start_time": "2019-11-04T04:34:25.448882Z"
116 | }
117 | },
118 | "outputs": [],
119 | "source": [
120 | "# Configure some defaults for plots\n",
121 | "rcParams['font.size'] = 16\n",
122 | "rcParams['figure.figsize'] = (10, 3)"
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "execution_count": 11,
128 | "metadata": {
129 | "ExecuteTime": {
130 | "end_time": "2019-11-04T04:34:25.513684Z",
131 | "start_time": "2019-11-04T04:34:25.506798Z"
132 | }
133 | },
134 | "outputs": [],
135 | "source": [
136 | "# Set Numpy's random number generator so the same results are produced each time the notebook is run\n",
137 | "np.random.seed(0)"
138 | ]
139 | },
140 | {
141 | "cell_type": "markdown",
142 | "metadata": {},
143 | "source": [
144 | "### Why did I make this notebook?\n",
145 | "\n",
146 | "\n",
147 | "If your exposure to the linear algebra was anything like my first course, you probably don't think back on the experience longingly, wishing you could relive those days. My professor literally opened the class by saying he didn't understand why linear algebra was interesting or useful. It was a pretty painful semester, full of abstract concepts like vector subspaces, and I just couldn't connect with the material. It wasn't until the end of my graduate school career when I took a class that beat me over the head with examples of how powerful and magical this subject can be. I want to share that experience with more people.\n",
148 | "\n",
149 | "Associating linear algebra with a kind of fascination or awe is especially helpful, because things like matrices and vectors have a way of showing up in just about every quantitative situation. In my world of signal processing and statistics, it's virtually impossible to get through a research paper without being comfortable manipulating vectors and matrices. \n",
150 | "\n",
151 | "Where the magic really becomes clear is when - for whatever discipline you're working in - you manage to cram your problem into something that looks like \n",
152 | "\n",
153 | " \n",
154 | "\n",
155 | "
\n",
162 | " \n",
163 | "\n",
164 | "suddenly you can apply ideas from finance, quantum mechanics, operations research, RADAR, medical imaging, or any number of other fields to solve your problem. The only thing you need to learn to be dangerous in any of these fields is an intution for the properties and structure of the matrix $A$.\n",
165 | "\n",
166 | "So how can four ASCII characters (not including the spaces) apply to so many situations in the real world almost stupidly well? In part because each of these symbols encompass a great deal of complexity. Let's start by unpacking them one by one:"
167 | ]
168 | },
169 | {
170 | "cell_type": "markdown",
171 | "metadata": {},
172 | "source": [
173 | "### Vectors and Matrices\n",
174 | "\n",
175 | "All of this should be material you've seen in courses before, so I'll go through it very quickly just to put us on the same footing as far as notation.\n",
176 | "\n",
177 | "Beginning with $y$:\n",
178 | "\n",
179 | "$y$ is a **vector**, which is essentially a list of numbers. It's often useful to think of vectors as arrows that point from the origin to some point in space, where each entry of $y$ corresponds to a dimension in that space. For a 2-dimensional example, we could have a vector with entries $[2, 3]$ that one could visualize as \n",
180 | "\n",
181 | "\n",
182 | "\n",
183 | "As convention, lowercase variables without subscripts will be vectors. If $y$ is a vector, then $y_1$, $y_2 \\ldots$ are the entries of $y$."
184 | ]
185 | },
186 | {
187 | "cell_type": "markdown",
188 | "metadata": {},
189 | "source": [
190 | "Vectors add elementwise - that is, you add the elements in the corresponding positions. Graphically, this has the effect of starting one vector from the point of the other vector. The vector sum is the final point that is reached:\n",
191 | "\n",
192 | "\n"
193 | ]
194 | },
195 | {
196 | "cell_type": "markdown",
197 | "metadata": {},
198 | "source": [
199 | "Vectors can be written in row or column form. For this presentation, all vectors are column vectors, and the corresponding row vectors are notated as e.g. $y^\\mathsf{T}$. The inner product (or dot product) of two vectors is \n",
200 | "\n",
201 | "
\n",
489 | "\n",
490 | "\n",
491 | "* $x$ is the coefficients of the model for each lag\n",
492 | "* $y$ is the time series output of the model\n",
493 | "* $\\mathbf{A}$ is a stack of lagged windows of the time series"
494 | ]
495 | },
496 | {
497 | "cell_type": "markdown",
498 | "metadata": {},
499 | "source": [
500 | "### More than just Lines\n",
501 | "\n",
502 | "This last example of an Autoregressive Model, as well as the time series forecasting example at the beginning, are great for highlighting that linear algebra applies to situations that look anything but linear! Let's explore this idea more closely - consider the case of a polynomial evaluated at $m$ different points:\n",
503 | "\n",
504 | " \n",
505 | "\n",
506 | "
\n",
516 | "\n",
517 | " \n",
518 | "\n",
519 | "It's certainly true that these equations are not linear as far as $x$ is concerned. BUT! The coefficients of the polynomial are all first order, which means we can write the system of equations this way:\n",
520 | " \n",
521 | "\n",
522 | "
\n"
545 | ]
546 | },
547 | {
548 | "cell_type": "markdown",
549 | "metadata": {},
550 | "source": [
551 | "In fact, there is absolutely no requirement on where the entries of $A$ come from! They often will arise from some horrendous expressions involving sines, logarithms, or enhanced spherical Riemann functions of the 60th kind (I made that up). As long as you can remain calm, slowly step back, and recognize that the **mixture** of these functions is still linear, then you still have\n",
552 | "\n",
553 | " \n",
554 | "\n",
555 | "
\n",
262 | "\n",
263 | "This would be a silly thing to do, when one could immediately save half of the memory by storing only the first column, and remembering to double the value whenever we need to index into the 2nd column. We can achieve this by representing $A$ as the following:\n",
264 | "\n",
265 | "
"
286 | ]
287 | },
288 | {
289 | "cell_type": "markdown",
290 | "metadata": {},
291 | "source": [
292 | "This idea generalizes to matrices of any shape:\n",
293 | "\n",
294 | "> A matrix $A$ with rank $r$ can be factored into matrices $Q \\in \\mathbb{R}^{m \\times r}$ and $R \\in \\mathbb{R}^{r \\times n}$\n",
295 | "\n",
296 | "\n",
297 | "\n",
298 | "Our lovely Mathematician friends have given us multiple algorithms that will produce these smaller matrices for us (see [QR decomposition](https://en.wikipedia.org/wiki/QR_decomposition)). We'll see one such algorithm in a later chapter.\n",
299 | "\n",
300 | "Armed with this fact, we can quickly calculate how many numbers we have to store for the original matrix versus the factorized matrices:\n",
301 | "\n",
302 | "$\\;\\;$ Size of $A = mn$\n",
303 | "\n",
304 | "$\\;\\;$ Size of $Q$ + Size of $R = mr + rn = r(m+n)$\n",
305 | "\n",
306 | "As soon as $r$ becomes appreciably smaller than $m$ and $n$, this trick can save a lot of space:"
307 | ]
308 | },
309 | {
310 | "cell_type": "code",
311 | "execution_count": 8,
312 | "metadata": {
313 | "ExecuteTime": {
314 | "end_time": "2019-11-04T04:37:09.341092Z",
315 | "start_time": "2019-11-04T04:37:09.330100Z"
316 | }
317 | },
318 | "outputs": [
319 | {
320 | "data": {
321 | "text/plain": [
322 | "1000000"
323 | ]
324 | },
325 | "execution_count": 8,
326 | "metadata": {},
327 | "output_type": "execute_result"
328 | }
329 | ],
330 | "source": [
331 | "m = 1000\n",
332 | "n = 1000\n",
333 | "r = 10\n",
334 | "\n",
335 | "# Storage cost of A\n",
336 | "m*n"
337 | ]
338 | },
339 | {
340 | "cell_type": "code",
341 | "execution_count": 9,
342 | "metadata": {
343 | "ExecuteTime": {
344 | "end_time": "2019-11-04T04:37:09.354373Z",
345 | "start_time": "2019-11-04T04:37:09.345070Z"
346 | }
347 | },
348 | "outputs": [
349 | {
350 | "data": {
351 | "text/plain": [
352 | "20000"
353 | ]
354 | },
355 | "execution_count": 9,
356 | "metadata": {},
357 | "output_type": "execute_result"
358 | }
359 | ],
360 | "source": [
361 | "# Storage cost of QR\n",
362 | "m*r + r*n"
363 | ]
364 | },
365 | {
366 | "cell_type": "markdown",
367 | "metadata": {},
368 | "source": [
369 | "**Multiplication Efficiency**\n",
370 | "\n",
371 | "Amusingly, the math is identical when you estimate how many floating-point operations (FLOPs) it takes to compute $Ax = Q(Rx)$\n",
372 | "\n",
373 | "$\\;\\;$ FLOPs to compute $Ax$ = (# rows) x FLOPs to compute ($\\tilde{a}^\\mathsf{T}x)$ $\\sim O(mn)$\n",
374 | "\n",
375 | "$\\;\\;$ FLOPs to compute $QRx\\sim O(mr + rn) = O(r(m+n))$"
376 | ]
377 | },
378 | {
379 | "cell_type": "markdown",
380 | "metadata": {},
381 | "source": [
382 | "### Nullspace\n",
383 | "\n",
384 | "\"The Nullspace\" is a term that haunted me from my first linear algebra class. I remember thinking it sounded cool, but I had no grasp of what it really meant. I think the simplest way of understanding the idea is this: every matrix has a nullspace, which is a list of vectors. If you multiply your matrix $A$ by a vector $z$ and the result is the zero vector, then you put $z$ in the list.\n",
385 | "\n",
386 | "> The nullspace of $A$ is the set of vectors, $z$, such that $Az = 0$\n",
387 | "\n",
388 | "If we were to express this in code, `A.nullspace()` would return a list of vectors. This list would be infinitely long, but one can return a list of vectors that can be used to reconstruct any vector in the nullspace via linear combinations (we would call this finite set a **basis** for the nullspace). There isn't a direct way to return the nullspace of $A$ with Numpy, but we will cover how to do this in a later chapter (see Ch 4 - Singular Value Decomposition).\n",
389 | "\n",
390 | "The **key idea** about vectors in the nullspace of $A$ is that they represent redundancy or flexibility in the system:\n",
391 | "\n",
392 | "$A(x + z) = Ax + Az = Ax + 0 = Ax$\n",
393 | "\n",
394 | "We can add any vector from the nullspace \"for free.\" Visually, if we plot the points $x$ where $Ax$ equals some constant $b$, then any vector that is parallel to the line is in the nullspace of $A$:\n",
395 | "\n",
396 | "\n",
397 | "\n",
398 | "The points making up the green line represent all the vectors in the nullspace of A, for this example.\n",
399 | "\n",
400 | "This is great if, for instance, we have $y=Ax$ where $y$ is a target we are trying to achieve. Lots of vectors in the nullspace means we have options to choose from for our input, and we can optimize on some other criteria.\n",
401 | "\n",
402 | "However, if you recall our transmitter/receiver analogy from above, having vectors in the nullspace of our matrix can be a very bad thing! Imagine you are trying to reconstruct what your transmitters were sending from the sensor measurements you detect. If there are multiple ways to transmit signals that produce the same received measurements, **you can never know which signals were sent.**"
403 | ]
404 | },
405 | {
406 | "cell_type": "markdown",
407 | "metadata": {},
408 | "source": [
409 | "### Singularity\n",
410 | "\n",
411 | "This last idea, that you can't \"undo\" the transformation from multiplying $x$ by $A$, is mathematically expressed by saying that $A$ is not invertible. Precisely, there is no matrix $B$ such that \n",
412 | "\n",
413 | "
\n",
414 | " $x = B(Ax)$\n",
415 | "
"
416 | ]
417 | },
418 | {
419 | "cell_type": "markdown",
420 | "metadata": {},
421 | "source": [
422 | "This idea is also expressed by saying that \"$A$ has no left inverse\". In the special case where $A$ is square, you will encounter people using the term **singular** to mean a matrix which has no inverse. As best I can find, the term singular refers to an (incorrect) understanding that most square matrices are invertible, and so non-invertible matrices are special. "
423 | ]
424 | },
425 | {
426 | "cell_type": "markdown",
427 | "metadata": {},
428 | "source": [
429 | "### The Determinant\n",
430 | "\n",
431 | "Another abstract concept you would undoubtedly find in a beginning linear algebra class is **the determinant** of a matrix. You probably were drilled at length on using [Cramer's Rule](https://en.wikipedia.org/wiki/Cramer%27s_rule) to calculate the determinant, and that was probably all you would have done with it.\n",
432 | "\n",
433 | "The determinant is connected to all of the previous topics in this section, which I have grouped together because they all indicate whether or not a matrix operation can be undone. In applied terms, whether or not an estimation problem has a unique solution, or if a design problem has multiple choices.\n",
434 | "\n",
435 | "The determinant is a number that **determines** if a square matrix is singular. That's where the name comes from. If $\\det(A)=0$, then $A$ has no inverse.\n",
436 | "\n",
437 | "Ok great. Why?\n",
438 | "\n",
439 | "There is a great visual interpretation - we'll work in a 3-dimensional space so that we can draw it, but this extends to any number of dimensions. Suppose we have a cube with volume 1. Applying the matrix $A$ to the vectors which define the points on the boundary of the cube results in a parallelopiped with volume equal to the determinant of $A$:\n",
440 | "\n",
441 | "\n",
442 | "\n",
443 | "If the determinant of $A$ is 0, the resulting parallelopiped has volume 0, which implies at least one dimension of the cube is flattened:\n",
444 | "\n",
445 | "\n",
446 | "\n",
447 | "In the picture above, any information in the vertical direction is lost - you can't unflatten the shape because you don't know how high to stretch!"
448 | ]
449 | },
450 | {
451 | "cell_type": "markdown",
452 | "metadata": {},
453 | "source": [
454 | "We can demonstrate this volume concept in code:"
455 | ]
456 | },
457 | {
458 | "cell_type": "code",
459 | "execution_count": 10,
460 | "metadata": {
461 | "ExecuteTime": {
462 | "end_time": "2019-11-04T04:37:09.369929Z",
463 | "start_time": "2019-11-04T04:37:09.359000Z"
464 | }
465 | },
466 | "outputs": [
467 | {
468 | "data": {
469 | "text/plain": [
470 | "array([[1., 0., 0.],\n",
471 | " [0., 1., 0.],\n",
472 | " [0., 0., 1.]])"
473 | ]
474 | },
475 | "execution_count": 10,
476 | "metadata": {},
477 | "output_type": "execute_result"
478 | }
479 | ],
480 | "source": [
481 | "# Define a 3d unit cube\n",
482 | "C = np.eye(3)\n",
483 | "C"
484 | ]
485 | },
486 | {
487 | "cell_type": "code",
488 | "execution_count": 11,
489 | "metadata": {
490 | "ExecuteTime": {
491 | "end_time": "2019-11-04T04:37:09.385231Z",
492 | "start_time": "2019-11-04T04:37:09.373653Z"
493 | }
494 | },
495 | "outputs": [
496 | {
497 | "data": {
498 | "text/plain": [
499 | "array([[1, 4, 2],\n",
500 | " [1, 4, 4],\n",
501 | " [4, 4, 2]])"
502 | ]
503 | },
504 | "execution_count": 11,
505 | "metadata": {},
506 | "output_type": "execute_result"
507 | }
508 | ],
509 | "source": [
510 | "# Define a transformation matrix\n",
511 | "A = np.random.randint(low=1, high=5, size=(3, 3))\n",
512 | "A"
513 | ]
514 | },
515 | {
516 | "cell_type": "code",
517 | "execution_count": 12,
518 | "metadata": {
519 | "ExecuteTime": {
520 | "end_time": "2019-11-04T04:37:09.400901Z",
521 | "start_time": "2019-11-04T04:37:09.389461Z"
522 | }
523 | },
524 | "outputs": [
525 | {
526 | "data": {
527 | "text/plain": [
528 | "array([[1., 4., 2.],\n",
529 | " [1., 4., 4.],\n",
530 | " [4., 4., 2.]])"
531 | ]
532 | },
533 | "execution_count": 12,
534 | "metadata": {},
535 | "output_type": "execute_result"
536 | }
537 | ],
538 | "source": [
539 | "# Since the unit cube vectors correspond to an Identity matrix, the output of the transform\n",
540 | "# is the same as the transformation matrix\n",
541 | "A @ C"
542 | ]
543 | },
544 | {
545 | "cell_type": "code",
546 | "execution_count": 13,
547 | "metadata": {
548 | "ExecuteTime": {
549 | "end_time": "2019-11-04T04:37:09.417521Z",
550 | "start_time": "2019-11-04T04:37:09.404196Z"
551 | }
552 | },
553 | "outputs": [
554 | {
555 | "data": {
556 | "text/plain": [
557 | "24.000000000000004"
558 | ]
559 | },
560 | "execution_count": 13,
561 | "metadata": {},
562 | "output_type": "execute_result"
563 | }
564 | ],
565 | "source": [
566 | "# The volume of the transformed cube is given by the determinant\n",
567 | "np.linalg.det(A)"
568 | ]
569 | },
570 | {
571 | "cell_type": "code",
572 | "execution_count": 14,
573 | "metadata": {
574 | "ExecuteTime": {
575 | "end_time": "2019-11-04T04:37:09.434914Z",
576 | "start_time": "2019-11-04T04:37:09.420512Z"
577 | }
578 | },
579 | "outputs": [
580 | {
581 | "data": {
582 | "text/plain": [
583 | "0.0"
584 | ]
585 | },
586 | "execution_count": 14,
587 | "metadata": {},
588 | "output_type": "execute_result"
589 | }
590 | ],
591 | "source": [
592 | "# If we make two of the columns of A identical, this will have the effect of \n",
593 | "# Making the parallelopiped effectively 2-dimensional. The determinant will be\n",
594 | "# zero to reflect this.\n",
595 | "A_flat = A\n",
596 | "A_flat[:, 2] = A[:, 1]\n",
597 | "np.linalg.det(A_flat)"
598 | ]
599 | },
600 | {
601 | "cell_type": "markdown",
602 | "metadata": {},
603 | "source": [
604 | "### Unification\n",
605 | "\n",
606 | "You may have noticed just now that, to create a matrix with a zero determinant, I made one column of A equal to another column. This dropped the rank of $A$. This was not a coincidence. In fact, the rank of $A$ can also be interpreted as the number of dimensions in the accessible output space (the **range**) of $A$ - if this number is less than the number of dimensions of $x$, then some kind of geometric flattening is going to happen and the determinant will be zero. \n",
607 | "\n",
608 | "At the same time, removing a dimension from the range of $A$ **adds** a dimension to the nullspace of $A$! There is a handy property:\n",
609 | "\n",
610 | "> $\\text{Rank}(A) + \\text{dim Null}(A) = n$\n",
611 | "\n",
612 | "All of the topics in this section are intimately connected, each communicating whether or not a matrix is invertible. To summarize, the following statements are equivalent:\n",
613 | "\n",
614 | "- $A$ is full rank\n",
615 | "- $A$ has only the zero vector in its nullspace\n",
616 | "- the determinant of $A$ is non-zero\n",
617 | "- $A$ has a left inverse"
618 | ]
619 | },
620 | {
621 | "cell_type": "code",
622 | "execution_count": null,
623 | "metadata": {},
624 | "outputs": [],
625 | "source": []
626 | }
627 | ],
628 | "metadata": {
629 | "author": "",
630 | "kernelspec": {
631 | "display_name": "Python 3",
632 | "language": "python",
633 | "name": "python3"
634 | },
635 | "language_info": {
636 | "codemirror_mode": {
637 | "name": "ipython",
638 | "version": 3
639 | },
640 | "file_extension": ".py",
641 | "mimetype": "text/x-python",
642 | "name": "python",
643 | "nbconvert_exporter": "python",
644 | "pygments_lexer": "ipython3",
645 | "version": "3.6.7"
646 | },
647 | "latex_envs": {
648 | "LaTeX_envs_menu_present": true,
649 | "autoclose": true,
650 | "autocomplete": false,
651 | "bibliofile": "biblio.bib",
652 | "cite_by": "apalike",
653 | "current_citInitial": 1,
654 | "eqLabelWithNumbers": false,
655 | "eqNumInitial": 1,
656 | "hotkeys": {
657 | "equation": "Ctrl-E",
658 | "itemize": "Ctrl-I"
659 | },
660 | "labels_anchors": false,
661 | "latex_user_defs": false,
662 | "report_style_numbering": false,
663 | "user_envs_cfg": false
664 | },
665 | "toc": {
666 | "base_numbering": 1,
667 | "nav_menu": {},
668 | "number_sections": false,
669 | "sideBar": true,
670 | "skip_h1_title": false,
671 | "title_cell": "Table of Contents",
672 | "title_sidebar": "Contents",
673 | "toc_cell": false,
674 | "toc_position": {},
675 | "toc_section_display": true,
676 | "toc_window_display": true
677 | }
678 | },
679 | "nbformat": 4,
680 | "nbformat_minor": 2
681 | }
682 |
--------------------------------------------------------------------------------
/Ch 3 - Eigenvalues and Eigenvectors.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Eigenvalues and Eigenvectors\n",
8 | "\n",
9 | "This chapter will revisit another significant topic that you would have seen in a typical linear algebra course. During my class experience, I recall going through mountainous piles of loose-leaf paper, solving characteristic polynomials to find eigenvectors of 2x2 matrices, without any motivation for the practice. \n",
10 | "\n",
11 | "Eigenvalues and vectors turn out to have myriad applications, and hopefully the content here will help remind you how these things work and provide a context to understand them when they show up elsewhere."
12 | ]
13 | },
14 | {
15 | "cell_type": "markdown",
16 | "metadata": {},
17 | "source": [
18 | "### Python Setup"
19 | ]
20 | },
21 | {
22 | "cell_type": "code",
23 | "execution_count": 2,
24 | "metadata": {
25 | "ExecuteTime": {
26 | "end_time": "2019-11-04T15:28:26.182425Z",
27 | "start_time": "2019-11-04T15:28:24.643827Z"
28 | }
29 | },
30 | "outputs": [],
31 | "source": [
32 | "# Render MPL figures within notebook cells\n",
33 | "%matplotlib inline\n",
34 | "\n",
35 | "# Import python libraries\n",
36 | "import numpy as np\n",
37 | "import matplotlib.pyplot as plt\n",
38 | "from matplotlib import rcParams"
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "execution_count": 3,
44 | "metadata": {
45 | "ExecuteTime": {
46 | "end_time": "2019-11-04T15:28:26.193748Z",
47 | "start_time": "2019-11-04T15:28:26.187041Z"
48 | }
49 | },
50 | "outputs": [],
51 | "source": [
52 | "# Configure some defaults for plots\n",
53 | "rcParams['font.size'] = 16\n",
54 | "# rcParams['figure.figsize'] = (10, 3)"
55 | ]
56 | },
57 | {
58 | "cell_type": "code",
59 | "execution_count": 4,
60 | "metadata": {
61 | "ExecuteTime": {
62 | "end_time": "2019-11-04T15:28:26.205917Z",
63 | "start_time": "2019-11-04T15:28:26.198928Z"
64 | }
65 | },
66 | "outputs": [],
67 | "source": [
68 | "# Set Numpy's random number generator so the same results are produced each time the notebook is run\n",
69 | "np.random.seed(0)"
70 | ]
71 | },
72 | {
73 | "cell_type": "markdown",
74 | "metadata": {},
75 | "source": [
76 | "### A Quick Review\n",
77 | "\n",
78 | "It's helpful to start with a geometric picture. If you multiply a vector with a square matrix, the output will be another vector in the same space, but in general will \"point\" in another direction. For any matrix $A$, there are a set of special vectors $v$ whose output is a scaled version of the input:\n",
79 | "\n",
80 | "\n",
81 | "\n",
82 | "This relationship is expressed mathematically as \n",
83 | "\n",
84 | "
\n",
85 | " $Av = \\lambda v$\n",
86 | "
\n",
87 | "\n",
88 | "where $v$ is called an **eigenvector** and the scaling factor $\\lambda$ is the associated **eigenvalue**. To emphasize, since the output has the same dimensions as the input, eigenvectors only make sense in the context of square matrices. A square matrix of dimension $n$ may have up to $n$ eigenvectors."
89 | ]
90 | },
91 | {
92 | "cell_type": "markdown",
93 | "metadata": {},
94 | "source": [
95 | "#### Finding Eigenvectors and Eigenvalues\n",
96 | "\n",
97 | "Given the definition of an eigenpair is $Av=\\lambda v$, it has to be true that $(A-\\lambda I)v=0$. Referring back to the previous chapter, this means that $v$ is in the nullspace of the matrix $M = (A - \\lambda I)$. This, in turn, means that $M$ **has** non-zero vectors in its nullspace, which means that the determinant must be zero.\n",
98 | "\n",
99 | "This is where in a linear algebra class, one would start writing out the determinants of matrices and solving for values of $\\lambda$ which produce a zero determinant. In practice, this is not an efficient way to find eigenpairs. Numpy uses the LAPACK routine `geev`:"
100 | ]
101 | },
102 | {
103 | "cell_type": "code",
104 | "execution_count": 5,
105 | "metadata": {
106 | "ExecuteTime": {
107 | "end_time": "2019-11-04T15:28:26.225379Z",
108 | "start_time": "2019-11-04T15:28:26.211441Z"
109 | }
110 | },
111 | "outputs": [
112 | {
113 | "data": {
114 | "text/plain": [
115 | "array([[1, 0, 0, 0, 0],\n",
116 | " [0, 2, 0, 0, 0],\n",
117 | " [0, 0, 3, 0, 0],\n",
118 | " [0, 0, 0, 4, 0],\n",
119 | " [0, 0, 0, 0, 5]])"
120 | ]
121 | },
122 | "execution_count": 5,
123 | "metadata": {},
124 | "output_type": "execute_result"
125 | }
126 | ],
127 | "source": [
128 | "A = np.diag([1, 2, 3, 4, 5])\n",
129 | "A"
130 | ]
131 | },
132 | {
133 | "cell_type": "code",
134 | "execution_count": 6,
135 | "metadata": {
136 | "ExecuteTime": {
137 | "end_time": "2019-11-04T15:28:26.251245Z",
138 | "start_time": "2019-11-04T15:28:26.230343Z"
139 | }
140 | },
141 | "outputs": [
142 | {
143 | "data": {
144 | "text/plain": [
145 | "array([1., 2., 3., 4., 5.])"
146 | ]
147 | },
148 | "execution_count": 6,
149 | "metadata": {},
150 | "output_type": "execute_result"
151 | }
152 | ],
153 | "source": [
154 | "λ, v = np.linalg.eig(A)\n",
155 | "\n",
156 | "# λ contains the eigenvalues\n",
157 | "λ"
158 | ]
159 | },
160 | {
161 | "cell_type": "code",
162 | "execution_count": 7,
163 | "metadata": {
164 | "ExecuteTime": {
165 | "end_time": "2019-11-04T15:28:26.267187Z",
166 | "start_time": "2019-11-04T15:28:26.256265Z"
167 | }
168 | },
169 | "outputs": [
170 | {
171 | "data": {
172 | "text/plain": [
173 | "array([[1., 0., 0., 0., 0.],\n",
174 | " [0., 1., 0., 0., 0.],\n",
175 | " [0., 0., 1., 0., 0.],\n",
176 | " [0., 0., 0., 1., 0.],\n",
177 | " [0., 0., 0., 0., 1.]])"
178 | ]
179 | },
180 | "execution_count": 7,
181 | "metadata": {},
182 | "output_type": "execute_result"
183 | }
184 | ],
185 | "source": [
186 | "# v contains the eigenvectors as columns\n",
187 | "v"
188 | ]
189 | },
190 | {
191 | "cell_type": "markdown",
192 | "metadata": {},
193 | "source": [
194 | "### Time Dynamics\n",
195 | "\n",
196 | "One of my favorite applications of eigenvectors is with systems that evolve over time - think object tracking or evolution of financial portfolios, etc. Any kind of time series data.\n",
197 | "\n",
198 | "In this context, a vector is a list of the parameters that describe the state of your system at a particular time. A nice, concrete example of \"the state\" from classical mechanics would be the position, velocity, and acceleration of an object. In finance it could be the dollar amounts in a list of assets.\n",
199 | "\n",
200 | "The matrix $A$ describes the process that updates the state from time $t$ to $t+1$: \n",
201 | " \n",
202 | "
\n",
203 | " $x(t+1) = Ax(t)$\n",
204 | "
\n",
205 | " \n",
206 | "It's possible to describe the state at an arbitrary time as a function of the input state using entirely terms that depend on the eigenvectors and eigenvalues:\n",
207 | " \n",
208 | "
\n",
211 | " \n",
212 | "\n",
213 | "Now, the $\\ldots$ above are hiding an ugly mess involving binomial coefficients and such - my aim is to simply highlight that every term contains one eigenvector scaled by something that looks like the matching eigenvalue raised to a power which grows with time. In special cases, if $A$ is diagonalizable, then the terms in the equation above becomes extremely simple: \n",
214 | " \n",
215 | "
\n",
218 | " \n",
219 | "\n",
220 | "First of all, how amazing is it that eigenvectors appear in this setting at all! These weird geometric oddities can describe how systems evolve in time from any starting state. Second, and just as amazing, this structure means that linear systems can only evolve in time in a few specific ways:\n",
221 | "\n",
222 | "\n",
223 | "\n",
224 | "1. Terms associated with eigenvalues of magnitude = 1 stay the same over time\n",
225 | "2. Terms associated with eigenvalues of magnitude < 1 decay over time\n",
226 | "3. Terms associated with eigenvalues of magnitude > 1 stay grow exponentially over time\n",
227 | "\n",
228 | "For a matrix with all real entries, it's possible to have complex eigenvalues and eigenvectors. Complex values will be associated with oscillating behavior, shown by dashed lines in the image above. The asymptotic behavior is still determined by the magnitude of the eigenvalue. "
229 | ]
230 | },
231 | {
232 | "cell_type": "markdown",
233 | "metadata": {},
234 | "source": [
235 | "### Markov Processes\n",
236 | "\n",
237 | "A fun example to show this idea is that of **Markov Processes**. In a Markov process, at any point in time we can be in one of several states. Each time step, we can transition to one of the other states with some probability based on where we are currently. An example system could be an employee - at any point the employee could be thinking, working, or idle:\n",
238 | "\n",
239 | "\n",
240 | "\n",
241 | "If we ran a simulation using coin flips to decide if we change states, we would end up with a sequence like \n",
242 | "\n",
243 | "*idle - working - idle - thinking - thinking - working - thinking...* \n",
244 | "\n",
245 | "This sequence is called a **Markov Chain**. A classic question one might ask with a model like this is \"what is the probability at any time that the employee is producing results?\""
246 | ]
247 | },
248 | {
249 | "cell_type": "markdown",
250 | "metadata": {},
251 | "source": [
252 | "Markov processes are linear. The vector $x$ represents the probability at time $t$ that we are in state 1 through state $n$. The square matrix $P$ is called the state transition matrix, and the entry $p_{i,j}$ is the probability of moving from state $j$ to state $i$:\n",
253 | "\n",
254 | ""
255 | ]
256 | },
257 | {
258 | "cell_type": "markdown",
259 | "metadata": {},
260 | "source": [
261 | "Ok, let's do an example. When I was in high school, I had a particularly boring hamster that had a set routine: *sleep - run on wheel - eat - sleep...*\n",
262 | "\n",
263 | "Here's the Markov process for my hamster:\n",
264 | "\n",
265 | ""
266 | ]
267 | },
268 | {
269 | "cell_type": "code",
270 | "execution_count": 8,
271 | "metadata": {
272 | "ExecuteTime": {
273 | "end_time": "2019-11-04T15:28:26.283172Z",
274 | "start_time": "2019-11-04T15:28:26.272017Z"
275 | }
276 | },
277 | "outputs": [],
278 | "source": [
279 | "# The associated transition matrix is P\n",
280 | "P = np.array([[0, 0, 1],\n",
281 | " [1, 0, 0],\n",
282 | " [0, 1, 0]])\n",
283 | "\n",
284 | "λ, v = np.linalg.eig(P)"
285 | ]
286 | },
287 | {
288 | "cell_type": "code",
289 | "execution_count": 9,
290 | "metadata": {
291 | "ExecuteTime": {
292 | "end_time": "2019-11-04T15:28:26.303171Z",
293 | "start_time": "2019-11-04T15:28:26.289061Z"
294 | }
295 | },
296 | "outputs": [
297 | {
298 | "data": {
299 | "text/plain": [
300 | "array([-0.5+0.8660254j, -0.5-0.8660254j, 1. +0.j ])"
301 | ]
302 | },
303 | "execution_count": 9,
304 | "metadata": {},
305 | "output_type": "execute_result"
306 | }
307 | ],
308 | "source": [
309 | "λ"
310 | ]
311 | },
312 | {
313 | "cell_type": "code",
314 | "execution_count": 10,
315 | "metadata": {
316 | "ExecuteTime": {
317 | "end_time": "2019-11-04T15:28:26.320703Z",
318 | "start_time": "2019-11-04T15:28:26.307590Z"
319 | }
320 | },
321 | "outputs": [
322 | {
323 | "data": {
324 | "text/plain": [
325 | "array([[ 0.57735027+0.j , 0.57735027-0.j , -0.57735027+0.j ],\n",
326 | " [-0.28867513-0.5j, -0.28867513+0.5j, -0.57735027+0.j ],\n",
327 | " [-0.28867513+0.5j, -0.28867513-0.5j, -0.57735027+0.j ]])"
328 | ]
329 | },
330 | "execution_count": 10,
331 | "metadata": {},
332 | "output_type": "execute_result"
333 | }
334 | ],
335 | "source": [
336 | "v"
337 | ]
338 | },
339 | {
340 | "cell_type": "markdown",
341 | "metadata": {},
342 | "source": [
343 | "We have one eigenvalue of 1, corresponding to an eigenvector with equal weighting on all three states. The remaining eigenvectors form a complex conjugate pair - we would expect a complex value to appear given the clear, periodic nature of the example.\n",
344 | "\n",
345 | "We can look at another transition matrix, this time with all real eigenvectors. For sake of concreteness, this matrix could correspond to a chemical reaction, where the probabilities are now concentrations of mass in different reactants:"
346 | ]
347 | },
348 | {
349 | "cell_type": "code",
350 | "execution_count": 11,
351 | "metadata": {
352 | "ExecuteTime": {
353 | "end_time": "2019-11-04T15:28:26.332138Z",
354 | "start_time": "2019-11-04T15:28:26.323908Z"
355 | }
356 | },
357 | "outputs": [],
358 | "source": [
359 | "# Construct an example where all the non-unity eigenvalues are associated with decaying exponentials\n",
360 | "\n",
361 | "P = np.array([[1, 0.05, 0.0],\n",
362 | " [0, 0.95, 0.2],\n",
363 | " [0, 0.00, 0.8]])\n",
364 | "\n",
365 | "λ, v = np.linalg.eig(P)"
366 | ]
367 | },
368 | {
369 | "cell_type": "code",
370 | "execution_count": 12,
371 | "metadata": {
372 | "ExecuteTime": {
373 | "end_time": "2019-11-04T15:28:26.347257Z",
374 | "start_time": "2019-11-04T15:28:26.337148Z"
375 | }
376 | },
377 | "outputs": [
378 | {
379 | "data": {
380 | "text/plain": [
381 | "array([1. , 0.95, 0.8 ])"
382 | ]
383 | },
384 | "execution_count": 12,
385 | "metadata": {},
386 | "output_type": "execute_result"
387 | }
388 | ],
389 | "source": [
390 | "λ"
391 | ]
392 | },
393 | {
394 | "cell_type": "code",
395 | "execution_count": 13,
396 | "metadata": {
397 | "ExecuteTime": {
398 | "end_time": "2019-11-04T15:28:26.362670Z",
399 | "start_time": "2019-11-04T15:28:26.351972Z"
400 | }
401 | },
402 | "outputs": [
403 | {
404 | "data": {
405 | "text/plain": [
406 | "array([[ 1. , -0.70710678, 0.19611614],\n",
407 | " [ 0. , 0.70710678, -0.78446454],\n",
408 | " [ 0. , 0. , 0.58834841]])"
409 | ]
410 | },
411 | "execution_count": 13,
412 | "metadata": {},
413 | "output_type": "execute_result"
414 | }
415 | ],
416 | "source": [
417 | "v"
418 | ]
419 | },
420 | {
421 | "cell_type": "markdown",
422 | "metadata": {},
423 | "source": [
424 | "Before reading further, try to inspect the eigenvalues and eigenvectors and predict what this system will do over time. Does your prediction make sense if you try to imagine the process by looking at the transition matrix?\n",
425 | "\n",
426 | "Ok, let's run a simulation to see what this does over time:"
427 | ]
428 | },
429 | {
430 | "cell_type": "code",
431 | "execution_count": 17,
432 | "metadata": {
433 | "ExecuteTime": {
434 | "end_time": "2019-11-04T15:29:21.234730Z",
435 | "start_time": "2019-11-04T15:29:20.863133Z"
436 | }
437 | },
438 | "outputs": [
439 | {
440 | "data": {
441 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAe0AAAF+CAYAAACvRkIWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOzdd3xV9f3H8dc3e5OdAGGFsJEZGYobQbQqLsS9qBRHtcOq1Eql1lbraPVXB3ULYquiqDgRFEVRAdkz7BWyyd7f3x8nICPCBZKcm9z38/E4j3PvGfd+gpg333O+5/s11lpERETE+/m5XYCIiIh4RqEtIiLSTCi0RUREmgmFtoiISDOh0BYREWkmFNoiIiLNRIDbBRxJfHy87dixo9tliIiINIlFixblWGsT6tvn9aHdsWNHFi5c6HYZIiIiTcIYs+Xn9unyuIiISDOh0BYREWkmFNoiIiLNhEJbRESkmVBoi4iINBMKbRERkWbCo0e+jDEpwN1AOtAXCAU6WWs3e3BuCPAX4GogGlgC3G2tnXeMNR+isLCQrKwsqqqqGuojW5TAwEASExOJiopyuxQRETkOnj6nnQaMARYBXwEjjuI7XgDOA+4CNgK3Ap8YY4Zaa5ccxefUq7CwkN27d9O2bVtCQ0MxxhzvR7Yo1lrKysrYsWMHgIJbRKQZ8/Ty+DxrbZK19lzgTU8/3BjTF7gS+I219j/W2s9xwn8rMPmoq61HVlYWbdu2JSwsTIFdD2MMYWFhtG3blqysLLfLERGR4+BRaFtra4/x8y8AqoD/7vdZ1cAbwEhjTPAxfu4+VVVVhIaGHu/HtHihoaG6fSAi0sw1dke0XsAma23pQdtXAkE4l92Pm1rYR6Y/IxGR5q+xxx6PBfLr2Z633/5DGGNuBm4GaN++feNUJiIiUqe21lJVW0tVjaW6xllX1dRSXeNsr977vtZSs+84S3VtLZ0TImgXG9YkdTZ2aBvA/sz2n2WtnQJMAUhPT6/vfBERaaastVTVWMqra6ioqqWiuobyunVFde2+bRXVtXXva6iscbZX1tRSWV231L2u2O991d51jbO9qu51VbUTunv3VdXYfcc6QXzsUfOnX/TkpmGdGvBP6Oc1dmjnAfU1lWP22y8iIl7CWktFdS2llTWUVFRTVlVDaWUNpRXVlFbWUFZVQ1llDaWV1ZRV1da9r65b11JeVUN5Vd1xVU4Y791WXvVTOB9HRgLgZyAowI8gfz+CAvwJDvAjKMCPQH+zb3ugvx8RwQH7XgcG+BHoZ+qO27sYAv39CNi79jMHbXdeB/g5xwT4GWdb3TrA35AS03T9qho7tFcCFxljwg66r90TqAQyGvn7RURaPGst5VW1FFVUUVReTVF5NcXl1RTv/76impIKZ733dUlFDSWVzvvSutellTVH1eo0BkID/QkN9Cck0J/QoL2vncCMj3C2hwT4OetAP4IDnHVIoD/BgU7ghtStncWf4EAnePcevzeI924P8PfNscEaO7TfAx4ALgNeATDGBACXA59aaysa+fubpYyMDHr06MHEiRN54IEH9m2fMGECU6dOZe7cuaSnp7tYoYg0tNpaS1F5NQVlleSXVlFQWsmesioKy6qcdXk1e0qrKCyvW8qqKSx3QrmwrIpqD4I2KMCPyOAAwuuWiGB/YsODaBcTRliQf912f8KCApz3QQGEBfsTFuRPaKCzLSzICeawoIB94ayOrk3H49A2xlxa93Jg3XqUMSYbyLbWfmmM6QBsACZbaycDWGuXGGP+C/zTGBMIbAImAJ2Aqxrqh2hp0tLSGDduHE888QS333478fHxTJ48mRdffJFZs2YpsEW8XG2tpbC8itySSvJKKsktriS/1HmdX1JJXmklBaVV5O+33lNWhT1M7oYE+tEqNJCokECiQgOJjwgiNSGcyJAAokICiQwJJDIkYN8SERxYt3behwcHEOijrdOW5Gha2gcPqvJ03fpL4HSczmX+HPoY2Q3AX4EHcYYxXQqcY61dfLTFeuqB91eyamdhY328R3q2iWLS+b2O+fxJkybx6quv8vDDD9O9e3ceeOABpk+fzvDhwxuwShHxVG2tJa+0kqzCCrKLK8guqiCnuIKcIud9bnEluSWV5BZXkFdS+bMt37Agf2LCgogJDyQmLIiUmDBiwgKJDg0kOiyI6LBAosMCaRXqLFF16+AA/yb+icUbeRza1toj9fjeTD29wq21ZcBv6xbxUHJyMnfeeSePPfYY1dXVPPnkk4wZM2bf/oceeohXXnmF9evXM2PGDEaPHu1itSLNl7WWPWVVZBaWk7nHWXYXVrC7qJyswrrXheXkllTWe683NNCf+Mgg4iOCaRsdSt+UVsSGBxEXEUxceBCxdUtcRBAxYUGEBCp85dg19j1tVxxPC9ebdOnShYqKCoYNG8att956wL6zzjqLyy+/nJtuusml6kSah9LKanYWlLGjoJydBWV1Szm79pSRuaecXXvKKauqOeS8uPAgEiKDSYoKoXtyJIlRwSRGhpAYGUxCZDDxEc46PLhF/hoVL6W/bV5qzpw5jB8/nqFDhzJ//nyWLl1K37599+0fPHiwi9WJeI+K6hq255exLa/UWfLL2J5fyvb8Mrbnl5FXUnnA8X4GkqJCaN0qhB5tojirR2Ld+1CSWzkhnRgZQlCA7v+K91Foe6HFixczevTofZ3RunbtysSJE5k1a5bbpYm4orSyms05pWzOLWFzbglbckrZklfC1txSdhWWH9CBKyjAj5ToUNrGhNKrTStSYkJpW/e+TXQoSZHBPvu4kDR/Cm0vk5GRwahRoxgxYgRPPfUUfn5+TJo0iRtvvJF58+Zx6qmnul2iSKOorbXsKCgjI7uYjdklbKxbb8opIbOw/IBj4yOCaB8bxuDUONrHhtEhLox2sWG0jw0jISIYPz89giQtk0Lbi2RmZjJixAh69OjBtGnT8PNzWgPXXnstjzzyCPfccw/ffPONy1WKHJ+aWsvWvFLWZhaxfncR67OKycgqZmNOMeVVP00oGBUSQGpCBCelxZEaH06HuHA6xYfTIS6MyJBAF38CEfcotL1IcnIyGzduPGS7v78/q1evdqEikeOTU1zB6l2FrM0sYvWuItbuLmT97mIqqn8K57bRoaQlRjC0cxxpiRF0Toigc0I4seFBGrRD5CAK7WbqwQcf5NlnnyU7O5sVK1Zw2223sXDhQpKTk90uTXyQtU7recWOQlbu3MOqXYWs2llIVtFPgx4mRAbTPTmSa4Z0oGtyJF2TIklLjCBCva9FPKb/W5qp++67j/vuu8/tMsQH7Q3oZdv3sGx7Act37GHlzkKKyqsBCPAzdEmK5JQuCfRoHUnP1lF0bx1FbHiQy5WLNH8KbRE5rILSSpZsK+DHrQX8uK2AZdsLKCitApye2j1aR3Fhvzb0btOK3m1b0SUpQqN3iTQShbaI7FNba9mYU8zCzfks2pLPoq35bMwuAZznm7smRTKyZzJ92rWib0o03ZIjNZ61SBNSaIv4sOqaWlbuLOT7TXl8vzmPHzbn7WtFx4QFMqB9DJcMSKF/+2j6pETr/rOIy/R/oIgPqam1rNixh2835vLthlwWbs6jpNIZwrNjXBhn90jixI6xDOwYQ2p8uHpvi3gZhbZIC2atZWNOCfMzcvh6fQ7fbszd12EsLTGCiwa0ZXCnOAZ1iiUpKsTlakXkSBTaIi3MnrIq5mfk8OXabOatz2bXHmc0sXaxofyiT2uGdo5nSGosiZEKaZHmRqEt0sxZa1mTWcScNVnMXZPFj9sKqKm1RIYEMCwtntvOjOeUtATax4W5XaqIHCeFtkgzVFFdwzcbcpm9ajdz12Sxs641fULbVtxyemdO65pAv3bRmhhDpIVRaIs0E4XlVcxdk8Wnq3bzxZosSiprCAvyZ1haPHcM78IZ3RJJ1H1pkRZNoS3ixQpKK/l01W4+XpHJV+uzqaqxxEcEc0G/NozomczQznGEBGogExFfodAW8TKF5VV8unI37y3dyTcZOVTXWlJiQrn+pI6c0zuZ/u1iNPWkiI9SaIt4gfKqGj5fncXMJTv4Ym02lTW1tIsNZdwpqZx3Qmt6t43SM9MiotAWcUttreX7zXm8s3gHHy7fRVFFNYmRwVw9pAPn921Nv3bRCmoROYBC2wtlZGTQo0cPJk6cyAMPPLBv+4QJE5g6dSpz584lPT3dxQrleGzLK+WtRdt5a9F2dhSUER7kzzm9W3PxgLYMSY3DX5e+ReRnKLS9UFpaGuPGjeOJJ57g9ttvJz4+nsmTJ/Piiy8ya9YsBXYzVFFdw8crMvnfwm3Mz8jFGBiWFs8fzunG2T2TCAvS/4oicmQt8zfFR/dA5nJ3a0g+AUb9/ZhPnzRpEq+++ioPP/ww3bt354EHHmD69OkMHz68AYuUxrYhu5g3vt/KW4u2k19aRbvYUH57dlcuGZhC2+hQt8sTkWamZYZ2C5CcnMydd97JY489RnV1NU8++SRjxowBoLy8nLFjx7J27VqCg4NJSkrimWeeITU11eWqBZyZsz5btZtXv93CtxtzCfAzjOiVxJWDOnBS5zj1/BaRY9YyQ/s4WrjepEuXLlRUVDBs2DBuvfXWA/ZNmDCBkSNHAvB///d/jBs3jjlz5rhRptTJKa7gje+3Mu27rezaU07b6FDuGtmNy9JTNM63iDSIlhnaLcCcOXMYP348Q4cOZf78+SxdupS+ffsCEBISsi+wAYYMGcKjjz7qVqk+b01mIS98tYmZS3ZSWVPLsLR4HrigF2f1SFKnMhFpUAptL7R48WJGjx69rzNa165dmThxIrNmzar3+KeeeooLL7ywiav0bdZa5q3P4fmvNvLV+hxCAv24LD2FG07uSFpipNvliUgLpdD2MhkZGYwaNYoRI0bw1FNP4efnx6RJk7jxxhuZN28ep5566gHH/+1vf2PdunV8/vnnLlXsW6prapm1fBfPfLGBNZlFJEYGc9fIblw5qD0x4UFulyciLZyx1rpdw2Glp6fbhQsX/uz+1atX06NHjyasqPFkZmZy0kkn0b59ez755BOCg4MBqKmpoXfv3sTExPDNN9/sO/7RRx/ljTfeYPbs2URHRx/x81vSn1VTK6+q4c1F25kybwPb8spIS4xg/KmpXNivLUEBmklLRBqOMWaRtbbeZ3vV0vYiycnJbNy48ZDt/v7+rF69+oBtjz/+ONOnT/c4sOXYlFXW8Pr3W3nuyw1kFVXQr100fzqvJ8N7JKkXuIg0OYV2M7R9+3Z+97vfkZqayhlnnAFAQEAAh7siIUentLKaqQu2MGXeRnKKKxmSGss/x/ZjaGqchhYVEdcotJuhlJQUvP22RnNVUV3D699t5d9zN5BTXMGwtHhuPzONwalxbpcmIqLQFgGoqqnl7UXbefLz9ezcU86Q1FievXoA6R1j3S5NRGQfhbb4NGstn6zczSMfr2FjTgl920XzyKV9OTlNl8FFxPsotMVnLdqSx0MfrmHRlnzSEiOYcs1Azu6ZpLAWEa+l0BafszW3lL99tJqPVmSSGBnM3y4+gcsGphDgr0e3RMS7KbTFZxRXVPP03Aye/2oT/n6G3wzvyi9P7aRpMUWk2dBvK2nxrLXMWLyDv3+8huyiCi7u35Y/nNOd5FaaxENEmheFtrRoq3cVcv/MFfywOZ9+7aKZcs1A+rePcbssEZFjotCWFqmovIonPlvPK99uplVoII9c0odLB6ZoFDMRadYU2tLifLxiF/fPXEl2cQVXDGrPH0Z2IzpMk3mISPOn0JYWI3NPOffPXMGnq3bTo3UUU65Np187jcsuIi2HQluavdpay7Tvt/LIR2uorKnl7nO6M+6UTgTqES4RaWEU2tKsbcsr5a63lrJgYx4np8Xx0EUn0CEu3O2yREQahUJbmqW9reu/fbgaP2P4+8UncPmJ7TSamYi0aB5dPzTGtDPGvGWM2WOMKTTGzDDGtPfw3PbGmFeMMVuNMaXGmHXGmAeNMWoO/YyMjAwCAwOZNGnSAdsnTJhAZGSkz0/BubOgjGte/I4/vbuCAe1j+OQ3pzJ2UHsFtoi0eEcMbWNMGDAH6A5cB1wDdAHmHil46/bPBk4F/gScBzwP/A548bgqb8HS0tIYN24cTzzxBDk5OQBMnjyZF198kXfeeYf09HSXK3TP+0t3cs4/5/Hj1gL+elFvXrtpEG2jQ90uS0SkSXhyefyXQCrQzVqbAWCMWQasB8YDjx/m3JNxAn6ktfbTum1zjTGxwO+NMWHW2tJjrv5nPPz9w6zJW9PQH3tUusd25+5Bdx/z+ZMmTeLVV1/l4Ycfpnv37jzwwANMnz6d4cOHN2CVzUdheRWTZq7knR930L99NP+8vJ/uXYuIz/EktC8AFuwNbABr7SZjzHzgQg4f2nsfji08aHsBTitf1zN/RnJyMnfeeSePPfYY1dXVPPnkk4wZM2bf/rPOOoucnByMMURGRvLUU0/Rr18/FytuPIu25PHr6UvILCznzuFduO2MNE3uISI+yZPQ7gXMrGf7SuCyI5w7G6dF/rAxZgKwFRgE3AE8a60tOYpaPXY8LVxv0qVLFyoqKhg2bBi33nrrAftmzJhBq1atAHjnnXe4/vrrWbJkiRtlNpraWsuz8zbw2KfraBMdwpu/GsoADUEqIj7Mk9COBfLr2Z4HHPY3qLW23BgzDHgbJ+T3eh64zdMifdGcOXMYP348Q4cOZf78+SxdupS+ffvu2783sAEKCw++kNH8ZRdV8Nv/LeGr9Tmc16c1f7v4BKJCAt0uS0TEVZ4+8mXr2XbES9vGmBDgv0AiTge2vS3t+4FqYMLPnHczcDNA+/YedVJvURYvXszo0aP3dUbr2rUrEydOZNasWQccd9VVV/Hll1/i5+fHhx9+6FK1DW/Bxlxun/4jhWVVPHTRCVwxSI9yiYiAZ4985eO0tg8WQ/0t8P3dBJwOnGutnWqtnWetfRSn9/ivjDF96zvJWjvFWpturU1PSEjwoMSWIyMjg1GjRjFixAieeuopgoKCmDRpEh9++CHz5s074Nhp06axfft27r//fu6+u/nfErDWMmXeBq56/jsiQwKYedvJXDlYj3KJiOzlSWivxLmvfbCewKojnHsCkG+t3XDQ9u/r1j08+H6fkZmZyYgRI+jRowfTpk3Dz8/5z3PttdfSvXt37rnnnnrPu+mmm/jss8/Izc1tynIbVHFFNbdMW8xDH65hRM8kZt56Mt2To9wuS0TEq3hyefw94FFjTKq1diOAMaYjzuNc9afITzKBGGNM2v69z4HBdesdR1duy5acnMzGjRsP2e7v78/q1av3vc/Pz6e8vJzWrVsD8Pbbb5OYmEhsbH0XRLxfRlYRN7+2iC25pfzx3B6MO6WTWtciIvXwJLT/g9NpbKYx5j6c+9t/AbYBz+09yBjTAdgATLbWTq7b/DLwW+BDY8xfce5pp+MMtLIImN8wP4Zvyc/P5/LLL6e8vBw/Pz8SExP54IMPmmXQzVmzm19PX0JIoB/Txg1mSGqc2yWJiHitI4a2tbbEGHMm8ATwGk4HtM+BO621xfsdagB/9rvkbq3dbIwZAvwZeBCIxwn7KcBfrbW1DfRz+JTU1FR++OEHt8s4LtZanv1yI498soZebaKYck06bTSymYjIYXnUe9xauxW45AjHbKaeHuXW2lXAmENOEJ9VXlXD3W8vY+aSnZzftw2PXNKH0CB/t8sSEfF6muVLmlR2UQW/fHUhS7cXcNfIbtxyeudmeVlfRMQNCm1pMut3F3HDyz+QW1zJs1cPZGSvZLdLEhFpVhTa0iTmZ+Twq6mLCAn057/jh9AnJdrtkkREmp0WEdrWWl1iPQJr6xvUrmm8uXAb985YTueECF64Pp2UmDDXahERac6afWgHBARQXV1NYKDGpT6c6upqAgKa9j+3tZanv9jAPz5Zy7C0eJ6+eoDGDxcROQ7NPrRDQkIoLi4mJkazPx1OUVERISEhTfZ9tbWWyR+s4uVvNnNhvzb849K+BAVoOk0RkePR7H+LJiQkkJ2dTWlpqauXgL2VtZbS0lJycnJoqnHcK6pruP2NH3n5m82MG9aJJ8b0U2CLiDSAFtHSTkpKIjMzk4qKCrfL8UrBwcEkJSU1SUu7pKKam19byPyMXCae252bT+3c6N8pIuIrmn1ogzO39P7zS4s79pRVccNL37NkWwGPXtaXSwemuF2SiEiL0iJCW9yXW1zBNS98z/qsIv595QBGndDa7ZJERFochbYct8w95Vz1/AJ2FJTx/HUnclpX35oDXUSkqSi05bjsKCjjiikLyCup5JUbBjFYs3SJiDQahbYcs+35pVzxnwUUlFYxddxg+rXTKGciIo1JoS3HZFueE9h7yqqYetNg+iqwRUQanUJbjtq2vFLGTllAUXkV08YN1jjiIiJNRKEtR2VnQRljpyyguKKaaeOGcEKKHrUTEWkqGqZKPJZVVM5Vz39HYd0lcQW2iEjTUktbPJJXUsnVz3/H7sJyXrtpkAJbRMQFamnLEe0pq+LaF79jS24pz1+XzsAOsW6XJCLikxTaclilldXc8NL3rM0s4tlrBnJS53i3SxIR8VkKbflZVTW1TJi6mCXbCnjqiv6c0S3R7ZJERHya7mlLvWprLb9/cylfrsvm7xefwDm9NZa4iIjb1NKWQ1hrmfzBKmYu2cldI7sxdlB7t0sSEREU2lKPp7/YwMvfbObGkztxy+maD1tExFsotOUAby7cxj8+Wcvofm2477weGGPcLklEROootGWfr9fncO+M5ZycFscjl/bFz0+BLSLiTRTaAsCazEImTF1E54QInrl6IEEB+qshIuJt9JtZyNxTzg0v/UBYsD8v3XAiUSGBbpckIiL1UGj7uOKKam54+QcKy6p48foTaRMd6nZJIiLyM/Sctg+rqbXcMf1H1u0u4oXr0unVRuOJi4h4M7W0fdgjH6/h8zVZTDq/J6drtDMREa+n0PZRby7cxnPzNnL1kPZcO7Sj2+WIiIgHFNo+6IfNeUx8x3m0a9L5vdwuR0REPKTQ9jHb8koZ/9oiUmLCePrKgQT666+AiEhzod/YPqSssobxry2iqqaW569Lp1WYHu0SEWlO1HvcR1hruXfGMlZnFvLCdel0TohwuyQRETlKamn7iBfnb+bdJTv57fCunNk9ye1yRETkGCi0fcA3G3J46MPVjOiZxK1npLldjoiIHCOFdgu3o6CM217/kU7x4Tx+eT9NAiIi0owptFuwiuoabpm2mMrqWp67ZiARwerCICLSnOm3eAv20KzVLN1WwDNXDVDHMxGRFkAt7RbqvaU7eeXbLdw0rBOjTmjtdjkiItIAFNotUEZWMfe8vYyBHWK4Z1R3t8sREZEGotBuYUorq7ll2iJCAv35vyv7a8QzEZEWRPe0W5j7Z65kfVYxr944iNatNDe2iEhLomZYC/LOj9t5a9F2bjsjjVO6JLhdjoiINDCFdguxKaeE+95ZwYkdY7jjrC5ulyMiIo3Ao9A2xrQzxrxljNljjCk0xswwxrT39EuMMT2MMW8aY3KMMWXGmLXGmDuOvWzZX0V1DbdPX0yAvx//GtufAN3HFhFpkY54T9sYEwbMASqA6wALPAjMNcb0sdaWHOH89LrzvwDGAXuALoAeHG4gj3y8lhU7CplyzUDaROs+tohIS+VJR7RfAqlAN2ttBoAxZhmwHhgPPP5zJxpj/IBXgM+ttRftt2vuMVcsB5izZjcvfL2J64Z2YESvZLfLERGRRuTJddQLgAV7AxvAWrsJmA9ceIRzTwd6cphgl2OXXVTBXW8uo0frKO49t4fb5YiISCPzJLR7ASvq2b4SJ5APZ1jdOsQYs8AYU2WMyTLGPGmM0XXc42Ct5Q9vLaW4oponx/YjJNDf7ZJERKSReRLasUB+PdvzgJgjnNumbv1f4FPgbOARnHvbr//cScaYm40xC40xC7Ozsz0o0fdMXbCFuWuzuXdUd7okRbpdjoiINAFPB1ex9WzzZI7Hvf8omGqtvb/u9RfGGH/g78aYntbaVYd8mbVTgCkA6enp9X23T8vIKuLBWas5rWsC153U0e1yRESkiXjS0s7HaW0fLIb6W+D7y61bf3bQ9k/r1v08+H7ZT2V1LXe8sYTw4AD+cVkfjNH82CIivsKTlvZKnPvaB+sJHNJKrudcOLSlvjdpaj34ftnPE7PXsXKn83hXYmSI2+WIiEgT8qSl/R4wxBiTuneDMaYjcHLdvsP5COf57nMO2j6ybr3QoyoFgEVb8njuyw2MPbGdHu8SEfFBnoT2f4DNwExjzIXGmAuAmcA24Lm9BxljOhhjqo0xe+9dY63NBf4G/MoY85AxZrgx5h7gfuCV/R8jk8Mrrazmd/9bSpvoUO77xZE67YuISEt0xMvj1toSY8yZwBPAaziXtj8H7rTWFu93qAH8OfQfApOBIuAW4PfALuAfwF+Ou3of8vBHa9icW8obNw8hIliTs4mI+CKPfvtba7cClxzhmM3U06PcWmtxBlfRACvHaH5GDq98u4UbT+7EkNQ4t8sRERGXaGYJL1dYXsVdby4lNSGcP5zTze1yRETERbrO6uX+8v4qMgvLmXHLyRr1TETEx6ml7cXmrs3izUXb+dVpnenXLtrtckRExGUKbS9VVF7FxBnLSUuM4I7hXdwuR0REvIAuj3uphz5cw+7Cct6ecBLBAbosLiIiaml7pfkZOUz/fivjTkmlf/sjzckiIiK+QqHtZUoqqrn77WV0ig/nt2d3dbscERHxIro87mUe+XgNOwrK+N/4oeotLiIiB1BL24ss2pLHqwu2cO2QDpzYsb6J1URExJcptL1ERXUNd7+9nNZRIdx1Tne3yxERES+ky+Ne4pkvNpCRVcyL16drbHEREamXWtpeYP3uIv49N4ML+rbhzO5JbpcjIiJeSqHtstpayz0zlhMeHMD952vKTRER+XkKbZdN+34ri7bkc995PYmPCHa7HBER8WIKbRftLizn4Y/WMCwtnksGtHW7HBER8XIKbRc98P5Kqmpq+etFvTHmkKnIRUREDqDQdsmcNbv5cHkmvz6rCx3iwt0uR0REmgGFtgtKK6v507sr6ZIYwS9PSXW7HBERaSb0QBOKqfoAACAASURBVLAL/jV7/b6hSoMC9O8mERHxjBKjia3aWcjzX29i7IntGNRJQ5WKiIjnFNpNqLbW8sd3lxMdGsg9ozRUqYiIHB2FdhOa/sNWftxawB/P60F0WJDb5YiISDOj0G4iOcUVPPLxWoakxnJRfz2TLSIiR0+h3UT+9uEaSiureXC0nskWEZFjo9BuAt9tzOXtxdv55SmppCVGul2OiIg0UwrtRlZVU8ufZq6gbXQot5/Zxe1yRESkGdNz2o3sha83sW53Mc9fm05okL/b5YiISDOmlnYj2lFQxr9mr+fsnkkM76l5skVE5PgotBvRX95fhcUySfNki4hIA1BoN5Iv1mbx8cpMbj+zCykxYW6XIyIiLYBCuxGUV9Uw6b2VpMaHM+6UTm6XIyIiLYQ6ojWCKfM2siW3lKk3DSY4QJ3PRESkYail3cC25ZXy77kZnNenNcO6xLtdjoiItCAK7Qb2wPsr8fcz3HdeD7dLERGRFkaXx4/Vnu2w8h3Y9h0ERUJoDBuKA4lbV8K9Z4yhdatQtysUEZEWRqF9NCqKYel0WPE2bP3W2RabCjVV2LJ8OlcW83Ag2G9eguyRMOAaSDsb/PXHLCIix09p4qnCnTDtMti9AhJ6wJn3Qa+LIa4zAE98upZn5qzhzUsT6Jf3ESx9A9bOgohkOGMi9L8G/HQ3QkREjp1C2xNZq2HqpVBeAFe9BV3OPmD35pwSnv1yI+f1a0+/9P7ASXDW/bD+M/jmSXj/17DkdfjFE5CkgVZEROTYqOl3JJu/hhdGQm0V3PDRIYFtrWXSeysJCvBj4rn7dT7zD4Tu5zrnXPhvyFkHz50Cn90PVWVN/EOIiEhLoNA+nLUfw2sXQWQSjJsNrfsccsgnK3fz5bpsfnN2VxKjQg79DGOg/9Vw20LoMxbm/wteOBvyNzd+/SIi0qIotH9OaR7MvAUSusONn0B0+0MPqaxm8vsr6Z4cyXVDOxz+88LjYPS/4cr/Qf5WmHI6ZHzeOLWLiEiLpND+OZ/eB+V7YPQzEBZb7yFPzclg555yJl/YmwB/D/8ou46Em+dCZGuYdil89ThY24CFi4hIS6XQrs/GL2HJNDjpdkjuXe8h63cX8Z95G7lkQAqDOtUf6j8rrrNzub3naPj8AZhxM9RUNUDhIiLSkqn3+MGqyuGD30BMJzjt7noPsdbyp5krCAvy595zux/b9wSFw6UvOr3J5zwIZfkw5hVnu4iISD3U0j7YV49C3gbn8azA+kc1m7lkJws25vGHc7oTHxF87N9lDJx6F5z/JGz4HF690LmXLiIiUg+F9v6yVsPXTzi9vDufUe8he8qqeHDWavqmtOKKQYd2TjsmA6+DMa/CrmXw0ijYs6NhPldERFoUj0LbGNPOGPOWMWaPMabQGDPDGHPUiWWMudcYY40xXx99qU3g0z9BcBSM/OvPHvL4p2vJLangwdEn4O9nGu67e5wPV7/tBPbL50LBtob7bBERaRGOGNrGmDBgDtAduA64BugCzDXGeHwD1hiTCvwRyDq2UhtZ/mbImA2Dbobw+qfUXLFjD68t2MI1QzpwQkqrhq+h0ylw7UwozYdXfqHgFhGRA3jS0v4lkAqMtta+a62dCVwAdADGH8V3PQNMA1YfdZVNYdErzj3mAdfWu7um1jLxneXEhgfzuxHdGq+OlIFwzTsKbhEROYQnoX0BsMBam7F3g7V2EzAfuNCTLzHGXAkMAO49liIbXU0V/DgVuoyEVm3rPeTVbzezbPse7j+/J61CAxu3HgW3iIjUw5PQ7gWsqGf7SuCIs18YY2KAJ4A/WGu9s2v0mllQkgXpN9S7e9eeMh79ZC2ndU3g/D6tm6amA4L7fCjKbJrvFRERr+VJaMcC+fVszwNiPDj/H8A64GXPy2pii16CVu0gbXi9uyfNXEmNtTw4ujfGNGDnsyNJGQjXzIDiLOdxsJLcpvtuERHxOp4+8lXfOJtHTC9jzCnAtcAEaz0fq9MYc7MxZqExZmF2dranpx2b3A2w8QvnXraf/yG7P1mZyaerdnPn8K60iw1r3Frqk5IOV/7X6Sg39SJnaFUREfFJnoR2Pk5r+2Ax1N8C399zwAvAdmNMtDEmGmcUNv+69/WOTGKtnWKtTbfWpickJHhQ4nFY/AoYf+h/zSG7iiuq+fN7zoQgNw3r1Lh1HE6nU2DMa7B7FUwbA5Ul7tUiIiKu8SS0V+Lc1z5YT2DVEc7tAfwKJ9z3LicDQ+peT/C40sZQXQk/ToNuoyDq0HvVj36ylszCch66+AQCPZ0QpLF0HQGXPA/bv4c3roLqCnfrERGRJufJ2OPvAY8aY1KttRsBjDEdccL3niOcW9+wYv8E/IHbgYx69jedNe9DaQ4MPLQD2g+b83jl281cO6QDA9p7cuu+CfQa7bSyZ97iTDJy6Yv1XtIXEZGWyZPQ/g9wGzDTGHMfzv3tvwDbcC5/A2CM6QBsACZbaycDWGu/OPjDjDEFQEB9+5rc4tecebI7n3nA5rLKGv7w1jJSYkL5wznHOCFIY+l/lTO5yKd/hFkxzhjpTdk5TkREXHPEa77W2hLgTJwe4K/hDJCyCTjTWlu836EGpwXdPMYzryiGzV8702P6HVjyE7PXsSmnhL9f3IfwYC+cCO2k22DYb51e73P+4nY1IiLSRDxKJGvtVuCSIxyzGQ96lFtrT/fkOxvdlvlQWwVpZx2w+cet+Tz/1UauGNSek9PqH87UK5x1P5TmwlePQWisE+QiItKieWEzsolkfA4BodBuyL5N5VU13PXWMpKjQph4rPNkNxVjnEvj5QXOpfKwOOh3hdtViYhII/Ld0N4wBzoOg8CQfZv+9fl6MrKKefmGE4kMaeShShuCnz9c/B/nHvfMWyEsFrqOdLsqERFpJM3j/nNDK9gKuesP6IC2YGMuz365gTHpKZzeLdHF4o5SQDCMfR2ST4D/XQdbv3O7IhERaSS+Gdob5jjruvvZBaWV/Oa/S+gYF86k8+t7JN3LBUfCVW9BVBt4/TJnEBYREWlxfDO0Mz6HqLYQ3xVrLffOWE52UQX/GtvPO3uLeyIiwRmnPCAUpl7sXE0QEZEWxfdCu6YaNn0Jnc8AY/jfwm18tCKT34/sRp+UaLerOz4xHZ3griqF1y6Ckhy3KxIRkQbke6G980dn0o3OZ7Ehu5g/v7eKkzrHcfMpqW5X1jCSesEV/4U922HapVBR5HZFIiLSQHwvtDd8DhhKU4Zx2+s/EhLox+Nj+uHn14JGFeswFC57GXYtg/9erXHKRURaCB8M7TnYNgP43QfbWJtZyBOX9yO5VciRz2tuuo2CC//PmXb0nfFQW+N2RSIicpx8K7TLCmD7Qr7z68tHKzKZeG6P5vV419HqdyWc/RdY+Q58+HvwfEpzERHxQs20q/Qx2jQPbA2PbkjhsoEp7s6R3VRO/rUzk9n8fzmjpp15n9sViYjIMfKp0N6+dBa1fmH4tTuRBy/qjfGV2bGGPwCleTDvH8445UNvcbsiERE5Bj4T2qXlVVxeNp8TYtvx72sGExzgQ/NQGwO/+KczTvkn9zrDnfYd63ZVIiJylHzmnnZYkD/tQjqwKyGWhMhgt8tpev4BcPHz0OlUePcWWPOh2xWJiMhR8pnQxs+Pod1HsLViNxU1PvoIVGCIM055m37w5vVOz3IREWk2fCe0gd5xvam21azJW+N2Ke7ZO055XGeYfiVs+97tikRExEM+Fdq94p3JQFbkrHC5EpeFxcI170BEojNqWuZytysSEREP+FRoJ4UlER8az8qclW6X4r7IZLh2JgRFOOOU52S4XZGIiByBT4W2MYZecb1YmavQBiCmA1zzrjPoyqsXQP5mtysSEZHD8KnQBucS+aY9myipKnG7FO+Q0NVpcVeVwivnOxONiIiIV/K50O4d1xuLZVXuKrdL8R7JvZ173GUF8MoFUJTpdkUiIlIP3wvt+N6AOqMdok1/p1d5USa8eqHm4hYR8UI+F9oxITG0jWir0K5P+8Fw5X8hf4vT4i7JdbsiERHZj8+FNqDOaIfT6RS48g3I2+B0TlNwi4h4Dd8M7fhe7CjeQX55vtuleKfU0+GKNyA3w7lUXprndkUiIoKPhnbvOOe+tlrbh9H5DGfI05x1TotbwS0i4jqfDO2ecT0xGN3XPpK0s+CK1yF7Xd09bnVOExFxk0+GdkRQBB1bddTIaJ5IGw5XTIfc9fDyL6Bot9sViYj4LJ8MbXAukevyuIfSzoKr3oSCLfDyeVC40+2KRER8ks+Gdq/4XmSXZbO7RC1Hj3Q6Fa6e4TzH/dK5ULDN7YpERHyO74Z2XN2MX7m6r+2xDkPh2nedTmkvjdIkIyIiTcxnQ7t7bHcCTIDuax+tlHS4/n2oKoOXztG0niIiTchnQzskIIQuMV1Ylr3M7VKan9Z94caPwT/Iuce97Xu3KxIR8Qk+G9oAA5MGsiR7CZU1lW6X0vzEd3GCOyzeGYBlwxy3KxIRafF8OrQHtx5MRU0FS7OXul1K8xTd3gnu2FSYNgaWv+V2RSIiLZpPh/bApIH4GT++z9Tl3WMWkQjXz4J2g+Dtm2DBM25XJCLSYvl0aEcGRdIrrhff7frO7VKat9Bo53GwHufDx/fA7D+DtW5XJSLS4vh0aAMMSh7E8uzllFaVul1K8xYYApe9Auk3wtdPwLsToFp9BUREGpJCu/Ugqm01i7MWu11K8+fnD+c9Dmf8EZZOh2mXQFmB21WJiLQYPh/a/RP7E+AXwPe7dF+7QRgDp/0BRj8LW76FF0dCwVa3qxIRaRF8PrRDA0Lpm9CX7zJ1X7tB9bsCrn4bCnfBf86CHbqSISJyvHw+tAEGJw9mde5q9lTscbuUliX1NLjpUwgIccYrX/mu2xWJiDRrCm2c+9oWy8LdC90upeVJ7A6//ByST4A3r4MvHlbPchGRY6TQBvrE9yHEP0T3tRtLRCJc/wH0vQK+eAjeugEq1VtfRORoBbhdgDcI9A9kQNIADbLSmAKCYfQzkNDdeY47bxOMnQatUtyuTESk2VBLu86g5EFkFGSQU5bjdiktlzEw7E64YjrkboDnToNNX7ldlYhIs6HQrjO49WAAfsj8weVKfEC3UXDzXAiLdSYb+fZp3ecWEfGAR6FtjGlnjHnLGLPHGFNojJlhjGnvwXnpxpgpxpg1xphSY8xWY8w0Y0yn4y+9YXWP7U5kYKSGNG0q8V1g3OdOgH9yL8y4GSpL3K5KRMSrHTG0jTFhwBygO3AdcA3QBZhrjAk/wuljgV7Ak8Ao4B5gALDQGNPuOOpucAF+AaQnp/PNzm+wavU1jZAoGPManPknWP4m/OdMyF7rdlUiIl7Lk5b2L4FUYLS19l1r7UzgAqADMP4I5z5srT3ZWvu0tfZLa+3rwDlATN3nepUz25/JrpJdrMpd5XYpvsPPD079PVzzDpTkwJQzYNmbblclIuKVPAntC4AF1tqMvRustZuA+cCFhzvRWptdz7YtQDbQ9uhKbXynp5yOv/Fn9tbZbpfiezqfAb/6Clr3gRnj4P07oarM7apERLyKJ6HdC1hRz/aVQM+j/UJjTA8gEVh9tOc2tuiQaE5MPpHZW2brErkbotrAdR/AyXfAopecy+VZXvfXRETENZ6EdiyQX8/2PJzL3B4zxgQAz+K0tF84zHE3G2MWGmMWZmcf0lhvVMPbD2dz4WY2FGxo0u+VOv4BcPZkuOptKMmGKafDDy+od7mICJ4/8lXfb0xzDN/3f8BJwNXW2vr+IeB8mbVTrLXp1tr0hISEY/iaY3dm+zMxGD7b+lmTfq8cpMtwmPANdDgZZv0W/ns1lOa5XZWIiKs8Ce18nNb2wWKovwVeL2PM34CbgRuttZ96el5TSwhLoF9iPz7f8rnbpUhEIlz1Fox4ENZ9Ak8PhfXqbyAivsuT0F6Jc1/7YD0Bj7pZG2P+iPO41x3W2tc8L88dw9sPZ23+WrYVbnO7FPHzg5Nuh1/OgdAYmHYJfPBbPdMtIj7Jk9B+DxhijEndu8EY0xE4uW7fYRljfg08CPzRWvvUsZXZtIZ3GA6gXuTepHUfuPkLGHobLHwRnh0GWzUQjoj4Fk9C+z/AZmCmMeZCY8wFwExgG/Dc3oOMMR2MMdXGmPv32zYW+CfwMTDHGDNkv+Woe543lTYRbegZ15PZWxTaXiUwBEb+1ZkxrKYaXhwJH9+rGcNExGccMbSttSXAmcA64DVgGrAJONNaW7zfoQbwP+gzz6nbfg7w7UHL0w1Qf6M5u8PZLMtZRmZJptulyME6DoNbvoETb4IFT8MzJ8Hmr92uSkSk0XnUe9xau9Vae4m1NspaG2mtHW2t3XzQMZuttcZa++f9tl1ft62+5fQG/Uka2FntzwLg863qkOaVgiPhvMec57qx8PJ58MFvoKzA7cpERBqNZvn6GZ1adSItOk2XyL1dp1OcR8OG3AqLXoZ/D4IVM/Rct4i0SArtwxjRcQSLdi9ie9F2t0uRwwkKh3MecnqYRybDWzfA62Mgf4vblYmINCiF9mFclHYRxhjeXv+226WIJ9r0h3FzYORDsHm+0+r+8hGoKne7MhGRBqHQPozk8GROTTmVGetnUFVT5XY54gn/ABh6K9z2PXQ9B+b+FZ4e4gzOIiLSzCm0j+CyrpeRV57HnG1z3C5FjkarFBjzClzzLvgFOJfLX78ccjKOfK6IiJdSaB/ByW1OpnV4a95cpzmem6XOZzgd1YY/4Fwyf3owfDwRyjwegVdExGsotI/A38+fS7teyne7vmNr4Va3y5FjERAEw+6E2xdBvyudZ7ufHADfTQHd9hCRZkSh7YGL0i4iwATw1rq33C5FjkdkElzwFIyfB0m94KO7nM5qK9/RI2Ii0iwotD2QEJbA6e1O592Md6msqXS7HDlerfvAde/Dlf+DgBB483p4/izY9JXblYmIHJZC20OXdb2M/Ip8DbbSUhgDXUfCr76GC5+Gokx45Rfw6mjYvtDt6kRE6qXQ9tCQNkNIiUhRh7SWxs8f+l/l3O8e+RBkLnda3a+PhV3L3K5OROQACm0P+Rk/xnQbw8LdC1mevdztcqShBYY6z3ffsRTOuh+2fgPPnQLTr4SdP7pdnYgIoNA+KmO6jSE6OJpnlj7jdinSWIIj4JTfwR3L4PSJsOVrmHI6TLsMtv3gdnUi4uMU2kchPDCc63pdx1c7vmJZti6dtmih0XD63XDnCqflvX0hvDAcXv4FrJ+t3uYi4gqF9lG6svuVxATH8PRSr54OXBpKSJTT8r5zOYz4K+RugGmXwLOnwPK3oKba7QpFxIcotI9SWGAY1/e+nvk75rMka4nb5UhTCY6Ak25z7nlf+DTUVMDbN8GT/WD+k5rHW0SahEL7GIztNpbYkFjd2/ZFAUFOb/NbvoOx0yGmI3z2J3iiF3x0t9MSFxFpJArtYxAWGMYNvW7gm53fqLXtq/z8oPu5cP0Hzghr3X8BP7wATw2AqZc6s4rV1rpdpYi0MArtYzSm2xhiQ2L595J/u12KuK11X7j4OfjNCqfHeeZyZ1axp/rD1/+E4my3KxSRFkKhfYzCAsO4qfdNLNi1gHnb57ldjniDyGSnx/lvVsClL0FkG5g9CR7v4QyVuvFL9ToXkeNirJf/EklPT7cLF3rnsJJVNVVc9v5llFaX8u6F7xIWGOZ2SeJtstbA4ldgyetQXuDcA+93NfS7wpnzW0TkIMaYRdba9Pr2qaV9HAL9A5l00iR2lezSZXKpX2J3OOdv8Ls1cNEUaNUO5j4IT/SG1y5yHhurLHW7ShFpJhTax6l/Yn8u63oZU1dPZVXuKrfLEW8VGAp9L3c6rv16CZz2B8hZ7zw29mgXeGcCbPwCamvcrlREvJgujzeAwspCLnz3QhLDEpl27jQC/ALcLkmag9pa2DIflr0Bq96DikKIbA29LoLel0Dbgc5sZCLiU3R5vJFFBUVx96C7WZW7iulrprtdjjQXfn7Q6RS48N/w+3VO57W2A+GH552Zxv7VF2b/GXYsVgc2EQHU0m4w1lpum3MbP2T+wNvnv027qHZulyTNVfkeWP0BrHirrsd5DbRqDz0vgB4XQMqJTuCLSIt0uJa2QrsB7SrexaXvX0pSeBJTR01Vb3I5fqV5sPZD5/L5hjlQWwXhidBtlDOgS6dTITDE7SpFpAEptJvQ/B3zmTB7AqM6jeLvp/wdo3uS0lDKCiBjNqz5ANZ/BpXFEBgOnc+AriOhywjnWXERadYOF9rqMdXATm57Mrf1v42nfnyKPgl9uKrHVW6XJC1FaDSccKmzVFfApq+cVvi6T5wgB2d0trSzIW24cxndX/+Li7Qkamk3glpby51z7+Sr7V/xnxH/IT253n8wiTQMayFrFaz7GNZ9Ctt/cO6DB7eC1FMh9QxIPR1iU9UbXaQZ0OVxFxRVFnHlrCspqixi+nnTaR3R2u2SxFeUFcCmLyHjc2cp3O5sj27vhHfHU51e67qULuKVFNou2VCwgas/vJqYkBheGvkSSeFJbpckvsZaZ7rQjXOdwVs2fQUVe5x9cV2c8O5wMnQ4CaLauFqqiDgU2i5amr2U8Z+NJz40npdGvkRCWILbJYkvq62BXUth81dOgG/91unQBs646B1OhnaDof0QiO+qy+kiLlBou2xJ1hLGfzaepPAkXhz5IvGh8W6XJOKoqYbMZU54b/nGWcrynH2hMU6Ap5wIKenQZgCERLlbr4gPUGh7gUW7FzFh9gTahLfh+ZHPK7jFO1kLuRmwdQFsWwBbv4Pc9XU7DSR0d0Zta9vfCfGk3hAQ5GrJIi2NQttL/JD5A7d+fiutglvxrzP+Rc+4nm6XJHJkZfmwYxFsX+T0TN+x6KfWuH+QE9yt+/60JPWCgGB3axZpxhTaXmR17mrumHsHeeV5TD5pMuemnut2SSJHx1oo2Ao7Fzvjou9a4twnL6/r4OYXAPHdIPkESO7trBN7QYT6c4h4QqHtZXLLcvntF79lcdZibup9E7f3vx1/P3+3yxI5dtZC/mYnwDOX/7QU7frpmPAESOzptMQTe0BCD0jopvvkIgdRaHuhqpoq/v793/nfuv8xIHEAfz7pz3Rq1cntskQaVkmOE95Zq2D3Kti9ArLXQHX5T8dEpUBCV6d1vncd38UJefVeFx+k0PZi7214j4e/f5jy6nIm9JvA9b2u13zc0rLV1kDBFsha7SzZayFnLeSsh6rSn44LbgVxnZ0Aj0tzRnSLTXW2hbRyr36RRqbQ9nI5ZTk89N1DfLblM3rE9uBPQ/7ECQknuF2WSNOqrXVGb8te5/Rgz13vBHluBhTuOPDYsDiI6QSxnZwgj+kI0R2cdWRrTV0qzZpCu5n4bMtn/HXBX8ktz+XMdmdyW//b6BLTxe2yRNxXWQr5m5zR3fI2QN4m533eZifobe1Px/oHQat2EN3OGbq1VXvndasUZ4lso8fUxKsptJuR4spiXlv9Gq+ufJWSqhJGdRrF+L7jSW2V6nZpIt6pugL2bHc6wuVvdi69F2ytW7ZBSdZBJxhn3PWoNhDV1gnyqDZOCz2qzU+v9diauESh3QwVlBfw0sqXeH3165TXlDOk9RDGdh/LaSmn6Z63yNGoKoM9O2DPNifc964Ld0DhTmdfVcmh54XGOuEemQwRyRCZtN86CcITISIRgiPVYU4alEK7Gcspy2HG+hn8b+3/2F26m+TwZC5Ku4iRHUfSObqz2+WJNH/WOs+YF+1yQrxwp/O6KNNZiveud0Nt9aHnB4TUBXiC0+M9PN5Zh8U7r8PiITzOWYfFQVBY0/+M0qwotFuA6tpqvtz+JW+seYPvdn2HxZIWncaIDiM4q8NZdInugtG/9kUaT22tMzrc3hAvyYbiLOfye3GW874kp27Jhtqq+j8nIMQJ79BYCIupW8fut45xlpBoCI3+aR0Y2rQ/r7hGod3CZJdm89mWz/h0y6cs3r0YiyU+NJ7BrQczpPUQBicPJjk8WSEu4pa9rffSXCfES3OgNM95v2/Jc4aD3bsuyz+wQ93B/IPrQrzVgUtwVN3rKOd1cNR+ryPrligIjtB9+mbiuEPbGNMOeAI4GzDAbOBOa+1WD84NAf4CXA1EA0uAu6218zwpXqF9eNml2Xy942u+3fUt3+36jrxyZ0zohNAEesf3pk9CH3rF9aJLTBfiQuIU5CLeqrYWKouc8C6tC/HyAigr2G+9Z7+lAMoLoaLQeV9TeeTv8AusC/EICNq7joCgcGd7UPh+SwQEhv30fu/rwDDnEn9guLMOCNUjdg3suELbGBMGLAUqgPsACzwIhAF9rLX19OA44PxpwHnAXcBG4FZgFDDUWrvkSMUrtD1Xa2tZn7+ehbsXsjxnOStyVrClcMu+/THBMaTFpNG5VWfaR7UnJSKFdpHtSIlMISQgxMXKReS4VZU7AV5R5IR4RSFUFDvvKwp/el+5d1vd68piqCxx3leVOK89+QfA/gJCncv3gWEQGOK83rct1LklsHdfQKjT4t+3ve793u0BIQet6xb/vduCnNf+gS22A+DxhvYdwONAN2ttRt22TsB64A/W2scPc25fnJb1jdbal+q2BQArgbXW2guOVLxC+/jsqdjDytyVbCjYwIaCDawvWM/Ggo0UVxUfcFxMcAxJ4UkkhTlLfGg8sSGxxIXGERsSS3RINK2CWhEVHEWgX6BLP42INInqSifMq0qdEN+7VJXWbSv9aV91ed2+/2/v3mPkOuszjn+fmdmLN9kQm7pSc3EuChIkTYCItkGhNUmkOEXgCJFwqUTbIKCEqqJClXoTV9FQlYq2gQpIm4IEEUEQIEZIkAtQaJu0pFFSMDRtaJI6reMEX2PvbS4//njPrM8ez+6ctSc7c7zPxz4+M+95z867jzzzmzmXObPZ8tnU1r3dnVpzS+fRPvFx1rsFffzovD6eK+zd291prDAfTxe46bbVxrJlvW43jrbVGtl62bKNRtIU0QAACplJREFU56SzDAbkRIv2vcBkRFxeaP9HgIjYusK67wHeA5weETO59g8AfwScFhHzKz2+i/bgRQT75/fz5LNPsuvZXex6dhd7Zvbw9MzT7Dmyhz0zezgwf2DZ9acaU0yPT3Pq2KmcMn4K02PTTI1NsaGxgQ2NDel2fQMTjQkm6hNM1ieZaEwwXhtnvJ5NtXHG6mOM1cZo1BpL5nXVadQa1Gt1GmpQU416rU5ddWryZjizk0K7Ba3ZdJ59ay5tKWjNpjcMrbncNJ+m9nzh9kLveXsB2s2s30JuaqZ5az6dBbCkvbn8gYNlbPswvPydA4tmpaJd5oTfi4A7e7TvBK4vse5j+YKdW3ccuCC7bWtIEpsmN7FpchOXbL6kZ59mp8mBuQPsm9vH3tm9HFw4yMH5NB2YP8Dh5mGONI9weOEwhxYOsfvIbmZbs8y2ZplpzrDQWeXmtVXoFu+eEzUkUVMNISQhlt4HlizrzrvZLP7JbXrLr7dkXmgvKh5DUOx3zP0Sm/uWe6wVVjA7udVIFWVJw4ZsKimyfyKOziOATrask1u29P6bp6e5ZgC/RhllivYmYH+P9n3AxhNYt7v8GJLeDrwdYMuWLSWGaIM2Vhtj89RmNk8d3zWQ25028+35xWmuNcdCZ4Fmu8l8e56FzgKtTotmu0mzk6ZWp0Wr06IdbZqdJu1Om050aEVr8XY7js4jYklbkO5HBO1s01u3rZMdlRsRLP7Jbqe/hbasb/dn9LyfzY/Olm61OuZ+9FlO/4NCy3RZ9c80sxMyNn3Gmj1W2a/W6vXML/P+XcezbkTcAtwCafN4icexEVOv1ZmqTTE15i+SMDMblDI7CPfT+xPxRnp/is7bt8K63eVmZmZWQpmivZO0b7roQuBHJdY9LzttrLjuAvBoicc3MzMzyhXtHcBlkhYvMyXpXODybFm/dcfIHbCWnfL1BuCufkeOm5mZ2VFlivbfAY8Dd0q6VtJ20tHku4BPdTtJOkdSS9J7u23Zl6d8AfhrSW+VdBVwO3Ae8L7B/RpmZmYnv75FO/vGsyuB/wI+C9wGPAZcGRH5b+gQUO/xM28APk36FrWvA2cD10TEgyc8ejMzs3Wk1NHj2XeMv65Pn8fpcVR4RMwC784mMzMzO07+eikzM7OKcNE2MzOrCBdtMzOzinDRNjMzqwgXbTMzs4pw0TYzM6sIF20zM7OKUPFygaNG0jPAEwP8kT8H/HSAP2+9co6D4RwHwzkOhnMcjBPN8ZyI6Hld5JEv2oMm6YGIeNmwx1F1znEwnONgOMfBcI6D8Vzm6M3jZmZmFeGibWZmVhHrsWjfMuwBnCSc42A4x8FwjoPhHAfjOctx3e3TNjMzq6r1+EnbzMysktZF0ZZ0tqQvSToo6ZCkL0vaMuxxjSpJ10m6Q9ITkmYlPSLpw5KmC/02Svp7ST+VdETSPZIuHta4R52kb0gKSR8qtDvHEiS9StJ3JR3OnscPSLoyt9w59iHpckl3SXo6y/BBSW8p9HGOGUlnSfqYpPskzWTP33N79CuVmaRJSR+RtDt7bb1P0q+tZkwnfdGWNAV8C3gh8FvAm4EXAN+WdMowxzbC/gBoA38CXAN8ArgRuFtSDUCSgB3Z8t8jXW99jJTrWcMY9CiT9CbgxT3anWMJkn4HuBP4d+C1wPXAF4GpbLlz7EPSJcA9pFzeRsro+8Ctkm7M+jjHpS4AXg/sB77Xq8MqM7uVlP17gVcDu4FvSnpJ6RFFxEk9Ae8iFaALcm3nAS3g3cMe3yhOwOYebb8JBHBldv/a7P4VuT7PA/YBNw/7dxilCTgdeAp4U5bZh3LLnGP//M4FZoHfX6GPc+yf403AAnBqof1+4D7n2DOzWu72W7Nszi30KZUZ6U17ADfk2hrAI8COsmM66T9pA9uB+yPi0W5DRDwG/DMpbCuIiGd6NH8/m5+ZzbcD/x8R386tdxD4Gs616C+AnRHx+R7LnGN/bwE6wCdX6OMc+xsHmqQ3QHkHOLrV1TnmRESnRLeymW0n5f+FXL8WcDuwTdJEmTGth6J9EfDDHu07gQvXeCxVtjWb/zibr5TrFkmnrsmoRpykV5C2UrxzmS7Osb9XAP8JvFHSTyS1JD0q6XdzfZxjf5/J5jdLOkPS6ZLeBlwF/FW2zDmuXtnMLgIei4iZHv3GSZvi+1oPRXsTaX9E0T5g4xqPpZIknQl8ELgnIh7ImlfKFZwtksaATwF/GRGPLNPNOfZ3Buk4lI8Afw5cDdwNfFzSu7I+zrGPiPgh8ErSp7//I+X1t8A7IuL2rJtzXL2ymfXrt6nMgzVWNbTq6nUyutZ8FBWUvUu8k3QMwA35RTjXfv4Q2AD82Qp9nGN/NWAa+O2I+HLW9q3sKN4/lnQzzrEvSS8A7iB9snsHaTP5tcAnJc1FxG04x+NRNrOBZLseivZ+er+D2Ujvdz2WkTRJOiryfGBrRDyZW7yP5XOFdZ5tdkrhn5IOXpko7K+akHQ68CzOsYy9pE/adxfa7yIdsfsLOMcybiLtU311RDSztnslPR/4G0mfxzkej7KZ7QN6nWq8Mbe8r/WweXwnaV9C0YXAj9Z4LJWRbdq9A/hl4FUR8YNCl5Vy/d+IOPwcD3HUnQ9MAp8jPWm7E6RT6vYDF+Mcy9i5THv3E0oH51jGxcDDuYLd9W/A84Gfxzkej7KZ7QTOy05DLvZbAB6lhPVQtHcAl0k6v9uQbVa7PFtmBdm52LeRDlC5NiLu79FtB3CmpK259U4DXoNzBXgIuKLHBKmQX0F6kjrH/r6SzbcV2rcBT0bEUzjHMp4CXiJpvND+K8Ac6ZOec1y9spntIJ2/fX2uXwN4A3BXRMyXerRhnwe3BufZnUJ6cfwBaf/NduBh4H8onK/oaTGzT5CdTwxcVpjOyvrUgH8BdgFvJL2Afof0xD972L/DqE4ce562c+yfmUhfkLSXtC/2atIFGYK0n9s5lsvxuiyzb2avhVcDH8/aPuocV8ztutzr4o3Z/a2rzYx0etd+0m6zq4Avkd4wXVp6PMMOZI1C30La1HuItB/xqxROkPe0JK/Hs/+cvab35/ptAv4h+885A9wLvHjY4x/lqVi0nWPp3E4jHem8h7Qp8T+A33COq87x17OC8kz2WvgQ6XTEunNcNrPlXgu/s9rMSAemfpS01WMO+FfglasZj6/yZWZmVhHrYZ+2mZnZScFF28zMrCJctM3MzCrCRdvMzKwiXLTNzMwqwkXbzMysIly0zczMKsJF28yWkHSapPdLetGwx2JmS7lom1nRy4D3kb4n2cxGiIu2mRW9FJjHV8EzGzn+GlMzWyTpx8ALC813RMR1wxiPmS3lom1miyT9EulKRDuBm7Lm3RHxxPBGZWZdjWEPwMxGysPAWcDHovd11M1siLxP28zyLgLGgQeHPRAzO5aLtpnlXUq6VvBDwx6ImR3LRdvM8l4K/CQiDg17IGZ2LBdtM8u7EJ/qZTayfCCameUdAC6VtA04CPx3ROwd8pjMLONTvsxskaRfBG4FLgEmgV+NiH8a7qjMrMtF28zMrCK8T9vMzKwiXLTNzMwqwkXbzMysIly0zczMKsJF28zMrCJctM3MzCrCRdvMzKwiXLTNzMwqwkXbzMysIn4Gy+mjIj8xopoAAAAASUVORK5CYII=\n",
442 | "text/plain": [
443 | ""
444 | ]
445 | },
446 | "metadata": {
447 | "needs_background": "light"
448 | },
449 | "output_type": "display_data"
450 | }
451 | ],
452 | "source": [
453 | "# Simulate the evolution of the system, starting from a uniform state\n",
454 | "T = 100\n",
455 | "x = np.ones((3, T)) / 3.\n",
456 | "\n",
457 | "for t in np.arange(1, T):\n",
458 | " x[:, t] = P @ x[:, t-1]\n",
459 | "\n",
460 | "plt.figure(figsize=(8, 6))\n",
461 | "plt.plot(x.T)\n",
462 | "plt.legend(labels=['$x_1$', '$x_2$', '$x_3$'], loc='upper left')\n",
463 | "plt.xlabel('$t$')\n",
464 | "plt.show()"
465 | ]
466 | },
467 | {
468 | "cell_type": "markdown",
469 | "metadata": {},
470 | "source": [
471 | "We have two eigenvalues with magnitude less than 1, so any contribution to the initial state from those eigenvectors will dampen out over time. The one eigenvalue with magnitude 1 will dominate over time - this eigenvalue corresponds to the eigenvector having 100% occupation of state 1, so eventually the probability of being in state 1 approaches 100%.\n",
472 | "\n",
473 | "Markov transition matrices, because of the structure imposed to contain probabilities summing to 1, will always have an eigenvalue with magnitude 1. This idea is one of the foundational principles of Google's PageRank algorithm. Thus, whether you like it or not, this theory impacts your life every single day!"
474 | ]
475 | },
476 | {
477 | "cell_type": "code",
478 | "execution_count": null,
479 | "metadata": {},
480 | "outputs": [],
481 | "source": []
482 | }
483 | ],
484 | "metadata": {
485 | "kernelspec": {
486 | "display_name": "Python 3",
487 | "language": "python",
488 | "name": "python3"
489 | },
490 | "language_info": {
491 | "codemirror_mode": {
492 | "name": "ipython",
493 | "version": 3
494 | },
495 | "file_extension": ".py",
496 | "mimetype": "text/x-python",
497 | "name": "python",
498 | "nbconvert_exporter": "python",
499 | "pygments_lexer": "ipython3",
500 | "version": "3.6.7"
501 | },
502 | "latex_envs": {
503 | "LaTeX_envs_menu_present": true,
504 | "autoclose": true,
505 | "autocomplete": false,
506 | "bibliofile": "biblio.bib",
507 | "cite_by": "apalike",
508 | "current_citInitial": 1,
509 | "eqLabelWithNumbers": false,
510 | "eqNumInitial": 1,
511 | "hotkeys": {
512 | "equation": "Ctrl-E",
513 | "itemize": "Ctrl-I"
514 | },
515 | "labels_anchors": false,
516 | "latex_user_defs": false,
517 | "report_style_numbering": false,
518 | "user_envs_cfg": false
519 | },
520 | "toc": {
521 | "base_numbering": 1,
522 | "nav_menu": {},
523 | "number_sections": false,
524 | "sideBar": true,
525 | "skip_h1_title": false,
526 | "title_cell": "Table of Contents",
527 | "title_sidebar": "Contents",
528 | "toc_cell": false,
529 | "toc_position": {},
530 | "toc_section_display": true,
531 | "toc_window_display": true
532 | }
533 | },
534 | "nbformat": 4,
535 | "nbformat_minor": 2
536 | }
537 |
--------------------------------------------------------------------------------
/Ch 5 - Singular Value Decomposition.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Singular Value Decomposition\n",
8 | "\n",
9 | "The Singular Value Decomposition (SVD) is a topic that is not part of the standard linear algebra introduction, but is possibly the most incredibly powerful tool in the field. The SVD makes many mathematical proofs very straightforward, and has immediate applications in\n",
10 | "\n",
11 | "* compression,\n",
12 | "* error analysis,\n",
13 | "* convex **and** non-convex optimization,\n",
14 | "\n",
15 | "...just to name a few."
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "### Python Setup"
23 | ]
24 | },
25 | {
26 | "cell_type": "code",
27 | "execution_count": 1,
28 | "metadata": {
29 | "ExecuteTime": {
30 | "end_time": "2019-11-03T21:54:57.746169Z",
31 | "start_time": "2019-11-03T21:54:56.391964Z"
32 | }
33 | },
34 | "outputs": [],
35 | "source": [
36 | "# Render MPL figures within notebook cells\n",
37 | "%matplotlib inline\n",
38 | "\n",
39 | "# Import python libraries\n",
40 | "import numpy as np\n",
41 | "import matplotlib.pyplot as plt\n",
42 | "from matplotlib import rcParams"
43 | ]
44 | },
45 | {
46 | "cell_type": "code",
47 | "execution_count": 2,
48 | "metadata": {
49 | "ExecuteTime": {
50 | "end_time": "2019-11-03T21:54:57.758468Z",
51 | "start_time": "2019-11-03T21:54:57.750295Z"
52 | }
53 | },
54 | "outputs": [],
55 | "source": [
56 | "# Configure some defaults for plots\n",
57 | "rcParams['font.size'] = 16\n",
58 | "rcParams['figure.figsize'] = (10, 3)"
59 | ]
60 | },
61 | {
62 | "cell_type": "code",
63 | "execution_count": 3,
64 | "metadata": {
65 | "ExecuteTime": {
66 | "end_time": "2019-11-03T21:54:57.767433Z",
67 | "start_time": "2019-11-03T21:54:57.762978Z"
68 | }
69 | },
70 | "outputs": [],
71 | "source": [
72 | "# Set Numpy's random number generator so the same results are produced each time the notebook is run\n",
73 | "np.random.seed(0)"
74 | ]
75 | },
76 | {
77 | "cell_type": "markdown",
78 | "metadata": {},
79 | "source": [
80 | "### Derivation\n",
81 | "\n",
82 | "The SVD is a way of decomposing a matrix into smaller matrices with very well-defined structure. We can do a nice derivation of the derivation, again, geometrically:\n",
83 | "\n",
84 | "We start with the fun fact that if you transform a sphere of unit radius by **any** matrix $A$, the resulting points make an ellipse. Notice that we are **not restricted in any way** on the matrix $A$ here.\n",
85 | "\n",
86 | "\n",
87 | "\n",
88 | "The output ellipse is defined by a set of mutually orthogonal principal axis vectors, $u$. Orthogonality, as a reminder, means that the inner product between two different vectors in the set is zero: \n",
89 | " \n",
90 | "
\n",
91 | " $u_1^{\\mathsf{T}}u_2 = 0$\n",
92 | "
\n",
93 | " \n",
94 | "\n",
95 | "An absolutely remarkable result is that - working backwards - the input vectors $v$ that map onto the principle axes **also** form a mutually orthogonal set. For simplicity, we'll scale the $u$ and $v$ vectors to have unit length, and use coefficients $\\sigma$ to make the equality work out.\n",
96 | "\n",
97 | " \n",
98 | "
\n",
99 | " $Av=\\sigma u$\n",
100 | "
\n",
101 | " \n",
102 | "\n",
103 | "Now, we can horizontally stack all the vectors into matrices: \n",
104 | "\n",
105 | " \n",
106 | "
\n",
107 | " $AV=U\\Sigma$\n",
108 | "
\n",
109 | " \n",
110 | "\n",
111 | "The matrix $\\Sigma$ contains the scaling coefficients $\\sigma$ along its main diagonal:\n",
112 | "\n",
113 | "\n",
114 | "\n",
115 | "We need one last piece, which comes from exploiting the fact that these new matrices $U$ and $V$ are built from mutually orthogonal vectors with length 1. These are called **orthogonal** matrices, and have the property that \n",
116 | "\n",
117 | " \n",
118 | "
\n",
119 | " $V^\\mathsf{T} V = VV^\\mathsf{T} = I \\\\\n",
120 | " U^\\mathsf{T} U = UU^\\mathsf{T} = I$\n",
121 | "
\n",
122 | " \n",
123 | "\n",
124 | "Therefore, we can conclude that **any** matrix $A$ can be written in the form \n",
125 | "\n",
126 | " \n",
127 | "
\n",
130 | " \n",
131 | "\n",
132 | "This is the singular value decomposition, where \"singular values\" refers to the scaling coefficients $\\sigma$. Let's dive into how we can use this thing for fun and profit.\n"
133 | ]
134 | },
135 | {
136 | "cell_type": "markdown",
137 | "metadata": {},
138 | "source": [
139 | "### Pseudoinverse\n",
140 | "\n",
141 | "We saw in the previous section the least squares and least norm solutions when solving problems of the form $y = Ax$. The SVD provides an inconceivably simple way of arriving at these solutions *simultaneously*. \n",
142 | "\n",
143 | "1. Drop any rows or columns in $\\Sigma$ that have only zeros such that it becomes a square, diagonal matrix. If A has rank $r$, this will give $U \\in \\mathbb{R}^{m, r}$, $V \\in \\mathbb{R}^{n, r}$, and $\\Sigma \\in \\mathbb{R}^{r, r}$.\n",
144 | "2. Now, by construction the matrix $V\\Sigma^{-1}U^{\\mathsf{T}}$ is an inverse to $U\\Sigma V^{\\mathsf{T}}$ (this is super easy to test)\n",
145 | "3. Therefore, $x = V\\Sigma^{-1}U^{\\mathsf{T}}y$\n",
146 | "\n",
147 | "It's easy to show just by substituting the SVD into the least squares and least norm solutions that one arrives at the same result. Try this at home - cancelling everything out is immensely satisfying.\n",
148 | "\n",
149 | "The matrix $V\\Sigma^{-1}U^{\\mathsf{T}}$ is so significant that it has a fancy name, the **Moore-Penrose Pseudoinverse**. It also has a fancy notation, $A^{\\dagger}$."
150 | ]
151 | },
152 | {
153 | "cell_type": "markdown",
154 | "metadata": {},
155 | "source": [
156 | "### Basis for the Nullspace\n",
157 | "\n",
158 | "I mentioned in the dicussion on the nullspace back in chapter 1 that there would be a way to get the basis vectors for the nullspace. If $A$ has rank $r$, then the last $n-r$ columns of $V$ are those basis vectors! These are the directions in the input space that get multiplied by singular values of zero, and are, thus, flattened."
159 | ]
160 | },
161 | {
162 | "cell_type": "markdown",
163 | "metadata": {},
164 | "source": [
165 | "### Error Analysis\n",
166 | "\n",
167 | "Suppose you have a linear system with some small error, $\\delta$ in the observed measurement such that $y = Ax + \\delta$. If you solve using the pseudoinverse, the resulting error in the estimate of $x$ will be $V\\Sigma^{-1} U^{\\mathsf{T}}\\delta$. A property of $U$ and $V$ as orthogonal matrices is that, when viewed as a transform, they only rotate or reflect - lengths of vectors are preserved. This means the only thing that affects the magnitude of the error is $\\Sigma^{-1}$. This is a diagonal matrix whose entries are $\\sigma_i^{-1}$ - if $\\sigma_i$ is very, very small, then this means you have very, very big numbers multiplying your errors!\n",
168 | "\n",
169 | "You will see people refer to matrices as **well-conditioned** or **ill-conditioned**. These terms indicate whether the matrix has these very small singular values and is at risk of producing large errors. How small is small, though? Since size is relative, the standard is to normalize to the largest singular value. This gives us the concept of the **condition number**: \n",
170 | "\n",
171 | " \n",
172 | "
\n",
175 | " \n",
176 | "\n",
177 | "When $k$ is large, there is a wide spread in the singular values and our estimates may be very sensitive to small changes or noise."
178 | ]
179 | },
180 | {
181 | "cell_type": "markdown",
182 | "metadata": {},
183 | "source": [
184 | "### Low-Rank Approximation\n",
185 | "\n",
186 | "For any matrix that you would obtain from *measurements* of something, as opposed to analytically, it's basically guaranteed that the matrix will be full rank. Suppose you build a measurement system whose corresponding matrix is **not** full rank - if you so much as speak loudly nearby, the infinitesimal noise you introduce will add some floating point-level error that a computer will correctly identify as \"full rank\". Practically, though, your matrix is still singular.\n",
187 | "\n",
188 | "The SVD provides a handy solution - simply round the smallest singular values to zero. One can show this is actually an optimal approximation to the original matrix $A$ under a particular cost function (the details are beyond the scope of what I'm covering here, but the Boyd lectures explain it well!). How small should you set the threshold? That depends on your application. We'll see an example below."
189 | ]
190 | },
191 | {
192 | "cell_type": "markdown",
193 | "metadata": {},
194 | "source": [
195 | "#### Compression\n",
196 | "\n",
197 | "I mentioned in the discussion on rank that any matrix $A$ can be factored into component matrices such that $A = QR$, which require storing fewer elements than the original matrix depending on the rank of $A$. The SVD provides us with an immediate way of getting $Q$ and $R$. If $A$ is low rank - implying that transforming by $A$ flattens some dimensions - then some of the singular values will be zero. We can drop the rows and columns of $U$ and $V$ corresponding to those singular values and merge $\\Sigma$ into either of them:\n",
198 | "\n",
199 | " \n",
200 | "