├── 2.1 Scalars, Vectors, Matrices and Tensors ├── 2.1 Scalars Vectors Matrices and Tensors.ipynb └── images │ ├── dimensions-transposition-matrix.png │ ├── matrix-addition.png │ ├── non-squared-matrix-transposition.png │ ├── scalar-vector-matrix-tensor.png │ ├── square-matrix-transposition.png │ └── vector-transposition.png ├── 2.10 The Trace Operator ├── 2.10 The Trace Operator.ipynb └── images │ └── trace-matrix.png ├── 2.11 The determinant ├── 2.11 The determinant.ipynb ├── images │ ├── positive-negative-determinant.png │ ├── unit-square-area-transformed-1.png │ ├── unit-square-area-transformed.png │ └── unit-square-area.png └── test_svd.jpg ├── 2.12 Example - Principal Components Analysis ├── 2.12 Example - Principal Components Analysis.ipynb └── images │ ├── first-principal-component.png │ ├── gradient-descent-local-minima.png │ ├── gradient-descent.png │ ├── orthogonal-vectors.png │ ├── principal-component-analysis-variance-explained.png │ ├── principal-components-analysis-PCA-change-coordinates.png │ ├── principal-components-analysis-PCA-decoding-function.png │ ├── principal-components-analysis-PCA-encoding-function.png │ └── principal-components-analysis-PCA-reconstruction-function.png ├── 2.2 Multiplying Matrices and Vectors ├── 2.2 Multiplying Matrices and Vectors.ipynb └── images │ ├── dot-product.png │ ├── plot-linear-equation.png │ └── system-linear-equations-matrix-form.png ├── 2.3 Identity and Inverse Matrices ├── 2.3 Identity and Inverse Matrices.ipynb └── images │ └── identity-matrix.png ├── 2.4 Linear Dependence and Span ├── 2.4 Linear Dependence and Span.ipynb └── images │ ├── intersection-2-planes-line.png │ ├── number-solutions-system-equations.png │ ├── overdetermined-system-linear-equations.png │ ├── python-adding-vectors-combination-1.png │ ├── python-adding-vectors-combination.png │ ├── python-one-equation.png │ ├── python-system-equations-with-linear-dependence.png │ ├── python-three-equations.png │ ├── python-two-equations-1.png │ ├── python-two-equations-no-solution.png │ ├── python-two-equations.png │ ├── python-two-vectors.png │ ├── representing-features.png │ └── underdetermined-system-linear-equations.png ├── 2.5 Norms ├── 2.5 Norms.ipynb └── images │ ├── l1-norm.png │ ├── l2-norm.png │ └── squared-L2-Norm.png ├── 2.6 Special Kinds of Matrices and Vectors ├── 2.6 Special Kinds of Matrices and Vectors.ipynb └── images │ ├── diagonal-and-symmetric-matrices.png │ ├── diagonal-matrix.png │ ├── orthogonal-matrix.png │ └── symmetric-matrix.png ├── 2.7 Eigendecomposition ├── 2.7 Eigendecomposition.ipynb └── images │ ├── output_59_0.png │ ├── quadratic-functions-indefinite-form.png │ ├── quadratic-functions-negative-definite-form.png │ └── quadratic-functions-positive-definite-form.png ├── 2.8 Singular Value Decomposition ├── 2.8 Singular Value Decomposition.ipynb ├── images │ ├── SVD_image_dim.png │ ├── dimensions-reconstruction-image-singular-value-decomposition.png │ ├── non-square-matrix-change-dimensions.png │ ├── output_35_7.png │ ├── rescaled-circle-rotated.png │ ├── singular-value-decomposition-understanding-dimensions.png │ ├── singular-value-decomposition.png │ ├── transformation-vector-by-matrix.png │ ├── unit-circle-transformation.png │ ├── unit-circle-transformation1.png │ ├── unit-circle.png │ └── unit-vectors-rotation.png └── test_svd.jpg ├── 2.9 The Moore-Penrose Pseudoinverse ├── 2.9 The Moore-Penrose Pseudoinverse.ipynb └── images │ ├── dataset-representation.png │ ├── linear-regression-r.png │ └── overdetermined-system-equations-python.png ├── 3.1-3.3 Probability Mass and Density Functions ├── 3.1-3.3 Probability Mass and Density Functions.ipynb └── images │ ├── all_dice.png │ ├── area-under-curve-derivative.png │ ├── area_under_curve_more_1.png │ ├── intro_image.png │ ├── marginal-probabilities-empty.png │ ├── marginal-probabilities.png │ ├── mass.png │ ├── negative-and-positive-covariance.png │ ├── probability-density-function-area-under-the-curve-1.png │ ├── probability-density-function-area-under-the-curve-2.png │ └── probability-density-function.png ├── 3.4-3.5 Marginal and Conditional Probability ├── 3.4 - 3.5 Marginal and Conditional Probability.ipynb └── images │ ├── bivariate-gaussian-curves.png │ ├── conditional-probability.png │ ├── inside-probability.png │ ├── integral-probability.png │ ├── intro.png │ ├── marginal-probabilities-empty.png │ ├── marginal-probabilities.png │ ├── negative-and-positive-covariance.png │ ├── sum-rule-example.png │ ├── sum_rule_1.png │ ├── sum_rule_2.png │ ├── sum_rule_3.png │ ├── sum_rule_4.png │ └── summary-mathematical-notation.png ├── LICENSE ├── deep-learning-book-goodfellow-cover.jpg └── readme.md /2.1 Scalars, Vectors, Matrices and Tensors/2.1 Scalars Vectors Matrices and Tensors.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 39, 6 | "metadata": { 7 | "collapsed": true 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "import numpy as np" 12 | ] 13 | }, 14 | { 15 | "cell_type": "code", 16 | "execution_count": 2, 17 | "metadata": {}, 18 | "outputs": [ 19 | { 20 | "data": { 21 | "text/html": [ 22 | "" 49 | ], 50 | "text/plain": [ 51 | "" 52 | ] 53 | }, 54 | "metadata": {}, 55 | "output_type": "display_data" 56 | } 57 | ], 58 | "source": [ 59 | "%%html\n", 60 | "" 87 | ] 88 | }, 89 | { 90 | "cell_type": "markdown", 91 | "metadata": {}, 92 | "source": [ 93 | "$$\n", 94 | "\\newcommand\\bs[1]{\\boldsymbol{#1}}\n", 95 | "$$" 96 | ] 97 | }, 98 | { 99 | "cell_type": "markdown", 100 | "metadata": {}, 101 | "source": [ 102 | "\n", 103 | " This content is part of a series following the chapter 2 on linear algebra from the [Deep Learning Book](http://www.deeplearningbook.org/) by Goodfellow, I., Bengio, Y., and Courville, A. (2016). It aims to provide intuitions/drawings/python code on mathematical theories and is constructed as my understanding of these concepts. You can check the syllabus in the [introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/).\n", 104 | "" 105 | ] 106 | }, 107 | { 108 | "cell_type": "markdown", 109 | "metadata": {}, 110 | "source": [ 111 | "# Introduction\n", 112 | "\n", 113 | "This is the first post/notebook of a series following the syllabus of the [linear algebra chapter from the Deep Learning Book](http://www.deeplearningbook.org/contents/linear_algebra.html) by Goodfellow et al.. This work is a collection of thoughts/details/developements/examples I made while reading this chapter. It is designed to help you go through their introduction to linear algebra. For more details about this series and the syllabus, please see the [introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/).\n", 114 | "\n", 115 | "This first chapter is quite light and concerns the basic elements used in linear algebra and their definitions. It also introduces important functions in Python/Numpy that we will use all along this series. It will explain how to create and use vectors and matrices through examples." 116 | ] 117 | }, 118 | { 119 | "cell_type": "markdown", 120 | "metadata": {}, 121 | "source": [ 122 | "# 2.1 Scalars, Vectors, Matrices and Tensors\n", 123 | "\n", 124 | "Let's start with some basic definitions:\n", 125 | "\n", 126 | "\"An\n", 127 | "Difference between a scalar, a vector, a matrix and a tensor\n", 128 | "\n", 129 | "- A scalar is a single number\n", 130 | "- A vector is an array of numbers.\n", 131 | "\n", 132 | "$$\n", 133 | "\\bs{x} =\\begin{bmatrix}\n", 134 | " x_1 \\\\\\\\\n", 135 | " x_2 \\\\\\\\\n", 136 | " \\cdots \\\\\\\\\n", 137 | " x_n\n", 138 | "\\end{bmatrix}\n", 139 | "$$\n", 140 | "\n", 141 | "- A matrix is a 2-D array\n", 142 | "\n", 143 | "$$\n", 144 | "\\bs{A}=\n", 145 | "\\begin{bmatrix}\n", 146 | " A_{1,1} & A_{1,2} & \\cdots & A_{1,n} \\\\\\\\\n", 147 | " A_{2,1} & A_{2,2} & \\cdots & A_{2,n} \\\\\\\\\n", 148 | " \\cdots & \\cdots & \\cdots & \\cdots \\\\\\\\\n", 149 | " A_{m,1} & A_{m,2} & \\cdots & A_{m,n}\n", 150 | "\\end{bmatrix}\n", 151 | "$$\n", 152 | "\n", 153 | "- A tensor is a $n$-dimensional array with $n>2$\n", 154 | "\n", 155 | "We will follow the conventions used in the [Deep Learning Book](http://www.deeplearningbook.org/):\n", 156 | "\n", 157 | "- scalars are written in lowercase and italics. For instance: $n$\n", 158 | "- vectors are written in lowercase, italics and bold type. For instance: $\\bs{x}$\n", 159 | "- matrices are written in uppercase, italics and bold. For instance: $\\bs{X}$" 160 | ] 161 | }, 162 | { 163 | "cell_type": "markdown", 164 | "metadata": {}, 165 | "source": [ 166 | "### Example 1.\n", 167 | "\n", 168 | "#### Create a vector with Python and Numpy\n", 169 | "\n", 170 | "*Coding tip*: Unlike the `matrix()` function which necessarily creates $2$-dimensional matrices, you can create $n$-dimensionnal arrays with the `array()` function. The main advantage to use `matrix()` is the useful methods (conjugate transpose, inverse, matrix operations...). We will use the `array()` function in this series.\n", 171 | "\n", 172 | "We will start by creating a vector. This is just a $1$-dimensional array:" 173 | ] 174 | }, 175 | { 176 | "cell_type": "code", 177 | "execution_count": 41, 178 | "metadata": {}, 179 | "outputs": [ 180 | { 181 | "data": { 182 | "text/plain": [ 183 | "array([1, 2, 3, 4])" 184 | ] 185 | }, 186 | "execution_count": 41, 187 | "metadata": {}, 188 | "output_type": "execute_result" 189 | } 190 | ], 191 | "source": [ 192 | "x = np.array([1, 2, 3, 4])\n", 193 | "x" 194 | ] 195 | }, 196 | { 197 | "cell_type": "markdown", 198 | "metadata": {}, 199 | "source": [ 200 | "### Example 2.\n", 201 | "\n", 202 | "#### Create a (3x2) matrix with nested brackets\n", 203 | "\n", 204 | "The `array()` function can also create $2$-dimensional arrays with nested brackets:" 205 | ] 206 | }, 207 | { 208 | "cell_type": "code", 209 | "execution_count": 42, 210 | "metadata": {}, 211 | "outputs": [ 212 | { 213 | "data": { 214 | "text/plain": [ 215 | "array([[1, 2],\n", 216 | " [3, 4],\n", 217 | " [5, 6]])" 218 | ] 219 | }, 220 | "execution_count": 42, 221 | "metadata": {}, 222 | "output_type": "execute_result" 223 | } 224 | ], 225 | "source": [ 226 | "A = np.array([[1, 2], [3, 4], [5, 6]])\n", 227 | "A" 228 | ] 229 | }, 230 | { 231 | "cell_type": "markdown", 232 | "metadata": {}, 233 | "source": [ 234 | "### Shape\n", 235 | "\n", 236 | "The shape of an array (that is to say its dimensions) tells you the number of values for each dimension. For a $2$-dimensional array it will give you the number of rows and the number of columns. Let's find the shape of our preceding $2$-dimensional array `A`. Since `A` is a Numpy array (it was created with the `array()` function) you can access its shape with:" 237 | ] 238 | }, 239 | { 240 | "cell_type": "code", 241 | "execution_count": 43, 242 | "metadata": {}, 243 | "outputs": [ 244 | { 245 | "data": { 246 | "text/plain": [ 247 | "(3, 2)" 248 | ] 249 | }, 250 | "execution_count": 43, 251 | "metadata": {}, 252 | "output_type": "execute_result" 253 | } 254 | ], 255 | "source": [ 256 | "A.shape" 257 | ] 258 | }, 259 | { 260 | "cell_type": "markdown", 261 | "metadata": {}, 262 | "source": [ 263 | "We can see that $\\bs{A}$ has 3 rows and 2 columns.\n", 264 | "\n", 265 | "Let's check the shape of our first vector:" 266 | ] 267 | }, 268 | { 269 | "cell_type": "code", 270 | "execution_count": 44, 271 | "metadata": {}, 272 | "outputs": [ 273 | { 274 | "data": { 275 | "text/plain": [ 276 | "(4,)" 277 | ] 278 | }, 279 | "execution_count": 44, 280 | "metadata": {}, 281 | "output_type": "execute_result" 282 | } 283 | ], 284 | "source": [ 285 | "x.shape" 286 | ] 287 | }, 288 | { 289 | "cell_type": "markdown", 290 | "metadata": {}, 291 | "source": [ 292 | "As expected, you can see that $\\bs{x}$ has only one dimension. The number corresponds to the length of the array:" 293 | ] 294 | }, 295 | { 296 | "cell_type": "code", 297 | "execution_count": 45, 298 | "metadata": {}, 299 | "outputs": [ 300 | { 301 | "data": { 302 | "text/plain": [ 303 | "4" 304 | ] 305 | }, 306 | "execution_count": 45, 307 | "metadata": {}, 308 | "output_type": "execute_result" 309 | } 310 | ], 311 | "source": [ 312 | "len(x)" 313 | ] 314 | }, 315 | { 316 | "cell_type": "markdown", 317 | "metadata": {}, 318 | "source": [ 319 | "# Transposition\n", 320 | "\n", 321 | "With transposition you can convert a row vector to a column vector and vice versa:\n", 322 | "\n", 323 | "\"Transposition\n", 324 | "Vector transposition\n", 325 | "\n", 326 | "The transpose $\\bs{A}^{\\text{T}}$ of the matrix $\\bs{A}$ corresponds to the mirrored axes. If the matrix is a square matrix (same number of columns and rows):\n", 327 | "\n", 328 | "\"Transposition\n", 329 | "Square matrix transposition\n", 330 | "\n", 331 | "If the matrix is not square the idea is the same:\n", 332 | "\n", 333 | "\"Transposition\n", 334 | "Non-square matrix transposition\n", 335 | "\n", 336 | "\n", 337 | "The superscript $^\\text{T}$ is used for transposed matrices.\n", 338 | "\n", 339 | "$$\n", 340 | "\\bs{A}=\n", 341 | "\\begin{bmatrix}\n", 342 | " A_{1,1} & A_{1,2} \\\\\\\\\n", 343 | " A_{2,1} & A_{2,2} \\\\\\\\\n", 344 | " A_{3,1} & A_{3,2}\n", 345 | "\\end{bmatrix}\n", 346 | "$$\n", 347 | "\n", 348 | "$$\n", 349 | "\\bs{A}^{\\text{T}}=\n", 350 | "\\begin{bmatrix}\n", 351 | " A_{1,1} & A_{2,1} & A_{3,1} \\\\\\\\\n", 352 | " A_{1,2} & A_{2,2} & A_{3,2}\n", 353 | "\\end{bmatrix}\n", 354 | "$$\n", 355 | "\n", 356 | "The shape ($m \\times n$) is inverted and becomes ($n \\times m$).\n", 357 | "\n", 358 | "\"Dimensions\n", 359 | "Dimensions of matrix transposition" 360 | ] 361 | }, 362 | { 363 | "cell_type": "markdown", 364 | "metadata": {}, 365 | "source": [ 366 | "### Example 3.\n", 367 | "\n", 368 | "#### Create a matrix A and transpose it" 369 | ] 370 | }, 371 | { 372 | "cell_type": "code", 373 | "execution_count": 46, 374 | "metadata": {}, 375 | "outputs": [ 376 | { 377 | "data": { 378 | "text/plain": [ 379 | "array([[1, 2],\n", 380 | " [3, 4],\n", 381 | " [5, 6]])" 382 | ] 383 | }, 384 | "execution_count": 46, 385 | "metadata": {}, 386 | "output_type": "execute_result" 387 | } 388 | ], 389 | "source": [ 390 | "A = np.array([[1, 2], [3, 4], [5, 6]])\n", 391 | "A" 392 | ] 393 | }, 394 | { 395 | "cell_type": "code", 396 | "execution_count": 47, 397 | "metadata": {}, 398 | "outputs": [ 399 | { 400 | "data": { 401 | "text/plain": [ 402 | "array([[1, 3, 5],\n", 403 | " [2, 4, 6]])" 404 | ] 405 | }, 406 | "execution_count": 47, 407 | "metadata": {}, 408 | "output_type": "execute_result" 409 | } 410 | ], 411 | "source": [ 412 | "A_t = A.T\n", 413 | "A_t" 414 | ] 415 | }, 416 | { 417 | "cell_type": "markdown", 418 | "metadata": {}, 419 | "source": [ 420 | "We can check the dimensions of the matrices:" 421 | ] 422 | }, 423 | { 424 | "cell_type": "code", 425 | "execution_count": 48, 426 | "metadata": {}, 427 | "outputs": [ 428 | { 429 | "data": { 430 | "text/plain": [ 431 | "(3, 2)" 432 | ] 433 | }, 434 | "execution_count": 48, 435 | "metadata": {}, 436 | "output_type": "execute_result" 437 | } 438 | ], 439 | "source": [ 440 | "A.shape" 441 | ] 442 | }, 443 | { 444 | "cell_type": "code", 445 | "execution_count": 49, 446 | "metadata": {}, 447 | "outputs": [ 448 | { 449 | "data": { 450 | "text/plain": [ 451 | "(2, 3)" 452 | ] 453 | }, 454 | "execution_count": 49, 455 | "metadata": {}, 456 | "output_type": "execute_result" 457 | } 458 | ], 459 | "source": [ 460 | "A_t.shape" 461 | ] 462 | }, 463 | { 464 | "cell_type": "markdown", 465 | "metadata": {}, 466 | "source": [ 467 | "We can see that the number of columns becomes the number of rows with transposition and vice versa." 468 | ] 469 | }, 470 | { 471 | "cell_type": "markdown", 472 | "metadata": {}, 473 | "source": [ 474 | "# Addition\n", 475 | "\n", 476 | "\"Addition\n", 477 | "Addition of two matrices\n", 478 | "\n", 479 | "Matrices can be added if they have the same shape:\n", 480 | "\n", 481 | "$$\\bs{A} + \\bs{B} = \\bs{C}$$\n", 482 | "\n", 483 | "Each cell of $\\bs{A}$ is added to the corresponding cell of $\\bs{B}$:\n", 484 | "\n", 485 | "$$\\bs{A}_{i,j} + \\bs{B}_{i,j} = \\bs{C}_{i,j}$$\n", 486 | "\n", 487 | "$i$ is the row index and $j$ the column index.\n", 488 | "\n", 489 | "$$\n", 490 | "\\begin{bmatrix}\n", 491 | " A_{1,1} & A_{1,2} \\\\\\\\\n", 492 | " A_{2,1} & A_{2,2} \\\\\\\\\n", 493 | " A_{3,1} & A_{3,2}\n", 494 | "\\end{bmatrix}+\n", 495 | "\\begin{bmatrix}\n", 496 | " B_{1,1} & B_{1,2} \\\\\\\\\n", 497 | " B_{2,1} & B_{2,2} \\\\\\\\\n", 498 | " B_{3,1} & B_{3,2}\n", 499 | "\\end{bmatrix}=\n", 500 | "\\begin{bmatrix}\n", 501 | " A_{1,1} + B_{1,1} & A_{1,2} + B_{1,2} \\\\\\\\\n", 502 | " A_{2,1} + B_{2,1} & A_{2,2} + B_{2,2} \\\\\\\\\n", 503 | " A_{3,1} + B_{3,1} & A_{3,2} + B_{3,2}\n", 504 | "\\end{bmatrix}\n", 505 | "$$\n", 506 | "\n", 507 | "The shape of $\\bs{A}$, $\\bs{B}$ and $\\bs{C}$ are identical. Let's check that in an example:" 508 | ] 509 | }, 510 | { 511 | "cell_type": "markdown", 512 | "metadata": {}, 513 | "source": [ 514 | "### Example 4.\n", 515 | "\n", 516 | "#### Create two matrices A and B and add them\n", 517 | "\n", 518 | "With Numpy you can add matrices just as you would add vectors or scalars." 519 | ] 520 | }, 521 | { 522 | "cell_type": "code", 523 | "execution_count": 50, 524 | "metadata": {}, 525 | "outputs": [ 526 | { 527 | "data": { 528 | "text/plain": [ 529 | "array([[1, 2],\n", 530 | " [3, 4],\n", 531 | " [5, 6]])" 532 | ] 533 | }, 534 | "execution_count": 50, 535 | "metadata": {}, 536 | "output_type": "execute_result" 537 | } 538 | ], 539 | "source": [ 540 | "A = np.array([[1, 2], [3, 4], [5, 6]])\n", 541 | "A" 542 | ] 543 | }, 544 | { 545 | "cell_type": "code", 546 | "execution_count": 51, 547 | "metadata": {}, 548 | "outputs": [ 549 | { 550 | "data": { 551 | "text/plain": [ 552 | "array([[2, 5],\n", 553 | " [7, 4],\n", 554 | " [4, 3]])" 555 | ] 556 | }, 557 | "execution_count": 51, 558 | "metadata": {}, 559 | "output_type": "execute_result" 560 | } 561 | ], 562 | "source": [ 563 | "B = np.array([[2, 5], [7, 4], [4, 3]])\n", 564 | "B" 565 | ] 566 | }, 567 | { 568 | "cell_type": "code", 569 | "execution_count": 52, 570 | "metadata": {}, 571 | "outputs": [ 572 | { 573 | "data": { 574 | "text/plain": [ 575 | "array([[ 3, 7],\n", 576 | " [10, 8],\n", 577 | " [ 9, 9]])" 578 | ] 579 | }, 580 | "execution_count": 52, 581 | "metadata": {}, 582 | "output_type": "execute_result" 583 | } 584 | ], 585 | "source": [ 586 | "# Add matrices A and B\n", 587 | "C = A + B\n", 588 | "C" 589 | ] 590 | }, 591 | { 592 | "cell_type": "markdown", 593 | "metadata": {}, 594 | "source": [ 595 | "It is also possible to add a scalar to a matrix. This means adding this scalar to each cell of the matrix.\n", 596 | "\n", 597 | "$$\n", 598 | "\\alpha+ \\begin{bmatrix}\n", 599 | " A_{1,1} & A_{1,2} \\\\\\\\\n", 600 | " A_{2,1} & A_{2,2} \\\\\\\\\n", 601 | " A_{3,1} & A_{3,2}\n", 602 | "\\end{bmatrix}=\n", 603 | "\\begin{bmatrix}\n", 604 | " \\alpha + A_{1,1} & \\alpha + A_{1,2} \\\\\\\\\n", 605 | " \\alpha + A_{2,1} & \\alpha + A_{2,2} \\\\\\\\\n", 606 | " \\alpha + A_{3,1} & \\alpha + A_{3,2}\n", 607 | "\\end{bmatrix}\n", 608 | "$$" 609 | ] 610 | }, 611 | { 612 | "cell_type": "markdown", 613 | "metadata": {}, 614 | "source": [ 615 | "### Example 5.\n", 616 | "\n", 617 | "#### Add a scalar to a matrix" 618 | ] 619 | }, 620 | { 621 | "cell_type": "code", 622 | "execution_count": 53, 623 | "metadata": {}, 624 | "outputs": [ 625 | { 626 | "data": { 627 | "text/plain": [ 628 | "array([[1, 2],\n", 629 | " [3, 4],\n", 630 | " [5, 6]])" 631 | ] 632 | }, 633 | "execution_count": 53, 634 | "metadata": {}, 635 | "output_type": "execute_result" 636 | } 637 | ], 638 | "source": [ 639 | "A" 640 | ] 641 | }, 642 | { 643 | "cell_type": "code", 644 | "execution_count": 54, 645 | "metadata": {}, 646 | "outputs": [ 647 | { 648 | "data": { 649 | "text/plain": [ 650 | "array([[ 5, 6],\n", 651 | " [ 7, 8],\n", 652 | " [ 9, 10]])" 653 | ] 654 | }, 655 | "execution_count": 54, 656 | "metadata": {}, 657 | "output_type": "execute_result" 658 | } 659 | ], 660 | "source": [ 661 | "# Exemple: Add 4 to the matrix A\n", 662 | "C = A+4\n", 663 | "C" 664 | ] 665 | }, 666 | { 667 | "cell_type": "markdown", 668 | "metadata": {}, 669 | "source": [ 670 | "# Broadcasting\n", 671 | "\n", 672 | "Numpy can handle operations on arrays of different shapes. The smaller array will be extended to match the shape of the bigger one. The advantage is that this is done in `C` under the hood (like any vectorized operations in Numpy). Actually, we used broadcasting in the example 5. The scalar was converted in an array of same shape as $\\bs{A}$.\n", 673 | "\n", 674 | "Here is another generic example:\n", 675 | "\n", 676 | "$$\n", 677 | "\\begin{bmatrix}\n", 678 | " A_{1,1} & A_{1,2} \\\\\\\\\n", 679 | " A_{2,1} & A_{2,2} \\\\\\\\\n", 680 | " A_{3,1} & A_{3,2}\n", 681 | "\\end{bmatrix}+\n", 682 | "\\begin{bmatrix}\n", 683 | " B_{1,1} \\\\\\\\\n", 684 | " B_{2,1} \\\\\\\\\n", 685 | " B_{3,1}\n", 686 | "\\end{bmatrix}\n", 687 | "$$\n", 688 | "\n", 689 | "is equivalent to\n", 690 | "\n", 691 | "$$\n", 692 | "\\begin{bmatrix}\n", 693 | " A_{1,1} & A_{1,2} \\\\\\\\\n", 694 | " A_{2,1} & A_{2,2} \\\\\\\\\n", 695 | " A_{3,1} & A_{3,2}\n", 696 | "\\end{bmatrix}+\n", 697 | "\\begin{bmatrix}\n", 698 | " B_{1,1} & B_{1,1} \\\\\\\\\n", 699 | " B_{2,1} & B_{2,1} \\\\\\\\\n", 700 | " B_{3,1} & B_{3,1}\n", 701 | "\\end{bmatrix}=\n", 702 | "\\begin{bmatrix}\n", 703 | " A_{1,1} + B_{1,1} & A_{1,2} + B_{1,1} \\\\\\\\\n", 704 | " A_{2,1} + B_{2,1} & A_{2,2} + B_{2,1} \\\\\\\\\n", 705 | " A_{3,1} + B_{3,1} & A_{3,2} + B_{3,1}\n", 706 | "\\end{bmatrix}\n", 707 | "$$\n", 708 | "\n", 709 | "where the ($3 \\times 1$) matrix is converted to the right shape ($3 \\times 2$) by copying the first column. Numpy will do that automatically if the shapes can match." 710 | ] 711 | }, 712 | { 713 | "cell_type": "markdown", 714 | "metadata": {}, 715 | "source": [ 716 | "### Example 6.\n", 717 | "\n", 718 | "#### Add two matrices of different shapes" 719 | ] 720 | }, 721 | { 722 | "cell_type": "code", 723 | "execution_count": 55, 724 | "metadata": {}, 725 | "outputs": [ 726 | { 727 | "data": { 728 | "text/plain": [ 729 | "array([[1, 2],\n", 730 | " [3, 4],\n", 731 | " [5, 6]])" 732 | ] 733 | }, 734 | "execution_count": 55, 735 | "metadata": {}, 736 | "output_type": "execute_result" 737 | } 738 | ], 739 | "source": [ 740 | "A = np.array([[1, 2], [3, 4], [5, 6]])\n", 741 | "A" 742 | ] 743 | }, 744 | { 745 | "cell_type": "code", 746 | "execution_count": 56, 747 | "metadata": {}, 748 | "outputs": [ 749 | { 750 | "data": { 751 | "text/plain": [ 752 | "array([[2],\n", 753 | " [4],\n", 754 | " [6]])" 755 | ] 756 | }, 757 | "execution_count": 56, 758 | "metadata": {}, 759 | "output_type": "execute_result" 760 | } 761 | ], 762 | "source": [ 763 | "B = np.array([[2], [4], [6]])\n", 764 | "B" 765 | ] 766 | }, 767 | { 768 | "cell_type": "code", 769 | "execution_count": 57, 770 | "metadata": {}, 771 | "outputs": [ 772 | { 773 | "data": { 774 | "text/plain": [ 775 | "array([[ 3, 4],\n", 776 | " [ 7, 8],\n", 777 | " [11, 12]])" 778 | ] 779 | }, 780 | "execution_count": 57, 781 | "metadata": {}, 782 | "output_type": "execute_result" 783 | } 784 | ], 785 | "source": [ 786 | "# Broadcasting\n", 787 | "C=A+B\n", 788 | "C" 789 | ] 790 | }, 791 | { 792 | "cell_type": "markdown", 793 | "metadata": {}, 794 | "source": [ 795 | "You can find basics operations on matrices simply explained [here](https://www.mathsisfun.com/algebra/matrix-introduction.html)." 796 | ] 797 | }, 798 | { 799 | "cell_type": "markdown", 800 | "metadata": {}, 801 | "source": [ 802 | "\n", 803 | " Feel free to drop me an email or a comment. The syllabus of this series can be found [in the introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/). All the notebooks can be found on [Github](https://github.com/hadrienj/deepLearningBook-Notes).\n", 804 | "" 805 | ] 806 | }, 807 | { 808 | "cell_type": "markdown", 809 | "metadata": {}, 810 | "source": [ 811 | "# References\n", 812 | "\n", 813 | "- [Broadcasting in Numpy](https://docs.scipy.org/doc/numpy-1.13.0/user/basics.broadcasting.html)\n", 814 | "\n", 815 | "- [Discussion on Arrays and matrices](https://stackoverflow.com/questions/4151128/what-are-the-differences-between-numpy-arrays-and-matrices-which-one-should-i-u)\n", 816 | "\n", 817 | "- [Math is fun - Matrix introduction](https://www.mathsisfun.com/algebra/matrix-introduction.html)" 818 | ] 819 | } 820 | ], 821 | "metadata": { 822 | "kernelspec": { 823 | "display_name": "Python 3", 824 | "language": "python", 825 | "name": "python3" 826 | }, 827 | "language_info": { 828 | "codemirror_mode": { 829 | "name": "ipython", 830 | "version": 3 831 | }, 832 | "file_extension": ".py", 833 | "mimetype": "text/x-python", 834 | "name": "python", 835 | "nbconvert_exporter": "python", 836 | "pygments_lexer": "ipython3", 837 | "version": "3.7.2" 838 | }, 839 | "toc": { 840 | "base_numbering": 1, 841 | "nav_menu": {}, 842 | "number_sections": false, 843 | "sideBar": true, 844 | "skip_h1_title": false, 845 | "title_cell": "Table of Contents", 846 | "title_sidebar": "Contents", 847 | "toc_cell": false, 848 | "toc_position": {}, 849 | "toc_section_display": true, 850 | "toc_window_display": false 851 | } 852 | }, 853 | "nbformat": 4, 854 | "nbformat_minor": 2 855 | } 856 | -------------------------------------------------------------------------------- /2.1 Scalars, Vectors, Matrices and Tensors/images/dimensions-transposition-matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.1 Scalars, Vectors, Matrices and Tensors/images/dimensions-transposition-matrix.png -------------------------------------------------------------------------------- /2.1 Scalars, Vectors, Matrices and Tensors/images/matrix-addition.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.1 Scalars, Vectors, Matrices and Tensors/images/matrix-addition.png -------------------------------------------------------------------------------- /2.1 Scalars, Vectors, Matrices and Tensors/images/non-squared-matrix-transposition.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.1 Scalars, Vectors, Matrices and Tensors/images/non-squared-matrix-transposition.png -------------------------------------------------------------------------------- /2.1 Scalars, Vectors, Matrices and Tensors/images/scalar-vector-matrix-tensor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.1 Scalars, Vectors, Matrices and Tensors/images/scalar-vector-matrix-tensor.png -------------------------------------------------------------------------------- /2.1 Scalars, Vectors, Matrices and Tensors/images/square-matrix-transposition.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.1 Scalars, Vectors, Matrices and Tensors/images/square-matrix-transposition.png -------------------------------------------------------------------------------- /2.1 Scalars, Vectors, Matrices and Tensors/images/vector-transposition.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.1 Scalars, Vectors, Matrices and Tensors/images/vector-transposition.png -------------------------------------------------------------------------------- /2.10 The Trace Operator/2.10 The Trace Operator.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 7, 6 | "metadata": { 7 | "collapsed": true, 8 | "scrolled": false 9 | }, 10 | "outputs": [], 11 | "source": [ 12 | "import numpy as np\n", 13 | "import matplotlib.pyplot as plt\n", 14 | "import seaborn as sns" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": 8, 20 | "metadata": {}, 21 | "outputs": [ 22 | { 23 | "name": "stdout", 24 | "output_type": "stream", 25 | "text": [ 26 | "Populating the interactive namespace from numpy and matplotlib\n" 27 | ] 28 | } 29 | ], 30 | "source": [ 31 | "# Plot style\n", 32 | "sns.set()\n", 33 | "%pylab inline\n", 34 | "pylab.rcParams['figure.figsize'] = (4, 4)" 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "execution_count": 1, 40 | "metadata": {}, 41 | "outputs": [ 42 | { 43 | "data": { 44 | "text/html": [ 45 | "" 72 | ], 73 | "text/plain": [ 74 | "" 75 | ] 76 | }, 77 | "metadata": {}, 78 | "output_type": "display_data" 79 | } 80 | ], 81 | "source": [ 82 | "%%html\n", 83 | "" 110 | ] 111 | }, 112 | { 113 | "cell_type": "markdown", 114 | "metadata": {}, 115 | "source": [ 116 | "$$\n", 117 | "\\newcommand\\norm[1]{\\left\\lVert#1\\right\\rVert} \n", 118 | "\\DeclareMathOperator{\\Tr}{Tr}\n", 119 | "\\newcommand\\bs[1]{\\boldsymbol{#1}}\n", 120 | "$$" 121 | ] 122 | }, 123 | { 124 | "cell_type": "markdown", 125 | "metadata": {}, 126 | "source": [ 127 | "\n", 128 | " This content is part of a series following the chapter 2 on linear algebra from the [Deep Learning Book](http://www.deeplearningbook.org/) by Goodfellow, I., Bengio, Y., and Courville, A. (2016). It aims to provide intuitions/drawings/python code on mathematical theories and is constructed as my understanding of these concepts. You can check the syllabus in the [introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/).\n", 129 | "" 130 | ] 131 | }, 132 | { 133 | "cell_type": "markdown", 134 | "metadata": {}, 135 | "source": [ 136 | "# Introduction\n", 137 | "\n", 138 | "This chapter is very light! I can assure you that you will read it in 1 minute! It is nice after the last two chapters that were quite big! We will see what is the Trace of a matrix. It will be needed for the last chapter on the Principal Component Analysis (PCA)." 139 | ] 140 | }, 141 | { 142 | "cell_type": "markdown", 143 | "metadata": {}, 144 | "source": [ 145 | "# 2.10 The Trace Operator" 146 | ] 147 | }, 148 | { 149 | "cell_type": "markdown", 150 | "metadata": {}, 151 | "source": [ 152 | "\"Calculating\n", 153 | "The trace of matrix\n", 154 | "\n", 155 | "\n", 156 | "The trace is the sum of all values in the diagonal of a square matrix.\n", 157 | "\n", 158 | "$$\n", 159 | "\\bs{A}=\n", 160 | "\\begin{bmatrix}\n", 161 | " 2 & 9 & 8 \\\\\\\\\n", 162 | " 4 & 7 & 1 \\\\\\\\\n", 163 | " 8 & 2 & 5\n", 164 | "\\end{bmatrix}\n", 165 | "$$\n", 166 | "\n", 167 | "$$\n", 168 | "\\mathrm{Tr}(\\bs{A}) = 2 + 7 + 5 = 14\n", 169 | "$$\n", 170 | "\n", 171 | "Numpy provides the function `trace()` to calculate it:" 172 | ] 173 | }, 174 | { 175 | "cell_type": "code", 176 | "execution_count": 10, 177 | "metadata": {}, 178 | "outputs": [ 179 | { 180 | "data": { 181 | "text/plain": [ 182 | "array([[2, 9, 8],\n", 183 | " [4, 7, 1],\n", 184 | " [8, 2, 5]])" 185 | ] 186 | }, 187 | "execution_count": 10, 188 | "metadata": {}, 189 | "output_type": "execute_result" 190 | } 191 | ], 192 | "source": [ 193 | "A = np.array([[2, 9, 8], [4, 7, 1], [8, 2, 5]])\n", 194 | "A" 195 | ] 196 | }, 197 | { 198 | "cell_type": "code", 199 | "execution_count": 11, 200 | "metadata": {}, 201 | "outputs": [ 202 | { 203 | "data": { 204 | "text/plain": [ 205 | "14" 206 | ] 207 | }, 208 | "execution_count": 11, 209 | "metadata": {}, 210 | "output_type": "execute_result" 211 | } 212 | ], 213 | "source": [ 214 | "A_tr = np.trace(A)\n", 215 | "A_tr" 216 | ] 217 | }, 218 | { 219 | "cell_type": "markdown", 220 | "metadata": {}, 221 | "source": [ 222 | "GoodFellow et al. explain that the trace can be used to specify the Frobenius norm of a matrix (see [2.5](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.5-Norms/)). The Frobenius norm is the equivalent of the $L^2$ norm for matrices. It is defined by:\n", 223 | "\n", 224 | "$$\n", 225 | "\\norm{\\bs{A}}_F=\\sqrt{\\sum_{i,j}A^2_{i,j}}\n", 226 | "$$\n", 227 | "\n", 228 | "Take the square of all elements and sum them. Take the square root of the result. This norm can also be calculated with:\n", 229 | "\n", 230 | "$$\n", 231 | "\\norm{\\bs{A}}_F=\\sqrt{\\Tr({\\bs{AA}^T})}\n", 232 | "$$\n", 233 | "\n", 234 | "We can check this. The first way to compute the norm can be done with the simple command `np.linalg.norm()`:" 235 | ] 236 | }, 237 | { 238 | "cell_type": "code", 239 | "execution_count": 12, 240 | "metadata": {}, 241 | "outputs": [ 242 | { 243 | "data": { 244 | "text/plain": [ 245 | "17.549928774784245" 246 | ] 247 | }, 248 | "execution_count": 12, 249 | "metadata": {}, 250 | "output_type": "execute_result" 251 | } 252 | ], 253 | "source": [ 254 | "np.linalg.norm(A)" 255 | ] 256 | }, 257 | { 258 | "cell_type": "markdown", 259 | "metadata": {}, 260 | "source": [ 261 | "The Frobenius norm of $\\bs{A}$ is 17.549928774784245.\n", 262 | "\n", 263 | "With the trace the result is identical:" 264 | ] 265 | }, 266 | { 267 | "cell_type": "code", 268 | "execution_count": 13, 269 | "metadata": {}, 270 | "outputs": [ 271 | { 272 | "data": { 273 | "text/plain": [ 274 | "17.549928774784245" 275 | ] 276 | }, 277 | "execution_count": 13, 278 | "metadata": {}, 279 | "output_type": "execute_result" 280 | } 281 | ], 282 | "source": [ 283 | "np.sqrt(np.trace(A.dot(A.T)))" 284 | ] 285 | }, 286 | { 287 | "cell_type": "markdown", 288 | "metadata": {}, 289 | "source": [ 290 | "Since the transposition of a matrix doesn't change the diagonal, the trace of the matrix is equal to the trace of its transpose:\n", 291 | "\n", 292 | "$$\n", 293 | "\\Tr(\\bs{A})=\\Tr(\\bs{A}^T)\n", 294 | "$$" 295 | ] 296 | }, 297 | { 298 | "cell_type": "markdown", 299 | "metadata": {}, 300 | "source": [ 301 | "## Trace of a product\n", 302 | "\n", 303 | "$$\n", 304 | "\\Tr(\\bs{ABC}) = \\Tr(\\bs{CAB}) = \\Tr(\\bs{BCA})\n", 305 | "$$\n", 306 | "\n", 307 | "\n", 308 | "### Example 1.\n", 309 | "\n", 310 | "Let's see an example of this property.\n", 311 | "\n", 312 | "$$\n", 313 | "\\bs{A}=\n", 314 | "\\begin{bmatrix}\n", 315 | " 4 & 12 \\\\\\\\\n", 316 | " 7 & 6\n", 317 | "\\end{bmatrix}\n", 318 | "$$\n", 319 | "\n", 320 | "$$\n", 321 | "\\bs{B}=\n", 322 | "\\begin{bmatrix}\n", 323 | " 1 & -3 \\\\\\\\\n", 324 | " 4 & 3\n", 325 | "\\end{bmatrix}\n", 326 | "$$\n", 327 | "\n", 328 | "$$\n", 329 | "\\bs{C}=\n", 330 | "\\begin{bmatrix}\n", 331 | " 6 & 6 \\\\\\\\\n", 332 | " 2 & 5\n", 333 | "\\end{bmatrix}\n", 334 | "$$" 335 | ] 336 | }, 337 | { 338 | "cell_type": "code", 339 | "execution_count": 14, 340 | "metadata": {}, 341 | "outputs": [ 342 | { 343 | "data": { 344 | "text/plain": [ 345 | "531" 346 | ] 347 | }, 348 | "execution_count": 14, 349 | "metadata": {}, 350 | "output_type": "execute_result" 351 | } 352 | ], 353 | "source": [ 354 | "A = np.array([[4, 12], [7, 6]])\n", 355 | "B = np.array([[1, -3], [4, 3]])\n", 356 | "C = np.array([[6, 6], [2, 5]])\n", 357 | "\n", 358 | "np.trace(A.dot(B).dot(C))" 359 | ] 360 | }, 361 | { 362 | "cell_type": "code", 363 | "execution_count": 15, 364 | "metadata": {}, 365 | "outputs": [ 366 | { 367 | "data": { 368 | "text/plain": [ 369 | "531" 370 | ] 371 | }, 372 | "execution_count": 15, 373 | "metadata": {}, 374 | "output_type": "execute_result" 375 | } 376 | ], 377 | "source": [ 378 | "np.trace(C.dot(A).dot(B))" 379 | ] 380 | }, 381 | { 382 | "cell_type": "code", 383 | "execution_count": 16, 384 | "metadata": {}, 385 | "outputs": [ 386 | { 387 | "data": { 388 | "text/plain": [ 389 | "531" 390 | ] 391 | }, 392 | "execution_count": 16, 393 | "metadata": {}, 394 | "output_type": "execute_result" 395 | } 396 | ], 397 | "source": [ 398 | "np.trace(B.dot(C).dot(A))" 399 | ] 400 | }, 401 | { 402 | "cell_type": "markdown", 403 | "metadata": {}, 404 | "source": [ 405 | "$$\n", 406 | "\\bs{ABC}=\n", 407 | "\\begin{bmatrix}\n", 408 | " 360 & 432 \\\\\\\\\n", 409 | " 180 & 171\n", 410 | "\\end{bmatrix}\n", 411 | "$$\n", 412 | "\n", 413 | "$$\n", 414 | "\\bs{CAB}=\n", 415 | "\\begin{bmatrix}\n", 416 | " 498 & 126 \\\\\\\\\n", 417 | " 259 & 33\n", 418 | "\\end{bmatrix}\n", 419 | "$$\n", 420 | "\n", 421 | "$$\n", 422 | "\\bs{BCA}=\n", 423 | "\\begin{bmatrix}\n", 424 | " -63 & -54 \\\\\\\\\n", 425 | " 393 & 594\n", 426 | "\\end{bmatrix}\n", 427 | "$$\n", 428 | "\n", 429 | "$$\n", 430 | "\\Tr(\\bs{ABC}) = \\Tr(\\bs{CAB}) = \\Tr(\\bs{BCA}) = 531\n", 431 | "$$" 432 | ] 433 | }, 434 | { 435 | "cell_type": "markdown", 436 | "metadata": {}, 437 | "source": [ 438 | "\n", 439 | " Feel free to drop me an email or a comment. The syllabus of this series can be found [in the introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/). All the notebooks can be found on [Github](https://github.com/hadrienj/deepLearningBook-Notes).\n", 440 | "" 441 | ] 442 | }, 443 | { 444 | "cell_type": "markdown", 445 | "metadata": {}, 446 | "source": [ 447 | "# References\n", 448 | "\n", 449 | "[Trace (linear algebra) - Wikipedia](https://en.wikipedia.org/wiki/Trace_(linear_algebra))\n", 450 | "\n", 451 | "[Numpy Trace operator](https://docs.scipy.org/doc/numpy/reference/generated/numpy.trace.html)" 452 | ] 453 | } 454 | ], 455 | "metadata": { 456 | "kernelspec": { 457 | "display_name": "Python 2", 458 | "language": "python", 459 | "name": "python2" 460 | }, 461 | "language_info": { 462 | "codemirror_mode": { 463 | "name": "ipython", 464 | "version": 2 465 | }, 466 | "file_extension": ".py", 467 | "mimetype": "text/x-python", 468 | "name": "python", 469 | "nbconvert_exporter": "python", 470 | "pygments_lexer": "ipython2", 471 | "version": "2.7.10" 472 | } 473 | }, 474 | "nbformat": 4, 475 | "nbformat_minor": 2 476 | } 477 | -------------------------------------------------------------------------------- /2.10 The Trace Operator/images/trace-matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.10 The Trace Operator/images/trace-matrix.png -------------------------------------------------------------------------------- /2.11 The determinant/2.11 The determinant.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 14, 6 | "metadata": { 7 | "collapsed": true, 8 | "scrolled": false 9 | }, 10 | "outputs": [], 11 | "source": [ 12 | "import numpy as np\n", 13 | "import matplotlib.pyplot as plt\n", 14 | "import seaborn as sns" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": 15, 20 | "metadata": {}, 21 | "outputs": [ 22 | { 23 | "name": "stdout", 24 | "output_type": "stream", 25 | "text": [ 26 | "Populating the interactive namespace from numpy and matplotlib\n" 27 | ] 28 | } 29 | ], 30 | "source": [ 31 | "# Plot style\n", 32 | "sns.set()\n", 33 | "%pylab inline\n", 34 | "pylab.rcParams['figure.figsize'] = (4, 4)" 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "execution_count": 1, 40 | "metadata": {}, 41 | "outputs": [ 42 | { 43 | "data": { 44 | "text/html": [ 45 | "" 72 | ], 73 | "text/plain": [ 74 | "" 75 | ] 76 | }, 77 | "metadata": {}, 78 | "output_type": "display_data" 79 | } 80 | ], 81 | "source": [ 82 | "%%html\n", 83 | "" 110 | ] 111 | }, 112 | { 113 | "cell_type": "code", 114 | "execution_count": 17, 115 | "metadata": { 116 | "collapsed": true 117 | }, 118 | "outputs": [], 119 | "source": [ 120 | "def plotVectors(vecs, cols, alpha=1):\n", 121 | " \"\"\"\n", 122 | " Plot set of vectors.\n", 123 | "\n", 124 | " Parameters\n", 125 | " ----------\n", 126 | " vecs : array-like\n", 127 | " Coordinates of the vectors to plot. Each vectors is in an array. For\n", 128 | " instance: [[1, 3], [2, 2]] can be used to plot 2 vectors.\n", 129 | " cols : array-like\n", 130 | " Colors of the vectors. For instance: ['red', 'blue'] will display the\n", 131 | " first vector in red and the second in blue.\n", 132 | " alpha : float\n", 133 | " Opacity of vectors\n", 134 | "\n", 135 | " Returns:\n", 136 | "\n", 137 | " fig : instance of matplotlib.figure.Figure\n", 138 | " The figure of the vectors\n", 139 | " \"\"\"\n", 140 | " plt.axvline(x=0, color='#A9A9A9', zorder=0)\n", 141 | " plt.axhline(y=0, color='#A9A9A9', zorder=0)\n", 142 | "\n", 143 | " for i in range(len(vecs)):\n", 144 | " if (isinstance(alpha, list)):\n", 145 | " alpha_i = alpha[i]\n", 146 | " else:\n", 147 | " alpha_i = alpha\n", 148 | " x = np.concatenate([[0,0],vecs[i]])\n", 149 | " plt.quiver([x[0]],\n", 150 | " [x[1]],\n", 151 | " [x[2]],\n", 152 | " [x[3]],\n", 153 | " angles='xy', scale_units='xy', scale=1, color=cols[i],\n", 154 | " alpha=alpha_i)" 155 | ] 156 | }, 157 | { 158 | "cell_type": "markdown", 159 | "metadata": {}, 160 | "source": [ 161 | "$$\n", 162 | "\\newcommand\\norm[1]{\\left\\lVert#1\\right\\rVert} \n", 163 | "\\DeclareMathOperator{\\Tr}{Tr}\n", 164 | "\\newcommand\\bs[1]{\\boldsymbol{#1}}\n", 165 | "$$" 166 | ] 167 | }, 168 | { 169 | "cell_type": "markdown", 170 | "metadata": {}, 171 | "source": [ 172 | "\n", 173 | " This content is part of a series following the chapter 2 on linear algebra from the [Deep Learning Book](http://www.deeplearningbook.org/) by Goodfellow, I., Bengio, Y., and Courville, A. (2016). It aims to provide intuitions/drawings/python code on mathematical theories and is constructed as my understanding of these concepts. You can check the syllabus in the [introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/).\n", 174 | "" 175 | ] 176 | }, 177 | { 178 | "cell_type": "markdown", 179 | "metadata": {}, 180 | "source": [ 181 | "# Introduction\n", 182 | "\n", 183 | "This chapter is also very light! We will see what is the meaning of the determinant of a matrix. This special number can tell us a lot of things about our matrix!" 184 | ] 185 | }, 186 | { 187 | "cell_type": "markdown", 188 | "metadata": {}, 189 | "source": [ 190 | "# 2.11 The determinant" 191 | ] 192 | }, 193 | { 194 | "cell_type": "markdown", 195 | "metadata": {}, 196 | "source": [ 197 | "We saw in [2.8](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.8-Singular-Value-Decomposition/) that a matrix can be seen as a linear transformation of the space. The determinant of a matrix $\\bs{A}$ is a number corresponding to the *multiplicative change* you get when you transform your space with this matrix (see a comment by Pete L. Clark in [this SE question](https://math.stackexchange.com/questions/668/whats-an-intuitive-way-to-think-about-the-determinant)). A negative determinant means that there is a change in orientation (and not just a rescaling and/or a rotation). As outlined by Nykamp DQ on [Math Insight](https://mathinsight.org/determinant_linear_transformation), a change in orientation means for instance in 2D that we take a plane out of these 2 dimensions, do some transformations and get back to the initial 2D space. Here is an example distinguishing between positive and negative determinant:\n", 198 | "\n", 199 | "\"Comparison\n", 200 | "The determinant of a matrix can tell you a lot of things about the transformation associated with this matrix\n", 201 | "\n", 202 | "You can see that the second transformation can't be obtained through rotation and rescaling. Thus the sign can tell you the nature of the transformation associated with the matrix!\n", 203 | "\n", 204 | "In addition, the determinant also gives you the *amount* of transformation. If you take the *n*-dimensional unit cube and apply the matrix $\\bs{A}$ on it, the absolute value of the determinant corresponds to the area of the transformed figure. You might believe me more easily after the following example.\n", 205 | "\n", 206 | "### Example 1.\n", 207 | "\n", 208 | "To calculate the area of the shapes, we will use simple squares in 2 dimensions. The unit square area can be calculated with the Pythagorean theorem taking the two unit vectors.\n", 209 | "\n", 210 | "\"Illustration\n", 211 | "The unit square area\n", 212 | "\n", 213 | "The lengths of $i$ and $j$ are $1$ thus the area of the unit square is $1$.\n", 214 | "\n", 215 | "Let's start by creating both vectors in Python:" 216 | ] 217 | }, 218 | { 219 | "cell_type": "code", 220 | "execution_count": 18, 221 | "metadata": {}, 222 | "outputs": [ 223 | { 224 | "data": { 225 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQ8AAAECCAYAAADgsVLgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAEQBJREFUeJzt3X+MXXWZx/H3tN1WLDN0u05Bd1GyAZ4SFcOPiNbCCour\ni8yI6yaGDbiiiKBrAiYmVoObbMTFYDAQw65SfiUSVFiJYhVJlLq0NrCa7gor+1hDFnZdtAMK09If\nTNu7f9w7u9exdKbPnem5tO9XMumc8733PM935s5nzj1zTs9Aq9VCkvbXvKYbkPTiZHhIKjE8JJUY\nHpJKDA9JJYaHpJIFlSdFxDzgRiCAPcClmfnTrvER4EpgArglM1fPQq+S+kh1z2MEaGXmStoh8ZnJ\ngYhYAFwLnA28GbgkIpb12KekPlMKj8z8BnBJZ/EY4DddwycAmzJzPDMngHXA6b00Kan/lN62AGTm\nnoi4FTgP+MuuoSHg2a7lLcAR1TqS+lNPB0wz873A8cDqiDiss3qcdoBMGgSe6aWOpP5TPWB6AfBH\nmXk1sAPY3fkAeBQ4NiKWANuAM4Brpttmq9VqDQwMVNqR1JvSD95A5cK4iHgpcAtwFO0Auho4HFic\nmasj4u3A33aauikz/3EGm22NjW3Z715mw4YNaxkdHaGp+gDDw4ON1W+ytvX7on4pPEp7Hpm5DXj3\nPsbXAGsq25b04uBJYpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NS\nieEhqcTwkFRieEgqMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8JJVUbze5ALgZOAZY\nCFyVmfd0jV8BvB/Y3Fn1wczc1FurkvpJKTyAC4CnMvM9EbEU2Ajc0zV+MnBhZm7stUFJ/akaHl8D\n7ux8PgBMTBk/BVgVES8H1nRuiC3pIFI65pGZ2zLzuYgYpB0in5zykDuAS4EzgZURcU5vbUrqNwOt\nVqv0xIg4Gvg68IXMvG3K2FBmjnc+vwxYmplXTbPJWiOSejVQeVL1gOmRwHeBD2fm/VPGhoBHImI5\nsB04C7hpJtsdG9tSaadnGzasZXR0pLH6AMPDg43Vb7K29fujfkX1mMcqYAlwZUR8ivZew43A4sxc\nHRGrgLXADuB7mXlvsY6kPlUKj8y8HLh8H+O3A7dXm5LU/zxJTFKJ4SGpxPCQVGJ4SCoxPCSVGB6S\nSgwPSSWGh6QSw0NSieEhqcTwkFRieEgqMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8\nJJUYHpJKDA9JJYaHpJLq7SYXADcDxwALgasy856u8RHgSmACuCUzV/feqqR+Ut3zuAB4KjPPAM4B\nvjA50AmWa4GzgTcDl0TEsh77lNRnquHxNdp7FtC+w/ZE19gJwKbMHM/MCWAdcHq9RUn9qHqv2m0A\nETEI3Al8smt4CHi2a3kLcES1QUn9qRQeABFxNPB14AuZ+dWuoXHaATJpEHhmJtscHh6sttOT0dGR\nRutParL+oTx369dUD5geCXwX+HBm3j9l+FHg2IhYAmwDzgCumcl2x8a2VNrp2YYNaxkdHWmsPrRf\nPE3Vb7K29fujfkV1z2MVsAS4MiI+BbSAG4HFmbk6Ij4K3Ef7eMjqzHyyWEdSn6oe87gcuHwf42uA\nNdWmJPU/TxKTVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NSieEhqcTwkFRieEgqMTwklRgekkoMD0kl\nhoekEsNDUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ4SGppHyvWoCIOA24OjPPnLL+\nCuD9wObOqg9m5qZeaknqL73c6PpjwIXA1r0MnwxcmJkbq9uX1N96edvyc+CdLzB2CrAqIh6IiI/3\nUENSnyqHR2beDex6geE7gEuBM4GVEXFOtY6k/jTQarXKT46IVwF3ZOaKKeuHMnO88/llwNLMvGqa\nzdUbkdSLgcqTejpgurfCETEEPBIRy4HtwFnATTPZ0NjYllloZ/9t2LCW0dGRxuoDDA8PNla/ydrW\n74/6FbMRHi2AiDgfWJyZqyNiFbAW2AF8LzPvnYU6kvpIT+GRmY8DKzqf39G1/nbg9t5ak9TPPElM\nUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWG\nh6QSw0NSieEhqcTwkFRieEgqMTwklRgekkoMD0klhoekkp7CIyJOi4j797J+JCIeioj1EXFxLzUk\n9adyeETEx4AbgUVT1i8ArgXOBt4MXBIRy3roUVIf6mXP4+fAO/ey/gRgU2aOZ+YEsA44vYc6kvpQ\nOTwy825g116GhoBnu5a3AEdU60jqTz3d6PoFjNMOkEmDwDMzeeLw8OActDO90dERnt/dXP1JTdY/\nlOdu/ZrZCI+BKcuPAsdGxBJgG3AGcM1MNjQ2tmUW2tl/Gzas5T9fMcL5r2qmPrRfPE3Nv8na1u+P\n+hWz8afaFkBEnB8RF2fmLuCjwH3AemB1Zj45C3XmzLY9C7jmIdg60XQn0otHT3semfk4sKLz+R1d\n69cAa3pr7cB5ZNswz03AA08u4M9fubfDOJKm8iQxYOO2owC4/7/n4hCQdHA65MNj9x74yfb2aSg/\neHI+u/c03JD0InHIh8fGp+bz3J6FADyzcx7/+vT8hjuSXhwO+fC4/xfz97ksae8O+fD4r63zWHn4\nEwD8xR9P8MSWQ/5LIs3IIf2T0mrB59+0gxMP2wzA6S/fxefftINWq+HGpBeBQ/rPCwMDMH/KKW7z\nD+k4lWbOHxVJJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NSieEhqcTwkFRi\neEgqMTwklRgekkoMD0klpf9JLCIGgBuA1wE7gIsz87Gu8eto3wxq8h5678jM5u6nJ2nWVf8bwvOA\nRZm5IiJOA67trJt0MvDWzPx1rw1K6k/Vty0rgXsBMvNB4NTJgc5eyXHAlyJiXURc1HOXkvpONTyG\ngGe7lndFxOS2FgPXAxcAbwM+FBGvqbcoqR9V37aMA4Ndy/Myc/JGjduA6zNzB0BEfJ/2sZFHptvo\n8PDgdA+ZE6eceip8B4aGDmN4uJEWgObm33Rt6zdfv6IaHuuBc4G7IuINwMNdY8cDX4mIkzrbXwnc\nOpONjo01c0z1xz9K4FTGx7czNrarkR6Ghwcbm3+Tta3fH/UrquFxN/CWiFjfWb4oIq4ANmXmtyLi\ny8CDwPPAbZn5aLGOpD5VCo/MbAGXTVn9s67xzwGf66EvSX3Ok8QklRgekkoMD0klhoekEsNDUonh\nIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWGh6QS\nw0NSieEhqcTwkFRieEgqKd30KSIGgBto34N2B3BxZj7WNf4B4BJgArgqM9fMQq+S+kh1z+M8YFFm\nrgBWAddODkTEkcBHgDcCbwP+PiJ+r9dG1YNWq+kOdBCqhsdK4F6AzHwQOLVr7PXAuszclZnjwCbg\nxJ66VI/2MPTQZSz+96tZ8PS/QGt30w3pIFC90fUQ8GzX8q6ImJeZe/YythU4YiYb3bBhbbGd3jw/\n8TKWD/wHi376bR5/7DeN9DB8yik8/uMfz9n2l+3cyiv/54u8dNMX2TlwOJsXncivFr2OzQtfwznn\nvbuxrz3A6OiI9RuuX1ENj3FgsGt5Mjgmx4a6xgaBZ2ay0eokenXc0zC67rOs3HwN7GykBfj+b+++\nzaVFra0cveOHHL3jh7BgMfzbrxh9+6Uwf+EB6uB3NfW9t35dNTzWA+cCd0XEG4CHu8YeAj4dEQuB\nw4DlwCMz2ejY2JZiO73ZtGEto6e/g/EnXtlIfYChwcMY37J9zra/8Fc/4CVP3AXAnoW/z/NHnsnO\no85iYtkZvOwVr+h87ZtJzuHhwca+99Zv16+ohsfdwFsiYn1n+aKIuALYlJnfiojrgXXAAPCJzHy+\nWOfAWbqcnbv/sLn6w4PsnKsXUGs3i37xbbYddyk7j/pTdi09CQbmz00tHTJK4ZGZLeCyKat/1jV+\nE3BTD31pVs1j/PU3NN2EDjKeJHYoGBhougMdhAwPSSWGh6QSw0NSieEhqcTwkFRieEgqMTwklRge\nkkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCop\n3bclIl4CfBlYRvv2kn+dmU9Pecw3gKXABLA9M9/eY6+S+kj1jnGXAT/JzL+LiHcDVwKXT3nMsZn5\n6p66k9S3qm9bVgL3dj7/DnB292BELAOWRMQ3I+KfI8K9DukgM+2eR0S8D7gCaHVWDQC/BJ7tLG8B\nhqY8bSHwOeA64A+A9RHxYGY+NRtNS2retOGRmTcDN3evi4h/AiZvrT0IPDPlab8EvpiZe4CxiNgI\nBGB4SAeJ6jGP9cA5wI86/z4wZfxs4G+AcyPicODVwKPTbHNgeHhwmofMjdHREQCaqj+pyfqH8tyt\nXzPQarWmf9QUEXEYcBvwcmAn8FeZuTkiPgvcmZk/iohrgTcCu4HPZuY9s9i3pIaVwkOSPElMUonh\nIanE8JBUYnhIKqn+qbZnTVwfExEDwA3A64AdwMWZ+VjX+AeASzr1rsrMNb3UK9S/DlhB+8Q7gHdk\n5pbf2VDvfZwGXJ2ZZ05ZP0L7UoMJ4JbMXD3btaepfwXwfmBzZ9UHM3PTLNZdQPucpWNon8h4Vfdf\nAedy/jOoPadz79SYB9xI+5yrPcClmfnTrvH9mn9j4UEz18ecByzKzBWdF/C1nXVExJHAR4CTgZcC\n6yLivsycOBD1O04G3pqZv57Fmr8lIj4GXAhsnbJ+QaefU4DttM8K/mZmbv7drcx+/Y6TgQszc+Ns\n1uxyAfBUZr4nIpYCG4F7On3N9fxfsHbHXM8dYARoZebKiPgT4DP8/+t/v+ff5NuWJq6P+b+amfkg\ncGrX2OuBdZm5KzPHgU3AibNQc0b1O3slxwFfioh1EXHRLNee9HPgnXtZfwKwKTPHO4G5Djj9ANaH\n9gt3VUQ8EBEfn4PaX6P9Swral1l0/2KY6/nvqzbM/dzJzG/Q3rOG9h7Qb7qG93v+ByQ8IuJ9EfFw\nRPyk8/Ew7ethZnJ9zHnAu4DPR8TLemyluybArs6u3N7GtgJH9Fhvf+ovBq6n/RvqbcCHIuI1s1yf\nzLwb2DWD3rYw+/PfV32AO4BLgTOBlRFxzizX3paZz0XEIHAn8Mmu4Tmd/zS1YY7n3tXHnoi4lfZ1\nZ7d3De33/A9IeGTmzZn52sw8sfPxWtrHOWZ0fUxmjtHezYseW+muCTCvc/3N5Fh3gO2tp17tq/42\n4PrM3JGZW4Hv0z42cqAciPlP57rM/HVm7gLWACfNdoGIOJr21/a2zPxq19Ccz38fteEAzH1SZr4X\nOB5Y3TlbHArzb/KYx1xcHzOTmucCd0XEG4CHu8YeAj4dEQuBw4DlwCM91tuf+scDX4mIk2h/X1YC\nt85y/W4DU5YfBY6NiCW0g+wM4JoDVT8ihoBHImI57ffcZwE3zWbBznGt7wIfzsz7pwzP6fz3VftA\nzL1T5wLgjzLzatoH7Hd3PqAw/ybD4x+A2yLiATrXxwB0XR9zb0T8WURsoD3BVbNwIPFu4C0Rsb6z\nfFHnKPemzPxWRFxP+73eAPCJzHy+x3r7W//LwIPA87R/O/UalvvSAoiI84HFmbk6Ij4K3Ed7/qsz\n88kDXH8VsJb2C/t7mXnvPp5fsQpYAlwZEZ/q9HAjB2b+09We67kDfB24JSJ+QPtn/3LgXRFRmr/X\ntkgq8SQxSSWGh6QSw0NSieEhqcTwkFRieEgqMTwklRgekkr+F+sAtuRl2ekoAAAAAElFTkSuQmCC\n", 226 | "text/plain": [ 227 | "" 228 | ] 229 | }, 230 | "metadata": {}, 231 | "output_type": "display_data" 232 | } 233 | ], 234 | "source": [ 235 | "orange = '#FF9A13'\n", 236 | "blue = '#1190FF'\n", 237 | " \n", 238 | "i = [0, 1]\n", 239 | "j = [1, 0]\n", 240 | "\n", 241 | "plotVectors([i, j], [[blue], [orange]])\n", 242 | "plt.xlim(-0.5, 3)\n", 243 | "plt.ylim(-0.5, 3)\n", 244 | "plt.show()" 245 | ] 246 | }, 247 | { 248 | "cell_type": "markdown", 249 | "metadata": {}, 250 | "source": [ 251 | "We will apply\n", 252 | "\n", 253 | "$$\n", 254 | "\\bs{A}=\\begin{bmatrix}\n", 255 | " 2 & 0\\\\\\\\\n", 256 | " 0 & 2\n", 257 | "\\end{bmatrix}\n", 258 | "$$\n", 259 | "\n", 260 | "to $i$ and $j$. You can notice that this matrix is special: it is diagonal. So it will only rescale our space. No rotation here. More precisely, it will rescale each dimension the same way because the diagonal values are identical. Let's create the matrix $\\bs{A}$:" 261 | ] 262 | }, 263 | { 264 | "cell_type": "code", 265 | "execution_count": 19, 266 | "metadata": {}, 267 | "outputs": [ 268 | { 269 | "data": { 270 | "text/plain": [ 271 | "array([[2, 0],\n", 272 | " [0, 2]])" 273 | ] 274 | }, 275 | "execution_count": 19, 276 | "metadata": {}, 277 | "output_type": "execute_result" 278 | } 279 | ], 280 | "source": [ 281 | "A = np.array([[2, 0], [0, 2]])\n", 282 | "A" 283 | ] 284 | }, 285 | { 286 | "cell_type": "markdown", 287 | "metadata": {}, 288 | "source": [ 289 | "Now we will apply $\\bs{A}$ on our two unit vectors $i$ and $j$ and plot the resulting new vectors:" 290 | ] 291 | }, 292 | { 293 | "cell_type": "code", 294 | "execution_count": 20, 295 | "metadata": {}, 296 | "outputs": [ 297 | { 298 | "data": { 299 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQ8AAAECCAYAAADgsVLgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAERhJREFUeJzt3X+MXXWZx/H3lNpa6wwNcEF31RCDPBB/LbQr0i2sKP4I\nMgXiJsYNuKKIoGsCJmatBjbZiAuLYQMx6kr5lUBQcSWKlR9R6Eprg4FlIwT2oUiW3Sgsg0intS39\nwd0/7p31MpZ2+tyZORf6fiWT3nO+957n+bZ3PnPumXN6htrtNpK0t+Y03YCklybDQ1KJ4SGpxPCQ\nVGJ4SCoxPCSVzK28KCLmAFcCATwPnJOZD/WMjwIXANuBazJz5TT0KmmAVPc8RoF2Zi6jExJfmRiI\niLnAZcCJwLuAsyPi4D77lDRgSuGRmT8Azu4uHgr8rmf4SGB9Zo5n5nZgDXBcP01KGjyljy0Amfl8\nRFwLnAr8Vc/QCLChZ3kjsH+1jqTB1NcB08z8GHA4sDIiFnRXj9MJkAnDwLP91JE0eKoHTE8HXpeZ\nFwNbgZ3dL4CHgcMiYhGwGTgeuHRP22y32+2hoaFKO5L6U/rGG6pcGBcRrwKuAV5DJ4AuBl4NLMzM\nlRHxQeDvu01dlZnfnMJm22NjG/e6l+mwbt1qli8fpan6AK3WcGP1m6xt/YGoXwqP0p5HZm4GPryb\n8VXAqsq2Jb00eJKYpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NSieEhqcTwkFRi\neEgqMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJdXbTc4FrgYOBeYB\nF2XmLT3j5wOfAJ7qrvpUZq7vr1VJg6QUHsDpwNOZ+dGIOAC4H7ilZ/xo4IzMvL/fBiUNpmp4fBe4\nqft4CNg+aXwxsCIiXgus6t4QW9LLSOmYR2ZuzszfR8QwnRD50qSn3AicA5wALIuIk/prU9KgGWq3\n26UXRsTrge8DX8vM6yaNjWTmePfxucABmXnRHjZZa0RSv4YqL6oeMD0EuB34TGbeNWlsBHgwIo4A\ntgDvBq6aynbHxjZW2unbunWrWb58tLH6AK3WcGP1m6xt/cGoX1E95rECWARcEBEX0tlruBJYmJkr\nI2IFsBrYCvw0M28r1pE0oErhkZnnAeftZvwG4IZqU5IGnyeJSSoxPCSVGB6SSgwPSSWGh6QSw0NS\nieEhqcTwkFRieEgqMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaH\npBLDQ1KJ4SGpxPCQVFK93eRc4GrgUGAecFFm3tIzPgpcAGwHrsnMlf23KmmQVPc8TgeezszjgZOA\nr00MdIPlMuBE4F3A2RFxcJ99Show1fD4Lp09C+jcYXt7z9iRwPrMHM/M7cAa4Lh6i5IGUfVetZsB\nImIYuAn4Us/wCLChZ3kjsH+1QUmDqRQeABHxeuD7wNcy8zs9Q+N0AmTCMPDsVLbZag1X2+nL8uWj\ntNvN1Z/QZP19ee7Wr6keMD0EuB34TGbeNWn4YeCwiFgEbAaOBy6dynbHxjZW2unbunWraR85ytJF\nzdSHzpunqfk3Wdv6g1G/orrnsQJYBFwQERcCbeBKYGFmroyIzwF30DkesjIznyjWmRU72kNc+DO4\n/YMwb7+mu5FeGqrHPM4DztvN+CpgVbWp2ZZbD+Q3m+C+sf049jU7m25HeknwJDHgPzYfAsCdvy4f\nApL2Oft8eLTbfwiPu349l3a74Yakl4h9Pjwe3TCHsR0LAfifTXP41fg+/1ciTck+/51y129e+FHl\nLj+6SFOyz4fHz5/Yj8PmPwPAUQft5OdP+usWaSr26fBot+HSpVt538hjAHw0tvFPx271uIc0Bfv0\nPvrQELQWvDApJi9L2rV9es9DUp3hIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQ\nVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NSSV//k1hEHANcnJknTFp/PvAJ4Knuqk9l5vp+akkaLP3c\n6PrzwBnApl0MHw2ckZn3V7cvabD187HlUeC0FxlbDKyIiLsj4gt91JA0oMrhkZk3AzteZPhG4Bzg\nBGBZRJxUrSNpMM3U/55+eWaOA0TEKuAo4Md7elGrNTxD7eze4iVL4FYYGVlAq9VIC0Bz82+6tvWb\nr18xHeEx1LsQESPAgxFxBLAFeDdw1VQ2NDa2cRra2Xv33ZvAEsbHtzA29mI7UzOr1RpubP5N1rb+\nYNSvmI7waANExEeAhZm5MiJWAKuBrcBPM/O2aagjaYD0FR6Z+TiwtPv4xp71NwA39NeapEHmSWKS\nSgwPSSWGh6QSw0NSieEhqcTwkFRieEgqMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8\nJJUYHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSV9BUeEXFMRNy1i/WjEfGLiFgbEWf1U0PS\nYCqHR0R8HrgSmD9p/VzgMuBE4F3A2RFxcB89ShpA/ex5PAqctov1RwLrM3M8M7cDa4Dj+qgjaQCV\nwyMzbwZ2dUv5EWBDz/JGYP9qHUmDqa8bXb+IcToBMmEYeHYqL2y1hmegnT1bvGQJ3AojIwtotRpp\nAWhu/k3Xtn7z9SumIzyGJi0/DBwWEYuAzcDxwKVT2dDY2MZpaGfv3XdvAksYH9/C2NiudqZmXqs1\n3Nj8m6xt/cGoXzEd4dEGiIiPAAszc2VEfA64g06wrMzMJ6ahjqQB0ld4ZObjwNLu4xt71q8CVvXX\nmqRB5klikkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQ\nVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NSieEhqcTwkFRieEgqMTwklZRu+hQRQ8DXgbcDW4GzMvOx\nnvHL6dwMauIeeqdkZnP305M07ap3jDsVmJ+ZSyPiGOCy7roJRwPvz8xn+m1Q0mCqfmxZBtwGkJn3\nAEsmBrp7JW8CvhURayLizL67lDRwquExAmzoWd4RERPbWghcAZwOfAD4dES8pd6ipEFU/dgyDgz3\nLM/JzOe7jzcDV2TmVoCIuJPOsZEH97TRVmt4T0+ZEYuXLIFbYWRkAa1WIy0Azc2/6drWb75+RTU8\n1gInA9+LiHcCD/SMHQ58OyKO6m5/GXDtVDY6NtbMMdX77k1gCePjWxgb29FID63WcGPzb7K29Qej\nfkU1PG4G3hsRa7vLZ0bE+cD6zPxRRFwP3ANsA67LzIeLdSQNqFJ4ZGYbOHfS6kd6xr8KfLWPviQN\nOE8Sk1RieEgqMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLD\nQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NSieEhqaR006eIGAK+TucetFuBszLzsZ7x\nTwJnA9uBizJz1TT0KmmAVPc8TgXmZ+ZSYAVw2cRARBwCfBY4FvgA8I8R8Yp+G5XK2u2mO3hZqobH\nMuA2gMy8B1jSM/YOYE1m7sjMcWA98La+upT6MPTc0+y/9nQWPPJN9ht/xDCZJtUbXY8AG3qWd0TE\nnMx8fhdjm4D9p7LRdetWF9vpz7btB3HE0H8y/6Ef8/hjv2ukh9bixTx+3337XO3Zqv/GzU9ywNgl\n8NAl/H6/Fv877+08Of/P+O28YPSU0xp77wEsXz7aeP2KaniMA8M9yxPBMTE20jM2DDw7lY1WJ9Gv\nN/0Wlq+5hGVPXQrPNdIC3PnC3bd9pnYD9RfuHOONW37CG7f8BBb+CTy2H8tHR2FoaBa7eKGm3vv9\nqIbHWuBk4HsR8U7ggZ6xXwBfjoh5wALgCODBqWx0bGxjsZ3+rF+3muXHncL4f7+hkfoAI8MLGN+4\nZZ+rPVv1F/zqGl7xzL8DsPNVb+C517yHba99D9sP/HNahxzY2HsPoNUabrx+RTU8bgbeGxFru8tn\nRsT5wPrM/FFEXAGsAYaAL2bmtmKd2XPAETy380+bq98a5rmm3kBN1p6F+kNbx3jlf32bTW/+O7Yd\n8h52Dh/W6F7Gy0UpPDKzDZw7afUjPeNXAVf10Zc0bdrzD2LDX1zfdBsvO54kppc/9zJmhOEhqcTw\nkFRieEgqMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ\n4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWl+7ZExCuB64GD6dxe8m8y87eTnvMD4ABgO7AlMz/YZ6+S\nBkj1jnHnAr/MzH+IiA8DFwDnTXrOYZn55r66kzSwqh9blgG3dR/fCpzYOxgRBwOLIuKHEfGziHCv\nQ3qZ2eOeR0R8HDgfaHdXDQFPAhu6yxuBkUkvmwd8FbgcOBBYGxH3ZObT09G0pObtMTwy82rg6t51\nEfGvwMSttYeBZye97EngXzLzeWAsIu4HAjA8pJeJ6jGPtcBJwL3dP++eNH4i8LfAyRHxauDNwMN7\n2OZQqzW8h6fMjOXLRwFoqv6EJuvvy3O3fs1Qu93e87MmiYgFwHXAa4HngL/OzKci4hLgpsy8NyIu\nA44FdgKXZOYt09i3pIaVwkOSPElMUonhIanE8JBUYnhIKqn+qrZvTVwfExFDwNeBtwNbgbMy87Ge\n8U8CZ3frXZSZq/qpV6h/ObCUzol3AKdk5sY/2lD/fRwDXJyZJ0xaP0rnUoPtwDWZuXK6a++h/vnA\nJ4Cnuqs+lZnrp7HuXDrnLB1K50TGi3p/CziT859C7Rmde7fGHOBKOudcPQ+ck5kP9Yzv1fwbCw+a\nuT7mVGB+Zi7tvoEv664jIg4BPgscDbwKWBMRd2Tm9tmo33U08P7MfGYaa75ARHweOAPYNGn93G4/\ni4EtdM4K/mFmPvXHW5n++l1HA2dk5v3TWbPH6cDTmfnRiDgAuB+4pdvXTM//RWt3zfTcAUaBdmYu\ni4i/BL7CH97/ez3/Jj+2NHF9zP/XzMx7gCU9Y+8A1mTmjswcB9YDb5uGmlOq390reRPwrYhYExFn\nTnPtCY8Cp+1i/ZHA+swc7wbmGuC4WawPnTfuioi4OyK+MAO1v0vnhxR0LrPo/cEw0/PfXW2Y+bmT\nmT+gs2cNnT2g3/UM7/X8ZyU8IuLjEfFARPyy+/UAnethpnJ9zKnAh4B/joiD+myltybAju6u3K7G\nNgH791lvb+ovBK6g8xPqA8CnI+It01yfzLwZ2DGF3jYy/fPfXX2AG4FzgBOAZRFx0jTX3pyZv4+I\nYeAm4Es9wzM6/z3Uhhmee08fz0fEtXSuO7uhZ2iv5z8r4ZGZV2fmWzPzbd2vt9I5zjGl62Myc4zO\nbl702UpvTYA53etvJsZ6A2xXPfVrd/U3A1dk5tbM3ATcSefYyGyZjfnvyeWZ+Uxm7gBWAUdNd4GI\neD2dv9vrMvM7PUMzPv/d1IZZmPuEzPwYcDiwsnu2OBTm3+Qxj5m4PmYqNU8GvhcR7wQe6Bn7BfDl\niJgHLACOAB7ss97e1D8c+HZEHEXn32UZcO001+81NGn5YeCwiFhEJ8iOBy6drfoRMQI8GBFH0PnM\n/W7gquks2D2udTvwmcy8a9LwjM5/d7VnY+7dOqcDr8vMi+kcsN/Z/YLC/JsMj28A10XE3XSvjwHo\nuT7mtoh4X0SsozPBFdNwIPFm4L0Rsba7fGb3KPf6zPxRRFxB57PeEPDFzNzWZ729rX89cA+wjc5P\np37DcnfaABHxEWBhZq6MiM8Bd9CZ/8rMfGKW668AVtN5Y/80M2/bzesrVgCLgAsi4sJuD1cyO/Pf\nU+2ZnjvA94FrIuLf6Hzvnwd8KCJK8/faFkklniQmqcTwkFRieEgqMTwklRgekkoMD0klhoekEsND\nUsn/AcPgudf36t92AAAAAElFTkSuQmCC\n", 300 | "text/plain": [ 301 | "" 302 | ] 303 | }, 304 | "metadata": {}, 305 | "output_type": "display_data" 306 | } 307 | ], 308 | "source": [ 309 | "new_i = A.dot(i)\n", 310 | "new_j = A.dot(j)\n", 311 | "plotVectors([new_i, new_j], [['#1190FF'], ['#FF9A13']])\n", 312 | "plt.xlim(-0.5, 3)\n", 313 | "plt.ylim(-0.5, 3)\n", 314 | "plt.show()" 315 | ] 316 | }, 317 | { 318 | "cell_type": "markdown", 319 | "metadata": {}, 320 | "source": [ 321 | "As expected, we can see that the square corresponding to $i$ and $j$ didn't rotate but the lengths of $i$ and $j$ have doubled. We will now calculate the determinant of $\\bs{A}$ (you can go to the [Wikipedia article](https://en.wikipedia.org/wiki/Determinant) for more details about the calculation of the determinant):\n", 322 | "\n", 323 | "\"Areas\n", 324 | "The unit square transformed by the matrix" 325 | ] 326 | }, 327 | { 328 | "cell_type": "code", 329 | "execution_count": 21, 330 | "metadata": {}, 331 | "outputs": [ 332 | { 333 | "data": { 334 | "text/plain": [ 335 | "4.0" 336 | ] 337 | }, 338 | "execution_count": 21, 339 | "metadata": {}, 340 | "output_type": "execute_result" 341 | } 342 | ], 343 | "source": [ 344 | "np.linalg.det(A)" 345 | ] 346 | }, 347 | { 348 | "cell_type": "markdown", 349 | "metadata": {}, 350 | "source": [ 351 | "And yes, the transformation have multiplied the area of the unit square by 4. The lengths of $new_i$ and $new_j$ are $2$ (thus $2\\cdot2=4$)." 352 | ] 353 | }, 354 | { 355 | "cell_type": "markdown", 356 | "metadata": { 357 | "collapsed": true 358 | }, 359 | "source": [ 360 | "### Example 2.\n", 361 | "\n", 362 | "Let's see now an example of negative determinant.\n", 363 | "\n", 364 | "We will transform the unit square with the matrix:\n", 365 | "\n", 366 | "$$\n", 367 | "\\bs{B}=\\begin{bmatrix}\n", 368 | " -2 & 0\\\\\\\\\n", 369 | " 0 & 2\n", 370 | "\\end{bmatrix}\n", 371 | "$$\n", 372 | "\n", 373 | "Its determinant is $-4$:" 374 | ] 375 | }, 376 | { 377 | "cell_type": "code", 378 | "execution_count": 22, 379 | "metadata": {}, 380 | "outputs": [ 381 | { 382 | "data": { 383 | "text/plain": [ 384 | "-4.0" 385 | ] 386 | }, 387 | "execution_count": 22, 388 | "metadata": {}, 389 | "output_type": "execute_result" 390 | } 391 | ], 392 | "source": [ 393 | "B = np.array([[-2, 0], [0, 2]])\n", 394 | "np.linalg.det(B)" 395 | ] 396 | }, 397 | { 398 | "cell_type": "code", 399 | "execution_count": 23, 400 | "metadata": {}, 401 | "outputs": [ 402 | { 403 | "data": { 404 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQ8AAAECCAYAAADgsVLgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAEIpJREFUeJzt3X+s3XV9x/HnLVCs5d6SsgOYzI1M4F3mr0gbkVqYbqAG\nbYXExLjhDyYizLmAxmg1aLKI02GaQYxOKSCZjA2cRLEDifyItlYUhptE8qZIpotAPIj0FtrSFs7+\nOOfOw11/3Ps+5/Sccp+PpOn5fj/f832/P/ne+8r3fHu+/Y61Wi0kabbmDbsBSQcmw0NSieEhqcTw\nkFRieEgqMTwklRxceVNEzAOuAAJ4Fjg/M3/WNb4SuBjYCVydmWv70KukEVI981gJtDJzBe2Q+MzU\nQEQcDKwBTgNeB5wXEUf22KekEVMKj8z8JnBeZ/EY4LddwycAmzJzMjN3AuuBU3ppUtLoKX1sAcjM\nZyPiq8CZwNu6hiaAzV3LW4BF1TqSRlNPF0wz8z3A8cDaiFjQWT1JO0CmjANP9FJH0uipXjA9G/j9\nzPwssB14pvMH4H7g2Ig4HNgKnApcuq99tlqt1tjYWKUdSb0p/eKNVW6Mi4gXAlcDR9MOoM8ChwEL\nM3NtRLwZ+FSnqSsz8x9nsNtWs7ll1r30S6MxzlytP5fnPuz6GzfeyapVK4c9/1J4lM48MnMr8Pa9\njK8D1lX2LenA4JfEJJUYHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWGh6QS\nw0NSieEhqcTwkFRieEgqMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKqk+bvJg4CrgGGA+\ncElm3tQ1fhHwXuDXnVXvz8xNvbUqaZSUwgM4G3gsM98VEYuBe4GbusZPBN6Zmff22qCk0VQNj+uB\nGzqvx4Cd08aXAqsj4kXAus4DsSU9j5SueWTm1sx8KiLGaYfIJ6Ztch1wPvB6YEVEnNFbm5JGzVir\n1Sq9MSJeDHwD+EJmXjNtbCIzJzuvLwAWZ+Yl+9hlrRFJvRqrvKl6wfQo4DvABzLzjmljE8B9EbEE\n2Ab8KXDlTPbbbG6ptNMXjcb4nK0/l+c+7PobN97JqlUrhz7/iuo1j9XA4cDFEfFJ2mcNVwALM3Nt\nRKwG7gS2A7dl5i3FOpJGVCk8MvNC4MK9jF8LXFttStLo80tikkoMD0klhoekEsNDUonhIanE8JBU\nYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NSieEh\nqcTwkFRieEgqMTwklVQfN3kwcBVwDDAfuCQzb+oaXwlcDOwErs7Mtb23KmmUVM88zgYey8xTgTOA\nL0wNdIJlDXAa8DrgvIg4ssc+JY2YanhcT/vMAtpP2N7ZNXYCsCkzJzNzJ7AeOKXeoqRRVH1W7VaA\niBgHbgA+0TU8AWzuWt4CLKo2KGk0lcIDICJeDHwD+EJm/mvX0CTtAJkyDjwxk302GuPVdvpiLtef\ny3MfZv1Vq1bSag1//hXVC6ZHAd8BPpCZd0wbvh84NiIOB7YCpwKXzmS/zeaWSjt90WiMz9n6c3nu\nw66/ceOdtE5YyfLDhzv/iuqZx2rgcODiiPgk0AKuABZm5tqI+BBwK+3rIWsz85FiHel5bVdrjE9+\nD77zZph/0LC7mZ3qNY8LgQv3Mr4OWFdtSporcvsRPPwk3NM8iJOPfmbY7cyKXxKThugnW48C4PZf\nlS8/Do3hIQ1Jq/W78LjjVwfTag25oVkyPKQheXDzPJq7FgLwP0/O4+eTB9av44HVrfQ8csfDz/2o\ncscB9tHF8JCG5AePHMSxhz4OwKt+7xl+8OiB9c8thoc0BK0WXLp8O2+YeAiAd8UO/v7k7QfUdY8D\n6zxJep4YG4PGgucmxfTlUeeZh6QSw0NSieEhqcTwkFRieEgqMTwklRgekkoMD0klhoekEsNDUonh\nIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpJKe/iexiDgJ+Gxmvn7a+ouA9wK/7qx6f2Zu6qWWpNHS\ny4OuPwK8E3hyN8MnAu/MzHur+5c02nr52PIgcNYexpYCqyPi+xHxsR5qSBpR5fDIzBuBXXsYvg44\nH3g9sCIizqjWkTSaBvW/p1+WmZMAEbEOeBXw7/t6U6MxPqB2ZmYu15/Lcx9m/aXLlsHNMDGxgEZj\nKC2U9SM8xroXImICuC8ilgDbgD8FrpzJjprNLX1op6bRGJ+z9efy3Idd/567E1jG5OQ2ms09ncgP\nVjU4+xEeLYCIeAewMDPXRsRq4E5gO3BbZt7ShzqSRkhP4ZGZvwCWd15f17X+WuDa3lqTNMr8kpik\nEsNDUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwP\nSSWGh6QSw0NSieEhqcTwkFRieEgqMTwklRgekkoMD0klPYVHRJwUEXfsZv3KiPhRRGyIiHN7qSFp\nNJXDIyI+AlwBHDpt/cHAGuA04HXAeRFxZA89ShpBvZx5PAictZv1JwCbMnMyM3cC64FTeqgjaQSV\nwyMzbwR291jvCWBz1/IWYFG1jqTR1NODrvdgknaATBkHnpjJGxuN8QG0M3Nzuf5cnvsw6y9dtgxu\nhomJBTQaQ2mhrB/hMTZt+X7g2Ig4HNgKnApcOpMdNZtb+tBOTaMxPmfrz+W5D7v+PXcnsIzJyW00\nm7s7kR+8anD2IzxaABHxDmBhZq6NiA8Bt9IOlrWZ+Ugf6kgaIT2FR2b+AljeeX1d1/p1wLreWpM0\nyvySmKQSw0NSieEhqcTwkFRieEgqMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8JJUY\nHpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWlhz5FxBjwReCVwHbg3Mx8qGv8\nMtoPg5p6ht9bM3N4zxOU1HfVJ8adCRyamcsj4iRgTWfdlBOBN2bm4702KGk0VT+2rABuAcjMu4Bl\nUwOds5LjgK9ExPqIOKfnLiWNnGp4TACbu5Z3RcTUvhYClwNnA28C/ioiXlZvUdIoqn5smQTGu5bn\nZeaznddbgcszcztARNxO+9rIffvaaaMxvq9NBmou15/Lcx9m/aXLlsHNMDGxgEZjKC2UVcNjA/AW\n4OsR8Rrgp11jxwP/EhGv6ux/BfDVmey02RzeNdVGY3zO1p/Lcx92/XvuTmAZk5PbaDZ3DaWHanBW\nw+NG4PSI2NBZPiciLgI2Zea3I+JrwF3ADuCazLy/WEfSiCqFR2a2gAumrX6ga/zzwOd76EvSiPNL\nYpJKDA9JJYaHpBLDQ1KJ4SGpxPCQVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NSieEhqcTwkFRieEgq\nMTwklRgekkoMD0klhoekEsNDUonhIanE8JBUYnhIKjE8JJWUHvoUEWPAF2k/g3Y7cG5mPtQ1/j7g\nPGAncElmrutDr5JGSPXM40zg0MxcDqwG1kwNRMRRwAeBk4E3AX8XEYf02uic12oNuwPpOarhsQK4\nBSAz7wKWdY29GlifmbsycxLYBLyipy7nolaLgyYfYMEDX2LRhrMZe/qxYXckPUf1QdcTwOau5V0R\nMS8zn93N2JPAopnsdOPGO4vt9G7VqpVDr//DH3yXI3YkRz/9E47a8Z8sfKYJwOOHvIT/3nj1wGo3\nli7lF/fcM7D9W3/PlmwfZ8nYYWx64Ek2PvzIUHpYtWpl6X3V8JgExruWp4Jjamyia2wceGImO61O\nol+GVr/Vgoe+xcqnPgVPPfz/hhfv/DmLN39pcPVvf+6p4343h+svA475wxZHrPgoJxwxpCaKquGx\nAXgL8PWIeA3w066xHwGfjoj5wAJgCXDfTHbabG4pttO7RmN8uPX/aBXNha/lkN/8mPmP3Mahj97G\nQVt/CcDOxSey7SXnDKz2xPgCJrdsG9j+rb9nDzzwM1ac8laaz2yh2RxKCzQa4/veaDeq4XEjcHpE\nbOgsnxMRFwGbMvPbEXE5sB4YAz6emTuKdeaWefPZ2XgtOxuv5amXX8xBWx5k/qO3Mb+5nh1HnETr\nBY3B1G2M8/QQg3Mu13/4l4fB4iUwzPkXlcIjM1vABdNWP9A1fiVwZQ99aWyMZyaOY9vEcWw7/nz/\ntUUjxy+JHSjGxobdgfQchoekEsNDUonhIanE8JBUYnhIKjE8JJUYHpJKDA9JJYaHpBLDQ1KJ4SGp\nxPCQVGJ4SCoxPCSVGB6SSgwPSSWGh6QSw0NSieEhqcTwkFRieEgqMTwklZSe2xIRLwC+BhxJ+/GS\n787M30zb5pvAYmAnsC0z39xjr5JGSPWJcRcA/5WZfxsRbwcuBi6cts2xmfnSnrqTNLKqH1tWALd0\nXt8MnNY9GBFHAodHxLci4nsR4VmH9DyzzzOPiPhL4CJg6nmHY8CjwObO8hZgYtrb5gOfBy4DjgA2\nRMRdmflYP5qWNHz7DI/MvAq4qntdRPwbMPVo7XHgiWlvexT4cmY+CzQj4l4gAMNDep6oXvPYAJwB\n3N35+/vTxk8D/hp4S0QcBrwUuH8f+xxrNMb3sclgzeX6c3nuw6y/atXKodbvxVir8PT1iFgAXAO8\nCHga+PPM/HVEfA64ITPvjog1wMnAM8DnMvOmPvYtachK4SFJfklMUonhIanE8JBUYnhIKqn+U21P\nIuKFwD/TvvdlO+17Yx6Zts37gPNo3xtzSWau62P9Cdr35kwAhwAfzswfTtvmMmA57S/BAbw1M7fQ\nBzOsP7D5d9U4C3hbZv7FbsYGNv8Z1h/k8R/KvVkRMQZ8EXgl7Z/7czPzoa7xgR7zGdSf1TEfSngA\n7wPuzsxPR8S7gY/SdW9MRBwFfBA4EXghsD4ibs3MnX2q/yHgu5l5eUQcD1wHLJ22zYnAGzPz8T7V\nnHH9/TB/IuIfgDcAP9nDJoOc/17r74f5D+verDOBQzNzeUScBKzprNsvx3xv9TtmdcyH8rElMy8D\nLuks/gHw22mbvBpYn5m7MnMS2AS8oo8trAG+3Hl9CLCte7CT0McBX4mI9RFxTh9r77M+g58/tL/o\nd8HuBvbD/Pdan8HPf1j3Zv1f3cy8C1jWNbY/jvke61eO+cDPPKbdGzPW+fuczLwnIm4DXgacPu1t\nE/zu3hmAJ4FFA6h/NPBPwN9Me9tC4HLav+QHA3dExI8z8779VH9/zP+GiPiTPbxtf8x/b/UHNX8Y\n7r1Z0+e1KyLmdW7j6Nuci/VnfcwHHh67uzema+zPIiKAdcCxXUOTPPeA7u7+mZ7qR8TLaV93+XBm\nrp82vBW4PDO3d7a9nfbnxFn/8hTrD3z++zDw+e/DQOc/xHuzJrvqAkz94k6N9WXOxfqzPuZD+dgS\nER+LiLM7i1uBXdM2+RGwIiLmR8QiYAmFH9y91P9j4HraX6u/dTebHE/7M+dYRBxC+3TvP/Zj/YHO\nfwYGOv8ZGPT8p+7Ngj3fm3U9wCzuzZpV3Yh4DfDTrrH9ccz3Vn/Wx3xYF0yvAq6JiPfSDrD3AETE\nRcCmzPx2RFwOrKd9mvnxzNzRx/qfAQ4FLut81nsiM8+aVv9rwF3ADuCazOzHD89s6g9y/ru1H+c/\nk/qDnP+XaP/8fZ/OvVmd+lP3Zt0SEW+IiI20781a3acLxzcCp0fEhs7yOfv5mO+r/qyOufe2SCrx\nS2KSSgwPSSWGh6QSw0NSieEhqcTwkFRieEgqMTwklfwv+sPnD1rNqNwAAAAASUVORK5CYII=\n", 405 | "text/plain": [ 406 | "" 407 | ] 408 | }, 409 | "metadata": {}, 410 | "output_type": "display_data" 411 | } 412 | ], 413 | "source": [ 414 | "new_i_1 = B.dot(i)\n", 415 | "new_j_1 = B.dot(j)\n", 416 | "plotVectors([new_i_1, new_j_1], [['#1190FF'], ['#FF9A13']])\n", 417 | "plt.xlim(-3, 0.5)\n", 418 | "plt.ylim(-0.5, 3)\n", 419 | "plt.show()" 420 | ] 421 | }, 422 | { 423 | "cell_type": "markdown", 424 | "metadata": {}, 425 | "source": [ 426 | "We can see that the matrices with determinant $2$ and $-2$ modified the area of the unit square the same way.\n", 427 | "\n", 428 | "\"Areas\n", 429 | "The unit square transformed by the matrix with a negative determinant\n", 430 | "\n", 431 | "The absolute value of the determinant shows that, as in the first example, the area of the new square is 4 times the area of the unit square. But this time, it was not just a rescaling but also a transformation. It is not obvious with only the unit vectors so let's transform some random points. We will use the matrix\n", 432 | "\n", 433 | "$$\n", 434 | "\\bs{C}=\\begin{bmatrix}\n", 435 | " -1 & 0\\\\\\\\\n", 436 | " 0 & 1\n", 437 | "\\end{bmatrix}\n", 438 | "$$\n", 439 | "\n", 440 | "that has a determinant equal to $-1$ for simplicity:" 441 | ] 442 | }, 443 | { 444 | "cell_type": "code", 445 | "execution_count": 24, 446 | "metadata": { 447 | "collapsed": true 448 | }, 449 | "outputs": [], 450 | "source": [ 451 | "# Some random points\n", 452 | "points = np.array([[1, 3], [2, 2], [3, 1], [4, 7], [5, 4]])" 453 | ] 454 | }, 455 | { 456 | "cell_type": "code", 457 | "execution_count": 25, 458 | "metadata": {}, 459 | "outputs": [ 460 | { 461 | "data": { 462 | "text/plain": [ 463 | "-1.0" 464 | ] 465 | }, 466 | "execution_count": 25, 467 | "metadata": {}, 468 | "output_type": "execute_result" 469 | } 470 | ], 471 | "source": [ 472 | "C = np.array([[-1, 0], [0, 1]])\n", 473 | "np.linalg.det(C)" 474 | ] 475 | }, 476 | { 477 | "cell_type": "markdown", 478 | "metadata": {}, 479 | "source": [ 480 | "Since the determinant is $-1$, the area of the space will not be changed. However, since it is negative we will observe a transformation that we can't obtain through rotation:" 481 | ] 482 | }, 483 | { 484 | "cell_type": "code", 485 | "execution_count": 26, 486 | "metadata": {}, 487 | "outputs": [ 488 | { 489 | "data": { 490 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP0AAAECCAYAAAA1htGEAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGQlJREFUeJztnXmQZWVhxX/d08sw3a9ngHmzOTOALJ9LBFMxQSgEFyhK\nRUorifEPrIBZ3KrUJJUq0cJKKhWTKomWFbdojCtq3DEKIVJRE4ximaAR0A8RJgOz9gwM09AM09Pz\n8sd9X8/tnrfcd9fvvu/8qqzpljfvnV7OO/eed+a+kVarhRAiHEarFiCEKBeZXojAkOmFCAyZXojA\nkOmFCAyZXojAGOt3A2PM7wPXAi3gFOACYJO19nCx0oQQRTAyyOv0xpgPAHdZaz9enCQhRJEkPrw3\nxjwPeJYML0S9GeSc/nrgL4sSIoQoh0SmN8asBYy19nsF6xFCFEzfIq/NpcDtSW7YarVaIyMj6RXl\nyNHFBV77lbcyOjLK537n7/FFl6gvew8+wR+9+3Y2nz7FR99xedVyVpLoFzyp6Q3wQKJHHRlhdnYu\n4d0Wy8Nzu2m1Wiy2Ftm5Z5Y146dULWmJZrPhzffJ4aMm8EvXjt2PAbD3kSd4ePchJsdXVazoBM1m\nI9HtEpneWntjJjUVsfeJfUsfP77wuFemF/Vkbn4BgFYL9h6c54xNyYzmE0M9ztmzzPRPVKhEDAtz\n80eXPt59oJ6/U8GYfu7o4xUqEcOCS3qAXTK9f+yZjyX90Xr+gIRfKOk9ZmFxgdn5g4y0C805Hd6L\nHHBJPzoi03vHvvlZWrTY3tgKwOM6vBc5cLid9GdvXcfsoSd5amGxYkWDM7Smd839OaeeBcDcgkwv\nsjM3v8DE2CjnbltHi6jBrxtDa3pX4p237mxA5/QiHx6fP0pjzQTbN0Yv1dXxEH/oTb+tsZXVY5NK\nepGZVqvF4fkFGmvG2b5pBqhngz+8pp/fx9TYGmYmppmZnFbSi8w8tbDIwrHjUdJvUtJ7hWvuN01t\nYGRkhLWTDR5feAJd419kwTX3M2vGWTs9yfQp4zK9L7jmfvPURgAaqxssthZ58tiRipWJOuOa+8aa\nCQCetn6qlg3+UJreNfebpzYBMDM5DUT7eyHS4pK+sWYcgC3NqVo2+ENpelfibZraAMDayej8S/t7\nkYW5FUm/5fQpoH7n9UNt+hNJH5le+3uRhZVJ/7T1kenr1uAPp+ljzT3EDu/V4IsMnJT0TSW9F6xs\n7gHWrm4nvQ7vRQbi7X3050QtG/yhM/3K5h7iSa/De5Gele091LPBHzrTr2zuIXZOr/ZeZMDt7icn\nTlwiq44N/tCZfmVzDzqnF/ngdvdx6tjgD63p40k/MTbB5KoJJb1ITXx3H6eODf7wmX5Fc++YHtf+\nXqQnvruPU8cGf6hM36m5dzQmprW/F6lZ2dw76tjgD5XpOzX3junxKe3vRWo6NfeOujX4Q2X6Ts29\nY3oiOgzT/l6kYeUaL07dGvyhMn2n5t7RGHf/6KY+h2HCH1au8eLUrcEfStP3Snrt70UaeiV93Rr8\n4TJ9l+YeYkmvBl+koGfS16zBHxrT92ruAabbTwTa34s0dGvvo/+vXg3+0Ji+V3MP0BhvF3k6vBcp\n6NXeQ70a/ETvWmuMeTtwNTAOfMha+4lCVaWgV3MPsXN6tfciBZ1293G2NKewDx2qxTvZ9k16Y8xl\nwEXW2ouBFwLbihaVhl7NPUSLPNA5vUhHp919nDo1+EkO768E7jbGfB34BvDNYiWlo1dzDzCxalz7\ne5GKbrv7OHVq8JMc3q8HtgNXAU8nMv4zihSVhl7NvUP7e5GGbrv7OHVq8JMk/UHgNmvtMWvtfcAR\nY8z6gnUNRL/m3qH9vUhDr+beUacGP0nS3wG8BXifMWYLsIboiaArzWa5RcaORx+mRYuz1m/r+tjN\nZoPTptey4/BOptaNMTWxplSN3TT5ho+aoFpdj7RNv2H99DIdKzWduWWGex44SGPtKayeSNSRV0Jf\nZdbabxljXmCM+REwArzJWtszKmdn5/LSl4if730AgFNHT+v42M1mg9nZOSZakwDs2LOXDWuapWrs\npsknfNQE1evaufsxAMZoLenopKm5djWtFtxt91fS4Cd9Ykz0dGStfXsmNQXTr7l3xPf3G6jW9KI+\n9FrjxYk3+D6/bDcU45x+zb1D+3uRhl67+zh1afCHw/QJmnvQ/l6kI3HS16TBr73pkzb3oP29SEeS\n9j767xM01vjf4Nfe9P0293G0vxdp6Le7j7PldP83+LU3/d6lEi+B6ZeSXqYXyem3u49Th6vo1N70\nJ0q8/qafXkp6vw+/hF/0293HcWWez4f4Q2T63s09wLj292JAkuzu47iX7Xxu8Otv+vl9rBk7pW9z\n72hofy8GIMnuPk4dGvxam94195unNvZt7h3T2t+LAUja3Dvq0ODX2vSDNPcOXf9eDMIgzb3D9wa/\n1qYfpLl3uAZf178XSUi6xovje4Nfa9MP0tw7lhp8DXREApKu8eL43uAPien7N/cO7e/FIKRKes8b\n/HqbfsDmHrS/F4ORJul9b/Bra/o0zT1ofy8GY9D2Prqt3w1+bU2fprkH7e/FYKRp78HvBr+2pk/T\n3IP292IwBtndx/G5wa+t6dM096D9vRiMQXb3cU5cUMO/cBkC0ydv7kH7e5GcQXf3cU5cOktJnxtp\nmnuH9vciCYPu7uP43ODX0vRpm3uH9vciCWmae4dr8HV4nxNpm3uH9vciCWmbe8eW06c4cOiIdw1+\nLU2ftrl3aH8vkpBmjRfH1wa/lqZP29w7tL8XSUizxovja4Nfc9MP1tw7tL8XScic9J42+PU0fYbm\nHrS/F8nImvS+Nvi1M33W5h60vxfJyNLeR3/Pzwa/dqbP2tyD9vciGVnbe/Czwa+d6bM296D9vUhG\n2t19HB8b/NqZPmtzD9rfi2Sk3d3H8bHBT/RW1caY/wEOtT990Fr7B8VJ6k3W5h60vxf9cbt7Z9q0\n+Njg9zW9MWYSaFlrX1yCnr5kbe4d2t+LXmTZ3cfxscFPcnh/ATBljLnNGHO7MebCokV1I4/m3qH9\nvehF1ube4WODn8T088B7rLVXAm8EbjLGVNIF5NHcO7S/F73Io7l3+NbgJzmnvw+4H8Ba+0tjzEFg\nM7Cr219oNhv5qFuBnT8MwDkbtw/8GCtv35w5FQ7CRAOajWL0DqrJB3zUBOXrenA2Ohzf1Jzu+thJ\nNZ29bR32oUMcOQ5bPfj+JjH964DnAG82xmwBGsCeXn9hdnYuB2knc9+eHQA0WusGeoxms3HS7ccW\no2fwnfv2MXbklLwkZtJUNT5qgmp0PbznMQBGW62Ojz2IptOmo9+1e365n7WT6V/+60fSJ6Ekpv84\n8AljzH8Cx4HXWWuPZ9CWmjyae4f296IXWXf3cXxr8Pua3lq7AFxTgpa+5NXcg/b3ojdZd/dxfGvw\nazPOybO5B+3vRW/yau+j+/Crwa+N6fNs7kH7e9GbPNt78KvBr43p89jcx9H+XvQij919HJ82+LUx\nfR6b+zja34te5LG7j+PTBr+Gps/e3IP296I7Wa533w2fGvz6mD7H5t6h/b3oRF67+zg+Nfi1MH3e\nzb1D+3vRiTybe4dPDX4tTJ93c+/Q/l50Iu/m3uFLg18L0+fd3Dt0/XvRiTzXeHF8afBrYfq8m3uH\nrn8vOpHnGi+OLw1+zUyfT3Pv0P5edKKwpPekwa+H6Qto7kH7e9GZopLelwbfe9MX1dyD9veiM0W0\n99H9+dHge2/6opp70P5edKao9h78aPC9N31RzT1ofy86k/fuPo4PDb73pi+quQft70Vn8t7dx/Gh\nwa+R6fNt7kH7e3EyRezu4/jQ4Ptv+oKae4f29yJOEbv7OD40+F6bvsjm3qH9vYhTVHPv8KHB99r0\nRTb3Du3vRZwim3tH1Q2+16Yvsrl3aH8v4hS1xotTdYPvtemLbO4d2t+LOEWt8eJU3eDXxPT5N/cO\n7e9FnFKSvuIG32/TF9zcg/b3YjllJH3VDb63pi+juQft78Vyim7vo/uutsH31vRlNPeg/b1YThnt\nPVTb4Htr+jKae9D+XiynyN19nCobfG9NX0ZzD9rfi+UUubuPU2WDXwPTF9fcg/b34gRF7+7jVNng\nJ3mraowxG4AfA5dba+8rVlJEGc29Q/t7AcXv7uNU2eD3TXpjzBjwEaC0p6SymnuH9vcCymnuHVU2\n+EkO728EPgzsLljLEmU19w7t7wWU19w7qmrwe5reGHMtsN9a+22g+MhtU1Zz79D+XkA5a7w4VTX4\n/c7prwOOG2OuAJ4LfNoYc7W1dn+vv9RsNjKJ2nTsNMZXjfP8p59Pc122+0qiacPuU2EPjE21aK7P\n5/GyaqoKHzVBObpGHnwEgC0bZxI9XlZNF1/wNP7r7r1s3NAo9fve0/TW2svcx8aY7wCv72d4gNnZ\nuUyito6dwXsv/StGF0Yz3xdEP5xe9zN6LHpmf2j/fk5rbcj8eHloqgIfNUF5unbtaz/G4mLfx8tD\n09kbp/ng2y5ldDS7Z5ymJAzykl2pLdfoSHmvJmp/L6Cc3f1KRkdLO2teItFLdgDW2hcXKaRKtL8X\nUP45fVV4O84pE+3vBZxo72dKTPoqkOnR/l5ElLW7rxqZHu3vRUS0ux/uQ3uQ6QHt70V8dz/ch/Yg\n0y+h/X3YlLm7rxqZvo3292ETSnMPMv0S2t+HTSjNPcj0S2h/HzZK+gDR9e/Dxq3xpmX6cND178Pm\nxL+l1+F9MGh/HzZV7O6rQqZvo/192OicPkC0vw8btfcBov192ISyuweZfgnt78MmlN09yPRLaH8f\nLiHt7kGmX4b292ES0u4eZPplaH8fJiE19yDTL0P7+zAJqbkHmX4Z2t+HiZI+YLS/D5OQdvcg0y9D\n+/swCWl3DzL9MrS/D5OQdvcg0y9D+/sw0Tl9wGh/HyZq7wNG+/swCWl3DzL9MrS/D5OQdvcg0y9D\n+/vwCG13DzL9SWh/Hxah7e4hwbvWGmNGgY8BBjgOvMFae2/RwqpiemKah+Z20Wq1GBkp/22ERbmE\n1txDsqR/BdCy1l4C3AC8u1hJ1aL9fViE1txDAtNba28G/rj96ZnAo0UKqhrt78NCSd8Fa+1xY8wn\ngfcDNxWqqGLqtr//2YF72Tc/W7WMZTyw+zD3PXSoahmJCG13DwnO6R3W2muNMRuAHxljnmmtfbLb\nbZvNRi7i8iSppk0HT4edMLJ6sfCvI+v9339wBx/530/yorMu5o2/9VovNAG85ws/4VcPH+If33kF\np86szkFVcb9Tx0ei3Nu2ee3Aj+Hj73kSkhR51wBbrbV/CxwBFtv/68rs7Fw+6nKi2Wwk1jRyNPqW\n7D5wgNnJ4r6OQTR146af3gzA+et+LZfveR6aAJ5nmvx8xyN89pZ7ec1LzvVGVyf2tu/3+MLiQI9R\npKa0JH0SSnJ4/1Xg140x3wNuBd5qrT2aQZvX1GV/v+PwTu45+AvOWXcW5647u2o5y7jkOZs5bWaS\n7961i8cef6pqOT0J8Zy+b9Jba+eB3ytBixfUZX9/y4O3A/Dys67w7qXF8bFRXn7RmXzmNsutd+7M\nJe2LQu29qMX+3ueUd9Ql7UPb3YNMfxJ12N/7nPIOl/ZHjx3n1jt3Vi2nK6Ht7kGmPwnf9/d1SHmH\n72kf4u4eZPqO+Ly/r0PKO3xP+xB39yDTd8TX69/XKeUdPqd9iM09yPQd8XV/X6eUd/ic9iE29yDT\nd8TH/X0dU97ha9or6cUSPu7v65jyDl/TPsTdPcj0HfHt+vd1TnmHj2kf2vXuHTJ9B3y7/n2dU97h\nY9qHdr17h0zfAZ/298OQ8g7f0l7n9GIJn/b3w5DyDt/SXu29WMKX/f0wpbzDp7QPcXcPMn1HfNnf\nD1PKO3xK+xB39yDTd8SH/f0wprzDh7QPdXcPMn1Xqt7fD2PKO3xI+1B39yDTd6XK/f0wp7yj6rQP\ntbkHmb4rVe7vhznlHVWnfajNPcj0Xalqfx9CyjuqTHslvTiJqvb3IaS8o8q0D3V3DzJ9V6rY34eU\n8o6q0j7U3T3I9F2pYn8fUso7qkr7UHf3INN3pez9fYgp76gi7XVOL06i7P19iCnvqCLt1d6Lkyhz\nfx9yyjvKTvtQd/cg03elzP19yCnvKDvtQ93dg0zflbL290r5E5SV9iHv7kGm70kZ+3ul/AnKSvuQ\nd/cg0/ek6P29Uv5kykj7kJt76GN6Y8yYMebTxpj/MMb80BjzirKE+UDR+3ul/MmUkfYhN/fQP+mv\nAQ5Yay8FXgZ8oHhJ/lDk/v7+gzuU8l0oOu2V9L35InBD++MRYKFYOX5R5P7+S/d8C1DKd6LotA95\ndw99TG+tnbfWPmGMaQBfAt5Zjiw/KGp/v+PwTu7ac7dSvgfxtH/0cL6nVyHv7iFBkWeM2Qb8O/Ap\na+0/Fy/JH4ra3+tcvj/xtP/Kd+7P9b5D3t0DjPX6j8aYjcBtwJuttd9JeqfNZiOrrtxJo2nrsSb8\nHI5PLOT2Nblz+Wc2z+Xic5/rnel9+tm96sXnceudO7n1Bzv47Redw6kzq3O534Xj0Z9nbF1H8/Sp\n1Pfj0/dqEHqaHrgeWAfcYIx5F9ACXmqt7dmuzM7O5SQvH5rNRipNi09GB0L7Hn0kt6/ppp/eDMDv\nPvvlHDhQ/XX146T9PhXJyy7czqdvs3z2lnt5zUvOzeU+9z8SHbkdO7KQ+uv18XuV9Emop+mttW8D\n3paHoDriDu/zWuXFX5d/9obzvDO9j1xy/mZuuXMn371rFy+9cDtrpycz32fIu3vQOKcnee/vdS4/\nOGOrRnn15efl2uSHvLsHmb4n46vGWb1qMpek1/ouPZf/5vbcXrcPfXcPMn1fpsenckl6pXx6xsdG\nuSqn1+1D392DTN+XPPb3SvnsXHJ+Piu90Nd4INP3pTGRfX+vlM/O2Kp80j703T3I9H2ZHs+2v1fK\n50ceaa+kl+n7knV/r5TPjzzSPvTdPcj0fVm6Vl6K/b1SPn+ypn3ou3uQ6fuS5bV6pXz+ZE370Hf3\nINP3Je3175XyxZEl7XVOL9P3pTGR7vr3SvniyJL2au9l+r6k2d8r5YsnbdqHvrsHmb4vac7plfLF\nkzbtQ9/dg0zfl0H390r58hg07bW7j5DpEzDI/l4pXx6Dpr129xEyfQKS7u+V8uUzSNqruY+Q6ROQ\ndH+vlC+fQdJezX2ETJ+AJPt7pXx1JE17JX2ETJ+AJPt7pXx1JE177e4jZPoE9NvfK+WrJ0naa3cf\nIdMnoN9r9Ur56kmS9trdR8j0Cei1v1fK+0O/tNc5fYRMn4Be+3ulvD/0S3u19xEyfQK67e+V8v7R\nK+21u4+Q6RPQ7ZxeKe8fvdJeu/sImT4Bnfb3Snl/6ZT22t2fQKZPyMr9vVLeXzqlvXb3J5DpExLf\n3yvl/Wdl2qu5P4FMn5D4/l4p7z8r017N/QkSmd4Yc6ExJvH70w8jbn9/z8FfKOVrQjztd81Gp2ZK\n+gSmN8b8OfAxIPt7BNcY1+B//Ve3AEr5OhBP+5vveBDQ7h6SJf39wKuKFuI7bn9/6KnHlPI1wqX9\no3NRi6/D+wSmt9Z+DThWghavcUkPSvk64dLeofZeRV5iXNIr5euHS3uAGR3eM5LkLZiNMWcAX7DW\nXlS8JCFEkQyS9OnfoF0I4Q2Jkl4IMTzonF6IwJDphQgMmV6IwJDphQiMsbzv0BgzCrwX+A2i6e5f\nWGtvyftx0mCMeQbwQ2CDtfZoxVpmgM8CM8A48GfW2h9WpGUE+BBwAXAE+ENr7QNVaIlpGgP+CTgT\nmAD+2lr7L1VqchhjNgA/Bi631t5XtR4AY8zbgauJfpc+ZK39RLfbFpH0rwXGrLUvAF4JnFPAYwyM\nMaYB3Ej0S+0Dfwrcbq19IXAd8MEKtbwSmLTWXgxcT/SkXTXXAAestZcCLwM+ULEeYOnJ6CPAfNVa\nHMaYy4CL2j+/FwLbet2+CNNfCewyxnwT+CjgxbMzkZbr8eeH9V7gH9ofjwNPVqjlEuBfAay1dwLP\nq1CL44vADe2PR4CFCrXEuRH4MLC7aiExrgTuNsZ8HfgG8M1eN850eG+MeR3wJywf7swCT1prrzLG\nXAp8Ergsy+PkoGkn8Hlr7c/ah7KlskLTSPvP66y1/22M2QR8BnhL2bpizACPxT4/ZowZtdYer0qQ\ntXYelo7QvgS8syotDmPMtcB+a+23jTHvqFpPjPXAduAq4OlExn9GtxvnPs4xxnwe+GL7H+pgjNlj\nrd2c64MMruk+4GEiwz0fuLN9WF0pxpjnAJ8jOp//twp1/B3wA2vtl9uf77TWbq9Kj8MYsw34KvAB\na+2nPNDzPcA9ET4XsMDV1tr91akCY8zfED0Zva/9+U+I+oYDnW6fe5EH3EF0DvY1Y8wFwP8V8BgD\nYa09z31sjHkQuKJCOU7Hs4gOYV9trf1ZxXK+T5QSXzbGPB+oWg/GmI3AbcCbrbVeXMDFWrt0xNq+\nqMzrqzZ8mzuIjhTfZ4zZAqwBDna7cRGm/xjwYWPMD9qfv6GAx8iCO8SumncTvbrx/vYpxyFrbVXX\nLfgacIUx5vvtz6+rSEec64F1wA3GmHcR/dxeaq3t/Sb05eHNft1a+y1jzAuMMT8i+t1+k7W2qz5t\n74UIDI1zhAgMmV6IwJDphQgMmV6IwJDphQgMmV6IwJDphQgMmV6IwPh/iKfapT8YOGwAAAAASUVO\nRK5CYII=\n", 491 | "text/plain": [ 492 | "" 493 | ] 494 | }, 495 | "metadata": {}, 496 | "output_type": "display_data" 497 | } 498 | ], 499 | "source": [ 500 | "newPoints = points.dot(C)\n", 501 | "\n", 502 | "plt.figure()\n", 503 | "plt.plot(points[:, 0], points[:, 1])\n", 504 | "plt.plot(newPoints[:, 0], newPoints[:, 1])\n", 505 | "plt.show()" 506 | ] 507 | }, 508 | { 509 | "cell_type": "markdown", 510 | "metadata": {}, 511 | "source": [ 512 | "You can see that the transformation mirrored the initial shape.\n", 513 | "\n", 514 | "# Conclusion\n", 515 | "\n", 516 | "We have seen that the determinant of a matrix is a special value telling us a lot of things on the transformation corresponding to this matrix. Now hang on and go to the *last chapter* on the Principal Component Analysis (PCA)." 517 | ] 518 | }, 519 | { 520 | "cell_type": "markdown", 521 | "metadata": {}, 522 | "source": [ 523 | "\n", 524 | " Feel free to drop me an email or a comment. The syllabus of this series can be found [in the introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/). All the notebooks can be found on [Github](https://github.com/hadrienj/deepLearningBook-Notes).\n", 525 | "" 526 | ] 527 | }, 528 | { 529 | "cell_type": "markdown", 530 | "metadata": {}, 531 | "source": [ 532 | "# References\n", 533 | "\n", 534 | "## Linear transformations\n", 535 | "\n", 536 | "- [Nykamp DQ, “Determinants and linear transformations.” From Math Insight](https://mathinsight.org/determinant_linear_transformation)\n", 537 | "\n", 538 | "- [Determinant intuition - SE](https://math.stackexchange.com/questions/668/whats-an-intuitive-way-to-think-about-the-determinant)\n", 539 | "\n", 540 | "## Numpy\n", 541 | "\n", 542 | "- [Numpy determinant](https://docs.scipy.org/doc/numpy-1.13.0/reference/generated/numpy.linalg.det.html)\n", 543 | "\n" 544 | ] 545 | } 546 | ], 547 | "metadata": { 548 | "kernelspec": { 549 | "display_name": "Python 2", 550 | "language": "python", 551 | "name": "python2" 552 | }, 553 | "language_info": { 554 | "codemirror_mode": { 555 | "name": "ipython", 556 | "version": 2 557 | }, 558 | "file_extension": ".py", 559 | "mimetype": "text/x-python", 560 | "name": "python", 561 | "nbconvert_exporter": "python", 562 | "pygments_lexer": "ipython2", 563 | "version": "2.7.10" 564 | } 565 | }, 566 | "nbformat": 4, 567 | "nbformat_minor": 2 568 | } 569 | -------------------------------------------------------------------------------- /2.11 The determinant/images/positive-negative-determinant.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.11 The determinant/images/positive-negative-determinant.png -------------------------------------------------------------------------------- /2.11 The determinant/images/unit-square-area-transformed-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.11 The determinant/images/unit-square-area-transformed-1.png -------------------------------------------------------------------------------- /2.11 The determinant/images/unit-square-area-transformed.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.11 The determinant/images/unit-square-area-transformed.png -------------------------------------------------------------------------------- /2.11 The determinant/images/unit-square-area.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.11 The determinant/images/unit-square-area.png -------------------------------------------------------------------------------- /2.11 The determinant/test_svd.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.11 The determinant/test_svd.jpg -------------------------------------------------------------------------------- /2.12 Example - Principal Components Analysis/images/first-principal-component.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.12 Example - Principal Components Analysis/images/first-principal-component.png -------------------------------------------------------------------------------- /2.12 Example - Principal Components Analysis/images/gradient-descent-local-minima.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.12 Example - Principal Components Analysis/images/gradient-descent-local-minima.png -------------------------------------------------------------------------------- /2.12 Example - Principal Components Analysis/images/gradient-descent.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.12 Example - Principal Components Analysis/images/gradient-descent.png -------------------------------------------------------------------------------- /2.12 Example - Principal Components Analysis/images/orthogonal-vectors.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.12 Example - Principal Components Analysis/images/orthogonal-vectors.png -------------------------------------------------------------------------------- /2.12 Example - Principal Components Analysis/images/principal-component-analysis-variance-explained.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.12 Example - Principal Components Analysis/images/principal-component-analysis-variance-explained.png -------------------------------------------------------------------------------- /2.12 Example - Principal Components Analysis/images/principal-components-analysis-PCA-change-coordinates.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.12 Example - Principal Components Analysis/images/principal-components-analysis-PCA-change-coordinates.png -------------------------------------------------------------------------------- /2.12 Example - Principal Components Analysis/images/principal-components-analysis-PCA-decoding-function.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.12 Example - Principal Components Analysis/images/principal-components-analysis-PCA-decoding-function.png -------------------------------------------------------------------------------- /2.12 Example - Principal Components Analysis/images/principal-components-analysis-PCA-encoding-function.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.12 Example - Principal Components Analysis/images/principal-components-analysis-PCA-encoding-function.png -------------------------------------------------------------------------------- /2.12 Example - Principal Components Analysis/images/principal-components-analysis-PCA-reconstruction-function.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.12 Example - Principal Components Analysis/images/principal-components-analysis-PCA-reconstruction-function.png -------------------------------------------------------------------------------- /2.2 Multiplying Matrices and Vectors/images/dot-product.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.2 Multiplying Matrices and Vectors/images/dot-product.png -------------------------------------------------------------------------------- /2.2 Multiplying Matrices and Vectors/images/plot-linear-equation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.2 Multiplying Matrices and Vectors/images/plot-linear-equation.png -------------------------------------------------------------------------------- /2.2 Multiplying Matrices and Vectors/images/system-linear-equations-matrix-form.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.2 Multiplying Matrices and Vectors/images/system-linear-equations-matrix-form.png -------------------------------------------------------------------------------- /2.3 Identity and Inverse Matrices/2.3 Identity and Inverse Matrices.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 61, 6 | "metadata": { 7 | "collapsed": true 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "import numpy as np\n", 12 | "import matplotlib.pyplot as plt\n", 13 | "import seaborn as sns" 14 | ] 15 | }, 16 | { 17 | "cell_type": "code", 18 | "execution_count": 62, 19 | "metadata": {}, 20 | "outputs": [ 21 | { 22 | "name": "stdout", 23 | "output_type": "stream", 24 | "text": [ 25 | "Populating the interactive namespace from numpy and matplotlib\n" 26 | ] 27 | } 28 | ], 29 | "source": [ 30 | "# Plot parameters\n", 31 | "sns.set()\n", 32 | "%pylab inline\n", 33 | "pylab.rcParams['figure.figsize'] = (4, 4)" 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": 63, 39 | "metadata": { 40 | "collapsed": true 41 | }, 42 | "outputs": [], 43 | "source": [ 44 | "# Avoid inaccurate floating values (for inverse matrices in dot product for instance)\n", 45 | "# See https://stackoverflow.com/questions/24537791/numpy-matrix-inversion-rounding-errors\n", 46 | "np.set_printoptions(suppress=True)" 47 | ] 48 | }, 49 | { 50 | "cell_type": "code", 51 | "execution_count": 1, 52 | "metadata": {}, 53 | "outputs": [ 54 | { 55 | "data": { 56 | "text/html": [ 57 | "" 84 | ], 85 | "text/plain": [ 86 | "" 87 | ] 88 | }, 89 | "metadata": {}, 90 | "output_type": "display_data" 91 | } 92 | ], 93 | "source": [ 94 | "%%html\n", 95 | "" 122 | ] 123 | }, 124 | { 125 | "cell_type": "markdown", 126 | "metadata": {}, 127 | "source": [ 128 | "$$\n", 129 | "\\newcommand\\bs[1]{\\boldsymbol{#1}}\n", 130 | "$$" 131 | ] 132 | }, 133 | { 134 | "cell_type": "markdown", 135 | "metadata": {}, 136 | "source": [ 137 | "\n", 138 | " This content is part of a series following the chapter 2 on linear algebra from the [Deep Learning Book](http://www.deeplearningbook.org/) by Goodfellow, I., Bengio, Y., and Courville, A. (2016). It aims to provide intuitions/drawings/python code on mathematical theories and is constructed as my understanding of these concepts. You can check the syllabus in the [introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/).\n", 139 | "" 140 | ] 141 | }, 142 | { 143 | "cell_type": "markdown", 144 | "metadata": {}, 145 | "source": [ 146 | "# Introduction\n", 147 | "\n", 148 | "This chapter is light but contains some important definitions. The identity matrix or the inverse of a matrix are concepts that will be very useful in the next chapters. We will see at the end of this chapter that we can solve systems of linear equations by using the inverse matrix. So hang on!" 149 | ] 150 | }, 151 | { 152 | "cell_type": "markdown", 153 | "metadata": {}, 154 | "source": [ 155 | "# 2.3 Identity and Inverse Matrices\n", 156 | "\n", 157 | "\n", 158 | "# Identity matrices\n", 159 | "\n", 160 | "The identity matrix $\\bs{I}_n$ is a special matrix of shape ($n \\times n$) that is filled with $0$ except the diagonal that is filled with 1.\n", 161 | "\n", 162 | "\"Example\n", 163 | "A 3 by 3 identity matrix" 164 | ] 165 | }, 166 | { 167 | "cell_type": "markdown", 168 | "metadata": {}, 169 | "source": [ 170 | "An identity matrix can be created with the Numpy function `eye()`:" 171 | ] 172 | }, 173 | { 174 | "cell_type": "code", 175 | "execution_count": 65, 176 | "metadata": {}, 177 | "outputs": [ 178 | { 179 | "data": { 180 | "text/plain": [ 181 | "array([[ 1., 0., 0.],\n", 182 | " [ 0., 1., 0.],\n", 183 | " [ 0., 0., 1.]])" 184 | ] 185 | }, 186 | "execution_count": 65, 187 | "metadata": {}, 188 | "output_type": "execute_result" 189 | } 190 | ], 191 | "source": [ 192 | "np.eye(3)" 193 | ] 194 | }, 195 | { 196 | "cell_type": "markdown", 197 | "metadata": {}, 198 | "source": [ 199 | "When 'apply' the identity matrix to a vector the result is this same vector:\n", 200 | "\n", 201 | "$$\\bs{I}_n\\bs{x} = \\bs{x}$$\n", 202 | "\n", 203 | "### Example 1.\n", 204 | "\n", 205 | "$$\n", 206 | "\\begin{bmatrix}\n", 207 | " 1 & 0 & 0 \\\\\\\\\n", 208 | " 0 & 1 & 0 \\\\\\\\\n", 209 | " 0 & 0 & 1\n", 210 | "\\end{bmatrix}\n", 211 | "\\times\n", 212 | "\\begin{bmatrix}\n", 213 | " x_{1} \\\\\\\\\n", 214 | " x_{2} \\\\\\\\\n", 215 | " x_{3}\n", 216 | "\\end{bmatrix}=\n", 217 | "\\begin{bmatrix}\n", 218 | " 1 \\times x_1 + 0 \\times x_2 + 0\\times x_3 \\\\\\\\\n", 219 | " 0 \\times x_1 + 1 \\times x_2 + 0\\times x_3 \\\\\\\\\n", 220 | " 0 \\times x_1 + 0 \\times x_2 + 1\\times x_3\n", 221 | "\\end{bmatrix}=\n", 222 | "\\begin{bmatrix}\n", 223 | " x_{1} \\\\\\\\\n", 224 | " x_{2} \\\\\\\\\n", 225 | " x_{3}\n", 226 | "\\end{bmatrix}\n", 227 | "$$" 228 | ] 229 | }, 230 | { 231 | "cell_type": "code", 232 | "execution_count": 66, 233 | "metadata": {}, 234 | "outputs": [ 235 | { 236 | "data": { 237 | "text/plain": [ 238 | "array([[2],\n", 239 | " [6],\n", 240 | " [3]])" 241 | ] 242 | }, 243 | "execution_count": 66, 244 | "metadata": {}, 245 | "output_type": "execute_result" 246 | } 247 | ], 248 | "source": [ 249 | "x = np.array([[2], [6], [3]])\n", 250 | "x" 251 | ] 252 | }, 253 | { 254 | "cell_type": "code", 255 | "execution_count": 67, 256 | "metadata": {}, 257 | "outputs": [ 258 | { 259 | "data": { 260 | "text/plain": [ 261 | "array([[ 2.],\n", 262 | " [ 6.],\n", 263 | " [ 3.]])" 264 | ] 265 | }, 266 | "execution_count": 67, 267 | "metadata": {}, 268 | "output_type": "execute_result" 269 | } 270 | ], 271 | "source": [ 272 | "xid = np.eye(x.shape[0]).dot(x)\n", 273 | "xid" 274 | ] 275 | }, 276 | { 277 | "cell_type": "markdown", 278 | "metadata": {}, 279 | "source": [ 280 | "## Intuition\n", 281 | "\n", 282 | "You can think of a matrix as a way to transform objects in a $n$-dimensional space. It applies a linear transformation of the space. We can say that we *apply* a matrix to an element: this means that we do the dot product between this matrix and the element (more details about the dot product in [2.2](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.2-Multiplying-Matrices-and-Vectors/)). We will see this notion thoroughly in the next chapters but the identity matrix is a good first example. It is a particular example because the space doesn't change when we *apply* the identity matrix to it.\n", 283 | "\n", 284 | "\n", 285 | " The space doesn't change when we *apply* the identity matrix to it\n", 286 | "\n", 287 | "\n", 288 | "We saw that $\\bs{x}$ was not altered after being multiplied by $\\bs{I}$." 289 | ] 290 | }, 291 | { 292 | "cell_type": "markdown", 293 | "metadata": {}, 294 | "source": [ 295 | "# Inverse Matrices\n", 296 | "\n", 297 | "The matrix inverse of $\\bs{A}$ is denoted $\\bs{A}^{-1}$. It is the matrix that results in the identity matrix when it is multiplied by $\\bs{A}$:\n", 298 | "\n", 299 | "$$\\bs{A}^{-1}\\bs{A}=\\bs{I}_n$$\n", 300 | "\n", 301 | "This means that if we apply a linear transformation to the space with $\\bs{A}$, it is possible to go back with $\\bs{A}^{-1}$. It provides a way to cancel the transformation.\n", 302 | "\n", 303 | "### Example 2.\n", 304 | "\n", 305 | "$$\n", 306 | "\\bs{A}=\\begin{bmatrix}\n", 307 | " 3 & 0 & 2 \\\\\\\\\n", 308 | " 2 & 0 & -2 \\\\\\\\\n", 309 | " 0 & 1 & 1\n", 310 | "\\end{bmatrix}\n", 311 | "$$\n", 312 | "\n", 313 | "For this example, we will use the Numpy function `linalg.inv()` to calculate the inverse of $\\bs{A}$. Let's start by creating $\\bs{A}$:" 314 | ] 315 | }, 316 | { 317 | "cell_type": "code", 318 | "execution_count": 68, 319 | "metadata": {}, 320 | "outputs": [ 321 | { 322 | "data": { 323 | "text/plain": [ 324 | "array([[ 3, 0, 2],\n", 325 | " [ 2, 0, -2],\n", 326 | " [ 0, 1, 1]])" 327 | ] 328 | }, 329 | "execution_count": 68, 330 | "metadata": {}, 331 | "output_type": "execute_result" 332 | } 333 | ], 334 | "source": [ 335 | "A = np.array([[3, 0, 2], [2, 0, -2], [0, 1, 1]])\n", 336 | "A" 337 | ] 338 | }, 339 | { 340 | "cell_type": "markdown", 341 | "metadata": {}, 342 | "source": [ 343 | "Now we calculate its inverse:" 344 | ] 345 | }, 346 | { 347 | "cell_type": "code", 348 | "execution_count": 69, 349 | "metadata": {}, 350 | "outputs": [ 351 | { 352 | "data": { 353 | "text/plain": [ 354 | "array([[ 0.2, 0.2, 0. ],\n", 355 | " [-0.2, 0.3, 1. ],\n", 356 | " [ 0.2, -0.3, -0. ]])" 357 | ] 358 | }, 359 | "execution_count": 69, 360 | "metadata": {}, 361 | "output_type": "execute_result" 362 | } 363 | ], 364 | "source": [ 365 | "A_inv = np.linalg.inv(A)\n", 366 | "A_inv" 367 | ] 368 | }, 369 | { 370 | "cell_type": "markdown", 371 | "metadata": {}, 372 | "source": [ 373 | "We can check that $\\bs{A_{inv}}$ is well the inverse of $\\bs{A}$ with Python:" 374 | ] 375 | }, 376 | { 377 | "cell_type": "code", 378 | "execution_count": 70, 379 | "metadata": {}, 380 | "outputs": [ 381 | { 382 | "data": { 383 | "text/plain": [ 384 | "array([[ 1., 0., -0.],\n", 385 | " [ 0., 1., -0.],\n", 386 | " [ 0., 0., 1.]])" 387 | ] 388 | }, 389 | "execution_count": 70, 390 | "metadata": {}, 391 | "output_type": "execute_result" 392 | } 393 | ], 394 | "source": [ 395 | "A_bis = A_inv.dot(A)\n", 396 | "A_bis" 397 | ] 398 | }, 399 | { 400 | "cell_type": "markdown", 401 | "metadata": {}, 402 | "source": [ 403 | "We will see that inverse of matrices can be very usefull, for instance to solve a set of linear equations. We must note however that non square matrices (matrices with more columns than rows or more rows than columns) don't have inverse." 404 | ] 405 | }, 406 | { 407 | "cell_type": "markdown", 408 | "metadata": {}, 409 | "source": [ 410 | "# Sovling a system of linear equations\n", 411 | "\n", 412 | "An introduction on system of linear equations can be found in [2.2](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.2-Multiplying-Matrices-and-Vectors/).\n", 413 | "\n", 414 | "The inverse matrix can be used to solve the equation $\\bs{Ax}=\\bs{b}$ by adding it to each term:\n", 415 | "\n", 416 | "$$\\bs{A}^{-1}\\bs{Ax}=\\bs{A}^{-1}\\bs{b}$$\n", 417 | "\n", 418 | "Since we know by definition that $\\bs{A}^{-1}\\bs{A}=\\bs{I}$, we have:\n", 419 | "\n", 420 | "$$\\bs{I}_n\\bs{x}=\\bs{A}^{-1}\\bs{b}$$\n", 421 | "\n", 422 | "We saw that a vector is not changed when multiplied by the identity matrix. So we can write:\n", 423 | "\n", 424 | "$$\\bs{x}=\\bs{A}^{-1}\\bs{b}$$\n", 425 | "\n", 426 | "This is great! We can solve a set of linear equation just by computing the inverse of $\\bs{A}$ and apply this matrix to the vector of results $\\bs{b}$!\n", 427 | "\n", 428 | "Let's try that!" 429 | ] 430 | }, 431 | { 432 | "cell_type": "markdown", 433 | "metadata": {}, 434 | "source": [ 435 | "### Example 3.\n", 436 | "\n", 437 | "We will take a simple solvable example:\n", 438 | "\n", 439 | "$$\n", 440 | "\\begin{cases}\n", 441 | "y = 2x \\\\\\\\\n", 442 | "y = -x +3\n", 443 | "\\end{cases}\n", 444 | "$$\n", 445 | "\n", 446 | "We will use the notation that we saw in [2.2](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.2-Multiplying-Matrices-and-Vectors/):\n", 447 | "\n", 448 | "$$\n", 449 | "\\begin{cases}\n", 450 | "A_{1,1}x_1 + A_{1,2}x_2 = b_1 \\\\\\\\\n", 451 | "A_{2,1}x_1 + A_{2,2}x_2= b_2\n", 452 | "\\end{cases}\n", 453 | "$$\n", 454 | "\n", 455 | "Here, $x_1$ corresponds to $x$ and $x_2$ corresponds to $y$. So we have:\n", 456 | "\n", 457 | "$$\n", 458 | "\\begin{cases}\n", 459 | "2x_1 - x_2 = 0 \\\\\\\\\n", 460 | "x_1 + x_2= 3\n", 461 | "\\end{cases}\n", 462 | "$$\n", 463 | "\n", 464 | "Our matrix $\\bs{A}$ of weights is:\n", 465 | "\n", 466 | "$$\n", 467 | "\\bs{A}=\n", 468 | "\\begin{bmatrix}\n", 469 | " 2 & -1 \\\\\\\\\n", 470 | " 1 & 1\n", 471 | "\\end{bmatrix}\n", 472 | "$$\n", 473 | "\n", 474 | "And the vector $\\bs{b}$ containing the solutions of individual equations is:\n", 475 | "\n", 476 | "$$\n", 477 | "\\bs{b}=\n", 478 | "\\begin{bmatrix}\n", 479 | " 0 \\\\\\\\\n", 480 | " 3\n", 481 | "\\end{bmatrix}\n", 482 | "$$\n", 483 | "\n", 484 | "Under the matrix form, our systems becomes:\n", 485 | "\n", 486 | "$$\n", 487 | "\\begin{bmatrix}\n", 488 | " 2 & -1 \\\\\\\\\n", 489 | " 1 & 1\n", 490 | "\\end{bmatrix}\n", 491 | "\\begin{bmatrix}\n", 492 | " x_1 \\\\\\\\\n", 493 | " x_2\n", 494 | "\\end{bmatrix}=\n", 495 | "\\begin{bmatrix}\n", 496 | " 0 \\\\\\\\\n", 497 | " 3\n", 498 | "\\end{bmatrix}\n", 499 | "$$\n", 500 | "\n", 501 | "Let's find the inverse of $\\bs{A}$:" 502 | ] 503 | }, 504 | { 505 | "cell_type": "code", 506 | "execution_count": 71, 507 | "metadata": {}, 508 | "outputs": [ 509 | { 510 | "data": { 511 | "text/plain": [ 512 | "array([[ 2, -1],\n", 513 | " [ 1, 1]])" 514 | ] 515 | }, 516 | "execution_count": 71, 517 | "metadata": {}, 518 | "output_type": "execute_result" 519 | } 520 | ], 521 | "source": [ 522 | "A = np.array([[2, -1], [1, 1]])\n", 523 | "A" 524 | ] 525 | }, 526 | { 527 | "cell_type": "code", 528 | "execution_count": 72, 529 | "metadata": {}, 530 | "outputs": [ 531 | { 532 | "data": { 533 | "text/plain": [ 534 | "array([[ 0.33333333, 0.33333333],\n", 535 | " [-0.33333333, 0.66666667]])" 536 | ] 537 | }, 538 | "execution_count": 72, 539 | "metadata": {}, 540 | "output_type": "execute_result" 541 | } 542 | ], 543 | "source": [ 544 | "A_inv = np.linalg.inv(A)\n", 545 | "A_inv" 546 | ] 547 | }, 548 | { 549 | "cell_type": "markdown", 550 | "metadata": {}, 551 | "source": [ 552 | "We have also:" 553 | ] 554 | }, 555 | { 556 | "cell_type": "code", 557 | "execution_count": 73, 558 | "metadata": { 559 | "collapsed": true 560 | }, 561 | "outputs": [], 562 | "source": [ 563 | "b = np.array([[0], [3]])" 564 | ] 565 | }, 566 | { 567 | "cell_type": "markdown", 568 | "metadata": {}, 569 | "source": [ 570 | "Since we saw that\n", 571 | "\n", 572 | "$$\\bs{x}=\\bs{A}^{-1}\\bs{b}$$\n", 573 | "\n", 574 | "We have:" 575 | ] 576 | }, 577 | { 578 | "cell_type": "code", 579 | "execution_count": 74, 580 | "metadata": {}, 581 | "outputs": [ 582 | { 583 | "data": { 584 | "text/plain": [ 585 | "array([[ 1.],\n", 586 | " [ 2.]])" 587 | ] 588 | }, 589 | "execution_count": 74, 590 | "metadata": {}, 591 | "output_type": "execute_result" 592 | } 593 | ], 594 | "source": [ 595 | "x = A_inv.dot(b)\n", 596 | "x" 597 | ] 598 | }, 599 | { 600 | "cell_type": "markdown", 601 | "metadata": {}, 602 | "source": [ 603 | "This is our solution! \n", 604 | "\n", 605 | "$$\n", 606 | "\\bs{x}=\n", 607 | "\\begin{bmatrix}\n", 608 | " 1 \\\\\\\\\n", 609 | " 2\n", 610 | "\\end{bmatrix}\n", 611 | "$$\n", 612 | "\n", 613 | "This means that the point of coordinates (1, 2) is the solution and is at the intersection of the lines representing the equations. Let's plot them to check this solution:" 614 | ] 615 | }, 616 | { 617 | "cell_type": "code", 618 | "execution_count": 75, 619 | "metadata": {}, 620 | "outputs": [ 621 | { 622 | "data": { 623 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQkAAAECCAYAAADtryKnAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAFSJJREFUeJzt3X+M1PWdx/Hncqse9ZYfV7fkbDH0Svct6YKRerSnwAm1\n19ZK2qaXXEzUlhSoJ7mcGpvA9ej1j2shkTPRFJoKFn/Us2pTr1dp1KSgwF7O9o8GQelbLD3bpEZX\nBUHRdHHn/piZ88u4O/Odme+Pz/c7r0eyCTvf3Zm3X+HJm/mx01epVBARmcyUvAcQkbApEiLSlCIh\nIk0pEiLSlCIhIk0pEiLSVH+rLzCzKcA2wIBx4Fp3fyZyfAWwARgDdrj79pRmFZEcxNkkVgAVd19M\nNQbfrh8ws37gFuAy4FJgjZm9L4U5RSQnLSPh7j8B1tQ+nQMcjRyeBxx29+PuPgbsA5YkPaSI5Kfl\nPzcA3H3czO4EPg/8XeTQNOC1yOcngOmJTSciuYt9x6W7fxkYArab2dTaxcephqJuADiW2HQikrs4\nd1xeBXzA3TcBbwFv1z4ADgFzzWwGcBJYCtzc7Pr+5dvfqPzv0DHWLVnL8CzranjJRqVSYf3WEZ4+\n8gr//k9LGTpvZt4jSWf6OvmmOP/c+DGww8yeqH399cAXzexsd99uZjcCj9UG2O7uLzS7smlnDTA+\n/iob92zhugtWMjRzbidzJ2pwcIDR0RN5j3GakGY69PxRnj7yCgs+9F6GzpsZzFx1IZ2rulBn6kTL\nSLj7SeDvmxzfCeyMe4Nn/skZrJ5/DdsO3M3W/TuCCYVMrFKp8JN9vwXgc4s/mPM0kodcnkw1fM48\nVs+/hkplnK37d/Ds0efyGENi+PXvjvHs74+x4EPv5YN/Ma31N0jp5PaMS4UifNoiBHJ+WrZCETZt\nEQIBvHZDoQiTtgipyz0SoFCESFuE1AURCVAoQqItQqKCiQQoFKHQFiFRQUUCFIq8aYuQRsFFAhSK\nPGmLkEZBRgIUijxoi5CJBBsJUCiypi1CJhJ0JEChyIq2CJlM8JEAhSIL2iJkMoWIBCgUadIWIc0U\nJhKgUKRFW4Q0U6hIgEKRNG0R0krhIgEKRZK0RUgrhYwEKBRJ0BYhcRQ2EqBQdEtbhMRR6EiAQtEp\nbRESV+EjAQpFJ7RFSFyliAQoFO3QFiHtKE0kQKGIS1uEtKNUkQCFohVtEdKu0kUCFIpmtEVIu0oZ\nCVAoJqItQjpR2kiAQtFIW4R0otSRAIWiTluEdKr0kQCFArRFSOeavqu4mfUD3wfmAGcC33L3n0aO\n3wB8BXipdtFX3f1wOqN2px6KXnw3c20R0o1Wm8RVwMvuvhS4HPhOw/GFwNXuvrz2EWQg6np1o9AW\nId1oFYkHgA21X/cBYw3HPwqsN7O9ZrYu6eHS0Guh0BYh3WoaCXc/6e5vmNkA8CDw9YYvuQ+4FlgG\nLDazy9MZM1mNoTj4ouc9Umq0RUi3Wt5xaWazgV3AXe5+f8PhW939VXc/BewELkxhxlREQ7Fp75ZS\nbhTaIiQJre64nAU8Cqx1990Nx6YBB83sfOBNYDlwR5wbHRwc6GzahC0bXMT06VPZPHI7331qB+uW\nrGV4luU91v/r9jw99dwoz/7+GBfNm8WiBe9PaKpw/v9Faab0NI0EsB6YAWwws28AFWAbcLa7bzez\n9cDjwFvAz939kTg3Ojp6ovOJEzb7jDncdMkaNu/7Hhv3bAnmUY/BwYGuzlOlUuGuh58B4DOLZid2\nzrudKw2aKZ5Oo9U0Eu5+PXB9k+P3Avd2dMsBWXju/NI9PKr7IiQpPfFkqjjK9KiH7ouQJCkSEWUJ\nhbYISZIi0aDoodAWIUlTJCZQ5FBoi5CkKRKTKGIotEVIGhSJJooWCm0RkgZFooWihEJbhKRFkYih\nCKHQFiFpUSRiCjkU2iIkTYpEG0INhbYISZMi0abQQqEtQtKmSHQgpFBoi5C0KRIdCiEU2iIkC4pE\nF/IOhbYIyYIi0aW8QqEtQrKiSCQgj1Boi5CsKBIJyTIU2iIkS4pEgrIKhbYIyZIikbC0Q6EtQrKm\nSKQgzVBoi5CsKRIpSSMU2iIkD4pEipIOhbYIyYMikbKkQqEtQvKiSGQgiVBoi5C8KBIZ6SYU2iIk\nT4pEhjoNhbYIyZMikbF2Q6EtQvKmSOSgnVBoi5C8KRI5iRMKbRESAkUiR61CoS1CQtA0EmbWb2Z3\nm9keM/sfM1vRcHyFmf3CzEbMbFW6o5bTZKHQFiGhaLVJXAW87O5LgcuB79QPmFk/cAtwGXApsMbM\n3pfSnKU2USgO/OZlbREShFaReADYUPt1HzAWOTYPOOzux919DNgHLEl+xN7QGIo7dj0BaIuQ/DWN\nhLufdPc3zGwAeBD4euTwNOC1yOcngOnJj9g76qF4e3ycP/zZbubamLYIyV1/qy8ws9nAj4HvuPv9\nkUPHqYaibgA4FudGBwcH2pkxE6HMdOk5f8VDe3/LCwNP8NLMPbw4Pp/hWZb3WKcJ5VxFaab0NI2E\nmc0CHgXWuvvuhsOHgLlmNgM4CSwFbo5zo6OjJzoYNT2DgwPBzHTo+aP81v8UG17OH85+nI17tnDd\nBSsZmjk379GAsM5VnWaKp9NotbpPYj0wA9hgZrvNbJeZXWlmq9z9FHAj8BgwAmx39xc6mkKA0x/R\nWPOJT+T+vh4i0GKTcPfrgeubHN8J7Ex6qF4VfV7E0HkzmTlavY9i24G72bp/R1AbhfQOPZkqEJM9\nLyLvNwASUSQC0ezZlQqF5EmRCECcZ1cqFJIXRSIAcV+joVBIHhSJnLX7Gg2FQrKmSOSsk1d6KhSS\nJUUiR9280lOhkKwoEjnq9udFKBSSBUUiJ0n9vAiFQtKmSOQkyZ86pVBImhSJHKTxU6cUCkmLIpGD\ntH52pUIhaVAkMpb2z65UKCRpikTGsvgJ2AqFJEmRyFCWPwFboZCkKBIZyvp9NBQKSYIikZG83kdD\noZBuKRIZyfPduBQK6YYikYEQ3o1LoZBOKRIZCOU9PRUK6YQikbIQtogohULapUikLJQtIkqhkHYo\nEikKbYuIUigkLkUiRSFuEVEKhcShSKQk5C0iSqGQVhSJlIS+RUQpFNKMIpGComwRUQqFTEaRSEGR\ntogohUImokgkrIhbRJRCIY1iRcLMPmZmuye4/AYzO2hmu2ofH05+xGIp6hYRpVBIVMtImNnXgG3A\nWRMcXghc7e7Lax+Hkx6wSIq+RUQpFFIXZ5N4DvjCJMc+Cqw3s71mti65sYqpDFtElEIhECMS7v4Q\ncGqSw/cB1wLLgMVmdnmCsxVKmbaIqMZQHHzR8x5JMtbf5fff6u7HAcxsJ3Ah8LNW3zQ4ONDlzSav\n25meem6UZ39/jIvmzWLRgvcHMVNSlg0uYvr0qWweuZ1Ne7ewbslahmdZ3mOdJpRzFRXiTJ1oJxJ9\n0U/MbBpw0MzOB94ElgN3xLmi0dETbdxs+gYHB7qaqVKpcNfDzwDwmUWzE/nv63ampM0+Yw6rh69m\n28F72LhnC9ddsJKhmXPzHgsI71xBuDN1op2HQCsAZnalma2qbRDrgceBJ4CD7v5IR1MUXNnui5jM\n8DnzuOmSNbqPosfE2iTc/Xng4tqv74tcfi9wbzqjFUNZ74uYzMJz57N6/jVsO3A3W/fvCGqjkHTo\nyVRd6pUtIkqPevQWRaILvbZFRCkUvUOR6EIvbhFRCkVvUCQ61MtbRJRCUX6KRId6fYuIUijKTZHo\ngLaId1MoykuR6IC2iIkpFOWkSLRJW0RzCkX5KBJt0hbRmkJRLopEG7RFxKdQlIci0QZtEe1RKMpB\nkYhJW0RnFIriUyRi0hbROYWi2BSJGLRFdE+hKC5FIgZtEclQKIpJkWhBW0SyFIriUSRa0BaRPIWi\nWBSJJrRFpEehKA5FogltEelSKIpBkZiEtohsKBThUyQmoS0iOwpF2BSJCWiLyJ5CES5FYgLaIvKh\nUIRJkWigLSJfCkV4FIkG2iLyp1CERZGI0BYRDoUiHIpEhLaIsCgUYVAkarRFhEmhyJ8iUaMtIlwK\nRb5iRcLMPmZmuye4fIWZ/cLMRsxsVfLjZUNbRPgUivy0jISZfQ3YBpzVcHk/cAtwGXApsMbM3pfC\njKk78JuXtUUUgEKRjzibxHPAFya4fB5w2N2Pu/sYsA9YkuRwWahUKvzHow5oiygChSJ7LSPh7g8B\npyY4NA14LfL5CWB6QnNl5te/O8bTR17RFlEgjaE4+KLnPVKp9XfxvcephqJuADgW5xsHBwe6uNnk\nVCoVfvbAfgC+dMVHgpmrLrR56kKYa9ngIqZPn8rmkdvZtHcL65asZXiW5T3WaUI4T0loJxJ9DZ8f\nAuaa2QzgJLAUuDnOFY2OnmjjZtNz6PmjPH3kFS6aN4uZU/uDmQuqv8FCmqcupLlmnzGH1cNXs+3g\nPWzcs4XrLljJ0My5eY8FhHWe6jqNVjsPgVYAzOxKM1vl7qeAG4HHgBFgu7u/0NEUOYg+onHl34b1\nN5DEN3zOPG66ZI3uo0hRrE3C3Z8HLq79+r7I5TuBnemMlq7o8yKGzpsZXPUlvoXnzmf1/GvYduBu\ntu7fEdRGUQY9+WQqPS+ifPSoR3p6MhJ6dmU5KRTp6LlIaIsoN4UieT0XCW0R5adQJKunIqEtonco\nFMnpqUhoi+gtCkUyeiYS2iJ6k0LRvZ6JhLaI3qVQdKcnIqEtQhSKzvVEJLRFCCgUnSp9JLRFSJRC\n0b7SR0JbhDRSKNpT6khoi5DJKBTxlToS2iKkGYUintJGQluExKFQtFbaSGiLkLgUiuZKGQltEdIu\nhWJypYyEtgjphEIxsdJFQluEdEOheLfSRUJbhHRLoThdqSKhLUKSolC8o1SR0BYhSVIoqkoTCW0R\nkgaFokSR0BYhaen1UJQiEtoiJG29HIpSREJbhGShV0NR+Ehoi5As9WIoCh8JbRGStV4LRaEjoS1C\n8tJLoWj5ruJm1gdsBS4A3gJWufuRyPFbqb7jeP1tuT/n7pm8Rbe2CMlTPRRlfzfzOJvE54Gz3P1i\nYD1wS8PxhcCn3H157SOTQGiLkBD0wkYRJxKLgUcA3P1J4KL6gdqW8WHgdjPbZ2YrU5lyAtoiJBRl\nD0WcSEwDXot8fsrM6t93NnAbcBXwaeA6MxtOdsR30xYhoWkMxcEXPe+REhMnEseBgej3uPt47dcn\ngdvc/S13fx3YRfW+i1Rpi5AQRUOxae+W0mwULe+4BEaAK4AfmdnHgQORY0PAD83swtp1LQbubHWF\ng4MDrb5kUpVKhZ89sB+AL13xka6uK6mZ0hLiTBDmXKHMtGxwEdOnT2XzyO1896kdrFuyluFZlvdY\nXYkTiYeAT5rZSO3zlWZ2A3DY3R82sx8ATwJ/BO5y90OtrnB0tPP7Ng89f5Snj7zCgg+9l5lT+7u6\nrrrBwYFEridJIc4EYc4V2kyzz5jDTZesYfO+77Fxz5ZgHvXoNKQtI+HuFeAfGi5+NnJ8M7C5o1tv\nk+6LkKJYeO780jw8WqgnU+m+CCmSsjzqUZhIaIuQIipDKAoTCW0RUlRFD0UhIqEtQoquyKEoRCS0\nRUgZFDUUwUdCW4SUSRFDEXwktEVI2RQtFEFHQluElFWRQhF0JLRFSJkVJRTBRkJbhPSCIoQi2Eho\ni5BeEXoogoyEtgjpNSGHIshIaIuQXhRqKIKLhLYI6WUhhiK4SGiLkF4XWiiCioS2CJGqkEIRVCS0\nRYi8I5RQBBMJbREi7xZCKIKJhLYIkYnlHYogIqEtQqS5PEMRRCS0RYi0llcoco+EtgiR+PIIRe6R\n0BYh0p6sQ5FrJLRFiHQmy1DkGgltESKdyyoUuUVCW4RI97IIRW6R0BYhkoy0Q5FLJLRFiCQrzVDk\nEgltESLJSysULSNhZn1m9l0z+28z22Vmf9lwfLWZ/bJ2/LNxblRbhEg60ghFnE3i88BZ7n4xsB64\npX7AzGYB/wj8NfBpYKOZndHsysZOjWuLEElR0qGIE4nFwCMA7v4kcFHk2CJgn7ufcvfjwGFgQbMr\ne/3NMUBbhEiakgxFnEhMA16LfH7KzKZMcux1YHqzKxs79ba2CJEMNIaiU/0xvuY4MBD5fIq7j0eO\nRf+0DwDHml3ZmX1jvOeNJ7nnnl+2NWiapkzpY3y8kvcYpwlxJghzLs3U3IK3z+P1sdc7/v44kRgB\nrgB+ZGYfBw5Ejv0C+DczOxOYCpwPHGx2Zd/85r/2dTiriOSgr1JpXjsz6wO28s59DSuBzwKH3f1h\nM/sK8FWgD/iWu/9nivOKSMZaRkJEelvuLxUXkbApEiLSlCIhIk0pEiLSVJyHQDsSeVTkAuAtYJW7\nH4kcXw2sAcaoPiqyM61Z2pjpVuBi4ETtos+5+4l3XVE6s30M2OTuyxouXwFsoHqedrj79izmaTHT\nDcBXgJdqF33V3Q+nPEs/8H1gDnAm1d8zP40cz/w8xZgp8/NUu90pwDbAgHHgWnd/JnK8rXOVWiSI\nvOaj9pvtltpl0dd8LATeA+wzs8fcfSzFeZrOVLMQ+JS7v5ryHKcxs68BV1N9xmr08n6qM34UeBMY\nMbP/cveX3n0t2cxUsxC42t1/lfYcEVcBL7v7NWb258CvgJ9Crudp0plq8jhPACuAirsvNrO/Ab7N\nO3/22j5Xaf5zI9HXfKQ9U23L+DBwu5ntM7OVGcxT9xzwhQkun0f1+SjHawHdByzJeSao/gZbb2Z7\nzWxdRvM8QPVvP6g+Jyf6F0pe56nZTJDPecLdf0J1S4fqlnM0crjtc5VmJBJ9zUcGM50N3Eb1b4dP\nA9eZ2XAGM+HuDwGnJjjUOO8JsjlPzWYCuA+4FlgGLDazyzOY56S7v2FmA8CDwNcjh3M5Ty1mghzO\nU2S2cTO7E7gVuDdyqO1zlWYkEn3NRwYznQRuc/e33P11YBfV+y7ylNd5auVWd3/V3U8BO4ELs7hR\nM5tN9f/LXe5+f+RQbuepyUyQ03mqc/cvA0PAdjObWru47XOV5n0Sib7mI4OZhoAfmtmFVM/LYuDO\nDGaKanxdyyFgrpnNoBqxpcDNec5kZtOAg2Z2PtV/0y4H7kh7iNr9WI8Ca919d8PhXM5Ts5nyOk+1\n274K+IC7b6J6B/3btQ/o4FylGYmHgE+a2Ujt85W1e3vrr/m4jeq/h/qAf3b3P6Y4S9yZfgA8CfyR\n6t8MhzKYKaoCYGZXAme7+3YzuxF4jOp52u7uLwQw03rgcaq/AX/u7o9kMMd6YAawwcy+UZtrG/me\np1Yz5XGeAH4M7DCzJ6j+Gb8e+KKZdXSu9NoNEWlKT6YSkaYUCRFpSpEQkaYUCRFpSpEQkaYUCRFp\nSpEQkaYUCRFp6v8A9Mn+VQ3Rv70AAAAASUVORK5CYII=\n", 624 | "text/plain": [ 625 | "" 626 | ] 627 | }, 628 | "metadata": {}, 629 | "output_type": "display_data" 630 | } 631 | ], 632 | "source": [ 633 | "x = np.arange(-10, 10)\n", 634 | "y = 2*x\n", 635 | "y1 = -x + 3\n", 636 | "\n", 637 | "plt.figure()\n", 638 | "plt.plot(x, y)\n", 639 | "plt.plot(x, y1)\n", 640 | "plt.xlim(0, 3)\n", 641 | "plt.ylim(0, 3)\n", 642 | "# draw axes\n", 643 | "plt.axvline(x=0, color='grey')\n", 644 | "plt.axhline(y=0, color='grey')\n", 645 | "plt.show()\n", 646 | "plt.close()" 647 | ] 648 | }, 649 | { 650 | "cell_type": "markdown", 651 | "metadata": {}, 652 | "source": [ 653 | "We can see that the solution (corresponding to the line crossing) is when $x=1$ and $y=2$. It confirms what we found with the matrix inversion!" 654 | ] 655 | }, 656 | { 657 | "cell_type": "markdown", 658 | "metadata": {}, 659 | "source": [ 660 | "## BONUS: Coding tip - Draw an equation\n", 661 | "\n", 662 | "To draw the equation with Matplotlib, we first need to create a vector with all the $x$ values. Actually, since this is a line, only two points would have been sufficient. But with more complex functions, the length of the vector $x$ corresponds to the sampling rate. So here we used the Numpy function `arrange()` (see the [doc](https://docs.scipy.org/doc/numpy/reference/generated/numpy.arange.html)) to create a vector from $-10$ to $10$ (not included)." 663 | ] 664 | }, 665 | { 666 | "cell_type": "code", 667 | "execution_count": 76, 668 | "metadata": {}, 669 | "outputs": [ 670 | { 671 | "data": { 672 | "text/plain": [ 673 | "array([-10, -9, -8, -7, -6, -5, -4, -3, -2, -1, 0, 1, 2,\n", 674 | " 3, 4, 5, 6, 7, 8, 9])" 675 | ] 676 | }, 677 | "execution_count": 76, 678 | "metadata": {}, 679 | "output_type": "execute_result" 680 | } 681 | ], 682 | "source": [ 683 | "np.arange(-10, 10)" 684 | ] 685 | }, 686 | { 687 | "cell_type": "markdown", 688 | "metadata": {}, 689 | "source": [ 690 | "The first argument is the starting point and the second the ending point. You can add a third argument to specify the step:" 691 | ] 692 | }, 693 | { 694 | "cell_type": "code", 695 | "execution_count": 77, 696 | "metadata": {}, 697 | "outputs": [ 698 | { 699 | "data": { 700 | "text/plain": [ 701 | "array([-10, -8, -6, -4, -2, 0, 2, 4, 6, 8])" 702 | ] 703 | }, 704 | "execution_count": 77, 705 | "metadata": {}, 706 | "output_type": "execute_result" 707 | } 708 | ], 709 | "source": [ 710 | "np.arange(-10, 10, 2)" 711 | ] 712 | }, 713 | { 714 | "cell_type": "markdown", 715 | "metadata": {}, 716 | "source": [ 717 | "Then we create a second vector $y$ that depends on the $x$ vector. Numpy will take each value of $x$ and apply the equation formula to it." 718 | ] 719 | }, 720 | { 721 | "cell_type": "code", 722 | "execution_count": 78, 723 | "metadata": {}, 724 | "outputs": [ 725 | { 726 | "data": { 727 | "text/plain": [ 728 | "array([-19, -17, -15, -13, -11, -9, -7, -5, -3, -1, 1, 3, 5,\n", 729 | " 7, 9, 11, 13, 15, 17, 19])" 730 | ] 731 | }, 732 | "execution_count": 78, 733 | "metadata": {}, 734 | "output_type": "execute_result" 735 | } 736 | ], 737 | "source": [ 738 | "x = np.arange(-10, 10)\n", 739 | "y = 2*x + 1\n", 740 | "y" 741 | ] 742 | }, 743 | { 744 | "cell_type": "markdown", 745 | "metadata": {}, 746 | "source": [ 747 | "Finally, you just need to plot these vectors." 748 | ] 749 | }, 750 | { 751 | "cell_type": "markdown", 752 | "metadata": { 753 | "collapsed": true 754 | }, 755 | "source": [ 756 | "# Singular matrices\n", 757 | "\n", 758 | "Some matrices are not invertible. They are called **singular**." 759 | ] 760 | }, 761 | { 762 | "cell_type": "markdown", 763 | "metadata": {}, 764 | "source": [ 765 | "# Conclusion\n", 766 | "\n", 767 | "This introduces different cases according to the linear system because $\\bs{A}^{-1}$ exists only if the equation $\\bs{Ax}=\\bs{b}$ has one and only one solution. [The next chapter](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.4-Linear-Dependence-and-Span/) is almost all about systems of linear equations and number of solutions." 768 | ] 769 | }, 770 | { 771 | "cell_type": "markdown", 772 | "metadata": {}, 773 | "source": [ 774 | "\n", 775 | " Feel free to drop me an email or a comment. The syllabus of this series can be found [in the introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/). All the notebooks can be found on [Github](https://github.com/hadrienj/deepLearningBook-Notes).\n", 776 | "" 777 | ] 778 | } 779 | ], 780 | "metadata": { 781 | "kernelspec": { 782 | "display_name": "Python 2", 783 | "language": "python", 784 | "name": "python2" 785 | }, 786 | "language_info": { 787 | "codemirror_mode": { 788 | "name": "ipython", 789 | "version": 2 790 | }, 791 | "file_extension": ".py", 792 | "mimetype": "text/x-python", 793 | "name": "python", 794 | "nbconvert_exporter": "python", 795 | "pygments_lexer": "ipython2", 796 | "version": "2.7.10" 797 | } 798 | }, 799 | "nbformat": 4, 800 | "nbformat_minor": 2 801 | } 802 | -------------------------------------------------------------------------------- /2.3 Identity and Inverse Matrices/images/identity-matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.3 Identity and Inverse Matrices/images/identity-matrix.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/intersection-2-planes-line.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/intersection-2-planes-line.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/number-solutions-system-equations.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/number-solutions-system-equations.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/overdetermined-system-linear-equations.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/overdetermined-system-linear-equations.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/python-adding-vectors-combination-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/python-adding-vectors-combination-1.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/python-adding-vectors-combination.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/python-adding-vectors-combination.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/python-one-equation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/python-one-equation.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/python-system-equations-with-linear-dependence.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/python-system-equations-with-linear-dependence.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/python-three-equations.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/python-three-equations.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/python-two-equations-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/python-two-equations-1.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/python-two-equations-no-solution.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/python-two-equations-no-solution.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/python-two-equations.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/python-two-equations.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/python-two-vectors.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/python-two-vectors.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/representing-features.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/representing-features.png -------------------------------------------------------------------------------- /2.4 Linear Dependence and Span/images/underdetermined-system-linear-equations.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.4 Linear Dependence and Span/images/underdetermined-system-linear-equations.png -------------------------------------------------------------------------------- /2.5 Norms/images/l1-norm.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.5 Norms/images/l1-norm.png -------------------------------------------------------------------------------- /2.5 Norms/images/l2-norm.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.5 Norms/images/l2-norm.png -------------------------------------------------------------------------------- /2.5 Norms/images/squared-L2-Norm.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.5 Norms/images/squared-L2-Norm.png -------------------------------------------------------------------------------- /2.6 Special Kinds of Matrices and Vectors/2.6 Special Kinds of Matrices and Vectors.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 2, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import numpy as np\n", 10 | "import matplotlib.pyplot as plt\n", 11 | "import seaborn as sns" 12 | ] 13 | }, 14 | { 15 | "cell_type": "code", 16 | "execution_count": 3, 17 | "metadata": {}, 18 | "outputs": [ 19 | { 20 | "name": "stdout", 21 | "output_type": "stream", 22 | "text": [ 23 | "Populating the interactive namespace from numpy and matplotlib\n" 24 | ] 25 | } 26 | ], 27 | "source": [ 28 | "# Plot parameters\n", 29 | "sns.set()\n", 30 | "%pylab inline\n", 31 | "pylab.rcParams['figure.figsize'] = (4, 4)\n", 32 | "plt.rcParams['xtick.major.size'] = 0\n", 33 | "plt.rcParams['ytick.major.size'] = 0\n", 34 | "# Avoid inaccurate floating values (for inverse matrices in dot product for instance)\n", 35 | "# See https://stackoverflow.com/questions/24537791/numpy-matrix-inversion-rounding-errors\n", 36 | "np.set_printoptions(suppress=True)" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "execution_count": 4, 42 | "metadata": {}, 43 | "outputs": [ 44 | { 45 | "data": { 46 | "text/html": [ 47 | "" 74 | ], 75 | "text/plain": [ 76 | "" 77 | ] 78 | }, 79 | "metadata": {}, 80 | "output_type": "display_data" 81 | } 82 | ], 83 | "source": [ 84 | "%%html\n", 85 | "" 112 | ] 113 | }, 114 | { 115 | "cell_type": "markdown", 116 | "metadata": {}, 117 | "source": [ 118 | "$$\n", 119 | "\\newcommand\\bs[1]{\\boldsymbol{#1}}\n", 120 | "$$" 121 | ] 122 | }, 123 | { 124 | "cell_type": "markdown", 125 | "metadata": {}, 126 | "source": [ 127 | "\n", 128 | " This content is part of a series following the chapter 2 on linear algebra from the [Deep Learning Book](http://www.deeplearningbook.org/) by Goodfellow, I., Bengio, Y., and Courville, A. (2016). It aims to provide intuitions/drawings/python code on mathematical theories and is constructed as my understanding of these concepts. You can check the syllabus in the [introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/).\n", 129 | "" 130 | ] 131 | }, 132 | { 133 | "cell_type": "markdown", 134 | "metadata": {}, 135 | "source": [ 136 | "# Introduction\n", 137 | "\n", 138 | "We have seen in [2.3](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.3-Identity-and-Inverse-Matrices/) some interesting kind of matrices. We will see other type of vectors and matrices in this chapter. It is not a big chapter but it is important to understand the next ones.\n", 139 | "\n", 140 | "# 2.6 Special Kinds of Matrices and Vectors" 141 | ] 142 | }, 143 | { 144 | "cell_type": "markdown", 145 | "metadata": {}, 146 | "source": [ 147 | "\"Diagonal\n", 148 | "Example of diagonal and symmetric matrices\n", 149 | "\n", 150 | "\n", 151 | "# Diagonal matrices" 152 | ] 153 | }, 154 | { 155 | "cell_type": "markdown", 156 | "metadata": {}, 157 | "source": [ 158 | "\"Example\n", 159 | "Example of a diagonal matrix\n", 160 | "\n", 161 | "A matrix $\\bs{A}_{i,j}$ is diagonal if its entries are all zeros except on the diagonal (when $i=j$).\n", 162 | "\n", 163 | "### Example 1.\n", 164 | "\n", 165 | "$$\n", 166 | "\\bs{D}=\n", 167 | "\\begin{bmatrix}\n", 168 | " 2 & 0 & 0 & 0\\\\\\\\\n", 169 | " 0 & 4 & 0 & 0\\\\\\\\\n", 170 | " 0 & 0 & 3 & 0\\\\\\\\\n", 171 | " 0 & 0 & 0 & 1\n", 172 | "\\end{bmatrix}\n", 173 | "$$\n", 174 | "\n", 175 | "In this case the matrix is also square but there can be non square diagonal matrices.\n", 176 | "\n", 177 | "### Example 2.\n", 178 | "\n", 179 | "$$\n", 180 | "\\bs{D}=\n", 181 | "\\begin{bmatrix}\n", 182 | " 2 & 0 & 0\\\\\\\\\n", 183 | " 0 & 4 & 0\\\\\\\\\n", 184 | " 0 & 0 & 3\\\\\\\\\n", 185 | " 0 & 0 & 0\n", 186 | "\\end{bmatrix}\n", 187 | "$$\n", 188 | "\n", 189 | "Or\n", 190 | "\n", 191 | "$$\n", 192 | "\\bs{D}=\n", 193 | "\\begin{bmatrix}\n", 194 | " 2 & 0 & 0 & 0\\\\\\\\\n", 195 | " 0 & 4 & 0 & 0\\\\\\\\\n", 196 | " 0 & 0 & 3 & 0\n", 197 | "\\end{bmatrix}\n", 198 | "$$\n", 199 | "\n", 200 | "The diagonal matrix can be denoted $diag(\\bs{v})$ where $\\bs{v}$ is the vector containing the diagonal values.\n", 201 | "\n", 202 | "### Example 3.\n", 203 | "\n", 204 | "$$\n", 205 | "\\bs{D}=\n", 206 | "\\begin{bmatrix}\n", 207 | " 2 & 0 & 0 & 0\\\\\\\\\n", 208 | " 0 & 4 & 0 & 0\\\\\\\\\n", 209 | " 0 & 0 & 3 & 0\\\\\\\\\n", 210 | " 0 & 0 & 0 & 1\n", 211 | "\\end{bmatrix}\n", 212 | "$$\n", 213 | "\n", 214 | "In this matrix, $\\bs{v}$ is the following vector:\n", 215 | "\n", 216 | "$$\n", 217 | "\\bs{v}=\n", 218 | "\\begin{bmatrix}\n", 219 | " 2\\\\\\\\\n", 220 | " 4\\\\\\\\\n", 221 | " 3\\\\\\\\\n", 222 | " 1\n", 223 | "\\end{bmatrix}\n", 224 | "$$\n", 225 | "\n", 226 | "The Numpy function `diag()` can be used to create square diagonal matrices:" 227 | ] 228 | }, 229 | { 230 | "cell_type": "code", 231 | "execution_count": 35, 232 | "metadata": {}, 233 | "outputs": [ 234 | { 235 | "data": { 236 | "text/plain": [ 237 | "array([[2, 0, 0, 0],\n", 238 | " [0, 4, 0, 0],\n", 239 | " [0, 0, 3, 0],\n", 240 | " [0, 0, 0, 1]])" 241 | ] 242 | }, 243 | "execution_count": 35, 244 | "metadata": {}, 245 | "output_type": "execute_result" 246 | } 247 | ], 248 | "source": [ 249 | "v = np.array([2, 4, 3, 1])\n", 250 | "np.diag(v)" 251 | ] 252 | }, 253 | { 254 | "cell_type": "markdown", 255 | "metadata": {}, 256 | "source": [ 257 | "The mutliplication between a diagonal matrix and a vector is thus just a ponderation of each element of the vector by $v$:\n", 258 | "\n", 259 | "### Example 4.\n", 260 | "\n", 261 | "$$\n", 262 | "\\bs{D}=\n", 263 | "\\begin{bmatrix}\n", 264 | " 2 & 0 & 0 & 0\\\\\\\\\n", 265 | " 0 & 4 & 0 & 0\\\\\\\\\n", 266 | " 0 & 0 & 3 & 0\\\\\\\\\n", 267 | " 0 & 0 & 0 & 1\n", 268 | "\\end{bmatrix}\n", 269 | "$$\n", 270 | "\n", 271 | "and\n", 272 | "\n", 273 | "$$\n", 274 | "\\bs{x}=\n", 275 | "\\begin{bmatrix}\n", 276 | " 3\\\\\\\\\n", 277 | " 2\\\\\\\\\n", 278 | " 2\\\\\\\\\n", 279 | " 7\n", 280 | "\\end{bmatrix}\n", 281 | "$$\n", 282 | "\n", 283 | "$$\n", 284 | "\\begin{align*}\n", 285 | "&\\bs{Dx}=\n", 286 | "\\begin{bmatrix}\n", 287 | " 2 & 0 & 0 & 0\\\\\\\\\n", 288 | " 0 & 4 & 0 & 0\\\\\\\\\n", 289 | " 0 & 0 & 3 & 0\\\\\\\\\n", 290 | " 0 & 0 & 0 & 1\n", 291 | "\\end{bmatrix} \\times\n", 292 | "\\begin{bmatrix}\n", 293 | " 3\\\\\\\\\n", 294 | " 2\\\\\\\\\n", 295 | " 2\\\\\\\\\n", 296 | " 7\n", 297 | "\\end{bmatrix}\\\\\\\\\n", 298 | "&=\\begin{bmatrix}\n", 299 | " 2\\times3 + 0\\times2 + 0\\times2 + 0\\times7\\\\\\\\\n", 300 | " 0\\times3 + 4\\times2 + 0\\times2 + 0\\times7\\\\\\\\\n", 301 | " 0\\times3 + 0\\times2 + 3\\times2 + 0\\times7\\\\\\\\\n", 302 | " 0\\times3 + 0\\times2 + 0\\times2 + 1\\times7\n", 303 | "\\end{bmatrix}\\\\\\\\\n", 304 | "&=\n", 305 | "\\begin{bmatrix}\n", 306 | " 2\\times3\\\\\\\\\n", 307 | " 4\\times2\\\\\\\\\n", 308 | " 3\\times2\\\\\\\\\n", 309 | " 1\\times7\n", 310 | "\\end{bmatrix}\n", 311 | "\\end{align*}\n", 312 | "$$" 313 | ] 314 | }, 315 | { 316 | "cell_type": "markdown", 317 | "metadata": {}, 318 | "source": [ 319 | "Non square matrices have the same properties:\n", 320 | "\n", 321 | "### Example 5.\n", 322 | "\n", 323 | "$$\n", 324 | "\\bs{D}=\n", 325 | "\\begin{bmatrix}\n", 326 | " 2 & 0 & 0\\\\\\\\\n", 327 | " 0 & 4 & 0\\\\\\\\\n", 328 | " 0 & 0 & 3\\\\\\\\\n", 329 | " 0 & 0 & 0\n", 330 | "\\end{bmatrix}\n", 331 | "$$\n", 332 | "\n", 333 | "and\n", 334 | "\n", 335 | "$$\n", 336 | "\\bs{x}=\n", 337 | "\\begin{bmatrix}\n", 338 | " 3\\\\\\\\\n", 339 | " 2\\\\\\\\\n", 340 | " 2\n", 341 | "\\end{bmatrix}\n", 342 | "$$\n", 343 | "\n", 344 | "$$\n", 345 | "\\bs{Dx}=\n", 346 | "\\begin{bmatrix}\n", 347 | " 2 & 0 & 0\\\\\\\\\n", 348 | " 0 & 4 & 0\\\\\\\\\n", 349 | " 0 & 0 & 3\\\\\\\\\n", 350 | " 0 & 0 & 0\n", 351 | "\\end{bmatrix}\n", 352 | "\\times\n", 353 | "\\begin{bmatrix}\n", 354 | " 3\\\\\\\\\n", 355 | " 2\\\\\\\\\n", 356 | " 2\n", 357 | "\\end{bmatrix}\n", 358 | "=\n", 359 | "\\begin{bmatrix}\n", 360 | " 2\\times3\\\\\\\\\n", 361 | " 4\\times2\\\\\\\\\n", 362 | " 3\\times2\\\\\\\\\n", 363 | " 0\n", 364 | "\\end{bmatrix}\n", 365 | "$$" 366 | ] 367 | }, 368 | { 369 | "cell_type": "markdown", 370 | "metadata": {}, 371 | "source": [ 372 | "The invert of a square diagonal matrix exists if all entries of the diagonal are non-zeros. If it is the case, the invert is easy to find. Also, the inverse doen't exist if the matrix is non-square.\n", 373 | "\n", 374 | "$$\n", 375 | "\\bs{D}=\n", 376 | "\\begin{bmatrix}\n", 377 | " 2 & 0 & 0 & 0\\\\\\\\\n", 378 | " 0 & 4 & 0 & 0\\\\\\\\\n", 379 | " 0 & 0 & 3 & 0\\\\\\\\\n", 380 | " 0 & 0 & 0 & 1\n", 381 | "\\end{bmatrix}\n", 382 | "$$\n", 383 | "\n", 384 | "$$\n", 385 | "\\bs{D}^{-1}=\n", 386 | "\\begin{bmatrix}\n", 387 | " \\frac{1}{2} & 0 & 0 & 0\\\\\\\\\n", 388 | " 0 & \\frac{1}{4} & 0 & 0\\\\\\\\\n", 389 | " 0 & 0 & \\frac{1}{3} & 0\\\\\\\\\n", 390 | " 0 & 0 & 0 & \\frac{1}{1}\n", 391 | "\\end{bmatrix}\n", 392 | "$$\n", 393 | "\n", 394 | "$$\n", 395 | "\\bs{D}=\n", 396 | "\\begin{bmatrix}\n", 397 | " 2 & 0 & 0 & 0\\\\\\\\\n", 398 | " 0 & 4 & 0 & 0\\\\\\\\\n", 399 | " 0 & 0 & 3 & 0\\\\\\\\\n", 400 | " 0 & 0 & 0 & 1\n", 401 | "\\end{bmatrix}\n", 402 | "\\begin{bmatrix}\n", 403 | " \\frac{1}{2} & 0 & 0 & 0\\\\\\\\\n", 404 | " 0 & \\frac{1}{4} & 0 & 0\\\\\\\\\n", 405 | " 0 & 0 & \\frac{1}{3} & 0\\\\\\\\\n", 406 | " 0 & 0 & 0 & \\frac{1}{1}\n", 407 | "\\end{bmatrix}=\n", 408 | "\\begin{bmatrix}\n", 409 | " 1 & 0 & 0 & 0\\\\\\\\\n", 410 | " 0 & 1 & 0 & 0\\\\\\\\\n", 411 | " 0 & 0 & 1 & 0\\\\\\\\\n", 412 | " 0 & 0 & 0 & 1\n", 413 | "\\end{bmatrix}\n", 414 | "$$\n", 415 | "\n", 416 | "Let's check with Numpy that the multiplication of the matrix with its invert gives us the identity matrix:" 417 | ] 418 | }, 419 | { 420 | "cell_type": "code", 421 | "execution_count": 36, 422 | "metadata": {}, 423 | "outputs": [ 424 | { 425 | "data": { 426 | "text/plain": [ 427 | "array([[2, 0, 0, 0],\n", 428 | " [0, 4, 0, 0],\n", 429 | " [0, 0, 3, 0],\n", 430 | " [0, 0, 0, 1]])" 431 | ] 432 | }, 433 | "execution_count": 36, 434 | "metadata": {}, 435 | "output_type": "execute_result" 436 | } 437 | ], 438 | "source": [ 439 | "A = np.array([[2, 0, 0, 0], [0, 4, 0, 0], [0, 0, 3, 0], [0, 0, 0, 1]])\n", 440 | "A" 441 | ] 442 | }, 443 | { 444 | "cell_type": "code", 445 | "execution_count": 37, 446 | "metadata": {}, 447 | "outputs": [ 448 | { 449 | "data": { 450 | "text/plain": [ 451 | "array([[ 0.5 , 0. , 0. , 0. ],\n", 452 | " [ 0. , 0.25 , 0. , 0. ],\n", 453 | " [ 0. , 0. , 0.33333333, 0. ],\n", 454 | " [ 0. , 0. , 0. , 1. ]])" 455 | ] 456 | }, 457 | "execution_count": 37, 458 | "metadata": {}, 459 | "output_type": "execute_result" 460 | } 461 | ], 462 | "source": [ 463 | "A_inv = np.array([[1/2., 0, 0, 0], [0, 1/4., 0, 0], [0, 0, 1/3., 0], [0, 0, 0, 1/1.]])\n", 464 | "A_inv" 465 | ] 466 | }, 467 | { 468 | "cell_type": "code", 469 | "execution_count": 38, 470 | "metadata": {}, 471 | "outputs": [ 472 | { 473 | "data": { 474 | "text/plain": [ 475 | "array([[ 1., 0., 0., 0.],\n", 476 | " [ 0., 1., 0., 0.],\n", 477 | " [ 0., 0., 1., 0.],\n", 478 | " [ 0., 0., 0., 1.]])" 479 | ] 480 | }, 481 | "execution_count": 38, 482 | "metadata": {}, 483 | "output_type": "execute_result" 484 | } 485 | ], 486 | "source": [ 487 | "A.dot(A_inv)" 488 | ] 489 | }, 490 | { 491 | "cell_type": "markdown", 492 | "metadata": {}, 493 | "source": [ 494 | "Great! This gives the identity matrix" 495 | ] 496 | }, 497 | { 498 | "cell_type": "markdown", 499 | "metadata": {}, 500 | "source": [ 501 | "# Symmetric matrices" 502 | ] 503 | }, 504 | { 505 | "cell_type": "markdown", 506 | "metadata": {}, 507 | "source": [ 508 | "\"Illustration\n", 509 | "Illustration of a symmetric matrix\n", 510 | "\n", 511 | "The matrix $A$ is symmetric if it is equal to its transpose:\n", 512 | " \n", 513 | "$$\n", 514 | "\\bs{A} = \\bs{A}^\\text{T}\n", 515 | "$$\n", 516 | "\n", 517 | "This concerns only square matrices.\n", 518 | "\n", 519 | "### Example 6.\n", 520 | "\n", 521 | "$$\n", 522 | "\\bs{A}=\n", 523 | "\\begin{bmatrix}\n", 524 | " 2 & 4 & -1\\\\\\\\\n", 525 | " 4 & -8 & 0\\\\\\\\\n", 526 | " -1 & 0 & 3\n", 527 | "\\end{bmatrix}\n", 528 | "$$" 529 | ] 530 | }, 531 | { 532 | "cell_type": "code", 533 | "execution_count": 39, 534 | "metadata": {}, 535 | "outputs": [ 536 | { 537 | "data": { 538 | "text/plain": [ 539 | "array([[ 2, 4, -1],\n", 540 | " [ 4, -8, 0],\n", 541 | " [-1, 0, 3]])" 542 | ] 543 | }, 544 | "execution_count": 39, 545 | "metadata": {}, 546 | "output_type": "execute_result" 547 | } 548 | ], 549 | "source": [ 550 | "A = np.array([[2, 4, -1], [4, -8, 0], [-1, 0, 3]])\n", 551 | "A" 552 | ] 553 | }, 554 | { 555 | "cell_type": "code", 556 | "execution_count": 40, 557 | "metadata": {}, 558 | "outputs": [ 559 | { 560 | "data": { 561 | "text/plain": [ 562 | "array([[ 2, 4, -1],\n", 563 | " [ 4, -8, 0],\n", 564 | " [-1, 0, 3]])" 565 | ] 566 | }, 567 | "execution_count": 40, 568 | "metadata": {}, 569 | "output_type": "execute_result" 570 | } 571 | ], 572 | "source": [ 573 | "A.T" 574 | ] 575 | }, 576 | { 577 | "cell_type": "markdown", 578 | "metadata": {}, 579 | "source": [ 580 | "# Unit vectors\n", 581 | "\n", 582 | "A unit vector is a vector of length equal to 1. It can be denoted by a letter with a hat: $\\hat{u}$" 583 | ] 584 | }, 585 | { 586 | "cell_type": "markdown", 587 | "metadata": {}, 588 | "source": [ 589 | "# Orthogonal vectors\n", 590 | "\n", 591 | "Two orthogonal vectors are separated by a 90° angle. The dot product of two orthogonal vectors gives 0.\n", 592 | "\n", 593 | "### Example 7." 594 | ] 595 | }, 596 | { 597 | "cell_type": "code", 598 | "execution_count": 41, 599 | "metadata": {}, 600 | "outputs": [ 601 | { 602 | "data": { 603 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQIAAAECCAYAAAAVT9lQAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAEQRJREFUeJzt3X2QFPWdx/H3jLC7wg2KMqCeIObBX9SkclF8DKfmlLqg\n69MV5dYpeiyHMcRgkC2NnEHlLIWzlMMEOXNSPJjVK9ST00gMMbhSFVRQPCzrTL6HYplTUNeg7lq4\nCMzcH7O7LgSY2Z6e7t/0fF5VFDvDTPe3Ztn3dvf0zKTy+TwiUtvScQ8gIvFTCEREIRARhUBEUAhE\nBIVARIABQe/onEsDDwAOyAHfN7PXwxpMRKJTzhbBhUDezMYCs4A7wxlJRKIWOARm9gTwve6Lo4GP\nwhhIRKIXeNcAwMxyzrmlwCXAhFAmEpHIpcI4xdg5NxxYDxxvZp+VvUARiVQ5BwsnAkeb2VygC9jd\n/Wef8vl8PpVKBV1dTZg/fz4A06dPj3kSSZiiP3jl7Bo8Dixxzq3pXs6PzOzz/U6SStHe3lnG6sKX\nzWa8mimXy5NO63EqlY9z+TpTMYFDYGbbgaag9xcRf+iEIhFRCEREIRARFAIRQSEQERQCEUEhEBEU\nAhFBIRARFAIRQSEQERQCEUEhEBEUAhFBIRARFAIRQSEQERQCEUEhEBEUAhFBIRARFAIRQSEQERQC\nEUEhEBEUAhFBIRARFAIRQSEQEQJ+GrJzbgCwGBgN1AF3mNkvQ5xLRCIUdItgIvChmZ0FnA8sCG8k\nEYlaoC0C4BHg0e6vU8DOcMYRkTgECoGZbQdwzmUoBOHmMIcSkWgFPljonBsJPAssM7Pl4Y0kUtyO\nHTviHiFRgh4sHAGsAq41s7ZS75fNZoKsrqJ8mimdTgF+zdTDl5k2bdrEbbfdxuzZs4Fh3szVl48z\nFRP0GMFM4FBglnPuFiAPjDezA2a6vb0z4OoqI5vNeDVTLpcnnU55NRP48Ti9//573H33v9DaupSr\nrmrmkENGAPo/VYpSwhT0GMF0YHqQ+4r0R2dnB/fddy/3338f27dvZ9CgwcyY8eO4x0qcoFsEIhW1\nc+dOlix5gHnz7mLbtm2910+d+kNGjBgR42TJpDMLJRQ7dnSRz+dDW96AAQMYO/Zsjj32S73XDRs2\njB/8YFpo65AvKAQSira21axc+URoy0ulUrzyysts2PAygwf/BQAzZtxIJjMktHXIF7RrIP2yfPlD\nrF+/jlRqz+s7Ozt5881NnHHGWA4/fFjZ62ltXcaMGdMYNWo0y5f/J1OmTOKqqyaXvVzZN4VA+qWp\n6Qqamq7Y47quri6mTbuGOXPuCT0CK1Y8xciRo3j44Uepq6sre9mybwqBlC2VSjFr1mxGjRpd9rL2\nFQGAI488quxly/7pGIGUrb6+vqIRkMpTCKRka9Y8y9Kli2hpuY6Ojo7e61evfoYbb7y+rGUrAvFS\nCKQkH37YztatW5g0aQrvvbeFjRtf6f23trZnGDRoUOBlKwLxUwikJC+9tI7zz7+QN97YxDvv/B8n\nnHBi77+9+upGxow5JdByFQE/KARSkvHjGxky5BB+9asnOfnkUxg2LAvAW29t5pNPPuakk/ofAkXA\nHwqB9Mtzzz3LOeec23v51VdfYcSIIzjqqL/s13IUAb8oBFKyjo4O2ts/4Pjjv9gt2LjxvznppDEA\nPPjg4pKWowj4RyGQkjU0NDBwYB2p7tMK//jHt3nppRf58pe/wtatWzj00KFFl6EI+EknFEnJ6urq\nuOGGmSxduojjjnMMHXoYN954MytXPkl7ezvXXHPtAe+vCPhLIZB+GT++kfHjG/e47uyz/6bo/RQB\nv2nXQCpOEfCfQiAVpQhUB4VAKkYRqB4KgVSEIlBdFAIJnSJQfRQCCZUiUJ0UAgmNIlC9FAIJhSJQ\n3RQCKZsiUP0UAimLIpAMCoEEpggkh0IggSgCyaIQSL8pAsmjEEi/KALJVFYInHOnOefawhpG/KYI\nJFfg9yNwzt0AXAl8Gt444itFINnK2SJ4A7g0rEHEX4sWLVIEEi5wCMxsBbArxFnEQ62ty7j66qsV\ngYSL9K3KstlMlKsriU8zpdOFNwX1ZaaeLYFjjz2WtrY2jjnmmLhH+jO+PFZ9+ThTMWGEIFXqDdvb\nO0NYXXiy2YxXM+VyedLplBcz9T0m0NbWxqBBh3kxV1++ff/A35mKCePpw3wIyxCP7H1g0MctAQlX\nWVsEZvY2cGZIs4gH9OxAbdIJRdJLEahdCoEAikCtUwhEERCFoNYpAgIKQU1TBKSHQlCjFAHpSyGo\nQYqA7E0hqDGKgOyLQlBDFAHZH4WgRigCciAKQQ1QBKQYhSDhFAEphUKQYIqAlEohSChFQPpDIUgg\nRUD6SyFIGEVAglAIEkQRkKAUgoRQBKQcCkECKAJSLoWgyikCEgaFoIopAhIWhaBKKQISJoWgCikC\nEjaFoMooAlIJCkEVUQSkUhSCKqEISCUpBFVAEZBKUwg8pwhIFBQCjykCEpXAn4bsnEsBC4FvAl3A\nFDPbHNZgtU4RkCiVs0VwCVBvZmcCM4F54YwkioBErZwQjAV+DWBm64AxoUxU4z799FNFQCIXeNcA\nGAJ80ufyLudc2sxy+7rx/PnzyeXyZawufOl0yquZOjs7yOVytLS0MHz4CJ577jdxjwT49zj18HEu\nH2eaMeP6orcpJwQdQKbP5f1GoPcG6VQZq6sMn2ZKpQqzNDQ0UFc3MOZp9uTT49SXj3P5OFMx5YRg\nLdAIPOacOx147UA3nj59Ou3tnWWsLnzZbMarmZYt+3f+9KcPmTt3Ls3NU5g7957eOMTJt8eph49z\n+ThTKcoJwQpgnHNubffl5hDmqWnpdJrhw4fz7W//NUuWLALwJgaSbIFDYGZ5YGqIswiF3YPW1keY\nOPEyxUAioxOKPDR48GBaWx/p3TK46aYW8nm/DkBJsigEnlIMJEoKgccUA4mKQuA5xUCioBBUAcVA\nKk0hqBKKgVSSQlBFFAOpFIWgyigGUgkKQRVSDCRsCkGVUgwkTApBFVMMJCwKQZVTDCQMCkECKAZS\nLoUgIRQDKYdCkCCKgQSlECSMYiBBKAQJpBhIfykECaUYSH8oBAmmGEipFIKEq7YY7NjR5fV8SaUQ\n1IBqikFb22pWrnwi7jFqTjlvZy5VpCcGvrw78vLlD7F+/Tr2Xn1nZydvvrmJM84Yy+GHD4tltlqk\nENQQn2LQ1HQFTU1X7HFdV1cX06Zdw5w59ygCEdOuQY3xeTchlUoxa9ZsTjnltLhHqTkKQQ3yNQb1\n9fWMGjU67jFqkkJQo+KOwZo1z7J06SJaWq6jo6MDgD/84XUuuOBc3nprc2RzSIFCUMPiikF7+we8\n//77TJo0hffe28LGja8AMGxYllQqxdtvv1XxGWRPCkGNiyMG69Y9z/jxjWze/AbvvvsOJ574daAQ\nggsuuJgjjjiyouuXP6cQSOQxaGy8hEwmw5NP/hennnr6Hs8QHHzwwRx33Ncqtm7Zt7JC4Jy71Dn3\nUFjDSHzi2DJoa3uGs876zh7X5XI50mn9fopa4EfcOTcfuAPQ53UnRJQx+Oijj9i2bRvOffHbf926\nF/jWt06uyPrkwMpJ71pgaliDiB+iikEmk6GhoYHdu3NA4YzC1157VSGISdEzC51zk4HrgTyF3/55\noNnMHnXOnV3h+SQGe5+BePDBddx665xQz0AcMGAAP/nJP9PaupSvfvU4du3axZVXNoe2fOmfoiEw\ns8XA4jBWls1mwlhMqHyaKZ0u/KD5MFM2m2HVqqdpbGxk4cKFACxYsCDUGEyYcBETJlxU1jJ8eKz2\n5uNMxUT6WoP29s4oV1dUNpvxaqZcLk86nfJqpiVL/oPm5r9n4cKFfPbZ57G+UGlvvn3/wN+ZitHh\nWTmgwYMH89RTT3l3OrKEq6wtAjNbA6wJaRbxlE+vWpTK0BaBlCTu1yZIZSkEUjLFILkUAukXxSCZ\nFALpN8UgeRQCCUQxSBaFQAJTDJJDIZCyKAbJoBBI2RSD6qcQSCgUg+qmEEhoFIPqpRBIqBSD6qQQ\nSOgUg+qjEEhFKAbVRSGQilEMqodCIBWlGFQHhUAqTjHwn0IgkVAM/KYQSGQUA38pBBIpxcBPCoFE\nTjHwj0IgsVAM/KIQSGwUA38oBBIrxcAPCoHETjGIn0IgXlAM4qUQiDcOFANFobIUAvHK/mIwd+7t\ncY+WaJF+GrJIKfb+rMUtW95l1aqnGTfuu4wZc2rc4yWStgjESz0xOPnkMaxa9TQAt99+q3YRKiTQ\nFoFzbgjQCgwBBgItZvZimINJbdu5cye33DKTDRte7r3uhRfWsnr1bzjvvL+NcbJkCrpFMAP4rZmd\nAzQD94U2kQgwcOBA7rrrX/nZz+7n6KNH9l5/++23sXv37hgnS6agIZgH/Lz764HAZ+GMI/KFgw46\niKamy3n++Q3Mnn0nQ4cO5fe//x8ee2x53KMlTtFdA+fcZOB6IA+kuv9uNrMNzrkjgF8A11V0Sqlp\nDQ0NTJ36Qy6/fCILFtzLT386j4sv/jsgE/doiVE0BGa2GFi89/XOuW8AD1M4PvC7CswmsodDDjmU\nm2++lcmTr2br1i2MHJmNe6TECHqw8ATgEeAyM3ut1Ptls/4V3KeZ0ukU4NdMPXyaKZt1fb72Z64e\nPs5UTNDzCO4E6oF7nXMp4GMzu7TYndrbOwOurjKy2YxXM+VyedLplFczgX+PUw8f5/J1pmIChcDM\nLglyPxHxk04oEhGFQEQUAhFBIRARFAIRQSEQERQCEUEhEBEUAhFBIRARFAIRQSEQERQCEUEhEBEU\nAhFBIRARFAIRQSEQERQCEUEhEBEUAhFBIRARFAIRQSEQERQCEUEhEBEUAhFBIRARFAIRIeCnITvn\nBgEPA4cBXcA/mNnWMAcTkegE3SK4GnjZzM4CHgJ+HN5IIhK1QFsEZnavcy7VfXEU8FF4I4lI1IqG\nwDk3GbgeyAOp7r+bzWyDc2418HVgXEWnFJGKKhoCM1sMLN7Pv53rnHPASuArIc8mIhFJ5fP5ft/J\nOXcT8I6ZtTrnRgLPmNnXQp9ORCIR6BgBhS2EZc65f6RwwLE5vJFEJGqBtghEJFl0QpGIKAQiohCI\nCAqBiBD8WYN+cc4NAVqBIcBAoMXMXoxi3aVwzl0KTDCzK2KcIQUsBL5J4fUbU8xsc1zz9OWcOw2Y\na2bf8WCWARSetRoN1AF3mNkvY54pDTwAOCAHfN/MXo9zph7OueHAy8B5Zva/+7tdVFsEM4Dfmtk5\nFJ5qvC+i9RblnJsP3EHhrMk4XQLUm9mZwExgXszzAOCcu4HCf/L6uGfpNhH4sPt1LucDC2KeB+BC\nIG9mY4FZwJ0xzwP0RvN+YHux20YVgnnAz7u/Hgh8FtF6S7EWmBr3EMBY4NcAZrYOGBPvOL3eAC6N\ne4g+HqHwwwaFeO+McRYAzOwJ4HvdF0fjz2tv7gb+DdhS7Iah7xoUeW3CEcAvgOvCXm8Zcz3qnDs7\n6nn2YQjwSZ/Lu5xzaTPLxTUQgJmtcM4dE+cMfZnZdgDnXAZ4FLg53okKzCznnFtKYctuQszj4Jyb\nBHxgZs845/6p2O1DD8H+XpvgnPsGhfcwaDGz34W93qBzeaQDyPS5HHsEfNV9WvvjwAIzWx73PD3M\nbFL3Pvl659zxZhbnlm8zkHPOjQP+CnjQOXeRmX2wrxtHdbDwBAqbdJeZ2WtRrLMKrQUagcecc6cD\nvj1OcR9DAcA5NwJYBVxrZm1xzwPgnJsIHG1mcykc6N3d/Sc2Zta7leucawOu2V8EIKIQUDh4Ug/0\nvI/Bx2bm036nD1YA45xza7sv+/b6DV/ORZ8JHArMcs7dQmGu8Wa2I8aZHgeWOOfWUPiZ+pGZfR7j\nPHsr+r3Taw1ERCcUiYhCICIoBCKCQiAiKAQigkIgIigEIoJCICLA/wOdu6Fy94/kOAAAAABJRU5E\nrkJggg==\n", 604 | "text/plain": [ 605 | "" 606 | ] 607 | }, 608 | "metadata": {}, 609 | "output_type": "display_data" 610 | } 611 | ], 612 | "source": [ 613 | "x = [0,0,2,2]\n", 614 | "y = [0,0,2,-2]\n", 615 | "\n", 616 | "plt.quiver([x[0], y[0]],\n", 617 | " [x[1], y[1]],\n", 618 | " [x[2], y[2]],\n", 619 | " [x[3], y[3]],\n", 620 | " angles='xy', scale_units='xy', scale=1)\n", 621 | "\n", 622 | "plt.xlim(-2, 4)\n", 623 | "plt.ylim(-3, 3)\n", 624 | "plt.axvline(x=0, color='grey')\n", 625 | "plt.axhline(y=0, color='grey')\n", 626 | "\n", 627 | "plt.text(1, 1.5, r'$\\vec{u}$', size=18)\n", 628 | "plt.text(1.5, -1, r'$\\vec{v}$', size=18)\n", 629 | "\n", 630 | "plt.show()\n", 631 | "plt.close()" 632 | ] 633 | }, 634 | { 635 | "cell_type": "markdown", 636 | "metadata": {}, 637 | "source": [ 638 | "$$\n", 639 | "\\bs{x}= \n", 640 | "\\begin{bmatrix}\n", 641 | " 2\\\\\\\\\n", 642 | " 2\n", 643 | "\\end{bmatrix}\n", 644 | "$$\n", 645 | "\n", 646 | "and\n", 647 | "\n", 648 | "$$\n", 649 | "\\bs{y}=\n", 650 | "\\begin{bmatrix}\n", 651 | " 2\\\\\\\\\n", 652 | " -2\n", 653 | "\\end{bmatrix}\n", 654 | "$$\n", 655 | "\n", 656 | "$$\n", 657 | "\\bs{x^\\text{T}y}=\n", 658 | "\\begin{bmatrix}\n", 659 | " 2 & 2\n", 660 | "\\end{bmatrix}\n", 661 | "\\begin{bmatrix}\n", 662 | " 2\\\\\\\\\n", 663 | " -2\n", 664 | "\\end{bmatrix}=\n", 665 | "\\begin{bmatrix}\n", 666 | " 2\\times2 + 2\\times-2\n", 667 | "\\end{bmatrix}=0\n", 668 | "$$\n", 669 | "\n", 670 | "In addition, when the norm of orthogonal vectors is the unit norm they are called **orthonormal**." 671 | ] 672 | }, 673 | { 674 | "cell_type": "markdown", 675 | "metadata": {}, 676 | "source": [ 677 | "\n", 678 | " It is impossible to have more than $n$ vectors mutually orthogonal in $\\mathbb{R}^n$.\n", 679 | "\n", 680 | "\n", 681 | "It is impossible to have more than $n$ vectors mutually orthogonal in $\\mathbb{R}^n$. For instance try to draw 3 vectors in a 2-dimensional space ($\\mathbb{R}^2$) that are mutually orthogonal...\n" 682 | ] 683 | }, 684 | { 685 | "cell_type": "markdown", 686 | "metadata": {}, 687 | "source": [ 688 | "# Orthogonal matrices\n", 689 | "\n", 690 | "Orthogonal matrices are important because they have interesting properties. A matrix is orthogonal if columns are mutually orthogonal and have a unit norm (orthonormal) and rows are mutually orthonormal and have unit norm. \n", 691 | "\n", 692 | "\n", 693 | "\"Under\n", 694 | "Under the hood of an orthogonal matrix\n", 695 | "\n", 696 | "$$\n", 697 | "\\bs{A}= \n", 698 | "\\begin{bmatrix}\n", 699 | " A_{1,1} & A_{1,2}\\\\\\\\\n", 700 | " A_{2,1} & A_{2,2}\n", 701 | "\\end{bmatrix}\n", 702 | "$$\n", 703 | "\n", 704 | "This means that \n", 705 | "\n", 706 | "$$\n", 707 | "\\begin{bmatrix}\n", 708 | " A_{1,1}\\\\\\\\\n", 709 | " A_{2,1}\n", 710 | "\\end{bmatrix}\n", 711 | "$$\n", 712 | "\n", 713 | "and \n", 714 | "\n", 715 | "$$\n", 716 | "\\begin{bmatrix}\n", 717 | " A_{1,2}\\\\\\\\\n", 718 | " A_{2,2}\n", 719 | "\\end{bmatrix}\n", 720 | "$$\n", 721 | "\n", 722 | "are orthogonal vectors and also that the rows\n", 723 | "\n", 724 | "$$\n", 725 | "\\begin{bmatrix}\n", 726 | " A_{1,1} & A_{1,2}\n", 727 | "\\end{bmatrix}\n", 728 | "$$\n", 729 | "\n", 730 | "and\n", 731 | "\n", 732 | "$$\n", 733 | "\\begin{bmatrix}\n", 734 | " A_{2,1} & A_{2,2}\n", 735 | "\\end{bmatrix}\n", 736 | "$$\n", 737 | "\n", 738 | "are orthogonal vectors (cf. above for definition of orthogonal vectors)." 739 | ] 740 | }, 741 | { 742 | "cell_type": "markdown", 743 | "metadata": {}, 744 | "source": [ 745 | "## Property 1: $\\bs{A^\\text{T}A}=\\bs{I}$\n", 746 | "\n", 747 | "\n", 748 | "A orthogonal matrix has this property:\n", 749 | "\n", 750 | "$$\n", 751 | "\\bs{A^\\text{T}A}=\\bs{AA^\\text{T}}=\\bs{I}\n", 752 | "$$\n", 753 | "\n", 754 | "We can see that this statement is true with the following reasoning:\n", 755 | "\n", 756 | "Let's have the following matrix:\n", 757 | "\n", 758 | "$$\n", 759 | "\\bs{A}=\\begin{bmatrix}\n", 760 | " a & b\\\\\\\\\n", 761 | " c & d\n", 762 | "\\end{bmatrix}\n", 763 | "$$\n", 764 | "\n", 765 | "and thus\n", 766 | "\n", 767 | "$$\n", 768 | "\\bs{A}^\\text{T}=\\begin{bmatrix}\n", 769 | " a & c\\\\\\\\\n", 770 | " b & d\n", 771 | "\\end{bmatrix}\n", 772 | "$$\n", 773 | "\n", 774 | "Let's do the product:\n", 775 | "\n", 776 | "$$\n", 777 | "\\begin{align*}\n", 778 | "&\\bs{A^\\text{T}A}=\\begin{bmatrix}\n", 779 | " a & c\\\\\\\\\n", 780 | " b & d\n", 781 | "\\end{bmatrix}\n", 782 | "\\begin{bmatrix}\n", 783 | " a & b\\\\\\\\\n", 784 | " c & d\n", 785 | "\\end{bmatrix}\n", 786 | "=\n", 787 | "\\begin{bmatrix}\n", 788 | " aa + cc & ab + cd\\\\\\\\\n", 789 | " ab + cd & bb + dd\n", 790 | "\\end{bmatrix}\\\\\\\\\n", 791 | "&=\n", 792 | "\\begin{bmatrix}\n", 793 | " a^2 + c^2 & ab + cd\\\\\\\\\n", 794 | " ab + cd & b^2 + d^2\n", 795 | "\\end{bmatrix}\n", 796 | "\\end{align*}\n", 797 | "$$\n", 798 | "\n", 799 | "We saw in [2.5](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.5-Norms/) that the norm of the vector $\\begin{bmatrix}\n", 800 | " a & c\n", 801 | "\\end{bmatrix}$ is equal to $a^2+c^2$ ($L^2$ or squared $L^2$). In addtion, we saw that the rows of $\\bs{A}$ have a unit norm because $\\bs{A}$ is orthogonal. This means that $a^2+c^2=1$ and $b^2+d^2=1$. So we now have:\n", 802 | "\n", 803 | "$$\n", 804 | "\\bs{A^\\text{T}A}=\n", 805 | "\\begin{bmatrix}\n", 806 | " 1 & ab + cd\\\\\\\\\n", 807 | " ab + cd & 1\n", 808 | "\\end{bmatrix}\n", 809 | "$$\n", 810 | "\n", 811 | "Also, $ab+cd$ corresponds to the product of $\\begin{bmatrix}\n", 812 | " a & c\n", 813 | "\\end{bmatrix} and \\begin{bmatrix}\n", 814 | " b & d\n", 815 | "\\end{bmatrix}$:\n", 816 | "\n", 817 | "$$\n", 818 | "\\begin{bmatrix}\n", 819 | " a & c\n", 820 | "\\end{bmatrix}\n", 821 | "\\begin{bmatrix}\n", 822 | " b\\\\\\\\\n", 823 | " d\n", 824 | "\\end{bmatrix}\n", 825 | "=\n", 826 | "ab+cd\n", 827 | "$$\n", 828 | "\n", 829 | "And we know that the columns are orthogonal which means that:\n", 830 | "\n", 831 | "$$\n", 832 | "\\begin{bmatrix}\n", 833 | " a & c\n", 834 | "\\end{bmatrix}\n", 835 | "\\begin{bmatrix}\n", 836 | " b\\\\\\\\\n", 837 | " d\n", 838 | "\\end{bmatrix}=0\n", 839 | "$$\n", 840 | "\n", 841 | "We thus have the identity matrix:\n", 842 | "\n", 843 | "$$\n", 844 | "\\bs{A^\\text{T}A}=\\begin{bmatrix}\n", 845 | " 1 & 0\\\\\\\\\n", 846 | " 0 & 1\n", 847 | "\\end{bmatrix}\n", 848 | "$$" 849 | ] 850 | }, 851 | { 852 | "cell_type": "markdown", 853 | "metadata": {}, 854 | "source": [ 855 | "## Property 2: $\\bs{A}^\\text{T}=\\bs{A}^{-1}$\n", 856 | "\n", 857 | "We can show that if $\\bs{A^\\text{T}A}=\\bs{I}$ then $\n", 858 | "\\bs{A}^\\text{T}=\\bs{A}^{-1}$.\n", 859 | "\n", 860 | "If we multiply each side of the equation $\\bs{A^\\text{T}A}=\\bs{I}$ by $\\bs{A}^{-1}$ we have:\n", 861 | "\n", 862 | "$$\n", 863 | "(\\bs{A^\\text{T}A})\\bs{A}^{-1}=\\bs{I}\\bs{A}^{-1}\n", 864 | "$$\n", 865 | "\n", 866 | "Recall from [2.3](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.3-Identity-and-Inverse-Matrices/) that a matrix or vector doesn't change when it is multiplied by the identity matrix. So we have:\n", 867 | "\n", 868 | "$$\n", 869 | "(\\bs{A^\\text{T}A})\\bs{A}^{-1}=\\bs{A}^{-1}\n", 870 | "$$\n", 871 | "\n", 872 | "We also saw in [2.2](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.2-Multiplying-Matrices-and-Vectors/) that matrix multiplication is associative so we can remove the parenthesis:\n", 873 | "\n", 874 | "$$\n", 875 | "\\bs{A^\\text{T}A}\\bs{A}^{-1}=\\bs{A}^{-1}\n", 876 | "$$\n", 877 | "\n", 878 | "We also know that $\\bs{A}\\bs{A}^{-1}=\\bs{I}$ (see [2.3](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.3-Identity-and-Inverse-Matrices/)) so we can replace:\n", 879 | "\n", 880 | "$$\n", 881 | "\\bs{A^\\text{T}}\\bs{I}=\\bs{A}^{-1}\n", 882 | "$$\n", 883 | "\n", 884 | "This shows that\n", 885 | "\n", 886 | "$$\\bs{A}^\\text{T}=\\bs{A}^{-1}$$\n", 887 | "\n", 888 | "You can refer to [this question](https://math.stackexchange.com/questions/1936020/why-is-the-inverse-of-an-orthogonal-matrix-equal-to-its-transpose).\n", 889 | "\n", 890 | "### Example 8.\n", 891 | "\n", 892 | "Sine and cosine are convenient to create orthogonal matrices. Let's take the following matrix:\n", 893 | "\n", 894 | "$$\n", 895 | "\\bs{A}= \n", 896 | "\\begin{bmatrix}\n", 897 | " cos(50) & -sin(50)\\\\\\\\\n", 898 | " sin(50) & cos(50)\n", 899 | "\\end{bmatrix}\n", 900 | "$$" 901 | ] 902 | }, 903 | { 904 | "cell_type": "code", 905 | "execution_count": 5, 906 | "metadata": {}, 907 | "outputs": [ 908 | { 909 | "data": { 910 | "text/plain": [ 911 | "array([[ 0.96496603, 0.26237485],\n", 912 | " [-0.26237485, 0.96496603]])" 913 | ] 914 | }, 915 | "execution_count": 5, 916 | "metadata": {}, 917 | "output_type": "execute_result" 918 | } 919 | ], 920 | "source": [ 921 | "A = np.array([[np.cos(50), -np.sin(50)], [np.sin(50), np.cos(50)]])\n", 922 | "A" 923 | ] 924 | }, 925 | { 926 | "cell_type": "code", 927 | "execution_count": 21, 928 | "metadata": {}, 929 | "outputs": [], 930 | "source": [ 931 | "col0 = A[:, [0]]\n", 932 | "col1 = A[:, [1]]\n", 933 | "row0 = A[0].reshape(A.shape[1], 1)\n", 934 | "row1 = A[1].reshape(A.shape[1], 1)" 935 | ] 936 | }, 937 | { 938 | "cell_type": "markdown", 939 | "metadata": {}, 940 | "source": [ 941 | "Let's check that rows and columns are orthogonal:" 942 | ] 943 | }, 944 | { 945 | "cell_type": "code", 946 | "execution_count": 44, 947 | "metadata": {}, 948 | "outputs": [ 949 | { 950 | "data": { 951 | "text/plain": [ 952 | "array([[ 0.]])" 953 | ] 954 | }, 955 | "execution_count": 44, 956 | "metadata": {}, 957 | "output_type": "execute_result" 958 | } 959 | ], 960 | "source": [ 961 | "col0.T.dot(col1)" 962 | ] 963 | }, 964 | { 965 | "cell_type": "code", 966 | "execution_count": 45, 967 | "metadata": {}, 968 | "outputs": [ 969 | { 970 | "data": { 971 | "text/plain": [ 972 | "array([[ 0.]])" 973 | ] 974 | }, 975 | "execution_count": 45, 976 | "metadata": {}, 977 | "output_type": "execute_result" 978 | } 979 | ], 980 | "source": [ 981 | "row0.T.dot(row1)" 982 | ] 983 | }, 984 | { 985 | "cell_type": "markdown", 986 | "metadata": {}, 987 | "source": [ 988 | "Let's check that\n", 989 | "\n", 990 | "$$\n", 991 | "\\bs{A^\\text{T}A}=\\bs{AA^\\text{T}}=\\bs{I}\n", 992 | "$$\n", 993 | "\n", 994 | "and thus\n", 995 | "\n", 996 | "$$\n", 997 | "\\bs{A}^\\text{T}=\\bs{A}^{-1}\n", 998 | "$$" 999 | ] 1000 | }, 1001 | { 1002 | "cell_type": "code", 1003 | "execution_count": 46, 1004 | "metadata": {}, 1005 | "outputs": [ 1006 | { 1007 | "data": { 1008 | "text/plain": [ 1009 | "array([[ 1., 0.],\n", 1010 | " [ 0., 1.]])" 1011 | ] 1012 | }, 1013 | "execution_count": 46, 1014 | "metadata": {}, 1015 | "output_type": "execute_result" 1016 | } 1017 | ], 1018 | "source": [ 1019 | "A.T.dot(A)" 1020 | ] 1021 | }, 1022 | { 1023 | "cell_type": "code", 1024 | "execution_count": 47, 1025 | "metadata": {}, 1026 | "outputs": [ 1027 | { 1028 | "data": { 1029 | "text/plain": [ 1030 | "array([[ 0.96496603, -0.26237485],\n", 1031 | " [ 0.26237485, 0.96496603]])" 1032 | ] 1033 | }, 1034 | "execution_count": 47, 1035 | "metadata": {}, 1036 | "output_type": "execute_result" 1037 | } 1038 | ], 1039 | "source": [ 1040 | "A.T" 1041 | ] 1042 | }, 1043 | { 1044 | "cell_type": "code", 1045 | "execution_count": 48, 1046 | "metadata": {}, 1047 | "outputs": [ 1048 | { 1049 | "data": { 1050 | "text/plain": [ 1051 | "array([[ 0.96496603, -0.26237485],\n", 1052 | " [ 0.26237485, 0.96496603]])" 1053 | ] 1054 | }, 1055 | "execution_count": 48, 1056 | "metadata": {}, 1057 | "output_type": "execute_result" 1058 | } 1059 | ], 1060 | "source": [ 1061 | "numpy.linalg.inv(A)" 1062 | ] 1063 | }, 1064 | { 1065 | "cell_type": "markdown", 1066 | "metadata": {}, 1067 | "source": [ 1068 | "Everything is correct!\n", 1069 | "\n", 1070 | "# Conclusion\n", 1071 | "\n", 1072 | "In this chapter we saw different interesting type of matrices with specific properties. It is generally useful to recall them while we deal with this kind of matrices.\n", 1073 | "\n", 1074 | "In the next chapter we will saw a central idea in linear algebra: the eigendecomposition. Keep reading!" 1075 | ] 1076 | }, 1077 | { 1078 | "cell_type": "markdown", 1079 | "metadata": {}, 1080 | "source": [ 1081 | "\n", 1082 | " Feel free to drop me an email or a comment. The syllabus of this series can be found [in the introduction post](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/). All the notebooks can be found on [Github](https://github.com/hadrienj/deepLearningBook-Notes).\n", 1083 | "" 1084 | ] 1085 | }, 1086 | { 1087 | "cell_type": "markdown", 1088 | "metadata": {}, 1089 | "source": [ 1090 | "# References\n", 1091 | "\n", 1092 | "## Inverse and transpose of orthogonal matrix\n", 1093 | "\n", 1094 | "- https://math.stackexchange.com/questions/1936020/why-is-the-inverse-of-an-orthogonal-matrix-equal-to-its-transpose\n", 1095 | "\n", 1096 | "- https://dyinglovegrape.wordpress.com/2010/11/30/the-inverse-of-an-orthogonal-matrix-is-its-transpose/" 1097 | ] 1098 | } 1099 | ], 1100 | "metadata": { 1101 | "kernelspec": { 1102 | "display_name": "Python 2", 1103 | "language": "python", 1104 | "name": "python2" 1105 | }, 1106 | "language_info": { 1107 | "codemirror_mode": { 1108 | "name": "ipython", 1109 | "version": 2 1110 | }, 1111 | "file_extension": ".py", 1112 | "mimetype": "text/x-python", 1113 | "name": "python", 1114 | "nbconvert_exporter": "python", 1115 | "pygments_lexer": "ipython2", 1116 | "version": "2.7.10" 1117 | }, 1118 | "varInspector": { 1119 | "cols": { 1120 | "lenName": 16, 1121 | "lenType": 16, 1122 | "lenVar": 40 1123 | }, 1124 | "kernels_config": { 1125 | "python": { 1126 | "delete_cmd_postfix": "", 1127 | "delete_cmd_prefix": "del ", 1128 | "library": "var_list.py", 1129 | "varRefreshCmd": "print(var_dic_list())" 1130 | }, 1131 | "r": { 1132 | "delete_cmd_postfix": ") ", 1133 | "delete_cmd_prefix": "rm(", 1134 | "library": "var_list.r", 1135 | "varRefreshCmd": "cat(var_dic_list()) " 1136 | } 1137 | }, 1138 | "types_to_exclude": [ 1139 | "module", 1140 | "function", 1141 | "builtin_function_or_method", 1142 | "instance", 1143 | "_Feature" 1144 | ], 1145 | "window_display": false 1146 | } 1147 | }, 1148 | "nbformat": 4, 1149 | "nbformat_minor": 2 1150 | } 1151 | -------------------------------------------------------------------------------- /2.6 Special Kinds of Matrices and Vectors/images/diagonal-and-symmetric-matrices.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.6 Special Kinds of Matrices and Vectors/images/diagonal-and-symmetric-matrices.png -------------------------------------------------------------------------------- /2.6 Special Kinds of Matrices and Vectors/images/diagonal-matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.6 Special Kinds of Matrices and Vectors/images/diagonal-matrix.png -------------------------------------------------------------------------------- /2.6 Special Kinds of Matrices and Vectors/images/orthogonal-matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.6 Special Kinds of Matrices and Vectors/images/orthogonal-matrix.png -------------------------------------------------------------------------------- /2.6 Special Kinds of Matrices and Vectors/images/symmetric-matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.6 Special Kinds of Matrices and Vectors/images/symmetric-matrix.png -------------------------------------------------------------------------------- /2.7 Eigendecomposition/images/output_59_0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.7 Eigendecomposition/images/output_59_0.png -------------------------------------------------------------------------------- /2.7 Eigendecomposition/images/quadratic-functions-indefinite-form.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.7 Eigendecomposition/images/quadratic-functions-indefinite-form.png -------------------------------------------------------------------------------- /2.7 Eigendecomposition/images/quadratic-functions-negative-definite-form.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.7 Eigendecomposition/images/quadratic-functions-negative-definite-form.png -------------------------------------------------------------------------------- /2.7 Eigendecomposition/images/quadratic-functions-positive-definite-form.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.7 Eigendecomposition/images/quadratic-functions-positive-definite-form.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/SVD_image_dim.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/SVD_image_dim.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/dimensions-reconstruction-image-singular-value-decomposition.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/dimensions-reconstruction-image-singular-value-decomposition.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/non-square-matrix-change-dimensions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/non-square-matrix-change-dimensions.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/output_35_7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/output_35_7.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/rescaled-circle-rotated.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/rescaled-circle-rotated.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/singular-value-decomposition-understanding-dimensions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/singular-value-decomposition-understanding-dimensions.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/singular-value-decomposition.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/singular-value-decomposition.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/transformation-vector-by-matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/transformation-vector-by-matrix.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/unit-circle-transformation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/unit-circle-transformation.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/unit-circle-transformation1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/unit-circle-transformation1.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/unit-circle.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/unit-circle.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/images/unit-vectors-rotation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/images/unit-vectors-rotation.png -------------------------------------------------------------------------------- /2.8 Singular Value Decomposition/test_svd.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.8 Singular Value Decomposition/test_svd.jpg -------------------------------------------------------------------------------- /2.9 The Moore-Penrose Pseudoinverse/images/dataset-representation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.9 The Moore-Penrose Pseudoinverse/images/dataset-representation.png -------------------------------------------------------------------------------- /2.9 The Moore-Penrose Pseudoinverse/images/linear-regression-r.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.9 The Moore-Penrose Pseudoinverse/images/linear-regression-r.png -------------------------------------------------------------------------------- /2.9 The Moore-Penrose Pseudoinverse/images/overdetermined-system-equations-python.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/2.9 The Moore-Penrose Pseudoinverse/images/overdetermined-system-equations-python.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/all_dice.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/all_dice.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/area-under-curve-derivative.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/area-under-curve-derivative.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/area_under_curve_more_1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/area_under_curve_more_1.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/intro_image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/intro_image.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/marginal-probabilities-empty.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/marginal-probabilities-empty.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/marginal-probabilities.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/marginal-probabilities.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/mass.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/mass.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/negative-and-positive-covariance.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/negative-and-positive-covariance.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/probability-density-function-area-under-the-curve-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/probability-density-function-area-under-the-curve-1.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/probability-density-function-area-under-the-curve-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/probability-density-function-area-under-the-curve-2.png -------------------------------------------------------------------------------- /3.1-3.3 Probability Mass and Density Functions/images/probability-density-function.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.1-3.3 Probability Mass and Density Functions/images/probability-density-function.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/bivariate-gaussian-curves.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/bivariate-gaussian-curves.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/conditional-probability.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/conditional-probability.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/inside-probability.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/inside-probability.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/integral-probability.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/integral-probability.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/intro.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/intro.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/marginal-probabilities-empty.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/marginal-probabilities-empty.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/marginal-probabilities.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/marginal-probabilities.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/negative-and-positive-covariance.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/negative-and-positive-covariance.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/sum-rule-example.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/sum-rule-example.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/sum_rule_1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/sum_rule_1.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/sum_rule_2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/sum_rule_2.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/sum_rule_3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/sum_rule_3.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/sum_rule_4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/sum_rule_4.png -------------------------------------------------------------------------------- /3.4-3.5 Marginal and Conditional Probability/images/summary-mathematical-notation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/3.4-3.5 Marginal and Conditional Probability/images/summary-mathematical-notation.png -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2016 redVi 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /deep-learning-book-goodfellow-cover.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hadrienj/deepLearningBook-Notes/f8f489bd4945bf64509986c0fd378bee61a6706a/deep-learning-book-goodfellow-cover.jpg -------------------------------------------------------------------------------- /readme.md: -------------------------------------------------------------------------------- 1 | Cover of the deep learning book by Goodfellow, Bengio and Courville 2 | 3 | **The Deep Learning Book - Goodfellow, I., Bengio, Y., and Courville, A. (2016)** 4 | 5 | This content is part of a series following the chapter 2 on linear algebra from the [Deep Learning Book](http://www.deeplearningbook.org/) by Goodfellow, I., Bengio, Y., and Courville, A. (2016). It aims to provide intuitions/drawings/python code on mathematical theories and is constructed as my understanding of these concepts. 6 | 7 | 8 | # Boost your data science skills. Learn linear algebra. 9 | 10 | I'd like to introduce a series of blog posts and their corresponding Python Notebooks gathering notes on [the Deep Learning Book](http://www.deeplearningbook.org/) from Ian Goodfellow, Yoshua Bengio, and Aaron Courville (2016). The aim of these notebooks is to help beginners/advanced beginners to grasp linear algebra concepts underlying deep learning and machine learning. Acquiring these skills can boost your ability to understand and apply various data science algorithms. In my opinion, it is one of the bedrock of machine learning, deep learning and data science. 11 | 12 | These notes cover the chapter 2 on Linear Algebra. I liked this chapter because it gives a sense of what is most used in the domain of machine learning and deep learning. It is thus a great syllabus for anyone who wants to dive in deep learning and acquire the concepts of linear algebra useful to better understand deep learning algorithms. 13 | 14 | You can find all the articles [here](https://hadrienj.github.io). 15 | 16 | # Getting started with linear algebra 17 | 18 | The goal of this series is to provide content for beginners who want to understand enough linear algebra to be confortable with machine learning and deep learning. However, I think that the chapter on linear algebra from the [Deep Learning book](http://www.deeplearningbook.org/) is a bit tough for beginners. So I decided to produce code, examples and drawings on each part of this chapter in order to add steps that may not be obvious for beginners. I also think that you can convey as much information and knowledge through examples as through general definitions. The illustrations are a way to see the big picture of an idea. Finally, I think that coding is a great tool to experiment with these abstract mathematical notions. Along with pen and paper, it adds a layer of what you can try to push your understanding through new horizons. 19 | 20 | Graphical representation is also very helpful to understand linear algebra. I tried to bind the concepts with plots (and code to produce it). The type of representation I liked most by doing this series is the fact that you can see any matrix as linear transformation of the space. In several chapters we will extend this idea and see how it can be useful to understand eigendecomposition, Singular Value Decomposition (SVD) or the Principal Components Analysis (PCA). 21 | 22 | # The use of Python/Numpy 23 | 24 | In addition, I noticed that creating and reading examples is really helpful to understand the theory. It is why I built Python notebooks. The goal is two folds: 25 | 26 | 1. To provide a starting point to use Python/Numpy to apply linear algebra concepts. And since the final goal is to use linear algebra concepts for data science, it seems natural to continuously go between theory and code. All you will need is a working Python installation with major mathematical librairies like Numpy/Scipy/Matplotlib. 27 | 28 | 2. Give a more concrete vision of the underlying concepts. I found hugely useful to play and experiment with these notebooks in order to build my understanding of somewhat complicated theoretical concepts or notations. I hope that reading them will be as useful. 29 | 30 | # Syllabus 31 | 32 | The syllabus follows exactly the [Deep Learning Book](http://www.deeplearningbook.org/) so you can find more details if you can't understand one specific point while you are reading it. Here is a short description of the content: 33 | 34 | 1. [Scalars, Vectors, Matrices and Tensors](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.1-Scalars-Vectors-Matrices-and-Tensors/) 35 | 36 | An example of a scalar, a vector, a matrix and a tensor 37 | 38 | **Difference between a scalar, a vector, a matrix and a tensor** 39 | 40 | Light introduction to vectors, matrices, transpose and basic operations (addition of vectors of matrices). Introduces also Numpy functions and finally a word on broadcasting. 41 | 42 | 2. [Multiplying Matrices and Vectors](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.2-Multiplying-Matrices-and-Vectors/) 43 | 44 | An example of how to calculate the dot product 45 | 46 | **The dot product explained** 47 | 48 | This chapter is mainly on the dot product (vector and/or matrix multiplication). We will also see some of its properties. Then, we will see how to synthesize a system of linear equations using matrix notation. This is a major process for the following chapters. 49 | 50 | 3. [Identity and Inverse Matrices](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.3-Identity-and-Inverse-Matrices/) 51 | 52 | Example of an identity matrix 53 | 54 | **An identity matrix** 55 | 56 | We will see two important matrices: the identity matrix and the inverse matrix. We will see why they are important in linear algebra and how to use them with Numpy. Finally, we will see an example on how to solve a system of linear equations with the inverse matrix. 57 | 58 | 4. [Linear Dependence and Span](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.4-Linear-Dependence-and-Span/) 59 | 60 | Examples of systems of equations with 0, 1 and an infinite number of solutions 61 | 62 | **A system of equations has no solution, 1 solution or an infinite number of solutions** 63 | 64 | In this chapter we will continue to study systems of linear equations. We will see that such systems can't have more than one solution and less than an infinite number of solutions. We will see the intuition, the graphical representation and the proof behind this statement. Then we will go back to the matrix form of the system and consider what Gilbert Strang calls the *row figure* (we are looking at the rows, that is to say multiple equations) and the *column figure* (looking at the columns, that is to say the linear combination of the coefficients). We will also see what is linear combination. Finally, we will see examples of overdetermined and underdetermined systems of equations. 65 | 66 | 5. [Norms](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.5-Norms/) 67 | 68 | Representation of the squared L2 norm in 3 dimensions 69 | 70 | **Shape of a squared L2 norm in 3 dimensions** 71 | 72 | The norm of a vector is a function that takes a vector in input and outputs a positive value. It can be thought of as the *length* of the vector. It is for example used to evaluate the distance between the prediction of a model and the actual value. We will see different kinds of norms ($L^0$, $L^1$, $L^2$...) with examples. 73 | 74 | 6. [Special Kinds of Matrices and Vectors](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.6-Special-Kinds-of-Matrices-and-Vectors/) 75 | 76 | Example of a diagonal matrix and of a symmetric matrix 77 | 78 | **A diagonal (left) and a symmetric matrix (right)** 79 | 80 | We have seen in [2.3](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.3-Identity-and-Inverse-Matrices/) some special matrices that are very interesting. We will see other types of vectors and matrices in this chapter. It is not a big chapter but it is important to understand the next ones. 81 | 82 | 7. [Eigendecomposition](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.7-Eigendecomposition/) 83 | 84 | output_59_0 85 | 86 | We will see some major concepts of linear algebra in this chapter. We will start by getting some ideas on eigenvectors and eigenvalues. We will see that a matrix can be seen as a linear transformation and that applying a matrix on its eigenvectors gives new vectors with same direction. Then we will see how to express quadratic equations in a matrix form. We will see that the eigendecomposition of the matrix corresponding to the quadratic equation can be used to find its minimum and maximum. As a bonus, we will also see how to visualize linear transformation in Python! 87 | 88 | 8. [Singular Value Decomposition](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.8-Singular-Value-Decomposition/) 89 | 90 | output_35_7 91 | 92 | We will see another way to decompose matrices: the Singular Value Decomposition or SVD. Since the beginning of this series I emphasized the fact that you can see matrices as linear transformation in space. With the SVD, you decompose a matrix in three other matrices. We will see that we look at these new matrices as *sub-transformation* of the space. Instead of doing the transformation in one movement, we decompose it in three movements. As a bonus, we will apply the SVD to image processing. We will see the effect of SVD on an example image of Lucy the goose. So keep on reading! 93 | 94 | 9. [The Moore-Penrose Pseudoinverse](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.9-The-Moore-Penrose-Pseudoinverse/) 95 | 96 | output_44_0 97 | 98 | We saw that not all matrices have an inverse. It is unfortunate because the inverse is used to solve system of equations. In some cases, a system of equations has no solution, and thus the inverse doesn’t exist. However it can be useful to find a value that is almost a solution (in terms of minimizing the error). This can be done with the pseudoinverse! We will see for instance how we can find the best-fit line of a set of data points with the pseudoinverse. 99 | 100 | 10. [The Trace Operator](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.10-The-Trace-Operator/) 101 | 102 | Calculating the trace of a matrix 103 | 104 | **The trace of matrix** 105 | 106 | We will see what is the Trace of a matrix. It will be needed for the last chapter on the Principal Component Analysis (PCA). 107 | 108 | 11. [The Determinant](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.11-The-determinant/) 109 | 110 | Comparison of positive and negative determinant 111 | 112 | **Link between the determinant of a matrix and the transformation associated with it** 113 | 114 | This chapter is about the determinant of a matrix. This special number can tell us a lot of things about our matrix! 115 | 116 | 12. [Example: Principal Components Analysis](https://hadrienj.github.io/posts/Deep-Learning-Book-Series-2.12-Example-Principal-Components-Analysis/) 117 | 118 | Mechanism of the gradient descent algorithm 119 | **Gradient descent** 120 | 121 | This is the last chapter of this series on linear algebra! It is about Principal Components Analysis (PCA). We will use some knowledge that we acquired along the preceding chapters to understand this important data analysis tool! 122 | 123 | # Requirements 124 | 125 | This content is aimed at beginners but it would be nice to have at least some experience with mathematics. 126 | 127 | # Enjoy 128 | 129 | I hope that you will find something interesting in this series. I tried to be as accurate as I could. If you find errors/misunderstandings/typos… Please report it! You can send me emails or open issues and pull request in the notebooks Github. 130 | 131 | # References 132 | 133 | Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT press. 134 | --------------------------------------------------------------------------------