├── docs
├── _config.yml
├── 000-Intro_to_Study_Group.pdf
├── 001-3b1b_Revision
│ ├── Images
│ │ ├── Lect2_1.png
│ │ ├── Lect2_2.png
│ │ ├── Lect3_1.png
│ │ ├── Lect3_2.png
│ │ ├── Lect4_1.png
│ │ ├── Lect5_1.png
│ │ ├── Lect6_1.png
│ │ ├── Lect6_2.png
│ │ ├── Lect6_3.png
│ │ ├── Lect7_1.png
│ │ ├── Lect7_2.png
│ │ ├── Lect9_1.png
│ │ ├── Lect9_2.png
│ │ ├── Lect9_3.png
│ │ ├── Lect9_4.png
│ │ ├── Lect9_5.png
│ │ ├── Det_Proof.png
│ │ ├── Lect10_1.png
│ │ ├── Lect10_2.png
│ │ ├── Lect10_3.png
│ │ ├── Lect10_4.png
│ │ ├── Lect11_1.png
│ │ ├── Lect11_2.png
│ │ ├── Lect11_3.png
│ │ ├── Lect12_1.png
│ │ ├── Lect12_2.png
│ │ ├── Lect13_1.png
│ │ ├── Lect13_2.png
│ │ ├── Lect13_3.png
│ │ ├── Lect13_4.png
│ │ ├── Lect13_5.png
│ │ ├── Lect13_6.png
│ │ ├── Lect13_7.png
│ │ ├── Lect14_1.png
│ │ ├── Lect14_10.png
│ │ ├── Lect14_11.png
│ │ ├── Lect14_2.png
│ │ ├── Lect14_3.png
│ │ ├── Lect14_4.png
│ │ ├── Lect14_5.png
│ │ ├── Lect14_6.png
│ │ ├── Lect14_7.png
│ │ ├── Lect14_8.png
│ │ ├── Lect14_9.png
│ │ ├── Lect15_1.png
│ │ └── Lect15_2.png
│ ├── 001C.md
│ ├── 001B.md
│ └── 001A.md
├── 002-MIT_1to4
│ ├── Images
│ │ ├── col_picture.PNG
│ │ └── row_picture.PNG
│ └── 002A.md
├── index.md
├── 003-MIT_5to10
│ ├── 003A.md
│ └── 003B.md
├── 005-MIT_11to20
│ ├── 005B.md
│ └── 005A.md
└── 006-MIT_21to25
│ └── 006.md
└── Schedule.md
/docs/_config.yml:
--------------------------------------------------------------------------------
1 | theme: jekyll-theme-cayman
--------------------------------------------------------------------------------
/docs/000-Intro_to_Study_Group.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/000-Intro_to_Study_Group.pdf
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect2_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect2_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect2_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect2_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect3_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect3_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect3_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect3_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect4_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect4_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect5_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect5_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect6_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect6_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect6_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect6_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect6_3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect6_3.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect7_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect7_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect7_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect7_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect9_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect9_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect9_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect9_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect9_3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect9_3.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect9_4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect9_4.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect9_5.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect9_5.png
--------------------------------------------------------------------------------
/docs/002-MIT_1to4/Images/col_picture.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/002-MIT_1to4/Images/col_picture.PNG
--------------------------------------------------------------------------------
/docs/002-MIT_1to4/Images/row_picture.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/002-MIT_1to4/Images/row_picture.PNG
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Det_Proof.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Det_Proof.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect10_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect10_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect10_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect10_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect10_3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect10_3.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect10_4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect10_4.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect11_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect11_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect11_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect11_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect11_3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect11_3.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect12_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect12_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect12_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect12_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect13_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect13_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect13_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect13_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect13_3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect13_3.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect13_4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect13_4.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect13_5.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect13_5.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect13_6.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect13_6.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect13_7.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect13_7.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_10.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_10.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_11.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_11.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_2.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_3.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_4.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_5.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_5.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_6.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_6.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_7.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_7.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_8.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_8.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect14_9.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect14_9.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect15_1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect15_1.png
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/Images/Lect15_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/SRA-VJTI/linear-algebra-study-group/HEAD/docs/001-3b1b_Revision/Images/Lect15_2.png
--------------------------------------------------------------------------------
/docs/002-MIT_1to4/002A.md:
--------------------------------------------------------------------------------
1 | ## Geometry Of Linear Equations
2 |
3 |
4 | Consider the given set of equations
5 | 2x - y = 0
6 | -x + 2y = 3
7 | * Row Picture
8 |
9 |
10 | This type of visualization is used to solve 2D equations generally.
11 | 
12 |
13 | * Column Picture
14 |
15 | This provides better intution when dealing with higher or more complex equations and dimensions
16 | 
17 |
18 |
19 | - *NOTE* : If 3 columns are dependent on each other then the space where we get the solution is restricted to only that particular plane where these cols are present.
20 |
--------------------------------------------------------------------------------
/docs/index.md:
--------------------------------------------------------------------------------
1 | # Linear Algebra
2 |
3 | * Lecture 0 - [Intro to LA Study Group](000-Intro_to_Study_Group.pdf)
4 | * Lecture 1 - 3b1b Revision [Part1](001-3b1b_Revision/001A.md), [Part2](001-3b1b_Revision/001B.md), [Practice Problems](001-3b1b_Revision/001C.md)
5 | * Lecture 2 - [Pictures, Vector Space, Elimination, Multiplication, Inverse, LU](002-MIT_1to4/002A.md)
6 | * Lecture 3 - MIT Lecture 5 to 10
7 | * [Permutaions, Vector Space & Sub Space, Computing Null Space, RREF](003-MIT_5to10/003A.md)
8 | * [Solving `Ax = B`, Rank, Linear Independence, Span, Basis, Four Subspaces](003-MIT_5to10/003B.md)
9 | * Lecture 5 - MIT Lecture 11 to 20
10 | * [Rank1 matrices, Graphs, Orthogonality, Projections, Least Squares](005-MIT_11to20/005A.md)
11 | * [Straight line fitting, Gram-Schmidt, Determinants, Cofactors, Tridiagonal matrices, Inverse](005-MIT_11to20/005B.md)
12 | * Lecture 6 - MIT Lecture 21 to 25
13 | * [Eigen Values and Eigen Vectors, Diagonalization, Markov matrices, Positive Definite Matrices](006-MIT_21to25/006.md)
--------------------------------------------------------------------------------
/Schedule.md:
--------------------------------------------------------------------------------
1 | # Study Group Timeline
2 |
3 | [3b1b](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)
4 | [MIT 18.06 LA](https://www.youtube.com/playlist?list=PL221E2BBF13BECF6C)
5 |
6 | | Date | Topic | Notes | Lecturer |
7 | | :------------------: | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :---: | :------------------: |
8 | | 12-10-2020 Tuesday | Introduction to the Course | | Saurabh Gupta |
9 | | 17-10-2020 Saturday | 3b1b - Essence of linear algebra | | Shamit and Shubham |
10 | | 25-10-2020 Sunday | 18.06 Pictures, Vector Space, Elimination, Multiplication, Inverse, LU | | Shantanu and Abhinav |
11 | | 04-11-2020 Wednesday | 18.06 Computing solutions of spaces, Span, Basis, 4 fundamenta subspaces | | Saharsh |
12 | | 11-11-2020 Wednesday | Problem Solving and assignment discussion | | Saharsh |
13 | | 16-01-2021 Saturday | 18.06 Rank1 matrices, Graphs, Orthogonality, Projections, Least Squares | | Saharsh |
14 | | 07-08-2021 Saturday | Straight line fitting, Gram-Schmidt, Determinants, Cofactors, Tridiagonal matrices, Inverse, Eigen Values and Eigen Vectors, Diagonalization, Markov matrices, Positive Definite Matrices | | Saharsh |
15 |
--------------------------------------------------------------------------------
/docs/003-MIT_5to10/003A.md:
--------------------------------------------------------------------------------
1 | # 5 - Permutations, Vector Spaces and Sub Spaces
2 |
3 | * Permutation matrices execute row exchanges
4 | * Matlab does this to choose higher pivot, improving accuracy
5 | * Number of permutation matrices possible for a `nxn` matrix = n! (_Arrangement of rows_)
6 | * `PA = LU`
7 | * P is `orthonormal` i.e. P-1 = PT and columns are unit vectors
8 |
9 | * ATA is always symmetric i.e. its transpose is equal to itself
10 |
11 | * **Vector Spaces** => set of vectors which satisfy closure property on linear combination
12 | * Linear combination = k1v1 + k2v2
13 | * where kx = scalars and vx vectors from that set
14 | * Closure = After taking linear combination the resultant vector must belong to same set
15 | * Hence `Zero or null` vector must be present in any vector space
16 |
17 | * **Sub Spaces** => subset of vector space which satisfies the above closure property
18 | * R2 has R2, R1 through origin, R0 i.e origin as subspaces
19 |
20 | * **Column Space** => Span of columns of a matrix
21 |
22 |
23 | # 6 - Solutions of Ax=b, Null Space
24 |
25 | * If P and L are two subspaces
26 | * Their `union` is always a subspace
27 | * Their `intersection` may or may not be subspace
28 |
29 | * Does Ax=b have a solution for every b?
30 | * No, only those `b` have a solution which lie in C(A)
31 | * Since `Ax` is nothing but linear combination of columns of A, which is span
32 |
33 | * **Null Space** => set of vectors satisfying `Ax = 0`
34 | * Always a subspace
35 |
36 | * Solution of `Ax = b` can be expressed as **x = xp + Cxs**
37 | * xp = particular solution
38 | * xs = special solution , Axs = 0
39 | * A (xp + Cxs) = b
40 |
41 |
42 | # 7 - Computing Null Space, Pivot/Free variables, Reduced Row Echelon form
43 |
44 | * `pivot` = first non zero element in every row after elimination
45 | * `pivot columns` = columns containing pivot
46 | * `pivot variables` = variables corresponding to pivot columns
47 | * `free columns` = columns without pivot
48 | * `free variables` = variables corresponding to free columns
49 |
50 | > add image of matrix elimination process
51 |
52 | * **Reduced row echelon form (rref)** = pivot should be `1` and elements above pivot should be `0`
53 |
54 | > add image of rref matrix
55 |
56 | * R =
57 |
58 |
59 | | | |
60 | | --- | --- |
61 | | I | F |
62 | | 0 | 0 |
63 |
64 |
65 | * `RN = 0`
66 |
67 | > add image of N = xpivot, xfree and -F, I
--------------------------------------------------------------------------------
/docs/003-MIT_5to10/003B.md:
--------------------------------------------------------------------------------
1 | # 8 - Complete solution of `Ax = b`, rank of matrix
2 |
3 | * > add image of Ax = b
4 |
5 | * For xparticular => set all free variables to `0`
6 |
7 | * **Rank** =>
8 | * Number of pivot
9 | * Number of independent columns/rows
10 | * Dimension of column space
11 | * r <= `max`(m, n)
12 | * m = rows, n = columns
13 |
14 | | Condition | Solution | Comment |
15 | | :------------: | :-----------: | :-------------------------------------------: |
16 | | r = n < m | 0 or 1 | No free variables. Hence, null space is empty |
17 | | r = m < n | Infinite | Every row has pivot, `n-r` free variables |
18 | | r = m = n | Unique | Invertible matrix |
19 | | r < m && r < n | 0 or Infinite | depends on `b` |
20 |
21 |
22 | # 9 - Linear Independence, Span, Basis
23 |
24 | * **Independent vectors** => Linear combination is 0 only if all scalars are 0
25 | * **Span** => vectors v1, v2, v3, v4, v5...., vn span a space if whole space can be generated using linear combinations of these vectors
26 | * **Basis** => Set of vectors that span a space and are linearly independent
27 | * right enough vector to span a space
28 | * for a given space there exists infinite number of basis, but all basis consists of same number of vectors (equal to dimension of space)
29 |
30 |
31 | # 10 - Four fundamental subspaces
32 |
33 | For a matrix `A`
34 |
35 | | | | |
36 | | :----------------------------------: | :--------------------------------: | :------------------------------: |
37 | | **Column Space C(A)** | combination of columns of A | Rr in Rm |
38 | | **Null Space N(A)** | all solution of `Ax = 0` | Rn-r in Rn |
39 | | **Row Space C(AT)** | combination of rows of A | Rr in Rn |
40 | | **Left Null Space N(AT)** | all solution of ATy = 0 | Rm-r in Rm |
41 |
42 | > add image showing othogonality between spaces
43 |
44 | > add image of matrix R with elimination steps
45 |
46 | * Basis of `row` space of A ==> first `r` rows of R or A
47 | * Basis of `column` space of A ==> pivot columns of `A`
48 | * Basis of `null` space of A ==> special solution of A
49 | * Basis of `left null` space of A ==> transforming `[A | I] ---> [R | E]`, look for combination of rows which give zero row
50 |
51 |
52 |
53 | ### 3x3 matrices a vector space?
54 |
55 |
56 | View Answer
57 |
58 | > _Yes_
59 |
60 |
61 | ### What are its subspaces?
62 |
63 |
64 | View Answer
65 |
66 | > _upper triangular, symmetrical, diagonal..._
67 |
--------------------------------------------------------------------------------
/docs/005-MIT_11to20/005B.md:
--------------------------------------------------------------------------------
1 | # 16 - Projections and Least square fitting of straight line
2 |
3 | * if `b` in column space of `A`, then Pb = b
4 | * if `b` ⊥ column space, then Pb = 0, since b in null space
5 |
6 | * Show example of fitting straight line
7 |
8 | * If A has independent columns, then ATA is invertible
9 | * To prove above statement, prove the null space of ATA contains only zero vector
10 | * ATAx = 0
11 | * xTATAx = 0
12 | * (Ax)TAx = 0
13 | * Ax = 0
14 | * x = 0
15 |
16 | * Columns are definitely independent if they are perpendicular vectors
17 |
18 |
19 | # 17 - Orthogonal basis, Orthogonal square matrix (Q), Gram-Schmidt (A --> Q)
20 |
21 | * Orthonormal vector -
22 | * qiTqj = 0, if i is not equal to j
23 | * qiTqj = 1, if i is equal to j
24 |
25 | * QTQ = I
26 | * if Q is square - QT = Q-1
27 | * What is made easier?
28 | * Projection matrix calculation
29 | * P = Q(QTQ)-1QT = QQT, I if Q is square
30 |
31 | * Gram-Schmidt (make matrix A to Q)
32 | * [Diagram]
33 | * take independent vectors of A - a, b
34 | * Make them orthogonal by projection
35 | * Normalize for orthonormal
36 | * For multiple vectors - one by one take vector and remove components of previous vectors from it
37 | * [Diagram]
38 |
39 |
40 | # 18 - Determinants, Properties
41 |
42 | 1. det(I) = 1
43 | 2. row exchange reverses the sign of determinant
44 | 3. [Diagram]
45 | 1. Multiplier from row can be taken common and determinant will be multiplied by the same
46 | 2. Determinant can be split with respect to one row
47 | 4. 2 equal or dependent rows --> det = 0
48 | 5. Subtract l x rowi from rowk --> determinant doesn't change
49 | 6. row of 0's --> det = 0
50 | 7. upper triangular --> det = product of diagonal entries
51 | 8. det(A) = 0 ↔ A is singular **OR** det(A) ≠ 0 ↔ A is invertible
52 | 9. det(AB) = det(A)*det(B)
53 | 10. det(AT) = det(A)
54 |
55 | **All above properties also holds for column
56 |
57 | * determinant of skew-symmetric matrix = 0 (use rule `10`)
58 |
59 |
60 | # 19 - Determinant Formula, Cofactors formula, Tridiagonal matrices
61 |
62 | * Use rule 3.2 of determinant properties
63 | * n! terms
64 |
65 | * Cofactors - connect nxn determinant to (n-1)x(n-1)
66 | * det(A) = a11C11 + a12C12 + ..... + a1nC1n
67 |
68 | * Tridiagonal matrices [Diagram]
69 | * det(An) = det(An-1) - det(An-2)
70 |
71 |
72 | # 20 - Formula for A-1, Cramer's rule, det(A) = Volume of box
73 |
74 | * ACT = det(A) I
75 | * A-1 = (1 / |A|) CT....**C = Cofactor matrix**
76 | * rowi * Cofactorsj = 0 .....**if i != j**
77 | * Hint: Identical rows
78 | * Cramer's Rule:
79 | * Used for solving **Ax = b**
80 | * x = A-1b
81 | * x = (1 / |A|) CTb
82 | * x1 = det(B1) / det(A), x2 = det(B2) / det(A)
83 | * Bj = Matrix A with jth column substituted by `b`
84 | * Determinant of 3x3 matrix = Volume of space enclosed by the column vectors (Refer 3b1b video for visualization)
85 |
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/001C.md:
--------------------------------------------------------------------------------
1 | # Practice Problems
2 |
3 | ### 1. Which of the following is true for any matrix A?
4 | **_NOTE_: dim(C(A)) ==> dimension of column space of A, dim(N(A)) ==> dimension of null space of A**
5 | * rank(A) > dim(C(A))
6 | * dim(C(A)) = dim(N(A))
7 | * dim(N(A)) + dim(C(A)) = dimension of whole vector space
8 | * None
9 |
10 |
11 | View Answer
12 |
13 | > _dim(N(A)) + dim(C(A)) = dimension of whole vector space_
14 |
15 |
16 | ### 2. Let A, B, C, D be n × n matrices. If ABCD = 1, then B-1?
17 | * D-1C-1A-1
18 | * CDA
19 | * ADC
20 | * None
21 | * Insufficient information
22 |
23 |
24 | View Answer
25 |
26 | > _Insufficient information. Since it is not given that the matrices are invertible. If invertible the answer would be ==> CDA_
27 |
28 |
29 | ### 3. Find a transformation matrix in 3D space, that first rotates the space about X axis by 270o, then rotates the resulting space about its Y axis by 90o and finally rotates the resulting space about its Z axis by 180o (All rotations in clockwise direction)
30 |
31 |
32 | View Answer
33 |
34 | Let A ==> Rotation about X by -270o
35 | B ==> Rotation about Y by -90o
36 | C ==> Rotation about Z by -180o since clockwise*
37 | A =
38 |
39 | | 1 | 0 | 0 |
40 | | :---: | :---: | :---: |
41 | | 0 | 0 | 1 |
42 | | 0 | -1 | 0 |
43 |
44 |
45 | B =
46 |
47 |
48 | | 0 | 0 | -1 |
49 | | :---: | :---: | :---: |
50 | | 0 | 1 | 0 |
51 | | 1 | 0 | 0 |
52 |
53 |
54 | C =
55 |
56 | | -1 | 0 | 0 |
57 | | :---: | :---: | :---: |
58 | | 0 | -1 | 0 |
59 | | 0 | 0 | 1 |
60 |
61 | Resultant transformation = **C x B x A**
62 | Think why preorder multiplication?
63 |
64 |
65 | ### 4. Does all natural numbers in range (0, ∞) form a vector space or sub space? Justify
66 |
67 |
68 | View Answer
69 |
70 | > _No, Since zero vector is not present_
71 |
72 |
73 | ### 5. Does all natural numbers in range (-∞, 0] form a vector space or sub space? Justify
74 |
75 |
76 | View Answer
77 |
78 | > _For any space to be a vector space/subspace, there should be closure i.e resultant vector after
79 | linear transformation should lie in same space. But here the resultant can be positive and hence
80 | the answer is NO_
81 |
82 |
83 | ### 6. Does all 2D matrices form a vector space? Justify
84 |
85 |
86 | View Answer
87 |
88 | > _Yes_
89 |
90 |
91 | ### 7. A transformation matrix squishes 4D space into line, what is the rank of that matrix?
92 | * 0
93 | * 1
94 | * 2
95 | * 3
96 | * 4
97 |
98 |
99 | View Answer
100 |
101 | > _1 (Since line)_
102 |
103 |
104 | ### 8. How many basis exists for a 4D vector space and how many vectors are there in that basis
105 |
106 |
107 | View Answer
108 |
109 | > _Infinite number of basis, each basis consists of 4 vectors_
110 |
111 |
112 | ### 9. Prove that determinant of the matrix given below is , det(A) = ad - bc
113 |
114 | | | |
115 | | --- | --- |
116 | | a | b |
117 | | c | d |
118 |
119 |
120 | View Answer
121 |
122 | 
123 | Area of resultant parallelogram = (a+b)x(c+d) - [2bc + ac/2 + ac/2 + bd/2 + bd/2]
124 |
125 |
--------------------------------------------------------------------------------
/docs/005-MIT_11to20/005A.md:
--------------------------------------------------------------------------------
1 | # 11 - Bases of new vector spaces, Rank one matrices
2 |
3 | **All for 3x3 cases**
4 | * Basis - 9 dimensional (1 at each cell)
5 | * Basis of symmetric matrices - 6 dimensional
6 | * Basis of upper triangular matrices - 6 dimensional
7 | * Basis of (Symmetric ∩ Upper triangular) - 3 dimensional (since diagonal)
8 | * Basis of (Symmetric ∪ Upper triangular) - 9 dimensional (since all 3x3)
9 |
10 | * Every rank 1 matrix can be expressed as - u x vT
11 | * where `u` and `v` are column matrices
12 |
13 | * Let M be all 5x17 matrices with rank 4. Is M a subspace?
14 | * No. Since no `0` matrix
15 |
16 |
17 | # 12 - Graphs and Networks, Incidence matrices, Kirchoff's Law
18 |
19 | Consider this directed graph -
20 |
21 | * Here `m` = 5 edges, `n` = 4 nodes
22 | * `A` - Incidence matrix can be used to denote this graph
23 |
24 | * Let Null space solution of `A` be `x` -
25 |
26 | * `Ax` denotes the Potential Difference between nodes
27 | * shows circuit movement only when potential difference
28 |
29 | * Let `C` be a matrix which takes us from Potential Difference to Current through edges (Ohm's law)
30 | * `C` - Conductance matrix
31 | * Let current be `Y` -
32 |
33 | * ATy = 0 - Solving Left null space (Kirchoff's current law)
34 | * Solving left null space just means current coming `IN` a node is equal to current going `OUT` through it
35 | * can find basis by assuming current (`y`) through one edge and solving such that no charge accumulation
36 | * Dimension of Left Null Space = Number of independent loops
37 | * Rank = Number of nodes - 1
38 | * dim(N(AT)) = m - r
39 | * #loops = #edges - (#nodes - 1)
40 | * #nodes - #edges + #loops = 1 --> `Euler's Formula`
41 |
42 |
43 | # 13 - Quiz Review
44 |
45 |
46 |
47 | # 14 - Orthogonal vectors and subspaces
48 |
49 | * `x` and `y` are orthogonal if - xTy = 0
50 | * Two subspaces are orthogonal if every vector in subspace 1 is perpendicular to every other vector in subspace 2
51 | * row space is perpendicular to null space
52 | * column space is perpendicular to left null space
53 |
54 |
55 | # 15 - Projections, Least squares, Projection matrix
56 |
57 | * Why projections?
58 | * Because `Ax = b` may not have a solution
59 | * projection means changing `b` to closest vector in column space of `A`
60 | * Ax^ = p
61 |
62 | ### Projection for line
63 |
64 | [Diagram]
65 |
66 | * a ⊥ e
67 | * aTe = 0
68 | * aT(b - p) = 0
69 | * aT(b - xa) = 0
70 | * x.aT.a = aTb
71 | * **x = (aTb) / (aTa)** - scalar
72 | * **p = ax = a ((aTb) / (aTa))** - vector (closest `b`)
73 |
74 | * Since projection of `b`, `p` is governed by `b` i.e scaling `b` also scales `p`
75 | * So we can write `p` as:
76 | * **p = (a aT / (aTa)) b**
77 | * here (a aT / (aTa)) is known as Projection matrix (`P`)
78 |
79 | * Column space of `P` - line `a`
80 | * rank(P) = 1
81 | * P = PT .... symmetrical
82 | * P2 = P .... projecting same line twice, thrice will not change projection
83 |
84 | ### Projection for matrix
85 |
86 | [Diagram]
87 |
88 | * a1Te = 0 = a2Te
89 | * e = b - p = b - Ax^
90 | * a1T(b - Ax^) = 0 = a2T(b - Ax^)
91 | * Combining both equations (writing them in matrix form)
92 | * AT(b - Ax^) = 0
93 | * ATAx^ = ATb
94 |
95 | * **x = (ATA)-1ATb**
96 | * **p = A(ATA)-1ATb**
97 | * Here projection matrix `P` is A(ATA)-1AT
98 |
99 | * If `A` is square - `P = I` .... since column space is whole space
100 | * P = PT .... symmetrical
101 | * P2 = P .... projecting same line twice, thrice will not change projection
102 |
103 | ### Application - least sqaure fitting by a line
104 |
105 | * fit points on a straight line
106 |
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/001B.md:
--------------------------------------------------------------------------------
1 | # **Dot Product and Duality**
2 | 
3 |
4 |
5 | - Dot Product is projecting a Vector onto other vector and Multiplying their Length , if the projection is in other direction the Dot Product will be Negative , and if Perpendicular then the projection will be zero
6 |
7 | 
8 |
9 |
10 | - Order dosen’t Matter who projects onto whom
11 |
12 | 
13 |
14 |
15 | - vector Multiplication (1X2 Matrice = 2D Matrice)
16 |
17 | 
18 |
19 |
20 | 
21 |
22 |
23 | - Duality : Natural correspondence (ie linear transformation of vector is some other vector in that space ) (ie 2 computation that look similar )
24 |
25 | # **Cross Product**
26 | 
27 | - take a vector v and move the vector w to the end of it and then repeat the same in other order (ie take w and move start of v to end of w)the diagram obtained is a parallelogram ) — the Determinant
28 | - +ve and -ve is due to the order (j should be on left of I anticlockwise )
29 |
30 | 
31 | 
32 | - The Cross product is not a number its a resultant vector as per Right hand thumb rule of the magnitude given by the number
33 | 
34 |
35 | # Cross products in the light of linear transformations
36 |
37 | The **Cross product** a × b is defined as a vector c that is perpendicular (orthogonal) to both a and b, with a direction given by the right-hand rule and a magnitude equal to the area of the parallelogram that the vectors span
38 |
39 | 
40 |
41 | The linear transformation to the number line can be matched to a vector which is called the dual vector of that transformation, such that performing the linear transformation is same as taking the dot product with that vector.
42 |
43 | 
44 |
45 | 
46 |
47 | # Cramer's rule, explained geometrically
48 |
49 | An orthogonal transformation is a linear transformation which preserves a symmetric inner product. In particular, an orthogonal transformation (technically, an orthonormal transformation) preserves lengths of vectors and angles between vectors.
50 |
51 | 
52 |
53 | 
54 |
55 | # Change of Basis
56 |
57 | In linear algebra, a basis for a vector space is a linearly independent set spanning the vector space.
58 |
59 | 
60 |
61 | Geometrically the matrix represents transformation from other grid to our grid, but numerically it is exactly opposite.
62 |
63 | 
64 |
65 | 
66 | Inverse of Matrix represents the reverse linear transformation. So the Inverse matrix will be the transformation that will transform the vector from our grid to the other grids.
67 |
68 | Translation matrices is not same as transforming vectors. The following pictures show the various steps in translating a matrix from one coordinate system to another.
69 |
70 | 
71 | 
72 | 
73 | 
74 |
75 | 
76 |
77 | - when a vector is transformed the span of it also changes but sometimes even after transformation the span doesn't changes
78 | - For those vectors which remain on their span even after transformation but stretched are called the eigen vectors of the transformation and eigen values are the factors by which they are stretched or squished
79 |
80 | 
81 |
82 | 
83 |
84 | 
85 |
86 | 
87 |
88 | - We have to find the value of the lambda so that the determinant is zero so that the following assumption becomes true
89 |
90 | 
91 |
92 | 
93 |
94 | 
95 |
96 | 
97 |
98 | 
99 |
100 | 
101 |
102 | Not all matrices have eigen basis
103 |
104 | 
105 |
106 | 
107 |
108 |
--------------------------------------------------------------------------------
/docs/006-MIT_21to25/006.md:
--------------------------------------------------------------------------------
1 | # 21 - Eigen Values and Eigen Vectors, det(A - λI) = 0, Trace = λ1 + λ2 + ... + λn
2 |
3 | * x is an Eigen vector if **Ax || x**
4 | * λ is an Eigen value if **Ax = λx**
5 | * If A is singular, λ = 0 is an eigen value
6 | * Set of eigen values - spectrum
7 |
8 | * * *
9 |
10 | - What are the eigen vectors and values for Projection matrix?
11 |
12 | View Answer
13 |
14 | > Eigen vectors - All vectors in projection plane, Eigen value - 1
15 | > Eigen vectors - All vectors perpendicular to projection plane, Eigen value - 0
16 |
17 |
18 | - What are the eigen vectors and values for [[0 1] [1 0]]?
19 | - Hint: Recall visualizing eigen vectors and values from 3b1b video (vectors which only scales up or down after transforming the space by matrix)
20 |
21 | View Answer
22 |
23 | > Seeing the matrix it seems the space has flipped around X and Y axis, hence lines at 45o will be at same orientation
24 | > Eigen vectors - [1 1], [1 -1], Eigen value - 1, -1
25 |
26 |
27 | * * *
28 |
29 | * Sum(λ) = Trace, Product(λ) = Determinant
30 | * Solving Ax = λx is same as solving (A - λI)x = 0
31 | * (A - λI) is singular, since x != 0 :thinking:
32 | * Hence we solve |A - λI| = 0
33 | * This is known as characteristic equation
34 |
35 | * * *
36 |
37 | - What are the eigen vectors and values for Rotation matrix R90o?
38 |
39 | View Answer
40 |
41 | > Rotation matrix is [[0 -1] [1 0]]
42 | > No eigen values or vectors
43 |
44 |
45 | - if Ax = λx, find eigen values for (A + 2I)
46 |
47 | View Answer
48 |
49 | > (A + 2I)x = Ax + 2Ix = λx + 2x = (λ + 2)x
50 |
51 |
52 | - if A and B have eigen values λ and α respectively, find eigen values for (A + B)
53 |
54 | View Answer
55 |
56 | > (λ + α) ? :grin:
57 | > No, since eigen vectors of A and B can be different
58 |
59 |
60 |
61 | # 22 - Diagonalizing a matrix (S-1AS = Λ), Powers of A, uk+1 = Auk
62 |
63 | * Suppose there are n independent eigen vectors of A
64 | * Let `S` be matrix formed with columns containing eigen vectors
65 | * AS = A[x1 x2 x3 ... xn] = [λ1x1 λ2x2 .... λnxn] = [x1 x2 x3 .... xn]`diagonal`([λ1 λ2 ... λn])
66 | * AS = SΛ
67 | * **A = SΛS-1** OR **Λ = S-1AS**
68 | * Ak = SΛkS-1
69 | * A is sure to have n independent eigen vectors and be diagonalizable if all λ's are different
70 | * Repeated eigen values may or may not result in n independent eigen vectors
71 | * eg. Identity matrix
72 | * Algebraic multiplicity - Number of times an eigen value is repeated
73 | * Geometric multiplicity - Number of eigen vectors for a particular eigen value
74 |
75 | * * *
76 |
77 | * Fibonacci series - 0, 1, 1, 2, 3, 5, 8.... F100 = ?
78 | * Fk + 2 = Fk + 1 + Fk ... *We want to write this as uk+1 = Auk*
79 | * Fk + 1 = Fk + 1
80 | * **Trick**: Let uk = [Fk + 1 Fk]
81 | * Your work:-
82 | * Find A, eigen value, vector
83 | * Use following logic for finding F100
84 | * u0 = c1x1 + c2x2 + ... + cnxn = SC
85 | * Aku0 = ΛkSC
86 |
87 |
88 | # 23 - Differential equation (du/dt = Au), Exponent of matrix (eAt)
89 |
90 | * du / dt = Au ... u(t) = c1eλ1x1 + c2eλ2x2 + ... + cneλnxn
91 | * Stability, steady state or blow up based on λ
92 | * Stability - λ < 0
93 | * Steady state - one λ = 0 and other < 0
94 | * Blow up - if any λ > 0
95 | * **TODO** - Explain eAt
96 |
97 |
98 | # 24 - Markov matrices, Steady state, Fourier series and Projections
99 |
100 | * Steady state -
101 | * λ = 0 if exponents
102 | * λ = 1 if powers
103 | * Markov matrix -
104 | * All elements >= 0
105 | * All columns add to 1
106 | * Power of Markov matrix is another Markov matrix
107 | * λ = 1 is an eigen value of Markov matrix and corresponding eigen vector has all its components > 0 => positive steady state
108 | * All other eigen values are between (-1, 1)
109 | * **TODO**: Explain example of Markov matrix by migration system model
110 | * V = QC (v = q1c1 + q2c2 ...) .... q = orthonormal basis
111 | * C = Q-1V = QTV .... hence can find all coefficients
112 | * **TODO**: Explain Fourier series
113 |
114 |
115 | # 25 - Symmetric matrices (Eigen values and vectors) and Positive definite matrices
116 |
117 | * Real symmetric matrices - Real eigen values and eigen vectors are orthogonal
118 | * Positive definite symmetric matrix -
119 | * All eigen values are positive
120 | * All pivots are positive
121 | * All sub-determinant positive
--------------------------------------------------------------------------------
/docs/001-3b1b_Revision/001A.md:
--------------------------------------------------------------------------------
1 | # Vectors, what even are they?
2 |
3 | ## Meaning of Vector
4 | * Viewpoints of various domian of study.
5 | * for Physicist (arrows pointing in space)
6 | * CS Student (ordered list of numbers, sets)
7 | * Mathematician (Generalizes both the ideas together).
8 |
9 | * In Linear Algebra the Vector will almost always origin at 0,0 in Space. And the root of it will always be fixed which is different compared to the way Physics students define vectors.
10 |
11 | * Vectors are coordinates as to how much distance along the X-axis or Y axis , that can also be represented in form of a 2X1 Matrix
12 |
13 | ## Addition
14 | * the process of finding one vector that is equivalent to the result of the successive application of two or more given vectors. (OR in simple words) To add vectors, lay the first one on a set of axes with its tail at the origin. Place the next vector with its tail at the previous vector's head. When there are no more vectors, draw a straight line from the origin to the head of the last vector. This line is the sum of the vectors.
15 |
16 |
17 |
18 | ## Scaling
19 | * Multiplication of a vector by a positive scalar changes the magnitude of the vector, but leaves its direction unchanged. The scalar changes the size of the vector. The scalar "scales" the vector.
20 |
21 | # Linear combinations, span, and basis vectors
22 |
23 | * i vector is unit vector along x-axix and j is unit vector along y-axis.
24 |
25 | * i and j form the Basis of the Coordinate System (and everything else is just scaling these basis in the coordinate system).
26 |
27 | * **Basis**: The Basis of a vector is a set of linearly independent vectors that span the full space.
28 |
29 | * **Linear combination** is an expression constructed from a set of terms by multiplying each term by a constant and adding the results (e.g. a linear combination of x and y would be any expression of the form ax + by, where a and b are constants).
30 |
31 | * The choice of basis vector matters when choosing a coordinate system.
32 |
33 | * **Span** : Span of two vectors is set of all linear combinations (ie the points they can reach).
34 |
35 | * If adding two vectors keeps the resultant vector span in the same dimension then they are Linearly dependent , and if their addition adds a new span of dimensions then they are “Linear independent”.
36 |
37 | * Same concept of Basis and Span is followed in 3 dimensions as well. Where Linear combination of 3 vectors can span the entire 3-D space.
38 |
39 | 
40 |
41 |
42 | 
43 |
44 | # Linear transformations and matrices
45 |
46 | **Linear Transformation** :Linear (all lines must remain straight and Origin remains fixed )+ Transformation (a function that takes an input and gives an output, the transformation is used instead of function to bring int the role of moment of vector from its initial position to final position ) in easier words parallel and evenly spaced.
47 |
48 | Now since we need to think about a lot of vectors and their transformation at a time, it is always better to visualize them as points in space.
49 |
50 | 
51 |
52 | * A2-D linear transformation depends on only 4 numbers.
53 | * 2 cordinates where the i lands.
54 | * 2 cordinates where the j lands.
55 |
56 | These four numbers can be represented in the matrix form as follows:
57 |
58 | 
59 |
60 | Here each column represents the point where the i and j vector lands after transformation.
61 |
62 | # Matrix multiplication as composition
63 |
64 | * **Composition** is a way of chaining transformations together. The composition of matrix transformations corresponds to a notion of multiplying two matrices together
65 |
66 | * Matrix Multiplication is Not Commutative
67 |
68 | * Matrix Multiplication is Not Associative
69 |
70 | 
71 |
72 | # Three-dimensional linear transformations
73 |
74 | All the concepts of 2-D matrix transformation are followed by the 3-D matrix transformation as well. The only difference being that earlier we were working with 4 numbers in a matrix whereas now we will work on 9 numbers in matrix
75 |
76 | Each of the 3 columns represents the landing position of i,j and k basis vectors respectively.
77 |
78 | 
79 |
80 | # **Determinant**
81 |
82 | - The Determinant of Matrix is the change in the area covered by the Vectors after the Transformation
83 | 
84 | - Determinant is Zero 0 if the area squeeze down to lower dimension
85 | - -ve Determinant is caused due to flipping of Orientation of Space
86 | - In 3D Matrix the Determinant is the change in Volume
87 |
88 | 
89 | 
90 |
91 | # **Inverse , Rank**
92 |
93 | 
94 | - It can be Visually Interpreted as the Transformation required so that the Vector A lands on vector V after the Transformation
95 | - Inverse Transformation is the transformation required from V to go back to A
96 | - Unless the the Determinant is Not ZERO the Inverse will Exist (if zero the area will be line and you cannot decompose back line back to area or volume through a single function)
97 | - A Inverse X A is doing Nothing thus the Identity matrix
98 | - Solution can Still exist if the Determinant is zero if the solution exist on the Same Line
99 | - When the output of Determinant is 0 (it squeeze down to a line ) then the Rank of the Matrix is said to be 1 , if the Transformation lands on a 2-D Plane instead of a line than it has the Rank 2
100 | - RANK : is the number of Dimensions in the Output
101 | - Be it Line of Plane : it is called the Columns Space of a Matrix , and column tells where the base vector lands
102 | - Span of Columns = Column Space
103 |
104 | 
105 |
106 |
107 | - 0,0 is always in the Column Space since in the Linear Transformation the origin cannot be moved and any number of vectors can land on the origin after the Transformation thus it is the Null Space (Kernel of Matrix)of the Vector
108 |
109 | # **Non Square Matrices and Transformation**
110 | - Transformation can occur between inter Dimensions in 1D-2D etc
--------------------------------------------------------------------------------