├── README.md
└── images
├── 3blue1brown.png
├── Gilbert_Strang.png
├── d2lai.gif
├── d2lai.png
├── gilbert_strang.png
├── matrix_calc.png
├── ml_math.png
└── seeing_theory.png
/README.md:
--------------------------------------------------------------------------------
1 | # Free online math resources for Machine Learning
2 |
3 | # Content
4 |
5 | - [Books](#books)
6 | - [Blog posts](#blog-posts)
7 | - [Interactive Tools](#interactive-tools)
8 | - [Videos](#videos)
9 | - [Online courses](#online-courses)
10 |
11 |
12 | # Books
13 |
14 | ## Dive into Deep Learning
15 |
16 | An interactive deep learning book with code, math, and discussions, based on the NumPy interface by Aston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola. With 900 pages, this seems to be one of the most comprehensive one-stop resources that goes from Linear Neural Networks and Multilayer Perceptrons all the way to modern Deep Learning architectures including Attention Mechanisms and Optimization Algorithms – giving you all three: Theory, Math & Code.
17 |
18 | [

](http://d2l.ai/)
19 |
20 | [
](http://d2l.ai/)
21 |
22 |
23 | ## Math for Machine Learning
24 |
25 | Note: We have bi-weekly remote reading sessions goingthrough all chapters of the book. If you'd like to join check out this [blog post](https://machinelearningtokyo.com/2019/11/28/ml-math-reading-sessions/) and join us on [Meetup](https://www.meetup.com/Machine-Learning-Tokyo/).
26 |
27 | **Part I: Mathematical Foundations**
28 |
29 | 1. Introduction and Motivation
30 | 2. Linear Algebra
31 | 3. Analytic Geometry
32 | 4. Matrix Decompositions
33 | 5. Vector Calculus
34 | 6. Probability and Distribution
35 | 7. Continuous Optimization
36 |
37 | **Part II: Central Machine Learning Problems**
38 |
39 | 8. When Models Meet Data
40 | 9. Linear Regression
41 | 10. Dimensionality Reduction with Principal Component Analysis
42 | 11. Density Estimation with Gaussian Mixture Models
43 | 12. Classification with Support Vector Machines
44 |
45 | [
](https://mml-book.github.io/)
46 |
47 | # Blog posts
48 |
49 | ## The Matrix Calculus You Need For Deep Learning
50 |
51 | by Terence Parr and Jeremy Howard
52 |
53 | Abstract
54 |
55 | "This paper is an attempt to explain all the matrix calculus you need in order to understand the training of deep neural networks. We assume no math knowledge beyond what you learned in calculus 1, and provide links to help you refresh the necessary math where needed. Note that you do not need to understand this material before you start learning to train and use deep learning in practice; rather, this material is for those who are already familiar with the basics of neural networks, and wish to deepen their understanding of the underlying math. Don't worry if you get stuck at some point along the way---just go back and reread the previous section, and try writing down and working through some examples. And if you're still stuck, we're happy to answer your questions in the Theory category at forums.fast.ai. Note: There is a reference section at the end of the paper summarizing all the key matrix calculus rules and terminology discussed here."
56 |
57 | - [Source](https://explained.ai/matrix-calculus/index.html)
58 |
59 | [
](https://explained.ai/matrix-calculus/index.html)
60 |
61 |
62 | # Interactive tools
63 |
64 | ## Seeing Theory: Probability and Stats
65 | A visual introduction to probability and statistics.
66 |
67 | [
](https://seeing-theory.brown.edu/)
68 |
69 | ## Sage Interactions
70 |
71 | This is a collection of pages demonstrating the use of the **interact** command in Sage. It should be easy to just scroll through and copy/paste examples into Sage notebooks.
72 |
73 | Examples include Algebra, Bioinformatics, Calculus, Cryptography, Differential Equations, Drawing Graphics, Dynamical Systems, Fractals, Games and Diversions, Geometry, Graph Theory, Linear Algebra, Loop Quantum Gravity, Number Theory, Statistics/Probability, Topology, Web Applications.
74 |
75 | [
](https://wiki.sagemath.org/interact/)
76 |
77 |
78 | ## Probability Distributions
79 |
80 | by Simon Ward-Jones. A visual 👀 tour of probability distributions.
81 |
82 | - Bernoulli Distribution
83 | - Binomial Distribution
84 | - Normal Distribution
85 | - Beta Distribution
86 | - LogNormal Distribution
87 |
88 | [
](https://www.simonwardjones.co.uk/posts/probability_distributions/)
89 |
90 | ## Bayesian Inference
91 |
92 | by Simon Ward-Jones. Explaining the basics of bayesian inference with the example of flipping a coin.
93 |
94 | [
](https://www.simonwardjones.co.uk/posts/probability_distributions/)
95 |
96 |
97 | # Videos
98 |
99 | ## 3blue1brown
100 |
101 | 3blue1brown, by Grant Sanderson, is some combination of math and entertainment, depending on your disposition. The goal is for explanations to be driven by animations and for difficult problems to be made simple with changes in perspective.
102 |
103 | [
](https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw/featured)
104 |
105 | ## Recommended video series:
106 | - **[Essence of Linear Algebra](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)**
107 | - **[Essence of Calculus](https://www.youtube.com/playlist?list=PLZHQObOWTQDMsr9K-rj53DwVRMYO3t5Yr)**
108 |
109 |
110 | ## The classic: Gilbert Strang MIT lectures on Linear Algebra
111 |
112 | **Matrix Methods in Data Analysis, Signal Processing, and Machine Learning**
113 | - **Spring 2018**
114 | - **Level:** Undergraduate / Graduate
115 | - **Course description:** "Linear algebra concepts are key for understanding and creating machine learning algorithms, especially as applied to deep learning and neural networks. This course reviews linear algebra with applications to probability and statistics and optimization–and above all a full explanation of deep learning."
116 | - **Format:** [Video lectures](https://ocw.mit.edu/courses/mathematics/18-065-matrix-methods-in-data-analysis-signal-processing-and-machine-learning-spring-2018/)
117 |
118 | [
](https://ocw.mit.edu/courses/mathematics/18-065-matrix-methods-in-data-analysis-signal-processing-and-machine-learning-spring-2018/video-lectures/index.htm)
119 |
120 | # Online courses
121 |
122 | ## Mathematics for Machine Learning – Linear Algebra
123 |
124 | Imperial College London. "This course offers an introduction to the linear algebra required for common machine learning techniques. We start at the very beginning with thinking about vectors and what vectors are, and the basic mathematical operations we can do with vectors, like how to add vectors. We then move on to think about how to find the product of vectors and what the modulus or size of a vector is. In physical spaces that then lets us think about linear algebra geometrically, and therefore when vectors are perpendicular to eachother or have an angle between then. We can think about the basis – the fundamental vectors that make up a vector space – and how to change basis and transform between vector frames. That then lets us think about how to combine matrix transformations and how to do inverse transformations. That then takes us on to think about the eigenvectors and eigenvalues of a transformation and what these “eigen-things” mean. We then finish up the course by applying all this to a machine learning problem – the google pagerank algorithm."
125 |
126 | - [YouTube Playlist](https://www.youtube.com/playlist?list=PLiiljHvN6z1_o1ztXTKWPrShrMrBLo5P3)
127 | - [Coursera](https://www.coursera.org/specializations/mathematics-machine-learning)
128 |
129 | ## Essential Math for Machine Learning: Python Edition
130 |
131 | - Equations, Functions, and Graphs
132 | - Differentiation and Optimization
133 | - Vectors and Matrices
134 | - Statistics and Probability
135 |
136 | - [EdX](https://www.edx.org/course/essential-math-for-machine-learning-python-edition-3)
137 |
138 |
--------------------------------------------------------------------------------
/images/3blue1brown.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Machine-Learning-Tokyo/Math_resources/00712c2bcae206576e0f2553d41e4407663a7003/images/3blue1brown.png
--------------------------------------------------------------------------------
/images/Gilbert_Strang.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Machine-Learning-Tokyo/Math_resources/00712c2bcae206576e0f2553d41e4407663a7003/images/Gilbert_Strang.png
--------------------------------------------------------------------------------
/images/d2lai.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Machine-Learning-Tokyo/Math_resources/00712c2bcae206576e0f2553d41e4407663a7003/images/d2lai.gif
--------------------------------------------------------------------------------
/images/d2lai.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Machine-Learning-Tokyo/Math_resources/00712c2bcae206576e0f2553d41e4407663a7003/images/d2lai.png
--------------------------------------------------------------------------------
/images/gilbert_strang.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Machine-Learning-Tokyo/Math_resources/00712c2bcae206576e0f2553d41e4407663a7003/images/gilbert_strang.png
--------------------------------------------------------------------------------
/images/matrix_calc.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Machine-Learning-Tokyo/Math_resources/00712c2bcae206576e0f2553d41e4407663a7003/images/matrix_calc.png
--------------------------------------------------------------------------------
/images/ml_math.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Machine-Learning-Tokyo/Math_resources/00712c2bcae206576e0f2553d41e4407663a7003/images/ml_math.png
--------------------------------------------------------------------------------
/images/seeing_theory.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Machine-Learning-Tokyo/Math_resources/00712c2bcae206576e0f2553d41e4407663a7003/images/seeing_theory.png
--------------------------------------------------------------------------------