├── C1 ├── w1 │ ├── C1w1notes.pdf │ ├── Readme.md │ ├── lab │ │ ├── C1_W1_Lab_1_introduction_to_numpy_arrays.ipynb │ │ ├── C1_W1_Lab_2_solving_linear_systems_2_variables.ipynb │ │ ├── __pycache__ │ │ │ └── quiz.cpython-38.pyc │ │ ├── images │ │ │ ├── 2x2x2.PNG │ │ │ ├── XtuAbxu3QWybgG8bt1Fsjw_c9f74854fefa4d3692c62a39e8b16ae1_snapshotmeteoorite (2).png │ │ │ ├── applesbananas.PNG │ │ │ ├── broadcasting.png │ │ │ ├── hstack.PNG │ │ │ ├── hubble.jpg │ │ │ ├── imagereshaped.PNG │ │ │ ├── visualization.png │ │ │ └── vstack.PNG │ │ └── quiz.py │ ├── pq1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ ├── ss3.png │ │ └── ss4.png │ └── q1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ ├── ss3.png │ │ ├── ss4.png │ │ ├── ss5.png │ │ ├── ss6.png │ │ └── ss7.png ├── w2 │ ├── C1w2_graded_lab │ │ ├── C1_W2_Assignment.ipynb │ │ ├── __pycache__ │ │ │ └── w2_unittest.cpython-38.pyc │ │ └── w2_unittest.py │ ├── C1w2_ungraded_lab.ipynb │ ├── C1w2notes.pdf │ ├── Readme.md │ └── q1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ ├── ss3.png │ │ ├── ss4.png │ │ └── ss5.png ├── w3 │ ├── C1w3_graded_lab │ │ ├── C1_W3_Assignment.ipynb │ │ ├── __pycache__ │ │ │ ├── w3_tools.cpython-38.pyc │ │ │ └── w3_unittest.cpython-38.pyc │ │ ├── data │ │ │ └── house_prices_train.csv │ │ ├── images │ │ │ ├── nn_model_linear_regression_multiple.png │ │ │ └── nn_model_linear_regression_simple.png │ │ ├── w3_tools.py │ │ └── w3_unittest.py │ ├── C1w3notes.pdf │ ├── Readme.md │ ├── lab │ │ ├── C1_W3_Lab_1_vector_operations.ipynb │ │ ├── C1_W3_Lab_2_matrix_multiplication.ipynb │ │ ├── C1_W3_Lab_3_linear_transformations.ipynb │ │ └── images │ │ │ ├── barnsley_fern.png │ │ │ ├── dot_product_geometric.png │ │ │ ├── leaf_original.png │ │ │ ├── shear_transformation.png │ │ │ └── sum_of_vectors.png │ ├── pq1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ ├── ss3.png │ │ ├── ss4.png │ │ └── ss5.png │ └── q1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ ├── ss3.png │ │ └── ss4.png └── w4 │ ├── C1w4_graded_lab │ ├── C1_W4_Assignment.ipynb │ ├── __pycache__ │ │ └── w4_unittest.cpython-38.pyc │ ├── images │ │ └── shear_transformation.png │ └── w4_unittest.py │ ├── C1w4notes.pdf │ ├── Readme.md │ └── q1 │ ├── Readme.md │ ├── ss1.png │ ├── ss2.png │ ├── ss3.png │ ├── ss4.png │ ├── ss5.png │ └── ss6.png ├── C2 ├── w1 │ ├── C2_W1_Lab_1_differentiation_in_python.ipynb │ ├── C2w1_graded_lab │ │ ├── C2_W1_Assignment.ipynb │ │ ├── data │ │ │ └── prices.csv │ │ └── w1_unittest.py │ ├── C2w1notes.pdf │ ├── Readme.md │ ├── pq1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ └── ss3.png │ └── q1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ ├── ss3.png │ │ └── ss4.png ├── w2 │ ├── C2w2_graded_lab │ │ ├── C2_W2_Assignment.ipynb │ │ ├── data │ │ │ └── tvmarketing.csv │ │ └── w2_unittest.py │ ├── C2w2notes.pdf │ ├── Readme.md │ ├── lab │ │ ├── C2_W2_Lab_1_Optimization_Using_Gradient_Descent_in_One_Variable.ipynb │ │ ├── C2_W2_Lab_2_Optimization_Using_Gradient_Descent_in_Two_Variables.ipynb │ │ └── w2_tools.py │ ├── pq1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ └── ss2.png │ └── q1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ └── ss2.png └── w3 │ ├── C2w3_graded_lab │ ├── C2_W3_Assignment.ipynb │ ├── images │ │ └── nn_model_2_layers.png │ └── w3_unittest.py │ ├── C2w3notes.pdf │ ├── Readme.md │ ├── lab │ ├── C2_W3_Lab_1_Regression_with_Perceptron.ipynb │ ├── C2_W3_Lab_2_Classification_with_Perceptron.ipynb │ ├── C2_W3_Lab_3_Optimization_Using_Newtons_Method.ipynb │ ├── data │ │ ├── house_prices_train.csv │ │ └── tvmarketing.csv │ └── images │ │ ├── nn_model_classification_1_layer.png │ │ ├── nn_model_linear_regression_multiple.png │ │ └── nn_model_linear_regression_simple.png │ ├── pq1 │ ├── Readme.md │ ├── ss1.png │ ├── ss2.png │ └── ss3.png │ └── q1 │ ├── Readme.md │ ├── ss1.png │ ├── ss2.png │ ├── ss3.png │ └── ss4.png ├── C3 ├── w1 │ ├── C3w1_graded_lab │ │ ├── C3_W1_Assignment.ipynb │ │ ├── __pycache__ │ │ │ └── utils.cpython-310.pyc │ │ ├── assets │ │ │ ├── binomial2.png │ │ │ └── gaussian.png │ │ ├── emails.csv │ │ └── utils.py │ ├── Readme.md │ ├── lab │ │ ├── C3_W1_Lab_2_Birthday_Problems.ipynb │ │ ├── images │ │ │ ├── first.png │ │ │ ├── fourth.png │ │ │ ├── second.png │ │ │ └── third.png │ │ └── utils.py │ ├── pq1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ └── ss3.png │ └── q1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ ├── ss3.png │ │ ├── ss4.png │ │ ├── ss5.png │ │ ├── ss6.png │ │ └── ss7.png ├── w2 │ ├── C3w2_graded_lab │ │ ├── C3_W2_Assignment.ipynb │ │ ├── __pycache__ │ │ │ └── utils.cpython-310.pyc │ │ ├── answers.json │ │ ├── images │ │ │ ├── 4_side_hists.png │ │ │ ├── 4_side_uf.png │ │ │ ├── 4_sided_hist_no_prob.png │ │ │ ├── 6_sided_cond_blue.png │ │ │ ├── 6_sided_cond_blue2.png │ │ │ ├── 6_sided_cond_brown.png │ │ │ ├── 6_sided_cond_brown2.png │ │ │ ├── 6_sided_cond_green.png │ │ │ ├── 6_sided_cond_green2.png │ │ │ ├── 6_sided_cond_red.png │ │ │ ├── 6_sided_cond_red2.png │ │ │ ├── fair_dice.png │ │ │ ├── hist_sum_4_3l.png │ │ │ ├── hist_sum_4_4l.png │ │ │ ├── hist_sum_4_uf.png │ │ │ ├── hist_sum_5_side.png │ │ │ ├── hist_sum_6_side.png │ │ │ ├── hist_sum_6_uf.png │ │ │ ├── loaded_6_cdf.png │ │ │ └── loaded_6_side.png │ │ └── utils.py │ ├── Readme.md │ ├── lab │ │ ├── C3_W2_Lab_2_Dice_Simulations.ipynb │ │ ├── datasaurus.csv │ │ ├── df_anscombe.csv │ │ ├── ugl_datasets.ipynb │ │ └── utils.py │ ├── pq1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ └── ss3.png │ └── q1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ ├── ss3.png │ │ ├── ss4.png │ │ ├── ss5.png │ │ ├── ss6.png │ │ └── ss7.png ├── w3 │ ├── Readme.md │ ├── lab │ │ ├── C3_W3_Lab_1_Central_Limit_Theorem.ipynb │ │ └── utils.py │ ├── pq1 │ │ ├── Readme.md │ │ └── ss1.png │ └── q1 │ │ ├── Readme.md │ │ ├── ss1.png │ │ ├── ss2.png │ │ └── ss3.png └── w4 │ ├── C3w4_graded_lab │ ├── C3_W4_Assignment.ipynb │ ├── __pycache__ │ │ └── utils.cpython-310.pyc │ └── utils.py │ ├── Readme.md │ ├── pq1 │ ├── Readme.md │ └── ss1.png │ └── q1 │ ├── Readme.md │ ├── ss1.png │ ├── ss2.png │ └── ss3.png └── README.md /C1/w1/C1w1notes.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/C1w1notes.pdf -------------------------------------------------------------------------------- /C1/w1/Readme.md: -------------------------------------------------------------------------------- 1 | ## Course 1 - Week 1 2 | 3 | - [Ungraded Lab - Introduction To Numpy Arrays](/C1/w1/lab/C1_W1_Lab_1_introduction_to_numpy_arrays.ipynb) 4 | - [Ungraded Lab - Solving Linear Systems : 2 Variables](/C1/w1/lab/C1_W1_Lab_2_solving_linear_systems_2_variables.ipynb) 5 | - [Practice Quiz - Solving Systems of Linear Equations](/C1/w1/pq1/) 6 | - [Graded Quiz - Matrices](/C1/w1/q1/) 7 | - [Lecture Materials](/C1/w1/C1w1notes.pdf) -------------------------------------------------------------------------------- /C1/w1/lab/__pycache__/quiz.cpython-38.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/__pycache__/quiz.cpython-38.pyc -------------------------------------------------------------------------------- /C1/w1/lab/images/2x2x2.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/images/2x2x2.PNG -------------------------------------------------------------------------------- /C1/w1/lab/images/XtuAbxu3QWybgG8bt1Fsjw_c9f74854fefa4d3692c62a39e8b16ae1_snapshotmeteoorite (2).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/images/XtuAbxu3QWybgG8bt1Fsjw_c9f74854fefa4d3692c62a39e8b16ae1_snapshotmeteoorite (2).png -------------------------------------------------------------------------------- /C1/w1/lab/images/applesbananas.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/images/applesbananas.PNG -------------------------------------------------------------------------------- /C1/w1/lab/images/broadcasting.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/images/broadcasting.png -------------------------------------------------------------------------------- /C1/w1/lab/images/hstack.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/images/hstack.PNG -------------------------------------------------------------------------------- /C1/w1/lab/images/hubble.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/images/hubble.jpg -------------------------------------------------------------------------------- /C1/w1/lab/images/imagereshaped.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/images/imagereshaped.PNG -------------------------------------------------------------------------------- /C1/w1/lab/images/visualization.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/images/visualization.png -------------------------------------------------------------------------------- /C1/w1/lab/images/vstack.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/lab/images/vstack.PNG -------------------------------------------------------------------------------- /C1/w1/lab/quiz.py: -------------------------------------------------------------------------------- 1 | import ipywidgets as widgets 2 | from ipywidgets import interact, Dropdown 3 | 4 | 5 | question1 = "\033[1mSelect one of the options given:" 6 | solution1 = {'A':'\033[91mNot quite. While both functions are used to create empty arrays, np.zeros() is initialized with the value 0.', 7 | 'B':"\033[91mNot quite. np.zeros() is inialized, and it gives an output of 0's.", 8 | 'C':"\033[91mNot quite. Most often, np.empty() is faster since it is not initialized.", 9 | 'D':''' \033[92mTrue! np.empty() creates an array with uninitialized elements from available memory space and may be faster to execute.'''} 10 | 11 | 12 | 13 | def mcq(question, solution): 14 | s = '' 15 | # print(question) 16 | print("\033[1mPlease select the correct option:") 17 | answer_w = Dropdown(options = solution.keys(), value=None, layout=widgets.Layout(width='25%')) 18 | 19 | @interact(Answer = answer_w) 20 | def print_city(Answer): 21 | if(Answer != None): 22 | s = solution[Answer] 23 | # print("\n") 24 | print(s) 25 | -------------------------------------------------------------------------------- /C1/w1/pq1/Readme.md: -------------------------------------------------------------------------------- 1 | # Practice Quiz - Solving systems of linear equations 2 | 3 | 4 | ![](/C1/w1/pq1/ss1.png) 5 | ![](/C1/w1/pq1/ss2.png) 6 | ![](/C1/w1/pq1/ss3.png) 7 | ![](/C1/w1/pq1/ss4.png) -------------------------------------------------------------------------------- /C1/w1/pq1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/pq1/ss1.png -------------------------------------------------------------------------------- /C1/w1/pq1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/pq1/ss2.png -------------------------------------------------------------------------------- /C1/w1/pq1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/pq1/ss3.png -------------------------------------------------------------------------------- /C1/w1/pq1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/pq1/ss4.png -------------------------------------------------------------------------------- /C1/w1/q1/Readme.md: -------------------------------------------------------------------------------- 1 | ## Graded Quiz - Matrices 2 | 3 | 4 | ![](/C1/w1/q1/ss1.png) 5 | ![](/C1/w1/q1/ss2.png) 6 | ![](/C1/w1/q1/ss3.png) 7 | ![](/C1/w1/q1/ss4.png) 8 | ![](/C1/w1/q1/ss5.png) 9 | ![](/C1/w1/q1/ss6.png) 10 | ![](/C1/w1/q1/ss7.png) -------------------------------------------------------------------------------- /C1/w1/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/q1/ss1.png -------------------------------------------------------------------------------- /C1/w1/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/q1/ss2.png -------------------------------------------------------------------------------- /C1/w1/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/q1/ss3.png -------------------------------------------------------------------------------- /C1/w1/q1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/q1/ss4.png -------------------------------------------------------------------------------- /C1/w1/q1/ss5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/q1/ss5.png -------------------------------------------------------------------------------- /C1/w1/q1/ss6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/q1/ss6.png -------------------------------------------------------------------------------- /C1/w1/q1/ss7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w1/q1/ss7.png -------------------------------------------------------------------------------- /C1/w2/C1w2_graded_lab/__pycache__/w2_unittest.cpython-38.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w2/C1w2_graded_lab/__pycache__/w2_unittest.cpython-38.pyc -------------------------------------------------------------------------------- /C1/w2/C1w2notes.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w2/C1w2notes.pdf -------------------------------------------------------------------------------- /C1/w2/Readme.md: -------------------------------------------------------------------------------- 1 | ## Course 1 - Week 2 2 | 3 | - [Ungraded Lab - Solving Linear Systems : 3 Variables](/C1/w2/C1w2_ungraded_lab.ipynb) 4 | - [Graded Quiz - The Rank of a Matrix](/C1/w2/q1/) 5 | - [Programming Assignment - System of Linear Equations](/C1/w2/C1w2_graded_lab/) 6 | - [Lecture Materials](/C1/w2/C1w2notes.pdf) -------------------------------------------------------------------------------- /C1/w2/q1/Readme.md: -------------------------------------------------------------------------------- 1 | ## Graded Quiz - The Rank of a Matrix 2 | 3 | ![](/C1/w2/q1/ss1.png) 4 | ![](/C1/w2/q1/ss2.png) 5 | ![](/C1/w2/q1/ss3.png) 6 | ![](/C1/w2/q1/ss4.png) 7 | ![](/C1/w2/q1/ss5.png) -------------------------------------------------------------------------------- /C1/w2/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w2/q1/ss1.png -------------------------------------------------------------------------------- /C1/w2/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w2/q1/ss2.png -------------------------------------------------------------------------------- /C1/w2/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w2/q1/ss3.png -------------------------------------------------------------------------------- /C1/w2/q1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w2/q1/ss4.png -------------------------------------------------------------------------------- /C1/w2/q1/ss5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w2/q1/ss5.png -------------------------------------------------------------------------------- /C1/w3/C1w3_graded_lab/__pycache__/w3_tools.cpython-38.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/C1w3_graded_lab/__pycache__/w3_tools.cpython-38.pyc -------------------------------------------------------------------------------- /C1/w3/C1w3_graded_lab/__pycache__/w3_unittest.cpython-38.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/C1w3_graded_lab/__pycache__/w3_unittest.cpython-38.pyc -------------------------------------------------------------------------------- /C1/w3/C1w3_graded_lab/images/nn_model_linear_regression_multiple.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/C1w3_graded_lab/images/nn_model_linear_regression_multiple.png -------------------------------------------------------------------------------- /C1/w3/C1w3_graded_lab/images/nn_model_linear_regression_simple.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/C1w3_graded_lab/images/nn_model_linear_regression_simple.png -------------------------------------------------------------------------------- /C1/w3/C1w3_graded_lab/w3_tools.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | 3 | def backward_propagation(A, X, Y): 4 | """ 5 | Implements the backward propagation, calculating gradients 6 | 7 | Arguments: 8 | parameters -- python dictionary containing our parameters 9 | A -- the output of the neural network of shape (1, number of examples) 10 | X -- input data of shape (n_x, number of examples) 11 | Y -- "true" labels vector of shape (n_y, number of examples) 12 | 13 | Returns: 14 | grads -- python dictionary containing gradients with respect to different parameters 15 | """ 16 | m = X.shape[1] 17 | 18 | # Backward propagation: calculate dW, db. 19 | dZ = A - Y 20 | dW = 1/m * np.matmul(dZ, X.T) 21 | db = 1/m * np.sum(dZ, axis = 1, keepdims = True) 22 | 23 | grads = {"dW": dW, 24 | "db": db} 25 | 26 | return grads 27 | 28 | def update_parameters(parameters, grads, learning_rate = 1.2): 29 | """ 30 | Updates parameters using the gradient descent update rule 31 | 32 | Arguments: 33 | parameters -- python dictionary containing parameters 34 | grads -- python dictionary containing gradients 35 | 36 | Returns: 37 | parameters -- python dictionary containing updated parameters 38 | """ 39 | # Retrieve each parameter from the dictionary "parameters". 40 | W = parameters["W"] 41 | b = parameters["b"] 42 | 43 | # Retrieve each gradient from the dictionary "grads". 44 | dW = grads["dW"] 45 | db = grads["db"] 46 | 47 | # Update rule for each parameter. 48 | W = W - learning_rate * dW 49 | b = b - learning_rate * db 50 | 51 | parameters = {"W": W, 52 | "b": b} 53 | 54 | return parameters 55 | 56 | def train_nn(parameters, A, X, Y): 57 | # Backpropagation. Inputs: "A, X, Y". Outputs: "grads". 58 | grads = backward_propagation(A, X, Y) 59 | 60 | # Gradient descent parameter update. Inputs: "parameters, grads". Outputs: "parameters". 61 | parameters = update_parameters(parameters, grads) 62 | 63 | return parameters 64 | -------------------------------------------------------------------------------- /C1/w3/C1w3notes.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/C1w3notes.pdf -------------------------------------------------------------------------------- /C1/w3/Readme.md: -------------------------------------------------------------------------------- 1 | ## Course 1 - Week 3 2 | 3 | - [Ungraded Lab - Vector Operations](/C1/w3/lab/C1_W3_Lab_1_vector_operations.ipynb) 4 | - [Ungraded Lab - Matrix Multiplication](/C1/w3/lab/C1_W3_Lab_2_matrix_multiplication.ipynb) 5 | - [Ungraded Lab - Linear Transformations](/C1/w3/lab/C1_W3_Lab_3_linear_transformations.ipynb) 6 | - [Practice Quiz - Vector operations: Sum, difference, multiplication, dot product](/C1/w3/pq1) 7 | - [Graded Quiz - Vector and Matrix Operations, Types of Matrices](/C1/w3/q1/) 8 | - [Programming Assignment - Single Perceptron Neural Networks for Linear Regression](/C1/w3/C1w3_graded_lab/) 9 | - [Lecture Materials](/C1/w3/C1w3notes.pdf) -------------------------------------------------------------------------------- /C1/w3/lab/C1_W3_Lab_2_matrix_multiplication.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "48446b1e", 6 | "metadata": {}, 7 | "source": [ 8 | "# Matrix Multiplication" 9 | ] 10 | }, 11 | { 12 | "cell_type": "markdown", 13 | "id": "4b6934ea", 14 | "metadata": {}, 15 | "source": [ 16 | "In this lab you will use `NumPy` functions to perform matrix multiplication and see how it can be used in the Machine Learning applications. " 17 | ] 18 | }, 19 | { 20 | "cell_type": "markdown", 21 | "id": "4439a0e9", 22 | "metadata": {}, 23 | "source": [ 24 | "# Table of Contents\n", 25 | "\n", 26 | "- [ 1 - Definition of Matrix Multiplication](#1)\n", 27 | "- [ 2 - Matrix Multiplication using Python](#2)\n", 28 | "- [ 3 - Matrix Convention and Broadcasting](#3)" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "id": "c2743d75", 34 | "metadata": {}, 35 | "source": [ 36 | "## Packages\n", 37 | "\n", 38 | "Load the `NumPy` package to access its functions." 39 | ] 40 | }, 41 | { 42 | "cell_type": "code", 43 | "execution_count": null, 44 | "id": "b463e5b7", 45 | "metadata": {}, 46 | "outputs": [], 47 | "source": [ 48 | "import numpy as np" 49 | ] 50 | }, 51 | { 52 | "cell_type": "markdown", 53 | "id": "5f9be5bb", 54 | "metadata": {}, 55 | "source": [ 56 | "\n", 57 | "## 1 - Definition of Matrix Multiplication\n", 58 | "\n", 59 | "If $A$ is an $m \\times n$ matrix and $B$ is an $n \\times p$ matrix, the matrix product $C = AB$ (denoted without multiplication signs or dots) is defined to be the $m \\times p$ matrix such that \n", 60 | "$c_{ij}=a_{i1}b_{1j}+a_{i2}b_{2j}+\\ldots+a_{in}b_{nj}=\\sum_{k=1}^{n} a_{ik}b_{kj}, \\tag{4}$\n", 61 | "\n", 62 | "where $a_{ik}$ are the elements of matrix $A$, $b_{kj}$ are the elements of matrix $B$, and $i = 1, \\ldots, m$, $k=1, \\ldots, n$, $j = 1, \\ldots, p$. In other words, $c_{ij}$ is the dot product of the $i$-th row of $A$ and the $j$-th column of $B$." 63 | ] 64 | }, 65 | { 66 | "cell_type": "markdown", 67 | "id": "ecd63af9", 68 | "metadata": {}, 69 | "source": [ 70 | "\n", 71 | "## 2 - Matrix Multiplication using Python\n", 72 | "\n", 73 | "Like with the dot product, there are a few ways to perform matrix multiplication in Python. As discussed in the previous lab, the calculations are more efficient in the vectorized form. Let's discuss the most commonly used functions in the vectorized form. First, define two matrices:" 74 | ] 75 | }, 76 | { 77 | "cell_type": "code", 78 | "execution_count": null, 79 | "id": "8b0f59f5", 80 | "metadata": {}, 81 | "outputs": [], 82 | "source": [ 83 | "A = np.array([[4, 9, 9], [9, 1, 6], [9, 2, 3]])\n", 84 | "print(\"Matrix A (3 by 3):\\n\", A)\n", 85 | "\n", 86 | "B = np.array([[2, 2], [5, 7], [4, 4]])\n", 87 | "print(\"Matrix B (3 by 2):\\n\", B)" 88 | ] 89 | }, 90 | { 91 | "cell_type": "markdown", 92 | "id": "cdf047c9", 93 | "metadata": {}, 94 | "source": [ 95 | "You can multiply matrices $A$ and $B$ using `NumPy` package function `np.matmul()`:" 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": null, 101 | "id": "43452598", 102 | "metadata": {}, 103 | "outputs": [], 104 | "source": [ 105 | "np.matmul(A, B)" 106 | ] 107 | }, 108 | { 109 | "cell_type": "markdown", 110 | "id": "7be5d42a", 111 | "metadata": {}, 112 | "source": [ 113 | "Which will output $3 \\times 2$ matrix as a `np.array`. Python operator `@` will also work here giving the same result:" 114 | ] 115 | }, 116 | { 117 | "cell_type": "code", 118 | "execution_count": null, 119 | "id": "bb36ba42", 120 | "metadata": {}, 121 | "outputs": [], 122 | "source": [ 123 | "A @ B" 124 | ] 125 | }, 126 | { 127 | "cell_type": "markdown", 128 | "id": "0186638b", 129 | "metadata": {}, 130 | "source": [ 131 | "\n", 132 | "## 3 - Matrix Convention and Broadcasting\n", 133 | "\n", 134 | "Mathematically, matrix multiplication is defined only if number of the columns of matrix $A$ is equal to the number of the rows of matrix $B$ (you can check again the definition in the secition [1](#1) and see that otherwise the dot products between rows and columns will not be defined). \n", 135 | "\n", 136 | "Thus, in the example above ([2](#2)), changing the order of matrices when performing the multiplication $BA$ will not work as the above rule does not hold anymore. You can check it by running the cells below - both of them will give errors." 137 | ] 138 | }, 139 | { 140 | "cell_type": "code", 141 | "execution_count": null, 142 | "id": "3ecc05e5", 143 | "metadata": {}, 144 | "outputs": [], 145 | "source": [ 146 | "try:\n", 147 | " np.matmul(B, A)\n", 148 | "except ValueError as err:\n", 149 | " print(err)" 150 | ] 151 | }, 152 | { 153 | "cell_type": "code", 154 | "execution_count": null, 155 | "id": "ea9c6d13", 156 | "metadata": {}, 157 | "outputs": [], 158 | "source": [ 159 | "try:\n", 160 | " B @ A\n", 161 | "except ValueError as err:\n", 162 | " print(err)" 163 | ] 164 | }, 165 | { 166 | "cell_type": "markdown", 167 | "id": "05d9a674", 168 | "metadata": {}, 169 | "source": [ 170 | "So when using matrix multiplication you will need to be very careful about the dimensions - the number of the columns in the first matrix should match the number of the rows in the second matrix. This is very important for your future understanding of Neural Networks and how they work. \n", 171 | "\n", 172 | "However, for multiplying of the vectors, `NumPy` has a shortcut. You can define two vectors $x$ and $y$ of the same size (which one can understand as two $3 \\times 1$ matrices). If you check the shape of the vector $x$, you can see that :" 173 | ] 174 | }, 175 | { 176 | "cell_type": "code", 177 | "execution_count": null, 178 | "id": "fab77ce6", 179 | "metadata": {}, 180 | "outputs": [], 181 | "source": [ 182 | "x = np.array([1, -2, -5])\n", 183 | "y = np.array([4, 3, -1])\n", 184 | "\n", 185 | "print(\"Shape of vector x:\", x.shape)\n", 186 | "print(\"Number of dimensions of vector x:\", x.ndim)\n", 187 | "print(\"Shape of vector x, reshaped to a matrix:\", x.reshape((3, 1)).shape)\n", 188 | "print(\"Number of dimensions of vector x, reshaped to a matrix:\", x.reshape((3, 1)).ndim)" 189 | ] 190 | }, 191 | { 192 | "cell_type": "markdown", 193 | "id": "5bd337df", 194 | "metadata": {}, 195 | "source": [ 196 | "Following the matrix convention, multiplication of matrices $3 \\times 1$ and $3 \\times 1$ is not defined. For matrix multiplication you would expect an error in the following cell, but let's check the output:" 197 | ] 198 | }, 199 | { 200 | "cell_type": "code", 201 | "execution_count": null, 202 | "id": "f655677c", 203 | "metadata": {}, 204 | "outputs": [], 205 | "source": [ 206 | "np.matmul(x,y)" 207 | ] 208 | }, 209 | { 210 | "cell_type": "markdown", 211 | "id": "2fc01d74", 212 | "metadata": {}, 213 | "source": [ 214 | "You can see that there is no error and that the result is actually a dot product $x \\cdot y\\,$! So, vector $x$ was automatically transposed into the vector $1 \\times 3$ and matrix multiplication $x^Ty$ was calculated. While this is very convenient, you need to keep in mind such functionality in Python and pay attention to not use it in a wrong way. The following cell will return an error:" 215 | ] 216 | }, 217 | { 218 | "cell_type": "code", 219 | "execution_count": null, 220 | "id": "d92006f1", 221 | "metadata": {}, 222 | "outputs": [], 223 | "source": [ 224 | "try:\n", 225 | " np.matmul(x.reshape((3, 1)), y.reshape((3, 1)))\n", 226 | "except ValueError as err:\n", 227 | " print(err)" 228 | ] 229 | }, 230 | { 231 | "cell_type": "markdown", 232 | "id": "ace12c7d", 233 | "metadata": {}, 234 | "source": [ 235 | "You might have a question in you mind: does `np.dot()` function also work for matrix multiplication? Let's try it:" 236 | ] 237 | }, 238 | { 239 | "cell_type": "code", 240 | "execution_count": null, 241 | "id": "f296e528", 242 | "metadata": {}, 243 | "outputs": [], 244 | "source": [ 245 | "np.dot(A, B)" 246 | ] 247 | }, 248 | { 249 | "cell_type": "markdown", 250 | "id": "8dbbdc0f", 251 | "metadata": {}, 252 | "source": [ 253 | "Yes, it works! What actually happens is what is called **broadcasting** in Python: `NumPy` broadcasts this dot product operation to all rows and all columns, you get the resultant product matrix. Broadcasting also works in other cases, for example:" 254 | ] 255 | }, 256 | { 257 | "cell_type": "code", 258 | "execution_count": null, 259 | "id": "68ded501", 260 | "metadata": {}, 261 | "outputs": [], 262 | "source": [ 263 | "A - 2" 264 | ] 265 | }, 266 | { 267 | "cell_type": "markdown", 268 | "id": "eec1d0d2", 269 | "metadata": {}, 270 | "source": [ 271 | "Mathematically, subtraction of the $3 \\times 3$ matrix $A$ and a scalar is not defined, but Python broadcasts the scalar, creating a $3 \\times 3$ `np.array` and performing subtraction element by element. A practical example of matrix multiplication can be seen in a linear regression model. You will implement it in this week's assignment!" 272 | ] 273 | }, 274 | { 275 | "cell_type": "markdown", 276 | "id": "86605d6f", 277 | "metadata": {}, 278 | "source": [ 279 | "Congratulations on finishing this lab!" 280 | ] 281 | }, 282 | { 283 | "cell_type": "code", 284 | "execution_count": null, 285 | "id": "76db64ac", 286 | "metadata": {}, 287 | "outputs": [], 288 | "source": [] 289 | } 290 | ], 291 | "metadata": { 292 | "kernelspec": { 293 | "display_name": "Python 3 (ipykernel)", 294 | "language": "python", 295 | "name": "python3" 296 | }, 297 | "language_info": { 298 | "codemirror_mode": { 299 | "name": "ipython", 300 | "version": 3 301 | }, 302 | "file_extension": ".py", 303 | "mimetype": "text/x-python", 304 | "name": "python", 305 | "nbconvert_exporter": "python", 306 | "pygments_lexer": "ipython3", 307 | "version": "3.9.12" 308 | } 309 | }, 310 | "nbformat": 4, 311 | "nbformat_minor": 5 312 | } 313 | -------------------------------------------------------------------------------- /C1/w3/lab/images/barnsley_fern.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/lab/images/barnsley_fern.png -------------------------------------------------------------------------------- /C1/w3/lab/images/dot_product_geometric.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/lab/images/dot_product_geometric.png -------------------------------------------------------------------------------- /C1/w3/lab/images/leaf_original.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/lab/images/leaf_original.png -------------------------------------------------------------------------------- /C1/w3/lab/images/shear_transformation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/lab/images/shear_transformation.png -------------------------------------------------------------------------------- /C1/w3/lab/images/sum_of_vectors.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/lab/images/sum_of_vectors.png -------------------------------------------------------------------------------- /C1/w3/pq1/Readme.md: -------------------------------------------------------------------------------- 1 | ## Practice Quiz - Vector operations: Sum, difference, multiplication, dot product 2 | 3 | ![](/C1/w3/pq1/ss1.png) 4 | ![](/C1/w3/pq1/ss2.png) 5 | ![](/C1/w3/pq1/ss3.png) 6 | ![](/C1/w3/pq1/ss4.png) 7 | ![](/C1/w3/pq1/ss5.png) -------------------------------------------------------------------------------- /C1/w3/pq1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/pq1/ss1.png -------------------------------------------------------------------------------- /C1/w3/pq1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/pq1/ss2.png -------------------------------------------------------------------------------- /C1/w3/pq1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/pq1/ss3.png -------------------------------------------------------------------------------- /C1/w3/pq1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/pq1/ss4.png -------------------------------------------------------------------------------- /C1/w3/pq1/ss5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/pq1/ss5.png -------------------------------------------------------------------------------- /C1/w3/q1/Readme.md: -------------------------------------------------------------------------------- 1 | ## Graded Quiz - Vector and Matrix Operations, Types of Matrices 2 | 3 | ![](/C1/w3/q1/ss1.png) 4 | ![](/C1/w3/q1/ss2.png) 5 | ![](/C1/w3/q1/ss3.png) 6 | ![](/C1/w3/q1/ss4.png) -------------------------------------------------------------------------------- /C1/w3/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/q1/ss1.png -------------------------------------------------------------------------------- /C1/w3/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/q1/ss2.png -------------------------------------------------------------------------------- /C1/w3/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/q1/ss3.png -------------------------------------------------------------------------------- /C1/w3/q1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w3/q1/ss4.png -------------------------------------------------------------------------------- /C1/w4/C1w4_graded_lab/__pycache__/w4_unittest.cpython-38.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w4/C1w4_graded_lab/__pycache__/w4_unittest.cpython-38.pyc -------------------------------------------------------------------------------- /C1/w4/C1w4_graded_lab/images/shear_transformation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w4/C1w4_graded_lab/images/shear_transformation.png -------------------------------------------------------------------------------- /C1/w4/C1w4notes.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w4/C1w4notes.pdf -------------------------------------------------------------------------------- /C1/w4/Readme.md: -------------------------------------------------------------------------------- 1 | ## Course 1 - Week 4 2 | 3 | - [Graded Quiz - Eigenvalues and Eigenvectors](/C1/w4/q1/) 4 | - [Programming Assignment](/C1/w4/C1w4_graded_lab/) 5 | - [Lecture Materials](/C1/w4/C1w4notes.pdf) -------------------------------------------------------------------------------- /C1/w4/q1/Readme.md: -------------------------------------------------------------------------------- 1 | ## Graded Quiz - Eigenvalues and Eigenvectors 2 | 3 | ![](/C1/w4/q1/ss1.png) 4 | ![](/C1/w4/q1/ss2.png) 5 | ![](/C1/w4/q1/ss3.png) 6 | ![](/C1/w4/q1/ss4.png) 7 | ![](/C1/w4/q1/ss5.png) 8 | ![](/C1/w4/q1/ss6.png) -------------------------------------------------------------------------------- /C1/w4/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w4/q1/ss1.png -------------------------------------------------------------------------------- /C1/w4/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w4/q1/ss2.png -------------------------------------------------------------------------------- /C1/w4/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w4/q1/ss3.png -------------------------------------------------------------------------------- /C1/w4/q1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w4/q1/ss4.png -------------------------------------------------------------------------------- /C1/w4/q1/ss5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w4/q1/ss5.png -------------------------------------------------------------------------------- /C1/w4/q1/ss6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C1/w4/q1/ss6.png -------------------------------------------------------------------------------- /C2/w1/C2w1_graded_lab/data/prices.csv: -------------------------------------------------------------------------------- 1 | date,price_supplier_a_dollars_per_item,price_supplier_b_dollars_per_item 2 | 1/02/2016,104,76 3 | 1/03/2016,108,76 4 | 1/04/2016,101,84 5 | 1/05/2016,104,79 6 | 1/06/2016,102,81 7 | 1/07/2016,105,84 8 | 1/08/2016,114,90 9 | 1/09/2016,102,93 10 | 1/10/2016,105,93 11 | 1/11/2016,101,99 12 | 1/12/2016,109,98 13 | 1/01/2017,103,96 14 | 1/02/2017,93,94 15 | 1/03/2017,98,104 16 | 1/04/2017,92,101 17 | 1/05/2017,97,102 18 | 1/06/2017,96,104 19 | 1/07/2017,94,106 20 | 1/08/2017,97,105 21 | 1/09/2017,93,103 22 | 1/10/2017,99,106 23 | 1/11/2017,93,104 24 | 1/12/2017,98,113 25 | 1/01/2018,94,115 26 | 1/02/2018,93,114 27 | 1/03/2018,92,124 28 | 1/04/2018,96,119 29 | 1/05/2018,98,115 30 | 1/06/2018,98,112 31 | 1/07/2018,93,111 32 | 1/08/2018,97,106 33 | 1/09/2018,102,107 34 | 1/10/2018,103,108 35 | 1/11/2018,100,108 36 | 1/12/2018,100,102 37 | 1/01/2019,104,104 38 | 1/02/2019,100,101 39 | 1/03/2019,103,101 40 | 1/04/2019,104,100 41 | 1/05/2019,101,103 42 | 1/06/2019,102,106 43 | 1/07/2019,100,100 44 | 1/08/2019,102,97 45 | 1/09/2019,108,98 46 | 1/10/2019,107,90 47 | 1/11/2019,107,92 48 | 1/12/2019,103,92 49 | 1/01/2020,109,99 50 | 1/02/2020,108,94 51 | 1/03/2020,108,91 52 | -------------------------------------------------------------------------------- /C2/w1/C2w1_graded_lab/w1_unittest.py: -------------------------------------------------------------------------------- 1 | # + 2 | import jax.numpy as np 3 | from math import isclose 4 | 5 | # Variables for the default_check test cases. 6 | prices_A = np.array([ 7 | 104., 108., 101., 104., 102., 105., 114., 102., 105., 101., 109., 103., 93., 98., 92., 97., 96., 8 | 94., 97., 93., 99., 93., 98., 94., 93., 92., 96., 98., 98., 93., 97., 102., 103., 100., 100., 104., 9 | 100., 103., 104., 101., 102., 100., 102., 108., 107., 107., 103., 109., 108., 108., 10 | ]) 11 | prices_B = np.array([ 12 | 76., 76., 84., 79., 81., 84., 90., 93., 93., 99., 98., 96., 94., 104., 101., 102., 104., 106., 105., 13 | 103., 106., 104., 113., 115., 114., 124., 119., 115., 112., 111., 106., 107., 108., 108., 102., 104., 14 | 101., 101., 100., 103., 106., 100., 97., 98., 90., 92., 92., 99., 94., 91. 15 | ]) 16 | 17 | 18 | # - 19 | 20 | def test_load_and_convert_data(target_A, target_B): 21 | successful_cases = 0 22 | failed_cases = [] 23 | 24 | test_cases = [ 25 | { 26 | "name": "default_check", 27 | "expected": {"prices_A": prices_A, 28 | "prices_B": prices_B,}, 29 | }, 30 | ] 31 | 32 | for test_case in test_cases: 33 | 34 | try: 35 | assert type(target_A) == type(test_case["expected"]["prices_A"]) 36 | successful_cases += 1 37 | except: 38 | failed_cases.append( 39 | { 40 | "name": test_case["name"], 41 | "expected": type(test_case["expected"]["prices_A"]), 42 | "got": type(target_A), 43 | } 44 | ) 45 | print( 46 | f"prices_A has incorrect type. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 47 | ) 48 | break 49 | 50 | try: 51 | assert type(target_B) == type(test_case["expected"]["prices_B"]) 52 | successful_cases += 1 53 | except: 54 | failed_cases.append( 55 | { 56 | "name": test_case["name"], 57 | "expected": type(test_case["expected"]["prices_B"]), 58 | "got": type(target_B), 59 | } 60 | ) 61 | print( 62 | f"prices_B has incorrect type. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 63 | ) 64 | break 65 | 66 | try: 67 | # Check only one element - no need to check all array. 68 | assert type(target_A[0].item()) == type(test_case["expected"]["prices_A"][0].item()) 69 | successful_cases += 1 70 | except: 71 | failed_cases.append( 72 | { 73 | "name": test_case["name"], 74 | "expected": type(test_case["expected"]["prices_A"][0].item()), 75 | "got": type(target_A[0].item()), 76 | } 77 | ) 78 | print( 79 | f"Elements of prices_A array have incorrect type. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 80 | ) 81 | 82 | try: 83 | # Check only one element - no need to check all array. 84 | assert type(target_B[0].item()) == type(test_case["expected"]["prices_B"][0].item()) 85 | successful_cases += 1 86 | except: 87 | failed_cases.append( 88 | { 89 | "name": test_case["name"], 90 | "expected": type(test_case["expected"]["prices_B"][0].item()), 91 | "got": type(target_B[0].item()), 92 | } 93 | ) 94 | print( 95 | f"Elements of prices_B array have incorrect type. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 96 | ) 97 | 98 | try: 99 | assert target_A.shape == test_case["expected"]["prices_A"].shape 100 | successful_cases += 1 101 | except: 102 | failed_cases.append( 103 | { 104 | "name": test_case["name"], 105 | "expected": test_case["expected"]["prices_A"].shape, 106 | "got": target_A.shape, 107 | } 108 | ) 109 | print( 110 | f"Wrong shape of prices_A array. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 111 | ) 112 | break 113 | 114 | try: 115 | assert target_B.shape == test_case["expected"]["prices_B"].shape 116 | successful_cases += 1 117 | except: 118 | failed_cases.append( 119 | { 120 | "name": test_case["name"], 121 | "expected": test_case["expected"]["prices_B"].shape, 122 | "got": target_B.shape, 123 | } 124 | ) 125 | print( 126 | f"Wrong shape of prices_B array. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 127 | ) 128 | break 129 | 130 | try: 131 | assert np.allclose(target_A, test_case["expected"]["prices_A"]) 132 | successful_cases += 1 133 | except: 134 | failed_cases.append( 135 | { 136 | "name": test_case["name"], 137 | "expected": test_case["expected"]["prices_A"], 138 | "got": target_A, 139 | } 140 | ) 141 | print( 142 | f"Wrong array prices_A. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 143 | ) 144 | 145 | try: 146 | assert np.allclose(target_B, test_case["expected"]["prices_B"]) 147 | successful_cases += 1 148 | except: 149 | failed_cases.append( 150 | { 151 | "name": test_case["name"], 152 | "expected": test_case["expected"]["prices_B"], 153 | "got": target_B, 154 | } 155 | ) 156 | print( 157 | f"Wrong array prices_B. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 158 | ) 159 | 160 | if len(failed_cases) == 0: 161 | print("\033[92m All tests passed") 162 | else: 163 | print("\033[92m", successful_cases, " Tests passed") 164 | print("\033[91m", len(failed_cases), " Tests failed") 165 | 166 | 167 | def test_f_of_omega(target_f_of_omega): 168 | successful_cases = 0 169 | failed_cases = [] 170 | 171 | test_cases = [ 172 | { 173 | "name": "default_check", 174 | "input": {"omega": 0,}, 175 | "expected": {"f_of_omega": prices_B,}, 176 | }, 177 | { 178 | "name": "extra_check_1", 179 | "input": {"omega": 0.2,}, 180 | "expected": {"f_of_omega": prices_A * 0.2 + prices_B * (1-0.2),}, 181 | }, 182 | { 183 | "name": "extra_check_2", 184 | "input": {"omega": 0.8,}, 185 | "expected": {"f_of_omega": prices_A * 0.8 + prices_B * (1-0.8),}, 186 | }, 187 | { 188 | "name": "extra_check_3", 189 | "input": {"omega": 1,}, 190 | "expected": {"f_of_omega": prices_A,}, 191 | }, 192 | ] 193 | 194 | for test_case in test_cases: 195 | result = target_f_of_omega(test_case["input"]["omega"]) 196 | 197 | try: 198 | assert result.shape == test_case["expected"]["f_of_omega"].shape 199 | successful_cases += 1 200 | except: 201 | failed_cases.append( 202 | { 203 | "name": test_case["name"], 204 | "expected": test_case["expected"]["f_of_omega"].shape, 205 | "got": result.shape, 206 | } 207 | ) 208 | print( 209 | f"Test case \"{failed_cases[-1].get('name')}\". Wrong shape of f_of_omega output for omega = {test_case['input']['omega']}. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 210 | ) 211 | 212 | try: 213 | assert np.allclose(result, test_case["expected"]["f_of_omega"]) 214 | successful_cases += 1 215 | 216 | except: 217 | failed_cases.append( 218 | { 219 | "name": test_case["name"], 220 | "expected": test_case["expected"]["f_of_omega"], 221 | "got": result, 222 | } 223 | ) 224 | print( 225 | f"Test case \"{failed_cases[-1].get('name')}\". Wrong output of f_of_omega for omega = {test_case['input']['omega']}. \n\tExpected: \n{failed_cases[-1].get('expected')}\n\tGot: \n{failed_cases[-1].get('got')}" 226 | ) 227 | 228 | if len(failed_cases) == 0: 229 | print("\033[92m All tests passed") 230 | else: 231 | print("\033[92m", successful_cases, " Tests passed") 232 | print("\033[91m", len(failed_cases), " Tests failed") 233 | 234 | 235 | def test_L_of_omega_array(target_L_of_omega_array): 236 | successful_cases = 0 237 | failed_cases = [] 238 | 239 | # Not all of the values of the output array will be checked - only some of them. 240 | # In graders all of the output array gets checked. 241 | test_cases = [ 242 | { 243 | "name": "default_check", 244 | "input": {"omega_array": np.linspace(0, 1, 1001, endpoint=True),}, 245 | "expected": {"shape": (1001,), 246 | "L_of_omega_array": [ 247 | {"i": 0, "L_of_omega": 110.72,}, 248 | {"i": 1000, "L_of_omega": 27.48,}, 249 | {"i": 400, "L_of_omega": 28.051199,}, 250 | ],} 251 | }, 252 | { 253 | "name": "extra_check", 254 | "input": {"omega_array": np.linspace(0, 1, 11, endpoint=True),}, 255 | "expected": {"shape": (11,), 256 | "L_of_omega_array": [ 257 | {"i": 0, "L_of_omega": 110.72,}, 258 | {"i": 11, "L_of_omega": 27.48,}, 259 | {"i": 5, "L_of_omega": 17.67,}, 260 | ],} 261 | }, 262 | ] 263 | 264 | for test_case in test_cases: 265 | result = target_L_of_omega_array(test_case["input"]["omega_array"]) 266 | 267 | try: 268 | assert result.shape == test_case["expected"]["shape"] 269 | successful_cases += 1 270 | except: 271 | failed_cases.append( 272 | { 273 | "name": test_case["name"], 274 | "expected": test_case["expected"]["shape"], 275 | "got": result.shape, 276 | } 277 | ) 278 | print( 279 | f"Test case \"{failed_cases[-1].get('name')}\". Wrong shape of L_of_omega_array output. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 280 | ) 281 | 282 | for test_case_i in test_case["expected"]["L_of_omega_array"]: 283 | i = test_case_i["i"] 284 | 285 | try: 286 | assert isclose(result[i], test_case_i["L_of_omega"], abs_tol=1e-5) 287 | successful_cases += 1 288 | 289 | except: 290 | failed_cases.append( 291 | { 292 | "name": test_case["name"], 293 | "expected": test_case_i["L_of_omega"], 294 | "got": result[i], 295 | } 296 | ) 297 | print( 298 | f"Test case \"{failed_cases[-1].get('name')}\". Wrong output of L_of_omega_array for omega_array = \n{test_case['input']['omega_array']}\nTest for index i = {i}. \n\tExpected: \n{failed_cases[-1].get('expected')}\n\tGot: \n{failed_cases[-1].get('got')}" 299 | ) 300 | 301 | if len(failed_cases) == 0: 302 | print("\033[92m All tests passed") 303 | else: 304 | print("\033[92m", successful_cases, " Tests passed") 305 | print("\033[91m", len(failed_cases), " Tests failed") 306 | 307 | 308 | def test_dLdOmega_of_omega_array(target_dLdOmega_of_omega_array): 309 | successful_cases = 0 310 | failed_cases = [] 311 | 312 | # Not all of the values of the output array will be checked - only some of them. 313 | # In graders all of the output array gets checked. 314 | test_cases = [ 315 | { 316 | "name": "default_check", 317 | "input": {"omega_array": np.linspace(0, 1, 1001, endpoint=True),}, 318 | "expected": {"shape": (1001,), 319 | "dLdOmega_of_omega_array": [ 320 | {"i": 0, "dLdOmega_of_omega": -288.96,}, 321 | {"i": 1000, "dLdOmega_of_omega": 122.47999,}, 322 | {"i": 400, "dLdOmega_of_omega": -124.38398,}, 323 | ],} 324 | }, 325 | { 326 | "name": "extra_check", 327 | "input": {"omega_array": np.linspace(0, 1, 11, endpoint=True),}, 328 | "expected": {"shape": (11,), 329 | "dLdOmega_of_omega_array": [ 330 | {"i": 0, "dLdOmega_of_omega": -288.96,}, 331 | {"i": 11, "dLdOmega_of_omega": 122.47999,}, 332 | {"i": 5, "dLdOmega_of_omega": -83.240036,}, 333 | ],} 334 | }, 335 | ] 336 | 337 | for test_case in test_cases: 338 | result = target_dLdOmega_of_omega_array(test_case["input"]["omega_array"]) 339 | 340 | try: 341 | assert result.shape == test_case["expected"]["shape"] 342 | successful_cases += 1 343 | except: 344 | failed_cases.append( 345 | { 346 | "name": test_case["name"], 347 | "expected": test_case["expected"]["shape"], 348 | "got": result.shape, 349 | } 350 | ) 351 | print( 352 | f"Test case \"{failed_cases[-1].get('name')}\". Wrong shape of dLdOmega_of_omega_array output. \n\tExpected: {failed_cases[-1].get('expected')}.\n\tGot: {failed_cases[-1].get('got')}." 353 | ) 354 | 355 | for test_case_i in test_case["expected"]["dLdOmega_of_omega_array"]: 356 | i = test_case_i["i"] 357 | 358 | try: 359 | assert isclose(result[i], test_case_i["dLdOmega_of_omega"], abs_tol=1e-5) 360 | successful_cases += 1 361 | 362 | except: 363 | failed_cases.append( 364 | { 365 | "name": test_case["name"], 366 | "expected": test_case_i["dLdOmega_of_omega"], 367 | "got": result[i], 368 | } 369 | ) 370 | print( 371 | f"Test case \"{failed_cases[-1].get('name')}\". Wrong output of dLdOmega_of_omega_array for omega_array = \n{test_case['input']['omega_array']}\nTest for index i = {i}. \n\tExpected: \n{failed_cases[-1].get('expected')}\n\tGot: \n{failed_cases[-1].get('got')}" 372 | ) 373 | 374 | if len(failed_cases) == 0: 375 | print("\033[92m All tests passed") 376 | else: 377 | print("\033[92m", successful_cases, " Tests passed") 378 | print("\033[91m", len(failed_cases), " Tests failed") 379 | -------------------------------------------------------------------------------- /C2/w1/C2w1notes.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w1/C2w1notes.pdf -------------------------------------------------------------------------------- /C2/w1/Readme.md: -------------------------------------------------------------------------------- 1 | ## Course 2 - Week 1 2 | 3 | - [Practice Quiz - Derivatives](/C2/w1/pq1/) 4 | - [Ungraded Lab - Differentiation in Python](/C2/w1/C2_W1_Lab_1_differentiation_in_python.ipynb) 5 | - [Graded Quiz - Derivatives and Optimization](/C2/w1/q1/) 6 | - [Programming Assignment](/C2/w1/C2w1_graded_lab/) 7 | - [Lecture Materials](/C2/w1/C2w1notes.pdf) -------------------------------------------------------------------------------- /C2/w1/pq1/Readme.md: -------------------------------------------------------------------------------- 1 | # Practice Quiz - Derivatives 2 | 3 | ![](/C2/w1/pq1/ss1.png) 4 | ![](/C2/w1/pq1/ss2.png) 5 | ![](/C2/w1/pq1/ss3.png) -------------------------------------------------------------------------------- /C2/w1/pq1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w1/pq1/ss1.png -------------------------------------------------------------------------------- /C2/w1/pq1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w1/pq1/ss2.png -------------------------------------------------------------------------------- /C2/w1/pq1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w1/pq1/ss3.png -------------------------------------------------------------------------------- /C2/w1/q1/Readme.md: -------------------------------------------------------------------------------- 1 | # Graded Quiz - Derivatives and Optimization 2 | 3 | ![](/C2/w1/q1/ss1.png) 4 | ![](/C2/w1/q1/ss2.png) 5 | ![](/C2/w1/q1/ss3.png) 6 | ![](/C2/w1/q1/ss4.png) -------------------------------------------------------------------------------- /C2/w1/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w1/q1/ss1.png -------------------------------------------------------------------------------- /C2/w1/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w1/q1/ss2.png -------------------------------------------------------------------------------- /C2/w1/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w1/q1/ss3.png -------------------------------------------------------------------------------- /C2/w1/q1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w1/q1/ss4.png -------------------------------------------------------------------------------- /C2/w2/C2w2_graded_lab/data/tvmarketing.csv: -------------------------------------------------------------------------------- 1 | TV,Sales 2 | 230.1,22.1 3 | 44.5,10.4 4 | 17.2,9.3 5 | 151.5,18.5 6 | 180.8,12.9 7 | 8.7,7.2 8 | 57.5,11.8 9 | 120.2,13.2 10 | 8.6,4.8 11 | 199.8,10.6 12 | 66.1,8.6 13 | 214.7,17.4 14 | 23.8,9.2 15 | 97.5,9.7 16 | 204.1,19 17 | 195.4,22.4 18 | 67.8,12.5 19 | 281.4,24.4 20 | 69.2,11.3 21 | 147.3,14.6 22 | 218.4,18 23 | 237.4,12.5 24 | 13.2,5.6 25 | 228.3,15.5 26 | 62.3,9.7 27 | 262.9,12 28 | 142.9,15 29 | 240.1,15.9 30 | 248.8,18.9 31 | 70.6,10.5 32 | 292.9,21.4 33 | 112.9,11.9 34 | 97.2,9.6 35 | 265.6,17.4 36 | 95.7,9.5 37 | 290.7,12.8 38 | 266.9,25.4 39 | 74.7,14.7 40 | 43.1,10.1 41 | 228,21.5 42 | 202.5,16.6 43 | 177,17.1 44 | 293.6,20.7 45 | 206.9,12.9 46 | 25.1,8.5 47 | 175.1,14.9 48 | 89.7,10.6 49 | 239.9,23.2 50 | 227.2,14.8 51 | 66.9,9.7 52 | 199.8,11.4 53 | 100.4,10.7 54 | 216.4,22.6 55 | 182.6,21.2 56 | 262.7,20.2 57 | 198.9,23.7 58 | 7.3,5.5 59 | 136.2,13.2 60 | 210.8,23.8 61 | 210.7,18.4 62 | 53.5,8.1 63 | 261.3,24.2 64 | 239.3,15.7 65 | 102.7,14 66 | 131.1,18 67 | 69,9.3 68 | 31.5,9.5 69 | 139.3,13.4 70 | 237.4,18.9 71 | 216.8,22.3 72 | 199.1,18.3 73 | 109.8,12.4 74 | 26.8,8.8 75 | 129.4,11 76 | 213.4,17 77 | 16.9,8.7 78 | 27.5,6.9 79 | 120.5,14.2 80 | 5.4,5.3 81 | 116,11 82 | 76.4,11.8 83 | 239.8,12.3 84 | 75.3,11.3 85 | 68.4,13.6 86 | 213.5,21.7 87 | 193.2,15.2 88 | 76.3,12 89 | 110.7,16 90 | 88.3,12.9 91 | 109.8,16.7 92 | 134.3,11.2 93 | 28.6,7.3 94 | 217.7,19.4 95 | 250.9,22.2 96 | 107.4,11.5 97 | 163.3,16.9 98 | 197.6,11.7 99 | 184.9,15.5 100 | 289.7,25.4 101 | 135.2,17.2 102 | 222.4,11.7 103 | 296.4,23.8 104 | 280.2,14.8 105 | 187.9,14.7 106 | 238.2,20.7 107 | 137.9,19.2 108 | 25,7.2 109 | 90.4,8.7 110 | 13.1,5.3 111 | 255.4,19.8 112 | 225.8,13.4 113 | 241.7,21.8 114 | 175.7,14.1 115 | 209.6,15.9 116 | 78.2,14.6 117 | 75.1,12.6 118 | 139.2,12.2 119 | 76.4,9.4 120 | 125.7,15.9 121 | 19.4,6.6 122 | 141.3,15.5 123 | 18.8,7 124 | 224,11.6 125 | 123.1,15.2 126 | 229.5,19.7 127 | 87.2,10.6 128 | 7.8,6.6 129 | 80.2,8.8 130 | 220.3,24.7 131 | 59.6,9.7 132 | 0.7,1.6 133 | 265.2,12.7 134 | 8.4,5.7 135 | 219.8,19.6 136 | 36.9,10.8 137 | 48.3,11.6 138 | 25.6,9.5 139 | 273.7,20.8 140 | 43,9.6 141 | 184.9,20.7 142 | 73.4,10.9 143 | 193.7,19.2 144 | 220.5,20.1 145 | 104.6,10.4 146 | 96.2,11.4 147 | 140.3,10.3 148 | 240.1,13.2 149 | 243.2,25.4 150 | 38,10.9 151 | 44.7,10.1 152 | 280.7,16.1 153 | 121,11.6 154 | 197.6,16.6 155 | 171.3,19 156 | 187.8,15.6 157 | 4.1,3.2 158 | 93.9,15.3 159 | 149.8,10.1 160 | 11.7,7.3 161 | 131.7,12.9 162 | 172.5,14.4 163 | 85.7,13.3 164 | 188.4,14.9 165 | 163.5,18 166 | 117.2,11.9 167 | 234.5,11.9 168 | 17.9,8 169 | 206.8,12.2 170 | 215.4,17.1 171 | 284.3,15 172 | 50,8.4 173 | 164.5,14.5 174 | 19.6,7.6 175 | 168.4,11.7 176 | 222.4,11.5 177 | 276.9,27 178 | 248.4,20.2 179 | 170.2,11.7 180 | 276.7,11.8 181 | 165.6,12.6 182 | 156.6,10.5 183 | 218.5,12.2 184 | 56.2,8.7 185 | 287.6,26.2 186 | 253.8,17.6 187 | 205,22.6 188 | 139.5,10.3 189 | 191.1,17.3 190 | 286,15.9 191 | 18.7,6.7 192 | 39.5,10.8 193 | 75.5,9.9 194 | 17.2,5.9 195 | 166.8,19.6 196 | 149.7,17.3 197 | 38.2,7.6 198 | 94.2,9.7 199 | 177,12.8 200 | 283.6,25.5 201 | 232.1,13.4 202 | -------------------------------------------------------------------------------- /C2/w2/C2w2notes.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w2/C2w2notes.pdf -------------------------------------------------------------------------------- /C2/w2/Readme.md: -------------------------------------------------------------------------------- 1 | ## Course 2 - Week 2 2 | 3 | - [Practice Quiz - Partial Derivatives and Gradient](/C2/w2/pq1/) 4 | - [Ungraded Lab - Optimization Using Gradient Descent in One Variable](/C2/w2/lab/C2_W2_Lab_1_Optimization_Using_Gradient_Descent_in_One_Variable.ipynb) 5 | - [Ungraded Lab - Optimization Using Gradient Descent in Two Variables](/C2/w2/lab/C2_W2_Lab_2_Optimization_Using_Gradient_Descent_in_Two_Variables.ipynb) 6 | - [Graded Quiz - Partial Derivatives and Gradient Descent](/C2/w2/q1/) 7 | - [Programming Assignment - Optimization Using Gradient Descent: Linear Regression](/C2/w2/C2w2_graded_lab/) 8 | - [Lecture Materials](/C2/w2/C2w2notes.pdf) -------------------------------------------------------------------------------- /C2/w2/lab/C2_W2_Lab_1_Optimization_Using_Gradient_Descent_in_One_Variable.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "5e797b86", 6 | "metadata": {}, 7 | "source": [ 8 | "# Optimization Using Gradient Descent in One Variable" 9 | ] 10 | }, 11 | { 12 | "cell_type": "markdown", 13 | "id": "c29d4eb9", 14 | "metadata": {}, 15 | "source": [ 16 | "To understand how to optimize functions using gradient descent, start from simple examples - functions of one variable. In this lab, you will implement the gradient descent method for functions with single and multiple minima, experiment with the parameters and visualize the results. This will allow you to understand the advantages and disadvantages of the gradient descent method." 17 | ] 18 | }, 19 | { 20 | "cell_type": "markdown", 21 | "id": "18491c07", 22 | "metadata": {}, 23 | "source": [ 24 | "# Table of Contents\n", 25 | "\n", 26 | "- [ 1 - Function with One Global Minimum](#1)\n", 27 | "- [ 2 - Function with Multiple Minima](#2)" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "id": "4bd109eb", 33 | "metadata": {}, 34 | "source": [ 35 | "## Packages\n", 36 | "\n", 37 | "Run the following cell to load the packages you'll need." 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "execution_count": null, 43 | "id": "ae466878", 44 | "metadata": {}, 45 | "outputs": [], 46 | "source": [ 47 | "import numpy as np\n", 48 | "import matplotlib.pyplot as plt\n", 49 | "# Some functions defined specifically for this notebook.\n", 50 | "from w2_tools import plot_f, gradient_descent_one_variable, f_example_2, dfdx_example_2\n", 51 | "# Magic command to make matplotlib plots interactive.\n", 52 | "%matplotlib widget" 53 | ] 54 | }, 55 | { 56 | "cell_type": "markdown", 57 | "id": "cc7350e4", 58 | "metadata": {}, 59 | "source": [ 60 | "\n", 61 | "## 1 - Function with One Global Minimum" 62 | ] 63 | }, 64 | { 65 | "cell_type": "markdown", 66 | "id": "8d92c5b7", 67 | "metadata": {}, 68 | "source": [ 69 | "Function $f\\left(x\\right)=e^x - \\log(x)$ (defined for $x>0$) is a function of one variable which has only one **minimum point** (called **global minimum**). However, sometimes that minimum point cannot be found analytically - solving the equation $\\frac{df}{dx}=0$. It can be done using a gradient descent method.\n", 70 | "\n", 71 | "To implement gradient descent, you need to start from some initial point $x_0$. Aiming to find a point, where the derivative equals zero, you want to move \"down the hill\". Calculate the derivative $\\frac{df}{dx}(x_0)$ (called a **gradient**) and step to the next point using the expression:\n", 72 | "\n", 73 | "$$x_1 = x_0 - \\alpha \\frac{df}{dx}(x_0),\\tag{1}$$\n", 74 | "\n", 75 | "where $\\alpha>0$ is a parameter called a **learning rate**. Repeat the process iteratively. The number of iterations $n$ is usually also a parameter.\n", 76 | "\n", 77 | "Subtracting $\\frac{df}{dx}(x_0)$ you move \"down the hill\" against the increase of the function - toward the minimum point. So, $\\frac{df}{dx}(x_0)$ generally defines the direction of movement. Parameter $\\alpha$ serves as a scaling factor.\n", 78 | "\n", 79 | "Now it's time to implement the gradient descent method and experiment with the parameters!\n", 80 | "\n", 81 | "First, define function $f\\left(x\\right)=e^x - \\log(x)$ and its derivative $\\frac{df}{dx}\\left(x\\right)=e^x - \\frac{1}{x}$:" 82 | ] 83 | }, 84 | { 85 | "cell_type": "code", 86 | "execution_count": null, 87 | "id": "018e0f4c", 88 | "metadata": {}, 89 | "outputs": [], 90 | "source": [ 91 | "def f_example_1(x):\n", 92 | " return np.exp(x) - np.log(x)\n", 93 | "\n", 94 | "def dfdx_example_1(x):\n", 95 | " return np.exp(x) - 1/x" 96 | ] 97 | }, 98 | { 99 | "cell_type": "markdown", 100 | "id": "88843676", 101 | "metadata": {}, 102 | "source": [ 103 | "Function $f\\left(x\\right)$ has one global minimum. Let's plot the function:" 104 | ] 105 | }, 106 | { 107 | "cell_type": "code", 108 | "execution_count": null, 109 | "id": "f5d4b4f0", 110 | "metadata": {}, 111 | "outputs": [], 112 | "source": [ 113 | "plot_f([0.001, 2.5], [-0.3, 13], f_example_1, 0.0)" 114 | ] 115 | }, 116 | { 117 | "cell_type": "markdown", 118 | "id": "d3ece7aa", 119 | "metadata": {}, 120 | "source": [ 121 | "Gradient descent can be implemented in the following function: " 122 | ] 123 | }, 124 | { 125 | "cell_type": "code", 126 | "execution_count": null, 127 | "id": "3d35a4cb", 128 | "metadata": {}, 129 | "outputs": [], 130 | "source": [ 131 | "def gradient_descent(dfdx, x, learning_rate = 0.1, num_iterations = 100):\n", 132 | " for iteration in range(num_iterations):\n", 133 | " x = x - learning_rate * dfdx(x)\n", 134 | " return x" 135 | ] 136 | }, 137 | { 138 | "cell_type": "markdown", 139 | "id": "abd7a1c4", 140 | "metadata": {}, 141 | "source": [ 142 | "Note that there are three parameters in this implementation: `num_iterations`, `learning_rate`, initial point `x_initial`. Model parameters for such methods as gradient descent are usually found experimentially. For now, just assume that you know the parameters that will work in this model - you will see the discussion of that later. To optimize the function, set up the parameters and call the defined function `gradient_descent`:" 143 | ] 144 | }, 145 | { 146 | "cell_type": "code", 147 | "execution_count": null, 148 | "id": "a3aa3c23", 149 | "metadata": {}, 150 | "outputs": [], 151 | "source": [ 152 | "num_iterations = 25; learning_rate = 0.1; x_initial = 1.6\n", 153 | "print(\"Gradient descent result: x_min =\", gradient_descent(dfdx_example_1, x_initial, learning_rate, num_iterations)) " 154 | ] 155 | }, 156 | { 157 | "cell_type": "markdown", 158 | "id": "b5656b68", 159 | "metadata": {}, 160 | "source": [ 161 | "The code in following cell will help you to visualize and understand the gradient descent method deeper. After the end of the animation, you can click on the plot to choose a new initial point and investigate how the gradient descent method will be performed.\n", 162 | "\n", 163 | "You can see that it works successfully here, bringing it to the global minimum point!\n", 164 | "\n", 165 | "What if some of the parameters will be changed? Will the method always work? Uncomment the lines in the cell below and rerun the code to investigate what happens if other parameter values are chosen. Try to investigate and analyse the results. You can read some comments below.\n", 166 | "\n", 167 | "*Notes related to this animation*: \n", 168 | "- Gradient descent is performed with some pauses between the iterations for visualization purposes. The actual implementation is much faster.\n", 169 | "- The animation stops when minimum point is reached with certain accuracy (it might be a smaller number of steps than `num_iterations`) - to avoid long runs of the code and for teaching purposes.\n", 170 | "- Please wait for the end of the animation before making any code changes or rerunning the cell. In case of any issues, you can try to restart the Kernel and rerun the notebook." 171 | ] 172 | }, 173 | { 174 | "cell_type": "code", 175 | "execution_count": null, 176 | "id": "b9eece7b", 177 | "metadata": {}, 178 | "outputs": [], 179 | "source": [ 180 | "num_iterations = 25; learning_rate = 0.1; x_initial = 1.6\n", 181 | "# num_iterations = 25; learning_rate = 0.3; x_initial = 1.6\n", 182 | "# num_iterations = 25; learning_rate = 0.5; x_initial = 1.6\n", 183 | "# num_iterations = 25; learning_rate = 0.04; x_initial = 1.6\n", 184 | "# num_iterations = 75; learning_rate = 0.04; x_initial = 1.6\n", 185 | "# num_iterations = 25; learning_rate = 0.1; x_initial = 0.05\n", 186 | "# num_iterations = 25; learning_rate = 0.1; x_initial = 0.03\n", 187 | "# num_iterations = 25; learning_rate = 0.1; x_initial = 0.02\n", 188 | "\n", 189 | "gd_example_1 = gradient_descent_one_variable([0.001, 2.5], [-0.3, 13], f_example_1, dfdx_example_1, \n", 190 | " gradient_descent, num_iterations, learning_rate, x_initial, 0.0, [0.35, 9.5])" 191 | ] 192 | }, 193 | { 194 | "cell_type": "markdown", 195 | "id": "17d96ced", 196 | "metadata": {}, 197 | "source": [ 198 | "Comments related to the choice of the parameters in the animation above:\n", 199 | "\n", 200 | "- Choosing `num_iterations = 25`, `learning_rate = 0.1`, `x_initial = 1.6` you get to the minimum point successfully. Even a little bit earlier - on the iteration 21, so for this choice of the learning rate and initial point, the number of iterations could have been taken less than `25` to save some computation time.\n", 201 | "\n", 202 | "- Increasing the `learning_rate` to `0.3` you can see that the method converges even faster - you need less number of iterations. But note that the steps are larger and this may cause some problems.\n", 203 | "\n", 204 | "- Increasing the `learning_rate` further to `0.5` the method doesn't converge anymore! You steped too far away from the minimum point. So, be careful - increasing `learning_rate` the method may converge significantly faster... or not converge at all.\n", 205 | "\n", 206 | "- To be \"safe\", you may think, why not to decrease `learning_rate`?! Take it `0.04`, keeping the rest of the parameters the same. The model will not run enough number of iterations to converge!\n", 207 | "\n", 208 | "- Increasing `num_iterations`, say to `75`, the model will converge but slowly. This would be more \"expensive\" computationally.\n", 209 | "\n", 210 | "- What if you get back to the original parameters `num_iterations = 25`, `learning_rate = 0.1`, but choose some other `x_initial`, e.g. `0.05`? The function is steeper at that point, thus the gradient is larger in absolute value, and the first step is larger. But it will work - you will get to the minimum point.\n", 211 | "\n", 212 | "- If you take `x_initial = 0.03` the function is even steeper, making the first step significantly larger. You are risking \"missing\" the minimum point.\n", 213 | "\n", 214 | "- Taking `x_initial = 0.02` the method doesn't converge anymore...\n", 215 | "\n", 216 | "This is a very simple example, but hopefully, it gives you an idea of how important is the choice of the initial parameters." 217 | ] 218 | }, 219 | { 220 | "cell_type": "markdown", 221 | "id": "b223b5fc", 222 | "metadata": {}, 223 | "source": [ 224 | "\n", 225 | "## 2 - Function with Multiple Minima" 226 | ] 227 | }, 228 | { 229 | "cell_type": "markdown", 230 | "id": "0c5ab6e6", 231 | "metadata": {}, 232 | "source": [ 233 | "Now you can take a slightly more complicated example - a function in one variable, but with multiple minima. Such an example was shown in the videos, and you can plot the function with the following code:" 234 | ] 235 | }, 236 | { 237 | "cell_type": "code", 238 | "execution_count": null, 239 | "id": "4d7c3352", 240 | "metadata": {}, 241 | "outputs": [], 242 | "source": [ 243 | "plot_f([0.001, 2], [-6.3, 5], f_example_2, -6)" 244 | ] 245 | }, 246 | { 247 | "cell_type": "markdown", 248 | "id": "8c394dac", 249 | "metadata": {}, 250 | "source": [ 251 | "Function `f_example_2` and its derivative `dfdx_example_2` are pre-defined and uploaded into this notebook. At this stage, while you are mastering the optimization method, do not worry about the corresponding expressions, just concentrate on the gradient descent and the related parameters for now.\n", 252 | "\n", 253 | "Use the following code to run gradient descent with the same `learning_rate` and `num_iterations`, but with a different starting point:" 254 | ] 255 | }, 256 | { 257 | "cell_type": "code", 258 | "execution_count": null, 259 | "id": "78dab02b", 260 | "metadata": {}, 261 | "outputs": [], 262 | "source": [ 263 | "print(\"Gradient descent results\")\n", 264 | "print(\"Global minimum: x_min =\", gradient_descent(dfdx_example_2, x=1.3, learning_rate=0.005, num_iterations=35)) \n", 265 | "print(\"Local minimum: x_min =\", gradient_descent(dfdx_example_2, x=0.25, learning_rate=0.005, num_iterations=35)) " 266 | ] 267 | }, 268 | { 269 | "cell_type": "markdown", 270 | "id": "ef98246a", 271 | "metadata": {}, 272 | "source": [ 273 | "The results are different. Both times the point did fall into one of the minima, but in the first run it was a global minimum, while in the second run it got \"stuck\" in a local one. To see the visualization of what is happening, run the code below. You can uncomment the lines to try different sets of parameters or click on the plot to choose the initial point (after the end of the animation)." 274 | ] 275 | }, 276 | { 277 | "cell_type": "code", 278 | "execution_count": null, 279 | "id": "ca1f892d", 280 | "metadata": {}, 281 | "outputs": [], 282 | "source": [ 283 | "num_iterations = 35; learning_rate = 0.005; x_initial = 1.3\n", 284 | "# num_iterations = 35; learning_rate = 0.005; x_initial = 0.25\n", 285 | "# num_iterations = 35; learning_rate = 0.01; x_initial = 1.3\n", 286 | "\n", 287 | "gd_example_2 = gradient_descent_one_variable([0.001, 2], [-6.3, 5], f_example_2, dfdx_example_2, \n", 288 | " gradient_descent, num_iterations, learning_rate, x_initial, -6, [0.1, -0.5])" 289 | ] 290 | }, 291 | { 292 | "cell_type": "markdown", 293 | "id": "7a1b84d9", 294 | "metadata": {}, 295 | "source": [ 296 | "You can see that gradient descent method is robust - it allows you to optimize a function with a small number of calculations, but it has some drawbacks. The efficiency of the method depends a lot on the choice of the initial parameters, and it is a challenge in machine learning applications to choose the \"right\" set of parameters to train the model!" 297 | ] 298 | }, 299 | { 300 | "cell_type": "code", 301 | "execution_count": null, 302 | "id": "d67c0446", 303 | "metadata": {}, 304 | "outputs": [], 305 | "source": [] 306 | } 307 | ], 308 | "metadata": { 309 | "kernelspec": { 310 | "display_name": "Python 3 (ipykernel)", 311 | "language": "python", 312 | "name": "python3" 313 | }, 314 | "language_info": { 315 | "codemirror_mode": { 316 | "name": "ipython", 317 | "version": 3 318 | }, 319 | "file_extension": ".py", 320 | "mimetype": "text/x-python", 321 | "name": "python", 322 | "nbconvert_exporter": "python", 323 | "pygments_lexer": "ipython3", 324 | "version": "3.10.6" 325 | } 326 | }, 327 | "nbformat": 4, 328 | "nbformat_minor": 5 329 | } 330 | -------------------------------------------------------------------------------- /C2/w2/lab/C2_W2_Lab_2_Optimization_Using_Gradient_Descent_in_Two_Variables.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "5e797b86", 6 | "metadata": {}, 7 | "source": [ 8 | "# Optimization Using Gradient Descent in Two Variables" 9 | ] 10 | }, 11 | { 12 | "cell_type": "markdown", 13 | "id": "c29d4eb9", 14 | "metadata": {}, 15 | "source": [ 16 | "In this lab, you will implement and visualize the gradient descent method optimizing some functions in two variables. You will have a chance to experiment with the initial parameters, and investigate the results and limitations of the method." 17 | ] 18 | }, 19 | { 20 | "cell_type": "markdown", 21 | "id": "18491c07", 22 | "metadata": {}, 23 | "source": [ 24 | "# Table of Contents\n", 25 | "\n", 26 | "- [ 1 - Function with One Global Minimum](#1)\n", 27 | "- [ 2 - Function with Multiple Minima](#2)" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "id": "4bd109eb", 33 | "metadata": {}, 34 | "source": [ 35 | "## Packages\n", 36 | "\n", 37 | "Run the following cell to load the packages you'll need." 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "execution_count": null, 43 | "id": "ae466878", 44 | "metadata": {}, 45 | "outputs": [], 46 | "source": [ 47 | "import numpy as np\n", 48 | "import matplotlib.pyplot as plt\n", 49 | "# Some functions defined specifically for this notebook.\n", 50 | "from w2_tools import (plot_f_cont_and_surf, gradient_descent_two_variables, \n", 51 | " f_example_3, dfdx_example_3, dfdy_example_3, \n", 52 | " f_example_4, dfdx_example_4, dfdy_example_4)\n", 53 | "# Magic command to make matplotlib plots interactive.\n", 54 | "%matplotlib widget" 55 | ] 56 | }, 57 | { 58 | "cell_type": "markdown", 59 | "id": "a605afb0", 60 | "metadata": {}, 61 | "source": [ 62 | "\n", 63 | "## 1 - Function with One Global Minimum" 64 | ] 65 | }, 66 | { 67 | "cell_type": "markdown", 68 | "id": "7df25271", 69 | "metadata": {}, 70 | "source": [ 71 | "Let's explore a simple example of a function in two variables $f\\left(x, y\\right)$ with one global minimum. Such a function was discussed in the videos, it is predefined and uploaded into this notebook as `f_example_3` with its partial derivatives `dfdx_example_3` and `dfdy_example_3`. At this stage, you do not need to worry about the exact expression for that function and its partial derivatives, so you can focus on the implementation of gradient descent and the choice of the related parameters. Run the following cell to plot the function." 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "execution_count": null, 77 | "id": "d83526e4", 78 | "metadata": {}, 79 | "outputs": [], 80 | "source": [ 81 | "plot_f_cont_and_surf([0, 5], [0, 5], [74, 85], f_example_3, cmap='coolwarm', view={'azim':-60,'elev':28})" 82 | ] 83 | }, 84 | { 85 | "cell_type": "markdown", 86 | "id": "f84421da", 87 | "metadata": {}, 88 | "source": [ 89 | "To find the minimum, you can implement gradient descent starting from the initial point $\\left(x_0, y_0\\right)$ and making steps iteration by iteration using the following equations:\n", 90 | "\n", 91 | "\n", 92 | "$$x_1 = x_0 - \\alpha \\frac{\\partial f}{\\partial x}(x_0, y_0),$$ \n", 93 | "$$y_1 = y_0 - \\alpha \\frac{\\partial f}{\\partial y}(x_0, y_0),\\tag{1}$$\n", 94 | "\n", 95 | "where $\\alpha>0$ is a learning rate. Number of iterations is also a parameter. The method is implemented with the following code:" 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": null, 101 | "id": "e5eb04ad", 102 | "metadata": {}, 103 | "outputs": [], 104 | "source": [ 105 | "def gradient_descent(dfdx, dfdy, x, y, learning_rate = 0.1, num_iterations = 100):\n", 106 | " for iteration in range(num_iterations):\n", 107 | " x, y = x - learning_rate * dfdx(x, y), y - learning_rate * dfdy(x, y)\n", 108 | " return x, y" 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "id": "eb8bdca3", 114 | "metadata": {}, 115 | "source": [ 116 | "Now to optimize the function, set up the parameters `num_iterations`, `learning_rate`, `x_initial`, `y_initial` and run gradient descent:" 117 | ] 118 | }, 119 | { 120 | "cell_type": "code", 121 | "execution_count": null, 122 | "id": "5673ecfb", 123 | "metadata": {}, 124 | "outputs": [], 125 | "source": [ 126 | "num_iterations = 30; learning_rate = 0.25; x_initial = 0.5; y_initial = 0.6\n", 127 | "print(\"Gradient descent result: x_min, y_min =\", \n", 128 | " gradient_descent(dfdx_example_3, dfdy_example_3, x_initial, y_initial, learning_rate, num_iterations)) " 129 | ] 130 | }, 131 | { 132 | "cell_type": "markdown", 133 | "id": "73a835a0", 134 | "metadata": {}, 135 | "source": [ 136 | "You can see the visualization running the following code. Note that gradient descent in two variables performs steps on the plane, in a direction opposite to the gradient vector $\\begin{bmatrix}\\frac{\\partial f}{\\partial x}(x_0, y_0) \\\\ \\frac{\\partial f}{\\partial y}(x_0, y_0)\\end{bmatrix}$ with the learning rate $\\alpha$ as a scaling factor.\n", 137 | "\n", 138 | "By uncommenting different lines you can experiment with various sets of the parameter values and corresponding results. At the end of the animation, you can also click on the contour plot to choose the initial point and restart the animation automatically.\n", 139 | "\n", 140 | "Run a few experiments and try to explain what is actually happening in each of the cases." 141 | ] 142 | }, 143 | { 144 | "cell_type": "code", 145 | "execution_count": null, 146 | "id": "20711620", 147 | "metadata": {}, 148 | "outputs": [], 149 | "source": [ 150 | "num_iterations = 20; learning_rate = 0.25; x_initial = 0.5; y_initial = 0.6\n", 151 | "# num_iterations = 20; learning_rate = 0.5; x_initial = 0.5; y_initial = 0.6\n", 152 | "# num_iterations = 20; learning_rate = 0.15; x_initial = 0.5; y_initial = 0.6\n", 153 | "# num_iterations = 20; learning_rate = 0.15; x_initial = 3.5; y_initial = 3.6\n", 154 | "\n", 155 | "gd_example_3 = gradient_descent_two_variables([0, 5], [0, 5], [74, 85], \n", 156 | " f_example_3, dfdx_example_3, dfdy_example_3, \n", 157 | " gradient_descent, num_iterations, learning_rate, \n", 158 | " x_initial, y_initial, \n", 159 | " [0.1, 0.1, 81.5], 2, [4, 1, 171], \n", 160 | " cmap='coolwarm', view={'azim':-60,'elev':28})" 161 | ] 162 | }, 163 | { 164 | "cell_type": "markdown", 165 | "id": "88824f8d", 166 | "metadata": {}, 167 | "source": [ 168 | "\n", 169 | "## 2 - Function with Multiple Minima" 170 | ] 171 | }, 172 | { 173 | "cell_type": "markdown", 174 | "id": "6e1c647c", 175 | "metadata": {}, 176 | "source": [ 177 | "Let's investigate a more complicated case of a function, which was also shown in the videos:" 178 | ] 179 | }, 180 | { 181 | "cell_type": "code", 182 | "execution_count": null, 183 | "id": "d2ea18d3", 184 | "metadata": {}, 185 | "outputs": [], 186 | "source": [ 187 | "plot_f_cont_and_surf([0, 5], [0, 5], [6, 9.5], f_example_4, cmap='terrain', view={'azim':-63,'elev':21})" 188 | ] 189 | }, 190 | { 191 | "cell_type": "markdown", 192 | "id": "3ddc17a8", 193 | "metadata": {}, 194 | "source": [ 195 | "You can find its global minimum point by using gradient descent with the following parameters:" 196 | ] 197 | }, 198 | { 199 | "cell_type": "code", 200 | "execution_count": null, 201 | "id": "8fef9c35", 202 | "metadata": {}, 203 | "outputs": [], 204 | "source": [ 205 | "num_iterations = 100; learning_rate = 0.2; x_initial = 0.5; y_initial = 3\n", 206 | "\n", 207 | "print(\"Gradient descent result: x_min, y_min =\", \n", 208 | " gradient_descent(dfdx_example_4, dfdy_example_4, x_initial, y_initial, learning_rate, num_iterations)) " 209 | ] 210 | }, 211 | { 212 | "cell_type": "markdown", 213 | "id": "ba6499be", 214 | "metadata": {}, 215 | "source": [ 216 | "However, the shape of the surface is much more complicated and not every initial point will bring you to the global minimum of this surface. Use the following code to explore various sets of parameters and the results of gradient descent." 217 | ] 218 | }, 219 | { 220 | "cell_type": "code", 221 | "execution_count": null, 222 | "id": "c9269ef2", 223 | "metadata": {}, 224 | "outputs": [], 225 | "source": [ 226 | "# Converges to the global minimum point.\n", 227 | "num_iterations = 30; learning_rate = 0.2; x_initial = 0.5; y_initial = 3\n", 228 | "# Converges to a local minimum point.\n", 229 | "# num_iterations = 20; learning_rate = 0.2; x_initial = 2; y_initial = 3\n", 230 | "# Converges to another local minimum point.\n", 231 | "# num_iterations = 20; learning_rate = 0.2; x_initial = 4; y_initial = 0.5\n", 232 | "\n", 233 | "gd_example_4 = gradient_descent_two_variables([0, 5], [0, 5], [6, 9.5], \n", 234 | " f_example_4, dfdx_example_4, dfdy_example_4, \n", 235 | " gradient_descent, num_iterations, learning_rate, \n", 236 | " x_initial, y_initial, \n", 237 | " [2, 2, 6], 0.5, [2, 1, 63], \n", 238 | " cmap='terrain', view={'azim':-63,'elev':21})" 239 | ] 240 | }, 241 | { 242 | "cell_type": "markdown", 243 | "id": "1028397e", 244 | "metadata": {}, 245 | "source": [ 246 | "You had a chance to experience the robustness and limitations of the gradient descent methods for a function in two variables. " 247 | ] 248 | }, 249 | { 250 | "cell_type": "code", 251 | "execution_count": null, 252 | "id": "009daf3f", 253 | "metadata": {}, 254 | "outputs": [], 255 | "source": [] 256 | } 257 | ], 258 | "metadata": { 259 | "kernelspec": { 260 | "display_name": "Python 3 (ipykernel)", 261 | "language": "python", 262 | "name": "python3" 263 | }, 264 | "language_info": { 265 | "codemirror_mode": { 266 | "name": "ipython", 267 | "version": 3 268 | }, 269 | "file_extension": ".py", 270 | "mimetype": "text/x-python", 271 | "name": "python", 272 | "nbconvert_exporter": "python", 273 | "pygments_lexer": "ipython3", 274 | "version": "3.10.6" 275 | } 276 | }, 277 | "nbformat": 4, 278 | "nbformat_minor": 5 279 | } 280 | -------------------------------------------------------------------------------- /C2/w2/lab/w2_tools.py: -------------------------------------------------------------------------------- 1 | import time 2 | import numpy as np 3 | import matplotlib.pyplot as plt 4 | from matplotlib.widgets import Button 5 | from matplotlib.patches import FancyArrowPatch 6 | from matplotlib.gridspec import GridSpec 7 | from IPython.display import display, clear_output 8 | 9 | 10 | def plot_f(x_range, y_range, f, ox_position): 11 | x = np.linspace(*x_range, 100) 12 | fig, ax = plt.subplots(1,1,figsize=(8,4)) 13 | 14 | fig.canvas.toolbar_visible = False 15 | fig.canvas.header_visible = False 16 | fig.canvas.footer_visible = False 17 | 18 | ax.set_ylim(*y_range) 19 | ax.set_xlim(*x_range) 20 | ax.set_ylabel('$f$') 21 | ax.set_xlabel('$x$') 22 | ax.spines['left'].set_position('zero') 23 | ax.spines['bottom'].set_position(('data', ox_position)) 24 | ax.spines['right'].set_color('none') 25 | ax.spines['top'].set_color('none') 26 | ax.xaxis.set_ticks_position('bottom') 27 | ax.yaxis.set_ticks_position('left') 28 | ax.autoscale(enable=False) 29 | 30 | pf = ax.plot(x, f(x), 'k') 31 | 32 | return fig, ax 33 | 34 | 35 | class gradient_descent_one_variable: 36 | """ class to run one interactive plot """ 37 | def __init__(self, x_range, y_range, f, dfdx, gd, n_it, lr, x_0, ox_position, t_position): 38 | x = np.linspace(*x_range, 100) 39 | fig, ax = plot_f(x_range, y_range, f, ox_position) 40 | 41 | # Initialize plot. 42 | self.fig = fig 43 | self.ax = ax 44 | self.x = x 45 | self.f = f 46 | self.dfdx = dfdx 47 | self.gd = gd 48 | self.n_it = n_it 49 | self.lr = lr 50 | self.x_0 = x_0 51 | self.x_range = x_range 52 | self.i = 0 53 | self.ox_position = ox_position 54 | self.t_position = t_position 55 | 56 | self.update_plot_point(firsttime=True) 57 | self.path = path(self.x_0, self.ax, self.ox_position) # initialize an empty path, avoids existance check 58 | 59 | time.sleep(0.2) 60 | clear_output(wait=True) 61 | display(self.fig) 62 | 63 | self.run_gd() 64 | self.cpoint = self.fig.canvas.mpl_connect('button_press_event', self.click_plot) 65 | 66 | def click_plot(self, event): 67 | ''' Called when click in plot ''' 68 | if (event.xdata <= max(self.x) and event.xdata >= min(self.x)): 69 | self.x_0 = event.xdata 70 | self.i = 0 71 | self.path.re_init(self.x_0) 72 | self.update_plot_point() 73 | time.sleep(0.2) 74 | self.run_gd() 75 | 76 | def update_plot_point(self, firsttime=False): 77 | 78 | # Remove items and re-add them on plot. 79 | if not firsttime: 80 | for artist in self.p_items: 81 | artist.remove() 82 | 83 | a = self.ax.scatter(self.x_0, self.f(self.x_0), marker='o', s=100, color='r', zorder=10) 84 | b = self.ax.scatter(self.x_0, self.ox_position, marker='o', s=100, color='k', zorder=10) 85 | c = self.ax.hlines(self.f(self.x_0), 0, self.x_0, lw=2, ls='dotted', color='k') 86 | d = self.ax.vlines(self.x_0, self.ox_position, self.f(self.x_0), lw=2, ls='dotted', color='k') 87 | t_it = self.ax.annotate(f"Iteration #${self.i}$", xy=(self.t_position[0], self.t_position[1]), 88 | xytext=(4,4), textcoords='offset points', size=10) 89 | t_x_0 = self.ax.annotate(f"$x_0 = {self.x_0:0.4f}$", xy=(self.t_position[0], self.t_position[1]-1), 90 | xytext=(4,4), textcoords='offset points', size=10) 91 | t_f = self.ax.annotate(f"$f\\,\\left(x_0\\right) = {self.f(self.x_0):0.2f}$", 92 | xy=(self.t_position[0], self.t_position[1]-2), xytext=(4,4), 93 | textcoords='offset points', size=10) 94 | t_dfdx = self.ax.annotate(f"$f\\,'\\left(x_0\\right) = {self.dfdx(self.x_0):0.4f}$", 95 | xy=(self.t_position[0], self.t_position[1]-3), 96 | xytext=(4,4), textcoords='offset points', size=10) 97 | 98 | self.p_items = [a, b, c, d, t_it, t_x_0, t_f, t_dfdx] 99 | self.fig.canvas.draw() 100 | 101 | def run_gd(self): 102 | self.i = 1 103 | x_0_new = self.gd(self.dfdx, self.x_0, self.lr, 1) 104 | while (self.i <= self.n_it and abs(self.dfdx(x_0_new)) >= 0.00001 and x_0_new >= self.x_range[0]): 105 | x_0_new = self.gd(self.dfdx, self.x_0, self.lr, 1) 106 | self.path.add_path_item(x_0_new, self.f) 107 | self.x_0 = x_0_new 108 | time.sleep(0.05) 109 | self.update_plot_point() 110 | clear_output(wait=True) 111 | display(self.fig) 112 | self.i += 1 113 | 114 | if abs(self.dfdx(self.x_0)) >= 0.00001 or self.x_0 < self.x_range[0] or self.x_0 < self.x_range[0]: 115 | t_res = self.ax.annotate("Has Not Converged", xy=(self.t_position[0], self.t_position[1]-4), 116 | xytext=(4,4), textcoords='offset points', size=10) 117 | else: 118 | t_res = self.ax.annotate("Converged", xy=(self.t_position[0], self.t_position[1]-4), 119 | xytext=(4,4), textcoords='offset points', size=10) 120 | t_instruction = self.ax.text(0.3,0.95,"[Click on the plot to choose initial point]", 121 | size=10, color="r", transform=self.ax.transAxes) 122 | self.p_items.append(t_res) 123 | self.p_items.append(t_instruction) 124 | # Clear last time at the end, so there is no duplicate with the cell output. 125 | clear_output(wait=True) 126 | # plt.close() 127 | 128 | 129 | class path: 130 | ''' tracks paths during gradient descent on the plot ''' 131 | def __init__(self, x_0, ax, ox_position): 132 | ''' x_0 at start of path ''' 133 | self.path_items = [] 134 | self.x_0 = x_0 135 | self.ax = ax 136 | self.ox_position = ox_position 137 | 138 | def re_init(self, x_0): 139 | for artist in self.path_items: 140 | artist.remove() 141 | self.path_items = [] 142 | self.x_0 = x_0 143 | 144 | def add_path_item(self, x_0, f): 145 | a = FancyArrowPatch( 146 | posA=(self.x_0, self.ox_position), posB=(x_0, self.ox_position), color='r', 147 | arrowstyle='simple, head_width=5, head_length=10, tail_width=1.0', 148 | ) 149 | b = self.ax.scatter(self.x_0, f(self.x_0), facecolors='none', edgecolors='r', ls='dotted', s=100, zorder=10) 150 | self.ax.add_artist(a) 151 | self.path_items.append(a) 152 | self.path_items.append(b) 153 | self.x_0 = x_0 154 | 155 | 156 | # + 157 | def f_example_2(x): 158 | return (np.exp(x) - np.log(x))*np.sin(np.pi*x*2) 159 | 160 | def dfdx_example_2(x): 161 | return (np.exp(x) - 1/x)*np.sin(np.pi*x*2) + (np.exp(x) - \ 162 | np.log(x))*np.cos(np.pi*x*2)*2*np.pi 163 | 164 | 165 | # + 166 | def f_example_3(x,y): 167 | return (85+ 0.1*(- 1/9*(x-6)*x**2*y**3 + 2/3*(x-6)*x**2*y**2)) 168 | 169 | def dfdx_example_3(x,y): 170 | return 0.1/3*x*y**2*(2-y/3)*(3*x-12) 171 | 172 | def dfdy_example_3(x,y): 173 | return 0.1/3*(x-6)*x**2*y*(4-y) 174 | 175 | 176 | # + 177 | def f_example_4(x,y): 178 | return -(10/(3+3*(x-.5)**2+3*(y-.5)**2) + \ 179 | 2/(1+2*((x-3)**2)+2*(y-1.5)**2) + \ 180 | 3/(1+.5*((x-3.5)**2)+0.5*(y-4)**2))+10 181 | 182 | def dfdx_example_4(x,y): 183 | return -(-2*3*(x-0.5)*10/(3+3*(x-0.5)**2+3*(y-0.5)**2)**2 + \ 184 | -2*2*(x-3)*2/(1+2*((x-3)**2)+2*(y-1.5)**2)**2 +\ 185 | -2*0.5*(x-3.5)*3/(1+.5*((x-3.5)**2)+0.5*(y-4)**2)**2) 186 | 187 | def dfdy_example_4(x,y): 188 | return -(-2*3*(y-0.5)*10/(3+3*(x-0.5)**2+3*(y-0.5)**2)**2 + \ 189 | -2*2*(y-1.5)*2/(1+2*((x-3)**2)+2*(y-1.5)**2)**2 +\ 190 | -0.5*2*(y-4)*3/(1+.5*((x-3.5)**2)+0.5*(y-4)**2)**2) 191 | 192 | 193 | # - 194 | 195 | def plot_f_cont_and_surf(x_range, y_range, z_range, f, cmap, view): 196 | 197 | fig = plt.figure( figsize=(10,5)) 198 | fig.canvas.toolbar_visible = False 199 | fig.canvas.header_visible = False 200 | fig.canvas.footer_visible = False 201 | fig.set_facecolor('#ffffff') #white 202 | gs = GridSpec(1, 2, figure=fig) 203 | axc = fig.add_subplot(gs[0, 0]) 204 | axs = fig.add_subplot(gs[0, 1], projection='3d') 205 | 206 | x = np.linspace(*x_range, 51) 207 | y = np.linspace(*y_range, 51) 208 | X,Y = np.meshgrid(x,y) 209 | 210 | cont = axc.contour(X, Y, f(X, Y), cmap=cmap, levels=18, linewidths=2, alpha=0.7) 211 | axc.set_xlabel('$x$') 212 | axc.set_ylabel('$y$') 213 | axc.set_xlim(*x_range) 214 | axc.set_ylim(*y_range) 215 | axc.set_aspect("equal") 216 | axc.autoscale(enable=False) 217 | 218 | surf = axs.plot_surface(X,Y, f(X,Y), cmap=cmap, 219 | antialiased=True, cstride=1, rstride=1, alpha=0.69) 220 | axs.set_xlabel('$x$') 221 | axs.set_ylabel('$y$') 222 | axs.set_zlabel('$f$') 223 | axs.set_xlim(*x_range) 224 | axs.set_ylim(*y_range) 225 | axs.set_zlim(*z_range) 226 | axs.view_init(elev=view['elev'], azim=view['azim']) 227 | axs.autoscale(enable=False) 228 | 229 | return fig, axc, axs 230 | 231 | 232 | class gradient_descent_two_variables: 233 | """ class to run one interactive plot """ 234 | def __init__(self, x_range, y_range, z_range, f, dfdx, dfdy, gd, n_it, lr, x_0, y_0, 235 | t_position, t_space, instr_position, cmap, view): 236 | 237 | x = np.linspace(*x_range, 51) 238 | y = np.linspace(*y_range, 51) 239 | fig, axc, axs = plot_f_cont_and_surf(x_range, y_range, z_range, f, cmap, view) 240 | 241 | # Initialize plot. 242 | self.fig = fig 243 | self.axc = axc 244 | self.axs = axs 245 | self.x = x 246 | self.y = y 247 | self.f = f 248 | self.dfdx = dfdx 249 | self.dfdy = dfdy 250 | self.gd = gd 251 | self.n_it = n_it 252 | self.lr = lr 253 | self.x_0 = x_0 254 | self.y_0 = y_0 255 | self.x_range = x_range 256 | self.y_range = y_range 257 | self.i = 0 258 | self.t_position = t_position 259 | self.t_space = t_space 260 | self.instr_position = instr_position 261 | 262 | self.update_plot_point(firsttime=True) 263 | self.path = path_2(self.x_0, self.y_0, self.axc, self.axs) # initialize an empty path, avoids existance check 264 | 265 | time.sleep(0.2) 266 | clear_output(wait=True) 267 | display(self.fig) 268 | 269 | self.run_gd() 270 | self.cpoint = self.fig.canvas.mpl_connect('button_press_event', self.click_plot) 271 | 272 | def click_plot(self, event): 273 | ''' Called when click in plot ''' 274 | if (event.xdata <= max(self.x) and event.xdata >= min(self.x) and 275 | event.ydata <= max(self.y) and event.ydata >= min(self.y)): 276 | self.x_0 = event.xdata 277 | self.y_0 = event.ydata 278 | self.i = 0 279 | self.path.re_init(self.x_0, self.y_0) 280 | self.update_plot_point() 281 | time.sleep(0.2) 282 | self.run_gd() 283 | 284 | def update_plot_point(self, firsttime=False): 285 | 286 | # Remove items and re-add them on plot. 287 | if not firsttime: 288 | for artist in self.p_items: 289 | artist.remove() 290 | 291 | a = self.axc.scatter(self.x_0, self.y_0, marker='o', s=100, color='k', zorder=10) 292 | b = self.axc.hlines(self.y_0, self.axc.get_xlim()[0], self.x_0, lw=2, ls='dotted', color='k') 293 | c = self.axc.vlines(self.x_0, self.axc.get_ylim()[0], self.y_0, lw=2, ls='dotted', color='k') 294 | d = self.axs.scatter3D(self.x_0, self.y_0, self.f(self.x_0, self.y_0), s=100, color='r', zorder=10) 295 | t_it = self.axs.text(self.t_position[0], self.t_position[1], self.t_position[2], 296 | f"Iteration #${self.i}$", size=10, zorder=20) 297 | t_x_y = self.axs.text(self.t_position[0], self.t_position[1], self.t_position[2]-self.t_space, 298 | f"$x_0, y_0 = {self.x_0:0.2f}, {self.y_0:0.2f}$", size=10, zorder=20) 299 | t_f = self.axs.text(self.t_position[0], self.t_position[1], self.t_position[2]-self.t_space*2, 300 | f"$f\\,\\left(x_0, y_0\\right) = {self.f(self.x_0, self.y_0):0.2f}$", size=10, zorder=20) 301 | t_dfdx = self.axs.text(self.t_position[0], self.t_position[1], self.t_position[2]-self.t_space*3, 302 | f"$f\\,'_x\\left(x_0, y_0\\right) = {self.dfdx(self.x_0, self.y_0):0.2f}$", size=10, zorder=20) 303 | t_dfdy = self.axs.text(self.t_position[0], self.t_position[1], self.t_position[2]-self.t_space*4, 304 | f"$f\\,'_y\\left(x_0, y_0\\right) = {self.dfdy(self.x_0, self.y_0):0.2f}$", size=10, zorder=20) 305 | self.p_items = [a, b, c, d, t_it, t_x_y, t_f, t_dfdx, t_dfdy] 306 | self.fig.canvas.draw() 307 | 308 | def run_gd(self): 309 | self.i = 1 310 | x_0_new, y_0_new = self.gd(self.dfdx, self.dfdy, self.x_0, self.y_0, self.lr, 1) 311 | 312 | while (self.i <= self.n_it and 313 | (abs(self.dfdx(x_0_new, y_0_new)) >= 0.001 or abs(self.dfdy(x_0_new, y_0_new)) >= 0.001) and 314 | x_0_new >= self.x_range[0] and x_0_new <= self.x_range[1] and 315 | y_0_new >= self.y_range[0] and y_0_new <= self.y_range[1]): 316 | x_0_new, y_0_new = self.gd(self.dfdx, self.dfdy, self.x_0, self.y_0, self.lr, 1) 317 | self.path.add_path_item(x_0_new, y_0_new, self.f) 318 | self.x_0 = x_0_new 319 | self.y_0 = y_0_new 320 | time.sleep(0.05) 321 | self.update_plot_point() 322 | clear_output(wait=True) 323 | display(self.fig) 324 | self.i += 1 325 | 326 | if abs(self.dfdx(x_0_new, y_0_new)) >= 0.001 or abs(self.dfdy(x_0_new, y_0_new)) >= 0.001 or self.x_0 < self.x_range[0] or self.x_0 > self.x_range[1] or self.y_0 < self.y_range[0] or self.y_0 > self.y_range[1]: 327 | t_res = self.axs.text(self.t_position[0], self.t_position[1], self.t_position[2]-self.t_space*5, 328 | "Has Not Converged", size=10, zorder=20) 329 | else: 330 | t_res = self.axs.text(self.t_position[0], self.t_position[1], self.t_position[2]-self.t_space*5, 331 | "Converged", size=10, zorder=20) 332 | t_instruction = self.axs.text(*self.instr_position, "[Click on the contour plot to choose initial point]", 333 | size=10, color="r", transform=self.axs.transAxes) 334 | self.p_items.append(t_res) 335 | self.p_items.append(t_instruction) 336 | # Clear last time at the end, so there is no duplicate with the cell output. 337 | clear_output(wait=True) 338 | 339 | 340 | class path_2: 341 | ''' tracks paths during gradient descent on contour and surface plots ''' 342 | def __init__(self, x_0, y_0, axc, axs): 343 | ''' x_0, y_0 at start of path ''' 344 | self.path_items = [] 345 | self.x_0 = x_0 346 | self.y_0 = y_0 347 | self.axc = axc 348 | self.axs = axs 349 | 350 | def re_init(self, x_0, y_0): 351 | for artist in self.path_items: 352 | artist.remove() 353 | self.path_items = [] 354 | self.x_0 = x_0 355 | self.y_0 = y_0 356 | 357 | def add_path_item(self, x_0, y_0, f): 358 | a = FancyArrowPatch( 359 | posA=(self.x_0, self.y_0), posB=(x_0, y_0), color='r', 360 | arrowstyle='simple, head_width=5, head_length=10, tail_width=1.0', 361 | ) 362 | b = self.axs.scatter3D(self.x_0, self.y_0, f(self.x_0, self.y_0), 363 | facecolors='none', edgecolors='r', ls='dotted', s=100, zorder=10) 364 | self.axc.add_artist(a) 365 | self.path_items.append(a) 366 | self.path_items.append(b) 367 | self.x_0 = x_0 368 | self.y_0 = y_0 369 | -------------------------------------------------------------------------------- /C2/w2/pq1/Readme.md: -------------------------------------------------------------------------------- 1 | # Practice Quiz - Partial Derivatives and Gradient 2 | 3 | ![](/C2/w2/pq1/ss1.png) 4 | ![](/C2/w2/pq1/ss2.png) -------------------------------------------------------------------------------- /C2/w2/pq1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w2/pq1/ss1.png -------------------------------------------------------------------------------- /C2/w2/pq1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w2/pq1/ss2.png -------------------------------------------------------------------------------- /C2/w2/q1/Readme.md: -------------------------------------------------------------------------------- 1 | # Graded Quiz - Partial Derivatives and Gradient Descent 2 | 3 | ![](/C2/w2/q1/ss1.png) 4 | ![](/C2/w2/q1/ss2.png) -------------------------------------------------------------------------------- /C2/w2/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w2/q1/ss1.png -------------------------------------------------------------------------------- /C2/w2/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w2/q1/ss2.png -------------------------------------------------------------------------------- /C2/w3/C2w3_graded_lab/images/nn_model_2_layers.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/C2w3_graded_lab/images/nn_model_2_layers.png -------------------------------------------------------------------------------- /C2/w3/C2w3notes.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/C2w3notes.pdf -------------------------------------------------------------------------------- /C2/w3/Readme.md: -------------------------------------------------------------------------------- 1 | # Course 2 - Week 3 2 | 3 | - [Practice Quiz - Optimization in Neural Networks](/C2/w3/pq1/) 4 | - [Ungraded Lab - Regression with Perceptron](/C2/w3/lab/C2_W3_Lab_1_Regression_with_Perceptron.ipynb) 5 | - [Ungraded Lab - Classification with Perceptron](/C2/w3/lab/C2_W3_Lab_2_Classification_with_Perceptron.ipynb) 6 | - [Ungraded Lab - Optimization Using Newtons Method](/C2/w3/lab/C2_W3_Lab_3_Optimization_Using_Newtons_Method.ipynb) 7 | - [Graded Quiz - Optimization in Neural Networks and Newton's Method](/C2/w3/q1/) 8 | - [Programming Assignment - Neural Network with Two Layers](/C2/w3/C2w3_graded_lab/) 9 | - [Lecture Materials](/C2/w3/C2w3notes.pdf) 10 | -------------------------------------------------------------------------------- /C2/w3/lab/C2_W3_Lab_3_Optimization_Using_Newtons_Method.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "fa9e9f53", 6 | "metadata": {}, 7 | "source": [ 8 | "# Optimization Using Newton's Method" 9 | ] 10 | }, 11 | { 12 | "cell_type": "markdown", 13 | "id": "14154c8a", 14 | "metadata": {}, 15 | "source": [ 16 | "In this lab you will implement Newton's method optimizing some functions in one and two variables. You will also compare it with the gradient descent, experiencing advantages and disadvantages of each of the methods." 17 | ] 18 | }, 19 | { 20 | "cell_type": "markdown", 21 | "id": "8aff2e4d", 22 | "metadata": {}, 23 | "source": [ 24 | "# Table of Contents\n", 25 | "\n", 26 | "- [ 1 - Function in One Variable](#1)\n", 27 | "- [ 2 - Function in Two Variables](#2)" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "id": "5202ca39", 33 | "metadata": {}, 34 | "source": [ 35 | "## Packages\n", 36 | "\n", 37 | "Run the following cell to load the packages you'll need." 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "execution_count": null, 43 | "id": "21767de7", 44 | "metadata": {}, 45 | "outputs": [], 46 | "source": [ 47 | "import numpy as np\n", 48 | "import matplotlib.pyplot as plt\n", 49 | "from matplotlib.gridspec import GridSpec" 50 | ] 51 | }, 52 | { 53 | "cell_type": "markdown", 54 | "id": "f12db316", 55 | "metadata": {}, 56 | "source": [ 57 | "\n", 58 | "## 1 - Function in One Variable" 59 | ] 60 | }, 61 | { 62 | "cell_type": "markdown", 63 | "id": "9d07eb61", 64 | "metadata": {}, 65 | "source": [ 66 | "You will use Newton's method to optimize a function $f\\left(x\\right)$. Aiming to find a point, where the derivative equals to zero, you need to start from some initial point $x_0$, calculate first and second derivatives ($f'(x_0)$ and $f''(x_0)$) and step to the next point using the expression:\n", 67 | "\n", 68 | "$$x_1 = x_0 - \\frac{f'(x_0)}{f''(x_0)},\\tag{1}$$\n", 69 | "\n", 70 | "Repeat the process iteratively. Number of iterations $n$ is usually also a parameter. \n", 71 | "\n", 72 | "Let's optimize function $f\\left(x\\right)=e^x - \\log(x)$ (defined for $x>0$) using Newton's method. To implement it in the code, define function $f\\left(x\\right)=e^x - \\log(x)$, its first and second derivatives $f'(x)=e^x - \\frac{1}{x}$, $f''(x)=e^x + \\frac{1}{x^2}$:" 73 | ] 74 | }, 75 | { 76 | "cell_type": "code", 77 | "execution_count": null, 78 | "id": "314b980f", 79 | "metadata": {}, 80 | "outputs": [], 81 | "source": [ 82 | "def f_example_1(x):\n", 83 | " return np.exp(x) - np.log(x)\n", 84 | "\n", 85 | "def dfdx_example_1(x):\n", 86 | " return np.exp(x) - 1/x\n", 87 | "\n", 88 | "def d2fdx2_example_1(x):\n", 89 | " return np.exp(x) + 1/(x**2)\n", 90 | "\n", 91 | "x_0 = 1.6\n", 92 | "print(f\"f({x_0}) = {f_example_1(x_0)}\")\n", 93 | "print(f\"f'({x_0}) = {dfdx_example_1(x_0)}\")\n", 94 | "print(f\"f''({x_0}) = {d2fdx2_example_1(x_0)}\")" 95 | ] 96 | }, 97 | { 98 | "cell_type": "markdown", 99 | "id": "d69f6ee0", 100 | "metadata": {}, 101 | "source": [ 102 | "Plot the function to visualize the global minimum:" 103 | ] 104 | }, 105 | { 106 | "cell_type": "code", 107 | "execution_count": null, 108 | "id": "905afb0d", 109 | "metadata": {}, 110 | "outputs": [], 111 | "source": [ 112 | "def plot_f(x_range, y_range, f, ox_position):\n", 113 | " x = np.linspace(*x_range, 100)\n", 114 | " fig, ax = plt.subplots(1,1,figsize=(8,4))\n", 115 | "\n", 116 | " ax.set_ylim(*y_range)\n", 117 | " ax.set_xlim(*x_range)\n", 118 | " ax.set_ylabel('$f\\,(x)$')\n", 119 | " ax.set_xlabel('$x$')\n", 120 | " ax.spines['left'].set_position('zero')\n", 121 | " ax.spines['bottom'].set_position(('data', ox_position))\n", 122 | " ax.spines['right'].set_color('none')\n", 123 | " ax.spines['top'].set_color('none')\n", 124 | " ax.xaxis.set_ticks_position('bottom')\n", 125 | " ax.yaxis.set_ticks_position('left')\n", 126 | " ax.autoscale(enable=False)\n", 127 | " \n", 128 | " pf = ax.plot(x, f(x), 'k')\n", 129 | " \n", 130 | " return fig, ax\n", 131 | "\n", 132 | "plot_f([0.001, 2.5], [-0.3, 13], f_example_1, 0.0)" 133 | ] 134 | }, 135 | { 136 | "cell_type": "markdown", 137 | "id": "3f3686a8", 138 | "metadata": {}, 139 | "source": [ 140 | "Implement Newton's method described above:" 141 | ] 142 | }, 143 | { 144 | "cell_type": "code", 145 | "execution_count": null, 146 | "id": "103eb240", 147 | "metadata": {}, 148 | "outputs": [], 149 | "source": [ 150 | "def newtons_method(dfdx, d2fdx2, x, num_iterations=100):\n", 151 | " for iteration in range(num_iterations):\n", 152 | " x = x - dfdx(x) / d2fdx2(x)\n", 153 | " print(x)\n", 154 | " return x" 155 | ] 156 | }, 157 | { 158 | "cell_type": "markdown", 159 | "id": "e94e6071", 160 | "metadata": {}, 161 | "source": [ 162 | "In addition to the first and second derivatives, there are two other parameters in this implementation: number of iterations `num_iterations`, initial point `x`. To optimize the function, set up the parameters and call the defined function gradient_descent:" 163 | ] 164 | }, 165 | { 166 | "cell_type": "code", 167 | "execution_count": null, 168 | "id": "5f41dab9", 169 | "metadata": {}, 170 | "outputs": [], 171 | "source": [ 172 | "num_iterations_example_1 = 25; x_initial = 1.6\n", 173 | "newtons_example_1 = newtons_method(dfdx_example_1, d2fdx2_example_1, x_initial, num_iterations_example_1)\n", 174 | "print(\"Newton's method result: x_min =\", newtons_example_1)" 175 | ] 176 | }, 177 | { 178 | "cell_type": "markdown", 179 | "id": "47e0c03e", 180 | "metadata": {}, 181 | "source": [ 182 | "You can see that starting from the initial point $x_0 = 1.6$ Newton's method converges after $6$ iterations. You could actually exit the loop when there is no significant change of $x$ each step (or when first derivative is close to zero).\n", 183 | "\n", 184 | "What if gradient descent was used starting from the same intial point?" 185 | ] 186 | }, 187 | { 188 | "cell_type": "code", 189 | "execution_count": null, 190 | "id": "e77a4ff0", 191 | "metadata": {}, 192 | "outputs": [], 193 | "source": [ 194 | "def gradient_descent(dfdx, x, learning_rate=0.1, num_iterations=100):\n", 195 | " for iteration in range(num_iterations):\n", 196 | " x = x - learning_rate * dfdx(x)\n", 197 | " print(x)\n", 198 | " return x\n", 199 | "\n", 200 | "num_iterations = 25; learning_rate = 0.1; x_initial = 1.6\n", 201 | "# num_iterations = 25; learning_rate = 0.2; x_initial = 1.6\n", 202 | "gd_example_1 = gradient_descent(dfdx_example_1, x_initial, learning_rate, num_iterations)\n", 203 | "print(\"Gradient descent result: x_min =\", gd_example_1) " 204 | ] 205 | }, 206 | { 207 | "cell_type": "markdown", 208 | "id": "37d76af5", 209 | "metadata": {}, 210 | "source": [ 211 | "Gradient descent method has an extra parameter `learning_rate`. If you take it equal to `0.1` in this example, the method will start to converge after about $15$ iterations (aiming for an accuracy of 4-5 decimal places). If you increase it to $0.2$, gradient descent will converge within about $12$ iterations, which is still slower than Newton's method. \n", 212 | "\n", 213 | "So, those are disadvantages of gradient descent method in comparison with Newton's method: there is an extra parameter to control and it converges slower. However it has an advantage - in each step you do not need to calculate second derivative, which in more complicated cases is quite computationally expensive to find. So, one step of gradient descent method is easier to make than one step of Newton's method.\n", 214 | "\n", 215 | "This is the reality of numerical optimization - convergency and actual result depend on the initial parameters. Also, there is no \"perfect\" algorithm - every method will have advantages and disadvantages." 216 | ] 217 | }, 218 | { 219 | "cell_type": "markdown", 220 | "id": "a08f02b9", 221 | "metadata": {}, 222 | "source": [ 223 | "\n", 224 | "## 2 - Function in Two Variables" 225 | ] 226 | }, 227 | { 228 | "cell_type": "markdown", 229 | "id": "13ae722c", 230 | "metadata": {}, 231 | "source": [ 232 | "In case of a function in two variables, Newton's method will require even more computations. Starting from the intial point $(x_0, y_0)$, the step to the next point shoud be done using the expression: \n", 233 | "\n", 234 | "$$\\begin{bmatrix}x_1 \\\\ y_1\\end{bmatrix} = \\begin{bmatrix}x_0 \\\\ y_0\\end{bmatrix} - \n", 235 | "H^{-1}\\left(x_0, y_0\\right)\\nabla f\\left(x_0, y_0\\right),\\tag{2}$$\n", 236 | "\n", 237 | "where $H^{-1}\\left(x_0, y_0\\right)$ is an inverse of a Hessian matrix at point $(x_0, y_0)$ and $\\nabla f\\left(x_0, y_0\\right)$ is the gradient at that point.\n", 238 | "\n", 239 | "Let's implement that in the code. Define the function $f(x, y)$ like in the videos, its gradient and Hessian:\n", 240 | "\n", 241 | "\\begin{align}\n", 242 | "f\\left(x, y\\right) &= x^4 + 0.8 y^4 + 4x^2 + 2y^2 - xy - 0.2x^2y,\\\\\n", 243 | "\\nabla f\\left(x, y\\right) &= \\begin{bmatrix}4x^3 + 8x - y - 0.4xy \\\\ 3.2y^3 + 4y - x - 0.2x^2\\end{bmatrix}, \\\\\n", 244 | "H\\left(x, y\\right) &= \\begin{bmatrix}12x^2 + 8 - 0.4y && -1 - 0.4x \\\\ -1 - 0.4x && 9.6y^2 + 4\\end{bmatrix}.\n", 245 | "\\end{align}" 246 | ] 247 | }, 248 | { 249 | "cell_type": "code", 250 | "execution_count": null, 251 | "id": "a5dd8cc2", 252 | "metadata": {}, 253 | "outputs": [], 254 | "source": [ 255 | "def f_example_2(x, y):\n", 256 | " return x**4 + 0.8*y**4 + 4*x**2 + 2*y**2 - x*y -0.2*x**2*y\n", 257 | "\n", 258 | "def grad_f_example_2(x, y):\n", 259 | " return np.array([[4*x**3 + 8*x - y - 0.4*x*y],\n", 260 | " [3.2*y**3 +4*y - x - 0.2*x**2]])\n", 261 | "\n", 262 | "def hessian_f_example_2(x, y):\n", 263 | " hessian_f = np.array([[12*x**2 + 8 - 0.4*y, -1 - 0.4*x],\n", 264 | " [-1 - 0.4*x, 9.6*y**2 + 4]])\n", 265 | " return hessian_f\n", 266 | "\n", 267 | "x_0, y_0 = 4, 4\n", 268 | "print(f\"f{x_0, y_0} = {f_example_2(x_0, y_0)}\")\n", 269 | "print(f\"grad f{x_0, y_0} = \\n{grad_f_example_2(x_0, y_0)}\")\n", 270 | "print(f\"H{x_0, y_0} = \\n{hessian_f_example_2(x_0, y_0)}\")" 271 | ] 272 | }, 273 | { 274 | "cell_type": "markdown", 275 | "id": "53b87b92", 276 | "metadata": {}, 277 | "source": [ 278 | "Run the following cell to plot the function:" 279 | ] 280 | }, 281 | { 282 | "cell_type": "code", 283 | "execution_count": null, 284 | "id": "1d243fba", 285 | "metadata": {}, 286 | "outputs": [], 287 | "source": [ 288 | "def plot_f_cont_and_surf(f):\n", 289 | " \n", 290 | " fig = plt.figure( figsize=(10,5))\n", 291 | " fig.canvas.toolbar_visible = False\n", 292 | " fig.canvas.header_visible = False\n", 293 | " fig.canvas.footer_visible = False\n", 294 | " fig.set_facecolor('#ffffff')\n", 295 | " gs = GridSpec(1, 2, figure=fig)\n", 296 | " axc = fig.add_subplot(gs[0, 0])\n", 297 | " axs = fig.add_subplot(gs[0, 1], projection='3d')\n", 298 | " \n", 299 | " x_range = [-4, 5]\n", 300 | " y_range = [-4, 5]\n", 301 | " z_range = [0, 1200]\n", 302 | " x = np.linspace(*x_range, 100)\n", 303 | " y = np.linspace(*y_range, 100)\n", 304 | " X,Y = np.meshgrid(x,y)\n", 305 | " \n", 306 | " cont = axc.contour(X, Y, f(X, Y), cmap='terrain', levels=18, linewidths=2, alpha=0.7)\n", 307 | " axc.set_xlabel('$x$')\n", 308 | " axc.set_ylabel('$y$')\n", 309 | " axc.set_xlim(*x_range)\n", 310 | " axc.set_ylim(*y_range)\n", 311 | " axc.set_aspect(\"equal\")\n", 312 | " axc.autoscale(enable=False)\n", 313 | " \n", 314 | " surf = axs.plot_surface(X,Y, f(X,Y), cmap='terrain', \n", 315 | " antialiased=True,cstride=1,rstride=1, alpha=0.69)\n", 316 | " axs.set_xlabel('$x$')\n", 317 | " axs.set_ylabel('$y$')\n", 318 | " axs.set_zlabel('$f$')\n", 319 | " axs.set_xlim(*x_range)\n", 320 | " axs.set_ylim(*y_range)\n", 321 | " axs.set_zlim(*z_range)\n", 322 | " axs.view_init(elev=20, azim=-100)\n", 323 | " axs.autoscale(enable=False)\n", 324 | " \n", 325 | " return fig, axc, axs\n", 326 | "\n", 327 | "plot_f_cont_and_surf(f_example_2)" 328 | ] 329 | }, 330 | { 331 | "cell_type": "markdown", 332 | "id": "88731f3c", 333 | "metadata": {}, 334 | "source": [ 335 | "Newton's method $(2)$ is implemented in the following function:" 336 | ] 337 | }, 338 | { 339 | "cell_type": "code", 340 | "execution_count": null, 341 | "id": "f2f020fa", 342 | "metadata": {}, 343 | "outputs": [], 344 | "source": [ 345 | "def newtons_method_2(f, grad_f, hessian_f, x_y, num_iterations=100):\n", 346 | " for iteration in range(num_iterations):\n", 347 | " x_y = x_y - np.matmul(np.linalg.inv(hessian_f(x_y[0,0], x_y[1,0])), grad_f(x_y[0,0], x_y[1,0]))\n", 348 | " print(x_y.T)\n", 349 | " return x_y" 350 | ] 351 | }, 352 | { 353 | "cell_type": "markdown", 354 | "id": "775700eb", 355 | "metadata": {}, 356 | "source": [ 357 | "Now run the following code to find the minimum:" 358 | ] 359 | }, 360 | { 361 | "cell_type": "code", 362 | "execution_count": null, 363 | "id": "4d3066a3", 364 | "metadata": {}, 365 | "outputs": [], 366 | "source": [ 367 | "num_iterations_example_2 = 25; x_y_initial = np.array([[4], [4]])\n", 368 | "newtons_example_2 = newtons_method_2(f_example_2, grad_f_example_2, hessian_f_example_2, \n", 369 | " x_y_initial, num_iterations=num_iterations_example_2)\n", 370 | "print(\"Newton's method result: x_min, y_min =\", newtons_example_2.T)" 371 | ] 372 | }, 373 | { 374 | "cell_type": "markdown", 375 | "id": "66c1cb88", 376 | "metadata": {}, 377 | "source": [ 378 | "In this example starting from the initial point $(4, 4)$ it will converge after about $9$ iterations. Try to compare it with the gradient descent now:" 379 | ] 380 | }, 381 | { 382 | "cell_type": "code", 383 | "execution_count": null, 384 | "id": "61f82d17", 385 | "metadata": {}, 386 | "outputs": [], 387 | "source": [ 388 | "def gradient_descent_2(grad_f, x_y, learning_rate=0.1, num_iterations=100):\n", 389 | " for iteration in range(num_iterations):\n", 390 | " x_y = x_y - learning_rate * grad_f(x_y[0,0], x_y[1,0])\n", 391 | " print(x_y.T)\n", 392 | " return x_y\n", 393 | "\n", 394 | "num_iterations_2 = 300; learning_rate_2 = 0.02; x_y_initial = np.array([[4], [4]])\n", 395 | "# num_iterations_2 = 300; learning_rate_2 = 0.03; x_y_initial = np.array([[4], [4]])\n", 396 | "gd_example_2 = gradient_descent_2(grad_f_example_2, x_y_initial, learning_rate_2, num_iterations_2)\n", 397 | "print(\"Gradient descent result: x_min, y_min =\", gd_example_2) " 398 | ] 399 | }, 400 | { 401 | "cell_type": "markdown", 402 | "id": "9fa2602d", 403 | "metadata": {}, 404 | "source": [ 405 | "Obviously, gradient descent will converge much slower than Newton's method. And trying to increase learning rate, it might not even work at all. This illustrates again the disadvantages of gradient descent in comparison with Newton's method. However, note, that Newton's method required calculation of an inverted Hessian matrix, which is a very computationally expensive calculation to perform when you have, say, a thousand of parameters.\n", 406 | "\n", 407 | "Well done on finishing this lab!" 408 | ] 409 | }, 410 | { 411 | "cell_type": "code", 412 | "execution_count": null, 413 | "id": "6c418025", 414 | "metadata": {}, 415 | "outputs": [], 416 | "source": [] 417 | } 418 | ], 419 | "metadata": { 420 | "kernelspec": { 421 | "display_name": "Python 3 (ipykernel)", 422 | "language": "python", 423 | "name": "python3" 424 | }, 425 | "language_info": { 426 | "codemirror_mode": { 427 | "name": "ipython", 428 | "version": 3 429 | }, 430 | "file_extension": ".py", 431 | "mimetype": "text/x-python", 432 | "name": "python", 433 | "nbconvert_exporter": "python", 434 | "pygments_lexer": "ipython3", 435 | "version": "3.10.6" 436 | } 437 | }, 438 | "nbformat": 4, 439 | "nbformat_minor": 5 440 | } 441 | -------------------------------------------------------------------------------- /C2/w3/lab/data/tvmarketing.csv: -------------------------------------------------------------------------------- 1 | TV,Sales 2 | 230.1,22.1 3 | 44.5,10.4 4 | 17.2,9.3 5 | 151.5,18.5 6 | 180.8,12.9 7 | 8.7,7.2 8 | 57.5,11.8 9 | 120.2,13.2 10 | 8.6,4.8 11 | 199.8,10.6 12 | 66.1,8.6 13 | 214.7,17.4 14 | 23.8,9.2 15 | 97.5,9.7 16 | 204.1,19 17 | 195.4,22.4 18 | 67.8,12.5 19 | 281.4,24.4 20 | 69.2,11.3 21 | 147.3,14.6 22 | 218.4,18 23 | 237.4,12.5 24 | 13.2,5.6 25 | 228.3,15.5 26 | 62.3,9.7 27 | 262.9,12 28 | 142.9,15 29 | 240.1,15.9 30 | 248.8,18.9 31 | 70.6,10.5 32 | 292.9,21.4 33 | 112.9,11.9 34 | 97.2,9.6 35 | 265.6,17.4 36 | 95.7,9.5 37 | 290.7,12.8 38 | 266.9,25.4 39 | 74.7,14.7 40 | 43.1,10.1 41 | 228,21.5 42 | 202.5,16.6 43 | 177,17.1 44 | 293.6,20.7 45 | 206.9,12.9 46 | 25.1,8.5 47 | 175.1,14.9 48 | 89.7,10.6 49 | 239.9,23.2 50 | 227.2,14.8 51 | 66.9,9.7 52 | 199.8,11.4 53 | 100.4,10.7 54 | 216.4,22.6 55 | 182.6,21.2 56 | 262.7,20.2 57 | 198.9,23.7 58 | 7.3,5.5 59 | 136.2,13.2 60 | 210.8,23.8 61 | 210.7,18.4 62 | 53.5,8.1 63 | 261.3,24.2 64 | 239.3,15.7 65 | 102.7,14 66 | 131.1,18 67 | 69,9.3 68 | 31.5,9.5 69 | 139.3,13.4 70 | 237.4,18.9 71 | 216.8,22.3 72 | 199.1,18.3 73 | 109.8,12.4 74 | 26.8,8.8 75 | 129.4,11 76 | 213.4,17 77 | 16.9,8.7 78 | 27.5,6.9 79 | 120.5,14.2 80 | 5.4,5.3 81 | 116,11 82 | 76.4,11.8 83 | 239.8,12.3 84 | 75.3,11.3 85 | 68.4,13.6 86 | 213.5,21.7 87 | 193.2,15.2 88 | 76.3,12 89 | 110.7,16 90 | 88.3,12.9 91 | 109.8,16.7 92 | 134.3,11.2 93 | 28.6,7.3 94 | 217.7,19.4 95 | 250.9,22.2 96 | 107.4,11.5 97 | 163.3,16.9 98 | 197.6,11.7 99 | 184.9,15.5 100 | 289.7,25.4 101 | 135.2,17.2 102 | 222.4,11.7 103 | 296.4,23.8 104 | 280.2,14.8 105 | 187.9,14.7 106 | 238.2,20.7 107 | 137.9,19.2 108 | 25,7.2 109 | 90.4,8.7 110 | 13.1,5.3 111 | 255.4,19.8 112 | 225.8,13.4 113 | 241.7,21.8 114 | 175.7,14.1 115 | 209.6,15.9 116 | 78.2,14.6 117 | 75.1,12.6 118 | 139.2,12.2 119 | 76.4,9.4 120 | 125.7,15.9 121 | 19.4,6.6 122 | 141.3,15.5 123 | 18.8,7 124 | 224,11.6 125 | 123.1,15.2 126 | 229.5,19.7 127 | 87.2,10.6 128 | 7.8,6.6 129 | 80.2,8.8 130 | 220.3,24.7 131 | 59.6,9.7 132 | 0.7,1.6 133 | 265.2,12.7 134 | 8.4,5.7 135 | 219.8,19.6 136 | 36.9,10.8 137 | 48.3,11.6 138 | 25.6,9.5 139 | 273.7,20.8 140 | 43,9.6 141 | 184.9,20.7 142 | 73.4,10.9 143 | 193.7,19.2 144 | 220.5,20.1 145 | 104.6,10.4 146 | 96.2,11.4 147 | 140.3,10.3 148 | 240.1,13.2 149 | 243.2,25.4 150 | 38,10.9 151 | 44.7,10.1 152 | 280.7,16.1 153 | 121,11.6 154 | 197.6,16.6 155 | 171.3,19 156 | 187.8,15.6 157 | 4.1,3.2 158 | 93.9,15.3 159 | 149.8,10.1 160 | 11.7,7.3 161 | 131.7,12.9 162 | 172.5,14.4 163 | 85.7,13.3 164 | 188.4,14.9 165 | 163.5,18 166 | 117.2,11.9 167 | 234.5,11.9 168 | 17.9,8 169 | 206.8,12.2 170 | 215.4,17.1 171 | 284.3,15 172 | 50,8.4 173 | 164.5,14.5 174 | 19.6,7.6 175 | 168.4,11.7 176 | 222.4,11.5 177 | 276.9,27 178 | 248.4,20.2 179 | 170.2,11.7 180 | 276.7,11.8 181 | 165.6,12.6 182 | 156.6,10.5 183 | 218.5,12.2 184 | 56.2,8.7 185 | 287.6,26.2 186 | 253.8,17.6 187 | 205,22.6 188 | 139.5,10.3 189 | 191.1,17.3 190 | 286,15.9 191 | 18.7,6.7 192 | 39.5,10.8 193 | 75.5,9.9 194 | 17.2,5.9 195 | 166.8,19.6 196 | 149.7,17.3 197 | 38.2,7.6 198 | 94.2,9.7 199 | 177,12.8 200 | 283.6,25.5 201 | 232.1,13.4 202 | -------------------------------------------------------------------------------- /C2/w3/lab/images/nn_model_classification_1_layer.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/lab/images/nn_model_classification_1_layer.png -------------------------------------------------------------------------------- /C2/w3/lab/images/nn_model_linear_regression_multiple.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/lab/images/nn_model_linear_regression_multiple.png -------------------------------------------------------------------------------- /C2/w3/lab/images/nn_model_linear_regression_simple.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/lab/images/nn_model_linear_regression_simple.png -------------------------------------------------------------------------------- /C2/w3/pq1/Readme.md: -------------------------------------------------------------------------------- 1 | # Practice Quiz - Optimization in Neural Networks 2 | 3 | ![](/C2/w3/pq1/ss1.png) 4 | ![](/C2/w3/pq1/ss2.png) 5 | ![](/C2/w3/pq1/ss3.png) -------------------------------------------------------------------------------- /C2/w3/pq1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/pq1/ss1.png -------------------------------------------------------------------------------- /C2/w3/pq1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/pq1/ss2.png -------------------------------------------------------------------------------- /C2/w3/pq1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/pq1/ss3.png -------------------------------------------------------------------------------- /C2/w3/q1/Readme.md: -------------------------------------------------------------------------------- 1 | # Graded Quiz - Optimization in Neural Networks and Newton's Method 2 | 3 | ![](/C2/w3/q1/ss1.png) 4 | ![](/C2/w3/q1/ss2.png) 5 | ![](/C2/w3/q1/ss3.png) 6 | ![](/C2/w3/q1/ss4.png) -------------------------------------------------------------------------------- /C2/w3/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/q1/ss1.png -------------------------------------------------------------------------------- /C2/w3/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/q1/ss2.png -------------------------------------------------------------------------------- /C2/w3/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/q1/ss3.png -------------------------------------------------------------------------------- /C2/w3/q1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C2/w3/q1/ss4.png -------------------------------------------------------------------------------- /C3/w1/C3w1_graded_lab/__pycache__/utils.cpython-310.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/C3w1_graded_lab/__pycache__/utils.cpython-310.pyc -------------------------------------------------------------------------------- /C3/w1/C3w1_graded_lab/assets/binomial2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/C3w1_graded_lab/assets/binomial2.png -------------------------------------------------------------------------------- /C3/w1/C3w1_graded_lab/assets/gaussian.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/C3w1_graded_lab/assets/gaussian.png -------------------------------------------------------------------------------- /C3/w1/C3w1_graded_lab/utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import matplotlib.pyplot as plt 3 | 4 | 5 | def estimate_gaussian_params(sample): 6 | ### START CODE HERE ### 7 | mu = np.mean(sample) 8 | sigma = np.std(sample) 9 | ### END CODE HERE ### 10 | 11 | return mu, sigma 12 | 13 | 14 | def estimate_binomial_params(sample): 15 | ### START CODE HERE ### 16 | n = 30 17 | p = (sample / n).mean() 18 | ### END CODE HERE ### 19 | 20 | return n, p 21 | 22 | 23 | def estimate_uniform_params(sample): 24 | ### START CODE HERE ### 25 | a = sample.min() 26 | b = sample.max() 27 | ### END CODE HERE ### 28 | 29 | return a, b 30 | 31 | 32 | def plot_gaussian_distributions(gaussian_0, gaussian_1, gaussian_2): 33 | fig, ax = plt.subplots(1, 1, figsize=(10, 4)) 34 | ax.hist(gaussian_0, alpha=0.5, label="gaussian_0", bins=32) 35 | ax.hist(gaussian_1, alpha=0.5, label="gaussian_1", bins=32) 36 | ax.hist(gaussian_2, alpha=0.5, label="gaussian_2", bins=32) 37 | ax.set_title("Histograms of Gaussian distributions") 38 | ax.set_xlabel("Values") 39 | ax.set_ylabel("Frequencies") 40 | ax.legend() 41 | plt.show() 42 | 43 | 44 | def plot_binomial_distributions(binomial_0, binomial_1, binomial_2): 45 | fig, ax = plt.subplots(1, 1, figsize=(10, 4)) 46 | ax.hist(binomial_0, alpha=0.5, label="binomial_0") 47 | ax.hist(binomial_1, alpha=0.5, label="binomial_1") 48 | ax.hist(binomial_2, alpha=0.5, label="binomial_2") 49 | ax.set_title("Histograms of Binomial distributions") 50 | ax.set_xlabel("Values") 51 | ax.set_ylabel("Frequencies") 52 | ax.legend() 53 | plt.show() -------------------------------------------------------------------------------- /C3/w1/Readme.md: -------------------------------------------------------------------------------- 1 | # Course 3 - Week 1 2 | 3 | - [Ungraded Lab - Birthday Problems](/C3/w1/lab/C3_W1_Lab_2_Birthday_Problems.ipynb) 4 | - [Practice Quiz](/C3/w1/pq1/) 5 | - [Graded Quiz - Summative quiz](/C3/w1/q1/) 6 | - [Programming Assignment: Probability Distributions / Naive Bayes](/C3/w1/C3w1_graded_lab/) 7 | -------------------------------------------------------------------------------- /C3/w1/lab/images/first.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/lab/images/first.png -------------------------------------------------------------------------------- /C3/w1/lab/images/fourth.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/lab/images/fourth.png -------------------------------------------------------------------------------- /C3/w1/lab/images/second.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/lab/images/second.png -------------------------------------------------------------------------------- /C3/w1/lab/images/third.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/lab/images/third.png -------------------------------------------------------------------------------- /C3/w1/lab/utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | from datetime import timedelta, date 3 | import matplotlib.pyplot as plt 4 | import seaborn as sns 5 | import ipywidgets as widgets 6 | from ipywidgets import interact_manual 7 | 8 | 9 | 10 | class your_bday: 11 | def __init__(self) -> None: 12 | fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(10, 4)) 13 | 14 | self.fig = fig 15 | self.ax = ax1 16 | self.ax_hist = ax2 17 | self.dates = [ 18 | (date(2015, 1, 1) + timedelta(days=n)).strftime("%m-%d") for n in range(365) 19 | ] 20 | self.match = False 21 | self.bday_str = None 22 | self.bday_index = None 23 | self.n_students = 0 24 | self.history = [] 25 | self.bday_picker = widgets.DatePicker(description="Pick your bday", disabled=False, style={'description_width': 'initial'}) 26 | self.start_button = widgets.Button(description="Simulate!") 27 | 28 | display(self.bday_picker) 29 | display(self.start_button) 30 | 31 | self.start_button.on_click(self.on_button_clicked) 32 | 33 | def on_button_clicked(self, b): 34 | self.match = False 35 | self.n_students = 0 36 | 37 | self.get_bday() 38 | self.add_students() 39 | 40 | def get_bday(self): 41 | try: 42 | self.bday_str = self.bday_picker.value.strftime("%m-%d") 43 | except AttributeError: 44 | self.ax.set_title(f"Input a valid date and try again!") 45 | return 46 | self.bday_index = self.dates.index(self.bday_str) 47 | 48 | 49 | def generate_bday(self): 50 | # gen_bdays = np.random.randint(0, 365, (n_people)) 51 | gen_bday = np.random.randint(0, 365) 52 | # if not np.isnan(self.y[gen_bday]): 53 | if gen_bday == self.bday_index: 54 | self.match = True 55 | 56 | def add_students(self): 57 | 58 | if not self.bday_str: 59 | return 60 | 61 | while True: 62 | if self.match: 63 | self.history.append(self.n_students) 64 | # print(f"Match found. It took {self.n_students} students to get a match") 65 | n_runs = [i for i in range(len(self.history))] 66 | self.ax.scatter(n_runs, self.history) 67 | # counts, bins = np.histogram(self.history) 68 | # plt.stairs(counts, bins) 69 | # self.ax_hist.hist(bins[:-1], bins, weights=counts) 70 | self.ax_hist.clear() 71 | sns.histplot(data=self.history, ax=self.ax_hist, bins=16) 72 | # plt.show() 73 | break 74 | 75 | self.generate_bday() 76 | self.n_students += 1 77 | self.ax.set_title(f"Match found. It took {self.n_students} students.\nNumber of runs: {len(self.history)+1}") 78 | # self.fig.canvas.draw() 79 | # self.fig.canvas.flush_events() 80 | 81 | 82 | big_classroom_sizes = [*range(1,1000, 5)] 83 | small_classroom_sizes = [*range(1, 80)] 84 | 85 | def plot_simulated_probs(sim_probs, class_size): 86 | fig, ax = plt.subplots(1, 1, figsize=(10, 4)) 87 | # ax.scatter(class_size, sim_probs) 88 | sns.scatterplot(x=class_size, y=sim_probs, ax=ax, label="simulated probabilities") 89 | ax.set_ylabel("Simulated Probability") 90 | ax.set_xlabel("Classroom Size") 91 | ax.set_title("Probability vs Number of Students") 92 | ax.plot([0, max(class_size)], [0.5, 0.5], color='red', label="p = 0.5") 93 | ax.legend() 94 | plt.show() 95 | 96 | 97 | 98 | class third_bday_problem: 99 | def __init__(self) -> None: 100 | fig, axes = plt.subplot_mosaic( 101 | [["top row", "top row"], ["bottom left", "bottom right"]], figsize=(10, 8) 102 | ) 103 | self.fig = fig 104 | self.ax = axes["top row"] 105 | self.count_ax = axes["bottom left"] 106 | self.ax_hist = axes["bottom right"] 107 | self.ax.spines["top"].set_color("none") 108 | self.ax.spines["right"].set_color("none") 109 | self.ax.spines["left"].set_color("none") 110 | self.ax.get_yaxis().set_visible(False) 111 | x = np.arange(365) 112 | y = np.zeros((365,)) 113 | y[y == 0] = np.nan 114 | 115 | y_match = np.zeros((365,)) 116 | y_match[y_match == 0] = np.nan 117 | 118 | self.x = x 119 | self.y = y 120 | self.y_match = y_match 121 | self.match = False 122 | self.n_students = 0 123 | 124 | self.dates = [ 125 | (date(2015, 1, 1) + timedelta(days=n)).strftime("%m-%d") for n in range(365) 126 | ] 127 | self.month_names = [ 128 | "January", 129 | "February", 130 | "March", 131 | "April", 132 | "May", 133 | "June", 134 | "July", 135 | "August", 136 | "September", 137 | "October", 138 | "November", 139 | "December", 140 | ] 141 | 142 | self.history = [] 143 | self.match_index = None 144 | self.match_str = None 145 | 146 | self.cpoint = self.fig.canvas.mpl_connect("button_press_event", self.on_button_clicked) 147 | 148 | # self.start_button = widgets.Button(description="Simulate!") 149 | 150 | # display(self.start_button) 151 | 152 | # self.start_button.on_click(self.on_button_clicked) 153 | 154 | def generate_bday(self): 155 | gen_bday = np.random.randint(0, 365) 156 | 157 | if not np.isnan(self.y[gen_bday]): 158 | self.match_index = gen_bday 159 | self.match_str = self.dates[gen_bday] 160 | self.y_match[gen_bday] = 1 161 | self.match = True 162 | 163 | self.y[gen_bday] = 0.5 164 | 165 | def on_button_clicked(self, event): 166 | if event.inaxes in [self.ax]: 167 | self.new_run() 168 | self.add_students() 169 | 170 | def add_students(self): 171 | 172 | while True: 173 | if self.match: 174 | self.history.append(self.n_students) 175 | n_runs = [i for i in range(len(self.history))] 176 | self.count_ax.scatter(n_runs, self.history) 177 | self.count_ax.set_ylabel("# of students") 178 | self.count_ax.set_xlabel("# of simulations") 179 | 180 | month_str = self.month_names[int(self.match_str.split("-")[0]) - 1] 181 | day_value = self.match_str.split("-")[1] 182 | self.ax.set_title( 183 | f"Match found for {month_str} {day_value}\nIt took {self.n_students} students to get a match" 184 | ) 185 | self.ax_hist.clear() 186 | sns.histplot(data=self.history, ax=self.ax_hist, bins="auto") 187 | break 188 | 189 | self.generate_bday() 190 | self.n_students += 1 191 | self.ax.set_title(f"Number of students: {self.n_students}") 192 | 193 | self.fig.canvas.draw() 194 | self.fig.canvas.flush_events() 195 | 196 | if not np.isnan(self.y_match).all(): 197 | markerline, stemlines, baseline = self.ax.stem( 198 | self.x, self.y_match, markerfmt="*" 199 | ) 200 | plt.setp(markerline, color="green") 201 | plt.setp(stemlines, "color", plt.getp(markerline, "color")) 202 | plt.setp(stemlines, "linestyle", "dotted") 203 | self.ax.stem(self.x, self.y, markerfmt="o") 204 | 205 | def new_run(self): 206 | y = np.zeros((365,)) 207 | y[y == 0] = np.nan 208 | y_match = np.zeros((365,)) 209 | y_match[y_match == 0] = np.nan 210 | self.y_match = y_match 211 | self.y = y 212 | self.n_students = 0 213 | self.match = False 214 | self.ax.clear() 215 | -------------------------------------------------------------------------------- /C3/w1/pq1/Readme.md: -------------------------------------------------------------------------------- 1 | # Practice Quiz 2 | 3 | ![](/C3/w1/pq1/ss1.png) 4 | ![](/C3/w1/pq1/ss2.png) 5 | ![](/C3/w1/pq1/ss3.png) -------------------------------------------------------------------------------- /C3/w1/pq1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/pq1/ss1.png -------------------------------------------------------------------------------- /C3/w1/pq1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/pq1/ss2.png -------------------------------------------------------------------------------- /C3/w1/pq1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/pq1/ss3.png -------------------------------------------------------------------------------- /C3/w1/q1/Readme.md: -------------------------------------------------------------------------------- 1 | # Graded Quiz - Summative quiz 2 | 3 | ![](/C3/w1/q1/ss1.png) 4 | ![](/C3/w1/q1/ss2.png) 5 | ![](/C3/w1/q1/ss3.png) 6 | ![](/C3/w1/q1/ss4.png) 7 | ![](/C3/w1/q1/ss5.png) 8 | ![](/C3/w1/q1/ss6.png) 9 | ![](/C3/w1/q1/ss7.png) -------------------------------------------------------------------------------- /C3/w1/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/q1/ss1.png -------------------------------------------------------------------------------- /C3/w1/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/q1/ss2.png -------------------------------------------------------------------------------- /C3/w1/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/q1/ss3.png -------------------------------------------------------------------------------- /C3/w1/q1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/q1/ss4.png -------------------------------------------------------------------------------- /C3/w1/q1/ss5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/q1/ss5.png -------------------------------------------------------------------------------- /C3/w1/q1/ss6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/q1/ss6.png -------------------------------------------------------------------------------- /C3/w1/q1/ss7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w1/q1/ss7.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/__pycache__/utils.cpython-310.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/__pycache__/utils.cpython-310.pyc -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/answers.json: -------------------------------------------------------------------------------- 1 | {"ex1": {"mean": 3.5, "var": 2.91666}, "ex2": {"hist": "left"}, "ex3": {"sum_2_8": 0.0, "sum_3_7": 0.0, "sum_4_6": 0.0, "sum_5": 0.125}, "ex4": {"mean": 5.0, "var": 2.5, "cov": 0.0}, "ex5": {"hist": "right"}, "ex6": {"max_sum": 6}, "ex7": {"hist": "left-center"}, "ex8": {"hist": "right-most"}, "ex9": {"mean": "increases", "var": "increases", "cov": "stays the same"}, "ex10": {"options": "changing the loaded side from 1 to 6 will yield a higher mean but the same variance"}, "ex11": {"options": "changing the direction of the inequality will change the sign of the covariance"}, "ex12": {"options": "yes, but only if the die is fair"}} -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/4_side_hists.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/4_side_hists.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/4_side_uf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/4_side_uf.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/4_sided_hist_no_prob.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/4_sided_hist_no_prob.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/6_sided_cond_blue.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/6_sided_cond_blue.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/6_sided_cond_blue2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/6_sided_cond_blue2.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/6_sided_cond_brown.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/6_sided_cond_brown.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/6_sided_cond_brown2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/6_sided_cond_brown2.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/6_sided_cond_green.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/6_sided_cond_green.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/6_sided_cond_green2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/6_sided_cond_green2.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/6_sided_cond_red.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/6_sided_cond_red.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/6_sided_cond_red2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/6_sided_cond_red2.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/fair_dice.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/fair_dice.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/hist_sum_4_3l.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/hist_sum_4_3l.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/hist_sum_4_4l.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/hist_sum_4_4l.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/hist_sum_4_uf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/hist_sum_4_uf.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/hist_sum_5_side.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/hist_sum_5_side.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/hist_sum_6_side.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/hist_sum_6_side.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/hist_sum_6_uf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/hist_sum_6_uf.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/loaded_6_cdf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/loaded_6_cdf.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/images/loaded_6_side.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/C3w2_graded_lab/images/loaded_6_side.png -------------------------------------------------------------------------------- /C3/w2/C3w2_graded_lab/utils.py: -------------------------------------------------------------------------------- 1 | import json 2 | from IPython.display import display 3 | import ipywidgets as widgets 4 | from ipywidgets import interact 5 | 6 | 7 | 8 | 9 | def reset_answers(): 10 | with open("answers.json", "w") as f: 11 | json.dump({}, f) 12 | 13 | 14 | 15 | def exercise_1(): 16 | mean = widgets.FloatText( 17 | # value='', 18 | placeholder=0.0, 19 | description='Mean:', 20 | disabled=False 21 | ) 22 | 23 | var = widgets.FloatText( 24 | # value='', 25 | placeholder=0.0, 26 | description='Variance:', 27 | disabled=False 28 | ) 29 | 30 | 31 | button = widgets.Button(description="Save your answer!", button_style="success") 32 | output = widgets.Output() 33 | 34 | display(mean) 35 | display(var) 36 | # display(cov) 37 | 38 | display(button, output) 39 | 40 | def on_button_clicked(b): 41 | 42 | with open("answers.json", "r") as f: 43 | source_dict = json.load(f) 44 | 45 | answer_dict = { 46 | "ex1": { 47 | "mean": mean.value, 48 | "var": var.value, 49 | } 50 | } 51 | 52 | source_dict.update(answer_dict) 53 | 54 | with open("answers.json", "w") as f: 55 | json.dump(source_dict, f) 56 | 57 | with output: 58 | print("Answer for exercise 1 saved.") 59 | 60 | 61 | button.on_click(on_button_clicked) 62 | 63 | 64 | def exercise_2(): 65 | hist = widgets.ToggleButtons( 66 | options=['left', 'center', 'right'], 67 | description='Your answer:', 68 | disabled=False, 69 | button_style='', # 'success', 'info', 'warning', 'danger' or '' 70 | ) 71 | 72 | button = widgets.Button(description="Save your answer!", button_style="success") 73 | output = widgets.Output() 74 | 75 | display(hist) 76 | 77 | display(button, output) 78 | 79 | def on_button_clicked(b): 80 | 81 | with open("answers.json", "r") as f: 82 | source_dict = json.load(f) 83 | 84 | answer_dict = { 85 | "ex2": { 86 | "hist": hist.value 87 | } 88 | } 89 | 90 | source_dict.update(answer_dict) 91 | 92 | with open("answers.json", "w") as f: 93 | json.dump(source_dict, f) 94 | 95 | with output: 96 | print("Answer for exercise 2 saved.") 97 | 98 | 99 | button.on_click(on_button_clicked) 100 | 101 | 102 | def exercise_3(): 103 | sum_2_8 = widgets.FloatText( 104 | # value='', 105 | placeholder=0.0, 106 | description='P for sum=2|8', 107 | style = {'description_width': 'initial'}, 108 | disabled=False 109 | ) 110 | 111 | sum_3_7 = widgets.FloatText( 112 | # value='', 113 | placeholder=0.0, 114 | description='P for sum=3|7:', 115 | style = {'description_width': 'initial'}, 116 | disabled=False 117 | ) 118 | 119 | sum_4_6 = widgets.FloatText( 120 | # value='', 121 | placeholder=0.0, 122 | description='P for sum=4|6:', 123 | style = {'description_width': 'initial'}, 124 | disabled=False 125 | ) 126 | 127 | sum_5 = widgets.FloatText( 128 | # value='', 129 | placeholder=0.0, 130 | description='P for sum=5:', 131 | style = {'description_width': 'initial'}, 132 | disabled=False 133 | ) 134 | 135 | button = widgets.Button(description="Save your answer!", button_style="success") 136 | output = widgets.Output() 137 | 138 | display(sum_2_8) 139 | display(sum_3_7) 140 | display(sum_4_6) 141 | display(sum_5) 142 | 143 | display(button, output) 144 | 145 | def on_button_clicked(b): 146 | 147 | with open("answers.json", "r") as f: 148 | source_dict = json.load(f) 149 | 150 | answer_dict = { 151 | "ex3": { 152 | "sum_2_8": sum_2_8.value, 153 | "sum_3_7": sum_3_7.value, 154 | "sum_4_6": sum_4_6.value, 155 | "sum_5": sum_5.value 156 | } 157 | } 158 | 159 | source_dict.update(answer_dict) 160 | 161 | with open("answers.json", "w") as f: 162 | json.dump(source_dict, f) 163 | 164 | with output: 165 | print("Answer for exercise 3 saved.") 166 | 167 | 168 | button.on_click(on_button_clicked) 169 | 170 | def exercise_4(): 171 | mean = widgets.FloatText( 172 | # value='', 173 | placeholder=0.0, 174 | description='Mean:', 175 | disabled=False 176 | ) 177 | 178 | var = widgets.FloatText( 179 | # value='', 180 | placeholder=0.0, 181 | description='Variance:', 182 | disabled=False 183 | ) 184 | 185 | cov = widgets.FloatText( 186 | # value='', 187 | placeholder=0.0, 188 | description='Covariance:', 189 | disabled=False 190 | ) 191 | 192 | button = widgets.Button(description="Save your answer!", button_style="success") 193 | output = widgets.Output() 194 | 195 | display(mean) 196 | display(var) 197 | display(cov) 198 | 199 | display(button, output) 200 | 201 | def on_button_clicked(b): 202 | 203 | with open("answers.json", "r") as f: 204 | source_dict = json.load(f) 205 | 206 | answer_dict = { 207 | "ex4": { 208 | "mean": mean.value, 209 | "var": var.value, 210 | "cov": cov.value 211 | } 212 | } 213 | 214 | source_dict.update(answer_dict) 215 | 216 | with open("answers.json", "w") as f: 217 | json.dump(source_dict, f) 218 | 219 | with output: 220 | print("Answer for exercise 4 saved.") 221 | 222 | 223 | button.on_click(on_button_clicked) 224 | 225 | 226 | def exercise_5(): 227 | hist = widgets.ToggleButtons( 228 | options=['left', 'center', 'right'], 229 | description='Your answer:', 230 | disabled=False, 231 | button_style='', # 'success', 'info', 'warning', 'danger' or '' 232 | ) 233 | 234 | button = widgets.Button(description="Save your answer!", button_style="success") 235 | output = widgets.Output() 236 | 237 | display(hist) 238 | 239 | display(button, output) 240 | 241 | def on_button_clicked(b): 242 | 243 | with open("answers.json", "r") as f: 244 | source_dict = json.load(f) 245 | 246 | answer_dict = { 247 | "ex5": { 248 | "hist": hist.value 249 | } 250 | } 251 | 252 | source_dict.update(answer_dict) 253 | 254 | with open("answers.json", "w") as f: 255 | json.dump(source_dict, f) 256 | 257 | with output: 258 | print("Answer for exercise 5 saved.") 259 | 260 | 261 | button.on_click(on_button_clicked) 262 | 263 | 264 | def exercise_6(): 265 | max_sum = widgets.IntSlider( 266 | value=2, 267 | min=2, 268 | max=12, 269 | step=1, 270 | description='Sum:', 271 | disabled=False, 272 | continuous_update=False, 273 | orientation='horizontal', 274 | readout=True, 275 | readout_format='d' 276 | ) 277 | 278 | button = widgets.Button(description="Save your answer!", button_style="success") 279 | output = widgets.Output() 280 | 281 | display(max_sum) 282 | 283 | display(button, output) 284 | 285 | def on_button_clicked(b): 286 | 287 | with open("answers.json", "r") as f: 288 | source_dict = json.load(f) 289 | 290 | answer_dict = { 291 | "ex6": { 292 | "max_sum": max_sum.value 293 | } 294 | } 295 | 296 | source_dict.update(answer_dict) 297 | 298 | with open("answers.json", "w") as f: 299 | json.dump(source_dict, f) 300 | 301 | with output: 302 | print("Answer for exercise 6 saved.") 303 | 304 | 305 | button.on_click(on_button_clicked) 306 | 307 | 308 | def exercise_7(): 309 | hist = widgets.ToggleButtons( 310 | options=['left-most', 'left-center', 'right-center', 'right-most'], 311 | description='Your answer:', 312 | disabled=False, 313 | button_style='', # 'success', 'info', 'warning', 'danger' or '' 314 | ) 315 | 316 | button = widgets.Button(description="Save your answer!", button_style="success") 317 | output = widgets.Output() 318 | 319 | display(hist) 320 | 321 | display(button, output) 322 | 323 | def on_button_clicked(b): 324 | 325 | with open("answers.json", "r") as f: 326 | source_dict = json.load(f) 327 | 328 | answer_dict = { 329 | "ex7": { 330 | "hist": hist.value 331 | } 332 | } 333 | 334 | source_dict.update(answer_dict) 335 | 336 | with open("answers.json", "w") as f: 337 | json.dump(source_dict, f) 338 | 339 | with output: 340 | print("Answer for exercise 7 saved.") 341 | 342 | 343 | button.on_click(on_button_clicked) 344 | 345 | 346 | def exercise_8(): 347 | hist = widgets.ToggleButtons( 348 | options=['left-most', 'left-center', 'right-center', 'right-most'], 349 | description='Your answer:', 350 | disabled=False, 351 | button_style='', # 'success', 'info', 'warning', 'danger' or '' 352 | ) 353 | 354 | button = widgets.Button(description="Save your answer!", button_style="success") 355 | output = widgets.Output() 356 | 357 | display(hist) 358 | 359 | display(button, output) 360 | 361 | def on_button_clicked(b): 362 | 363 | with open("answers.json", "r") as f: 364 | source_dict = json.load(f) 365 | 366 | answer_dict = { 367 | "ex8": { 368 | "hist": hist.value 369 | } 370 | } 371 | 372 | source_dict.update(answer_dict) 373 | 374 | with open("answers.json", "w") as f: 375 | json.dump(source_dict, f) 376 | 377 | with output: 378 | print("Answer for exercise 8 saved.") 379 | 380 | 381 | button.on_click(on_button_clicked) 382 | 383 | 384 | def exercise_9(): 385 | mean = widgets.ToggleButtons( 386 | options=['stays the same', 'increases', 'decreases'], 387 | description='The mean of the sum:', 388 | disabled=False, 389 | button_style='', 390 | ) 391 | 392 | var = widgets.ToggleButtons( 393 | options=['stays the same', 'increases', 'decreases'], 394 | description='The variance of the sum:', 395 | disabled=False, 396 | button_style='', 397 | ) 398 | 399 | cov = widgets.ToggleButtons( 400 | options=['stays the same', 'increases', 'decreases'], 401 | description='The covariance of the joint distribution:', 402 | disabled=False, 403 | button_style='', 404 | ) 405 | 406 | button = widgets.Button(description="Save your answer!", button_style="success") 407 | output = widgets.Output() 408 | 409 | print("As the number of sides in the die increases:") 410 | display(mean) 411 | display(var) 412 | display(cov) 413 | 414 | display(button, output) 415 | 416 | def on_button_clicked(b): 417 | 418 | with open("answers.json", "r") as f: 419 | source_dict = json.load(f) 420 | 421 | answer_dict = { 422 | "ex9": { 423 | "mean": mean.value, 424 | "var": var.value, 425 | "cov": cov.value, 426 | } 427 | } 428 | 429 | source_dict.update(answer_dict) 430 | 431 | with open("answers.json", "w") as f: 432 | json.dump(source_dict, f) 433 | 434 | with output: 435 | print("Answer for exercise 9 saved.") 436 | 437 | 438 | button.on_click(on_button_clicked) 439 | 440 | 441 | 442 | def exercise_10(): 443 | options = widgets.RadioButtons( 444 | options=[ 445 | 'the mean and variance is the same regardless of which side is loaded', 446 | 'having the sides 3 or 4 loaded will yield a higher covariance than any other sides', 447 | 'the mean will decrease as the value of the loaded side increases', 448 | 'changing the loaded side from 1 to 6 will yield a higher mean but the same variance' 449 | ], 450 | layout={'width': 'max-content'} 451 | ) 452 | 453 | button = widgets.Button(description="Save your answer!", button_style="success") 454 | output = widgets.Output() 455 | 456 | display(options) 457 | 458 | display(button, output) 459 | 460 | def on_button_clicked(b): 461 | 462 | with open("answers.json", "r") as f: 463 | source_dict = json.load(f) 464 | 465 | answer_dict = { 466 | "ex10": { 467 | "options": options.value, 468 | } 469 | } 470 | 471 | source_dict.update(answer_dict) 472 | 473 | with open("answers.json", "w") as f: 474 | json.dump(source_dict, f) 475 | 476 | with output: 477 | print("Answer for exercise 10 saved.") 478 | 479 | 480 | button.on_click(on_button_clicked) 481 | 482 | 483 | def exercise_11(): 484 | options = widgets.RadioButtons( 485 | options=[ 486 | 'changing the direction of the inequality will change the sign of the covariance', 487 | 'changing the direction of the inequality will change the sign of the mean', 488 | 'changing the direction of the inequality does not affect the possible values of the sum', 489 | 'covariance will always be equal to 0' 490 | ], 491 | layout={'width': 'max-content'} 492 | ) 493 | 494 | button = widgets.Button(description="Save your answer!", button_style="success") 495 | output = widgets.Output() 496 | 497 | display(options) 498 | 499 | display(button, output) 500 | 501 | def on_button_clicked(b): 502 | 503 | with open("answers.json", "r") as f: 504 | source_dict = json.load(f) 505 | 506 | answer_dict = { 507 | "ex11": { 508 | "options": options.value, 509 | } 510 | } 511 | 512 | source_dict.update(answer_dict) 513 | 514 | with open("answers.json", "w") as f: 515 | json.dump(source_dict, f) 516 | 517 | with output: 518 | print("Answer for exercise 11 saved.") 519 | 520 | 521 | button.on_click(on_button_clicked) 522 | 523 | 524 | def exercise_12(): 525 | options = widgets.RadioButtons( 526 | options=[ 527 | 'yes, but only if one of the sides is loaded', 528 | 'no, regardless if the die is fair or not', 529 | 'yes, but only if the die is fair', 530 | 'yes, regardless if the die is fair or not' 531 | ], 532 | layout={'width': 'max-content'} 533 | ) 534 | 535 | button = widgets.Button(description="Save your answer!", button_style="success") 536 | output = widgets.Output() 537 | 538 | display(options) 539 | 540 | display(button, output) 541 | 542 | def on_button_clicked(b): 543 | 544 | with open("answers.json", "r") as f: 545 | source_dict = json.load(f) 546 | 547 | answer_dict = { 548 | "ex12": { 549 | "options": options.value, 550 | } 551 | } 552 | 553 | source_dict.update(answer_dict) 554 | 555 | with open("answers.json", "w") as f: 556 | json.dump(source_dict, f) 557 | 558 | with output: 559 | print("Answer for exercise 12 saved.") 560 | 561 | 562 | button.on_click(on_button_clicked) 563 | 564 | 565 | def check_submissions(): 566 | with open("./answers.json", "r") as f: 567 | answer_dict = json.load(f) 568 | 569 | saved_exercises = [k for k in answer_dict.keys()] 570 | expected = ['ex1', 'ex2', 'ex3', 'ex4', 'ex5', 'ex6', 'ex7', 'ex8', 'ex9', 'ex10', 'ex11', 'ex12'] 571 | missing = [e for e in expected if not e in saved_exercises] 572 | 573 | if missing: 574 | print(f"missing answers for exercises {[ex.split('ex')[1] for ex in missing]}\n\nSave your answers before submitting for grading!") 575 | return 576 | 577 | print("All answers saved, you can submit the assignment for grading!") -------------------------------------------------------------------------------- /C3/w2/Readme.md: -------------------------------------------------------------------------------- 1 | # Course 3 - Week 2 2 | 3 | - [Practice Quiz](/C3/w2/pq1/) 4 | - [Graded Quiz - Summative Quiz](/C3/w2/q1/) 5 | - [Optional Lab - Summary statistics and visualization of data sets](/C3/w2/lab/ugl_datasets.ipynb) 6 | - [Optional Lab - Dice Simulations](/C3/w2/lab/C3_W2_Lab_2_Dice_Simulations.ipynb) 7 | - [Programming Assignment: Loaded Dice](/C3/w2/C3w2_graded_lab/) -------------------------------------------------------------------------------- /C3/w2/lab/df_anscombe.csv: -------------------------------------------------------------------------------- 1 | x,y,group 2 | 10.0,8.04,1 3 | 8.0,6.95,1 4 | 13.0,7.58,1 5 | 9.0,8.81,1 6 | 11.0,8.33,1 7 | 14.0,9.96,1 8 | 6.0,7.24,1 9 | 4.0,4.26,1 10 | 12.0,10.84,1 11 | 7.0,4.82,1 12 | 5.0,5.68,1 13 | 10.0,9.14,2 14 | 8.0,8.14,2 15 | 13.0,8.74,2 16 | 9.0,8.77,2 17 | 11.0,9.26,2 18 | 14.0,8.1,2 19 | 6.0,6.13,2 20 | 4.0,3.1,2 21 | 12.0,9.13,2 22 | 7.0,7.26,2 23 | 5.0,4.74,2 24 | 10.0,7.46,3 25 | 8.0,6.77,3 26 | 13.0,12.74,3 27 | 9.0,7.11,3 28 | 11.0,7.81,3 29 | 14.0,8.84,3 30 | 6.0,6.08,3 31 | 4.0,5.39,3 32 | 12.0,8.15,3 33 | 7.0,6.42,3 34 | 5.0,5.73,3 35 | 8.0,6.58,4 36 | 8.0,5.76,4 37 | 8.0,7.71,4 38 | 8.0,8.84,4 39 | 8.0,8.47,4 40 | 8.0,7.04,4 41 | 8.0,5.25,4 42 | 19.0,12.5,4 43 | 8.0,5.56,4 44 | 8.0,7.91,4 45 | 8.0,6.89,4 46 | -------------------------------------------------------------------------------- /C3/w2/lab/ugl_datasets.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "c6dea874", 6 | "metadata": {}, 7 | "source": [ 8 | "# Ungraded Lab - Datasets\n", 9 | "\n", 10 | "In this notebook, you will be working with two distinct datasets. You will notice that relying solely on the main statistical measures such as mean, variance (or standard deviation), and correlation may not always effectively describe the datasets. Therefore, it is always advisable to supplement these measures with visualization techniques and/or other statistical measures to gain a deeper understanding of the data.\n", 11 | "\n", 12 | "You will be working with two well-known datasets: Anscombe's quartet and the Datasaurus Dozen dataset. These datasets are artificially generated and are used to illustrate the fact that some metrics can fail to capture important information present in a dataset. More specifically, these datasets are used to demonstrate how relying solely on metrics can sometimes be misleading. If you're interested, you can read more about Anscombe's quartet and the Datasaurus Dozen dataset at their respective [Wikipedia](https://en.wikipedia.org/wiki/Anscombe%27s_quartet) and [Autodesk Research](https://www.autodesk.com/research/publications/same-stats-different-graphs) pages." 13 | ] 14 | }, 15 | { 16 | "cell_type": "code", 17 | "execution_count": null, 18 | "id": "a74d2cd6", 19 | "metadata": {}, 20 | "outputs": [], 21 | "source": [ 22 | "import numpy as np\n", 23 | "import pandas as pd\n", 24 | "import os\n", 25 | "import matplotlib.pyplot as plt\n", 26 | "import utils\n", 27 | "%matplotlib widget\n" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "id": "d084f9c7", 33 | "metadata": {}, 34 | "source": [ 35 | "# 1.1 First data set - Anscombe's quartet\n", 36 | "\n", 37 | "This first dataset was initially constructed by the statistician Francis Anscombe to demonstrate both the importance of graphing data when analyzing it, and the effect of outliers and other influential observations on statistical properties. (From wikipedia)\n", 38 | "\n", 39 | "Let us explore the first dataset. To do this, you will be utilizing the popular Python library called [pandas](https://pandas.pydata.org/), specifically its main object known as [DataFrame](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html#pandas.DataFrame).\n", 40 | "\n", 41 | "To read the dataset, which is stored in a `.csv file`, you can use the read_csv function in pandas. This function enables you to load a DataFrame immediately. For further information on this function, you can type help(pd.read_csv) in your code editor." 42 | ] 43 | }, 44 | { 45 | "cell_type": "code", 46 | "execution_count": null, 47 | "id": "8a5ad75e", 48 | "metadata": {}, 49 | "outputs": [], 50 | "source": [ 51 | "# This line of code reads the dataset named 'df_anscombe.csv', which is stored in the same directory as this notebook.\n", 52 | "df_anscombe = pd.read_csv('df_anscombe.csv')" 53 | ] 54 | }, 55 | { 56 | "cell_type": "markdown", 57 | "id": "9fac4348", 58 | "metadata": {}, 59 | "source": [ 60 | "The call `df_anscombe.head()` will show you the first five rows of the data set, so you can have a look on its data." 61 | ] 62 | }, 63 | { 64 | "cell_type": "code", 65 | "execution_count": null, 66 | "id": "a20f11a7", 67 | "metadata": {}, 68 | "outputs": [], 69 | "source": [ 70 | "df_anscombe.head()" 71 | ] 72 | }, 73 | { 74 | "cell_type": "code", 75 | "execution_count": null, 76 | "id": "2080e010", 77 | "metadata": {}, 78 | "outputs": [], 79 | "source": [ 80 | "# Let's determine the number of groups present in this dataset.\n", 81 | "df_anscombe.group.nunique()" 82 | ] 83 | }, 84 | { 85 | "cell_type": "markdown", 86 | "id": "bd9ce36c", 87 | "metadata": {}, 88 | "source": [ 89 | "This dataset comprises of four groups of data, each containing two components - `x` and `y`. To analyze the data, you can obtain the mean and variance of each group, as well as the correlation between x and y within each group. Pandas provides a built-in function called `DataFrame.describe` that displays common statistics for each variable. To group the data by the group column, you can use the `DataFrame.groupby` function." 90 | ] 91 | }, 92 | { 93 | "cell_type": "markdown", 94 | "id": "70946c36", 95 | "metadata": {}, 96 | "source": [ 97 | "The next block of code first groups the `DataFrame` based on the group column, and then applies the describe function to obtain the common statistics for each variable in each group." 98 | ] 99 | }, 100 | { 101 | "cell_type": "code", 102 | "execution_count": null, 103 | "id": "e66e9c08", 104 | "metadata": {}, 105 | "outputs": [], 106 | "source": [ 107 | "df_anscombe.groupby('group').describe()" 108 | ] 109 | }, 110 | { 111 | "cell_type": "markdown", 112 | "id": "02532823", 113 | "metadata": {}, 114 | "source": [ 115 | "The groups appear to be quite similar, as evidenced by the identical mean and standard deviation values for both `x` and `y` within each group.\n", 116 | "\n", 117 | "Additionally, you can analyze the correlation between `x` and `y` within each group.\n", 118 | "\n", 119 | "To obtain the correlation matrix for each group, you can follow the same approach as before. First, group the data by the `group` column using `DataFrame.groupby`, and then apply the `.corr` function." 120 | ] 121 | }, 122 | { 123 | "cell_type": "code", 124 | "execution_count": null, 125 | "id": "655b0377", 126 | "metadata": {}, 127 | "outputs": [], 128 | "source": [ 129 | "df_anscombe.groupby('group').corr()" 130 | ] 131 | }, 132 | { 133 | "cell_type": "markdown", 134 | "id": "facfef83", 135 | "metadata": {}, 136 | "source": [ 137 | "As observed, the correlation between `x` and `y` is identical within each group up to three decimal places. Moreover, the high correlation coefficient values suggest a strong linear correlation between `x` and `y` within each group.\n", 138 | "\n", 139 | "Despite the similarities in the statistical measures for the groups, it is still necessary to visualize the data to get a better understanding of the differences, if any." 140 | ] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "execution_count": null, 145 | "id": "ba869efd", 146 | "metadata": {}, 147 | "outputs": [], 148 | "source": [ 149 | "utils.plot_anscombes_quartet()" 150 | ] 151 | }, 152 | { 153 | "cell_type": "markdown", 154 | "id": "af7aa0f3", 155 | "metadata": {}, 156 | "source": [ 157 | "Upon visualizing the data, the four groups appear to be quite distinct:\n", 158 | "\n", 159 | "1. The first group shows a clear linear relationship between `x` and `y`.\n", 160 | "2. The second group, on the other hand, exhibits a non-linear pattern, indicating that the usual Pearson correlation may not be appropriate to describe the dataset.\n", 161 | "3. The third group would be linear if it were not for a single outlier.\n", 162 | "4. The fourth group demonstrates that `y` can have different values for the same `x`, suggesting that there is no clear relationship between `x` and `y`. However, there is also an outlier in this group.\n", 163 | "\n", 164 | "These four groups illustrate that summary statistics alone are not sufficient for investigating data. Visualizing the data, analyzing possible outliers, and identifying more complex relationships are essential to gain a better understanding of the underlying patterns in the data." 165 | ] 166 | }, 167 | { 168 | "cell_type": "markdown", 169 | "id": "4d829a1f", 170 | "metadata": {}, 171 | "source": [ 172 | "# 2 - Second data set - Datasaurus Dozen\n", 173 | "\n", 174 | "The creation of Anscombe's quartet inspired other authors to generate datasets that have different relationships among its points but share the same summary statistics. One such dataset is the Datasaurus Dozen, which was created by AutoDesk. \n", 175 | "\n", 176 | "In this case, you will take a different approach. Instead of analyzing summary statistics and then plotting the data points, you will compare two datasets from the dozen and compute their statistics." 177 | ] 178 | }, 179 | { 180 | "cell_type": "code", 181 | "execution_count": null, 182 | "id": "76d174b1", 183 | "metadata": {}, 184 | "outputs": [], 185 | "source": [ 186 | "df_datasaurus = pd.read_csv(\"datasaurus.csv\")" 187 | ] 188 | }, 189 | { 190 | "cell_type": "markdown", 191 | "id": "f0aaf119", 192 | "metadata": {}, 193 | "source": [ 194 | "The next cell will run a widget where you can investigate this dataset, which has different groups in it." 195 | ] 196 | }, 197 | { 198 | "cell_type": "code", 199 | "execution_count": null, 200 | "id": "cc2eb2cc", 201 | "metadata": {}, 202 | "outputs": [], 203 | "source": [ 204 | "utils.display_widget()" 205 | ] 206 | }, 207 | { 208 | "cell_type": "markdown", 209 | "id": "37513ad2", 210 | "metadata": {}, 211 | "source": [ 212 | "As you have observed, the first dataset was not an anomaly; it is possible to have different datasets with the same summary statistics. Hence, it is essential to keep in mind while analyzing data that the summary statistics alone may not provide a complete picture of the underlying patterns and relationships." 213 | ] 214 | }, 215 | { 216 | "cell_type": "markdown", 217 | "id": "941c9278", 218 | "metadata": {}, 219 | "source": [ 220 | "Congratulations! You have completed this ungraded lab and now understand that summary statistics may not capture all the necessary information to describe your data accurately. Keep in mind that visualizations and more in-depth analyses are often needed to get a better understanding of your data." 221 | ] 222 | } 223 | ], 224 | "metadata": { 225 | "kernelspec": { 226 | "display_name": "Python 3 (ipykernel)", 227 | "language": "python", 228 | "name": "python3" 229 | }, 230 | "language_info": { 231 | "codemirror_mode": { 232 | "name": "ipython", 233 | "version": 3 234 | }, 235 | "file_extension": ".py", 236 | "mimetype": "text/x-python", 237 | "name": "python", 238 | "nbconvert_exporter": "python", 239 | "pygments_lexer": "ipython3", 240 | "version": "3.10.9" 241 | } 242 | }, 243 | "nbformat": 4, 244 | "nbformat_minor": 5 245 | } 246 | -------------------------------------------------------------------------------- /C3/w2/lab/utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import pandas as pd 3 | import os 4 | import matplotlib.pyplot as plt 5 | from IPython.display import display 6 | import ipywidgets as widgets 7 | from ipywidgets import interact,HBox, VBox 8 | import matplotlib.gridspec as gridspec 9 | 10 | df_anscombe = pd.read_csv('df_anscombe.csv') 11 | df_datasaurus = pd.read_csv("datasaurus.csv") 12 | 13 | def plot_anscombes_quartet(): 14 | fig, axs = plt.subplots(2,2, figsize = (8,5), tight_layout = True) 15 | i = 1 16 | fig.suptitle("Anscombe's quartet", fontsize = 16) 17 | for line in axs: 18 | for ax in line: 19 | ax.scatter(df_anscombe[df_anscombe.group == i]['x'],df_anscombe[df_anscombe.group == i]['y']) 20 | ax.set_title(f'Group {i}') 21 | ax.set_ylim(2,15) 22 | ax.set_xlim(0,21) 23 | ax.set_xlabel('x') 24 | ax.set_ylabel('y') 25 | i+=1 26 | 27 | def display_widget(): 28 | 29 | dropdown_graph_1 = widgets.Dropdown( 30 | options=df_datasaurus.group.unique(), 31 | value='dino', 32 | description='Data set 1: ', 33 | disabled=False, 34 | ) 35 | 36 | statistics_graph_1 = widgets.Button( 37 | value=False, 38 | description='Compute stats', 39 | disabled=False, 40 | button_style='', 41 | tooltip='Description', 42 | icon='' 43 | ) 44 | 45 | dropdown_graph_2 = widgets.Dropdown( 46 | options=df_datasaurus.group.unique(), 47 | value='h_lines', 48 | description='Data set 2: ', 49 | disabled=False, 50 | ) 51 | 52 | statistics_graph_2 = widgets.Button( 53 | value=False, 54 | description='Compute stats', 55 | disabled=False, 56 | button_style='', 57 | tooltip='Description', 58 | icon='' 59 | ) 60 | plotted_stats_graph_1 = None 61 | plotted_stats_graph_2 = None 62 | 63 | fig = plt.figure(figsize = (8,4), tight_layout = True) 64 | gs = gridspec.GridSpec(2,2) 65 | ax_1 = fig.add_subplot(gs[0,0]) 66 | ax_2 = fig.add_subplot(gs[1,0]) 67 | ax_text_1 = fig.add_subplot(gs[0,1]) 68 | ax_text_2 = fig.add_subplot(gs[1,1]) 69 | df_group_1 = df_datasaurus.groupby('group').get_group('dino') 70 | df_group_2 = df_datasaurus.groupby('group').get_group('h_lines') 71 | sc_1 = ax_1.scatter(df_group_1['x'],df_group_1['y'], s = 4) 72 | sc_2 = ax_2.scatter(df_group_2['x'],df_group_2['y'], s = 4) 73 | ax_1.set_xlabel('x') 74 | ax_1.set_ylabel('y') 75 | ax_2.set_xlabel('x') 76 | ax_2.set_ylabel('y') 77 | ax_text_1.axis('off') 78 | ax_text_2.axis('off') 79 | 80 | def dropdown_choice(value, plotted_stats, ax_text, sc): 81 | if value.new != plotted_stats: 82 | ax_text.clear() 83 | ax_text.axis('off') 84 | sc.set_offsets(df_datasaurus.groupby('group').get_group(value.new)[['x', 'y']]) 85 | fig.canvas.draw_idle() 86 | 87 | 88 | def get_stats(value, plotted_stats, ax_text, dropdown, val): 89 | value = dropdown.value 90 | if value == plotted_stats: 91 | return 92 | ax_text.clear() 93 | ax_text.axis('off') 94 | df_group = df_datasaurus.groupby('group').get_group(value) 95 | means = df_group.mean() 96 | var = df_group.var() 97 | corr = df_group.corr() 98 | ax_text.text(0, 99 | 0, 100 | f"Statistics:\n Mean x: {means['x']:.2f}\n Variance x: {var['x']:.2f}\n\n Mean y: {means['y']:.2f}\n Variance y: {var['y']:.2f}\n\n Correlation: {corr['x']['y']:.2f}" 101 | ) 102 | if val == 1: 103 | plotted_stats_graph_1 = value 104 | if val == 2: 105 | plotted_stats_graph_2 = value 106 | 107 | 108 | 109 | 110 | dropdown_graph_1.observe(lambda value: dropdown_choice(value,plotted_stats_graph_1, ax_text_1, sc_1), names = 'value') 111 | statistics_graph_1.on_click(lambda value: get_stats(value, plotted_stats_graph_1, ax_text_1, dropdown_graph_1,1)) 112 | dropdown_graph_2.observe(lambda value: dropdown_choice(value,plotted_stats_graph_2, ax_text_2, sc_2), names = 'value') 113 | statistics_graph_2.on_click(lambda value: get_stats(value, plotted_stats_graph_2, ax_text_2, dropdown_graph_2,2)) 114 | graph_1_box = HBox([dropdown_graph_1, statistics_graph_1]) 115 | graph_2_box = HBox([dropdown_graph_2, statistics_graph_2]) 116 | display(VBox([graph_1_box,graph_2_box])) 117 | 118 | 119 | def plot_datasaurus(): 120 | 121 | fig, axs = plt.subplots(6,2, figsize = (7,9), tight_layout = True) 122 | i = 0 123 | fig.suptitle("Datasaurus", fontsize = 16) 124 | for line in axs: 125 | for ax in line: 126 | if i > 12: 127 | ax.axis('off') 128 | else: 129 | group = df_datasaurus.group.unique()[i] 130 | ax.scatter(df_datasaurus[df_datasaurus.group == group]['x'],df_datasaurus[df_datasaurus.group == group]['y'], s = 4) 131 | ax.set_title(f'Group {group}') 132 | ax.set_ylim(-5,110) 133 | ax.set_xlim(10,110) 134 | ax.set_xlabel('x') 135 | ax.set_ylabel('y') 136 | i+=1 137 | 138 | -------------------------------------------------------------------------------- /C3/w2/pq1/Readme.md: -------------------------------------------------------------------------------- 1 | # Practice Quiz 2 | 3 | ![](/C3/w2/pq1/ss1.png) 4 | ![](/C3/w2/pq1/ss2.png) 5 | ![](/C3/w2/pq1/ss3.png) -------------------------------------------------------------------------------- /C3/w2/pq1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/pq1/ss1.png -------------------------------------------------------------------------------- /C3/w2/pq1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/pq1/ss2.png -------------------------------------------------------------------------------- /C3/w2/pq1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/pq1/ss3.png -------------------------------------------------------------------------------- /C3/w2/q1/Readme.md: -------------------------------------------------------------------------------- 1 | # Graded Quiz - Summative Quiz 2 | 3 | ![](/C3/w2/q1/ss1.png) 4 | ![](/C3/w2/q1/ss2.png) 5 | ![](/C3/w2/q1/ss3.png) 6 | ![](/C3/w2/q1/ss4.png) 7 | ![](/C3/w2/q1/ss5.png) 8 | ![](/C3/w2/q1/ss6.png) 9 | ![](/C3/w2/q1/ss7.png) -------------------------------------------------------------------------------- /C3/w2/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/q1/ss1.png -------------------------------------------------------------------------------- /C3/w2/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/q1/ss2.png -------------------------------------------------------------------------------- /C3/w2/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/q1/ss3.png -------------------------------------------------------------------------------- /C3/w2/q1/ss4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/q1/ss4.png -------------------------------------------------------------------------------- /C3/w2/q1/ss5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/q1/ss5.png -------------------------------------------------------------------------------- /C3/w2/q1/ss6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/q1/ss6.png -------------------------------------------------------------------------------- /C3/w2/q1/ss7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w2/q1/ss7.png -------------------------------------------------------------------------------- /C3/w3/Readme.md: -------------------------------------------------------------------------------- 1 | # Course 3 - Week 3 2 | 3 | - [Optional Lab - Sampling data from different distribution and studying the distribution of sample mean](/C3/w3/lab/) 4 | - [Practice Quiz](/C3/w3/pq1/) 5 | - [Graded Quiz - Summative Quiz](/C3/w3/q1/) -------------------------------------------------------------------------------- /C3/w3/lab/utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import seaborn as sns 3 | import matplotlib.pyplot as plt 4 | from scipy import stats 5 | from scipy.stats import norm 6 | import ipywidgets as widgets 7 | from ipywidgets import interact, interact_manual 8 | 9 | 10 | def sample_means(data, sample_size): 11 | means = [] 12 | 13 | for _ in range(10_000): 14 | sample = np.random.choice(data, size=sample_size) 15 | means.append(np.mean(sample)) 16 | 17 | return np.array(means) 18 | 19 | 20 | def gaussian_clt(): 21 | def _plot(mu, sigma, sample_size): 22 | # mu = 10 23 | # sigma = 5 24 | 25 | gaussian_population = np.random.normal(mu, sigma, 100_000) 26 | gaussiam_sample_means = sample_means(gaussian_population, sample_size) 27 | x_range = np.linspace( 28 | min(gaussiam_sample_means), max(gaussiam_sample_means), 100 29 | ) 30 | 31 | sample_means_mean = np.mean(gaussiam_sample_means) 32 | sample_means_std = np.std(gaussiam_sample_means) 33 | clt_std = sigma / np.sqrt(sample_size) 34 | 35 | estimated_pop_sigma = sample_means_std * np.sqrt(sample_size) 36 | 37 | std_err = abs(clt_std - sample_means_std) / clt_std 38 | 39 | clt_holds = True if std_err < 0.1 else False 40 | 41 | # print(f"Mean of sample means: {sample_means_mean:.2f}\n") 42 | # print(f"Std of sample means: {sample_means_std:.2f}\n") 43 | # print(f"Theoretical sigma: {clt_std:.2f}\n") 44 | # print(f"Estimated population sigma: {estimated_pop_sigma:.2f}\n") 45 | 46 | # print(f"Error: {std_err:.2f}\n") 47 | # print(f"CLT holds?: {clt_holds}\n") 48 | 49 | mu2 = mu 50 | sigma2 = sigma / np.sqrt(sample_size) 51 | # fig, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(10, 6)) 52 | fig, axes = plt.subplot_mosaic( 53 | [["top row", "top row"], ["bottom left", "bottom right"]], figsize=(10, 5) 54 | ) 55 | 56 | ax1 = axes["top row"] 57 | ax2 = axes["bottom left"] 58 | ax3 = axes["bottom right"] 59 | sns.histplot(gaussian_population, stat="density", ax=ax1) 60 | ax1.set_title("Population Distribution") 61 | ax2.set_title("Sample Means Distribution") 62 | ax3.set_title("QQ Plot of Sample Means") 63 | 64 | sns.histplot(gaussiam_sample_means, stat="density", ax=ax2, label="hist") 65 | sns.kdeplot( 66 | data=gaussiam_sample_means, 67 | color="crimson", 68 | ax=ax2, 69 | label="kde", 70 | linestyle="dashed", 71 | fill=True, 72 | ) 73 | ax2.plot( 74 | x_range, 75 | norm.pdf(x_range, loc=mu2, scale=sigma2), 76 | color="black", 77 | label="gaussian", 78 | linestyle="solid", 79 | ) 80 | ax2.legend() 81 | 82 | stats.probplot(gaussiam_sample_means, plot=ax3, fit=True) 83 | plt.tight_layout() 84 | plt.show() 85 | 86 | mu_selection = widgets.FloatSlider( 87 | value=10.0, 88 | min=0.01, 89 | max=50.0, 90 | step=1.0, 91 | description="mu", 92 | disabled=False, 93 | continuous_update=False, 94 | orientation="horizontal", 95 | readout=True, 96 | readout_format=".1f", 97 | ) 98 | 99 | sigma_selection = widgets.FloatSlider( 100 | value=5.0, 101 | min=0.01, 102 | max=20.0, 103 | step=0.1, 104 | description="sigma", 105 | disabled=False, 106 | continuous_update=False, 107 | orientation="horizontal", 108 | readout=True, 109 | readout_format=".1f", 110 | ) 111 | 112 | sample_size_selection = widgets.IntSlider( 113 | value=2, 114 | min=2, 115 | max=100, 116 | step=1, 117 | description="sample_size", 118 | disabled=False, 119 | continuous_update=False, 120 | orientation="horizontal", 121 | readout=True, 122 | readout_format="d", 123 | ) 124 | 125 | interact_manual( 126 | _plot, sample_size=sample_size_selection, mu=mu_selection, sigma=sigma_selection 127 | ) 128 | 129 | 130 | def binomial_clt(): 131 | def _plot(n, p, sample_size): 132 | mu = n * p 133 | sigma = np.sqrt(n * p * (1 - p)) / np.sqrt(sample_size) 134 | N = n * sample_size 135 | # sigma = np.sqrt(n * p * (1 - p)) / np.sqrt(N) 136 | 137 | binomial_population = np.random.binomial(n, p, 100_000) 138 | 139 | binomial_sample_means = sample_means(binomial_population, sample_size) 140 | 141 | x_range = np.linspace( 142 | min(binomial_sample_means), max(binomial_sample_means), 100 143 | ) 144 | 145 | condition_val = np.min([N * p, N * (1 - p)]) 146 | 147 | condition = True if condition_val >= 5 else False 148 | 149 | sample_means_mean = np.mean(binomial_sample_means) 150 | sample_means_std = np.std(binomial_sample_means) 151 | clt_std = np.std(binomial_population) / np.sqrt(sample_size) 152 | 153 | estimated_pop_sigma = sample_means_std * np.sqrt(sample_size) 154 | 155 | std_err = abs(clt_std - sample_means_std) / clt_std 156 | 157 | clt_holds = True if std_err < 0.1 else False 158 | 159 | # print(f"Value of N: {N}\n") 160 | print(f"Condition value: {condition_val:.1f}") 161 | 162 | # fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(15, 6)) 163 | fig, axes = plt.subplot_mosaic( 164 | [["top row", "top row"], ["bottom left", "bottom right"]], figsize=(10, 5) 165 | ) 166 | 167 | ax1 = axes["top row"] 168 | ax2 = axes["bottom left"] 169 | ax3 = axes["bottom right"] 170 | ax1.set_title("Population Distribution") 171 | ax2.set_title("Sample Means Distribution") 172 | ax3.set_title("QQ Plot of Sample Means") 173 | sns.histplot(binomial_population, stat="density", ax=ax1) 174 | 175 | sns.histplot(binomial_sample_means, stat="density", ax=ax2, label="hist") 176 | sns.kdeplot( 177 | data=binomial_sample_means, 178 | color="crimson", 179 | ax=ax2, 180 | label="kde", 181 | linestyle="dashed", 182 | fill=True, 183 | ) 184 | ax2.plot( 185 | x_range, 186 | norm.pdf(x_range, loc=mu, scale=sigma), 187 | color="black", 188 | label="gaussian", 189 | linestyle="solid", 190 | ) 191 | ax2.legend() 192 | stats.probplot(binomial_sample_means, plot=ax3, fit=True) 193 | plt.tight_layout() 194 | plt.show() 195 | 196 | # print(f"Condition holds?: {condition} with value of {condition_val:.2f}\n") 197 | 198 | # print(f"Mean of sample means: {sample_means_mean:.2f}\n") 199 | # print(f"Std of sample means: {sample_means_std:.2f}\n") 200 | # print(f"Theoretical sigma: {clt_std:.2f}\n") 201 | # print(f"Estimated population sigma: {estimated_pop_sigma:.2f}\n") 202 | 203 | sample_size_selection = widgets.IntSlider( 204 | value=2, 205 | min=2, 206 | max=50, 207 | step=1, 208 | description="sample_size", 209 | disabled=False, 210 | continuous_update=False, 211 | orientation="horizontal", 212 | readout=True, 213 | readout_format="d", 214 | ) 215 | 216 | n_selection = widgets.IntSlider( 217 | value=2, 218 | min=2, 219 | max=50, 220 | step=1, 221 | description="n", 222 | disabled=False, 223 | continuous_update=False, 224 | orientation="horizontal", 225 | readout=True, 226 | readout_format="d", 227 | ) 228 | 229 | prob_success_selection = widgets.FloatSlider( 230 | value=0.5, 231 | min=0.01, 232 | max=0.99, 233 | step=0.1, 234 | description="p", 235 | disabled=False, 236 | continuous_update=False, 237 | orientation="horizontal", 238 | readout=True, 239 | readout_format=".1f", 240 | ) 241 | 242 | interact_manual( 243 | _plot, 244 | sample_size=sample_size_selection, 245 | p=prob_success_selection, 246 | n=n_selection, 247 | ) 248 | 249 | 250 | def poisson_clt(): 251 | def _plot(mu, sample_size): 252 | sigma = np.sqrt(mu) / np.sqrt(sample_size) 253 | 254 | poisson_population = np.random.poisson(mu, 100_000) 255 | 256 | poisson_sample_means = sample_means(poisson_population, sample_size) 257 | 258 | x_range = np.linspace(min(poisson_sample_means), max(poisson_sample_means), 100) 259 | 260 | fig, axes = plt.subplot_mosaic( 261 | [["top row", "top row"], ["bottom left", "bottom right"]], figsize=(10, 5) 262 | ) 263 | 264 | ax1 = axes["top row"] 265 | ax2 = axes["bottom left"] 266 | ax3 = axes["bottom right"] 267 | ax1.set_title("Population Distribution") 268 | ax2.set_title("Sample Means Distribution") 269 | ax3.set_title("QQ Plot of Sample Means") 270 | sns.histplot(poisson_population, stat="density", ax=ax1) 271 | 272 | sns.histplot(poisson_sample_means, stat="density", ax=ax2, label="hist") 273 | sns.kdeplot( 274 | data=poisson_sample_means, 275 | color="crimson", 276 | ax=ax2, 277 | label="kde", 278 | linestyle="dashed", 279 | fill=True, 280 | ) 281 | ax2.plot( 282 | x_range, 283 | norm.pdf(x_range, loc=mu, scale=sigma), 284 | color="black", 285 | label="gaussian", 286 | linestyle="solid", 287 | ) 288 | ax2.legend() 289 | stats.probplot(poisson_sample_means, plot=ax3, fit=True) 290 | plt.tight_layout() 291 | plt.show() 292 | 293 | sample_size_selection = widgets.IntSlider( 294 | value=2, 295 | min=2, 296 | max=50, 297 | step=1, 298 | description="sample_size", 299 | disabled=False, 300 | continuous_update=False, 301 | orientation="horizontal", 302 | readout=True, 303 | readout_format="d", 304 | ) 305 | 306 | mu_selection = widgets.FloatSlider( 307 | value=1.5, 308 | min=0.01, 309 | max=5.0, 310 | # step=1.0, 311 | description="mu", 312 | disabled=False, 313 | continuous_update=False, 314 | orientation="horizontal", 315 | readout=True, 316 | readout_format=".1f", 317 | ) 318 | 319 | interact_manual(_plot, sample_size=sample_size_selection, mu=mu_selection) 320 | 321 | 322 | def plot_kde_and_qq(sample_means_data, mu_sample_means, sigma_sample_means): 323 | fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(10, 4)) 324 | 325 | # Define the x-range for the Gaussian curve (this is just for plotting purposes) 326 | x_range = np.linspace(min(sample_means_data), max(sample_means_data), 100) 327 | 328 | # Histogram of sample means (blue) 329 | sns.histplot(sample_means_data, stat="density", label="hist", ax=ax1) 330 | 331 | # Estimated PDF of sample means (red) 332 | sns.kdeplot( 333 | data=sample_means_data, 334 | color="crimson", 335 | label="kde", 336 | linestyle="dashed", 337 | fill=True, 338 | ax=ax1, 339 | ) 340 | 341 | # Gaussian curve with estimated mu and sigma (black) 342 | ax1.plot( 343 | x_range, 344 | norm.pdf(x_range, loc=mu_sample_means, scale=sigma_sample_means), 345 | color="black", 346 | label="gaussian", 347 | ) 348 | 349 | res = stats.probplot(sample_means_data, plot=ax2, fit=True) 350 | 351 | ax1.legend() 352 | plt.show() 353 | -------------------------------------------------------------------------------- /C3/w3/pq1/Readme.md: -------------------------------------------------------------------------------- 1 | # Practice Quiz 2 | 3 | ![](/C3/w3/pq1/ss1.png) -------------------------------------------------------------------------------- /C3/w3/pq1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w3/pq1/ss1.png -------------------------------------------------------------------------------- /C3/w3/q1/Readme.md: -------------------------------------------------------------------------------- 1 | # Graded Quiz - Summative Quiz 2 | 3 | ![](/C3/w3/q1/ss1.png) 4 | ![](/C3/w3/q1/ss2.png) 5 | ![](/C3/w3/q1/ss3.png) -------------------------------------------------------------------------------- /C3/w3/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w3/q1/ss1.png -------------------------------------------------------------------------------- /C3/w3/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w3/q1/ss2.png -------------------------------------------------------------------------------- /C3/w3/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w3/q1/ss3.png -------------------------------------------------------------------------------- /C3/w4/C3w4_graded_lab/__pycache__/utils.cpython-310.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w4/C3w4_graded_lab/__pycache__/utils.cpython-310.pyc -------------------------------------------------------------------------------- /C3/w4/C3w4_graded_lab/utils.py: -------------------------------------------------------------------------------- 1 | import string 2 | import random 3 | import math 4 | import numpy as np 5 | import pandas as pd 6 | import scipy.stats as stats 7 | from scipy.stats import lognorm 8 | import ipywidgets as widgets 9 | from ipywidgets import interact_manual 10 | from dataclasses import dataclass 11 | 12 | 13 | def sample_size_diff_means(mu1, mu2, sigma, alpha=0.05, beta=0.20, two_sided=True): 14 | delta = abs(mu2 - mu1) 15 | 16 | if two_sided: 17 | alpha = alpha / 2 18 | 19 | n = ( 20 | (np.square(sigma) + np.square(sigma)) 21 | * np.square(stats.norm.ppf(1 - alpha) + stats.norm.ppf(1 - beta)) 22 | ) / np.square(delta) 23 | 24 | return math.ceil(n) 25 | 26 | 27 | def sample_size_diff_proportions(p1, p2, alpha=0.05, beta=0.20, two_sided=True): 28 | k = 1 29 | 30 | q1, q2 = (1 - p1), (1 - p2) 31 | p_bar = (p1 + k * p2) / (1 + k) 32 | q_bar = 1 - p_bar 33 | delta = abs(p2 - p1) 34 | 35 | if two_sided: 36 | alpha = alpha / 2 37 | 38 | n = np.square( 39 | np.sqrt(p_bar * q_bar * (1 + (1 / k))) * stats.norm.ppf(1 - (alpha)) 40 | + np.sqrt((p1 * q1) + (p2 * q2 / k)) * stats.norm.ppf(1 - beta) 41 | ) / np.square(delta) 42 | 43 | return math.ceil(n) 44 | 45 | 46 | def generate_user_ids(num_users): 47 | 48 | user_ids = [] 49 | 50 | while len(user_ids) < num_users: 51 | new_id = ''.join(random.choices(string.ascii_uppercase + string.digits, k=10)) 52 | 53 | if new_id not in user_ids: 54 | user_ids.append(new_id) 55 | 56 | return user_ids 57 | 58 | 59 | def run_ab_test_background_color(n_days): 60 | 61 | np.random.seed(42) 62 | 63 | daily_users = 104 64 | n_control = int(daily_users*n_days*np.random.uniform(0.98, 1.02)) 65 | n_variation = int(daily_users*n_days*np.random.uniform(0.98, 1.02)) 66 | data_control = lognorm.rvs(0.5, loc=0, scale=np.exp(1)*10.5, size=n_control) 67 | data_variation = lognorm.rvs(0.5, loc=0, scale=np.exp(1)*11.01, size=n_variation) 68 | 69 | user_ids = generate_user_ids(n_control+n_variation) 70 | 71 | control_dict = {"user_id": user_ids[:n_control], "user_type": "control", "session_duration": data_control} 72 | variation_dict = {"user_id": user_ids[n_control:], "user_type": "variation", "session_duration": data_variation} 73 | 74 | control_df = pd.DataFrame(control_dict) 75 | variation_df = pd.DataFrame(variation_dict) 76 | 77 | df_ab_test = pd.concat([control_df, variation_df]) 78 | 79 | df_ab_test = df_ab_test.sample(frac=1).reset_index(drop=True) 80 | 81 | return df_ab_test 82 | 83 | 84 | 85 | def run_ab_test_personalized_feed(n_days): 86 | 87 | np.random.seed(69) 88 | 89 | daily_users = 519 90 | n_control = int(daily_users*n_days*np.random.uniform(0.98, 1.02)) 91 | n_variation = int(daily_users*n_days*np.random.uniform(0.98, 1.02)) 92 | data_control = np.random.choice([0, 1], size=n_control, p=[1-0.12, 0.12]) 93 | data_variation = np.random.choice([0, 1], size=n_variation, p=[1-0.15, 0.15]) 94 | 95 | user_ids = generate_user_ids(n_control+n_variation) 96 | 97 | control_dict = {"user_id": user_ids[:n_control], "user_type": "control", "converted": data_control} 98 | variation_dict = {"user_id": user_ids[n_control:], "user_type": "variation", "converted": data_variation} 99 | 100 | control_df = pd.DataFrame(control_dict) 101 | variation_df = pd.DataFrame(variation_dict) 102 | 103 | df_ab_test = pd.concat([control_df, variation_df]) 104 | 105 | df_ab_test = df_ab_test.sample(frac=1).reset_index(drop=True) 106 | 107 | return df_ab_test 108 | 109 | 110 | @dataclass 111 | class estimation_metrics_prop: 112 | n: int 113 | x: int 114 | p: float 115 | 116 | def __repr__(self): 117 | return f"sample_params(n={self.n}, x={self.x}, p={self.p:.3f})" 118 | 119 | 120 | def AB_test_dashboard(z_statistic_diff_proportions, reject_nh_z_statistic): 121 | def _AB(n1, x1, n2, x2, alpha): 122 | 123 | m1 = estimation_metrics_prop(n=n1, x=x1, p=x1/n1) 124 | m2 = estimation_metrics_prop(n=n2, x=x2, p=x2/n2) 125 | z = z_statistic_diff_proportions(m1, m2) 126 | reject_nh = reject_nh_z_statistic(z, alpha=alpha) 127 | print(f"The null hypothesis can be rejected at the {alpha:.5f} level of significance: {reject_nh}\n") 128 | 129 | msg = "" if reject_nh else " not" 130 | print(f"There is{msg} enough statistical evidence against H0.\nThus it can be concluded that there is{msg} a statistically significant difference between the two proportions.") 131 | 132 | n1_selection = widgets.IntText( 133 | value=4632, 134 | description='Users A:', 135 | disabled=False 136 | ) 137 | 138 | n2_selection = widgets.IntText( 139 | value=4728, 140 | description='Users B:', 141 | disabled=False 142 | ) 143 | 144 | x1_selection = widgets.IntText( 145 | value=576, 146 | description='Conversions A:', 147 | disabled=False, 148 | style = {'description_width': 'initial'} 149 | ) 150 | 151 | x2_selection = widgets.IntText( 152 | value=718, 153 | description='Conversions B:', 154 | disabled=False, 155 | style = {'description_width': 'initial'} 156 | ) 157 | 158 | alpha_selection = widgets.FloatSlider( 159 | value=0.05, 160 | min=0, 161 | max=1, 162 | step=0.001, 163 | description='Alpha:', 164 | disabled=False, 165 | continuous_update=False, 166 | orientation='horizontal', 167 | readout=True, 168 | readout_format='.2f', 169 | ) 170 | 171 | 172 | interact_manual(_AB, n1=n1_selection, x1=x1_selection, n2=n2_selection, x2=x2_selection, alpha=alpha_selection) 173 | -------------------------------------------------------------------------------- /C3/w4/Readme.md: -------------------------------------------------------------------------------- 1 | # Course 3 - Week 4 2 | 3 | - [Practice Quiz](/C3/w4/pq1/) 4 | - [Graded Quiz - Summative Quiz](/C3/w4/q1/) 5 | - [Programming Assignment: A/B Testing](/C3/w4/C3w4_graded_lab/) -------------------------------------------------------------------------------- /C3/w4/pq1/Readme.md: -------------------------------------------------------------------------------- 1 | # Practice Quiz 2 | 3 | ![](/C3/w4/pq1/ss1.png) -------------------------------------------------------------------------------- /C3/w4/pq1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w4/pq1/ss1.png -------------------------------------------------------------------------------- /C3/w4/q1/Readme.md: -------------------------------------------------------------------------------- 1 | # Graded Quiz - Summative Quiz 2 | 3 | ![](/C3/w4/q1/ss1.png) 4 | ![](/C3/w4/q1/ss2.png) 5 | ![](/C3/w4/q1/ss3.png) -------------------------------------------------------------------------------- /C3/w4/q1/ss1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w4/q1/ss1.png -------------------------------------------------------------------------------- /C3/w4/q1/ss2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w4/q1/ss2.png -------------------------------------------------------------------------------- /C3/w4/q1/ss3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/dc449536802ee1b3dfb31986b26a05f955441658/C3/w4/q1/ss3.png -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Mathematics for Machine Learning and Data Science Specialization - Coursera 2 | 3 | ![title-banner](https://github.com/greyhatguy007/Mathematics-for-Machine-Learning-and-Data-Science-Specialization-Coursera/assets/77543865/42742826-89a3-41c4-aa6a-6d0f83b260b6) 4 | 5 | Mathematics for Machine Learning and Data Science Specialization offered by deeplearning.ai , instructed by Luis Serrano on Coursera. 6 | 7 |
8 | 9 |
10 | 11 | ## Course 1 : [Linear Algebra for Machine Learning and Data Science](https://www.coursera.org/learn/machine-learning-linear-algebra) 12 | 13 |
14 | 15 | - [Week 1](/C1/w1/) 16 | - [Ungraded Lab - Introduction To Numpy Arrays](/C1/w1/lab/C1_W1_Lab_1_introduction_to_numpy_arrays.ipynb) 17 | - [Ungraded Lab - Solving Linear Systems : 2 Variables](/C1/w1/lab/C1_W1_Lab_2_solving_linear_systems_2_variables.ipynb) 18 | - [Practice Quiz - Solving Systems of Linear Equations](/C1/w1/pq1/) 19 | - [Graded Quiz - Matrices](/C1/w1/q1/) 20 | - [Lecture Materials](/C1/w1/C1w1notes.pdf) 21 | 22 |
23 | 24 | - [Week 2](/C1/w2/) 25 | - [Ungraded Lab - Solving Linear Systems : 3 Variables](/C1/w2/C1w2_ungraded_lab.ipynb) 26 | - [Graded Quiz - The Rank of a Matrix](/C1/w2/q1/) 27 | - [Programming Assignment - System of Linear Equations](/C1/w2/C1w2_graded_lab/) 28 | - [Lecture Materials](/C1/w2/C1w2notes.pdf) 29 | 30 |
31 | 32 | - [Week 3](/C1/w3/) 33 | - [Ungraded Lab - Vector Operations](/C1/w3/lab/C1_W3_Lab_1_vector_operations.ipynb) 34 | - [Ungraded Lab - Matrix Multiplication](/C1/w3/lab/C1_W3_Lab_2_matrix_multiplication.ipynb) 35 | - [Ungraded Lab - Linear Transformations](/C1/w3/lab/C1_W3_Lab_3_linear_transformations.ipynb) 36 | - [Practice Quiz - Vector operations: Sum, difference, multiplication, dot product](/C1/w3/pq1) 37 | - [Graded Quiz - Vector and Matrix Operations, Types of Matrices](/C1/w3/q1/) 38 | - [Programming Assignment - Single Perceptron Neural Networks for Linear Regression](/C1/w3/C1w3_graded_lab/) 39 | - [Lecture Materials](/C1/w3/C1w3notes.pdf) 40 | 41 |
42 | 43 | - [Week 4](/C1/w4/) 44 | - [Graded Quiz - Eigenvalues and Eigenvectors](/C1/w4/q1/) 45 | - [Programming Assignment - Eigenvalues and Eigenvectors](/C1/w4/C1w4_graded_lab/) 46 | - [Lecture Materials](/C1/w4/C1w4notes.pdf) 47 | 48 |
49 | 50 | ### [Certificate Of Completion](https://coursera.org/share/4dcac0c68e690f1947739cc62143dc78) 51 | 52 |
53 | 54 |
55 | 56 | ## Course 2 : [Calculus For Machine Learning and Data Science](https://www.coursera.org/learn/machine-learning-calculus) 57 | 58 |
59 | 60 | - [Week 1](/C2/w1/) 61 | - [Practice Quiz - Derivatives](/C2/w1/pq1/) 62 | - [Ungraded Lab - Differentiation in Python](/C2/w1/C2_W1_Lab_1_differentiation_in_python.ipynb) 63 | - [Graded Quiz - Derivatives and Optimization](/C2/w1/q1/) 64 | - [Programming Assignment - Optimizing Functions of One Variable: Cost Minimization](/C2/w1/C2w1_graded_lab/) 65 | - [Lecture Materials](/C2/w1/C2w1notes.pdf) 66 | 67 |
68 | 69 | - [Week 2](/C2/w2/) 70 | - [Practice Quiz - Partial Derivatives and Gradient](/C2/w2/pq1/) 71 | - [Ungraded Lab - Optimization Using Gradient Descent in One Variable](/C2/w2/lab/C2_W2_Lab_1_Optimization_Using_Gradient_Descent_in_One_Variable.ipynb) 72 | - [Ungraded Lab - Optimization Using Gradient Descent in Two Variables](/C2/w2/lab/C2_W2_Lab_2_Optimization_Using_Gradient_Descent_in_Two_Variables.ipynb) 73 | - [Graded Quiz - Partial Derivatives and Gradient Descent](/C2/w2/q1/) 74 | - [Programming Assignment - Optimization Using Gradient Descent: Linear Regression](/C2/w2/C2w2_graded_lab/) 75 | - [Lecture Materials](/C2/w2/C2w2notes.pdf) 76 | 77 |
78 | 79 | - [Week 3](/C2/w3/) 80 | - [Practice Quiz - Optimization in Neural Networks](/C2/w3/pq1/) 81 | - [Ungraded Lab - Regression with Perceptron](/C2/w3/lab/C2_W3_Lab_1_Regression_with_Perceptron.ipynb) 82 | - [Ungraded Lab - Classification with Perceptron](/C2/w3/lab/C2_W3_Lab_2_Classification_with_Perceptron.ipynb) 83 | - [Ungraded Lab - Optimization Using Newtons Method](/C2/w3/lab/C2_W3_Lab_3_Optimization_Using_Newtons_Method.ipynb) 84 | - [Graded Quiz - Optimization in Neural Networks and Newton's Method](/C2/w3/q1/) 85 | - [Programming Assignment - Neural Network with Two Layers](/C2/w3/C2w3_graded_lab/) 86 | - [Lecture Materials](/C2/w3/C2w3notes.pdf) 87 | 88 |
89 | 90 | ### [Certificate Of Completion](https://coursera.org/share/5fa3a336a4fdfcb89879b8b828f8abbe) 91 | 92 |
93 | 94 |
95 | 96 | ## Course 3 : [Probability & Statistics for Machine Learning & Data Science](https://www.coursera.org/learn/machine-learning-probability-and-statistics) 97 | 98 |
99 | 100 | - [Week 1](/C3/w1/) 101 | - [Ungraded Lab - Birthday Problems](/C3/w1/lab/C3_W1_Lab_2_Birthday_Problems.ipynb) 102 | - [Practice Quiz](/C3/w1/pq1/) 103 | - [Graded Quiz - Summative quiz](/C3/w1/q1/) 104 | - [Programming Assignment: Probability Distributions / Naive Bayes](/C3/w1/C3w1_graded_lab/) 105 | 106 |
107 | 108 | - [Week2](/C3/w2/) 109 | - [Practice Quiz](/C3/w2/pq1/) 110 | - [Graded Quiz - Summative Quiz](/C3/w2/q1/) 111 | - [Optional Lab - Summary statistics and visualization of data sets](/C3/w2/lab/ugl_datasets.ipynb) 112 | - [Optional Lab - Dice Simulations](/C3/w2/lab/C3_W2_Lab_2_Dice_Simulations.ipynb) 113 | - [Programming Assignment: Loaded Dice](/C3/w2/C3w2_graded_lab/) 114 | 115 |
116 | 117 | - [Week 3](/C3/w3/) 118 | - [Optional Lab - Sampling data from different distribution and studying the distribution of sample mean](/C3/w3/lab/) 119 | - [Practice Quiz](/C3/w3/pq1/) 120 | - [Graded Quiz - Summative Quiz](/C3/w3/q1/) 121 | 122 |
123 | 124 | - [Week 4](/C3/w4/) 125 | - [Practice Quiz](/C3/w4/pq1/) 126 | - [Graded Quiz - Summative Quiz](/C3/w4/q1/) 127 | - [Programming Assignment: A/B Testing](/C3/w4/C3w4_graded_lab/) 128 | 129 |
130 | 131 | ### [Certificate Of Completion](https://coursera.org/share/10ba65d22dca9278c5119d7511bcec0b) 132 | 133 |
134 | 135 |
136 | 137 | ## [Specialization Certificate](https://coursera.org/share/ea6107e80f98b4d1f05b9263413f39c6) 138 | --------------------------------------------------------------------------------