├── Machine learning LS Intro (3).pdf
├── README.md
├── WEEK 4
├── PROJECT1.md
├── PROJECT2.md
└── Readme.md
├── Week 1
├── Assignment
│ ├── Linear Regression Assignment
│ │ ├── Problem Statement
│ │ ├── Test data.xlsx
│ │ └── Training data.xlsx
│ ├── Logistic Regression
│ │ ├── Probem Statement
│ │ └── data.txt
│ └── Python Modules Assigment
│ │ └── Problem Statement
├── Classifiers (OPTIONAL)
│ └── README.md
├── Linear Regression
│ └── README.md
├── Logistic Regression
│ └── README.md
├── Python Modules
│ └── README.md
└── README.md
├── Week 2
├── Assignment
│ ├── Classifiers Assignment
│ │ ├── README.md
│ │ └── utils.zip
│ ├── Neural Network Assignment
│ │ ├── README.md
│ │ └── homer_bart.zip
│ └── YOLOv8 Assignment
│ │ └── README.md
├── Classifiers
│ └── README.md
├── Framework + NN
│ └── README.md
├── README.md
└── YOLOv8
│ └── README.md
└── Week 3
├── Assignment
├── CNN
│ ├── README.md
│ └── data
└── OpenCV
│ ├── ImagePreprocessing-OpenCV.ipynb
│ ├── README.md
│ └── images.zip
├── CNN
└── README.md
├── Image Preprocessing - OpenCV
└── README.md
└── README.md
/Machine learning LS Intro (3).pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wncc/Machine-Learning-LS-24/1f8ed556eec2d37960867d4b6e86e07886071f7e/Machine learning LS Intro (3).pdf
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Machine-Learning-LS-24
2 |
3 | ### Web and Coding Club welcomes you to Machine Learning Course where we aim to deep dive into the world of Deep Learning in Computer Vision. We hope you learned a lot thorugh the last 3 weeks. Now we reach the concluding week where we shall build the final project. Please visit [WEEK4](https://github.com/wncc/Machine-Learning-LS-24/tree/main/WEEK%204)
4 |
5 | ### For detailed information about the course, check out- [Introduction to ML Course](./Machine%20learning%20LS%20Intro%20(3).pdf)
6 |
7 | ## For any further querries, feel free to reach out our course Moderators-
8 |
9 | * Veeraditya: 9449007525
10 | * Samarth: 8368657754
11 | * Lopamudra: 7735361219
12 | * Tushar: 8974359817
13 | * Shahu: 9146050850
14 |
--------------------------------------------------------------------------------
/WEEK 4/PROJECT1.md:
--------------------------------------------------------------------------------
1 | ### PROJECT 1
2 | > In this project you will learn about facial recognition which includes few more new concepts apart from the topics of other week and implement it.
3 | ***
4 | ### FACIAL RECOGNITION
5 | >You have to go through the given [playlist](https://youtube.com/playlist?list=PLgNJO2hghbmhHuhURAGbe6KWpiYZt0AMH&si=6b0qNPymO5H6a-FD) and make a vedio which identifies whether two images are same or different.
6 |
7 | > Base Image will be your own image and you are flexible to use any other image which does not match with the base image.
8 |
9 | > You will learn about new type of network called Siamese Neural Network
10 | ***
11 |
12 | ### SUBMISSION MODE
13 |
14 | > Link of github Repo where ipynb file of the project is uploaded and a vedio showing that the base image does not match with other image but matches with different orientation of base image.
15 | ***
16 |
17 | ### ALL THE BEST !!!
--------------------------------------------------------------------------------
/WEEK 4/PROJECT2.md:
--------------------------------------------------------------------------------
1 | ### PROJECT 2
2 | > This project aims to develop a benchmark for long-tailed, multi-class medical image classification, we will use the HAM10000 Dataset that was acquired with a variety of dermatoscope types that are categorized into one of seven possible disease categories.
3 |
4 |
5 | ### INSTRUCTIONS
6 | > Go through the [instruction doc](https://docs.google.com/document/d/1xpSxyq8pRyjPRciQb3tOjV24K7pqniHY/edit?usp=sharing&ouid=102124429604486730187&rtpof=true&sd=true) to get better understanding about the problem statement and the datasets..
7 | > Tip- Explore various OpenCV techniques to preprocess the image well before using any model
8 |
9 | ### TYPES OF METRICES
10 | >For scientific completeness, predicted responses will also have the following metrics computed (comparing prediction vs. ground truth) for each image:
11 |
12 | * >>[Accuracy](https://en.wikipedia.org/wiki/Accuracy_and_precision#In_binary_classification)
13 | * >>[area under the receiver operating characteristic curve (AUC)](https://en.wikipedia.org/wiki/Receiver_operating_characteristic#Area_under_the_curve)
14 | * >>[mean average precision](http://fastml.com/what-you-wanted-to-know-about-mean-average-precision/)
15 | * >>[F1 score](https://en.wikipedia.org/wiki/F1_score)
16 |
17 | ### SUBMISSION MODE
18 | > Submit the github repo containing ipynb file along with normalized-multi class accuracy.
19 |
--------------------------------------------------------------------------------
/WEEK 4/Readme.md:
--------------------------------------------------------------------------------
1 | ### WELCOME TO WEEK 4
2 |
3 | In this week we will apply whatever we learned till now and complete the final project.
4 |
5 | ### Instruction
6 | There are two projects provided.
7 | 1) [FACIAL RECOGNITION USING SIAMESE NETWORKS](https://github.com/wncc/Machine-Learning-LS-24/blob/main/WEEK%204/PROJECT1.md)
8 | 2) [MULTI CLASS IMAGE CLASSIFICATON ON MEDICAL DATASET](https://github.com/wncc/Machine-Learning-LS-24/blob/main/WEEK%204/PROJECT2.md)
9 |
10 | One can do either of the two.
11 | Deadline: 11:59 pm, 31st July 2024
12 |
13 | ## HAPPY LEARNING AND ALL THE BEST !!!
14 |
--------------------------------------------------------------------------------
/Week 1/Assignment/ Linear Regression Assignment/Problem Statement:
--------------------------------------------------------------------------------
1 | https://colab.research.google.com/drive/1HFNcaz7YkmkqBsCUZfb_fyR1ijenzAFU?usp=sharing
2 |
--------------------------------------------------------------------------------
/Week 1/Assignment/ Linear Regression Assignment/Test data.xlsx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wncc/Machine-Learning-LS-24/1f8ed556eec2d37960867d4b6e86e07886071f7e/Week 1/Assignment/ Linear Regression Assignment/Test data.xlsx
--------------------------------------------------------------------------------
/Week 1/Assignment/ Linear Regression Assignment/Training data.xlsx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wncc/Machine-Learning-LS-24/1f8ed556eec2d37960867d4b6e86e07886071f7e/Week 1/Assignment/ Linear Regression Assignment/Training data.xlsx
--------------------------------------------------------------------------------
/Week 1/Assignment/Logistic Regression/Probem Statement:
--------------------------------------------------------------------------------
1 | https://colab.research.google.com/drive/1oRnVWpXmK5JDKIHOOJm5bjwy8d9v6pRb#scrollTo=BqlxZOXoHh4z
2 |
--------------------------------------------------------------------------------
/Week 1/Assignment/Logistic Regression/data.txt:
--------------------------------------------------------------------------------
1 | 0.051267,0.69956,1
2 | -0.092742,0.68494,1
3 | -0.21371,0.69225,1
4 | -0.375,0.50219,1
5 | -0.51325,0.46564,1
6 | -0.52477,0.2098,1
7 | -0.39804,0.034357,1
8 | -0.30588,-0.19225,1
9 | 0.016705,-0.40424,1
10 | 0.13191,-0.51389,1
11 | 0.38537,-0.56506,1
12 | 0.52938,-0.5212,1
13 | 0.63882,-0.24342,1
14 | 0.73675,-0.18494,1
15 | 0.54666,0.48757,1
16 | 0.322,0.5826,1
17 | 0.16647,0.53874,1
18 | -0.046659,0.81652,1
19 | -0.17339,0.69956,1
20 | -0.47869,0.63377,1
21 | -0.60541,0.59722,1
22 | -0.62846,0.33406,1
23 | -0.59389,0.005117,1
24 | -0.42108,-0.27266,1
25 | -0.11578,-0.39693,1
26 | 0.20104,-0.60161,1
27 | 0.46601,-0.53582,1
28 | 0.67339,-0.53582,1
29 | -0.13882,0.54605,1
30 | -0.29435,0.77997,1
31 | -0.26555,0.96272,1
32 | -0.16187,0.8019,1
33 | -0.17339,0.64839,1
34 | -0.28283,0.47295,1
35 | -0.36348,0.31213,1
36 | -0.30012,0.027047,1
37 | -0.23675,-0.21418,1
38 | -0.06394,-0.18494,1
39 | 0.062788,-0.16301,1
40 | 0.22984,-0.41155,1
41 | 0.2932,-0.2288,1
42 | 0.48329,-0.18494,1
43 | 0.64459,-0.14108,1
44 | 0.46025,0.012427,1
45 | 0.6273,0.15863,1
46 | 0.57546,0.26827,1
47 | 0.72523,0.44371,1
48 | 0.22408,0.52412,1
49 | 0.44297,0.67032,1
50 | 0.322,0.69225,1
51 | 0.13767,0.57529,1
52 | -0.0063364,0.39985,1
53 | -0.092742,0.55336,1
54 | -0.20795,0.35599,1
55 | -0.20795,0.17325,1
56 | -0.43836,0.21711,1
57 | -0.21947,-0.016813,1
58 | -0.13882,-0.27266,1
59 | 0.18376,0.93348,0
60 | 0.22408,0.77997,0
61 | 0.29896,0.61915,0
62 | 0.50634,0.75804,0
63 | 0.61578,0.7288,0
64 | 0.60426,0.59722,0
65 | 0.76555,0.50219,0
66 | 0.92684,0.3633,0
67 | 0.82316,0.27558,0
68 | 0.96141,0.085526,0
69 | 0.93836,0.012427,0
70 | 0.86348,-0.082602,0
71 | 0.89804,-0.20687,0
72 | 0.85196,-0.36769,0
73 | 0.82892,-0.5212,0
74 | 0.79435,-0.55775,0
75 | 0.59274,-0.7405,0
76 | 0.51786,-0.5943,0
77 | 0.46601,-0.41886,0
78 | 0.35081,-0.57968,0
79 | 0.28744,-0.76974,0
80 | 0.085829,-0.75512,0
81 | 0.14919,-0.57968,0
82 | -0.13306,-0.4481,0
83 | -0.40956,-0.41155,0
84 | -0.39228,-0.25804,0
85 | -0.74366,-0.25804,0
86 | -0.69758,0.041667,0
87 | -0.75518,0.2902,0
88 | -0.69758,0.68494,0
89 | -0.4038,0.70687,0
90 | -0.38076,0.91886,0
91 | -0.50749,0.90424,0
92 | -0.54781,0.70687,0
93 | 0.10311,0.77997,0
94 | 0.057028,0.91886,0
95 | -0.10426,0.99196,0
96 | -0.081221,1.1089,0
97 | 0.28744,1.087,0
98 | 0.39689,0.82383,0
99 | 0.63882,0.88962,0
100 | 0.82316,0.66301,0
101 | 0.67339,0.64108,0
102 | 1.0709,0.10015,0
103 | -0.046659,-0.57968,0
104 | -0.23675,-0.63816,0
105 | -0.15035,-0.36769,0
106 | -0.49021,-0.3019,0
107 | -0.46717,-0.13377,0
108 | -0.28859,-0.060673,0
109 | -0.61118,-0.067982,0
110 | -0.66302,-0.21418,0
111 | -0.59965,-0.41886,0
112 | -0.72638,-0.082602,0
113 | -0.83007,0.31213,0
114 | -0.72062,0.53874,0
115 | -0.59389,0.49488,0
116 | -0.48445,0.99927,0
117 | -0.0063364,0.99927,0
118 | 0.63265,-0.030612,0
119 |
--------------------------------------------------------------------------------
/Week 1/Assignment/Python Modules Assigment/Problem Statement:
--------------------------------------------------------------------------------
1 | https://colab.research.google.com/drive/10Mj-ju7t82lhKXkpO5gr9mwxzJ8MtrHV?usp=sharing
2 |
--------------------------------------------------------------------------------
/Week 1/Classifiers (OPTIONAL)/README.md:
--------------------------------------------------------------------------------
1 | # CLASSIFIERS
2 | In this course, you'll learn how to use classifiers, powerful tools for data science and machine learning. You'll explore various classification algorithms, including logistic regression, decision trees, random forests, and support vector machines. By the end, you'll be well-equipped to handle various data science and machine learning projects involving classification tasks.
3 | ## Regression:
4 | >[!Note]
5 | >You can *skip* the regression part if done before
6 |
7 | Regression is a key technique in data science and machine learning for modeling relationships between variables. You'll learn logistic regression for binary classification and softmax regression for multi-class classification. By mastering these methods, you'll enhance your ability to analyze and interpret data effectively.
8 |
9 | [Logistic Regression](https://youtu.be/zM4VZR0px8E?si=1GkpCaVPTQMuHi6D) (till 16:00 at 2x speed)
10 | [Softmax Regression](https://youtu.be/J5bXOOmkopc?si=F54C3YoWqGoL-mG-) (till 11:21 at 2x speed)
11 | ## SVM:
12 | Support Vector Machines (SVMs) is a powerful tool in data science for both classification tasks. You'll delve into the theory behind SVMs, understanding how they optimize decision boundaries to maximize margin and handle complex data distributions. Optional study includes polynomial kernels for non-linear data transformations. Through practical applications, you'll master SVMs for robust data analysis and prediction.
13 |
14 | [Basic Intuition](https://youtu.be/H9yACitf-KM?si=JCrZzpSIFY0aaV0u) (at 1.5x speed)
15 | [Maths and Theory](https://youtu.be/Js3GLb1xPhc?si=AMAxwVn-cK6nKIL1) (at 1.5x speed)
16 | [Application](https://youtu.be/FB5EdxAGxQg?si=AiNDpo0t0f3zjQbP) (at 2x speed)
17 | [Polynomial Kernels SVM](https://youtu.be/8bFKyb77vp0?si=c7gLv7to7dy-NeCF) (Optional)
18 | [Polynomial Kernels Application](https://youtu.be/dl_ZsuHSIFE?si=vPj0FCbwNFBnqaY9) (Optional)
19 | ## Decision Trees & Random Forest:
20 | Decision Trees and Random Forests are the fundamental and first-chosen techniques in data science for classification tasks. You'll learn how Decision Trees partition data based on features to make predictions, and how Random Forests aggregate multiple trees for improved accuracy and robustness. Through practical applications, you'll gain proficiency in building, evaluating, and optimizing these models for various data science tasks.
21 |
22 | [Decision Tree in ML](https://youtu.be/RmajweUFKvM?si=kMkMDKvsUNn9GtJc&t=393) (6:32 to 14:31 at 1.5x speed)
23 | [Random Forest in ML](https://youtu.be/eM4uJ6XGnSM?si=U9eHNFiLz-TtABwv&t=858) (14:18 to 17:18 at 1.5x speed)
24 | [Random Forest Classifier and Regressor](https://youtu.be/nxFG5xdpDto?si=3ACkJrx7H-McFssa&t=272) (4:30 to end at 1.5x speed)
25 | [Application](https://youtu.be/ok2s1vV9XW0?si=Imgl-oYGEcBrIvGu&t=154) (2:43 to 10:36 at 2x speed)
26 | ## KNN:
27 | K-nearest neighbors (KNN) is a non-parametric classification and regression method. It predicts new data points based on their similarity to existing data points, using a majority vote (for classification) or averaging (for regression) among its nearest neighbors in the feature space.
28 |
29 | [KNN in ML](https://youtu.be/CQveSaMyEwM?si=-DrgOoalYD89MaSD) (Optional)
30 | ## Naive Bayes:
31 | Naive Bayes is a probabilistic classifier based on Bayes' theorem. It assumes that features are independent of each other, hence "naive," which simplifies computation and training. It's efficient with minimal training data, making it ideal for tasks like text classification and spam filtering, where it often yields effective results despite its simplicity. Despite this simplification, Naive Bayes classifiers often perform well in practice, especially with high-dimensional data and large datasets.
32 |
33 | [Intuition](https://youtu.be/jS1CKhALUBQ?si=8LC2Qn9oFexke3aM) (at 1.5x speed)
34 | [Naive Bayes on Text Data](https://youtu.be/temQ8mHpe3k?si=u_I8AyPMoRho6KHV) (at 1.5x speed)
35 | [Application](https://youtu.be/nHIUYwN-5rM?si=bHFlE9KX23WmaZH-) (at 2x speed)
36 | >[!Warning]
37 | NLP is done with advanced Deep Learning algorithms
38 | ## K-means Clustering:
39 | K-means clustering is a partitioning algorithm that aims to divide a dataset into K clusters by iteratively assigning each data point to the cluster with the nearest centroid and recalculating centroids until convergence. It's widely used for data segmentation and pattern recognition tasks.
40 |
41 | [K-Means in ML](https://youtu.be/EItlUEPCIzM?si=f9Or7ZgnpzlJmudt) (Optional)
42 |
43 |
--------------------------------------------------------------------------------
/Week 1/Linear Regression/README.md:
--------------------------------------------------------------------------------
1 | After learning about the important libraries, now you are ready to start your journey of machine learning.
2 |
3 |
4 | ## 1. Supervised vs. Unsupervised Machine Learning
5 | ***
6 |
7 |
8 | Each type of machine learning model serves different purposes and is chosen based on the nature of the problem and the type of data available. By understanding these basic categories, we can better select and apply the appropriate ML techniques to real-world problems.
9 |
10 |
11 | Recommended to watch at 2X
12 | + [What is Machine Learning?](https://youtu.be/XtlwSmJfUs4?si=5n6aB220kPeJYlQN)
13 | + [Supervised Learning 1](https://youtu.be/sca5rQ9x1cA?si=ex3vuimTkJIAroZe)
14 | + [Supervised Learning 2](https://youtu.be/hh6gE0LxfO8?si=s3Ni3qN_CUY87mmI)
15 | + [Unsupervised Learning 1](https://youtu.be/gG_wI_uGfIE?si=2QOIlKa1OM_phnEV)
16 | + [Unsupervised Learning 2](https://youtu.be/_0bhZBqtCCs?si=2wK_W4aubLNXZh0p)
17 |
18 |
19 | You are required to install jupyter notebooks and learn basics of it as we are going to use it throughout the course.
20 |
21 | + [Installation](https://jupyter.org/install)
22 | + [Jupyter Notebook](https://youtu.be/IMrxB8Mq5KU?si=kZGc7BZKFaQfvnvU&t=82) (watch it till 6:00)
23 |
24 |
25 | Note : No need to go in depth of jupyter notebook as you need it’s basics only and
26 | will learn more throughout the course. *Don’t forget to try some codes in your notebook!*
27 |
28 | ## 2. Regression Model
29 | ***
30 |
31 | In this module you will learn about linear regression. It will seem quite simple but advised to listen carefully and practise the things you are taught. Give time to understand the optional labs mentioned in the videos.
32 |
33 | + [Linear Regression 1](https://youtu.be/dLc-lfEEYss?si=Ptr0SgDIrZPLwYW8)
34 | + [Linear Regression 2](https://youtu.be/KWULpBYzIYk?si=6wAylXMUiuN-iPYp)
35 | + [Optional Lab : Model Representation](https://github.com/greyhatguy007/Machine-Learning-Specialization-Coursera/blob/main/C1%20-%20Supervised%20Machine%20Learning%20-%20Regression%20and%20Classification/week1/Optional%20Labs/C1_W1_Lab03_Model_Representation_Soln.ipynb)
36 | + [Cost Function 1](https://youtu.be/CFN5zHzEuGY?si=pvA5xLMfW_oVsApL)
37 | + [Cost Function 2](https://youtu.be/peNRqkfukYY?si=xolqXp-fwkLDqY9n)
38 | + [Cost Function 3](https://youtu.be/bFNz2u0hl9E?si=76e_BT57XE_hvA74)
39 | + [Cost Function 4](https://youtu.be/L5INhX5cbWU?si=F_ivtQmiqeNU3bnv)
40 | + [Optional Lab : Cost Function](https://github.com/greyhatguy007/Machine-Learning-Specialization-Coursera/blob/main/C1%20-%20Supervised%20Machine%20Learning%20-%20Regression%20and%20Classification/week1/Optional%20Labs/C1_W1_Lab04_Cost_function_Soln.ipynb)
41 |
42 | ## 3. Training with Gradient Descent
43 | ***
44 |
45 |
46 | In this module, we are going to provide an overview of gradient descent, a fundamental optimization algorithm widely used in machine learning to minimize functions and improve model performance.
47 |
48 |
49 | + [Gradient Descent 1](https://youtu.be/WtlvKq_zxPI?si=EPKHXj529fwJd-GU)
50 | + [Gradient Descent 2](https://youtu.be/w_2vCijLiiM?si=L_HR-2NN1GpnnBrB)
51 | + [Gradient Descent 3](https://youtu.be/PKm61nrqpCA?si=2-lZfYDIb9WOI0EP)
52 | + [Learning Rate](https://youtu.be/k0h8emRAAHE?si=2HEtFo4n03NV6lSk)
53 | + [Gradient Descent for LR](https://youtu.be/RGL_XUjPkGo?si=HF0sB7aXBbOsji8I)
54 | + [Running Gradient Descent](https://youtu.be/tHDDbqYfflM?si=5iXFZTfpicf2wwuM)
55 | + [Optional Lab : Gradient Descent](https://github.com/greyhatguy007/Machine-Learning-Specialization-Coursera/blob/main/C1%20-%20Supervised%20Machine%20Learning%20-%20Regression%20and%20Classification/week1/Optional%20Labs/C1_W1_Lab05_Gradient_Descent_Soln.ipynb) (Take this one seriously!)
56 |
57 | Now as you are ready to make a linear regression model. Let us try to make some
58 | models of which you all are familiar.
59 |
60 | Problem Statement : Plot a best fit line for the following set of points.
61 | 1. [(1,1) (2,2) (3,1)] Can you tell from which special point the line passes?
62 | 2. [(1,1) (1,2) (2,2) (2,1)]
63 | 3. [(1,1) (2,1.5) (2,0.5) (3,1)]
64 |
65 | ## 4. Multiple Linear Regression
66 | ***
67 |
68 |
69 | In this module, we are going to provide an overview of multiple linear regression, a statistical technique used to model the relationship between one dependent variable and multiple independent variables.
70 |
71 |
72 | I would recommend attempting the Problem solved in optional labs in this section before seeing its method.
73 |
74 | + [Multiple Features](https://youtu.be/jXg0vU0y1ak?si=TX3xr55v3TNSXF4_)
75 | + [Vectorisation 1](https://youtu.be/U6zuBcmLxSg?si=j23VaM2TRPX4NC_8)
76 | + [Vectorisation 2](https://youtu.be/uvTL1N02f04?si=RliYfKn2G69IVSZ_)
77 | + [Optional Lab : Vectorsation](https://github.com/greyhatguy007/Machine-Learning-Specialization-Coursera/blob/main/C1%20-%20Supervised%20Machine%20Learning%20-%20Regression%20and%20Classification/week2/Optional%20Labs/C1_W2_Lab01_Python_Numpy_Vectorization_Soln.ipynb) (Go through it once for revision)
78 | + [Gradient Descent for MR](https://youtu.be/YjpCQof9tI8?si=CDWQydaNV1DpGb1C)
79 | + [Optional Lab : MR](https://github.com/greyhatguy007/Machine-Learning-Specialization-Coursera/blob/main/C1%20-%20Supervised%20Machine%20Learning%20-%20Regression%20and%20Classification/week2/Optional%20Labs/C1_W2_Lab02_Multiple_Variable_Soln.ipynb) (Try to solve the problem statement first)
80 | + [Feature Scaling 1](https://youtu.be/YVtP5UGdgXg?si=0__uhJjcL4eduQdZ)
81 | + [Feature Scaling 2](https://youtu.be/gmJqLGrUscg?si=1KhP7-uxC4qLZAFn)
82 | + [Checking Gradient for Convergence](https://youtu.be/5g4H5_gsTpU?si=PzegloLpm9yeCyh7)
83 | + [Choosing the Learning Rate](https://youtu.be/P_9hNBVRldM?si=WSbgWxmNj3M-UrqN)
84 | + [Optional Lab : Feature Scaling](https://github.com/greyhatguy007/Machine-Learning-Specialization-Coursera/blob/main/C1%20-%20Supervised%20Machine%20Learning%20-%20Regression%20and%20Classification/week2/Optional%20Labs/C1_W2_Lab03_Feature_Scaling_and_Learning_Rate_Soln.ipynb) (Apply feature scaling first in the model)
85 | + [Feature Engineering](https://youtu.be/ecOdZlY9jsQ?si=2VAGLer97_NSYir2)
86 | + [Polynomial Regression](https://youtu.be/IFkRKJ5iBDE?si=EkfR0KKOGd7FBG20)
87 | + [Optional Lab : Feature Engineering](https://github.com/greyhatguy007/Machine-Learning-Specialization-Coursera/blob/main/C1%20-%20Supervised%20Machine%20Learning%20-%20Regression%20and%20Classification/week2/Optional%20Labs/C1_W2_Lab04_FeatEng_PolyReg_Soln.ipynb)
88 |
89 | As now you all are capable of making a regression model so here is an interesting
90 | problem for you all. But before that think whether feature scaling always improves the
91 | accuracy (reduces the cost more) and speed of the model? Search about it.
92 |
93 |
94 |
95 |
--------------------------------------------------------------------------------
/Week 1/Logistic Regression/README.md:
--------------------------------------------------------------------------------
1 | # LOGISTICS REGRESSION
2 | ***
3 | Logistic regression is used for binary classification where we use a sigmoid function that takes input as independent variables and produces a probability value between 0 and 1.
4 | To get an intuition refer to the [article](https://ml-cheatsheet.readthedocs.io/en/latest/logistic_regression.html).
5 |
6 | For example, we have two classes Class 0 and Class 1 if the value of the logistic function for an input is greater than 0.5 (threshold value) then it belongs to Class 1 otherwise it belongs to Class 0. It's referred to as regression because it is the extension of linear regression but is mainly used for classification problems.This [Video(32 to 36)](https://www.youtube.com/watch?v=xuTiAW0OR40&list=PLkDaE6sCZn6FNC6YRfRQc_FbeQrF8BwGI&index=32)
7 | provides a deep insight to it.
8 |
9 | >>Now , what happens at the backend 🤔. Let’s dive into its [maths](https://towardsdatascience.com/logistic-regression-detailed-overview-46c4da4303bc).
10 |
11 |
12 |
13 | ***
14 | # REGULARISATION
15 | ***
16 | >While developing machine learning models you must have encountered a situation in which the training accuracy of the model is high but the validation accuracy or the testing accuracy is low. This is the case which is popularly known as overfitting in the domain of machine learning. Also, this is the last thing a machine learning practitioner would like to have in his model. In this article, we will learn about a method known as Regularization in Python which helps us to solve the problem of overfitting. But before that let’s understand what is the role of regularisation and what is underfitting and overfitting.
17 | Refer to the [Video(37-39,41)](https://www.youtube.com/watch?v=8upNQi-40Q8&list=PLkDaE6sCZn6FNC6YRfRQc_FbeQrF8BwGI&index=37) to get proper INTUITION.
18 |
19 |
20 | Now, let's dive into types of logistic regression
21 | ***
22 | ## BINARY LOGISTIC REGRESSION
23 | >In Binary Logistic regression, there can be only two possible types of the dependent variables, such as 0 or 1, Pass or Fail, etc. Refer to the [article](https://onezero.blog/modelling-binary-logistic-regression-using-python-research-oriented-modelling-and-interpretation/) to explore more.
24 |
25 | ## MULTINOMIAL LOGISTIC REGRESSION
26 | >In Multinomial Logistic Regression, there can be 3 or more possible unordered types of the dependent variable, such as "cat", "dogs", or "sheep”. Refer to the [article](https://www.pycodemates.com/2022/03/multinomial-logistic-regression-definition-math-and-implementation.html) to get better insights.
27 | ***
28 | # FEATURE MAPPING
29 | Feature mapping is a technique used in data analysis and machine learning to transform input data from a lower-dimensional space to a higher-dimensional space, where it can be more easily analyzed or classified
30 | Go through the content below…
31 | >One way to fit the data better is to create more features from each data point. We will map the features into all polynomial terms of
32 | and
33 | up to the sixth power. As a result of this mapping, our vector of two features (the scores on two QA tests) has been transformed into a 28-dimensional vector. A logistic regression classifier trained on this higher-dimension feature vector will have a more complex decision boundary and will appear nonlinear when drawn in our 2-dimensional plot.
34 | map_feature(x)=[x1,x2,x12 ,x1x2,x13,....,x1x25,x26]
35 | >
36 | >This can be executed as
37 | ```python
38 | def map_feature(X1, X2):
39 | X1=np.atleast_1d(X1)
40 | X2=np.atleast_1d(X2)
41 | degree=6
42 | out=[]
43 | for i in range(1,degree+1):
44 | for j in range(i+1):
45 | out.append((X1**(i-j) * (X2**j)))
46 | return np.stack(out, axis=1)
47 | ```
48 |
49 | This will be introduced in CNN in the further weeks.
50 |
51 |
52 | # ASSIGNMENT
53 | ***
54 | Plot decision boundary and find accuracy of your model on [data.txt](./data.txt)
55 | For further information refer the [colab_notebook](https://colab.research.google.com/drive/1oRnVWpXmK5JDKIHOOJm5bjwy8d9v6pRb#scrollTo=BqlxZOXoHh4z)
56 | Save a copy of this in your drive . Compelete the blanks and save your colab notebook in github.
57 | Share the github repo for verification.
58 |
59 |
--------------------------------------------------------------------------------
/Week 1/Python Modules/README.md:
--------------------------------------------------------------------------------
1 | # Learning Python Modules: NumPy, Pandas, and Matplotlib
2 | ***
3 | ## Numpy Basics:(1 day)
4 | NumPy is a Python library used for working with arrays.
5 | It also has functions for working in the domain of linear algebra, Fourier transform, and matrices.
6 |
7 | ### Numpy Documentation:
8 | https://numpy.org/doc/stable/user/absolute_beginners.html
9 |
10 | Note: Do go through numpy broadcasting carefully: https://numpy.org/doc/stable/user/basics.broadcasting.html
11 |
12 | ### Refer to this for the Numpy Video Tutorial in Hindi:
13 | https://www.youtube.com/watch?v=awP79Yb3NaU
14 | Skip to 19:56
15 |
16 | ### Refer to this for the Numpy Video Tutorial in English:
17 | https://www.youtube.com/watch?v=QUT1VHiLmmI
18 |
19 |
20 | If you have never used Numpy or are not familiar with it, please go through this documentation clearly, step by step.
21 | If you have learned some of the topics before, you can navigate to respective topics from the menu bar on the right.
22 | You can go through any one of the 3 materials provided.
23 |
24 | ## Numpy Advanced:(1 day)
25 | https://numpy.org/doc/stable/user/quickstart.html
26 |
27 | If you are familiar with NumPy or have completed the basics documentation, go through this carefully.
28 | ***
29 | ## Pandas:(2 days)
30 | Pandas is a Python library used for working with data sets.
31 | It has functions for analyzing, cleaning, exploring, and manipulating data.
32 | ***
33 | ### Pandas Compiled Documentation:
34 |
35 | https://www.w3schools.com/python/pandas/default.asp
36 |
37 | The following documentation contains a tutorial on Pandas from basic to advanced. Navigate through it using the menu bar on the left side according to your need, but do make sure on completion, you are aware of all the topics mentioned.
38 | You can go through any one of the 3 materials provided.
39 |
40 | ### Refer to this for Pandas Video Tutorial in Hindi:
41 |
42 | https://www.youtube.com/watch?v=JjuLJ3Sb_9U&list=PLjVLYmrlmjGdEE2jFpL71LsVH5QjDP5s4&index=2
43 |
44 |
45 | ### Refer to this for Pandas Video Tutorial in English:
46 |
47 | https://www.youtube.com/watch?v=ZyhVh-qRZPA&list=PL-osiE80TeTsWmV9i9c58mdDCSskIFdDS
48 |
49 |
50 | ***
51 |
52 | ## Matplotlib:(1 day)
53 | Matplotlib is a low-level graph plotting library in Python that serves as a visualization utility.
54 | ***
55 | ### Matplotlib Compiled Documentation:
56 | https://www.w3schools.com/python/matplotlib_intro.asp
57 |
58 | ### Refer to this for Matplotlib Video Tutorial in Hindi:
59 | https://www.youtube.com/watch?v=9GvnrQv138s&list=PLjVLYmrlmjGcC0B_FP3bkJ-JIPkV5GuZR
60 |
61 |
62 | ### Refer to this for Matplotlib Video Tutorial in English:
63 | https://www.youtube.com/watch?v=ANQSqWQrLSU&list=PLc20sA5NNOvpG9WlvjY7XTDWPAFBNgv_N&index=3
64 |
65 |
66 | The following documentation contains a tutorial on Matplotlib from basic to advanced. Navigate through it using the menu bar on the left side according to your need, but do make sure on completion, you are aware of all the topics mentioned.
67 | ***
68 | ## Please make sure, you know these modules thoroughly as it will be used in the course very frequently.
69 |
--------------------------------------------------------------------------------
/Week 1/README.md:
--------------------------------------------------------------------------------
1 | ## Welcome to Week 1
2 | Welcome to the first week of our **Machine Learning Course!** This week, we’ll start by equipping ourselves with essential ML libraries such as NumPy, Pandas, and Matplotlib, which are crucial for data manipulation and visualization. Following this, we’ll delve into the foundational concepts of machine learning, including regression and classifiers. Also, we'll practise on many datasets to test our learning.
3 |
4 | Stay engaged and practice consistently to make the most of this learning experience. Keep your enthusiasm high, and let’s begin this exciting journey into machine learning together!
5 |
6 | 1. We'll start with the [ML Libaries / Python Modules](./Python%20Modules) to facilitate efficient data manipulation and visualization.
7 | 2. Our first ML concept will be [Linear Regression](./Linear%20Regression), a powerful technique for modeling relationships between variables.
8 | 3. Next, we'll explore [Logistic Regression](./Logistic%20Regression), a fundamental method for binary classification tasks.
9 | 4. Now we'll cover [Classifiers](https://github.com/wncc/Machine-Learning-LS-24/tree/main/Week%201/Classifiers%20(OPTIONAL)), which are algorithms used to classify data into different categories based on patterns in the input features.
10 | *This section is optional for week 1 but is **compulsory** in week 2. After completing the above 3 sections, if time persists then one can refer to the **classifiers** section.*
11 | 5. It's time to put our knowledge to the test. Let's tackle the [Assignments](./Assignment) and practice our skills to reinforce our understanding.
12 |
13 |
14 | >[!Note]
15 | For submitting all asignments make a github repo and store the assignments in that repo.
16 | Refer to the [**video**](https://www.youtube.com/watch?v=PQsJR8ci3J0) for making repo.
17 |
18 | **Congratulations** on completing the first week! We hope you've gained valuable insights that have sparked your enthusiasm and interest in machine learning. Next week, we'll delve into the expansive realm of deep learning, a powerful approach for constructing diverse AI models. Keep your enthusiasm high, and we are excited to see you in the next week.
19 |
--------------------------------------------------------------------------------
/Week 2/Assignment/Classifiers Assignment/README.md:
--------------------------------------------------------------------------------
1 | # Assignment 1
2 | Open the [Colab File](https://colab.research.google.com/drive/1hbaJHFO8KtDi9xEPYxAfMI-IiPJYC37q?usp=sharing)
3 |
4 | Also download the [utils.zip](./utils.zip) file to your computer. It contains the necessary files required to execute the colab cells. Upload the zip file to your colab.
5 |
6 | You can achieve the required accuracy with the non-optional classifiers. You can also use the optional classifiers.
7 | *here optional refers to the optional classifier videos in [Classifiers](./../../Classifiers) section.
8 |
9 | **BEST OF LUCK**
10 |
--------------------------------------------------------------------------------
/Week 2/Assignment/Classifiers Assignment/utils.zip:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wncc/Machine-Learning-LS-24/1f8ed556eec2d37960867d4b6e86e07886071f7e/Week 2/Assignment/Classifiers Assignment/utils.zip
--------------------------------------------------------------------------------
/Week 2/Assignment/Neural Network Assignment/README.md:
--------------------------------------------------------------------------------
1 | # Assignment 2
2 | In this assignment, you need to classify images into 2 classes using simple Neural Network.
3 |
4 | Download the [Images](./homer_bart.zip) file.
5 |
6 | Suggestions:
7 | 1. The images are of different dimensions, while importing/preprocessing convert all into (`64x64`) images.
8 | 2. Distribute them into batches (e.g. `batch_size = 32`).
9 | 3. Since the amount of images is less, no need to create validation dataset, go with training and test dataset only. Prefer 9:1 split between training and test dataset.
10 | 4. Use `Dense` i.e. Fully-Connecetd Layers with `activation = 'relu'` in each layer. Use `activation = 'sigmoid'` in the last layer.
11 | 5. `metrics = 'accuracy'` of test dataset will be checked only.
12 |
13 | Mandatory:
14 | 1. Don't use `Conv2D`, `MaxPool2D` layers.
15 | 2. For passing the assignment, test dataset accuracy should be more than 90%. [Test Accuracy > 0.9]
16 |
17 | **BEST OF LUCK**
18 |
--------------------------------------------------------------------------------
/Week 2/Assignment/Neural Network Assignment/homer_bart.zip:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wncc/Machine-Learning-LS-24/1f8ed556eec2d37960867d4b6e86e07886071f7e/Week 2/Assignment/Neural Network Assignment/homer_bart.zip
--------------------------------------------------------------------------------
/Week 2/Assignment/YOLOv8 Assignment/README.md:
--------------------------------------------------------------------------------
1 | # Assignment 3
2 | :white_check_mark: After going through the contents it is time for some implementation of Yolov8.
3 |
4 | ### Here is the Assignment
5 | ***
6 | ### DISCLAIMER:
7 | >⚠️ TRY TO START THIS ASSIGNMENT AS EARLY AS POSSIBLE . IT TAKES TIME FOR DATA TO LOAD AS WELL AS TRAIN THE MODEL.
8 |
9 | >>You will be given a dataset in which the train dataset will have all images labelled .
10 | >>The following dataset is based on car parts.
11 | >>You have to train and predict the model on the test dataset.
12 |
13 | ***
14 |
15 | ### :bangbang: CAUTION
16 | > Do not forget to change the path of train , test and valid images
17 |
18 | ```
19 | train: .../train/images
20 | test: .../test/images
21 | valid: .../valid/images
22 | ```
23 | ***
24 |
25 | > [**DATASET**](https://drive.google.com/drive/folders/1rPFkpiEhn5CKgT7EPZRF9sOCepsvL-hM?usp=drive_link)
26 |
27 | ***
28 |
29 | ### SUBMISSION MODE:
30 | >A colab notebook along with image of confusion matrix and results.png . You will find both in runs/detect/train folder.
31 |
32 | 💡TRY TO MAKE YOUR MODEL AS ACCURATE AS POSSIBLE…
33 |
--------------------------------------------------------------------------------
/Week 2/Classifiers/README.md:
--------------------------------------------------------------------------------
1 | # CLASSIFIERS
2 | In this course, you'll learn how to use classifiers, powerful tools for data science and machine learning. You'll explore various classification algorithms, including logistic regression, decision trees, random forests, and support vector machines. By the end, you'll be well-equipped to handle various data science and machine learning projects involving classification tasks.
3 | ## Regression:
4 | >[!Note]
5 | >You can *skip* the regression part if done before
6 |
7 | Regression is a key technique in data science and machine learning for modeling relationships between variables. You'll learn logistic regression for binary classification and softmax regression for multi-class classification. By mastering these methods, you'll enhance your ability to analyze and interpret data effectively.
8 |
9 | [Logistic Regression](https://youtu.be/zM4VZR0px8E?si=1GkpCaVPTQMuHi6D) (till 16:00 at 2x speed)
10 | [Softmax Regression](https://youtu.be/J5bXOOmkopc?si=F54C3YoWqGoL-mG-) (till 11:21 at 2x speed)
11 | ## SVM:
12 | Support Vector Machines (SVMs) is a powerful tool in data science for both classification tasks. You'll delve into the theory behind SVMs, understanding how they optimize decision boundaries to maximize margin and handle complex data distributions. Optional study includes polynomial kernels for non-linear data transformations. Through practical applications, you'll master SVMs for robust data analysis and prediction.
13 |
14 | [Basic Intuition](https://youtu.be/H9yACitf-KM?si=JCrZzpSIFY0aaV0u) (at 1.5x speed)
15 | [Maths and Theory](https://youtu.be/Js3GLb1xPhc?si=AMAxwVn-cK6nKIL1) (at 1.5x speed)
16 | [Application](https://youtu.be/FB5EdxAGxQg?si=AiNDpo0t0f3zjQbP) (at 2x speed)
17 | [Polynomial Kernels SVM](https://youtu.be/8bFKyb77vp0?si=c7gLv7to7dy-NeCF) (Optional)
18 | [Polynomial Kernels Application](https://youtu.be/dl_ZsuHSIFE?si=vPj0FCbwNFBnqaY9) (Optional)
19 | ## Decision Trees & Random Forest:
20 | Decision Trees and Random Forests are the fundamental and first-chosen techniques in data science for classification tasks. You'll learn how Decision Trees partition data based on features to make predictions, and how Random Forests aggregate multiple trees for improved accuracy and robustness. Through practical applications, you'll gain proficiency in building, evaluating, and optimizing these models for various data science tasks.
21 |
22 | [Decision Tree in ML](https://youtu.be/RmajweUFKvM?si=kMkMDKvsUNn9GtJc&t=393) (6:32 to 14:31 at 1.5x speed)
23 | [Random Forest in ML](https://youtu.be/eM4uJ6XGnSM?si=U9eHNFiLz-TtABwv&t=858) (14:18 to 17:18 at 1.5x speed)
24 | [Random Forest Classifier and Regressor](https://youtu.be/nxFG5xdpDto?si=3ACkJrx7H-McFssa&t=272) (4:30 to end at 1.5x speed)
25 | [Application](https://youtu.be/ok2s1vV9XW0?si=Imgl-oYGEcBrIvGu&t=154) (2:43 to 10:36 at 2x speed)
26 | ## KNN (Optional):
27 | K-nearest neighbors (KNN) is a non-parametric classification and regression method. It predicts new data points based on their similarity to existing data points, using a majority vote (for classification) or averaging (for regression) among its nearest neighbors in the feature space.
28 |
29 | [KNN in ML](https://youtu.be/CQveSaMyEwM?si=-DrgOoalYD89MaSD) (Optional)
30 | ## Naive Bayes (Optional):
31 | Naive Bayes is a probabilistic classifier based on Bayes' theorem. It assumes that features are independent of each other, hence "naive," which simplifies computation and training. It's efficient with minimal training data, making it ideal for tasks like text classification and spam filtering, where it often yields effective results despite its simplicity. Despite this simplification, Naive Bayes classifiers often perform well in practice, especially with high-dimensional data and large datasets.
32 |
33 | [Intuition](https://youtu.be/jS1CKhALUBQ?si=8LC2Qn9oFexke3aM) (at 1.5x speed)
34 | [Naive Bayes on Text Data](https://youtu.be/temQ8mHpe3k?si=u_I8AyPMoRho6KHV) (at 1.5x speed)
35 | [Application](https://youtu.be/nHIUYwN-5rM?si=bHFlE9KX23WmaZH-) (at 2x speed)
36 | >[!Warning]
37 | NLP is done with advanced Deep Learning algorithms
38 | ## K-means Clustering (Optional):
39 | K-means clustering is a partitioning algorithm that aims to divide a dataset into K clusters by iteratively assigning each data point to the cluster with the nearest centroid and recalculating centroids until convergence. It's widely used for data segmentation and pattern recognition tasks.
40 |
41 | [K-Means in ML](https://youtu.be/EItlUEPCIzM?si=f9Or7ZgnpzlJmudt) (Optional)
42 |
--------------------------------------------------------------------------------
/Week 2/Framework + NN/README.md:
--------------------------------------------------------------------------------
1 | # FRAMEWORK
2 | In this course, you'll learn how to use TensorFlow/PyTorch, two powerful tools for data science and machine learning. Starting with creating tensors and performing operations on them, you'll quickly move to building and deploying deep learning models. You'll construct neural networks from scratch, utilize Convolutional Neural Networks (CNNs) for tasks like image recognition, and apply regularization techniques such as L2 and Dropout to improve model performance. The course also includes tutorials on saving and loading models, ensuring your work is reusable and accessible. By the end, you'll be well-equipped to handle various data science and machine learning projects.
3 | >[!IMPORTANT]
4 | >Use either Tensorflow or PyTorch
5 | ## Tensorflow
6 | >[!Tip]
7 | All the videos can be watched at **1.5x speed**
8 | USE NOTEBOOK [**COLAB**/JUPYTER]
9 |
10 | 1. **[Install and Setup](https://youtu.be/5Ym-dOS9ssA?si=lpQeaEaSqo9jOtvi)**
11 | `pip3 install tensorflow` (if using on local notebook/editor)
12 | Setup isn’t required for working in colab
13 |
14 | 2. **[Tensorflow Basics](https://youtu.be/HPjBY1H-U4U?si=xTni0ae-S0vgbQVA)**
15 | TensorFlow basics encompass creating tensors using constants or variables, performing operations such as element-wise and matrix multiplication, reshaping tensors, computing dot products, applying broadcasting, and utilizing activation functions for neural network training and inference.
16 | *You can skip os.environ[...] = ‘2'*
17 |
18 | 4. **[Neural Network](https://youtu.be/pAhPiF3yiXI?si=jodY8SIy5PUapbzo)**
19 | Neural networks consist of interconnected neurons arranged in layers, processing input data through weights and activation functions to make predictions.
20 |
21 | 5. **[Regularization](https://youtu.be/kJSUq1PLmWg?si=yfzLDoywCIZ3zE92)**
22 | Regularization techniques in neural networks prevent overfitting by adding constraints or penalties, such as L1/L2 regularization or dropout, to the model's training process.
23 |
24 | 7. **[Model Subclassing](https://youtu.be/WcZ_1IAH_nM?si=iftvnOL3IBpq-qYp)** (OPTIONAL)
25 | TensorFlow's model subclassing allows custom model architectures by subclassing `tf.keras.Model`, enabling flexibility in defining layers, forward pass, and custom training loops for complex neural network designs.
26 |
27 | 9. **[Model Saving and Loading](https://youtu.be/idus3KO6Wic?si=SFuwbX3sZWX6CEsv)**
28 | Model saving and loading in neural networks involve storing trained model parameters to disk and reloading them for inference or further training, using formats like HDF5.
29 |
30 | 10. **[Custom Dataset for Images](https://youtu.be/q7ZuZ8ZOErE?si=Xu9uxKgzL9oYQvIa)**
31 | Creating a custom dataset of images is necessary for training neural networks on specific tasks or domains not covered by existing datasets, ensuring better model performance and accuracy.
32 |
33 | **EXPLORE THE [TENSORFLOW OFFICIAL DOCUMENTATION](https://www.tensorflow.org/api_docs/python/tf/keras) FOR MORE**
34 | `tf.keras` will be used mainly
35 |
36 | ## PyTorch
37 | >[!Tip]
38 | All the videos can be watched at **1.5x speed**
39 | USE NOTEBOOK [**COLAB**/JUPYTER]
40 |
41 | 1. **[Install and Setup](https://youtu.be/2S1dgHpqCdk?si=5tSQQbP7UELku9bc)**
42 | Setup isn’t required for working in colab
43 |
44 | 1. **[Pytorch Basics](https://youtu.be/x9JiIFvlUwk?si=jRkoBXGdU4FC6i8P)**
45 | PyTorch basics involve creating tensors with constants or variables, performing operations like element-wise and matrix multiplication, reshaping tensors, computing dot products, applying broadcasting, and utilizing activation functions for neural network training and inference.
46 |
47 | 1. **[Neural Network](https://youtu.be/Jy4wM2X21u0?si=DHMg84nHd8VL6Pjk)**
48 | Neural networks consist of interconnected neurons arranged in layers, processing input data through weights and activation functions to make predictions.
49 |
50 | 1. **[Model Saving and Loading](https://youtu.be/g6kQl_EFn84?si=UMABNixyBJaNxF7_)**
51 | Model saving and loading in neural networks involve storing trained model parameters to disk and reloading them for inference or further training, using formats like HDF5
52 |
53 | 1. **[Custom Dataset for Images](https://youtu.be/ZoZHd0Zm3RY?si=nl0R64asjzMYPR4m)**
54 | Creating a custom dataset of images is necessary for training neural networks on specific tasks or domains not covered by existing datasets, ensuring better model performance and accuracy.
55 |
56 |
57 | **EXPLORE THE [PYTORCH OFFICIAL DOCUMENTATION](https://pytorch.org/tutorials/beginner/basics/intro.html) FOR MORE**
58 |
59 | # NEURAL NETWORK
60 | In this course, you'll learn the essentials of Neural Networks for data science and machine learning. Starting with the basics, you'll understand how neural networks work and how to build them from scratch. You'll cover topics such as feedforward networks, activation functions, backpropagation, and managing overfitting and underfitting. You'll also explore learnable parameters and batch normalization. The course includes description about test, train and validation set.
61 | >[!Tip]
62 | All the videos can be watched at **1.5x speed**
63 |
64 | Watch the [playlist](https://youtube.com/playlist?list=PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU&si=lgaRNzQ9qPih4Hnt) videos 1-18, 25-38 (skip 34)
65 |
66 |
--------------------------------------------------------------------------------
/Week 2/README.md:
--------------------------------------------------------------------------------
1 | ## Welcome to Week 2
2 | Welcome to the second week of our **Machine Learning Course!** This week, we'll dive into the world of Deep Learning. We’ll start by equipping ourselves with the most essential Deep Learning framework, TensorFlow/PyTorch, which are crucial for building and training Deep Learning models. Following this, we’ll delve into the foundational concepts of neural networks and explore the YOLOv8 architecture for object classifictaion. Additionally, we'll practice on various datasets to test our learning.
3 |
4 | Stay engaged and practice consistently to make the most of this learning experience. Keep your enthusiasm high, and let’s continue this exciting journey of machine learning together!
5 |
6 | 1. Let's start with Deep Learning framework, [TensorFlow/PyTorch](./Framework%20+%20NN) followed by concepts of [Neural Network](./Framework%20+%20NN).
7 | 2. Now we'll cover [Classifiers](./Classifiers), which are algorithms used to classify data into different categories based on patterns in the input features.
8 | You can skip if it has been done in week 1.
9 | 4. Let's explore [YOLOv8](./YOLOv8), focusing on its capabilities for object detection and classification.
10 | 5. It's time to put our knowledge to the test. Let's tackle the [Assignments](./Assignment) and practice our skills to reinforce our understanding.
11 |
12 | **Congratulations** on completing the second week! We hope you've gained valuable insights that have sparked your enthusiasm and interest in machine learning. Next week, we'll delve into the expansive realm of CNN, a powerful approach for image related problems. Keep your enthusiasm high, and we are excited to see you in the next week.
13 |
14 |
15 |
--------------------------------------------------------------------------------
/Week 2/YOLOv8/README.md:
--------------------------------------------------------------------------------
1 | > ## What is YOLOv8?
2 | YOLO (You Only Look Once) is a real-time object detection algorithm.It is a single-stage object detector that uses a convolutional neural network (CNN) to predict the bounding boxes and class probabilities of objects in input images. The YOLO algorithm divides the input image into a grid of cells, and for each cell, it predicts the probability of the presence of an object and the bounding box coordinates of the object. It also predicts the class of the object. Unlike two-stage object detectors such as R-CNN and its variants, YOLO processes the entire image in one pass, making it faster and more efficient.
3 |
4 | + #### To start with the basic concepts please watch this video:
5 |
6 | https://www.youtube.com/watch?v=ag3DLKsl2vk&list=PLiwz66k_RGNwIbE16anNQ37z_gNaVuLdx
7 |
8 | + #### For deeper understanding go through this video:
9 |
10 | https://www.youtube.com/watch?v=iy34dSwfEsY
11 |
12 | + #### Official GitHub repo of YOLO v8
13 | https://github.com/ultralytics/ultralytics
14 | + #### For visual experience of Yolo:
15 |
16 | https://www.youtube.com/watch?v=ag3DLKsl2vk
17 |
--------------------------------------------------------------------------------
/Week 3/Assignment/CNN/README.md:
--------------------------------------------------------------------------------
1 | [Problem Statement](https://colab.research.google.com/drive/1Iak8gGbKmIzbWoe0K_U8K4BMWCrfkHdp?usp=sharing)
2 |
3 | Use the given [data](./data)
--------------------------------------------------------------------------------
/Week 3/Assignment/CNN/data:
--------------------------------------------------------------------------------
1 | https://drive.google.com/drive/folders/1Nxwrymd9n1qfum_2OEgU_1dtwbsVLbdv?usp=sharing
2 |
--------------------------------------------------------------------------------
/Week 3/Assignment/OpenCV/ImagePreprocessing-OpenCV.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "a57d17d9-a84a-4bfe-a335-18b67cc0db32",
6 | "metadata": {},
7 | "source": [
8 | "**RUN ALL THE CELLS AND DON'T EDIT ANY CELL**\n",
9 | "\n",
10 | "**CODE IN THE MENTIONED CELLS ONLY**"
11 | ]
12 | },
13 | {
14 | "cell_type": "markdown",
15 | "id": "be62504c-e291-4502-b880-5da1d3ec9ff7",
16 | "metadata": {},
17 | "source": [
18 | "## Import Necessary Files"
19 | ]
20 | },
21 | {
22 | "cell_type": "code",
23 | "execution_count": null,
24 | "id": "998981a4-938d-4f61-ac33-8f33b1417121",
25 | "metadata": {},
26 | "outputs": [],
27 | "source": [
28 | "import cv2\n",
29 | "import numpy as np\n",
30 | "import matplotlib.pyplot as plt"
31 | ]
32 | },
33 | {
34 | "cell_type": "markdown",
35 | "id": "75a7ca5c-7739-4d9c-835c-89e8f927bb59",
36 | "metadata": {},
37 | "source": [
38 | "Add ```/path/to/your/folder``` to load the images"
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "execution_count": null,
44 | "id": "ff141a98-3ddb-4bc5-aff0-a0d48071e377",
45 | "metadata": {},
46 | "outputs": [],
47 | "source": [
48 | "cartoon = cv2.imread(\"/path/to/your/folder/cartoon.bmp\")\n",
49 | "cartoon_y = cv2.imread(\"/path/to/your/folder/cartoon_y.jpg\")\n",
50 | "girl = cv2.imread(\"/path/to/your/folder/girl.png\") #grayscale image\n",
51 | "girl_y = cv2.imread(\"/path/to/your/folder/girl_y.jpg\")\n",
52 | "fingerprint = cv2.imread(\"/path/to/your/folder/fingerprint.jpg\")\n",
53 | "fingerprint_y = cv2.imread(\"/path/to/your/folder/fingerprint_y.jpg\")"
54 | ]
55 | },
56 | {
57 | "cell_type": "markdown",
58 | "id": "174718bd-90c7-46c0-b018-b15be8dd4a6d",
59 | "metadata": {},
60 | "source": [
61 | "## Raw and Final Image\n",
62 | "You need to process the raw images: ```cartoon, girl, fingerprint``` to get the processed images: ```cartoon_y, girl_y, fingerprint_y```\n",
63 | "\n",
64 | "Your processed image need not be 100% alike given processed image. They are given to give you an idea of what to process in the raw images."
65 | ]
66 | },
67 | {
68 | "cell_type": "code",
69 | "execution_count": null,
70 | "id": "3638c757-ff4b-4622-b4f2-dc3acffabae3",
71 | "metadata": {},
72 | "outputs": [],
73 | "source": [
74 | "fig, axs = plt.subplots(3,2,figsize=(8,12))\n",
75 | "axs[0,0].imshow(cv2.cvtColor(cartoon,cv2.COLOR_BGR2RGB))\n",
76 | "axs[0,1].imshow(cartoon_y)\n",
77 | "axs[0,0].axis(\"off\")\n",
78 | "axs[0,1].axis(\"off\")\n",
79 | "axs[0,0].set_title(\"Raw\")\n",
80 | "axs[0,1].set_title(\"Masked\")\n",
81 | "axs[1,0].imshow(girl)\n",
82 | "axs[1,1].imshow(girl_y)\n",
83 | "axs[1,0].axis(\"off\")\n",
84 | "axs[1,1].axis(\"off\")\n",
85 | "axs[1,0].set_title(\"Raw\")\n",
86 | "axs[1,1].set_title(\"Edges\")\n",
87 | "axs[2,0].imshow(cv2.cvtColor(fingerprint,cv2.COLOR_BGR2RGB))\n",
88 | "axs[2,1].imshow(fingerprint_y)\n",
89 | "axs[2,0].axis(\"off\")\n",
90 | "axs[2,1].axis(\"off\")\n",
91 | "axs[2,0].set_title(\"Raw\")\n",
92 | "axs[2,1].set_title(\"Binary\")\n",
93 | "plt.show()"
94 | ]
95 | },
96 | {
97 | "cell_type": "markdown",
98 | "id": "c7b6f679-6153-4fad-b6ec-0d0bf133775f",
99 | "metadata": {},
100 | "source": [
101 | "## Code in the following 3 cells only"
102 | ]
103 | },
104 | {
105 | "cell_type": "markdown",
106 | "id": "9ca454f5-55bc-4cae-9ba0-25e00fd34d35",
107 | "metadata": {},
108 | "source": [
109 | "### Cartoon\n",
110 | "For the bart image, you need to mask the dress. Write all your code in the cell below and the final image should be stored in ```cartoon_y_mentee``` variable only or it'll show error later."
111 | ]
112 | },
113 | {
114 | "cell_type": "code",
115 | "execution_count": null,
116 | "id": "d6659747-9da3-48a3-9ee9-0858f62b702f",
117 | "metadata": {},
118 | "outputs": [],
119 | "source": [
120 | "#START CODE HERE (Do all your processing here)\n",
121 | "\n",
122 | "cartoon_y_mentee = None #(in RGB format only)\n",
123 | "\n",
124 | "#END CODE HERE (the final processed image should be cartoon_y_mentee)"
125 | ]
126 | },
127 | {
128 | "cell_type": "markdown",
129 | "id": "9cd5295a-d3e2-4889-a1c8-0f956ef528c8",
130 | "metadata": {},
131 | "source": [
132 | "### Girl\n",
133 | "For the girl image, you need to detect the edges. Write all your code in the cell below and the final image should be stored in ```girl_y_mentee``` variable only or it'll show error later."
134 | ]
135 | },
136 | {
137 | "cell_type": "code",
138 | "execution_count": null,
139 | "id": "80ac55b4-f34c-4f5d-af7c-f6df602f56f8",
140 | "metadata": {},
141 | "outputs": [],
142 | "source": [
143 | "#START CODE HERE (Do all your processing here)\n",
144 | "\n",
145 | "girl_y_mentee = None\n",
146 | "\n",
147 | "#END CODE HERE (the final processed image should be girl_y_mentee)"
148 | ]
149 | },
150 | {
151 | "cell_type": "markdown",
152 | "id": "8bc35193-0397-4e48-a198-bd17f4cffaa6",
153 | "metadata": {},
154 | "source": [
155 | "### Fingerprint\n",
156 | "For the fingerprint image, you need to convert it to binary image. Binary images have 2 intensities only (0 or 255). Write all your code in the cell below and the final image should be stored in ```cartoon_y_mentee``` variable only or it'll show error later."
157 | ]
158 | },
159 | {
160 | "cell_type": "code",
161 | "execution_count": null,
162 | "id": "c183ef1c-74a3-47db-9322-b0f78636e8a2",
163 | "metadata": {},
164 | "outputs": [],
165 | "source": [
166 | "#START CODE HERE (Do all your processing here)\n",
167 | "\n",
168 | "fingerprint_y_mentee = None #(in Binary format only i.e. pixel intensity = 0 or 255)\n",
169 | "\n",
170 | "#END CODE HERE (the final processed image should be fingerprint_y_mentee)"
171 | ]
172 | },
173 | {
174 | "cell_type": "markdown",
175 | "id": "23a6edaf-aea5-410d-b897-3b12aa1113b2",
176 | "metadata": {},
177 | "source": [
178 | "## Comparison"
179 | ]
180 | },
181 | {
182 | "cell_type": "code",
183 | "execution_count": null,
184 | "id": "23d388f3-a17a-4e99-9845-c9ef26e16283",
185 | "metadata": {},
186 | "outputs": [],
187 | "source": [
188 | "fig, axs1 = plt.subplots(3,2,figsize=(8,12))\n",
189 | "axs1[0,0].imshow(cartoon_y_mentee)\n",
190 | "axs1[0,1].imshow(cartoon_y)\n",
191 | "axs1[0,0].axis(\"off\")\n",
192 | "axs1[0,1].axis(\"off\")\n",
193 | "axs1[0,0].set_title(\"Masked_Mentee\")\n",
194 | "axs1[0,1].set_title(\"Masked\")\n",
195 | "axs1[1,0].imshow(girl_y_mentee,cmap=\"gray\")\n",
196 | "axs1[1,1].imshow(girl_y)\n",
197 | "axs1[1,0].axis(\"off\")\n",
198 | "axs1[1,1].axis(\"off\")\n",
199 | "axs1[1,0].set_title(\"Edges_Mentee\")\n",
200 | "axs1[1,1].set_title(\"Edges\")\n",
201 | "axs1[2,0].imshow(fingerprint_y_mentee,cmap=\"gray\")\n",
202 | "axs1[2,1].imshow(fingerprint_y)\n",
203 | "axs1[2,0].axis(\"off\")\n",
204 | "axs1[2,1].axis(\"off\")\n",
205 | "axs1[2,0].set_title(\"Binary_Mentee\")\n",
206 | "axs1[2,1].set_title(\"Binary\")\n",
207 | "plt.show()"
208 | ]
209 | }
210 | ],
211 | "metadata": {
212 | "kernelspec": {
213 | "display_name": "Python 3 (ipykernel)",
214 | "language": "python",
215 | "name": "python3"
216 | },
217 | "language_info": {
218 | "codemirror_mode": {
219 | "name": "ipython",
220 | "version": 3
221 | },
222 | "file_extension": ".py",
223 | "mimetype": "text/x-python",
224 | "name": "python",
225 | "nbconvert_exporter": "python",
226 | "pygments_lexer": "ipython3",
227 | "version": "3.11.4"
228 | }
229 | },
230 | "nbformat": 4,
231 | "nbformat_minor": 5
232 | }
233 |
--------------------------------------------------------------------------------
/Week 3/Assignment/OpenCV/README.md:
--------------------------------------------------------------------------------
1 | A [Python Notebook](./ImagePreprocessing-OpenCV.ipynb) is provided. Download the [images.zip](./images.zip).
2 |
3 | You need to code in the mentioned cells only and upload the .ipynb file (not .py) in your github repo.
4 | Don't edit any other cell or add any extra cell.
5 |
6 | The [Image Preprocessing - OpenCV](./../../Image%20Preprocessing%20-%20OpenCV) folder is updated on 17th July, 2:00 a.m.
7 | Check the new content **HSV** under **Image Color Channels** (necessary for this assignment).
8 |
--------------------------------------------------------------------------------
/Week 3/Assignment/OpenCV/images.zip:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wncc/Machine-Learning-LS-24/1f8ed556eec2d37960867d4b6e86e07886071f7e/Week 3/Assignment/OpenCV/images.zip
--------------------------------------------------------------------------------
/Week 3/CNN/README.md:
--------------------------------------------------------------------------------
1 | # Convolutional Neural Network
2 |
3 | In the previous week we had learnt about Neural Networks. This week we will learn a special kind of network called Convolutional Neural Network.
4 |
5 | A Convolutional Neural Network (CNN) is a category of machine learning model, namely a type of deep learning algorithm well suited to analysing visual data. CNNs (sometimes referred to as convnets) use principles from linear algebra, particularly convolution operations, to extract features and identify patterns within images. Although CNNs are predominantly used to process images, they can also be adapted to work with audio and other signal data.
6 |
7 | ## Architecture
8 | Convolutional neural networks have three main kinds of layers, which are:
9 | * Convolutional layer
10 | * Pooling layer
11 | * Fully-connected layer
12 |
13 | To get more deeper about these layers refer to this [theory](https://medium.com/latinxinai/convolutional-neural-network-from-scratch-6b1c856e1c07).
14 |
15 |
16 | ## How does CNN work?
17 | Before going through the application it is important to know how CNN architecture works.
18 |
19 | To have some basic ideas go through the [video](https://www.youtube.com/watch?v=zfiSAzpy9NM) to get a better intuition.
20 | Go through the [theory](https://medium.com/thedeephub/convolutional-neural-networks-a-comprehensive-guide-5cc0b5eae175) to get some more insights.
21 |
22 |
23 | ## Maths Behind
24 | To have a better understanding on maths behind CNN refer to any of these videos.
25 |
26 | [Video 1](https://www.youtube.com/watch?v=Lakz2MoHy6o)
27 | [Playlist](https://www.youtube.com/playlist?list=PLuhqtP7jdD8CD6rOWy20INGM44kULvrHu)
28 |
29 |
30 |
31 |
--------------------------------------------------------------------------------
/Week 3/Image Preprocessing - OpenCV/README.md:
--------------------------------------------------------------------------------
1 | # Image Preprocessing - OpenCV
2 | Image preprocessing using OpenCV is crucial for deep learning projects as it enhances image quality, ensuring better model accuracy. Techniques include resizing, normalization, noise reduction, and augmentation. These steps standardize inputs, improve feature extraction, and increase dataset diversity, leading to more robust and efficient neural network training and performance.
3 |
4 | Frameworks like Tensorflow also have image preprocessing methods but aren't as efficient as OpenCV (made for computer vision only). You can use framework's preprocessing methods like resizing, normalization, rotation etc. for simple image data. But for advanced image preprocessing techniques you need OpenCV.
5 |
6 | >[!Tip]
7 | All the videos can be watched at **2x speed**
8 |
9 | ## Image Basics
10 |
11 | In the context of computer vision and deep learning, an image represented as a NumPy array is a multi-dimensional array where each element corresponds to a pixel value. Each pixel's value ranges from 0 to 255, indicating the intensity of the color. This representation allows for efficient manipulation and processing using NumPy's array operations.
12 |
13 | [What is Image?](https://youtu.be/oUJs03eZ0S8?si=tyYuoZIt091fhbYp)
14 | [Image Reading](https://youtu.be/wRtAoZF50Jc?si=m7RWW3Qy7pQxYaed)
15 |
16 | ## Image Color Channels
17 | For a grayscale image, it's a 2D array with dimensions (height, width), and for a color image, it's a 3D array with dimensions (height, width, channels), where channels typically represent RGB values.
18 | >[!Note]
19 | >OpenCV reads image as BGR not RGB
20 |
21 | [BGR2GRAY](https://youtu.be/AFrZ3JOQ0Qg?si=wcdHgOTnidg7bHYV)
22 | [BGR Channels](https://youtu.be/wlH9w1eA6PQ?si=AFraoDNEW-3YYJB1)
23 | [BGR vs RGB](https://youtu.be/kSqxn6zGE0c?si=_ZK8MVWV5SLiJWi1&t=581) (9:41 to 11:50)
24 | [HSV](https://youtu.be/eDIj5LuIL4A?si=ex0w92Y_b4CC8nbq&t=7771) (2:09:31 to 2:29:00)
25 |
26 | ## Playing with Images
27 |
28 | Image cropping, resizing, rotation and flipping are necessary for standardizing input sizes, enhancing model training, and augmenting data to improve the robustness and performance of deep learning models. Drawing bounding boxes is essential for object detection projects.
29 |
30 | [Resizing](https://youtu.be/DPkpI2ezVO4?si=-xlW6J5h0TW8Fnh5)
31 | [Flipping](https://youtu.be/Y_78ARbpSwo?si=iCVqHqxZdX-3-lCp)
32 | [Cropping](https://youtu.be/fanEPKLRbPk?si=E4yGRKhyIByJq7ov)
33 | [Rotation](https://youtu.be/MtHvL1emJSE?si=uzGlA-SX9vedrJfL)
34 | [Drawing Shapes](https://youtu.be/shfXj_Og7ak?si=Yv7_qiBL7IVulHQl)
35 |
36 | ## Blurring and Noise Removal
37 |
38 | Blurring and noise removal are essential preprocessing steps in image processing. Blurring, using techniques like Gaussian blur, smooths the image, reducing detail and noise. Noise removal enhances image quality by eliminating random variations, ensuring clearer, more accurate inputs for deep learning models.
39 |
40 | [Blurring](https://youtu.be/eDIj5LuIL4A?si=uiqeiB6PriZciqRz&t=3099) (51:39 to 1:07:00)
41 |
42 | ## Threshold and Edge Detection
43 |
44 | Thresholding converts grayscale images to binary by setting pixel intensity limits, simplifying analysis. Edge detection, using methods like Canny or Sobel, identifies boundaries within images, highlighting significant structural information essential for feature extraction in deep learning models.
45 |
46 | [Threshold and Edge Detection](https://youtu.be/eDIj5LuIL4A?si=RtmvrTYhClNw_6b4&t=4026) (1:07:06 to 1:31:30)
47 |
48 | ## Contours
49 |
50 | **Contours:**
51 | Contours are *curves joining continuous points along the boundary of an object* in an image. They represent the shape and structure of objects present in the image, making them useful for tasks such as object detection, shape analysis, and recognition.
52 |
53 | **Edge Detection:**
54 | Edge detection algorithms aim to identify points in an image where the *brightness changes abruptly*. These points typically correspond to boundaries between objects or significant changes in texture within the image.
55 |
56 | [Contours Detection](https://youtu.be/eDIj5LuIL4A?si=-FRfiF9eB5D0z--_&t=6317) (1:45:17 to 2:01:20)
57 |
58 | ## Image Saving
59 |
60 | It's very important to save the images after processing them.
61 |
62 | [Image Saving](https://youtu.be/b_vVNCVDrbw?si=iaix9mp4M9mSzzi6)
63 |
--------------------------------------------------------------------------------
/Week 3/README.md:
--------------------------------------------------------------------------------
1 | ## Welcome to Week 3
2 | Welcome to the third week of our **Machine Learning Course!** This week, we'll dive into the world of Convolutional Neural Networks (CNNs) and image preprocessing using OpenCV. We'll explore foundational concepts of CNNs, emphasizing image preprocessing techniques such as resizing, normalization, blurring, and noise removal using OpenCV. Through practical exercises on various datasets, we'll enhance our understanding and skills in building and training CNN models.
3 |
4 | Stay engaged and practice consistently to make the most of this learning experience. Keep your enthusiasm high, and let’s continue this exciting journey of machine learning together!
5 |
6 | 1. Let's start with [OpenCV](./Image%20Preprocessing%20-%20OpenCV), essential for preprocessing the raw image data.
7 | 2. Now it's time to explore [Convolutional Neural Network](./CNN), the most efficient NN to deal with image data.
8 | 5. Let's put our knowledge to the test and tackle the [Assignments](./Assignment).
9 |
10 | **Congratulations** on completing the third week! We hope you've gained valuable insights that have sparked your enthusiasm and interest in machine learning. Keep your enthusiasm high, and we are excited to see you in the next week.
11 |
--------------------------------------------------------------------------------