├── 0_introduction
└── lessons
│ └── 0. Introduction.pdf
├── 1_python
├── client_data.csv
├── intro.py
└── lessons
│ └── 1. Python Crash Course.pdf
├── 2_data_preprocessing
├── client_data.csv
├── data_preprocessing.py
└── lessons
│ └── 3. Data Preprocessing.pdf
├── 3_supervised_ml
├── classification
│ ├── Classification Course.ipynb
│ ├── Social_Network_Ads.csv
│ └── Wine.csv
├── lessons
│ ├── 4. Supervised Learning - Classification.pdf
│ └── 4. Supervised Learning - Regression.pdf
└── regression
│ ├── 50_Startups.csv
│ ├── Regression Practical.ipynb
│ ├── Salary_Data.csv
│ ├── data_2d.csv
│ ├── data_poly.csv
│ ├── diamonds.csv
│ ├── height_weight.csv
│ └── height_weight_small.csv
├── 4_model_evaluation
└── lessons
│ └── 6. Evaluation Metrics.pdf
├── 5_unsupervised_ml
└── Mall_Customers.csv
├── 6_deep_learning
└── Churn_Modelling.csv
├── MLBC Syllabus AR.pdf
├── MLBC Syllabus EN.pdf
├── README.md
└── course_image.png
/0_introduction/lessons/0. Introduction.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Younes-Charfaoui/Machine-Learning-Beginner-Course/b39d9dea1d7f79c1b14590ea338a33d47e9ca49e/0_introduction/lessons/0. Introduction.pdf
--------------------------------------------------------------------------------
/1_python/client_data.csv:
--------------------------------------------------------------------------------
1 | Country,Age,Salary,Purchased
2 | Algeria,44,72000,No
3 | Tunis,27,48000,Yes
4 | Morocco,30,54000,No
5 | Tunis,38,61000,No
6 | Morocco,40,,Yes
7 | Algeria,35,58000,Yes
8 | Tunis,,52000,No
9 | Algeria,48,79000,Yes
10 | Morocco,50,83000,No
11 | Algeria,37,67000,Yes
12 | Tunis,22,50000,No
13 | Algeria,33,53000,No
14 | Morocco,44,47000,Yes
15 |
--------------------------------------------------------------------------------
/1_python/intro.py:
--------------------------------------------------------------------------------
1 |
2 |
3 |
--------------------------------------------------------------------------------
/1_python/lessons/1. Python Crash Course.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Younes-Charfaoui/Machine-Learning-Beginner-Course/b39d9dea1d7f79c1b14590ea338a33d47e9ca49e/1_python/lessons/1. Python Crash Course.pdf
--------------------------------------------------------------------------------
/2_data_preprocessing/client_data.csv:
--------------------------------------------------------------------------------
1 | Country,Age,Salary,Purchased
2 | Algeria,44,72000,No
3 | Tunis,27,48000,Yes
4 | Morocco,30,54000,No
5 | Tunis,38,61000,No
6 | Morocco,40,,Yes
7 | Algeria,35,58000,Yes
8 | Tunis,,52000,No
9 | Algeria,48,79000,Yes
10 | Morocco,50,83000,No
11 | Algeria,37,67000,Yes
12 | Tunis,22,50000,No
13 | Algeria,33,53000,No
14 | Morocco,44,47000,Yes
15 |
--------------------------------------------------------------------------------
/2_data_preprocessing/data_preprocessing.py:
--------------------------------------------------------------------------------
1 | # Importing the libraries.
2 | import pandas as pd
3 | import matplotlib.pyplot as plt
4 | import numpy as np
5 |
6 | # Importing the dataset.
7 | data = pd.read_csv('client_data.csv')
8 | X = data.iloc[:,:3].values
9 | y = data.iloc[:,-1].values
10 |
11 | # Handle Missing data.
12 | from sklearn.preprocessing import Imputer
13 | im = Imputer()
14 | X[:, 1:3] = im.fit_transform(X[:,1:3])
15 | # Encode the data.
16 | from sklearn.preprocessing import LabelEncoder , OneHotEncoder
17 | label_x = LabelEncoder()
18 | y = label_x.fit_transform(y)
19 |
20 | label_countries = LabelEncoder()
21 | X[:,0] = label_countries.fit_transform(X[:,0])
22 |
23 | onehotencoder = OneHotEncoder(categorical_features = [0])
24 | X = onehotencoder.fit_transform(X).toarray()
25 |
26 | # Feature Scaling
27 | from sklearn.preprocessing import StandardScaler
28 | sc = StandardScaler()
29 | X = sc.fit_transform(X)
30 |
31 | # Train/Test Splitting.
32 | from sklearn.model_selection import train_test_split
33 | x_train , x_test , y_train , y_test = train_test_split(X , y,test_size = 0.3 , random_state = 2424)
34 |
--------------------------------------------------------------------------------
/2_data_preprocessing/lessons/3. Data Preprocessing.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Younes-Charfaoui/Machine-Learning-Beginner-Course/b39d9dea1d7f79c1b14590ea338a33d47e9ca49e/2_data_preprocessing/lessons/3. Data Preprocessing.pdf
--------------------------------------------------------------------------------
/3_supervised_ml/classification/Classification Course.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import pandas as pd\n",
10 | "import numpy as np\n",
11 | "import matplotlib.pyplot\n",
12 | "import seaborn as sns\n"
13 | ]
14 | },
15 | {
16 | "cell_type": "code",
17 | "execution_count": 2,
18 | "metadata": {},
19 | "outputs": [],
20 | "source": [
21 | "data = pd.read_csv('Wine.csv')"
22 | ]
23 | },
24 | {
25 | "cell_type": "code",
26 | "execution_count": 32,
27 | "metadata": {},
28 | "outputs": [
29 | {
30 | "data": {
31 | "text/plain": [
32 | "(178, 14)"
33 | ]
34 | },
35 | "execution_count": 32,
36 | "metadata": {},
37 | "output_type": "execute_result"
38 | }
39 | ],
40 | "source": [
41 | "data.shape"
42 | ]
43 | },
44 | {
45 | "cell_type": "code",
46 | "execution_count": 3,
47 | "metadata": {},
48 | "outputs": [
49 | {
50 | "data": {
51 | "text/html": [
52 | "
\n",
53 | "\n",
66 | "
\n",
67 | " \n",
68 | " \n",
69 | " | \n",
70 | " Alcohol | \n",
71 | " Malic_Acid | \n",
72 | " Ash | \n",
73 | " Ash_Alcanity | \n",
74 | " Magnesium | \n",
75 | " Total_Phenols | \n",
76 | " Flavanoids | \n",
77 | " Nonflavanoid_Phenols | \n",
78 | " Proanthocyanins | \n",
79 | " Color_Intensity | \n",
80 | " Hue | \n",
81 | " OD280 | \n",
82 | " Proline | \n",
83 | " Customer_Segment | \n",
84 | "
\n",
85 | " \n",
86 | " \n",
87 | " \n",
88 | " 0 | \n",
89 | " 14.23 | \n",
90 | " 1.71 | \n",
91 | " 2.43 | \n",
92 | " 15.6 | \n",
93 | " 127 | \n",
94 | " 2.80 | \n",
95 | " 3.06 | \n",
96 | " 0.28 | \n",
97 | " 2.29 | \n",
98 | " 5.64 | \n",
99 | " 1.04 | \n",
100 | " 3.92 | \n",
101 | " 1065 | \n",
102 | " 1 | \n",
103 | "
\n",
104 | " \n",
105 | " 1 | \n",
106 | " 13.20 | \n",
107 | " 1.78 | \n",
108 | " 2.14 | \n",
109 | " 11.2 | \n",
110 | " 100 | \n",
111 | " 2.65 | \n",
112 | " 2.76 | \n",
113 | " 0.26 | \n",
114 | " 1.28 | \n",
115 | " 4.38 | \n",
116 | " 1.05 | \n",
117 | " 3.40 | \n",
118 | " 1050 | \n",
119 | " 1 | \n",
120 | "
\n",
121 | " \n",
122 | " 2 | \n",
123 | " 13.16 | \n",
124 | " 2.36 | \n",
125 | " 2.67 | \n",
126 | " 18.6 | \n",
127 | " 101 | \n",
128 | " 2.80 | \n",
129 | " 3.24 | \n",
130 | " 0.30 | \n",
131 | " 2.81 | \n",
132 | " 5.68 | \n",
133 | " 1.03 | \n",
134 | " 3.17 | \n",
135 | " 1185 | \n",
136 | " 1 | \n",
137 | "
\n",
138 | " \n",
139 | " 3 | \n",
140 | " 14.37 | \n",
141 | " 1.95 | \n",
142 | " 2.50 | \n",
143 | " 16.8 | \n",
144 | " 113 | \n",
145 | " 3.85 | \n",
146 | " 3.49 | \n",
147 | " 0.24 | \n",
148 | " 2.18 | \n",
149 | " 7.80 | \n",
150 | " 0.86 | \n",
151 | " 3.45 | \n",
152 | " 1480 | \n",
153 | " 1 | \n",
154 | "
\n",
155 | " \n",
156 | " 4 | \n",
157 | " 13.24 | \n",
158 | " 2.59 | \n",
159 | " 2.87 | \n",
160 | " 21.0 | \n",
161 | " 118 | \n",
162 | " 2.80 | \n",
163 | " 2.69 | \n",
164 | " 0.39 | \n",
165 | " 1.82 | \n",
166 | " 4.32 | \n",
167 | " 1.04 | \n",
168 | " 2.93 | \n",
169 | " 735 | \n",
170 | " 1 | \n",
171 | "
\n",
172 | " \n",
173 | "
\n",
174 | "
"
175 | ],
176 | "text/plain": [
177 | " Alcohol Malic_Acid Ash Ash_Alcanity Magnesium Total_Phenols \\\n",
178 | "0 14.23 1.71 2.43 15.6 127 2.80 \n",
179 | "1 13.20 1.78 2.14 11.2 100 2.65 \n",
180 | "2 13.16 2.36 2.67 18.6 101 2.80 \n",
181 | "3 14.37 1.95 2.50 16.8 113 3.85 \n",
182 | "4 13.24 2.59 2.87 21.0 118 2.80 \n",
183 | "\n",
184 | " Flavanoids Nonflavanoid_Phenols Proanthocyanins Color_Intensity Hue \\\n",
185 | "0 3.06 0.28 2.29 5.64 1.04 \n",
186 | "1 2.76 0.26 1.28 4.38 1.05 \n",
187 | "2 3.24 0.30 2.81 5.68 1.03 \n",
188 | "3 3.49 0.24 2.18 7.80 0.86 \n",
189 | "4 2.69 0.39 1.82 4.32 1.04 \n",
190 | "\n",
191 | " OD280 Proline Customer_Segment \n",
192 | "0 3.92 1065 1 \n",
193 | "1 3.40 1050 1 \n",
194 | "2 3.17 1185 1 \n",
195 | "3 3.45 1480 1 \n",
196 | "4 2.93 735 1 "
197 | ]
198 | },
199 | "execution_count": 3,
200 | "metadata": {},
201 | "output_type": "execute_result"
202 | }
203 | ],
204 | "source": [
205 | "data.head()"
206 | ]
207 | },
208 | {
209 | "cell_type": "code",
210 | "execution_count": 4,
211 | "metadata": {},
212 | "outputs": [
213 | {
214 | "data": {
215 | "text/html": [
216 | "\n",
217 | "\n",
230 | "
\n",
231 | " \n",
232 | " \n",
233 | " | \n",
234 | " Alcohol | \n",
235 | " Malic_Acid | \n",
236 | " Ash | \n",
237 | " Ash_Alcanity | \n",
238 | " Magnesium | \n",
239 | " Total_Phenols | \n",
240 | " Flavanoids | \n",
241 | " Nonflavanoid_Phenols | \n",
242 | " Proanthocyanins | \n",
243 | " Color_Intensity | \n",
244 | " Hue | \n",
245 | " OD280 | \n",
246 | " Proline | \n",
247 | " Customer_Segment | \n",
248 | "
\n",
249 | " \n",
250 | " \n",
251 | " \n",
252 | " count | \n",
253 | " 178.000000 | \n",
254 | " 178.000000 | \n",
255 | " 178.000000 | \n",
256 | " 178.000000 | \n",
257 | " 178.000000 | \n",
258 | " 178.000000 | \n",
259 | " 178.000000 | \n",
260 | " 178.000000 | \n",
261 | " 178.000000 | \n",
262 | " 178.000000 | \n",
263 | " 178.000000 | \n",
264 | " 178.000000 | \n",
265 | " 178.000000 | \n",
266 | " 178.000000 | \n",
267 | "
\n",
268 | " \n",
269 | " mean | \n",
270 | " 13.000618 | \n",
271 | " 2.336348 | \n",
272 | " 2.366517 | \n",
273 | " 19.494944 | \n",
274 | " 99.741573 | \n",
275 | " 2.295112 | \n",
276 | " 2.029270 | \n",
277 | " 0.361854 | \n",
278 | " 1.590899 | \n",
279 | " 5.058090 | \n",
280 | " 0.957449 | \n",
281 | " 2.611685 | \n",
282 | " 746.893258 | \n",
283 | " 1.938202 | \n",
284 | "
\n",
285 | " \n",
286 | " std | \n",
287 | " 0.811827 | \n",
288 | " 1.117146 | \n",
289 | " 0.274344 | \n",
290 | " 3.339564 | \n",
291 | " 14.282484 | \n",
292 | " 0.625851 | \n",
293 | " 0.998859 | \n",
294 | " 0.124453 | \n",
295 | " 0.572359 | \n",
296 | " 2.318286 | \n",
297 | " 0.228572 | \n",
298 | " 0.709990 | \n",
299 | " 314.907474 | \n",
300 | " 0.775035 | \n",
301 | "
\n",
302 | " \n",
303 | " min | \n",
304 | " 11.030000 | \n",
305 | " 0.740000 | \n",
306 | " 1.360000 | \n",
307 | " 10.600000 | \n",
308 | " 70.000000 | \n",
309 | " 0.980000 | \n",
310 | " 0.340000 | \n",
311 | " 0.130000 | \n",
312 | " 0.410000 | \n",
313 | " 1.280000 | \n",
314 | " 0.480000 | \n",
315 | " 1.270000 | \n",
316 | " 278.000000 | \n",
317 | " 1.000000 | \n",
318 | "
\n",
319 | " \n",
320 | " 25% | \n",
321 | " 12.362500 | \n",
322 | " 1.602500 | \n",
323 | " 2.210000 | \n",
324 | " 17.200000 | \n",
325 | " 88.000000 | \n",
326 | " 1.742500 | \n",
327 | " 1.205000 | \n",
328 | " 0.270000 | \n",
329 | " 1.250000 | \n",
330 | " 3.220000 | \n",
331 | " 0.782500 | \n",
332 | " 1.937500 | \n",
333 | " 500.500000 | \n",
334 | " 1.000000 | \n",
335 | "
\n",
336 | " \n",
337 | " 50% | \n",
338 | " 13.050000 | \n",
339 | " 1.865000 | \n",
340 | " 2.360000 | \n",
341 | " 19.500000 | \n",
342 | " 98.000000 | \n",
343 | " 2.355000 | \n",
344 | " 2.135000 | \n",
345 | " 0.340000 | \n",
346 | " 1.555000 | \n",
347 | " 4.690000 | \n",
348 | " 0.965000 | \n",
349 | " 2.780000 | \n",
350 | " 673.500000 | \n",
351 | " 2.000000 | \n",
352 | "
\n",
353 | " \n",
354 | " 75% | \n",
355 | " 13.677500 | \n",
356 | " 3.082500 | \n",
357 | " 2.557500 | \n",
358 | " 21.500000 | \n",
359 | " 107.000000 | \n",
360 | " 2.800000 | \n",
361 | " 2.875000 | \n",
362 | " 0.437500 | \n",
363 | " 1.950000 | \n",
364 | " 6.200000 | \n",
365 | " 1.120000 | \n",
366 | " 3.170000 | \n",
367 | " 985.000000 | \n",
368 | " 3.000000 | \n",
369 | "
\n",
370 | " \n",
371 | " max | \n",
372 | " 14.830000 | \n",
373 | " 5.800000 | \n",
374 | " 3.230000 | \n",
375 | " 30.000000 | \n",
376 | " 162.000000 | \n",
377 | " 3.880000 | \n",
378 | " 5.080000 | \n",
379 | " 0.660000 | \n",
380 | " 3.580000 | \n",
381 | " 13.000000 | \n",
382 | " 1.710000 | \n",
383 | " 4.000000 | \n",
384 | " 1680.000000 | \n",
385 | " 3.000000 | \n",
386 | "
\n",
387 | " \n",
388 | "
\n",
389 | "
"
390 | ],
391 | "text/plain": [
392 | " Alcohol Malic_Acid Ash Ash_Alcanity Magnesium \\\n",
393 | "count 178.000000 178.000000 178.000000 178.000000 178.000000 \n",
394 | "mean 13.000618 2.336348 2.366517 19.494944 99.741573 \n",
395 | "std 0.811827 1.117146 0.274344 3.339564 14.282484 \n",
396 | "min 11.030000 0.740000 1.360000 10.600000 70.000000 \n",
397 | "25% 12.362500 1.602500 2.210000 17.200000 88.000000 \n",
398 | "50% 13.050000 1.865000 2.360000 19.500000 98.000000 \n",
399 | "75% 13.677500 3.082500 2.557500 21.500000 107.000000 \n",
400 | "max 14.830000 5.800000 3.230000 30.000000 162.000000 \n",
401 | "\n",
402 | " Total_Phenols Flavanoids Nonflavanoid_Phenols Proanthocyanins \\\n",
403 | "count 178.000000 178.000000 178.000000 178.000000 \n",
404 | "mean 2.295112 2.029270 0.361854 1.590899 \n",
405 | "std 0.625851 0.998859 0.124453 0.572359 \n",
406 | "min 0.980000 0.340000 0.130000 0.410000 \n",
407 | "25% 1.742500 1.205000 0.270000 1.250000 \n",
408 | "50% 2.355000 2.135000 0.340000 1.555000 \n",
409 | "75% 2.800000 2.875000 0.437500 1.950000 \n",
410 | "max 3.880000 5.080000 0.660000 3.580000 \n",
411 | "\n",
412 | " Color_Intensity Hue OD280 Proline Customer_Segment \n",
413 | "count 178.000000 178.000000 178.000000 178.000000 178.000000 \n",
414 | "mean 5.058090 0.957449 2.611685 746.893258 1.938202 \n",
415 | "std 2.318286 0.228572 0.709990 314.907474 0.775035 \n",
416 | "min 1.280000 0.480000 1.270000 278.000000 1.000000 \n",
417 | "25% 3.220000 0.782500 1.937500 500.500000 1.000000 \n",
418 | "50% 4.690000 0.965000 2.780000 673.500000 2.000000 \n",
419 | "75% 6.200000 1.120000 3.170000 985.000000 3.000000 \n",
420 | "max 13.000000 1.710000 4.000000 1680.000000 3.000000 "
421 | ]
422 | },
423 | "execution_count": 4,
424 | "metadata": {},
425 | "output_type": "execute_result"
426 | }
427 | ],
428 | "source": [
429 | "data.describe()"
430 | ]
431 | },
432 | {
433 | "cell_type": "code",
434 | "execution_count": 5,
435 | "metadata": {},
436 | "outputs": [
437 | {
438 | "data": {
439 | "text/plain": [
440 | "array([1, 2, 3], dtype=int64)"
441 | ]
442 | },
443 | "execution_count": 5,
444 | "metadata": {},
445 | "output_type": "execute_result"
446 | }
447 | ],
448 | "source": [
449 | "data.Customer_Segment.unique()"
450 | ]
451 | },
452 | {
453 | "cell_type": "code",
454 | "execution_count": 6,
455 | "metadata": {},
456 | "outputs": [
457 | {
458 | "data": {
459 | "text/plain": [
460 | ""
461 | ]
462 | },
463 | "execution_count": 6,
464 | "metadata": {},
465 | "output_type": "execute_result"
466 | },
467 | {
468 | "data": {
469 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAX4AAAEHCAYAAACp9y31AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+17YcXAAAShklEQVR4nO3df/BldX3f8eeLXRALKKx8IVsB1ziUyNS6NN8QDTY1Ig6mKbtNkdGJZk1JNp2pFtuaiJlOJ5q0g00a45g0zQ6oa4aICBIIzag7W9FqEmQXNvzYla4SNVs27FeFCmmjWfruH/fzrV+/+12432XPvXz5PB8zZ845n3vOPe/v3tnX/dzPOffcVBWSpH4cM+0CJEmTZfBLUmcMfknqjMEvSZ0x+CWpM6unXcA4Tj311Fq3bt20y5CkFWXnzp1fr6qZxe0rIvjXrVvHjh07pl2GJK0oSb66VLtDPZLUGYNfkjpj8EtSZwx+SeqMwS9JnTH4JakzBr8kdcbgl6TODBb8Sc5JsmvB9K0kb0uyJsm2JHvb/JShapAkHWqwb+5W1f3AeoAkq4D/CdwEXAlsr6qrklzZ1t8xVB1aWb727pdMu4RnvLP+3T3TLkFTNqmhnguBL1fVV4ENwNbWvhXYOKEaJElMLvhfD3ykLZ9eVfsB2vy0CdUgSWICwZ/kOOAS4GPL3G9zkh1JdszNzQ1TnCR1aBI9/tcCd1bVQ239oSRrAdr8wFI7VdWWqpqtqtmZmUPuKipJOkKTCP438N1hHoBbgE1teRNw8wRqkCQ1gwZ/kr8FXAR8fEHzVcBFSfa2x64asgZJ0vca9IdYqup/A89b1PYNRlf5SJKmwG/uSlJnDH5J6ozBL0mdMfglqTMGvyR1xuCXpM4Y/JLUGYNfkjpj8EtSZwx+SeqMwS9JnTH4JakzBr8kdcbgl6TOGPyS1BmDX5I6Y/BLUmcMfknqjMEvSZ0x+CWpMwa/JHVm0OBPcnKSG5J8McmeJC9PsibJtiR72/yUIWuQJH2voXv87wM+UVU/ALwU2ANcCWyvqrOB7W1dkjQhgwV/kucAPwpcA1BV36mqR4ANwNa22VZg41A1SJIONWSP//uBOeCDSe5KcnWSE4DTq2o/QJufttTOSTYn2ZFkx9zc3IBlSlJfhgz+1cDfB36nqs4D/oplDOtU1Zaqmq2q2ZmZmaFqlKTuDBn8+4B9VXV7W7+B0RvBQ0nWArT5gQFrkCQtsnqoJ66qv0zyF0nOqar7gQuB3W3aBFzV5jcfzeP+4C98+Gg+nZaw89d+etolSHoKBgv+5q3AtUmOAx4AfobRp4zrk1wOfA143cA1SJIWGDT4q2oXMLvEQxcOeVxJ0uH5zV1J6ozBL0mdMfglqTMGvyR1xuCXpM4Y/JLUGYNfkjpj8EtSZwx+SeqMwS9JnTH4JakzBr8kdcbgl6TOGPyS1BmDX5I6Y/BLUmcMfknqjMEvSZ0x+CWpMwa/JHVm0B9bT/IV4FHgceBgVc0mWQN8FFgHfAW4rKoeHrIOSdJ3TaLH/2NVtb6qZtv6lcD2qjob2N7WJUkTMo2hng3A1ra8Fdg4hRokqVtDB38Bn0qyM8nm1nZ6Ve0HaPPTltoxyeYkO5LsmJubG7hMSerHoGP8wAVV9WCS04BtSb447o5VtQXYAjA7O1tDFShJvRm0x19VD7b5AeAm4HzgoSRrAdr8wJA1SJK+12DBn+SEJCfNLwOvAe4FbgE2tc02ATcPVYMk6VBDDvWcDtyUZP44v19Vn0hyB3B9ksuBrwGvG7AGSdIigwV/VT0AvHSJ9m8AFw51XEnSExv65K6kTlzw/gumXcIz3uff+vmj8jzeskGSOmPwS1JnDH5J6ozBL0mdMfglqTMGvyR1xuCXpM4Y/JLUGYNfkjpj8EtSZwx+SeqMwS9JnTH4JakzBr8kdcbgl6TOGPyS1BmDX5I6Y/BLUmcMfknqzFjBn2T7OG2H2XdVkruS3NrWX5jk9iR7k3w0yXHLK1mS9FQ8YfAnOT7JGuDUJKckWdOmdcDfHvMYVwB7Fqy/B3hvVZ0NPAxcvvyyJUlH6sl6/D8P7AR+oM3np5uB336yJ09yBvCPgKvbeoBXATe0TbYCG4+kcEnSkVn9RA9W1fuA9yV5a1W9/wie/zeBXwROauvPAx6pqoNtfR/w/KV2TLIZ2Axw1llnHcGhJUlLecLgn1dV70/yI8C6hftU1YcPt0+SnwAOVNXOJK+cb17q6Q9zzC3AFoDZ2dklt5EkLd9YwZ/k94AXAbuAx1tzAYcNfuAC4JIkPw4cDzyH0SeAk5Osbr3+M4AHj7B2SdIRGCv4gVng3Koau+ddVe8E3gnQevxvr6qfSvIx4FLgOmATo/MFkqQJGfc6/nuB7ztKx3wH8K+TfInRmP81R+l5JUljGLfHfyqwO8kXgG/PN1bVJePsXFW3Abe15QeA85dVpSTpqBk3+H95yCIkSZMz7lU9nxm6EEnSZIx7Vc+jfPeyy+OAY4G/qqrnDFWYJGkY4/b4T1q4nmQjjtNL0op0RHfnrKo/YHTrBUnSCjPuUM9PLlg9htF1/X6bVpJWoHGv6vnHC5YPAl8BNhz1aiRJgxt3jP9nhi5EkjQZ4/4QyxlJbkpyIMlDSW5st1yWJK0w457c/SBwC6MfX3k+8IetTZK0wowb/DNV9cGqOtimDwEzA9YlSRrIuMH/9SRvbL+fuyrJG4FvDFmYJGkY4wb/PwMuA/4S2M/otsqe8JWkFWjcyzl/BdhUVQ8DtB9g/3VGbwiSpBVk3B7/35sPfYCq+iZw3jAlSZKGNG7wH5PklPmV1uMf99OCJOlpZNzw/k/AHye5gdGtGi4D/v1gVUmSBjPuN3c/nGQHoxuzBfjJqto9aGWSpEGMPVzTgt6wl6QV7ohuyyxJWrkGC/4kxyf5QpI/S3Jfkne19hcmuT3J3iQfTXLcUDVIkg41ZI//28CrquqlwHrg4iQvA94DvLeqzgYeBi4fsAZJ0iKDBX+NPNZWj21TMTpBfENr3wpsHKoGSdKhBh3jb/f12QUcALYBXwYeqaqDbZN9jO72udS+m5PsSLJjbm5uyDIlqSuDBn9VPV5V64EzGP04+4uX2uww+26pqtmqmp2Z8UagknS0TOSqnqp6BLgNeBlwcpL5y0jPAB6cRA2SpJEhr+qZSXJyW3428GpgD/BpRnf3BNgE3DxUDZKkQw15v521wNYkqxi9wVxfVbcm2Q1cl+RXgbuAawasQZK0yGDBX1V3s8QdPKvqAUbj/ZKkKfCbu5LUGYNfkjpj8EtSZwx+SeqMwS9JnTH4JakzBr8kdcbgl6TOGPyS1BmDX5I6Y/BLUmcMfknqjMEvSZ0x+CWpMwa/JHXG4Jekzhj8ktQZg1+SOmPwS1JnDH5J6sxgwZ/kzCSfTrInyX1Jrmjta5JsS7K3zU8ZqgZJ0qGG7PEfBP5NVb0YeBnwL5KcC1wJbK+qs4HtbV2SNCGDBX9V7a+qO9vyo8Ae4PnABmBr22wrsHGoGiRJh5rIGH+SdcB5wO3A6VW1H0ZvDsBpk6hBkjQyePAnORG4EXhbVX1rGfttTrIjyY65ubnhCpSkzgwa/EmOZRT611bVx1vzQ0nWtsfXAgeW2reqtlTVbFXNzszMDFmmJHVlyKt6AlwD7Kmq31jw0C3Apra8Cbh5qBokSYdaPeBzXwC8Cbgnya7W9kvAVcD1SS4Hvga8bsAaJEmLDBb8VfU5IId5+MKhjitJemJ+c1eSOmPwS1JnDH5J6ozBL0mdMfglqTMGvyR1xuCXpM4Y/JLUGYNfkjpj8EtSZwx+SeqMwS9JnTH4JakzBr8kdcbgl6TOGPyS1BmDX5I6Y/BLUmcMfknqjMEvSZ0ZLPiTfCDJgST3Lmhbk2Rbkr1tfspQx5ckLW3IHv+HgIsXtV0JbK+qs4HtbV2SNEGDBX9VfRb45qLmDcDWtrwV2DjU8SVJS5v0GP/pVbUfoM1Pm/DxJal7T9uTu0k2J9mRZMfc3Ny0y5GkZ4xJB/9DSdYCtPmBw21YVVuqaraqZmdmZiZWoCQ90006+G8BNrXlTcDNEz6+JHVvyMs5PwL8CXBOkn1JLgeuAi5Kshe4qK1LkiZo9VBPXFVvOMxDFw51TEnSk3vantyVJA3D4Jekzhj8ktQZg1+SOmPwS1JnDH5J6ozBL0mdMfglqTMGvyR1xuCXpM4Y/JLUGYNfkjpj8EtSZwx+SeqMwS9JnTH4JakzBr8kdcbgl6TOGPyS1BmDX5I6Y/BLUmemEvxJLk5yf5IvJblyGjVIUq8mHvxJVgG/DbwWOBd4Q5JzJ12HJPVqGj3+84EvVdUDVfUd4DpgwxTqkKQupaome8DkUuDiqvrZtv4m4Ier6i2LttsMbG6r5wD3T7TQyToV+Pq0i9AR8bVb2Z7pr98LqmpmcePqKRSSJdoOefepqi3AluHLmb4kO6pqdtp1aPl87Va2Xl+/aQz17APOXLB+BvDgFOqQpC5NI/jvAM5O8sIkxwGvB26ZQh2S1KWJD/VU1cEkbwE+CawCPlBV9026jqeZLoa0nqF87Va2Ll+/iZ/clSRNl9/claTOGPyS1BmDf4qSfCDJgST3TrsWLU+SM5N8OsmeJPcluWLaNWk8SY5P8oUkf9Zeu3dNu6ZJc4x/ipL8KPAY8OGq+rvTrkfjS7IWWFtVdyY5CdgJbKyq3VMuTU8iSYATquqxJMcCnwOuqKo/nXJpE2OPf4qq6rPAN6ddh5avqvZX1Z1t+VFgD/D86ValcdTIY2312DZ11QM2+KWnKMk64Dzg9ulWonElWZVkF3AA2FZVXb12Br/0FCQ5EbgReFtVfWva9Wg8VfV4Va1ndOeA85N0NdRq8EtHqI0P3whcW1Ufn3Y9Wr6qegS4Dbh4yqVMlMEvHYF2gvAaYE9V/ca069H4kswkObktPxt4NfDF6VY1WQb/FCX5CPAnwDlJ9iW5fNo1aWwXAG8CXpVkV5t+fNpFaSxrgU8nuZvRvcO2VdWtU65porycU5I6Y49fkjpj8EtSZwx+SeqMwS9JnTH4JakzBr8kdcbg19Ql+b4k1yX5cpLdSf4oyd9Z5nNsTHLuUDUuR5KfSHJXu+3v7iQ/P+2aFkvyS9OuQdPjdfyaqvYN2D8GtlbVf2lt64GTquq/L+N5PgTcWlU3DFLo0sdcVVWPL2o7FvgqcH5V7UvyLGBdVd0/qbrGkeSxqjpx2nVoOuzxa9p+DPib+dAHqKpdwKok///blEl+K8mb2/JVrSd9d5JfT/IjwCXAr7Vv0L4oyfokf9q2uSnJKW3f25K8N8ln24+o/FCSjyfZm+RXFxzvje3HOnYl+d0kq1r7Y0neneR24OVL/D0nAauBb7S/5dvzod9uFXBjkjvadMGC9m1J7mzH+mqSU5OsS/LFJFcnuTfJtUleneTzrd7z2/4ntB/1uaN90tjQ2t/c/rZPtO3/4/y/H/Ds9rddezReRK0wVeXkNLUJ+JfAe5dofyWjHvz8+m8BbwbWAPfz3U+rJ7f5h4BLF2x/N/AP2/K7gd9sy7cB72nLVwAPMvoK/7OAfcDzgBcDfwgc27b7z8BPt+UCLnuSv+lqRrf7/QjwU8Axrf33gVe05bMY3edn/m97Z1u+uB3jVGAdcBB4CaNO2k7gA0CADcAftH3+A/DG+X8P4H8AJ7R/rweA5wLHM/okcmbb7rFpv/ZO05tWj/n+ID1dfAv4a+DqJP8VOOQeK0mey+gN4TOtaSvwsQWb3NLm9wD3VdX+tt8DwJnAK4AfBO4YjUTxbEZBDvA4oztyHlZV/WySlzC6+dfbgYsYhfCrgXPbcwI8p/161yuAf9L2/USShxc83Z9X1T2tvvuA7VVVSe5h9MYA8BrgkiRvb+vHM3pjoW3/v9r+u4EXAH/xRPXrmc/g17TdB1y6RPtBvnco8niAqjrYhjguBF4PvAV41TKP+e02/78LlufXVzPqUW+tqncuse9f16Jx/aW0sL4nye8Bf84o+I8BXl5V/2fhtlnwTvAEtS6ud75WWr3/tBadR0jyw4v2fxz/zwvH+DV9/w14VpKfm29I8kPAKka942e1HvyF7bETgedW1R8BbwPWt90eZTS+TuvhPpzkH7TH3gTM9/7HsR24NMlp7ZhrkrxgnB2TnJjklQua1jMaYgH4FKM3qvlt52v/HHBZa3sNcMoyagX4JPDW+TeQJOeNsc/ftBPR6pDBr6mqqmI0zHFRu5zzPuCXGY29X89orP5a4K62y0nAre2Wup8B/lVrvw74hXZy80XAJkYne+9mFL7vXkZNu4F/C3yq7b+N0XmAcQT4xST3Z/TTfu9i1NuH0fmM2XbCeTfwz1v7u4DXJLkTeC2wn9Eb2bh+hdHvxt6d5N62/mS2tO09udshL+eUpqxd8vl4G8Z6OfA7NfpZQGkQjvdJ03cWcH2SY4DvAD/3JNtLT4k9fukIJbkJeOGi5ndU1SenUY80LoNfkjrjyV1J6ozBL0mdMfglqTMGvyR15v8B63Q+V5MrP5EAAAAASUVORK5CYII=\n",
470 | "text/plain": [
471 | ""
472 | ]
473 | },
474 | "metadata": {
475 | "needs_background": "light"
476 | },
477 | "output_type": "display_data"
478 | }
479 | ],
480 | "source": [
481 | "sns.countplot(data.Customer_Segment)"
482 | ]
483 | },
484 | {
485 | "cell_type": "code",
486 | "execution_count": 7,
487 | "metadata": {},
488 | "outputs": [],
489 | "source": [
490 | "from sklearn.model_selection import train_test_split"
491 | ]
492 | },
493 | {
494 | "cell_type": "code",
495 | "execution_count": 8,
496 | "metadata": {},
497 | "outputs": [],
498 | "source": [
499 | "X = data.drop(['Customer_Segment'] , axis = 1)\n",
500 | "y = data.Customer_Segment"
501 | ]
502 | },
503 | {
504 | "cell_type": "code",
505 | "execution_count": 10,
506 | "metadata": {},
507 | "outputs": [],
508 | "source": [
509 | "x_train, x_test, y_train, y_test = train_test_split(X, y, test_size = 0.2)"
510 | ]
511 | },
512 | {
513 | "cell_type": "code",
514 | "execution_count": 11,
515 | "metadata": {},
516 | "outputs": [],
517 | "source": [
518 | "from sklearn.tree import DecisionTreeClassifier"
519 | ]
520 | },
521 | {
522 | "cell_type": "code",
523 | "execution_count": 28,
524 | "metadata": {},
525 | "outputs": [],
526 | "source": [
527 | "clf = DecisionTreeClassifier(max_depth = 7)"
528 | ]
529 | },
530 | {
531 | "cell_type": "code",
532 | "execution_count": 29,
533 | "metadata": {},
534 | "outputs": [
535 | {
536 | "data": {
537 | "text/plain": [
538 | "DecisionTreeClassifier(class_weight=None, criterion='gini', max_depth=8,\n",
539 | " max_features=None, max_leaf_nodes=None,\n",
540 | " min_impurity_decrease=0.0, min_impurity_split=None,\n",
541 | " min_samples_leaf=1, min_samples_split=2,\n",
542 | " min_weight_fraction_leaf=0.0, presort=False,\n",
543 | " random_state=None, splitter='best')"
544 | ]
545 | },
546 | "execution_count": 29,
547 | "metadata": {},
548 | "output_type": "execute_result"
549 | }
550 | ],
551 | "source": [
552 | "clf.fit(x_train, y_train)"
553 | ]
554 | },
555 | {
556 | "cell_type": "code",
557 | "execution_count": 30,
558 | "metadata": {},
559 | "outputs": [],
560 | "source": [
561 | "y_pred = clf.predict(x_test)"
562 | ]
563 | },
564 | {
565 | "cell_type": "code",
566 | "execution_count": 31,
567 | "metadata": {},
568 | "outputs": [
569 | {
570 | "data": {
571 | "text/plain": [
572 | "0.8888888888888888"
573 | ]
574 | },
575 | "execution_count": 31,
576 | "metadata": {},
577 | "output_type": "execute_result"
578 | }
579 | ],
580 | "source": [
581 | "(y_pred == y_test).mean()"
582 | ]
583 | },
584 | {
585 | "cell_type": "code",
586 | "execution_count": 26,
587 | "metadata": {},
588 | "outputs": [],
589 | "source": [
590 | "y_pred = clf.predict(x_train)"
591 | ]
592 | },
593 | {
594 | "cell_type": "code",
595 | "execution_count": 27,
596 | "metadata": {},
597 | "outputs": [
598 | {
599 | "data": {
600 | "text/plain": [
601 | "1.0"
602 | ]
603 | },
604 | "execution_count": 27,
605 | "metadata": {},
606 | "output_type": "execute_result"
607 | }
608 | ],
609 | "source": [
610 | "(y_pred == y_train).mean()"
611 | ]
612 | },
613 | {
614 | "cell_type": "code",
615 | "execution_count": 33,
616 | "metadata": {},
617 | "outputs": [],
618 | "source": [
619 | "from sklearn.linear_model import LogisticRegression"
620 | ]
621 | },
622 | {
623 | "cell_type": "code",
624 | "execution_count": 34,
625 | "metadata": {},
626 | "outputs": [
627 | {
628 | "name": "stderr",
629 | "output_type": "stream",
630 | "text": [
631 | "C:\\Anaconda\\Anaconda\\lib\\site-packages\\sklearn\\linear_model\\logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n",
632 | " FutureWarning)\n",
633 | "C:\\Anaconda\\Anaconda\\lib\\site-packages\\sklearn\\linear_model\\logistic.py:469: FutureWarning: Default multi_class will be changed to 'auto' in 0.22. Specify the multi_class option to silence this warning.\n",
634 | " \"this warning.\", FutureWarning)\n"
635 | ]
636 | }
637 | ],
638 | "source": [
639 | "clf = LogisticRegression()\n",
640 | "clf.fit(x_train, y_train)\n",
641 | "y_pred = clf.predict(x_test)"
642 | ]
643 | },
644 | {
645 | "cell_type": "code",
646 | "execution_count": 35,
647 | "metadata": {},
648 | "outputs": [
649 | {
650 | "data": {
651 | "text/plain": [
652 | "0.9722222222222222"
653 | ]
654 | },
655 | "execution_count": 35,
656 | "metadata": {},
657 | "output_type": "execute_result"
658 | }
659 | ],
660 | "source": [
661 | "(y_pred == y_test).mean()"
662 | ]
663 | },
664 | {
665 | "cell_type": "code",
666 | "execution_count": 36,
667 | "metadata": {},
668 | "outputs": [],
669 | "source": [
670 | "from sklearn.svm import SVC"
671 | ]
672 | },
673 | {
674 | "cell_type": "code",
675 | "execution_count": 43,
676 | "metadata": {},
677 | "outputs": [
678 | {
679 | "data": {
680 | "text/plain": [
681 | "SVC(C=1.0, cache_size=200, class_weight=None, coef0=0.0,\n",
682 | " decision_function_shape='ovr', degree=3, gamma='auto_deprecated',\n",
683 | " kernel='linear', max_iter=-1, probability=False, random_state=None,\n",
684 | " shrinking=True, tol=0.001, verbose=False)"
685 | ]
686 | },
687 | "execution_count": 43,
688 | "metadata": {},
689 | "output_type": "execute_result"
690 | }
691 | ],
692 | "source": [
693 | "clf = SVC(kernel='linear')\n",
694 | "clf.fit(x_train, y_train)"
695 | ]
696 | },
697 | {
698 | "cell_type": "code",
699 | "execution_count": 44,
700 | "metadata": {},
701 | "outputs": [
702 | {
703 | "data": {
704 | "text/plain": [
705 | "0.9444444444444444"
706 | ]
707 | },
708 | "execution_count": 44,
709 | "metadata": {},
710 | "output_type": "execute_result"
711 | }
712 | ],
713 | "source": [
714 | "y_pred = clf.predict(x_test)\n",
715 | "(y_pred == y_test).mean()"
716 | ]
717 | },
718 | {
719 | "cell_type": "code",
720 | "execution_count": 45,
721 | "metadata": {},
722 | "outputs": [],
723 | "source": [
724 | "from sklearn.naive_bayes import GaussianNB"
725 | ]
726 | },
727 | {
728 | "cell_type": "code",
729 | "execution_count": 46,
730 | "metadata": {},
731 | "outputs": [
732 | {
733 | "data": {
734 | "text/plain": [
735 | "GaussianNB(priors=None, var_smoothing=1e-09)"
736 | ]
737 | },
738 | "execution_count": 46,
739 | "metadata": {},
740 | "output_type": "execute_result"
741 | }
742 | ],
743 | "source": [
744 | "clf = GaussianNB()\n",
745 | "clf.fit(x_train, y_train)"
746 | ]
747 | },
748 | {
749 | "cell_type": "code",
750 | "execution_count": 47,
751 | "metadata": {},
752 | "outputs": [
753 | {
754 | "data": {
755 | "text/plain": [
756 | "0.9722222222222222"
757 | ]
758 | },
759 | "execution_count": 47,
760 | "metadata": {},
761 | "output_type": "execute_result"
762 | }
763 | ],
764 | "source": [
765 | "y_pred = clf.predict(x_test)\n",
766 | "(y_pred == y_test).mean()"
767 | ]
768 | },
769 | {
770 | "cell_type": "code",
771 | "execution_count": 48,
772 | "metadata": {},
773 | "outputs": [],
774 | "source": [
775 | "from sklearn.neighbors import KNeighborsClassifier"
776 | ]
777 | },
778 | {
779 | "cell_type": "code",
780 | "execution_count": 51,
781 | "metadata": {},
782 | "outputs": [],
783 | "source": [
784 | "from sklearn.preprocessing import StandardScaler\n",
785 | "std = StandardScaler()\n",
786 | "x_train = std.fit_transform(x_train)\n",
787 | "x_test = std.fit_transform(x_test)"
788 | ]
789 | },
790 | {
791 | "cell_type": "code",
792 | "execution_count": 52,
793 | "metadata": {},
794 | "outputs": [
795 | {
796 | "data": {
797 | "text/plain": [
798 | "KNeighborsClassifier(algorithm='auto', leaf_size=30, metric='minkowski',\n",
799 | " metric_params=None, n_jobs=None, n_neighbors=5, p=2,\n",
800 | " weights='uniform')"
801 | ]
802 | },
803 | "execution_count": 52,
804 | "metadata": {},
805 | "output_type": "execute_result"
806 | }
807 | ],
808 | "source": [
809 | "clf = KNeighborsClassifier()\n",
810 | "clf.fit(x_train, y_train)"
811 | ]
812 | },
813 | {
814 | "cell_type": "code",
815 | "execution_count": null,
816 | "metadata": {},
817 | "outputs": [],
818 | "source": []
819 | },
820 | {
821 | "cell_type": "code",
822 | "execution_count": 53,
823 | "metadata": {},
824 | "outputs": [
825 | {
826 | "data": {
827 | "text/plain": [
828 | "0.9722222222222222"
829 | ]
830 | },
831 | "execution_count": 53,
832 | "metadata": {},
833 | "output_type": "execute_result"
834 | }
835 | ],
836 | "source": [
837 | "y_pred = clf.predict(x_test)\n",
838 | "(y_pred == y_test).mean()"
839 | ]
840 | },
841 | {
842 | "cell_type": "code",
843 | "execution_count": 54,
844 | "metadata": {},
845 | "outputs": [
846 | {
847 | "data": {
848 | "text/html": [
849 | "\n",
850 | "\n",
863 | "
\n",
864 | " \n",
865 | " \n",
866 | " | \n",
867 | " 0 | \n",
868 | " 1 | \n",
869 | " 2 | \n",
870 | " 3 | \n",
871 | " 4 | \n",
872 | " 5 | \n",
873 | " 6 | \n",
874 | " 7 | \n",
875 | " 8 | \n",
876 | " 9 | \n",
877 | " 10 | \n",
878 | " 11 | \n",
879 | " 12 | \n",
880 | "
\n",
881 | " \n",
882 | " \n",
883 | " \n",
884 | " 0 | \n",
885 | " 1.353998 | \n",
886 | " 0.202060 | \n",
887 | " 0.397610 | \n",
888 | " 0.148821 | \n",
889 | " -0.612437 | \n",
890 | " -0.997888 | \n",
891 | " -1.357240 | \n",
892 | " 0.652892 | \n",
893 | " -0.704687 | \n",
894 | " 1.930695 | \n",
895 | " -1.481679 | \n",
896 | " -1.278588 | \n",
897 | " -0.287348 | \n",
898 | "
\n",
899 | " \n",
900 | " 1 | \n",
901 | " 0.998013 | \n",
902 | " -0.724401 | \n",
903 | " 1.134897 | \n",
904 | " 1.616358 | \n",
905 | " -0.957815 | \n",
906 | " 1.005442 | \n",
907 | " 0.786499 | \n",
908 | " -1.214310 | \n",
909 | " 0.423266 | \n",
910 | " -0.753014 | \n",
911 | " 1.756836 | \n",
912 | " 0.744165 | \n",
913 | " -1.080386 | \n",
914 | "
\n",
915 | " \n",
916 | " 2 | \n",
917 | " 1.603188 | \n",
918 | " -0.316758 | \n",
919 | " 0.475219 | \n",
920 | " -0.790403 | \n",
921 | " 0.907225 | \n",
922 | " 2.425125 | \n",
923 | " 1.411756 | \n",
924 | " -0.970762 | \n",
925 | " 0.978291 | \n",
926 | " 1.123884 | \n",
927 | " -0.431350 | \n",
928 | " 1.148716 | \n",
929 | " 2.313817 | \n",
930 | "
\n",
931 | " \n",
932 | " 3 | \n",
933 | " 0.642027 | \n",
934 | " -0.539109 | \n",
935 | " -0.262067 | \n",
936 | " -0.966507 | \n",
937 | " 1.183527 | \n",
938 | " 1.320927 | \n",
939 | " 1.213262 | \n",
940 | " -0.158935 | \n",
941 | " 1.264756 | \n",
942 | " 0.414739 | \n",
943 | " -0.037477 | \n",
944 | " 1.051066 | \n",
945 | " 0.140893 | \n",
946 | "
\n",
947 | " \n",
948 | " 4 | \n",
949 | " 1.092942 | \n",
950 | " -0.520579 | \n",
951 | " -0.417286 | \n",
952 | " -0.614299 | \n",
953 | " 0.561847 | \n",
954 | " 0.895022 | \n",
955 | " 1.461380 | \n",
956 | " -0.321300 | \n",
957 | " 0.799251 | \n",
958 | " 1.590985 | \n",
959 | " 0.706507 | \n",
960 | " 0.660465 | \n",
961 | " 1.615943 | \n",
962 | "
\n",
963 | " \n",
964 | " 5 | \n",
965 | " 0.582696 | \n",
966 | " -0.455727 | \n",
967 | " 1.057288 | \n",
968 | " -0.144687 | \n",
969 | " 0.699998 | \n",
970 | " 0.058987 | \n",
971 | " 0.458983 | \n",
972 | " -0.564849 | \n",
973 | " -0.167567 | \n",
974 | " -0.404811 | \n",
975 | " 0.618979 | \n",
976 | " 0.339614 | \n",
977 | " 1.092538 | \n",
978 | "
\n",
979 | " \n",
980 | " 6 | \n",
981 | " -1.209097 | \n",
982 | " 1.054403 | \n",
983 | " -1.465009 | \n",
984 | " -0.144687 | \n",
985 | " -0.888740 | \n",
986 | " -0.493112 | \n",
987 | " -0.424317 | \n",
988 | " 0.084613 | \n",
989 | " 0.423266 | \n",
990 | " -1.644752 | \n",
991 | " -0.125004 | \n",
992 | " 0.590715 | \n",
993 | " -0.591874 | \n",
994 | "
\n",
995 | " \n",
996 | " 7 | \n",
997 | " 0.974281 | \n",
998 | " 1.693661 | \n",
999 | " 0.009565 | \n",
1000 | " 0.002067 | \n",
1001 | " -0.750589 | \n",
1002 | " -0.808597 | \n",
1003 | " -1.228219 | \n",
1004 | " 0.977622 | \n",
1005 | " -0.131759 | \n",
1006 | " 1.637695 | \n",
1007 | " -1.700498 | \n",
1008 | " -1.376239 | \n",
1009 | " -0.858335 | \n",
1010 | "
\n",
1011 | " \n",
1012 | " 8 | \n",
1013 | " 2.054103 | \n",
1014 | " -0.520579 | \n",
1015 | " 0.048369 | \n",
1016 | " -2.375343 | \n",
1017 | " -0.612437 | \n",
1018 | " 1.242056 | \n",
1019 | " 1.610250 | \n",
1020 | " 0.571709 | \n",
1021 | " 2.106245 | \n",
1022 | " 0.104754 | \n",
1023 | " 1.275435 | \n",
1024 | " 0.144314 | \n",
1025 | " 1.267007 | \n",
1026 | "
\n",
1027 | " \n",
1028 | " 9 | \n",
1029 | " 0.036852 | \n",
1030 | " -0.520579 | \n",
1031 | " -1.309791 | \n",
1032 | " -2.081836 | \n",
1033 | " -0.543362 | \n",
1034 | " 0.642634 | \n",
1035 | " 1.193412 | \n",
1036 | " -1.539041 | \n",
1037 | " 2.285286 | \n",
1038 | " 0.869101 | \n",
1039 | " 0.706507 | \n",
1040 | " 0.395414 | \n",
1041 | " 1.267007 | \n",
1042 | "
\n",
1043 | " \n",
1044 | " 10 | \n",
1045 | " 0.843753 | \n",
1046 | " 1.916011 | \n",
1047 | " -0.456090 | \n",
1048 | " 0.882589 | \n",
1049 | " -0.819664 | \n",
1050 | " -1.628858 | \n",
1051 | " -1.585508 | \n",
1052 | " 1.302353 | \n",
1053 | " -0.865824 | \n",
1054 | " 0.622812 | \n",
1055 | " -0.781460 | \n",
1056 | " -1.222788 | \n",
1057 | " -0.731449 | \n",
1058 | "
\n",
1059 | " \n",
1060 | " 11 | \n",
1061 | " -0.959907 | \n",
1062 | " -1.020868 | \n",
1063 | " -2.435123 | \n",
1064 | " -0.790403 | \n",
1065 | " 3.532096 | \n",
1066 | " -0.729726 | \n",
1067 | " -0.781606 | \n",
1068 | " -1.782589 | \n",
1069 | " 1.551220 | \n",
1070 | " -0.978072 | \n",
1071 | " 1.406726 | \n",
1072 | " 0.618615 | \n",
1073 | " -0.103363 | \n",
1074 | "
\n",
1075 | " \n",
1076 | " 12 | \n",
1077 | " 1.615055 | \n",
1078 | " -0.390875 | \n",
1079 | " 0.009565 | \n",
1080 | " -2.199239 | \n",
1081 | " 0.147394 | \n",
1082 | " 1.557541 | \n",
1083 | " 1.560627 | \n",
1084 | " -0.564849 | \n",
1085 | " 2.374806 | \n",
1086 | " 0.996493 | \n",
1087 | " 1.056617 | \n",
1088 | " 0.520965 | \n",
1089 | " 2.526351 | \n",
1090 | "
\n",
1091 | " \n",
1092 | " 13 | \n",
1093 | " -1.422689 | \n",
1094 | " -0.761459 | \n",
1095 | " -1.503814 | \n",
1096 | " 0.383627 | \n",
1097 | " -0.957815 | \n",
1098 | " -0.524661 | \n",
1099 | " -0.464015 | \n",
1100 | " -0.483666 | \n",
1101 | " -0.185471 | \n",
1102 | " -1.360245 | \n",
1103 | " -0.037477 | \n",
1104 | " 0.981315 | \n",
1105 | " -0.810753 | \n",
1106 | "
\n",
1107 | " \n",
1108 | " 14 | \n",
1109 | " 0.214845 | \n",
1110 | " -0.474256 | \n",
1111 | " -0.921745 | \n",
1112 | " -2.434045 | \n",
1113 | " 0.009243 | \n",
1114 | " 0.532214 | \n",
1115 | " 0.687252 | \n",
1116 | " -0.808397 | \n",
1117 | " -0.633071 | \n",
1118 | " -0.328376 | \n",
1119 | " 0.400161 | \n",
1120 | " 1.078966 | \n",
1121 | " 0.949791 | \n",
1122 | "
\n",
1123 | " \n",
1124 | " 15 | \n",
1125 | " -1.624414 | \n",
1126 | " -0.576167 | \n",
1127 | " 0.940874 | \n",
1128 | " 1.909865 | \n",
1129 | " -0.819664 | \n",
1130 | " -0.619306 | \n",
1131 | " -0.454091 | \n",
1132 | " 0.328161 | \n",
1133 | " -0.525647 | \n",
1134 | " -1.084231 | \n",
1135 | " 1.756836 | \n",
1136 | " 0.813915 | \n",
1137 | " -0.598219 | \n",
1138 | "
\n",
1139 | " \n",
1140 | " 16 | \n",
1141 | " -0.390331 | \n",
1142 | " -0.705871 | \n",
1143 | " -0.456090 | \n",
1144 | " 0.354276 | \n",
1145 | " -1.372269 | \n",
1146 | " -1.471116 | \n",
1147 | " -0.602961 | \n",
1148 | " 1.789449 | \n",
1149 | " -0.024334 | \n",
1150 | " -0.893144 | \n",
1151 | " 0.006287 | \n",
1152 | " -0.790338 | \n",
1153 | " -0.810753 | \n",
1154 | "
\n",
1155 | " \n",
1156 | " 17 | \n",
1157 | " -1.909202 | \n",
1158 | " -1.437775 | \n",
1159 | " 0.475219 | \n",
1160 | " 0.442328 | \n",
1161 | " -0.819664 | \n",
1162 | " 0.264052 | \n",
1163 | " -0.057102 | \n",
1164 | " 0.490526 | \n",
1165 | " -0.346607 | \n",
1166 | " -0.880405 | \n",
1167 | " 0.618979 | \n",
1168 | " -0.441587 | \n",
1169 | " -1.004254 | \n",
1170 | "
\n",
1171 | " \n",
1172 | " 18 | \n",
1173 | " -1.114168 | \n",
1174 | " -0.835576 | \n",
1175 | " 0.475219 | \n",
1176 | " 0.882589 | \n",
1177 | " -1.095966 | \n",
1178 | " 0.390246 | \n",
1179 | " 0.220790 | \n",
1180 | " 0.571709 | \n",
1181 | " -1.062768 | \n",
1182 | " -0.956840 | \n",
1183 | " -0.125004 | \n",
1184 | " 0.786015 | \n",
1185 | " -1.159690 | \n",
1186 | "
\n",
1187 | " \n",
1188 | " 19 | \n",
1189 | " -0.592056 | \n",
1190 | " 0.127943 | \n",
1191 | " -0.805331 | \n",
1192 | " 0.442328 | \n",
1193 | " -0.819664 | \n",
1194 | " 0.374472 | \n",
1195 | " 0.200941 | \n",
1196 | " -0.808397 | \n",
1197 | " -0.740496 | \n",
1198 | " -1.339013 | \n",
1199 | " -0.256295 | \n",
1200 | " 0.214064 | \n",
1201 | " -1.350019 | \n",
1202 | "
\n",
1203 | " \n",
1204 | " 20 | \n",
1205 | " 0.796288 | \n",
1206 | " 0.896905 | \n",
1207 | " 0.630438 | \n",
1208 | " 0.148821 | \n",
1209 | " 0.492771 | \n",
1210 | " -0.761274 | \n",
1211 | " -1.496186 | \n",
1212 | " 1.139988 | \n",
1213 | " -1.492465 | \n",
1214 | " 0.308580 | \n",
1215 | " 0.006287 | \n",
1216 | " -1.125138 | \n",
1217 | " -0.223905 | \n",
1218 | "
\n",
1219 | " \n",
1220 | " 21 | \n",
1221 | " 0.013120 | \n",
1222 | " -1.289541 | \n",
1223 | " -2.590342 | \n",
1224 | " -1.025209 | \n",
1225 | " -0.957815 | \n",
1226 | " -0.571983 | \n",
1227 | " -0.037253 | \n",
1228 | " -0.970762 | \n",
1229 | " -0.310799 | \n",
1230 | " -0.234956 | \n",
1231 | " 1.012853 | \n",
1232 | " -0.204437 | \n",
1233 | " -1.137485 | \n",
1234 | "
\n",
1235 | " \n",
1236 | " 22 | \n",
1237 | " -0.675119 | \n",
1238 | " 0.683819 | \n",
1239 | " 1.018483 | \n",
1240 | " 2.203373 | \n",
1241 | " -0.197984 | \n",
1242 | " -0.650855 | \n",
1243 | " -1.476337 | \n",
1244 | " 2.195363 | \n",
1245 | " -0.883728 | \n",
1246 | " 0.996493 | \n",
1247 | " -1.262861 | \n",
1248 | " -1.250688 | \n",
1249 | " 0.410526 | \n",
1250 | "
\n",
1251 | " \n",
1252 | " 23 | \n",
1253 | " 1.615055 | \n",
1254 | " 1.202637 | \n",
1255 | " -0.378481 | \n",
1256 | " -1.025209 | \n",
1257 | " 0.147394 | \n",
1258 | " 1.478670 | \n",
1259 | " 1.094165 | \n",
1260 | " -0.727214 | \n",
1261 | " 0.996195 | \n",
1262 | " -0.107565 | \n",
1263 | " 0.356397 | \n",
1264 | " 1.134766 | \n",
1265 | " 0.997374 | \n",
1266 | "
\n",
1267 | " \n",
1268 | " 24 | \n",
1269 | " 1.520125 | \n",
1270 | " -0.344552 | \n",
1271 | " 1.328920 | \n",
1272 | " 0.148821 | \n",
1273 | " 1.390754 | \n",
1274 | " 0.768828 | \n",
1275 | " 1.064391 | \n",
1276 | " -0.240118 | \n",
1277 | " 0.602307 | \n",
1278 | " 0.444464 | \n",
1279 | " 0.487688 | \n",
1280 | " 0.032714 | \n",
1281 | " 1.679386 | \n",
1282 | "
\n",
1283 | " \n",
1284 | " 25 | \n",
1285 | " 0.998013 | \n",
1286 | " -0.872634 | \n",
1287 | " -0.417286 | \n",
1288 | " -1.025209 | \n",
1289 | " -0.128909 | \n",
1290 | " 1.052765 | \n",
1291 | " 1.074316 | \n",
1292 | " -1.133127 | \n",
1293 | " 0.387458 | \n",
1294 | " 0.877594 | \n",
1295 | " 0.225106 | \n",
1296 | " 1.288216 | \n",
1297 | " 0.933931 | \n",
1298 | "
\n",
1299 | " \n",
1300 | " 26 | \n",
1301 | " 0.867485 | \n",
1302 | " -0.520579 | \n",
1303 | " 0.125978 | \n",
1304 | " -1.025209 | \n",
1305 | " -0.750589 | \n",
1306 | " 0.453343 | \n",
1307 | " 0.687252 | \n",
1308 | " -0.564849 | \n",
1309 | " 0.315842 | \n",
1310 | " 0.189681 | \n",
1311 | " 0.837798 | \n",
1312 | " 0.381464 | \n",
1313 | " 1.806273 | \n",
1314 | "
\n",
1315 | " \n",
1316 | " 27 | \n",
1317 | " -0.176739 | \n",
1318 | " 2.147626 | \n",
1319 | " 0.397610 | \n",
1320 | " 0.589082 | \n",
1321 | " -0.957815 | \n",
1322 | " -0.966340 | \n",
1323 | " -1.406864 | \n",
1324 | " 0.896440 | \n",
1325 | " -1.385041 | \n",
1326 | " 1.060188 | \n",
1327 | " -1.831789 | \n",
1328 | " -1.069338 | \n",
1329 | " -0.398373 | \n",
1330 | "
\n",
1331 | " \n",
1332 | " 28 | \n",
1333 | " -0.295401 | \n",
1334 | " 1.054403 | \n",
1335 | " -1.542618 | \n",
1336 | " -1.025209 | \n",
1337 | " -1.372269 | \n",
1338 | " -1.076760 | \n",
1339 | " -0.811381 | \n",
1340 | " 0.571709 | \n",
1341 | " -1.438753 | \n",
1342 | " -0.744521 | \n",
1343 | " -1.131570 | \n",
1344 | " -0.706637 | \n",
1345 | " -1.200928 | \n",
1346 | "
\n",
1347 | " \n",
1348 | " 29 | \n",
1349 | " 1.045478 | \n",
1350 | " -0.566902 | \n",
1351 | " -0.999354 | \n",
1352 | " -1.025209 | \n",
1353 | " 0.078318 | \n",
1354 | " 1.242056 | \n",
1355 | " 1.312509 | \n",
1356 | " -1.214310 | \n",
1357 | " 0.906675 | \n",
1358 | " 0.402000 | \n",
1359 | " -0.212531 | \n",
1360 | " 0.981315 | \n",
1361 | " 0.743602 | \n",
1362 | "
\n",
1363 | " \n",
1364 | " ... | \n",
1365 | " ... | \n",
1366 | " ... | \n",
1367 | " ... | \n",
1368 | " ... | \n",
1369 | " ... | \n",
1370 | " ... | \n",
1371 | " ... | \n",
1372 | " ... | \n",
1373 | " ... | \n",
1374 | " ... | \n",
1375 | " ... | \n",
1376 | " ... | \n",
1377 | " ... | \n",
1378 | "
\n",
1379 | " \n",
1380 | " 112 | \n",
1381 | " 0.594563 | \n",
1382 | " 0.813523 | \n",
1383 | " 1.328920 | \n",
1384 | " 1.176097 | \n",
1385 | " -0.197984 | \n",
1386 | " -1.202954 | \n",
1387 | " -1.535885 | \n",
1388 | " 1.139988 | \n",
1389 | " -1.940066 | \n",
1390 | " -0.341115 | \n",
1391 | " -0.300059 | \n",
1392 | " -0.790338 | \n",
1393 | " -0.731449 | \n",
1394 | "
\n",
1395 | " \n",
1396 | " 113 | \n",
1397 | " 1.567590 | \n",
1398 | " -0.566902 | \n",
1399 | " 1.251311 | \n",
1400 | " 1.616358 | \n",
1401 | " -0.128909 | \n",
1402 | " 0.768828 | \n",
1403 | " -0.751832 | \n",
1404 | " 1.383536 | \n",
1405 | " 1.909301 | \n",
1406 | " 3.331999 | \n",
1407 | " -1.700498 | \n",
1408 | " -0.929838 | \n",
1409 | " -0.287348 | \n",
1410 | "
\n",
1411 | " \n",
1412 | " 114 | \n",
1413 | " -0.698851 | \n",
1414 | " -0.705871 | \n",
1415 | " -0.339677 | \n",
1416 | " 0.589082 | \n",
1417 | " -0.957815 | \n",
1418 | " 0.674183 | \n",
1419 | " 1.074316 | \n",
1420 | " 0.246978 | \n",
1421 | " 0.244226 | \n",
1422 | " -0.515217 | \n",
1423 | " -1.175333 | \n",
1424 | " 0.297764 | \n",
1425 | " -1.264371 | \n",
1426 | "
\n",
1427 | " \n",
1428 | " 115 | \n",
1429 | " 2.149033 | \n",
1430 | " -0.603961 | \n",
1431 | " -0.805331 | \n",
1432 | " -1.612224 | \n",
1433 | " -0.197984 | \n",
1434 | " 0.768828 | \n",
1435 | " 0.905595 | \n",
1436 | " -0.564849 | \n",
1437 | " 0.620211 | \n",
1438 | " 0.019826 | \n",
1439 | " 0.531452 | \n",
1440 | " 0.311714 | \n",
1441 | " 0.933931 | \n",
1442 | "
\n",
1443 | " \n",
1444 | " 116 | \n",
1445 | " -1.126034 | \n",
1446 | " -0.122201 | \n",
1447 | " -0.805331 | \n",
1448 | " 0.442328 | \n",
1449 | " -1.026891 | \n",
1450 | " 0.453343 | \n",
1451 | " 0.578080 | \n",
1452 | " 0.084613 | \n",
1453 | " -0.507743 | \n",
1454 | " -1.016289 | \n",
1455 | " -0.431350 | \n",
1456 | " 0.911565 | \n",
1457 | " -1.181895 | \n",
1458 | "
\n",
1459 | " \n",
1460 | " 117 | \n",
1461 | " 1.425196 | \n",
1462 | " -0.548373 | \n",
1463 | " -0.300872 | \n",
1464 | " -0.937157 | \n",
1465 | " 1.252602 | \n",
1466 | " 1.399798 | \n",
1467 | " 0.925445 | \n",
1468 | " -0.808397 | \n",
1469 | " 0.709731 | \n",
1470 | " 0.520899 | \n",
1471 | " -0.081240 | \n",
1472 | " 0.953415 | \n",
1473 | " 0.696019 | \n",
1474 | "
\n",
1475 | " \n",
1476 | " 118 | \n",
1477 | " -0.188605 | \n",
1478 | " -0.872634 | \n",
1479 | " -0.223263 | \n",
1480 | " -0.438194 | \n",
1481 | " 1.528905 | \n",
1482 | " -1.266051 | \n",
1483 | " -0.811381 | \n",
1484 | " -1.214310 | \n",
1485 | " -1.241808 | \n",
1486 | " -0.447275 | \n",
1487 | " -0.868987 | \n",
1488 | " -1.864489 | \n",
1489 | " -0.382512 | \n",
1490 | "
\n",
1491 | " \n",
1492 | " 119 | \n",
1493 | " -0.414063 | \n",
1494 | " -1.215425 | \n",
1495 | " -0.533699 | \n",
1496 | " -0.438194 | \n",
1497 | " -0.059833 | \n",
1498 | " -0.177627 | \n",
1499 | " -0.126575 | \n",
1500 | " -0.483666 | \n",
1501 | " -0.310799 | \n",
1502 | " -1.075738 | \n",
1503 | " 1.187908 | \n",
1504 | " 0.744165 | \n",
1505 | " -0.953500 | \n",
1506 | "
\n",
1507 | " \n",
1508 | " 120 | \n",
1509 | " -1.636280 | \n",
1510 | " -0.214847 | \n",
1511 | " 0.320001 | \n",
1512 | " 0.618433 | \n",
1513 | " -1.095966 | \n",
1514 | " -0.571983 | \n",
1515 | " -0.374693 | \n",
1516 | " 0.977622 | \n",
1517 | " -0.507743 | \n",
1518 | " -0.999304 | \n",
1519 | " 0.181342 | \n",
1520 | " 0.172214 | \n",
1521 | " -0.223905 | \n",
1522 | "
\n",
1523 | " \n",
1524 | " 121 | \n",
1525 | " 0.452168 | \n",
1526 | " 1.499104 | \n",
1527 | " 0.397610 | \n",
1528 | " 1.029343 | \n",
1529 | " 0.147394 | \n",
1530 | " -0.808597 | \n",
1531 | " -1.307616 | \n",
1532 | " 0.571709 | \n",
1533 | " -0.400319 | \n",
1534 | " 0.911565 | \n",
1535 | " -1.131570 | \n",
1536 | " -1.487839 | \n",
1537 | " -0.001854 | \n",
1538 | "
\n",
1539 | " \n",
1540 | " 122 | \n",
1541 | " 0.167380 | \n",
1542 | " 0.063091 | \n",
1543 | " 1.134897 | \n",
1544 | " -0.262090 | \n",
1545 | " 0.078318 | \n",
1546 | " 0.768828 | \n",
1547 | " 1.163638 | \n",
1548 | " -0.483666 | \n",
1549 | " 2.106245 | \n",
1550 | " 0.223652 | \n",
1551 | " 0.312633 | \n",
1552 | " 0.758115 | \n",
1553 | " 1.378032 | \n",
1554 | "
\n",
1555 | " \n",
1556 | " 123 | \n",
1557 | " 0.962414 | \n",
1558 | " -0.594696 | \n",
1559 | " 0.863265 | \n",
1560 | " -0.673000 | \n",
1561 | " -0.405211 | \n",
1562 | " 0.216729 | \n",
1563 | " 0.915520 | \n",
1564 | " -1.133127 | \n",
1565 | " 1.175236 | \n",
1566 | " 0.189681 | \n",
1567 | " 1.231671 | \n",
1568 | " 1.037115 | \n",
1569 | " 1.631804 | \n",
1570 | "
\n",
1571 | " \n",
1572 | " 124 | \n",
1573 | " -0.770048 | \n",
1574 | " -1.002339 | \n",
1575 | " 0.708047 | \n",
1576 | " -0.408843 | \n",
1577 | " -0.128909 | \n",
1578 | " 0.169407 | \n",
1579 | " 0.578080 | \n",
1580 | " 0.084613 | \n",
1581 | " 0.799251 | \n",
1582 | " -0.234956 | \n",
1583 | " 1.012853 | \n",
1584 | " -0.455537 | \n",
1585 | " -0.230249 | \n",
1586 | "
\n",
1587 | " \n",
1588 | " 125 | \n",
1589 | " -0.864978 | \n",
1590 | " -0.631755 | \n",
1591 | " -0.650113 | \n",
1592 | " 0.266224 | \n",
1593 | " 0.216469 | \n",
1594 | " -1.912795 | \n",
1595 | " -1.039649 | \n",
1596 | " 0.084613 | \n",
1597 | " -0.310799 | \n",
1598 | " -0.893144 | \n",
1599 | " -0.230037 | \n",
1600 | " -1.125138 | \n",
1601 | " 0.378804 | \n",
1602 | "
\n",
1603 | " \n",
1604 | " 126 | \n",
1605 | " 0.772555 | \n",
1606 | " -0.965280 | \n",
1607 | " -1.775446 | \n",
1608 | " -0.438194 | \n",
1609 | " -0.405211 | \n",
1610 | " -0.335370 | \n",
1611 | " -0.275446 | \n",
1612 | " -0.321300 | \n",
1613 | " -1.617793 | \n",
1614 | " -0.574666 | \n",
1615 | " 1.187908 | \n",
1616 | " -0.232337 | \n",
1617 | " -0.382512 | \n",
1618 | "
\n",
1619 | " \n",
1620 | " 127 | \n",
1621 | " 0.665760 | \n",
1622 | " 0.266912 | \n",
1623 | " 1.212506 | \n",
1624 | " 1.469604 | \n",
1625 | " 0.354620 | \n",
1626 | " -1.202954 | \n",
1627 | " -1.218294 | \n",
1628 | " 0.246978 | \n",
1629 | " -0.167567 | \n",
1630 | " 1.489072 | \n",
1631 | " -0.956515 | \n",
1632 | " -1.153038 | \n",
1633 | " -0.001854 | \n",
1634 | "
\n",
1635 | " \n",
1636 | " 128 | \n",
1637 | " -0.247936 | \n",
1638 | " 0.016768 | \n",
1639 | " 0.087174 | \n",
1640 | " 1.322850 | \n",
1641 | " -0.128909 | \n",
1642 | " -1.833924 | \n",
1643 | " -0.970176 | \n",
1644 | " -0.727214 | \n",
1645 | " -1.438753 | \n",
1646 | " 0.232145 | \n",
1647 | " -1.306625 | \n",
1648 | " -1.766839 | \n",
1649 | " -0.604563 | \n",
1650 | "
\n",
1651 | " \n",
1652 | " 129 | \n",
1653 | " -0.354732 | \n",
1654 | " -0.502050 | \n",
1655 | " -0.378481 | \n",
1656 | " 0.882589 | \n",
1657 | " -1.095966 | \n",
1658 | " -1.471116 | \n",
1659 | " -0.305220 | \n",
1660 | " 0.977622 | \n",
1661 | " -0.006430 | \n",
1662 | " -0.786985 | \n",
1663 | " -0.343823 | \n",
1664 | " -0.288137 | \n",
1665 | " -0.832958 | \n",
1666 | "
\n",
1667 | " \n",
1668 | " 130 | \n",
1669 | " -0.378464 | \n",
1670 | " 1.462046 | \n",
1671 | " 0.087174 | \n",
1672 | " 1.029343 | \n",
1673 | " 0.078318 | \n",
1674 | " 0.816151 | \n",
1675 | " 0.478833 | \n",
1676 | " 0.571709 | \n",
1677 | " 0.566499 | \n",
1678 | " -1.096970 | \n",
1679 | " 1.012853 | \n",
1680 | " 0.702315 | \n",
1681 | " -0.912262 | \n",
1682 | "
\n",
1683 | " \n",
1684 | " 131 | \n",
1685 | " -0.603922 | \n",
1686 | " -0.974545 | \n",
1687 | " -0.494895 | \n",
1688 | " -0.584948 | \n",
1689 | " -1.026891 | \n",
1690 | " -0.493112 | \n",
1691 | " -1.476337 | \n",
1692 | " 1.951815 | \n",
1693 | " -0.686783 | \n",
1694 | " 0.125986 | \n",
1695 | " -0.912751 | \n",
1696 | " -1.557589 | \n",
1697 | " -0.319069 | \n",
1698 | "
\n",
1699 | " \n",
1700 | " 132 | \n",
1701 | " -1.849871 | \n",
1702 | " 1.341606 | \n",
1703 | " -2.163491 | \n",
1704 | " 0.002067 | \n",
1705 | " 0.492771 | \n",
1706 | " 1.368250 | \n",
1707 | " 0.508607 | \n",
1708 | " -0.970762 | \n",
1709 | " 3.484856 | \n",
1710 | " -0.956840 | \n",
1711 | " -0.912751 | \n",
1712 | " 0.255914 | \n",
1713 | " -0.598219 | \n",
1714 | "
\n",
1715 | " \n",
1716 | " 133 | \n",
1717 | " 0.072451 | \n",
1718 | " 1.489840 | \n",
1719 | " -0.068045 | \n",
1720 | " 0.589082 | \n",
1721 | " 0.907225 | \n",
1722 | " -1.423793 | \n",
1723 | " -0.672434 | \n",
1724 | " -0.158935 | \n",
1725 | " -0.883728 | \n",
1726 | " 1.803304 | \n",
1727 | " -1.700498 | \n",
1728 | " -1.808689 | \n",
1729 | " -0.636285 | \n",
1730 | "
\n",
1731 | " \n",
1732 | " 134 | \n",
1733 | " -0.200472 | \n",
1734 | " -0.641019 | \n",
1735 | " 0.552828 | \n",
1736 | " -0.496896 | \n",
1737 | " -0.336135 | \n",
1738 | " 0.264052 | \n",
1739 | " 0.300188 | \n",
1740 | " -0.808397 | \n",
1741 | " -0.310799 | \n",
1742 | " -0.519463 | \n",
1743 | " 0.575216 | \n",
1744 | " 1.399816 | \n",
1745 | " 0.838766 | \n",
1746 | "
\n",
1747 | " \n",
1748 | " 135 | \n",
1749 | " 0.903083 | \n",
1750 | " 0.433675 | \n",
1751 | " -0.300872 | \n",
1752 | " 0.735835 | \n",
1753 | " -0.681513 | \n",
1754 | " -1.518439 | \n",
1755 | " -1.377089 | \n",
1756 | " 0.409344 | \n",
1757 | " -1.080672 | \n",
1758 | " 1.879738 | \n",
1759 | " -1.131570 | \n",
1760 | " -1.320439 | \n",
1761 | " -0.430095 | \n",
1762 | "
\n",
1763 | " \n",
1764 | " 136 | \n",
1765 | " 1.425196 | \n",
1766 | " 1.573221 | \n",
1767 | " 0.514024 | \n",
1768 | " -1.847030 | \n",
1769 | " 1.943358 | \n",
1770 | " 1.084313 | \n",
1771 | " 0.965144 | \n",
1772 | " -1.295493 | \n",
1773 | " 0.799251 | \n",
1774 | " -0.022637 | \n",
1775 | " -0.300059 | \n",
1776 | " 1.260316 | \n",
1777 | " 0.029867 | \n",
1778 | "
\n",
1779 | " \n",
1780 | " 137 | \n",
1781 | " -0.864978 | \n",
1782 | " 0.498527 | \n",
1783 | " -0.611309 | \n",
1784 | " -0.438194 | \n",
1785 | " -0.819664 | \n",
1786 | " 0.216729 | \n",
1787 | " 0.181091 | \n",
1788 | " -0.889579 | \n",
1789 | " 0.638115 | \n",
1790 | " -1.275318 | \n",
1791 | " 0.837798 | \n",
1792 | " 0.939465 | \n",
1793 | " -1.461044 | \n",
1794 | "
\n",
1795 | " \n",
1796 | " 138 | \n",
1797 | " -0.449662 | \n",
1798 | " -0.863370 | \n",
1799 | " -1.387400 | \n",
1800 | " -0.790403 | \n",
1801 | " 0.009243 | \n",
1802 | " -0.461564 | \n",
1803 | " -0.652585 | \n",
1804 | " 1.383536 | \n",
1805 | " -1.814737 | \n",
1806 | " 0.253377 | \n",
1807 | " 0.093815 | \n",
1808 | " -1.445989 | \n",
1809 | " -0.953500 | \n",
1810 | "
\n",
1811 | " \n",
1812 | " 139 | \n",
1813 | " 1.187872 | \n",
1814 | " -0.566902 | \n",
1815 | " -0.650113 | \n",
1816 | " -1.025209 | \n",
1817 | " -0.267060 | \n",
1818 | " 0.532214 | \n",
1819 | " 0.260489 | \n",
1820 | " -0.808397 | \n",
1821 | " 0.620211 | \n",
1822 | " -0.192492 | \n",
1823 | " 0.356397 | \n",
1824 | " 1.344016 | \n",
1825 | " 0.902209 | \n",
1826 | "
\n",
1827 | " \n",
1828 | " 140 | \n",
1829 | " 1.021745 | \n",
1830 | " 2.546004 | \n",
1831 | " -0.572504 | \n",
1832 | " 0.148821 | \n",
1833 | " -1.372269 | \n",
1834 | " -2.102086 | \n",
1835 | " -1.714530 | \n",
1836 | " 0.328161 | \n",
1837 | " -1.707313 | \n",
1838 | " -0.107565 | \n",
1839 | " -1.656734 | \n",
1840 | " -1.808689 | \n",
1841 | " -1.064525 | \n",
1842 | "
\n",
1843 | " \n",
1844 | " 141 | \n",
1845 | " -0.948041 | \n",
1846 | " -0.928222 | \n",
1847 | " -1.697837 | \n",
1848 | " -0.144687 | \n",
1849 | " -0.543362 | \n",
1850 | " 0.074761 | \n",
1851 | " -0.027328 | \n",
1852 | " 0.246978 | \n",
1853 | " 0.799251 | \n",
1854 | " -1.041767 | \n",
1855 | " -0.431350 | \n",
1856 | " 0.548865 | \n",
1857 | " -1.391257 | \n",
1858 | "
\n",
1859 | " \n",
1860 | "
\n",
1861 | "
142 rows × 13 columns
\n",
1862 | "
"
1863 | ],
1864 | "text/plain": [
1865 | " 0 1 2 3 4 5 6 \\\n",
1866 | "0 1.353998 0.202060 0.397610 0.148821 -0.612437 -0.997888 -1.357240 \n",
1867 | "1 0.998013 -0.724401 1.134897 1.616358 -0.957815 1.005442 0.786499 \n",
1868 | "2 1.603188 -0.316758 0.475219 -0.790403 0.907225 2.425125 1.411756 \n",
1869 | "3 0.642027 -0.539109 -0.262067 -0.966507 1.183527 1.320927 1.213262 \n",
1870 | "4 1.092942 -0.520579 -0.417286 -0.614299 0.561847 0.895022 1.461380 \n",
1871 | "5 0.582696 -0.455727 1.057288 -0.144687 0.699998 0.058987 0.458983 \n",
1872 | "6 -1.209097 1.054403 -1.465009 -0.144687 -0.888740 -0.493112 -0.424317 \n",
1873 | "7 0.974281 1.693661 0.009565 0.002067 -0.750589 -0.808597 -1.228219 \n",
1874 | "8 2.054103 -0.520579 0.048369 -2.375343 -0.612437 1.242056 1.610250 \n",
1875 | "9 0.036852 -0.520579 -1.309791 -2.081836 -0.543362 0.642634 1.193412 \n",
1876 | "10 0.843753 1.916011 -0.456090 0.882589 -0.819664 -1.628858 -1.585508 \n",
1877 | "11 -0.959907 -1.020868 -2.435123 -0.790403 3.532096 -0.729726 -0.781606 \n",
1878 | "12 1.615055 -0.390875 0.009565 -2.199239 0.147394 1.557541 1.560627 \n",
1879 | "13 -1.422689 -0.761459 -1.503814 0.383627 -0.957815 -0.524661 -0.464015 \n",
1880 | "14 0.214845 -0.474256 -0.921745 -2.434045 0.009243 0.532214 0.687252 \n",
1881 | "15 -1.624414 -0.576167 0.940874 1.909865 -0.819664 -0.619306 -0.454091 \n",
1882 | "16 -0.390331 -0.705871 -0.456090 0.354276 -1.372269 -1.471116 -0.602961 \n",
1883 | "17 -1.909202 -1.437775 0.475219 0.442328 -0.819664 0.264052 -0.057102 \n",
1884 | "18 -1.114168 -0.835576 0.475219 0.882589 -1.095966 0.390246 0.220790 \n",
1885 | "19 -0.592056 0.127943 -0.805331 0.442328 -0.819664 0.374472 0.200941 \n",
1886 | "20 0.796288 0.896905 0.630438 0.148821 0.492771 -0.761274 -1.496186 \n",
1887 | "21 0.013120 -1.289541 -2.590342 -1.025209 -0.957815 -0.571983 -0.037253 \n",
1888 | "22 -0.675119 0.683819 1.018483 2.203373 -0.197984 -0.650855 -1.476337 \n",
1889 | "23 1.615055 1.202637 -0.378481 -1.025209 0.147394 1.478670 1.094165 \n",
1890 | "24 1.520125 -0.344552 1.328920 0.148821 1.390754 0.768828 1.064391 \n",
1891 | "25 0.998013 -0.872634 -0.417286 -1.025209 -0.128909 1.052765 1.074316 \n",
1892 | "26 0.867485 -0.520579 0.125978 -1.025209 -0.750589 0.453343 0.687252 \n",
1893 | "27 -0.176739 2.147626 0.397610 0.589082 -0.957815 -0.966340 -1.406864 \n",
1894 | "28 -0.295401 1.054403 -1.542618 -1.025209 -1.372269 -1.076760 -0.811381 \n",
1895 | "29 1.045478 -0.566902 -0.999354 -1.025209 0.078318 1.242056 1.312509 \n",
1896 | ".. ... ... ... ... ... ... ... \n",
1897 | "112 0.594563 0.813523 1.328920 1.176097 -0.197984 -1.202954 -1.535885 \n",
1898 | "113 1.567590 -0.566902 1.251311 1.616358 -0.128909 0.768828 -0.751832 \n",
1899 | "114 -0.698851 -0.705871 -0.339677 0.589082 -0.957815 0.674183 1.074316 \n",
1900 | "115 2.149033 -0.603961 -0.805331 -1.612224 -0.197984 0.768828 0.905595 \n",
1901 | "116 -1.126034 -0.122201 -0.805331 0.442328 -1.026891 0.453343 0.578080 \n",
1902 | "117 1.425196 -0.548373 -0.300872 -0.937157 1.252602 1.399798 0.925445 \n",
1903 | "118 -0.188605 -0.872634 -0.223263 -0.438194 1.528905 -1.266051 -0.811381 \n",
1904 | "119 -0.414063 -1.215425 -0.533699 -0.438194 -0.059833 -0.177627 -0.126575 \n",
1905 | "120 -1.636280 -0.214847 0.320001 0.618433 -1.095966 -0.571983 -0.374693 \n",
1906 | "121 0.452168 1.499104 0.397610 1.029343 0.147394 -0.808597 -1.307616 \n",
1907 | "122 0.167380 0.063091 1.134897 -0.262090 0.078318 0.768828 1.163638 \n",
1908 | "123 0.962414 -0.594696 0.863265 -0.673000 -0.405211 0.216729 0.915520 \n",
1909 | "124 -0.770048 -1.002339 0.708047 -0.408843 -0.128909 0.169407 0.578080 \n",
1910 | "125 -0.864978 -0.631755 -0.650113 0.266224 0.216469 -1.912795 -1.039649 \n",
1911 | "126 0.772555 -0.965280 -1.775446 -0.438194 -0.405211 -0.335370 -0.275446 \n",
1912 | "127 0.665760 0.266912 1.212506 1.469604 0.354620 -1.202954 -1.218294 \n",
1913 | "128 -0.247936 0.016768 0.087174 1.322850 -0.128909 -1.833924 -0.970176 \n",
1914 | "129 -0.354732 -0.502050 -0.378481 0.882589 -1.095966 -1.471116 -0.305220 \n",
1915 | "130 -0.378464 1.462046 0.087174 1.029343 0.078318 0.816151 0.478833 \n",
1916 | "131 -0.603922 -0.974545 -0.494895 -0.584948 -1.026891 -0.493112 -1.476337 \n",
1917 | "132 -1.849871 1.341606 -2.163491 0.002067 0.492771 1.368250 0.508607 \n",
1918 | "133 0.072451 1.489840 -0.068045 0.589082 0.907225 -1.423793 -0.672434 \n",
1919 | "134 -0.200472 -0.641019 0.552828 -0.496896 -0.336135 0.264052 0.300188 \n",
1920 | "135 0.903083 0.433675 -0.300872 0.735835 -0.681513 -1.518439 -1.377089 \n",
1921 | "136 1.425196 1.573221 0.514024 -1.847030 1.943358 1.084313 0.965144 \n",
1922 | "137 -0.864978 0.498527 -0.611309 -0.438194 -0.819664 0.216729 0.181091 \n",
1923 | "138 -0.449662 -0.863370 -1.387400 -0.790403 0.009243 -0.461564 -0.652585 \n",
1924 | "139 1.187872 -0.566902 -0.650113 -1.025209 -0.267060 0.532214 0.260489 \n",
1925 | "140 1.021745 2.546004 -0.572504 0.148821 -1.372269 -2.102086 -1.714530 \n",
1926 | "141 -0.948041 -0.928222 -1.697837 -0.144687 -0.543362 0.074761 -0.027328 \n",
1927 | "\n",
1928 | " 7 8 9 10 11 12 \n",
1929 | "0 0.652892 -0.704687 1.930695 -1.481679 -1.278588 -0.287348 \n",
1930 | "1 -1.214310 0.423266 -0.753014 1.756836 0.744165 -1.080386 \n",
1931 | "2 -0.970762 0.978291 1.123884 -0.431350 1.148716 2.313817 \n",
1932 | "3 -0.158935 1.264756 0.414739 -0.037477 1.051066 0.140893 \n",
1933 | "4 -0.321300 0.799251 1.590985 0.706507 0.660465 1.615943 \n",
1934 | "5 -0.564849 -0.167567 -0.404811 0.618979 0.339614 1.092538 \n",
1935 | "6 0.084613 0.423266 -1.644752 -0.125004 0.590715 -0.591874 \n",
1936 | "7 0.977622 -0.131759 1.637695 -1.700498 -1.376239 -0.858335 \n",
1937 | "8 0.571709 2.106245 0.104754 1.275435 0.144314 1.267007 \n",
1938 | "9 -1.539041 2.285286 0.869101 0.706507 0.395414 1.267007 \n",
1939 | "10 1.302353 -0.865824 0.622812 -0.781460 -1.222788 -0.731449 \n",
1940 | "11 -1.782589 1.551220 -0.978072 1.406726 0.618615 -0.103363 \n",
1941 | "12 -0.564849 2.374806 0.996493 1.056617 0.520965 2.526351 \n",
1942 | "13 -0.483666 -0.185471 -1.360245 -0.037477 0.981315 -0.810753 \n",
1943 | "14 -0.808397 -0.633071 -0.328376 0.400161 1.078966 0.949791 \n",
1944 | "15 0.328161 -0.525647 -1.084231 1.756836 0.813915 -0.598219 \n",
1945 | "16 1.789449 -0.024334 -0.893144 0.006287 -0.790338 -0.810753 \n",
1946 | "17 0.490526 -0.346607 -0.880405 0.618979 -0.441587 -1.004254 \n",
1947 | "18 0.571709 -1.062768 -0.956840 -0.125004 0.786015 -1.159690 \n",
1948 | "19 -0.808397 -0.740496 -1.339013 -0.256295 0.214064 -1.350019 \n",
1949 | "20 1.139988 -1.492465 0.308580 0.006287 -1.125138 -0.223905 \n",
1950 | "21 -0.970762 -0.310799 -0.234956 1.012853 -0.204437 -1.137485 \n",
1951 | "22 2.195363 -0.883728 0.996493 -1.262861 -1.250688 0.410526 \n",
1952 | "23 -0.727214 0.996195 -0.107565 0.356397 1.134766 0.997374 \n",
1953 | "24 -0.240118 0.602307 0.444464 0.487688 0.032714 1.679386 \n",
1954 | "25 -1.133127 0.387458 0.877594 0.225106 1.288216 0.933931 \n",
1955 | "26 -0.564849 0.315842 0.189681 0.837798 0.381464 1.806273 \n",
1956 | "27 0.896440 -1.385041 1.060188 -1.831789 -1.069338 -0.398373 \n",
1957 | "28 0.571709 -1.438753 -0.744521 -1.131570 -0.706637 -1.200928 \n",
1958 | "29 -1.214310 0.906675 0.402000 -0.212531 0.981315 0.743602 \n",
1959 | ".. ... ... ... ... ... ... \n",
1960 | "112 1.139988 -1.940066 -0.341115 -0.300059 -0.790338 -0.731449 \n",
1961 | "113 1.383536 1.909301 3.331999 -1.700498 -0.929838 -0.287348 \n",
1962 | "114 0.246978 0.244226 -0.515217 -1.175333 0.297764 -1.264371 \n",
1963 | "115 -0.564849 0.620211 0.019826 0.531452 0.311714 0.933931 \n",
1964 | "116 0.084613 -0.507743 -1.016289 -0.431350 0.911565 -1.181895 \n",
1965 | "117 -0.808397 0.709731 0.520899 -0.081240 0.953415 0.696019 \n",
1966 | "118 -1.214310 -1.241808 -0.447275 -0.868987 -1.864489 -0.382512 \n",
1967 | "119 -0.483666 -0.310799 -1.075738 1.187908 0.744165 -0.953500 \n",
1968 | "120 0.977622 -0.507743 -0.999304 0.181342 0.172214 -0.223905 \n",
1969 | "121 0.571709 -0.400319 0.911565 -1.131570 -1.487839 -0.001854 \n",
1970 | "122 -0.483666 2.106245 0.223652 0.312633 0.758115 1.378032 \n",
1971 | "123 -1.133127 1.175236 0.189681 1.231671 1.037115 1.631804 \n",
1972 | "124 0.084613 0.799251 -0.234956 1.012853 -0.455537 -0.230249 \n",
1973 | "125 0.084613 -0.310799 -0.893144 -0.230037 -1.125138 0.378804 \n",
1974 | "126 -0.321300 -1.617793 -0.574666 1.187908 -0.232337 -0.382512 \n",
1975 | "127 0.246978 -0.167567 1.489072 -0.956515 -1.153038 -0.001854 \n",
1976 | "128 -0.727214 -1.438753 0.232145 -1.306625 -1.766839 -0.604563 \n",
1977 | "129 0.977622 -0.006430 -0.786985 -0.343823 -0.288137 -0.832958 \n",
1978 | "130 0.571709 0.566499 -1.096970 1.012853 0.702315 -0.912262 \n",
1979 | "131 1.951815 -0.686783 0.125986 -0.912751 -1.557589 -0.319069 \n",
1980 | "132 -0.970762 3.484856 -0.956840 -0.912751 0.255914 -0.598219 \n",
1981 | "133 -0.158935 -0.883728 1.803304 -1.700498 -1.808689 -0.636285 \n",
1982 | "134 -0.808397 -0.310799 -0.519463 0.575216 1.399816 0.838766 \n",
1983 | "135 0.409344 -1.080672 1.879738 -1.131570 -1.320439 -0.430095 \n",
1984 | "136 -1.295493 0.799251 -0.022637 -0.300059 1.260316 0.029867 \n",
1985 | "137 -0.889579 0.638115 -1.275318 0.837798 0.939465 -1.461044 \n",
1986 | "138 1.383536 -1.814737 0.253377 0.093815 -1.445989 -0.953500 \n",
1987 | "139 -0.808397 0.620211 -0.192492 0.356397 1.344016 0.902209 \n",
1988 | "140 0.328161 -1.707313 -0.107565 -1.656734 -1.808689 -1.064525 \n",
1989 | "141 0.246978 0.799251 -1.041767 -0.431350 0.548865 -1.391257 \n",
1990 | "\n",
1991 | "[142 rows x 13 columns]"
1992 | ]
1993 | },
1994 | "execution_count": 54,
1995 | "metadata": {},
1996 | "output_type": "execute_result"
1997 | }
1998 | ],
1999 | "source": [
2000 | "pd.DataFrame(x_train)"
2001 | ]
2002 | },
2003 | {
2004 | "cell_type": "code",
2005 | "execution_count": null,
2006 | "metadata": {},
2007 | "outputs": [],
2008 | "source": []
2009 | }
2010 | ],
2011 | "metadata": {
2012 | "kernelspec": {
2013 | "display_name": "Python 3",
2014 | "language": "python",
2015 | "name": "python3"
2016 | },
2017 | "language_info": {
2018 | "codemirror_mode": {
2019 | "name": "ipython",
2020 | "version": 3
2021 | },
2022 | "file_extension": ".py",
2023 | "mimetype": "text/x-python",
2024 | "name": "python",
2025 | "nbconvert_exporter": "python",
2026 | "pygments_lexer": "ipython3",
2027 | "version": "3.7.3"
2028 | }
2029 | },
2030 | "nbformat": 4,
2031 | "nbformat_minor": 2
2032 | }
2033 |
--------------------------------------------------------------------------------
/3_supervised_ml/classification/Social_Network_Ads.csv:
--------------------------------------------------------------------------------
1 | User ID,Gender,Age,EstimatedSalary,Purchased
2 | 15624510,Male,19,19000,0
3 | 15810944,Male,35,20000,0
4 | 15668575,Female,26,43000,0
5 | 15603246,Female,27,57000,0
6 | 15804002,Male,19,76000,0
7 | 15728773,Male,27,58000,0
8 | 15598044,Female,27,84000,0
9 | 15694829,Female,32,150000,1
10 | 15600575,Male,25,33000,0
11 | 15727311,Female,35,65000,0
12 | 15570769,Female,26,80000,0
13 | 15606274,Female,26,52000,0
14 | 15746139,Male,20,86000,0
15 | 15704987,Male,32,18000,0
16 | 15628972,Male,18,82000,0
17 | 15697686,Male,29,80000,0
18 | 15733883,Male,47,25000,1
19 | 15617482,Male,45,26000,1
20 | 15704583,Male,46,28000,1
21 | 15621083,Female,48,29000,1
22 | 15649487,Male,45,22000,1
23 | 15736760,Female,47,49000,1
24 | 15714658,Male,48,41000,1
25 | 15599081,Female,45,22000,1
26 | 15705113,Male,46,23000,1
27 | 15631159,Male,47,20000,1
28 | 15792818,Male,49,28000,1
29 | 15633531,Female,47,30000,1
30 | 15744529,Male,29,43000,0
31 | 15669656,Male,31,18000,0
32 | 15581198,Male,31,74000,0
33 | 15729054,Female,27,137000,1
34 | 15573452,Female,21,16000,0
35 | 15776733,Female,28,44000,0
36 | 15724858,Male,27,90000,0
37 | 15713144,Male,35,27000,0
38 | 15690188,Female,33,28000,0
39 | 15689425,Male,30,49000,0
40 | 15671766,Female,26,72000,0
41 | 15782806,Female,27,31000,0
42 | 15764419,Female,27,17000,0
43 | 15591915,Female,33,51000,0
44 | 15772798,Male,35,108000,0
45 | 15792008,Male,30,15000,0
46 | 15715541,Female,28,84000,0
47 | 15639277,Male,23,20000,0
48 | 15798850,Male,25,79000,0
49 | 15776348,Female,27,54000,0
50 | 15727696,Male,30,135000,1
51 | 15793813,Female,31,89000,0
52 | 15694395,Female,24,32000,0
53 | 15764195,Female,18,44000,0
54 | 15744919,Female,29,83000,0
55 | 15671655,Female,35,23000,0
56 | 15654901,Female,27,58000,0
57 | 15649136,Female,24,55000,0
58 | 15775562,Female,23,48000,0
59 | 15807481,Male,28,79000,0
60 | 15642885,Male,22,18000,0
61 | 15789109,Female,32,117000,0
62 | 15814004,Male,27,20000,0
63 | 15673619,Male,25,87000,0
64 | 15595135,Female,23,66000,0
65 | 15583681,Male,32,120000,1
66 | 15605000,Female,59,83000,0
67 | 15718071,Male,24,58000,0
68 | 15679760,Male,24,19000,0
69 | 15654574,Female,23,82000,0
70 | 15577178,Female,22,63000,0
71 | 15595324,Female,31,68000,0
72 | 15756932,Male,25,80000,0
73 | 15726358,Female,24,27000,0
74 | 15595228,Female,20,23000,0
75 | 15782530,Female,33,113000,0
76 | 15592877,Male,32,18000,0
77 | 15651983,Male,34,112000,1
78 | 15746737,Male,18,52000,0
79 | 15774179,Female,22,27000,0
80 | 15667265,Female,28,87000,0
81 | 15655123,Female,26,17000,0
82 | 15595917,Male,30,80000,0
83 | 15668385,Male,39,42000,0
84 | 15709476,Male,20,49000,0
85 | 15711218,Male,35,88000,0
86 | 15798659,Female,30,62000,0
87 | 15663939,Female,31,118000,1
88 | 15694946,Male,24,55000,0
89 | 15631912,Female,28,85000,0
90 | 15768816,Male,26,81000,0
91 | 15682268,Male,35,50000,0
92 | 15684801,Male,22,81000,0
93 | 15636428,Female,30,116000,0
94 | 15809823,Male,26,15000,0
95 | 15699284,Female,29,28000,0
96 | 15786993,Female,29,83000,0
97 | 15709441,Female,35,44000,0
98 | 15710257,Female,35,25000,0
99 | 15582492,Male,28,123000,1
100 | 15575694,Male,35,73000,0
101 | 15756820,Female,28,37000,0
102 | 15766289,Male,27,88000,0
103 | 15593014,Male,28,59000,0
104 | 15584545,Female,32,86000,0
105 | 15675949,Female,33,149000,1
106 | 15672091,Female,19,21000,0
107 | 15801658,Male,21,72000,0
108 | 15706185,Female,26,35000,0
109 | 15789863,Male,27,89000,0
110 | 15720943,Male,26,86000,0
111 | 15697997,Female,38,80000,0
112 | 15665416,Female,39,71000,0
113 | 15660200,Female,37,71000,0
114 | 15619653,Male,38,61000,0
115 | 15773447,Male,37,55000,0
116 | 15739160,Male,42,80000,0
117 | 15689237,Male,40,57000,0
118 | 15679297,Male,35,75000,0
119 | 15591433,Male,36,52000,0
120 | 15642725,Male,40,59000,0
121 | 15701962,Male,41,59000,0
122 | 15811613,Female,36,75000,0
123 | 15741049,Male,37,72000,0
124 | 15724423,Female,40,75000,0
125 | 15574305,Male,35,53000,0
126 | 15678168,Female,41,51000,0
127 | 15697020,Female,39,61000,0
128 | 15610801,Male,42,65000,0
129 | 15745232,Male,26,32000,0
130 | 15722758,Male,30,17000,0
131 | 15792102,Female,26,84000,0
132 | 15675185,Male,31,58000,0
133 | 15801247,Male,33,31000,0
134 | 15725660,Male,30,87000,0
135 | 15638963,Female,21,68000,0
136 | 15800061,Female,28,55000,0
137 | 15578006,Male,23,63000,0
138 | 15668504,Female,20,82000,0
139 | 15687491,Male,30,107000,1
140 | 15610403,Female,28,59000,0
141 | 15741094,Male,19,25000,0
142 | 15807909,Male,19,85000,0
143 | 15666141,Female,18,68000,0
144 | 15617134,Male,35,59000,0
145 | 15783029,Male,30,89000,0
146 | 15622833,Female,34,25000,0
147 | 15746422,Female,24,89000,0
148 | 15750839,Female,27,96000,1
149 | 15749130,Female,41,30000,0
150 | 15779862,Male,29,61000,0
151 | 15767871,Male,20,74000,0
152 | 15679651,Female,26,15000,0
153 | 15576219,Male,41,45000,0
154 | 15699247,Male,31,76000,0
155 | 15619087,Female,36,50000,0
156 | 15605327,Male,40,47000,0
157 | 15610140,Female,31,15000,0
158 | 15791174,Male,46,59000,0
159 | 15602373,Male,29,75000,0
160 | 15762605,Male,26,30000,0
161 | 15598840,Female,32,135000,1
162 | 15744279,Male,32,100000,1
163 | 15670619,Male,25,90000,0
164 | 15599533,Female,37,33000,0
165 | 15757837,Male,35,38000,0
166 | 15697574,Female,33,69000,0
167 | 15578738,Female,18,86000,0
168 | 15762228,Female,22,55000,0
169 | 15614827,Female,35,71000,0
170 | 15789815,Male,29,148000,1
171 | 15579781,Female,29,47000,0
172 | 15587013,Male,21,88000,0
173 | 15570932,Male,34,115000,0
174 | 15794661,Female,26,118000,0
175 | 15581654,Female,34,43000,0
176 | 15644296,Female,34,72000,0
177 | 15614420,Female,23,28000,0
178 | 15609653,Female,35,47000,0
179 | 15594577,Male,25,22000,0
180 | 15584114,Male,24,23000,0
181 | 15673367,Female,31,34000,0
182 | 15685576,Male,26,16000,0
183 | 15774727,Female,31,71000,0
184 | 15694288,Female,32,117000,1
185 | 15603319,Male,33,43000,0
186 | 15759066,Female,33,60000,0
187 | 15814816,Male,31,66000,0
188 | 15724402,Female,20,82000,0
189 | 15571059,Female,33,41000,0
190 | 15674206,Male,35,72000,0
191 | 15715160,Male,28,32000,0
192 | 15730448,Male,24,84000,0
193 | 15662067,Female,19,26000,0
194 | 15779581,Male,29,43000,0
195 | 15662901,Male,19,70000,0
196 | 15689751,Male,28,89000,0
197 | 15667742,Male,34,43000,0
198 | 15738448,Female,30,79000,0
199 | 15680243,Female,20,36000,0
200 | 15745083,Male,26,80000,0
201 | 15708228,Male,35,22000,0
202 | 15628523,Male,35,39000,0
203 | 15708196,Male,49,74000,0
204 | 15735549,Female,39,134000,1
205 | 15809347,Female,41,71000,0
206 | 15660866,Female,58,101000,1
207 | 15766609,Female,47,47000,0
208 | 15654230,Female,55,130000,1
209 | 15794566,Female,52,114000,0
210 | 15800890,Female,40,142000,1
211 | 15697424,Female,46,22000,0
212 | 15724536,Female,48,96000,1
213 | 15735878,Male,52,150000,1
214 | 15707596,Female,59,42000,0
215 | 15657163,Male,35,58000.0,0
216 | 15622478,Male,47.0,43000,0
217 | 15779529,Female,60,108000,1
218 | 15636023,Male,49,65000,0
219 | 15582066,Male,40,78000,0
220 | 15666675,Female,46,96000,0
221 | 15732987,Male,59,143000,1
222 | 15789432,Female,41,80000,0
223 | 15663161,Male,35,91000,1
224 | 15694879,Male,37,144000,1
225 | 15593715,Male,60,102000,1
226 | 15575002,Female,35,60000,0
227 | 15622171,Male,37,53000,0
228 | 15795224,Female,36,126000,1
229 | 15685346,Male,56,133000,1
230 | 15691808,Female,40,72000,0
231 | 15721007,Female,42,80000,1
232 | 15794253,Female,35,147000,1
233 | 15694453,Male,39,42000,0
234 | 15813113,Male,40,107000,1
235 | 15614187,Male,49,86000,1
236 | 15619407,Female,38,112000,0
237 | 15646227,Male,46,79000,1
238 | 15660541,Male,40,57000,0
239 | 15753874,Female,37,80000,0
240 | 15617877,Female,46,82000,0
241 | 15772073,Female,53,143000,1
242 | 15701537,Male,42,149000,1
243 | 15736228,Male,38,59000,0
244 | 15780572,Female,50,88000,1
245 | 15769596,Female,56,104000,1
246 | 15586996,Female,41,72000,0
247 | 15722061,Female,51,146000,1
248 | 15638003,Female,35,50000,0
249 | 15775590,Female,57,122000,1
250 | 15730688,Male,41,52000,0
251 | 15753102,Female,35,97000,1
252 | 15810075,Female,44,39000,0
253 | 15723373,Male,37,52000,0
254 | 15795298,Female,48,134000,1
255 | 15584320,Female,37,146000,1
256 | 15724161,Female,50,44000,0
257 | 15750056,Female,52,90000,1
258 | 15609637,Female,41,72000,0
259 | 15794493,Male,40,57000,0
260 | 15569641,Female,58,95000,1
261 | 15815236,Female,45,131000,1
262 | 15811177,Female,35,77000,0
263 | 15680587,Male,36,144000,1
264 | 15672821,Female,55,125000,1
265 | 15767681,Female,35,72000,0
266 | 15600379,Male,48,90000,1
267 | 15801336,Female,42,108000,1
268 | 15721592,Male,40,75000,0
269 | 15581282,Male,37,74000,0
270 | 15746203,Female,47,144000,1
271 | 15583137,Male,40,61000,0
272 | 15680752,Female,43,133000,0
273 | 15688172,Female,59,76000,1
274 | 15791373,Male,60,42000,1
275 | 15589449,Male,39,106000,1
276 | 15692819,Female,57,26000,1
277 | 15727467,Male,57,74000,1
278 | 15734312,Male,38,71000,0
279 | 15764604,Male,49,88000,1
280 | 15613014,Female,52,38000,1
281 | 15759684,Female,50,36000,1
282 | 15609669,Female,59,88000,1
283 | 15685536,Male,35,61000,0
284 | 15750447,Male,37,70000,1
285 | 15663249,Female,52,21000,1
286 | 15638646,Male,48,141000,0
287 | 15734161,Female,37,93000,1
288 | 15631070,Female,37,62000,0
289 | 15761950,Female,48,138000,1
290 | 15649668,Male,41,79000,0
291 | 15713912,Female,37,78000,1
292 | 15586757,Male,39,134000,1
293 | 15596522,Male,49,89000,1
294 | 15625395,Male,55,39000,1
295 | 15760570,Male,37,77000,0
296 | 15566689,Female,35,57000,0
297 | 15725794,Female,36,63000,0
298 | 15673539,Male,42,73000,1
299 | 15705298,Female,43,112000,1
300 | 15675791,Male,45,79000,0
301 | 15747043,Male,46,117000,1
302 | 15736397,Female,58,38000,1
303 | 15678201,Male,48,74000,1
304 | 15720745,Female,37,137000,1
305 | 15637593,Male,37,79000,1
306 | 15598070,Female,40,60000,0
307 | 15787550,Male,42,54000,0
308 | 15603942,Female,51,134000,0
309 | 15733973,Female,47,113000,1
310 | 15596761,Male,36,125000,1
311 | 15652400,Female,38,50000,0
312 | 15717893,Female,42,70000,0
313 | 15622585,Male,39,96000,1
314 | 15733964,Female,38,50000,0
315 | 15753861,Female,49,141000,1
316 | 15747097,Female,39,79000,0
317 | 15594762,Female,39,75000,1
318 | 15667417,Female,54,104000,1
319 | 15684861,Male,35,55000,0
320 | 15742204,Male,45,32000,1
321 | 15623502,Male,36,60000,0
322 | 15774872,Female,52,138000,1
323 | 15611191,Female,53,82000,1
324 | 15674331,Male,41,52000,0
325 | 15619465,Female,48,30000,1
326 | 15575247,Female,48,131000,1
327 | 15695679,Female,41,60000,0
328 | 15713463,Male,41,72000,0
329 | 15785170,Female,42,75000,0
330 | 15796351,Male,36,118000,1
331 | 15639576,Female,47,107000,1
332 | 15693264,Male,38,51000,0
333 | 15589715,Female,48,119000,1
334 | 15769902,Male,42,65000,0
335 | 15587177,Male,40,65000,0
336 | 15814553,Male,57,60000,1
337 | 15601550,Female,36,54000,0
338 | 15664907,Male,58,144000,1
339 | 15612465,Male,35,79000,0
340 | 15810800,Female,38,55000,0
341 | 15665760,Male,39,122000,1
342 | 15588080,Female,53,104000,1
343 | 15776844,Male,35,75000,0
344 | 15717560,Female,38,65000,0
345 | 15629739,Female,47,51000,1
346 | 15729908,Male,47,105000,1
347 | 15716781,Female,41,63000,0
348 | 15646936,Male,53,72000,1
349 | 15768151,Female,54,108000,1
350 | 15579212,Male,39,77000,0
351 | 15721835,Male,38,61000,0
352 | 15800515,Female,38,113000,1
353 | 15591279,Male,37,75000,0
354 | 15587419,Female,42,90000,1
355 | 15750335,Female,37,57000,0
356 | 15699619,Male,36,99000,1
357 | 15606472,Male,60,34000,1
358 | 15778368,Male,54,70000,1
359 | 15671387,Female,41,72000,0
360 | 15573926,Male,40,71000,1
361 | 15709183,Male,42,54000,0
362 | 15577514,Male,43,129000,1
363 | 15778830,Female,53,34000,1
364 | 15768072,Female,47,50000,1
365 | 15768293,Female,42,79000,0
366 | 15654456,Male,42,104000,1
367 | 15807525,Female,59,29000,1
368 | 15574372,Female,58,47000,1
369 | 15671249,Male,46,88000,1
370 | 15779744,Male,38,71000,0
371 | 15624755,Female,54,26000,1
372 | 15611430,Female,60,46000,1
373 | 15774744,Male,60,83000,1
374 | 15629885,Female,39,73000,0
375 | 15708791,Male,59,130000,1
376 | 15793890,Female,37,80000,0
377 | 15646091,Female,46,32000,1
378 | 15596984,Female,46,74000,0
379 | 15800215,Female,42,53000,0
380 | 15577806,Male,41,87000,1
381 | 15749381,Female,58,23000,1
382 | 15683758,Male,42,64000,0
383 | 15670615,Male,48,33000,1
384 | 15715622,Female,44,139000,1
385 | 15707634,Male,49,28000,1
386 | 15806901,Female,57,33000,1
387 | 15775335,Male,56,60000,1
388 | 15724150,Female,49,39000,1
389 | 15627220,Male,39,71000,0
390 | 15672330,Male,47,34000,1
391 | 15668521,Female,48,35000,1
392 | 15807837,Male,48,33000,1
393 | 15592570,Male,47,23000,1
394 | 15748589,Female,45,45000,1
395 | 15635893,Male,60,42000,1
396 | 15757632,Female,39,59000,0
397 | 15691863,Female,46,41000,1
398 | 15706071,Male,51,23000,1
399 | 15654296,Female,50,20000,1
400 | 15755018,Male,36,33000,0
401 | 15594041,Female,49,36000,1
--------------------------------------------------------------------------------
/3_supervised_ml/classification/Wine.csv:
--------------------------------------------------------------------------------
1 | Alcohol,Malic_Acid,Ash,Ash_Alcanity,Magnesium,Total_Phenols,Flavanoids,Nonflavanoid_Phenols,Proanthocyanins,Color_Intensity,Hue,OD280,Proline,Customer_Segment
2 | 14.23,1.71,2.43,15.6,127,2.8,3.06,0.28,2.29,5.64,1.04,3.92,1065,1
3 | 13.2,1.78,2.14,11.2,100,2.65,2.76,0.26,1.28,4.38,1.05,3.4,1050,1
4 | 13.16,2.36,2.67,18.6,101,2.8,3.24,0.3,2.81,5.68,1.03,3.17,1185,1
5 | 14.37,1.95,2.5,16.8,113,3.85,3.49,0.24,2.18,7.8,0.86,3.45,1480,1
6 | 13.24,2.59,2.87,21,118,2.8,2.69,0.39,1.82,4.32,1.04,2.93,735,1
7 | 14.2,1.76,2.45,15.2,112,3.27,3.39,0.34,1.97,6.75,1.05,2.85,1450,1
8 | 14.39,1.87,2.45,14.6,96,2.5,2.52,0.3,1.98,5.25,1.02,3.58,1290,1
9 | 14.06,2.15,2.61,17.6,121,2.6,2.51,0.31,1.25,5.05,1.06,3.58,1295,1
10 | 14.83,1.64,2.17,14,97,2.8,2.98,0.29,1.98,5.2,1.08,2.85,1045,1
11 | 13.86,1.35,2.27,16,98,2.98,3.15,0.22,1.85,7.22,1.01,3.55,1045,1
12 | 14.1,2.16,2.3,18,105,2.95,3.32,0.22,2.38,5.75,1.25,3.17,1510,1
13 | 14.12,1.48,2.32,16.8,95,2.2,2.43,0.26,1.57,5,1.17,2.82,1280,1
14 | 13.75,1.73,2.41,16,89,2.6,2.76,0.29,1.81,5.6,1.15,2.9,1320,1
15 | 14.75,1.73,2.39,11.4,91,3.1,3.69,0.43,2.81,5.4,1.25,2.73,1150,1
16 | 14.38,1.87,2.38,12,102,3.3,3.64,0.29,2.96,7.5,1.2,3,1547,1
17 | 13.63,1.81,2.7,17.2,112,2.85,2.91,0.3,1.46,7.3,1.28,2.88,1310,1
18 | 14.3,1.92,2.72,20,120,2.8,3.14,0.33,1.97,6.2,1.07,2.65,1280,1
19 | 13.83,1.57,2.62,20,115,2.95,3.4,0.4,1.72,6.6,1.13,2.57,1130,1
20 | 14.19,1.59,2.48,16.5,108,3.3,3.93,0.32,1.86,8.7,1.23,2.82,1680,1
21 | 13.64,3.1,2.56,15.2,116,2.7,3.03,0.17,1.66,5.1,0.96,3.36,845,1
22 | 14.06,1.63,2.28,16,126,3,3.17,0.24,2.1,5.65,1.09,3.71,780,1
23 | 12.93,3.8,2.65,18.6,102,2.41,2.41,0.25,1.98,4.5,1.03,3.52,770,1
24 | 13.71,1.86,2.36,16.6,101,2.61,2.88,0.27,1.69,3.8,1.11,4,1035,1
25 | 12.85,1.6,2.52,17.8,95,2.48,2.37,0.26,1.46,3.93,1.09,3.63,1015,1
26 | 13.5,1.81,2.61,20,96,2.53,2.61,0.28,1.66,3.52,1.12,3.82,845,1
27 | 13.05,2.05,3.22,25,124,2.63,2.68,0.47,1.92,3.58,1.13,3.2,830,1
28 | 13.39,1.77,2.62,16.1,93,2.85,2.94,0.34,1.45,4.8,0.92,3.22,1195,1
29 | 13.3,1.72,2.14,17,94,2.4,2.19,0.27,1.35,3.95,1.02,2.77,1285,1
30 | 13.87,1.9,2.8,19.4,107,2.95,2.97,0.37,1.76,4.5,1.25,3.4,915,1
31 | 14.02,1.68,2.21,16,96,2.65,2.33,0.26,1.98,4.7,1.04,3.59,1035,1
32 | 13.73,1.5,2.7,22.5,101,3,3.25,0.29,2.38,5.7,1.19,2.71,1285,1
33 | 13.58,1.66,2.36,19.1,106,2.86,3.19,0.22,1.95,6.9,1.09,2.88,1515,1
34 | 13.68,1.83,2.36,17.2,104,2.42,2.69,0.42,1.97,3.84,1.23,2.87,990,1
35 | 13.76,1.53,2.7,19.5,132,2.95,2.74,0.5,1.35,5.4,1.25,3,1235,1
36 | 13.51,1.8,2.65,19,110,2.35,2.53,0.29,1.54,4.2,1.1,2.87,1095,1
37 | 13.48,1.81,2.41,20.5,100,2.7,2.98,0.26,1.86,5.1,1.04,3.47,920,1
38 | 13.28,1.64,2.84,15.5,110,2.6,2.68,0.34,1.36,4.6,1.09,2.78,880,1
39 | 13.05,1.65,2.55,18,98,2.45,2.43,0.29,1.44,4.25,1.12,2.51,1105,1
40 | 13.07,1.5,2.1,15.5,98,2.4,2.64,0.28,1.37,3.7,1.18,2.69,1020,1
41 | 14.22,3.99,2.51,13.2,128,3,3.04,0.2,2.08,5.1,0.89,3.53,760,1
42 | 13.56,1.71,2.31,16.2,117,3.15,3.29,0.34,2.34,6.13,0.95,3.38,795,1
43 | 13.41,3.84,2.12,18.8,90,2.45,2.68,0.27,1.48,4.28,0.91,3,1035,1
44 | 13.88,1.89,2.59,15,101,3.25,3.56,0.17,1.7,5.43,0.88,3.56,1095,1
45 | 13.24,3.98,2.29,17.5,103,2.64,2.63,0.32,1.66,4.36,0.82,3,680,1
46 | 13.05,1.77,2.1,17,107,3,3,0.28,2.03,5.04,0.88,3.35,885,1
47 | 14.21,4.04,2.44,18.9,111,2.85,2.65,0.3,1.25,5.24,0.87,3.33,1080,1
48 | 14.38,3.59,2.28,16,102,3.25,3.17,0.27,2.19,4.9,1.04,3.44,1065,1
49 | 13.9,1.68,2.12,16,101,3.1,3.39,0.21,2.14,6.1,0.91,3.33,985,1
50 | 14.1,2.02,2.4,18.8,103,2.75,2.92,0.32,2.38,6.2,1.07,2.75,1060,1
51 | 13.94,1.73,2.27,17.4,108,2.88,3.54,0.32,2.08,8.9,1.12,3.1,1260,1
52 | 13.05,1.73,2.04,12.4,92,2.72,3.27,0.17,2.91,7.2,1.12,2.91,1150,1
53 | 13.83,1.65,2.6,17.2,94,2.45,2.99,0.22,2.29,5.6,1.24,3.37,1265,1
54 | 13.82,1.75,2.42,14,111,3.88,3.74,0.32,1.87,7.05,1.01,3.26,1190,1
55 | 13.77,1.9,2.68,17.1,115,3,2.79,0.39,1.68,6.3,1.13,2.93,1375,1
56 | 13.74,1.67,2.25,16.4,118,2.6,2.9,0.21,1.62,5.85,0.92,3.2,1060,1
57 | 13.56,1.73,2.46,20.5,116,2.96,2.78,0.2,2.45,6.25,0.98,3.03,1120,1
58 | 14.22,1.7,2.3,16.3,118,3.2,3,0.26,2.03,6.38,0.94,3.31,970,1
59 | 13.29,1.97,2.68,16.8,102,3,3.23,0.31,1.66,6,1.07,2.84,1270,1
60 | 13.72,1.43,2.5,16.7,108,3.4,3.67,0.19,2.04,6.8,0.89,2.87,1285,1
61 | 12.37,0.94,1.36,10.6,88,1.98,0.57,0.28,0.42,1.95,1.05,1.82,520,2
62 | 12.33,1.1,2.28,16,101,2.05,1.09,0.63,0.41,3.27,1.25,1.67,680,2
63 | 12.64,1.36,2.02,16.8,100,2.02,1.41,0.53,0.62,5.75,0.98,1.59,450,2
64 | 13.67,1.25,1.92,18,94,2.1,1.79,0.32,0.73,3.8,1.23,2.46,630,2
65 | 12.37,1.13,2.16,19,87,3.5,3.1,0.19,1.87,4.45,1.22,2.87,420,2
66 | 12.17,1.45,2.53,19,104,1.89,1.75,0.45,1.03,2.95,1.45,2.23,355,2
67 | 12.37,1.21,2.56,18.1,98,2.42,2.65,0.37,2.08,4.6,1.19,2.3,678,2
68 | 13.11,1.01,1.7,15,78,2.98,3.18,0.26,2.28,5.3,1.12,3.18,502,2
69 | 12.37,1.17,1.92,19.6,78,2.11,2,0.27,1.04,4.68,1.12,3.48,510,2
70 | 13.34,0.94,2.36,17,110,2.53,1.3,0.55,0.42,3.17,1.02,1.93,750,2
71 | 12.21,1.19,1.75,16.8,151,1.85,1.28,0.14,2.5,2.85,1.28,3.07,718,2
72 | 12.29,1.61,2.21,20.4,103,1.1,1.02,0.37,1.46,3.05,0.906,1.82,870,2
73 | 13.86,1.51,2.67,25,86,2.95,2.86,0.21,1.87,3.38,1.36,3.16,410,2
74 | 13.49,1.66,2.24,24,87,1.88,1.84,0.27,1.03,3.74,0.98,2.78,472,2
75 | 12.99,1.67,2.6,30,139,3.3,2.89,0.21,1.96,3.35,1.31,3.5,985,2
76 | 11.96,1.09,2.3,21,101,3.38,2.14,0.13,1.65,3.21,0.99,3.13,886,2
77 | 11.66,1.88,1.92,16,97,1.61,1.57,0.34,1.15,3.8,1.23,2.14,428,2
78 | 13.03,0.9,1.71,16,86,1.95,2.03,0.24,1.46,4.6,1.19,2.48,392,2
79 | 11.84,2.89,2.23,18,112,1.72,1.32,0.43,0.95,2.65,0.96,2.52,500,2
80 | 12.33,0.99,1.95,14.8,136,1.9,1.85,0.35,2.76,3.4,1.06,2.31,750,2
81 | 12.7,3.87,2.4,23,101,2.83,2.55,0.43,1.95,2.57,1.19,3.13,463,2
82 | 12,0.92,2,19,86,2.42,2.26,0.3,1.43,2.5,1.38,3.12,278,2
83 | 12.72,1.81,2.2,18.8,86,2.2,2.53,0.26,1.77,3.9,1.16,3.14,714,2
84 | 12.08,1.13,2.51,24,78,2,1.58,0.4,1.4,2.2,1.31,2.72,630,2
85 | 13.05,3.86,2.32,22.5,85,1.65,1.59,0.61,1.62,4.8,0.84,2.01,515,2
86 | 11.84,0.89,2.58,18,94,2.2,2.21,0.22,2.35,3.05,0.79,3.08,520,2
87 | 12.67,0.98,2.24,18,99,2.2,1.94,0.3,1.46,2.62,1.23,3.16,450,2
88 | 12.16,1.61,2.31,22.8,90,1.78,1.69,0.43,1.56,2.45,1.33,2.26,495,2
89 | 11.65,1.67,2.62,26,88,1.92,1.61,0.4,1.34,2.6,1.36,3.21,562,2
90 | 11.64,2.06,2.46,21.6,84,1.95,1.69,0.48,1.35,2.8,1,2.75,680,2
91 | 12.08,1.33,2.3,23.6,70,2.2,1.59,0.42,1.38,1.74,1.07,3.21,625,2
92 | 12.08,1.83,2.32,18.5,81,1.6,1.5,0.52,1.64,2.4,1.08,2.27,480,2
93 | 12,1.51,2.42,22,86,1.45,1.25,0.5,1.63,3.6,1.05,2.65,450,2
94 | 12.69,1.53,2.26,20.7,80,1.38,1.46,0.58,1.62,3.05,0.96,2.06,495,2
95 | 12.29,2.83,2.22,18,88,2.45,2.25,0.25,1.99,2.15,1.15,3.3,290,2
96 | 11.62,1.99,2.28,18,98,3.02,2.26,0.17,1.35,3.25,1.16,2.96,345,2
97 | 12.47,1.52,2.2,19,162,2.5,2.27,0.32,3.28,2.6,1.16,2.63,937,2
98 | 11.81,2.12,2.74,21.5,134,1.6,0.99,0.14,1.56,2.5,0.95,2.26,625,2
99 | 12.29,1.41,1.98,16,85,2.55,2.5,0.29,1.77,2.9,1.23,2.74,428,2
100 | 12.37,1.07,2.1,18.5,88,3.52,3.75,0.24,1.95,4.5,1.04,2.77,660,2
101 | 12.29,3.17,2.21,18,88,2.85,2.99,0.45,2.81,2.3,1.42,2.83,406,2
102 | 12.08,2.08,1.7,17.5,97,2.23,2.17,0.26,1.4,3.3,1.27,2.96,710,2
103 | 12.6,1.34,1.9,18.5,88,1.45,1.36,0.29,1.35,2.45,1.04,2.77,562,2
104 | 12.34,2.45,2.46,21,98,2.56,2.11,0.34,1.31,2.8,0.8,3.38,438,2
105 | 11.82,1.72,1.88,19.5,86,2.5,1.64,0.37,1.42,2.06,0.94,2.44,415,2
106 | 12.51,1.73,1.98,20.5,85,2.2,1.92,0.32,1.48,2.94,1.04,3.57,672,2
107 | 12.42,2.55,2.27,22,90,1.68,1.84,0.66,1.42,2.7,0.86,3.3,315,2
108 | 12.25,1.73,2.12,19,80,1.65,2.03,0.37,1.63,3.4,1,3.17,510,2
109 | 12.72,1.75,2.28,22.5,84,1.38,1.76,0.48,1.63,3.3,0.88,2.42,488,2
110 | 12.22,1.29,1.94,19,92,2.36,2.04,0.39,2.08,2.7,0.86,3.02,312,2
111 | 11.61,1.35,2.7,20,94,2.74,2.92,0.29,2.49,2.65,0.96,3.26,680,2
112 | 11.46,3.74,1.82,19.5,107,3.18,2.58,0.24,3.58,2.9,0.75,2.81,562,2
113 | 12.52,2.43,2.17,21,88,2.55,2.27,0.26,1.22,2,0.9,2.78,325,2
114 | 11.76,2.68,2.92,20,103,1.75,2.03,0.6,1.05,3.8,1.23,2.5,607,2
115 | 11.41,0.74,2.5,21,88,2.48,2.01,0.42,1.44,3.08,1.1,2.31,434,2
116 | 12.08,1.39,2.5,22.5,84,2.56,2.29,0.43,1.04,2.9,0.93,3.19,385,2
117 | 11.03,1.51,2.2,21.5,85,2.46,2.17,0.52,2.01,1.9,1.71,2.87,407,2
118 | 11.82,1.47,1.99,20.8,86,1.98,1.6,0.3,1.53,1.95,0.95,3.33,495,2
119 | 12.42,1.61,2.19,22.5,108,2,2.09,0.34,1.61,2.06,1.06,2.96,345,2
120 | 12.77,3.43,1.98,16,80,1.63,1.25,0.43,0.83,3.4,0.7,2.12,372,2
121 | 12,3.43,2,19,87,2,1.64,0.37,1.87,1.28,0.93,3.05,564,2
122 | 11.45,2.4,2.42,20,96,2.9,2.79,0.32,1.83,3.25,0.8,3.39,625,2
123 | 11.56,2.05,3.23,28.5,119,3.18,5.08,0.47,1.87,6,0.93,3.69,465,2
124 | 12.42,4.43,2.73,26.5,102,2.2,2.13,0.43,1.71,2.08,0.92,3.12,365,2
125 | 13.05,5.8,2.13,21.5,86,2.62,2.65,0.3,2.01,2.6,0.73,3.1,380,2
126 | 11.87,4.31,2.39,21,82,2.86,3.03,0.21,2.91,2.8,0.75,3.64,380,2
127 | 12.07,2.16,2.17,21,85,2.6,2.65,0.37,1.35,2.76,0.86,3.28,378,2
128 | 12.43,1.53,2.29,21.5,86,2.74,3.15,0.39,1.77,3.94,0.69,2.84,352,2
129 | 11.79,2.13,2.78,28.5,92,2.13,2.24,0.58,1.76,3,0.97,2.44,466,2
130 | 12.37,1.63,2.3,24.5,88,2.22,2.45,0.4,1.9,2.12,0.89,2.78,342,2
131 | 12.04,4.3,2.38,22,80,2.1,1.75,0.42,1.35,2.6,0.79,2.57,580,2
132 | 12.86,1.35,2.32,18,122,1.51,1.25,0.21,0.94,4.1,0.76,1.29,630,3
133 | 12.88,2.99,2.4,20,104,1.3,1.22,0.24,0.83,5.4,0.74,1.42,530,3
134 | 12.81,2.31,2.4,24,98,1.15,1.09,0.27,0.83,5.7,0.66,1.36,560,3
135 | 12.7,3.55,2.36,21.5,106,1.7,1.2,0.17,0.84,5,0.78,1.29,600,3
136 | 12.51,1.24,2.25,17.5,85,2,0.58,0.6,1.25,5.45,0.75,1.51,650,3
137 | 12.6,2.46,2.2,18.5,94,1.62,0.66,0.63,0.94,7.1,0.73,1.58,695,3
138 | 12.25,4.72,2.54,21,89,1.38,0.47,0.53,0.8,3.85,0.75,1.27,720,3
139 | 12.53,5.51,2.64,25,96,1.79,0.6,0.63,1.1,5,0.82,1.69,515,3
140 | 13.49,3.59,2.19,19.5,88,1.62,0.48,0.58,0.88,5.7,0.81,1.82,580,3
141 | 12.84,2.96,2.61,24,101,2.32,0.6,0.53,0.81,4.92,0.89,2.15,590,3
142 | 12.93,2.81,2.7,21,96,1.54,0.5,0.53,0.75,4.6,0.77,2.31,600,3
143 | 13.36,2.56,2.35,20,89,1.4,0.5,0.37,0.64,5.6,0.7,2.47,780,3
144 | 13.52,3.17,2.72,23.5,97,1.55,0.52,0.5,0.55,4.35,0.89,2.06,520,3
145 | 13.62,4.95,2.35,20,92,2,0.8,0.47,1.02,4.4,0.91,2.05,550,3
146 | 12.25,3.88,2.2,18.5,112,1.38,0.78,0.29,1.14,8.21,0.65,2,855,3
147 | 13.16,3.57,2.15,21,102,1.5,0.55,0.43,1.3,4,0.6,1.68,830,3
148 | 13.88,5.04,2.23,20,80,0.98,0.34,0.4,0.68,4.9,0.58,1.33,415,3
149 | 12.87,4.61,2.48,21.5,86,1.7,0.65,0.47,0.86,7.65,0.54,1.86,625,3
150 | 13.32,3.24,2.38,21.5,92,1.93,0.76,0.45,1.25,8.42,0.55,1.62,650,3
151 | 13.08,3.9,2.36,21.5,113,1.41,1.39,0.34,1.14,9.4,0.57,1.33,550,3
152 | 13.5,3.12,2.62,24,123,1.4,1.57,0.22,1.25,8.6,0.59,1.3,500,3
153 | 12.79,2.67,2.48,22,112,1.48,1.36,0.24,1.26,10.8,0.48,1.47,480,3
154 | 13.11,1.9,2.75,25.5,116,2.2,1.28,0.26,1.56,7.1,0.61,1.33,425,3
155 | 13.23,3.3,2.28,18.5,98,1.8,0.83,0.61,1.87,10.52,0.56,1.51,675,3
156 | 12.58,1.29,2.1,20,103,1.48,0.58,0.53,1.4,7.6,0.58,1.55,640,3
157 | 13.17,5.19,2.32,22,93,1.74,0.63,0.61,1.55,7.9,0.6,1.48,725,3
158 | 13.84,4.12,2.38,19.5,89,1.8,0.83,0.48,1.56,9.01,0.57,1.64,480,3
159 | 12.45,3.03,2.64,27,97,1.9,0.58,0.63,1.14,7.5,0.67,1.73,880,3
160 | 14.34,1.68,2.7,25,98,2.8,1.31,0.53,2.7,13,0.57,1.96,660,3
161 | 13.48,1.67,2.64,22.5,89,2.6,1.1,0.52,2.29,11.75,0.57,1.78,620,3
162 | 12.36,3.83,2.38,21,88,2.3,0.92,0.5,1.04,7.65,0.56,1.58,520,3
163 | 13.69,3.26,2.54,20,107,1.83,0.56,0.5,0.8,5.88,0.96,1.82,680,3
164 | 12.85,3.27,2.58,22,106,1.65,0.6,0.6,0.96,5.58,0.87,2.11,570,3
165 | 12.96,3.45,2.35,18.5,106,1.39,0.7,0.4,0.94,5.28,0.68,1.75,675,3
166 | 13.78,2.76,2.3,22,90,1.35,0.68,0.41,1.03,9.58,0.7,1.68,615,3
167 | 13.73,4.36,2.26,22.5,88,1.28,0.47,0.52,1.15,6.62,0.78,1.75,520,3
168 | 13.45,3.7,2.6,23,111,1.7,0.92,0.43,1.46,10.68,0.85,1.56,695,3
169 | 12.82,3.37,2.3,19.5,88,1.48,0.66,0.4,0.97,10.26,0.72,1.75,685,3
170 | 13.58,2.58,2.69,24.5,105,1.55,0.84,0.39,1.54,8.66,0.74,1.8,750,3
171 | 13.4,4.6,2.86,25,112,1.98,0.96,0.27,1.11,8.5,0.67,1.92,630,3
172 | 12.2,3.03,2.32,19,96,1.25,0.49,0.4,0.73,5.5,0.66,1.83,510,3
173 | 12.77,2.39,2.28,19.5,86,1.39,0.51,0.48,0.64,9.899999,0.57,1.63,470,3
174 | 14.16,2.51,2.48,20,91,1.68,0.7,0.44,1.24,9.7,0.62,1.71,660,3
175 | 13.71,5.65,2.45,20.5,95,1.68,0.61,0.52,1.06,7.7,0.64,1.74,740,3
176 | 13.4,3.91,2.48,23,102,1.8,0.75,0.43,1.41,7.3,0.7,1.56,750,3
177 | 13.27,4.28,2.26,20,120,1.59,0.69,0.43,1.35,10.2,0.59,1.56,835,3
178 | 13.17,2.59,2.37,20,120,1.65,0.68,0.53,1.46,9.3,0.6,1.62,840,3
179 | 14.13,4.1,2.74,24.5,96,2.05,0.76,0.56,1.35,9.2,0.61,1.6,560,3
--------------------------------------------------------------------------------
/3_supervised_ml/lessons/4. Supervised Learning - Classification.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Younes-Charfaoui/Machine-Learning-Beginner-Course/b39d9dea1d7f79c1b14590ea338a33d47e9ca49e/3_supervised_ml/lessons/4. Supervised Learning - Classification.pdf
--------------------------------------------------------------------------------
/3_supervised_ml/lessons/4. Supervised Learning - Regression.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Younes-Charfaoui/Machine-Learning-Beginner-Course/b39d9dea1d7f79c1b14590ea338a33d47e9ca49e/3_supervised_ml/lessons/4. Supervised Learning - Regression.pdf
--------------------------------------------------------------------------------
/3_supervised_ml/regression/50_Startups.csv:
--------------------------------------------------------------------------------
1 | R&D Spend,Administration,Marketing Spend,State,Profit
2 | 165349.2,136897.8,471784.1,New York,192261.83
3 | 162597.7,151377.59,443898.53,California,191792.06
4 | 153441.51,101145.55,407934.54,Florida,191050.39
5 | 144372.41,118671.85,383199.62,New York,182901.99
6 | 142107.34,91391.77,366168.42,Florida,166187.94
7 | 131876.9,99814.71,362861.36,New York,156991.12
8 | 134615.46,147198.87,127716.82,California,156122.51
9 | 130298.13,145530.06,323876.68,Florida,155752.6
10 | 120542.52,148718.95,311613.29,New York,152211.77
11 | 123334.88,108679.17,304981.62,California,149759.96
12 | 101913.08,110594.11,229160.95,Florida,146121.95
13 | 100671.96,91790.61,249744.55,California,144259.4
14 | 93863.75,127320.38,249839.44,Florida,141585.52
15 | 91992.39,135495.07,252664.93,California,134307.35
16 | 119943.24,156547.42,256512.92,Florida,132602.65
17 | 114523.61,122616.84,261776.23,New York,129917.04
18 | 78013.11,121597.55,264346.06,California,126992.93
19 | 94657.16,145077.58,282574.31,New York,125370.37
20 | 91749.16,114175.79,294919.57,Florida,124266.9
21 | 86419.7,153514.11,0,New York,122776.86
22 | 76253.86,113867.3,298664.47,California,118474.03
23 | 78389.47,153773.43,299737.29,New York,111313.02
24 | 73994.56,122782.75,303319.26,Florida,110352.25
25 | 67532.53,105751.03,304768.73,Florida,108733.99
26 | 77044.01,99281.34,140574.81,New York,108552.04
27 | 64664.71,139553.16,137962.62,California,107404.34
28 | 75328.87,144135.98,134050.07,Florida,105733.54
29 | 72107.6,127864.55,353183.81,New York,105008.31
30 | 66051.52,182645.56,118148.2,Florida,103282.38
31 | 65605.48,153032.06,107138.38,New York,101004.64
32 | 61994.48,115641.28,91131.24,Florida,99937.59
33 | 61136.38,152701.92,88218.23,New York,97483.56
34 | 63408.86,129219.61,46085.25,California,97427.84
35 | 55493.95,103057.49,214634.81,Florida,96778.92
36 | 46426.07,157693.92,210797.67,California,96712.8
37 | 46014.02,85047.44,205517.64,New York,96479.51
38 | 28663.76,127056.21,201126.82,Florida,90708.19
39 | 44069.95,51283.14,197029.42,California,89949.14
40 | 20229.59,65947.93,185265.1,New York,81229.06
41 | 38558.51,82982.09,174999.3,California,81005.76
42 | 28754.33,118546.05,172795.67,California,78239.91
43 | 27892.92,84710.77,164470.71,Florida,77798.83
44 | 23640.93,96189.63,148001.11,California,71498.49
45 | 15505.73,127382.3,35534.17,New York,69758.98
46 | 22177.74,154806.14,28334.72,California,65200.33
47 | 1000.23,124153.04,1903.93,New York,64926.08
48 | 1315.46,115816.21,297114.46,Florida,49490.75
49 | 0,135426.92,0,California,42559.73
50 | 542.05,51743.15,0,New York,35673.41
51 | 0,116983.8,45173.06,California,14681.4
--------------------------------------------------------------------------------
/3_supervised_ml/regression/Regression Practical.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import numpy as np\n",
10 | "import pandas as pd\n",
11 | "import matplotlib.pyplot as plt"
12 | ]
13 | },
14 | {
15 | "cell_type": "code",
16 | "execution_count": 2,
17 | "metadata": {},
18 | "outputs": [],
19 | "source": [
20 | "data = pd.read_csv('housing.csv')"
21 | ]
22 | },
23 | {
24 | "cell_type": "code",
25 | "execution_count": 3,
26 | "metadata": {},
27 | "outputs": [
28 | {
29 | "data": {
30 | "text/html": [
31 | "\n",
32 | "\n",
45 | "
\n",
46 | " \n",
47 | " \n",
48 | " | \n",
49 | " RM | \n",
50 | " LSTAT | \n",
51 | " PTRATIO | \n",
52 | " MEDV | \n",
53 | "
\n",
54 | " \n",
55 | " \n",
56 | " \n",
57 | " 0 | \n",
58 | " 6.575 | \n",
59 | " 4.98 | \n",
60 | " 15.3 | \n",
61 | " 504000.0 | \n",
62 | "
\n",
63 | " \n",
64 | " 1 | \n",
65 | " 6.421 | \n",
66 | " 9.14 | \n",
67 | " 17.8 | \n",
68 | " 453600.0 | \n",
69 | "
\n",
70 | " \n",
71 | " 2 | \n",
72 | " 7.185 | \n",
73 | " 4.03 | \n",
74 | " 17.8 | \n",
75 | " 728700.0 | \n",
76 | "
\n",
77 | " \n",
78 | " 3 | \n",
79 | " 6.998 | \n",
80 | " 2.94 | \n",
81 | " 18.7 | \n",
82 | " 701400.0 | \n",
83 | "
\n",
84 | " \n",
85 | " 4 | \n",
86 | " 7.147 | \n",
87 | " 5.33 | \n",
88 | " 18.7 | \n",
89 | " 760200.0 | \n",
90 | "
\n",
91 | " \n",
92 | "
\n",
93 | "
"
94 | ],
95 | "text/plain": [
96 | " RM LSTAT PTRATIO MEDV\n",
97 | "0 6.575 4.98 15.3 504000.0\n",
98 | "1 6.421 9.14 17.8 453600.0\n",
99 | "2 7.185 4.03 17.8 728700.0\n",
100 | "3 6.998 2.94 18.7 701400.0\n",
101 | "4 7.147 5.33 18.7 760200.0"
102 | ]
103 | },
104 | "execution_count": 3,
105 | "metadata": {},
106 | "output_type": "execute_result"
107 | }
108 | ],
109 | "source": [
110 | "data.head()"
111 | ]
112 | },
113 | {
114 | "cell_type": "code",
115 | "execution_count": 5,
116 | "metadata": {},
117 | "outputs": [
118 | {
119 | "data": {
120 | "text/plain": [
121 | "RM 0\n",
122 | "LSTAT 0\n",
123 | "PTRATIO 0\n",
124 | "MEDV 0\n",
125 | "dtype: int64"
126 | ]
127 | },
128 | "execution_count": 5,
129 | "metadata": {},
130 | "output_type": "execute_result"
131 | }
132 | ],
133 | "source": [
134 | "data.isnull().sum()"
135 | ]
136 | },
137 | {
138 | "cell_type": "code",
139 | "execution_count": 6,
140 | "metadata": {},
141 | "outputs": [
142 | {
143 | "name": "stdout",
144 | "output_type": "stream",
145 | "text": [
146 | "\n",
147 | "RangeIndex: 489 entries, 0 to 488\n",
148 | "Data columns (total 4 columns):\n",
149 | "RM 489 non-null float64\n",
150 | "LSTAT 489 non-null float64\n",
151 | "PTRATIO 489 non-null float64\n",
152 | "MEDV 489 non-null float64\n",
153 | "dtypes: float64(4)\n",
154 | "memory usage: 15.4 KB\n"
155 | ]
156 | }
157 | ],
158 | "source": [
159 | "data.info()"
160 | ]
161 | },
162 | {
163 | "cell_type": "code",
164 | "execution_count": 73,
165 | "metadata": {},
166 | "outputs": [],
167 | "source": [
168 | "X = data.iloc[:,:-1]\n",
169 | "y = data.iloc[:,-1]"
170 | ]
171 | },
172 | {
173 | "cell_type": "code",
174 | "execution_count": 74,
175 | "metadata": {},
176 | "outputs": [],
177 | "source": [
178 | "from sklearn.preprocessing import StandardScaler"
179 | ]
180 | },
181 | {
182 | "cell_type": "code",
183 | "execution_count": 75,
184 | "metadata": {},
185 | "outputs": [],
186 | "source": [
187 | "scaler = StandardScaler()\n",
188 | "X = scaler.fit_transform(X)"
189 | ]
190 | },
191 | {
192 | "cell_type": "code",
193 | "execution_count": 17,
194 | "metadata": {},
195 | "outputs": [],
196 | "source": [
197 | "from sklearn.model_selection import train_test_split"
198 | ]
199 | },
200 | {
201 | "cell_type": "code",
202 | "execution_count": 84,
203 | "metadata": {},
204 | "outputs": [],
205 | "source": [
206 | "x_train, x_test, y_train, y_test = train_test_split(X_poly, y, test_size = 0.2, shuffle = True)"
207 | ]
208 | },
209 | {
210 | "cell_type": "code",
211 | "execution_count": 85,
212 | "metadata": {},
213 | "outputs": [],
214 | "source": [
215 | "from sklearn.linear_model import LinearRegression"
216 | ]
217 | },
218 | {
219 | "cell_type": "code",
220 | "execution_count": 86,
221 | "metadata": {},
222 | "outputs": [],
223 | "source": [
224 | "regressor = LinearRegression()"
225 | ]
226 | },
227 | {
228 | "cell_type": "code",
229 | "execution_count": 87,
230 | "metadata": {},
231 | "outputs": [
232 | {
233 | "data": {
234 | "text/plain": [
235 | "LinearRegression(copy_X=True, fit_intercept=True, n_jobs=1, normalize=False)"
236 | ]
237 | },
238 | "execution_count": 87,
239 | "metadata": {},
240 | "output_type": "execute_result"
241 | }
242 | ],
243 | "source": [
244 | "regressor.fit(x_train,y_train)"
245 | ]
246 | },
247 | {
248 | "cell_type": "code",
249 | "execution_count": 88,
250 | "metadata": {},
251 | "outputs": [
252 | {
253 | "data": {
254 | "text/plain": [
255 | "0.8123763373367298"
256 | ]
257 | },
258 | "execution_count": 88,
259 | "metadata": {},
260 | "output_type": "execute_result"
261 | }
262 | ],
263 | "source": [
264 | "regressor.score(x_test, y_test)"
265 | ]
266 | },
267 | {
268 | "cell_type": "code",
269 | "execution_count": 89,
270 | "metadata": {},
271 | "outputs": [],
272 | "source": [
273 | "from sklearn.neighbors import KNeighborsRegressor\n",
274 | "reg = KNeighborsRegressor(n_neighbors = 10)"
275 | ]
276 | },
277 | {
278 | "cell_type": "code",
279 | "execution_count": 90,
280 | "metadata": {},
281 | "outputs": [
282 | {
283 | "data": {
284 | "text/plain": [
285 | "KNeighborsRegressor(algorithm='auto', leaf_size=30, metric='minkowski',\n",
286 | " metric_params=None, n_jobs=1, n_neighbors=10, p=2,\n",
287 | " weights='uniform')"
288 | ]
289 | },
290 | "execution_count": 90,
291 | "metadata": {},
292 | "output_type": "execute_result"
293 | }
294 | ],
295 | "source": [
296 | "reg.fit(x_train, y_train)"
297 | ]
298 | },
299 | {
300 | "cell_type": "code",
301 | "execution_count": 91,
302 | "metadata": {},
303 | "outputs": [],
304 | "source": [
305 | "y_pred = reg.predict(x_test)"
306 | ]
307 | },
308 | {
309 | "cell_type": "code",
310 | "execution_count": 92,
311 | "metadata": {},
312 | "outputs": [],
313 | "source": [
314 | "from sklearn.metrics import r2_score"
315 | ]
316 | },
317 | {
318 | "cell_type": "code",
319 | "execution_count": 93,
320 | "metadata": {},
321 | "outputs": [
322 | {
323 | "data": {
324 | "text/plain": [
325 | "0.7528827030083509"
326 | ]
327 | },
328 | "execution_count": 93,
329 | "metadata": {},
330 | "output_type": "execute_result"
331 | }
332 | ],
333 | "source": [
334 | "r2_score(y_test, y_pred)"
335 | ]
336 | },
337 | {
338 | "cell_type": "code",
339 | "execution_count": 61,
340 | "metadata": {},
341 | "outputs": [],
342 | "source": [
343 | "from sklearn.tree import DecisionTreeRegressor\n",
344 | "re = DecisionTreeRegressor(max_depth = 10)"
345 | ]
346 | },
347 | {
348 | "cell_type": "code",
349 | "execution_count": 62,
350 | "metadata": {},
351 | "outputs": [
352 | {
353 | "data": {
354 | "text/plain": [
355 | "DecisionTreeRegressor(criterion='mse', max_depth=10, max_features=None,\n",
356 | " max_leaf_nodes=None, min_impurity_decrease=0.0,\n",
357 | " min_impurity_split=None, min_samples_leaf=1,\n",
358 | " min_samples_split=2, min_weight_fraction_leaf=0.0,\n",
359 | " presort=False, random_state=None, splitter='best')"
360 | ]
361 | },
362 | "execution_count": 62,
363 | "metadata": {},
364 | "output_type": "execute_result"
365 | }
366 | ],
367 | "source": [
368 | "re.fit(x_train, y_train)"
369 | ]
370 | },
371 | {
372 | "cell_type": "code",
373 | "execution_count": 63,
374 | "metadata": {},
375 | "outputs": [],
376 | "source": [
377 | "y_pred = re.predict(x_test)"
378 | ]
379 | },
380 | {
381 | "cell_type": "code",
382 | "execution_count": 64,
383 | "metadata": {},
384 | "outputs": [
385 | {
386 | "data": {
387 | "text/plain": [
388 | "0.7613669853377097"
389 | ]
390 | },
391 | "execution_count": 64,
392 | "metadata": {},
393 | "output_type": "execute_result"
394 | }
395 | ],
396 | "source": [
397 | "r2_score(y_test, y_pred)"
398 | ]
399 | },
400 | {
401 | "cell_type": "code",
402 | "execution_count": 60,
403 | "metadata": {},
404 | "outputs": [],
405 | "source": [
406 | "??DecisionTreeRegressor"
407 | ]
408 | },
409 | {
410 | "cell_type": "code",
411 | "execution_count": 65,
412 | "metadata": {},
413 | "outputs": [],
414 | "source": [
415 | "from sklearn.svm import SVR"
416 | ]
417 | },
418 | {
419 | "cell_type": "code",
420 | "execution_count": 66,
421 | "metadata": {},
422 | "outputs": [],
423 | "source": [
424 | "rrrr = SVR()"
425 | ]
426 | },
427 | {
428 | "cell_type": "code",
429 | "execution_count": 67,
430 | "metadata": {},
431 | "outputs": [
432 | {
433 | "data": {
434 | "text/plain": [
435 | "SVR(C=1.0, cache_size=200, coef0=0.0, degree=3, epsilon=0.1, gamma='auto',\n",
436 | " kernel='rbf', max_iter=-1, shrinking=True, tol=0.001, verbose=False)"
437 | ]
438 | },
439 | "execution_count": 67,
440 | "metadata": {},
441 | "output_type": "execute_result"
442 | }
443 | ],
444 | "source": [
445 | "rrrr.fit(x_train, y_train)"
446 | ]
447 | },
448 | {
449 | "cell_type": "code",
450 | "execution_count": 68,
451 | "metadata": {},
452 | "outputs": [],
453 | "source": [
454 | "y = rrrr.predict(x_test)"
455 | ]
456 | },
457 | {
458 | "cell_type": "code",
459 | "execution_count": 69,
460 | "metadata": {},
461 | "outputs": [
462 | {
463 | "data": {
464 | "text/plain": [
465 | "-15057497.241008699"
466 | ]
467 | },
468 | "execution_count": 69,
469 | "metadata": {},
470 | "output_type": "execute_result"
471 | }
472 | ],
473 | "source": [
474 | "r2_score(y, y_test)"
475 | ]
476 | },
477 | {
478 | "cell_type": "code",
479 | "execution_count": 83,
480 | "metadata": {},
481 | "outputs": [],
482 | "source": [
483 | "from sklearn.preprocessing import PolynomialFeatures\n",
484 | "poly_reg = PolynomialFeatures(degree = 4)\n",
485 | "X_poly = poly_reg.fit_transform(X)\n"
486 | ]
487 | },
488 | {
489 | "cell_type": "code",
490 | "execution_count": 77,
491 | "metadata": {},
492 | "outputs": [
493 | {
494 | "data": {
495 | "text/plain": [
496 | "array([[ 1. , 0.52055395, -1.1250769 , ..., -1.93044717,\n",
497 | " -2.61679213, -3.54715794],\n",
498 | " [ 1. , 0.28104837, -0.53706982, ..., -0.09799818,\n",
499 | " -0.06199315, -0.03921656],\n",
500 | " [ 1. , 1.46924486, -1.25935736, ..., -0.53883334,\n",
501 | " -0.14536571, -0.03921656],\n",
502 | " ...,\n",
503 | " [ 1. , 1.14420158, -1.03178731, ..., 1.25352939,\n",
504 | " -1.43053502, 1.63253487],\n",
505 | " [ 1. , 0.86114953, -0.91305511, ..., 0.98163072,\n",
506 | " -1.26591721, 1.63253487],\n",
507 | " [ 1. , -0.32704695, -0.71516812, ..., 0.60224117,\n",
508 | " -0.99155419, 1.63253487]])"
509 | ]
510 | },
511 | "execution_count": 77,
512 | "metadata": {},
513 | "output_type": "execute_result"
514 | }
515 | ],
516 | "source": [
517 | "X_poly"
518 | ]
519 | },
520 | {
521 | "cell_type": "code",
522 | "execution_count": null,
523 | "metadata": {},
524 | "outputs": [],
525 | "source": []
526 | }
527 | ],
528 | "metadata": {
529 | "kernelspec": {
530 | "display_name": "Python 3",
531 | "language": "python",
532 | "name": "python3"
533 | },
534 | "language_info": {
535 | "codemirror_mode": {
536 | "name": "ipython",
537 | "version": 3
538 | },
539 | "file_extension": ".py",
540 | "mimetype": "text/x-python",
541 | "name": "python",
542 | "nbconvert_exporter": "python",
543 | "pygments_lexer": "ipython3",
544 | "version": "3.7.1"
545 | }
546 | },
547 | "nbformat": 4,
548 | "nbformat_minor": 2
549 | }
550 |
--------------------------------------------------------------------------------
/3_supervised_ml/regression/Salary_Data.csv:
--------------------------------------------------------------------------------
1 | YearsExperience,Salary
2 | 1.1,39343.00
3 | 1.3,46205.00
4 | 1.5,37731.00
5 | 2.0,43525.00
6 | 2.2,39891.00
7 | 2.9,56642.00
8 | 3.0,60150.00
9 | 3.2,54445.00
10 | 3.2,64445.00
11 | 3.7,57189.00
12 | 3.9,63218.00
13 | 4.0,55794.00
14 | 4.0,56957.00
15 | 4.1,57081.00
16 | 4.5,61111.00
17 | 4.9,67938.00
18 | 5.1,66029.00
19 | 5.3,83088.00
20 | 5.9,81363.00
21 | 6.0,93940.00
22 | 6.8,91738.00
23 | 7.1,98273.00
24 | 7.9,101302.00
25 | 8.2,113812.00
26 | 8.7,109431.00
27 | 9.0,105582.00
28 | 9.5,116969.00
29 | 9.6,112635.00
30 | 10.3,122391.00
31 | 10.5,121872.00
32 |
--------------------------------------------------------------------------------
/3_supervised_ml/regression/data_2d.csv:
--------------------------------------------------------------------------------
1 | 0,1,2
2 | 97.14469719,69.59328198,404.6344715
3 | 81.77590078,5.737648097,181.4851077
4 | 55.85434242,70.32590168,321.773638
5 | 49.36654999,75.11404016,322.4654856
6 | 3.192702465,29.25629886,94.6188109
7 | 49.20078406,86.14443851,356.3480927
8 | 21.8828039,46.8415052,181.6537692
9 | 79.50986272,87.39735554,423.5577432
10 | 88.1538875,65.20564193,369.2292454
11 | 60.74385434,99.9576339,427.6058037
12 | 67.41558195,50.36830961,292.4718216
13 | 48.31811577,99.12895314,395.5298114
14 | 28.82997197,87.18494885,319.0313485
15 | 43.85374266,64.47363908,287.4281441
16 | 25.31369409,83.54529426,292.7689088
17 | 10.80772668,45.69556859,159.6633077
18 | 98.36574588,82.69739353,438.7989639
19 | 29.14690997,66.36510676,250.986309
20 | 65.1003019,33.3538835,231.7115079
21 | 24.64411349,39.54005274,163.3981608
22 | 37.55980488,1.345727842,83.48015514
23 | 88.16450624,95.15366257,466.2658058
24 | 13.83462084,25.4940482,100.8864303
25 | 64.41084375,77.25983813,365.6410481
26 | 68.9259918,97.4536008,426.1400155
27 | 39.48844224,50.85612819,235.5323895
28 | 52.46317768,59.77650969,283.2916403
29 | 48.48478698,66.97035422,298.5814404
30 | 8.062087814,98.24260014,309.2341088
31 | 32.73188771,18.85353553,129.610139
32 | 11.6523788,66.26451174,224.1505421
33 | 13.73035353,70.47250913,235.3056656
34 | 8.185551765,41.85198942,153.4841895
35 | 53.60987615,94.56012164,394.9394443
36 | 95.36860989,47.29550696,336.1267389
37 | 87.33360921,93.80393433,449.363352
38 | 66.35761109,81.84755128,387.0148165
39 | 19.75471753,65.52330092,240.3894416
40 | 21.13344045,47.43718199,177.1482806
41 | 22.37386481,25.95562754,119.611258
42 | 93.99040405,0.12789052,196.7161667
43 | 86.7201981,18.41376679,236.2608084
44 | 98.99837299,60.23126569,384.3813447
45 | 3.593965644,96.25221732,293.237183
46 | 15.10236337,92.55690357,304.8908828
47 | 97.83414077,2.023908104,201.2935984
48 | 19.93821969,46.77827346,170.610093
49 | 30.37351114,58.77752516,242.3734842
50 | 73.29288315,67.66962776,353.0829913
51 | 52.23090088,81.90244825,348.7256886
52 | 86.42957611,66.5402276,365.959971
53 | 93.40080214,18.07524594,235.4723816
54 | 13.21346006,91.48885878,300.6068783
55 | 4.593462704,46.33593152,145.8187453
56 | 15.66929158,35.543744,138.8803347
57 | 52.95935977,68.72020961,317.1637077
58 | 56.81752123,47.57273192,254.9036313
59 | 51.13354308,78.04216746,334.5843335
60 | 7.862164715,17.72908178,69.35558881
61 | 54.6986037,92.74458414,386.8599372
62 | 86.39906301,41.88869459,294.8717136
63 | 11.94750602,42.96138674,156.7542195
64 | 70.35840106,83.70623451,391.8061353
65 | 29.02236633,84.32778308,319.3104629
66 | 42.75947991,97.49332608,376.2915895
67 | 96.21565644,25.83428258,280.6170439
68 | 53.22772766,27.90550857,194.4304651
69 | 30.36098967,0.939644215,69.64886318
70 | 83.27756539,73.17934857,384.597185
71 | 30.18769248,7.146538599,89.5390084
72 | 11.78841846,51.69776084,181.5506828
73 | 18.29242401,61.97797605,224.7733829
74 | 96.71266769,9.029101513,219.5670937
75 | 31.01273869,78.28338246,298.4902165
76 | 11.39726075,61.72869324,199.9440447
77 | 17.39255579,4.241140863,43.91569235
78 | 72.18269373,34.53907217,256.0683782
79 | 73.98002079,3.716493438,159.3725806
80 | 94.49305835,88.41719702,447.1327036
81 | 84.56282073,20.24116219,233.0788296
82 | 51.74247397,11.00974796,131.0701796
83 | 53.7485904,60.0251023,298.8143327
84 | 85.05083476,95.73699695,451.803523
85 | 46.77725045,90.20220624,368.366436
86 | 49.75843417,52.83449436,254.706774
87 | 24.1192565,42.10281078,168.3084328
88 | 27.20157645,29.97874929,146.34226
89 | 7.009596168,55.87605839,176.810149
90 | 97.64694967,8.147625127,219.1602804
91 | 1.382982509,84.94408692,252.905653
92 | 22.32353035,27.51507504,127.5704788
93 | 45.04540623,93.52040222,375.8223403
94 | 40.16399147,0.161699235,80.38901933
95 | 53.18273979,8.170316162,142.7181831
96 | 46.45677916,82.00017091,336.8761544
97 | 77.13030069,95.18875945,438.4605861
98 | 68.60060757,72.57118072,355.9002869
99 | 41.69388712,69.24112597,284.8346367
100 | 4.142669398,52.25472638,168.0344009
101 | 17.93020121,94.52059195,320.2595296
102 |
--------------------------------------------------------------------------------
/3_supervised_ml/regression/data_poly.csv:
--------------------------------------------------------------------------------
1 | 76.7007086033,663.797275569
2 | 95.2735441552,1014.3622816
3 | 73.0957232493,618.938826916
4 | 46.9516354572,288.012877367
5 | 33.3137480056,144.977555864
6 | 58.8001284334,412.327812022
7 | 86.4775958313,844.415014024
8 | 26.1438291437,89.351909563
9 | 97.6793058523,1053.20525097
10 | 43.7453159626,240.908778383
11 | 27.3960599569,115.936712225
12 | 97.7709380144,1034.55054041
13 | 0.477748646463,3.44508498982
14 | 45.8230852085,254.058885722
15 | 54.8058554165,353.273447055
16 | 90.1526491199,926.945048811
17 | 7.25055401942,30.7577367238
18 | 45.32559552,241.735203861
19 | 78.0299288822,692.797188617
20 | 56.9992954919,382.995647161
21 | 94.0058295504,978.718285869
22 | 26.9031773559,98.2290062278
23 | 23.0527128171,84.9785703587
24 | 72.1620607036,584.521550996
25 | 99.03856132,1082.25149734
26 | 3.8567282135,5.37807424483
27 | 29.1370660814,122.013630198
28 | 86.1406581482,841.297543579
29 | 30.6637414672,131.606512241
30 | 45.7038082016,267.357505947
31 | 20.8365297577,52.1846807158
32 | 57.1870229314,388.795855138
33 | 35.6458427285,155.657694842
34 | 19.3869531878,87.1260781016
35 | 21.9669393722,83.7232155716
36 | 68.2754384479,530.711261039
37 | 0.0499320826943,14.4928408734
38 | 88.0160403117,843.24850174
39 | 52.302680128,340.811945309
40 | 30.8083679391,128.066959237
41 | 36.7450495784,172.783502591
42 | 32.497674786,147.228458255
43 | 69.2984401147,538.180640028
44 | 8.50775977301,18.1584427007
45 | 70.340432818,559.410466793
46 | 57.5800635973,384.884606047
47 | 88.4806455787,875.860995014
48 | 18.4605500858,58.9483005575
49 | 55.8041981373,376.843030927
50 | 16.4395710474,24.9459291843
51 | 11.5900497701,32.6626606312
52 | 46.8705089511,262.885902286
53 | 84.33492545,798.336107587
54 | 47.0879172063,278.115426472
55 | 25.3595168427,104.171319469
56 | 57.5279893076,411.579592637
57 | 64.1988259571,471.645130755
58 | 4.21211019461,-4.60107066464
59 | 54.0943345308,363.848079875
60 | 74.1390699605,634.232884203
61 | 73.6494046203,614.61698055
62 | 36.128161214,165.31378043
63 | 52.0563603735,326.302161959
64 | 47.1946296354,282.326194483
65 | 38.4175210129,199.427772234
66 | 54.6207713889,349.23680114
67 | 64.7683580294,489.500982377
68 | 84.4005467356,805.013871913
69 | 21.4404800201,67.0955374565
70 | 73.2615461079,611.057504623
71 | 20.1698478817,69.446653099
72 | 31.4337548498,138.906777891
73 | 12.5179786376,48.9516001981
74 | 3.61761195371,5.32627355217
75 | 81.2515736807,731.3845794
76 | 56.8424497509,378.289439943
77 | 82.2665238066,748.343138324
78 | 37.4722110019,175.748739041
79 | 84.6946340015,800.047623475
80 | 39.7685020685,203.056324686
81 | 2.05616850915,6.23518914756
82 | 97.1247091279,1042.6919584
83 | 31.1291991206,130.113663552
84 | 84.6126409599,790.864163252
85 | 85.9561287143,827.455642687
86 | 80.9652012247,738.33866824
87 | 88.8979965529,893.81365673
88 | 58.3641047494,402.882690162
89 | 50.3024480011,314.117036852
90 | 12.7419734701,12.9681174938
91 | 77.320666256,687.654958048
92 | 98.9646405094,1089.94766197
93 | 56.3266253207,372.340305154
94 | 79.8360680928,722.043849621
95 | 84.2993047538,798.10497093
96 | 45.3458596295,251.755172506
97 | 3.13987798949,4.26261828882
98 | 70.1062835334,550.923455067
99 | 80.3106828908,728.06984768
100 | 72.0680436903,581.130210853
101 |
--------------------------------------------------------------------------------
/4_model_evaluation/lessons/6. Evaluation Metrics.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Younes-Charfaoui/Machine-Learning-Beginner-Course/b39d9dea1d7f79c1b14590ea338a33d47e9ca49e/4_model_evaluation/lessons/6. Evaluation Metrics.pdf
--------------------------------------------------------------------------------
/5_unsupervised_ml/Mall_Customers.csv:
--------------------------------------------------------------------------------
1 | CustomerID,Genre,Age,Annual Income (k$),Spending Score (1-100)
2 | 0001,Male,19,15,39
3 | 0002,Male,21,15,81
4 | 0003,Female,20,16,6
5 | 0004,Female,23,16,77
6 | 0005,Female,31,17,40
7 | 0006,Female,22,17,76
8 | 0007,Female,35,18,6
9 | 0008,Female,23,18,94
10 | 0009,Male,64,19,3
11 | 0010,Female,30,19,72
12 | 0011,Male,67,19,14
13 | 0012,Female,35,19,99
14 | 0013,Female,58,20,15
15 | 0014,Female,24,20,77
16 | 0015,Male,37,20,13
17 | 0016,Male,22,20,79
18 | 0017,Female,35,21,35
19 | 0018,Male,20,21,66
20 | 0019,Male,52,23,29
21 | 0020,Female,35,23,98
22 | 0021,Male,35,24,35
23 | 0022,Male,25,24,73
24 | 0023,Female,46,25,5
25 | 0024,Male,31,25,73
26 | 0025,Female,54,28,14
27 | 0026,Male,29,28,82
28 | 0027,Female,45,28,32
29 | 0028,Male,35,28,61
30 | 0029,Female,40,29,31
31 | 0030,Female,23,29,87
32 | 0031,Male,60,30,4
33 | 0032,Female,21,30,73
34 | 0033,Male,53,33,4
35 | 0034,Male,18,33,92
36 | 0035,Female,49,33,14
37 | 0036,Female,21,33,81
38 | 0037,Female,42,34,17
39 | 0038,Female,30,34,73
40 | 0039,Female,36,37,26
41 | 0040,Female,20,37,75
42 | 0041,Female,65,38,35
43 | 0042,Male,24,38,92
44 | 0043,Male,48,39,36
45 | 0044,Female,31,39,61
46 | 0045,Female,49,39,28
47 | 0046,Female,24,39,65
48 | 0047,Female,50,40,55
49 | 0048,Female,27,40,47
50 | 0049,Female,29,40,42
51 | 0050,Female,31,40,42
52 | 0051,Female,49,42,52
53 | 0052,Male,33,42,60
54 | 0053,Female,31,43,54
55 | 0054,Male,59,43,60
56 | 0055,Female,50,43,45
57 | 0056,Male,47,43,41
58 | 0057,Female,51,44,50
59 | 0058,Male,69,44,46
60 | 0059,Female,27,46,51
61 | 0060,Male,53,46,46
62 | 0061,Male,70,46,56
63 | 0062,Male,19,46,55
64 | 0063,Female,67,47,52
65 | 0064,Female,54,47,59
66 | 0065,Male,63,48,51
67 | 0066,Male,18,48,59
68 | 0067,Female,43,48,50
69 | 0068,Female,68,48,48
70 | 0069,Male,19,48,59
71 | 0070,Female,32,48,47
72 | 0071,Male,70,49,55
73 | 0072,Female,47,49,42
74 | 0073,Female,60,50,49
75 | 0074,Female,60,50,56
76 | 0075,Male,59,54,47
77 | 0076,Male,26,54,54
78 | 0077,Female,45,54,53
79 | 0078,Male,40,54,48
80 | 0079,Female,23,54,52
81 | 0080,Female,49,54,42
82 | 0081,Male,57,54,51
83 | 0082,Male,38,54,55
84 | 0083,Male,67,54,41
85 | 0084,Female,46,54,44
86 | 0085,Female,21,54,57
87 | 0086,Male,48,54,46
88 | 0087,Female,55,57,58
89 | 0088,Female,22,57,55
90 | 0089,Female,34,58,60
91 | 0090,Female,50,58,46
92 | 0091,Female,68,59,55
93 | 0092,Male,18,59,41
94 | 0093,Male,48,60,49
95 | 0094,Female,40,60,40
96 | 0095,Female,32,60,42
97 | 0096,Male,24,60,52
98 | 0097,Female,47,60,47
99 | 0098,Female,27,60,50
100 | 0099,Male,48,61,42
101 | 0100,Male,20,61,49
102 | 0101,Female,23,62,41
103 | 0102,Female,49,62,48
104 | 0103,Male,67,62,59
105 | 0104,Male,26,62,55
106 | 0105,Male,49,62,56
107 | 0106,Female,21,62,42
108 | 0107,Female,66,63,50
109 | 0108,Male,54,63,46
110 | 0109,Male,68,63,43
111 | 0110,Male,66,63,48
112 | 0111,Male,65,63,52
113 | 0112,Female,19,63,54
114 | 0113,Female,38,64,42
115 | 0114,Male,19,64,46
116 | 0115,Female,18,65,48
117 | 0116,Female,19,65,50
118 | 0117,Female,63,65,43
119 | 0118,Female,49,65,59
120 | 0119,Female,51,67,43
121 | 0120,Female,50,67,57
122 | 0121,Male,27,67,56
123 | 0122,Female,38,67,40
124 | 0123,Female,40,69,58
125 | 0124,Male,39,69,91
126 | 0125,Female,23,70,29
127 | 0126,Female,31,70,77
128 | 0127,Male,43,71,35
129 | 0128,Male,40,71,95
130 | 0129,Male,59,71,11
131 | 0130,Male,38,71,75
132 | 0131,Male,47,71,9
133 | 0132,Male,39,71,75
134 | 0133,Female,25,72,34
135 | 0134,Female,31,72,71
136 | 0135,Male,20,73,5
137 | 0136,Female,29,73,88
138 | 0137,Female,44,73,7
139 | 0138,Male,32,73,73
140 | 0139,Male,19,74,10
141 | 0140,Female,35,74,72
142 | 0141,Female,57,75,5
143 | 0142,Male,32,75,93
144 | 0143,Female,28,76,40
145 | 0144,Female,32,76,87
146 | 0145,Male,25,77,12
147 | 0146,Male,28,77,97
148 | 0147,Male,48,77,36
149 | 0148,Female,32,77,74
150 | 0149,Female,34,78,22
151 | 0150,Male,34,78,90
152 | 0151,Male,43,78,17
153 | 0152,Male,39,78,88
154 | 0153,Female,44,78,20
155 | 0154,Female,38,78,76
156 | 0155,Female,47,78,16
157 | 0156,Female,27,78,89
158 | 0157,Male,37,78,1
159 | 0158,Female,30,78,78
160 | 0159,Male,34,78,1
161 | 0160,Female,30,78,73
162 | 0161,Female,56,79,35
163 | 0162,Female,29,79,83
164 | 0163,Male,19,81,5
165 | 0164,Female,31,81,93
166 | 0165,Male,50,85,26
167 | 0166,Female,36,85,75
168 | 0167,Male,42,86,20
169 | 0168,Female,33,86,95
170 | 0169,Female,36,87,27
171 | 0170,Male,32,87,63
172 | 0171,Male,40,87,13
173 | 0172,Male,28,87,75
174 | 0173,Male,36,87,10
175 | 0174,Male,36,87,92
176 | 0175,Female,52,88,13
177 | 0176,Female,30,88,86
178 | 0177,Male,58,88,15
179 | 0178,Male,27,88,69
180 | 0179,Male,59,93,14
181 | 0180,Male,35,93,90
182 | 0181,Female,37,97,32
183 | 0182,Female,32,97,86
184 | 0183,Male,46,98,15
185 | 0184,Female,29,98,88
186 | 0185,Female,41,99,39
187 | 0186,Male,30,99,97
188 | 0187,Female,54,101,24
189 | 0188,Male,28,101,68
190 | 0189,Female,41,103,17
191 | 0190,Female,36,103,85
192 | 0191,Female,34,103,23
193 | 0192,Female,32,103,69
194 | 0193,Male,33,113,8
195 | 0194,Female,38,113,91
196 | 0195,Female,47,120,16
197 | 0196,Female,35,120,79
198 | 0197,Female,45,126,28
199 | 0198,Male,32,126,74
200 | 0199,Male,32,137,18
201 | 0200,Male,30,137,83
--------------------------------------------------------------------------------
/MLBC Syllabus AR.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Younes-Charfaoui/Machine-Learning-Beginner-Course/b39d9dea1d7f79c1b14590ea338a33d47e9ca49e/MLBC Syllabus AR.pdf
--------------------------------------------------------------------------------
/MLBC Syllabus EN.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Younes-Charfaoui/Machine-Learning-Beginner-Course/b39d9dea1d7f79c1b14590ea338a33d47e9ca49e/MLBC Syllabus EN.pdf
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Machine Learning Beginner 💻📚👨🏫
2 | The code of the projects will be developed, datasets and the slides of the course will be found in this repository, all pull requests, issues or any contributions are welcomed to enhance the program content.
3 |
4 | # Introduction 🎉💯🎓
5 | Machine Learning Beginner course (MLBC) is a course devoted to beginners in the domain of machine learning, this course will reveal the idea behind machine learning and its philosophy, the types, techniques, and tools to know as newcomers. We’ll kick off our Python and machine learning journey with the basic, yet important concepts of machine learning. We will start with what machine learning is about, why we need it. We will then discuss typical machine learning tasks and explore several essential techniques of working with data and working with models. It is a great starting point of the subject and we will learn it in a fun way. At the end of this course a final challenge will have a place to practice what we learned.
6 |
7 | # Registration 📚✅📌
8 | Please note that the places in this course are limited so hurry up and subscribe by providing your information and follow this [link](https://docs.google.com/forms/d/e/1FAIpQLSdWNNNypECt4ueQ6s8FucsZT3uNCgiMEET0O7QKjNFe0uOZwQ/viewform) to register into the course.
9 |
10 | # Place & Time 🕧⏳📅
11 | Every Wednesday on `12:30` in the Classroom Number 11 in the `B` Building Block
12 |
13 | # Outline 👨🎓📋📏
14 | -Introduction.
15 | -Python Refresher.
16 | -Data Preprocessing.
17 | -Supervised Machine Learning.
18 | -Model Evaluations.
19 | -Unsupervised Machine Learning.
20 | -Deep Learning.
21 |
--------------------------------------------------------------------------------
/course_image.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Younes-Charfaoui/Machine-Learning-Beginner-Course/b39d9dea1d7f79c1b14590ea338a33d47e9ca49e/course_image.png
--------------------------------------------------------------------------------