├── 01_Linear_Regression ├── Linear_Regression.ipynb └── headbrain.csv ├── 02_Logistic_Regression ├── Logistic_Regression.ipynb └── titanic.xls ├── 03_KNN_Classifier ├── KNN_Classifier.ipynb └── credit_data.csv ├── 04_Naive_Bayes_Classifier ├── Naive_Bayes_Classifier.ipynb └── credit_data.csv ├── 05_Support_Vector_Machine ├── Support_Vector_Machine_(Digit_Recognition).ipynb └── Support_Vector_Machine_(Iris_Dataset).ipynb ├── 06_Decision_Trees ├── Decision_Trees.ipynb └── iris_data.csv ├── 07_Random_Forest_Classifier ├── Random_Forest_Classifier.ipynb └── credit_data.csv ├── 08_Boosting ├── Boosting.ipynb └── wine.csv ├── 09_Clustering ├── DBSCAN.ipynb ├── Hierarchical_Clustering.ipynb ├── K_Means_Clustering.ipynb └── Principal_Component_Analysis.ipynb └── README.md /01_Linear_Regression/Linear_Regression.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "Linear Regression.ipynb", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | } 14 | }, 15 | "cells": [ 16 | { 17 | "cell_type": "markdown", 18 | "metadata": { 19 | "id": "jPpVjmofy4u6", 20 | "colab_type": "text" 21 | }, 22 | "source": [ 23 | "# **Linear Regression**\n", 24 | "\n", 25 | "Linear regression is a common Statistical Data Analysis technique. It is used to determine the extent to which there is a linear relationship between a dependent variable and one or more independent variables.\n", 26 | "\n", 27 | "We have implemented here using Python libraries and a ***headbrain.csv*** dataset (from Kaggle). \n", 28 | "\n", 29 | "By [Muhammad Huzaifa Shahbaz](https://www.linkedin.com/in/mhuzaifadev)" 30 | ] 31 | }, 32 | { 33 | "cell_type": "markdown", 34 | "metadata": { 35 | "id": "QI18bF5Jutaa", 36 | "colab_type": "text" 37 | }, 38 | "source": [ 39 | "## **Importing libraries**\n", 40 | "\n", 41 | "We will import NumPy library as *np*, Pandas as *pd*, math, Matplotlib as *plt*, mean_squared_error from sklearn.metrics and Linear regression from sklearn.linear_model. " 42 | ] 43 | }, 44 | { 45 | "cell_type": "code", 46 | "metadata": { 47 | "id": "DU794seugbR3", 48 | "colab_type": "code", 49 | "colab": {} 50 | }, 51 | "source": [ 52 | "import numpy as np\n", 53 | "import pandas as pd\n", 54 | "import matplotlib.pyplot as plt\n", 55 | "from sklearn.linear_model import LinearRegression\n", 56 | "from sklearn.metrics import mean_squared_error\n", 57 | "import math " 58 | ], 59 | "execution_count": 0, 60 | "outputs": [] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": { 65 | "id": "qdBizDuGv6U1", 66 | "colab_type": "text" 67 | }, 68 | "source": [ 69 | "## **DataFraming**\n", 70 | "\n", 71 | "Read .csv data into a Dataframe" 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "metadata": { 77 | "id": "YNmbLZUewP0q", 78 | "colab_type": "code", 79 | "colab": { 80 | "base_uri": "https://localhost:8080/", 81 | "height": 439 82 | }, 83 | "outputId": "e0c9f03e-b602-4573-b77d-298005bfd0d9" 84 | }, 85 | "source": [ 86 | "dataset = pd.read_csv(\"./headbrain.csv\")\n", 87 | "size = dataset['Head Size(cm^3)']\n", 88 | "weight = dataset['Brain Weight(grams)']\n", 89 | "\n", 90 | "print(size)\n", 91 | "print(weight)" 92 | ], 93 | "execution_count": 2, 94 | "outputs": [ 95 | { 96 | "output_type": "stream", 97 | "text": [ 98 | "0 4512\n", 99 | "1 3738\n", 100 | "2 4261\n", 101 | "3 3777\n", 102 | "4 4177\n", 103 | " ... \n", 104 | "232 3214\n", 105 | "233 3394\n", 106 | "234 3233\n", 107 | "235 3352\n", 108 | "236 3391\n", 109 | "Name: Head Size(cm^3), Length: 237, dtype: int64\n", 110 | "0 1530\n", 111 | "1 1297\n", 112 | "2 1335\n", 113 | "3 1282\n", 114 | "4 1590\n", 115 | " ... \n", 116 | "232 1110\n", 117 | "233 1215\n", 118 | "234 1104\n", 119 | "235 1170\n", 120 | "236 1120\n", 121 | "Name: Brain Weight(grams), Length: 237, dtype: int64\n" 122 | ], 123 | "name": "stdout" 124 | } 125 | ] 126 | }, 127 | { 128 | "cell_type": "markdown", 129 | "metadata": { 130 | "id": "-Lo-TWfLyHZd", 131 | "colab_type": "text" 132 | }, 133 | "source": [ 134 | "## **Converting dataframes into NumPy Arrays**\n", 135 | "\n", 136 | "Machine Learning handle arrays not dataframes" 137 | ] 138 | }, 139 | { 140 | "cell_type": "code", 141 | "metadata": { 142 | "id": "h9-uO1pcyyUP", 143 | "colab_type": "code", 144 | "colab": {} 145 | }, 146 | "source": [ 147 | " x = np.array(size).reshape(-1,1)\n", 148 | " y = np.array(weight).reshape(-1,1)" 149 | ], 150 | "execution_count": 0, 151 | "outputs": [] 152 | }, 153 | { 154 | "cell_type": "markdown", 155 | "metadata": { 156 | "id": "XiPwFWyEy2oc", 157 | "colab_type": "text" 158 | }, 159 | "source": [ 160 | "## **Training the Model**\n", 161 | "\n", 162 | "We are using Linear regression model as imported from sklearn library and then it's being trained on x and y (any 2 major axis of datasets)" 163 | ] 164 | }, 165 | { 166 | "cell_type": "code", 167 | "metadata": { 168 | "id": "-8bgiHGp0ELK", 169 | "colab_type": "code", 170 | "colab": { 171 | "base_uri": "https://localhost:8080/", 172 | "height": 34 173 | }, 174 | "outputId": "f04ef507-4042-423e-e548-c30395f9884d" 175 | }, 176 | "source": [ 177 | "model = LinearRegression()\n", 178 | "model.fit(x,y)" 179 | ], 180 | "execution_count": 4, 181 | "outputs": [ 182 | { 183 | "output_type": "execute_result", 184 | "data": { 185 | "text/plain": [ 186 | "LinearRegression(copy_X=True, fit_intercept=True, n_jobs=None, normalize=False)" 187 | ] 188 | }, 189 | "metadata": { 190 | "tags": [] 191 | }, 192 | "execution_count": 4 193 | } 194 | ] 195 | }, 196 | { 197 | "cell_type": "markdown", 198 | "metadata": { 199 | "id": "j2Fy1UoQ0G2g", 200 | "colab_type": "text" 201 | }, 202 | "source": [ 203 | "## **Mean Sqaured Error & R - Squared Value**\n", 204 | "\n", 205 | "The **mean squared error** or mean squared deviation of an estimator measures the average of the squares of the errors—that is, the average squared difference between the estimated values and the actual value. MSE is a risk function, corresponding to the expected value of the squared error loss.\n", 206 | "\n", 207 | "The **coefficient of determination**, denoted R² or r² and pronounced \"R squared\", is the proportion of the variance in the dependent variable that is predictable from the independent variable." 208 | ] 209 | }, 210 | { 211 | "cell_type": "code", 212 | "metadata": { 213 | "id": "K8UKYT-F08nX", 214 | "colab_type": "code", 215 | "colab": { 216 | "base_uri": "https://localhost:8080/", 217 | "height": 52 218 | }, 219 | "outputId": "57af052f-9bbe-42da-f5d5-f6e5c7b27d35" 220 | }, 221 | "source": [ 222 | "regression_model_mse = mean_squared_error(x,y)\n", 223 | "print('Mean Squared Error:\\t',math.sqrt(regression_model_mse))\n", 224 | "print(\"R squared value\\t\\t\",model.score(x,y))" 225 | ], 226 | "execution_count": 7, 227 | "outputs": [ 228 | { 229 | "output_type": "stream", 230 | "text": [ 231 | "Mean Squared Error:\t 2367.495611943946\n", 232 | "R squared value\t\t 0.639311719957\n" 233 | ], 234 | "name": "stdout" 235 | } 236 | ] 237 | }, 238 | { 239 | "cell_type": "markdown", 240 | "metadata": { 241 | "id": "bMnyRGHT1hmT", 242 | "colab_type": "text" 243 | }, 244 | "source": [ 245 | "**Getting b values after the model fit**\n", 246 | "\n", 247 | "Thus the b0 and b1" 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "metadata": { 253 | "id": "qRV2OMY_12kP", 254 | "colab_type": "code", 255 | "colab": { 256 | "base_uri": "https://localhost:8080/", 257 | "height": 52 258 | }, 259 | "outputId": "40482de6-966d-419c-8aa1-a32d20bc931a" 260 | }, 261 | "source": [ 262 | "print(model.coef_[0])\n", 263 | "print(model.intercept_[0])" 264 | ], 265 | "execution_count": 8, 266 | "outputs": [ 267 | { 268 | "output_type": "stream", 269 | "text": [ 270 | "[0.26342934]\n", 271 | "325.5734210494426\n" 272 | ], 273 | "name": "stdout" 274 | } 275 | ] 276 | }, 277 | { 278 | "cell_type": "markdown", 279 | "metadata": { 280 | "id": "8FU4kKET2Drz", 281 | "colab_type": "text" 282 | }, 283 | "source": [ 284 | "## **Visualization**\n", 285 | "\n", 286 | "Visualize the dataset with the fitted model using matplotlib." 287 | ] 288 | }, 289 | { 290 | "cell_type": "code", 291 | "metadata": { 292 | "id": "NgDeBCym2S3x", 293 | "colab_type": "code", 294 | "colab": { 295 | "base_uri": "https://localhost:8080/", 296 | "height": 313 297 | }, 298 | "outputId": "0936b932-374c-4f3a-f60f-89f9cd79557e" 299 | }, 300 | "source": [ 301 | "plt.scatter(x,y, color = 'orange')\n", 302 | "plt.plot(x,model.predict(x), color = 'black')\n", 303 | "plt.title('Linear Regression')\n", 304 | "plt.xlabel('Head Size')\n", 305 | "plt.ylabel('Brain Weight')" 306 | ], 307 | "execution_count": 12, 308 | "outputs": [ 309 | { 310 | "output_type": "execute_result", 311 | "data": { 312 | "text/plain": [ 313 | "Text(0, 0.5, 'Brain Weight')" 314 | ] 315 | }, 316 | "metadata": { 317 | "tags": [] 318 | }, 319 | "execution_count": 12 320 | }, 321 | { 322 | "output_type": "display_data", 323 | "data": { 324 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEWCAYAAACXGLsWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO29e5xVZdn///4wDCqKKAdLkcOYaKmlKaHmk9FT5iGL9MmyUFFTs/Sr/tKnNB+zLDqXmqVGSmKhRllGZQmVqZmgeEQ84OBAIqYEoiLIYeb6/bHWnll7z1prrz2zjzPX+/XaL/a+1tprXXvP5r7u+zrdMjMcx3EcJ40BtVbAcRzHqX/cWDiO4zhFcWPhOI7jFMWNheM4jlMUNxaO4zhOUdxYOI7jOEVxY+E0JJLeI+npWuvRF5C0WNKkWuvh1DduLJy6RtIySR8olJvZPWa2Zy10KkTSVyRtlrRO0lpJ/5R0cK31yoqZ7W1mf6+1Hk5948bCcUpA0sCEQ780s+2AEcCdwK8qcG9J8v+zTk3wH57TkEiaJGlF5PUySRdIekzSK5J+KWnryPGjJT0Smfm/I3LsQklLJb0m6QlJx0SOnSzpXkmXS1oNfCVNLzPbAswCRkkaGV5jqKTrJb0g6XlJX5fUFB5rkvR9Sf+R1CbpbEmWM0qS/i5pmqR7gfXAbpLeKmmepDWSnpb08Yi+R4Wf4bXwXheE8hGS/hB+/jWS7skZnujqTdJWkq6QtDJ8XCFpq+h3Lul8SS+Fn+eUnv0FnUbDjYXTl/g4cATQArwDOBlA0juBGcBngOHAT4A5uUEQWAq8BxgKfBX4haSdI9c9EHgWeBMwLU0BSYOAk4DVwMuh+AZgC7A78E7gg8Bp4bHTgSOB/YD9gY/GXPZE4AxgCLAKmAfcBOwEHA9cLWmv8Nzrgc+Y2RBgH+Bvofx8YAUwMvwcXwLiev1cDBwU6rMvMBH4v8jxNxN8T6OATwM/lrRj2nfi9A3cWDh9iR+a2UozWwP8nmDAg2Cg/YmZLTCzdjObCWwkGBQxs1+F7+sws18CzxAMkjlWmtlVZrbFzDYk3PvjktYCGwgMwMfMbIukNwFHAeeZ2etm9hJwOcEgD4GBu9LMVpjZy8C3Yq59g5ktDlctRwDLzOxnoT4PA7cCx4Xnbgb2krS9mb1sZg9F5DsDY81scxjziTMWU4DLzOwlM1tFYDxPjBzfHB7fbGa3A+uAuogdOZXFjYXTl/h35Pl6YLvw+Vjg/NAFszYc1EcDuwBIOiniolpLMCMfEbnWcxnuPdvMdiCYtT8OHBC5dzPwQuT6PyFYFRDqEL1+3L2isrHAgQWfZQrBjB/gfwiM03JJd0UC7d8FWoG5kp6VdGHC59gFWB55vTyU5VgdGq0c0e/Z6cMkBescpy/xHDDNzLq5kCSNBX4KvB+4z8zaJT0CKHJa5tbMZvYfSWcACyXdFN57IzCiYJDN8QKwa+T16LjLFnyWu8zssIT7PwBMltQMnA3MBkab2WsErqjzJe0D/E3SA2b214JLrCQwSIvD12NCmdPP8ZWF0wg0S9o68ih1kvNT4ExJB4YZRdtK+pCkIcC2BIPxKoAwYLtPb5Q1s6eBO4AvmNkLwFzg+5K2lzRA0lskvTc8fTZwrqRRknYAvljk8n8A9pB0oqTm8PEuSW+TNEjSFElDzWwz8CrQEX6uoyXtLknAK0B77lgBNwP/J2mkpBHAl4Ff9Ob7cPoGbiycRuB2glhA7vGVUt5sZgsJ4gg/Igg6txIGv83sCeD7wH3Ai8DbgXvLoPN3gTMk7UQQ8B4EPBHe/9cE8QMIDNlc4DHgYYLPuoVgMI/7LK8RBMiPJ5jx/xv4NpAL1p8ILJP0KnAmgYsKYDzwF4IYw33A1WZ2Z8wtvg4sDPVZBDwUypx+jnzzI8epHyQdCVxrZmNrrYvjRPGVhePUEEnbhLURAyWNAi4FfltrvRynEF9ZOE4NkTQYuAt4K4GL7Y/AuWb2ak0Vc5wC3Fg4juM4RXE3lOM4jlOUPllnMWLECBs3blyt1XAcx2koHnzwwf+Y2ci4Y33SWIwbN46FCxfWWg3HcZyGQtLypGPuhnIcx3GK4sbCcRzHKYobC8dxHKcobiwcx3GcorixcBzHcYrixsJxHKcv0DYLbhsHNw0I/m2bVdbL98nUWcdxnH5F2yy4/wxoXx+8Xr88eA3QMiX5fSXgKwvHcZxG59GLuwxFjvb1gbxMuLFwHMdpdNb/qzR5D3Bj4TiO0+gMHlOavAe4sXAcx4GKB4gryr7ToGlwvqxpcCAvEx7gdhzHqUKAuKLkdHz04sD1NHhMYCjKqHuf3M9iwoQJ5o0EHcfJzG3jAgNRyOCx8NFl1damZkh60MwmxB1zN5TjOE4VAsSNjhsLx3GcKgSIG52KGQtJMyS9JOnxAvn/k/SUpMWSvhORXySpVdLTkg6PyI8IZa2SLqyUvo7j9GOqECBudCq5srgBOCIqkPQ+YDKwr5ntDXwvlO8FHA/sHb7naklNkpqAHwNHAnsBnwzPdRzHKR8tU2Di9CBGgYJ/J05vjOB2lahYNpSZ3S1pXIH4s8C3zGxjeM5LoXwycEsob5PUCkwMj7Wa2bMAkm4Jz32iUno7jtNPaZnixiGFascs9gDeI2mBpLskvSuUjwKei5y3IpQlybsh6QxJCyUtXLVqVQVUdxynLDRyPUM/ptrGYiAwDDgI+F9gtiSV48JmNt3MJpjZhJEjY/cbd5y+R6MNvLl6hvXLAeuqZ6h3vZ2qG4sVwG8s4H6gAxgBPA+Mjpy3ayhLkjuO04gDbxUa3jmVodrG4jbgfQCS9gAGAf8B5gDHS9pKUgswHrgfeAAYL6lF0iCCIPicKuvsOPVJIw68Xs/QsFQswC3pZmASMELSCuBSYAYwI0yn3QRMtaCEfLGk2QSB6y3AWWbWHl7nbOAOoAmYYWaLK6Wz4zQUjTjwDh6TUCnt9Qz1TiWzoT6ZcOiEhPOnAd2Sms3sduD2MqrmOH2DRhx4952W34MJvJ6hQfAKbsepN7IGrRuxkMzrGSrKn//8Z5YuXVqRa3vXWcepJ0rpflqFTqMVwesZys7MmTM5+eSTAZg0aRJ33nln2e/hXWcdp57w7qdOCcyePZtPfOITna+32mornn32WXbZZZceXc+7zjpOo9CIQetSaLS6kDrl97//PZLyDEVbWxtvvPFGjw1FMdxYOE490Ze7nzZiXUidMXfuXCTxkY98pFO2ZMkSzIxx48ZV9N5uLBynnmjEoHVWGrEupE64++67kcThh3c25Obxxx/HzBg/fnxVdHBj4Tj1RF/OFurrLrYKMH/+fCTx3ve+t1P20EMPYWbsvffeVdXFs6Ecp97oq9lCjVgXUiMefvhh9t9//zzZ/PnzOfDAA2ukka8sHMepFn3ZxVYmFi9ejKQ8Q3HXXXdhZjU1FODGwnGcatGXXWy9ZMmSJUhin3326ZTNnTsXM+PQQw+toWZduBvKcZzq0VddbD1k2bJltLS05Ml+//vfc/TRR9dIo2R8ZeE4jlNlVqxYwaBBg/IMxezZszGzujQU4MbCcRyn55RYZPjiiy+yww47MHr0aDZv3gzAjTfeiJlx3HHHVV7fXuDGwnGcxqcWleElFBmuXr2aUaNG8eY3v5lXXnkFgJ/85CeYGSeeeGLldS0Dbiwcx6k9vRnsa1UZnqHI8JVXXmH8+PGMGDGClStXAnDFFVdgZpxxxhmV1a/MuLFwnHqgP/dM6u1gX6vK8JQiw3Xr1rHvvvuyww470NraCsA3vvENzIxzzz23snpVCDcWjlNr+nvPpN4O9rWqDI8pJtywCQ7+ajNDhgzhscceA+CSSy7BzLjooosqq0+FcWPhOLWmv/dM6u1gX6vmi5Eiw42b4f3fgMGnwPwlmwD4/Oc/T0dHB5dddlll9agSXmfhOLWmv/dM6m0bkHJt1do2q7SNpFqmsHHjZrZ+2yl54jPPPJOrr74aSaXdv87xlYXj1Jq+3JY8C+VoA9K0Tdfz5uGlV4bf/zm478TMrsDNmzfT1NSUZyhOOukk2tvbueaaa/qcoYAKGgtJMyS9JOnxiOwrkp6X9Ej4OCpy7CJJrZKelnR4RH5EKGuVdGGl9HWcmtHfeyb1pg1ILt6zaXWXrGNDafdvmwWt1wIFu4bGuALb29sZNmwYgwYNoqOjo1O+adMmZs6cyYABfXf+XbFtVSUdCqwDbjSzfULZV4B1Zva9gnP3Am4GJgK7AH8B9ggPLwEOA1YADwCfNLMn0u7t26o6DUepLhAnoBzb0CZdAwDBpzowM3bbbTeWLcu/5oYNG9h6662z61vnpG2rWrGYhZndLWlcxtMnA7eY2UagTVIrgeEAaDWzZwEk3RKem2osHKfh8J5JPaMc8Z6Uc22b0bxzv/149NFH8+Tr1q1j2223zX6PPkAt1kxnS3osdFPtGMpGAc9FzlkRypLk3ZB0hqSFkhauWrWqEno7Tv+j3us/yhHvSTh30tdhwLH/yjMUa9euxcz6naGA6huLa4C3APsBLwDfL9eFzWy6mU0wswkjR44s12Udp//SCPUf5Yj3FFxj8vdBU+CuJ7tOWbVqFWbG0KFDe6lw41JVY2FmL5pZu5l1AD+ly9X0PDA6cuquoSxJ7jhOJWmbBfOn1kf9R9rqphx7ZITXOOmn26IpMOehrkMrV67EzBgxYkSZPkzjUtU6C0k7m9kL4ctjgFym1BzgJkk/IAhwjwfuBwSMl9RCYCSOBz5VTZ0dp9+RW1FYe/zxatZ/5HTJGa3c6ga6DEIv4z1nnXUWV199dZ5s+fLljBnTT1KXM1IxYyHpZmASMELSCuBSYJKk/Qhy1JYBnwEws8WSZhMErrcAZ5kFv1RJZwN3AE3ADDNbXCmdHcchvqI8SjXrP9Kq23uZEHDhhRfy7W9/O0+2ZMkSxo8f36vr9lUqmQ31yRjx9SnnTwO6ORrN7Hbg9jKq5jhOGmkrh2rXf1Sguv3rX/86l1xySZ5s0aJFeVuaOt3puxUkjlNP1HtWUZSklYOaqr9ndhmr26+88kok5RmKhQsXYmZuKDLgxsJxKk0jZBXlaJsFW9Z1lzcNhoNmVr8WpAzZTtdddx2SOO+88zpl//jHPzAzDjjggHJp2udxY+E4laZRusrGtc6AnvVaKhe9yHa66aabkMTpp5/eKZs3bx5mxiGHHFJBpfsm3nXW6b9Uq8VGo3SVTQpsN29X2+ryErOdbrvtNo455pg82Zw5c/jwhz9cbs36Fb6ycPon1XQNNUpX2UYxagnMnTsXSXmG4pZbbsHM3FCUATcWTv+kmq6hRukq2yhGrYB77rkHSRx+eGezaq6//nrMjE984hM11Kxv4cbC6Z9UcxZdjirjapBm1Oowm2vhwoVI4tBDD+2UXXXVVZgZp556ag0165t4zMLpn/R2d7ZSaYSusjn9CuM4ULyKuoosWrSId7zjHXmyb37zm1x4oW93U0ncWDj9k3JtxdnXiDNqt42rWBV1KSxZsoQ999wzT/alL32JadP6+d+sSrgbyumfNIprqCeU22VU48D3smXLkJRnKM455xzMrPqGog7dcdXCVxZO/6URXEOlkqXxXqlU22UXsnLlSkaNyt++5uSTT+ZnP/tZRe+bSCW+2wbCVxaOU0/0duZaiSyvuMA3gl2OKu06GT/bqlWrkJRnKI499ljMrHaGAhqnuLJC+MrCceqFtJkrdAWeBw0L+jZvXtO9mLASLqOWKbDqXmi9luDGBP+2zYSRh2SbVWeYla9du5Ydd9wx723vf//7+ctf/tJz3ctJg9eh9BY3Fo5TLyTNXBeeCx0buo5F23EUDrrldhl1VrnHXLOUIHfKrHzdyMkMGTIk79C73vUuFixYgKSe6V0JauSOqxfcDeU49ULSDHXz6vT9JaKukHIWALbNggWnxg+QObLOqmPO27AJdMzyPEOxxx570NHRwf33319fhgIap7iyQvjKwnHqhaSZaxZyg3FSrURPArAPngsdm9LPyTqrjny2TVtgq6n5h3faaSdeeOEFBgyo4/lrOb/bBsSNhePUC0m1H03bdO8EW4gGBCuBXIZXOQawYvcsZVa97zTa55/O9qdsYP3GLvFWzbDuhm0YePAPoBqGorfNI/tiBl1G3Fg4Tr2QtYI6Dmuvbhrn4LGZB9qOjg52PeR/eeGFDXnyjTNh0ECADdUp8Ovnqa+9xY2F4/SESrU3T5u5Pnhu+my/3FXVzcODeEmc/KPLir7dzNhrr7146qmn8uSvz4DBWxWcXI2Mogru590fqGMHoePUKbXY+a5lCgzcrvh5PY15xDHhSlBzvkzNgTwFM+Oggw5iwIABeYbi1cd+it3U1N1QQHUyivp56mtvqZixkDRD0kuSHo85dr4kkzQifC1JP5TUKukxSftHzp0q6ZnwMbXwWo5TdWpVnJXJEKh8RqtlChz0s/yWKAf9LHUWfsQRRzBgwAAWLFjQKVuzZg327C8Y8sS5gbuskGplFDVoC/Z6oZIrixuAIwqFkkYDHwSi5vxIYHz4OAO4Jjx3GHApcCAwEbhUUn7VjuNUm0rNUItVOKspw0Ws90YrqsejFwcD+ac6AtdTgqH4xCc+gSTuuOOOTtmLL76ImQWFdkm78Kmpej25iqW+9uO+T1momLEws7uBNTGHLge+QFcpKMBk4EYLmA/sIGln4HBgnpmtMbOXgXnEGCDHqSqVmKFmcW3Fzcrj6I3RKtHFdvrppyOJ2bNnd8pWrFiBmbHTTjsV18k6qhcvSGseWQvXYoNR1FhIaskiy4KkycDzZvZowaFRwHOR1ytCWZLccWpHJYqzsri2Bo/Ndq3eGK1ieoSz788fJSRx3XXXdZ62dOlSzKxb879UnartAmqZEqyQCldK/bzvUxayrCxujZH9utQbSRoMfAn4cqnvzXj9MyQtlLRw1apVlbiF4wRUor15FtdWbEO/AnprtNL0aJvFly84GR2znMv/1HXoyXnfwczYbbfdkq9b79XPHvwuSmLqrKS3AnsDQyUdGzm0PbB1D+71FqAFeDQs498VeEjSROB5YHTk3F1D2fPApAL53+MubmbTgekAEyZMsLhzHKdslLs4K0vfodz95k+Nd0mVw/+foMd37xjKF6ackCd75Buw71hg3Y+B/02/br1XP/fzvk9ZSFtZ7AkcDewAfDjy2B84vdQbmdkiM9vJzMaZ2TgCl9L+ZvZvYA5wUpgVdRDwipm9ANwBfFDSjmFg+4OhzHH6Flln3i1TAj9/HDl5T4K0ueDu+uVAV0+mq+eBpsAXblzbKZv/VbBZoaGA7rPvpEBxkguoHgLL9b7yqQMSVxZm9jvgd5IONrP7Sr2wpJsJVgUjJK0ALjWz6xNOvx04CmgF1gOnhDqskfQ14IHwvMvMLC5o7jiNTSkz76RZcPOwnlUoF1Y2Y8y8G07+Sf5pf//am3jvbi/G65N0rWI61EtVdb2vfOoAmaV7bCSNJFhJjCNiXMzs1Ipq1gsmTJhgCxcurLUaTj1SqcrrclJMx/s/V7C3BOk9pAaPTa64bpuV59b61QL4+A/zT7n99ts58sgjY4xKeN+o66tzdZJRh1LPdyqKpAfNbELcsSztPn4H3AP8BciYu+c4dUi9zGLTKKZj26xg06G8zHNBy9TQgMSQFKTN3cva+ePDcPT38g/feh4ce3nkPllm30UC5N3e64HlhiHLyuIRM9uvSvqUBV9ZOLE0wiy2mI5px6HkWf2dDyznv7+RL/75Z+GE/0p5XxpJ+jUPz9/ACXq+GnIqRtrKIkvq7B8klbjZruPUIY0wiy2mY9rxEoK09913Hzom31Bce2oQuD7hv5Lfl0hCgLxTBxFfx2B4YLlBSDQWkl6T9CpwLoHB2CDp1YjccRqLeikMS6OYjmnH8+o/CFJpc4VlYYbRww8/jCTe/e53d771+1MCI/GZ99P1vlJScPOqnyHPRdY8PLjWpoS8lM1ryl+z4lSERGNhZkPMbPvw3wFmtk3k9fbVVNJxekRhSuYuR9X/LLbY6iDu+IBBsHldVy+n3OfM1WKsX84TvzkNSey/f2ePTr5y7rHYLYP5fNRv0DQYDppZ2mCd1PcJAtcTFDdycSm1Tl2Rpd3H/jGPt0jyvTCc+iWu10/bzCAQXOosttDo3P+55LqA3tYMFKsOLzw+aDiYhftOhJ+z9drOwXvpi0GdxN4XvNF5iwsuuAAz49Irbi3PrD7NjZdb2XgdQ8OTJcA9n6AQb1EoejvwODAU+KyZza2ohj3AA9xO2YLZbbNg/ilgm5PPyaWPQvy5zcO79oB49OJALzUFM//cjnNpx9IG74TP+dxqGHNOvuwz/w3X/tXKnz6c9F13omDV0Ahpy/2ctAB3FmPxG+ASM1scvt4LuIygc+xv6jFTyo2Fw00DyE8vzREOXFn51Yj43eIKGTw2cAUlnatmkKBjU2nHCusYCin4nC++Am/+XP4pnzwYbjqbLuNTrFaiVOLqL6J4ZlPD0NtsqD1yhgLAzJ4A3mpmz5ZLQccpO+UKZmcxFBDMltPOtc3xxqDYsWKdT8PPs/q1wN0UNRQf2i8IXN90Nl0un0p0V825xpqHdz/mrqY+QxZjsVjSNZLeGz6uBp6QtBWQsjZ36o566MFTTtpmBTP/mxQ8fj2i6zNV20deyYyquJhA+Ld89T/L0RQYcWbXof/aMzASf8j19otmN1UqfbhlChz3Hzj4F57Z1EfJEqQ+GfgccF74+l7gAgJD8b7KqOWUnUaoXi6FuFjCptWwIOxCk1ZtXIrvfNDw+KKxKDkj9OC5xc/tCYWGqG0W6+85nW2nbsgTv300PPrNwKOVR3SDoUp3Vy13N94cbbNg4bldq7dBw+GAKxvzt9ugFI1ZNCIes4ihEaqXSyEtqFqsF1IpPvu2WYEBirqJNBAGDg1qBAqNUOG5ne8pT8xi48aNbL11/g4BY4ZD2xUwoKkpvnV59Pso9fPXA0lJBgMGwYEz6lfvBqRHvaEkzTazj0taREyk0MzeUUYdnUrTCNXLpZCmd9qxNJ993KBTSjfSvHNLzHhKO9Yyhc2bN7P11lvT0dEVnB86GFb/BJpyzuSkbVe32z00rqH+LVNh5e2Nk5X06MXx2Wgdm5L/br3Bs7ZiSVxZSNrZzF6QFLuXo5ml5crVFF9ZxOAri4ByZUlVifb2dkaOHMnLL7+cJ980E5oLp3pKWFkgunWoreeVRCGJfzMo+9+tEVdeZaRH2VDh5kNRozA+fP4S4HtKNBp9oSgqGqDfso7Yn++AQemfqRFafgBmRktLCwMHDswzFBs2bMCe/QXNW8X8LZNWFoUDbcye2nWd9JD2tyn338334k4kSwX36QR7bue2QtkVuK2SSjkVoBL7RleTworsTathwEAYsG3XOYOGx/uwowPi5nWBQYkSNZqlDp7R8381IsjI6unA2zYL++1Y9hsrBgwYwLJlyzoPrVu3DjML4hVJf8vBsU6AeHItwwur3O8/o/4Mxr7TgrhOIcUmBj2hr7lry0iWbKizgInAAgAze0bSThXVyqkMlcpUqQZxM76OTTB4Z/jouuT3FboVNq8OBp5Bw4PmdjmfNHQvwCt1l7dS3htzrUkfOom7nsx3qax9ZDpD9z093o8e52rrVhxX4ILKMXhM6fGbWpHTpRrZUL4XdyJZ6iw2mllnmkbYE6rvpVA59U1PZ3xxA6JthoHbdTWug2CQjSuqi3NB5FYT952QXLWc9N4YJk+ejHY7Ic9Q/OfaoFZiaNu07CuAuBXH7mcmux9L/U5r6bLK1XF8yoLHx/5TGYPWF9y1FSKLsbhL0peAbSQdBvwK+H1l1XKcAtJiDWmDWJYBMa1rauG53dpxF2H98sRB9aSTTkISc+bM6ZS98OPASAwfErl3Fj96pwE7MXh98M8DQzjx6iD7SU2BXE3B65YppcVvGsVl1Vsa3V1bQbK4oS4EPk3QSPAzwO3AdZVUynG6sctR0HpNd/l2u6cXG2ZxKxRbnUTPLWZY4ph/Spc+wFlnncXVV1+dd8ry6aMYs+3z8fcuZvDSCi4h6LabC35be/B65CHJfaLiZtGN4rIqB43srq0gaZsfrZZ0O3AR0ApMNbOPmdlPLUMln6QZkl6S9HhE9jVJj0l6RNJcSbuEckn6oaTW8Pj+kfdMlfRM+Jjaq0/rNC4rb4+Xr/p7+qw7i1th0LDk+xae25NAp22Ghedy4YUXIinPUCxZsgQzY8wHvp2sZ7EVQNpAXmyQL5xFt0wNjhWu0pJWUllXWE7Dk7ayaAEOAt5NYDAOkNRG0O7jXjObXeTaNwA/Am6MyL5rZpcASDoH+DJwJnAkMD58HAhcAxwoaRhwKTCBIE7yoKQ5ZpafdO70fZIG6aR00dz5WYrq0qY+VtA7I2mlksLXfwuX/Ho18O1O2ePfhr33GAsD7wfGF9czbQXQk3hO9PvJ3SNthZJUw5Fzbzl9nkRjYWavAnPDB5K2BU4h6BF1NpBqLMzsbknjYq6ZY1u6/ptOBm4MVyzzJe0gaWdgEjDPzNaEOswDjgBuzvbxnD5D0iCdNIhFZ+PF3AqbU8qGOl7PdyMlucNiuPLPcN7P82ULvw4HtIQvCl1mSXoWMyRprraktunNBauptlkwf2r37zK3Ckkyyom1HU5fI80NtYukj0n6gaR7gD8DuwP/B+zW0xtKmibpOWAKwcoCYBTwXOS0FaEsSe70N5LcSW85o7scgkEya/A1zQ0FgRsp59ZKcodFuO7OoF141FD848tB4LrTUOTIWvCVtvVomqutsKlgjqg8t6JIW6Ul1XCUUtvhNDRp2VArCILbDwLvN7P3mNl5ZnZLb1p9mNnFZjYamEWwQikLks6QtFDSwlWrVpXrsk6tKMxwgnz/evNwaNom2EK0aRto2jb//ZtXZ8/WyZIIvn550R3hbro3MBKnR9I/5l0UGIlD9ky7dhnagydl8GxKWDVF5cWC9oPHBCuqQsvjKaX9ijRjcQhwE3AMcJ+kWyVdIOmQcC+L3jIL+J/w+fPA6MixXUNZkrwbZjbdzCaY2YSRI0eWQT2nZiSlaUIwqz7459CxIWwHHlZzxw12WWftaW6oKAmG4raFgZGYEklwmnN+YCQ+sE+G65aj4Ctp5ZElPV5mDgMAACAASURBVDbNWDUNDgxF20zyraq6UnCdfkFab6j7zOwHYQbUAcD5wEZgJvBKT24maXzk5WTgqfD5HOCkMCvqIOCVsDfVHcAHJe0oaUfgg6HM6csUqyuInQknLA/WLw82Rrp5YPBvXDFZDwfruY8FRuKYy7tkt5wdGIkP75/8vjzUHAzGlSp2y5INlvb5cx1q477vDC45p++QWmch6a0E2VDvJlhp7ADMB64tdmFJNxMEqEdIWkGQ1XSUpD2BDmA5QSYUBLUbRxGk6K4nCKRjZmskfQ14IDzvslyw26kzytnWuVh2T4/SV0N/fFwbjrh6gxTueQoO/Vq+7Prrr+fUrU+j9OYGBkuv62rBXe5NqbJkg6V9/raZyd+L90vqV6S1KP8PsBK4jyBd9p9m1lpF3XqMtyivMuVu61ysnXpi7CChD1IchW3Mo8ZOA2KDvQ8shYlfzpddddVVnH322el69wQ1BTvcVWs/haRsqE5dimyq5PQJetSiHHiLmb3DzD5jZjc2iqFwakC52zoXc50kHd/9zCDwnYXCWXHLlK4COGsnGsx97F+BuylqKL75yWbs2V8EhiIXjF+/nOT0oxKxdnrUVqOn/ZtapgTGKUkX75fU70mLWfQoLuH0Q8rd1rlYf56k4yMPCQLfWYjZ1zq/55Px9MrASOx7UddpX5oM9tuxXDjtZ11bqRa8r2wGI0dWw9vb/k2JwfCx3i/J8T24nTKQ5H5pHg7N21Vve8pS3EAH/yJfl8h7l62ClvPyTz/nnHO48sors98zcde6AgYMArP4bUPzL1h8R7je7obYz3eJc3ruhnKcbMS5hdQM7a9Vt0tp1pXMoOHdB7/1/2Lly8FKImooTj402LWum6HIcz3FkGYoBg2nc4Z+4Aw46GfFi9uyZGz1doXnHVedFIp2nQ1rKv4HGBc938wuq5xaTkMRl3GzZV1YBxGh0l1Ks/RtahocbJoTYdWqVew0JX+Ffey74NbziB/E42bghfQkKJx0zazxgXJs3FOLjqvlzKRzKkaWlcXvCGoitgCvRx6O00VhUVhS5XAl0y2TVjjRmXxkprx27VoksdNOXRs/vn/voE7i1vMorV13lKQ2JGmDftI11ZR9dt+IG/f0l30y+gBFYxaSHjezLHWodYPHLGpA4ewwqYHd4LHB4FWpmWRUj0HDgnjz5jVdLSuWz2bda6sZ8un8t71rN1hwGagwNq2mYOCfGCnPvmkAiSm6uc+XC35n/ZyJ18wQq4hSeM9djgqK5+p11t7bOItTVtJiFlmMxXTgKjNbVAnlKoEbiyoT55YZMAg6thDUX4aoGd5yWvdCr0oEUWN02rAJBp+Sf9oeO8NT340xEoXs/tkug1GJAa4S12yEgHW5jKRTFnob4P4vgn0kng43Jlok6bHyqug0NHEulI5N5BkKCEbk5bNLq8noad1ARKdNW4LAddRQvGkotP8cnv5eBkMBsHR61/M4dw8Km+31kDQXUhm+g056U/9SCUrZ2tWpKVm2VT2y4lo4jU3WOETHJuiIcU0lXSNtM56kmXGnG2Y5W9ph+9OCFUWOrZph3fUwsNQ9e6LB6pYpsOreoONt56zYurYr7cmsPaktR+F9SmkHUu76l0pQytauTk1J289i+/DpawkPp69S6ky2HLNADeh+n1JnxqFx6Vi3nJ3PguaT8g3Fxpnwxg09MBTQfUe4lbfTzX3S21l7YZIAFBikEu/TCLN2T9dtGNJWFjcBRxPsZ1FYlmr0YgMkp44pZTYfmcVn7ss0aDi0b+huBKy9+33SZsYxwWN75Eu87f9bz9Mv5J/++gwY3Num+m85o7sOSbrl6G1K6KMXk9xNN8PqoFFm7bVI13VKJm1b1aPDfwv39nL6Mmmz+eh/6G7B09x8IsVgRGsc0rbwLLZd6KBhefe215dz0GEncf/S/BjJq9fBkG1SPuug4TDm42G2UEp9RjS4nSNNN+iZC62QNIOQZXWQpeOs42QkS8yCcC+J8cDWOZmZ3V0ppZwakjRoFg5ciXtKDKBbYBsCedS9cN+Jxe8T2zpb0P5Gp+zwb8HcReTdc8102LFg47w8ohlGbbMi+zIUGLu0zKF9p8GCU8NAfoTNr3atKLIY3TQSiwyVfXXgs3anTBTNhpJ0GnA3waZDXw3//Upl1XJqQtssEpvgFc5kE2e9HUHabJQBg+DgG/MHrSz+9JYpweY7hR7Q9tf5xA+DDKe5kYTuF68OCupSDYWauwbatCaAxXznLVOgaUh3uW2GB88tT3A5Ketq9zPjXYKV2kDJcciWOnsu8C5guZm9D3gnsLaiWjm1IdFHHjOTTetQeuCM/IDlbp8Orh0dyLKmnxYEkk//aWAkZi/oOmXFVYGR2GloweUGbEueoWnaNujDFHXPxK2OciuPYjPypO1YN62O+WwhpbbeKAz+Hvzz7i4xr4J2qkAWY/GGmb0BQZ8oM3sKSNt+3mlUEme91n3gTKsLiGb17DstSCmN2087btXQNjN/kAtn/Z//RWAkrvt716FnLw+MxKhhCWrLyDd+BYawt7P/tIG/PaYjTnRVk5WkvbWjNEI9hdPwZDEWKyTtANwGzJP0O4ItUZ2+RuLgp+6z1Kwpj2kDWbH007ZZXPabwEhc/qeuU578bmAkWt6UkgOrpuIDaG9TS0sd+Ju3r0z8oBHqKZyGp6ixMLNjzGytmX0FuAS4HvhopRVzasC+04iPWVj8LDXLrDdtIEs5NmfOHLTbCVx6a5f4kW8ERuKtu+TU6gj2pYhb4SS1CM+l3SbtbFeYWpoWC2iZkn1nPkhurpiFND0aoZ7CaXhSjYWkJklP5V6b2V1mNsfMNqW9z2lQWqbQq7z+ONIGsphjdzwGmmJMnjy5U/bA1wIjsW9ht/DBY5JXOEn7QzQPyx7UTooF3P+5roFbBO6lPDImCRSj00AoyB5Likk0YrdZp+FITZ01s/awJ9QYM/M1bV8ml+6ZRE9nqcUKw8Jjf38C3lcwti3+4c7sNbygwi5H1P+flB4ad1+RHtSOkuRCi1ZVb1odZHsNHJ7f3TauWWIpg3dsHUuBHrk0XK+ncKpAljqLHYHFku4nso+FmX0k7U2SZhBUgL+Ua3Eu6bvAh4FNwFLgFDNbGx67CPg00A6cY2Z3hPIjgCuBJuA6M/tWSZ/QKU6WzXy2rAvOK3UAKjKQ3ffQM7z7+K/mveXhabDfWwZDy0e7D7oQZDkd+JN0XZLum6W+I00GdBu4OzbB1tvBcf/pko08pPfV22l/j0L9vJ7CqTBZWpS/N05uZncVed+hwDrgxoix+CDwNzPbIunb4XW+KGkv4GZgIrAL8Bdgj/BSS4DDgBXAA8AnzeyJtHt7i/ISybp3dRnbWz/00EMccMABebL5X4UDd48IBo8NZulLpwcxiLi9JUqllFbgpezpXe6W2ml7ZuTwPR+cMtOrFuVhnOKu0DgsBu4uZijC990NrCmQzTWzLeHL+cCu4fPJwC1mttHM2oBWAsMxEWg1s2fDOMkt4blOOckajyhDOubjf/4WkvIMxd2XBDGJPEMBwUDdNrMrWG3t+am1PSlEK8W/n1QLEke5g8nFrucxCafKpHWdPUjS3yX9RtI7JT0OPA68GLqGesupQC4hchTwXOTYilCWJI/T9wxJCyUtXLVqVRnU60eUMtD1MNC9ZMkSJPH2Iy/qlM27COyWwbznHUkZRQOS0197WohWSpfTuHN3P7M6weQ0Q+WdWZ0akBaz+BHwJWAo8DfgSDObL+mtBC6jP/f0ppIuJtjTu2wlpmY2HZgOgRuqXNftFyT1YIpzg5Q4g25ra2O33fIbFP/+fDh6//BF+3qwpN2HEtw66//Vu95Lpfj3487tbTwi633Bg9ZO3ZBmLAaa2VwASZeZ2XwAM3tKmbYWi0fSyQSB7/dbV8DkeWB05LRdQxkpcqdcxA1MvczoWbFiBS0tLWzZsqVT9qtz4GMHxpzcEVPtnMbgMbUtRKtWMNmD1k4dkRaziE7rNhQc69HMPXRffQH4iJlFp4VzgOMlbSWphaDD7f0EAe3xklokDQKOD891SiGLb7+wwG7kIdAU6e/dPDxoz1HY46ngHi9eI4ZuO4DRo0d3Goobb7wRM+Nj70uofSiFnMGqZiGaN+lznNSVxb6SXiXwR2wTPid8vXXy28KTpJuBScAISSuAS4GLgK0I2oYAzDezM81ssaTZwBME7qmzzIKopqSzCTrdNgEzzGxx6R+zH9PTrUkL3VKbV0PrNV2vo9cBVv/1dPb53w38ey3k5hLTv/FpTr/ouq737DsN5p8SdGbNoeagDcamhO1WC2mZ2qV3NTb2Kce+FI7TByiaOtuIeOpshFJSRSEYHOM2JkpgrY1mwgUvsPTfXe6mK0+Ec46IuUfbrO57QAwYFHSlbZ1OUGJTjLBF98Sre78TXRZK/f4cp4FJS53NtPmR08Ak+vZjBsDcLDqDoXhtAxzyVVj0XFey2jc/ARdGSzXjNkwq3CyoYxMsnw0DmqAji7GwoIJ65CHV8el7kz7HAbJ1nXUambROstEeR7eNCzbtKVI1vH4jHPRl2P40WBTaiUuOG4rNKjAUhfdum5Vc4LZ5dXcjkkpCY8NKkPj9WXnjFx4XceocNxZ9nbROsq3X5tcppMQNNm6G/54G254KC5YGsvM/NJCOpT/nsm//OL32ILdiKSdpM/tyDryx9Q45Hcq0yZBvXuQ0AG4s+jppnWQzJLVt3gIf/h5sfTLcGTZZ+ewHoOM3Y/jeVTeg3U4oXuiW1ueoaTAMSijKGzSckiumyz3w5n22GMqxyZBvXuQ0AB6z6A8MHltCj6OA9g741I/yty+d+r5tmfHTaxjwlphmfGnxg7RVwMTpwb9xRYGbVgdboRbuOpeW9dSbYr0kcp8tqV9Tb+MXHhdxGgBfWfRl0jb5SZixdzQN45Trt2XgiV2G4rjjjmPz5s3c8Ld18YaiGGn7decG4rzZe6R6vP31IL02t8oo1uqikgNvpWo7fPMipwFwY9FXyXPHQLdNfgp6HJnBWTMH0nT8Gm74WzCTP+qoo9i0aROzZ89m4MBeLEKzNO/LFQUOHku32btthoHbpe/Il6OSA2+lNhnyzYucBsCNRV8lNk4Q2eRn4tUwcTq2zRj+9yYYcAJcPTeolZg0aRIbNmzgj3/8I83NhbvA9YBSmvf1dmVQyYG3lM9RD9d1nDLiMYu+SoZB99IblnDZZV2vJ06cyJ133sngwQnZP1lJKpbLMvgNHhMfX9GAIGZQrPiu0g34KlXb4X2gnDrHjUVfJWnQHTyG73znO3zxi1/sFO2zzz7885//ZMiQIb2/b2/bY8R2wKWrUDDL9XzgdZyy426ovkqMO+aqec3omOWdhmK33Xbj5ZdfZtGiReUxFJA9DTSpFqLQJaOm7vfIklbqRW6OU1bcWDQSpQyAkUH3ujtBU+CcG4IGfm9+85tZtWoVS5cuZYcddiivjlliDsVqIXLB7oN/ntx6pFhRnhe5OU5ZcWPRKPRgAJz1T9Axyzk9bPw6ZMgQVq5cyQsvvMCIESOK368nM/Ms2UhZVh+5poOl3ifr9aP4KsRxiuLGolEoYQC89dZbkcQJJ5wAQFNTE//617949dVX2XnnnYvfqzcz8yzZSFlWHw+em9wvqlh2UykZVb4KcZxMuLFoFDIMgH/84x+RxMc+9rFO2dJrdmHLMzMZPXp03Lvj6U37ibQ00NwMPqnNSHS1kLa/RbG00lJqLbzVhuNkwrOhGoWU7Ka//OUvHHbYYXnip74Le+4CsLL0zXp6W+sQl40Ut6FSlFJqIYp9jriMqqTre6sNx8mErywahRj3zj+e2QodszzPUDx2xc7YrJyhCCl1plyJKui0ZoKxRWhJP80MP9lSity81YbjZMKNRaMQGQDvXxpkN73nKxs7Dy9cuBAz4+0j/x3//lJmysXiDj0JCCfeXwktPDriTk6RF1C4p3jSasRbbThOJtwN1UA8+uo+7HdMvivq3nvv5d3vfneXIMVdlZm0KuisRXeFVdzNw4JNjrLqldQpN6lVeE+pdMW34/QRfA/uBuDJJ59kr732ypPdeeedTJo0qfvJcbGBpsHl6zWUtCc1BAN5bkZeqMOAQUG3QtucTa/efI5q7M3tOH2QtD24K+aGkjRD0kuSHo/IjpO0WFKHpAkF518kqVXS05IOj8iPCGWtki6slL71SGtrK5LyDMWf/vQnzCzeUEDvm9IVczGlubNyq4yFMduzdmyC5u2z69XTz+GpsI5TESq2spB0KLAOuNHM9gllbyNwOv8EuMDMFobyvYCbgYnALsBfgD3CSy0BDgNWAA8AnzSzJ9Lu3egri+XLlzNu3Lg82W233cbkyZMre+Mss/m0lUVRFMQQ0u7f2xVBkn65bruO4yRSk5WFmd0NrCmQPWlmT8ecPhm4xcw2mlkb0EpgOCYCrWb2rJltAm4Jz+2TrFy5ksGDB+cZiptvvhkzq7yhgGw1B2l7UhcjLW5SrhWBp8I6TkWol2yoUcBzkdcrQlmSvBuSzpC0UNLCVatWVUzRSvDSSy8xYsQIRo0axYYNGwCYMWMGZsbxxx/f/Q2Vak+RZaAttic1BLvalZphVK7iOE+FdZyKUC/GoteY2XQzm2BmE0aOHFlrdTLx8ssvM3bsWN70pjexenWQKfTjH/8YM+OUU06Jf1MlffJZB9rORn+/iDcKB1xZeryhXCsCT4V1nIpQL6mzzwPRfhS7hjJS5A3Lq6++yoEHHshTTz3VKfve977H+eefX/zNaTPw3mb8lFL5DMXTTkvRpxwpv1l0chynR9SLsZgD3CTpBwQB7vHA/QSbRo+X1EJgJI4HPlUzLXvJ66+/zqGHHspDDz3UKbvsssu45JJLsl+k1CZ5hYMmFB/cSxloy7XRUKmGKg3f/Mhxyk7FjIWkm4FJwAhJK4BLCQLeVwEjgT9KesTMDjezxZJmA08AW4CzzIKNDCSdDdwBNAEzzGxxpXQuSg+zdd544w0OO+ww/vGPf3TKLrroIqZNm4ak0nTIOgOPK55bcGp+rUNcQV0lBtos35uvCBynrvGivKz0oEhs06ZNfPjDH2bu3LmdsnPPPZfLL7883khkGVSz6lFKimsl00orXSToOE7ZqEnqbJ+jhGydLVu2cOyxx7LVVlt1GorTTjuN9vZ2rrjiimRDkSVwnbVYrZTAcCXTSr0FuOP0CeolZlH/ZIgVtLe3M3XqVGbN6hrgP/WpT3HjjTfS1BSzl3SUUgLXWVxFSe6qpHMrhdc9OE6fwFcWWUlJK+3o6OCMM85g4MCBnYbiox/9KJs3b2bWrFnFDQWkDKo9rJaOSyEdMAjUnC/rSRC5lDoPr3twnD6BG4usxAy+NmAbzvvdW2lqauKnP/0pAIcddhhvvPEGv/3tbxk4sISFW+LgqZ7VUMS5qw6cAQf9rOd9o6D0Og+ve3CcPoEHuEshDEDb68u5+Lfb881bX+08dMghhzBv3jy22Wabnl/7vhOJ3XK0nvoa9aT3kneBdZyGIC3A7TGLUmiZwtdntYV1EYGheOc738ndd9/Ndttt1+trc98J8cfqyb/fkxiE1z04TsPjbqiM/OAHP0BSZwHdnnvuydq1a3nooYd6byhyJPVbqif/vscgHKdf4saiCNdccw2SOltxjB49mtWrV/PUU08xdOjQ8t6sEfz7jaCj4zhlx91QCdxwww15zfyGDRvGk08+yU477VS5mzZCFXMj6Og4TtnxAHcBv/71rznuuOM6X2+99da0trYyalRsZ3THcZw+gwe4M9Le3p5nKJYtW8bYsSn7NjiO4/QT3FhEaGpqYsGCBey4446MHz++1uo4juPUDW4sCpg4cWL5Lub1BY7j9BE8G6pSVHJHu1J0qMT2q47j9DvcWFSKWndbrQdj5ThOn8GNRaWodbfVWhsrx3H6FG4sKkVSRXPzsOrcv9bGynGcPoUbi0qx77Tu7cAB2l+rjivI23I4jlNG3FhUipYp0Lx9d3nHpuq4grwth+M4ZcSNRSXZtCZeXg1XUNbtVx3HcTJQMWMhaYaklyQ9HpENkzRP0jPhvzuGckn6oaRWSY9J2j/ynqnh+c9ImlopfStCrV1BLVOCPSY+1RH864bCcZweUsmVxQ3AEQWyC4G/mtl44K/ha4AjgfHh4wzgGgiMC3ApcCAwEbg0Z2AaAncFOY7TR6iYsTCzu4FCP8xkYGb4fCbw0Yj8RguYD+wgaWfgcGCema0xs5eBeXQ3QPWLu4Icx+kjVLvdx5vM7IXw+b+BN4XPRwHPRc5bEcqS5N2QdAbBqoQxY+oo48d3iXMcpw9QswC3Bb3Ry9Yf3cymm9kEM5swcuTIcl3WcRzHofrG4sXQvUT470uh/HlgdOS8XUNZktxxHMepItU2FnOAXEbTVOB3EflJYVbUQcArobvqDuCDknYMA9sfDGWO4zhOFalYzELSzcAkYISkFQRZTd8CZkv6NLAc+Hh4+u3AUUArsB44BcDM1kj6GvBAeN5lZpZQvOA4juNUCt9W1XEcxwHSt1X1Cm7HcRynKG4sHMdxnKK4sXAcx3GK4sbCcRzHKYobiyi+Z7XjOE4s1W73Ub/k9qzObUWa27MavF2H4zj9Hl9Z5PA9qx3HcRJxY5HD96x2HMdJxI1FjlpvVOQ4jlPHuLHI4RsVOY7jJOLGIodvVOQ4jpOIZ0NF8Y2KHMdxYvGVheM4jlMUNxaO4zhOUdxYOI7jOEVxY+E4juMUxY2F4ziOU5Q+uVOepFUE27aWixHAf8p4vXJRj3rVo05Qn3rVo05Qn3rVo05Qn3r1RqexZjYy7kCfNBblRtLCpK0Ga0k96lWPOkF96lWPOkF96lWPOkF96lUpndwN5TiO4xTFjYXjOI5TFDcW2ZheawUSqEe96lEnqE+96lEnqE+96lEnqE+9KqKTxywcx3GcovjKwnEcxymKGwvHcRynKP3SWEgaLelOSU9IWizp3FD+S0mPhI9lkh4J5eMkbYgcuzZyrQMkLZLUKumHktQLvbaWdL+kR0O9vhrKWyQtCO/xS0mDQvlW4evW8Pi4yLUuCuVPSzq8AjrNCq/9uKQZkppD+SRJr0S+qy9HrnVE+J5WSRf2VKciet0gqS1y//1CucK/T6ukxyTtH7nWVEnPhI+pFdDpnog+KyXdFsqr8l1Frtkk6WFJfwhf1+x3laJTTX9XKXrV7HeVolN1f1dm1u8ewM7A/uHzIcASYK+Cc74PfDl8Pg54POFa9wMHAQL+BBzZC70EbBc+bwYWhNeeDRwfyq8FPhs+/xxwbfj8eOCX4fO9gEeBrYAWYCnQVGadjgqPCbg5otMk4A8x12kK9dgNGBTqt1dPdCqi1w3Ax2LOPyr8+yg8b0EoHwY8G/67Y/h8x3LqVHDOrcBJ1fyuItf9PHBT7p61/F2l6FTT31WKXjX7XSXpVO3fVb9cWZjZC2b2UPj8NeBJYFTuuCQBHyf4sSYiaWdgezObb8Ff40bgo73Qy8xsXfiyOXwY8N/Ar0P5zMg9JoevCY+/P9R9MnCLmW00szagFZhYTp3M7PbwmBEYzF2LXGoi0Gpmz5rZJuCWUM8ekfJdJTEZuDF833xgh/Dvdzgwz8zWmNnLwDzgiEroJGl7gr/lbUUuVdbvKrz3rsCHgOvC16KGv6s4nQBq/btK0iuFiv+uiulUrd9VvzQWUcIl9jsJZoE53gO8aGbPRGQt4RLwLknvCWWjgBWRc1YQMTo91KdJgfvrJYIf2FJgrZltibnHKOA5gPD4K8DwqLwcehXqZGYLIseagROBP0fecnDoivmTpL0LdS2HTkX0mha6BC6XtFWR+1ftuyIYjP9qZq9GZFX5roArgC8AHeHr4dT4dxWjUye1/F2l6FWz31WKTlCl31W/NhaStiNYvp1X8EV/kvxVxQvAGDN7J+FSMLTmZcfM2s1sP4IZ1UTgrZW4TykU6iRpn8jhq4G7zeye8PVDBP1l9gWuovhsp9x6XUTwnb2LwAXwxUrdvwSdchT+rqryXUk6GnjJzB6sxPV7QgadavK7StGrZr+rDN9VVX5X/dZYhDOXW4FZZvabiHwgcCzwy5wsXHavDp8/SDDb3wN4nvxl8q6hrNeY2VrgTuBggqVtbgvc6D2eB0ZH9B4KrI7Ky6lXRKcjwnteCowkMKC5c17NuWLM7HagWdKISulUqFfoYjQz2wj8jC43SdL9q/VdjQh1+WPknGp9V4cAH5G0jMD18N/AldT2d9VNJ0m/CO9Zy99VrF41/l2lfVfV+11ZLwNBjfggCEbdCFwRc+wI4K4C2UjCQB5BcOh5YFj4ujDAfVQv9BoJ7BA+3wa4Bzga+BX5gcjPhc/PIj8QOTt8vjf5gchn6XmAO0mn04B/AtsUnP9muoo9JwL/Cr+bgaEeLXQF1/auwHe1c+RvfAXwrfD1h8gPRN4fyocBbQRByB3D58PKqVP4+kxgZi2+q4J7TqIraFuz31WKTjX9XaXoVbPfVZJO1f5d9foLbcQH8F8EgcfHgEfCx1HhsRuAMwvO/x9gcXjeQ8CHI8cmAI8TrDZ+lPsj9VCvdwAPh3o9Tlc21m4ERqk1/A++VSjfOnzdGh7fLXKti0OdnqZ3GVpJOm0Jr5/7/nLys8Pv6lFgPvDuyLWOIsg8Wwpc3Mu/YZJefwMWhbJf0JWdJODH4b0XARMi1zo1/A5bgVPKrVN47O8EK5/o+VX5rgruOYmuAbBmv6sUnWr6u0rRq2a/qySdqv278nYfjuM4TlH6bczCcRzHyY4bC8dxHKcobiwcx3GcorixcBzHcYrixsJxHMcpihsLxylA0rqC1ydL+lGZrv13SRNi5EeH7WQeVdAN+TOh/ExJJ5Xj3o7TGwYWP8VxnEoSdhOYDkw0sxVh36FxAGZ2bdp7Hada+MrCcUpA0khJt0p6IHwcEsonSrovXB38U9KeoXwbSbdIelLSbwkquwsZQjBxy7WUqnNeFgAAAbtJREFU2WhmT4fv/4qkCyTtEtmf4BFJ7ZLGJunjOOXGVxaO051tws6xOYYBc8LnVwKXm9k/JI0B7gDeBjwFvMfMtkj6APANgsr/zwLrzextkt5B0AEgDzNbI2kOsFzSX4E/ADebWUfknJVAbsOds4D3mtlySTcl6OM4ZcWNheN0Z4MFnWOBIGZB0NYF4APAXuraEHH7sHvxUGCmpPEErWSaw+OHAj8EMLPHJD0Wd0MzO03S28PrXwAcBpxceF64cjidoGVNoj7Wta+G45QFNxaOUxoDCHa/eyMqDAPgd5rZMeEeKX8v9cJmtghYJOnnBI3nTi64x87A9cBHIsYgVh/HKTces3Cc0pgL/L/cC4V7MROsLHLtnk+OnH838Knw3H0Img3mIWk7SZMiov2A5QXnNBM09/uimS3JoI/jlBU3Fo5TGucAE8Id054gaBEN8B3gm5IeJn/Ffg2wnaQngcuAuA1sBHxB0tNhrOSrdHdBvZvAFfbVSJB7lxR9HKeseNdZx3Ecpyi+snAcx3GK4sbCcRzHKYobC8dxHKcobiwcx3GcorixcBzHcYrixsJxHMcpihsLx3Ecpyj/P4jiirPrf/HlAAAAAElFTkSuQmCC\n", 325 | "text/plain": [ 326 | "
" 327 | ] 328 | }, 329 | "metadata": { 330 | "tags": [], 331 | "needs_background": "light" 332 | } 333 | } 334 | ] 335 | }, 336 | { 337 | "cell_type": "markdown", 338 | "metadata": { 339 | "id": "HUgdZsIp6vKx", 340 | "colab_type": "text" 341 | }, 342 | "source": [ 343 | "## **Prediction**\n", 344 | "\n", 345 | "We will predict the Brain weight by giving an input of Brain Size" 346 | ] 347 | }, 348 | { 349 | "cell_type": "code", 350 | "metadata": { 351 | "id": "TfgYfaHk67kX", 352 | "colab_type": "code", 353 | "colab": { 354 | "base_uri": "https://localhost:8080/", 355 | "height": 34 356 | }, 357 | "outputId": "c6adf2de-9904-4337-c5ea-91a8b0534c23" 358 | }, 359 | "source": [ 360 | "print('Prediction by the model:\\t', model.predict([[2000]]))" 361 | ], 362 | "execution_count": 14, 363 | "outputs": [ 364 | { 365 | "output_type": "stream", 366 | "text": [ 367 | "Prediction by the model:\t [[852.43210003]]\n" 368 | ], 369 | "name": "stdout" 370 | } 371 | ] 372 | } 373 | ] 374 | } -------------------------------------------------------------------------------- /01_Linear_Regression/headbrain.csv: -------------------------------------------------------------------------------- 1 | Gender,Age Range,Head Size(cm^3),Brain Weight(grams) 2 | 1,1,4512,1530 3 | 1,1,3738,1297 4 | 1,1,4261,1335 5 | 1,1,3777,1282 6 | 1,1,4177,1590 7 | 1,1,3585,1300 8 | 1,1,3785,1400 9 | 1,1,3559,1255 10 | 1,1,3613,1355 11 | 1,1,3982,1375 12 | 1,1,3443,1340 13 | 1,1,3993,1380 14 | 1,1,3640,1355 15 | 1,1,4208,1522 16 | 1,1,3832,1208 17 | 1,1,3876,1405 18 | 1,1,3497,1358 19 | 1,1,3466,1292 20 | 1,1,3095,1340 21 | 1,1,4424,1400 22 | 1,1,3878,1357 23 | 1,1,4046,1287 24 | 1,1,3804,1275 25 | 1,1,3710,1270 26 | 1,1,4747,1635 27 | 1,1,4423,1505 28 | 1,1,4036,1490 29 | 1,1,4022,1485 30 | 1,1,3454,1310 31 | 1,1,4175,1420 32 | 1,1,3787,1318 33 | 1,1,3796,1432 34 | 1,1,4103,1364 35 | 1,1,4161,1405 36 | 1,1,4158,1432 37 | 1,1,3814,1207 38 | 1,1,3527,1375 39 | 1,1,3748,1350 40 | 1,1,3334,1236 41 | 1,1,3492,1250 42 | 1,1,3962,1350 43 | 1,1,3505,1320 44 | 1,1,4315,1525 45 | 1,1,3804,1570 46 | 1,1,3863,1340 47 | 1,1,4034,1422 48 | 1,1,4308,1506 49 | 1,1,3165,1215 50 | 1,1,3641,1311 51 | 1,1,3644,1300 52 | 1,1,3891,1224 53 | 1,1,3793,1350 54 | 1,1,4270,1335 55 | 1,1,4063,1390 56 | 1,1,4012,1400 57 | 1,1,3458,1225 58 | 1,1,3890,1310 59 | 1,2,4166,1560 60 | 1,2,3935,1330 61 | 1,2,3669,1222 62 | 1,2,3866,1415 63 | 1,2,3393,1175 64 | 1,2,4442,1330 65 | 1,2,4253,1485 66 | 1,2,3727,1470 67 | 1,2,3329,1135 68 | 1,2,3415,1310 69 | 1,2,3372,1154 70 | 1,2,4430,1510 71 | 1,2,4381,1415 72 | 1,2,4008,1468 73 | 1,2,3858,1390 74 | 1,2,4121,1380 75 | 1,2,4057,1432 76 | 1,2,3824,1240 77 | 1,2,3394,1195 78 | 1,2,3558,1225 79 | 1,2,3362,1188 80 | 1,2,3930,1252 81 | 1,2,3835,1315 82 | 1,2,3830,1245 83 | 1,2,3856,1430 84 | 1,2,3249,1279 85 | 1,2,3577,1245 86 | 1,2,3933,1309 87 | 1,2,3850,1412 88 | 1,2,3309,1120 89 | 1,2,3406,1220 90 | 1,2,3506,1280 91 | 1,2,3907,1440 92 | 1,2,4160,1370 93 | 1,2,3318,1192 94 | 1,2,3662,1230 95 | 1,2,3899,1346 96 | 1,2,3700,1290 97 | 1,2,3779,1165 98 | 1,2,3473,1240 99 | 1,2,3490,1132 100 | 1,2,3654,1242 101 | 1,2,3478,1270 102 | 1,2,3495,1218 103 | 1,2,3834,1430 104 | 1,2,3876,1588 105 | 1,2,3661,1320 106 | 1,2,3618,1290 107 | 1,2,3648,1260 108 | 1,2,4032,1425 109 | 1,2,3399,1226 110 | 1,2,3916,1360 111 | 1,2,4430,1620 112 | 1,2,3695,1310 113 | 1,2,3524,1250 114 | 1,2,3571,1295 115 | 1,2,3594,1290 116 | 1,2,3383,1290 117 | 1,2,3499,1275 118 | 1,2,3589,1250 119 | 1,2,3900,1270 120 | 1,2,4114,1362 121 | 1,2,3937,1300 122 | 1,2,3399,1173 123 | 1,2,4200,1256 124 | 1,2,4488,1440 125 | 1,2,3614,1180 126 | 1,2,4051,1306 127 | 1,2,3782,1350 128 | 1,2,3391,1125 129 | 1,2,3124,1165 130 | 1,2,4053,1312 131 | 1,2,3582,1300 132 | 1,2,3666,1270 133 | 1,2,3532,1335 134 | 1,2,4046,1450 135 | 1,2,3667,1310 136 | 2,1,2857,1027 137 | 2,1,3436,1235 138 | 2,1,3791,1260 139 | 2,1,3302,1165 140 | 2,1,3104,1080 141 | 2,1,3171,1127 142 | 2,1,3572,1270 143 | 2,1,3530,1252 144 | 2,1,3175,1200 145 | 2,1,3438,1290 146 | 2,1,3903,1334 147 | 2,1,3899,1380 148 | 2,1,3401,1140 149 | 2,1,3267,1243 150 | 2,1,3451,1340 151 | 2,1,3090,1168 152 | 2,1,3413,1322 153 | 2,1,3323,1249 154 | 2,1,3680,1321 155 | 2,1,3439,1192 156 | 2,1,3853,1373 157 | 2,1,3156,1170 158 | 2,1,3279,1265 159 | 2,1,3707,1235 160 | 2,1,4006,1302 161 | 2,1,3269,1241 162 | 2,1,3071,1078 163 | 2,1,3779,1520 164 | 2,1,3548,1460 165 | 2,1,3292,1075 166 | 2,1,3497,1280 167 | 2,1,3082,1180 168 | 2,1,3248,1250 169 | 2,1,3358,1190 170 | 2,1,3803,1374 171 | 2,1,3566,1306 172 | 2,1,3145,1202 173 | 2,1,3503,1240 174 | 2,1,3571,1316 175 | 2,1,3724,1280 176 | 2,1,3615,1350 177 | 2,1,3203,1180 178 | 2,1,3609,1210 179 | 2,1,3561,1127 180 | 2,1,3979,1324 181 | 2,1,3533,1210 182 | 2,1,3689,1290 183 | 2,1,3158,1100 184 | 2,1,4005,1280 185 | 2,1,3181,1175 186 | 2,1,3479,1160 187 | 2,1,3642,1205 188 | 2,1,3632,1163 189 | 2,2,3069,1022 190 | 2,2,3394,1243 191 | 2,2,3703,1350 192 | 2,2,3165,1237 193 | 2,2,3354,1204 194 | 2,2,3000,1090 195 | 2,2,3687,1355 196 | 2,2,3556,1250 197 | 2,2,2773,1076 198 | 2,2,3058,1120 199 | 2,2,3344,1220 200 | 2,2,3493,1240 201 | 2,2,3297,1220 202 | 2,2,3360,1095 203 | 2,2,3228,1235 204 | 2,2,3277,1105 205 | 2,2,3851,1405 206 | 2,2,3067,1150 207 | 2,2,3692,1305 208 | 2,2,3402,1220 209 | 2,2,3995,1296 210 | 2,2,3318,1175 211 | 2,2,2720,955 212 | 2,2,2937,1070 213 | 2,2,3580,1320 214 | 2,2,2939,1060 215 | 2,2,2989,1130 216 | 2,2,3586,1250 217 | 2,2,3156,1225 218 | 2,2,3246,1180 219 | 2,2,3170,1178 220 | 2,2,3268,1142 221 | 2,2,3389,1130 222 | 2,2,3381,1185 223 | 2,2,2864,1012 224 | 2,2,3740,1280 225 | 2,2,3479,1103 226 | 2,2,3647,1408 227 | 2,2,3716,1300 228 | 2,2,3284,1246 229 | 2,2,4204,1380 230 | 2,2,3735,1350 231 | 2,2,3218,1060 232 | 2,2,3685,1350 233 | 2,2,3704,1220 234 | 2,2,3214,1110 235 | 2,2,3394,1215 236 | 2,2,3233,1104 237 | 2,2,3352,1170 238 | 2,2,3391,1120 239 | -------------------------------------------------------------------------------- /02_Logistic_Regression/Logistic_Regression.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "Logistic Regression.ipynb", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | } 14 | }, 15 | "cells": [ 16 | { 17 | "cell_type": "markdown", 18 | "metadata": { 19 | "id": "EeWekSPEB-Y-", 20 | "colab_type": "text" 21 | }, 22 | "source": [ 23 | "## **Logistic Regression**\n", 24 | "\n", 25 | "Logistic regression is a statistical model that in its basic form uses a logistic function to model a binary dependent variable, although many more complex extensions exist. In regression analysis, logistic regression (or logit regression) is estimating the parameters of a logistic model (a form of binary regression).\n", 26 | "\n", 27 | "For Example: spam detection for emails, predicting if a customer will default in a loan, etc.\n", 28 | "\n", 29 | "By [Muhammad Huzaifa Shahbaz](https://www.linkedin.com/in/mhuzaifadev)" 30 | ] 31 | }, 32 | { 33 | "cell_type": "markdown", 34 | "metadata": { 35 | "id": "fRehRCOpBXAB", 36 | "colab_type": "text" 37 | }, 38 | "source": [ 39 | "## **Importing libraries**\n", 40 | "\n", 41 | "We will import Pandas as *pd*, train_test_split from sklearn.model_selection, confusion_matrix & accuracy_score from sklearn.metrics and Logistic Regression from sklearn.linear_model. " 42 | ] 43 | }, 44 | { 45 | "cell_type": "code", 46 | "metadata": { 47 | "id": "j1UhjdP0B6m7", 48 | "colab_type": "code", 49 | "colab": {} 50 | }, 51 | "source": [ 52 | "import pandas as pd\n", 53 | "from sklearn.linear_model import LogisticRegression\n", 54 | "from sklearn.model_selection import train_test_split\n", 55 | "from sklearn.metrics import confusion_matrix\n", 56 | "from sklearn.metrics import accuracy_score" 57 | ], 58 | "execution_count": 0, 59 | "outputs": [] 60 | }, 61 | { 62 | "cell_type": "markdown", 63 | "metadata": { 64 | "id": "PsAOn1BiC-_o", 65 | "colab_type": "text" 66 | }, 67 | "source": [ 68 | "## **DataFraming**\n", 69 | "\n", 70 | "Read .csv data into a Dataframe " 71 | ] 72 | }, 73 | { 74 | "cell_type": "code", 75 | "metadata": { 76 | "id": "A9gmwPuuDTq7", 77 | "colab_type": "code", 78 | "colab": { 79 | "base_uri": "https://localhost:8080/", 80 | "height": 527 81 | }, 82 | "outputId": "3c95a830-36ec-4651-dc88-43de30614140" 83 | }, 84 | "source": [ 85 | "creditData = pd.read_csv(\"./titanic.xls\")\n", 86 | "print(creditData.head())\n", 87 | "print(creditData.describe())\n", 88 | "print(creditData.corr())" 89 | ], 90 | "execution_count": 13, 91 | "outputs": [ 92 | { 93 | "output_type": "stream", 94 | "text": [ 95 | " Survived Pclass ... Parents/Children Aboard Fare\n", 96 | "0 0 3 ... 0 7.2500\n", 97 | "1 1 1 ... 0 71.2833\n", 98 | "2 1 3 ... 0 7.9250\n", 99 | "3 1 1 ... 0 53.1000\n", 100 | "4 0 3 ... 0 8.0500\n", 101 | "\n", 102 | "[5 rows x 7 columns]\n", 103 | " Survived Pclass ... Parents/Children Aboard Fare\n", 104 | "count 887.000000 887.000000 ... 887.000000 887.00000\n", 105 | "mean 0.385569 2.305524 ... 0.383315 32.30542\n", 106 | "std 0.487004 0.836662 ... 0.807466 49.78204\n", 107 | "min 0.000000 1.000000 ... 0.000000 0.00000\n", 108 | "25% 0.000000 2.000000 ... 0.000000 7.92500\n", 109 | "50% 0.000000 3.000000 ... 0.000000 14.45420\n", 110 | "75% 1.000000 3.000000 ... 0.000000 31.13750\n", 111 | "max 1.000000 3.000000 ... 6.000000 512.32920\n", 112 | "\n", 113 | "[8 rows x 7 columns]\n", 114 | " Survived Pclass ... Parents/Children Aboard Fare\n", 115 | "Survived 1.000000 -0.336528 ... 0.080097 0.256179\n", 116 | "Pclass -0.336528 1.000000 ... 0.020252 -0.548919\n", 117 | "Sex -0.542152 0.129507 ... -0.244337 -0.181137\n", 118 | "Age -0.059665 -0.391492 ... -0.193741 0.112329\n", 119 | "Siblings/Spouses Aboard -0.037082 0.085026 ... 0.414244 0.158839\n", 120 | "Parents/Children Aboard 0.080097 0.020252 ... 1.000000 0.215470\n", 121 | "Fare 0.256179 -0.548919 ... 0.215470 1.000000\n", 122 | "\n", 123 | "[7 rows x 7 columns]\n" 124 | ], 125 | "name": "stdout" 126 | } 127 | ] 128 | }, 129 | { 130 | "cell_type": "markdown", 131 | "metadata": { 132 | "id": "N0c60VUJEfr-", 133 | "colab_type": "text" 134 | }, 135 | "source": [ 136 | "## **Features Extraction**\n", 137 | "\n", 138 | "Extracting features and splitting data into test and train." 139 | ] 140 | }, 141 | { 142 | "cell_type": "code", 143 | "metadata": { 144 | "id": "2KYihcjTE067", 145 | "colab_type": "code", 146 | "colab": {} 147 | }, 148 | "source": [ 149 | "features = creditData[[\"Sex\",\"Age\",\"Pclass\"]]\n", 150 | "target = creditData[[\"Survived\"]]\n", 151 | "\n", 152 | "feature_train, feature_test, target_train, target_test = train_test_split(features,target)" 153 | ], 154 | "execution_count": 0, 155 | "outputs": [] 156 | }, 157 | { 158 | "cell_type": "markdown", 159 | "metadata": { 160 | "id": "nHxk6jzYLx0m", 161 | "colab_type": "text" 162 | }, 163 | "source": [ 164 | "## **Training the Model**\n", 165 | "\n", 166 | "We are using Logistic regression model as imported from sklearn library and then it's being trained on feature_train and target_train" 167 | ] 168 | }, 169 | { 170 | "cell_type": "code", 171 | "metadata": { 172 | "id": "zcVLyHh3L5Rv", 173 | "colab_type": "code", 174 | "colab": { 175 | "base_uri": "https://localhost:8080/", 176 | "height": 72 177 | }, 178 | "outputId": "3e52b837-ceab-4b49-e0dc-025588880736" 179 | }, 180 | "source": [ 181 | "model = LogisticRegression()\n", 182 | "model.fit = model.fit(feature_train, target_train)\n", 183 | "predictions = model.fit.predict(feature_test)" 184 | ], 185 | "execution_count": 21, 186 | "outputs": [ 187 | { 188 | "output_type": "stream", 189 | "text": [ 190 | "/usr/local/lib/python3.6/dist-packages/sklearn/utils/validation.py:760: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().\n", 191 | " y = column_or_1d(y, warn=True)\n" 192 | ], 193 | "name": "stderr" 194 | } 195 | ] 196 | }, 197 | { 198 | "cell_type": "markdown", 199 | "metadata": { 200 | "id": "tFBsaZpeMdIa", 201 | "colab_type": "text" 202 | }, 203 | "source": [ 204 | "## **Printing an Error Matrix and Accuracy Score**" 205 | ] 206 | }, 207 | { 208 | "cell_type": "code", 209 | "metadata": { 210 | "id": "urqzo3v7MzYp", 211 | "colab_type": "code", 212 | "colab": { 213 | "base_uri": "https://localhost:8080/", 214 | "height": 70 215 | }, 216 | "outputId": "4c38340a-d31e-4d14-ec18-2d22ab6b47a4" 217 | }, 218 | "source": [ 219 | "print(confusion_matrix(target_test,predictions))\n", 220 | "print(accuracy_score(target_test,predictions))" 221 | ], 222 | "execution_count": 22, 223 | "outputs": [ 224 | { 225 | "output_type": "stream", 226 | "text": [ 227 | "[[128 18]\n", 228 | " [ 18 58]]\n", 229 | "0.8378378378378378\n" 230 | ], 231 | "name": "stdout" 232 | } 233 | ] 234 | } 235 | ] 236 | } -------------------------------------------------------------------------------- /02_Logistic_Regression/titanic.xls: -------------------------------------------------------------------------------- 1 | Survived,Pclass,Sex,Age,Siblings/Spouses Aboard,Parents/Children Aboard,Fare 2 | 0,3,1,22,1,0,7.25 3 | 1,1,0,38,1,0,71.2833 4 | 1,3,0,26,0,0,7.925 5 | 1,1,0,35,1,0,53.1 6 | 0,3,1,35,0,0,8.05 7 | 0,3,1,27,0,0,8.4583 8 | 0,1,1,54,0,0,51.8625 9 | 0,3,1,2,3,1,21.075 10 | 1,3,0,27,0,2,11.1333 11 | 1,2,0,14,1,0,30.0708 12 | 1,3,0,4,1,1,16.7 13 | 1,1,0,58,0,0,26.55 14 | 0,3,1,20,0,0,8.05 15 | 0,3,1,39,1,5,31.275 16 | 0,3,0,14,0,0,7.8542 17 | 1,2,0,55,0,0,16 18 | 0,3,1,2,4,1,29.125 19 | 1,2,1,23,0,0,13 20 | 0,3,0,31,1,0,18 21 | 1,3,0,22,0,0,7.225 22 | 0,2,1,35,0,0,26 23 | 1,2,1,34,0,0,13 24 | 1,3,0,15,0,0,8.0292 25 | 1,1,1,28,0,0,35.5 26 | 0,3,0,8,3,1,21.075 27 | 1,3,0,38,1,5,31.3875 28 | 0,3,1,26,0,0,7.225 29 | 0,1,1,19,3,2,263 30 | 1,3,0,24,0,0,7.8792 31 | 0,3,1,23,0,0,7.8958 32 | 0,1,1,40,0,0,27.7208 33 | 1,1,0,48,1,0,146.5208 34 | 1,3,0,18,0,0,7.75 35 | 0,2,1,66,0,0,10.5 36 | 0,1,1,28,1,0,82.1708 37 | 0,1,1,42,1,0,52 38 | 1,3,1,18,0,0,7.2292 39 | 0,3,1,21,0,0,8.05 40 | 0,3,0,18,2,0,18 41 | 1,3,0,14,1,0,11.2417 42 | 0,3,0,40,1,0,9.475 43 | 0,2,0,27,1,0,21 44 | 1,2,0,3,1,2,41.5792 45 | 1,3,0,19,0,0,7.8792 46 | 0,3,1,30,0,0,8.05 47 | 0,3,1,20,1,0,15.5 48 | 1,3,0,27,0,0,7.75 49 | 0,3,1,16,2,0,21.6792 50 | 0,3,0,18,1,0,17.8 51 | 0,3,1,7,4,1,39.6875 52 | 0,3,1,21,0,0,7.8 53 | 1,1,0,49,1,0,76.7292 54 | 1,2,0,29,1,0,26 55 | 0,1,1,65,0,1,61.9792 56 | 1,1,1,46,0,0,35.5 57 | 1,2,0,21,0,0,10.5 58 | 0,3,1,28.5,0,0,7.2292 59 | 1,2,0,5,1,2,27.75 60 | 0,3,1,11,5,2,46.9 61 | 0,3,1,22,0,0,7.2292 62 | 1,1,0,38,0,0,80 63 | 0,1,1,45,1,0,83.475 64 | 0,3,1,4,3,2,27.9 65 | 0,1,1,64,0,0,27.7208 66 | 1,3,1,7,1,1,15.2458 67 | 1,2,0,29,0,0,10.5 68 | 0,3,1,19,0,0,8.1583 69 | 1,3,0,17,4,2,7.925 70 | 0,3,1,26,2,0,8.6625 71 | 0,2,1,32,0,0,10.5 72 | 0,3,0,16,5,2,46.9 73 | 0,2,1,21,0,0,73.5 74 | 0,3,1,26,1,0,14.4542 75 | 1,3,1,32,0,0,56.4958 76 | 0,3,1,25,0,0,7.65 77 | 0,3,1,23,0,0,7.8958 78 | 0,3,1,28,0,0,8.05 79 | 1,2,1,0.83,0,2,29 80 | 1,3,0,30,0,0,12.475 81 | 0,3,1,22,0,0,9 82 | 1,3,1,29,0,0,9.5 83 | 1,3,0,31,0,0,7.7875 84 | 0,1,1,28,0,0,47.1 85 | 1,2,0,17,0,0,10.5 86 | 1,3,0,33,3,0,15.85 87 | 0,3,1,16,1,3,34.375 88 | 0,3,1,20,0,0,8.05 89 | 1,1,0,23,3,2,263 90 | 0,3,1,24,0,0,8.05 91 | 0,3,1,29,0,0,8.05 92 | 0,3,1,20,0,0,7.8542 93 | 0,1,1,46,1,0,61.175 94 | 0,3,1,26,1,2,20.575 95 | 0,3,1,59,0,0,7.25 96 | 0,3,1,22,0,0,8.05 97 | 0,1,1,71,0,0,34.6542 98 | 1,1,1,23,0,1,63.3583 99 | 1,2,0,34,0,1,23 100 | 0,2,1,34,1,0,26 101 | 0,3,0,28,0,0,7.8958 102 | 0,3,1,29,0,0,7.8958 103 | 0,1,1,21,0,1,77.2875 104 | 0,3,1,33,0,0,8.6542 105 | 0,3,1,37,2,0,7.925 106 | 0,3,1,28,0,0,7.8958 107 | 1,3,0,21,0,0,7.65 108 | 1,3,1,29,0,0,7.775 109 | 0,3,1,38,0,0,7.8958 110 | 1,3,0,28,1,0,24.15 111 | 0,1,1,47,0,0,52 112 | 0,3,0,14.5,1,0,14.4542 113 | 0,3,1,22,0,0,8.05 114 | 0,3,0,20,1,0,9.825 115 | 0,3,0,17,0,0,14.4583 116 | 0,3,1,21,0,0,7.925 117 | 0,3,1,70.5,0,0,7.75 118 | 0,2,1,29,1,0,21 119 | 0,1,1,24,0,1,247.5208 120 | 0,3,0,2,4,2,31.275 121 | 0,2,1,21,2,0,73.5 122 | 0,3,1,19,0,0,8.05 123 | 0,2,1,32.5,1,0,30.0708 124 | 1,2,0,32.5,0,0,13 125 | 0,1,1,54,0,1,77.2875 126 | 1,3,1,12,1,0,11.2417 127 | 0,3,1,19,0,0,7.75 128 | 1,3,1,24,0,0,7.1417 129 | 1,3,0,2,1,1,22.3583 130 | 0,3,1,45,0,0,6.975 131 | 0,3,1,33,0,0,7.8958 132 | 0,3,1,20,0,0,7.05 133 | 0,3,0,47,1,0,14.5 134 | 1,2,0,29,1,0,26 135 | 0,2,1,25,0,0,13 136 | 0,2,1,23,0,0,15.0458 137 | 1,1,0,19,0,2,26.2833 138 | 0,1,1,37,1,0,53.1 139 | 0,3,1,16,0,0,9.2167 140 | 0,1,1,24,0,0,79.2 141 | 0,3,0,40,0,2,15.2458 142 | 1,3,0,22,0,0,7.75 143 | 1,3,0,24,1,0,15.85 144 | 0,3,1,19,0,0,6.75 145 | 0,2,1,18,0,0,11.5 146 | 0,2,1,19,1,1,36.75 147 | 1,3,1,27,0,0,7.7958 148 | 0,3,0,9,2,2,34.375 149 | 0,2,1,36.5,0,2,26 150 | 0,2,1,42,0,0,13 151 | 0,2,1,51,0,0,12.525 152 | 1,1,0,22,1,0,66.6 153 | 0,3,1,55.5,0,0,8.05 154 | 0,3,1,40.5,0,2,14.5 155 | 0,3,1,27,0,0,7.3125 156 | 0,1,1,51,0,1,61.3792 157 | 1,3,0,16,0,0,7.7333 158 | 0,3,1,30,0,0,8.05 159 | 0,3,1,37,0,0,8.6625 160 | 0,3,1,5,8,2,69.55 161 | 0,3,1,44,0,1,16.1 162 | 1,2,0,40,0,0,15.75 163 | 0,3,1,26,0,0,7.775 164 | 0,3,1,17,0,0,8.6625 165 | 0,3,1,1,4,1,39.6875 166 | 1,3,1,9,0,2,20.525 167 | 1,1,0,48,0,1,55 168 | 0,3,0,45,1,4,27.9 169 | 0,1,1,60,0,0,25.925 170 | 0,3,1,28,0,0,56.4958 171 | 0,1,1,61,0,0,33.5 172 | 0,3,1,4,4,1,29.125 173 | 1,3,0,1,1,1,11.1333 174 | 0,3,1,21,0,0,7.925 175 | 0,1,1,56,0,0,30.6958 176 | 0,3,1,18,1,1,7.8542 177 | 0,3,1,5,3,1,25.4667 178 | 0,1,0,50,0,0,28.7125 179 | 0,2,1,30,0,0,13 180 | 0,3,1,36,0,0,0 181 | 0,3,0,8,8,2,69.55 182 | 0,2,1,39,0,0,15.05 183 | 0,3,1,9,4,2,31.3875 184 | 1,2,1,1,2,1,39 185 | 1,3,0,4,0,2,22.025 186 | 0,1,1,39,0,0,50 187 | 1,3,0,26,1,0,15.5 188 | 1,1,1,45,0,0,26.55 189 | 0,3,1,40,1,1,15.5 190 | 0,3,1,36,0,0,7.8958 191 | 1,2,0,32,0,0,13 192 | 0,2,1,19,0,0,13 193 | 1,3,0,19,1,0,7.8542 194 | 1,2,1,3,1,1,26 195 | 1,1,0,44,0,0,27.7208 196 | 1,1,0,58,0,0,146.5208 197 | 0,3,1,28,0,0,7.75 198 | 0,3,1,42,0,1,8.4042 199 | 1,3,0,21,0,0,7.75 200 | 0,2,0,24,0,0,13 201 | 0,3,1,28,0,0,9.5 202 | 0,3,1,17,8,2,69.55 203 | 0,3,1,34,0,0,6.4958 204 | 0,3,1,45.5,0,0,7.225 205 | 1,3,1,18,0,0,8.05 206 | 0,3,0,2,0,1,10.4625 207 | 0,3,1,32,1,0,15.85 208 | 1,3,1,26,0,0,18.7875 209 | 1,3,0,16,0,0,7.75 210 | 1,1,1,40,0,0,31 211 | 0,3,1,24,0,0,7.05 212 | 1,2,0,35,0,0,21 213 | 0,3,1,22,0,0,7.25 214 | 0,2,1,30,0,0,13 215 | 0,3,1,22,1,0,7.75 216 | 1,1,0,31,1,0,113.275 217 | 1,3,0,27,0,0,7.925 218 | 0,2,1,42,1,0,27 219 | 1,1,0,32,0,0,76.2917 220 | 0,2,1,30,0,0,10.5 221 | 1,3,1,16,0,0,8.05 222 | 0,2,1,27,0,0,13 223 | 0,3,1,51,0,0,8.05 224 | 0,3,1,22,0,0,7.8958 225 | 1,1,1,38,1,0,90 226 | 0,3,1,22,0,0,9.35 227 | 1,2,1,19,0,0,10.5 228 | 0,3,1,20.5,0,0,7.25 229 | 0,2,1,18,0,0,13 230 | 0,3,0,12,3,1,25.4667 231 | 1,1,0,35,1,0,83.475 232 | 0,3,1,29,0,0,7.775 233 | 0,2,1,59,0,0,13.5 234 | 1,3,0,5,4,2,31.3875 235 | 0,2,1,24,0,0,10.5 236 | 0,3,0,21,0,0,7.55 237 | 0,2,1,44,1,0,26 238 | 1,2,0,8,0,2,26.25 239 | 0,2,1,19,0,0,10.5 240 | 0,2,1,33,0,0,12.275 241 | 0,3,0,19,1,0,14.4542 242 | 1,3,0,18,1,0,15.5 243 | 0,2,1,29,0,0,10.5 244 | 0,3,1,22,0,0,7.125 245 | 0,3,1,30,0,0,7.225 246 | 0,1,1,44,2,0,90 247 | 0,3,0,25,0,0,7.775 248 | 1,2,0,24,0,2,14.5 249 | 1,1,1,37,1,1,52.5542 250 | 0,2,1,54,1,0,26 251 | 0,3,1,18,0,0,7.25 252 | 0,3,0,29,1,1,10.4625 253 | 0,1,1,62,0,0,26.55 254 | 0,3,1,30,1,0,16.1 255 | 0,3,0,41,0,2,20.2125 256 | 1,3,0,29,0,2,15.2458 257 | 1,1,0,38,0,0,79.2 258 | 1,1,0,30,0,0,86.5 259 | 1,1,0,35,0,0,512.3292 260 | 1,2,0,50,0,1,26 261 | 1,3,1,3,4,2,31.3875 262 | 0,1,1,52,1,1,79.65 263 | 0,1,1,40,0,0,0 264 | 0,3,0,21,0,0,7.75 265 | 0,2,1,36,0,0,10.5 266 | 0,3,1,16,4,1,39.6875 267 | 1,3,1,25,1,0,7.775 268 | 1,1,0,58,0,1,153.4625 269 | 1,1,0,35,0,0,135.6333 270 | 0,1,1,28,0,0,31 271 | 1,3,1,25,0,0,0 272 | 1,2,0,41,0,1,19.5 273 | 0,1,1,37,0,1,29.7 274 | 1,3,0,33,0,0,7.75 275 | 1,1,0,63,1,0,77.9583 276 | 0,3,0,45,0,0,7.75 277 | 0,2,1,21,0,0,0 278 | 0,3,1,7,4,1,29.125 279 | 1,3,0,35,1,1,20.25 280 | 0,3,1,65,0,0,7.75 281 | 0,3,1,28,0,0,7.8542 282 | 0,3,1,16,0,0,9.5 283 | 1,3,1,19,0,0,8.05 284 | 0,1,1,57,0,0,26 285 | 0,3,1,33,0,0,8.6625 286 | 1,3,1,30,0,0,9.5 287 | 0,3,1,22,0,0,7.8958 288 | 1,2,1,42,0,0,13 289 | 1,3,0,22,0,0,7.75 290 | 1,1,0,26,0,0,78.85 291 | 1,1,0,19,1,0,91.0792 292 | 0,2,1,36,0,0,12.875 293 | 0,3,0,24,0,0,8.85 294 | 0,3,1,24,0,0,7.8958 295 | 0,1,1,30,0,0,27.7208 296 | 0,3,1,23.5,0,0,7.2292 297 | 0,1,0,2,1,2,151.55 298 | 1,1,1,47,0,0,30.5 299 | 1,1,0,50,0,1,247.5208 300 | 1,3,0,20,0,0,7.75 301 | 1,3,1,24,2,0,23.25 302 | 0,3,1,19,0,0,0 303 | 1,2,0,46,0,0,12.35 304 | 0,3,1,28,0,0,8.05 305 | 1,1,1,0.92,1,2,151.55 306 | 1,1,0,42,0,0,110.8833 307 | 1,1,0,17,1,0,108.9 308 | 0,2,1,30,1,0,24 309 | 1,1,0,30,0,0,56.9292 310 | 1,1,0,24,0,0,83.1583 311 | 1,1,0,18,2,2,262.375 312 | 0,2,0,26,1,1,26 313 | 0,3,1,28,0,0,7.8958 314 | 0,2,1,43,1,1,26.25 315 | 1,3,0,26,0,0,7.8542 316 | 1,2,0,24,1,0,26 317 | 0,2,1,54,0,0,14 318 | 1,1,0,31,0,2,164.8667 319 | 1,1,0,40,1,1,134.5 320 | 0,3,1,22,0,0,7.25 321 | 0,3,1,27,0,0,7.8958 322 | 1,2,0,30,0,0,12.35 323 | 1,2,0,22,1,1,29 324 | 0,3,1,20,8,2,69.55 325 | 1,1,0,36,0,0,135.6333 326 | 0,3,1,61,0,0,6.2375 327 | 1,2,0,36,0,0,13 328 | 1,3,0,31,1,1,20.525 329 | 1,1,0,16,0,1,57.9792 330 | 1,3,0,28,2,0,23.25 331 | 0,1,1,45.5,0,0,28.5 332 | 0,1,1,38,0,1,153.4625 333 | 0,3,1,16,2,0,18 334 | 1,1,0,42,1,0,133.65 335 | 0,3,1,30,0,0,7.8958 336 | 0,1,1,29,1,0,66.6 337 | 1,1,0,41,0,0,134.5 338 | 1,3,1,45,0,0,8.05 339 | 0,1,1,45,0,0,35.5 340 | 1,2,1,2,1,1,26 341 | 1,1,0,24,3,2,263 342 | 0,2,1,28,0,0,13 343 | 0,2,1,25,0,0,13 344 | 0,2,1,36,0,0,13 345 | 1,2,0,24,0,0,13 346 | 1,2,0,40,0,0,13 347 | 1,3,0,34,1,0,16.1 348 | 1,3,1,3,1,1,15.9 349 | 0,3,1,42,0,0,8.6625 350 | 0,3,1,23,0,0,9.225 351 | 0,1,1,43,0,0,35 352 | 0,3,1,15,1,1,7.2292 353 | 0,3,1,25,1,0,17.8 354 | 0,3,1,23,0,0,7.225 355 | 0,3,1,28,0,0,9.5 356 | 1,1,0,22,0,1,55 357 | 0,2,0,38,0,0,13 358 | 1,3,0,22,0,0,7.8792 359 | 1,3,0,23,0,0,7.8792 360 | 0,3,1,40,1,4,27.9 361 | 0,2,1,29,1,0,27.7208 362 | 0,3,0,45,0,1,14.4542 363 | 0,3,1,35,0,0,7.05 364 | 0,3,1,27,1,0,15.5 365 | 0,3,1,30,0,0,7.25 366 | 1,1,0,60,1,0,75.25 367 | 1,3,0,35,0,0,7.2292 368 | 1,3,0,22,0,0,7.75 369 | 1,1,0,24,0,0,69.3 370 | 1,1,1,25,1,0,55.4417 371 | 0,3,1,18,1,0,6.4958 372 | 0,3,1,19,0,0,8.05 373 | 0,1,1,22,0,0,135.6333 374 | 0,3,0,3,3,1,21.075 375 | 1,1,0,25,1,0,82.1708 376 | 1,3,0,22,0,0,7.25 377 | 0,1,1,27,0,2,211.5 378 | 0,3,1,20,0,0,4.0125 379 | 0,3,1,19,0,0,7.775 380 | 1,1,0,42,0,0,227.525 381 | 1,3,0,1,0,2,15.7417 382 | 0,3,1,32,0,0,7.925 383 | 1,1,0,35,1,0,52 384 | 0,3,1,27,0,0,7.8958 385 | 0,2,1,18,0,0,73.5 386 | 0,3,1,1,5,2,46.9 387 | 1,2,0,36,0,0,13 388 | 0,3,1,19,0,0,7.7292 389 | 1,2,0,17,0,0,12 390 | 1,1,1,36,1,2,120 391 | 1,3,1,21,0,0,7.7958 392 | 0,3,1,28,2,0,7.925 393 | 1,1,0,23,1,0,113.275 394 | 1,3,0,24,0,2,16.7 395 | 0,3,1,22,0,0,7.7958 396 | 0,3,0,31,0,0,7.8542 397 | 0,2,1,46,0,0,26 398 | 0,2,1,23,0,0,10.5 399 | 1,2,0,28,0,0,12.65 400 | 1,3,1,39,0,0,7.925 401 | 0,3,1,26,0,0,8.05 402 | 0,3,0,21,1,0,9.825 403 | 0,3,1,28,1,0,15.85 404 | 0,3,0,20,0,0,8.6625 405 | 0,2,1,34,1,0,21 406 | 0,3,1,51,0,0,7.75 407 | 1,2,1,3,1,1,18.75 408 | 0,3,1,21,0,0,7.775 409 | 0,3,0,3,3,1,25.4667 410 | 0,3,1,42,0,0,7.8958 411 | 0,3,1,27,0,0,6.8583 412 | 1,1,0,33,1,0,90 413 | 0,2,1,22,0,0,0 414 | 1,3,1,44,0,0,7.925 415 | 0,3,0,32,0,0,8.05 416 | 1,2,0,34,1,1,32.5 417 | 1,2,0,18,0,2,13 418 | 0,2,1,30,0,0,13 419 | 0,3,0,10,0,2,24.15 420 | 0,3,1,21,0,0,7.7333 421 | 0,3,1,29,0,0,7.875 422 | 0,3,0,28,1,1,14.4 423 | 0,3,1,18,1,1,20.2125 424 | 0,3,1,54,0,0,7.25 425 | 1,2,0,28,1,0,26 426 | 1,2,0,19,0,0,26 427 | 0,3,1,28,0,0,7.75 428 | 1,3,1,32,0,0,8.05 429 | 1,1,1,28,0,0,26.55 430 | 1,3,0,33,1,0,16.1 431 | 1,2,0,42,1,0,26 432 | 0,3,1,17,0,0,7.125 433 | 0,1,1,50,1,0,55.9 434 | 1,1,0,14,1,2,120 435 | 0,3,0,21,2,2,34.375 436 | 1,2,0,24,2,3,18.75 437 | 0,1,1,64,1,4,263 438 | 0,2,1,31,0,0,10.5 439 | 1,2,0,45,1,1,26.25 440 | 0,3,1,20,0,0,9.5 441 | 0,3,1,25,1,0,7.775 442 | 1,2,0,28,0,0,13 443 | 1,3,1,29,0,0,8.1125 444 | 1,1,1,4,0,2,81.8583 445 | 1,2,0,13,0,1,19.5 446 | 1,1,1,34,0,0,26.55 447 | 1,3,0,5,2,1,19.2583 448 | 1,1,1,52,0,0,30.5 449 | 0,2,1,36,1,2,27.75 450 | 0,3,1,28,1,0,19.9667 451 | 0,1,1,30,0,0,27.75 452 | 1,1,1,49,1,0,89.1042 453 | 0,3,1,24,0,0,8.05 454 | 1,3,1,29,0,0,7.8958 455 | 0,1,1,65,0,0,26.55 456 | 1,1,0,41,1,0,51.8625 457 | 1,2,0,50,0,0,10.5 458 | 0,3,1,17,0,0,7.75 459 | 1,1,1,48,0,0,26.55 460 | 0,3,1,34,0,0,8.05 461 | 0,1,1,47,0,0,38.5 462 | 0,2,1,48,0,0,13 463 | 0,3,1,34,0,0,8.05 464 | 0,3,1,38,0,0,7.05 465 | 0,2,1,21,0,0,0 466 | 0,1,1,56,0,0,26.55 467 | 0,3,1,22,0,0,7.725 468 | 1,3,0,0.75,2,1,19.2583 469 | 0,3,1,39,0,0,7.25 470 | 0,3,1,38,0,0,8.6625 471 | 1,2,0,33,1,2,27.75 472 | 1,2,0,23,0,0,13.7917 473 | 0,3,0,22,0,0,9.8375 474 | 0,1,1,40,0,0,52 475 | 0,2,1,34,1,0,21 476 | 0,3,1,29,1,0,7.0458 477 | 0,3,1,22,0,0,7.5208 478 | 1,3,0,2,0,1,12.2875 479 | 0,3,1,9,5,2,46.9 480 | 0,2,1,37,0,0,0 481 | 0,3,1,50,0,0,8.05 482 | 1,3,0,63,0,0,9.5875 483 | 1,1,1,25,1,0,91.0792 484 | 0,3,0,8,3,1,25.4667 485 | 1,1,0,35,1,0,90 486 | 0,1,1,58,0,0,29.7 487 | 0,3,1,30,0,0,8.05 488 | 1,3,1,9,1,1,15.9 489 | 0,3,1,19,1,0,19.9667 490 | 0,3,1,21,0,0,7.25 491 | 0,1,1,55,0,0,30.5 492 | 0,1,1,71,0,0,49.5042 493 | 0,3,1,21,0,0,8.05 494 | 0,3,1,26,0,0,14.4583 495 | 1,1,0,54,1,0,78.2667 496 | 0,3,1,55,0,0,15.1 497 | 0,1,0,25,1,2,151.55 498 | 0,3,1,24,0,0,7.7958 499 | 0,3,1,17,0,0,8.6625 500 | 0,3,0,21,0,0,7.75 501 | 0,3,0,21,0,0,7.6292 502 | 0,3,0,37,0,0,9.5875 503 | 1,1,0,16,0,0,86.5 504 | 0,1,1,18,1,0,108.9 505 | 1,2,0,33,0,2,26 506 | 1,1,1,37,0,0,26.55 507 | 0,3,1,28,0,0,22.525 508 | 1,3,1,26,0,0,56.4958 509 | 1,3,1,29,0,0,7.75 510 | 0,3,1,66,0,0,8.05 511 | 1,1,1,36,0,0,26.2875 512 | 1,1,0,54,1,0,59.4 513 | 0,3,1,24,0,0,7.4958 514 | 0,1,1,47,0,0,34.0208 515 | 1,2,0,34,0,0,10.5 516 | 0,3,1,30,0,0,24.15 517 | 1,2,0,36,1,0,26 518 | 0,3,1,32,0,0,7.8958 519 | 1,1,0,30,0,0,93.5 520 | 0,3,1,22,0,0,7.8958 521 | 0,3,1,35,0,0,7.225 522 | 1,1,0,44,0,1,57.9792 523 | 0,3,1,18,0,0,7.2292 524 | 0,3,1,40.5,0,0,7.75 525 | 1,2,0,50,0,0,10.5 526 | 0,1,1,49,0,0,221.7792 527 | 0,3,1,39,0,0,7.925 528 | 0,2,1,23,2,1,11.5 529 | 1,2,0,2,1,1,26 530 | 0,3,1,17,0,0,7.2292 531 | 0,3,1,17,1,1,7.2292 532 | 1,3,0,24,0,2,22.3583 533 | 0,3,0,30,0,0,8.6625 534 | 1,2,0,7,0,2,26.25 535 | 0,1,1,45,0,0,26.55 536 | 1,1,0,30,0,0,106.425 537 | 0,3,1,69,0,0,14.5 538 | 1,1,0,22,0,2,49.5 539 | 1,1,0,36,0,2,71 540 | 0,3,0,9,4,2,31.275 541 | 0,3,0,11,4,2,31.275 542 | 1,2,1,32,1,0,26 543 | 0,1,1,50,1,0,106.425 544 | 0,1,1,64,0,0,26 545 | 1,2,0,19,1,0,26 546 | 1,2,1,27,0,0,13.8625 547 | 0,3,1,33,1,1,20.525 548 | 1,2,1,8,1,1,36.75 549 | 1,1,1,17,0,2,110.8833 550 | 0,2,1,27,0,0,26 551 | 0,3,1,21,0,0,7.8292 552 | 1,3,1,22,0,0,7.225 553 | 1,3,0,22,0,0,7.775 554 | 0,1,1,62,0,0,26.55 555 | 1,1,0,48,1,0,39.6 556 | 0,1,1,45,0,0,227.525 557 | 1,1,0,39,1,1,79.65 558 | 1,3,0,36,1,0,17.4 559 | 0,3,1,30,0,0,7.75 560 | 0,3,1,40,0,0,7.8958 561 | 0,2,1,28,0,0,13.5 562 | 0,3,1,40,0,0,8.05 563 | 0,3,0,62,0,0,8.05 564 | 0,3,1,24,2,0,24.15 565 | 0,3,1,19,0,0,7.8958 566 | 0,3,0,29,0,4,21.075 567 | 0,3,1,28,0,0,7.2292 568 | 1,3,1,32,0,0,7.8542 569 | 1,2,1,62,0,0,10.5 570 | 1,1,0,53,2,0,51.4792 571 | 1,1,1,36,0,0,26.3875 572 | 1,3,0,22,0,0,7.75 573 | 0,3,1,16,0,0,8.05 574 | 0,3,1,19,0,0,14.5 575 | 1,2,0,34,0,0,13 576 | 1,1,0,39,1,0,55.9 577 | 0,3,0,18,1,0,14.4583 578 | 1,3,1,32,0,0,7.925 579 | 1,2,0,25,1,1,30 580 | 1,1,0,39,1,1,110.8833 581 | 0,2,1,54,0,0,26 582 | 0,1,1,36,0,0,40.125 583 | 0,3,1,16,0,0,8.7125 584 | 1,1,0,18,0,2,79.65 585 | 0,2,1,47,0,0,15 586 | 1,1,1,60,1,1,79.2 587 | 0,3,1,22,0,0,8.05 588 | 0,3,1,22,0,0,8.05 589 | 0,3,1,35,0,0,7.125 590 | 1,1,0,52,1,0,78.2667 591 | 0,3,1,47,0,0,7.25 592 | 0,3,0,40,0,2,7.75 593 | 0,2,1,37,1,0,26 594 | 0,3,1,36,1,1,24.15 595 | 1,2,0,31,0,0,33 596 | 0,3,1,49,0,0,0 597 | 0,3,1,18,0,0,7.225 598 | 1,1,1,49,1,0,56.9292 599 | 1,2,0,24,2,1,27 600 | 0,3,1,42,0,0,7.8958 601 | 0,1,1,37,0,0,42.4 602 | 0,3,1,44,0,0,8.05 603 | 1,1,1,35,0,0,26.55 604 | 0,3,1,36,1,0,15.55 605 | 0,3,1,30,0,0,7.8958 606 | 1,1,1,27,0,0,30.5 607 | 1,2,0,22,1,2,41.5792 608 | 1,1,0,40,0,0,153.4625 609 | 0,3,0,39,1,5,31.275 610 | 0,3,1,21,0,0,7.05 611 | 1,3,0,18,1,0,15.5 612 | 0,3,1,22,0,0,7.75 613 | 0,3,1,35,0,0,8.05 614 | 1,2,0,24,1,2,65 615 | 0,3,1,34,1,1,14.4 616 | 0,3,0,26,1,0,16.1 617 | 1,2,0,4,2,1,39 618 | 0,2,1,26,0,0,10.5 619 | 0,3,1,27,1,0,14.4542 620 | 1,1,1,42,1,0,52.5542 621 | 1,3,1,20,1,1,15.7417 622 | 0,3,1,21,0,0,7.8542 623 | 0,3,1,21,0,0,16.1 624 | 0,1,1,61,0,0,32.3208 625 | 0,2,1,57,0,0,12.35 626 | 1,1,0,21,0,0,77.9583 627 | 0,3,1,26,0,0,7.8958 628 | 0,3,1,18,0,0,7.7333 629 | 1,1,1,80,0,0,30 630 | 0,3,1,51,0,0,7.0542 631 | 1,1,1,32,0,0,30.5 632 | 0,1,1,30,0,0,0 633 | 0,3,0,9,3,2,27.9 634 | 1,2,0,28,0,0,13 635 | 0,3,1,32,0,0,7.925 636 | 0,2,1,31,1,1,26.25 637 | 0,3,0,41,0,5,39.6875 638 | 0,3,1,37,1,0,16.1 639 | 0,3,1,20,0,0,7.8542 640 | 1,1,0,24,0,0,69.3 641 | 0,3,0,2,3,2,27.9 642 | 1,3,1,32,0,0,56.4958 643 | 1,3,0,0.75,2,1,19.2583 644 | 1,1,1,48,1,0,76.7292 645 | 0,3,1,19,0,0,7.8958 646 | 1,1,1,56,0,0,35.5 647 | 0,3,1,21,0,0,7.55 648 | 1,3,0,23,0,0,7.55 649 | 0,3,1,23,0,0,7.8958 650 | 1,2,0,18,0,1,23 651 | 0,3,1,21,0,0,8.4333 652 | 1,3,0,16,0,0,7.8292 653 | 0,3,0,18,0,0,6.75 654 | 0,2,1,24,2,0,73.5 655 | 0,3,1,27,0,0,7.8958 656 | 0,3,0,32,1,1,15.5 657 | 0,2,1,23,0,0,13 658 | 0,1,1,58,0,2,113.275 659 | 1,1,1,50,2,0,133.65 660 | 0,3,1,40,0,0,7.225 661 | 0,1,1,47,0,0,25.5875 662 | 0,3,1,36,0,0,7.4958 663 | 1,3,1,20,1,0,7.925 664 | 0,2,1,32,2,0,73.5 665 | 0,2,1,25,0,0,13 666 | 0,3,1,49,0,0,7.775 667 | 0,3,1,43,0,0,8.05 668 | 1,1,0,48,1,0,52 669 | 1,2,0,40,1,1,39 670 | 0,1,1,31,1,0,52 671 | 0,2,1,70,0,0,10.5 672 | 1,2,1,31,0,0,13 673 | 0,2,1,19,0,0,0 674 | 0,3,1,18,0,0,7.775 675 | 0,3,1,24.5,0,0,8.05 676 | 1,3,0,18,0,0,9.8417 677 | 0,3,0,43,1,6,46.9 678 | 1,1,1,36,0,1,512.3292 679 | 0,3,0,28,0,0,8.1375 680 | 1,1,1,27,0,0,76.7292 681 | 0,3,1,20,0,0,9.225 682 | 0,3,1,14,5,2,46.9 683 | 0,2,1,60,1,1,39 684 | 0,2,1,25,1,2,41.5792 685 | 0,3,1,14,4,1,39.6875 686 | 0,3,1,19,0,0,10.1708 687 | 0,3,1,18,0,0,7.7958 688 | 1,1,0,15,0,1,211.3375 689 | 1,1,1,31,1,0,57 690 | 1,3,0,4,0,1,13.4167 691 | 1,3,1,37,0,0,56.4958 692 | 0,3,1,25,0,0,7.225 693 | 0,1,1,60,0,0,26.55 694 | 0,2,1,52,0,0,13.5 695 | 0,3,1,44,0,0,8.05 696 | 1,3,0,19,0,0,7.7333 697 | 0,1,1,49,1,1,110.8833 698 | 0,3,1,42,0,0,7.65 699 | 1,1,0,18,1,0,227.525 700 | 1,1,1,35,0,0,26.2875 701 | 0,3,0,18,0,1,14.4542 702 | 0,3,1,25,0,0,7.7417 703 | 0,3,1,26,1,0,7.8542 704 | 0,2,1,39,0,0,26 705 | 1,2,0,45,0,0,13.5 706 | 1,1,1,42,0,0,26.2875 707 | 1,1,0,22,0,0,151.55 708 | 1,3,1,4,1,1,15.2458 709 | 1,1,0,24,0,0,49.5042 710 | 0,1,1,41,0,0,26.55 711 | 1,1,1,48,1,0,52 712 | 0,3,1,29,0,0,9.4833 713 | 0,2,1,52,0,0,13 714 | 0,3,1,19,0,0,7.65 715 | 1,1,0,38,0,0,227.525 716 | 1,2,0,27,0,0,10.5 717 | 0,3,1,33,0,0,7.775 718 | 1,2,0,6,0,1,33 719 | 0,3,1,17,1,0,7.0542 720 | 0,2,1,34,0,0,13 721 | 0,2,1,50,0,0,13 722 | 1,1,1,27,1,0,53.1 723 | 0,3,1,20,0,0,8.6625 724 | 1,2,0,30,3,0,21 725 | 1,3,0,28,0,0,7.7375 726 | 0,2,1,25,1,0,26 727 | 0,3,0,25,1,0,7.925 728 | 1,1,0,29,0,0,211.3375 729 | 0,3,1,11,0,0,18.7875 730 | 0,2,1,41,0,0,0 731 | 0,2,1,23,0,0,13 732 | 0,2,1,23,0,0,13 733 | 0,3,1,28.5,0,0,16.1 734 | 0,3,0,48,1,3,34.375 735 | 1,1,1,35,0,0,512.3292 736 | 0,3,1,20,0,0,7.8958 737 | 0,3,1,32,0,0,7.8958 738 | 1,1,1,45,0,0,30 739 | 0,1,1,36,1,0,78.85 740 | 1,1,0,21,2,2,262.375 741 | 0,3,1,24,1,0,16.1 742 | 1,3,1,31,0,0,7.925 743 | 0,1,1,70,1,1,71 744 | 0,3,1,16,1,1,20.25 745 | 1,2,0,30,0,0,13 746 | 0,1,1,19,1,0,53.1 747 | 0,3,1,31,0,0,7.75 748 | 1,2,0,4,1,1,23 749 | 1,3,1,6,0,1,12.475 750 | 0,3,1,33,0,0,9.5 751 | 0,3,1,23,0,0,7.8958 752 | 1,2,0,48,1,2,65 753 | 1,2,1,0.67,1,1,14.5 754 | 0,3,1,28,0,0,7.7958 755 | 0,2,1,18,0,0,11.5 756 | 0,3,1,34,0,0,8.05 757 | 1,1,0,33,0,0,86.5 758 | 0,3,1,23,0,0,14.5 759 | 0,3,1,41,0,0,7.125 760 | 1,3,1,20,0,0,7.2292 761 | 1,1,0,36,1,2,120 762 | 0,3,1,16,0,0,7.775 763 | 1,1,0,51,1,0,77.9583 764 | 0,1,1,46,0,0,39.6 765 | 0,3,0,30.5,0,0,7.75 766 | 0,3,1,28,1,0,24.15 767 | 0,3,1,32,0,0,8.3625 768 | 0,3,1,24,0,0,9.5 769 | 0,3,1,48,0,0,7.8542 770 | 0,2,0,57,0,0,10.5 771 | 0,3,1,29,0,0,7.225 772 | 1,2,0,54,1,3,23 773 | 0,3,1,18,0,0,7.75 774 | 0,3,1,20,0,0,7.75 775 | 1,3,0,5,0,0,12.475 776 | 0,3,1,22,0,0,7.7375 777 | 1,1,0,43,0,1,211.3375 778 | 1,3,0,13,0,0,7.2292 779 | 1,1,0,17,1,0,57 780 | 0,1,1,29,0,0,30 781 | 0,3,1,35,1,2,23.45 782 | 0,3,1,25,0,0,7.05 783 | 0,3,1,25,0,0,7.25 784 | 1,3,0,18,0,0,7.4958 785 | 0,3,1,8,4,1,29.125 786 | 1,3,1,1,1,2,20.575 787 | 0,1,1,46,0,0,79.2 788 | 0,3,1,20,0,0,7.75 789 | 0,2,1,16,0,0,26 790 | 0,3,0,21,8,2,69.55 791 | 0,1,1,43,0,0,30.6958 792 | 0,3,1,25,0,0,7.8958 793 | 0,2,1,39,0,0,13 794 | 1,1,0,49,0,0,25.9292 795 | 1,3,0,31,0,0,8.6833 796 | 0,3,1,30,0,0,7.2292 797 | 0,3,0,30,1,1,24.15 798 | 0,2,1,34,0,0,13 799 | 1,2,0,31,1,1,26.25 800 | 1,1,1,11,1,2,120 801 | 1,3,1,0.42,0,1,8.5167 802 | 1,3,1,27,0,0,6.975 803 | 0,3,1,31,0,0,7.775 804 | 0,1,1,39,0,0,0 805 | 0,3,0,18,0,0,7.775 806 | 0,2,1,39,0,0,13 807 | 1,1,0,33,1,0,53.1 808 | 0,3,1,26,0,0,7.8875 809 | 0,3,1,39,0,0,24.15 810 | 0,2,1,35,0,0,10.5 811 | 0,3,0,6,4,2,31.275 812 | 0,3,1,30.5,0,0,8.05 813 | 0,1,1,39,0,0,0 814 | 0,3,0,23,0,0,7.925 815 | 0,2,1,31,1,1,37.0042 816 | 0,3,1,43,0,0,6.45 817 | 0,3,1,10,3,2,27.9 818 | 1,1,0,52,1,1,93.5 819 | 1,3,1,27,0,0,8.6625 820 | 0,1,1,38,0,0,0 821 | 1,3,0,27,0,1,12.475 822 | 0,3,1,2,4,1,39.6875 823 | 0,3,1,36,0,0,6.95 824 | 0,3,1,23,0,0,56.4958 825 | 1,2,1,1,0,2,37.0042 826 | 1,3,1,19,0,0,7.75 827 | 1,1,0,62,0,0,80 828 | 1,3,0,15,1,0,14.4542 829 | 1,2,1,0.83,1,1,18.75 830 | 0,3,1,30,0,0,7.2292 831 | 0,3,1,23,0,0,7.8542 832 | 0,3,1,18,0,0,8.3 833 | 1,1,0,39,1,1,83.1583 834 | 0,3,1,21,0,0,8.6625 835 | 0,3,1,20,0,0,8.05 836 | 1,3,1,32,0,0,56.4958 837 | 1,1,1,29,0,0,29.7 838 | 0,3,1,20,0,0,7.925 839 | 0,2,1,16,0,0,10.5 840 | 1,1,0,30,0,0,31 841 | 0,3,1,34.5,0,0,6.4375 842 | 0,3,1,17,0,0,8.6625 843 | 0,3,1,42,0,0,7.55 844 | 0,3,1,18,8,2,69.55 845 | 0,3,1,35,0,0,7.8958 846 | 0,2,1,28,0,1,33 847 | 1,1,0,40,1,0,89.1042 848 | 0,3,1,4,4,2,31.275 849 | 0,3,1,74,0,0,7.775 850 | 0,3,0,9,1,1,15.2458 851 | 1,1,0,16,0,1,39.4 852 | 0,2,0,44,1,0,26 853 | 1,3,0,18,0,1,9.35 854 | 1,1,0,45,1,1,164.8667 855 | 1,1,1,51,0,0,26.55 856 | 1,3,0,24,0,3,19.2583 857 | 0,3,1,30,0,0,7.2292 858 | 0,3,1,41,2,0,14.1083 859 | 0,2,1,21,1,0,11.5 860 | 1,1,0,48,0,0,25.9292 861 | 0,3,0,14,8,2,69.55 862 | 0,2,1,24,0,0,13 863 | 1,2,0,42,0,0,13 864 | 1,2,0,27,1,0,13.8583 865 | 0,1,1,31,0,0,50.4958 866 | 0,3,1,23,0,0,9.5 867 | 1,3,1,4,1,1,11.1333 868 | 0,3,1,26,0,0,7.8958 869 | 1,1,0,47,1,1,52.5542 870 | 0,1,1,33,0,0,5 871 | 0,3,1,47,0,0,9 872 | 1,2,0,28,1,0,24 873 | 1,3,0,15,0,0,7.225 874 | 0,3,1,20,0,0,9.8458 875 | 0,3,1,19,0,0,7.8958 876 | 0,3,1,23,0,0,7.8958 877 | 1,1,0,56,0,1,83.1583 878 | 1,2,0,25,0,1,26 879 | 0,3,1,33,0,0,7.8958 880 | 0,3,0,22,0,0,10.5167 881 | 0,2,1,28,0,0,10.5 882 | 0,3,1,25,0,0,7.05 883 | 0,3,0,39,0,5,29.125 884 | 0,2,1,27,0,0,13 885 | 1,1,0,19,0,0,30 886 | 0,3,0,7,1,2,23.45 887 | 1,1,1,26,0,0,30 888 | 0,3,1,32,0,0,7.75 889 | -------------------------------------------------------------------------------- /03_KNN_Classifier/KNN_Classifier.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "KNN Classifier.ipynb", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | } 14 | }, 15 | "cells": [ 16 | { 17 | "cell_type": "markdown", 18 | "metadata": { 19 | "id": "7AqXLpH1eh0r", 20 | "colab_type": "text" 21 | }, 22 | "source": [ 23 | "# **k-nearest neighbors algorithm**\n", 24 | "\n", 25 | "In pattern recognition, the k-nearest neighbors algorithm is a non-parametric method used for classification and regression. In both cases, the input consists of the k closest training examples in the feature space.\n", 26 | "\n", 27 | "By [Muhammad Huzaifa Shahbaz](https://www.linkedin.com/in/mhuzaifadev)" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "metadata": { 33 | "id": "EkXtM692ev1u", 34 | "colab_type": "text" 35 | }, 36 | "source": [ 37 | "## **Importing libraries**\n", 38 | "\n", 39 | "We will import NumPy as np, Pandas as *pd*, train_test_split & cross_val_score from sklearn.model_selection, confusion_matrix & accuracy_score from sklearn.metrics, KNeighborsClassifier from sklearn.neighbors and preprocessing from sklearn." 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "metadata": { 45 | "id": "hewapbOjdYLB", 46 | "colab_type": "code", 47 | "colab": {} 48 | }, 49 | "source": [ 50 | "import pandas as pd\n", 51 | "import numpy as np\n", 52 | "from sklearn.neighbors import KNeighborsClassifier\n", 53 | "from sklearn.model_selection import train_test_split\n", 54 | "from sklearn.model_selection import cross_val_score\n", 55 | "from sklearn.metrics import confusion_matrix\n", 56 | "from sklearn.metrics import accuracy_score\n", 57 | "from sklearn import preprocessing" 58 | ], 59 | "execution_count": 0, 60 | "outputs": [] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": { 65 | "id": "PsAOn1BiC-_o", 66 | "colab_type": "text" 67 | }, 68 | "source": [ 69 | "## **DataFraming**\n", 70 | "\n", 71 | "Read .csv data into a Dataframe " 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "metadata": { 77 | "id": "A9gmwPuuDTq7", 78 | "colab_type": "code", 79 | "outputId": "a25c3145-3b50-4fa2-c8c0-d83e4168d343", 80 | "colab": { 81 | "base_uri": "https://localhost:8080/", 82 | "height": 439 83 | } 84 | }, 85 | "source": [ 86 | "data = pd.read_csv(\"credit_data.csv\")\n", 87 | "print(data.head())\n", 88 | "print(data.describe())\n", 89 | "print(data.corr())" 90 | ], 91 | "execution_count": 22, 92 | "outputs": [ 93 | { 94 | "output_type": "stream", 95 | "text": [ 96 | " clientid income age loan LTI default\n", 97 | "0 1 66155.925095 59.017015 8106.532131 0.122537 0\n", 98 | "1 2 34415.153966 48.117153 6564.745018 0.190752 0\n", 99 | "2 3 57317.170063 63.108049 8020.953296 0.139940 0\n", 100 | "3 4 42709.534201 45.751972 6103.642260 0.142911 0\n", 101 | "4 5 66952.688845 18.584336 8770.099235 0.130989 1\n", 102 | " clientid income ... LTI default\n", 103 | "count 2000.000000 2000.000000 ... 2000.000000 2000.000000\n", 104 | "mean 1000.500000 45331.600018 ... 0.098403 0.141500\n", 105 | "std 577.494589 14326.327119 ... 0.057620 0.348624\n", 106 | "min 1.000000 20014.489470 ... 0.000049 0.000000\n", 107 | "25% 500.750000 32796.459717 ... 0.047903 0.000000\n", 108 | "50% 1000.500000 45789.117313 ... 0.099437 0.000000\n", 109 | "75% 1500.250000 57791.281668 ... 0.147585 0.000000\n", 110 | "max 2000.000000 69995.685578 ... 0.199938 1.000000\n", 111 | "\n", 112 | "[8 rows x 6 columns]\n", 113 | " clientid income age loan LTI default\n", 114 | "clientid 1.000000 0.039280 -0.030341 0.018931 0.002538 -0.020145\n", 115 | "income 0.039280 1.000000 -0.034984 0.441117 -0.019862 0.002284\n", 116 | "age -0.030341 -0.034984 1.000000 0.006561 0.021588 -0.444765\n", 117 | "loan 0.018931 0.441117 0.006561 1.000000 0.847495 0.377160\n", 118 | "LTI 0.002538 -0.019862 0.021588 0.847495 1.000000 0.433261\n", 119 | "default -0.020145 0.002284 -0.444765 0.377160 0.433261 1.000000\n" 120 | ], 121 | "name": "stdout" 122 | } 123 | ] 124 | }, 125 | { 126 | "cell_type": "markdown", 127 | "metadata": { 128 | "id": "N0c60VUJEfr-", 129 | "colab_type": "text" 130 | }, 131 | "source": [ 132 | "## **Features Extraction**\n", 133 | "\n", 134 | "Extracting features and splitting data into test and train." 135 | ] 136 | }, 137 | { 138 | "cell_type": "code", 139 | "metadata": { 140 | "id": "X4X2kjPahc0V", 141 | "colab_type": "code", 142 | "colab": { 143 | "base_uri": "https://localhost:8080/", 144 | "height": 107 145 | }, 146 | "outputId": "8578c424-511f-48fd-870a-20a87293bf4d" 147 | }, 148 | "source": [ 149 | "data.features = data[[\"income\",\"age\",\"loan\"]]\n", 150 | "data.target = data.default\n", 151 | "\n", 152 | "data.features = preprocessing.MinMaxScaler().fit_transform(data.features)\n", 153 | "\n", 154 | "feature_train, feature_test, target_train, target_test = train_test_split(data.features,data.target)" 155 | ], 156 | "execution_count": 23, 157 | "outputs": [ 158 | { 159 | "output_type": "stream", 160 | "text": [ 161 | "/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:1: UserWarning: Pandas doesn't allow columns to be created via a new attribute name - see https://pandas.pydata.org/pandas-docs/stable/indexing.html#attribute-access\n", 162 | " \"\"\"Entry point for launching an IPython kernel.\n", 163 | "/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:2: UserWarning: Pandas doesn't allow columns to be created via a new attribute name - see https://pandas.pydata.org/pandas-docs/stable/indexing.html#attribute-access\n", 164 | " \n" 165 | ], 166 | "name": "stderr" 167 | } 168 | ] 169 | }, 170 | { 171 | "cell_type": "markdown", 172 | "metadata": { 173 | "id": "nHxk6jzYLx0m", 174 | "colab_type": "text" 175 | }, 176 | "source": [ 177 | "## **Training the Model**\n", 178 | "\n", 179 | "We are using KNeighborsClassifier model as imported from sklearn.neighbors library and then it's being trained on feature_train and target_train" 180 | ] 181 | }, 182 | { 183 | "cell_type": "code", 184 | "metadata": { 185 | "id": "zcVLyHh3L5Rv", 186 | "colab_type": "code", 187 | "colab": {} 188 | }, 189 | "source": [ 190 | "model = KNeighborsClassifier(n_neighbors=32)\n", 191 | "fitModel = model.fit(feature_train, target_train)\n", 192 | "predictions = fitModel.predict(feature_test)" 193 | ], 194 | "execution_count": 0, 195 | "outputs": [] 196 | }, 197 | { 198 | "cell_type": "markdown", 199 | "metadata": { 200 | "colab_type": "text", 201 | "id": "MldzyDb_kWJt" 202 | }, 203 | "source": [ 204 | "## **Finding Optimal K Value**\n", 205 | "\n", 206 | "We are going to use cross-validation in order to find the optimal k value. This optimal value is not going to have as good accuracy and precision as we have seen for any neighbors close to ***n*** but it is going to be much more realistic because we use cross-validation. So we are going to use the cross-validation scores and then we are going to make a simple iteration. Basically we are going to consider k values from 1 up to 100.\n" 207 | ] 208 | }, 209 | { 210 | "cell_type": "code", 211 | "metadata": { 212 | "id": "MMj7k6ETlAmD", 213 | "colab_type": "code", 214 | "colab": { 215 | "base_uri": "https://localhost:8080/", 216 | "height": 34 217 | }, 218 | "outputId": "f37a8951-0e7a-4499-d3a0-ef1f62acd9bb" 219 | }, 220 | "source": [ 221 | "cross_valid_scores = []\n", 222 | "\n", 223 | "for k in range(1, 100):\n", 224 | " knn = KNeighborsClassifier(n_neighbors = k)\n", 225 | " scores = cross_val_score(knn,data.features, data.target, cv = 10, scoring = 'accuracy')\n", 226 | " cross_valid_scores.append(scores.mean())\n", 227 | "\n", 228 | "print(\"Optimal k with cross-validation: \\t\",np.argmax(cross_valid_scores))" 229 | ], 230 | "execution_count": 26, 231 | "outputs": [ 232 | { 233 | "output_type": "stream", 234 | "text": [ 235 | "Optimal k with cross-validation: \t 32\n" 236 | ], 237 | "name": "stdout" 238 | } 239 | ] 240 | }, 241 | { 242 | "cell_type": "markdown", 243 | "metadata": { 244 | "id": "tFBsaZpeMdIa", 245 | "colab_type": "text" 246 | }, 247 | "source": [ 248 | "## **Printing an Error Matrix and Accuracy Score**" 249 | ] 250 | }, 251 | { 252 | "cell_type": "code", 253 | "metadata": { 254 | "id": "urqzo3v7MzYp", 255 | "colab_type": "code", 256 | "outputId": "fd356366-5a2d-4b70-bad5-1ebda9f26903", 257 | "colab": { 258 | "base_uri": "https://localhost:8080/", 259 | "height": 70 260 | } 261 | }, 262 | "source": [ 263 | "print(confusion_matrix(target_test,predictions))\n", 264 | "print(accuracy_score(target_test,predictions))" 265 | ], 266 | "execution_count": 28, 267 | "outputs": [ 268 | { 269 | "output_type": "stream", 270 | "text": [ 271 | "[[437 2]\n", 272 | " [ 10 51]]\n", 273 | "0.976\n" 274 | ], 275 | "name": "stdout" 276 | } 277 | ] 278 | } 279 | ] 280 | } -------------------------------------------------------------------------------- /04_Naive_Bayes_Classifier/Naive_Bayes_Classifier.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "Naive_Bayes Classifier.ipynb", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | } 14 | }, 15 | "cells": [ 16 | { 17 | "cell_type": "markdown", 18 | "metadata": { 19 | "id": "7AqXLpH1eh0r", 20 | "colab_type": "text" 21 | }, 22 | "source": [ 23 | "# **Naive Bayes Classifier**\n", 24 | "\n", 25 | "Naive Bayes classifiers are a collection of classification algorithms based on Bayes' Theorem. It is not a single algorithm but a family of algorithms where all of them share a common principle, i.e. every pair of features being classified is independent of each other.\n", 26 | "\n", 27 | "By [Muhammad Huzaifa Shahbaz](https://www.linkedin.com/in/mhuzaifadev)" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "metadata": { 33 | "id": "EkXtM692ev1u", 34 | "colab_type": "text" 35 | }, 36 | "source": [ 37 | "## **Importing libraries**\n", 38 | "\n", 39 | "We will import Pandas as *pd*, train_test_split from sklearn.model_selection, confusion_matrix & accuracy_score from sklearn.metrics and GaussianNB from sklearn.naive_bayes." 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "metadata": { 45 | "id": "hewapbOjdYLB", 46 | "colab_type": "code", 47 | "colab": {} 48 | }, 49 | "source": [ 50 | "import pandas as pd\n", 51 | "from sklearn.naive_bayes import GaussianNB\n", 52 | "from sklearn.model_selection import train_test_split\n", 53 | "from sklearn.metrics import confusion_matrix\n", 54 | "from sklearn.metrics import accuracy_score" 55 | ], 56 | "execution_count": 0, 57 | "outputs": [] 58 | }, 59 | { 60 | "cell_type": "markdown", 61 | "metadata": { 62 | "id": "PsAOn1BiC-_o", 63 | "colab_type": "text" 64 | }, 65 | "source": [ 66 | "## **DataFraming**\n", 67 | "\n", 68 | "Read .csv data into a Dataframe " 69 | ] 70 | }, 71 | { 72 | "cell_type": "code", 73 | "metadata": { 74 | "id": "A9gmwPuuDTq7", 75 | "colab_type": "code", 76 | "outputId": "8a4009cd-79d1-4487-e057-e5da813dffce", 77 | "colab": { 78 | "base_uri": "https://localhost:8080/", 79 | "height": 439 80 | } 81 | }, 82 | "source": [ 83 | "data = pd.read_csv(\"/content/sample_data/credit_data.csv\")\n", 84 | "print(data.head())\n", 85 | "print(data.describe())\n", 86 | "print(data.corr())" 87 | ], 88 | "execution_count": 3, 89 | "outputs": [ 90 | { 91 | "output_type": "stream", 92 | "text": [ 93 | " clientid income age loan LTI default\n", 94 | "0 1 66155.925095 59.017015 8106.532131 0.122537 0\n", 95 | "1 2 34415.153966 48.117153 6564.745018 0.190752 0\n", 96 | "2 3 57317.170063 63.108049 8020.953296 0.139940 0\n", 97 | "3 4 42709.534201 45.751972 6103.642260 0.142911 0\n", 98 | "4 5 66952.688845 18.584336 8770.099235 0.130989 1\n", 99 | " clientid income ... LTI default\n", 100 | "count 2000.000000 2000.000000 ... 2000.000000 2000.000000\n", 101 | "mean 1000.500000 45331.600018 ... 0.098403 0.141500\n", 102 | "std 577.494589 14326.327119 ... 0.057620 0.348624\n", 103 | "min 1.000000 20014.489470 ... 0.000049 0.000000\n", 104 | "25% 500.750000 32796.459717 ... 0.047903 0.000000\n", 105 | "50% 1000.500000 45789.117313 ... 0.099437 0.000000\n", 106 | "75% 1500.250000 57791.281668 ... 0.147585 0.000000\n", 107 | "max 2000.000000 69995.685578 ... 0.199938 1.000000\n", 108 | "\n", 109 | "[8 rows x 6 columns]\n", 110 | " clientid income age loan LTI default\n", 111 | "clientid 1.000000 0.039280 -0.030341 0.018931 0.002538 -0.020145\n", 112 | "income 0.039280 1.000000 -0.034984 0.441117 -0.019862 0.002284\n", 113 | "age -0.030341 -0.034984 1.000000 0.006561 0.021588 -0.444765\n", 114 | "loan 0.018931 0.441117 0.006561 1.000000 0.847495 0.377160\n", 115 | "LTI 0.002538 -0.019862 0.021588 0.847495 1.000000 0.433261\n", 116 | "default -0.020145 0.002284 -0.444765 0.377160 0.433261 1.000000\n" 117 | ], 118 | "name": "stdout" 119 | } 120 | ] 121 | }, 122 | { 123 | "cell_type": "markdown", 124 | "metadata": { 125 | "id": "N0c60VUJEfr-", 126 | "colab_type": "text" 127 | }, 128 | "source": [ 129 | "## **Features Extraction**\n", 130 | "\n", 131 | "Extracting features and splitting data into test and train." 132 | ] 133 | }, 134 | { 135 | "cell_type": "code", 136 | "metadata": { 137 | "id": "X4X2kjPahc0V", 138 | "colab_type": "code", 139 | "colab": {} 140 | }, 141 | "source": [ 142 | "data.features = data[[\"income\",\"age\",\"loan\"]]\n", 143 | "data.target = data.default\n", 144 | "\n", 145 | "feature_train, feature_test, target_train, target_test = train_test_split(data.features,data.target)" 146 | ], 147 | "execution_count": 0, 148 | "outputs": [] 149 | }, 150 | { 151 | "cell_type": "markdown", 152 | "metadata": { 153 | "id": "nHxk6jzYLx0m", 154 | "colab_type": "text" 155 | }, 156 | "source": [ 157 | "## **Training the Model**\n", 158 | "\n", 159 | "We are using GaussianNB model as imported from sklearn.naive_bayes library and then it's being trained on feature_train and target_train" 160 | ] 161 | }, 162 | { 163 | "cell_type": "code", 164 | "metadata": { 165 | "id": "zcVLyHh3L5Rv", 166 | "colab_type": "code", 167 | "colab": {} 168 | }, 169 | "source": [ 170 | "model = GaussianNB()\n", 171 | "fitModel = model.fit(feature_train, target_train)\n", 172 | "predictions = fitModel.predict(feature_test)" 173 | ], 174 | "execution_count": 0, 175 | "outputs": [] 176 | }, 177 | { 178 | "cell_type": "markdown", 179 | "metadata": { 180 | "id": "tFBsaZpeMdIa", 181 | "colab_type": "text" 182 | }, 183 | "source": [ 184 | "## **Printing an Error Matrix and Accuracy Score**" 185 | ] 186 | }, 187 | { 188 | "cell_type": "code", 189 | "metadata": { 190 | "id": "urqzo3v7MzYp", 191 | "colab_type": "code", 192 | "outputId": "33289bcf-32b1-4e83-b511-f134d47b2781", 193 | "colab": { 194 | "base_uri": "https://localhost:8080/", 195 | "height": 70 196 | } 197 | }, 198 | "source": [ 199 | "print(confusion_matrix(target_test,predictions))\n", 200 | "print(accuracy_score(target_test,predictions))" 201 | ], 202 | "execution_count": 7, 203 | "outputs": [ 204 | { 205 | "output_type": "stream", 206 | "text": [ 207 | "[[415 12]\n", 208 | " [ 29 44]]\n", 209 | "0.918\n" 210 | ], 211 | "name": "stdout" 212 | } 213 | ] 214 | } 215 | ] 216 | } -------------------------------------------------------------------------------- /05_Support_Vector_Machine/Support_Vector_Machine_(Digit_Recognition).ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "Copy of Support_Vector_Machine (Digit_Recognition).ipynb", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | } 14 | }, 15 | "cells": [ 16 | { 17 | "cell_type": "markdown", 18 | "metadata": { 19 | "id": "HFenp3yV-gcO", 20 | "colab_type": "text" 21 | }, 22 | "source": [ 23 | "## **Support Vector Machine (SVM)**\n", 24 | "\n", 25 | "A support vector machine (SVM) is a supervised machine learning model that uses classification algorithms for two-group classification problems. After giving an SVM model sets of labeled training data for each category, they're able to categorize new text.\n", 26 | "\n", 27 | "By [Muhammad Huzaifa Shahbaz](https://www.linkedin.com/in/mhuzaifadev)" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "metadata": { 33 | "id": "EkXtM692ev1u", 34 | "colab_type": "text" 35 | }, 36 | "source": [ 37 | "## **Importing libraries**\n" 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "metadata": { 43 | "id": "hewapbOjdYLB", 44 | "colab_type": "code", 45 | "colab": {} 46 | }, 47 | "source": [ 48 | "import numpy as np\n", 49 | "import pandas as pd\n", 50 | "import matplotlib.pyplot as plt\n", 51 | "import matplotlib.cm as cm\n", 52 | "from sklearn import svm, datasets, metrics\n", 53 | "from sklearn.model_selection import train_test_split\n", 54 | "from sklearn.metrics import accuracy_score" 55 | ], 56 | "execution_count": 0, 57 | "outputs": [] 58 | }, 59 | { 60 | "cell_type": "markdown", 61 | "metadata": { 62 | "id": "PsAOn1BiC-_o", 63 | "colab_type": "text" 64 | }, 65 | "source": [ 66 | "### **Digits Loading**\n", 67 | "\n", 68 | "Loading data from sklearn datasets" 69 | ] 70 | }, 71 | { 72 | "cell_type": "code", 73 | "metadata": { 74 | "id": "6djJF7NEAx_I", 75 | "colab_type": "code", 76 | "colab": {} 77 | }, 78 | "source": [ 79 | "digits = datasets.load_digits()\n", 80 | "#print(digits)\n", 81 | "images_and_labels = list(zip(digits.images,digits.target))" 82 | ], 83 | "execution_count": 0, 84 | "outputs": [] 85 | }, 86 | { 87 | "cell_type": "markdown", 88 | "metadata": { 89 | "id": "N0c60VUJEfr-", 90 | "colab_type": "text" 91 | }, 92 | "source": [ 93 | "## **Flattening the Images**\n", 94 | "\n", 95 | "To apply a classifier on this data, we need to flatten the images to turn the data into (samples, features) matrix." 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "metadata": { 101 | "id": "pwLzYvYWBQcQ", 102 | "colab_type": "code", 103 | "colab": {} 104 | }, 105 | "source": [ 106 | "n_samples = len(digits.images)\n", 107 | "data = digits.images.reshape((n_samples, -1))" 108 | ], 109 | "execution_count": 0, 110 | "outputs": [] 111 | }, 112 | { 113 | "cell_type": "markdown", 114 | "metadata": { 115 | "id": "nHxk6jzYLx0m", 116 | "colab_type": "text" 117 | }, 118 | "source": [ 119 | "## **Training the Model**\n", 120 | "\n", 121 | "We are using SVM model as imported from sklearn library and then it's being trained on 75% of the digits." 122 | ] 123 | }, 124 | { 125 | "cell_type": "code", 126 | "metadata": { 127 | "id": "1eAhZM71COWr", 128 | "colab_type": "code", 129 | "colab": {} 130 | }, 131 | "source": [ 132 | "trainTestSplit = int(n_samples*0.75)\n", 133 | "model = svm.SVC()\n", 134 | "model.fit(data[:trainTestSplit], digits.target[:trainTestSplit])\n", 135 | "target = digits.target[trainTestSplit:]\n", 136 | "predictions = model.predict(data[trainTestSplit:])" 137 | ], 138 | "execution_count": 0, 139 | "outputs": [] 140 | }, 141 | { 142 | "cell_type": "markdown", 143 | "metadata": { 144 | "id": "tFBsaZpeMdIa", 145 | "colab_type": "text" 146 | }, 147 | "source": [ 148 | "## **Printing an Error Matrix and Accuracy Score**" 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "metadata": { 154 | "id": "urqzo3v7MzYp", 155 | "colab_type": "code", 156 | "outputId": "236c2b2d-6eb7-4994-ed81-2a1b917f87aa", 157 | "colab": { 158 | "base_uri": "https://localhost:8080/", 159 | "height": 210 160 | } 161 | }, 162 | "source": [ 163 | "print(metrics.confusion_matrix(target,predictions))\n", 164 | "print(accuracy_score(target,predictions))" 165 | ], 166 | "execution_count": 11, 167 | "outputs": [ 168 | { 169 | "output_type": "stream", 170 | "text": [ 171 | "[[42 0 0 0 1 0 0 0 0 0]\n", 172 | " [ 0 46 0 0 0 0 0 0 0 0]\n", 173 | " [ 0 0 43 0 0 0 0 0 0 0]\n", 174 | " [ 0 0 0 37 0 2 0 2 6 0]\n", 175 | " [ 0 0 0 0 45 0 0 0 1 2]\n", 176 | " [ 0 0 0 0 0 44 1 0 0 0]\n", 177 | " [ 0 0 0 0 0 0 47 0 0 0]\n", 178 | " [ 0 0 0 0 0 0 0 44 1 0]\n", 179 | " [ 0 2 0 0 0 0 0 0 38 1]\n", 180 | " [ 0 0 0 1 0 1 0 1 1 41]]\n", 181 | "0.9488888888888889\n" 182 | ], 183 | "name": "stdout" 184 | } 185 | ] 186 | }, 187 | { 188 | "cell_type": "markdown", 189 | "metadata": { 190 | "id": "5E1tYLLJJRSB", 191 | "colab_type": "text" 192 | }, 193 | "source": [ 194 | "## **Testing**" 195 | ] 196 | }, 197 | { 198 | "cell_type": "code", 199 | "metadata": { 200 | "id": "6VDPd6RDJqxf", 201 | "colab_type": "code", 202 | "colab": { 203 | "base_uri": "https://localhost:8080/", 204 | "height": 282 205 | }, 206 | "outputId": "3f17080d-fda6-40fa-bd60-95375186d3cd" 207 | }, 208 | "source": [ 209 | "plt.imshow(digits.images[-3], cmap = plt.cm.gray_r, interpolation = \"nearest\")\n", 210 | "print(\"Prediction for test image:\", model.predict(data[-3].reshape(1,-1)))\n", 211 | "\n", 212 | "plt.show()" 213 | ], 214 | "execution_count": 18, 215 | "outputs": [ 216 | { 217 | "output_type": "stream", 218 | "text": [ 219 | "Prediction for test image: [8]\n" 220 | ], 221 | "name": "stdout" 222 | }, 223 | { 224 | "output_type": "display_data", 225 | "data": { 226 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPUAAAD4CAYAAAA0L6C7AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAKnElEQVR4nO3d3Ytd5RmG8fvuqLRWO4EmFMmETA4kIA1NZAhIitiIJVYxOehBAgqRgidVDC2I9iT4D2hyUASJ2oCp0sYPRKxWMEMrtNYkTlqT0ZLGCZmgTUIZNR40JD49mBWIMnbW3nt9zeP1g+B8bOZ9Nnq59l6zsl5HhADk8Y22BwBQLaIGkiFqIBmiBpIhaiCZy+r4oYsXL47R0dE6fnSrLly40Oh6H3zwQWNrnT17trG1Vq1a1dhaQ0NDja3VpKmpKZ05c8Zzfa+WqEdHR7V///46fnSrZmZmGl1v69atja01Pj7e2Fr79u1rbK1FixY1tlaTxsbGvvJ7vPwGkiFqIBmiBpIhaiAZogaSIWogGaIGkiFqIBmiBpIpFbXtDbbft33U9oN1DwWgf/NGbXtI0q8l3SrpOklbbF9X92AA+lPmSL1W0tGIOBYR5yQ9K2ljvWMB6FeZqJdKOnHJ59PF177A9j2299vef/r06armA9Cjyk6URcTjETEWEWNLliyp6scC6FGZqE9KWnbJ5yPF1wB0UJmo35Z0re0Vtq+QtFnSS/WOBaBf894kISLO275X0muShiQ9GRGHa58MQF9K3fkkIl6R9ErNswCoAFeUAckQNZAMUQPJEDWQDFEDyRA1kAxRA8nUskNHVjfddFOj6x06dKixtbZv397YWk3vdNKkLuwIwpEaSIaogWSIGkiGqIFkiBpIhqiBZIgaSIaogWSIGkiGqIFkyuzQ8aTtU7bfbWIgAIMpc6T+jaQNNc8BoCLzRh0Rf5L0nwZmAVCByt5Ts+0O0A1suwMkw9lvIBmiBpIp8yutZyT9RdJK29O2f1b/WAD6VWYvrS1NDAKgGrz8BpIhaiAZogaSIWogGaIGkiFqIBmiBpJh250eNLkNjiQNDw83ttbU1FRja61YsaKxtV588cXG1pKkjRs3NrreXDhSA8kQNZAMUQPJEDWQDFEDyRA1kAxRA8kQNZAMUQPJEDWQTJl7lC2zvc/2EduHbd/fxGAA+lPm2u/zkn4ZEQdtXy3pgO3XI+JIzbMB6EOZbXc+jIiDxcefSpqUtLTuwQD0p6f31LZHJa2R9NYc32PbHaADSkdt+ypJz0naFhGffPn7bLsDdEOpqG1frtmg90TE8/WOBGAQZc5+W9ITkiYj4pH6RwIwiDJH6nWS7pK03vZE8ecnNc8FoE9ltt15U5IbmAVABbiiDEiGqIFkiBpIhqiBZIgaSIaogWSIGkiGqIFk2EurB9u3b290vYcffrixtXbv3t3YWo8++mhja3Vhb6umcaQGkiFqIBmiBpIhaiAZogaSIWogGaIGkiFqIBmiBpIpc+PBb9r+m+1DxbY7zV3mBKBnZS4T/a+k9RFxtrhV8Ju2/xARf615NgB9KHPjwZB0tvj08uJP1DkUgP6VvZn/kO0JSackvR4RbLsDdFSpqCPiQkSsljQiaa3t78/xGLbdATqgp7PfETEjaZ+kDfWMA2BQZc5+L7G9qPj4W5JukfRe3YMB6E+Zs9/XSNpte0iz/xP4XUS8XO9YAPpV5uz33zW7JzWABYAryoBkiBpIhqiBZIgaSIaogWSIGkiGqIFkiBpIhm13ejAzM9P2CCmsXr267RFS40gNJEPUQDJEDSRD1EAyRA0kQ9RAMkQNJEPUQDJEDSRD1EAypaMubuj/jm1uOgh0WC9H6vslTdY1CIBqlN12Z0TSbZJ21TsOgEGVPVLvkPSApM+/6gHspQV0Q5kdOm6XdCoiDvy/x7GXFtANZY7U6yTdYXtK0rOS1tt+utapAPRt3qgj4qGIGImIUUmbJb0REXfWPhmAvvB7aiCZnm5nFBHjksZrmQRAJThSA8kQNZAMUQPJEDWQDFEDyRA1kAxRA8mw7U4Pdu7c2eh6y5cvb2yt48ePN7bWpk2bGlvr67hVEkdqIBmiBpIhaiAZogaSIWogGaIGkiFqIBmiBpIhaiAZogaSKXWZaHEn0U8lXZB0PiLG6hwKQP96ufb7RxFxprZJAFSCl99AMmWjDkl/tH3A9j1zPYBtd4BuKBv1DyPiekm3Svq57Ru//AC23QG6oVTUEXGy+OcpSS9IWlvnUAD6V2aDvG/bvvrix5J+LOndugcD0J8yZ7+/J+kF2xcf/9uIeLXWqQD0bd6oI+KYpB80MAuACvArLSAZogaSIWogGaIGkiFqIBmiBpIhaiAZtt3pwfDwcKPrNbllTJPP7eOPP25sra8jjtRAMkQNJEPUQDJEDSRD1EAyRA0kQ9RAMkQNJEPUQDJEDSRTKmrbi2zvtf2e7UnbN9Q9GID+lL32e6ekVyPip7avkHRljTMBGMC8UdselnSjpK2SFBHnJJ2rdywA/Srz8nuFpNOSnrL9ju1dxf2/v4Btd4BuKBP1ZZKul/RYRKyR9JmkB7/8ILbdAbqhTNTTkqYj4q3i872ajRxAB80bdUR8JOmE7ZXFl26WdKTWqQD0rezZ7/sk7SnOfB+TdHd9IwEYRKmoI2JC0ljNswCoAFeUAckQNZAMUQPJEDWQDFEDyRA1kAxRA8kQNZAMe2n1YGJiotH1duzY0dha4+Pjja21bdu2xtb6OuJIDSRD1EAyRA0kQ9RAMkQNJEPUQDJEDSRD1EAyRA0kM2/Utlfanrjkzye2uSQI6Kh5LxONiPclrZYk20OSTkp6oea5APSp15ffN0v6V0Qcr2MYAIPrNerNkp6Z6xtsuwN0Q+moi3t+3yHp93N9n213gG7o5Uh9q6SDEfHvuoYBMLheot6ir3jpDaA7SkVdbF17i6Tn6x0HwKDKbrvzmaTv1jwLgApwRRmQDFEDyRA1kAxRA8kQNZAMUQPJEDWQDFEDyTgiqv+h9mlJvf71zMWSzlQ+TDdkfW48r/Ysj4g5/+ZULVH3w/b+iBhre446ZH1uPK9u4uU3kAxRA8l0KerH2x6gRlmfG8+rgzrznhpANbp0pAZQAaIGkulE1LY32H7f9lHbD7Y9TxVsL7O9z/YR24dt39/2TFWyPWT7Hdsvtz1LlWwvsr3X9nu2J23f0PZMvWr9PXWxQcA/NXu7pGlJb0vaEhFHWh1sQLavkXRNRBy0fbWkA5I2LfTndZHtX0gak/SdiLi97XmqYnu3pD9HxK7iDrpXRsRM23P1ogtH6rWSjkbEsYg4J+lZSRtbnmlgEfFhRBwsPv5U0qSkpe1OVQ3bI5Juk7Sr7VmqZHtY0o2SnpCkiDi30IKWuhH1UkknLvl8Wkn+47/I9qikNZLeaneSyuyQ9ICkz9sepGIrJJ2W9FTx1mJXcdPNBaULUadm+ypJz0naFhGftD3PoGzfLulURBxoe5YaXCbpekmPRcQaSZ9JWnDneLoQ9UlJyy75fKT42oJn+3LNBr0nIrLcXnmdpDtsT2n2rdJ620+3O1JlpiVNR8TFV1R7NRv5gtKFqN+WdK3tFcWJic2SXmp5poHZtmbfm01GxCNtz1OViHgoIkYiYlSz/67eiIg7Wx6rEhHxkaQTtlcWX7pZ0oI7sVnqvt91iojztu+V9JqkIUlPRsThlseqwjpJd0n6h+2J4mu/iohXWpwJ87tP0p7iAHNM0t0tz9Oz1n+lBaBaXXj5DaBCRA0kQ9RAMkQNJEPUQDJEDSRD1EAy/wOyjq4i4utuBAAAAABJRU5ErkJggg==\n", 227 | "text/plain": [ 228 | "
" 229 | ] 230 | }, 231 | "metadata": { 232 | "tags": [], 233 | "needs_background": "light" 234 | } 235 | } 236 | ] 237 | } 238 | ] 239 | } -------------------------------------------------------------------------------- /05_Support_Vector_Machine/Support_Vector_Machine_(Iris_Dataset).ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "Support_Vector_Machine (Iris Dataset).ipynb", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | } 14 | }, 15 | "cells": [ 16 | { 17 | "cell_type": "markdown", 18 | "metadata": { 19 | "id": "HFenp3yV-gcO", 20 | "colab_type": "text" 21 | }, 22 | "source": [ 23 | "## **Support Vector Machine (SVM)**\n", 24 | "\n", 25 | "A support vector machine (SVM) is a supervised machine learning model that uses classification algorithms for two-group classification problems. After giving an SVM model sets of labeled training data for each category, they're able to categorize new text.\n", 26 | "\n", 27 | "By [Muhammad Huzaifa Shahbaz](https://www.linkedin.com/in/mhuzaifadev)" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "metadata": { 33 | "id": "EkXtM692ev1u", 34 | "colab_type": "text" 35 | }, 36 | "source": [ 37 | "## **Importing libraries**\n", 38 | "\n", 39 | "We will import Numpy as *np*, Pandas as *pd*, MatplotLib.pyplot as *plt*, *svm* & *datasets* from sklearn, train_test_split from sklearn.model_selection and confusion_matrix & accuracy_score from sklearn.metrics." 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "metadata": { 45 | "id": "hewapbOjdYLB", 46 | "colab_type": "code", 47 | "colab": {} 48 | }, 49 | "source": [ 50 | "import numpy as np\n", 51 | "import pandas as pd\n", 52 | "import matplotlib.pyplot as plt\n", 53 | "from sklearn import svm\n", 54 | "from sklearn import datasets\n", 55 | "from sklearn.model_selection import train_test_split\n", 56 | "from sklearn.metrics import confusion_matrix\n", 57 | "from sklearn.metrics import accuracy_score" 58 | ], 59 | "execution_count": 0, 60 | "outputs": [] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": { 65 | "id": "PsAOn1BiC-_o", 66 | "colab_type": "text" 67 | }, 68 | "source": [ 69 | "## **Data Loading**\n", 70 | "\n", 71 | "Loading data from sklearn datasets" 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "metadata": { 77 | "id": "6djJF7NEAx_I", 78 | "colab_type": "code", 79 | "colab": { 80 | "base_uri": "https://localhost:8080/", 81 | "height": 1000 82 | }, 83 | "outputId": "276a6c92-6bd6-4b46-ba24-f169f707a7bd" 84 | }, 85 | "source": [ 86 | "dataset = datasets.load_iris()\n", 87 | "#print(dataset)" 88 | ], 89 | "execution_count": 3, 90 | "outputs": [ 91 | { 92 | "output_type": "stream", 93 | "text": [ 94 | "{'data': array([[5.1, 3.5, 1.4, 0.2],\n", 95 | " [4.9, 3. , 1.4, 0.2],\n", 96 | " [4.7, 3.2, 1.3, 0.2],\n", 97 | " [4.6, 3.1, 1.5, 0.2],\n", 98 | " [5. , 3.6, 1.4, 0.2],\n", 99 | " [5.4, 3.9, 1.7, 0.4],\n", 100 | " [4.6, 3.4, 1.4, 0.3],\n", 101 | " [5. , 3.4, 1.5, 0.2],\n", 102 | " [4.4, 2.9, 1.4, 0.2],\n", 103 | " [4.9, 3.1, 1.5, 0.1],\n", 104 | " [5.4, 3.7, 1.5, 0.2],\n", 105 | " [4.8, 3.4, 1.6, 0.2],\n", 106 | " [4.8, 3. , 1.4, 0.1],\n", 107 | " [4.3, 3. , 1.1, 0.1],\n", 108 | " [5.8, 4. , 1.2, 0.2],\n", 109 | " [5.7, 4.4, 1.5, 0.4],\n", 110 | " [5.4, 3.9, 1.3, 0.4],\n", 111 | " [5.1, 3.5, 1.4, 0.3],\n", 112 | " [5.7, 3.8, 1.7, 0.3],\n", 113 | " [5.1, 3.8, 1.5, 0.3],\n", 114 | " [5.4, 3.4, 1.7, 0.2],\n", 115 | " [5.1, 3.7, 1.5, 0.4],\n", 116 | " [4.6, 3.6, 1. , 0.2],\n", 117 | " [5.1, 3.3, 1.7, 0.5],\n", 118 | " [4.8, 3.4, 1.9, 0.2],\n", 119 | " [5. , 3. , 1.6, 0.2],\n", 120 | " [5. , 3.4, 1.6, 0.4],\n", 121 | " [5.2, 3.5, 1.5, 0.2],\n", 122 | " [5.2, 3.4, 1.4, 0.2],\n", 123 | " [4.7, 3.2, 1.6, 0.2],\n", 124 | " [4.8, 3.1, 1.6, 0.2],\n", 125 | " [5.4, 3.4, 1.5, 0.4],\n", 126 | " [5.2, 4.1, 1.5, 0.1],\n", 127 | " [5.5, 4.2, 1.4, 0.2],\n", 128 | " [4.9, 3.1, 1.5, 0.2],\n", 129 | " [5. , 3.2, 1.2, 0.2],\n", 130 | " [5.5, 3.5, 1.3, 0.2],\n", 131 | " [4.9, 3.6, 1.4, 0.1],\n", 132 | " [4.4, 3. , 1.3, 0.2],\n", 133 | " [5.1, 3.4, 1.5, 0.2],\n", 134 | " [5. , 3.5, 1.3, 0.3],\n", 135 | " [4.5, 2.3, 1.3, 0.3],\n", 136 | " [4.4, 3.2, 1.3, 0.2],\n", 137 | " [5. , 3.5, 1.6, 0.6],\n", 138 | " [5.1, 3.8, 1.9, 0.4],\n", 139 | " [4.8, 3. , 1.4, 0.3],\n", 140 | " [5.1, 3.8, 1.6, 0.2],\n", 141 | " [4.6, 3.2, 1.4, 0.2],\n", 142 | " [5.3, 3.7, 1.5, 0.2],\n", 143 | " [5. , 3.3, 1.4, 0.2],\n", 144 | " [7. , 3.2, 4.7, 1.4],\n", 145 | " [6.4, 3.2, 4.5, 1.5],\n", 146 | " [6.9, 3.1, 4.9, 1.5],\n", 147 | " [5.5, 2.3, 4. , 1.3],\n", 148 | " [6.5, 2.8, 4.6, 1.5],\n", 149 | " [5.7, 2.8, 4.5, 1.3],\n", 150 | " [6.3, 3.3, 4.7, 1.6],\n", 151 | " [4.9, 2.4, 3.3, 1. ],\n", 152 | " [6.6, 2.9, 4.6, 1.3],\n", 153 | " [5.2, 2.7, 3.9, 1.4],\n", 154 | " [5. , 2. , 3.5, 1. ],\n", 155 | " [5.9, 3. , 4.2, 1.5],\n", 156 | " [6. , 2.2, 4. , 1. ],\n", 157 | " [6.1, 2.9, 4.7, 1.4],\n", 158 | " [5.6, 2.9, 3.6, 1.3],\n", 159 | " [6.7, 3.1, 4.4, 1.4],\n", 160 | " [5.6, 3. , 4.5, 1.5],\n", 161 | " [5.8, 2.7, 4.1, 1. ],\n", 162 | " [6.2, 2.2, 4.5, 1.5],\n", 163 | " [5.6, 2.5, 3.9, 1.1],\n", 164 | " [5.9, 3.2, 4.8, 1.8],\n", 165 | " [6.1, 2.8, 4. , 1.3],\n", 166 | " [6.3, 2.5, 4.9, 1.5],\n", 167 | " [6.1, 2.8, 4.7, 1.2],\n", 168 | " [6.4, 2.9, 4.3, 1.3],\n", 169 | " [6.6, 3. , 4.4, 1.4],\n", 170 | " [6.8, 2.8, 4.8, 1.4],\n", 171 | " [6.7, 3. , 5. , 1.7],\n", 172 | " [6. , 2.9, 4.5, 1.5],\n", 173 | " [5.7, 2.6, 3.5, 1. ],\n", 174 | " [5.5, 2.4, 3.8, 1.1],\n", 175 | " [5.5, 2.4, 3.7, 1. ],\n", 176 | " [5.8, 2.7, 3.9, 1.2],\n", 177 | " [6. , 2.7, 5.1, 1.6],\n", 178 | " [5.4, 3. , 4.5, 1.5],\n", 179 | " [6. , 3.4, 4.5, 1.6],\n", 180 | " [6.7, 3.1, 4.7, 1.5],\n", 181 | " [6.3, 2.3, 4.4, 1.3],\n", 182 | " [5.6, 3. , 4.1, 1.3],\n", 183 | " [5.5, 2.5, 4. , 1.3],\n", 184 | " [5.5, 2.6, 4.4, 1.2],\n", 185 | " [6.1, 3. , 4.6, 1.4],\n", 186 | " [5.8, 2.6, 4. , 1.2],\n", 187 | " [5. , 2.3, 3.3, 1. ],\n", 188 | " [5.6, 2.7, 4.2, 1.3],\n", 189 | " [5.7, 3. , 4.2, 1.2],\n", 190 | " [5.7, 2.9, 4.2, 1.3],\n", 191 | " [6.2, 2.9, 4.3, 1.3],\n", 192 | " [5.1, 2.5, 3. , 1.1],\n", 193 | " [5.7, 2.8, 4.1, 1.3],\n", 194 | " [6.3, 3.3, 6. , 2.5],\n", 195 | " [5.8, 2.7, 5.1, 1.9],\n", 196 | " [7.1, 3. , 5.9, 2.1],\n", 197 | " [6.3, 2.9, 5.6, 1.8],\n", 198 | " [6.5, 3. , 5.8, 2.2],\n", 199 | " [7.6, 3. , 6.6, 2.1],\n", 200 | " [4.9, 2.5, 4.5, 1.7],\n", 201 | " [7.3, 2.9, 6.3, 1.8],\n", 202 | " [6.7, 2.5, 5.8, 1.8],\n", 203 | " [7.2, 3.6, 6.1, 2.5],\n", 204 | " [6.5, 3.2, 5.1, 2. ],\n", 205 | " [6.4, 2.7, 5.3, 1.9],\n", 206 | " [6.8, 3. , 5.5, 2.1],\n", 207 | " [5.7, 2.5, 5. , 2. ],\n", 208 | " [5.8, 2.8, 5.1, 2.4],\n", 209 | " [6.4, 3.2, 5.3, 2.3],\n", 210 | " [6.5, 3. , 5.5, 1.8],\n", 211 | " [7.7, 3.8, 6.7, 2.2],\n", 212 | " [7.7, 2.6, 6.9, 2.3],\n", 213 | " [6. , 2.2, 5. , 1.5],\n", 214 | " [6.9, 3.2, 5.7, 2.3],\n", 215 | " [5.6, 2.8, 4.9, 2. ],\n", 216 | " [7.7, 2.8, 6.7, 2. ],\n", 217 | " [6.3, 2.7, 4.9, 1.8],\n", 218 | " [6.7, 3.3, 5.7, 2.1],\n", 219 | " [7.2, 3.2, 6. , 1.8],\n", 220 | " [6.2, 2.8, 4.8, 1.8],\n", 221 | " [6.1, 3. , 4.9, 1.8],\n", 222 | " [6.4, 2.8, 5.6, 2.1],\n", 223 | " [7.2, 3. , 5.8, 1.6],\n", 224 | " [7.4, 2.8, 6.1, 1.9],\n", 225 | " [7.9, 3.8, 6.4, 2. ],\n", 226 | " [6.4, 2.8, 5.6, 2.2],\n", 227 | " [6.3, 2.8, 5.1, 1.5],\n", 228 | " [6.1, 2.6, 5.6, 1.4],\n", 229 | " [7.7, 3. , 6.1, 2.3],\n", 230 | " [6.3, 3.4, 5.6, 2.4],\n", 231 | " [6.4, 3.1, 5.5, 1.8],\n", 232 | " [6. , 3. , 4.8, 1.8],\n", 233 | " [6.9, 3.1, 5.4, 2.1],\n", 234 | " [6.7, 3.1, 5.6, 2.4],\n", 235 | " [6.9, 3.1, 5.1, 2.3],\n", 236 | " [5.8, 2.7, 5.1, 1.9],\n", 237 | " [6.8, 3.2, 5.9, 2.3],\n", 238 | " [6.7, 3.3, 5.7, 2.5],\n", 239 | " [6.7, 3. , 5.2, 2.3],\n", 240 | " [6.3, 2.5, 5. , 1.9],\n", 241 | " [6.5, 3. , 5.2, 2. ],\n", 242 | " [6.2, 3.4, 5.4, 2.3],\n", 243 | " [5.9, 3. , 5.1, 1.8]]), 'target': array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", 244 | " 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", 245 | " 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,\n", 246 | " 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,\n", 247 | " 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,\n", 248 | " 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,\n", 249 | " 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2]), 'target_names': array(['setosa', 'versicolor', 'virginica'], dtype='= 7:\n", 106 | " return 1\n", 107 | " else:\n", 108 | " return 0" 109 | ], 110 | "execution_count": 0, 111 | "outputs": [] 112 | }, 113 | { 114 | "cell_type": "markdown", 115 | "metadata": { 116 | "id": "N0c60VUJEfr-", 117 | "colab_type": "text" 118 | }, 119 | "source": [ 120 | "## **Features Extraction**\n", 121 | "\n", 122 | "To apply a classifier on this data, we need to extract features and target data and split it into test and train." 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "metadata": { 128 | "id": "CLm4haVcGf-N", 129 | "colab_type": "code", 130 | "colab": {} 131 | }, 132 | "source": [ 133 | "features = data[[\"fixed acidity\",\"volatile acidity\",\"citric acid\",\"residual sugar\",\"chlorides\",\"free sulfur dioxide\",\"total sulfur dioxide\",\"density\",\"pH\",\"sulphates\",\"alcohol\"]]\n", 134 | "data['tasty'] = data['quality'].apply(isTasty)\n", 135 | "targets = data['tasty']\n", 136 | "\n", 137 | "feature_train, feature_test, target_train, target_test = train_test_split(features,targets,test_size=0.2)" 138 | ], 139 | "execution_count": 0, 140 | "outputs": [] 141 | }, 142 | { 143 | "cell_type": "markdown", 144 | "metadata": { 145 | "id": "avcFhrNaO1TB", 146 | "colab_type": "text" 147 | }, 148 | "source": [ 149 | "## **Finding an Optimal Value**\n", 150 | "\n", 151 | "It may take upto 45min in execution on Colab or maybe hours in your local desktop, based on processing power." 152 | ] 153 | }, 154 | { 155 | "cell_type": "code", 156 | "metadata": { 157 | "id": "CFjn9HyMO1ml", 158 | "colab_type": "code", 159 | "outputId": "7e1a906b-744f-447c-869b-348502d5c62c", 160 | "colab": { 161 | "base_uri": "https://localhost:8080/", 162 | "height": 35 163 | } 164 | }, 165 | "source": [ 166 | "param_grid = {\n", 167 | " 'n_estimators' : [50,100,200,300,500,1000],\n", 168 | " 'learning_rate' : [0.01,0.05,0.1,0.3,1],\n", 169 | " }\n", 170 | "\n", 171 | "grid_search = GridSearchCV(estimator=AdaBoostClassifier(), param_grid=param_grid,cv=10)\n", 172 | "grid_search.fit(feature_train, target_train)\n", 173 | "\n", 174 | "print(grid_search.best_params_)\n", 175 | "\n", 176 | "optimal_estimators = grid_search.best_params_.get(\"n_estimators\")\n", 177 | "optimal_lrate = grid_search.best_params_.get(\"learning_rate\")" 178 | ], 179 | "execution_count": 0, 180 | "outputs": [ 181 | { 182 | "output_type": "stream", 183 | "text": [ 184 | "{'learning_rate': 0.3, 'n_estimators': 300}\n" 185 | ], 186 | "name": "stdout" 187 | } 188 | ] 189 | }, 190 | { 191 | "cell_type": "markdown", 192 | "metadata": { 193 | "id": "nHxk6jzYLx0m", 194 | "colab_type": "text" 195 | }, 196 | "source": [ 197 | "## **Training the Model**\n", 198 | "\n", 199 | "We will use AdaBoostClassifier for training the model." 200 | ] 201 | }, 202 | { 203 | "cell_type": "code", 204 | "metadata": { 205 | "id": "1eAhZM71COWr", 206 | "colab_type": "code", 207 | "colab": {} 208 | }, 209 | "source": [ 210 | "best_model = AdaBoostClassifier(n_estimators=optimal_estimators,learning_rate=optimal_lrate)\n", 211 | "best_model.fit(feature_train, target_train)\n", 212 | "\n", 213 | "predictions = best_model.predict(feature_test)" 214 | ], 215 | "execution_count": 0, 216 | "outputs": [] 217 | }, 218 | { 219 | "cell_type": "markdown", 220 | "metadata": { 221 | "id": "tFBsaZpeMdIa", 222 | "colab_type": "text" 223 | }, 224 | "source": [ 225 | "## **Printing an Error Matrix and Accuracy Score**" 226 | ] 227 | }, 228 | { 229 | "cell_type": "code", 230 | "metadata": { 231 | "id": "urqzo3v7MzYp", 232 | "colab_type": "code", 233 | "outputId": "4be01f9b-3308-4b38-df9b-eeb432505f38", 234 | "colab": { 235 | "base_uri": "https://localhost:8080/", 236 | "height": 72 237 | } 238 | }, 239 | "source": [ 240 | "print(confusion_matrix(target_test,predictions))\n", 241 | "print(accuracy_score(target_test,predictions))" 242 | ], 243 | "execution_count": 0, 244 | "outputs": [ 245 | { 246 | "output_type": "stream", 247 | "text": [ 248 | "[[717 45]\n", 249 | " [135 83]]\n", 250 | "0.8163265306122449\n" 251 | ], 252 | "name": "stdout" 253 | } 254 | ] 255 | } 256 | ] 257 | } -------------------------------------------------------------------------------- /09_Clustering/Hierarchical_Clustering.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "Hierarchical Clustering.ipynb", 7 | "provenance": [] 8 | }, 9 | "kernelspec": { 10 | "name": "python3", 11 | "display_name": "Python 3" 12 | } 13 | }, 14 | "cells": [ 15 | { 16 | "cell_type": "markdown", 17 | "metadata": { 18 | "id": "OTbnV4K1CV4i", 19 | "colab_type": "text" 20 | }, 21 | "source": [ 22 | "## **Hierarchical Clustering**\n", 23 | "\n", 24 | "Hierarchical clustering, also known as hierarchical cluster analysis, is an algorithm that groups similar objects into groups called clusters. The endpoint is a set of clusters, where each cluster is distinct from each other cluster, and the objects within each cluster are broadly similar to each other." 25 | ] 26 | }, 27 | { 28 | "cell_type": "markdown", 29 | "metadata": { 30 | "id": "j3F3bpdAWNCO", 31 | "colab_type": "text" 32 | }, 33 | "source": [ 34 | "## **Importing Libraries**" 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "metadata": { 40 | "id": "CPIMeNgoCjZL", 41 | "colab_type": "code", 42 | "colab": { 43 | "base_uri": "https://localhost:8080/", 44 | "height": 70 45 | }, 46 | "outputId": "f4425e29-071b-42c2-a14e-54c7e4af4c89" 47 | }, 48 | "source": [ 49 | "import matplotlib.pyplot as plt\n", 50 | "from scipy.cluster.hierarchy import linkage, dendrogram\n", 51 | "from sklearn.datasets.samples_generator import make_blobs\n", 52 | "import numpy as np" 53 | ], 54 | "execution_count": 2, 55 | "outputs": [ 56 | { 57 | "output_type": "stream", 58 | "text": [ 59 | "/usr/local/lib/python3.6/dist-packages/sklearn/utils/deprecation.py:144: FutureWarning: The sklearn.datasets.samples_generator module is deprecated in version 0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.datasets. Anything that cannot be imported from sklearn.datasets is now part of the private API.\n", 60 | " warnings.warn(message, FutureWarning)\n" 61 | ], 62 | "name": "stderr" 63 | } 64 | ] 65 | }, 66 | { 67 | "cell_type": "markdown", 68 | "metadata": { 69 | "id": "4anySMcgqROX", 70 | "colab_type": "text" 71 | }, 72 | "source": [ 73 | "## **Dataset Loading**\n", 74 | "\n" 75 | ] 76 | }, 77 | { 78 | "cell_type": "code", 79 | "metadata": { 80 | "id": "H2fQIxhZDJa2", 81 | "colab_type": "code", 82 | "colab": { 83 | "base_uri": "https://localhost:8080/", 84 | "height": 264 85 | }, 86 | "outputId": "96955c3c-9304-4bb4-8971-d9445d6a4d6f" 87 | }, 88 | "source": [ 89 | "x = np.array([[1,1],[1.5,1],[3,8],[4,4],[3,3.5],[3.5,4]])\n", 90 | "plt.scatter(x[:,0],x[:,1],s=50)\n", 91 | "\n", 92 | "plt.show()" 93 | ], 94 | "execution_count": 5, 95 | "outputs": [ 96 | { 97 | "output_type": "display_data", 98 | "data": { 99 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAWoAAAD4CAYAAADFAawfAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAP1UlEQVR4nO3dX2xb533G8ecZbamhHDSzRWxRnI69KrAGdpLRQroMQZY/Q9oKzsUCzAXSzQYKA9uwJtuAYs3Fiq63Q9H9AVYYSe1sTf8tTYZMbYoGSIChF7VDO7ab2NmQdUqbKKtpB01iaZMm9bcLHrsqQ5mHNg/5kvp+AMIUzyvx99NrPTo6fHmOI0IAgHT90qALAABcGkENAIkjqAEgcQQ1ACSOoAaAxG0q4otOTk5GtVot4ksDwEg6evTo2YiotNtWSFBXq1XV6/UivjQAjCTbr663jUMfAJA4ghoAEkdQA0DiCGoASFwhLyYCKM75pRXNnpjX3LkFVbdNaGbnlLaM86M8ynLNru0/lfQJSSHpB5L2RcT/FlkYgHd7fu5N7T14RBHS4vKqymMlfe5bp3Ro37R2VbcOujwUpOOhD9vXSfqkpFpE3CCpJGlP0YUB+EXnl1a09+ARLSytanF5VVIzrBeWVrPHVwZcIYqS9xj1JklX2d4kqSxpvriSALQze2Je652VOEKaPcmP5ajqGNQR8bqkv5b0I0lvSHorIr7bOs72ftt12/VGo9H7SoENbu7cwsU96VaLy6uaO7vY54rQL3kOffyypHslvV/SlKQJ2/e3jouIAxFRi4hapdL2XZAArkB124TKY6W228pjJVUny32uCP2S59DHXZL+KyIaEfF/kp6Q9JvFlgWg1czOKdntt9nSzI6p/haEvskT1D+SdIvtsm1LulPS6WLLAtBqy/gmHdo3rYnx0sU96/JYSRPjpexxluiNqo4zGxGHbT8u6ZikFUkvSDpQdGEA3m1XdauOPHSXZk/Oa+7soqqTZc3smCKkR5yLuLhtrVYLzp4HAPnZPhoRtXbbeAs5ACSOoAaAxBHUAJA4ghoAEkdQA0DiCGoASBxBDQCJI6gBIHEENQAkjqAGgMQR1ACQOIIaABJHUANA4ghqAEgcQQ0AiSOoASBxBDUAJI6gBoDEEdQAkLiOQW37A7aPr7m9bfvBfhQHAMh3FfJ/l3SjJNkuSXpd0pMF1wUAyHR76ONOSf8ZEa8WUQwA4N26Deo9kr7aboPt/bbrtuuNRuPKKwMASOoiqG2PSdot6Z/bbY+IAxFRi4hapVLpVX0AsOF1s0f9YUnHIuInRRUDAHi3boL6Y1rnsAcAoDi5gtr2hKS7JT1RbDkAgFYdl+dJUkQsSNpWcC0AgDZ4ZyIAJI6gBoDEEdQAkDiCGgASR1ADQOIIagBIHEENAIkjqAEgcQQ1ACSOoAaAxBHUAJA4ghoAEkdQA0DiCGoASBxBDQCJI6gBIHEENQAkjqAGgMQR1ACQOIIaABKX9yrk19h+3PbLtk/b/lDRhQEAmnJdhVzS30j6TkTcZ3tMUrnAmgAAa3QMatvvlXSbpL2SFBHLkpaLLQsAcEGeQx/vl9SQdND2C7Yftj3ROsj2ftt12/VGo9HzQgFgo8oT1Jsk3SzpHyLiJkkLkv6idVBEHIiIWkTUKpVKj8sEgI0rT1C/Jum1iDicffy4msENAOiDjkEdEf8t6ce2P5A9dKekU4VWBQC4KO+qjz+R9Fi24uOHkvYVVxIAYK1cQR0RxyXVCq4FANAG70wEgMQR1ACQOIIaABJHUANA4ghqAEgcQQ0AiSOoASBxBDUAJI6gBoDEEdQAkDiCGgASR1ADQOIIagBIHEENAIkjqAEgcQQ1ACSOoAaAxBHUAJA4ghoAEpfrmom25yS9I2lV0kpEcP1EAOiTvFchl6TfjoizhVUCAGiLQx8AkLi8QR2Svmv7qO397QbY3m+7brveaDR6VyEAbHB5g/q3IuJmSR+W9Me2b2sdEBEHIqIWEbVKpdLTIgFgI8sV1BHxevbvGUlPSpousigAwM91DGrbE7avvnBf0u9IerHowgAATXlWffyKpCdtXxj/lYj4TqFVAQAu6hjUEfFDSTv7UAsAoA2W5wFA4ghqAEgcQQ0AiSOoASBxBDUAJI6gBoDEEdQAkDiCGgASR1ADQOIIagBIHEENAIkjqAEgcQQ1ACSOoAaAxBHUAJA4ghoAEkdQA0DiCGoASFyeayYCQM+dX1rR7Il5zZ1bUHXbhGZ2TmnL+HBGUtG9OCLyDbRLkuqSXo+ImUuNrdVqUa/Xe1AegFH0/Nyb2nvwiCKkxeVVlcdKsqVD+6a1q7p10OV1pVe92D4aEbV227o59PGApNNdjAeAdzm/tKK9B49oYWlVi8urkpoBt7C0mj2+MuAK8+tXL7mC2vZ2SR+V9HBPnhXAhjV7Yl7r/SEfIc2enO9vQVegX73k3aP+gqRPSfrZegNs77ddt11vNBo9KQ7A6Jk7t3Bx77PV4vKq5s4u9rmiy9evXjoGte0ZSWci4uilxkXEgYioRUStUqn0pDgAo6e6bULlsVLbbeWxkqqT5T5XdPn61UuePepbJe22PSfpa5LusP3lnjw7gA1nZueU7PbbbGlmx1R/C7oC/eqlY1BHxKcjYntEVCXtkfRsRNzfk2cHsOFsGd+kQ/umNTFeurg3Wh4raWK8lD0+PEv0+tXL8HxHAIyMXdWtOvLQXZo9Oa+5s4uqTpY1s2NqqEL6gn70knsddTdYRw0A3enVOmoAwAAQ1ACQOIIaABJHUANA4obvJVZggxuls84hH2YXGCLtztT2uW+dGsqzziE/Dn0AQ2KUzjqH7hDUwJAYpbPOoTsENTAkRumsc+gOQQ0MiVE66xy6Q1ADQ2KUzjqH7hDUwJAYpbPOoTvMLDBERumsc8iP2QWGzMT4Jv3ervcNugz0EYc+ACBxBDUAJI6gBoDEEdQAkDiCGgAS1zGobb/H9hHbJ2y/ZPuz/SgMANCUZ3nekqQ7IuK87c2Svmf76Yj4fsG1AQCUI6ijeZny89mHm7Nb7y9dDgBoK9cxatsl28clnZH0TEQcbjNmv+267Xqj0eh1nQCwYeUK6ohYjYgbJW2XNG37hjZjDkRELSJqlUql13UCwIbV1aqPiPippOck3VNMOQCAVnlWfVRsX5Pdv0rS3ZJeLrowAEBTnlUf10p61HZJzWD/RkTMFlsWAOCCPKs+Tkq6qQ+1AADa4J2JAJA4ghoAEkdQA0DiCGoASBxBDQCJI6gBIHEENQAkjqAGgMQR1ACQOIIaABJHUANA4ghqAEgcQQ0AiSOoASBxBDUAJI6gBoDEEdQAkDiCGgASR1ADQOIIagBIXMegtn297edsn7L9ku0H+lEYAKCp41XIJa1I+vOIOGb7aklHbT8TEacKrg0AoBx71BHxRkQcy+6/I+m0pOuKLgwA0NTVMWrbVUk3STrcZtt+23Xb9Uaj0ZvqAAD5g9r2FknflPRgRLzduj0iDkRELSJqlUqllzUCwIaWK6htb1YzpB+LiCeKLQkAsFaeVR+W9Iik0xHx+eJLAgCslWeP+lZJH5d0h+3j2e0jBdcFAMh0XJ4XEd+T5D7UAgBog3cmAkDiCGoASBxBDQCJI6gBIHEENQAkjqAGgMQR1ACQOIIaABJHUANA4ghqAEgcQQ0AiSOoASBxBDUAJI6gBoDEEdQAkDiCGgASR1ADQOIIagBIXMdLcfXL+aUVzZ6Y19y5BVW3TWhm55S2jCdTXldGqRcAg+eIuPQA+0uSZiSdiYgb8nzRWq0W9Xo9dxHPz72pvQePKEJaXF5VeawkWzq0b1q7qltzf50UjFIvAPrH9tGIqLXblufQxyFJ9/S0ojXOL61o78EjWlha1eLyqqRmwC0srWaPrxT11D03Sr0ASEfHoI6If5P0ZlEFzJ6Y13o79RHS7Mn5op6650apFwDp6NmLibb3267brjcajdyfN3du4eLeZ6vF5VXNnV3sVYmFG6VeAKSjZ0EdEQciohYRtUqlkvvzqtsmVB4rtd1WHiupOlnuVYmFG6VeAKRj4MvzZnZOyW6/zZZmdkz1t6ArMEq9AEjHwIN6y/gmHdo3rYnx0sW90fJYSRPjpezx4VnWNkq9AEhHnuV5X5V0u6RJST+R9JmIeORSn9Pt8jxJWlha0ezJec2dXVR1sqyZHVNDG2yj1AuA/rjU8ryOQX05LieoAWAju9J11ACAASKoASBxBDUAJI6gBoDEFfJiou2GpFcv89MnJZ3tYTmDNCq9jEofEr2kaFT6kK6sl1+LiLbvFiwkqK+E7fp6r3wOm1HpZVT6kOglRaPSh1RcLxz6AIDEEdQAkLgUg/rAoAvooVHpZVT6kOglRaPSh1RQL8kdowYA/KIU96gBAGsQ1ACQuIEEte0v2T5j+8V1ttv239p+xfZJ2zf3u8a8cvRyu+23bB/Pbn/Z7xrzsH297edsn7L9ku0H2owZinnJ2cuwzMt7bB+xfSLr5bNtxozb/no2L4dtV/tf6aXl7GOv7caaOfnEIGrNy3bJ9gu2Z9ts6+2cRETfb5Juk3SzpBfX2f4RSU9LsqRbJB0eRJ096uV2SbODrjNHH9dKujm7f7Wk/5D068M4Lzl7GZZ5saQt2f3Nkg5LuqVlzB9J+mJ2f4+krw+67svsY6+kvx90rV309GeSvtLu/1Gv52Qge9TR+YK590r6x2j6vqRrbF/bn+q6k6OXoRARb0TEsez+O5JOS7quZdhQzEvOXoZC9r0+n324Obu1rgC4V9Kj2f3HJd1pr3etocHI2cfQsL1d0kclPbzOkJ7OSarHqK+T9OM1H7+mIf1By3wo+5PvadsfHHQxnWR/pt2k5l7PWkM3L5foRRqSecn+xD4u6YykZyJi3XmJiBVJb0na1t8qO8vRhyT9bnZY7XHb1/e5xG58QdKnJP1sne09nZNUg3qUHFPzPfw7Jf2dpH8ZcD2XZHuLpG9KejAi3h50PVeiQy9DMy8RsRoRN0raLmna9g2Druly5OjjXyVVI2KHpGf08z3SpNiekXQmIo726zlTDerXJa39bbo9e2zoRMTbF/7ki4hvS9pse3LAZbVle7OawfZYRDzRZsjQzEunXoZpXi6IiJ9Kek7SPS2bLs6L7U2S3ivpXH+ry2+9PiLiXEQsZR8+LOk3+l1bTrdK2m17TtLXJN1h+8stY3o6J6kG9VOSfj9bZXCLpLci4o1BF3U5bP/qhWNTtqfV/J4n90OU1fiIpNMR8fl1hg3FvOTpZYjmpWL7muz+VZLulvRyy7CnJP1Bdv8+Sc9G9ipWKvL00fJ6x241X1tITkR8OiK2R0RVzRcKn42I+1uG9XROBnLFVa+5YK7t1yR9Rs0XFxQRX5T0bTVXGLwiaVHSvkHUmUeOXu6T9Ie2VyT9j6Q9qf0QZW6V9HFJP8iOI0rSQ5LeJw3dvOTpZVjm5VpJj9ouqfnL5BsRMWv7ryTVI+IpNX8p/ZPtV9R8YXvP4MpdV54+Pml7t6QVNfvYO7BqL0ORc8JbyAEgcake+gAAZAhqAEgcQQ0AiSOoASBxBDUAJI6gBoDEEdQAkLj/B9N1KpKQspU6AAAAAElFTkSuQmCC\n", 100 | "text/plain": [ 101 | "
" 102 | ] 103 | }, 104 | "metadata": { 105 | "tags": [], 106 | "needs_background": "light" 107 | } 108 | } 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "metadata": { 114 | "id": "3OY0KOqzO7dy", 115 | "colab_type": "text" 116 | }, 117 | "source": [ 118 | "## **Nearest Point Algorithm**" 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "metadata": { 124 | "id": "S7fTugJUPG6y", 125 | "colab_type": "code", 126 | "colab": { 127 | "base_uri": "https://localhost:8080/", 128 | "height": 100 129 | }, 130 | "outputId": "5b0f6c8a-4e7f-4257-b31c-f1cec265a84c" 131 | }, 132 | "source": [ 133 | "linkage_matrix = linkage(x,\"complete\")\n", 134 | "print(linkage_matrix)" 135 | ], 136 | "execution_count": 6, 137 | "outputs": [ 138 | { 139 | "output_type": "stream", 140 | "text": [ 141 | "[[0. 1. 0.5 2. ]\n", 142 | " [3. 5. 0.5 2. ]\n", 143 | " [4. 7. 1.11803399 3. ]\n", 144 | " [6. 8. 4.24264069 5. ]\n", 145 | " [2. 9. 7.28010989 6. ]]\n" 146 | ], 147 | "name": "stdout" 148 | } 149 | ] 150 | }, 151 | { 152 | "cell_type": "markdown", 153 | "metadata": { 154 | "id": "EztAqL_GPNEM", 155 | "colab_type": "text" 156 | }, 157 | "source": [ 158 | "## **Pruning Tree Like Structure with Tuncade Mode**" 159 | ] 160 | }, 161 | { 162 | "cell_type": "code", 163 | "metadata": { 164 | "id": "48lpPyFzPNZo", 165 | "colab_type": "code", 166 | "colab": { 167 | "base_uri": "https://localhost:8080/", 168 | "height": 284 169 | }, 170 | "outputId": "890ef2c7-c1e6-4f20-f4bf-b63a6a862c84" 171 | }, 172 | "source": [ 173 | "dendrogram = dendrogram(linkage_matrix,truncate_mode='none')\n", 174 | "plt.title(\"Hierarchical Clustering\")\n", 175 | "plt.show()" 176 | ], 177 | "execution_count": 8, 178 | "outputs": [ 179 | { 180 | "output_type": "display_data", 181 | "data": { 182 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAWoAAAELCAYAAADwcMwcAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAASj0lEQVR4nO3deZCkdX3H8ffHBUQ5o4ygcqyJSrziKFtYxqgbjwAmEa2YBNeYYELWo7A0xihWNEETE2OZs0StjSBEXM+AUTwSjG4ZKI/s6nhweYKArAxeLKsuAb7543kGZ8fZnV7onv7NzPtV1cX08/ym+9O9zad//TzP9JOqQpLUrruMO4AkafcsaklqnEUtSY2zqCWpcRa1JDXOopakxlnUIsklSdY2kOPkJBftZv1HkvzBKO9jgN/flOSUO5NhGJI8NskV486hxWFRL3NJrkzypDnLdiqrqnpIVW1a9HB7qKpOqKpzRnkfSfZJcnqSrybZ3j9/ZyVZPcT7uFNvFgBV9T9VdfSwMqltFrXusCR73YHfWTWKLEP0PuCpwDrgIODhwBbgieMMNdsded61tFnU2mnWneQuSU5L8vUk303yniT36NetTlJJ/ijJt4CP98vfm2Rrkh8m+WSSh8y67bOTvDnJh5NsB341yRFJzksy3d/HG+fkeUOS7yf5ZpITZi3fabNDkj9OclmSbUkuTfLIfvlM/pnlTx/weXgS8GTgxKr636q6pap+WFVnVNWZ84w/Pcm5s67PPD979ddPTvKNPsc3kzwryYOAtwCPTnJTkh/0Y+/aP+5vJflOkrckuVu/bm2Sa5K8PMlW4G0zy+b8G740yRf7f4d3J9l31vqXJbkuybeTnNLnvP8gz4vGz6LWXC8EngY8HrgP8H3gjDljHg88CDiuv/4R4AHAvYDPAe+YM34d8FrgAOBTwAXAVcBq4L7Au2aNfRRwBXAI8HrgzCSZGzLJbwOnA78PHEg3C/5uv/rrwGPpZsSvBs5Ncu8BHvuTgM9W1dUDjN2tJPsB/wKcUFUHAL8MTFXVZcDzgE9V1f5VdXD/K68DHghMAvene17+YtZNHgbcAzgKWL+Lu/0d4HjgfsAvASf3WY4HXtI/vvsDa+/s49PisqhXhvcn+cHMBXjTbsY+D/jzqrqmqnbQleEz5nzcPr2qtlfVjwGq6qyq2jZr/MOTHDRr/H9U1cVVdRtdgdwH+LP+Nn5SVbO3115VVf9aVbcC5wD3Bg6dJ+cpwOv7mW9V1deq6qo+z3ur6ttVdVtVvRv4KnDsAM/TPYHrBhg3qNuAhya5W1VdV1WXzDeofyNaD/xJVX2vqrYBfwOcNOe2/rKqdsw87/P4l/5xfw/4IF3pQ1fgb6uqS6rqR3T/RlpCLOqV4WlVdfDMBXjBbsYeBZw/q9QvA25l57K8fcaZZFWS1/WbGm4EruxXHTLfeOAIujK+ZRf3v3Xmh75UAPafZ9wRdDPnn5Hk95NMzXoMD52TZ1e+S/fGcKdV1Xbgd+ne+K5L8qEkv7iL4RPA3YEtszJ/tF8+Y7qqfrLA3W6d9fOP+Onzdh92/je4058YtLgsas11Nd3H9YNnXfatqmtnjZn9lYvrgBPpPlYfRLc5AyC7GH81cOQQdohdDfzC3IVJjgL+FTgVuGf/xvTlOXl25WPAsUkOHzDDdrqCnXHY7JVV9Z9V9WS68r+8zwU7Px8ANwA/Bh4y6zk/qKpmv0Hdma+5vA6Y/ZiOuBO3pTGwqDXXW4DX9oVHkokkJ+5m/AHADrrZ6N3pPrLvzmfpiuN1SfZLsm+Sx9yBnG8FXprkmHTu32fej67Upvv8z6GbUS+oqj4GXEj3ieKYJHslOSDJ85L84Ty/MgU8LsmR/aaeV8ysSHJokhP7bdU7gJvoNl8AfAc4PMk+/f3eRlfi/5jkXv3v3zfJcQzHe4DnJHlQkrsDrxrS7WqRWNSa65+BDwD/lWQb8Gm6HXy78m90OwavBS7tx+9Sv+35N+l2an0LuIZuE8Eeqar30u2g3AhsA94P3KOqLgX+nm6n5XeAhwEX78FNPwP4MPBu4Id0s/E1dLPtuRku7Md9ke4Qvgtmrb4L3Q68bwPfo9sB+/x+3ceBS4CtSW7ol70c+Brw6X4T0seAoRwnXVUfodux+YmZ++hX7RjG7Wv04okDpJWlP0Twy8Bdd7OvQA1xRi2tAEme3h+r/XPA3wEftKSXDotaWhmeC1xPd6TMrfx0M4yWADd9SFLjnFFLUuNG8uUuhxxySK1evXoUNy1Jy9KWLVtuqKqJ+daNpKhXr17N5s2bR3HTkrQsJblqV+vc9CFJjbOoJalxFrUkNc6ilqTGWdSS1DiLWpIaZ1FLUuMsaklq3JI+7fyGDbBx47hTaLlYtw7W7+q0sdIYLekZ9caNMDU17hRaDqamfNNXu5b0jBpgchI2bRp3Ci11a9eOO4G0a0t6Ri1JK4FFLUmNs6glqXELFnWSo5NMzbrcmOTFixFOkjTAzsSqugKYBEiyCrgWOH/EuSRJvT3d9PFE4OtVtcsvuJYkDdeeFvVJwDvnW5FkfZLNSTZPT0/f+WSSJGAPijrJPsBTgffOt76qNlTVmqpaMzEx72m/JEl3wJ7MqE8APldV3xlVGEnSz9qTon4mu9jsIUkanYGKOsl+wJOB80YbR5I010Df9VFV24F7jjiLJGke/mWiJDXOopakxlnUktQ4i1qSGmdRS1LjLGpJapxFLUmNs6glqXEWtSQ1zqKWpMZZ1JLUOItakhpnUUtS4yxqSWqcRS1JjbOoJalxFrUkNc6ilqTGWdSS1LhBT257cJL3Jbk8yWVJHj3qYJKkzkAntwX+GfhoVT0jyT7A3UeYSZI0y4JFneQg4HHAyQBVdTNw82hjSZJmDLLp437ANPC2JJ9P8tYk+80dlGR9ks1JNk9PTw89qCStVIMU9V7AI4E3V9UjgO3AaXMHVdWGqlpTVWsmJiaGHFOSVq5Bivoa4Jqq+kx//X10xS1JWgQLFnVVbQWuTnJ0v+iJwKUjTSVJut2gR328EHhHf8THN4DnjC6SJGm2gYq6qqaANSPOIkmah3+ZKEmNs6glqXEWtSQ1zqKWpMZZ1JLUOItakhpnUUtS4yxqSWqcRS1JjbOoJalxFrUkNc6ilqTGWdSS1DiLWpIaZ1FLUuMsaklqnEUtSY2zqCWpcRa1JDVuoHMmJrkS2AbcCtxSVZ4/UZIWyaBnIQf41aq6YWRJJEnzctOHJDVu0KIu4L+SbEmyfpSBJEk7G3TTx69U1bVJ7gVcmOTyqvrk7AF9ga8HOPLII4ccU5JWroFm1FV1bf/f64HzgWPnGbOhqtZU1ZqJiYnhppSkFWzBok6yX5IDZn4Gfg348qiDSZI6g2z6OBQ4P8nM+I1V9dGRppIk3W7Boq6qbwAPX4QskqR5eHieJDXOopakxlnUktQ4i1qSGmdRS1LjLGpJapxFLUmNs6glqXEWtSQ1zqKWpMZZ1JLUOItakhpnUUtS4yxqSWqcRS1JjRv0nIlaATZs2cDGL20cd4yxmNr6TwCsPfvFY04yHuseto71x3je6lZZ1Lrdxi9tZGrrFJOHTY47yqKbPG1lFjTA1NYpAIu6YRa1djJ52CSbTt407hhaRGvPXjvuCFqA26glqXEWtSQ1buCiTrIqyeeTXDDKQJKkne3JjPpFwGWjCiJJmt9ARZ3kcODXgbeONo4kaa5BZ9T/BLwMuG1XA5KsT7I5yebp6emhhJMkDVDUSX4DuL6qtuxuXFVtqKo1VbVmYmJiaAElaaUbZEb9GOCpSa4E3gU8Icm5I00lSbrdgkVdVa+oqsOrajVwEvDxqvq9kSeTJAEeRy1JzdujPyGvqk3AppEkkSTNyxm1JDXOopakxlnUktQ4i1qSGmdRS1LjLGpJapxFLUmNs6glqXEWtSQ1zqKWpMZZ1JLUOItakhpnUUtS4yxqSWqcRS1JjbOoJalxFrUkNc6ilqTGWdSS1LgFizrJvkk+m+QLSS5J8urFCCZJ6gxyctsdwBOq6qYkewMXJflIVX16xNkkSQxQ1FVVwE391b37S40ylCTppwbaRp1kVZIp4Hrgwqr6zDxj1ifZnGTz9PT0sHNK0oo1UFFX1a1VNQkcDhyb5KHzjNlQVWuqas3ExMSwc0rSirVHR31U1Q+ATwDHjyaOJGmuQY76mEhycP/z3YAnA5ePOpgkqTPIUR/3Bs5Jsoqu2N9TVReMNpYkacYgR318EXjEImSRJM3Dv0yUpMZZ1JLUOItakhpnUUtS4yxqSWqcRS1JjbOoJalxFrUkNc6ilqTGWdSS1DiLWpIaZ1FLUuMsaklqnEUtSY2zqCWpcRa1JDXOopakxlnUktQ4i1qSGmdRS1LjFizqJEck+USSS5NckuRFixFMktRZ8CzkwC3An1bV55IcAGxJcmFVXTribJIkBphRV9V1VfW5/udtwGXAfUcdTJLU2aNt1ElWA48APjPPuvVJNifZPD09PZx0kqTBizrJ/sC/Ay+uqhvnrq+qDVW1pqrWTExMDDOjJK1oAxV1kr3pSvodVXXeaCNJkmYb5KiPAGcCl1XVP4w+kiRptkFm1I8Bng08IclUf3nKiHNJknoLHp5XVRcBWYQskqR5+JeJktQ4i1qSGmdRS1LjLGpJapxFLUmNs6glqXEWtSQ1zqKWpMZZ1JLUOItakhpnUUtS4yxqSWqcRS1JjbOoJalxFrUkNc6ilqTGWdSS1DiLWpIat+CpuCTdMRu2bGDjlzaOO8aCprZOAbD27LXjDTKAdQ9bx/pj1o87xqIb5CzkZyW5PsmXFyOQtFxs/NLG20uwZZOHTTJ52OS4YyxoauvUknjjG4VBZtRnA28E/m20UaTlZ/KwSTadvGncMZaFpTDjH5UFZ9RV9Unge4uQRZI0j6HtTEyyPsnmJJunp6eHdbOStOINrairakNVramqNRMTE8O6WUla8Tw8T5IaZ1FLUuMGOTzvncCngKOTXJPkj0YfS5I0Y8HD86rqmYsRRJI0Pzd9SFLjLGpJapxFLUmNs6glqXEWtSQ1zqKWpMZZ1JLUOItakhrnGV4atthnCBnHmT5W6hk7loPl/vps6bXpjLphi32GkMU+08dKPmPHcrCcX5+tvTadUTduOZ8hZCWfsWO5WK6vz9Zem86oJalxFrUkNc6ilqTGWdSS1DiLWpIaZ1FLUuMsaklqnEUtSY2zqCWpcQMVdZLjk1yR5GtJTht1KEnSTy1Y1ElWAWcAJwAPBp6Z5MGjDiZJ6gwyoz4W+FpVfaOqbgbeBZw42liSpBmpqt0PSJ4BHF9Vp/TXnw08qqpOnTNuPTDznYBHA1cMP64kLVtHVdXEfCuG9u15VbUB2DCs25MkdQbZ9HEtcMSs64f3yyRJi2CQov5f4AFJ7pdkH+Ak4AOjjSVJmrHgpo+quiXJqcB/AquAs6rqkpEnkyQBA+xMlCSNl3+ZKEmNs6glqXEWtSQ1bskWdZK7JjkzyVVJtiWZSnLCuHMNS5J7JDk/yfb+Ma4bd6ZhSnJqks1JdiQ5e9x5RiXJA5L8JMm5484yTEnOTXJdkhuTfCXJKePONExJNvX/bjf1l7H+Ad+SLWq6I1auBh4PHAS8EnhPktVjzDRMZwA3A4cCzwLenOQh4400VN8G/ho4a9xBRuwMukNcl5u/BVZX1YHAU4G/TnLMmDMN26lVtX9/OXqcQZZsUVfV9qo6vaqurKrbquoC4JvAkn+xJNkP+C3gVVV1U1VdRHfs+rPHm2x4quq8qno/8N1xZxmVJCcBPwD+e9xZhq2qLqmqHTNX+8svjDHSsrZki3quJIcCDwSWwzHeDwRuqaqvzFr2BWA5zaiXtSQHAq8BXjLuLKOS5E1JfgRcDlwHfHjMkYbtb5PckOTiJGvHGWRZFHWSvYF3AOdU1eXjzjME+wM3zln2Q+CAMWTRHfNXwJlVdc24g4xKVb2A7jX5WOA8YMfuf2NJeTnw88B96b7D6INJxvaJYckXdZK7AG+n25576gLDl4qbgAPnLDsQ2DaGLNpDSSaBJwH/OO4so1ZVt/ab5g4Hnj/uPMNSVZ+pqm1VtaOqzgEuBp4yrjxD+/a8cUgS4Ey6HW5Pqar/G3OkYfkKsFeSB1TVV/tlD2d5bNZZCdYCq4FvdS9R9gdWJXlwVT1yjLlGaS+W9zbqAjKuO1/qM+o3Aw8CfrOqfjzuMMNSVdvpPkq+Jsl+SR5Dd7KGt4832fAk2SvJvnTfH7Mqyb5JlvTEYZYNdKU12V/eAnwIOG6coYYlyb2SnJRk/ySrkhwHPJNlstM0ycFJjpt5TSZ5FvA44KPjyrRkizrJUcBz6f5H2DrreMdnjTnasLwAuBtwPfBO4PnL7MuwXgn8GDgN+L3+51eONdGQVNWPqmrrzIVuU9ZPqmp63NmGpOg2c1wDfB94A/Diqlou36q5N92ho9PADcALgafN2bm/qPxSJklq3JKdUUvSSmFRS1LjLGpJapxFLUmNs6glqXEWtSQ1zqKWpMZZ1JLUuP8HY4IJBagTmiYAAAAASUVORK5CYII=\n", 183 | "text/plain": [ 184 | "
" 185 | ] 186 | }, 187 | "metadata": { 188 | "tags": [], 189 | "needs_background": "light" 190 | } 191 | } 192 | ] 193 | } 194 | ] 195 | } -------------------------------------------------------------------------------- /09_Clustering/K_Means_Clustering.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "K Means Clustering.ipynb", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | }, 14 | "accelerator": "TPU" 15 | }, 16 | "cells": [ 17 | { 18 | "cell_type": "markdown", 19 | "metadata": { 20 | "id": "lL35lKpnTYH1", 21 | "colab_type": "text" 22 | }, 23 | "source": [ 24 | "## **K Means Clustering**\n", 25 | "\n", 26 | "k-means clustering is a method of vector quantization, originally from signal processing, that aims to partition n observations into k clusters in which each observation belongs to the cluster with the nearest mean, serving as a prototype of the cluster." 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "metadata": { 32 | "id": "j3F3bpdAWNCO", 33 | "colab_type": "text" 34 | }, 35 | "source": [ 36 | "## **Importing Libraries**" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "metadata": { 42 | "id": "-Hr1KUL9piT0", 43 | "colab_type": "code", 44 | "colab": { 45 | "base_uri": "https://localhost:8080/", 46 | "height": 90 47 | }, 48 | "outputId": "e807b520-d422-4548-e063-ba18d159bc26" 49 | }, 50 | "source": [ 51 | "import collections\n", 52 | "import nltk\n", 53 | "nltk.download('stopwords')\n", 54 | "nltk.download('punkt')\n", 55 | "from nltk import word_tokenize\n", 56 | "from nltk.corpus import stopwords\n", 57 | "from nltk.stem import PorterStemmer\n", 58 | "from sklearn.cluster import KMeans\n", 59 | "from sklearn.feature_extraction.text import TfidfVectorizer\n", 60 | "from pprint import pprint" 61 | ], 62 | "execution_count": 17, 63 | "outputs": [ 64 | { 65 | "output_type": "stream", 66 | "text": [ 67 | "[nltk_data] Downloading package stopwords to /root/nltk_data...\n", 68 | "[nltk_data] Package stopwords is already up-to-date!\n", 69 | "[nltk_data] Downloading package punkt to /root/nltk_data...\n", 70 | "[nltk_data] Package punkt is already up-to-date!\n" 71 | ], 72 | "name": "stdout" 73 | } 74 | ] 75 | }, 76 | { 77 | "cell_type": "markdown", 78 | "metadata": { 79 | "id": "4anySMcgqROX", 80 | "colab_type": "text" 81 | }, 82 | "source": [ 83 | "## **Defining a function tokenizer(text)**\n", 84 | "\n" 85 | ] 86 | }, 87 | { 88 | "cell_type": "code", 89 | "metadata": { 90 | "id": "YFXxYQrUqoPA", 91 | "colab_type": "code", 92 | "colab": {} 93 | }, 94 | "source": [ 95 | "def tokenizer(text):\n", 96 | " tokens = word_tokenize(text)\n", 97 | " stemmer = PorterStemmer()\n", 98 | " tokens = [stemmer.stem(t) for t in tokens if t not in stopwords.words('english')]\n", 99 | " return tokens" 100 | ], 101 | "execution_count": 0, 102 | "outputs": [] 103 | }, 104 | { 105 | "cell_type": "markdown", 106 | "metadata": { 107 | "colab_type": "text", 108 | "id": "NHTdg_vevRu1" 109 | }, 110 | "source": [ 111 | "## **Defining a function cluster_sentences(sentences,k=(int))**\n", 112 | "\n", 113 | "### ***(KMeans Clustering)***\n", 114 | "\n" 115 | ] 116 | }, 117 | { 118 | "cell_type": "code", 119 | "metadata": { 120 | "id": "LuBTUARBvtpu", 121 | "colab_type": "code", 122 | "colab": {} 123 | }, 124 | "source": [ 125 | "def cluster_sentences(sentences, k):\n", 126 | " #Create tf ifd again: stopwords--> we filter out common words (I,my, the,and...)\n", 127 | " tfidf_vectorizer = TfidfVectorizer(tokenizer=tokenizer, stop_words=stopwords.words('english'),lowercase=True)\n", 128 | " #builds a tf-idf matrix for the sentences \n", 129 | " tfidf_matrix = tfidf_vectorizer.fit_transform(sentences)\n", 130 | " kmeans = KMeans(n_clusters=k)\n", 131 | " kmeans.fit(tfidf_matrix)\n", 132 | " clusters = collections.defaultdict(list)\n", 133 | " for i, label in enumerate(kmeans.labels_):\n", 134 | " clusters[label].append(i)\n", 135 | " return dict(clusters)" 136 | ], 137 | "execution_count": 0, 138 | "outputs": [] 139 | }, 140 | { 141 | "cell_type": "markdown", 142 | "metadata": { 143 | "id": "cqOBCOx9xbU6", 144 | "colab_type": "text" 145 | }, 146 | "source": [ 147 | "## **Main Body**" 148 | ] 149 | }, 150 | { 151 | "cell_type": "code", 152 | "metadata": { 153 | "id": "ZZ1rKmCxxcLy", 154 | "colab_type": "code", 155 | "colab": { 156 | "base_uri": "https://localhost:8080/", 157 | "height": 492 158 | }, 159 | "outputId": "7794636e-593b-4bfa-d598-e8202fa0b304" 160 | }, 161 | "source": [ 162 | "if __name__ == \"__main__\":\n", 163 | " sentences= [\"Graphics designers are most creative people\",\n", 164 | " \"Snooker is a billiards sport for normally two players.\",\n", 165 | " \"Snooker is played on a large (12 feet by 6 feet) table that is covered with a smooth green material.\",\n", 166 | " \"FOREX is the stock market for trading currencies\",\n", 167 | " \"Software Engineering is hotter and hotter topic in Silicon Valley\",\n", 168 | " \"Love is blind\",\n", 169 | " \"Snooker is popular in the United Kingdom and many other countries.\",\n", 170 | " \"The flying or operating of aircraft is known as aviation.\",\n", 171 | " \"Falling in love is like being on drugs.\",\n", 172 | " \"Warren Buffet is famous for making good investments.He knows stock markets\",\n", 173 | " \"The biggest of the many uses of aviation are in air travel and military aircraft.\",\n", 174 | " \"All giant majors in Silicon Valley is focusing AI for their business productivity\",\n", 175 | " \"Investing in stocks and trading with them are not that easy\",\n", 176 | " \"Being in love is the number one reason why people wed.\",\n", 177 | " \"Aviation refers to flying using an aircraft, like an aeroplane.\",\n", 178 | " \"Graphics Designing is high rated freelance subject\",\n", 179 | " \"Loving from a long distance actually strengthens a relationship.\"\n", 180 | " ]\n", 181 | " k = 6\n", 182 | " clusters = cluster_sentences(sentences,k)\n", 183 | " for cluster in range (k):\n", 184 | " print(\"CLUSTER \",cluster,\":\")\n", 185 | " for i, sentence in enumerate(clusters[cluster]):\n", 186 | " print(\"\\t\",(i+1),\": \",sentences[sentence])\n" 187 | ], 188 | "execution_count": 27, 189 | "outputs": [ 190 | { 191 | "output_type": "stream", 192 | "text": [ 193 | "CLUSTER 0 :\n", 194 | "\t 1 : Software Engineering is hotter and hotter topic in Silicon Valley\n", 195 | "\t 2 : All giant majors in Silicon Valley is focusing AI for their business productivity\n", 196 | "CLUSTER 1 :\n", 197 | "\t 1 : Snooker is a billiards sport for normally two players.\n", 198 | "\t 2 : Snooker is played on a large (12 feet by 6 feet) table that is covered with a smooth green material.\n", 199 | "\t 3 : Snooker is popular in the United Kingdom and many other countries.\n", 200 | "CLUSTER 2 :\n", 201 | "\t 1 : Love is blind\n", 202 | "\t 2 : Falling in love is like being on drugs.\n", 203 | "\t 3 : Being in love is the number one reason why people wed.\n", 204 | "\t 4 : Loving from a long distance actually strengthens a relationship.\n", 205 | "CLUSTER 3 :\n", 206 | "\t 1 : FOREX is the stock market for trading currencies\n", 207 | "\t 2 : Warren Buffet is famous for making good investments.He knows stock markets\n", 208 | "\t 3 : Investing in stocks and trading with them are not that easy\n", 209 | "CLUSTER 4 :\n", 210 | "\t 1 : Graphics designers are most creative people\n", 211 | "\t 2 : Graphics Designing is high rated freelance subject\n", 212 | "CLUSTER 5 :\n", 213 | "\t 1 : The flying or operating of aircraft is known as aviation.\n", 214 | "\t 2 : The biggest of the many uses of aviation are in air travel and military aircraft.\n", 215 | "\t 3 : Aviation refers to flying using an aircraft, like an aeroplane.\n" 216 | ], 217 | "name": "stdout" 218 | }, 219 | { 220 | "output_type": "stream", 221 | "text": [ 222 | "/usr/local/lib/python3.6/dist-packages/sklearn/feature_extraction/text.py:385: UserWarning: Your stop_words may be inconsistent with your preprocessing. Tokenizing the stop words generated tokens [\"'d\", \"'ll\", \"'re\", \"'s\", \"'ve\", 'could', 'might', 'must', \"n't\", 'need', 'sha', 'wo', 'would'] not in stop_words.\n", 223 | " 'stop_words.' % sorted(inconsistent))\n" 224 | ], 225 | "name": "stderr" 226 | } 227 | ] 228 | } 229 | ] 230 | } -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Machine_Learning_Algorithms 2 | In this repository you will get all necessary algorithms, for supervised and unsupervised learning (Clustering). It's simple and will be a good code guide for Kick-start in ML. 3 | 4 | https://www.linkedin.com/in/mhuzaifadev/ 5 | --------------------------------------------------------------------------------