├── README.md ├── Quizzes └── Integration Python SQL Tableau - Quiz Questions.pdf ├── 6. Machine Learning ├── 65 A Note on Pickling │ └── A Note on Pickling.pdf ├── 66 Saving the Model (and Scaler) - EXERCISE │ └── EXERCISE - Saving the Model (and Scaler).pdf ├── 66 EXERCISE - Saving the Model (and Scaler) │ ├── Absenteeism Exercise - Logistic Regression.ipynb │ └── Absenteeism Exercise - Logistic Regression_with_comments.ipynb ├── 60 Omitting the Dummy Variables from the Standardization │ └── Absenteeism Exercise - Logistic Regression_prior to custom_scaler.ipynb ├── 62 Simplifying the Model (Backward Elimination) │ └── Absenteeism Exercise - Logistic Regression_prior_to_backward_elimination.ipynb └── 52 Exploring the Problem from a Machine Learning Point of View │ └── Absenteeism_preprocessed.csv ├── 8. Connecting Python and SQL ├── 71 Are You Sure You're All Set │ ├── The 5 files rar.zip │ └── NOTE - Are You Sure You're All Set.pdf └── 77 Create 'df_new_obs' - EXERCISE │ └── EXERCISE - Create 'df_new_obs'.pdf ├── 5. Preprocessing ├── 48 Removing Columns - EXERCISES │ └── EXERCISES - Removing Columns.pdf ├── 33 Dummy Variables - Reasoning │ └── ARTICLE - Dummy Variables - Reasoning.pdf ├── 27 Removing Irrelevant Data - EXERCISE │ └── EXERCISE - Removing Irrelevant Data.pdf ├── 28 Removing Irrelevant Data - SOLUTION │ └── SOLUTION - Removing Irrelevant Data.pdf ├── 20 What to Expect from the Next Couple of Sections │ ├── data_preprocessing_homework.pptx │ ├── NOTICE - What to Expect from the Next Couple of Sections.pdf │ ├── df_preprocessed.csv │ └── Absenteeism_data.csv ├── 37 Concatenating Columns in Python - EXERCISE │ └── EXERCISE - Concatenating Columns in Python.pdf ├── 38 Concatenating Columns in Python - SOLUTION │ └── SOLUTION - Concatenating Columns in Python.pdf ├── 24 A Brief Overview of Regression Analysis │ └── ARTICLE - A Brief Overview of Regression Analysis.pdf ├── 43 Implementing Checkpoints in Coding - EXERCISE │ └── EXERCISE - Implementing Checkpoints in Coding.pdf ├── 44 Implementing Checkpoints in Coding - SOLUTION │ └── SOLUTION - Implementing Checkpoints in Coding.pdf ├── 31 Splitting a Column into Multiple Dummies - EXERCISE │ └── EXERCISE - Splitting a Column into Multiple Dummies.pdf ├── 32 Splitting a Column into Multiple Dummies - SOLUTION │ └── SOLUTION - Splitting a Column into Multiple Dummies.pdf ├── 40 Changing Column Order in Pandas DataFrame - EXERCISE │ └── EXERCISE - Changing Column Order in Pandas DataFrame.pdf ├── 41 Changing Column Order in Pandas DataFrame - SOLUTION │ └── SOLUTION - Changing Column Order in Pandas DataFrame.pdf ├── 51 A Final Note on Preprocessing │ └── Absenteeism Exercise - EXERCISES and SOLUTIONS.ipynb ├── 42 Implementing Checkpoints in Coding │ └── Absenteeism Exercise - Preprocessing - df_reason_mod.ipynb └── 48 Removing Columns - EXERCISE │ ├── Absenteeism Exercise - Preprocessing - df_reason_date_mod.ipynb │ └── Absenteeism Exercise - Removing the Date Column - SOLUTION.ipynb ├── 3. Setting up the working environment ├── 12 Shortcuts for Jupyter │ └── Shortcuts-for-Jupyter.pdf ├── 14 Installing sklearn - EXERCISE │ └── EXERCISE - Installing sklearn.pdf └── 15 Installing sklearn - SOLUTION │ └── SOLUTION - Installing sklearn.pdf ├── 9. Analyzing the Obtained data in Tableau ├── 83 Age vs Probability - EXERCISE │ └── EXERCISE - Age vs Probability.pdf ├── 85 Reasons vs Probability - EXERCISE │ └── EXERCISE - Reasons vs Probability.pdf └── 87 Tranportation Expense vs Probability - EXERCISE │ └── EXERCISE - Transportation Expense vs Probability.pdf ├── 4. What's next in the course └── 19 Important Notice Regarding Datasets │ └── NOTICE - Important Notice Regarding Datasets.pdf └── LICENSE /README.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | # Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau 5 | Python + SQL + Tableau: Integrating Python, SQL, and Tableau, published by Packt 6 | -------------------------------------------------------------------------------- /Quizzes/Integration Python SQL Tableau - Quiz Questions.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/Quizzes/Integration Python SQL Tableau - Quiz Questions.pdf -------------------------------------------------------------------------------- /6. Machine Learning/65 A Note on Pickling/A Note on Pickling.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/6. Machine Learning/65 A Note on Pickling/A Note on Pickling.pdf -------------------------------------------------------------------------------- /8. Connecting Python and SQL/71 Are You Sure You're All Set/The 5 files rar.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/8. Connecting Python and SQL/71 Are You Sure You're All Set/The 5 files rar.zip -------------------------------------------------------------------------------- /5. Preprocessing/48 Removing Columns - EXERCISES/EXERCISES - Removing Columns.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/48 Removing Columns - EXERCISES/EXERCISES - Removing Columns.pdf -------------------------------------------------------------------------------- /3. Setting up the working environment/12 Shortcuts for Jupyter/Shortcuts-for-Jupyter.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/3. Setting up the working environment/12 Shortcuts for Jupyter/Shortcuts-for-Jupyter.pdf -------------------------------------------------------------------------------- /5. Preprocessing/33 Dummy Variables - Reasoning/ARTICLE - Dummy Variables - Reasoning.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/33 Dummy Variables - Reasoning/ARTICLE - Dummy Variables - Reasoning.pdf -------------------------------------------------------------------------------- /5. Preprocessing/27 Removing Irrelevant Data - EXERCISE/EXERCISE - Removing Irrelevant Data.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/27 Removing Irrelevant Data - EXERCISE/EXERCISE - Removing Irrelevant Data.pdf -------------------------------------------------------------------------------- /5. Preprocessing/28 Removing Irrelevant Data - SOLUTION/SOLUTION - Removing Irrelevant Data.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/28 Removing Irrelevant Data - SOLUTION/SOLUTION - Removing Irrelevant Data.pdf -------------------------------------------------------------------------------- /8. Connecting Python and SQL/71 Are You Sure You're All Set/NOTE - Are You Sure You're All Set.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/8. Connecting Python and SQL/71 Are You Sure You're All Set/NOTE - Are You Sure You're All Set.pdf -------------------------------------------------------------------------------- /8. Connecting Python and SQL/77 Create 'df_new_obs' - EXERCISE/EXERCISE - Create 'df_new_obs'.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/8. Connecting Python and SQL/77 Create 'df_new_obs' - EXERCISE/EXERCISE - Create 'df_new_obs'.pdf -------------------------------------------------------------------------------- /5. Preprocessing/20 What to Expect from the Next Couple of Sections/data_preprocessing_homework.pptx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/20 What to Expect from the Next Couple of Sections/data_preprocessing_homework.pptx -------------------------------------------------------------------------------- /3. Setting up the working environment/14 Installing sklearn - EXERCISE/EXERCISE - Installing sklearn.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/3. Setting up the working environment/14 Installing sklearn - EXERCISE/EXERCISE - Installing sklearn.pdf -------------------------------------------------------------------------------- /3. Setting up the working environment/15 Installing sklearn - SOLUTION/SOLUTION - Installing sklearn.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/3. Setting up the working environment/15 Installing sklearn - SOLUTION/SOLUTION - Installing sklearn.pdf -------------------------------------------------------------------------------- /5. Preprocessing/37 Concatenating Columns in Python - EXERCISE/EXERCISE - Concatenating Columns in Python.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/37 Concatenating Columns in Python - EXERCISE/EXERCISE - Concatenating Columns in Python.pdf -------------------------------------------------------------------------------- /5. Preprocessing/38 Concatenating Columns in Python - SOLUTION/SOLUTION - Concatenating Columns in Python.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/38 Concatenating Columns in Python - SOLUTION/SOLUTION - Concatenating Columns in Python.pdf -------------------------------------------------------------------------------- /6. Machine Learning/66 Saving the Model (and Scaler) - EXERCISE/EXERCISE - Saving the Model (and Scaler).pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/6. Machine Learning/66 Saving the Model (and Scaler) - EXERCISE/EXERCISE - Saving the Model (and Scaler).pdf -------------------------------------------------------------------------------- /9. Analyzing the Obtained data in Tableau/83 Age vs Probability - EXERCISE/EXERCISE - Age vs Probability.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/9. Analyzing the Obtained data in Tableau/83 Age vs Probability - EXERCISE/EXERCISE - Age vs Probability.pdf -------------------------------------------------------------------------------- /5. Preprocessing/24 A Brief Overview of Regression Analysis/ARTICLE - A Brief Overview of Regression Analysis.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/24 A Brief Overview of Regression Analysis/ARTICLE - A Brief Overview of Regression Analysis.pdf -------------------------------------------------------------------------------- /4. What's next in the course/19 Important Notice Regarding Datasets/NOTICE - Important Notice Regarding Datasets.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/4. What's next in the course/19 Important Notice Regarding Datasets/NOTICE - Important Notice Regarding Datasets.pdf -------------------------------------------------------------------------------- /5. Preprocessing/43 Implementing Checkpoints in Coding - EXERCISE/EXERCISE - Implementing Checkpoints in Coding.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/43 Implementing Checkpoints in Coding - EXERCISE/EXERCISE - Implementing Checkpoints in Coding.pdf -------------------------------------------------------------------------------- /5. Preprocessing/44 Implementing Checkpoints in Coding - SOLUTION/SOLUTION - Implementing Checkpoints in Coding.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/44 Implementing Checkpoints in Coding - SOLUTION/SOLUTION - Implementing Checkpoints in Coding.pdf -------------------------------------------------------------------------------- /9. Analyzing the Obtained data in Tableau/85 Reasons vs Probability - EXERCISE/EXERCISE - Reasons vs Probability.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/9. Analyzing the Obtained data in Tableau/85 Reasons vs Probability - EXERCISE/EXERCISE - Reasons vs Probability.pdf -------------------------------------------------------------------------------- /5. Preprocessing/20 What to Expect from the Next Couple of Sections/NOTICE - What to Expect from the Next Couple of Sections.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/20 What to Expect from the Next Couple of Sections/NOTICE - What to Expect from the Next Couple of Sections.pdf -------------------------------------------------------------------------------- /5. Preprocessing/31 Splitting a Column into Multiple Dummies - EXERCISE/EXERCISE - Splitting a Column into Multiple Dummies.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/31 Splitting a Column into Multiple Dummies - EXERCISE/EXERCISE - Splitting a Column into Multiple Dummies.pdf -------------------------------------------------------------------------------- /5. Preprocessing/32 Splitting a Column into Multiple Dummies - SOLUTION/SOLUTION - Splitting a Column into Multiple Dummies.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/32 Splitting a Column into Multiple Dummies - SOLUTION/SOLUTION - Splitting a Column into Multiple Dummies.pdf -------------------------------------------------------------------------------- /5. Preprocessing/40 Changing Column Order in Pandas DataFrame - EXERCISE/EXERCISE - Changing Column Order in Pandas DataFrame.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/40 Changing Column Order in Pandas DataFrame - EXERCISE/EXERCISE - Changing Column Order in Pandas DataFrame.pdf -------------------------------------------------------------------------------- /5. Preprocessing/41 Changing Column Order in Pandas DataFrame - SOLUTION/SOLUTION - Changing Column Order in Pandas DataFrame.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/5. Preprocessing/41 Changing Column Order in Pandas DataFrame - SOLUTION/SOLUTION - Changing Column Order in Pandas DataFrame.pdf -------------------------------------------------------------------------------- /9. Analyzing the Obtained data in Tableau/87 Tranportation Expense vs Probability - EXERCISE/EXERCISE - Transportation Expense vs Probability.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Python-SQL-Tableau-Integrating-Python-SQL-and-Tableau/HEAD/9. Analyzing the Obtained data in Tableau/87 Tranportation Expense vs Probability - EXERCISE/EXERCISE - Transportation Expense vs Probability.pdf -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2019 Packt 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /5. Preprocessing/51 A Final Note on Preprocessing/Absenteeism Exercise - EXERCISES and SOLUTIONS.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import pandas as pd\n", 10 | "raw_csv_data = pd.read_csv(\"D:/Test/Absenteeism_data.csv\")\n", 11 | "df = raw_csv_data.copy()\n", 12 | "df = df.drop(['ID'], axis = 1)\n", 13 | "df" 14 | ] 15 | }, 16 | { 17 | "cell_type": "markdown", 18 | "metadata": {}, 19 | "source": [ 20 | "## Exercise: Dropping a Column from a DataFrame in Python" 21 | ] 22 | }, 23 | { 24 | "cell_type": "markdown", 25 | "metadata": {}, 26 | "source": [ 27 | "Create a new notebook file where you will need to solve the exercises we provide throughout this business case study.\n", 28 | "Then, drop the ‘Age’ column from df and assign the newly obtained data set to a new variable, called df_no_age. " 29 | ] 30 | }, 31 | { 32 | "cell_type": "code", 33 | "execution_count": null, 34 | "metadata": {}, 35 | "outputs": [], 36 | "source": [ 37 | "df_no_age = df.drop(['Age'], axis = 1)\n", 38 | "df_no_age" 39 | ] 40 | }, 41 | { 42 | "cell_type": "markdown", 43 | "metadata": {}, 44 | "source": [ 45 | "## Exercise: Obtaining Dummies from a Single Feature" 46 | ] 47 | }, 48 | { 49 | "cell_type": "markdown", 50 | "metadata": {}, 51 | "source": [ 52 | "Split the ‘Age’ column into multiple dummies. Store the output in a new variable, called age_dummies." 53 | ] 54 | }, 55 | { 56 | "cell_type": "code", 57 | "execution_count": null, 58 | "metadata": {}, 59 | "outputs": [], 60 | "source": [ 61 | "age_dummies = pd.get_dummies(df['Age'])\n", 62 | "age_dummies" 63 | ] 64 | }, 65 | { 66 | "cell_type": "markdown", 67 | "metadata": {}, 68 | "source": [ 69 | "## Exercise: Concatenation in Python" 70 | ] 71 | }, 72 | { 73 | "cell_type": "markdown", 74 | "metadata": {}, 75 | "source": [ 76 | "Concatenate the df_no_age and age_dummies objects you previously obtained. Store the result in a new object called df_concatenated." 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": null, 82 | "metadata": {}, 83 | "outputs": [], 84 | "source": [ 85 | "df_concatenated = pd.concat([df_no_age, age_dummies], axis=1)\n", 86 | "df_concatenated" 87 | ] 88 | }, 89 | { 90 | "cell_type": "markdown", 91 | "metadata": {}, 92 | "source": [ 93 | "## Exercise: Reordering Columns" 94 | ] 95 | }, 96 | { 97 | "cell_type": "markdown", 98 | "metadata": {}, 99 | "source": [ 100 | "Reorder the columns from df_concatented in such a way that the ‘Absenteeism Time in Hours’ column appears at the far right of the data set." 101 | ] 102 | }, 103 | { 104 | "cell_type": "code", 105 | "execution_count": null, 106 | "metadata": {}, 107 | "outputs": [], 108 | "source": [ 109 | "df_concatenated.columns.values" 110 | ] 111 | }, 112 | { 113 | "cell_type": "code", 114 | "execution_count": null, 115 | "metadata": {}, 116 | "outputs": [], 117 | "source": [ 118 | "column_names_updated = ['Reason for Absence', 'Date', 'Transportation Expense',\n", 119 | " 'Distance to Work', 'Daily Work Load Average', 'Body Mass Index',\n", 120 | " 'Education', 'Children', 'Pets', 27,\n", 121 | " 28, 29, 30, 31, 32, 33, 34, 36, 37, 38, 39, 40, 41, 43, 46, 47, 48,\n", 122 | " 49, 50, 58, 'Absenteeism Time in Hours']" 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "execution_count": null, 128 | "metadata": {}, 129 | "outputs": [], 130 | "source": [ 131 | "df_concatenated.columns = column_names_updated" 132 | ] 133 | }, 134 | { 135 | "cell_type": "code", 136 | "execution_count": null, 137 | "metadata": {}, 138 | "outputs": [], 139 | "source": [ 140 | "df_concatenated" 141 | ] 142 | }, 143 | { 144 | "cell_type": "markdown", 145 | "metadata": {}, 146 | "source": [ 147 | "## Exercise Creating Checkpoints" 148 | ] 149 | }, 150 | { 151 | "cell_type": "markdown", 152 | "metadata": {}, 153 | "source": [ 154 | "Create a checkpoint of your work on the exercises, storing the current output in a variable called df_checkpoint." 155 | ] 156 | }, 157 | { 158 | "cell_type": "code", 159 | "execution_count": null, 160 | "metadata": {}, 161 | "outputs": [], 162 | "source": [ 163 | "df_checkpoint = df_concatenated.copy()\n", 164 | "df_checkpoint" 165 | ] 166 | } 167 | ], 168 | "metadata": { 169 | "kernelspec": { 170 | "display_name": "Python 3", 171 | "language": "python", 172 | "name": "python3" 173 | }, 174 | "language_info": { 175 | "codemirror_mode": { 176 | "name": "ipython", 177 | "version": 3 178 | }, 179 | "file_extension": ".py", 180 | "mimetype": "text/x-python", 181 | "name": "python", 182 | "nbconvert_exporter": "python", 183 | "pygments_lexer": "ipython3", 184 | "version": "3.6.4" 185 | } 186 | }, 187 | "nbformat": 4, 188 | "nbformat_minor": 2 189 | } 190 | -------------------------------------------------------------------------------- /5. Preprocessing/42 Implementing Checkpoints in Coding/Absenteeism Exercise - Preprocessing - df_reason_mod.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "# Below you can find the code that builds up to the ‘df_reason_mod’ checkpoint.\n", 10 | "# Additionally, in the comments you can see the code that we ran in the lectures to check \n", 11 | "# the current state of a specific object while explaining various programming or data analytics concepts. \n", 12 | "\n", 13 | "\n", 14 | "import pandas as pd\n", 15 | "raw_csv_data = pd.read_csv(\"D:/Test/Absenteeism_data.csv\")\n", 16 | "\n", 17 | "# type(raw_csv_data)\n", 18 | "# raw_csv_data\n", 19 | "\n", 20 | "df = raw_csv_data.copy()\n", 21 | "# df\n", 22 | "\n", 23 | "# pd.options.display.max_columns = None\n", 24 | "# pd.options.display.max_rows = None\n", 25 | "# display(df)\n", 26 | "# df.info()\n", 27 | "\n", 28 | "\n", 29 | "\n", 30 | "########## Drop 'ID': ############################\n", 31 | "##################################################\n", 32 | "\n", 33 | "\n", 34 | "# df.drop(['ID'])\n", 35 | "# df.drop(['ID'], axis = 1)\n", 36 | "df = df.drop(['ID'], axis = 1)\n", 37 | "\n", 38 | "# df\n", 39 | "# raw_csv_data\n", 40 | "\n", 41 | "\n", 42 | "\n", 43 | "########## 'Reason for Absence' ##################\n", 44 | "##################################################\n", 45 | "\n", 46 | "\n", 47 | "# df['Reason for Absence'].min()\n", 48 | "# df['Reason for Absence'].max()\n", 49 | "# pd.unique(df['Reason for Absence'])\n", 50 | "# df['Reason for Absence'].unique()\n", 51 | "# len(df['Reason for Absence'].unique())\n", 52 | "# sorted(df['Reason for Absence'].unique())\n", 53 | "\n", 54 | "\n", 55 | "\n", 56 | "########## '.get_dummies()' ######################\n", 57 | "##################################################\n", 58 | "\n", 59 | "\n", 60 | "reason_columns = pd.get_dummies(df['Reason for Absence'])\n", 61 | "# reason_columns\n", 62 | "\n", 63 | "reason_columns['check'] = reason_columns.sum(axis=1)\n", 64 | "# reason_columns\n", 65 | "\n", 66 | "# reason_columns['check'].sum(axis=0)\n", 67 | "# reason_columns['check'].unique()\n", 68 | "\n", 69 | "reason_columns = reason_columns.drop(['check'], axis = 1)\n", 70 | "# reason_columns\n", 71 | "\n", 72 | "reason_columns = pd.get_dummies(df['Reason for Absence'], drop_first = True)\n", 73 | "# reason_columns\n", 74 | "\n", 75 | "\n", 76 | "\n", 77 | "########## Group the Reasons for Absence##########\n", 78 | "##################################################\n", 79 | "\n", 80 | "\n", 81 | "# df.columns.values\n", 82 | "# reason_columns.columns.values\n", 83 | "df = df.drop(['Reason for Absence'], axis = 1)\n", 84 | "# df\n", 85 | "\n", 86 | "# reason_columns.loc[:, 1:14].max(axis=1)\n", 87 | "reason_type_1 = reason_columns.loc[:, 1:14].max(axis=1)\n", 88 | "reason_type_2 = reason_columns.loc[:, 15:17].max(axis=1)\n", 89 | "reason_type_3 = reason_columns.loc[:, 18:21].max(axis=1)\n", 90 | "reason_type_4 = reason_columns.loc[:, 22:].max(axis=1)\n", 91 | "\n", 92 | "# reason_type_1\n", 93 | "# reason_type_2\n", 94 | "# reason_type_3\n", 95 | "# reason_type_4\n", 96 | "\n", 97 | "\n", 98 | "\n", 99 | "########## Concatenate Column Values #############\n", 100 | "##################################################\n", 101 | "\n", 102 | "\n", 103 | "# df\n", 104 | "\n", 105 | "df = pd.concat([df, reason_type_1, reason_type_2, reason_type_3, reason_type_4], axis = 1)\n", 106 | "# df\n", 107 | "\n", 108 | "# df.columns.values\n", 109 | "column_names = ['Date', 'Transportation Expense', 'Distance to Work', 'Age',\n", 110 | " 'Daily Work Load Average', 'Body Mass Index', 'Education',\n", 111 | " 'Children', 'Pets', 'Absenteeism Time in Hours', 'Reason_1', 'Reason_2', 'Reason_3', 'Reason_4']\n", 112 | "\n", 113 | "df.columns = column_names\n", 114 | "# df.head()\n", 115 | "\n", 116 | "\n", 117 | "\n", 118 | "########## Reorder Columns #######################\n", 119 | "##################################################\n", 120 | "\n", 121 | "\n", 122 | "column_names_reordered = ['Reason_1', 'Reason_2', 'Reason_3', 'Reason_4', \n", 123 | " 'Date', 'Transportation Expense', 'Distance to Work', 'Age',\n", 124 | " 'Daily Work Load Average', 'Body Mass Index', 'Education',\n", 125 | " 'Children', 'Pets', 'Absenteeism Time in Hours']\n", 126 | "\n", 127 | "df = df[column_names_reordered]\n", 128 | "# df.head()\n", 129 | "\n", 130 | "\n", 131 | "\n", 132 | "########## Create a Checkpoint #######################\n", 133 | "##################################################\n", 134 | "\n", 135 | "\n", 136 | "df_reason_mod = df.copy()\n", 137 | "df_reason_mod" 138 | ] 139 | } 140 | ], 141 | "metadata": { 142 | "kernelspec": { 143 | "display_name": "Python 3", 144 | "language": "python", 145 | "name": "python3" 146 | }, 147 | "language_info": { 148 | "codemirror_mode": { 149 | "name": "ipython", 150 | "version": 3 151 | }, 152 | "file_extension": ".py", 153 | "mimetype": "text/x-python", 154 | "name": "python", 155 | "nbconvert_exporter": "python", 156 | "pygments_lexer": "ipython3", 157 | "version": "3.6.4" 158 | } 159 | }, 160 | "nbformat": 4, 161 | "nbformat_minor": 2 162 | } 163 | -------------------------------------------------------------------------------- /5. Preprocessing/48 Removing Columns - EXERCISE/Absenteeism Exercise - Preprocessing - df_reason_date_mod.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "# Below you can find the code that builds up to the ‘df_reason_mod’ checkpoint.\n", 10 | "# Additionally, in the comments you can see the code that we ran in the lectures to check \n", 11 | "# the current state of a specific object while explaining various programming or data analytics concepts. \n", 12 | "\n", 13 | "\n", 14 | "import pandas as pd\n", 15 | "raw_csv_data = pd.read_csv(\"D:/Test/Absenteeism_data.csv\")\n", 16 | "\n", 17 | "# type(raw_csv_data)\n", 18 | "# raw_csv_data\n", 19 | "\n", 20 | "df = raw_csv_data.copy()\n", 21 | "# df\n", 22 | "\n", 23 | "# pd.options.display.max_columns = None\n", 24 | "# pd.options.display.max_rows = None\n", 25 | "# display(df)\n", 26 | "# df.info()\n", 27 | "\n", 28 | "\n", 29 | "\n", 30 | "########## Drop 'ID': ############################\n", 31 | "##################################################\n", 32 | "\n", 33 | "\n", 34 | "# df.drop(['ID'])\n", 35 | "# df.drop(['ID'], axis = 1)\n", 36 | "df = df.drop(['ID'], axis = 1)\n", 37 | "\n", 38 | "# df\n", 39 | "# raw_csv_data\n", 40 | "\n", 41 | "\n", 42 | "\n", 43 | "########## 'Reason for Absence' ##################\n", 44 | "##################################################\n", 45 | "\n", 46 | "\n", 47 | "# df['Reason for Absence'].min()\n", 48 | "# df['Reason for Absence'].max()\n", 49 | "# pd.unique(df['Reason for Absence'])\n", 50 | "# df['Reason for Absence'].unique()\n", 51 | "# len(df['Reason for Absence'].unique())\n", 52 | "# sorted(df['Reason for Absence'].unique())\n", 53 | "\n", 54 | "\n", 55 | "\n", 56 | "########## '.get_dummies()' ######################\n", 57 | "##################################################\n", 58 | "\n", 59 | "\n", 60 | "reason_columns = pd.get_dummies(df['Reason for Absence'])\n", 61 | "# reason_columns\n", 62 | "\n", 63 | "reason_columns['check'] = reason_columns.sum(axis=1)\n", 64 | "# reason_columns\n", 65 | "\n", 66 | "# reason_columns['check'].sum(axis=0)\n", 67 | "# reason_columns['check'].unique()\n", 68 | "\n", 69 | "reason_columns = reason_columns.drop(['check'], axis = 1)\n", 70 | "# reason_columns\n", 71 | "\n", 72 | "reason_columns = pd.get_dummies(df['Reason for Absence'], drop_first = True)\n", 73 | "# reason_columns\n", 74 | "\n", 75 | "\n", 76 | "\n", 77 | "########## Group the Reasons for Absence##########\n", 78 | "##################################################\n", 79 | "\n", 80 | "\n", 81 | "# df.columns.values\n", 82 | "# reason_columns.columns.values\n", 83 | "df = df.drop(['Reason for Absence'], axis = 1)\n", 84 | "# df\n", 85 | "\n", 86 | "# reason_columns.loc[:, 1:14].max(axis=1)\n", 87 | "reason_type_1 = reason_columns.loc[:, 1:14].max(axis=1)\n", 88 | "reason_type_2 = reason_columns.loc[:, 15:17].max(axis=1)\n", 89 | "reason_type_3 = reason_columns.loc[:, 18:21].max(axis=1)\n", 90 | "reason_type_4 = reason_columns.loc[:, 22:].max(axis=1)\n", 91 | "\n", 92 | "# reason_type_1\n", 93 | "# reason_type_2\n", 94 | "# reason_type_3\n", 95 | "# reason_type_4\n", 96 | "\n", 97 | "\n", 98 | "\n", 99 | "########## Concatenate Column Values #############\n", 100 | "##################################################\n", 101 | "\n", 102 | "\n", 103 | "# df\n", 104 | "\n", 105 | "df = pd.concat([df, reason_type_1, reason_type_2, reason_type_3, reason_type_4], axis = 1)\n", 106 | "# df\n", 107 | "\n", 108 | "# df.columns.values\n", 109 | "column_names = ['Date', 'Transportation Expense', 'Distance to Work', 'Age',\n", 110 | " 'Daily Work Load Average', 'Body Mass Index', 'Education',\n", 111 | " 'Children', 'Pets', 'Absenteeism Time in Hours', 'Reason_1', 'Reason_2', 'Reason_3', 'Reason_4']\n", 112 | "\n", 113 | "df.columns = column_names\n", 114 | "# df.head()\n", 115 | "\n", 116 | "\n", 117 | "\n", 118 | "########## Reorder Columns #######################\n", 119 | "##################################################\n", 120 | "\n", 121 | "\n", 122 | "column_names_reordered = ['Reason_1', 'Reason_2', 'Reason_3', 'Reason_4', \n", 123 | " 'Date', 'Transportation Expense', 'Distance to Work', 'Age',\n", 124 | " 'Daily Work Load Average', 'Body Mass Index', 'Education',\n", 125 | " 'Children', 'Pets', 'Absenteeism Time in Hours']\n", 126 | "\n", 127 | "df = df[column_names_reordered]\n", 128 | "# df.head()\n", 129 | "\n", 130 | "\n", 131 | "\n", 132 | "########## Create a Checkpoint ###################\n", 133 | "##################################################\n", 134 | "\n", 135 | "\n", 136 | "df_reason_mod = df.copy()\n", 137 | "# df_reason_mod\n", 138 | "\n", 139 | "\n", 140 | "\n", 141 | "########## 'Date' ################################\n", 142 | "##################################################\n", 143 | "\n", 144 | "\n", 145 | "# df_reason_mod['Date']\n", 146 | "# df_reason_mod['Date'][0]\n", 147 | "# type(df_reason_mod['Date'][0])\n", 148 | "\n", 149 | "# df_reason_mod['Date'] = pd.to_datetime(df_reason_mod['Date'])\n", 150 | "# df_reason_mod['Date']\n", 151 | "\n", 152 | "df_reason_mod['Date'] = pd.to_datetime(df_reason_mod['Date'], format = '%d/%m/%Y')\n", 153 | "# df_reason_mod['Date']\n", 154 | "# type(df_reason_mod['Date'])\n", 155 | "# df_reason_mod.info()\n", 156 | "\n", 157 | "\n", 158 | "\n", 159 | "########## Extract the Month Value ###############\n", 160 | "##################################################\n", 161 | "\n", 162 | "\n", 163 | "# df_reason_mod['Date'][0]\n", 164 | "# df_reason_mod['Date'][0].month\n", 165 | "\n", 166 | "list_months = []\n", 167 | "# list_months\n", 168 | "\n", 169 | "# df_reason_mod.shape\n", 170 | "\n", 171 | "for i in range(df_reason_mod.shape[0]):\n", 172 | " list_months.append(df_reason_mod['Date'][i].month)\n", 173 | " \n", 174 | "# list_months\n", 175 | "# len(list_months)\n", 176 | "df_reason_mod['Month Value'] = list_months\n", 177 | "# df_reason_mod.head(20)\n", 178 | "\n", 179 | "\n", 180 | "\n", 181 | "########## Extract the Month Value ###############\n", 182 | "##################################################\n", 183 | "\n", 184 | "\n", 185 | "# df_reason_mod['Date'][699].weekday()\n", 186 | "# df_reason_mod['Date'][699]\n", 187 | "\n", 188 | "def date_to_weekday(date_value):\n", 189 | " return date_value.weekday()\n", 190 | "\n", 191 | "df_reason_mod['Day of the Week'] = df_reason_mod['Date'].apply(date_to_weekday)\n", 192 | "\n", 193 | "# df_reason_mod.head()\n", 194 | "\n", 195 | "\n", 196 | "\n", 197 | "########## Exercise ##############################\n", 198 | "##################################################\n", 199 | "\n", 200 | "\n", 201 | "df_reason_mod = df_reason_mod.drop(['Date'], axis = 1)\n", 202 | "# df_reason_mod.head()\n", 203 | "# df_reason_mod.columns.values\n", 204 | "column_names_upd = ['Reason_1', 'Reason_2', 'Reason_3', 'Reason_4', 'Month Value', 'Day of the Week',\n", 205 | " 'Transportation Expense', 'Distance to Work', 'Age',\n", 206 | " 'Daily Work Load Average', 'Body Mass Index', 'Education', 'Children',\n", 207 | " 'Pets', 'Absenteeism Time in Hours']\n", 208 | "df_reason_mod = df_reason_mod[column_names_upd]\n", 209 | "# df_reason_mod.head()\n", 210 | "\n", 211 | "\n", 212 | "\n", 213 | "########## Create a Checkpoint ###################\n", 214 | "##################################################\n", 215 | "\n", 216 | "\n", 217 | "df_reason_date_mod = df_reason_mod.copy()\n", 218 | "df_reason_date_mod" 219 | ] 220 | } 221 | ], 222 | "metadata": { 223 | "kernelspec": { 224 | "display_name": "Python 3", 225 | "language": "python", 226 | "name": "python3" 227 | }, 228 | "language_info": { 229 | "codemirror_mode": { 230 | "name": "ipython", 231 | "version": 3 232 | }, 233 | "file_extension": ".py", 234 | "mimetype": "text/x-python", 235 | "name": "python", 236 | "nbconvert_exporter": "python", 237 | "pygments_lexer": "ipython3", 238 | "version": "3.6.4" 239 | } 240 | }, 241 | "nbformat": 4, 242 | "nbformat_minor": 2 243 | } 244 | -------------------------------------------------------------------------------- /5. Preprocessing/48 Removing Columns - EXERCISE/Absenteeism Exercise - Removing the Date Column - SOLUTION.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "# Below you can find the code that builds up to the ‘df_reason_mod’ checkpoint.\n", 10 | "# Additionally, in the comments you can see the code that we ran in the lectures to check \n", 11 | "# the current state of a specific object while explaining various programming or data analytics concepts. \n", 12 | "\n", 13 | "\n", 14 | "import pandas as pd\n", 15 | "raw_csv_data = pd.read_csv(\"D:/Test/Absenteeism_data.csv\")\n", 16 | "\n", 17 | "# type(raw_csv_data)\n", 18 | "# raw_csv_data\n", 19 | "\n", 20 | "df = raw_csv_data.copy()\n", 21 | "# df\n", 22 | "\n", 23 | "# pd.options.display.max_columns = None\n", 24 | "# pd.options.display.max_rows = None\n", 25 | "# display(df)\n", 26 | "# df.info()\n", 27 | "\n", 28 | "\n", 29 | "\n", 30 | "########## Drop 'ID': ############################\n", 31 | "##################################################\n", 32 | "\n", 33 | "\n", 34 | "# df.drop(['ID'])\n", 35 | "# df.drop(['ID'], axis = 1)\n", 36 | "df = df.drop(['ID'], axis = 1)\n", 37 | "\n", 38 | "# df\n", 39 | "# raw_csv_data\n", 40 | "\n", 41 | "\n", 42 | "\n", 43 | "########## 'Reason for Absence' ##################\n", 44 | "##################################################\n", 45 | "\n", 46 | "\n", 47 | "# df['Reason for Absence'].min()\n", 48 | "# df['Reason for Absence'].max()\n", 49 | "# pd.unique(df['Reason for Absence'])\n", 50 | "# df['Reason for Absence'].unique()\n", 51 | "# len(df['Reason for Absence'].unique())\n", 52 | "# sorted(df['Reason for Absence'].unique())\n", 53 | "\n", 54 | "\n", 55 | "\n", 56 | "########## '.get_dummies()' ######################\n", 57 | "##################################################\n", 58 | "\n", 59 | "\n", 60 | "reason_columns = pd.get_dummies(df['Reason for Absence'])\n", 61 | "# reason_columns\n", 62 | "\n", 63 | "reason_columns['check'] = reason_columns.sum(axis=1)\n", 64 | "# reason_columns\n", 65 | "\n", 66 | "# reason_columns['check'].sum(axis=0)\n", 67 | "# reason_columns['check'].unique()\n", 68 | "\n", 69 | "reason_columns = reason_columns.drop(['check'], axis = 1)\n", 70 | "# reason_columns\n", 71 | "\n", 72 | "reason_columns = pd.get_dummies(df['Reason for Absence'], drop_first = True)\n", 73 | "# reason_columns\n", 74 | "\n", 75 | "\n", 76 | "\n", 77 | "########## Group the Reasons for Absence##########\n", 78 | "##################################################\n", 79 | "\n", 80 | "\n", 81 | "# df.columns.values\n", 82 | "# reason_columns.columns.values\n", 83 | "df = df.drop(['Reason for Absence'], axis = 1)\n", 84 | "# df\n", 85 | "\n", 86 | "# reason_columns.loc[:, 1:14].max(axis=1)\n", 87 | "reason_type_1 = reason_columns.loc[:, 1:14].max(axis=1)\n", 88 | "reason_type_2 = reason_columns.loc[:, 15:17].max(axis=1)\n", 89 | "reason_type_3 = reason_columns.loc[:, 18:21].max(axis=1)\n", 90 | "reason_type_4 = reason_columns.loc[:, 22:].max(axis=1)\n", 91 | "\n", 92 | "# reason_type_1\n", 93 | "# reason_type_2\n", 94 | "# reason_type_3\n", 95 | "# reason_type_4\n", 96 | "\n", 97 | "\n", 98 | "\n", 99 | "########## Concatenate Column Values #############\n", 100 | "##################################################\n", 101 | "\n", 102 | "\n", 103 | "# df\n", 104 | "\n", 105 | "df = pd.concat([df, reason_type_1, reason_type_2, reason_type_3, reason_type_4], axis = 1)\n", 106 | "# df\n", 107 | "\n", 108 | "# df.columns.values\n", 109 | "column_names = ['Date', 'Transportation Expense', 'Distance to Work', 'Age',\n", 110 | " 'Daily Work Load Average', 'Body Mass Index', 'Education',\n", 111 | " 'Children', 'Pets', 'Absenteeism Time in Hours', 'Reason_1', 'Reason_2', 'Reason_3', 'Reason_4']\n", 112 | "\n", 113 | "df.columns = column_names\n", 114 | "# df.head()\n", 115 | "\n", 116 | "\n", 117 | "\n", 118 | "########## Reorder Columns #######################\n", 119 | "##################################################\n", 120 | "\n", 121 | "\n", 122 | "column_names_reordered = ['Reason_1', 'Reason_2', 'Reason_3', 'Reason_4', \n", 123 | " 'Date', 'Transportation Expense', 'Distance to Work', 'Age',\n", 124 | " 'Daily Work Load Average', 'Body Mass Index', 'Education',\n", 125 | " 'Children', 'Pets', 'Absenteeism Time in Hours']\n", 126 | "\n", 127 | "df = df[column_names_reordered]\n", 128 | "# df.head()\n", 129 | "\n", 130 | "\n", 131 | "\n", 132 | "########## Create a Checkpoint ###################\n", 133 | "##################################################\n", 134 | "\n", 135 | "\n", 136 | "df_reason_mod = df.copy()\n", 137 | "# df_reason_mod\n", 138 | "\n", 139 | "\n", 140 | "\n", 141 | "########## 'Date' ################################\n", 142 | "##################################################\n", 143 | "\n", 144 | "\n", 145 | "# df_reason_mod['Date']\n", 146 | "# df_reason_mod['Date'][0]\n", 147 | "# type(df_reason_mod['Date'][0])\n", 148 | "\n", 149 | "# df_reason_mod['Date'] = pd.to_datetime(df_reason_mod['Date'])\n", 150 | "# df_reason_mod['Date']\n", 151 | "\n", 152 | "df_reason_mod['Date'] = pd.to_datetime(df_reason_mod['Date'], format = '%d/%m/%Y')\n", 153 | "# df_reason_mod['Date']\n", 154 | "# type(df_reason_mod['Date'])\n", 155 | "# df_reason_mod.info()\n", 156 | "\n", 157 | "\n", 158 | "\n", 159 | "########## Extract the Month Value ###############\n", 160 | "##################################################\n", 161 | "\n", 162 | "\n", 163 | "# df_reason_mod['Date'][0]\n", 164 | "# df_reason_mod['Date'][0].month\n", 165 | "\n", 166 | "list_months = []\n", 167 | "# list_months\n", 168 | "\n", 169 | "# df_reason_mod.shape\n", 170 | "\n", 171 | "for i in range(df_reason_mod.shape[0]):\n", 172 | " list_months.append(df_reason_mod['Date'][i].month)\n", 173 | " \n", 174 | "# list_months\n", 175 | "# len(list_months)\n", 176 | "df_reason_mod['Month Value'] = list_months\n", 177 | "# df_reason_mod.head(20)\n", 178 | "\n", 179 | "\n", 180 | "\n", 181 | "########## Extract the Month Value ###############\n", 182 | "##################################################\n", 183 | "\n", 184 | "\n", 185 | "# df_reason_mod['Date'][699].weekday()\n", 186 | "# df_reason_mod['Date'][699]\n", 187 | "\n", 188 | "def date_to_weekday(date_value):\n", 189 | " return date_value.weekday()\n", 190 | "\n", 191 | "df_reason_mod['Day of the Week'] = df_reason_mod['Date'].apply(date_to_weekday)\n", 192 | "\n", 193 | "# df_reason_mod.head()" 194 | ] 195 | }, 196 | { 197 | "cell_type": "markdown", 198 | "metadata": {}, 199 | "source": [ 200 | "## Exercise:" 201 | ] 202 | }, 203 | { 204 | "cell_type": "markdown", 205 | "metadata": {}, 206 | "source": [ 207 | "**Drop the 'Date' column from the df_reason_mod DataFrame.** " 208 | ] 209 | }, 210 | { 211 | "cell_type": "code", 212 | "execution_count": null, 213 | "metadata": {}, 214 | "outputs": [], 215 | "source": [ 216 | "df_reason_mod = df_reason_mod.drop(['Date'], axis = 1)\n", 217 | "df_reason_mod.head()" 218 | ] 219 | }, 220 | { 221 | "cell_type": "markdown", 222 | "metadata": {}, 223 | "source": [ 224 | "**Re-order the columns in df_reason_mod so that “Month Value” and “Day of the Week” appear exactly where “Date” used to be. That is, between “Reason_4” and “Transportation Expense”.**" 225 | ] 226 | }, 227 | { 228 | "cell_type": "code", 229 | "execution_count": null, 230 | "metadata": {}, 231 | "outputs": [], 232 | "source": [ 233 | "df_reason_mod.columns.values" 234 | ] 235 | }, 236 | { 237 | "cell_type": "code", 238 | "execution_count": null, 239 | "metadata": {}, 240 | "outputs": [], 241 | "source": [ 242 | "column_names_upd = ['Reason_1', 'Reason_2', 'Reason_3', 'Reason_4', 'Month Value', 'Day of the Week',\n", 243 | " 'Transportation Expense', 'Distance to Work', 'Age',\n", 244 | " 'Daily Work Load Average', 'Body Mass Index', 'Education', 'Children',\n", 245 | " 'Pets', 'Absenteeism Time in Hours']" 246 | ] 247 | }, 248 | { 249 | "cell_type": "code", 250 | "execution_count": null, 251 | "metadata": {}, 252 | "outputs": [], 253 | "source": [ 254 | "df_reason_mod = df_reason_mod[column_names_upd]\n", 255 | "df_reason_mod.head()" 256 | ] 257 | }, 258 | { 259 | "cell_type": "markdown", 260 | "metadata": {}, 261 | "source": [ 262 | "**Create another checkpoint, calling the new variable df_reason_date_mod.**" 263 | ] 264 | }, 265 | { 266 | "cell_type": "code", 267 | "execution_count": null, 268 | "metadata": {}, 269 | "outputs": [], 270 | "source": [ 271 | "df_reason_date_mod = df_reason_mod.copy()\n", 272 | "df_reason_date_mod" 273 | ] 274 | } 275 | ], 276 | "metadata": { 277 | "kernelspec": { 278 | "display_name": "Python 3", 279 | "language": "python", 280 | "name": "python3" 281 | }, 282 | "language_info": { 283 | "codemirror_mode": { 284 | "name": "ipython", 285 | "version": 3 286 | }, 287 | "file_extension": ".py", 288 | "mimetype": "text/x-python", 289 | "name": "python", 290 | "nbconvert_exporter": "python", 291 | "pygments_lexer": "ipython3", 292 | "version": "3.6.4" 293 | } 294 | }, 295 | "nbformat": 4, 296 | "nbformat_minor": 2 297 | } 298 | -------------------------------------------------------------------------------- /6. Machine Learning/66 EXERCISE - Saving the Model (and Scaler)/Absenteeism Exercise - Logistic Regression.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Creating a logistic regression to predict absenteeism" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "## Import the relevant libraries" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": null, 20 | "metadata": {}, 21 | "outputs": [], 22 | "source": [ 23 | "import pandas as pd\n", 24 | "import numpy as np" 25 | ] 26 | }, 27 | { 28 | "cell_type": "markdown", 29 | "metadata": {}, 30 | "source": [ 31 | "## Load the data" 32 | ] 33 | }, 34 | { 35 | "cell_type": "code", 36 | "execution_count": null, 37 | "metadata": {}, 38 | "outputs": [], 39 | "source": [ 40 | "data_preprocessed = pd.read_csv('Absenteeism_preprocessed.csv')" 41 | ] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "execution_count": null, 46 | "metadata": {}, 47 | "outputs": [], 48 | "source": [ 49 | "data_preprocessed.head()" 50 | ] 51 | }, 52 | { 53 | "cell_type": "markdown", 54 | "metadata": {}, 55 | "source": [ 56 | "## Create the targets" 57 | ] 58 | }, 59 | { 60 | "cell_type": "code", 61 | "execution_count": null, 62 | "metadata": {}, 63 | "outputs": [], 64 | "source": [ 65 | "data_preprocessed['Absenteeism Time in Hours'].median()" 66 | ] 67 | }, 68 | { 69 | "cell_type": "code", 70 | "execution_count": null, 71 | "metadata": {}, 72 | "outputs": [], 73 | "source": [ 74 | "targets = np.where(data_preprocessed['Absenteeism Time in Hours'] > \n", 75 | " data_preprocessed['Absenteeism Time in Hours'].median(), 1, 0)" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": null, 81 | "metadata": { 82 | "scrolled": true 83 | }, 84 | "outputs": [], 85 | "source": [ 86 | "targets" 87 | ] 88 | }, 89 | { 90 | "cell_type": "code", 91 | "execution_count": null, 92 | "metadata": {}, 93 | "outputs": [], 94 | "source": [ 95 | "data_preprocessed['Excessive Absenteeism'] = targets" 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": null, 101 | "metadata": {}, 102 | "outputs": [], 103 | "source": [ 104 | "data_preprocessed.head()" 105 | ] 106 | }, 107 | { 108 | "cell_type": "markdown", 109 | "metadata": {}, 110 | "source": [ 111 | "## A comment on the targets" 112 | ] 113 | }, 114 | { 115 | "cell_type": "code", 116 | "execution_count": null, 117 | "metadata": {}, 118 | "outputs": [], 119 | "source": [ 120 | "targets.sum() / targets.shape[0]" 121 | ] 122 | }, 123 | { 124 | "cell_type": "code", 125 | "execution_count": null, 126 | "metadata": {}, 127 | "outputs": [], 128 | "source": [ 129 | "data_with_targets = data_preprocessed.drop(['Absenteeism Time in Hours','Day of the Week',\n", 130 | " 'Daily Work Load Average','Distance to Work'],axis=1)" 131 | ] 132 | }, 133 | { 134 | "cell_type": "code", 135 | "execution_count": null, 136 | "metadata": {}, 137 | "outputs": [], 138 | "source": [ 139 | "data_with_targets is data_preprocessed" 140 | ] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "execution_count": null, 145 | "metadata": {}, 146 | "outputs": [], 147 | "source": [ 148 | "data_with_targets.head()" 149 | ] 150 | }, 151 | { 152 | "cell_type": "markdown", 153 | "metadata": {}, 154 | "source": [ 155 | "## Select the inputs for the regression" 156 | ] 157 | }, 158 | { 159 | "cell_type": "code", 160 | "execution_count": null, 161 | "metadata": {}, 162 | "outputs": [], 163 | "source": [ 164 | "data_with_targets.shape" 165 | ] 166 | }, 167 | { 168 | "cell_type": "code", 169 | "execution_count": null, 170 | "metadata": { 171 | "scrolled": true 172 | }, 173 | "outputs": [], 174 | "source": [ 175 | "data_with_targets.iloc[:,:14]" 176 | ] 177 | }, 178 | { 179 | "cell_type": "code", 180 | "execution_count": null, 181 | "metadata": { 182 | "scrolled": true 183 | }, 184 | "outputs": [], 185 | "source": [ 186 | "data_with_targets.iloc[:,:-1]" 187 | ] 188 | }, 189 | { 190 | "cell_type": "code", 191 | "execution_count": null, 192 | "metadata": {}, 193 | "outputs": [], 194 | "source": [ 195 | "unscaled_inputs = data_with_targets.iloc[:,:-1]" 196 | ] 197 | }, 198 | { 199 | "cell_type": "markdown", 200 | "metadata": {}, 201 | "source": [ 202 | "## Standardize the data" 203 | ] 204 | }, 205 | { 206 | "cell_type": "code", 207 | "execution_count": null, 208 | "metadata": {}, 209 | "outputs": [], 210 | "source": [ 211 | "from sklearn.preprocessing import StandardScaler\n", 212 | "\n", 213 | "absenteeism_scaler = StandardScaler()" 214 | ] 215 | }, 216 | { 217 | "cell_type": "code", 218 | "execution_count": null, 219 | "metadata": {}, 220 | "outputs": [], 221 | "source": [ 222 | "from sklearn.base import BaseEstimator, TransformerMixin\n", 223 | "from sklearn.preprocessing import StandardScaler\n", 224 | "\n", 225 | "class CustomScaler(BaseEstimator,TransformerMixin): \n", 226 | " \n", 227 | " def __init__(self,columns,copy=True,with_mean=True,with_std=True):\n", 228 | " self.scaler = StandardScaler(copy,with_mean,with_std)\n", 229 | " self.columns = columns\n", 230 | " self.mean_ = None\n", 231 | " self.var_ = None\n", 232 | "\n", 233 | " def fit(self, X, y=None):\n", 234 | " self.scaler.fit(X[self.columns], y)\n", 235 | " self.mean_ = np.mean(X[self.columns])\n", 236 | " self.var_ = np.var(X[self.columns])\n", 237 | " return self\n", 238 | "\n", 239 | " def transform(self, X, y=None, copy=None):\n", 240 | " init_col_order = X.columns\n", 241 | " X_scaled = pd.DataFrame(self.scaler.transform(X[self.columns]), columns=self.columns)\n", 242 | " X_not_scaled = X.loc[:,~X.columns.isin(self.columns)]\n", 243 | " return pd.concat([X_not_scaled, X_scaled], axis=1)[init_col_order]" 244 | ] 245 | }, 246 | { 247 | "cell_type": "code", 248 | "execution_count": null, 249 | "metadata": {}, 250 | "outputs": [], 251 | "source": [ 252 | "unscaled_inputs.columns.values" 253 | ] 254 | }, 255 | { 256 | "cell_type": "code", 257 | "execution_count": null, 258 | "metadata": {}, 259 | "outputs": [], 260 | "source": [ 261 | "#columns_to_scale = ['Month Value','Day of the Week', 'Transportation Expense', 'Distance to Work',\n", 262 | " #'Age', 'Daily Work Load Average', 'Body Mass Index', 'Children', 'Pet']\n", 263 | "\n", 264 | "columns_to_omit = ['Reason_1', 'Reason_2', 'Reason_3', 'Reason_4','Education']" 265 | ] 266 | }, 267 | { 268 | "cell_type": "code", 269 | "execution_count": null, 270 | "metadata": {}, 271 | "outputs": [], 272 | "source": [ 273 | "columns_to_scale = [x for x in unscaled_inputs.columns.values if x not in columns_to_omit]" 274 | ] 275 | }, 276 | { 277 | "cell_type": "code", 278 | "execution_count": null, 279 | "metadata": {}, 280 | "outputs": [], 281 | "source": [ 282 | "absenteeism_scaler = CustomScaler(columns_to_scale)" 283 | ] 284 | }, 285 | { 286 | "cell_type": "code", 287 | "execution_count": null, 288 | "metadata": {}, 289 | "outputs": [], 290 | "source": [ 291 | "absenteeism_scaler.fit(unscaled_inputs)" 292 | ] 293 | }, 294 | { 295 | "cell_type": "code", 296 | "execution_count": null, 297 | "metadata": {}, 298 | "outputs": [], 299 | "source": [ 300 | "scaled_inputs = absenteeism_scaler.transform(unscaled_inputs)" 301 | ] 302 | }, 303 | { 304 | "cell_type": "code", 305 | "execution_count": null, 306 | "metadata": { 307 | "scrolled": true 308 | }, 309 | "outputs": [], 310 | "source": [ 311 | "scaled_inputs" 312 | ] 313 | }, 314 | { 315 | "cell_type": "code", 316 | "execution_count": null, 317 | "metadata": {}, 318 | "outputs": [], 319 | "source": [ 320 | "scaled_inputs.shape" 321 | ] 322 | }, 323 | { 324 | "cell_type": "markdown", 325 | "metadata": {}, 326 | "source": [ 327 | "## Split the data into train & test and shuffle" 328 | ] 329 | }, 330 | { 331 | "cell_type": "markdown", 332 | "metadata": {}, 333 | "source": [ 334 | "### Import the relevant module" 335 | ] 336 | }, 337 | { 338 | "cell_type": "code", 339 | "execution_count": null, 340 | "metadata": {}, 341 | "outputs": [], 342 | "source": [ 343 | "from sklearn.model_selection import train_test_split" 344 | ] 345 | }, 346 | { 347 | "cell_type": "markdown", 348 | "metadata": {}, 349 | "source": [ 350 | "### Split" 351 | ] 352 | }, 353 | { 354 | "cell_type": "code", 355 | "execution_count": null, 356 | "metadata": { 357 | "scrolled": true 358 | }, 359 | "outputs": [], 360 | "source": [ 361 | "train_test_split(scaled_inputs, targets)" 362 | ] 363 | }, 364 | { 365 | "cell_type": "code", 366 | "execution_count": null, 367 | "metadata": {}, 368 | "outputs": [], 369 | "source": [ 370 | "x_train, x_test, y_train, y_test = train_test_split(scaled_inputs, targets, #train_size = 0.8, \n", 371 | " test_size = 0.2, random_state = 20)" 372 | ] 373 | }, 374 | { 375 | "cell_type": "code", 376 | "execution_count": null, 377 | "metadata": {}, 378 | "outputs": [], 379 | "source": [ 380 | "print (x_train.shape, y_train.shape)" 381 | ] 382 | }, 383 | { 384 | "cell_type": "code", 385 | "execution_count": null, 386 | "metadata": {}, 387 | "outputs": [], 388 | "source": [ 389 | "print (x_test.shape, y_test.shape)" 390 | ] 391 | }, 392 | { 393 | "cell_type": "markdown", 394 | "metadata": {}, 395 | "source": [ 396 | "## Logistic regression with sklearn" 397 | ] 398 | }, 399 | { 400 | "cell_type": "code", 401 | "execution_count": null, 402 | "metadata": {}, 403 | "outputs": [], 404 | "source": [ 405 | "from sklearn.linear_model import LogisticRegression\n", 406 | "from sklearn import metrics" 407 | ] 408 | }, 409 | { 410 | "cell_type": "markdown", 411 | "metadata": {}, 412 | "source": [ 413 | "### Training the model" 414 | ] 415 | }, 416 | { 417 | "cell_type": "code", 418 | "execution_count": null, 419 | "metadata": {}, 420 | "outputs": [], 421 | "source": [ 422 | "reg = LogisticRegression()" 423 | ] 424 | }, 425 | { 426 | "cell_type": "code", 427 | "execution_count": null, 428 | "metadata": {}, 429 | "outputs": [], 430 | "source": [ 431 | "reg.fit(x_train,y_train)" 432 | ] 433 | }, 434 | { 435 | "cell_type": "code", 436 | "execution_count": null, 437 | "metadata": {}, 438 | "outputs": [], 439 | "source": [ 440 | "reg.score(x_train,y_train)" 441 | ] 442 | }, 443 | { 444 | "cell_type": "markdown", 445 | "metadata": {}, 446 | "source": [ 447 | "### Manually check the accuracy" 448 | ] 449 | }, 450 | { 451 | "cell_type": "code", 452 | "execution_count": null, 453 | "metadata": { 454 | "scrolled": true 455 | }, 456 | "outputs": [], 457 | "source": [ 458 | "model_outputs = reg.predict(x_train)\n", 459 | "model_outputs" 460 | ] 461 | }, 462 | { 463 | "cell_type": "code", 464 | "execution_count": null, 465 | "metadata": {}, 466 | "outputs": [], 467 | "source": [ 468 | "y_train" 469 | ] 470 | }, 471 | { 472 | "cell_type": "code", 473 | "execution_count": null, 474 | "metadata": { 475 | "scrolled": true 476 | }, 477 | "outputs": [], 478 | "source": [ 479 | "model_outputs == y_train" 480 | ] 481 | }, 482 | { 483 | "cell_type": "code", 484 | "execution_count": null, 485 | "metadata": {}, 486 | "outputs": [], 487 | "source": [ 488 | "np.sum((model_outputs==y_train))" 489 | ] 490 | }, 491 | { 492 | "cell_type": "code", 493 | "execution_count": null, 494 | "metadata": {}, 495 | "outputs": [], 496 | "source": [ 497 | "model_outputs.shape[0]" 498 | ] 499 | }, 500 | { 501 | "cell_type": "code", 502 | "execution_count": null, 503 | "metadata": {}, 504 | "outputs": [], 505 | "source": [ 506 | "np.sum((model_outputs==y_train)) / model_outputs.shape[0]" 507 | ] 508 | }, 509 | { 510 | "cell_type": "markdown", 511 | "metadata": {}, 512 | "source": [ 513 | "### Finding the intercept and coefficients" 514 | ] 515 | }, 516 | { 517 | "cell_type": "code", 518 | "execution_count": null, 519 | "metadata": {}, 520 | "outputs": [], 521 | "source": [ 522 | "reg.intercept_" 523 | ] 524 | }, 525 | { 526 | "cell_type": "code", 527 | "execution_count": null, 528 | "metadata": {}, 529 | "outputs": [], 530 | "source": [ 531 | "reg.coef_" 532 | ] 533 | }, 534 | { 535 | "cell_type": "code", 536 | "execution_count": null, 537 | "metadata": {}, 538 | "outputs": [], 539 | "source": [ 540 | "unscaled_inputs.columns.values" 541 | ] 542 | }, 543 | { 544 | "cell_type": "code", 545 | "execution_count": null, 546 | "metadata": {}, 547 | "outputs": [], 548 | "source": [ 549 | "feature_name = unscaled_inputs.columns.values" 550 | ] 551 | }, 552 | { 553 | "cell_type": "code", 554 | "execution_count": null, 555 | "metadata": {}, 556 | "outputs": [], 557 | "source": [ 558 | "summary_table = pd.DataFrame (columns=['Feature name'], data = feature_name)\n", 559 | "\n", 560 | "summary_table['Coefficient'] = np.transpose(reg.coef_)\n", 561 | "\n", 562 | "summary_table" 563 | ] 564 | }, 565 | { 566 | "cell_type": "code", 567 | "execution_count": null, 568 | "metadata": { 569 | "scrolled": true 570 | }, 571 | "outputs": [], 572 | "source": [ 573 | "summary_table.index = summary_table.index + 1\n", 574 | "summary_table.loc[0] = ['Intercept', reg.intercept_[0]]\n", 575 | "summary_table = summary_table.sort_index()\n", 576 | "summary_table" 577 | ] 578 | }, 579 | { 580 | "cell_type": "markdown", 581 | "metadata": {}, 582 | "source": [ 583 | "## Interpreting the coefficients" 584 | ] 585 | }, 586 | { 587 | "cell_type": "code", 588 | "execution_count": null, 589 | "metadata": {}, 590 | "outputs": [], 591 | "source": [ 592 | "summary_table['Odds_ratio'] = np.exp(summary_table.Coefficient)" 593 | ] 594 | }, 595 | { 596 | "cell_type": "code", 597 | "execution_count": null, 598 | "metadata": {}, 599 | "outputs": [], 600 | "source": [ 601 | "summary_table" 602 | ] 603 | }, 604 | { 605 | "cell_type": "code", 606 | "execution_count": null, 607 | "metadata": {}, 608 | "outputs": [], 609 | "source": [ 610 | "summary_table.sort_values('Odds_ratio', ascending=False)" 611 | ] 612 | }, 613 | { 614 | "cell_type": "markdown", 615 | "metadata": {}, 616 | "source": [ 617 | "## Testing the model" 618 | ] 619 | }, 620 | { 621 | "cell_type": "code", 622 | "execution_count": null, 623 | "metadata": {}, 624 | "outputs": [], 625 | "source": [ 626 | "reg.score(x_test,y_test)" 627 | ] 628 | }, 629 | { 630 | "cell_type": "code", 631 | "execution_count": null, 632 | "metadata": {}, 633 | "outputs": [], 634 | "source": [ 635 | "predicted_proba = reg.predict_proba(x_test)\n", 636 | "predicted_proba" 637 | ] 638 | }, 639 | { 640 | "cell_type": "code", 641 | "execution_count": null, 642 | "metadata": {}, 643 | "outputs": [], 644 | "source": [ 645 | "predicted_proba.shape" 646 | ] 647 | }, 648 | { 649 | "cell_type": "code", 650 | "execution_count": null, 651 | "metadata": {}, 652 | "outputs": [], 653 | "source": [ 654 | "predicted_proba[:,1]" 655 | ] 656 | }, 657 | { 658 | "cell_type": "markdown", 659 | "metadata": {}, 660 | "source": [ 661 | "## Save the model" 662 | ] 663 | }, 664 | { 665 | "cell_type": "code", 666 | "execution_count": null, 667 | "metadata": {}, 668 | "outputs": [], 669 | "source": [ 670 | "import pickle" 671 | ] 672 | }, 673 | { 674 | "cell_type": "code", 675 | "execution_count": null, 676 | "metadata": {}, 677 | "outputs": [], 678 | "source": [ 679 | "with open('model', 'wb') as file:\n", 680 | " pickle.dump(reg, file)" 681 | ] 682 | }, 683 | { 684 | "cell_type": "code", 685 | "execution_count": null, 686 | "metadata": {}, 687 | "outputs": [], 688 | "source": [ 689 | "with open('scaler','wb') as file:\n", 690 | " pickle.dump(absenteeism_scaler, file)" 691 | ] 692 | } 693 | ], 694 | "metadata": { 695 | "kernelspec": { 696 | "display_name": "Python 3", 697 | "language": "python", 698 | "name": "python3" 699 | }, 700 | "language_info": { 701 | "codemirror_mode": { 702 | "name": "ipython", 703 | "version": 3 704 | }, 705 | "file_extension": ".py", 706 | "mimetype": "text/x-python", 707 | "name": "python", 708 | "nbconvert_exporter": "python", 709 | "pygments_lexer": "ipython3", 710 | "version": "3.6.4" 711 | } 712 | }, 713 | "nbformat": 4, 714 | "nbformat_minor": 2 715 | } 716 | -------------------------------------------------------------------------------- /6. Machine Learning/60 Omitting the Dummy Variables from the Standardization/Absenteeism Exercise - Logistic Regression_prior to custom_scaler.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Creating a logistic regression to predict absenteeism" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "## Import the relevant libraries" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": null, 20 | "metadata": {}, 21 | "outputs": [], 22 | "source": [ 23 | "# import the relevant libraries\n", 24 | "import pandas as pd\n", 25 | "import numpy as np" 26 | ] 27 | }, 28 | { 29 | "cell_type": "markdown", 30 | "metadata": {}, 31 | "source": [ 32 | "## Load the data" 33 | ] 34 | }, 35 | { 36 | "cell_type": "code", 37 | "execution_count": null, 38 | "metadata": {}, 39 | "outputs": [], 40 | "source": [ 41 | "# load the preprocessed CSV data\n", 42 | "data_preprocessed = pd.read_csv('Absenteeism_preprocessed.csv')" 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": null, 48 | "metadata": {}, 49 | "outputs": [], 50 | "source": [ 51 | "# eyeball the data\n", 52 | "data_preprocessed.head()" 53 | ] 54 | }, 55 | { 56 | "cell_type": "markdown", 57 | "metadata": {}, 58 | "source": [ 59 | "## Create the targets" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "metadata": {}, 66 | "outputs": [], 67 | "source": [ 68 | "# find the median of 'Absenteeism Time in Hours'\n", 69 | "data_preprocessed['Absenteeism Time in Hours'].median()" 70 | ] 71 | }, 72 | { 73 | "cell_type": "code", 74 | "execution_count": null, 75 | "metadata": {}, 76 | "outputs": [], 77 | "source": [ 78 | "# create targets for our logistic regression\n", 79 | "# they have to be categories and we must find a way to say if someone is 'being absent too much' or not\n", 80 | "# what we've decided to do is to take the median of the dataset as a cut-off line\n", 81 | "# in this way the dataset will be balanced (there will be roughly equal number of 0s and 1s for the logistic regression)\n", 82 | "# as balancing is a great problem for ML, this will work great for us\n", 83 | "# alternatively, if we had more data, we could have found other ways to deal with the issue \n", 84 | "# for instance, we could have assigned some arbitrary value as a cut-off line, instead of the median\n", 85 | "\n", 86 | "# note that what line does is to assign 1 to anyone who has been absent 4 hours or more (more than 3 hours)\n", 87 | "# that is the equivalent of taking half a day off\n", 88 | "\n", 89 | "# initial code from the lecture\n", 90 | "# targets = np.where(data_preprocessed['Absenteeism Time in Hours'] > 3, 1, 0)\n", 91 | "\n", 92 | "# parameterized code\n", 93 | "targets = np.where(data_preprocessed['Absenteeism Time in Hours'] > \n", 94 | " data_preprocessed['Absenteeism Time in Hours'].median(), 1, 0)" 95 | ] 96 | }, 97 | { 98 | "cell_type": "code", 99 | "execution_count": null, 100 | "metadata": { 101 | "scrolled": true 102 | }, 103 | "outputs": [], 104 | "source": [ 105 | "# eyeball the targets\n", 106 | "targets" 107 | ] 108 | }, 109 | { 110 | "cell_type": "code", 111 | "execution_count": null, 112 | "metadata": {}, 113 | "outputs": [], 114 | "source": [ 115 | "# create a Series in the original data frame that will contain the targets for the regression\n", 116 | "data_preprocessed['Excessive Absenteeism'] = targets" 117 | ] 118 | }, 119 | { 120 | "cell_type": "code", 121 | "execution_count": null, 122 | "metadata": {}, 123 | "outputs": [], 124 | "source": [ 125 | "# check what happened\n", 126 | "# maybe manually see how the targets were created\n", 127 | "data_preprocessed.head()" 128 | ] 129 | }, 130 | { 131 | "cell_type": "markdown", 132 | "metadata": {}, 133 | "source": [ 134 | "## A comment on the targets" 135 | ] 136 | }, 137 | { 138 | "cell_type": "code", 139 | "execution_count": null, 140 | "metadata": {}, 141 | "outputs": [], 142 | "source": [ 143 | "# check if dataset is balanced (what % of targets are 1s)\n", 144 | "# targets.sum() will give us the number of 1s that there are\n", 145 | "# the shape[0] will give us the length of the targets array\n", 146 | "targets.sum() / targets.shape[0]" 147 | ] 148 | }, 149 | { 150 | "cell_type": "code", 151 | "execution_count": null, 152 | "metadata": {}, 153 | "outputs": [], 154 | "source": [ 155 | "# create a checkpoint by dropping the unnecessary variables\n", 156 | "data_with_targets = data_preprocessed.drop(['Absenteeism Time in Hours'],axis=1)" 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": null, 162 | "metadata": {}, 163 | "outputs": [], 164 | "source": [ 165 | "# check if the line above is a checkpoint :)\n", 166 | "\n", 167 | "# if data_with_targets is data_preprocessed = True, then the two are pointing to the same object\n", 168 | "# if it is False, then the two variables are completely different and this is in fact a checkpoint\n", 169 | "data_with_targets is data_preprocessed" 170 | ] 171 | }, 172 | { 173 | "cell_type": "code", 174 | "execution_count": null, 175 | "metadata": {}, 176 | "outputs": [], 177 | "source": [ 178 | "# check what's inside\n", 179 | "data_with_targets.head()" 180 | ] 181 | }, 182 | { 183 | "cell_type": "markdown", 184 | "metadata": {}, 185 | "source": [ 186 | "## Select the inputs for the regression" 187 | ] 188 | }, 189 | { 190 | "cell_type": "code", 191 | "execution_count": null, 192 | "metadata": {}, 193 | "outputs": [], 194 | "source": [ 195 | "data_with_targets.shape" 196 | ] 197 | }, 198 | { 199 | "cell_type": "code", 200 | "execution_count": null, 201 | "metadata": { 202 | "scrolled": true 203 | }, 204 | "outputs": [], 205 | "source": [ 206 | "# selects all rows and all columns until 14 (excluding)\n", 207 | "data_with_targets.iloc[:,:14]" 208 | ] 209 | }, 210 | { 211 | "cell_type": "code", 212 | "execution_count": null, 213 | "metadata": { 214 | "scrolled": true 215 | }, 216 | "outputs": [], 217 | "source": [ 218 | "# selects all rows and all columns but the last one (basically the same operation)\n", 219 | "data_with_targets.iloc[:,:-1]" 220 | ] 221 | }, 222 | { 223 | "cell_type": "code", 224 | "execution_count": null, 225 | "metadata": {}, 226 | "outputs": [], 227 | "source": [ 228 | "# create a variable that will contain the inputs (everything without the targets)\n", 229 | "unscaled_inputs = data_with_targets.iloc[:,:-1]" 230 | ] 231 | }, 232 | { 233 | "cell_type": "markdown", 234 | "metadata": {}, 235 | "source": [ 236 | "## Standardize the data" 237 | ] 238 | }, 239 | { 240 | "cell_type": "code", 241 | "execution_count": null, 242 | "metadata": {}, 243 | "outputs": [], 244 | "source": [ 245 | "# standardize the inputs\n", 246 | "\n", 247 | "# standardization is one of the most common preprocessing tools\n", 248 | "# since data of different magnitude (scale) can be biased towards high values,\n", 249 | "# we want all inputs to be of similar magnitude\n", 250 | "# this is a peculiarity of machine learning in general - most (but not all) algorithms do badly with unscaled data\n", 251 | "\n", 252 | "# a very useful module we can use is StandardScaler \n", 253 | "# it has much more capabilities than the straightforward 'preprocessing' method\n", 254 | "from sklearn.preprocessing import StandardScaler\n", 255 | "\n", 256 | "\n", 257 | "# we will create a variable that will contain the scaling information for this particular dataset\n", 258 | "# here's the full documentation: http://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.StandardScaler.html\n", 259 | "\n", 260 | "# define scaler as an object\n", 261 | "absenteeism_scaler = StandardScaler()" 262 | ] 263 | }, 264 | { 265 | "cell_type": "code", 266 | "execution_count": null, 267 | "metadata": {}, 268 | "outputs": [], 269 | "source": [ 270 | "# fit the unscaled_inputs\n", 271 | "# this basically calculates the mean and standard deviation of each feature\n", 272 | "absenteeism_scaler.fit(unscaled_inputs)" 273 | ] 274 | }, 275 | { 276 | "cell_type": "code", 277 | "execution_count": null, 278 | "metadata": {}, 279 | "outputs": [], 280 | "source": [ 281 | "# transform the unscaled inputs\n", 282 | "# this is the scaling itself - we subtract the mean and divide by the standard deviation\n", 283 | "scaled_inputs = absenteeism_scaler.transform(unscaled_inputs)" 284 | ] 285 | }, 286 | { 287 | "cell_type": "code", 288 | "execution_count": null, 289 | "metadata": {}, 290 | "outputs": [], 291 | "source": [ 292 | "# eyeball the newly created variable\n", 293 | "scaled_inputs" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": null, 299 | "metadata": {}, 300 | "outputs": [], 301 | "source": [ 302 | "# check the shape of the array\n", 303 | "scaled_inputs.shape" 304 | ] 305 | }, 306 | { 307 | "cell_type": "markdown", 308 | "metadata": {}, 309 | "source": [ 310 | "## Split the data into train & test and shuffle" 311 | ] 312 | }, 313 | { 314 | "cell_type": "markdown", 315 | "metadata": {}, 316 | "source": [ 317 | "### Import the relevant module" 318 | ] 319 | }, 320 | { 321 | "cell_type": "code", 322 | "execution_count": null, 323 | "metadata": {}, 324 | "outputs": [], 325 | "source": [ 326 | "# import train_test_split so we can split our data into train and test\n", 327 | "from sklearn.model_selection import train_test_split" 328 | ] 329 | }, 330 | { 331 | "cell_type": "markdown", 332 | "metadata": {}, 333 | "source": [ 334 | "### Split" 335 | ] 336 | }, 337 | { 338 | "cell_type": "code", 339 | "execution_count": null, 340 | "metadata": { 341 | "scrolled": true 342 | }, 343 | "outputs": [], 344 | "source": [ 345 | "# check how this method works\n", 346 | "train_test_split(scaled_inputs, targets)" 347 | ] 348 | }, 349 | { 350 | "cell_type": "code", 351 | "execution_count": null, 352 | "metadata": {}, 353 | "outputs": [], 354 | "source": [ 355 | "# declare 4 variables for the split\n", 356 | "x_train, x_test, y_train, y_test = train_test_split(scaled_inputs, targets, #train_size = 0.8, \n", 357 | " test_size = 0.2, random_state = 20)" 358 | ] 359 | }, 360 | { 361 | "cell_type": "code", 362 | "execution_count": null, 363 | "metadata": {}, 364 | "outputs": [], 365 | "source": [ 366 | "# check the shape of the train inputs and targets\n", 367 | "print (x_train.shape, y_train.shape)" 368 | ] 369 | }, 370 | { 371 | "cell_type": "code", 372 | "execution_count": null, 373 | "metadata": {}, 374 | "outputs": [], 375 | "source": [ 376 | "# check the shape of the test inputs and targets\n", 377 | "print (x_test.shape, y_test.shape)" 378 | ] 379 | }, 380 | { 381 | "cell_type": "markdown", 382 | "metadata": {}, 383 | "source": [ 384 | "## Logistic regression with sklearn" 385 | ] 386 | }, 387 | { 388 | "cell_type": "code", 389 | "execution_count": null, 390 | "metadata": {}, 391 | "outputs": [], 392 | "source": [ 393 | "# import the LogReg model from sklearn\n", 394 | "from sklearn.linear_model import LogisticRegression\n", 395 | "\n", 396 | "# import the 'metrics' module, which includes important metrics we may want to use\n", 397 | "from sklearn import metrics" 398 | ] 399 | }, 400 | { 401 | "cell_type": "markdown", 402 | "metadata": {}, 403 | "source": [ 404 | "### Training the model" 405 | ] 406 | }, 407 | { 408 | "cell_type": "code", 409 | "execution_count": null, 410 | "metadata": {}, 411 | "outputs": [], 412 | "source": [ 413 | "# create a logistic regression object\n", 414 | "reg = LogisticRegression()" 415 | ] 416 | }, 417 | { 418 | "cell_type": "code", 419 | "execution_count": null, 420 | "metadata": {}, 421 | "outputs": [], 422 | "source": [ 423 | "# fit our train inputs\n", 424 | "# that is basically the whole training part of the machine learning\n", 425 | "reg.fit(x_train,y_train)" 426 | ] 427 | }, 428 | { 429 | "cell_type": "code", 430 | "execution_count": null, 431 | "metadata": {}, 432 | "outputs": [], 433 | "source": [ 434 | "# assess the train accuracy of the model\n", 435 | "reg.score(x_train,y_train)" 436 | ] 437 | }, 438 | { 439 | "cell_type": "markdown", 440 | "metadata": {}, 441 | "source": [ 442 | "### Manually check the accuracy" 443 | ] 444 | }, 445 | { 446 | "cell_type": "code", 447 | "execution_count": null, 448 | "metadata": { 449 | "scrolled": true 450 | }, 451 | "outputs": [], 452 | "source": [ 453 | "# find the model outputs according to our model\n", 454 | "model_outputs = reg.predict(x_train)\n", 455 | "model_outputs" 456 | ] 457 | }, 458 | { 459 | "cell_type": "code", 460 | "execution_count": null, 461 | "metadata": {}, 462 | "outputs": [], 463 | "source": [ 464 | "# compare them with the targets\n", 465 | "y_train" 466 | ] 467 | }, 468 | { 469 | "cell_type": "code", 470 | "execution_count": null, 471 | "metadata": { 472 | "scrolled": true 473 | }, 474 | "outputs": [], 475 | "source": [ 476 | "# ACTUALLY compare the two variables\n", 477 | "model_outputs == y_train" 478 | ] 479 | }, 480 | { 481 | "cell_type": "code", 482 | "execution_count": null, 483 | "metadata": {}, 484 | "outputs": [], 485 | "source": [ 486 | "# find out in how many instances we predicted correctly\n", 487 | "np.sum((model_outputs==y_train))" 488 | ] 489 | }, 490 | { 491 | "cell_type": "code", 492 | "execution_count": null, 493 | "metadata": {}, 494 | "outputs": [], 495 | "source": [ 496 | "# get the total number of instances\n", 497 | "model_outputs.shape[0]" 498 | ] 499 | }, 500 | { 501 | "cell_type": "code", 502 | "execution_count": null, 503 | "metadata": {}, 504 | "outputs": [], 505 | "source": [ 506 | "# calculate the accuracy of the model\n", 507 | "# divide the number of correctly predicted outputs by the number of all outputs\n", 508 | "np.sum((model_outputs==y_train)) / model_outputs.shape[0]" 509 | ] 510 | }, 511 | { 512 | "cell_type": "markdown", 513 | "metadata": {}, 514 | "source": [ 515 | "### Finding the intercept and coefficients" 516 | ] 517 | }, 518 | { 519 | "cell_type": "code", 520 | "execution_count": null, 521 | "metadata": {}, 522 | "outputs": [], 523 | "source": [ 524 | "# get the intercept (bias) of our model\n", 525 | "reg.intercept_" 526 | ] 527 | }, 528 | { 529 | "cell_type": "code", 530 | "execution_count": null, 531 | "metadata": {}, 532 | "outputs": [], 533 | "source": [ 534 | "# get the coefficients (weights) of our model\n", 535 | "reg.coef_" 536 | ] 537 | }, 538 | { 539 | "cell_type": "code", 540 | "execution_count": null, 541 | "metadata": {}, 542 | "outputs": [], 543 | "source": [ 544 | "# check what were the names of our columns\n", 545 | "unscaled_inputs.columns.values" 546 | ] 547 | }, 548 | { 549 | "cell_type": "code", 550 | "execution_count": null, 551 | "metadata": {}, 552 | "outputs": [], 553 | "source": [ 554 | "# save the names of the columns in an ad-hoc variable\n", 555 | "feature_name = unscaled_inputs.columns.values" 556 | ] 557 | }, 558 | { 559 | "cell_type": "code", 560 | "execution_count": null, 561 | "metadata": {}, 562 | "outputs": [], 563 | "source": [ 564 | "# use the coefficients from this table (they will be exported later and will be used in Tableau)\n", 565 | "# transpose the model coefficients (model.coef_) and throws them into a df (a vertical organization, so that they can be\n", 566 | "# multiplied by certain matrices later) \n", 567 | "summary_table = pd.DataFrame (columns=['Feature name'], data = feature_name)\n", 568 | "\n", 569 | "# add the coefficient values to the summary table\n", 570 | "summary_table['Coefficient'] = np.transpose(reg.coef_)\n", 571 | "\n", 572 | "# display the summary table\n", 573 | "summary_table" 574 | ] 575 | }, 576 | { 577 | "cell_type": "code", 578 | "execution_count": null, 579 | "metadata": { 580 | "scrolled": true 581 | }, 582 | "outputs": [], 583 | "source": [ 584 | "# do a little Python trick to move the intercept to the top of the summary table\n", 585 | "# move all indices by 1\n", 586 | "summary_table.index = summary_table.index + 1\n", 587 | "\n", 588 | "# add the intercept at index 0\n", 589 | "summary_table.loc[0] = ['Intercept', reg.intercept_[0]]\n", 590 | "\n", 591 | "# sort the df by index\n", 592 | "summary_table = summary_table.sort_index()\n", 593 | "summary_table" 594 | ] 595 | }, 596 | { 597 | "cell_type": "markdown", 598 | "metadata": {}, 599 | "source": [ 600 | "## Interpreting the coefficients" 601 | ] 602 | }, 603 | { 604 | "cell_type": "code", 605 | "execution_count": null, 606 | "metadata": {}, 607 | "outputs": [], 608 | "source": [ 609 | "# create a new Series called: 'Odds ratio' which will show the.. odds ratio of each feature\n", 610 | "summary_table['Odds_ratio'] = np.exp(summary_table.Coefficient)" 611 | ] 612 | }, 613 | { 614 | "cell_type": "code", 615 | "execution_count": null, 616 | "metadata": {}, 617 | "outputs": [], 618 | "source": [ 619 | "# display the df\n", 620 | "summary_table" 621 | ] 622 | }, 623 | { 624 | "cell_type": "code", 625 | "execution_count": null, 626 | "metadata": {}, 627 | "outputs": [], 628 | "source": [ 629 | "# sort the table according to odds ratio\n", 630 | "# note that by default, the sort_values method sorts values by 'ascending'\n", 631 | "summary_table.sort_values('Odds_ratio', ascending=False)" 632 | ] 633 | } 634 | ], 635 | "metadata": { 636 | "kernelspec": { 637 | "display_name": "Python 3", 638 | "language": "python", 639 | "name": "python3" 640 | }, 641 | "language_info": { 642 | "codemirror_mode": { 643 | "name": "ipython", 644 | "version": 3 645 | }, 646 | "file_extension": ".py", 647 | "mimetype": "text/x-python", 648 | "name": "python", 649 | "nbconvert_exporter": "python", 650 | "pygments_lexer": "ipython3", 651 | "version": "3.6.4" 652 | } 653 | }, 654 | "nbformat": 4, 655 | "nbformat_minor": 2 656 | } 657 | -------------------------------------------------------------------------------- /6. Machine Learning/62 Simplifying the Model (Backward Elimination)/Absenteeism Exercise - Logistic Regression_prior_to_backward_elimination.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Creating a logistic regression to predict absenteeism" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "## Import the relevant libraries" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": null, 20 | "metadata": {}, 21 | "outputs": [], 22 | "source": [ 23 | "# import the relevant libraries\n", 24 | "import pandas as pd\n", 25 | "import numpy as np" 26 | ] 27 | }, 28 | { 29 | "cell_type": "markdown", 30 | "metadata": {}, 31 | "source": [ 32 | "## Load the data" 33 | ] 34 | }, 35 | { 36 | "cell_type": "code", 37 | "execution_count": null, 38 | "metadata": {}, 39 | "outputs": [], 40 | "source": [ 41 | "# load the preprocessed CSV data\n", 42 | "data_preprocessed = pd.read_csv('Absenteeism_preprocessed.csv')" 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": null, 48 | "metadata": {}, 49 | "outputs": [], 50 | "source": [ 51 | "# eyeball the data\n", 52 | "data_preprocessed.head()" 53 | ] 54 | }, 55 | { 56 | "cell_type": "markdown", 57 | "metadata": {}, 58 | "source": [ 59 | "## Create the targets" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "metadata": {}, 66 | "outputs": [], 67 | "source": [ 68 | "# find the median of 'Absenteeism Time in Hours'\n", 69 | "data_preprocessed['Absenteeism Time in Hours'].median()" 70 | ] 71 | }, 72 | { 73 | "cell_type": "code", 74 | "execution_count": null, 75 | "metadata": {}, 76 | "outputs": [], 77 | "source": [ 78 | "# create targets for our logistic regression\n", 79 | "# they have to be categories and we must find a way to say if someone is 'being absent too much' or not\n", 80 | "# what we've decided to do is to take the median of the dataset as a cut-off line\n", 81 | "# in this way the dataset will be balanced (there will be roughly equal number of 0s and 1s for the logistic regression)\n", 82 | "# as balancing is a great problem for ML, this will work great for us\n", 83 | "# alternatively, if we had more data, we could have found other ways to deal with the issue \n", 84 | "# for instance, we could have assigned some arbitrary value as a cut-off line, instead of the median\n", 85 | "\n", 86 | "# note that what line does is to assign 1 to anyone who has been absent 4 hours or more (more than 3 hours)\n", 87 | "# that is the equivalent of taking half a day off\n", 88 | "\n", 89 | "# initial code from the lecture\n", 90 | "# targets = np.where(data_preprocessed['Absenteeism Time in Hours'] > 3, 1, 0)\n", 91 | "\n", 92 | "# parameterized code\n", 93 | "targets = np.where(data_preprocessed['Absenteeism Time in Hours'] > \n", 94 | " data_preprocessed['Absenteeism Time in Hours'].median(), 1, 0)" 95 | ] 96 | }, 97 | { 98 | "cell_type": "code", 99 | "execution_count": null, 100 | "metadata": { 101 | "scrolled": true 102 | }, 103 | "outputs": [], 104 | "source": [ 105 | "# eyeball the targets\n", 106 | "targets" 107 | ] 108 | }, 109 | { 110 | "cell_type": "code", 111 | "execution_count": null, 112 | "metadata": {}, 113 | "outputs": [], 114 | "source": [ 115 | "# create a Series in the original data frame that will contain the targets for the regression\n", 116 | "data_preprocessed['Excessive Absenteeism'] = targets" 117 | ] 118 | }, 119 | { 120 | "cell_type": "code", 121 | "execution_count": null, 122 | "metadata": {}, 123 | "outputs": [], 124 | "source": [ 125 | "# check what happened\n", 126 | "# maybe manually see how the targets were created\n", 127 | "data_preprocessed.head()" 128 | ] 129 | }, 130 | { 131 | "cell_type": "markdown", 132 | "metadata": {}, 133 | "source": [ 134 | "## A comment on the targets" 135 | ] 136 | }, 137 | { 138 | "cell_type": "code", 139 | "execution_count": null, 140 | "metadata": {}, 141 | "outputs": [], 142 | "source": [ 143 | "# check if dataset is balanced (what % of targets are 1s)\n", 144 | "# targets.sum() will give us the number of 1s that there are\n", 145 | "# the shape[0] will give us the length of the targets array\n", 146 | "targets.sum() / targets.shape[0]" 147 | ] 148 | }, 149 | { 150 | "cell_type": "code", 151 | "execution_count": null, 152 | "metadata": {}, 153 | "outputs": [], 154 | "source": [ 155 | "# create a checkpoint by dropping the unnecessary variables\n", 156 | "# also drop the variables we 'eliminated' after exploring the weights\n", 157 | "data_with_targets = data_preprocessed.drop(['Absenteeism Time in Hours'],axis=1)" 158 | ] 159 | }, 160 | { 161 | "cell_type": "code", 162 | "execution_count": null, 163 | "metadata": {}, 164 | "outputs": [], 165 | "source": [ 166 | "# check if the line above is a checkpoint :)\n", 167 | "\n", 168 | "# if data_with_targets is data_preprocessed = True, then the two are pointing to the same object\n", 169 | "# if it is False, then the two variables are completely different and this is in fact a checkpoint\n", 170 | "data_with_targets is data_preprocessed" 171 | ] 172 | }, 173 | { 174 | "cell_type": "code", 175 | "execution_count": null, 176 | "metadata": {}, 177 | "outputs": [], 178 | "source": [ 179 | "# check what's inside\n", 180 | "data_with_targets.head()" 181 | ] 182 | }, 183 | { 184 | "cell_type": "markdown", 185 | "metadata": {}, 186 | "source": [ 187 | "## Select the inputs for the regression" 188 | ] 189 | }, 190 | { 191 | "cell_type": "code", 192 | "execution_count": null, 193 | "metadata": {}, 194 | "outputs": [], 195 | "source": [ 196 | "data_with_targets.shape" 197 | ] 198 | }, 199 | { 200 | "cell_type": "code", 201 | "execution_count": null, 202 | "metadata": { 203 | "scrolled": true 204 | }, 205 | "outputs": [], 206 | "source": [ 207 | "# Selects all rows and all columns until 14 (excluding)\n", 208 | "data_with_targets.iloc[:,:14]" 209 | ] 210 | }, 211 | { 212 | "cell_type": "code", 213 | "execution_count": null, 214 | "metadata": { 215 | "scrolled": true 216 | }, 217 | "outputs": [], 218 | "source": [ 219 | "# Selects all rows and all columns but the last one (basically the same operation)\n", 220 | "data_with_targets.iloc[:,:-1]" 221 | ] 222 | }, 223 | { 224 | "cell_type": "code", 225 | "execution_count": null, 226 | "metadata": {}, 227 | "outputs": [], 228 | "source": [ 229 | "# Create a variable that will contain the inputs (everything without the targets)\n", 230 | "unscaled_inputs = data_with_targets.iloc[:,:-1]" 231 | ] 232 | }, 233 | { 234 | "cell_type": "markdown", 235 | "metadata": {}, 236 | "source": [ 237 | "## Standardize the data" 238 | ] 239 | }, 240 | { 241 | "cell_type": "code", 242 | "execution_count": null, 243 | "metadata": {}, 244 | "outputs": [], 245 | "source": [ 246 | "# standardize the inputs\n", 247 | "\n", 248 | "# standardization is one of the most common preprocessing tools\n", 249 | "# since data of different magnitude (scale) can be biased towards high values,\n", 250 | "# we want all inputs to be of similar magnitude\n", 251 | "# this is a peculiarity of machine learning in general - most (but not all) algorithms do badly with unscaled data\n", 252 | "\n", 253 | "# a very useful module we can use is StandardScaler \n", 254 | "# it has much more capabilities than the straightforward 'preprocessing' method\n", 255 | "from sklearn.preprocessing import StandardScaler\n", 256 | "\n", 257 | "\n", 258 | "# we will create a variable that will contain the scaling information for this particular dataset\n", 259 | "# here's the full documentation: http://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.StandardScaler.html\n", 260 | "\n", 261 | "# define scaler as an object\n", 262 | "absenteeism_scaler = StandardScaler()" 263 | ] 264 | }, 265 | { 266 | "cell_type": "code", 267 | "execution_count": null, 268 | "metadata": {}, 269 | "outputs": [], 270 | "source": [ 271 | "# import the libraries needed to create the Custom Scaler\n", 272 | "# note that all of them are a part of the sklearn package\n", 273 | "# moreover, one of them is actually the StandardScaler module, \n", 274 | "# so you can imagine that the Custom Scaler is build on it\n", 275 | "\n", 276 | "from sklearn.base import BaseEstimator, TransformerMixin\n", 277 | "from sklearn.preprocessing import StandardScaler\n", 278 | "\n", 279 | "# create the Custom Scaler class\n", 280 | "\n", 281 | "class CustomScaler(BaseEstimator,TransformerMixin): \n", 282 | " \n", 283 | " # init or what information we need to declare a CustomScaler object\n", 284 | " # and what is calculated/declared as we do\n", 285 | " \n", 286 | " def __init__(self,columns,copy=True,with_mean=True,with_std=True):\n", 287 | " \n", 288 | " # scaler is nothing but a Standard Scaler object\n", 289 | " self.scaler = StandardScaler(copy,with_mean,with_std)\n", 290 | " # with some columns 'twist'\n", 291 | " self.columns = columns\n", 292 | " self.mean_ = None\n", 293 | " self.var_ = None\n", 294 | " \n", 295 | " \n", 296 | " # the fit method, which, again based on StandardScale\n", 297 | " \n", 298 | " def fit(self, X, y=None):\n", 299 | " self.scaler.fit(X[self.columns], y)\n", 300 | " self.mean_ = np.mean(X[self.columns])\n", 301 | " self.var_ = np.var(X[self.columns])\n", 302 | " return self\n", 303 | " \n", 304 | " # the transform method which does the actual scaling\n", 305 | "\n", 306 | " def transform(self, X, y=None, copy=None):\n", 307 | " \n", 308 | " # record the initial order of the columns\n", 309 | " init_col_order = X.columns\n", 310 | " \n", 311 | " # scale all features that you chose when creating the instance of the class\n", 312 | " X_scaled = pd.DataFrame(self.scaler.transform(X[self.columns]), columns=self.columns)\n", 313 | " \n", 314 | " # declare a variable containing all information that was not scaled\n", 315 | " X_not_scaled = X.loc[:,~X.columns.isin(self.columns)]\n", 316 | " \n", 317 | " # return a data frame which contains all scaled features and all 'not scaled' features\n", 318 | " # use the original order (that you recorded in the beginning)\n", 319 | " return pd.concat([X_not_scaled, X_scaled], axis=1)[init_col_order]" 320 | ] 321 | }, 322 | { 323 | "cell_type": "code", 324 | "execution_count": null, 325 | "metadata": {}, 326 | "outputs": [], 327 | "source": [ 328 | "# check what are all columns that we've got\n", 329 | "unscaled_inputs.columns.values" 330 | ] 331 | }, 332 | { 333 | "cell_type": "code", 334 | "execution_count": null, 335 | "metadata": {}, 336 | "outputs": [], 337 | "source": [ 338 | "# choose the columns to scale\n", 339 | "# we later augmented this code and put it in comments\n", 340 | "# columns_to_scale = ['Month Value','Day of the Week', 'Transportation Expense', 'Distance to Work',\n", 341 | " #'Age', 'Daily Work Load Average', 'Body Mass Index', 'Children', 'Pet']\n", 342 | " \n", 343 | "# select the columns to omit\n", 344 | "columns_to_omit = ['Reason_1', 'Reason_2', 'Reason_3', 'Reason_4','Education']" 345 | ] 346 | }, 347 | { 348 | "cell_type": "code", 349 | "execution_count": null, 350 | "metadata": {}, 351 | "outputs": [], 352 | "source": [ 353 | "# create the columns to scale, based on the columns to omit\n", 354 | "# use list comprehension to iterate over the list\n", 355 | "columns_to_scale = [x for x in unscaled_inputs.columns.values if x not in columns_to_omit]" 356 | ] 357 | }, 358 | { 359 | "cell_type": "code", 360 | "execution_count": null, 361 | "metadata": {}, 362 | "outputs": [], 363 | "source": [ 364 | "# declare a scaler object, specifying the columns you want to scale\n", 365 | "absenteeism_scaler = CustomScaler(columns_to_scale)" 366 | ] 367 | }, 368 | { 369 | "cell_type": "code", 370 | "execution_count": null, 371 | "metadata": {}, 372 | "outputs": [], 373 | "source": [ 374 | "# fit the data (calculate mean and standard deviation); they are automatically stored inside the object \n", 375 | "absenteeism_scaler.fit(unscaled_inputs)" 376 | ] 377 | }, 378 | { 379 | "cell_type": "code", 380 | "execution_count": null, 381 | "metadata": {}, 382 | "outputs": [], 383 | "source": [ 384 | "# standardizes the data, using the transform method \n", 385 | "# in the last line, we fitted the data - in other words\n", 386 | "# we found the internal parameters of a model that will be used to transform data. \n", 387 | "# transforming applies these parameters to our data\n", 388 | "# note that when you get new data, you can just call 'scaler' again and transform it in the same way as now\n", 389 | "scaled_inputs = absenteeism_scaler.transform(unscaled_inputs)" 390 | ] 391 | }, 392 | { 393 | "cell_type": "code", 394 | "execution_count": null, 395 | "metadata": { 396 | "scrolled": true 397 | }, 398 | "outputs": [], 399 | "source": [ 400 | "# the scaled_inputs are now an ndarray, because sklearn works with ndarrays\n", 401 | "scaled_inputs" 402 | ] 403 | }, 404 | { 405 | "cell_type": "code", 406 | "execution_count": null, 407 | "metadata": {}, 408 | "outputs": [], 409 | "source": [ 410 | "# check the shape of the inputs\n", 411 | "scaled_inputs.shape" 412 | ] 413 | }, 414 | { 415 | "cell_type": "markdown", 416 | "metadata": {}, 417 | "source": [ 418 | "## Split the data into train & test and shuffle" 419 | ] 420 | }, 421 | { 422 | "cell_type": "markdown", 423 | "metadata": {}, 424 | "source": [ 425 | "### Import the relevant module" 426 | ] 427 | }, 428 | { 429 | "cell_type": "code", 430 | "execution_count": null, 431 | "metadata": {}, 432 | "outputs": [], 433 | "source": [ 434 | "# import train_test_split so we can split our data into train and test\n", 435 | "from sklearn.model_selection import train_test_split" 436 | ] 437 | }, 438 | { 439 | "cell_type": "markdown", 440 | "metadata": {}, 441 | "source": [ 442 | "### Split" 443 | ] 444 | }, 445 | { 446 | "cell_type": "code", 447 | "execution_count": null, 448 | "metadata": { 449 | "scrolled": true 450 | }, 451 | "outputs": [], 452 | "source": [ 453 | "# check how this method works\n", 454 | "train_test_split(scaled_inputs, targets)" 455 | ] 456 | }, 457 | { 458 | "cell_type": "code", 459 | "execution_count": null, 460 | "metadata": {}, 461 | "outputs": [], 462 | "source": [ 463 | "# declare 4 variables for the split\n", 464 | "x_train, x_test, y_train, y_test = train_test_split(scaled_inputs, targets, #train_size = 0.8, \n", 465 | " test_size = 0.2, random_state = 20)" 466 | ] 467 | }, 468 | { 469 | "cell_type": "code", 470 | "execution_count": null, 471 | "metadata": {}, 472 | "outputs": [], 473 | "source": [ 474 | "# check the shape of the train inputs and targets\n", 475 | "print (x_train.shape, y_train.shape)" 476 | ] 477 | }, 478 | { 479 | "cell_type": "code", 480 | "execution_count": null, 481 | "metadata": {}, 482 | "outputs": [], 483 | "source": [ 484 | "# check the shape of the test inputs and targets\n", 485 | "print (x_test.shape, y_test.shape)" 486 | ] 487 | }, 488 | { 489 | "cell_type": "markdown", 490 | "metadata": {}, 491 | "source": [ 492 | "## Logistic regression with sklearn" 493 | ] 494 | }, 495 | { 496 | "cell_type": "code", 497 | "execution_count": null, 498 | "metadata": {}, 499 | "outputs": [], 500 | "source": [ 501 | "# import the LogReg model from sklearn\n", 502 | "from sklearn.linear_model import LogisticRegression\n", 503 | "\n", 504 | "# import the 'metrics' module, which includes important metrics we may want to use\n", 505 | "from sklearn import metrics" 506 | ] 507 | }, 508 | { 509 | "cell_type": "markdown", 510 | "metadata": {}, 511 | "source": [ 512 | "### Training the model" 513 | ] 514 | }, 515 | { 516 | "cell_type": "code", 517 | "execution_count": null, 518 | "metadata": {}, 519 | "outputs": [], 520 | "source": [ 521 | "# create a logistic regression object\n", 522 | "reg = LogisticRegression()" 523 | ] 524 | }, 525 | { 526 | "cell_type": "code", 527 | "execution_count": null, 528 | "metadata": {}, 529 | "outputs": [], 530 | "source": [ 531 | "# fit our train inputs\n", 532 | "# that is basically the whole training part of the machine learning\n", 533 | "reg.fit(x_train,y_train)" 534 | ] 535 | }, 536 | { 537 | "cell_type": "code", 538 | "execution_count": null, 539 | "metadata": {}, 540 | "outputs": [], 541 | "source": [ 542 | "# assess the train accuracy of the model\n", 543 | "reg.score(x_train,y_train)" 544 | ] 545 | }, 546 | { 547 | "cell_type": "markdown", 548 | "metadata": {}, 549 | "source": [ 550 | "### Manually check the accuracy" 551 | ] 552 | }, 553 | { 554 | "cell_type": "code", 555 | "execution_count": null, 556 | "metadata": { 557 | "scrolled": true 558 | }, 559 | "outputs": [], 560 | "source": [ 561 | "# find the model outputs according to our model\n", 562 | "model_outputs = reg.predict(x_train)\n", 563 | "model_outputs" 564 | ] 565 | }, 566 | { 567 | "cell_type": "code", 568 | "execution_count": null, 569 | "metadata": {}, 570 | "outputs": [], 571 | "source": [ 572 | "# compare them with the targets\n", 573 | "y_train" 574 | ] 575 | }, 576 | { 577 | "cell_type": "code", 578 | "execution_count": null, 579 | "metadata": { 580 | "scrolled": true 581 | }, 582 | "outputs": [], 583 | "source": [ 584 | "# ACTUALLY compare the two variables\n", 585 | "model_outputs == y_train" 586 | ] 587 | }, 588 | { 589 | "cell_type": "code", 590 | "execution_count": null, 591 | "metadata": {}, 592 | "outputs": [], 593 | "source": [ 594 | "# find out in how many instances we predicted correctly\n", 595 | "np.sum((model_outputs==y_train))" 596 | ] 597 | }, 598 | { 599 | "cell_type": "code", 600 | "execution_count": null, 601 | "metadata": {}, 602 | "outputs": [], 603 | "source": [ 604 | "# get the total number of instances\n", 605 | "model_outputs.shape[0]" 606 | ] 607 | }, 608 | { 609 | "cell_type": "code", 610 | "execution_count": null, 611 | "metadata": {}, 612 | "outputs": [], 613 | "source": [ 614 | "# calculate the accuracy of the model\n", 615 | "np.sum((model_outputs==y_train)) / model_outputs.shape[0]" 616 | ] 617 | }, 618 | { 619 | "cell_type": "markdown", 620 | "metadata": {}, 621 | "source": [ 622 | "### Finding the intercept and coefficients" 623 | ] 624 | }, 625 | { 626 | "cell_type": "code", 627 | "execution_count": null, 628 | "metadata": {}, 629 | "outputs": [], 630 | "source": [ 631 | "# get the intercept (bias) of our model\n", 632 | "reg.intercept_" 633 | ] 634 | }, 635 | { 636 | "cell_type": "code", 637 | "execution_count": null, 638 | "metadata": {}, 639 | "outputs": [], 640 | "source": [ 641 | "# get the coefficients (weights) of our model\n", 642 | "reg.coef_" 643 | ] 644 | }, 645 | { 646 | "cell_type": "code", 647 | "execution_count": null, 648 | "metadata": {}, 649 | "outputs": [], 650 | "source": [ 651 | "# check what were the names of our columns\n", 652 | "unscaled_inputs.columns.values" 653 | ] 654 | }, 655 | { 656 | "cell_type": "code", 657 | "execution_count": null, 658 | "metadata": {}, 659 | "outputs": [], 660 | "source": [ 661 | "# save the names of the columns in an ad-hoc variable\n", 662 | "feature_name = unscaled_inputs.columns.values" 663 | ] 664 | }, 665 | { 666 | "cell_type": "code", 667 | "execution_count": null, 668 | "metadata": {}, 669 | "outputs": [], 670 | "source": [ 671 | "# use the coefficients from this table (they will be exported later and will be used in Tableau)\n", 672 | "# transpose the model coefficients (model.coef_) and throws them into a df (a vertical organization, so that they can be\n", 673 | "# multiplied by certain matrices later) \n", 674 | "summary_table = pd.DataFrame (columns=['Feature name'], data = feature_name)\n", 675 | "\n", 676 | "# add the coefficient values to the summary table\n", 677 | "summary_table['Coefficient'] = np.transpose(reg.coef_)\n", 678 | "\n", 679 | "# display the summary table\n", 680 | "summary_table" 681 | ] 682 | }, 683 | { 684 | "cell_type": "code", 685 | "execution_count": null, 686 | "metadata": { 687 | "scrolled": true 688 | }, 689 | "outputs": [], 690 | "source": [ 691 | "# do a little Python trick to move the intercept to the top of the summary table\n", 692 | "# move all indices by 1\n", 693 | "summary_table.index = summary_table.index + 1\n", 694 | "\n", 695 | "# add the intercept at index 0\n", 696 | "summary_table.loc[0] = ['Intercept', reg.intercept_[0]]\n", 697 | "\n", 698 | "# sort the df by index\n", 699 | "summary_table = summary_table.sort_index()\n", 700 | "summary_table" 701 | ] 702 | }, 703 | { 704 | "cell_type": "markdown", 705 | "metadata": {}, 706 | "source": [ 707 | "## Interpreting the coefficients" 708 | ] 709 | }, 710 | { 711 | "cell_type": "code", 712 | "execution_count": null, 713 | "metadata": {}, 714 | "outputs": [], 715 | "source": [ 716 | "# create a new Series called: 'Odds ratio' which will show the.. odds ratio of each feature\n", 717 | "summary_table['Odds_ratio'] = np.exp(summary_table.Coefficient)" 718 | ] 719 | }, 720 | { 721 | "cell_type": "code", 722 | "execution_count": null, 723 | "metadata": {}, 724 | "outputs": [], 725 | "source": [ 726 | "# display the df\n", 727 | "summary_table" 728 | ] 729 | }, 730 | { 731 | "cell_type": "code", 732 | "execution_count": null, 733 | "metadata": {}, 734 | "outputs": [], 735 | "source": [ 736 | "# sort the table according to odds ratio\n", 737 | "# note that by default, the sort_values method sorts values by 'ascending'\n", 738 | "summary_table.sort_values('Odds_ratio', ascending=False)" 739 | ] 740 | } 741 | ], 742 | "metadata": { 743 | "kernelspec": { 744 | "display_name": "Python 3", 745 | "language": "python", 746 | "name": "python3" 747 | }, 748 | "language_info": { 749 | "codemirror_mode": { 750 | "name": "ipython", 751 | "version": 3 752 | }, 753 | "file_extension": ".py", 754 | "mimetype": "text/x-python", 755 | "name": "python", 756 | "nbconvert_exporter": "python", 757 | "pygments_lexer": "ipython3", 758 | "version": "3.6.4" 759 | } 760 | }, 761 | "nbformat": 4, 762 | "nbformat_minor": 2 763 | } 764 | -------------------------------------------------------------------------------- /6. Machine Learning/66 EXERCISE - Saving the Model (and Scaler)/Absenteeism Exercise - Logistic Regression_with_comments.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Creating a logistic regression to predict absenteeism" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "## Import the relevant libraries" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": null, 20 | "metadata": {}, 21 | "outputs": [], 22 | "source": [ 23 | "# import the relevant libraries\n", 24 | "import pandas as pd\n", 25 | "import numpy as np" 26 | ] 27 | }, 28 | { 29 | "cell_type": "markdown", 30 | "metadata": {}, 31 | "source": [ 32 | "## Load the data" 33 | ] 34 | }, 35 | { 36 | "cell_type": "code", 37 | "execution_count": null, 38 | "metadata": {}, 39 | "outputs": [], 40 | "source": [ 41 | "# load the preprocessed CSV data\n", 42 | "data_preprocessed = pd.read_csv('Absenteeism_preprocessed.csv')" 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": null, 48 | "metadata": {}, 49 | "outputs": [], 50 | "source": [ 51 | "# eyeball the data\n", 52 | "data_preprocessed.head()" 53 | ] 54 | }, 55 | { 56 | "cell_type": "markdown", 57 | "metadata": {}, 58 | "source": [ 59 | "## Create the targets" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "metadata": {}, 66 | "outputs": [], 67 | "source": [ 68 | "# find the median of 'Absenteeism Time in Hours'\n", 69 | "data_preprocessed['Absenteeism Time in Hours'].median()" 70 | ] 71 | }, 72 | { 73 | "cell_type": "code", 74 | "execution_count": null, 75 | "metadata": {}, 76 | "outputs": [], 77 | "source": [ 78 | "# create targets for our logistic regression\n", 79 | "# they have to be categories and we must find a way to say if someone is 'being absent too much' or not\n", 80 | "# what we've decided to do is to take the median of the dataset as a cut-off line\n", 81 | "# in this way the dataset will be balanced (there will be roughly equal number of 0s and 1s for the logistic regression)\n", 82 | "# as balancing is a great problem for ML, this will work great for us\n", 83 | "# alternatively, if we had more data, we could have found other ways to deal with the issue \n", 84 | "# for instance, we could have assigned some arbitrary value as a cut-off line, instead of the median\n", 85 | "\n", 86 | "# note that what line does is to assign 1 to anyone who has been absent 4 hours or more (more than 3 hours)\n", 87 | "# that is the equivalent of taking half a day off\n", 88 | "\n", 89 | "# initial code from the lecture\n", 90 | "# targets = np.where(data_preprocessed['Absenteeism Time in Hours'] > 3, 1, 0)\n", 91 | "\n", 92 | "# parameterized code\n", 93 | "targets = np.where(data_preprocessed['Absenteeism Time in Hours'] > \n", 94 | " data_preprocessed['Absenteeism Time in Hours'].median(), 1, 0)" 95 | ] 96 | }, 97 | { 98 | "cell_type": "code", 99 | "execution_count": null, 100 | "metadata": { 101 | "scrolled": true 102 | }, 103 | "outputs": [], 104 | "source": [ 105 | "# eyeball the targets\n", 106 | "targets" 107 | ] 108 | }, 109 | { 110 | "cell_type": "code", 111 | "execution_count": null, 112 | "metadata": {}, 113 | "outputs": [], 114 | "source": [ 115 | "# create a Series in the original data frame that will contain the targets for the regression\n", 116 | "data_preprocessed['Excessive Absenteeism'] = targets" 117 | ] 118 | }, 119 | { 120 | "cell_type": "code", 121 | "execution_count": null, 122 | "metadata": {}, 123 | "outputs": [], 124 | "source": [ 125 | "# check what happened\n", 126 | "# maybe manually see how the targets were created\n", 127 | "data_preprocessed.head()" 128 | ] 129 | }, 130 | { 131 | "cell_type": "markdown", 132 | "metadata": {}, 133 | "source": [ 134 | "## A comment on the targets" 135 | ] 136 | }, 137 | { 138 | "cell_type": "code", 139 | "execution_count": null, 140 | "metadata": {}, 141 | "outputs": [], 142 | "source": [ 143 | "# check if dataset is balanced (what % of targets are 1s)\n", 144 | "# targets.sum() will give us the number of 1s that there are\n", 145 | "# the shape[0] will give us the length of the targets array\n", 146 | "targets.sum() / targets.shape[0]" 147 | ] 148 | }, 149 | { 150 | "cell_type": "code", 151 | "execution_count": null, 152 | "metadata": {}, 153 | "outputs": [], 154 | "source": [ 155 | "# create a checkpoint by dropping the unnecessary variables\n", 156 | "# also drop the variables we 'eliminated' after exploring the weights\n", 157 | "data_with_targets = data_preprocessed.drop(['Absenteeism Time in Hours','Day of the Week',\n", 158 | " 'Daily Work Load Average','Distance to Work'],axis=1)" 159 | ] 160 | }, 161 | { 162 | "cell_type": "code", 163 | "execution_count": null, 164 | "metadata": {}, 165 | "outputs": [], 166 | "source": [ 167 | "# check if the line above is a checkpoint :)\n", 168 | "\n", 169 | "# if data_with_targets is data_preprocessed = True, then the two are pointing to the same object\n", 170 | "# if it is False, then the two variables are completely different and this is in fact a checkpoint\n", 171 | "data_with_targets is data_preprocessed" 172 | ] 173 | }, 174 | { 175 | "cell_type": "code", 176 | "execution_count": null, 177 | "metadata": {}, 178 | "outputs": [], 179 | "source": [ 180 | "# check what's inside\n", 181 | "data_with_targets.head()" 182 | ] 183 | }, 184 | { 185 | "cell_type": "markdown", 186 | "metadata": {}, 187 | "source": [ 188 | "## Select the inputs for the regression" 189 | ] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": null, 194 | "metadata": {}, 195 | "outputs": [], 196 | "source": [ 197 | "data_with_targets.shape" 198 | ] 199 | }, 200 | { 201 | "cell_type": "code", 202 | "execution_count": null, 203 | "metadata": { 204 | "scrolled": true 205 | }, 206 | "outputs": [], 207 | "source": [ 208 | "# Selects all rows and all columns until 14 (excluding)\n", 209 | "data_with_targets.iloc[:,:14]" 210 | ] 211 | }, 212 | { 213 | "cell_type": "code", 214 | "execution_count": null, 215 | "metadata": { 216 | "scrolled": true 217 | }, 218 | "outputs": [], 219 | "source": [ 220 | "# Selects all rows and all columns but the last one (basically the same operation)\n", 221 | "data_with_targets.iloc[:,:-1]" 222 | ] 223 | }, 224 | { 225 | "cell_type": "code", 226 | "execution_count": null, 227 | "metadata": {}, 228 | "outputs": [], 229 | "source": [ 230 | "# Create a variable that will contain the inputs (everything without the targets)\n", 231 | "unscaled_inputs = data_with_targets.iloc[:,:-1]" 232 | ] 233 | }, 234 | { 235 | "cell_type": "markdown", 236 | "metadata": {}, 237 | "source": [ 238 | "## Standardize the data" 239 | ] 240 | }, 241 | { 242 | "cell_type": "code", 243 | "execution_count": null, 244 | "metadata": {}, 245 | "outputs": [], 246 | "source": [ 247 | "# standardize the inputs\n", 248 | "\n", 249 | "# standardization is one of the most common preprocessing tools\n", 250 | "# since data of different magnitude (scale) can be biased towards high values,\n", 251 | "# we want all inputs to be of similar magnitude\n", 252 | "# this is a peculiarity of machine learning in general - most (but not all) algorithms do badly with unscaled data\n", 253 | "\n", 254 | "# a very useful module we can use is StandardScaler \n", 255 | "# it has much more capabilities than the straightforward 'preprocessing' method\n", 256 | "from sklearn.preprocessing import StandardScaler\n", 257 | "\n", 258 | "\n", 259 | "# we will create a variable that will contain the scaling information for this particular dataset\n", 260 | "# here's the full documentation: http://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.StandardScaler.html\n", 261 | "\n", 262 | "# define scaler as an object\n", 263 | "absenteeism_scaler = StandardScaler()" 264 | ] 265 | }, 266 | { 267 | "cell_type": "code", 268 | "execution_count": null, 269 | "metadata": {}, 270 | "outputs": [], 271 | "source": [ 272 | "# import the libraries needed to create the Custom Scaler\n", 273 | "# note that all of them are a part of the sklearn package\n", 274 | "# moreover, one of them is actually the StandardScaler module, \n", 275 | "# so you can imagine that the Custom Scaler is build on it\n", 276 | "\n", 277 | "from sklearn.base import BaseEstimator, TransformerMixin\n", 278 | "from sklearn.preprocessing import StandardScaler\n", 279 | "\n", 280 | "# create the Custom Scaler class\n", 281 | "\n", 282 | "class CustomScaler(BaseEstimator,TransformerMixin): \n", 283 | " \n", 284 | " # init or what information we need to declare a CustomScaler object\n", 285 | " # and what is calculated/declared as we do\n", 286 | " \n", 287 | " def __init__(self,columns,copy=True,with_mean=True,with_std=True):\n", 288 | " \n", 289 | " # scaler is nothing but a Standard Scaler object\n", 290 | " self.scaler = StandardScaler(copy,with_mean,with_std)\n", 291 | " # with some columns 'twist'\n", 292 | " self.columns = columns\n", 293 | " self.mean_ = None\n", 294 | " self.var_ = None\n", 295 | " \n", 296 | " \n", 297 | " # the fit method, which, again based on StandardScale\n", 298 | " \n", 299 | " def fit(self, X, y=None):\n", 300 | " self.scaler.fit(X[self.columns], y)\n", 301 | " self.mean_ = np.mean(X[self.columns])\n", 302 | " self.var_ = np.var(X[self.columns])\n", 303 | " return self\n", 304 | " \n", 305 | " # the transform method which does the actual scaling\n", 306 | "\n", 307 | " def transform(self, X, y=None, copy=None):\n", 308 | " \n", 309 | " # record the initial order of the columns\n", 310 | " init_col_order = X.columns\n", 311 | " \n", 312 | " # scale all features that you chose when creating the instance of the class\n", 313 | " X_scaled = pd.DataFrame(self.scaler.transform(X[self.columns]), columns=self.columns)\n", 314 | " \n", 315 | " # declare a variable containing all information that was not scaled\n", 316 | " X_not_scaled = X.loc[:,~X.columns.isin(self.columns)]\n", 317 | " \n", 318 | " # return a data frame which contains all scaled features and all 'not scaled' features\n", 319 | " # use the original order (that you recorded in the beginning)\n", 320 | " return pd.concat([X_not_scaled, X_scaled], axis=1)[init_col_order]" 321 | ] 322 | }, 323 | { 324 | "cell_type": "code", 325 | "execution_count": null, 326 | "metadata": {}, 327 | "outputs": [], 328 | "source": [ 329 | "# check what are all columns that we've got\n", 330 | "unscaled_inputs.columns.values" 331 | ] 332 | }, 333 | { 334 | "cell_type": "code", 335 | "execution_count": null, 336 | "metadata": {}, 337 | "outputs": [], 338 | "source": [ 339 | "# choose the columns to scale\n", 340 | "# we later augmented this code and put it in comments\n", 341 | "# columns_to_scale = ['Month Value','Day of the Week', 'Transportation Expense', 'Distance to Work',\n", 342 | " #'Age', 'Daily Work Load Average', 'Body Mass Index', 'Children', 'Pet']\n", 343 | " \n", 344 | "# select the columns to omit\n", 345 | "columns_to_omit = ['Reason_1', 'Reason_2', 'Reason_3', 'Reason_4','Education']" 346 | ] 347 | }, 348 | { 349 | "cell_type": "code", 350 | "execution_count": null, 351 | "metadata": {}, 352 | "outputs": [], 353 | "source": [ 354 | "# create the columns to scale, based on the columns to omit\n", 355 | "# use list comprehension to iterate over the list\n", 356 | "columns_to_scale = [x for x in unscaled_inputs.columns.values if x not in columns_to_omit]" 357 | ] 358 | }, 359 | { 360 | "cell_type": "code", 361 | "execution_count": null, 362 | "metadata": {}, 363 | "outputs": [], 364 | "source": [ 365 | "# declare a scaler object, specifying the columns you want to scale\n", 366 | "absenteeism_scaler = CustomScaler(columns_to_scale)" 367 | ] 368 | }, 369 | { 370 | "cell_type": "code", 371 | "execution_count": null, 372 | "metadata": {}, 373 | "outputs": [], 374 | "source": [ 375 | "# fit the data (calculate mean and standard deviation); they are automatically stored inside the object \n", 376 | "absenteeism_scaler.fit(unscaled_inputs)" 377 | ] 378 | }, 379 | { 380 | "cell_type": "code", 381 | "execution_count": null, 382 | "metadata": {}, 383 | "outputs": [], 384 | "source": [ 385 | "# standardizes the data, using the transform method \n", 386 | "# in the last line, we fitted the data - in other words\n", 387 | "# we found the internal parameters of a model that will be used to transform data. \n", 388 | "# transforming applies these parameters to our data\n", 389 | "# note that when you get new data, you can just call 'scaler' again and transform it in the same way as now\n", 390 | "scaled_inputs = absenteeism_scaler.transform(unscaled_inputs)" 391 | ] 392 | }, 393 | { 394 | "cell_type": "code", 395 | "execution_count": null, 396 | "metadata": { 397 | "scrolled": true 398 | }, 399 | "outputs": [], 400 | "source": [ 401 | "# the scaled_inputs are now an ndarray, because sklearn works with ndarrays\n", 402 | "scaled_inputs" 403 | ] 404 | }, 405 | { 406 | "cell_type": "code", 407 | "execution_count": null, 408 | "metadata": {}, 409 | "outputs": [], 410 | "source": [ 411 | "# check the shape of the inputs\n", 412 | "scaled_inputs.shape" 413 | ] 414 | }, 415 | { 416 | "cell_type": "markdown", 417 | "metadata": {}, 418 | "source": [ 419 | "## Split the data into train & test and shuffle" 420 | ] 421 | }, 422 | { 423 | "cell_type": "markdown", 424 | "metadata": {}, 425 | "source": [ 426 | "### Import the relevant module" 427 | ] 428 | }, 429 | { 430 | "cell_type": "code", 431 | "execution_count": null, 432 | "metadata": {}, 433 | "outputs": [], 434 | "source": [ 435 | "# import train_test_split so we can split our data into train and test\n", 436 | "from sklearn.model_selection import train_test_split" 437 | ] 438 | }, 439 | { 440 | "cell_type": "markdown", 441 | "metadata": {}, 442 | "source": [ 443 | "### Split" 444 | ] 445 | }, 446 | { 447 | "cell_type": "code", 448 | "execution_count": null, 449 | "metadata": { 450 | "scrolled": true 451 | }, 452 | "outputs": [], 453 | "source": [ 454 | "# check how this method works\n", 455 | "train_test_split(scaled_inputs, targets)" 456 | ] 457 | }, 458 | { 459 | "cell_type": "code", 460 | "execution_count": null, 461 | "metadata": {}, 462 | "outputs": [], 463 | "source": [ 464 | "# declare 4 variables for the split\n", 465 | "x_train, x_test, y_train, y_test = train_test_split(scaled_inputs, targets, #train_size = 0.8, \n", 466 | " test_size = 0.2, random_state = 20)" 467 | ] 468 | }, 469 | { 470 | "cell_type": "code", 471 | "execution_count": null, 472 | "metadata": {}, 473 | "outputs": [], 474 | "source": [ 475 | "# check the shape of the train inputs and targets\n", 476 | "print (x_train.shape, y_train.shape)" 477 | ] 478 | }, 479 | { 480 | "cell_type": "code", 481 | "execution_count": null, 482 | "metadata": {}, 483 | "outputs": [], 484 | "source": [ 485 | "# check the shape of the test inputs and targets\n", 486 | "print (x_test.shape, y_test.shape)" 487 | ] 488 | }, 489 | { 490 | "cell_type": "markdown", 491 | "metadata": {}, 492 | "source": [ 493 | "## Logistic regression with sklearn" 494 | ] 495 | }, 496 | { 497 | "cell_type": "code", 498 | "execution_count": null, 499 | "metadata": {}, 500 | "outputs": [], 501 | "source": [ 502 | "# import the LogReg model from sklearn\n", 503 | "from sklearn.linear_model import LogisticRegression\n", 504 | "\n", 505 | "# import the 'metrics' module, which includes important metrics we may want to use\n", 506 | "from sklearn import metrics" 507 | ] 508 | }, 509 | { 510 | "cell_type": "markdown", 511 | "metadata": {}, 512 | "source": [ 513 | "### Training the model" 514 | ] 515 | }, 516 | { 517 | "cell_type": "code", 518 | "execution_count": null, 519 | "metadata": {}, 520 | "outputs": [], 521 | "source": [ 522 | "# create a logistic regression object\n", 523 | "reg = LogisticRegression()" 524 | ] 525 | }, 526 | { 527 | "cell_type": "code", 528 | "execution_count": null, 529 | "metadata": {}, 530 | "outputs": [], 531 | "source": [ 532 | "# fit our train inputs\n", 533 | "# that is basically the whole training part of the machine learning\n", 534 | "reg.fit(x_train,y_train)" 535 | ] 536 | }, 537 | { 538 | "cell_type": "code", 539 | "execution_count": null, 540 | "metadata": {}, 541 | "outputs": [], 542 | "source": [ 543 | "# assess the train accuracy of the model\n", 544 | "reg.score(x_train,y_train)" 545 | ] 546 | }, 547 | { 548 | "cell_type": "markdown", 549 | "metadata": {}, 550 | "source": [ 551 | "### Manually check the accuracy" 552 | ] 553 | }, 554 | { 555 | "cell_type": "code", 556 | "execution_count": null, 557 | "metadata": { 558 | "scrolled": true 559 | }, 560 | "outputs": [], 561 | "source": [ 562 | "# find the model outputs according to our model\n", 563 | "model_outputs = reg.predict(x_train)\n", 564 | "model_outputs" 565 | ] 566 | }, 567 | { 568 | "cell_type": "code", 569 | "execution_count": null, 570 | "metadata": {}, 571 | "outputs": [], 572 | "source": [ 573 | "# compare them with the targets\n", 574 | "y_train" 575 | ] 576 | }, 577 | { 578 | "cell_type": "code", 579 | "execution_count": null, 580 | "metadata": { 581 | "scrolled": true 582 | }, 583 | "outputs": [], 584 | "source": [ 585 | "# ACTUALLY compare the two variables\n", 586 | "model_outputs == y_train" 587 | ] 588 | }, 589 | { 590 | "cell_type": "code", 591 | "execution_count": null, 592 | "metadata": {}, 593 | "outputs": [], 594 | "source": [ 595 | "# find out in how many instances we predicted correctly\n", 596 | "np.sum((model_outputs==y_train))" 597 | ] 598 | }, 599 | { 600 | "cell_type": "code", 601 | "execution_count": null, 602 | "metadata": {}, 603 | "outputs": [], 604 | "source": [ 605 | "# get the total number of instances\n", 606 | "model_outputs.shape[0]" 607 | ] 608 | }, 609 | { 610 | "cell_type": "code", 611 | "execution_count": null, 612 | "metadata": {}, 613 | "outputs": [], 614 | "source": [ 615 | "# calculate the accuracy of the model\n", 616 | "np.sum((model_outputs==y_train)) / model_outputs.shape[0]" 617 | ] 618 | }, 619 | { 620 | "cell_type": "markdown", 621 | "metadata": {}, 622 | "source": [ 623 | "### Finding the intercept and coefficients" 624 | ] 625 | }, 626 | { 627 | "cell_type": "code", 628 | "execution_count": null, 629 | "metadata": {}, 630 | "outputs": [], 631 | "source": [ 632 | "# get the intercept (bias) of our model\n", 633 | "reg.intercept_" 634 | ] 635 | }, 636 | { 637 | "cell_type": "code", 638 | "execution_count": null, 639 | "metadata": {}, 640 | "outputs": [], 641 | "source": [ 642 | "# get the coefficients (weights) of our model\n", 643 | "reg.coef_" 644 | ] 645 | }, 646 | { 647 | "cell_type": "code", 648 | "execution_count": null, 649 | "metadata": {}, 650 | "outputs": [], 651 | "source": [ 652 | "# check what were the names of our columns\n", 653 | "unscaled_inputs.columns.values" 654 | ] 655 | }, 656 | { 657 | "cell_type": "code", 658 | "execution_count": null, 659 | "metadata": {}, 660 | "outputs": [], 661 | "source": [ 662 | "# save the names of the columns in an ad-hoc variable\n", 663 | "feature_name = unscaled_inputs.columns.values" 664 | ] 665 | }, 666 | { 667 | "cell_type": "code", 668 | "execution_count": null, 669 | "metadata": {}, 670 | "outputs": [], 671 | "source": [ 672 | "# use the coefficients from this table (they will be exported later and will be used in Tableau)\n", 673 | "# transpose the model coefficients (model.coef_) and throws them into a df (a vertical organization, so that they can be\n", 674 | "# multiplied by certain matrices later) \n", 675 | "summary_table = pd.DataFrame (columns=['Feature name'], data = feature_name)\n", 676 | "\n", 677 | "# add the coefficient values to the summary table\n", 678 | "summary_table['Coefficient'] = np.transpose(reg.coef_)\n", 679 | "\n", 680 | "# display the summary table\n", 681 | "summary_table" 682 | ] 683 | }, 684 | { 685 | "cell_type": "code", 686 | "execution_count": null, 687 | "metadata": { 688 | "scrolled": true 689 | }, 690 | "outputs": [], 691 | "source": [ 692 | "# do a little Python trick to move the intercept to the top of the summary table\n", 693 | "# move all indices by 1\n", 694 | "summary_table.index = summary_table.index + 1\n", 695 | "\n", 696 | "# add the intercept at index 0\n", 697 | "summary_table.loc[0] = ['Intercept', reg.intercept_[0]]\n", 698 | "\n", 699 | "# sort the df by index\n", 700 | "summary_table = summary_table.sort_index()\n", 701 | "summary_table" 702 | ] 703 | }, 704 | { 705 | "cell_type": "markdown", 706 | "metadata": {}, 707 | "source": [ 708 | "## Interpreting the coefficients" 709 | ] 710 | }, 711 | { 712 | "cell_type": "code", 713 | "execution_count": null, 714 | "metadata": {}, 715 | "outputs": [], 716 | "source": [ 717 | "# create a new Series called: 'Odds ratio' which will show the.. odds ratio of each feature\n", 718 | "summary_table['Odds_ratio'] = np.exp(summary_table.Coefficient)" 719 | ] 720 | }, 721 | { 722 | "cell_type": "code", 723 | "execution_count": null, 724 | "metadata": {}, 725 | "outputs": [], 726 | "source": [ 727 | "# display the df\n", 728 | "summary_table" 729 | ] 730 | }, 731 | { 732 | "cell_type": "code", 733 | "execution_count": null, 734 | "metadata": {}, 735 | "outputs": [], 736 | "source": [ 737 | "# sort the table according to odds ratio\n", 738 | "# note that by default, the sort_values method sorts values by 'ascending'\n", 739 | "summary_table.sort_values('Odds_ratio', ascending=False)" 740 | ] 741 | }, 742 | { 743 | "cell_type": "markdown", 744 | "metadata": {}, 745 | "source": [ 746 | "## Testing the model" 747 | ] 748 | }, 749 | { 750 | "cell_type": "code", 751 | "execution_count": null, 752 | "metadata": {}, 753 | "outputs": [], 754 | "source": [ 755 | "# assess the test accuracy of the model\n", 756 | "reg.score(x_test,y_test)" 757 | ] 758 | }, 759 | { 760 | "cell_type": "code", 761 | "execution_count": null, 762 | "metadata": {}, 763 | "outputs": [], 764 | "source": [ 765 | "# find the predicted probabilities of each class\n", 766 | "# the first column shows the probability of a particular observation to be 0, while the second one - to be 1\n", 767 | "predicted_proba = reg.predict_proba(x_test)\n", 768 | "\n", 769 | "# let's check that out\n", 770 | "predicted_proba" 771 | ] 772 | }, 773 | { 774 | "cell_type": "code", 775 | "execution_count": null, 776 | "metadata": {}, 777 | "outputs": [], 778 | "source": [ 779 | "predicted_proba.shape" 780 | ] 781 | }, 782 | { 783 | "cell_type": "code", 784 | "execution_count": null, 785 | "metadata": {}, 786 | "outputs": [], 787 | "source": [ 788 | "# select ONLY the probabilities referring to 1s\n", 789 | "predicted_proba[:,1]" 790 | ] 791 | }, 792 | { 793 | "cell_type": "markdown", 794 | "metadata": {}, 795 | "source": [ 796 | "## Save the model" 797 | ] 798 | }, 799 | { 800 | "cell_type": "code", 801 | "execution_count": null, 802 | "metadata": {}, 803 | "outputs": [], 804 | "source": [ 805 | "# import the relevant module\n", 806 | "import pickle" 807 | ] 808 | }, 809 | { 810 | "cell_type": "code", 811 | "execution_count": null, 812 | "metadata": {}, 813 | "outputs": [], 814 | "source": [ 815 | "# pickle the model file\n", 816 | "with open('model', 'wb') as file:\n", 817 | " pickle.dump(reg, file)" 818 | ] 819 | }, 820 | { 821 | "cell_type": "code", 822 | "execution_count": null, 823 | "metadata": {}, 824 | "outputs": [], 825 | "source": [ 826 | "# pickle the scaler file\n", 827 | "with open('scaler','wb') as file:\n", 828 | " pickle.dump(absenteeism_scaler, file)" 829 | ] 830 | } 831 | ], 832 | "metadata": { 833 | "kernelspec": { 834 | "display_name": "Python 3", 835 | "language": "python", 836 | "name": "python3" 837 | }, 838 | "language_info": { 839 | "codemirror_mode": { 840 | "name": "ipython", 841 | "version": 3 842 | }, 843 | "file_extension": ".py", 844 | "mimetype": "text/x-python", 845 | "name": "python", 846 | "nbconvert_exporter": "python", 847 | "pygments_lexer": "ipython3", 848 | "version": "3.6.4" 849 | } 850 | }, 851 | "nbformat": 4, 852 | "nbformat_minor": 2 853 | } 854 | -------------------------------------------------------------------------------- /5. Preprocessing/20 What to Expect from the Next Couple of Sections/df_preprocessed.csv: -------------------------------------------------------------------------------- 1 | Reason_1,Reason_2,Reason_3,Reason_4,Month Value,Day of the Week,Transportation Expense,Distance to Work,Age,Daily Work Load Average,Body Mass Index,Education,Children,Pets,Absenteeism Time in Hours 2 | 0,0,0,1,7,1,289,36,33,239.554,30,0,2,1,4 3 | 0,0,0,0,7,1,118,13,50,239.554,31,0,1,0,0 4 | 0,0,0,1,7,2,179,51,38,239.554,31,0,0,0,2 5 | 1,0,0,0,7,3,279,5,39,239.554,24,0,2,0,4 6 | 0,0,0,1,7,3,289,36,33,239.554,30,0,2,1,2 7 | 0,0,0,1,7,4,179,51,38,239.554,31,0,0,0,2 8 | 0,0,0,1,7,4,361,52,28,239.554,27,0,1,4,8 9 | 0,0,0,1,7,4,260,50,36,239.554,23,0,4,0,4 10 | 0,0,1,0,7,0,155,12,34,239.554,25,0,2,0,40 11 | 0,0,0,1,7,0,235,11,37,239.554,29,1,1,1,8 12 | 1,0,0,0,7,0,260,50,36,239.554,23,0,4,0,8 13 | 1,0,0,0,7,1,260,50,36,239.554,23,0,4,0,8 14 | 1,0,0,0,7,2,260,50,36,239.554,23,0,4,0,8 15 | 1,0,0,0,7,2,179,51,38,239.554,31,0,0,0,1 16 | 0,0,0,1,7,2,179,51,38,239.554,31,0,0,0,4 17 | 1,0,0,0,7,4,246,25,41,239.554,23,0,0,0,8 18 | 0,0,0,1,7,4,179,51,38,239.554,31,0,0,0,2 19 | 0,0,1,0,7,0,179,51,38,239.554,31,0,0,0,8 20 | 1,0,0,0,7,3,189,29,33,239.554,25,0,2,2,8 21 | 0,0,0,1,8,2,248,25,47,205.917,32,0,2,1,2 22 | 1,0,0,0,8,2,330,16,28,205.917,25,1,0,0,8 23 | 1,0,0,0,8,0,179,51,38,205.917,31,0,0,0,1 24 | 1,0,0,0,8,0,361,52,28,205.917,27,0,1,4,40 25 | 0,0,0,1,8,4,260,50,36,205.917,23,0,4,0,4 26 | 0,0,1,0,8,0,289,36,33,205.917,30,0,2,1,8 27 | 0,0,0,1,8,0,361,52,28,205.917,27,0,1,4,7 28 | 0,0,0,1,8,1,289,36,33,205.917,30,0,2,1,1 29 | 0,0,0,1,8,2,157,27,29,205.917,22,0,0,0,4 30 | 0,0,1,0,8,2,289,36,33,205.917,30,0,2,1,8 31 | 0,0,0,1,8,4,179,51,38,205.917,31,0,0,0,2 32 | 0,0,1,0,8,0,179,51,38,205.917,31,0,0,0,8 33 | 0,0,1,0,8,3,235,29,48,205.917,33,0,1,5,8 34 | 0,0,0,1,8,3,235,11,37,205.917,29,1,1,1,4 35 | 0,0,1,0,8,0,235,29,48,205.917,33,0,1,5,8 36 | 0,0,0,1,8,0,179,51,38,205.917,31,0,0,0,2 37 | 0,0,0,1,8,0,361,52,28,205.917,27,0,1,4,1 38 | 0,0,0,1,8,1,289,36,33,205.917,30,0,2,1,8 39 | 1,0,0,0,8,3,291,50,32,205.917,23,0,0,0,4 40 | 0,0,0,1,8,4,235,29,48,205.917,33,0,1,5,8 41 | 0,0,0,1,8,4,260,50,36,205.917,23,0,4,0,4 42 | 0,0,0,1,9,1,184,42,27,241.476,21,0,0,0,2 43 | 0,0,0,1,9,0,118,10,37,241.476,28,0,0,0,4 44 | 0,0,0,1,9,1,179,51,38,241.476,31,0,0,0,4 45 | 0,0,1,0,9,1,235,20,43,241.476,38,0,1,0,8 46 | 0,0,0,1,9,2,155,12,34,241.476,25,0,2,0,2 47 | 0,0,0,1,9,6,118,10,37,241.476,28,0,0,0,3 48 | 0,0,0,1,9,0,179,51,38,241.476,31,0,0,0,3 49 | 0,0,0,1,9,3,291,31,40,241.476,25,0,1,1,4 50 | 0,0,0,1,9,4,260,50,36,241.476,23,0,4,0,8 51 | 1,0,0,0,9,0,291,31,40,241.476,25,0,1,1,32 52 | 0,0,0,0,9,0,260,50,36,241.476,23,0,4,0,0 53 | 0,0,0,0,9,0,225,26,28,241.476,24,0,1,2,0 54 | 0,0,0,1,9,1,225,26,28,241.476,24,0,1,2,2 55 | 0,0,0,1,9,1,118,10,37,241.476,28,0,0,0,2 56 | 0,0,0,0,9,1,289,36,33,241.476,30,0,2,1,0 57 | 0,0,0,0,9,1,118,13,50,241.476,31,0,1,0,0 58 | 0,0,1,0,9,2,225,26,28,241.476,24,0,1,2,3 59 | 0,0,0,1,9,2,179,51,38,241.476,31,0,0,0,3 60 | 0,0,0,0,9,2,369,17,31,241.476,25,0,3,0,0 61 | 0,0,0,1,9,4,248,25,47,241.476,32,0,2,1,1 62 | 0,0,0,1,9,4,179,51,38,241.476,31,0,0,0,3 63 | 0,0,0,1,9,4,260,50,36,241.476,23,0,4,0,4 64 | 0,0,0,1,10,1,179,51,38,253.465,31,0,0,0,3 65 | 0,0,0,1,10,1,118,10,37,253.465,28,0,0,0,3 66 | 0,0,0,0,10,2,118,13,50,253.465,31,0,1,0,0 67 | 0,0,0,1,10,3,179,26,30,253.465,19,1,0,0,1 68 | 0,0,0,1,10,4,179,51,38,253.465,31,0,0,0,3 69 | 0,0,0,1,10,4,225,26,28,253.465,24,0,1,2,3 70 | 0,0,0,1,10,1,118,10,37,253.465,28,0,0,0,3 71 | 0,0,0,1,10,2,225,26,28,253.465,24,0,1,2,2 72 | 0,0,0,1,10,2,248,25,47,253.465,32,0,2,1,2 73 | 0,0,0,1,10,3,291,31,40,253.465,25,0,1,1,5 74 | 0,0,0,1,10,2,179,51,38,253.465,31,0,0,0,8 75 | 0,0,0,1,10,2,225,26,28,253.465,24,0,1,2,3 76 | 0,0,1,0,10,3,260,50,36,253.465,23,0,4,0,16 77 | 1,0,0,0,10,1,291,31,40,253.465,25,0,1,1,8 78 | 0,0,0,1,10,1,225,26,28,253.465,24,0,1,2,2 79 | 0,0,0,1,10,2,289,36,33,253.465,30,0,2,1,8 80 | 0,0,0,1,10,4,361,52,28,253.465,27,0,1,4,1 81 | 0,0,0,1,10,4,260,50,36,253.465,23,0,4,0,3 82 | 0,0,0,1,11,3,179,51,38,306.345,31,0,0,0,1 83 | 0,0,0,1,11,2,225,26,28,306.345,24,0,1,2,1 84 | 1,0,0,0,11,3,179,51,38,306.345,31,0,0,0,8 85 | 0,0,1,0,11,3,179,22,40,306.345,22,1,2,0,8 86 | 0,0,0,1,11,3,291,31,40,306.345,25,0,1,1,5 87 | 1,0,0,0,11,0,155,12,34,306.345,25,0,2,0,32 88 | 0,0,0,1,11,0,189,29,33,306.345,25,0,2,2,8 89 | 1,0,0,0,11,0,291,31,40,306.345,25,0,1,1,40 90 | 0,0,0,1,11,2,225,26,28,306.345,24,0,1,2,1 91 | 1,0,0,0,11,4,155,12,34,306.345,25,0,2,0,8 92 | 0,0,0,1,11,2,225,26,28,306.345,24,0,1,2,3 93 | 0,0,1,0,11,2,179,22,40,306.345,22,1,2,0,8 94 | 1,0,0,0,11,4,225,26,28,306.345,24,0,1,2,3 95 | 0,0,0,1,11,4,260,50,36,306.345,23,0,4,0,4 96 | 0,0,0,1,11,0,248,25,47,306.345,32,0,2,1,1 97 | 0,0,0,1,11,1,225,26,28,306.345,24,0,1,2,3 98 | 1,0,0,0,11,2,289,36,33,306.345,30,0,2,1,24 99 | 0,0,0,1,11,3,291,31,40,306.345,25,0,1,1,3 100 | 0,0,0,1,12,1,248,25,47,261.306,32,0,2,1,1 101 | 0,0,1,0,12,1,118,10,37,261.306,28,0,0,0,64 102 | 0,0,0,1,12,2,118,13,50,261.306,31,0,1,0,2 103 | 0,0,0,1,12,2,235,11,37,261.306,29,1,1,1,8 104 | 0,0,0,1,12,3,225,26,28,261.306,24,0,1,2,2 105 | 0,0,0,1,12,4,260,50,36,261.306,23,0,4,0,8 106 | 0,0,1,0,12,1,118,10,37,261.306,28,0,0,0,56 107 | 0,0,0,1,12,2,361,52,28,261.306,27,0,1,4,8 108 | 0,0,0,1,12,3,225,26,28,261.306,24,0,1,2,3 109 | 0,0,0,1,12,4,260,50,36,261.306,23,0,4,0,3 110 | 0,0,0,1,12,1,225,26,28,261.306,24,0,1,2,2 111 | 0,0,0,1,12,2,361,52,28,261.306,27,0,1,4,8 112 | 0,0,0,1,12,4,118,10,37,261.306,28,0,0,0,2 113 | 0,0,1,0,12,4,246,25,41,261.306,23,0,0,0,8 114 | 0,0,0,1,12,4,225,26,28,261.306,24,0,1,2,2 115 | 0,0,0,1,1,2,225,26,28,308.593,24,0,1,2,1 116 | 0,0,1,0,1,0,118,10,37,308.593,28,0,0,0,1 117 | 0,0,0,1,1,1,118,10,37,308.593,28,0,0,0,1 118 | 0,0,1,0,1,1,155,12,34,308.593,25,0,2,0,8 119 | 0,0,0,1,1,2,225,26,28,308.593,24,0,1,2,2 120 | 0,0,0,1,1,3,184,42,27,308.593,21,0,0,0,2 121 | 0,0,0,1,1,3,225,26,28,308.593,24,0,1,2,2 122 | 0,0,0,1,1,4,225,26,28,308.593,24,0,1,2,1 123 | 0,0,0,1,1,0,118,10,37,308.593,28,0,0,0,2 124 | 0,0,0,1,1,1,225,26,28,308.593,24,0,1,2,2 125 | 0,0,0,1,1,1,118,10,37,308.593,28,0,0,0,2 126 | 0,0,0,1,1,2,118,10,37,308.593,28,0,0,0,2 127 | 0,0,0,1,1,3,118,10,37,308.593,28,0,0,0,2 128 | 0,0,0,1,1,4,118,10,37,308.593,28,0,0,0,2 129 | 0,0,0,1,1,0,118,10,37,308.593,28,0,0,0,2 130 | 0,0,0,1,1,1,118,10,37,308.593,28,0,0,0,2 131 | 0,0,1,0,1,1,179,26,30,308.593,19,1,0,0,8 132 | 0,0,1,0,1,1,289,36,33,308.593,30,0,2,1,8 133 | 0,0,0,1,1,2,118,10,37,308.593,28,0,0,0,2 134 | 0,0,0,1,1,3,184,42,27,308.593,21,0,0,0,2 135 | 0,0,0,1,1,3,118,10,37,308.593,28,0,0,0,2 136 | 0,0,0,1,1,0,118,10,37,308.593,28,0,0,0,0 137 | 0,0,0,1,1,1,225,26,28,308.593,24,0,1,2,1 138 | 0,0,0,1,1,3,289,36,33,308.593,30,0,2,1,3 139 | 0,0,0,1,2,4,184,42,27,302.585,21,0,0,0,1 140 | 1,0,0,0,2,2,246,25,41,302.585,23,0,0,0,8 141 | 1,0,0,0,2,2,179,51,38,302.585,31,0,0,0,8 142 | 0,0,0,1,2,3,155,12,34,302.585,25,0,2,0,2 143 | 0,0,0,1,2,3,189,29,33,302.585,25,0,2,2,8 144 | 0,0,0,1,2,4,260,50,36,302.585,23,0,4,0,2 145 | 0,0,0,1,2,4,289,36,33,302.585,30,0,2,1,8 146 | 1,0,0,0,2,0,388,15,50,302.585,24,0,0,0,8 147 | 1,0,0,0,2,1,388,15,50,302.585,24,0,0,0,8 148 | 0,0,0,1,2,0,225,26,28,302.585,24,0,1,2,2 149 | 0,0,0,1,2,1,225,26,28,302.585,24,0,1,2,2 150 | 0,0,0,1,2,1,179,26,30,302.585,19,1,0,0,1 151 | 0,0,0,1,2,1,184,42,27,302.585,21,0,0,0,8 152 | 0,0,0,1,2,3,225,26,28,302.585,24,0,1,2,3 153 | 0,0,1,0,2,0,330,16,28,302.585,25,1,0,0,8 154 | 0,0,0,1,2,1,330,16,28,302.585,25,1,0,0,1 155 | 0,0,0,1,2,2,225,26,28,302.585,24,0,1,2,1 156 | 0,0,1,0,2,3,189,29,33,302.585,25,0,2,2,8 157 | 0,0,0,1,3,1,291,50,32,343.253,23,0,0,0,2 158 | 0,0,1,0,3,1,260,50,36,343.253,23,0,4,0,8 159 | 0,0,1,0,3,1,157,27,29,343.253,22,0,0,0,3 160 | 0,1,0,0,3,1,179,22,40,343.253,22,1,2,0,8 161 | 0,0,0,1,3,2,291,31,40,343.253,25,0,1,1,8 162 | 1,0,0,0,3,2,260,50,36,343.253,23,0,4,0,8 163 | 1,0,0,0,3,3,179,26,30,343.253,19,1,0,0,8 164 | 1,0,0,0,3,4,248,25,47,343.253,32,0,2,1,3 165 | 1,0,0,0,3,4,260,50,36,343.253,23,0,4,0,40 166 | 1,0,0,0,3,0,179,22,40,343.253,22,1,2,0,40 167 | 1,0,0,0,3,0,155,12,34,343.253,25,0,2,0,16 168 | 0,0,0,1,3,0,260,50,36,343.253,23,0,4,0,16 169 | 1,0,0,0,3,1,155,12,34,343.253,25,0,2,0,8 170 | 1,0,0,0,3,3,289,36,33,343.253,30,0,2,1,8 171 | 1,0,0,0,3,3,179,22,40,343.253,22,1,2,0,8 172 | 0,0,0,1,3,4,260,50,36,343.253,23,0,4,0,4 173 | 0,0,0,1,3,4,225,26,28,343.253,24,0,1,2,1 174 | 1,0,0,0,3,0,279,5,39,343.253,24,0,2,0,8 175 | 1,0,0,0,3,1,179,51,38,343.253,31,0,0,0,24 176 | 0,0,0,1,3,2,225,26,28,343.253,24,0,1,2,2 177 | 1,0,0,0,3,0,225,26,28,343.253,24,0,1,2,8 178 | 1,0,0,0,3,0,179,26,30,343.253,19,1,0,0,1 179 | 1,0,0,0,3,1,225,26,28,343.253,24,0,1,2,8 180 | 1,0,0,0,3,2,225,26,28,343.253,24,0,1,2,16 181 | 1,0,0,0,3,2,179,51,38,343.253,31,0,0,0,3 182 | 1,0,0,0,3,3,279,5,39,343.253,24,0,2,0,16 183 | 0,0,0,1,3,4,225,26,28,343.253,24,0,1,2,2 184 | 1,0,0,0,3,4,248,25,47,343.253,32,0,2,1,3 185 | 0,0,0,1,3,0,225,26,28,343.253,24,0,1,2,1 186 | 0,0,0,1,4,2,291,31,40,326.452,25,0,1,1,1 187 | 0,0,0,1,4,2,225,26,28,326.452,24,0,1,2,1 188 | 0,0,0,1,4,1,155,12,34,326.452,25,0,2,0,1 189 | 1,0,0,0,4,2,246,25,41,326.452,23,0,0,0,24 190 | 0,0,0,1,4,3,155,12,34,326.452,25,0,2,0,1 191 | 0,0,0,1,4,4,225,26,28,326.452,24,0,1,2,2 192 | 0,0,0,1,4,4,260,50,36,326.452,23,0,4,0,4 193 | 1,0,0,0,4,2,179,51,38,326.452,31,0,0,0,24 194 | 0,0,0,1,4,3,118,13,50,326.452,31,0,1,0,1 195 | 0,0,0,1,4,4,291,31,40,326.452,25,0,1,1,3 196 | 1,0,0,0,4,4,246,25,41,326.452,23,0,0,0,8 197 | 0,0,0,1,4,4,291,31,40,326.452,25,0,1,1,1 198 | 0,0,0,1,4,4,248,25,47,326.452,32,0,2,1,8 199 | 0,0,1,0,4,4,260,50,36,326.452,23,0,4,0,56 200 | 0,0,1,0,4,1,289,36,33,326.452,30,0,2,1,8 201 | 1,0,0,0,4,2,155,12,34,326.452,25,0,2,0,24 202 | 0,0,1,0,4,2,378,49,36,326.452,21,0,2,4,8 203 | 1,0,0,0,4,3,289,36,33,326.452,30,0,2,1,16 204 | 1,0,0,0,4,4,235,11,37,326.452,29,1,1,1,3 205 | 0,0,0,0,4,0,235,29,48,326.452,33,0,1,5,0 206 | 1,0,0,0,5,2,289,36,33,378.884,30,0,2,1,8 207 | 0,0,0,1,5,3,155,12,34,378.884,25,0,2,0,2 208 | 0,0,0,1,5,0,155,12,34,378.884,25,0,2,0,1 209 | 0,0,1,0,5,1,179,51,38,378.884,31,0,0,0,8 210 | 0,0,1,0,5,1,225,26,28,378.884,24,0,1,2,8 211 | 1,0,0,0,5,2,184,42,27,378.884,21,0,0,0,4 212 | 0,0,0,1,5,0,155,12,34,378.884,25,0,2,0,2 213 | 1,0,0,0,5,1,179,51,38,378.884,31,0,0,0,1 214 | 1,0,0,0,5,2,289,36,33,378.884,30,0,2,1,24 215 | 0,0,0,0,5,2,279,5,39,378.884,24,0,2,0,0 216 | 0,0,0,0,5,2,330,16,28,378.884,25,1,0,0,0 217 | 0,0,0,0,5,2,378,49,36,378.884,21,0,2,4,0 218 | 0,0,0,0,5,2,388,15,50,378.884,24,0,0,0,0 219 | 1,0,0,0,5,1,179,51,38,378.884,31,0,0,0,1 220 | 1,0,0,0,5,2,118,13,50,378.884,31,0,1,0,24 221 | 0,0,0,1,5,4,361,52,28,378.884,27,0,1,4,8 222 | 0,0,1,0,6,0,246,25,41,377.55,23,0,0,0,8 223 | 0,0,0,1,6,0,361,52,28,377.55,27,0,1,4,8 224 | 1,0,0,0,6,1,246,25,41,377.55,23,0,0,0,24 225 | 0,0,0,1,6,3,291,31,40,377.55,25,0,1,1,4 226 | 1,0,0,0,6,4,246,25,41,377.55,23,0,0,0,8 227 | 1,0,0,0,6,0,179,51,38,377.55,31,0,0,0,8 228 | 0,0,0,1,6,0,155,12,34,377.55,25,0,2,0,4 229 | 1,0,0,0,6,0,246,25,41,377.55,23,0,0,0,8 230 | 1,0,0,0,6,2,118,13,50,377.55,31,0,1,0,8 231 | 1,0,0,0,6,4,235,11,37,377.55,29,1,1,1,16 232 | 0,0,0,1,6,1,118,13,50,377.55,31,0,1,0,1 233 | 1,0,0,0,6,2,118,13,50,377.55,31,0,1,0,80 234 | 0,0,0,1,6,3,378,49,36,377.55,21,0,2,4,8 235 | 1,0,0,0,6,4,179,51,38,377.55,31,0,0,0,2 236 | 0,0,0,1,6,0,289,48,49,377.55,36,0,0,2,2 237 | 0,0,0,1,6,3,225,26,28,377.55,24,0,1,2,2 238 | 0,0,1,0,7,1,155,12,34,275.312,25,0,2,0,16 239 | 1,0,0,0,7,2,118,13,50,275.312,31,0,1,0,8 240 | 1,0,0,0,7,4,118,10,37,275.312,28,0,0,0,8 241 | 0,0,0,1,7,4,118,10,37,275.312,28,0,0,0,4 242 | 0,0,0,1,7,1,330,16,28,275.312,25,1,0,0,8 243 | 0,0,1,0,7,3,179,26,30,275.312,19,1,0,0,8 244 | 0,0,0,1,7,4,155,12,34,275.312,25,0,2,0,2 245 | 1,0,0,0,7,0,330,16,28,275.312,25,1,0,0,8 246 | 1,0,0,0,7,1,330,16,28,275.312,25,1,0,0,8 247 | 0,0,0,1,7,0,157,27,29,275.312,22,0,0,0,3 248 | 0,0,0,1,7,1,361,52,28,275.312,27,0,1,4,8 249 | 0,0,0,1,7,2,289,36,33,275.312,30,0,2,1,8 250 | 0,0,0,1,7,3,179,51,38,275.312,31,0,0,0,8 251 | 0,0,1,0,7,0,289,36,33,275.312,30,0,2,1,32 252 | 0,0,1,0,7,3,289,36,33,275.312,30,0,2,1,8 253 | 0,0,0,0,7,3,260,50,36,275.312,23,0,4,0,0 254 | 0,0,1,0,8,3,289,36,33,265.615,30,0,2,1,8 255 | 0,0,1,0,8,3,157,27,29,265.615,22,0,0,0,3 256 | 0,0,0,1,8,6,289,36,33,265.615,30,0,2,1,1 257 | 0,0,1,0,8,0,228,14,58,265.615,22,0,2,1,8 258 | 1,0,0,0,8,2,300,26,43,265.615,25,0,2,1,1 259 | 1,0,0,0,8,2,300,26,43,265.615,25,0,2,1,2 260 | 0,0,0,1,8,3,260,50,36,265.615,23,0,4,0,4 261 | 0,0,0,1,8,0,289,36,33,265.615,30,0,2,1,4 262 | 0,0,0,1,8,1,248,25,47,265.615,32,0,2,1,1 263 | 1,0,0,0,8,2,268,11,33,265.615,25,1,0,0,8 264 | 0,0,0,1,8,2,179,26,30,265.615,19,1,0,0,1 265 | 1,0,0,0,8,2,118,13,50,265.615,31,0,1,0,3 266 | 0,0,0,1,8,6,248,25,47,265.615,32,0,2,1,2 267 | 0,0,0,1,8,0,235,11,37,265.615,29,1,1,1,1 268 | 0,0,0,1,8,2,118,13,50,265.615,31,0,1,0,1 269 | 0,0,1,0,8,2,235,11,37,265.615,29,1,1,1,8 270 | 1,0,0,0,8,0,361,52,28,265.615,27,0,1,4,8 271 | 1,0,0,0,8,1,184,42,27,265.615,21,0,0,0,8 272 | 1,0,0,0,9,0,179,51,38,294.217,31,0,0,0,8 273 | 0,0,0,1,9,4,179,51,38,294.217,31,0,0,0,3 274 | 0,0,1,0,9,2,289,36,33,294.217,30,0,2,1,24 275 | 0,0,0,0,9,3,235,20,43,294.217,38,0,1,0,0 276 | 1,0,0,0,9,0,246,25,41,294.217,23,0,0,0,16 277 | 0,0,0,1,9,1,291,31,40,294.217,25,0,1,1,3 278 | 0,0,0,0,9,1,231,35,39,294.217,35,0,2,2,0 279 | 0,0,0,0,9,1,291,50,32,294.217,23,0,0,0,0 280 | 1,0,0,0,9,2,179,51,38,294.217,31,0,0,0,8 281 | 1,0,0,0,9,2,246,25,41,294.217,23,0,0,0,32 282 | 0,0,0,1,9,3,179,51,38,294.217,31,0,0,0,1 283 | 0,0,0,1,9,4,291,31,40,294.217,25,0,1,1,4 284 | 0,0,0,1,9,4,260,50,36,294.217,23,0,4,0,4 285 | 0,0,0,1,9,2,235,20,43,294.217,38,0,1,0,8 286 | 0,0,0,1,9,3,118,13,50,294.217,31,0,1,0,1 287 | 0,0,0,0,9,3,235,20,43,294.217,38,0,1,0,0 288 | 0,0,0,1,9,4,291,31,40,294.217,25,0,1,1,3 289 | 1,0,0,0,9,0,291,31,40,294.217,25,0,1,1,40 290 | 1,0,0,0,9,0,179,51,38,294.217,31,0,0,0,8 291 | 0,0,0,1,10,2,289,36,33,265.017,30,0,2,1,8 292 | 0,0,0,1,10,2,235,11,37,265.017,29,1,1,1,4 293 | 0,0,0,1,10,2,289,36,33,265.017,30,0,2,1,8 294 | 0,0,0,1,10,3,289,36,33,265.017,30,0,2,1,8 295 | 0,0,0,0,10,3,118,13,50,265.017,31,0,1,0,0 296 | 0,0,0,0,10,3,248,25,47,265.017,32,0,2,1,0 297 | 1,0,0,0,10,2,179,26,30,265.017,19,1,0,0,8 298 | 1,0,0,0,10,2,118,10,37,265.017,28,0,0,0,3 299 | 0,0,0,1,10,2,369,17,31,265.017,25,0,3,0,8 300 | 0,0,0,1,10,4,179,51,38,265.017,31,0,0,0,1 301 | 1,0,0,0,10,4,179,26,30,265.017,19,1,0,0,64 302 | 0,0,0,0,10,4,235,20,43,265.017,38,0,1,0,0 303 | 0,0,1,0,10,5,289,36,33,265.017,30,0,2,1,16 304 | 0,0,0,1,10,6,260,50,36,265.017,23,0,4,0,3 305 | 0,0,0,0,10,6,235,20,43,265.017,38,0,1,0,0 306 | 0,0,0,1,10,2,235,20,43,265.017,38,0,1,0,2 307 | 0,0,0,1,10,2,235,20,43,265.017,38,0,1,0,2 308 | 0,0,0,1,10,3,118,13,50,265.017,31,0,1,0,1 309 | 0,0,0,1,10,3,291,31,40,265.017,25,0,1,1,4 310 | 0,0,0,1,10,5,179,26,30,265.017,19,1,0,0,16 311 | 0,0,0,1,10,5,118,13,50,265.017,31,0,1,0,1 312 | 1,0,0,0,10,2,361,52,28,265.017,27,0,1,4,8 313 | 0,0,0,0,10,3,260,50,36,265.017,23,0,4,0,0 314 | 0,0,0,0,10,3,291,31,40,265.017,25,0,1,1,0 315 | 0,0,0,0,10,3,157,27,29,265.017,22,0,0,0,0 316 | 1,0,0,0,10,4,179,26,30,265.017,19,1,0,0,5 317 | 1,0,0,0,10,4,179,26,30,265.017,19,1,0,0,5 318 | 0,0,0,1,10,5,118,13,50,265.017,31,0,1,0,1 319 | 1,0,0,0,11,0,118,10,37,284.031,28,0,0,0,8 320 | 0,0,0,1,11,0,248,25,47,284.031,32,0,2,1,2 321 | 1,0,0,0,11,1,179,51,38,284.031,31,0,0,0,8 322 | 0,0,0,1,11,4,260,50,36,284.031,23,0,4,0,3 323 | 0,0,0,1,11,0,291,31,40,284.031,25,0,1,1,1 324 | 1,0,0,0,11,0,378,49,36,284.031,21,0,2,4,8 325 | 1,0,0,0,11,0,155,12,34,284.031,25,0,2,0,120 326 | 0,0,0,1,11,0,235,20,43,284.031,38,0,1,0,8 327 | 0,0,0,0,11,1,330,16,28,284.031,25,1,0,0,0 328 | 0,0,1,0,11,2,235,11,37,284.031,29,1,1,1,1 329 | 1,0,0,0,11,2,118,10,37,284.031,28,0,0,0,3 330 | 0,0,0,1,11,3,235,11,37,284.031,29,1,1,1,2 331 | 0,0,0,1,11,3,179,51,38,284.031,31,0,0,0,3 332 | 1,0,0,0,11,4,246,25,41,284.031,23,0,0,0,8 333 | 1,0,0,0,11,4,291,31,40,284.031,25,0,1,1,4 334 | 1,0,0,0,11,0,246,25,41,284.031,23,0,0,0,8 335 | 0,0,0,1,11,1,179,51,38,284.031,31,0,0,0,1 336 | 1,0,0,0,11,2,260,50,36,284.031,23,0,4,0,8 337 | 0,1,0,0,11,4,260,50,36,284.031,23,0,4,0,8 338 | 0,0,0,0,11,4,378,49,36,284.031,21,0,2,4,0 339 | 0,0,0,0,11,1,279,5,39,284.031,24,0,2,0,0 340 | 0,0,0,1,11,3,179,51,38,284.031,31,0,0,0,1 341 | 1,0,0,0,12,0,225,26,28,236.629,24,0,1,2,3 342 | 0,0,0,1,12,0,179,51,38,236.629,31,0,0,0,2 343 | 0,0,0,1,12,0,179,51,38,236.629,31,0,0,0,1 344 | 0,0,0,1,12,0,235,11,37,236.629,29,1,1,1,3 345 | 0,0,0,1,12,1,118,13,50,236.629,31,0,1,0,1 346 | 0,0,0,1,12,4,260,50,36,236.629,23,0,4,0,4 347 | 1,0,0,0,12,3,246,25,41,236.629,23,0,0,0,8 348 | 0,0,0,1,12,3,179,51,38,236.629,31,0,0,0,1 349 | 0,0,0,1,12,4,179,51,38,236.629,31,0,0,0,1 350 | 0,0,0,1,12,1,179,26,30,236.629,19,1,0,0,1 351 | 0,0,0,1,12,1,118,10,37,236.629,28,0,0,0,8 352 | 0,0,0,1,12,3,235,11,37,236.629,29,1,1,1,2 353 | 0,0,0,1,12,4,179,51,38,236.629,31,0,0,0,1 354 | 1,0,0,0,12,1,235,20,43,236.629,38,0,1,0,8 355 | 1,0,0,0,12,1,235,11,37,236.629,29,1,1,1,4 356 | 0,0,0,1,12,2,260,50,36,236.629,23,0,4,0,8 357 | 0,0,0,1,12,0,157,27,29,236.629,22,0,0,0,2 358 | 0,0,0,1,12,0,179,51,38,236.629,31,0,0,0,3 359 | 0,0,1,0,12,0,289,36,33,236.629,30,0,2,1,8 360 | 0,0,0,1,1,2,225,26,28,330.061,24,0,1,2,5 361 | 0,0,1,0,1,0,118,10,37,330.061,28,0,0,0,32 362 | 0,0,0,1,1,0,155,12,34,330.061,25,0,2,0,2 363 | 1,0,0,0,1,1,235,11,37,330.061,29,1,1,1,1 364 | 0,0,0,1,1,1,155,12,34,330.061,25,0,2,0,4 365 | 0,0,0,1,1,0,289,36,33,330.061,30,0,2,1,8 366 | 1,0,0,0,1,2,291,31,40,330.061,25,0,1,1,8 367 | 0,0,0,1,1,0,235,20,43,330.061,38,0,1,0,8 368 | 0,0,0,1,1,0,118,13,50,330.061,31,0,1,0,4 369 | 0,0,0,1,1,2,179,51,38,330.061,31,0,0,0,1 370 | 0,0,0,1,1,4,179,51,38,330.061,31,0,0,0,1 371 | 0,0,0,1,2,1,118,10,37,251.818,28,0,0,0,2 372 | 0,0,0,1,2,2,179,51,38,251.818,31,0,0,0,3 373 | 1,0,0,0,2,2,225,26,28,251.818,24,0,1,2,1 374 | 0,0,0,1,2,4,289,36,33,251.818,30,0,2,1,3 375 | 0,0,0,1,2,4,260,50,36,251.818,23,0,4,0,3 376 | 0,0,0,1,2,4,179,51,38,251.818,31,0,0,0,3 377 | 0,0,0,1,2,0,179,51,38,251.818,31,0,0,0,2 378 | 0,0,0,1,2,2,179,51,38,251.818,31,0,0,0,3 379 | 1,0,0,0,2,3,179,51,38,251.818,31,0,0,0,8 380 | 0,0,0,1,2,3,246,25,41,251.818,23,0,0,0,8 381 | 0,0,0,1,2,4,179,51,38,251.818,31,0,0,0,3 382 | 0,0,0,1,2,0,189,29,33,251.818,25,0,2,2,8 383 | 0,0,0,1,2,0,179,51,38,251.818,31,0,0,0,3 384 | 0,0,0,1,2,1,246,25,41,251.818,23,0,0,0,2 385 | 0,0,0,1,2,1,291,31,40,251.818,25,0,1,1,2 386 | 1,0,0,0,2,2,157,27,29,251.818,22,0,0,0,16 387 | 0,0,0,1,2,2,179,51,38,251.818,31,0,0,0,3 388 | 0,0,0,1,2,4,179,51,38,251.818,31,0,0,0,3 389 | 1,0,0,0,2,4,246,25,41,251.818,23,0,0,0,24 390 | 0,0,0,1,2,2,179,51,38,251.818,31,0,0,0,3 391 | 0,0,0,1,2,4,179,51,38,251.818,31,0,0,0,3 392 | 0,0,1,0,3,1,118,10,37,244.387,28,0,0,0,8 393 | 0,0,1,0,3,2,246,25,41,244.387,23,0,0,0,16 394 | 0,0,0,1,3,4,246,25,41,244.387,23,0,0,0,2 395 | 0,0,0,1,3,4,260,50,36,244.387,23,0,4,0,4 396 | 0,0,0,1,3,0,179,51,38,244.387,31,0,0,0,2 397 | 0,0,0,1,3,0,235,11,37,244.387,29,1,1,1,8 398 | 0,0,0,1,3,1,179,22,40,244.387,22,1,2,0,8 399 | 0,0,0,1,3,1,378,49,36,244.387,21,0,2,4,8 400 | 0,0,0,1,3,0,179,51,38,244.387,31,0,0,0,16 401 | 0,0,0,1,3,2,361,52,28,244.387,27,0,1,4,8 402 | 0,0,0,0,3,2,369,17,31,244.387,25,0,3,0,0 403 | 0,0,1,0,3,3,235,11,37,244.387,29,1,1,1,8 404 | 0,0,0,1,3,4,118,13,50,244.387,31,0,1,0,2 405 | 1,0,0,0,3,1,118,13,50,244.387,31,0,1,0,3 406 | 1,0,0,0,3,2,118,13,50,244.387,31,0,1,0,8 407 | 0,0,0,0,3,3,235,11,37,244.387,29,1,1,1,0 408 | 0,0,0,0,3,3,246,25,41,244.387,23,0,0,0,0 409 | 0,0,0,0,3,3,118,13,50,244.387,31,0,1,0,0 410 | 0,0,0,1,3,4,179,51,38,244.387,31,0,0,0,8 411 | 0,0,0,1,3,4,289,36,33,244.387,30,0,2,1,8 412 | 0,0,1,0,3,0,260,50,36,244.387,23,0,4,0,8 413 | 0,0,0,1,3,1,246,25,41,244.387,23,0,0,0,2 414 | 0,0,0,1,4,2,179,51,38,239.409,31,0,0,0,4 415 | 0,0,0,1,4,4,260,50,36,239.409,23,0,4,0,3 416 | 0,0,0,1,4,4,330,16,28,239.409,25,1,0,0,4 417 | 0,0,0,1,4,0,369,17,31,239.409,25,0,3,0,4 418 | 0,0,0,1,4,0,248,25,47,239.409,32,0,2,1,4 419 | 0,0,0,1,4,2,330,16,28,239.409,25,1,0,0,8 420 | 0,0,0,1,4,2,179,51,38,239.409,31,0,0,0,8 421 | 0,0,0,1,4,0,118,13,50,239.409,31,0,1,0,1 422 | 1,0,0,0,4,2,118,13,50,239.409,31,0,1,0,120 423 | 0,0,0,1,4,4,300,26,43,239.409,25,0,2,1,8 424 | 0,0,0,1,4,4,260,50,36,239.409,23,0,4,0,4 425 | 0,0,0,1,4,0,179,51,38,239.409,31,0,0,0,4 426 | 1,0,0,0,4,2,118,10,37,239.409,28,0,0,0,2 427 | 1,0,0,0,5,0,235,20,43,246.074,38,0,1,0,16 428 | 0,0,0,1,5,2,248,25,47,246.074,32,0,2,1,2 429 | 1,0,0,0,5,0,369,17,31,246.074,25,0,3,0,8 430 | 0,0,0,1,5,2,179,26,30,246.074,19,1,0,0,3 431 | 0,0,0,1,5,2,179,51,38,246.074,31,0,0,0,4 432 | 0,0,0,1,5,3,361,52,28,246.074,27,0,1,4,1 433 | 0,0,0,1,5,4,260,50,36,246.074,23,0,4,0,3 434 | 1,0,0,0,5,0,179,22,40,246.074,22,1,2,0,2 435 | 1,0,0,0,5,0,179,22,40,246.074,22,1,2,0,3 436 | 0,0,1,0,5,2,228,14,58,246.074,22,0,2,1,8 437 | 0,0,0,1,5,2,225,26,28,246.074,24,0,1,2,3 438 | 1,0,0,0,5,4,330,16,28,246.074,25,1,0,0,8 439 | 0,0,0,1,5,0,179,26,30,246.074,19,1,0,0,2 440 | 0,0,0,1,5,0,118,10,37,246.074,28,0,0,0,1 441 | 1,0,0,0,5,0,235,11,37,246.074,29,1,1,1,8 442 | 0,0,0,1,5,2,179,26,30,246.074,19,1,0,0,3 443 | 0,0,0,1,6,0,118,10,37,253.957,28,0,0,0,3 444 | 0,0,0,1,6,0,179,51,38,253.957,31,0,0,0,3 445 | 0,0,0,1,6,1,118,10,37,253.957,28,0,0,0,2 446 | 0,0,0,1,6,3,225,26,28,253.957,24,0,1,2,4 447 | 0,0,0,1,6,4,260,50,36,253.957,23,0,4,0,4 448 | 0,0,0,0,6,4,179,51,38,253.957,31,0,0,0,0 449 | 1,0,0,0,6,0,291,31,40,253.957,25,0,1,1,40 450 | 0,0,0,1,6,0,179,51,38,253.957,31,0,0,0,24 451 | 0,0,0,1,6,1,246,25,41,253.957,23,0,0,0,3 452 | 0,0,0,1,6,0,179,51,38,253.957,31,0,0,0,4 453 | 0,0,0,1,6,1,235,20,43,253.957,38,0,1,0,8 454 | 0,0,0,1,6,0,179,51,38,253.957,31,0,0,0,2 455 | 0,0,0,1,6,2,225,26,28,253.957,24,0,1,2,2 456 | 0,0,0,1,6,2,118,13,50,253.957,31,0,1,0,2 457 | 1,0,0,0,6,2,179,51,38,253.957,31,0,0,0,8 458 | 0,0,1,0,6,2,179,26,30,253.957,19,1,0,0,2 459 | 0,0,0,1,6,4,246,25,41,253.957,23,0,0,0,2 460 | 1,0,0,0,6,1,330,16,28,253.957,25,1,0,0,1 461 | 1,0,0,0,6,1,235,11,37,253.957,29,1,1,1,8 462 | 0,0,0,1,7,3,179,26,30,230.29,19,1,0,0,2 463 | 0,0,0,1,7,3,225,26,28,230.29,24,0,1,2,4 464 | 1,0,0,0,7,4,260,50,36,230.29,23,0,4,0,8 465 | 1,0,0,0,7,0,268,11,33,230.29,25,1,0,0,8 466 | 0,0,0,1,7,4,330,16,28,230.29,25,1,0,0,8 467 | 0,0,0,1,7,4,118,10,37,230.29,28,0,0,0,8 468 | 0,0,0,1,7,0,260,50,36,230.29,23,0,4,0,4 469 | 0,0,0,1,7,1,118,10,37,230.29,28,0,0,0,8 470 | 0,1,0,0,7,0,300,26,43,230.29,25,0,2,1,8 471 | 0,0,0,1,7,0,235,29,48,230.29,33,0,1,5,1 472 | 0,0,0,1,7,1,246,25,41,230.29,23,0,0,0,2 473 | 1,0,0,0,7,1,225,26,28,230.29,24,0,1,2,112 474 | 0,0,0,1,7,1,179,51,38,230.29,31,0,0,0,1 475 | 0,0,0,1,7,4,118,13,50,230.29,31,0,1,0,1 476 | 0,0,0,1,7,4,361,52,28,230.29,27,0,1,4,8 477 | 0,0,0,1,7,0,289,36,33,230.29,30,0,2,1,8 478 | 0,0,0,1,7,0,235,20,43,230.29,38,0,1,0,8 479 | 0,0,0,1,7,1,246,25,41,230.29,23,0,0,0,2 480 | 0,0,0,1,7,3,291,31,40,230.29,25,0,1,1,1 481 | 0,0,0,1,7,3,279,5,39,230.29,24,0,2,0,2 482 | 0,0,0,1,8,3,179,51,38,249.797,31,0,0,0,4 483 | 0,0,0,1,8,0,179,22,40,249.797,22,1,2,0,1 484 | 0,0,0,1,8,1,246,25,41,249.797,23,0,0,0,4 485 | 0,0,0,1,8,1,118,10,37,249.797,28,0,0,0,4 486 | 0,0,0,1,8,1,289,36,33,249.797,30,0,2,1,8 487 | 0,0,0,1,8,1,235,20,43,249.797,38,0,1,0,8 488 | 0,0,0,1,8,3,291,31,40,249.797,25,0,1,1,4 489 | 0,0,0,1,8,0,179,51,38,249.797,31,0,0,0,4 490 | 0,0,0,1,8,1,179,22,40,249.797,22,1,2,0,8 491 | 0,0,0,1,8,3,330,16,28,249.797,25,1,0,0,16 492 | 0,0,0,1,8,1,235,11,37,249.797,29,1,1,1,4 493 | 0,0,0,1,8,1,246,25,41,249.797,23,0,0,0,1 494 | 0,0,0,1,8,1,118,10,37,249.797,28,0,0,0,5 495 | 0,0,0,1,8,3,291,31,40,249.797,25,0,1,1,2 496 | 0,0,0,1,8,0,260,50,36,249.797,23,0,4,0,3 497 | 0,0,0,1,9,1,246,25,41,261.756,23,0,0,0,1 498 | 0,0,0,1,9,1,246,25,41,261.756,23,0,0,0,1 499 | 0,0,0,1,9,1,118,10,37,261.756,28,0,0,0,3 500 | 0,0,0,1,9,1,155,12,34,261.756,25,0,2,0,2 501 | 0,0,0,1,9,3,291,31,40,261.756,25,0,1,1,2 502 | 0,0,0,1,9,4,179,26,30,261.756,19,1,0,0,8 503 | 0,0,0,1,9,4,248,25,47,261.756,32,0,2,1,1 504 | 0,0,0,1,9,0,179,51,38,261.756,31,0,0,0,4 505 | 0,0,0,1,9,2,225,26,28,261.756,24,0,1,2,1 506 | 0,0,0,1,9,0,179,26,30,261.756,19,1,0,0,2 507 | 0,0,0,1,9,1,369,17,31,261.756,25,0,3,0,8 508 | 0,0,0,1,9,1,361,52,28,261.756,27,0,1,4,8 509 | 1,0,0,0,10,3,289,48,49,284.853,36,0,0,2,1 510 | 1,0,0,0,10,3,235,16,32,284.853,25,1,0,0,3 511 | 0,0,0,1,10,4,246,25,41,284.853,23,0,0,0,8 512 | 1,0,0,0,10,2,289,48,49,284.853,36,0,0,2,3 513 | 0,0,0,1,10,2,291,31,40,284.853,25,0,1,1,2 514 | 0,0,0,1,10,1,118,10,37,284.853,28,0,0,0,2 515 | 0,0,0,1,10,3,289,48,49,284.853,36,0,0,2,2 516 | 0,0,0,1,10,4,291,31,40,284.853,25,0,1,1,1 517 | 0,0,0,1,10,1,225,26,28,284.853,24,0,1,2,2 518 | 0,0,0,1,10,1,369,17,31,284.853,25,0,3,0,8 519 | 0,0,0,1,10,1,369,17,31,284.853,25,0,3,0,3 520 | 0,0,0,1,10,1,225,26,28,284.853,24,0,1,2,4 521 | 0,0,0,1,10,1,369,17,31,284.853,25,0,3,0,8 522 | 0,0,0,1,10,2,179,51,38,284.853,31,0,0,0,3 523 | 1,0,0,0,10,2,228,14,58,284.853,22,0,2,1,1 524 | 0,0,0,1,10,2,291,31,40,284.853,25,0,1,1,1 525 | 1,0,0,0,10,3,369,17,31,284.853,25,0,3,0,8 526 | 1,0,0,0,10,3,225,26,28,284.853,24,0,1,2,1 527 | 1,0,0,0,10,4,369,17,31,284.853,25,0,3,0,8 528 | 1,0,0,0,10,4,225,26,28,284.853,24,0,1,2,3 529 | 0,0,0,1,10,6,189,29,33,284.853,25,0,2,2,8 530 | 1,0,0,0,10,6,235,16,32,284.853,25,1,0,0,8 531 | 1,0,0,0,10,6,248,25,47,284.853,32,0,2,1,8 532 | 0,0,0,0,10,6,225,26,28,284.853,24,0,1,2,0 533 | 1,0,0,0,10,1,225,26,28,284.853,24,0,1,2,3 534 | 0,0,1,0,11,1,179,51,38,268.519,31,0,0,0,1 535 | 0,0,0,1,11,2,118,10,37,268.519,28,0,0,0,3 536 | 1,0,0,0,11,2,330,16,28,268.519,25,1,0,0,24 537 | 0,0,0,1,11,4,179,51,38,268.519,31,0,0,0,1 538 | 1,0,0,0,11,1,118,10,37,268.519,28,0,0,0,8 539 | 0,0,0,1,11,2,289,36,33,268.519,30,0,2,1,8 540 | 1,0,0,0,11,4,235,16,32,268.519,25,1,0,0,8 541 | 0,0,0,1,11,4,225,26,28,268.519,24,0,1,2,4 542 | 0,0,0,1,11,1,361,52,28,268.519,27,0,1,4,8 543 | 0,0,0,1,11,2,291,31,40,268.519,25,0,1,1,2 544 | 1,0,0,0,11,3,118,10,37,268.519,28,0,0,0,2 545 | 1,0,0,0,11,3,225,26,28,268.519,24,0,1,2,3 546 | 0,0,0,1,11,0,179,51,38,268.519,31,0,0,0,1 547 | 0,0,0,1,11,0,118,10,37,268.519,28,0,0,0,8 548 | 1,0,0,0,11,1,118,10,37,268.519,28,0,0,0,8 549 | 0,0,0,1,11,1,225,26,28,268.519,24,0,1,2,2 550 | 0,0,0,0,11,1,291,31,40,268.519,25,0,1,1,0 551 | 0,0,0,0,11,2,289,36,33,268.519,30,0,2,1,0 552 | 1,0,0,0,11,3,248,25,47,268.519,32,0,2,1,4 553 | 0,0,0,0,11,3,235,20,43,268.519,38,0,1,0,0 554 | 0,0,0,1,11,4,225,26,28,268.519,24,0,1,2,2 555 | 0,0,0,1,11,4,369,17,31,268.519,25,0,3,0,8 556 | 0,0,0,1,11,0,361,52,28,268.519,27,0,1,4,2 557 | 1,0,0,0,12,1,179,51,38,280.549,31,0,0,0,32 558 | 0,0,0,1,12,2,291,31,40,280.549,25,0,1,1,1 559 | 0,0,0,1,12,2,225,26,28,280.549,24,0,1,2,3 560 | 1,0,0,0,12,4,179,26,30,280.549,19,1,0,0,1 561 | 0,0,0,1,12,4,225,26,28,280.549,24,0,1,2,3 562 | 0,0,0,1,12,2,225,26,28,280.549,24,0,1,2,3 563 | 1,0,0,0,12,3,361,52,28,280.549,27,0,1,4,4 564 | 0,0,1,0,12,4,179,22,40,280.549,22,1,2,0,2 565 | 0,0,0,1,12,4,235,20,43,280.549,38,0,1,0,8 566 | 0,0,1,0,12,0,233,51,31,280.549,21,1,1,8,8 567 | 1,0,0,0,12,1,179,26,30,280.549,19,1,0,0,16 568 | 0,0,0,1,12,1,225,26,28,280.549,24,0,1,2,2 569 | 0,0,0,1,12,3,225,26,28,280.549,24,0,1,2,3 570 | 0,0,0,1,12,0,225,26,28,280.549,24,0,1,2,2 571 | 0,0,1,0,12,1,155,12,34,280.549,25,0,2,0,80 572 | 1,0,0,0,1,0,179,26,30,313.532,19,1,0,0,24 573 | 1,0,0,0,1,3,179,26,30,313.532,19,1,0,0,16 574 | 0,0,0,1,1,3,179,22,40,313.532,22,1,2,0,2 575 | 0,0,0,1,1,4,179,22,40,313.532,22,1,2,0,2 576 | 1,0,0,0,1,0,179,26,30,313.532,19,1,0,0,3 577 | 0,0,0,1,1,2,179,22,40,313.532,22,1,2,0,2 578 | 1,0,0,0,1,3,289,48,49,313.532,36,0,0,2,8 579 | 0,0,1,0,1,4,179,22,40,313.532,22,1,2,0,3 580 | 0,0,0,1,1,0,179,26,30,313.532,19,1,0,0,2 581 | 0,0,1,0,1,1,155,12,34,313.532,25,0,2,0,8 582 | 0,0,0,1,1,2,179,26,30,313.532,19,1,0,0,2 583 | 0,0,0,1,1,2,179,51,38,313.532,31,0,0,0,3 584 | 1,0,0,0,1,2,289,36,33,313.532,30,0,2,1,8 585 | 0,0,0,1,1,3,179,51,38,313.532,31,0,0,0,3 586 | 0,0,0,1,1,4,179,51,38,313.532,31,0,0,0,2 587 | 1,0,0,0,2,1,179,51,38,264.249,31,0,0,0,8 588 | 0,0,0,1,2,1,225,26,28,264.249,24,0,1,2,3 589 | 1,0,0,0,2,2,248,25,47,264.249,32,0,2,1,8 590 | 0,0,0,1,2,2,179,51,38,264.249,31,0,0,0,2 591 | 0,0,0,1,2,3,225,26,28,264.249,24,0,1,2,3 592 | 0,0,0,1,2,3,179,51,38,264.249,31,0,0,0,2 593 | 0,0,0,1,2,3,179,26,30,264.249,19,1,0,0,2 594 | 0,0,0,1,2,4,225,15,41,264.249,28,1,2,2,2 595 | 0,0,0,1,2,4,179,51,38,264.249,31,0,0,0,2 596 | 0,0,1,0,2,0,233,51,31,264.249,21,1,1,8,2 597 | 0,0,0,1,2,0,179,51,38,264.249,31,0,0,0,2 598 | 1,0,0,0,2,1,225,26,28,264.249,24,0,1,2,8 599 | 0,0,0,1,2,2,179,51,38,264.249,31,0,0,0,3 600 | 0,0,0,1,2,3,179,51,38,264.249,31,0,0,0,3 601 | 0,0,0,1,2,3,225,26,28,264.249,24,0,1,2,3 602 | 1,0,0,0,2,3,179,26,30,264.249,19,1,0,0,2 603 | 0,0,0,1,2,4,179,22,40,264.249,22,1,2,0,2 604 | 0,0,0,1,2,4,179,51,38,264.249,31,0,0,0,3 605 | 1,0,0,0,2,2,233,51,31,264.249,21,1,1,8,3 606 | 0,0,0,1,2,2,179,26,30,264.249,19,1,0,0,2 607 | 0,0,0,1,2,2,179,51,38,264.249,31,0,0,0,2 608 | 1,0,0,0,2,3,179,51,38,264.249,31,0,0,0,8 609 | 0,0,0,1,2,4,179,51,38,264.249,31,0,0,0,2 610 | 0,0,0,1,2,0,155,12,34,264.249,25,0,2,0,5 611 | 0,0,0,1,2,0,235,16,32,264.249,25,1,0,0,3 612 | 0,0,0,1,2,0,179,51,38,264.249,31,0,0,0,2 613 | 1,0,0,0,2,0,225,26,28,264.249,24,0,1,2,2 614 | 0,0,0,1,2,1,179,51,38,264.249,31,0,0,0,2 615 | 0,0,0,1,2,1,248,25,47,264.249,32,0,2,1,2 616 | 0,0,0,1,2,1,225,26,28,264.249,24,0,1,2,2 617 | 0,0,0,1,2,2,179,51,38,264.249,31,0,0,0,2 618 | 0,0,0,1,2,3,179,51,38,264.249,31,0,0,0,2 619 | 0,0,0,1,2,4,235,16,32,264.249,25,1,0,0,2 620 | 0,0,0,1,3,0,179,51,38,222.196,31,0,0,0,2 621 | 0,0,0,1,3,0,248,25,47,222.196,32,0,2,1,2 622 | 0,0,0,1,3,1,228,14,58,222.196,22,0,2,1,3 623 | 0,0,0,1,3,1,248,25,47,222.196,32,0,2,1,3 624 | 1,0,0,0,3,1,228,14,58,222.196,22,0,2,1,112 625 | 0,0,0,1,3,2,179,51,38,222.196,31,0,0,0,2 626 | 0,0,0,1,3,3,225,26,28,222.196,24,0,1,2,2 627 | 0,0,0,1,3,3,179,51,38,222.196,31,0,0,0,3 628 | 0,0,0,1,3,3,225,26,28,222.196,24,0,1,2,2 629 | 0,0,0,1,3,4,179,26,30,222.196,19,1,0,0,3 630 | 0,0,0,1,3,0,235,16,32,222.196,25,1,0,0,3 631 | 0,0,1,0,3,0,361,52,28,222.196,27,0,1,4,8 632 | 1,0,0,0,3,1,179,51,38,222.196,31,0,0,0,8 633 | 0,0,0,1,3,2,179,51,38,222.196,31,0,0,0,2 634 | 0,0,0,1,3,3,179,51,38,222.196,31,0,0,0,3 635 | 0,0,0,1,3,4,179,26,30,222.196,19,1,0,0,2 636 | 1,0,0,0,3,0,179,51,38,222.196,31,0,0,0,4 637 | 1,0,0,0,3,0,248,25,47,222.196,32,0,2,1,2 638 | 0,0,0,1,3,0,179,51,38,222.196,31,0,0,0,3 639 | 1,0,0,0,3,0,225,26,28,222.196,24,0,1,2,8 640 | 0,0,0,1,3,1,179,51,38,222.196,31,0,0,0,2 641 | 0,0,0,1,3,2,289,36,33,222.196,30,0,2,1,8 642 | 0,0,0,1,3,2,228,14,58,222.196,22,0,2,1,2 643 | 0,0,0,1,3,2,179,51,38,222.196,31,0,0,0,2 644 | 0,0,0,1,3,3,248,25,47,222.196,32,0,2,1,3 645 | 0,0,0,1,3,3,179,51,38,222.196,31,0,0,0,3 646 | 0,0,0,1,3,4,179,26,30,222.196,19,1,0,0,2 647 | 0,0,0,1,3,4,179,51,38,222.196,31,0,0,0,3 648 | 0,0,0,1,3,1,179,51,38,222.196,31,0,0,0,3 649 | 0,0,0,1,3,2,118,15,46,222.196,25,0,2,0,8 650 | 1,0,0,0,3,2,155,12,34,222.196,25,0,2,0,24 651 | 0,0,0,1,3,2,179,51,38,222.196,31,0,0,0,3 652 | 0,0,0,1,3,3,179,51,38,222.196,31,0,0,0,3 653 | 1,0,0,0,3,0,179,26,30,222.196,19,1,0,0,2 654 | 0,0,1,0,3,0,289,36,33,222.196,30,0,2,1,104 655 | 0,0,0,1,3,2,369,17,31,222.196,25,0,3,0,8 656 | 1,0,0,0,4,0,225,26,28,246.288,24,0,1,2,8 657 | 1,0,0,0,4,0,118,10,37,246.288,28,0,0,0,8 658 | 0,0,1,0,4,1,361,52,28,246.288,27,0,1,4,8 659 | 0,0,1,0,4,2,248,25,47,246.288,32,0,2,1,8 660 | 1,0,0,0,4,3,189,29,33,246.288,25,0,2,2,8 661 | 0,0,0,1,4,4,179,26,30,246.288,19,1,0,0,2 662 | 1,0,0,0,4,0,369,17,31,246.288,25,0,3,0,24 663 | 0,1,0,0,4,1,179,22,40,246.288,22,1,2,0,2 664 | 0,0,0,1,4,1,118,13,50,246.288,31,0,1,0,3 665 | 0,0,0,1,4,1,361,52,28,246.288,27,0,1,4,2 666 | 1,0,0,0,4,2,118,10,37,246.288,28,0,0,0,2 667 | 0,0,0,1,4,4,235,11,37,246.288,29,1,1,1,8 668 | 0,0,0,1,4,4,179,26,30,246.288,19,1,0,0,2 669 | 0,0,1,0,4,0,225,26,28,246.288,24,0,1,2,8 670 | 0,1,0,0,4,1,235,16,32,246.288,25,1,0,0,3 671 | 0,0,0,1,4,4,179,26,30,246.288,19,1,0,0,2 672 | 0,0,0,1,4,1,155,12,34,246.288,25,0,2,0,4 673 | 0,0,1,0,4,3,225,26,28,246.288,24,0,1,2,8 674 | 1,0,0,0,4,3,118,13,50,246.288,31,0,1,0,2 675 | 0,0,0,1,4,4,179,26,30,246.288,19,1,0,0,2 676 | 0,0,0,1,5,0,235,11,37,237.656,29,1,1,1,8 677 | 0,0,1,0,5,2,225,15,41,237.656,28,1,2,2,3 678 | 0,0,0,1,5,2,235,16,32,237.656,25,1,0,0,2 679 | 1,0,0,0,5,2,118,10,37,237.656,28,0,0,0,3 680 | 0,0,0,1,5,2,235,20,43,237.656,38,0,1,0,8 681 | 1,0,0,0,5,3,179,26,30,237.656,19,1,0,0,1 682 | 0,0,0,1,5,3,291,31,40,237.656,25,0,1,1,2 683 | 1,0,0,0,5,3,225,15,41,237.656,28,1,2,2,8 684 | 0,0,1,0,5,4,300,26,43,237.656,25,0,2,1,64 685 | 0,0,0,1,5,4,225,15,41,237.656,28,1,2,2,8 686 | 0,0,0,1,5,4,179,26,30,237.656,19,1,0,0,2 687 | 0,0,0,1,5,0,118,13,50,237.656,31,0,1,0,2 688 | 1,0,0,0,5,1,118,13,50,237.656,31,0,1,0,3 689 | 0,0,0,1,5,1,118,10,37,237.656,28,0,0,0,1 690 | 0,0,0,0,5,1,118,13,50,237.656,31,0,1,0,0 691 | 0,0,0,1,5,2,179,26,30,237.656,19,1,0,0,2 692 | 0,0,0,0,5,2,378,49,36,237.656,21,0,2,4,0 693 | 0,1,0,0,5,4,179,22,40,237.656,22,1,2,0,1 694 | 1,0,0,0,5,0,155,12,34,237.656,25,0,2,0,48 695 | 1,0,0,0,5,0,235,16,32,237.656,25,1,0,0,8 696 | 0,0,0,1,5,2,291,31,40,237.656,25,0,1,1,8 697 | 1,0,0,0,5,2,179,22,40,237.656,22,1,2,0,8 698 | 1,0,0,0,5,2,225,26,28,237.656,24,0,1,2,3 699 | 1,0,0,0,5,3,330,16,28,237.656,25,1,0,0,8 700 | 0,0,0,1,5,3,235,16,32,237.656,25,1,0,0,2 701 | 0,0,0,1,5,3,291,31,40,237.656,25,0,1,1,2 702 | -------------------------------------------------------------------------------- /6. Machine Learning/52 Exploring the Problem from a Machine Learning Point of View/Absenteeism_preprocessed.csv: -------------------------------------------------------------------------------- 1 | Reason_1,Reason_2,Reason_3,Reason_4,Month Value,Day of the Week,Transportation Expense,Distance to Work,Age,Daily Work Load Average,Body Mass Index,Education,Children,Pet,Absenteeism Time in Hours 2 | 0,0,0,1,7,1,289,36,33,239.554,30,0,2,1,4 3 | 0,0,0,0,7,1,118,13,50,239.554,31,0,1,0,0 4 | 0,0,0,1,7,2,179,51,38,239.554,31,0,0,0,2 5 | 1,0,0,0,7,3,279,5,39,239.554,24,0,2,0,4 6 | 0,0,0,1,7,3,289,36,33,239.554,30,0,2,1,2 7 | 0,0,0,1,10,2,179,51,38,239.554,31,0,0,0,2 8 | 0,0,0,1,7,4,361,52,28,239.554,27,0,1,4,8 9 | 0,0,0,1,7,4,260,50,36,239.554,23,0,4,0,4 10 | 0,0,1,0,6,6,155,12,34,239.554,25,0,2,0,40 11 | 0,0,0,1,7,0,235,11,37,239.554,29,1,1,1,8 12 | 1,0,0,0,7,0,260,50,36,239.554,23,0,4,0,8 13 | 1,0,0,0,7,1,260,50,36,239.554,23,0,4,0,8 14 | 1,0,0,0,7,2,260,50,36,239.554,23,0,4,0,8 15 | 1,0,0,0,7,2,179,51,38,239.554,31,0,0,0,1 16 | 0,0,0,1,7,2,179,51,38,239.554,31,0,0,0,4 17 | 1,0,0,0,7,4,246,25,41,239.554,23,0,0,0,8 18 | 0,0,0,1,7,4,179,51,38,239.554,31,0,0,0,2 19 | 0,0,1,0,7,0,179,51,38,239.554,31,0,0,0,8 20 | 1,0,0,0,7,3,189,29,33,239.554,25,0,2,2,8 21 | 0,0,0,1,5,4,248,25,47,205.917,32,0,2,1,2 22 | 1,0,0,0,12,1,330,16,28,205.917,25,1,0,0,8 23 | 1,0,0,0,3,6,179,51,38,205.917,31,0,0,0,1 24 | 1,0,0,0,10,3,361,52,28,205.917,27,0,1,4,40 25 | 0,0,0,1,8,4,260,50,36,205.917,23,0,4,0,4 26 | 0,0,1,0,8,0,289,36,33,205.917,30,0,2,1,8 27 | 0,0,0,1,8,0,361,52,28,205.917,27,0,1,4,7 28 | 0,0,0,1,4,2,289,36,33,205.917,30,0,2,1,1 29 | 0,0,0,1,12,1,157,27,29,205.917,22,0,0,0,4 30 | 0,0,1,0,8,2,289,36,33,205.917,30,0,2,1,8 31 | 0,0,0,1,8,4,179,51,38,205.917,31,0,0,0,2 32 | 0,0,1,0,8,0,179,51,38,205.917,31,0,0,0,8 33 | 0,0,1,0,8,3,235,29,48,205.917,33,0,1,5,8 34 | 0,0,0,1,8,3,235,11,37,205.917,29,1,1,1,4 35 | 0,0,1,0,8,0,235,29,48,205.917,33,0,1,5,8 36 | 0,0,0,1,8,0,179,51,38,205.917,31,0,0,0,2 37 | 0,0,0,1,8,0,361,52,28,205.917,27,0,1,4,1 38 | 0,0,0,1,4,2,289,36,33,205.917,30,0,2,1,8 39 | 1,0,0,0,8,3,291,50,32,205.917,23,0,0,0,4 40 | 0,0,0,1,8,4,235,29,48,205.917,33,0,1,5,8 41 | 0,0,0,1,8,4,260,50,36,205.917,23,0,4,0,4 42 | 0,0,0,1,1,4,184,42,27,241.476,21,0,0,0,2 43 | 0,0,0,1,7,3,118,10,37,241.476,28,0,0,0,4 44 | 0,0,0,1,1,4,179,51,38,241.476,31,0,0,0,4 45 | 0,0,1,0,8,6,235,20,43,241.476,38,0,1,0,8 46 | 0,0,0,1,9,2,155,12,34,241.476,25,0,2,0,2 47 | 0,0,0,1,9,6,118,10,37,241.476,28,0,0,0,3 48 | 0,0,0,1,9,0,179,51,38,241.476,31,0,0,0,3 49 | 0,0,0,1,9,3,291,31,40,241.476,25,0,1,1,4 50 | 0,0,0,1,4,3,260,50,36,241.476,23,0,4,0,8 51 | 1,0,0,0,9,0,291,31,40,241.476,25,0,1,1,32 52 | 0,0,0,0,9,0,260,50,36,241.476,23,0,4,0,0 53 | 0,0,0,0,9,0,225,26,28,241.476,24,0,1,2,0 54 | 0,0,0,1,8,6,225,26,28,241.476,24,0,1,2,2 55 | 0,0,0,1,9,1,118,10,37,241.476,28,0,0,0,2 56 | 0,0,0,0,9,1,289,36,33,241.476,30,0,2,1,0 57 | 0,0,0,0,9,1,118,13,50,241.476,31,0,1,0,0 58 | 0,0,1,0,9,2,225,26,28,241.476,24,0,1,2,3 59 | 0,0,0,1,9,2,179,51,38,241.476,31,0,0,0,3 60 | 0,0,0,0,9,2,369,17,31,241.476,25,0,3,0,0 61 | 0,0,0,1,11,0,248,25,47,241.476,32,0,2,1,1 62 | 0,0,0,1,9,4,179,51,38,241.476,31,0,0,0,3 63 | 0,0,0,1,9,4,260,50,36,241.476,23,0,4,0,4 64 | 0,0,0,1,6,2,179,51,38,253.465,31,0,0,0,3 65 | 0,0,0,1,10,1,118,10,37,253.465,28,0,0,0,3 66 | 0,0,0,0,10,2,118,13,50,253.465,31,0,1,0,0 67 | 0,0,0,1,10,3,179,26,30,253.465,19,1,0,0,1 68 | 0,0,0,1,10,4,179,51,38,253.465,31,0,0,0,3 69 | 0,0,0,1,10,4,225,26,28,253.465,24,0,1,2,3 70 | 0,0,0,1,6,2,118,10,37,253.465,28,0,0,0,3 71 | 0,0,0,1,10,2,225,26,28,253.465,24,0,1,2,2 72 | 0,0,0,1,10,2,248,25,47,253.465,32,0,2,1,2 73 | 0,0,0,1,10,3,291,31,40,253.465,25,0,1,1,5 74 | 0,0,0,1,10,2,179,51,38,253.465,31,0,0,0,8 75 | 0,0,0,1,10,2,225,26,28,253.465,24,0,1,2,3 76 | 0,0,1,0,10,3,260,50,36,253.465,23,0,4,0,16 77 | 1,0,0,0,10,1,291,31,40,253.465,25,0,1,1,8 78 | 0,0,0,1,10,1,225,26,28,253.465,24,0,1,2,2 79 | 0,0,0,1,10,2,289,36,33,253.465,30,0,2,1,8 80 | 0,0,0,1,10,4,361,52,28,253.465,27,0,1,4,1 81 | 0,0,0,1,10,4,260,50,36,253.465,23,0,4,0,3 82 | 0,0,0,1,5,0,179,51,38,306.345,31,0,0,0,1 83 | 0,0,0,1,4,5,225,26,28,306.345,24,0,1,2,1 84 | 1,0,0,0,5,0,179,51,38,306.345,31,0,0,0,8 85 | 0,0,1,0,12,4,179,22,40,306.345,22,1,2,0,8 86 | 0,0,0,1,11,3,291,31,40,306.345,25,0,1,1,5 87 | 1,0,0,0,2,2,155,12,34,306.345,25,0,2,0,32 88 | 0,0,0,1,9,4,189,29,33,306.345,25,0,2,2,8 89 | 1,0,0,0,11,0,291,31,40,306.345,25,0,1,1,40 90 | 0,0,0,1,11,2,225,26,28,306.345,24,0,1,2,1 91 | 1,0,0,0,11,4,155,12,34,306.345,25,0,2,0,8 92 | 0,0,0,1,11,2,225,26,28,306.345,24,0,1,2,3 93 | 0,0,1,0,11,2,179,22,40,306.345,22,1,2,0,8 94 | 1,0,0,0,11,4,225,26,28,306.345,24,0,1,2,3 95 | 0,0,0,1,11,4,260,50,36,306.345,23,0,4,0,4 96 | 0,0,0,1,2,2,248,25,47,306.345,32,0,2,1,1 97 | 0,0,0,1,10,6,225,26,28,306.345,24,0,1,2,3 98 | 1,0,0,0,11,2,289,36,33,306.345,30,0,2,1,24 99 | 0,0,0,1,11,3,291,31,40,306.345,25,0,1,1,3 100 | 0,0,0,1,1,0,248,25,47,261.306,32,0,2,1,1 101 | 0,0,1,0,1,0,118,10,37,261.306,28,0,0,0,64 102 | 0,0,0,1,2,3,118,13,50,261.306,31,0,1,0,2 103 | 0,0,0,1,2,3,235,11,37,261.306,29,1,1,1,8 104 | 0,0,0,1,3,3,225,26,28,261.306,24,0,1,2,2 105 | 0,0,0,1,4,6,260,50,36,261.306,23,0,4,0,8 106 | 0,0,1,0,8,2,118,10,37,261.306,28,0,0,0,56 107 | 0,0,0,1,9,5,361,52,28,261.306,27,0,1,4,8 108 | 0,0,0,1,10,0,225,26,28,261.306,24,0,1,2,3 109 | 0,0,0,1,11,3,260,50,36,261.306,23,0,4,0,3 110 | 0,0,0,1,12,1,225,26,28,261.306,24,0,1,2,2 111 | 0,0,0,1,12,2,361,52,28,261.306,27,0,1,4,8 112 | 0,0,0,1,11,3,118,10,37,261.306,28,0,0,0,2 113 | 0,0,1,0,12,4,246,25,41,261.306,23,0,0,0,8 114 | 0,0,0,1,12,4,225,26,28,261.306,24,0,1,2,2 115 | 0,0,0,1,6,2,225,26,28,308.593,24,0,1,2,1 116 | 0,0,1,0,4,4,118,10,37,308.593,28,0,0,0,1 117 | 0,0,0,1,5,6,118,10,37,308.593,28,0,0,0,1 118 | 0,0,1,0,5,6,155,12,34,308.593,25,0,2,0,8 119 | 0,0,0,1,6,2,225,26,28,308.593,24,0,1,2,2 120 | 0,0,0,1,7,4,184,42,27,308.593,21,0,0,0,2 121 | 0,0,0,1,7,4,225,26,28,308.593,24,0,1,2,2 122 | 0,0,0,1,8,0,225,26,28,308.593,24,0,1,2,1 123 | 0,0,0,1,11,1,118,10,37,308.593,28,0,0,0,2 124 | 0,0,0,1,12,3,225,26,28,308.593,24,0,1,2,2 125 | 0,0,0,1,12,3,118,10,37,308.593,28,0,0,0,2 126 | 0,0,0,1,1,2,118,10,37,308.593,28,0,0,0,2 127 | 0,0,0,1,1,3,118,10,37,308.593,28,0,0,0,2 128 | 0,0,0,1,1,4,118,10,37,308.593,28,0,0,0,2 129 | 0,0,0,1,11,1,118,10,37,308.593,28,0,0,0,2 130 | 0,0,0,1,12,3,118,10,37,308.593,28,0,0,0,2 131 | 0,0,1,0,12,3,179,26,30,308.593,19,1,0,0,8 132 | 0,0,1,0,12,3,289,36,33,308.593,30,0,2,1,8 133 | 0,0,0,1,1,2,118,10,37,308.593,28,0,0,0,2 134 | 0,0,0,1,1,3,184,42,27,308.593,21,0,0,0,2 135 | 0,0,0,1,1,3,118,10,37,308.593,28,0,0,0,2 136 | 0,0,0,1,1,0,118,10,37,308.593,28,0,0,0,0 137 | 0,0,0,1,1,1,225,26,28,308.593,24,0,1,2,1 138 | 0,0,0,1,1,3,289,36,33,308.593,30,0,2,1,3 139 | 0,0,0,1,5,0,184,42,27,302.585,21,0,0,0,1 140 | 1,0,0,0,3,2,246,25,41,302.585,23,0,0,0,8 141 | 1,0,0,0,10,6,179,51,38,302.585,31,0,0,0,8 142 | 0,0,0,1,11,2,155,12,34,302.585,25,0,2,0,2 143 | 0,0,0,1,11,2,189,29,33,302.585,25,0,2,2,8 144 | 0,0,0,1,12,4,260,50,36,302.585,23,0,4,0,2 145 | 0,0,0,1,12,4,289,36,33,302.585,30,0,2,1,8 146 | 1,0,0,0,8,1,388,15,50,302.585,24,0,0,0,8 147 | 1,0,0,0,9,4,388,15,50,302.585,24,0,0,0,8 148 | 0,0,0,1,2,0,225,26,28,302.585,24,0,1,2,2 149 | 0,0,0,1,9,4,225,26,28,302.585,24,0,1,2,2 150 | 0,0,0,1,2,1,179,26,30,302.585,19,1,0,0,1 151 | 0,0,0,1,2,1,184,42,27,302.585,21,0,0,0,8 152 | 0,0,0,1,2,3,225,26,28,302.585,24,0,1,2,3 153 | 0,0,1,0,2,0,330,16,28,302.585,25,1,0,0,8 154 | 0,0,0,1,2,1,330,16,28,302.585,25,1,0,0,1 155 | 0,0,0,1,2,2,225,26,28,302.585,24,0,1,2,1 156 | 0,0,1,0,2,3,189,29,33,302.585,25,0,2,2,8 157 | 0,0,0,1,1,6,291,50,32,343.253,23,0,0,0,2 158 | 0,0,1,0,1,6,260,50,36,343.253,23,0,4,0,8 159 | 0,0,1,0,8,2,157,27,29,343.253,22,0,0,0,3 160 | 0,1,0,0,3,1,179,22,40,343.253,22,1,2,0,8 161 | 0,0,0,1,3,2,291,31,40,343.253,25,0,1,1,8 162 | 1,0,0,0,2,2,260,50,36,343.253,23,0,4,0,8 163 | 1,0,0,0,3,3,179,26,30,343.253,19,1,0,0,8 164 | 1,0,0,0,4,6,248,25,47,343.253,32,0,2,1,3 165 | 1,0,0,0,4,6,260,50,36,343.253,23,0,4,0,40 166 | 1,0,0,0,7,6,179,22,40,343.253,22,1,2,0,40 167 | 1,0,0,0,3,0,155,12,34,343.253,25,0,2,0,16 168 | 0,0,0,1,3,0,260,50,36,343.253,23,0,4,0,16 169 | 1,0,0,0,3,1,155,12,34,343.253,25,0,2,0,8 170 | 1,0,0,0,3,3,289,36,33,343.253,30,0,2,1,8 171 | 1,0,0,0,3,3,179,22,40,343.253,22,1,2,0,8 172 | 0,0,0,1,3,4,260,50,36,343.253,23,0,4,0,4 173 | 0,0,0,1,3,4,225,26,28,343.253,24,0,1,2,1 174 | 1,0,0,0,3,0,279,5,39,343.253,24,0,2,0,8 175 | 1,0,0,0,3,1,179,51,38,343.253,31,0,0,0,24 176 | 0,0,0,1,3,2,225,26,28,343.253,24,0,1,2,2 177 | 1,0,0,0,3,0,225,26,28,343.253,24,0,1,2,8 178 | 1,0,0,0,3,0,179,26,30,343.253,19,1,0,0,1 179 | 1,0,0,0,3,1,225,26,28,343.253,24,0,1,2,8 180 | 1,0,0,0,3,2,225,26,28,343.253,24,0,1,2,16 181 | 1,0,0,0,3,2,179,51,38,343.253,31,0,0,0,3 182 | 1,0,0,0,3,3,279,5,39,343.253,24,0,2,0,16 183 | 0,0,0,1,3,4,225,26,28,343.253,24,0,1,2,2 184 | 1,0,0,0,3,4,248,25,47,343.253,32,0,2,1,3 185 | 0,0,0,1,3,0,225,26,28,343.253,24,0,1,2,1 186 | 0,0,0,1,6,5,291,31,40,326.452,25,0,1,1,1 187 | 0,0,0,1,4,2,225,26,28,326.452,24,0,1,2,1 188 | 0,0,0,1,12,6,155,12,34,326.452,25,0,2,0,1 189 | 1,0,0,0,4,2,246,25,41,326.452,23,0,0,0,24 190 | 0,0,0,1,4,3,155,12,34,326.452,25,0,2,0,1 191 | 0,0,0,1,4,4,225,26,28,326.452,24,0,1,2,2 192 | 0,0,0,1,4,4,260,50,36,326.452,23,0,4,0,4 193 | 1,0,0,0,6,5,179,51,38,326.452,31,0,0,0,24 194 | 0,0,0,1,7,0,118,13,50,326.452,31,0,1,0,1 195 | 0,0,0,1,8,3,291,31,40,326.452,25,0,1,1,3 196 | 1,0,0,0,8,3,246,25,41,326.452,23,0,0,0,8 197 | 0,0,0,1,4,4,291,31,40,326.452,25,0,1,1,1 198 | 0,0,0,1,4,4,248,25,47,326.452,32,0,2,1,8 199 | 0,0,1,0,4,4,260,50,36,326.452,23,0,4,0,56 200 | 0,0,1,0,4,1,289,36,33,326.452,30,0,2,1,8 201 | 1,0,0,0,4,2,155,12,34,326.452,25,0,2,0,24 202 | 0,0,1,0,4,2,378,49,36,326.452,21,0,2,4,8 203 | 1,0,0,0,4,3,289,36,33,326.452,30,0,2,1,16 204 | 1,0,0,0,8,3,235,11,37,326.452,29,1,1,1,3 205 | 0,0,0,0,4,0,235,29,48,326.452,33,0,1,5,0 206 | 1,0,0,0,4,1,289,36,33,378.884,30,0,2,1,8 207 | 0,0,0,1,5,3,155,12,34,378.884,25,0,2,0,2 208 | 0,0,0,1,2,4,155,12,34,378.884,25,0,2,0,1 209 | 0,0,1,0,3,5,179,51,38,378.884,31,0,0,0,8 210 | 0,0,1,0,3,5,225,26,28,378.884,24,0,1,2,8 211 | 1,0,0,0,11,5,184,42,27,378.884,21,0,0,0,4 212 | 0,0,0,1,9,0,155,12,34,378.884,25,0,2,0,2 213 | 1,0,0,0,10,2,179,51,38,378.884,31,0,0,0,1 214 | 1,0,0,0,4,1,289,36,33,378.884,30,0,2,1,24 215 | 0,0,0,0,11,5,279,5,39,378.884,24,0,2,0,0 216 | 0,0,0,0,5,2,330,16,28,378.884,25,1,0,0,0 217 | 0,0,0,0,5,2,378,49,36,378.884,21,0,2,4,0 218 | 0,0,0,0,5,2,388,15,50,378.884,24,0,0,0,0 219 | 1,0,0,0,5,1,179,51,38,378.884,31,0,0,0,1 220 | 1,0,0,0,11,5,118,13,50,378.884,31,0,1,0,24 221 | 0,0,0,1,5,4,361,52,28,378.884,27,0,1,4,8 222 | 0,0,1,0,6,0,246,25,41,377.55,23,0,0,0,8 223 | 0,0,0,1,6,0,361,52,28,377.55,27,0,1,4,8 224 | 1,0,0,0,6,1,246,25,41,377.55,23,0,0,0,24 225 | 0,0,0,1,6,3,291,31,40,377.55,25,0,1,1,4 226 | 1,0,0,0,6,4,246,25,41,377.55,23,0,0,0,8 227 | 1,0,0,0,6,0,179,51,38,377.55,31,0,0,0,8 228 | 0,0,0,1,6,0,155,12,34,377.55,25,0,2,0,4 229 | 1,0,0,0,6,0,246,25,41,377.55,23,0,0,0,8 230 | 1,0,0,0,6,2,118,13,50,377.55,31,0,1,0,8 231 | 1,0,0,0,3,6,235,11,37,377.55,29,1,1,1,16 232 | 0,0,0,1,7,2,118,13,50,377.55,31,0,1,0,1 233 | 1,0,0,0,8,5,118,13,50,377.55,31,0,1,0,80 234 | 0,0,0,1,9,1,378,49,36,377.55,21,0,2,4,8 235 | 1,0,0,0,10,3,179,51,38,377.55,31,0,0,0,2 236 | 0,0,0,1,6,0,289,48,49,377.55,36,0,0,2,2 237 | 0,0,0,1,6,3,225,26,28,377.55,24,0,1,2,2 238 | 0,0,1,0,5,5,155,12,34,275.312,25,0,2,0,16 239 | 1,0,0,0,6,1,118,13,50,275.312,31,0,1,0,8 240 | 1,0,0,0,8,6,118,10,37,275.312,28,0,0,0,8 241 | 0,0,0,1,8,6,118,10,37,275.312,28,0,0,0,4 242 | 0,0,0,1,12,2,330,16,28,275.312,25,1,0,0,8 243 | 0,0,1,0,7,3,179,26,30,275.312,19,1,0,0,8 244 | 0,0,0,1,7,4,155,12,34,275.312,25,0,2,0,2 245 | 1,0,0,0,7,0,330,16,28,275.312,25,1,0,0,8 246 | 1,0,0,0,7,1,330,16,28,275.312,25,1,0,0,8 247 | 0,0,0,1,7,0,157,27,29,275.312,22,0,0,0,3 248 | 0,0,0,1,7,1,361,52,28,275.312,27,0,1,4,8 249 | 0,0,0,1,7,2,289,36,33,275.312,30,0,2,1,8 250 | 0,0,0,1,7,3,179,51,38,275.312,31,0,0,0,8 251 | 0,0,1,0,11,0,289,36,33,275.312,30,0,2,1,32 252 | 0,0,1,0,7,3,289,36,33,275.312,30,0,2,1,8 253 | 0,0,0,0,7,3,260,50,36,275.312,23,0,4,0,0 254 | 0,0,1,0,4,4,289,36,33,265.615,30,0,2,1,8 255 | 0,0,1,0,11,1,157,27,29,265.615,22,0,0,0,3 256 | 0,0,0,1,7,4,289,36,33,265.615,30,0,2,1,1 257 | 0,0,1,0,8,0,228,14,58,265.615,22,0,2,1,8 258 | 1,0,0,0,10,5,300,26,43,265.615,25,0,2,1,1 259 | 1,0,0,0,10,5,300,26,43,265.615,25,0,2,1,2 260 | 0,0,0,1,11,1,260,50,36,265.615,23,0,4,0,4 261 | 0,0,0,1,8,0,289,36,33,265.615,30,0,2,1,4 262 | 0,0,0,1,9,3,248,25,47,265.615,32,0,2,1,1 263 | 1,0,0,0,10,5,268,11,33,265.615,25,1,0,0,8 264 | 0,0,0,1,10,5,179,26,30,265.615,19,1,0,0,1 265 | 1,0,0,0,10,5,118,13,50,265.615,31,0,1,0,3 266 | 0,0,0,1,8,6,248,25,47,265.615,32,0,2,1,2 267 | 0,0,0,1,8,0,235,11,37,265.615,29,1,1,1,1 268 | 0,0,0,1,8,2,118,13,50,265.615,31,0,1,0,1 269 | 0,0,1,0,8,2,235,11,37,265.615,29,1,1,1,8 270 | 1,0,0,0,8,0,361,52,28,265.615,27,0,1,4,8 271 | 1,0,0,0,8,1,184,42,27,265.615,21,0,0,0,8 272 | 1,0,0,0,5,0,179,51,38,294.217,31,0,0,0,8 273 | 0,0,0,1,9,4,179,51,38,294.217,31,0,0,0,3 274 | 0,0,1,0,7,5,289,36,33,294.217,30,0,2,1,24 275 | 0,0,0,0,8,1,235,20,43,294.217,38,0,1,0,0 276 | 1,0,0,0,12,4,246,25,41,294.217,23,0,0,0,16 277 | 0,0,0,1,9,1,291,31,40,294.217,25,0,1,1,3 278 | 0,0,0,0,9,1,231,35,39,294.217,35,0,2,2,0 279 | 0,0,0,0,9,1,291,50,32,294.217,23,0,0,0,0 280 | 1,0,0,0,9,2,179,51,38,294.217,31,0,0,0,8 281 | 1,0,0,0,9,2,246,25,41,294.217,23,0,0,0,32 282 | 0,0,0,1,9,3,179,51,38,294.217,31,0,0,0,1 283 | 0,0,0,1,9,4,291,31,40,294.217,25,0,1,1,4 284 | 0,0,0,1,9,4,260,50,36,294.217,23,0,4,0,4 285 | 0,0,0,1,9,2,235,20,43,294.217,38,0,1,0,8 286 | 0,0,0,1,9,3,118,13,50,294.217,31,0,1,0,1 287 | 0,0,0,0,9,3,235,20,43,294.217,38,0,1,0,0 288 | 0,0,0,1,9,4,291,31,40,294.217,25,0,1,1,3 289 | 1,0,0,0,9,0,291,31,40,294.217,25,0,1,1,40 290 | 1,0,0,0,9,0,179,51,38,294.217,31,0,0,0,8 291 | 0,0,0,1,5,1,289,36,33,265.017,30,0,2,1,8 292 | 0,0,0,1,5,1,235,11,37,265.017,29,1,1,1,4 293 | 0,0,0,1,5,1,289,36,33,265.017,30,0,2,1,8 294 | 0,0,0,1,6,4,289,36,33,265.017,30,0,2,1,8 295 | 0,0,0,0,6,4,118,13,50,265.017,31,0,1,0,0 296 | 0,0,0,0,6,4,248,25,47,265.017,32,0,2,1,0 297 | 1,0,0,0,12,5,179,26,30,265.017,19,1,0,0,8 298 | 1,0,0,0,12,5,118,10,37,265.017,28,0,0,0,3 299 | 0,0,0,1,12,5,369,17,31,265.017,25,0,3,0,8 300 | 0,0,0,1,10,4,179,51,38,265.017,31,0,0,0,1 301 | 1,0,0,0,10,4,179,26,30,265.017,19,1,0,0,64 302 | 0,0,0,0,10,4,235,20,43,265.017,38,0,1,0,0 303 | 0,0,1,0,10,5,289,36,33,265.017,30,0,2,1,16 304 | 0,0,0,1,10,6,260,50,36,265.017,23,0,4,0,3 305 | 0,0,0,0,10,6,235,20,43,265.017,38,0,1,0,0 306 | 0,0,0,1,10,2,235,20,43,265.017,38,0,1,0,2 307 | 0,0,0,1,10,2,235,20,43,265.017,38,0,1,0,2 308 | 0,0,0,1,10,3,118,13,50,265.017,31,0,1,0,1 309 | 0,0,0,1,10,3,291,31,40,265.017,25,0,1,1,4 310 | 0,0,0,1,10,5,179,26,30,265.017,19,1,0,0,16 311 | 0,0,0,1,10,5,118,13,50,265.017,31,0,1,0,1 312 | 1,0,0,0,10,2,361,52,28,265.017,27,0,1,4,8 313 | 0,0,0,0,10,3,260,50,36,265.017,23,0,4,0,0 314 | 0,0,0,0,10,3,291,31,40,265.017,25,0,1,1,0 315 | 0,0,0,0,10,3,157,27,29,265.017,22,0,0,0,0 316 | 1,0,0,0,10,4,179,26,30,265.017,19,1,0,0,5 317 | 1,0,0,0,10,4,179,26,30,265.017,19,1,0,0,5 318 | 0,0,0,1,10,5,118,13,50,265.017,31,0,1,0,1 319 | 1,0,0,0,7,0,118,10,37,284.031,28,0,0,0,8 320 | 0,0,0,1,7,0,248,25,47,284.031,32,0,2,1,2 321 | 1,0,0,0,8,3,179,51,38,284.031,31,0,0,0,8 322 | 0,0,0,1,11,4,260,50,36,284.031,23,0,4,0,3 323 | 0,0,0,1,11,0,291,31,40,284.031,25,0,1,1,1 324 | 1,0,0,0,11,0,378,49,36,284.031,21,0,2,4,8 325 | 1,0,0,0,11,0,155,12,34,284.031,25,0,2,0,120 326 | 0,0,0,1,11,0,235,20,43,284.031,38,0,1,0,8 327 | 0,0,0,0,11,1,330,16,28,284.031,25,1,0,0,0 328 | 0,0,1,0,11,2,235,11,37,284.031,29,1,1,1,1 329 | 1,0,0,0,11,2,118,10,37,284.031,28,0,0,0,3 330 | 0,0,0,1,11,3,235,11,37,284.031,29,1,1,1,2 331 | 0,0,0,1,11,3,179,51,38,284.031,31,0,0,0,3 332 | 1,0,0,0,11,4,246,25,41,284.031,23,0,0,0,8 333 | 1,0,0,0,11,4,291,31,40,284.031,25,0,1,1,4 334 | 1,0,0,0,11,0,246,25,41,284.031,23,0,0,0,8 335 | 0,0,0,1,11,1,179,51,38,284.031,31,0,0,0,1 336 | 1,0,0,0,11,2,260,50,36,284.031,23,0,4,0,8 337 | 0,1,0,0,11,4,260,50,36,284.031,23,0,4,0,8 338 | 0,0,0,0,11,4,378,49,36,284.031,21,0,2,4,0 339 | 0,0,0,0,11,1,279,5,39,284.031,24,0,2,0,0 340 | 0,0,0,1,11,3,179,51,38,284.031,31,0,0,0,1 341 | 1,0,0,0,5,3,225,26,28,236.629,24,0,1,2,3 342 | 0,0,0,1,5,3,179,51,38,236.629,31,0,0,0,2 343 | 0,0,0,1,5,3,179,51,38,236.629,31,0,0,0,1 344 | 0,0,0,1,5,3,235,11,37,236.629,29,1,1,1,3 345 | 0,0,0,1,6,6,118,13,50,236.629,31,0,1,0,1 346 | 0,0,0,1,9,0,260,50,36,236.629,23,0,4,0,4 347 | 1,0,0,0,12,3,246,25,41,236.629,23,0,0,0,8 348 | 0,0,0,1,12,3,179,51,38,236.629,31,0,0,0,1 349 | 0,0,0,1,12,4,179,51,38,236.629,31,0,0,0,1 350 | 0,0,0,1,12,1,179,26,30,236.629,19,1,0,0,1 351 | 0,0,0,1,12,1,118,10,37,236.629,28,0,0,0,8 352 | 0,0,0,1,12,3,235,11,37,236.629,29,1,1,1,2 353 | 0,0,0,1,12,4,179,51,38,236.629,31,0,0,0,1 354 | 1,0,0,0,12,1,235,20,43,236.629,38,0,1,0,8 355 | 1,0,0,0,12,1,235,11,37,236.629,29,1,1,1,4 356 | 0,0,0,1,12,2,260,50,36,236.629,23,0,4,0,8 357 | 0,0,0,1,12,0,157,27,29,236.629,22,0,0,0,2 358 | 0,0,0,1,12,0,179,51,38,236.629,31,0,0,0,3 359 | 0,0,1,0,12,0,289,36,33,236.629,30,0,2,1,8 360 | 0,0,0,1,4,5,225,26,28,330.061,24,0,1,2,5 361 | 0,0,1,0,9,4,118,10,37,330.061,28,0,0,0,32 362 | 0,0,0,1,9,4,155,12,34,330.061,25,0,2,0,2 363 | 1,0,0,0,3,2,235,11,37,330.061,29,1,1,1,1 364 | 0,0,0,1,3,2,155,12,34,330.061,25,0,2,0,4 365 | 0,0,0,1,9,4,289,36,33,330.061,30,0,2,1,8 366 | 1,0,0,0,11,2,291,31,40,330.061,25,0,1,1,8 367 | 0,0,0,1,1,0,235,20,43,330.061,38,0,1,0,8 368 | 0,0,0,1,1,0,118,13,50,330.061,31,0,1,0,4 369 | 0,0,0,1,1,2,179,51,38,330.061,31,0,0,0,1 370 | 0,0,0,1,1,4,179,51,38,330.061,31,0,0,0,1 371 | 0,0,0,1,7,6,118,10,37,251.818,28,0,0,0,2 372 | 0,0,0,1,1,0,179,51,38,251.818,31,0,0,0,3 373 | 1,0,0,0,1,0,225,26,28,251.818,24,0,1,2,1 374 | 0,0,0,1,3,3,289,36,33,251.818,30,0,2,1,3 375 | 0,0,0,1,10,0,260,50,36,251.818,23,0,4,0,3 376 | 0,0,0,1,2,4,179,51,38,251.818,31,0,0,0,3 377 | 0,0,0,1,2,0,179,51,38,251.818,31,0,0,0,2 378 | 0,0,0,1,2,2,179,51,38,251.818,31,0,0,0,3 379 | 1,0,0,0,2,3,179,51,38,251.818,31,0,0,0,8 380 | 0,0,0,1,2,3,246,25,41,251.818,23,0,0,0,8 381 | 0,0,0,1,2,4,179,51,38,251.818,31,0,0,0,3 382 | 0,0,0,1,2,0,189,29,33,251.818,25,0,2,2,8 383 | 0,0,0,1,2,0,179,51,38,251.818,31,0,0,0,3 384 | 0,0,0,1,2,1,246,25,41,251.818,23,0,0,0,2 385 | 0,0,0,1,2,1,291,31,40,251.818,25,0,1,1,2 386 | 1,0,0,0,2,2,157,27,29,251.818,22,0,0,0,16 387 | 0,0,0,1,2,2,179,51,38,251.818,31,0,0,0,3 388 | 0,0,0,1,2,4,179,51,38,251.818,31,0,0,0,3 389 | 1,0,0,0,2,4,246,25,41,251.818,23,0,0,0,24 390 | 0,0,0,1,2,2,179,51,38,251.818,31,0,0,0,3 391 | 0,0,0,1,2,4,179,51,38,251.818,31,0,0,0,3 392 | 0,0,1,0,7,0,118,10,37,244.387,28,0,0,0,8 393 | 0,0,1,0,8,3,246,25,41,244.387,23,0,0,0,16 394 | 0,0,0,1,10,1,246,25,41,244.387,23,0,0,0,2 395 | 0,0,0,1,10,1,260,50,36,244.387,23,0,4,0,4 396 | 0,0,0,1,3,0,179,51,38,244.387,31,0,0,0,2 397 | 0,0,0,1,3,0,235,11,37,244.387,29,1,1,1,8 398 | 0,0,0,1,3,1,179,22,40,244.387,22,1,2,0,8 399 | 0,0,0,1,3,1,378,49,36,244.387,21,0,2,4,8 400 | 0,0,0,1,3,0,179,51,38,244.387,31,0,0,0,16 401 | 0,0,0,1,3,2,361,52,28,244.387,27,0,1,4,8 402 | 0,0,0,0,3,2,369,17,31,244.387,25,0,3,0,0 403 | 0,0,1,0,3,3,235,11,37,244.387,29,1,1,1,8 404 | 0,0,0,1,3,4,118,13,50,244.387,31,0,1,0,2 405 | 1,0,0,0,3,1,118,13,50,244.387,31,0,1,0,3 406 | 1,0,0,0,3,2,118,13,50,244.387,31,0,1,0,8 407 | 0,0,0,0,3,3,235,11,37,244.387,29,1,1,1,0 408 | 0,0,0,0,3,3,246,25,41,244.387,23,0,0,0,0 409 | 0,0,0,0,3,3,118,13,50,244.387,31,0,1,0,0 410 | 0,0,0,1,3,4,179,51,38,244.387,31,0,0,0,8 411 | 0,0,0,1,3,4,289,36,33,244.387,30,0,2,1,8 412 | 0,0,1,0,3,0,260,50,36,244.387,23,0,4,0,8 413 | 0,0,0,1,3,1,246,25,41,244.387,23,0,0,0,2 414 | 0,0,0,1,5,3,179,51,38,239.409,31,0,0,0,4 415 | 0,0,0,1,7,1,260,50,36,239.409,23,0,4,0,3 416 | 0,0,0,1,7,1,330,16,28,239.409,25,1,0,0,4 417 | 0,0,0,1,10,2,369,17,31,239.409,25,0,3,0,4 418 | 0,0,0,1,10,2,248,25,47,239.409,32,0,2,1,4 419 | 0,0,0,1,12,0,330,16,28,239.409,25,1,0,0,8 420 | 0,0,0,1,12,0,179,51,38,239.409,31,0,0,0,8 421 | 0,0,0,1,4,0,118,13,50,239.409,31,0,1,0,1 422 | 1,0,0,0,4,2,118,13,50,239.409,31,0,1,0,120 423 | 0,0,0,1,4,4,300,26,43,239.409,25,0,2,1,8 424 | 0,0,0,1,4,4,260,50,36,239.409,23,0,4,0,4 425 | 0,0,0,1,4,0,179,51,38,239.409,31,0,0,0,4 426 | 1,0,0,0,4,2,118,10,37,239.409,28,0,0,0,2 427 | 1,0,0,0,1,3,235,20,43,246.074,38,0,1,0,16 428 | 0,0,0,1,3,6,248,25,47,246.074,32,0,2,1,2 429 | 1,0,0,0,8,5,369,17,31,246.074,25,0,3,0,8 430 | 0,0,0,1,10,3,179,26,30,246.074,19,1,0,0,3 431 | 0,0,0,1,10,3,179,51,38,246.074,31,0,0,0,4 432 | 0,0,0,1,11,6,361,52,28,246.074,27,0,1,4,1 433 | 0,0,0,1,12,1,260,50,36,246.074,23,0,4,0,3 434 | 1,0,0,0,5,0,179,22,40,246.074,22,1,2,0,2 435 | 1,0,0,0,5,0,179,22,40,246.074,22,1,2,0,3 436 | 0,0,1,0,5,2,228,14,58,246.074,22,0,2,1,8 437 | 0,0,0,1,5,2,225,26,28,246.074,24,0,1,2,3 438 | 1,0,0,0,5,4,330,16,28,246.074,25,1,0,0,8 439 | 0,0,0,1,5,0,179,26,30,246.074,19,1,0,0,2 440 | 0,0,0,1,5,0,118,10,37,246.074,28,0,0,0,1 441 | 1,0,0,0,5,0,235,11,37,246.074,29,1,1,1,8 442 | 0,0,0,1,5,2,179,26,30,246.074,19,1,0,0,3 443 | 0,0,0,1,5,5,118,10,37,253.957,28,0,0,0,3 444 | 0,0,0,1,5,5,179,51,38,253.957,31,0,0,0,3 445 | 0,0,0,1,6,1,118,10,37,253.957,28,0,0,0,2 446 | 0,0,0,1,8,6,225,26,28,253.957,24,0,1,2,4 447 | 0,0,0,1,9,2,260,50,36,253.957,23,0,4,0,4 448 | 0,0,0,0,9,2,179,51,38,253.957,31,0,0,0,0 449 | 1,0,0,0,12,2,291,31,40,253.957,25,0,1,1,40 450 | 0,0,0,1,12,2,179,51,38,253.957,31,0,0,0,24 451 | 0,0,0,1,6,1,246,25,41,253.957,23,0,0,0,3 452 | 0,0,0,1,6,0,179,51,38,253.957,31,0,0,0,4 453 | 0,0,0,1,6,1,235,20,43,253.957,38,0,1,0,8 454 | 0,0,0,1,6,0,179,51,38,253.957,31,0,0,0,2 455 | 0,0,0,1,6,2,225,26,28,253.957,24,0,1,2,2 456 | 0,0,0,1,6,2,118,13,50,253.957,31,0,1,0,2 457 | 1,0,0,0,6,2,179,51,38,253.957,31,0,0,0,8 458 | 0,0,1,0,6,2,179,26,30,253.957,19,1,0,0,2 459 | 0,0,0,1,6,4,246,25,41,253.957,23,0,0,0,2 460 | 1,0,0,0,6,1,330,16,28,253.957,25,1,0,0,1 461 | 1,0,0,0,6,1,235,11,37,253.957,29,1,1,1,8 462 | 0,0,0,1,6,2,179,26,30,230.29,19,1,0,0,2 463 | 0,0,0,1,6,2,225,26,28,230.29,24,0,1,2,4 464 | 1,0,0,0,7,4,260,50,36,230.29,23,0,4,0,8 465 | 1,0,0,0,10,5,268,11,33,230.29,25,1,0,0,8 466 | 0,0,0,1,7,4,330,16,28,230.29,25,1,0,0,8 467 | 0,0,0,1,7,4,118,10,37,230.29,28,0,0,0,8 468 | 0,0,0,1,7,0,260,50,36,230.29,23,0,4,0,4 469 | 0,0,0,1,7,1,118,10,37,230.29,28,0,0,0,8 470 | 0,1,0,0,7,0,300,26,43,230.29,25,0,2,1,8 471 | 0,0,0,1,7,0,235,29,48,230.29,33,0,1,5,1 472 | 0,0,0,1,7,1,246,25,41,230.29,23,0,0,0,2 473 | 1,0,0,0,7,1,225,26,28,230.29,24,0,1,2,112 474 | 0,0,0,1,7,1,179,51,38,230.29,31,0,0,0,1 475 | 0,0,0,1,7,4,118,13,50,230.29,31,0,1,0,1 476 | 0,0,0,1,7,4,361,52,28,230.29,27,0,1,4,8 477 | 0,0,0,1,7,0,289,36,33,230.29,30,0,2,1,8 478 | 0,0,0,1,7,0,235,20,43,230.29,38,0,1,0,8 479 | 0,0,0,1,7,1,246,25,41,230.29,23,0,0,0,2 480 | 0,0,0,1,7,3,291,31,40,230.29,25,0,1,1,1 481 | 0,0,0,1,7,3,279,5,39,230.29,24,0,2,0,2 482 | 0,0,0,1,3,2,179,51,38,249.797,31,0,0,0,4 483 | 0,0,0,1,7,5,179,22,40,249.797,22,1,2,0,1 484 | 0,0,0,1,8,1,246,25,41,249.797,23,0,0,0,4 485 | 0,0,0,1,8,1,118,10,37,249.797,28,0,0,0,4 486 | 0,0,0,1,8,1,289,36,33,249.797,30,0,2,1,8 487 | 0,0,0,1,8,1,235,20,43,249.797,38,0,1,0,8 488 | 0,0,0,1,10,6,291,31,40,249.797,25,0,1,1,4 489 | 0,0,0,1,8,0,179,51,38,249.797,31,0,0,0,4 490 | 0,0,0,1,8,1,179,22,40,249.797,22,1,2,0,8 491 | 0,0,0,1,8,3,330,16,28,249.797,25,1,0,0,16 492 | 0,0,0,1,8,1,235,11,37,249.797,29,1,1,1,4 493 | 0,0,0,1,8,1,246,25,41,249.797,23,0,0,0,1 494 | 0,0,0,1,8,1,118,10,37,249.797,28,0,0,0,5 495 | 0,0,0,1,8,3,291,31,40,249.797,25,0,1,1,2 496 | 0,0,0,1,8,0,260,50,36,249.797,23,0,4,0,3 497 | 0,0,0,1,5,1,246,25,41,261.756,23,0,0,0,1 498 | 0,0,0,1,5,1,246,25,41,261.756,23,0,0,0,1 499 | 0,0,0,1,5,1,118,10,37,261.756,28,0,0,0,3 500 | 0,0,0,1,5,1,155,12,34,261.756,25,0,2,0,2 501 | 0,0,0,1,7,6,291,31,40,261.756,25,0,1,1,2 502 | 0,0,0,1,8,2,179,26,30,261.756,19,1,0,0,8 503 | 0,0,0,1,8,2,248,25,47,261.756,32,0,2,1,1 504 | 0,0,0,1,11,3,179,51,38,261.756,31,0,0,0,4 505 | 0,0,0,1,9,2,225,26,28,261.756,24,0,1,2,1 506 | 0,0,0,1,9,0,179,26,30,261.756,19,1,0,0,2 507 | 0,0,0,1,9,1,369,17,31,261.756,25,0,3,0,8 508 | 0,0,0,1,9,1,361,52,28,261.756,27,0,1,4,8 509 | 1,0,0,0,5,2,289,48,49,284.853,36,0,0,2,1 510 | 1,0,0,0,5,2,235,16,32,284.853,25,1,0,0,3 511 | 0,0,0,1,6,5,246,25,41,284.853,23,0,0,0,8 512 | 1,0,0,0,11,4,289,48,49,284.853,36,0,0,2,3 513 | 0,0,0,1,11,4,291,31,40,284.853,25,0,1,1,2 514 | 0,0,0,1,10,1,118,10,37,284.853,28,0,0,0,2 515 | 0,0,0,1,10,3,289,48,49,284.853,36,0,0,2,2 516 | 0,0,0,1,10,4,291,31,40,284.853,25,0,1,1,1 517 | 0,0,0,1,10,1,225,26,28,284.853,24,0,1,2,2 518 | 0,0,0,1,10,1,369,17,31,284.853,25,0,3,0,8 519 | 0,0,0,1,10,1,369,17,31,284.853,25,0,3,0,3 520 | 0,0,0,1,10,1,225,26,28,284.853,24,0,1,2,4 521 | 0,0,0,1,10,1,369,17,31,284.853,25,0,3,0,8 522 | 0,0,0,1,10,2,179,51,38,284.853,31,0,0,0,3 523 | 1,0,0,0,10,2,228,14,58,284.853,22,0,2,1,1 524 | 0,0,0,1,10,2,291,31,40,284.853,25,0,1,1,1 525 | 1,0,0,0,10,3,369,17,31,284.853,25,0,3,0,8 526 | 1,0,0,0,10,3,225,26,28,284.853,24,0,1,2,1 527 | 1,0,0,0,10,4,369,17,31,284.853,25,0,3,0,8 528 | 1,0,0,0,10,4,225,26,28,284.853,24,0,1,2,3 529 | 0,0,0,1,10,6,189,29,33,284.853,25,0,2,2,8 530 | 1,0,0,0,10,6,235,16,32,284.853,25,1,0,0,8 531 | 1,0,0,0,10,6,248,25,47,284.853,32,0,2,1,8 532 | 0,0,0,0,10,6,225,26,28,284.853,24,0,1,2,0 533 | 1,0,0,0,10,1,225,26,28,284.853,24,0,1,2,3 534 | 0,0,1,0,7,1,179,51,38,268.519,31,0,0,0,1 535 | 0,0,0,1,8,4,118,10,37,268.519,28,0,0,0,3 536 | 1,0,0,0,8,4,330,16,28,268.519,25,1,0,0,24 537 | 0,0,0,1,10,2,179,51,38,268.519,31,0,0,0,1 538 | 1,0,0,0,11,1,118,10,37,268.519,28,0,0,0,8 539 | 0,0,0,1,11,2,289,36,33,268.519,30,0,2,1,8 540 | 1,0,0,0,11,4,235,16,32,268.519,25,1,0,0,8 541 | 0,0,0,1,11,4,225,26,28,268.519,24,0,1,2,4 542 | 0,0,0,1,11,1,361,52,28,268.519,27,0,1,4,8 543 | 0,0,0,1,11,2,291,31,40,268.519,25,0,1,1,2 544 | 1,0,0,0,11,3,118,10,37,268.519,28,0,0,0,2 545 | 1,0,0,0,11,3,225,26,28,268.519,24,0,1,2,3 546 | 0,0,0,1,11,0,179,51,38,268.519,31,0,0,0,1 547 | 0,0,0,1,11,0,118,10,37,268.519,28,0,0,0,8 548 | 1,0,0,0,11,1,118,10,37,268.519,28,0,0,0,8 549 | 0,0,0,1,11,1,225,26,28,268.519,24,0,1,2,2 550 | 0,0,0,0,11,1,291,31,40,268.519,25,0,1,1,0 551 | 0,0,0,0,11,2,289,36,33,268.519,30,0,2,1,0 552 | 1,0,0,0,11,3,248,25,47,268.519,32,0,2,1,4 553 | 0,0,0,0,11,3,235,20,43,268.519,38,0,1,0,0 554 | 0,0,0,1,11,4,225,26,28,268.519,24,0,1,2,2 555 | 0,0,0,1,11,4,369,17,31,268.519,25,0,3,0,8 556 | 0,0,0,1,11,0,361,52,28,268.519,27,0,1,4,2 557 | 1,0,0,0,5,4,179,51,38,280.549,31,0,0,0,32 558 | 0,0,0,1,6,0,291,31,40,280.549,25,0,1,1,1 559 | 0,0,0,1,6,0,225,26,28,280.549,24,0,1,2,3 560 | 1,0,0,0,8,5,179,26,30,280.549,19,1,0,0,1 561 | 0,0,0,1,8,5,225,26,28,280.549,24,0,1,2,3 562 | 0,0,0,1,12,2,225,26,28,280.549,24,0,1,2,3 563 | 1,0,0,0,12,3,361,52,28,280.549,27,0,1,4,4 564 | 0,0,1,0,12,4,179,22,40,280.549,22,1,2,0,2 565 | 0,0,0,1,12,4,235,20,43,280.549,38,0,1,0,8 566 | 0,0,1,0,12,0,233,51,31,280.549,21,1,1,8,8 567 | 1,0,0,0,12,1,179,26,30,280.549,19,1,0,0,16 568 | 0,0,0,1,12,1,225,26,28,280.549,24,0,1,2,2 569 | 0,0,0,1,12,3,225,26,28,280.549,24,0,1,2,3 570 | 0,0,0,1,12,0,225,26,28,280.549,24,0,1,2,2 571 | 0,0,1,0,12,1,155,12,34,280.549,25,0,2,0,80 572 | 1,0,0,0,1,0,179,26,30,313.532,19,1,0,0,24 573 | 1,0,0,0,4,6,179,26,30,313.532,19,1,0,0,16 574 | 0,0,0,1,4,6,179,22,40,313.532,22,1,2,0,2 575 | 0,0,0,1,5,1,179,22,40,313.532,22,1,2,0,2 576 | 1,0,0,0,8,2,179,26,30,313.532,19,1,0,0,3 577 | 0,0,0,1,10,0,179,22,40,313.532,22,1,2,0,2 578 | 1,0,0,0,11,3,289,48,49,313.532,36,0,0,2,8 579 | 0,0,1,0,12,5,179,22,40,313.532,22,1,2,0,3 580 | 0,0,0,1,1,0,179,26,30,313.532,19,1,0,0,2 581 | 0,0,1,0,1,1,155,12,34,313.532,25,0,2,0,8 582 | 0,0,0,1,1,2,179,26,30,313.532,19,1,0,0,2 583 | 0,0,0,1,1,2,179,51,38,313.532,31,0,0,0,3 584 | 1,0,0,0,1,2,289,36,33,313.532,30,0,2,1,8 585 | 0,0,0,1,1,3,179,51,38,313.532,31,0,0,0,3 586 | 0,0,0,1,1,4,179,51,38,313.532,31,0,0,0,2 587 | 1,0,0,0,6,5,179,51,38,264.249,31,0,0,0,8 588 | 0,0,0,1,6,5,225,26,28,264.249,24,0,1,2,3 589 | 1,0,0,0,7,0,248,25,47,264.249,32,0,2,1,8 590 | 0,0,0,1,7,0,179,51,38,264.249,31,0,0,0,2 591 | 0,0,0,1,8,3,225,26,28,264.249,24,0,1,2,3 592 | 0,0,0,1,8,3,179,51,38,264.249,31,0,0,0,2 593 | 0,0,0,1,8,3,179,26,30,264.249,19,1,0,0,2 594 | 0,0,0,1,9,6,225,15,41,264.249,28,1,2,2,2 595 | 0,0,0,1,9,6,179,51,38,264.249,31,0,0,0,2 596 | 0,0,1,0,12,6,233,51,31,264.249,21,1,1,8,2 597 | 0,0,0,1,12,6,179,51,38,264.249,31,0,0,0,2 598 | 1,0,0,0,2,1,225,26,28,264.249,24,0,1,2,8 599 | 0,0,0,1,2,2,179,51,38,264.249,31,0,0,0,3 600 | 0,0,0,1,2,3,179,51,38,264.249,31,0,0,0,3 601 | 0,0,0,1,2,3,225,26,28,264.249,24,0,1,2,3 602 | 1,0,0,0,2,3,179,26,30,264.249,19,1,0,0,2 603 | 0,0,0,1,2,4,179,22,40,264.249,22,1,2,0,2 604 | 0,0,0,1,2,4,179,51,38,264.249,31,0,0,0,3 605 | 1,0,0,0,2,2,233,51,31,264.249,21,1,1,8,3 606 | 0,0,0,1,2,2,179,26,30,264.249,19,1,0,0,2 607 | 0,0,0,1,2,2,179,51,38,264.249,31,0,0,0,2 608 | 1,0,0,0,2,3,179,51,38,264.249,31,0,0,0,8 609 | 0,0,0,1,2,4,179,51,38,264.249,31,0,0,0,2 610 | 0,0,0,1,2,0,155,12,34,264.249,25,0,2,0,5 611 | 0,0,0,1,2,0,235,16,32,264.249,25,1,0,0,3 612 | 0,0,0,1,2,0,179,51,38,264.249,31,0,0,0,2 613 | 1,0,0,0,2,0,225,26,28,264.249,24,0,1,2,2 614 | 0,0,0,1,2,1,179,51,38,264.249,31,0,0,0,2 615 | 0,0,0,1,2,1,248,25,47,264.249,32,0,2,1,2 616 | 0,0,0,1,2,1,225,26,28,264.249,24,0,1,2,2 617 | 0,0,0,1,2,2,179,51,38,264.249,31,0,0,0,2 618 | 0,0,0,1,2,3,179,51,38,264.249,31,0,0,0,2 619 | 0,0,0,1,2,4,235,16,32,264.249,25,1,0,0,2 620 | 0,0,0,1,5,3,179,51,38,222.196,31,0,0,0,2 621 | 0,0,0,1,5,3,248,25,47,222.196,32,0,2,1,2 622 | 0,0,0,1,6,6,228,14,58,222.196,22,0,2,1,3 623 | 0,0,0,1,6,6,248,25,47,222.196,32,0,2,1,3 624 | 1,0,0,0,6,6,228,14,58,222.196,22,0,2,1,112 625 | 0,0,0,1,7,1,179,51,38,222.196,31,0,0,0,2 626 | 0,0,0,1,8,4,225,26,28,222.196,24,0,1,2,2 627 | 0,0,0,1,8,4,179,51,38,222.196,31,0,0,0,3 628 | 0,0,0,1,8,4,225,26,28,222.196,24,0,1,2,2 629 | 0,0,0,1,9,0,179,26,30,222.196,19,1,0,0,3 630 | 0,0,0,1,12,0,235,16,32,222.196,25,1,0,0,3 631 | 0,0,1,0,12,0,361,52,28,222.196,27,0,1,4,8 632 | 1,0,0,0,3,1,179,51,38,222.196,31,0,0,0,8 633 | 0,0,0,1,3,2,179,51,38,222.196,31,0,0,0,2 634 | 0,0,0,1,3,3,179,51,38,222.196,31,0,0,0,3 635 | 0,0,0,1,3,4,179,26,30,222.196,19,1,0,0,2 636 | 1,0,0,0,3,0,179,51,38,222.196,31,0,0,0,4 637 | 1,0,0,0,3,0,248,25,47,222.196,32,0,2,1,2 638 | 0,0,0,1,3,0,179,51,38,222.196,31,0,0,0,3 639 | 1,0,0,0,3,0,225,26,28,222.196,24,0,1,2,8 640 | 0,0,0,1,3,1,179,51,38,222.196,31,0,0,0,2 641 | 0,0,0,1,3,2,289,36,33,222.196,30,0,2,1,8 642 | 0,0,0,1,3,2,228,14,58,222.196,22,0,2,1,2 643 | 0,0,0,1,3,2,179,51,38,222.196,31,0,0,0,2 644 | 0,0,0,1,3,3,248,25,47,222.196,32,0,2,1,3 645 | 0,0,0,1,3,3,179,51,38,222.196,31,0,0,0,3 646 | 0,0,0,1,3,4,179,26,30,222.196,19,1,0,0,2 647 | 0,0,0,1,3,4,179,51,38,222.196,31,0,0,0,3 648 | 0,0,0,1,3,1,179,51,38,222.196,31,0,0,0,3 649 | 0,0,0,1,3,2,118,15,46,222.196,25,0,2,0,8 650 | 1,0,0,0,3,2,155,12,34,222.196,25,0,2,0,24 651 | 0,0,0,1,3,2,179,51,38,222.196,31,0,0,0,3 652 | 0,0,0,1,3,3,179,51,38,222.196,31,0,0,0,3 653 | 1,0,0,0,3,0,179,26,30,222.196,19,1,0,0,2 654 | 0,0,1,0,3,0,289,36,33,222.196,30,0,2,1,104 655 | 0,0,0,1,3,2,369,17,31,222.196,25,0,3,0,8 656 | 1,0,0,0,2,6,225,26,28,246.288,24,0,1,2,8 657 | 1,0,0,0,2,6,118,10,37,246.288,28,0,0,0,8 658 | 0,0,1,0,3,6,361,52,28,246.288,27,0,1,4,8 659 | 0,0,1,0,4,2,248,25,47,246.288,32,0,2,1,8 660 | 1,0,0,0,5,4,189,29,33,246.288,25,0,2,2,8 661 | 0,0,0,1,6,0,179,26,30,246.288,19,1,0,0,2 662 | 1,0,0,0,9,1,369,17,31,246.288,25,0,3,0,24 663 | 0,1,0,0,10,3,179,22,40,246.288,22,1,2,0,2 664 | 0,0,0,1,10,3,118,13,50,246.288,31,0,1,0,3 665 | 0,0,0,1,10,3,361,52,28,246.288,27,0,1,4,2 666 | 1,0,0,0,11,6,118,10,37,246.288,28,0,0,0,2 667 | 0,0,0,1,4,4,235,11,37,246.288,29,1,1,1,8 668 | 0,0,0,1,4,4,179,26,30,246.288,19,1,0,0,2 669 | 0,0,1,0,4,0,225,26,28,246.288,24,0,1,2,8 670 | 0,1,0,0,4,1,235,16,32,246.288,25,1,0,0,3 671 | 0,0,0,1,4,4,179,26,30,246.288,19,1,0,0,2 672 | 0,0,0,1,4,1,155,12,34,246.288,25,0,2,0,4 673 | 0,0,1,0,4,3,225,26,28,246.288,24,0,1,2,8 674 | 1,0,0,0,4,3,118,13,50,246.288,31,0,1,0,2 675 | 0,0,0,1,4,4,179,26,30,246.288,19,1,0,0,2 676 | 0,0,0,1,7,3,235,11,37,237.656,29,1,1,1,8 677 | 0,0,1,0,9,2,225,15,41,237.656,28,1,2,2,3 678 | 0,0,0,1,9,2,235,16,32,237.656,25,1,0,0,2 679 | 1,0,0,0,9,2,118,10,37,237.656,28,0,0,0,3 680 | 0,0,0,1,9,2,235,20,43,237.656,38,0,1,0,8 681 | 1,0,0,0,10,4,179,26,30,237.656,19,1,0,0,1 682 | 0,0,0,1,10,4,291,31,40,237.656,25,0,1,1,2 683 | 1,0,0,0,10,4,225,15,41,237.656,28,1,2,2,8 684 | 0,0,1,0,11,0,300,26,43,237.656,25,0,2,1,64 685 | 0,0,0,1,11,0,225,15,41,237.656,28,1,2,2,8 686 | 0,0,0,1,11,0,179,26,30,237.656,19,1,0,0,2 687 | 0,0,0,1,5,0,118,13,50,237.656,31,0,1,0,2 688 | 1,0,0,0,5,1,118,13,50,237.656,31,0,1,0,3 689 | 0,0,0,1,5,1,118,10,37,237.656,28,0,0,0,1 690 | 0,0,0,0,5,1,118,13,50,237.656,31,0,1,0,0 691 | 0,0,0,1,5,2,179,26,30,237.656,19,1,0,0,2 692 | 0,0,0,0,5,2,378,49,36,237.656,21,0,2,4,0 693 | 0,1,0,0,5,4,179,22,40,237.656,22,1,2,0,1 694 | 1,0,0,0,5,0,155,12,34,237.656,25,0,2,0,48 695 | 1,0,0,0,5,0,235,16,32,237.656,25,1,0,0,8 696 | 0,0,0,1,5,2,291,31,40,237.656,25,0,1,1,8 697 | 1,0,0,0,5,2,179,22,40,237.656,22,1,2,0,8 698 | 1,0,0,0,5,2,225,26,28,237.656,24,0,1,2,3 699 | 1,0,0,0,5,3,330,16,28,237.656,25,1,0,0,8 700 | 0,0,0,1,5,3,235,16,32,237.656,25,1,0,0,2 701 | 0,0,0,1,5,3,291,31,40,237.656,25,0,1,1,2 702 | -------------------------------------------------------------------------------- /5. Preprocessing/20 What to Expect from the Next Couple of Sections/Absenteeism_data.csv: -------------------------------------------------------------------------------- 1 | ID,Reason for Absence,Date,Transportation Expense,Distance to Work,Age,Daily Work Load Average,Body Mass Index,Education,Children,Pets,Absenteeism Time in Hours 2 | 11,26,07/07/2015,289,36,33,239.554,30,1,2,1,4 3 | 36,0,14/07/2015,118,13,50,239.554,31,1,1,0,0 4 | 3,23,15/07/2015,179,51,38,239.554,31,1,0,0,2 5 | 7,7,16/07/2015,279,5,39,239.554,24,1,2,0,4 6 | 11,23,23/07/2015,289,36,33,239.554,30,1,2,1,2 7 | 3,23,10/07/2015,179,51,38,239.554,31,1,0,0,2 8 | 10,22,17/07/2015,361,52,28,239.554,27,1,1,4,8 9 | 20,23,24/07/2015,260,50,36,239.554,23,1,4,0,4 10 | 14,19,06/07/2015,155,12,34,239.554,25,1,2,0,40 11 | 1,22,13/07/2015,235,11,37,239.554,29,3,1,1,8 12 | 20,1,20/07/2015,260,50,36,239.554,23,1,4,0,8 13 | 20,1,14/07/2015,260,50,36,239.554,23,1,4,0,8 14 | 20,11,15/07/2015,260,50,36,239.554,23,1,4,0,8 15 | 3,11,15/07/2015,179,51,38,239.554,31,1,0,0,1 16 | 3,23,15/07/2015,179,51,38,239.554,31,1,0,0,4 17 | 24,14,17/07/2015,246,25,41,239.554,23,1,0,0,8 18 | 3,23,17/07/2015,179,51,38,239.554,31,1,0,0,2 19 | 3,21,27/07/2015,179,51,38,239.554,31,1,0,0,8 20 | 6,11,30/07/2015,189,29,33,239.554,25,1,2,2,8 21 | 33,23,05/08/2015,248,25,47,205.917,32,1,2,1,2 22 | 18,10,12/08/2015,330,16,28,205.917,25,2,0,0,8 23 | 3,11,03/08/2015,179,51,38,205.917,31,1,0,0,1 24 | 10,13,10/08/2015,361,52,28,205.917,27,1,1,4,40 25 | 20,28,14/08/2015,260,50,36,205.917,23,1,4,0,4 26 | 11,18,17/08/2015,289,36,33,205.917,30,1,2,1,8 27 | 10,25,24/08/2015,361,52,28,205.917,27,1,1,4,7 28 | 11,23,04/08/2015,289,36,33,205.917,30,1,2,1,1 29 | 30,28,12/08/2015,157,27,29,205.917,22,1,0,0,4 30 | 11,18,19/08/2015,289,36,33,205.917,30,1,2,1,8 31 | 3,23,28/08/2015,179,51,38,205.917,31,1,0,0,2 32 | 3,18,17/08/2015,179,51,38,205.917,31,1,0,0,8 33 | 2,18,27/08/2015,235,29,48,205.917,33,1,1,5,8 34 | 1,23,27/08/2015,235,11,37,205.917,29,3,1,1,4 35 | 2,18,17/08/2015,235,29,48,205.917,33,1,1,5,8 36 | 3,23,17/08/2015,179,51,38,205.917,31,1,0,0,2 37 | 10,23,17/08/2015,361,52,28,205.917,27,1,1,4,1 38 | 11,24,04/08/2015,289,36,33,205.917,30,1,2,1,8 39 | 19,11,20/08/2015,291,50,32,205.917,23,1,0,0,4 40 | 2,28,21/08/2015,235,29,48,205.917,33,1,1,5,8 41 | 20,23,28/08/2015,260,50,36,205.917,23,1,4,0,4 42 | 27,23,01/09/2015,184,42,27,241.476,21,1,0,0,2 43 | 34,23,07/09/2015,118,10,37,241.476,28,1,0,0,4 44 | 3,23,01/09/2015,179,51,38,241.476,31,1,0,0,4 45 | 5,19,08/09/2015,235,20,43,241.476,38,1,1,0,8 46 | 14,23,09/09/2015,155,12,34,241.476,25,1,2,0,2 47 | 34,23,13/09/2015,118,10,37,241.476,28,1,0,0,3 48 | 3,23,14/09/2015,179,51,38,241.476,31,1,0,0,3 49 | 15,23,24/09/2015,291,31,40,241.476,25,1,1,1,4 50 | 20,22,04/09/2015,260,50,36,241.476,23,1,4,0,8 51 | 15,14,14/09/2015,291,31,40,241.476,25,1,1,1,32 52 | 20,0,21/09/2015,260,50,36,241.476,23,1,4,0,0 53 | 29,0,28/09/2015,225,26,28,241.476,24,1,1,2,0 54 | 28,23,08/09/2015,225,26,28,241.476,24,1,1,2,2 55 | 34,23,15/09/2015,118,10,37,241.476,28,1,0,0,2 56 | 11,0,22/09/2015,289,36,33,241.476,30,1,2,1,0 57 | 36,0,29/09/2015,118,13,50,241.476,31,1,1,0,0 58 | 28,18,16/09/2015,225,26,28,241.476,24,1,1,2,3 59 | 3,23,23/09/2015,179,51,38,241.476,31,1,0,0,3 60 | 13,0,30/09/2015,369,17,31,241.476,25,1,3,0,0 61 | 33,23,11/09/2015,248,25,47,241.476,32,1,2,1,1 62 | 3,23,18/09/2015,179,51,38,241.476,31,1,0,0,3 63 | 20,23,25/09/2015,260,50,36,241.476,23,1,4,0,4 64 | 3,23,06/10/2015,179,51,38,253.465,31,1,0,0,3 65 | 34,23,13/10/2015,118,10,37,253.465,28,1,0,0,3 66 | 36,0,14/10/2015,118,13,50,253.465,31,1,1,0,0 67 | 22,23,15/10/2015,179,26,30,253.465,19,3,0,0,1 68 | 3,23,16/10/2015,179,51,38,253.465,31,1,0,0,3 69 | 28,23,16/10/2015,225,26,28,253.465,24,1,1,2,3 70 | 34,23,06/10/2015,118,10,37,253.465,28,1,0,0,3 71 | 28,23,14/10/2015,225,26,28,253.465,24,1,1,2,2 72 | 33,23,21/10/2015,248,25,47,253.465,32,1,2,1,2 73 | 15,23,22/10/2015,291,31,40,253.465,25,1,1,1,5 74 | 3,23,21/10/2015,179,51,38,253.465,31,1,0,0,8 75 | 28,23,21/10/2015,225,26,28,253.465,24,1,1,2,3 76 | 20,19,22/10/2015,260,50,36,253.465,23,1,4,0,16 77 | 15,14,13/10/2015,291,31,40,253.465,25,1,1,1,8 78 | 28,28,20/10/2015,225,26,28,253.465,24,1,1,2,2 79 | 11,26,21/10/2015,289,36,33,253.465,30,1,2,1,8 80 | 10,23,23/10/2015,361,52,28,253.465,27,1,1,4,1 81 | 20,28,30/10/2015,260,50,36,253.465,23,1,4,0,3 82 | 3,23,05/11/2015,179,51,38,306.345,31,1,0,0,1 83 | 28,23,04/11/2015,225,26,28,306.345,24,1,1,2,1 84 | 3,13,05/11/2015,179,51,38,306.345,31,1,0,0,8 85 | 17,21,12/11/2015,179,22,40,306.345,22,2,2,0,8 86 | 15,23,19/11/2015,291,31,40,306.345,25,1,1,1,5 87 | 14,10,02/11/2015,155,12,34,306.345,25,1,2,0,32 88 | 6,22,09/11/2015,189,29,33,306.345,25,1,2,2,8 89 | 15,14,16/11/2015,291,31,40,306.345,25,1,1,1,40 90 | 28,23,18/11/2015,225,26,28,306.345,24,1,1,2,1 91 | 14,6,20/11/2015,155,12,34,306.345,25,1,2,0,8 92 | 28,23,18/11/2015,225,26,28,306.345,24,1,1,2,3 93 | 17,21,25/11/2015,179,22,40,306.345,22,2,2,0,8 94 | 28,13,20/11/2015,225,26,28,306.345,24,1,1,2,3 95 | 20,28,27/11/2015,260,50,36,306.345,23,1,4,0,4 96 | 33,28,02/11/2015,248,25,47,306.345,32,1,2,1,1 97 | 28,28,10/11/2015,225,26,28,306.345,24,1,1,2,3 98 | 11,7,11/11/2015,289,36,33,306.345,30,1,2,1,24 99 | 15,23,26/11/2015,291,31,40,306.345,25,1,1,1,3 100 | 33,23,01/12/2015,248,25,47,261.306,32,1,2,1,1 101 | 34,19,01/12/2015,118,10,37,261.306,28,1,0,0,64 102 | 36,23,02/12/2015,118,13,50,261.306,31,1,1,0,2 103 | 1,26,02/12/2015,235,11,37,261.306,29,3,1,1,8 104 | 28,23,03/12/2015,225,26,28,261.306,24,1,1,2,2 105 | 20,26,04/12/2015,260,50,36,261.306,23,1,4,0,8 106 | 34,19,08/12/2015,118,10,37,261.306,28,1,0,0,56 107 | 10,22,09/12/2015,361,52,28,261.306,27,1,1,4,8 108 | 28,28,10/12/2015,225,26,28,261.306,24,1,1,2,3 109 | 20,28,11/12/2015,260,50,36,261.306,23,1,4,0,3 110 | 28,23,15/12/2015,225,26,28,261.306,24,1,1,2,2 111 | 10,22,16/12/2015,361,52,28,261.306,27,1,1,4,8 112 | 34,27,11/12/2015,118,10,37,261.306,28,1,0,0,2 113 | 24,19,18/12/2015,246,25,41,261.306,23,1,0,0,8 114 | 28,23,18/12/2015,225,26,28,261.306,24,1,1,2,2 115 | 28,23,06/01/2016,225,26,28,308.593,24,1,1,2,1 116 | 34,19,04/01/2016,118,10,37,308.593,28,1,0,0,1 117 | 34,27,05/01/2016,118,10,37,308.593,28,1,0,0,1 118 | 14,18,05/01/2016,155,12,34,308.593,25,1,2,0,8 119 | 28,27,06/01/2016,225,26,28,308.593,24,1,1,2,2 120 | 27,23,07/01/2016,184,42,27,308.593,21,1,0,0,2 121 | 28,28,07/01/2016,225,26,28,308.593,24,1,1,2,2 122 | 28,27,08/01/2016,225,26,28,308.593,24,1,1,2,1 123 | 34,27,11/01/2016,118,10,37,308.593,28,1,0,0,2 124 | 28,27,12/01/2016,225,26,28,308.593,24,1,1,2,2 125 | 34,27,12/01/2016,118,10,37,308.593,28,1,0,0,2 126 | 34,27,13/01/2016,118,10,37,308.593,28,1,0,0,2 127 | 34,27,14/01/2016,118,10,37,308.593,28,1,0,0,2 128 | 34,27,15/01/2016,118,10,37,308.593,28,1,0,0,2 129 | 34,27,11/01/2016,118,10,37,308.593,28,1,0,0,2 130 | 34,27,12/01/2016,118,10,37,308.593,28,1,0,0,2 131 | 22,18,12/01/2016,179,26,30,308.593,19,3,0,0,8 132 | 11,18,12/01/2016,289,36,33,308.593,30,1,2,1,8 133 | 34,27,13/01/2016,118,10,37,308.593,28,1,0,0,2 134 | 27,23,21/01/2016,184,42,27,308.593,21,1,0,0,2 135 | 34,27,28/01/2016,118,10,37,308.593,28,1,0,0,2 136 | 34,27,25/01/2016,118,10,37,308.593,28,1,0,0,0 137 | 28,23,26/01/2016,225,26,28,308.593,24,1,1,2,1 138 | 11,22,28/01/2016,289,36,33,308.593,30,1,2,1,3 139 | 27,23,05/02/2016,184,42,27,302.585,21,1,0,0,1 140 | 24,1,03/02/2016,246,25,41,302.585,23,1,0,0,8 141 | 3,11,10/02/2016,179,51,38,302.585,31,1,0,0,8 142 | 14,28,11/02/2016,155,12,34,302.585,25,1,2,0,2 143 | 6,23,11/02/2016,189,29,33,302.585,25,1,2,2,8 144 | 20,28,12/02/2016,260,50,36,302.585,23,1,4,0,2 145 | 11,22,12/02/2016,289,36,33,302.585,30,1,2,1,8 146 | 31,11,08/02/2016,388,15,50,302.585,24,1,0,0,8 147 | 31,1,09/02/2016,388,15,50,302.585,24,1,0,0,8 148 | 28,28,15/02/2016,225,26,28,302.585,24,1,1,2,2 149 | 28,23,09/02/2016,225,26,28,302.585,24,1,1,2,2 150 | 22,23,16/02/2016,179,26,30,302.585,19,3,0,0,1 151 | 27,23,23/02/2016,184,42,27,302.585,21,1,0,0,8 152 | 28,25,25/02/2016,225,26,28,302.585,24,1,1,2,3 153 | 18,18,15/02/2016,330,16,28,302.585,25,2,0,0,8 154 | 18,23,16/02/2016,330,16,28,302.585,25,2,0,0,1 155 | 28,23,17/02/2016,225,26,28,302.585,24,1,1,2,1 156 | 6,19,25/02/2016,189,29,33,302.585,25,1,2,2,8 157 | 19,28,01/03/2016,291,50,32,343.253,23,1,0,0,2 158 | 20,19,01/03/2016,260,50,36,343.253,23,1,4,0,8 159 | 30,19,08/03/2016,157,27,29,343.253,22,1,0,0,3 160 | 17,17,15/03/2016,179,22,40,343.253,22,2,2,0,8 161 | 15,22,16/03/2016,291,31,40,343.253,25,1,1,1,8 162 | 20,13,02/03/2016,260,50,36,343.253,23,1,4,0,8 163 | 22,13,03/03/2016,179,26,30,343.253,19,3,0,0,8 164 | 33,14,04/03/2016,248,25,47,343.253,32,1,2,1,3 165 | 20,13,04/03/2016,260,50,36,343.253,23,1,4,0,40 166 | 17,11,07/03/2016,179,22,40,343.253,22,2,2,0,40 167 | 14,1,14/03/2016,155,12,34,343.253,25,1,2,0,16 168 | 20,26,21/03/2016,260,50,36,343.253,23,1,4,0,16 169 | 14,13,22/03/2016,155,12,34,343.253,25,1,2,0,8 170 | 11,6,24/03/2016,289,36,33,343.253,30,1,2,1,8 171 | 17,8,24/03/2016,179,22,40,343.253,22,2,2,0,8 172 | 20,28,25/03/2016,260,50,36,343.253,23,1,4,0,4 173 | 28,23,25/03/2016,225,26,28,343.253,24,1,1,2,1 174 | 7,14,28/03/2016,279,5,39,343.253,24,1,2,0,8 175 | 3,13,29/03/2016,179,51,38,343.253,31,1,0,0,24 176 | 28,23,30/03/2016,225,26,28,343.253,24,1,1,2,2 177 | 28,11,28/03/2016,225,26,28,343.253,24,1,1,2,8 178 | 22,13,28/03/2016,179,26,30,343.253,19,3,0,0,1 179 | 28,11,29/03/2016,225,26,28,343.253,24,1,1,2,8 180 | 28,11,30/03/2016,225,26,28,343.253,24,1,1,2,16 181 | 3,13,30/03/2016,179,51,38,343.253,31,1,0,0,3 182 | 7,14,17/03/2016,279,5,39,343.253,24,1,2,0,16 183 | 28,28,18/03/2016,225,26,28,343.253,24,1,1,2,2 184 | 33,14,25/03/2016,248,25,47,343.253,32,1,2,1,3 185 | 28,28,28/03/2016,225,26,28,343.253,24,1,1,2,1 186 | 15,28,06/04/2016,291,31,40,326.452,25,1,1,1,1 187 | 28,23,13/04/2016,225,26,28,326.452,24,1,1,2,1 188 | 14,28,12/04/2016,155,12,34,326.452,25,1,2,0,1 189 | 24,13,13/04/2016,246,25,41,326.452,23,1,0,0,24 190 | 14,23,14/04/2016,155,12,34,326.452,25,1,2,0,1 191 | 28,28,15/04/2016,225,26,28,326.452,24,1,1,2,2 192 | 20,28,15/04/2016,260,50,36,326.452,23,1,4,0,4 193 | 3,13,06/04/2016,179,51,38,326.452,31,1,0,0,24 194 | 36,23,07/04/2016,118,13,50,326.452,31,1,1,0,1 195 | 15,23,08/04/2016,291,31,40,326.452,25,1,1,1,3 196 | 24,14,08/04/2016,246,25,41,326.452,23,1,0,0,8 197 | 15,28,22/04/2016,291,31,40,326.452,25,1,1,1,1 198 | 33,28,29/04/2016,248,25,47,326.452,32,1,2,1,8 199 | 20,19,29/04/2016,260,50,36,326.452,23,1,4,0,56 200 | 11,19,26/04/2016,289,36,33,326.452,30,1,2,1,8 201 | 14,12,27/04/2016,155,12,34,326.452,25,1,2,0,24 202 | 23,19,27/04/2016,378,49,36,326.452,21,1,2,4,8 203 | 11,13,28/04/2016,289,36,33,326.452,30,1,2,1,16 204 | 1,7,08/04/2016,235,11,37,326.452,29,3,1,1,3 205 | 2,0,04/04/2016,235,29,48,326.452,33,1,1,5,0 206 | 11,13,04/05/2016,289,36,33,378.884,30,1,2,1,8 207 | 14,28,05/05/2016,155,12,34,378.884,25,1,2,0,2 208 | 14,28,02/05/2016,155,12,34,378.884,25,1,2,0,1 209 | 3,18,03/05/2016,179,51,38,378.884,31,1,0,0,8 210 | 28,19,03/05/2016,225,26,28,378.884,24,1,1,2,8 211 | 27,7,11/05/2016,184,42,27,378.884,21,1,0,0,4 212 | 14,28,09/05/2016,155,12,34,378.884,25,1,2,0,2 213 | 3,12,10/05/2016,179,51,38,378.884,31,1,0,0,1 214 | 11,13,04/05/2016,289,36,33,378.884,30,1,2,1,24 215 | 7,0,11/05/2016,279,5,39,378.884,24,1,2,0,0 216 | 18,0,18/05/2016,330,16,28,378.884,25,2,0,0,0 217 | 23,0,25/05/2016,378,49,36,378.884,21,1,2,4,0 218 | 31,0,25/05/2016,388,15,50,378.884,24,1,0,0,0 219 | 3,11,24/05/2016,179,51,38,378.884,31,1,0,0,1 220 | 36,13,11/05/2016,118,13,50,378.884,31,1,1,0,24 221 | 10,22,27/05/2016,361,52,28,378.884,27,1,1,4,8 222 | 24,19,06/06/2016,246,25,41,377.55,23,1,0,0,8 223 | 10,22,13/06/2016,361,52,28,377.55,27,1,1,4,8 224 | 24,10,14/06/2016,246,25,41,377.55,23,1,0,0,24 225 | 15,23,16/06/2016,291,31,40,377.55,25,1,1,1,4 226 | 24,10,17/06/2016,246,25,41,377.55,23,1,0,0,8 227 | 3,11,13/06/2016,179,51,38,377.55,31,1,0,0,8 228 | 14,23,20/06/2016,155,12,34,377.55,25,1,2,0,4 229 | 24,10,27/06/2016,246,25,41,377.55,23,1,0,0,8 230 | 36,13,29/06/2016,118,13,50,377.55,31,1,1,0,8 231 | 1,13,03/06/2016,235,11,37,377.55,29,3,1,1,16 232 | 36,23,07/06/2016,118,13,50,377.55,31,1,1,0,1 233 | 36,13,08/06/2016,118,13,50,377.55,31,1,1,0,80 234 | 23,22,09/06/2016,378,49,36,377.55,21,1,2,4,8 235 | 3,11,10/06/2016,179,51,38,377.55,31,1,0,0,2 236 | 32,28,20/06/2016,289,48,49,377.55,36,1,0,2,2 237 | 28,28,30/06/2016,225,26,28,377.55,24,1,1,2,2 238 | 14,19,05/07/2016,155,12,34,275.312,25,1,2,0,16 239 | 36,1,06/07/2016,118,13,50,275.312,31,1,1,0,8 240 | 34,5,08/07/2016,118,10,37,275.312,28,1,0,0,8 241 | 34,26,08/07/2016,118,10,37,275.312,28,1,0,0,4 242 | 18,26,12/07/2016,330,16,28,275.312,25,2,0,0,8 243 | 22,18,14/07/2016,179,26,30,275.312,19,3,0,0,8 244 | 14,25,15/07/2016,155,12,34,275.312,25,1,2,0,2 245 | 18,1,18/07/2016,330,16,28,275.312,25,2,0,0,8 246 | 18,1,19/07/2016,330,16,28,275.312,25,2,0,0,8 247 | 30,25,18/07/2016,157,27,29,275.312,22,1,0,0,3 248 | 10,22,19/07/2016,361,52,28,275.312,27,1,1,4,8 249 | 11,26,20/07/2016,289,36,33,275.312,30,1,2,1,8 250 | 3,26,21/07/2016,179,51,38,275.312,31,1,0,0,8 251 | 11,19,11/07/2016,289,36,33,275.312,30,1,2,1,32 252 | 11,19,14/07/2016,289,36,33,275.312,30,1,2,1,8 253 | 20,0,21/07/2016,260,50,36,275.312,23,1,4,0,0 254 | 11,19,04/08/2016,289,36,33,265.615,30,1,2,1,8 255 | 30,19,11/08/2016,157,27,29,265.615,22,1,0,0,3 256 | 11,23,07/08/2016,289,36,33,265.615,30,1,2,1,1 257 | 9,18,08/08/2016,228,14,58,265.615,22,1,2,1,8 258 | 26,13,10/08/2016,300,26,43,265.615,25,1,2,1,1 259 | 26,14,10/08/2016,300,26,43,265.615,25,1,2,1,2 260 | 20,28,11/08/2016,260,50,36,265.615,23,1,4,0,4 261 | 11,23,08/08/2016,289,36,33,265.615,30,1,2,1,4 262 | 33,23,09/08/2016,248,25,47,265.615,32,1,2,1,1 263 | 21,11,10/08/2016,268,11,33,265.615,25,2,0,0,8 264 | 22,23,10/08/2016,179,26,30,265.615,19,3,0,0,1 265 | 36,13,10/08/2016,118,13,50,265.615,31,1,1,0,3 266 | 33,25,14/08/2016,248,25,47,265.615,32,1,2,1,2 267 | 1,23,15/08/2016,235,11,37,265.615,29,3,1,1,1 268 | 36,23,17/08/2016,118,13,50,265.615,31,1,1,0,1 269 | 1,19,24/08/2016,235,11,37,265.615,29,3,1,1,8 270 | 10,8,08/08/2016,361,52,28,265.615,27,1,1,4,8 271 | 27,6,30/08/2016,184,42,27,265.615,21,1,0,0,8 272 | 3,11,05/09/2016,179,51,38,294.217,31,1,0,0,8 273 | 3,23,09/09/2016,179,51,38,294.217,31,1,0,0,3 274 | 11,19,07/09/2016,289,36,33,294.217,30,1,2,1,24 275 | 5,0,08/09/2016,235,20,43,294.217,38,1,1,0,0 276 | 24,9,12/09/2016,246,25,41,294.217,23,1,0,0,16 277 | 15,28,13/09/2016,291,31,40,294.217,25,1,1,1,3 278 | 8,0,13/09/2016,231,35,39,294.217,35,1,2,2,0 279 | 19,0,13/09/2016,291,50,32,294.217,23,1,0,0,0 280 | 3,13,14/09/2016,179,51,38,294.217,31,1,0,0,8 281 | 24,9,14/09/2016,246,25,41,294.217,23,1,0,0,32 282 | 3,23,15/09/2016,179,51,38,294.217,31,1,0,0,1 283 | 15,28,16/09/2016,291,31,40,294.217,25,1,1,1,4 284 | 20,28,16/09/2016,260,50,36,294.217,23,1,4,0,4 285 | 5,26,14/09/2016,235,20,43,294.217,38,1,1,0,8 286 | 36,28,15/09/2016,118,13,50,294.217,31,1,1,0,1 287 | 5,0,15/09/2016,235,20,43,294.217,38,1,1,0,0 288 | 15,28,16/09/2016,291,31,40,294.217,25,1,1,1,3 289 | 15,7,26/09/2016,291,31,40,294.217,25,1,1,1,40 290 | 3,13,26/09/2016,179,51,38,294.217,31,1,0,0,8 291 | 11,24,05/10/2016,289,36,33,265.017,30,1,2,1,8 292 | 1,26,05/10/2016,235,11,37,265.017,29,3,1,1,4 293 | 11,26,05/10/2016,289,36,33,265.017,30,1,2,1,8 294 | 11,22,06/10/2016,289,36,33,265.017,30,1,2,1,8 295 | 36,0,06/10/2016,118,13,50,265.017,31,1,1,0,0 296 | 33,0,06/10/2016,248,25,47,265.017,32,1,2,1,0 297 | 22,1,12/10/2016,179,26,30,265.017,19,3,0,0,8 298 | 34,7,12/10/2016,118,10,37,265.017,28,1,0,0,3 299 | 13,22,12/10/2016,369,17,31,265.017,25,1,3,0,8 300 | 3,28,14/10/2016,179,51,38,265.017,31,1,0,0,1 301 | 22,1,14/10/2016,179,26,30,265.017,19,3,0,0,64 302 | 5,0,14/10/2016,235,20,43,265.017,38,1,1,0,0 303 | 11,19,15/10/2016,289,36,33,265.017,30,1,2,1,16 304 | 20,28,16/10/2016,260,50,36,265.017,23,1,4,0,3 305 | 5,0,16/10/2016,235,20,43,265.017,38,1,1,0,0 306 | 5,23,19/10/2016,235,20,43,265.017,38,1,1,0,2 307 | 5,23,19/10/2016,235,20,43,265.017,38,1,1,0,2 308 | 36,28,20/10/2016,118,13,50,265.017,31,1,1,0,1 309 | 15,28,20/10/2016,291,31,40,265.017,25,1,1,1,4 310 | 22,23,22/10/2016,179,26,30,265.017,19,3,0,0,16 311 | 36,28,22/10/2016,118,13,50,265.017,31,1,1,0,1 312 | 10,10,26/10/2016,361,52,28,265.017,27,1,1,4,8 313 | 20,0,27/10/2016,260,50,36,265.017,23,1,4,0,0 314 | 15,0,27/10/2016,291,31,40,265.017,25,1,1,1,0 315 | 30,0,27/10/2016,157,27,29,265.017,22,1,0,0,0 316 | 22,1,28/10/2016,179,26,30,265.017,19,3,0,0,5 317 | 22,7,28/10/2016,179,26,30,265.017,19,3,0,0,5 318 | 36,23,29/10/2016,118,13,50,265.017,31,1,1,0,1 319 | 34,11,07/11/2016,118,10,37,284.031,28,1,0,0,8 320 | 33,23,07/11/2016,248,25,47,284.031,32,1,2,1,2 321 | 3,6,08/11/2016,179,51,38,284.031,31,1,0,0,8 322 | 20,28,11/11/2016,260,50,36,284.031,23,1,4,0,3 323 | 15,23,14/11/2016,291,31,40,284.031,25,1,1,1,1 324 | 23,1,14/11/2016,378,49,36,284.031,21,1,2,4,8 325 | 14,11,14/11/2016,155,12,34,284.031,25,1,2,0,120 326 | 5,26,14/11/2016,235,20,43,284.031,38,1,1,0,8 327 | 18,0,15/11/2016,330,16,28,284.031,25,2,0,0,0 328 | 1,18,16/11/2016,235,11,37,284.031,29,3,1,1,1 329 | 34,11,16/11/2016,118,10,37,284.031,28,1,0,0,3 330 | 1,25,17/11/2016,235,11,37,284.031,29,3,1,1,2 331 | 3,28,17/11/2016,179,51,38,284.031,31,1,0,0,3 332 | 24,13,18/11/2016,246,25,41,284.031,23,1,0,0,8 333 | 15,12,18/11/2016,291,31,40,284.031,25,1,1,1,4 334 | 24,13,21/11/2016,246,25,41,284.031,23,1,0,0,8 335 | 3,28,22/11/2016,179,51,38,284.031,31,1,0,0,1 336 | 20,10,23/11/2016,260,50,36,284.031,23,1,4,0,8 337 | 20,15,25/11/2016,260,50,36,284.031,23,1,4,0,8 338 | 23,0,25/11/2016,378,49,36,284.031,21,1,2,4,0 339 | 7,0,29/11/2016,279,5,39,284.031,24,1,2,0,0 340 | 3,23,24/11/2016,179,51,38,284.031,31,1,0,0,1 341 | 28,12,05/12/2016,225,26,28,236.629,24,1,1,2,3 342 | 3,28,05/12/2016,179,51,38,236.629,31,1,0,0,2 343 | 3,28,05/12/2016,179,51,38,236.629,31,1,0,0,1 344 | 1,23,05/12/2016,235,11,37,236.629,29,3,1,1,3 345 | 36,28,06/12/2016,118,13,50,236.629,31,1,1,0,1 346 | 20,28,09/12/2016,260,50,36,236.629,23,1,4,0,4 347 | 24,4,15/12/2016,246,25,41,236.629,23,1,0,0,8 348 | 3,28,15/12/2016,179,51,38,236.629,31,1,0,0,1 349 | 3,28,16/12/2016,179,51,38,236.629,31,1,0,0,1 350 | 22,23,20/12/2016,179,26,30,236.629,19,3,0,0,1 351 | 34,25,20/12/2016,118,10,37,236.629,28,1,0,0,8 352 | 1,25,22/12/2016,235,11,37,236.629,29,3,1,1,2 353 | 3,28,23/12/2016,179,51,38,236.629,31,1,0,0,1 354 | 5,13,20/12/2016,235,20,43,236.629,38,1,1,0,8 355 | 1,14,20/12/2016,235,11,37,236.629,29,3,1,1,4 356 | 20,26,21/12/2016,260,50,36,236.629,23,1,4,0,8 357 | 30,28,19/12/2016,157,27,29,236.629,22,1,0,0,2 358 | 3,28,19/12/2016,179,51,38,236.629,31,1,0,0,3 359 | 11,19,19/12/2016,289,36,33,236.629,30,1,2,1,8 360 | 28,23,04/01/2017,225,26,28,330.061,24,1,1,2,5 361 | 34,19,09/01/2017,118,10,37,330.061,28,1,0,0,32 362 | 14,23,09/01/2017,155,12,34,330.061,25,1,2,0,2 363 | 1,13,03/01/2017,235,11,37,330.061,29,3,1,1,1 364 | 14,23,03/01/2017,155,12,34,330.061,25,1,2,0,4 365 | 11,26,09/01/2017,289,36,33,330.061,30,1,2,1,8 366 | 15,3,11/01/2017,291,31,40,330.061,25,1,1,1,8 367 | 5,26,16/01/2017,235,20,43,330.061,38,1,1,0,8 368 | 36,26,16/01/2017,118,13,50,330.061,31,1,1,0,4 369 | 3,28,25/01/2017,179,51,38,330.061,31,1,0,0,1 370 | 3,28,27/01/2017,179,51,38,330.061,31,1,0,0,1 371 | 34,28,07/02/2017,118,10,37,251.818,28,1,0,0,2 372 | 3,27,01/02/2017,179,51,38,251.818,31,1,0,0,3 373 | 28,7,01/02/2017,225,26,28,251.818,24,1,1,2,1 374 | 11,22,03/02/2017,289,36,33,251.818,30,1,2,1,3 375 | 20,28,10/02/2017,260,50,36,251.818,23,1,4,0,3 376 | 3,23,17/02/2017,179,51,38,251.818,31,1,0,0,3 377 | 3,27,13/02/2017,179,51,38,251.818,31,1,0,0,2 378 | 3,27,15/02/2017,179,51,38,251.818,31,1,0,0,3 379 | 3,10,16/02/2017,179,51,38,251.818,31,1,0,0,8 380 | 24,26,16/02/2017,246,25,41,251.818,23,1,0,0,8 381 | 3,27,17/02/2017,179,51,38,251.818,31,1,0,0,3 382 | 6,22,13/02/2017,189,29,33,251.818,25,1,2,2,8 383 | 3,27,13/02/2017,179,51,38,251.818,31,1,0,0,3 384 | 24,23,21/02/2017,246,25,41,251.818,23,1,0,0,2 385 | 15,23,21/02/2017,291,31,40,251.818,25,1,1,1,2 386 | 30,11,22/02/2017,157,27,29,251.818,22,1,0,0,16 387 | 3,27,22/02/2017,179,51,38,251.818,31,1,0,0,3 388 | 3,27,24/02/2017,179,51,38,251.818,31,1,0,0,3 389 | 24,10,24/02/2017,246,25,41,251.818,23,1,0,0,24 390 | 3,27,22/02/2017,179,51,38,251.818,31,1,0,0,3 391 | 3,27,24/02/2017,179,51,38,251.818,31,1,0,0,3 392 | 34,18,07/03/2017,118,10,37,244.387,28,1,0,0,8 393 | 24,19,08/03/2017,246,25,41,244.387,23,1,0,0,16 394 | 24,28,10/03/2017,246,25,41,244.387,23,1,0,0,2 395 | 20,28,10/03/2017,260,50,36,244.387,23,1,4,0,4 396 | 3,28,13/03/2017,179,51,38,244.387,31,1,0,0,2 397 | 1,22,13/03/2017,235,11,37,244.387,29,3,1,1,8 398 | 17,22,14/03/2017,179,22,40,244.387,22,2,2,0,8 399 | 23,22,14/03/2017,378,49,36,244.387,21,1,2,4,8 400 | 3,28,20/03/2017,179,51,38,244.387,31,1,0,0,16 401 | 10,22,22/03/2017,361,52,28,244.387,27,1,1,4,8 402 | 13,0,22/03/2017,369,17,31,244.387,25,1,3,0,0 403 | 1,21,23/03/2017,235,11,37,244.387,29,3,1,1,8 404 | 36,23,24/03/2017,118,13,50,244.387,31,1,1,0,2 405 | 36,14,28/03/2017,118,13,50,244.387,31,1,1,0,3 406 | 36,13,29/03/2017,118,13,50,244.387,31,1,1,0,8 407 | 1,0,30/03/2017,235,11,37,244.387,29,3,1,1,0 408 | 24,0,30/03/2017,246,25,41,244.387,23,1,0,0,0 409 | 36,0,30/03/2017,118,13,50,244.387,31,1,1,0,0 410 | 3,28,31/03/2017,179,51,38,244.387,31,1,0,0,8 411 | 11,22,31/03/2017,289,36,33,244.387,30,1,2,1,8 412 | 20,19,27/03/2017,260,50,36,244.387,23,1,4,0,8 413 | 24,28,28/03/2017,246,25,41,244.387,23,1,0,0,2 414 | 3,28,05/04/2017,179,51,38,239.409,31,1,0,0,4 415 | 20,28,07/04/2017,260,50,36,239.409,23,1,4,0,3 416 | 18,26,07/04/2017,330,16,28,239.409,25,2,0,0,4 417 | 13,22,10/04/2017,369,17,31,239.409,25,1,3,0,4 418 | 33,26,10/04/2017,248,25,47,239.409,32,1,2,1,4 419 | 18,23,12/04/2017,330,16,28,239.409,25,2,0,0,8 420 | 3,28,12/04/2017,179,51,38,239.409,31,1,0,0,8 421 | 36,23,17/04/2017,118,13,50,239.409,31,1,1,0,1 422 | 36,13,19/04/2017,118,13,50,239.409,31,1,1,0,120 423 | 26,28,28/04/2017,300,26,43,239.409,25,1,2,1,8 424 | 20,28,28/04/2017,260,50,36,239.409,23,1,4,0,4 425 | 3,28,24/04/2017,179,51,38,239.409,31,1,0,0,4 426 | 34,11,26/04/2017,118,10,37,239.409,28,1,0,0,2 427 | 5,13,01/05/2017,235,20,43,246.074,38,1,1,0,16 428 | 33,23,03/05/2017,248,25,47,246.074,32,1,2,1,2 429 | 13,10,08/05/2017,369,17,31,246.074,25,1,3,0,8 430 | 22,23,10/05/2017,179,26,30,246.074,19,3,0,0,3 431 | 3,28,10/05/2017,179,51,38,246.074,31,1,0,0,4 432 | 10,23,11/05/2017,361,52,28,246.074,27,1,1,4,1 433 | 20,28,12/05/2017,260,50,36,246.074,23,1,4,0,3 434 | 17,11,15/05/2017,179,22,40,246.074,22,2,2,0,2 435 | 17,8,15/05/2017,179,22,40,246.074,22,2,2,0,3 436 | 9,18,17/05/2017,228,14,58,246.074,22,1,2,1,8 437 | 28,25,17/05/2017,225,26,28,246.074,24,1,1,2,3 438 | 18,13,19/05/2017,330,16,28,246.074,25,2,0,0,8 439 | 22,25,22/05/2017,179,26,30,246.074,19,3,0,0,2 440 | 34,28,22/05/2017,118,10,37,246.074,28,1,0,0,1 441 | 1,1,29/05/2017,235,11,37,246.074,29,3,1,1,8 442 | 22,23,31/05/2017,179,26,30,246.074,19,3,0,0,3 443 | 34,23,05/06/2017,118,10,37,253.957,28,1,0,0,3 444 | 3,28,05/06/2017,179,51,38,253.957,31,1,0,0,3 445 | 34,28,06/06/2017,118,10,37,253.957,28,1,0,0,2 446 | 28,23,08/06/2017,225,26,28,253.957,24,1,1,2,4 447 | 20,28,09/06/2017,260,50,36,253.957,23,1,4,0,4 448 | 3,0,09/06/2017,179,51,38,253.957,31,1,0,0,0 449 | 15,13,12/06/2017,291,31,40,253.957,25,1,1,1,40 450 | 3,28,12/06/2017,179,51,38,253.957,31,1,0,0,24 451 | 24,28,13/06/2017,246,25,41,253.957,23,1,0,0,3 452 | 3,28,19/06/2017,179,51,38,253.957,31,1,0,0,4 453 | 5,26,20/06/2017,235,20,43,253.957,38,1,1,0,8 454 | 3,28,26/06/2017,179,51,38,253.957,31,1,0,0,2 455 | 28,23,28/06/2017,225,26,28,253.957,24,1,1,2,2 456 | 36,23,28/06/2017,118,13,50,253.957,31,1,1,0,2 457 | 3,5,28/06/2017,179,51,38,253.957,31,1,0,0,8 458 | 22,21,28/06/2017,179,26,30,253.957,19,3,0,0,2 459 | 24,28,30/06/2017,246,25,41,253.957,23,1,0,0,2 460 | 18,11,27/06/2017,330,16,28,253.957,25,2,0,0,1 461 | 1,13,27/06/2017,235,11,37,253.957,29,3,1,1,8 462 | 22,23,06/07/2017,179,26,30,230.29,19,3,0,0,2 463 | 28,25,06/07/2017,225,26,28,230.29,24,1,1,2,4 464 | 20,13,07/07/2017,260,50,36,230.29,23,1,4,0,8 465 | 21,7,10/07/2017,268,11,33,230.29,25,2,0,0,8 466 | 18,25,14/07/2017,330,16,28,230.29,25,2,0,0,8 467 | 34,26,14/07/2017,118,10,37,230.29,28,1,0,0,8 468 | 20,26,17/07/2017,260,50,36,230.29,23,1,4,0,4 469 | 34,28,18/07/2017,118,10,37,230.29,28,1,0,0,8 470 | 26,15,24/07/2017,300,26,43,230.29,25,1,2,1,8 471 | 2,23,24/07/2017,235,29,48,230.29,33,1,1,5,1 472 | 24,28,25/07/2017,246,25,41,230.29,23,1,0,0,2 473 | 28,9,25/07/2017,225,26,28,230.29,24,1,1,2,112 474 | 3,28,25/07/2017,179,51,38,230.29,31,1,0,0,1 475 | 36,23,28/07/2017,118,13,50,230.29,31,1,1,0,1 476 | 10,22,28/07/2017,361,52,28,230.29,27,1,1,4,8 477 | 11,22,24/07/2017,289,36,33,230.29,30,1,2,1,8 478 | 5,26,24/07/2017,235,20,43,230.29,38,1,1,0,8 479 | 24,28,25/07/2017,246,25,41,230.29,23,1,0,0,2 480 | 15,28,27/07/2017,291,31,40,230.29,25,1,1,1,1 481 | 7,23,27/07/2017,279,5,39,230.29,24,1,2,0,2 482 | 3,25,03/08/2017,179,51,38,249.797,31,1,0,0,4 483 | 17,25,07/08/2017,179,22,40,249.797,22,2,2,0,1 484 | 24,28,08/08/2017,246,25,41,249.797,23,1,0,0,4 485 | 34,28,08/08/2017,118,10,37,249.797,28,1,0,0,4 486 | 11,26,08/08/2017,289,36,33,249.797,30,1,2,1,8 487 | 5,26,08/08/2017,235,20,43,249.797,38,1,1,0,8 488 | 15,28,10/08/2017,291,31,40,249.797,25,1,1,1,4 489 | 3,25,14/08/2017,179,51,38,249.797,31,1,0,0,4 490 | 17,25,15/08/2017,179,22,40,249.797,22,2,2,0,8 491 | 18,23,17/08/2017,330,16,28,249.797,25,2,0,0,16 492 | 1,23,22/08/2017,235,11,37,249.797,29,3,1,1,4 493 | 24,28,22/08/2017,246,25,41,249.797,23,1,0,0,1 494 | 34,28,22/08/2017,118,10,37,249.797,28,1,0,0,5 495 | 15,28,24/08/2017,291,31,40,249.797,25,1,1,1,2 496 | 20,28,28/08/2017,260,50,36,249.797,23,1,4,0,3 497 | 24,28,05/09/2017,246,25,41,261.756,23,1,0,0,1 498 | 24,28,05/09/2017,246,25,41,261.756,23,1,0,0,1 499 | 34,28,05/09/2017,118,10,37,261.756,28,1,0,0,3 500 | 14,23,05/09/2017,155,12,34,261.756,25,1,2,0,2 501 | 15,28,07/09/2017,291,31,40,261.756,25,1,1,1,2 502 | 22,23,08/09/2017,179,26,30,261.756,19,3,0,0,8 503 | 33,23,08/09/2017,248,25,47,261.756,32,1,2,1,1 504 | 3,23,11/09/2017,179,51,38,261.756,31,1,0,0,4 505 | 28,23,13/09/2017,225,26,28,261.756,24,1,1,2,1 506 | 22,23,18/09/2017,179,26,30,261.756,19,3,0,0,2 507 | 13,23,19/09/2017,369,17,31,261.756,25,1,3,0,8 508 | 10,22,26/09/2017,361,52,28,261.756,27,1,1,4,8 509 | 32,4,05/10/2017,289,48,49,284.853,36,1,0,2,1 510 | 25,11,05/10/2017,235,16,32,284.853,25,3,0,0,3 511 | 24,26,06/10/2017,246,25,41,284.853,23,1,0,0,8 512 | 32,14,11/10/2017,289,48,49,284.853,36,1,0,2,3 513 | 15,28,11/10/2017,291,31,40,284.853,25,1,1,1,2 514 | 34,23,17/10/2017,118,10,37,284.853,28,1,0,0,2 515 | 32,23,19/10/2017,289,48,49,284.853,36,1,0,2,2 516 | 15,23,20/10/2017,291,31,40,284.853,25,1,1,1,1 517 | 28,23,24/10/2017,225,26,28,284.853,24,1,1,2,2 518 | 13,23,24/10/2017,369,17,31,284.853,25,1,3,0,8 519 | 13,23,24/10/2017,369,17,31,284.853,25,1,3,0,3 520 | 28,23,24/10/2017,225,26,28,284.853,24,1,1,2,4 521 | 13,26,24/10/2017,369,17,31,284.853,25,1,3,0,8 522 | 3,28,25/10/2017,179,51,38,284.853,31,1,0,0,3 523 | 9,1,25/10/2017,228,14,58,284.853,22,1,2,1,1 524 | 15,23,25/10/2017,291,31,40,284.853,25,1,1,1,1 525 | 13,10,26/10/2017,369,17,31,284.853,25,1,3,0,8 526 | 28,13,26/10/2017,225,26,28,284.853,24,1,1,2,1 527 | 13,10,27/10/2017,369,17,31,284.853,25,1,3,0,8 528 | 28,10,27/10/2017,225,26,28,284.853,24,1,1,2,3 529 | 6,23,29/10/2017,189,29,33,284.853,25,1,2,2,8 530 | 25,6,29/10/2017,235,16,32,284.853,25,3,0,0,8 531 | 33,10,29/10/2017,248,25,47,284.853,32,1,2,1,8 532 | 28,0,29/10/2017,225,26,28,284.853,24,1,1,2,0 533 | 28,13,31/10/2017,225,26,28,284.853,24,1,1,2,3 534 | 3,21,07/11/2017,179,51,38,268.519,31,1,0,0,1 535 | 34,28,08/11/2017,118,10,37,268.519,28,1,0,0,3 536 | 18,2,08/11/2017,330,16,28,268.519,25,2,0,0,24 537 | 3,28,10/11/2017,179,51,38,268.519,31,1,0,0,1 538 | 34,9,14/11/2017,118,10,37,268.519,28,1,0,0,8 539 | 11,24,15/11/2017,289,36,33,268.519,30,1,2,1,8 540 | 25,1,17/11/2017,235,16,32,268.519,25,3,0,0,8 541 | 28,23,17/11/2017,225,26,28,268.519,24,1,1,2,4 542 | 10,22,21/11/2017,361,52,28,268.519,27,1,1,4,8 543 | 15,28,22/11/2017,291,31,40,268.519,25,1,1,1,2 544 | 34,13,23/11/2017,118,10,37,268.519,28,1,0,0,2 545 | 28,14,23/11/2017,225,26,28,268.519,24,1,1,2,3 546 | 3,28,27/11/2017,179,51,38,268.519,31,1,0,0,1 547 | 34,23,27/11/2017,118,10,37,268.519,28,1,0,0,8 548 | 34,8,28/11/2017,118,10,37,268.519,28,1,0,0,8 549 | 28,23,28/11/2017,225,26,28,268.519,24,1,1,2,2 550 | 15,0,28/11/2017,291,31,40,268.519,25,1,1,1,0 551 | 11,0,29/11/2017,289,36,33,268.519,30,1,2,1,0 552 | 33,14,30/11/2017,248,25,47,268.519,32,1,2,1,4 553 | 5,0,30/11/2017,235,20,43,268.519,38,1,1,0,0 554 | 28,23,24/11/2017,225,26,28,268.519,24,1,1,2,2 555 | 13,26,24/11/2017,369,17,31,268.519,25,1,3,0,8 556 | 10,28,27/11/2017,361,52,28,268.519,27,1,1,4,2 557 | 3,13,05/12/2017,179,51,38,280.549,31,1,0,0,32 558 | 15,28,06/12/2017,291,31,40,280.549,25,1,1,1,1 559 | 28,23,06/12/2017,225,26,28,280.549,24,1,1,2,3 560 | 22,13,08/12/2017,179,26,30,280.549,19,3,0,0,1 561 | 28,23,08/12/2017,225,26,28,280.549,24,1,1,2,3 562 | 28,23,13/12/2017,225,26,28,280.549,24,1,1,2,3 563 | 10,14,14/12/2017,361,52,28,280.549,27,1,1,4,4 564 | 17,18,15/12/2017,179,22,40,280.549,22,2,2,0,2 565 | 5,26,15/12/2017,235,20,43,280.549,38,1,1,0,8 566 | 12,18,18/12/2017,233,51,31,280.549,21,2,1,8,8 567 | 22,13,19/12/2017,179,26,30,280.549,19,3,0,0,16 568 | 28,23,19/12/2017,225,26,28,280.549,24,1,1,2,2 569 | 28,23,21/12/2017,225,26,28,280.549,24,1,1,2,3 570 | 28,23,18/12/2017,225,26,28,280.549,24,1,1,2,2 571 | 14,18,19/12/2017,155,12,34,280.549,25,1,2,0,80 572 | 22,12,01/01/2018,179,26,30,313.532,19,3,0,0,24 573 | 22,12,04/01/2018,179,26,30,313.532,19,3,0,0,16 574 | 17,25,04/01/2018,179,22,40,313.532,22,2,2,0,2 575 | 17,25,05/01/2018,179,22,40,313.532,22,2,2,0,2 576 | 22,13,08/01/2018,179,26,30,313.532,19,3,0,0,3 577 | 17,25,10/01/2018,179,22,40,313.532,22,2,2,0,2 578 | 32,10,11/01/2018,289,48,49,313.532,36,1,0,2,8 579 | 17,18,12/01/2018,179,22,40,313.532,22,2,2,0,3 580 | 22,27,15/01/2018,179,26,30,313.532,19,3,0,0,2 581 | 14,18,16/01/2018,155,12,34,313.532,25,1,2,0,8 582 | 22,27,17/01/2018,179,26,30,313.532,19,3,0,0,2 583 | 3,27,17/01/2018,179,51,38,313.532,31,1,0,0,3 584 | 11,13,17/01/2018,289,36,33,313.532,30,1,2,1,8 585 | 3,27,18/01/2018,179,51,38,313.532,31,1,0,0,3 586 | 3,27,26/01/2018,179,51,38,313.532,31,1,0,0,2 587 | 3,13,06/02/2018,179,51,38,264.249,31,1,0,0,8 588 | 28,23,06/02/2018,225,26,28,264.249,24,1,1,2,3 589 | 33,1,07/02/2018,248,25,47,264.249,32,1,2,1,8 590 | 3,27,07/02/2018,179,51,38,264.249,31,1,0,0,2 591 | 28,28,08/02/2018,225,26,28,264.249,24,1,1,2,3 592 | 3,27,08/02/2018,179,51,38,264.249,31,1,0,0,2 593 | 22,27,08/02/2018,179,26,30,264.249,19,3,0,0,2 594 | 29,28,09/02/2018,225,15,41,264.249,28,4,2,2,2 595 | 3,27,09/02/2018,179,51,38,264.249,31,1,0,0,2 596 | 12,19,12/02/2018,233,51,31,264.249,21,2,1,8,2 597 | 3,27,12/02/2018,179,51,38,264.249,31,1,0,0,2 598 | 28,7,13/02/2018,225,26,28,264.249,24,1,1,2,8 599 | 3,27,14/02/2018,179,51,38,264.249,31,1,0,0,3 600 | 3,27,15/02/2018,179,51,38,264.249,31,1,0,0,3 601 | 28,25,15/02/2018,225,26,28,264.249,24,1,1,2,3 602 | 22,13,15/02/2018,179,26,30,264.249,19,3,0,0,2 603 | 17,23,16/02/2018,179,22,40,264.249,22,2,2,0,2 604 | 3,27,16/02/2018,179,51,38,264.249,31,1,0,0,3 605 | 12,12,21/02/2018,233,51,31,264.249,21,2,1,8,3 606 | 22,27,21/02/2018,179,26,30,264.249,19,3,0,0,2 607 | 3,27,21/02/2018,179,51,38,264.249,31,1,0,0,2 608 | 3,13,22/02/2018,179,51,38,264.249,31,1,0,0,8 609 | 3,27,23/02/2018,179,51,38,264.249,31,1,0,0,2 610 | 14,25,26/02/2018,155,12,34,264.249,25,1,2,0,5 611 | 25,25,26/02/2018,235,16,32,264.249,25,3,0,0,3 612 | 3,27,26/02/2018,179,51,38,264.249,31,1,0,0,2 613 | 28,7,26/02/2018,225,26,28,264.249,24,1,1,2,2 614 | 3,27,27/02/2018,179,51,38,264.249,31,1,0,0,2 615 | 33,23,27/02/2018,248,25,47,264.249,32,1,2,1,2 616 | 28,25,27/02/2018,225,26,28,264.249,24,1,1,2,2 617 | 3,27,28/02/2018,179,51,38,264.249,31,1,0,0,2 618 | 3,27,22/02/2018,179,51,38,264.249,31,1,0,0,2 619 | 25,25,23/02/2018,235,16,32,264.249,25,3,0,0,2 620 | 3,27,05/03/2018,179,51,38,222.196,31,1,0,0,2 621 | 33,23,05/03/2018,248,25,47,222.196,32,1,2,1,2 622 | 9,25,06/03/2018,228,14,58,222.196,22,1,2,1,3 623 | 33,25,06/03/2018,248,25,47,222.196,32,1,2,1,3 624 | 9,12,06/03/2018,228,14,58,222.196,22,1,2,1,112 625 | 3,27,07/03/2018,179,51,38,222.196,31,1,0,0,2 626 | 28,27,08/03/2018,225,26,28,222.196,24,1,1,2,2 627 | 3,27,08/03/2018,179,51,38,222.196,31,1,0,0,3 628 | 28,25,08/03/2018,225,26,28,222.196,24,1,1,2,2 629 | 22,27,09/03/2018,179,26,30,222.196,19,3,0,0,3 630 | 25,25,12/03/2018,235,16,32,222.196,25,3,0,0,3 631 | 10,19,12/03/2018,361,52,28,222.196,27,1,1,4,8 632 | 3,13,13/03/2018,179,51,38,222.196,31,1,0,0,8 633 | 3,27,14/03/2018,179,51,38,222.196,31,1,0,0,2 634 | 3,27,15/03/2018,179,51,38,222.196,31,1,0,0,3 635 | 22,27,16/03/2018,179,26,30,222.196,19,3,0,0,2 636 | 3,10,19/03/2018,179,51,38,222.196,31,1,0,0,4 637 | 33,13,19/03/2018,248,25,47,222.196,32,1,2,1,2 638 | 3,27,19/03/2018,179,51,38,222.196,31,1,0,0,3 639 | 28,7,19/03/2018,225,26,28,222.196,24,1,1,2,8 640 | 3,27,20/03/2018,179,51,38,222.196,31,1,0,0,2 641 | 11,23,21/03/2018,289,36,33,222.196,30,1,2,1,8 642 | 9,25,21/03/2018,228,14,58,222.196,22,1,2,1,2 643 | 3,27,21/03/2018,179,51,38,222.196,31,1,0,0,2 644 | 33,23,22/03/2018,248,25,47,222.196,32,1,2,1,3 645 | 3,27,22/03/2018,179,51,38,222.196,31,1,0,0,3 646 | 22,23,23/03/2018,179,26,30,222.196,19,3,0,0,2 647 | 3,27,23/03/2018,179,51,38,222.196,31,1,0,0,3 648 | 3,27,27/03/2018,179,51,38,222.196,31,1,0,0,3 649 | 16,23,28/03/2018,118,15,46,222.196,25,1,2,0,8 650 | 14,13,28/03/2018,155,12,34,222.196,25,1,2,0,24 651 | 3,27,28/03/2018,179,51,38,222.196,31,1,0,0,3 652 | 3,27,29/03/2018,179,51,38,222.196,31,1,0,0,3 653 | 22,13,26/03/2018,179,26,30,222.196,19,3,0,0,2 654 | 11,19,26/03/2018,289,36,33,222.196,30,1,2,1,104 655 | 13,22,28/03/2018,369,17,31,222.196,25,1,3,0,8 656 | 28,13,02/04/2018,225,26,28,246.288,24,1,1,2,8 657 | 34,10,02/04/2018,118,10,37,246.288,28,1,0,0,8 658 | 10,19,03/04/2018,361,52,28,246.288,27,1,1,4,8 659 | 33,19,04/04/2018,248,25,47,246.288,32,1,2,1,8 660 | 6,13,05/04/2018,189,29,33,246.288,25,1,2,2,8 661 | 22,27,06/04/2018,179,26,30,246.288,19,3,0,0,2 662 | 13,7,09/04/2018,369,17,31,246.288,25,1,3,0,24 663 | 17,16,10/04/2018,179,22,40,246.288,22,2,2,0,2 664 | 36,23,10/04/2018,118,13,50,246.288,31,1,1,0,3 665 | 10,23,10/04/2018,361,52,28,246.288,27,1,1,4,2 666 | 34,10,11/04/2018,118,10,37,246.288,28,1,0,0,2 667 | 1,22,13/04/2018,235,11,37,246.288,29,3,1,1,8 668 | 22,27,13/04/2018,179,26,30,246.288,19,3,0,0,2 669 | 28,19,16/04/2018,225,26,28,246.288,24,1,1,2,8 670 | 25,16,17/04/2018,235,16,32,246.288,25,3,0,0,3 671 | 22,27,20/04/2018,179,26,30,246.288,19,3,0,0,2 672 | 14,28,24/04/2018,155,12,34,246.288,25,1,2,0,4 673 | 28,19,26/04/2018,225,26,28,246.288,24,1,1,2,8 674 | 36,14,26/04/2018,118,13,50,246.288,31,1,1,0,2 675 | 22,27,27/04/2018,179,26,30,246.288,19,3,0,0,2 676 | 1,22,07/05/2018,235,11,37,237.656,29,3,1,1,8 677 | 29,19,09/05/2018,225,15,41,237.656,28,4,2,2,3 678 | 25,28,09/05/2018,235,16,32,237.656,25,3,0,0,2 679 | 34,8,09/05/2018,118,10,37,237.656,28,1,0,0,3 680 | 5,26,09/05/2018,235,20,43,237.656,38,1,1,0,8 681 | 22,13,10/05/2018,179,26,30,237.656,19,3,0,0,1 682 | 15,28,10/05/2018,291,31,40,237.656,25,1,1,1,2 683 | 29,14,10/05/2018,225,15,41,237.656,28,4,2,2,8 684 | 26,19,11/05/2018,300,26,43,237.656,25,1,2,1,64 685 | 29,22,11/05/2018,225,15,41,237.656,28,4,2,2,8 686 | 22,27,11/05/2018,179,26,30,237.656,19,3,0,0,2 687 | 36,23,14/05/2018,118,13,50,237.656,31,1,1,0,2 688 | 36,5,15/05/2018,118,13,50,237.656,31,1,1,0,3 689 | 34,28,15/05/2018,118,10,37,237.656,28,1,0,0,1 690 | 36,0,15/05/2018,118,13,50,237.656,31,1,1,0,0 691 | 22,27,16/05/2018,179,26,30,237.656,19,3,0,0,2 692 | 23,0,16/05/2018,378,49,36,237.656,21,1,2,4,0 693 | 17,16,18/05/2018,179,22,40,237.656,22,2,2,0,1 694 | 14,10,21/05/2018,155,12,34,237.656,25,1,2,0,48 695 | 25,10,21/05/2018,235,16,32,237.656,25,3,0,0,8 696 | 15,22,23/05/2018,291,31,40,237.656,25,1,1,1,8 697 | 17,10,23/05/2018,179,22,40,237.656,22,2,2,0,8 698 | 28,6,23/05/2018,225,26,28,237.656,24,1,1,2,3 699 | 18,10,24/05/2018,330,16,28,237.656,25,2,0,0,8 700 | 25,23,24/05/2018,235,16,32,237.656,25,3,0,0,2 701 | 15,28,31/05/2018,291,31,40,237.656,25,1,1,1,2 702 | --------------------------------------------------------------------------------