├── Basic Probability and Statistics (Recap) ├── ANOVA.pdf ├── Correlation Analysis.pdf ├── Descriptive Statistics.pdf ├── Probability Distributions.pdf ├── Sampling Distributions.pdf ├── Simple Linear Regression.pdf ├── Testing of Hypothesis.pdf └── Theory of Estimation.pdf ├── Handwritten Classnotes ├── MDA_Chapter_0.pdf ├── MDA_Chapter_1.pdf ├── MDA_Chapter_10.pdf ├── MDA_Chapter_2.pdf ├── MDA_Chapter_3.pdf ├── MDA_Chapter_4.pdf ├── MDA_Chapter_5.pdf ├── MDA_Chapter_6.pdf ├── MDA_Chapter_7.pdf ├── MDA_Chapter_8.pdf └── MDA_Chapter_9.pdf ├── Introduction to MDA Course.pdf ├── Practical Slides and Codes ├── Advertising.csv ├── Amount.csv ├── Big_Mart_Dataset.csv ├── Credit_Card_Expenses.csv ├── DC_Simple_Reg.csv ├── E15demand.csv ├── Logistic_Reg.csv ├── Missing_Values_Telecom.csv ├── Missing_Values_Telecom_mod.csv ├── Mult_Reg_Conversion.csv ├── Mult_Reg_Yield.csv ├── Nonlinear_Thrust.csv ├── PO_Processing.csv ├── Practical_1_Intro2R.pdf ├── Practical_1_Intro2R_Code.R ├── Practical_2_EDA.pdf ├── Practical_2_EDA_Code.R ├── Practical_3_Distribution.pdf ├── Practical_3_Distribution_Code.R ├── Practical_4_ToH_ANOVA.pdf ├── Practical_4_ToH_ANOVA_Code.R ├── Practical_5_Linear_Regression.pdf ├── Practical_5_Linear_Regression_Code.R ├── Practical_6_MLR.pdf ├── Practical_6_MLR_Code.R ├── Practical_7_Logistic_Regression.pdf ├── Practical_7_Logistic_Regression_Code.R ├── Practical_8_TS_Forecasting.pdf ├── Practical_8_TS_Forecasting_Code.R ├── Reg_Spline_DFR.csv ├── Resort_Visit.csv ├── ST_Defects.csv ├── Sales_Revenue_Anova.csv ├── Seasonal_sales.csv ├── Supply_Chain.csv ├── Travel_dummy_Reg.csv ├── Trend_&_Seasonal.csv ├── Trens_GDP.csv ├── Visits.csv ├── bank_data.csv └── shipment.csv ├── README.md └── Tutorial Worksheets ├── MDA_TD_1.pdf ├── MDA_TD_1_Sol.pdf ├── MDA_TD_2.pdf ├── MDA_TD_2_Sol.pdf ├── MDA_TD_3.pdf ├── MDA_TD_3_Sol.pdf ├── MDA_TD_4.pdf ├── MDA_TD_4_Sol.pdf ├── MDA_TD_5.pdf ├── MDA_TD_5_Sol.pdf └── Statistical Tables.pdf /Basic Probability and Statistics (Recap)/ANOVA.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Basic Probability and Statistics (Recap)/ANOVA.pdf -------------------------------------------------------------------------------- /Basic Probability and Statistics (Recap)/Correlation Analysis.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Basic Probability and Statistics (Recap)/Correlation Analysis.pdf -------------------------------------------------------------------------------- /Basic Probability and Statistics (Recap)/Descriptive Statistics.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Basic Probability and Statistics (Recap)/Descriptive Statistics.pdf -------------------------------------------------------------------------------- /Basic Probability and Statistics (Recap)/Probability Distributions.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Basic Probability and Statistics (Recap)/Probability Distributions.pdf -------------------------------------------------------------------------------- /Basic Probability and Statistics (Recap)/Sampling Distributions.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Basic Probability and Statistics (Recap)/Sampling Distributions.pdf -------------------------------------------------------------------------------- /Basic Probability and Statistics (Recap)/Simple Linear Regression.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Basic Probability and Statistics (Recap)/Simple Linear Regression.pdf -------------------------------------------------------------------------------- /Basic Probability and Statistics (Recap)/Testing of Hypothesis.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Basic Probability and Statistics (Recap)/Testing of Hypothesis.pdf -------------------------------------------------------------------------------- /Basic Probability and Statistics (Recap)/Theory of Estimation.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Basic Probability and Statistics (Recap)/Theory of Estimation.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_0.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_0.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_1.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_1.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_10.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_10.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_2.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_2.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_3.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_3.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_4.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_4.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_5.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_5.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_6.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_6.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_7.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_7.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_8.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_8.pdf -------------------------------------------------------------------------------- /Handwritten Classnotes/MDA_Chapter_9.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Handwritten Classnotes/MDA_Chapter_9.pdf -------------------------------------------------------------------------------- /Introduction to MDA Course.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Introduction to MDA Course.pdf -------------------------------------------------------------------------------- /Practical Slides and Codes/Advertising.csv: -------------------------------------------------------------------------------- 1 | ,TV,radio,newspaper,sales 2 | 1,230.1,37.8,69.2,22.1 3 | 2,44.5,39.3,45.1,10.4 4 | 3,17.2,45.9,69.3,9.3 5 | 4,151.5,41.3,58.5,18.5 6 | 5,180.8,10.8,58.4,12.9 7 | 6,8.7,48.9,75,7.2 8 | 7,57.5,32.8,23.5,11.8 9 | 8,120.2,19.6,11.6,13.2 10 | 9,8.6,2.1,1,4.8 11 | 10,199.8,2.6,21.2,10.6 12 | 11,66.1,5.8,24.2,8.6 13 | 12,214.7,24,4,17.4 14 | 13,23.8,35.1,65.9,9.2 15 | 14,97.5,7.6,7.2,9.7 16 | 15,204.1,32.9,46,19 17 | 16,195.4,47.7,52.9,22.4 18 | 17,67.8,36.6,114,12.5 19 | 18,281.4,39.6,55.8,24.4 20 | 19,69.2,20.5,18.3,11.3 21 | 20,147.3,23.9,19.1,14.6 22 | 21,218.4,27.7,53.4,18 23 | 22,237.4,5.1,23.5,12.5 24 | 23,13.2,15.9,49.6,5.6 25 | 24,228.3,16.9,26.2,15.5 26 | 25,62.3,12.6,18.3,9.7 27 | 26,262.9,3.5,19.5,12 28 | 27,142.9,29.3,12.6,15 29 | 28,240.1,16.7,22.9,15.9 30 | 29,248.8,27.1,22.9,18.9 31 | 30,70.6,16,40.8,10.5 32 | 31,292.9,28.3,43.2,21.4 33 | 32,112.9,17.4,38.6,11.9 34 | 33,97.2,1.5,30,9.6 35 | 34,265.6,20,0.3,17.4 36 | 35,95.7,1.4,7.4,9.5 37 | 36,290.7,4.1,8.5,12.8 38 | 37,266.9,43.8,5,25.4 39 | 38,74.7,49.4,45.7,14.7 40 | 39,43.1,26.7,35.1,10.1 41 | 40,228,37.7,32,21.5 42 | 41,202.5,22.3,31.6,16.6 43 | 42,177,33.4,38.7,17.1 44 | 43,293.6,27.7,1.8,20.7 45 | 44,206.9,8.4,26.4,12.9 46 | 45,25.1,25.7,43.3,8.5 47 | 46,175.1,22.5,31.5,14.9 48 | 47,89.7,9.9,35.7,10.6 49 | 48,239.9,41.5,18.5,23.2 50 | 49,227.2,15.8,49.9,14.8 51 | 50,66.9,11.7,36.8,9.7 52 | 51,199.8,3.1,34.6,11.4 53 | 52,100.4,9.6,3.6,10.7 54 | 53,216.4,41.7,39.6,22.6 55 | 54,182.6,46.2,58.7,21.2 56 | 55,262.7,28.8,15.9,20.2 57 | 56,198.9,49.4,60,23.7 58 | 57,7.3,28.1,41.4,5.5 59 | 58,136.2,19.2,16.6,13.2 60 | 59,210.8,49.6,37.7,23.8 61 | 60,210.7,29.5,9.3,18.4 62 | 61,53.5,2,21.4,8.1 63 | 62,261.3,42.7,54.7,24.2 64 | 63,239.3,15.5,27.3,15.7 65 | 64,102.7,29.6,8.4,14 66 | 65,131.1,42.8,28.9,18 67 | 66,69,9.3,0.9,9.3 68 | 67,31.5,24.6,2.2,9.5 69 | 68,139.3,14.5,10.2,13.4 70 | 69,237.4,27.5,11,18.9 71 | 70,216.8,43.9,27.2,22.3 72 | 71,199.1,30.6,38.7,18.3 73 | 72,109.8,14.3,31.7,12.4 74 | 73,26.8,33,19.3,8.8 75 | 74,129.4,5.7,31.3,11 76 | 75,213.4,24.6,13.1,17 77 | 76,16.9,43.7,89.4,8.7 78 | 77,27.5,1.6,20.7,6.9 79 | 78,120.5,28.5,14.2,14.2 80 | 79,5.4,29.9,9.4,5.3 81 | 80,116,7.7,23.1,11 82 | 81,76.4,26.7,22.3,11.8 83 | 82,239.8,4.1,36.9,12.3 84 | 83,75.3,20.3,32.5,11.3 85 | 84,68.4,44.5,35.6,13.6 86 | 85,213.5,43,33.8,21.7 87 | 86,193.2,18.4,65.7,15.2 88 | 87,76.3,27.5,16,12 89 | 88,110.7,40.6,63.2,16 90 | 89,88.3,25.5,73.4,12.9 91 | 90,109.8,47.8,51.4,16.7 92 | 91,134.3,4.9,9.3,11.2 93 | 92,28.6,1.5,33,7.3 94 | 93,217.7,33.5,59,19.4 95 | 94,250.9,36.5,72.3,22.2 96 | 95,107.4,14,10.9,11.5 97 | 96,163.3,31.6,52.9,16.9 98 | 97,197.6,3.5,5.9,11.7 99 | 98,184.9,21,22,15.5 100 | 99,289.7,42.3,51.2,25.4 101 | 100,135.2,41.7,45.9,17.2 102 | 101,222.4,4.3,49.8,11.7 103 | 102,296.4,36.3,100.9,23.8 104 | 103,280.2,10.1,21.4,14.8 105 | 104,187.9,17.2,17.9,14.7 106 | 105,238.2,34.3,5.3,20.7 107 | 106,137.9,46.4,59,19.2 108 | 107,25,11,29.7,7.2 109 | 108,90.4,0.3,23.2,8.7 110 | 109,13.1,0.4,25.6,5.3 111 | 110,255.4,26.9,5.5,19.8 112 | 111,225.8,8.2,56.5,13.4 113 | 112,241.7,38,23.2,21.8 114 | 113,175.7,15.4,2.4,14.1 115 | 114,209.6,20.6,10.7,15.9 116 | 115,78.2,46.8,34.5,14.6 117 | 116,75.1,35,52.7,12.6 118 | 117,139.2,14.3,25.6,12.2 119 | 118,76.4,0.8,14.8,9.4 120 | 119,125.7,36.9,79.2,15.9 121 | 120,19.4,16,22.3,6.6 122 | 121,141.3,26.8,46.2,15.5 123 | 122,18.8,21.7,50.4,7 124 | 123,224,2.4,15.6,11.6 125 | 124,123.1,34.6,12.4,15.2 126 | 125,229.5,32.3,74.2,19.7 127 | 126,87.2,11.8,25.9,10.6 128 | 127,7.8,38.9,50.6,6.6 129 | 128,80.2,0,9.2,8.8 130 | 129,220.3,49,3.2,24.7 131 | 130,59.6,12,43.1,9.7 132 | 131,0.7,39.6,8.7,1.6 133 | 132,265.2,2.9,43,12.7 134 | 133,8.4,27.2,2.1,5.7 135 | 134,219.8,33.5,45.1,19.6 136 | 135,36.9,38.6,65.6,10.8 137 | 136,48.3,47,8.5,11.6 138 | 137,25.6,39,9.3,9.5 139 | 138,273.7,28.9,59.7,20.8 140 | 139,43,25.9,20.5,9.6 141 | 140,184.9,43.9,1.7,20.7 142 | 141,73.4,17,12.9,10.9 143 | 142,193.7,35.4,75.6,19.2 144 | 143,220.5,33.2,37.9,20.1 145 | 144,104.6,5.7,34.4,10.4 146 | 145,96.2,14.8,38.9,11.4 147 | 146,140.3,1.9,9,10.3 148 | 147,240.1,7.3,8.7,13.2 149 | 148,243.2,49,44.3,25.4 150 | 149,38,40.3,11.9,10.9 151 | 150,44.7,25.8,20.6,10.1 152 | 151,280.7,13.9,37,16.1 153 | 152,121,8.4,48.7,11.6 154 | 153,197.6,23.3,14.2,16.6 155 | 154,171.3,39.7,37.7,19 156 | 155,187.8,21.1,9.5,15.6 157 | 156,4.1,11.6,5.7,3.2 158 | 157,93.9,43.5,50.5,15.3 159 | 158,149.8,1.3,24.3,10.1 160 | 159,11.7,36.9,45.2,7.3 161 | 160,131.7,18.4,34.6,12.9 162 | 161,172.5,18.1,30.7,14.4 163 | 162,85.7,35.8,49.3,13.3 164 | 163,188.4,18.1,25.6,14.9 165 | 164,163.5,36.8,7.4,18 166 | 165,117.2,14.7,5.4,11.9 167 | 166,234.5,3.4,84.8,11.9 168 | 167,17.9,37.6,21.6,8 169 | 168,206.8,5.2,19.4,12.2 170 | 169,215.4,23.6,57.6,17.1 171 | 170,284.3,10.6,6.4,15 172 | 171,50,11.6,18.4,8.4 173 | 172,164.5,20.9,47.4,14.5 174 | 173,19.6,20.1,17,7.6 175 | 174,168.4,7.1,12.8,11.7 176 | 175,222.4,3.4,13.1,11.5 177 | 176,276.9,48.9,41.8,27 178 | 177,248.4,30.2,20.3,20.2 179 | 178,170.2,7.8,35.2,11.7 180 | 179,276.7,2.3,23.7,11.8 181 | 180,165.6,10,17.6,12.6 182 | 181,156.6,2.6,8.3,10.5 183 | 182,218.5,5.4,27.4,12.2 184 | 183,56.2,5.7,29.7,8.7 185 | 184,287.6,43,71.8,26.2 186 | 185,253.8,21.3,30,17.6 187 | 186,205,45.1,19.6,22.6 188 | 187,139.5,2.1,26.6,10.3 189 | 188,191.1,28.7,18.2,17.3 190 | 189,286,13.9,3.7,15.9 191 | 190,18.7,12.1,23.4,6.7 192 | 191,39.5,41.1,5.8,10.8 193 | 192,75.5,10.8,6,9.9 194 | 193,17.2,4.1,31.6,5.9 195 | 194,166.8,42,3.6,19.6 196 | 195,149.7,35.6,6,17.3 197 | 196,38.2,3.7,13.8,7.6 198 | 197,94.2,4.9,8.1,9.7 199 | 198,177,9.3,6.4,12.8 200 | 199,283.6,42,66.2,25.5 201 | 200,232.1,8.6,8.7,13.4 202 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Amount.csv: -------------------------------------------------------------------------------- 1 | Month,Amount 2 | 1,9 3 | 2,8 4 | 3,9 5 | 4,12 6 | 5,9 7 | 6,12 8 | 7,11 9 | 8,7 10 | 9,13 11 | 10,9 12 | 11,11 13 | 12,10 14 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Credit_Card_Expenses.csv: -------------------------------------------------------------------------------- 1 | Month ,CC_Expenses 2 | 1,55 3 | 2,65 4 | 3,59 5 | 4,59 6 | 5,57 7 | 6,61 8 | 7,53 9 | 8,63 10 | 9,59 11 | 10,57 12 | 11,63 13 | 12,55 14 | 13,61 15 | 14,61 16 | 15,57 17 | 16,59 18 | 17,61 19 | 18,57 20 | 19,59 21 | 20,63 22 | -------------------------------------------------------------------------------- /Practical Slides and Codes/DC_Simple_Reg.csv: -------------------------------------------------------------------------------- 1 | SL No,Dryer Temperature,Dry Content 2 | 1,55,73.3 3 | 2,56,74.6 4 | 3,55.5,74 5 | 4,59,78.5 6 | 5,56,74.6 7 | 6,55.5,74 8 | 7,56.5,75.2 9 | 8,58,77.2 10 | 9,57,75.9 11 | 10,56,74.6 12 | 11,55,73.3 13 | 12,57,75.9 14 | 13,57,76 15 | 14,56,74.6 16 | 15,56,74.7 17 | 16,56,74.5 18 | 17,53,70.7 19 | 18,54,72 20 | 19,54,72.1 21 | 20,54,72.2 22 | 21,53,70.7 23 | 22,56,74.6 24 | 23,56.5,75.2 25 | 24,55.5,74 26 | 25,57,75.9 27 | 26,56.5,75.3 28 | 27,55.5,74 29 | 28,59,78.5 30 | 29,57.5,76.5 31 | 30,56,74.5 32 | 31,57,76 33 | 32,57.5,76.5 34 | 33,57.5,76.7 35 | 34,57,76 36 | 35,57,75.8 37 | 36,55.5,73.8 38 | 37,55,73.3 39 | 38,56,74.6 40 | 39,55,73.4 41 | 40,57,76 42 | 41,55.5,74 43 | 42,56.5,75.2 44 | -------------------------------------------------------------------------------- /Practical Slides and Codes/E15demand.csv: -------------------------------------------------------------------------------- 1 | Month,Demand 2 | 1,139 3 | 2,137 4 | 3,174 5 | 4,142 6 | 5,141 7 | 6,162 8 | 7,180 9 | 8,164 10 | 9,171 11 | 10,206 12 | 11,193 13 | 12,207 14 | 13,218 15 | 14,229 16 | 15,225 17 | 16,204 18 | 17,227 19 | 18,223 20 | 19,242 21 | 20,239 22 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Logistic_Reg.csv: -------------------------------------------------------------------------------- 1 | Ind_Exp_Act_Score,Tran_Speed_Score,Peer_Comb_Score,Outcome 2 | 6.2,9.3,7.4,1 3 | 2.6,2.2,8.7,1 4 | 9.5,1.5,8.2,1 5 | 2.6,5,0.4,0 6 | 10,7.7,7.2,1 7 | 0.2,2.3,3.4,0 8 | 2.5,2.8,4.6,0 9 | 0.4,1.6,5.3,0 10 | 0.2,9.8,0.8,0 11 | 8.9,6.1,9.2,1 12 | 9.3,5.2,2.1,1 13 | 9.4,2.1,4.1,1 14 | 0.8,5.1,7.5,1 15 | 8.2,3.1,0.9,1 16 | 6.5,0.7,5.5,1 17 | 1.5,5.7,5.9,1 18 | 7.3,0.9,4.8,1 19 | 3.2,3.3,3,0 20 | 6.9,9.4,9.3,1 21 | 6.8,9.8,8.3,1 22 | 3.6,4.4,7.1,1 23 | 0.6,5.9,0.2,0 24 | 2.4,3.2,8.2,1 25 | 7.2,1.6,5.8,1 26 | 3.7,6.6,0.7,0 27 | 9.4,9.7,6,1 28 | 6.5,8.2,8.3,1 29 | 6.1,2,4.5,1 30 | 7.1,1.9,4.8,1 31 | 1.2,6.7,7.6,1 32 | 3.8,3.8,9,1 33 | 8.3,2.9,6.4,1 34 | 0.9,7.7,2.1,0 35 | 3.6,6.9,9.7,1 36 | 6.9,2.5,2.9,0 37 | 3.6,9.7,8.8,1 38 | 7.9,9.3,2.5,1 39 | 9.2,7.6,2.4,1 40 | 7.8,1.7,9.9,1 41 | 9.4,6.9,6.3,1 42 | 4.8,0.8,5.5,0 43 | 7.3,0.8,0.2,0 44 | 3.9,6.2,9.4,1 45 | 0.4,6.7,2.4,0 46 | 8.2,6.1,2.2,1 47 | 9.2,7.8,9.2,1 48 | 1.2,1.7,0.1,0 49 | 2.5,0.3,4.3,0 50 | 2,1.9,9.5,1 51 | 1.5,8,1.3,0 52 | 1.6,0.4,4,0 53 | 7.8,7.1,9.3,1 54 | 9,0.9,9.9,1 55 | 9.8,8.7,6.6,1 56 | 5.2,2.3,1,0 57 | 3.6,4.3,4.9,1 58 | 6.5,3,3.5,1 59 | 8.5,2.5,5.1,1 60 | 9.3,4.7,1.1,1 61 | 2.6,4.8,3.6,0 62 | 9.9,8.3,7.6,1 63 | 7.7,9.5,4.1,1 64 | 9.2,0.3,1.8,0 65 | 7.4,8.4,2.8,1 66 | 3.7,6.8,7.5,1 67 | 2.7,9.7,6.2,1 68 | 9.6,9,5.7,1 69 | 0,2.5,7.9,0 70 | 0.7,5.8,7.9,1 71 | 8.3,0.4,1.5,0 72 | 0.6,0.9,2.7,0 73 | 8.5,9.7,3.4,1 74 | 8.2,7.9,8.9,1 75 | 7,6.9,10,1 76 | 7.3,3.3,2.9,1 77 | 7.5,0.2,2.1,0 78 | 1.6,3.2,5.6,0 79 | 8.9,1.2,8.9,1 80 | 4.6,1.5,3.5,0 81 | 9.6,3.7,4,1 82 | 5.4,1.7,2.9,0 83 | 1.3,1.4,1.1,0 84 | 5,0.2,7.4,1 85 | 4.9,0.5,6.4,1 86 | 2.4,0.7,3.3,0 87 | 6.8,2.1,6.4,1 88 | 8.6,9.4,4.4,1 89 | 3.4,7.8,2,1 90 | 9.9,2.8,6.9,1 91 | 5.6,8.6,2.2,1 92 | 2.7,4.5,4.6,0 93 | 2.4,8.2,0.7,0 94 | 3.3,8.1,1,0 95 | 7.4,2.9,3.8,1 96 | 2.6,0,5.2,0 97 | 4.5,1.8,2.6,0 98 | 7.1,6.6,0.9,1 99 | 3.1,8.7,9.1,1 100 | 9.8,5.9,7.6,1 101 | 4.1,5.5,1.4,0 102 | 8,6.4,1.3,1 103 | 6.7,0.6,6.1,1 104 | 0.1,5,5,0 105 | 3.5,9.5,4.2,1 106 | 8.6,2.2,8.9,1 107 | 2.6,4.9,8.1,1 108 | 9.1,8.1,3.8,1 109 | 6.9,4.8,0.9,0 110 | 7.9,9.2,0.6,1 111 | 9.9,0.4,8.8,1 112 | 1.2,9,3,1 113 | 0.7,0.6,8.2,0 114 | 2.9,6.7,2.7,0 115 | 0.3,9.1,2.7,0 116 | 6.5,5.8,8,1 117 | 7.7,2.3,8.6,1 118 | 8.5,0,8,1 119 | 3.9,1,9.7,1 120 | 2.3,3.1,3.6,0 121 | 5.2,4.4,2.2,0 122 | 7.2,3.9,3.8,1 123 | 0.7,3.4,9.8,1 124 | 4,3.5,2.3,0 125 | 8.6,9.7,9,1 126 | 9.1,0.9,2.8,1 127 | 7.9,1.6,0.5,0 128 | 6.1,7.1,4.2,1 129 | 2.4,5.5,6.2,1 130 | 1.5,8.4,4.9,1 131 | 8.4,0.3,4.4,1 132 | 1.4,1.2,6.6,0 133 | 9.4,4.5,6.6,1 134 | 7,1.7,0.2,0 135 | 1.9,9.6,1.4,0 136 | 7.2,1.1,2.1,0 137 | 9.6,9.3,0.2,1 138 | 5.7,5.6,8.7,1 139 | 6,4.6,1.6,1 140 | 8.8,1.5,0.9,0 141 | 0.3,1.9,4.3,0 142 | 7.6,2.6,7.6,1 143 | 3.3,0.2,2.2,0 144 | 1,2.5,2.7,0 145 | 4.1,0.4,1.6,0 146 | 10,0.1,9.8,1 147 | 3.7,8.9,8.7,1 148 | 3.6,9.2,5.1,1 149 | 1.1,9.4,2,0 150 | 8.2,3.4,5.1,1 151 | 7.3,0.8,3.1,0 152 | 2.1,6.5,3.3,1 153 | 1.3,1.5,5.6,0 154 | 4.3,7.3,3.5,1 155 | 4.1,3.2,3.3,0 156 | 1.1,6.9,2.1,0 157 | 2,6.8,2.4,0 158 | 6.2,4.3,2.5,1 159 | 1.3,8.7,3.9,1 160 | 3.2,0.5,1.3,0 161 | 5.8,5.3,9.6,1 162 | 1.2,1.8,4.7,0 163 | 1.8,7.4,0.4,0 164 | 8.5,4.6,6.3,1 165 | 2.8,4.2,0.1,0 166 | 6.8,5.1,10,1 167 | 4.7,8.7,4.2,1 168 | 6.3,3.9,1.3,0 169 | 2.4,5.7,3,0 170 | 4.8,8.4,7.8,1 171 | 8.8,1.1,4.4,1 172 | 5.3,4.3,10,1 173 | 3.1,1,7.1,1 174 | 4.1,5.4,6.8,1 175 | 8.8,9.3,7.7,1 176 | 7.5,7.9,1.9,1 177 | 6.9,9.5,4.4,1 178 | 9.1,2.9,8.9,1 179 | 9.2,8,1.5,1 180 | 2.8,2,4.2,0 181 | 3.9,8.5,7.5,1 182 | 4.7,6.2,4.2,1 183 | 0.1,7.2,8.5,1 184 | 1.2,9.3,9.8,1 185 | 4.1,0.6,8.4,1 186 | 3.4,0.1,0,0 187 | 6.9,2.3,6.1,1 188 | 0.2,9.7,8.5,1 189 | 4.8,5.9,5.2,1 190 | 5.3,7.1,1.6,1 191 | 4.1,7.8,9.4,1 192 | 2.3,3.3,0.4,0 193 | 8.6,1.7,2.4,0 194 | 7.7,4.6,3.8,1 195 | 4.6,6.6,3.2,1 196 | 7.6,0,2.8,0 197 | 5.6,3.4,2.9,1 198 | 3.2,7.9,9.1,1 199 | 8.2,8.4,0.3,1 200 | 3.4,7.3,1,0 201 | 5.8,5.5,6.4,1 202 | 1.9,1.8,4.9,0 203 | 2.3,0.8,7.8,1 204 | 5.1,7.7,9.7,1 205 | 6.5,0.8,0.5,0 206 | 3.8,8.8,3.9,1 207 | 2.2,6.4,1.1,0 208 | 7.2,1.3,8.9,1 209 | 0.3,6.6,0.5,0 210 | 8.9,9,8.7,1 211 | 2.6,5.1,5,1 212 | 2.5,5.4,4,1 213 | 7.3,8.2,0.9,1 214 | 4.4,9.7,1.3,1 215 | 3.5,7,2,1 216 | 8.3,2.7,9.5,1 217 | 7.5,0.4,1.8,0 218 | 3.1,3.5,9.8,1 219 | 0.3,9.4,9.5,1 220 | 9.9,3.1,8.3,1 221 | 8.2,3,8.6,1 222 | 2.4,0.5,8.7,0 223 | 7.6,7.6,2.8,1 224 | 2.1,6.7,0.1,0 225 | 2,1.5,5.8,0 226 | 7.4,0.7,0.9,0 227 | 8,3.6,6.7,1 228 | 7.4,3.5,2.5,1 229 | 9.1,0.4,6.3,1 230 | 4.9,3.3,8.1,1 231 | 3.5,7.4,5.6,1 232 | 5.1,2.6,10,1 233 | 1.1,4.5,1.8,0 234 | 3.6,7.1,6.5,1 235 | 0.2,2.5,3.1,0 236 | 4,3,8.7,1 237 | 1.8,9.8,5.9,1 238 | 5.5,4.1,5.5,1 239 | 7.5,8,6.4,1 240 | 6.2,6.6,0.6,1 241 | 5.7,8.2,5.8,1 242 | 7.9,4.3,7.6,1 243 | 7.9,6.7,0.3,1 244 | 1.5,0.7,3,0 245 | 2.7,7.2,6.2,1 246 | 3.4,5,2.9,0 247 | 1.5,6,7.3,1 248 | 10,0.7,1.2,0 249 | 2.8,9.3,9.8,1 250 | 2.1,8.8,1.4,0 251 | 5.6,1,4.8,0 252 | 8.9,1.1,9.9,1 253 | 3.5,4.7,3.9,1 254 | 4,5.8,0.4,0 255 | 2.9,6.6,8.1,1 256 | 1.1,2,9.1,1 257 | 0,3.1,1.2,0 258 | 6.4,6,2.5,1 259 | 3.3,5.3,2,0 260 | 6.4,8.8,4.2,1 261 | 4.4,2.1,1.6,0 262 | 4.6,6.7,7.8,1 263 | 6.9,7.2,9,1 264 | 4.8,6,9.7,1 265 | 4.5,4.2,5.2,1 266 | 3.3,0.5,3.6,0 267 | 3.6,9.6,6.5,1 268 | 3.8,6.5,8.5,1 269 | 5.2,9.5,1.9,1 270 | 2.6,7.5,5.3,1 271 | 0.9,5.1,9.8,1 272 | 5.9,0.1,0.3,0 273 | 1.7,5.3,9.2,1 274 | 7.6,7.7,7.4,1 275 | 1.3,3.8,3.7,0 276 | 1.2,4.2,2.7,0 277 | 6.1,7,9.6,1 278 | 3.2,8.4,2.7,1 279 | 4.9,5.7,0.7,0 280 | 9.7,4.1,8.3,1 281 | 6.3,9.1,3.2,1 282 | 1.9,2.2,8.5,1 283 | 9,0.8,0.9,0 284 | 8.7,4.5,7,1 285 | 9.6,4.4,7.3,1 286 | 3.8,9.2,0.1,1 287 | 6.3,6.3,1.6,1 288 | 0.8,5.4,8.9,1 289 | 3.4,0.2,4.6,0 290 | 6.1,9.4,2.3,1 291 | 9.4,5,5.4,1 292 | 5.5,1.6,0.6,0 293 | 9.8,1.2,4.4,1 294 | 1.1,9.5,4.3,1 295 | 1.3,8.8,8.9,1 296 | 7.3,3.3,8,1 297 | 9.8,3.3,2.7,1 298 | 6.4,8.7,2,1 299 | 2.9,9.3,4.9,1 300 | 0.7,8.6,4.7,1 301 | 1.7,0.3,1.7,0 302 | 3.7,4.2,2.6,0 303 | 5.1,2.9,6.8,1 304 | 4.5,4.5,2,0 305 | 4.1,0.5,3.9,0 306 | 4.1,3,6.5,1 307 | 0.4,9.6,1.9,1 308 | 1.5,3.4,2.4,0 309 | 2.3,1.2,9.2,1 310 | 7.7,4.8,3.5,1 311 | 6.2,6.9,7.4,1 312 | 9,5.6,6,1 313 | 1,5,7.6,1 314 | 1.8,7.2,3.7,1 315 | 5.1,7.3,6.1,1 316 | 2.4,0.9,0.1,0 317 | 0.2,2,6.6,0 318 | 1.7,2.8,4.3,0 319 | 9.9,0.9,5.3,1 320 | 6.2,9.3,0,1 321 | 5.3,2.2,8.7,1 322 | 2.2,1.5,8.2,1 323 | 2.6,5,0.4,0 324 | 3.2,8.3,0.5,1 325 | 0.8,2.9,4,0 326 | 3.1,3.4,5.2,1 327 | 1,2.3,5.9,0 328 | 0.8,0.4,1.4,0 329 | 9.5,9.4,2.5,1 330 | 9.9,5.8,5.4,1 331 | 0,2.7,4.7,0 332 | 4.1,5.7,8.2,1 333 | 1.4,3.7,1.5,0 334 | 7.1,4,6.1,1 335 | 2.1,6.3,6.5,1 336 | 7.9,4.2,5.4,1 337 | 3.8,3.9,3.6,0 338 | 7.6,2.7,9.9,1 339 | 7.4,3,8.9,1 340 | 4.9,3.2,5.8,1 341 | 9.4,4.6,8.9,1 342 | 1.1,1.9,6.9,0 343 | 5.9,0.3,4.5,0 344 | 2.5,5.3,9.4,1 345 | 8.1,1.1,4.7,1 346 | 5.2,7,7.1,1 347 | 4.9,0.7,3.2,0 348 | 5.8,0.6,3.5,0 349 | 10,5.5,6.3,1 350 | 5.2,2.5,7.8,1 351 | 8.9,6.2,7,1 352 | 1.5,0.9,5.3,0 353 | 4.2,7.5,0.3,0 354 | 7.5,3.1,3.5,1 355 | 4.2,0.3,2,0 356 | 8.5,9.9,5.7,1 357 | 2.4,0.8,5.6,0 358 | 1,5,0.5,0 359 | 2.7,7.5,7.6,1 360 | 1.2,4.7,1.5,0 361 | 5.1,7.4,0.7,1 362 | 1.6,0.6,6.3,0 363 | 9.4,7.4,3.4,1 364 | 0.4,1.7,3.1,0 365 | 2.4,2.9,1.3,0 366 | 3.8,1.6,5.5,0 367 | 3.2,3.2,0.7,0 368 | 5.4,9.2,5.2,1 369 | 2.8,1.7,5.2,0 370 | 9.1,8.3,0.6,1 371 | 0.2,4.8,1.1,0 372 | 1,2.5,0.5,0 373 | 6.4,3.5,2.2,1 374 | 4.9,5.6,6.1,1 375 | 7.8,6.9,4.7,1 376 | 9.7,6.4,6.3,1 377 | 3.2,5.9,2.4,0 378 | 6.5,6,4.8,1 379 | 1.1,2.2,1.5,0 380 | 1.6,3.4,5.3,0 381 | 0.4,4.1,3.1,0 382 | 8.6,9.6,6.7,1 383 | 4.9,8,8.7,1 384 | 4,0.9,7.4,1 385 | 0.9,2.9,6.9,1 386 | 3.9,3.7,9.1,1 387 | 1.9,7,9.2,1 388 | 9.5,4.3,2.7,1 389 | 4.5,2.1,3.9,0 390 | 9.7,3.6,4.7,1 391 | 2.1,1.8,2.8,0 392 | 8.2,8.1,1.2,1 393 | 8.6,7.1,4.1,1 394 | 2.8,4.4,6.9,1 395 | 0.1,5.1,0.2,0 396 | 5.8,2.7,4.8,1 397 | 3.5,5,5.3,1 398 | 6.6,2.9,4.1,1 399 | 2.5,2.6,2.3,0 400 | 6.3,1.4,1.3,0 401 | 6.2,1.7,7.6,1 402 | 3.7,1.9,4.5,0 403 | 9.8,3.3,5.6,1 404 | 4.6,9,5.9,1 405 | 3.8,4,8.1,1 406 | 6.8,9.8,6.1,1 407 | 6.6,5.7,5.8,1 408 | 3.6,8.8,4,1 409 | 3.9,8.7,4.3,1 410 | 8.1,6.2,4.4,1 411 | 5.1,2.4,3.2,0 412 | 7.7,7.2,1.5,1 413 | 3.1,3.7,6.6,1 414 | 3.7,9.3,2.3,1 415 | 0.4,6.5,8.2,1 416 | 4.7,6.1,2,1 417 | 8.7,7,1.9,1 418 | 7.3,1.2,6.7,1 419 | 8.9,6.4,3.8,1 420 | 4.9,8.3,5.6,1 421 | 7.4,0.9,0.3,0 422 | 1.4,3.6,6.9,1 423 | 7.8,6.8,2.5,1 424 | 5.6,3.6,9.7,1 425 | 6.6,7.9,9.3,1 426 | 1.3,1.8,0.2,0 427 | 10,0.4,4.4,1 428 | 9.4,2,6.9,1 429 | 1.6,5.4,1.4,0 430 | 1.7,0.5,4.1,0 431 | 6.4,1,10,1 432 | 7.2,8.8,6.7,1 433 | 2.6,9.8,1.1,0 434 | 3.7,1.8,2.3,0 435 | 6.6,3.1,0.9,0 436 | 9.4,4.8,8.6,1 437 | 2.7,2.2,1,0 438 | 7.3,8.4,7.7,1 439 | 7.8,9.6,4.2,1 440 | 6.7,0.4,1.9,0 441 | 4.8,5.8,2.9,1 442 | 1.1,4.2,5,0 443 | 0.2,7.1,6.3,1 444 | 7.1,9.1,5.8,1 445 | 0.1,2.6,5.3,0 446 | 8.2,5.9,5.4,1 447 | 5.7,0.5,1.6,0 448 | 0.7,1,2.8,0 449 | 6,9.8,3.5,1 450 | 8.3,8,9,1 451 | 4.5,4.3,7.4,1 452 | 4.8,3.4,0.3,0 453 | 7.6,0.3,2.2,0 454 | 9,3.3,3.1,1 455 | 6.3,1.3,6.4,1 456 | 4.7,8.9,3.6,1 457 | 9.7,3.8,1.5,1 458 | 2.9,9.1,3,1 459 | 1.4,1.5,1.2,0 460 | 5.1,7.6,7.5,1 461 | 5,7.9,6.5,1 462 | 9.9,0.7,3.4,1 463 | 6.9,2.2,3.8,1 464 | 6.1,9.5,4.5,1 465 | 0.8,5.2,2.1,0 466 | 1.9,4.8,6.2,1 467 | 7.5,7.9,4.2,1 468 | 4.7,3.8,3.9,0 469 | 4.3,7.5,2.7,1 470 | 2.6,7.4,3,1 471 | 6.8,4.9,3.1,1 472 | 1.9,2,7.2,0 473 | 6.5,1.1,4.5,1 474 | 6.4,5.9,0.2,0 475 | 1.8,5.1,5.3,1 476 | 2.4,8,1.1,0 477 | 1.8,7.8,6.9,1 478 | 3.4,4.8,0.7,0 479 | 7.4,5.8,0.6,1 480 | 6,9.9,8.1,1 481 | 7.6,5.1,2.5,1 482 | 3.6,7,4.3,1 483 | 6.1,9.6,9,1 484 | 2.7,5,5.6,1 485 | 6.5,5.5,1.2,1 486 | 8,6.6,8,1 487 | 10,0.5,8.9,1 488 | 8.7,9.1,3.1,1 489 | 8.1,0.7,8.3,1 490 | 0.3,6.8,0.1,0 491 | 0.4,9.2,2.8,0 492 | 4,3.2,5.5,1 493 | 5.2,9.7,8.7,1 494 | 5.9,7.5,5.4,1 495 | 4,8.5,9.8,1 496 | 2.4,3.2,3.7,0 497 | 5.3,1.8,2.3,0 498 | 4.6,1.3,3.9,0 499 | 8.1,3.5,7.3,1 500 | 1.4,3.6,2.4,0 501 | 6,7.1,6.4,1 502 | 6.5,8.3,2.9,1 503 | 5.4,9.1,0.6,1 504 | 3.6,4.5,1.6,0 505 | 2.5,5.6,3.7,0 506 | 8.9,8.5,5,1 507 | 5.8,7.8,4.5,1 508 | 8.9,1.3,6.7,1 509 | 6.9,4.6,4.1,1 510 | 7.1,9.2,0.3,1 511 | 9.4,9.7,1.5,1 512 | 7.3,8.5,2.2,1 513 | 7,6.7,7.7,1 514 | 5.8,3,6.1,1 515 | 6.1,2.1,9,1 516 | 6.3,9,0.9,1 517 | 7.7,2,4.4,1 518 | 5,0,7.7,1 519 | 3.4,7.6,2.3,1 520 | 8.4,2.5,2.8,1 521 | 4.2,7.8,1.7,1 522 | 0.1,0.2,9.9,1 523 | 3.8,6.3,6.2,1 524 | 3.7,6.6,5.2,1 525 | 8.6,8.8,1.5,1 526 | 5,0.3,4.6,0 527 | 6.8,7.6,2.6,1 528 | 1.6,5.9,0.2,0 529 | 8.1,1,5.1,1 530 | 3.7,4.1,0.4,0 531 | 0.9,2.6,2.7,0 532 | 0.5,6.4,8.9,1 533 | 1.5,6.3,9.2,1 534 | 5.6,1.1,2,0 535 | 8.2,8.2,3.4,1 536 | 2.7,7.3,0.8,0 537 | 5.3,2.1,6.5,1 538 | 8,1.3,4.1,1 539 | 1.2,6.9,7.3,1 540 | 8,4.1,3.1,1 541 | 2.3,3.7,6.9,1 542 | 3.6,2,6.8,1 543 | 2.2,6.1,4.3,1 544 | 3.8,1.3,1.4,0 545 | 9.8,5.8,0.5,1 546 | 2.3,5.8,5.2,1 547 | 8.9,1.2,4.4,1 548 | 5.4,1.8,7.4,1 549 | 3.2,1.1,7.2,1 550 | 6.2,6.7,5.1,1 551 | 7.5,5.3,9.3,1 552 | 7,6.9,4.5,1 553 | 6.6,3,6.3,1 554 | 6.6,5.5,9,1 555 | 2.8,2.1,4.3,0 556 | 4,5.9,4.9,1 557 | 4.8,3.7,1.6,0 558 | 0.2,7.3,6,1 559 | 8.7,9.4,9.9,1 560 | 1.5,8.1,8.5,1 561 | 3.5,7.5,0.1,0 562 | 4.3,9.7,6.2,1 563 | 7.6,9.8,8.6,1 564 | 4.9,3.4,2.6,0 565 | 2.7,4.5,9.1,1 566 | 4.2,5.3,6.8,1 567 | 2.4,3.4,7.8,1 568 | 8.7,1.8,2.5,1 569 | 7.8,4.7,1.2,1 570 | 4.7,4,0.7,0 571 | 5.1,7.5,2.9,1 572 | 5.7,0.8,3,0 573 | 3.3,5.4,6.5,1 574 | 5.6,5.9,7.7,1 575 | 3.2,2.9,3.9,0 576 | 2,1.9,5,0 577 | 2.4,8.3,7.9,1 578 | 2.5,5.2,7.2,1 579 | 6.6,8.2,0.6,1 580 | 3.9,6.2,4,1 581 | 9.6,6.5,8.5,1 582 | 4.6,8.8,9,1 583 | 0.4,6.7,7.9,1 584 | 8.2,8.3,8,1 585 | 1.9,7.1,4.3,1 586 | 1.8,7.4,3.3,1 587 | 9.3,7.5,0.2,1 588 | 3.7,9,3.3,1 589 | 5.5,6.3,1.3,1 590 | 0.3,4.7,8.9,1 591 | 6.8,9.7,3.8,1 592 | 2.4,5.5,9.1,1 593 | 9.6,1.3,1.4,0 594 | 9.2,5.1,7.6,1 595 | 0.2,5,7.9,1 596 | 4.3,9.8,0.7,1 597 | 9.5,6.9,2.1,1 598 | 1.4,8.7,9.5,1 599 | 4,3.4,7.8,1 600 | 6.7,10,2.8,1 601 | 10,5.6,6,1 602 | 6.7,2.8,4.5,1 603 | 1,2.4,8.2,1 604 | 4.9,3.3,8.1,1 605 | 3.5,7.5,3,1 606 | 5.1,0,0.1,0 607 | 8.5,4.5,9.2,1 608 | 3.6,7.2,3.9,1 609 | 7.6,9.9,3.2,1 610 | 4.1,3.1,8.8,1 611 | 1.9,9.8,5.9,1 612 | 2.9,4.2,5.6,1 613 | 4.9,5.4,3.8,1 614 | 6.3,4.1,8,1 615 | 5.7,5.7,3.2,1 616 | 7.9,1.7,7.7,1 617 | 5.3,4.2,7.7,1 618 | 1.6,0.8,3,0 619 | 2.7,7.3,3.6,1 620 | 3.5,5,3,1 621 | 8.9,6,4.7,1 622 | 7.4,8.1,8.6,1 623 | 0.2,9.4,7.2,1 624 | 2.2,8.9,8.8,1 625 | 5.7,8.4,4.9,1 626 | 9,8.5,7.3,1 627 | 3.6,4.7,4,1 628 | 4.1,5.9,7.8,1 629 | 2.9,6.6,5.5,1 630 | 1.1,2.1,9.2,1 631 | 7.4,0.5,1.2,0 632 | 6.5,3.4,9.9,1 633 | 3.4,5.4,9.4,1 634 | 6.4,6.2,1.6,1 635 | 4.4,9.5,1.7,1 636 | 2,6.7,5.2,1 637 | 7,4.6,6.4,1 638 | 2.2,6.1,7.1,1 639 | 4.6,4.3,5.3,1 640 | 0.7,0.6,3.7,0 641 | 1.1,9.6,6.6,1 642 | 3.9,6.5,5.9,1 643 | 5.3,6.9,9.4,1 644 | 2,7,2,0 645 | 7.7,4.6,6.6,1 646 | 5.4,6.8,7.1,1 647 | 8.5,4.8,6,1 648 | 4.4,4.5,4.2,1 649 | 8.1,3.3,3.1,1 650 | 8,3.6,9.5,1 651 | 5.5,3.7,6.4,1 652 | 2.6,7.8,9.5,1 653 | 1.7,5.2,7.5,1 654 | 6.5,0.9,7.7,1 655 | 5.7,5.9,0,0 656 | 8.7,1.7,7.9,1 657 | 8.5,7.6,7.7,1 658 | 5.5,1.3,6.4,1 659 | 6.4,1.2,6.8,1 660 | 0.5,8.7,6.9,1 661 | 5.7,3.1,8.3,1 662 | 7.6,4.9,5.7,1 663 | 0.2,9.6,4,1 664 | 5.6,6.2,9,1 665 | 6.2,1.8,4.8,1 666 | 2.9,9,0.7,0 667 | 7.2,8.6,4.5,1 668 | 1.1,9.5,4.3,1 669 | 1.4,8.9,6.3,1 670 | 7.4,0.8,8,1 671 | 9.9,3.4,2.8,1 672 | 3.9,6.1,9.4,1 673 | 0.3,9.3,5,1 674 | 8.1,6.1,2.1,1 675 | 9.1,0.4,1.8,0 676 | 3.8,4.3,2.7,0 677 | 2.5,2.9,6.9,1 678 | 1.9,4.5,9.4,1 679 | 4.1,7.9,3.9,1 680 | 4.2,3,6.6,1 681 | 7.8,7,9.3,1 682 | 8.9,3.5,2.5,1 683 | 9.7,1.3,9.2,1 684 | 5.1,2.3,3.6,0 685 | 6.2,4.3,4.8,1 686 | 9.1,5.6,3.4,1 687 | 8.4,5.1,5,1 688 | 1.9,7.3,1.1,0 689 | 5.2,4.7,3.5,1 690 | 9.8,0.9,0.2,0 691 | 0.3,2.1,6.7,0 692 | 9.2,2.9,4.4,1 693 | 7.3,8.3,5.4,1 694 | 3.6,6.7,7.4,1 695 | 2.7,9.6,8.8,1 696 | 9.6,1.6,8.3,1 697 | 2.6,5.1,7.8,1 698 | 0.6,8.4,7.9,1 699 | 8.2,3,4.1,1 700 | 3.2,3.5,5.3,1 701 | 8.4,2.3,6,1 702 | 0.8,0.5,1.5,0 703 | 7,6.8,9.9,1 704 | 9.1,7.6,4.6,1 705 | 1.9,4.5,6.5,1 706 | 3.3,7.6,7.3,1 707 | 0.6,5.6,0.7,0 708 | 9,3.2,7.9,1 709 | 4,8.1,5.7,1 710 | 7.1,3.4,7.3,1 711 | 5.7,5.8,5.4,1 712 | 9.4,1.9,1.8,1 713 | 9.3,2.2,0.8,0 714 | 4.1,5,7.7,1 715 | 1.2,6.4,8.1,1 716 | 0.3,3.8,8.8,1 717 | 5.1,9.5,6.3,1 718 | 4.3,7.2,8.6,1 719 | 9.9,0.3,6.6,1 720 | 7.1,6.2,6.3,1 721 | 6.7,9.9,5.1,1 722 | 5,9.8,5.4,1 723 | 9.2,7.3,5.5,1 724 | 4.4,4.4,9.6,1 725 | 8.9,3.5,6.9,1 726 | 8.8,8.3,2.6,1 727 | 4.2,7.5,7.7,1 728 | 4.8,0.4,3.5,0 729 | 4.2,0.3,9.3,1 730 | 5.8,7.2,3.1,1 731 | 9.8,8.2,3,1 732 | 8.4,2.3,0.5,0 733 | 10,7.5,4.9,1 734 | 6,9.4,6.7,1 735 | 8.5,2,1.4,0 736 | 5.1,7.4,8,1 737 | 8.9,7.9,3.6,1 738 | 6.7,4.7,0.8,0 739 | 0.4,9,0.4,0 740 | 2.4,2.9,1.3,0 741 | 1.1,1.5,5.5,0 742 | 0.5,3.1,0.7,0 743 | 2.8,9.2,2.5,1 744 | 2.8,1.6,5.2,0 745 | 6.4,5.6,7.9,1 746 | 7.6,2.1,1.1,0 747 | 8.3,9.9,7.8,1 748 | 6.4,0.9,2.2,0 749 | 4.8,5.6,6.1,1 750 | 7.7,4.2,4.7,1 751 | 7,3.7,6.3,1 752 | 0.5,5.9,9.7,1 753 | 3.8,6,4.8,1 754 | 8.4,9.5,8.8,1 755 | 8.9,0.7,5.3,1 756 | 7.8,1.5,3,1 757 | 6,6.9,4,1 758 | 4.9,8,6.1,1 759 | 1.3,0.9,7.4,0 760 | 8.2,0.2,6.9,1 761 | 1.3,3.7,9.1,1 762 | 9.3,7,6.5,1 763 | 9.5,1.6,2.7,1 764 | 1.2,1.4,3.3,0 765 | 9.1,0.3,4,1 766 | 8.8,8.5,9.5,1 767 | 7.6,4.8,7.9,1 768 | 7.9,3.8,0.8,1 769 | 8.1,0.7,2.7,1 770 | 9.5,3.8,6.2,1 771 | 6.8,1.8,9.5,1 772 | 5.2,9.4,4.1,1 773 | 0.2,4.3,4.6,0 774 | 6,9.6,3.5,1 775 | 1.9,2,1.7,0 776 | 5.6,8.1,8,1 777 | 5.5,8.4,7,1 778 | 0.4,1.2,3.9,0 779 | 7.4,2.7,7,1 780 | 4,8.3,2.6,1 781 | 0.5,3.4,7.5,0 782 | 6.1,6.5,2.8,1 783 | 3.3,5,5.1,1 784 | 2.9,8.8,1.3,1 785 | 8,3.5,4.4,1 786 | 0.6,0.6,5.8,0 787 | 5.1,9.7,3.2,1 788 | 7.7,4.5,8.9,1 789 | 0.4,3.7,6.5,0 790 | 3.7,9.3,9.7,1 791 | 0.4,6.5,5.5,1 792 | 4.7,6.1,9.3,1 793 | 6,4.4,9.2,1 794 | 4.6,8.5,6.7,1 795 | 6.2,3.7,3.8,1 796 | 2.2,8.2,2.9,1 797 | 4.7,8.2,7.6,1 798 | 1.3,3.6,6.9,1 799 | 7.8,4.2,9.8,1 800 | 5.6,3.5,9.6,1 801 | 6.6,5.2,6.6,1 802 | 8.6,9.1,7.5,1 803 | 10,7.7,1.7,1 804 | 9.4,9.4,6.9,1 805 | 9,5.4,8.8,1 806 | 9,7.9,1.4,1 807 | 5.3,4.5,6.7,1 808 | 6.4,8.3,7.3,1 809 | 2.6,9.7,8.4,1 810 | 1.1,1.8,2.3,0 811 | 3.9,0.5,0.9,0 812 | 6.7,2.1,8.6,1 813 | 0,2.2,1,0 814 | 7.3,5.8,5,1 815 | 5.1,6.9,1.5,1 816 | 6.6,7.7,9.2,1 817 | 4.8,5.8,0.2,0 818 | 1.1,4.2,4.9,0 819 | 0.2,7.1,3.6,0 820 | 7.1,6.4,3.1,1 821 | 9.3,1.8,7.2,1 822 | 0,5.1,7.2,1 823 | 9.9,0.2,2,0 824 | 7.8,9,2.7,1 825 | 7.5,7.2,8.2,1 826 | 6.3,6.2,9.3,1 827 | 6.6,2.5,2.1,0 828 | 6.8,9.4,1.4,1 829 | 0.9,2.5,4.9,0 830 | 8.2,0.5,8.2,1 831 | 3.9,0.8,2.8,0 832 | 8.9,3,3.3,1 833 | 4.7,1,2.2,0 834 | 0.6,0.7,0.4,0 835 | 4.3,9.5,6.7,1 836 | 4.2,9.8,5.7,1 837 | 1.7,9.9,2.6,1 838 | 6.1,1.4,5.7,1 839 | 7.9,8.7,3.7,1 840 | 2.7,7.1,1.3,0 841 | 9.2,2.1,6.2,1 842 | 4.9,7.9,1.5,1 843 | 2,3.7,3.9,0 844 | 1.6,7.5,10,1 845 | 2.6,7.4,0.3,0 846 | 6.7,2.2,3.1,1 847 | 1.9,9.3,4.5,1 848 | 3.8,1.1,1.9,0 849 | 6.4,5.8,0.2,1 850 | 9.1,2.4,5.2,1 851 | 2.4,8,8.4,1 852 | 9.1,5.2,6.9,1 853 | 3.4,4.8,0.6,0 854 | 7.3,5.7,0.5,1 855 | 5.9,9.9,5.4,1 856 | 7.5,2.4,2.5,1 857 | 0.9,6.9,1.6,0 858 | 6.1,9.6,6.3,1 859 | 0,2.3,5.6,0 860 | 6.5,5.5,1.2,1 861 | 4.3,2.2,8.3,1 862 | 5.3,6.6,8,1 863 | 7.3,7.8,6.2,1 864 | 8.7,6.5,0.4,1 865 | 8.1,8.1,5.6,1 866 | 0.3,4.1,0.1,0 867 | 7.7,6.6,0.1,1 868 | 4,3.2,5.5,1 869 | 5.1,9.7,6,1 870 | 5.9,7.4,5.4,1 871 | 1.3,8.4,7.1,1 872 | 9.8,0.5,1,0 873 | 2.7,1.8,9.6,1 874 | 4.6,1.3,1.2,0 875 | 8.1,0.8,7.3,1 876 | 1.4,0.9,9.7,1 877 | 6,7.1,6.4,1 878 | 6.5,8.3,0.2,1 879 | 5.3,9,8,1 880 | 3.5,4.5,1,0 881 | 9.2,2.3,3,1 882 | 8.2,5.2,1.7,1 883 | 5.2,7.1,1.2,1 884 | 8.2,8,3.4,1 885 | 6.2,1.3,3.4,0 886 | 3.8,8.5,7,1 887 | 8.7,6.4,8.2,1 888 | 4,7.9,8.9,1 889 | 6.4,6.1,7,1 890 | 2.5,2.4,5.5,1 891 | 2.8,1.4,8.4,1 892 | 5.6,8.3,7.6,1 893 | 7.1,8.7,1.1,1 894 | 4.4,9.4,4.4,1 895 | 0.1,7,9,1 896 | 7.8,9.2,9.5,1 897 | 0.9,7.2,8.4,1 898 | 6.8,6.9,6.6,1 899 | 0.5,5.7,5.5,1 900 | 0.4,6,1.9,0 901 | 7.9,6.2,8.8,1 902 | 5,0.2,1.9,0 903 | 4.1,7.6,9.9,1 904 | 8.9,3.3,0.1,0 905 | 8.1,8.3,2.4,1 906 | 1.1,4.1,0.3,0 907 | 0.9,10,0.1,0 908 | 7.9,3.7,8.8,1 909 | 8.8,3.6,9.2,1 910 | 2.9,1.1,9.3,1 911 | 8.1,5.5,0.7,1 912 | 10,7.3,8.1,1 913 | 2.6,2.1,6.4,1 914 | 8,8.6,1.5,1 915 | 8.6,4.2,7.2,1 916 | 5.3,1.4,3.1,0 917 | 9.6,1,6.9,1 918 | 3.5,1.9,6.8,1 919 | 2.2,6.1,1.6,0 920 | 3.8,1.3,8.7,1 921 | 9.8,3.2,0.5,1 922 | 2.3,5.8,5.2,1 923 | 6.3,8.5,1.8,1 924 | 2.7,1.7,7.4,1 925 | 0.5,8.5,4.6,1 926 | 6.2,6.7,5.1,1 927 | 4.9,5.3,9.3,1 928 | 4.3,6.9,1.8,1 929 | 6.5,0.3,6.3,1 930 | 6.6,5.4,9,1 931 | 0.2,9.4,1.7,1 932 | 1.3,5.9,4.9,1 933 | 2.1,3.7,1.6,0 934 | 7.5,4.7,6,1 935 | 8.6,6.7,7.2,1 936 | 1.5,8,5.8,1 937 | 0.8,7.5,7.4,1 938 | 4.3,9.7,3.5,1 939 | 7.6,7.1,5.9,1 940 | 4.1,5.2,4.5,1 941 | 4.6,6.4,0.9,0 942 | 3.4,7.1,8.7,1 943 | 1.6,2.6,9.7,1 944 | 7.9,1,1.7,0 945 | 7,3.9,3.1,1 946 | 3.9,5.8,2.5,0 947 | 6.9,9.3,2.1,1 948 | 4.9,2.6,2.1,0 949 | 2.5,7.2,8.4,1 950 | 7.4,7.7,9.5,1 951 | 2.7,6.6,0.3,0 952 | 5.1,4.8,5.7,1 953 | 1.2,1.1,4.2,0 954 | 1.5,0.1,7.1,0 955 | 4.3,7,9,1 956 | 5.8,0.1,9.8,1 957 | 3.1,8.1,3.1,1 958 | 1.5,5.7,0.4,0 959 | 9.6,5.9,9.8,1 960 | 8.2,8.3,7.9,1 961 | 1.9,4.4,4.2,0 962 | 6.6,7.5,0.2,1 963 | 3.7,8.9,0.6,1 964 | 2.8,6.3,1.3,0 965 | 7.6,2,8.8,1 966 | 6.8,9.6,1.1,1 967 | 2.4,2.8,9,1 968 | 9.6,8.7,8.8,1 969 | 9.2,2.4,7.6,1 970 | 7.5,2.3,7.9,1 971 | 1.7,9.8,8,1 972 | 6.8,6.9,2.1,1 973 | 1.4,6,9.4,1 974 | 1.3,0.8,5.1,0 975 | 6.7,10,0.2,1 976 | 7.3,2.9,6,1 977 | 6.7,2.7,1.8,0 978 | 8.3,9.7,5.6,1 979 | 2.3,0.7,5.5,0 980 | 0.9,4.8,3,0 981 | 2.5,0,7.4,0 982 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Missing_Values_Telecom.csv: -------------------------------------------------------------------------------- 1 | SL No.,Current Month's Usage,Last 3 Month's Usage,Average Recharge, Projected Growth,Circle 2 | 1,5.1,3.5,99.4,99.2,A 3 | 2,4.9,3,98.6,99.2,A 4 | 3,,3.2,,99.2,A 5 | 4,4.6,3.1,98.5,9..2,A 6 | 5,5,,98.4,99.2,A 7 | 6,5.4,3.9,98.3,99.4,A 8 | 7,7,3.2,95.3,98.4.,B 9 | 8,6.4,3.2,95.5,98.5,B 10 | 9,6.9,3.1,95.1,98.5,B 11 | 10,,2.3,96,98.3,B 12 | 11,6.5,2.8,95.4,98.5,B 13 | 12,5.7,,95.5,98.3,B 14 | 13,6.3,3.3,,98.6,B 15 | 14,6.7,3.3,94.3,97.5,C 16 | 15,6.7,3,94.8,97.3,C 17 | 16,6.3,2.5,95,98.9,C 18 | 17,,3,94.8,98,C 19 | 18,6.2,3.4,94.6,97.3,C 20 | 19,5.9,3,94.9,98.8,C 21 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Missing_Values_Telecom_mod.csv: -------------------------------------------------------------------------------- 1 | "","cmusage","l3musage","avrecharge","","" 2 | "1","5.1","3.5","99.4","99.2","A" 3 | "2","4.9","3","98.6","99.2","A" 4 | "3","5.975","3.2","96.1411764705882","99.2","A" 5 | "4","4.6","3.1","98.5","9..2","A" 6 | "5","5","3.10588235294118","98.4","99.2","A" 7 | "6","5.4","3.9","98.3","99.4","A" 8 | "7","7","3.2","95.3","98.4.","B" 9 | "8","6.4","3.2","95.5","98.5","B" 10 | "9","6.9","3.1","95.1","98.5","B" 11 | "10","5.975","2.3","96","98.3","B" 12 | "11","6.5","2.8","95.4","98.5","B" 13 | "12","5.7","3.10588235294118","95.5","98.3","B" 14 | "13","6.3","3.3","96.1411764705882","98.6","B" 15 | "14","6.7","3.3","94.3","97.5","C" 16 | "15","6.7","3","94.8","97.3","C" 17 | "16","6.3","2.5","95","98.9","C" 18 | "17","5.975","3","94.8","98","C" 19 | "18","6.2","3.4","94.6","97.3","C" 20 | "19","5.9","3","94.9","98.8","C" 21 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Mult_Reg_Conversion.csv: -------------------------------------------------------------------------------- 1 | SL No.,Temperature,Time,Kappa number,Percent_Conversion 2 | 1,1300,0.012,7.5,49 3 | 2,1300,0.012,9,50.2 4 | 3,1300,0.0115,11,50.5 5 | 4,1300,0.013,13.5,48.5 6 | 5,1300,0.0135,17,47.5 7 | 6,1300,0.012,23,44.5 8 | 7,1200,0.04,5.3,28 9 | 8,1200,0.038,7.5,31.5 10 | 9,1200,0.032,11,34.5 11 | 10,1200,0.026,13.5,35 12 | 11,1200,0.034,17,38 13 | 12,1200,0.041,23,38.5 14 | 13,1100,0.084,5.3,15 15 | 14,1100,0.098,7.5,17 16 | 15,1100,0.092,11,20.5 17 | 16,1100,0.086,17,29.5 18 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Mult_Reg_Yield.csv: -------------------------------------------------------------------------------- 1 | SL No.,Time,Temperature,%Yield 2 | 1,130,190,35 3 | 2,174,176,81.7 4 | 3,134,205,42.5 5 | 4,191,210,98.3 6 | 5,165,230,52.7 7 | 6,194,192,82 8 | 7,143,220,34.5 9 | 8,186,235,95.4 10 | 9,139,240,56.7 11 | 10,188,230,84.4 12 | 11,175,200,94.3 13 | 12,156,218,44.3 14 | 13,190,220,83.3 15 | 14,178,210,91.4 16 | 15,132,208,43.5 17 | 16,148,225,51.7 18 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Nonlinear_Thrust.csv: -------------------------------------------------------------------------------- 1 | x1,x2,x3,y 2 | 3,3,10,4.095789556 3 | 8,8,10,2.878296124 4 | 6,6,10,3.881744606 5 | 4,12,10,2.621305565 6 | 6,5,10,4.49181515 7 | 5,5,10,4.402884344 8 | 3,3,25,5.249046578 9 | 4,4,25,5.210749248 10 | 12,4,25,12.162786 11 | 8,4,25,7.335729412 12 | 8,8,25,4.273256678 13 | 3,3,25,5.570909689 14 | 3,3,50,6.090830347 15 | 8,3,50,8.930656889 16 | 4,8,50,3.827508353 17 | 2,2,50,4.872198626 18 | 2,3,50,3.954824573 19 | 3,3,50,6.255752908 20 | 2,3,50,3.960425828 21 | 2,3,75,4.930255551 22 | 3,3,75,7.035385055 23 | 2,2,75,5.736351056 24 | 6,4,75,8.273348183 25 | 8,6,75,7.491533143 26 | 8,8,75,6.217925159 27 | -------------------------------------------------------------------------------- /Practical Slides and Codes/PO_Processing.csv: -------------------------------------------------------------------------------- 1 | Processing_Time 2 | 49 3 | 46 4 | 56 5 | 56 6 | 68 7 | 46 8 | 65 9 | 21 10 | 10 11 | 26 12 | 35 13 | 25 14 | 54 15 | 105 16 | 2 17 | 39 18 | 92 19 | 44 20 | 30 21 | 36 22 | 42 23 | 43 24 | 83 25 | 68 26 | 45 27 | 102 28 | 5 29 | 54 30 | 20 31 | 52 32 | 93 33 | 66 34 | 10 35 | 38 36 | 89 37 | 8 38 | 60 39 | 61 40 | 55 41 | 34 42 | 80 43 | 58 44 | 48 45 | 64 46 | 83 47 | 62 48 | 77 49 | 44 50 | 35 51 | 45 52 | 67 53 | 14 54 | 68 55 | 68 56 | 68 57 | 44 58 | 8 59 | 33 60 | 81 61 | 71 62 | 38 63 | 20 64 | 46 65 | 63 66 | 56 67 | 34 68 | 97 69 | 69 70 | 40 71 | 75 72 | 39 73 | 61 74 | 93 75 | 48 76 | 3 77 | 55 78 | 26 79 | 61 80 | 55 81 | 77 82 | 51 83 | 50 84 | 2 85 | 60 86 | 44 87 | 41 88 | 10 89 | 57 90 | 47 91 | 55 92 | 1 93 | 61 94 | 60 95 | 4 96 | 51 97 | 36 98 | 37 99 | 12 100 | 42 101 | 72 102 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_1_Intro2R.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Practical Slides and Codes/Practical_1_Intro2R.pdf -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_1_Intro2R_Code.R: -------------------------------------------------------------------------------- 1 | 2 | ####################### Installation of R and RStudio ######################### 3 | # (editor comment symbol in R is # ) 4 | 5 | # Download R software from http://cran.r-project.org/bin/windows/base/ 6 | # Run the R set up (exe) file and follow instructions 7 | # Double click on the R icon in the desktop and R window will open 8 | # Download RStudio from http://www.rstudio.com/ 9 | # Run R studio set up file and follow instructions 10 | # Click on R studio icon, R Studio IDE Studio will load 11 | # Go to R-Script (Ctrl + Shift + N) 12 | # Write 'Hello World' 13 | # Save & Run (Ctrl + Enter) 14 | 15 | ################################################################################ 16 | 17 | 'Hello World' 18 | 19 | ############ Simple Maths in R #################### 20 | # 21 | 3+5 22 | 12+3/4-5+3*8 23 | (12+3/4-5)+3*8 24 | pi*2^3-sqrt(4) 25 | factorial(4) 26 | log(2,10) 27 | log(2,base=10) 28 | log10(2) 29 | log(2) 30 | 31 | x = 3+5 32 | x 33 | y = 12+3/4-5+3*8 34 | y 35 | z = (12+3/4-5)+3*8 36 | z 37 | 38 | ######## R is case sensitive and no space should be between '<' & '-' ########## 39 | 40 | A <- 6+8 41 | a 42 | A 43 | 44 | ########## Write numeric / text data ################ 45 | 46 | data1 = c(3, 6, 9, 12, 78, 34, 5, 7, 7) ## numerical data 47 | data1 48 | data1.text = c('Mon', "Tue",'Wed') ##text data 49 | data1.text 50 | data2.text = c(data1.text , 'Thu' , "Fri") ## SIngle or double quote works 51 | data2.text 52 | 53 | ############ Scan command for Read Data Values ############### 54 | ### Double enter in Console to stop ### 55 | data3 = scan(what = 'character') 56 | mon 57 | tue 58 | wed 59 | thur 60 | data3 61 | data3[2] 62 | data3[2] = 'mon' 63 | data3 64 | data3[6]= 'sat' 65 | data3 66 | data3[2] = 'tue' 67 | data3[5] = 'fri' 68 | data3 69 | 70 | ############# Working directory ################ 71 | 72 | getwd() ### Session - Get Working Directory - To Source File Location 73 | setwd("Path of working directory") ### Session - Set Working Directory - To Source File Location 74 | dir() ### working directory listing 75 | ls() ### Work space listing of objects 76 | rm('object') ### Remove an element "object", if exist 77 | rm(list = ls(all = TRUE)) ### Cleaning 78 | 79 | 80 | ##### Importing Data and accessing it ##### 81 | 82 | data = read.csv('Logistic_Reg.csv',header = T,sep = ",") 83 | data = read.table('Logistic_Reg.csv',header = T,sep = ",") 84 | data 85 | str(data) 86 | data$Ind_Exp_Act_Score 87 | data$Ind_Exp_Act_Score[1] = 5.2 ###This change has happened in work space only not in the file 88 | data$Ind_Exp_Act_Score ### Save the changes ### 89 | write.table(data, file = 'Logistic_Reg_mod.csv',row.names = FALSE,sep = ",") 90 | write.csv(data, file = 'Logistic_Reg_mod.csv',row.names = TRUE) 91 | 92 | ##### Vectors in R ##### 93 | 94 | x = c(1,2,3,4,56) 95 | x 96 | ## [1] 1 2 3 4 56 97 | 98 | # Creating double and integer vectors 99 | 100 | dbl_var <- c(1, 2.5, 4.5) # double-precision values 101 | dbl_var 102 | ## [1] 1.0 2.5 4.5 103 | 104 | int_var <- c(1L, 6L, 10L) # Integer vectors 105 | int_var 106 | ## [1] 1 6 10 107 | 108 | # Checking for numeric type 109 | 110 | typeof(dbl_var) 111 | ## [1] "double" 112 | 113 | typeof(int_var) 114 | ## [1] "integer" 115 | 116 | # Converting between types 117 | 118 | # converts integers to double-precision values 119 | as.double(int_var) 120 | ## [1] 1 6 10 121 | 122 | # identical to as.double() 123 | as.numeric(int_var) 124 | ## [1] 1 6 10 125 | 126 | # converts doubles to integers 127 | as.integer(dbl_var) 128 | ## [1] 1 2 4 129 | 130 | mean(x) 131 | x[2] 132 | x = c(3, 4, NA, 5) 133 | mean(x) 134 | mean(x, rm.NA = T) 135 | x = c(3,4, NULL, 5) 136 | mean(x) 137 | length(x) 138 | 139 | ##### Long Vectors in AP ##### 140 | 141 | x = 1:20 142 | x 143 | y = seq(2,5,0.3) 144 | y 145 | length(y) 146 | 147 | ############### Operation on Vectors ################## 148 | x = 1:5 149 | x 150 | mean(x) 151 | x^2 152 | x+1 153 | 2*x 154 | exp(sqrt(x)) 155 | y = c(0,3,4,0) 156 | x+y 157 | y = c(0,3,4,0,9) 158 | x+y 159 | 160 | x1 <- c(1,4,5) 161 | y1 <- c(2,3,6) 162 | print(x1+y1) 163 | print(x1 > y1) 164 | 165 | ################# R Logical Operators ########################## 166 | 167 | # ! : Logical NOT 168 | # & : Element-wise logical AND 169 | # && : Logical AND 170 | # | : Element-wise logical OR 171 | # || : Logical OR 172 | 173 | # Operators & and | perform element-wise operation producing result having length of the longer operand. 174 | # But && and || examines only the first element of the operands resulting into a single length logical vector. 175 | # Zero is considered FALSE and non-zero numbers are taken as TRUE. An example run. 176 | 177 | x2 <- c(TRUE,FALSE,TRUE) 178 | y2 <- c(FALSE,TRUE,TRUE) 179 | print(!x2) # [1] FALSE TRUE FALSE 180 | print(x2&y2) # [1] FALSE FALSE TRUE 181 | print(x2&&y2) # [1] FALSE 182 | print(x2|y2) # [1] TRUE TRUE TRUE 183 | print(x2||y2) # [1] TRUE 184 | 185 | ############## Comparing numbers in R / Relational Operators in R ############## 186 | 187 | x <- 6 188 | y <- 20 189 | # < : Less than 190 | # > : Greater than 191 | # <= : Less than or equal to 192 | # >= : Greater than or equal to 193 | # == : Equal to 194 | # != : Not equal to 195 | 196 | print(xy) # [1] FALSE 198 | print(x<=y) # [1] TRUE 199 | print(x>=y) # [1] FALSE 200 | print(x==y) # [1] FALSE 201 | print(x!=y) # [1] TRUE 202 | 203 | x <- c(1, 4, 9, 12) 204 | y <- c(4, 4, 9, 13) 205 | x == y 206 | ## [1] FALSE TRUE TRUE FALSE 207 | 208 | x <- c(1, 4, 9, 12) 209 | y <- c(4, 4, 9, 13) 210 | # How many pairwise equal values are in vectors x and y 211 | sum(x == y) 212 | ## [1] 2 213 | 214 | # Where are the pairwise equal values located in vectors x and y 215 | which(x == y) 216 | ## [1] 2 3 217 | 218 | #### Generating Random numbers: Random sampling 219 | 220 | # simple random sampling with replacement 221 | x <- c(1:10) 222 | sample(x, size = 5, replace = TRUE) 223 | ## [1] 1 1 8 4 3 224 | 225 | # simple random sampling without replacement 226 | x <- c(1:10) 227 | sample(x, size = 5, replace = FALSE) 228 | ## [1] 4 2 10 3 9 229 | 230 | 231 | ## Generating Random numbers: Discrete and continuous distributions 232 | 233 | # random sample of size 10 from a Bin(4, 0.8) distribution 234 | rbinom(n = 10, size = 4, prob = 0.8) 235 | ## [1] 2 4 4 3 4 4 3 1 3 3 236 | 237 | # random sample of size 10 from a Poisson(2) distribution 238 | rpois(n = 10, lambda = 2) 239 | ## [1] 3 3 5 3 0 3 2 1 2 0 240 | 241 | # random sample of size 5 from a Uniform(2,4) distribution 242 | runif(n = 5, min = 2, max = 4) 243 | ## [1] 2.147157 3.766859 3.293724 3.071439 3.903576 244 | 245 | # random sample of size 10 from a Normal(2,4) distribution 246 | rnorm(10, mean = 2, sd = 2) # note the parameter spec. here 247 | ## [1] 2.0745291 0.1719257 2.3249074 3.2447822 1.4329557 0.1173869 0.5231311 1.4754708 1.7591520 -1.8763400 248 | 249 | ## Setting the seed 250 | 251 | rnorm(n = 5, mean = 1, sd = 2) 252 | ## [1] 1.7727385 -3.6486144 0.9951339 2.5762444 -0.8415272 253 | 254 | rnorm(n = 5, mean = 1, sd = 2) 255 | ## [1] 1.22932369 -1.98687014 0.59244070 2.38830593 -0.01179321 256 | 257 | set.seed(100) 258 | rnorm(n = 5, mean = 1, sd = 2) 259 | ## [1] -0.004384701 1.263062331 0.842165820 2.773569619 1.233942541 260 | 261 | set.seed(100) 262 | rnorm(n = 5, mean = 1, sd = 2) 263 | 264 | ## [1] -0.004384701 1.263062331 0.842165820 2.773569619 1.233942541 265 | 266 | #### Rounding #### 267 | 268 | x <- runif(5, 2, 4) 269 | x 270 | ## [1] 3.249993 3.764331 2.560708 2.796976 3.525102 271 | 272 | round(x) # round to nearest integer 273 | ## [1] 3 4 3 3 4 274 | round(x,3) # round to 3 places of decimal 275 | ## [1] 3.250 3.764 2.561 2.797 3.525 276 | 277 | x 278 | ## [1] 3.249993 3.764331 2.560708 2.796976 3.525102 279 | 280 | floor(x) # round down 281 | ## [1] 3 3 2 2 3 282 | 283 | ceiling(x) # round up 284 | ## [1] 4 4 3 3 4 285 | 286 | ############# Character strings in R ################## 287 | 288 | a <- "This course is on" # create string a 289 | b <- "Multivariate Data Analysis" # create string b 290 | # paste together string a & b 291 | paste(a, b) 292 | ## [1] "This course is on Multivariate Data Analysis" 293 | 294 | # paste character and number strings (converts numbers to character class) 295 | paste("I want to eat a", pi) 296 | ## [1] "I want to eat a 3.14159265358979" 297 | 298 | # paste multiple strings 299 | paste("Multivariate", " Data Analysis", "with", "R") 300 | ## [1] "Multivariate Data Analysis with R" 301 | 302 | # paste multiple strings with a separating character 303 | paste("Data analysis", "applications","with", "R", sep = "-") 304 | ## [1] "Data analysis-applications-with-R" 305 | 306 | # use paste0() to paste without spaces btwn characters 307 | paste0("Data analysis", "applications", "with", "R") 308 | ## [1] "Data analysisapplicationswithR" 309 | 310 | # paste objects with different lengths 311 | paste("R", 1:5, sep = " v1.") 312 | ## [1] "R v1.1" "R v1.2" "R v1.3" "R v1.4" "R v1.5" 313 | 314 | ############### Coercing in atomic vectors ################## 315 | 316 | c("A",10) 317 | ## [1] "A" "10" 318 | 319 | typeof(c("A",10)) 320 | ## [1] "character" 321 | 322 | # Combining logical and numeric creates numeric 323 | 324 | x <- c(10, 57, 3, 90) 325 | x > 20 326 | ## [1] FALSE TRUE FALSE TRUE 327 | 328 | sum(x > 20) # numeric function applied on logical 329 | ## [1] 2 330 | 331 | ############### Subsetting vectors in R ############### 332 | 333 | x_1 <- c(1:10) 334 | x_2 <- 4 335 | x_3 <- c(101:110) 336 | x_4 <- c(2,3) 337 | x_5 <- c(1,2,3) 338 | 339 | x_1 340 | ## [1] 1 2 3 4 5 6 7 8 9 10 341 | 342 | x_1[4] #get the fourth element of vector x 343 | ## [1] 4 344 | 345 | x_1[c(2,7)] # get elements in position 2 and 7 346 | ## [1] 2 7 347 | 348 | x_1[-3] # everything except element in position 3 349 | ## [1] 1 2 4 5 6 7 8 9 10 350 | 351 | x_1[c(-1,-3)] # everything except elements in pos. 1 & 3 352 | ## [1] 2 4 5 6 7 8 9 10 353 | 354 | names(x_1) <- letters[1: length(x_1)] # assign names to x 355 | x_1 356 | ## a b c d e f g h i j 357 | ## 1 2 3 4 5 6 7 8 9 10 358 | 359 | x_1[c("b","j")] # get elements named "b" and "j" 360 | ## b j 361 | ## 2 10 362 | 363 | x_1[x_1 > 6] 364 | ## g h i j 365 | ## 7 8 9 10 366 | 367 | x_1[4] #preserves (here, keeps the name) 368 | ## d 369 | ## 4 370 | 371 | x_1[[4]] # simplifies (here, name is removed) 372 | ## [1] 4 373 | 374 | ############### Matrices in R ############### 375 | 376 | #a.matrix <- matrix(vector, nrow = r, ncol = c, byrow = FALSE, dimnames = list(char-vector-rownames, char-vector-col-names)) 377 | y <- matrix(1:20, nrow = 4, ncol = 5) 378 | y 379 | A = matrix(c(1,2,3,4),nrow=2,byrow=T) 380 | A 381 | A = matrix(c(1,2,3,4),ncol=2) 382 | A 383 | B = matrix(2:7,nrow=2) 384 | B 385 | C = matrix(5:2,ncol=2) 386 | C 387 | mr <- matrix(1:20, nrow = 5, ncol = 4, byrow = T) 388 | mr 389 | mc <- matrix(1:20, nrow = 5, ncol = 4) 390 | mc 391 | 392 | ########## Arithmetic Operations on Matrix ############# 393 | 394 | A <- matrix(c(1:4), 2, 2) 395 | B <- matrix(c(11:14), 2, 2) 396 | A + B # elementwise addition 397 | ## [,1] [,2] 398 | ## [1,] 12 16 399 | ## [2,] 14 18 400 | 401 | A * B # elementwise multiplication 402 | ## [,1] [,2] 403 | ## [1,] 11 39 404 | ## [2,] 24 56 405 | 406 | A%*%B #Matrix multiplication. 407 | ## [,1] [,2] 408 | ##[1,] 47 55 409 | ##[2,] 70 82 410 | 411 | dim(B) #Dimension 412 | # [1] 2 2 413 | 414 | nrow(B) 415 | # [1] 2 416 | 417 | ncol(B) 418 | # [1] 2 419 | 420 | t(A) #Transpose 421 | ## [,1] [,2] 422 | ##[1,] 1 2 423 | ##[2,] 3 4 424 | 425 | rownames(A) = c("a","b") 426 | colnames(A) = c("d", "e") 427 | A 428 | ## d e 429 | ## a 1 3 430 | ## b 2 4 431 | 432 | ## Subsetting matrices are similar to atomic vectors 433 | 434 | A[1,2] # element in row 1, column 2 435 | ## [1] 3 436 | 437 | A[2, ] # row 2 438 | ## [1] 2 4 439 | 440 | A[ ,1] # column 1 441 | ## [1] 1 2 442 | 443 | B[,-1] # all except column 1 444 | ## [1] 13 14 445 | 446 | 447 | ########## Examples of lists in R ############### 448 | 449 | x = list(name = 'Tanujit', nationality = 'Indian' , height =5.5 , marks =c(95,45,80)) 450 | names(x) 451 | ## [1] "name" "nationality" "height" "marks" 452 | 453 | str(x) 454 | ## List of 4 455 | ## $ name : chr "Tanujit" 456 | ## $ nationality: chr "Indian" 457 | ## $ height : num 5.5 458 | ## $ marks : num [1:3] 95 45 80 459 | 460 | x$name 461 | ## [1] "Tanujit" 462 | 463 | x$hei #abbreviations are OK 464 | ## [1] 5.5 465 | 466 | x$marks 467 | ## [1] 95 45 80 468 | 469 | x$m[2] 470 | ## [1] 45 471 | 472 | typeof(x) 473 | ## [1] "list" 474 | 475 | ############## Adding lists ############## 476 | 477 | l1 <- list(1:3, "a", c(TRUE, FALSE, TRUE), c(2.3, 5.9)) 478 | l2 <- list(l1, c(1:10)) 479 | str(l2) 480 | 481 | ## List of 2 482 | ## $ :List of 4 483 | ## ..$ : int [1:3] 1 2 3 484 | ## ..$ : chr "a" 485 | ## ..$ : logi [1:3] TRUE FALSE TRUE 486 | ## ..$ : num [1:2] 2.3 5.9 487 | ## $ : int [1:10] 1 2 3 4 5 6 7 8 9 10 488 | 489 | #To add an additional list component without creating nested lists we use the append() function. 490 | l3 <- append(l1, list(c(1:10))) 491 | str(l3) 492 | ## List of 5 493 | ## $ : int [1:3] 1 2 3 494 | ## $ : chr "a" 495 | ## $ : logi [1:3] TRUE FALSE TRUE 496 | ## $ : num [1:2] 2.3 5.9 497 | ## $ : int [1:10] 1 2 3 4 5 6 7 8 9 10 498 | 499 | # Alternatively, we can add additional elements by using the $ sign and naming the item. 500 | 501 | l1$x <- c(1.4,2.3) 502 | str(l1) 503 | ## List of 5 504 | ## $ : int [1:3] 1 2 3 505 | ## $ : chr "a" 506 | ## $ : logi [1:3] TRUE FALSE TRUE 507 | ## $ : num [1:2] 2.3 5.9 508 | ## $ x: num [1:2] 1.4 2.3 509 | 510 | ############### Data Frame in R ############### 511 | 512 | d <- c(1,2,3,4) 513 | e <- c("Tanujit", "Subhajit", "Indrajit", NA) 514 | f <- c(TRUE,TRUE,TRUE,FALSE) 515 | myframe <- data.frame(d,e,f) 516 | names(myframe) <- c("ID","Name","Passed") # Variable names 517 | myframe 518 | myframe[1:3,] # Rows 1,2,3 of data frame 519 | myframe[,1:2] # Columns 1,2 of data frame 520 | myframe[c("ID","Name")] #Columns ID and color from data frame 521 | myframe$ID # Variable ID in the data frame 522 | 523 | ############### Factors in R ############### 524 | ### Example: variable gender with 20 "male" entries and 30 "female" entries ### 525 | 526 | gender <- c(rep("male",20), rep("female", 30)) 527 | gender <- factor(gender) # Stores gender as 20 1s and 30 2s 528 | # 1=male, 2=female internally (alphabetically) 529 | # R now treats gender as a nominal variable 530 | summary(gender) 531 | ## [1] female male 532 | ## 30 20 533 | 534 | 535 | section <- rep(c("A","B"), each = 3) 536 | section 537 | ## [1] "A" "A" "A" "B" "B" "B" 538 | 539 | typeof(section) 540 | ## [1] "character" 541 | 542 | section <- as.factor(section) 543 | section 544 | ## [1] A A A B B B 545 | ## Levels: A B 546 | 547 | class(section) 548 | ## [1] "factor" 549 | 550 | ############### Functions in R ############### 551 | 552 | g = function(x,y) (x+2*y)/5 553 | g(5,10) 554 | g(10,5) 555 | 556 | 557 | ###################### Use different packages after installation ########### 558 | 559 | install.packages('MASS') 560 | library('MASS') 561 | 562 | ################### R Arithmetic operators ############################# 563 | 564 | # + : Addition 565 | # - : subtraction 566 | # * : multiplication 567 | # / : division 568 | # ^ or ** : exponentiation 569 | # x%%y : Modulus (Remainder from division) (x mod y) : exam; 5%%2 is 1 570 | # x%/%y : integer division 571 | 572 | ##### Define variables and array to do the above Arithmetic operations ##### 573 | 574 | x <- 6 575 | y <- 20 576 | print(x+y) 577 | print(x-y) 578 | print(x*y) 579 | print(y/x) 580 | print(y%%x) 581 | print(y%/%x) 582 | print(y^x) 583 | 584 | 585 | ######### R Assignment Operators ######################### 586 | 587 | # <-, = : Leftwards assignment 588 | # -> : Rightwards assignment 589 | x3=4 590 | print(x3 <- 3) 591 | print(10->x3) 592 | 593 | ########### Matrix Multiplication ################## 594 | 595 | ?matrix ### Know about functions ### 596 | A = matrix(c(21,57,89,31,7,98), nrow =2, ncol=3, byrow = TRUE) 597 | B = matrix(c(24, 35, 15, 34, 56,25), nrow = 3, ncol = 2, byrow = TRUE) 598 | print(A) 599 | print(B) 600 | C = A%*%B ### Do the multiplication of A and B and stored it into C matrix ### 601 | print(C) 602 | print(det(C)) 603 | Inv <- solve(C) ### Do the inverse of a matrix ### 604 | print(Inv) 605 | print(eigen(C)) ### Find Eigenvalues ### 606 | 607 | ######### alternative way to find inverse of a matrix A1 ############## 608 | 609 | ## Using the inv() function: 610 | ## inv() function is a built-in function in R which is especially used to find the inverse of a matrix 611 | ## For that you need to install the matlib package in your environment. and use it using library() 612 | ## It takes time .. Do it later 613 | ## install.packages('matlib') 614 | ## library("matlib") 615 | ## print(inv(C)) 616 | 617 | ##### R Functions for Probability Distributions ##### 618 | 619 | # Every distribution that R handles has four functions. There is a root name, for example, 620 | # the root name for the normal distribution is norm. This root is prefixed by one of the letters 621 | 622 | # p for "probability", the cumulative distribution function (c. d. f.) 623 | # q for "quantile", the inverse c. d. f. 624 | # d for "density", the density function (p. f. or p. d. f.) 625 | # r for "random", a random variable having the specified distribution 626 | # For the normal distribution, these functions are pnorm, qnorm, dnorm, and rnorm. 627 | # For the binomial distribution, these functions are pbinom, qbinom, dbinom, and rbinom. And so forth. 628 | # R has functions to handle many probability distributions like, Beta, Cauchy, Gamma, Poisson, etc.. 629 | 630 | #### Example of Normal Distribution ############## 631 | 632 | # Direct Look-Up 633 | # pnorm is the R function that calculates the c. d. f. 634 | # F(x) = P(X <= x) where X is normal. 635 | 636 | print(pnorm(27.4, 50, 20)) # Here it look up P(X < 27.4) when X is normal with mean 50 and standard deviation 20. 637 | 638 | # Inverse Look-Up 639 | 640 | # qnorm is the R function that calculates the inverse c. d. f. F-1 of the normal distribution The c. d. f. and the inverse c. d. f. are related by 641 | 642 | # p = F(x) 643 | # x = F-1(p) 644 | 645 | # So given a number p between zero and one, qnorm looks up the p-th quantile of the normal distribution. 646 | 647 | # Q: What is F^(-1)(0.95) when X has the N(100, 15^2) distribution? 648 | 649 | print(qnorm(0.95, mean=100, sd=15)) 650 | 651 | ### Random Variates 652 | 653 | # rnorm is the R function that simulates random variates having a specified normal distribution. 654 | # As with pnorm, qnorm, and dnorm, optional arguments specify the mean and standard deviation of the distribution. 655 | 656 | x <- rnorm(100, mean=10, sd=5) 657 | print(x) 658 | 659 | ###### below it plots the histogram of the above 100 random points generated from normal distribution with mean=10 and sd=5 660 | 661 | print(hist(x, probability=TRUE)) ### hist plots Histogram ### 662 | 663 | ### Home Task: Do the same for other distributions for hands on study 664 | 665 | ##### For any Functional Help write the following ##### 666 | 667 | # ?rnorm() 668 | # ?pnorm() 669 | # ?dnorm() 670 | # ?qnorm() 671 | 672 | 673 | 674 | ################## Functions in R ####################### 675 | 676 | # Total accumulated value of an investment 677 | compound_amount <- function(P, r, n, t){ 678 | A <- P*(1 + r/n)^(n*t) 679 | return(A) 680 | } 681 | 682 | # Suppose someone invested a sum of 10000 INR at an annual interest rate of 6.5%, 683 | # compounded quarterly for a period of 6 years. To calculate the accumulated sum, 684 | # we can use 685 | 686 | P <- 10000; r <- 0.065; n <- 4; t <- 6 687 | compound_amount(P, r, n, t) 688 | 689 | formals(compound_amount) 690 | 691 | body(compound_amount) 692 | 693 | environment(compound_amount) 694 | 695 | 696 | # Function arguments 697 | # We can call the arguments in several ways 698 | 699 | # using argument names 700 | compound_amount(P = 10000, r = 0.065, n = 4, t = 6) 701 | ## [1] 14723.58 702 | 703 | # same as above but without using names ("positional matching") 704 | compound_amount(10000, .065, 4, 6) 705 | ## [1] 14723.58 706 | 707 | # if using names you can change the order 708 | compound_amount(r = .065, P = 10000, n = 4, t = 6) 709 | ## [1] 14723.58 710 | 711 | # if not using names you must insert arguments in proper order 712 | compound_amount(.08, 10000, 4, 6) 713 | ## [1] 2.869582e+80 714 | 715 | # While writing a function we can also set default values to arguments. 716 | 717 | compound_amount <- function(P, r = 0.065, n = 4, t){ 718 | A <- P*(1 + r/n)^(n*t) 719 | return(A) 720 | } 721 | 722 | compound_amount(P = 10000, t = 6) # uses default values of r and n 723 | ## [1] 14723.58 724 | 725 | compound_amount(P = 10000, r = 0.07, t = 6) # uses specified value of r, default for n 726 | ## [1] 15164.43 727 | 728 | 729 | # Lexical scoping 730 | # What is the output of this code? 731 | x <- 10 732 | g01 <- function() { 733 | x <- 20 734 | x 735 | } 736 | 737 | g01() 738 | 739 | # R uses lexical scoping 740 | # This means that a function will first look inside the function to identify all the variables being called. 741 | # If all variables exist then their is no additional search required to identify variables. 742 | 743 | P <- 1 744 | compound_amount <- function(P, r, n, t){ 745 | P <- 10000; r <- 0.065; n <- 4; t <- 6 746 | A <- P*(1 + r/n)^(n*t) 747 | return(A) 748 | } 749 | compound_amount(P, r, n, t) 750 | ## [1] 14723.58 751 | 752 | #If a variable does not exist within the function, R will look one level up to see if the variable exists. 753 | P <- 10000 754 | compound_amount <- function(P, r, n, t){ 755 | r <- 0.065; n <- 4; t <- 6 756 | A <- P*(1 + r/n)^(n*t) 757 | return(A) 758 | } 759 | compound_amount(P, r, n, t) 760 | ## [1] 14723.58 761 | 762 | #The same rule applies for functions within functions 763 | P <- 100 764 | compound_amount <- function(P){ 765 | r <- 0.065; n <- 4; t <- 6 766 | interest_factor <- function(){ 767 | out <- (1 + r/n)^(n*t) 768 | } 769 | A <- P*interest_factor() 770 | return(A) 771 | } 772 | compound_amount(P) 773 | ## [1] 147.2358 774 | 775 | # This also applies for functions in which some arguments are called but not all variables used in the body are identified as arguments. 776 | r <- 0.07 777 | compound_amount <- function(P, t){ 778 | n <- 4 779 | A <- P*(1 + r/n)^(n*t) 780 | return(A) 781 | } 782 | # n is defined within the function 783 | # r is defined outside the function 784 | compound_amount(P = 10000, t = 6) 785 | ## [1] 15164.43 786 | 787 | # Functions vs variables 788 | # In R, functions are ordinary objects. 789 | # This means the scoping rules described above also apply to functions. 790 | cool_func <- function(x) x + 10 791 | ubercool_func <- function() { 792 | cool_func <- function(x) x + 99 793 | cool_func(10) 794 | } 795 | ubercool_func() 796 | ## [1] 109 797 | 798 | # Dynamic lookup 799 | # Lexical scoping determines where, but not when to look for values. 800 | # R looks for values when the function is run, not when the function is created. 801 | # Together, these two properties tell us that the output of a function can differ depending on the objects outside the function’s environment. 802 | 803 | foo <- function() x + 1 804 | x <- 25 805 | foo() 806 | ## [1] 26 807 | x <- 50 808 | foo() 809 | ## [1] 51 810 | 811 | # Lazy evaluation 812 | # In R, function arguments are lazily evaluated: they are only evaluated if accessed. 813 | # For example, this code does not generate an error because x is never used. 814 | h01 <- function(x) { 815 | 10 816 | } 817 | h01(stop("This is an error!")) 818 | ## [1] 10 819 | 820 | # the y argument is not used so not included it causes no harm 821 | lazy_func <- function(x, y){ 822 | x^2 823 | } 824 | lazy_func(2) 825 | ## [1] 4 826 | 827 | # if both arguments are required in the body an error will result if an argument is missing 828 | lazy_func2 <- function(x, y){ 829 | (x + y)^2 830 | } 831 | lazy_func2(2) 832 | ## Error in lazy_func2(2): argument "y" is missing, with no default 833 | 834 | # Returning multiple outputs 835 | bad_func <- function(x, y) { 836 | 2*x + y 837 | x + 2*y 838 | 2*x + 2*y 839 | x/y 840 | } 841 | bad_func(2,3) 842 | ## [1] 0.6666667 843 | 844 | good_func <- function(x, y) { 845 | output1 <- 2*x + y 846 | output2 <- x + 2*y 847 | output3 <- 2*x + 2*y 848 | output4 <- x/y 849 | c(output1, output2, output3, output4) 850 | } 851 | good_func(2, 3) 852 | ## [1] 7.0000000 8.0000000 10.0000000 0.6666667 853 | 854 | good_list <- function(x, y) { 855 | output1 <- 2*x + y 856 | output2 <- x + 2*y 857 | output3 <- 2*x + 2*y 858 | output4 <- x/y 859 | c(list(Output1 = output1, Output2 = output2, 860 | Output3 = output3, Output4 = output4)) 861 | } 862 | good_list(1, 2) 863 | 864 | ## $Output1 865 | ## [1] 4 866 | ## $Output2 867 | ## [1] 5 868 | ## $Output3 869 | ## [1] 6 870 | ## $Output4 871 | ## [1] 0.5 872 | 873 | ############## Conditional Executions ############## 874 | 875 | # if Statement 876 | # syntax of if statement 877 | if (test_expression) { 878 | statement 879 | } 880 | 881 | x <- 2 882 | if(x < 3){ 883 | y <- x + 2 884 | y 885 | } 886 | ## [1] 4 887 | 888 | # second example 889 | coin <- rbinom(1,1,0.5) # tossing a fair coin once 890 | toss_call <- "Heads" 891 | coin # toss outcome 892 | ## [1] 1 893 | 894 | if(coin == 1){ 895 | print("Heads it is!") 896 | } 897 | ## [1] "Heads it is!" 898 | 899 | # if else statement 900 | # syntax of if...else statement 901 | if (test_expression) { 902 | # statement 1 903 | } else { 904 | # statement 2 905 | } 906 | 907 | coin <- rbinom(1,1,0.5) # tossing a fair coin once 908 | toss_call <- "Heads" 909 | coin # toss outcome 910 | ## [1] 1 911 | 912 | if(coin == 1){ 913 | print("Heads it is!") 914 | } else { 915 | print("No, it's Tails!") 916 | } 917 | ## [1] "Heads it is!" 918 | 919 | set.seed(100) 920 | coin <- rbinom(1,1,0.5) # tossing a fair coin once 921 | toss_call <- "Tails" 922 | coin # toss outcome 923 | ## [1] 0 924 | 925 | if(coin == 1 & toss_call == "Heads"){ 926 | print("Heads it is!") 927 | } else if(coin == 1 & toss_call == "Tails"){ 928 | print("No, it's Heads!") 929 | } else if(coin == 0 & toss_call == "Heads"){ 930 | print("No it's Tails!") 931 | } else if(coin == 0 & toss_call == "Tails"){ 932 | print("Tails it is!") 933 | } 934 | ## [1] "Heads it is!" 935 | 936 | # FOR LOOP 937 | 938 | # syntax of for loop 939 | for(i in 1:100) { 940 | # 941 | } 942 | 943 | # squaring elements in a vector 944 | x <- seq(from = 0, to = 10, by = 0.5) 945 | xsq <- 0 #initialization 946 | 947 | for(i in 1:length(x)){ 948 | xsq[i] <- x[i]*x[i] 949 | } 950 | 951 | xsq 952 | ## [1] 0.00 0.25 1.00 2.25 4.00 6.25 9.00 12.25 16.00 20.25 953 | ## [11] 25.00 30.25 36.00 42.25 49.00 56.25 64.00 72.25 81.00 90.25 954 | ## [21] 100.00 955 | 956 | #Note: This was just for demonstration. We could have just done this: 957 | 958 | xsq <- x^2 959 | xsq 960 | ## [1] 0.00 0.25 1.00 2.25 4.00 6.25 9.00 12.25 16.00 20.25 961 | ## [11] 25.00 30.25 36.00 42.25 49.00 56.25 64.00 72.25 81.00 90.25 962 | ## [21] 100.00 963 | 964 | for (i in 2023:2029){ 965 | output <- paste("Good times shall come in ", i) # the "promise" 966 | print(output) 967 | } 968 | ## [1] "Good times shall come in 2023" 969 | ## [1] "Good times shall come in 2024" 970 | ## [1] "Good times shall come in 2025" 971 | ## [1] "Good times shall come in 2026" 972 | ## [1] "Good times shall come in 2027" 973 | ## [1] "Good times shall come in 2028" 974 | ## [1] "Good times shall come in 2029" 975 | 976 | print("Good times are yet to come!") # the "reality" 977 | ## [1] "Good times are yet to come!" 978 | 979 | ######### While loop ############# 980 | 981 | # syntax of while loop 982 | counter <- 1 983 | 984 | while(test_expression) { 985 | statement 986 | counter <- counter + 1 987 | } 988 | #The primary difference between a for loop and a while loop is: 989 | # a for loop is used when the number of iterations a code should be run is known 990 | # where as a while loop is used when the number of iterations is not known. 991 | 992 | wins <- 0; games <- 0 993 | while(wins < 3) { 994 | x <- rbinom(1,1,0.5) 995 | wins <- wins + x 996 | games <- games + 1 997 | print(c(toss = x, games = games, wins = wins)) 998 | } 999 | ## toss games wins 1000 | ## 1 1 1 1001 | ## toss games wins 1002 | ## 0 2 1 1003 | ## toss games wins 1004 | ## 0 3 1 1005 | ## toss games wins 1006 | ## 1 4 2 1007 | ## toss games wins 1008 | ## 1 5 3 1009 | 1010 | ########### Repeat Statement ############### 1011 | 1012 | # We monitor a captain's toss winning record, and count how many games are required to win at least 3 tosses 1013 | 1014 | games <- 0 # games counter 1015 | wins <- 0 # toss wins counter 1016 | 1017 | repeat { 1018 | games <- games + 1 # new game 1019 | x <- rbinom(1,1,0.5) # toss outcome 1020 | wins <- wins + x # wins update 1021 | if(wins == 3){ 1022 | break 1023 | } 1024 | } 1025 | 1026 | c(wins, games) # toss wins and number of games 1027 | ## [1] 3 6 1028 | 1029 | ################# BREAK STATEMENT ################# 1030 | 1031 | 1032 | # Example: Suppose we want to create a 10 x 10 lower-triangular matrix with elements given by the product of the row and column numbers. 1033 | # This means that all matrix elements for which row number <= column number must be zero. 1034 | 1035 | # use of breaks 1036 | 1037 | m <- n <- 10 # matrix dimensions 1038 | # Create a 10 x 10 matrix with zeroes 1039 | mymat <- matrix(0, nrow = m, ncol = n) 1040 | ctr <- 0 # counter to count the assignment 1041 | 1042 | for(i in 1:m) { 1043 | for(j in 1:n) { 1044 | if(i == j) { 1045 | break 1046 | } else { 1047 | mymat[i,j] <- i*j # assign the values only when i > j 1048 | ctr <- ctr+1 1049 | } 1050 | } 1051 | } 1052 | 1053 | # Print how many matrix cells were assigned 1054 | print(ctr) 1055 | ## [1] 45 1056 | 1057 | ################## Descriptive Statistics ##################### 1058 | 1059 | # The monthly credit card expenses of an individual in 1000 rupees is 1060 | # given in the file Credit_Card_Expenses.csv. 1061 | # Q1. Read the data set 1062 | # Q2. Compute mean, median minimum, maximum, range, variance, standard 1063 | # deviation, skewness, kurtosis and quantile of Credit Card Expenses 1064 | # Q3. Compute default summary of Credit Card Expenses 1065 | # Q4. Draw Histogram of Credit Card Expenses 1066 | 1067 | 1068 | #### read the csv file using read.csv() function 1069 | 1070 | 1071 | Credit_Card_Expenses <- read.csv("Credit_Card_Expenses.csv") 1072 | Credit_Card_Expenses 1073 | mydata = Credit_Card_Expenses ### load it to another variable 1074 | print(mydata) ### print the data frame 1075 | 1076 | 1077 | ### To read a particular column or variable of data set to a new variable Example: 1078 | ### Read CC_Expenses to CC 1079 | 1080 | CC=mydata$CC_Expenses 1081 | print(CC) 1082 | 1083 | ###### Descriptive statistics for variable ##### 1084 | 1085 | Mean = mean(CC) 1086 | print(Mean) 1087 | Median=median(CC) 1088 | print(Median) 1089 | StandaradDeviation=sd(CC) 1090 | print(StandaradDeviation) 1091 | Variance=var(CC) 1092 | print(Variance) 1093 | Minimum=min(CC) 1094 | print(Minimum) 1095 | Maximum=max(CC) 1096 | print(Maximum) 1097 | Range=range(CC) 1098 | print(Range) 1099 | Quantile=quantile(CC) 1100 | print(Quantile) 1101 | 1102 | Summary=summary(CC) 1103 | print(Summary) 1104 | 1105 | ##### Another way to calculate Descriptive statistics ##### 1106 | 1107 | # install.packages("psych") 1108 | library('psych') 1109 | data_descriptive=describe(CC) 1110 | print(data_descriptive) 1111 | 1112 | ##### Plotting of the Descriptive Statistics ##### 1113 | 1114 | hist(CC) 1115 | hist(CC,col="blue") 1116 | dotchart(CC) 1117 | boxplot(CC) 1118 | boxplot(CC, col="dark green") 1119 | 1120 | ########################## Data Visualization techniques ########################## 1121 | 1122 | #install.packages("ggplot2") 1123 | # Library Call (for use) 1124 | library("ggplot2") 1125 | 1126 | # Installing packages 1127 | install.packages("tidyverse") 1128 | # Library Call (for use) 1129 | library("tidyverse") 1130 | 1131 | ######### Basics of ggplot2 ######### 1132 | 1133 | mpg 1134 | 1135 | # create the blank canvas 1136 | ggplot(mpg) 1137 | 1138 | # variables of interest mapped 1139 | ggplot(mpg, aes(x = displ, y = hwy)) 1140 | 1141 | # type of display 1142 | ggplot(mpg, aes(x = displ, y = hwy)) + 1143 | geom_point() 1144 | 1145 | ########## Aesthetics mapping ############ 1146 | 1147 | # plot displ vs hwy w.r.t different class of cars in mpg data. 1148 | 1149 | ggplot(data = mpg) + 1150 | geom_point(mapping = aes(x = displ, y = hwy, color = class)) 1151 | 1152 | 1153 | # alpha aesthetic, which controls the transparency of the points 1154 | ggplot(data = mpg) + 1155 | geom_point(mapping = aes(x = displ, y = hwy, alpha = class)) 1156 | 1157 | # shape aesthetic, which controls the shape of the points. 1158 | ggplot(data = mpg) + 1159 | geom_point(mapping = aes(x = displ, y = hwy, shape = class)) 1160 | 1161 | # You can also set the aesthetic properties of your geom manually. 1162 | # For example, we can make all of the points in our plot red: 1163 | 1164 | ggplot(data = mpg) + 1165 | geom_point(mapping = aes(x = displ, y = hwy), color = "red") 1166 | 1167 | ############## Geometric Objects ################### 1168 | 1169 | ggplot(mpg, aes(x = displ, y = hwy)) + 1170 | geom_point() + 1171 | geom_smooth() 1172 | # `geom_smooth()` using method = 'loess' and formula = 'y ~ x' 1173 | 1174 | ggplot(data = mpg, aes(x = class)) + 1175 | geom_bar() 1176 | 1177 | ggplot(data = mpg, aes(x = hwy)) + 1178 | geom_histogram() 1179 | # `stat_bin()` using `bins = 30`. Pick better value with `binwidth`. 1180 | 1181 | ggplot(mpg, aes(x = displ, y = hwy)) + 1182 | geom_point(color = "blue") + 1183 | geom_smooth(color = "red") 1184 | # `geom_smooth()` using method = 'loess' and formula = 'y ~ x' 1185 | 1186 | ggplot(data = mpg, mapping = aes(x = displ, y = hwy)) + 1187 | geom_point(mapping = aes(color = class)) + 1188 | geom_smooth() 1189 | # `geom_smooth()` using method = 'loess' and formula = 'y ~ x' 1190 | 1191 | ggplot(data = mpg, mapping = aes(x = displ, y = hwy)) + 1192 | geom_point(mapping = aes(color = class)) + 1193 | geom_smooth(data = filter(mpg, class == "subcompact"), se = FALSE) 1194 | # `geom_smooth()` using method = 'loess' and formula = 'y ~ x' 1195 | 1196 | ################ Statistical Transformation ################ 1197 | ggplot(data = mpg, aes(x = class)) + geom_bar() 1198 | 1199 | ggplot(data = mpg, aes(x = class)) + 1200 | geom_bar(mapping = aes(x = class, y = stat(prop), group = 1)) 1201 | 1202 | ############### Position Adjustments ############### 1203 | 1204 | # bar chart of class, colored by drive (front, rear, 4-wheel) 1205 | ggplot(mpg, aes(x = class, fill = drv)) + 1206 | geom_bar() 1207 | 1208 | # position = "dodge": values next to each other 1209 | ggplot(mpg, aes(x = class, fill = drv)) + 1210 | geom_bar(position = "dodge") 1211 | 1212 | # position = "fill": percentage/proportions chart 1213 | ggplot(mpg, aes(x = class, fill = drv)) + 1214 | geom_bar(position = "fill") + 1215 | ylab("proportion") 1216 | 1217 | ################ Co-ordinate Systems ################### 1218 | 1219 | # flip x and y axis with coord_flip 1220 | ggplot(mpg, aes(x = class)) + 1221 | geom_bar() + 1222 | coord_flip() 1223 | 1224 | ################ Labels ################## 1225 | # Adding title 1226 | ggplot(mpg, aes(displ, hwy)) + 1227 | geom_point(aes(color = class)) + 1228 | geom_smooth(se = FALSE) + 1229 | labs(title = "Fuel efficiency generally decreases with engine size") 1230 | 1231 | # We can also use labs() to replace the axis and legend titles. 1232 | ggplot(mpg, aes(displ, hwy)) + 1233 | geom_point(aes(color = class)) + 1234 | geom_smooth(se = FALSE) + 1235 | labs(title = "Fuel efficiency generally decreases with engine size", x = "Engine displacement (L)", 1236 | y = "Highway fuel economy (mog)",color = "Car types") 1237 | 1238 | 1239 | ################ Scales ############### 1240 | ggplot(mpg, aes(displ, hwy)) + 1241 | geom_point(aes(colour = class)) 1242 | 1243 | ggplot(mpg, aes(displ, hwy)) + 1244 | geom_point(aes(colour = class)) + 1245 | scale_x_continuous() + 1246 | scale_y_continuous() + 1247 | scale_colour_discrete 1248 | 1249 | # Change the scale of y axis 1250 | ggplot(mpg, aes(displ, hwy)) + 1251 | geom_point() + 1252 | scale_y_continuous(breaks = seq(15, 40, by = 5)) 1253 | 1254 | # Change the legend key 1255 | base <- ggplot(mpg, aes(displ, hwy)) + 1256 | geom_point(aes(colour = class)) 1257 | base + theme(legend.position = "left") # 'right' is default 1258 | 1259 | #################### Facets ##################### 1260 | 1261 | # Adding another varaible in the plot 1262 | ggplot(data = mpg) + 1263 | geom_point(mapping = aes(x = displ, y = hwy)) + 1264 | facet_wrap(~ class, nrow = 2) 1265 | 1266 | # Facet plots with combination of two variables 1267 | ggplot(data = mpg) + 1268 | geom_point(mapping = aes(x = displ, y = hwy)) + 1269 | facet_grid(drv ~ cyl) 1270 | 1271 | #################### Saving your plots ################## 1272 | ggplot(mpg, aes(displ, hwy)) + geom_point() 1273 | ggsave("plot1.png") 1274 | # Saving 7.29 x 5.56 in image 1275 | ################ Big Mart Data Visualization #################### 1276 | 1277 | data_mart = read.csv("Big_Mart_Dataset.csv") 1278 | print(data_mart) # Read and Print BigMart dataset 1279 | 1280 | library(ggplot2) # Scatter Plot of Item_Visibility & Item_MRP 1281 | print(ggplot(data_mart, aes(Item_Visibility, Item_MRP)) + geom_point() + 1282 | scale_x_continuous("Item Visibility", breaks = seq(0,0.35,0.05))+ 1283 | scale_y_continuous("Item MRP", breaks = seq(0,270,by = 30))+ theme_bw()) 1284 | 1285 | 1286 | # Now, we can view a third variable also in same chart, say a categorical variable (Item_Type) which will give the characteristic (item_type) 1287 | # of each data set. Different categories are depicted by way of different color for 1288 | # item_type in below chart. Another scatter plot using function ggplot() with geom_point(). 1289 | 1290 | 1291 | print(ggplot(data_mart, aes(Item_Visibility, Item_MRP)) + 1292 | geom_point(aes(color = Item_Type)) + scale_x_continuous("Item Visibility", breaks = seq(0,0.35,0.05)) + 1293 | scale_y_continuous("Item MRP", breaks = seq(0,270,by = 30))+ theme_bw() + 1294 | labs(title="Scatterplot")) 1295 | 1296 | # We can even make it more visually clear by creating separate 1297 | # scatter plots for each separate Item_Type as shown below. 1298 | # Another scatter plot using function ggplot() with geom_point(). 1299 | 1300 | print(ggplot(data_mart, aes(Item_Visibility, Item_MRP)) + 1301 | geom_point(aes(color = Item_Type)) + 1302 | scale_x_continuous("Item Visibility", breaks = seq(0,0.35,0.05)) + 1303 | scale_y_continuous("Item MRP", breaks = seq(0,270, by = 30))+ 1304 | theme_bw() + labs(title="Scatterplot") + facet_wrap( ~ Item_Type)) 1305 | 1306 | ########## Histogram Plot ######################## 1307 | 1308 | # For Big_Mart_Dataset, if we want to know the count of items on basis of their 1309 | # cost, then we can plot histogram using continuous variable Item_MRP as shown below. 1310 | # Histogram plot using function ggplot() with geom_ histogram() 1311 | 1312 | print(ggplot(data_mart, aes(Item_MRP)) + geom_histogram(binwidth = 2)+ 1313 | scale_x_continuous("Item MRP", breaks = seq(0,270,by = 30))+ 1314 | scale_y_continuous("Count", breaks = seq(0,200,by = 20))+ labs(title = "Histogram")) 1315 | 1316 | 1317 | ############## Bar Chart Plot ############################### 1318 | 1319 | # For Big_Mart_Dataset, if we want to know item weights (continuous variable) 1320 | # on basis of Outlet Type (categorical variable) on single bar chart as shown below. 1321 | # Vertical Bar plot using function ggplot() 1322 | 1323 | print(ggplot(data_mart, aes(Item_Type, Item_Weight)) + geom_bar(stat = "identity", fill = 1324 | "darkblue") + scale_x_discrete("Outlet Type")+ 1325 | scale_y_continuous("Item Weight", breaks = seq(0,15000, by = 500))+ 1326 | theme(axis.text.x = element_text(angle = 90, vjust = 0.5)) + 1327 | labs(title = "Bar Chart")) 1328 | 1329 | ################## Stack Bar Chart ######################### 1330 | 1331 | # For Big_Mart_Dataset, if we want to know the count of outlets on basis of 1332 | # categorical variables like its type (Outlet Type) and location (Outlet Location 1333 | # Type) both, stack chart will visualize the scenario in most useful manner. 1334 | # Stack Bar Chart using function ggplot() 1335 | 1336 | print(ggplot(data_mart, aes(Outlet_Location_Type, fill = Outlet_Type)) + 1337 | geom_bar()+labs(title = "Stacked Bar Chart", x = "Outlet Location Type", y = 1338 | "Count of Outlets")) 1339 | 1340 | ############ Box Plot ########################################## 1341 | 1342 | # For Big_Mart_Dataset, if we want to identify each outlet's detailed item sales 1343 | # including minimum, maximum & median numbers, box plot can be helpful. In 1344 | # addition, it also gives values of outlier of item sales for each outlet as shown 1345 | # in below chart. 1346 | 1347 | print(ggplot(data_mart, aes(Outlet_Identifier, Item_Outlet_Sales)) + 1348 | geom_boxplot(fill = "red")+ 1349 | scale_y_continuous("Item Outlet Sales", breaks= seq(0,15000, by=500))+ 1350 | labs(title = "Box Plot", x = "Outlet Identifier")) 1351 | 1352 | ### To save these charts, click on Export - Save as ... ### 1353 | 1354 | ##################### Area Chart #################################### 1355 | 1356 | # For Big_Mart_Data set, when we want to analyze the trend of item outlet sales, 1357 | # area chart can be plotted as shown below. It shows count of outlets on basis of sales. 1358 | 1359 | print(ggplot(data_mart, aes(Item_Outlet_Sales)) + 1360 | geom_area(stat = "bin", bins = 30, fill = "steelblue") + 1361 | scale_x_continuous(breaks = seq(0,11000,1000))+ 1362 | labs(title = "Area Chart", x = "Item Outlet Sales", y = "Count")) 1363 | 1364 | # Area chart shows continuity of Item Outlet Sales. 1365 | 1366 | 1367 | ###################### Heat Map: ############################################ 1368 | 1369 | # For Big_Mart_Dataset, if we want to know cost of each item on every outlet, 1370 | # we can plot heatmap as shown below using three variables Item MRP, Outlet 1371 | # Identifier & Item Type from our mart dataset 1372 | 1373 | print(ggplot(data_mart, aes(Outlet_Identifier, Item_Type))+ 1374 | geom_raster(aes(fill = Item_MRP))+ 1375 | labs(title ="Heat Map", x = "Outlet Identifier", y = "Item Type")+ 1376 | scale_fill_continuous(name = "Item MRP")) 1377 | 1378 | # The dark portion indicates Item MRP is close 50. The brighter portion indicates 1379 | # Item MRP is close to 250. 1380 | 1381 | ##################### Correlogram ########################## 1382 | 1383 | # For Big_Mart_Dataset, check co-relation between Item cost, weight, visibility 1384 | # along with Outlet establishment year and Outlet sales from below plot. 1385 | # install.packages("corrgram") 1386 | 1387 | install.packages("corrgram") 1388 | library(corrgram) 1389 | print(corrgram(data_mart, order=NULL, panel=panel.shade, text.panel=panel.txt, 1390 | main="Correlogram")) 1391 | 1392 | # Darker the color, higher the co-relation between variables. Positive co- 1393 | # relations are displayed in blue and negative correlations in red color. Color 1394 | # intensity is proportional to the co-relation value. 1395 | 1396 | # We can see that Item cost & Outlet sales are positively correlated while Item 1397 | # weight & its visibility are negatively correlated. 1398 | 1399 | ev_od_check <- function(mu, sig){ 1400 | x = round(rnorm(1, mu, sig)) 1401 | if (x %% 2 == 0){ 1402 | out <- "It is even" 1403 | }else{ 1404 | out <- "It is odd" 1405 | } 1406 | out_1 <- list(out, x) 1407 | return(out_1) 1408 | } 1409 | ev_od_check(50, 10) 1410 | 1411 | ev_od_check_vec <- function(n, mu, sig){ 1412 | x = round(rnorm(n, mu, sig)) 1413 | out <- c() 1414 | for(i in 1:n){ 1415 | if (x[i] %% 2 == 0){ 1416 | out[i] <- "It is even" 1417 | }else{ 1418 | out[i] <- "It is odd" 1419 | } 1420 | } 1421 | 1422 | return(out) 1423 | } 1424 | ev_od_check_vec(10, 50, 10) 1425 | 1426 | die_play <- function(){ ## Lexical Scoping 1427 | x1 <- rbinom(1, 5, p =1/6) + 1 1428 | x2 <- rbinom(1, 5, p =1/6) + 1 1429 | val <- x1 + x2 1430 | if(val > 10 & x1 == x2){ 1431 | out <- "Congartulations, you won the game!" 1432 | }else if(val <= 10 & val > 5 & x1 == x2){ 1433 | out <- "You still have a winning chance" 1434 | }else if(val <= 5 & x1 == x2){ 1435 | out <- "You can play once more" 1436 | }else{ 1437 | out <- "Sorry, you lost the game!" 1438 | } 1439 | return(out) 1440 | } 1441 | die_play() 1442 | 1443 | 1444 | wins <- 0; games <- 0 1445 | while (wins < 3){ 1446 | x <- rbinom(1, 1, 0.5) 1447 | wins <- wins + x 1448 | games <- games +1 1449 | print(c(toss = x, games = games, wins = wins)) 1450 | } 1451 | 1452 | ################################# END OF SESSION ############################### 1453 | 1454 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_2_EDA.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Practical Slides and Codes/Practical_2_EDA.pdf -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_2_EDA_Code.R: -------------------------------------------------------------------------------- 1 | ###################### Data Pre-processing ###################### 2 | 3 | # Installing packages 4 | install.packages("tidyverse") 5 | # Library Call (for use) 6 | library("tidyverse") 7 | 8 | ### Use Import Dataset - From Text (base) - Missing_Values_Telecom - Heading - Yes ### 9 | mydata = read.csv("Missing_Values_Telecom.csv") 10 | newdata = na.omit(mydata) # Discard all records with missing values 11 | write.csv(newdata, file = 'newdata.csv',row.names = TRUE) 12 | 13 | ### Compute the means excluding the missing values 14 | cmusage = mydata[,2] 15 | l3musage = mydata[,3] 16 | avrecharge = mydata[,4] 17 | cmusage_mean = mean(cmusage, na.rm = TRUE) 18 | l3musage_mean = mean(l3musage, na.rm = TRUE) 19 | avrecharge_mean = mean(avrecharge, na.rm = TRUE) 20 | 21 | ### Replace the missing values with mean 22 | cmusage[is.na(cmusage)]=cmusage_mean 23 | l3musage[is.na(l3musage)]= l3musage_mean 24 | avrecharge[is.na(avrecharge)]=avrecharge_mean 25 | 26 | mynewdata = cbind(cmusage, l3musage, avrecharge, mydata[,5],mydata[,6]) 27 | mydata 28 | mynewdata 29 | write.csv(mynewdata, file = 'Missing_Values_Telecom_mod.csv',row.names = TRUE) 30 | 31 | ############## Data Normalization and Random Sampling ################# 32 | 33 | mydata = read.csv("Supply_Chain.csv") 34 | # mydata = Supply_Chain 35 | mystddata = scale(mydata) 36 | mystddata 37 | 38 | mydata= read.csv("bank_data.csv") 39 | nrow(mydata) 40 | sample = sample(2, nrow(mydata), replace = TRUE, prob = c(0.750, 0.250)) 41 | sample1 = mydata[sample ==1,] 42 | nrow(sample1) 43 | sample2 = mydata[sample ==2,] 44 | nrow(sample2) 45 | 46 | ################## Playing with the data ####################### 47 | library(tidyverse) 48 | 49 | # load packages 50 | suppressMessages(library(dplyr)) 51 | install.packages("hflights") 52 | library(hflights) 53 | 54 | data(hflights) 55 | head(hflights) 56 | 57 | # Revert to the original data frame to view all the columns 58 | head(data.frame(hflights)) 59 | 60 | flights <- hflights 61 | 62 | ######## Using 'filter' function ############ 63 | 64 | ## Extract details of flights operating on January 01. 65 | 66 | # base R approach to view all flights on January 1 67 | flights[flights$Month==1 & flights$DayofMonth==1, ] 68 | 69 | # try the filter function of dplyr for the same question 70 | # dplyr approach 71 | # note: you can use comma or ampersand to represent AND condition 72 | filter(flights, Month==1, DayofMonth==1) 73 | 74 | # Check the two answers 75 | 76 | ## Extract details of flights operated by American Airlines or United Airlines. 77 | 78 | # use pipe for OR condition 79 | filter(flights, UniqueCarrier=="AA" | UniqueCarrier=="UA") 80 | 81 | # you can also use %in% operator 82 | filter(flights, UniqueCarrier %in% c("AA", "UA")) 83 | 84 | # Check the two answers 85 | 86 | ########## Using 'select' function ########### 87 | 88 | # Show flight details with Dep/Arr times alongwith flight numbers only. 89 | # base R approach to select DepTime, ArrTime, and FlightNum columns 90 | flights[, c("DepTime", "ArrTime", "FlightNum")] 91 | 92 | # dplyr approach 93 | select(flights, DepTime, ArrTime, FlightNum) 94 | # Check the two answers 95 | 96 | # Use colon to select multiple contiguous columns, and use contains to match columns by name 97 | # starts_with, ends_with, and matches (for regular expressions) can also be used to match columns by name 98 | select(flights, Year:DayofMonth, contains("Taxi"), contains("Delay")) 99 | 100 | select(flights, Year:DayofMonth, starts_with("Taxi"), ends_with("Delay")) 101 | 102 | ############## Using 'pipe' operator ############# 103 | # Show unique carrier details for flights having departure delays of more than 1 hour. 104 | # nesting method 105 | filter(select(flights, UniqueCarrier, DepDelay), DepDelay > 60) 106 | 107 | # chaining method 108 | flights %>% 109 | select(UniqueCarrier, DepDelay) %>% 110 | filter(DepDelay > 60) 111 | 112 | # Advantage of chaining 113 | 114 | # create two vectors and calculate Euclidian distance between them 115 | x1 <- 1:5; x2 <- 2:6 116 | sqrt(sum((x1-x2)^2)) 117 | 118 | # chaining method 119 | (x1-x2)^2 %>% sum() %>% sqrt() 120 | ## [1] 2.236068 121 | 122 | ############ Using arrange: Reorder rows ############ 123 | #Extract flight carrier details and show departure delays only sorted by delay lengths. 124 | # base R approach 125 | flights[order(flights$DepDelay), c("UniqueCarrier", "DepDelay")] 126 | 127 | # dplyr approach 128 | flights %>% 129 | select(UniqueCarrier, DepDelay) %>% 130 | arrange(DepDelay)# arrange(desc(DepDelay)) for descending order 131 | 132 | ############ Using mutate: Add new variables ############ 133 | # Find mean speed of flights. 134 | # base R approach 135 | flights$Speed <- flights$Distance / flights$AirTime*60 136 | flights[, c("Distance", "AirTime", "Speed")] 137 | 138 | # dplyr approach (prints the new variable but does not store it) 139 | flights %>% 140 | select(Distance, AirTime) %>% 141 | mutate(Speed = Distance/AirTime*60) 142 | 143 | flights <- flights %>% mutate(Speed = Distance/AirTime*60) 144 | flights 145 | 146 | ############## Using summarise ############# 147 | 148 | # Finding the average delay to each destination 149 | 150 | # dplyr approach: create a table grouped by Dest, and then summarise each group by taking the mean of ArrDelay 151 | flights %>% 152 | group_by(Dest) %>% 153 | summarise(avg_delay = mean(ArrDelay, na.rm=TRUE)) 154 | 155 | # For each day of the year, count the total number of flights and sort in descending order. 156 | flights %>% 157 | group_by(Month, DayofMonth) %>% 158 | summarise(flight_count = n()) %>% 159 | arrange(desc(flight_count)) 160 | 161 | ## `summarise()` has grouped output by 'Month'. You can override using the 162 | ## `.groups` argument. 163 | 164 | ########### Uisng n_distinct(vector) ########## 165 | # For each destination, count the total number of flights and the number of distinct planes that flew there. 166 | flights %>% 167 | group_by(Dest) %>% 168 | summarise(flight_count = n(), plane_count = n_distinct(TailNum)) 169 | 170 | ########## Using Window function ############ 171 | 172 | # For each carrier, calculate which two days of the year they had their longest departure delays. 173 | # Note: smallest (not largest) value is ranked as 1, 174 | # so you have to use `desc` to rank by largest value 175 | flights %>% 176 | group_by(UniqueCarrier) %>% 177 | select(Month, DayofMonth, DepDelay) %>% 178 | filter(min_rank(desc(DepDelay)) <= 2) %>% 179 | arrange(UniqueCarrier, desc(DepDelay)) 180 | 181 | # rewrite more simply with the `top_n` function 182 | flights %>% 183 | group_by(UniqueCarrier) %>% 184 | select(Month, DayofMonth, DepDelay) %>% 185 | top_n(2) %>% 186 | arrange(UniqueCarrier, desc(DepDelay)) 187 | 188 | ######## Useful Convenience Functions ############# 189 | # randomly sample a fixed number of rows, without replacement 190 | flights %>% sample_n(5) 191 | 192 | # randomly sample a fraction of rows, with replacement 193 | flights %>% sample_frac(0.25, replace=TRUE) 194 | 195 | # base R approach to view the structure of an object 196 | str(flights) 197 | 198 | # dplyr approach: better formatting, and adapts to your screen width 199 | glimpse(flights) 200 | 201 | 202 | ############################ EDA Email Case Study ########################### 203 | 204 | # Loading email dataset 205 | library(tidyverse) 206 | library(openintro) #for loading our case study dataset 207 | data(email) 208 | email 209 | 210 | # Exploring number of characters in spam vs non-spam emails 211 | email <- email %>% 212 | mutate(spam = as.factor(spam)) %>% 213 | mutate(spam = recode(spam, "0" = "Not_spam", "1" = "Spam")) 214 | 215 | email 216 | 217 | # Compute summary statistics 218 | email %>% 219 | group_by(spam) %>% 220 | summarize(median(num_char), 221 | IQR(num_char)) 222 | 223 | # Visualize 224 | email %>% 225 | ggplot(aes(x = spam, y = num_char)) + 226 | geom_boxplot() + 227 | labs(x = "Spam status", y = "No. of characters") 228 | 229 | # It might be a good idea to change the scale of the y-axis. 230 | 231 | # Visualize 232 | email %>% 233 | mutate(log_num_char = log(num_char)) %>% 234 | ggplot(aes(x = spam, y = log_num_char)) + 235 | geom_boxplot()+ 236 | labs(x = "Spam status", y = "No. of characters(log-scale)") 237 | 238 | # exclaim_mess vs spam 239 | email %>% 240 | group_by(spam) %>% 241 | summarise(MED = median(exclaim_mess), 242 | IQR = IQR(exclaim_mess), SD = sd(exclaim_mess)) 243 | 244 | # Visualize 245 | email %>% 246 | mutate(log_exclaim_mess = log(exclaim_mess + 0.001)) %>% 247 | mutate(custom_fill = as.factor(exclaim_mess > 0)) %>% 248 | ggplot(aes(x = log_exclaim_mess, fill = custom_fill)) + 249 | geom_histogram() + theme(legend.position = "none") + facet_wrap(~ spam) 250 | 251 | # Visualize: Box-plots 252 | email %>% 253 | mutate(log_exclaim_mess = log(exclaim_mess + 0.001)) %>% 254 | ggplot(aes(x = spam, y = log_exclaim_mess)) + 255 | geom_boxplot() 256 | 257 | # Tackling inflated Zeroes 258 | email %>% 259 | mutate(zero_exclaim = (exclaim_mess == 0)) %>% 260 | ggplot(aes(x = zero_exclaim)) + 261 | geom_bar() + 262 | facet_wrap(~spam) 263 | 264 | email %>% 265 | mutate(zero_exclaim = (exclaim_mess == 0)) %>% 266 | ggplot(aes(x = zero_exclaim, fill = spam)) + 267 | geom_bar(position = "fill") 268 | 269 | # Tackling inflated zeroes: example with the image variable 270 | email %>% 271 | count(image) 272 | 273 | # There are 3811 images with no images attached. This poses a zero inflation problem. 274 | email %>% 275 | mutate(image_present = (image > 0)) %>% 276 | ggplot(aes(x = image_present, fill = spam)) + 277 | geom_bar(position = "fill") 278 | 279 | 280 | 281 | ################################# END OF SESSION ############################### 282 | 283 | 284 | 285 | 286 | 287 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_3_Distribution.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Practical Slides and Codes/Practical_3_Distribution.pdf -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_3_Distribution_Code.R: -------------------------------------------------------------------------------- 1 | ########################### Maximum Likelihood Estimation ############################# 2 | 3 | rm(list=ls()) 4 | library(MASS) 5 | library(stats4) 6 | 7 | 8 | ################# Coalmine Dataset ##################### 9 | 10 | coal.dis=c(1 ,4 ,4, 7 ,11, 13 ,15, 15 ,17 ,18, 19 ,19 ,20 ,20 ,22 ,23 ,28 ,29, 31, 32, 36 ,37 ,47 ,48, 49 ,50 ,54 ,54, 55, 59, 59 ,61, 61, 11 | 66, 72, 72 ,75, 78, 78, 81, 93 ,96 ,99 ,108 ,113 ,114 ,120 ,120 ,120 ,123 ,124 ,129, 131 ,137 ,145, 151, 156, 171, 12 | 176 ,182 ,188 ,189 ,195 ,203, 208, 215, 217 ,217 ,217 ,224, 228 ,233, 255 ,271, 275, 275 ,275 ,286 ,291, 312, 13 | 312 ,312, 315, 326, 326 ,329, 330, 336, 338, 345, 348, 354, 361, 364 ,369, 378 ,390, 457 ,467 ,498 ,517 ,566, 14 | 644, 745, 871, 1312, 1357, 1613, 1630) 15 | 16 | hist(coal.dis,breaks=15,prob=TRUE, main = "Histogram of Coalmine Data") 17 | summary(coal.dis) 18 | var(coal.dis) 19 | 20 | Exp=fitdistr(coal.dis,"exponential") 21 | pa=as.vector(Exp$estimate);pa 22 | l_exp = logLik(Exp)[1];l_exp 23 | 24 | EE=function(alp,lam) 25 | { 26 | x=coal.dis 27 | a1=sum((alp-1)*log(1-exp(-lam*x),base=exp(1))) 28 | -(length(x)*log(alp*lam)-lam*sum(x)+a1) 29 | } 30 | s=mle(EE,start=list(alp=5,lam=5)) 31 | pa3=as.vector(coef(s));pa3 32 | l_ee = logLik(s)[1];l_ee 33 | 34 | p1=ks.test(coal.dis,"pexp",0.00429) 35 | 36 | pEE=function(x,alp,lam) 37 | { 38 | (1-exp(-lam*x))^alp 39 | } 40 | 41 | p2=ks.test(coal.dis,"pEE",coef(s)[1],coef(s)[2]) 42 | 43 | 44 | # Exponential distribution 45 | hist(coal.dis,breaks=15,prob=TRUE, main = "Exponential Distribution") 46 | s=seq(0,max(coal.dis),length=10000) 47 | lamb_hat_e = pa[1] 48 | ed = lamb_hat_e*exp(-1*lamb_hat_e*s) 49 | lines(s,ed,col="red",lwd=2) 50 | legend("topright", c("Exponential"), col=c("red"), lwd=10) 51 | 52 | ########### Exponentiated exponential distribution ########### 53 | hist(coal.dis,breaks=15,prob=TRUE, main = "Exponentiated Exponential Distribution") 54 | s=seq(0,max(coal.dis),length=10000) 55 | alph_hat=pa3[1] 56 | lamb_hat=pa3[2] 57 | eed = lamb_hat*alph_hat*(1-exp(-1*lamb_hat*s))^(alph_hat-1)*exp(-1*lamb_hat*s) 58 | lines(s,eed,col="blue",lwd=2) 59 | legend("topright", c("Exponentiated exponential"), col=c("blue"), lwd=10) 60 | 61 | # Combined plot 62 | hist(coal.dis,breaks=15,prob=TRUE, main = "Combined plot") 63 | s=seq(0,max(coal.dis),length=10000) 64 | lines(s,ed,col="red",lwd=2) 65 | lines(s,eed,col="blue",lwd=2) 66 | legend("topright", c("Exponential", "Exponentiated exponential"), col=c("red", "blue"), lwd=10) 67 | 68 | #------Table 69 | Dist=c("Exponential","Exponentiated exponential") 70 | par1=c(pa[1],pa3[1]) 71 | par2=c(pa[2],pa3[2]) 72 | Likelihood=c(l_exp,l_ee) 73 | p_value=c(p1$p.value,p2$p.value) 74 | cbind(Dist,par1,par2,Likelihood,p_value) 75 | 76 | ################# Guinea Pigs Dataset ##################### 77 | 78 | pigs = c(12,15,22,24,24,32,32,33,34,38,38,43,44,48,52,53,54,54,55,56,57,58,58,59,60,60,60,60,61,62,63,65,65,67,68,70,70,72,73,75,76,76,81,83,84,85,87,91,95,96,98,99,109,110,121,127,129,131,143,146,146,175,175,211,233,258,258,263,297,341,341,376) 79 | 80 | summary(pigs) 81 | sd(pigs) 82 | ############# Exponential Distribution ############# 83 | Exp=fitdistr(pigs,"exponential") 84 | pa=as.vector(Exp$estimate);pa 85 | l_exp = logLik(Exp)[1];l_exp 86 | 87 | ############# Exponentiated exponential Distribution ############# 88 | EE=function(alp,lam) 89 | { 90 | x=pigs 91 | a1=sum((alp-1)*log(1-exp(-lam*x),base=exp(1))) 92 | -(length(x)*log(alp*lam)-lam*sum(x)+a1) 93 | 94 | } 95 | s=mle(EE,start=list(alp=5,lam=5)) 96 | pa1=as.vector(coef(s));pa1 97 | l_ee = logLik(s)[1];l_ee 98 | 99 | ############ Kolmogorov-Smirnov Tests ################ 100 | 101 | p1=ks.test(pigs,"pexp",0.01001809) 102 | 103 | pEE=function(x,alp,lam) 104 | { 105 | (1-exp(-lam*x))^alp 106 | } 107 | 108 | p2=ks.test(pigs,"pEE",pa1[1],pa1[2]) 109 | 110 | ############### Histogram ################## 111 | 112 | # Exponential distribution 113 | hist(pigs,breaks=15,prob=TRUE, main = "Exponential Distribution") 114 | s=seq(0,max(pigs),length=10000) 115 | lamb_hat_e = pa[1] 116 | ed = lamb_hat_e*exp(-1*lamb_hat_e*s) 117 | lines(s,ed,col="red",lwd=2) 118 | legend("topright", c("Exponential"), col=c("red"), lwd=10) 119 | 120 | #Exponentiated exponential distribution 121 | hist(pigs,breaks=15,prob=TRUE, main = "Exponentiated Exponential Distribution") 122 | s=seq(0,max(pigs),length=10000) 123 | alph_hat=pa1[1] 124 | lamb_hat=pa1[2] 125 | eed = lamb_hat*alph_hat*(1-exp(-1*lamb_hat*s))^(alph_hat-1)*exp(-1*lamb_hat*s) 126 | lines(s,eed,col="blue",lwd=2) 127 | legend("topright", c("Exponentiated exponential"), col=c("blue"), lwd=10) 128 | 129 | 130 | # Combined plot 131 | hist(pigs,breaks=15,prob=TRUE, main = "Combined plot") 132 | s=seq(0,max(pigs),length=10000) 133 | lines(s,ed,col="red",lwd=2) 134 | lines(s,eed,col="blue",lwd=2) 135 | legend("topright", c("Exponential", "Exponentiated exponential"), col=c("red", "blue"), lwd=10) 136 | 137 | #------Table 138 | Dist=c("Exponential","Exponentiated exponential") 139 | par1=c(pa[1],pa1[1]) 140 | par2=c(pa[2],pa1[2]) 141 | Likelihood=c(l_exp,l_ee) 142 | p_value=c(p1$p.value,p2$p.value) 143 | cbind(Dist,par1,par2,Likelihood,p_value) 144 | 145 | ################################ END OF SESSION ############################### 146 | 147 | 148 | 149 | 150 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_4_ToH_ANOVA.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Practical Slides and Codes/Practical_4_ToH_ANOVA.pdf -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_4_ToH_ANOVA_Code.R: -------------------------------------------------------------------------------- 1 | ####################### One sample t-test ################################# 2 | 3 | set.seed(8885) 4 | Xsamp <- rnorm(n = 25, mean = 2, sd = 2) 5 | 6 | mytest <- t.test(Xsamp, alternative = "greater", mu = 3) 7 | mytest 8 | 9 | mytest$p.value 10 | 11 | # Another alternative hypothesis 12 | t.test(Xsamp, alternative = "greater", mu = 2.1) 13 | 14 | # Another alternative hypothesis 15 | t.test(Xsamp, alternative = "greater", mu = 1.9) 16 | 17 | # Another alternative hypothesis 18 | t.test(Xsamp, alternative = "greater", mu = 1) 19 | # p-value as a function of μ0 and n 20 | 21 | func.p.val <- function(n, mu0){ 22 | set.seed(100) 23 | Xsamp <- rnorm(n, mean = 2, sd = 2) #generating random sample of size n from N(2,4) 24 | p.val <- t.test(Xsamp, alternative = "greater", mu = mu0)$p.value #calculate p-value 25 | return(p.val) 26 | } 27 | n <- seq(5,25, by = 5) 28 | mu0 <- seq(-5,5, by = 0.1) 29 | 30 | pvals <- matrix(0, nrow = length(n), ncol = length(mu0)) 31 | 32 | for(i in 1:length(n)){ 33 | for(j in 1:length(mu0)){ 34 | pvals[i,j] <- func.p.val(n[i],mu0[j]) 35 | } 36 | } 37 | 38 | # Plotting 39 | cl <- rainbow(length(n)) 40 | plot(mu0, pvals[1, ], type = "l", main = "Plots of p-values", xlab = "Null value", 41 | ylab = "p-value", ylim = c(min(pvals),1), col = cl[1]) 42 | 43 | for(i in 2:length(n)){ 44 | lines(mu0, pvals[i, ], type = "l", col = cl[i]) 45 | } 46 | abline(h = 0.05) 47 | abline(h = 0.01) 48 | legend("topleft", legend = n, lty = 1, col = cl) 49 | 50 | # Let us zoom it in near the true value of 2. 51 | 52 | cl <- rainbow(length(n)) 53 | 54 | plot(mu0, pvals[1, ], type = "l", main = "Plots of p-values", xlab = "Null value", 55 | ylab = "p-value", ylim = c(0.009,0.052), col = cl[1]) 56 | 57 | for(i in 2:length(n)){ 58 | lines(mu0, pvals[i, ], type = "l", col = cl[i]) 59 | } 60 | abline(h = 0.05) 61 | abline(h = 0.01) 62 | abline(v = 2) 63 | legend("topleft", legend = n, lty = 1, col = cl) 64 | 65 | ####################### Two Sample t test ############################ 66 | 67 | Xsamp <- rnorm(25, mean = 0, sd = 2) 68 | Ysamp <- rnorm(20, mean = 1, sd = 1) 69 | 70 | boxplot(Xsamp, Ysamp) 71 | 72 | #Take null value of difference to be zero. 73 | 74 | mytest2 <- t.test(Xsamp, Ysamp, alternative = "g", mu = 0) 75 | mytest2 76 | 77 | # In case you had to test whether μ2 is greater than μ1 by 0.5, 78 | # we can do it in two different ways as follows. 79 | 80 | t.test(Xsamp, Ysamp, alternative = "l", mu = -0.5) 81 | 82 | t.test(Ysamp, Xsamp, alternative = "g", mu = 0.5) 83 | 84 | # Pooled t-test 85 | 86 | Xsamp <- rnorm(20, mean = 2, sd = 1) 87 | Ysamp <- rnorm(30, mean = 1, sd = 1) 88 | boxplot(Xsamp, Ysamp) 89 | 90 | mytest.pooled <- t.test(Xsamp, Ysamp, alternative = "t", mu = 0, var.equal = TRUE) 91 | mytest.pooled 92 | 93 | ################### Paired t-test ######################### 94 | 95 | Xsamp <- rnorm(25, mean = 2, sd = 2) 96 | Ysamp <- 0.5*Xsamp + rnorm(25, mean = 1, sd = 1) 97 | # Notice that the Y samples are linearly related to X samples, 98 | # and hence a correlation between them are established. 99 | # You can plot the data to visualize this. 100 | 101 | plot(Xsamp, Ysamp) 102 | 103 | boxplot(Xsamp, Ysamp) 104 | 105 | mytest.paired <- t.test(Xsamp, Ysamp, alternative = "t", mu = 0, paired = TRUE) 106 | mytest.paired 107 | # Note the usage of the argument paired = TRUE in the command above. 108 | 109 | ####################### Testing of variances ############################### 110 | 111 | # We use the command var.test for this. 112 | 113 | Xsamp <- rnorm(15, mean = 2, sd = 2) 114 | Ysamp <- rnorm(20, mean = 5, sd = 4) 115 | 116 | vartest <- var.test(Xsamp, Ysamp, alternative = "g") 117 | vartest 118 | 119 | 120 | ######## Testing of Hypothesis PO Processing Time dataset ######## 121 | 122 | # Installing all the required packages for the R Notebook 123 | 124 | install.packages("car") 125 | install.packages("ggplot2") 126 | install.packages("gplots") 127 | install.packages("qqplotr") 128 | library(car) 129 | library(gplots) 130 | library(qqplotr) 131 | library(ggplot2) 132 | 133 | ######## One Sample t Test ####### 134 | 135 | # Problem: Testing whether the average Processing Time of PO_Processing 136 | # data set is less than equal to 40. 137 | 138 | # Step 1: Reading the data as mydata 139 | 140 | # SESSION -> SET WORKING DIRECTORY -> TO SOURCE FILE LOCATION 141 | 142 | mydata = read.csv('PO_Processing.csv',header = T,sep = ",") 143 | 144 | # IMPORT DATASET -> FROM TEXT -> CHOOSE DATA -> HEADING = YES, RENAME THE DATA 145 | 146 | PT = mydata$Processing_Time 147 | 148 | # Step 2: Using the t Test function to test our hypothesis 149 | 150 | # ?t.test 151 | 152 | t.test(PT, alternative = 'greater', mu = 40) 153 | 154 | # p-value < 0.05 => Reject H0. 155 | 156 | ######## Normality Test ######## 157 | 158 | # Problem: Checking whether the Processing Time Data is Normally Distributed 159 | 160 | qqnorm(PT) 161 | qqline(PT) 162 | 163 | # Normality Check using Shapiro-Wilk test 164 | 165 | shapiro.test(PT) 166 | 167 | ########################### ANOVA ###################################### 168 | 169 | data("InsectSprays") 170 | dat <- InsectSprays 171 | head(InsectSprays) 172 | 173 | str(InsectSprays) 174 | 175 | levels(dat$spray) 176 | 177 | library(ggplot2) 178 | ggplot(dat, aes(x=spray,y=count))+ 179 | geom_point() 180 | 181 | anova.fit <- aov(count ~ spray, data = dat) 182 | summary(anova.fit) 183 | 184 | summary.lm(anova.fit) 185 | 186 | TukeyHSD(anova.fit) 187 | 188 | plot(TukeyHSD(anova.fit)) 189 | 190 | ######## One Way ANOVA Sales Revenue Example ######## 191 | 192 | # Reading data and variables in R 193 | 194 | mydata = read.csv('Sales_Revenue_Anova.csv',header = T,sep = ",") 195 | location = mydata$Location 196 | revenue = mydata$Sales.Revenue 197 | 198 | # Converting location to factor 199 | 200 | location = factor(location) 201 | 202 | # H0 : location-wise sales figures are equal. 203 | 204 | # Computing ANOVA table 205 | 206 | fit = aov(revenue ~ location) 207 | summary(fit) 208 | # H0 rejected. Sales figures are not equal. 209 | aggregate(revenue ~ location, FUN = mean) 210 | boxplot(revenue ~ location) 211 | plotmeans(revenue ~ location) 212 | 213 | # Tukey's Honestly Significant Difference (HSD) Test 214 | 215 | TukeyHSD(fit) 216 | 217 | # Bartlett's test 218 | 219 | bartlett.test(revenue, location, data = mydata) 220 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_5_Linear_Regression.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Practical Slides and Codes/Practical_5_Linear_Regression.pdf -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_5_Linear_Regression_Code.R: -------------------------------------------------------------------------------- 1 | ################# Simple Linear Regression Analysis in RStudio ################# 2 | 3 | ################ Fitting Linear Regression using lm function ################### 4 | 5 | # Installing all the required packages for the R Notebook 6 | 7 | install.packages("car") 8 | install.packages("pls") 9 | install.packages("ggplot2") 10 | install.packages("gplots") 11 | install.packages("glmnet") 12 | install.packages("DAAG") 13 | install.packages("boot") 14 | install.packages("MASS") 15 | library(DAAG) 16 | library(glmnet) 17 | library(car) 18 | library(MASS) 19 | library(gplots) 20 | library(ggplot2) 21 | library(pls) 22 | library(boot) 23 | 24 | #### Simple Linear Regression #### 25 | #(dependent variable is regressed upon one predictor variable) 26 | 27 | # Reading the data and variables 28 | 29 | mydata = read.csv('DC_Simple_Reg.csv',header = T,sep = ",") 30 | Temp = mydata$Dryer.Temperature 31 | DContent = mydata$Dry.Content 32 | 33 | # Constructing Scatter Plot 34 | plot(Temp, DContent) 35 | 36 | # Computing Correlation Matrix 37 | cor(Temp, DContent) 38 | # Correlation between y & x need to be high (preferably 0.8 to 1 to -0.8 to -1.0) 39 | 40 | # Fitting Regression Model 41 | model = lm(DContent ~ Temp) 42 | summary(model) 43 | 44 | # DContent = 2.183813 + 1.293432*Temp 45 | 46 | # Regression Performance 47 | anova(model) 48 | 49 | # Residual Analysis 50 | pred = fitted(model) 51 | Res = residuals(model) 52 | 53 | # write.csv(pred,"Pred.csv") # Stores the predicted values in 'Pred.csv' file at your working directory 54 | # write.csv(Res,"Res.csv") # Stores the residual values in 'Res.csv' file at your working directory 55 | 56 | # Visualizing Actual vs Predicted Values 57 | plot(DContent, pred) 58 | 59 | # Checking whether the distribution of the Residuals is bell shaped (Normality Test) 60 | qqnorm(Res) 61 | qqline(Res) 62 | 63 | # Normality Check using Shapiro-Wilk test 64 | shapiro.test(Res) 65 | 66 | # Visualizing the relationship between Residuals and Predicted values 67 | plot(pred,Res) 68 | 69 | # Visualizing the relationship between Residuals and Predictor 70 | plot(Temp,Res) 71 | 72 | ################################################################################ 73 | 74 | ################# Fitting Simple Linear Regression Manually ################# 75 | 76 | # Reading the data and variables 77 | df = data.frame(DContent = c(73.3,74.6,74,78.5,74.6,74,75.2,77.2,75.9,74.6,73.3,75.9,76,74.6,74.7,74.5,70.7,72,72.1,72.2,70.7, 78 | 74.6,75.2,74,75.9,75.3,74,78.5,76.5,74.5,76,76.5,76.7,76,75.8,73.8,73.3,74.6,73.4,76,74,75.2), 79 | Temp = c(55,56,55.5,59,56,55.5,56.5,58,57,56,55,57,57,56,56,56,53,54,54,54,53,56,56.5,55.5,57,56.5,55.5,59, 80 | 57.5,56,57,57.5,57.5,57,57,55.5,55,56,55,57,55.5,56.5)) 81 | 82 | # Constructing Scatter Plot 83 | plot(Temp, DContent) 84 | 85 | # Computing Correlation Matrix 86 | cor(Temp, DContent) 87 | # Correlation between y & x need to be high (preferably 0.8 to 1 to -0.8 to -1.0) 88 | 89 | # Fitting Regression Model 90 | 91 | # y_hat = beta_0_hat + beta_1_hat*x 92 | # beta_1_hat = S_xy / S_xx 93 | # beta_0_hat = y_bar - beta_1_hat*x_bar 94 | 95 | be1_hat = cov(Temp,DContent) / var(Temp) 96 | be0_hat = mean(DContent) - (be1_hat * mean(Temp)) 97 | 98 | pred = be0_hat + (be1_hat *Temp) 99 | 100 | A = data.frame (DContent, Temp, pred) 101 | names(A)=c("Observed DContent","Observed Temp ","Fitted DContent") 102 | print(A) 103 | 104 | cat("\n The fitted linear regression model is : DContent=",be0_hat,"+",be1_hat,"Temp\n") 105 | # DContent = 2.183813 + 1.293432*Temp 106 | 107 | # Regression Performance 108 | 109 | Res = DContent - pred # Calculating the residuals 110 | 111 | # Performing ANOVA 112 | 113 | # SS_Res = Sum (Res^2) | MS_Res = SS__Res/df_Res 114 | SS_Res =sum(Res^2) 115 | df_Res = length(DContent)-2 # Number of observations - No of predictors - 1 116 | MS_Res = SS_Res / df_Res 117 | 118 | # SS_Total = Sum(y - y_bar)^2 | MS_Total = SS_Total/df_total 119 | SS_Total = sum((DContent - mean(DContent))^2) 120 | df_Total = length(DContent) - 1 # Number of observations - 1 121 | MS_Total = SS_Total / df_Total 122 | 123 | # SS_Reg = SS_Total - SS_Res | MS_Reg = SS_Reg/df_Reg 124 | SS_Reg = SS_Total - SS_Res 125 | df_Reg = 1 # No of predictors 126 | MS_Reg = SS_Reg / df_Reg 127 | 128 | # Calculating F statistic values 129 | F_obs = MS_Reg / MS_Res 130 | 131 | # Tabulated F vlaue 132 | F_pvalue = 1 - pf(F_obs, df_Reg, df_Res) 133 | 134 | # ANOVA Table 135 | 136 | cat("\n \t\t\t Analysis of Variance Table\t\t\t\t\n")+ 137 | cat("\nSource","\t\t","Df","\t","Sum Sq","\t","Mean Sq","\t","F value","\t","Pr(>F)","\n")+ 138 | cat("\n Temp","\t\t",df_Reg,"\t",SS_Reg,"\t",MS_Reg,"\t",F_obs,"\t",F_pvalue,"\n")+ 139 | cat("\n Residual","\t",df_Res,"\t",SS_Res,"\t",MS_Res,"\n")+ 140 | cat("\n Total","\t\t",df_Total,"\t",SS_Total,"\t",MS_Total,"\n") 141 | 142 | # Visualizing Actual vs Predicted Values 143 | plot(DContent, pred) 144 | 145 | # Checking whether the distribution of the Residuals is bell shaped (Normality Test) 146 | qqnorm(Res) 147 | qqline(Res) 148 | 149 | # Normality Check using Shapiro-Wilk test 150 | shapiro.test(Res) 151 | 152 | # Visualizing the relationship between Residuals and Predicted values 153 | plot(pred,Res) 154 | 155 | # Visualizing the relationship between Residuals and Predictor 156 | plot(Temp,Res) 157 | 158 | ################################################################################ 159 | 160 | library(modelr) 161 | dat <- read.csv("Advertising.csv", header = TRUE) 162 | 163 | # visualize the data 164 | par(mfrow=c(1,3)) 165 | plot(dat$TV, dat$sales, xlab = "TV", ylab = "Sales", col = "red") 166 | plot(dat$radio, dat$sales, xlab="Radio", ylab="Sales", col="blue") 167 | plot(dat$newspaper, dat$sales, xlab="Newspaper", 168 | ylab="Sales", col="forestgreen") 169 | 170 | library(tidyverse) 171 | 172 | # Split the data into training and testing set 173 | dat.split <- resample_partition(dat, c(test = 0.3, train = 0.7)) 174 | 175 | train <- as_tibble(dat.split$train) 176 | test <- as_tibble(dat.split$test) 177 | 178 | # Time for SLR model fitting 179 | mod1 <- lm(sales ~ TV, data = train) 180 | 181 | # Summary 182 | summary(mod1) 183 | 184 | # Tidy summary (available in 'broom' package) 185 | library(broom) 186 | tidy(mod1) 187 | 188 | confint(mod1) 189 | 190 | # Checking model accuracy 191 | 192 | MSE <- mse(mod1, train) 193 | MSE 194 | ## [1] 10.16292 195 | 196 | rsquare(mod1, train) 197 | ## [1] 0.6067217 198 | 199 | # In case of simple linear regression, R2 is exactly equal to the square of the 200 | # correlation between the response variable y and predictor variable X. 201 | 202 | r = cor(train$sales, train$TV) 203 | r^2 204 | ## [1] 0.6067217 205 | 206 | # Visual assessment of model fit 207 | 208 | ggplot(train, aes(x = TV, y = sales)) + 209 | geom_point() + 210 | geom_smooth(method = "lm") 211 | 212 | # Predictions 213 | 214 | test %>% add_predictions(mod1) 215 | 216 | pred.MSE <- mse(mod1, test) 217 | c(Prediction_MSE = pred.MSE, Train_MSE = MSE) 218 | ## Prediction_MSE Train_MSE 219 | ## 11.59707 10.16292 220 | 221 | # Confidence intervals and prediction intervals of fit 222 | test.predict <- predict(mod1, test, interval = "prediction") 223 | 224 | new_df <- cbind(test, test.predict) 225 | 226 | ggplot(new_df, aes(x = TV, y = sales))+ 227 | geom_point() + 228 | geom_line(aes(y=lwr), color = "red", linetype = "dashed")+ 229 | geom_line(aes(y=upr), color = "red", linetype = "dashed")+ 230 | geom_smooth(method=lm, se=TRUE) 231 | 232 | # Residual Analysis 233 | # Equal variance assumptions 234 | # add model diagnostics to our training data 235 | mod1_results <- augment(mod1, train) 236 | 237 | p1 <- ggplot(mod1_results, aes(.fitted, .resid)) + 238 | geom_ref_line(h = 0) + 239 | geom_point() + 240 | geom_smooth(se = FALSE) + 241 | ggtitle("Residuals vs Fitted") 242 | 243 | p2 <- ggplot(mod1_results, aes(.fitted, .std.resid)) + 244 | geom_ref_line(h = 0) + 245 | geom_point() + 246 | geom_smooth(se = FALSE) + 247 | ggtitle("Standardized Residuals vs Fitted") 248 | 249 | gridExtra::grid.arrange(p1, p2, nrow = 1) 250 | 251 | par(mfrow = c(1,1)) 252 | 253 | # Normality check of errors 254 | qqnorm(mod1_results$.resid) 255 | qqline(mod1_results$.resid) 256 | 257 | # Leverage, Influence, and Outliers 258 | 259 | par(mfrow=c(1, 2)) 260 | 261 | plot(mod1, which = 4, id.n = 5) 262 | plot(mod1, which = 5, id.n = 5) 263 | 264 | ######################### End of Session ######################################### -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_6_MLR.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Practical Slides and Codes/Practical_6_MLR.pdf -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_6_MLR_Code.R: -------------------------------------------------------------------------------- 1 | ################ Multiple Linear Regression Analysis in RStudio ################ 2 | 3 | # Installing all the required packages for the R Notebook 4 | 5 | # install.packages("car") 6 | # install.packages("pls") 7 | # install.packages("ggplot2") 8 | # install.packages("gplots") 9 | # install.packages("glmnet") 10 | # install.packages("DAAG") 11 | # install.packages("boot") 12 | # install.packages("MASS") 13 | library(broom) 14 | library(DAAG) 15 | library(glmnet) 16 | library(car) 17 | library(MASS) 18 | library(gplots) 19 | library(ggplot2) 20 | library(pls) 21 | library(boot) 22 | library(dplyr) 23 | library(tidyverse) 24 | 25 | #### Multiple Linear Regression #### 26 | 27 | # Revisiting the Advertising dataset 28 | library(modelr) 29 | dat <- read.csv("Advertising.csv") 30 | 31 | # visualize the data 32 | par(mfrow=c(1,3)) 33 | plot(dat$TV, dat$sales, xlab = "TV", ylab = "Sales", col = "red") 34 | plot(dat$radio, dat$sales, xlab="Radio", ylab="Sales", col="blue") 35 | plot(dat$radio, dat$newspaper, xlab="Newspaper", 36 | ylab="Sales", col="forestgreen") 37 | 38 | par(mfrow = c(1,1)) 39 | dat <- as_tibble(dat) 40 | #dat <- as_tibble(dat[,-1]) #%>% select(-X) # X represents serial number hence can be removed 41 | 42 | # Split the data into training and testing set 43 | set.seed(8885) 44 | dat.split <- resample_partition(dat, c(test = 0.3, train = 0.7)) 45 | 46 | train <- as_tibble(dat.split$train) 47 | test <- as_tibble(dat.split$test) 48 | 49 | # Previously we did SLR model fitting as follows 50 | mod1 <- lm(sales ~ TV, data = train) 51 | 52 | # Time for MLR model fitting 53 | mod2 <- lm(sales ~ TV + radio + newspaper, data = train) 54 | 55 | summary(mod2) 56 | 57 | tidy(mod2) 58 | 59 | # Getting 95% confidence intervals 60 | confint(mod2) 61 | 62 | glance(mod2, train) 63 | 64 | # Comparing the two fitted models 65 | list(`model-1` = glance(mod1,train), `model-2` = glance(mod2,train)) 66 | 67 | # Visual assessment of model fit 68 | # add model diagnostics to our training data 69 | mod1_results <- augment(mod1, train) %>% 70 | mutate(Model = "Model-1") 71 | 72 | mod2_results <- augment(mod2, train) %>% 73 | mutate(Model = "Model-2") %>% 74 | rbind(mod1_results) 75 | 76 | ggplot(mod2_results, aes(.fitted, .resid)) + 77 | geom_ref_line(h = 0) + 78 | geom_point() + 79 | geom_smooth(se = FALSE) + 80 | facet_wrap(~ Model) + 81 | ggtitle("Residuals vs Fitted") 82 | 83 | # predictions 84 | test %>% add_predictions(mod2) 85 | 86 | MSE <- mse(mod2, train) 87 | pred.MSE <- mse(mod2, test) 88 | c(Prediction_MSE = pred.MSE, Train_MSE = MSE) 89 | 90 | # Using base R approach (gives confidence and prediction interval of fit) 91 | test.predict <- predict(mod2, test, interval = "prediction") 92 | as_tibble(test.predict) 93 | 94 | # interaction effect 95 | 96 | mod3 <- lm(sales ~ TV + radio + TV:radio + newspaper, data = train) 97 | 98 | #Aliter: 99 | mod3 <- lm(sales ~ TV*radio + newspaper, data = train) 100 | 101 | summary(mod3) 102 | 103 | tidy(mod3) 104 | ######################## Multicollinearity ################### 105 | 106 | X1 <- runif(100) 107 | X2 <- rnorm(100,10,10) 108 | X3 <- rnorm(100,-20,20) 109 | X4 <- X2 - X3 # exact linear relationship 110 | 111 | Y <- 1 + X1 - 0.5*X2 + 0.5*X3 + rnorm(100,0,4) 112 | 113 | dat <- as_tibble(data.frame(Y = Y, X1 = X1, X2 = X2, X3 = X3, X4 = X4)) 114 | dat 115 | 116 | cor(dat) 117 | 118 | simu.model <- lm(Y ~ X1 + X2 + X3 + X4) 119 | summary(simu.model) 120 | 121 | #Let us re-run this example with near-linear relationship among some of the predictors. 122 | library(tidyverse) 123 | #### Near-linear dependency 124 | set.seed(100) 125 | X1 <- runif(100) 126 | X2 <- rnorm(100,10,10) 127 | X3 <- rnorm(100,-20,20) 128 | X4 <- X2 - X3 + runif(100) # near-linear relationship 129 | 130 | Y <- 1 + X1 - 0.5*X2 + 0.5*X3 + rnorm(100,0,4) 131 | 132 | dat <- as_tibble(data.frame(Y = Y, X1 = X1, X2 = X2, X3 = X3, X4 = X4)) 133 | dat 134 | 135 | cor(dat) 136 | 137 | simu.model <- lm(Y ~ X1 + X2 + X3 + X4) 138 | summary(simu.model) 139 | 140 | library(car) 141 | vif(simu.model) 142 | 143 | 144 | #dependent variable is regressed upon two or more predictor variables 145 | 146 | # Reading the data and variables 147 | data = read.csv('Mult_Reg_Yield.csv',header = T,sep = ",") 148 | mydata= data[,-1] # Removing SL.NO. Column 149 | attach(mydata) 150 | # Computing Correlation Matrix 151 | cor(mydata) 152 | 153 | # Fitting Multiple Linear Regression 154 | model = lm(X.Yield ~ Temperature + Time) 155 | summary(model) 156 | 157 | # X.Yield = -67.88436 - 0.06419*Temperature + 0.90609*Time 158 | 159 | # Temperature is NOT a causal variable. 160 | 161 | # Regression Model Performance 162 | anova(model) 163 | 164 | # From the ANOVA Table we can say only time is related to % yield as p value < 0.05, so we modify our model 165 | # Fitting Linear Regression Model with Time as the only Predictor variable 166 | 167 | model_m = lm(X.Yield ~ Time) 168 | summary(model_m) 169 | 170 | # X.Yield = -81.6205 + 0.9065*Time 171 | 172 | # Regression Model Performance 173 | anova(model_m) 174 | 175 | # Residual Analysis 176 | pred = fitted(model_m) 177 | Res = residuals(model_m) 178 | plot(Res) 179 | qqnorm(Res) 180 | 181 | #Standardizing the Residuals using scale function 182 | #"center" parameter (when set to TRUE) is responsible for subtracting the mean on the numeric object from each observation. 183 | #The "scale" parameter (when set to TRUE) is responsible for dividing the resulting difference by the standard deviation of the numeric object. 184 | Std_Res = scale(Res, center = TRUE, scale = TRUE) 185 | 186 | # Normality Check using Shapiro - Wilk test 187 | shapiro.test(Res) 188 | 189 | # Bonferonni Outlier Test 190 | outlierTest(model_m) # Bonferroni p-value < 0.05 indicates potential outlier 191 | 192 | # Leave One Out Cross Validation 193 | mymodel = glm(X.Yield ~ Time) 194 | valid = cv.glm(mydata, mymodel) 195 | valid$delta[1] 196 | 197 | # Multiple Linear Regression (dependent variable is regressed upon two or more predictor variables) 198 | 199 | # Reading the data and variables 200 | data = read.csv('Mult_Reg_Conversion.csv',header = T,sep = ",") 201 | mydata = data[,-1] # Removing SL.NO. Column 202 | attach(mydata) 203 | 204 | # Computing Correlation Matrix 205 | cor(mydata) 206 | # High Correlation between Percent_Conversion and Temperature & Time 207 | # High Correlation between Temperature & Time - Multicollinearity 208 | 209 | # Fitting Multiple Linear Regression 210 | model = lm(Percent_Conversion ~ Kappa.number + Temperature + Time) 211 | summary(model) 212 | 213 | # Regression ANOVA 214 | anova(model) 215 | 216 | # Checking Multi-collinearity using Variance Inflation Factor 217 | #library(car) 218 | vif(model) 219 | 220 | # VIF > 5 indicates multi-collinearity. Hence, multi-collinearity exists between Time and Temperature 221 | 222 | # Tackling Multi-collinearity 223 | 224 | # Method 1: Removing highly correlated variable - Stepwise Regression 225 | 226 | #library(MASS) 227 | mymodel = lm(Percent_Conversion ~ Temperature + Time + Kappa.number) 228 | step = stepAIC(mymodel, direction = "both") 229 | summary(step) 230 | 231 | # Check for multicollinearity in the new model 232 | vif(step) 233 | # vif values <5 indicates no multicollinearity 234 | 235 | # Predicting the values 236 | pred = predict(step) 237 | res = residuals(step) 238 | cbind(Percent_Conversion, pred, res) 239 | mse = mean(res^2) 240 | rmse = sqrt(mse) 241 | 242 | # K Fold Validation 243 | 244 | #install.packages("DAAG") 245 | #library(DAAG) 246 | 247 | cv.lm(data = mydata, form.lm = formula(Percent_Conversion ~ Kappa.number + Temperature + Time), m=10, dots = 248 | FALSE, seed=29, plotit=TRUE, printit=TRUE) 249 | ################################################################################### 250 | # Method 2: Principal Component Regression 251 | 252 | #install.packages("pls") 253 | #library(pls) 254 | mymodel = pcr(Percent_Conversion ~ ., data = mydata, scale = TRUE) 255 | summary(mymodel) 256 | mymodel$loadings 257 | mymodel$scores 258 | 259 | pred = predict(mymodel, type = "response", ncomp = 1) 260 | res = Percent_Conversion - pred 261 | mse = mean(res^2) 262 | prednew = predict(mymodel, type = "response", ncomp = 2) 263 | resnew = Percent_Conversion - prednew 264 | msenew = mean(resnew^2) 265 | 266 | # Since there is not much reduction in MSE by including the second principal component , only PC1 is required for modelling 267 | 268 | # Method 3: Partial Least Square Regression 269 | 270 | mymodel = plsr(Percent_Conversion ~ ., data = mydata, scale = TRUE) 271 | summary(mymodel) 272 | ps = mymodel$scores 273 | score = ps[,1:2] 274 | 275 | #Identifying the required number of components in the model 276 | pred = predict(mymodel, data = mydata, scale = TRUE, ncomp = 1) 277 | res = Percent_Conversion - pred 278 | mse = mean(res^2) 279 | 280 | prednew = predict(mymodel, data = mydata, scale = TRUE , ncomp = 2) 281 | resnew = Percent_Conversion - prednew 282 | msenew = mean(resnew^2) 283 | # Not much reduction in MSE by including the second component , only PLS1 is required for modelling 284 | 285 | # Method 4: Ridge regression 286 | 287 | #library(glmnet) 288 | set.seed(1) 289 | y = mydata[,4] 290 | x = mydata[,1:3] 291 | x = as.matrix(x) 292 | mymodel = cv.glmnet(x , y, alpha =0) 293 | plot(mymodel) 294 | 295 | # Choose the lambda which minimizes the mean square error 296 | bestlambda = mymodel$lambda.min 297 | bestlambda 298 | 299 | # Develop the model with best lambda and identify the coefficients 300 | mynewmodel = glmnet(x, y, alpha = 0) 301 | predict(mynewmodel, type = "coefficients", s = bestlambda)[1:4,] 302 | 303 | ########## Ridge and lasso implementation in R ########## 304 | # We shall be working with the Ames Housing data, available in the AmesHousing package. 305 | 306 | library(tidyverse) 307 | library(modelr) 308 | library(glmnet) 309 | library(AmesHousing) 310 | library(leaps) 311 | library(broom) 312 | 313 | # Create training and test data for Ames Housing data 314 | 315 | dat <- make_ames() 316 | set.seed(8885) 317 | 318 | ames_split <- resample_partition(dat, c(test = 0.3, train = 0.7)) 319 | 320 | ames_train <- as_tibble(ames_split$train) 321 | ames_test <- as_tibble(ames_split$test) 322 | 323 | # Create training and testing feature model matrices and response vectors. 324 | # use model.matrix(...)[, -1] to discard the intercept 325 | 326 | ames_train_x <- model.matrix(Sale_Price ~ ., ames_train)[, -1] 327 | ames_train_y <- log(ames_train$Sale_Price) 328 | 329 | ames_test_x <- model.matrix(Sale_Price ~ ., ames_test)[, -1] 330 | ames_test_y <- log(ames_test$Sale_Price) 331 | 332 | # Check dimensions 333 | dim(ames_train_x) 334 | ## [1] 2052 308 335 | 336 | dim(ames_test_x) 337 | ## [1] 878 308 338 | 339 | # Apply Ridge regression to ames data 340 | ames_ridge <- glmnet(x = ames_train_x, y = ames_train_y, alpha = 0) 341 | 342 | plot(ames_ridge, xvar = "lambda") 343 | 344 | # lambdas applied to penalty parameter 345 | ames_ridge$lambda %>% head() 346 | 347 | # coefficients for the largest and smallest lambda parameters 348 | coef(ames_ridge)[c("Gr_Liv_Area", "TotRms_AbvGrd"), 100] 349 | 350 | coef(ames_ridge)[c("Gr_Liv_Area", "TotRms_AbvGrd"), 1] 351 | 352 | # Tuning the penalty parameter λ. 353 | 354 | # Apply CV Ridge regression to ames data 355 | ames_ridge <- cv.glmnet(x = ames_train_x, y = ames_train_y, alpha = 0) 356 | 357 | # plot results 358 | plot(ames_ridge) 359 | 360 | min(ames_ridge$cvm) # minimum MSE 361 | ## [1] 0.02284662 362 | 363 | ames_ridge$lambda.min # lambda for this min MSE 364 | ## [1] 0.2450304 365 | 366 | ames_ridge$cvm[ames_ridge$lambda == ames_ridge$lambda.1se] # 1 st.error of min MSE 367 | ## [1] 0.02545676 368 | 369 | ames_ridge$lambda.1se # lambda for this MSE 370 | ## [1] 0.7482874 371 | 372 | # Prediction performance of the fitted ridge regression model. 373 | ridge.pred <- predict(ames_ridge, s = ames_ridge$lambda.min, newx = ames_test_x) 374 | mean((ridge.pred - ames_test_y)^2) 375 | ## [1] 0.01787904 376 | 377 | ############## LASSO Regression #################### 378 | # We now apply the ridge regression method to the Ames Housing data. 379 | # Apply lasso regression to ames data 380 | ames_lasso <- glmnet(x = ames_train_x, y = ames_train_y, alpha = 1) 381 | 382 | plot(ames_lasso, xvar = "lambda") 383 | 384 | # Tuning the penalty parameter λ. 385 | # Apply CV lasso regression to ames data 386 | ames_lasso <- cv.glmnet(x = ames_train_x, y = ames_train_y, alpha = 1) 387 | 388 | # plot results 389 | plot(ames_lasso) 390 | 391 | min(ames_lasso$cvm) # minimum MSE 392 | ## [1] 0.02828329 393 | 394 | ames_lasso$lambda.min # lambda for this min MSE 395 | ## [1] 0.003315378 396 | 397 | ames_lasso$cvm[ames_lasso$lambda == ames_lasso$lambda.1se] # 1st.error of min MSE 398 | ## [1] 0.03308106 399 | 400 | ames_lasso$lambda.1se # lambda for this MSE 401 | ## [1] 0.03091933 402 | 403 | #Prediction performance of the fitted lasso regression model. 404 | lasso.pred <- predict(ames_lasso, s = ames_lasso$lambda.min, newx = ames_test_x) 405 | mean((lasso.pred - ames_test_y)^2) 406 | ## [1] 0.02093244 407 | 408 | lasso.pred2 <- predict(ames_lasso, s = ames_lasso$lambda.1se, newx = ames_test_x) 409 | mean((lasso.pred2 - ames_test_y)^2) 410 | ## [1] 0.02726125 411 | 412 | # Advantage of choosing λ with MSE within 1 standard error 413 | ames_lasso_min <- glmnet( 414 | x = ames_train_x, 415 | y = ames_train_y, 416 | alpha = 1 417 | ) 418 | 419 | plot(ames_lasso_min, xvar = "lambda") 420 | abline(v = log(ames_lasso$lambda.min), col = "red", lty = "dashed") 421 | abline(v = log(ames_lasso$lambda.1se), col = "red", lty = "dashed") 422 | 423 | ######## Linear Regression with Dummy Variables ######### 424 | 425 | mydata = read.csv('Travel_dummy_Reg.csv',header = T,sep = ",") 426 | attach(mydata) 427 | mydata = mydata[,2:4] 428 | 429 | # Converting categorical x's to factors 430 | gender = factor(Gender) 431 | income = factor(Income) 432 | # Fitting the model 433 | mymodel = lm(Attitude ~ gender + income) 434 | summary(mymodel) 435 | # Regression Model Performance 436 | anova(mymodel) 437 | 438 | ################################################################################ 439 | 440 | ################## Fitting Multiple Linear Regression manually ################# 441 | 442 | # Read the data 443 | df1<-data.frame( 444 | x0=c(1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1), 445 | x1=c(410,368,357,373,361,385,380,400,407,410,421,446,478,441,454,440,427), 446 | x2=c(9,11,15,12,9,9,10,12,12,13,12,19,19,18,12,12,12) 447 | ) 448 | df1 449 | 450 | df2 = data.frame( 451 | y=c(776,580,539,648,538,891,673,783,571,627,727,867,1042,804,832,764,727) 452 | ) 453 | df2 454 | 455 | Mat1 = data.matrix (df1) 456 | Mat1 457 | Maty = data.matrix (df2) 458 | Maty 459 | 460 | ####### Transpose of Matrix Mat1 ######### 461 | TMat1 = t(Mat1) 462 | TMat1 463 | 464 | ############ Matrix multiplication############### 465 | Mat2 = TMat1 %*% Mat1 466 | Mat2 467 | 468 | ############ Inverse Matrix ############ 469 | Mat3 = solve (Mat2) 470 | Mat3 471 | beta = Mat3 %*% TMat1 %*%Maty 472 | beta 473 | 474 | haty = Mat1 %*% beta 475 | haty 476 | 477 | ############# SSResid ################ 478 | Residsquare = (Maty - haty)^2 479 | Residsquare 480 | SSResid = sum (Residsquare) 481 | SSResid 482 | 483 | ### SSResid with Matrix ### 484 | TMaty = t(Maty) 485 | TMaty 486 | Tbeta = t(beta) 487 | Tbeta 488 | SSResid = (TMaty %*% Maty) - (Tbeta %*% TMat1 %*% Maty) 489 | SSResid 490 | 491 | ##################SSTOTAL######################### 492 | Matyy = Maty^2 493 | Matyy 494 | bary = mean(Maty) 495 | bary 496 | SSTot = sum(Matyy)-17*(bary^2) 497 | SSTot 498 | 499 | ################ R^2 Calculation ############## 500 | R2 = 1-(SSResid/SSTot) 501 | R2 502 | 503 | #########################Sigma Hat ####################### 504 | sigmahat2 = SSResid/(17-3) 505 | sigmahat2 506 | 507 | #####################Confidence interval for Sigma Square############### 508 | q1 = qchisq (0.025, df = 14) 509 | q2 = qchisq (0.975, df = 14) 510 | CI = c(SSResid/q2 , SSResid/q1) 511 | CI 512 | 513 | ##################### SSRegression ############ 514 | SSReg = SSTot - SSResid 515 | SSReg 516 | 517 | ##### SSRegression with Matrix #### 518 | SSReg = (Tbeta %*% TMat1 %*% Maty) - (17*(bary^2)) 519 | SSReg 520 | 521 | ############################ F value######################### 522 | MSReg = SSReg/ (3-1) 523 | MSResid = SSResid/ (17-3) 524 | F = MSReg/MSResid 525 | F 526 | qf(p=.025, df1=2, df2=14, lower.tail=FALSE) 527 | ################################################################################ 528 | 529 | 530 | 531 | 532 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_7_Logistic_Regression.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Practical Slides and Codes/Practical_7_Logistic_Regression.pdf -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_7_Logistic_Regression_Code.R: -------------------------------------------------------------------------------- 1 | ################ Logistic and Non linear Regression in RStudio ################# 2 | 3 | # Installing all the required packages for the R Notebook 4 | 5 | install.packages("car") 6 | install.packages("pls") 7 | install.packages("ggplot2") 8 | install.packages("gplots") 9 | install.packages("boot") 10 | install.packages ("MASS") 11 | install.packages("glmnet") 12 | install.packages("DAAG") 13 | library(DAAG) 14 | library(glmnet) 15 | library(car) 16 | library(MASS) 17 | library(boot) 18 | library(gplots) 19 | library(ggplot2) 20 | library(pls) 21 | 22 | #################### Nonlinear Regression #################### 23 | 24 | # Read the data file and variables 25 | mydata = read.csv('Nonlinear_Thrust.csv', header = T, sep = ",") 26 | mydata 27 | cor(mydata) 28 | plot(mydata$x1,mydata$y) 29 | plot(mydata$x2,mydata$y) 30 | plot(mydata$x3,mydata$y) 31 | mymodel = lm(y ~ x1 + x2 + x3, data = mydata) 32 | summary(mymodel) 33 | 34 | # Install car package 35 | #install.packages("car") 36 | library(car) 37 | crPlots(mymodel) 38 | 39 | # Design polynomial model-1 40 | newmodel1 = lm(y ~ poly(x1, 2, raw = TRUE) + x2 + x3, data = mydata) 41 | summary(newmodel1) 42 | crPlots(newmodel1) 43 | 44 | # Design polynomial model-2 45 | newmodel2 = lm(y ~ poly(x1, 3, raw = TRUE) + x2 + x3, data = mydata) 46 | summary(newmodel2) 47 | crPlots(newmodel2) 48 | 49 | # Design Final Polynomial model 50 | finalmodel = lm(y ~ poly(x1, 3, raw = TRUE) + 51 | poly(x2, 2, raw = TRUE) + 52 | sqrt(x3), data = mydata) 53 | crPlots(finalmodel) 54 | summary(finalmodel) 55 | 56 | finalmodel = lm(y ~ poly(x1, 3, raw = TRUE) + 57 | poly(x2, 3, raw = TRUE) + 58 | poly(x3, 3, raw = TRUE), data = mydata) 59 | res = residuals(finalmodel) 60 | qqnorm(res) 61 | qqline(res) 62 | shapiro.test(res) 63 | 64 | ##################### Regression Spline ##################### 65 | 66 | # Read the data file and variables 67 | mydata = read.csv('Reg_Spline_DFR.csv', header = T, sep = ",") 68 | mydata 69 | design = mydata$Design 70 | coding = mydata$Coding 71 | plot(design, coding) 72 | mymodel = lm(coding ~ design) 73 | summary(mymodel) 74 | pred = predict(mymodel) 75 | plot(design, coding) 76 | lines( design, pred, col = "blue") 77 | 78 | 79 | ################################## 80 | design44 = design - 0.44 81 | design44[design44 < 0] = 0 82 | mymodel = lm(coding ~ design + design44) 83 | summary(mymodel) 84 | pred = predict(mymodel) 85 | plot(design, coding) 86 | lines(design, pred, col = "blue") 87 | 88 | design44cb = design44^3 89 | mymodel = lm(coding ~ poly(design, 3, raw = TRUE) + design44cb) 90 | summary(mymodel) 91 | pred = predict(mymodel) 92 | plot(design, coding) 93 | lines(design, pred, col = "blue") 94 | 95 | ####################### Logistic Regression with 'default' data in ISLR ##################### 96 | library(ISLR2) 97 | 98 | Default %>% head() 99 | 100 | Default %>% 101 | mutate(prob = ifelse(default == "Yes", 1, 0)) %>% 102 | ggplot(aes(balance, prob)) + 103 | geom_point(alpha = .25) + 104 | geom_smooth(method = "lm") + 105 | ggtitle("Linear regression model fit") + 106 | xlab("Balance") + 107 | ylab("Probability of Default") 108 | 109 | #Fitting a simple logistic regression model. 110 | 111 | library(modelr) 112 | 113 | set.seed(100) 114 | 115 | # Split into training and testing data 116 | default_split <- resample_partition(Default, c(test = 0.3, train = 0.7)) 117 | 118 | default_train <- as_tibble(default_split$train) 119 | default_test <- as_tibble(default_split$test) 120 | 121 | # Fitting a simple logistic regression model 122 | model1 <- glm(default ~ balance, family = binomial, data = default_train) 123 | 124 | summary(model1) 125 | 126 | # Quick check whether "default = Yes" is coded as "1" by R 127 | contrasts(default_train$default) 128 | 129 | # Interpretation of the model coefficients 130 | 131 | exp(coef(model1)) 132 | ## (Intercept) balance 133 | ## 1.868042e-05 1.005699e+00 134 | 135 | # Let us visualize the model fit. 136 | 137 | Default %>% 138 | mutate(prob = ifelse(default == "Yes", 1, 0)) %>% 139 | ggplot(aes(balance, prob)) + 140 | geom_point(alpha = .25) + 141 | geom_smooth(method = "glm", method.args = list(family = "binomial")) + 142 | ggtitle("Logistic regression model fit") + 143 | xlab("Balance") + 144 | ylab("Probability of Default") 145 | 146 | # Predictions 147 | 148 | test.predictions.m1 <- predict(model1, newdata = default_test, type = "response") 149 | test.predictions.m1 <- tibble(balance = default_test$balance, 150 | true.default.status = default_test$default, 151 | pred.default = test.predictions.m1) 152 | test.predictions.m1 153 | 154 | test.predictions.m1 %>% 155 | filter(true.default.status == "Yes") 156 | 157 | ############# Multiple Logistic Regression 158 | 159 | # Let us now demonstrate the procedure with the Default dataset taking into account all the available predictors, namely, student, balance and income. 160 | 161 | model2 <- glm(default ~ student + balance + income, family = binomial, data = default_train) 162 | 163 | summary(model2) 164 | 165 | # We can do predictions similarly. 166 | test.predictions.m2 <- predict(model2, newdata = default_test, type = "response") 167 | test.predictions.m2 <- tibble(default_test, 168 | pred.default = test.predictions.m2) 169 | test.predictions.m2 170 | 171 | ## Take a look at the predicted default probabilities who actually defaulted 172 | 173 | test.predictions.m2 %>% 174 | filter(default == "Yes") 175 | 176 | #Model validation via Classification performance assessment 177 | 178 | # First model: only predictor is balance 179 | 180 | glm.pred.m1 <- rep("No", dim(default_test)[1]) 181 | glm.pred.m1[test.predictions.m1$pred.default > 0.5] <- "Yes" 182 | 183 | table(glm.pred.m1, default_test$default) 184 | 185 | # Second model: all predictors included 186 | 187 | glm.pred.m2 <- rep("No", dim(default_test)[1]) 188 | glm.pred.m2[test.predictions.m2$pred.default > 0.5] <- "Yes" 189 | 190 | table(glm.pred.m2, default_test$default) 191 | 192 | # ROC Curve and AUC 193 | 194 | library(ROCR) 195 | 196 | test.predictions.m1 <- predict(model1, newdata = default_test, type = "response") 197 | prediction(test.predictions.m1, default_test$default) %>% 198 | performance(measure = "tpr", x.measure = "fpr") %>% 199 | plot() %>% 200 | abline(a=0,b=1, lty = "dashed") 201 | 202 | # AUC 203 | prediction(test.predictions.m1, default_test$default) %>% 204 | performance(measure = "auc") %>% 205 | .@y.values 206 | ## [[1]] 207 | ## [1] 0.9409003 208 | 209 | ########### Binary Logistic Regression with visit data ########### 210 | # Response variable is Categorical 211 | 212 | #Reading the file and variables 213 | mydata = read.csv('Resort_Visit.csv',header = T,sep = ",") 214 | # either attach the data or use data$. to extract columns 215 | visit = mydata$Resort_Visit 216 | income = mydata$Family_Income 217 | attitude = mydata$Attitude.Towards.Travel 218 | importance = mydata$Importance_Vacation 219 | size = mydata$House_Size 220 | age = mydata$Age._Head 221 | 222 | #Converting response variable to discrete 223 | visit = factor(visit) 224 | 225 | # Computing Correlation Matrix 226 | cor(mydata) 227 | # Correlation between X variables should be low otherwise it will indicate Multicollinearity 228 | 229 | # Checking relation between Xs and Y 230 | aggregate(income ~visit, FUN = mean) 231 | aggregate(attitude ~visit, FUN = mean) 232 | aggregate(importance ~visit, FUN = mean) 233 | aggregate(size ~visit, FUN = mean) 234 | aggregate(age ~visit, FUN = mean) 235 | # Higher the difference in means, stronger will be the relation to response variable 236 | 237 | # Checking relation between Xs and Y - box plot 238 | boxplot(income ~ visit) 239 | boxplot(attitude ~ visit) 240 | boxplot(importance ~ visit) 241 | boxplot(size ~ visit) 242 | boxplot(age ~ visit) 243 | 244 | # Fitting Logistic Regression Model 245 | model = glm(visit ~ income + attitude + importance + size + age, family = binomial(logit)) 246 | summary(model) 247 | 248 | # Perform Logistic regression - ANOVA 249 | anova(model,test = 'Chisq') 250 | # Since p value < 0.05 for Income redo the modeling with important factors only 251 | 252 | # Modifying the Logistic Regression Model 253 | model_m = glm(visit ~ income, family = binomial(logit)) 254 | summary(model_m) 255 | 256 | # Perform Logistic regression - Anova 257 | anova(model_m,test = 'Chisq') 258 | 259 | cdplot(visit ~ attitude) 260 | 261 | # Fitted Value & Residual 262 | predict(model_m,type = 'response') 263 | residuals(model_m,type = 'deviance') 264 | predclass = ifelse(predict(model_m, type ='response')>0.5,"1","0") 265 | predclass 266 | 267 | # Model Evaluation 268 | mytable = table(visit, predclass) 269 | mytable 270 | prop.table(mytable) 271 | 272 | 273 | prop.table(mytable) 274 | 275 | ######### Ordinal Logistic Regression ########### 276 | 277 | # Read the data file and variables 278 | mydata = read.csv('ST_Defects.csv', header = T, sep = ",") 279 | dd = mydata$DD 280 | effort = mydata$Effort 281 | coverage = mydata$Test.Coverage 282 | dd = factor(dd) 283 | dd 284 | 285 | # Install MASS Package 286 | 287 | #install.packages("MASS") 288 | #library(MASS) 289 | 290 | # Fitting the model 291 | mymodel = polr(dd ~ effort + coverage) 292 | summary(mymodel) 293 | 294 | # Predicting Values 295 | pred = predict(mymodel) 296 | pred 297 | fit = fitted(mymodel) 298 | fit 299 | output = cbind(dd, pred) 300 | output 301 | 302 | # Comparing Actual Vs Predicted 303 | mytable = table(dd, pred) 304 | mytable 305 | 306 | 307 | ################################# Exercise Solution of Ridge regression and Lasso ##################################### 308 | 309 | 310 | # Apply Ridge Regression and Lasso approach to the Hitters data from ISLR2 package. 311 | # Aim is to predict a basketball player's Salary on the basis of various features 312 | 313 | install.packages("ISLR2") 314 | library (ISLR2) 315 | 316 | # Loading the data 317 | View(Hitters) 318 | names(Hitters) 319 | dim(Hitters) 320 | 321 | # Removing missing values 322 | sum(is.na(Hitters$Salary)) # Salary is missing for 59 players 323 | Hitters <- na.omit(Hitters) # remove all rows that have missing value in any variable 324 | dim(Hitters) 325 | sum(is.na(Hitters)) # no missing data 326 | 327 | # Use glmnet package for Ridge Regression and Lasso. The alpha argument determines the type of model to be fitted. 328 | install.packages("glmnet") 329 | library(glmnet) 330 | 331 | # Transform the dependent variable into a matrix, and convert all qualitative variable (if any) into dummy variables using model.matrix() function 332 | x <- model.matrix(Salary ~ . , Hitters)[,-1] 333 | print(x) 334 | y <- Hitters$Salary 335 | 336 | ########################### Ridge Regression ########################### 337 | 338 | # alpha argument of glmnet() function is set to 0 for ridge regression 339 | # grid <- 10^seq(10, -2, length = 100) # value of lambda 340 | # ridge.mod <- glmnet(x, y, alpha = 0, lambda = grid) 341 | # 342 | # dim(coef(ridge.mod)) # ridge coefficients associated with each value of lambda, arranged in a matrix form 343 | # 344 | # # coefficient estimates in terms of l2 norm are much smaller for a larger value of lambda, compared to a smaller value of lambda 345 | # ridge.mod$lambda[50] # lambda = 11498 346 | # coef(ridge.mod)[,50] 347 | # sqrt(sum(coef(ridge.mod)[-1,50]^2)) # l2 norm = 6.36 348 | # 349 | # ridge.mod$lambda[60] # lambda = 705 350 | # coef(ridge.mod)[,60] 351 | # sqrt(sum(coef(ridge.mod)[-1,60]^2)) # l2 norm = 57.1 352 | # 353 | # # obtain the ridge coefficients for a new value of lambda = 50 354 | # predict(ridge.mod, s = 50, type = "coefficients")[1:20,] 355 | 356 | # we will split the sample into train and test set to obtain the test error 357 | # set.seed(1) 358 | # train <- sample(1:nrow(x) , nrow(x) / 2) 359 | # test <- (-train) 360 | # y.test <- y[test] 361 | # 362 | # # fit ridge regression model on training data 363 | # ridge.mod <- glmnet(x[train, ], y[train], alpha = 0, lambda = grid, thresh = 1e-12) 364 | # 365 | # # use the fitted model to get predictions for a test set, using lambda = 4 366 | # ridge.pred <- predict(ridge.mod, s = 4, newx = x[test, ]) # we can use the predict function for various purposes by modifying the argument 367 | # mean((ridge.pred - y.test)^2) # computing test MSE 368 | # 369 | # # Compare a ridge regression model with lambda = 4 with a model fitting only an intercept 370 | # mean((mean(y[train]) - y.test)^2) # MSE for a model with just an intercept 371 | # 372 | # # We would get the same result by fitting a ridge regression model with a very large value of lambda 373 | # ridge.pred <- predict(ridge.mod , s = 1e10, newx = x[test, ]) # 1e10 means 10^10 374 | # mean((ridge.pred - y.test)^2) 375 | # # Ridge regression with lambda = 4 leads to a much lower test MSE than fitting a model with just an intercept 376 | # 377 | # # Compare a ridge regression model with lambda = 4 with a least square regression model 378 | # ridge.pred <- predict(ridge.mod, s = 0, newx = x[test, ], exact = T, x = x[train, ], y = y[train]) # least square regression is equivalent to ridge regression with lambda = 0 379 | # mean((ridge.pred - y.test)^2) 380 | # 381 | # lm(y~x, subset = train) 382 | # predict(ridge.mod, s = 0, exact = T, type = "coefficients", x = x[train, ], y = y[train])[1:20, ] 383 | 384 | # use cross-validation method to choose the value of lambda 385 | set.seed(1) 386 | cv.out <- cv.glmnet(x[train, ], y[train], alpha = 0) 387 | bestlam <- cv.out$lambda.min 388 | bestlam 389 | 390 | ridge.pred <- predict(ridge.mod, s = bestlam, newx = x[test, ]) # the best lambda to obtain predictions for a test set 391 | mean((ridge.pred - y.test)^2) # test MSE for best lambda 392 | 393 | # This model is a further improvement over the model with lambda = 4 in terms of the test MSE 394 | # Refit the ridge regression model on full dataset using the the value of lambda obtained by cross-validation approach 395 | out <- glmnet(x, y, alpha = 0) 396 | predict (out, type = "coefficients", s = bestlam)[1:20, ] 397 | 398 | ############################ LASSO ########################### 399 | 400 | # alpha argument of glmnet() function is set to 1 for lasso 401 | lasso.mod <- glmnet(x[train, ], y[train], alpha = 1, lambda = grid) 402 | plot(lasso.mod) 403 | 404 | # perform cross-validation and compute the test error 405 | set.seed(1) 406 | cv.out <- cv.glmnet(x[train, ], y[train], alpha = 1) 407 | bestlam <- cv.out$lambda.min 408 | lasso.pred <- predict(lasso.mod, s = bestlam, newx = x[test, ]) 409 | mean((lasso.pred - y.test)^2) # MSE is substantially lower than the test MSE of null model and least square regrssion model; it is very similar to the test MSE of ridge regression with lambda chosen by cross validation 410 | 411 | # lasso has an advantage over ridge regression in that the resulting coefficient estimates are sparse. 412 | out <- glmnet(x, y, alpha = 1, lambda = grid) 413 | lasso.coef <- predict(out, type = "coefficients", s = bestlam)[1:20, ] 414 | lasso.coef # 8 out of 19 coefficients are exactly 0 415 | lasso.coef[lasso.coef != 0] # model contains only 11 variables 416 | 417 | 418 | ########################################################## 419 | # EXPLORATORY NON-PARAMETRIC REGRESSION IN R WITH GGPLOT # 420 | ########################################################## 421 | 422 | require(smooth) 423 | install.packages("ggpubr") 424 | library(ggpubr) 425 | library(ggplot2) 426 | 427 | ############## Creating the data ###################### 428 | set.seed(999) 429 | n <- 120 430 | t <- seq(0,3.5*pi/2,length=n) 431 | class <- as.factor(sample(c("A","B","C"),n/3,replace = TRUE)) 432 | data <- data.frame(Predictor=t,Output=sin(t)-.5*sin(as.numeric(class)*t)+rnorm(n,0,.35),class=class) 433 | 434 | ################################################################# 435 | # ggplot do all this stuff for us with the function "geom_smooth" 436 | ################################################################# 437 | 438 | # Linear regression 439 | gg.lm <- ggplot(data,aes(x=Predictor,y=Output,col=class))+geom_point()+ 440 | geom_smooth(method='lm') + facet_grid(class ~.) 441 | print(gg.lm) 442 | 443 | # Quadratic regression 444 | gg.qua <- ggplot(data,aes(x=Predictor,y=Output,col=class))+geom_point()+ 445 | geom_smooth(method='lm',formula= y ~ poly(x,2)) + facet_grid(class ~.) 446 | print(gg.qua) 447 | 448 | # Cubic regression 449 | gg.cub <- ggplot(data,aes(x=Predictor,y=Output,col=class))+geom_point()+ 450 | geom_smooth(method='lm',formula= y ~ poly(x,3)) + facet_grid(class ~.) 451 | print(gg.cub) 452 | 453 | # Natural splines 454 | gg.ns2 <- ggplot(data,aes(x=Predictor,y=Output,col=class))+geom_point()+ 455 | geom_smooth(method='gam',formula=y ~ splines::ns(x, 2)) + facet_grid(class ~.) 456 | print(gg.ns2) 457 | 458 | gg.ns3 <- ggplot(data,aes(x=Predictor,y=Output,col=class))+geom_point()+ 459 | geom_smooth(method='gam',formula=y ~ splines::ns(x, 3)) + facet_grid(class ~.) 460 | print(gg.ns3) 461 | 462 | gg.bs3 <- ggplot(data,aes(x=Predictor,y=Output,col=class))+geom_point()+ 463 | geom_smooth(method='gam',formula=y ~ splines::bs(x, 3)) + facet_grid(class ~.) 464 | print(gg.bs3) 465 | 466 | gg.ns30 <- ggplot(data,aes(x=Predictor,y=Output,col=class))+geom_point()+ 467 | geom_smooth(method='gam',formula=y ~ splines::ns(x, 30)) + facet_grid(class ~.) 468 | print(gg.ns30) 469 | 470 | gg.bs30 <- ggplot(data,aes(x=Predictor,y=Output,col=class))+geom_point()+ 471 | geom_smooth(method='gam',formula=y ~ splines::bs(x, 30)) + facet_grid(class ~.) 472 | print(gg.bs30) 473 | 474 | ################################# END OF SESSION ############################### -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_8_TS_Forecasting.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Practical Slides and Codes/Practical_8_TS_Forecasting.pdf -------------------------------------------------------------------------------- /Practical Slides and Codes/Practical_8_TS_Forecasting_Code.R: -------------------------------------------------------------------------------- 1 | 2 | ####################### Time series analysis using R ########################### 3 | 4 | 5 | # Installing all the required packages for the R Notebook 6 | 7 | #install.packages(stats) 8 | library(stats) 9 | 10 | ############### Time series plot ############### 11 | mydata <- read.csv("E15demand.csv") 12 | E15 = ts(mydata$Demand, start = c(2012,4), end = c(2013,10), frequency = 12) 13 | E15 14 | plot(E15, type = "b") 15 | 16 | E15 = ts(mydata$Demand) 17 | E15 18 | plot(E15, type = "b") 19 | 20 | ############# Trend in GDP data ############### 21 | mydata <- read.csv("Trens_GDP.csv") 22 | GDP <- ts(mydata$GDP, start = 1993, end = 2003) 23 | plot(GDP, type = "b") 24 | 25 | ############# Seasonality in Sales data ############### 26 | mydata <- read.csv("Seasonal_sales.csv") 27 | sales = ts(mydata$Sales, start = c(2002,1), end = c(2005,12), frequency = 12) 28 | plot(sales, type = "b") 29 | 30 | ############# Trend & Seasonality in TS data ############### 31 | mydata <- read.csv("Trend_&_Seasonal.csv") 32 | sales = ts(mydata$Sales) 33 | plot(sales, type = "b") 34 | 35 | ################## Stationarity Test ######################## 36 | mydata <- read.csv("shipment.csv") 37 | shipments = ts(mydata$Shipments) 38 | plot(shipments, type = "b") 39 | 40 | # ADF Test 41 | install.packages("tseries") 42 | library("tseries") 43 | adf.test(shipments) 44 | 45 | # KPSS Test 46 | kpss.test(shipments) 47 | 48 | ############# Stationarity in GDP data ############### 49 | mydata <- read.csv("Trens_GDP.csv") 50 | GDP <- ts(mydata$GDP, start = 1993, end = 2003) 51 | plot(GDP, type = "b") 52 | 53 | kpss.test(GDP) 54 | 55 | # Differencing 56 | 57 | install.packages("forecast") 58 | library(forecast) 59 | ndiffs(GDP) 60 | 61 | mydiffdata = diff(GDP, difference = 1) 62 | plot(mydiffdata, type = "b") 63 | adf.test(mydiffdata) 64 | kpss.test(mydiffdata) 65 | 66 | ################### Simple Exponential Smoothing ################## 67 | mydata <- read.csv("Amount.csv") 68 | amount = ts(mydata$Amount) 69 | plot(amount, type ="b") 70 | 71 | # ADF Test & KPSS Test 72 | adf.test(amount) 73 | kpss.test(amount) 74 | 75 | #Fitting the model 76 | mymodel = HoltWinters(amount, beta = FALSE, gamma = FALSE) 77 | mymodel 78 | 79 | # Actual Vs Fitted plot 80 | plot(mymodel) 81 | 82 | # Computing predicted values and residuals (errors) 83 | pred = fitted(mymodel) 84 | res = residuals(mymodel) 85 | outputdata = cbind(amount, pred[,1], res) 86 | write.csv(outputdata, "amount_outputdata.csv") 87 | 88 | # Model diagnostics 89 | abs_res = abs(res) 90 | res_sq = res^2 91 | pae = abs_res/ amount 92 | mean(res) # Mean Residuals 93 | mean(abs_res) # Mean Absolute Residuals 94 | mean(res_sq) # Mean Residuals Square 95 | sqrt(mean(res_sq)) # Root Mean Residuals Square 96 | mean(pae)*100 # Mean absolute percentage error 97 | 98 | #Normality Test of Errors with zero 99 | qqnorm(res) 100 | qqline(res) 101 | shapiro.test(res) 102 | mean(res) 103 | 104 | #Forecast 105 | forecast = forecast(mymodel, 1) 106 | forecast 107 | plot(forecast) 108 | 109 | ######################### Autocorrelation ########################### 110 | mydata <- read.csv("Trens_GDP.csv") 111 | GDP <- ts(mydata$GDP, start = 1993, end = 2003) 112 | acf(GDP, 3) 113 | acf(GDP) 114 | 115 | ######################## ARIMA Model ################################# 116 | mydata <- read.csv("Visits.csv") 117 | mydata <- ts(mydata$Data) 118 | plot(mydata, type = "b") 119 | 120 | # Descriptive Statistics 121 | summary(mydata) 122 | 123 | # Check whether the series is stationary 124 | library(tseries) 125 | adf.test(mydata) 126 | kpss.test(mydata) 127 | ndiffs(mydata) 128 | 129 | #Draw ACF & PACF Graphs 130 | acf(mydata) 131 | pacf(mydata) 132 | 133 | # ARIMA Model Fitting 134 | mymodel = auto.arima(mydata) 135 | mymodel 136 | 137 | # Identification of model manually 138 | arima(mydata, c(0,0,1)) 139 | arima(mydata, c(1,0,0)) 140 | arima(mydata, c(1,0,1)) 141 | 142 | # Model diagnostics 143 | summary(mymodel) 144 | pred = fitted(mymodel) 145 | res = residuals(mymodel) 146 | 147 | #Normality check on Residuals 148 | qqnorm(res) 149 | qqline(res) 150 | shapiro.test(res) 151 | hist(res, col = "grey") 152 | 153 | # Autocorrelation check on Residuals 154 | acf(res) 155 | Box.test(res, lag = 15, type = "Ljung-Box") 156 | 157 | # Forecasting the future 158 | forecast = forecast(mymodel, h = 3) 159 | forecast 160 | 161 | ############################ End of Session #################################### -------------------------------------------------------------------------------- /Practical Slides and Codes/Reg_Spline_DFR.csv: -------------------------------------------------------------------------------- 1 | Design,Coding 2 | 0.05,0.02 3 | 0.21,0.04 4 | 0.24,0.05 5 | 0.29,0.065 6 | 0.32,0.07 7 | 0.38,0.08 8 | 0.39,0.085 9 | 0.42,0.085 10 | 0.425,0.09 11 | 0.435,0.09 12 | 0.5,0.2 13 | 0.52,0.22 14 | 0.53,0.25 15 | 0.54,0.295 16 | 0.54,0.275 17 | 0.55,0.3 18 | 0.565,0.278 19 | 0.57,0.275 20 | 0.585,0.32 21 | 0.59,0.34 22 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Resort_Visit.csv: -------------------------------------------------------------------------------- 1 | Resort_Visit,Family_Income,Attitude Towards Travel,Importance_Vacation,House_Size,Age _Head 2 | 0,20.2,5,8,3,43 3 | 0,70.3,6,7,4,61 4 | 0,62.9,7,5,6,52 5 | 0,48.5,7,5,5,36 6 | 0,52.7,6,6,4,55 7 | 0,75,8,7,5,68 8 | 0,46.2,5,3,3,62 9 | 0,57,2,4,6,51 10 | 0,64.1,7,5,4,57 11 | 0,68.1,7,6,5,45 12 | 0,73.4,6,7,5,44 13 | 0,71.9,5,8,4,64 14 | 0,56.2,1,8,6,54 15 | 0,49.3,4,2,3,56 16 | 0,62,5,6,2,58 17 | 1,32.1,5,4,3,58 18 | 1,36.2,4,3,2,55 19 | 1,43.2,2,5,2,57 20 | 1,50.4,5,2,4,37 21 | 1,44.1,6,6,3,42 22 | 1,38.3,6,6,2,45 23 | 1,55,1,2,2,57 24 | 1,46.1,3,5,3,51 25 | 1,35,6,4,5,64 26 | 1,37.3,2,7,4,54 27 | 1,41.8,5,1,3,56 28 | 1,57,8,3,2,36 29 | 1,33.4,6,8,2,50 30 | 1,37.5,3,2,3,48 31 | 1,41.3,3,3,2,42 32 | -------------------------------------------------------------------------------- /Practical Slides and Codes/ST_Defects.csv: -------------------------------------------------------------------------------- 1 | DD,Effort,Test Coverage 2 | High,35,26 3 | Medium,33,36 4 | High,39,42 5 | High,37,32 6 | High,31,51 7 | Medium,36,39 8 | High,36,46 9 | High,31,31 10 | High,41,41 11 | High,37,31 12 | High,44,41 13 | Low,33,41 14 | High,31,36 15 | High,44,41 16 | High,35,33 17 | Medium,44,46 18 | Medium,46,51 19 | High,46,46 20 | Low,46,36 21 | High,49,56 22 | Medium,39,41 23 | High,44,46 24 | High,47,51 25 | High,44,31 26 | Low,37,46 27 | Low,38,46 28 | Medium,49,51 29 | Medium,44,51 30 | Low,41,41 31 | Low,50,51 32 | Medium,44,36 33 | High,39,26 34 | Low,40,31 35 | Low,46,41 36 | High,41,26 37 | High,39,46 38 | High,33,36 39 | Low,59,51 40 | Medium,41,41 41 | Low,47,51 42 | Medium,31,51 43 | Low,42,61 44 | Low,55,41 45 | High,40,47 46 | Low,44,48 47 | High,54,41 48 | High,46,61 49 | Medium,54,36 50 | High,42,51 51 | Low,49,51 52 | Low,41,56 53 | High,41,41 54 | Medium,44,51 55 | Low,57,56 56 | Low,52,51 57 | Medium,49,56 58 | Low,41,56 59 | Medium,57,41 60 | High,62,46 61 | Medium,33,31 62 | High,54,46 63 | Low,40,56 64 | Low,52,61 65 | High,57,51 66 | High,49,51 67 | High,46,46 68 | Medium,57,43 69 | Low,49,61 70 | Low,52,56 71 | Medium,53,41 72 | Low,54,52 73 | Low,57,41 74 | Low,41,56 75 | Medium,52,57 76 | High,57,61 77 | Medium,54,56 78 | High,52,56 79 | Medium,52,56 80 | Medium,44,46 81 | Low,46,61 82 | High,49,36 83 | Low,54,41 84 | Low,52,56 85 | Medium,44,41 86 | High,49,46 87 | Low,46,66 88 | Medium,54,61 89 | Medium,39,46 90 | Medium,59,61 91 | High,45,51 92 | High,52,36 93 | Medium,54,46 94 | High,44,56 95 | Low,50,58 96 | High,62,31 97 | Medium,59,56 98 | Low,52,61 99 | High,52,56 100 | High,46,61 101 | Medium,41,52 102 | Low,52,61 103 | High,52,56 104 | High,55,61 105 | High,41,31 106 | Medium,49,41 107 | Low,59,66 108 | Medium,54,51 109 | Medium,54,51 110 | Low,59,61 111 | Low,55,51 112 | Low,62,61 113 | Low,54,51 114 | Low,54,56 115 | Low,54,51 116 | Low,59,46 117 | High,57,46 118 | Low,61,61 119 | Low,52,51 120 | Low,59,56 121 | Low,62,51 122 | High,60,37 123 | Low,59,66 124 | Low,65,56 125 | Low,61,31 126 | Low,59,46 127 | Low,59,61 128 | Low,59,61 129 | Low,59,51 130 | Low,65,61 131 | Low,57,61 132 | Low,59,71 133 | Low,49,56 134 | Medium,49,46 135 | Low,59,66 136 | Low,62,46 137 | Medium,59,61 138 | Low,54,66 139 | Low,63,46 140 | Low,43,44 141 | Low,54,61 142 | Low,52,61 143 | Low,57,56 144 | Medium,57,56 145 | Medium,57,51 146 | Low,61,56 147 | Medium,62,61 148 | Medium,62,66 149 | Medium,65,51 150 | Low,57,41 151 | Low,59,56 152 | Low,59,51 153 | Low,62,61 154 | Low,65,61 155 | High,62,51 156 | Medium,62,51 157 | Low,62,41 158 | Low,54,56 159 | Low,62,56 160 | Low,65,56 161 | Low,62,66 162 | Medium,67,61 163 | Medium,65,66 164 | Low,60,66 165 | Low,59,51 166 | Low,59,71 167 | High,59,61 168 | Medium,65,66 169 | Low,62,66 170 | Low,65,61 171 | Medium,59,66 172 | Medium,59,46 173 | Low,61,66 174 | Low,65,56 175 | High,67,56 176 | Low,65,61 177 | Low,65,56 178 | Low,67,71 179 | Low,65,71 180 | Low,62,61 181 | Low,52,61 182 | Low,63,61 183 | Low,59,51 184 | Low,65,66 185 | Medium,59,66 186 | Low,67,66 187 | Low,67,66 188 | Low,60,66 189 | Low,67,71 190 | Low,62,66 191 | Low,54,66 192 | Low,65,71 193 | Low,62,61 194 | Low,59,56 195 | Low,60,71 196 | Low,63,66 197 | Low,65,71 198 | High,63,66 199 | Low,67,66 200 | Low,65,56 201 | Low,62,66 202 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Sales_Revenue_Anova.csv: -------------------------------------------------------------------------------- 1 | Location,Sales Revenue 2 | 1,1.55 3 | 1,2.36 4 | 1,1.84 5 | 1,1.72 6 | 2,4.23 7 | 2,3.05 8 | 2,3.93 9 | 2,3.08 10 | 2,5 11 | 2,3.23 12 | 2,4.25 13 | 2,3.54 14 | 3,2.85 15 | 3,2.77 16 | 3,2.75 17 | 3,2.39 18 | 3,2.72 19 | 3,2.07 20 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Seasonal_sales.csv: -------------------------------------------------------------------------------- 1 | Month,Sales 2 | Jan-02,164 3 | Feb-02,148 4 | Mar-02,152 5 | Apr-02,144 6 | May-02,155 7 | Jun-02,125 8 | Jul-02,153 9 | Aug-02,146 10 | Sep-02,138 11 | Oct-02,190 12 | Nov-02,192 13 | Dec-02,192 14 | Jan-03,147 15 | Feb-03,133 16 | Mar-03,163 17 | Apr-03,150 18 | May-03,129 19 | Jun-03,131 20 | Jul-03,145 21 | Aug-03,137 22 | Sep-03,138 23 | Oct-03,168 24 | Nov-03,176 25 | Dec-03,188 26 | Jan-04,139 27 | Feb-04,143 28 | Mar-04,150 29 | Apr-04,154 30 | May-04,137 31 | Jun-04,129 32 | Jul-04,128 33 | Aug-04,140 34 | Sep-04,143 35 | Oct-04,151 36 | Nov-04,177 37 | Dec-04,184 38 | Jan-05,151 39 | Feb-05,134 40 | Mar-05,164 41 | Apr-05,126 42 | May-05,131 43 | Jun-05,125 44 | Jul-05,127 45 | Aug-05,143 46 | Sep-05,143 47 | Oct-05,160 48 | Nov-05,190 49 | Dec-05,182 50 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Supply_Chain.csv: -------------------------------------------------------------------------------- 1 | Cycle time,Cost,Satisfaction 2 | 10,12035,0.98 3 | 5,45632,0.99 4 | 16,32000,0.975 5 | 5.3,48236,0.995 6 | 8.2,75236,0.976 7 | 4.5,63521,0.954 8 | 19.2,98654,0.989 9 | 11.5,10000,0.998 10 | 12.8,25321,0.997 11 | 10.5,65231,0.987 12 | 5.9,25000,0.978 13 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Travel_dummy_Reg.csv: -------------------------------------------------------------------------------- 1 | SL No,Gender,Income,Attitude 2 | 1,1,1,2 3 | 2,1,1,3 4 | 3,1,1,1 5 | 4,1,1,1 6 | 5,1,1,2 7 | 6,1,2,4 8 | 7,1,2,5 9 | 8,1,2,5 10 | 9,1,2,6 11 | 10,1,2,6 12 | 11,1,3,8 13 | 12,1,3,9 14 | 13,1,3,8 15 | 14,1,3,7 16 | 15,1,3,7 17 | 16,2,1,1 18 | 17,2,1,1 19 | 18,2,1,2 20 | 19,2,1,2 21 | 20,2,1,1 22 | 21,2,2,3 23 | 22,2,2,4 24 | 23,2,2,5 25 | 24,2,2,3 26 | 25,2,2,3 27 | 26,2,3,5 28 | 27,2,3,5 29 | 28,2,3,5 30 | 29,2,3,4 31 | 30,2,3,6 32 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Trend_&_Seasonal.csv: -------------------------------------------------------------------------------- 1 | Month ,Sales 2 | 1,742 3 | 2,697 4 | 3,776 5 | 4,898 6 | 5,1030 7 | 6,1107 8 | 7,1165 9 | 8,1216 10 | 9,1208 11 | 10,1131 12 | 11,971 13 | 12,783 14 | 13,741 15 | 14,700 16 | 15,774 17 | 16,932 18 | 17,1099 19 | 18,1223 20 | 19,1290 21 | 20,1349 22 | 21,1341 23 | 22,1296 24 | 23,1066 25 | 24,901 26 | 25,896 27 | 26,793 28 | 27,885 29 | 28,1055 30 | 29,1204 31 | 30,1326 32 | 31,1303 33 | 32,1436 34 | 33,1473 35 | 34,1453 36 | 35,1170 37 | 36,1023 38 | 37,951 39 | 38,861 40 | 39,938 41 | 40,1109 42 | 41,1274 43 | 42,1422 44 | 43,1486 45 | 44,1555 46 | 45,1604 47 | 46,1600 48 | 47,1403 49 | 48,1209 50 | 49,1030 51 | 50,1032 52 | 51,1126 53 | 52,1285 54 | 53,1468 55 | 54,1637 56 | 55,1611 57 | 56,1608 58 | 57,1528 59 | 58,1420 60 | 59,1119 61 | 60,1013 62 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Trens_GDP.csv: -------------------------------------------------------------------------------- 1 | Year,GDP 2 | 1993,94.43 3 | 1994,100 4 | 1995,107.25 5 | 1996,115.13 6 | 1997,124.16 7 | 1998,130.11 8 | 1999,138.57 9 | 2000,146.97 10 | 2001,153.4 11 | 2002,162.28 12 | 2003,168.73 13 | -------------------------------------------------------------------------------- /Practical Slides and Codes/Visits.csv: -------------------------------------------------------------------------------- 1 | SL No.,Data 2 | 1,259 3 | 2,310 4 | 3,268 5 | 4,379 6 | 5,275 7 | 6,102 8 | 7,139 9 | 8,60 10 | 9,93 11 | 10,45 12 | 11,101 13 | 12,161 14 | 13,288 15 | 14,372 16 | 15,291 17 | 16,416 18 | 17,248 19 | 18,314 20 | 19,351 21 | 20,417 22 | 21,276 23 | 22,164 24 | 23,120 25 | 24,379 26 | 25,277 27 | 26,208 28 | 27,361 29 | 28,289 30 | 29,138 31 | 30,206 32 | -------------------------------------------------------------------------------- /Practical Slides and Codes/bank_data.csv: -------------------------------------------------------------------------------- 1 | id,age,sex,region,income,married,children,car,save_act,current_act,mortgage,pep 2 | ID12101,48,FEMALE,INNER_CITY,17546,NO,1,NO,NO,NO,NO,YES 3 | ID12102,40,MALE,TOWN,30085.1,YES,3,YES,NO,YES,YES,NO 4 | ID12103,51,FEMALE,INNER_CITY,16575.4,YES,0,YES,YES,YES,NO,NO 5 | ID12104,23,FEMALE,TOWN,20375.4,YES,3,NO,NO,YES,NO,NO 6 | ID12105,57,FEMALE,RURAL,50576.3,YES,0,NO,YES,NO,NO,NO 7 | ID12106,57,FEMALE,TOWN,37869.6,YES,2,NO,YES,YES,NO,YES 8 | ID12107,22,MALE,RURAL,8877.07,NO,0,NO,NO,YES,NO,YES 9 | ID12108,58,MALE,TOWN,24946.6,YES,0,YES,YES,YES,NO,NO 10 | ID12109,37,FEMALE,SUBURBAN,25304.3,YES,2,YES,NO,NO,NO,NO 11 | ID12110,54,MALE,TOWN,24212.1,YES,2,YES,YES,YES,NO,NO 12 | ID12111,66,FEMALE,TOWN,59803.9,YES,0,NO,YES,YES,NO,NO 13 | ID12112,52,FEMALE,INNER_CITY,26658.8,NO,0,YES,YES,YES,YES,NO 14 | ID12113,44,FEMALE,TOWN,15735.8,YES,1,NO,YES,YES,YES,YES 15 | ID12114,66,FEMALE,TOWN,55204.7,YES,1,YES,YES,YES,YES,YES 16 | ID12115,36,MALE,RURAL,19474.6,YES,0,NO,YES,YES,YES,NO 17 | ID12116,38,FEMALE,INNER_CITY,22342.1,YES,0,YES,YES,YES,YES,NO 18 | ID12117,37,FEMALE,TOWN,17729.8,YES,2,NO,NO,NO,YES,NO 19 | ID12118,46,FEMALE,SUBURBAN,41016,YES,0,NO,YES,NO,YES,NO 20 | ID12119,62,FEMALE,INNER_CITY,26909.2,YES,0,NO,YES,NO,NO,YES 21 | ID12120,31,MALE,TOWN,22522.8,YES,0,YES,YES,YES,NO,NO 22 | ID12121,61,MALE,INNER_CITY,57880.7,YES,2,NO,YES,NO,NO,YES 23 | ID12122,50,MALE,TOWN,16497.3,YES,2,NO,YES,YES,NO,NO 24 | ID12123,54,MALE,INNER_CITY,38446.6,YES,0,NO,YES,YES,NO,NO 25 | ID12124,27,FEMALE,TOWN,15538.8,NO,0,YES,YES,YES,YES,NO 26 | ID12125,22,MALE,INNER_CITY,12640.3,NO,2,YES,YES,YES,NO,NO 27 | ID12126,56,MALE,INNER_CITY,41034,YES,0,YES,YES,YES,YES,NO 28 | ID12127,45,MALE,INNER_CITY,20809.7,YES,0,NO,YES,YES,YES,NO 29 | ID12128,39,FEMALE,TOWN,20114,YES,1,NO,NO,YES,NO,YES 30 | ID12129,39,FEMALE,INNER_CITY,29359.1,NO,3,YES,NO,YES,YES,NO 31 | ID12130,61,MALE,RURAL,24270.1,YES,1,NO,NO,YES,NO,YES 32 | ID12131,61,FEMALE,RURAL,22942.9,YES,2,NO,YES,YES,NO,NO 33 | ID12132,20,FEMALE,TOWN,16325.8,YES,2,NO,YES,NO,NO,NO 34 | ID12133,45,MALE,SUBURBAN,23443.2,YES,1,YES,YES,YES,NO,YES 35 | ID12134,33,FEMALE,INNER_CITY,29921.3,NO,3,YES,YES,NO,NO,NO 36 | ID12135,43,MALE,SUBURBAN,37521.9,NO,0,NO,YES,YES,NO,YES 37 | ID12136,27,FEMALE,INNER_CITY,19868,YES,2,NO,YES,YES,NO,NO 38 | ID12137,19,MALE,RURAL,10953,YES,3,YES,YES,YES,NO,NO 39 | ID12138,36,FEMALE,RURAL,13381,NO,0,YES,NO,YES,NO,YES 40 | ID12139,43,FEMALE,TOWN,18504.3,YES,0,YES,YES,YES,NO,NO 41 | ID12140,66,FEMALE,SUBURBAN,25391.5,NO,2,NO,NO,YES,NO,NO 42 | ID12141,55,MALE,TOWN,26774.2,YES,0,NO,NO,YES,YES,YES 43 | ID12142,47,FEMALE,INNER_CITY,26952.6,YES,0,YES,NO,YES,NO,NO 44 | ID12143,67,MALE,TOWN,55716.5,NO,2,YES,YES,NO,NO,YES 45 | ID12144,32,FEMALE,TOWN,27571.5,YES,0,YES,NO,YES,YES,NO 46 | ID12145,20,MALE,INNER_CITY,13740,NO,2,YES,YES,YES,YES,NO 47 | ID12146,64,MALE,INNER_CITY,52670.6,YES,2,NO,YES,YES,YES,YES 48 | ID12147,50,FEMALE,INNER_CITY,13283.9,NO,1,YES,YES,YES,NO,YES 49 | ID12148,29,MALE,INNER_CITY,13106.6,NO,2,NO,YES,YES,YES,YES 50 | ID12149,52,MALE,INNER_CITY,39547.8,NO,2,YES,NO,YES,NO,YES 51 | ID12150,47,FEMALE,RURAL,17867.3,YES,2,YES,YES,NO,NO,NO 52 | ID12151,24,MALE,TOWN,14309.7,NO,2,YES,YES,NO,NO,NO 53 | ID12152,36,MALE,TOWN,23894.8,YES,0,NO,NO,NO,NO,NO 54 | ID12153,43,MALE,TOWN,16259.7,YES,1,NO,YES,YES,NO,YES 55 | ID12154,48,MALE,SUBURBAN,29794.1,NO,1,NO,YES,YES,NO,YES 56 | ID12155,63,MALE,TOWN,56842.5,YES,0,NO,YES,YES,YES,NO 57 | ID12156,52,FEMALE,RURAL,47835.8,NO,3,NO,YES,NO,NO,YES 58 | ID12157,58,FEMALE,INNER_CITY,24977.5,NO,0,NO,NO,YES,NO,YES 59 | ID12158,28,MALE,INNER_CITY,23124.9,YES,0,NO,YES,YES,NO,YES 60 | ID12159,29,FEMALE,INNER_CITY,15143.8,YES,0,NO,NO,YES,NO,NO 61 | ID12160,34,MALE,INNER_CITY,25334.3,NO,1,YES,YES,YES,YES,YES 62 | ID12161,42,FEMALE,INNER_CITY,24763.3,YES,1,NO,YES,YES,YES,YES 63 | ID12162,65,FEMALE,INNER_CITY,36589,NO,1,YES,NO,YES,NO,YES 64 | ID12163,47,MALE,INNER_CITY,27022.6,YES,2,NO,YES,YES,NO,NO 65 | ID12164,20,MALE,INNER_CITY,11700.4,YES,0,NO,YES,NO,NO,NO 66 | ID12165,21,MALE,TOWN,5014.21,NO,0,YES,YES,YES,YES,NO 67 | ID12166,42,MALE,INNER_CITY,17390.1,YES,0,NO,YES,YES,NO,NO 68 | ID12167,19,MALE,TOWN,10861,NO,2,NO,YES,YES,NO,NO 69 | ID12168,41,FEMALE,TOWN,34892.9,NO,0,NO,YES,YES,YES,NO 70 | ID12169,30,MALE,TOWN,19403.1,NO,2,NO,YES,YES,NO,NO 71 | ID12170,31,FEMALE,RURAL,10441.9,YES,2,NO,NO,YES,NO,YES 72 | ID12171,25,MALE,INNER_CITY,14064.9,YES,3,YES,YES,YES,NO,NO 73 | ID12172,21,MALE,INNER_CITY,8062.73,NO,0,NO,NO,YES,NO,YES 74 | ID12173,36,MALE,INNER_CITY,31982,YES,3,YES,YES,YES,YES,YES 75 | ID12174,58,FEMALE,INNER_CITY,23197.5,YES,2,NO,YES,NO,YES,NO 76 | ID12175,64,FEMALE,INNER_CITY,52674,NO,2,YES,YES,YES,NO,YES 77 | ID12176,59,FEMALE,RURAL,35610.5,NO,2,YES,NO,NO,NO,YES 78 | ID12177,45,FEMALE,TOWN,26948,NO,0,NO,YES,YES,YES,NO 79 | ID12178,61,MALE,INNER_CITY,49456.7,YES,1,YES,YES,YES,YES,YES 80 | ID12179,30,FEMALE,INNER_CITY,14724.5,YES,0,YES,NO,YES,NO,NO 81 | ID12180,58,FEMALE,TOWN,34524.9,YES,2,YES,YES,NO,NO,YES 82 | ID12181,50,FEMALE,TOWN,22052.1,NO,3,NO,YES,YES,NO,YES 83 | ID12182,30,MALE,INNER_CITY,27808.1,NO,3,NO,NO,YES,NO,NO 84 | ID12183,29,FEMALE,INNER_CITY,12591.4,YES,1,NO,YES,YES,NO,NO 85 | ID12184,35,MALE,INNER_CITY,16394.4,YES,1,NO,YES,YES,NO,YES 86 | ID12185,62,MALE,INNER_CITY,24026.1,YES,0,NO,NO,YES,YES,YES 87 | ID12186,36,MALE,INNER_CITY,31683.1,YES,1,YES,YES,YES,NO,YES 88 | ID12187,25,FEMALE,INNER_CITY,15525,YES,0,NO,YES,YES,NO,NO 89 | ID12188,66,FEMALE,TOWN,22562.2,NO,0,YES,YES,YES,YES,NO 90 | ID12189,30,MALE,SUBURBAN,15848.7,YES,0,YES,YES,NO,YES,NO 91 | ID12190,54,FEMALE,INNER_CITY,31095.6,YES,2,NO,NO,YES,NO,YES 92 | ID12191,37,MALE,TOWN,24814.5,YES,1,YES,NO,YES,YES,YES 93 | ID12192,28,FEMALE,INNER_CITY,25429.3,NO,2,NO,YES,YES,YES,NO 94 | ID12193,53,FEMALE,RURAL,34866.5,NO,0,NO,NO,YES,NO,YES 95 | ID12194,61,MALE,INNER_CITY,42579.1,YES,2,YES,YES,YES,NO,NO 96 | ID12195,61,FEMALE,INNER_CITY,41127.4,YES,0,YES,YES,NO,NO,NO 97 | ID12196,18,FEMALE,INNER_CITY,9990.11,YES,0,NO,YES,YES,NO,NO 98 | ID12197,22,MALE,INNER_CITY,7948.62,YES,1,NO,NO,NO,YES,NO 99 | ID12198,34,MALE,TOWN,30870.8,YES,2,YES,YES,YES,YES,YES 100 | ID12199,35,FEMALE,INNER_CITY,12125.8,NO,2,NO,YES,YES,NO,NO 101 | ID12200,18,FEMALE,RURAL,15348.9,YES,0,YES,NO,YES,NO,NO 102 | ID12201,54,MALE,INNER_CITY,26707.9,YES,1,NO,YES,YES,YES,YES 103 | ID12202,27,FEMALE,INNER_CITY,11604.4,YES,2,YES,YES,YES,NO,NO 104 | ID12203,42,MALE,INNER_CITY,15499.9,YES,0,YES,NO,YES,YES,YES 105 | ID12204,43,MALE,TOWN,33088.5,NO,0,NO,YES,YES,YES,NO 106 | ID12205,64,FEMALE,INNER_CITY,34513.6,YES,1,NO,YES,YES,NO,YES 107 | ID12206,43,MALE,TOWN,32395.5,YES,3,YES,YES,YES,NO,NO 108 | ID12207,49,MALE,RURAL,46633,YES,0,YES,YES,NO,NO,NO 109 | ID12208,23,MALE,INNER_CITY,13039.9,YES,0,NO,NO,YES,NO,NO 110 | ID12209,23,MALE,INNER_CITY,12681.9,NO,0,NO,YES,YES,NO,YES 111 | ID12210,30,FEMALE,INNER_CITY,24031.5,YES,2,YES,YES,YES,YES,NO 112 | ID12211,36,MALE,TOWN,37330.5,NO,2,NO,YES,YES,NO,YES 113 | ID12212,34,MALE,INNER_CITY,25333.2,YES,3,YES,NO,NO,YES,NO 114 | ID12213,51,FEMALE,INNER_CITY,37094.2,YES,0,YES,NO,YES,NO,NO 115 | ID12214,36,MALE,TOWN,33630.6,NO,2,YES,YES,YES,NO,YES 116 | ID12215,56,MALE,INNER_CITY,43228.2,YES,1,YES,YES,YES,NO,YES 117 | ID12216,54,FEMALE,INNER_CITY,47796.8,YES,0,NO,YES,YES,NO,NO 118 | ID12217,56,FEMALE,TOWN,21730.3,YES,2,NO,YES,NO,NO,NO 119 | ID12218,26,MALE,INNER_CITY,10044.1,YES,3,NO,YES,YES,YES,NO 120 | ID12219,39,MALE,TOWN,17270.1,NO,0,YES,NO,NO,NO,YES 121 | ID12220,64,FEMALE,RURAL,45765,YES,3,YES,YES,YES,NO,YES 122 | ID12221,46,MALE,RURAL,29525.5,NO,2,NO,YES,NO,YES,NO 123 | ID12222,62,FEMALE,RURAL,54863.8,YES,1,YES,YES,YES,NO,YES 124 | ID12223,36,FEMALE,TOWN,20799,YES,1,NO,YES,NO,YES,YES 125 | ID12224,35,FEMALE,RURAL,33028.3,NO,1,NO,YES,YES,NO,YES 126 | ID12225,47,MALE,RURAL,45031.9,NO,3,YES,YES,NO,NO,YES 127 | ID12226,47,MALE,INNER_CITY,39010.8,YES,2,NO,NO,YES,NO,YES 128 | ID12227,37,FEMALE,TOWN,25257.7,YES,0,YES,YES,YES,NO,NO 129 | ID12228,48,FEMALE,INNER_CITY,42603.9,YES,0,NO,YES,YES,NO,NO 130 | ID12229,41,MALE,TOWN,14092.7,YES,3,YES,YES,YES,NO,NO 131 | ID12230,27,FEMALE,RURAL,21350.3,NO,0,YES,YES,YES,NO,YES 132 | ID12231,43,MALE,INNER_CITY,23246.4,NO,2,NO,NO,YES,NO,NO 133 | ID12232,61,MALE,RURAL,41609.5,YES,3,NO,YES,YES,YES,NO 134 | ID12233,52,FEMALE,SUBURBAN,16716.1,NO,2,YES,YES,YES,NO,NO 135 | ID12234,64,FEMALE,SUBURBAN,36436.4,YES,0,YES,NO,YES,NO,NO 136 | ID12235,66,FEMALE,TOWN,59503.8,YES,2,YES,YES,YES,YES,YES 137 | ID12236,53,FEMALE,TOWN,31334.8,YES,1,YES,YES,YES,NO,YES 138 | ID12237,20,FEMALE,INNER_CITY,14048.9,YES,2,YES,NO,YES,YES,NO 139 | ID12238,57,FEMALE,INNER_CITY,39205.3,NO,0,NO,YES,YES,NO,YES 140 | ID12239,65,FEMALE,RURAL,42173.9,YES,0,NO,YES,YES,NO,NO 141 | ID12240,64,FEMALE,INNER_CITY,55263,NO,1,NO,YES,YES,NO,YES 142 | ID12241,52,MALE,INNER_CITY,37095.2,YES,3,NO,YES,YES,NO,NO 143 | ID12242,47,FEMALE,INNER_CITY,22791.4,YES,0,NO,NO,YES,NO,NO 144 | ID12243,28,FEMALE,TOWN,17240.6,YES,1,YES,NO,NO,YES,NO 145 | ID12244,64,MALE,TOWN,48974.8,YES,0,YES,YES,YES,YES,NO 146 | ID12245,25,MALE,INNER_CITY,18923,YES,1,NO,YES,YES,NO,YES 147 | ID12246,58,MALE,SUBURBAN,51204.2,YES,0,NO,YES,NO,YES,YES 148 | ID12247,34,MALE,TOWN,20236.2,YES,2,YES,NO,YES,NO,NO 149 | ID12248,20,FEMALE,INNER_CITY,18860.3,NO,2,NO,YES,YES,NO,NO 150 | ID12249,63,MALE,RURAL,25732.5,YES,0,YES,YES,NO,NO,NO 151 | ID12250,30,FEMALE,SUBURBAN,28240.4,YES,3,YES,YES,YES,NO,NO 152 | ID12251,53,MALE,RURAL,28193.6,YES,1,YES,YES,YES,NO,YES 153 | ID12252,43,MALE,TOWN,36432.8,NO,2,YES,NO,YES,NO,YES 154 | ID12253,63,MALE,TOWN,54618.8,YES,2,NO,YES,NO,YES,YES 155 | ID12254,33,MALE,INNER_CITY,24760.8,YES,1,NO,YES,YES,YES,YES 156 | ID12255,41,MALE,RURAL,23356.1,NO,2,YES,NO,NO,NO,NO 157 | ID12256,20,FEMALE,SUBURBAN,8143.75,YES,2,NO,YES,YES,NO,YES 158 | ID12257,50,MALE,TOWN,26462.5,YES,0,YES,YES,YES,NO,NO 159 | ID12258,24,MALE,RURAL,20467.3,YES,2,YES,YES,YES,NO,NO 160 | ID12259,60,FEMALE,TOWN,21506.2,YES,0,NO,NO,YES,NO,YES 161 | ID12260,44,FEMALE,TOWN,15315.3,YES,1,NO,YES,YES,NO,YES 162 | ID12261,23,MALE,INNER_CITY,18875.7,YES,2,YES,YES,YES,NO,NO 163 | ID12262,40,FEMALE,INNER_CITY,12977.2,YES,0,NO,NO,NO,NO,NO 164 | ID12263,49,FEMALE,TOWN,20708.5,NO,3,NO,NO,YES,NO,NO 165 | ID12264,21,FEMALE,TOWN,7549.38,NO,1,YES,NO,YES,NO,NO 166 | ID12265,40,FEMALE,INNER_CITY,24904,YES,0,NO,NO,NO,NO,NO 167 | ID12266,26,MALE,RURAL,24071.8,YES,1,NO,NO,YES,YES,YES 168 | ID12267,20,MALE,TOWN,9589.91,NO,1,NO,YES,YES,YES,NO 169 | ID12268,24,MALE,INNER_CITY,8562.86,NO,1,NO,NO,NO,YES,NO 170 | ID12269,37,FEMALE,TOWN,26707.5,NO,0,YES,YES,YES,NO,NO 171 | ID12270,56,MALE,INNER_CITY,34020.5,YES,0,NO,NO,YES,NO,NO 172 | ID12271,52,MALE,INNER_CITY,49175.7,YES,0,YES,YES,YES,NO,NO 173 | ID12272,22,MALE,INNER_CITY,19726.3,YES,2,NO,NO,YES,NO,NO 174 | ID12273,35,MALE,INNER_CITY,24346.6,YES,3,YES,YES,YES,NO,NO 175 | ID12274,34,MALE,RURAL,26999.4,YES,1,YES,YES,YES,NO,YES 176 | ID12275,67,FEMALE,TOWN,41558.1,YES,2,NO,YES,YES,NO,YES 177 | ID12276,58,FEMALE,INNER_CITY,56340.3,NO,0,NO,YES,YES,NO,YES 178 | ID12277,40,MALE,TOWN,37558.5,YES,0,YES,YES,YES,NO,NO 179 | ID12278,41,FEMALE,INNER_CITY,30099.3,YES,0,YES,YES,YES,YES,YES 180 | ID12279,43,MALE,INNER_CITY,15254.8,YES,0,NO,NO,NO,NO,YES 181 | ID12280,63,MALE,INNER_CITY,36086.1,YES,2,NO,NO,YES,NO,YES 182 | ID12281,22,FEMALE,INNER_CITY,17655,YES,1,NO,YES,YES,YES,YES 183 | ID12282,60,MALE,RURAL,56658.9,NO,0,NO,YES,YES,NO,YES 184 | ID12283,65,FEMALE,INNER_CITY,37706.5,NO,0,YES,YES,YES,YES,NO 185 | ID12284,48,FEMALE,INNER_CITY,18516,YES,2,YES,YES,YES,YES,NO 186 | ID12285,38,FEMALE,INNER_CITY,29622,NO,0,YES,YES,NO,YES,NO 187 | ID12286,49,MALE,RURAL,32669.9,YES,1,YES,YES,YES,YES,YES 188 | ID12287,20,FEMALE,INNER_CITY,18275.5,YES,0,NO,NO,NO,YES,YES 189 | ID12288,48,FEMALE,TOWN,34410,YES,1,NO,NO,YES,NO,YES 190 | ID12289,38,MALE,INNER_CITY,34866.9,YES,2,YES,NO,YES,YES,NO 191 | ID12290,41,FEMALE,INNER_CITY,21796.6,YES,0,NO,NO,YES,NO,NO 192 | ID12291,67,FEMALE,SUBURBAN,63130.1,YES,2,YES,YES,YES,NO,YES 193 | ID12292,39,MALE,INNER_CITY,14996.4,YES,1,YES,YES,NO,NO,NO 194 | ID12293,64,FEMALE,RURAL,49024.9,YES,3,NO,YES,YES,NO,YES 195 | ID12294,41,MALE,INNER_CITY,16249.8,YES,0,YES,NO,YES,YES,YES 196 | ID12295,55,MALE,SUBURBAN,36192.1,YES,2,YES,NO,YES,NO,YES 197 | ID12296,52,MALE,INNER_CITY,17839.9,YES,1,NO,YES,NO,YES,YES 198 | ID12297,30,FEMALE,INNER_CITY,18802.4,NO,0,NO,NO,YES,YES,YES 199 | ID12298,52,MALE,INNER_CITY,48720.3,YES,2,YES,YES,YES,NO,YES 200 | ID12299,26,MALE,INNER_CITY,14585.9,NO,2,NO,NO,YES,YES,NO 201 | ID12300,26,FEMALE,INNER_CITY,20819,YES,0,NO,YES,YES,NO,YES 202 | ID12301,46,MALE,TOWN,26077.8,YES,1,YES,YES,YES,NO,YES 203 | ID12302,46,FEMALE,TOWN,41627.1,YES,0,NO,YES,YES,YES,NO 204 | ID12303,52,MALE,INNER_CITY,16977.3,YES,0,NO,NO,YES,NO,NO 205 | ID12304,37,MALE,INNER_CITY,19012.8,NO,3,YES,NO,YES,NO,NO 206 | ID12305,22,MALE,INNER_CITY,12764.8,YES,1,NO,YES,YES,YES,YES 207 | ID12306,18,MALE,INNER_CITY,14388.6,NO,0,YES,NO,YES,YES,YES 208 | ID12307,63,MALE,INNER_CITY,59409.1,NO,0,YES,YES,YES,NO,YES 209 | ID12308,25,FEMALE,INNER_CITY,14960.2,YES,0,NO,YES,YES,YES,NO 210 | ID12309,67,MALE,INNER_CITY,39666.6,YES,0,YES,YES,YES,NO,NO 211 | ID12310,27,MALE,INNER_CITY,20771.9,NO,0,NO,YES,NO,NO,YES 212 | ID12311,61,MALE,INNER_CITY,24474.1,NO,0,YES,NO,NO,NO,YES 213 | ID12312,58,MALE,TOWN,33123.7,YES,1,YES,YES,YES,NO,YES 214 | ID12313,22,MALE,INNER_CITY,14433.4,YES,0,YES,YES,YES,NO,YES 215 | ID12314,28,MALE,TOWN,13175.5,NO,1,NO,NO,YES,NO,YES 216 | ID12315,23,MALE,INNER_CITY,9824.37,YES,0,NO,YES,YES,NO,NO 217 | ID12316,27,MALE,SUBURBAN,17610.3,YES,0,NO,YES,YES,NO,NO 218 | ID12317,27,FEMALE,SUBURBAN,15156.2,YES,0,YES,NO,YES,NO,NO 219 | ID12318,40,FEMALE,INNER_CITY,31774.1,YES,3,YES,YES,YES,YES,NO 220 | ID12319,39,MALE,TOWN,31693.5,NO,0,YES,YES,NO,NO,YES 221 | ID12320,35,FEMALE,INNER_CITY,28598.7,YES,0,YES,YES,YES,NO,NO 222 | ID12321,37,FEMALE,INNER_CITY,26261.7,NO,2,YES,NO,NO,NO,NO 223 | ID12322,47,MALE,INNER_CITY,42124.1,YES,2,NO,YES,YES,NO,YES 224 | ID12323,42,FEMALE,INNER_CITY,39308.7,YES,1,NO,YES,YES,NO,YES 225 | ID12324,67,FEMALE,INNER_CITY,43530,YES,0,NO,YES,YES,YES,NO 226 | ID12325,57,MALE,RURAL,49874.4,YES,3,NO,YES,YES,NO,YES 227 | ID12326,47,FEMALE,RURAL,27434.8,NO,0,YES,YES,YES,NO,YES 228 | ID12327,67,MALE,INNER_CITY,50474.6,YES,2,YES,YES,YES,NO,YES 229 | ID12328,56,MALE,TOWN,24888.2,NO,0,NO,YES,YES,NO,YES 230 | ID12329,37,MALE,RURAL,28021.6,NO,0,YES,NO,YES,YES,YES 231 | ID12330,27,MALE,INNER_CITY,12279.5,NO,0,YES,NO,NO,NO,YES 232 | ID12331,59,FEMALE,INNER_CITY,30189.4,YES,0,YES,NO,YES,YES,YES 233 | ID12332,31,MALE,INNER_CITY,28969.4,NO,1,NO,NO,YES,YES,NO 234 | ID12333,31,MALE,SUBURBAN,14058.5,YES,0,NO,NO,YES,YES,YES 235 | ID12334,32,FEMALE,TOWN,30404.3,YES,0,YES,YES,YES,NO,NO 236 | ID12335,57,FEMALE,RURAL,41438.2,NO,3,NO,YES,NO,NO,NO 237 | ID12336,49,FEMALE,INNER_CITY,16711.3,NO,1,NO,YES,YES,YES,YES 238 | ID12337,65,MALE,TOWN,52255.9,NO,2,YES,YES,YES,NO,YES 239 | ID12338,22,FEMALE,INNER_CITY,17866.9,YES,2,NO,YES,YES,YES,NO 240 | ID12339,26,FEMALE,RURAL,18067.5,YES,2,NO,YES,NO,NO,NO 241 | ID12340,23,MALE,INNER_CITY,12823.7,YES,0,YES,YES,YES,NO,NO 242 | ID12341,26,FEMALE,RURAL,11299.3,YES,2,YES,NO,NO,NO,NO 243 | ID12342,59,FEMALE,INNER_CITY,56031.1,NO,0,YES,YES,YES,NO,YES 244 | ID12343,67,MALE,INNER_CITY,35263.5,YES,1,YES,YES,YES,NO,YES 245 | ID12344,34,FEMALE,INNER_CITY,19968.1,YES,0,YES,YES,YES,NO,NO 246 | ID12345,50,MALE,RURAL,27825.5,YES,2,YES,YES,NO,NO,NO 247 | ID12346,46,MALE,SUBURBAN,37773.9,NO,0,YES,YES,YES,YES,NO 248 | ID12347,23,FEMALE,INNER_CITY,7606.25,NO,3,YES,NO,NO,NO,NO 249 | ID12348,26,MALE,RURAL,21384.4,YES,0,NO,NO,YES,NO,NO 250 | ID12349,40,MALE,TOWN,20347,YES,3,NO,YES,YES,NO,NO 251 | ID12350,36,MALE,TOWN,21332.3,YES,3,YES,NO,YES,NO,NO 252 | ID12351,65,MALE,INNER_CITY,57671.7,NO,0,NO,YES,YES,YES,NO 253 | ID12352,45,FEMALE,TOWN,36057.8,YES,1,YES,YES,YES,YES,YES 254 | ID12353,23,MALE,INNER_CITY,14290.5,YES,2,NO,NO,YES,YES,NO 255 | ID12354,42,FEMALE,TOWN,17882.9,YES,1,NO,NO,NO,NO,YES 256 | ID12355,21,FEMALE,RURAL,10629.1,NO,3,NO,YES,YES,YES,NO 257 | ID12356,62,FEMALE,INNER_CITY,24262.8,NO,0,YES,NO,YES,NO,YES 258 | ID12357,49,FEMALE,SUBURBAN,26097.9,NO,2,NO,YES,YES,NO,NO 259 | ID12358,28,FEMALE,TOWN,23371,YES,2,NO,NO,YES,YES,NO 260 | ID12359,38,FEMALE,TOWN,21495.6,NO,3,NO,NO,YES,YES,NO 261 | ID12360,36,MALE,TOWN,12166.9,NO,0,NO,NO,NO,NO,NO 262 | ID12361,22,MALE,SUBURBAN,17180.2,YES,0,NO,YES,YES,NO,NO 263 | ID12362,40,FEMALE,TOWN,28882.3,YES,1,NO,YES,YES,YES,YES 264 | ID12363,40,FEMALE,TOWN,21612.2,YES,0,NO,NO,YES,NO,NO 265 | ID12364,60,FEMALE,INNER_CITY,46358.4,YES,0,YES,YES,YES,YES,NO 266 | ID12365,23,MALE,INNER_CITY,19166,NO,0,YES,YES,NO,NO,YES 267 | ID12366,21,MALE,INNER_CITY,17921.8,YES,1,YES,NO,NO,NO,YES 268 | ID12367,58,MALE,TOWN,33229,NO,0,YES,YES,YES,NO,YES 269 | ID12368,48,FEMALE,SUBURBAN,30396.1,NO,0,NO,YES,YES,NO,YES 270 | ID12369,63,FEMALE,TOWN,34625.2,YES,0,YES,NO,YES,NO,NO 271 | ID12370,20,MALE,TOWN,16672.8,NO,3,YES,NO,YES,YES,NO 272 | ID12371,67,FEMALE,SUBURBAN,60747.5,NO,2,NO,YES,YES,YES,YES 273 | ID12372,62,FEMALE,INNER_CITY,56394.3,NO,0,YES,YES,YES,NO,YES 274 | ID12373,36,MALE,TOWN,13236.4,YES,0,NO,YES,YES,NO,NO 275 | ID12374,31,MALE,INNER_CITY,28409.4,YES,1,YES,NO,YES,NO,YES 276 | ID12375,42,FEMALE,INNER_CITY,27056.5,YES,0,YES,YES,YES,NO,NO 277 | ID12376,18,MALE,RURAL,9362.58,YES,0,YES,NO,YES,YES,YES 278 | ID12377,46,FEMALE,SUBURBAN,28702.7,NO,1,NO,YES,YES,YES,YES 279 | ID12378,25,MALE,TOWN,22366.1,YES,1,YES,YES,YES,NO,YES 280 | ID12379,65,FEMALE,RURAL,24477.5,NO,0,NO,YES,NO,YES,NO 281 | ID12380,40,MALE,TOWN,36972.4,YES,1,NO,NO,YES,YES,YES 282 | ID12381,32,MALE,INNER_CITY,22327.8,YES,1,NO,NO,NO,NO,YES 283 | ID12382,18,FEMALE,INNER_CITY,15610.2,YES,0,NO,YES,YES,NO,NO 284 | ID12383,64,MALE,INNER_CITY,54314.5,YES,1,YES,YES,NO,NO,YES 285 | ID12384,43,FEMALE,INNER_CITY,39175.8,YES,3,NO,YES,YES,YES,NO 286 | ID12385,22,FEMALE,INNER_CITY,13739,YES,3,NO,YES,YES,NO,NO 287 | ID12386,25,MALE,TOWN,9485.84,YES,0,NO,NO,NO,NO,NO 288 | ID12387,39,MALE,INNER_CITY,24675.7,YES,1,YES,YES,YES,YES,YES 289 | ID12388,58,FEMALE,INNER_CITY,28253.6,YES,3,NO,NO,YES,NO,NO 290 | ID12389,33,MALE,INNER_CITY,14136.5,YES,1,NO,YES,NO,NO,NO 291 | ID12390,52,FEMALE,RURAL,37162.1,YES,1,YES,YES,YES,NO,YES 292 | ID12391,23,MALE,INNER_CITY,13519.2,NO,0,NO,YES,YES,NO,YES 293 | ID12392,44,FEMALE,INNER_CITY,39253.6,NO,0,YES,YES,NO,NO,YES 294 | ID12393,51,MALE,RURAL,46323.8,YES,2,YES,YES,YES,YES,YES 295 | ID12394,26,FEMALE,TOWN,20950.7,YES,0,NO,YES,YES,NO,NO 296 | ID12395,42,MALE,TOWN,22495.7,YES,0,NO,YES,NO,YES,NO 297 | ID12396,34,MALE,TOWN,32548.9,YES,0,YES,YES,YES,YES,NO 298 | ID12397,54,FEMALE,RURAL,24583.4,NO,2,YES,YES,YES,YES,NO 299 | ID12398,18,MALE,RURAL,8639.24,YES,2,NO,NO,NO,NO,NO 300 | ID12399,47,FEMALE,INNER_CITY,17139.5,NO,2,YES,NO,YES,NO,NO 301 | ID12400,24,FEMALE,INNER_CITY,13667.7,YES,0,NO,YES,YES,NO,NO 302 | ID12401,19,FEMALE,INNER_CITY,8162.42,YES,1,YES,YES,YES,YES,NO 303 | ID12402,37,FEMALE,TOWN,15349.6,YES,0,NO,YES,NO,NO,NO 304 | ID12403,45,FEMALE,TOWN,29231.4,YES,0,NO,YES,NO,NO,NO 305 | ID12404,49,MALE,RURAL,41462.3,YES,3,NO,YES,YES,YES,NO 306 | ID12405,67,FEMALE,RURAL,57398.1,NO,3,NO,YES,YES,NO,YES 307 | ID12406,35,FEMALE,RURAL,11520.8,YES,0,NO,NO,YES,NO,NO 308 | ID12407,63,MALE,INNER_CITY,52117.3,NO,2,YES,YES,YES,NO,YES 309 | ID12408,38,MALE,RURAL,26281.4,NO,0,YES,YES,YES,NO,YES 310 | ID12409,48,MALE,TOWN,25683.4,NO,2,YES,YES,YES,NO,NO 311 | ID12410,28,FEMALE,INNER_CITY,11920.7,NO,1,NO,YES,YES,NO,NO 312 | ID12411,46,MALE,TOWN,30658.7,YES,0,NO,YES,YES,NO,NO 313 | ID12412,66,MALE,INNER_CITY,36646.4,NO,1,NO,YES,YES,NO,YES 314 | ID12413,61,FEMALE,TOWN,30760.4,NO,2,YES,NO,YES,NO,YES 315 | ID12414,18,FEMALE,RURAL,16109.9,NO,2,YES,YES,YES,NO,NO 316 | ID12415,54,FEMALE,TOWN,18036.7,YES,0,YES,YES,NO,NO,NO 317 | ID12416,45,MALE,RURAL,42628.3,NO,0,YES,YES,YES,YES,NO 318 | ID12417,60,MALE,INNER_CITY,22110.1,NO,2,YES,YES,YES,NO,NO 319 | ID12418,45,FEMALE,TOWN,37689.1,NO,1,NO,YES,YES,YES,YES 320 | ID12419,31,FEMALE,INNER_CITY,23171.8,NO,2,NO,YES,NO,NO,NO 321 | ID12420,39,MALE,SUBURBAN,21951.3,NO,0,YES,YES,YES,YES,NO 322 | ID12421,53,MALE,INNER_CITY,38103.4,NO,2,YES,YES,YES,NO,YES 323 | ID12422,35,MALE,RURAL,22882.9,YES,0,YES,NO,NO,YES,YES 324 | ID12423,25,FEMALE,TOWN,11043.7,YES,1,YES,YES,NO,YES,NO 325 | ID12424,32,MALE,TOWN,24027.6,NO,0,NO,YES,YES,NO,YES 326 | ID12425,36,MALE,SUBURBAN,28495.1,YES,0,NO,YES,YES,NO,NO 327 | ID12426,24,FEMALE,TOWN,9465.21,NO,0,NO,NO,YES,NO,YES 328 | ID12427,39,MALE,INNER_CITY,34852.3,YES,1,NO,YES,NO,YES,YES 329 | ID12428,24,MALE,INNER_CITY,21268.4,YES,0,NO,NO,YES,YES,YES 330 | ID12429,57,FEMALE,RURAL,50849.2,NO,1,NO,YES,NO,YES,YES 331 | ID12430,27,FEMALE,TOWN,18555.9,YES,3,NO,NO,YES,NO,NO 332 | ID12431,66,FEMALE,RURAL,52769.9,YES,3,YES,YES,YES,NO,YES 333 | ID12432,18,FEMALE,INNER_CITY,11601.4,YES,2,NO,YES,YES,YES,NO 334 | ID12433,33,FEMALE,TOWN,29541.7,YES,0,NO,YES,NO,NO,NO 335 | ID12434,48,MALE,SUBURBAN,17861,YES,2,NO,YES,YES,NO,NO 336 | ID12435,23,FEMALE,SUBURBAN,21042,YES,1,NO,NO,YES,NO,YES 337 | ID12436,44,MALE,TOWN,26688.1,YES,1,NO,YES,NO,YES,YES 338 | ID12437,39,MALE,INNER_CITY,26900.6,YES,2,NO,NO,NO,YES,NO 339 | ID12438,65,MALE,INNER_CITY,38080.9,YES,1,YES,YES,NO,YES,NO 340 | ID12439,60,MALE,SUBURBAN,37554.1,YES,2,YES,YES,YES,YES,YES 341 | ID12440,20,MALE,INNER_CITY,18184.6,YES,0,NO,NO,YES,YES,YES 342 | ID12441,45,FEMALE,SUBURBAN,28864.9,YES,0,NO,YES,YES,NO,NO 343 | ID12442,66,MALE,RURAL,48346.1,YES,1,YES,YES,NO,NO,YES 344 | ID12443,64,MALE,INNER_CITY,53104.3,YES,0,YES,YES,YES,NO,NO 345 | ID12444,51,FEMALE,INNER_CITY,19416.8,YES,0,NO,YES,YES,NO,NO 346 | ID12445,34,MALE,RURAL,23638.1,YES,2,YES,YES,YES,NO,NO 347 | ID12446,65,FEMALE,TOWN,42378.2,YES,1,NO,YES,YES,NO,YES 348 | ID12447,50,MALE,RURAL,39745.3,YES,0,YES,YES,NO,YES,NO 349 | ID12448,66,FEMALE,INNER_CITY,45189.8,YES,0,NO,YES,NO,NO,NO 350 | ID12449,63,MALE,INNER_CITY,37930.9,YES,0,NO,YES,NO,YES,NO 351 | ID12450,53,FEMALE,INNER_CITY,24042,NO,1,NO,NO,YES,NO,YES 352 | ID12451,33,FEMALE,INNER_CITY,31207.1,NO,0,YES,NO,YES,YES,YES 353 | ID12452,38,MALE,TOWN,24424.3,YES,0,NO,YES,YES,YES,NO 354 | ID12453,56,FEMALE,RURAL,24607.8,NO,1,NO,YES,YES,YES,NO 355 | ID12454,48,MALE,TOWN,43057,NO,1,YES,YES,YES,NO,YES 356 | ID12455,49,FEMALE,SUBURBAN,30198.5,YES,0,YES,NO,YES,YES,YES 357 | ID12456,54,FEMALE,RURAL,50186.1,YES,2,NO,YES,YES,YES,YES 358 | ID12457,41,FEMALE,SUBURBAN,22916.1,YES,0,NO,NO,YES,NO,NO 359 | ID12458,19,MALE,RURAL,9592.73,NO,0,NO,YES,YES,NO,YES 360 | ID12459,52,FEMALE,INNER_CITY,34253.6,NO,3,YES,NO,YES,NO,NO 361 | ID12460,52,FEMALE,INNER_CITY,22792.3,YES,1,YES,YES,NO,NO,YES 362 | ID12461,64,FEMALE,TOWN,51620.8,NO,2,YES,YES,YES,NO,YES 363 | ID12462,56,MALE,TOWN,19918.9,YES,3,NO,YES,YES,YES,NO 364 | ID12463,56,MALE,TOWN,29625.1,YES,2,NO,NO,YES,NO,NO 365 | ID12464,19,FEMALE,TOWN,12549,YES,0,NO,YES,YES,NO,NO 366 | ID12465,56,FEMALE,TOWN,51299.3,YES,1,YES,YES,NO,NO,YES 367 | ID12466,27,MALE,INNER_CITY,17364.8,YES,2,YES,YES,NO,NO,YES 368 | ID12467,59,FEMALE,INNER_CITY,29866.9,NO,1,YES,NO,YES,NO,YES 369 | ID12468,56,MALE,INNER_CITY,47750.2,YES,0,YES,YES,YES,YES,NO 370 | ID12469,21,MALE,TOWN,11281.5,YES,0,NO,YES,YES,YES,NO 371 | ID12470,64,MALE,INNER_CITY,34073.8,YES,3,NO,YES,NO,YES,NO 372 | ID12471,62,MALE,INNER_CITY,46870.4,YES,0,NO,YES,YES,NO,NO 373 | ID12472,44,FEMALE,INNER_CITY,38453.7,NO,2,NO,YES,YES,NO,YES 374 | ID12473,24,FEMALE,INNER_CITY,7756.36,NO,0,NO,NO,NO,NO,YES 375 | ID12474,52,FEMALE,INNER_CITY,28413.8,YES,0,NO,NO,YES,NO,NO 376 | ID12475,67,FEMALE,SUBURBAN,47198.6,YES,2,NO,YES,YES,NO,YES 377 | ID12476,41,MALE,INNER_CITY,20866.3,YES,0,YES,YES,YES,NO,NO 378 | ID12477,58,FEMALE,TOWN,33204.3,NO,1,NO,NO,NO,YES,YES 379 | ID12478,40,MALE,INNER_CITY,24823.5,NO,2,NO,NO,YES,NO,NO 380 | ID12479,19,MALE,SUBURBAN,17986.8,YES,0,NO,NO,YES,YES,YES 381 | ID12480,20,FEMALE,INNER_CITY,9909.82,YES,3,NO,NO,YES,NO,NO 382 | ID12481,56,FEMALE,TOWN,26542.8,NO,0,YES,YES,YES,NO,YES 383 | ID12482,46,FEMALE,INNER_CITY,32583.5,YES,2,YES,YES,YES,NO,NO 384 | ID12483,30,MALE,INNER_CITY,14606.6,YES,1,NO,NO,YES,NO,NO 385 | ID12484,40,FEMALE,TOWN,34836.8,YES,1,YES,YES,YES,NO,YES 386 | ID12485,36,FEMALE,TOWN,26920.8,YES,0,NO,NO,YES,NO,NO 387 | ID12486,57,FEMALE,INNER_CITY,38248.3,NO,3,YES,NO,YES,YES,NO 388 | ID12487,49,MALE,INNER_CITY,15689.1,NO,0,NO,NO,YES,NO,NO 389 | ID12488,61,MALE,INNER_CITY,30157.7,NO,1,YES,NO,NO,NO,YES 390 | ID12489,29,MALE,INNER_CITY,14642.2,NO,0,YES,NO,YES,NO,YES 391 | ID12490,48,FEMALE,INNER_CITY,15933.3,YES,0,NO,YES,NO,NO,NO 392 | ID12491,56,FEMALE,INNER_CITY,44288.3,YES,0,NO,YES,YES,NO,NO 393 | ID12492,40,MALE,TOWN,22197.1,NO,0,YES,NO,YES,NO,YES 394 | ID12493,58,FEMALE,TOWN,38248.3,YES,3,YES,YES,NO,NO,NO 395 | ID12494,60,FEMALE,INNER_CITY,22053.2,YES,2,YES,YES,NO,YES,NO 396 | ID12495,58,FEMALE,SUBURBAN,25468.5,NO,0,YES,NO,NO,YES,YES 397 | ID12496,67,FEMALE,TOWN,23485.9,YES,3,YES,NO,NO,YES,NO 398 | ID12497,40,MALE,RURAL,25768.6,YES,0,YES,YES,NO,NO,NO 399 | ID12498,48,MALE,INNER_CITY,34182.2,YES,2,NO,YES,YES,NO,YES 400 | ID12499,64,FEMALE,INNER_CITY,57444.5,NO,1,NO,YES,YES,NO,YES 401 | ID12500,43,MALE,TOWN,38059.8,YES,0,YES,YES,NO,YES,NO 402 | ID12501,34,FEMALE,RURAL,19481.3,NO,0,NO,NO,YES,YES,YES 403 | ID12502,26,MALE,RURAL,19563.8,NO,3,YES,NO,NO,YES,NO 404 | ID12503,48,MALE,INNER_CITY,38598.4,YES,0,YES,NO,YES,NO,NO 405 | ID12504,35,MALE,INNER_CITY,20754.3,NO,0,NO,NO,NO,YES,YES 406 | ID12505,24,FEMALE,INNER_CITY,13864.6,YES,3,NO,YES,YES,NO,YES 407 | ID12506,47,MALE,SUBURBAN,36599,YES,3,YES,YES,YES,NO,YES 408 | ID12507,52,MALE,SUBURBAN,45856.1,NO,1,NO,YES,YES,NO,YES 409 | ID12508,31,MALE,SUBURBAN,22362.3,NO,0,NO,YES,YES,NO,YES 410 | ID12509,41,FEMALE,SUBURBAN,21984,YES,1,YES,NO,NO,NO,YES 411 | ID12510,23,MALE,SUBURBAN,11073,YES,2,NO,YES,NO,NO,NO 412 | ID12511,27,FEMALE,INNER_CITY,18158.5,NO,1,YES,NO,YES,YES,YES 413 | ID12512,22,MALE,INNER_CITY,7304.2,NO,0,YES,YES,YES,YES,NO 414 | ID12513,67,FEMALE,INNER_CITY,58092,NO,2,YES,YES,YES,NO,YES 415 | ID12514,26,FEMALE,INNER_CITY,16518.6,YES,0,YES,NO,YES,NO,YES 416 | ID12515,58,FEMALE,SUBURBAN,46461.5,YES,0,NO,YES,YES,NO,NO 417 | ID12516,27,MALE,RURAL,20058.7,YES,0,NO,YES,YES,YES,NO 418 | ID12517,36,MALE,INNER_CITY,12533.2,NO,1,NO,YES,NO,YES,NO 419 | ID12518,31,MALE,RURAL,22848.5,YES,1,YES,NO,NO,NO,YES 420 | ID12519,28,FEMALE,TOWN,25699.4,YES,2,YES,YES,NO,NO,YES 421 | ID12520,57,MALE,TOWN,21612.6,YES,0,NO,YES,YES,YES,NO 422 | ID12521,64,MALE,INNER_CITY,48950.9,YES,0,YES,YES,YES,YES,NO 423 | ID12522,49,MALE,INNER_CITY,41438,NO,1,NO,YES,NO,NO,YES 424 | ID12523,22,MALE,INNER_CITY,11411,YES,1,YES,NO,YES,NO,NO 425 | ID12524,58,FEMALE,INNER_CITY,43940.6,YES,0,YES,YES,YES,NO,NO 426 | ID12525,20,MALE,TOWN,17239.5,NO,1,YES,NO,YES,NO,YES 427 | ID12526,44,FEMALE,RURAL,30488.7,YES,0,NO,YES,YES,NO,NO 428 | ID12527,65,FEMALE,TOWN,29866.3,YES,3,NO,YES,NO,NO,NO 429 | ID12528,53,FEMALE,INNER_CITY,32184.4,YES,2,YES,NO,NO,NO,YES 430 | ID12529,34,FEMALE,INNER_CITY,17308.7,YES,1,YES,NO,YES,YES,YES 431 | ID12530,35,MALE,RURAL,27863.9,NO,2,NO,YES,YES,YES,NO 432 | ID12531,48,MALE,TOWN,28920.6,YES,0,NO,YES,NO,YES,NO 433 | ID12532,64,FEMALE,TOWN,58367.3,YES,1,YES,YES,YES,NO,YES 434 | ID12533,46,MALE,TOWN,16849.3,NO,0,YES,YES,YES,YES,NO 435 | ID12534,42,FEMALE,RURAL,28138.5,YES,0,NO,NO,YES,YES,YES 436 | ID12535,47,MALE,INNER_CITY,23038.2,YES,0,YES,YES,NO,NO,NO 437 | ID12536,23,MALE,TOWN,11736.9,NO,2,YES,YES,YES,YES,NO 438 | ID12537,35,MALE,INNER_CITY,16479.5,YES,0,YES,NO,NO,YES,YES 439 | ID12538,64,FEMALE,INNER_CITY,31415.7,NO,1,YES,YES,NO,YES,YES 440 | ID12539,18,FEMALE,SUBURBAN,12117.3,YES,1,NO,NO,YES,NO,NO 441 | ID12540,19,MALE,INNER_CITY,15417.1,YES,1,YES,NO,YES,NO,NO 442 | ID12541,40,MALE,INNER_CITY,29414.6,YES,1,NO,YES,NO,YES,YES 443 | ID12542,47,MALE,INNER_CITY,44682.1,YES,0,YES,YES,YES,NO,NO 444 | ID12543,43,FEMALE,RURAL,36281,YES,0,YES,YES,YES,NO,NO 445 | ID12544,38,MALE,TOWN,33302.8,NO,0,YES,NO,YES,YES,YES 446 | ID12545,21,FEMALE,RURAL,15797.1,YES,0,NO,YES,YES,NO,YES 447 | ID12546,40,MALE,RURAL,31864.8,YES,0,YES,YES,YES,NO,NO 448 | ID12547,52,FEMALE,TOWN,43719.5,YES,0,NO,YES,YES,NO,NO 449 | ID12548,35,FEMALE,TOWN,30799.5,YES,2,NO,NO,YES,NO,YES 450 | ID12549,53,MALE,RURAL,48971.6,YES,3,YES,YES,NO,NO,YES 451 | ID12550,38,FEMALE,RURAL,34061.4,NO,0,YES,YES,YES,NO,YES 452 | ID12551,42,MALE,INNER_CITY,28938.6,YES,3,YES,NO,NO,NO,NO 453 | ID12552,43,MALE,TOWN,38540,NO,0,YES,YES,NO,YES,NO 454 | ID12553,59,FEMALE,INNER_CITY,27045.1,NO,0,NO,NO,YES,NO,YES 455 | ID12554,59,FEMALE,RURAL,51284.3,NO,0,YES,YES,YES,YES,NO 456 | ID12555,24,MALE,INNER_CITY,16352.2,NO,0,NO,YES,YES,YES,NO 457 | ID12556,27,FEMALE,INNER_CITY,11866.4,YES,0,YES,YES,YES,NO,NO 458 | ID12557,32,FEMALE,TOWN,13267.6,YES,0,YES,YES,YES,YES,NO 459 | ID12558,65,FEMALE,INNER_CITY,61554.6,YES,0,NO,YES,YES,NO,NO 460 | ID12559,18,MALE,SUBURBAN,13700.2,NO,1,NO,YES,YES,NO,NO 461 | ID12560,66,MALE,TOWN,46963.9,YES,1,NO,YES,YES,NO,YES 462 | ID12561,41,MALE,INNER_CITY,23475.6,YES,0,YES,NO,YES,YES,YES 463 | ID12562,64,FEMALE,SUBURBAN,24554.1,YES,0,YES,YES,NO,YES,NO 464 | ID12563,23,FEMALE,INNER_CITY,18050,YES,0,YES,NO,YES,YES,YES 465 | ID12564,29,FEMALE,RURAL,15237.6,YES,2,YES,NO,YES,NO,NO 466 | ID12565,28,MALE,TOWN,20555,NO,0,YES,YES,YES,NO,YES 467 | ID12566,57,FEMALE,TOWN,28421.7,YES,2,YES,YES,YES,YES,NO 468 | ID12567,38,MALE,INNER_CITY,21876.5,YES,0,NO,NO,YES,NO,NO 469 | ID12568,34,FEMALE,SUBURBAN,12810.2,NO,3,NO,YES,YES,NO,YES 470 | ID12569,43,MALE,SUBURBAN,15109.4,YES,0,NO,NO,YES,NO,YES 471 | ID12570,63,MALE,TOWN,37414.7,NO,3,YES,YES,YES,NO,NO 472 | ID12571,62,FEMALE,INNER_CITY,41521.6,YES,0,NO,YES,YES,YES,NO 473 | ID12572,51,FEMALE,INNER_CITY,25372.8,YES,0,YES,YES,YES,NO,NO 474 | ID12573,61,MALE,INNER_CITY,21139.8,YES,2,YES,YES,NO,NO,NO 475 | ID12574,41,FEMALE,TOWN,27757.6,NO,0,NO,YES,YES,NO,YES 476 | ID12575,31,FEMALE,TOWN,22678.1,NO,1,YES,YES,YES,YES,YES 477 | ID12576,33,FEMALE,TOWN,12178.5,YES,2,NO,YES,YES,YES,NO 478 | ID12577,43,MALE,RURAL,26106.7,NO,1,NO,NO,YES,NO,YES 479 | ID12578,40,MALE,INNER_CITY,27417.6,YES,0,NO,YES,YES,YES,NO 480 | ID12579,47,MALE,TOWN,23337.2,YES,2,NO,YES,YES,YES,NO 481 | ID12580,46,MALE,TOWN,43395.5,NO,1,YES,YES,YES,NO,YES 482 | ID12581,30,MALE,RURAL,11536.2,YES,2,NO,NO,YES,YES,NO 483 | ID12582,47,MALE,INNER_CITY,44658.6,NO,2,YES,YES,YES,NO,YES 484 | ID12583,44,MALE,RURAL,32762.5,NO,0,NO,YES,YES,YES,NO 485 | ID12584,23,MALE,RURAL,16403.8,YES,0,YES,YES,NO,NO,NO 486 | ID12585,28,FEMALE,INNER_CITY,21184.7,YES,1,YES,YES,YES,NO,YES 487 | ID12586,64,FEMALE,INNER_CITY,49917.3,YES,0,YES,YES,YES,YES,NO 488 | ID12587,35,FEMALE,TOWN,21623.8,YES,0,NO,YES,NO,YES,NO 489 | ID12588,19,MALE,RURAL,16625.9,YES,1,NO,NO,YES,NO,YES 490 | ID12589,27,FEMALE,TOWN,14014.5,YES,3,NO,YES,NO,NO,NO 491 | ID12590,27,MALE,INNER_CITY,20409.3,YES,2,NO,YES,NO,NO,YES 492 | ID12591,58,FEMALE,INNER_CITY,31671.3,YES,1,YES,YES,NO,YES,YES 493 | ID12592,46,FEMALE,TOWN,17149.2,NO,1,YES,YES,YES,NO,YES 494 | ID12593,61,FEMALE,TOWN,27756.3,YES,0,YES,YES,NO,YES,YES 495 | ID12594,59,MALE,INNER_CITY,40949.9,NO,0,NO,YES,YES,NO,YES 496 | ID12595,47,FEMALE,RURAL,43743.2,YES,3,NO,YES,YES,NO,NO 497 | ID12596,44,MALE,INNER_CITY,38459.9,YES,0,NO,NO,NO,YES,YES 498 | ID12597,50,MALE,TOWN,40972.9,NO,2,YES,YES,YES,YES,YES 499 | ID12598,64,MALE,INNER_CITY,46587.9,NO,0,NO,YES,YES,NO,YES 500 | ID12599,51,FEMALE,TOWN,43799.6,NO,0,NO,YES,YES,YES,NO 501 | ID12600,46,FEMALE,TOWN,18912.2,YES,0,YES,NO,YES,NO,YES 502 | ID12601,39,FEMALE,INNER_CITY,27765.8,YES,3,YES,YES,NO,NO,NO 503 | ID12602,58,FEMALE,INNER_CITY,33007.3,NO,1,YES,NO,YES,NO,YES 504 | ID12603,32,FEMALE,TOWN,26325.3,YES,0,YES,NO,NO,NO,NO 505 | ID12604,22,FEMALE,INNER_CITY,15308.2,NO,0,YES,YES,YES,YES,NO 506 | ID12605,63,FEMALE,INNER_CITY,59805.6,YES,1,YES,YES,YES,NO,YES 507 | ID12606,35,FEMALE,TOWN,28658.3,YES,0,YES,YES,YES,NO,NO 508 | ID12607,59,FEMALE,INNER_CITY,23175,YES,0,NO,NO,NO,YES,YES 509 | ID12608,22,FEMALE,SUBURBAN,11595.4,NO,0,YES,YES,YES,NO,YES 510 | ID12609,60,MALE,SUBURBAN,50409.9,NO,2,NO,YES,YES,YES,YES 511 | ID12610,23,FEMALE,RURAL,11215.3,YES,2,YES,YES,YES,NO,YES 512 | ID12611,29,FEMALE,TOWN,13327.8,YES,0,NO,YES,NO,NO,NO 513 | ID12612,44,MALE,INNER_CITY,16088.8,YES,3,NO,YES,YES,NO,NO 514 | ID12613,63,FEMALE,TOWN,43943,YES,0,YES,YES,YES,YES,NO 515 | ID12614,25,FEMALE,RURAL,14505.3,NO,3,NO,YES,YES,NO,NO 516 | ID12615,37,MALE,INNER_CITY,33886.4,NO,0,YES,YES,NO,NO,YES 517 | ID12616,48,FEMALE,INNER_CITY,16662.5,YES,1,NO,YES,YES,NO,YES 518 | ID12617,35,FEMALE,TOWN,20262.6,NO,0,NO,YES,NO,NO,YES 519 | ID12618,51,MALE,TOWN,33615.4,YES,1,NO,YES,NO,YES,YES 520 | ID12619,27,FEMALE,TOWN,22007.1,NO,3,NO,NO,YES,NO,NO 521 | ID12620,56,FEMALE,INNER_CITY,28981.1,YES,3,YES,YES,YES,YES,NO 522 | ID12621,38,FEMALE,INNER_CITY,12163.9,YES,2,YES,NO,YES,NO,YES 523 | ID12622,36,FEMALE,TOWN,17247.7,YES,2,YES,YES,YES,NO,NO 524 | ID12623,27,MALE,SUBURBAN,12683.6,NO,3,NO,YES,YES,YES,NO 525 | ID12624,34,FEMALE,TOWN,16291,YES,0,YES,YES,NO,YES,NO 526 | ID12625,44,MALE,SUBURBAN,18707.3,YES,0,YES,YES,YES,YES,YES 527 | ID12626,43,FEMALE,INNER_CITY,19326.9,YES,1,NO,NO,YES,NO,YES 528 | ID12627,32,MALE,INNER_CITY,14511.8,NO,2,NO,YES,NO,YES,NO 529 | ID12628,21,FEMALE,TOWN,10672,NO,1,NO,YES,YES,YES,NO 530 | ID12629,30,FEMALE,INNER_CITY,25830.5,YES,2,NO,NO,YES,NO,NO 531 | ID12630,55,MALE,RURAL,43499.5,YES,1,YES,YES,YES,NO,NO 532 | ID12631,64,MALE,SUBURBAN,59175.1,YES,1,NO,YES,YES,NO,YES 533 | ID12632,36,FEMALE,INNER_CITY,27642.9,NO,1,YES,NO,NO,NO,YES 534 | ID12633,59,MALE,INNER_CITY,30067.5,NO,1,NO,YES,YES,NO,YES 535 | ID12634,60,MALE,INNER_CITY,29714.4,YES,0,YES,NO,YES,YES,YES 536 | ID12635,27,FEMALE,TOWN,13950.4,NO,1,YES,YES,YES,YES,NO 537 | ID12636,29,MALE,INNER_CITY,10072.6,YES,0,NO,NO,YES,NO,NO 538 | ID12637,53,FEMALE,INNER_CITY,37850.6,NO,1,NO,YES,YES,NO,YES 539 | ID12638,61,FEMALE,RURAL,57176.4,NO,2,NO,YES,YES,NO,YES 540 | ID12639,43,FEMALE,INNER_CITY,38784,YES,0,NO,YES,YES,NO,NO 541 | ID12640,19,MALE,INNER_CITY,10191.8,YES,0,NO,YES,YES,YES,YES 542 | ID12641,48,FEMALE,INNER_CITY,21821.4,YES,1,YES,YES,NO,YES,YES 543 | ID12642,39,MALE,RURAL,37389,YES,2,NO,YES,YES,NO,YES 544 | ID12643,32,MALE,INNER_CITY,14627.9,YES,2,YES,YES,YES,NO,NO 545 | ID12644,63,MALE,SUBURBAN,48770.5,YES,1,YES,YES,NO,YES,YES 546 | ID12645,46,MALE,RURAL,21096.2,YES,1,YES,YES,YES,NO,YES 547 | ID12646,40,MALE,INNER_CITY,36256.9,NO,0,NO,NO,YES,YES,YES 548 | ID12647,27,FEMALE,INNER_CITY,15281.8,NO,0,YES,NO,NO,YES,YES 549 | ID12648,22,MALE,SUBURBAN,9316.98,YES,2,YES,YES,YES,NO,NO 550 | ID12649,42,MALE,RURAL,20736.2,YES,1,YES,YES,YES,NO,YES 551 | ID12650,58,FEMALE,TOWN,52662.5,YES,1,YES,YES,YES,YES,YES 552 | ID12651,23,FEMALE,INNER_CITY,8020.19,YES,1,YES,NO,YES,NO,NO 553 | ID12652,58,MALE,SUBURBAN,32245.4,YES,3,NO,NO,NO,NO,NO 554 | ID12653,45,MALE,INNER_CITY,41107.2,YES,2,YES,YES,YES,NO,NO 555 | ID12654,49,FEMALE,SUBURBAN,39358.3,YES,0,YES,NO,NO,YES,NO 556 | ID12655,67,FEMALE,INNER_CITY,36095.9,YES,3,YES,YES,YES,NO,NO 557 | ID12656,20,FEMALE,TOWN,7723.93,YES,2,YES,YES,YES,NO,NO 558 | ID12657,43,FEMALE,RURAL,18565.8,YES,1,YES,YES,YES,YES,YES 559 | ID12658,41,MALE,RURAL,25132.9,YES,0,NO,NO,YES,NO,NO 560 | ID12659,38,FEMALE,SUBURBAN,31290.6,YES,0,NO,YES,YES,NO,NO 561 | ID12660,67,FEMALE,SUBURBAN,24858.4,YES,0,NO,NO,YES,YES,YES 562 | ID12661,40,MALE,INNER_CITY,16398.8,NO,1,NO,YES,YES,NO,YES 563 | ID12662,25,FEMALE,SUBURBAN,23287.9,NO,0,NO,NO,YES,YES,YES 564 | ID12663,57,FEMALE,RURAL,50897.6,YES,0,NO,YES,YES,NO,NO 565 | ID12664,32,FEMALE,INNER_CITY,22446.5,NO,1,NO,NO,YES,YES,YES 566 | ID12665,44,MALE,SUBURBAN,23092.1,YES,0,YES,NO,YES,YES,YES 567 | ID12666,30,MALE,TOWN,24867.6,YES,0,YES,YES,NO,NO,NO 568 | ID12667,43,MALE,TOWN,22234.7,YES,0,NO,NO,YES,YES,NO 569 | ID12668,19,FEMALE,INNER_CITY,17371.1,YES,2,YES,YES,YES,NO,NO 570 | ID12669,37,FEMALE,TOWN,29574,YES,0,YES,YES,YES,NO,NO 571 | ID12670,49,MALE,INNER_CITY,17944.2,NO,1,YES,YES,YES,YES,YES 572 | ID12671,49,FEMALE,RURAL,33665.5,NO,3,YES,NO,YES,NO,NO 573 | ID12672,40,MALE,INNER_CITY,36166.2,YES,0,NO,NO,NO,NO,NO 574 | ID12673,33,FEMALE,SUBURBAN,27712.9,NO,2,NO,YES,YES,NO,YES 575 | ID12674,39,MALE,TOWN,22400.7,YES,2,YES,YES,YES,NO,NO 576 | ID12675,43,FEMALE,INNER_CITY,28469.9,YES,0,YES,YES,YES,NO,NO 577 | ID12676,37,MALE,TOWN,30488,YES,0,NO,YES,YES,YES,NO 578 | ID12677,24,FEMALE,TOWN,19160.3,YES,1,YES,YES,YES,NO,YES 579 | ID12678,62,MALE,TOWN,45342.5,YES,0,YES,YES,NO,NO,YES 580 | ID12679,18,MALE,INNER_CITY,6294.21,NO,0,NO,YES,YES,YES,NO 581 | ID12680,47,FEMALE,TOWN,25127.7,YES,0,YES,NO,NO,NO,NO 582 | ID12681,63,MALE,SUBURBAN,51879.3,YES,2,YES,YES,NO,YES,YES 583 | ID12682,20,MALE,TOWN,12644.9,YES,2,YES,NO,YES,YES,NO 584 | ID12683,46,FEMALE,TOWN,21984.4,NO,3,NO,YES,NO,NO,NO 585 | ID12684,47,FEMALE,RURAL,29093.1,NO,2,YES,YES,YES,NO,NO 586 | ID12685,33,MALE,TOWN,23528.4,YES,1,NO,YES,YES,YES,YES 587 | ID12686,20,MALE,TOWN,9516.91,NO,3,YES,YES,YES,NO,NO 588 | ID12687,20,MALE,INNER_CITY,18364.9,YES,1,NO,NO,YES,NO,YES 589 | ID12688,43,FEMALE,TOWN,31273.8,NO,1,YES,YES,NO,NO,YES 590 | ID12689,58,MALE,RURAL,49673.6,YES,0,NO,YES,YES,NO,NO 591 | ID12690,29,MALE,SUBURBAN,12623.4,YES,1,YES,YES,YES,NO,NO 592 | ID12691,25,MALE,INNER_CITY,23818.6,YES,0,NO,NO,NO,NO,NO 593 | ID12692,40,FEMALE,INNER_CITY,31473.9,NO,2,NO,YES,YES,YES,YES 594 | ID12693,48,MALE,TOWN,20268,YES,0,NO,YES,YES,NO,NO 595 | ID12694,65,MALE,SUBURBAN,51417,YES,2,NO,YES,YES,NO,YES 596 | ID12695,59,FEMALE,RURAL,30971.8,YES,3,YES,YES,YES,YES,NO 597 | ID12696,61,FEMALE,INNER_CITY,47025,NO,2,YES,YES,YES,YES,NO 598 | ID12697,30,FEMALE,INNER_CITY,9672.25,YES,0,YES,YES,YES,NO,NO 599 | ID12698,31,FEMALE,TOWN,15976.3,YES,0,YES,YES,NO,NO,YES 600 | ID12699,29,MALE,INNER_CITY,14711.8,YES,0,NO,YES,NO,YES,NO 601 | ID12700,38,MALE,TOWN,26671.6,NO,0,YES,NO,YES,YES,YES 602 | -------------------------------------------------------------------------------- /Practical Slides and Codes/shipment.csv: -------------------------------------------------------------------------------- 1 | Day,Shipments 2 | 1,99 3 | 2,103 4 | 3,92 5 | 4,100 6 | 5,99 7 | 6,99 8 | 7,103 9 | 8,101 10 | 9,100 11 | 10,100 12 | 11,102 13 | 12,101 14 | 13,101 15 | 14,111 16 | 15,94 17 | 16,101 18 | 17,104 19 | 18,99 20 | 19,94 21 | 20,110 22 | 21,108 23 | 22,102 24 | 23,100 25 | 24,98 26 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # MATH-260 (Multivariate Data Analysis) 2 | Course Name: Multivariate Data Analysis (MDA) 3 | 4 | Participants: BSc Mathematics and Data Science students of Sorbonne University 5 | 6 | Faculty Name: Dr. Tanujit Chakraborty, Assistant Professor of Statistics at Sorbonne University and Sorbonne Center for AI (SCAI) 7 | 8 | Timeline : January 2024 to April 2024 | Total Teaching : 45 Sessions (20 Theory + 5 Tutorials + 20 Practicals) 9 | 10 | Course Website: https://www.ctanujit.org/mda.html 11 | 12 | The details of the courses are given below: 13 | 14 | Theoretical class notes (written on boards) are available under the folder *Handwritten Classnotes* 15 | 16 | Tutorial Exercises are available under the folder *Tutorial Worksheets* 17 | 18 | Slides, Data, and Codes are available under the *Practical Slides and Codes* folders. 19 | 20 | In case you want to recap the basics of probability and statistics, check *Basic Probability and Statistics (Recap)* folder. 21 | 22 | Happy Learning! Share your feedback at ctanujit@gmail.com 23 | 24 | -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_1.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_1.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_1_Sol.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_1_Sol.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_2.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_2.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_2_Sol.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_2_Sol.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_3.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_3.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_3_Sol.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_3_Sol.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_4.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_4.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_4_Sol.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_4_Sol.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_5.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_5.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/MDA_TD_5_Sol.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/MDA_TD_5_Sol.pdf -------------------------------------------------------------------------------- /Tutorial Worksheets/Statistical Tables.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ctanujit/MATH-260/237d20369f59bfcb9b2c35efefc9ff5155acc166/Tutorial Worksheets/Statistical Tables.pdf --------------------------------------------------------------------------------