├── README.md
├── data_banknote_authentication.csv
├── decision_tree_classification.py
├── ex1data1.txt
├── insurance.csv
├── kmeans.py
├── mnist.pkl.gz
├── neural_network.py
├── pl_logo.png
├── sgd_linear_regression_multivariable.py
├── sgd_linear_regression_simple.py
├── simple_linear_regression.py
└── winequality-white.csv
/README.md:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 | # PerceptiLabs Machine Learning CodeExamples
8 | We have put together code examples for some of the well-known machine learning algorithms discussed in our [Machine Learning Handbook](https://www.perceptilabs.com/resources/handbook). The handbook is a free resource that you can download and use to become more familiar with approaches like linear regression, decision trees, k-nearest neighbor, support vector machines (SVMs), clustering, and of course, neural networks. The handbook goes into the architecture and math behind these powerful algorithms.
9 |
10 | We provide Python examples for the following machine learning algorithms in this repo:
11 |
12 | * [Neural Network](#neural-network)
13 | * [K-means](#k-means)
14 | * [Decision Tree Classification](#decision-tree-classification)
15 | * [Linear Regression](#linear-regression)
16 |
17 | Happy hacking!
18 |
19 | ## Neural Network
20 |
21 | Play around with this code if you want to understand how to build neural networks using [NumPy](https://numpy.org/).
22 |
23 | This is a great way to better understand how it works when training a neural network.
24 |
25 | In this example we are using the [MNIST Dataset](http://yann.lecun.com/exdb/mnist/), which is a dataset containing image data of handwritten digits ranging from 0 through 9.
26 |
27 | Note that each sample has a corresponding label.
28 |
29 | ## K-means
30 | Use K-means clustering in its most common form, with seeding and/or constraints.
31 |
32 | In this code example we are using the [IRIS dataset](https://archive.ics.uci.edu/ml/datasets/iris), which is easy to play around with for this task.
33 |
34 | ## Decision Tree Classification
35 | You can use decision trees for classification, using the Gini index for the split.
36 |
37 | In this code example we are using the Bank Dataset ([data_banknote_authentication.csv](./data_banknote_authentication.csv), which can easily be replaced with another dataset.
38 |
39 | ## Linear Regression
40 | Here we are providing three different versions of linear regression:
41 |
42 | 1. Simple Linear Regression
43 | 2. Simple Linear Regression using SGD
44 | 3. Multi-variable Linear Regression using SGD
45 |
46 | In this example we are using the Insurance Dataset ([insurance.csv](./insurance.csv)) for 1., Wine Dataset for 2. and [ex1data1](./ex1data1.txt) Dataset for 3.
47 |
48 |
49 | # Community
50 |
51 | Got questions, feedback, or want to join a community of machine learning practitioners working with exciting tools and projects? Check out our [Community page](https://www.perceptilabs.com/community)!
52 |
--------------------------------------------------------------------------------
/data_banknote_authentication.csv:
--------------------------------------------------------------------------------
1 | 3.6216,8.6661,-2.8073,-0.44699,0
2 | 4.5459,8.1674,-2.4586,-1.4621,0
3 | 3.866,-2.6383,1.9242,0.10645,0
4 | 3.4566,9.5228,-4.0112,-3.5944,0
5 | 0.32924,-4.4552,4.5718,-0.9888,0
6 | 4.3684,9.6718,-3.9606,-3.1625,0
7 | 3.5912,3.0129,0.72888,0.56421,0
8 | 2.0922,-6.81,8.4636,-0.60216,0
9 | 3.2032,5.7588,-0.75345,-0.61251,0
10 | 1.5356,9.1772,-2.2718,-0.73535,0
11 | 1.2247,8.7779,-2.2135,-0.80647,0
12 | 3.9899,-2.7066,2.3946,0.86291,0
13 | 1.8993,7.6625,0.15394,-3.1108,0
14 | -1.5768,10.843,2.5462,-2.9362,0
15 | 3.404,8.7261,-2.9915,-0.57242,0
16 | 4.6765,-3.3895,3.4896,1.4771,0
17 | 2.6719,3.0646,0.37158,0.58619,0
18 | 0.80355,2.8473,4.3439,0.6017,0
19 | 1.4479,-4.8794,8.3428,-2.1086,0
20 | 5.2423,11.0272,-4.353,-4.1013,0
21 | 5.7867,7.8902,-2.6196,-0.48708,0
22 | 0.3292,-4.4552,4.5718,-0.9888,0
23 | 3.9362,10.1622,-3.8235,-4.0172,0
24 | 0.93584,8.8855,-1.6831,-1.6599,0
25 | 4.4338,9.887,-4.6795,-3.7483,0
26 | 0.7057,-5.4981,8.3368,-2.8715,0
27 | 1.1432,-3.7413,5.5777,-0.63578,0
28 | -0.38214,8.3909,2.1624,-3.7405,0
29 | 6.5633,9.8187,-4.4113,-3.2258,0
30 | 4.8906,-3.3584,3.4202,1.0905,0
31 | -0.24811,-0.17797,4.9068,0.15429,0
32 | 1.4884,3.6274,3.308,0.48921,0
33 | 4.2969,7.617,-2.3874,-0.96164,0
34 | -0.96511,9.4111,1.7305,-4.8629,0
35 | -1.6162,0.80908,8.1628,0.60817,0
36 | 2.4391,6.4417,-0.80743,-0.69139,0
37 | 2.6881,6.0195,-0.46641,-0.69268,0
38 | 3.6289,0.81322,1.6277,0.77627,0
39 | 4.5679,3.1929,-2.1055,0.29653,0
40 | 3.4805,9.7008,-3.7541,-3.4379,0
41 | 4.1711,8.722,-3.0224,-0.59699,0
42 | -0.2062,9.2207,-3.7044,-6.8103,0
43 | -0.0068919,9.2931,-0.41243,-1.9638,0
44 | 0.96441,5.8395,2.3235,0.066365,0
45 | 2.8561,6.9176,-0.79372,0.48403,0
46 | -0.7869,9.5663,-3.7867,-7.5034,0
47 | 2.0843,6.6258,0.48382,-2.2134,0
48 | -0.7869,9.5663,-3.7867,-7.5034,0
49 | 3.9102,6.065,-2.4534,-0.68234,0
50 | 1.6349,3.286,2.8753,0.087054,0
51 | 4.3239,-4.8835,3.4356,-0.5776,0
52 | 5.262,3.9834,-1.5572,1.0103,0
53 | 3.1452,5.825,-0.51439,-1.4944,0
54 | 2.549,6.1499,-1.1605,-1.2371,0
55 | 4.9264,5.496,-2.4774,-0.50648,0
56 | 4.8265,0.80287,1.6371,1.1875,0
57 | 2.5635,6.7769,-0.61979,0.38576,0
58 | 5.807,5.0097,-2.2384,0.43878,0
59 | 3.1377,-4.1096,4.5701,0.98963,0
60 | -0.78289,11.3603,-0.37644,-7.0495,0
61 | 2.888,0.44696,4.5907,-0.24398,0
62 | 0.49665,5.527,1.7785,-0.47156,0
63 | 4.2586,11.2962,-4.0943,-4.3457,0
64 | 1.7939,-1.1174,1.5454,-0.26079,0
65 | 5.4021,3.1039,-1.1536,1.5651,0
66 | 2.5367,2.599,2.0938,0.20085,0
67 | 4.6054,-4.0765,2.7587,0.31981,0
68 | 2.4235,9.5332,-3.0789,-2.7746,0
69 | 1.0009,7.7846,-0.28219,-2.6608,0
70 | 0.12326,8.9848,-0.9351,-2.4332,0
71 | 3.9529,-2.3548,2.3792,0.48274,0
72 | 4.1373,0.49248,1.093,1.8276,0
73 | 4.7181,10.0153,-3.9486,-3.8582,0
74 | 4.1654,-3.4495,3.643,1.0879,0
75 | 4.4069,10.9072,-4.5775,-4.4271,0
76 | 2.3066,3.5364,0.57551,0.41938,0
77 | 3.7935,7.9853,-2.5477,-1.872,0
78 | 0.049175,6.1437,1.7828,-0.72113,0
79 | 0.24835,7.6439,0.9885,-0.87371,0
80 | 1.1317,3.9647,3.3979,0.84351,0
81 | 2.8033,9.0862,-3.3668,-1.0224,0
82 | 4.4682,2.2907,0.95766,0.83058,0
83 | 5.0185,8.5978,-2.9375,-1.281,0
84 | 1.8664,7.7763,-0.23849,-2.9634,0
85 | 3.245,6.63,-0.63435,0.86937,0
86 | 4.0296,2.6756,0.80685,0.71679,0
87 | -1.1313,1.9037,7.5339,1.022,0
88 | 0.87603,6.8141,0.84198,-0.17156,0
89 | 4.1197,-2.7956,2.0707,0.67412,0
90 | 3.8027,0.81529,2.1041,1.0245,0
91 | 1.4806,7.6377,-2.7876,-1.0341,0
92 | 4.0632,3.584,0.72545,0.39481,0
93 | 4.3064,8.2068,-2.7824,-1.4336,0
94 | 2.4486,-6.3175,7.9632,0.20602,0
95 | 3.2718,1.7837,2.1161,0.61334,0
96 | -0.64472,-4.6062,8.347,-2.7099,0
97 | 2.9543,1.076,0.64577,0.89394,0
98 | 2.1616,-6.8804,8.1517,-0.081048,0
99 | 3.82,10.9279,-4.0112,-5.0284,0
100 | -2.7419,11.4038,2.5394,-5.5793,0
101 | 3.3669,-5.1856,3.6935,-1.1427,0
102 | 4.5597,-2.4211,2.6413,1.6168,0
103 | 5.1129,-0.49871,0.62863,1.1189,0
104 | 3.3397,-4.6145,3.9823,-0.23751,0
105 | 4.2027,0.22761,0.96108,0.97282,0
106 | 3.5438,1.2395,1.997,2.1547,0
107 | 2.3136,10.6651,-3.5288,-4.7672,0
108 | -1.8584,7.886,-1.6643,-1.8384,0
109 | 3.106,9.5414,-4.2536,-4.003,0
110 | 2.9163,10.8306,-3.3437,-4.122,0
111 | 3.9922,-4.4676,3.7304,-0.1095,0
112 | 1.518,5.6946,0.094818,-0.026738,0
113 | 3.2351,9.647,-3.2074,-2.5948,0
114 | 4.2188,6.8162,-1.2804,0.76076,0
115 | 1.7819,6.9176,-1.2744,-1.5759,0
116 | 2.5331,2.9135,-0.822,-0.12243,0
117 | 3.8969,7.4163,-1.8245,0.14007,0
118 | 2.108,6.7955,-0.1708,0.4905,0
119 | 2.8969,0.70768,2.29,1.8663,0
120 | 0.9297,-3.7971,4.6429,-0.2957,0
121 | 3.4642,10.6878,-3.4071,-4.109,0
122 | 4.0713,10.4023,-4.1722,-4.7582,0
123 | -1.4572,9.1214,1.7425,-5.1241,0
124 | -1.5075,1.9224,7.1466,0.89136,0
125 | -0.91718,9.9884,1.1804,-5.2263,0
126 | 2.994,7.2011,-1.2153,0.3211,0
127 | -2.343,12.9516,3.3285,-5.9426,0
128 | 3.7818,-2.8846,2.2558,-0.15734,0
129 | 4.6689,1.3098,0.055404,1.909,0
130 | 3.4663,1.1112,1.7425,1.3388,0
131 | 3.2697,-4.3414,3.6884,-0.29829,0
132 | 5.1302,8.6703,-2.8913,-1.5086,0
133 | 2.0139,6.1416,0.37929,0.56938,0
134 | 0.4339,5.5395,2.033,-0.40432,0
135 | -1.0401,9.3987,0.85998,-5.3336,0
136 | 4.1605,11.2196,-3.6136,-4.0819,0
137 | 5.438,9.4669,-4.9417,-3.9202,0
138 | 5.032,8.2026,-2.6256,-1.0341,0
139 | 5.2418,10.5388,-4.1174,-4.2797,0
140 | -0.2062,9.2207,-3.7044,-6.8103,0
141 | 2.0911,0.94358,4.5512,1.234,0
142 | 1.7317,-0.34765,4.1905,-0.99138,0
143 | 4.1736,3.3336,-1.4244,0.60429,0
144 | 3.9232,-3.2467,3.4579,0.83705,0
145 | 3.8481,10.1539,-3.8561,-4.2228,0
146 | 0.5195,-3.2633,3.0895,-0.9849,0
147 | 3.8584,0.78425,1.1033,1.7008,0
148 | 1.7496,-0.1759,5.1827,1.2922,0
149 | 3.6277,0.9829,0.68861,0.63403,0
150 | 2.7391,7.4018,0.071684,-2.5302,0
151 | 4.5447,8.2274,-2.4166,-1.5875,0
152 | -1.7599,11.9211,2.6756,-3.3241,0
153 | 5.0691,0.21313,0.20278,1.2095,0
154 | 3.4591,11.112,-4.2039,-5.0931,0
155 | 1.9358,8.1654,-0.023425,-2.2586,0
156 | 2.486,-0.99533,5.3404,-0.15475,0
157 | 2.4226,-4.5752,5.947,0.21507,0
158 | 3.9479,-3.7723,2.883,0.019813,0
159 | 2.2634,-4.4862,3.6558,-0.61251,0
160 | 1.3566,4.2358,2.1341,0.3211,0
161 | 5.0452,3.8964,-1.4304,0.86291,0
162 | 3.5499,8.6165,-3.2794,-1.2009,0
163 | 0.17346,7.8695,0.26876,-3.7883,0
164 | 2.4008,9.3593,-3.3565,-3.3526,0
165 | 4.8851,1.5995,-0.00029081,1.6401,0
166 | 4.1927,-3.2674,2.5839,0.21766,0
167 | 1.1166,8.6496,-0.96252,-1.8112,0
168 | 1.0235,6.901,-2.0062,-2.7125,0
169 | -1.803,11.8818,2.0458,-5.2728,0
170 | 0.11739,6.2761,-1.5495,-2.4746,0
171 | 0.5706,-0.0248,1.2421,-0.5621,0
172 | 4.0552,-2.4583,2.2806,1.0323,0
173 | -1.6952,1.0657,8.8294,0.94955,0
174 | -1.1193,10.7271,2.0938,-5.6504,0
175 | 1.8799,2.4707,2.4931,0.37671,0
176 | 3.583,-3.7971,3.4391,-0.12501,0
177 | 0.19081,9.1297,-3.725,-5.8224,0
178 | 3.6582,5.6864,-1.7157,-0.23751,0
179 | -0.13144,-1.7775,8.3316,0.35214,0
180 | 2.3925,9.798,-3.0361,-2.8224,0
181 | 1.6426,3.0149,0.22849,-0.147,0
182 | -0.11783,-1.5789,8.03,-0.028031,0
183 | -0.69572,8.6165,1.8419,-4.3289,0
184 | 2.9421,7.4101,-0.97709,-0.88406,0
185 | -1.7559,11.9459,3.0946,-4.8978,0
186 | -1.2537,10.8803,1.931,-4.3237,0
187 | 3.2585,-4.4614,3.8024,-0.15087,0
188 | 1.8314,6.3672,-0.036278,0.049554,0
189 | 4.5645,-3.6275,2.8684,0.27714,0
190 | 2.7365,-5.0325,6.6608,-0.57889,0
191 | 0.9297,-3.7971,4.6429,-0.2957,0
192 | 3.9663,10.1684,-4.1131,-4.6056,0
193 | 1.4578,-0.08485,4.1785,0.59136,0
194 | 4.8272,3.0687,0.68604,0.80731,0
195 | -2.341,12.3784,0.70403,-7.5836,0
196 | -1.8584,7.886,-1.6643,-1.8384,0
197 | 4.1454,7.257,-1.9153,-0.86078,0
198 | 1.9157,6.0816,0.23705,-2.0116,0
199 | 4.0215,-2.1914,2.4648,1.1409,0
200 | 5.8862,5.8747,-2.8167,-0.30087,0
201 | -2.0897,10.8265,2.3603,-3.4198,0
202 | 4.0026,-3.5943,3.5573,0.26809,0
203 | -0.78689,9.5663,-3.7867,-7.5034,0
204 | 4.1757,10.2615,-3.8552,-4.3056,0
205 | 0.83292,7.5404,0.65005,-0.92544,0
206 | 4.8077,2.2327,-0.26334,1.5534,0
207 | 5.3063,5.2684,-2.8904,-0.52716,0
208 | 2.5605,9.2683,-3.5913,-1.356,0
209 | 2.1059,7.6046,-0.47755,-1.8461,0
210 | 2.1721,-0.73874,5.4672,-0.72371,0
211 | 4.2899,9.1814,-4.6067,-4.3263,0
212 | 3.5156,10.1891,-4.2759,-4.978,0
213 | 2.614,8.0081,-3.7258,-1.3069,0
214 | 0.68087,2.3259,4.9085,0.54998,0
215 | 4.1962,0.74493,0.83256,0.753,0
216 | 6.0919,2.9673,-1.3267,1.4551,0
217 | 1.3234,3.2964,0.2362,-0.11984,0
218 | 1.3264,1.0326,5.6566,-0.41337,0
219 | -0.16735,7.6274,1.2061,-3.6241,0
220 | -1.3,10.2678,-2.953,-5.8638,0
221 | -2.2261,12.5398,2.9438,-3.5258,0
222 | 2.4196,6.4665,-0.75688,0.228,0
223 | 1.0987,0.6394,5.989,-0.58277,0
224 | 4.6464,10.5326,-4.5852,-4.206,0
225 | -0.36038,4.1158,3.1143,-0.37199,0
226 | 1.3562,3.2136,4.3465,0.78662,0
227 | 0.5706,-0.0248,1.2421,-0.5621,0
228 | -2.6479,10.1374,-1.331,-5.4707,0
229 | 3.1219,-3.137,1.9259,-0.37458,0
230 | 5.4944,1.5478,0.041694,1.9284,0
231 | -1.3389,1.552,7.0806,1.031,0
232 | -2.3361,11.9604,3.0835,-5.4435,0
233 | 2.2596,-0.033118,4.7355,-0.2776,0
234 | 0.46901,-0.63321,7.3848,0.36507,0
235 | 2.7296,2.8701,0.51124,0.5099,0
236 | 2.0466,2.03,2.1761,-0.083634,0
237 | -1.3274,9.498,2.4408,-5.2689,0
238 | 3.8905,-2.1521,2.6302,1.1047,0
239 | 3.9994,0.90427,1.1693,1.6892,0
240 | 2.3952,9.5083,-3.1783,-3.0086,0
241 | 3.2704,6.9321,-1.0456,0.23447,0
242 | -1.3931,1.5664,7.5382,0.78403,0
243 | 1.6406,3.5488,1.3964,-0.36424,0
244 | 2.7744,6.8576,-1.0671,0.075416,0
245 | 2.4287,9.3821,-3.2477,-1.4543,0
246 | 4.2134,-2.806,2.0116,0.67412,0
247 | 1.6472,0.48213,4.7449,1.225,0
248 | 2.0597,-0.99326,5.2119,-0.29312,0
249 | 0.3798,0.7098,0.7572,-0.4444,0
250 | 1.0135,8.4551,-1.672,-2.0815,0
251 | 4.5691,-4.4552,3.1769,0.0042961,0
252 | 0.57461,10.1105,-1.6917,-4.3922,0
253 | 0.5734,9.1938,-0.9094,-1.872,0
254 | 5.2868,3.257,-1.3721,1.1668,0
255 | 4.0102,10.6568,-4.1388,-5.0646,0
256 | 4.1425,-3.6792,3.8281,1.6297,0
257 | 3.0934,-2.9177,2.2232,0.22283,0
258 | 2.2034,5.9947,0.53009,0.84998,0
259 | 3.744,0.79459,0.95851,1.0077,0
260 | 3.0329,2.2948,2.1135,0.35084,0
261 | 3.7731,7.2073,-1.6814,-0.94742,0
262 | 3.1557,2.8908,0.59693,0.79825,0
263 | 1.8114,7.6067,-0.9788,-2.4668,0
264 | 4.988,7.2052,-3.2846,-1.1608,0
265 | 2.483,6.6155,-0.79287,-0.90863,0
266 | 1.594,4.7055,1.3758,0.081882,0
267 | -0.016103,9.7484,0.15394,-1.6134,0
268 | 3.8496,9.7939,-4.1508,-4.4582,0
269 | 0.9297,-3.7971,4.6429,-0.2957,0
270 | 4.9342,2.4107,-0.17594,1.6245,0
271 | 3.8417,10.0215,-4.2699,-4.9159,0
272 | 5.3915,9.9946,-3.8081,-3.3642,0
273 | 4.4072,-0.070365,2.0416,1.1319,0
274 | 2.6946,6.7976,-0.40301,0.44912,0
275 | 5.2756,0.13863,0.12138,1.1435,0
276 | 3.4312,6.2637,-1.9513,-0.36165,0
277 | 4.052,-0.16555,0.45383,0.51248,0
278 | 1.3638,-4.7759,8.4182,-1.8836,0
279 | 0.89566,7.7763,-2.7473,-1.9353,0
280 | 1.9265,7.7557,-0.16823,-3.0771,0
281 | 0.20977,-0.46146,7.7267,0.90946,0
282 | 4.068,-2.9363,2.1992,0.50084,0
283 | 2.877,-4.0599,3.6259,-0.32544,0
284 | 0.3223,-0.89808,8.0883,0.69222,0
285 | -1.3,10.2678,-2.953,-5.8638,0
286 | 1.7747,-6.4334,8.15,-0.89828,0
287 | 1.3419,-4.4221,8.09,-1.7349,0
288 | 0.89606,10.5471,-1.4175,-4.0327,0
289 | 0.44125,2.9487,4.3225,0.7155,0
290 | 3.2422,6.2265,0.12224,-1.4466,0
291 | 2.5678,3.5136,0.61406,-0.40691,0
292 | -2.2153,11.9625,0.078538,-7.7853,0
293 | 4.1349,6.1189,-2.4294,-0.19613,0
294 | 1.934,-9.2828e-06,4.816,-0.33967,0
295 | 2.5068,1.1588,3.9249,0.12585,0
296 | 2.1464,6.0795,-0.5778,-2.2302,0
297 | 0.051979,7.0521,-2.0541,-3.1508,0
298 | 1.2706,8.035,-0.19651,-2.1888,0
299 | 1.143,0.83391,5.4552,-0.56984,0
300 | 2.2928,9.0386,-3.2417,-1.2991,0
301 | 0.3292,-4.4552,4.5718,-0.9888,0
302 | 2.9719,6.8369,-0.2702,0.71291,0
303 | 1.6849,8.7489,-1.2641,-1.3858,0
304 | -1.9177,11.6894,2.5454,-3.2763,0
305 | 2.3729,10.4726,-3.0087,-3.2013,0
306 | 1.0284,9.767,-1.3687,-1.7853,0
307 | 0.27451,9.2186,-3.2863,-4.8448,0
308 | 1.6032,-4.7863,8.5193,-2.1203,0
309 | 4.616,10.1788,-4.2185,-4.4245,0
310 | 4.2478,7.6956,-2.7696,-1.0767,0
311 | 4.0215,-2.7004,2.4957,0.36636,0
312 | 5.0297,-4.9704,3.5025,-0.23751,0
313 | 1.5902,2.2948,3.2403,0.18404,0
314 | 2.1274,5.1939,-1.7971,-1.1763,0
315 | 1.1811,8.3847,-2.0567,-0.90345,0
316 | 0.3292,-4.4552,4.5718,-0.9888,0
317 | 5.7353,5.2808,-2.2598,0.075416,0
318 | 2.6718,5.6574,0.72974,-1.4892,0
319 | 1.5799,-4.7076,7.9186,-1.5487,0
320 | 2.9499,2.2493,1.3458,-0.037083,0
321 | 0.5195,-3.2633,3.0895,-0.9849,0
322 | 3.7352,9.5911,-3.9032,-3.3487,0
323 | -1.7344,2.0175,7.7618,0.93532,0
324 | 3.884,10.0277,-3.9298,-4.0819,0
325 | 3.5257,1.2829,1.9276,1.7991,0
326 | 4.4549,2.4976,1.0313,0.96894,0
327 | -0.16108,-6.4624,8.3573,-1.5216,0
328 | 4.2164,9.4607,-4.9288,-5.2366,0
329 | 3.5152,6.8224,-0.67377,-0.46898,0
330 | 1.6988,2.9094,2.9044,0.11033,0
331 | 1.0607,2.4542,2.5188,-0.17027,0
332 | 2.0421,1.2436,4.2171,0.90429,0
333 | 3.5594,1.3078,1.291,1.6556,0
334 | 3.0009,5.8126,-2.2306,-0.66553,0
335 | 3.9294,1.4112,1.8076,0.89782,0
336 | 3.4667,-4.0724,4.2882,1.5418,0
337 | 3.966,3.9213,0.70574,0.33662,0
338 | 1.0191,2.33,4.9334,0.82929,0
339 | 0.96414,5.616,2.2138,-0.12501,0
340 | 1.8205,6.7562,0.0099913,0.39481,0
341 | 4.9923,7.8653,-2.3515,-0.71984,0
342 | -1.1804,11.5093,0.15565,-6.8194,0
343 | 4.0329,0.23175,0.89082,1.1823,0
344 | 0.66018,10.3878,-1.4029,-3.9151,0
345 | 3.5982,7.1307,-1.3035,0.21248,0
346 | -1.8584,7.886,-1.6643,-1.8384,0
347 | 4.0972,0.46972,1.6671,0.91593,0
348 | 3.3299,0.91254,1.5806,0.39352,0
349 | 3.1088,3.1122,0.80857,0.4336,0
350 | -4.2859,8.5234,3.1392,-0.91639,0
351 | -1.2528,10.2036,2.1787,-5.6038,0
352 | 0.5195,-3.2633,3.0895,-0.9849,0
353 | 0.3292,-4.4552,4.5718,-0.9888,0
354 | 0.88872,5.3449,2.045,-0.19355,0
355 | 3.5458,9.3718,-4.0351,-3.9564,0
356 | -0.21661,8.0329,1.8848,-3.8853,0
357 | 2.7206,9.0821,-3.3111,-0.96811,0
358 | 3.2051,8.6889,-2.9033,-0.7819,0
359 | 2.6917,10.8161,-3.3,-4.2888,0
360 | -2.3242,11.5176,1.8231,-5.375,0
361 | 2.7161,-4.2006,4.1914,0.16981,0
362 | 3.3848,3.2674,0.90967,0.25128,0
363 | 1.7452,4.8028,2.0878,0.62627,0
364 | 2.805,0.57732,1.3424,1.2133,0
365 | 5.7823,5.5788,-2.4089,-0.056479,0
366 | 3.8999,1.734,1.6011,0.96765,0
367 | 3.5189,6.332,-1.7791,-0.020273,0
368 | 3.2294,7.7391,-0.37816,-2.5405,0
369 | 3.4985,3.1639,0.22677,-0.1651,0
370 | 2.1948,1.3781,1.1582,0.85774,0
371 | 2.2526,9.9636,-3.1749,-2.9944,0
372 | 4.1529,-3.9358,2.8633,-0.017686,0
373 | 0.74307,11.17,-1.3824,-4.0728,0
374 | 1.9105,8.871,-2.3386,-0.75604,0
375 | -1.5055,0.070346,6.8681,-0.50648,0
376 | 0.58836,10.7727,-1.3884,-4.3276,0
377 | 3.2303,7.8384,-3.5348,-1.2151,0
378 | -1.9922,11.6542,2.6542,-5.2107,0
379 | 2.8523,9.0096,-3.761,-3.3371,0
380 | 4.2772,2.4955,0.48554,0.36119,0
381 | 1.5099,0.039307,6.2332,-0.30346,0
382 | 5.4188,10.1457,-4.084,-3.6991,0
383 | 0.86202,2.6963,4.2908,0.54739,0
384 | 3.8117,10.1457,-4.0463,-4.5629,0
385 | 0.54777,10.3754,-1.5435,-4.1633,0
386 | 2.3718,7.4908,0.015989,-1.7414,0
387 | -2.4953,11.1472,1.9353,-3.4638,0
388 | 4.6361,-2.6611,2.8358,1.1991,0
389 | -2.2527,11.5321,2.5899,-3.2737,0
390 | 3.7982,10.423,-4.1602,-4.9728,0
391 | -0.36279,8.2895,-1.9213,-3.3332,0
392 | 2.1265,6.8783,0.44784,-2.2224,0
393 | 0.86736,5.5643,1.6765,-0.16769,0
394 | 3.7831,10.0526,-3.8869,-3.7366,0
395 | -2.2623,12.1177,0.28846,-7.7581,0
396 | 1.2616,4.4303,-1.3335,-1.7517,0
397 | 2.6799,3.1349,0.34073,0.58489,0
398 | -0.39816,5.9781,1.3912,-1.1621,0
399 | 4.3937,0.35798,2.0416,1.2004,0
400 | 2.9695,5.6222,0.27561,-1.1556,0
401 | 1.3049,-0.15521,6.4911,-0.75346,0
402 | 2.2123,-5.8395,7.7687,-0.85302,0
403 | 1.9647,6.9383,0.57722,0.66377,0
404 | 3.0864,-2.5845,2.2309,0.30947,0
405 | 0.3798,0.7098,0.7572,-0.4444,0
406 | 0.58982,7.4266,1.2353,-2.9595,0
407 | 0.14783,7.946,1.0742,-3.3409,0
408 | -0.062025,6.1975,1.099,-1.131,0
409 | 4.223,1.1319,0.72202,0.96118,0
410 | 0.64295,7.1018,0.3493,-0.41337,0
411 | 1.941,0.46351,4.6472,1.0879,0
412 | 4.0047,0.45937,1.3621,1.6181,0
413 | 3.7767,9.7794,-3.9075,-3.5323,0
414 | 3.4769,-0.15314,2.53,2.4495,0
415 | 1.9818,9.2621,-3.521,-1.872,0
416 | 3.8023,-3.8696,4.044,0.95343,0
417 | 4.3483,11.1079,-4.0857,-4.2539,0
418 | 1.1518,1.3864,5.2727,-0.43536,0
419 | -1.2576,1.5892,7.0078,0.42455,0
420 | 1.9572,-5.1153,8.6127,-1.4297,0
421 | -2.484,12.1611,2.8204,-3.7418,0
422 | -1.1497,1.2954,7.701,0.62627,0
423 | 4.8368,10.0132,-4.3239,-4.3276,0
424 | -0.12196,8.8068,0.94566,-4.2267,0
425 | 1.9429,6.3961,0.092248,0.58102,0
426 | 1.742,-4.809,8.2142,-2.0659,0
427 | -1.5222,10.8409,2.7827,-4.0974,0
428 | -1.3,10.2678,-2.953,-5.8638,0
429 | 3.4246,-0.14693,0.80342,0.29136,0
430 | 2.5503,-4.9518,6.3729,-0.41596,0
431 | 1.5691,6.3465,-0.1828,-2.4099,0
432 | 1.3087,4.9228,2.0013,0.22024,0
433 | 5.1776,8.2316,-3.2511,-1.5694,0
434 | 2.229,9.6325,-3.1123,-2.7164,0
435 | 5.6272,10.0857,-4.2931,-3.8142,0
436 | 1.2138,8.7986,-2.1672,-0.74182,0
437 | 0.3798,0.7098,0.7572,-0.4444,0
438 | 0.5415,6.0319,1.6825,-0.46122,0
439 | 4.0524,5.6802,-1.9693,0.026279,0
440 | 4.7285,2.1065,-0.28305,1.5625,0
441 | 3.4359,0.66216,2.1041,1.8922,0
442 | 0.86816,10.2429,-1.4912,-4.0082,0
443 | 3.359,9.8022,-3.8209,-3.7133,0
444 | 3.6702,2.9942,0.85141,0.30688,0
445 | 1.3349,6.1189,0.46497,0.49826,0
446 | 3.1887,-3.4143,2.7742,-0.2026,0
447 | 2.4527,2.9653,0.20021,-0.056479,0
448 | 3.9121,2.9735,0.92852,0.60558,0
449 | 3.9364,10.5885,-3.725,-4.3133,0
450 | 3.9414,-3.2902,3.1674,1.0866,0
451 | 3.6922,-3.9585,4.3439,1.3517,0
452 | 5.681,7.795,-2.6848,-0.92544,0
453 | 0.77124,9.0862,-1.2281,-1.4996,0
454 | 3.5761,9.7753,-3.9795,-3.4638,0
455 | 1.602,6.1251,0.52924,0.47886,0
456 | 2.6682,10.216,-3.4414,-4.0069,0
457 | 2.0007,1.8644,2.6491,0.47369,0
458 | 0.64215,3.1287,4.2933,0.64696,0
459 | 4.3848,-3.0729,3.0423,1.2741,0
460 | 0.77445,9.0552,-2.4089,-1.3884,0
461 | 0.96574,8.393,-1.361,-1.4659,0
462 | 3.0948,8.7324,-2.9007,-0.96682,0
463 | 4.9362,7.6046,-2.3429,-0.85302,0
464 | -1.9458,11.2217,1.9079,-3.4405,0
465 | 5.7403,-0.44284,0.38015,1.3763,0
466 | -2.6989,12.1984,0.67661,-8.5482,0
467 | 1.1472,3.5985,1.9387,-0.43406,0
468 | 2.9742,8.96,-2.9024,-1.0379,0
469 | 4.5707,7.2094,-3.2794,-1.4944,0
470 | 0.1848,6.5079,2.0133,-0.87242,0
471 | 0.87256,9.2931,-0.7843,-2.1978,0
472 | 0.39559,6.8866,1.0588,-0.67587,0
473 | 3.8384,6.1851,-2.0439,-0.033204,0
474 | 2.8209,7.3108,-0.81857,-1.8784,0
475 | 2.5817,9.7546,-3.1749,-2.9957,0
476 | 3.8213,0.23175,2.0133,2.0564,0
477 | 0.3798,0.7098,0.7572,-0.4444,0
478 | 3.4893,6.69,-1.2042,-0.38751,0
479 | -1.7781,0.8546,7.1303,0.027572,0
480 | 2.0962,2.4769,1.9379,-0.040962,0
481 | 0.94732,-0.57113,7.1903,-0.67587,0
482 | 2.8261,9.4007,-3.3034,-1.0509,0
483 | 0.0071249,8.3661,0.50781,-3.8155,0
484 | 0.96788,7.1907,1.2798,-2.4565,0
485 | 4.7432,2.1086,0.1368,1.6543,0
486 | 3.6575,7.2797,-2.2692,-1.144,0
487 | 3.8832,6.4023,-2.432,-0.98363,0
488 | 3.4776,8.811,-3.1886,-0.92285,0
489 | 1.1315,7.9212,1.093,-2.8444,0
490 | 2.8237,2.8597,0.19678,0.57196,0
491 | 1.9321,6.0423,0.26019,-2.053,0
492 | 3.0632,-3.3315,5.1305,0.8267,0
493 | -1.8411,10.8306,2.769,-3.0901,0
494 | 2.8084,11.3045,-3.3394,-4.4194,0
495 | 2.5698,-4.4076,5.9856,0.078002,0
496 | -0.12624,10.3216,-3.7121,-6.1185,0
497 | 3.3756,-4.0951,4.367,1.0698,0
498 | -0.048008,-1.6037,8.4756,0.75558,0
499 | 0.5706,-0.0248,1.2421,-0.5621,0
500 | 0.88444,6.5906,0.55837,-0.44182,0
501 | 3.8644,3.7061,0.70403,0.35214,0
502 | 1.2999,2.5762,2.0107,-0.18967,0
503 | 2.0051,-6.8638,8.132,-0.2401,0
504 | 4.9294,0.27727,0.20792,0.33662,0
505 | 2.8297,6.3485,-0.73546,-0.58665,0
506 | 2.565,8.633,-2.9941,-1.3082,0
507 | 2.093,8.3061,0.022844,-3.2724,0
508 | 4.6014,5.6264,-2.1235,0.19309,0
509 | 5.0617,-0.35799,0.44698,0.99868,0
510 | -0.2951,9.0489,-0.52725,-2.0789,0
511 | 3.577,2.4004,1.8908,0.73231,0
512 | 3.9433,2.5017,1.5215,0.903,0
513 | 2.6648,10.754,-3.3994,-4.1685,0
514 | 5.9374,6.1664,-2.5905,-0.36553,0
515 | 2.0153,1.8479,3.1375,0.42843,0
516 | 5.8782,5.9409,-2.8544,-0.60863,0
517 | -2.3983,12.606,2.9464,-5.7888,0
518 | 1.762,4.3682,2.1384,0.75429,0
519 | 4.2406,-2.4852,1.608,0.7155,0
520 | 3.4669,6.87,-1.0568,-0.73147,0
521 | 3.1896,5.7526,-0.18537,-0.30087,0
522 | 0.81356,9.1566,-2.1492,-4.1814,0
523 | 0.52855,0.96427,4.0243,-1.0483,0
524 | 2.1319,-2.0403,2.5574,-0.061652,0
525 | 0.33111,4.5731,2.057,-0.18967,0
526 | 1.2746,8.8172,-1.5323,-1.7957,0
527 | 2.2091,7.4556,-1.3284,-3.3021,0
528 | 2.5328,7.528,-0.41929,-2.6478,0
529 | 3.6244,1.4609,1.3501,1.9284,0
530 | -1.3885,12.5026,0.69118,-7.5487,0
531 | 5.7227,5.8312,-2.4097,-0.24527,0
532 | 3.3583,10.3567,-3.7301,-3.6991,0
533 | 2.5227,2.2369,2.7236,0.79438,0
534 | 0.045304,6.7334,1.0708,-0.9332,0
535 | 4.8278,7.7598,-2.4491,-1.2216,0
536 | 1.9476,-4.7738,8.527,-1.8668,0
537 | 2.7659,0.66216,4.1494,-0.28406,0
538 | -0.10648,-0.76771,7.7575,0.64179,0
539 | 0.72252,-0.053811,5.6703,-1.3509,0
540 | 4.2475,1.4816,-0.48355,0.95343,0
541 | 3.9772,0.33521,2.2566,2.1625,0
542 | 3.6667,4.302,0.55923,0.33791,0
543 | 2.8232,10.8513,-3.1466,-3.9784,0
544 | -1.4217,11.6542,-0.057699,-7.1025,0
545 | 4.2458,1.1981,0.66633,0.94696,0
546 | 4.1038,-4.8069,3.3491,-0.49225,0
547 | 1.4507,8.7903,-2.2324,-0.65259,0
548 | 3.4647,-3.9172,3.9746,0.36119,0
549 | 1.8533,6.1458,1.0176,-2.0401,0
550 | 3.5288,0.71596,1.9507,1.9375,0
551 | 3.9719,1.0367,0.75973,1.0013,0
552 | 3.534,9.3614,-3.6316,-1.2461,0
553 | 3.6894,9.887,-4.0788,-4.3664,0
554 | 3.0672,-4.4117,3.8238,-0.81682,0
555 | 2.6463,-4.8152,6.3549,0.003003,0
556 | 2.2893,3.733,0.6312,-0.39786,0
557 | 1.5673,7.9274,-0.056842,-2.1694,0
558 | 4.0405,0.51524,1.0279,1.106,0
559 | 4.3846,-4.8794,3.3662,-0.029324,0
560 | 2.0165,-0.25246,5.1707,1.0763,0
561 | 4.0446,11.1741,-4.3582,-4.7401,0
562 | -0.33729,-0.64976,7.6659,0.72326,0
563 | -2.4604,12.7302,0.91738,-7.6418,0
564 | 4.1195,10.9258,-3.8929,-4.1802,0
565 | 2.0193,0.82356,4.6369,1.4202,0
566 | 1.5701,7.9129,0.29018,-2.1953,0
567 | 2.6415,7.586,-0.28562,-1.6677,0
568 | 5.0214,8.0764,-3.0515,-1.7155,0
569 | 4.3435,3.3295,0.83598,0.64955,0
570 | 1.8238,-6.7748,8.3873,-0.54139,0
571 | 3.9382,0.9291,0.78543,0.6767,0
572 | 2.2517,-5.1422,4.2916,-1.2487,0
573 | 5.504,10.3671,-4.413,-4.0211,0
574 | 2.8521,9.171,-3.6461,-1.2047,0
575 | 1.1676,9.1566,-2.0867,-0.80647,0
576 | 2.6104,8.0081,-0.23592,-1.7608,0
577 | 0.32444,10.067,-1.1982,-4.1284,0
578 | 3.8962,-4.7904,3.3954,-0.53751,0
579 | 2.1752,-0.8091,5.1022,-0.67975,0
580 | 1.1588,8.9331,-2.0807,-1.1272,0
581 | 4.7072,8.2957,-2.5605,-1.4905,0
582 | -1.9667,11.8052,-0.40472,-7.8719,0
583 | 4.0552,0.40143,1.4563,0.65343,0
584 | 2.3678,-6.839,8.4207,-0.44829,0
585 | 0.33565,6.8369,0.69718,-0.55691,0
586 | 4.3398,-5.3036,3.8803,-0.70432,0
587 | 1.5456,8.5482,0.4187,-2.1784,0
588 | 1.4276,8.3847,-2.0995,-1.9677,0
589 | -0.27802,8.1881,-3.1338,-2.5276,0
590 | 0.93611,8.6413,-1.6351,-1.3043,0
591 | 4.6352,-3.0087,2.6773,1.212,0
592 | 1.5268,-5.5871,8.6564,-1.722,0
593 | 0.95626,2.4728,4.4578,0.21636,0
594 | -2.7914,1.7734,6.7756,-0.39915,0
595 | 5.2032,3.5116,-1.2538,1.0129,0
596 | 3.1836,7.2321,-1.0713,-2.5909,0
597 | 0.65497,5.1815,1.0673,-0.42113,0
598 | 5.6084,10.3009,-4.8003,-4.3534,0
599 | 1.105,7.4432,0.41099,-3.0332,0
600 | 3.9292,-2.9156,2.2129,0.30817,0
601 | 1.1558,6.4003,1.5506,0.6961,0
602 | 2.5581,2.6218,1.8513,0.40257,0
603 | 2.7831,10.9796,-3.557,-4.4039,0
604 | 3.7635,2.7811,0.66119,0.34179,0
605 | -2.6479,10.1374,-1.331,-5.4707,0
606 | 1.0652,8.3682,-1.4004,-1.6509,0
607 | -1.4275,11.8797,0.41613,-6.9978,0
608 | 5.7456,10.1808,-4.7857,-4.3366,0
609 | 5.086,3.2798,-1.2701,1.1189,0
610 | 3.4092,5.4049,-2.5228,-0.89958,0
611 | -0.2361,9.3221,2.1307,-4.3793,0
612 | 3.8197,8.9951,-4.383,-4.0327,0
613 | -1.1391,1.8127,6.9144,0.70127,0
614 | 4.9249,0.68906,0.77344,1.2095,0
615 | 2.5089,6.841,-0.029423,0.44912,0
616 | -0.2062,9.2207,-3.7044,-6.8103,0
617 | 3.946,6.8514,-1.5443,-0.5582,0
618 | -0.278,8.1881,-3.1338,-2.5276,0
619 | 1.8592,3.2074,-0.15966,-0.26208,0
620 | 0.56953,7.6294,1.5754,-3.2233,0
621 | 3.4626,-4.449,3.5427,0.15429,0
622 | 3.3951,1.1484,2.1401,2.0862,0
623 | 5.0429,-0.52974,0.50439,1.106,0
624 | 3.7758,7.1783,-1.5195,0.40128,0
625 | 4.6562,7.6398,-2.4243,-1.2384,0
626 | 4.0948,-2.9674,2.3689,0.75429,0
627 | 1.8384,6.063,0.54723,0.51248,0
628 | 2.0153,0.43661,4.5864,-0.3151,0
629 | 3.5251,0.7201,1.6928,0.64438,0
630 | 3.757,-5.4236,3.8255,-1.2526,0
631 | 2.5989,3.5178,0.7623,0.81119,0
632 | 1.8994,0.97462,4.2265,0.81377,0
633 | 3.6941,-3.9482,4.2625,1.1577,0
634 | 4.4295,-2.3507,1.7048,0.90946,0
635 | 6.8248,5.2187,-2.5425,0.5461,0
636 | 1.8967,-2.5163,2.8093,-0.79742,0
637 | 2.1526,-6.1665,8.0831,-0.34355,0
638 | 3.3004,7.0811,-1.3258,0.22283,0
639 | 2.7213,7.05,-0.58808,0.41809,0
640 | 3.8846,-3.0336,2.5334,0.20214,0
641 | 4.1665,-0.4449,0.23448,0.27843,0
642 | 0.94225,5.8561,1.8762,-0.32544,0
643 | 5.1321,-0.031048,0.32616,1.1151,0
644 | 0.38251,6.8121,1.8128,-0.61251,0
645 | 3.0333,-2.5928,2.3183,0.303,0
646 | 2.9233,6.0464,-0.11168,-0.58665,0
647 | 1.162,10.2926,-1.2821,-4.0392,0
648 | 3.7791,2.5762,1.3098,0.5655,0
649 | 0.77765,5.9781,1.1941,-0.3526,0
650 | -0.38388,-1.0471,8.0514,0.49567,0
651 | 0.21084,9.4359,-0.094543,-1.859,0
652 | 2.9571,-4.5938,5.9068,0.57196,0
653 | 4.6439,-3.3729,2.5976,0.55257,0
654 | 3.3577,-4.3062,6.0241,0.18274,0
655 | 3.5127,2.9073,1.0579,0.40774,0
656 | 2.6562,10.7044,-3.3085,-4.0767,0
657 | -1.3612,10.694,1.7022,-2.9026,0
658 | -0.278,8.1881,-3.1338,-2.5276,0
659 | 1.04,-6.9321,8.2888,-1.2991,0
660 | 2.1881,2.7356,1.3278,-0.1832,0
661 | 4.2756,-2.6528,2.1375,0.94437,0
662 | -0.11996,6.8741,0.91995,-0.6694,0
663 | 2.9736,8.7944,-3.6359,-1.3754,0
664 | 3.7798,-3.3109,2.6491,0.066365,0
665 | 5.3586,3.7557,-1.7345,1.0789,0
666 | 1.8373,6.1292,0.84027,0.55257,0
667 | 1.2262,0.89599,5.7568,-0.11596,0
668 | -0.048008,-0.56078,7.7215,0.453,0
669 | 0.5706,-0.024841,1.2421,-0.56208,0
670 | 4.3634,0.46351,1.4281,2.0202,0
671 | 3.482,-4.1634,3.5008,-0.078462,0
672 | 0.51947,-3.2633,3.0895,-0.98492,0
673 | 2.3164,-2.628,3.1529,-0.08622,0
674 | -1.8348,11.0334,3.1863,-4.8888,0
675 | 1.3754,8.8793,-1.9136,-0.53751,0
676 | -0.16682,5.8974,0.49839,-0.70044,0
677 | 0.29961,7.1328,-0.31475,-1.1828,0
678 | 0.25035,9.3262,-3.6873,-6.2543,0
679 | 2.4673,1.3926,1.7125,0.41421,0
680 | 0.77805,6.6424,-1.1425,-1.0573,0
681 | 3.4465,2.9508,1.0271,0.5461,0
682 | 2.2429,-4.1427,5.2333,-0.40173,0
683 | 3.7321,-3.884,3.3577,-0.0060486,0
684 | 4.3365,-3.584,3.6884,0.74912,0
685 | -2.0759,10.8223,2.6439,-4.837,0
686 | 4.0715,7.6398,-2.0824,-1.1698,0
687 | 0.76163,5.8209,1.1959,-0.64613,0
688 | -0.53966,7.3273,0.46583,-1.4543,0
689 | 2.6213,5.7919,0.065686,-1.5759,0
690 | 3.0242,-3.3378,2.5865,-0.54785,0
691 | 5.8519,5.3905,-2.4037,-0.061652,0
692 | 0.5706,-0.0248,1.2421,-0.5621,0
693 | 3.9771,11.1513,-3.9272,-4.3444,0
694 | 1.5478,9.1814,-1.6326,-1.7375,0
695 | 0.74054,0.36625,2.1992,0.48403,0
696 | 0.49571,10.2243,-1.097,-4.0159,0
697 | 1.645,7.8612,-0.87598,-3.5569,0
698 | 3.6077,6.8576,-1.1622,0.28231,0
699 | 3.2403,-3.7082,5.2804,0.41291,0
700 | 3.9166,10.2491,-4.0926,-4.4659,0
701 | 3.9262,6.0299,-2.0156,-0.065531,0
702 | 5.591,10.4643,-4.3839,-4.3379,0
703 | 3.7522,-3.6978,3.9943,1.3051,0
704 | 1.3114,4.5462,2.2935,0.22541,0
705 | 3.7022,6.9942,-1.8511,-0.12889,0
706 | 4.364,-3.1039,2.3757,0.78532,0
707 | 3.5829,1.4423,1.0219,1.4008,0
708 | 4.65,-4.8297,3.4553,-0.25174,0
709 | 5.1731,3.9606,-1.983,0.40774,0
710 | 3.2692,3.4184,0.20706,-0.066824,0
711 | 2.4012,1.6223,3.0312,0.71679,0
712 | 1.7257,-4.4697,8.2219,-1.8073,0
713 | 4.7965,6.9859,-1.9967,-0.35001,0
714 | 4.0962,10.1891,-3.9323,-4.1827,0
715 | 2.5559,3.3605,2.0321,0.26809,0
716 | 3.4916,8.5709,-3.0326,-0.59182,0
717 | 0.5195,-3.2633,3.0895,-0.9849,0
718 | 2.9856,7.2673,-0.409,-2.2431,0
719 | 4.0932,5.4132,-1.8219,0.23576,0
720 | 1.7748,-0.76978,5.5854,1.3039,0
721 | 5.2012,0.32694,0.17965,1.1797,0
722 | -0.45062,-1.3678,7.0858,-0.40303,0
723 | 4.8451,8.1116,-2.9512,-1.4724,0
724 | 0.74841,7.2756,1.1504,-0.5388,0
725 | 5.1213,8.5565,-3.3917,-1.5474,0
726 | 3.6181,-3.7454,2.8273,-0.71208,0
727 | 0.040498,8.5234,1.4461,-3.9306,0
728 | -2.6479,10.1374,-1.331,-5.4707,0
729 | 0.37984,0.70975,0.75716,-0.44441,0
730 | -0.95923,0.091039,6.2204,-1.4828,0
731 | 2.8672,10.0008,-3.2049,-3.1095,0
732 | 1.0182,9.109,-0.62064,-1.7129,0
733 | -2.7143,11.4535,2.1092,-3.9629,0
734 | 3.8244,-3.1081,2.4537,0.52024,0
735 | 2.7961,2.121,1.8385,0.38317,0
736 | 3.5358,6.7086,-0.81857,0.47886,0
737 | -0.7056,8.7241,2.2215,-4.5965,0
738 | 4.1542,7.2756,-2.4766,-1.2099,0
739 | 0.92703,9.4318,-0.66263,-1.6728,0
740 | 1.8216,-6.4748,8.0514,-0.41855,0
741 | -2.4473,12.6247,0.73573,-7.6612,0
742 | 3.5862,-3.0957,2.8093,0.24481,0
743 | 0.66191,9.6594,-0.28819,-1.6638,0
744 | 4.7926,1.7071,-0.051701,1.4926,0
745 | 4.9852,8.3516,-2.5425,-1.2823,0
746 | 0.75736,3.0294,2.9164,-0.068117,0
747 | 4.6499,7.6336,-1.9427,-0.37458,0
748 | -0.023579,7.1742,0.78457,-0.75734,0
749 | 0.85574,0.0082678,6.6042,-0.53104,0
750 | 0.88298,0.66009,6.0096,-0.43277,0
751 | 4.0422,-4.391,4.7466,1.137,0
752 | 2.2546,8.0992,-0.24877,-3.2698,0
753 | 0.38478,6.5989,-0.3336,-0.56466,0
754 | 3.1541,-5.1711,6.5991,0.57455,0
755 | 2.3969,0.23589,4.8477,1.437,0
756 | 4.7114,2.0755,-0.2702,1.2379,0
757 | 4.0127,10.1477,-3.9366,-4.0728,0
758 | 2.6606,3.1681,1.9619,0.18662,0
759 | 3.931,1.8541,-0.023425,1.2314,0
760 | 0.01727,8.693,1.3989,-3.9668,0
761 | 3.2414,0.40971,1.4015,1.1952,0
762 | 2.2504,3.5757,0.35273,0.2836,0
763 | -1.3971,3.3191,-1.3927,-1.9948,1
764 | 0.39012,-0.14279,-0.031994,0.35084,1
765 | -1.6677,-7.1535,7.8929,0.96765,1
766 | -3.8483,-12.8047,15.6824,-1.281,1
767 | -3.5681,-8.213,10.083,0.96765,1
768 | -2.2804,-0.30626,1.3347,1.3763,1
769 | -1.7582,2.7397,-2.5323,-2.234,1
770 | -0.89409,3.1991,-1.8219,-2.9452,1
771 | 0.3434,0.12415,-0.28733,0.14654,1
772 | -0.9854,-6.661,5.8245,0.5461,1
773 | -2.4115,-9.1359,9.3444,-0.65259,1
774 | -1.5252,-6.2534,5.3524,0.59912,1
775 | -0.61442,-0.091058,-0.31818,0.50214,1
776 | -0.36506,2.8928,-3.6461,-3.0603,1
777 | -5.9034,6.5679,0.67661,-6.6797,1
778 | -1.8215,2.7521,-0.72261,-2.353,1
779 | -0.77461,-1.8768,2.4023,1.1319,1
780 | -1.8187,-9.0366,9.0162,-0.12243,1
781 | -3.5801,-12.9309,13.1779,-2.5677,1
782 | -1.8219,-6.8824,5.4681,0.057313,1
783 | -0.3481,-0.38696,-0.47841,0.62627,1
784 | 0.47368,3.3605,-4.5064,-4.0431,1
785 | -3.4083,4.8587,-0.76888,-4.8668,1
786 | -1.6662,-0.30005,1.4238,0.024986,1
787 | -2.0962,-7.1059,6.6188,-0.33708,1
788 | -2.6685,-10.4519,9.1139,-1.7323,1
789 | -0.47465,-4.3496,1.9901,0.7517,1
790 | 1.0552,1.1857,-2.6411,0.11033,1
791 | 1.1644,3.8095,-4.9408,-4.0909,1
792 | -4.4779,7.3708,-0.31218,-6.7754,1
793 | -2.7338,0.45523,2.4391,0.21766,1
794 | -2.286,-5.4484,5.8039,0.88231,1
795 | -1.6244,-6.3444,4.6575,0.16981,1
796 | 0.50813,0.47799,-1.9804,0.57714,1
797 | 1.6408,4.2503,-4.9023,-2.6621,1
798 | 0.81583,4.84,-5.2613,-6.0823,1
799 | -5.4901,9.1048,-0.38758,-5.9763,1
800 | -3.2238,2.7935,0.32274,-0.86078,1
801 | -2.0631,-1.5147,1.219,0.44524,1
802 | -0.91318,-2.0113,-0.19565,0.066365,1
803 | 0.6005,1.9327,-3.2888,-0.32415,1
804 | 0.91315,3.3377,-4.0557,-1.6741,1
805 | -0.28015,3.0729,-3.3857,-2.9155,1
806 | -3.6085,3.3253,-0.51954,-3.5737,1
807 | -6.2003,8.6806,0.0091344,-3.703,1
808 | -4.2932,3.3419,0.77258,-0.99785,1
809 | -3.0265,-0.062088,0.68604,-0.055186,1
810 | -1.7015,-0.010356,-0.99337,-0.53104,1
811 | -0.64326,2.4748,-2.9452,-1.0276,1
812 | -0.86339,1.9348,-2.3729,-1.0897,1
813 | -2.0659,1.0512,-0.46298,-1.0974,1
814 | -2.1333,1.5685,-0.084261,-1.7453,1
815 | -1.2568,-1.4733,2.8718,0.44653,1
816 | -3.1128,-6.841,10.7402,-1.0172,1
817 | -4.8554,-5.9037,10.9818,-0.82199,1
818 | -2.588,3.8654,-0.3336,-1.2797,1
819 | 0.24394,1.4733,-1.4192,-0.58535,1
820 | -1.5322,-5.0966,6.6779,0.17498,1
821 | -4.0025,-13.4979,17.6772,-3.3202,1
822 | -4.0173,-8.3123,12.4547,-1.4375,1
823 | -3.0731,-0.53181,2.3877,0.77627,1
824 | -1.979,3.2301,-1.3575,-2.5819,1
825 | -0.4294,-0.14693,0.044265,-0.15605,1
826 | -2.234,-7.0314,7.4936,0.61334,1
827 | -4.211,-12.4736,14.9704,-1.3884,1
828 | -3.8073,-8.0971,10.1772,0.65084,1
829 | -2.5912,-0.10554,1.2798,1.0414,1
830 | -2.2482,3.0915,-2.3969,-2.6711,1
831 | -1.4427,3.2922,-1.9702,-3.4392,1
832 | -0.39416,-0.020702,-0.066267,-0.44699,1
833 | -1.522,-6.6383,5.7491,-0.10691,1
834 | -2.8267,-9.0407,9.0694,-0.98233,1
835 | -1.7263,-6.0237,5.2419,0.29524,1
836 | -0.94255,0.039307,-0.24192,0.31593,1
837 | -0.89569,3.0025,-3.6067,-3.4457,1
838 | -6.2815,6.6651,0.52581,-7.0107,1
839 | -2.3211,3.166,-1.0002,-2.7151,1
840 | -1.3414,-2.0776,2.8093,0.60688,1
841 | -2.258,-9.3263,9.3727,-0.85949,1
842 | -3.8858,-12.8461,12.7957,-3.1353,1
843 | -1.8969,-6.7893,5.2761,-0.32544,1
844 | -0.52645,-0.24832,-0.45613,0.41938,1
845 | 0.0096613,3.5612,-4.407,-4.4103,1
846 | -3.8826,4.898,-0.92311,-5.0801,1
847 | -2.1405,-0.16762,1.321,-0.20906,1
848 | -2.4824,-7.3046,6.839,-0.59053,1
849 | -2.9098,-10.0712,8.4156,-1.9948,1
850 | -0.60975,-4.002,1.8471,0.6017,1
851 | 0.83625,1.1071,-2.4706,-0.062945,1
852 | 0.60731,3.9544,-4.772,-4.4853,1
853 | -4.8861,7.0542,-0.17252,-6.959,1
854 | -3.1366,0.42212,2.6225,-0.064238,1
855 | -2.5754,-5.6574,6.103,0.65214,1
856 | -1.8782,-6.5865,4.8486,-0.021566,1
857 | 0.24261,0.57318,-1.9402,0.44007,1
858 | 1.296,4.2855,-4.8457,-2.9013,1
859 | 0.25943,5.0097,-5.0394,-6.3862,1
860 | -5.873,9.1752,-0.27448,-6.0422,1
861 | -3.4605,2.6901,0.16165,-1.0224,1
862 | -2.3797,-1.4402,1.1273,0.16076,1
863 | -1.2424,-1.7175,-0.52553,-0.21036,1
864 | 0.20216,1.9182,-3.2828,-0.61768,1
865 | 0.59823,3.5012,-3.9795,-1.7841,1
866 | -0.77995,3.2322,-3.282,-3.1004,1
867 | -4.1409,3.4619,-0.47841,-3.8879,1
868 | -6.5084,8.7696,0.23191,-3.937,1
869 | -4.4996,3.4288,0.56265,-1.1672,1
870 | -3.3125,0.10139,0.55323,-0.2957,1
871 | -1.9423,0.3766,-1.2898,-0.82458,1
872 | -0.75793,2.5349,-3.0464,-1.2629,1
873 | -0.95403,1.9824,-2.3163,-1.1957,1
874 | -2.2173,1.4671,-0.72689,-1.1724,1
875 | -2.799,1.9679,-0.42357,-2.1125,1
876 | -1.8629,-0.84841,2.5377,0.097399,1
877 | -3.5916,-6.2285,10.2389,-1.1543,1
878 | -5.1216,-5.3118,10.3846,-1.0612,1
879 | -3.2854,4.0372,-0.45356,-1.8228,1
880 | -0.56877,1.4174,-1.4252,-1.1246,1
881 | -2.3518,-4.8359,6.6479,-0.060358,1
882 | -4.4861,-13.2889,17.3087,-3.2194,1
883 | -4.3876,-7.7267,11.9655,-1.4543,1
884 | -3.3604,-0.32696,2.1324,0.6017,1
885 | -1.0112,2.9984,-1.1664,-1.6185,1
886 | 0.030219,-1.0512,1.4024,0.77369,1
887 | -1.6514,-8.4985,9.1122,1.2379,1
888 | -3.2692,-12.7406,15.5573,-0.14182,1
889 | -2.5701,-6.8452,8.9999,2.1353,1
890 | -1.3066,0.25244,0.7623,1.7758,1
891 | -1.6637,3.2881,-2.2701,-2.2224,1
892 | -0.55008,2.8659,-1.6488,-2.4319,1
893 | 0.21431,-0.69529,0.87711,0.29653,1
894 | -0.77288,-7.4473,6.492,0.36119,1
895 | -1.8391,-9.0883,9.2416,-0.10432,1
896 | -0.63298,-5.1277,4.5624,1.4797,1
897 | 0.0040545,0.62905,-0.64121,0.75817,1
898 | -0.28696,3.1784,-3.5767,-3.1896,1
899 | -5.2406,6.6258,-0.19908,-6.8607,1
900 | -1.4446,2.1438,-0.47241,-1.6677,1
901 | -0.65767,-2.8018,3.7115,0.99739,1
902 | -1.5449,-10.1498,9.6152,-1.2332,1
903 | -2.8957,-12.0205,11.9149,-2.7552,1
904 | -0.81479,-5.7381,4.3919,0.3211,1
905 | 0.50225,0.65388,-1.1793,0.39998,1
906 | 0.74521,3.6357,-4.4044,-4.1414,1
907 | -2.9146,4.0537,-0.45699,-4.0327,1
908 | -1.3907,-1.3781,2.3055,-0.021566,1
909 | -1.786,-8.1157,7.0858,-1.2112,1
910 | -1.7322,-9.2828,7.719,-1.7168,1
911 | 0.55298,-3.4619,1.7048,1.1008,1
912 | 2.031,1.852,-3.0121,0.003003,1
913 | 1.2279,4.0309,-4.6435,-3.9125,1
914 | -4.2249,6.2699,0.15822,-5.5457,1
915 | -2.5346,-0.77392,3.3602,0.00171,1
916 | -1.749,-6.332,6.0987,0.14266,1
917 | -0.539,-5.167,3.4399,0.052141,1
918 | 1.5631,0.89599,-1.9702,0.65472,1
919 | 2.3917,4.5565,-4.9888,-2.8987,1
920 | 0.89512,4.7738,-4.8431,-5.5909,1
921 | -5.4808,8.1819,0.27818,-5.0323,1
922 | -2.8833,1.7713,0.68946,-0.4638,1
923 | -1.4174,-2.2535,1.518,0.61981,1
924 | 0.4283,-0.94981,-1.0731,0.3211,1
925 | 1.5904,2.2121,-3.1183,-0.11725,1
926 | 1.7425,3.6833,-4.0129,-1.7207,1
927 | -0.23356,3.2405,-3.0669,-2.7784,1
928 | -3.6227,3.9958,-0.35845,-3.9047,1
929 | -6.1536,7.9295,0.61663,-3.2646,1
930 | -3.9172,2.6652,0.78886,-0.7819,1
931 | -2.2214,-0.23798,0.56008,0.05602,1
932 | -0.49241,0.89392,-1.6283,-0.56854,1
933 | 0.26517,2.4066,-2.8416,-0.59958,1
934 | -0.10234,1.8189,-2.2169,-0.56725,1
935 | -1.6176,1.0926,-0.35502,-0.59958,1
936 | -1.8448,1.254,0.27218,-1.0728,1
937 | -1.2786,-2.4087,4.5735,0.47627,1
938 | -2.902,-7.6563,11.8318,-0.84268,1
939 | -4.3773,-5.5167,10.939,-0.4082,1
940 | -2.0529,3.8385,-0.79544,-1.2138,1
941 | 0.18868,0.70148,-0.51182,0.0055892,1
942 | -1.7279,-6.841,8.9494,0.68058,1
943 | -3.3793,-13.7731,17.9274,-2.0323,1
944 | -3.1273,-7.1121,11.3897,-0.083634,1
945 | -2.121,-0.05588,1.949,1.353,1
946 | -1.7697,3.4329,-1.2144,-2.3789,1
947 | -0.0012852,0.13863,-0.19651,0.0081754,1
948 | -1.682,-6.8121,7.1398,1.3323,1
949 | -3.4917,-12.1736,14.3689,-0.61639,1
950 | -3.1158,-8.6289,10.4403,0.97153,1
951 | -2.0891,-0.48422,1.704,1.7435,1
952 | -1.6936,2.7852,-2.1835,-1.9276,1
953 | -1.2846,3.2715,-1.7671,-3.2608,1
954 | -0.092194,0.39315,-0.32846,-0.13794,1
955 | -1.0292,-6.3879,5.5255,0.79955,1
956 | -2.2083,-9.1069,8.9991,-0.28406,1
957 | -1.0744,-6.3113,5.355,0.80472,1
958 | -0.51003,-0.23591,0.020273,0.76334,1
959 | -0.36372,3.0439,-3.4816,-2.7836,1
960 | -6.3979,6.4479,1.0836,-6.6176,1
961 | -2.2501,3.3129,-0.88369,-2.8974,1
962 | -1.1859,-1.2519,2.2635,0.77239,1
963 | -1.8076,-8.8131,8.7086,-0.21682,1
964 | -3.3863,-12.9889,13.0545,-2.7202,1
965 | -1.4106,-7.108,5.6454,0.31335,1
966 | -0.21394,-0.68287,0.096532,1.1965,1
967 | 0.48797,3.5674,-4.3882,-3.8116,1
968 | -3.8167,5.1401,-0.65063,-5.4306,1
969 | -1.9555,0.20692,1.2473,-0.3707,1
970 | -2.1786,-6.4479,6.0344,-0.20777,1
971 | -2.3299,-9.9532,8.4756,-1.8733,1
972 | 0.0031201,-4.0061,1.7956,0.91722,1
973 | 1.3518,1.0595,-2.3437,0.39998,1
974 | 1.2309,3.8923,-4.8277,-4.0069,1
975 | -5.0301,7.5032,-0.13396,-7.5034,1
976 | -3.0799,0.60836,2.7039,-0.23751,1
977 | -2.2987,-5.227,5.63,0.91722,1
978 | -1.239,-6.541,4.8151,-0.033204,1
979 | 0.75896,0.29176,-1.6506,0.83834,1
980 | 1.6799,4.2068,-4.5398,-2.3931,1
981 | 0.63655,5.2022,-5.2159,-6.1211,1
982 | -6.0598,9.2952,-0.43642,-6.3694,1
983 | -3.518,2.8763,0.1548,-1.2086,1
984 | -2.0336,-1.4092,1.1582,0.36507,1
985 | -0.69745,-1.7672,-0.34474,-0.12372,1
986 | 0.75108,1.9161,-3.1098,-0.20518,1
987 | 0.84546,3.4826,-3.6307,-1.3961,1
988 | -0.55648,3.2136,-3.3085,-2.7965,1
989 | -3.6817,3.2239,-0.69347,-3.4004,1
990 | -6.7526,8.8172,-0.061983,-3.725,1
991 | -4.577,3.4515,0.66719,-0.94742,1
992 | -2.9883,0.31245,0.45041,0.068951,1
993 | -1.4781,0.14277,-1.1622,-0.48579,1
994 | -0.46651,2.3383,-2.9812,-1.0431,1
995 | -0.8734,1.6533,-2.1964,-0.78061,1
996 | -2.1234,1.1815,-0.55552,-0.81165,1
997 | -2.3142,2.0838,-0.46813,-1.6767,1
998 | -1.4233,-0.98912,2.3586,0.39481,1
999 | -3.0866,-6.6362,10.5405,-0.89182,1
1000 | -4.7331,-6.1789,11.388,-1.0741,1
1001 | -2.8829,3.8964,-0.1888,-1.1672,1
1002 | -0.036127,1.525,-1.4089,-0.76121,1
1003 | -1.7104,-4.778,6.2109,0.3974,1
1004 | -3.8203,-13.0551,16.9583,-2.3052,1
1005 | -3.7181,-8.5089,12.363,-0.95518,1
1006 | -2.899,-0.60424,2.6045,1.3776,1
1007 | -0.98193,2.7956,-1.2341,-1.5668,1
1008 | -0.17296,-1.1816,1.3818,0.7336,1
1009 | -1.9409,-8.6848,9.155,0.94049,1
1010 | -3.5713,-12.4922,14.8881,-0.47027,1
1011 | -2.9915,-6.6258,8.6521,1.8198,1
1012 | -1.8483,0.31038,0.77344,1.4189,1
1013 | -2.2677,3.2964,-2.2563,-2.4642,1
1014 | -0.50816,2.868,-1.8108,-2.2612,1
1015 | 0.14329,-1.0885,1.0039,0.48791,1
1016 | -0.90784,-7.9026,6.7807,0.34179,1
1017 | -2.0042,-9.3676,9.3333,-0.10303,1
1018 | -0.93587,-5.1008,4.5367,1.3866,1
1019 | -0.40804,0.54214,-0.52725,0.6586,1
1020 | -0.8172,3.3812,-3.6684,-3.456,1
1021 | -4.8392,6.6755,-0.24278,-6.5775,1
1022 | -1.2792,2.1376,-0.47584,-1.3974,1
1023 | -0.66008,-3.226,3.8058,1.1836,1
1024 | -1.7713,-10.7665,10.2184,-1.0043,1
1025 | -3.0061,-12.2377,11.9552,-2.1603,1
1026 | -1.1022,-5.8395,4.5641,0.68705,1
1027 | 0.11806,0.39108,-0.98223,0.42843,1
1028 | 0.11686,3.735,-4.4379,-4.3741,1
1029 | -2.7264,3.9213,-0.49212,-3.6371,1
1030 | -1.2369,-1.6906,2.518,0.51636,1
1031 | -1.8439,-8.6475,7.6796,-0.66682,1
1032 | -1.8554,-9.6035,7.7764,-0.97716,1
1033 | 0.16358,-3.3584,1.3749,1.3569,1
1034 | 1.5077,1.9596,-3.0584,-0.12243,1
1035 | 0.67886,4.1199,-4.569,-4.1414,1
1036 | -3.9934,5.8333,0.54723,-4.9379,1
1037 | -2.3898,-0.78427,3.0141,0.76205,1
1038 | -1.7976,-6.7686,6.6753,0.89912,1
1039 | -0.70867,-5.5602,4.0483,0.903,1
1040 | 1.0194,1.1029,-2.3,0.59395,1
1041 | 1.7875,4.78,-5.1362,-3.2362,1
1042 | 0.27331,4.8773,-4.9194,-5.8198,1
1043 | -5.1661,8.0433,0.044265,-4.4983,1
1044 | -2.7028,1.6327,0.83598,-0.091393,1
1045 | -1.4904,-2.2183,1.6054,0.89394,1
1046 | -0.014902,-1.0243,-0.94024,0.64955,1
1047 | 0.88992,2.2638,-3.1046,-0.11855,1
1048 | 1.0637,3.6957,-4.1594,-1.9379,1
1049 | -0.8471,3.1329,-3.0112,-2.9388,1
1050 | -3.9594,4.0289,-0.35845,-3.8957,1
1051 | -5.8818,7.6584,0.5558,-2.9155,1
1052 | -3.7747,2.5162,0.83341,-0.30993,1
1053 | -2.4198,-0.24418,0.70146,0.41809,1
1054 | -0.83535,0.80494,-1.6411,-0.19225,1
1055 | -0.30432,2.6528,-2.7756,-0.65647,1
1056 | -0.60254,1.7237,-2.1501,-0.77027,1
1057 | -2.1059,1.1815,-0.53324,-0.82716,1
1058 | -2.0441,1.2271,0.18564,-1.091,1
1059 | -1.5621,-2.2121,4.2591,0.27972,1
1060 | -3.2305,-7.2135,11.6433,-0.94613,1
1061 | -4.8426,-4.9932,10.4052,-0.53104,1
1062 | -2.3147,3.6668,-0.6969,-1.2474,1
1063 | -0.11716,0.60422,-0.38587,-0.059065,1
1064 | -2.0066,-6.719,9.0162,0.099985,1
1065 | -3.6961,-13.6779,17.5795,-2.6181,1
1066 | -3.6012,-6.5389,10.5234,-0.48967,1
1067 | -2.6286,0.18002,1.7956,0.97282,1
1068 | -0.82601,2.9611,-1.2864,-1.4647,1
1069 | 0.31803,-0.99326,1.0947,0.88619,1
1070 | -1.4454,-8.4385,8.8483,0.96894,1
1071 | -3.1423,-13.0365,15.6773,-0.66165,1
1072 | -2.5373,-6.959,8.8054,1.5289,1
1073 | -1.366,0.18416,0.90539,1.5806,1
1074 | -1.7064,3.3088,-2.2829,-2.1978,1
1075 | -0.41965,2.9094,-1.7859,-2.2069,1
1076 | 0.37637,-0.82358,0.78543,0.74524,1
1077 | -0.55355,-7.9233,6.7156,0.74394,1
1078 | -1.6001,-9.5828,9.4044,0.081882,1
1079 | -0.37013,-5.554,4.7749,1.547,1
1080 | 0.12126,0.22347,-0.47327,0.97024,1
1081 | -0.27068,3.2674,-3.5562,-3.0888,1
1082 | -5.119,6.6486,-0.049987,-6.5206,1
1083 | -1.3946,2.3134,-0.44499,-1.4905,1
1084 | -0.69879,-3.3771,4.1211,1.5043,1
1085 | -1.48,-10.5244,9.9176,-0.5026,1
1086 | -2.6649,-12.813,12.6689,-1.9082,1
1087 | -0.62684,-6.301,4.7843,1.106,1
1088 | 0.518,0.25865,-0.84085,0.96118,1
1089 | 0.64376,3.764,-4.4738,-4.0483,1
1090 | -2.9821,4.1986,-0.5898,-3.9642,1
1091 | -1.4628,-1.5706,2.4357,0.49826,1
1092 | -1.7101,-8.7903,7.9735,-0.45475,1
1093 | -1.5572,-9.8808,8.1088,-1.0806,1
1094 | 0.74428,-3.7723,1.6131,1.5754,1
1095 | 2.0177,1.7982,-2.9581,0.2099,1
1096 | 1.164,3.913,-4.5544,-3.8672,1
1097 | -4.3667,6.0692,0.57208,-5.4668,1
1098 | -2.5919,-1.0553,3.8949,0.77757,1
1099 | -1.8046,-6.8141,6.7019,1.1681,1
1100 | -0.71868,-5.7154,3.8298,1.0233,1
1101 | 1.4378,0.66837,-2.0267,1.0271,1
1102 | 2.1943,4.5503,-4.976,-2.7254,1
1103 | 0.7376,4.8525,-4.7986,-5.6659,1
1104 | -5.637,8.1261,0.13081,-5.0142,1
1105 | -3.0193,1.7775,0.73745,-0.45346,1
1106 | -1.6706,-2.09,1.584,0.71162,1
1107 | -0.1269,-1.1505,-0.95138,0.57843,1
1108 | 1.2198,2.0982,-3.1954,0.12843,1
1109 | 1.4501,3.6067,-4.0557,-1.5966,1
1110 | -0.40857,3.0977,-2.9607,-2.6892,1
1111 | -3.8952,3.8157,-0.31304,-3.8194,1
1112 | -6.3679,8.0102,0.4247,-3.2207,1
1113 | -4.1429,2.7749,0.68261,-0.71984,1
1114 | -2.6864,-0.097265,0.61663,0.061192,1
1115 | -1.0555,0.79459,-1.6968,-0.46768,1
1116 | -0.29858,2.4769,-2.9512,-0.66165,1
1117 | -0.49948,1.7734,-2.2469,-0.68104,1
1118 | -1.9881,0.99945,-0.28562,-0.70044,1
1119 | -1.9389,1.5706,0.045979,-1.122,1
1120 | -1.4375,-1.8624,4.026,0.55127,1
1121 | -3.1875,-7.5756,11.8678,-0.57889,1
1122 | -4.6765,-5.6636,10.969,-0.33449,1
1123 | -2.0285,3.8468,-0.63435,-1.175,1
1124 | 0.26637,0.73252,-0.67891,0.03533,1
1125 | -1.7589,-6.4624,8.4773,0.31981,1
1126 | -3.5985,-13.6593,17.6052,-2.4927,1
1127 | -3.3582,-7.2404,11.4419,-0.57113,1
1128 | -2.3629,-0.10554,1.9336,1.1358,1
1129 | -2.1802,3.3791,-1.2256,-2.6621,1
1130 | -0.40951,-0.15521,0.060545,-0.088807,1
1131 | -2.2918,-7.257,7.9597,0.9211,1
1132 | -4.0214,-12.8006,15.6199,-0.95647,1
1133 | -3.3884,-8.215,10.3315,0.98187,1
1134 | -2.0046,-0.49457,1.333,1.6543,1
1135 | -1.7063,2.7956,-2.378,-2.3491,1
1136 | -1.6386,3.3584,-1.7302,-3.5646,1
1137 | -0.41645,0.32487,-0.33617,-0.36036,1
1138 | -1.5877,-6.6072,5.8022,0.31593,1
1139 | -2.5961,-9.349,9.7942,-0.28018,1
1140 | -1.5228,-6.4789,5.7568,0.87325,1
1141 | -0.53072,-0.097265,-0.21793,1.0426,1
1142 | -0.49081,2.8452,-3.6436,-3.1004,1
1143 | -6.5773,6.8017,0.85483,-7.5344,1
1144 | -2.4621,2.7645,-0.62578,-2.8573,1
1145 | -1.3995,-1.9162,2.5154,0.59912,1
1146 | -2.3221,-9.3304,9.233,-0.79871,1
1147 | -3.73,-12.9723,12.9817,-2.684,1
1148 | -1.6988,-7.1163,5.7902,0.16723,1
1149 | -0.26654,-0.64562,-0.42014,0.89136,1
1150 | 0.33325,3.3108,-4.5081,-4.012,1
1151 | -4.2091,4.7283,-0.49126,-5.2159,1
1152 | -2.3142,-0.68494,1.9833,-0.44829,1
1153 | -2.4835,-7.4494,6.8964,-0.64484,1
1154 | -2.7611,-10.5099,9.0239,-1.9547,1
1155 | -0.36025,-4.449,2.1067,0.94308,1
1156 | 1.0117,0.9022,-2.3506,0.42714,1
1157 | 0.96708,3.8426,-4.9314,-4.1323,1
1158 | -5.2049,7.259,0.070827,-7.3004,1
1159 | -3.3203,-0.02691,2.9618,-0.44958,1
1160 | -2.565,-5.7899,6.0122,0.046968,1
1161 | -1.5951,-6.572,4.7689,-0.94354,1
1162 | 0.7049,0.17174,-1.7859,0.36119,1
1163 | 1.7331,3.9544,-4.7412,-2.5017,1
1164 | 0.6818,4.8504,-5.2133,-6.1043,1
1165 | -6.3364,9.2848,0.014275,-6.7844,1
1166 | -3.8053,2.4273,0.6809,-1.0871,1
1167 | -2.1979,-2.1252,1.7151,0.45171,1
1168 | -0.87874,-2.2121,-0.051701,0.099985,1
1169 | 0.74067,1.7299,-3.1963,-0.1457,1
1170 | 0.98296,3.4226,-3.9692,-1.7116,1
1171 | -0.3489,3.1929,-3.4054,-3.1832,1
1172 | -3.8552,3.5219,-0.38415,-3.8608,1
1173 | -6.9599,8.9931,0.2182,-4.572,1
1174 | -4.7462,3.1205,1.075,-1.2966,1
1175 | -3.2051,-0.14279,0.97565,0.045675,1
1176 | -1.7549,-0.080711,-0.75774,-0.3707,1
1177 | -0.59587,2.4811,-2.8673,-0.89828,1
1178 | -0.89542,2.0279,-2.3652,-1.2746,1
1179 | -2.0754,1.2767,-0.64206,-1.2642,1
1180 | -3.2778,1.8023,0.1805,-2.3931,1
1181 | -2.2183,-1.254,2.9986,0.36378,1
1182 | -3.5895,-6.572,10.5251,-0.16381,1
1183 | -5.0477,-5.8023,11.244,-0.3901,1
1184 | -3.5741,3.944,-0.07912,-2.1203,1
1185 | -0.7351,1.7361,-1.4938,-1.1582,1
1186 | -2.2617,-4.7428,6.3489,0.11162,1
1187 | -4.244,-13.0634,17.1116,-2.8017,1
1188 | -4.0218,-8.304,12.555,-1.5099,1
1189 | -3.0201,-0.67253,2.7056,0.85774,1
1190 | -2.4941,3.5447,-1.3721,-2.8483,1
1191 | -0.83121,0.039307,0.05369,-0.23105,1
1192 | -2.5665,-6.8824,7.5416,0.70774,1
1193 | -4.4018,-12.9371,15.6559,-1.6806,1
1194 | -3.7573,-8.2916,10.3032,0.38059,1
1195 | -2.4725,-0.40145,1.4855,1.1189,1
1196 | -1.9725,2.8825,-2.3086,-2.3724,1
1197 | -2.0149,3.6874,-1.9385,-3.8918,1
1198 | -0.82053,0.65181,-0.48869,-0.52716,1
1199 | -1.7886,-6.3486,5.6154,0.42584,1
1200 | -2.9138,-9.4711,9.7668,-0.60216,1
1201 | -1.8343,-6.5907,5.6429,0.54998,1
1202 | -0.8734,-0.033118,-0.20165,0.55774,1
1203 | -0.70346,2.957,-3.5947,-3.1457,1
1204 | -6.7387,6.9879,0.67833,-7.5887,1
1205 | -2.7723,3.2777,-0.9351,-3.1457,1
1206 | -1.6641,-1.3678,1.997,0.52283,1
1207 | -2.4349,-9.2497,8.9922,-0.50001,1
1208 | -3.793,-12.7095,12.7957,-2.825,1
1209 | -1.9551,-6.9756,5.5383,-0.12889,1
1210 | -0.69078,-0.50077,-0.35417,0.47498,1
1211 | 0.025013,3.3998,-4.4327,-4.2655,1
1212 | -4.3967,4.9601,-0.64892,-5.4719,1
1213 | -2.456,-0.24418,1.4041,-0.45863,1
1214 | -2.62,-6.8555,6.2169,-0.62285,1
1215 | -2.9662,-10.3257,8.784,-2.1138,1
1216 | -0.71494,-4.4448,2.2241,0.49826,1
1217 | 0.6005,0.99945,-2.2126,0.097399,1
1218 | 0.61652,3.8944,-4.7275,-4.3948,1
1219 | -5.4414,7.2363,0.10938,-7.5642,1
1220 | -3.5798,0.45937,2.3457,-0.45734,1
1221 | -2.7769,-5.6967,5.9179,0.37671,1
1222 | -1.8356,-6.7562,5.0585,-0.55044,1
1223 | 0.30081,0.17381,-1.7542,0.48921,1
1224 | 1.3403,4.1323,-4.7018,-2.5987,1
1225 | 0.26877,4.987,-5.1508,-6.3913,1
1226 | -6.5235,9.6014,-0.25392,-6.9642,1
1227 | -4.0679,2.4955,0.79571,-1.1039,1
1228 | -2.564,-1.7051,1.5026,0.32757,1
1229 | -1.3414,-1.9162,-0.15538,-0.11984,1
1230 | 0.23874,2.0879,-3.3522,-0.66553,1
1231 | 0.6212,3.6771,-4.0771,-2.0711,1
1232 | -0.77848,3.4019,-3.4859,-3.5569,1
1233 | -4.1244,3.7909,-0.6532,-4.1802,1
1234 | -7.0421,9.2,0.25933,-4.6832,1
1235 | -4.9462,3.5716,0.82742,-1.4957,1
1236 | -3.5359,0.30417,0.6569,-0.2957,1
1237 | -2.0662,0.16967,-1.0054,-0.82975,1
1238 | -0.88728,2.808,-3.1432,-1.2035,1
1239 | -1.0941,2.3072,-2.5237,-1.4453,1
1240 | -2.4458,1.6285,-0.88541,-1.4802,1
1241 | -3.551,1.8955,0.1865,-2.4409,1
1242 | -2.2811,-0.85669,2.7185,0.044382,1
1243 | -3.6053,-5.974,10.0916,-0.82846,1
1244 | -5.0676,-5.1877,10.4266,-0.86725,1
1245 | -3.9204,4.0723,-0.23678,-2.1151,1
1246 | -1.1306,1.8458,-1.3575,-1.3806,1
1247 | -2.4561,-4.5566,6.4534,-0.056479,1
1248 | -4.4775,-13.0303,17.0834,-3.0345,1
1249 | -4.1958,-8.1819,12.1291,-1.6017,1
1250 | -3.38,-0.7077,2.5325,0.71808,1
1251 | -2.4365,3.6026,-1.4166,-2.8948,1
1252 | -0.77688,0.13036,-0.031137,-0.35389,1
1253 | -2.7083,-6.8266,7.5339,0.59007,1
1254 | -4.5531,-12.5854,15.4417,-1.4983,1
1255 | -3.8894,-7.8322,9.8208,0.47498,1
1256 | -2.5084,-0.22763,1.488,1.2069,1
1257 | -2.1652,3.0211,-2.4132,-2.4241,1
1258 | -1.8974,3.5074,-1.7842,-3.8491,1
1259 | -0.62043,0.5587,-0.38587,-0.66423,1
1260 | -1.8387,-6.301,5.6506,0.19567,1
1261 | -3,-9.1566,9.5766,-0.73018,1
1262 | -1.9116,-6.1603,5.606,0.48533,1
1263 | -1.005,0.084831,-0.2462,0.45688,1
1264 | -0.87834,3.257,-3.6778,-3.2944,1
1265 | -6.651,6.7934,0.68604,-7.5887,1
1266 | -2.5463,3.1101,-0.83228,-3.0358,1
1267 | -1.4377,-1.432,2.1144,0.42067,1
1268 | -2.4554,-9.0407,8.862,-0.86983,1
1269 | -3.9411,-12.8792,13.0597,-3.3125,1
1270 | -2.1241,-6.8969,5.5992,-0.47156,1
1271 | -0.74324,-0.32902,-0.42785,0.23317,1
1272 | -0.071503,3.7412,-4.5415,-4.2526,1
1273 | -4.2333,4.9166,-0.49212,-5.3207,1
1274 | -2.3675,-0.43663,1.692,-0.43018,1
1275 | -2.5526,-7.3625,6.9255,-0.66811,1
1276 | -3.0986,-10.4602,8.9717,-2.3427,1
1277 | -0.89809,-4.4862,2.2009,0.50731,1
1278 | 0.56232,1.0015,-2.2726,-0.0060486,1
1279 | 0.53936,3.8944,-4.8166,-4.3418,1
1280 | -5.3012,7.3915,0.029699,-7.3987,1
1281 | -3.3553,0.35591,2.6473,-0.37846,1
1282 | -2.7908,-5.7133,5.953,0.45946,1
1283 | -1.9983,-6.6072,4.8254,-0.41984,1
1284 | 0.15423,0.11794,-1.6823,0.59524,1
1285 | 1.208,4.0744,-4.7635,-2.6129,1
1286 | 0.2952,4.8856,-5.149,-6.2323,1
1287 | -6.4247,9.5311,0.022844,-6.8517,1
1288 | -3.9933,2.6218,0.62863,-1.1595,1
1289 | -2.659,-1.6058,1.3647,0.16464,1
1290 | -1.4094,-2.1252,-0.10397,-0.19225,1
1291 | 0.11032,1.9741,-3.3668,-0.65259,1
1292 | 0.52374,3.644,-4.0746,-1.9909,1
1293 | -0.76794,3.4598,-3.4405,-3.4276,1
1294 | -3.9698,3.6812,-0.60008,-4.0133,1
1295 | -7.0364,9.2931,0.16594,-4.5396,1
1296 | -4.9447,3.3005,1.063,-1.444,1
1297 | -3.5933,0.22968,0.7126,-0.3332,1
1298 | -2.1674,0.12415,-1.0465,-0.86208,1
1299 | -0.9607,2.6963,-3.1226,-1.3121,1
1300 | -1.0802,2.1996,-2.5862,-1.2759,1
1301 | -2.3277,1.4381,-0.82114,-1.2862,1
1302 | -3.7244,1.9037,-0.035421,-2.5095,1
1303 | -2.5724,-0.95602,2.7073,-0.16639,1
1304 | -3.9297,-6.0816,10.0958,-1.0147,1
1305 | -5.2943,-5.1463,10.3332,-1.1181,1
1306 | -3.8953,4.0392,-0.3019,-2.1836,1
1307 | -1.2244,1.7485,-1.4801,-1.4181,1
1308 | -2.6406,-4.4159,5.983,-0.13924,1
1309 | -4.6338,-12.7509,16.7166,-3.2168,1
1310 | -4.2887,-7.8633,11.8387,-1.8978,1
1311 | -3.3458,-0.50491,2.6328,0.53705,1
1312 | -1.1188,3.3357,-1.3455,-1.9573,1
1313 | 0.55939,-0.3104,0.18307,0.44653,1
1314 | -1.5078,-7.3191,7.8981,1.2289,1
1315 | -3.506,-12.5667,15.1606,-0.75216,1
1316 | -2.9498,-8.273,10.2646,1.1629,1
1317 | -1.6029,-0.38903,1.62,1.9103,1
1318 | -1.2667,2.8183,-2.426,-1.8862,1
1319 | -0.49281,3.0605,-1.8356,-2.834,1
1320 | 0.66365,-0.045533,-0.18794,0.23447,1
1321 | -0.72068,-6.7583,5.8408,0.62369,1
1322 | -1.9966,-9.5001,9.682,-0.12889,1
1323 | -0.97325,-6.4168,5.6026,1.0323,1
1324 | -0.025314,-0.17383,-0.11339,1.2198,1
1325 | 0.062525,2.9301,-3.5467,-2.6737,1
1326 | -5.525,6.3258,0.89768,-6.6241,1
1327 | -1.2943,2.6735,-0.84085,-2.0323,1
1328 | -0.24037,-1.7837,2.135,1.2418,1
1329 | -1.3968,-9.6698,9.4652,-0.34872,1
1330 | -2.9672,-13.2869,13.4727,-2.6271,1
1331 | -1.1005,-7.2508,6.0139,0.36895,1
1332 | 0.22432,-0.52147,-0.40386,1.2017,1
1333 | 0.90407,3.3708,-4.4987,-3.6965,1
1334 | -2.8619,4.5193,-0.58123,-4.2629,1
1335 | -1.0833,-0.31247,1.2815,0.41291,1
1336 | -1.5681,-7.2446,6.5537,-0.1276,1
1337 | -2.0545,-10.8679,9.4926,-1.4116,1
1338 | 0.2346,-4.5152,2.1195,1.4448,1
1339 | 1.581,0.86909,-2.3138,0.82412,1
1340 | 1.5514,3.8013,-4.9143,-3.7483,1
1341 | -4.1479,7.1225,-0.083404,-6.4172,1
1342 | -2.2625,-0.099335,2.8127,0.48662,1
1343 | -1.7479,-5.823,5.8699,1.212,1
1344 | -0.95923,-6.7128,4.9857,0.32886,1
1345 | 1.3451,0.23589,-1.8785,1.3258,1
1346 | 2.2279,4.0951,-4.8037,-2.1112,1
1347 | 1.2572,4.8731,-5.2861,-5.8741,1
1348 | -5.3857,9.1214,-0.41929,-5.9181,1
1349 | -2.9786,2.3445,0.52667,-0.40173,1
1350 | -1.5851,-2.1562,1.7082,0.9017,1
1351 | -0.21888,-2.2038,-0.0954,0.56421,1
1352 | 1.3183,1.9017,-3.3111,0.065071,1
1353 | 1.4896,3.4288,-4.0309,-1.4259,1
1354 | 0.11592,3.2219,-3.4302,-2.8457,1
1355 | -3.3924,3.3564,-0.72004,-3.5233,1
1356 | -6.1632,8.7096,-0.21621,-3.6345,1
1357 | -4.0786,2.9239,0.87026,-0.65389,1
1358 | -2.5899,-0.3911,0.93452,0.42972,1
1359 | -1.0116,-0.19038,-0.90597,0.003003,1
1360 | 0.066129,2.4914,-2.9401,-0.62156,1
1361 | -0.24745,1.9368,-2.4697,-0.80518,1
1362 | -1.5732,1.0636,-0.71232,-0.8388,1
1363 | -2.1668,1.5933,0.045122,-1.678,1
1364 | -1.1667,-1.4237,2.9241,0.66119,1
1365 | -2.8391,-6.63,10.4849,-0.42113,1
1366 | -4.5046,-5.8126,10.8867,-0.52846,1
1367 | -2.41,3.7433,-0.40215,-1.2953,1
1368 | 0.40614,1.3492,-1.4501,-0.55949,1
1369 | -1.3887,-4.8773,6.4774,0.34179,1
1370 | -3.7503,-13.4586,17.5932,-2.7771,1
1371 | -3.5637,-8.3827,12.393,-1.2823,1
1372 | -2.5419,-0.65804,2.6842,1.1952,1
--------------------------------------------------------------------------------
/decision_tree_classification.py:
--------------------------------------------------------------------------------
1 | from random import seed
2 | from random import randrange
3 | from csv import reader
4 |
5 | # Load a CSV file
6 | def load_csv(filename):
7 | file = open(filename, "r")
8 | lines = reader(file)
9 | dataset = list(lines)
10 | return dataset
11 |
12 | # Convert string column to float
13 | def str_column_to_float(dataset, column):
14 | for row in dataset:
15 | row[column] = float(row[column].strip())
16 |
17 | # Split a dataset into k folds
18 | def cross_validation_split(dataset, n_folds):
19 | dataset_split = list()
20 | dataset_copy = list(dataset)
21 | fold_size = int(len(dataset) / n_folds)
22 | for i in range(n_folds):
23 | fold = list()
24 | while len(fold) < fold_size:
25 | index = randrange(len(dataset_copy))
26 | fold.append(dataset_copy.pop(index))
27 | dataset_split.append(fold)
28 | return dataset_split
29 |
30 | # Calculate accuracy percentage
31 | def accuracy_metric(actual, predicted):
32 | correct = 0
33 | for i in range(len(actual)):
34 | if actual[i] == predicted[i]:
35 | correct += 1
36 | return correct / float(len(actual)) * 100.0
37 |
38 | # Evaluate an algorithm using a cross validation split
39 | def evaluate_algorithm(dataset, algorithm, n_folds, *args):
40 | folds = cross_validation_split(dataset, n_folds)
41 | scores = list()
42 | for fold in folds:
43 | train_set = list(folds)
44 | train_set.remove(fold)
45 | train_set = sum(train_set, [])
46 | test_set = list()
47 | for row in fold:
48 | row_copy = list(row)
49 | test_set.append(row_copy)
50 | row_copy[-1] = None
51 | predicted = algorithm(train_set, test_set, *args)
52 | actual = [row[-1] for row in fold]
53 | accuracy = accuracy_metric(actual, predicted)
54 | scores.append(accuracy)
55 | return scores
56 |
57 | # Split a dataset based on an attribute and an attribute value
58 | def test_split(index, value, dataset):
59 | left, right = list(), list()
60 | for row in dataset:
61 | if row[index] < value:
62 | left.append(row)
63 | else:
64 | right.append(row)
65 | return left, right
66 |
67 | # Calculate the Gini index for a split dataset
68 | def gini_index(groups, classes):
69 | # count all samples at split point
70 | n_instances = float(sum([len(group) for group in groups]))
71 | # sum weighted Gini index for each group
72 | gini = 0.0
73 | for group in groups:
74 | size = float(len(group))
75 | # avoid divide by zero
76 | if size == 0:
77 | continue
78 | score = 0.0
79 | # score the group based on the score for each class
80 | for class_val in classes:
81 | p = [row[-1] for row in group].count(class_val) / size
82 | score += p * p
83 | # weight the group score by its relative size
84 | gini += (1.0 - score) * (size / n_instances)
85 | return gini
86 |
87 | # Select the best split point for a dataset
88 | def get_split(dataset):
89 | class_values = list(set(row[-1] for row in dataset))
90 | b_index, b_value, b_score, b_groups = 999, 999, 999, None
91 | for index in range(len(dataset[0])-1):
92 | for row in dataset:
93 | groups = test_split(index, row[index], dataset)
94 | gini = gini_index(groups, class_values)
95 | if gini < b_score:
96 | b_index, b_value, b_score, b_groups = index, row[index], gini, groups
97 | return {'index':b_index, 'value':b_value, 'groups':b_groups}
98 |
99 | # Create a terminal node value
100 | def to_terminal(group):
101 | outcomes = [row[-1] for row in group]
102 | return max(set(outcomes), key=outcomes.count)
103 |
104 | # Create child splits for a node or make terminal
105 | def split(node, max_depth, min_size, depth):
106 | left, right = node['groups']
107 | del(node['groups'])
108 | # check for a no split
109 | if not left or not right:
110 | node['left'] = node['right'] = to_terminal(left + right)
111 | return
112 | # check for max depth
113 | if depth >= max_depth:
114 | node['left'], node['right'] = to_terminal(left), to_terminal(right)
115 | return
116 | # process left child
117 | if len(left) <= min_size:
118 | node['left'] = to_terminal(left)
119 | else:
120 | node['left'] = get_split(left)
121 | split(node['left'], max_depth, min_size, depth+1)
122 | # process right child
123 | if len(right) <= min_size:
124 | node['right'] = to_terminal(right)
125 | else:
126 | node['right'] = get_split(right)
127 | split(node['right'], max_depth, min_size, depth+1)
128 |
129 | # Build a decision tree
130 | def build_tree(train, max_depth, min_size):
131 | root = get_split(train)
132 | split(root, max_depth, min_size, 1)
133 | return root
134 |
135 | # Make a prediction with a decision tree
136 | def predict(node, row):
137 | if row[node['index']] < node['value']:
138 | if isinstance(node['left'], dict):
139 | return predict(node['left'], row)
140 | else:
141 | return node['left']
142 | else:
143 | if isinstance(node['right'], dict):
144 | return predict(node['right'], row)
145 | else:
146 | return node['right']
147 |
148 | # Classification and Regression Tree Algorithm
149 | def decision_tree(train, test, max_depth, min_size):
150 | tree = build_tree(train, max_depth, min_size)
151 | predictions = list()
152 | for row in test:
153 | prediction = predict(tree, row)
154 | predictions.append(prediction)
155 | return(predictions)
156 |
157 | # Test CART on Bank Note dataset
158 | seed(1)
159 | # load and prepare data
160 | filename = 'data_banknote_authentication.csv'
161 | dataset = load_csv(filename)
162 | # convert string attributes to integers
163 | for i in range(len(dataset[0])):
164 | str_column_to_float(dataset, i)
165 | # evaluate algorithm
166 | n_folds = 5
167 | max_depth = 5
168 | min_size = 10
169 | scores = evaluate_algorithm(dataset, decision_tree, n_folds, max_depth, min_size)
170 | print('Scores: %s' % scores)
171 | print('Mean Accuracy: %.3f%%' % (sum(scores)/float(len(scores))))
--------------------------------------------------------------------------------
/ex1data1.txt:
--------------------------------------------------------------------------------
1 | 6.1101,17.592
2 | 5.5277,9.1302
3 | 8.5186,13.662
4 | 7.0032,11.854
5 | 5.8598,6.8233
6 | 8.3829,11.886
7 | 7.4764,4.3483
8 | 8.5781,12
9 | 6.4862,6.5987
10 | 5.0546,3.8166
11 | 5.7107,3.2522
12 | 14.164,15.505
13 | 5.734,3.1551
14 | 8.4084,7.2258
15 | 5.6407,0.71618
16 | 5.3794,3.5129
17 | 6.3654,5.3048
18 | 5.1301,0.56077
19 | 6.4296,3.6518
20 | 7.0708,5.3893
21 | 6.1891,3.1386
22 | 20.27,21.767
23 | 5.4901,4.263
24 | 6.3261,5.1875
25 | 5.5649,3.0825
26 | 18.945,22.638
27 | 12.828,13.501
28 | 10.957,7.0467
29 | 13.176,14.692
30 | 22.203,24.147
31 | 5.2524,-1.22
32 | 6.5894,5.9966
33 | 9.2482,12.134
34 | 5.8918,1.8495
35 | 8.2111,6.5426
36 | 7.9334,4.5623
37 | 8.0959,4.1164
38 | 5.6063,3.3928
39 | 12.836,10.117
40 | 6.3534,5.4974
41 | 5.4069,0.55657
42 | 6.8825,3.9115
43 | 11.708,5.3854
44 | 5.7737,2.4406
45 | 7.8247,6.7318
46 | 7.0931,1.0463
47 | 5.0702,5.1337
48 | 5.8014,1.844
49 | 11.7,8.0043
50 | 5.5416,1.0179
51 | 7.5402,6.7504
52 | 5.3077,1.8396
53 | 7.4239,4.2885
54 | 7.6031,4.9981
55 | 6.3328,1.4233
56 | 6.3589,-1.4211
57 | 6.2742,2.4756
58 | 5.6397,4.6042
59 | 9.3102,3.9624
60 | 9.4536,5.4141
61 | 8.8254,5.1694
62 | 5.1793,-0.74279
63 | 21.279,17.929
64 | 14.908,12.054
65 | 18.959,17.054
66 | 7.2182,4.8852
67 | 8.2951,5.7442
68 | 10.236,7.7754
69 | 5.4994,1.0173
70 | 20.341,20.992
71 | 10.136,6.6799
72 | 7.3345,4.0259
73 | 6.0062,1.2784
74 | 7.2259,3.3411
75 | 5.0269,-2.6807
76 | 6.5479,0.29678
77 | 7.5386,3.8845
78 | 5.0365,5.7014
79 | 10.274,6.7526
80 | 5.1077,2.0576
81 | 5.7292,0.47953
82 | 5.1884,0.20421
83 | 6.3557,0.67861
84 | 9.7687,7.5435
85 | 6.5159,5.3436
86 | 8.5172,4.2415
87 | 9.1802,6.7981
88 | 6.002,0.92695
89 | 5.5204,0.152
90 | 5.0594,2.8214
91 | 5.7077,1.8451
92 | 7.6366,4.2959
93 | 5.8707,7.2029
94 | 5.3054,1.9869
95 | 8.2934,0.14454
96 | 13.394,9.0551
97 | 5.4369,0.61705
98 |
--------------------------------------------------------------------------------
/insurance.csv:
--------------------------------------------------------------------------------
1 | 108 392,5
2 | 19 46,2
3 | 13 15,7
4 | 124 422,2
5 | 40 119,4
6 | 57 170,9
7 | 23 56,9
8 | 14 77,5
9 | 45 214
10 | 10 65,3
11 | 5 20,9
12 | 48 248,1
13 | 11 23,5
14 | 23 39,6
15 | 7 48,8
16 | 2 6,6
17 | 24 134,9
18 | 6 50,9
19 | 3 4,4
20 | 23 113
21 | 6 14,8
22 | 9 48,7
23 | 9 52,1
24 | 3 13,2
25 | 29 103,9
26 | 7 77,5
27 | 4 11,8
28 | 20 98,1
29 | 7 27,9
30 | 4 38,1
31 | 0 0
32 | 25 69,2
33 | 6 14,6
34 | 5 40,3
35 | 22 161,5
36 | 11 57,2
37 | 61 217,6
38 | 12 58,1
39 | 4 12,6
40 | 16 59,6
41 | 13 89,9
42 | 60 202,4
43 | 41 181,3
44 | 37 152,8
45 | 55 162,8
46 | 41 73,4
47 | 11 21,3
48 | 27 92,6
49 | 8 76,1
50 | 3 39,9
51 | 17 142,1
52 | 13 93
53 | 13 31,9
54 | 15 32,1
55 | 8 55,6
56 | 29 133,3
57 | 30 194,5
58 | 24 137,9
59 | 9 87,4
60 | 31 209,8
61 | 14 95,5
62 | 53 244,6
63 | 26 187,5
--------------------------------------------------------------------------------
/kmeans.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 | from sklearn import datasets
5 | from random import randint
6 | import time
7 | from scipy.spatial.distance import cdist
8 | iris = datasets.load_iris() # Used for Dataset 1
9 |
10 | # Function: Constraint
11 | # -------------
12 | # Compute kmeans with constraint, which
13 | # is that the seeding set S keeps their labels
14 | # and don't get used in the kmeans algorithm
15 | # again
16 | def constraint(dataSet, idx_seed_labels, part1_labels, k):
17 | labels=np.zeros([dataSet.shape[0],dataSet.shape[1]+1])
18 | for i in range(k):
19 | new_dSet=np.delete(dataSet, idx_seed_labels[i], axis=0)
20 | idx_part2_labels=np.delete(np.array(range(0,dataSet.shape[0])), idx_seed_labels[i])
21 | labels[idx_seed_labels[i],:]=part1_labels[i]
22 | centroids,part2_labels=kmeans(new_dSet,k)
23 | labels[idx_part2_labels,:]=part2_labels
24 | return centroids,labels
25 |
26 | # Function: Seeding
27 | # -------------
28 | # Return seeds from the subset S to
29 | # better initialize the centroids for the
30 | # kmeans algorithm. Subsample by 10%
31 | def seeding(dataSet, k):
32 | sub_rate=0.1
33 | seed_labels = []
34 | idx_seed_labels = []
35 | centroids = k*[None]
36 | for i in range(k):
37 | S_l=dataSet[::round(sub_rate*dataSet.shape[0])]
38 | centroid,labels_S_l=kmeans(S_l,1)
39 | centroids[i] = centroid
40 | labels_S_l[:,2] = i
41 | seed_labels.append(labels_S_l)
42 | idx_tmp=np.array(range(dataSet.shape[0]))
43 | idx_seed_labels.append(idx_tmp[::round(sub_rate*dataSet.shape[0])])
44 | # This if-statement makes sure that index 0 only occurs one time
45 | if i > 0:
46 | idx_seed_labels[i] = np.delete(idx_seed_labels[i],0,axis=0)
47 | seed_labels[i] = np.delete(seed_labels[i],0,axis=0)
48 | sub_rate += 0.01
49 | centroids = np.concatenate(centroids,axis=0)
50 | return centroids,seed_labels,idx_seed_labels
51 |
52 | # Function: Get Number of Features
53 | # -------------
54 | # Returns the number of features in the dataSet
55 | def getNumFeatures(dataSet):
56 | numFeatures=dataSet.shape[0]
57 | return numFeatures
58 |
59 | # Function: Get Random Centroids
60 | # -------------
61 | # Returns the randomized centroids
62 | def getRandomCentroids(numFeatures, k, dataSet):
63 | # An alternative aproach is to choose the k datasamples
64 | # that have the largest Euclidean Distance
65 | centroids=np.zeros([k,2])
66 | for i in range(0,k):
67 | centroids[i]=dataSet[randint(0,numFeatures-1)]
68 | return centroids
69 |
70 | # Function: Should Stop
71 | # -------------
72 | # Returns True or False if k-means is done. K-means terminates either
73 | # because it has run a maximum number of iterations OR the centroids
74 | # stop changing.
75 | def shouldStop(oldCentroids, centroids, iterations):
76 | MAX_ITERATIONS=1000
77 | if iterations > MAX_ITERATIONS:
78 | print('STOP: Reason: Too many iterations')
79 | return True
80 | elif oldCentroids is None:
81 | return False
82 | elif len(oldCentroids)!=len(centroids):
83 | return False
84 | return (centroids==oldCentroids).all()
85 |
86 | # Function: Get Labels
87 | # -------------
88 | # Returns a label for each piece of data in the dataset.
89 | def getLabels(dataSet, centroids, numFeatures, k):
90 | # For each element in the dataset, chose the closest centroid.
91 | # Make that centroid's index the element's label.
92 | ext_dataSet=np.zeros(numFeatures)
93 | labels=np.column_stack((dataSet,ext_dataSet))
94 | c_idx=np.zeros(numFeatures)
95 | dist=cdist(dataSet,centroids,'euclidean')
96 | for i in range(0,numFeatures):
97 | c_idx[i]=np.where(dist[i,:]==dist[i,:].min())[0][0]
98 | labels[:,-1]=c_idx
99 | return labels
100 |
101 | # Function: Get Centroids
102 | # -------------
103 | # Returns k random centroids, each of dimension n.
104 | def getCentroids(dataSet, labels, k, numFeatures):
105 | # Each centroid is the geometric mean of the points that
106 | # have that centroid's label. Important: If a centroid is empty (no points have
107 | # that centroid's label) you should randomly re-initialize it.
108 | centroids=np.zeros([k,dataSet.shape[1]])
109 | x=np.zeros([numFeatures,dataSet.shape[1]])
110 | for i in range(0,k):
111 | if any(labels[:,-1]==i):
112 | idx=np.where(labels[:,-1]==i)[0]
113 | X=dataSet[idx,:]
114 | div=X.shape[0]
115 | for j in range(0,dataSet.shape[1]):
116 | x[:,j]=np.array(sum(X[:,j])/div)
117 | centroids[i]=x[i,:]
118 | elif all(labels[:,-1]!=i):
119 | centroids[i]=getRandomCentroids(numFeatures, 1, dataSet)
120 | return centroids
121 |
122 | # Function: Plot Clusters
123 | # -------------
124 | # Plot the clusters in different colors.
125 | def plotClusters(labels):
126 | color=['g','c','m','y','k','w','b','r','g','c','m','y','k','w','b','r'] # For max 8 different clusters
127 | l=list(set(labels[:,-1]))
128 | for i in range(0,len(l)):
129 | idx=np.where(labels[:,-1]==i)[0]
130 | plt.scatter(labels[idx,0], labels[idx,1],c=color[i])
131 |
132 | # Function: Evaluate Clustering
133 | # -------------
134 | # Output how many wrong labels in total.
135 | def evaluate(clustering_labels, true_labels):
136 | labels = []
137 | for cluster in clustering_labels:
138 | labels.append(int(cluster[2]))
139 | matches = [i for i, j in zip(labels, true_labels) if i == j]
140 | n_errors = len(clustering_labels)-len(matches)
141 | print('Number of wrong/correct labels: ',n_errors)
142 | print('Number of total samples: ',len(true_labels))
143 |
144 | # Function: K Means
145 | # -------------
146 | # K-Means is an algorithm that takes in a dataset and a constant
147 | # k and returns k centroids (which define clusters of data in the
148 | # dataset which are similar to one another).
149 | def kmeans(dataSet, k, seed=None, cons=None):
150 |
151 | # Initialize centroids randomly or by seeding
152 | numFeatures = getNumFeatures(dataSet)
153 | if seed==None and cons==None:
154 | centroids = getRandomCentroids(numFeatures, k, dataSet)
155 | elif seed!=None or cons!=None:
156 | centroids,part1_labels,idx_seed_labels=seeding(dataSet,k)
157 | if cons!=None:
158 | centroids,labels=constraint(dataSet, idx_seed_labels, part1_labels, k)
159 | return centroids,labels
160 |
161 | # Initialize book keeping vars.
162 | iterations = 0
163 | oldCentroids = None
164 |
165 | # Run the main k-means algorithm
166 | while not shouldStop(oldCentroids, centroids, iterations):
167 | # Save old centroids for convergence test. Book keeping.
168 | oldCentroids = centroids
169 | iterations += 1
170 | # Assign labels to each datapoint based on centroids
171 | labels = getLabels(dataSet, centroids, numFeatures, k)
172 |
173 | # Assign centroids based on datapoint labels
174 | centroids = getCentroids(dataSet, labels, k, numFeatures)
175 |
176 | # We can get the labels too by calling getLabels(dataSet, centroids, numFeatures, k)
177 | return centroids,labels
178 |
179 | start_time = time.time()
180 |
181 | # Main
182 |
183 | # Dataset 1
184 |
185 | x=iris['data'][:, 3]
186 | y=iris['data'][:, 2]
187 | dataSet=np.column_stack((x,y))
188 |
189 | # Parameters and functions
190 |
191 | k=2 # Number of clusters
192 | seed=1 # To run kmeans with seeding, if not just use 2 inputs in the algorithm
193 | cons=1 # To run kmeans with constraint, if not just use 2 or 3 inputs in the algorithm
194 | centroids,labels=kmeans(dataSet,k)
195 |
196 | print("--- %s seconds ---" % (time.time() - start_time))
197 |
198 | # Plot the different classified clusters and the centroids
199 | plt.figure()
200 | plotClusters(labels)
201 | plt.scatter(centroids[:,0], centroids[:,1], c='r')
202 | plt.show()
203 |
204 | # Evaluate number of wrong/correct number of classified labels
205 | evaluate(labels,y)
206 |
207 |
208 |
--------------------------------------------------------------------------------
/mnist.pkl.gz:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/PerceptiLabs/CodeExamples/41b1a9a8b095b931f797283b256d56786759b76c/mnist.pkl.gz
--------------------------------------------------------------------------------
/neural_network.py:
--------------------------------------------------------------------------------
1 | import random
2 | import matplotlib.pyplot as plt
3 | import numpy as np
4 | from random import randint, shuffle
5 | import mnist_loader
6 |
7 | class Network(object):
8 |
9 | def __init__(self, sizes):
10 | self.num_layers = len(sizes)
11 | self.sizes = sizes
12 | self.biases = [np.random.randn(y, 1) for y in sizes[1:]]
13 | self.weights = [np.random.randn(y, x)
14 | for x, y in zip(sizes[:-1], sizes[1:])]
15 |
16 | def feedforward(self, a):
17 | for b, w in zip(self.biases, self.weights):
18 | a = sigmoid(np.dot(w, a)+b)
19 | return a
20 |
21 | def SGD(self, training_data, epochs, mini_batch_size, eta,
22 | test_data=None):
23 | if test_data: n_test = len(test_data)
24 | n = len(list(training_data))
25 | for j in range(epochs):
26 | random.shuffle(list(training_data))
27 | mini_batches = [
28 | training_data[k:k+mini_batch_size]
29 | for k in range(0, n, mini_batch_size)]
30 | for mini_batch in mini_batches:
31 | self.update_mini_batch(mini_batch, eta)
32 | if test_data:
33 | plt.figure()
34 | print("Epoch {0}: {1} / {2}".format(
35 | j, self.evaluate(test_data), n_test))
36 | else:
37 | print("Epoch {0} complete".format(j))
38 |
39 | def update_mini_batch(self, mini_batch, eta):
40 | nabla_b = [np.zeros(b.shape) for b in self.biases]
41 | nabla_w = [np.zeros(w.shape) for w in self.weights]
42 | for x, y in mini_batch:
43 | delta_nabla_b, delta_nabla_w = self.backprop(x, y)
44 | nabla_b = [nb+dnb for nb, dnb in zip(nabla_b, delta_nabla_b)]
45 | nabla_w = [nw+dnw for nw, dnw in zip(nabla_w, delta_nabla_w)]
46 | self.weights = [w-(eta/len(mini_batch))*nw
47 | for w, nw in zip(self.weights, nabla_w)]
48 | self.biases = [b-(eta/len(mini_batch))*nb
49 | for b, nb in zip(self.biases, nabla_b)]
50 |
51 | def backprop(self, x, y):
52 | nabla_b = [np.zeros(b.shape) for b in self.biases]
53 | nabla_w = [np.zeros(w.shape) for w in self.weights]
54 | # feedforward
55 | activation = x
56 | activations = [x] # list to store all the activations, layer by layer
57 | zs = [] # list to store all the z vectors, layer by layer
58 | for b, w in zip(self.biases, self.weights):
59 | z = np.dot(w, activation)+b
60 | zs.append(z)
61 | activation = sigmoid(z)
62 | activations.append(activation)
63 | # backward pass
64 | delta = self.cost_derivative(activations[-1], y) * \
65 | sigmoid_prime(zs[-1])
66 | nabla_b[-1] = delta
67 | nabla_w[-1] = np.dot(delta, activations[-2].transpose())
68 | # Note that the variable l in the loop below is used a little
69 | # differently to the notation in Chapter 2 of the book. Here,
70 | # l = 1 means the last layer of neurons, l = 2 is the
71 | # second-last layer, and so on. It's a renumbering of the
72 | # scheme in the book, used here to take advantage of the fact
73 | # that Python can use negative indices in lists.
74 | for l in range(2, self.num_layers):
75 | z = zs[-l]
76 | sp = sigmoid_prime(z)
77 | delta = np.dot(self.weights[-l+1].transpose(), delta) * sp
78 | nabla_b[-l] = delta
79 | nabla_w[-l] = np.dot(delta, activations[-l-1].transpose())
80 | return (nabla_b, nabla_w)
81 |
82 | def evaluate(self, test_data):
83 | test_results = [(np.argmax(self.feedforward(x)), list(y).index(1))
84 | for (x, y) in test_data]
85 | for i in range(len(test_results)):
86 | if test_results[i][0]==0:
87 | plt.plot(test_data[i][0][0],test_data[i][0][1],'ro')
88 | plt.hold(True)
89 | elif test_results[i][0]==1:
90 | plt.plot(test_data[i][0][0],test_data[i][0][1],'bo')
91 | plt.hold(True)
92 | plt.show()
93 | return sum(int(x == y) for (x, y) in test_results)
94 |
95 | def cost_derivative(self, output_activations, y):
96 | return (output_activations-y)
97 |
98 | #### Miscellaneous functions
99 | def sigmoid(z):
100 | return 1.0/(1.0+np.exp(-z))
101 |
102 | def sigmoid_prime(z):
103 | return sigmoid(z)*(1-sigmoid(z))
104 |
105 | # Main
106 |
107 | training_data, validation_data, test_data = mnist_loader.load_data_wrapper()
108 |
109 | training_data=[]
110 | test_data=[]
111 | for i in range(1000):
112 | training_data.append([np.array([[randint(0, 9)],[randint(0, 9)]]),np.array([[1],[0]])])
113 | training_data.append([np.array([[randint(90, 99)],[randint(90, 99)]]),np.array([[0],[1]])])
114 | if i%10:
115 | test_data.append([np.array([[randint(0, 9)],[randint(0, 9)]]),np.array([[1],[0]])])
116 | test_data.append([np.array([[randint(90, 99)],[randint(90, 99)]]),np.array([[0],[1]])])
117 |
118 | shuffle(training_data)
119 | shuffle(test_data)
120 |
121 | print(test_data[0])
122 |
123 | net = Network([2, 3, 2])
124 | net.SGD(training_data, 30, 1, 0.01, test_data=test_data)
--------------------------------------------------------------------------------
/pl_logo.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/PerceptiLabs/CodeExamples/41b1a9a8b095b931f797283b256d56786759b76c/pl_logo.png
--------------------------------------------------------------------------------
/sgd_linear_regression_multivariable.py:
--------------------------------------------------------------------------------
1 | from numpy import loadtxt, zeros, ones, array, linspace, logspace, mean, std, shape
2 | from pylab import scatter, show, title, xlabel, ylabel, plot, contour
3 |
4 | def feature_normalize(X):
5 | '''
6 | Returns a normalized version of X where
7 | the mean value of each feature is 0 and the standard deviation
8 | is 1. This is often a good preprocessing step to do when
9 | working with learning algorithms.
10 | '''
11 | mean_r = []
12 | std_r = []
13 |
14 | X_norm = X
15 |
16 | n_c = X.shape[1]
17 | for i in range(n_c):
18 | m = mean(X[:, i])
19 | s = std(X[:, i])
20 | mean_r.append(m)
21 | std_r.append(s)
22 | X_norm[:, i] = (X_norm[:, i] - m) / s
23 |
24 | return X_norm, mean_r, std_r
25 |
26 |
27 | def compute_cost(X, y, theta):
28 | '''
29 | Comput cost for linear regression
30 | '''
31 | #Number of training samples
32 | m = y.size
33 |
34 | predictions = X.dot(theta)
35 |
36 | sqErrors = (predictions - y)
37 |
38 | J = (1.0 / (2 * m)) * sqErrors.T.dot(sqErrors)
39 |
40 | return J
41 |
42 |
43 | def gradient_descent(X, y, theta, alpha, num_iters):
44 | '''
45 | Performs gradient descent to learn theta
46 | by taking num_items gradient steps with learning
47 | rate alpha
48 | '''
49 | m = y.size
50 |
51 | for i in range(num_iters):
52 |
53 | predictions = X.dot(theta)
54 |
55 | theta_size = theta.size
56 |
57 | for it in range(theta_size):
58 |
59 | temp = X[:, it]
60 | temp.shape = (m, 1)
61 |
62 | errors_x1 = (predictions - y) * temp
63 |
64 | theta[it][0] = theta[it][0] - alpha * (1.0 / m) * errors_x1.sum()
65 |
66 | return theta
67 |
68 | #Load the dataset
69 | data = loadtxt('ex1data1.txt', delimiter=',')
70 |
71 | #Plot the data
72 | scatter(data[:, 0], data[:, 1], marker='o', c='b')
73 | title('Profits distribution')
74 | xlabel('Population of City in 10,000s')
75 | ylabel('Profit in $10,000s')
76 | #show()
77 |
78 | X = data[:, 0]
79 | y = data[:, 1]
80 |
81 |
82 | #number of training samples
83 | m = y.size
84 |
85 | #Add a column of ones to X (interception data)
86 | it = ones(shape=(m, 2))
87 | it[:, 1] = X
88 |
89 | #Initialize theta parameters
90 | theta = zeros(shape=(2, 1))
91 |
92 | #Some gradient descent settings
93 | iterations = 1500
94 | alpha = 0.01
95 |
96 | #compute and display initial cost
97 | cost=compute_cost(it, y, theta)
98 | print(cost)
99 |
100 | theta = gradient_descent(it, y, theta, alpha, iterations)
101 |
102 | print(theta)
103 | #Predict values for population sizes of 35,000 and 70,000
104 | predict1 = array([1, 3.5]).dot(theta).flatten()
105 | print('For population = 35,000, we predict a profit of %f' % (predict1 * 10000))
106 | predict2 = array([1, 7.0]).dot(theta).flatten()
107 | print('For population = 70,000, we predict a profit of %f' % (predict2 * 10000))
108 |
109 | #Plot the results
110 | result = it.dot(theta).flatten()
111 | plot(data[:, 0], result)
112 | show()
113 |
114 |
115 | #Grid over which we will calculate J
116 | theta0_vals = linspace(-10, 10, 100)
117 | theta1_vals = linspace(-1, 4, 100)
118 |
119 |
120 | #initialize J_vals to a matrix of 0's
121 | J_vals = zeros(shape=(theta0_vals.size, theta1_vals.size))
122 |
123 | #Fill out J_vals
124 | for t1, element in enumerate(theta0_vals):
125 | for t2, element2 in enumerate(theta1_vals):
126 | thetaT = zeros(shape=(2, 1))
127 | thetaT[0][0] = element
128 | thetaT[1][0] = element2
129 | J_vals[t1, t2] = compute_cost(it, y, thetaT)
130 |
131 | #Contour plot
132 | J_vals = J_vals.T
133 | #Plot J_vals as 15 contours spaced logarithmically between 0.01 and 100
134 | contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 3, 20))
135 | xlabel('theta_0')
136 | ylabel('theta_1')
137 | scatter(theta[0][0], theta[1][0])
138 | show()
--------------------------------------------------------------------------------
/sgd_linear_regression_simple.py:
--------------------------------------------------------------------------------
1 | from random import seed
2 | from random import randrange
3 | from csv import reader
4 | from math import sqrt
5 | import matplotlib.pyplot as plt
6 | import numpy as np
7 |
8 | # Load a CSV file
9 | def load_csv(filename):
10 | dataset = list()
11 | with open(filename, 'r') as file:
12 | csv_reader = reader(file)
13 | for row in csv_reader:
14 | row=row[0].split(";")
15 | print(row)
16 | if not row:
17 | continue
18 | dataset.append(row)
19 | return dataset
20 |
21 | # Convert string column to float
22 | def str_column_to_float(dataset, column):
23 | for row in dataset:
24 | row[column] = float(row[column].strip())
25 |
26 | # Find the min and max values for each column
27 | def dataset_minmax(dataset):
28 | minmax = list()
29 | for i in range(len(dataset[0])):
30 | col_values = [row[i] for row in dataset]
31 | value_min = min(col_values)
32 | value_max = max(col_values)
33 | minmax.append([value_min, value_max])
34 | return minmax
35 |
36 | # Rescale dataset columns to the range 0-1
37 | def normalize_dataset(dataset, minmax):
38 | for row in dataset:
39 | for i in range(len(row)):
40 | row[i] = (row[i] - minmax[i][0]) / (minmax[i][1] - minmax[i][0])
41 |
42 | # Split a dataset into k folds
43 | def cross_validation_split(dataset, n_folds):
44 | dataset_split = list()
45 | dataset_copy = list(dataset)
46 | fold_size = int(len(dataset) / n_folds)
47 | for i in range(n_folds):
48 | fold = list()
49 | while len(fold) < fold_size:
50 | index = randrange(len(dataset_copy))
51 | fold.append(dataset_copy.pop(index))
52 | dataset_split.append(fold)
53 | return dataset_split
54 |
55 | # Calculate root mean squared error
56 | def rmse_metric(actual, predicted):
57 | sum_error = 0.0
58 | for i in range(len(actual)):
59 | prediction_error = predicted[i] - actual[i]
60 | sum_error += (prediction_error ** 2)
61 | mean_error = sum_error / float(len(actual))
62 | return sqrt(mean_error)
63 |
64 | # Evaluate an algorithm using a cross validation split
65 | def evaluate_algorithm(dataset, algorithm, n_folds, *args):
66 | folds = cross_validation_split(dataset, n_folds)
67 | scores = list()
68 | for fold in folds:
69 | train_set = list(folds)
70 | train_set.remove(fold)
71 | train_set = sum(train_set, [])
72 | test_set = list()
73 | y=[]
74 | for row in fold:
75 | y.append(row[-1])
76 | row_copy = list(row)
77 | test_set.append(row_copy)
78 | # row_copy[-1] = None
79 | predicted = algorithm(train_set, test_set, y, *args)
80 | actual = [row[-1] for row in fold]
81 | rmse = rmse_metric(actual, predicted)
82 | scores.append(rmse)
83 | return scores
84 |
85 | # Make a prediction with coefficients
86 | def predict(row, coefficients):
87 | yhat = coefficients[0]
88 | for i in range(len(row)-1):
89 | yhat += coefficients[i + 1] * row[i]
90 | return yhat
91 |
92 | # Estimate linear regression coefficients using stochastic gradient descent
93 | def coefficients_sgd(train, l_rate, n_epoch):
94 | coef = [0.0 for i in range(len(train[0]))]
95 | for epoch in range(n_epoch):
96 | for row in train:
97 | yhat = predict(row, coef)
98 | error = yhat - row[-1]
99 | coef[0] = coef[0] - l_rate * error
100 | for i in range(len(row)-1):
101 | coef[i + 1] = coef[i + 1] - l_rate * error * row[i]
102 | # print(l_rate, n_epoch, error)
103 | print(coef)
104 | return coef
105 |
106 | # Linear Regression Algorithm With Stochastic Gradient Descent
107 | def linear_regression_sgd(train, test, y, l_rate, n_epoch):
108 | predictions = list()
109 | coef = coefficients_sgd(train, l_rate, n_epoch)
110 | x=[]
111 | for row in test:
112 | yhat = predict(row, coef)
113 | predictions.append(yhat)
114 | x.append(row[0])
115 | plt.figure()
116 | plt.plot(x,predictions,'r',x,y,'bo')
117 | plt.show()
118 | return(predictions)
119 |
120 | # Linear Regression on wine quality dataset
121 | seed(1)
122 | # load and prepare data
123 | filename = 'winequality-white.csv'
124 | dataset = load_csv(filename)
125 | for i in range(len(dataset[0])):
126 | str_column_to_float(dataset, i)
127 | # normalize
128 | minmax = dataset_minmax(dataset)
129 | normalize_dataset(dataset, minmax)
130 | # evaluate algorithm
131 | n_folds = 5
132 | l_rate = 0.01
133 | n_epoch = 50
134 | scores = evaluate_algorithm(dataset, linear_regression_sgd, n_folds, l_rate, n_epoch)
135 | print('Scores: %s' % scores)
136 | print('Mean RMSE: %.3f' % (sum(scores)/float(len(scores))))
--------------------------------------------------------------------------------
/simple_linear_regression.py:
--------------------------------------------------------------------------------
1 | from random import seed
2 | from random import randrange
3 | from csv import reader
4 | from math import sqrt
5 | import matplotlib.pyplot as plt
6 |
7 | # Load a CSV file
8 | def load_csv(filename):
9 | dataset = list()
10 | with open(filename, 'r') as file:
11 | csv_reader = reader(file)
12 | for row in csv_reader:
13 | row=row[0].split()
14 | if not row:
15 | continue
16 | dataset.append(row)
17 | return dataset
18 |
19 | # Convert string column to float
20 | def str_column_to_float(dataset, column):
21 | for row in dataset:
22 | row[column] = float(row[column].strip())
23 |
24 | # Split a dataset into a train and test set
25 | def train_test_split(dataset, split):
26 | train = list()
27 | train_size = split * len(dataset)
28 | dataset_copy = list(dataset)
29 | while len(train) < train_size:
30 | index = randrange(len(dataset_copy))
31 | train.append(dataset_copy.pop(index))
32 | return train, dataset_copy
33 |
34 | # Calculate root mean squared error
35 | def rmse_metric(actual, predicted):
36 | sum_error = 0.0
37 | for i in range(len(actual)):
38 | prediction_error = predicted[i] - actual[i]
39 | sum_error += (prediction_error ** 2)
40 | mean_error = sum_error / float(len(actual))
41 | return sqrt(mean_error)
42 |
43 | # Evaluate an algorithm using a train/test split
44 | def evaluate_algorithm(dataset, algorithm, split, *args):
45 | train, test = train_test_split(dataset, split)
46 | test_set = list()
47 | y=[]
48 | for row in test:
49 | row_copy = list(row)
50 | row_copy[-1] = None
51 | test_set.append(row_copy)
52 | y.append(row[-1])
53 | predicted = algorithm(train, test_set, y, *args)
54 | actual = [row[-1] for row in test]
55 | rmse = rmse_metric(actual, predicted)
56 | return rmse
57 |
58 | # Calculate the mean value of a list of numbers
59 | def mean(values):
60 | return sum(values) / float(len(values))
61 |
62 | # Calculate covariance between x and y
63 | def covariance(x, mean_x, y, mean_y):
64 | covar = 0.0
65 | for i in range(len(x)):
66 | covar += (x[i] - mean_x) * (y[i] - mean_y)
67 | return covar
68 |
69 | # Calculate the variance of a list of numbers
70 | def variance(values, mean):
71 | return sum([(x-mean)**2 for x in values])
72 |
73 | # Calculate coefficients
74 | def coefficients(dataset):
75 | x = [row[0] for row in dataset]
76 | y = [row[1] for row in dataset]
77 | x_mean, y_mean = mean(x), mean(y)
78 | b1 = covariance(x, x_mean, y, y_mean) / variance(x, x_mean)
79 | b0 = y_mean - b1 * x_mean
80 | return [b0, b1]
81 |
82 | # Simple linear regression algorithm
83 | def simple_linear_regression(train, test, y):
84 | predictions = list()
85 | b0, b1 = coefficients(train)
86 | x=[]
87 | for row in test:
88 | yhat = b0 + b1 * row[0]
89 | predictions.append(yhat)
90 | x.append(row[0])
91 | plt.figure()
92 | plt.plot(x,predictions,'r',x,y,'bo')
93 | plt.show()
94 | return predictions
95 |
96 | # Simple linear regression on insurance dataset
97 | seed(1)
98 | # load and prepare data
99 | filename = 'insurance.csv'
100 | dataset = load_csv(filename)
101 | for i in range(len(dataset[0])):
102 | str_column_to_float(dataset, i)
103 | # evaluate algorithm
104 | split = 0.6
105 | rmse = evaluate_algorithm(dataset, simple_linear_regression, split)
106 | print('RMSE: %.3f' % (rmse))
--------------------------------------------------------------------------------