├── .gitignore
├── CITATION.bib
├── LICENSE
├── PyImpetus.py
├── README.md
├── setup.py
└── tutorials
├── Classification_Tutorial.ipynb
├── Regression_Tutorial.ipynb
├── ionosphere.data
└── ionosphere.names
/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 | *$py.class
5 |
6 | # C extensions
7 | *.so
8 |
9 | # Distribution / packaging
10 | .Python
11 | env/
12 | build/
13 | develop-eggs/
14 | dist/
15 | downloads/
16 | eggs/
17 | .eggs/
18 | lib/
19 | lib64/
20 | parts/
21 | sdist/
22 | var/
23 | *.egg-info/
24 | .installed.cfg
25 | *.egg
26 |
27 | # PyInstaller
28 | # Usually these files are written by a python script from a template
29 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
30 | *.manifest
31 | *.spec
32 |
33 | # Installer logs
34 | pip-log.txt
35 | pip-delete-this-directory.txt
36 |
37 | # Unit test / coverage reports
38 | htmlcov/
39 | .tox/
40 | .coverage
41 | .coverage.*
42 | .cache
43 | nosetests.xml
44 | coverage.xml
45 | *,cover
46 | .hypothesis/
47 |
48 | # Translations
49 | *.mo
50 | *.pot
51 |
52 | # Django stuff:
53 | *.log
54 | local_settings.py
55 |
56 | # Flask stuff:
57 | instance/
58 | .webassets-cache
59 |
60 | # Scrapy stuff:
61 | .scrapy
62 |
63 | # Sphinx documentation
64 | docs/_build/
65 |
66 | # PyBuilder
67 | target/
68 |
69 | # IPython Notebook
70 | .ipynb_checkpoints
71 |
72 | # pyenv
73 | .python-version
74 |
75 | # celery beat schedule file
76 | celerybeat-schedule
77 |
78 | # dotenv
79 | .env
80 |
81 | # virtualenv
82 | .venv/
83 | venv/
84 | ENV/
85 |
86 | # Spyder project settings
87 | .spyderproject
88 |
89 | # Rope project settings
90 | .ropeproject
--------------------------------------------------------------------------------
/CITATION.bib:
--------------------------------------------------------------------------------
1 | @article{HASSAN2025111069,
2 | title = {A wrapper feature selection approach using Markov blankets},
3 | journal = {Pattern Recognition},
4 | volume = {158},
5 | pages = {111069},
6 | year = {2025},
7 | issn = {0031-3203},
8 | doi = {https://doi.org/10.1016/j.patcog.2024.111069},
9 | url = {https://www.sciencedirect.com/science/article/pii/S0031320324008203},
10 | author = {Atif Hassan and Jiaul Hoque Paik and Swanand Ravindra Khare and Syed Asif Hassan},
11 | keywords = {Feature selection, Markov blanket, Conditional independence test, Classification, Regression},
12 | abstract = {In feature selection, Markov Blanket (MB) based approaches have attracted considerable attention with most MB discovery algorithms being categorized as filter based techniques. Typically, the Conditional Independence (CI) test employed by such methods is different for different data types. In this article, we propose a novel Markov Blanket based wrapper feature selection method. The proposed approach employs Predictive Permutation Independence (PPI), a novel Conditional Independence (CI) test that allows it to work out-of-the-box for both classification and regression tasks on mixed data. PPI can work with any supervised algorithm to estimate the association of a feature with the target variable while also providing a measure of feature importance. The proposed approach also includes an optional MB aggregation step that can be used to find the optimal MB under non-faithful conditions. Our method11Implementation and experimental results are available at https://github.com/AnonymousMLSubmissions/PatternRecognitionSubmission. outperforms other MB discovery methods, in terms of F1-score, by 7% on average, over 3 large-scale BN datasets. It also outperforms state-of-the-art feature selection techniques on 13 real-world datasets.}
13 | }
14 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2020 Atif Hassan
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/PyImpetus.py:
--------------------------------------------------------------------------------
1 | import math
2 | import time
3 | import numpy as np
4 | import pandas as pd
5 | from sklearn.model_selection import train_test_split, KFold, StratifiedKFold
6 | from joblib import Parallel, delayed
7 | from sklearn.tree import DecisionTreeClassifier, DecisionTreeRegressor
8 | from sklearn.base import TransformerMixin, BaseEstimator, clone
9 | from sklearn.metrics import log_loss, mean_squared_error
10 | import scipy.stats as ss
11 | from collections import Counter
12 | import seaborn as sns
13 | import matplotlib.pyplot as plt
14 | from tqdm import tqdm
15 | import contextlib
16 | import joblib
17 |
18 |
19 |
20 | # tqdm enabled for joblib
21 | # This is all thanks to the answer by frenzykryger at https://stackoverflow.com/questions/24983493/tracking-progress-of-joblib-parallel-execution/58936697#58936697
22 | @contextlib.contextmanager
23 | def tqdm_joblib(tqdm_object):
24 | """Context manager to patch joblib to report into tqdm progress bar given as argument"""
25 | class TqdmBatchCompletionCallback(joblib.parallel.BatchCompletionCallBack):
26 | def __init__(self, *args, **kwargs):
27 | super().__init__(*args, **kwargs)
28 |
29 | def __call__(self, *args, **kwargs):
30 | tqdm_object.update(n=self.batch_size)
31 | return super().__call__(*args, **kwargs)
32 |
33 | old_batch_callback = joblib.parallel.BatchCompletionCallBack
34 | joblib.parallel.BatchCompletionCallBack = TqdmBatchCompletionCallback
35 | try:
36 | yield tqdm_object
37 | finally:
38 | joblib.parallel.BatchCompletionCallBack = old_batch_callback
39 | tqdm_object.close()
40 |
41 |
42 | def convert_numpy_to_pandas(data):
43 | return pd.DataFrame({'Column'+str(i): data[:, i-1] for i in range(1, len(data[0])+1)})
44 |
45 |
46 |
47 |
48 |
49 |
50 |
51 | class PPIMBC(TransformerMixin, BaseEstimator):
52 | def __init__(self, model=None, p_val_thresh=0.05, num_simul=30, cv=0, simul_size=0.2, simul_type=0, sig_test_type="non-parametric", random_state=None, n_jobs=-1, verbose=2):
53 | self.random_state = random_state
54 | if model is not None:
55 | self.model = model
56 | else:
57 | self.model = DecisionTreeClassifier(random_state=self.random_state)
58 | self.p_val_thresh = p_val_thresh
59 | self.num_simul = num_simul
60 | self.simul_size = simul_size
61 | self.simul_type = simul_type
62 | self.sig_test_type = sig_test_type
63 | self.cv = cv
64 | self.n_jobs = n_jobs
65 | self.verbose = verbose
66 | self.MB = None
67 | self.feat_imp_scores = list()
68 |
69 |
70 |
71 |
72 |
73 | #---------------------------------------------------------------------------------------------
74 | # _feature_importance() - Function that gives score with and without feature transformation #
75 | #---------------------------------------------------------------------------------------------
76 | # data - The data provided by user
77 | # Y - Target variable
78 | # i - a number for shuffling the data. Based on the number of simulations provided by user
79 | def _feature_importance(self, data, Y, i):
80 | # The target variable comes in 2D shape. Reshape it
81 | Y = np.ravel(Y)
82 | # Split the data into train and test
83 | if self.simul_type == 0:
84 | x_train, x_test, y_train, y_test = train_test_split(data, Y, test_size=self.simul_size, random_state=i)
85 | else:
86 | x_train, x_test, y_train, y_test = train_test_split(data, Y, test_size=self.simul_size, random_state=i, stratify=Y)
87 | # Find all the labels. If there are more than 2 labels, then it multi-class classification
88 | # So handle accordingly
89 | self_labels = np.unique(y_train)
90 | if len(self_labels) <= 2:
91 | self_labels = None
92 | # Clone the user provided model
93 | model = clone(self.model)
94 | # Fit the model on train data
95 | model.fit(x_train, y_train)
96 | # SVM and RidgeClassifiers dont have predict_proba functions so handle accordingly
97 | if "SVC" in type(model).__name__ or "RidgeClassifier" in type(model).__name__:
98 | preds = model.decision_function(x_test)
99 | else:
100 | preds = model.predict_proba(x_test)
101 | # Calculate log_loss metric. T-test is perfomed on this value
102 | x = log_loss(y_test, preds, labels=self_labels)
103 |
104 | np.random.seed(i)
105 | np.random.shuffle(x_test[:,0])
106 |
107 | # SVM and RidgeClassifiers dont have predict_proba functions so handle accordingly
108 | if "SVC" in type(model).__name__ or "RidgeClassifier" in type(model).__name__:
109 | preds = model.decision_function(x_test)
110 | else:
111 | preds = model.predict_proba(x_test)
112 | # Calculate log_loss metric. T-test is perfomed on this value
113 | y = log_loss(y_test, preds, labels=self_labels)
114 |
115 | return [x, y]
116 |
117 |
118 |
119 |
120 | #-----------------------------------------------------
121 | # _PPI() - Function that provides feature importance #
122 | #-----------------------------------------------------
123 | # X - The feature to provide importance for
124 | # Y - The target variable
125 | # Z - The remaining variables upon which X is conditionally tested
126 | def _PPI(self, X, Y, Z, col):
127 | X = np.reshape(X, (-1, 1))
128 | # Either growth stage or shrink stage
129 | if Z is None:
130 | data = X
131 | elif Z.ndim == 1:
132 | # Always keep the variable to check, in front
133 | data = np.concatenate((X, np.reshape(Z, (-1, 1))), axis=1)
134 | else:
135 | data = np.concatenate((X, Z), axis=1)
136 |
137 | # testX -> prediction on correct set
138 | # testY -> prediction on permuted set
139 | testX, testY = list(), list()
140 | parallel = Parallel(n_jobs=self.n_jobs, verbose=self.verbose)
141 | p_values = parallel(delayed(self._feature_importance)(data, Y, i) for i in range(self.num_simul))
142 | p_values = np.array(p_values)
143 | testX, testY = p_values[:,0], p_values[:,1]
144 |
145 | # Since we want log_loss to be lesser for testX,
146 | # we therefore perform one tail paired t-test (to the left and not right)
147 | if self.sig_test_type == "parametric":
148 | t_stat, p_val = ss.ttest_ind(testX, testY, alternative="less", nan_policy="omit")
149 | else:
150 | t_stat, p_val = ss.wilcoxon(testX, testY, alternative="less", zero_method="zsplit")
151 | if col is None:
152 | return p_val#ss.t.cdf(t_stat, len(testX) + len(testY) - 2) # Left part of t-test
153 | else:
154 | return [col, p_val]#ss.t.cdf(t_stat, len(testX) + len(testY) - 2)] # Left part of t-test
155 |
156 |
157 |
158 |
159 |
160 | #---------------------------------------------------------------------
161 | # _grow() - Function that performs the growth stage of the algorithm #
162 | #---------------------------------------------------------------------
163 | # data - The data provided by user
164 | # Y - Target variable
165 | def _grow(self, data, Y):
166 | MB = list()
167 | # For each feature find its individual importance in relation to the target variable
168 | # Since each feature's importance is checked individually, we can therefore run them in parallel
169 | parallel = Parallel(n_jobs=self.n_jobs, verbose=self.verbose)
170 | feats_and_pval = parallel(delayed(self._PPI)(data[col].values, Y, None, col) for col in data.columns)
171 |
172 | # Sort the features according to their importance (decreasing order)
173 | feats_and_pval.sort(key = lambda x: x[1], reverse=True)
174 | # Only significant features are added the MB
175 | for feat, p_val in feats_and_pval:
176 | if p_val < self.p_val_thresh:
177 | MB.append(feat)
178 | return MB
179 |
180 |
181 |
182 |
183 |
184 | #-----------------------------------------------------------------------
185 | # _shrink() - Function that performs the shrink stage of the algorithm #
186 | #-----------------------------------------------------------------------
187 | # data - The data provided by user
188 | # Y - Target variable
189 | # MB - The MB currently populated by features from growth stage
190 | def _shrink(self, data, Y, MB):
191 | scores = list()
192 | # If MB is empty, return
193 | if len(MB) < 1:
194 | return list(), list()
195 | # Generate a list for false positive features
196 | remove = list()
197 | # For each feature in MB, check if it is false positive
198 | for col in MB:
199 | MB_to_consider = [i for i in MB if i not in [col]+remove]
200 | # Again, if there is only one element in MB, no shrinkage is required
201 | if len(MB_to_consider) < 1:
202 | break
203 | # Get the p-value from the Predictive Permutation Independence Test
204 | p_val = self._PPI(data[col].values, Y, data[MB_to_consider].values, None)
205 | if p_val > self.p_val_thresh:
206 | remove.append(col)
207 | else:
208 | scores.append(np.log(1/p_val))
209 | # Finally, return only those features that are not present in MB
210 | return [i for i in MB if i not in remove], scores
211 |
212 |
213 | #----------------------------------------------------------------------
214 | # _find_MB() - Function that finds the MB of provided target variable #
215 | #----------------------------------------------------------------------
216 | # data - The data provided by user
217 | # Y - Target variable
218 | def _find_MB(self, data, Y):
219 | # Keep a copy of the original data
220 | orig_data = data.copy()
221 | # The target variable needs to be reshaped for downstream processing
222 | Y = np.reshape(Y, (-1, 1))
223 |
224 | # Growth phase
225 | MB = self._grow(data, Y)
226 |
227 | # Shrink phase. This is the final MB
228 | MB, scores = self._shrink(orig_data, Y, MB)
229 |
230 | return MB, scores
231 |
232 |
233 |
234 |
235 | #------------------------------------------------------------------------------------------
236 | # fit() - Function that finds the MB of target variable using CV strategy for aggregation #
237 | #------------------------------------------------------------------------------------------
238 | # data - The data provided by user
239 | # Y - Target variable
240 | def fit(self, data, Y):
241 | # If user passed a numpy matrix then convert to pandas dataframe and proceed
242 | if not isinstance(data, pd.DataFrame):
243 | data = convert_numpy_to_pandas(data)
244 |
245 | # for a value of 0, no CV is applied
246 | if self.cv!= 0:
247 | # Find MB for each fold in parallel
248 | parallel = Parallel(n_jobs=self.n_jobs)#, verbose=self.verbose)
249 | if type(self.cv).__name__ == "StratifiedKFold":
250 | if self.verbose > 0:
251 | with tqdm_joblib(tqdm(desc="Progress bar", total=self.cv.get_n_splits())) as progress_bar:
252 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in self.cv.split(data, Y))
253 | else:
254 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in self.cv.split(data, Y))
255 | elif type(self.cv).__name__ == "KFold":
256 | if self.verbose > 0:
257 | with tqdm_joblib(tqdm(desc="Progress bar", total=self.cv.get_n_splits())) as progress_bar:
258 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in self.cv.split(data))
259 | else:
260 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in self.cv.split(data))
261 | else:
262 | kfold = KFold(n_splits=self.cv, random_state=self.random_state, shuffle=True)
263 | if self.verbose > 0:
264 | with tqdm_joblib(tqdm(desc="Progress bar", total=self.cv)) as progress_bar:
265 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in kfold.split(data))
266 | else:
267 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in kfold.split(data))
268 |
269 | # Separate out the features from the importance scores
270 | for i in range(len(tmp)):
271 | if i == 0:
272 | feature_sets, scores = [tmp[i][0]], [tmp[i][1]]
273 | else:
274 | feature_sets.append(tmp[i][0])
275 | scores.append(tmp[i][1])
276 |
277 | # Now get all the features and find their frequencies
278 | final_feats = dict()
279 | for fs in feature_sets:
280 | for i in fs:
281 | if i not in final_feats:
282 | final_feats[i] = 1
283 | else:
284 | final_feats[i]+= 1
285 |
286 | # Now find the most robust MB
287 | final_MB, max_score, final_feat_imp = list(), 0, list()
288 | for fs, feat_imp in zip(feature_sets, scores):
289 | tmp = [final_feats[i] for i in fs]
290 | score = sum(tmp)/max(len(tmp), 1)
291 | if score > max_score:
292 | final_MB = fs
293 | final_feat_imp = feat_imp
294 | max_score = score
295 |
296 | # Sort the features according to their importance
297 | tmp_feats_and_imp = list(zip(final_MB, final_feat_imp))
298 | tmp_feats_and_imp.sort(key = lambda x: x[1], reverse=True)
299 |
300 | self.MB = [i for i, _ in tmp_feats_and_imp]
301 | self.feat_imp_scores = [i for _, i in tmp_feats_and_imp]
302 |
303 | else:
304 | final_MB, final_feat_imp = self._find_MB(data.copy(), Y)
305 |
306 | # Sort the features according to their importance
307 | tmp_feats_and_imp = list(zip(final_MB, final_feat_imp))
308 | tmp_feats_and_imp.sort(key = lambda x: x[1], reverse=True)
309 |
310 | self.MB = [i for i, _ in tmp_feats_and_imp]
311 | self.feat_imp_scores = [i for _, i in tmp_feats_and_imp]
312 |
313 |
314 |
315 |
316 | #--------------------------------------------------------------
317 | # transform() - Function that returns the MB part of the data #
318 | #--------------------------------------------------------------
319 | def transform(self, data):
320 | # If user passed a numpy matrix then convert to pandas dataframe
321 | if not isinstance(data, pd.DataFrame):
322 | data = convert_numpy_to_pandas(data)
323 | # And return a numpy matrix
324 | return data[self.MB].values
325 | else:
326 | return data[self.MB]
327 |
328 |
329 |
330 |
331 | #-----------------------------------------------------------
332 | # fit_transform() - Wrapper function for fit and transform #
333 | #-----------------------------------------------------------
334 | # data - The data provided by user
335 | # Y - Target variable
336 | def fit_transform(self, data, Y):
337 | self.fit(data, Y)
338 | return self.transform(data)
339 |
340 |
341 |
342 |
343 | #-------------------------------------------------------------------------
344 | # feature_importance() - Wrapper function for plotting feature importance #
345 | #--------------------------------------------------------------------------
346 | def feature_importance(self):
347 | y_axis = np.arange(len(self.MB))
348 | x_axis = self.feat_imp_scores
349 |
350 | sns.barplot(x=x_axis, y=y_axis, orient="h")
351 | plt.yticks(y_axis, [str(i) for i in self.MB], size='small')
352 | plt.xlabel("Importance Scores")
353 | plt.ylabel("Features")
354 | sns.despine()
355 | plt.show()
356 |
357 |
358 |
359 |
360 |
361 |
362 |
363 |
364 |
365 |
366 |
367 |
368 |
369 |
370 |
371 | class PPIMBR(TransformerMixin, BaseEstimator):
372 | def __init__(self, model=None, p_val_thresh=0.05, num_simul=30, simul_size=0.2, sig_test_type="non-parametric", cv=0, random_state=None, n_jobs=-1, verbose=2):
373 | self.random_state = random_state
374 | if model is not None:
375 | self.model = model
376 | else:
377 | self.model = DecisionTreeRegressor(random_state=self.random_state)
378 | self.p_val_thresh = p_val_thresh
379 | self.num_simul = num_simul
380 | self.simul_size = simul_size
381 | self.sig_test_type = sig_test_type
382 | self.cv = cv
383 | self.n_jobs = n_jobs
384 | self.verbose = verbose
385 | self.MB = None
386 | self.feat_imp_scores = list()
387 |
388 |
389 |
390 |
391 |
392 | #---------------------------------------------------------------------------------------------
393 | # _feature_importance() - Function that gives score with and without feature transformation #
394 | #---------------------------------------------------------------------------------------------
395 | # data - The data provided by user
396 | # Y - Target variable
397 | # i - a number for shuffling the data. Based on the number of simulations provided by user
398 | def _feature_importance(self, data, Y, i):
399 | # The target variable comes in 2D shape. Reshape it
400 | Y = np.ravel(Y)
401 | # Clone the user provided model
402 | model = clone(self.model)
403 | # Split the data into train and test
404 | x_train, x_test, y_train, y_test = train_test_split(data, Y, test_size=self.simul_size, random_state=i)
405 |
406 | # Fit the model on train data
407 | model.fit(x_train, y_train)
408 | preds = model.predict(x_test)
409 | # Calculate rmse metric. T-test is perfomed on this value
410 | x = mean_squared_error(y_test, preds)
411 |
412 | # Perform transformation
413 | np.random.seed(i)
414 | np.random.shuffle(x_test[:,0])
415 |
416 | preds = model.predict(x_test)
417 | # Calculate mse metric. T-test is perfomed on this value
418 | y = mean_squared_error(y_test, preds)
419 |
420 | return [x, y]
421 |
422 |
423 |
424 |
425 |
426 | #-----------------------------------------------------
427 | # _CPI() - Function that provides feature importance #
428 | #-----------------------------------------------------
429 | # X - The feature to provide importance for
430 | # Y - The target variable
431 | # Z - The remaining variables upon which X is conditionally tested
432 | def _PPI(self, X, Y, Z, col):
433 | X = np.reshape(X, (-1, 1))
434 | # Either growth stage or shrink stage
435 | if Z is None:
436 | data = X
437 | elif Z.ndim == 1:
438 | # Always keep the variable to check, in front
439 | data = np.concatenate((X, np.reshape(Z, (-1, 1))), axis=1)
440 | else:
441 | data = np.concatenate((X, Z), axis=1)
442 |
443 | # testX -> prediction on correct set
444 | # testY -> prediction on permuted set
445 | testX, testY = list(), list()
446 | parallel = Parallel(n_jobs=self.n_jobs, verbose=self.verbose)
447 | p_values = parallel(delayed(self._feature_importance)(data, Y, i) for i in range(self.num_simul))
448 | p_values = np.array(p_values)
449 | testX, testY = p_values[:,0], p_values[:,1]
450 |
451 | # Since we want log_loss to be lesser for testX,
452 | # we therefore perform one tail paired t-test (to the left and not right)
453 | if self.sig_test_type == "parametric":
454 | t_stat, p_val = ss.ttest_ind(testX, testY, alternative="less", nan_policy="omit")
455 | else:
456 | t_stat, p_val = ss.wilcoxon(testX, testY, alternative="less", zero_method="zsplit")
457 | if col is None:
458 | return p_val#ss.t.cdf(t_stat, len(testX) + len(testY) - 2) # Left part of t-test
459 | else:
460 | return [col, p_val]#ss.t.cdf(t_stat, len(testX) + len(testY) - 2)] # Left part of t-test
461 |
462 |
463 |
464 |
465 |
466 | #---------------------------------------------------------------------
467 | # _grow() - Function that performs the growth stage of the algorithm #
468 | #---------------------------------------------------------------------
469 | # data - The data provided by user
470 | # Y - Target variable
471 | def _grow(self, data, Y):
472 | MB = list()
473 | # For each feature find its individual importance in relation to the target variable
474 | # Since each feature's importance is checked individually, we can therefore run them in parallel
475 | parallel = Parallel(n_jobs=self.n_jobs, verbose=self.verbose)
476 | feats_and_pval = parallel(delayed(self._PPI)(data[col].values, Y, None, col) for col in data.columns)
477 | #feats_and_pval = [self._CPI(data[col].values, Y, None, col) for col in tqdm(data.columns)]
478 |
479 | # Sort the features according to their importance (decreasing order)
480 | feats_and_pval.sort(key = lambda x: x[1], reverse=True)
481 | # Only significant features are added the MB
482 | for feat, p_val in feats_and_pval:
483 | if p_val < self.p_val_thresh:
484 | MB.append(feat)
485 | return MB
486 |
487 |
488 |
489 |
490 |
491 | #-----------------------------------------------------------------------
492 | # _shrink() - Function that performs the shrink stage of the algorithm #
493 | #-----------------------------------------------------------------------
494 | # data - The data provided by user
495 | # Y - Target variable
496 | # MB - The MB currently populated by features from growth stage
497 | def _shrink(self, data, Y, MB):
498 | scores = list()
499 | # If MB is empty, return
500 | if len(MB) < 1:
501 | return list(), list() # Callers expect a pair
502 | # Generate a list for false positive features
503 | remove = list()
504 | # For each feature in MB, check if it is false positive
505 | for col in MB:
506 | MB_to_consider = [i for i in MB if i not in [col]+remove]
507 | # Again, if there is only one element in MB, no shrinkage is required
508 | if len(MB_to_consider) < 1:
509 | break
510 | # Get the p-value from the Conditional Predictive Information Test
511 | p_val = self._PPI(data[col].values, Y, data[MB_to_consider].values, None)
512 | if p_val > self.p_val_thresh:
513 | remove.append(col)
514 | else:
515 | scores.append(np.log(1/p_val))
516 | # Finally, return only those features that are not present in MB
517 | return [i for i in MB if i not in remove], scores
518 |
519 |
520 | #----------------------------------------------------------------------
521 | # _find_MB() - Function that finds the MB of provided target variable #
522 | #----------------------------------------------------------------------
523 | # data - The data provided by user
524 | # Y - Target variable
525 | def _find_MB(self, data, Y):
526 | # Keep a copy of the original data
527 | orig_data = data.copy()
528 | # The target variable needs to be reshaped for downstream processing
529 | Y = np.reshape(Y, (-1, 1))
530 |
531 | # Growth phase
532 | MB = self._grow(data, Y)
533 |
534 | # Shrink phase. This is the final MB
535 | MB, scores = self._shrink(orig_data, Y, MB)
536 |
537 | return MB, scores
538 |
539 |
540 |
541 |
542 | #------------------------------------------------------------------------------------------
543 | # fit() - Function that finds the MB of target variable using CV strategy for aggregation #
544 | #------------------------------------------------------------------------------------------
545 | # data - The data provided by user
546 | # Y - Target variable
547 | def fit(self, data, Y):
548 | # If user passed a numpy matrix then convert to pandas dataframe and proceed
549 | if not isinstance(data, pd.DataFrame):
550 | data = convert_numpy_to_pandas(data)
551 |
552 | # for a value of 0, no CV is applied
553 | if self.cv!= 0:
554 | # Find MB for each fold in parallel
555 | parallel = Parallel(n_jobs=self.n_jobs)#, verbose=self.verbose)
556 | if type(self.cv).__name__ == "StratifiedKFold":
557 | if self.verbose > 0:
558 | with tqdm_joblib(tqdm(desc="Progress bar", total=self.cv.get_n_splits())) as progress_bar:
559 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in self.cv.split(data, Y))
560 | else:
561 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in self.cv.split(data, Y))
562 | elif type(self.cv).__name__ == "KFold":
563 | if self.verbose > 0:
564 | with tqdm_joblib(tqdm(desc="Progress bar", total=self.cv.get_n_splits())) as progress_bar:
565 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in self.cv.split(data))
566 | else:
567 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in self.cv.split(data))
568 | else:
569 | kfold = KFold(n_splits=self.cv, random_state=self.random_state, shuffle=True)
570 | if self.verbose > 0:
571 | with tqdm_joblib(tqdm(desc="Progress bar", total=self.cv)) as progress_bar:
572 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in kfold.split(data))
573 | else:
574 | tmp = parallel(delayed(self._find_MB)(data.iloc[train].copy(), Y[train]) for train, test in kfold.split(data))
575 |
576 | # Separate out the features from the importance scores
577 | for i in range(len(tmp)):
578 | if i == 0:
579 | feature_sets, scores = [tmp[i][0]], [tmp[i][1]]
580 | else:
581 | feature_sets.append(tmp[i][0])
582 | scores.append(tmp[i][1])
583 |
584 | # Now get all the features and find their frequencies
585 | final_feats = dict()
586 | for fs in feature_sets:
587 | for i in fs:
588 | if i not in final_feats:
589 | final_feats[i] = 1
590 | else:
591 | final_feats[i]+= 1
592 |
593 | # Now find the most robust MB
594 | final_MB, max_score, final_feat_imp = list(), 0, list()
595 | for fs, feat_imp in zip(feature_sets, scores):
596 | tmp = [final_feats[i] for i in fs]
597 | score = sum(tmp)/max(len(tmp), 1)
598 | if score > max_score:
599 | final_MB = fs
600 | final_feat_imp = feat_imp
601 | max_score = score
602 |
603 | # Sort the features according to their importance
604 | tmp_feats_and_imp = list(zip(final_MB, final_feat_imp))
605 | tmp_feats_and_imp.sort(key = lambda x: x[1], reverse=True)
606 |
607 | self.MB = [i for i, _ in tmp_feats_and_imp]
608 | self.feat_imp_scores = [i for _, i in tmp_feats_and_imp]
609 |
610 | else:
611 | final_MB, final_feat_imp = self._find_MB(data.copy(), Y)
612 |
613 | # Sort the features according to their importance
614 | tmp_feats_and_imp = list(zip(final_MB, final_feat_imp))
615 | tmp_feats_and_imp.sort(key = lambda x: x[1], reverse=True)
616 |
617 | self.MB = [i for i, _ in tmp_feats_and_imp]
618 | self.feat_imp_scores = [i for _, i in tmp_feats_and_imp]
619 |
620 |
621 |
622 |
623 | #--------------------------------------------------------------
624 | # transform() - Function that returns the MB part of the data #
625 | #--------------------------------------------------------------
626 | def transform(self, data):
627 | # If user passed a numpy matrix then convert to pandas dataframe
628 | if not isinstance(data, pd.DataFrame):
629 | data = convert_numpy_to_pandas(data)
630 | # And return a numpy matrix
631 | return data[self.MB].values
632 | else:
633 | return data[self.MB]
634 |
635 |
636 |
637 |
638 | #-----------------------------------------------------------
639 | # fit_transform() - Wrapper function for fit and transform #
640 | #-----------------------------------------------------------
641 | # data - The data provided by user
642 | # Y - Target variable
643 | def fit_transform(self, data, Y):
644 | self.fit(data, Y)
645 | return self.transform(data)
646 |
647 |
648 |
649 |
650 | #-------------------------------------------------------------------------
651 | # feature_importance() - Wrapper function for plotting feature importance #
652 | #--------------------------------------------------------------------------
653 | def feature_importance(self):
654 | y_axis = np.arange(len(self.MB))
655 | x_axis = self.feat_imp_scores
656 |
657 | sns.barplot(x=x_axis, y=y_axis, orient="h")
658 | plt.yticks(y_axis, [str(i) for i in self.MB], size='small')
659 | plt.xlabel("Importance Scores")
660 | plt.ylabel("Features")
661 | sns.despine()
662 | plt.show()
663 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | [](https://www.python.org/)
2 | [](https://github.com/atif-hassan/)
3 |
4 | [](https://pypi.python.org/pypi/PyImpetus/)
5 | [](https://pepy.tech/project/PyImpetus)
6 | [](https://github.com/atif-hassan/PyImpetus/commits/master)
7 | # PyImpetus (a.k.a PPFS)
8 | PyImpetus is a **[Markov Blanket based feature selection algorithm](https://www.sciencedirect.com/science/article/pii/S0031320324008203)** that selects a subset of features by considering their performance both individually as well as a group. This allows the algorithm to not only select the best set of features, but also select the **best set of features that play well with each other**. For example, the best performing feature might not play well with others while the remaining features, when taken together could out-perform the best feature. PyImpetus takes this into account and produces the best possible combination. Thus, the algorithm provides a minimal feature subset. So, **you do not have to decide on how many features to take. PyImpetus selects the optimal set for you.**
9 |
10 | PyImpetus has been completely revamped and now supports **binary classification, multi-class classification and regression** tasks. It has been tested on 14 datasets and outperformed state-of-the-art Markov Blanket learning algorithms on all of them along with traditional feature selection algorithms such as Forward Feature Selection, Backward Feature Elimination and Recursive Feature Elimination.
11 |
12 | ## How to install?
13 | ```pip install PyImpetus```
14 |
15 | ## Functions and parameters
16 | ```python
17 | # The initialization of PyImpetus takes in multiple parameters as input
18 | # PPIMBC is for classification
19 | model = PPIMBC(model, p_val_thresh, num_simul, simul_size, simul_type, sig_test_type, cv, verbose, random_state, n_jobs)
20 | ```
21 | - **model** - `estimator object, default=DecisionTreeClassifier()` The model which is used to perform classification in order to find feature importance via significance-test.
22 | - **p_val_thresh** - `float, default=0.05` The p-value (in this case, feature importance) below which a feature will be considered as a candidate for the final MB.
23 | - **num_simul** - `int, default=30` **(This feature has huge impact on speed)** Number of train-test splits to perform to check usefulness of each feature. For large datasets, the value should be considerably reduced though do not go below 5.
24 | - **simul_size** - `float, default=0.2` The size of the test set in each train-test split
25 | - **simul_type** - `boolean, default=0` To apply stratification or not
26 | - `0` means train-test splits are not stratified.
27 | - `1` means the train-test splits will be stratified.
28 | - **sig_test_type** - `string, default="non-parametric"` This determines the type of significance test to use.
29 | - `"parametric"` means a parametric significance test will be used (Note: This test selects very few features)
30 | - `"non-parametric"` means a non-parametric significance test will be used
31 | - **cv** - `cv object/int, default=0` Determines the number of splits for cross-validation. Sklearn CV object can also be passed. A value of 0 means CV is disabled.
32 | - **verbose** - `int, default=2` Controls the verbosity: the higher, more the messages.
33 | - **random_state** - `int or RandomState instance, default=None` Pass an int for reproducible output across multiple function calls.
34 | - **n_jobs** - `int, default=-1` The number of CPUs to use to do the computation.
35 | - `None` means 1 unless in a `:obj:joblib.parallel_backend` context.
36 | - `-1` means using all processors.
37 |
38 | ```python
39 | # The initialization of PyImpetus takes in multiple parameters as input
40 | # PPIMBR is for regression
41 | model = PPIMBR(model, p_val_thresh, num_simul, simul_size, sig_test_type, cv, verbose, random_state, n_jobs)
42 | ```
43 | - **model** - `estimator object, default=DecisionTreeRegressor()` The model which is used to perform regression in order to find feature importance via significance-test.
44 | - **p_val_thresh** - `float, default=0.05` The p-value (in this case, feature importance) below which a feature will be considered as a candidate for the final MB.
45 | - **num_simul** - `int, default=30` **(This feature has huge impact on speed)** Number of train-test splits to perform to check usefulness of each feature. For large datasets, the value should be considerably reduced though do not go below 5.
46 | - **simul_size** - `float, default=0.2` The size of the test set in each train-test split
47 | - **sig_test_type** - `string, default="non-parametric"` This determines the type of significance test to use.
48 | - `"parametric"` means a parametric significance test will be used (Note: This test selects very few features)
49 | - `"non-parametric"` means a non-parametric significance test will be used
50 | - **cv** - `cv object/int, default=0` Determines the number of splits for cross-validation. Sklearn CV object can also be passed. A value of 0 means CV is disabled.
51 | - **verbose** - `int, default=2` Controls the verbosity: the higher, more the messages.
52 | - **random_state** - `int or RandomState instance, default=None` Pass an int for reproducible output across multiple function calls.
53 | - **n_jobs** - `int, default=-1` The number of CPUs to use to do the computation.
54 | - `None` means 1 unless in a `:obj:joblib.parallel_backend` context.
55 | - `-1` means using all processors.
56 |
57 | ```python
58 | # To fit PyImpetus on provided dataset and find recommended features
59 | fit(data, target)
60 | ```
61 | - **data** - A pandas dataframe or a numpy matrix upon which feature selection is to be applied\
62 | (Passing pandas dataframe allows using correct column names. Numpy matrix will apply default column names)
63 | - **target** - A numpy array, denoting the target variable
64 |
65 | ```python
66 | # This function returns the names of the columns that form the MB (These are the recommended features)
67 | transform(data)
68 | ```
69 | - **data** - A pandas dataframe or a numpy matrix which needs to be pruned\
70 | (Passing pandas dataframe allows using correct column names. Numpy matrix will apply default column names)
71 |
72 | ```python
73 | # To fit PyImpetus on provided dataset and return pruned data
74 | fit_transform(data, target)
75 | ```
76 | - **data** - A pandas dataframe or numpy matrix upon which feature selection is to be applied\
77 | (Passing pandas dataframe allows using correct column names. Numpy matrix will apply default column names)
78 | - **target** - A numpy array, denoting the target variable
79 |
80 | ```python
81 | # To plot XGBoost style feature importance
82 | feature_importance()
83 | ```
84 |
85 |
86 | ## How to import?
87 | ```python
88 | from PyImpetus import PPIMBC, PPIMBR
89 | ```
90 |
91 | ## Usage
92 | If data is a pandas dataframe
93 | ```python
94 | # Import the algorithm. PPIMBC is for classification and PPIMBR is for regression
95 | from PyImpetus import PPIMBC, PPIMBR
96 | # Initialize the PyImpetus object
97 | model = PPIMBC(model=SVC(random_state=27, class_weight="balanced"), p_val_thresh=0.05, num_simul=30, simul_size=0.2, simul_type=0, sig_test_type="non-parametric", cv=5, random_state=27, n_jobs=-1, verbose=2)
98 | # The fit_transform function is a wrapper for the fit and transform functions, individually.
99 | # The fit function finds the MB for given data while transform function provides the pruned form of the dataset
100 | df_train = model.fit_transform(df_train.drop("Response", axis=1), df_train["Response"].values)
101 | df_test = model.transform(df_test)
102 | # Check out the MB
103 | print(model.MB)
104 | # Check out the feature importance scores for the selected feature subset
105 | print(model.feat_imp_scores)
106 | # Get a plot of the feature importance scores
107 | model.feature_importance()
108 | ```
109 |
110 | If data is a numpy matrix
111 | ```python
112 | # Import the algorithm. PPIMBC is for classification and PPIMBR is for regression
113 | from PyImpetus import PPIMBC, PPIMBR
114 | # Initialize the PyImpetus object
115 | model = PPIMBC(model=SVC(random_state=27, class_weight="balanced"), p_val_thresh=0.05, num_simul=30, simul_size=0.2, simul_type=0, sig_test_type="non-parametric", cv=5, random_state=27, n_jobs=-1, verbose=2)
116 | # The fit_transform function is a wrapper for the fit and transform functions, individually.
117 | # The fit function finds the MB for given data while transform function provides the pruned form of the dataset
118 | df_train = model.fit_transform(x_train, y_train)
119 | df_test = model.transform(x_test)
120 | # Check out the MB
121 | print(model.MB)
122 | # Check out the feature importance scores for the selected feature subset
123 | print(model.feat_imp_scores)
124 | # Get a plot of the feature importance scores
125 | model.feature_importance()
126 | ```
127 |
128 | ## For better accuracy
129 | Note: Play with the values of **num_simul**, **simul_size**, **simul_type** and **p_val_thresh** because sometimes a specific combination of these values will end up giving best results
130 | - ~~Increase the **cv** value~~ In all experiments, **cv** did not help in getting better accuracy. Use this only when you have extremely small dataset
131 | - Increase the **num_simul** value
132 | - Try one of these values for **simul_size** = `{0.1, 0.2, 0.3, 0.4}`
133 | - Use non-linear models for feature selection. Apply hyper-parameter tuning on models
134 | - Increase value of **p_val_thresh** in order to increase the number of features to include in thre Markov Blanket
135 |
136 | ## For better speeds
137 | - ~~Decrease the **cv** value. For large datasets cv might not be required. Therefore, set **cv=0** to disable the aggregation step. This will result in less robust feature subset selection but at much faster speeds~~
138 | - Decrease the **num_simul** value but don't decrease it below 5
139 | - Set **n_jobs** to -1
140 | - Use linear models
141 |
142 | ## For selection of less features
143 | - Try reducing the **p_val_thresh** value
144 | - Try out `sig_test_type = "parametric"`
145 |
146 | ## Performance in terms of Accuracy (classification) and MSE (regression)
147 | | Dataset | # of samples | # of features | Task Type | Score using all features | Score using [featurewiz](https://github.com/AutoViML/featurewiz) | Score using PyImpetus | # of features selected | % of features selected | Tutorial |
148 | | --- | --- | --- | --- |--- |--- |--- |--- |--- |--- |
149 | | Ionosphere | 351 | 34 | Classification | 88.01% | | 92.86% | 14 | 42.42% | [tutorial here](https://github.com/atif-hassan/PyImpetus/blob/master/tutorials/Classification_Tutorial.ipynb) |
150 | | Arcene | 100 | 10000 | Classification | 82% | | 84.72% | 304 | 3.04% | |
151 | | AlonDS2000 | 62 | 2000 | Classification | 80.55% | 86.98% | 88.49% | 75 | 3.75% | |
152 | | slice_localization_data | 53500 | 384 | Regression | 6.54 | | 5.69 | 259 | 67.45% | [tutorial here](https://github.com/atif-hassan/PyImpetus/blob/master/tutorials/Regression_Tutorial.ipynb) |
153 |
154 | **Note**: Here, for the first, second and third tasks, a higher accuracy score is better while for the fourth task, a lower MSE (Mean Squared Error) is better.
155 |
156 | ## Performance in terms of Time (in seconds)
157 | | Dataset | # of samples | # of features | Time (with PyImpetus) |
158 | | --- | --- | --- | --- |
159 | | Ionosphere | 351 | 34 | 35.37 |
160 | | Arcene | 100 | 10000 | 1570 |
161 | | AlonDS2000 | 62 | 2000 | 125.511 |
162 | | slice_localization_data | 53500 | 384 | 1296.13 |
163 |
164 | ## Future Ideas
165 | - Let me know
166 |
167 | ## Feature Request
168 | Drop me an email at **atif.hit.hassan@gmail.com** if you want any particular feature
169 |
170 | # Please cite this work as
171 | Citation is available in the [CITATION.bib file](https://github.com/atif-hassan/PyImpetus/blob/master/CITATION.bib)
172 |
173 | [Alternatively, use the following DBLP Bibtex link](https://dblp.uni-trier.de/rec/journals/pr/HassanPKS25.html)
174 |
175 | ## Corresponding paper
176 | [A wrapper feature selection approach using Markov blankets](https://www.sciencedirect.com/science/article/pii/S0031320324008203)
177 |
178 | ## Old ArXiv version
179 | [PPFS: Predictive Permutation Feature Selection](https://arxiv.org/abs/2110.10713)
180 |
--------------------------------------------------------------------------------
/setup.py:
--------------------------------------------------------------------------------
1 | import setuptools
2 |
3 | with open("README.md", "r") as fh:
4 | long_description = fh.read()
5 |
6 |
7 | setuptools.setup(
8 | name = 'PyImpetus',
9 | version = '4.1.1',
10 | author = "Atif Hassan",
11 | author_email = "atif.hit.hassan@gmail.com",
12 | description = "PyImpetus is a Markov Blanket based feature subset selection algorithm that considers features both separately and together as a group in order to provide not just the best set of features but also the best combination of features",
13 | long_description = long_description,
14 | long_description_content_type = "text/markdown",
15 | url = "https://github.com/atif-hassan/PyImpetus",
16 | py_modules = ["PyImpetus"],
17 | install_requires = ["pandas", "scikit-learn", "numpy", "joblib"],
18 | include_package_data = True,
19 | classifiers = [
20 | "Programming Language :: Python :: 3",
21 | "Programming Language :: Python :: 3.6",
22 | "Programming Language :: Python :: 3.7",
23 | "License :: OSI Approved :: MIT License",
24 | "Operating System :: OS Independent",
25 | ]
26 | )
27 |
--------------------------------------------------------------------------------
/tutorials/Classification_Tutorial.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "## Get all imports"
8 | ]
9 | },
10 | {
11 | "cell_type": "code",
12 | "execution_count": 1,
13 | "metadata": {},
14 | "outputs": [],
15 | "source": [
16 | "import math\n",
17 | "import numpy as np\n",
18 | "import pandas as pd\n",
19 | "from collections import Counter\n",
20 | "from PyImpetus import PPIMBC\n",
21 | "from sklearn.svm import LinearSVC, SVC\n",
22 | "from sklearn.tree import DecisionTreeClassifier\n",
23 | "from sklearn.linear_model import LogisticRegression\n",
24 | "from lightgbm import LGBMClassifier\n",
25 | "from sklearn.metrics import accuracy_score\n",
26 | "from sklearn.model_selection import KFold, StratifiedKFold\n",
27 | "from sklearn.naive_bayes import GaussianNB\n",
28 | "from sklearn.preprocessing import StandardScaler, OneHotEncoder\n",
29 | "import time"
30 | ]
31 | },
32 | {
33 | "cell_type": "markdown",
34 | "metadata": {},
35 | "source": [
36 | "## Load data"
37 | ]
38 | },
39 | {
40 | "cell_type": "code",
41 | "execution_count": 2,
42 | "metadata": {},
43 | "outputs": [
44 | {
45 | "data": {
46 | "text/html": [
47 | "
\n",
48 | "\n",
61 | "
\n",
62 | " \n",
63 | "
\n",
64 | "
\n",
65 | "
0
\n",
66 | "
1
\n",
67 | "
2
\n",
68 | "
3
\n",
69 | "
4
\n",
70 | "
5
\n",
71 | "
6
\n",
72 | "
7
\n",
73 | "
8
\n",
74 | "
9
\n",
75 | "
...
\n",
76 | "
25
\n",
77 | "
26
\n",
78 | "
27
\n",
79 | "
28
\n",
80 | "
29
\n",
81 | "
30
\n",
82 | "
31
\n",
83 | "
32
\n",
84 | "
33
\n",
85 | "
34
\n",
86 | "
\n",
87 | " \n",
88 | " \n",
89 | "
\n",
90 | "
0
\n",
91 | "
1
\n",
92 | "
0
\n",
93 | "
0.99539
\n",
94 | "
-0.05889
\n",
95 | "
0.85243
\n",
96 | "
0.02306
\n",
97 | "
0.83398
\n",
98 | "
-0.37708
\n",
99 | "
1.00000
\n",
100 | "
0.03760
\n",
101 | "
...
\n",
102 | "
-0.51171
\n",
103 | "
0.41078
\n",
104 | "
-0.46168
\n",
105 | "
0.21266
\n",
106 | "
-0.34090
\n",
107 | "
0.42267
\n",
108 | "
-0.54487
\n",
109 | "
0.18641
\n",
110 | "
-0.45300
\n",
111 | "
1
\n",
112 | "
\n",
113 | "
\n",
114 | "
1
\n",
115 | "
1
\n",
116 | "
0
\n",
117 | "
1.00000
\n",
118 | "
-0.18829
\n",
119 | "
0.93035
\n",
120 | "
-0.36156
\n",
121 | "
-0.10868
\n",
122 | "
-0.93597
\n",
123 | "
1.00000
\n",
124 | "
-0.04549
\n",
125 | "
...
\n",
126 | "
-0.26569
\n",
127 | "
-0.20468
\n",
128 | "
-0.18401
\n",
129 | "
-0.19040
\n",
130 | "
-0.11593
\n",
131 | "
-0.16626
\n",
132 | "
-0.06288
\n",
133 | "
-0.13738
\n",
134 | "
-0.02447
\n",
135 | "
0
\n",
136 | "
\n",
137 | "
\n",
138 | "
2
\n",
139 | "
1
\n",
140 | "
0
\n",
141 | "
1.00000
\n",
142 | "
-0.03365
\n",
143 | "
1.00000
\n",
144 | "
0.00485
\n",
145 | "
1.00000
\n",
146 | "
-0.12062
\n",
147 | "
0.88965
\n",
148 | "
0.01198
\n",
149 | "
...
\n",
150 | "
-0.40220
\n",
151 | "
0.58984
\n",
152 | "
-0.22145
\n",
153 | "
0.43100
\n",
154 | "
-0.17365
\n",
155 | "
0.60436
\n",
156 | "
-0.24180
\n",
157 | "
0.56045
\n",
158 | "
-0.38238
\n",
159 | "
1
\n",
160 | "
\n",
161 | "
\n",
162 | "
3
\n",
163 | "
1
\n",
164 | "
0
\n",
165 | "
1.00000
\n",
166 | "
-0.45161
\n",
167 | "
1.00000
\n",
168 | "
1.00000
\n",
169 | "
0.71216
\n",
170 | "
-1.00000
\n",
171 | "
0.00000
\n",
172 | "
0.00000
\n",
173 | "
...
\n",
174 | "
0.90695
\n",
175 | "
0.51613
\n",
176 | "
1.00000
\n",
177 | "
1.00000
\n",
178 | "
-0.20099
\n",
179 | "
0.25682
\n",
180 | "
1.00000
\n",
181 | "
-0.32382
\n",
182 | "
1.00000
\n",
183 | "
0
\n",
184 | "
\n",
185 | "
\n",
186 | "
4
\n",
187 | "
1
\n",
188 | "
0
\n",
189 | "
1.00000
\n",
190 | "
-0.02401
\n",
191 | "
0.94140
\n",
192 | "
0.06531
\n",
193 | "
0.92106
\n",
194 | "
-0.23255
\n",
195 | "
0.77152
\n",
196 | "
-0.16399
\n",
197 | "
...
\n",
198 | "
-0.65158
\n",
199 | "
0.13290
\n",
200 | "
-0.53206
\n",
201 | "
0.02431
\n",
202 | "
-0.62197
\n",
203 | "
-0.05707
\n",
204 | "
-0.59573
\n",
205 | "
-0.04608
\n",
206 | "
-0.65697
\n",
207 | "
1
\n",
208 | "
\n",
209 | " \n",
210 | "
\n",
211 | "
5 rows × 35 columns
\n",
212 | "
"
213 | ],
214 | "text/plain": [
215 | " 0 1 2 3 4 5 6 7 8 \\\n",
216 | "0 1 0 0.99539 -0.05889 0.85243 0.02306 0.83398 -0.37708 1.00000 \n",
217 | "1 1 0 1.00000 -0.18829 0.93035 -0.36156 -0.10868 -0.93597 1.00000 \n",
218 | "2 1 0 1.00000 -0.03365 1.00000 0.00485 1.00000 -0.12062 0.88965 \n",
219 | "3 1 0 1.00000 -0.45161 1.00000 1.00000 0.71216 -1.00000 0.00000 \n",
220 | "4 1 0 1.00000 -0.02401 0.94140 0.06531 0.92106 -0.23255 0.77152 \n",
221 | "\n",
222 | " 9 ... 25 26 27 28 29 30 \\\n",
223 | "0 0.03760 ... -0.51171 0.41078 -0.46168 0.21266 -0.34090 0.42267 \n",
224 | "1 -0.04549 ... -0.26569 -0.20468 -0.18401 -0.19040 -0.11593 -0.16626 \n",
225 | "2 0.01198 ... -0.40220 0.58984 -0.22145 0.43100 -0.17365 0.60436 \n",
226 | "3 0.00000 ... 0.90695 0.51613 1.00000 1.00000 -0.20099 0.25682 \n",
227 | "4 -0.16399 ... -0.65158 0.13290 -0.53206 0.02431 -0.62197 -0.05707 \n",
228 | "\n",
229 | " 31 32 33 34 \n",
230 | "0 -0.54487 0.18641 -0.45300 1 \n",
231 | "1 -0.06288 -0.13738 -0.02447 0 \n",
232 | "2 -0.24180 0.56045 -0.38238 1 \n",
233 | "3 1.00000 -0.32382 1.00000 0 \n",
234 | "4 -0.59573 -0.04608 -0.65697 1 \n",
235 | "\n",
236 | "[5 rows x 35 columns]"
237 | ]
238 | },
239 | "metadata": {},
240 | "output_type": "display_data"
241 | },
242 | {
243 | "name": "stdout",
244 | "output_type": "stream",
245 | "text": [
246 | "Data shape: (351, 34) Target Variable shape: (351,)\n"
247 | ]
248 | }
249 | ],
250 | "source": [
251 | "# Load the data into a dataframe\n",
252 | "df = pd.read_csv(\"ionosphere.data\", header=None)\n",
253 | "# Pre-process the data\n",
254 | "df[34] = df[34].str.replace(\"g\", '1').replace(\"b\", '0')\n",
255 | "# Lets check it out\n",
256 | "display(df.head())\n",
257 | "# Split the data into input features and target variable\n",
258 | "data, Y = df.drop([34], axis=1), df[34].values\n",
259 | "# Lets check out the shape of our data\n",
260 | "print(\"Data shape: \", data.shape, \"Target Variable shape: \", Y.shape)"
261 | ]
262 | },
263 | {
264 | "cell_type": "markdown",
265 | "metadata": {},
266 | "source": [
267 | "## Modelling with Decision Tree without PyImpetus"
268 | ]
269 | },
270 | {
271 | "cell_type": "code",
272 | "execution_count": 3,
273 | "metadata": {},
274 | "outputs": [
275 | {
276 | "name": "stdout",
277 | "output_type": "stream",
278 | "text": [
279 | "Score: 0.9436619718309859\n",
280 | "Score: 0.9142857142857143\n",
281 | "Score: 0.8285714285714286\n",
282 | "Score: 0.8428571428571429\n",
283 | "Score: 0.8714285714285714\n",
284 | "\n",
285 | "\n",
286 | "Average Accuracy: 0.8801609657947687\n",
287 | "\n",
288 | "\n",
289 | "Total Time Required (in seconds): 0.03653097152709961\n"
290 | ]
291 | }
292 | ],
293 | "source": [
294 | "# We want to time our algorithm\n",
295 | "start = time.time()\n",
296 | "# Use KFold for understanding the performance of PyImpetus\n",
297 | "kfold = KFold(n_splits=5, random_state=27, shuffle=True)\n",
298 | "# This will hold all the accuracy scores\n",
299 | "scores = list()\n",
300 | "# Perform CV\n",
301 | "for train, test in kfold.split(data):\n",
302 | " # Split data into train and test based on folds\n",
303 | " x_train, x_test = data.iloc[train], data.iloc[test]\n",
304 | " y_train, y_test = Y[train], Y[test]\n",
305 | " \n",
306 | " # Convert the data into numpy arrays\n",
307 | " x_train, x_test = x_train.values, x_test.values\n",
308 | " \n",
309 | " model = DecisionTreeClassifier(random_state=27)\n",
310 | " model.fit(x_train, y_train)\n",
311 | " preds = model.predict(x_test)\n",
312 | " score = accuracy_score(y_test, preds)\n",
313 | " scores.append(score)\n",
314 | " print(\"Score: \", score)\n",
315 | "# Compute average score\n",
316 | "print(\"\\n\\nAverage Accuracy: \", sum(scores)/len(scores))\n",
317 | "# Finally, check out the total time taken\n",
318 | "end = time.time()\n",
319 | "print(\"\\n\\nTotal Time Required (in seconds): \", end-start)"
320 | ]
321 | },
322 | {
323 | "cell_type": "markdown",
324 | "metadata": {},
325 | "source": [
326 | "## Modelling with Decision Tree (fast but less robust feature subset selection)"
327 | ]
328 | },
329 | {
330 | "cell_type": "code",
331 | "execution_count": 4,
332 | "metadata": {},
333 | "outputs": [
334 | {
335 | "name": "stderr",
336 | "output_type": "stream",
337 | "text": [
338 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
339 | "[Parallel(n_jobs=-1)]: Done 29 out of 34 | elapsed: 3.7s remaining: 0.5s\n",
340 | "[Parallel(n_jobs=-1)]: Done 34 out of 34 | elapsed: 3.7s finished\n",
341 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
342 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
343 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
344 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
345 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
346 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
347 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
348 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
349 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
350 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
351 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
352 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
353 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
354 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
355 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
356 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
357 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
358 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
359 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
360 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
361 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
362 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
363 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
364 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
365 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
366 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
367 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
368 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
369 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
370 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
371 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
372 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
373 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
374 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
375 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
376 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
377 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
378 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
379 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
380 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
381 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
382 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
383 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
384 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
385 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
386 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
387 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
388 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
389 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
390 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
391 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
392 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
393 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
394 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
395 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
396 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
397 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
398 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
399 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
400 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
401 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
402 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
403 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
404 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
405 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
406 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
407 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
408 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
409 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
410 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
411 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
412 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n"
413 | ]
414 | },
415 | {
416 | "name": "stdout",
417 | "output_type": "stream",
418 | "text": [
419 | "Markov Blanket: [0, 21, 6, 4, 3, 2, 26, 14, 16, 7, 8, 28, 12, 24, 10]\n",
420 | "Feature importance: [13.862943611198906, 12.253505698764807, 11.223886281583647, 10.918504632032466, 9.614448369149548, 9.3856067967207, 9.16246324540649, 8.733044896275834, 6.019879594506852, 6.019879594506852, 5.109414274682476, 3.815485593252747, 3.720557275404062, 3.446363007454386, 3.446363007454386]\n"
421 | ]
422 | },
423 | {
424 | "data": {
425 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAEGCAYAAABmXi5tAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuNCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8QVMy6AAAACXBIWXMAAAsTAAALEwEAmpwYAAAYi0lEQVR4nO3de5wV9XnH8c9XLqLCgop35AUi0arxSlLBkKjRYBKNd03URE1SmjTGxEtqrDEXm1pbc7EmRl/UVFNDqAS1VWNBGhWNMSrxHhXESxQiCgoigsjK0z9mVg/L2d3Z9czMWeb7fr3Oa8+ZM+c8z4HdZ3/7m/k9o4jAzMzWfxuUnYCZmRXDBd/MrCJc8M3MKsIF38ysIlzwzcwqom/ZCXTmkEMOienTp5edhplZb6N6G5t6hL948eKyUzAzW2809Qi/ddGrLLr8l2WnYWZWqC2+fFIu79vUI3wzM2scF3wzs4oovOBLuljSXZImS+pfdHwzs6oqtOBL2gvYJiLGA48DxxQZ38ysyooe4Y8Fbk3vTwfGFRzfzKyyii74Q4Bl6f3XgM3a7yBpoqTZkma/snxZ+6fNzKyHii74S4CW9P4Q4NX2O0TEpIgYExFjNh/Y0v5pMzProaIL/h+Aj6X3JwB3FxzfzKyyCi34EfEg8KKku4BdgOuKjG9mVmWFr7SNiG8UHdPMzJq8tULfLTbLbYmxmVnVeKWtmVlFuOCbmVVEU0/prH75Bf5y2Zllp2Fm67Ftv/KjslMojEf4ZmYV4YJvZlYRuRV8SfukXTFnSZoqqZ+kqyQtknRaXnHNzKy+POfwFwATImKFpAuBI4B/AGYBA3OMa2ZmdeRW8CNiYc3D1UBrRLwo1b22rpmZ5Sz3OXxJw4GDgJsz7l/TLXNlvsmZmVVIrgVfUgtwDXBqRKzO8pq1u2VulGd6ZmaVkudB2z7AZOCCiJibVxwzM8smz4O2x5Fc0WqQpPOBy4E9gU8BfSSNjIizcoxvZmY18jxoOwWY0m7ztcC5ecU0M7OOeeGVmVlFNHUvnX5bbl+pPhdmZnnyCN/MrCJc8M3MKqKpp3TeWDSPeyYdWnYaZlaisRMzrdm0DDzCNzOrCBd8M7OKKLzgS9pf0m/TtsmHFx3fzKyqCp3DlzQAOAv4eES8VWRsM7OqK3qEPw5YCdwk6QZJWxcc38yssoou+FsBI4HDgEnAd9vvUNseecly/xFgZtYoRRf8pcDv0umc24Bd2u9Q2x5504H9C07PzGz9VXTBv493i/xewDMFxzczq6xCD9pGxCuSbpR0J7AG+HyR8c3MqqzwlbYRcRlwWdFxzcyqrqlbK2yyxY5eVm1m1iBeaWtmVhEu+GZmFdHUUzpLFj/FtKsOKTsNM8vZMadOLzuFSvAI38ysIlzwzcwqopSCL+kzkhaVEdvMrKrKaI+8AXAM8ELRsc3MqqyMEf4JwDSSlbZmZlaQQgu+pD7AccC1nezzTrfMZe6WaWbWMEWP8E8CpkZEh6P72m6ZLe6WaWbWMEUX/F2Az0maDoyW9OOC45uZVVbR3TLPabsvaXZEnFFkfDOzKivtPPyIGFNWbDOzKvLCKzOzilBElJ1Dh8aMGROzZ88uOw0zs95G9TZ6hG9mVhEu+GZmFdHU7ZFffvUpLp08oew0zCrj9BNnlJ2C5cgjfDOzinDBNzOriKJ76ewm6W5JsyT9RtLAIuObmVVZ0SP8ORGxX0R8BLgPOLLg+GZmlVVowY+I1TUPNwaeLDK+mVmVlXEBlIMlPQgcADxd5/l32iMvX+b2yGZmjVJ4wY+ImRGxF8lFUCbWef6d9sgDW9we2cysUYo+aLthzcPXgDeKjG9mVmVFL7w6WNI3SC5vuAg4peD4ZmaVVXQ//JuBm4uMaWZmCXfLNDNb/7hbpplZlbngm5lVRFNP6QzdcXAcdvHYstMwWy9ddeT0slOw/HhKx8ysylzwzcwqouiFV/tIuivtljlVUr8i45uZVVnRI/wFwIS0W+Y84IiC45uZVVbRC68W1jxcDbQWGd/MrMpKmcOXNBw4iDqrbmu7Zb7pbplmZg1TRnvkFuAa4NR2/fGBtbtlDnC3TDOzhin6oG0fYDJwQUTMLTK2mVnVFT3CPw4YB5wv6Q5Jxxcc38yssoo+aDsFmFJkTDMzS3jhlZlZRTR1Lx23RzYz6xH30jEzqzIXfDOziij6mrbd8tTSF/nEDd8vOw2zXu2WI79VdgrWJDKN8CWNkrRhen9/SadLGpJrZmZm1lBZp3SuA96WtCPwc2Ak8KvcsjIzs4bLWvDXREQrcCRwSUScAWzT2QvqtUKWNEzSjemiq++81+TNzCy7rHP4qyV9BjgZOCzd1lUv+7ZWyCskXUjSCvko4MsRsaAnyZqZWc9lHeGfCowF/ikinpU0EvhlZy+IiIURsSJ92NYkbQTwQ0m3SRrXk4TNzKxnMo3wI+JxSecAw9PHzwIXZXltTSvkK4HdgWNJ+uDfCHywzv4TgYkAA7YYnCWEmZllkPUsncOAh4Dp6eM9Jd2Y4XXvtEIGFgNzI2J+eiGUVknr/MKpbY/cv2WT7J/EzMw6lXVK57sko/GlABHxEMmZOh1q3wo5IlYCSyUNlrQJ0D89EGxmZgXIetC2NSJek9Zqz9BVE562VsiDJJ0PXA6cR3KVq37A+d3M1czM3oOsBf8xSScAfSSNBk4Hft/ZCzpphTy+eymamVkjZOqWKWljktH5x9JNM4DvR8SbOebmbplmZj1Tt1tmlyP8dC7+xog4iKTom5lZL9TlQduIeBtYIcnnSJqZ9WJZ5/DfBB6VNBN4o21jRJyeS1app5Ys5pPXXZlnCLNe7TdHf7HsFKwXyVrwf5PezMysl8q60vYXeSdiZmb5yrrS9llJz7S/dfGaQZLulbRc0m4124dLWlW7zczM8pd1SmdMzf0BJP1wNuviNSuBQ4GL220/B7g7Y1wzM2uQTCP8iHil5rYgIi4BDuziNa0Rsah2W9plM4Dne5qwmZn1TKYRvqS9ax5uQDLiH9SDeOeQdNn8biex3u2WObSrPyLMzCyrrFM6P6y53wo8S9IrJzNJowAi4rl2PXnWEhGTgEkAg0eN6HoZsJmZZZK14H8hItY6SJtOz3THHsCukqYD7wd2lHRARKzu4nVmZtYAWdsjT8u4bS2SbiHpv/PvwMCIGB8RhwAzgS+52JuZFafTEb6knYFdgcGSjqp5qoXkbJ1ORcQnOth+SjdyNDOzBuhqSmcnklMrh/DuxcsBXgf+JqeczMwsB1nbI4+NiHsKyGctbo9sZtYjPWuPnHpQ0ldIpnfemcqJiM83IDEzMytA1oO21wBbAxOAWcAwkmkdMzPrJbJO6TwYEXtJeiQidpfUD5gREZ2utn2vhozaMcb/y7/mGcLWAzcdc1TXO5lVS90pnawj/LbTJ5emTc8GAyMakJSZmRUk6xz+JEmbAucDNwIDgW/nlpWZmTVc1n74bZedmgXskOU1kgYB/0dyoHffiHhM0jDgZyTn8d8eEd/rfspmZtYTWZunbQVcCGwbER+XtAswNiJ+3snL6rVHvhj4ckQs6GnCZmbWM1nn8K8GZgDbpo/nAl/v7AXt2yOnB3pHAD+UdJukcd1N1szMei5rwR8aEVOBNZAUc+DtbsYaCuwOnA2cAFxSbydJEyXNljT7rWWvdTOEmZl1JGvBf0PS5iQXL0HSvkB3q/FSYG5EzI+IhUCrpHWmlCJiUkSMiYgx/VsGdzOEmZl1JOtZOmeSnJ0zStLdwBbAMd0JFBErJS2VNJikp37/9C8FMzMrQFfdModHxPMR8YCkj5A0UxMwJ0tr47Q98p7ATpIuB84Dbgb6kZziaWZmBelqhP/fQNvlDa+NiKO78+YdtEce3533MDOzxuiq4Ncuz810/n0j7bjpEC+bNzNrkK4O2kYH983MrJfpaoS/h6RlJCP9jdL7pI8jIlpyzc7MzBqm04IfEX2KSqSep5cs58jrfldmCtYL3HD0h8pOwaxXyHoevpmZ9XIu+GZmFVFowZf0QUl3pLc5kn5cZHwzsyrLutK2ISLiPmB/AElXkpznb2ZmBShlSiftobMvcFcZ8c3MqqisOfwDgVkRsab9E7XdMlctW1p8ZmZm66myCv6xwK/rPVHbLXPDliHFZmVmth4rvOCn0zljgTuLjm1mVmVljPAPAO6sN51jZmb5KfQsHYCImAnMLDqumVnVeeGVmVlFFD7C745Rmw50nxQzswbxCN/MrCJc8M3MKqKpp3ReWPoWp9/wQtlpWANdeuT2ZadgVlke4ZuZVYQLvplZRRTdHnkDSb+QdFd6G1VkfDOzKit6hL8nsGFEjAcuAE4rOL6ZWWUVXfDnA0gSMARYVHB8M7PKKvosncXAGuAJYENgv/Y7SJoITAQYtMV2hSZnZrY+K3qEPwFYGRE7A0cDP2q/Q2175I1aNis4PTOz9VcZZ+ksSb8uJZnWMTOzAhQ9pXMr8FlJs0imdM4sOL6ZWWUVfRHzt4ETioxpZmaJpm6tsP2Q/l6Kb2bWIF5pa2ZWES74ZmYV0dRTOkuXtHL9tMVlp7FeOuqYoWWnYGYF8wjfzKwiXPDNzCoit4IvaZ+0I+YsSVMl9ZN0mqT70tthecU2M7N15TmHvwCYEBErJF0IHAH8HbA7sDEwA7gpx/hmZlYjt4IfEQtrHq4GWoF5wEbAIOCVvGKbmdm6cj9LR9Jw4CDg+8A2wONAH+CUDvZ/p1vm0KHD8k7PzKwycj1oK6kFuAY4lWRkPxEYDewMXJj2xV9LbbfMwS2b55memVml5HnQtg8wGbggIuaS9MF/E1gFrCBpnrZOwTczs3zkOaVzHDAOGCTpfOByYBpwD8mUzmURsSbH+GZmViPPg7ZTgCl1nvpBXjHNzKxjXnhlZlYRTd1LZ8imfd3zxcysQTzCNzOrCBd8M7OKaOopnRWLW3nwypfLTmO9stcXtyw7BTMriUf4ZmYV4YJvZlYRea60HSTpXknLJe0maRNJt0q6U9LtkkbkFdvMzNaV5wh/JXAoyepaSLplnhoRHwb+GfhGjrHNzKydPFfatgKL2vqjRcQqkh758G67ZDMzK0jhc/iS+gHfBi7t4PmJkmZLmr3kdbfMNzNrlDIO2k4CroiIp+s9WdseedNBbo9sZtYohRZ8Sd8Cno2Ia4uMa2ZmOS+8knQLsCewk6TLge8Ad0s6ELgnIs7NM76Zmb0r14IfEZ9ot+k/84xnZmYda+rWChsP7etWAGZmDeKVtmZmFeGCb2ZWEU09pbP6pVUs/MG8stPolbY+e8eyUzCzJuMRvplZRbjgm5lVRJ7dMveRdJekWZKmpi0VkDRc0ipJu+UV28zM1pXnCH8BMCEiPgLMA45It58D3J1jXDMzqyPPbpkLax6uBloljQQCeD6vuGZmVl/uc/iShgMHATeTjO5/0MX+73TLfGX5q3mnZ2ZWGbkWfEktwDXAqcBwgIh4rrPX1HbL3HzgZnmmZ2ZWKXketO0DTAYuiIi5wB7ArpKmAwcDV7QdyDUzs/zlOcI/DhgHnC/pDqBfRIyPiEOAmcCXImJ1jvHNzKxGngdtpwBTOnjulLzimplZfV54ZWZWEU3dS6ffVhu6J4yZWYN4hG9mVhEu+GZmFdHUUzqrX36dly69o+w0eqWtTt+/7BTMrMl4hG9mVhEu+GZmFZHnSttBku6VtLytFbKk4yX9XtJtkrbPK7aZma0rzxH+SuBQYBpA2kbhTGB/4Pz0ZmZmBcmt4EdEa0Qsqtk0GvhTRLwVEXcD788rtpmZravIOfwhwLKax33q7VTbHvnV5a8VkpiZWRUUWfCXAC01j9+ut1Nte+TNBg4uJjMzswoo8jz8ecAukvoDHwAeKTC2mVnl5VrwJd0C7AnsBFwO/BiYBbwJfC7P2GZmtrZcC35EfKLO5mvzjGlmZvU1dWuFflsOcosAM7MG8UpbM7OKUESUnUOHJL0OzCk7jx4YCiwuO4kecN7Fct7FqlLei9PLya6lqad0gDkRMabsJLpL0mznXRznXSznXaxG5u0pHTOzinDBNzOriGYv+JPKTqCHnHexnHexnHexGpZ3Ux+0NTOzxmn2Eb6ZmTWIC76ZWUU0ZcGXdIikOZLmSfpm2flkIWl7SbdLekLSnyR9reycukNSH0kPSrq57FyykjRE0jRJT6b/7mPLzikLSWek3yOPSZoiaUDZOdUj6T8kvSzpsZptm0maKemp9OumZeZYTwd5X5x+nzwi6QZJQ0pMsa56edc8d7akkDT0vcRouoIvqQ9wGfBxYBfgM5J2KTerTFqBsyLir4B9ga/0krzbfA14ouwkuunfgOkRsTOwB70gf0nbAacDYyJiN5LrQny63Kw6dDXQfvHON4HfRsRo4Lfp42ZzNevmPRPYLSJ2B+YC5xadVAZXs27epJeDPRh4/r0GaLqCD3wQmBcRz0TEW8B/AYeXnFOXIuLFiHggvf86SfHZrtysspE0DPgkcGXZuWQlqQX4MPBzgPRKaktLTSq7vsBGkvoCGwN/KTmfuiLiTuDVdpsPB36R3v8FcESROWVRL++IuDUiWtOHfwCGFZ5YFzr494aky/DfA+/5DJtmLPjbAS/UPJ5PLymcbSSNAPYC7i05lawuIfmGWlNyHt2xA7AIuCqdirpS0iZlJ9WViFgA/IBktPYi8FpE3FpuVt2yVUS8CMkgB9iy5Hx64vPA/5adRBaSPgUsiIiHG/F+zVjwVWdbrzl3VNJA4Drg6xGxrKv9yybpUODliPhj2bl0U19gb+DyiNgLeIPmnF5YSzrnfTgwEtgW2ETSSeVmVR2SziOZfp1cdi5dkbQxcB7w7Ua9ZzMW/PnA9jWPh9Gkf/K2J6kfSbGfHBHXl51PRvsBn5L0HMn02YGSflluSpnMB+ZHRNtfUdNIfgE0u4OAZyNiUUSsBq4HxpWcU3e8JGkbgPTryyXnk5mkk4FDgROjdyxAGkUyMHg4/fkcBjwgaeuevmEzFvz7gdGSRqaXQ/w0cGPJOXVJkkjmk5+IiB+VnU9WEXFuRAyLiBEk/9a3RUTTjzgjYiHwgqSd0k0fBR4vMaWsngf2lbRx+j3zUXrBweYaNwInp/dPBv6nxFwyk3QIcA7wqYhYUXY+WUTEoxGxZUSMSH8+5wN7p9/7PdJ0BT89sHIaMIPkB2FqRPyp3Kwy2Q/4LMkI+aH0Vu+KX9Y4XwUmS3qE5FKaF5abTtfSv0imAQ8Aj5L8DDblkn9JU4B7gJ0kzZf0BeAi4GBJT5GcOXJRmTnW00HePwUGATPTn80rSk2yjg7ybmyM3vGXjZmZvVdNN8I3M7N8uOCbmVWEC76ZWUW44JuZVYQLvplZRbjgW+kkLS843ghJJxQZs13889JumY+kpwj+dVm5WLX0LTsBsyKlDctGACcAvyoh/liS1Z57R8SqtN1t//f4nn1rGoOZdcgjfGsakvaXNEvSVElzJV0k6URJ90l6VNKodL+rJV0h6a50v0PT7QMkXZXu+6CkA9Ltp0j6taSbgFtJFguNT0fXZ6Qj/rskPZDextXkc4fe7bk/OV0di6QPSPq9pIfT/AYpuabAxZLuT0fvf1vnY24DLI6IVQARsTgi/tLJe2b6TJI2UdJP/f50v8PT/XZN3+uhNKfR+f0PWtOLCN98K/UGLE+/7g8sJSmKGwILgO+lz30NuCS9fzUwnWTAMppkyfkA4CzgqnSfnUnaGAwATkn32awmzs018TcGBqT3RwOza/Z7jaSHyQYkqyA/RDIifwb4QLpfC8lfyxOBb6XbNgRmAyPbfdaBwEMkPdl/Bnwk3d7Re2b9TBcCJ6X3h6TvvwnwE5LeMW0xNir7/9u38m6e0rFmc3+k7XclPU0yIoekDcEBNftNjYg1wFOSniEphh8iKXBExJOS/gy8L91/ZkTU6zUO0A/4qaQ9gbdrXgNwX0TMT/N5iGQ66DXgxYi4P421LH3+Y8Duko5JXzuY5BfIs21vFhHLJe0DjE8/z7VKrur2xw7eM+tn+hhJE7yz08cDgOEkv6TOU3LNg+sj4qkO/g2sAlzwrdmsqrm/pubxGtb+fm3fEySo31q7zRudPHcG8BLJVbM2AN7sIJ+30xxUJz7p9q9GxIxOYhERbwN3AHdIepSkCdkDnbxnR2o/k4CjI2JOu32ekHQvyQVuZkj6YkTc1ll+tv7yHL71VsdK2iCd198BmAPcCZwIIOl9JCPc9gUQ4HWSRlptBpOMrteQNMDr00XsJ4FtJX0gjTUoPRg8A/iykjbZSHqf2l2URdJO7ebR9wT+3Ml7Zv1MM4Cv1hxj2Cv9ugPwTERcStLpcvcuPputxzzCt95qDjAL2Ar4UkS8KelnwBXpqLkVOCWSM2Hav/YRoFXSwyTHA34GXCfpWOB2Ov9rgIh4S9LxwE8kbQSsJOlzfyXJlM8DaeFdxLqXAByYvm5ImuM8YGIn75n1M/0jyZXLHkljP0dyNtDxwEmSVgMLgQs6+2y2fnO3TOt1JF1NctB1Wtm5mPUmntIxM6sIj/DNzCrCI3wzs4pwwTczqwgXfDOzinDBNzOrCBd8M7OK+H9QlZ733oq0uQAAAABJRU5ErkJggg==\n",
426 | "text/plain": [
427 | ""
428 | ]
429 | },
430 | "metadata": {
431 | "needs_background": "light"
432 | },
433 | "output_type": "display_data"
434 | },
435 | {
436 | "name": "stdout",
437 | "output_type": "stream",
438 | "text": [
439 | "Score: 0.9577464788732394\n"
440 | ]
441 | },
442 | {
443 | "name": "stderr",
444 | "output_type": "stream",
445 | "text": [
446 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
447 | "[Parallel(n_jobs=-1)]: Done 29 out of 34 | elapsed: 0.3s remaining: 0.0s\n",
448 | "[Parallel(n_jobs=-1)]: Done 34 out of 34 | elapsed: 0.4s finished\n",
449 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
450 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
451 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
452 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
453 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
454 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
455 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
456 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
457 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
458 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
459 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
460 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
461 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
462 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
463 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
464 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
465 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
466 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
467 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
468 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
469 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
470 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
471 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
472 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
473 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
474 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
475 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
476 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
477 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
478 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
479 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
480 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
481 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
482 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
483 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
484 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
485 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
486 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
487 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
488 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
489 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
490 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
491 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
492 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
493 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
494 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
495 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
496 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
497 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
498 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
499 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
500 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
501 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
502 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
503 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
504 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
505 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
506 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
507 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
508 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
509 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
510 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
511 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
512 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
513 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
514 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
515 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
516 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
517 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
518 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
519 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
520 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
521 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
522 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
523 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n"
524 | ]
525 | },
526 | {
527 | "name": "stdout",
528 | "output_type": "stream",
529 | "text": [
530 | "Markov Blanket: [26, 2, 0, 4, 21, 17, 14, 8, 7, 20, 6, 3, 22, 13, 30, 18, 5, 24, 12]\n",
531 | "Feature importance: [13.862943611198906, 13.862943611198906, 13.16979643063896, 13.16979643063896, 11.560358518204861, 11.560358518204861, 8.733044896275834, 8.733044896275834, 8.530224817933536, 8.530224817933536, 8.13609586361171, 7.762624659178842, 7.241537959434772, 6.452596513377882, 6.3049486526681005, 5.483634127146055, 5.483634127146055, 3.720557275404062, 3.1044876919713547]\n"
532 | ]
533 | },
534 | {
535 | "data": {
536 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAEGCAYAAABmXi5tAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuNCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8QVMy6AAAACXBIWXMAAAsTAAALEwEAmpwYAAAanUlEQVR4nO3df5RddXnv8feH/CQkk/AbJGQFw68LEQMMXAgCQcDQGvkhv6pQAW1ztYqKSKnFaEtXUQstVCphZWHBi5ECQTEGTciVEnKjBUZAEDABgWJiAhMgCQkxmSFP/9h74GRyZubM5Oy9z8z+vNY6a87Ze5/zPJNMnnznu/f32YoIzMxs4Nuh6ATMzCwfLvhmZiXhgm9mVhIu+GZmJeGCb2ZWEoOLTqA7p512WsyfP7/oNMzM+htV29jQI/zVq1cXnYKZ2YDR0CP89tbXaZ35/aLTMDPL1e6fuTCTz81shC/pSEmLJS2SdJekIZLGSpor6UFJX88qtpmZbSvLEf4KYGpEvCXpGuBM4KPAZyJiRYZxzcysisxG+BGxKiLeSl+2pV/HA/8s6QFJk7OKbWZm28p8Dl/SOOAU4BbgMOBcoB2YCxxd5fjpwHSAsbvsmnV6ZmalkelVOpKagNuBS4DVwLKIWB4Rq4B2Sdv8hxMRsyKiOSKadx3ZlGV6ZmalktkIX9IgYDZwdUQsS7etkTSaZIQ/NCLas4pvZmZby3JK5zxgMjBK0gxgJnAVMA8YAszIMLaZmXWSWcGPiDuAO6rsOj6rmGZm1rWGXng1ePddMluAYGZWNg3dWsHMzOqnoUf4ba1/YNXMvy86DTOzXO31mWwaEXiEb2ZWErkW/Gr9dfKMb2ZWZnmP8Dv665wIPE/SX8fMzHKQ6xx+usK2QxvJAiwzM8tBIXP4Ff115lXZN11Si6SW19a/te2bzcysT3Iv+JX9dSKirfP+rXvpjMg7PTOzASvvk7bb9NcxM7N85D3C7+ivMyO969X5Occ3MystRUTROXSpubk5Wlpaik7DzKy/UbWNXnhlZlYSDd1aYWPr8zx90+lFp2Fm1iuH/tXcolOoyiN8M7OSKOKyzGvT9gqzJQ3NO76ZWVnlfVnm4cDeEXE88AxwTp7xzczKLO8R/rHA/enz+SSXaJqZWQ7yLvhjgHXp87XALp0PqGyt8Mb6zXnmZmY2oOVd8N8AmtLnY4DXOx9Q2Vph55Ge4jczq5e8C/5/AR9Kn08FluQc38ystHIt+BHxOLBS0mLgEOCePOObmZVZ7guvIuKKvGOamVmDr7Tdcff9G3bFmplZf+OVtmZmJdHQI/w3Vz/Hz2/5cNFpmJn1ysl/cV/RKVTlEb6ZWUkUdU/bj0lqLSK2mVlZFdE8bQeSHjq/zzu2mVmZFTHC/zgwB9hSbWdla4U1b7q1gplZvRRxE/PzgDu7OqaytcKYUW6tYGZWL3mP8C8E7oqIqqN7MzPLTt4F/xDgE5LmAwdIuj7n+GZmpZXrdfgRcWXHc0ktEXFZnvHNzMpMEVF0Dl1qbm6OlpaWotMwM+tvVG2jF16ZmZVEQ7dWeH31c/zHrVOLTsPMGtifXbKg6BT6DY/wzcxKIrOCL+lISYslLZJ0l6Qhkm6V1Crpc1nFNTOz6rKc0lkBTI2ItyRdA5wJ/C2wCBiZYVwzM6sis4IfEasqXrYB7RGxUqp68tjMzDKW+Ry+pHHAKcC8Go9/p5fOm+vdS8fMrF4yLfiSmoDbgUsioq2W91T20hk10r10zMzqJcuTtoOA2cDVEbEsqzhmZlabLE/angdMBkZJmgHMBCYBpwODJO0XEZdnGN/MzCq4tYKZ2cDj1gpmZmXmgm9mVhIN3Uvn1def48bZ7qVjZl279AL30qmVR/hmZiWR2Qhf0ijg/wGHAscAy4D70907AkMj4vCs4puZ2daynNLZCEwDrgWIiM3AFABJFwITMoxtZmadZDalExHtEdHaxe5zgbur7ahsrbB+nVsrmJnVS+5z+OlUz74R8Uy1/ZWtFUY2ubWCmVm9FHHS9nRgbgFxzcxKrYiC3+V0jpmZZSfT6/Al/ZSkf85BkmYCPwLGRcTTWcY1M7NtuZeOmdnA4146ZmZl1tAj/L0mjI4Lv3Vs0WmYNazrzplfdArWmDzCNzMrsyzveDVK0sOS1kuaWLF9nKRNldvMzCx7WY7wO1orzOm0/UpgSYZxzcysilxbK0jaDwjg5azimplZdXnP4V8JXNfdAZW9dN5yLx0zs7rJreBLmgAQES91d1xlL50R7qVjZlY3ed7x6v3AoZLmA+8D9pd0UkS05ZiDmVlp5dZaAZgZEcen228DrnOxNzPLT0MvvHJrBTOzPvHCKzOzMnPBNzMriYae0hm9/x4x+brzi07DrGH97Mwbi07BGpOndMzMyizXgi9pB0nfk7Q4fUzIM76ZWZnlPcKfBAxLL8+8GvhczvHNzEor74K/HECSgDFAa+cDKlsrbF63Mef0zMwGrjxX2gKsBrYAzwLDgOM6HxARs4BZkJy0zTU7M7MBLO8R/lRgY0QcDJwN/EvO8c3MSquIq3TeSL+uIZnWMTOzHNRU8CVNkDQsfT5F0ucljelDvPuB90haBPyA5MStmZnloKaFV5KeAJqB8cACYC5wUET8aZbJuZeOmVmfbNfCqy0R0Q6cBdwQEZcBe9crMzMzy16tV+m0SfoYcBHwkXTbkGxSetdza17hwz/856zDmPUL93308qJTsH6u1hH+JcCxwD9GxIvpvWm/n11aZmZWbzUV/Ih4huR+tI+lr1+MiG/2NpikoyU9mD6WSrq+t59hZmZ9U+tVOh8BngDmp68nSZrb22AR8UhETImIKcBi4N7efoaZmfVNrVM6fwccTXLtPBHxBLBfX4NKGgwcQ1L0zcwsB7UW/PaIWNtp2/a0PfggsCgitnTesVUvnbUbtiOEmZlVqrXg/0bSx4FBkg6QdCPwi+2Iey5wd7UdETErIpojonno6J22I4SZmVWqteBfChwKbCJZIbsW+GJfAqbTOccCD/Xl/WZm1jc9XocvaRAwNyJOAa6qQ8yTgIeqTeeYmVl2am2tMBf48yrz+JlyawUzsz6p2lqh1pW2fwSekrQQeOdMakR8vg6JmZlZDmot+PelDzMz66dqmtIpypgJ4+MD//S1otMwawjzzv5k0SlY/9H3KR1JL1LluvuIeG837zkSuIHkloavABcAHwW+QDJFdFFE/L6W+GZmtv1qndJprng+nOQ6+l16eM8KYGpEvCXpGuBM4EvA8cBRwAxgeq+yNTOzPqu1edprFY8VEXEDyWrZ7t6zKiLeSl+2AQcCT0fE5ohYArxvexI3M7PeqXVK54iKlzuQjPhH1fjeccApwN8Cu1fsGtTF8dNJR/477rZrLSHMzKwGtU7pVN6FpB14ETivpzdJagJuJ+mnPwhoqtj9drX3RMQsYBYkJ21rzM/MzHpQa8H/VES8ULkhvQlKl9IVurOBqyNimaQhwCGShpLM4T/Zl4TNzKxvai34c4Ajqmw7spv3nAdMBkZJmgHMBK4HFpFcpfOJ3qVqZmbbo9uCL+lgkqZpoyV9tGJXE8nVOl2KiDuAO6rsurO3SZqZ2fbraYR/EDANGMO7Ny8HeBP4y4xyesf+O+/mxSZmZnXSbcGPiB8DP5Z0bET8MqeczMwsA7XO4T8u6bMk0zvvTOVERKbD7+ffeINpd8/JMoRZQ5h37jlFp2AlUOsNUG4H9gKmkpx0HUsyrWNmZv1ErQV//4iYAWyIiO8BH6aPK2UlTZH0c0mLJJ3Rl88wM7Peq3VKpy39ukbSRGAVML63wSQNBy4H/iQiNvf2/WZm1ne1jvBnSdqZpOHZXOAZ4J/6EG8ysBH4iaQfSdqrD59hZmZ9UNMIPyJuSZ8uArpsiVyDPYH9gOOAk4G/Az5decDWvXR2245QZmZWqaYRvqQ9JX1X0s/S14dI+lQf4q0B/n86nfMAcEjnAyJiVkQ0R0Tz0KamzrvNzKyPap3SuQ1YALwnfb0M+GIf4j3Cu0X+cOCFbo41M7M6qvWk7W4RcZekrwBERLukqt0uuxMRr0maK+khkjtheRmtmVlOai34GyTtSnqbQ0nHAGv7EjAivgN8p5Zj9995Zy9IMTOrk1oL/pdIrs6ZIGkJyY1MXInNzPoRRXR9jxFJ4yLi5fT5YJJmagKWRkRbl2+skzETDowp37op6zBmmbr3nFOKTsHKR9U29nTS9t6K53dGxNMR8Zs8ir2ZmdVXTwW/8n+J7bn+PvkwaaKkJWlbhfskjdzezzQzs9r0VPCji+d9tTQijouIE0ku0TyrDp9pZmY16Omk7fslrSMZ6e+YPid9HRHRq5VRnaaCRgC/7c37zcys73q6AcqgegeUdCpJH5424FtV9le0Vtij3uHNzEqr1pW2dRMRCyPicJKboE+vsr+itcLovNMzMxuwci34koZVvFwLbMgzvplZmdW68KpeTpV0BUlbhVbg4pzjm5mVVq4FPyLmAfPyjGlmZom8R/i9sv/OTV6laGZWJ7mftDUzs2I09Aj/hTc2cu49TxadhtlW7j77sKJTMOsTj/DNzEoisxG+pCOBG0iuyHkF+EvgbmA48DZwSUS8lFV8MzPbWpYj/BXA1LRvzvPAaSRF/gTgG8AVGcY2M7NOMhvhR8SqipdtwOaIWFHxur3a+ypbK4zYbe+s0jMzK53M5/AljQNOIb3+XtIQ4GvAt6sdX9laYVjTzlmnZ2ZWGpkWfElNwO0kUzkdnTJnATdHxO+yjG1mZlvLrOBLGgTMBq6OiGXptq8CL0bEnVnFNTOz6rIc4Z8HTAZmSHpQ0kXA14EPpq+/kWFsMzPrpNubmBetubk5Wlpaik7DzKy/6dNNzM3MbIBo6NYKK9e08Y8/Wll0GjZAXHWWL/O1cvMI38ysJLK8SmeUpIclrZc0Md32UHrC9hcd28zMLB9ZjvA3AtNI7l3b4eSImAJ8Bbgsw9hmZtZJZgU/ItojorXTto7FV03AU1nFNjOzbeV60lbS7sC9wDjgjC6OeaeXzujd98ktNzOzgS7Xk7YR0RoRxwFnA9d0ccw7vXR2ato1z/TMzAa03Aq+pMGSOuKtBTbkFdvMzDKe0pH0U2AScBBwM/BJSVtIbory2Sxjm5nZ1txawcxs4HFrBTOzMnPBNzMriYbupbPmjXbuvXt10WlYP3bmubsVnYJZw/AI38ysJLLspTNR0hJJiyTdJ2mkpPPTPjoPSNo3q9hmZratLEf4SyPiuIg4EXgEOAv4EjAFmJE+zMwsJ1n20mmreDkCeBl4OiI2R8QS4H3V3idpuqQWSS3r1r2WVXpmZqWT6Ry+pFMlPQ6cBLQB6yp2D6r2nsrWCk1urWBmVjeZFvyIWBgRh5O0SD6RpEtmh7ezjG1mZlvL7LJMScMiYlP6ci0wFDhE0lDgKODJrGKbmdm2srwO/1RJV5D0zWkFLgZeBRYBfwQ+kWFsMzPrxL10zMwGHvfSMTMrs4ZurbBhdTuP3Ppq0WlYzo6+ZI+iUzAbkDzCNzMriSxbK4yS9LCk9ZImpts+J+mR9PGRrGKbmdm2spzS2QhMA66t2PZXwGEkK28XAD/JML6ZmVXIsrVCe0S0dtr8PLAjMApw3wQzsxzlfdJ2PvAMSVuFi6sdIGk6MB1gr13H5paYmdlAl9tJW0lNJIX8AOBg4BpJ21wrWtlLZ8xI99IxM6uXPEf4W0hW2G4C2oFhJIsDGnfll5nZAJJpwZf0U2AScBAwk6SJ2i9JpnS+ExFbsoxvZmbvcmsFM7OBx60VzMzKzAXfzKwkGrqXzuZX2lh+3aqi07Ccjf3yXkWnYDYgeYRvZlYSuRZ8SeMltUp6MH3snmd8M7MyK2JKZ1FEnFNAXDOzUitiSuc4SYslVV1pK2m6pBZJLa+vd7sdM7N6ybvgrwT2B04A9gDO6nxAZWuFXdxawcysbnIt+BGxKSI2RLLa6x6SVbhmZpaDvE/ajqp4eQJJu2QzM8tB3lM6H5D0K0mLgX2AH+Qc38ystNxLx8xs4HEvHTOzMmvo1gptr7zFK9c/UXQaA8qel00qOgUzK4hH+GZmJZFZwZd0ZLrAapGkuyQNSbePk7RJ0sSsYpuZ2bayHOGvAKZGxIkkl1+emW6/EliSYVwzM6siszn8iKjsa9wGtEvaj+Qeti9nFdfMzKrLfA5f0jjgFGAeyej+uh6Of7eXzoY1WadnZlYamRZ8SU3A7cAlwDiAiHipu/ds1UtnpzFZpmdmVipZnrQdBMwGro6IZcD7gUMlzQdOBW7uOJFrZmbZy3KEfx4wGZgh6UFgSEQcHxGnAQuBT0dEW4bxzcysglsrmJkNPG6tYGZWZi74ZmYl0dC9dNpfXcerNy4sOo2GtMelpxadgpn1M1lepTNK0sOS1kuaKGknSfdLekjSf0oan1VsMzPbVpZTOhuBacCc9HU7cElEnAB8A7giw9hmZtZJlq0V2oFWSR2vN5H014G01UJWsc3MbFu5n7RNF1t9Dfh2F/vfaa3w2vq1+SZnZjaAFXGVzizg5oj4XbWdla0Vdh05OufUzMwGrlwLvqSvAi9GxJ15xjUzs4wvy5T0U2AScJCkmcDXgSWSPgj8MiK+kmV8MzN7V6YFPyL+tNOm/5tlPDMz61pDL7wavEeTFxiZmdVJQzdPk/QmsLToPPpgN2B10Un0gfPOl/POV5nyXp12Jt5KQ4/wgaUR0Vx0Er0lqcV558d558t556ueebt5mplZSbjgm5mVRKMX/FlFJ9BHzjtfzjtfzjtfdcu7oU/amplZ/TT6CN/MzOrEBd/MrCQasuBLOk3SUknPS/qbovOphaR90xu7PCvpaUlfKDqn3pA0SNLjkuYVnUutJI2RNEfSb9M/92OLzqkWki5Lf0Z+I+kOScOLzqkaSf8u6VVJv6nYtoukhZKeS7/uXGSO1XSR97Xpz8mTkn4kaUyBKVZVLe+KfV+WFJJ2254YDVfwJQ0CvgP8CXAI8DFJhxSbVU3agcsj4n8BxwCf7Sd5d/gC8GzRSfTSvwLzI+Jg4P30g/wl7QN8HmiOiInAIODPis2qS7cBnRfv/A3w84g4APh5+rrR3Ma2eS8EJkbEYcAyoBH7eN3GtnkjaV/gVODl7Q3QcAUfOBp4PiJeiIjNwH8AZxScU48iYmVEPJY+f5Ok+OxTbFa1kTQW+DBwS9G51EpSE3AC8F2AiNgcEWsKTap2g4EdJQ0GRgB/KDifqiLiIeD1TpvPAL6XPv8ecGaeOdWiWt4RcX96UyaA/wLG5p5YD7r48wa4HvhrYLuvsGnEgr8P8PuK18vpJ4WzQ3q/3sOBhwtOpVY3kPxAbSk4j954L9AK3JpORd0iaaeik+pJRKwAriMZra0E1kbE/cVm1St7RsRKSAY5wB4F59MXnwR+VnQStZB0OrAiIn5dj89rxIKvKtv6zbWjkkYC9wBfjIh1RefTE0nTgFcj4ldF59JLg4EjgJkRcTiwgcacXthKOud9BrAf8B5gJ0kXFptVeUi6imT6dXbRufRE0gjgKpI7BNZFIxb85cC+Fa/H0qC/8naW3r7xHmB2RPyw6HxqdBxwuqSXSKbPPijp+8WmVJPlwPKI6Pgtag7JfwCN7hSSmwC1RkQb8ENgcsE59cYrkvYGSL++WnA+NZN0ETANuCD6xwKkCSQDg1+n/z7HAo9J2quvH9iIBf9R4ABJ+0kaSnJCa27BOfVIyd3avws8GxH/UnQ+tYqIr0TE2IgYT/Jn/UBENPyIMyJWAb+XdFC66WTgmQJTqtXLwDGSRqQ/MyfTD042V5gLXJQ+vwj4cYG51EzSacCVwOkR8VbR+dQiIp6KiD0iYnz673M5cET6s98nDVfw0xMrnwMWkPxDuCsini42q5ocB/w5yQj5ifTR+QYwVl+XArMlPUlyZ7Vrik2nZ+lvJHOAx4CnSP4NNuSSf0l3AL8kuWPdckmfAr4JnCrpOZIrR75ZZI7VdJH3vwGjgIXpv82bC02yii7yrm+M/vGbjZmZba+GG+GbmVk2XPDNzErCBd/MrCRc8M3MSsIF38ysJFzwrXCS1uccb7ykj+cZs1P8q9JumU+mlwj+76JysXIZXHQCZnlKG5aNBz4O/KCA+MeSrPY8IiI2pe1uh27nZw6uaAxm1iWP8K1hSJoiaZGkuyQtk/RNSRdIekTSU5ImpMfdJulmSYvT46al24dLujU99nFJJ6XbL5Z0t6SfAPeTLBY6Ph1dX5aO+BdLeix9TK7I50G923N/dro6FklHSfqFpF+n+Y1Sck+BayU9mo7e/0+Vb3NvYHVEbAKIiNUR8YduPrOm70nSTkr6qT+aHndGetyh6Wc9keZ0QHZ/g9bwIsIPPwp9AOvTr1OANSRFcRiwAvj7dN8XgBvS57cB80kGLAeQLDkfDlwO3JoeczBJG4PhwMXpMbtUxJlXEX8EMDx9fgDQUnHcWpIeJjuQrIL8AMmI/AXgqPS4JpLflqcDX023DQNagP06fa8jgSdIerLfBJyYbu/qM2v9nq4BLkyfj0k/fyfgRpLeMR0xdiz679uP4h6e0rFG82ik7Xcl/Y5kRA5JG4KTKo67KyK2AM9JeoGkGH6ApMAREb+V9N/AgenxCyOiWq9xgCHAv0maBLxd8R6ARyJieZrPEyTTQWuBlRHxaBprXbr/Q8Bhks5J3zua5D+QFzs+LCLWSzoSOD79fu5Ucle3X3XxmbV+Tx8iaYL35fT1cGAcyX9SVym558EPI+K5Lv4MrARc8K3RbKp4vqXi9Ra2/nnt3BMkqN5au8OGbvZdBrxCctesHYA/dpHP22kOqhKfdPulEbGgm1hExNvAg8CDkp4iaUL2WDef2ZXK70nA2RGxtNMxz0p6mOQGNwsk/UVEPNBdfjZweQ7f+qtzJe2Qzuu/F1gKPARcACDpQJIRbucCCPAmSSOtDqNJRtdbSBrgDeoh9m+B90g6Ko01Kj0ZvAD4jJI22Ug6UJ1uyiLpoE7z6JOA/+7mM2v9nhYAl1acYzg8/fpe4IWI+DZJp8vDevjebADzCN/6q6XAImBP4NMR8UdJNwE3p6PmduDiSK6E6fzeJ4F2Sb8mOR9wE3CPpHOB/6T73waIiM2SzgdulLQjsJGkz/0tJFM+j6WFt5VtbwE4Mn3fmDTH54Hp3Xxmrd/TP5DcuezJNPZLJFcDnQ9cKKkNWAVc3d33ZgObu2VavyPpNpKTrnOKzsWsP/GUjplZSXiEb2ZWEh7hm5mVhAu+mVlJuOCbmZWEC76ZWUm44JuZlcT/AJIKkyv3NphsAAAAAElFTkSuQmCC\n",
537 | "text/plain": [
538 | ""
539 | ]
540 | },
541 | "metadata": {
542 | "needs_background": "light"
543 | },
544 | "output_type": "display_data"
545 | },
546 | {
547 | "name": "stdout",
548 | "output_type": "stream",
549 | "text": [
550 | "Score: 0.9857142857142858\n"
551 | ]
552 | },
553 | {
554 | "name": "stderr",
555 | "output_type": "stream",
556 | "text": [
557 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
558 | "[Parallel(n_jobs=-1)]: Done 29 out of 34 | elapsed: 0.3s remaining: 0.0s\n",
559 | "[Parallel(n_jobs=-1)]: Done 34 out of 34 | elapsed: 0.3s finished\n",
560 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
561 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
562 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
563 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
564 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
565 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
566 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
567 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
568 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
569 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
570 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
571 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
572 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
573 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
574 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
575 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
576 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
577 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
578 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
579 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
580 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
581 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
582 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
583 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
584 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
585 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
586 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
587 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
588 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
589 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
590 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
591 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
592 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
593 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
594 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
595 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
596 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
597 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
598 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
599 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
600 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
601 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
602 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
603 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
604 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
605 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
606 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
607 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
608 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
609 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
610 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
611 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
612 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
613 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
614 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
615 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
616 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
617 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
618 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n"
619 | ]
620 | },
621 | {
622 | "name": "stdout",
623 | "output_type": "stream",
624 | "text": [
625 | "Markov Blanket: [20, 0, 2, 4, 8, 7, 30, 6, 18, 10]\n",
626 | "Feature importance: [13.862943611198906, 13.862943611198906, 12.764331322530797, 11.917033462143593, 11.560358518204861, 10.644067786330705, 8.942962685370782, 6.452596513377882, 4.644238322891096, 3.627422228640497]\n"
627 | ]
628 | },
629 | {
630 | "name": "stderr",
631 | "output_type": "stream",
632 | "text": [
633 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
634 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
635 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
636 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
637 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
638 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
639 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n"
640 | ]
641 | },
642 | {
643 | "data": {
644 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAEGCAYAAABmXi5tAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuNCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8QVMy6AAAACXBIWXMAAAsTAAALEwEAmpwYAAAVhElEQVR4nO3de5RlZX3m8e9Dc23o5hJHBYEFEgRb5WbhCMYB75h4ywRlxBuamZ4ZI17jmCwGF3FWslzLJDoTb6uXjhglBNKaiGQCsiQ0BIxYIhcRQQYShGAEheYm0E3/5o+9GYvq6u5T1XX2rur9/ax1Vp2zzz77/Z3uqqfees/e75uqQpK07duu7wIkSd0w8CVpIAx8SRoIA1+SBsLAl6SB2L7vAjbnhBNOqAsuuKDvMiRpsclMGxd0D//uu+/uuwRJ2mYs6MCXJM2fLOQLr3Z96oF16Fv+oO8yJKlT3/3YW7f2EItvSEeSNH8MfEkaCANfkgbCwJekgTDwJWkgxhb4SZ6b5LIka5Kcm2SHJCcluSLJxUn2G1fbkqSNjbOHfwfwiqo6DrgZeB3wfuB44PT2JknqyNgCv6p+UlUPtQ/XAc8Arq+qR6vqcuA5M70uycokk0km1z90/7jKk6TBGfsYfpL9gZcC/wDcN+WpJTPtX1Wrqmqiqia2X7ps3OVJ0mCMdfK0JMuBLwFvpwn45VOefmycbUuSnmhsgZ9kCXAW8JGquinJDsCKJDsCRwPXjqttSdLGxtnDfwNwLLAsyenAZ4CPA2uAh4GtnixCkjS6sQV+VZ0NnD3DU+eMq01J0qZ54ZUkDYSBL0kDYeBL0kAs6AVQJiYmanJysu8yJGmxcQEUSRoyA1+SBsLAl6SBGOvUClvr0Tuv57aPzDjHmiRts/b/8HVjOa49fEkaCANfkgbCwJekgTDwJWkgDHxJGojOAz/Jx9rFzc9q58aXJHWg08BPciSwd1W9EPgBcGKX7UvSkHXdwz8G+EZ7/wKaBVIkSR3oOvD34JcLma8F9pq+Q5KVSSaTTP78QZe9laT50nXg38MvFzLfA/j59B2qalVVTVTVxF67LumyNknapnUd+P8IvLy9/wrg8o7bl6TB6jTwq+p7wJ1JLgNWAF/psn1JGrLOJ0+rqg923aYkyQuvJGkwDHxJGggDX5IGwsCXpIFY0Cte7bj3s9j/w5N9lyFJ2wR7+JI0EAa+JA2EgS9JA5Gq6ruGTdpt/93q8A8e3ncZkgTA5acumtlgMtNGe/iSNBAGviQNhIEvSQNh4EvSQBj4kjQQXS9i/twklyVZk+TcJDt02b4kDVnXPfw7gFdU1XHAzcDrOm5fkgar07l0quonUx6uA9Z32b4kDVkvY/hJ9gdeCpw/w3Mrk0wmmVz3wLrui5OkbVTngZ9kOfAl4O1VtVGiV9WqqpqoqokddnOIX5LmS9cf2i4BzgI+UlU3ddm2JA1d1z38NwDHAqcnuSTJSR23L0mD1fWHtmcDZ3fZpiSp4YVXkjQQBr4kDYSBL0kDsaAXQJmYmKjJSRcxl6RZcgEUSRoyA1+SBsLAl6SBMPAlaSAW9Ie2hyxbVquOPKrvMiQtYsdduqbvEvrgh7aSNGQGviQNhIEvSQNh4EvSQBj4kjQQfS1x+MYkd/XRtiQNVR9LHG4HnAj8uOu2JWnI+ujhnwysBjbM9OTURczXrnMRc0maL32safsG4JxN7TN1EfPdd3ARc0maL1338N8MnFtVM/buJUnj03XgrwDemuQC4OAkH++4fUkarK4XMf/Q4/eTTFbV+7psX5KGrLfz8Ktqoq+2JWmIvPBKkgbCwJekgTDwJWkgOv3QdraWHXLIUBcvkKR5Zw9fkgbCwJekgTDwJWkgDHxJGoiRPrRNchBwe1U9kuR44DDgz6vq3vGVBj+9fS2f/MDXx9mEpEXuXX/y6r5LWDRG7eF/BXgsya8CnwcOBP5ibFVJkubdqIG/oarWA78JfKKdA2fv8ZUlSZpvowb+uiRvBN4GnN9uc7J6SVpERg38twPHAH9YVbcmORD48vjKkiTNt5E+tK2qHyT5ELB/+/hW4KPjLEySNL9G6uEneTVwNXBB+/iIJOfNtrEk2yX5YpLL2ttBsz2GJGluRh3SOQN4HnAvQFVdTXOmzmwdAexUVS8EPgK8aw7HkCTNwaiBv76q1k7bVnNo73aAJAH2AO6avkOSlUkmk0w+8ND0JiVJczXqbJnfT3IysCTJwcC7gSvm0N7dwAbgBmAn4AXTd6iqVcAqgP2fevBcfqlIkmYwag//VOBZwCM0F1ytBd47h/ZeAfyiqg4Ffgv40zkcQ5I0B1vs4SdZApxXVS8FTpuHNu9pv95LM6wjSerAFgO/qh5L8lCS3WcYx5+tbwBvSbKGZkjn/Vt5PEnSiEYdw38YuC7JRcCDj2+sqnfPprGqegw4eTavkSTNj1ED/2/bmyRpkRr1StsvjrsQSdJ4jTof/q3McN59VT193iuSJI3FqEM6E1Pu7wy8Hthr/st5oifvu7uLG0jSPBnpPPyq+tmU2x1V9QngxeMtTZI0n0Yd0jlqysPtaHr8y8ZSkSRpLEYd0vmTKffXA7cCb5j/ciRJ45KqLU9Xk+TpVXXLtG0HtvPij83TfmXPeucrXzLOJiR16LQvr+67hKHITBtHnUtnpv8l/+ckaRHZ7JBOkkNpJk3bPcm/n/LUcpqzdSRJi8SWxvAPAV5FM8nZ1PMj7wf+05hqkiSNwWYDv6q+BnwtyTFV9a2OapIkjcGoZ+l8L8nv0Azv/P+hnKp6x1iqkiTNu1E/tP0S8FSaBUzWAPvSDOvMSpLnJbmkvd2Y5OOzPYYkaW5GDfxfrarTgQfbidR+A3jObBurqiur6viqOh64DPib2R5DkjQ3owb+uvbrvUmeDewOHDDXRpNsDzyfJvQlSR0YdQx/VZI9gdOB84DdgA9vRbsvBtZU1YbpTyRZCawE2H3pLlvRhCRpqlHnw/9ce3cNMB9TIr8eOGsTba0CVkFzpe08tCVJYsQhnSRPSfL5JH/XPl6R5Lfn0mA7nHMMcOlcXi9JmptRx/DPBC4E9mkf3wS8d45tvgi4dKbhHEnS+Iwa+E+qqnOBDQBVtR54bC4NVtVFVfXOubxWkjR3owb+g0l+hXaZwyTPB9aOrSpJ0rwb9Syd99OcnXNQksuBfwOcOLaqJEnzbkuzZe5fVbdV1VVJjqOZTC3AjVW1bnOvlSQtLFsa0vmbKffPqarrq+r7hr0kLT5bGtKZumrKfJx/Pyt7H3iQK+RI0jzZUg+/NnFfkrTIbKmHf3iS+2h6+ru092kfV1UtH2t1kqR5s6UFUJZ0VYgkabxGPS2zFw/feT83/OHFfZchLRrPPO3FfZegBWzUC68kSYucgS9JA2HgS9JAGPiSNBAGviQNxNgCP8mzk1yeZE2Sv02yW5KTklyR5OIk+42rbUnSxsbZw7+xql5QVccBVwK/STPr5vE0a+OePsa2JUnTjC3wp02wthS4Dbi+qh6tqsuB54yrbUnSxsY6hp/kZUm+R7Os4TrgvilPz3gVb5KVSSaTTP78wXvHWZ4kDcpYA79dzvBIYDVwHDB17p0Zl0isqlVVNVFVE3vtusc4y5OkQRnb1ApJdqqqR9qHa4EdgRVJdgSOBq4dV9uSpI2Ncy6dlyX5IM3C53cBpwA/BdYADwNvHWPbkqRpxhb4VXU+cP60zee0N0lSx7zwSpIGwsCXpIEw8CVpIBb0Aig7773MBR0kaZ7Yw5ekgTDwJWkgDHxJGggDX5IGIlXVdw2btM8++9TKlSv7LkPbiDPOOKPvEqSuZKaN9vAlaSAMfEkaCANfkgbCwJekgTDwJWkgOg/8JMcn+WaSNUle23X7kjRUnc6lk2Rn4APAK6vq0S7blqSh67qHfyzwC+DrSf46yVOn7zB1EfOHHnqo4/IkadvVdeA/BTgQeDWwCjhj+g5TFzFfunRpx+VJ0rar68C/F/iHdjjnYmBFx+1L0mB1HfhX8suQPxK4peP2JWmwOv3Qtqp+luS8JJcCG4B3dNm+JA1Z5yteVdWngE913a4kDZ0XXknSQBj4kjQQBr4kDcSCXgBlYmKiJicn+y5DkhYbF0CRpCEz8CVpIAx8SRqIzs/Dn4177rmBc//qeX2XoRG84fVX9l2CpC2why9JA2HgS9JAGPiSNBAGviQNhIEvSQNh4EvSQIwt8JMsS/LtJA8keXa77V1Jrmxvrx5X25KkjY3zPPxfAK8CPjZl2zuBw4ClwIXA18fYviRpirH18KtqfVXdNW3zzcAuwDLgZzO9LsnKJJNJJu+7b/24ypOkwen6StsLgB8AS4BTZtqhqlYBqwAOOmjXhTuVpyQtMp0FfpLlwErgYGBH4OIkF9VCnp9ZkrYhXfbwNwAPA48A64GdaOZsNvAlqQNjDfwk/wc4AjgE+AywGvgWzZDOp6pqwzjblyT90lgDv6p+fYbNfzzONiVJM/PCK0kaCANfkgbCwJekgVjQK17tueczXUlJkuaJPXxJGggDX5IGwsCXpIFY0GP4P7jnPg5ffWHfZWwTrjnxFX2XIKln9vAlaSAMfEkaCANfkgbCwJekgTDwJWkgul7E/KQkVyS5OMl+42pbkrSxcfbwH1/EfDVAkh2A9wPHA6e3N0lSR7pcxPxg4PqqerSqLgeeM662JUkb63IMfw/gvimPl8y0U5KVSSaTTK6/b20nhUnSEHQZ+PcAy6c8fmymnapqVVVNVNXE9st376YySRqALqdWuBlYkWRH4Gjg2g7blqTB63oR848Da4CHgbeOs21J0hP1sYj5OeNsU5I0My+8kqSBMPAlaSAMfEkaCANfkgZiQa94tWLP5Uy6UpMkzQt7+JI0EKmqvmvYpCT3Azf2XcccPAm4u+8i5sC6u2Xd3RpS3XdX1QnTNy7oIR3gxqqa6LuI2Uoyad3dse5uWXe35rNuh3QkaSAMfEkaiIUe+Kv6LmCOrLtb1t0t6+7WvNW9oD+0lSTNn4Xew5ckzRMDX5IGYkEGfpITktyY5OYkv9d3PaNIsl+Sv09yQ5Lrk7yn75pmI8mSJN9Lcn7ftYwqyR5JVif5YfvvfkzfNY0iyfva75HvJzk7yc591zSTJP87yU+TfH/Ktr2SXJTkR+3XPfuscSabqPtj7ffJtUn+OskePZY4o5nqnvLc7yapJE/amjYWXOAnWQJ8CnglsAJ4Y5IV/VY1kvXAB6rqmcDzgd9ZJHU/7j3ADX0XMUv/E7igqg4FDmcR1J/kacC7gYmqejbN2s7/od+qNulMYPrFO78HfLOqDga+2T5eaM5k47ovAp5dVYcBNwG/33VRIziTjesmyX7Ay4DbtraBBRf4wPOAm6vqlqp6FPhL4LU917RFVXVnVV3V3r+fJnye1m9Vo0myL/AbwOf6rmVUSZYD/w74PEBVPVpV9/Za1Oi2B3ZJsj2wFPiXnuuZUVVdCvx82ubXAl9s738ReF2XNY1iprqr6htVtb59+I/Avp0XtgWb+PeGZqXA/wZs9Rk2CzHwnwb8eMrj21kkwfm4JAcARwLf7rmUUX2C5htqQ891zMbTgbuAL7RDUZ9LsmvfRW1JVd0B/DFNb+1OYG1VfaPfqmblKVV1JzSdHODJPdczF+8A/q7vIkaR5DXAHVV1zXwcbyEGfmbYtmjOHU2yG/AV4L1VdV/f9WxJklcBP62q7/ZdyyxtDxwFfKaqjgQeZGEOLzxBO+b9WuBAYB9g1yRv7req4UhyGs3w61l917IlSZYCpwEfnq9jLsTAvx3Yb8rjfVmgf/JOl2QHmrA/q6q+2nc9I3oB8Jok/0QzfPbiJF/ut6SR3A7cXlWP/xW1muYXwEL3UuDWqrqrqtYBXwWO7bmm2fjXJHsDtF9/2nM9I0vyNuBVwJtqcVyAdBBNx+Ca9udzX+CqJE+d6wEXYuB/Bzg4yYFJdqT5QOu8nmvaoiShGU++oar+tO96RlVVv19V+1bVATT/1hdX1YLvcVbVT4AfJzmk3fQS4Ac9ljSq24DnJ1nafs+8hEXwYfMU5wFva++/Dfhaj7WMLMkJwIeA11TVQ33XM4qquq6qnlxVB7Q/n7cDR7Xf+3Oy4AK//WDlXcCFND8I51bV9f1WNZIXAG+h6SFf3d5+ve+itnGnAmcluRY4AvijfsvZsvYvktXAVcB1ND+DC/KS/yRnA98CDklye5LfBj4KvCzJj2jOHPlonzXOZBN1fxJYBlzU/mx+ttciZ7CJuue3jcXxl40kaWstuB6+JGk8DHxJGggDX5IGwsCXpIEw8CVpIAx89S7JAx23d0CSk7tsc1r7p7WzZV7bniL4b/uqRcOyfd8FSF1qJyw7ADgZ+Ise2j+G5mrPo6rqkXa62x238pjbT5kYTNoke/haMJIcn2RNknOT3JTko0nelOTKJNclOajd78wkn01yWbvfq9rtOyf5Qrvv95K8qN1+SpK/SvJ14Bs0Fwu9sO1dv6/t8V+W5Kr2duyUei7JL+fcP6u9OpYkRye5Isk1bX3L0qwp8LEk32l77/95hre5N3B3VT0CUFV3V9W/bOaYI72nJLummU/9O+1+r233e1Z7rKvbmg4e3/+gFryq8uat1xvwQPv1eOBemlDcCbgD+IP2ufcAn2jvnwlcQNNhOZjmkvOdgQ8AX2j3OZRmGoOdgVPaffaa0s75U9pfCuzc3j8YmJyy31qaOUy2o7kK8tdoeuS3AEe3+y2n+Wt5JfDf2207AZPAgdPe627A1TRzsn8aOK7dvqljjvqe/gh4c3t/j/b4uwJ/RjN3zONt7NL3/7e3/m4O6Wih+U610+8m+b80PXJopiF40ZT9zq2qDcCPktxCE4a/RhNwVNUPk/wz8Ix2/4uqaqa5xgF2AD6Z5AjgsSmvAbiyqm5v67maZjhoLXBnVX2nbeu+9vmXA4clObF97e40v0BuffxgVfVAkucCL2zfzzlpVnX77iaOOep7ejnNJHi/2z7eGdif5pfUaWnWPPhqVf1oE/8GGgADXwvNI1Pub5jyeANP/H6dPidIMfPU2o97cDPPvQ/4V5pVs7YDHt5EPY+1NWSG9mm3n1pVF26mLarqMeAS4JIk19FMQnbVZo65KVPfU4Dfqqobp+1zQ5Jv0yxwc2GS/1hVF2+uPm27HMPXYvX6JNu14/pPB24ELgXeBJDkGTQ93OkBCHA/zURaj9udpne9gWYCvCVbaPuHwD5Jjm7bWtZ+GHwh8F/TTJNNkmdk2qIsSQ6ZNo5+BPDPmznmqO/pQuDUKZ8xHNl+fTpwS1X9L5qZLg/bwnvTNswevharG4E1wFOA/1JVDyf5NPDZtte8HjilmjNhpr/2WmB9kmtoPg/4NPCVJK8H/p7N/zVAVT2a5CTgz5LsAvyCZp77z9EM+VzVBu9dbLwE4G7t6/Zoa7wZWLmZY476nv4Hzcpl17Zt/xPN2UAnAW9Osg74CfCRzb03bducLVOLTpIzaT50Xd13LdJi4pCOJA2EPXxJGgh7+JI0EAa+JA2EgS9JA2HgS9JAGPiSNBD/D6BIWIZRDl96AAAAAElFTkSuQmCC\n",
645 | "text/plain": [
646 | ""
647 | ]
648 | },
649 | "metadata": {
650 | "needs_background": "light"
651 | },
652 | "output_type": "display_data"
653 | },
654 | {
655 | "name": "stdout",
656 | "output_type": "stream",
657 | "text": [
658 | "Score: 0.8857142857142857\n"
659 | ]
660 | },
661 | {
662 | "name": "stderr",
663 | "output_type": "stream",
664 | "text": [
665 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
666 | "[Parallel(n_jobs=-1)]: Done 29 out of 34 | elapsed: 0.3s remaining: 0.0s\n",
667 | "[Parallel(n_jobs=-1)]: Done 34 out of 34 | elapsed: 0.4s finished\n",
668 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
669 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
670 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
671 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
672 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
673 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
674 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
675 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
676 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
677 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
678 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
679 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
680 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
681 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
682 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
683 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
684 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
685 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
686 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
687 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
688 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
689 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
690 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
691 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
692 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
693 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
694 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
695 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
696 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
697 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
698 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
699 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
700 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
701 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
702 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
703 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
704 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
705 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
706 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
707 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
708 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
709 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
710 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
711 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
712 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
713 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
714 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
715 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
716 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
717 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
718 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
719 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
720 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
721 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
722 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
723 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
724 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
725 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
726 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
727 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
728 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
729 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
730 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
731 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
732 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
733 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n"
734 | ]
735 | },
736 | {
737 | "name": "stdout",
738 | "output_type": "stream",
739 | "text": [
740 | "Markov Blanket: [0, 2, 4, 26, 21, 30, 22, 17, 13, 28, 7, 6, 20, 8, 32]\n",
741 | "Feature importance: [13.862943611198906, 13.862943611198906, 13.862943611198906, 13.16979643063896, 12.253505698764807, 10.918504632032466, 9.3856067967207, 7.762624659178842, 6.757157481717635, 6.757157481717635, 6.452596513377882, 6.452596513377882, 4.424910053209942, 4.213961286420002, 3.187497697678543]\n"
742 | ]
743 | },
744 | {
745 | "data": {
746 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAEGCAYAAABmXi5tAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuNCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8QVMy6AAAACXBIWXMAAAsTAAALEwEAmpwYAAAY6UlEQVR4nO3de5wddXnH8c+X3CHZBAggCLyCAaWISCBYLiIBwaBylVtFFFC7rVVRUKoWo5ZatMVbUYSmWOCFMSUGsYiakIqEiCisgFxNCKCSSEwCJCEEyG7y9I+Z4GFzzu7s5szM2Z3v+/U6rz0zZ+Y8zyabJ7/9nfk9o4jAzMwGv63KTsDMzIrhgm9mVhEu+GZmFeGCb2ZWES74ZmYVMbTsBHpy7LHHxpw5c8pOw8xsoFG9nS09wl+5cmXZKZiZDRotPcLvWvEMK674btlpmJkVaocPnZXL+7b0CN/MzJrHBd/MrCIKL/iSLpW0QNIMScOLjm9mVlWFFnxJk4CdI+Jw4GHg1CLjm5lVWdEj/EOAW9Lnc4BDC45vZlZZRRf8ccCa9PlqYLvuB0hql9QhqePptWu6v2xmZv1UdMF/FmhLn48Dnul+QERMj4jJETF5+9Ft3V82M7N+Krrg/wp4W/p8KnBHwfHNzCqr0IIfEfcCT0laAOwD3FBkfDOzKit8pW1EXFh0TDMza/HWCkN32C63JcZmZlXjlbZmZhXhgm9mVhEtPaXTufxJ/nT5BWWnYWZWqF0+/LVc3tcjfDOzinDBNzOriKKbpx2YdsqcL2mWpGFFxjczq7KiR/hLgakRcQSwGDip4PhmZpVV6Ie2EbGsZrMT6CoyvplZlZUyhy9pd+Bo4OY6r9V0y3yh+OTMzAapMu541QZcB5wbEZ3dX39lt8xRRadnZjZoFf2h7RBgBnBxRCwqMraZWdUVPcI/neQuV9Mk3SbpjILjm5lVVtEf2s4EZhYZ08zMEl54ZWZWES3dS2fYjrvl1lPCzKxqPMI3M6sIF3wzs4po6Smd51cs5s7px5WdhplZoQ5p32xNalN4hG9mVhEu+GZmFVFWL513S1pRRmwzs6oqo5fOVsCpwJNFxzYzq7IyRvhnArOBjSXENjOrrDKap50OXN/DMS+3R3527frikjMzG+SKHuGfBcyKiIaj+9r2yNuOHl5gamZmg1vRBX8f4H2S5gB7Sfp6wfHNzCqr6G6Zn9r0XFJHRJxfZHwzsyor7Tr8iJhcVmwzsypq6dYK2+ywZ25LjM3MqsYrbc3MKsIF38ysIlp6SufZlY8y++pjy07DzKxPTj13Ttkp1OURvplZRbjgm5lVRG4FX9KBkhZImi9plqRhknaVdJOk2yR9Pq/YZma2uTzn8JcCUyNinaRLgJOAdwEfioilOcY1M7M6chvhR8SyiFiXbnamXycAX5V0q6RD84ptZmaby/0qHUm7A0cDVwH7AacBXcBNwJvqHN8OtAOM335k3umZmVVGrh/aSmoDrgPOBVYCiyJiSUQsA7okbfYfTm23zDZ3yzQza5rcRvhp7/sZwMURsSjdt0rSWJIR/vCI6MorvpmZvVKeUzqnA4cCYyRNA64ALgJuBoYB03KMbWZm3eRW8CNiJjCzzkuH5xXTzMwa88IrM7OKUESUnUNDkydPjo6OjrLTMDMbaFRvp0f4ZmYV4YJvZlYRLd0eefkzj3LZjKllp2Fmg9h575lbdgqF8QjfzKwiXPDNzCqi6PbIV0taIekjecU1M7P6im6P/E/AfGB0jnHNzKyOPFfaLqvZ7AS6IuIpqe7loWZmlrPc5/Br2iPfnPH4dkkdkjrWrlmfb3JmZhVSWHvkiOjs7Xh4ZXvk0W1uj2xm1ix5fmi7WXtkMzMrT9HtkfcHTgCGSNojIj6RY3wzM6tRdHvk64HP5BXTzMwac7dMM7PBx90yzcyqzAXfzKwiWnpKZ/yeY+P4Sw8pOw0zy9nVJ88pO4XBxlM6ZmZV5oJvZlYReS682lfSHWm3zB9LGi3pDEm/lHSrpN3yim1mZpvLc4S/MCIOi4gjgLuAk4ELgCnAtPRhZmYFya3gd+udszXwR+ChiFgfEXcAb8grtpmZbS7v5mnHSLoXOJKkRfKampeHNDjn5W6ZL7pbpplZ0+Ra8CNiXkRMAmYDRwBtNS9vaHDOy90yR7pbpplZ0+TWS0fSiIh4Kd1cDQwH9pE0HDgIuD+v2GZmtrk8u2UeI+lCYCOwAjgHWE5yi8MXgfflGNvMzLrJs1vmzWx+l6vr04eZmRXMC6/MzCqipXvpuD2ymVm/uJeOmVmVueCbmVVEnlfpbLFHVz3FO278YtlpmA1KPzn5s2WnYAXLNMKXNFHSiPT5FEnnSRqXa2ZmZtZUWad0bgA2SNoT+A6wB/C93LIyM7Omy1rwN0ZEF0nHy29ExPnAzj2dIOlASQvS9sizJI2VdIuk2yX9XNKELczdzMz6IGvB75T0buBs/rKYalgv5ywFpqbtkRcDxwLnRsRbgC8BF/YjXzMz66esBf9c4BDgXyPiCUl7AN/t6YSIWBYR69LNTmB9RCyt2e7qT8JmZtY/mQp+RDwMfAq4J91+IiK+nOVcSbsDR5P+ZiBpGPA54LIGx7/cHnn9muezhDAzswyyXqVzPHAfMCfd3l/STRnOawOuI5nK2XRDlOnAlRHxWL1zatsjD2/bJkt6ZmaWQdYpnS8AbwJWAUTEfSRX6jQkaQgwA7g4Ihal+z4LPBERbqBmZlawrAW/KyJWd9vXWxOe04FDgWmSbpN0NvB54Kh0+0t9zNXMzLZA1pW2D0o6ExgiaS/gPOCXPZ0QETOBmd12X9v3FM3MrBkydcuUtDVwEfC2dNdc4IsR8WKOublbpplZ/9TtltnrCD+di78pIo4mKfpmZjYA9TqHHxEbgHWSxhaQj5mZ5STrHP6LwAOS5gEvXxwfEeflklXq0WdX8s4brsozhFnufnzKB8tOwQzIXvB/nD7MzGyAylTwI8JX15iZDXCZCr6kJ6hz3X1EvKaHc8YA/we8HjgYWATckr48ChgeEZP6mrCZmfVP1imdyTXPRwKnAdv1cs4LwHHApQARsR6YAiDpLGBiXxI1M7Mtk7V52tM1j6UR8Q3gqF7O6YqIFQ1ePg34ft9SNTOzLZF1SueAms2tSEb8Y/oTMJ3q2S3twFnv9XagHWDk+N5+iTAzs6yyTul8teZ5F/AESa+c/jgBaNhpMyKmk3TUZOzECb0vAzYzs0yyFvwPRMTjtTvSm6D0x2l4xa6ZWeGydsucnXHfK0j6CUn/nf+S9L50Omf3iHioDzmamVkT9DjCl7Q3yWWVYyW9q+alNpKrdXoUEe+os/uAOvvMzCxnvU3pvI7k0spxwPE1+58D/jannMzMLAdZ2yMfEhF3FpDPK7g9splZv/SvPXLqXkkfJpneeXkqJyLe34TEzMysAFk/tL0OeBUwFZgP7EoyrWNmZgNE1imdeyNikqT7I2I/ScOAuRHR42rbLTVu4p5x+L/9e54hzAD40anv6v0gs4Gj7pRO1hF+Z/p1laR9gbHAhCYkZWZmBck6hz9d0rbANJJVsqOBz+WWlZmZNV3Wfvibbjs1H2jYErlW9/bIEfGgpNuBjcBwoD0iHux7ymZm1h+ZpnQk7STpO5J+mm7vI+kDvZy2qT1y7Yrct0bEFOAzwPn9yNfMzPop6xz+NcBcYJd0exHw8Z5OqNceOSI2fRbQBjyQOUszM9tiWQv++IiYRTIdQ0R0ARv6GkzSDpLuAL4N3N7gmHZJHZI61q9Z3dcQZmbWQNaC/7yk7UlvcyjpYKDP1TgiVkTEYcApwCUNjpkeEZMjYvLwtrF9DWFmZg1kvUrnApKrcyamI/QdgFP7EkjSUGBjRGwk+c/i+b6cb2ZmW6a3bpm7R8QfI+IeSUeQNFMTsLBmPr6n838C7J+edyXwfkkbSaaGPrylyZuZWXa9jfB/yF/aGV8fEaf05c3rtEe+ti/nm5lZ8/RW8GuX52a6/r6Z9tx2nJe8m5k1SW8f2kaD52ZmNsD0NsJ/o6Q1JCP9Uelz0u2IiLZcszMzs6bpseBHxJCiEqnnsWfXcvINvygzBauIG095c9kpmOUu63X4ZmY2wLngm5lVRG4FX9KBkhZImi9plqRhkj4i6a70cXzv72JmZs2SdaVtfywFpkbEOkmXACcB/wDsB2xN0oztRznGNzOzGrkV/IhYVrPZCXQBi4FRwBjg6bxim5nZ5vIc4QNJewbgaOCLwM7Aw8AQ4JwGx7cD7QCjxu+Ud3pmZpWR64e2ktqA64BzSUb27cBewN7AJZI2u9FubbfMEW3j8kzPzKxS8vzQdggwA7g4IhaRNEx7EXgJWAeMoMGd1c3MrPnynNI5HTgUGCNpGnAFye0O7ySZ0rk8bZVsZmYFyPND25nAzDovfSWvmGZm1pgXXpmZVUTuV+lsiYnbjnaPEzOzJvEI38ysIlzwzcwqoqWndJ5ctZ7zbnyy7DRskLjs5N3KTsGsVB7hm5lVhAu+mVlFFFrwJb1J0m3pY6GkrxcZ38ysygqdw4+Iu4ApAJKuAn5YZHwzsyorZUpH0lDgYGBBGfHNzKqorDn8o4D59XrpSGqX1CGp44U1z5SQmpnZ4FRWwT8N+H69F2rbI49q267gtMzMBq/CC346nXMIcHvRsc3MqqyMEf6RwO1ujWxmVqzCV9pGxDxgXtFxzcyqrqVbK+w2briXw5uZNYlX2pqZVYQLvplZRbT0lM6qZ7v4weyVZadhg8S7Th1fdgpmpfII38ysIlzwzcwqooyFV1Mk/UzSfEknFh3fzKyqCp3DlzQS+ATw9ohYX2RsM7OqK3qEfyjwAvAjSTdKelXB8c3MKqvogr8TsAdwPDAd+EL3A2q7Za5e83TB6ZmZDV5FF/xVwC/S6ZxbgX26H1DbLXNs2/YFp2dmNngVXfDv4i9FfhLweMHxzcwqq+hbHD4t6SZJtwMbgfcXGd/MrMrK6JZ5OXB50XHNzKrOC6/MzCqipXvpjNt2qPufmJk1iUf4ZmYV4YJvZlYRLT2ls25lF/detbzsNKzGpA/uWHYKZtZPHuGbmVWEC76ZWUXkVvAlHShpQdoGeZakYZLOkPRLSbdK8t3JzcwKlOcIfykwNSKOABYDJwEXAFOAaenDzMwKklvBj4hlEbEu3ewEXgs8FBHrI+IO4A15xTYzs83lPocvaXfgaOAXwJqal4Y0OP7l9sjPPuf2yGZmzZJrwZfUBlwHnAssB9pqXt5Q75za9sjbjnF7ZDOzZsntOnxJQ4AZwMURsUjSMGAfScOBg4D784ptZmaby3Ph1ekktzQcI2kacAXwdWA+8CLwvhxjm5lZN7kV/IiYCcys89L1ecU0M7PGWrq1wtbjh3opv5lZk3ilrZlZRbjgm5lVREtP6XT++SWWfWVx2WlU2qs+uWfZKZhZk3iEb2ZWES74ZmYVUWjBl7SVpGvTLpoLJE0sMr6ZWZUVPcLfHxgREYcDFwMfKTi+mVllFV3wlwBIEjAOWFFwfDOzyir6Kp2VwEbgEWAEcFj3AyS1A+0Arx63S6HJmZkNZkWP8KcCL0TE3sApwNe6H1DbLXP70dsVnJ6Z2eBVxlU6z6ZfV5FM65iZWQGKntK5BXivpPkkUzoXFBzfzKyyCi34EbEBOLPImGZmlvDCKzOzimjpXjrDdhrhXi5mZk3iEb6ZWUW44JuZVURLT+l0Ln+OP192W9lptLSdzptSdgpmNkB4hG9mVhEu+GZmFZHblI6kfYH/BLqAtSTX338fGAlsAM6NiN/nFd/MzF4pzxH+wog4LCKOAO4CTiIp8m8BvgRcmGNsMzPrJreCHxGdNZtbAw9HxNJ0u5Nk5G9mZgXJdQ5f0jGS7gWOBB5L9w0DPgdc1uCcdkkdkjqeWbs6z/TMzCol14IfEfMiYhIwm7THPTAduDIiHmtwzsvtkbcbPTbP9MzMKiW3gi9pRM3mauB5SZ8FnoiI6/OKa2Zm9eW58OoYSReS3OFqBfBx4A/AHZKOAu6MiM/kGN/MzGrkVvAj4mbg5m67h+UVz8zMetbSrRWG7TjGrQPMzJrEK23NzCpCEVF2Dg1Jeg5YWHYe/TAeWFl2Ev3gvIvlvItVpbxXRsSx3Xe29JQOyWrdyWUn0VeSOpx3cZx3sZx3sZqZt6d0zMwqwgXfzKwiWr3gTy87gX5y3sVy3sVy3sVqWt4t/aGtmZk1T6uP8M3MrElc8M3MKqIlC76kYyUtlLRY0qfLzicLSbtJ+rmkRyQ9JOljZefUF5KGSLpXUvd2GC1L0jhJsyX9Lv1zP6TsnLKQdH76M/KgpJmSRpadUz2S/lvSckkP1uzbTtI8SY+mX7ctM8d6GuR9afpzcr+kGyWNKzHFuurlXfPaJyWFpPFbEqPlCr6kIcDlwNuBfYB3S9qn3Kwy6QI+ERF/BRwMfHiA5L3Jx4BHyk6ij/4DmBMRewNvZADkL+nVwHnA5IjYFxgC/E25WTV0DdB98c6ngZ9FxF7Az9LtVnMNm+c9D9g3IvYDFgGt2LjxGjbPG0m7AccAf9zSAC1X8IE3AYsj4vGIWA/8D3BiyTn1KiKeioh70ufPkRSfV5ebVTaSdgXeCVxVdi5ZSWoD3gJ8ByAi1kfEqlKTym4oMErSUJK7wf2p5HzqiojbgWe67T4RuDZ9fi3JrUtbSr28I+KWiNh0l71fAbsWnlgvGvx5A3wd+Edgi6+wacWC/2rgyZrtJQyQwrmJpAnAJODXJaeS1TdIfqA2lpxHX7yGpO321elU1FWStik7qd6kt/n8Cslo7SlgdUTcUm5WfbJTRDwFySAH2LHkfPrj/cBPy04iC0knAEsj4rfNeL9WLPiqs2/AXDsqaTRwA/DxiFhTdj69kXQcsDwiflN2Ln00FDgAuCK9q9rztOb0wiukc94nAnsAuwDbSDqr3KyqQ9JFJNOvM8rOpTeStgYuIrklbFO0YsFfAuxWs70rLforb3fp/XpvAGZExA/Kziejw4ATJP2eZPrsKEnfLTelTJYASyJi029Rs0n+A2h1R5Pc9W1FRHQCPwAOLTmnvvizpJ0B0q/LS84nM0lnA8cB74mBsQBpIsnA4Lfpv89dgXskvaq/b9iKBf9uYC9Je0gaTvKB1k0l59QrSSKZT34kIr5Wdj5ZRcRnImLXiJhA8md9a0S0/IgzIpYBT0p6XbrrrcDDJaaU1R+BgyVtnf7MvJUB8GFzjZuAs9PnZwP/W2IumUk6FvgUcEJErCs7nywi4oGI2DEiJqT/PpcAB6Q/+/3ScgU//WDlI8Bckn8IsyLioXKzyuQw4L0kI+T70sc7yk5qkPsoMEPS/cD+wCXlptO79DeS2cA9wAMk/wZbcsm/pJnAncDrJC2R9AHgyyS3L32U5MqRL5eZYz0N8v4WMAaYl/7bvLLUJOtokHdzYwyM32zMzGxLtdwI38zM8uGCb2ZWES74ZmYV4YJvZlYRLvhmZhXhgm+lk7S24HgTJJ1ZZMxu8S9Ku2Xen14i+Ndl5WLVMrTsBMyKlDYsmwCcCXyvhPiHkKz2PCAiXkrb3Q7fwvccWtMYzKwhj/CtZUiaImm+pFmSFkn6sqT3SLpL0gOSJqbHXSPpSkkL0uOOS/ePlHR1euy9ko5M958j6fuSfgTcQrJY6PB0dH1+OuJfIOme9HFoTT636S8992ekq2ORdJCkX0r6bZrfGCX3FLhU0t3p6P3v6nybOwMrI+IlgIhYGRF/6uE9M31PkrZR0k/97vS4E9PjXp++131pTnvl9zdoLS8i/PCj1AewNv06BVhFUhRHAEuBf05f+xjwjfT5NcAckgHLXiRLzkcCnwCuTo/Zm6SNwUjgnPSY7Wri3FwTf2tgZPp8L6Cj5rjVJD1MtiJZBflmkhH548BB6XFtJL8ttwOfTfeNADqAPbp9r6OB+0h6sn8bOCLd3+g9s35PlwBnpc/Hpe+/DfBNkt4xm2KMKvvv24/yHp7SsVZzd6TtdyU9RjIih6QNwZE1x82KiI3Ao5IeJymGbyYpcETE7yT9AXhtevy8iKjXaxxgGPAtSfsDG2rOAbgrIpak+dxHMh20GngqIu5OY61JX38bsJ+kU9Nzx5L8B/LEpjeLiLWSDgQOT7+f65Xc1e03Dd4z6/f0NpImeJ9Mt0cCu5P8J3WRknse/CAiHm3wZ2AV4IJvrealmucba7Y38sqf1+49QYL6rbU3eb6H184H/kxy16ytgBcb5LMhzUF14pPu/2hEzO0hFhGxAbgNuE3SAyRNyO7p4T0bqf2eBJwSEQu7HfOIpF+T3OBmrqQPRsStPeVng5fn8G2gOk3SVum8/muAhcDtwHsAJL2WZITbvQACPEfSSGuTsSSj640kDfCG9BL7d8Aukg5KY41JPwyeC3xISZtsJL1W3W7KIul13ebR9wf+0MN7Zv2e5gIfrfmMYVL69TXA4xFxGUmny/16+d5sEPMI3waqhcB8YCfg7yPiRUnfBq5MR81dwDmRXAnT/dz7gS5JvyX5PODbwA2STgN+Ts+/DRAR6yWdAXxT0ijgBZI+91eRTPnckxbeFWx+C8DR6Xnj0hwXA+09vGfW7+lfSO5cdn8a+/ckVwOdAZwlqRNYBlzc0/dmg5u7ZdqAI+kakg9dZ5edi9lA4ikdM7OK8AjfzKwiPMI3M6sIF3wzs4pwwTczqwgXfDOzinDBNzOriP8HFojSvIsMiUwAAAAASUVORK5CYII=\n",
747 | "text/plain": [
748 | ""
749 | ]
750 | },
751 | "metadata": {
752 | "needs_background": "light"
753 | },
754 | "output_type": "display_data"
755 | },
756 | {
757 | "name": "stdout",
758 | "output_type": "stream",
759 | "text": [
760 | "Score: 0.9\n"
761 | ]
762 | },
763 | {
764 | "name": "stderr",
765 | "output_type": "stream",
766 | "text": [
767 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
768 | "[Parallel(n_jobs=-1)]: Done 29 out of 34 | elapsed: 0.3s remaining: 0.0s\n",
769 | "[Parallel(n_jobs=-1)]: Done 34 out of 34 | elapsed: 0.4s finished\n",
770 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
771 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
772 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
773 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
774 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
775 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
776 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
777 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
778 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
779 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
780 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
781 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
782 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
783 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
784 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
785 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
786 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
787 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
788 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
789 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
790 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
791 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
792 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
793 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
794 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
795 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
796 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
797 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
798 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
799 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
800 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
801 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
802 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
803 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
804 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
805 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
806 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
807 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
808 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
809 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
810 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
811 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
812 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
813 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
814 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
815 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
816 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
817 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
818 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
819 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
820 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
821 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
822 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
823 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
824 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
825 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n"
826 | ]
827 | },
828 | {
829 | "name": "stdout",
830 | "output_type": "stream",
831 | "text": [
832 | "Markov Blanket: [0, 2, 7, 6, 4, 30, 17, 18, 8, 10, 20, 28]\n",
833 | "Feature importance: [13.862943611198906, 13.862943611198906, 13.16979643063896, 12.253505698764807, 11.560358518204861, 11.223886281583647, 9.614448369149548, 6.757157481717635, 6.160839271147856, 4.757075260819435, 4.644238322891096, 4.11144241566066]\n"
834 | ]
835 | },
836 | {
837 | "name": "stderr",
838 | "output_type": "stream",
839 | "text": [
840 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
841 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
842 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
843 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n",
844 | "[Parallel(n_jobs=-1)]: Using backend LokyBackend with 12 concurrent workers.\n",
845 | "[Parallel(n_jobs=-1)]: Done 8 out of 20 | elapsed: 0.0s remaining: 0.0s\n",
846 | "[Parallel(n_jobs=-1)]: Done 20 out of 20 | elapsed: 0.0s finished\n"
847 | ]
848 | },
849 | {
850 | "data": {
851 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAEGCAYAAABmXi5tAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuNCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8QVMy6AAAACXBIWXMAAAsTAAALEwEAmpwYAAAWUUlEQVR4nO3debRlZXnn8e+PoRirKBDFAVgglNjEENDSFo0NjqCCQwelVVTUbtJJnI0xWTa90qbb5WrS0WgcmqUtLiU0WE6IhqElIlEjEkQUlCFgECIyU4AMVdTTf+xderjcW/dUcc/e59b+ftY6656zzz77fc6te59673v2fp5UFZKkzd8WfQcgSeqGCV+SBsKEL0kDYcKXpIEw4UvSQGzVdwAbcvjhh9eZZ57ZdxiStNhkto1TPcO/+eab+w5BkjYbUz3DX3vTrdz08c/1HYYkdeqRf3DMRI471TN8SdLCMeFL0kCY8CVpIEz4kjQQnSf8JCckOT/JyUmWdD2+JA1Vpwk/yUHAY6rqWcBlwFFdji9JQ9b1DP9g4Oz2/pnAM2bukOS4JBcmufCWu1Z3Gpwkbc66TvjLgfVZ/A5gl5k7VNWJVbWyqlY+YsdlXcYmSZu1rhP+bcD6LL4cuLXj8SVpsLpO+P8IvKC9fxjw7Y7Hl6TB6jThV9UPgF8kOR/YH/hCl+NL0pB1Xkunqt7d9ZiSJC+8kqTBMOFL0kBMdXnkrR65y8TKhErS0DjDl6SBMOFL0kBM9ZLO/Tf+M9d+2HI7koZlz7eumshxneFL0kCY8CVpIEz4kjQQJnxJGoiuG6A8pe12dV6S05Js3eX4kjRkXc/wrwcOq6pDgKuAl3U8viQNVqenZVbVDSMP1wBruxxfkoaslzX8JHsCzwPOmOW5X7c4vPWu+7oPTpI2U50n/CTLgM8Cb6iqNTOfH21xuMuO23QdniRttrr+0HZL4GTgfVV1RZdjS9LQdT3DfyXwDOD4JN9McnTH40vSYHX9oe0pwCldjilJanjhlSQNhAlfkgZiqssjL3nUPhMrEypJQ+MMX5IGwoQvSQMx1Us6q2++krM+9aK+w5CkjXLYm77edwizcoYvSQNhwpekgTDhS9JAmPAlaSC6Lp72tLaGzjeTXJ7kg12OL0lD1nUtnQuAQwGSfBL4cpfjS9KQ9dUAZSvg6cD5fYwvSUPU1xr+c4DzqmrdzCdGO17dcef9PYQmSZunvhL+K4DPz/bEaMernZYu6TgsSdp89dHicCvgYOBbXY8tSUPWxwz/2cC3ZlvOkSRNTue1dKrqHOCcrseVpKHzwitJGggTviQNxFSXR16264qpLTMqSYuNM3xJGggTviQNhAlfkgZiqtfwb7rlSv73Zw/rOwxJm7Hff+1ZfYfQGWf4kjQQJnxJGggTviQNRB/F0w5N8o0k5yV5adfjS9JQdfqhbZJtgXcBL6wqi91LUoe6nuE/A7gH+GqSLyV5dMfjS9JgdZ3wdwP2Bo4ETgT+fOYOox2v7rLjlSQtmK4T/u3AP7TLOecC+8/cYbTj1Y52vJKkBdN1wr+A3yT5g4CrOx5fkgar0w9tq+qWJKcn+RawDnhjl+NL0pD10fHqo8BHux5XkobOC68kaSBM+JI0EKmqvmOY08qVK+vCCy/sOwxJWmwy20Zn+JI0ECZ8SRoIE74kDcRUr+Hvuu9OdeQJB/cdhqQp9umXn9l3CNPINXxJGjITviQNhAlfkgail4Sf5FVJbupjbEkaqj5aHG4BHAX8vOuxJWnI+pjhvxpYRVMtU5LUkU4TfpItgVcCp25gn193vLp3tR2vJGmhdD3DPwY4rarmnN2PdrzadpkdryRpoXSd8PcHXpfkTGBFkg92PL4kDVbXHa/es/5+kgur6h1dji9JQ9bbefhVtbKvsSVpiLzwSpIGwoQvSQMx1dUy7XglSZtk06tlJtknyTbt/UOTvDXJ8gUMTpI0YeMu6XwBeCDJvsCngL2Bv51YVJKkBTduwl9XVWuBlwMfak+nfMzkwpIkLbRxz8Nfk+RVwOuBI9ttW08mpN+48vYbePGXTpj0MJKm2Nde/u6+Q9hsjDvDfwNwMPA/quqaJHsDn5tcWJKkhTbWDL+qLkvyHmDP9vE1wAcmGZgkaWGNe5bOkcDFwJnt4wOTnD7BuCRJC2zcJZ0/B54G3A5QVRfTnKkzpyRPSvLtJOcl+VqSHZMcneQ7Sc5NssfDiFuStJHGTfhrq+qOGdvmu2Lr8qp6ZlUdAlxAc4bPO4FDgePbmySpI+Mm/B8neTWwZZIVST4CfGdDL6iqNSMPtweuBS6tqvur6tvAb29SxJKkTTJuwn8L8FvAfTQXXN0BvH2+FyV5fpIfAM8G1gCrR57eco7X/Lrj1f2r7x4zPEnSfOY9S6dtS3h6VT0PeO/GHLyqzgEOSvInwCHAspGnH5jjNScCJwLstO/u01voR5IWmXln+FX1APCrJDttzIHX195p3QHcBeyfZEmSZwKXbFSkkqSHZdwrbe8FfpTkHODX6yxV9dYNvOb5Sd4NrANuAo4FbgTOa4/3uk0JWJK0acZN+F9rb2OrqjOAM2ZsPrW9SZI6Nu6Vtp+ZdCCSpMkaK+EnuYZZzruvqscveESSpIkYd0lntOH4tsArgF0WPpwHW7H80VbKk6QFMtZ5+FV1y8jt+qr6EPCcyYYmSVpI4y7pPHnk4RY0M/6lE4lIkjQR4y7p/K+R+2uBa4BXLnw4kqRJGTfhv6mqrh7d0DZBmairbruVI1adPOlhJLXOOOo1fYegCRq3ls6qMbdJkqbUBmf4SZ5IUzRtpyT/fuSpZTRn60iSFon5lnT2A44AlvOb5uUAdwL/aUIxSZImYIMJv6q+AnwlycFV9d2NOXCSpcD/o/kL4enAFcDZ7dPbAUuq6qCND1mStCnG/dD2B0n+iCZ5/3opp6reuIHX3EPz18EJ7b7303S7IskxwD6bEK8kaRON+6HtZ4FHA4fRVLvcnWZZZ05Vtbaqbprj6VcAnx83SEnSwzduwt+3qo4H7m4Lqb2YTWxR2C717FFVl83x/EjHq9Wz7SJJ2gTjJvz1/WlvT/IkYCdgr00c8yXA6XM9WVUnVtXKqlq5ZNmyuXaTJG2kcRP+iUl2Bo6nSdaXAf9zE8d0OUeSejBuPfxPtnfPA8YuiZzk68CBwH5JPg58Cdizqi7dyDglSQ/TuMXTdgPeDzy2ql6YZH/g4Kr61IZeV1UvmmXzk2fZJkmasHGXdE4CzgIe2z6+Anj7BOKRJE3IuAl/16o6jaYhOVW1FnhgYlFJkhbcuBde3Z3kEbRtDpM8HbhjYlG19t15F6v3SdICGTfhv5Pm7Jx9knwbeCRw1MSikiQtuPmqZe5ZVddW1UVJDqEpphbg8qpas6HXSpKmy3xr+F8euX9qVV1aVT822UvS4jPfkk5G7o99/v1Cueq2O3nZqm90PawG6MtHPbfvEKSJm2+GX3PclyQtMvPN8H8nyWqamf527X3ax1VVFruRpEVivgYoW3YViCRpssa98EqStMhNLOEnWZrke0nuaksqk+TNSS5ob0fOdwxJ0sIZ98KrTfGgFoetPwQOALanqc3z1QmOL0kaMbEZ/hwtDq+iaWC+FLhlttc9uOPV7ZMKT5IGZ5Iz/NmcSdM8ZUvg2Nl2qKoTgRMBlu+zn6eCStIC6SzhJ1kGHAesAJYA5yY5p6pM6pLUgS5n+OuAe4H7gLXANrTn83cYgyQN1kQT/miLQ+DjwCrguzRLOh+tqnWTHF+S9BsTTfhztDj8y0mOKUmanRdeSdJAmPAlaSC6Pi1zo+y781LL1krSAnGGL0kDYcKXpIGY6iWdn99+P2/90s/7DkOLyIdfvkffIUhTyxm+JA2ECV+SBsKEL0kDYcKXpIHoNOEn2SLJZ5Kc39726XJ8SRqyrmf4BwLbVNWzgPcBb+54fEkarK4T/nUASQIsB2Z2xHpQx6t7Vt/acXiStPnq+jz8m2nq4v+Eph7+M2fuMNrxard9D7BWviQtkK5n+IcB91TVE4HfA/6q4/ElabD6OEvntvbr7TTLOpKkDnS9pHM28Nok59Es6byz4/ElabA6TfhV9QDw6i7HlCQ1vPBKkgbChC9JAzHV5ZH3WL7EcreStECc4UvSQJjwJWkgpnpJ547b1vJ3p97cdxjaCC88ete+Q5A0B2f4kjQQJnxJGggTviQNhAlfkgZiYgk/ydIk30tyV5IntduOTvKdJOcm8QR7SerQJGf49wBHAKsAkmxNUyztUOD49iZJ6sjEEn5Vra2q0Y5WK4BLq+r+qvo28NuTGluS9FBdruEvB1aPPN5ytp1GWxyuXn1LJ4FJ0hB0mfBvA5aNPH5gtp2q6sSqWllVK5cte0Q3kUnSAHR5pe1VwP5JlgBPBS7pcGxJGryJJvwkXwcOBPYDPg58EDgPuBd43STHliQ92EQTflW9aJbNp05yTEnS7LzwSpIGwoQvSQMx1eWRd9p5K8vtStICcYYvSQNhwpekgZjqJZ37f7mGn33ohr7D0Bj2evuj+w5B0jyc4UvSQJjwJWkgTPiSNBAmfEkaiEl2vHpKkvOTnJfktCRb2/FKkvozyRn+9cBhVXUITaXMl2HHK0nqzSQ7Xt1QVb9qH64BnoAdrySpNxNfw0+yJ/A84B/YyI5Xt9xtxytJWigTTfhJlgGfBd4A3MhGdrx6xA52vJKkhTKxK22TbAmcDLyvqq5IsjV2vJKk3kyytMIrgWcAS5Mcjx2vJKlXE0v4VXUKcMosT9nxSpJ64IVXkjQQJnxJGoipLo+8ZLetLbsrSQvEGb4kDYQJX5IGYqqXdNbceDe//Ovv9h3G4Oz2toP7DkHSBDjDl6SBMOFL0kCY8CVpIEz4kjQQXXe8enOSC9rbkZMaW5L0UJM8S2d9x6tfJXk/TcerPwQOALYHzgK+OsHxJUkjJlk87YaRh2uAtTStDrcDlgJ2N5GkDk38PPyRjlf/HXgMcBlNt6tj59j/OOA4gN133m3S4UnSYHTZ8Wo7mkS+Angi8P4kmfma0Y5Xu+y48yTDk6RBmeSHtg/qeAWso2l8ch/wK2Ab4CEJX5I0GV13vFoFfJdmSeejVbVuguNLkkb00fHqLyc1piRpbl54JUkDYcKXpIGY6vLIWz9qB0v1StICcYYvSQORquo7hjkluRO4vO84NsGuwM19B7EJjLtbxt2tIcV9c1UdPnPjVC/pAJdX1cq+g9hYSS407u4Yd7eMu1sLGbdLOpI0ECZ8SRqIaU/4J/YdwCYy7m4Zd7eMu1sLFvdUf2grSVo40z7DlyQtEBO+JA3EVCb8JIcnuTzJVUn+tO94xpFkjyR/n+QnSS5N8ra+Y9oYSbZM8oMkZ/Qdy7iSLE+yKslP2+/7orgsO8k72p+RHyc5Jcm2fcc0myT/J8mNSX48sm2XJOckubL9OnVNK+aI+4T25+SSJF9KsrzHEGc1W9wjz/1xkkqy68MZY+oSfltH/6PAC4H9gVcl2b/fqMayFnhXVf0b4OnAHy2SuNd7G/CTvoPYSH8NnFlVTwR+h0UQf5LHAW8FVlbVk2hKhf+HfqOa00nAzIt3/hT4RlWtAL7RPp42J/HQuM8BnlRVBwBXAH/WdVBjOImHxk2SPYDnA9c+3AGmLuEDTwOuqqqrq+p+4P8CL+05pnlV1S+q6qL2/p00yedx/UY1niS7Ay8GPtl3LONqu6n9O+BTAFV1f1Xd3mtQ49sK2C7JVsD2wL/2HM+squpbwK0zNr8U+Ex7/zPAy7qMaRyzxV1VZ1fV2vbhPwK7dx7YPOb4fgN8EPgT4GGfYTONCf9xwM9HHl/HIkmc6yXZCzgI+F7PoYzrQzQ/UIupIc3jgZuAT7dLUZ9MskPfQc2nqq6n6QlxLfAL4I6qOrvfqDbKblX1C2gmOcCjeo5nU7wR+Lu+gxhHkpcA11fVDxfieNOY8Gdre7hozh1NsiPwBeDtVbW673jmk+QI4Maq+qe+Y9lIWwFPBj5eVQcBdzOdywsP0q55vxTYG3gssEOSY/qNajiSvJdm+fXkvmOZT5LtgfcC/3WhjjmNCf86YI+Rx7szpX/yzpRka5pkf3JVfbHveMb0TOAlSX5Gs3z2nCSf6zeksVwHXFdV6/+KWkXzH8C0ex5wTVXdVFVrgC/StAJdLH6Z5DEA7dcbe45nbEleDxwBvKYWxwVI+9BMDH7Y/n7uDlyU5NGbesBpTPjfB1Yk2TvJEpoPtE7vOaZ5JQnNevJPquqv+o5nXFX1Z1W1e1XtRfO9Preqpn7GWVU3AD9Psl+76bnAZT2GNK5rgacn2b79mXkui+DD5hGnA69v778e+EqPsYwtyeHAe4CXVNWv+o5nHFX1o6p6VFXt1f5+Xgc8uf3Z3yRTl/DbD1beDJxF84twWlVd2m9UY3km8FqaGfLF7e1FfQe1mXsLcHKSS4ADgff3G8782r9IVgEXAT+i+R2cykv+k5wCfBfYL8l1Sd4EfAB4fpIrac4c+UCfMc5mjrj/BlgKnNP+bn6i1yBnMUfcCzvG4vjLRpL0cE3dDF+SNBkmfEkaCBO+JA2ECV+SBsKEL0kDYcJX75Lc1fF4eyV5dZdjzhj/vW21zEvaUwT/bV+xaFi26jsAqUttwbK9gFcDf9vD+AfTXO355Kq6ry13u+RhHnOrkcJg0pyc4WtqJDk0yXlJTktyRZIPJHlNkguS/CjJPu1+JyX5RJLz2/2OaLdvm+TT7b4/SPLsdvuxST6f5KvA2TQXCz2rnV2/o53xn5/kovb2jJF4vpnf1Nw/ub06liRPTfKdJD9s41uapqfACUm+387ef3+Wt/kY4Oaqug+gqm6uqn/dwDHHek9JdkhTT/377X4vbff7rfZYF7cxrZjcv6CmXlV589brDbir/XoocDtNUtwGuB74b+1zbwM+1N4/CTiTZsKyguaS822BdwGfbvd5Ik0Zg22BY9t9dhkZ54yR8bcHtm3vrwAuHNnvDpoaJlvQXAX5uzQz8quBp7b7LaP5a/k44L+027YBLgT2nvFedwQupqnJ/jHgkHb7XMcc9z29Hzimvb+8Pf4OwEdoasesH2O7vv+9vfV3c0lH0+b71ZbfTfLPNDNyaMoQPHtkv9Oqah1wZZKraZLh79IkOKrqp0n+BXhCu/85VTVbrXGArYG/SXIg8MDIawAuqKrr2nguplkOugP4RVV9vx1rdfv8C4ADkhzVvnYnmv9Arll/sKq6K8lTgGe17+fUNF3d/mmOY477nl5AUwTvj9vH2wJ70vwn9d40PQ++WFVXzvE90ACY8DVt7hu5v27k8Toe/PM6syZIMXtp7fXu3sBz7wB+SdM1awvg3jnieaCNIbOMT7v9LVV11gbGoqoeAL4JfDPJj2iKkF20gWPOZfQ9Bfi9qrp8xj4/SfI9mgY3ZyX5j1V17obi0+bLNXwtVq9IskW7rv944HLgW8BrAJI8gWaGOzMBAtxJU0hrvZ1oZtfraArgbTnP2D8FHpvkqe1YS9sPg88C/iBNmWySPCEzmrIk2W/GOvqBwL9s4JjjvqezgLeMfMZwUPv18cDVVfVhmkqXB8zz3rQZc4avxepy4DxgN+A/V9W9ST4GfKKdNa8Fjq3mTJiZr70EWJvkhzSfB3wM+EKSVwB/z4b/GqCq7k9yNPCRJNsB99DUuf8kzZLPRW3ivYmHtgDcsX3d8jbGq4DjNnDMcd/TX9B0LrukHftnNGcDHQ0ck2QNcAPwvg29N23erJapRSfJSTQfuq7qOxZpMXFJR5IGwhm+JA2EM3xJGggTviQNhAlfkgbChC9JA2HCl6SB+P8vm7MyCjx2FgAAAABJRU5ErkJggg==\n",
852 | "text/plain": [
853 | ""
854 | ]
855 | },
856 | "metadata": {
857 | "needs_background": "light"
858 | },
859 | "output_type": "display_data"
860 | },
861 | {
862 | "name": "stdout",
863 | "output_type": "stream",
864 | "text": [
865 | "Score: 0.9142857142857143\n",
866 | "\n",
867 | "\n",
868 | "Average Accuracy: 0.928692152917505\n",
869 | "\n",
870 | "\n",
871 | "Total Time Required (in seconds): 9.881650447845459\n"
872 | ]
873 | }
874 | ],
875 | "source": [
876 | "# We want to time our algorithm\n",
877 | "start = time.time()\n",
878 | "# Use KFold for understanding the performance of PyImpetus\n",
879 | "kfold = KFold(n_splits=5, random_state=27, shuffle=True)\n",
880 | "# This will hold all the accuracy scores\n",
881 | "scores = list()\n",
882 | "# Perform CV\n",
883 | "for train, test in kfold.split(data):\n",
884 | " # Split data into train and test based on folds\n",
885 | " x_train, x_test = data.iloc[train], data.iloc[test]\n",
886 | " y_train, y_test = Y[train], Y[test]\n",
887 | "\n",
888 | " # Create a PyImpetus classification object and initialize with required parameters\n",
889 | " # NOTE: To achieve fast selection, set cv=0 for disabling the use of any internal cross-validation\n",
890 | " model = PPIMBC(LogisticRegression(random_state=27, max_iter=1000, class_weight=\"balanced\"), cv=0, num_simul=20, simul_type=0, simul_size=0.2, sig_test_type=\"non-parametric\", random_state=27, verbose=2, p_val_thresh=0.05)\n",
891 | " # Fit this above object on the train part and transform the train dataset into selected feature subset\n",
892 | " # NOTE: x_train has to be a dataframe and y_train has to be a numpy array\n",
893 | " x_train = model.fit_transform(x_train, y_train)\n",
894 | " # Transform the test set as well\n",
895 | " # NOTE: x_test has to be a dataframe\n",
896 | " x_test = model.transform(x_test)\n",
897 | " # Check out the features selected\n",
898 | " print(\"Markov Blanket: \", model.MB)\n",
899 | " # Check out the scores of each feature. The scores are in order of the selected feature list\n",
900 | " # NOTE: You can use these scores ina feature selection ensemble\n",
901 | " print(\"Feature importance: \", model.feat_imp_scores)\n",
902 | " # Plot the feature importance scores\n",
903 | " model.feature_importance()\n",
904 | " # Convert the data into numpy arrays\n",
905 | " x_train, x_test = x_train.values, x_test.values\n",
906 | " \n",
907 | " model = DecisionTreeClassifier(random_state=27)\n",
908 | " model.fit(x_train, y_train)\n",
909 | " preds = model.predict(x_test)\n",
910 | " score = accuracy_score(y_test, preds)\n",
911 | " scores.append(score)\n",
912 | " print(\"Score: \", score)\n",
913 | "# Compute average score\n",
914 | "print(\"\\n\\nAverage Accuracy: \", sum(scores)/len(scores))\n",
915 | "# Finally, check out the total time taken\n",
916 | "end = time.time()\n",
917 | "print(\"\\n\\nTotal Time Required (in seconds): \", end-start)"
918 | ]
919 | },
920 | {
921 | "cell_type": "markdown",
922 | "metadata": {},
923 | "source": [
924 | "# Final Results (Accuracy is used so higher is better)"
925 | ]
926 | },
927 | {
928 | "cell_type": "markdown",
929 | "metadata": {},
930 | "source": [
931 | "### Final Accuracy using all features: 0.8801 (0.0073 seconds)\n",
932 | "### Final Accuracy using PyImpetus recommended features: 0.9286 (1.976 seconds)"
933 | ]
934 | },
935 | {
936 | "cell_type": "code",
937 | "execution_count": null,
938 | "metadata": {},
939 | "outputs": [],
940 | "source": []
941 | }
942 | ],
943 | "metadata": {
944 | "kernelspec": {
945 | "display_name": "Python 3",
946 | "language": "python",
947 | "name": "python3"
948 | },
949 | "language_info": {
950 | "codemirror_mode": {
951 | "name": "ipython",
952 | "version": 3
953 | },
954 | "file_extension": ".py",
955 | "mimetype": "text/x-python",
956 | "name": "python",
957 | "nbconvert_exporter": "python",
958 | "pygments_lexer": "ipython3",
959 | "version": "3.8.8"
960 | }
961 | },
962 | "nbformat": 4,
963 | "nbformat_minor": 4
964 | }
965 |
--------------------------------------------------------------------------------
/tutorials/ionosphere.data:
--------------------------------------------------------------------------------
1 | 1,0,0.99539,-0.05889,0.85243,0.02306,0.83398,-0.37708,1,0.03760,0.85243,-0.17755,0.59755,-0.44945,0.60536,-0.38223,0.84356,-0.38542,0.58212,-0.32192,0.56971,-0.29674,0.36946,-0.47357,0.56811,-0.51171,0.41078,-0.46168,0.21266,-0.34090,0.42267,-0.54487,0.18641,-0.45300,g
2 | 1,0,1,-0.18829,0.93035,-0.36156,-0.10868,-0.93597,1,-0.04549,0.50874,-0.67743,0.34432,-0.69707,-0.51685,-0.97515,0.05499,-0.62237,0.33109,-1,-0.13151,-0.45300,-0.18056,-0.35734,-0.20332,-0.26569,-0.20468,-0.18401,-0.19040,-0.11593,-0.16626,-0.06288,-0.13738,-0.02447,b
3 | 1,0,1,-0.03365,1,0.00485,1,-0.12062,0.88965,0.01198,0.73082,0.05346,0.85443,0.00827,0.54591,0.00299,0.83775,-0.13644,0.75535,-0.08540,0.70887,-0.27502,0.43385,-0.12062,0.57528,-0.40220,0.58984,-0.22145,0.43100,-0.17365,0.60436,-0.24180,0.56045,-0.38238,g
4 | 1,0,1,-0.45161,1,1,0.71216,-1,0,0,0,0,0,0,-1,0.14516,0.54094,-0.39330,-1,-0.54467,-0.69975,1,0,0,1,0.90695,0.51613,1,1,-0.20099,0.25682,1,-0.32382,1,b
5 | 1,0,1,-0.02401,0.94140,0.06531,0.92106,-0.23255,0.77152,-0.16399,0.52798,-0.20275,0.56409,-0.00712,0.34395,-0.27457,0.52940,-0.21780,0.45107,-0.17813,0.05982,-0.35575,0.02309,-0.52879,0.03286,-0.65158,0.13290,-0.53206,0.02431,-0.62197,-0.05707,-0.59573,-0.04608,-0.65697,g
6 | 1,0,0.02337,-0.00592,-0.09924,-0.11949,-0.00763,-0.11824,0.14706,0.06637,0.03786,-0.06302,0,0,-0.04572,-0.15540,-0.00343,-0.10196,-0.11575,-0.05414,0.01838,0.03669,0.01519,0.00888,0.03513,-0.01535,-0.03240,0.09223,-0.07859,0.00732,0,0,-0.00039,0.12011,b
7 | 1,0,0.97588,-0.10602,0.94601,-0.20800,0.92806,-0.28350,0.85996,-0.27342,0.79766,-0.47929,0.78225,-0.50764,0.74628,-0.61436,0.57945,-0.68086,0.37852,-0.73641,0.36324,-0.76562,0.31898,-0.79753,0.22792,-0.81634,0.13659,-0.82510,0.04606,-0.82395,-0.04262,-0.81318,-0.13832,-0.80975,g
8 | 0,0,0,0,0,0,1,-1,0,0,-1,-1,0,0,0,0,1,1,-1,-1,0,0,0,0,1,1,1,1,0,0,1,1,0,0,b
9 | 1,0,0.96355,-0.07198,1,-0.14333,1,-0.21313,1,-0.36174,0.92570,-0.43569,0.94510,-0.40668,0.90392,-0.46381,0.98305,-0.35257,0.84537,-0.66020,0.75346,-0.60589,0.69637,-0.64225,0.85106,-0.65440,0.57577,-0.69712,0.25435,-0.63919,0.45114,-0.72779,0.38895,-0.73420,g
10 | 1,0,-0.01864,-0.08459,0,0,0,0,0.11470,-0.26810,-0.45663,-0.38172,0,0,-0.33656,0.38602,-0.37133,0.15018,0.63728,0.22115,0,0,0,0,-0.14803,-0.01326,0.20645,-0.02294,0,0,0.16595,0.24086,-0.08208,0.38065,b
11 | 1,0,1,0.06655,1,-0.18388,1,-0.27320,1,-0.43107,1,-0.41349,0.96232,-0.51874,0.90711,-0.59017,0.89230,-0.66474,0.69876,-0.70997,0.70645,-0.76320,0.63081,-0.80544,0.55867,-0.89128,0.47211,-0.86500,0.40303,-0.83675,0.30996,-0.89093,0.22995,-0.89158,g
12 | 1,0,1,-0.54210,1,-1,1,-1,1,0.36217,1,-0.41119,1,1,1,-1,1,-0.29354,1,-0.93599,1,1,1,1,1,-0.40888,1,-0.62745,1,-1,1,-1,1,-1,b
13 | 1,0,1,-0.16316,1,-0.10169,0.99999,-0.15197,1,-0.19277,0.94055,-0.35151,0.95735,-0.29785,0.93719,-0.34412,0.94486,-0.28106,0.90137,-0.43383,0.86043,-0.47308,0.82987,-0.51220,0.84080,-0.47137,0.76224,-0.58370,0.65723,-0.68794,0.68714,-0.64537,0.64727,-0.67226,g
14 | 1,0,1,-0.86701,1,0.22280,0.85492,-0.39896,1,-0.12090,1,0.35147,1,0.07772,1,-0.14767,1,-1,1,-1,0.61831,0.15803,1,0.62349,1,-0.17012,1,0.35924,1,-0.66494,1,0.88428,1,-0.18826,b
15 | 1,0,1,0.07380,1,0.03420,1,-0.05563,1,0.08764,1,0.19651,1,0.20328,1,0.12785,1,0.10561,1,0.27087,1,0.44758,1,0.41750,1,0.20033,1,0.36743,0.95603,0.48641,1,0.32492,1,0.46712,g
16 | 1,0,0.50932,-0.93996,1,0.26708,-0.03520,-1,1,-1,0.43685,-1,0,0,-1,-0.34265,-0.37681,0.03623,1,-1,0,0,0,0,-0.16253,0.92236,0.39752,0.26501,0,0,1,0.23188,0,0,b
17 | 1,0,0.99645,0.06468,1,-0.01236,0.97811,0.02498,0.96112,0.02312,0.99274,0.07808,0.89323,0.10346,0.94212,0.05269,0.88809,0.11120,0.86104,0.08631,0.81633,0.11830,0.83668,0.14442,0.81329,0.13412,0.79476,0.13638,0.79110,0.15379,0.77122,0.15930,0.70941,0.12015,g
18 | 0,0,0,0,-1,-1,1,1,-1,1,-1,1,1,-1,1,1,-1,-1,-1,1,1,-1,-1,1,-1,1,1,-1,-1,1,-1,-1,1,-1,b
19 | 1,0,0.67065,0.02528,0.66626,0.05031,0.57197,0.18761,0.08776,0.34081,0.63621,0.12131,0.62099,0.14285,0.78637,0.10976,0.58373,0.18151,0.14395,0.41224,0.53888,0.21326,0.51420,0.22625,0.48838,0.23724,0.46167,0.24618,0.43433,0.25306,0.40663,0.25792,1,0.33036,g
20 | 0,0,1,-1,0,0,0,0,1,1,1,-1,-0.71875,1,0,0,-1,1,1,1,-1,1,1,0.56250,-1,1,1,1,1,-1,1,1,1,1,b
21 | 1,0,1,-0.00612,1,-0.09834,1,-0.07649,1,-0.10605,1,-0.11073,1,-0.39489,1,-0.15616,0.92124,-0.31884,0.86473,-0.34534,0.91693,-0.44072,0.96060,-0.46866,0.81874,-0.40372,0.82681,-0.42231,0.75784,-0.38231,0.80448,-0.40575,0.74354,-0.45039,g
22 | 0,0,1,1,0,0,0,0,-1,-1,0,0,0,0,-1,-1,-1,-1,-1,1,-1,1,0,0,0,0,1,-1,-1,1,-1,1,-1,1,b
23 | 1,0,0.96071,0.07088,1,0.04296,1,0.09313,0.90169,-0.05144,0.89263,0.02580,0.83250,-0.06142,0.87534,0.09831,0.76544,0.00280,0.75206,-0.05295,0.65961,-0.07905,0.64158,-0.05929,0.55677,-0.07705,0.58051,-0.02205,0.49664,-0.01251,0.51310,-0.00015,0.52099,-0.00182,g
24 | 0,0,-1,1,0,0,0,0,-1,1,1,1,0,0,0,0,1,-1,-1,1,1,1,0,0,-1,-1,1,-1,1,1,-1,1,0,0,b
25 | 1,0,1,-0.06182,1,0.02942,1,-0.05131,1,-0.01707,1,-0.11726,0.84493,-0.05202,0.93392,-0.06598,0.69170,-0.07379,0.65731,-0.20367,0.94910,-0.31558,0.80852,-0.31654,0.84932,-0.34838,0.72529,-0.29174,0.73094,-0.38576,0.54356,-0.26284,0.64207,-0.39487,g
26 | 1,0,1,0.57820,1,-1,1,-1,1,-1,1,-1,1,-1,1,-1,1,-1,1,-1,1,-0.62796,1,-1,1,-1,1,-1,1,-1,1,-1,1,-1,b
27 | 1,0,1,-0.08714,1,-0.17263,0.86635,-0.81779,0.94817,0.61053,0.95473,-0.41382,0.88486,-0.31736,0.87937,-0.23433,0.81051,-0.62180,0.12245,-1,0.90284,0.11053,0.62357,-0.78547,0.55389,-0.82868,0.48136,-0.86583,0.40650,-0.89674,0.32984,-0.92128,-0.13341,-1,g
28 | 0,0,-1,-1,0,0,-1,1,1,-0.37500,0,0,0,0,0,0,1,-1,-1,-1,1,-1,0,0,1,-1,-1,1,-1,-1,0,0,-1,1,b
29 | 1,0,1,0.08380,1,0.17387,1,-0.13308,0.98172,0.64520,1,0.47904,1,0.59113,1,0.70758,1,0.82777,1,0.95099,1,1,0.98042,1,0.91624,1,0.83899,1,0.74822,1,0.64358,1,0.52479,1,g
30 | 0,0,-1,-1,1,1,1,-1,-1,1,1,-1,-1,-1,0,0,1,1,-1,-1,1,-1,1,-1,1,1,1,-1,1,-1,-1,1,1,-1,b
31 | 1,0,1,-0.14236,1,-0.16256,1,-0.23656,1,-0.07514,1,-0.25010,1,-0.26161,1,-0.21975,1,-0.38606,1,-0.46162,1,-0.35519,1,-0.59661,1,-0.47643,0.98820,-0.49687,1,-0.75820,1,-0.75761,1,-0.84437,g
32 | 1,0,1,-1,1,1,1,-1,1,-1,1,-1,1,-0.01840,1,-1,1,1,1,-0.85583,1,1,1,-1,0,0,1,1,1,-0.79141,1,1,1,1,b
33 | 1,0,0.88208,-0.14639,0.93408,-0.11057,0.92100,-0.16450,0.88307,-0.17036,0.88462,-0.31809,0.85269,-0.31463,0.82116,-0.35924,0.80681,-0.33632,0.75243,-0.47022,0.70555,-0.47153,0.66150,-0.50085,0.61297,-0.48086,0.56804,-0.54629,0.50179,-0.59854,0.47075,-0.57377,0.42189,-0.58086,g
34 | 1,0,0.71253,-0.02595,0.41287,-0.23067,0.98019,-0.09473,0.99709,-0.10236,1,-0.10951,0.58965,1,0.83726,-1,0.82270,-0.17863,0.80760,-0.28257,-0.25914,0.92730,0.51933,0.05456,0.65493,-0.20392,0.93124,-0.41307,0.63811,-0.21901,0.86136,-0.87354,-0.23186,-1,b
35 | 1,0,1,-0.15899,0.72314,0.27686,0.83443,-0.58388,1,-0.28207,1,-0.49863,0.79962,-0.12527,0.76837,0.14638,1,0.39337,1,0.26590,0.96354,-0.01891,0.92599,-0.91338,1,0.14803,1,-0.11582,1,-0.11129,1,0.53372,1,-0.57758,g
36 | 1,0,0.66161,-1,1,1,1,-0.67321,0.80893,-0.40446,1,-1,1,-0.89375,1,0.73393,0.17589,0.70982,1,0.78036,1,0.85268,1,-1,1,0.85357,1,-0.08571,0.95982,-0.36250,1,0.65268,1,0.34732,b
37 | 1,0,1,0.00433,1,-0.01209,1,-0.02960,1,-0.07014,0.97839,-0.06256,1,-0.06544,0.97261,-0.07917,0.92561,-0.13665,0.94184,-0.14327,0.99589,-0.14248,0.94815,-0.13565,0.89469,-0.20851,0.89067,-0.17909,0.85644,-0.18552,0.83777,-0.20101,0.83867,-0.20766,g
38 | 0,0,1,1,1,-1,0,0,0,0,-1,-1,0,0,0,0,-1,1,1,1,-1,1,-1,1,1,-1,1,1,-1,1,1,1,0,0,b
39 | 1,0,0.91241,0.04347,0.94191,0.02280,0.94705,0.05345,0.93582,0.01321,0.91911,0.06348,0.92766,0.12067,0.92048,0.06211,0.88899,0.12722,0.83744,0.14439,0.80983,0.11849,0.77041,0.14222,0.75755,0.11299,0.73550,0.13282,0.66387,0.15300,0.70925,0.10754,0.65258,0.11447,g
40 | 1,0,1,0.02461,0.99672,0.04861,0.97545,0.07143,0.61745,-1,0.91036,0.11147,0.88462,0.53640,0.82077,0.14137,0.76929,0.15189,1,0.41003,0.65850,0.16371,0.60138,0.16516,0.54446,0.16390,0.48867,0.16019,0.43481,0.15436,0.38352,0.14677,1,1,b
41 | 1,0,1,0.06538,1,0.20746,1,0.26281,0.93051,0.32213,0.86773,0.39039,0.75474,0.50082,0.79555,0.52321,0.65954,0.60756,0.57619,0.62999,0.47807,0.67135,0.40553,0.68840,0.34384,0.72082,0.27712,0.72386,0.19296,0.70682,0.11372,0.72688,0.06990,0.71444,g
42 | 1,0,-1,-1,1,1,1,-0.14375,0,0,-1,1,1,1,0.17917,-1,-1,-1,0.08750,-1,1,-1,-1,1,-1,-1,1,-1,-1,-1,1,1,0,0,b
43 | 1,0,0.90932,0.08791,0.86528,0.16888,1,0.16598,0.55187,0.68154,0.70207,0.36719,0.16286,0.42739,0.57620,0.46086,0.51067,0.49618,0.31639,0.12967,0.37824,0.54462,0.31274,0.55826,0.24856,0.56527,0.18626,0.56605,0.12635,0.56101,0.06927,0.55061,0.12137,0.67739,g
44 | 1,0,-0.64286,-1,1,0.82857,1,-1,1,-0.23393,1,0.96161,1,-0.37679,1,-1,1,0.13839,1,-1,1,-0.03393,-0.84286,1,0.53750,0.85714,1,1,1,-1,1,-1,1,-1,b
45 | 1,0,0.99025,-0.05785,0.99793,-0.13009,0.98663,-0.19430,0.99374,-0.25843,0.92738,-0.30130,0.92651,-0.37965,0.89812,-0.43796,0.84922,-0.52064,0.87433,-0.57075,0.79016,-0.59839,0.74725,-0.64615,0.68282,-0.68479,0.65247,-0.73174,0.61010,-0.75353,0.54752,-0.80278,0.49195,-0.83245,g
46 | 0,0,0,0,0,0,0,0,1,1,1,1,0,0,0,0,-0.37500,-1,-1,-1,0,0,0,0,-1,-1,-1,-1,-1,1,1,0,0,0,b
47 | 1,0,1,-0.03730,1,-0.07383,0.99601,-0.11039,0.99838,-0.09931,0.98941,-0.13814,0.96674,-0.21695,0.95288,-0.25099,0.91236,-0.34400,0.90581,-0.32152,0.89991,-0.34691,0.87874,-0.37643,0.86213,-0.42990,0.83172,-0.43122,0.81433,-0.42593,0.77919,-0.47977,0.75115,-0.50152,g
48 | 1,0,0.94598,-0.02685,-1,0.26131,-0.36393,0.35639,0.69258,-0.63427,1,-0.03353,-0.29020,-0.00550,-0.54852,0.15452,0.91921,-0.46270,1,-0.50424,-0.29735,-0.31454,-0.73864,0.37361,0.83872,-0.46734,0.52208,-0.58130,1,-0.61393,-0.09634,0.20477,-0.06117,0.41913,b
49 | 1,0,0.98166,0.00874,0.98103,-0.03818,0.97565,-0.05699,0.95947,-0.06971,0.99004,-0.04507,0.94713,-0.11102,0.93369,-0.12790,0.94217,-0.11583,0.79682,-0.19200,0.88274,-0.17387,0.86257,-0.18739,0.88487,-0.19689,0.81813,-0.21136,0.78546,-0.23864,0.76911,-0.23095,0.74323,-0.23902,g
50 | 1,0,0,0,1,0.51724,0,0,0.10991,-1,0,0,0,0,-1,-0.22414,-0.55711,-0.83297,0.76940,0.63147,0,0,0.53448,0.35668,-0.90302,0.44828,1,-1,-1,0.81573,0,0,0,0,b
51 | 1,0,0.84134,-0.18362,0.43644,0.02919,0.93421,-0.00267,0.87947,0.13795,0.81121,-0.01789,0.88559,0.54991,0.91714,-0.57486,0.75000,-0.29520,0.86676,-0.20104,1,1,0.46610,-0.16290,0.90066,-0.02778,0.93358,-0.01158,0.61582,-0.32298,0.84463,-0.25706,0.93323,-0.01425,g
52 | 0,0,1,1,1,-1,0,0,0,0,1,1,1,1,-1,-1,1,-1,-1,1,0,0,1,-1,1,-1,1,1,-1,-1,0,0,0,0,b
53 | 1,0,1,1,1,1,0.91010,1,-0.26970,1,-0.83152,1,-1,1,-1,0.72526,-1,-0.57779,-1,-0.42052,-1,-1,-0.52838,-1,0.90014,-1,1,-1,1,-1,1,-0.34686,1,0.34845,g
54 | 1,0,-0.67935,-1,-1,1,1,0.63317,0.03515,-1,-1,-1,1,1,0.88683,-1,-1,1,0.83840,1,1,-1,-1,-1,-0.18856,1,1,-1,-1,-1,-1,1,1,0.33611,b
55 | 1,0,0.95659,0.08143,0.97487,-0.05667,0.97165,-0.08484,0.96097,-0.06561,0.94717,0.01279,0.95436,-0.16795,0.94612,-0.19497,0.99630,-0.32268,0.90343,-0.35902,0.91428,-0.27316,0.90140,-0.29807,0.99899,-0.40747,0.87244,-0.34586,0.92059,-0.30619,0.83951,-0.39061,0.82166,-0.41173,g
56 | 1,0,0.08333,-0.20685,-1,1,-1,1,0.71875,0.47173,-0.82143,-0.62723,-1,-1,-1,1,-0.02753,0.59152,-0.42113,-0.42113,-0.74628,-1,-1,-0.46801,-1,0.23810,1,-1,-1,-0.38914,-1,-1,-1,0.61458,b
57 | 1,0,1,-0.02259,1,-0.04494,1,-0.06682,1,-0.08799,1,0.56173,1,-0.12738,1,-0.14522,1,0.32407,1,-0.17639,0.99484,-0.18949,0.95601,-0.20081,1,-0.92284,0.87280,-0.21793,0.82920,-0.22370,0.78479,-0.22765,0.73992,-0.22981,g
58 | 0,0,-1,1,1,-1,-1,1,0,0,1,1,-1,-0.18750,1,1,-1,-1,1,-1,-1,-1,1,1,1,-1,1,1,1,1,0,0,-1,-1,b
59 | 1,0,1,0.05812,0.94525,0.07418,0.99952,0.13231,1,-0.01911,0.94846,0.07033,0.95713,0.14644,0.94862,0.11224,0.90896,0.20119,0.96741,0.16265,0.99695,0.14258,0.90784,0.16410,0.91667,0.22431,0.88423,0.23571,0.88568,0.22511,0.78324,0.29576,0.83574,0.31166,g
60 | 1,0,0.17188,-1,-1,1,0,0,0,0,-1,1,0,0,-0.61354,-0.67708,0.80521,0.36146,0.51979,0.14375,0,0,-1,-0.27083,-0.84792,0.96250,1,1,-1,0.67708,0,0,0,0,b
61 | 1,0,1,0.09771,1,0.12197,1,0.22574,0.98602,0.09237,0.94930,0.19211,0.92992,0.24288,0.89241,0.28343,0.85529,0.26721,0.83656,0.33129,0.83393,0.31698,0.74829,0.39597,0.76193,0.34658,0.68452,0.42746,0.62764,0.46031,0.56791,0.47033,0.54252,0.50903,g
62 | 1,0,0.01667,-0.35625,0,0,0,0,0,0,0,0,0,0,0.12292,-0.55000,0.22813,0.82813,1,-0.42292,0,0,0.08333,-1,-0.10625,-0.16667,1,-0.76667,-1,0.18854,0,0,1,-0.27292,b
63 | 1,0,1,0.16801,0.99352,0.16334,0.94616,0.33347,0.91759,0.22610,0.91408,0.37107,0.84250,0.46899,0.81011,0.49225,0.78473,0.48311,0.65091,0.56977,0.56553,0.58071,0.55586,0.64720,0.48311,0.55236,0.43317,0.69129,0.35684,0.76147,0.33921,0.66844,0.22101,0.78685,g
64 | 1,0,0.63816,1,0.20833,-1,1,1,0.87719,0.30921,-0.66886,1,-0.05921,0.58772,0.01754,0.05044,-0.51535,-1,0.14254,-0.03289,0.32675,-0.43860,-1,1,0.80921,-1,1,-0.06140,1,1,0.20614,-1,1,1,b
65 | 1,0,1,-0.41457,1,0.76131,0.87060,0.18593,1,-0.09925,0.93844,0.47990,0.65452,-0.16080,1,0.00879,0.97613,-0.50126,0.80025,-0.24497,0.88065,-0.19095,1,-0.12312,0.93593,0.10678,0.92890,-0.07249,1,-0.27387,0.43970,0.19849,0.51382,-0.05402,g
66 | 1,0,0.84783,0.10598,1,0.39130,1,-1,0.66938,0.08424,1,0.27038,1,0.60598,1,0.35507,1,0.02672,0.58424,-0.43025,1,0.63496,0.89130,0.26585,0.91033,-0.33333,1,0.15942,0.37681,-0.01947,1,0.22464,1,0.37409,b
67 | 1,0,1,0.28046,1,0.02477,1,0.07764,1,0.04317,0.98762,0.33266,1,0.05489,1,0.04384,0.95750,-0.24598,0.84371,-0.08668,1,0.04150,0.99933,0.27376,1,-0.39056,0.96414,-0.02174,0.86747,0.23360,0.94578,-0.22021,0.80355,-0.07329,g
68 | 0,0,1,-1,1,-1,1,-1,1,-1,1,1,1,1,1,-1,1,1,1,1,1,1,1,-1,1,-1,1,-1,1,0.65625,0,0,1,-1,b
69 | 1,0,1,0.67784,0.81309,0.82021,0.43019,1,0.20619,0.80541,-0.43872,1,-0.79135,0.77092,-1,0.40268,-0.39046,-0.58634,-0.97907,-0.42822,-0.73083,-0.76339,-0.37671,-0.97491,0.41366,-1,0.41778,-0.93296,0.25773,-1,0.93570,-0.35222,0.98816,0.03446,g
70 | 1,0,1,1,1,-1,1,-1,1,1,1,1,1,1,1,-1,1,1,1,1,1,1,1,1,1,1,1,0.5,0,0,1,-1,1,-1,b
71 | 1,0,1,0.03529,1,0.18281,1,0.26968,1,0.25068,1,0.28778,1,0.38643,1,0.31674,1,0.65701,1,0.53846,1,0.61267,1,0.59457,0.89593,0.68326,0.89502,0.71374,0.85611,0.67149,0.74389,0.85611,0.71493,0.75837,g
72 | 0,0,1,-1,1,1,-1,-1,1,-1,0,0,0,0,-1,1,1,-1,1,-1,-0.75000,1,1,-1,1,-1,1,-1,-1,-1,0,0,1,-1,b
73 | 1,0,0.96087,0.08620,0.96760,0.19279,0.96026,0.27451,0.98044,0.35052,0.92867,0.46281,0.86265,0.52517,0.82820,0.58794,0.73242,0.69065,0.69003,0.73140,0.54473,0.68820,0.48339,0.76197,0.40615,0.74689,0.33401,0.83796,0.24944,0.86061,0.13756,0.86835,0.09048,0.86285,g
74 | 1,0,0.69444,0.38889,0,0,-0.32937,0.69841,0,0,0,0,0,0,0.20635,-0.24206,0.21032,0.19444,0.46429,0.78175,0,0,0,0,0.73413,0.27381,0.76190,0.63492,0,0,0,0,0,0,b
75 | 1,0,1,0.05070,1,0.10827,1,0.19498,1,0.28453,1,0.34826,1,0.38261,0.94575,0.42881,0.89126,0.50391,0.75906,0.58801,0.80644,0.59962,0.79578,0.62758,0.66643,0.63942,0.59417,0.69435,0.49538,0.72684,0.47027,0.71689,0.33381,0.75243,g
76 | 0,0,1,1,0,0,1,-1,1,-1,1,1,1,1,1,-1,1,1,1,1,1,-1,-1,-1,1,-1,1,-1,1,1,0,0,1,-1,b
77 | 1,0,1,0.04078,1,0.11982,1,0.16159,1,0.27921,0.98703,0.30889,0.92745,0.37639,0.91118,0.39749,0.81939,0.46059,0.78619,0.46994,0.79400,0.56282,0.70331,0.58129,0.67077,0.59723,0.58903,0.60990,0.53952,0.60932,0.45312,0.63636,0.40442,0.62658,g
78 | 0,0,1,1,1,-1,1,1,1,1,1,1,1,1,1,1,1,-1,-1,1,-1,1,-1,1,1,-1,1,1,-1,1,-1,-1,-1,1,b
79 | 1,0,1,0.24168,1,0.48590,1,0.72973,1,1,1,1,1,1,1,0.77128,1,1,1,1,0.74468,1,0.89647,1,0.64628,1,0.38255,1,0.10819,1,-0.17370,1,-0.81383,1,g
80 | 0,0,1,1,1,-1,1,1,-1,1,0,0,1,1,0,0,0,0,-1,1,-1,1,1,1,1,-1,1,1,1,1,1,-1,-1,1,b
81 | 1,0,1,-0.06604,1,0.62937,1,0.09557,1,0.20280,1,-1,1,-0.40559,1,-0.15851,1,0.04895,1,-0.61538,1,-0.26573,1,-1,1,-0.58042,1,-0.81372,1,-1,1,-0.78555,1,-0.48252,g
82 | 0,0,1,-1,1,1,1,1,1,1,1,1,1,-1,1,-1,1,1,1,-1,1,1,1,1,1,-1,1,1,1,-1,1,1,1,-1,b
83 | 1,0,0.92277,0.07804,0.92679,0.16251,0.89702,0.24618,0.84111,0.35197,0.78801,0.42196,0.70716,0.46983,0.70796,0.56476,0.60459,0.64200,0.51247,0.64924,0.39903,0.66975,0.34232,0.68343,0.23693,0.76146,0.18765,0.73885,0.09694,0.71038,0.02735,0.77072,-0.04023,0.69509,g
84 | 1,0,0.68198,-0.17314,0.82332,0.21908,0.46643,0.32862,0.25795,0.58304,1,-0.15194,0.01060,0.44523,0.01060,0.38869,0.18681,0.41168,0.10567,0.36353,0.04325,0.30745,-0.00083,0.24936,-0.02862,0.19405,-0.04314,0.14481,-0.04779,0.10349,-0.04585,0.07064,-0.04013,0.04586,b
85 | 1,0,0.74852,-0.02811,0.65680,-0.05178,0.80621,0.02811,0.85947,0.02515,0.63462,0.08728,0.71598,0.07840,0.73077,0.05178,0.78550,-0.27811,0.65976,-0.01479,0.78698,0.06953,0.34615,-0.18639,0.65385,0.02811,0.61009,-0.06637,0.53550,-0.21154,0.59024,-0.14053,0.56361,0.02959,g
86 | 1,0,0.39179,-0.06343,0.97464,0.04328,1,1,0.35821,0.15299,0.54478,0.13060,0.61567,-0.82090,0.57836,0.67910,0.66791,-0.10448,0.46642,-0.11567,0.65574,0.14792,0.83209,0.45522,0.47015,0.16418,0.49309,0.14630,0.32463,-0.02612,0.39118,0.13521,0.34411,0.12755,b
87 | 1,0,0.67547,0.04528,0.76981,-0.10566,0.77358,0.03774,0.66038,-0.04528,0.64528,0.01132,0.66792,-0.13962,0.72075,-0.02264,0.76981,0.08679,0.61887,-0.07925,0.75849,-0.23774,0.73962,-0.14717,0.84906,-0.15094,0.73886,-0.05801,0.66792,0.02264,0.86415,0.03774,0.73208,0.00755,g
88 | 1,0,0.72727,-0.05000,0.89241,0.03462,1,0.72727,0.66364,-0.05909,0.48182,-0.16818,0.81809,0.09559,0.56818,1,0.50455,0.21818,0.66818,0.10000,1,-0.30000,0.98636,-1,0.57273,0.32727,0.56982,0.14673,0.42273,0.08182,0.48927,0.14643,1,1,b
89 | 1,0,0.57647,-0.01569,0.40392,0,0.38431,0.12941,0.40000,-0.05882,0.56471,0.14118,0.46667,0.08235,0.52549,-0.05490,0.58039,0.01569,0.50196,0,0.45882,0.06667,0.58039,0.08235,0.49804,0.00392,0.48601,0.10039,0.46275,0.08235,0.45098,0.23529,0.43137,0.17255,g
90 | 1,0,0.41932,0.12482,0.35000,0.12500,0.23182,0.27955,-0.03636,0.44318,0.04517,0.36194,-0.19091,0.33636,-0.13350,0.27322,0.02727,0.40455,-0.34773,0.12727,-0.20028,0.05078,-0.18636,0.36364,-0.14003,-0.04802,-0.09971,-0.07114,-1,-1,-0.02916,-0.07464,-0.00526,-0.06314,b
91 | 1,0,0.88305,-0.21996,1,0.36373,0.82403,0.19206,0.85086,0.05901,0.90558,-0.04292,0.85193,0.25000,0.77897,0.25322,0.69206,0.57940,0.71030,0.39056,0.73176,0.27575,1,0.34871,0.56760,0.52039,0.69811,0.53235,0.80901,0.58584,0.43026,0.70923,0.52361,0.54185,g
92 | 1,0,0.84557,-0.08580,-0.31745,-0.80553,-0.08961,-0.56435,0.80648,0.04576,0.89514,-0.00763,-0.18494,0.63966,-0.20019,-0.68065,0.85701,-0.11344,0.77979,-0.15729,-0.06959,0.50810,-0.34128,0.80934,0.78932,-0.03718,0.70882,-0.25288,0.77884,-0.14109,-0.21354,-0.78170,-0.18494,-0.59867,b
93 | 1,0,0.70870,-0.24783,0.64348,0.04348,0.45217,0.38261,0.65217,0.18261,0.5,0.26957,0.57826,-0.23043,0.50435,0.37826,0.38696,-0.42609,0.36087,-0.26087,0.26957,0.11739,0.53246,-0.03845,0.31304,-0.12174,0.49930,-0.04264,0.48348,-0.04448,0.64348,-0.25217,0.50435,0.14783,g
94 | 1,0,-0.54180,0.14861,-0.33746,0.73375,0.52012,-0.13932,0.31889,-0.06811,0.20743,-0.15170,0.47368,0.08978,0.56347,-0.15480,0.16409,0.45201,0.33746,0.03406,0.50464,0.07121,-0.63777,-0.61610,1,0.65635,0.41348,-0.40116,-0.15170,0.11146,0.02399,0.55820,0.52632,-0.08978,b
95 | 1,0,0.29202,0.13582,0.45331,0.16808,0.51783,-0.00509,0.52632,0.20883,0.52462,-0.16638,0.47368,-0.04754,0.55518,0.03905,0.81664,-0.22411,0.42445,-0.04244,0.34975,0.06621,0.28183,-0.20883,0.51731,-0.03176,0.50369,-0.03351,0.34635,0.09847,0.70798,-0.01868,0.39559,-0.03226,g
96 | 1,0,0.79157,0.16851,0,0,0.56541,0.06874,0.39468,1,0.38359,0.99557,-0.02439,0.53215,0.23725,0.12860,-0.02661,0.95122,-0.50998,0.84922,-0.10200,0.38803,-0.42572,0.23725,-0.91574,0.80710,-0.34146,0.88248,-1,0.69401,-1,0.12860,0,0,b
97 | 1,0,0.90116,0.16607,0.79299,0.37379,0.72990,0.50515,0.59784,0.72997,0.44303,0.81152,0.24412,0.87493,0.06438,0.85038,-0.12611,0.87396,-0.28739,0.79617,-0.46635,0.65924,-0.57135,0.53805,-0.68159,0.39951,-0.71844,0.25835,-0.72369,0.11218,-0.71475,-0.05525,-0.67699,-0.19904,g
98 | 1,0,0.97714,0.19049,0.82683,0.46259,0.71771,0.58732,0.47968,0.84278,0.31409,0.92643,0.10289,0.93945,-0.13254,0.84290,-0.32020,0.91624,-0.52145,0.79525,-0.68274,0.49508,-0.77408,0.33537,-0.85376,0.17849,-0.83314,-0.01358,-0.82366,-0.19321,-0.67289,-0.33662,-0.59943,-0.49700,g
99 | 1,0,-1,-1,0,0,0.50814,-0.78502,0.60586,0.32899,-1,-0.41368,0,0,0,0,1,-0.26710,0.36482,-0.63518,0.97068,-1,-1,-1,1,-0.59609,-1,-1,-1,-1,1,-1,0,0,b
100 | 1,0,0.74084,0.04974,0.79074,0.02543,0.78575,0.03793,0.66230,0.09948,0.67801,0.31152,0.75934,0.07348,0.74695,0.08442,0.70681,-0.07853,0.63613,0,0.70021,0.11355,0.68183,0.12185,0.67016,0.15445,0.64158,0.13608,0.65707,0.17539,0.59759,0.14697,0.57455,0.15114,g
101 | 1,0,1,-1,0,0,0.77941,-0.99265,0.80882,0.55147,-0.41912,-0.94853,0,0,0,0,0.72059,-0.77206,0.73529,-0.60294,0,0,0.18382,-1,-1,-1,-1,-1,1,-1,1,-1,0,0,b
102 | 1,0,1,0.01709,0.96215,-0.03142,1,-0.03436,1,-0.05071,0.99026,-0.07092,0.99173,-0.09002,1,-0.15727,1,-0.14257,0.98310,-0.11813,1,-0.18519,1,-0.19272,0.98971,-0.22083,0.96490,-0.20243,0.94599,-0.17123,0.96436,-0.22561,0.87011,-0.23296,g
103 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,-1,0,0,0,0,0,0,b
104 | 1,0,0.95704,-0.12095,0.63318,-0.12690,0.96365,-0.18242,0.97026,0.08460,0.92003,-0.01124,0.83543,-0.24719,1,-0.31395,0.99273,-0.21216,0.98678,-0.21018,1,-0.27165,0.93126,-0.39458,1,-0.19233,0.88793,-0.31565,0.81428,-0.23728,0.89095,-0.31857,0.69531,-0.41573,g
105 | 1,0,0.28409,-0.31818,0,0,0.68182,-1,0.30682,0.95833,0.64394,0.06439,0.34848,-0.84848,0,0,0.59091,-0.35985,0.45076,-0.80682,0,0,0,0,0.24242,0.17803,1,-0.23864,0.06061,-0.48485,0.16288,-0.70076,0,0,b
106 | 1,0,0.94490,-0.49311,1,-0.03692,0.98898,-0.87052,0.90083,0.66942,1,-0.10104,1,-0.12493,1,-0.15017,1,-0.17681,1,-0.20491,1,-0.23452,1,-0.26571,1,-0.29852,1,-0.33304,1,-0.36931,1,-0.40740,1,-0.44739,g
107 | 1,0,0,0,0,0,0,0,0,0,0.62195,1,0,0,0,0,0.36585,-0.71951,0.56098,-1,0,0,0,0,0,0,1,0.10976,0,0,0,0,0,0,b
108 | 1,0,0.99449,0.00526,0.84082,-0.11313,0.88237,-0.16431,0.99061,-0.06257,0.96484,-0.07496,0.85221,0.02966,0.87161,-0.20848,0.93881,-0.12977,0.98298,-0.08935,0.89876,0.00075,0.87836,-0.05882,0.93368,-0.19872,0.87579,-0.17806,0.94294,-0.16581,0.80253,-0.25741,0.76586,-0.27794,g
109 | 1,0,0.10135,0.10811,0,0,0,0,0.54730,0.82432,0.31081,1,0,0,0,0,0.37162,-1,0.33108,-1,0,0,0,0,-0.42568,-1,1,-1,0.55405,-0.23649,0,0,0,0,b
110 | 1,0,1,-0.57224,0.99150,-0.73371,0.89518,-0.97450,1,-0.35818,1,-0.23229,0.62890,-0.86402,1,-0.57535,1,-0.79603,0.76771,-0.88952,0.96601,-1,0.70120,-0.74896,0.61946,-0.76904,0.53777,-0.77986,0.81020,-1,1,-1,0.30445,-0.76112,g
111 | 1,0,0.65909,-0.62879,0,0,0,0,0.77273,1,1,-0.28030,0,0,0,0,0.62121,-0.22727,0.84091,-1,1,-1,0,0,0,0,1,-0.93939,-0.12879,-0.93182,0,0,0,0,b
112 | 1,0,0.86284,0.19310,0.80920,0.41149,0.67203,0.55785,0.54559,0.69962,0.36705,0.81533,0.19617,0.85671,-0.04061,0.86284,-0.17241,0.75785,-0.34100,0.65747,-0.48199,0.56092,-0.60230,0.40996,-0.59234,0.25747,-0.63038,0.08818,-0.57241,-0.07816,-0.54866,-0.19923,-0.42912,-0.31954,g
113 | 1,0,0.42000,-0.61000,0,0,1,-1,0.90000,1,0.43000,0.64000,0,0,0,0,0.67000,-0.29000,0.84000,-1,0,0,0,0,0.21000,0.68000,1,0.22000,0,0,0,0,0,0,b
114 | 1,0,1,0.23395,0.91404,0.52013,0.78020,0.72144,0.47660,0.84222,0.27639,0.91730,0.09467,0.88248,-0.21980,0.91404,-0.34168,0.75517,-0.51360,0.64527,-0.64527,0.44614,-0.74102,0.29162,-0.70838,0.03591,-0.71731,-0.11943,-0.64962,-0.28183,-0.51251,-0.44505,-0.37432,-0.53319,g
115 | 1,0,0.91353,0.81586,-0.72973,1,-0.39466,0.55735,0.05405,0.29730,-0.18599,-0.10241,-0.03158,-0.08970,0.01401,-0.03403,0.01108,-0.00537,0.00342,0.00097,0.00048,0.00075,-0.00003,0.00019,-0.00003,0.00002,-0.00001,0,0,0,0,0,0,0,b
116 | 1,0,0.21429,-0.09524,0.33333,0.07143,0.19048,0.19048,0.23810,0.09524,0.40476,0.02381,0.30952,-0.04762,0.30952,-0.04762,0.28571,-0.11905,0.33333,0.04762,0.30952,0,0.21429,-0.11905,0.35714,-0.04762,0.22109,-0.02290,0.19048,0,0.16997,-0.02034,0.14694,-0.01877,g
117 | 1,0,1,-0.14754,1,0.04918,0.57377,-0.01639,0.65574,0.01639,0.85246,-0.03279,0.72131,0,0.68852,-0.16393,0.19672,-0.14754,0.65558,-0.17176,0.67213,0.03279,1,-0.29508,0.31148,-0.34426,0.52385,-0.20325,0.32787,-0.03279,0.27869,-0.44262,0.49180,-0.06557,b
118 | 1,0,0.98182,0,0.88627,0.03131,0.86249,0.04572,0.80000,0,0.69091,0.04545,0.79343,0.08436,0.77118,0.09579,0.62727,0.25455,0.68182,0.12727,0.70674,0.12608,0.68604,0.13493,0.74545,0.22727,0.64581,0.15088,0.67273,0.02727,0.60715,0.16465,0.58840,0.17077,g
119 | 1,0,0.39286,0.52381,-0.78824,0.11342,-0.16628,-0.76378,0.66667,0.01190,0.82143,0.40476,-0.67230,0.30729,-0.34797,-0.63668,0.46429,0.15476,0.54762,0.05952,-0.51830,0.44961,-0.47651,-0.47594,0.32143,0.70238,0.51971,0.38848,0.57143,0.39286,-0.54891,-0.29915,0.25441,-0.55837,b
120 | 1,0,0.86889,-0.07111,1,-0.02494,1,-0.06889,0.87778,0.00222,0.83556,-0.06444,1,-0.07287,1,-0.20000,0.86889,0.05333,0.88000,-0.03778,1,-0.11526,1,-0.18667,0.84444,0.03556,1,-0.14162,0.82222,-0.14667,1,-0.15609,1,-0.44222,g
121 | 1,0,0.43636,-0.12727,0.58182,-0.14545,0.18182,-0.67273,0.34545,-0.03636,0.29091,-0.05455,0.29091,0.29091,0.36364,-0.41818,0.20000,-0.01818,0.36364,0.05455,0.12727,0.49091,0.61818,0.16364,0.32727,0.16364,0.41098,-0.07027,0.34545,-0.05455,0.12727,-0.36364,0.29091,-0.29091,b
122 | 1,0,1,-0.92453,1,0.75472,0.49057,-0.05660,0.62264,0,1,-0.00054,0.45283,0.07547,0.62264,-0.05660,0.98878,-0.00085,0.52830,0,0.52830,0.07547,0.95190,-0.00112,1,0.79245,0.92192,-0.00128,0.94340,-1,1,0.43396,0.43396,-0.11321,g
123 | 1,0,0.73810,0.83333,-0.76190,-0.23810,0.33333,-0.14286,0.45238,-0.14286,-0.67285,0.12808,0.33333,0,0.28571,-0.07143,-0.38214,0.51163,0.23810,0.02381,0.45238,0.04762,0.16667,-0.26190,-0.57255,-0.10234,0.24889,-0.51079,1,0,-0.66667,-0.04762,0.26190,0.02381,b
124 | 1,0,0.43750,0.04167,0.58333,-0.10417,0.39583,0,0.33333,-0.06250,0.47917,0,0.29167,0.10417,0.54167,0.02083,0.43750,-0.22917,0.35417,-0.22917,0.33333,0.08333,0.25000,0.18750,0.39583,-0.18750,0.44012,-0.10064,0.41667,-0.08333,0.58333,-0.31250,0.33333,-0.06250,g
125 | 1,0,1,1,0,0,0,0,0,0,0.47744,-0.89098,-0.51504,0.45489,-0.95489,0.28571,0.64662,1,0,0,0,0,0.62030,0.20301,-1,-1,1,-1,1,1,0,0,0,0,b
126 | 1,0,0.95217,0.06595,0.93614,0.13030,0.90996,0.19152,0.84881,-0.49962,0.90023,0.61320,0.77937,0.34328,0.72254,0.37988,0.66145,0.40844,0.95472,0.59862,0.53258,0.44088,0.46773,0.44511,0.40440,0.44199,0.34374,0.43221,0.90330,1,0.23405,0.39620,0.18632,0.37191,g
127 | 1,0,0.59840,0.40332,0.82809,0.80521,0.76001,0.70709,0.84010,-0.10984,0.97311,0.07981,0.95824,-0.85727,0.91962,0.88444,0.95452,-0.05206,0.88673,0.18135,0.98484,-0.69594,0.86670,-0.85755,0.28604,-0.30063,1,0.17076,0.62958,0.42677,0.87757,0.81007,0.81979,0.68822,b
128 | 1,0,0.95882,0.10129,1,-0.01918,0.98313,0.02555,0.96974,-0.09316,0.98955,-0.02716,0.97980,-0.03096,1,-0.05343,1,-0.05179,0.93840,0.01557,0.97620,-0.09284,0.97889,-0.05318,0.91567,-0.15675,0.95677,-0.06995,0.90978,0.01307,1,-0.10797,0.93144,-0.06888,g
129 | 1,0,0,0,-0.33672,0.85388,0,0,0.68869,-1,0.97078,0.31385,-0.26048,-0.59212,-0.30241,0.65565,0.94155,0.16391,0,0,0,0,-0.18043,-1,0,0,1,-1,0,0,0.04447,0.61881,0,0,b
130 | 1,0,0.96933,0.00876,1,0.00843,0.98658,-0.00763,0.97868,-0.02844,0.99820,-0.03510,1,-0.01271,1,-0.02581,1,-0.01175,0.98485,0.00025,1,-0.02612,1,-0.04744,0.96019,-0.04527,0.99188,-0.03473,0.97020,-0.02478,1,-0.03855,0.98420,-0.04112,g
131 | 1,0,0,0,0.98919,-0.22703,0.18919,-0.05405,0,0,0.93243,0.07297,1,-0.20000,1,0.07027,1,-0.11351,0,0,1,-0.21081,1,-0.41622,0,0,1,-0.17568,0,0,1,-0.25946,0.28919,-0.15676,b
132 | 1,0,0.64122,0.01403,0.34146,-0.02439,0.52751,0.03466,0.19512,0.12195,0.43313,0.04755,0.21951,0.04878,0.29268,0,0.36585,0,0.31707,0.07317,0.26829,0.12195,0.23698,0.05813,0.21951,0.09756,0.19304,0.05641,0.17410,0.05504,0.19512,0,0.17073,0.07317,g
133 | 1,0,1,1,1,-1,0,0,0,0,1,1,1,-1,1,1,1,-1,0,0,0,0,1,-0.27778,0,0,1,-1,1,1,1,-1,0,0,b
134 | 1,0,0.34694,0.20408,0.46939,0.24490,0.40816,0.20408,0.46939,0.44898,0.30612,0.59184,0.12245,0.55102,0,0.51020,-0.06122,0.55102,-0.20408,0.55102,-0.28571,0.44898,-0.28571,0.32653,-0.61224,0.22449,-0.46579,0.14895,-0.59184,0.18367,-0.34694,0,-0.26531,-0.24490,g
135 | 1,0,0,0,1,-1,0,0,0,0,1,1,1,-0.25342,1,0.23288,1,-1,0,0,0,0,1,1,0,0,1,-1,0,0,1,-1,0,0,b
136 | 1,0,0.89706,0.38235,0.91176,0.37500,0.74265,0.67647,0.45588,0.77941,0.19118,0.88971,-0.02206,0.86029,-0.20588,0.82353,-0.37500,0.67647,-0.5,0.47794,-0.73529,0.38235,-0.86029,0.08824,-0.74265,-0.12500,-0.67925,-0.24131,-0.55147,-0.42647,-0.44118,-0.50735,-0.28676,-0.56618,g
137 | 1,0,-1,0.28105,0.22222,0.15033,-0.75693,-0.70984,-0.30719,0.71242,-1,1,-0.81699,0.33987,-0.79085,-0.02614,-0.98039,-0.83007,-0.60131,-0.54248,-0.04575,-0.83007,0.94118,-0.94118,-1,-0.43137,0.74385,0.09176,-1,0.05229,0.18301,0.02614,-0.40201,-0.48241,b
138 | 1,0,0.26667,-0.10000,0.53333,0,0.33333,-0.13333,0.36667,0.11667,0.56667,0.01667,0.71667,0.08333,0.70000,-0.06667,0.53333,0.20000,0.41667,-0.01667,0.31667,0.20000,0.70000,0,0.25000,0.13333,0.46214,0.05439,0.40000,0.03333,0.46667,0.03333,0.41667,-0.05000,g
139 | 1,0,-0.26667,0.40000,-0.27303,0.12159,-0.17778,-0.04444,0.06192,-0.06879,0.04461,0.02575,-0.00885,0.02726,-0.01586,-0.00166,-0.00093,-0.00883,0.00470,-0.00153,0.00138,0.00238,-0.00114,0.00102,-0.00069,-0.00050,0.00019,-0.00043,0.00026,0.00005,0,0.00015,-0.00008,0.00002,b
140 | 1,0,1,-0.37838,0.64865,0.29730,0.64865,-0.24324,0.86486,0.18919,1,-0.27027,0.51351,0,0.62162,-0.05405,0.32432,-0.21622,0.71833,-0.17666,0.62162,0.05405,0.75676,0.13514,0.35135,-0.29730,0.61031,-0.22163,0.58478,-0.23027,0.72973,-0.59459,0.51351,-0.24324,g
141 | 1,0,0.94531,-0.03516,-1,-0.33203,-1,-0.01563,0.97266,0.01172,0.93359,-0.01953,-1,0.16406,-1,-0.00391,0.95313,-0.03516,0.92188,-0.02734,-0.99219,0.11719,-0.93359,0.34766,0.95703,-0.00391,0.82041,0.13758,0.90234,-0.06641,-1,-0.18750,-1,-0.34375,b
142 | 1,0,0.95202,0.02254,0.93757,-0.01272,0.93526,0.01214,0.96705,-0.01734,0.96936,0.00520,0.95665,-0.03064,0.95260,-0.00405,0.99480,-0.02659,0.99769,0.01792,0.93584,-0.04971,0.93815,-0.02370,0.97052,-0.04451,0.96215,-0.01647,0.97399,0.01908,0.95434,-0.03410,0.95838,0.00809,g
143 | 1,0,1,-0.05529,1,-1,0.5,-0.11111,0.36111,-0.22222,1,-0.25712,0.16667,-0.11111,1,-0.34660,1,-0.38853,1,-0.42862,0,-0.25000,1,-0.50333,1,-0.27778,1,-0.57092,1,-0.27778,1,-0.63156,1,-0.65935,b
144 | 1,0,0.31034,-0.10345,0.24138,-0.10345,0.20690,-0.06897,0.07405,-0.05431,0.03649,-0.03689,0.01707,-0.02383,0.00741,-0.01482,0.00281,-0.00893,0.00078,-0.00523,-0.00003,-0.00299,-0.00028,-0.00166,-0.00031,-0.00090,-0.00025,-0.00048,-0.00018,-0.00024,-0.00012,-0.00012,-0.00008,-0.00006,g
145 | 1,0,0.62745,-0.07843,0.72549,0,0.60784,-0.07843,0.62745,-0.11765,0.68627,-0.11765,0.66667,-0.13725,0.64706,-0.09804,0.54902,-0.11765,0.54902,-0.21569,0.58824,-0.19608,0.66667,-0.23529,0.45098,-0.25490,0.52409,-0.24668,0.56863,-0.31373,0.43137,-0.21569,0.47059,-0.27451,b
146 | 1,0,0.25000,0.16667,0.46667,0.26667,0.19036,0.23966,0.07766,0.19939,0.01070,0.14922,-0.02367,0.10188,-0.03685,0.06317,-0.03766,0.03458,-0.03230,0.01532,-0.02474,0.00357,-0.01726,-0.00273,-0.01097,-0.00539,-0.00621,-0.00586,-0.00294,-0.00520,-0.00089,-0.00408,0.00025,-0.00291,g
147 | 1,0,-0.65625,0.15625,0.06250,0,0,0.06250,0.62500,0.06250,0.18750,0,-0.03125,0.09375,0.06250,0,0.15625,-0.15625,0.43750,-0.37500,0,-0.09375,0,0,0.03125,-0.46875,0.03125,0,-0.71875,0.03125,-0.03125,0,0,0.09375,b
148 | 1,0,1,-0.01081,1,-0.02703,1,-0.06486,0.95135,-0.01622,0.98919,-0.03243,0.98919,0.08649,1,-0.06486,0.95135,0.09189,0.97838,-0.00541,1,0.06486,1,0.04324,0.97838,0.09189,0.98556,0.01251,1,-0.03243,1,0.02703,1,-0.07027,g
149 | 1,0,0.85271,0.05426,1,0.08069,1,1,0.91473,-0.00775,0.83721,0.03876,1,0.27153,1,1,0.81395,0.04651,0.90698,0.11628,1,0.50670,1,-1,0.80620,0.03876,1,0.71613,0.84496,0.06977,1,0.87317,1,1,b
150 | 1,0,0.90374,-0.01604,1,0.08021,1,0.01604,0.93048,0.00535,0.93583,-0.01604,1,0,1,0.06417,1,0.04813,0.91444,0.04278,0.96791,0.02139,0.98930,-0.01604,0.96257,0.05348,0.96974,0.04452,0.87701,0.01070,1,0.09091,0.97861,0.06417,g
151 | 1,0,-0.20500,0.28750,0.23000,0.10000,0.28250,0.31750,0.32250,0.35000,0.36285,-0.34617,0.09250,0.27500,-0.09500,0.21000,-0.08750,0.23500,-0.34187,0.31408,-0.48000,-0.08000,0.29908,0.33176,-0.58000,-0.24000,0.32190,-0.28475,-0.47000,0.18500,-0.27104,-0.31228,0.40445,0.03050,b
152 | 1,0,0.60000,0.03333,0.63333,0.06667,0.70000,0.06667,0.70000,0,0.63333,0,0.80000,0,0.73333,0,0.70000,0.10000,0.66667,0.10000,0.73333,-0.03333,0.76667,0,0.63333,0.13333,0.65932,0.10168,0.60000,0.13333,0.60000,0.16667,0.63333,0.16667,g
153 | 1,0,0.05866,-0.00838,0.06704,0.00838,0,-0.01117,0.00559,-0.03911,0.01676,-0.07542,-0.00559,0.05307,0.06425,-0.03352,0,0.09497,-0.06425,0.07542,-0.04749,0.02514,0.02793,-0.00559,0.00838,0.00559,0.10335,-0.00838,0.03073,-0.00279,0.04469,0,0.04749,-0.03352,b
154 | 1,0,0.94653,0.28713,0.72554,0.67248,0.47564,0.82455,0.01267,0.89109,-0.24871,0.84475,-0.47644,0.56079,-0.75881,0.41743,-0.66455,0.07208,-0.65426,-0.19525,-0.52475,-0.44000,-0.30851,-0.55089,-0.04119,-0.64792,0.16085,-0.56420,0.36752,-0.41901,0.46059,-0.22535,0.50376,-0.05980,g
155 | 1,0,0.05460,0.01437,-0.02586,0.04598,0.01437,0.04598,-0.07759,0.00862,0.01724,-0.06609,-0.03736,0.04310,-0.08333,-0.04598,-0.09483,0.08046,-0.04023,0.05172,0.02011,0.02299,-0.03736,-0.01149,0.03161,-0.00862,0.00862,0.01724,0.02586,0.01149,0.02586,0.01149,-0.04598,-0.00575,b
156 | 1,0,0.72414,-0.01084,0.79704,0.01084,0.80000,0.00197,0.79015,0.01084,0.78424,-0.00985,0.83350,0.03251,0.85123,0.01675,0.80099,-0.00788,0.79113,-0.02956,0.75961,0.03350,0.74778,0.05517,0.72611,-0.01478,0.78041,0.00612,0.74089,-0.05025,0.82956,0.02956,0.79015,0.00788,g
157 | 1,0,0.03852,0.02568,0.00428,0,0.01997,-0.01997,0.02140,-0.04993,-0.04850,-0.01284,0.01427,-0.02282,0,-0.03281,-0.04708,-0.02853,-0.01712,0.03566,0.02140,0.00428,0.05136,-0.02282,0.05136,0.01854,0.03994,0.01569,0.01997,0.00713,-0.02568,-0.01854,-0.01427,0.01997,b
158 | 1,0,0.47090,0.22751,0.42328,0.33598,0.25661,0.47619,0.01852,0.49471,-0.02116,0.53968,-0.34127,0.31217,-0.41270,0.32540,-0.51587,0.06878,-0.5,-0.11640,-0.14815,-0.14550,-0.14815,-0.38095,-0.23280,0.00265,0.03574,-0.31739,0.15873,-0.21693,0.24868,-0.24339,0.26720,0.04233,g
159 | 1,0,0.08696,0.00686,0.13959,-0.04119,0.10526,-0.08238,0.12586,-0.06178,0.23341,-0.01144,0.12357,0.07780,0.14645,-0.13501,0.29062,-0.04805,0.18993,0.07323,0.11670,0,0.11213,-0.00229,0.15103,-0.10297,0.08467,0.01373,0.11213,-0.06636,0.09611,-0.07323,0.11670,-0.06865,b
160 | 1,0,0.94333,0.38574,0.48263,0.64534,0.21572,0.77514,-0.55941,0.64899,-0.73675,0.42048,-0.76051,0,-0.62706,-0.31079,-0.38391,-0.62157,-0.12797,-0.69287,0.49909,-0.63620,0.71481,-0.37660,0.73857,-0.05484,0.60098,0.30384,0.45521,0.60512,0.02742,0.54479,-0.21572,0.50457,g
161 | 1,0,0.01975,0.00705,0.04090,-0.00846,0.02116,0.01128,0.01128,0.04372,0.00282,0.00141,0.01975,-0.03103,-0.01975,0.06065,-0.04090,0.02680,-0.02398,-0.00423,0.04372,-0.02539,0.01834,0,0,-0.01269,0.01834,-0.01128,0.00564,-0.01551,-0.01693,-0.02398,0.00705,0,b
162 | 1,0,0.85736,0.00075,0.81927,-0.05676,0.77521,-0.04182,0.84317,0.09037,0.86258,0.11949,0.88051,-0.06124,0.78342,0.03510,0.83719,-0.06796,0.83570,-0.14190,0.88125,0.01195,0.90515,0.02240,0.79686,-0.01942,0.82383,-0.03678,0.88125,-0.06423,0.73936,-0.01942,0.79089,-0.09186,g
163 | 1,0,1,-1,1,1,-1,1,1,-1,1,-1,-1,-1,-1,1,1,1,1,1,-1,1,1,-1,1,-1,1,1,1,1,-1,1,-1,1,b
164 | 1,0,0.85209,0.39252,0.38887,0.76432,0.08858,0.98903,-0.42625,0.88744,-0.76229,0.49980,-0.93092,0.10768,-0.85900,-0.31044,-0.66030,-0.55262,-0.19260,-0.86063,0.28444,-0.80496,0.64649,-0.35230,0.77814,-0.23324,0.71698,0.21343,0.37830,0.58310,0.19667,0.66315,-0.11215,0.64933,g
165 | 1,0,1,1,1,0.51250,0.62500,-1,1,1,0.02500,0.03125,1,1,0,0,1,-1,1,1,1,1,0.31250,1,1,1,1,1,1,1,-0.94375,1,0,0,b
166 | 1,0,1,0.54902,0.62745,1,0.01961,1,-0.49020,0.92157,-0.82353,0.58824,-1,0.11765,-0.96078,-0.33333,-0.64706,-0.68627,-0.23529,-0.86275,0.35294,-1,0.74510,-0.72549,0.92157,-0.21569,0.92874,0.21876,0.72549,0.56863,0.23529,0.90196,-0.11765,0.90196,g
167 | 1,0,0,0,-1,-1,-1,1,0,0,-1,1,1,1,1,-1,0,0,0,0,-1,-1,-1,1,1,0.43750,1,-1,0,0,-1,-1,-1,1,b
168 | 1,0,0.44444,0.44444,0.53695,0.90763,-0.22222,1,-0.33333,0.88889,-1,0.33333,-1,-0.11111,-1,-0.22222,-0.66667,-0.77778,0.55556,-1,-0.22222,-0.77778,0.77778,-0.22222,0.33333,0,0.92120,0.45019,0.57454,0.84353,0.22222,1,-0.55556,1,g
169 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,b
170 | 1,0,1,0,1,0,0.5,0.50000,0.75000,0,0.91201,0.12094,0.89067,0.14210,0.86922,0.16228,0.75000,0.25000,0.75000,0.5,0.75000,0,1,-0.25000,0.5,0.50000,0.73944,0.26388,0.75000,0.25000,0.69635,0.29074,0.67493,0.30293,g
171 | 0,0,-1,1,1,1,0,0,1,-1,1,-1,1,-1,-1,-1,0,0,-1,-1,0,0,0,0,-1,-1,1,-1,1,1,-1,-1,0,0,b
172 | 1,0,1,0,1,0,0.66667,0.11111,1,-0.11111,0.88889,-0.11111,1,-0.22222,0.77778,0,0.77778,0,1,-0.11111,0.77778,-0.11111,0.66667,-0.11111,0.66667,0,0.90347,-0.05352,1,0.11111,0.88889,-0.11111,1,0,g
173 | 0,0,0,0,0,0,0,0,0,0,0,0,-1,-1,0,0,1,0.75000,0,0,0,0,-1,1,0,0,1,-1,-1,-1,1,1,0,0,b
174 | 1,0,1,0.45455,1,-0.45455,1,0.09091,1,-0.09091,1,0,1,-0.27273,1,-0.18182,1,0.09091,1,0,1,-0.36364,1,0.09091,1,-0.09091,1,-0.04914,1,0.45455,1,-0.27273,1,-0.18182,g
175 | 1,0,0.62121,-0.63636,0,0,0,0,0.34470,0.28788,0.42803,0.39394,-0.07576,0.51894,0.36364,0.31439,-0.53788,0.32955,0.12121,-0.14773,0.01894,-0.53409,-0.57576,0.17803,0.29167,-0.27273,0.25758,-0.57576,0.43182,0.24242,0.18182,-0.02273,0.17045,-0.41667,b
176 | 1,0,1,0.11765,1,0.23529,1,0.41176,1,0.05882,1,0.23529,1,0.11765,1,0.47059,1,-0.05882,1,-0.11765,1,0.35294,1,0.41176,1,-0.11765,1,0.20225,1,0.05882,1,0.35294,1,0.23529,g
177 | 1,0,0,0,-1,-0.62766,1,0.51064,0.07979,-0.23404,-1,-0.36170,0.12766,-0.59043,1,-1,0,0,0.82979,-0.07979,-0.25000,1,0.17021,-0.70745,0,0,-0.19149,-0.46809,-0.22340,-0.48936,0.74468,0.90426,-0.67553,0.45745,b
178 | 1,0,0.91667,0.29167,0.83333,-0.16667,0.70833,0.25000,0.87500,-0.08333,0.91667,0.04167,0.83333,0.12500,0.70833,0,0.87500,0.04167,1,0.08333,0.66667,-0.08333,0.75000,0.16667,0.83333,-0.12500,0.83796,0.05503,1,0.20833,0.70833,0,0.70833,0.04167,g
179 | 1,0,0.18590,-0.16667,0,0,0,0,0,0,0,0,0.11538,-0.19071,0,0,0,0,0,0,0,0,-0.05128,-0.06571,0.07853,0.08974,0.17308,-0.10897,0.12500,0.09615,0.02564,-0.04808,0.16827,0.19551,b
180 | 1,0,1,-0.08183,1,-0.11326,0.99246,-0.29802,1,-0.33075,0.96662,-0.34281,0.85788,-0.47265,0.91904,-0.48170,0.73084,-0.65224,0.68131,-0.63544,0.82450,-0.78316,0.58829,-0.74785,0.67033,-0.96296,0.48757,-0.85669,0.37941,-0.83893,0.24117,-0.88846,0.29221,-0.89621,g
181 | 1,0,1,1,-1,1,-1,-0.82456,0.34649,0.21053,0.46053,0.07018,0.22807,0.05702,0.35088,0.34649,0.72807,-0.03947,0.22807,0.53070,0,0,-0.29825,-0.16228,1,-0.66667,1,-1,1,-0.24561,0.35088,0.20175,0.82895,0.07895,b
182 | 1,0,1,0.24077,0.99815,0.00369,0.80244,-0.30133,0.89919,-0.23486,0.70643,-0.24077,0.73855,-0.30539,0.71492,-0.36078,0.47194,-0.61189,0.40473,-0.55059,0.61041,-0.39328,0.53176,-0.32681,0.23966,-0.52142,0.29208,-0.48390,0.12777,-0.39143,0.15657,-0.51329,0.18353,-0.46603,g
183 | 0,0,-1,1,1,-1,0,0,0,0,1,-1,1,1,0,0,1,-1,0,0,0,0,1,1,-1,1,1,-1,-1,1,-1,-1,0,0,b
184 | 1,0,0.92247,-0.19448,0.96419,-0.17674,0.87024,-0.22602,0.81702,-0.27070,0.79271,-0.28909,0.70302,-0.49639,0.63338,-0.49967,0.37254,-0.70729,0.27070,-0.72109,0.40506,-0.54172,0.33509,-0.59691,0.14750,-0.63601,0.09312,-0.59589,-0.07162,-0.54928,-0.01840,-0.54074,-0.07457,-0.47898,g
185 | 1,0,-1,-1,-0.50694,1,1,-1,1,0.53819,0,0,0.23958,-1,1,1,0,0,1,1,1,1,0,0,-0.71528,1,0.33333,-1,1,-1,0.69792,-1,0.47569,1,b
186 | 1,0,0.84177,0.43460,0.5,0.76160,0.09916,0.93460,-0.37764,0.88186,-0.72363,0.61181,-0.93882,0.19409,-0.86709,-0.25527,-0.62869,-0.65612,-0.25105,-0.85654,0.16245,-0.86498,0.51477,-0.66878,0.74895,-0.28903,0.77937,0.07933,0.64135,0.42827,0.31435,0.62447,-0.00422,0.69409,g
187 | 1,0,1,1,0,0,1,-1,-1,-1,1,1,1,-1,0,0,1,-1,1,1,0,0,1,-1,-1,-1,1,1,-1,1,-1,1,0,0,b
188 | 1,0,1,0.63548,1,1,0.77123,1,-0.33333,1,-1,1,0,1,-1,1,-1,0,-1,-0.66667,-1,-0.92536,-1,-0.33333,-0.33333,-1,0.19235,-1,1,-1,0,-1,1,-0.66667,g
189 | 0,0,-1,1,-1,-1,0,0,-1,1,1,-1,-1,-1,-1,1,0,0,-1,-1,-1,1,0,0,1,-1,1,1,1,-1,1,1,0,0,b
190 | 1,0,1,0.06843,1,0.14211,1,0.22108,1,-0.12500,1,0.39495,1,0.48981,1,0.58986,-0.37500,1,1,0,1,0.92001,1,1,1,1,1,1,1,0.25000,1,1,1,1,g
191 | 0,0,-1,-1,0,0,0,0,0,0,0,0,0,0,1,-1,0,0,-1,-1,0,0,1,1,1,-1,1,-1,0,0,0,0,0,0,b
192 | 1,0,0.64947,-0.07896,0.58264,-0.14380,-0.13129,-0.21384,0.29796,0.04403,0.38096,-0.26339,0.28931,-0.31997,0.03459,-0.18947,0.20269,-0.29441,0.15196,-0.29052,0.09513,-0.31525,0.06556,-0.26795,0.03004,-0.25124,-0.00046,-0.23210,-0.02612,-0.21129,-0.04717,-0.18950,0.01336,-0.27201,g
193 | 1,0,0,0,0,0,0,0,0,0,1,-0.33333,0.16667,0.26042,0,0,0,0,0,0,-0.19792,-0.21875,-0.16667,0.90625,-1,0.5,0.04167,0.75000,-0.22917,-1,-0.12500,-0.27083,-0.19792,-0.93750,b
194 | 1,0,1,0.05149,0.99363,0.10123,0.96142,0.14756,0.95513,-0.26496,0.66026,0.54701,0.80426,0.25283,0.73781,0.27380,0.66775,0.28714,0.59615,0.29304,0.52494,0.29200,0.45582,0.28476,0.39023,0.27226,0.32930,0.25553,0.27381,0.23568,0.22427,0.21378,0.18086,0.19083,g
195 | 1,0,1,-0.09524,-1,-1,-1,-1,1,0.31746,0.81349,0.76190,-1,-1,-1,1,0.47364,1,1,1,0.68839,-1,-1,-1,0.82937,0.36508,1,1,1,0.50794,-1,-0.32540,-1,0.72831,b
196 | 1,0,0.93669,-0.00190,0.60761,0.43204,0.92314,-0.40129,0.93123,0.16828,0.96197,0.09061,0.99676,0.08172,0.91586,0.05097,0.84628,-0.25324,0.87379,-0.14482,0.84871,0.26133,0.75081,-0.03641,0.84547,-0.02589,0.87293,-0.02302,0.98544,0.09385,0.78317,-0.10194,0.85841,-0.14725,g
197 | 1,0,1,-1,1,1,1,1,1,-0.5,1,1,1,1,1,1,0,0,1,1,1,1,1,-1,1,1,1,0.62500,1,-0.75000,-0.75000,1,1,1,b
198 | 1,0,1,0.23058,1,-0.78509,1,-0.10401,1,0.15414,1,0.27820,0.98120,-0.06861,1,0.06610,0.95802,-0.18954,0.83584,-0.15633,0.97400,0.03728,0.99624,0.09242,1,-0.01253,0.96238,-0.04597,0.91165,0.03885,1,-0.13722,0.96523,-0.11717,g
199 | 1,0,0.36876,-1,-1,-1,-0.07661,1,1,0.95041,0.74597,-0.38710,-1,-0.79313,-0.09677,1,0.48684,0.46502,0.31755,-0.27461,-0.14343,-0.20188,-0.11976,0.06895,0.03021,0.06639,0.03443,-0.01186,-0.00403,-0.01672,-0.00761,0.00108,0.00015,0.00325,b
200 | 1,0,0.79847,0.38265,0.80804,-0.16964,1,-0.07653,0.98151,-0.07398,0.70217,0.20663,0.99745,0.02105,0.98214,0.02487,1,-0.13074,0.95663,0.07717,1,0.00191,0.90306,0.30804,1,-0.14541,1,-0.00394,0.75638,0.07908,1,-0.18750,1,-0.05740,g
201 | 0,0,0,0,0,0,0,0,0,0,0,0,1,1,1,1,1,-1,0,0,1,1,1,-1,1,1,1,0,1,1,1,-1,0,0,b
202 | 1,0,1,-0.28428,1,-0.25346,0.94623,-0.35094,1,-0.30566,0.92736,-0.49057,0.90818,-0.44119,0.75723,-0.58899,0.69748,-0.58019,0.59623,-0.57579,0.68459,-0.70975,0.54465,-0.87327,0.49214,-0.73333,0.35504,-0.76054,0.26352,-0.78239,0.16604,-0.73145,0.13994,-0.70000,g
203 | 1,0,0,0,0,0,0,0,-0.85000,-1,0,0,1,-1,0,0,-1,-1,-1,-1,1,-1,-0.60000,-1,1,1,-1,-0.20000,1,-1,0,1,0,0,b
204 | 1,0,1,0.09091,0.95455,-0.09091,0.77273,0,1,0,0.95455,0,1,0.04545,0.90909,-0.04545,1,0,1,0,0.86364,0.09091,0.77273,0.09091,0.90909,0.04545,0.91541,0.02897,0.95455,0.09091,0.86364,-0.09091,0.86364,0.04545,g
205 | 0,0,0,0,-1,1,1,1,-1,-1,0,0,-1,-1,-1,-0.31250,-1,-1,1,-1,1,-1,0,0,1,-1,-1,-1,0,0,1,-1,0,0,b
206 | 1,0,0.91176,-0.08824,0.97059,0.17647,0.82353,0.08824,0.91176,-0.02941,0.97059,-0.17647,0.97059,0.14706,0.94118,0.02941,1,0,1,0,0.76471,0.11765,0.88235,0.02941,0.85294,0.02941,0.92663,0.02600,0.94118,-0.11765,0.97059,0.05882,0.91176,0.05882,g
207 | 1,0,-1,1,-1,0.15244,0.28354,1,-1,1,-1,-1,1,1,-1,-0.23476,0.28301,-1,1,1,-0.31402,-1,-1,-1,1,-1,-1,-0.03578,1,-1,-1,-0.32317,0.14939,1,b
208 | 1,0,0.47368,-0.10526,0.83781,0.01756,0.83155,0.02615,0.68421,-0.05263,0.68421,0,0.79856,0.05028,0.78315,0.05756,0.84211,0.47368,1,0.05263,0.72550,0.07631,0.70301,0.08141,0.42105,0.21053,0.65419,0.08968,0.52632,-0.21053,0.60150,0.09534,0.57418,0.09719,g
209 | 1,0,-0.00641,-0.5,0,0,-0.01923,1,0,0,0,0,0,0,0,0,0,0,0.31410,0.92949,-0.35256,0.74359,-0.34615,-0.80769,0,0,-0.61538,-0.51282,0,0,0,0,0,0,b
210 | 1,0,1,0.45455,1,0.54545,0.81818,0.63636,1,-0.09091,1,0,0.81818,-0.45455,0.63636,0.27273,1,-0.63636,1,-0.27273,0.90909,-0.45455,1,0.07750,1,-0.09091,1,0.08867,1,0.36364,1,0.63636,0.72727,0.27273,g
211 | 0,0,-1,-1,1,-1,-1,1,0,0,1,-1,1,-1,0,0,0,0,0,0,-1,1,1,-1,-1,1,1,1,0,0,1,0.5,0,0,b
212 | 1,0,0.45455,0.09091,0.63636,0.09091,0.27273,0.18182,0.63636,0,0.36364,-0.09091,0.45455,-0.09091,0.48612,-0.01343,0.63636,-0.18182,0.45455,0,0.36364,-0.09091,0.27273,0.18182,0.36364,-0.09091,0.34442,-0.01768,0.27273,0,0.36364,0,0.28985,-0.01832,g
213 | 1,0,-1,-0.59677,0,0,-1,0.64516,-0.87097,1,0,0,0,0,0,0,0,0,0,0,-1,-1,0,0,0.29839,0.23387,1,0.51613,0,0,0,0,0,0,b
214 | 1,0,1,0.14286,1,0.71429,1,0.71429,1,-0.14286,0.85714,-0.14286,1,0.02534,1,0,0.42857,-0.14286,1,0.03617,1,-0.28571,1,0,0.28571,-0.28571,1,0.04891,1,0.05182,1,0.57143,1,0,g
215 | 0,0,1,1,1,-1,1,1,1,1,1,1,1,-1,1,1,1,-1,1,-1,1,1,1,1,1,-1,1,1,1,1,1,1,1,1,b
216 | 1,0,0.87032,0.46972,0.53945,0.82161,0.10380,0.95275,-0.38033,0.87916,-0.73939,0.58226,-0.92099,0.16731,-0.82417,-0.24942,-0.59383,-0.63342,-0.24012,-0.82881,0.18823,-0.78699,0.51557,-0.57430,0.69274,-0.24843,0.69097,0.10484,0.52798,0.39762,0.25974,0.56573,-0.06739,0.57552,g
217 | 0,0,1,-1,1,1,1,-1,1,1,1,-1,1,-1,1,-1,1,1,1,1,1,1,1,-1,1,1,1,1,1,1,1,1,1,-1,b
218 | 1,0,0.92657,0.04174,0.89266,0.15766,0.86098,0.19791,0.83675,0.36526,0.80619,0.40198,0.76221,0.40552,0.66586,0.48360,0.60101,0.51752,0.53392,0.52180,0.48435,0.54212,0.42546,0.55684,0.33340,0.55274,0.26978,0.54214,0.22307,0.53448,0.14312,0.49124,0.11573,0.46571,g
219 | 0,0,1,1,1,-1,1,-1,1,1,0,0,1,-1,0,0,0,0,0,0,-1,1,1,1,0,0,1,1,0,0,-1,-1,0,0,b
220 | 1,0,0.93537,0.13645,0.93716,0.25359,0.85705,0.38779,0.79039,0.47127,0.72352,0.59942,0.65260,0.75000,0.50830,0.73586,0.41629,0.82742,0.25539,0.85952,0.13712,0.85615,0.00494,0.88869,-0.07361,0.79780,-0.20995,0.78004,-0.33169,0.71454,-0.38532,0.64363,-0.47419,0.55835,g
221 | 0,0,1,-1,-1,1,-1,1,1,1,1,1,-1,-1,-1,-1,1,1,1,-1,-1,-1,-1,-1,1,0,1,-1,1,-1,-1,1,-1,1,b
222 | 1,0,0.80627,0.13069,0.73061,0.24323,0.64615,0.19038,0.36923,0.45577,0.44793,0.46439,0.25000,0.57308,0.25192,0.37115,0.15215,0.51877,-0.09808,0.57500,-0.03462,0.42885,-0.08856,0.44424,-0.14943,0.40006,-0.19940,0.34976,-0.23832,0.29541,-0.26634,0.23896,-0.23846,0.31154,g
223 | 0,0,1,-1,1,1,1,-1,1,1,1,-1,1,1,1,-1,1,-1,1,1,1,1,1,-1,1,-1,1,-1,1,1,1,-1,1,1,b
224 | 1,0,0.97467,0.13082,0.94120,0.20036,0.88783,0.32248,0.89009,0.32711,0.85550,0.45217,0.72298,0.52284,0.69946,0.58820,0.58548,0.66893,0.48869,0.70398,0.44245,0.68159,0.35289,0.75622,0.26832,0.76210,0.16813,0.78541,0.07497,0.80439,-0.02962,0.77702,-0.10289,0.74242,g
225 | 0,0,0,0,1,1,0,0,1,1,0,0,1,-1,0,0,0,0,0,0,0,0,0,0,0,0,1,-1,0,0,-1,1,0,0,b
226 | 1,0,0.92308,0.15451,0.86399,0.29757,0.72582,0.36790,0.70588,0.56830,0.57449,0.62719,0.43270,0.74676,0.31705,0.67697,0.19128,0.76818,0.04686,0.76171,-0.12064,0.76969,-0.18479,0.71327,-0.29291,0.65708,-0.38798,0.58553,-0.46799,0.50131,-0.53146,0.40732,-0.56231,0.35095,g
227 | 0,0,0,0,1,1,1,1,0,0,0,0,-1,-1,0,0,-1,-1,0,0,0,0,1,1,0,0,1,1,0,0,-1,1,0,0,b
228 | 1,0,0.88804,0.38138,0.65926,0.69431,0.29148,0.87892,-0.06726,0.90135,-0.39597,0.80441,-0.64574,0.56502,-0.82960,0.26906,-0.78940,-0.08205,-0.62780,-0.30942,-0.46637,-0.55605,-0.16449,-0.64338,0.09562,-0.61055,0.30406,-0.48392,0.43227,-0.29838,0.47029,-0.09461,0.42152,0.12556,g
229 | 0,0,1,-1,1,1,1,1,1,1,1,1,1,-1,1,1,1,1,1,-1,1,-1,1,-1,1,-1,1,1,1,-1,1,1,1,1,b
230 | 1,0,0.73523,-0.38293,0.80151,0.10278,0.78826,0.15266,0.55580,0.05252,1,0.21225,0.71947,0.28954,0.68798,0.32925,0.49672,0.17287,0.64333,-0.02845,0.57399,0.42528,0.53120,0.44872,0.94530,0.57549,0.44174,0.48200,0.12473,1,0.35070,0.49721,0.30588,0.49831,g
231 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,b
232 | 1,0,0.94649,0.00892,0.97287,-0.00260,0.98922,0.00372,0.95801,0.01598,0.94054,0.03530,0.97213,0.04719,0.98625,0.01858,0.94277,0.07135,0.98551,-0.00706,0.97770,0.04980,0.96358,0.07098,0.93274,0.08101,0.95243,0.04356,0.97473,0.00818,0.97845,0.07061,1,-0.00260,g
233 | 0,0,1,1,-1,-1,-1,-1,0,0,0,0,-1,-1,0,0,0,0,0,0,-1,1,1,1,0,0,1,-1,0,0,-1,-1,-1,-1,b
234 | 1,0,0.50466,-0.16900,0.71442,0.01513,0.71063,0.02258,0.68065,0.01282,0.34615,0.05594,0.69050,0.04393,0.68101,0.05058,0.67023,0.05692,0.63403,-0.04662,0.64503,0.06856,0.63077,0.07381,0.84033,0.18065,0.59935,0.08304,0.38228,0.06760,0.56466,0.09046,0.54632,0.09346,g
235 | 1,0,0.68729,1,0.91973,-0.76087,0.81773,0.04348,0.76087,0.10702,0.86789,0.73746,0.70067,0.18227,0.75920,0.13712,0.93478,-0.25084,0.70736,0.18729,0.64883,0.24582,0.60201,0.77425,1,-0.53846,0.89262,0.22216,0.71070,0.53846,1,-0.06522,0.56522,0.23913,b
236 | 1,0,0.76296,-0.07778,1,-0.29630,1,-0.85741,0.80000,0.06111,0.45556,-0.42778,1,-0.12581,1,-0.83519,0.49259,0.01852,0.82222,-0.05926,0.98215,-0.19938,1,0.22037,0.69630,-0.26481,0.92148,-0.24549,0.78889,0.02037,0.87492,-0.27105,1,-0.57037,g
237 | 1,0,0.38521,0.15564,0.41245,0.07393,0.26459,0.24125,0.23346,0.13230,0.19455,0.25292,0.24514,0.36965,0.08949,0.22957,-0.03891,0.36965,0.05058,0.24903,0.24903,0.09728,0.07782,0.29961,-0.02494,0.28482,-0.06024,0.26256,-0.14786,0.14786,-0.09339,0.31128,-0.19066,0.28794,b
238 | 1,0,0.57540,-0.03175,0.75198,-0.05357,0.61508,-0.01190,0.53968,0.03373,0.61706,0.09921,0.59127,-0.02381,0.62698,0.01190,0.70833,0.02579,0.60317,0.01587,0.47817,-0.02778,0.59127,0.03770,0.5,0.03968,0.61291,-0.01237,0.61706,-0.13492,0.68849,-0.01389,0.62500,-0.03175,g
239 | 1,0,0.06404,-0.15271,-0.04433,0.05911,0.08374,-0.02463,-0.01478,0.18719,0.06404,0,0.12315,-0.09852,0.05911,0,0.01970,-0.02956,-0.12808,-0.20690,0.06897,0.01478,0.06897,0.02956,0.07882,0.16256,0.28079,-0.04926,-0.05911,-0.09360,0.04433,0.05419,0.07389,-0.10837,b
240 | 1,0,0.61857,0.10850,0.70694,-0.06935,0.70358,0.01678,0.74273,0.00224,0.71029,0.15772,0.71588,-0.00224,0.79754,0.06600,0.83669,-0.16555,0.68680,-0.09060,0.62528,-0.01342,0.60962,0.11745,0.71253,-0.09508,0.69845,-0.01673,0.63311,0.04810,0.78859,-0.05145,0.65213,-0.04698,g
241 | 1,0,0.25316,0.35949,0,0,-0.29620,-1,0,0,0.07595,-0.07342,0,0,0,0,0,0,0,0,0.00759,0.68101,-0.20000,0.33671,-0.10380,0.35696,0.05570,-1,0,0,0.06329,-1,0,0,b
242 | 1,0,0.88103,-0.00857,0.89818,-0.02465,0.94105,-0.01822,0.89175,-0.12755,0.82208,-0.10932,0.88853,0.01179,0.90782,-0.13719,0.87138,-0.06109,0.90782,-0.02358,0.87996,-0.14577,0.82851,-0.12433,0.90139,-0.19507,0.88245,-0.14903,0.84352,-0.12862,0.88424,-0.18542,0.91747,-0.16827,g
243 | 1,0,0.42708,-0.5,0,0,0,0,0.46458,0.51042,0.58958,0.02083,0,0,0,0,0.16458,-0.45417,0.59167,-0.18333,0,0,0,0,0.98750,-0.40833,-1,-1,-0.27917,-0.75625,0,0,0,0,b
244 | 1,0,0.88853,0.01631,0.92007,0.01305,0.92442,0.01359,0.89179,-0.10223,0.90103,-0.08428,0.93040,-0.01033,0.93094,-0.08918,0.86025,-0.05057,0.89451,-0.04024,0.88418,-0.12126,0.88907,-0.11909,0.82980,-0.14138,0.86453,-0.11808,0.85536,-0.13051,0.83524,-0.12452,0.86786,-0.12235,g
245 | 1,0,0,0,1,0.12889,0.88444,-0.02000,0,0,1,-0.42444,1,0.19556,1,-0.05333,1,-0.81556,0,0,1,-0.04000,1,-0.18667,0,0,1,-1,0,0,1,0.11778,0.90667,-0.09556,b
246 | 1,0,0.81143,0.03714,0.85143,-0.00143,0.79000,0.00714,0.79571,-0.04286,0.87571,0,0.85571,-0.06714,0.86429,0.00286,0.82857,-0.05429,0.81000,-0.11857,0.76857,-0.08429,0.84286,-0.05000,0.77000,-0.06857,0.81598,-0.08669,0.82571,-0.10429,0.81429,-0.05000,0.82143,-0.15143,g
247 | 1,0,0,0,0,0,0,0,0,0,0,0,-1,1,1,0.55172,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,b
248 | 1,0,0.49870,0.01818,0.43117,-0.09610,0.50649,-0.04156,0.50130,0.09610,0.44675,0.05974,0.55844,-0.11948,0.51688,-0.03636,0.52727,-0.05974,0.55325,-0.01039,0.48571,-0.03377,0.49091,-0.01039,0.59221,0,0.53215,-0.03280,0.43117,0.03377,0.54545,-0.05455,0.58961,-0.08571,g
249 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,-1,0,0,0,0,0,0,b
250 | 1,0,1,0.5,1,0.25000,0.25000,1,0.16851,0.91180,-0.13336,0.80454,-0.34107,0.60793,-0.43820,0.37856,-0.43663,0.16709,-0.36676,0.00678,-0.26477,-0.09025,-0.16178,-0.12964,-0.07782,-0.12744,-0.02089,-0.10242,0.01033,-0.07036,0.02224,-0.04142,0.02249,-0.02017,g
251 | 1,0,0,0,0,0,1,1,-1,-1,0,0,1,-0.11111,0,0,0,0,-1,1,1,1,1,-1,0,0,1,-1,0,0,0,0,1,1,b
252 | 1,0,0.87048,0.38027,0.64099,0.69212,0.31347,0.86625,-0.03933,0.90740,-0.42173,0.79346,-0.70561,0.51560,-0.81049,0.22735,-0.81136,-0.12539,-0.67474,-0.38102,-0.38334,-0.62861,-0.13013,-0.70762,0.15552,-0.66421,0.38544,-0.51568,0.52573,-0.29897,0.56239,-0.05938,0.51460,0.16645,g
253 | 1,0,0,0,0,0,0,0,-1,1,0,0,1,0.37333,-0.12000,-0.12000,0,0,-1,-1,0,0,1,-1,0,0,1,0.22667,0,0,0,0,0,0,b
254 | 1,0,0.88179,0.43491,0.59573,0.77655,0.19672,0.94537,-0.24103,0.92544,-0.62526,0.71257,-0.86443,0.33652,-0.92384,-0.05338,-0.77356,-0.44707,-0.46950,-0.73285,-0.10237,-0.82217,0.26384,-0.77570,0.55984,-0.55910,0.72147,-0.24433,0.72478,0.09599,0.58137,0.38915,0.34749,0.57656,g
255 | 1,0,0.32834,0.02520,0.15236,0.21278,0.14919,0.74003,-0.25706,0.92324,-0.10312,0.19380,-0.61352,0.25786,-0.94053,-0.05409,-0.13117,-0.14329,-0.30315,-0.44615,-0.11409,-0.85597,0.02668,-0.22786,0.27942,-0.06295,0.33737,-0.11876,0.27657,-0.11409,0.15078,0.13296,0.12197,0.20468,g
256 | 1,0,0.83427,0.39121,0.54040,0.78579,0.12326,0.89402,-0.33221,0.83578,-0.70086,0.59564,-0.86622,0.21909,-0.84442,-0.24164,-0.59714,-0.61894,-0.19354,-0.87787,0.12439,-0.89064,0.51109,-0.72454,0.79143,-0.27734,0.83008,0.08718,0.66592,0.49079,0.37542,0.70011,-0.03983,0.79444,g
257 | 1,0,0.62335,-0.03490,0.59085,0.00481,0.60409,-0.07461,0.63177,0.00963,0.62455,-0.07461,0.67028,0.07220,0.62936,-0.08424,0.67509,0.09146,0.67148,0,0.58965,0.10108,0.50060,0.03129,0.65945,0.14079,0.60463,0.02019,0.51384,0.04452,0.61733,-0.00963,0.61372,-0.09146,g
258 | 1,0,0.74449,-0.02390,0.70772,0.03309,0.72243,0.16912,0.79228,0.07721,0.81434,0.43934,0.63787,0.00551,0.70772,0.21691,1,0.06066,0.61029,0.05147,0.67463,0.04228,0.52022,-0.25000,0.72978,-0.15809,0.61727,0.07124,0.30882,0.08640,0.55916,0.07458,0.60294,0.21691,g
259 | 1,0,0.61538,0.18923,0.78157,0.01780,0.77486,0.02647,0.65077,-0.10308,0.77538,0.08000,0.73961,0.05060,0.72322,0.05776,0.68615,-0.08923,0.61692,0.16308,0.66233,0.07573,0.63878,0.08041,0.60154,-0.07231,0.58803,0.08767,0.55077,0.25692,0.53389,0.09207,0.50609,0.09322,g
260 | 1,0,0.68317,0.05375,0.84803,0.00202,0.84341,0.00301,0.84300,0.09901,0.75813,0.04102,0.81892,0.00585,0.80738,0.00673,0.80622,-0.12447,0.77935,-0.03536,0.76365,0.00909,0.74635,0.00978,0.79632,-0.04243,0.70824,0.01096,0.62235,0.11598,0.66624,0.01190,0.64407,0.01227,g
261 | 1,0,0.5,0,0.38696,0.10435,0.49130,0.06522,0.46957,-0.03913,0.35652,-0.12609,0.45652,0.04783,0.50435,0.02609,0.35652,0.19565,0.42174,0.14783,0.42174,-0.02609,0.32174,-0.11304,0.47391,-0.00870,0.41789,0.06908,0.38696,0.03913,0.35217,0.14783,0.44783,0.17391,g
262 | 1,0,0.79830,0.09417,0.78129,0.20656,0.71628,0.28068,0.69320,0.41252,0.65917,0.50122,0.57898,0.60814,0.49210,0.58445,0.33354,0.67861,0.29587,0.63548,0.09599,0.68104,0.02066,0.72236,-0.08748,0.63183,-0.11925,0.60696,-0.18226,0.56015,-0.25516,0.51701,-0.27339,0.42467,g
263 | 1,0,1,0.09802,1,0.25101,0.98390,0.33044,0.80365,0.53020,0.74977,0.60297,0.56937,0.71942,0.55311,0.74079,0.29452,0.82193,0.21137,0.79777,0.09709,0.82162,-0.01734,0.79870,-0.15144,0.75596,-0.22839,0.69187,-0.31713,0.60948,-0.40291,0.54522,-0.42815,0.44534,g
264 | 1,0,0.89410,0.13425,0.87001,0.31543,0.78896,0.43388,0.63388,0.59975,0.54003,0.71016,0.39699,0.76161,0.24266,0.79523,0.09134,0.79598,-0.09159,0.76261,-0.20201,0.66926,-0.30263,0.62610,-0.40552,0.50489,-0.46215,0.40753,-0.50314,0.27252,-0.52823,0.19172,-0.48808,0.05972,g
265 | 1,0,0.94631,0.17498,0.90946,0.33143,0.85096,0.49960,0.73678,0.63842,0.59215,0.73838,0.48698,0.83614,0.30459,0.90665,0.17959,0.93429,-0.00701,0.93109,-0.18880,0.89383,-0.33023,0.82492,-0.46534,0.76482,-0.58563,0.66335,-0.67929,0.52564,-0.75321,0.42488,-0.81210,0.26092,g
266 | 1,0,0.91767,0.18198,0.86090,0.35543,0.72873,0.45747,0.60425,0.69865,0.50376,0.74922,0.36100,0.81795,0.15664,0.83558,0.00396,0.85210,-0.16390,0.77853,-0.35996,0.76193,-0.43087,0.65385,-0.53140,0.53886,-0.60328,0.40972,-0.64511,0.27338,-0.65710,0.13667,-0.64056,0.05394,g
267 | 1,0,0.76627,0.21106,0.63935,0.38112,0.48409,0.52500,0.15000,0.22273,0.13753,0.59565,-0.07727,0.44545,0,0.48636,-0.27491,0.42014,-0.56136,0.36818,-0.36591,0.18864,-0.40533,0.07588,-0.38483,-0.03229,-0.33942,-0.12486,-0.27540,-0.19714,-0.19962,-0.24648,-0.11894,-0.27218,g
268 | 1,0,0.58940,-0.60927,0.85430,0.55298,0.81126,0.07285,0.56623,0.16225,0.32781,0.24172,0.50331,0.12252,0.63907,0.19868,0.71854,0.42715,0.54305,0.13907,0.65232,0.27815,0.68874,0.07285,0.51872,0.26653,0.49013,0.27687,0.46216,0.28574,0.43484,0.29324,0.40821,0.29942,g
269 | 1,0,1,0.11385,0.70019,-0.12144,0.81594,0.09677,0.71157,0.01139,0.56167,-0.07780,0.69070,0.12524,0.58634,0.03985,0.53131,-0.03416,0.69450,0.16888,0.72676,0.07211,0.32068,0.05882,0.53321,0.37381,0.49090,0.17951,0.15180,0.32448,0.44141,0.18897,0.56167,0.15180,g
270 | 1,0,0.84843,0.06794,0.80562,-0.02299,0.77031,-0.03299,0.66725,-0.06620,0.59582,-0.07666,0.67260,-0.05771,0.64260,-0.06438,0.39199,0.04530,0.71254,0.01394,0.55970,-0.08039,0.53430,-0.08453,0.47038,-0.22822,0.48659,-0.09128,0.52613,-0.08537,0.44277,-0.09621,0.42223,-0.09808,g
271 | 1,0,1,0.08013,0.96775,-0.00482,0.96683,-0.00722,0.87980,-0.03923,1,0.01419,0.96186,-0.01436,0.95947,-0.01671,0.98497,0.01002,0.91152,-0.08848,0.95016,-0.02364,0.94636,-0.02591,0.98164,0.02003,0.93772,-0.03034,1,-0.05843,0.92774,-0.03464,0.92226,-0.03673,g
272 | 1,0,0.47938,-0.12371,0.42784,-0.12371,0.70103,-0.39175,0.73196,0.07216,0.26289,-0.21649,0.49485,0.15979,0.45361,-0.11856,0.42268,0.06186,0.5,-0.27320,0.54639,0.18557,0.42268,0.08247,0.70619,0.19588,0.53396,-0.12447,0.15464,-0.26289,0.47423,0.04124,0.45361,-0.51546,g
273 | 1,0,0.63510,-0.04388,0.76530,0.02968,0.61432,0.36028,0.65358,-0.00462,0.64203,0.08314,0.79446,-0.43418,0.72517,0.54965,0.59584,0.13857,0.63510,0.21940,0.63279,-0.25404,0.70951,0.15359,0.64665,0.23095,0.68775,0.17704,0.61663,0.07621,0.66316,0.19841,0.69053,0.36721,g
274 | 1,0,0.50112,-0.03596,0.61124,0.01348,0.58876,0.01573,0.58876,0.02472,0.66742,-0.00449,0.71685,-0.04719,0.66517,0.00899,0.57303,0.02472,0.64719,-0.07416,0.56854,0.14157,0.57528,-0.03596,0.46517,0.04944,0.56588,0.00824,0.47640,-0.03596,0.54607,0.10562,0.60674,-0.08090,g
275 | 1,0,0.71521,-0.00647,0.66667,-0.04207,0.63107,-0.05178,0.77994,0.08091,0.67314,0.09709,0.64725,0.15858,0.60194,-0.01942,0.54369,-0.04531,0.46926,-0.10032,0.64725,0.14887,0.39159,0.21683,0.52427,-0.05502,0.45105,0.00040,0.31392,-0.06796,0.49191,-0.10680,0.30421,-0.05178,g
276 | 1,0,0.68148,0.10370,0.77037,0.03457,0.65185,0.08148,0.60988,-0.00494,0.79012,0.11852,0.59753,0.04938,0.62469,0.09630,0.78272,-0.17531,0.73827,-0.10864,0.48642,0.00988,0.60988,0.08148,0.66667,-0.12840,0.63773,-0.02451,0.76543,0.02222,0.61235,-0.07160,0.51358,-0.04691,g
277 | 1,0,0.60678,-0.02712,0.67119,0.04068,0.52881,-0.04407,0.50508,0.03729,0.70508,-0.07797,0.57966,-0.02034,0.53220,0.07797,0.64068,0.11864,0.56949,-0.02373,0.53220,0.00678,0.71525,-0.03390,0.52881,-0.03390,0.57262,0.00750,0.58644,-0.00339,0.58983,-0.02712,0.50169,0.06780,g
278 | 1,0,0.49515,0.09709,0.29612,0.05825,0.34951,0,0.57282,-0.02427,0.58252,0.02427,0.33495,0.04854,0.52427,0.00485,0.47087,-0.10680,0.43204,0.00485,0.34951,0.05825,0.18932,0.25728,0.31068,-0.15049,0.36547,0.03815,0.39320,0.17476,0.26214,0,0.37379,-0.01942,g
279 | 1,0,0.98822,0.02187,0.93102,0.34100,0.83904,0.35222,0.74706,0.48906,0.73584,0.51879,0.55076,0.60179,0.43130,0.66237,0.31800,0.70443,0.28379,0.68873,0.07515,0.73696,0.06338,0.71284,-0.16489,0.69714,-0.16556,0.60510,-0.16209,0.55805,-0.34717,0.44195,-0.33483,0.37465,g
280 | 1,0,0.97905,0.15810,0.90112,0.35237,0.82039,0.48561,0.71760,0.64888,0.58827,0.73743,0.40349,0.83156,0.25140,0.84804,0.04700,0.85475,-0.12193,0.79749,-0.26180,0.80754,-0.37835,0.71676,-0.51034,0.58324,-0.57587,0.46040,-0.61899,0.30796,-0.65754,0.18345,-0.64134,0.02968,g
281 | 1,0,0.99701,0.21677,0.91966,0.47030,0.76902,0.62415,0.53312,0.78120,0.36774,0.88291,0.10107,0.83312,-0.06827,0.89274,-0.28269,0.72073,-0.43707,0.61688,-0.55769,0.48120,-0.65000,0.35534,-0.64658,0.15908,-0.66651,0.02277,-0.64872,-0.13462,-0.54615,-0.22949,-0.47201,-0.35032,g
282 | 1,0,0.94331,0.19959,0.96132,0.40803,0.80514,0.56569,0.56687,0.70830,0.41836,0.83230,0.14939,0.89489,0.05167,0.93682,-0.24742,0.83939,-0.42811,0.75554,-0.50251,0.62563,-0.65515,0.50428,-0.68851,0.30912,-0.77097,0.15619,-0.75406,-0.04399,-0.75199,-0.17921,-0.66932,-0.34367,g
283 | 1,0,0.93972,0.28082,0.80486,0.52821,0.58167,0.73151,0.34961,0.80511,0.10797,0.90403,-0.20015,0.89335,-0.39730,0.82163,-0.58835,0.62867,-0.76305,0.40368,-0.81262,0.18888,-0.81317,-0.04284,-0.75273,-0.26883,-0.63237,-0.46438,-0.46422,-0.61446,-0.26389,-0.70835,-0.08937,-0.71273,g
284 | 1,0,0.89835,0.35157,0.67333,0.62233,0.43898,0.94353,-0.03643,0.80510,-0.22838,0.75334,-0.25137,0.48816,-0.57377,0.28415,-0.66750,0.10591,-0.47359,-0.06193,-0.81056,-0.06011,-0.33197,-0.47592,-0.12897,-0.53620,0.07158,-0.51925,0.24321,-0.43478,0.36586,-0.30057,0.42805,0.13297,g
285 | 1,0,0.29073,0.10025,0.23308,0.17293,0.03759,0.34336,0.12030,0.26316,0.06266,0.21303,-0.04725,0.12767,-0.06333,0.07907,-0.06328,0.04097,-0.05431,0.01408,-0.04166,-0.00280,-0.02876,-0.01176,-0.01755,-0.01505,-0.00886,-0.01475,-0.00280,-0.01250,0.00096,-0.00948,0.00290,-0.00647,g
286 | 1,0,0.58459,-0.35526,1,0.35338,0.75376,-0.00564,0.82519,0.19361,0.50188,-0.27632,0.65977,0.06391,0.69737,0.14662,0.72368,-0.42669,0.76128,0.04511,0.66917,0.20489,0.84774,-0.40977,0.64850,-0.04699,0.56836,-0.10571,0.52820,-0.13346,0.15602,-0.12218,0.44767,-0.10309,g
287 | 1,0,0.83609,0.13215,0.72171,0.06059,0.65829,0.08315,0.23888,0.12961,0.43837,0.20330,0.49418,0.12686,0.44747,0.13507,0.29352,0.02922,0.48158,0.15756,0.32835,0.14616,0.29495,0.14638,0.26436,0.14530,0.23641,0.14314,0.26429,0.16137,0.18767,0.13632,0.16655,0.13198,g
288 | 1,0,0.94080,0.11933,0.85738,0.01038,0.85124,0.01546,0.76966,-0.00278,0.84459,0.10916,0.83289,0.03027,0.82680,0.03506,0.74838,0.01943,0.80019,0.02405,0.80862,0.04901,0.80259,0.05352,0.77336,0.02220,0.79058,0.06235,0.85939,0.09251,0.77863,0.07090,0.77269,0.07508,g
289 | 1,0,0.87111,0.04326,0.79946,0.18297,0.99009,0.29292,0.89455,-0.08337,0.88598,-0.02028,0.90446,-0.26724,0.89410,0.19964,0.88644,-0.04642,0.84452,-0.00991,0.97882,-0.34024,0.78954,-0.25101,0.86661,-0.09193,0.85967,-0.02908,0.78774,-0.04101,0.75935,0.21812,0.88238,0.09193,g
290 | 1,0,0.74916,0.02549,0.98994,0.09792,0.75855,0.12877,0.74313,-0.09188,0.95842,0.02482,0.97921,-0.00469,0.96110,0.10195,0.91482,0.03756,0.71026,0.02683,0.81221,-0.08048,1,0,0.71764,-0.01207,0.82271,0.02552,0.72435,-0.01073,0.90409,0.11066,0.72837,0.02750,g
291 | 1,0,0.47337,0.19527,0.06213,-0.18343,0.62316,0.01006,0.45562,-0.04438,0.56509,0.01775,0.44675,0.27515,0.71598,-0.03846,0.55621,0.12426,0.41420,0.11538,0.52767,0.02842,0.51183,-0.10651,0.47929,-0.02367,0.46514,0.03259,0.53550,0.25148,0.31953,-0.14497,0.34615,-0.00296,g
292 | 1,0,0.59887,0.14689,0.69868,-0.13936,0.85122,-0.13936,0.80979,0.02448,0.50471,0.02825,0.67420,-0.04520,0.80791,-0.13748,0.51412,-0.24482,0.81544,-0.14313,0.70245,-0.00377,0.33333,0.06215,0.56121,-0.33145,0.61444,-0.16837,0.52731,-0.02072,0.53861,-0.31262,0.67420,-0.22034,g
293 | 1,0,0.84713,-0.03397,0.86412,-0.08493,0.81953,0,0.73673,-0.07643,0.71975,-0.13588,0.74947,-0.11677,0.77495,-0.18684,0.78132,-0.21231,0.61996,-0.10191,0.79193,-0.15711,0.89384,-0.03397,0.84926,-0.26115,0.74115,-0.23312,0.66242,-0.22293,0.72611,-0.37792,0.65817,-0.24841,g
294 | 1,0,0.87772,-0.08152,0.83424,0.07337,0.84783,0.04076,0.77174,-0.02174,0.77174,-0.05707,0.82337,-0.10598,0.67935,-0.00543,0.88043,-0.20924,0.83424,0.03261,0.86413,-0.05978,0.97283,-0.27989,0.85054,-0.18750,0.83705,-0.10211,0.85870,-0.03261,0.78533,-0.10870,0.79076,-0.00543,g
295 | 1,0,0.74704,-0.13241,0.53755,0.16996,0.72727,0.09486,0.69565,-0.11067,0.66798,-0.23518,0.87945,-0.19170,0.73715,0.04150,0.63043,-0.00395,0.63636,-0.11858,0.79249,-0.25296,0.66403,-0.28656,0.67194,-0.10474,0.61847,-0.12041,0.60079,-0.20949,0.37549,0.06917,0.61067,-0.01383,g
296 | 1,0,0.46785,0.11308,0.58980,0.00665,0.55432,0.06874,0.47894,-0.13969,0.52993,0.01330,0.63858,-0.16186,0.67849,-0.03326,0.54545,-0.13525,0.52993,-0.04656,0.47894,-0.19512,0.50776,-0.13525,0.41463,-0.20177,0.53930,-0.11455,0.59867,-0.02882,0.53659,-0.11752,0.56319,-0.04435,g
297 | 1,0,0.88116,0.27475,0.72125,0.42881,0.61559,0.63662,0.38825,0.90502,0.09831,0.96128,-0.20097,0.89200,-0.35737,0.77500,-0.65114,0.62210,-0.78768,0.45535,-0.81856,0.19095,-0.83943,-0.08079,-0.78334,-0.26356,-0.67557,-0.45511,-0.54732,-0.60858,-0.30512,-0.66700,-0.19312,-0.75597,g
298 | 1,0,0.93147,0.29282,0.79917,0.55756,0.59952,0.71596,0.26203,0.92651,0.04636,0.96748,-0.23237,0.95130,-0.55926,0.81018,-0.73329,0.62385,-0.90995,0.36200,-0.92254,0.06040,-0.93618,-0.19838,-0.83192,-0.46906,-0.65165,-0.69556,-0.41223,-0.85725,-0.13590,-0.93953,0.10007,-0.94823,g
299 | 1,0,0.88241,0.30634,0.73232,0.57816,0.34109,0.58527,0.05717,1,-0.09238,0.92118,-0.62403,0.71996,-0.69767,0.32558,-0.81422,0.41195,-1,-0.00775,-0.78973,-0.41085,-0.76901,-0.45478,-0.57242,-0.67605,-0.31610,-0.81876,-0.02979,-0.86841,0.25392,-0.82127,0.00194,-0.81686,g
300 | 1,0,0.83479,0.28993,0.69256,0.47702,0.49234,0.68381,0.21991,0.86761,-0.08096,0.85011,-0.35558,0.77681,-0.52735,0.58425,-0.70350,0.31291,-0.75821,0.03939,-0.71225,-0.15317,-0.58315,-0.39168,-0.37199,-0.52954,-0.16950,-0.60863,0.08425,-0.61488,0.25164,-0.48468,0.40591,-0.35339,g
301 | 1,0,0.92870,0.33164,0.76168,0.62349,0.49305,0.84266,0.21592,0.95193,-0.13956,0.96167,-0.47202,0.83590,-0.70747,0.65490,-0.87474,0.36750,-0.91814,0.05595,-0.89824,-0.26173,-0.73969,-0.54069,-0.50757,-0.74735,-0.22323,-0.86122,0.07810,-0.87159,0.36021,-0.78057,0.59407,-0.60270,g
302 | 1,0,0.83367,0.31456,0.65541,0.57671,0.34962,0.70677,0.17293,0.78947,-0.18976,0.79886,-0.41729,0.66541,-0.68421,0.47744,-0.74725,0.19492,-0.72180,-0.04887,-0.62030,-0.28195,-0.49165,-0.53463,-0.26577,-0.66014,-0.01530,-0.69706,0.22708,-0.64428,0.43100,-0.51206,0.64662,-0.30075,g
303 | 1,0,0.98455,-0.02736,0.98058,-0.04104,1,-0.07635,0.98720,0.01456,0.95278,-0.02604,0.98500,-0.07458,0.99382,-0.07149,0.97396,-0.09532,0.97264,-0.12224,0.99294,-0.05252,0.95278,-0.08914,0.97352,-0.08341,0.96653,-0.12912,0.93469,-0.14916,0.97132,-0.15755,0.96778,-0.18800,g
304 | 1,0,0.94052,-0.01531,0.94170,0.01001,0.94994,-0.01472,0.95878,-0.01060,0.94641,-0.03710,0.97173,-0.01767,0.97055,-0.03887,0.95465,-0.04064,0.95230,-0.04711,0.94229,-0.02179,0.92815,-0.04417,0.92049,-0.04476,0.92695,-0.05827,0.90342,-0.07479,0.91991,-0.07244,0.92049,-0.07420,g
305 | 1,0,0.97032,-0.14384,0.91324,-0.00228,0.96575,-0.17123,0.98630,0.18265,0.91781,0.00228,0.93607,-0.08447,0.91324,-0.00228,0.86758,-0.08676,0.97032,-0.21233,1,0.10274,0.92009,-0.05251,0.92466,0.06849,0.94043,-0.09252,0.97032,-0.20091,0.85388,-0.08676,0.96575,-0.21918,g
306 | 1,0,0.52542,-0.03390,0.94915,0.08475,0.52542,-0.16949,0.30508,-0.01695,0.50847,-0.13559,0.64407,0.28814,0.83051,-0.35593,0.54237,0.01695,0.55932,0.03390,0.59322,0.30508,0.86441,0.05085,0.40678,0.15254,0.67287,-0.00266,0.66102,-0.03390,0.83051,-0.15254,0.76271,-0.10169,g
307 | 1,0,0.33333,-0.25000,0.44444,0.22222,0.38889,0.16667,0.41667,0.13889,0.5,-0.11111,0.54911,-0.08443,0.58333,0.33333,0.55556,0.02778,0.25000,-0.19444,0.47222,-0.05556,0.52778,-0.02778,0.38889,0.08333,0.41543,-0.14256,0.19444,-0.13889,0.36924,-0.14809,0.08333,-0.5,g
308 | 1,0,0.51207,1,1,0.53810,0.71178,0.80833,0.45622,0.46427,0.33081,1,0.21249,1,-0.17416,1,-0.33081,0.98722,-0.61382,1,-0.52674,0.71699,-0.88500,0.47894,-1,0.35175,-1,0.09569,-1,-0.16713,-1,-0.42226,-0.91903,-0.65557,g
309 | 1,0,0.75564,0.49638,0.83550,0.54301,0.54916,0.72063,0.35225,0.70792,0.13469,0.94749,-0.09818,0.93778,-0.37604,0.82223,-0.52742,0.71161,-0.68358,0.67989,-0.70163,0.24956,-0.79147,0.02995,-0.98988,-0.29099,-0.70352,-0.32792,-0.63312,-0.19185,-0.34131,-0.60454,-0.19609,-0.62956,g
310 | 1,0,0.83789,0.42904,0.72113,0.58385,0.45625,0.78115,0.16470,0.82732,-0.13012,0.86947,-0.46177,0.78497,-0.59435,0.52070,-0.78470,0.26529,-0.84014,0.03928,-0.62041,-0.31351,-0.47412,-0.48905,-0.37298,-0.67796,-0.05054,-0.62691,0.14690,-0.45911,0.37093,-0.39167,0.48319,-0.24313,g
311 | 1,0,0.93658,0.35107,0.75254,0.65640,0.45571,0.88576,0.15323,0.95776,-0.21775,0.96301,-0.56535,0.83397,-0.78751,0.58045,-0.93104,0.26020,-0.93641,-0.06418,-0.87028,-0.40949,-0.65079,-0.67464,-0.36799,-0.84951,-0.04578,-0.91221,0.27330,-0.85762,0.54827,-0.69613,0.74828,-0.44173,g
312 | 1,0,0.92436,0.36924,0.71976,0.68420,0.29303,0.94078,-0.11108,0.76527,-0.31605,0.92453,-0.66616,0.78766,-0.92145,0.42314,-0.94315,0.09585,-1,0.03191,-0.66431,-0.66278,-0.46010,-0.78174,-0.13486,-0.88082,0.19765,-0.85137,0.48904,-0.70247,0.69886,-0.46048,0.76066,-0.13194,g
313 | 1,0,1,0.16195,1,-0.05558,1,0.01373,1,-0.12352,1,-0.01511,1,-0.01731,1,-0.06374,1,-0.07157,1,0.05900,1,-0.10108,1,-0.02685,1,-0.22978,1,-0.06823,1,0.08299,1,-0.14194,1,-0.07439,g
314 | 1,0,0.95559,-0.00155,0.86421,-0.13244,0.94982,-0.00461,0.82809,-0.51171,0.92441,0.10368,1,-0.14247,0.99264,-0.02542,0.95853,-0.15518,0.84013,0.61739,1,-0.16321,0.87492,-0.08495,0.85741,-0.01664,0.84132,-0.01769,0.82427,-0.01867,0.80634,-0.01957,0.78761,-0.02039,g
315 | 1,0,0.79378,0.29492,0.64064,0.52312,0.41319,0.68158,0.14177,0.83548,-0.16831,0.78772,-0.42911,0.72328,-0.57165,0.41471,-0.75436,0.16755,-0.69977,-0.09856,-0.57695,-0.23503,-0.40637,-0.38287,-0.17437,-0.52540,0.01523,-0.48707,0.19030,-0.38059,0.31008,-0.23199,0.34572,-0.08036,g
316 | 1,0,0.88085,0.35232,0.68389,0.65128,0.34816,0.79784,0.05832,0.90842,-0.29784,0.86490,-0.62635,0.69590,-0.77106,0.39309,-0.85803,0.08408,-0.81641,-0.24017,-0.64579,-0.50022,-0.39766,-0.68337,-0.11147,-0.75533,0.17041,-0.71504,0.40675,-0.57649,0.56626,-0.36765,0.62765,-0.13305,g
317 | 1,0,0.89589,0.39286,0.66129,0.71804,0.29521,0.90824,-0.04787,0.94415,-0.45725,0.84605,-0.77660,0.58511,-0.92819,0.25133,-0.92282,-0.15315,-0.76064,-0.48404,-0.50931,-0.76197,-0.14895,-0.88591,0.21581,-0.85703,0.53229,-0.68593,0.74846,-0.40656,0.83142,-0.07029,0.76862,0.27926,g
318 | 1,0,1,-0.24051,1,-0.20253,0.87342,-0.10127,0.88608,0.01266,1,0.11392,0.92405,0.06329,0.84810,-0.03797,0.63291,-0.36709,0.87342,-0.01266,0.93671,0.06329,1,0.25316,0.62025,-0.37975,0.84637,-0.05540,1,-0.06329,0.53165,0.02532,0.83544,-0.02532,g
319 | 1,0,0.74790,0.00840,0.83312,0.01659,0.82638,0.02469,0.86555,0.01681,0.60504,0.05882,0.79093,0.04731,0.77441,0.05407,0.64706,0.19328,0.84034,0.04202,0.71285,0.07122,0.68895,0.07577,0.66387,0.08403,0.63728,0.08296,0.61345,0.01681,0.58187,0.08757,0.55330,0.08891,g
320 | 1,0,0.85013,0.01809,0.92211,0.01456,0.92046,0.02180,0.92765,0.08010,0.87597,0.11370,0.91161,0.04320,0.90738,0.05018,0.87339,0.02842,0.95866,0,0.89097,0.07047,0.88430,0.07697,0.83721,0.10853,0.86923,0.08950,0.87597,0.08786,0.85198,0.10134,0.84258,0.10698,g
321 | 1,0,1,-0.01179,1,-0.00343,1,-0.01565,1,-0.01565,1,-0.02809,1,-0.02187,0.99828,-0.03087,0.99528,-0.03238,0.99314,-0.03452,1,-0.03881,1,-0.05039,1,-0.04931,0.99842,-0.05527,0.99400,-0.06304,0.99057,-0.06497,0.98971,-0.06668,g
322 | 1,0,0.89505,-0.03168,0.87525,0.05545,0.89505,0.01386,0.92871,0.02772,0.91287,-0.00990,0.94059,-0.01584,0.91881,0.03366,0.93663,0,0.94257,0.01386,0.90495,0.00792,0.88713,-0.01782,0.89307,0.02376,0.89002,0.01611,0.88119,0.00198,0.87327,0.04158,0.86733,0.02376,g
323 | 1,0,0.90071,0.01773,1,-0.01773,0.90071,0.00709,0.84752,0.05674,1,0.03546,0.97872,0.01064,0.97518,0.03546,1,-0.03191,0.89716,-0.03191,0.86170,0.07801,1,0.09220,0.90071,0.04610,0.94305,0.03247,0.94681,0.02482,1,0.01064,0.93617,0.02128,g
324 | 1,0,0.39394,-0.24242,0.62655,0.01270,0.45455,0.09091,0.63636,0.09091,0.21212,-0.21212,0.57576,0.15152,0.39394,0,0.56156,0.04561,0.51515,0.03030,0.78788,0.18182,0.30303,-0.15152,0.48526,0.05929,0.46362,0.06142,0.33333,-0.03030,0.41856,0.06410,0.39394,0.24242,g
325 | 1,0,0.86689,0.35950,0.72014,0.66667,0.37201,0.83049,0.08646,0.85893,-0.24118,0.86121,-0.51763,0.67577,-0.68714,0.41524,-0.77019,0.09898,-0.69397,-0.13652,-0.49488,-0.42207,-0.32537,-0.57679,-0.02844,-0.59954,0.15360,-0.53127,0.32309,-0.37088,0.46189,-0.19681,0.40956,0.01820,g
326 | 1,0,0.89563,0.37917,0.67311,0.69438,0.35916,0.88696,-0.04193,0.93345,-0.38875,0.84414,-0.67274,0.62078,-0.82680,0.30356,-0.86150,-0.05365,-0.73564,-0.34275,-0.51778,-0.62443,-0.23428,-0.73855,0.06911,-0.73856,0.33531,-0.62296,0.52414,-0.42086,0.61217,-0.17343,0.60073,0.08660,g
327 | 1,0,0.90547,0.41113,0.65354,0.74761,0.29921,0.95905,-0.13342,0.97820,-0.52236,0.83263,-0.79657,0.55086,-0.96631,0.15192,-0.93001,-0.25554,-0.71863,-0.59379,-0.41546,-0.85205,-0.02250,-0.93788,0.36318,-0.85368,0.67538,-0.61959,0.85977,-0.28123,0.88654,0.09800,0.75495,0.46301,g
328 | 1,0,1,1,0.36700,0.06158,0.12993,0.92713,-0.27586,0.93596,-0.31527,0.37685,-0.87192,0.36946,-0.92857,-0.08867,-0.38916,-0.34236,-0.46552,-0.82512,-0.05419,-0.93596,0.25616,-0.20443,0.73792,-0.45950,0.85471,-0.06831,1,1,0.38670,0.00246,0.17758,0.79790,g
329 | 1,0,1,0.51515,0.45455,0.33333,0.06061,0.36364,-0.32104,0.73062,-0.45455,0.48485,-0.57576,0,-0.57576,-0.12121,-0.33333,-0.48485,-0.09091,-0.84848,0.48485,-0.57576,0.57576,-0.42424,1,-0.39394,0.72961,0.12331,0.96970,0.57576,0.24242,0.36364,0.09091,0.33333,g
330 | 1,0,0.88110,0,0.94817,-0.02744,0.93598,-0.01220,0.90244,0.01829,0.90244,0.01829,0.93902,0.00915,0.95732,0.00305,1,0.02744,0.94207,-0.01220,0.90854,0.02439,0.91463,0.05488,0.99695,0.04878,0.89666,0.02226,0.90854,0.00915,1,0.05488,0.97561,-0.01220,g
331 | 1,0,0.82624,0.08156,0.79078,-0.08156,0.90426,-0.01773,0.92908,0.01064,0.80142,0.08865,0.94681,-0.00709,0.94326,0,0.93262,0.20213,0.95035,-0.00709,0.91489,0.00709,0.80496,0.07092,0.91135,0.15957,0.89527,0.08165,0.77660,0.06738,0.92553,0.18085,0.92553,0,g
332 | 1,0,0.74468,0.10638,0.88706,0.00982,0.88542,0.01471,0.87234,-0.01418,0.73050,0.10638,0.87657,0.02912,0.87235,0.03382,0.95745,0.07801,0.95035,0.04255,0.85597,0.04743,0.84931,0.05178,0.87234,0.11348,0.83429,0.06014,0.74468,-0.03546,0.81710,0.06800,0.80774,0.07173,g
333 | 1,0,0.87578,0.03727,0.89951,0.00343,0.89210,0.00510,0.86335,0,0.95031,0.07453,0.87021,0.00994,0.86303,0.01151,0.83851,-0.06211,0.85714,0.02484,0.84182,0.01603,0.83486,0.01749,0.79503,-0.04348,0.82111,0.02033,0.81988,0.08696,0.80757,0.02308,0.80088,0.02441,g
334 | 1,0,0.97513,0.00710,0.98579,0.01954,1,0.01954,0.99290,0.01599,0.95737,0.02309,0.97158,0.03552,1,0.03730,0.97869,0.02131,0.98579,0.05684,0.97158,0.04796,0.94494,0.05506,0.98401,0.03552,0.97540,0.06477,0.94849,0.08171,0.99112,0.06217,0.98934,0.09947,g
335 | 1,0,1,0.01105,1,0.01105,1,0.02320,0.99448,-0.01436,0.99448,-0.00221,0.98343,0.02320,1,0.00884,0.97569,0.00773,0.97901,0.01657,0.98011,0.00663,0.98122,0.02099,0.97127,-0.00663,0.98033,0.01600,0.97901,0.01547,0.98564,0.02099,0.98674,0.02762,g
336 | 1,0,1,-0.01342,1,0.01566,1,-0.00224,1,0.06264,0.97763,0.04474,0.95973,0.02908,1,0.06488,0.98881,0.03356,1,0.03579,0.99776,0.09396,0.95749,0.07383,1,0.10067,0.99989,0.08763,0.99105,0.08501,1,0.10067,1,0.10067,g
337 | 1,0,0.88420,0.36724,0.67123,0.67382,0.39613,0.86399,0.02424,0.93182,-0.35148,0.83713,-0.60316,0.58842,-0.78658,0.38778,-0.83285,-0.00642,-0.69318,-0.32963,-0.52504,-0.53924,-0.27377,-0.68126,0.00806,-0.69774,0.26028,-0.60678,0.44569,-0.43383,0.54209,-0.21542,0.56286,0.02823,g
338 | 1,0,0.90147,0.41786,0.64131,0.75725,0.30440,0.95148,-0.20449,0.96534,-0.55483,0.81191,-0.81857,0.50949,-0.96986,0.10345,-0.91456,-0.31412,-0.70163,-0.65461,-0.32354,-0.88999,0.05865,-0.94172,0.44483,-0.82154,0.74105,-0.55231,0.89415,-0.18725,0.87893,0.20359,0.70555,0.54852,g
339 | 1,0,0.32789,0.11042,0.15970,0.29308,0.14020,0.74485,-0.25131,0.91993,-0.16503,0.26664,-0.63714,0.24865,-0.97650,-0.00337,-0.23227,-0.19909,-0.30522,-0.48886,-0.14426,-0.89991,0.09345,-0.28916,0.28307,-0.18560,0.39599,-0.11498,0.31005,0.05614,0.21443,0.20540,0.13376,0.26422,g
340 | 1,0,0.65845,0.43617,0.44681,0.74804,0.05319,0.85106,-0.32027,0.82139,-0.68253,0.52408,-0.84211,0.07111,-0.82811,-0.28723,-0.47032,-0.71725,-0.04759,-0.86002,0.23292,-0.76316,0.56663,-0.52128,0.74300,-0.18645,0.74758,0.23713,0.45185,0.59071,0.20549,0.76764,-0.18533,0.74356,g
341 | 1,0,0.19466,0.05725,0.04198,0.25191,-0.10557,0.48866,-0.18321,-0.18321,-0.41985,0.06107,-0.45420,0.09160,-0.16412,-0.30534,-0.10305,-0.39695,0.18702,-0.17557,0.34012,-0.11953,0.28626,-0.16031,0.21645,0.24692,0.03913,0.31092,-0.03817,0.26336,-0.16794,0.16794,-0.30153,-0.33588,g
342 | 1,0,0.98002,0.00075,1,0,0.98982,-0.00075,0.94721,0.02394,0.97700,0.02130,0.97888,0.03073,0.99170,0.02338,0.93929,0.05713,0.93552,0.05279,0.97738,0.05524,1,0.06241,0.94155,0.08107,0.96709,0.07255,0.95701,0.08088,0.98190,0.08126,0.97247,0.08616,g
343 | 1,0,0.82254,-0.07572,0.80462,0.00231,0.87514,-0.01214,0.86821,-0.07514,0.72832,-0.11734,0.84624,0.05029,0.83121,-0.07399,0.74798,0.06705,0.78324,0.06358,0.86763,-0.02370,0.78844,-0.06012,0.74451,-0.02370,0.76717,-0.02731,0.74046,-0.07630,0.70058,-0.04220,0.78439,0.01214,g
344 | 1,0,0.35346,-0.13768,0.69387,-0.02423,0.68195,-0.03574,0.55717,-0.06119,0.61836,-0.10467,0.62099,-0.06527,0.59361,-0.07289,0.42271,-0.26409,0.58213,0.04992,0.49736,-0.08771,0.46241,-0.08989,0.45008,-0.00564,0.39146,-0.09038,0.35588,-0.10306,0.32232,-0.08637,0.28943,-0.08300,g
345 | 1,0,0.76046,0.01092,0.86335,0.00258,0.85821,0.00384,0.79988,0.02304,0.81504,0.12068,0.83096,0.00744,0.81815,0.00854,0.82777,-0.06974,0.76531,0.03881,0.76979,0.01148,0.75071,0.01232,0.77138,-0.00303,0.70886,0.01375,0.66161,0.00849,0.66298,0.01484,0.63887,0.01525,g
346 | 1,0,0.66667,-0.01366,0.97404,0.06831,0.49590,0.50137,0.75683,-0.00273,0.65164,-0.14071,0.40164,-0.48907,0.39208,0.58743,0.76776,0.31831,0.78552,0.11339,0.47541,-0.44945,1,0.00683,0.60656,0.06967,0.68656,0.17088,0.87568,0.07787,0.55328,0.24590,0.13934,0.48087,g
347 | 1,0,0.83508,0.08298,0.73739,-0.14706,0.84349,-0.05567,0.90441,-0.04622,0.89391,0.13130,0.81197,0.06723,0.79307,-0.08929,1,-0.02101,0.96639,0.06618,0.87605,0.01155,0.77521,0.06618,0.95378,-0.04202,0.83479,0.00123,1,0.12815,0.86660,-0.10714,0.90546,-0.04307,g
348 | 1,0,0.95113,0.00419,0.95183,-0.02723,0.93438,-0.01920,0.94590,0.01606,0.96510,0.03281,0.94171,0.07330,0.94625,-0.01326,0.97173,0.00140,0.94834,0.06038,0.92670,0.08412,0.93124,0.10087,0.94520,0.01361,0.93522,0.04925,0.93159,0.08168,0.94066,-0.00035,0.91483,0.04712,g
349 | 1,0,0.94701,-0.00034,0.93207,-0.03227,0.95177,-0.03431,0.95584,0.02446,0.94124,0.01766,0.92595,0.04688,0.93954,-0.01461,0.94837,0.02004,0.93784,0.01393,0.91406,0.07677,0.89470,0.06148,0.93988,0.03193,0.92489,0.02542,0.92120,0.02242,0.92459,0.00442,0.92697,-0.00577,g
350 | 1,0,0.90608,-0.01657,0.98122,-0.01989,0.95691,-0.03646,0.85746,0.00110,0.89724,-0.03315,0.89061,-0.01436,0.90608,-0.04530,0.91381,-0.00884,0.80773,-0.12928,0.88729,0.01215,0.92155,-0.02320,0.91050,-0.02099,0.89147,-0.07760,0.82983,-0.17238,0.96022,-0.03757,0.87403,-0.16243,g
351 | 1,0,0.84710,0.13533,0.73638,-0.06151,0.87873,0.08260,0.88928,-0.09139,0.78735,0.06678,0.80668,-0.00351,0.79262,-0.01054,0.85764,-0.04569,0.87170,-0.03515,0.81722,-0.09490,0.71002,0.04394,0.86467,-0.15114,0.81147,-0.04822,0.78207,-0.00703,0.75747,-0.06678,0.85764,-0.06151,g
352 |
--------------------------------------------------------------------------------
/tutorials/ionosphere.names:
--------------------------------------------------------------------------------
1 | 1. Title: Johns Hopkins University Ionosphere database
2 |
3 | 2. Source Information:
4 | -- Donor: Vince Sigillito (vgs@aplcen.apl.jhu.edu)
5 | -- Date: 1989
6 | -- Source: Space Physics Group
7 | Applied Physics Laboratory
8 | Johns Hopkins University
9 | Johns Hopkins Road
10 | Laurel, MD 20723
11 |
12 | 3. Past Usage:
13 | -- Sigillito, V. G., Wing, S. P., Hutton, L. V., \& Baker, K. B. (1989).
14 | Classification of radar returns from the ionosphere using neural
15 | networks. Johns Hopkins APL Technical Digest, 10, 262-266.
16 |
17 | They investigated using backprop and the perceptron training algorithm
18 | on this database. Using the first 200 instances for training, which
19 | were carefully split almost 50% positive and 50% negative, they found
20 | that a "linear" perceptron attained 90.7%, a "non-linear" perceptron
21 | attained 92%, and backprop an average of over 96% accuracy on the
22 | remaining 150 test instances, consisting of 123 "good" and only 24 "bad"
23 | instances. (There was a counting error or some mistake somewhere; there
24 | are a total of 351 rather than 350 instances in this domain.) Accuracy
25 | on "good" instances was much higher than for "bad" instances. Backprop
26 | was tested with several different numbers of hidden units (in [0,15])
27 | and incremental results were also reported (corresponding to how well
28 | the different variants of backprop did after a periodic number of
29 | epochs).
30 |
31 | David Aha (aha@ics.uci.edu) briefly investigated this database.
32 | He found that nearest neighbor attains an accuracy of 92.1%, that
33 | Ross Quinlan's C4 algorithm attains 94.0% (no windowing), and that
34 | IB3 (Aha \& Kibler, IJCAI-1989) attained 96.7% (parameter settings:
35 | 70% and 80% for acceptance and dropping respectively).
36 |
37 | 4. Relevant Information:
38 | This radar data was collected by a system in Goose Bay, Labrador. This
39 | system consists of a phased array of 16 high-frequency antennas with a
40 | total transmitted power on the order of 6.4 kilowatts. See the paper
41 | for more details. The targets were free electrons in the ionosphere.
42 | "Good" radar returns are those showing evidence of some type of structure
43 | in the ionosphere. "Bad" returns are those that do not; their signals pass
44 | through the ionosphere.
45 |
46 | Received signals were processed using an autocorrelation function whose
47 | arguments are the time of a pulse and the pulse number. There were 17
48 | pulse numbers for the Goose Bay system. Instances in this databse are
49 | described by 2 attributes per pulse number, corresponding to the complex
50 | values returned by the function resulting from the complex electromagnetic
51 | signal.
52 |
53 | 5. Number of Instances: 351
54 |
55 | 6. Number of Attributes: 34 plus the class attribute
56 | -- All 34 predictor attributes are continuous
57 |
58 | 7. Attribute Information:
59 | -- All 34 are continuous, as described above
60 | -- The 35th attribute is either "good" or "bad" according to the definition
61 | summarized above. This is a binary classification task.
62 |
63 | 8. Missing Values: None
64 |
65 |
66 |
--------------------------------------------------------------------------------