├── .gitignore
├── 1. Data Preprocessing.ipynb
├── 2. Feature Selection.ipynb
├── 3. Dimension Reduction.ipynb
├── GA.py
├── README.md
├── SA.py
├── images
├── Embedded_Pipeline.png
├── Filter_Pipeline.png
├── GA_Pseudo_Code.png
├── SA_Pseudo_Code.png
├── Wrapper_Pipeline.png
├── 基于基因算法特征选择.png
├── 基于模拟退火特征选择.png
├── 封装法工作流.png
├── 嵌入法工作流.png
└── 过滤法工作流.png
├── 中文版.md
└── 中文版
├── 1. 数据预处理.ipynb
├── 2. 特征选择.ipynb
└── 3. 特征降维.ipynb
/.gitignore:
--------------------------------------------------------------------------------
1 |
2 | .DS_Store
3 | .ipynb_checkpoints
4 |
--------------------------------------------------------------------------------
/3. Dimension Reduction.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "**Feature Engineering Notebook Three: Dimensionality Reduction** \n",
8 | "*Author: Yingxiang Chen, Zihan Yang*"
9 | ]
10 | },
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "**Reference**\n",
16 | "- https://scikit-learn.org/stable/modules/decomposition.html#principal-component-analysis-pca\n",
17 | "- https://sebastianraschka.com/faq/docs/lda-vs-pca.html\n",
18 | "- https://en.wikipedia.org/wiki/Linear_discriminant_analysis"
19 | ]
20 | },
21 | {
22 | "cell_type": "markdown",
23 | "metadata": {
24 | "toc": true
25 | },
26 | "source": [
27 | "
Table of Contents
\n",
28 | ""
29 | ]
30 | },
31 | {
32 | "cell_type": "markdown",
33 | "metadata": {},
34 | "source": [
35 | "# Dimension Reduction"
36 | ]
37 | },
38 | {
39 | "cell_type": "markdown",
40 | "metadata": {},
41 | "source": [
42 | "After data preprocessing and feature selection, we have generated a good feature subset. But sometimes, this subset might still contain too many features and cost so much computing power to train. In this case, we can use dimension reduction techniques to further compress our feature subset. But this might deprecate model performance.\n",
43 | "\n",
44 | "We can also apply dimension reduction methods directly after data preprocessing if we don't have much time on feature selection. The dimension reduction algorithm can compress the original feature space and generate a feature subset for us.\n",
45 | "\n",
46 | "Specifically, we will introduce PCA and LDA (Linear Discriminant Analysis)."
47 | ]
48 | },
49 | {
50 | "cell_type": "markdown",
51 | "metadata": {},
52 | "source": [
53 | "## Unsupervised Methods"
54 | ]
55 | },
56 | {
57 | "cell_type": "markdown",
58 | "metadata": {},
59 | "source": [
60 | "### PCA (Principal Components Analysis)"
61 | ]
62 | },
63 | {
64 | "cell_type": "markdown",
65 | "metadata": {},
66 | "source": [
67 | "PCA is an **unsupervised** technique that finds the directions of maximal variance. It uses a few unrelated features to represent original features in the dataset and tries to retain as much information (variance) as possible. More math detail can be viewed from a [repo](https://github.com/YC-Coder-Chen/Unsupervised-Notes/blob/master/PCA.md) written by us in Github."
68 | ]
69 | },
70 | {
71 | "cell_type": "code",
72 | "execution_count": 1,
73 | "metadata": {
74 | "ExecuteTime": {
75 | "end_time": "2020-03-30T03:37:18.226607Z",
76 | "start_time": "2020-03-30T03:37:15.126399Z"
77 | }
78 | },
79 | "outputs": [],
80 | "source": [
81 | "import numpy as np\n",
82 | "import pandas as pd\n",
83 | "from sklearn.decomposition import PCA\n",
84 | "\n",
85 | "# load dataset\n",
86 | "from sklearn.datasets import fetch_california_housing\n",
87 | "dataset = fetch_california_housing()\n",
88 | "X, y = dataset.data, dataset.target # use california_housing dataset as example\n",
89 | "\n",
90 | "# use the first 20000 observations as train_set\n",
91 | "# the rest observations as test_set\n",
92 | "train_set = X[0:20000,]\n",
93 | "test_set = X[20000:,]\n",
94 | "train_y = y[0:20000]\n",
95 | "\n",
96 | "# we need to standardize the data first or the PCA will only comopress features in\n",
97 | "# large scale\n",
98 | "from sklearn.preprocessing import StandardScaler\n",
99 | "model = StandardScaler()\n",
100 | "model.fit(train_set) \n",
101 | "standardized_train = model.transform(train_set)\n",
102 | "standardized_test = model.transform(test_set)\n",
103 | "\n",
104 | "# start compressing\n",
105 | "compressor = PCA(n_components=0.9) # set n_components=0.9 =>\n",
106 | "# select the number of components such that the amount of variance\n",
107 | "# explained is greater than 90% of the original variance\n",
108 | "# we can also set n_components to be the number of features we want directly\n",
109 | "\n",
110 | "compressor.fit(standardized_train) # fit on trainset\n",
111 | "transformed_trainset = compressor.transform(standardized_train) # transform trainset (20000,5)\n",
112 | "transformed_testset = compressor.transform(standardized_test) # transform test set\n",
113 | "\n",
114 | "assert transformed_trainset.shape[1] == transformed_testset.shape[1] # same number of features"
115 | ]
116 | },
117 | {
118 | "cell_type": "code",
119 | "execution_count": 2,
120 | "metadata": {
121 | "ExecuteTime": {
122 | "end_time": "2020-03-30T03:37:18.780647Z",
123 | "start_time": "2020-03-30T03:37:18.242698Z"
124 | }
125 | },
126 | "outputs": [
127 | {
128 | "data": {
129 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEKCAYAAAD9xUlFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3Xl8FeXZ//HPxSo7su8EIYDsYAQV6opWrXWviisuRdtStba/51Hroy192sdareIuUtwrdS9aLSiKigsQVFYFAwSJoCD7FrJdvz9mkh5jSCbI5CQ53/frlRdnzrln5joTcq4zM/d93ebuiIiIANRJdgAiIlJ9KCmIiEgJJQURESmhpCAiIiWUFEREpISSgoiIlFBSEBGREkoKIiJSQklBRERK1Et2AJXVpk0bT0tLS3YYIiI1yvz5879x97YVtatxSSEtLY3MzMxkhyEiUqOY2eoo7XT5SERESigpiIhICSUFEREpoaQgIiIllBRERKRErEnBzE40s2VmlmVm15fxenczm2lmC81slpl1iTMeEREpX2xJwczqAvcBJwH9gDFm1q9Us9uBx919EDAB+L+44hERkYrFeaYwHMhy95XungdMBU4r1aYfMDN8/FYZr4uIpLzNO/P4y/TPyP5mZ+z7inPwWmdgTcJyDjCiVJsFwFnAROAMoJmZtXb3jTHGJSJSI2zZlcfkd1fx6PvZ7MwroEOLRqS1aRLrPuNMClbGc15q+TfAvWY2FngH+BIo+M6GzMYB4wC6deu2f6MUEalmtu7KZ/LslTzyXjY79hTwo4Edufq4dPp0aBb7vuNMCjlA14TlLsDaxAbuvhY4E8DMmgJnufvW0hty90nAJICMjIzSiUVEpFbYuiufv723ikdmr2L7ngJOHtiBq49Lp2+H5lUWQ5xJYR6QbmY9CM4AzgPOT2xgZm2ATe5eBNwATIkxHhGRamnr7nymzF7FlPdWsT23gJMGBMng4I5VlwyKxZYU3L3AzMYD04G6wBR3X2JmE4BMd58GHA38n5k5weWjX8QVj4hIdbMtN0gGf5sdJIMf9m/PNcf1pl+nqk8Gxcy9Zl2NycjIcFVJFZGabFtuPo++l83kd1eyLbeAE/q155rR6fTv1CK2fZrZfHfPqKhdjSudLSJSU20vTgazV7F1dz7H92vPNcelM6BzfMmgspQURERitj03n8fez+bhd4NkMPrgdlw7une1SgbFlBRERGKyY09BmAxWsmVXPsf1DZLBwC7VLxkUU1IQEdnPdu4p4LEPsnn4nZVs3pXPsX3bcc1x6Qzu2jLZoVVISUFEZD/ZuaeAxz9YzaR3VrB5Vz5H92nLtaN7M6QGJINiSgoiIt/TrrziZLCSTTvzOKp3W64dnc7QbgcmO7RKU1IQEdlHu/IKePLD1Tz09ko27szjyN5tuea4dA7pXvOSQTElBRGRStqdVxgkg3dW8M2OPH6Q3oZrR6dzSPdWyQ7te1NSEBGJaHdeIU/NWc2DbwfJYFSvIBlkpNX8ZFBMSUFEpAK5+YU8NecLHnx7BRu272Fkr9Y8MLo3h9aiZFBMSUFEZC9y8wv5+5wveCBMBkf0bM195w9jeI/alwyKKSmIiJSSm1/I03O/4IFZK1i/fQ+HHdSKe8YM5bCDWic7tNgpKYiIhHLzC/nHvDXcPyuLr7ftYXiPVkw8byiH96z9yaCYkoKIpLw9BWEyeGsFX23LZXhaK+48dwhH9GyT7NCqnJKCiKSsPQWFPDNvDffPWsG6rbkcmnYgfz1nMIf3bI1ZWTMK135KCiKScvYUFPJsZg73v5XF2q25ZHQ/kNt/MpgjUjgZFFNSEJGUkVdQxLPz13Dfm0EyGNatJX8+exCjerVJ+WRQLFJSMLNRQLq7P2JmbYGm7r4q3tBERPaPvIIinpufw31vZfHllt0M7daSW88axA/SlQxKqzApmNktQAbQB3gEqA88CYyMNzQRke8nv7CI5+fncM+bQTIY0rUlfzpzIEcqGexVlDOFM4ChwEcA7r7WzJrFGpWIyPeQX1jECx8FySBn824Gd23J/54xgKN7t1UyqECUpJDn7m5mDmBmTWKOSURkn+QXFvHiR19yz1ufs2bTbgZ1acEfThvA0X2UDKKKkhSeMbOHgJZm9lPgMuDheMMSEYmuoLCIFz/+knvezOKLTbsY2LkFvx/bn2P6tFMyqKQKk4K7325mxwPbCO4r3Ozur0fZuJmdCEwE6gKT3f3WUq93Ax4DWoZtrnf3Vyv3FkQkVRUUFvHSJ2u5583PWb1xFwM6N2fyxRkcd7CSwb6KcqO5B/BucSIws0Zmlubu2RWsVxe4DzgeyAHmmdk0d1+a0Owm4Bl3f8DM+gGvAmn79E5EJGUUFBbxzzAZZG/cRf9OzXn44gxGKxl8b1EuHz0LHJGwXBg+d2gF6w0Hstx9JYCZTQVOAxKTggPNw8ctgLUR4hGRFFVY5Exb8CV3z8xi1Tc76dexOZMuOoTj+7VXMthPoiSFeu6eV7zg7nlm1iDCep2BNQnLOcCIUm1+B8wws18CTYDRZW3IzMYB4wC6desWYdciUpsUFjkvL1jL3TM/Z+U3Ozm4Y3MeuugQTlAy2O+iJIUNZnaqu08DMLPTgG8irFfWb8pLLY8BHnX3O8zscOAJMxvg7kXfWsl9EjAJICMjo/Q2RKSWKixyXlm4lokzP2flhp307dCMBy8cxgn9OlCnjpJBHKIkhauAp8zsXoIP+jXAxRHWywG6Jix34buXhy4HTgRw9w/M7ACgDbA+wvZFpJYqLHL+tWgdd8/8nKz1O+jTvhkPXDCMH/ZXMohblN5HK4DDzKwpYO6+PeK25wHp4Y3qL4HzgPNLtfkCOA541MwOBg4ANkQNXkRql6KEZPD5+h30bt+U+y8YxolKBlUmSu+jhsBZBL2C6hVfv3P3CeWt5+4FZjYemE7Q3XSKuy8xswlAZng56tfAw2b2K4JLS2PdXZeHRFJMUZHz6uJ1THwjSAbp7Zpy7/lDOXlARyWDKhbl8tE/ga3AfGBPZTYejjl4tdRzNyc8XopqKImkrKIi57XFXzFx5nKWf72DXu2acs+YofxooJJBskRJCl3c/cTYIxGRlFFU5Exf8hUTZ37OZ19tp2fbJtwdJoO6SgZJFSUpvG9mA919UezRiEit5u68tWw9d8xYzpK12ziobRMmnjeEUwZ1UjKoJqIkhVHAWDNbRXD5yAB390GxRiYitcp7Wd9w+4xlfPzFFrq1asxfzxnMaUM6KxlUM1GSwkmxRyEitdb81Zu4ffpyPli5kY4tDuD/zhzI2Yd0oX7dOskOTcoQpUvqagAza0fQZVREpEKLv9zK7TOWMWvZBto0bcgtP+7HmOHdOKB+3WSHJuWI0iX1VOAOoBPBoLLuwKdA/3hDE5GaaPnX2/nrjOX8e8lXtGhUn/8+sS+XHNGdxg00JXxNEOW39AfgMOANdx9qZscQlKcQESmR/c1O7npjOf9csJYmDepx7eh0LhvVg+YH1E92aFIJUZJCvrtvNLM6ZlbH3d8ysz/HHpmI1AhfbtnNPTM/59n5OdSva1x5ZE+uPPIgDmwSpW6mVDdRksKWsMTFOwQ1kNYDBfGGJSLV3fptudz3VhZPzw2KIV90WHd+fkxP2jXTrceaLEpSOA3IBX4FXEAw70G5JS5EpPbavDOPB99ewWMfZFNQ6Pwkoyu/PLYXnVo2SnZosh9E6X20M2HxsRhjEZFqbFtuPpPfXcWU2avYmVfAGUM6c83odLq3bpLs0GQ/2mtSMLPZ7j7KzLbz7XkQigevNd/LqiJSi+zKK+DR97N56O2VbN2dz8kDO/Cr0b1Jb98s2aFJDPaaFNx9VPivfvMiKSg3v5Cn5nzBA7Oy+GZHHsf2bcd1x/dmQOcWyQ5NYlTu5SMzqwMsdPcBVRSPiCRZXkERz85fwz0zs/hqWy4je7XmoeP7cEj3A5MdmlSBcpOCuxeZ2QIz6+buX1RVUCJS9QqLnJc+/pK7Zi5nzabdDOvWkr+eO5gjerZJdmhShaL0PuoILDGzuUDJTWd3PzW2qESkyhRPcHPn68tZsWEn/Ts155GxAzi6T1uKJ9WS1BElKfw+9ihEpMq5OzM/Xc8dry/n03XbSG/XlAcvDOZBVjJIXVG6pL5dFYGISNVwd97L2sjtM5bxyZotdG/dmLvOHcKPB2tOA4lWEO8w4B7gYKABwXzLO9UlVaTmyczexF+mL2POqk10anEAt545kLNUxloSRLl8dC9wHvAskAFcDKTHGZSI7F+LcoIy1m8v30DbZg35/an9OW94VxrWUxlr+bZItWzdPcvM6rp7IfCImb0fc1wish8s+2o7f319GdOXfE3LxvW54aS+XHx4Go0aKBlI2aIkhV1m1gD4xMxuA9YBkca1m9mJwESCS06T3f3WUq/fCRwTLjYG2rl7y6jBi0jZVoVlrKctWEvTBvX41ejeXDYqjWYqYy0ViJIULgLqAOMJiuJ1Bc6qaCUzqwvcBxwP5ADzzGyauy8tbuPuv0po/0tgaKWiF5Fvydm8i3tmZvHcRzk0qFuHq44Kyli3bKwy1hJNlKQwDHjV3bdRue6pw4Esd18JYGZTCSquLt1L+zHALZXYvoiEvi4pY/0FZsYlh6fxs6N70rZZw2SHJjVMlKRwKnCXmb0DTAWmu3uU+RQ6A2sSlnOAEWU1NLPuQA/gzQjbFZHQpuIy1u9nU1jknHNoV8YfozLWsu+ijFO41MzqAycB5wP3m9nr7n5FBauW1eHZy3gOgt5Nz4U3sr+7IbNxwDiAbt26VRSySK23dXc+k99dyZTZq9idX8jpQztzzXEqYy3fX9TeR/lm9hrBh3ojgstAFSWFHIL7D8W6AGv30vY84Bfl7H8SMAkgIyNjb4lFpNbbuae4jPUKtuUW8KOBHfnV8en0aqdixrJ/RBm8diLBh/YxwCxgMnBOhG3PA9LNrAfwZbiN88vYfh/gQOCDyFGLpJjc/EKe/HA1D8xawcadeRzXtx3XndCb/p1Uxlr2ryhnCmMJ7iVc6e57om7Y3QvMbDwwnaBL6hR3X2JmE4BMd58WNh0DTHV3nQGIlJJXUMQzmWu4982gjPWoXm247oTeDOumMtYSD6tpn8UZGRmemZmZ7DBEYlVQWMRLn6xlYljGOqP7gfz6hD4c3rN1skOTGsrM5rt7RkXtIt1TEJGqUVTk/GvROu58YzkrN+xkYOcW/OHSARzVW2WspWooKYhUA+7OG5+u544Zy/jsq+30bt+UBy88hB/2b69kIFVKSUEkidyd2VnfcPuM5SxYs4W01o2ZeN4QThmkMtaSHHtNCma2iL2PK8DdB8USkUiKmLtqE7fPWMbcVZvo3LIRt501iDOHdaaeylhLEpV3pnBK+G/x+IEnwn8vAHbFFpFILbdgzRbueH0574RlrCec1p9zD1UZa6ke9poU3H01gJmNdPeRCS9db2bvARPiDk6kNvnsq23cMWM5ry/9mgMb1+fGk/ty0WEqYy3VS5R7Ck3MbJS7zwYwsyOIWDpbRGDlhh3c+cbnvLIwKGN93fG9uWxUD5o21C09qX6i/K+8HJhiZi0I7jFsBS6LNSqRWmDNpl3cPfNznv8oh4b16vKzo3oyTmWspZqLUhBvPjDYzJoTDHbbGn9YIjXX19tyuffNLKbOC8pYXzqyBz87uidtmqqMtVR/UWoftQf+BHRy95PMrB9wuLv/LfboRGqQjTv28ODbK3j8g9UUFjnnHtqV8cf2omMLlbGWmiPK5aNHgUeA34bLy4F/AEoKIgQlKR59P5s7X1/O7vxCzhjahWtHp9O1VeNkhyZSaVGSQht3f8bMboCSQndlznsgkmqWrN3K9c8vYtGXWzm2bztuPPlgerVrmuywRPZZlKSw08xaEw5kM7PDCG42i6Ss3XmF3DVzOZPfXcWBjetzz5ihnDKoo0pSSI0XJSlcB0wDeobjE9oCZ8calUg1Nvvzb7jxxUV8sWkX52R04caTD1aPIqk1ovQ++sjMjgL6EEyxuczd82OPTKSa2bwzj//916c8/1EOaa0b8/efjuCInm2SHZbIfhV19MxwIC1sP8zMcPfHY4tKpBpxd6YtWMuEl5eydXc+Pz+6J1cfl84B9TUSWWqfKF1SnwB6Ap8AxTeYHVBSkFovZ/MubnppMbOWbWBwlxY8ecUIDu7YPNlhicQmyplCBtBP02VKKiksch59P5s7ZiwD4OZT+nHJEWkqZy21XpSksBjoAKyLORaRamHp2m3c8MJCFuRs5Zg+bfnD6QPocqDGHEhqiDROAVhqZnOBPcVPuvupsUUlkgS5+YVMnPk5k95ZSctG9bl7zFB+rG6mkmKiJIXfxR2ESLK9nxV0M83euIuzD+nCb08+mAObqJuppJ4oXVLfropARJJhy648/vivT3l2fg7dWzfmqStGMLKXuplK6ipvOs7Z7j7KzLbz7Wk5DXB3r7ALhpmdCEwE6gKT3f3WMtqcQ3A24sACdz+/cm9BpPLcnZcXrmPCy0vYvCufq47qybWj1c1UpLyZ10aF/zbblw2bWV3gPuB4IAeYZ2bT3H1pQpt04AZgpLtvNrN2+7Ivkcr4cstubnpxEW8t28CgLi147LLh9O/UItlhiVQLkad+Cj+wDyhedvcvKlhlOJDl7ivD9acCpwFLE9r8FLjP3TeH21wfNR6Ryiosch57P5vbZyzDHW760cFcOrKHupmKJIgyeO1U4A6gE7Ae6A58CvSvYNXOwJqE5RxgRKk2vcN9vEdwiel37v7vMmIYB4wD6NatW0Uhi3zHZ19t47+fX8SCNVs4qndb/vf0ASptLVKGKGcKfwAOA95w96FmdgwwJsJ6ZX39Kj0Arh6QDhwNdAHeNbMB7r7lWyu5TwImAWRkZGgQnUSWm1/IPW9+zkNvr6RFo/pMPG8Ipw7upG6mInsRJSnku/tGM6tjZnXc/S0z+3OE9XKArgnLXYC1ZbT5MCywt8rMlhEkiXlRghcpzwcrNnLji4tY9c1OzhrWhZt+pG6mIhWJkhS2mFlT4B3gKTNbDxREWG8ekG5mPYAvgfOA0j2LXiI463jUzNoQXE5aGTV4kbJs3ZXPn179lH9krqFbq8Y8efkIRqWrm6lIFFGSwmlALvAr4AKgBTChopXCGdrGA9MJ7hdMcfclZjYByHT3aeFrJ5jZUoJie//P3Tfu21uRVOfu/GvROn43bSmbd+Vx5VEHce1xvWnUQN1MRaKymlbnLiMjwzMzM5MdhlQza7fs5n9eWszMz9YzoHNzbj1zEAM6q5upSDEzm+/uGRW1K2/wWpmD1qjE4DWRuBUWOU98kM1fpi+jyOG3Jx/MpSPTqFe3TrJDE6mRyhu8tk+D1kSqyrKvtvPfzy/kkzVb+EF6G/50xkB1MxX5niINXjOzYcAogjOF2e7+caxRiZQjN7+Qe9/M4sG3V9C8UX3uPHcwpw/prG6mIvtBlMFrNwM/AV4In3rUzJ519/+NNTKRMny4ciM3vrCIld/s5MyhnbnplH60UjdTkf0mypnCGGCou+cCmNmtwEeAkoJUma2787n1tU95eu4aurZqxOOXDefI3m2THZZIrRMlKWQT1DzKDZcbAiviCkgkkbvz2uKvuGXaEjbu2MO4Iw/i2tHpNG4QuWyXiFRClL+sPcASM3ud4J7C8cBsM7sbwN2vjjE+SWHrtu7mf15awhuffk3/Ts15ZOyh6mYqErMoSeHF8KfYrHhCEQkUFTlPzlnNbf9eRkFRETee3JfLRvZQN1ORKhAlKbxWuqS1mfVx92UxxSQpbPnX27n++YV89EXQzfSPpw+kW2t1MxWpKlGSwrtm9j/u/gyAmf0auBzoF2tkklJy8wu5/60sHnh7BU0b1uOOnwzmzGHqZipS1aIkhaOBSWb2E6A9wVwKw+MMSlLL3FWbuP6FhazcsJPTh3Tif07pR+umDZMdlkhKqjApuPs6M/s3wbSZRcAN7r4j9sik1gu6mX7G03O/oMuBjXjssuEcpW6mIkkVZfDa68A6YADBnAhTzOwdd/9N3MFJ7fXvxeu4+Z9L+GbHHq4Y1YPrTuitbqYi1UCUv8L73P2l8PEWMzuC4KxBpNK+2prLzf9czIylX9OvY3MmX5LBoC4tkx2WiISiXD56ycy6A+nu/gZQH7gr9sikVikqcp6a+wW3vfYZeYVFXH9SXy4f1YP66mYqUq1EuXz0U2Ac0AroSXAJ6UHguHhDk9ri86+3c8MLi8hcvZmRvVrzpzMG0r11k2SHJSJliHL56BcEvY3mALj752bWLtaopFbYU1DI/W+t4P5ZWTRpWI/bfzKYs9TNVKRai1Tmwt3ziv+Qzawe3558R+Q75mVv4oYXFpG1fgenhd1M26ibqUi1FyUpvG1mNwKNzOx44OfAy/GGJTXVttx8/vzaZzw15ws6t2zEI5ceyjF9dGIpUlNESQrXE4xgXgRcCbwKTI4zKKmZ/r34K26ZtpgN2/dw2cge/PqE3jRpqG6mIjVJlN5HRcDD4Y/Id3y9LehmOn3J1/Tt0IxJF2UwuKu6mYrURLF+jTOzE4GJQF1gsrvfWur1scBfgC/Dp+51d52F1BBFRc7f537Bn8Nupv91Yh9++oOD1M1UpAaLLSmYWV3gPoL5F3KAeWY2zd2Xlmr6D3cfH1ccEo+s9Tu44YWFzMvezBE9g26maW3UzVSkpoucFMysibvvrMS2hwNZ7r4yXH8qcBpQOilIDZJXUMQDs1Zw31tZNGpQl9vOHsRPDumibqYitUSF5/lmdoSZLSWojoqZDTaz+yNsuzOwJmE5J3yutLPMbKGZPWdmXaMELckxf/UmfnT3u9z5xnJ+OKADb1x3FOdkdFVCEKlFopwp3An8EJgG4O4LzOzICOuV9UlRenzDy8DT7r7HzK4CHgOO/c6GzMYRjKqmW7duEXYt+9P23Hxu+/cynpyzmk4tGvHI2EM5pq+6mYrURpEuH7n7mlLfBgsjrJYDJH7z7wKsLbXdjQmLDwN/3sv+JwGTADIyMjRwrgrNWPIVN/9zCV9vz2XsEWn85oQ+6mYqUotF+eteE1ZGdTNrAFxNeCmpAvOAdDPrQdC76Dzg/MQGZtbR3deFi6dG3K5Ugc0787jxxUW8tvgr+nZoxoMXHcIQdTMVqfWiJIWrCLqVdib49j+DoB5Sudy9wMzGA9MJuqROcfclZjYByHT3acDVZnYqUABsAsbu07uQ/errbblcOHkOqzfu4v/9sA/jjlQ3U5FUYe7lX40xs7buvqGK4qlQRkaGZ2ZmJjuMWmvNpl1cMHkOG3fs4eFLMjiiZ5tkhyQi+4GZzXf3jIraRTlTeN/MVgH/AJ539y3fOzqplj7/ejsX/m0OuflFPPXTw3S5SCQFVXhNwN3TgZuA/sBHZvaKmV0Ye2RSpRblbOWchz6gyOGZKw9XQhBJUZEuFLv7XHe/jmBA2iaCrqNSS8xdtYnzH/6Qxg3q8eyVh9OnQ7NkhyQiSRJl8FpzM7vEzF4D3gfWESQHqQVmLVvPxVPm0K55Q5772eEqVSGS4qLcU1gAvARMcPcPYo5HqtCri9ZxzdSP6d2+GY9fNpzWmgRHJOVFSQoHeUVdlKTGeSZzDdc/v5Bh3Q7kb2MPpUWj+skOSUSqgb0mBTO7y92vBaaZ2XeSgrufGmtkEpsps1cx4ZWl/CC9DQ9ddAiNG2iEsogEyvs0eCL89/aqCETi5+7cPTOLO99Yzon9OzBxzBAa1qub7LBEpBrZa1Jw9/nhwyHuPjHxNTO7Bng7zsBk/3J3/vivT5k8exVnDevCn88aSD2NUhaRUqJ8KlxSxnNj93McEqPCIueGFxYxefYqxh6Rxl/OHqSEICJlKu+ewhiCAnY9zGxawkvNgI1lryXVTV5BEdc98wmvLFzHL4/txXXH99b8ByKyV+XdUygek9AGuCPh+e3AwjiDkv0jN7+Qnz05n7eWbeDGk/sy7sieyQ5JRKq58u4prAZWA4dXXTiyv2zPzeeKxzKZm72JP50xkPNHaHIiEalYlBHNh5nZPDPbYWZ5ZlZoZtuqIjjZN5t35nHB5DnMX72Zu84dooQgIpFF6aB+L8EEOc8CGcDFQK84g5J9VzIXwqZdPHTRIRx3cPtkhyQiNUjU6TizzKyuuxcCj5jZ+zHHJfsgcS6ERy89VHMhiEilRUkKu8JpOD8xs9sIbj6ralo1o7kQRGR/iNJZ/SKC6TTHAzuBrsBZcQYllaO5EERkf6nwTCHshQSwG/h9vOFIZc1dtYnLH51H80b1eeqKESp9LSLfS3mD1xYBe62O6u6DYolIIpu1bD1XPTmfzi0b8eQVI+jYolGyQxKRGq68M4VTqiwKqTTNhSAicaho8JpUQ5oLQUTiEmXw2nYz2xb+5FZm8JqZnWhmy8wsy8yuL6fd2WbmZpZRmeBT0ZTZq/iv5xYyslcbHr98uBKCiOxXUW40f2sWdzM7nQhzNJtZXeA+4HggB5hnZtPcfWmpds2Aq4E5lYg75WguBBGpCpWun+zuLwHHRmg6HMhy95XungdMBU4ro90fgNuA3MrGkiqK50K4843lnDWsC/eeP1QJQURiUeGZgpmdmbBYh6DURZQ5mzsDaxKWc4ARpbY9FOjq7q+Y2W/KiWEcMA6gW7fUquNTWOT89sVFTJ23hrFHpHHzKf2oU0elr0UkHlFGNP844XEBkE3Z3/hLK+uTqySZmFkd4E4iTNjj7pOASQAZGRlRElKtoLkQRKSqRbmncOk+bjuHYPRzsS7A2oTlZsAAYFb4QdcBmGZmp7p75j7us9bQXAgikgxRLh/1AH4JpCW2d/dTK1h1HpAerv8lQaXV8xPW30owgU/xfmYBv1FC0FwIIpI8US4fvQT8DXgZKIq6YXcvMLPxwHSC2klT3H2JmU0AMt19WvlbSE2bd+ZxySNzWbp2G3edO4TThnROdkgikkKiJIVcd797Xzbu7q8Cr5Z67ua9tD16X/ZRm2guBBFJtihJYaKZ3QLMAPYUP+nuH8UWVQrSXAgiUh1ESQoDCcpnH8t/Lh850cYqSASaC0FEqosoSeEM4KBwAJrsZ4tytnLxlDnUq1uHZ648nD4dmlW8kohITKKMaF4A6KtrDOau2sT5D39I4wa9Zf3HAAANI0lEQVT1eFYJQUSqgShnCu2Bz8xsHt++p1BRl1Qph+ZCEJHqKEpSuCX2KFKM5kIQkeoqyojmt6sikFShuRBEpDqLMqJ5O/+pWdQAqA/sdPfmcQZWG02ZvYoJryzlB+lteOiiQ2jcIMqJmohI1YltPgX5j8S5EH7Yvz13j1HpaxGpnuKcT0H49lwIZw7rzH3nD1NCEJFqK875FFJe4lwIlxzenVt+3F9zIYhItRbnfAopLXEuhPHH9OLXJ2guBBGp/uKcTyFlJc6FcMNJfbnyKM2FICI1Q4X3FMzsMTNrmbB8oJlNiTesmmt7bj6XTJnLrOUb+OMZA5QQRKRGiXL5aJC7bylecPfN4dzKUkrxXAhLNBeCiNRQUZJCHTM70N03A5hZq4jrpZRvzYVw4SGM7qe5EESk5ony4X4H8L6ZPUfQ6+gc4I+xRlXDaC4EEaktotxoftzMMgnGJhhwprsvjT2yGiJxLoQnrxjB0G4HJjskEZF9FukyUJgElAhKKZ4LoW6dOvzjysPo20GVP0SkZtO9gX00d9UmLn90Hs0b1eepK0aQ1qZJskMSEfnelBT2QfFcCJ1aNuLJy0fQqaXmQhCR2qHStY8qw8xONLNlZpZlZteX8fpVZrbIzD4xs9lm1i/OePaHVxet46ePZ3JQm6Y8c+XhSggiUqvElhTMrC5wH3AS0A8YU8aH/t/dfaC7DwFuA/4aVzz7wzOZaxj/948Y1KUlT487jDaaHEdEapk4zxSGA1nuvtLd84CplKqZ5O7bEhabUI0L7U2ZvYr/em4hI3u14YnLh2tyHBGpleK8p9AZWJOwnAOMKN3IzH4BXEcwgU+1K8mtuRBEJJXEeaZQVknQ75wJuPt97t4T+G/gpjI3ZDbOzDLNLHPDhg37Ocy901wIIpJq4kwKOUDXhOUuwNpy2k8FTi/rBXef5O4Z7p7Rtm3b/Rji3hUWOTe8sIjJs1dxyeHduf3swdSrG+t9eRGRpIvzU24ekG5mPcysAXAeMC2xgZmlJyz+CPg8xngiyyso4pqpHzN13hrGH9OL352qyXFEJDXEdk/B3QvMbDwwHagLTHH3JWY2Ach092nAeDMbDeQDm4FL4oonKs2FICKpLNbBa+7+KvBqqeduTnh8TZz7r6ztuflc8Vgmc7M38cczBnDBiO7JDklEpEppRHNIcyGIiCgpALB+Wy4X/m0O2Rs1F4KIpLaUTwqaC0FE5D9SOilkrd/OhZPnsju/UHMhiIiQwklh8ZdbuXjKXOqYaS4EEZFQSiaFedmbuOwRzYUgIlJayiWFt5dv4MonMjUXgohIGVIqKby2aB1XT/2Y9HbNePzy4Sp9LSJSSsokhRc+yuE3zy5gaLcDmTL2UJW+FhEpQ8okhW6tGjP64Pbcdd4QGjdImbctIlIpKfPpmJHWioy0VskOQ0SkWlMtaBERKaGkICIiJZQURESkhJKCiIiUUFIQEZESSgoiIlJCSUFEREooKYiISAlz92THUClmtgFYvY+rtwG+2Y/h7C+Kq3IUV+VV19gUV+V8n7i6u3vbihrVuKTwfZhZprtnJDuO0hRX5SiuyquusSmuyqmKuHT5SERESigpiIhIiVRLCpOSHcBeKK7KUVyVV11jU1yVE3tcKXVPQUREypdqZwoiIlKOWpcUzGyKma03s8V7ed3M7G4zyzKzhWY2rJrEdbSZbTWzT8Kfm6sorq5m9paZfWpmS8zsmjLaVPkxixhXlR8zMzvAzOaa2YIwrt+X0aahmf0jPF5zzCytmsQ11sw2JByvK+KOK2Hfdc3sYzN7pYzXqvx4RYwrmccr28wWhfvNLOP1+P4m3b1W/QBHAsOAxXt5/WTgNcCAw4A51SSuo4FXknC8OgLDwsfNgOVAv2Qfs4hxVfkxC49B0/BxfWAOcFipNj8HHgwfnwf8o5rENRa4t6r/j4X7vg74e1m/r2Qcr4hxJfN4ZQNtynk9tr/JWnem4O7vAJvKaXIa8LgHPgRamlnHahBXUrj7Onf/KHy8HfgU6FyqWZUfs4hxVbnwGOwIF+uHP6VvzJ0GPBY+fg44zsysGsSVFGbWBfgRMHkvTar8eEWMqzqL7W+y1iWFCDoDaxKWc6gGHzahw8PT/9fMrH9V7zw8bR9K8C0zUVKPWTlxQRKOWXjJ4RNgPfC6u+/1eLl7AbAVaF0N4gI4K7zc8JyZdY07ptBdwH8BRXt5PSnHK0JckJzjBUFCn2Fm881sXBmvx/Y3mYpJoaxvINXhG9VHBMPQBwP3AC9V5c7NrCnwPHCtu28r/XIZq1TJMasgrqQcM3cvdPchQBdguJkNKNUkKccrQlwvA2nuPgh4g/98O4+NmZ0CrHf3+eU1K+O5WI9XxLiq/HglGOnuw4CTgF+Y2ZGlXo/tmKViUsgBEjN+F2BtkmIp4e7bik//3f1VoL6ZtamKfZtZfYIP3qfc/YUymiTlmFUUVzKPWbjPLcAs4MRSL5UcLzOrB7SgCi8d7i0ud9/o7nvCxYeBQ6ognJHAqWaWDUwFjjWzJ0u1ScbxqjCuJB2v4n2vDf9dD7wIDC/VJLa/yVRMCtOAi8O794cBW919XbKDMrMOxddRzWw4we9mYxXs14C/AZ+6+1/30qzKj1mUuJJxzMysrZm1DB83AkYDn5VqNg24JHx8NvCmh3cHkxlXqWvOpxLcp4mVu9/g7l3cPY3gJvKb7n5hqWZVfryixJWM4xXut4mZNSt+DJwAlO61GNvfZL39sZHqxMyeJuiV0sbMcoBbCG664e4PAq8S3LnPAnYBl1aTuM4GfmZmBcBu4Ly4/zBCI4GLgEXh9WiAG4FuCbEl45hFiSsZx6wj8JiZ1SVIQs+4+ytmNgHIdPdpBMnsCTPLIvjGe17MMUWN62ozOxUoCOMaWwVxlakaHK8ocSXreLUHXgy/79QD/u7u/zazqyD+v0mNaBYRkRKpePlIRET2QklBRERKKCmIiEgJJQURESmhpCAiIiWUFKRGM7NZZhb7XLpmdrUFFVufintfyWRmLc3s58mOQ5JHSUFSVjh6NqqfAye7+wVxxVNNtCR4r5KilBQkdmaWFn7LftiCWv8zwlG33/qmb2ZtwrIDxbXsXzKzl81slZmNN7PrLKh9/6GZtUrYxYVm9r6ZLQ5HNhePCp1iZvPCdU5L2O6zZvYyMKOMWK8Lt7PYzK4Nn3sQOAiYZma/KtW+rpndbkHt+4Vm9svw+ePC/S4K42gYPp9tZn8ysw/MLNPMhpnZdDNbUTw4yYJ5It4xsxfNbKmZPWhmdcLXxoTbXGxmf06IY4eZ/dGC4oAfmln78Pm2ZvZ8eBzmmdnI8PnfhXHNMrOVZnZ1uKlbgZ4W1PH/i5l1DGP5JNznD/b5P4LUDPurBrd+9LO3HyCNYFTokHD5GeDC8PEsICN83AbIDh+PJRit2QxoS1A586rwtTsJCuQVr/9w+PhIwvkqgD8l7KMlwXwMTcLt5gCtyojzEGBR2K4psAQYGr6WTRn17YGfEdRnqhcutwIOIKhg2Tt87vGEeLOBnyW8j4UJ73F9+PzRQC5BIqoLvE4wersT8EXYth7wJnB6uI4DPw4f3wbcFD7+OzAqfNyNoGwIwO+A94GG4XHfSDDCPo2EOT+AXwO/DR/XBZol+/+TfuL9qXVlLqTaWuXuxeUq5hN8+FTkLQ/mUthuZlsJqlZC8ME9KKHd0xDMWWFmzS2oAXQCQcGz34RtDiAskUFQVrqsgmujgBfdfSeAmb0A/AD4uJwYRxNMEFMQxrDJzAaH73d52OYx4BcEpZohqFtT/D6aJrzH3DB2gLnuvjKM4+kwtnxglrtvCJ9/iiARvgTkAcWzh80Hjk+Ir5/9Z3qC5hbW1QH+5UHBtz1mtp6gvEJp84ApFhQnfCnhdyi1lJKCVJU9CY8LgUbh4wL+cxnzgHLWKUpYLuLb/3dL12pxgtLCZ7n7ssQXzGwEsHMvMe7LxC5Wxv4r2k7i+yj9Hovf197e097ku3vxOoUJ26kDHO7uu78VYJAkSv9OvvN5ECbaIwkmo3nCzP7i7o+XE4fUcLqnIMmWzX9KEp+9j9s4F8DMRhFUi9wKTAd+aVZSRXVohO28A5xuZo0tqE55BvBuBevMAK4qvmkd3uv4DEgzs15hm4uAtyv5noabWY/wXsK5wGyCSYaOCu+91AXGRNjuDGB88YKZDamg/XaCy1nF7bsTXNZ6mKBwXZXMaS7JozMFSbbbgWfM7CKCa+T7YrOZvQ80By4Ln/sDweWahWFiyAZOKW8j7v6RmT0KzA2fmuzu5V06gmAqx97hfvIJ7m/ca2aXAs+GyWIe8GAl39MHBDd9BxIkqxfdvcjMbgDeIjhreNXd/1nBdq4G7jOzhQR/7+8AV+2tsbtvNLP3zGwxwRzAi4H/F763HcDFlXwfUsOoSqpINWNmRwO/cfdyk5hIHHT5SERESuhMQURESuhMQURESigpiIhICSUFEREpoaQgIiIllBRERKSEkoKIiJT4/x1i0Sbu5283AAAAAElFTkSuQmCC\n",
130 | "text/plain": [
131 | ""
132 | ]
133 | },
134 | "metadata": {
135 | "needs_background": "light"
136 | },
137 | "output_type": "display_data"
138 | }
139 | ],
140 | "source": [
141 | "# visualize the relationship between cumulative variance explained and number of components\n",
142 | "%matplotlib inline\n",
143 | "import matplotlib.pyplot as plt\n",
144 | "plt.plot(np.array(range(len(compressor.explained_variance_ratio_))) + 1, \n",
145 | " np.cumsum(compressor.explained_variance_ratio_))\n",
146 | "plt.xlabel('number of components')\n",
147 | "plt.ylabel('cumulative explained variance')\n",
148 | "plt.show(); # top 5 components can already explained 90% of the original variance"
149 | ]
150 | },
151 | {
152 | "cell_type": "markdown",
153 | "metadata": {},
154 | "source": [
155 | "## Supervised Methods"
156 | ]
157 | },
158 | {
159 | "cell_type": "markdown",
160 | "metadata": {},
161 | "source": [
162 | "### LDA (Linear Discriminant Analysis)"
163 | ]
164 | },
165 | {
166 | "cell_type": "markdown",
167 | "metadata": {},
168 | "source": [
169 | "Compared with PCA, LDA is a supervised technique attempts to find a feature subset to maximize class linear-separability, that is, the projected points of the observations with the same class label are as close as possible, while the distances between the centers of difference class labels are as large as possible. LDA can only be applied to classification problems. LDA assumes that classes are normally distributed and have the same covariance matrix. \n",
170 | " \n",
171 | "Math detail can be accessed at the [official website](https://scikit-learn.org/stable/modules/lda_qda.html#lda-qda) of sklearn. Traditionally, LDA will reduce dimension to (K-1) where K is the number of classes. But in sklearn, it allows further dimension by incorporating PCA into LDA."
172 | ]
173 | },
174 | {
175 | "cell_type": "code",
176 | "execution_count": 3,
177 | "metadata": {
178 | "ExecuteTime": {
179 | "end_time": "2020-03-30T03:37:18.844214Z",
180 | "start_time": "2020-03-30T03:37:18.787150Z"
181 | }
182 | },
183 | "outputs": [],
184 | "source": [
185 | "import numpy as np\n",
186 | "import pandas as pd\n",
187 | "from sklearn.discriminant_analysis import LinearDiscriminantAnalysis as LDA\n",
188 | "\n",
189 | "# classification example\n",
190 | "# use iris dataset\n",
191 | "from sklearn.datasets import load_iris\n",
192 | "iris = load_iris()\n",
193 | "X, y = iris.data, iris.target\n",
194 | "\n",
195 | "# random suffle the dataset\n",
196 | "# use the first 100 observations as train_set\n",
197 | "# the rest 50 observations as test_set\n",
198 | "np.random.seed(1234)\n",
199 | "idx = np.random.permutation(len(X))\n",
200 | "X = X[idx]\n",
201 | "y = y[idx]\n",
202 | "\n",
203 | "train_set = X[0:100,:]\n",
204 | "test_set = X[100:,]\n",
205 | "train_y = y[0:100]\n",
206 | "test_y = y[100:,]\n",
207 | "\n",
208 | "# we need to standardize the data because LDA assumes normal distribution\n",
209 | "from sklearn.preprocessing import StandardScaler\n",
210 | "model = StandardScaler()\n",
211 | "model.fit(train_set) \n",
212 | "standardized_train = model.transform(train_set)\n",
213 | "standardized_test = model.transform(test_set)\n",
214 | "\n",
215 | "# start compressing\n",
216 | "compressor = LDA(n_components=2) # set n_components=2\n",
217 | "# n_components <= min(n_classes - 1, n_features)\n",
218 | "\n",
219 | "compressor.fit(standardized_train, train_y) # fit on trainset\n",
220 | "transformed_trainset = compressor.transform(standardized_train) # transform trainset, (100,2)\n",
221 | "transformed_testset = compressor.transform(standardized_test) # transform test set\n",
222 | "assert transformed_trainset.shape[1] == transformed_testset.shape[1] # same number of features"
223 | ]
224 | },
225 | {
226 | "cell_type": "code",
227 | "execution_count": 4,
228 | "metadata": {
229 | "ExecuteTime": {
230 | "end_time": "2020-03-30T03:37:19.129206Z",
231 | "start_time": "2020-03-30T03:37:18.847470Z"
232 | }
233 | },
234 | "outputs": [
235 | {
236 | "data": {
237 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZIAAAEKCAYAAAA4t9PUAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3Xd8VfX9x/HXh7333nsTBMNwI9WKC0Vq3bOuqrX1VxVw1IG46qitq2pRabVWWaKC4AQVVEAlCTvsAMreBDI+vz/OoUYKyYHk5ma8n4/HfXDvGfe+D4H7yVmfr7k7IiIiR6pMvAOIiEjxpkIiIiL5okIiIiL5okIiIiL5okIiIiL5okIiIiL5okIiIiL5okIiIiL5okIiIiL5Ui7eAQpDvXr1vFWrVvGOISJSrMyZM2eju9fPa7lSUUhatWrF7Nmz4x1DRKRYMbOVUZbToS0REckXFRIREckXFRIREckXFRIREckXFRIREcmXmBYSMxtlZuvNLOUQ883M/mpmqWaWZGa9csy7wsyWhI8rckw/2sySw3X+amYWy20QEZHcxXqP5FVgYC7zTwfah4/rgOcBzKwOcC/QF+gD3GtmtcN1ng+X3b9ebu8vIiIxFtNC4u7Tgc25LHIOMNoDXwG1zKwxcBrwobtvdvctwIfAwHBeDXef6cEYwaOBc2O5DSIixdGarXu4/915ZGZlx/yz4n1DYlNgdY7XaeG03KanHWT6/zCz6wj2XGjRokXBJRYRKcKys53Xv17JI5MXku0wuGdTEprViulnxruQHOz8hh/B9P+d6P4i8CJAYmLiQZcRESlJlm3YybCxyXyzYjMntK/HQ4O707xOlZh/brwLSRrQPMfrZsDacHr/A6Z/Fk5vdpDlRURKrcysbF76fDlPfbSYSuXK8OdfJfCro5tRWNcixfvy34nA5eHVW/2Abe6+DpgC/NLMaocn2X8JTAnn7TCzfuHVWpcD78QtvYhInM1bu41zn/uSRz9YyICODfjojydxfmLzQisiEOM9EjP7N8GeRT0zSyO4Eqs8gLu/AEwCzgBSgd3AVeG8zWY2ApgVvtUD7r7/pP1vCa4GqwxMDh8iIqVKekYWf/tkCS9MW0btKhV4/pJenN69cVyyWHDxU8mWmJjo6v4rIiXFnJWbuWNMEks37GJIr2bcc1ZnalWpUOCfY2Zz3D0xr+XifY5EREQi2rU3kz9PWcRrM1fQpGZlXru6Dyd1yHO4kJhTIRERKQamL97A8HHJrN22hyuOacVtp3WkWsWi8RVeNFKIiMhBbd29jwffX8CYOWm0qV+Vt68/hsRWdeId62dUSEREiqjJyeu45515bNm9j5tObsvvBrSnUvmy8Y71P1RIRESKmPU70rn3nXlMTvmBrk1q8NrVvenapGa8Yx2SComISBHh7oyZk8aD7y9gT0YWdwzsyLUntKF82Xjf8pc7FRIRkSJg9ebd3Dk+mc+XbKR3q9o8MiSBtvWrxTtWJCokIiJxlJ3tjJ65gsemLMKAEed05ZK+LSlTpvgMtaRCIiISJ6nrdzB0bDJzVm7hpA71GTm4G81qx77JYkFTIRERKWQZWdm8OH0ZT3+0hCoVy/Lkr3swuGfTQu2PVZBUSEREClHKmm3cMSaJ+eu2c2b3xtw3qCv1q1eMd6x8USERESkE6RlZPP3xEl6cvow6VSvwwqVHM7Bbo3jHKhAqJCIiMTZrxWaGjkli2cZdXJDYnDvP6EzNKuXjHavAqJCIiMTIzr2ZPPbBQkbPXEmz2pX512/6cnz7evGOVeBUSEREYuDTReu5a1wy67anc/VxrbnttA5UqVAyv3JL5laJiMTJll37GPHefMZ9t4Z2Daox5oZjObpl7XjHiikVEhGRAuDuTEr+gXsnprB1dwa3DGjHTQPaUbFc0WuyWNAiFRIzOx5o7+6vmFl9oJq7L49tNBGR4mH99nTunpDC1Pk/0r1pTUZf3ZcuTWrEO1ahybOQmNm9QCLQEXiFYMz1fwHHxTaaiEjR5u68PTuNEe/PZ19mNsNP78Rvjm9NuSLeZLGgRdkjGQz0BL4FcPe1ZlY9pqlERIq4VZt2M3x8El+mbqJP6zo8OiSB1vWqxjtWXEQpJPvc3c3MAcysdP5NiYgAWdnOqzNW8PiURZQtYzx4bjcu7tOiWDVZLGhRCslbZvZ3oJaZXQtcDbwU21giIkXPkh93cMfYJL5btZWTO9Zn5ODuNKlVOd6x4i7PQuLuj5vZqcB2gvMkf3L3D2OeTESkiNiXmc0L05byzCepVK1Ylr9ccBTnHNWk2DZZLGhRTra3Bj7fXzzMrLKZtXL3FbEOJyISb0lpW7ljTBILf9jB2T2acO/ZXahXrXg3WSxoUQ5tvQ0cm+N1Vjitd0wSiYgUAXv2ZfGXjxbz0ufLqF+9Ii9dnsipXRrGO1aRFKWQlHP3fftfuPs+M6sQw0wiInH11bJNDBubxIpNu7moT3OGn9GZGpVKTpPFghalkGwws0HuPhHAzM4BNsY2lohI4duRnsEjkxfy+teraFGnCm9c05dj25W8JosFLUohuQF43cyeAQxYDVwe01QiIoXsk4U/ctf4FH7cns41x7fmj7/sSOUKJb+9SUGIctXWUqCfmVUDzN13xD6WiEjh2LxrHw+8O48J36+lQ8NqPHfJsfRsUbKbLBa0KFdtVQSGAK2Acvsvd3P3B2KaTEQkhtydd5PWcd/EeexIz+APp7Tnxv7tqFCudLU3KQhRDm29A2wD5gB7YxtHRCT2ftiWzt0TkvlowXp6NK/FY0MS6NhInZ+OVJRC0szdB8Y8iYhIjLk7b85azUPvLyAjO5u7z+zMVce1pmwpbm9SEKLsw80ws+5H8uZmNtDMFplZqpkNO8j8lmb2sZklmdlnZtYsx7xHzSwlfFyQY/ovzOxbM/vezL4ws3ZHkk1ESpeVm3Zx8UtfM3xcMt2a1mTKH07kmhPaqIgUgCh7JMcDV5rZcoJDWwa4uyfktpKZlQWeBU4F0oBZZjbR3efnWOxxYLS7v2ZmA4CHgcvM7EygF3AUUBGYZmaT3X078DxwjrsvMLMbgbuBK6NvsoiUJlnZzitfLufxqYsoX6YMD5/XnQt7N1d7kwIUpZCcfoTv3QdIdfdlAGb2JnAOkLOQdAFuDZ9/CkzIMX2au2cCmWY2FxgIvAU4sH/EmJrA2iPMJyIl3KIfgiaLc1dv5ZTODXjw3O40qlkp3rFKnCiX/64EMLMGwOH8BJoS3HOyXxrQ94Bl5hJcEfY0wbgn1c2sbjj9XjN7EqgCnMxPBegaYJKZ7SFoJNnvMDKJSCmwLzObZz9N5bnPUqleqTx/vagnZyc01l5IjES5/HcQ8ATQBFgPtAQWAF3zWvUg0/yA17cBz5jZlcB0YA2Q6e5Tzaw3MAPYAMwEMsN1bgXOcPevzex24EmC4nJg7uuA6wBatGiRR1QRKSm+X72VO8bMZfGPOzn3qCb86eyu1Kmqrk6xFOXQ1giC3/o/cveeZnYycFGE9dKA5jleN+OAw1DuvhY4DyC84XGIu28L540ERobz3gCWhOPF93D3r8O3+A/wwcE+3N1fBF4ESExMPLCAiUgJs2dfFk9MXcSoL5fTsEYlRl2ZyIBOarJYGKIUkgx332RmZcysjLt/amaPRlhvFtA+bEO/BrgQuDjnAmZWD9js7tnAcGBUOL0sUCv83AQgAZgarlbTzDq4+2KCE/kLImQRkRJsxtKNDBubzKrNu7mkbwuGnd6J6mqyWGiiFJKt4d7CdIKeW+v56TDTIbl7ppndDEwBygKj3H2emT0AzA6bQPYHHg6H8Z0O3BSuXh74PDyeuR24NDzxTjhK41gzywa2EIzYKCKl0Pb0DB6etIB/f7OaVnWr8OZ1/ejXpm68Y5U65p77UZ9wjPZ0gnMelxBcKfW6u2+KfbyCkZiY6LNnz453DBEpQB/O/5G7JySzYcderj2hDX84pYOaLBYwM5vj7ol5LRflqq1dOV6+lq9UIiL5tHHnXu6bOI/3ktbRqVF1Xro8kYRmteIdq1Q7ZCExsy/c/Xgz28HPr7baf0NijUOsKiJS4Nydd75fy/3vzmPX3iz+eGoHrj+prZosFgGHLCTufnz4pzqZiUhcrd26h7snpPDJwvX0bBE0WWzfUF9NRUWuh7bMrAyQ5O7dCimPiMh/ZWc7b3yzikcmLyQr2/nTWV244thW6o9VxORaSNw928zmmlkLd19VWKFERJZv3MWwsUl8vXwzx7Wry8ODE2hRt0q8Y8lBRLn8tzEwz8y+Af574t3dB8UslYiUWplZ2fzji+U8+eFiKpQrw2NDEjg/sZnamxRhUQrJ/TFPISICzF+7naFjk0hes41fdmnIiHO70bCGmiwWdVEu/51WGEFEpPTam5nFM5+k8vxnS6lVpTzPXtyLM7o30l5IMRGlaWM/4G9AZ6ACwV3qu3T5r4gUhDkrtzB0bBKp63dyXq+m3HNmF2qryWKxEuXQ1jMEfbLeBhKBy4H2sQwlIiXf7n2Z/HnKIl6dsYLGNSrxylW9Obljg3jHkiMQpZDg7qlmVtbds4BXzGxGjHOJSAn2xZKNDBuXRNqWPVx+TEvuGNiJahUjfR1JERTlJ7fbzCoA35vZY8A6oGpsY4lISbRtdwYjJ83nrdlptKlXlbeuP4Y+revEO5bkU5RCchlQBriZYFCp5gSjGoqIRPZByg/c804Km3ft47f92/L7X7SnUnk1WSwJohSSXsAkd9+OLgUWkcO0YUfQZPH95HV0aVyDV67sTbemNeMdSwpQlEIyCPiLmU0H3gSm7B8bRETkUNydcd+u4YH35rNnXxa3n9aR605sQ/myarJY0kS5j+QqMysPnE4wwuFzZvahu//POOkiIgBrtu7hznHJTFu8gaNb1ubRIQm0a1At3rEkRqJetZVhZpMJ2slXBs4BVEhE5Geys51/fb2SRycvxIH7zu7C5ce0ooyaLJZoUW5IHEhwH8nJwGfAy8CvYxtLRIqbpRt2MmxsErNWbOGE9vV4aHB3mtdRk8XSIMoeyZUE50aud/e9sY0jIsVNRlY2L32+jL98tITK5cvy+Pk9GNKrqdqblCJRzpFcWBhBRKT4SVmzjaFjk5i3djund2vE/ed0pUF1NVksbXQrqYgctvSMLP72yRJemLaM2lUq8PwlvTi9e+N4x5I4USERkcMye8Vm7hibxLINu/jV0c24+8zO1KqiJoulmQqJiESya2/QZPG1mStoUrMyo6/uw4kd6sc7lhQBhywkZpZMcLnvQbl7QkwSiUiRM23xBu4cl8zabXu44phW3H5aR6qqyaKEcvuXcFb4503hn/8M/7wE2B2zRCJSZGzdvY8R7y1g7LdptK1flbevP4bEVmqyKD93yELi7isBzOw4dz8ux6xhZvYl8ECsw4lI/ExOXsc978xjy+593HxyO24e0E5NFuWgouybVjWz4939CwAzOxa1kRcpsdZvT+dP78zjg3k/0LVJDV67ujddm6jJohxalELyG2CUmdUkOGeyDbg6pqlEpNC5O2PmpDHivfmkZ2YzdGAnrj2hNeXUZFHyEOWGxDlADzOrAZi7b4t9LBEpTKs37+bO8cl8vmQjfVrV4eEh3WlbX00WJZoovbYaAg8BTdz9dDPrAhzj7v+IeToRiamsbGf0zBX8ecoiDBhxTlcu6dtSTRblsEQ5tPUq8ApwV/h6MfAfQIVEpBhLXb+DoWOTmbNyCyd1qM9D53Wnaa3K8Y4lxVCUQlLP3d8ys+EA7p5pZlkxziUiMZKRlc3fpy3lrx+nUqViWZ78dQ8G91STRTlyUQrJLjOrS3hzopn1IzjhLiLFTMqabdw+JokF67ZzZkJj7ju7K/WrV4x3LCnmolyO8X/ARKBteP/IaOB3Ud7czAaa2SIzSzWzYQeZ39LMPjazJDP7zMya5Zj3qJmlhI8Lckw3MxtpZovNbIGZ3RIli0hplp6RxSOTF3LOs1+ycede/n7Z0Tx7cS8VESkQUa7a+tbMTgI6AgYscveMvNYzs7LAs8CpQBowy8wmuvv8HIs9Dox299fMbADwMHCZmZ0J9AKOAioC08xssrtvJxgfpTnQyd2zzazBYWyvSKnz9bJNDBuXzPKNu7ggsTl3ntGZmlXKxzuWlCBRm+X0AVqFy/cyM9x9dIR1Ut19GYCZvUkwRG/OQtIFuDV8/ikwIcf0ae6eCWSa2VxgIPAW8FvgYnfPBnD39RG3QaRU2ZGewWMfLOKfX62keZ3KvH5NX45rVy/esaQEinL57z+BtsD3wP6T7E5wiCs3TYHVOV6nAX0PWGYuMAR4GhgMVA/Px8wF7jWzJ4EqBMP87i9AbYELzGwwsAG4xd2X5LUdIqXJp4vWc9e4ZNZtT+fq41pz22kdqFJBTRYlNqL8y0oEurj7ITsBH8LBLgE58D1uA54xsyuB6cAaINPdp5pZb2AGQbGYCWSG61QE0t090czOA0YBJ/zPh5tdB1wH0KJFi8OMLlI8bdm1jxHvzWfcd2to36AaY397LL1a1I53LCnhohSSFKARsO4w3zuN4FzGfs2AtTkXcPe1wHkAZlYNGLL/znl3HwmMDOe9Aezf60gDxobPxxPc4/I/3P1F4EWAxMTEwy2CIsWKu/N+8jrufWce2/ZkcMuAdtw0oB0Vy6nJosRepPtIgPlm9g2wd/9Edx+Ux3qzgPZm1ppgT+NC4OKcC5hZPWBzeL5jOMHexf4T9bXcfZOZJQAJwNRwtQnAgHDZkwhukBQptX7cns7dE1L4cP6PdG9ak39d05fOjWvEO5aUIlEKyX1H8sbhjYs3A1OAssAod59nZg8As919ItAfeNjMnODQ1v6xT8oDn4c3SG0HLg1PvAM8ArxuZrcCO4FrjiSfSHHn7rw1ezUPvr+AfZnZ3HlGJ64+Tk0WpfDZ4Z/6KH4SExN99uzZ8Y4hUmBWbdrNsHFJzFi6ib6t6/DokARa1dPoDlKwzGyOuyfmtVxuQ+1+4e7Hm9kOfn6S3AB3d+07ixSyrGzn1RkreHzKIsqWMUYO7sZFvVuoyaLEVW4jJB4f/lm98OKIyKEs/nEHd4xJ4vvVWxnQqQEjB3ejcU01WZT4i3xheXgHeaX9r919VUwSicjP7MvM5vnPlvLMp0uoVrEcT194FIN6NFGTRSkyotyQOAh4AmgCrAdaAguArrGNJiJzV29l6NgkFv6wg7N7NOG+s7tQt5r6Y0nREmWPZATQD/jI3Xua2cnARbGNJVK67dmXxVMfLeblz5dRv3pFXro8kVO7NIx3LJGDilJIMsL7OcqYWRl3/9TMHo15MpFSaubSTQwfl8SKTbu5qE8Lhp/RiRqV1GRRiq4ohWRreNf5dIL7N9bzU7sSESkg29MzeGTyQt74ehUt61bhjWv7cmxbNVmUoi9KITkHSCfo0nsJUBN4IJahREqbTxb+yJ3jUli/I51rT2jN/53akcoV1N5Eioco45HsyvHytRhmESl1Nu3cywPvzeed79fSsWF1XrjsaI5qXivesUQOS243JB70RkR0Q6JIvrk7E+eu5f5357MjPYM/nNKeG/u3o0I5tTeR4ie3GxJ1I6JIDKzbtoe7x6fw8cL19Ghei8eGJNCxkf67SfEV6YZEM+sFHE+wR/KFu38X01QiJVB2tvPmrNU8PGkBGdnZ3H1mZ646rjVl1d5EirkoNyT+CTgfGBdOetXM3nb3B2OaTKQEWbFxF8PGJfHVss0c06YujwzpTsu6arIoJUOUPZKLgJ7ung5gZo8A3wIqJCJ5yMzK5pUvV/DEh4soX6YMj5zXnQt6N1d7EylRohSSFQQ9ttLD1xWBpbEKJFJSLPxhO0PHJDE3bRundG7Ag+d2p1HNSnmvKFLMRCkke4F5ZvYhwTmSU4EvzOyvAO5+SwzziRQ7ezOzePbTpTz3aSo1K5fnbxf15KyExtoLkRIrSiEZHz72+yw2UUSKv+9WbWHo2CQW/7iTwT2bcs9ZXahTtUK8Y4nEVJRCMtnd1+ecYGYd3X1RjDKJFDu792XyxNTFjPpyOY1qVGLUlYkM6KQmi1I6RCkkn5vZPe7+FoCZ/RH4DdAlpslEiokZqRsZNi6ZVZt3c2m/Fgwd2InqarIopUiUQtIfeNHMzgcaEoxF0ieWoUSKg217Mnh40gLenLWaVnWr8OZ1/ejXpm68Y4kUuii9ttaZ2QfAcCAbGO7uO2OeTKQImzrvB+6ekMLGnXu5/qQ23HpKByqVV5NFKZ2i3JD4IbAO6AY0A0aZ2XR3vy3W4USKmo0793LfxHm8l7SOTo2q8/IViSQ0U5NFKd2iHNp61t0nhM+3mtmxBHsnIqWGuzPh+zXc/+58du/N4o+nduCG/m0pX1ZNFkWiHNqaYGYtgfbu/hFQHvhLzJOJFBFrt+7hrvHJfLpoAz1bBE0W2zdUk0WR/aIc2roWuA6oA7QlOLz1AvCL2EYTia/sbOf1b1bx6OSFZGU7fzqrC1cc20pNFkUOEOXQ1k0EV2l9DeDuS8ysQUxTicTZsg07GTY2mW9WbOb4dvV4+LzuNK9TJd6xRIqkSC1S3H3f/vYOZlaOnw94JVJiZGZl8/IXy3nqw8VULFeGx36VwPlHN1N7E5FcRCkk08zsTqCymZ0K3Ai8G9tYIoVv/trt3DF2LilrtnNa14aMOKcbDWqoyaJIXqIUkmEEd7InA9cDk4CXYxlKpDDtzczimU9Sef6zpdSqUp7nLunF6d0aaS9EJKIoV21lAy+FD5ESZc7KoMli6vqdnNerKfec2YXaarIoclgiDbUrUtLs2pvJ41MX8eqMFTSpWZlXr+pN/466hkTkSKiQSKnz+ZINDB+XTNqWPVx+TEvuGNiJahX1X0HkSEX+32NmVd19VyzDiMTStt0ZPPj+fN6ek0abelV56/pj6NO6TrxjiRR7efZ3MLNjzWw+QddfzKyHmT0X5c3NbKCZLTKzVDMbdpD5Lc3sYzNLMrPPzKxZjnmPmllK+LjgIOv+zczUPFIi+SDlB055ahrjvlvDjf3bMun3J6iIiBSQKHskTwGnARMB3H2umZ2Y10pmVhZ4lmBo3jRglplNdPf5ORZ7HBjt7q+Z2QDgYeAyMzsT6AUcRTBG/DQzm+zu28P3TgTUKU/ytH5HOvdNnMek5B/o0rgGr1zZm25Na8Y7lkiJEunQlruvPuBSyKwIq/UBUt19GYCZvQmcA+QsJF2AW8PnnwITckyf5u6ZQKaZzQUGAm+FBerPwMXA4Cj5pfRxd8Z+u4YR781nT0YWt5/WketObKMmiyIxEOV/1eqw46+bWQUzu43wMFcemgKrc7xOC6flNBcYEj4fDFQ3s7rh9NPNrIqZ1QNOBpqHy90MTHT3dREySCmUtmU3V7wyi9venku7BtWYdMsJ3HRyOxURkRiJskdyA/A0QRFIA6YS9N/Ky8Hu5jqwtcptwDNmdiUwHVgDZLr7VDPrDcwANgAzCfZMmgDnE4zamPuHm11H0GySFi1aRIgrxV12tvPPr1by6AcLAbh/UFcu69eSMmqyKBJTUQqJufslR/Deafy0FwFB1+C1ORdw97XAeQBmVg0Y4u7bwnkjgZHhvDeAJUBPoB2QGh5qq2Jmqe7e7sAPd/cXgRcBEhMT1RushFu6YSdDxyQxe+UWTuxQn4cGd6NZbTVZFCkMUQrJDDNbDvwHGOvuWyO+9yygvZm1JtjTuJDgvMZ/hYetNod3zw8HRoXTywK13H2TmSUACcDU8JxJoxzr7zxYEZHSIyMrmxenL+Ppj5dQuXxZHj+/B0N6NVV7E5FCFKVFSnsz60NQCO4KLwV+093/lcd6mWZ2MzAFKAuMcvd5ZvYAMNvdJxIconrYzJzg0Nb+Q2blgc/DL4PtwKVhERH5r5Q12xg6Nol5a7dzRvdG3DeoKw2qq8miSGEz9+hHfcI9iCeBS9y9bMxSFbDExESfPXt2vGNIAUnPyOKvHy/h79OXUbtKBR48tysDuzWOdyyREsfM5rh7Yl7LRRkhsQbBFVUXEoyQOJ7g0l6RQjdrxWaGjkli2cZdnH90M+4+sws1q5SPdyyRUi3KOZK5BPd3PODuM2OcR+Sgdu7N5LEPFjJ65kqa1qrM6Kv7cGKH+vGOJSJEKyRt/HCOf4kUsGmLN3DnuGTWbtvDlce24vbTOlJVTRZFioxD/m80s7+4+x+AieHJ8J9x90ExTSal3tbd+3jgvfmM+3YNbetXZcwNx3B0S/XHEilqcvu17p/hn48XRhCRnCYlr+NP76SwdXcGN5/cjpsHtKNS+WJzfYdIqXLIQuLuc8KnR7n70znnmdnvgWmxDCal0/rt6dzzTgpT5v1It6Y1eO3qPnRtoiaLIkVZlAPNVxC0SMnpyoNMEzli7s7bc9J48L35pGdmM3RgJ649oTXl1B9LpMjL7RzJRQR3orc2s4k5ZlUHNsU6mJQeqzfvZvi4ZL5I3UifVnV4ZEh32tSvFu9YIhJRbnskM4B1QD3giRzTdwBJsQwlpUNWtjN65goe+2ARZQxGnNuNS/q0UJNFkWImt3MkK4GVwDGFF0dKi9T1O7hjTBLfrtpK/471GTm4O01rVY53LBE5AlHubO8H/A3oDFQg6Ju1y91rxDiblEAZWdm88NlS/vZJKlUqluWpC3pw7lFqsihSnEU52f4MQXuUt4FE4HKCVu4ihyU5bRu3j5nLwh92cGZCY+4f1JV61SrGO5aI5FPUoXZTzaysu2cBr5jZjBjnkhIkPSOLpz5azEvTl1GvWkX+ftnRnNa1Ud4rikixEKWQ7DazCsD3ZvYYwQn4qrGNJSXF18s2MWxcMss37uLC3s0ZfkZnalZWk0WRkiRKIbmM4LzIzcCtBKMeDsl1DSn1dqRn8OgHC/nXV6toXqcyr1/Tl+Pa1Yt3LBGJgSgDW60Mn+4B7o9tHCkJPl24nrvGJ7Nuezq/Ob41f/xlB6pUUJNFkZIqtxsSk4FDdv1194SYJJJia/OufYx4bz7jv1tD+wbVGPvbY+nVona8Y4lIjOX2a+JZhZZCijV3572kddw3cR7b9mRwyy/ac9PJbalYTk0WRUqDvG5IFMnVj9vTuWt8Ch8t+JGEZjX51zV96dxYtxiJlCZRbkjcwU+HuCoA5dENiaWeu/MJuwFoAAAQyUlEQVSfWasZOWkB+zKzufOMTlx9nJosipRGUU62V8/52szORWO2l2qrNu1m2LgkZizdRN/WdXh0SAKt6umKcJHS6rAvpXH3CWY2LBZhpGjLynZe+XI5j09dRLkyZXhocHcu7N1cTRZFSrkoh7bOy/GyDEGbFI3hXsos+mEHd4xNYu7qrQzo1ICRg7vRuKaaLIpItD2Ss3M8zwRWAOfEJI0UOfsys3nus1Se/TSV6pXK8/SFRzGoRxM1WRSR/4pyjuSqwggiRc/c1Vu5Y0wSi37cwaAeTbj37C7UVZNFETlAlENbrYHfAa1yLu/ug2IXS+Jpz74snvxwEf/4YjkNqlfi5csTOaVLw3jHEpEiKsqhrQnAP4B3gezYxpF4m7l0E8PGJbFy024u7tuCYad3okYlNVkUkUOLUkjS3f2vMU8icbU9PYOHJy3k39+somXdKrxxbV+ObasmiyKStyiF5GkzuxeYCuzdP9Hdv41ZKilUH83/kbsmJLNhx16uO7ENt57SgcoV1N5ERKKJUki6E7SSH8BPh7Y8fC3F2Kade7n/3flMnLuWjg2r8/fLEjmqea14xxKRYiZKIRkMtHH3fbEOI4XD3Zk4dy33TZzHzr2Z3HpKB37bvy0Vyqm9iYgcviiFZC5QC1gf4yxSCNZt28Pd41P4eOF6jmpei8d+lUCHhtXzXlFE5BCiFJKGwEIzm8XPz5Ho8t9iJDvb+fesVTw8aSGZ2dncfWZnrjquNWXV3kRE8ilKIbn3SN/czAYCTxMM1fuyuz9ywPyWwCigPrAZuNTd08J5jwJnhouOcPf/hNNfJ2jTkgF8A1zv7hlHmrE0WL5xF8PGJvH18s0c27Yuj5yXQIu6VeIdS0RKiCh3tk87kjc2s7LAs8CpQBowy8wmuvv8HIs9Dox299fMbADwMHCZmZ0J9AKOAioC08xssrtvB14HLg3XfwO4Bnj+SDKWdJlZ2Yz6cjlPTF1MhbJleOS87lzQu7nam4hIgYrleCR9gFR3Xxa+z5sEPbpyFpIuwK3h808Jbn7cP32au2cCmWY2FxgIvOXuk3Jk+wZoltc2lEYL1m1n6NgkktK2cUrnhjx4bjca1awU71giUgLleZmOu1d39xrhoxIwBHgmwns3BVbneJ0WTstpbvh+EFwdVt3M6obTTzezKmZWDzgZaJ5zRTMrT3BZ8gcRspQaezOzePLDxZz9ty9Ys2UPz1zck5cuP1pFRERiJpbjkRzs+MmB7edvA54xsyuB6cAaINPdp5pZb2AGsAGYSdB5OKfngOnu/vlBP9zsOuA6gBYtWkSIW/x9u2oLQ8cksWT9Tgb3bMqfzupC7aoV4h1LREq4WI5HksbP9yKaAWtzLuDua4Hzws+pBgxx923hvJHAyHDeG8CSHJnuJThBf/2hPtzdXwReBEhMTCzR46fs3pfJE1MXM+rL5TSqUYlXruzNyZ0axDuWiJQSsRyPZBbQPuwevAa4ELg45wLhYavN7p4NDCe4gmv/ifpa7r7JzBKABIIWLZjZNcBpwC/C9Uq1L1M3MmxcEqs37+HSfi0YOrAT1dVkUUQKUczGI3H3TDO7GZhCcPnvKHefZ2YPALPdfSLQH3jYzJzg0NZN4erlgc/Dq4u2E1wWvP/Q1gvASmBmOH+cuz9wJBmLs217Mnjo/QX8Z/ZqWteryn+u60ffNnXjHUtESiFzz/2oj5m9Bvze3beGr2sDT7j71YWQr0AkJib67Nmz4x2jwEyd9wN3T0hh4869XBs2WaxUXk0WRaRgmdkcd0/Ma7koh7YS9hcRAHffYmY985VOjsiGHXu57915vJ+0jk6NqvPyFYkkNFOTRRGJryiFpIyZ1Xb3LQBmVifielJA3J0J36/h/nfns3tvFrf9sgPXn9SW8mXVZFFE4i9KQXgCmGFmYwiu1vo14dVUEntrtu7hrvHJfLZoA71aBE0W2zVQk0URKTqinGwfbWazCcYfMeC8A9qcSAxkZzuvf72SRyYvJNvh3rO7cPkxrdRkUUSKnEiHqMLCoeJRSJZt2Mmwscl8s2Izx7erx8Pndad5HTVZFJGiSec6ipDMrGxe+nw5T320mErlyvDYrxI4/+hmarIoIkWaCkkRMX/tdu4YO5eUNds5rWtDRpzTjQY11B9LRIo+FZI4S8/I4plPUnlh2lJqVanA85f04vTujeMdS0QkMhWSOJqzcjN3jEli6YZdDOnVjHvO6kytKmqyKCLFiwpJHOzam8mfpyzitZkraFKzMq9d3YeTOtSPdywRkSOiQlLIpi/ewPBxyazZuocrjmnJ7QM7Ua2ifgwiUnzpG6yQbNudwYj35zNmThpt6lfl7RuOoXerOvGOJSKSbyokheCDlHXc8848Nu/ax43923LLL9qryaKIlBgqJDG0fkc6974zj8kpP9ClcQ1eubI33ZrWjHcsEZECpUISA+7OmDlpPPj+AvZkZHH7aR257sQ2arIoIiWSCkkBW715N3eOT+bzJRtJbFmbR4Yk0K5BtXjHEhGJGRWSApKd7YyeuYLHpiwC4P5BXbmsX0vKqMmiiJRwKiQFIHX9ToaNTWL2yi2c2KE+Dw3uRrPaarIoIqWDCkk+ZGRl8+L0ZTz90RIqVyjLE+f34LxeTdVkUURKFRWSI5SyZht3jEli/rrtnNG9EfcP6kb96hXjHUtEpNCpkBym9Iwsnv54CS9OX0adqhV44dJeDOymJosiUnqpkByGWSs2M3RMEss27uL8o5tx95ldqFmlfLxjiYjElQpJBDv3ZvLYBwsZPXMlzWpX5p+/6cMJ7dVkUUQEVEjy9Nmi9dw1PoW12/Zw1XGtuO2XHamqJosiIv+lb8RcDB+XzL+/WUW7BtUYc8OxHN2ydrwjiYgUOSokuWhVtwq/G9COmwe0o2I5NVkUETkYFZJcXH9S23hHEBEp8tRFUERE8kWFRERE8kWFRERE8kWFRERE8kWFRERE8kWFRERE8kWFRERE8kWFRERE8sXcPd4ZYs7MNgArj3D1esDGAoxTHGibSwdtc8mX3+1t6e55dqgtFYUkP8xstrsnxjtHYdI2lw7a5pKvsLZXh7ZERCRfVEhERCRfVEjy9mK8A8SBtrl00DaXfIWyvTpHIiIi+aI9EhERyRcVEsDMRpnZejNLOcR8M7O/mlmqmSWZWa/CzljQImzzJeG2JpnZDDPrUdgZC1pe25xjud5mlmVmvyqsbLESZZvNrL+ZfW9m88xsWmHmi4UI/7Zrmtm7ZjY33OarCjtjQTKz5mb2qZktCLfn9wdZJqbfYSokgVeBgbnMPx1oHz6uA54vhEyx9iq5b/Ny4CR3TwBGUDKOLb9K7tuMmZUFHgWmFEagQvAquWyzmdUCngMGuXtX4PxCyhVLr5L7z/kmYL679wD6A0+YWYVCyBUrmcAf3b0z0A+4ycy6HLBMTL/DVEgAd58ObM5lkXOA0R74CqhlZo0LJ11s5LXN7j7D3beEL78CmhVKsBiK8HMG+B0wFlgf+0SxF2GbLwbGufuqcPliv90RttmB6mZmQLVw2czCyBYL7r7O3b8Nn+8AFgBND1gspt9hKiTRNAVW53idxv/+oEqy3wCT4x0i1sysKTAYeCHeWQpRB6C2mX1mZnPM7PJ4ByoEzwCdgbVAMvB7d8+Ob6SCYWatgJ7A1wfMiul3mMZsj8YOMq1UXO5mZicTFJLj452lEPwFGOruWcEvq6VCOeBo4BdAZWCmmX3l7ovjGyumTgO+BwYAbYEPzexzd98e31j5Y2bVCPam/3CQbYnpd5gKSTRpQPMcr5sR/DZToplZAvAycLq7b4p3nkKQCLwZFpF6wBlmlunuE+IbK6bSgI3uvgvYZWbTgR5ASS4kVwGPeHDvQ6qZLQc6Ad/EN9aRM7PyBEXkdXcfd5BFYvodpkNb0UwELg+vfOgHbHP3dfEOFUtm1gIYB1xWwn87/S93b+3urdy9FTAGuLGEFxGAd4ATzKycmVUB+hIcYy/JVhHsgWFmDYGOwLK4JsqH8FzPP4AF7v7kIRaL6XeY9kgAM/s3wdUb9cwsDbgXKA/g7i8Ak4AzgFRgN8FvNMVahG3+E1AXeC78DT2zuDe7i7DNJU5e2+zuC8zsAyAJyAZedvdcL48u6iL8nEcAr5pZMsEhn6HuXpw7Ah8HXAYkm9n34bQ7gRZQON9hurNdRETyRYe2REQkX1RIREQkX1RIREQkX1RIREQkX1RIREQkX1RIpNQJ24HE/FJmM7sl7Mj6eqw/K57MrJaZ3RjvHBI/KiQih8HMDufeqxuBM9z9kljlKSJqEWyrlFIqJFIkmVmr8Lf5l8IxFqaaWeVw3n/3KMysnpmtCJ9faWYTwrEmlpvZzWb2f2b2nZl9ZWZ1cnzEpeE4Kylm1idcv2o4lsWscJ1zcrzv22b2LjD1IFn/L3yfFDP7QzjtBaANMNHMbj1g+bJm9riZJYdjQ/wunP6L8HOTwxwVw+krzOwhM5tpZrPNrJeZTTGzpWZ2Q7hMfzObbmbjzWy+mb1gZmXCeReF75liZo/myLHTzEZaMC7HV+Fd3phZfTMbG/49zDKz48Lp94W5PjOzZWZ2S/hWjwBtLRjT5M9m1jjM8n34mScc8T8EKR7cXQ89itwDaEXQ2vuo8PVbwKXh88+AxPB5PWBF+PxKgjt3qwP1gW3ADeG8pwia2e1f/6Xw+YlASvj8oRyfUYug31TV8H3TgDoHyXk0QQfZqgQtyecBPcN5K4B6B1nntwR9kcqFr+sAlQi6s3YIp43OkXcF8Nsc25GUYxvXh9P7A+kExass8CHwK6AJQUuQ+gSdLD4Bzg3XceDs8PljwN3h8zeA48PnLQhabwDcB8wAKoZ/75sI7hhvtf/vMFzuj8Bd4fOyQPV4/3vSI7YPtUiRomy5u+9v+TCH4AsrL596MCbDDjPbBrwbTk8GEnIs928Ixq4wsxoWDPD0S2CQmd0WLlOJsM0E8KG7H2yMi+OB8R40PcTMxgEnAN/lkvEU4AV3zwwzbLZgBMrl/lNfs9cIBmD6S/h6Yo7tqJZjG9PD7ADfuPuyMMe/w2wZwGfuviGc/jpB8ZwA7APeC9edA5yaI18X+6kDcg0zqx4+f9/d9wJ7zWw90PAg2zcLGGVBI8EJOX6GUkKpkEhRtjfH8yyCNucQ7KnsPyxbKZd1snO8zubn/94P7A3kBH2Xhrj7opwzzKwvsOsQGY+k37wd5PPzep+c23HgNu7frkNt06FkuPv+dbJyvE8Z4Bh33/OzgEFhOfBn8j/fIWFxPhE4E/inmf3Z3UfnkkOKOZ0jkeJoBcEhJQgO3xyJCwDM7HiCTqjbCIbX/Z2F35hm1jPC+0wHzjWzKmZWlWBgrM/zWGcqcMP+E/fhuZuFQCszaxcucxlwuOOn9zGz1uG5kQuALwgGODopPJdUFrgowvtOBW7e/8LMjspj+R0Eh9r2L9+S4JDbSwRdaQt0fHAperRHIsXR48BbZnYZwTH/I7HFzGYANYCrw2kjCA4lJYXFZAVwVm5v4u7fmtmr/DSWxcvuntthLQjGeOkQfk4GwfmaZ8zsKuDtsMDM4vBHapxJcOK7O0GBG+/u2WY2HPiUYO9kkru/k8f73AI8a2ZJBN8R04EbDrWwu28ysy/NLIVgJM0U4PZw23YCpWHUxVJN3X9FSgAz6w/c5u65Fj6RWNChLRERyRftkYiISL5oj0RERPJFhURERPJFhURERPJFhURERPJFhURERPJFhURERPLl/wFuz7e2mCP1vAAAAABJRU5ErkJggg==\n",
238 | "text/plain": [
239 | ""
240 | ]
241 | },
242 | "metadata": {
243 | "needs_background": "light"
244 | },
245 | "output_type": "display_data"
246 | }
247 | ],
248 | "source": [
249 | "# visualize the relationship between cumulative variance explained and number of components\n",
250 | "%matplotlib inline\n",
251 | "import matplotlib.pyplot as plt\n",
252 | "plt.plot(np.array(range(len(compressor.explained_variance_ratio_))) + 1, \n",
253 | " np.cumsum(compressor.explained_variance_ratio_))\n",
254 | "plt.xlabel('number of components')\n",
255 | "plt.ylabel('cumulative explained variance')\n",
256 | "plt.show(); # LDA compresses the original 4 variables into 2 \n",
257 | "# These 2 variables can explain 100% of the variance"
258 | ]
259 | },
260 | {
261 | "cell_type": "code",
262 | "execution_count": null,
263 | "metadata": {},
264 | "outputs": [],
265 | "source": []
266 | }
267 | ],
268 | "metadata": {
269 | "kernelspec": {
270 | "display_name": "Python 3",
271 | "language": "python",
272 | "name": "python3"
273 | },
274 | "language_info": {
275 | "codemirror_mode": {
276 | "name": "ipython",
277 | "version": 3
278 | },
279 | "file_extension": ".py",
280 | "mimetype": "text/x-python",
281 | "name": "python",
282 | "nbconvert_exporter": "python",
283 | "pygments_lexer": "ipython3",
284 | "version": "3.6.8"
285 | },
286 | "toc": {
287 | "base_numbering": 1,
288 | "nav_menu": {},
289 | "number_sections": true,
290 | "sideBar": false,
291 | "skip_h1_title": false,
292 | "title_cell": "Table of Contents",
293 | "title_sidebar": "Contents",
294 | "toc_cell": true,
295 | "toc_position": {
296 | "height": "690.488px",
297 | "left": "28.9922px",
298 | "top": "134px",
299 | "width": "319.961px"
300 | },
301 | "toc_section_display": true,
302 | "toc_window_display": true
303 | }
304 | },
305 | "nbformat": 4,
306 | "nbformat_minor": 2
307 | }
308 |
--------------------------------------------------------------------------------
/GA.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 | """
4 | Created on Mon Jan 27 11:18:30 2020
5 |
6 | @author: chenyingxiang
7 | """
8 |
9 | import random
10 | import numpy as np
11 | from tqdm import tqdm
12 | import random
13 | from sklearn.model_selection import KFold
14 | from deap import base, creator, tools, algorithms
15 |
16 | random.seed()
17 | np.random.seed()
18 |
19 | import warnings
20 | warnings.filterwarnings("ignore", category=RuntimeWarning)
21 |
22 | class Genetic_Algorithm(object):
23 | """
24 | Genetic Algorithm Algorithm for feature selection
25 |
26 | Parameters
27 | ----------
28 | n_pop: int, default =20
29 | The number of population
30 |
31 | n_gen: int, default = 20
32 | The number of generation
33 |
34 | both: boolean, default = True
35 | Whether offsprings can result from both crossover and mutation
36 | If False, offsprings can result from one of them.
37 |
38 | n_children: int, default = None
39 | The number of children to produce when offsprings can only result from one of the operations
40 | including crossover, mutation and reproduction
41 | Default None will set n_children = n_pop
42 | n_children corresponds with the lambda_ parameter in deap.algorithms.varOr
43 |
44 | cxpb: float, default = 0.5
45 | The probability of mating two individuals
46 | The sum of cxpb and mutpb shall be in [0,1]
47 |
48 | mutpb: float, default = 0.3
49 | The probability of mutating an individual
50 | The sum of cxpb and mutpb shall be in [0,1]
51 |
52 | cx_indpb: float, default = 0.25
53 | The independent probabily for each attribute to be exchanged under uniform crossover.
54 |
55 | mu_indpb: floatt, default = 0.25
56 | The independent probability for each attribute to be flipped under mutFlipBit.
57 |
58 | algorithm: string, default="one-max"
59 | The offspring selection algorithm
60 | "NSGA2" is also available
61 |
62 | loss_func: object
63 | The loss function of the ML task.
64 | loss_func(y_true, y_pred) should return the loss.
65 |
66 | estimator: object
67 | A supervised learning estimator
68 | It has to have the `fit` and `predict` method (or `predict_proba` method for classification)
69 |
70 | predict_type: string, default="predict"
71 | Final prediction type.
72 | - For some classification loss functions, probability output is required.
73 | Should set predict_type to "predict_proba"
74 |
75 | Attributes
76 | ----------
77 | best_sol: np.array of int
78 | The index of the best subset of features.
79 |
80 | best_loss: float
81 | The loss associated with the best_sol
82 |
83 | loss_dict: dictionary
84 | Store the evaluation results to speed up fitting process
85 |
86 | References
87 | ----------
88 | 1. https://deap.readthedocs.io/en/master/index.html
89 | 2. https://github.com/kaushalshetty/FeatureSelectionGA
90 | 3. Haupt, R. L. (1995). An introduction to genetic algorithms for electromagnetics.
91 | IEEE Antennas and Propagation Magazine, 37(2), 7-15.
92 | 4. Deb, K., Pratap, A., Agarwal, S., & Meyarivan, T. A. M. T. (2002). A fast and elitist multiobjective genetic algorithm: NSGA-II.
93 | IEEE transactions on evolutionary computation, 6(2), 182-197.
94 | 5. Mkaouer, W., Kessentini, M., Shaout, A., Koligheu, P., Bechikh, S., Deb, K., & Ouni, A. (2015). Many-objective software remodularization using NSGA-III.
95 | ACM Transactions on Software Engineering and Methodology (TOSEM), 24(3), 1-45.
96 | 6. Fortin, F. A., Rainville, F. M. D., Gardner, M. A., Parizeau, M., & Gagné, C. (2012). DEAP: Evolutionary algorithms made easy.
97 | Journal of Machine Learning Research, 13(Jul), 2171-2175.
98 |
99 | """
100 |
101 | def __init__(self, loss_func, estimator, n_pop = 20, n_gen = 20, both = True, n_children = None,
102 | cxpb = 0.5, mutpb = 0.2, cx_indpb = 0.25, mu_indpb = 0.25,
103 | algorithm = "one-max", predict_type = 'predict'):
104 |
105 | #### check type
106 | if not hasattr(estimator, 'fit'):
107 | raise ValueError('Estimator doesn\' have fit method')
108 | if not hasattr(estimator, 'predict') and not hasattr(estimator, 'predict_proba'):
109 | raise ValueError('Estimator doesn\' have predict or predict_proba method')
110 |
111 | for instant in [cxpb, mutpb, cx_indpb, mu_indpb]:
112 | if type(instant) != float:
113 | raise TypeError(f'{instant} should be float type')
114 | if (instant > 1) or (instant) < 0:
115 | raise ValueError(f'{instant} should be within range [0,1]')
116 |
117 | for instant in [n_pop, n_gen]:
118 | if type(instant) != int:
119 | raise TypeError(f'{instant} should be int type')
120 |
121 | if type(both) != bool:
122 | raise TypeError(f'{both} should be boolean type')
123 |
124 | if predict_type not in ['predict', 'predict_proba']:
125 | raise ValueError('predict_type should be "predict" or "predict_proba"')
126 |
127 | if algorithm not in ['one-max', 'NSGA2']:
128 | raise ValueError('algorithm should be "one-max" or "NSGA2"')
129 |
130 | if not n_children:
131 | n_children = n_pop
132 |
133 | if type(n_children) != int:
134 | raise TypeError(f'{n_children} should be int type')
135 |
136 | if (cxpb + mutpb) > 1.0:
137 | raise ValueError(f'The sum of cxpb and mutpb shall be in [0,1]')
138 |
139 | self.n_pop = n_pop
140 | self.n_gen = n_gen
141 | self.both = both
142 | self.n_children = n_children
143 | self.cxpb = cxpb
144 | self.mutpb = mutpb
145 | self.cx_indpb = cx_indpb
146 | self.mu_indpb = mu_indpb
147 | self.algorithm = algorithm
148 | self.loss_func = loss_func
149 | self.estimator = estimator
150 | self.predict_type = predict_type
151 | self.loss_dict = dict()
152 |
153 | def _get_cost(self, X, y, estimator, loss_func, X_test = None, y_test = None):
154 |
155 | estimator.fit(X, y.ravel())
156 | if type(X_test) is np.ndarray:
157 | if self.predict_type == "predict_proba": # if loss function requires probability
158 | y_test_pred = estimator.predict_proba(X_test)
159 | return loss_func(y_test, y_test_pred)
160 | else:
161 | y_test_pred = estimator.predict(X_test)
162 | return loss_func(y_test, y_test_pred)
163 |
164 | y_pred = estimator.predict(X)
165 |
166 | return loss_func(y, y_pred)
167 |
168 |
169 | def _cross_val(self, X, y, estimator, loss_func, cv):
170 |
171 | loss_record = []
172 |
173 | for train_index, test_index in KFold(n_splits = cv).split(X): # k-fold
174 |
175 | try:
176 | X_train, X_test = X[train_index], X[test_index]
177 | y_train, y_test = y[train_index], y[test_index]
178 | estimator.fit(X_train, y_train.ravel())
179 |
180 | if self.predict_type == "predict_proba":
181 | y_test_pred = estimator.predict_proba(X_test)
182 | loss = loss_func(y_test, y_test_pred)
183 | loss_record.append(loss)
184 | else:
185 | y_test_pred = estimator.predict(X_test)
186 | loss = loss_func(y_test, y_test_pred)
187 | loss_record.append(loss)
188 | except:
189 | continue
190 |
191 | return np.array(loss_record).mean()
192 |
193 | def _eval_fitness(self, individual):
194 |
195 | individual = [True if x else False for x in individual]
196 |
197 | if sum(individual) == 0:
198 | current_loss = np.Inf
199 | else:
200 | encoded_str = ''.join(['1' if x else '0' for x in individual])
201 | if self.loss_dict.get(encoded_str):
202 | current_loss = self.loss_dict.get(encoded_str)
203 | else:
204 | if self.cv:
205 | current_loss = self._cross_val(self.X_train[:,individual], self.y_train,
206 | self.estimator, self.loss_func, self.cv)
207 | current_loss = np.round(current_loss, 4)
208 |
209 | elif type(self.X_val) is np.ndarray:
210 | current_loss = self._get_cost(self.X_train[:,individual], self.y_train,
211 | self.estimator, self.loss_func,
212 | self.X_val[:,individual], self.y_val)
213 | current_loss = np.round(current_loss, 4)
214 |
215 | else:
216 | current_loss = self._get_cost(self.X_train[:,individual], self.y_train,
217 | self.estimator, self.loss_func, None, None)
218 | current_loss = np.round(current_loss, 4)
219 | self.loss_dict[encoded_str] = current_loss
220 |
221 | if self.algorithm == "one-max":
222 | return current_loss,
223 | else:
224 | return current_loss, sum(individual)
225 |
226 | def fit(self, X_train, y_train, cv = None, X_val = None, y_val = None,
227 | init_sol = None, stop_point = 5):
228 |
229 |
230 | """
231 | Fit method.
232 |
233 | Parameters
234 | ----------
235 | X_train: numpy array shape = (n_samples, n_features).
236 | The training input samples.
237 |
238 | y_train: numpy array, shape = (n_samples,).
239 | The target values (class labels in classification, real numbers in regression).
240 |
241 | cv: int or None, default = None
242 | Specify the number of folds in KFold. None means SA will not use
243 | k-fold cross-validation results to select features.
244 | [1] If cv = None and X_val = None, the GA will evaluate each subset on trainset.
245 | [2] If cv != None and X_val = None, the GA will evaluate each subset on generated validation set using k-fold.
246 | [3] If cv = None and X_val != None, the GA will evaluate each subset on the user-provided validation set.
247 |
248 | X_val: numpy array, shape = (n_samples, n_features) or None. default = None.
249 | The validation input samples. None means no validation set is provoded.
250 | [1] If cv = None and X_val = None, the GA will evaluate each subset on trainset.
251 | [2] If cv != None and X_val = None, the GA will evaluate each subset on generated validation set using k-fold.
252 | [3] If cv = None and X_val != None, the GA will evaluate each subset on the user-provided validation set.
253 |
254 | y_val: numpy array, shape = (n_samples, ) or None. default = None.
255 | The validation target values (class labels in classification, real numbers in regression).
256 |
257 | Returns
258 | -------
259 | self : object
260 |
261 | """
262 |
263 | # make sure input has two dimensions
264 | assert len(X_train.shape) == 2
265 | num_feature = X_train.shape[1]
266 |
267 | # save them for _eval_fitness function
268 | self.X_train = X_train
269 | self.y_train = y_train
270 | self.cv = cv
271 | self.X_val = X_val
272 | self.y_val = y_val
273 |
274 | # creator
275 | if self.algorithm == "one-max":
276 | creator.create("FitnessMin", base.Fitness, weights=(-1.0,)) # minimize the loss
277 | creator.create("Individual", list, fitness=creator.FitnessMin)
278 | else:
279 | creator.create("FitnessMulti", base.Fitness, weights=(-1.0, -0.1))
280 | creator.create("Individual", list, fitness=creator.FitnessMulti)
281 |
282 | # register
283 | toolbox = base.Toolbox()
284 | toolbox.register("gene", random.randint, 0, 1)
285 | toolbox.register("individual", tools.initRepeat, creator.Individual,
286 | toolbox.gene, n = num_feature)
287 | toolbox.register("population", tools.initRepeat, list, toolbox.individual,
288 | n = self.n_pop)
289 | toolbox.register("evaluate", self._eval_fitness)
290 | toolbox.register("mate", tools.cxUniform, indpb = self.cx_indpb)
291 | toolbox.register("mutate", tools.mutFlipBit, indpb = self.mu_indpb)
292 |
293 | if self.algorithm == "one-max":
294 | toolbox.register("select", tools.selTournament, tournsize=5)
295 | else:
296 | toolbox.register("select", tools.selNSGA2)
297 |
298 | # start evolution
299 | # evaluate inital population
300 | population = toolbox.population()
301 | fits = toolbox.map(toolbox.evaluate, population)
302 | for ind, fit in zip(population, fits):
303 | ind.fitness.values = fit
304 |
305 | # evolving
306 | for gen in tqdm(range(self.n_gen)):
307 | if self.both:
308 | offspring = algorithms.varOr(population, toolbox,
309 | lambda_ = self.n_children, cxpb = self.cxpb,
310 | mutpb = self.mutpb)
311 | else:
312 | offspring = algorithms.varAnd(population, toolbox, cxpb = self.cxpb,
313 | mutpb = self.mutpb)
314 | invalid_ind = [ind for ind in offspring if not ind.fitness.valid]
315 | fitnesses = map(toolbox.evaluate, invalid_ind)
316 | for ind, fit in zip(invalid_ind, fitnesses):
317 | ind.fitness.values = fit
318 |
319 | if self.algorithm == 'one-max':
320 | population = toolbox.select(offspring, k = self.n_pop)
321 | else:
322 | population = toolbox.select(offspring + population, k = self.n_pop)
323 |
324 | fits = list(toolbox.map(toolbox.evaluate, population))
325 | if self.algorithm != "one-max":
326 | fits = [x[0] for x in fits]
327 |
328 | try:
329 | best_idx = np.argmin(np.array(fits))
330 | self.best_sol = [True if x else False for x in population[best_idx]]
331 | self.best_loss = fits[best_idx]
332 |
333 | if np.isinf(self.best_loss): # if best loss is inf
334 | best_key = min([(value, key) for key, value in self.loss_dict.items()])[1]
335 | self.best_sol = [True if x == '1' else False for x in best_key]
336 | self.best_loss = min([(value, key) for key, value in self.loss_dict.items()])[0]
337 | except:
338 | best_key = min([(value, key) for key, value in self.loss_dict.items()])[1]
339 | self.best_sol = [True if x == '1' else False for x in best_key]
340 | self.best_loss = min([(value, key) for key, value in self.loss_dict.items()])[0]
341 |
342 | def transform(self, X):
343 | """
344 | Transform method.
345 |
346 | Parameters
347 | ----------
348 | X: numpy array shape = (n_samples, n_features).
349 | The data set needs feature reduction.
350 |
351 | Returns
352 | -------
353 | transform_X: numpy array shape = (n_samples, n_best_features).
354 | The data set after feature reduction.
355 |
356 | """
357 | transform_X = X[:, self.best_sol]
358 | return transform_X
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | Feature-Engineering-Handbook
2 | ============
3 | Welcome! This repo provides an interactive and complete practical feature engineering tutorial in Jupyter Notebook. It contains three parts: [Data Prepocessing](1.%20Data%20Preprocessing.ipynb), [Feature Selection](2.%20Feature%20Selection.ipynb) and [Dimension Reduction](3.%20Dimension%20Reduction.ipynb). Each part is demonstrated separately in one notebook. Since some feature selection algorithms such as Simulated Annealing and Genetic Algorithm lack complete implementation in python, we also provide corresponding python scripts ([Simulated Annealing](SA.py), [Genetic Algorithm](GA.py)) and cover them in our tutorial for your reference.
4 |
5 |
6 | Brief Introduction
7 | ------------
8 | - [Notebook One](1.%20Data%20Preprocessing.ipynb) covers data preprocessing on static continuous features based on [scikit-learn](https://scikit-learn.org/stable/), on static categorical features based on [Category Encoders](https://contrib.scikit-learn.org/categorical-encoding/), and on time series features based on [Featuretools](https://www.featuretools.com/).
9 |
10 | - [Notebook Two](2.%20Feature%20Selection.ipynb) covers feature selection including univariate filter methods based on [scikit-learn](https://scikit-learn.org/stable/), multivariate filter methods based on [scikit-feature](http://featureselection.asu.edu/), deterministic wrapper methods based on [scikit-learn](https://scikit-learn.org/stable/), randomized wrapper methods based on our implementations in python scrips, and embedded methods based on [scikit-learn](https://scikit-learn.org/stable/).
11 |
12 | - [Notebook Three](3.%20Dimension%20Reduction.ipynb) covers supervised and unsupervised dimension reduction based on [scikit-learn](https://scikit-learn.org/stable/).
13 |
14 |
15 | Table of Content
16 | ------------
17 | - 1 Data Prepocessing
- 1.1 Static Continuous Variables
- 1.1.1 Discretization
- 1.1.1.1 Binarization
- 1.1.1.2 Binning
- 1.1.2 Scaling
- 1.1.2.1 Stardard Scaling (Z-score standardization)
- 1.1.2.2 MinMaxScaler (Scale to range)
- 1.1.2.3 RobustScaler (Anti-outliers scaling)
- 1.1.2.4 Power Transform (Non-linear transformation)
- 1.1.3 Normalization
- 1.1.4 Imputation of missing values
- 1.1.4.1 Univariate feature imputation
- 1.1.4.2 Multivariate feature imputation
- 1.1.4.3 Marking imputed values
- 1.1.5 Feature Transformation
- 1.1.5.1 Polynomial Transformation
- 1.1.5.2 Custom Transformation
- 1.2 Static Categorical Variables
- 1.2.1 Ordinal Encoding
- 1.2.2 One-hot Encoding
- 1.2.3 Hashing Encoding
- 1.2.4 Helmert Coding
- 1.2.5 Sum (Deviation) Coding
- 1.2.6 Target Encoding
- 1.2.7 M-estimate Encoding
- 1.2.8 James-Stein Encoder
- 1.2.9 Weight of Evidence Encoder
- 1.2.10 Leave One Out Encoder
- 1.2.11 Catboost Encoder
- 1.3 Time Series Variables
- 1.3.1 Time Series Categorical Features
- 1.3.2 Time Series Continuous Features
- 1.3.3 Implementation
- 1.3.3.1 Create EntitySet
- 1.3.3.2 Set up cut-time
- 1.3.3.3 Auto Feature Engineering
- 2 Feature Selection
- 2.1 Filter Methods
- 2.1.1 Univariate Filter Methods
- 2.1.1.1 Variance Threshold
- 2.1.1.2 Pearson Correlation (regression problem)
- 2.1.1.3 Distance Correlation (regression problem)
- 2.1.1.4 F-Score (regression problem)
- 2.1.1.5 Mutual Information (regression problem)
- 2.1.1.6 Chi-squared Statistics (classification problem)
- 2.1.1.7 F-Score (classification problem)
- 2.1.1.8 Mutual Information (classification problem)
- 2.1.2 Multivariate Filter Methods
- 2.1.2.1 Max-Relevance Min-Redundancy (mRMR)
- 2.1.2.2 Correlation-based Feature Selection (CFS)
- 2.1.2.3 Fast Correlation-based Filter (FCBF)
- 2.1.2.4 ReliefF
- 2.1.2.5 Spectral Feature Selection (SPEC)
- 2.2 Wrapper Methods
- 2.2.1 Deterministic Algorithms
- 2.2.1.1 Recursive Feature Elimination (SBS)
- 2.2.2 Randomized Algorithms
- 2.2.2.1 Simulated Annealing (SA)
- 2.2.2.2 Genetic Algorithm (GA)
- 2.3 Embedded Methods
- 2.3.1 Regulization Based Methods
- 2.3.1.1 Lasso Regression (Linear Regression with L1 Norm)
- 2.3.1.2 Logistic Regression (with L1 Norm)
- 2.3.1.3 LinearSVR/ LinearSVC
- 2.3.2 Tree Based Methods
- 3 Dimension Reduction
- 3.1 Unsupervised Methods
- 3.1.1 PCA (Principal Components Analysis)
- 3.2 Supervised Methods
- 3.2.1 LDA (Linear Discriminant Analysis)
18 |
19 | Reference
20 | ------------
21 | References have been included in each Jupyter Notebook.
22 |
23 | Author
24 | ------------
25 | [**@Yingxiang Chen**](https://github.com/YC-Coder-Chen)
26 | [**@Zihan Yang**](https://github.com/echoyang48)
27 |
28 | Contact
29 | ------------
30 | **If there are any mistakes, please feel free to reach out and correct us!**
31 |
32 | Yingxiang Chen E-mail: chenyingxiang3526@gmail.com
33 | Zihan Yang E-mai: echoyang48@gmail.com
34 |
--------------------------------------------------------------------------------
/SA.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 | """
4 | Simulated Annealing
5 |
6 | @author: chenyingxiang
7 | """
8 |
9 | import random
10 | import numpy as np
11 | from sklearn.model_selection import KFold
12 |
13 | random.seed()
14 | np.random.seed()
15 |
16 | class Simulated_Annealing(object):
17 | """
18 | Simulated Annealing Algorithm for feature selection
19 |
20 | Parameters
21 | ----------
22 | init_temp: float, default: 100.0
23 | The initial temperature
24 |
25 | min_temp: float, default: 1.0
26 | The minimum temperature to stop
27 |
28 | max_perturb: float, default: 0.2
29 | The maximum percentage of perturbance genarated
30 |
31 | alpha: float, default: 0.98
32 | The decay coefficient of temperature
33 |
34 | k: float, default: 1.0
35 | The constant for computing probability
36 |
37 | loss_func: object
38 | The loss function of the ML task.
39 | loss_func(y_true, y_pred) should return the loss.
40 |
41 | iteration: int, default: 50
42 | Number of iteration when temperature level is above min_temp each time.
43 |
44 | estimator: object
45 | A supervised learning estimator
46 | It has to have the `fit` and `predict` method (or `predict_proba` method for classification)
47 |
48 | predict_type: string, default="predict"
49 | Final prediction type.
50 | - For some classification loss functions, probability output is required.
51 | Should set predict_type to "predict_proba"
52 |
53 | Attributes
54 | ----------
55 | best_sol: np.array of int
56 | The index of the best subset of features.
57 |
58 | best_loss: float
59 | The loss associated with the best_sol
60 |
61 | References
62 | ----------
63 | 1. https://blog.csdn.net/Joseph__Lagrange/article/details/94410317
64 | 2. https://github.com/JeromeBau/SimulatedAnnealing/blob/master/gibbs_annealing.py
65 |
66 | """
67 |
68 |
69 | def __init__(self, loss_func, estimator, init_temp = 100.0, min_temp = 0.01, k = 1.0,
70 | max_perturb = 0.2, alpha = 0.98, iteration = 50, predict_type = 'predict'):
71 |
72 | #### check type
73 | if not hasattr(estimator, 'fit'):
74 | raise ValueError('Estimator doesn\' have fit method')
75 | if not hasattr(estimator, 'predict') and not hasattr(estimator, 'predict_proba'):
76 | raise ValueError('Estimator doesn\' have predict or predict_proba method')
77 |
78 | for instant in [init_temp, min_temp, k, max_perturb, alpha]:
79 | if type(instant) != float:
80 | raise TypeError(f'{instant} should be float type')
81 |
82 | if type(iteration) != int:
83 | raise TypeError(f'{iteration} should be int type')
84 |
85 | if predict_type not in ['predict', 'predict_proba']:
86 | raise ValueError('predict_type should be "predict" or "predict_proba"')
87 |
88 | self.loss_func = loss_func
89 | self.estimator = estimator
90 | self.init_temp = init_temp
91 | self.min_temp = min_temp
92 | self.k = k
93 | self.max_perturb = max_perturb
94 | self.alpha = alpha
95 | self.iteration = iteration
96 | self.predict_type = predict_type
97 | self.loss_dict = dict()
98 |
99 | def _judge(self, new_cost, old_cost, temp):
100 |
101 | delta_cost = new_cost - old_cost
102 |
103 | if delta_cost < 0: # new solution is better
104 | proceed = 1
105 | else:
106 | probability = np.exp(-1 * delta_cost / (self.k * temp))
107 | if probability > np.random.random():
108 | proceed = 1
109 |
110 | else:
111 | proceed = 0
112 |
113 | return proceed
114 |
115 | def _get_neighbor(self, num_feature, current_sol, max_perturb):
116 |
117 | all_feature = np.ones(shape=(num_feature,)).astype(bool)
118 | outside_feature = np.where(all_feature != current_sol)[0]
119 | inside_feature = np.where(all_feature == current_sol)[0]
120 | num_perturb_in = int(max(np.ceil(len(inside_feature) * max_perturb),1))
121 | num_perturb_out = int(max(np.ceil(len(outside_feature) * max_perturb),1))
122 | if len(outside_feature) == 0:
123 | feature_in = np.array([])
124 | else:
125 | feature_in = np.random.choice(outside_feature,
126 | size = min(len(outside_feature),
127 | np.random.randint(0, num_perturb_in + 1)),
128 | replace = False) # uniform distribution
129 | if len(inside_feature) == 0:
130 | feature_out = np.array([])
131 | else:
132 | feature_out = np.random.choice(inside_feature ,
133 | size = min(len(inside_feature),
134 | np.random.randint(0, num_perturb_out + 1)),
135 | replace = False) # uniform distribution
136 | feature_change = np.append(feature_in, feature_out).astype(int)
137 | all_feature[feature_change] = 1 - all_feature[feature_change]
138 |
139 | return all_feature
140 |
141 | def _get_cost(self, X, y, estimator, loss_func, X_test = None, y_test = None):
142 |
143 | estimator.fit(X, y.ravel())
144 | if type(X_test) is np.ndarray:
145 | if self.predict_type == "predict_proba": # if loss function requires probability
146 | y_test_pred = estimator.predict_proba(X_test)
147 | return loss_func(y_test, y_test_pred)
148 | else:
149 | y_test_pred = estimator.predict(X_test)
150 | return loss_func(y_test, y_test_pred)
151 |
152 | y_pred = estimator.predict(X)
153 |
154 | return loss_func(y, y_pred)
155 |
156 |
157 | def _cross_val(self, X, y, estimator, loss_func, cv):
158 |
159 | loss_record = []
160 |
161 | for train_index, test_index in KFold(n_splits = cv).split(X): # k-fold
162 |
163 | try:
164 | X_train, X_test = X[train_index], X[test_index]
165 | y_train, y_test = y[train_index], y[test_index]
166 | estimator.fit(X_train, y_train.ravel())
167 |
168 | if self.predict_type == "predict_proba":
169 | y_test_pred = estimator.predict_proba(X_test)
170 | loss = loss_func(y_test, y_test_pred)
171 | loss_record.append(loss)
172 | else:
173 | y_test_pred = estimator.predict(X_test)
174 | loss = loss_func(y_test, y_test_pred)
175 | loss_record.append(loss)
176 | except:
177 | continue
178 |
179 | return np.array(loss_record).mean()
180 |
181 | def fit(self, X_train, y_train, cv = None, X_val = None, y_val = None,
182 | init_sol = None, stop_point = 5):
183 |
184 |
185 | """
186 | Fit method.
187 |
188 | Parameters
189 | ----------
190 | X_train: numpy array shape = (n_samples, n_features).
191 | The training input samples.
192 |
193 | y_train: numpy array, shape = (n_samples,).
194 | The target values (class labels in classification, real numbers in regression).
195 |
196 | cv: int or None, default = None
197 | Specify the number of folds in KFold. None means SA will not use
198 | k-fold cross-validation results to select features.
199 | [1] If cv = None and X_val = None, the SA will evaluate each subset on trainset.
200 | [2] If cv != None and X_val = None, the SA will evaluate each subset on generated validation set using k-fold.
201 | [3] If cv = None and X_val != None, the SA will evaluate each subset on the user-provided validation set.
202 |
203 | X_val: numpy array, shape = (n_samples, n_features) or None. default = None.
204 | The validation input samples. None means no validation set is provoded.
205 | [1] If cv = None and X_val = None, the SA will evaluate each subset on trainset.
206 | [2] If cv != None and X_val = None, the SA will evaluate each subset on generated validation set using k-fold.
207 | [3] If cv = None and X_val != None, the SA will evaluate each subset on the user-provided validation set.
208 |
209 | y_val: numpy array, shape = (n_samples, ) or None. default = None.
210 | The validation target values (class labels in classification, real numbers in regression).
211 |
212 | init_sol: numpy array, shape = (num_feautre, ) or None. default = None.
213 | The initial solution provided by the user. It should contain bools.
214 | A good inital solution will save SA algorithm a lot of searching time.
215 | None means the SA will randomly generated a inital solution.
216 |
217 | stop_point: int, default = 5.
218 | The stopping conditions. If the optimal loss keeps the same for a few iterantions, then it will stop.
219 |
220 | Returns
221 | -------
222 | self : object
223 |
224 | """
225 |
226 | # make sure input has two dimensions
227 | assert len(X_train.shape) == 2
228 | num_feature = X_train.shape[1]
229 |
230 | # get initial solution
231 | if init_sol == None:
232 | init_sol = np.random.randint(2, size=num_feature)
233 | while sum(init_sol)==0:
234 | init_sol = np.random.randint(2, size=num_feature)
235 |
236 | current_sol = init_sol
237 | if cv:
238 | current_loss = self._cross_val(X_train[:,current_sol], y_train,
239 | self.estimator, self.loss_func, cv)
240 | current_loss = np.round(current_loss, 4)
241 |
242 | elif type(X_val) is np.ndarray:
243 | current_loss = self._get_cost(X_train[:,current_sol], y_train, self.estimator,
244 | self.loss_func, X_val[:,current_sol], y_val)
245 | current_loss = np.round(current_loss, 4)
246 |
247 | else:
248 | current_loss = self._get_cost(X_train[:,current_sol], y_train, self.estimator,
249 | self.loss_func, None, None)
250 | current_loss = np.round(current_loss, 4)
251 |
252 | encoded_str = ''.join(['1' if x else '0' for x in current_sol])
253 | self.loss_dict[encoded_str] = current_loss
254 | temp_history = [self.init_temp]
255 | loss_history = [current_loss]
256 | sol_history = [current_sol]
257 |
258 | current_temp = self.init_temp
259 | current_temp = np.round(current_temp, 4)
260 |
261 | best_loss = current_loss
262 | best_sol = current_sol
263 |
264 | # start looping
265 | while current_temp > self.min_temp:
266 | for step in range(self.iteration):
267 | current_sol = self._get_neighbor(num_feature, current_sol, self.max_perturb)
268 | if len(current_sol) == 0:
269 | current_loss = np.Inf
270 | else:
271 | encoded_str = ''.join(['1' if x else '0' for x in current_sol])
272 | if self.loss_dict.get(encoded_str):
273 | current_loss = self.loss_dict.get(encoded_str)
274 | else:
275 | if cv:
276 | current_loss = self._cross_val(X_train[:,current_sol], y_train,
277 | self.estimator, self.loss_func, cv)
278 | current_loss = np.round(current_loss, 4)
279 |
280 | elif type(X_val) is np.ndarray:
281 | current_loss = self._get_cost(X_train[:,current_sol], y_train, self.estimator,
282 | self.loss_func, X_val[:,current_sol], y_val)
283 | current_loss = np.round(current_loss, 4)
284 |
285 | else:
286 | current_loss = self._get_cost(X_train[:,current_sol], y_train, self.estimator,
287 | self.loss_func, None, None)
288 | current_loss = np.round(current_loss, 4)
289 | self.loss_dict[encoded_str] = current_loss
290 |
291 | if (current_loss - best_loss) <= 0: # update temperature
292 | current_temp = current_temp * self.alpha
293 | current_temp = np.round(current_temp, 4)
294 |
295 | # judge
296 | if self._judge(current_loss, best_loss, current_temp): # take new solution
297 | best_sol = current_sol
298 | best_loss = current_loss
299 |
300 | # keep record
301 | temp_history.append(current_temp)
302 | loss_history.append(best_loss)
303 | sol_history.append(best_sol)
304 |
305 | # debugging Pipeline
306 | # print(f"Current temperature is {current_temp}")
307 | # print(f"Current best loss is {best_loss}")
308 | # print(f"Current best solution is {best_sol}")
309 |
310 | # check stopping condition
311 | if len(loss_history) > stop_point:
312 | if len(np.unique(loss_history[-1 * stop_point : ])) == 1:
313 | print(f"Stopping condition reached!")
314 | break
315 |
316 | best_idx = np.argmin(loss_history)
317 | self.best_sol = sol_history[best_idx]
318 | self.best_loss = loss_history[best_idx]
319 |
320 | def transform(self, X):
321 | """
322 | Transform method.
323 |
324 | Parameters
325 | ----------
326 | X: numpy array shape = (n_samples, n_features).
327 | The data set needs feature reduction.
328 |
329 | Returns
330 | -------
331 | transform_X: numpy array shape = (n_samples, n_best_features).
332 | The data set after feature reduction.
333 |
334 | """
335 | transform_X = X[:, self.best_sol]
336 | return transform_X
--------------------------------------------------------------------------------
/images/Embedded_Pipeline.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/Embedded_Pipeline.png
--------------------------------------------------------------------------------
/images/Filter_Pipeline.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/Filter_Pipeline.png
--------------------------------------------------------------------------------
/images/GA_Pseudo_Code.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/GA_Pseudo_Code.png
--------------------------------------------------------------------------------
/images/SA_Pseudo_Code.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/SA_Pseudo_Code.png
--------------------------------------------------------------------------------
/images/Wrapper_Pipeline.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/Wrapper_Pipeline.png
--------------------------------------------------------------------------------
/images/基于基因算法特征选择.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/基于基因算法特征选择.png
--------------------------------------------------------------------------------
/images/基于模拟退火特征选择.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/基于模拟退火特征选择.png
--------------------------------------------------------------------------------
/images/封装法工作流.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/封装法工作流.png
--------------------------------------------------------------------------------
/images/嵌入法工作流.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/嵌入法工作流.png
--------------------------------------------------------------------------------
/images/过滤法工作流.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/YC-Coder-Chen/feature-engineering-handbook/2f2cd6f314437ffdd7fa5c9d18aafd0bc8e7fd04/images/过滤法工作流.png
--------------------------------------------------------------------------------
/中文版.md:
--------------------------------------------------------------------------------
1 | 基于Jupyter的特征工程手册
2 | ============
3 | 欢迎!此项目提供了基于Jupyter Notebook的交互式实用特征工程手册。其一共包含三个部分[数据预处理](./中文版/1.%20数据预处理.ipynb),[特征选择](./中文版/2.%20特征选择.ipynb),[特征降维](./中文版/3.%20特征降维.ipynb)。每个部分将在其单独的Notebook中演示。由于某些特征选择算法(例如“模拟退火”和“遗传算法”)在python中缺少完整连续的实现,因此我们还提供了相应的python脚本实现这些算法([模拟退火](SA.py), [基因算法](GA.py)),并将其涵盖在我们的教程供您参考。
4 |
5 | 简单介绍
6 | ------------
7 | - [第一个笔记本](./中文版/1.%20数据预处理.ipynb) 主要涵盖了数据预处理的介绍,包含基于 [scikit-learn](https://scikit-learn.org/stable/) 处理静态连续特征,基于 [Category Encoders](https://contrib.scikit-learn.org/categorical-encoding/) 处理静态类别特征,基于[Featuretools](https://www.featuretools.com/) 处理时间序列问题。
8 |
9 | - [第二个笔记本](./中文版/2.%20特征选择.ipynb) 主要涵盖了特征选择的介绍,包含基于 [scikit-learn](https://scikit-learn.org/stable/) 实现单变量特征过滤,基于 [scikit-feature](http://featureselection.asu.edu/) 实现多变量特征过滤,基于 [scikit-learn](https://scikit-learn.org/stable/) 实现确定性封装筛选,基于我们撰写的 [模拟退火](SA.py)及[基因算法](GA.py) 脚本实现随机封装筛选,基于 [scikit-learn](https://scikit-learn.org/stable/) 实现嵌入特征筛选。
10 |
11 | - [第三个笔记本](3.%20Dimension%20Reduction.ipynb) 主要涵盖了特征压缩降维的介绍,包含基于 [scikit-learn](https://scikit-learn.org/stable/) 实现监督与无监督特征降维。
12 |
13 | |项目内容|英文版地址 | 中文版地址 |
14 | |------ |------ | ------ |
15 | |README | [README](./README.md) | [中文版README](./中文版.md) |
16 | |数据预处理| [Notebook 1](./1.%20Data%20Preprocessing.ipynb) | [第一个笔记本](./中文版/1.%20数据预处理.ipynb) |
17 | |特征选择 | [Notebook 2](2.%20Feature%20Selection.ipynb) | [第二个笔记本](./中文版/2.%20特征选择.ipynb) |
18 | |特征降维 | [Notebook 3](3.%20Dimension%20Reduction.ipynb) | [第三个笔记本](./中文版/3.%20特征降维.ipynb) |
19 |
20 |
21 | 总目录
22 | ------------
23 | - 1 Data Prepocessing 数据预处理
- 1.1 Static Continuous Variables 静态连续变量
- 1.1.1 Discretization 离散化
- 1.1.1.1 Binarization 二值化
- 1.1.1.2 Binning 分箱
- 1.1.2 Scaling 缩放
- 1.1.2.1 Stardard Scaling (Z-score standardization) 标准缩放 (Z值标准化)
- 1.1.2.2 MinMaxScaler (Scale to range) 最大最小缩放 (按数值范围缩放)
- 1.1.2.3 RobustScaler (Anti-outliers scaling) 稳健缩放 (抗异常值缩放)
- 1.1.2.4 Power Transform (Non-linear transformation) 幂次变换 (非线性变换)
- 1.1.3 Normalization 正则化
- 1.1.4 Imputation of missing values 缺失值填补
- 1.1.4.1 Univariate feature imputation 单变量特征插补
- 1.1.4.2 Multivariate feature imputation 多元特征插补
- 1.1.4.3 Marking imputed values 标记估算值
- 1.1.5 Feature Transformation 特征变换
- 1.1.5.1 Polynomial Transformation 多项式变换
- 1.1.5.2 Custom Transformation 自定义变换
- 1.2 Static Categorical Variables 静态类别变量
- 1.2.1 Ordinal Encoding 序数编码
- 1.2.2 One-hot Encoding 独热编码
- 1.2.3 Hashing Encoding 哈希编码
- 1.2.4 Helmert Coding Helmert 编码
- 1.2.5 Sum (Deviation) Coding 偏差编码
- 1.2.6 Target Encoding 目标编码
- 1.2.7 M-estimate Encoding M估计量编码
- 1.2.8 James-Stein Encoder James-Stein 编码
- 1.2.9 Weight of Evidence Encoder 证据权重编码
- 1.2.10 Leave One Out Encoder 留一法编码
- 1.2.11 Catboost Encoder Catboost 编码
- 1.3 Time Series Variables 时间序列变量
- 1.3.1 Time Series Categorical Features 时间序列类别变量
- 1.3.2 Time Series Continuous Features 时间序列连续变量
- 1.3.3 Implementation 代码实现
- 1.3.3.1 Create EntitySet 生成实体集
- 1.3.3.2 Set up cut-time 设置时间截断
- 1.3.3.3 Auto Feature Engineering 自动特征工程
- 2 Feature Selection 特征选择
- 2.1 Filter Methods 过滤法
- 2.1.1 Univariate Filter Methods 单变量特征过滤
- 2.1.1.1 Variance Threshold 方差选择法
- 2.1.1.2 Pearson Correlation (regression problem) 皮尔森相关系数 (回归问题)
- 2.1.1.3 Distance Correlation (regression problem) 距离相关系数 (回归问题)
- 2.1.1.4 F-Score (regression problem) F-统计量 (回归问题)
- 2.1.1.5 Mutual Information (regression problem) 互信息 (回归问题)
- 2.1.1.6 Chi-squared Statistics (classification problem) 卡方统计量 (分类问题)
- 2.1.1.7 F-Score (classification problem) F-统计量 (分类问题)
- 2.1.1.8 Mutual Information (classification problem) 互信息 (分类问题)
- 2.1.2 Multivariate Filter Methods 多元特征过滤
- 2.1.2.1 Max-Relevance Min-Redundancy (mRMR) 最大相关最小冗余
- 2.1.2.2 Correlation-based Feature Selection (CFS) 基于相关性的特征选择
- 2.1.2.3 Fast Correlation-based Filter (FCBF) 基于相关性的快速特征选择
- 2.1.2.4 ReliefF
- 2.1.2.5 Spectral Feature Selection (SPEC) 基于谱图的特征选择
- 2.2 Wrapper Methods 封装方法
- 2.2.1 Deterministic Algorithms 确定性算法
- 2.2.1.1 Recursive Feature Elimination (SBS) 递归式特征消除
- 2.2.2 Randomized Algorithms 随机方法
- 2.2.2.1 Simulated Annealing (SA) 基于模拟退火特征选择
- 2.2.2.2 Genetic Algorithm (GA) 基于基因算法特征选择
- 2.3 Embedded Methods 嵌入方法
- 2.3.1 Regulization Based Methods 基于正则化模型的方法
- 2.3.1.1 Lasso Regression (Linear Regression with L1 Norm) 套索回归
- 2.3.1.2 Logistic Regression (with L1 Norm) 逻辑回归
- 2.3.1.3 LinearSVR/ LinearSVC 线性向量支持机
- 2.3.2 Tree Based Methods 基于树模型的方法
- 3 Dimension Reduction 特征降维
- 3.1 Unsupervised Methods 非监督方法
- 3.1.1 PCA (Principal Components Analysis) 主成分分析
- 3.2 Supervised Methods 监督方法
- 3.2.1 LDA (Linear Discriminant Analysis) 线性判别分析
24 |
25 | *注:由于部分内容未有前人翻译,部分翻译可能不尽准确*
26 |
27 | 参考文献
28 | ------------
29 | 参考文献已在各独立 Notebook 中分别记录。
30 |
31 | Author
32 | ------------
33 | [**@陈颖祥**](https://github.com/YC-Coder-Chen)
34 | [**@杨子晗**](https://github.com/echoyang48)
35 |
36 | Contact
37 | ------------
38 | **此资料为笔者业余时间整理制作,欢迎各位指正其中的不足之处!让我们一起让这份学习资料更加完善~**
39 |
40 | 陈颖祥 E-mail: chenyingxiang3526@gmail.com
41 | 杨子晗 E-maii: echoyang48@gmail.com
42 |
--------------------------------------------------------------------------------
/中文版/3. 特征降维.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "**基于Jupyter的特征工程笔记本3: 特征降维** \n",
8 | "*作者: 陈颖祥,杨子唅*"
9 | ]
10 | },
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {
14 | "toc": true
15 | },
16 | "source": [
17 | "Table of Contents
\n",
18 | ""
19 | ]
20 | },
21 | {
22 | "cell_type": "markdown",
23 | "metadata": {},
24 | "source": [
25 | "**Reference**\n",
26 | "- https://scikit-learn.org/stable/modules/decomposition.html#principal-component-analysis-pca\n",
27 | "- https://sebastianraschka.com/faq/docs/lda-vs-pca.html\n",
28 | "- https://en.wikipedia.org/wiki/Linear_discriminant_analysis"
29 | ]
30 | },
31 | {
32 | "cell_type": "markdown",
33 | "metadata": {},
34 | "source": [
35 | "# Dimension Reduction 特征降维"
36 | ]
37 | },
38 | {
39 | "cell_type": "markdown",
40 | "metadata": {},
41 | "source": [
42 | "经过数据预处理和特征选择,我们已经生成了一个很好的特征子集。但是有时该子集可能仍然包含过多特征,导致需要花费太多的计算能力用以训练模型。在这种情况下,我们可以使用降维技术进一步压缩特征子集。但这可能会降低模型性能。\n",
43 | "\n",
44 | "同时,如果我们没有太多时间进行特征选择,我们也可以在数据预处理之后直接应用降维方法。我们可以使用降维算法来压缩原始特征空间直接生成特征子集。\n",
45 | "\n",
46 | "具体来说,我们将分别介绍PCA和LDA(线性判别分析)。"
47 | ]
48 | },
49 | {
50 | "cell_type": "markdown",
51 | "metadata": {},
52 | "source": [
53 | "## Unsupervised Methods 非监督方法"
54 | ]
55 | },
56 | {
57 | "cell_type": "markdown",
58 | "metadata": {},
59 | "source": [
60 | "### PCA (Principal Components Analysis) 主成分分析"
61 | ]
62 | },
63 | {
64 | "cell_type": "markdown",
65 | "metadata": {},
66 | "source": [
67 | "主成分分析(PCA)是一种无监督机器学习模型,其目标为利用线性变换将原始特征投影为一系列线性不相关的单位向量,而同时保留尽可能多的信息(方差)。您可以从我们在Github中编写的[repo](https://github.com/YC-Coder-Chen/Unsupervised-Notes/blob/master/PCA.md)中查看更多数学细节。"
68 | ]
69 | },
70 | {
71 | "cell_type": "code",
72 | "execution_count": 1,
73 | "metadata": {
74 | "ExecuteTime": {
75 | "end_time": "2020-03-30T03:37:20.574170Z",
76 | "start_time": "2020-03-30T03:37:18.879555Z"
77 | }
78 | },
79 | "outputs": [],
80 | "source": [
81 | "import numpy as np\n",
82 | "import pandas as pd\n",
83 | "from sklearn.decomposition import PCA\n",
84 | "\n",
85 | "# 直接载入数据集\n",
86 | "from sklearn.datasets import fetch_california_housing\n",
87 | "dataset = fetch_california_housing()\n",
88 | "X, y = dataset.data, dataset.target # 利用 california_housing 数据集来演示\n",
89 | "\n",
90 | "# 选择前15000个观测点作为训练集\n",
91 | "# 剩下的作为测试集\n",
92 | "train_set = X[0:15000,:]\n",
93 | "test_set = X[15000:,]\n",
94 | "train_y = y[0:15000]\n",
95 | "\n",
96 | "# 在使用主成分分析前,我们需要先对变量进行缩放操作,否则PCA将会赋予高尺度的特征过多的权重\n",
97 | "from sklearn.preprocessing import StandardScaler\n",
98 | "model = StandardScaler()\n",
99 | "model.fit(train_set) \n",
100 | "standardized_train = model.transform(train_set)\n",
101 | "standardized_test = model.transform(test_set)\n",
102 | "\n",
103 | "# 开始压缩特征\n",
104 | "compressor = PCA(n_components=0.9) \n",
105 | "# 将n_components设置为0.9 =>\n",
106 | "# 即要求我们从所有主成分中选取的输出主成分至少能保留原特征中90%的方差\n",
107 | "# 我们也可以通过设置n_components参数为整数直接控制输出的变量数目\n",
108 | "\n",
109 | "compressor.fit(standardized_train) # 在训练集上训练\n",
110 | "transformed_trainset = compressor.transform(standardized_train) # 转换训练集 (20000,5)\n",
111 | "# 即我们从8个主成分中选取了前5个主成分,而这前5个主成分可以保证保留原特征中90%的方差\n",
112 | "\n",
113 | "transformed_testset = compressor.transform(standardized_test) # 转换测试集\n",
114 | "assert transformed_trainset.shape[1] == transformed_testset.shape[1] \n",
115 | "# 转换后训练集和测试集有相同的特征数"
116 | ]
117 | },
118 | {
119 | "cell_type": "code",
120 | "execution_count": 2,
121 | "metadata": {
122 | "ExecuteTime": {
123 | "end_time": "2020-03-30T03:37:21.146354Z",
124 | "start_time": "2020-03-30T03:37:20.576580Z"
125 | }
126 | },
127 | "outputs": [
128 | {
129 | "data": {
130 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYQAAAEHCAYAAACumTGlAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3XeYVdW5x/Hvy9B7r9KrSHdoEXUQUaxXsSvYRY0libmJJmo0Br2J3aioKCgqalCxNywgoHSRptJ77wMM09/7x94TcTwwZ2DOnCm/z/PMwx7OOue8bGb27+y191rL3B0REZEy8S5ARESKBgWCiIgACgQREQkpEEREBFAgiIhISIEgIiKAAkFEREIKBBERARQIIiISKhvvAvKjbt263qJFi3iXISJSrMyZM2ebu9fLq12xCoQWLVowe/bseJchIlKsmNnqaNqpy0hERAAFgoiIhBQIIiICKBBERCSkQBAREUCBICIiIQWCiEgRl5GVTUZWdszfJ6aBYGajzGyamd11kMdbmtlHZjbFzB6JZS0iIsVNRlY2b8xcw0mPTOI/s9bG/P1iFghmNhhIcPe+QCszaxuh2b+Af7j78cBRZpYUq3pERIqL9MxsXpuxhqSHJnHH+AXUqlyelnWrxPx9YzlSOQkYF25PAPoBS3O1aQd8F25vAWrEsB4RkSItPTObN+esZcTE5azftZ+uTWsy/JxOJLWvh5nF/P1jGQhVgPXh9g6gR4Q2bwH3mNl0YBDwl9wNzGwYMAygWbNmsalURCSO0jKzGDd7Hc9MXMaG3al0b1aT+8/txIntCicIcsQyEPYClcLtqkTonnL34WbWD/gTMMbd90ZoMxIYCZCYmOixK1dEpHClZmQxbvZanpm0nI27U+nRrCb/PK8Lx7etW6hBkCOWgTCHoJtoOtAVWHyQdt8DzYBLYliLiEiRkZqRxX9mBUGwKTmVxOa1ePD8LvRrE58gyBHLQHgXmGJmjYHTgIvNbLi7577j6E/Ao+6eEsNaRETiLjUji9dnruHZr5ezOTmNXi1q88iFXflN6zpxDYIcMQsEd08O7xoaCDzo7puAeRHa3ROrGkREioLUjCzGzljDc18vZ8ueNHq1rM1jF3Wjb6uiEQQ5Yroegrvv5Oc7jURESpX96VmMnbGa5yavYOueNPq0qs0TF3enb+s68S4tomK1QI6ISHGQkp7J2OlreG7yCrbtTaNvqzo8eUl3+rQqmkGQQ4EgIlJAUtIzeWXaap6fsoJte9M5rk0dRgzoQa+WteNdWlQUCCIiR2hfWiavTF/N85NXsH1fOse3rcvvBrQlsUXxCIIcCgQRkcO0Ny2Tl6et4oUpK9kRBsHvT27Lsc2LVxDkUCCIiOTTntQMXp62mhemrGBnSgYntqvH705uS49mteJd2hFRIIiIRGlPagZjvl3FC1NXsislg/7t63HrgLZ0L+ZBkEOBICKSh+TUDF76ZhWjpq5k9/4MTupQn1sHtKVb05rxLq1AKRBERA5i9/4MXvxmJaOnriQ5NZOTjw6CoMtRJSsIcigQRERy2b0/g9FTVzL6m5XsSc1kYMcG/G5AWzo1Kdkz9CsQRERCu1MyGDV1BS9+s4o9aZmc0rEBt5aCIMihQBCRUm9XSjqjpq7kpTAIBh3TkFsGtOGYxqUjCHIoEESk1Nq5L50Xpq5gzLer2ZuWyemdG3LLSW05ulH1eJcWFwoEESl1duxL5/kpK3j521WkZGRxeqdG3DKgDR0als4gyKFAEJFSY/veNJ6fspKXp61if0YWZ3RuxK0D2tKuQbV4l1YkKBBEpMTbtjeN5yev4OVpq0nNzOKsLo255aQ2tFUQ/IICQURKrK170hg5eTmvTl9DWmYWZ3UNgqBNfQVBJAoEESlxtuxJZeTXK3h1xmrSM7P5n25NuPmkNrSuVzXepRVpCgQRKTG2JKfy7NcrGDtjNRlZ2ZzTvQk3929DKwVBVBQIIlLsbU5O5ZlJy3l95hoys51zwjOClnWrxLu0YkWBICLF1qbdqTz79XJem7mGrGxncPcm3NS/DS0UBIdFgSAixc7G3ft5ZtJy3pi5lmx3BvcIgqB5HQXBkVAgiEixsWHXfkZMWsa4WevIduf8Y4/ipv5taFq7crxLKxFiGghmNgroCHzk7sMjPF4LGAvUB+a4+/WxrEdEiqf1u/YzYuIyxs1eiztckNiU3ya1VhAUsJgFgpkNBhLcva+ZjTaztu6+NFezocBYdx9rZq+ZWaK7z45VTSJSvKzbmcLTE5fz1py1wM9BcFQtBUEsxPIMIQkYF25PAPoBuQNhO9DJzGoCTYG1uV/EzIYBwwCaNWsWq1pFpAhZuyOFEZOW8ebsdZQx46KeTbkxqQ1NalaKd2klWiwDoQqwPtzeAfSI0GYqcAZwK/Bj2O4X3H0kMBIgMTHRY1KpiBQJa7an8PTEZbz9XRAEl/Rqxo1JrWmsICgUsQyEvUDO/2JVoEyENvcAN7h7spndBlxFePAXkdJj9fZ9PPXVMsbPXU9CGeOy3s24Iak1jWooCApTLANhDkE30XSgK7A4QptaQGczmw70Br6IYT0iUsSs2raPpyYu450wCIb2ac6NSa1pUL1ivEsrlWIZCO8CU8ysMXAacLGZDXf3uw5o83/Ai0BzYBrwegzrEZEiYuW2fTz51VLe+34DZcsYl/dtzg0nKgjiLWaBEHYDJQEDgQfdfRMwL1ebmcAxsapBRIqWtTtS+PeXS3n7u3WUL1uGK3/TgutPaEV9BUGRENNxCO6+k5/vNBKRUmrT7lSemriU/8xai5lx5W9ackNSK+pXUxAUJRqpLCIxs21vGs9OWs4r01eTle1c1LMpN5/URheLiygFgogUuN0pGYycspwXv1lFakYWg3scxe8GtNXI4iJOgSAiBWZvWiajp67k+Skr2JOayZldGvGHge20ME0xoUAQkSO2Pz2LV6av4plJy9mZksHAjg24bWA7jm5UPd6lST4oEETksKVlZvHGzLU8NXEZW/ekcXzbuvzxlPZ0a1oz3qXJYVAgiEi+ZWRlM/67dfz7y2Ws37WfXi1q89Ql3endqk68S5MjoEAQkahlZTsfzNvA418sYdX2FLo2rck/z+tMvzZ1MbN4lydHSIEgInlydz5btIlHP1/Cks176dCwGs9fnsjJR9dXEJQgCgQROSh3Z9LirTzy+WIWrk+mVb0qPHVpd07v1IgyZRQEJY0CQUQi+nbZNh6esJjv1uyiae1KPHxBV87p1piyCZEmLpaSQIEgIr8wZ/VOHpmwmG+Xb6dh9Yrcf24nLji2KeXLKghKOgWCiACwcP1uHpmwmImLt1K3annuPrMjl/VuRsVyCfEuTQqJAkGklFuyeQ+Pfb6ETxZuokalcvx5UHuu6NuCKhV0eCht9D8uUkqt2raPx79YwnvzNlClfFl+N6At1xzfkuoVy8W7NIkTBYJIKbN+136e/HIpb85ZR7kEY9gJrbj+hNbUrlI+3qVJnCkQREqJLcmpPD1xGa/PXAvA0D7N+W3/1lqTQP5LgSBSwu3Yl86zXy9nzLeryMx2Lkw8iltOakvjmlqTQH7psALBzCq6e6qZlXH37IIuSkSO3O79GYyasoJRU1eSkpHFud2acOuAtrSoWyXepUkRlWcgmNlwIB0wYK27jwZeMbM7gb+b2X3u/mOM6xSRKO1Ly+Slb1fx3NfLSU7N5IzOjfj9yW1p26BavEuTIi6aM4TTgXeAjcCdFkxc0g0oA5RXGIgUDakZWbw6fTXPTFrO9n3pDOhQnz8MbEenJjXiXZoUE9EEwm5gMbA63O4EbAVqAffErjQRiUZ6Zjb/mb2Wp75ayubkNPq1qcttp7SjR7Na8S5NiploAqEhcCKwDdhF0HWUDvQAMs0s2d3XxK5EEYkkMyubd+au54kvl7Ju534Sm9fi8Yu607e11iSQwxNNIDiQFX51BrYDVcLvKwD3AVdGeqKZjQI6Ah+5+/AIj98IXBR+WxOY4e7X5++fIFK6ZGc7Hy7YyOOfL2HFtn10blKD4ed04sR29TQVtRyRaAJhMzCVoMvoCoKAaApUA74AXoz0JDMbDCS4e18zG21mbd196YFt3P0Z4Jmw/ZPAmMP9h4iUdO7OhB8289jnS/hp0x7aNajKs0OO5dRjGigIpEBEEwi1CK4b1APWACuBDIJgGO7uZxzkeUnAuHB7AtAPWBqpoZk1ARq4++wIjw0DhgE0a9YsinJFShZ3Z/LSbTwyYTHz1+2mZd0qPHFxN87s0pgErUkgBSiaQBhPcM2gOvAS8CFBN88KoLaZDXD3LyM8rwqwPtzeQXDN4WBuIjxTyM3dRwIjARITEz2KekVKjBkrtvPIhCXMXLWDJjUr8eD5XRjcvYnWJJCYyDMQ3P2+CH89AsDMJrl76kGeuhfIGQpZleA21V8xszJAf+DOPKsVKSXmrtnJo58vYcrSbdSvVoF//M8xXNizKRXKaipqiZ2oRiqbWX133xJuX+DubwKEo5W7ufv3EZ42h6CbaDrQleDW1UiOJ7iYrE//Uuot2rCbxz5fwhc/bqF2lfLcdcbRDOnTXGsSSKGIduqKV8zsCqAnMMzM3gL6u/tXwOME1wtyexeYYmaNgdOAi81suLvflavdqcDkw6pepIRYtmUPj32xlI/mb6RaxbL87yntuPK4llTVmgRSiKKZuqI5kEZwTaAtwcVkgD8CXxFcYP4Vd082syRgIPCgu28C5kVo99fDqlykBFizPYXHv1zCu3PXU6lcArec1IZr+7WiRmWtSSCF75CBYGb1gDcJ7jDKGY+Au7uZVTCzVgRjESJy9538fKeRiIQ27t7Pk18tY9ystSSUMa49vhXXn9CKOlUP+uskEnOHDAR33wr0MrMJ/PrA3x74O6B7QUWitHVPGiMmLWPsjDW4O5f2bsZN/dvQoLrWJJD4i6bLKGdClFEEoVDDzBKABe4+1Mw+j2WBIiXBrpR0nv16BWO+XUV6Vjbn9ziKWwa04ahaleNdmsh/5dVlVB34FMgGhhCMVD6LYISyZjkVycOe1AxGTV3JqCkr2ZueydldG/O7AW1pVa9qvEsT+ZW8uoySzawvwR1DADsJZjq9GnjNzPoQjF4WkQOkpGcy5tvVPDd5ObtSMhh0TEP+MLAd7RtqTQIpuqIZmJZtZmUJZjudCJzh7uvM7HyCu4wujXGNIsVGakYWr89cw9MTl7NtbxpJ7evxx4Ht6XyU1iSQoi+qm5zd/fRw83sze8jMzN23mNlVwP7YlSdSPGRkZfPm7HU8+dVSNu5OpU+r2jw7pAeJLWrHuzSRqEU7UrkMwUC0L939s5y/d/dZMatMpJiYvGQrd7+3kNXbU+jerCYPX9CV37SuoxlIpdiJdhikE9xi+qWZdQK25ExlIVJa7U7JYPhHP/DmnHW0qleF0Vcm0r99fQWBFFvRdhm5mWWH334ALDCzugS3oT7l7hHXRBApqT5btIm73l3Ijn3p/DapNbcOaKv5hqTYO5yJUta5+9nw39tSJ3GQRXJESppte9O45/1FfDR/I0c3qs6LV/bUIvZSYkQzMK0+wTKYvxLeljqtwKsSKWLcnffnbeDe9xexLy2L/z2lHdef2JpyWpdASpBozhBeJVygJvSLaard/aYCrUikiNm0O5U731nAlz9toVvTmjx0fhfaNtB4Ail5ohmHcAqAmY0ys6lASzN7n2Cw2hh3z4pxjSJx4e68MWstD3z0IxnZ2dx1xtFcdVxLLVspJVZ+riHMd/fjAcysEXA98I2Zna07jqSkWbsjhTvGz+ebZdvp26oO/zyvM83rVIl3WSIxFe04hAQOWALT3TcC95rZZ8CbZtbf3bMP+gIixURWtjPm21U89NliEsoYD5zbmYt7NqWMzgqkFMjPOIRfra3s7tPMbKjCQEqCZVv2cvvb85mzeif929fj/nM707hmpbyfKFJCRBsICcDKcPW0fcA+d98P4O6a3E6KtYysbEZOXsETXy6lcvkEHr2wK+d2b6IBZlLqRBsIrYC3CSa3qwhUNLPKQB1glbtfGZvyRGJr0Ybd/Pmt+SzakMzpnRvy97M7Ua+aVi2T0ik/F5VnufstAGbWGVgUzoS6xMzquPv22JQoUvDSMrN48stlPPv1cmpWLs+zQ3owqFOjeJclElfRDEx7i6CbKMHMqgJjgUzgJmATMDz8XqRY+G7NTm5/az5Lt+xlcI8m/O3MjtSsXD7eZYnEXTRnCJcAA4EbgUeA37v7ypwH3f3lGNUmUqD2p2fx8ITFjP5mJY2qV+TFq3rSv339eJclUmREMzAtA/g4XB2tPTA018W2FHd/ONJzzWwUwbQXH7n78IO9h5mNAD5x9w/yU7xItL5dvo073l7Amh0pDOnTjNsHdaBaxXLxLkukSMlzIpbwziKAM4ERwLkEE9pNBs4BphzkeYOBBHfvC7Qys7YHaXc80FBhILGwJzWDv76zgEufn0EZgzeG9WH4OZ0VBiIRRNNldJ2ZHQvscvevzWynu08GMLNd7j7jIM9LAsaF2xOAfsDSAxuYWTngeYIzkP9x9/dyv4iZDQOGATRr1iyKckUCE3/awl/fWcDm5FSuO74ltw1sT6XymqJa5GCi6TK6C8DMdpjZ50BnM5sAGNDFzD5190ERnloFWB9u7wB6RGhzOfAD8CBwi5k1c/cnc73/SMLJ9RITE/3XLyHySzv3pXPfhz/wztz1tGtQlWeGHEe3pjXjXZZIkZef207rRhqRHC6vGcleIGeYZ1Uid091B0a6+yYzexW4H3gyQjuRqHy8YCN/e28hu1IyuPWkNtx0UhsqlNVZgUg0og6EcMzBScBkd880s/Pd/a1DTFsxh6CbaDrQFVgcoc0ygkFvAInA6uhLF/nZlj2p/O3dRXy6aBOdmlTn5at707Fx9XiXJVKsRDMO4R3gX+4+neC207vNrALBbKdvHeKp7wJTzKwxcBpwsZkNz+mCCo0CRpvZxUA54PzD/HdIKeXujP9uPfd9+AP7M7K4fVAHrju+JWW1cI1IvkVzhtAEuNTM7gLeIfi0nwBUM7MTgXLu/kXuJ4WrqSURjGF40N03AfNytdkDXHBk/wQprdbv2s9fxy/g6yVbSWxei3+d34XW9arGuyyRYiuaQNgFfAA8Hf7ZK3xedaAnwbWBXwUCgLvv5Oc7jUQKRHa2M3bmGv758Y84cO9ZHbm8bwtNUS1yhA4ZCOE6CN2BvuGfnxIEQzbQ6GAD0kRiZdW2fdz+9nxmrNxBvzZ1+b/BnWlau3K8yxIpEfI6Q6gFXEXQRdSU4NP+x8CpwEsxrUzkAFnZzuipK3nk88WUSyjDv87rzIWJTTVFtUgByisQTgVaE4xK7gWcR3BnEO7+GoCZlXf39FgWKaXbks17+NNb85m3dhcnH92A+8/tRIPqFeNdlkiJk1cgTAOeIbglFKAzwXWEKWY2lGBwWgJweswqlFIrPTObZyYt56mJS6lWsRz/vqQ7Z3VppLMCkRjJKxDWESyf+ThwA7AAuAe4l2DCuudiWp2UWgvW7eZPb83jp017OKtrY+49qyN1qmrhGpFYOmQguHu6mV0JfAdcBgwM5zMaBNxmZlXdfW8h1CmlRGpGFo9/sZTnp6ygTpXyPH95IgM7Noh3WSKlQjS3ne4nuI7wk7t/bGZ9gVlAlsJACtKsVTu4/a35rNi2jwsTj+LOMzpSo5JmJRUpLNEM53wSOAv4T/j9/xF0IyXFqCYpZfalZXLPewu58LlppGVm88o1vXjw/K4KA5FCFs0ZwgZ3v9fM+oUL3nQAXiWY6fR1oLq7nxHTKqXEmrJ0K38Zv4D1u/ZzRd8W/OnU9lSpkJ85F0WkoETzm9fRzN4H3iSYuuIU4GpgnLtfEsvipOTavT+D+z/6gXGz19GqbhXGXd+Xni1qx7sskVItr5HKZYD33P3acKGar4G7gFuANYVQn5RAn/+wmTvfWcD2fenccGJrfn9yWyqW0xTVIvGW111G2Wb2r3AZzcuBmwkGpnUHZpnZeGCpu98e+1KluNu+N417P/iBD+ZtoEPDaoy6oiedj6oR77JEJBRNl1Fb4AGgMbAQWB7+3VIgk2Caa5GDcnc+mL+Re99fxJ7UDG4b2I4bTmxN+bKaolqkKMmry+gS4FaCO4tuAVYAEwlGLL8JpBAEhEhEm5NTufOdhXzx42a6HlWDB8/vQ/uG1eJdlohEkFeX0etmtha4L2ybQLB+wS6C0coALwPjY1ijFEPuzrjZaxn+0Y+kZ2bz19M7cPVxWrhGpCjLs8vI3aea2ZnAte7+7wMfM7OKQLtYFSfF09odKfxl/AKmLttGr5a1+dd5XWhZt0q8yxKRPER1w7e7pwD/NrP+7j7xgL9PBebHqjgpXrKznZenreLBzxZjwD/O6cRlvZpp4RqRYiKaNZWfBYa7+zrgDjPLfc2ggrsvjUl1Umws37qX29+az+zVOzmxXT0eGNyZJjUrxbssEcmHaM4QlhKsq7yO4BrC3bker0BwS6qUQplZ2YycsoLHv1hKpXIJPHxBV87r0URTVIsUQ9EGQqNwO8Xdr4thPVKM/LAhmT+/PY+F65MZdExD7jvnGOpX08I1IsVVNIGwAnjMzK4DEszsUyAd2Am86+7vxLJAKXrSMrN4+qtljJi0nJqVyzHish6c3rlR3k8UkSItmkBoApwB1AS6AhkEK6UtAG40sxuBs8MLzFLCzV2zkz+/NZ+lW/Zybvcm/O3MjtSqUj7eZYlIAYgmEAYQTFfRCRgEfA/c4u6dgH+YWctIYRDOjNqRYGW14REeL0tw9rEi/Ktb3H3B4f0zJNb2p2fxyITFjP5mJQ2qV+TFK3vSv0P9eJclIgUomkCYBRwPVAIWAZOAq8ysrLtnuvvK3E8ws8FAgrv3NbPRZtY2wp1IXYDXNQ9S0Td9xXZuf3s+q7encGnvZvzltA5Uq6i1CkRKmmgCYQ5wJ5ANZBF0G60APjWzBOBNdx+R6zlJwLhwewLQj+Di9IH6AGeaWX+C7qfr3T0z95uHs6wOA2jWrFkU5UpB2ZOawT8/+YmxM9bQrHZlXruuN79pXTfeZYlIjEQzUnkFcE2kx8ysAtA3wkNVgPXh9g6gR4Q2s4CT3X2jmb0MnA68H+H9RwIjARITEz2veqVgTFq8hb+OX8DG5FSu6deSP57SjsrltXCNSEmWr99wM7vG3UflfO/uaQRdSLntJehiAqhK5KU654fPB5hNMIOqxJm78+RXy3j08yW0qV+Vt274Dcc2rxXvskSkEOR3prELomw3h6CbCIIuplUR2rxiZl3DbqdzgHn5rEUKWHa2848Pf+TRz5cwuHsTPrq1n8JApBTJbx9AdpTt3gWmmFlj4DTgYjMb7u53HdDmPuA1gltY33f3L/JZixSgzKxs7hi/gLfmrOPK37Tgb2d21BxEIqVMNHMZfQOkERy4O5vZV+G2h39WdPdfXEdw92QzSwIGAg+6+yZynQG4+0KCO40kzlIzsvjdG3P5bNFm/nByO24d0EZTT4iUQtFcVD4uZ9vMPnH306J5YXffyc93GkkRtTctk+tfmc03y7Zzz1kdueq4lvEuSUTiJL9dRv9dCd3Mmrv76gKuRwrRzn3pXPnSLBau380jF3TlvGOPindJIhJHUV1UNrM24eYHB/z1g+H8RlIMbU5O5aKR0/hxYzLPXNZDYSAiUV1D6Ac8Eq6adnY4CtkJzhaeMbMUdx8b4zqlAK3evo/LXpjBzn3pvHRVTw02ExEgui6j3sBp7r7DzLLdvX/OA2ZWlejvPJIi4MeNyVw+eiaZWdm8dl0fujatGe+SRKSIiOai8iMHfPsxBGsphxPapbi7AqGYmLN6J1e9OJPK5cvy2vV9adugWrxLEpEiJM9rCGY23Mz+Zmb3EIxAhmBQWTtgrJkdHdMKpUBMWbqVIS/MoHaV8rx5g8JARH4tmi6j04F3gI3AX8P707sRhEl5d/8xduVJQfhkwUZufWMuretV5eVremlVMxGJKJpA2A0sBlaH252ArUAt4J7YlSYFYdystdwxfj7dm9Vi9JU9qVFJ01aLSGTRBEJD4ERgG7CLYHRyOsEMpplmluzua2JXohyu5yev4P6Pf+SEdvV4dkgPzVYqIocUzTgEJ1gHIQvoTLCkZpXw+woEcxJJEeLuPPTZT9z/8Y+c0bkRL1yeqDAQkTxFEwibganAZ0AyQUA0BaoBU4BbYlad5Ft2tnP3ewt5euJyLunVlH9f0p3yZfM7qa2IlEbRfGysRXDdoB6wBlgJZBAEw3B3PyN25Ul+ZGRl88dx83h/3gauP7EVdwzqoEnqRCRq0QTCeIJrBtWBl4APgYsIltGsbWYD3P3LmFUoUdmfnsVNr33HVz9t4fZBHbgxqXW8SxKRYiaagWmRrhGMADCzSeEANYmj5NQMrn1pNrNW7+CBcztzaW+tPS0i+XdEVxoVBvG3bW8aV4yeyZLNe/j3xd05q2vjeJckIsWUbj0pxtbv2s/QF2awYfd+nr88kaT29eNdkogUYwqEYmr51r0MfWEGe9IyefWa3iS2qB3vkkSkmFMgFEML1+/m8tEzKWPwxrA+HNO4RrxLEpESQIFQzMxYsZ1rxsymRqVyvHJNL1rVqxrvkkSkhFAgFCNf/bSZG1/9jqNqVeLVa3vTqEaleJckIiWIAqGYeO/79fxx3DyOblSdMVf3onaV8vEuSURKmJjOaWBmo8xsmpndlUe7BmY2N5a1FGevTF/N7//zPcc2r8Vr1/VWGIhITMQsEMK1lxPcvS/QyszaHqL5w4D6P3Jxd576ail3v7uQAR3qM+bqXlSrqOmrRSQ2YnmGkASMC7cnAP0iNTKzk4B9wKYY1lLsuDsPfPwjD09Ywrndm/DMkGOpWC4h3mWJSAkWy0CoAqwPt3cADXI3MLPywN3AHQd7ETMbZmazzWz21q1bY1JoUZOZlc3tb8/n+SkrufI3LXjkgq6US9CMpSISW7E8yuzl526gqgd5rzuAEe6+62Av4u4j3T3R3RPr1asXgzKLlrTMLG5+bS7jZq/jdwPacs9ZHSlTRjOWikjsxTIQ5vBzN1FXYFWENicDN5nZJKCbmb0Qw3qKvH1pmVzz0mw+XbSJv53ZkT8MbKfpq0Wk0MQK7tjLAAAPWUlEQVTyttN3gSlm1hg4DbjYzIa7+3/vOHL3E3K2w5lTr41hPUXarpR0rnppFvPW7uKh87twQWLTeJckIqVMzALB3ZPNLAkYCDzo7puAeYdonxSrWoq6LcmpDB01k5Xb9vHMkGM59ZiG8S5JREqhmA5Mc/ed/HynkUSwZnsKQ0bNYNveNF68qifHtakb75JEpJTSSOU4WrxpD0NHzSA9K5vXrutDt6Y1412SiJRiCoQ4mbtmJ1e+OIuK5cow7vq+tGtQLd4liUgpp0CIg6lLtzHsldnUq1aBV6/pTdPaleNdkoiIAqGwfbpwI7e+/j2t6lXh5at7Ub96xXiXJCICKBAK1bjZa7nj7fl0a1qTF6/sRY3KmpdIRIoOBUIheWHKCoZ/9CPHt63Lc0OPpXJ57XoRKVp0VIoxd+exz5fw76+WcXrnhjx2UTcqlNUkdSJS9CgQYig72/n7B4sYM201FyU25YHBnUnQvEQiUkQpEGIkIyubP705j3e/38CwE1rxl9M6aF4iESnSFAgxkJqRxU1jv+PLn7bwp1Pb89uk1goDESnyFAgFbE9qBteOmc3MVTsYfk4nhvRpHu+SRESiokAoQNv3pnHFizP5aeMenri4O2d3bRzvkkREoqZAKCAbdu1nyKgZrN+5n+cvT6R/h/rxLklEJF8UCAVgxda9DB01k+T9GbxyTW96tawd75JERPJNgXCEFm3YzRWjZ+IOrw/rQ6cmNeJdkojIYVEgHIFZq3Zw9YuzqFaxLK9e25tW9arGuyQRkcOmQDhME3/awo1j59C4ZiVevaY3jWtWindJIiJHRIFwGN6ft4Hb/vM9HRpVY8xVvahTtUK8SxIROWIKhHx6dfpq7n5vIT1b1GbUFYlUq6gZS0WkZFAgRMndGTFpOQ99tpgBHerz9GU9qFhOk9SJSMmhQIiCu/PPT37iuckrOKdbYx66oCvlEsrEuywRkQIV90Aws9rAscBcd98W73pyy8p27nxnAW/MWsvlfZtz71nHUEYzlopICRTTj7lmNsrMppnZXQd5vBbwIdALmGhm9WJZT36lZWZx6+tzeWPWWm45qQ1/P1thICIlV8wCwcwGAwnu3hdoZWZtIzTrAtzm7vcDnwE9YlVPfqWkZ3LtmNl8tGAjd51xNH88pb1mLBWREi2WZwhJwLhwewLQL3cDd//a3aeb2QkEZwnTcrcxs2FmNtvMZm/dujWG5f5sd0oGQ16YwTfLtvHgeV249vhWhfK+IiLxFMtAqAKsD7d3AA0iNbLgY/dFwE4gI/fj7j7S3RPdPbFevdj3KG3Zk8pFI6excH0yIy7rwYU9m8b8PUVEioJYBsJeIGf4btWDvZcHbgLmA2fHsJ48rd2RwgXPTmPNjhRGX9mTQZ0axbMcEZFCFctAmMPP3URdgVW5G5jZ7WZ2efhtTWBXDOs5pCWb93D+s9+yKyWDsdf2pl/buvEqRUQkLmIZCO8CQ83sUeBCYJGZDc/VZmTYZjKQQHCtodDNW7uLC5+bhjuMu74v3ZvVikcZIiJxFbNxCO6ebGZJwEDgQXffBMzL1WZn+HjcfLt8G9eNmU2dqhV49ZreNKtTOZ7liIjETUwHpoUH/HF5NoyTCYs2cfPrc2lZpwqvXNOL+tUrxrskEZG4iftI5Xh5e846/vz2fDo3qcFLV/WkZuXy8S5JRCSuSmUgjJ66kvs+/IF+bery3NBjqVKhVO4GEZFfKFVHQnfn8S+W8sSXSxl0TEOeuKQbFcpqxlIREShFgZCd7dz34Q+89O0qLjj2KP5vcGfKasZSEZH/KhWBkJGVze1vzWf83PVc268ld55xtOYlEhHJpVQEwphvVzF+7nr+95R23NS/jcJARCSCUhEIl/dtQfM6VRjYMeJ0SiIiQozXQygqypctozAQEclDqQgEERHJmwJBREQABYKIiIQUCCIiAigQREQkpEAQERFAgSAiIiFz93jXEDUz2wqsPsyn1wW2FWA5BUV15V9RrU115Y/qyp8jqau5u9fLq1GxCoQjYWaz3T0x3nXkprryr6jWprryR3XlT2HUpS4jEREBFAgiIhIqTYEwMt4FHITqyr+iWpvqyh/VlT8xr6vUXEMQEZFDK01nCCIicggKBJESwMxqm9lAM6sb71qk+FIgFCIza2BmUw7xeBMzW2dmk8KvPO8bLonMrIaZfWJmE8zsHTMrH6FNWTNbc8C+6lxItRW5A6+Z1QI+BHoBEyP93MRrfxVl4e/j3IM8Vir3V4kMhCgOvOXM7AMz+8bMri6kmmoBY4Aqh2jWG7jf3ZPCr60xrinPA2/YbpSZTTOzu2JZzwEuAx5191OATcCgCG26AK8fsK8WxLqoaA68YbvC3l9dgNvc/X7gM6DHQdoU6v7KcagDb/h4Ye+vHA8DlQ7yWDx+vqIKITP7u5nNMrOnC7qGEhcIUR54bwHmuPtxwPlmVq0QSssCLgKSD9GmD3CtmX1nZg8UQk15HnjNbDCQ4O59gVZm1jbWRbn7CHf/PPy2HrAlQrM+wJlmNjM8oBTGcrB5HnjjtL++dvfpZnYCQVhNi9AsHvsrx0EPvPHYX+H7ngTsI/i5jyReP1+HDCEzOxboR/D/vMXMTi7IAkpcIBDdgTcJGBduTwZiPirR3ZPdfXcezT4hqK0n0NfMusS4pmgOvEn8vK8mEPwwFgoz6wvUcvfpER6eBZzs7r2AcsDpsa4nygNvEnHYX2ZmBD/3O4GMCE0KfX+FdeV14E2ikPdXeCZ8N3DHIZrFY39FE0InAm97cHvoZ8DxBVlAiQuEKA+8VYD14fYOoKgsuPytu+9x9yxgLlBYn5YOdeCNy74ys9rAk8DBuvTmu/vGcHs2hbev8jrwxmV/eeAmYD5wdoQmhb6/ojzwxmN/3QGMcPddh2gTj5+vaEIopvurxAVClPby8ylsVYrOfvjMzBqZWWXgFGBhrN8wigNvoe+r8EDyJvAXdz/YZIavmFlXM0sAzgHmxbouiOrAG4/9dbuZXR5+WxOIdKCLx/6K5sAbj9/Fk4GbzGwS0M3MXojQJh77K5oQiun+KioHwsI2h59PTbsCqwq7ADM7ycxuzvXXfwcmAtOBZ919cYxriObAG499dQ1B//yd4cW1e8xseK429wGvAN8D09z9i1gXFeWBNx77ayQw1MwmAwnAuqKwv4juwFvo+8vdT8jppyfYH48Wkf0VTQjFdH+V2JHKZjbJ3ZPCPsyO7v7UAY81Bz4GvgB+A/QJu2lKFTO7EXiAn3/wJgLl3P2uA9pUB6YAXwKnEeyrvLrkSqTwhoVxQAWCs7engUu0v/IWhsJvgUu1vyIzs07Aa4AB7wOPAA+6+7UHtClDsL9mE9wEMsjdVxZYDSU1EPJiZo0Jkvaz0voDGK3wQDgQmOzuB7s4KCHtr/zR/sofM6sEnAF85+4rCvS1S2sgiIjIL5XWawgiIpKLAkFERAAFgpQwZlYmHCsQ8fsCeP2EgnotkaJGgSDFipmdYmZHhdtPm1lzM/vggCYDgfVmNt3MvgE2AA+Y2WNm9nD4VfaA17vfzFqG2+XN7O1DvHdV4GszqxllrU+YWQ8zGxPpOWZ2pZlVNbNBZnZqhMfvM7P+YY13mFk1M/ssdyiZ2RAzW2JmX+T6WmFml0RTqwgoEKT4qQDcYmaJwLnAC0BieKC82d0/A8YDFwP/A3xKcAtfAvA4wW2NmQe83rFAzhiMgUCKmXUIv8oDmFlFMzN33ws8Bpyf82Q7yISAoYrhny8BV+Q6c2lOMKfWPoJR6cPNrOIBj1clmH6lL1CfYJBSc2Cfu2eFZz45v7/pwEh3P/nAL+BlgqlcRKJSmBNciRSED4G1QHuCof2bgKfc/fxc7d4gOBguBdYBFd19nZmlwn+7frYRjMH4LjzL6E5wEL6DYK6iwcBPBKGSZmZOcFDONLOc9ytvZgPDg/RLQGuCgzzhdjeCAWwVgP/w85w+TwJ/Deek2WxmrwJjzexid88AagB1gJsJBkd9G263CQegtSEYvDQTyAaGmVnuyQlbAX+Oes9KqadAkGLDzE4DbgO+BtIIPql3BLLM7EuCg/nvw+YXEwzzfxjYCDQKP5VvAAgP4N+5+wAzSyIYNLUR+KO7J1swtXBa2DbpgBp+D+xy95cilJgNXOfuP4VtbyIIo4VA15x77M3sDwQhMSHnie7+hJk1Ab4xs2sJJhpsQTA4qT3QkGA2zDuBZcD17j7zgPce6e4P59pf9x56j4r8kgJBig13/8TMfgT+7u45XTBfEUx3Ts5BOuyZeZXgDGGlu2eaWTngQg44CANdwxG0NYF3CaYqeJVgjqIKhJ/0zewyYBjBZHZNCc4QhgDlgSfd/c2cEgk+5RvwHcHZTCuCs41l4Ws1B04ClgALLVhoZyVBlxbAUwQTmJUlmMpkKPBo+N4PEHRxpQMFOiBJBBQIUjzljKYcRDC3CwQTkZ5JcKAFGEJ4hmDBBH6fAn8Ajj7gdb5395PDM4Qkd19uZlkWTDteiTAQ3H0sMDZ8k0OdIVQi6GZqEP45j+DMozzwt/C1VgNnha81B2jj7veFQfGou48JHzsOGE5w7aArQddTS4LrIxBMlX4gdRnJEVMgSHGUYGZdgYeA/gRdKi8SfLJOIpgF8jKCA+IJwKlAXYIzhgYEn8gP5iZgM8HBPSWfdTUDtgPtCKYmXkVwMB96kPbnA/8KtxsTXBsBwN2/MbNxBHPkfwIscvcMM/uO4NrBfbleS11GcsQUCFLc3EewHsECgonltgJbCacKNrPTCQ7mswnusikLVCb4hH0G8I6Z3RCu/dA9V5cR7r4hfJ067u7hnT9l3D1iOBxwC2tloDrB/FgPAvcQ3AE1heCMoL6ZVXf3EeHzbgUy3X1G+Px2HBAIoWcJzlLuAlaHt8ceQ3Bto0f4b4RgMrSDnSHcfvBdKfJLCgQpNixYY7Y2wcH/MoLuoDIEt3dWJLjwere7XxS2b0jQXVONYJbNZDO7InjIEgiWUT3FggWCjg+fcwXBAThnsaBTCW5zzV3LkHAzgeDspBFBAH0DXE8wE+oj7j7WzMYCTwCXhdcXxhOcSQwJX+seglteLz/g9WsBzxBcK+hDcPH8ReB/Cc5g3jKzS919OcHv8Qh3fzxXjfei33HJB01uJ3KA8EBcNjzzyM/zygGE3ToGNDxgsRPMrJ27Lwm3q7j7voO81IGvWTZnzET4mmVypmkPx0Xol1cKlAJBREQAjVQWEZGQAkFERAAFgoiIhBQIIiICwP8D69Ni8dubMSoAAAAASUVORK5CYII=\n",
131 | "text/plain": [
132 | ""
133 | ]
134 | },
135 | "metadata": {
136 | "needs_background": "light"
137 | },
138 | "output_type": "display_data"
139 | }
140 | ],
141 | "source": [
142 | "# 可视化 所解释的方差与选取的主成分数目之间的关系\n",
143 | "\n",
144 | "import matplotlib.pyplot as plt\n",
145 | "plt.rcParams['font.sans-serif']=['SimHei']\n",
146 | "%matplotlib inline\n",
147 | "\n",
148 | "\n",
149 | "plt.plot(np.array(range(len(compressor.explained_variance_ratio_))) + 1, \n",
150 | " np.cumsum(compressor.explained_variance_ratio_))\n",
151 | "plt.xlabel('选取的主成分数目')\n",
152 | "plt.ylabel('累计所解释的方差累')\n",
153 | "plt.show(); # 前5个主成分可以保证保留原特征中90%的方差"
154 | ]
155 | },
156 | {
157 | "cell_type": "markdown",
158 | "metadata": {},
159 | "source": [
160 | "## Supervised Methods 监督方法"
161 | ]
162 | },
163 | {
164 | "cell_type": "markdown",
165 | "metadata": {},
166 | "source": [
167 | "### LDA (Linear Discriminant Analysis) 线性判别分析"
168 | ]
169 | },
170 | {
171 | "cell_type": "markdown",
172 | "metadata": {},
173 | "source": [
174 | "与主成分分析(PCA)不同的是,线性判别分析(LDA)是一种有监督机器学习模型,旨在找到特征子集以最大化类线性可分离性,即希望投影望同一种类别数据的投影点尽可能的接近,而不同类别的数据的类别中心之间的距离尽可能的大。线性判别分析仅适用于分类问题,其假设各个类别的样本数据符合高斯分布,并且具有相同的协方差矩阵。\n",
175 | "\n",
176 | "可以在sklearn的[官方网站](https://scikit-learn.org/stable/modules/lda_qda.html#lda-qda)上了解更多原理方面的详细信息。LDA会将原始变量压缩为(K-1)个,其中K是目标变量类别数。但是在sklearn中,通过将主成分分析的思想合并到LDA中,其可以进一步压缩变量。"
177 | ]
178 | },
179 | {
180 | "cell_type": "code",
181 | "execution_count": 3,
182 | "metadata": {
183 | "ExecuteTime": {
184 | "end_time": "2020-03-30T03:37:21.167912Z",
185 | "start_time": "2020-03-30T03:37:21.148967Z"
186 | }
187 | },
188 | "outputs": [],
189 | "source": [
190 | "import numpy as np\n",
191 | "import pandas as pd\n",
192 | "from sklearn.discriminant_analysis import LinearDiscriminantAnalysis as LDA\n",
193 | "\n",
194 | "# LDA仅适用于分类问题\n",
195 | "# 载入数据集\n",
196 | "from sklearn.datasets import load_iris\n",
197 | "iris = load_iris()\n",
198 | "X, y = iris.data, iris.target\n",
199 | "\n",
200 | "# iris 数据集使用前需要被打乱顺序\n",
201 | "np.random.seed(1234)\n",
202 | "idx = np.random.permutation(len(X))\n",
203 | "X = X[idx]\n",
204 | "y = y[idx]\n",
205 | "\n",
206 | "# 选择前100个观测点作为训练集\n",
207 | "# 剩下的50个观测点测试集\n",
208 | "\n",
209 | "train_set = X[0:100,:]\n",
210 | "test_set = X[100:,]\n",
211 | "train_y = y[0:100]\n",
212 | "test_y = y[100:,]\n",
213 | "\n",
214 | "# 在使用主成分分析前,我们需要先对变量进行缩放操作\n",
215 | "# 因为LDA假定数据服从正态分布\n",
216 | "\n",
217 | "from sklearn.preprocessing import StandardScaler # 我们也可以采用幂次变换\n",
218 | "model = StandardScaler()\n",
219 | "model.fit(train_set) \n",
220 | "standardized_train = model.transform(train_set)\n",
221 | "standardized_test = model.transform(test_set)\n",
222 | "\n",
223 | "# 开始压缩特征\n",
224 | "compressor = LDA(n_components=2) # 将n_components设置为2\n",
225 | "# n_components <= min(n_classes - 1, n_features)\n",
226 | "\n",
227 | "compressor.fit(standardized_train, train_y) # 在训练集上训练\n",
228 | "transformed_trainset = compressor.transform(standardized_train) # 转换训练集 (20000,2)\n",
229 | "transformed_testset = compressor.transform(standardized_test) # 转换测试集\n",
230 | "assert transformed_trainset.shape[1] == transformed_testset.shape[1]\n",
231 | "# 转换后训练集和测试集有相同的特征数"
232 | ]
233 | },
234 | {
235 | "cell_type": "code",
236 | "execution_count": 4,
237 | "metadata": {
238 | "ExecuteTime": {
239 | "end_time": "2020-03-30T03:37:21.314293Z",
240 | "start_time": "2020-03-30T03:37:21.169760Z"
241 | }
242 | },
243 | "outputs": [
244 | {
245 | "data": {
246 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEHCAYAAAC5u6FsAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3Xd4VHXaxvHvQ+hVeu+9BBADiKICNhAblrXXVdRV13eLgIKKigXXtru2xVVXV9ddpSpNRUVQQQUWE3rvvZcQ0p73jzOsERMygUwmydyf68rFyZkzmeeEZO6c3zm/55i7IyIiEq4S0S5ARESKFgWHiIjkiYJDRETyRMEhIiJ5ouAQEZE8UXCIiEieKDhERCRPFBwiIpInCg4REcmTktEuIBJq1KjhTZo0iXYZIiJFyty5c3e4e83ctiuWwdGkSRPmzJkT7TJERIoUM1sbznYaqhIRkTxRcIiISJ4oOEREJE8UHCIikicKDhERyRMFh4iI5ElEg8PMapvZzGM8XsrMPjazb8zs1rysExGRn8vMLJg7ukYsOMysKvA2UOEYm90LzHX304ErzKxSHtaJiEjI3LW76ffnmcxbtzvirxXJI44M4Cpg3zG26QV8EFqeASTkYd3PmNlAM5tjZnO2b99+gqWLiBQNBw+n8+jHC7nitW85cDid1PTMiL9mxGaOu/s+ADM71mYVgI2h5V1A7TysO/r1RgGjABISEgrmeE1EJIpmLt/OA2OT2LD7EDf2aMygvm2oWCbyDUGi3XLkAFAO2AtUDH0e7joRkZi0NzmNJyYv4oM5G2hWowIf3NGDbk2rFdjrR/uqqrlAz9ByJ2BNHtaJiMScqQu2cM4LXzFm3kZ+06s5k+87o0BDAwrwiMPM+gDt3P2lLKvfBiab2RlAO+A7giGpcNaJiMSMbftTGP7RQiYnbaFd3cq8dXNXOtSvEpVazD26pwPMrB7B0cQn7r43L+tykpCQ4OqOKyLFgbszdt5GHpu4iENpGdx3dksGntmMUnH5P2BkZnPd/RcXHx0t2uc4cPdN/HTFVJ7WiYgUZxt2J/PguAXMWLadUxpXZeTlHWlRq2K0y4p+cIiIyM9lZjrvfreWkVOW4MCjF7fnhlMbU6LEMa9SLTAKDhGRQmTl9gMMGZPID2t2c2armjw5oAMNqpaPdlk/o+AQESkE0jIyGTVjFX/+fDnlSsXx7JWduLxL/dzmwkWFgkNEJMoWbNzL4DGJLNy0jwvi6zD84vbUqlQ22mXlSMEhIhIlKWkZ/OXz5fxtxiqqli/Na9d3oW+HutEuK1cKDhGRKJizZheDxiSyavtBrjylAcP6t6NK+VLRLissCg4RkQJ04HA6f5q6hHdmr6X+SeV459ZunNmqZrTLyhMFh4hIAflq2XYeHJvEpr2HuKlHE+4/vzUVCqApYX4rehWLiBQxe5JTeWziIsbO20jzmhUYfWcPTmlcsP2l8pOCQ0QkgiYnbebhCQvYk5zGPb1bcE+fFpQtFRftsk6IgkNEJAK27Uvh4QkLmbpwCx3qV+btW7vRvl50mhLmNwWHiEg+cnc+nLuBERMXkZKeyeC+bbj9jKaUjEBTwmhRcIiI5JP1u5J5cFwSM5fvoFuTajx9eTzNaka/KWF+U3CIiJygjEznnVlreGbqUkoYPH5pB67r1qjQNCXMbwoOEZETsGLbfgaNTmTeuj30al2TJwbEU/+kctEuK6IUHCIixyEtI5O/fbWSv3y+gvJl4njhqk5c2rlwNiXMbwoOEZE8Stqwl/tH/8iSLfvp37Euj17cnhoVy0S7rAKj4BARCVNKWgYvTlvO6zNXUb1Caf52wymc375OtMsqcAoOEZEwfLdqJ0PGJrF6x0Gu7tqQBy5oS5VyRaMpYX5TcIiIHMP+lDRGTl3Cu7PX0bBaOd67rTunt6gR7bKiSsEhIpKDL5dsY+i4JDbvS+HXPZvyh/NaUb603jb1HRAROcqug6k8PnER4/67kZa1KjLmrtPo0qhqtMsqNBQcIiIh7s6kpM08MmEhew+l8duzW3J37+aUKVm0mxLmNwWHiAiwdV8Kw8Yv4LNFW+nYoArv3tadtnUrR7usQknBISIxzd35zw/reWLyYlLTMxl6QVtuOb1JsWpKmN8UHCISs9btTGbI2ES+XbmT7k2rMfLyjjSpUSHaZRV6Cg4RiTkZmc5b36zm2U+XUrJECZ4cEM/VXRsW26aE+U3BISIxZdnWoCnh/PV76NOmFk8M6EDdKsW7KWF+U3CISExITc/k1ekreenL5VQqW4o/X92ZizvVi4mmhPlNwSEixd6P6/cwaHQiS7fu55LO9Xj4wnZUj6GmhPlNwSEixdah1Aye/2wpb3y9mlqVyvL3GxM4p13taJdV5Ck4RKRYmrVyJ0PGJrJ2ZzLXdm/EkH5tqFw2NpsS5jcFh4gUK/tS0nhq8hLe/34djauX51+3d+e05rHdlDC/RSw4zOwNoB0wyd1HZPN4U+AloDLwvbv/IYd1VYH3gFrAXHe/I1I1i0jR9vnirQwdt4Bt+1MYeGYzfndOK8qVVruQ/BaRqZFmdhkQ5+49gGZm1jKbzUYCj7v7GUADM+uVw7obgPfcPQGoZGYJkahZRIqunQcO89v3/8uv355DlXKlGPub03nwgrYKjQiJ1BFHL+CD0PKnQE9g+VHbtALmhZa3AVVyWLcT6GBmJwENgfXZvaCZDQQGAjRq1Cg/9kFECjl356MfN/Hox4vYn5LG785pxV29mlO6pNqFRFKkgqMCsDG0vAvoks02o4FHzGw20Bd4AIjPZl11oD/wW2Bx6Ov9gruPAkYBJCQkeL7tiYgUSpv3HmLYuAV8vmQbnRuexDNXdKRV7UrRLismRCo4DgBHpmJWJJshMXcfYWY9gfuBt939APCLdWb2F+BOd99nZr8HbiEUECISezIznfd/WMdTk5eQnpnJsP5tueX0psSpXUiBiVRwzCUYnpoNdAKW5rDdfKARcM0x1lUF4kNHId2BaZEoWEQKvzU7DjJkbCKzV+3itObVefqyjjSqXj7aZcWcSAXHeGCmmdUD+gFXm9kIdx921Hb3A8+7e/Ix1j0FvAU0BmYB70eoZhEppNIzMnnzm9U89+kySseV4OnL4rmqa0O1C4kSc4/M6YDQZbTnAjPcfUtEXiQHCQkJPmfOnIJ8SRGJkCVb9jF4dCI/btjLOW1rM+LSDtSpUjbaZRVLZjY3dAXrMUVsHoe77+anK6tERPLkcHoGL3+5kle+XEGVcqV46dqT6R9fV0cZhYBmjotIoTNv3W4Gj05k+bYDDDi5Pg9f2I6qFUpHuywJUXCISKGRnJrOc58u481vVlOnclneurkrvdvUinZZchQFh4gUCt+s2MGQsYms33WI609txOC+baikpoSFkoJDRKJq76E0npq8mH//sJ6mNSrwn4Gn0r1Z9WiXJceg4BCRqPl04RaGjV/AzoOp3HlWc/7vnJaULaX+UoWdgkNECtz2/YcZ/vFCJiVupm3dyrxxU1fiG1SJdlkSJgWHiBQYd2f8/I08+vEikg9n8MfzWnHHWc0pFaemhEXJcQWHmZV19xQzK+HumfldlIgUPxv3HGLouCSmL91Ol0ZBU8IWtdSUsCjKNTjMbASQChiw3t3fBP5pZkOBR83sMXdfHOE6RaSIysx03vt+HU9PXkymwyMXtePGHk3UlLAIC+eI4wJgHLAZGGrBtM3OBB1vSys0RCQnq7YfYMiYJL5fs4ueLWrw1GXxNKympoRFXTjBsZegu+3a0HIHYDtB19pHIleaiBRV6RmZvD5zNS9MW0bZkiV45oqOXHlKA7ULKSbCCY46wFnADmAPwZBVKsHNmdLNbJ+7r4tciSJSlCzatI9BY35kwcZ9nN++No9f0oFaldWUsDgJJzgcyAh9xBPcyrVC6PMywGPAzRGqT0SKiJS0DF76YgWvfbWSk8qX5tXrutAvvm60y5IICCc4tgJfEwxV3UQQJA2BSgQ3VXorYtWJSJEwd+0uBo1OZOX2g1zepQEPXdiWk8qrKWFxFU5wVCU4r1ETWAesBtIIAmSEu/ePXHkiUpgdPJzOnz5Zytuz1lCvSjnevrUbZ7WqGe2yJMLCCY6xBOc0KgP/ACYCVwGrgGpmdra7fx6xCkWkUJqxbDsPjE1i455D3NSjMff3bUPFMppTHAty/V9298eyWf0KgJlNd/eUfK9KRAqtvclpPD5pEaPnbqBZzQp8eGcPujapFu2ypACF9eeBmdVy922h5Svd/UOA0Ozxzu4+P5JFikjhMHXBZh6asJBdB1P5Ta/m/PZsNSWMReE2iPmnmdUxs4uAgRboE3rsxQjVJiKFxLb9Kdz17lzufHceNSuWYcLdpzOobxuFRowKp+VIY+AwwSW4LQlOigP8AfiC4ES5iBRD7s6YeRt5fOIiDqVlcP/5rRl4ZjM1JYxxxwwOM6sJfEhwRdWR+Ry4u5tZGTNrRjCXQ0SKmQ27k3lw3AJmLNtOQuOqPH15R1rUqhjtsqQQOGZwuPt2oJuZfcovA6I18CjQKEK1iUgUZGY6/5y9lpFTl2DAY5e05/rujSmhpoQSEs5QVdXQ4hsE4VHFzOKAJHe/wcw+i2SBIlJwVmw7wJAxicxZu5szW9XkyQEdaFBVTQnl53IbqqoMTAUygesJZo5fRDBjXF1xRYqJtIxMRs1YxZ+nLadc6Tieu7ITl3Wpr6aEkq3chqr2mVkPYHxo1W6Czri3Av8ys1MJZpOLSBG1YONeBo1OZNHmfVwQX4dHL+5AzUo6dSk5C2cCYKaZlSTojvsl0N/dN5jZFQRXVV0b4RpFJAJS0jL48+fLGTVjFdUqlOa167vQt4OaEkruwpoA6O4XhBbnm9mfzMzcfZuZ3QIcilx5IhIJP6zZxeDRiazacZArT2nAsP7tqFK+VLTLkiIi3JnjJYDe7v65u39yZL27/xCxykQk3x04nM4zU5fwzqy1NKhajn/+uhtntFRTQsmbcDuSOcGlt5+bWQdg25EWJCJSNExfuo2h4xawae8hbjm9CX88rzUV1JRQjkO4Q1VuZpmhTz8GksysBsHluS+5u+7JIVJI7T6YyuOTFjF23kZa1KrI6DtP45TGVXN/okgOjufPjQ3ufjH873Ld6ehmTiKFjrszZcEWHp6wgD3JadzbpwX39GlBmZLqLyUnJpwJgLWAdtk9Frpcd1a+VyUiJ2TbvhQemrCATxZuJb5+Fd65tTvt6lWOdllSTITTqexdoEaWzz3rg+5+d3ZPMrM3zGyWmQ3L4fGmZjbJzGaa2XM5rcuy/Suh7rwikgN354Mf1nPO818xfel2hvRrw7jfnKbQkHyVa3C4+3nuPhroZGZfA83N7CMzuzXUeuQXzOwyIM7dewDNzKxlNpuNBB539zOABmbWK4d1mNkZQB13//g49lEkJqzflcwNb3zPoDGJtKlbmSn3ncGdZzWnpDrZSj7LyzmOxNAbOmZWF7gD+MbMLs7mCqtewAeh5U+BnsDyo7ZpBcwLLW8DqmS3zsxKAa8Dk83sEnefkIeaRYq9jEzn7W/X8KdPlhJXwhhxaQeu7dZITQklYsKdxxFHlqMTd98MDDezT4APzay3u2dmeUoFYGNoeRfQJZsvOxp4xMxmA32BB4D4bNbdCCwCngHuNbNG7v7XbGocCAwEaNRIDXslNizfup/BYxKZt24PvVrX5MkB8dQ7qVy0y5JiLtxjWAd+ce9xd58F3HBUaAAcAI789FbM7nXcfQQwBbgNeNvdD2S3DjgZGOXuWwjOt/TOtkD3Ue6e4O4JNWtqQpMUb2kZmfz18+X0/8vXrN5xkBev6sxbN3dVaEiBCHeoKg5YHbob4EHgoLsfAnD37JocziUYnpoNdAKW5vB15xPcz+OaY6xbATQLLScAa8OsWaRYStywh0GjE1myZT8XdqzL8IvbU6OimhJKwQk3OJoBYwiaHJYFyppZeaA6sMbdbz5q+/HATDOrB/QDrjazEe5+9BVW9wPPu3vyMda9AbxpZlcDpYArwqxZpFhJScvghc+W8frMVdSoWIZRN5zCee3rRLssiUF5OTn+g7vfC2Bm8cDCUOfcZWZW3d13HtkwNL+jF3Au8ExomOnHo7+guz+S2zp33w9cmYc6RYqd2at2MmRMImt2JnNNt4YM6deWKuXUlFCiI5wJgKMJhqfizKwi8B6QDtwNbAFGhD7/GXffzU9XVonIcdifksbTU5bw3nfraFStPP+6rTuntaiR+xNFIiicI45rCI4c7gKeA/7P3VcfedDd34lQbSIx7csl23hwXBJb96VwW8+m/P68VpQvraaEEn3h3MgpjWAOxalAa+CGo24nmezuz0aoPpGYs+tgKo99vJDx8zfRslZFXrnrNE5upKaEUniEM1TV2N3XAhcCvwNeBO4juMT2eYIjERE5Qe7OxMTNDP9oIXsPpXHf2S35Te/makoohU44x723m9kpwB53/8rMdrv7DAAz2+Pu30W2RJHib8veFIaNX8C0xVvp1KAK793enTZ11F9KCqdwhqqGAZjZLjP7DIg3s08BAzqa2VR37xvhOkWKJXfn3z+s58lJi0nLzGToBW25tWdT4tQuRAqxvJxpq5HNDPEjt5UVkTxau/MgQ8YkMWvVTk5tVo2nL+tIkxoVol2WSK7CDo7QnI0+wAx3TzezK9x9dHZhIiI5y8h03vpmNc9+upRSJUrw5IB4ru7aUE0JpcgI5+T4OGCku88muBz3ITMrQ9Add3SE6xMpVpZu2c+gMYn8uH4PZ7epxYgBHahbRf2lpGgJ54ijPnBt6IZM4wh6T8UBlczsLKCUu0+LYI0iRV5qeiavTF/By1+uoFLZUvz56s5c3KkeR13aLlIkhBMce4CPgZdD/3YLPa8y0JXgslwFh0gO5q/fw+DRiSzdup9LOtfj4QvbUV1NCaUIO2ZwhO7DcTLQI/TvVIIAyQTqauKfSM4OpWbw/GdLeePr1dSqVJY3bkrg7La1o12WyAnL7YijKnALwdBUQ4LeU5OB84F/RLQykSLs25U7GDImiXW7krm2eyOG9GtD5bJqSijFQ27BcT7QHJhBMER1OcH9MXD3fwGYWWl3T41kkSJFxb6UNJ6avIT3v19H4+rlef/2U+nRvHq0yxLJV7kFxyzgVYIbKEFwa9ePCe61cQPBJMA44IKIVShSRExbtJWh45PYvv8wA89sxu/OaUW50moXIsVPbpP3NhDcNvZFIAVIAs4OPTbJ3S9yd4WGxLSdBw5z7/v/5bZ35lC1fGnG/eZ0HrygrUJDiq1jHnG4e6qZ3QzMA64Dzg31q+oL/N7MKobuCy4Sc9ydj37cxPCPFnLgcDq/P7cVd57VnNIl1UxBirdwLsc9RHCeY4m7TzazHsAPQIZCQ2LVpj2HGDZ+AV8s2UbnhifxzBUdaVW7UrTLEikQ4fxp9FfgIuA/oc+fIhi+6hWhmkQKrcxM573v1nLeCzOYtXInD13YjjF3nabQkJgSzhHHJncfbmY9zewNoA3wLkFn3PeByu7eP6JVihQCq3ccZMiYRL5bvYvTW1TnqQEdaVS9fLTLEilw4QRHOzP7CPiQoOXIecCtwAfufk0kixMpDNIzMnnj69U8/9kySpcswcjL4/lVQkO1C5GYldvM8RLABHe/zcwGAl8Bw4B7gXUFUJ9IVC3evI/BYxJJ3LCXc9vVZsSlHahduWy0yxKJqtyuqso0s5Fm1hi4EbiHYALgycAPZjYWWO7ugyNfqkjBOZyewctfrOCV6SupUq4UL117Mv3j6+ooQ4TwhqpaAk8C9YAFwMrQuuVAOjA+YtWJRMG8dbsZPDqR5dsOcNnJ9XnownZUrVA62mWJFBq5DVVdA/yW4Eqqe4FVwJcEM8g/BJIJgkSkyEtOTefZT5bx1rerqVu5LG/d0pXerWtFuyyRQie3oar3zWw98Fho2zjgSoJW68NDm70DjI1gjSIR9/XyHTwwLpH1uw5xw6mNGdS3NZXUlFAkW7kOVbn712Z2IXCbu/8l62NmVhZoFaniRCJt76E0npi0iA/mbKBpjQr8Z+CpdG+mpoQixxLWPcfdPRn4i5n1dvcvs6xPARIjVZxIJH2ycAsPjV/AzoOp3HlWc/7vnJaULaX+UiK5Ceee468BI9x9AzDEzI4+p1HG3ZdHpDqRCNi+/zDDP1rIpKTNtK1bmTdu6kp8gyrRLkukyAjniGM5wX3HNxCc43joqMfLEFyqK1KouTvj/ruRxyYuIvlwBvef35qBZzajVJyaEorkRbjBUTe0nOzut0ewHpGI2LjnEEPHJTF96Xa6NAqaEraopf5SIscjnOBYBbxgZrcDcWY2FUgFdgPj3X1cJAsUORFHmhI+PWUJDgy/qB039GhCXAlN5BM5XuEER32gP3AS0AlII7jzXxJwl5ndBVwcOlEuUmis3H6AIWMS+WHNbs5oWYMnB8TTsJqaEoqcqHCC42yCNiMdgL7AfOBed+8APG5mTRUaUpikZ2QyauYqXpy2nLIlS/CnKzpyxSkN1C5EJJ+Ec1bwB+AMgpYjC4HpwAEzKwng7quze5KZvWFms8xsWA6PNzWzSWY208yey2ldlu1rm9l/w94ziUkLN+3l0le+4ZmpS+nTuhbT/nAWV6qTrUi+CueIYy4wFMgEMgiGq1YBU80sDvjQ3V/J+gQzuwyIc/ceZvammbXM5pLdkcDj7j7bzP5jZr2A3xy9zt2nh7Z/Fih3nPspxVxKWgZ//WI5r321iqrlS/PqdV3oF1839yeKSJ6FM3N8FfDr7B4zszJAj2we6gV8EFr+FOhJcHVWVq0I7mUOsA2oksM6zKwPcBDYklu9EnvmrNnF4DGJrNx+kMu7NOChC9tyUnk1JRSJlDxdwG5mPwsQdz+c5YggqwrAxtDyLqB2NtuMBh4xs4sIzp18nt06MytNMHdkSC61DTSzOWY2Z/v27XnYKymqDh5OZ/hHC7nyb7NIScvk7Vu78dyvOik0RCIsrJYjWVwJvBHGdgf4aVipItkElLuPMLOewP3A2+5+APjFOjN7GHjF3fcca5za3UcBowASEhI8D/skRdCMZdt5YGwSm/Ye4sZTG3N/3zZULJPXH2cROR55/U3LDHO7uQTDU7MJzokszWG7+UAj4JpjrDsH6GNmdwOdzezv7n5bHuuWYmJPciojJi1m9NwNNKtZgQ/v6EFCk2rRLkskpoTTq+ob4DDB3I14M/sitOyhf8u6+9HnOcYDM82sHtAPuNrMRrj70VdY3Q88H2qimO06dz8zSy3TFRqxa0rSZh6asJDdyanc3bs59/ZRU0KRaDD38Ed1zGyKu/cLc9uqwLnADHcv0JPaCQkJPmfOnIJ8SYmgbftTeGTCQqYs2EL7epV55oqOtK+npoQi+c3M5rp7Qm7b5XWo6n9/3plZY3dfm9OG7r6bn66sEskzd2f03A2MmLSYQ2kZDOrbmtvPUFNCkWgLKzjMrIW7rwA+zrL6GTOb5u6vR6Y0iWXrdyXz4LgkZi7fQdcmVXn68o40r1kx2mWJCOGd4+gJPBe6C+DFocl9TnD08aqZJbv7exGuU2JEZqbzzqw1PPPJUgx4/JL2XNe9MSXUlFCk0AjniKM70M/dd5lZprv3PvKAmVUk/CutRI5pxbb9DB6TxNy1uzmrVU2eGNCBBlXVlFCksAln5njWnlGTIbjXeKixYbK7KzjkhKRlZDJqxir+PG055cvE8fyvOjHg5PrqLyVSSIUzVDWC4P4bRnAXQIB/mtlQ4FEze8zdF0ewRinGFmzcy/2jE1m8eR/94+sy/OL21KxUJtplicgxhDNUdQEwDtgMPBj6K7AzwWzw0goNOR4paRm8OG05r89cRbUKpXnt+lPo26FOtMsSkTCEExx7CWZ+rw0tdwC2A1WBRyJXmhRX36/exZAxiazacZBfJTRg6AXtqFK+VLTLEpEwhRMcdYCzgB3AHoIhq1SgC5BuZvvcfV3kSpTi4sDhdEZOWcI/Z6+lQdVyvPvr7vRsWSPaZYlIHoUTHE5wH44MIB7YSdD9NgMoAzwG3Byh+qSY+HLpNoaOTWLzvhRuPb0pfzy/FeVLqymhSFEUzm/uVuBrgqGqmwiCpCFQCZgGvBWx6qTI230wlccnLmLsfzfSolZFRt95Gqc0rhrtskTkBIQTHFUJzmvUBNYBq4E0ggAZ4e79I1eeFFXuzuSkLTzy0QL2JKfx2z4tuLtPC8qUVFNCkaIunOAYS3BOozLwD2AicBXB7WOrmdnZ7v55xCqUImfrvhQeGr+ATxdtJb5+Fd65tTvt6lWOdlkikk/CmQD4WDarX4H/tTlPyfeqpEhydz6Ys54RkxaTmp7JA/3a8OueTSmppoQixcoJnZ1UaMgR63Ym88C4RL5ZsZNuTasx8vKONK1RIdpliUgE6LIWOSEZmc4/vl3Ds58sJa6EMeLSDlzbrZGaEooUYwoOOW7Lt+5n0JhE/rtuD71b1+SJAfHUO6lc7k8UkSJNwSF5lpqeyWtfreSlL1ZQoUwcL17VmUs611NTQpEYoeCQPPlx/R4Gj0lkyZb9XNSpHo9c1I4aFdWUUCSWKDgkLIdSM3hx2jJen7mKmpXK8PqNCZzbrna0yxKRKFBwSK5mr9rJkDGJrNmZzDXdGjKkX1uqlFNTQpFYpeCQHO1PSePpKUt477t1NKpWnn/d1p3TWqgpoUisU3BItr5YspWh4xawdV8Kt/Vsyh/Oa0250moXIiIKDjnKzgOHeWziIibM30Sr2hV55brTOLmRmhKKyE8UHAIE7UI+TtzM8I8Wsj8ljfvObsndvVtQuqTahYjIzyk4hC17Uxg2Polpi7fRqUEVRl7RnTZ11JRQRLKn4Ihh7s6/f1jPk5MWk5aZybD+bbnl9KbEqV2IiByDgiNGrd15kCFjkpi1aic9mlXn6cvjaVxdTQlFJHcKjhiTkem8+fVqnvtsKaVKlOCpy+K5umtDtQsRkbApOGLI0i37GTT6R37csJdz2tZixKXx1KlSNtpliUgRo+CIAanpmbz85Qpemb6CSmVL8ZdrTuaijnV1lCEix0XBUczNX7+HQaN/ZNnWA1zauR4PX9SeahVKR7ssESnCFBzF1KHUDJ77dClvfrOa2pXL8uazxTn8AAAN10lEQVTNCfRpo6aEInLiFBzF0LcrdjBkbBLrdiVzXfdGDOnXhkpl1ZRQRPJHxILDzN4A2gGT3H1ENo83BV4CKgPfu/sfclhXBfg3EAccBK5y99RI1V2U7T2UxlOTF/PvH9bTpHp5/j3wVE5tVj3aZYlIMRORfhJmdhkQ5+49gGZm1jKbzUYCj7v7GUADM+uVw7rrgOfd/TxgC9A3EjUXdZ8t2sp5L3zFB3PWc8eZzZhy35kKDRGJiEgdcfQCPggtfwr0BJYftU0rYF5oeRtQJbt17v5KlufUDK3/BTMbCAwEaNSo0YlVX4TsOHCY4R8tZGLiZtrUqcTrNybQscFJ0S5LRIqxSAVHBWBjaHkX0CWbbUYDj5jZbIKjiAeA+GzWAWBmPYCq7j47uxd091HAKICEhATPp/0otNydCfM38ejHCzlwOJ3fn9uKO89qrqaEIhJxkQqOA0C50HJFshkSc/cRZtYTuB94290PANmtw8yqAX8FLo9QvUXKpj2HGDZ+AV8s2cbJjU7imcs70rJ2pWiXJSIxIlLBMZdgeGo20AlYmsN284FGwDU5rTOz0sCHwAPuvjZC9RYJmZnOv75fx9NTlpCR6Tx8YTtuOq2JmhKKSIGKVHCMB2aaWT2gH3C1mY1w92FHbXc/wYnv5GOs+zXBUNdQMxsKvOru/4lQ3YXW6h0HGTwmke9X7+L0FtV5akBHGlUvH+2yRCQGmXtkTgeYWVXgXGCGu2+JyIvkICEhwefMmVOQLxkx6RmZ/P3r1bzw2TJKlyzBQ/3bcWVCA7ULEZF8Z2Zz3T0ht+0iNo/D3Xfz05VVchwWbdrH4DGJJG3cy7ntajPi0g7UrqymhCISXZo5XggdTs/gpS9W8Or0lZxUvhQvX9uFC+Lr6ChDRAoFBUchM3ftbgaPSWTFtgNc1qU+D/VvR1U1JRSRQkTBUUgcPJzOs58u5R/frqFu5bK8dUtXereuFe2yRER+QcFRCMxcvp0HxiaxYfchbuzRmEF921CxjP5rRKRw0rtTFO1NTuOJyYv4YM4GmtaowAd39KBb02rRLktE5JgUHFEydcEWHpqwgF0HU7mrV3PuO7slZUvFRbssEZFcKTgK2Pb9QVPCSUmbaVe3Mm/d3JUO9atEuywRkbApOAqIuzN23kYem7iIQ6kZ3H9+awae2YxScWpKKCJFi4KjAGzYncyD4xYwY9l2TmlclZGXd6RFrYrRLktE5LgoOCIoM9N597u1jJyyBAeGX9SOG3s0oYSaEopIEabgiJCV2w8wZEwiP6zZzRkta/DkgHgaVlNTQhEp+hQc+SwtI5PXZ67ixWnLKVcqjmev7MTlXeqrXYiIFBsKjny0YONeBo9JZOGmffTrUIdHL2lPrUpqSigixYuCIx+kpGXw1y+W89pXq6havjSvXteFfvF1o12WiEhEKDhO0Jw1uxg0JpFV2w9yxSkNGNa/LSeVV1NCESm+FBzH6cDhdP40dQnvzF5LvSrleOfWbpzZqma0yxIRiTgFx3H4atl2HhybxKa9h7ipRxPuP781FdSUUERihN7t8mBPciqPT1zMmHkbaF6zAh/e0YOEJmpKKCKxRcERpilJm3lowkJ2J6dyT+8W3NOnhZoSikhMUnDkYtu+FB6esJCpC7fQvl5l3r61K+3rqSmhiMQuBUcO3J0P525gxMRFpKRnMrhvG24/oykl1ZRQRGKcgiMb63cl8+C4JGYu30HXJlV5+vKONK+ppoQiIqDg+JmMTOedWWv40ydLMeDxS9pzXffGakooIpKFgiOLpyYv5u9fr+asVjV58rJ46p9ULtoliYgUOgqOLG7s0YR29Soz4GQ1JRQRyYmCI4tG1cvTqLpan4uIHIsuERIRkTxRcIiISJ4oOEREJE8UHCIikicKDhERyRMFh4iI5ImCQ0RE8kTBISIieWLuHu0a8p2ZbQfWHufTawA78rGcokD7HBu0z7HhRPa5sbvneg/sYhkcJ8LM5rh7QrTrKEja59igfY4NBbHPGqoSEZE8UXCIiEieKDh+aVS0C4gC7XNs0D7Hhojvs85xiIhInuiIQ0RE8kTBEWPMrLaZzTzG443MbLqZfWFmo6yI39Eqt/3Nsl0HM/usIGqKtDzs88dm1rkgaoq0MH6um5nZ52Y238weKsjaiqOYDo4wfthKhX65vjGzWwuytkgws6rA20CFY2x2B3CXu/cBGgLxBVFbJIS5v4TC8XmgVEHUFUl52OfrgJXuPr9ACougMPf5HuBhd+8MnG9muc5VKKzMrIqZTTGzT81snJmVzmG7N8xslpkNy+8aYjY4wvxhuxeY6+6nA1eYWaUCKS5yMoCrgH05beDuQ919cejT6hTtyVO57m/ILcCXkS+nQOS6z2ZWDXgO2G1mvQuqsAgK5/95J9DRzGoDZYA9BVFYhFwHPO/u5wFbgL5Hb2BmlwFx7t4DaGZmLfOzgJgNDsL7YesFfBBangEU6YlE7r7P3feGs62ZXQUsdPdNES4rYsLZXzOrDlwPPFswVUVWmP/HvwM+BP4G3GhmF0e+ssgJc5+nAqcCvwW+ANIjXliEuPsr7n5kWLUmsC2bzXrx03vXp0DP/KwhZoMjzB+2CsDG0PIuoHZkqyoczKwZ8Efg/6JdSwF4GnjA3dOiXUgBOhl42d23ELy59IpuOQViCHCzuw8FygHnRrmeE2ZmPYCq7j47m4cj+t4Vs8ERpgMEP2QAFYmB71doCO994NZwj06KuLOAkWY2HehsZiOiXE9BWAE0Cy0ncPx93YqSpkBDMysLdAGK9DyE0HDjX4Gczr1G9L2r2L8RnqC5/HSI1wlYE71S8p+Z9TGze45aPQRoBPw1dHXVWVEoLSKy2193b+Xuvdy9FzDf3fP9RGI05fB//Axwj5l9A5wJvFnwlUVODvv8CDAd2A6sJxiuKpJCJ8M/JDhSzin0I/reFfMTAM1surv3MrM+QDt3fynLY42BycA04DTgVHfPiFKpIiKY2V3Ak8CPoVVfAqWy/tFjZpWBmcDnQD+C9658G0GI+eDIjZnVI0juT2Jk6EZEioHQsPO5wIzQ+az8+9oKDhERyQud4xARkTxRcIiEmFmJrC1Wjv5cRAIKDokZZnaemTUILb9sZo3N7OMsm5wLbDSz2aErjjYBT5rZC2b2bOijZJav94SZNQ0tlzazMTm87idmVv5IEIVaRcSZWVyWbczMJphZg6NbSJhZySPbmtljZtY79NpDzKxS6OvHHfWc681smZlNO+pjlZldc2LfSYl1Cg6JJWWAe80sARgA/B1ICL3x3uPunwBjgauBSwhmG38ExAEvElyZknXG8Sn8NAfiXCDZzNqEPkqbWQszexhIc/dk4HyCq1xOBj4BpmUJiUeA/xJce/9l6FLoA6H5JV8CF5hZRYJOBz2AWkBLoDFw0N0zQsF05Hc6FRjl7udk/QDeIeiaIHLcSua+iUixMZHgGv7WwAUEfX5ecvcrjtru3wRvrsuBDUBZd99gZikAob/udxBcDjkvdNRyMsGb+hCgG3AZ0A44B2hgZq8Cv3X3KWY20d0vDH2tEmZ2SWjbV4HMUG80zGx+aH4Joc/rE/QPuweYD3wbWm5hZjOAFsClwPdAJjDQzI7uY9QMGHSc3z8RQMEhMcLM+gG/B74CDgNXELxZZ5jZ5wRv+kdarFxNMPP2WWAzUDc0p2cTQOiv+3nufraZ9QJ+E9ruD+6+z8xeDr3GacCdwDPufpeZPRM62sHMVgC7gf8QNBz8huBoZICZ9XD3WVlqjyMIggygSWj71kAdoCMwlGA2+B3u/n2W3R7l7j/rwWVmw4/7mygSouCQmBD6S38x8Ki73xQ66f0FQYdk3P0fAKFz4e8SvEmvdvd0MysF/IqgWdwRnULDSCcB44F/hp53McGQ2EFgJEFXVszsV8BD7n449PlwYBIwDzDgFWCmu68ys7FA59B20wh+T+8lCJpHgRsI2sCnEUwEO4VgaGpVfn2/RI5FwSGx5sjEpb4EbRkgODd9IbAs9Pn1hI44Qj2BphJ0lG2b5evMd/dzQkccvdx9pZllmFlHgvMUBwmODh4gmED6GXCfmfUP1dAEuAh4meA+IKlZvnbK/4oNzkscKfJ0YATBuY1OBOHSlOC8DMCUo/ZVQ1USEQoOiTVxZtYJ+BPQm2DI5y2CbqK9CBrCXUfwBnsmwQntGgRHILWB1cf42ncDWwmCI5ng4pMJQCt3/3PoRLi5+8jQEcdEd59jZmUIQuzmYxXu7t+Y2QcE7cGnELS9TzOzeQTnNh476ikaqpKIUHBILHmM4B4sScA17r6doOldSwAzu4DgTX8OwdVHJYHyBH/Z9wfGmdmdoTbWJx81VMWRe5eYWXUPWjIkhT6/P/R4qpnVMbOf3TzM3Q+Hhs7s6EtxQ8+PA0qEWr+/RnA0MwxYG7ocuD3BOZUuodohGP7K6YhjcF6/cSJZKTgkJphZPFCNICSuIxiGKgGUDX3UITgHcVVo+zpAaaAScG3opPdNwUMWR3BnyPMsuCfCGaHn3ETwhn70/RFKmVkZdz/s7r8zszsIhspey7JNmdBrfQIcCJ3b2BH6twTwmZm9RnDl1SqCo452BEdLfyQ40hltZte6+0qC3+1X3P3Fo74Pw9HvvZwg9aoSyScWNJUrGTqSOdZ25sf5i2dmJY/MJQkdpZQ40rH5RL6uSF4oOEREJE80c1xERPJEwSEiInmi4BARkTxRcIiISJ4oOEREJE/+H37WiX38YjDpAAAAAElFTkSuQmCC\n",
247 | "text/plain": [
248 | ""
249 | ]
250 | },
251 | "metadata": {
252 | "needs_background": "light"
253 | },
254 | "output_type": "display_data"
255 | }
256 | ],
257 | "source": [
258 | "# 可视化 所解释的方差与选取的特征数目之间的关系\n",
259 | "import matplotlib.pyplot as plt\n",
260 | "plt.plot(np.array(range(len(compressor.explained_variance_ratio_))) + 1, \n",
261 | " np.cumsum(compressor.explained_variance_ratio_))\n",
262 | "plt.xlabel('选取的特征数目')\n",
263 | "plt.ylabel('累计所解释的方差累')\n",
264 | "plt.show(); # LDA将原始的4个变量压缩为2个,这2个变量即能解释100%的方差"
265 | ]
266 | },
267 | {
268 | "cell_type": "code",
269 | "execution_count": null,
270 | "metadata": {},
271 | "outputs": [],
272 | "source": []
273 | }
274 | ],
275 | "metadata": {
276 | "kernelspec": {
277 | "display_name": "Python 3",
278 | "language": "python",
279 | "name": "python3"
280 | },
281 | "language_info": {
282 | "codemirror_mode": {
283 | "name": "ipython",
284 | "version": 3
285 | },
286 | "file_extension": ".py",
287 | "mimetype": "text/x-python",
288 | "name": "python",
289 | "nbconvert_exporter": "python",
290 | "pygments_lexer": "ipython3",
291 | "version": "3.6.8"
292 | },
293 | "toc": {
294 | "base_numbering": 1,
295 | "nav_menu": {},
296 | "number_sections": true,
297 | "sideBar": false,
298 | "skip_h1_title": false,
299 | "title_cell": "Table of Contents",
300 | "title_sidebar": "Contents",
301 | "toc_cell": true,
302 | "toc_position": {
303 | "height": "786.465px",
304 | "left": "40.9922px",
305 | "top": "113.535px",
306 | "width": "370.918px"
307 | },
308 | "toc_section_display": true,
309 | "toc_window_display": true
310 | }
311 | },
312 | "nbformat": 4,
313 | "nbformat_minor": 2
314 | }
315 |
--------------------------------------------------------------------------------