├── 1 ├── indexed_operations.py ├── iris.csv ├── matrix_operations.py ├── new │ ├── indexed_operations.py │ ├── iris.csv │ ├── matrix_operations.py │ ├── rank_and_shape.py │ ├── reduction.py │ └── segmentation.py ├── rank_and_shape.py ├── reduction.py └── segmentation.py ├── 2 ├── CH2_KNN.ipynb ├── CH2_NN.py ├── CH2_kmeans-FINAL.ipynb └── CH2_kmeans.py ├── 3 ├── Multivariate Linear Regression.ipynb ├── Multivariate Linear Regression.py ├── Univariate linear regression.ipynb ├── Univariate linear regression.py └── data │ └── boston.csv ├── 4 ├── CH4_Univariate_Logistic_regression.py ├── CH4_Univariate_logistic_regression.ipynb ├── CH4_univariate_logistic_regression_skflow.ipynb ├── CH4_univariate_logistic_regression_skflow.py ├── Univariate_logistic_regression_keras.ipynb ├── data │ └── CHD.csv └── old │ ├── CH4_draft.ipynb │ ├── CH4_skflow.ipynb │ ├── Test_CH4.ipynb │ ├── binary.csv │ └── data │ └── CHD.csv ├── 5 ├── CH5_Nonlinear.ipynb ├── CH5_linear_regression_nn.ipynb ├── Ch5_third_example.ipynb └── data │ ├── mpg.csv │ └── wine.csv ├── 6 ├── CIFAR.ipynb ├── Mnist_final.ipynb ├── convolution.ipynb ├── data │ ├── blue_jay.jpg │ ├── cifar-10-batches-bin │ │ ├── batches.meta.txt │ │ ├── data_batch_1.bin │ │ ├── readme.html │ │ └── test_batch.bin │ ├── leopard.jpg │ ├── test.gif │ └── test2.gif ├── image_subsampling.ipynb └── old │ ├── CIFAR.ipynb │ ├── Mnist_final.ipynb │ ├── convolution.ipynb │ ├── data │ ├── blue_jay.jpg │ ├── leopard.jpg │ ├── test.gif │ └── test2.gif │ └── image_subsampling.ipynb ├── 7 └── Code │ ├── CH7_time_series.ipynb │ ├── CH7_time_series.py │ ├── data │ └── elec_load.csv │ ├── model.py │ ├── sample.py │ ├── save │ └── .gitignore │ ├── train.py │ └── utils.py ├── 8 ├── LICENSE.txt ├── content.jpg ├── neural_style.py ├── out.jpg ├── style.jpg ├── stylize.py ├── stylize.pyc ├── vgg.py └── vgg.pyc ├── 9 ├── cluster_pi_final.py ├── gpu_pi.py ├── start_server.py └── trainer.py ├── ERRATA_AND_UPDATES.md ├── LICENSE └── README.md /1/indexed_operations.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | sess = tf.InteractiveSession() 3 | x = tf.constant([[2, 5, 3, -5], 4 | [0, 3,-2, 5], 5 | [4, 3, 5, 3], 6 | [6, 1, 4, 0]]) 7 | listx = tf.constant([1,2,3,4,5,6,7,8]) 8 | listy = tf.constant([4,5,8,9]) 9 | 10 | boolx = tf.constant([[True,False], [False,True]]) 11 | 12 | tf.argmin(x, 1).eval() # Position of the maximum value of columns 13 | tf.argmax(x, 1).eval() # Position of the minimum value of rows 14 | tf.setdiff1d(listx, listy)[0].eval() # List differences 15 | tf.where(boolx).eval() # Show true values 16 | tf.unique(listx)[0].eval() # Unique values in list 17 | -------------------------------------------------------------------------------- /1/iris.csv: -------------------------------------------------------------------------------- 1 | 5.1,3.5,1.4,0.2,setosa 2 | 4.9,3.0,1.4,0.2,setosa 3 | 4.7,3.2,1.3,0.2,setosa 4 | 4.6,3.1,1.5,0.2,setosa 5 | 5.0,3.6,1.4,0.2,setosa 6 | 5.4,3.9,1.7,0.4,setosa 7 | 4.6,3.4,1.4,0.3,setosa 8 | 5.0,3.4,1.5,0.2,setosa 9 | 4.4,2.9,1.4,0.2,setosa 10 | 4.9,3.1,1.5,0.1,setosa 11 | 5.4,3.7,1.5,0.2,setosa 12 | 4.8,3.4,1.6,0.2,setosa 13 | 4.8,3.0,1.4,0.1,setosa 14 | 4.3,3.0,1.1,0.1,setosa 15 | 5.8,4.0,1.2,0.2,setosa 16 | 5.7,4.4,1.5,0.4,setosa 17 | 5.4,3.9,1.3,0.4,setosa 18 | 5.1,3.5,1.4,0.3,setosa 19 | 5.7,3.8,1.7,0.3,setosa 20 | 5.1,3.8,1.5,0.3,setosa 21 | 5.4,3.4,1.7,0.2,setosa 22 | 5.1,3.7,1.5,0.4,setosa 23 | 4.6,3.6,1.0,0.2,setosa 24 | 5.1,3.3,1.7,0.5,setosa 25 | 4.8,3.4,1.9,0.2,setosa 26 | 5.0,3.0,1.6,0.2,setosa 27 | 5.0,3.4,1.6,0.4,setosa 28 | 5.2,3.5,1.5,0.2,setosa 29 | 5.2,3.4,1.4,0.2,setosa 30 | 4.7,3.2,1.6,0.2,setosa 31 | 4.8,3.1,1.6,0.2,setosa 32 | 5.4,3.4,1.5,0.4,setosa 33 | 5.2,4.1,1.5,0.1,setosa 34 | 5.5,4.2,1.4,0.2,setosa 35 | 4.9,3.1,1.5,0.1,setosa 36 | 5.0,3.2,1.2,0.2,setosa 37 | 5.5,3.5,1.3,0.2,setosa 38 | 4.9,3.1,1.5,0.1,setosa 39 | 4.4,3.0,1.3,0.2,setosa 40 | 5.1,3.4,1.5,0.2,setosa 41 | 5.0,3.5,1.3,0.3,setosa 42 | 4.5,2.3,1.3,0.3,setosa 43 | 4.4,3.2,1.3,0.2,setosa 44 | 5.0,3.5,1.6,0.6,setosa 45 | 5.1,3.8,1.9,0.4,setosa 46 | 4.8,3.0,1.4,0.3,setosa 47 | 5.1,3.8,1.6,0.2,setosa 48 | 4.6,3.2,1.4,0.2,setosa 49 | 5.3,3.7,1.5,0.2,setosa 50 | 5.0,3.3,1.4,0.2,setosa 51 | 7.0,3.2,4.7,1.4,versicolor 52 | 6.4,3.2,4.5,1.5,versicolor 53 | 6.9,3.1,4.9,1.5,versicolor 54 | 5.5,2.3,4.0,1.3,versicolor 55 | 6.5,2.8,4.6,1.5,versicolor 56 | 5.7,2.8,4.5,1.3,versicolor 57 | 6.3,3.3,4.7,1.6,versicolor 58 | 4.9,2.4,3.3,1.0,versicolor 59 | 6.6,2.9,4.6,1.3,versicolor 60 | 5.2,2.7,3.9,1.4,versicolor 61 | 5.0,2.0,3.5,1.0,versicolor 62 | 5.9,3.0,4.2,1.5,versicolor 63 | 6.0,2.2,4.0,1.0,versicolor 64 | 6.1,2.9,4.7,1.4,versicolor 65 | 5.6,2.9,3.6,1.3,versicolor 66 | 6.7,3.1,4.4,1.4,versicolor 67 | 5.6,3.0,4.5,1.5,versicolor 68 | 5.8,2.7,4.1,1.0,versicolor 69 | 6.2,2.2,4.5,1.5,versicolor 70 | 5.6,2.5,3.9,1.1,versicolor 71 | 5.9,3.2,4.8,1.8,versicolor 72 | 6.1,2.8,4.0,1.3,versicolor 73 | 6.3,2.5,4.9,1.5,versicolor 74 | 6.1,2.8,4.7,1.2,versicolor 75 | 6.4,2.9,4.3,1.3,versicolor 76 | 6.6,3.0,4.4,1.4,versicolor 77 | 6.8,2.8,4.8,1.4,versicolor 78 | 6.7,3.0,5.0,1.7,versicolor 79 | 6.0,2.9,4.5,1.5,versicolor 80 | 5.7,2.6,3.5,1.0,versicolor 81 | 5.5,2.4,3.8,1.1,versicolor 82 | 5.5,2.4,3.7,1.0,versicolor 83 | 5.8,2.7,3.9,1.2,versicolor 84 | 6.0,2.7,5.1,1.6,versicolor 85 | 5.4,3.0,4.5,1.5,versicolor 86 | 6.0,3.4,4.5,1.6,versicolor 87 | 6.7,3.1,4.7,1.5,versicolor 88 | 6.3,2.3,4.4,1.3,versicolor 89 | 5.6,3.0,4.1,1.3,versicolor 90 | 5.5,2.5,4.0,1.3,versicolor 91 | 5.5,2.6,4.4,1.2,versicolor 92 | 6.1,3.0,4.6,1.4,versicolor 93 | 5.8,2.6,4.0,1.2,versicolor 94 | 5.0,2.3,3.3,1.0,versicolor 95 | 5.6,2.7,4.2,1.3,versicolor 96 | 5.7,3.0,4.2,1.2,versicolor 97 | 5.7,2.9,4.2,1.3,versicolor 98 | 6.2,2.9,4.3,1.3,versicolor 99 | 5.1,2.5,3.0,1.1,versicolor 100 | 5.7,2.8,4.1,1.3,versicolor 101 | 6.3,3.3,6.0,2.5,virginica 102 | 5.8,2.7,5.1,1.9,virginica 103 | 7.1,3.0,5.9,2.1,virginica 104 | 6.3,2.9,5.6,1.8,virginica 105 | 6.5,3.0,5.8,2.2,virginica 106 | 7.6,3.0,6.6,2.1,virginica 107 | 4.9,2.5,4.5,1.7,virginica 108 | 7.3,2.9,6.3,1.8,virginica 109 | 6.7,2.5,5.8,1.8,virginica 110 | 7.2,3.6,6.1,2.5,virginica 111 | 6.5,3.2,5.1,2.0,virginica 112 | 6.4,2.7,5.3,1.9,virginica 113 | 6.8,3.0,5.5,2.1,virginica 114 | 5.7,2.5,5.0,2.0,virginica 115 | 5.8,2.8,5.1,2.4,virginica 116 | 6.4,3.2,5.3,2.3,virginica 117 | 6.5,3.0,5.5,1.8,virginica 118 | 7.7,3.8,6.7,2.2,virginica 119 | 7.7,2.6,6.9,2.3,virginica 120 | 6.0,2.2,5.0,1.5,virginica 121 | 6.9,3.2,5.7,2.3,virginica 122 | 5.6,2.8,4.9,2.0,virginica 123 | 7.7,2.8,6.7,2.0,virginica 124 | 6.3,2.7,4.9,1.8,virginica 125 | 6.7,3.3,5.7,2.1,virginica 126 | 7.2,3.2,6.0,1.8,virginica 127 | 6.2,2.8,4.8,1.8,virginica 128 | 6.1,3.0,4.9,1.8,virginica 129 | 6.4,2.8,5.6,2.1,virginica 130 | 7.2,3.0,5.8,1.6,virginica 131 | 7.4,2.8,6.1,1.9,virginica 132 | 7.9,3.8,6.4,2.0,virginica 133 | 6.4,2.8,5.6,2.2,virginica 134 | 6.3,2.8,5.1,1.5,virginica 135 | 6.1,2.6,5.6,1.4,virginica 136 | 7.7,3.0,6.1,2.3,virginica 137 | 6.3,3.4,5.6,2.4,virginica 138 | 6.4,3.1,5.5,1.8,virginica 139 | 6.0,3.0,4.8,1.8,virginica 140 | 6.9,3.1,5.4,2.1,virginica 141 | 6.7,3.1,5.6,2.4,virginica 142 | 6.9,3.1,5.1,2.3,virginica 143 | 5.8,2.7,5.1,1.9,virginica 144 | 6.8,3.2,5.9,2.3,virginica 145 | 6.7,3.3,5.7,2.5,virginica 146 | 6.7,3.0,5.2,2.3,virginica 147 | 6.3,2.5,5.0,1.9,virginica 148 | 6.5,3.0,5.2,2.0,virginica 149 | 6.2,3.4,5.4,2.3,virginica 150 | 5.9,3.0,5.1,1.8,virginica 151 | 152 | -------------------------------------------------------------------------------- /1/matrix_operations.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | sess = tf.InteractiveSession() 3 | 4 | x = tf.constant([[2, 5, 3, -5], 5 | [0, 3,-2, 5], 6 | [4, 3, 5, 3], 7 | [6, 1, 4, 0]]) 8 | 9 | y = tf.constant([[4, -7, 4, -3, 4], 10 | [6, 4,-7, 4, 7], 11 | [2, 3, 2, 1, 4], 12 | [1, 5, 5, 5, 2]]) 13 | 14 | floatx = tf.constant([[2., 5., 3., -5.], 15 | [0., 3.,-2., 5.], 16 | [4., 3., 5., 3.], 17 | [6., 1., 4., 0.]]) 18 | 19 | tf.transpose(x).eval() # Transpose matrix 20 | tf.matmul(x, y).eval() # Matrix multiplication 21 | tf.matrix_determinant(floatx).eval() # Matrix determinant 22 | tf.matrix_inverse(floatx).eval() # Matrix inverse 23 | tf.matrix_solve(floatx, [[1],[1],[1],[1]]).eval() # Solve Matrix system 24 | 25 | -------------------------------------------------------------------------------- /1/new/indexed_operations.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | sess = tf.InteractiveSession() 3 | x = tf.constant([[2, 5, 3, -5], 4 | [0, 3,-2, 5], 5 | [4, 3, 5, 3], 6 | [6, 1, 4, 0]]) 7 | listx = tf.constant([1,2,3,4,5,6,7,8]) 8 | listy = tf.constant([4,5,8,9]) 9 | 10 | boolx = tf.constant([[True,False], [False,True]]) 11 | 12 | tf.argmin(x, 1).eval() # Position of the maximum value of columns 13 | tf.argmax(x, 1).eval() # Position of the minimum value of rows 14 | tf.listdiff(listx, listy)[0].eval() # List differences 15 | tf.where(boolx).eval() # Show true values 16 | tf.unique(listx)[0].eval() # Unique values in list 17 | -------------------------------------------------------------------------------- /1/new/iris.csv: -------------------------------------------------------------------------------- 1 | 5.1,3.5,1.4,0.2,setosa 2 | 4.9,3.0,1.4,0.2,setosa 3 | 4.7,3.2,1.3,0.2,setosa 4 | 4.6,3.1,1.5,0.2,setosa 5 | 5.0,3.6,1.4,0.2,setosa 6 | 5.4,3.9,1.7,0.4,setosa 7 | 4.6,3.4,1.4,0.3,setosa 8 | 5.0,3.4,1.5,0.2,setosa 9 | 4.4,2.9,1.4,0.2,setosa 10 | 4.9,3.1,1.5,0.1,setosa 11 | 5.4,3.7,1.5,0.2,setosa 12 | 4.8,3.4,1.6,0.2,setosa 13 | 4.8,3.0,1.4,0.1,setosa 14 | 4.3,3.0,1.1,0.1,setosa 15 | 5.8,4.0,1.2,0.2,setosa 16 | 5.7,4.4,1.5,0.4,setosa 17 | 5.4,3.9,1.3,0.4,setosa 18 | 5.1,3.5,1.4,0.3,setosa 19 | 5.7,3.8,1.7,0.3,setosa 20 | 5.1,3.8,1.5,0.3,setosa 21 | 5.4,3.4,1.7,0.2,setosa 22 | 5.1,3.7,1.5,0.4,setosa 23 | 4.6,3.6,1.0,0.2,setosa 24 | 5.1,3.3,1.7,0.5,setosa 25 | 4.8,3.4,1.9,0.2,setosa 26 | 5.0,3.0,1.6,0.2,setosa 27 | 5.0,3.4,1.6,0.4,setosa 28 | 5.2,3.5,1.5,0.2,setosa 29 | 5.2,3.4,1.4,0.2,setosa 30 | 4.7,3.2,1.6,0.2,setosa 31 | 4.8,3.1,1.6,0.2,setosa 32 | 5.4,3.4,1.5,0.4,setosa 33 | 5.2,4.1,1.5,0.1,setosa 34 | 5.5,4.2,1.4,0.2,setosa 35 | 4.9,3.1,1.5,0.1,setosa 36 | 5.0,3.2,1.2,0.2,setosa 37 | 5.5,3.5,1.3,0.2,setosa 38 | 4.9,3.1,1.5,0.1,setosa 39 | 4.4,3.0,1.3,0.2,setosa 40 | 5.1,3.4,1.5,0.2,setosa 41 | 5.0,3.5,1.3,0.3,setosa 42 | 4.5,2.3,1.3,0.3,setosa 43 | 4.4,3.2,1.3,0.2,setosa 44 | 5.0,3.5,1.6,0.6,setosa 45 | 5.1,3.8,1.9,0.4,setosa 46 | 4.8,3.0,1.4,0.3,setosa 47 | 5.1,3.8,1.6,0.2,setosa 48 | 4.6,3.2,1.4,0.2,setosa 49 | 5.3,3.7,1.5,0.2,setosa 50 | 5.0,3.3,1.4,0.2,setosa 51 | 7.0,3.2,4.7,1.4,versicolor 52 | 6.4,3.2,4.5,1.5,versicolor 53 | 6.9,3.1,4.9,1.5,versicolor 54 | 5.5,2.3,4.0,1.3,versicolor 55 | 6.5,2.8,4.6,1.5,versicolor 56 | 5.7,2.8,4.5,1.3,versicolor 57 | 6.3,3.3,4.7,1.6,versicolor 58 | 4.9,2.4,3.3,1.0,versicolor 59 | 6.6,2.9,4.6,1.3,versicolor 60 | 5.2,2.7,3.9,1.4,versicolor 61 | 5.0,2.0,3.5,1.0,versicolor 62 | 5.9,3.0,4.2,1.5,versicolor 63 | 6.0,2.2,4.0,1.0,versicolor 64 | 6.1,2.9,4.7,1.4,versicolor 65 | 5.6,2.9,3.6,1.3,versicolor 66 | 6.7,3.1,4.4,1.4,versicolor 67 | 5.6,3.0,4.5,1.5,versicolor 68 | 5.8,2.7,4.1,1.0,versicolor 69 | 6.2,2.2,4.5,1.5,versicolor 70 | 5.6,2.5,3.9,1.1,versicolor 71 | 5.9,3.2,4.8,1.8,versicolor 72 | 6.1,2.8,4.0,1.3,versicolor 73 | 6.3,2.5,4.9,1.5,versicolor 74 | 6.1,2.8,4.7,1.2,versicolor 75 | 6.4,2.9,4.3,1.3,versicolor 76 | 6.6,3.0,4.4,1.4,versicolor 77 | 6.8,2.8,4.8,1.4,versicolor 78 | 6.7,3.0,5.0,1.7,versicolor 79 | 6.0,2.9,4.5,1.5,versicolor 80 | 5.7,2.6,3.5,1.0,versicolor 81 | 5.5,2.4,3.8,1.1,versicolor 82 | 5.5,2.4,3.7,1.0,versicolor 83 | 5.8,2.7,3.9,1.2,versicolor 84 | 6.0,2.7,5.1,1.6,versicolor 85 | 5.4,3.0,4.5,1.5,versicolor 86 | 6.0,3.4,4.5,1.6,versicolor 87 | 6.7,3.1,4.7,1.5,versicolor 88 | 6.3,2.3,4.4,1.3,versicolor 89 | 5.6,3.0,4.1,1.3,versicolor 90 | 5.5,2.5,4.0,1.3,versicolor 91 | 5.5,2.6,4.4,1.2,versicolor 92 | 6.1,3.0,4.6,1.4,versicolor 93 | 5.8,2.6,4.0,1.2,versicolor 94 | 5.0,2.3,3.3,1.0,versicolor 95 | 5.6,2.7,4.2,1.3,versicolor 96 | 5.7,3.0,4.2,1.2,versicolor 97 | 5.7,2.9,4.2,1.3,versicolor 98 | 6.2,2.9,4.3,1.3,versicolor 99 | 5.1,2.5,3.0,1.1,versicolor 100 | 5.7,2.8,4.1,1.3,versicolor 101 | 6.3,3.3,6.0,2.5,virginica 102 | 5.8,2.7,5.1,1.9,virginica 103 | 7.1,3.0,5.9,2.1,virginica 104 | 6.3,2.9,5.6,1.8,virginica 105 | 6.5,3.0,5.8,2.2,virginica 106 | 7.6,3.0,6.6,2.1,virginica 107 | 4.9,2.5,4.5,1.7,virginica 108 | 7.3,2.9,6.3,1.8,virginica 109 | 6.7,2.5,5.8,1.8,virginica 110 | 7.2,3.6,6.1,2.5,virginica 111 | 6.5,3.2,5.1,2.0,virginica 112 | 6.4,2.7,5.3,1.9,virginica 113 | 6.8,3.0,5.5,2.1,virginica 114 | 5.7,2.5,5.0,2.0,virginica 115 | 5.8,2.8,5.1,2.4,virginica 116 | 6.4,3.2,5.3,2.3,virginica 117 | 6.5,3.0,5.5,1.8,virginica 118 | 7.7,3.8,6.7,2.2,virginica 119 | 7.7,2.6,6.9,2.3,virginica 120 | 6.0,2.2,5.0,1.5,virginica 121 | 6.9,3.2,5.7,2.3,virginica 122 | 5.6,2.8,4.9,2.0,virginica 123 | 7.7,2.8,6.7,2.0,virginica 124 | 6.3,2.7,4.9,1.8,virginica 125 | 6.7,3.3,5.7,2.1,virginica 126 | 7.2,3.2,6.0,1.8,virginica 127 | 6.2,2.8,4.8,1.8,virginica 128 | 6.1,3.0,4.9,1.8,virginica 129 | 6.4,2.8,5.6,2.1,virginica 130 | 7.2,3.0,5.8,1.6,virginica 131 | 7.4,2.8,6.1,1.9,virginica 132 | 7.9,3.8,6.4,2.0,virginica 133 | 6.4,2.8,5.6,2.2,virginica 134 | 6.3,2.8,5.1,1.5,virginica 135 | 6.1,2.6,5.6,1.4,virginica 136 | 7.7,3.0,6.1,2.3,virginica 137 | 6.3,3.4,5.6,2.4,virginica 138 | 6.4,3.1,5.5,1.8,virginica 139 | 6.0,3.0,4.8,1.8,virginica 140 | 6.9,3.1,5.4,2.1,virginica 141 | 6.7,3.1,5.6,2.4,virginica 142 | 6.9,3.1,5.1,2.3,virginica 143 | 5.8,2.7,5.1,1.9,virginica 144 | 6.8,3.2,5.9,2.3,virginica 145 | 6.7,3.3,5.7,2.5,virginica 146 | 6.7,3.0,5.2,2.3,virginica 147 | 6.3,2.5,5.0,1.9,virginica 148 | 6.5,3.0,5.2,2.0,virginica 149 | 6.2,3.4,5.4,2.3,virginica 150 | 5.9,3.0,5.1,1.8,virginica 151 | 152 | -------------------------------------------------------------------------------- /1/new/matrix_operations.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | sess = tf.InteractiveSession() 3 | 4 | x = tf.constant([[2, 5, 3, -5], 5 | [0, 3,-2, 5], 6 | [4, 3, 5, 3], 7 | [6, 1, 4, 0]]) 8 | 9 | y = tf.constant([[4, -7, 4, -3, 4], 10 | [6, 4,-7, 4, 7], 11 | [2, 3, 2, 1, 4], 12 | [1, 5, 5, 5, 2]]) 13 | 14 | floatx = tf.constant([[2., 5., 3., -5.], 15 | [0., 3.,-2., 5.], 16 | [4., 3., 5., 3.], 17 | [6., 1., 4., 0.]]) 18 | 19 | tf.transpose(x).eval() # Transpose matrix 20 | tf.matmul(x, y).eval() # Matrix multiplication 21 | tf.matrix_determinant(floatx).eval() # Matrix determinant 22 | tf.matrix_inverse(floatx).eval() # Matrix inverse 23 | tf.matrix_solve(floatx, [[1],[1],[1],[1]]).eval() # Solve Matrix system 24 | 25 | -------------------------------------------------------------------------------- /1/new/rank_and_shape.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | sess = tf.InteractiveSession() 3 | x = tf.constant([[2, 5, 3, -5], 4 | [0, 3,-2, 5], 5 | [4, 3, 5, 3], 6 | [6, 1, 4, 0]]) 7 | 8 | tf.shape(x).eval() # Shape of the tensor 9 | tf.size(x).eval() # size of the tensor 10 | tf.rank(x).eval() # rank of the tensor 11 | tf.reshape(x, [8, 2]).eval() # converting to a 10x2 matrix 12 | tf.squeeze(x).eval() # squeezing 13 | tf.expand_dims(x,1).eval() #Expanding dims 14 | -------------------------------------------------------------------------------- /1/new/reduction.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | sess = tf.InteractiveSession() 3 | x = tf.constant([[1, 2, 3], 4 | [3, 2, 1], 5 | [-1,-2,-3]]) 6 | 7 | boolean_tensor = tf.constant([[True, False, True], 8 | [False, False, True], 9 | [True, False, False]]) 10 | 11 | tf.reduce_prod(x, reduction_indices=1).eval() # reduce prod 12 | tf.reduce_min(x, reduction_indices=1).eval() # reduce min 13 | tf.reduce_max(x, reduction_indices=1).eval() # reduce max 14 | tf.reduce_mean(x, reduction_indices=1).eval() # reduce mean 15 | tf.reduce_all(boolean_tensor, reduction_indices=1).eval() # reduce all 16 | tf.reduce_any(boolean_tensor, reduction_indices=1).eval() # reduce any 17 | -------------------------------------------------------------------------------- /1/new/segmentation.py: -------------------------------------------------------------------------------- 1 | #Segmentation Examples 2 | import tensorflow as tf 3 | sess = tf.InteractiveSession() 4 | seg_ids = tf.constant([0,1,1,2,2]); # Group indexes : 0|1,2|3,4 5 | 6 | tens1 = tf.constant([[2, 5, 3, -5], 7 | [0, 3,-2, 5], 8 | [4, 3, 5, 3], 9 | [6, 1, 4, 0], 10 | [6, 1, 4, 0]]) # A sample constant matrix 11 | 12 | tf.segment_sum(tens1, seg_ids).eval() # Sum segmentation 13 | tf.segment_prod(tens1, seg_ids).eval() # Product segmantation 14 | tf.segment_min(tens1, seg_ids).eval() # minimun value goes to group 15 | tf.segment_max(tens1, seg_ids).eval() # maximum value goes to group 16 | tf.segment_mean(tens1, seg_ids).eval() # mean value goes to group 17 | -------------------------------------------------------------------------------- /1/rank_and_shape.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | sess = tf.InteractiveSession() 3 | x = tf.constant([[2, 5, 3, -5], 4 | [0, 3,-2, 5], 5 | [4, 3, 5, 3], 6 | [6, 1, 4, 0]]) 7 | 8 | tf.shape(x).eval() # Shape of the tensor 9 | tf.size(x).eval() # size of the tensor 10 | tf.rank(x).eval() # rank of the tensor 11 | tf.reshape(x, [8, 2]).eval() # converting to a 10x2 matrix 12 | tf.squeeze(x).eval() # squeezing 13 | tf.expand_dims(x,1).eval() #Expanding dims 14 | -------------------------------------------------------------------------------- /1/reduction.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | sess = tf.InteractiveSession() 3 | x = tf.constant([[1, 2, 3], 4 | [3, 2, 1], 5 | [-1,-2,-3]]) 6 | 7 | boolean_tensor = tf.constant([[True, False, True], 8 | [False, False, True], 9 | [True, False, False]]) 10 | 11 | tf.reduce_prod(x, reduction_indices=1).eval() # reduce prod 12 | tf.reduce_min(x, reduction_indices=1).eval() # reduce min 13 | tf.reduce_max(x, reduction_indices=1).eval() # reduce max 14 | tf.reduce_mean(x, reduction_indices=1).eval() # reduce mean 15 | tf.reduce_all(boolean_tensor, reduction_indices=1).eval() # reduce all 16 | tf.reduce_any(boolean_tensor, reduction_indices=1).eval() # reduce any 17 | -------------------------------------------------------------------------------- /1/segmentation.py: -------------------------------------------------------------------------------- 1 | #Segmentation Examples 2 | import tensorflow as tf 3 | sess = tf.InteractiveSession() 4 | seg_ids = tf.constant([0,1,1,2,2]); # Group indexes : 0|1,2|3,4 5 | 6 | tens1 = tf.constant([[2, 5, 3, -5], 7 | [0, 3,-2, 5], 8 | [4, 3, 5, 3], 9 | [6, 1, 4, 0], 10 | [6, 1, 4, 0]]) # A sample constant matrix 11 | 12 | tf.segment_sum(tens1, seg_ids).eval() # Sum segmentation 13 | tf.segment_prod(tens1, seg_ids).eval() # Product segmantation 14 | tf.segment_min(tens1, seg_ids).eval() # minimun value goes to group 15 | tf.segment_max(tens1, seg_ids).eval() # maximum value goes to group 16 | tf.segment_mean(tens1, seg_ids).eval() # mean value goes to group 17 | -------------------------------------------------------------------------------- /2/CH2_NN.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | import time 4 | 5 | import matplotlib 6 | import matplotlib.pyplot as plt 7 | 8 | from sklearn.datasets.samples_generator import make_circles 9 | 10 | N=210 11 | K=2 12 | # Maximum number of iterations, if the conditions are not met 13 | MAX_ITERS = 1000 14 | cut=int(N*0.7) 15 | 16 | start = time.time() 17 | 18 | data, features = make_circles(n_samples=N, shuffle=True, noise= 0.12, factor=0.4) 19 | tr_data, tr_features= data[:cut], features[:cut] 20 | te_data,te_features=data[cut:], features[cut:] 21 | 22 | fig, ax = plt.subplots() 23 | ax.scatter(tr_data.transpose()[0], tr_data.transpose()[1], marker = 'o', s = 100, c = tr_features, cmap=plt.cm.coolwarm ) 24 | plt.plot() 25 | 26 | points=tf.Variable(data) 27 | cluster_assignments = tf.Variable(tf.zeros([N], dtype=tf.int64)) 28 | 29 | sess = tf.Session() 30 | sess.run(tf.initialize_all_variables()) 31 | 32 | test=[] 33 | 34 | for i, j in zip(te_data, te_features): 35 | distances = tf.reduce_sum(tf.square(tf.sub(i , tr_data)),reduction_indices=1) 36 | neighbor = tf.arg_min(distances,0) 37 | 38 | #print tr_features[sess.run(neighbor)] 39 | #print j 40 | test.append(tr_features[sess.run(neighbor)]) 41 | print test 42 | fig, ax = plt.subplots() 43 | ax.scatter(te_data.transpose()[0], te_data.transpose()[1], marker = 'o', s = 100, c = test, cmap=plt.cm.coolwarm ) 44 | plt.plot() 45 | 46 | #rep_points_v = tf.reshape(points, [1, N, 2]) 47 | #rep_points_h = tf.reshape(points, [N, 2]) 48 | #sum_squares = tf.reduce_sum(tf.square(rep_points - rep_points), reduction_indices=2) 49 | #print(sess.run(tf.square(rep_points_v - rep_points_h))) 50 | 51 | end = time.time() 52 | print ("Found in %.2f seconds" % (end-start)) 53 | print "Cluster assignments:", test 54 | 55 | 56 | -------------------------------------------------------------------------------- /2/CH2_kmeans.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | import time 4 | 5 | import matplotlib 6 | import matplotlib.pyplot as plt 7 | 8 | from sklearn.datasets.samples_generator import make_blobs 9 | from sklearn.datasets.samples_generator import make_circles 10 | 11 | DATA_TYPE = 'blobs' 12 | N=200 13 | # Number of clusters, if we choose circles, only 2 will be enough 14 | if (DATA_TYPE == 'circle'): 15 | K=2 16 | else: 17 | K=4 18 | 19 | 20 | # Maximum number of iterations, if the conditions are not met 21 | MAX_ITERS = 1000 22 | 23 | 24 | start = time.time() 25 | 26 | 27 | centers = [(-2, -2), (-2, 1.5), (1.5, -2), (2, 1.5)] 28 | if (DATA_TYPE == 'circle'): 29 | data, features = make_circles(n_samples=200, shuffle=True, noise= 0.01, factor=0.4) 30 | else: 31 | data, features = make_blobs (n_samples=200, centers=centers, n_features = 2, cluster_std=0.8, shuffle=False, random_state=42) 32 | 33 | 34 | fig, ax = plt.subplots() 35 | ax.scatter(np.asarray(centers).transpose()[0], np.asarray(centers).transpose()[1], marker = 'o', s = 250) 36 | plt.show() 37 | 38 | 39 | fig, ax = plt.subplots() 40 | if (DATA_TYPE == 'blobs'): 41 | ax.scatter(np.asarray(centers).transpose()[0], np.asarray(centers).transpose()[1], marker = 'o', s = 250) 42 | ax.scatter(data.transpose()[0], data.transpose()[1], marker = 'o', s = 100, c = features, cmap=plt.cm.coolwarm ) 43 | plt.show() 44 | 45 | 46 | points=tf.Variable(data) 47 | cluster_assignments = tf.Variable(tf.zeros([N], dtype=tf.int64)) 48 | 49 | centroids = tf.Variable(tf.slice(points.initialized_value(), [0,0], [K,2])) 50 | 51 | sess = tf.Session() 52 | sess.run(tf.initialize_all_variables()) 53 | 54 | sess.run(centroids) 55 | 56 | 57 | rep_centroids = tf.reshape(tf.tile(centroids, [N, 1]), [N, K, 2]) 58 | rep_points = tf.reshape(tf.tile(points, [1, K]), [N, K, 2]) 59 | sum_squares = tf.reduce_sum(tf.square(rep_points - rep_centroids), 60 | reduction_indices=2) 61 | 62 | 63 | best_centroids = tf.argmin(sum_squares, 1) 64 | 65 | 66 | did_assignments_change = tf.reduce_any(tf.not_equal(best_centroids, cluster_assignments)) 67 | 68 | 69 | def bucket_mean(data, bucket_ids, num_buckets): 70 | total = tf.unsorted_segment_sum(data, bucket_ids, num_buckets) 71 | count = tf.unsorted_segment_sum(tf.ones_like(data), bucket_ids, num_buckets) 72 | return total / count 73 | 74 | 75 | means = bucket_mean(points, best_centroids, K) 76 | 77 | 78 | with tf.control_dependencies([did_assignments_change]): 79 | do_updates = tf.group( 80 | centroids.assign(means), 81 | cluster_assignments.assign(best_centroids)) 82 | 83 | changed = True 84 | iters = 0 85 | 86 | 87 | fig, ax = plt.subplots() 88 | if (DATA_TYPE == 'blobs'): 89 | colourindexes=[2,1,4,3] 90 | else: 91 | colourindexes=[2,1] 92 | while changed and iters < MAX_ITERS: 93 | fig, ax = plt.subplots() 94 | iters += 1 95 | [changed, _] = sess.run([did_assignments_change, do_updates]) 96 | [centers, assignments] = sess.run([centroids, cluster_assignments]) 97 | ax.scatter(sess.run(points).transpose()[0], sess.run(points).transpose()[1], marker = 'o', s = 200, c = assignments, cmap=plt.cm.coolwarm ) 98 | ax.scatter(centers[:,0],centers[:,1], marker = '^', s = 550, c = colourindexes, cmap=plt.cm.plasma) 99 | ax.set_title('Iteration ' + str(iters)) 100 | plt.savefig("kmeans" + str(iters) +".png") 101 | 102 | 103 | ax.scatter(sess.run(points).transpose()[0], sess.run(points).transpose()[1], marker = 'o', s = 200, c = assignments, cmap=plt.cm.coolwarm ) 104 | plt.show() 105 | 106 | 107 | end = time.time() 108 | print ("Found in %.2f seconds" % (end-start)), iters, "iterations" 109 | print "Centroids:" 110 | print centers 111 | print "Cluster assignments:", assignments 112 | 113 | 114 | 115 | -------------------------------------------------------------------------------- /3/Multivariate Linear Regression.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | import tensorflow as tf 3 | import tensorflow.contrib.learn as skflow 4 | from sklearn.utils import shuffle 5 | import numpy as np 6 | import pandas as pd 7 | 8 | df = pd.read_csv("data/boston.csv", header=0) 9 | print df.describe() 10 | 11 | f, ax1 = plt.subplots() 12 | plt.figure() # Create a new figure 13 | 14 | y = df['MEDV'] 15 | 16 | for i in range (1,8): 17 | number = 420 + i 18 | ax1.locator_params(nbins=3) 19 | ax1 = plt.subplot(number) 20 | plt.title(list(df)[i]) 21 | ax1.scatter(df[df.columns[i]],y) #Plot a scatter draw of the datapoints 22 | plt.tight_layout(pad=0.4, w_pad=0.5, h_pad=1.0) 23 | 24 | 25 | X = tf.placeholder("float", name="X") # create symbolic variables 26 | Y = tf.placeholder("float", name = "Y") 27 | 28 | 29 | with tf.name_scope("Model"): 30 | 31 | w = tf.Variable(tf.random_normal([2], stddev=0.01), name="b0") # create a shared variable 32 | b = tf.Variable(tf.random_normal([2], stddev=0.01), name="b1") # create a shared variable 33 | 34 | def model(X, w, b): 35 | return tf.mul(X, w) + b # We just define the line as X*w + b0 36 | 37 | y_model = model(X, w, b) 38 | 39 | with tf.name_scope("CostFunction"): 40 | cost = tf.reduce_mean(tf.pow(Y-y_model, 2)) # use sqr error for cost function 41 | 42 | train_op = tf.train.AdamOptimizer(0.001).minimize(cost) 43 | 44 | 45 | sess = tf.Session() 46 | init = tf.initialize_all_variables() 47 | tf.train.write_graph(sess.graph, '/home/bonnin/linear2','graph.pbtxt') 48 | cost_op = tf.scalar_summary("loss", cost) 49 | merged = tf.merge_all_summaries() 50 | sess.run(init) 51 | writer = tf.train.SummaryWriter('/home/bonnin/linear2', sess.graph) 52 | 53 | xvalues = df[[df.columns[2], df.columns[4]]].values.astype(float) 54 | yvalues = df[df.columns[12]].values.astype(float) 55 | b0temp=b.eval(session=sess) 56 | b1temp=w.eval(session=sess) 57 | 58 | 59 | for a in range (1,50): 60 | cost1=0.0 61 | for i, j in zip(xvalues, yvalues): 62 | sess.run(train_op, feed_dict={X: i, Y: j}) 63 | cost1+=sess.run(cost, feed_dict={X: i, Y: i})/506.00 64 | #writer.add_summary(summary_str, i) 65 | xvalues, yvalues = shuffle (xvalues, yvalues) 66 | print (cost1) 67 | b0temp=b.eval(session=sess) 68 | b1temp=w.eval(session=sess) 69 | print (b0temp) 70 | print (b1temp) 71 | 72 | -------------------------------------------------------------------------------- /3/Univariate linear regression.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt # import matplotlib 2 | import numpy as np # import numpy 3 | import tensorflow as tf 4 | import numpy as np 5 | 6 | trX = np.linspace(-1, 1, 101) #Create a linear space of 101 points between 1 and 1 7 | trY = 2 * trX + np.random.randn(*trX.shape) * 0.4 + 0.2 #Create The y function based on the x axis 8 | plt.figure() # Create a new figure 9 | plt.scatter(trX,trY) #Plot a scatter draw of the random datapoints 10 | plt.plot (trX, .2 + 2 * trX) # Draw one line with the line function 11 | 12 | get_ipython().magic(u'matplotlib inline') 13 | 14 | import matplotlib.pyplot as plt 15 | import tensorflow as tf 16 | import numpy as np 17 | 18 | trX = np.linspace(-1, 1, 101) 19 | trY = 2 * trX + np.random.randn(*trX.shape) * 0.4 + 0.2 # create a y value which is approximately linear but with some random noise 20 | 21 | plt.scatter(trX,trY) 22 | plt.plot (trX, .2 + 2 * trX) 23 | 24 | X = tf.placeholder("float", name="X") # create symbolic variables 25 | Y = tf.placeholder("float", name = "Y") 26 | 27 | with tf.name_scope("Model"): 28 | 29 | def model(X, w, b): 30 | return tf.mul(X, w) + b # We just define the line as X*w + b0 31 | 32 | w = tf.Variable(-1.0, name="b0") # create a shared variable 33 | b = tf.Variable(-2.0, name="b1") # create a shared variable 34 | y_model = model(X, w, b) 35 | 36 | 37 | with tf.name_scope("CostFunction"): 38 | cost = (tf.pow(Y-y_model, 2)) # use sqr error for cost function 39 | 40 | train_op = tf.train.GradientDescentOptimizer(0.05).minimize(cost) 41 | 42 | 43 | sess = tf.Session() 44 | init = tf.initialize_all_variables() 45 | tf.train.write_graph(sess.graph, '/home/ubuntu/linear','graph.pbtxt') 46 | cost_op = tf.scalar_summary("loss", cost) 47 | merged = tf.merge_all_summaries() 48 | sess.run(init) 49 | writer = tf.train.SummaryWriter('/home/ubuntu/linear', sess.graph) 50 | 51 | for i in range(100): 52 | for (x, y) in zip(trX, trY): 53 | sess.run(train_op, feed_dict={X: x, Y: y}) 54 | summary_str = sess.run(cost_op, feed_dict={X: x, Y: y}) 55 | writer.add_summary(summary_str, i) 56 | b0temp=b.eval(session=sess) 57 | b1temp=w.eval(session=sess) 58 | plt.plot (trX, b0temp + b1temp * trX ) 59 | 60 | 61 | print sess.run(w) # Should be around 2 62 | print sess.run(b) #Should be around 0.2 63 | 64 | 65 | plt.scatter(trX,trY) 66 | plt.plot (trX, sess.run(b) + trX * sess.run(w)) 67 | 68 | -------------------------------------------------------------------------------- /3/data/boston.csv: -------------------------------------------------------------------------------- 1 | CRIM, ZN ,INDUS ,CHAS,NOX,RM,AGE,DIS,RAD,TAX,PTRATIO,LSTAT,MEDV 2 | 0.00632,18,2.31,0,0.538,6.575,65.2,4.09,1,296,15.3,4.98,24 3 | 0.02731,0,7.07,0,0.469,6.421,78.9,4.9671,2,242,17.8,9.14,21.6 4 | 0.02729,0,7.07,0,0.469,7.185,61.1,4.9671,2,242,17.8,4.03,34.7 5 | 0.03237,0,2.18,0,0.458,6.998,45.8,6.0622,3,222,18.7,2.94,33.4 6 | 0.06905,0,2.18,0,0.458,7.147,54.2,6.0622,3,222,18.7,5.33,36.2 7 | 0.02985,0,2.18,0,0.458,6.43,58.7,6.0622,3,222,18.7,5.21,28.7 8 | 0.08829,12.5,7.87,0,0.524,6.012,66.6,5.5605,5,311,15.2,12.43,22.9 9 | 0.14455,12.5,7.87,0,0.524,6.172,96.1,5.9505,5,311,15.2,19.15,27.1 10 | 0.21124,12.5,7.87,0,0.524,5.631,100,6.0821,5,311,15.2,29.93,16.5 11 | 0.17004,12.5,7.87,0,0.524,6.004,85.9,6.5921,5,311,15.2,17.1,18.9 12 | 0.22489,12.5,7.87,0,0.524,6.377,94.3,6.3467,5,311,15.2,20.45,15 13 | 0.11747,12.5,7.87,0,0.524,6.009,82.9,6.2267,5,311,15.2,13.27,18.9 14 | 0.09378,12.5,7.87,0,0.524,5.889,39,5.4509,5,311,15.2,15.71,21.7 15 | 0.62976,0,8.14,0,0.538,5.949,61.8,4.7075,4,307,21,8.26,20.4 16 | 0.63796,0,8.14,0,0.538,6.096,84.5,4.4619,4,307,21,10.26,18.2 17 | 0.62739,0,8.14,0,0.538,5.834,56.5,4.4986,4,307,21,8.47,19.9 18 | 1.05393,0,8.14,0,0.538,5.935,29.3,4.4986,4,307,21,6.58,23.1 19 | 0.7842,0,8.14,0,0.538,5.99,81.7,4.2579,4,307,21,14.67,17.5 20 | 0.80271,0,8.14,0,0.538,5.456,36.6,3.7965,4,307,21,11.69,20.2 21 | 0.7258,0,8.14,0,0.538,5.727,69.5,3.7965,4,307,21,11.28,18.2 22 | 1.25179,0,8.14,0,0.538,5.57,98.1,3.7979,4,307,21,21.02,13.6 23 | 0.85204,0,8.14,0,0.538,5.965,89.2,4.0123,4,307,21,13.83,19.6 24 | 1.23247,0,8.14,0,0.538,6.142,91.7,3.9769,4,307,21,18.72,15.2 25 | 0.98843,0,8.14,0,0.538,5.813,100,4.0952,4,307,21,19.88,14.5 26 | 0.75026,0,8.14,0,0.538,5.924,94.1,4.3996,4,307,21,16.3,15.6 27 | 0.84054,0,8.14,0,0.538,5.599,85.7,4.4546,4,307,21,16.51,13.9 28 | 0.67191,0,8.14,0,0.538,5.813,90.3,4.682,4,307,21,14.81,16.6 29 | 0.95577,0,8.14,0,0.538,6.047,88.8,4.4534,4,307,21,17.28,14.8 30 | 0.77299,0,8.14,0,0.538,6.495,94.4,4.4547,4,307,21,12.8,18.4 31 | 1.00245,0,8.14,0,0.538,6.674,87.3,4.239,4,307,21,11.98,21 32 | 1.13081,0,8.14,0,0.538,5.713,94.1,4.233,4,307,21,22.6,12.7 33 | 1.35472,0,8.14,0,0.538,6.072,100,4.175,4,307,21,13.04,14.5 34 | 1.38799,0,8.14,0,0.538,5.95,82,3.99,4,307,21,27.71,13.2 35 | 1.15172,0,8.14,0,0.538,5.701,95,3.7872,4,307,21,18.35,13.1 36 | 1.61282,0,8.14,0,0.538,6.096,96.9,3.7598,4,307,21,20.34,13.5 37 | 0.06417,0,5.96,0,0.499,5.933,68.2,3.3603,5,279,19.2,9.68,18.9 38 | 0.09744,0,5.96,0,0.499,5.841,61.4,3.3779,5,279,19.2,11.41,20 39 | 0.08014,0,5.96,0,0.499,5.85,41.5,3.9342,5,279,19.2,8.77,21 40 | 0.17505,0,5.96,0,0.499,5.966,30.2,3.8473,5,279,19.2,10.13,24.7 41 | 0.02763,75,2.95,0,0.428,6.595,21.8,5.4011,3,252,18.3,4.32,30.8 42 | 0.03359,75,2.95,0,0.428,7.024,15.8,5.4011,3,252,18.3,1.98,34.9 43 | 0.12744,0,6.91,0,0.448,6.77,2.9,5.7209,3,233,17.9,4.84,26.6 44 | 0.1415,0,6.91,0,0.448,6.169,6.6,5.7209,3,233,17.9,5.81,25.3 45 | 0.15936,0,6.91,0,0.448,6.211,6.5,5.7209,3,233,17.9,7.44,24.7 46 | 0.12269,0,6.91,0,0.448,6.069,40,5.7209,3,233,17.9,9.55,21.2 47 | 0.17142,0,6.91,0,0.448,5.682,33.8,5.1004,3,233,17.9,10.21,19.3 48 | 0.18836,0,6.91,0,0.448,5.786,33.3,5.1004,3,233,17.9,14.15,20 49 | 0.22927,0,6.91,0,0.448,6.03,85.5,5.6894,3,233,17.9,18.8,16.6 50 | 0.25387,0,6.91,0,0.448,5.399,95.3,5.87,3,233,17.9,30.81,14.4 51 | 0.21977,0,6.91,0,0.448,5.602,62,6.0877,3,233,17.9,16.2,19.4 52 | 0.08873,21,5.64,0,0.439,5.963,45.7,6.8147,4,243,16.8,13.45,19.7 53 | 0.04337,21,5.64,0,0.439,6.115,63,6.8147,4,243,16.8,9.43,20.5 54 | 0.0536,21,5.64,0,0.439,6.511,21.1,6.8147,4,243,16.8,5.28,25 55 | 0.04981,21,5.64,0,0.439,5.998,21.4,6.8147,4,243,16.8,8.43,23.4 56 | 0.0136,75,4,0,0.41,5.888,47.6,7.3197,3,469,21.1,14.8,18.9 57 | 0.01311,90,1.22,0,0.403,7.249,21.9,8.6966,5,226,17.9,4.81,35.4 58 | 0.02055,85,0.74,0,0.41,6.383,35.7,9.1876,2,313,17.3,5.77,24.7 59 | 0.01432,100,1.32,0,0.411,6.816,40.5,8.3248,5,256,15.1,3.95,31.6 60 | 0.15445,25,5.13,0,0.453,6.145,29.2,7.8148,8,284,19.7,6.86,23.3 61 | 0.10328,25,5.13,0,0.453,5.927,47.2,6.932,8,284,19.7,9.22,19.6 62 | 0.14932,25,5.13,0,0.453,5.741,66.2,7.2254,8,284,19.7,13.15,18.7 63 | 0.17171,25,5.13,0,0.453,5.966,93.4,6.8185,8,284,19.7,14.44,16 64 | 0.11027,25,5.13,0,0.453,6.456,67.8,7.2255,8,284,19.7,6.73,22.2 65 | 0.1265,25,5.13,0,0.453,6.762,43.4,7.9809,8,284,19.7,9.5,25 66 | 0.01951,17.5,1.38,0,0.4161,7.104,59.5,9.2229,3,216,18.6,8.05,33 67 | 0.03584,80,3.37,0,0.398,6.29,17.8,6.6115,4,337,16.1,4.67,23.5 68 | 0.04379,80,3.37,0,0.398,5.787,31.1,6.6115,4,337,16.1,10.24,19.4 69 | 0.05789,12.5,6.07,0,0.409,5.878,21.4,6.498,4,345,18.9,8.1,22 70 | 0.13554,12.5,6.07,0,0.409,5.594,36.8,6.498,4,345,18.9,13.09,17.4 71 | 0.12816,12.5,6.07,0,0.409,5.885,33,6.498,4,345,18.9,8.79,20.9 72 | 0.08826,0,10.81,0,0.413,6.417,6.6,5.2873,4,305,19.2,6.72,24.2 73 | 0.15876,0,10.81,0,0.413,5.961,17.5,5.2873,4,305,19.2,9.88,21.7 74 | 0.09164,0,10.81,0,0.413,6.065,7.8,5.2873,4,305,19.2,5.52,22.8 75 | 0.19539,0,10.81,0,0.413,6.245,6.2,5.2873,4,305,19.2,7.54,23.4 76 | 0.07896,0,12.83,0,0.437,6.273,6,4.2515,5,398,18.7,6.78,24.1 77 | 0.09512,0,12.83,0,0.437,6.286,45,4.5026,5,398,18.7,8.94,21.4 78 | 0.10153,0,12.83,0,0.437,6.279,74.5,4.0522,5,398,18.7,11.97,20 79 | 0.08707,0,12.83,0,0.437,6.14,45.8,4.0905,5,398,18.7,10.27,20.8 80 | 0.05646,0,12.83,0,0.437,6.232,53.7,5.0141,5,398,18.7,12.34,21.2 81 | 0.08387,0,12.83,0,0.437,5.874,36.6,4.5026,5,398,18.7,9.1,20.3 82 | 0.04113,25,4.86,0,0.426,6.727,33.5,5.4007,4,281,19,5.29,28 83 | 0.04462,25,4.86,0,0.426,6.619,70.4,5.4007,4,281,19,7.22,23.9 84 | 0.03659,25,4.86,0,0.426,6.302,32.2,5.4007,4,281,19,6.72,24.8 85 | 0.03551,25,4.86,0,0.426,6.167,46.7,5.4007,4,281,19,7.51,22.9 86 | 0.05059,0,4.49,0,0.449,6.389,48,4.7794,3,247,18.5,9.62,23.9 87 | 0.05735,0,4.49,0,0.449,6.63,56.1,4.4377,3,247,18.5,6.53,26.6 88 | 0.05188,0,4.49,0,0.449,6.015,45.1,4.4272,3,247,18.5,12.86,22.5 89 | 0.07151,0,4.49,0,0.449,6.121,56.8,3.7476,3,247,18.5,8.44,22.2 90 | 0.0566,0,3.41,0,0.489,7.007,86.3,3.4217,2,270,17.8,5.5,23.6 91 | 0.05302,0,3.41,0,0.489,7.079,63.1,3.4145,2,270,17.8,5.7,28.7 92 | 0.04684,0,3.41,0,0.489,6.417,66.1,3.0923,2,270,17.8,8.81,22.6 93 | 0.03932,0,3.41,0,0.489,6.405,73.9,3.0921,2,270,17.8,8.2,22 94 | 0.04203,28,15.04,0,0.464,6.442,53.6,3.6659,4,270,18.2,8.16,22.9 95 | 0.02875,28,15.04,0,0.464,6.211,28.9,3.6659,4,270,18.2,6.21,25 96 | 0.04294,28,15.04,0,0.464,6.249,77.3,3.615,4,270,18.2,10.59,20.6 97 | 0.12204,0,2.89,0,0.445,6.625,57.8,3.4952,2,276,18,6.65,28.4 98 | 0.11504,0,2.89,0,0.445,6.163,69.6,3.4952,2,276,18,11.34,21.4 99 | 0.12083,0,2.89,0,0.445,8.069,76,3.4952,2,276,18,4.21,38.7 100 | 0.08187,0,2.89,0,0.445,7.82,36.9,3.4952,2,276,18,3.57,43.8 101 | 0.0686,0,2.89,0,0.445,7.416,62.5,3.4952,2,276,18,6.19,33.2 102 | 0.14866,0,8.56,0,0.52,6.727,79.9,2.7778,5,384,20.9,9.42,27.5 103 | 0.11432,0,8.56,0,0.52,6.781,71.3,2.8561,5,384,20.9,7.67,26.5 104 | 0.22876,0,8.56,0,0.52,6.405,85.4,2.7147,5,384,20.9,10.63,18.6 105 | 0.21161,0,8.56,0,0.52,6.137,87.4,2.7147,5,384,20.9,13.44,19.3 106 | 0.1396,0,8.56,0,0.52,6.167,90,2.421,5,384,20.9,12.33,20.1 107 | 0.13262,0,8.56,0,0.52,5.851,96.7,2.1069,5,384,20.9,16.47,19.5 108 | 0.1712,0,8.56,0,0.52,5.836,91.9,2.211,5,384,20.9,18.66,19.5 109 | 0.13117,0,8.56,0,0.52,6.127,85.2,2.1224,5,384,20.9,14.09,20.4 110 | 0.12802,0,8.56,0,0.52,6.474,97.1,2.4329,5,384,20.9,12.27,19.8 111 | 0.26363,0,8.56,0,0.52,6.229,91.2,2.5451,5,384,20.9,15.55,19.4 112 | 0.10793,0,8.56,0,0.52,6.195,54.4,2.7778,5,384,20.9,13,21.7 113 | 0.10084,0,10.01,0,0.547,6.715,81.6,2.6775,6,432,17.8,10.16,22.8 114 | 0.12329,0,10.01,0,0.547,5.913,92.9,2.3534,6,432,17.8,16.21,18.8 115 | 0.22212,0,10.01,0,0.547,6.092,95.4,2.548,6,432,17.8,17.09,18.7 116 | 0.14231,0,10.01,0,0.547,6.254,84.2,2.2565,6,432,17.8,10.45,18.5 117 | 0.17134,0,10.01,0,0.547,5.928,88.2,2.4631,6,432,17.8,15.76,18.3 118 | 0.13158,0,10.01,0,0.547,6.176,72.5,2.7301,6,432,17.8,12.04,21.2 119 | 0.15098,0,10.01,0,0.547,6.021,82.6,2.7474,6,432,17.8,10.3,19.2 120 | 0.13058,0,10.01,0,0.547,5.872,73.1,2.4775,6,432,17.8,15.37,20.4 121 | 0.14476,0,10.01,0,0.547,5.731,65.2,2.7592,6,432,17.8,13.61,19.3 122 | 0.06899,0,25.65,0,0.581,5.87,69.7,2.2577,2,188,19.1,14.37,22 123 | 0.07165,0,25.65,0,0.581,6.004,84.1,2.1974,2,188,19.1,14.27,20.3 124 | 0.09299,0,25.65,0,0.581,5.961,92.9,2.0869,2,188,19.1,17.93,20.5 125 | 0.15038,0,25.65,0,0.581,5.856,97,1.9444,2,188,19.1,25.41,17.3 126 | 0.09849,0,25.65,0,0.581,5.879,95.8,2.0063,2,188,19.1,17.58,18.8 127 | 0.16902,0,25.65,0,0.581,5.986,88.4,1.9929,2,188,19.1,14.81,21.4 128 | 0.38735,0,25.65,0,0.581,5.613,95.6,1.7572,2,188,19.1,27.26,15.7 129 | 0.25915,0,21.89,0,0.624,5.693,96,1.7883,4,437,21.2,17.19,16.2 130 | 0.32543,0,21.89,0,0.624,6.431,98.8,1.8125,4,437,21.2,15.39,18 131 | 0.88125,0,21.89,0,0.624,5.637,94.7,1.9799,4,437,21.2,18.34,14.3 132 | 0.34006,0,21.89,0,0.624,6.458,98.9,2.1185,4,437,21.2,12.6,19.2 133 | 1.19294,0,21.89,0,0.624,6.326,97.7,2.271,4,437,21.2,12.26,19.6 134 | 0.59005,0,21.89,0,0.624,6.372,97.9,2.3274,4,437,21.2,11.12,23 135 | 0.32982,0,21.89,0,0.624,5.822,95.4,2.4699,4,437,21.2,15.03,18.4 136 | 0.97617,0,21.89,0,0.624,5.757,98.4,2.346,4,437,21.2,17.31,15.6 137 | 0.55778,0,21.89,0,0.624,6.335,98.2,2.1107,4,437,21.2,16.96,18.1 138 | 0.32264,0,21.89,0,0.624,5.942,93.5,1.9669,4,437,21.2,16.9,17.4 139 | 0.35233,0,21.89,0,0.624,6.454,98.4,1.8498,4,437,21.2,14.59,17.1 140 | 0.2498,0,21.89,0,0.624,5.857,98.2,1.6686,4,437,21.2,21.32,13.3 141 | 0.54452,0,21.89,0,0.624,6.151,97.9,1.6687,4,437,21.2,18.46,17.8 142 | 0.2909,0,21.89,0,0.624,6.174,93.6,1.6119,4,437,21.2,24.16,14 143 | 1.62864,0,21.89,0,0.624,5.019,100,1.4394,4,437,21.2,34.41,14.4 144 | 3.32105,0,19.58,1,0.871,5.403,100,1.3216,5,403,14.7,26.82,13.4 145 | 4.0974,0,19.58,0,0.871,5.468,100,1.4118,5,403,14.7,26.42,15.6 146 | 2.77974,0,19.58,0,0.871,4.903,97.8,1.3459,5,403,14.7,29.29,11.8 147 | 2.37934,0,19.58,0,0.871,6.13,100,1.4191,5,403,14.7,27.8,13.8 148 | 2.15505,0,19.58,0,0.871,5.628,100,1.5166,5,403,14.7,16.65,15.6 149 | 2.36862,0,19.58,0,0.871,4.926,95.7,1.4608,5,403,14.7,29.53,14.6 150 | 2.33099,0,19.58,0,0.871,5.186,93.8,1.5296,5,403,14.7,28.32,17.8 151 | 2.73397,0,19.58,0,0.871,5.597,94.9,1.5257,5,403,14.7,21.45,15.4 152 | 1.6566,0,19.58,0,0.871,6.122,97.3,1.618,5,403,14.7,14.1,21.5 153 | 1.49632,0,19.58,0,0.871,5.404,100,1.5916,5,403,14.7,13.28,19.6 154 | 1.12658,0,19.58,1,0.871,5.012,88,1.6102,5,403,14.7,12.12,15.3 155 | 2.14918,0,19.58,0,0.871,5.709,98.5,1.6232,5,403,14.7,15.79,19.4 156 | 1.41385,0,19.58,1,0.871,6.129,96,1.7494,5,403,14.7,15.12,17 157 | 3.53501,0,19.58,1,0.871,6.152,82.6,1.7455,5,403,14.7,15.02,15.6 158 | 2.44668,0,19.58,0,0.871,5.272,94,1.7364,5,403,14.7,16.14,13.1 159 | 1.22358,0,19.58,0,0.605,6.943,97.4,1.8773,5,403,14.7,4.59,41.3 160 | 1.34284,0,19.58,0,0.605,6.066,100,1.7573,5,403,14.7,6.43,24.3 161 | 1.42502,0,19.58,0,0.871,6.51,100,1.7659,5,403,14.7,7.39,23.3 162 | 1.27346,0,19.58,1,0.605,6.25,92.6,1.7984,5,403,14.7,5.5,27 163 | 1.46336,0,19.58,0,0.605,7.489,90.8,1.9709,5,403,14.7,1.73,50 164 | 1.83377,0,19.58,1,0.605,7.802,98.2,2.0407,5,403,14.7,1.92,50 165 | 1.51902,0,19.58,1,0.605,8.375,93.9,2.162,5,403,14.7,3.32,50 166 | 2.24236,0,19.58,0,0.605,5.854,91.8,2.422,5,403,14.7,11.64,22.7 167 | 2.924,0,19.58,0,0.605,6.101,93,2.2834,5,403,14.7,9.81,25 168 | 2.01019,0,19.58,0,0.605,7.929,96.2,2.0459,5,403,14.7,3.7,50 169 | 1.80028,0,19.58,0,0.605,5.877,79.2,2.4259,5,403,14.7,12.14,23.8 170 | 2.3004,0,19.58,0,0.605,6.319,96.1,2.1,5,403,14.7,11.1,23.8 171 | 2.44953,0,19.58,0,0.605,6.402,95.2,2.2625,5,403,14.7,11.32,22.3 172 | 1.20742,0,19.58,0,0.605,5.875,94.6,2.4259,5,403,14.7,14.43,17.4 173 | 2.3139,0,19.58,0,0.605,5.88,97.3,2.3887,5,403,14.7,12.03,19.1 174 | 0.13914,0,4.05,0,0.51,5.572,88.5,2.5961,5,296,16.6,14.69,23.1 175 | 0.09178,0,4.05,0,0.51,6.416,84.1,2.6463,5,296,16.6,9.04,23.6 176 | 0.08447,0,4.05,0,0.51,5.859,68.7,2.7019,5,296,16.6,9.64,22.6 177 | 0.06664,0,4.05,0,0.51,6.546,33.1,3.1323,5,296,16.6,5.33,29.4 178 | 0.07022,0,4.05,0,0.51,6.02,47.2,3.5549,5,296,16.6,10.11,23.2 179 | 0.05425,0,4.05,0,0.51,6.315,73.4,3.3175,5,296,16.6,6.29,24.6 180 | 0.06642,0,4.05,0,0.51,6.86,74.4,2.9153,5,296,16.6,6.92,29.9 181 | 0.0578,0,2.46,0,0.488,6.98,58.4,2.829,3,193,17.8,5.04,37.2 182 | 0.06588,0,2.46,0,0.488,7.765,83.3,2.741,3,193,17.8,7.56,39.8 183 | 0.06888,0,2.46,0,0.488,6.144,62.2,2.5979,3,193,17.8,9.45,36.2 184 | 0.09103,0,2.46,0,0.488,7.155,92.2,2.7006,3,193,17.8,4.82,37.9 185 | 0.10008,0,2.46,0,0.488,6.563,95.6,2.847,3,193,17.8,5.68,32.5 186 | 0.08308,0,2.46,0,0.488,5.604,89.8,2.9879,3,193,17.8,13.98,26.4 187 | 0.06047,0,2.46,0,0.488,6.153,68.8,3.2797,3,193,17.8,13.15,29.6 188 | 0.05602,0,2.46,0,0.488,7.831,53.6,3.1992,3,193,17.8,4.45,50 189 | 0.07875,45,3.44,0,0.437,6.782,41.1,3.7886,5,398,15.2,6.68,32 190 | 0.12579,45,3.44,0,0.437,6.556,29.1,4.5667,5,398,15.2,4.56,29.8 191 | 0.0837,45,3.44,0,0.437,7.185,38.9,4.5667,5,398,15.2,5.39,34.9 192 | 0.09068,45,3.44,0,0.437,6.951,21.5,6.4798,5,398,15.2,5.1,37 193 | 0.06911,45,3.44,0,0.437,6.739,30.8,6.4798,5,398,15.2,4.69,30.5 194 | 0.08664,45,3.44,0,0.437,7.178,26.3,6.4798,5,398,15.2,2.87,36.4 195 | 0.02187,60,2.93,0,0.401,6.8,9.9,6.2196,1,265,15.6,5.03,31.1 196 | 0.01439,60,2.93,0,0.401,6.604,18.8,6.2196,1,265,15.6,4.38,29.1 197 | 0.01381,80,0.46,0,0.422,7.875,32,5.6484,4,255,14.4,2.97,50 198 | 0.04011,80,1.52,0,0.404,7.287,34.1,7.309,2,329,12.6,4.08,33.3 199 | 0.04666,80,1.52,0,0.404,7.107,36.6,7.309,2,329,12.6,8.61,30.3 200 | 0.03768,80,1.52,0,0.404,7.274,38.3,7.309,2,329,12.6,6.62,34.6 201 | 0.0315,95,1.47,0,0.403,6.975,15.3,7.6534,3,402,17,4.56,34.9 202 | 0.01778,95,1.47,0,0.403,7.135,13.9,7.6534,3,402,17,4.45,32.9 203 | 0.03445,82.5,2.03,0,0.415,6.162,38.4,6.27,2,348,14.7,7.43,24.1 204 | 0.02177,82.5,2.03,0,0.415,7.61,15.7,6.27,2,348,14.7,3.11,42.3 205 | 0.0351,95,2.68,0,0.4161,7.853,33.2,5.118,4,224,14.7,3.81,48.5 206 | 0.02009,95,2.68,0,0.4161,8.034,31.9,5.118,4,224,14.7,2.88,50 207 | 0.13642,0,10.59,0,0.489,5.891,22.3,3.9454,4,277,18.6,10.87,22.6 208 | 0.22969,0,10.59,0,0.489,6.326,52.5,4.3549,4,277,18.6,10.97,24.4 209 | 0.25199,0,10.59,0,0.489,5.783,72.7,4.3549,4,277,18.6,18.06,22.5 210 | 0.13587,0,10.59,1,0.489,6.064,59.1,4.2392,4,277,18.6,14.66,24.4 211 | 0.43571,0,10.59,1,0.489,5.344,100,3.875,4,277,18.6,23.09,20 212 | 0.17446,0,10.59,1,0.489,5.96,92.1,3.8771,4,277,18.6,17.27,21.7 213 | 0.37578,0,10.59,1,0.489,5.404,88.6,3.665,4,277,18.6,23.98,19.3 214 | 0.21719,0,10.59,1,0.489,5.807,53.8,3.6526,4,277,18.6,16.03,22.4 215 | 0.14052,0,10.59,0,0.489,6.375,32.3,3.9454,4,277,18.6,9.38,28.1 216 | 0.28955,0,10.59,0,0.489,5.412,9.8,3.5875,4,277,18.6,29.55,23.7 217 | 0.19802,0,10.59,0,0.489,6.182,42.4,3.9454,4,277,18.6,9.47,25 218 | 0.0456,0,13.89,1,0.55,5.888,56,3.1121,5,276,16.4,13.51,23.3 219 | 0.07013,0,13.89,0,0.55,6.642,85.1,3.4211,5,276,16.4,9.69,28.7 220 | 0.11069,0,13.89,1,0.55,5.951,93.8,2.8893,5,276,16.4,17.92,21.5 221 | 0.11425,0,13.89,1,0.55,6.373,92.4,3.3633,5,276,16.4,10.5,23 222 | 0.35809,0,6.2,1,0.507,6.951,88.5,2.8617,8,307,17.4,9.71,26.7 223 | 0.40771,0,6.2,1,0.507,6.164,91.3,3.048,8,307,17.4,21.46,21.7 224 | 0.62356,0,6.2,1,0.507,6.879,77.7,3.2721,8,307,17.4,9.93,27.5 225 | 0.6147,0,6.2,0,0.507,6.618,80.8,3.2721,8,307,17.4,7.6,30.1 226 | 0.31533,0,6.2,0,0.504,8.266,78.3,2.8944,8,307,17.4,4.14,44.8 227 | 0.52693,0,6.2,0,0.504,8.725,83,2.8944,8,307,17.4,4.63,50 228 | 0.38214,0,6.2,0,0.504,8.04,86.5,3.2157,8,307,17.4,3.13,37.6 229 | 0.41238,0,6.2,0,0.504,7.163,79.9,3.2157,8,307,17.4,6.36,31.6 230 | 0.29819,0,6.2,0,0.504,7.686,17,3.3751,8,307,17.4,3.92,46.7 231 | 0.44178,0,6.2,0,0.504,6.552,21.4,3.3751,8,307,17.4,3.76,31.5 232 | 0.537,0,6.2,0,0.504,5.981,68.1,3.6715,8,307,17.4,11.65,24.3 233 | 0.46296,0,6.2,0,0.504,7.412,76.9,3.6715,8,307,17.4,5.25,31.7 234 | 0.57529,0,6.2,0,0.507,8.337,73.3,3.8384,8,307,17.4,2.47,41.7 235 | 0.33147,0,6.2,0,0.507,8.247,70.4,3.6519,8,307,17.4,3.95,48.3 236 | 0.44791,0,6.2,1,0.507,6.726,66.5,3.6519,8,307,17.4,8.05,29 237 | 0.33045,0,6.2,0,0.507,6.086,61.5,3.6519,8,307,17.4,10.88,24 238 | 0.52058,0,6.2,1,0.507,6.631,76.5,4.148,8,307,17.4,9.54,25.1 239 | 0.51183,0,6.2,0,0.507,7.358,71.6,4.148,8,307,17.4,4.73,31.5 240 | 0.08244,30,4.93,0,0.428,6.481,18.5,6.1899,6,300,16.6,6.36,23.7 241 | 0.09252,30,4.93,0,0.428,6.606,42.2,6.1899,6,300,16.6,7.37,23.3 242 | 0.11329,30,4.93,0,0.428,6.897,54.3,6.3361,6,300,16.6,11.38,22 243 | 0.10612,30,4.93,0,0.428,6.095,65.1,6.3361,6,300,16.6,12.4,20.1 244 | 0.1029,30,4.93,0,0.428,6.358,52.9,7.0355,6,300,16.6,11.22,22.2 245 | 0.12757,30,4.93,0,0.428,6.393,7.8,7.0355,6,300,16.6,5.19,23.7 246 | 0.20608,22,5.86,0,0.431,5.593,76.5,7.9549,7,330,19.1,12.5,17.6 247 | 0.19133,22,5.86,0,0.431,5.605,70.2,7.9549,7,330,19.1,18.46,18.5 248 | 0.33983,22,5.86,0,0.431,6.108,34.9,8.0555,7,330,19.1,9.16,24.3 249 | 0.19657,22,5.86,0,0.431,6.226,79.2,8.0555,7,330,19.1,10.15,20.5 250 | 0.16439,22,5.86,0,0.431,6.433,49.1,7.8265,7,330,19.1,9.52,24.5 251 | 0.19073,22,5.86,0,0.431,6.718,17.5,7.8265,7,330,19.1,6.56,26.2 252 | 0.1403,22,5.86,0,0.431,6.487,13,7.3967,7,330,19.1,5.9,24.4 253 | 0.21409,22,5.86,0,0.431,6.438,8.9,7.3967,7,330,19.1,3.59,24.8 254 | 0.08221,22,5.86,0,0.431,6.957,6.8,8.9067,7,330,19.1,3.53,29.6 255 | 0.36894,22,5.86,0,0.431,8.259,8.4,8.9067,7,330,19.1,3.54,42.8 256 | 0.04819,80,3.64,0,0.392,6.108,32,9.2203,1,315,16.4,6.57,21.9 257 | 0.03548,80,3.64,0,0.392,5.876,19.1,9.2203,1,315,16.4,9.25,20.9 258 | 0.01538,90,3.75,0,0.394,7.454,34.2,6.3361,3,244,15.9,3.11,44 259 | 0.61154,20,3.97,0,0.647,8.704,86.9,1.801,5,264,13,5.12,50 260 | 0.66351,20,3.97,0,0.647,7.333,100,1.8946,5,264,13,7.79,36 261 | 0.65665,20,3.97,0,0.647,6.842,100,2.0107,5,264,13,6.9,30.1 262 | 0.54011,20,3.97,0,0.647,7.203,81.8,2.1121,5,264,13,9.59,33.8 263 | 0.53412,20,3.97,0,0.647,7.52,89.4,2.1398,5,264,13,7.26,43.1 264 | 0.52014,20,3.97,0,0.647,8.398,91.5,2.2885,5,264,13,5.91,48.8 265 | 0.82526,20,3.97,0,0.647,7.327,94.5,2.0788,5,264,13,11.25,31 266 | 0.55007,20,3.97,0,0.647,7.206,91.6,1.9301,5,264,13,8.1,36.5 267 | 0.76162,20,3.97,0,0.647,5.56,62.8,1.9865,5,264,13,10.45,22.8 268 | 0.7857,20,3.97,0,0.647,7.014,84.6,2.1329,5,264,13,14.79,30.7 269 | 0.57834,20,3.97,0,0.575,8.297,67,2.4216,5,264,13,7.44,50 270 | 0.5405,20,3.97,0,0.575,7.47,52.6,2.872,5,264,13,3.16,43.5 271 | 0.09065,20,6.96,1,0.464,5.92,61.5,3.9175,3,223,18.6,13.65,20.7 272 | 0.29916,20,6.96,0,0.464,5.856,42.1,4.429,3,223,18.6,13,21.1 273 | 0.16211,20,6.96,0,0.464,6.24,16.3,4.429,3,223,18.6,6.59,25.2 274 | 0.1146,20,6.96,0,0.464,6.538,58.7,3.9175,3,223,18.6,7.73,24.4 275 | 0.22188,20,6.96,1,0.464,7.691,51.8,4.3665,3,223,18.6,6.58,35.2 276 | 0.05644,40,6.41,1,0.447,6.758,32.9,4.0776,4,254,17.6,3.53,32.4 277 | 0.09604,40,6.41,0,0.447,6.854,42.8,4.2673,4,254,17.6,2.98,32 278 | 0.10469,40,6.41,1,0.447,7.267,49,4.7872,4,254,17.6,6.05,33.2 279 | 0.06127,40,6.41,1,0.447,6.826,27.6,4.8628,4,254,17.6,4.16,33.1 280 | 0.07978,40,6.41,0,0.447,6.482,32.1,4.1403,4,254,17.6,7.19,29.1 281 | 0.21038,20,3.33,0,0.4429,6.812,32.2,4.1007,5,216,14.9,4.85,35.1 282 | 0.03578,20,3.33,0,0.4429,7.82,64.5,4.6947,5,216,14.9,3.76,45.4 283 | 0.03705,20,3.33,0,0.4429,6.968,37.2,5.2447,5,216,14.9,4.59,35.4 284 | 0.06129,20,3.33,1,0.4429,7.645,49.7,5.2119,5,216,14.9,3.01,46 285 | 0.01501,90,1.21,1,0.401,7.923,24.8,5.885,1,198,13.6,3.16,50 286 | 0.00906,90,2.97,0,0.4,7.088,20.8,7.3073,1,285,15.3,7.85,32.2 287 | 0.01096,55,2.25,0,0.389,6.453,31.9,7.3073,1,300,15.3,8.23,22 288 | 0.01965,80,1.76,0,0.385,6.23,31.5,9.0892,1,241,18.2,12.93,20.1 289 | 0.03871,52.5,5.32,0,0.405,6.209,31.3,7.3172,6,293,16.6,7.14,23.2 290 | 0.0459,52.5,5.32,0,0.405,6.315,45.6,7.3172,6,293,16.6,7.6,22.3 291 | 0.04297,52.5,5.32,0,0.405,6.565,22.9,7.3172,6,293,16.6,9.51,24.8 292 | 0.03502,80,4.95,0,0.411,6.861,27.9,5.1167,4,245,19.2,3.33,28.5 293 | 0.07886,80,4.95,0,0.411,7.148,27.7,5.1167,4,245,19.2,3.56,37.3 294 | 0.03615,80,4.95,0,0.411,6.63,23.4,5.1167,4,245,19.2,4.7,27.9 295 | 0.08265,0,13.92,0,0.437,6.127,18.4,5.5027,4,289,16,8.58,23.9 296 | 0.08199,0,13.92,0,0.437,6.009,42.3,5.5027,4,289,16,10.4,21.7 297 | 0.12932,0,13.92,0,0.437,6.678,31.1,5.9604,4,289,16,6.27,28.6 298 | 0.05372,0,13.92,0,0.437,6.549,51,5.9604,4,289,16,7.39,27.1 299 | 0.14103,0,13.92,0,0.437,5.79,58,6.32,4,289,16,15.84,20.3 300 | 0.06466,70,2.24,0,0.4,6.345,20.1,7.8278,5,358,14.8,4.97,22.5 301 | 0.05561,70,2.24,0,0.4,7.041,10,7.8278,5,358,14.8,4.74,29 302 | 0.04417,70,2.24,0,0.4,6.871,47.4,7.8278,5,358,14.8,6.07,24.8 303 | 0.03537,34,6.09,0,0.433,6.59,40.4,5.4917,7,329,16.1,9.5,22 304 | 0.09266,34,6.09,0,0.433,6.495,18.4,5.4917,7,329,16.1,8.67,26.4 305 | 0.1,34,6.09,0,0.433,6.982,17.7,5.4917,7,329,16.1,4.86,33.1 306 | 0.05515,33,2.18,0,0.472,7.236,41.1,4.022,7,222,18.4,6.93,36.1 307 | 0.05479,33,2.18,0,0.472,6.616,58.1,3.37,7,222,18.4,8.93,28.4 308 | 0.07503,33,2.18,0,0.472,7.42,71.9,3.0992,7,222,18.4,6.47,33.4 309 | 0.04932,33,2.18,0,0.472,6.849,70.3,3.1827,7,222,18.4,7.53,28.2 310 | 0.49298,0,9.9,0,0.544,6.635,82.5,3.3175,4,304,18.4,4.54,22.8 311 | 0.3494,0,9.9,0,0.544,5.972,76.7,3.1025,4,304,18.4,9.97,20.3 312 | 2.63548,0,9.9,0,0.544,4.973,37.8,2.5194,4,304,18.4,12.64,16.1 313 | 0.79041,0,9.9,0,0.544,6.122,52.8,2.6403,4,304,18.4,5.98,22.1 314 | 0.26169,0,9.9,0,0.544,6.023,90.4,2.834,4,304,18.4,11.72,19.4 315 | 0.26938,0,9.9,0,0.544,6.266,82.8,3.2628,4,304,18.4,7.9,21.6 316 | 0.3692,0,9.9,0,0.544,6.567,87.3,3.6023,4,304,18.4,9.28,23.8 317 | 0.25356,0,9.9,0,0.544,5.705,77.7,3.945,4,304,18.4,11.5,16.2 318 | 0.31827,0,9.9,0,0.544,5.914,83.2,3.9986,4,304,18.4,18.33,17.8 319 | 0.24522,0,9.9,0,0.544,5.782,71.7,4.0317,4,304,18.4,15.94,19.8 320 | 0.40202,0,9.9,0,0.544,6.382,67.2,3.5325,4,304,18.4,10.36,23.1 321 | 0.47547,0,9.9,0,0.544,6.113,58.8,4.0019,4,304,18.4,12.73,21 322 | 0.1676,0,7.38,0,0.493,6.426,52.3,4.5404,5,287,19.6,7.2,23.8 323 | 0.18159,0,7.38,0,0.493,6.376,54.3,4.5404,5,287,19.6,6.87,23.1 324 | 0.35114,0,7.38,0,0.493,6.041,49.9,4.7211,5,287,19.6,7.7,20.4 325 | 0.28392,0,7.38,0,0.493,5.708,74.3,4.7211,5,287,19.6,11.74,18.5 326 | 0.34109,0,7.38,0,0.493,6.415,40.1,4.7211,5,287,19.6,6.12,25 327 | 0.19186,0,7.38,0,0.493,6.431,14.7,5.4159,5,287,19.6,5.08,24.6 328 | 0.30347,0,7.38,0,0.493,6.312,28.9,5.4159,5,287,19.6,6.15,23 329 | 0.24103,0,7.38,0,0.493,6.083,43.7,5.4159,5,287,19.6,12.79,22.2 330 | 0.06617,0,3.24,0,0.46,5.868,25.8,5.2146,4,430,16.9,9.97,19.3 331 | 0.06724,0,3.24,0,0.46,6.333,17.2,5.2146,4,430,16.9,7.34,22.6 332 | 0.04544,0,3.24,0,0.46,6.144,32.2,5.8736,4,430,16.9,9.09,19.8 333 | 0.05023,35,6.06,0,0.4379,5.706,28.4,6.6407,1,304,16.9,12.43,17.1 334 | 0.03466,35,6.06,0,0.4379,6.031,23.3,6.6407,1,304,16.9,7.83,19.4 335 | 0.05083,0,5.19,0,0.515,6.316,38.1,6.4584,5,224,20.2,5.68,22.2 336 | 0.03738,0,5.19,0,0.515,6.31,38.5,6.4584,5,224,20.2,6.75,20.7 337 | 0.03961,0,5.19,0,0.515,6.037,34.5,5.9853,5,224,20.2,8.01,21.1 338 | 0.03427,0,5.19,0,0.515,5.869,46.3,5.2311,5,224,20.2,9.8,19.5 339 | 0.03041,0,5.19,0,0.515,5.895,59.6,5.615,5,224,20.2,10.56,18.5 340 | 0.03306,0,5.19,0,0.515,6.059,37.3,4.8122,5,224,20.2,8.51,20.6 341 | 0.05497,0,5.19,0,0.515,5.985,45.4,4.8122,5,224,20.2,9.74,19 342 | 0.06151,0,5.19,0,0.515,5.968,58.5,4.8122,5,224,20.2,9.29,18.7 343 | 0.01301,35,1.52,0,0.442,7.241,49.3,7.0379,1,284,15.5,5.49,32.7 344 | 0.02498,0,1.89,0,0.518,6.54,59.7,6.2669,1,422,15.9,8.65,16.5 345 | 0.02543,55,3.78,0,0.484,6.696,56.4,5.7321,5,370,17.6,7.18,23.9 346 | 0.03049,55,3.78,0,0.484,6.874,28.1,6.4654,5,370,17.6,4.61,31.2 347 | 0.03113,0,4.39,0,0.442,6.014,48.5,8.0136,3,352,18.8,10.53,17.5 348 | 0.06162,0,4.39,0,0.442,5.898,52.3,8.0136,3,352,18.8,12.67,17.2 349 | 0.0187,85,4.15,0,0.429,6.516,27.7,8.5353,4,351,17.9,6.36,23.1 350 | 0.01501,80,2.01,0,0.435,6.635,29.7,8.344,4,280,17,5.99,24.5 351 | 0.02899,40,1.25,0,0.429,6.939,34.5,8.7921,1,335,19.7,5.89,26.6 352 | 0.06211,40,1.25,0,0.429,6.49,44.4,8.7921,1,335,19.7,5.98,22.9 353 | 0.0795,60,1.69,0,0.411,6.579,35.9,10.7103,4,411,18.3,5.49,24.1 354 | 0.07244,60,1.69,0,0.411,5.884,18.5,10.7103,4,411,18.3,7.79,18.6 355 | 0.01709,90,2.02,0,0.41,6.728,36.1,12.1265,5,187,17,4.5,30.1 356 | 0.04301,80,1.91,0,0.413,5.663,21.9,10.5857,4,334,22,8.05,18.2 357 | 0.10659,80,1.91,0,0.413,5.936,19.5,10.5857,4,334,22,5.57,20.6 358 | 8.98296,0,18.1,1,0.77,6.212,97.4,2.1222,24,666,20.2,17.6,17.8 359 | 3.8497,0,18.1,1,0.77,6.395,91,2.5052,24,666,20.2,13.27,21.7 360 | 5.20177,0,18.1,1,0.77,6.127,83.4,2.7227,24,666,20.2,11.48,22.7 361 | 4.26131,0,18.1,0,0.77,6.112,81.3,2.5091,24,666,20.2,12.67,22.6 362 | 4.54192,0,18.1,0,0.77,6.398,88,2.5182,24,666,20.2,7.79,25 363 | 3.83684,0,18.1,0,0.77,6.251,91.1,2.2955,24,666,20.2,14.19,19.9 364 | 3.67822,0,18.1,0,0.77,5.362,96.2,2.1036,24,666,20.2,10.19,20.8 365 | 4.22239,0,18.1,1,0.77,5.803,89,1.9047,24,666,20.2,14.64,16.8 366 | 3.47428,0,18.1,1,0.718,8.78,82.9,1.9047,24,666,20.2,5.29,21.9 367 | 4.55587,0,18.1,0,0.718,3.561,87.9,1.6132,24,666,20.2,7.12,27.5 368 | 3.69695,0,18.1,0,0.718,4.963,91.4,1.7523,24,666,20.2,14,21.9 369 | 13.5222,0,18.1,0,0.631,3.863,100,1.5106,24,666,20.2,13.33,23.1 370 | 4.89822,0,18.1,0,0.631,4.97,100,1.3325,24,666,20.2,3.26,50 371 | 5.66998,0,18.1,1,0.631,6.683,96.8,1.3567,24,666,20.2,3.73,50 372 | 6.53876,0,18.1,1,0.631,7.016,97.5,1.2024,24,666,20.2,2.96,50 373 | 9.2323,0,18.1,0,0.631,6.216,100,1.1691,24,666,20.2,9.53,50 374 | 8.26725,0,18.1,1,0.668,5.875,89.6,1.1296,24,666,20.2,8.88,50 375 | 11.1081,0,18.1,0,0.668,4.906,100,1.1742,24,666,20.2,34.77,13.8 376 | 18.4982,0,18.1,0,0.668,4.138,100,1.137,24,666,20.2,37.97,13.8 377 | 19.6091,0,18.1,0,0.671,7.313,97.9,1.3163,24,666,20.2,13.44,15 378 | 15.288,0,18.1,0,0.671,6.649,93.3,1.3449,24,666,20.2,23.24,13.9 379 | 9.82349,0,18.1,0,0.671,6.794,98.8,1.358,24,666,20.2,21.24,13.3 380 | 23.6482,0,18.1,0,0.671,6.38,96.2,1.3861,24,666,20.2,23.69,13.1 381 | 17.8667,0,18.1,0,0.671,6.223,100,1.3861,24,666,20.2,21.78,10.2 382 | 88.9762,0,18.1,0,0.671,6.968,91.9,1.4165,24,666,20.2,17.21,10.4 383 | 15.8744,0,18.1,0,0.671,6.545,99.1,1.5192,24,666,20.2,21.08,10.9 384 | 9.18702,0,18.1,0,0.7,5.536,100,1.5804,24,666,20.2,23.6,11.3 385 | 7.99248,0,18.1,0,0.7,5.52,100,1.5331,24,666,20.2,24.56,12.3 386 | 20.0849,0,18.1,0,0.7,4.368,91.2,1.4395,24,666,20.2,30.63,8.8 387 | 16.8118,0,18.1,0,0.7,5.277,98.1,1.4261,24,666,20.2,30.81,7.2 388 | 24.3938,0,18.1,0,0.7,4.652,100,1.4672,24,666,20.2,28.28,10.5 389 | 22.5971,0,18.1,0,0.7,5,89.5,1.5184,24,666,20.2,31.99,7.4 390 | 14.3337,0,18.1,0,0.7,4.88,100,1.5895,24,666,20.2,30.62,10.2 391 | 8.15174,0,18.1,0,0.7,5.39,98.9,1.7281,24,666,20.2,20.85,11.5 392 | 6.96215,0,18.1,0,0.7,5.713,97,1.9265,24,666,20.2,17.11,15.1 393 | 5.29305,0,18.1,0,0.7,6.051,82.5,2.1678,24,666,20.2,18.76,23.2 394 | 11.5779,0,18.1,0,0.7,5.036,97,1.77,24,666,20.2,25.68,9.7 395 | 8.64476,0,18.1,0,0.693,6.193,92.6,1.7912,24,666,20.2,15.17,13.8 396 | 13.3598,0,18.1,0,0.693,5.887,94.7,1.7821,24,666,20.2,16.35,12.7 397 | 8.71675,0,18.1,0,0.693,6.471,98.8,1.7257,24,666,20.2,17.12,13.1 398 | 5.87205,0,18.1,0,0.693,6.405,96,1.6768,24,666,20.2,19.37,12.5 399 | 7.67202,0,18.1,0,0.693,5.747,98.9,1.6334,24,666,20.2,19.92,8.5 400 | 38.3518,0,18.1,0,0.693,5.453,100,1.4896,24,666,20.2,30.59,5 401 | 9.91655,0,18.1,0,0.693,5.852,77.8,1.5004,24,666,20.2,29.97,6.3 402 | 25.0461,0,18.1,0,0.693,5.987,100,1.5888,24,666,20.2,26.77,5.6 403 | 14.2362,0,18.1,0,0.693,6.343,100,1.5741,24,666,20.2,20.32,7.2 404 | 9.59571,0,18.1,0,0.693,6.404,100,1.639,24,666,20.2,20.31,12.1 405 | 24.8017,0,18.1,0,0.693,5.349,96,1.7028,24,666,20.2,19.77,8.3 406 | 41.5292,0,18.1,0,0.693,5.531,85.4,1.6074,24,666,20.2,27.38,8.5 407 | 67.9208,0,18.1,0,0.693,5.683,100,1.4254,24,666,20.2,22.98,5 408 | 20.7162,0,18.1,0,0.659,4.138,100,1.1781,24,666,20.2,23.34,11.9 409 | 11.9511,0,18.1,0,0.659,5.608,100,1.2852,24,666,20.2,12.13,27.9 410 | 7.40389,0,18.1,0,0.597,5.617,97.9,1.4547,24,666,20.2,26.4,17.2 411 | 14.4383,0,18.1,0,0.597,6.852,100,1.4655,24,666,20.2,19.78,27.5 412 | 51.1358,0,18.1,0,0.597,5.757,100,1.413,24,666,20.2,10.11,15 413 | 14.0507,0,18.1,0,0.597,6.657,100,1.5275,24,666,20.2,21.22,17.2 414 | 18.811,0,18.1,0,0.597,4.628,100,1.5539,24,666,20.2,34.37,17.9 415 | 28.6558,0,18.1,0,0.597,5.155,100,1.5894,24,666,20.2,20.08,16.3 416 | 45.7461,0,18.1,0,0.693,4.519,100,1.6582,24,666,20.2,36.98,7 417 | 18.0846,0,18.1,0,0.679,6.434,100,1.8347,24,666,20.2,29.05,7.2 418 | 10.8342,0,18.1,0,0.679,6.782,90.8,1.8195,24,666,20.2,25.79,7.5 419 | 25.9406,0,18.1,0,0.679,5.304,89.1,1.6475,24,666,20.2,26.64,10.4 420 | 73.5341,0,18.1,0,0.679,5.957,100,1.8026,24,666,20.2,20.62,8.8 421 | 11.8123,0,18.1,0,0.718,6.824,76.5,1.794,24,666,20.2,22.74,8.4 422 | 11.0874,0,18.1,0,0.718,6.411,100,1.8589,24,666,20.2,15.02,16.7 423 | 7.02259,0,18.1,0,0.718,6.006,95.3,1.8746,24,666,20.2,15.7,14.2 424 | 12.0482,0,18.1,0,0.614,5.648,87.6,1.9512,24,666,20.2,14.1,20.8 425 | 7.05042,0,18.1,0,0.614,6.103,85.1,2.0218,24,666,20.2,23.29,13.4 426 | 8.79212,0,18.1,0,0.584,5.565,70.6,2.0635,24,666,20.2,17.16,11.7 427 | 15.8603,0,18.1,0,0.679,5.896,95.4,1.9096,24,666,20.2,24.39,8.3 428 | 12.2472,0,18.1,0,0.584,5.837,59.7,1.9976,24,666,20.2,15.69,10.2 429 | 37.6619,0,18.1,0,0.679,6.202,78.7,1.8629,24,666,20.2,14.52,10.9 430 | 7.36711,0,18.1,0,0.679,6.193,78.1,1.9356,24,666,20.2,21.52,11 431 | 9.33889,0,18.1,0,0.679,6.38,95.6,1.9682,24,666,20.2,24.08,9.5 432 | 8.49213,0,18.1,0,0.584,6.348,86.1,2.0527,24,666,20.2,17.64,14.5 433 | 10.0623,0,18.1,0,0.584,6.833,94.3,2.0882,24,666,20.2,19.69,14.1 434 | 6.44405,0,18.1,0,0.584,6.425,74.8,2.2004,24,666,20.2,12.03,16.1 435 | 5.58107,0,18.1,0,0.713,6.436,87.9,2.3158,24,666,20.2,16.22,14.3 436 | 13.9134,0,18.1,0,0.713,6.208,95,2.2222,24,666,20.2,15.17,11.7 437 | 11.1604,0,18.1,0,0.74,6.629,94.6,2.1247,24,666,20.2,23.27,13.4 438 | 14.4208,0,18.1,0,0.74,6.461,93.3,2.0026,24,666,20.2,18.05,9.6 439 | 15.1772,0,18.1,0,0.74,6.152,100,1.9142,24,666,20.2,26.45,8.7 440 | 13.6781,0,18.1,0,0.74,5.935,87.9,1.8206,24,666,20.2,34.02,8.4 441 | 9.39063,0,18.1,0,0.74,5.627,93.9,1.8172,24,666,20.2,22.88,12.8 442 | 22.0511,0,18.1,0,0.74,5.818,92.4,1.8662,24,666,20.2,22.11,10.5 443 | 9.72418,0,18.1,0,0.74,6.406,97.2,2.0651,24,666,20.2,19.52,17.1 444 | 5.66637,0,18.1,0,0.74,6.219,100,2.0048,24,666,20.2,16.59,18.4 445 | 9.96654,0,18.1,0,0.74,6.485,100,1.9784,24,666,20.2,18.85,15.4 446 | 12.8023,0,18.1,0,0.74,5.854,96.6,1.8956,24,666,20.2,23.79,10.8 447 | 10.6718,0,18.1,0,0.74,6.459,94.8,1.9879,24,666,20.2,23.98,11.8 448 | 6.28807,0,18.1,0,0.74,6.341,96.4,2.072,24,666,20.2,17.79,14.9 449 | 9.92485,0,18.1,0,0.74,6.251,96.6,2.198,24,666,20.2,16.44,12.6 450 | 9.32909,0,18.1,0,0.713,6.185,98.7,2.2616,24,666,20.2,18.13,14.1 451 | 7.52601,0,18.1,0,0.713,6.417,98.3,2.185,24,666,20.2,19.31,13 452 | 6.71772,0,18.1,0,0.713,6.749,92.6,2.3236,24,666,20.2,17.44,13.4 453 | 5.44114,0,18.1,0,0.713,6.655,98.2,2.3552,24,666,20.2,17.73,15.2 454 | 5.09017,0,18.1,0,0.713,6.297,91.8,2.3682,24,666,20.2,17.27,16.1 455 | 8.24809,0,18.1,0,0.713,7.393,99.3,2.4527,24,666,20.2,16.74,17.8 456 | 9.51363,0,18.1,0,0.713,6.728,94.1,2.4961,24,666,20.2,18.71,14.9 457 | 4.75237,0,18.1,0,0.713,6.525,86.5,2.4358,24,666,20.2,18.13,14.1 458 | 4.66883,0,18.1,0,0.713,5.976,87.9,2.5806,24,666,20.2,19.01,12.7 459 | 8.20058,0,18.1,0,0.713,5.936,80.3,2.7792,24,666,20.2,16.94,13.5 460 | 7.75223,0,18.1,0,0.713,6.301,83.7,2.7831,24,666,20.2,16.23,14.9 461 | 6.80117,0,18.1,0,0.713,6.081,84.4,2.7175,24,666,20.2,14.7,20 462 | 4.81213,0,18.1,0,0.713,6.701,90,2.5975,24,666,20.2,16.42,16.4 463 | 3.69311,0,18.1,0,0.713,6.376,88.4,2.5671,24,666,20.2,14.65,17.7 464 | 6.65492,0,18.1,0,0.713,6.317,83,2.7344,24,666,20.2,13.99,19.5 465 | 5.82115,0,18.1,0,0.713,6.513,89.9,2.8016,24,666,20.2,10.29,20.2 466 | 7.83932,0,18.1,0,0.655,6.209,65.4,2.9634,24,666,20.2,13.22,21.4 467 | 3.1636,0,18.1,0,0.655,5.759,48.2,3.0665,24,666,20.2,14.13,19.9 468 | 3.77498,0,18.1,0,0.655,5.952,84.7,2.8715,24,666,20.2,17.15,19 469 | 4.42228,0,18.1,0,0.584,6.003,94.5,2.5403,24,666,20.2,21.32,19.1 470 | 15.5757,0,18.1,0,0.58,5.926,71,2.9084,24,666,20.2,18.13,19.1 471 | 13.0751,0,18.1,0,0.58,5.713,56.7,2.8237,24,666,20.2,14.76,20.1 472 | 4.34879,0,18.1,0,0.58,6.167,84,3.0334,24,666,20.2,16.29,19.9 473 | 4.03841,0,18.1,0,0.532,6.229,90.7,3.0993,24,666,20.2,12.87,19.6 474 | 3.56868,0,18.1,0,0.58,6.437,75,2.8965,24,666,20.2,14.36,23.2 475 | 4.64689,0,18.1,0,0.614,6.98,67.6,2.5329,24,666,20.2,11.66,29.8 476 | 8.05579,0,18.1,0,0.584,5.427,95.4,2.4298,24,666,20.2,18.14,13.8 477 | 6.39312,0,18.1,0,0.584,6.162,97.4,2.206,24,666,20.2,24.1,13.3 478 | 4.87141,0,18.1,0,0.614,6.484,93.6,2.3053,24,666,20.2,18.68,16.7 479 | 15.0234,0,18.1,0,0.614,5.304,97.3,2.1007,24,666,20.2,24.91,12 480 | 10.233,0,18.1,0,0.614,6.185,96.7,2.1705,24,666,20.2,18.03,14.6 481 | 14.3337,0,18.1,0,0.614,6.229,88,1.9512,24,666,20.2,13.11,21.4 482 | 5.82401,0,18.1,0,0.532,6.242,64.7,3.4242,24,666,20.2,10.74,23 483 | 5.70818,0,18.1,0,0.532,6.75,74.9,3.3317,24,666,20.2,7.74,23.7 484 | 5.73116,0,18.1,0,0.532,7.061,77,3.4106,24,666,20.2,7.01,25 485 | 2.81838,0,18.1,0,0.532,5.762,40.3,4.0983,24,666,20.2,10.42,21.8 486 | 2.37857,0,18.1,0,0.583,5.871,41.9,3.724,24,666,20.2,13.34,20.6 487 | 3.67367,0,18.1,0,0.583,6.312,51.9,3.9917,24,666,20.2,10.58,21.2 488 | 5.69175,0,18.1,0,0.583,6.114,79.8,3.5459,24,666,20.2,14.98,19.1 489 | 4.83567,0,18.1,0,0.583,5.905,53.2,3.1523,24,666,20.2,11.45,20.6 490 | 0.15086,0,27.74,0,0.609,5.454,92.7,1.8209,4,711,20.1,18.06,15.2 491 | 0.18337,0,27.74,0,0.609,5.414,98.3,1.7554,4,711,20.1,23.97,7 492 | 0.20746,0,27.74,0,0.609,5.093,98,1.8226,4,711,20.1,29.68,8.1 493 | 0.10574,0,27.74,0,0.609,5.983,98.8,1.8681,4,711,20.1,18.07,13.6 494 | 0.11132,0,27.74,0,0.609,5.983,83.5,2.1099,4,711,20.1,13.35,20.1 495 | 0.17331,0,9.69,0,0.585,5.707,54,2.3817,6,391,19.2,12.01,21.8 496 | 0.27957,0,9.69,0,0.585,5.926,42.6,2.3817,6,391,19.2,13.59,24.5 497 | 0.17899,0,9.69,0,0.585,5.67,28.8,2.7986,6,391,19.2,17.6,23.1 498 | 0.2896,0,9.69,0,0.585,5.39,72.9,2.7986,6,391,19.2,21.14,19.7 499 | 0.26838,0,9.69,0,0.585,5.794,70.6,2.8927,6,391,19.2,14.1,18.3 500 | 0.23912,0,9.69,0,0.585,6.019,65.3,2.4091,6,391,19.2,12.92,21.2 501 | 0.17783,0,9.69,0,0.585,5.569,73.5,2.3999,6,391,19.2,15.1,17.5 502 | 0.22438,0,9.69,0,0.585,6.027,79.7,2.4982,6,391,19.2,14.33,16.8 503 | 0.06263,0,11.93,0,0.573,6.593,69.1,2.4786,1,273,21,9.67,22.4 504 | 0.04527,0,11.93,0,0.573,6.12,76.7,2.2875,1,273,21,9.08,20.6 505 | 0.06076,0,11.93,0,0.573,6.976,91,2.1675,1,273,21,5.64,23.9 506 | 0.10959,0,11.93,0,0.573,6.794,89.3,2.3889,1,273,21,6.48,22 507 | 0.04741,0,11.93,0,0.573,6.03,80.8,2.505,1,273,21,7.88,11.9 508 | -------------------------------------------------------------------------------- /4/CH4_Univariate_Logistic_regression.py: -------------------------------------------------------------------------------- 1 | import pandas as pd 2 | import numpy as np 3 | get_ipython().magic(u'matplotlib inline') 4 | import matplotlib.pyplot as plt 5 | import tensorflow as tf 6 | 7 | df = pd.read_csv("data/CHD.csv", header=0) 8 | 9 | 10 | # Parameters 11 | 12 | learning_rate = 0.2 13 | training_epochs = 5 14 | batch_size = 100 15 | display_step = 1 16 | sess = tf.Session() 17 | b=np.zeros((100,2)) 18 | #print pd.get_dummies(df['admit']).values[1] 19 | print sess.run(tf.one_hot(indices = [1, 3, 2, 4], depth=5, on_value = 1, off_value = 0, axis = 1 , name = "a")) 20 | #print a.eval(session=sess) 21 | 22 | 23 | # tf Graph Input 24 | 25 | x = tf.placeholder("float", [None, 1]) 26 | y = tf.placeholder("float", [None, 2]) 27 | # Create model 28 | # Set model weights 29 | W = tf.Variable(tf.zeros([1, 2])) 30 | b = tf.Variable(tf.zeros([2])) 31 | 32 | 33 | # Construct model 34 | activation = tf.nn.softmax(tf.matmul(x, W) + b) 35 | # Minimize error using cross entropy 36 | cost = tf.reduce_mean(-tf.reduce_sum(y*tf.log(activation), reduction_indices=1)) # Cross entropy 37 | optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost) # Gradient Descent 38 | 39 | 40 | # Initializing the variables 41 | init = tf.initialize_all_variables() 42 | 43 | 44 | # Launch the graph 45 | 46 | with tf.Session() as sess: 47 | tf.train.write_graph(sess.graph, './graphs','graph.pbtxt') 48 | sess.run(init) 49 | writer = tf.train.SummaryWriter('./graphs', sess.graph) 50 | #Initialize the graph structure 51 | 52 | graphnumber=321 53 | 54 | #Generate a new graph 55 | plt.figure(1) 56 | 57 | #Iterate through all the epochs 58 | for epoch in range(training_epochs): 59 | avg_cost = 0. 60 | total_batch = 400/batch_size 61 | # Loop over all batches 62 | 63 | for i in range(total_batch): 64 | # Transform the array into a one hot format 65 | 66 | temp=tf.one_hot(indices = df['chd'].values, depth=2, on_value = 1, off_value = 0, axis = -1 , name = "a") 67 | batch_xs, batch_ys = (np.transpose([df['age']])-44.38)/11.721327, temp 68 | 69 | # Fit training using batch data 70 | sess.run(optimizer, feed_dict={x: batch_xs.astype(float), y: batch_ys.eval()}) 71 | 72 | # Compute average loss, suming the corrent cost divided by the batch total number 73 | avg_cost += sess.run(cost, feed_dict={x: batch_xs.astype(float), y: batch_ys.eval()})/total_batch 74 | # Display logs per epoch step 75 | 76 | if epoch % display_step == 0: 77 | print "Epoch:", '%05d' % (epoch+1), "cost=", "{:.8f}".format(avg_cost) 78 | 79 | #Generate a new graph, and add it to the complete graph 80 | 81 | trX = np.linspace(-30, 30, 100) 82 | print (b.eval()) 83 | print (W.eval()) 84 | Wdos=2*W.eval()[0][0]/11.721327 85 | bdos=2*b.eval()[0] 86 | 87 | # Generate the probabiliy function 88 | trY = np.exp(-(Wdos*trX)+bdos)/(1+np.exp(-(Wdos*trX)+bdos) ) 89 | 90 | # Draw the samples and the probability function, whithout the normalization 91 | plt.subplot(graphnumber) 92 | graphnumber=graphnumber+1 93 | 94 | #Plot a scatter draw of the random datapoints 95 | plt.scatter((df['age']),df['chd']) 96 | plt.plot(trX+44.38,trY) #Plot a scatter draw of the random datapoints 97 | plt.grid(True) 98 | 99 | #Plot the final graph 100 | plt.savefig("test.svg") 101 | 102 | 103 | -------------------------------------------------------------------------------- /4/CH4_univariate_logistic_regression_skflow.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "collapsed": true 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "import tensorflow.contrib.learn as skflow\n", 12 | "from sklearn import datasets, metrics, preprocessing\n", 13 | "import numpy as np\n", 14 | "import pandas as pd\n", 15 | "\n", 16 | "df = pd.read_csv(\"data/CHD.csv\", header=0)\n", 17 | "print df.describe()\n", 18 | "\n", 19 | "def my_model(X, y):\n", 20 | " return skflow.models.logistic_regression(X, y)\n", 21 | "\n", 22 | "a = preprocessing.StandardScaler()\n", 23 | "\n", 24 | "X =a.fit_transform(df['age'].astype(float))\n", 25 | "\n", 26 | "print a.get_params()\n", 27 | "classifier = skflow.TensorFlowEstimator(model_fn=my_model, n_classes=1)\n", 28 | "classifier.fit(X, df['chd'].astype(float), logdir='/tmp/logistic')\n", 29 | "print(classifier.get_tensor_value('logistic_regression/bias:0'))\n", 30 | "print(classifier.get_tensor_value('logistic_regression/weight:0'))\n", 31 | "score = metrics.accuracy_score(df['chd'].astype(float), classifier.predict(X))\n", 32 | "print(\"Accuracy: %f\" % score)" 33 | ] 34 | } 35 | ], 36 | "metadata": { 37 | "kernelspec": { 38 | "display_name": "Python 2", 39 | "language": "python", 40 | "name": "python2" 41 | }, 42 | "language_info": { 43 | "codemirror_mode": { 44 | "name": "ipython", 45 | "version": 2 46 | }, 47 | "file_extension": ".py", 48 | "mimetype": "text/x-python", 49 | "name": "python", 50 | "nbconvert_exporter": "python", 51 | "pygments_lexer": "ipython2", 52 | "version": "2.7.11+" 53 | } 54 | }, 55 | "nbformat": 4, 56 | "nbformat_minor": 0 57 | } 58 | -------------------------------------------------------------------------------- /4/CH4_univariate_logistic_regression_skflow.py: -------------------------------------------------------------------------------- 1 | import tensorflow.contrib.learn as skflow 2 | from sklearn import datasets, metrics, preprocessing 3 | import numpy as np 4 | import pandas as pd 5 | 6 | df = pd.read_csv("data/CHD.csv", header=0) 7 | print df.describe() 8 | 9 | def my_model(X, y): 10 | return skflow.models.logistic_regression(X, y) 11 | 12 | a = preprocessing.StandardScaler() 13 | 14 | X =a.fit_transform(df['age'].astype(float)) 15 | 16 | print a.get_params() 17 | classifier = skflow.TensorFlowEstimator(model_fn=my_model, n_classes=1) 18 | classifier.fit(X, df['chd'].astype(float), logdir='/tmp/logistic') 19 | print(classifier.get_tensor_value('logistic_regression/bias:0')) 20 | print(classifier.get_tensor_value('logistic_regression/weight:0')) 21 | score = metrics.accuracy_score(df['chd'].astype(float), classifier.predict(X)) 22 | print("Accuracy: %f" % score) 23 | 24 | -------------------------------------------------------------------------------- /4/Univariate_logistic_regression_keras.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [ 10 | { 11 | "name": "stderr", 12 | "output_type": "stream", 13 | "text": [ 14 | "Using TensorFlow backend.\n" 15 | ] 16 | }, 17 | { 18 | "name": "stdout", 19 | "output_type": "stream", 20 | "text": [ 21 | " age chd\n", 22 | "count 100.000000 100.00000\n", 23 | "mean 44.380000 0.43000\n", 24 | "std 11.721327 0.49757\n", 25 | "min 20.000000 0.00000\n", 26 | "25% 34.750000 0.00000\n", 27 | "50% 44.000000 0.00000\n", 28 | "75% 55.000000 1.00000\n", 29 | "max 69.000000 1.00000\n" 30 | ] 31 | }, 32 | { 33 | "name": "stderr", 34 | "output_type": "stream", 35 | "text": [ 36 | "/usr/lib/python2.7/dist-packages/sklearn/utils/validation.py:420: DataConversionWarning: Data with input dtype int64 was converted to float64 by StandardScaler.\n", 37 | " warnings.warn(msg, DataConversionWarning)\n", 38 | "/usr/lib/python2.7/dist-packages/sklearn/utils/validation.py:420: DataConversionWarning: Data with input dtype int64 was converted to float64 by StandardScaler.\n", 39 | " warnings.warn(msg, DataConversionWarning)\n" 40 | ] 41 | }, 42 | { 43 | "name": "stdout", 44 | "output_type": "stream", 45 | "text": [ 46 | "Train on 60 samples, validate on 30 samples\n", 47 | "Epoch 1/100\n", 48 | "0s - loss: 0.5174 - acc: 0.7500 - val_loss: 0.6135 - val_acc: 0.6667\n", 49 | "Epoch 2/100\n", 50 | "0s - loss: 0.5170 - acc: 0.7500 - val_loss: 0.6132 - val_acc: 0.6667\n", 51 | "Epoch 3/100\n", 52 | "0s - loss: 0.5169 - acc: 0.7500 - val_loss: 0.6130 - val_acc: 0.6667\n", 53 | "Epoch 4/100\n", 54 | "0s - loss: 0.5167 - acc: 0.7500 - val_loss: 0.6127 - val_acc: 0.6667\n", 55 | "Epoch 5/100\n", 56 | "0s - loss: 0.5165 - acc: 0.7500 - val_loss: 0.6125 - val_acc: 0.6667\n", 57 | "Epoch 6/100\n", 58 | "0s - loss: 0.5164 - acc: 0.7500 - val_loss: 0.6123 - val_acc: 0.6667\n", 59 | "Epoch 7/100\n", 60 | "0s - loss: 0.5163 - acc: 0.7500 - val_loss: 0.6121 - val_acc: 0.6667\n", 61 | "Epoch 8/100\n", 62 | "0s - loss: 0.5162 - acc: 0.7500 - val_loss: 0.6119 - val_acc: 0.6667\n", 63 | "Epoch 9/100\n", 64 | "0s - loss: 0.5161 - acc: 0.7500 - val_loss: 0.6118 - val_acc: 0.6667\n", 65 | "Epoch 10/100\n", 66 | "0s - loss: 0.5160 - acc: 0.7500 - val_loss: 0.6116 - val_acc: 0.6667\n", 67 | "Epoch 11/100\n", 68 | "0s - loss: 0.5159 - acc: 0.7500 - val_loss: 0.6115 - val_acc: 0.6667\n", 69 | "Epoch 12/100\n", 70 | "0s - loss: 0.5158 - acc: 0.7500 - val_loss: 0.6113 - val_acc: 0.6667\n", 71 | "Epoch 13/100\n", 72 | "0s - loss: 0.5157 - acc: 0.7500 - val_loss: 0.6113 - val_acc: 0.6667\n", 73 | "Epoch 14/100\n", 74 | "0s - loss: 0.5156 - acc: 0.7500 - val_loss: 0.6111 - val_acc: 0.6667\n", 75 | "Epoch 15/100\n", 76 | "0s - loss: 0.5155 - acc: 0.7500 - val_loss: 0.6110 - val_acc: 0.6667\n", 77 | "Epoch 16/100\n", 78 | "0s - loss: 0.5155 - acc: 0.7500 - val_loss: 0.6109 - val_acc: 0.6667\n", 79 | "Epoch 17/100\n", 80 | "0s - loss: 0.5153 - acc: 0.7500 - val_loss: 0.6108 - val_acc: 0.6667\n", 81 | "Epoch 18/100\n", 82 | "0s - loss: 0.5152 - acc: 0.7500 - val_loss: 0.6107 - val_acc: 0.6667\n", 83 | "Epoch 19/100\n", 84 | "0s - loss: 0.5152 - acc: 0.7500 - val_loss: 0.6105 - val_acc: 0.6667\n", 85 | "Epoch 20/100\n", 86 | "0s - loss: 0.5151 - acc: 0.7500 - val_loss: 0.6104 - val_acc: 0.6667\n", 87 | "Epoch 21/100\n", 88 | "0s - loss: 0.5150 - acc: 0.7500 - val_loss: 0.6103 - val_acc: 0.6667\n", 89 | "Epoch 22/100\n", 90 | "0s - loss: 0.5150 - acc: 0.7500 - val_loss: 0.6102 - val_acc: 0.6667\n", 91 | "Epoch 23/100\n", 92 | "0s - loss: 0.5148 - acc: 0.7500 - val_loss: 0.6100 - val_acc: 0.6667\n", 93 | "Epoch 24/100\n", 94 | "0s - loss: 0.5148 - acc: 0.7500 - val_loss: 0.6099 - val_acc: 0.6667\n", 95 | "Epoch 25/100\n", 96 | "0s - loss: 0.5147 - acc: 0.7500 - val_loss: 0.6097 - val_acc: 0.6667\n", 97 | "Epoch 26/100\n", 98 | "0s - loss: 0.5146 - acc: 0.7500 - val_loss: 0.6096 - val_acc: 0.6667\n", 99 | "Epoch 27/100\n", 100 | "0s - loss: 0.5145 - acc: 0.7500 - val_loss: 0.6095 - val_acc: 0.6667\n", 101 | "Epoch 28/100\n", 102 | "0s - loss: 0.5144 - acc: 0.7500 - val_loss: 0.6093 - val_acc: 0.6667\n", 103 | "Epoch 29/100\n", 104 | "0s - loss: 0.5143 - acc: 0.7500 - val_loss: 0.6092 - val_acc: 0.6667\n", 105 | "Epoch 30/100\n", 106 | "0s - loss: 0.5143 - acc: 0.7500 - val_loss: 0.6090 - val_acc: 0.6667\n", 107 | "Epoch 31/100\n", 108 | "0s - loss: 0.5142 - acc: 0.7500 - val_loss: 0.6089 - val_acc: 0.6667\n", 109 | "Epoch 32/100\n", 110 | "0s - loss: 0.5142 - acc: 0.7500 - val_loss: 0.6088 - val_acc: 0.6667\n", 111 | "Epoch 33/100\n", 112 | "0s - loss: 0.5140 - acc: 0.7500 - val_loss: 0.6087 - val_acc: 0.6667\n", 113 | "Epoch 34/100\n", 114 | "0s - loss: 0.5140 - acc: 0.7500 - val_loss: 0.6085 - val_acc: 0.6667\n", 115 | "Epoch 35/100\n", 116 | "0s - loss: 0.5139 - acc: 0.7500 - val_loss: 0.6085 - val_acc: 0.6667\n", 117 | "Epoch 36/100\n", 118 | "0s - loss: 0.5138 - acc: 0.7500 - val_loss: 0.6084 - val_acc: 0.6667\n", 119 | "Epoch 37/100\n", 120 | "0s - loss: 0.5137 - acc: 0.7500 - val_loss: 0.6083 - val_acc: 0.6667\n", 121 | "Epoch 38/100\n", 122 | "0s - loss: 0.5137 - acc: 0.7500 - val_loss: 0.6082 - val_acc: 0.6667\n", 123 | "Epoch 39/100\n", 124 | "0s - loss: 0.5137 - acc: 0.7500 - val_loss: 0.6081 - val_acc: 0.6667\n", 125 | "Epoch 40/100\n", 126 | "0s - loss: 0.5136 - acc: 0.7500 - val_loss: 0.6080 - val_acc: 0.6667\n", 127 | "Epoch 41/100\n", 128 | "0s - loss: 0.5134 - acc: 0.7500 - val_loss: 0.6078 - val_acc: 0.6667\n", 129 | "Epoch 42/100\n", 130 | "0s - loss: 0.5134 - acc: 0.7500 - val_loss: 0.6077 - val_acc: 0.6667\n", 131 | "Epoch 43/100\n", 132 | "0s - loss: 0.5134 - acc: 0.7500 - val_loss: 0.6077 - val_acc: 0.6667\n", 133 | "Epoch 44/100\n", 134 | "0s - loss: 0.5132 - acc: 0.7500 - val_loss: 0.6076 - val_acc: 0.6667\n", 135 | "Epoch 45/100\n", 136 | "0s - loss: 0.5132 - acc: 0.7500 - val_loss: 0.6075 - val_acc: 0.6667\n", 137 | "Epoch 46/100\n", 138 | "0s - loss: 0.5131 - acc: 0.7500 - val_loss: 0.6074 - val_acc: 0.6667\n", 139 | "Epoch 47/100\n", 140 | "0s - loss: 0.5131 - acc: 0.7500 - val_loss: 0.6073 - val_acc: 0.6667\n", 141 | "Epoch 48/100\n", 142 | "0s - loss: 0.5130 - acc: 0.7500 - val_loss: 0.6072 - val_acc: 0.6667\n", 143 | "Epoch 49/100\n", 144 | "0s - loss: 0.5129 - acc: 0.7500 - val_loss: 0.6070 - val_acc: 0.6667\n", 145 | "Epoch 50/100\n", 146 | "0s - loss: 0.5129 - acc: 0.7500 - val_loss: 0.6069 - val_acc: 0.6667\n", 147 | "Epoch 51/100\n", 148 | "0s - loss: 0.5128 - acc: 0.7500 - val_loss: 0.6068 - val_acc: 0.6667\n", 149 | "Epoch 52/100\n", 150 | "0s - loss: 0.5127 - acc: 0.7500 - val_loss: 0.6067 - val_acc: 0.6667\n", 151 | "Epoch 53/100\n", 152 | "0s - loss: 0.5127 - acc: 0.7500 - val_loss: 0.6066 - val_acc: 0.6667\n", 153 | "Epoch 54/100\n", 154 | "0s - loss: 0.5126 - acc: 0.7500 - val_loss: 0.6064 - val_acc: 0.6667\n", 155 | "Epoch 55/100\n", 156 | "0s - loss: 0.5125 - acc: 0.7500 - val_loss: 0.6063 - val_acc: 0.6667\n", 157 | "Epoch 56/100\n", 158 | "0s - loss: 0.5125 - acc: 0.7500 - val_loss: 0.6062 - val_acc: 0.6667\n", 159 | "Epoch 57/100\n", 160 | "0s - loss: 0.5124 - acc: 0.7500 - val_loss: 0.6061 - val_acc: 0.6667\n", 161 | "Epoch 58/100\n", 162 | "0s - loss: 0.5123 - acc: 0.7500 - val_loss: 0.6059 - val_acc: 0.6667\n", 163 | "Epoch 59/100\n", 164 | "0s - loss: 0.5122 - acc: 0.7500 - val_loss: 0.6058 - val_acc: 0.6667\n", 165 | "Epoch 60/100\n", 166 | "0s - loss: 0.5122 - acc: 0.7500 - val_loss: 0.6057 - val_acc: 0.6667\n", 167 | "Epoch 61/100\n", 168 | "0s - loss: 0.5121 - acc: 0.7500 - val_loss: 0.6056 - val_acc: 0.6667\n", 169 | "Epoch 62/100\n", 170 | "0s - loss: 0.5120 - acc: 0.7500 - val_loss: 0.6055 - val_acc: 0.6667\n", 171 | "Epoch 63/100\n", 172 | "0s - loss: 0.5120 - acc: 0.7500 - val_loss: 0.6054 - val_acc: 0.6667\n", 173 | "Epoch 64/100\n", 174 | "0s - loss: 0.5119 - acc: 0.7500 - val_loss: 0.6052 - val_acc: 0.6667\n", 175 | "Epoch 65/100\n", 176 | "0s - loss: 0.5119 - acc: 0.7500 - val_loss: 0.6051 - val_acc: 0.6667\n", 177 | "Epoch 66/100\n", 178 | "0s - loss: 0.5118 - acc: 0.7500 - val_loss: 0.6051 - val_acc: 0.6667\n", 179 | "Epoch 67/100\n", 180 | "0s - loss: 0.5117 - acc: 0.7500 - val_loss: 0.6050 - val_acc: 0.6667\n", 181 | "Epoch 68/100\n", 182 | "0s - loss: 0.5117 - acc: 0.7500 - val_loss: 0.6049 - val_acc: 0.6667\n", 183 | "Epoch 69/100\n", 184 | "0s - loss: 0.5116 - acc: 0.7500 - val_loss: 0.6048 - val_acc: 0.6667\n", 185 | "Epoch 70/100\n", 186 | "0s - loss: 0.5115 - acc: 0.7500 - val_loss: 0.6047 - val_acc: 0.6667\n", 187 | "Epoch 71/100\n", 188 | "0s - loss: 0.5115 - acc: 0.7500 - val_loss: 0.6046 - val_acc: 0.6667\n", 189 | "Epoch 72/100\n", 190 | "0s - loss: 0.5114 - acc: 0.7500 - val_loss: 0.6044 - val_acc: 0.6667\n", 191 | "Epoch 73/100\n", 192 | "0s - loss: 0.5113 - acc: 0.7500 - val_loss: 0.6043 - val_acc: 0.6667\n", 193 | "Epoch 74/100\n", 194 | "0s - loss: 0.5113 - acc: 0.7500 - val_loss: 0.6041 - val_acc: 0.6667\n", 195 | "Epoch 75/100\n", 196 | "0s - loss: 0.5112 - acc: 0.7500 - val_loss: 0.6040 - val_acc: 0.6667\n", 197 | "Epoch 76/100\n", 198 | "0s - loss: 0.5113 - acc: 0.7500 - val_loss: 0.6039 - val_acc: 0.6667\n", 199 | "Epoch 77/100\n", 200 | "0s - loss: 0.5111 - acc: 0.7500 - val_loss: 0.6038 - val_acc: 0.6667\n", 201 | "Epoch 78/100\n", 202 | "0s - loss: 0.5112 - acc: 0.7500 - val_loss: 0.6037 - val_acc: 0.6667\n", 203 | "Epoch 79/100\n", 204 | "0s - loss: 0.5111 - acc: 0.7500 - val_loss: 0.6037 - val_acc: 0.6667\n", 205 | "Epoch 80/100\n", 206 | "0s - loss: 0.5110 - acc: 0.7500 - val_loss: 0.6036 - val_acc: 0.6667\n", 207 | "Epoch 81/100\n", 208 | "0s - loss: 0.5109 - acc: 0.7500 - val_loss: 0.6035 - val_acc: 0.6667\n", 209 | "Epoch 82/100\n", 210 | "0s - loss: 0.5109 - acc: 0.7500 - val_loss: 0.6034 - val_acc: 0.6667\n", 211 | "Epoch 83/100\n", 212 | "0s - loss: 0.5109 - acc: 0.7500 - val_loss: 0.6034 - val_acc: 0.6667\n", 213 | "Epoch 84/100\n", 214 | "0s - loss: 0.5108 - acc: 0.7500 - val_loss: 0.6033 - val_acc: 0.6667\n", 215 | "Epoch 85/100\n", 216 | "0s - loss: 0.5107 - acc: 0.7500 - val_loss: 0.6032 - val_acc: 0.6667\n", 217 | "Epoch 86/100\n", 218 | "0s - loss: 0.5107 - acc: 0.7500 - val_loss: 0.6032 - val_acc: 0.6667\n", 219 | "Epoch 87/100\n", 220 | "0s - loss: 0.5106 - acc: 0.7500 - val_loss: 0.6031 - val_acc: 0.6667\n", 221 | "Epoch 88/100\n", 222 | "0s - loss: 0.5106 - acc: 0.7500 - val_loss: 0.6030 - val_acc: 0.6667\n", 223 | "Epoch 89/100\n", 224 | "0s - loss: 0.5105 - acc: 0.7500 - val_loss: 0.6029 - val_acc: 0.6667\n", 225 | "Epoch 90/100\n", 226 | "0s - loss: 0.5105 - acc: 0.7500 - val_loss: 0.6028 - val_acc: 0.6667\n", 227 | "Epoch 91/100\n", 228 | "0s - loss: 0.5104 - acc: 0.7500 - val_loss: 0.6027 - val_acc: 0.6667\n", 229 | "Epoch 92/100\n", 230 | "0s - loss: 0.5104 - acc: 0.7500 - val_loss: 0.6026 - val_acc: 0.6667\n", 231 | "Epoch 93/100\n", 232 | "0s - loss: 0.5104 - acc: 0.7500 - val_loss: 0.6025 - val_acc: 0.6667\n", 233 | "Epoch 94/100\n", 234 | "0s - loss: 0.5103 - acc: 0.7500 - val_loss: 0.6025 - val_acc: 0.6667\n", 235 | "Epoch 95/100\n", 236 | "0s - loss: 0.5102 - acc: 0.7500 - val_loss: 0.6024 - val_acc: 0.6667\n", 237 | "Epoch 96/100\n", 238 | "0s - loss: 0.5102 - acc: 0.7500 - val_loss: 0.6023 - val_acc: 0.6667\n", 239 | "Epoch 97/100\n", 240 | "0s - loss: 0.5101 - acc: 0.7500 - val_loss: 0.6022 - val_acc: 0.6667\n", 241 | "Epoch 98/100\n", 242 | "0s - loss: 0.5102 - acc: 0.7500 - val_loss: 0.6021 - val_acc: 0.6667\n", 243 | "Epoch 99/100\n", 244 | "0s - loss: 0.5100 - acc: 0.7500 - val_loss: 0.6020 - val_acc: 0.6667\n", 245 | "Epoch 100/100\n", 246 | "0s - loss: 0.5100 - acc: 0.7500 - val_loss: 0.6019 - val_acc: 0.6667\n", 247 | "['loss', 'acc']\n", 248 | "[0.58956193923950195, 0.80000001192092896]\n" 249 | ] 250 | } 251 | ], 252 | "source": [ 253 | "import tensorflow as tf\n", 254 | "from keras.models import Sequential\n", 255 | "from keras.layers import Dense\n", 256 | "from sklearn import datasets, metrics, preprocessing\n", 257 | "from sklearn.utils import shuffle\n", 258 | "import numpy as np\n", 259 | "import pandas as pd\n", 260 | "\n", 261 | "#Load the dataset\n", 262 | "df = pd.read_csv(\"data/CHD.csv\", header=0)\n", 263 | "#Describe the input data\n", 264 | "print df.describe()\n", 265 | "\n", 266 | "#Normalize the input data\n", 267 | "a = preprocessing.StandardScaler()\n", 268 | "X =a.fit_transform(df['age'].reshape(-1, 1))\n", 269 | "\n", 270 | "#Shuffle the data \n", 271 | "x,y = shuffle(X, df['chd'])\n", 272 | "\n", 273 | "#Define the model as a logistic regression with\n", 274 | "model = Sequential()\n", 275 | "model.add(Dense(1, activation='sigmoid', input_dim=1))\n", 276 | "model.compile(optimizer='rmsprop', loss='binary_crossentropy', metrics=['accuracy'])\n", 277 | "\n", 278 | "# Fit the model with the first 90 elements, and spliting 70%/30% of them for training/validation sets\n", 279 | "\n", 280 | "model.fit(x[:90], y[:90], nb_epoch=100, validation_split=0.33, shuffle=True,verbose=2 )\n", 281 | "\n", 282 | "#Evaluate the model with the last 10 elements\n", 283 | "scores = model.evaluate(x[90:], y[90:], verbose=2)\n", 284 | "print model.metrics_names\n", 285 | "print scores\n" 286 | ] 287 | } 288 | ], 289 | "metadata": { 290 | "kernelspec": { 291 | "display_name": "Python 2", 292 | "language": "python", 293 | "name": "python2" 294 | }, 295 | "language_info": { 296 | "codemirror_mode": { 297 | "name": "ipython", 298 | "version": 2 299 | }, 300 | "file_extension": ".py", 301 | "mimetype": "text/x-python", 302 | "name": "python", 303 | "nbconvert_exporter": "python", 304 | "pygments_lexer": "ipython2", 305 | "version": "2.7.11+" 306 | } 307 | }, 308 | "nbformat": 4, 309 | "nbformat_minor": 0 310 | } 311 | -------------------------------------------------------------------------------- /4/data/CHD.csv: -------------------------------------------------------------------------------- 1 | age,chd 2 | 20,0 3 | 23,0 4 | 24,0 5 | 25,0 6 | 25,1 7 | 26,0 8 | 26,0 9 | 28,0 10 | 28,0 11 | 29,0 12 | 30,0 13 | 30,0 14 | 30,0 15 | 30,0 16 | 30,0 17 | 30,1 18 | 32,0 19 | 32,0 20 | 33,0 21 | 33,0 22 | 34,0 23 | 34,0 24 | 34,1 25 | 34,0 26 | 34,0 27 | 35,0 28 | 35,0 29 | 36,0 30 | 36,1 31 | 36,0 32 | 37,0 33 | 37,1 34 | 37,0 35 | 38,0 36 | 38,0 37 | 39,0 38 | 39,1 39 | 40,0 40 | 40,1 41 | 41,0 42 | 41,0 43 | 42,0 44 | 42,0 45 | 42,0 46 | 42,1 47 | 43,0 48 | 43,0 49 | 43,1 50 | 44,0 51 | 44,0 52 | 44,1 53 | 44,1 54 | 45,0 55 | 45,1 56 | 46,0 57 | 46,1 58 | 47,0 59 | 47,0 60 | 47,1 61 | 48,0 62 | 48,1 63 | 48,1 64 | 49,0 65 | 49,0 66 | 49,1 67 | 50,0 68 | 50,1 69 | 51,0 70 | 52,0 71 | 52,1 72 | 53,1 73 | 53,1 74 | 54,1 75 | 55,0 76 | 55,1 77 | 55,1 78 | 56,1 79 | 56,1 80 | 56,1 81 | 57,0 82 | 57,0 83 | 57,1 84 | 57,1 85 | 57,1 86 | 57,1 87 | 58,0 88 | 58,1 89 | 58,1 90 | 59,1 91 | 59,1 92 | 60,0 93 | 60,1 94 | 61,1 95 | 62,1 96 | 62,1 97 | 63,1 98 | 64,0 99 | 64,1 100 | 65,1 101 | 69,1 102 | -------------------------------------------------------------------------------- /4/old/CH4_skflow.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "collapsed": true 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "import tensorflow.contrib.learn as skflow\n", 12 | "from sklearn import datasets, metrics, preprocessing\n", 13 | "import numpy as np\n", 14 | "import pandas as pd\n", 15 | "\n", 16 | "df = pd.read_csv(\"data/CHD.csv\", header=0)\n", 17 | "print df.describe()\n", 18 | "\n", 19 | "def my_model(X, y):\n", 20 | " return skflow.models.logistic_regression(X, y)\n", 21 | "\n", 22 | "a = preprocessing.StandardScaler()\n", 23 | "\n", 24 | "X =a.fit_transform(df['age'].astype(float))\n", 25 | "\n", 26 | "print a.get_params()\n", 27 | "classifier = skflow.TensorFlowEstimator(model_fn=my_model, n_classes=1)\n", 28 | "classifier.fit(X, df['chd'].astype(float), logdir='/tmp/logistic')\n", 29 | "print(classifier.get_tensor_value('logistic_regression/bias:0'))\n", 30 | "print(classifier.get_tensor_value('logistic_regression/weight:0'))\n", 31 | "score = metrics.accuracy_score(df['chd'].astype(float), classifier.predict(X))\n", 32 | "print(\"Accuracy: %f\" % score)" 33 | ] 34 | } 35 | ], 36 | "metadata": { 37 | "kernelspec": { 38 | "display_name": "Python 2", 39 | "language": "python", 40 | "name": "python2" 41 | }, 42 | "language_info": { 43 | "codemirror_mode": { 44 | "name": "ipython", 45 | "version": 2 46 | }, 47 | "file_extension": ".py", 48 | "mimetype": "text/x-python", 49 | "name": "python", 50 | "nbconvert_exporter": "python", 51 | "pygments_lexer": "ipython2", 52 | "version": "2.7.11+" 53 | } 54 | }, 55 | "nbformat": 4, 56 | "nbformat_minor": 0 57 | } 58 | -------------------------------------------------------------------------------- /4/old/binary.csv: -------------------------------------------------------------------------------- 1 | admit,gre,gpa,rank 2 | 0,380,3.61,3 3 | 1,660,3.67,3 4 | 1,800,4,1 5 | 1,640,3.19,4 6 | 0,520,2.93,4 7 | 1,760,3,2 8 | 1,560,2.98,1 9 | 0,400,3.08,2 10 | 1,540,3.39,3 11 | 0,700,3.92,2 12 | 0,800,4,4 13 | 0,440,3.22,1 14 | 1,760,4,1 15 | 0,700,3.08,2 16 | 1,700,4,1 17 | 0,480,3.44,3 18 | 0,780,3.87,4 19 | 0,360,2.56,3 20 | 0,800,3.75,2 21 | 1,540,3.81,1 22 | 0,500,3.17,3 23 | 1,660,3.63,2 24 | 0,600,2.82,4 25 | 0,680,3.19,4 26 | 1,760,3.35,2 27 | 1,800,3.66,1 28 | 1,620,3.61,1 29 | 1,520,3.74,4 30 | 1,780,3.22,2 31 | 0,520,3.29,1 32 | 0,540,3.78,4 33 | 0,760,3.35,3 34 | 0,600,3.4,3 35 | 1,800,4,3 36 | 0,360,3.14,1 37 | 0,400,3.05,2 38 | 0,580,3.25,1 39 | 0,520,2.9,3 40 | 1,500,3.13,2 41 | 1,520,2.68,3 42 | 0,560,2.42,2 43 | 1,580,3.32,2 44 | 1,600,3.15,2 45 | 0,500,3.31,3 46 | 0,700,2.94,2 47 | 1,460,3.45,3 48 | 1,580,3.46,2 49 | 0,500,2.97,4 50 | 0,440,2.48,4 51 | 0,400,3.35,3 52 | 0,640,3.86,3 53 | 0,440,3.13,4 54 | 0,740,3.37,4 55 | 1,680,3.27,2 56 | 0,660,3.34,3 57 | 1,740,4,3 58 | 0,560,3.19,3 59 | 0,380,2.94,3 60 | 0,400,3.65,2 61 | 0,600,2.82,4 62 | 1,620,3.18,2 63 | 0,560,3.32,4 64 | 0,640,3.67,3 65 | 1,680,3.85,3 66 | 0,580,4,3 67 | 0,600,3.59,2 68 | 0,740,3.62,4 69 | 0,620,3.3,1 70 | 0,580,3.69,1 71 | 0,800,3.73,1 72 | 0,640,4,3 73 | 0,300,2.92,4 74 | 0,480,3.39,4 75 | 0,580,4,2 76 | 0,720,3.45,4 77 | 0,720,4,3 78 | 0,560,3.36,3 79 | 1,800,4,3 80 | 0,540,3.12,1 81 | 1,620,4,1 82 | 0,700,2.9,4 83 | 0,620,3.07,2 84 | 0,500,2.71,2 85 | 0,380,2.91,4 86 | 1,500,3.6,3 87 | 0,520,2.98,2 88 | 0,600,3.32,2 89 | 0,600,3.48,2 90 | 0,700,3.28,1 91 | 1,660,4,2 92 | 0,700,3.83,2 93 | 1,720,3.64,1 94 | 0,800,3.9,2 95 | 0,580,2.93,2 96 | 1,660,3.44,2 97 | 0,660,3.33,2 98 | 0,640,3.52,4 99 | 0,480,3.57,2 100 | 0,700,2.88,2 101 | 0,400,3.31,3 102 | 0,340,3.15,3 103 | 0,580,3.57,3 104 | 0,380,3.33,4 105 | 0,540,3.94,3 106 | 1,660,3.95,2 107 | 1,740,2.97,2 108 | 1,700,3.56,1 109 | 0,480,3.13,2 110 | 0,400,2.93,3 111 | 0,480,3.45,2 112 | 0,680,3.08,4 113 | 0,420,3.41,4 114 | 0,360,3,3 115 | 0,600,3.22,1 116 | 0,720,3.84,3 117 | 0,620,3.99,3 118 | 1,440,3.45,2 119 | 0,700,3.72,2 120 | 1,800,3.7,1 121 | 0,340,2.92,3 122 | 1,520,3.74,2 123 | 1,480,2.67,2 124 | 0,520,2.85,3 125 | 0,500,2.98,3 126 | 0,720,3.88,3 127 | 0,540,3.38,4 128 | 1,600,3.54,1 129 | 0,740,3.74,4 130 | 0,540,3.19,2 131 | 0,460,3.15,4 132 | 1,620,3.17,2 133 | 0,640,2.79,2 134 | 0,580,3.4,2 135 | 0,500,3.08,3 136 | 0,560,2.95,2 137 | 0,500,3.57,3 138 | 0,560,3.33,4 139 | 0,700,4,3 140 | 0,620,3.4,2 141 | 1,600,3.58,1 142 | 0,640,3.93,2 143 | 1,700,3.52,4 144 | 0,620,3.94,4 145 | 0,580,3.4,3 146 | 0,580,3.4,4 147 | 0,380,3.43,3 148 | 0,480,3.4,2 149 | 0,560,2.71,3 150 | 1,480,2.91,1 151 | 0,740,3.31,1 152 | 1,800,3.74,1 153 | 0,400,3.38,2 154 | 1,640,3.94,2 155 | 0,580,3.46,3 156 | 0,620,3.69,3 157 | 1,580,2.86,4 158 | 0,560,2.52,2 159 | 1,480,3.58,1 160 | 0,660,3.49,2 161 | 0,700,3.82,3 162 | 0,600,3.13,2 163 | 0,640,3.5,2 164 | 1,700,3.56,2 165 | 0,520,2.73,2 166 | 0,580,3.3,2 167 | 0,700,4,1 168 | 0,440,3.24,4 169 | 0,720,3.77,3 170 | 0,500,4,3 171 | 0,600,3.62,3 172 | 0,400,3.51,3 173 | 0,540,2.81,3 174 | 0,680,3.48,3 175 | 1,800,3.43,2 176 | 0,500,3.53,4 177 | 1,620,3.37,2 178 | 0,520,2.62,2 179 | 1,620,3.23,3 180 | 0,620,3.33,3 181 | 0,300,3.01,3 182 | 0,620,3.78,3 183 | 0,500,3.88,4 184 | 0,700,4,2 185 | 1,540,3.84,2 186 | 0,500,2.79,4 187 | 0,800,3.6,2 188 | 0,560,3.61,3 189 | 0,580,2.88,2 190 | 0,560,3.07,2 191 | 0,500,3.35,2 192 | 1,640,2.94,2 193 | 0,800,3.54,3 194 | 0,640,3.76,3 195 | 0,380,3.59,4 196 | 1,600,3.47,2 197 | 0,560,3.59,2 198 | 0,660,3.07,3 199 | 1,400,3.23,4 200 | 0,600,3.63,3 201 | 0,580,3.77,4 202 | 0,800,3.31,3 203 | 1,580,3.2,2 204 | 1,700,4,1 205 | 0,420,3.92,4 206 | 1,600,3.89,1 207 | 1,780,3.8,3 208 | 0,740,3.54,1 209 | 1,640,3.63,1 210 | 0,540,3.16,3 211 | 0,580,3.5,2 212 | 0,740,3.34,4 213 | 0,580,3.02,2 214 | 0,460,2.87,2 215 | 0,640,3.38,3 216 | 1,600,3.56,2 217 | 1,660,2.91,3 218 | 0,340,2.9,1 219 | 1,460,3.64,1 220 | 0,460,2.98,1 221 | 1,560,3.59,2 222 | 0,540,3.28,3 223 | 0,680,3.99,3 224 | 1,480,3.02,1 225 | 0,800,3.47,3 226 | 0,800,2.9,2 227 | 1,720,3.5,3 228 | 0,620,3.58,2 229 | 0,540,3.02,4 230 | 0,480,3.43,2 231 | 1,720,3.42,2 232 | 0,580,3.29,4 233 | 0,600,3.28,3 234 | 0,380,3.38,2 235 | 0,420,2.67,3 236 | 1,800,3.53,1 237 | 0,620,3.05,2 238 | 1,660,3.49,2 239 | 0,480,4,2 240 | 0,500,2.86,4 241 | 0,700,3.45,3 242 | 0,440,2.76,2 243 | 1,520,3.81,1 244 | 1,680,2.96,3 245 | 0,620,3.22,2 246 | 0,540,3.04,1 247 | 0,800,3.91,3 248 | 0,680,3.34,2 249 | 0,440,3.17,2 250 | 0,680,3.64,3 251 | 0,640,3.73,3 252 | 0,660,3.31,4 253 | 0,620,3.21,4 254 | 1,520,4,2 255 | 1,540,3.55,4 256 | 1,740,3.52,4 257 | 0,640,3.35,3 258 | 1,520,3.3,2 259 | 1,620,3.95,3 260 | 0,520,3.51,2 261 | 0,640,3.81,2 262 | 0,680,3.11,2 263 | 0,440,3.15,2 264 | 1,520,3.19,3 265 | 1,620,3.95,3 266 | 1,520,3.9,3 267 | 0,380,3.34,3 268 | 0,560,3.24,4 269 | 1,600,3.64,3 270 | 1,680,3.46,2 271 | 0,500,2.81,3 272 | 1,640,3.95,2 273 | 0,540,3.33,3 274 | 1,680,3.67,2 275 | 0,660,3.32,1 276 | 0,520,3.12,2 277 | 1,600,2.98,2 278 | 0,460,3.77,3 279 | 1,580,3.58,1 280 | 1,680,3,4 281 | 1,660,3.14,2 282 | 0,660,3.94,2 283 | 0,360,3.27,3 284 | 0,660,3.45,4 285 | 0,520,3.1,4 286 | 1,440,3.39,2 287 | 0,600,3.31,4 288 | 1,800,3.22,1 289 | 1,660,3.7,4 290 | 0,800,3.15,4 291 | 0,420,2.26,4 292 | 1,620,3.45,2 293 | 0,800,2.78,2 294 | 0,680,3.7,2 295 | 0,800,3.97,1 296 | 0,480,2.55,1 297 | 0,520,3.25,3 298 | 0,560,3.16,1 299 | 0,460,3.07,2 300 | 0,540,3.5,2 301 | 0,720,3.4,3 302 | 0,640,3.3,2 303 | 1,660,3.6,3 304 | 1,400,3.15,2 305 | 1,680,3.98,2 306 | 0,220,2.83,3 307 | 0,580,3.46,4 308 | 1,540,3.17,1 309 | 0,580,3.51,2 310 | 0,540,3.13,2 311 | 0,440,2.98,3 312 | 0,560,4,3 313 | 0,660,3.67,2 314 | 0,660,3.77,3 315 | 1,520,3.65,4 316 | 0,540,3.46,4 317 | 1,300,2.84,2 318 | 1,340,3,2 319 | 1,780,3.63,4 320 | 1,480,3.71,4 321 | 0,540,3.28,1 322 | 0,460,3.14,3 323 | 0,460,3.58,2 324 | 0,500,3.01,4 325 | 0,420,2.69,2 326 | 0,520,2.7,3 327 | 0,680,3.9,1 328 | 0,680,3.31,2 329 | 1,560,3.48,2 330 | 0,580,3.34,2 331 | 0,500,2.93,4 332 | 0,740,4,3 333 | 0,660,3.59,3 334 | 0,420,2.96,1 335 | 0,560,3.43,3 336 | 1,460,3.64,3 337 | 1,620,3.71,1 338 | 0,520,3.15,3 339 | 0,620,3.09,4 340 | 0,540,3.2,1 341 | 1,660,3.47,3 342 | 0,500,3.23,4 343 | 1,560,2.65,3 344 | 0,500,3.95,4 345 | 0,580,3.06,2 346 | 0,520,3.35,3 347 | 0,500,3.03,3 348 | 0,600,3.35,2 349 | 0,580,3.8,2 350 | 0,400,3.36,2 351 | 0,620,2.85,2 352 | 1,780,4,2 353 | 0,620,3.43,3 354 | 1,580,3.12,3 355 | 0,700,3.52,2 356 | 1,540,3.78,2 357 | 1,760,2.81,1 358 | 0,700,3.27,2 359 | 0,720,3.31,1 360 | 1,560,3.69,3 361 | 0,720,3.94,3 362 | 1,520,4,1 363 | 1,540,3.49,1 364 | 0,680,3.14,2 365 | 0,460,3.44,2 366 | 1,560,3.36,1 367 | 0,480,2.78,3 368 | 0,460,2.93,3 369 | 0,620,3.63,3 370 | 0,580,4,1 371 | 0,800,3.89,2 372 | 1,540,3.77,2 373 | 1,680,3.76,3 374 | 1,680,2.42,1 375 | 1,620,3.37,1 376 | 0,560,3.78,2 377 | 0,560,3.49,4 378 | 0,620,3.63,2 379 | 1,800,4,2 380 | 0,640,3.12,3 381 | 0,540,2.7,2 382 | 0,700,3.65,2 383 | 1,540,3.49,2 384 | 0,540,3.51,2 385 | 0,660,4,1 386 | 1,480,2.62,2 387 | 0,420,3.02,1 388 | 1,740,3.86,2 389 | 0,580,3.36,2 390 | 0,640,3.17,2 391 | 0,640,3.51,2 392 | 1,800,3.05,2 393 | 1,660,3.88,2 394 | 1,600,3.38,3 395 | 1,620,3.75,2 396 | 1,460,3.99,3 397 | 0,620,4,2 398 | 0,560,3.04,3 399 | 0,460,2.63,2 400 | 0,700,3.65,2 401 | 0,600,3.89,3 402 | -------------------------------------------------------------------------------- /4/old/data/CHD.csv: -------------------------------------------------------------------------------- 1 | age,chd 2 | 20,0 3 | 23,0 4 | 24,0 5 | 25,0 6 | 25,1 7 | 26,0 8 | 26,0 9 | 28,0 10 | 28,0 11 | 29,0 12 | 30,0 13 | 30,0 14 | 30,0 15 | 30,0 16 | 30,0 17 | 30,1 18 | 32,0 19 | 32,0 20 | 33,0 21 | 33,0 22 | 34,0 23 | 34,0 24 | 34,1 25 | 34,0 26 | 34,0 27 | 35,0 28 | 35,0 29 | 36,0 30 | 36,1 31 | 36,0 32 | 37,0 33 | 37,1 34 | 37,0 35 | 38,0 36 | 38,0 37 | 39,0 38 | 39,1 39 | 40,0 40 | 40,1 41 | 41,0 42 | 41,0 43 | 42,0 44 | 42,0 45 | 42,0 46 | 42,1 47 | 43,0 48 | 43,0 49 | 43,1 50 | 44,0 51 | 44,0 52 | 44,1 53 | 44,1 54 | 45,0 55 | 45,1 56 | 46,0 57 | 46,1 58 | 47,0 59 | 47,0 60 | 47,1 61 | 48,0 62 | 48,1 63 | 48,1 64 | 49,0 65 | 49,0 66 | 49,1 67 | 50,0 68 | 50,1 69 | 51,0 70 | 52,0 71 | 52,1 72 | 53,1 73 | 53,1 74 | 54,1 75 | 55,0 76 | 55,1 77 | 55,1 78 | 56,1 79 | 56,1 80 | 56,1 81 | 57,0 82 | 57,0 83 | 57,1 84 | 57,1 85 | 57,1 86 | 57,1 87 | 58,0 88 | 58,1 89 | 58,1 90 | 59,1 91 | 59,1 92 | 60,0 93 | 60,1 94 | 61,1 95 | 62,1 96 | 62,1 97 | 63,1 98 | 64,0 99 | 64,1 100 | 65,1 101 | 69,1 102 | -------------------------------------------------------------------------------- /5/CH5_linear_regression_nn.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 70, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [ 10 | { 11 | "data": { 12 | "text/plain": [ 13 | "" 14 | ] 15 | }, 16 | "execution_count": 70, 17 | "metadata": {}, 18 | "output_type": "execute_result" 19 | }, 20 | { 21 | "data": { 22 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMEAAACQCAYAAACxkA/OAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAHTNJREFUeJztnXuYVNWV6H+L7qrqgn6BtIg8ulFUUK8CMyYm5o6S6MTE\nRByN42uMD3xLMDd6g5oxYHwkaDQfeqM8QgSNPBIzGk3MJeamyQxR00RaMBdQvNqNIKHLRKII/aB7\n3T/WPlR124/qrqqu6q79+77z9alzdp29T/VZZ6+19tpri6ri8eQzQ7LdAI8n23gh8OQ9Xgg8eY8X\nAk/e44XAk/d4IfDkPV4I+oiI3CYii9NdNolrtYnIEUmWnSsiT6Sj3sFMYbYbkAuIyOXAN4Ajgb8D\nzwC3qerfu/qOqn432ev3pmwyl8tEeRF5DHhHVb/d+yYNbPK+JxCRm4HvAjcDpcDJQCXwgoh0+pIQ\nkYL+a+HHq89i3YMTVc3bDSgBPgTO63B8GNAAXO4+zwV+BjwB7AGudMeeSPjOV4E6IAb8O/A28NmE\n7z/h9iuBNle+3tVze8J1TgJeBN4HdgIPA4UJ59uAI7q4nypgLdabrXHffTzh/E+BXe7aa4HJ7vjV\nQDPQCHwA/MIdnwO86Y79GTgn2/+zTGz53hN8GogATyceVNWPgOeBMxIOnw38VFXLgRVBUQARORb4\nIXARMBooAw7vUFdHteQU4CjgdODbInKMO94KfB0YAXwK+CxwQ5L3swJYD4wE7gYu63D+eUzlOxTY\nENyHqi4BngTuU9VSVZ3hyr8JnKKqpcCdwE9EZFSSbRkw5LsQjATeU9W2Ts7tcucDXlLV5wBUtbFD\n2fOAZ1X1JVU9APSkVyswT1WbVXUTsBE40V17g6rWqLEdWAyc2tONiMg44B+Bb6tqi6r+F/Bcu0pV\nl6nqPlVtAb4DnCgiJV02UvXnqrrb7f8M2AZ8oqe2DDTyXQjeA0aKSGe/w2h3PuCdbq5zeOJ5Vd0P\n/LWHuncn7O8DigFE5CgReU5EdonIHuAe2gtjd21439UdUB/siMgQEfmeiLzprvs2JoxdXltEvioi\ntSLyvoi8DxyXZFsGFPkuBC8BTcC5iQdFpBj4AvDbhMPdeVl2AWMTvh8FDuljmx4FtgBHOtXrWyRn\nDO8Chru6A8Yn7F8CfBmzU8ox+0ESrt3u/kRkPNYL3aCqw1V1OPB/k2zLgCKvhUBVP8DUgodF5PMi\nUigiVcBqYDvwkyQv9RTwZRE5WURCwLweynf3IJUAH6jqPhGZBFyfTAOc6vQn4E4RCYnIZ7CHPqAY\nE/j3RWQY5hFLfPB3A4njD8MwI/w914tcARyfTFsGGnktBACqej9wO/B9zKvyEqZGnO5052SusRn4\nGiY872LelAbsoev0K918vgW4REQ+ABYBq3r4biIXYy7evwJ3AMsTzj2OCfZOzNPzYofvLgWOE5G/\nich/qOoW4EHgZeAvmCq0rpu6ByziXGGpX8j06j8BO1T17LRcdIDi3rR7gImqWt9TeU92SWdPcBOw\nOY3XG1CIyJdEJOoE4AFgkxeAgUFahEBExgJfBH6UjusNUGZgqtAOzBd/YXab40mWtKhDIvIzzJVX\nBtyc7+qQZ2CRcgCdiJwF7FbVV0XkNLrwfIiIn9HvySiq2if3bTrUoVOAs0XkLWAlMF1EHu+sYKZj\nQObOndsvsSb9Uc9gqaO/6kmFlIVAVW9X1fGqegSmB/9OVb+a6nU9nv4i78cJPJ60TqpR1d8Dv0/n\nNXvDaaedNmjqGSx19Gc9fSVtg2U9ViSi/VWXJ/8QETSLhrHHM6DxQuDJe/xEe0/aiMVi1NXVUVVV\nBUBtbS0AU6dOpaKiIost6x4vBJ60sHLlambOvIFwuIp9+7bR1tZGa6sChxMOx1i2bBEXXXRBtpvZ\nKd4w9qRMLBajsnIS+/dXYxPyjsI07bXu8wsUFd3I9u1vZKxH8IaxJ6vU1dURDlcBJwC12Dz+CdgE\nuUnAAzQ2NrNo0ZLsNbIbfE/gSZl4TzAHm7DWBBQAIaw3OAHYRDQ6nfr6rRnpDbLaE4hIRET+6CZk\nvyYic1O9pmfgcfXVl2KzSm/DHqtGbJr1Ca7ECRQWjqeuri4r7euWNAUvDXV/C7DpeJ/opIx6BhcN\nDQ161133aChUohBSqFIoU3hS4XiF4QobFVRho0Yi5drQ0JCRtrjnq0/Pb1q8Q6q6z+1GMI+T13sG\nOStXrubKK6+jsbEJCLvtXSzBxRnAddg8/U9hxvG7LFiwICddpemaWTZERGqxCdkvqOr6dFzXk5vE\nYjFmzryBxsYfYg/9SOy99xVMEB4GDmBz+euA7xAKFXLuuedkqcXdk66eoA2YKiKlwDMicqxaBoZ2\nzJs37+D+aaedlvOBVZ7Oqa2tZciQccBULN1RK5b7qwp7+O/DvEKBPXAxRUX3U1dXl7aeYO3ataxd\nuzYt10q7d0hE7gA+UtUHOxzXdNfl6X/iatB+YCiWJmkn5gmKYJ6h0Vj2mmoCz1AkcirvvDNIxwlE\nZKSIlLn9KKYQbk31up7cI64GPY098LdiKY7CwFVABfAQphXPAaZjKVZPZsGC+3LSHoD02ASjgWoR\neRX4I7BGVZ9Pw3U9OUQsFuP555+nsLAS0/UPwcYEAMZgOYj/BnwSWADcC5QSidSxcOECrr326iy0\nOjn8YJmnRxYtWsLs2bdQUDCS/fvfxTzhB7CHXzC74CVshPh6YAThcIw77pjDtdde3S89QCrqkBcC\nT7csWrSE6677Gqb/F2LqTxHmAn0EU4sOYCpRFfAWBQXNvPbaK0yePLnf2uljhzwZYcuWLcya9T+w\nN/8V2KI+UcwTdCFwGOYZOoCNEDdQWNjKE0881q8CkCo+lNrTKStXrubSS6+ktbUMe/MvwgLjhmG+\n/4+wVZ+ew1Z62khBwTw2bfrTgBIA8OpQ3pI4AaaiouLg5+LiYt555x1mzLiAxsZWoAXrCcZiybpD\n2EpSfwcCN2kVUEco1MLOnW9lxQuUijrke4I8JJgAIzKa1tZ3uOSSC1m58ina2oppaooRDo+mubkY\n0/fbMMO3zu23YlnnHwQWAi+4c1VEo59P64BYf+F7gjwjFosxduxRNDffDszHPNxvYcsifB+Lf9yO\nhUAIJgiKPfxh4MfAL7B1Sdowr1DmQ6V7wvcEnqSpra2luXkEJgDV2AjvFZhv/2jsIX8Ee+APuPOH\nYAvdDAPOcdt9FBaeQEHBqUQiE2hpqWfp0kcGXC8AXgjyjj179mBBbkFsTwxbc3Actpbf65gh3IQF\nxz2LLX6zF3MmbnLf20Uo1Mwrr7zI3r17D9oWA5K+xmAHG2Yx/Q5b1O01YHYX5dIRNu5JgRUrVmlR\nUblCWCHqYv0bFGa6z2crjFE4UmGymxuwUWGVwjCFiMJQhSM1HC7TFStWZfuWDkIK8wlStglE5DDg\nMLXU7MXAK8AMVd3aoZymWpen77SfAnkX5tYU7O1+CCIxAOJqdRSLDZqPjQu8yezZ13LKKZ+mvLw8\n59KoZNUmUNW/YBFTqOpeEdmCjaf7ILos0tEFumjREvbvL8Me6j9grs5/xAzfQ1HdgwlGKzY/YC+2\niP2hwDZuvfVmvvvde7JyLxmnr11IZxuBwxiKOzmXiV7Q0wkLFy7WSKRcS0qmajQ6QhcuXOzUoGEK\nJ7rpjmucarNRYbM7N1zhDqcaVbsyj2pRUeamRaYLsj29Eg4ugP0UcJOq7u2sjJ9Uk3ks1ucm4GWa\nmsx1eeONn6K1dRTm338DywBxL+Ye3YKFQZcDo7BlnccB/wKMJBJ5jx//eGFOqT6Qg5NqRKQQ+CXw\na1Vd0EUZTUddnq6JxWKMGzeRpqYJwKvu6BbgHzDNdzy2sNByzB4oxNSi24C5mFdoLeb9WUso9GU2\nbqwZEGEQuRBA92Ngc1cC4OkfLAlWJeby3AQsAU7CHHhnA/8P+BnwLUwgbsMe/Huxga9mbGL8ROCL\nPPzwgwNCAFKmr3pUsGGvllbs1VMLbADO7KRcptRBj6OhoUGj0REK852OH+j2JW4bpTDR2QBRhafd\n3/kKxW7/MC0oiOrChYuzfTu9gmy6SJPFq0P9Q/s5wMcAvwUqsbDn912pQP2xhLnx+cCWNzQcvp4d\nO97MOTugO3JBHfLkEK2tLdiS0tuwALeJ2IN+GxYVOg+bCfsUNno8DrMDKoCLCYcn5GamuAzhe4JB\nRCwWY8yYI2lpAYv7mYfNA27G3vzzMeEYArwJrMcC5WJY4Fz/ZIbIBL4nyANisRjr168nFot1eby6\nupqWllYsxHki8E3MJXo1MJdwuIRQqIFQKIYZzlXYbLFbgFMxIcjtzBAZoa/GRG83vGHcZ1asWKXR\n6AgtK5umRUXletdd92hDQ8PBWKBhw47RwsIgtudIFw80wg2EXdMu5qegoFhnzZqt0egILS2dqqFQ\nsYbDZVpcfLxGIqUDziAOwBvGg5f2C2AE2RwOoajoPZqbm2lri2Au0G3EY///CzOIb8fGA6Ikpkgv\nKjqNDRv+cDD6E2gXYjEQ8fMJBjHxlIejsWRWa4ETaGz8DOaRXouFad0MDMfGAE7GhKHNfW8kiSnS\nRcawd+9eTjrppIP1DNSHPx14myBHicVi3H33vcyYcQEffRR4eSqxh/lBzKg9HOsdLgfew4zdiLvC\nUswuaMDmCWxyxzehuvNgD+DB2wS5SDzuPwhwC+L5g8GvqMIENwBWrFCqUKRwmCt3lLMLyhQudvMH\n4jZBLs0DSBekYBOk6wFfCuwGNnVTJpO/waAhPur7pMI0F/EZGLrByO4oJwgzFQ5RGJ0gJMsShKXY\nRYZOUojqkCER3bx5c7ZvMSOkIgTpUoceAz6fpmvlNfFF8M7AotLXAs9jxm8lZsZ9gOn7j2Nx/3uw\n/KB3A1/D5gefidkDr7ty9RQXH8vevZ0G+OY1aRECVV1HfEzekwJVVVU0N9dhs1X/CfgiNrnlDcwz\n9HvM7x942kZiuv8W4B7M99/IJZecTzjcgKVLOQnYRUtLvbcFOqOvXUjHDXtNeXUoBTZv3qzLli3T\niy++VNvPA17ldP7EMYBqhdVODSrTjuuDBZNpgvGAaHTEoLQFAsiFSTXJ4CfVtCcx69ucObfz3HNr\nsOmMuzEvzxGYSvPfgZXAxZiXaDxwmruKApe6Y3E3aChUybRpU6iv3zrgxwA6I52TanxPkCWCUeBo\n9AiFwoSQ5lKFcQr/zb3xL3Y9wD3O+1OU0EPYWz8SKdVwuOxjPUGuT4lMJ+RITyBu83RB8OZvbm7m\niiuuo6npaeBL2M82GgtwWw5chBm807FIzwJsHbA2V3aeO2cjxXfffTdjxoxh5szphEKVAzoRVlbo\nq/QkbsAKLCa3Ccvhd0UnZTL7Kshx2r/5IwpHq01kjygc7v4el/DGH5ZwLKo2GeZyNwYQuE1rFCbo\nQw89pKrmXq2pqcmrHiCAbI8TJFVRHgtB3PdfnWDUjlB4VGG8e8jPcqpOqTNyH1U41pULMkDc9zFV\nCKK6evXqbN9i1klFCHzsUD9QW1uLyFgsl2cVZvg2A9/AVJyJWLz/i1ja83uwlWAS5wGMwNYFa8Vi\ng8YCOxgyRJk+fXp/3s6gw8cOZZiVK1dz1lnnsm/fNmxhi7cwf/8fsCS4Q9yxnVi+/78BkzEbAOzB\n3w8IhYUFnH/+uUQiIYqKmohEQvzkJ8u87p8iPpQ6g8RiMcaPP5rGRgFmYSu9KzbA9RKWFLcaC3u+\nAwuIu5t4L/Ar4DwS1wOORqfzyivrBn4S3DTjQ6kzRMdUhsl+p7a2lu3bt/P007+gsbEYy+3zQ8wD\n9Da2+N13MBUnGOkdA+zAVoMJVoYM1Kf2/v+OYdCeFOmrMdHbjQFmGCfO5kocbU30wHT0xqxYsUpD\noRKFkDNyw24/MGZrFE5QWOzOByO9850BPDYh+C3RgM5f/3+y4L1D6SXuzek8DKGsbJqGQiUaDpcd\nFJJ4vs8S9yA/6h70wM2pCg+4c08qTHHu0AkJD3ogJKoWKjFCLUI0qtHo8YM+9CEVvBCkmZqaGi0r\nC8KYbbM5uOUaz+kf7OvBUduhQ492Ls+jFW5SqFA4ooObc75abH/UHYtoPElu4txgVajWSKRU161b\nl7f+/2TxQpBmOusJIpFSLSmZ6j7fozZ41V5ICguL1WL4w+7hjnTysC9OOJb4N6hrvkJUS0qm+Dd/\nL8i6EGDB61uxeN85XZTJ6I+QbgKbIIjANHVnuNPTyzUetdmg8KSGw8UqUqQWBxRxakwQ6xNxak+g\nLp3oVJ9AqBa7c0dpJFKuCxcu9m/+XpJVISCeyakSc4O8CkzqpFyGf4b0E4Q2r1u3Ts8//8KEN/xR\nTmcvcarNYRoPfS5ROMY9/IHQLHE9xKwEG6C6U9VnsM78yjTZFoKTsZTswedbO+sNBpoQBD1BJDI+\nQa0JFrQodQ/xcG0fAjFWLfqzVM0YDozb452QlDmhCbxBFjJRVHScV31SJBUhSMeI8RgsF3jADnds\nwBKLxZg58wb277+RpqYY8HXi+TrXYfk8/xnz87/g/p6HTXPcAdyEjQJPxrTEf8Xi/e/HokDvBEop\nKPgr999/N//5n49RX7+Viy66oD9v0+Pwk2oSCFIZLl/+hFvfawE2yeVR7MH/NhbOUISN/L6DpT+J\nYtOsAfYBD2Dr/p5MUdERwE5aW1tpaZmDxQdZ9udQ6EYuu+xSP+rbB3JqUg2mDv3vhM8DSh1qaGjQ\nNWvW6Jw5t6lIVOOTVkrUQpwDt2ZE46lLnlSb6NLR4xMYyo9qQcHQdq7Nu+7q3KO0bNkybwCnAbJs\nExQQN4zDmGE8uZNyGf4ZjN7E1MdHeKMJen+Q6uQBd2yi0+WfVPP5T3QPenGCkfvoxx5wOFLXrFnT\nrl3t3a6BK3Twz//tD7IqBFY/Z2K5PbYBt3ZRJqM/gmpyoQ4Bmzdv1nC41Lkm71M4VONJq0Y4T854\n9/YPHvxyZ/RuVLjKCc3FCkM0nihL3d+h7YQgsX3Fxcdrx3kBPhwiNbIuBElVlGEh6CrU4f77Hzi4\nnGmQ0dmWOC1VGKnmzy/u4AG6yqlFJU44OssEN1ThG+57RRpfAnWqwnAtLBzW6UPd0NCgy5YtSxh4\ns620dKrW1NRk9DcazHghUAt1aP9gNWgoNCrhwV7lHtIJGl+vKwhuu0NtkGuUe9tHtH2gW+IDXqYQ\n0kjkOLUBryBMYpX77tEKUb3uuuu7bGtXAut7gr6TihAMmkk1Gza8yocfbiW+auORtLTsAY7GxvCu\nw7K5rXbH1mLjfOOBh7CljD7AXJ2HY67OT7pzCjQCDQwZ0kwoVERT09vYJJk9mJdoMjZgPhdQZs/+\nWpdtraioYOnSR4hGp1NaOo1odLqfGJ9N+io9vd3IYE/Q+aqNT6oNUgWG75EHewgrE3a9QrHGjdoH\nND4qPEyDJLZQpLNn36RLlixxQXRBTtChrvcIUqZMVIjqrFmzk263D49ID+T7HOO6ujoKCyuBK7AJ\nKhOxt/Qb2Ns+gk1b3AQ8g01amQD8BbgAS2y1CTgec3aFsIUuzJ9fUHANxx57HLNm3UxT00hszu8f\niPv7r+WRR37A/v37Of3005Ne+7eiosK//XOBvkpPbzcy2BMsXLi4w9u/XOMhCkepuTyDVIYRtdie\nwP9fnmAbHK4wRuPZoG2LRie7HqBazTt0YrvzJSVTvFGbZchnmyAWi/H1r38Te+Nfg63a/m/YYPgo\nbGXGYPEKxXqJXdhI8Hxsfd9CLPPzY5iO/zaJi1q0te0kHJ6ApT78PtbDxM8fOLDdJ7odyPRVenq7\nkaGeoKamRocNO0bjaQuDVIZBFOcdTs8vdDZAmcZj+KvV8vmfkPBmX6WJC92Fw2UHZ5TFvTn2/eLi\nE/1AV45APtsEVVVVtLbuxmJ7gmVL3wV+idkH12Bv+11uKwB+gOXvOYt4AtxNWIDcZIqKoixfvojy\n8nKmTp1KRUUFpaWl7dIc/uAHC5g2bYrP+DAY6Kv0mPDxFeDP2BM1rYeyGXsLmE2Q6PPvGPpcpBby\nEFao0ngcv02IKSiIJpXC3HtzcheytYSriByDpVBbBNyiqhu6Kaup1NUd69ev53Ofu5YPP/w3LH5v\nFObh+TsW0vQ6Zg/cgkWBTsJCnIySkqk89dR8hg8f7t/sA5Ss5R1S1dddA7KajbqqqooDB+qBodgt\nfYgZyv+B5e75CJEzUf0+UELcsLWEVgcObD+o9njyjwFvE0B8BPbyy6+iuflwzBa4CtP5R1NY2MDj\njy9nypQTqKmp4e236/ne904lHK7iwIHtfrQ2z+lRHRKRFzD94uAhTLf4lqo+58pUAzf3pA7NnTv3\n4OdMTKrZsmULJ554Mi0twUDXzykouJnXXvvTxwaw+pJdzpM7dJxUc+edd/ZZHUpLLtJkhSBTNkEi\nK1euZubMGxgyZCxtbTtYuvQRTj/9s/6BH+SkYhOkc7AsJ1apueiiC6iv30p19Y+or98KQGXlJM44\n4zoqKyexcuXqLLfQk2uk6h06B0u1PBIban1VVb/QRdl+6QkSicViVFZOYv/+ahKzOtfXb/U9wiAj\naz2Bqj6jquNUNaqqo7sSgGwRXxi7fVbnurq67DXKk3MM+Nih7ogvjB2P8/ELWns6MqiFwE9e8SRD\nXqxU492hg59UbIK8EALP4CdXXKQez4DEC4En7/FC4Ml7vBB48p6UhEBE7hORLSLyqoj8XERK09Ww\nvpC2LMU5UM9gqaM/6+krqfYEvwGOU9UpWB7S21JvUt8ZTP/UwVJHf9bTV1INm/itqra5jy8DY1Nv\nksfTv6TTJrgS+HUar+fx9AvpmlTzLWyi/XndXMePlHkyStZGjEXkcuBq4LOq2pTSxTyeLJDSHGMR\nORP4n8A/eQHwDFRSnVSzDVui6a/u0MuqekM6Gubx9Bf9FkDn8eQqGRsxFpGviMifRaRVRKZ1U65O\nRDaKSK2I1GSojjNFZKuIvCEic3pTh/v+cBH5jYi8LiJrRKSsi3KtIrLB3cszSV6727aJSFhEVonI\nNhF5SUTG96H9PdVxmYg0uLZvEJEr+1DHUhHZLSKbuinzkLuPV0VkSrrrEJFTRWRPwn38e1IX7mvq\nup424BjgKOB3dJOiEVv1enim6sAEPVhdM4SlnpvUy3rmA990+3OA73VR7oNeXrfHtgHXA4+4/QuA\nVRmo4zLgoRT/358BpgCbujj/BeBXbv+TmOqc7jpOBZ7t7XUz1hOo6uuquo2es1AIfeyRkqzjE8A2\nVa1X1RZgFTCjl1XNAJa7/eXAOV2U662LLpm2Jdb9FPC5DNQBKWYLUdV1wPvdFJkBPO7K/hEoE5FR\n3ZTvSx3Qh/vIhQA6BdaIyHoRuToD1x+DLSoWsMMd6w2HqupuAFX9C5bKujMiIlIjIi+KSDKClkzb\nDpZR1VZgj4iM6EXbk73/c52a8lMRycTIf8d27OyiHalyslNHfyUixybzhVRdpD0OpCXBKaq6S0Qq\ngBdEZIuT+HTW0SPd1NOZXtmVN6HS3csE4HcisklV305XGxPalW6eBVaoaouIXIP1PL3tcXKBV7D/\nwT4R+QK2NtfRPX0p1YS8Z6TyfXeNXe5vTESexrrvdQnnU61jJ7ZEZcBYd6xjO7qsxxljo1R1t4gc\nBjR0Vi7hXt4WkbXAVGzZm1TatgMYB7wrIgVAqar+rZtr9roOVU1UMX6Epe5ONzux++iyHamiqnsT\n9n8tIo+IyIiefq/+Uoc6fXuJyFARKXb7w4B/xtY7SFsdwHpgoohUikgYuBB78/WGZ4HL3f5lwC8+\nVrlIubs+IjIS+DSwuYfrJtO251ydAOdjToDe0GMdTrADZiTR7q4Quv4/PAt81dV3MrAnUDHTVUei\njSEin8CGAHp+YaTiEejBkj8H0wH3Y0vE/NodHw380u1PwLwVtcBrwK3prsN9PhNbpGBbb+tw3x8B\n/NZd4zdAuTv+D8Bit/8pLMFRLbARuDzJa3+sbcCdwJfcfgT4qTv/MlDVh/b3VMe92MunFvg/wNF9\nqGMFtkRQE7AdW0r0WuCahDL/C/NUbaSHRV36UgdwY8J9vAh8Mpnr+sEyT96TC94hjyereCHw5D1e\nCDx5jxcCT97jhcCT93gh8OQ9Xgg8ec//B65Jgc0mFNxhAAAAAElFTkSuQmCC\n", 23 | "text/plain": [ 24 | "" 25 | ] 26 | }, 27 | "metadata": {}, 28 | "output_type": "display_data" 29 | } 30 | ], 31 | "source": [ 32 | "import tensorflow as tf\n", 33 | "import numpy as np\n", 34 | "from sklearn.utils import shuffle\n", 35 | "%matplotlib inline\n", 36 | "import matplotlib.pyplot as plt \n", 37 | "\n", 38 | "\n", 39 | "\n", 40 | "trainsamples = 200\n", 41 | "testsamples = 60\n", 42 | "\n", 43 | "#Here we will represent the model, a simple imput, a hidden layer of sigmoid activation\n", 44 | "def model(X, hidden_weights1, hidden_bias1, ow):\n", 45 | " hidden_layer = tf.nn.sigmoid(tf.matmul(X, hidden_weights1)+ b) \n", 46 | " return tf.matmul(hidden_layer, ow) \n", 47 | "\n", 48 | "dsX = np.linspace(-1, 1, trainsamples + testsamples).transpose()\n", 49 | "dsY = 0.4* pow(dsX,2) +2 * dsX + np.random.randn(*dsX.shape) * 0.22 + 0.8 \n", 50 | "\n", 51 | "plt.figure() # Create a new figure\n", 52 | "plt.title('Original data')\n", 53 | "plt.scatter(dsX,dsY) #Plot a scatter draw of the datapoints\n" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": 68, 59 | "metadata": { 60 | "collapsed": false 61 | }, 62 | "outputs": [], 63 | "source": [ 64 | "X = tf.placeholder(\"float\")\n", 65 | "Y = tf.placeholder(\"float\")\n", 66 | "\n", 67 | "hw1 = tf.Variable(tf.random_normal([1, 10], stddev=0.01)) # Create first hidden layer\n", 68 | "ow = tf.Variable(tf.random_normal([10, 1], stddev=0.01)) # Create output connection\n", 69 | "b = tf.Variable(tf.random_normal([10], stddev=0.01)) # Create bias\n", 70 | "\n", 71 | "model_y = model(X, hw1, b, ow) #\n", 72 | "\n", 73 | "cost = tf.pow(model_y-Y, 2)/(2) # Cost function\n", 74 | "\n", 75 | "train_op = tf.train.AdamOptimizer(0.0001).minimize(cost) # construct an optimizer\n" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": 69, 81 | "metadata": { 82 | "collapsed": false, 83 | "scrolled": true 84 | }, 85 | "outputs": [ 86 | { 87 | "name": "stdout", 88 | "output_type": "stream", 89 | "text": [ 90 | "Average cost for epoch 1:[[ 0.00753353]]\n", 91 | "Average cost for epoch 2:[[ 0.00381996]]\n", 92 | "Average cost for epoch 3:[[ 0.00134867]]\n", 93 | "Average cost for epoch 4:[[ 0.01020064]]\n", 94 | "Average cost for epoch 5:[[ 0.00240157]]\n", 95 | "Average cost for epoch 6:[[ 0.01248318]]\n", 96 | "Average cost for epoch 7:[[ 0.05143405]]\n", 97 | "Average cost for epoch 8:[[ 0.00621457]]\n", 98 | "Average cost for epoch 9:[[ 0.0007379]]\n" 99 | ] 100 | } 101 | ], 102 | "source": [ 103 | "# Launch the graph in a session\n", 104 | "with tf.Session() as sess:\n", 105 | " tf.initialize_all_variables().run() #Initialize all variables\n", 106 | " \n", 107 | " for i in range(1,10):\n", 108 | " \n", 109 | " trainX, trainY =dsX[0:trainsamples], dsY[0:trainsamples]\n", 110 | " for x1,y1 in zip (trainX, trainY):\n", 111 | " sess.run(train_op, feed_dict={X: [[x1]], Y: y1})\n", 112 | " testX, testY = dsX[trainsamples:trainsamples + testsamples], dsY[0:trainsamples:trainsamples+testsamples]\n", 113 | " \n", 114 | " cost1=0.\n", 115 | " for x1,y1 in zip (testX, testY):\n", 116 | " cost1 += sess.run(cost, feed_dict={X: [[x1]], Y: y1}) / testsamples \n", 117 | " print \"Average cost for epoch \" + str (i) + \":\" + str(cost1)\n", 118 | " dsX, dsY = shuffle (dsX, dsY) #We randomize the samples to implement a better training \n", 119 | " \n" 120 | ] 121 | } 122 | ], 123 | "metadata": { 124 | "kernelspec": { 125 | "display_name": "Python 2", 126 | "language": "python", 127 | "name": "python2" 128 | }, 129 | "language_info": { 130 | "codemirror_mode": { 131 | "name": "ipython", 132 | "version": 2 133 | }, 134 | "file_extension": ".py", 135 | "mimetype": "text/x-python", 136 | "name": "python", 137 | "nbconvert_exporter": "python", 138 | "pygments_lexer": "ipython2", 139 | "version": "2.7.11+" 140 | } 141 | }, 142 | "nbformat": 4, 143 | "nbformat_minor": 0 144 | } 145 | -------------------------------------------------------------------------------- /5/data/mpg.csv: -------------------------------------------------------------------------------- 1 | mpg,cylinders,displacement,horsepower,weight,acceleration,model_year,origin,name 2 | 18,8,307,130,3504,12,70,1,chevrolet chevelle malibu 3 | 15,8,350,165,3693,11.5,70,1,buick skylark 320 4 | 18,8,318,150,3436,11,70,1,plymouth satellite 5 | 16,8,304,150,3433,12,70,1,amc rebel sst 6 | 17,8,302,140,3449,10.5,70,1,ford torino 7 | 15,8,429,198,4341,10,70,1,ford galaxie 500 8 | 14,8,454,220,4354,9,70,1,chevrolet impala 9 | 14,8,440,215,4312,8.5,70,1,plymouth fury iii 10 | 14,8,455,225,4425,10,70,1,pontiac catalina 11 | 15,8,390,190,3850,8.5,70,1,amc ambassador dpl 12 | 15,8,383,170,3563,10,70,1,dodge challenger se 13 | 14,8,340,160,3609,8,70,1,plymouth 'cuda 340 14 | 15,8,400,150,3761,9.5,70,1,chevrolet monte carlo 15 | 14,8,455,225,3086,10,70,1,buick estate wagon (sw) 16 | 24,4,113,95,2372,15,70,3,toyota corona mark ii 17 | 22,6,198,95,2833,15.5,70,1,plymouth duster 18 | 18,6,199,97,2774,15.5,70,1,amc hornet 19 | 21,6,200,85,2587,16,70,1,ford maverick 20 | 27,4,97,88,2130,14.5,70,3,datsun pl510 21 | 26,4,97,46,1835,20.5,70,2,volkswagen 1131 deluxe sedan 22 | 25,4,110,87,2672,17.5,70,2,peugeot 504 23 | 24,4,107,90,2430,14.5,70,2,audi 100 ls 24 | 25,4,104,95,2375,17.5,70,2,saab 99e 25 | 26,4,121,113,2234,12.5,70,2,bmw 2002 26 | 21,6,199,90,2648,15,70,1,amc gremlin 27 | 10,8,360,215,4615,14,70,1,ford f250 28 | 10,8,307,200,4376,15,70,1,chevy c20 29 | 11,8,318,210,4382,13.5,70,1,dodge d200 30 | 9,8,304,193,4732,18.5,70,1,hi 1200d 31 | 27,4,97,88,2130,14.5,71,3,datsun pl510 32 | 28,4,140,90,2264,15.5,71,1,chevrolet vega 2300 33 | 25,4,113,95,2228,14,71,3,toyota corona 34 | 25,4,98,0,2046,19,71,1,ford pinto 35 | 19,6,232,100,2634,13,71,1,amc gremlin 36 | 16,6,225,105,3439,15.5,71,1,plymouth satellite custom 37 | 17,6,250,100,3329,15.5,71,1,chevrolet chevelle malibu 38 | 19,6,250,88,3302,15.5,71,1,ford torino 500 39 | 18,6,232,100,3288,15.5,71,1,amc matador 40 | 14,8,350,165,4209,12,71,1,chevrolet impala 41 | 14,8,400,175,4464,11.5,71,1,pontiac catalina brougham 42 | 14,8,351,153,4154,13.5,71,1,ford galaxie 500 43 | 14,8,318,150,4096,13,71,1,plymouth fury iii 44 | 12,8,383,180,4955,11.5,71,1,dodge monaco (sw) 45 | 13,8,400,170,4746,12,71,1,ford country squire (sw) 46 | 13,8,400,175,5140,12,71,1,pontiac safari (sw) 47 | 18,6,258,110,2962,13.5,71,1,amc hornet sportabout (sw) 48 | 22,4,140,72,2408,19,71,1,chevrolet vega (sw) 49 | 19,6,250,100,3282,15,71,1,pontiac firebird 50 | 18,6,250,88,3139,14.5,71,1,ford mustang 51 | 23,4,122,86,2220,14,71,1,mercury capri 2000 52 | 28,4,116,90,2123,14,71,2,opel 1900 53 | 30,4,79,70,2074,19.5,71,2,peugeot 304 54 | 30,4,88,76,2065,14.5,71,2,fiat 124b 55 | 31,4,71,65,1773,19,71,3,toyota corolla 1200 56 | 35,4,72,69,1613,18,71,3,datsun 1200 57 | 27,4,97,60,1834,19,71,2,volkswagen model 111 58 | 26,4,91,70,1955,20.5,71,1,plymouth cricket 59 | 24,4,113,95,2278,15.5,72,3,toyota corona hardtop 60 | 25,4,97.5,80,2126,17,72,1,dodge colt hardtop 61 | 23,4,97,54,2254,23.5,72,2,volkswagen type 3 62 | 20,4,140,90,2408,19.5,72,1,chevrolet vega 63 | 21,4,122,86,2226,16.5,72,1,ford pinto runabout 64 | 13,8,350,165,4274,12,72,1,chevrolet impala 65 | 14,8,400,175,4385,12,72,1,pontiac catalina 66 | 15,8,318,150,4135,13.5,72,1,plymouth fury iii 67 | 14,8,351,153,4129,13,72,1,ford galaxie 500 68 | 17,8,304,150,3672,11.5,72,1,amc ambassador sst 69 | 11,8,429,208,4633,11,72,1,mercury marquis 70 | 13,8,350,155,4502,13.5,72,1,buick lesabre custom 71 | 12,8,350,160,4456,13.5,72,1,oldsmobile delta 88 royale 72 | 13,8,400,190,4422,12.5,72,1,chrysler newport royal 73 | 19,3,70,97,2330,13.5,72,3,mazda rx2 coupe 74 | 15,8,304,150,3892,12.5,72,1,amc matador (sw) 75 | 13,8,307,130,4098,14,72,1,chevrolet chevelle concours (sw) 76 | 13,8,302,140,4294,16,72,1,ford gran torino (sw) 77 | 14,8,318,150,4077,14,72,1,plymouth satellite custom (sw) 78 | 18,4,121,112,2933,14.5,72,2,volvo 145e (sw) 79 | 22,4,121,76,2511,18,72,2,volkswagen 411 (sw) 80 | 21,4,120,87,2979,19.5,72,2,peugeot 504 (sw) 81 | 26,4,96,69,2189,18,72,2,renault 12 (sw) 82 | 22,4,122,86,2395,16,72,1,ford pinto (sw) 83 | 28,4,97,92,2288,17,72,3,datsun 510 (sw) 84 | 23,4,120,97,2506,14.5,72,3,toyouta corona mark ii (sw) 85 | 28,4,98,80,2164,15,72,1,dodge colt (sw) 86 | 27,4,97,88,2100,16.5,72,3,toyota corolla 1600 (sw) 87 | 13,8,350,175,4100,13,73,1,buick century 350 88 | 14,8,304,150,3672,11.5,73,1,amc matador 89 | 13,8,350,145,3988,13,73,1,chevrolet malibu 90 | 14,8,302,137,4042,14.5,73,1,ford gran torino 91 | 15,8,318,150,3777,12.5,73,1,dodge coronet custom 92 | 12,8,429,198,4952,11.5,73,1,mercury marquis brougham 93 | 13,8,400,150,4464,12,73,1,chevrolet caprice classic 94 | 13,8,351,158,4363,13,73,1,ford ltd 95 | 14,8,318,150,4237,14.5,73,1,plymouth fury gran sedan 96 | 13,8,440,215,4735,11,73,1,chrysler new yorker brougham 97 | 12,8,455,225,4951,11,73,1,buick electra 225 custom 98 | 13,8,360,175,3821,11,73,1,amc ambassador brougham 99 | 18,6,225,105,3121,16.5,73,1,plymouth valiant 100 | 16,6,250,100,3278,18,73,1,chevrolet nova custom 101 | 18,6,232,100,2945,16,73,1,amc hornet 102 | 18,6,250,88,3021,16.5,73,1,ford maverick 103 | 23,6,198,95,2904,16,73,1,plymouth duster 104 | 26,4,97,46,1950,21,73,2,volkswagen super beetle 105 | 11,8,400,150,4997,14,73,1,chevrolet impala 106 | 12,8,400,167,4906,12.5,73,1,ford country 107 | 13,8,360,170,4654,13,73,1,plymouth custom suburb 108 | 12,8,350,180,4499,12.5,73,1,oldsmobile vista cruiser 109 | 18,6,232,100,2789,15,73,1,amc gremlin 110 | 20,4,97,88,2279,19,73,3,toyota carina 111 | 21,4,140,72,2401,19.5,73,1,chevrolet vega 112 | 22,4,108,94,2379,16.5,73,3,datsun 610 113 | 18,3,70,90,2124,13.5,73,3,maxda rx3 114 | 19,4,122,85,2310,18.5,73,1,ford pinto 115 | 21,6,155,107,2472,14,73,1,mercury capri v6 116 | 26,4,98,90,2265,15.5,73,2,fiat 124 sport coupe 117 | 15,8,350,145,4082,13,73,1,chevrolet monte carlo s 118 | 16,8,400,230,4278,9.5,73,1,pontiac grand prix 119 | 29,4,68,49,1867,19.5,73,2,fiat 128 120 | 24,4,116,75,2158,15.5,73,2,opel manta 121 | 20,4,114,91,2582,14,73,2,audi 100ls 122 | 19,4,121,112,2868,15.5,73,2,volvo 144ea 123 | 15,8,318,150,3399,11,73,1,dodge dart custom 124 | 24,4,121,110,2660,14,73,2,saab 99le 125 | 20,6,156,122,2807,13.5,73,3,toyota mark ii 126 | 11,8,350,180,3664,11,73,1,oldsmobile omega 127 | 20,6,198,95,3102,16.5,74,1,plymouth duster 128 | 21,6,200,0,2875,17,74,1,ford maverick 129 | 19,6,232,100,2901,16,74,1,amc hornet 130 | 15,6,250,100,3336,17,74,1,chevrolet nova 131 | 31,4,79,67,1950,19,74,3,datsun b210 132 | 26,4,122,80,2451,16.5,74,1,ford pinto 133 | 32,4,71,65,1836,21,74,3,toyota corolla 1200 134 | 25,4,140,75,2542,17,74,1,chevrolet vega 135 | 16,6,250,100,3781,17,74,1,chevrolet chevelle malibu classic 136 | 16,6,258,110,3632,18,74,1,amc matador 137 | 18,6,225,105,3613,16.5,74,1,plymouth satellite sebring 138 | 16,8,302,140,4141,14,74,1,ford gran torino 139 | 13,8,350,150,4699,14.5,74,1,buick century luxus (sw) 140 | 14,8,318,150,4457,13.5,74,1,dodge coronet custom (sw) 141 | 14,8,302,140,4638,16,74,1,ford gran torino (sw) 142 | 14,8,304,150,4257,15.5,74,1,amc matador (sw) 143 | 29,4,98,83,2219,16.5,74,2,audi fox 144 | 26,4,79,67,1963,15.5,74,2,volkswagen dasher 145 | 26,4,97,78,2300,14.5,74,2,opel manta 146 | 31,4,76,52,1649,16.5,74,3,toyota corona 147 | 32,4,83,61,2003,19,74,3,datsun 710 148 | 28,4,90,75,2125,14.5,74,1,dodge colt 149 | 24,4,90,75,2108,15.5,74,2,fiat 128 150 | 26,4,116,75,2246,14,74,2,fiat 124 tc 151 | 24,4,120,97,2489,15,74,3,honda civic 152 | 26,4,108,93,2391,15.5,74,3,subaru 153 | 31,4,79,67,2000,16,74,2,fiat x1.9 154 | 19,6,225,95,3264,16,75,1,plymouth valiant custom 155 | 18,6,250,105,3459,16,75,1,chevrolet nova 156 | 15,6,250,72,3432,21,75,1,mercury monarch 157 | 15,6,250,72,3158,19.5,75,1,ford maverick 158 | 16,8,400,170,4668,11.5,75,1,pontiac catalina 159 | 15,8,350,145,4440,14,75,1,chevrolet bel air 160 | 16,8,318,150,4498,14.5,75,1,plymouth grand fury 161 | 14,8,351,148,4657,13.5,75,1,ford ltd 162 | 17,6,231,110,3907,21,75,1,buick century 163 | 16,6,250,105,3897,18.5,75,1,chevroelt chevelle malibu 164 | 15,6,258,110,3730,19,75,1,amc matador 165 | 18,6,225,95,3785,19,75,1,plymouth fury 166 | 21,6,231,110,3039,15,75,1,buick skyhawk 167 | 20,8,262,110,3221,13.5,75,1,chevrolet monza 2+2 168 | 13,8,302,129,3169,12,75,1,ford mustang ii 169 | 29,4,97,75,2171,16,75,3,toyota corolla 170 | 23,4,140,83,2639,17,75,1,ford pinto 171 | 20,6,232,100,2914,16,75,1,amc gremlin 172 | 23,4,140,78,2592,18.5,75,1,pontiac astro 173 | 24,4,134,96,2702,13.5,75,3,toyota corona 174 | 25,4,90,71,2223,16.5,75,2,volkswagen dasher 175 | 24,4,119,97,2545,17,75,3,datsun 710 176 | 18,6,171,97,2984,14.5,75,1,ford pinto 177 | 29,4,90,70,1937,14,75,2,volkswagen rabbit 178 | 19,6,232,90,3211,17,75,1,amc pacer 179 | 23,4,115,95,2694,15,75,2,audi 100ls 180 | 23,4,120,88,2957,17,75,2,peugeot 504 181 | 22,4,121,98,2945,14.5,75,2,volvo 244dl 182 | 25,4,121,115,2671,13.5,75,2,saab 99le 183 | 33,4,91,53,1795,17.5,75,3,honda civic cvcc 184 | 28,4,107,86,2464,15.5,76,2,fiat 131 185 | 25,4,116,81,2220,16.9,76,2,opel 1900 186 | 25,4,140,92,2572,14.9,76,1,capri ii 187 | 26,4,98,79,2255,17.7,76,1,dodge colt 188 | 27,4,101,83,2202,15.3,76,2,renault 12tl 189 | 17.5,8,305,140,4215,13,76,1,chevrolet chevelle malibu classic 190 | 16,8,318,150,4190,13,76,1,dodge coronet brougham 191 | 15.5,8,304,120,3962,13.9,76,1,amc matador 192 | 14.5,8,351,152,4215,12.8,76,1,ford gran torino 193 | 22,6,225,100,3233,15.4,76,1,plymouth valiant 194 | 22,6,250,105,3353,14.5,76,1,chevrolet nova 195 | 24,6,200,81,3012,17.6,76,1,ford maverick 196 | 22.5,6,232,90,3085,17.6,76,1,amc hornet 197 | 29,4,85,52,2035,22.2,76,1,chevrolet chevette 198 | 24.5,4,98,60,2164,22.1,76,1,chevrolet woody 199 | 29,4,90,70,1937,14.2,76,2,vw rabbit 200 | 33,4,91,53,1795,17.4,76,3,honda civic 201 | 20,6,225,100,3651,17.7,76,1,dodge aspen se 202 | 18,6,250,78,3574,21,76,1,ford granada ghia 203 | 18.5,6,250,110,3645,16.2,76,1,pontiac ventura sj 204 | 17.5,6,258,95,3193,17.8,76,1,amc pacer d/l 205 | 29.5,4,97,71,1825,12.2,76,2,volkswagen rabbit 206 | 32,4,85,70,1990,17,76,3,datsun b-210 207 | 28,4,97,75,2155,16.4,76,3,toyota corolla 208 | 26.5,4,140,72,2565,13.6,76,1,ford pinto 209 | 20,4,130,102,3150,15.7,76,2,volvo 245 210 | 13,8,318,150,3940,13.2,76,1,plymouth volare premier v8 211 | 19,4,120,88,3270,21.9,76,2,peugeot 504 212 | 19,6,156,108,2930,15.5,76,3,toyota mark ii 213 | 16.5,6,168,120,3820,16.7,76,2,mercedes-benz 280s 214 | 16.5,8,350,180,4380,12.1,76,1,cadillac seville 215 | 13,8,350,145,4055,12,76,1,chevy c10 216 | 13,8,302,130,3870,15,76,1,ford f108 217 | 13,8,318,150,3755,14,76,1,dodge d100 218 | 31.5,4,98,68,2045,18.5,77,3,honda accord cvcc 219 | 30,4,111,80,2155,14.8,77,1,buick opel isuzu deluxe 220 | 36,4,79,58,1825,18.6,77,2,renault 5 gtl 221 | 25.5,4,122,96,2300,15.5,77,1,plymouth arrow gs 222 | 33.5,4,85,70,1945,16.8,77,3,datsun f-10 hatchback 223 | 17.5,8,305,145,3880,12.5,77,1,chevrolet caprice classic 224 | 17,8,260,110,4060,19,77,1,oldsmobile cutlass supreme 225 | 15.5,8,318,145,4140,13.7,77,1,dodge monaco brougham 226 | 15,8,302,130,4295,14.9,77,1,mercury cougar brougham 227 | 17.5,6,250,110,3520,16.4,77,1,chevrolet concours 228 | 20.5,6,231,105,3425,16.9,77,1,buick skylark 229 | 19,6,225,100,3630,17.7,77,1,plymouth volare custom 230 | 18.5,6,250,98,3525,19,77,1,ford granada 231 | 16,8,400,180,4220,11.1,77,1,pontiac grand prix lj 232 | 15.5,8,350,170,4165,11.4,77,1,chevrolet monte carlo landau 233 | 15.5,8,400,190,4325,12.2,77,1,chrysler cordoba 234 | 16,8,351,149,4335,14.5,77,1,ford thunderbird 235 | 29,4,97,78,1940,14.5,77,2,volkswagen rabbit custom 236 | 24.5,4,151,88,2740,16,77,1,pontiac sunbird coupe 237 | 26,4,97,75,2265,18.2,77,3,toyota corolla liftback 238 | 25.5,4,140,89,2755,15.8,77,1,ford mustang ii 2+2 239 | 30.5,4,98,63,2051,17,77,1,chevrolet chevette 240 | 33.5,4,98,83,2075,15.9,77,1,dodge colt m/m 241 | 30,4,97,67,1985,16.4,77,3,subaru dl 242 | 30.5,4,97,78,2190,14.1,77,2,volkswagen dasher 243 | 22,6,146,97,2815,14.5,77,3,datsun 810 244 | 21.5,4,121,110,2600,12.8,77,2,bmw 320i 245 | 21.5,3,80,110,2720,13.5,77,3,mazda rx-4 246 | 43.1,4,90,48,1985,21.5,78,2,volkswagen rabbit custom diesel 247 | 36.1,4,98,66,1800,14.4,78,1,ford fiesta 248 | 32.8,4,78,52,1985,19.4,78,3,mazda glc deluxe 249 | 39.4,4,85,70,2070,18.6,78,3,datsun b210 gx 250 | 36.1,4,91,60,1800,16.4,78,3,honda civic cvcc 251 | 19.9,8,260,110,3365,15.5,78,1,oldsmobile cutlass salon brougham 252 | 19.4,8,318,140,3735,13.2,78,1,dodge diplomat 253 | 20.2,8,302,139,3570,12.8,78,1,mercury monarch ghia 254 | 19.2,6,231,105,3535,19.2,78,1,pontiac phoenix lj 255 | 20.5,6,200,95,3155,18.2,78,1,chevrolet malibu 256 | 20.2,6,200,85,2965,15.8,78,1,ford fairmont (auto) 257 | 25.1,4,140,88,2720,15.4,78,1,ford fairmont (man) 258 | 20.5,6,225,100,3430,17.2,78,1,plymouth volare 259 | 19.4,6,232,90,3210,17.2,78,1,amc concord 260 | 20.6,6,231,105,3380,15.8,78,1,buick century special 261 | 20.8,6,200,85,3070,16.7,78,1,mercury zephyr 262 | 18.6,6,225,110,3620,18.7,78,1,dodge aspen 263 | 18.1,6,258,120,3410,15.1,78,1,amc concord d/l 264 | 19.2,8,305,145,3425,13.2,78,1,chevrolet monte carlo landau 265 | 17.7,6,231,165,3445,13.4,78,1,buick regal sport coupe (turbo) 266 | 18.1,8,302,139,3205,11.2,78,1,ford futura 267 | 17.5,8,318,140,4080,13.7,78,1,dodge magnum xe 268 | 30,4,98,68,2155,16.5,78,1,chevrolet chevette 269 | 27.5,4,134,95,2560,14.2,78,3,toyota corona 270 | 27.2,4,119,97,2300,14.7,78,3,datsun 510 271 | 30.9,4,105,75,2230,14.5,78,1,dodge omni 272 | 21.1,4,134,95,2515,14.8,78,3,toyota celica gt liftback 273 | 23.2,4,156,105,2745,16.7,78,1,plymouth sapporo 274 | 23.8,4,151,85,2855,17.6,78,1,oldsmobile starfire sx 275 | 23.9,4,119,97,2405,14.9,78,3,datsun 200-sx 276 | 20.3,5,131,103,2830,15.9,78,2,audi 5000 277 | 17,6,163,125,3140,13.6,78,2,volvo 264gl 278 | 21.6,4,121,115,2795,15.7,78,2,saab 99gle 279 | 16.2,6,163,133,3410,15.8,78,2,peugeot 604sl 280 | 31.5,4,89,71,1990,14.9,78,2,volkswagen scirocco 281 | 29.5,4,98,68,2135,16.6,78,3,honda accord lx 282 | 21.5,6,231,115,3245,15.4,79,1,pontiac lemans v6 283 | 19.8,6,200,85,2990,18.2,79,1,mercury zephyr 6 284 | 22.3,4,140,88,2890,17.3,79,1,ford fairmont 4 285 | 20.2,6,232,90,3265,18.2,79,1,amc concord dl 6 286 | 20.6,6,225,110,3360,16.6,79,1,dodge aspen 6 287 | 17,8,305,130,3840,15.4,79,1,chevrolet caprice classic 288 | 17.6,8,302,129,3725,13.4,79,1,ford ltd landau 289 | 16.5,8,351,138,3955,13.2,79,1,mercury grand marquis 290 | 18.2,8,318,135,3830,15.2,79,1,dodge st. regis 291 | 16.9,8,350,155,4360,14.9,79,1,buick estate wagon (sw) 292 | 15.5,8,351,142,4054,14.3,79,1,ford country squire (sw) 293 | 19.2,8,267,125,3605,15,79,1,chevrolet malibu classic (sw) 294 | 18.5,8,360,150,3940,13,79,1,chrysler lebaron town @ country (sw) 295 | 31.9,4,89,71,1925,14,79,2,vw rabbit custom 296 | 34.1,4,86,65,1975,15.2,79,3,maxda glc deluxe 297 | 35.7,4,98,80,1915,14.4,79,1,dodge colt hatchback custom 298 | 27.4,4,121,80,2670,15,79,1,amc spirit dl 299 | 25.4,5,183,77,3530,20.1,79,2,mercedes benz 300d 300 | 23,8,350,125,3900,17.4,79,1,cadillac eldorado 301 | 27.2,4,141,71,3190,24.8,79,2,peugeot 504 302 | 23.9,8,260,90,3420,22.2,79,1,oldsmobile cutlass salon brougham 303 | 34.2,4,105,70,2200,13.2,79,1,plymouth horizon 304 | 34.5,4,105,70,2150,14.9,79,1,plymouth horizon tc3 305 | 31.8,4,85,65,2020,19.2,79,3,datsun 210 306 | 37.3,4,91,69,2130,14.7,79,2,fiat strada custom 307 | 28.4,4,151,90,2670,16,79,1,buick skylark limited 308 | 28.8,6,173,115,2595,11.3,79,1,chevrolet citation 309 | 26.8,6,173,115,2700,12.9,79,1,oldsmobile omega brougham 310 | 33.5,4,151,90,2556,13.2,79,1,pontiac phoenix 311 | 41.5,4,98,76,2144,14.7,80,2,vw rabbit 312 | 38.1,4,89,60,1968,18.8,80,3,toyota corolla tercel 313 | 32.1,4,98,70,2120,15.5,80,1,chevrolet chevette 314 | 37.2,4,86,65,2019,16.4,80,3,datsun 310 315 | 28,4,151,90,2678,16.5,80,1,chevrolet citation 316 | 26.4,4,140,88,2870,18.1,80,1,ford fairmont 317 | 24.3,4,151,90,3003,20.1,80,1,amc concord 318 | 19.1,6,225,90,3381,18.7,80,1,dodge aspen 319 | 34.3,4,97,78,2188,15.8,80,2,audi 4000 320 | 29.8,4,134,90,2711,15.5,80,3,toyota corona liftback 321 | 31.3,4,120,75,2542,17.5,80,3,mazda 626 322 | 37,4,119,92,2434,15,80,3,datsun 510 hatchback 323 | 32.2,4,108,75,2265,15.2,80,3,toyota corolla 324 | 46.6,4,86,65,2110,17.9,80,3,mazda glc 325 | 27.9,4,156,105,2800,14.4,80,1,dodge colt 326 | 40.8,4,85,65,2110,19.2,80,3,datsun 210 327 | 44.3,4,90,48,2085,21.7,80,2,vw rabbit c (diesel) 328 | 43.4,4,90,48,2335,23.7,80,2,vw dasher (diesel) 329 | 36.4,5,121,67,2950,19.9,80,2,audi 5000s (diesel) 330 | 30,4,146,67,3250,21.8,80,2,mercedes-benz 240d 331 | 44.6,4,91,67,1850,13.8,80,3,honda civic 1500 gl 332 | 40.9,4,85,0,1835,17.3,80,2,renault lecar deluxe 333 | 33.8,4,97,67,2145,18,80,3,subaru dl 334 | 29.8,4,89,62,1845,15.3,80,2,vokswagen rabbit 335 | 32.7,6,168,132,2910,11.4,80,3,datsun 280-zx 336 | 23.7,3,70,100,2420,12.5,80,3,mazda rx-7 gs 337 | 35,4,122,88,2500,15.1,80,2,triumph tr7 coupe 338 | 23.6,4,140,0,2905,14.3,80,1,ford mustang cobra 339 | 32.4,4,107,72,2290,17,80,3,honda accord 340 | 27.2,4,135,84,2490,15.7,81,1,plymouth reliant 341 | 26.6,4,151,84,2635,16.4,81,1,buick skylark 342 | 25.8,4,156,92,2620,14.4,81,1,dodge aries wagon (sw) 343 | 23.5,6,173,110,2725,12.6,81,1,chevrolet citation 344 | 30,4,135,84,2385,12.9,81,1,plymouth reliant 345 | 39.1,4,79,58,1755,16.9,81,3,toyota starlet 346 | 39,4,86,64,1875,16.4,81,1,plymouth champ 347 | 35.1,4,81,60,1760,16.1,81,3,honda civic 1300 348 | 32.3,4,97,67,2065,17.8,81,3,subaru 349 | 37,4,85,65,1975,19.4,81,3,datsun 210 mpg 350 | 37.7,4,89,62,2050,17.3,81,3,toyota tercel 351 | 34.1,4,91,68,1985,16,81,3,mazda glc 4 352 | 34.7,4,105,63,2215,14.9,81,1,plymouth horizon 4 353 | 34.4,4,98,65,2045,16.2,81,1,ford escort 4w 354 | 29.9,4,98,65,2380,20.7,81,1,ford escort 2h 355 | 33,4,105,74,2190,14.2,81,2,volkswagen jetta 356 | 34.5,4,100,0,2320,15.8,81,2,renault 18i 357 | 33.7,4,107,75,2210,14.4,81,3,honda prelude 358 | 32.4,4,108,75,2350,16.8,81,3,toyota corolla 359 | 32.9,4,119,100,2615,14.8,81,3,datsun 200sx 360 | 31.6,4,120,74,2635,18.3,81,3,mazda 626 361 | 28.1,4,141,80,3230,20.4,81,2,peugeot 505s turbo diesel 362 | 30.7,6,145,76,3160,19.6,81,2,volvo diesel 363 | 25.4,6,168,116,2900,12.6,81,3,toyota cressida 364 | 24.2,6,146,120,2930,13.8,81,3,datsun 810 maxima 365 | 22.4,6,231,110,3415,15.8,81,1,buick century 366 | 26.6,8,350,105,3725,19,81,1,oldsmobile cutlass ls 367 | 20.2,6,200,88,3060,17.1,81,1,ford granada gl 368 | 17.6,6,225,85,3465,16.6,81,1,chrysler lebaron salon 369 | 28,4,112,88,2605,19.6,82,1,chevrolet cavalier 370 | 27,4,112,88,2640,18.6,82,1,chevrolet cavalier wagon 371 | 34,4,112,88,2395,18,82,1,chevrolet cavalier 2-door 372 | 31,4,112,85,2575,16.2,82,1,pontiac j2000 se hatchback 373 | 29,4,135,84,2525,16,82,1,dodge aries se 374 | 27,4,151,90,2735,18,82,1,pontiac phoenix 375 | 24,4,140,92,2865,16.4,82,1,ford fairmont futura 376 | 23,4,151,0,3035,20.5,82,1,amc concord dl 377 | 36,4,105,74,1980,15.3,82,2,volkswagen rabbit l 378 | 37,4,91,68,2025,18.2,82,3,mazda glc custom l 379 | 31,4,91,68,1970,17.6,82,3,mazda glc custom 380 | 38,4,105,63,2125,14.7,82,1,plymouth horizon miser 381 | 36,4,98,70,2125,17.3,82,1,mercury lynx l 382 | 36,4,120,88,2160,14.5,82,3,nissan stanza xe 383 | 36,4,107,75,2205,14.5,82,3,honda accord 384 | 34,4,108,70,2245,16.9,82,3,toyota corolla 385 | 38,4,91,67,1965,15,82,3,honda civic 386 | 32,4,91,67,1965,15.7,82,3,honda civic (auto) 387 | 38,4,91,67,1995,16.2,82,3,datsun 310 gx 388 | 25,6,181,110,2945,16.4,82,1,buick century limited 389 | 38,6,262,85,3015,17,82,1,oldsmobile cutlass ciera (diesel) 390 | 26,4,156,92,2585,14.5,82,1,chrysler lebaron medallion 391 | 22,6,232,112,2835,14.7,82,1,ford granada l 392 | 32,4,144,96,2665,13.9,82,3,toyota celica gt 393 | 36,4,135,84,2370,13,82,1,dodge charger 2.2 394 | 27,4,151,90,2950,17.3,82,1,chevrolet camaro 395 | 27,4,140,86,2790,15.6,82,1,ford mustang gl 396 | 44,4,97,52,2130,24.6,82,2,vw pickup 397 | 32,4,135,84,2295,11.6,82,1,dodge rampage 398 | 28,4,120,79,2625,18.6,82,1,ford ranger 399 | 31,4,119,82,2720,19.4,82,1,chevy s-10 -------------------------------------------------------------------------------- /5/data/wine.csv: -------------------------------------------------------------------------------- 1 | Wine,Alcohol,Malic.acid,Ash,Acl,Mg,Phenols,Flavanoids,Nonflavanoid.phenols,Proanth,Color.int,Hue,OD,Proline 2 | 1,14.23,1.71,2.43,15.6,127,2.8,3.06,.28,2.29,5.64,1.04,3.92,1065 3 | 1,13.2,1.78,2.14,11.2,100,2.65,2.76,.26,1.28,4.38,1.05,3.4,1050 4 | 1,13.16,2.36,2.67,18.6,101,2.8,3.24,.3,2.81,5.68,1.03,3.17,1185 5 | 1,14.37,1.95,2.5,16.8,113,3.85,3.49,.24,2.18,7.8,.86,3.45,1480 6 | 1,13.24,2.59,2.87,21,118,2.8,2.69,.39,1.82,4.32,1.04,2.93,735 7 | 1,14.2,1.76,2.45,15.2,112,3.27,3.39,.34,1.97,6.75,1.05,2.85,1450 8 | 1,14.39,1.87,2.45,14.6,96,2.5,2.52,.3,1.98,5.25,1.02,3.58,1290 9 | 1,14.06,2.15,2.61,17.6,121,2.6,2.51,.31,1.25,5.05,1.06,3.58,1295 10 | 1,14.83,1.64,2.17,14,97,2.8,2.98,.29,1.98,5.2,1.08,2.85,1045 11 | 1,13.86,1.35,2.27,16,98,2.98,3.15,.22,1.85,7.22,1.01,3.55,1045 12 | 1,14.1,2.16,2.3,18,105,2.95,3.32,.22,2.38,5.75,1.25,3.17,1510 13 | 1,14.12,1.48,2.32,16.8,95,2.2,2.43,.26,1.57,5,1.17,2.82,1280 14 | 1,13.75,1.73,2.41,16,89,2.6,2.76,.29,1.81,5.6,1.15,2.9,1320 15 | 1,14.75,1.73,2.39,11.4,91,3.1,3.69,.43,2.81,5.4,1.25,2.73,1150 16 | 1,14.38,1.87,2.38,12,102,3.3,3.64,.29,2.96,7.5,1.2,3,1547 17 | 1,13.63,1.81,2.7,17.2,112,2.85,2.91,.3,1.46,7.3,1.28,2.88,1310 18 | 1,14.3,1.92,2.72,20,120,2.8,3.14,.33,1.97,6.2,1.07,2.65,1280 19 | 1,13.83,1.57,2.62,20,115,2.95,3.4,.4,1.72,6.6,1.13,2.57,1130 20 | 1,14.19,1.59,2.48,16.5,108,3.3,3.93,.32,1.86,8.7,1.23,2.82,1680 21 | 1,13.64,3.1,2.56,15.2,116,2.7,3.03,.17,1.66,5.1,.96,3.36,845 22 | 1,14.06,1.63,2.28,16,126,3,3.17,.24,2.1,5.65,1.09,3.71,780 23 | 1,12.93,3.8,2.65,18.6,102,2.41,2.41,.25,1.98,4.5,1.03,3.52,770 24 | 1,13.71,1.86,2.36,16.6,101,2.61,2.88,.27,1.69,3.8,1.11,4,1035 25 | 1,12.85,1.6,2.52,17.8,95,2.48,2.37,.26,1.46,3.93,1.09,3.63,1015 26 | 1,13.5,1.81,2.61,20,96,2.53,2.61,.28,1.66,3.52,1.12,3.82,845 27 | 1,13.05,2.05,3.22,25,124,2.63,2.68,.47,1.92,3.58,1.13,3.2,830 28 | 1,13.39,1.77,2.62,16.1,93,2.85,2.94,.34,1.45,4.8,.92,3.22,1195 29 | 1,13.3,1.72,2.14,17,94,2.4,2.19,.27,1.35,3.95,1.02,2.77,1285 30 | 1,13.87,1.9,2.8,19.4,107,2.95,2.97,.37,1.76,4.5,1.25,3.4,915 31 | 1,14.02,1.68,2.21,16,96,2.65,2.33,.26,1.98,4.7,1.04,3.59,1035 32 | 1,13.73,1.5,2.7,22.5,101,3,3.25,.29,2.38,5.7,1.19,2.71,1285 33 | 1,13.58,1.66,2.36,19.1,106,2.86,3.19,.22,1.95,6.9,1.09,2.88,1515 34 | 1,13.68,1.83,2.36,17.2,104,2.42,2.69,.42,1.97,3.84,1.23,2.87,990 35 | 1,13.76,1.53,2.7,19.5,132,2.95,2.74,.5,1.35,5.4,1.25,3,1235 36 | 1,13.51,1.8,2.65,19,110,2.35,2.53,.29,1.54,4.2,1.1,2.87,1095 37 | 1,13.48,1.81,2.41,20.5,100,2.7,2.98,.26,1.86,5.1,1.04,3.47,920 38 | 1,13.28,1.64,2.84,15.5,110,2.6,2.68,.34,1.36,4.6,1.09,2.78,880 39 | 1,13.05,1.65,2.55,18,98,2.45,2.43,.29,1.44,4.25,1.12,2.51,1105 40 | 1,13.07,1.5,2.1,15.5,98,2.4,2.64,.28,1.37,3.7,1.18,2.69,1020 41 | 1,14.22,3.99,2.51,13.2,128,3,3.04,.2,2.08,5.1,.89,3.53,760 42 | 1,13.56,1.71,2.31,16.2,117,3.15,3.29,.34,2.34,6.13,.95,3.38,795 43 | 1,13.41,3.84,2.12,18.8,90,2.45,2.68,.27,1.48,4.28,.91,3,1035 44 | 1,13.88,1.89,2.59,15,101,3.25,3.56,.17,1.7,5.43,.88,3.56,1095 45 | 1,13.24,3.98,2.29,17.5,103,2.64,2.63,.32,1.66,4.36,.82,3,680 46 | 1,13.05,1.77,2.1,17,107,3,3,.28,2.03,5.04,.88,3.35,885 47 | 1,14.21,4.04,2.44,18.9,111,2.85,2.65,.3,1.25,5.24,.87,3.33,1080 48 | 1,14.38,3.59,2.28,16,102,3.25,3.17,.27,2.19,4.9,1.04,3.44,1065 49 | 1,13.9,1.68,2.12,16,101,3.1,3.39,.21,2.14,6.1,.91,3.33,985 50 | 1,14.1,2.02,2.4,18.8,103,2.75,2.92,.32,2.38,6.2,1.07,2.75,1060 51 | 1,13.94,1.73,2.27,17.4,108,2.88,3.54,.32,2.08,8.90,1.12,3.1,1260 52 | 1,13.05,1.73,2.04,12.4,92,2.72,3.27,.17,2.91,7.2,1.12,2.91,1150 53 | 1,13.83,1.65,2.6,17.2,94,2.45,2.99,.22,2.29,5.6,1.24,3.37,1265 54 | 1,13.82,1.75,2.42,14,111,3.88,3.74,.32,1.87,7.05,1.01,3.26,1190 55 | 1,13.77,1.9,2.68,17.1,115,3,2.79,.39,1.68,6.3,1.13,2.93,1375 56 | 1,13.74,1.67,2.25,16.4,118,2.6,2.9,.21,1.62,5.85,.92,3.2,1060 57 | 1,13.56,1.73,2.46,20.5,116,2.96,2.78,.2,2.45,6.25,.98,3.03,1120 58 | 1,14.22,1.7,2.3,16.3,118,3.2,3,.26,2.03,6.38,.94,3.31,970 59 | 1,13.29,1.97,2.68,16.8,102,3,3.23,.31,1.66,6,1.07,2.84,1270 60 | 1,13.72,1.43,2.5,16.7,108,3.4,3.67,.19,2.04,6.8,.89,2.87,1285 61 | 2,12.37,.94,1.36,10.6,88,1.98,.57,.28,.42,1.95,1.05,1.82,520 62 | 2,12.33,1.1,2.28,16,101,2.05,1.09,.63,.41,3.27,1.25,1.67,680 63 | 2,12.64,1.36,2.02,16.8,100,2.02,1.41,.53,.62,5.75,.98,1.59,450 64 | 2,13.67,1.25,1.92,18,94,2.1,1.79,.32,.73,3.8,1.23,2.46,630 65 | 2,12.37,1.13,2.16,19,87,3.5,3.1,.19,1.87,4.45,1.22,2.87,420 66 | 2,12.17,1.45,2.53,19,104,1.89,1.75,.45,1.03,2.95,1.45,2.23,355 67 | 2,12.37,1.21,2.56,18.1,98,2.42,2.65,.37,2.08,4.6,1.19,2.3,678 68 | 2,13.11,1.01,1.7,15,78,2.98,3.18,.26,2.28,5.3,1.12,3.18,502 69 | 2,12.37,1.17,1.92,19.6,78,2.11,2,.27,1.04,4.68,1.12,3.48,510 70 | 2,13.34,.94,2.36,17,110,2.53,1.3,.55,.42,3.17,1.02,1.93,750 71 | 2,12.21,1.19,1.75,16.8,151,1.85,1.28,.14,2.5,2.85,1.28,3.07,718 72 | 2,12.29,1.61,2.21,20.4,103,1.1,1.02,.37,1.46,3.05,.906,1.82,870 73 | 2,13.86,1.51,2.67,25,86,2.95,2.86,.21,1.87,3.38,1.36,3.16,410 74 | 2,13.49,1.66,2.24,24,87,1.88,1.84,.27,1.03,3.74,.98,2.78,472 75 | 2,12.99,1.67,2.6,30,139,3.3,2.89,.21,1.96,3.35,1.31,3.5,985 76 | 2,11.96,1.09,2.3,21,101,3.38,2.14,.13,1.65,3.21,.99,3.13,886 77 | 2,11.66,1.88,1.92,16,97,1.61,1.57,.34,1.15,3.8,1.23,2.14,428 78 | 2,13.03,.9,1.71,16,86,1.95,2.03,.24,1.46,4.6,1.19,2.48,392 79 | 2,11.84,2.89,2.23,18,112,1.72,1.32,.43,.95,2.65,.96,2.52,500 80 | 2,12.33,.99,1.95,14.8,136,1.9,1.85,.35,2.76,3.4,1.06,2.31,750 81 | 2,12.7,3.87,2.4,23,101,2.83,2.55,.43,1.95,2.57,1.19,3.13,463 82 | 2,12,.92,2,19,86,2.42,2.26,.3,1.43,2.5,1.38,3.12,278 83 | 2,12.72,1.81,2.2,18.8,86,2.2,2.53,.26,1.77,3.9,1.16,3.14,714 84 | 2,12.08,1.13,2.51,24,78,2,1.58,.4,1.4,2.2,1.31,2.72,630 85 | 2,13.05,3.86,2.32,22.5,85,1.65,1.59,.61,1.62,4.8,.84,2.01,515 86 | 2,11.84,.89,2.58,18,94,2.2,2.21,.22,2.35,3.05,.79,3.08,520 87 | 2,12.67,.98,2.24,18,99,2.2,1.94,.3,1.46,2.62,1.23,3.16,450 88 | 2,12.16,1.61,2.31,22.8,90,1.78,1.69,.43,1.56,2.45,1.33,2.26,495 89 | 2,11.65,1.67,2.62,26,88,1.92,1.61,.4,1.34,2.6,1.36,3.21,562 90 | 2,11.64,2.06,2.46,21.6,84,1.95,1.69,.48,1.35,2.8,1,2.75,680 91 | 2,12.08,1.33,2.3,23.6,70,2.2,1.59,.42,1.38,1.74,1.07,3.21,625 92 | 2,12.08,1.83,2.32,18.5,81,1.6,1.5,.52,1.64,2.4,1.08,2.27,480 93 | 2,12,1.51,2.42,22,86,1.45,1.25,.5,1.63,3.6,1.05,2.65,450 94 | 2,12.69,1.53,2.26,20.7,80,1.38,1.46,.58,1.62,3.05,.96,2.06,495 95 | 2,12.29,2.83,2.22,18,88,2.45,2.25,.25,1.99,2.15,1.15,3.3,290 96 | 2,11.62,1.99,2.28,18,98,3.02,2.26,.17,1.35,3.25,1.16,2.96,345 97 | 2,12.47,1.52,2.2,19,162,2.5,2.27,.32,3.28,2.6,1.16,2.63,937 98 | 2,11.81,2.12,2.74,21.5,134,1.6,.99,.14,1.56,2.5,.95,2.26,625 99 | 2,12.29,1.41,1.98,16,85,2.55,2.5,.29,1.77,2.9,1.23,2.74,428 100 | 2,12.37,1.07,2.1,18.5,88,3.52,3.75,.24,1.95,4.5,1.04,2.77,660 101 | 2,12.29,3.17,2.21,18,88,2.85,2.99,.45,2.81,2.3,1.42,2.83,406 102 | 2,12.08,2.08,1.7,17.5,97,2.23,2.17,.26,1.4,3.3,1.27,2.96,710 103 | 2,12.6,1.34,1.9,18.5,88,1.45,1.36,.29,1.35,2.45,1.04,2.77,562 104 | 2,12.34,2.45,2.46,21,98,2.56,2.11,.34,1.31,2.8,.8,3.38,438 105 | 2,11.82,1.72,1.88,19.5,86,2.5,1.64,.37,1.42,2.06,.94,2.44,415 106 | 2,12.51,1.73,1.98,20.5,85,2.2,1.92,.32,1.48,2.94,1.04,3.57,672 107 | 2,12.42,2.55,2.27,22,90,1.68,1.84,.66,1.42,2.7,.86,3.3,315 108 | 2,12.25,1.73,2.12,19,80,1.65,2.03,.37,1.63,3.4,1,3.17,510 109 | 2,12.72,1.75,2.28,22.5,84,1.38,1.76,.48,1.63,3.3,.88,2.42,488 110 | 2,12.22,1.29,1.94,19,92,2.36,2.04,.39,2.08,2.7,.86,3.02,312 111 | 2,11.61,1.35,2.7,20,94,2.74,2.92,.29,2.49,2.65,.96,3.26,680 112 | 2,11.46,3.74,1.82,19.5,107,3.18,2.58,.24,3.58,2.9,.75,2.81,562 113 | 2,12.52,2.43,2.17,21,88,2.55,2.27,.26,1.22,2,.9,2.78,325 114 | 2,11.76,2.68,2.92,20,103,1.75,2.03,.6,1.05,3.8,1.23,2.5,607 115 | 2,11.41,.74,2.5,21,88,2.48,2.01,.42,1.44,3.08,1.1,2.31,434 116 | 2,12.08,1.39,2.5,22.5,84,2.56,2.29,.43,1.04,2.9,.93,3.19,385 117 | 2,11.03,1.51,2.2,21.5,85,2.46,2.17,.52,2.01,1.9,1.71,2.87,407 118 | 2,11.82,1.47,1.99,20.8,86,1.98,1.6,.3,1.53,1.95,.95,3.33,495 119 | 2,12.42,1.61,2.19,22.5,108,2,2.09,.34,1.61,2.06,1.06,2.96,345 120 | 2,12.77,3.43,1.98,16,80,1.63,1.25,.43,.83,3.4,.7,2.12,372 121 | 2,12,3.43,2,19,87,2,1.64,.37,1.87,1.28,.93,3.05,564 122 | 2,11.45,2.4,2.42,20,96,2.9,2.79,.32,1.83,3.25,.8,3.39,625 123 | 2,11.56,2.05,3.23,28.5,119,3.18,5.08,.47,1.87,6,.93,3.69,465 124 | 2,12.42,4.43,2.73,26.5,102,2.2,2.13,.43,1.71,2.08,.92,3.12,365 125 | 2,13.05,5.8,2.13,21.5,86,2.62,2.65,.3,2.01,2.6,.73,3.1,380 126 | 2,11.87,4.31,2.39,21,82,2.86,3.03,.21,2.91,2.8,.75,3.64,380 127 | 2,12.07,2.16,2.17,21,85,2.6,2.65,.37,1.35,2.76,.86,3.28,378 128 | 2,12.43,1.53,2.29,21.5,86,2.74,3.15,.39,1.77,3.94,.69,2.84,352 129 | 2,11.79,2.13,2.78,28.5,92,2.13,2.24,.58,1.76,3,.97,2.44,466 130 | 2,12.37,1.63,2.3,24.5,88,2.22,2.45,.4,1.9,2.12,.89,2.78,342 131 | 2,12.04,4.3,2.38,22,80,2.1,1.75,.42,1.35,2.6,.79,2.57,580 132 | 3,12.86,1.35,2.32,18,122,1.51,1.25,.21,.94,4.1,.76,1.29,630 133 | 3,12.88,2.99,2.4,20,104,1.3,1.22,.24,.83,5.4,.74,1.42,530 134 | 3,12.81,2.31,2.4,24,98,1.15,1.09,.27,.83,5.7,.66,1.36,560 135 | 3,12.7,3.55,2.36,21.5,106,1.7,1.2,.17,.84,5,.78,1.29,600 136 | 3,12.51,1.24,2.25,17.5,85,2,.58,.6,1.25,5.45,.75,1.51,650 137 | 3,12.6,2.46,2.2,18.5,94,1.62,.66,.63,.94,7.1,.73,1.58,695 138 | 3,12.25,4.72,2.54,21,89,1.38,.47,.53,.8,3.85,.75,1.27,720 139 | 3,12.53,5.51,2.64,25,96,1.79,.6,.63,1.1,5,.82,1.69,515 140 | 3,13.49,3.59,2.19,19.5,88,1.62,.48,.58,.88,5.7,.81,1.82,580 141 | 3,12.84,2.96,2.61,24,101,2.32,.6,.53,.81,4.92,.89,2.15,590 142 | 3,12.93,2.81,2.7,21,96,1.54,.5,.53,.75,4.6,.77,2.31,600 143 | 3,13.36,2.56,2.35,20,89,1.4,.5,.37,.64,5.6,.7,2.47,780 144 | 3,13.52,3.17,2.72,23.5,97,1.55,.52,.5,.55,4.35,.89,2.06,520 145 | 3,13.62,4.95,2.35,20,92,2,.8,.47,1.02,4.4,.91,2.05,550 146 | 3,12.25,3.88,2.2,18.5,112,1.38,.78,.29,1.14,8.21,.65,2,855 147 | 3,13.16,3.57,2.15,21,102,1.5,.55,.43,1.3,4,.6,1.68,830 148 | 3,13.88,5.04,2.23,20,80,.98,.34,.4,.68,4.9,.58,1.33,415 149 | 3,12.87,4.61,2.48,21.5,86,1.7,.65,.47,.86,7.65,.54,1.86,625 150 | 3,13.32,3.24,2.38,21.5,92,1.93,.76,.45,1.25,8.42,.55,1.62,650 151 | 3,13.08,3.9,2.36,21.5,113,1.41,1.39,.34,1.14,9.40,.57,1.33,550 152 | 3,13.5,3.12,2.62,24,123,1.4,1.57,.22,1.25,8.60,.59,1.3,500 153 | 3,12.79,2.67,2.48,22,112,1.48,1.36,.24,1.26,10.8,.48,1.47,480 154 | 3,13.11,1.9,2.75,25.5,116,2.2,1.28,.26,1.56,7.1,.61,1.33,425 155 | 3,13.23,3.3,2.28,18.5,98,1.8,.83,.61,1.87,10.52,.56,1.51,675 156 | 3,12.58,1.29,2.1,20,103,1.48,.58,.53,1.4,7.6,.58,1.55,640 157 | 3,13.17,5.19,2.32,22,93,1.74,.63,.61,1.55,7.9,.6,1.48,725 158 | 3,13.84,4.12,2.38,19.5,89,1.8,.83,.48,1.56,9.01,.57,1.64,480 159 | 3,12.45,3.03,2.64,27,97,1.9,.58,.63,1.14,7.5,.67,1.73,880 160 | 3,14.34,1.68,2.7,25,98,2.8,1.31,.53,2.7,13,.57,1.96,660 161 | 3,13.48,1.67,2.64,22.5,89,2.6,1.1,.52,2.29,11.75,.57,1.78,620 162 | 3,12.36,3.83,2.38,21,88,2.3,.92,.5,1.04,7.65,.56,1.58,520 163 | 3,13.69,3.26,2.54,20,107,1.83,.56,.5,.8,5.88,.96,1.82,680 164 | 3,12.85,3.27,2.58,22,106,1.65,.6,.6,.96,5.58,.87,2.11,570 165 | 3,12.96,3.45,2.35,18.5,106,1.39,.7,.4,.94,5.28,.68,1.75,675 166 | 3,13.78,2.76,2.3,22,90,1.35,.68,.41,1.03,9.58,.7,1.68,615 167 | 3,13.73,4.36,2.26,22.5,88,1.28,.47,.52,1.15,6.62,.78,1.75,520 168 | 3,13.45,3.7,2.6,23,111,1.7,.92,.43,1.46,10.68,.85,1.56,695 169 | 3,12.82,3.37,2.3,19.5,88,1.48,.66,.4,.97,10.26,.72,1.75,685 170 | 3,13.58,2.58,2.69,24.5,105,1.55,.84,.39,1.54,8.66,.74,1.8,750 171 | 3,13.4,4.6,2.86,25,112,1.98,.96,.27,1.11,8.5,.67,1.92,630 172 | 3,12.2,3.03,2.32,19,96,1.25,.49,.4,.73,5.5,.66,1.83,510 173 | 3,12.77,2.39,2.28,19.5,86,1.39,.51,.48,.64,9.899999,.57,1.63,470 174 | 3,14.16,2.51,2.48,20,91,1.68,.7,.44,1.24,9.7,.62,1.71,660 175 | 3,13.71,5.65,2.45,20.5,95,1.68,.61,.52,1.06,7.7,.64,1.74,740 176 | 3,13.4,3.91,2.48,23,102,1.8,.75,.43,1.41,7.3,.7,1.56,750 177 | 3,13.27,4.28,2.26,20,120,1.59,.69,.43,1.35,10.2,.59,1.56,835 178 | 3,13.17,2.59,2.37,20,120,1.65,.68,.53,1.46,9.3,.6,1.62,840 179 | 3,14.13,4.1,2.74,24.5,96,2.05,.76,.56,1.35,9.2,.61,1.6,560 -------------------------------------------------------------------------------- /6/convolution.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 4, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "import tensorflow as tf\n", 12 | "\n", 13 | "#Generate the filename queue, and read the gif files contents\n", 14 | "filename_queue = tf.train.string_input_producer(tf.train.match_filenames_once(\"data/test.gif\"))\n", 15 | "reader = tf.WholeFileReader()\n", 16 | "key, value = reader.read(filename_queue)\n", 17 | "image=tf.image.decode_gif(value)\n", 18 | "\n", 19 | "#Define the kernel parameters\n", 20 | "kernel=tf.constant(\n", 21 | " [\n", 22 | " [[[-1.]],[[-1.]],[[-1.]]],\n", 23 | " [[[-1.]],[[8.]],[[-1.]]],\n", 24 | " [[[-1.]],[[-1.]],[[-1.]]]\n", 25 | " ] \n", 26 | " )\n", 27 | "\n", 28 | "#Define the train coordinator\n", 29 | "coord = tf.train.Coordinator()\n", 30 | "\n", 31 | "with tf.Session() as sess:\n", 32 | " tf.initialize_all_variables().run()\n", 33 | " threads = tf.train.start_queue_runners(coord=coord)\n", 34 | " #Get first image\n", 35 | " image_tensor = tf.image.rgb_to_grayscale(sess.run([image])[0])\n", 36 | " #apply convolution, preserving the image size\n", 37 | " imagen_convoluted_tensor=tf.nn.conv2d(tf.cast(image_tensor, tf.float32),kernel,[1,1,1,1],\"SAME\")\n", 38 | " #Prepare to save the convolution option\n", 39 | " file=open (\"blur2.png\", \"wb+\")\n", 40 | " #Cast to uint8 (0..255), previous scalation, because the convolution could alter the scale of the final image\n", 41 | " out=tf.image.encode_png(tf.reshape(tf.cast(imagen_convoluted_tensor/tf.reduce_max(imagen_convoluted_tensor)*255.,tf.uint8), tf.shape(imagen_convoluted_tensor.eval()[0]).eval()))\n", 42 | " file.write(out.eval())\n", 43 | " file.close()\n", 44 | " coord.request_stop()\n", 45 | "coord.join(threads)" 46 | ] 47 | } 48 | ], 49 | "metadata": { 50 | "kernelspec": { 51 | "display_name": "Python 2", 52 | "language": "python", 53 | "name": "python2" 54 | }, 55 | "language_info": { 56 | "codemirror_mode": { 57 | "name": "ipython", 58 | "version": 2 59 | }, 60 | "file_extension": ".py", 61 | "mimetype": "text/x-python", 62 | "name": "python", 63 | "nbconvert_exporter": "python", 64 | "pygments_lexer": "ipython2", 65 | "version": "2.7.11+" 66 | } 67 | }, 68 | "nbformat": 4, 69 | "nbformat_minor": 0 70 | } 71 | -------------------------------------------------------------------------------- /6/data/blue_jay.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/data/blue_jay.jpg -------------------------------------------------------------------------------- /6/data/cifar-10-batches-bin/batches.meta.txt: -------------------------------------------------------------------------------- 1 | airplane 2 | automobile 3 | bird 4 | cat 5 | deer 6 | dog 7 | frog 8 | horse 9 | ship 10 | truck 11 | 12 | -------------------------------------------------------------------------------- /6/data/cifar-10-batches-bin/data_batch_1.bin: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/data/cifar-10-batches-bin/data_batch_1.bin -------------------------------------------------------------------------------- /6/data/cifar-10-batches-bin/readme.html: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /6/data/cifar-10-batches-bin/test_batch.bin: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/data/cifar-10-batches-bin/test_batch.bin -------------------------------------------------------------------------------- /6/data/leopard.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/data/leopard.jpg -------------------------------------------------------------------------------- /6/data/test.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/data/test.gif -------------------------------------------------------------------------------- /6/data/test2.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/data/test2.gif -------------------------------------------------------------------------------- /6/image_subsampling.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 35, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "import tensorflow as tf\n", 12 | "\n", 13 | "#Generate the filename queue, and read the gif files contents\n", 14 | "filename_queue = tf.train.string_input_producer(tf.train.match_filenames_once(\"data/test.gif\"))\n", 15 | "reader = tf.WholeFileReader()\n", 16 | "key, value = reader.read(filename_queue)\n", 17 | "image=tf.image.decode_gif(value)\n", 18 | "\n", 19 | "#Define the coordinator\n", 20 | "coord = tf.train.Coordinator()\n", 21 | "\n", 22 | "def normalize_and_encode (img_tensor):\n", 23 | " image_dimensions = tf.shape(img_tensor.eval()[0]).eval()\n", 24 | " return tf.image.encode_jpeg(tf.reshape(tf.cast(img_tensor, tf.uint8), image_dimensions))\n", 25 | "\n", 26 | "with tf.Session() as sess:\n", 27 | " maxfile=open (\"maxpool.jpeg\", \"wb+\")\n", 28 | " avgfile=open (\"avgpool.jpeg\", \"wb+\")\n", 29 | " tf.initialize_all_variables().run()\n", 30 | " threads = tf.train.start_queue_runners(coord=coord)\n", 31 | " \n", 32 | " image_tensor = tf.image.rgb_to_grayscale(sess.run([image])[0])\n", 33 | " \n", 34 | " maxed_tensor=tf.nn.avg_pool(tf.cast(image_tensor, tf.float32),[1,2,2,1],[1,2,2,1],\"SAME\")\n", 35 | " averaged_tensor=tf.nn.avg_pool(tf.cast(image_tensor, tf.float32),[1,2,2,1],[1,2,2,1],\"SAME\")\n", 36 | " \n", 37 | " maxfile.write(normalize_and_encode(maxed_tensor).eval())\n", 38 | " avgfile.write(normalize_and_encode(averaged_tensor).eval())\n", 39 | " coord.request_stop()\n", 40 | " maxfile.close()\n", 41 | " avgfile.close()\n", 42 | "coord.join(threads)" 43 | ] 44 | } 45 | ], 46 | "metadata": { 47 | "kernelspec": { 48 | "display_name": "Python 2", 49 | "language": "python", 50 | "name": "python2" 51 | }, 52 | "language_info": { 53 | "codemirror_mode": { 54 | "name": "ipython", 55 | "version": 2 56 | }, 57 | "file_extension": ".py", 58 | "mimetype": "text/x-python", 59 | "name": "python", 60 | "nbconvert_exporter": "python", 61 | "pygments_lexer": "ipython2", 62 | "version": "2.7.11+" 63 | } 64 | }, 65 | "nbformat": 4, 66 | "nbformat_minor": 0 67 | } 68 | -------------------------------------------------------------------------------- /6/old/convolution.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 98, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [ 10 | { 11 | "name": "stdout", 12 | "output_type": "stream", 13 | "text": [ 14 | "267.0\n", 15 | "\n" 16 | ] 17 | } 18 | ], 19 | "source": [ 20 | "import tensorflow as tf\n", 21 | "\n", 22 | "#Generate the filename queue, and read the gif files contents\n", 23 | "filename_queue = tf.train.string_input_producer(tf.train.match_filenames_once(\"data/test.gif\"))\n", 24 | "reader = tf.WholeFileReader()\n", 25 | "key, value = reader.read(filename_queue)\n", 26 | "image=tf.image.decode_gif(value)\n", 27 | "\n", 28 | "#Define the kernel parameters\n", 29 | "kernel=tf.constant(\n", 30 | " [\n", 31 | " [[[-1.]],[[-1.]],[[-1.]]],\n", 32 | " [[[-1.]],[[8.]],[[-1.]]],\n", 33 | " [[[-1.]],[[-1.]],[[-1.]]]\n", 34 | " ] \n", 35 | " )\n", 36 | "\n", 37 | "#Define the train coordinator\n", 38 | "coord = tf.train.Coordinator()\n", 39 | "\n", 40 | "with tf.Session() as sess:\n", 41 | " tf.initialize_all_variables().run()\n", 42 | " threads = tf.train.start_queue_runners(coord=coord)\n", 43 | " #Get first image\n", 44 | " image_tensor = tf.image.rgb_to_grayscale(sess.run([image])[0])\n", 45 | " #apply convolution, preserving the image size\n", 46 | " imagen_convoluted_tensor=tf.nn.conv2d(tf.cast(image_tensor, tf.float32),kernel,[1,1,1,1],\"SAME\")\n", 47 | " #Prepare to save the convolution option\n", 48 | " file=open (\"blur2.jpeg\", \"wb+\")\n", 49 | " #Cast to uint8 (0..255), previous scalation, because the convolution could alter the scale of the final image\n", 50 | " out=tf.image.encode_jpeg(tf.reshape(tf.cast(imagen_convoluted_tensor/tf.reduce_max(imagen_convoluted_tensor)*255.,tf.uint8), tf.shape(imagen_convoluted_tensor.eval()[0]).eval()))\n", 51 | " file.close()\n", 52 | " coord.request_stop()\n", 53 | "coord.join(threads)" 54 | ] 55 | } 56 | ], 57 | "metadata": { 58 | "kernelspec": { 59 | "display_name": "Python 2", 60 | "language": "python", 61 | "name": "python2" 62 | }, 63 | "language_info": { 64 | "codemirror_mode": { 65 | "name": "ipython", 66 | "version": 2 67 | }, 68 | "file_extension": ".py", 69 | "mimetype": "text/x-python", 70 | "name": "python", 71 | "nbconvert_exporter": "python", 72 | "pygments_lexer": "ipython2", 73 | "version": "2.7.11+" 74 | } 75 | }, 76 | "nbformat": 4, 77 | "nbformat_minor": 0 78 | } 79 | -------------------------------------------------------------------------------- /6/old/data/blue_jay.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/old/data/blue_jay.jpg -------------------------------------------------------------------------------- /6/old/data/leopard.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/old/data/leopard.jpg -------------------------------------------------------------------------------- /6/old/data/test.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/old/data/test.gif -------------------------------------------------------------------------------- /6/old/data/test2.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/6/old/data/test2.gif -------------------------------------------------------------------------------- /6/old/image_subsampling.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 35, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "import tensorflow as tf\n", 12 | "\n", 13 | "#Generate the filename queue, and read the gif files contents\n", 14 | "filename_queue = tf.train.string_input_producer(tf.train.match_filenames_once(\"data/test.gif\"))\n", 15 | "reader = tf.WholeFileReader()\n", 16 | "key, value = reader.read(filename_queue)\n", 17 | "image=tf.image.decode_gif(value)\n", 18 | "\n", 19 | "#Define the coordinator\n", 20 | "coord = tf.train.Coordinator()\n", 21 | "\n", 22 | "def normalize_and_encode (img_tensor):\n", 23 | " image_dimensions = tf.shape(img_tensor.eval()[0]).eval()\n", 24 | " return tf.image.encode_jpeg(tf.reshape(tf.cast(img_tensor, tf.uint8), image_dimensions))\n", 25 | "\n", 26 | "with tf.Session() as sess:\n", 27 | " maxfile=open (\"maxpool.jpeg\", \"wb+\")\n", 28 | " avgfile=open (\"avgpool.jpeg\", \"wb+\")\n", 29 | " tf.initialize_all_variables().run()\n", 30 | " threads = tf.train.start_queue_runners(coord=coord)\n", 31 | " \n", 32 | " image_tensor = tf.image.rgb_to_grayscale(sess.run([image])[0])\n", 33 | " \n", 34 | " maxed_tensor=tf.nn.avg_pool(tf.cast(image_tensor, tf.float32),[1,2,2,1],[1,2,2,1],\"SAME\")\n", 35 | " averaged_tensor=tf.nn.avg_pool(tf.cast(image_tensor, tf.float32),[1,2,2,1],[1,2,2,1],\"SAME\")\n", 36 | " \n", 37 | " maxfile.write(normalize_and_encode(maxed_tensor).eval())\n", 38 | " avgfile.write(normalize_and_encode(averaged_tensor).eval())\n", 39 | " coord.request_stop()\n", 40 | " maxfile.close()\n", 41 | " avgfile.close()\n", 42 | "coord.join(threads)" 43 | ] 44 | } 45 | ], 46 | "metadata": { 47 | "kernelspec": { 48 | "display_name": "Python 2", 49 | "language": "python", 50 | "name": "python2" 51 | }, 52 | "language_info": { 53 | "codemirror_mode": { 54 | "name": "ipython", 55 | "version": 2 56 | }, 57 | "file_extension": ".py", 58 | "mimetype": "text/x-python", 59 | "name": "python", 60 | "nbconvert_exporter": "python", 61 | "pygments_lexer": "ipython2", 62 | "version": "2.7.11+" 63 | } 64 | }, 65 | "nbformat": 4, 66 | "nbformat_minor": 0 67 | } 68 | -------------------------------------------------------------------------------- /7/Code/CH7_time_series.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import pandas as pd 3 | import tensorflow as tf 4 | from matplotlib import pyplot as plt 5 | 6 | 7 | from tensorflow.python.framework import dtypes 8 | from tensorflow.contrib import learn 9 | 10 | import logging 11 | logging.basicConfig(level=logging.INFO) 12 | 13 | 14 | from tensorflow.contrib import learn 15 | from sklearn.metrics import mean_squared_error 16 | 17 | LOG_DIR = './ops_logs' 18 | TIMESTEPS = 5 19 | RNN_LAYERS = [{'steps': TIMESTEPS}] 20 | DENSE_LAYERS = None 21 | TRAINING_STEPS = 10000 22 | BATCH_SIZE = 100 23 | PRINT_STEPS = TRAINING_STEPS / 100 24 | 25 | def lstm_model(time_steps, rnn_layers, dense_layers=None): 26 | def lstm_cells(layers): 27 | return [tf.nn.rnn_cell.BasicLSTMCell(layer['steps'],state_is_tuple=True) 28 | for layer in layers] 29 | 30 | def dnn_layers(input_layers, layers): 31 | return input_layers 32 | 33 | def _lstm_model(X, y): 34 | stacked_lstm = tf.nn.rnn_cell.MultiRNNCell(lstm_cells(rnn_layers), state_is_tuple=True) 35 | x_ = learn.ops.split_squeeze(1, time_steps, X) 36 | output, layers = tf.nn.rnn(stacked_lstm, x_, dtype=dtypes.float32) 37 | output = dnn_layers(output[-1], dense_layers) 38 | return learn.models.linear_regression(output, y) 39 | 40 | return _lstm_model 41 | 42 | 43 | regressor = learn.TensorFlowEstimator(model_fn=lstm_model(TIMESTEPS, RNN_LAYERS, DENSE_LAYERS), n_classes=0, 44 | verbose=2, steps=TRAINING_STEPS, optimizer='Adagrad', 45 | learning_rate=0.03, batch_size=BATCH_SIZE) 46 | 47 | df = pd.read_csv("data/elec_load.csv", error_bad_lines=False) 48 | plt.subplot() 49 | plot_test, = plt.plot(df.values[:1500], label='Load') 50 | plt.legend(handles=[plot_test]) 51 | 52 | 53 | print df.describe() 54 | array=(df.values- 147.0) /339.0 55 | plt.subplot() 56 | plot_test, = plt.plot(array[:1500], label='Normalized Load') 57 | plt.legend(handles=[plot_test]) 58 | 59 | 60 | listX = [] 61 | listy = [] 62 | X={} 63 | y={} 64 | 65 | for i in range(0,len(array)-6): 66 | listX.append(array[i:i+5].reshape([5,1])) 67 | listy.append(array[i+6]) 68 | 69 | arrayX=np.array(listX) 70 | arrayy=np.array(listy) 71 | 72 | 73 | X['train']=arrayX[0:12000] 74 | X['test']=arrayX[12000:13000] 75 | X['val']=arrayX[13000:14000] 76 | 77 | y['train']=arrayy[0:12000] 78 | y['test']=arrayy[12000:13000] 79 | y['val']=arrayy[13000:14000] 80 | 81 | 82 | # print y['test'][0] 83 | # print y2['test'][0] 84 | 85 | 86 | #X1, y2 = generate_data(np.sin, np.linspace(0, 100, 10000), TIMESTEPS, seperate=False) 87 | # create a lstm instance and validation monitor 88 | validation_monitor = learn.monitors.ValidationMonitor(X['val'], y['val'], 89 | every_n_steps=PRINT_STEPS, 90 | early_stopping_rounds=1000) 91 | 92 | regressor.fit(X['train'], y['train'], monitors=[validation_monitor], logdir=LOG_DIR) 93 | 94 | predicted = regressor.predict(X['test']) 95 | rmse = np.sqrt(((predicted - y['test']) ** 2).mean(axis=0)) 96 | score = mean_squared_error(predicted, y['test']) 97 | print ("MSE: %f" % score) 98 | 99 | #plot_predicted, = plt.plot(array[:1000], label='predicted') 100 | 101 | plt.subplot() 102 | plot_predicted, = plt.plot(predicted, label='predicted') 103 | 104 | plot_test, = plt.plot(y['test'], label='test') 105 | plt.legend(handles=[plot_predicted, plot_test]) 106 | 107 | 108 | 109 | 110 | 111 | -------------------------------------------------------------------------------- /7/Code/model.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | from tensorflow.contrib import rnn 3 | from tensorflow.contrib import legacy_seq2seq 4 | 5 | import numpy as np 6 | 7 | 8 | class Model(): 9 | def __init__(self, args, training=True): 10 | self.args = args 11 | if not training: 12 | args.batch_size = 1 13 | args.seq_length = 1 14 | 15 | if args.model == 'rnn': 16 | cell_fn = rnn.BasicRNNCell 17 | elif args.model == 'gru': 18 | cell_fn = rnn.GRUCell 19 | elif args.model == 'lstm': 20 | cell_fn = rnn.BasicLSTMCell 21 | elif args.model == 'nas': 22 | cell_fn = rnn.NASCell 23 | else: 24 | raise Exception("model type not supported: {}".format(args.model)) 25 | 26 | cells = [] 27 | for _ in range(args.num_layers): 28 | cell = cell_fn(args.rnn_size) 29 | if training and (args.output_keep_prob < 1.0 or args.input_keep_prob < 1.0): 30 | cell = rnn.DropoutWrapper(cell, 31 | input_keep_prob=args.input_keep_prob, 32 | output_keep_prob=args.output_keep_prob) 33 | cells.append(cell) 34 | 35 | self.cell = cell = rnn.MultiRNNCell(cells, state_is_tuple=True) 36 | 37 | self.input_data = tf.placeholder( 38 | tf.int32, [args.batch_size, args.seq_length]) 39 | self.targets = tf.placeholder( 40 | tf.int32, [args.batch_size, args.seq_length]) 41 | self.initial_state = cell.zero_state(args.batch_size, tf.float32) 42 | 43 | with tf.variable_scope('rnnlm'): 44 | softmax_w = tf.get_variable("softmax_w", 45 | [args.rnn_size, args.vocab_size]) 46 | softmax_b = tf.get_variable("softmax_b", [args.vocab_size]) 47 | 48 | embedding = tf.get_variable("embedding", [args.vocab_size, args.rnn_size]) 49 | inputs = tf.nn.embedding_lookup(embedding, self.input_data) 50 | 51 | # dropout beta testing: double check which one should affect next line 52 | if training and args.output_keep_prob: 53 | inputs = tf.nn.dropout(inputs, args.output_keep_prob) 54 | 55 | inputs = tf.split(inputs, args.seq_length, 1) 56 | inputs = [tf.squeeze(input_, [1]) for input_ in inputs] 57 | 58 | def loop(prev, _): 59 | prev = tf.matmul(prev, softmax_w) + softmax_b 60 | prev_symbol = tf.stop_gradient(tf.argmax(prev, 1)) 61 | return tf.nn.embedding_lookup(embedding, prev_symbol) 62 | 63 | outputs, last_state = legacy_seq2seq.rnn_decoder(inputs, self.initial_state, cell, loop_function=loop if not training else None, scope='rnnlm') 64 | output = tf.reshape(tf.concat(outputs, 1), [-1, args.rnn_size]) 65 | 66 | 67 | self.logits = tf.matmul(output, softmax_w) + softmax_b 68 | self.probs = tf.nn.softmax(self.logits) 69 | loss = legacy_seq2seq.sequence_loss_by_example( 70 | [self.logits], 71 | [tf.reshape(self.targets, [-1])], 72 | [tf.ones([args.batch_size * args.seq_length])]) 73 | self.cost = tf.reduce_sum(loss) / args.batch_size / args.seq_length 74 | with tf.name_scope('cost'): 75 | self.cost = tf.reduce_sum(loss) / args.batch_size / args.seq_length 76 | self.final_state = last_state 77 | self.lr = tf.Variable(0.0, trainable=False) 78 | tvars = tf.trainable_variables() 79 | grads, _ = tf.clip_by_global_norm(tf.gradients(self.cost, tvars), 80 | args.grad_clip) 81 | with tf.name_scope('optimizer'): 82 | optimizer = tf.train.AdamOptimizer(self.lr) 83 | self.train_op = optimizer.apply_gradients(zip(grads, tvars)) 84 | 85 | # instrument tensorboard 86 | tf.summary.histogram('logits', self.logits) 87 | tf.summary.histogram('loss', loss) 88 | tf.summary.scalar('train_loss', self.cost) 89 | 90 | def sample(self, sess, chars, vocab, num=200, prime='The ', sampling_type=1): 91 | state = sess.run(self.cell.zero_state(1, tf.float32)) 92 | for char in prime[:-1]: 93 | x = np.zeros((1, 1)) 94 | x[0, 0] = vocab[char] 95 | feed = {self.input_data: x, self.initial_state: state} 96 | [state] = sess.run([self.final_state], feed) 97 | 98 | def weighted_pick(weights): 99 | t = np.cumsum(weights) 100 | s = np.sum(weights) 101 | return(int(np.searchsorted(t, np.random.rand(1)*s))) 102 | 103 | ret = prime 104 | char = prime[-1] 105 | for n in range(num): 106 | x = np.zeros((1, 1)) 107 | x[0, 0] = vocab[char] 108 | feed = {self.input_data: x, self.initial_state: state} 109 | [probs, state] = sess.run([self.probs, self.final_state], feed) 110 | p = probs[0] 111 | 112 | if sampling_type == 0: 113 | sample = np.argmax(p) 114 | elif sampling_type == 2: 115 | if char == ' ': 116 | sample = weighted_pick(p) 117 | else: 118 | sample = np.argmax(p) 119 | else: # sampling_type == 1 default: 120 | sample = weighted_pick(p) 121 | 122 | pred = chars[sample] 123 | ret += pred 124 | char = pred 125 | return ret 126 | -------------------------------------------------------------------------------- /7/Code/sample.py: -------------------------------------------------------------------------------- 1 | from __future__ import print_function 2 | import tensorflow as tf 3 | 4 | import argparse 5 | import os 6 | from six.moves import cPickle 7 | 8 | from model import Model 9 | 10 | from six import text_type 11 | 12 | 13 | def main(): 14 | parser = argparse.ArgumentParser( 15 | formatter_class=argparse.ArgumentDefaultsHelpFormatter) 16 | parser.add_argument('--save_dir', type=str, default='save', 17 | help='model directory to store checkpointed models') 18 | parser.add_argument('-n', type=int, default=500, 19 | help='number of characters to sample') 20 | parser.add_argument('--prime', type=text_type, default=u' ', 21 | help='prime text') 22 | parser.add_argument('--sample', type=int, default=1, 23 | help='0 to use max at each timestep, 1 to sample at ' 24 | 'each timestep, 2 to sample on spaces') 25 | 26 | args = parser.parse_args() 27 | sample(args) 28 | 29 | 30 | def sample(args): 31 | with open(os.path.join(args.save_dir, 'config.pkl'), 'rb') as f: 32 | saved_args = cPickle.load(f) 33 | with open(os.path.join(args.save_dir, 'chars_vocab.pkl'), 'rb') as f: 34 | chars, vocab = cPickle.load(f) 35 | model = Model(saved_args, training=False) 36 | with tf.Session() as sess: 37 | tf.global_variables_initializer().run() 38 | saver = tf.train.Saver(tf.global_variables()) 39 | ckpt = tf.train.get_checkpoint_state(args.save_dir) 40 | if ckpt and ckpt.model_checkpoint_path: 41 | saver.restore(sess, ckpt.model_checkpoint_path) 42 | print(model.sample(sess, chars, vocab, args.n, args.prime, 43 | args.sample).encode('utf-8')) 44 | 45 | if __name__ == '__main__': 46 | main() 47 | -------------------------------------------------------------------------------- /7/Code/save/.gitignore: -------------------------------------------------------------------------------- 1 | # Ignore everything in this directory 2 | * 3 | # Except this file 4 | !.gitignore 5 | -------------------------------------------------------------------------------- /7/Code/train.py: -------------------------------------------------------------------------------- 1 | from __future__ import print_function 2 | import tensorflow as tf 3 | 4 | import argparse 5 | import time 6 | import os 7 | from six.moves import cPickle 8 | 9 | from utils import TextLoader 10 | from model import Model 11 | 12 | 13 | def main(): 14 | parser = argparse.ArgumentParser( 15 | formatter_class=argparse.ArgumentDefaultsHelpFormatter) 16 | parser.add_argument('--data_dir', type=str, default='data/tinyshakespeare', 17 | help='data directory containing input.txt') 18 | parser.add_argument('--save_dir', type=str, default='save', 19 | help='directory to store checkpointed models') 20 | parser.add_argument('--log_dir', type=str, default='logs', 21 | help='directory to store tensorboard logs') 22 | parser.add_argument('--rnn_size', type=int, default=128, 23 | help='size of RNN hidden state') 24 | parser.add_argument('--num_layers', type=int, default=2, 25 | help='number of layers in the RNN') 26 | parser.add_argument('--model', type=str, default='lstm', 27 | help='rnn, gru, lstm, or nas') 28 | parser.add_argument('--batch_size', type=int, default=50, 29 | help='minibatch size') 30 | parser.add_argument('--seq_length', type=int, default=50, 31 | help='RNN sequence length') 32 | parser.add_argument('--num_epochs', type=int, default=50, 33 | help='number of epochs') 34 | parser.add_argument('--save_every', type=int, default=1000, 35 | help='save frequency') 36 | parser.add_argument('--grad_clip', type=float, default=5., 37 | help='clip gradients at this value') 38 | parser.add_argument('--learning_rate', type=float, default=0.002, 39 | help='learning rate') 40 | parser.add_argument('--decay_rate', type=float, default=0.97, 41 | help='decay rate for rmsprop') 42 | parser.add_argument('--output_keep_prob', type=float, default=1.0, 43 | help='probability of keeping weights in the hidden layer') 44 | parser.add_argument('--input_keep_prob', type=float, default=1.0, 45 | help='probability of keeping weights in the input layer') 46 | parser.add_argument('--init_from', type=str, default=None, 47 | help="""continue training from saved model at this path. Path must contain files saved by previous training process: 48 | 'config.pkl' : configuration; 49 | 'chars_vocab.pkl' : vocabulary definitions; 50 | 'checkpoint' : paths to model file(s) (created by tf). 51 | Note: this file contains absolute paths, be careful when moving files around; 52 | 'model.ckpt-*' : file(s) with model definition (created by tf) 53 | """) 54 | args = parser.parse_args() 55 | train(args) 56 | 57 | 58 | def train(args): 59 | data_loader = TextLoader(args.data_dir, args.batch_size, args.seq_length) 60 | args.vocab_size = data_loader.vocab_size 61 | 62 | # check compatibility if training is continued from previously saved model 63 | if args.init_from is not None: 64 | # check if all necessary files exist 65 | assert os.path.isdir(args.init_from)," %s must be a a path" % args.init_from 66 | assert os.path.isfile(os.path.join(args.init_from,"config.pkl")),"config.pkl file does not exist in path %s"%args.init_from 67 | assert os.path.isfile(os.path.join(args.init_from,"chars_vocab.pkl")),"chars_vocab.pkl.pkl file does not exist in path %s" % args.init_from 68 | ckpt = tf.train.get_checkpoint_state(args.init_from) 69 | assert ckpt, "No checkpoint found" 70 | assert ckpt.model_checkpoint_path, "No model path found in checkpoint" 71 | 72 | # open old config and check if models are compatible 73 | with open(os.path.join(args.init_from, 'config.pkl'), 'rb') as f: 74 | saved_model_args = cPickle.load(f) 75 | need_be_same = ["model", "rnn_size", "num_layers", "seq_length"] 76 | for checkme in need_be_same: 77 | assert vars(saved_model_args)[checkme]==vars(args)[checkme],"Command line argument and saved model disagree on '%s' "%checkme 78 | 79 | # open saved vocab/dict and check if vocabs/dicts are compatible 80 | with open(os.path.join(args.init_from, 'chars_vocab.pkl'), 'rb') as f: 81 | saved_chars, saved_vocab = cPickle.load(f) 82 | assert saved_chars==data_loader.chars, "Data and loaded model disagree on character set!" 83 | assert saved_vocab==data_loader.vocab, "Data and loaded model disagree on dictionary mappings!" 84 | 85 | if not os.path.isdir(args.save_dir): 86 | os.makedirs(args.save_dir) 87 | with open(os.path.join(args.save_dir, 'config.pkl'), 'wb') as f: 88 | cPickle.dump(args, f) 89 | with open(os.path.join(args.save_dir, 'chars_vocab.pkl'), 'wb') as f: 90 | cPickle.dump((data_loader.chars, data_loader.vocab), f) 91 | 92 | model = Model(args) 93 | 94 | with tf.Session() as sess: 95 | # instrument for tensorboard 96 | summaries = tf.summary.merge_all() 97 | writer = tf.summary.FileWriter( 98 | os.path.join(args.log_dir, time.strftime("%Y-%m-%d-%H-%M-%S"))) 99 | writer.add_graph(sess.graph) 100 | 101 | sess.run(tf.global_variables_initializer()) 102 | saver = tf.train.Saver(tf.global_variables()) 103 | # restore model 104 | if args.init_from is not None: 105 | saver.restore(sess, ckpt.model_checkpoint_path) 106 | for e in range(args.num_epochs): 107 | sess.run(tf.assign(model.lr, 108 | args.learning_rate * (args.decay_rate ** e))) 109 | data_loader.reset_batch_pointer() 110 | state = sess.run(model.initial_state) 111 | for b in range(data_loader.num_batches): 112 | start = time.time() 113 | x, y = data_loader.next_batch() 114 | feed = {model.input_data: x, model.targets: y} 115 | for i, (c, h) in enumerate(model.initial_state): 116 | feed[c] = state[i].c 117 | feed[h] = state[i].h 118 | train_loss, state, _ = sess.run([model.cost, model.final_state, model.train_op], feed) 119 | 120 | # instrument for tensorboard 121 | summ, train_loss, state, _ = sess.run([summaries, model.cost, model.final_state, model.train_op], feed) 122 | writer.add_summary(summ, e * data_loader.num_batches + b) 123 | 124 | end = time.time() 125 | print("{}/{} (epoch {}), train_loss = {:.3f}, time/batch = {:.3f}" 126 | .format(e * data_loader.num_batches + b, 127 | args.num_epochs * data_loader.num_batches, 128 | e, train_loss, end - start)) 129 | if (e * data_loader.num_batches + b) % args.save_every == 0\ 130 | or (e == args.num_epochs-1 and 131 | b == data_loader.num_batches-1): 132 | # save for the last result 133 | checkpoint_path = os.path.join(args.save_dir, 'model.ckpt') 134 | saver.save(sess, checkpoint_path, 135 | global_step=e * data_loader.num_batches + b) 136 | print("model saved to {}".format(checkpoint_path)) 137 | 138 | 139 | if __name__ == '__main__': 140 | main() 141 | -------------------------------------------------------------------------------- /7/Code/utils.py: -------------------------------------------------------------------------------- 1 | import codecs 2 | import os 3 | import collections 4 | from six.moves import cPickle 5 | import numpy as np 6 | 7 | 8 | class TextLoader(): 9 | def __init__(self, data_dir, batch_size, seq_length, encoding='utf-8'): 10 | self.data_dir = data_dir 11 | self.batch_size = batch_size 12 | self.seq_length = seq_length 13 | self.encoding = encoding 14 | 15 | input_file = os.path.join(data_dir, "input.txt") 16 | vocab_file = os.path.join(data_dir, "vocab.pkl") 17 | tensor_file = os.path.join(data_dir, "data.npy") 18 | 19 | if not (os.path.exists(vocab_file) and os.path.exists(tensor_file)): 20 | print("reading text file") 21 | self.preprocess(input_file, vocab_file, tensor_file) 22 | else: 23 | print("loading preprocessed files") 24 | self.load_preprocessed(vocab_file, tensor_file) 25 | self.create_batches() 26 | self.reset_batch_pointer() 27 | 28 | def preprocess(self, input_file, vocab_file, tensor_file): 29 | with codecs.open(input_file, "r", encoding=self.encoding) as f: 30 | data = f.read() 31 | counter = collections.Counter(data) 32 | count_pairs = sorted(counter.items(), key=lambda x: -x[1]) 33 | self.chars, _ = zip(*count_pairs) 34 | self.vocab_size = len(self.chars) 35 | self.vocab = dict(zip(self.chars, range(len(self.chars)))) 36 | with open(vocab_file, 'wb') as f: 37 | cPickle.dump(self.chars, f) 38 | self.tensor = np.array(list(map(self.vocab.get, data))) 39 | np.save(tensor_file, self.tensor) 40 | 41 | def load_preprocessed(self, vocab_file, tensor_file): 42 | with open(vocab_file, 'rb') as f: 43 | self.chars = cPickle.load(f) 44 | self.vocab_size = len(self.chars) 45 | self.vocab = dict(zip(self.chars, range(len(self.chars)))) 46 | self.tensor = np.load(tensor_file) 47 | self.num_batches = int(self.tensor.size / (self.batch_size * 48 | self.seq_length)) 49 | 50 | def create_batches(self): 51 | self.num_batches = int(self.tensor.size / (self.batch_size * 52 | self.seq_length)) 53 | 54 | # When the data (tensor) is too small, 55 | # let's give them a better error message 56 | if self.num_batches == 0: 57 | assert False, "Not enough data. Make seq_length and batch_size small." 58 | 59 | self.tensor = self.tensor[:self.num_batches * self.batch_size * self.seq_length] 60 | xdata = self.tensor 61 | ydata = np.copy(self.tensor) 62 | ydata[:-1] = xdata[1:] 63 | ydata[-1] = xdata[0] 64 | self.x_batches = np.split(xdata.reshape(self.batch_size, -1), 65 | self.num_batches, 1) 66 | self.y_batches = np.split(ydata.reshape(self.batch_size, -1), 67 | self.num_batches, 1) 68 | 69 | def next_batch(self): 70 | x, y = self.x_batches[self.pointer], self.y_batches[self.pointer] 71 | self.pointer += 1 72 | return x, y 73 | 74 | def reset_batch_pointer(self): 75 | self.pointer = 0 76 | -------------------------------------------------------------------------------- /8/content.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/8/content.jpg -------------------------------------------------------------------------------- /8/neural_style.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2015-2017 Anish Athalye. Released under GPLv3. 2 | 3 | import os 4 | 5 | import numpy as np 6 | import scipy.misc 7 | 8 | from stylize import stylize 9 | 10 | import math 11 | from argparse import ArgumentParser 12 | 13 | from PIL import Image 14 | 15 | # default arguments 16 | CONTENT_WEIGHT = 5e0 17 | CONTENT_WEIGHT_BLEND = 1 18 | STYLE_WEIGHT = 5e2 19 | TV_WEIGHT = 1e2 20 | STYLE_LAYER_WEIGHT_EXP = 1 21 | LEARNING_RATE = 1e1 22 | BETA1 = 0.9 23 | BETA2 = 0.999 24 | EPSILON = 1e-08 25 | STYLE_SCALE = 1.0 26 | ITERATIONS = 1000 27 | VGG_PATH = 'imagenet-vgg-verydeep-19.mat' 28 | POOLING = 'max' 29 | 30 | def build_parser(): 31 | parser = ArgumentParser() 32 | parser.add_argument('--content', 33 | dest='content', help='content image', 34 | metavar='CONTENT', required=True) 35 | parser.add_argument('--styles', 36 | dest='styles', 37 | nargs='+', help='one or more style images', 38 | metavar='STYLE', required=True) 39 | parser.add_argument('--output', 40 | dest='output', help='output path', 41 | metavar='OUTPUT', required=True) 42 | parser.add_argument('--iterations', type=int, 43 | dest='iterations', help='iterations (default %(default)s)', 44 | metavar='ITERATIONS', default=ITERATIONS) 45 | parser.add_argument('--print-iterations', type=int, 46 | dest='print_iterations', help='statistics printing frequency', 47 | metavar='PRINT_ITERATIONS') 48 | parser.add_argument('--checkpoint-output', 49 | dest='checkpoint_output', help='checkpoint output format, e.g. output%%s.jpg', 50 | metavar='OUTPUT') 51 | parser.add_argument('--checkpoint-iterations', type=int, 52 | dest='checkpoint_iterations', help='checkpoint frequency', 53 | metavar='CHECKPOINT_ITERATIONS') 54 | parser.add_argument('--width', type=int, 55 | dest='width', help='output width', 56 | metavar='WIDTH') 57 | parser.add_argument('--style-scales', type=float, 58 | dest='style_scales', 59 | nargs='+', help='one or more style scales', 60 | metavar='STYLE_SCALE') 61 | parser.add_argument('--network', 62 | dest='network', help='path to network parameters (default %(default)s)', 63 | metavar='VGG_PATH', default=VGG_PATH) 64 | parser.add_argument('--content-weight-blend', type=float, 65 | dest='content_weight_blend', help='content weight blend, conv4_2 * blend + conv5_2 * (1-blend) (default %(default)s)', 66 | metavar='CONTENT_WEIGHT_BLEND', default=CONTENT_WEIGHT_BLEND) 67 | parser.add_argument('--content-weight', type=float, 68 | dest='content_weight', help='content weight (default %(default)s)', 69 | metavar='CONTENT_WEIGHT', default=CONTENT_WEIGHT) 70 | parser.add_argument('--style-weight', type=float, 71 | dest='style_weight', help='style weight (default %(default)s)', 72 | metavar='STYLE_WEIGHT', default=STYLE_WEIGHT) 73 | parser.add_argument('--style-layer-weight-exp', type=float, 74 | dest='style_layer_weight_exp', help='style layer weight exponentional increase - weight(layer) = weight_exp*weight(layer) (default %(default)s)', 75 | metavar='STYLE_LAYER_WEIGHT_EXP', default=STYLE_LAYER_WEIGHT_EXP) 76 | parser.add_argument('--style-blend-weights', type=float, 77 | dest='style_blend_weights', help='style blending weights', 78 | nargs='+', metavar='STYLE_BLEND_WEIGHT') 79 | parser.add_argument('--tv-weight', type=float, 80 | dest='tv_weight', help='total variation regularization weight (default %(default)s)', 81 | metavar='TV_WEIGHT', default=TV_WEIGHT) 82 | parser.add_argument('--learning-rate', type=float, 83 | dest='learning_rate', help='learning rate (default %(default)s)', 84 | metavar='LEARNING_RATE', default=LEARNING_RATE) 85 | parser.add_argument('--beta1', type=float, 86 | dest='beta1', help='Adam: beta1 parameter (default %(default)s)', 87 | metavar='BETA1', default=BETA1) 88 | parser.add_argument('--beta2', type=float, 89 | dest='beta2', help='Adam: beta2 parameter (default %(default)s)', 90 | metavar='BETA2', default=BETA2) 91 | parser.add_argument('--eps', type=float, 92 | dest='epsilon', help='Adam: epsilon parameter (default %(default)s)', 93 | metavar='EPSILON', default=EPSILON) 94 | parser.add_argument('--initial', 95 | dest='initial', help='initial image', 96 | metavar='INITIAL') 97 | parser.add_argument('--initial-noiseblend', type=float, 98 | dest='initial_noiseblend', help='ratio of blending initial image with normalized noise (if no initial image specified, content image is used) (default %(default)s)', 99 | metavar='INITIAL_NOISEBLEND') 100 | parser.add_argument('--preserve-colors', action='store_true', 101 | dest='preserve_colors', help='style-only transfer (preserving colors) - if color transfer is not needed') 102 | parser.add_argument('--pooling', 103 | dest='pooling', help='pooling layer configuration: max or avg (default %(default)s)', 104 | metavar='POOLING', default=POOLING) 105 | return parser 106 | 107 | 108 | def main(): 109 | parser = build_parser() 110 | options = parser.parse_args() 111 | 112 | if not os.path.isfile(options.network): 113 | parser.error("Network %s does not exist. (Did you forget to download it?)" % options.network) 114 | 115 | content_image = imread(options.content) 116 | style_images = [imread(style) for style in options.styles] 117 | 118 | width = options.width 119 | if width is not None: 120 | new_shape = (int(math.floor(float(content_image.shape[0]) / 121 | content_image.shape[1] * width)), width) 122 | content_image = scipy.misc.imresize(content_image, new_shape) 123 | target_shape = content_image.shape 124 | for i in range(len(style_images)): 125 | style_scale = STYLE_SCALE 126 | if options.style_scales is not None: 127 | style_scale = options.style_scales[i] 128 | style_images[i] = scipy.misc.imresize(style_images[i], style_scale * 129 | target_shape[1] / style_images[i].shape[1]) 130 | 131 | style_blend_weights = options.style_blend_weights 132 | if style_blend_weights is None: 133 | # default is equal weights 134 | style_blend_weights = [1.0/len(style_images) for _ in style_images] 135 | else: 136 | total_blend_weight = sum(style_blend_weights) 137 | style_blend_weights = [weight/total_blend_weight 138 | for weight in style_blend_weights] 139 | 140 | initial = options.initial 141 | if initial is not None: 142 | initial = scipy.misc.imresize(imread(initial), content_image.shape[:2]) 143 | # Initial guess is specified, but not noiseblend - no noise should be blended 144 | if options.initial_noiseblend is None: 145 | options.initial_noiseblend = 0.0 146 | else: 147 | # Neither inital, nor noiseblend is provided, falling back to random generated initial guess 148 | if options.initial_noiseblend is None: 149 | options.initial_noiseblend = 1.0 150 | if options.initial_noiseblend < 1.0: 151 | initial = content_image 152 | 153 | if options.checkpoint_output and "%s" not in options.checkpoint_output: 154 | parser.error("To save intermediate images, the checkpoint output " 155 | "parameter must contain `%s` (e.g. `foo%s.jpg`)") 156 | 157 | for iteration, image in stylize( 158 | network=options.network, 159 | initial=initial, 160 | initial_noiseblend=options.initial_noiseblend, 161 | content=content_image, 162 | styles=style_images, 163 | preserve_colors=options.preserve_colors, 164 | iterations=options.iterations, 165 | content_weight=options.content_weight, 166 | content_weight_blend=options.content_weight_blend, 167 | style_weight=options.style_weight, 168 | style_layer_weight_exp=options.style_layer_weight_exp, 169 | style_blend_weights=style_blend_weights, 170 | tv_weight=options.tv_weight, 171 | learning_rate=options.learning_rate, 172 | beta1=options.beta1, 173 | beta2=options.beta2, 174 | epsilon=options.epsilon, 175 | pooling=options.pooling, 176 | print_iterations=options.print_iterations, 177 | checkpoint_iterations=options.checkpoint_iterations 178 | ): 179 | output_file = None 180 | combined_rgb = image 181 | if iteration is not None: 182 | if options.checkpoint_output: 183 | output_file = options.checkpoint_output % iteration 184 | else: 185 | output_file = options.output 186 | if output_file: 187 | imsave(output_file, combined_rgb) 188 | 189 | 190 | def imread(path): 191 | img = scipy.misc.imread(path).astype(np.float) 192 | if len(img.shape) == 2: 193 | # grayscale 194 | img = np.dstack((img,img,img)) 195 | elif img.shape[2] == 4: 196 | # PNG with alpha channel 197 | img = img[:,:,:3] 198 | return img 199 | 200 | 201 | def imsave(path, img): 202 | img = np.clip(img, 0, 255).astype(np.uint8) 203 | Image.fromarray(img).save(path, quality=95) 204 | 205 | if __name__ == '__main__': 206 | main() 207 | -------------------------------------------------------------------------------- /8/out.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/8/out.jpg -------------------------------------------------------------------------------- /8/style.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/8/style.jpg -------------------------------------------------------------------------------- /8/stylize.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2015-2017 Anish Athalye. Released under GPLv3. 2 | 3 | import vgg 4 | 5 | import tensorflow as tf 6 | import numpy as np 7 | 8 | from sys import stderr 9 | 10 | from PIL import Image 11 | 12 | CONTENT_LAYERS = ('relu4_2', 'relu5_2') 13 | STYLE_LAYERS = ('relu1_1', 'relu2_1', 'relu3_1', 'relu4_1', 'relu5_1') 14 | 15 | try: 16 | reduce 17 | except NameError: 18 | from functools import reduce 19 | 20 | 21 | def stylize(network, initial, initial_noiseblend, content, styles, preserve_colors, iterations, 22 | content_weight, content_weight_blend, style_weight, style_layer_weight_exp, style_blend_weights, tv_weight, 23 | learning_rate, beta1, beta2, epsilon, pooling, 24 | print_iterations=None, checkpoint_iterations=None): 25 | """ 26 | Stylize images. 27 | 28 | This function yields tuples (iteration, image); `iteration` is None 29 | if this is the final image (the last iteration). Other tuples are yielded 30 | every `checkpoint_iterations` iterations. 31 | 32 | :rtype: iterator[tuple[int|None,image]] 33 | """ 34 | shape = (1,) + content.shape 35 | style_shapes = [(1,) + style.shape for style in styles] 36 | content_features = {} 37 | style_features = [{} for _ in styles] 38 | 39 | vgg_weights, vgg_mean_pixel = vgg.load_net(network) 40 | 41 | layer_weight = 1.0 42 | style_layers_weights = {} 43 | for style_layer in STYLE_LAYERS: 44 | style_layers_weights[style_layer] = layer_weight 45 | layer_weight *= style_layer_weight_exp 46 | 47 | # normalize style layer weights 48 | layer_weights_sum = 0 49 | for style_layer in STYLE_LAYERS: 50 | layer_weights_sum += style_layers_weights[style_layer] 51 | for style_layer in STYLE_LAYERS: 52 | style_layers_weights[style_layer] /= layer_weights_sum 53 | 54 | # compute content features in feedforward mode 55 | g = tf.Graph() 56 | with g.as_default(), g.device('/cpu:0'), tf.Session() as sess: 57 | image = tf.placeholder('float', shape=shape) 58 | net = vgg.net_preloaded(vgg_weights, image, pooling) 59 | content_pre = np.array([vgg.preprocess(content, vgg_mean_pixel)]) 60 | for layer in CONTENT_LAYERS: 61 | content_features[layer] = net[layer].eval(feed_dict={image: content_pre}) 62 | 63 | # compute style features in feedforward mode 64 | for i in range(len(styles)): 65 | g = tf.Graph() 66 | with g.as_default(), g.device('/cpu:0'), tf.Session() as sess: 67 | image = tf.placeholder('float', shape=style_shapes[i]) 68 | net = vgg.net_preloaded(vgg_weights, image, pooling) 69 | style_pre = np.array([vgg.preprocess(styles[i], vgg_mean_pixel)]) 70 | for layer in STYLE_LAYERS: 71 | features = net[layer].eval(feed_dict={image: style_pre}) 72 | features = np.reshape(features, (-1, features.shape[3])) 73 | gram = np.matmul(features.T, features) / features.size 74 | style_features[i][layer] = gram 75 | 76 | initial_content_noise_coeff = 1.0 - initial_noiseblend 77 | 78 | # make stylized image using backpropogation 79 | with tf.Graph().as_default(): 80 | if initial is None: 81 | noise = np.random.normal(size=shape, scale=np.std(content) * 0.1) 82 | initial = tf.random_normal(shape) * 0.256 83 | else: 84 | initial = np.array([vgg.preprocess(initial, vgg_mean_pixel)]) 85 | initial = initial.astype('float32') 86 | noise = np.random.normal(size=shape, scale=np.std(content) * 0.1) 87 | initial = (initial) * initial_content_noise_coeff + (tf.random_normal(shape) * 0.256) * (1.0 - initial_content_noise_coeff) 88 | image = tf.Variable(initial) 89 | net = vgg.net_preloaded(vgg_weights, image, pooling) 90 | 91 | # content loss 92 | content_layers_weights = {} 93 | content_layers_weights['relu4_2'] = content_weight_blend 94 | content_layers_weights['relu5_2'] = 1.0 - content_weight_blend 95 | 96 | content_loss = 0 97 | content_losses = [] 98 | for content_layer in CONTENT_LAYERS: 99 | content_losses.append(content_layers_weights[content_layer] * content_weight * (2 * tf.nn.l2_loss( 100 | net[content_layer] - content_features[content_layer]) / 101 | content_features[content_layer].size)) 102 | content_loss += reduce(tf.add, content_losses) 103 | 104 | # style loss 105 | style_loss = 0 106 | for i in range(len(styles)): 107 | style_losses = [] 108 | for style_layer in STYLE_LAYERS: 109 | layer = net[style_layer] 110 | _, height, width, number = map(lambda i: i.value, layer.get_shape()) 111 | size = height * width * number 112 | feats = tf.reshape(layer, (-1, number)) 113 | gram = tf.matmul(tf.transpose(feats), feats) / size 114 | style_gram = style_features[i][style_layer] 115 | style_losses.append(style_layers_weights[style_layer] * 2 * tf.nn.l2_loss(gram - style_gram) / style_gram.size) 116 | style_loss += style_weight * style_blend_weights[i] * reduce(tf.add, style_losses) 117 | 118 | # total variation denoising 119 | tv_y_size = _tensor_size(image[:,1:,:,:]) 120 | tv_x_size = _tensor_size(image[:,:,1:,:]) 121 | tv_loss = tv_weight * 2 * ( 122 | (tf.nn.l2_loss(image[:,1:,:,:] - image[:,:shape[1]-1,:,:]) / 123 | tv_y_size) + 124 | (tf.nn.l2_loss(image[:,:,1:,:] - image[:,:,:shape[2]-1,:]) / 125 | tv_x_size)) 126 | # overall loss 127 | loss = content_loss + style_loss + tv_loss 128 | 129 | # optimizer setup 130 | train_step = tf.train.AdamOptimizer(learning_rate, beta1, beta2, epsilon).minimize(loss) 131 | 132 | def print_progress(): 133 | stderr.write(' content loss: %g\n' % content_loss.eval()) 134 | stderr.write(' style loss: %g\n' % style_loss.eval()) 135 | stderr.write(' tv loss: %g\n' % tv_loss.eval()) 136 | stderr.write(' total loss: %g\n' % loss.eval()) 137 | 138 | # optimization 139 | best_loss = float('inf') 140 | best = None 141 | with tf.Session() as sess: 142 | sess.run(tf.global_variables_initializer()) 143 | stderr.write('Optimization started...\n') 144 | if (print_iterations and print_iterations != 0): 145 | print_progress() 146 | for i in range(iterations): 147 | stderr.write('Iteration %4d/%4d\n' % (i + 1, iterations)) 148 | train_step.run() 149 | 150 | last_step = (i == iterations - 1) 151 | if last_step or (print_iterations and i % print_iterations == 0): 152 | print_progress() 153 | 154 | if (checkpoint_iterations and i % checkpoint_iterations == 0) or last_step: 155 | this_loss = loss.eval() 156 | if this_loss < best_loss: 157 | best_loss = this_loss 158 | best = image.eval() 159 | 160 | img_out = vgg.unprocess(best.reshape(shape[1:]), vgg_mean_pixel) 161 | 162 | if preserve_colors and preserve_colors == True: 163 | original_image = np.clip(content, 0, 255) 164 | styled_image = np.clip(img_out, 0, 255) 165 | 166 | # Luminosity transfer steps: 167 | # 1. Convert stylized RGB->grayscale accoriding to Rec.601 luma (0.299, 0.587, 0.114) 168 | # 2. Convert stylized grayscale into YUV (YCbCr) 169 | # 3. Convert original image into YUV (YCbCr) 170 | # 4. Recombine (stylizedYUV.Y, originalYUV.U, originalYUV.V) 171 | # 5. Convert recombined image from YUV back to RGB 172 | 173 | # 1 174 | styled_grayscale = rgb2gray(styled_image) 175 | styled_grayscale_rgb = gray2rgb(styled_grayscale) 176 | 177 | # 2 178 | styled_grayscale_yuv = np.array(Image.fromarray(styled_grayscale_rgb.astype(np.uint8)).convert('YCbCr')) 179 | 180 | # 3 181 | original_yuv = np.array(Image.fromarray(original_image.astype(np.uint8)).convert('YCbCr')) 182 | 183 | # 4 184 | w, h, _ = original_image.shape 185 | combined_yuv = np.empty((w, h, 3), dtype=np.uint8) 186 | combined_yuv[..., 0] = styled_grayscale_yuv[..., 0] 187 | combined_yuv[..., 1] = original_yuv[..., 1] 188 | combined_yuv[..., 2] = original_yuv[..., 2] 189 | 190 | # 5 191 | img_out = np.array(Image.fromarray(combined_yuv, 'YCbCr').convert('RGB')) 192 | 193 | 194 | yield ( 195 | (None if last_step else i), 196 | img_out 197 | ) 198 | 199 | 200 | def _tensor_size(tensor): 201 | from operator import mul 202 | return reduce(mul, (d.value for d in tensor.get_shape()), 1) 203 | 204 | def rgb2gray(rgb): 205 | return np.dot(rgb[...,:3], [0.299, 0.587, 0.114]) 206 | 207 | def gray2rgb(gray): 208 | w, h = gray.shape 209 | rgb = np.empty((w, h, 3), dtype=np.float32) 210 | rgb[:, :, 2] = rgb[:, :, 1] = rgb[:, :, 0] = gray 211 | return rgb 212 | -------------------------------------------------------------------------------- /8/stylize.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/8/stylize.pyc -------------------------------------------------------------------------------- /8/vgg.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2015-2017 Anish Athalye. Released under GPLv3. 2 | 3 | import tensorflow as tf 4 | import numpy as np 5 | import scipy.io 6 | 7 | VGG19_LAYERS = ( 8 | 'conv1_1', 'relu1_1', 'conv1_2', 'relu1_2', 'pool1', 9 | 10 | 'conv2_1', 'relu2_1', 'conv2_2', 'relu2_2', 'pool2', 11 | 12 | 'conv3_1', 'relu3_1', 'conv3_2', 'relu3_2', 'conv3_3', 13 | 'relu3_3', 'conv3_4', 'relu3_4', 'pool3', 14 | 15 | 'conv4_1', 'relu4_1', 'conv4_2', 'relu4_2', 'conv4_3', 16 | 'relu4_3', 'conv4_4', 'relu4_4', 'pool4', 17 | 18 | 'conv5_1', 'relu5_1', 'conv5_2', 'relu5_2', 'conv5_3', 19 | 'relu5_3', 'conv5_4', 'relu5_4' 20 | ) 21 | 22 | def load_net(data_path): 23 | data = scipy.io.loadmat(data_path) 24 | mean = data['normalization'][0][0][0] 25 | mean_pixel = np.mean(mean, axis=(0, 1)) 26 | weights = data['layers'][0] 27 | return weights, mean_pixel 28 | 29 | def net_preloaded(weights, input_image, pooling): 30 | net = {} 31 | current = input_image 32 | for i, name in enumerate(VGG19_LAYERS): 33 | kind = name[:4] 34 | if kind == 'conv': 35 | kernels, bias = weights[i][0][0][0][0] 36 | # matconvnet: weights are [width, height, in_channels, out_channels] 37 | # tensorflow: weights are [height, width, in_channels, out_channels] 38 | kernels = np.transpose(kernels, (1, 0, 2, 3)) 39 | bias = bias.reshape(-1) 40 | current = _conv_layer(current, kernels, bias) 41 | elif kind == 'relu': 42 | current = tf.nn.relu(current) 43 | elif kind == 'pool': 44 | current = _pool_layer(current, pooling) 45 | net[name] = current 46 | 47 | assert len(net) == len(VGG19_LAYERS) 48 | return net 49 | 50 | def _conv_layer(input, weights, bias): 51 | conv = tf.nn.conv2d(input, tf.constant(weights), strides=(1, 1, 1, 1), 52 | padding='SAME') 53 | return tf.nn.bias_add(conv, bias) 54 | 55 | 56 | def _pool_layer(input, pooling): 57 | if pooling == 'avg': 58 | return tf.nn.avg_pool(input, ksize=(1, 2, 2, 1), strides=(1, 2, 2, 1), 59 | padding='SAME') 60 | else: 61 | return tf.nn.max_pool(input, ksize=(1, 2, 2, 1), strides=(1, 2, 2, 1), 62 | padding='SAME') 63 | 64 | def preprocess(image, mean_pixel): 65 | return image - mean_pixel 66 | 67 | 68 | def unprocess(image, mean_pixel): 69 | return image + mean_pixel 70 | -------------------------------------------------------------------------------- /8/vgg.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PacktPublishing/Building-Machine-Learning-Projects-with-TensorFlow/b9dd98c6cdf267f61a5181e8508eab66012ae242/8/vgg.pyc -------------------------------------------------------------------------------- /9/cluster_pi_final.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | 4 | tf.app.flags.DEFINE_integer("numsamples", "100","Number of samples per server") 5 | FLAGS = tf.app.flags.FLAGS 6 | 7 | print ("Sample number per server: " + str(FLAGS.numsamples) ) 8 | cluster = tf.train.ClusterSpec({"local": ["ec2-52-90-57-240.compute-1.amazonaws.com:2222", "ec2-54-196-135-128.compute-1.amazonaws.com:2222"]}) 9 | 10 | c=[] 11 | 12 | def generate_sum(): 13 | i=tf.constant(np.random.uniform(size=FLAGS.numsamples*2), shape=[FLAGS.numsamples,2]) 14 | distances=tf.reduce_sum(tf.pow(i,2),1) 15 | return (tf.reduce_sum(tf.cast(tf.greater_equal(tf.cast(1.0,tf.float64),distances),tf.int32))) 16 | 17 | 18 | with tf.device("/job:local/task:0"): 19 | test1= generate_sum() 20 | 21 | with tf.device("/job:local/task:1"): 22 | test2= generate_sum() 23 | 24 | with tf.Session("grpc://ec2-52-90-57-240.compute-1.amazonaws.com:2222") as sess: 25 | result = sess.run(tf.cast(test1 + test2,tf.float64)/FLAGS.numsamples*2.0) 26 | print(result) 27 | -------------------------------------------------------------------------------- /9/gpu_pi.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | c = [] 4 | #Distribute the work between the GPUs 5 | for d in ['/gpu:0', '/gpu:1', '/gpu:2', '/gpu:3']: 6 | #Generate the random 2D samples 7 | i=tf.constant(np.random.uniform(size=10000), shape=[5000,2]) 8 | with tf.Session() as sess: 9 | tf.initialize_all_variables() 10 | #Calculate the euclidean distance to the origin 11 | distances=tf.reduce_sum(tf.pow(i,2),1) 12 | #Sum the samples inside the circle 13 | tempsum = sess.run(tf.reduce_sum(tf.cast(tf.greater_equal(tf.cast(1.0,tf.float64),distances),tf.float64))) 14 | #append the current result to the results array 15 | c.append( tempsum) 16 | #Do the final ratio calculation on the CPU 17 | with tf.device('/cpu:0'): 18 | with tf.Session() as sess: 19 | sum = tf.add_n(c) 20 | print (sess.run(sum/20000.0)*4.0) -------------------------------------------------------------------------------- /9/start_server.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | tf.app.flags.DEFINE_string("index", "0","Server index") 3 | FLAGS = tf.app.flags.FLAGS 4 | print FLAGS.index 5 | #cluster = tf.train.ClusterSpec({"local": ["ec2-52-90-57-240.compute-1.amazonaws.com:2222", "ec2-54-196-135-128.compute-1.amazonaws.com:2222"]}) 6 | cluster = tf.train.ClusterSpec({"local": ["localhost:2222", "localhost:2223"]}) 7 | server = tf.train.Server(cluster, job_name="local", task_index=int(FLAGS.index)) 8 | server.join() 9 | -------------------------------------------------------------------------------- /9/trainer.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | from sklearn.utils import shuffle 4 | 5 | # Here we define our cluster setup via the command line 6 | tf.app.flags.DEFINE_string("ps_hosts", "", 7 | "Comma-separated list of hostname:port pairs") 8 | tf.app.flags.DEFINE_string("worker_hosts", "", 9 | "Comma-separated list of hostname:port pairs") 10 | 11 | # Define the characteristics of the cluster node, and its task index 12 | tf.app.flags.DEFINE_string("job_name", "", "One of 'ps', 'worker'") 13 | tf.app.flags.DEFINE_integer("task_index", 0, "Index of task within the job") 14 | 15 | FLAGS = tf.app.flags.FLAGS 16 | 17 | 18 | def main(_): 19 | ps_hosts = FLAGS.ps_hosts.split(",") 20 | worker_hosts = FLAGS.worker_hosts.split(",") 21 | 22 | # Create a cluster following the command line paramaters. 23 | cluster = tf.train.ClusterSpec({"ps": ps_hosts, "worker": worker_hosts}) 24 | 25 | # Create the local task. 26 | server = tf.train.Server(cluster, 27 | job_name=FLAGS.job_name, 28 | task_index=FLAGS.task_index) 29 | 30 | if FLAGS.job_name == "ps": 31 | server.join() 32 | elif FLAGS.job_name == "worker": 33 | 34 | # Assigns ops to the local worker by default. 35 | with tf.device(tf.train.replica_device_setter( 36 | worker_device="/job:worker/task:%d" % FLAGS.task_index, 37 | cluster=cluster)): 38 | 39 | #Define the training set, and the model parameters, loss function and training operation 40 | trX = np.linspace(-1, 1, 101) 41 | trY = 2 * trX + np.random.randn(*trX.shape) * 0.4 + 0.2 # create a y value 42 | X = tf.placeholder("float", name="X") # create symbolic variables 43 | Y = tf.placeholder("float", name = "Y") 44 | 45 | def model(X, w, b): 46 | return tf.mul(X, w) + b # We just define the line as X*w + b0 47 | 48 | w = tf.Variable(-1.0, name="b0") # create a shared variable 49 | b = tf.Variable(-2.0, name="b1") # create a shared variable 50 | y_model = model(X, w, b) 51 | 52 | loss = (tf.pow(Y-y_model, 2)) # use sqr error for cost function 53 | global_step = tf.Variable(0) 54 | 55 | train_op = tf.train.AdagradOptimizer(0.8).minimize( 56 | loss, global_step=global_step) 57 | 58 | #Create a saver, and a summary and init operation 59 | saver = tf.train.Saver() 60 | summary_op = tf.merge_all_summaries() 61 | init_op = tf.initialize_all_variables() 62 | 63 | # Create a "supervisor", which oversees the training process. 64 | sv = tf.train.Supervisor(is_chief=(FLAGS.task_index == 0), 65 | logdir="/tmp/train_logs", 66 | init_op=init_op, 67 | summary_op=summary_op, 68 | saver=saver, 69 | global_step=global_step, 70 | save_model_secs=600) 71 | 72 | # The supervisor takes care of session initialization, restoring from 73 | # a checkpoint, and closing when done or an error occurs. 74 | with sv.managed_session(server.target) as sess: 75 | # Loop until the supervisor shuts down 76 | step = 0 77 | while not sv.should_stop() : 78 | # Run a training step asynchronously. 79 | # See `tf.train.SyncReplicasOptimizer` for additional details on how to 80 | # perform *synchronous* training. 81 | for i in range(100): 82 | trX, trY = shuffle (trX, trY, random_state=0) 83 | for (x, y) in zip(trX, trY): 84 | _, step = sess.run([train_op, global_step],feed_dict={X: x, Y: y}) 85 | #Print the partial results, and the current node doing the calculation 86 | print ("Partial result from node: " + str(FLAGS.task_index) + ", w: " + str(w.eval(session=sess))+ ", b0: " + str(b.eval(session=sess))) 87 | # Ask for all the services to stop. 88 | sv.stop() 89 | 90 | 91 | 92 | if __name__ == "__main__": 93 | tf.app.run() 94 | -------------------------------------------------------------------------------- /ERRATA_AND_UPDATES.md: -------------------------------------------------------------------------------- 1 | ## We are updating all examples to make them compatible with Tensorflow 1.0 2 | 3 | This file will summarize the errata found in the book, and the updated examples for the new versions of Tensorflow appearing (starting with 0.12) 4 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2016 Packt 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Building Machine Learning Projects with TensorFlow 2 | This is the code repository for [Building Machine Learning Projects with TensorFlow](https://www.packtpub.com/big-data-and-business-intelligence/building-machine-learning-projects-tensorflow?utm_source=github&utm_medium=repository&utm_campaign=9781786466587), published by [Packt](https://www.packtpub.com). It contains all the supporting project files necessary to work through the book from start to finish. 3 | ## Instructions and Navigations 4 | All of the code is organized into folders. Each folder starts with a number followed by the application name. For example, Chapter02. 5 | 6 | 7 | 8 | The code will look like the following: 9 | ``` 10 | >>> import tensorflow as tf 11 | >>> tens1 = tf.constant([[[1,2],[2,3]],[[3,4],[5,6]]]) 12 | >>> sess = tf.Session() 13 | >>> print sess.run(tens1)[1,1,0] 14 | 5 15 | ``` 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 |
Software RequiredHardware RequiredOperating System
Tensorflow 0.10, Jupyter NotebookAny x86 computerUbuntu Linux 16.04
30 | 31 | ## Related Products 32 | * [Machine Learning with TensorFlow](https://www.packtpub.com/product/machine-learning-with-tensorflow-1-x/9781786462961) 33 | 34 | * [Getting Started with TensorFlow](https://www.packtpub.com/product/getting-started-with-tensorflow/9781786468574) 35 | 36 | * [Building Machine Learning Systems with Python - Second Edition](https://www.packtpub.com/product/building-machine-learning-systems-with-python-second-edition/9781784392772) 37 | 38 | 39 | --------------------------------------------------------------------------------