├── README.md ├── Result_Keras ├── Result_TensorLayer ├── keras_mnist.py └── tensorlayer_mnist.py /README.md: -------------------------------------------------------------------------------- 1 | # Comparsion of TensorFlow Wrappers 2 | 3 | Run Keras, TensorLayer and Tflearn with same model and data on a same GPU machine. 4 | 5 | The parameter initialization may have slightly different, but would not effect the speed. 6 | 7 | Feel free to PUSH ! 8 | 9 | ## Speed of MLP 10 | 11 | GPU: GTX980 12 | 13 | TensorFlow: r0.10 14 | 15 | Data: MNIST train:50k val:10k test:10k 16 | 17 | Model: 784-800-800-10 18 | 19 | Num of epochs: 200 20 | 21 | Batch size: 500 22 | 23 | Keras: 282.475250s = 1.41 s/epoch 24 | 25 | TensorLayer: 116.670947s = 0.58 s/epoch 26 | 27 | Tflearn: 28 | 29 | ## Speed of CNN 30 | 31 | 32 | ## Speed of LSTM 33 | -------------------------------------------------------------------------------- /Result_Keras: -------------------------------------------------------------------------------- 1 | Using TensorFlow backend. 2 | I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcublas.so locally 3 | I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcudnn.so locally 4 | I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcufft.so locally 5 | I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcuda.so.1 locally 6 | I tensorflow/stream_executor/dso_loader.cc:108] successfully opened CUDA library libcurand.so locally 7 | 50000 train samples 8 | 10000 val samples 9 | 10000 test samples 10 | (50000, 10) (10000, 10) 11 | I tensorflow/core/common_runtime/gpu/gpu_init.cc:102] Found device 0 with properties: 12 | name: GeForce GTX 980 13 | major: 5 minor: 2 memoryClockRate (GHz) 1.418 14 | pciBusID 0000:07:00.0 15 | Total memory: 4.00GiB 16 | Free memory: 3.50GiB 17 | I tensorflow/core/common_runtime/gpu/gpu_init.cc:126] DMA: 0 18 | I tensorflow/core/common_runtime/gpu/gpu_init.cc:136] 0: Y 19 | I tensorflow/core/common_runtime/gpu/gpu_device.cc:839] Creating TensorFlow device (/gpu:0) -> (device: 0, name: GeForce GTX 980, pci bus id: 0000:07:00.0) 20 | ____________________________________________________________________________________________________ 21 | Layer (type) Output Shape Param # Connected to 22 | ==================================================================================================== 23 | dropout_1 (Dropout) (None, 784) 0 dropout_input_1[0][0] 24 | ____________________________________________________________________________________________________ 25 | dense_1 (Dense) (None, 800) 628000 dropout_1[0][0] 26 | ____________________________________________________________________________________________________ 27 | activation_1 (Activation) (None, 800) 0 dense_1[0][0] 28 | ____________________________________________________________________________________________________ 29 | dropout_2 (Dropout) (None, 800) 0 activation_1[0][0] 30 | ____________________________________________________________________________________________________ 31 | dense_2 (Dense) (None, 800) 640800 dropout_2[0][0] 32 | ____________________________________________________________________________________________________ 33 | activation_2 (Activation) (None, 800) 0 dense_2[0][0] 34 | ____________________________________________________________________________________________________ 35 | dropout_3 (Dropout) (None, 800) 0 activation_2[0][0] 36 | ____________________________________________________________________________________________________ 37 | dense_3 (Dense) (None, 10) 8010 dropout_3[0][0] 38 | ____________________________________________________________________________________________________ 39 | activation_3 (Activation) (None, 10) 0 dense_3[0][0] 40 | ==================================================================================================== 41 | Total params: 1276810 42 | ____________________________________________________________________________________________________ 43 | Train on 50000 samples, validate on 10000 samples 44 | Epoch 1/200 45 | 50000/50000 [==============================] - 1s - loss: 0.5229 - acc: 0.8385 - val_loss: 0.1678 - val_acc: 0.9521 46 | Epoch 2/200 47 | 50000/50000 [==============================] - 1s - loss: 0.2316 - acc: 0.9292 - val_loss: 0.1145 - val_acc: 0.9660 48 | Epoch 3/200 49 | 50000/50000 [==============================] - 1s - loss: 0.1721 - acc: 0.9466 - val_loss: 0.1009 - val_acc: 0.9698 50 | Epoch 4/200 51 | 50000/50000 [==============================] - 1s - loss: 0.1408 - acc: 0.9567 - val_loss: 0.0843 - val_acc: 0.9760 52 | Epoch 5/200 53 | 50000/50000 [==============================] - 1s - loss: 0.1255 - acc: 0.9599 - val_loss: 0.0800 - val_acc: 0.9765 54 | Epoch 6/200 55 | 50000/50000 [==============================] - 1s - loss: 0.1151 - acc: 0.9634 - val_loss: 0.0719 - val_acc: 0.9785 56 | Epoch 7/200 57 | 50000/50000 [==============================] - 1s - loss: 0.1011 - acc: 0.9682 - val_loss: 0.0708 - val_acc: 0.9786 58 | Epoch 8/200 59 | 50000/50000 [==============================] - 1s - loss: 0.0940 - acc: 0.9702 - val_loss: 0.0659 - val_acc: 0.9810 60 | Epoch 9/200 61 | 50000/50000 [==============================] - 1s - loss: 0.0897 - acc: 0.9714 - val_loss: 0.0618 - val_acc: 0.9825 62 | Epoch 10/200 63 | 50000/50000 [==============================] - 1s - loss: 0.0810 - acc: 0.9736 - val_loss: 0.0668 - val_acc: 0.9801 64 | Epoch 11/200 65 | 50000/50000 [==============================] - 1s - loss: 0.0766 - acc: 0.9742 - val_loss: 0.0627 - val_acc: 0.9826 66 | Epoch 12/200 67 | 50000/50000 [==============================] - 1s - loss: 0.0777 - acc: 0.9751 - val_loss: 0.0622 - val_acc: 0.9816 68 | Epoch 13/200 69 | 50000/50000 [==============================] - 1s - loss: 0.0707 - acc: 0.9764 - val_loss: 0.0646 - val_acc: 0.9805 70 | Epoch 14/200 71 | 50000/50000 [==============================] - 1s - loss: 0.0671 - acc: 0.9780 - val_loss: 0.0622 - val_acc: 0.9819 72 | Epoch 15/200 73 | 50000/50000 [==============================] - 1s - loss: 0.0638 - acc: 0.9781 - val_loss: 0.0640 - val_acc: 0.9828 74 | Epoch 16/200 75 | 50000/50000 [==============================] - 1s - loss: 0.0611 - acc: 0.9795 - val_loss: 0.0630 - val_acc: 0.9821 76 | Epoch 17/200 77 | 50000/50000 [==============================] - 1s - loss: 0.0614 - acc: 0.9795 - val_loss: 0.0587 - val_acc: 0.9832 78 | Epoch 18/200 79 | 50000/50000 [==============================] - 1s - loss: 0.0582 - acc: 0.9816 - val_loss: 0.0641 - val_acc: 0.9834 80 | Epoch 19/200 81 | 50000/50000 [==============================] - 1s - loss: 0.0571 - acc: 0.9809 - val_loss: 0.0603 - val_acc: 0.9833 82 | Epoch 20/200 83 | 50000/50000 [==============================] - 1s - loss: 0.0519 - acc: 0.9827 - val_loss: 0.0580 - val_acc: 0.9834 84 | Epoch 21/200 85 | 50000/50000 [==============================] - 1s - loss: 0.0499 - acc: 0.9828 - val_loss: 0.0685 - val_acc: 0.9832 86 | Epoch 22/200 87 | 50000/50000 [==============================] - 1s - loss: 0.0545 - acc: 0.9813 - val_loss: 0.0616 - val_acc: 0.9842 88 | Epoch 23/200 89 | 50000/50000 [==============================] - 1s - loss: 0.0483 - acc: 0.9836 - val_loss: 0.0617 - val_acc: 0.9841 90 | Epoch 24/200 91 | 50000/50000 [==============================] - 1s - loss: 0.0480 - acc: 0.9840 - val_loss: 0.0622 - val_acc: 0.9825 92 | Epoch 25/200 93 | 50000/50000 [==============================] - 1s - loss: 0.0482 - acc: 0.9835 - val_loss: 0.0635 - val_acc: 0.9835 94 | Epoch 26/200 95 | 50000/50000 [==============================] - 1s - loss: 0.0455 - acc: 0.9848 - val_loss: 0.0665 - val_acc: 0.9838 96 | Epoch 27/200 97 | 50000/50000 [==============================] - 1s - loss: 0.0442 - acc: 0.9849 - val_loss: 0.0592 - val_acc: 0.9840 98 | Epoch 28/200 99 | 50000/50000 [==============================] - 1s - loss: 0.0439 - acc: 0.9854 - val_loss: 0.0622 - val_acc: 0.9849 100 | Epoch 29/200 101 | 50000/50000 [==============================] - 1s - loss: 0.0437 - acc: 0.9859 - val_loss: 0.0618 - val_acc: 0.9844 102 | Epoch 30/200 103 | 50000/50000 [==============================] - 1s - loss: 0.0421 - acc: 0.9866 - val_loss: 0.0601 - val_acc: 0.9843 104 | Epoch 31/200 105 | 50000/50000 [==============================] - 1s - loss: 0.0421 - acc: 0.9864 - val_loss: 0.0566 - val_acc: 0.9854 106 | Epoch 32/200 107 | 50000/50000 [==============================] - 1s - loss: 0.0433 - acc: 0.9855 - val_loss: 0.0605 - val_acc: 0.9840 108 | Epoch 33/200 109 | 50000/50000 [==============================] - 1s - loss: 0.0401 - acc: 0.9869 - val_loss: 0.0590 - val_acc: 0.9849 110 | Epoch 34/200 111 | 50000/50000 [==============================] - 1s - loss: 0.0404 - acc: 0.9864 - val_loss: 0.0603 - val_acc: 0.9849 112 | Epoch 35/200 113 | 50000/50000 [==============================] - 1s - loss: 0.0377 - acc: 0.9870 - val_loss: 0.0610 - val_acc: 0.9853 114 | Epoch 36/200 115 | 50000/50000 [==============================] - 1s - loss: 0.0401 - acc: 0.9866 - val_loss: 0.0627 - val_acc: 0.9842 116 | Epoch 37/200 117 | 50000/50000 [==============================] - 1s - loss: 0.0385 - acc: 0.9869 - val_loss: 0.0598 - val_acc: 0.9843 118 | Epoch 38/200 119 | 50000/50000 [==============================] - 1s - loss: 0.0378 - acc: 0.9874 - val_loss: 0.0590 - val_acc: 0.9860 120 | Epoch 39/200 121 | 50000/50000 [==============================] - 1s - loss: 0.0373 - acc: 0.9873 - val_loss: 0.0607 - val_acc: 0.9855 122 | Epoch 40/200 123 | 50000/50000 [==============================] - 1s - loss: 0.0364 - acc: 0.9883 - val_loss: 0.0616 - val_acc: 0.9852 124 | Epoch 41/200 125 | 50000/50000 [==============================] - 1s - loss: 0.0342 - acc: 0.9888 - val_loss: 0.0560 - val_acc: 0.9858 126 | Epoch 42/200 127 | 50000/50000 [==============================] - 1s - loss: 0.0354 - acc: 0.9879 - val_loss: 0.0641 - val_acc: 0.9847 128 | Epoch 43/200 129 | 50000/50000 [==============================] - 1s - loss: 0.0359 - acc: 0.9874 - val_loss: 0.0607 - val_acc: 0.9847 130 | Epoch 44/200 131 | 50000/50000 [==============================] - 1s - loss: 0.0358 - acc: 0.9878 - val_loss: 0.0591 - val_acc: 0.9850 132 | Epoch 45/200 133 | 50000/50000 [==============================] - 1s - loss: 0.0344 - acc: 0.9880 - val_loss: 0.0597 - val_acc: 0.9851 134 | Epoch 46/200 135 | 50000/50000 [==============================] - 1s - loss: 0.0316 - acc: 0.9896 - val_loss: 0.0598 - val_acc: 0.9844 136 | Epoch 47/200 137 | 50000/50000 [==============================] - 1s - loss: 0.0362 - acc: 0.9879 - val_loss: 0.0603 - val_acc: 0.9859 138 | Epoch 48/200 139 | 50000/50000 [==============================] - 1s - loss: 0.0331 - acc: 0.9882 - val_loss: 0.0650 - val_acc: 0.9842 140 | Epoch 49/200 141 | 50000/50000 [==============================] - 1s - loss: 0.0315 - acc: 0.9895 - val_loss: 0.0576 - val_acc: 0.9858 142 | Epoch 50/200 143 | 50000/50000 [==============================] - 1s - loss: 0.0310 - acc: 0.9897 - val_loss: 0.0590 - val_acc: 0.9861 144 | Epoch 51/200 145 | 50000/50000 [==============================] - 1s - loss: 0.0309 - acc: 0.9900 - val_loss: 0.0622 - val_acc: 0.9855 146 | Epoch 52/200 147 | 50000/50000 [==============================] - 1s - loss: 0.0298 - acc: 0.9899 - val_loss: 0.0612 - val_acc: 0.9854 148 | Epoch 53/200 149 | 50000/50000 [==============================] - 1s - loss: 0.0319 - acc: 0.9895 - val_loss: 0.0608 - val_acc: 0.9856 150 | Epoch 54/200 151 | 50000/50000 [==============================] - 1s - loss: 0.0326 - acc: 0.9887 - val_loss: 0.0654 - val_acc: 0.9852 152 | Epoch 55/200 153 | 50000/50000 [==============================] - 1s - loss: 0.0323 - acc: 0.9892 - val_loss: 0.0594 - val_acc: 0.9856 154 | Epoch 56/200 155 | 50000/50000 [==============================] - 1s - loss: 0.0303 - acc: 0.9896 - val_loss: 0.0601 - val_acc: 0.9856 156 | Epoch 57/200 157 | 50000/50000 [==============================] - 1s - loss: 0.0283 - acc: 0.9904 - val_loss: 0.0614 - val_acc: 0.9856 158 | Epoch 58/200 159 | 50000/50000 [==============================] - 1s - loss: 0.0310 - acc: 0.9902 - val_loss: 0.0611 - val_acc: 0.9845 160 | Epoch 59/200 161 | 50000/50000 [==============================] - 1s - loss: 0.0294 - acc: 0.9901 - val_loss: 0.0560 - val_acc: 0.9868 162 | Epoch 60/200 163 | 50000/50000 [==============================] - 1s - loss: 0.0285 - acc: 0.9908 - val_loss: 0.0596 - val_acc: 0.9861 164 | Epoch 61/200 165 | 50000/50000 [==============================] - 1s - loss: 0.0284 - acc: 0.9904 - val_loss: 0.0647 - val_acc: 0.9848 166 | Epoch 62/200 167 | 50000/50000 [==============================] - 1s - loss: 0.0286 - acc: 0.9903 - val_loss: 0.0586 - val_acc: 0.9860 168 | Epoch 63/200 169 | 50000/50000 [==============================] - 1s - loss: 0.0311 - acc: 0.9897 - val_loss: 0.0583 - val_acc: 0.9869 170 | Epoch 64/200 171 | 50000/50000 [==============================] - 1s - loss: 0.0308 - acc: 0.9898 - val_loss: 0.0616 - val_acc: 0.9852 172 | Epoch 65/200 173 | 50000/50000 [==============================] - 1s - loss: 0.0266 - acc: 0.9910 - val_loss: 0.0624 - val_acc: 0.9854 174 | Epoch 66/200 175 | 50000/50000 [==============================] - 1s - loss: 0.0294 - acc: 0.9904 - val_loss: 0.0594 - val_acc: 0.9849 176 | Epoch 67/200 177 | 50000/50000 [==============================] - 1s - loss: 0.0265 - acc: 0.9908 - val_loss: 0.0612 - val_acc: 0.9860 178 | Epoch 68/200 179 | 50000/50000 [==============================] - 1s - loss: 0.0307 - acc: 0.9902 - val_loss: 0.0582 - val_acc: 0.9864 180 | Epoch 69/200 181 | 50000/50000 [==============================] - 1s - loss: 0.0278 - acc: 0.9909 - val_loss: 0.0589 - val_acc: 0.9864 182 | Epoch 70/200 183 | 50000/50000 [==============================] - 1s - loss: 0.0270 - acc: 0.9911 - val_loss: 0.0630 - val_acc: 0.9855 184 | Epoch 71/200 185 | 50000/50000 [==============================] - 1s - loss: 0.0266 - acc: 0.9915 - val_loss: 0.0637 - val_acc: 0.9850 186 | Epoch 72/200 187 | 50000/50000 [==============================] - 1s - loss: 0.0286 - acc: 0.9906 - val_loss: 0.0642 - val_acc: 0.9856 188 | Epoch 73/200 189 | 50000/50000 [==============================] - 1s - loss: 0.0300 - acc: 0.9898 - val_loss: 0.0621 - val_acc: 0.9873 190 | Epoch 74/200 191 | 50000/50000 [==============================] - 1s - loss: 0.0278 - acc: 0.9910 - val_loss: 0.0595 - val_acc: 0.9857 192 | Epoch 75/200 193 | 50000/50000 [==============================] - 1s - loss: 0.0254 - acc: 0.9921 - val_loss: 0.0578 - val_acc: 0.9866 194 | Epoch 76/200 195 | 50000/50000 [==============================] - 1s - loss: 0.0288 - acc: 0.9906 - val_loss: 0.0609 - val_acc: 0.9850 196 | Epoch 77/200 197 | 50000/50000 [==============================] - 1s - loss: 0.0266 - acc: 0.9913 - val_loss: 0.0594 - val_acc: 0.9859 198 | Epoch 78/200 199 | 50000/50000 [==============================] - 1s - loss: 0.0252 - acc: 0.9915 - val_loss: 0.0614 - val_acc: 0.9855 200 | Epoch 79/200 201 | 50000/50000 [==============================] - 1s - loss: 0.0281 - acc: 0.9912 - val_loss: 0.0625 - val_acc: 0.9855 202 | Epoch 80/200 203 | 50000/50000 [==============================] - 1s - loss: 0.0270 - acc: 0.9907 - val_loss: 0.0612 - val_acc: 0.9857 204 | Epoch 81/200 205 | 50000/50000 [==============================] - 1s - loss: 0.0255 - acc: 0.9913 - val_loss: 0.0608 - val_acc: 0.9863 206 | Epoch 82/200 207 | 50000/50000 [==============================] - 1s - loss: 0.0256 - acc: 0.9916 - val_loss: 0.0650 - val_acc: 0.9854 208 | Epoch 83/200 209 | 50000/50000 [==============================] - 1s - loss: 0.0262 - acc: 0.9913 - val_loss: 0.0585 - val_acc: 0.9867 210 | Epoch 84/200 211 | 50000/50000 [==============================] - 1s - loss: 0.0266 - acc: 0.9912 - val_loss: 0.0631 - val_acc: 0.9852 212 | Epoch 85/200 213 | 50000/50000 [==============================] - 1s - loss: 0.0253 - acc: 0.9915 - val_loss: 0.0646 - val_acc: 0.9856 214 | Epoch 86/200 215 | 50000/50000 [==============================] - 1s - loss: 0.0262 - acc: 0.9914 - val_loss: 0.0628 - val_acc: 0.9851 216 | Epoch 87/200 217 | 50000/50000 [==============================] - 1s - loss: 0.0249 - acc: 0.9918 - val_loss: 0.0621 - val_acc: 0.9846 218 | Epoch 88/200 219 | 50000/50000 [==============================] - 1s - loss: 0.0266 - acc: 0.9914 - val_loss: 0.0595 - val_acc: 0.9862 220 | Epoch 89/200 221 | 50000/50000 [==============================] - 1s - loss: 0.0246 - acc: 0.9919 - val_loss: 0.0675 - val_acc: 0.9849 222 | Epoch 90/200 223 | 50000/50000 [==============================] - 1s - loss: 0.0234 - acc: 0.9924 - val_loss: 0.0592 - val_acc: 0.9853 224 | Epoch 91/200 225 | 50000/50000 [==============================] - 1s - loss: 0.0252 - acc: 0.9919 - val_loss: 0.0628 - val_acc: 0.9859 226 | Epoch 92/200 227 | 50000/50000 [==============================] - 1s - loss: 0.0254 - acc: 0.9916 - val_loss: 0.0667 - val_acc: 0.9857 228 | Epoch 93/200 229 | 50000/50000 [==============================] - 1s - loss: 0.0250 - acc: 0.9916 - val_loss: 0.0627 - val_acc: 0.9862 230 | Epoch 94/200 231 | 50000/50000 [==============================] - 1s - loss: 0.0248 - acc: 0.9919 - val_loss: 0.0630 - val_acc: 0.9852 232 | Epoch 95/200 233 | 50000/50000 [==============================] - 1s - loss: 0.0261 - acc: 0.9916 - val_loss: 0.0674 - val_acc: 0.9853 234 | Epoch 96/200 235 | 50000/50000 [==============================] - 1s - loss: 0.0255 - acc: 0.9920 - val_loss: 0.0634 - val_acc: 0.9846 236 | Epoch 97/200 237 | 50000/50000 [==============================] - 1s - loss: 0.0235 - acc: 0.9920 - val_loss: 0.0629 - val_acc: 0.9855 238 | Epoch 98/200 239 | 50000/50000 [==============================] - 1s - loss: 0.0265 - acc: 0.9919 - val_loss: 0.0594 - val_acc: 0.9864 240 | Epoch 99/200 241 | 50000/50000 [==============================] - 1s - loss: 0.0232 - acc: 0.9926 - val_loss: 0.0597 - val_acc: 0.9870 242 | Epoch 100/200 243 | 50000/50000 [==============================] - 1s - loss: 0.0217 - acc: 0.9927 - val_loss: 0.0641 - val_acc: 0.9853 244 | Epoch 101/200 245 | 50000/50000 [==============================] - 1s - loss: 0.0238 - acc: 0.9922 - val_loss: 0.0609 - val_acc: 0.9864 246 | Epoch 102/200 247 | 50000/50000 [==============================] - 1s - loss: 0.0232 - acc: 0.9921 - val_loss: 0.0616 - val_acc: 0.9865 248 | Epoch 103/200 249 | 50000/50000 [==============================] - 1s - loss: 0.0260 - acc: 0.9923 - val_loss: 0.0578 - val_acc: 0.9861 250 | Epoch 104/200 251 | 50000/50000 [==============================] - 1s - loss: 0.0223 - acc: 0.9927 - val_loss: 0.0642 - val_acc: 0.9864 252 | Epoch 105/200 253 | 50000/50000 [==============================] - 1s - loss: 0.0204 - acc: 0.9931 - val_loss: 0.0608 - val_acc: 0.9870 254 | Epoch 106/200 255 | 50000/50000 [==============================] - 1s - loss: 0.0233 - acc: 0.9925 - val_loss: 0.0599 - val_acc: 0.9856 256 | Epoch 107/200 257 | 50000/50000 [==============================] - 1s - loss: 0.0223 - acc: 0.9924 - val_loss: 0.0630 - val_acc: 0.9865 258 | Epoch 108/200 259 | 50000/50000 [==============================] - 1s - loss: 0.0229 - acc: 0.9927 - val_loss: 0.0635 - val_acc: 0.9863 260 | Epoch 109/200 261 | 50000/50000 [==============================] - 1s - loss: 0.0228 - acc: 0.9926 - val_loss: 0.0616 - val_acc: 0.9866 262 | Epoch 110/200 263 | 50000/50000 [==============================] - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0605 - val_acc: 0.9859 264 | Epoch 111/200 265 | 50000/50000 [==============================] - 1s - loss: 0.0231 - acc: 0.9922 - val_loss: 0.0615 - val_acc: 0.9861 266 | Epoch 112/200 267 | 50000/50000 [==============================] - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0656 - val_acc: 0.9863 268 | Epoch 113/200 269 | 50000/50000 [==============================] - 1s - loss: 0.0235 - acc: 0.9926 - val_loss: 0.0622 - val_acc: 0.9863 270 | Epoch 114/200 271 | 50000/50000 [==============================] - 1s - loss: 0.0219 - acc: 0.9929 - val_loss: 0.0621 - val_acc: 0.9864 272 | Epoch 115/200 273 | 50000/50000 [==============================] - 1s - loss: 0.0221 - acc: 0.9927 - val_loss: 0.0610 - val_acc: 0.9882 274 | Epoch 116/200 275 | 50000/50000 [==============================] - 1s - loss: 0.0232 - acc: 0.9927 - val_loss: 0.0661 - val_acc: 0.9855 276 | Epoch 117/200 277 | 50000/50000 [==============================] - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0676 - val_acc: 0.9865 278 | Epoch 118/200 279 | 50000/50000 [==============================] - 1s - loss: 0.0251 - acc: 0.9921 - val_loss: 0.0637 - val_acc: 0.9867 280 | Epoch 119/200 281 | 50000/50000 [==============================] - 1s - loss: 0.0223 - acc: 0.9927 - val_loss: 0.0690 - val_acc: 0.9855 282 | Epoch 120/200 283 | 50000/50000 [==============================] - 1s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0632 - val_acc: 0.9863 284 | Epoch 121/200 285 | 50000/50000 [==============================] - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0656 - val_acc: 0.9862 286 | Epoch 122/200 287 | 50000/50000 [==============================] - 1s - loss: 0.0228 - acc: 0.9925 - val_loss: 0.0648 - val_acc: 0.9861 288 | Epoch 123/200 289 | 50000/50000 [==============================] - 1s - loss: 0.0227 - acc: 0.9928 - val_loss: 0.0640 - val_acc: 0.9862 290 | Epoch 124/200 291 | 50000/50000 [==============================] - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0673 - val_acc: 0.9875 292 | Epoch 125/200 293 | 50000/50000 [==============================] - 1s - loss: 0.0222 - acc: 0.9930 - val_loss: 0.0690 - val_acc: 0.9861 294 | Epoch 126/200 295 | 50000/50000 [==============================] - 1s - loss: 0.0226 - acc: 0.9930 - val_loss: 0.0685 - val_acc: 0.9854 296 | Epoch 127/200 297 | 50000/50000 [==============================] - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0690 - val_acc: 0.9849 298 | Epoch 128/200 299 | 50000/50000 [==============================] - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0664 - val_acc: 0.9857 300 | Epoch 129/200 301 | 50000/50000 [==============================] - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0672 - val_acc: 0.9854 302 | Epoch 130/200 303 | 50000/50000 [==============================] - 1s - loss: 0.0207 - acc: 0.9934 - val_loss: 0.0658 - val_acc: 0.9851 304 | Epoch 131/200 305 | 50000/50000 [==============================] - 1s - loss: 0.0236 - acc: 0.9927 - val_loss: 0.0667 - val_acc: 0.9861 306 | Epoch 132/200 307 | 50000/50000 [==============================] - 1s - loss: 0.0225 - acc: 0.9932 - val_loss: 0.0636 - val_acc: 0.9857 308 | Epoch 133/200 309 | 50000/50000 [==============================] - 1s - loss: 0.0223 - acc: 0.9928 - val_loss: 0.0669 - val_acc: 0.9849 310 | Epoch 134/200 311 | 50000/50000 [==============================] - 1s - loss: 0.0236 - acc: 0.9921 - val_loss: 0.0614 - val_acc: 0.9861 312 | Epoch 135/200 313 | 50000/50000 [==============================] - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0630 - val_acc: 0.9855 314 | Epoch 136/200 315 | 50000/50000 [==============================] - 1s - loss: 0.0225 - acc: 0.9927 - val_loss: 0.0626 - val_acc: 0.9850 316 | Epoch 137/200 317 | 50000/50000 [==============================] - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0640 - val_acc: 0.9852 318 | Epoch 138/200 319 | 50000/50000 [==============================] - 1s - loss: 0.0197 - acc: 0.9934 - val_loss: 0.0642 - val_acc: 0.9862 320 | Epoch 139/200 321 | 50000/50000 [==============================] - 1s - loss: 0.0193 - acc: 0.9936 - val_loss: 0.0657 - val_acc: 0.9859 322 | Epoch 140/200 323 | 50000/50000 [==============================] - 1s - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0624 - val_acc: 0.9862 324 | Epoch 141/200 325 | 50000/50000 [==============================] - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0653 - val_acc: 0.9850 326 | Epoch 142/200 327 | 50000/50000 [==============================] - 1s - loss: 0.0203 - acc: 0.9931 - val_loss: 0.0670 - val_acc: 0.9860 328 | Epoch 143/200 329 | 50000/50000 [==============================] - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0652 - val_acc: 0.9865 330 | Epoch 144/200 331 | 50000/50000 [==============================] - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0644 - val_acc: 0.9866 332 | Epoch 145/200 333 | 50000/50000 [==============================] - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0640 - val_acc: 0.9858 334 | Epoch 146/200 335 | 50000/50000 [==============================] - 1s - loss: 0.0201 - acc: 0.9939 - val_loss: 0.0653 - val_acc: 0.9866 336 | Epoch 147/200 337 | 50000/50000 [==============================] - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0665 - val_acc: 0.9869 338 | Epoch 148/200 339 | 50000/50000 [==============================] - 1s - loss: 0.0209 - acc: 0.9932 - val_loss: 0.0635 - val_acc: 0.9878 340 | Epoch 149/200 341 | 50000/50000 [==============================] - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0642 - val_acc: 0.9873 342 | Epoch 150/200 343 | 50000/50000 [==============================] - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0638 - val_acc: 0.9860 344 | Epoch 151/200 345 | 50000/50000 [==============================] - 1s - loss: 0.0210 - acc: 0.9934 - val_loss: 0.0612 - val_acc: 0.9868 346 | Epoch 152/200 347 | 50000/50000 [==============================] - 1s - loss: 0.0200 - acc: 0.9934 - val_loss: 0.0651 - val_acc: 0.9864 348 | Epoch 153/200 349 | 50000/50000 [==============================] - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0661 - val_acc: 0.9862 350 | Epoch 154/200 351 | 50000/50000 [==============================] - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0659 - val_acc: 0.9862 352 | Epoch 155/200 353 | 50000/50000 [==============================] - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0652 - val_acc: 0.9875 354 | Epoch 156/200 355 | 50000/50000 [==============================] - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0650 - val_acc: 0.9866 356 | Epoch 157/200 357 | 50000/50000 [==============================] - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0694 - val_acc: 0.9860 358 | Epoch 158/200 359 | 50000/50000 [==============================] - 1s - loss: 0.0205 - acc: 0.9939 - val_loss: 0.0643 - val_acc: 0.9873 360 | Epoch 159/200 361 | 50000/50000 [==============================] - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0620 - val_acc: 0.9868 362 | Epoch 160/200 363 | 50000/50000 [==============================] - 1s - loss: 0.0225 - acc: 0.9929 - val_loss: 0.0641 - val_acc: 0.9872 364 | Epoch 161/200 365 | 50000/50000 [==============================] - 1s - loss: 0.0196 - acc: 0.9939 - val_loss: 0.0653 - val_acc: 0.9870 366 | Epoch 162/200 367 | 50000/50000 [==============================] - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0652 - val_acc: 0.9868 368 | Epoch 163/200 369 | 50000/50000 [==============================] - 1s - loss: 0.0202 - acc: 0.9935 - val_loss: 0.0653 - val_acc: 0.9861 370 | Epoch 164/200 371 | 50000/50000 [==============================] - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0649 - val_acc: 0.9850 372 | Epoch 165/200 373 | 50000/50000 [==============================] - 1s - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0669 - val_acc: 0.9856 374 | Epoch 166/200 375 | 50000/50000 [==============================] - 1s - loss: 0.0202 - acc: 0.9935 - val_loss: 0.0640 - val_acc: 0.9865 376 | Epoch 167/200 377 | 50000/50000 [==============================] - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0666 - val_acc: 0.9854 378 | Epoch 168/200 379 | 50000/50000 [==============================] - 1s - loss: 0.0208 - acc: 0.9934 - val_loss: 0.0666 - val_acc: 0.9866 380 | Epoch 169/200 381 | 50000/50000 [==============================] - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0681 - val_acc: 0.9861 382 | Epoch 170/200 383 | 50000/50000 [==============================] - 1s - loss: 0.0209 - acc: 0.9939 - val_loss: 0.0669 - val_acc: 0.9863 384 | Epoch 171/200 385 | 50000/50000 [==============================] - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0613 - val_acc: 0.9876 386 | Epoch 172/200 387 | 50000/50000 [==============================] - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0637 - val_acc: 0.9858 388 | Epoch 173/200 389 | 50000/50000 [==============================] - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0685 - val_acc: 0.9859 390 | Epoch 174/200 391 | 50000/50000 [==============================] - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0648 - val_acc: 0.9876 392 | Epoch 175/200 393 | 50000/50000 [==============================] - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0656 - val_acc: 0.9865 394 | Epoch 176/200 395 | 50000/50000 [==============================] - 1s - loss: 0.0182 - acc: 0.9939 - val_loss: 0.0719 - val_acc: 0.9859 396 | Epoch 177/200 397 | 50000/50000 [==============================] - 1s - loss: 0.0173 - acc: 0.9944 - val_loss: 0.0667 - val_acc: 0.9859 398 | Epoch 178/200 399 | 50000/50000 [==============================] - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0622 - val_acc: 0.9867 400 | Epoch 179/200 401 | 50000/50000 [==============================] - 1s - loss: 0.0196 - acc: 0.9938 - val_loss: 0.0675 - val_acc: 0.9870 402 | Epoch 180/200 403 | 50000/50000 [==============================] - 1s - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0696 - val_acc: 0.9864 404 | Epoch 181/200 405 | 50000/50000 [==============================] - 1s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0676 - val_acc: 0.9864 406 | Epoch 182/200 407 | 50000/50000 [==============================] - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0673 - val_acc: 0.9861 408 | Epoch 183/200 409 | 50000/50000 [==============================] - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0686 - val_acc: 0.9856 410 | Epoch 184/200 411 | 50000/50000 [==============================] - 1s - loss: 0.0195 - acc: 0.9941 - val_loss: 0.0690 - val_acc: 0.9861 412 | Epoch 185/200 413 | 50000/50000 [==============================] - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0660 - val_acc: 0.9866 414 | Epoch 186/200 415 | 50000/50000 [==============================] - 1s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0682 - val_acc: 0.9864 416 | Epoch 187/200 417 | 50000/50000 [==============================] - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0650 - val_acc: 0.9865 418 | Epoch 188/200 419 | 50000/50000 [==============================] - 1s - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0685 - val_acc: 0.9858 420 | Epoch 189/200 421 | 50000/50000 [==============================] - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0698 - val_acc: 0.9855 422 | Epoch 190/200 423 | 50000/50000 [==============================] - 1s - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0643 - val_acc: 0.9877 424 | Epoch 191/200 425 | 50000/50000 [==============================] - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0662 - val_acc: 0.9875 426 | Epoch 192/200 427 | 50000/50000 [==============================] - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0654 - val_acc: 0.9877 428 | Epoch 193/200 429 | 50000/50000 [==============================] - 1s - loss: 0.0186 - acc: 0.9938 - val_loss: 0.0635 - val_acc: 0.9864 430 | Epoch 194/200 431 | 50000/50000 [==============================] - 1s - loss: 0.0200 - acc: 0.9933 - val_loss: 0.0676 - val_acc: 0.9862 432 | Epoch 195/200 433 | 50000/50000 [==============================] - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0666 - val_acc: 0.9863 434 | Epoch 196/200 435 | 50000/50000 [==============================] - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0707 - val_acc: 0.9854 436 | Epoch 197/200 437 | 50000/50000 [==============================] - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0677 - val_acc: 0.9865 438 | Epoch 198/200 439 | 50000/50000 [==============================] - 1s - loss: 0.0206 - acc: 0.9941 - val_loss: 0.0658 - val_acc: 0.9858 440 | Epoch 199/200 441 | 50000/50000 [==============================] - 1s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0653 - val_acc: 0.9861 442 | Epoch 200/200 443 | 50000/50000 [==============================] - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0666 - val_acc: 0.9859 444 | Total training time: 282.475250s 445 | Test score: 0.0543896225868 446 | Test accuracy: 0.9871 447 | -------------------------------------------------------------------------------- /Result_TensorLayer: -------------------------------------------------------------------------------- 1 | I tensorflow/core/common_runtime/gpu/gpu_device.cc:839] Creating TensorFlow device (/gpu:0) -> (device: 0, name: GeForce GTX 980, pci bus id: 0000:07:00.0) 2 | Exception AssertionError: AssertionError("Nesting violated for default stack of objects",) in > ignored 3 | tensorlayer:Instantiate InputLayer input_layer (?, 784) 4 | tensorlayer:Instantiate DropoutLayer drop1: keep: 0.800000 5 | tensorlayer:Instantiate DenseLayer relu1: 800, 6 | tensorlayer:Instantiate DropoutLayer drop2: keep: 0.500000 7 | tensorlayer:Instantiate DenseLayer relu2: 800, 8 | tensorlayer:Instantiate DropoutLayer drop3: keep: 0.500000 9 | tensorlayer:Instantiate DenseLayer output_layer: 10, 10 | param 0: (784, 800) (mean: -0.000044, median: -0.000083, std: 0.087911) relu1/W:0 11 | param 1: (800,) (mean: 0.000000, median: 0.000000, std: 0.000000) relu1/b:0 12 | param 2: (800, 800) (mean: 0.000086, median: 0.000174, std: 0.087993) relu2/W:0 13 | param 3: (800,) (mean: 0.000000, median: 0.000000, std: 0.000000) relu2/b:0 14 | param 4: (800, 10) (mean: -0.000220, median: -0.000951, std: 0.087175) output_layer/W:0 15 | param 5: (10,) (mean: 0.000000, median: 0.000000, std: 0.000000) output_layer/b:0 16 | num of params: 1276810 17 | layer 0: Tensor("dropout/mul_1:0", shape=(?, 784), dtype=float32) 18 | layer 1: Tensor("Relu:0", shape=(?, 800), dtype=float32) 19 | layer 2: Tensor("dropout_1/mul_1:0", shape=(?, 800), dtype=float32) 20 | layer 3: Tensor("Relu_1:0", shape=(?, 800), dtype=float32) 21 | layer 4: Tensor("dropout_2/mul_1:0", shape=(?, 800), dtype=float32) 22 | layer 5: Tensor("add_2:0", shape=(?, 10), dtype=float32) 23 | Start training the network ... 24 | Epoch 1 of 200 took 0.731579s 25 | val loss: 0.562035 26 | val acc: 0.825000 27 | Epoch 5 of 200 took 0.557835s 28 | val loss: 0.286251 29 | val acc: 0.916000 30 | Epoch 10 of 200 took 0.558575s 31 | val loss: 0.222106 32 | val acc: 0.937500 33 | Epoch 15 of 200 took 0.575262s 34 | val loss: 0.188165 35 | val acc: 0.948100 36 | Epoch 20 of 200 took 0.564921s 37 | val loss: 0.164434 38 | val acc: 0.955400 39 | Epoch 25 of 200 took 0.572334s 40 | val loss: 0.145848 41 | val acc: 0.960100 42 | Epoch 30 of 200 took 0.579468s 43 | val loss: 0.132819 44 | val acc: 0.963700 45 | Epoch 35 of 200 took 0.561392s 46 | val loss: 0.122657 47 | val acc: 0.966300 48 | Epoch 40 of 200 took 0.564490s 49 | val loss: 0.113542 50 | val acc: 0.968400 51 | Epoch 45 of 200 took 0.569971s 52 | val loss: 0.106994 53 | val acc: 0.969100 54 | Epoch 50 of 200 took 0.573973s 55 | val loss: 0.099668 56 | val acc: 0.971400 57 | Epoch 55 of 200 took 0.555727s 58 | val loss: 0.094177 59 | val acc: 0.972400 60 | Epoch 60 of 200 took 0.575835s 61 | val loss: 0.091001 62 | val acc: 0.973500 63 | Epoch 65 of 200 took 0.584385s 64 | val loss: 0.087226 65 | val acc: 0.973800 66 | Epoch 70 of 200 took 0.568456s 67 | val loss: 0.083294 68 | val acc: 0.975000 69 | Epoch 75 of 200 took 0.563503s 70 | val loss: 0.079863 71 | val acc: 0.976700 72 | Epoch 80 of 200 took 0.570911s 73 | val loss: 0.077562 74 | val acc: 0.977200 75 | Epoch 85 of 200 took 0.568649s 76 | val loss: 0.075622 77 | val acc: 0.978000 78 | Epoch 90 of 200 took 0.572381s 79 | val loss: 0.073526 80 | val acc: 0.977800 81 | Epoch 95 of 200 took 0.573557s 82 | val loss: 0.071414 83 | val acc: 0.979100 84 | Epoch 100 of 200 took 0.573976s 85 | val loss: 0.069486 86 | val acc: 0.980200 87 | Epoch 105 of 200 took 0.566601s 88 | val loss: 0.068012 89 | val acc: 0.980900 90 | Epoch 110 of 200 took 0.559803s 91 | val loss: 0.066410 92 | val acc: 0.980700 93 | Epoch 115 of 200 took 0.572070s 94 | val loss: 0.065095 95 | val acc: 0.980900 96 | Epoch 120 of 200 took 0.565401s 97 | val loss: 0.063958 98 | val acc: 0.981500 99 | Epoch 125 of 200 took 0.572880s 100 | val loss: 0.063724 101 | val acc: 0.982100 102 | Epoch 130 of 200 took 0.560457s 103 | val loss: 0.062261 104 | val acc: 0.982800 105 | Epoch 135 of 200 took 0.565308s 106 | val loss: 0.061522 107 | val acc: 0.982500 108 | Epoch 140 of 200 took 0.566607s 109 | val loss: 0.060634 110 | val acc: 0.983000 111 | Epoch 145 of 200 took 0.557342s 112 | val loss: 0.059505 113 | val acc: 0.982600 114 | Epoch 150 of 200 took 0.572337s 115 | val loss: 0.059506 116 | val acc: 0.983400 117 | Epoch 155 of 200 took 0.567339s 118 | val loss: 0.058481 119 | val acc: 0.983700 120 | Epoch 160 of 200 took 0.570569s 121 | val loss: 0.058500 122 | val acc: 0.983600 123 | Epoch 165 of 200 took 0.570399s 124 | val loss: 0.056760 125 | val acc: 0.983600 126 | Epoch 170 of 200 took 0.574436s 127 | val loss: 0.057231 128 | val acc: 0.984000 129 | Epoch 175 of 200 took 0.552374s 130 | val loss: 0.056794 131 | val acc: 0.984000 132 | Epoch 180 of 200 took 0.562672s 133 | val loss: 0.056608 134 | val acc: 0.984000 135 | Epoch 185 of 200 took 0.564349s 136 | val loss: 0.055433 137 | val acc: 0.984100 138 | Epoch 190 of 200 took 0.573892s 139 | val loss: 0.056283 140 | val acc: 0.983900 141 | Epoch 195 of 200 took 0.562397s 142 | val loss: 0.054887 143 | val acc: 0.984400 144 | Epoch 200 of 200 took 0.571549s 145 | val loss: 0.055525 146 | val acc: 0.984400 147 | Total training time: 116.670978s 148 | Start testing the network ... 149 | test loss: 0.049633 150 | test acc: 0.985300 151 | Model is saved to: model.npz 152 | -------------------------------------------------------------------------------- /keras_mnist.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/python 2 | # -*- coding: utf8 -*- 3 | 4 | 5 | 6 | 7 | from __future__ import print_function 8 | import numpy as np 9 | np.random.seed(1337) # for reproducibility 10 | 11 | from keras.datasets import mnist 12 | from keras.models import Sequential 13 | from keras.layers.core import Dense, Dropout, Activation 14 | from keras.optimizers import SGD, Adam, RMSprop 15 | from keras.utils import np_utils 16 | import time 17 | 18 | nb_classes = 10 19 | 20 | # the data, shuffled and split between train and test sets 21 | (X_train, y_train), (X_test, y_test) = mnist.load_data() 22 | 23 | X_train = X_train.reshape(60000, 784) 24 | X_test = X_test.reshape(10000, 784) 25 | X_train = X_train.astype('float32') 26 | X_test = X_test.astype('float32') 27 | X_train /= 255 28 | X_test /= 255 29 | 30 | # convert class vectors to binary class matrices 31 | Y_train = np_utils.to_categorical(y_train, nb_classes) 32 | Y_test = np_utils.to_categorical(y_test, nb_classes) 33 | 34 | X_train, X_val = X_train[:-10000], X_train[-10000:] 35 | Y_train, Y_val = Y_train[:-10000], Y_train[-10000:] 36 | 37 | print(X_train.shape[0], 'train samples') 38 | print(X_val.shape[0], 'test samples') 39 | print(X_test.shape[0], 'test samples') 40 | 41 | 42 | print(Y_train.shape, Y_test.shape) 43 | 44 | model = Sequential() 45 | model.add(Dropout(0.2, input_shape=(784,))) 46 | model.add(Dense(800)) 47 | model.add(Activation('relu')) 48 | model.add(Dropout(0.5)) 49 | model.add(Dense(800)) 50 | model.add(Activation('relu')) 51 | model.add(Dropout(0.5)) 52 | model.add(Dense(10)) 53 | model.add(Activation('softmax')) 54 | 55 | model.summary() 56 | 57 | model.compile(loss='categorical_crossentropy', 58 | optimizer=Adam(), 59 | metrics=['accuracy']) 60 | 61 | start_time = time.time() 62 | history = model.fit(X_train, Y_train, 63 | batch_size=500, nb_epoch=200, 64 | verbose=1, validation_data=(X_val, Y_val)) 65 | print("Total training time: %fs" % (time.time() - start_time)) 66 | score = model.evaluate(X_test, Y_test, verbose=0) 67 | print('Test score:', score[0]) 68 | print('Test accuracy:', score[1]) 69 | -------------------------------------------------------------------------------- /tensorlayer_mnist.py: -------------------------------------------------------------------------------- 1 | 2 | #! /usr/bin/python 3 | # -*- coding: utf8 -*- 4 | 5 | 6 | import tensorflow as tf 7 | import tensorlayer as tl 8 | import time 9 | 10 | sess = tf.InteractiveSession() 11 | 12 | # prepare data 13 | X_train, y_train, X_val, y_val, X_test, y_test = \ 14 | tl.files.load_mnist_dataset(shape=(-1,784)) 15 | 16 | sess = tf.InteractiveSession() 17 | # define placeholder 18 | x = tf.placeholder(tf.float32, shape=[None, 784], name='x') 19 | y_ = tf.placeholder(tf.int64, shape=[None, ], name='y_') 20 | 21 | # define the network 22 | network = tl.layers.InputLayer(x, name='input_layer') 23 | network = tl.layers.DropoutLayer(network, keep=0.8, name='drop1') 24 | network = tl.layers.DenseLayer(network, n_units=800, 25 | act = tf.nn.relu, name='relu1') 26 | network = tl.layers.DropoutLayer(network, keep=0.5, name='drop2') 27 | network = tl.layers.DenseLayer(network, n_units=800, 28 | act = tf.nn.relu, name='relu2') 29 | network = tl.layers.DropoutLayer(network, keep=0.5, name='drop3') 30 | network = tl.layers.DenseLayer(network, n_units=10, 31 | act = tl.activation.identity, # the softmax is implement in tl.cost.cross_entropy(y, y_) to speed up computation, see tf.nn.sparse_softmax_cross_entropy_with_logits 32 | name='output_layer') 33 | # define cost function and metric. 34 | y = network.outputs 35 | cost = tl.cost.cross_entropy(y, y_) 36 | correct_prediction = tf.equal(tf.argmax(y, 1), y_) 37 | acc = tf.reduce_mean(tf.cast(correct_prediction, tf.float32)) 38 | y_op = tf.argmax(tf.nn.softmax(y), 1) 39 | 40 | # define the optimizer 41 | train_params = network.all_params 42 | train_op = tf.train.AdamOptimizer(learning_rate=0.0001, beta1=0.9, beta2=0.999, 43 | epsilon=1e-08, use_locking=False).minimize(cost, var_list=train_params) 44 | 45 | # initialize all variables 46 | sess.run(tf.initialize_all_variables()) 47 | 48 | # print network information 49 | network.print_params() 50 | network.print_layers() 51 | 52 | # train the network 53 | tl.utils.fit(sess, network, train_op, cost, X_train, y_train, x, y_, 54 | acc=acc, batch_size=500, n_epoch=200, print_freq=5, 55 | # X_val=None, y_val=None, eval_train=False) 56 | X_val=X_val, y_val=y_val, eval_train=False) # use this 57 | 58 | # evaluation 59 | tl.utils.test(sess, network, acc, X_test, y_test, x, y_, batch_size=None, cost=cost) 60 | 61 | # save the network to .npz file 62 | tl.files.save_npz(network.all_params , name='model.npz') 63 | sess.close() 64 | --------------------------------------------------------------------------------