├── 1 什么是深度学习.ipynb ├── 2 神经网络的数学基础.ipynb ├── 2.1-MINST手写数字-多分类问题(初识神经网络).ipynb ├── 3 神经网络入门.ipynb ├── 3.4-IMDB电影评论-二分类问题.ipynb ├── 3.5-reuters新闻数据-单标签多分类问题.ipynb ├── 3.6-boston_housing房价预测-回归问题.ipynb ├── 4 机器学习基础-通用工作流程.ipynb ├── 5.1-CNN简介.ipynb ├── 5.2-小型数据集上训练一个CNN.ipynb ├── 5.3-使用预训练的CNN.ipynb ├── 5.4-CNN的可视化.ipynb ├── 5.4-visualizing-what-convnets-learn.ipynb ├── 6.1.1-单词和字符的one-hot编码.ipynb ├── 6.1.2 使用词嵌入.ipynb ├── 6.2-理解RNN.ipynb ├── 6.3-RNN高级用法.ipynb ├── 6.4-使用CNN处理序列.ipynb ├── 7-高级深度学习最佳实践.ipynb ├── 8.1-使用LSTM生成文本.ipynb ├── 8.2-deep-dream.ipynb ├── 8.3-神经风格迁移.ipynb ├── 8.4-变分自编码器-VAE网络.ipynb ├── 8.5-生成式对抗网络-GAN之DCGAN.ipynb ├── SourceHanSansSC-ExtraLight.otf ├── creative_commons_elephant.jpg ├── elephant_cam.jpg ├── elephant_cam1.jpg ├── model.png ├── runoob.npz └── test.png /1 什么是深度学习.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "1、机器学习三要素\n", 10 | " (1)输入的数据点\n", 11 | " (2)预期输出的示例\n", 12 | " (3)衡量算法好坏的方法:计算当前输出与预期输出的差距。调节这个差距的过程就是学习\n", 13 | "2、术语定义\n", 14 | " 浅层学习:网络的层数少\n", 15 | " 深度学习:网络层数一般10层以上\n", 16 | " 说明:神经网络的灵感来自神经生物学,但深度学习模型不是大脑模型。没有证据表明大脑学习机制与现代深度学习模型相同\n", 17 | "3、深度学习进展\n", 18 | " 图像分类\n", 19 | " 语音识别\n", 20 | " 手写文字转录\n", 21 | " 机器翻译\n", 22 | " 文本到语音转换\n", 23 | " 数字助理(谷歌即时 Google Now | 亚马逊 Alexa)\n", 24 | " 自动驾驶\n", 25 | " 广告定向投放(搜索|推荐)\n", 26 | "4、机器学习简史\n", 27 | "(1)概率建模\n", 28 | " 朴素贝叶斯:假设输入数据的特征都是独立的\n", 29 | " 逻辑回归:也是一种概率的分类算法\n", 30 | "(2)早期神经网络\n", 31 | " 返向传播算法+梯度下降优化\n", 32 | "(3)核方法\n", 33 | " SVM:将数据映射到一个新的高纬度的表示\n", 34 | " 核函数:高效的计算数据点在新空间中的距离,从而避免对新表示进行直接计算\n", 35 | " 缺点:难以扩展到大型数据集;在图像分类等感知问题上的效果不好;是一种比较浅层的方法。\n", 36 | "(4)决策树、随机森林、梯度提升机\n", 37 | " 将弱学习器集成的技术\n", 38 | "(5)回到神经网络\n", 39 | " 深度的卷积神经网络已经成为计算机视觉任务的首选算法。\n", 40 | " 将特征工程完全自动化\n", 41 | "(6)机器学习现状\n", 42 | " 梯度提升机:用于浅层学习问题;处理结构化数据的问题;XGboost库\n", 43 | " 深度学习:用于感知问题;处理图像分类等感知问题;Keras库\n", 44 | "5、算法\n", 45 | " 问题:通过多层的梯度传播,神经网络的反馈信号会逐渐消失\n", 46 | " 解决:有利于梯度传播的方法(如 批标准化、残差连接、深度可分离卷积。可以训练上千层模型)" 47 | ] 48 | }, 49 | { 50 | "cell_type": "markdown", 51 | "metadata": {}, 52 | "source": [ 53 | "![word embeddings vs. one hot encoding](./test.png)" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": null, 59 | "metadata": {}, 60 | "outputs": [], 61 | "source": [] 62 | } 63 | ], 64 | "metadata": { 65 | "kernelspec": { 66 | "display_name": "Python [conda env:learn36]", 67 | "language": "python", 68 | "name": "conda-env-learn36-py" 69 | }, 70 | "language_info": { 71 | "codemirror_mode": { 72 | "name": "ipython", 73 | "version": 3 74 | }, 75 | "file_extension": ".py", 76 | "mimetype": "text/x-python", 77 | "name": "python", 78 | "nbconvert_exporter": "python", 79 | "pygments_lexer": "ipython3", 80 | "version": "3.6.8" 81 | }, 82 | "toc": { 83 | "base_numbering": 1, 84 | "nav_menu": {}, 85 | "number_sections": true, 86 | "sideBar": true, 87 | "skip_h1_title": false, 88 | "title_cell": "Table of Contents", 89 | "title_sidebar": "Contents", 90 | "toc_cell": false, 91 | "toc_position": {}, 92 | "toc_section_display": true, 93 | "toc_window_display": false 94 | } 95 | }, 96 | "nbformat": 4, 97 | "nbformat_minor": 2 98 | } 99 | -------------------------------------------------------------------------------- /2 神经网络的数学基础.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "raw", 5 | "metadata": {}, 6 | "source": [ 7 | "~~~\n", 8 | " 数据类型:numpy(以及大多数其他库)中不存在字符串张量,因为张量存储在预先分配的连续内存段中,二字符串长度是可变的,无法使用这种存储方式\n", 9 | "2.2.8 现实世界的数据张量\n", 10 | " 向量数据:2D张量(samples,features)\n", 11 | " 时间序列数据或序列数据:3D张量(samples,timesteps,features)\n", 12 | " 图像:4D张量(samples,height,width,channels)或(samples,channels,height,width)\n", 13 | " 视频:5D张量(samples,frames,height,width,channels)或(samples,frames,channels,height,width)\n", 14 | " 通道在后:channels-last:TensorFlow默认\n", 15 | " 通道在前:channels-first:Theano默认\n", 16 | "2.3 张量运算\n", 17 | " 逐元素运算\n", 18 | " 广播\n", 19 | " 点积:(a,b,c,d) . (d,e) = (a,b,c,e)\n", 20 | " 张量变形:仿射变换(线性变换,从一个空间到另一个空间,对应Dense层)、旋转、缩放\n", 21 | "2.4 基于梯度的优化\n", 22 | " 优化器:SGD、Adagrad、RMSProp等;(SGD解决了两个问题:收敛速度、局部极小值点)\n", 23 | " 反向传播算法:本质上是对误差的反向传播,是激活函数o~(z)对z值微分的反向传播\n", 24 | "~~~" 25 | ] 26 | } 27 | ], 28 | "metadata": { 29 | "kernelspec": { 30 | "display_name": "Python [conda env:tf-py36]", 31 | "language": "python", 32 | "name": "conda-env-tf-py36-py" 33 | }, 34 | "language_info": { 35 | "codemirror_mode": { 36 | "name": "ipython", 37 | "version": 3 38 | }, 39 | "file_extension": ".py", 40 | "mimetype": "text/x-python", 41 | "name": "python", 42 | "nbconvert_exporter": "python", 43 | "pygments_lexer": "ipython3", 44 | "version": "3.6.8" 45 | }, 46 | "toc": { 47 | "base_numbering": 1, 48 | "nav_menu": {}, 49 | "number_sections": true, 50 | "sideBar": true, 51 | "skip_h1_title": false, 52 | "title_cell": "Table of Contents", 53 | "title_sidebar": "Contents", 54 | "toc_cell": false, 55 | "toc_position": {}, 56 | "toc_section_display": true, 57 | "toc_window_display": false 58 | } 59 | }, 60 | "nbformat": 4, 61 | "nbformat_minor": 2 62 | } 63 | -------------------------------------------------------------------------------- /2.1-MINST手写数字-多分类问题(初识神经网络).ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | }, 15 | { 16 | "data": { 17 | "text/plain": [ 18 | "'2.2.4'" 19 | ] 20 | }, 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "output_type": "execute_result" 24 | } 25 | ], 26 | "source": [ 27 | "import keras\n", 28 | "keras.__version__" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": {}, 34 | "source": [ 35 | "# MNIST 数据集" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": 16, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "from keras.datasets import mnist # 加载MNIST 手写数字分类数据集\n", 45 | "(train_images, train_labels), (test_images, test_labels) = mnist.load_data()" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": 5, 51 | "metadata": { 52 | "collapsed": true 53 | }, 54 | "outputs": [ 55 | { 56 | "data": { 57 | "text/plain": [ 58 | "" 59 | ] 60 | }, 61 | "execution_count": 5, 62 | "metadata": {}, 63 | "output_type": "execute_result" 64 | }, 65 | { 66 | "data": { 67 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAD4CAYAAAAq5pAIAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+17YcXAAANo0lEQVR4nO3db6hc9Z3H8c9Ht4qkDZrNjRvTsLfWPNiwsmkZzIJas5RNVJRYQTFoiBBMH0RIoeJKVBpERZdNS8VNIV1NU+0ahdY/D2RjCMXYJyGjZDXZsGuU2KYJ5kaRpuKfjX73wT1ZrvHOb27m3xn9vl9wmZnznTPny+gnZ2Z+55yfI0IAvvxOq7sBAINB2IEkCDuQBGEHkiDsQBJ/MciNzZw5M0ZHRwe5SSCVAwcO6OjRo56s1lXYbV8u6aeSTpf0bxHxQOn5o6Ojajab3WwSQEGj0WhZ6/hjvO3TJf2rpCskzZe0zPb8Tl8PQH918539Ikn7I+LNiPhY0hZJS3vTFoBe6ybscyT9YcLjg9Wyz7C9ynbTdnNsbKyLzQHoRjdhn+xHgM8dexsRGyOiERGNkZGRLjYHoBvdhP2gpLkTHn9d0qHu2gHQL92EfZekeba/YfsMSTdIeq43bQHotY6H3iLiuO1bJW3V+NDboxGxt2edAeiprsbZI+J5Sc/3qBcAfcThskAShB1IgrADSRB2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJEHYgSQIO5AEYQeSIOxAEoQdSIKwA0kQdiAJwg4kQdiBJAg7kARhB5Ig7EAShB1IgrADSRB2IAnCDiRB2IEkupqy2fYBScckfSLpeEQ0etEUgN7rKuyVf4iIoz14HQB9xMd4IIluwx6SXrD9su1Vkz3B9irbTdvNsbGxLjcHoFPdhv3iiPi2pCskrbb9nZOfEBEbI6IREY2RkZEuNwegU12FPSIOVbdHJD0t6aJeNAWg9zoOu+1ptr924r6kxZL29KoxAL3Vza/x50p62vaJ1/n3iPiPnnQFoOc6DntEvCnp73rYC4A+YugNSIKwA0kQdiAJwg4kQdiBJHpxIgyG2M6dO4v1xx57rFjfsWNHsb5nT+eHVqxfv75YP++884r1l156qVhfvnx5y9rChQuL634ZsWcHkiDsQBKEHUiCsANJEHYgCcIOJEHYgSQYZ/8SePLJJ1vW1qxZU1y33aXCIqJYX7RoUbF+9Gjra5HedtttxXXbaddbadtbtmzpattfROzZgSQIO5AEYQeSIOxAEoQdSIKwA0kQdiAJxtmHwPHjx4v1Xbt2Feu33HJLy9r7779fXPeyyy4r1u++++5i/ZJLLinWP/roo5a166+/vrju1q1bi/V2Gg0mFZ6IPTuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJME4+xB4/PHHi/WVK1d2/NqLFy8u1kvnwkvS9OnTO952u9fvdhx97ty5xfqKFSu6ev0vm7Z7dtuP2j5ie8+EZTNsb7P9enV7Tn/bBNCtqXyM/4Wky09adoek7RExT9L26jGAIdY27BGxQ9K7Jy1eKmlzdX+zpGt63BeAHuv0B7pzI+KwJFW3s1o90fYq203bzXbXOwPQP33/NT4iNkZEIyIaIyMj/d4cgBY6DfvbtmdLUnV7pHctAeiHTsP+nKQT4xorJD3bm3YA9EvbcXbbT0haJGmm7YOSfiTpAUlP2V4p6feSrutnk190d911V7F+//33F+u2i/XVq1e3rN17773FdbsdR2/nvvvu69trP/TQQ8U6Xxs/q23YI2JZi9J3e9wLgD7icFkgCcIOJEHYgSQIO5AEYQeS4BTXHrjnnnuK9XZDa2eeeWaxvmTJkmL9wQcfbFk766yziuu28+GHHxbrL7zwQrH+1ltvtay1m3K53WWsly5dWqzjs9izA0kQdiAJwg4kQdiBJAg7kARhB5Ig7EASjLNP0XvvvdeytmHDhuK67U5RbTeO/swzzxTr3di/f3+xfuONNxbrzWaz421fd135zOjbb7+949fG57FnB5Ig7EAShB1IgrADSRB2IAnCDiRB2IEkGGefoo8//rhlrdtprdpdEvnIkfIcHJs2bWpZe/bZ8iX99+7dW6wfO3asWG93DMFpp7Xen9x0003FdadNm1as49SwZweSIOxAEoQdSIKwA0kQdiAJwg4kQdiBJBhnn6IzzjijZW3WrFnFdduNk4+Ojhbr7cayuzFnzpxivd2UzocOHSrWZ86c2bJ29dVXF9dFb7Xds9t+1PYR23smLFtn+4+2d1d/V/a3TQDdmsrH+F9IunyS5T+JiAXV3/O9bQtAr7UNe0TskPTuAHoB0Efd/EB3q+1Xq4/557R6ku1Vtpu2m90eQw6gc52G/WeSvilpgaTDkta3emJEbIyIRkQ0RkZGOtwcgG51FPaIeDsiPomITyX9XNJFvW0LQK91FHbbsyc8/J6kPa2eC2A4tB1nt/2EpEWSZto+KOlHkhbZXiApJB2Q9P0+9jgUzj777Ja1dtd1v+qqq4r1d955p1i/4IILivXSPOU333xzcd0ZM2YU6zfccEOx3m6cvd36GJy2YY+IZZMsfqQPvQDoIw6XBZIg7EAShB1IgrADSRB2IAlOce2BhQsXFuvDfJjwjh07ivUXX3yxWG93+u35559/yj2hP9izA0kQdiAJwg4kQdiBJAg7kARhB5Ig7EASjLMn98EHHxTr7cbR29U5xXV4sGcHkiDsQBKEHUiCsANJEHYgCcIOJEHYgSQYZ09uyZIldbeAAWHPDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJMM6e3NatW+tuAQPSds9ue67t39reZ3uv7TXV8hm2t9l+vbo9p//tAujUVD7GH5f0w4j4G0l/L2m17fmS7pC0PSLmSdpePQYwpNqGPSIOR8Qr1f1jkvZJmiNpqaTN1dM2S7qmX00C6N4p/UBne1TStyTtlHRuRByWxv9BkDSrxTqrbDdtN4d5zjPgy27KYbf9VUm/lvSDiPjTVNeLiI0R0YiIxsjISCc9AuiBKYXd9lc0HvRfRcRvqsVv255d1WdLOtKfFgH0QtuhN49fK/gRSfsi4scTSs9JWiHpger22b50iL5644036m4BAzKVcfaLJS2X9Jrt3dWytRoP+VO2V0r6vaTr+tMigF5oG/aI+J2kVjMBfLe37QDoFw6XBZIg7EAShB1IgrADSRB2IAlOcU3u0ksvLdYjYkCdoN/YswNJEHYgCcIOJEHYgSQIO5AEYQeSIOxAEoyzJ3fhhRcW6/PmzSvW250PX6pz5aLBYs8OJEHYgSQIO5AEYQeSIOxAEoQdSIKwA0kwzo6itWvXFusrV67seP2HH364uO78+fOLdZwa9uxAEoQdSIKwA0kQdiAJwg4kQdiBJAg7kMRU5mefK+mXkv5K0qeSNkbET22vk3SLpLHqqWsj4vl+NYp6XHvttcX6li1bivVt27a1rK1bt6647qZNm4r1adOmFev4rKkcVHNc0g8j4hXbX5P0su0T/wV/EhH/0r/2APTKVOZnPyzpcHX/mO19kub0uzEAvXVK39ltj0r6lqSd1aJbbb9q+1Hb57RYZ5Xtpu3m2NjYZE8BMABTDrvtr0r6taQfRMSfJP1M0jclLdD4nn/9ZOtFxMaIaEREg2uOAfWZUthtf0XjQf9VRPxGkiLi7Yj4JCI+lfRzSRf1r00A3WobdtuW9IikfRHx4wnLZ0942vck7el9ewB6ZSq/xl8sabmk12zvrpatlbTM9gJJIemApO/3pUPUavr06cX6U089VazfeeedLWsbNmworttuaI5TYE/NVH6N/50kT1JiTB34AuEIOiAJwg4kQdiBJAg7kARhB5Ig7EASjoiBbazRaESz2RzY9oBsGo2Gms3mZEPl7NmBLAg7kARhB5Ig7EAShB1IgrADSRB2IImBjrPbHpP01oRFMyUdHVgDp2ZYexvWviR661Qve/vriJj0+m8DDfvnNm43I6JRWwMFw9rbsPYl0VunBtUbH+OBJAg7kETdYd9Y8/ZLhrW3Ye1LordODaS3Wr+zAxicuvfsAAaEsANJ1BJ225fb/m/b+23fUUcPrdg+YPs127tt13ryfTWH3hHbeyYsm2F7m+3Xq9tJ59irqbd1tv9YvXe7bV9ZU29zbf/W9j7be22vqZbX+t4V+hrI+zbw7+y2T5f0P5L+UdJBSbskLYuI/xpoIy3YPiCpERG1H4Bh+zuS/izplxHxt9Wyf5b0bkQ8UP1DeU5E/NOQ9LZO0p/rnsa7mq1o9sRpxiVdI+lm1fjeFfq6XgN43+rYs18kaX9EvBkRH0vaImlpDX0MvYjYIendkxYvlbS5ur9Z4/+zDFyL3oZCRByOiFeq+8cknZhmvNb3rtDXQNQR9jmS/jDh8UEN13zvIekF2y/bXlV3M5M4NyIOS+P/80iaVXM/J2s7jfcgnTTN+NC8d51Mf96tOsI+2fWxhmn87+KI+LakKyStrj6uYmqmNI33oEwyzfhQ6HT6827VEfaDkuZOePx1SYdq6GNSEXGouj0i6WkN31TUb5+YQbe6PVJzP/9vmKbxnmyacQ3Be1fn9Od1hH2XpHm2v2H7DEk3SHquhj4+x/a06ocT2Z4mabGGbyrq5yStqO6vkPRsjb18xrBM491qmnHV/N7VPv15RAz8T9KVGv9F/g1Jd9bRQ4u+zpf0n9Xf3rp7k/SExj/W/a/GPxGtlPSXkrZLer26nTFEvT0m6TVJr2o8WLNr6u0SjX81fFXS7urvyrrfu0JfA3nfOFwWSIIj6IAkCDuQBGEHkiDsQBKEHUiCsANJEHYgif8DQhse1aKaCAIAAAAASUVORK5CYII=\n", 68 | "text/plain": [ 69 | "
" 70 | ] 71 | }, 72 | "metadata": { 73 | "needs_background": "light" 74 | }, 75 | "output_type": "display_data" 76 | } 77 | ], 78 | "source": [ 79 | "import matplotlib.pyplot as plt\n", 80 | "plt.imshow(train_images[4],cmap=plt.cm.binary)\n", 81 | "plt.show()" 82 | ] 83 | }, 84 | { 85 | "cell_type": "code", 86 | "execution_count": 17, 87 | "metadata": {}, 88 | "outputs": [ 89 | { 90 | "data": { 91 | "text/plain": [ 92 | "(60000, 28, 28)" 93 | ] 94 | }, 95 | "execution_count": 17, 96 | "metadata": {}, 97 | "output_type": "execute_result" 98 | } 99 | ], 100 | "source": [ 101 | "train_images.shape # train_images : array(60000, 28, 28) dtype('uint8') 0-255 训练数据" 102 | ] 103 | }, 104 | { 105 | "cell_type": "code", 106 | "execution_count": 18, 107 | "metadata": {}, 108 | "outputs": [ 109 | { 110 | "data": { 111 | "text/plain": [ 112 | "(60000,)" 113 | ] 114 | }, 115 | "execution_count": 18, 116 | "metadata": {}, 117 | "output_type": "execute_result" 118 | } 119 | ], 120 | "source": [ 121 | "train_labels.shape # train_labels : array(60000,) dtype('uint8') 0-9 " 122 | ] 123 | }, 124 | { 125 | "cell_type": "code", 126 | "execution_count": 19, 127 | "metadata": {}, 128 | "outputs": [ 129 | { 130 | "data": { 131 | "text/plain": [ 132 | "(10000, 28, 28)" 133 | ] 134 | }, 135 | "execution_count": 19, 136 | "metadata": {}, 137 | "output_type": "execute_result" 138 | } 139 | ], 140 | "source": [ 141 | "test_images.shape # test_images : array(10000, 28, 28) dtype('uint8') 0-255 测试数据" 142 | ] 143 | }, 144 | { 145 | "cell_type": "code", 146 | "execution_count": 20, 147 | "metadata": {}, 148 | "outputs": [ 149 | { 150 | "data": { 151 | "text/plain": [ 152 | "(10000,)" 153 | ] 154 | }, 155 | "execution_count": 20, 156 | "metadata": {}, 157 | "output_type": "execute_result" 158 | } 159 | ], 160 | "source": [ 161 | "test_labels.shape # test_labels : array(10000,) dtype('uint8') 0-9 " 162 | ] 163 | }, 164 | { 165 | "cell_type": "code", 166 | "execution_count": 7, 167 | "metadata": {}, 168 | "outputs": [], 169 | "source": [ 170 | "train_images = train_images.reshape((60000, 28 * 28)) # 图像数据处理 array(60000, 28, 28) -> array(60000, 28 * 28)\n", 171 | "train_images = train_images.astype('float32') / 255 # dtype('uint8') -> dtype('float32')\n", 172 | "test_images = test_images.reshape((10000, 28 * 28)) # [0,255] -> [0,1] \n", 173 | "test_images = test_images.astype('float32') / 255" 174 | ] 175 | }, 176 | { 177 | "cell_type": "code", 178 | "execution_count": 12, 179 | "metadata": {}, 180 | "outputs": [], 181 | "source": [ 182 | "from keras.utils import to_categorical # 标签数据处理 array(60000,) -> array(60000, 10)\n", 183 | "train_labels = to_categorical(train_labels) # dtype('uint8') -> dtype('float32')\n", 184 | "test_labels = to_categorical(test_labels) # [0,9] -> [0,1]" 185 | ] 186 | }, 187 | { 188 | "cell_type": "markdown", 189 | "metadata": {}, 190 | "source": [ 191 | "# 网络" 192 | ] 193 | }, 194 | { 195 | "cell_type": "code", 196 | "execution_count": 64, 197 | "metadata": {}, 198 | "outputs": [], 199 | "source": [ 200 | "from keras import models\n", 201 | "from keras import layers\n", 202 | "\n", 203 | "network = models.Sequential()\n", 204 | "network.add(layers.Dense(512, activation='relu', input_shape=(28 * 28,)))\n", 205 | "network.add(layers.Dense(10, activation='softmax'))\n", 206 | "\n", 207 | "network.compile(optimizer='rmsprop', # 优化器(optimizer)\n", 208 | " loss='categorical_crossentropy', # 损失函数(loss) \n", 209 | " metrics=['accuracy']) # 监控指标(metric)\n", 210 | "\n", 211 | "network.fit(train_images, train_labels, epochs=5, batch_size=128) # 训练网络" 212 | ] 213 | }, 214 | { 215 | "cell_type": "code", 216 | "execution_count": 69, 217 | "metadata": {}, 218 | "outputs": [ 219 | { 220 | "name": "stdout", 221 | "output_type": "stream", 222 | "text": [ 223 | "10000/10000 [==============================] - 0s 32us/step\n" 224 | ] 225 | }, 226 | { 227 | "data": { 228 | "text/plain": [ 229 | "(0.06708235375082586, 0.979)" 230 | ] 231 | }, 232 | "execution_count": 69, 233 | "metadata": {}, 234 | "output_type": "execute_result" 235 | } 236 | ], 237 | "source": [ 238 | "test_loss, test_acc = network.evaluate(test_images, test_labels) # 评估网络\n", 239 | "test_loss , test_acc" 240 | ] 241 | } 242 | ], 243 | "metadata": { 244 | "kernelspec": { 245 | "display_name": "Python [conda env:learn36]", 246 | "language": "python", 247 | "name": "conda-env-learn36-py" 248 | }, 249 | "language_info": { 250 | "codemirror_mode": { 251 | "name": "ipython", 252 | "version": 3 253 | }, 254 | "file_extension": ".py", 255 | "mimetype": "text/x-python", 256 | "name": "python", 257 | "nbconvert_exporter": "python", 258 | "pygments_lexer": "ipython3", 259 | "version": "3.6.8" 260 | }, 261 | "toc": { 262 | "base_numbering": 1, 263 | "nav_menu": {}, 264 | "number_sections": true, 265 | "sideBar": true, 266 | "skip_h1_title": false, 267 | "title_cell": "Table of Contents", 268 | "title_sidebar": "Contents", 269 | "toc_cell": false, 270 | "toc_position": {}, 271 | "toc_section_display": true, 272 | "toc_window_display": true 273 | } 274 | }, 275 | "nbformat": 4, 276 | "nbformat_minor": 2 277 | } 278 | -------------------------------------------------------------------------------- /3 神经网络入门.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "raw", 5 | "metadata": {}, 6 | "source": [ 7 | "1、数据张量\n", 8 | "向量数据:2D张量(samples,features) 密集连接层处理(dense)relu\n", 9 | "时间序列数据或序列数据:3D张量(samples,timesteps,features) 循环层处理(LSTM)\n", 10 | "图像:4D张量(samples,height,width,channels)或(samples,channels,height,width) 二维卷积层处理(Conv2D)\n", 11 | "视频:5D张量(samples,frames,height,width,channels)或(samples,frames,channels,height,width)\n", 12 | "多输出:有多个输出的神经网络可能有多个损失函数(每个输出对应一个损失函数)。但是梯度下降过程必须对应单个标量损失值。多损失函数取平均值变为一个标量值。值\n", 13 | "\n", 14 | "2、损失函数:\n", 15 | " 二分类问题:二元交叉熵(binary crossentropy) 最后一层sigmod函数\n", 16 | " 多分类问题:分类交叉熵(categorical crossentropy) 最后一层softmax函数;衡量两个概率分布之间的距离\n", 17 | " 回归问题:均方误差(mean-squared error)mse 绝对值误差 mae\n", 18 | " 序列学习问题:联结主义时序分类(CTC connection temporal classification)\n", 19 | " 交叉熵:用于处理输出概率值的模型\n", 20 | " \n", 21 | "3、后端:\n", 22 | " keras后端引擎:Tensorflow\\Theno\\CNTK\n", 23 | " 深度学习张量运算库:CPU(BLAS,Eigen)GPU(CUDA,cuDNN)\n", 24 | "\n", 25 | "4、定义模型的方法:\n", 26 | " 使用Sequential类:仅用于层的线性堆叠\n", 27 | " 函数式API:用于层组成的有向无环图,可以构建任意形式的架构" 28 | ] 29 | }, 30 | { 31 | "cell_type": "code", 32 | "execution_count": null, 33 | "metadata": {}, 34 | "outputs": [], 35 | "source": [ 36 | "from keras import models\n", 37 | "from keras import layers\n", 38 | "\n", 39 | "network = models.Sequential()\n", 40 | "network.add(layers.Dense(512, activation='relu', input_shape=(28 * 28,)))\n", 41 | "network.add(layers.Dense(10, activation='softmax'))" 42 | ] 43 | }, 44 | { 45 | "cell_type": "code", 46 | "execution_count": null, 47 | "metadata": {}, 48 | "outputs": [], 49 | "source": [ 50 | "from keras import models\n", 51 | "from keras import layers\n", 52 | "from keras import optimizers\n", 53 | "\n", 54 | "input_tensor = layers.Input(shape=(784,))\n", 55 | "x = layers.Dense(32,activation='relu')(input_tensor)\n", 56 | "output_tensor = layers.Dense(10,activation='softmax')(x)\n", 57 | "model = models.Model(inputs=input_tensor,outputs=output_tensor)" 58 | ] 59 | }, 60 | { 61 | "cell_type": "code", 62 | "execution_count": null, 63 | "metadata": {}, 64 | "outputs": [], 65 | "source": [ 66 | "model.compile(optimizer=optimizers.RMSprop(lr=0.001), \n", 67 | " loss='mes', \n", 68 | " metrics=['accuracy']) \n", 69 | "model.fit(input_tensor,target_tensor,batch_size=128,epochs=10)" 70 | ] 71 | } 72 | ], 73 | "metadata": { 74 | "kernelspec": { 75 | "display_name": "Python [conda env:tf-py36]", 76 | "language": "python", 77 | "name": "conda-env-tf-py36-py" 78 | }, 79 | "language_info": { 80 | "codemirror_mode": { 81 | "name": "ipython", 82 | "version": 3 83 | }, 84 | "file_extension": ".py", 85 | "mimetype": "text/x-python", 86 | "name": "python", 87 | "nbconvert_exporter": "python", 88 | "pygments_lexer": "ipython3", 89 | "version": "3.6.8" 90 | }, 91 | "toc": { 92 | "base_numbering": 1, 93 | "nav_menu": {}, 94 | "number_sections": true, 95 | "sideBar": true, 96 | "skip_h1_title": false, 97 | "title_cell": "Table of Contents", 98 | "title_sidebar": "Contents", 99 | "toc_cell": false, 100 | "toc_position": {}, 101 | "toc_section_display": true, 102 | "toc_window_display": false 103 | } 104 | }, 105 | "nbformat": 4, 106 | "nbformat_minor": 2 107 | } 108 | -------------------------------------------------------------------------------- /3.5-reuters新闻数据-单标签多分类问题.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | }, 15 | { 16 | "data": { 17 | "text/plain": [ 18 | "'2.0.8'" 19 | ] 20 | }, 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "output_type": "execute_result" 24 | } 25 | ], 26 | "source": [ 27 | "import keras\n", 28 | "keras.__version__" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": {}, 34 | "source": [ 35 | "单标签、多分类问题\n", 36 | "路透社数据集reuters\n" 37 | ] 38 | }, 39 | { 40 | "cell_type": "markdown", 41 | "metadata": {}, 42 | "source": [ 43 | "## reuters 路透社数据集" 44 | ] 45 | }, 46 | { 47 | "cell_type": "markdown", 48 | "metadata": {}, 49 | "source": [ 50 | "~~~\n", 51 | "train_data (8982,) array([list1 ,list2 ... , list8982], dtype=object) [1,9999]\n", 52 | "train_labels (8982,) array([ 3, 4, 3, ..., 25, 3, 25], dtype=int64) [0,45]\n", 53 | "test_data (2246,) array([list1 ,list2 ... , list2246], dtype=object) [1,9999]\n", 54 | "test_labels (2246,) array([ 3, 10, 1, ..., 3, 3, 24], dtype=int64) [0,45]\n", 55 | "~~~" 56 | ] 57 | }, 58 | { 59 | "cell_type": "code", 60 | "execution_count": 4, 61 | "metadata": {}, 62 | "outputs": [], 63 | "source": [ 64 | "from keras.datasets import reuters\n", 65 | "import numpy as np\n", 66 | "#(train_data, train_labels), (test_data, test_labels) = reuters.load_data(num_words=10000)\n", 67 | "np_load_old = np.load\n", 68 | "np.load = lambda *a,**k: np_load_old(*a, allow_pickle=True, **k)\n", 69 | "(train_data, train_labels), (test_data, test_labels) = reuters.load_data(num_words=10000)\n", 70 | "np.load = np_load_old" 71 | ] 72 | }, 73 | { 74 | "cell_type": "code", 75 | "execution_count": 64, 76 | "metadata": {}, 77 | "outputs": [ 78 | { 79 | "data": { 80 | "text/plain": [ 81 | "'? ? ? said as a result of its december acquisition of space co it expects earnings per share in 1987 of 1 15 to 1 30 dlrs per share up from 70 cts in 1986 the company said pretax net should rise to nine to 10 mln dlrs from six mln dlrs in 1986 and rental operation revenues to 19 to 22 mln dlrs from 12 5 mln dlrs it said cash flow per share this year should be 2 50 to three dlrs reuter 3'" 82 | ] 83 | }, 84 | "execution_count": 64, 85 | "metadata": {}, 86 | "output_type": "execute_result" 87 | } 88 | ], 89 | "source": [ 90 | "word_index = reuters.get_word_index() # 字典\n", 91 | "reverse_word_index = dict([(value, key) for (key, value) in word_index.items()])\n", 92 | "decoded_newswire = ' '.join([reverse_word_index.get(i - 3, '?') for i in train_data[0]])\n", 93 | "decoded_newswire" 94 | ] 95 | }, 96 | { 97 | "cell_type": "markdown", 98 | "metadata": {}, 99 | "source": [ 100 | "## 数据预处理one-hot编码" 101 | ] 102 | }, 103 | { 104 | "cell_type": "code", 105 | "execution_count": 86, 106 | "metadata": {}, 107 | "outputs": [], 108 | "source": [ 109 | "import numpy as np\n", 110 | "\n", 111 | "def vectorize_sequences(sequences, dimension=10000):\n", 112 | " results = np.zeros((len(sequences), dimension))\n", 113 | " for i, sequence in enumerate(sequences):\n", 114 | " results[i, sequence] = 1.\n", 115 | " return results\n", 116 | "\n", 117 | "x_train = vectorize_sequences(train_data) # (8982, 10000)\n", 118 | "x_test = vectorize_sequences(test_data) # (2246, 10000)" 119 | ] 120 | }, 121 | { 122 | "cell_type": "markdown", 123 | "metadata": {}, 124 | "source": [ 125 | "将标签向量化有两种方法: \n", 126 | "1、标签列表转化为整数张量 \n", 127 | "2、使用one-hot编码,也叫分类编码(categorical encoding)" 128 | ] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "execution_count": 87, 133 | "metadata": {}, 134 | "outputs": [], 135 | "source": [ 136 | "def to_one_hot(labels, dimension=46):\n", 137 | " results = np.zeros((len(labels), dimension))\n", 138 | " for i, label in enumerate(labels):\n", 139 | " results[i, label] = 1.\n", 140 | " return results\n", 141 | "\n", 142 | "one_hot_train_labels = to_one_hot(train_labels) # (8982, 46)\n", 143 | "one_hot_test_labels = to_one_hot(test_labels) # (2246, 46)" 144 | ] 145 | }, 146 | { 147 | "cell_type": "code", 148 | "execution_count": 84, 149 | "metadata": {}, 150 | "outputs": [], 151 | "source": [ 152 | "from keras.utils.np_utils import to_categorical # keras 内置的one-hot操作\n", 153 | "\n", 154 | "one_hot_train_labels = to_categorical(train_labels)\n", 155 | "one_hot_test_labels = to_categorical(test_labels)" 156 | ] 157 | }, 158 | { 159 | "cell_type": "markdown", 160 | "metadata": {}, 161 | "source": [ 162 | "## 构建网络" 163 | ] 164 | }, 165 | { 166 | "cell_type": "code", 167 | "execution_count": 88, 168 | "metadata": { 169 | "collapsed": true 170 | }, 171 | "outputs": [ 172 | { 173 | "name": "stdout", 174 | "output_type": "stream", 175 | "text": [ 176 | "WARNING:tensorflow:From C:\\Users\\pengfeizhang\\Anaconda3\\envs\\tf-py36\\lib\\site-packages\\tensorflow\\python\\framework\\op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.\n", 177 | "Instructions for updating:\n", 178 | "Colocations handled automatically by placer.\n" 179 | ] 180 | } 181 | ], 182 | "source": [ 183 | "from keras import models\n", 184 | "from keras import layers\n", 185 | "\n", 186 | "model = models.Sequential()\n", 187 | "model.add(layers.Dense(64, activation='relu', input_shape=(10000,)))\n", 188 | "model.add(layers.Dense(64, activation='relu'))\n", 189 | "model.add(layers.Dense(46, activation='softmax'))\n", 190 | "\n", 191 | "model.compile(optimizer='rmsprop',\n", 192 | " loss='categorical_crossentropy',\n", 193 | " metrics=['accuracy'])" 194 | ] 195 | }, 196 | { 197 | "cell_type": "markdown", 198 | "metadata": {}, 199 | "source": [ 200 | "## 验证网络" 201 | ] 202 | }, 203 | { 204 | "cell_type": "code", 205 | "execution_count": 89, 206 | "metadata": {}, 207 | "outputs": [], 208 | "source": [ 209 | "x_val = x_train[:1000] # 前1000个数据作为验证\n", 210 | "partial_x_train = x_train[1000:]\n", 211 | "\n", 212 | "y_val = one_hot_train_labels[:1000]\n", 213 | "partial_y_train = one_hot_train_labels[1000:]" 214 | ] 215 | }, 216 | { 217 | "cell_type": "code", 218 | "execution_count": 90, 219 | "metadata": { 220 | "collapsed": true 221 | }, 222 | "outputs": [ 223 | { 224 | "name": "stdout", 225 | "output_type": "stream", 226 | "text": [ 227 | "WARNING:tensorflow:From C:\\Users\\pengfeizhang\\Anaconda3\\envs\\tf-py36\\lib\\site-packages\\tensorflow\\python\\ops\\math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", 228 | "Instructions for updating:\n", 229 | "Use tf.cast instead.\n", 230 | "Train on 7982 samples, validate on 1000 samples\n", 231 | "Epoch 1/20\n", 232 | "7982/7982 [==============================] - 2s 256us/step - loss: 2.5326 - acc: 0.4954 - val_loss: 1.7237 - val_acc: 0.6110\n", 233 | "Epoch 2/20\n", 234 | "7982/7982 [==============================] - 1s 149us/step - loss: 1.4499 - acc: 0.6864 - val_loss: 1.3495 - val_acc: 0.7070\n", 235 | "Epoch 3/20\n", 236 | "7982/7982 [==============================] - 1s 154us/step - loss: 1.1000 - acc: 0.7643 - val_loss: 1.1727 - val_acc: 0.7430\n", 237 | "Epoch 4/20\n", 238 | "7982/7982 [==============================] - 1s 159us/step - loss: 0.8725 - acc: 0.8141 - val_loss: 1.0798 - val_acc: 0.7590\n", 239 | "Epoch 5/20\n", 240 | "7982/7982 [==============================] - 1s 154us/step - loss: 0.7055 - acc: 0.8472 - val_loss: 0.9851 - val_acc: 0.7810\n", 241 | "Epoch 6/20\n", 242 | "7982/7982 [==============================] - 1s 152us/step - loss: 0.5683 - acc: 0.8794 - val_loss: 0.9401 - val_acc: 0.8050\n", 243 | "Epoch 7/20\n", 244 | "7982/7982 [==============================] - 1s 161us/step - loss: 0.4607 - acc: 0.9038 - val_loss: 0.9102 - val_acc: 0.7980\n", 245 | "Epoch 8/20\n", 246 | "7982/7982 [==============================] - 1s 158us/step - loss: 0.3718 - acc: 0.9223 - val_loss: 0.9390 - val_acc: 0.7900\n", 247 | "Epoch 9/20\n", 248 | "7982/7982 [==============================] - 1s 158us/step - loss: 0.3047 - acc: 0.9312 - val_loss: 0.8888 - val_acc: 0.8060\n", 249 | "Epoch 10/20\n", 250 | "7982/7982 [==============================] - 1s 158us/step - loss: 0.2551 - acc: 0.9411 - val_loss: 0.9060 - val_acc: 0.8160\n", 251 | "Epoch 11/20\n", 252 | "7982/7982 [==============================] - 1s 161us/step - loss: 0.2194 - acc: 0.9476 - val_loss: 0.9196 - val_acc: 0.8100\n", 253 | "Epoch 12/20\n", 254 | "7982/7982 [==============================] - 1s 161us/step - loss: 0.1881 - acc: 0.9508 - val_loss: 0.9043 - val_acc: 0.8130\n", 255 | "Epoch 13/20\n", 256 | "7982/7982 [==============================] - 1s 159us/step - loss: 0.1702 - acc: 0.9524 - val_loss: 0.9354 - val_acc: 0.8100\n", 257 | "Epoch 14/20\n", 258 | "7982/7982 [==============================] - 1s 153us/step - loss: 0.1538 - acc: 0.9554 - val_loss: 0.9651 - val_acc: 0.8070\n", 259 | "Epoch 15/20\n", 260 | "7982/7982 [==============================] - 1s 161us/step - loss: 0.1389 - acc: 0.9559 - val_loss: 0.9687 - val_acc: 0.8160\n", 261 | "Epoch 16/20\n", 262 | "7982/7982 [==============================] - 1s 149us/step - loss: 0.1315 - acc: 0.9564 - val_loss: 1.0229 - val_acc: 0.8050\n", 263 | "Epoch 17/20\n", 264 | "7982/7982 [==============================] - 1s 154us/step - loss: 0.1217 - acc: 0.9580 - val_loss: 1.0246 - val_acc: 0.7990\n", 265 | "Epoch 18/20\n", 266 | "7982/7982 [==============================] - 1s 154us/step - loss: 0.1198 - acc: 0.9578 - val_loss: 1.0442 - val_acc: 0.8070\n", 267 | "Epoch 19/20\n", 268 | "7982/7982 [==============================] - 1s 152us/step - loss: 0.1137 - acc: 0.9593 - val_loss: 1.1030 - val_acc: 0.7940\n", 269 | "Epoch 20/20\n", 270 | "7982/7982 [==============================] - 1s 152us/step - loss: 0.1108 - acc: 0.9597 - val_loss: 1.0723 - val_acc: 0.8020\n" 271 | ] 272 | } 273 | ], 274 | "source": [ 275 | "history = model.fit(partial_x_train,\n", 276 | " partial_y_train,\n", 277 | " epochs=20,\n", 278 | " batch_size=512,\n", 279 | " validation_data=(x_val, y_val))" 280 | ] 281 | }, 282 | { 283 | "cell_type": "markdown", 284 | "metadata": {}, 285 | "source": [ 286 | "## 绘图" 287 | ] 288 | }, 289 | { 290 | "cell_type": "code", 291 | "execution_count": 92, 292 | "metadata": {}, 293 | "outputs": [ 294 | { 295 | "data": { 296 | "text/plain": [ 297 | "[]" 298 | ] 299 | }, 300 | "execution_count": 92, 301 | "metadata": {}, 302 | "output_type": "execute_result" 303 | }, 304 | { 305 | "data": { 306 | "text/plain": [ 307 | "[]" 308 | ] 309 | }, 310 | "execution_count": 92, 311 | "metadata": {}, 312 | "output_type": "execute_result" 313 | }, 314 | { 315 | "data": { 316 | "text/plain": [ 317 | "Text(0.5, 1.0, 'Training and validation loss')" 318 | ] 319 | }, 320 | "execution_count": 92, 321 | "metadata": {}, 322 | "output_type": "execute_result" 323 | }, 324 | { 325 | "data": { 326 | "text/plain": [ 327 | "Text(0.5, 0, 'Epochs')" 328 | ] 329 | }, 330 | "execution_count": 92, 331 | "metadata": {}, 332 | "output_type": "execute_result" 333 | }, 334 | { 335 | "data": { 336 | "text/plain": [ 337 | "Text(0, 0.5, 'Loss')" 338 | ] 339 | }, 340 | "execution_count": 92, 341 | "metadata": {}, 342 | "output_type": "execute_result" 343 | }, 344 | { 345 | "data": { 346 | "text/plain": [ 347 | "" 348 | ] 349 | }, 350 | "execution_count": 92, 351 | "metadata": {}, 352 | "output_type": "execute_result" 353 | }, 354 | { 355 | "data": { 356 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEWCAYAAABrDZDcAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+17YcXAAAgAElEQVR4nO3deZwU1bn/8c8DDLLvqAjC4BJlccBxghiMgBrjEjUaoxLc9SLeuERjrvzQqDFy4xYXjNdIEs3CRDQajRoNMZFEjQYFBBQRQQEdQRxQNkHjwPP741QPzdA908NMdfdMf9+vV726q+pU9dM1PfVUnVN1ytwdEREpXC1yHYCIiOSWEoGISIFTIhARKXBKBCIiBU6JQESkwCkRiIgUOCUCaVRm1tLMNppZ38Ysm0tmto+ZNfp11mZ2pJktSxpfZGZfzaTsTnzWL81s4s4uX8t6bzSzXzf2eiW7WuU6AMktM9uYNNoO+BzYEo1f6O7l9Vmfu28BOjR22ULg7vs1xnrM7ALgDHcflbTuCxpj3dI8KREUOHev3hFHR5wXuPvf0pU3s1buXpWN2EQkO1Q1JLWKTv0fMrMHzWwDcIaZHWJm/zaztWa20swmm1lRVL6VmbmZFUfjU6P5z5jZBjN72cz617dsNP8YM3vbzNaZ2d1m9i8zOydN3JnEeKGZLTGzT8xsctKyLc3sDjNbY2bvAEfXsn2uMbNpNabdY2a3R+8vMLOF0fd5JzpaT7euCjMbFb1vZ2a/i2JbAByU4nPfjda7wMxOiKYfAPwM+GpU7bY6adten7T8+Oi7rzGzx82sVybbpi5m9s0onrVm9pyZ7Zc0b6KZrTCz9Wb2VtJ3HW5mc6Lpq8zs1kw/TxqJu2vQgLsDLAOOrDHtRuA/wPGEA4e2wJeBgwlnlHsBbwMXR+VbAQ4UR+NTgdVAGVAEPARM3YmyuwIbgBOjeVcAXwDnpPkumcT4J6AzUAx8nPjuwMXAAqAP0B14PvyrpPycvYCNQPukdX8ElEXjx0dlDDgc2AyURPOOBJYlrasCGBW9vw34B9AV6Ae8WaPsqUCv6G/ynSiG3aJ5FwD/qBHnVOD66P1RUYxDgTbA/wHPZbJtUnz/G4FfR+8HRHEcHv2NJkbbvQgYBCwHdo/K9gf2it6/CoyJ3ncEDs71/0KhDTojkEy86O5PuvtWd9/s7q+6+0x3r3L3d4EpwMhaln/E3We5+xdAOWEHVN+y3wDmuvufonl3EJJGShnG+BN3X+fuywg73cRnnQrc4e4V7r4GuKmWz3kXeIOQoAC+Bqx191nR/Cfd/V0PngP+DqRsEK7hVOBGd//E3ZcTjvKTP/dhd18Z/U1+T0jiZRmsF2As8Et3n+vunwETgJFm1iepTLptU5vTgSfc/bnob3QT0ImQkKsISWdQVL24NNp2EBL6vmbW3d03uPvMDL+HNBIlAsnE+8kjZra/mf3ZzD40s/XADUCPWpb/MOn9JmpvIE5Xdo/kONzdCUfQKWUYY0afRTiSrc3vgTHR++8QElgijm+Y2Uwz+9jM1hKOxmvbVgm9aovBzM4xs3lRFcxaYP8M1wvh+1Wvz93XA58AvZPK1Odvlm69Wwl/o97uvgj4PuHv8FFU1bh7VPRcYCCwyMxeMbNjM/we0kiUCCQTNS+dvI9wFLyPu3cCriVUfcRpJaGqBgAzM7bfcdXUkBhXAnsmjdd1eetDwJHREfWJhMSAmbUFHgF+Qqi26QL8NcM4PkwXg5ntBdwLXAR0j9b7VtJ667rUdQWhuimxvo6EKqgPMoirPuttQfibfQDg7lPdfQShWqglYbvg7ovc/XRC9d9PgUfNrE0DY5F6UCKQndERWAd8amYDgAuz8JlPAaVmdryZtQIuA3rGFOPDwPfMrLeZdQeuqq2wu68CXgQeABa5++Jo1i5Aa6AS2GJm3wCOqEcME82si4X7LC5OmteBsLOvJOTECwhnBAmrgD6JxvEUHgTON7MSM9uFsEN+wd3TnmHVI+YTzGxU9Nk/ILTrzDSzAWY2Ovq8zdGwhfAFzjSzHtEZxLrou21tYCxSD0oEsjO+D5xN+Ce/j3BEHKtoZ3sacDuwBtgbeI1w30Njx3gvoS7/dUJD5iMZLPN7QuPv75NiXgtcDjxGaHA9hZDQMnEd4cxkGfAM8Nuk9c4HJgOvRGX2B5Lr1Z8FFgOrzCy5iiex/F8IVTSPRcv3JbQbNIi7LyBs83sJSepo4ISovWAX4BZCu86HhDOQa6JFjwUWWrgq7TbgNHf/T0PjkcxZqGoVaVrMrCWhKuIUd38h1/GINGU6I5Amw8yONrPOUfXCDwlXoryS47BEmjwlAmlKDgXeJVQvHA18093TVQ2JSIZUNSQiUuB0RiAiUuCaXKdzPXr08OLi4lyHISLSpMyePXu1u6e85LrJJYLi4mJmzZqV6zBERJoUM0t7h7yqhkRECpwSgYhIgVMiEBEpcE2ujUBEsuuLL76goqKCzz77LNehSAbatGlDnz59KCpK19XUjpQIRKRWFRUVdOzYkeLiYkKnr5Kv3J01a9ZQUVFB//79614gUhBVQ+XlUFwMLVqE1/J6PY5dpLB99tlndO/eXUmgCTAzunfvXu+zt2Z/RlBeDuPGwaZNYXz58jAOMLbB/S2KFAYlgaZjZ/5Wzf6M4OqrtyWBhE2bwnQREYkxEZjZnmY2w8wWmtkCM7ssRZlRZrbOzOZGw7WNHcd779VvuojklzVr1jB06FCGDh3K7rvvTu/evavH//OfzB5bcO6557Jo0aJay9xzzz2UN1K98aGHHsrcuXMbZV3ZEGfVUBXwfXefEz0Kb7aZPevub9Yo94K7fyOuIPr2DdVBqaaLSOMrLw9n3O+9F/7PJk1qWDVs9+7dq3eq119/PR06dODKK6/croy74+60aJH62PaBBx6o83O++93v7nyQTVxsZwTuvtLd50TvNwALqf0Zs7GYNAnatdt+Wrt2YbqINK5Em9zy5eC+rU0ujgs0lixZwuDBgxk/fjylpaWsXLmScePGUVZWxqBBg7jhhhuqyyaO0KuqqujSpQsTJkxgyJAhHHLIIXz00UcAXHPNNdx5553V5SdMmMCwYcPYb7/9eOmllwD49NNP+da3vsWQIUMYM2YMZWVldR75T506lQMOOIDBgwczceJEAKqqqjjzzDOrp0+ePBmAO+64g4EDBzJkyBDOOOOMRt9m6WSljcDMioED2f5xegmHmNk8M3vGzAalWX6cmc0ys1mVlZX1+uyxY2HKFOjXD8zC65QpaigWiUO22+TefPNNzj//fF577TV69+7NTTfdxKxZs5g3bx7PPvssb75ZswIC1q1bx8iRI5k3bx6HHHII999/f8p1uzuvvPIKt956a3VSufvuu9l9992ZN28eEyZM4LXXXqs1voqKCq655hpmzJjBa6+9xr/+9S+eeuopZs+ezerVq3n99dd54403OOusswC45ZZbmDt3LvPmzeNnP/tZA7dO5mJPBGbWAXgU+J67r68xew7Qz92HAHcDj6dah7tPcfcydy/r2bO255WnNnYsLFsGW7eGVyUBkXhku01u77335stf/nL1+IMPPkhpaSmlpaUsXLgwZSJo27YtxxxzDAAHHXQQy5YtS7nuk08+eYcyL774IqeffjoAQ4YMYdCglMeu1WbOnMnhhx9Ojx49KCoq4jvf+Q7PP/88++yzD4sWLeKyyy5j+vTpdO7cGYBBgwZxxhlnUF5eXq8bwhoq1kRgZkWEJFDu7n+sOd/d17v7xuj900CRmfWIMyYRiU+6tre42uTat29f/X7x4sXcddddPPfcc8yfP5+jjz465fX0rVu3rn7fsmVLqqqqUq57l1122aFMfR/kla589+7dmT9/PoceeiiTJ0/mwgsvBGD69OmMHz+eV155hbKyMrZs2VKvz9tZcV41ZMCvgIXufnuaMrtH5TCzYVE8a+KKSUTilcs2ufXr19OxY0c6derEypUrmT59eqN/xqGHHsrDDz8MwOuvv57yjCPZ8OHDmTFjBmvWrKGqqopp06YxcuRIKisrcXe+/e1v86Mf/Yg5c+awZcsWKioqOPzww7n11luprKxkU816tpjEedXQCOBM4HUzS7SmTAT6Arj7z4FTgIvMrArYDJzuenamSJOVqHZtzKuGMlVaWsrAgQMZPHgwe+21FyNGjGj0z7jkkks466yzKCkpobS0lMGDB1dX66TSp08fbrjhBkaNGoW7c/zxx3PccccxZ84czj//fNwdM+Pmm2+mqqqK73znO2zYsIGtW7dy1VVX0bFjx0b/Dqk0uWcWl5WVuR5MI5I9CxcuZMCAAbkOIy9UVVVRVVVFmzZtWLx4MUcddRSLFy+mVav86qQh1d/MzGa7e1mq8vkVvYhIHtu4cSNHHHEEVVVVuDv33Xdf3iWBndH0v4GISJZ06dKF2bNn5zqMRtfs+xoSEZHaKRGIiBQ4JQIRkQKnRCAiUuCUCEQkr40aNWqHm8PuvPNO/vu//7vW5Tp06ADAihUrOOWUU9Kuu67L0e+8887tbuw69thjWbt2bSah1+r666/ntttua/B6GoMSgYjktTFjxjBt2rTtpk2bNo0xY8ZktPwee+zBI488stOfXzMRPP3003Tp0mWn15ePlAhEJK+dcsopPPXUU3z++ecALFu2jBUrVnDooYdWX9dfWlrKAQccwJ/+9Kcdll+2bBmDBw8GYPPmzZx++umUlJRw2mmnsXnz5upyF110UXUX1tdddx0AkydPZsWKFYwePZrRo0cDUFxczOrVqwG4/fbbGTx4MIMHD67uwnrZsmUMGDCA//qv/2LQoEEcddRR231OKnPnzmX48OGUlJRw0kkn8cknn1R//sCBAykpKanu7O6f//xn9YN5DjzwQDZs2LDT2zZB9xGISMa+9z1o7AdvDR0K0T40pe7duzNs2DD+8pe/cOKJJzJt2jROO+00zIw2bdrw2GOP0alTJ1avXs3w4cM54YQT0j63995776Vdu3bMnz+f+fPnU1paWj1v0qRJdOvWjS1btnDEEUcwf/58Lr30Um6//XZmzJhBjx7b94c5e/ZsHnjgAWbOnIm7c/DBBzNy5Ei6du3K4sWLefDBB/nFL37BqaeeyqOPPlrr8wXOOuss7r77bkaOHMm1117Lj370I+68805uuukmli5dyi677FJdHXXbbbdxzz33MGLECDZu3EibNm3qsbVT0xmBiOS95Oqh5Gohd2fixImUlJRw5JFH8sEHH7Bq1aq063n++eerd8glJSWUlJRUz3v44YcpLS3lwAMPZMGCBXV2KPfiiy9y0kkn0b59ezp06MDJJ5/MCy+8AED//v0ZOnQoUHtX1xCej7B27VpGjhwJwNlnn83zzz9fHePYsWOZOnVq9R3MI0aM4IorrmDy5MmsXbu2Ue5s1hmBiGSstiP3OH3zm9/kiiuuYM6cOWzevLn6SL68vJzKykpmz55NUVERxcXFKbueTpbqbGHp0qXcdtttvPrqq3Tt2pVzzjmnzvXU1k9bogtrCN1Y11U1lM6f//xnnn/+eZ544gl+/OMfs2DBAiZMmMBxxx3H008/zfDhw/nb3/7G/vvvv1PrT9AZgYjkvQ4dOjBq1CjOO++87RqJ161bx6677kpRUREzZsxgeaoHlCc57LDDqh9Q/8YbbzB//nwgdGHdvn17OnfuzKpVq3jmmWeql+nYsWPKevjDDjuMxx9/nE2bNvHpp5/y2GOP8dWvfrXe361z58507dq1+mzid7/7HSNHjmTr1q28//77jB49mltuuYW1a9eyceNG3nnnHQ444ACuuuoqysrKeOutt+r9mTXpjEBEmoQxY8Zw8sknb3cF0dixYzn++OMpKytj6NChdR4ZX3TRRZx77rmUlJQwdOhQhg0bBoSnjR144IEMGjRohy6sx40bxzHHHEOvXr2YMWNG9fTS0lLOOeec6nVccMEFHHjggbVWA6Xzm9/8hvHjx7Np0yb22msvHnjgAbZs2cIZZ5zBunXrcHcuv/xyunTpwg9/+ENmzJhBy5YtGThwYPXT1hpC3VCLSK3UDXXTU99uqFU1JCJS4JQIREQKnBKBiNSpqVUhF7Kd+VspEYhIrdq0acOaNWuUDJoAd2fNmjX1vslMVw2JSK369OlDRUUFlZWVuQ5FMtCmTRv69OlTr2WUCESkVkVFRfTv3z/XYUiMVDUkIlLglAhERAqcEoGISIFTIhARKXBKBCIiBU6JQESkwCkRiIgUOCUCEZECp0QgIlLglAhERAqcEoGISIGLLRGY2Z5mNsPMFprZAjO7LEUZM7PJZrbEzOabWWlc8YiISGpxdjpXBXzf3eeYWUdgtpk96+5vJpU5Btg3Gg4G7o1eRUQkS2I7I3D3le4+J3q/AVgI9K5R7ETgtx78G+hiZr3iiklERHaUlTYCMysGDgRm1pjVG3g/abyCHZMFZjbOzGaZ2Sz1iS4i0rhiTwRm1gF4FPieu6+vOTvFIjs8Bsndp7h7mbuX9ezZM44wRUQKVqyJwMyKCEmg3N3/mKJIBbBn0ngfYEWcMYmIyPbivGrIgF8BC9399jTFngDOiq4eGg6sc/eVccUkIiI7ivOqoRHAmcDrZjY3mjYR6Avg7j8HngaOBZYAm4BzY4xHRERSiC0RuPuLpG4DSC7jwHfjikFEROqmO4tFRAqcEoGISIFTIhARKXBKBCIiBU6JQESkwCkRiIgUOCUCEZECp0QgIlLglAhERAqcEoGISIFTIhARKXBKBCIiBU6JQESkwCkRiIgUOCUCEZECVzCJYPlyuOoqqKrKdSQiIvmlYBLB3Llwyy3w29/mOhIRkfxSMInghBNg2DC4/nr47LNcRyMikj8KJhGYwf/+L7z/Ptx3X66jERHJHwWTCACOOCIMkybBxo25jkZEJD8UVCKAkAQqK+Guu3IdiYhIfii4RHDwwXDiiXDrrfDxx7mORkQk9wouEQDceCOsXx+uIhIRKXQFmQgGD4axY2HyZFi5MtfRiIjkVkEmAgiXkX7xRWgzEBEpZAWbCPbeGy64AKZMgaVLcx2NiEjuFGwiALjmGmjZMpwdiIgUqoJOBL17wyWXwO9+BwsW5DoaEZHcKOhEAKEjug4d4Nprcx2JiEhuFHwi6N4drrwS/vhHePXVXEcjIpJ9BZ8IAC6/HHr0gKuvznUkIiLZF1siMLP7zewjM3sjzfxRZrbOzOZGQ84qZzp2hIkT4dlnYcaMXEUhIpIbcZ4R/Bo4uo4yL7j70Gi4IcZY6nTRRdCnTzgrcM9lJCIi2RVbInD354Em05tPmzahwfjll+Gpp3IdjYhI9uS6jeAQM5tnZs+Y2aAcx8I558A++4Szgq1bcx2NiEh25DIRzAH6ufsQ4G7g8XQFzWycmc0ys1mVlZWxBVRUBD/+Mbz+Ojz0UGwfIyKSV3KWCNx9vbtvjN4/DRSZWY80Zae4e5m7l/Xs2TPWuE49FUpK4Ic/DH0RAZSXQ3ExtGgRXsvLYw1BRCSrWuXqg81sd2CVu7uZDSMkpTW5iiehRYvQEd3xx8MDD0D79jBuHGzaFOYvXx7GIfRgKiLS1JnHdImMmT0IjAJ6AKuA64AiAHf/uZldDFwEVAGbgSvc/aW61ltWVuazZs2KJeYEdxgxAt57LySG99/fsUy/frBsWaxhiIg0GjOb7e5lKefFlQjiko1EAPCPf8Do0ennm6lBWUSajtoSQa6vGspbo0bBUUeFM4JU+vbNajgiIrHJKBGY2d5mtkv0fpSZXWpmXeINLfcmTQpH/UVF209v104PtBGR5iPTM4JHgS1mtg/wK6A/8PvYosoTZWVw8snhmQV9+oTqoH79wsNs1FAsIs1Fpolgq7tXAScBd7r75UCv+MLKHz/+MfznP3D66eHsYNkyJQERaV4yTQRfmNkY4Gwg0QFDUS3lm42BA+HMM+FnP4MPPsh1NCIijS/TRHAucAgwyd2Xmll/YGp8YeWX666DLVvC2YGISHOTUSJw9zfd/VJ3f9DMugId3f2mmGPLG/37h5vIfvUr+Nvfch2NiEjjyvSqoX+YWScz6wbMAx4ws9vjDS2/XH897L8/HHssTJuW62hERBpPplVDnd19PXAy8IC7HwQcGV9Y+adHD3jhBTjkEBgzBu68M9cRiYg0jkwTQSsz6wWcyrbG4oLTpQtMnx4uKb388vDg+yZ2Y7aIyA4yTQQ3ANOBd9z9VTPbC1gcX1j5q00bePjh8ESzW24JzzBI9FIqItIUZdT7qLv/AfhD0vi7wLfiCirftWwJ99wDe+wRuqv+6CP4wx+gQ4dcRyYiUn+ZNhb3MbPHoofRrzKzR82sT9zB5TMzuOYa+MUv4K9/hcMPhxifmSMiEptMq4YeAJ4A9gB6A09G0wreBRfAY4+Fp5qNGAFLl+Y6IhGR+sk0EfR09wfcvSoafg3E+6iwJuSEE8L9BatXw1e+AnPn5joiEZHMZZoIVpvZGWbWMhrOIA+eJpZPRoyAF1+EVq3gsMPguedyHZGISGYyTQTnES4d/RBYCZxC6HZCkgwcCC+/HJ5VcMwx4eoiEZF8l2kXE++5+wnu3tPdd3X3bxJuLpMa+vQJN54NGxZ6LL377lxHJCJSu4Y8oeyKRouimenaNVxJdOKJcOmlMHGibjwTkfzVkERgjRZFM9S2LTzyCFx4IfzkJ3DeebrxTETyU0Y3lKWhY9w6tGwJ994LvXqFTus++gh+/nPYc89cRyYisk2tZwRmtsHM1qcYNhDuKZA6mIXnGdx3X6gu2nvvcHawaFGuIxMRCWpNBO7e0d07pRg6untDziYKzrhxsGQJjB8furEeMABOOQVmz851ZCJS6BrSRiD11K8fTJ4cnns8cWK4Ca2sDL7+dfjHP9SgLCK5oUSQBeXlUFwMLVqE12efhRtvhPfeg5tugnnzYPTocFfyE0/A1q25jlhECokSQczKy0O10PLl4Yh/+fIwXl4OnTqFZxosXQr/93/w4YfhktOSEpg6Faqqch29iBQC8yZWH1FWVuazZs3KdRgZKy4OO/+a+vULVUTJqqrgoYfCWcIbb4Rlf/ADOPfccDmqiDQt7qFDyiefDBeLtG0bHnk7YEB43X9/2HXXcFFJ3MxstruXpZynRBCvFi1S1/2bpa8C2roVnnoq3H/w73/DbruFJ6JddFE4ixCR/PX55zBjRvgffvLJUAUMcNBBYV/w1luwadO28l26bEsKycNee0FRUePFpUSQQ/U5I6jJHf75z5AQ/vpX6NwZLrkELrssPENZRPLDqlXw9NPbjvw//RTatYOvfQ2OPx6OPTbcTwThQO+DD2DhwpAUkoeVK7ets6gI9tln++Rw8MGw3347F6MSQQ4l2giSjwDatYMpU2Ds2MzXM3t2SAh//GM4vRw/Hq68ctuPS0SyJ7nK58kn4ZVXwrQ+fcKO/xvfCBeA1LdKd926cI9RzSSxZEmoOr7qqlB1vDOUCHKsvByuvjqcIvbtC5Mm1S8JJHvzzZAQfv/7cMRw3nnwP/8TzjxEJD6ffRYu837yyVDtk6jyGTYs7PiPPx6GDImnvv+LL+Ddd0Ni6dt359ahRNAMvfMO3Hwz/PrX4UjkjDPg//0/+NKXch2ZSNNSVRWqdlauhBUr0r9+9FGo1kmu8jnuONh991x/g8zkJBGY2f3AN4CP3H1wivkG3AUcC2wCznH3OXWtV4lge++/D7fdFqqaPv8cTj013KxWUpLryEQaz+efw6uvwvr1sGVL2CEnhkzHq6rCUwRT7eBr7gbNwtU8e+wRql8Tr4ccEqp82rTJzXZoiFwlgsOAjcBv0ySCY4FLCIngYOAudz+4rvUqEaS2ahXccQfccw9s3Bgen3n11eG0VaSpcYe334bp00Pj64wZ27ez7SyzcBVe8s491etuu4WnDTYnOasaMrNi4Kk0ieA+4B/u/mA0vggY5e4ra5ZNpkRQu48/Dg/Duesu+OSTcAp79dXh8ZnZuFZZZGetXQt///u2nX/iart99gndsHzta2FH3aJFGFq23Pa+5ni6eV26NL8dfKZqSwS53CS9gfeTxiuiaTskAjMbB4wD6LuzLSUFolu30NvpFVeELrB/+lMYNSo8U3niRDjySGjdOtdRioQqm1df3bbjnzkzTOvYEY44Ilwh8/Wvh+vpJV65TASpjk9Tnp64+xRgCoQzgjiDai46dgxXE11yCfzyl3DLLaFha5dd4MtfDolhxIjQv1H37rmOVgrF+++HHf/06aHTxbVrw5lqWVm42OGoo2D48Ma9kUrqlstEUAEkP6KlD7AiR7E0W23bhmRw4YXw5z+H5ym/9BLcfnu46gjCDSqJxDBiRLjySNVI4h7ant5+O1zbvmRJuFHqiy9Cw+sXX2w/1JxWc3zTpm2XXO6xB5x0UjjiP+II3SCZa7lMBE8AF5vZNEJj8bq62gdk57VuHf7xTjopjG/eHE7L//WvkBgefxzuvz/M6949nCkkEkNZWdO8SkIy8+mnsHhx2NkndvqJ9+vXbyvXujV06BCO1lu1Cq+JoeZ469bQvv3281u3Dt0sHHUUDBqkg418ElsiMLMHgVFADzOrAK4DigDc/efA04QrhpYQLh89N65YZEdt24YG5MMOC+Nbt4Z//pdeCsnhX/8KN85A+Ec+6KCQHAYNCmcQ++2XnaO4qqrQFcfixVBRES6LPeigwm3w21lbtoSj8cROPnmnX1Gxfdm+fcNZ4ZlnhtfE33vPPUOjqzQ/uqFM0qqshJdf3pYYZs0K13MndOu2bSeRvMPYZ5/QFpGprVvDzmjx4rBzSn59990du+Pu2DEksMMPD9d0DxkSrgiRcPfr229v66Ig8bpoUZiX0Lnzjn+3L30J9t033DAlzY/uLJZGsWVLODpPPqpMDMmdZSUewJPYwSSGfv1CZ1s1d/hLlmy/k2rbNuyQEjumxGuvXqHPpeeeC8Pbb4fyXbuGK6NGjw7JYeDA7FQ7VFWFxs6PPw7DJ59se59qfN26kMR69Nhx6N59+/Fu3Wo/+v744207+uSd/tKl226OMgt/h+Quj/ffP2zPbHV9LPlDiUBit3592DEn1zEnqh9S3QhUVBQuC0ze2Sfe77FHZkf4H3wQbjSaMSMkhkRvrrvuGhJD4sj6t0EAAAzgSURBVIxh330z2+lt3hzuMk0Mq1bt+D55J79uXe3r69w5JKlu3cLQqVO42W/16m1DupukzMKyycmhU6dQvfPWWyGWhF12CdtuwIBtO/0BA3R0L9tTImjiGrPTumxLdLm7aFG4Qah377CD6tev8ev5ly7dPjGsiK5B6907JIRDDw1Xr6TbyW/YkHq9HTqEO0179gxH7okde7du2+/ok8czvXFp0yZYs2b75JAYak7/5JPQu2ViR5/Y6RcXq+5e6qZE0IQ1VjfWhcY9VD0999y25FBZGeaZhZ36rruGYbfdtn+t+V5H1dIcKBE0YQ15sI1s4x62V/v24aheR9BSaPK1iwnJQOIGnEynS2pm0L9/rqMQyU+66C7PpetaSV0uiUhjUSLIc5Mm7VhH3a5dmC4i0hiUCPLc2LGhYbhfv1C90a+fGopFpHGpjaAJGDtWO34RiY/OCERECpwSgYhIgVMiEBEpcEoEIiIFTolARKTAKREUgPLy0FVFonvo8vJcRyQi+USXjzZzNTutW748jIMuSRWRQGcEzdzVV+/Y5/2mTWG6iAgoETR76rROROqiRNDMqdM6EamLEkEzp07rRKQuSgTNnDqtE5G66KqhAqBO60SkNjojEBEpcEoEIiIFTolARKTAKRGIiBQ4JQLJiPorEmm+dNWQ1En9FYk0bzojkDqpvyKR5k2JQOqk/opEmjclAqmT+isSad5iTQRmdrSZLTKzJWY2IcX8c8ys0szmRsMFccYjO0f9FYk0b7ElAjNrCdwDHAMMBMaY2cAURR9y96HR8Mu44pGdp/6KRJq3OK8aGgYscfd3AcxsGnAi8GaMnykxUX9FIs1XnFVDvYH3k8Yromk1fcvM5pvZI2a2Z6oVmdk4M5tlZrMqKyvjiFVipvsQRPJXnInAUkzzGuNPAsXuXgL8DfhNqhW5+xR3L3P3sp49ezZymBK3xH0Iy5eD+7b7EJQMRPJDnImgAkg+wu8DrEgu4O5r3P3zaPQXwEExxiM5ovsQRPJbnIngVWBfM+tvZq2B04EnkguYWa+k0ROAhTHGIzmi+xBE8ltsjcXuXmVmFwPTgZbA/e6+wMxuAGa5+xPApWZ2AlAFfAycE1c8kjt9+4bqoFTTRST3zL1mtX1+Kysr81mzZuU6DKmHmn0VQbgPQZegimSPmc1297JU83RnscRO9yGI5Df1PipZofsQRPKXzgikSdB9CCLx0RmB5D09D0EkXjojkLyn+xBE4qVEIHlP9yGIxEuJQPKenocgEi8lAsl7jfE8BDU2i6SnRCB5r6H3IajTO5Ha6c5iafaKi1N3cdGvHyxblu1oRHJDdxZLQVNjs0jtlAik2WuMxma1MUhzpkQgzV5DG5vVxiDNnRKBNHsNbWzWDW3S3CkRSEEYOzY0DG/dGl7r0zVFY7QxqGpJ8pkSgUgdGtrGoKolyXdKBCJ1aGgbg6qWJN8pEYjUoaFtDKpaknynRCCSgYa0MeRD1ZISidRGiUAkZrmuWlIikbooEYjELNdVS/mQSCS/KRGIZEEuq5ZynUig4WcUOiOJlxKBSJ5raNVSrhNJQ88o8qFqq9knIndvUsNBBx3kIoVm6lT3fv3czcLr1Kn1W7ZdO/ewGw1Du3aZr6Nfv+2XTQz9+jWN5Rv6/Ru6fGIdO/v3a4zl3d2BWZ5mv5rzHXt9ByUCkfrLZSIxS70jN8vO8kpEQW2JQM8jEJE6lZeHNoH33gtVSpMmZd7O0dDnQTR0+RYtwu6zJrPQZhP38rn+/gl6HoGINEhDGrsb2saR6zaSXLexZON5GkoEIhKrhl4+29DlCz0RZSRdnVG+DmojEJH6ymVjrdoIYqA2AhFpahrSxtIYy0PtbQRKBCIiBUCNxSIiklasicDMjjazRWa2xMwmpJi/i5k9FM2faWbFccYjIiI7ii0RmFlL4B7gGGAgMMbMBtYodj7wibvvA9wB3BxXPCIiklqcZwTDgCXu/q67/weYBpxYo8yJwG+i948AR5iZxRiTiIjUEGci6A28nzReEU1LWcbdq4B1QPeaKzKzcWY2y8xmVVZWxhSuiEhhahXjulMd2de8RCmTMrj7FGAKgJlVmlmKG67zQg9gda6DqEW+xwf5H6PiaxjF1zANia9fuhlxJoIKYM+k8T7AijRlKsysFdAZ+Li2lbp7z8YMsjGZ2ax0l2flg3yPD/I/RsXXMIqvYeKKL86qoVeBfc2sv5m1Bk4HnqhR5gng7Oj9KcBz3tRubBARaeJiOyNw9yozuxiYDrQE7nf3BWZ2A+FW5yeAXwG/M7MlhDOB0+OKR0REUouzagh3fxp4usa0a5PefwZ8O84YsmxKrgOoQ77HB/kfo+JrGMXXMLHE1+S6mBARkcalLiZERAqcEoGISIFTIqgnM9vTzGaY2UIzW2Bml6UoM8rM1pnZ3Gi4NtW6YoxxmZm9Hn32Dl21WjA56uNpvpmVZjG2/ZK2y1wzW29m36tRJuvbz8zuN7OPzOyNpGndzOxZM1scvXZNs+zZUZnFZnZ2qjIxxXermb0V/Q0fM7MuaZat9fcQY3zXm9kHSX/HY9MsW2ufZDHG91BSbMvMbG6aZWPdfun2KVn9/aV7UIGG1APQCyiN3ncE3gYG1igzCngqhzEuA3rUMv9Y4BnCDX3DgZk5irMl8CHQL9fbDzgMKAXeSJp2CzAhej8BuDnFct2Ad6PXrtH7rlmK7yigVfT+5lTxZfJ7iDG+64ErM/gNvAPsBbQG5tX8f4orvhrzfwpcm4vtl26fks3fn84I6sndV7r7nOj9BmAhO3adke9OBH7rwb+BLmbWKwdxHAG84+45v1Pc3Z9nx5sZk/vC+g3wzRSLfh141t0/dvdPgGeBo7MRn7v/1UPXLAD/Jty0mRNptl8mMumTrMFqiy/q3+xU4MHG/txM1LJPydrvT4mgAaJusw8EZqaYfYiZzTOzZ8xsUFYDC910/NXMZpvZuBTzM+kHKhtOJ/0/Xy63X8Ju7r4Swj8rsGuKMvmyLc8jnOWlUtfvIU4XR1VX96ep2siH7fdVYJW7L04zP2vbr8Y+JWu/PyWCnWRmHYBHge+5+/oas+cQqjuGAHcDj2c5vBHuXkroAvy7ZnZYjfkZ9fEUp+hu8xOAP6SYnevtVx/5sC2vBqqA8jRF6vo9xOVeYG9gKLCSUP1SU863HzCG2s8GsrL96tinpF0sxbR6bz8lgp1gZkWEP1i5u/+x5nx3X+/uG6P3TwNFZtYjW/G5+4ro9SPgMcLpd7JM+oGK2zHAHHdfVXNGrrdfklWJKrPo9aMUZXK6LaPGwW8AYz2qNK4pg99DLNx9lbtvcfetwC/SfG6ut18r4GTgoXRlsrH90uxTsvb7UyKop6g+8VfAQne/PU2Z3aNymNkwwnZek6X42ptZx8R7QoPiGzWKPQGcFV09NBxYlzgFzaK0R2G53H41JPeFdTbwpxRlpgNHmVnXqOrjqGha7MzsaOAq4AR335SmTCa/h7jiS253OinN52bSJ1mcjgTecveKVDOzsf1q2adk7/cXV0t4cx2AQwmnXvOBudFwLDAeGB+VuRhYQLgC4t/AV7IY317R586LYrg6mp4cnxGeHvcO8DpQluVt2I6wY++cNC2n24+QlFYCXxCOss4nPBvj78Di6LVbVLYM+GXSsucBS6Lh3CzGt4RQP5z4Hf48KrsH8HRtv4csxfe76Pc1n7BT61Uzvmj8WMKVMu9kM75o+q8Tv7ukslndfrXsU7L2+1MXEyIiBU5VQyIiBU6JQESkwCkRiIgUOCUCEZECp0QgIlLglAhEIma2xbbvGbXResI0s+Lkni9F8kmsj6oUaWI2u/vQXAchkm06IxCpQ9Qf/c1m9ko07BNN72dmf486Vfu7mfWNpu9m4fkA86LhK9GqWprZL6I+5/9qZm2j8pea2ZvReqbl6GtKAVMiENmmbY2qodOS5q1392HAz4A7o2k/I3TnXULo8G1yNH0y8E8PneaVEu5IBdgXuMfdBwFrgW9F0ycAB0brGR/XlxNJR3cWi0TMbKO7d0gxfRlwuLu/G3UO9qG7dzez1YRuE76Ipq909x5mVgn0cffPk9ZRTOg3ft9o/CqgyN1vNLO/ABsJvaw+7lGHeyLZojMCkcx4mvfpyqTyedL7LWxrozuO0PfTQcDsqEdMkaxRIhDJzGlJry9H718i9JYJMBZ4MXr/d+AiADNraWad0q3UzFoAe7r7DOB/gC7ADmclInHSkYfINm1t+weY/8XdE5eQ7mJmMwkHT2OiaZcC95vZD4BK4Nxo+mXAFDM7n3DkfxGh58tUWgJTzawzoVfYO9x9baN9I5EMqI1ApA5RG0GZu6/OdSwicVDVkIhIgdMZgYhIgdMZgYhIgVMiEBEpcEoEIiIFTolARKTAKRGIiBS4/w8JzxKMj9TCAQAAAABJRU5ErkJggg==\n", 357 | "text/plain": [ 358 | "
" 359 | ] 360 | }, 361 | "metadata": { 362 | "needs_background": "light" 363 | }, 364 | "output_type": "display_data" 365 | } 366 | ], 367 | "source": [ 368 | "import matplotlib.pyplot as plt\n", 369 | "\n", 370 | "loss = history.history['loss']\n", 371 | "val_loss = history.history['val_loss']\n", 372 | "\n", 373 | "epochs = range(1, len(loss) + 1)\n", 374 | "\n", 375 | "plt.plot(epochs, loss, 'bo', label='Training loss')\n", 376 | "plt.plot(epochs, val_loss, 'b', label='Validation loss')\n", 377 | "plt.title('Training and validation loss')\n", 378 | "plt.xlabel('Epochs')\n", 379 | "plt.ylabel('Loss')\n", 380 | "plt.legend()\n", 381 | "\n", 382 | "plt.show()" 383 | ] 384 | }, 385 | { 386 | "cell_type": "code", 387 | "execution_count": 18, 388 | "metadata": {}, 389 | "outputs": [ 390 | { 391 | "data": { 392 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XmcFNW5//HPwyargAwqsg0xKuKCwgT1525c0Khckasi\nSdwI0Su4xCxEvGpUsvrzGo0/r0Qweh0lKheDiTsSlxiVQRlQUCEIOICAgCAOAgPP749T0zRNz0wP\nM9XdM/N9v1796u6qU1VP1/TU0+ecqlPm7oiIiAA0y3UAIiKSP5QUREQkQUlBREQSlBRERCRBSUFE\nRBKUFEREJEFJQXZhZs3NbKOZ9arPsrlkZt80s3o//9rMTjWzxUnvPzKz4zMpuxvbetDMbtzd5UUy\n0SLXAUjdmdnGpLdtgc3Atuj9D929uDbrc/dtQPv6LtsUuPtB9bEeMxsJfNfdT0pa98j6WLdIdZQU\nGgF3TxyUo1+iI9395arKm1kLd6/IRmwiNdH3Mb+o+agJMLM7zOzPZva4mX0JfNfMjjGzt8zsCzNb\nYWb3mFnLqHwLM3MzK4zePxrNf87MvjSzf5pZn9qWjeafaWYfm9l6M7vXzP5hZpdWEXcmMf7QzBaa\n2Tozuydp2eZm9l9mtsbMFgGDq9k/48xscsq0+8zsruj1SDObH32ef0W/4qtaV5mZnRS9bmtm/xPF\n9gEwMKXsTWa2KFrvB2Z2bjT9MOAPwPFR09znSfv21qTlr4w++xoze9rMumWyb2qznyvjMbOXzWyt\nmX1mZj9N2s5/Rvtkg5mVmNl+6ZrqzOyNyr9ztD9fi7azFrjJzA4wsxnRNj6P9lvHpOV7R59xdTT/\n92bWOor54KRy3cys3My6VPV5pQburkcjegCLgVNTpt0BbAHOIfwQaAN8CziKUFv8BvAxMDoq3wJw\noDB6/yjwOVAEtAT+DDy6G2X3Br4EhkTzfgRsBS6t4rNkEuNfgI5AIbC28rMDo4EPgB5AF+C18HVP\nu51vABuBdknrXgUURe/PicoYcAqwCTg8mncqsDhpXWXASdHrO4G/A52B3sC8lLIXAN2iv8nFUQz7\nRPNGAn9PifNR4Nbo9elRjEcArYH/B7ySyb6p5X7uCKwErgX2APYEBkXzfg6UAgdEn+EIYC/gm6n7\nGnij8u8cfbYK4CqgOeH7eCDwbaBV9D35B3Bn0ud5P9qf7aLyx0bzJgDjk7ZzAzA11/+HDfmR8wD0\nqOc/aNVJ4ZUalvsx8GT0Ot2B/r+Typ4LvL8bZS8HXk+aZ8AKqkgKGcZ4dNL8/wV+HL1+jdCMVjnv\nrNQDVcq63wIujl6fCXxUTdm/AldHr6tLCkuT/xbAfySXTbPe94HvRK9rSgoPA79MmrcnoR+pR037\nppb7+XvAzCrK/asy3pTpmSSFRTXEMKxyu8DxwGdA8zTljgU+ASx6PxsYWt//V03poeajpuPT5Ddm\n1tfM/hY1B2wAbgMKqln+s6TX5VTfuVxV2f2S4/DwX1xW1UoyjDGjbQFLqokX4DFgePT64uh9ZRxn\nm9nbUdPGF4Rf6dXtq0rdqovBzC41s9KoCeQLoG+G64Xw+RLrc/cNwDqge1KZjP5mNeznnoSDfzrV\nzatJ6vdxXzN7wsyWRTH8KSWGxR5OatiJu/+DUOs4zswOBXoBf9vNmAT1KTQlqadjPkD4ZfpNd98T\nuJnwyz1OKwi/ZAEwM2Png1iqusS4gnAwqVTTKbNPAKeaWXdC89ZjUYxtgKeAXxGadjoBL2YYx2dV\nxWBm3wDuJzShdInW+2HSems6fXY5oUmqcn0dCM1UyzKIK1V1+/lTYP8qlqtq3ldRTG2Tpu2bUib1\n8/2GcNbcYVEMl6bE0NvMmlcRxyPAdwm1mifcfXMV5SQDSgpNVwdgPfBV1FH3wyxs86/AADM7x8xa\nENqpu8YU4xPAdWbWPep0/Fl1hd39M0ITx58ITUcLoll7ENq5VwPbzOxsQtt3pjHcaGadLFzHMTpp\nXnvCgXE1IT/+gFBTqLQS6JHc4ZviceAKMzvczPYgJK3X3b3Kmlc1qtvP04BeZjbazPYwsz3NbFA0\n70HgDjPb34IjzGwvQjL8jHBCQ3MzG0VSAqsmhq+A9WbWk9CEVemfwBrglxY679uY2bFJ8/+H0Nx0\nMSFBSB0oKTRdNwCXEDp+HyB0CMfK3VcCFwJ3Ef7J9wfeI/xCrO8Y7wemA3OBmYRf+zV5jNBHkGg6\ncvcvgOuBqYTO2mGE5JaJWwg1lsXAcyQdsNx9DnAv8E5U5iDg7aRlXwIWACvNLLkZqHL55wnNPFOj\n5XsBIzKMK1WV+9nd1wOnAecTEtXHwInR7N8BTxP28wZCp2/rqFnwB8CNhJMOvpny2dK5BRhESE7T\ngClJMVQAZwMHE2oNSwl/h8r5iwl/583u/mYtP7ukqOycEcm6qDlgOTDM3V/PdTzScJnZI4TO61tz\nHUtDp4vXJKvMbDDhTJ9NhFMatxJ+LYvslqh/ZghwWK5jaQzUfCTZdhywiNCWfgZwnjoGZXeZ2a8I\n10r80t2X5jqexkDNRyIikqCagoiIJDS4PoWCggIvLCzMdRgiIg3KrFmzPnf36k4BBxpgUigsLKSk\npCTXYYiINChmVtNV/YCaj0REJImSgoiIJCgpiIhIgpKCiIgkKCmIiEiCkoKISMyKi6GwEJo1C8/F\nxdldvjaUFESk0cvlQbm4GEaNgiVLwD08jxqV+Trqunyt5frWb7V9DBw40EUkux591L13b3ez8Pzo\now1n+UcfdW/b1j0cUsOjbdvM11HX5Xv33nnZykfv3tlZvhJQ4hkcY3N+kK/tQ0lBpPYa8kG1oR+U\nzdIvb5ad5SspKYg0Irk8qOf6oNrQD8q5/vyVMk0K6lMQyXN1bVMeNw7Ky3eeVl4epmdiaRUDUlc1\nPd+W71XF3bmrml7fy48fD23b7jytbdswPRvL15aSgkjM6trJmeuDeq4Pqg39oDxiBEyYAL17g1l4\nnjAhTM/G8rWWSXUinx5qPpJsy2XTjXvumy9y3SdQH/sw1x3l+QD1KYjUXa7b4+tjHY3hoNoYDsq5\nlmlSaHB3XisqKnINnS3ZUlgY2vBT9e4NixfXvHyzZuEwnMoMtm/PLIbKPoXkJqS2bWvXhFBcHJqb\nli4NzS7jx8fY/CB5ycxmuXtRTeXUpyBSjVy3x0P9tCmPGBGS2Pbt4VkJQaqipCBSjVx3clbSQV2y\nRUlBGr26nP2T6zNPRLKtwd2OU6Q2UtvjK8/xh8wOzJVl6tIeP2KEkoA0HOpolkatrh3FIo2FOppF\nqHtHsUhTo6Qgea8ufQL1cfaPSFOipCB5ra7j/mR73BiRhk5JQfJaXcf90dk/IrWjjmbJa/VxRbCI\nqKNZGgn1CYhkl5KC5DX1CYhkl5KC5DX1CYhkl65olrynK4JFskc1BYldXe88JiLZo5qCxKquYw+J\nSHbFWlMws8Fm9pGZLTSzsWnm9zaz6WY2x8z+bmY94oxHsq+u1xmISHbFlhTMrDlwH3Am0A8Ybmb9\nUordCTzi7ocDtwG/iiseyQ2NPSTSsMRZUxgELHT3Re6+BZgMDEkp0w94JXo9I818aeB0nYFIwxJn\nUugOfJr0viyalqwUGBq9Pg/oYGZdUldkZqPMrMTMSlavXh1LsBIPXWcg0rDk+uyjHwMnmtl7wInA\nMmBbaiF3n+DuRe5e1LVr12zHKHWg6wxEGpY4zz5aBvRMet8jmpbg7suJagpm1h44392/iDEmyQFd\nZyDScMRZU5gJHGBmfcysFXARMC25gJkVmFllDD8HJsUYj4iI1CC2pODuFcBo4AVgPvCEu39gZreZ\n2blRsZOAj8zsY2AfQC3NIiI5pKGzRUSaAA2dLfVGw1SINB0a5kKqpWEqRJoW1RSkWhqmQqRpUVKQ\nammYCpGmRUlBqqVhKkSaFiUFqZaGqRBpWpQUpFoapkKkadHZR1IjDVMh0nSopiAiIglKCiIikqCk\nICIiCUoKIiKSoKQgIiIJSgpNgAa0E5FM6ZTURk4D2olIbaim0MhpQDsRqQ0lhUZOA9qJSG0oKTRy\nGtBORGpDSaGR04B2IlIbSgqNnAa0E5Ha0NlHTYAGtBORTKmmICIiCUoKIiKSoKQgIiIJSgoiIpKg\npCAiIglKCg2ABrQTkWzRKal5TgPaiUg2qaaQ5zSgnYhkk2oKeU4D2uXO55/DO++Ex9tvw4cfQpcu\n0L079OgRnpMfPXpAhw513+62bbBuXdj+6tXh+fPPYf16KCjYeft77ln37WXCHbZsgVatwpXx0ngp\nKeS5Xr1Ck1G66VJ/Nm2C997bkQDeeQcWLQrzmjWDQw6Bo4+GL76ATz6BN96AtWt3XU+HDrsmisrX\nXbvChg07H+jTvV67FrZvzyzu9u3Tbyt52t57Q/PmOy+3dSusWbPrtquLbfNmaNeu+m117w777LPr\n9qThUFLIc+PH79ynABrQrq62bw+/+pNrAXPmQEVFmN+zJwwaBD/8IRx1FAwcGA6+qcrLYflyWLYs\nPMrKdrxetgxeeQVWrAi//NNp3jz88u/aNTwfeuiO15XPya/33DMcnFO3U7ntGTPC9io/R/J2unUL\nyeHLL8M6vvii6v3TseOObfbsCUceGV537BgSROX2Xn01fP5029t33/TJI/l96kCNkh/M3XMdQ60U\nFRV5SUlJrsPIquLi0IewdGmoIYwfr07m2lq4EB56CN56C0pKwi92CAfab30rJIGjjgrP3brV33a3\nbYNVq8KBdPVq6NRpx4G+Y8f6b4rZvn3H9lKTx6pVYZvJiSY1+XTpAi1b1m571SWqyteV+ztZp041\n1zoKCtRcVV/MbJa7F9VYLs6kYGaDgd8DzYEH3f3XKfN7AQ8DnaIyY9392erW2RSTQr6qqAi/PL/8\nMvzTb9iw43W6aWYwfDiccEL2/tHnzoVf/Qr+/OfQDNS/fzjwVyaBgw4K0yVeGzemTxzJCWTlyl2b\nzVq1gv32S584Kh/77Qd77BFP3O7h+/v556Gfp337kKg6d87O98Y97LvKZryePUMtbHfkPCmYWXPg\nY+A0oAyYCQx393lJZSYA77n7/WbWD3jW3QurW6+SQm489RTceWdo76482G/alNmybdqEX+Tl5eEf\n7LDDYMyYUNuJqwnh7bfhl7+EadPCP/JVV8H119dvLUDqV0UFfPZZ9TWOZct2PRsPQi2nqtpG5aNT\npx19Ken6TqrqT9myZdftNWsWalXVNfelPrdps3NfTqYxbN68Y7v33w9XXrl7+zfTpBBnn8IgYKG7\nL4oCmgwMAeYllXGg8vyJjsDyGOOR3bBxI1x7LUyaFDpbi4pCZ+qee4ZH5evU58rXHTpAi+hbVl4O\njz8O994b+kl++lO44gr4j/+Ab3yj7rG6h3b18eNDe37nznDrrSEB7bVX3dcv8WrRIhzIe/Souox7\nOAuruuaqd94JB9RUrVqlP8BX6tx5x4G9sDB815MP9p077/jVnnrwnj8/PK9ZU/VJAq1bw9dfV739\nqvpykpNL//5VL19f4qwpDAMGu/vI6P33gKPcfXRSmW7Ai0BnoB1wqrvPSrOuUcAogF69eg1cku50\nHKl3JSVw8cWhPf7GG+GWW2rX3lwV93D2zr33wv/+b/gnOvvscPA+9dTaNy25w1//GpLB22+H6vUN\nN4SO4vo4RVQans2bdz4JYNmyUAtp3z79L/m99qqf7/b27TtOJ06tAaxdG76P9dGXszsyrSng7rE8\ngGGEfoTK998D/pBS5kfADdHrYwi1iGbVrXfgwIEu8dq2zf03v3Fv0cK9Rw/3v/89vm2VlbnfdJN7\n167u4H7QQe733uu+YUPNy1ZUuD/+uPthh4VlCwvd77/ffdOm+OIVaaiAEs/g2B1nV8kyoGfS+x7R\ntGRXAE8AuPs/gdZAQYwxSQ2WLYPTToOf/Qz+7d+gtBROPDG+7XXvDrffDp9+Co88EpqdxowJ06+5\nBj76aNdltmyBiROhb9/QcV1REZb9+OPQ3tq6dXzxijR2cSaFmcABZtbHzFoBFwHTUsosBb4NYGYH\nE5JCmtZAyYann4bDDw+nbT74IDzxRPba4vfYA773vR3XDQwZAv/93+HAf8YZoXlo40a45x7Yf38Y\nOTIkkClT4P33w7JxV79FmoK4T0k9C7ibcLrpJHcfb2a3Eaox06Izjv4ItCd0Ov/U3V+sbp06+6j+\nlZfDj34EDzwAAwbAY4+FUzVzbeVKmDAhJIfly0NHZEVFOKX1xhvh9NN1DrtIpnJ+SmpclBTq1+zZ\noTN5/nz4yU/gjjvCWRr5ZOtWmDo1XEE7fDgcd1yuIxJpePLhlFTJY9u3w+9/D2PHhjMfXnopnPmT\nj1q2hAsuCA8RiZeSQhP02Wdw6aXwwgtw7rmh07ZA3fsigu6n0OT87W+hM/nVV8PVkU8/rYQgIjso\nKTQRX38dTvE8++ww1MOsWeH0TXXUikgyJYUm4P33w0ig994L110XTvns1y/XUYlIPlJSaMTc4Q9/\nCGO4rFoFzz4L//VfurhLRKqmpJAFxcVhgK1mzcJzcXH821y9OnQijxkDp5wSbiJz5pnxb1dEGjad\nfRSz4uKd75y2ZEl4D/HdKOfFF+GSS8LAXPfcA6NHq+9ARDKjmkLMxo3bdfz38vIwvb5t3hxGBz3j\njDA8xTvvhJqCEoKIZEo1hZgtXVq76btr/vxwZfLs2eH+BHfeGW7qISJSG6opxKxXr9pNry33MGbR\nwIFhpNG//AXuu08JQUR2j5JCzMaP3/WWk23bhul1tWYNDB0arjc49tjQmXzuuXVfr4g0XUoKMRsx\nIoz02bt3aNvv3Tu8r2sn8yuvhCuT//a30FT0wgvhBuYiInWhPoUsGDGi/s402rIFbr4ZfvtbOPBA\neOaZMNy1iEh9UFJoQD7+OHQmz5oFP/hBuBCtXbtcRyUijYmSQh7YtGnXG32nez1zZrhD2ZQpoS9B\nRKS+KSlkyZIl4aygFSt2Peh/9VX6ZZo1C/c6KCiArl3D/QRuvx169Mhu7CLSdGSUFMxsf6DM3Teb\n2UnA4cAj7v5FnME1FkuXhltIrlgRbkjftWt4HHxweK486BcU7Py6c+eQGEREsiXTmsIUoMjMvglM\nAP4CPAacFVdgjcXy5WHsofXr4a231CksIvkt09+h2929AjgPuNfdfwJ0iy+sxmHVKvj2t8MN6J9/\nXglBRPJfpjWFrWY2HLgEOCea1jKekBqHtWvhtNNCX8Lzz8PRR+c6IhGRmmVaU7gMOAYY7+6fmFkf\n4H/iC6thW78eTj8dPvoIpk0L/QkiIg1BRjUFd58HXANgZp2BDu7+mzgDa6g2boSzzgpDTkydCqee\nmuuIREQyl1FNwcz+bmZ7mtlewLvAH83srnhDa3jKy+Gcc8LtLidPhu98J9cRiYjUTqbNRx3dfQMw\nlHAq6lGAfgMn2bwZzjsPXn0VHnlEF5eJSMOUaVJoYWbdgAuAv8YYT4O0dWu4sOzFF2HixDAUhYhI\nQ5RpUrgNeAH4l7vPNLNvAAviC6vhqKgIg91NmxauWL7sslxHJCKy+zLtaH4SeDLp/SLg/LiCaii2\nbQtJ4Mkn4a67wh3PREQaskw7mnuY2VQzWxU9pphZkx6BZ/v2cHObRx8NN8y5/vpcRyQiUneZNh89\nBEwD9osez0TTmiR3uPZaePBBuOkmuPHGXEckIlI/Mk0KXd39IXeviB5/ArrGGFfecoef/Qz+8Ae4\n4Qa47bZcRyQiUn8yTQprzOy7ZtY8enwXWBNnYPnq1lvhd7+Dq68Oz2a5jkhEpP5kmhQuJ5yO+hmw\nAhgGXBpTTHnr178ONYMrroB77lFCEJHGJ6Ok4O5L3P1cd+/q7nu7+7/RxM4+euop+PnPwzUIDzyg\n+xyISONUl0Pbj2oqYGaDzewjM1toZmPTzP8vM5sdPT42s7y9ac/dd8OBB8LDD0Pz5rmORkQkHnW5\nHWe1jSdm1hy4DzgNKANmmtm0aHA9ANz9+qTyY4Aj6xBPbD78EP7xD/jNb6CFbmAqIo1YXWoKXsP8\nQcBCd1/k7luAycCQasoPBx6vQzyxmTQpJIPvfz/XkYiIxKva371m9iXpD/4GtKlh3d2BT5PelwFH\nVbGd3kAf4JUq5o8CRgH06tWrhs3Wr61bQ5PR2WfDvvtmddMiIllXbVJw9w5ZiuMi4Cl331ZFHBMI\n94amqKiophpKvfrrX8NtNa+4IptbFRHJjTjPoVkG9Ex63yOals5F5GnT0cSJsN9+MHhwriMREYlf\nnElhJnCAmfUxs1aEA/+01EJm1hfoDPwzxlh2y7Jl8NxzUFQE3/xmOA21sBCKi3MdmYhIPGI7l8bd\nK8xsNGHI7ebAJHf/wMxuA0rcvTJBXARMdvesNgtl4k9/CgPfvfgifP11mLZkCYwaFV6PGJGz0ERE\nYmF5eCyuVlFRkZeUlMS+ne3b4YADQm1h8+Zd5/fuDYsXxx6GiEi9MLNZ7l5UUzmddV+Fv/8dFi2q\nev7SpVkLRUQkazRYQxUmToROnaBnz/Tzs3xmrIhIVigppLFuHUyZEvoMfvUraNt25/lt24Yb64iI\nNDZqPkqjuDj0I1xxBRwZDbwxblxoMurVKyQEdTKLSGOkjuY0jjwynH46a1asmxERyZpMO5rVfJTi\n3Xdh9mxdwSwiTZOSQooHH4TWrcN9E0REmholhSSbNsFjj8GwYeHMIxGRpkZJIcmUKbB+vZqORKTp\nUlJI8uCDsP/+cOKJuY5ERCQ3lBQiCxbAq6+GWoJVe085EZHGS0khMmlSOA31kktyHYmISO4oKQAV\nFeHuamedFe6dICLSVCkpEO6ZsGIFjByZ60hERHJLSYHQwbzPPqGmICLSlDX5pLBiBfztb6EvoWXL\nXEcjIpJbTT4pPPIIbNumaxNERKCJJwX3cN+E44+HAw/MdTQiIrnXpJPC66+H6xNUSxARCZp0Upg4\nEfbcM4x1JCIiTTgprF8PTz4Jw4dDu3a5jkZEJD802aTw+ONhVFQ1HYmI7NBkk8LEiXD44VBU432I\nRESajiaZFEpLoaREg9+JiKRqkklh4kRo1QpGjMh1JCIi+aXJJYWvv4ZHH4WhQ6FLl1xHIyKSX5pc\nUpg6FdatUweziEg6TS4pTJwIhYVwyim5jkREJP80qaTwyScwfTpcfnm4oY6IiOysSR0aJ00KZxtd\nemmuIxERyU9NJils2wZ/+hOccQb07JnraERE8lOTSQovvghlZbq7mohIdZpMUli+HA44AM45J9eR\niIjkr1iTgpkNNrOPzGyhmY2toswFZjbPzD4ws8fiiuWKK+DDD8NFayIikl6LuFZsZs2B+4DTgDJg\npplNc/d5SWUOAH4OHOvu68xs77jiAZ1xJCJSkzgPk4OAhe6+yN23AJOBISllfgDc5+7rANx9VYzx\niIhIDeJMCt2BT5Pel0XTkh0IHGhm/zCzt8xscLoVmdkoMysxs5LVq1fHFK6IiOS6QaUFcABwEjAc\n+KOZdUot5O4T3L3I3Yu6du2a5RBFRJqOOJPCMiD5ioAe0bRkZcA0d9/q7p8AHxOShIiI5ECcSWEm\ncICZ9TGzVsBFwLSUMk8TagmYWQGhOWlRjDGJiEg1YksK7l4BjAZeAOYDT7j7B2Z2m5mdGxV7AVhj\nZvOAGcBP3H1NXDGJiEj1zN1zHUOtFBUVeUlJSa7DEBFpUMxslrvXeAPiXHc0i4hIHlFSEBGRBCUF\nERFJUFIQEZEEJQUREUlQUhARkQQlBRERSVBSEBGRBCUFERFJUFIQEZEEJQUREUmI7XacItK4bN26\nlbKyMr7++utchyLVaN26NT169KBly5a7tbySgohkpKysjA4dOlBYWIiZ5TocScPdWbNmDWVlZfTp\n02e31qHmIxHJyNdff02XLl2UEPKYmdGlS5c61eaUFEQkY0oI+a+ufyMlBRERSVBSEJFYFBdDYSE0\naxaei4vrtr41a9ZwxBFHcMQRR7DvvvvSvXv3xPstW7ZktI7LLruMjz76qNoy9913H8V1DbYBU0ez\niNS74mIYNQrKy8P7JUvCe4ARI3ZvnV26dGH27NkA3HrrrbRv354f//jHO5Vxd9ydZs3S/9596KGH\natzO1VdfvXsBNhKqKYhIvRs3bkdCqFReHqbXt4ULF9KvXz9GjBjBIYccwooVKxg1ahRFRUUccsgh\n3HbbbYmyxx13HLNnz6aiooJOnToxduxY+vfvzzHHHMOqVasAuOmmm7j77rsT5ceOHcugQYM46KCD\nePPNNwH46quvOP/88+nXrx/Dhg2jqKgokbCS3XLLLXzrW9/i0EMP5corr6Ty9scff/wxp5xyCv37\n92fAgAEsXrwYgF/+8pccdthh9O/fn3Fx7KwMKCmISL1burR20+vqww8/5Prrr2fevHl0796dX//6\n15SUlFBaWspLL73EvHnzdllm/fr1nHjiiZSWlnLMMccwadKktOt2d9555x1+97vfJRLMvffey777\n7su8efP4z//8T9577720y1577bXMnDmTuXPnsn79ep5//nkAhg8fzvXXX09paSlvvvkme++9N888\n8wzPPfcc77zzDqWlpdxwww31tHdqR0lBROpdr161m15X+++/P0VFO+5J//jjjzNgwAAGDBjA/Pnz\n0yaFNm3acOaZZwIwcODAxK/1VEOHDt2lzBtvvMFFF10EQP/+/TnkkEPSLjt9+nQGDRpE//79efXV\nV/nggw9Yt24dn3/+Oeeccw4QLjZr27YtL7/8Mpdffjlt2rQBYK+99qr9jqgHSgoiUu/Gj4e2bXee\n1rZtmB6Hdu3aJV4vWLCA3//+97zyyivMmTOHwYMHpz1vv1WrVonXzZs3p6KiIu2699hjjxrLpFNe\nXs7o0aOZOnUqc+bM4fLLL28QV4MrKYhIvRsxAiZMgN69wSw8T5iw+53MtbFhwwY6dOjAnnvuyYoV\nK3jhhRfqfRvHHnssTzzxBABz585NWxPZtGkTzZo1o6CggC+//JIpU6YA0LlzZ7p27cozzzwDhIsC\ny8vLOe2005g0aRKbNm0CYO3atfUedyZ09pGIxGLEiOwkgVQDBgygX79+9O3bl969e3PsscfW+zbG\njBnD978rmfjYAAANkklEQVT/ffr165d4dOzYcacyXbp04ZJLLqFfv35069aNo446KjGvuLiYH/7w\nh4wbN45WrVoxZcoUzj77bEpLSykqKqJly5acc8453H777fUee02ssje8oSgqKvKSkpJchyHS5Myf\nP5+DDz4412HkhYqKCioqKmjdujULFizg9NNPZ8GCBbRokR+/s9P9rcxslrsXVbFIQn58AhGRBmTj\nxo18+9vfpqKiAnfngQceyJuEUFeN41OIiGRRp06dmDVrVq7DiIU6mkVEJEFJQUREEpQUREQkQUlB\nREQSlBREpEE4+eSTd7kQ7e677+aqq66qdrn27dsDsHz5coYNG5a2zEknnURNp7rffffdlCeN8nfW\nWWfxxRdfZBJ6g6KkICINwvDhw5k8efJO0yZPnszw4cMzWn6//fbjqaee2u3tpyaFZ599lk6dOu32\n+vKVTkkVkVq77jpIM1J0nRxxBEQjVqc1bNgwbrrpJrZs2UKrVq1YvHgxy5cv5/jjj2fjxo0MGTKE\ndevWsXXrVu644w6GDBmy0/KLFy/m7LPP5v3332fTpk1cdtlllJaW0rdv38TQEgBXXXUVM2fOZNOm\nTQwbNoxf/OIX3HPPPSxfvpyTTz6ZgoICZsyYQWFhISUlJRQUFHDXXXclRlkdOXIk1113HYsXL+bM\nM8/kuOOO480336R79+785S9/SQx4V+mZZ57hjjvuYMuWLXTp0oXi4mL22WcfNm7cyJgxYygpKcHM\nuOWWWzj//PN5/vnnufHGG9m2bRsFBQVMnz69/v4IxFxTMLPBZvaRmS00s7Fp5l9qZqvNbHb0GBln\nPCLScO21114MGjSI5557Dgi1hAsuuAAzo3Xr1kydOpV3332XGTNmcMMNN1DdaA33338/bdu2Zf78\n+fziF7/Y6ZqD8ePHU1JSwpw5c3j11VeZM2cO11xzDfvttx8zZsxgxowZO61r1qxZPPTQQ7z99tu8\n9dZb/PGPf0wMpb1gwQKuvvpqPvjgAzp16pQY/yjZcccdx1tvvcV7773HRRddxG9/+1sAbr/9djp2\n7MjcuXOZM2cOp5xyCqtXr+YHP/gBU6ZMobS0lCeffLLO+zVVbDUFM2sO3AecBpQBM81smrunjhz1\nZ3cfHVccIlL/qvtFH6fKJqQhQ4YwefJkJk6cCIR7Htx444289tprNGvWjGXLlrFy5Ur23XfftOt5\n7bXXuOaaawA4/PDDOfzwwxPznnjiCSZMmEBFRQUrVqxg3rx5O81P9cYbb3DeeeclRmodOnQor7/+\nOueeey59+vThiCOOAKoenrusrIwLL7yQFStWsGXLFvr06QPAyy+/vFNzWefOnXnmmWc44YQTEmXi\nGF47zprCIGChuy9y9y3AZGBIDcvEor7vFSsiuTFkyBCmT5/Ou+++S3l5OQMHDgTCAHOrV69m1qxZ\nzJ49m3322We3hqn+5JNPuPPOO5k+fTpz5szhO9/5Tp2Gu64cdhuqHnp7zJgxjB49mrlz5/LAAw/k\nfHjtOJNCd+DTpPdl0bRU55vZHDN7ysx6pluRmY0ysxIzK1m9enWtgqi8V+ySJeC+416xSgwiDU/7\n9u05+eSTufzyy3fqYF6/fj177703LVu2ZMaMGSxZsqTa9Zxwwgk89thjALz//vvMmTMHCMNut2vX\njo4dO7Jy5cpEUxVAhw4d+PLLL3dZ1/HHH8/TTz9NeXk5X331FVOnTuX444/P+DOtX7+e7t3DofHh\nhx9OTD/ttNO47777Eu/XrVvH0UcfzWuvvcYnn3wCxDO8dq7PPnoGKHT3w4GXgIfTFXL3Ce5e5O5F\nXbt2rdUGsnmvWBGJ3/DhwyktLd0pKYwYMYKSkhIOO+wwHnnkEfr27VvtOq666io2btzIwQcfzM03\n35yocfTv358jjzySvn37cvHFF+807PaoUaMYPHgwJ5988k7rGjBgAJdeeimDBg3iqKOOYuTIkRx5\n5JEZf55bb72Vf//3f2fgwIEUFBQkpt90002sW7eOQw89lP79+zNjxgy6du3KhAkTGDp0KP379+fC\nCy/MeDuZim3obDM7BrjV3c+I3v8cwN1/VUX55sBad++Ybn6l2g6d3axZqCHsuj3Yvj3j1Yg0eRo6\nu+Goy9DZcdYUZgIHmFkfM2sFXARMSy5gZt2S3p4LzK/vILJ9r1gRkYYstqTg7hXAaOAFwsH+CXf/\nwMxuM7Nzo2LXmNkHZlYKXANcWt9xZPtesSIiDVmsF6+5+7PAsynTbk56/XPg53HGUHk7wHHjYOnS\nUEMYPz43twkUaejcHTPLdRhSjbp2CTSJK5pzda9YkcakdevWrFmzhi5duigx5Cl3Z82aNbRu3Xq3\n19EkkoKI1F2PHj0oKyujtqeFS3a1bt2aHj167PbySgoikpGWLVsmrqSVxivX1ymIiEgeUVIQEZEE\nJQUREUmI7YrmuJjZaqD6gU1ypwD4PNdBVEPx1U2+xwf5H6Piq5u6xNfb3WscJ6jBJYV8ZmYlmVxG\nniuKr27yPT7I/xgVX91kIz41H4mISIKSgoiIJCgp1K8JuQ6gBoqvbvI9Psj/GBVf3cQen/oUREQk\nQTUFERFJUFIQEZEEJYVaMrOeZjbDzOZF94K4Nk2Zk8xsvZnNjh43p1tXjDEuNrO50bZ3uU2dBfeY\n2cLo/tgDshjbQUn7ZbaZbTCz61LKZH3/mdkkM1tlZu8nTdvLzF4yswXRc+cqlr0kKrPAzC7JUmy/\nM7MPo7/fVDPrVMWy1X4XYo7xVjNblvR3PKuKZQeb2UfR93FsFuP7c1Jsi81sdhXLxroPqzqm5Oz7\n5+561OIBdAMGRK87AB8D/VLKnAT8NYcxLgYKqpl/FvAcYMDRwNs5irM58Bnhopqc7j/gBGAA8H7S\ntN8CY6PXY4HfpFluL2BR9Nw5et05C7GdDrSIXv8mXWyZfBdijvFW4McZfAf+BXwDaAWUpv4/xRVf\nyvz/C9yci31Y1TElV98/1RRqyd1XuPu70esvCXeV657bqGptCPCIB28BnVJujZot3wb+5e45v0Ld\n3V8D1qZMHgI8HL1+GPi3NIueAbzk7mvdfR3wEjA47tjc/UUPdzcEeAvY/bGS60EV+y8Tg4CF7r7I\n3bcAkwn7vV5VF5+Fm0NcADxe39vNRDXHlJx8/5QU6sDMCoEjgbfTzD7GzErN7DkzOySrgYEDL5rZ\nLDMblWZ+d+DTpPdl5CaxXUTV/4i53H+V9nH3FdHrz4B90pTJh315OaHml05N34W4jY6auCZV0fyR\nD/vveGCluy+oYn7W9mHKMSUn3z8lhd1kZu2BKcB17r4hZfa7hCaR/sC9wNNZDu84dx8AnAlcbWYn\nZHn7NTKzVsC5wJNpZud6/+3CQ109787fNrNxQAVQXEWRXH4X7gf2B44AVhCaaPLRcKqvJWRlH1Z3\nTMnm909JYTeYWUvCH6/Y3f83db67b3D3jdHrZ4GWZlaQrfjcfVn0vAqYSqiiJ1sG9Ex63yOalk1n\nAu+6+8rUGbnef0lWVjarRc+r0pTJ2b40s0uBs4ER0UFjFxl8F2Lj7ivdfZu7bwf+WMW2c/pdNLMW\nwFDgz1WVycY+rOKYkpPvn5JCLUXtjxOB+e5+VxVl9o3KYWaDCPt5TZbia2dmHSpfEzok308pNg34\nfnQW0tHA+qRqarZU+essl/svxTSg8myOS4C/pCnzAnC6mXWOmkdOj6bFyswGAz8FznX38irKZPJd\niDPG5H6q86rY9kzgADPrE9UeLyLs92w5FfjQ3cvSzczGPqzmmJKb719cPeqN9QEcR6jGzQFmR4+z\ngCuBK6Myo4EPCGdSvAX8nyzG941ou6VRDOOi6cnxGXAf4ayPuUBRlvdhO8JBvmPStJzuP0KCWgFs\nJbTLXgF0AaYDC4CXgb2iskXAg0nLXg4sjB6XZSm2hYS25Mrv4H9HZfcDnq3uu5DF/fc/0fdrDuEA\n1y01xuj9WYQzbv4VV4zp4oum/6nye5dUNqv7sJpjSk6+fxrmQkREEtR8JCIiCUoKIiKSoKQgIiIJ\nSgoiIpKgpCAiIglKCiIRM9tmO4/gWm8jdppZYfIInSL5qkWuAxDJI5vc/YhcByGSS6opiNQgGk//\nt9GY+u+Y2Tej6YVm9ko04Nt0M+sVTd/Hwj0OSqPH/4lW1dzM/hiNmf+imbWJyl8TjaU/x8wm5+hj\nigBKCiLJ2qQ0H12YNG+9ux8G/AG4O5p2L/Cwux9OGJDunmj6PcCrHgb0G0C4EhbgAOA+dz8E+AI4\nP5o+FjgyWs+VcX04kUzoimaRiJltdPf2aaYvBk5x90XRwGWfuXsXM/ucMHTD1mj6CncvMLPVQA93\n35y0jkLCuPcHRO9/BrR09zvM7HlgI2E02Kc9GgxQJBdUUxDJjFfxujY2J73exo4+ve8QxqIaAMyM\nRu4UyQklBZHMXJj0/M/o9ZuEUT0BRgCvR6+nA1cBmFlzM+tY1UrNrBnQ091nAD8DOgK71FZEskW/\nSER2aGM737z9eXevPC21s5nNIfzaHx5NGwM8ZGY/AVYDl0XTrwUmmNkVhBrBVYQROtNpDjwaJQ4D\n7nH3L+rtE4nUkvoURGoQ9SkUufvnuY5FJG5qPhIRkQTVFEREJEE1BRERSVBSEBGRBCUFERFJUFIQ\nEZEEJQUREUn4/0ypeWoxAVOLAAAAAElFTkSuQmCC\n", 393 | "text/plain": [ 394 | "" 395 | ] 396 | }, 397 | "metadata": {}, 398 | "output_type": "display_data" 399 | } 400 | ], 401 | "source": [ 402 | "plt.clf() # clear figure\n", 403 | "\n", 404 | "acc = history.history['acc']\n", 405 | "val_acc = history.history['val_acc']\n", 406 | "\n", 407 | "plt.plot(epochs, acc, 'bo', label='Training acc')\n", 408 | "plt.plot(epochs, val_acc, 'b', label='Validation acc')\n", 409 | "plt.title('Training and validation accuracy')\n", 410 | "plt.xlabel('Epochs')\n", 411 | "plt.ylabel('Loss')\n", 412 | "plt.legend()\n", 413 | "\n", 414 | "plt.show()" 415 | ] 416 | }, 417 | { 418 | "cell_type": "markdown", 419 | "metadata": {}, 420 | "source": [ 421 | "## 从头开始训练" 422 | ] 423 | }, 424 | { 425 | "cell_type": "code", 426 | "execution_count": 98, 427 | "metadata": { 428 | "collapsed": true 429 | }, 430 | "outputs": [ 431 | { 432 | "name": "stdout", 433 | "output_type": "stream", 434 | "text": [ 435 | "Train on 7982 samples, validate on 1000 samples\n", 436 | "Epoch 1/8\n", 437 | "7982/7982 [==============================] - 2s 225us/step - loss: 2.6785 - acc: 0.5163 - val_loss: 1.7800 - val_acc: 0.6470\n", 438 | "Epoch 2/8\n", 439 | "7982/7982 [==============================] - 1s 160us/step - loss: 1.4534 - acc: 0.7036 - val_loss: 1.3325 - val_acc: 0.7050\n", 440 | "Epoch 3/8\n", 441 | "7982/7982 [==============================] - 1s 158us/step - loss: 1.0707 - acc: 0.7729 - val_loss: 1.1677 - val_acc: 0.7420\n", 442 | "Epoch 4/8\n", 443 | "7982/7982 [==============================] - 1s 150us/step - loss: 0.8454 - acc: 0.8206 - val_loss: 1.0790 - val_acc: 0.7510\n", 444 | "Epoch 5/8\n", 445 | "7982/7982 [==============================] - 1s 160us/step - loss: 0.6767 - acc: 0.8561 - val_loss: 0.9885 - val_acc: 0.7950\n", 446 | "Epoch 6/8\n", 447 | "7982/7982 [==============================] - 1s 153us/step - loss: 0.5411 - acc: 0.8875 - val_loss: 0.9413 - val_acc: 0.8070\n", 448 | "Epoch 7/8\n", 449 | "7982/7982 [==============================] - 1s 152us/step - loss: 0.4391 - acc: 0.9080 - val_loss: 0.9254 - val_acc: 0.8070\n", 450 | "Epoch 8/8\n", 451 | "7982/7982 [==============================] - 1s 152us/step - loss: 0.3575 - acc: 0.9280 - val_loss: 0.9160 - val_acc: 0.8050\n" 452 | ] 453 | }, 454 | { 455 | "data": { 456 | "text/plain": [ 457 | "" 458 | ] 459 | }, 460 | "execution_count": 98, 461 | "metadata": {}, 462 | "output_type": "execute_result" 463 | }, 464 | { 465 | "name": "stdout", 466 | "output_type": "stream", 467 | "text": [ 468 | "2246/2246 [==============================] - 0s 176us/step\n" 469 | ] 470 | } 471 | ], 472 | "source": [ 473 | "model = models.Sequential()\n", 474 | "model.add(layers.Dense(64, activation='relu', input_shape=(10000,)))\n", 475 | "model.add(layers.Dense(64, activation='relu'))\n", 476 | "model.add(layers.Dense(46, activation='softmax'))\n", 477 | "\n", 478 | "model.compile(optimizer='rmsprop',\n", 479 | " loss='categorical_crossentropy',\n", 480 | " metrics=['accuracy'])\n", 481 | "model.fit(partial_x_train,\n", 482 | " partial_y_train,\n", 483 | " epochs=8,\n", 484 | " batch_size=512,\n", 485 | " validation_data=(x_val, y_val))\n", 486 | "model.evaluate(x_test, one_hot_test_labels)" 487 | ] 488 | }, 489 | { 490 | "cell_type": "code", 491 | "execution_count": 97, 492 | "metadata": {}, 493 | "outputs": [ 494 | { 495 | "data": { 496 | "text/plain": [ 497 | "0.19056099732858414" 498 | ] 499 | }, 500 | "execution_count": 97, 501 | "metadata": {}, 502 | "output_type": "execute_result" 503 | } 504 | ], 505 | "source": [ 506 | "import copy # 测试完全随机精度\n", 507 | "test_labels_copy = copy.copy(test_labels)\n", 508 | "np.random.shuffle(test_labels_copy)\n", 509 | "float(np.sum(np.array(test_labels) == np.array(test_labels_copy))) / len(test_labels)" 510 | ] 511 | }, 512 | { 513 | "cell_type": "markdown", 514 | "metadata": {}, 515 | "source": [ 516 | "## 评估、预测" 517 | ] 518 | }, 519 | { 520 | "cell_type": "code", 521 | "execution_count": 109, 522 | "metadata": {}, 523 | "outputs": [ 524 | { 525 | "data": { 526 | "text/plain": [ 527 | "(2246, 46)" 528 | ] 529 | }, 530 | "execution_count": 109, 531 | "metadata": {}, 532 | "output_type": "execute_result" 533 | } 534 | ], 535 | "source": [ 536 | "predictions = model.predict(x_test)\n", 537 | "predictions.shape" 538 | ] 539 | }, 540 | { 541 | "cell_type": "code", 542 | "execution_count": 110, 543 | "metadata": {}, 544 | "outputs": [ 545 | { 546 | "data": { 547 | "text/plain": [ 548 | "3" 549 | ] 550 | }, 551 | "execution_count": 110, 552 | "metadata": {}, 553 | "output_type": "execute_result" 554 | }, 555 | { 556 | "data": { 557 | "text/plain": [ 558 | "array([ 3, 10, 1, ..., 3, 4, 1], dtype=int64)" 559 | ] 560 | }, 561 | "execution_count": 110, 562 | "metadata": {}, 563 | "output_type": "execute_result" 564 | } 565 | ], 566 | "source": [ 567 | "np.argmax(predictions[0])\n", 568 | "np.argmax(predictions,axis=1)" 569 | ] 570 | }, 571 | { 572 | "cell_type": "markdown", 573 | "metadata": {}, 574 | "source": [ 575 | "## 处理标签和损失的另一种方法" 576 | ] 577 | }, 578 | { 579 | "cell_type": "code", 580 | "execution_count": 113, 581 | "metadata": {}, 582 | "outputs": [], 583 | "source": [ 584 | "y_train = np.array(train_labels) # 标签转为整数张量,注意是整数,不是浮点数。\n", 585 | "y_test = np.array(test_labels)" 586 | ] 587 | }, 588 | { 589 | "cell_type": "code", 590 | "execution_count": 114, 591 | "metadata": {}, 592 | "outputs": [ 593 | { 594 | "data": { 595 | "text/plain": [ 596 | "array([ 3, 4, 3, ..., 25, 3, 25], dtype=int64)" 597 | ] 598 | }, 599 | "execution_count": 114, 600 | "metadata": {}, 601 | "output_type": "execute_result" 602 | }, 603 | { 604 | "data": { 605 | "text/plain": [ 606 | "array([ 3, 4, 3, ..., 25, 3, 25], dtype=int64)" 607 | ] 608 | }, 609 | "execution_count": 114, 610 | "metadata": {}, 611 | "output_type": "execute_result" 612 | } 613 | ], 614 | "source": [ 615 | "train_labels\n", 616 | "y_train" 617 | ] 618 | }, 619 | { 620 | "cell_type": "code", 621 | "execution_count": 36, 622 | "metadata": { 623 | "collapsed": true 624 | }, 625 | "outputs": [], 626 | "source": [ 627 | "model.compile(optimizer='rmsprop', loss='sparse_categorical_crossentropy', metrics=['acc'])" 628 | ] 629 | }, 630 | { 631 | "cell_type": "markdown", 632 | "metadata": {}, 633 | "source": [ 634 | "`sparse_categorical_crossentropy` 和`categorical_crossentropy`在数学上完全相同,只是二者的接口不同" 635 | ] 636 | } 637 | ], 638 | "metadata": { 639 | "kernelspec": { 640 | "display_name": "Python [conda env:tf-py36]", 641 | "language": "python", 642 | "name": "conda-env-tf-py36-py" 643 | }, 644 | "language_info": { 645 | "codemirror_mode": { 646 | "name": "ipython", 647 | "version": 3 648 | }, 649 | "file_extension": ".py", 650 | "mimetype": "text/x-python", 651 | "name": "python", 652 | "nbconvert_exporter": "python", 653 | "pygments_lexer": "ipython3", 654 | "version": "3.6.8" 655 | }, 656 | "toc": { 657 | "base_numbering": 1, 658 | "nav_menu": {}, 659 | "number_sections": true, 660 | "sideBar": true, 661 | "skip_h1_title": false, 662 | "title_cell": "Table of Contents", 663 | "title_sidebar": "Contents", 664 | "toc_cell": false, 665 | "toc_position": {}, 666 | "toc_section_display": true, 667 | "toc_window_display": true 668 | } 669 | }, 670 | "nbformat": 4, 671 | "nbformat_minor": 2 672 | } 673 | -------------------------------------------------------------------------------- /5.1-CNN简介.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | }, 15 | { 16 | "data": { 17 | "text/plain": [ 18 | "'2.2.4'" 19 | ] 20 | }, 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "output_type": "execute_result" 24 | } 25 | ], 26 | "source": [ 27 | "import keras\n", 28 | "keras.__version__" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": {}, 34 | "source": [ 35 | "# CNN简介" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": null, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "1、接受形状`(image_height, image_width, image_channels)`" 45 | ] 46 | }, 47 | { 48 | "cell_type": "code", 49 | "execution_count": 3, 50 | "metadata": {}, 51 | "outputs": [ 52 | { 53 | "name": "stdout", 54 | "output_type": "stream", 55 | "text": [ 56 | "WARNING:tensorflow:From C:\\Users\\pengfeizhang\\Anaconda3\\envs\\learn36\\lib\\site-packages\\tensorflow\\python\\framework\\op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.\n", 57 | "Instructions for updating:\n", 58 | "Colocations handled automatically by placer.\n" 59 | ] 60 | } 61 | ], 62 | "source": [ 63 | "from keras import layers\n", 64 | "from keras import models\n", 65 | "\n", 66 | "model = models.Sequential()\n", 67 | "model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)))\n", 68 | "model.add(layers.MaxPooling2D((2, 2)))\n", 69 | "model.add(layers.Conv2D(64, (3, 3), activation='relu'))\n", 70 | "model.add(layers.MaxPooling2D((2, 2)))\n", 71 | "model.add(layers.Conv2D(64, (3, 3), activation='relu'))\n", 72 | "\n", 73 | "model.add(layers.Flatten()) # 展平\n", 74 | "model.add(layers.Dense(64, activation='relu'))\n", 75 | "model.add(layers.Dense(10, activation='softmax'))" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": 11, 81 | "metadata": { 82 | "collapsed": true 83 | }, 84 | "outputs": [ 85 | { 86 | "name": "stdout", 87 | "output_type": "stream", 88 | "text": [ 89 | "_________________________________________________________________\n", 90 | "Layer (type) Output Shape Param # \n", 91 | "=================================================================\n", 92 | "conv2d_7 (Conv2D) (None, 26, 26, 32) 320 \n", 93 | "_________________________________________________________________\n", 94 | "max_pooling2d_5 (MaxPooling2 (None, 13, 13, 32) 0 \n", 95 | "_________________________________________________________________\n", 96 | "conv2d_8 (Conv2D) (None, 11, 11, 64) 18496 \n", 97 | "_________________________________________________________________\n", 98 | "max_pooling2d_6 (MaxPooling2 (None, 5, 5, 64) 0 \n", 99 | "_________________________________________________________________\n", 100 | "conv2d_9 (Conv2D) (None, 3, 3, 64) 36928 \n", 101 | "_________________________________________________________________\n", 102 | "flatten_4 (Flatten) (None, 576) 0 \n", 103 | "_________________________________________________________________\n", 104 | "dense_5 (Dense) (None, 64) 36928 \n", 105 | "_________________________________________________________________\n", 106 | "dense_6 (Dense) (None, 10) 650 \n", 107 | "=================================================================\n", 108 | "Total params: 93,322\n", 109 | "Trainable params: 93,322\n", 110 | "Non-trainable params: 0\n", 111 | "_________________________________________________________________\n" 112 | ] 113 | } 114 | ], 115 | "source": [ 116 | "model.summary()" 117 | ] 118 | }, 119 | { 120 | "cell_type": "markdown", 121 | "metadata": {}, 122 | "source": [ 123 | "# 使用CNN解决mnist手写数字多分类问题" 124 | ] 125 | }, 126 | { 127 | "cell_type": "code", 128 | "execution_count": 4, 129 | "metadata": {}, 130 | "outputs": [], 131 | "source": [ 132 | "from keras.datasets import mnist\n", 133 | "from keras.utils import to_categorical\n", 134 | "\n", 135 | "(train_images, train_labels), (test_images, test_labels) = mnist.load_data()\n", 136 | "\n", 137 | "train_images = train_images.reshape((60000, 28, 28, 1))\n", 138 | "train_images = train_images.astype('float32') / 255\n", 139 | "\n", 140 | "test_images = test_images.reshape((10000, 28, 28, 1))\n", 141 | "test_images = test_images.astype('float32') / 255\n", 142 | "\n", 143 | "train_labels = to_categorical(train_labels)\n", 144 | "test_labels = to_categorical(test_labels)" 145 | ] 146 | }, 147 | { 148 | "cell_type": "code", 149 | "execution_count": 5, 150 | "metadata": { 151 | "collapsed": true 152 | }, 153 | "outputs": [ 154 | { 155 | "name": "stdout", 156 | "output_type": "stream", 157 | "text": [ 158 | "WARNING:tensorflow:From C:\\Users\\pengfeizhang\\Anaconda3\\envs\\learn36\\lib\\site-packages\\tensorflow\\python\\ops\\math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", 159 | "Instructions for updating:\n", 160 | "Use tf.cast instead.\n", 161 | "Epoch 1/5\n", 162 | "60000/60000 [==============================] - 12s 194us/step - loss: 0.1776 - acc: 0.9436\n", 163 | "Epoch 2/5\n", 164 | "60000/60000 [==============================] - 7s 113us/step - loss: 0.0488 - acc: 0.9848\n", 165 | "Epoch 3/5\n", 166 | "60000/60000 [==============================] - 7s 111us/step - loss: 0.0336 - acc: 0.9893\n", 167 | "Epoch 4/5\n", 168 | "60000/60000 [==============================] - 7s 112us/step - loss: 0.0250 - acc: 0.9919\n", 169 | "Epoch 5/5\n", 170 | "60000/60000 [==============================] - 7s 111us/step - loss: 0.0206 - acc: 0.9935\n" 171 | ] 172 | }, 173 | { 174 | "data": { 175 | "text/plain": [ 176 | "" 177 | ] 178 | }, 179 | "execution_count": 5, 180 | "metadata": {}, 181 | "output_type": "execute_result" 182 | } 183 | ], 184 | "source": [ 185 | "model.compile(optimizer='rmsprop',\n", 186 | " loss='categorical_crossentropy',\n", 187 | " metrics=['accuracy'])\n", 188 | "model.fit(train_images, train_labels, epochs=5, batch_size=64)" 189 | ] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": 6, 194 | "metadata": {}, 195 | "outputs": [ 196 | { 197 | "name": "stdout", 198 | "output_type": "stream", 199 | "text": [ 200 | "10000/10000 [==============================] - 1s 83us/step\n" 201 | ] 202 | } 203 | ], 204 | "source": [ 205 | "test_loss, test_acc = model.evaluate(test_images, test_labels)" 206 | ] 207 | }, 208 | { 209 | "cell_type": "code", 210 | "execution_count": 7, 211 | "metadata": {}, 212 | "outputs": [ 213 | { 214 | "data": { 215 | "text/plain": [ 216 | "0.9916" 217 | ] 218 | }, 219 | "execution_count": 7, 220 | "metadata": {}, 221 | "output_type": "execute_result" 222 | } 223 | ], 224 | "source": [ 225 | "test_acc" 226 | ] 227 | }, 228 | { 229 | "cell_type": "markdown", 230 | "metadata": {}, 231 | "source": [ 232 | "第二章:Dense网络精度为 97.8%\n", 233 | "这 里:CNN网络精度为 99.3%" 234 | ] 235 | } 236 | ], 237 | "metadata": { 238 | "kernelspec": { 239 | "display_name": "Python [conda env:learn36]", 240 | "language": "python", 241 | "name": "conda-env-learn36-py" 242 | }, 243 | "language_info": { 244 | "codemirror_mode": { 245 | "name": "ipython", 246 | "version": 3 247 | }, 248 | "file_extension": ".py", 249 | "mimetype": "text/x-python", 250 | "name": "python", 251 | "nbconvert_exporter": "python", 252 | "pygments_lexer": "ipython3", 253 | "version": "3.6.8" 254 | }, 255 | "toc": { 256 | "base_numbering": 1, 257 | "nav_menu": {}, 258 | "number_sections": true, 259 | "sideBar": true, 260 | "skip_h1_title": false, 261 | "title_cell": "Table of Contents", 262 | "title_sidebar": "Contents", 263 | "toc_cell": false, 264 | "toc_position": {}, 265 | "toc_section_display": true, 266 | "toc_window_display": true 267 | } 268 | }, 269 | "nbformat": 4, 270 | "nbformat_minor": 2 271 | } 272 | -------------------------------------------------------------------------------- /6.1.1-单词和字符的one-hot编码.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "scrolled": true, 8 | "pycharm": {} 9 | }, 10 | "outputs": [ 11 | { 12 | "name": "stderr", 13 | "output_type": "stream", 14 | "text": [ 15 | "Using TensorFlow backend.\n" 16 | ] 17 | }, 18 | { 19 | "data": { 20 | "text/plain": [ 21 | "\u00272.0.8\u0027" 22 | ] 23 | }, 24 | "execution_count": 1, 25 | "metadata": {}, 26 | "output_type": "execute_result" 27 | } 28 | ], 29 | "source": [ 30 | "import keras\n", 31 | "keras.__version__" 32 | ] 33 | }, 34 | { 35 | "cell_type": "code", 36 | "execution_count": null, 37 | "metadata": { 38 | "pycharm": {} 39 | }, 40 | "outputs": [], 41 | "source": [ 42 | "将文本分解成的单元叫做标记\n", 43 | "文本分解成标记的过程叫做分词\n", 44 | "两种方法:对标记做one-hot编码;使用词嵌入" 45 | ] 46 | }, 47 | { 48 | "cell_type": "markdown", 49 | "metadata": { 50 | "pycharm": {} 51 | }, 52 | "source": [ 53 | "# 示例:单词级的one-hot编码" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": 5, 59 | "metadata": { 60 | "collapsed": true, 61 | "pycharm": { 62 | "is_executing": false 63 | } 64 | }, 65 | "outputs": [ 66 | { 67 | "data": { 68 | "text/plain": "{\u0027The\u0027: 1,\n \u0027cat\u0027: 2,\n \u0027sat\u0027: 3,\n \u0027on\u0027: 4,\n \u0027the\u0027: 5,\n \u0027mat.\u0027: 6,\n \u0027dog\u0027: 7,\n \u0027ate\u0027: 8,\n \u0027my\u0027: 9,\n \u0027homework.\u0027: 10}" 69 | }, 70 | "metadata": {}, 71 | "output_type": "execute_result", 72 | "execution_count": 5 73 | }, 74 | { 75 | "data": { 76 | "text/plain": "(2, 10, 11)" 77 | }, 78 | "metadata": {}, 79 | "output_type": "execute_result", 80 | "execution_count": 5 81 | }, 82 | { 83 | "data": { 84 | "text/plain": "array([[[0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 1., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]],\n\n [[0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n [0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]]])" 85 | }, 86 | "metadata": {}, 87 | "output_type": "execute_result", 88 | "execution_count": 5 89 | } 90 | ], 91 | "source": [ 92 | "import numpy as np \n", 93 | "samples \u003d [\u0027The cat sat on the mat.\u0027, \u0027The dog ate my homework.\u0027]\n", 94 | "token_index \u003d {} # 构建标记的索引 { \u0027word\u0027: 1 }\n", 95 | "for sample in samples:\n", 96 | " for word in sample.split():\n", 97 | " if word not in token_index:\n", 98 | " token_index[word] \u003d len(token_index) + 1\n", 99 | "token_index\n", 100 | "\n", 101 | "max_length \u003d 10 # 做one-hot编码;只考虑样本的前10个单词;索引中没有0\n", 102 | "results \u003d np.zeros((len(samples), max_length, max(token_index.values()) + 1))\n", 103 | "for i, sample in enumerate(samples):\n", 104 | " for j, word in list(enumerate(sample.split()))[:max_length]:\n", 105 | " index \u003d token_index.get(word)\n", 106 | " results[i, j, index] \u003d 1.\n", 107 | "results.shape\n", 108 | "results" 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "metadata": { 114 | "pycharm": {} 115 | }, 116 | "source": [ 117 | "# 示例:字符级的one-hot编码" 118 | ] 119 | }, 120 | { 121 | "cell_type": "code", 122 | "execution_count": 8, 123 | "metadata": { 124 | "collapsed": true, 125 | "pycharm": { 126 | "is_executing": false 127 | } 128 | }, 129 | "outputs": [ 130 | { 131 | "data": { 132 | "text/plain": "{\u00270\u0027: 1,\n \u00271\u0027: 2,\n \u00272\u0027: 3,\n \u00273\u0027: 4,\n \u00274\u0027: 5,\n \u00275\u0027: 6,\n \u00276\u0027: 7,\n \u00277\u0027: 8,\n \u00278\u0027: 9,\n \u00279\u0027: 10,\n \u0027a\u0027: 11,\n \u0027b\u0027: 12,\n \u0027c\u0027: 13,\n \u0027d\u0027: 14,\n \u0027e\u0027: 15,\n \u0027f\u0027: 16,\n \u0027g\u0027: 17,\n \u0027h\u0027: 18,\n \u0027i\u0027: 19,\n \u0027j\u0027: 20,\n \u0027k\u0027: 21,\n \u0027l\u0027: 22,\n \u0027m\u0027: 23,\n \u0027n\u0027: 24,\n \u0027o\u0027: 25,\n \u0027p\u0027: 26,\n \u0027q\u0027: 27,\n \u0027r\u0027: 28,\n \u0027s\u0027: 29,\n \u0027t\u0027: 30,\n \u0027u\u0027: 31,\n \u0027v\u0027: 32,\n \u0027w\u0027: 33,\n \u0027x\u0027: 34,\n \u0027y\u0027: 35,\n \u0027z\u0027: 36,\n \u0027A\u0027: 37,\n \u0027B\u0027: 38,\n \u0027C\u0027: 39,\n \u0027D\u0027: 40,\n \u0027E\u0027: 41,\n \u0027F\u0027: 42,\n \u0027G\u0027: 43,\n \u0027H\u0027: 44,\n \u0027I\u0027: 45,\n \u0027J\u0027: 46,\n \u0027K\u0027: 47,\n \u0027L\u0027: 48,\n \u0027M\u0027: 49,\n \u0027N\u0027: 50,\n \u0027O\u0027: 51,\n \u0027P\u0027: 52,\n \u0027Q\u0027: 53,\n \u0027R\u0027: 54,\n \u0027S\u0027: 55,\n \u0027T\u0027: 56,\n \u0027U\u0027: 57,\n \u0027V\u0027: 58,\n \u0027W\u0027: 59,\n \u0027X\u0027: 60,\n \u0027Y\u0027: 61,\n \u0027Z\u0027: 62,\n \u0027!\u0027: 63,\n \u0027\"\u0027: 64,\n \u0027#\u0027: 65,\n \u0027$\u0027: 66,\n \u0027%\u0027: 67,\n \u0027\u0026\u0027: 68,\n \"\u0027\": 69,\n \u0027(\u0027: 70,\n \u0027)\u0027: 71,\n \u0027*\u0027: 72,\n \u0027+\u0027: 73,\n \u0027,\u0027: 74,\n \u0027-\u0027: 75,\n \u0027.\u0027: 76,\n \u0027/\u0027: 77,\n \u0027:\u0027: 78,\n \u0027;\u0027: 79,\n \u0027\u003c\u0027: 80,\n \u0027\u003d\u0027: 81,\n \u0027\u003e\u0027: 82,\n \u0027?\u0027: 83,\n \u0027@\u0027: 84,\n \u0027[\u0027: 85,\n \u0027\\\\\u0027: 86,\n \u0027]\u0027: 87,\n \u0027^\u0027: 88,\n \u0027_\u0027: 89,\n \u0027`\u0027: 90,\n \u0027{\u0027: 91,\n \u0027|\u0027: 92,\n \u0027}\u0027: 93,\n \u0027~\u0027: 94,\n \u0027 \u0027: 95,\n \u0027\\t\u0027: 96,\n \u0027\\n\u0027: 97,\n \u0027\\r\u0027: 98,\n \u0027\\x0b\u0027: 99,\n \u0027\\x0c\u0027: 100}" 133 | }, 134 | "metadata": {}, 135 | "output_type": "execute_result", 136 | "execution_count": 8 137 | }, 138 | { 139 | "data": { 140 | "text/plain": "(2, 50, 101)" 141 | }, 142 | "metadata": {}, 143 | "output_type": "execute_result", 144 | "execution_count": 8 145 | }, 146 | { 147 | "data": { 148 | "text/plain": "array([[[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]]])" 149 | }, 150 | "metadata": {}, 151 | "output_type": "execute_result", 152 | "execution_count": 8 153 | } 154 | ], 155 | "source": [ 156 | "import string \n", 157 | "samples \u003d [\u0027The cat sat on the mat.\u0027, \u0027The dog ate my homework.\u0027]\n", 158 | "characters \u003d string.printable # 所有可打印的ASCII字符 | characters\u003d\u002701234..tuv..NOPQt\\n\\x0b\\x0c\u0027\n", 159 | "token_index \u003d dict(zip(characters, range(1, len(characters) + 1)))\n", 160 | "token_index\n", 161 | "\n", 162 | "max_length \u003d 50 # 做one-hot编码;只考虑样本的前50个字符;索引中没有0\n", 163 | "results \u003d np.zeros((len(samples), max_length, max(token_index.values()) + 1))\n", 164 | "for i, sample in enumerate(samples):\n", 165 | " for j, character in enumerate(sample[:max_length]):\n", 166 | " index \u003d token_index.get(character)\n", 167 | " results[i, j, index] \u003d 1.\n", 168 | "results.shape\n", 169 | "results" 170 | ] 171 | }, 172 | { 173 | "cell_type": "markdown", 174 | "metadata": { 175 | "pycharm": {} 176 | }, 177 | "source": [ 178 | "# keras实现单词级onehot编码" 179 | ] 180 | }, 181 | { 182 | "cell_type": "code", 183 | "execution_count": 10, 184 | "metadata": { 185 | "scrolled": true, 186 | "pycharm": { 187 | "is_executing": false 188 | } 189 | }, 190 | "outputs": [ 191 | { 192 | "data": { 193 | "text/plain": "{\u0027the\u0027: 1,\n \u0027cat\u0027: 2,\n \u0027sat\u0027: 3,\n \u0027on\u0027: 4,\n \u0027mat\u0027: 5,\n \u0027dog\u0027: 6,\n \u0027ate\u0027: 7,\n \u0027my\u0027: 8,\n \u0027homework\u0027: 9}" 194 | }, 195 | "metadata": {}, 196 | "output_type": "execute_result", 197 | "execution_count": 10 198 | }, 199 | { 200 | "data": { 201 | "text/plain": "[[1, 2, 3, 4, 1, 5], [1, 6, 7, 8, 9]]" 202 | }, 203 | "metadata": {}, 204 | "output_type": "execute_result", 205 | "execution_count": 10 206 | }, 207 | { 208 | "data": { 209 | "text/plain": "(2, 1000)" 210 | }, 211 | "metadata": {}, 212 | "output_type": "execute_result", 213 | "execution_count": 10 214 | }, 215 | { 216 | "data": { 217 | "text/plain": "array([[0., 1., 1., ..., 0., 0., 0.],\n [0., 1., 0., ..., 0., 0., 0.]])" 218 | }, 219 | "metadata": {}, 220 | "output_type": "execute_result", 221 | "execution_count": 10 222 | } 223 | ], 224 | "source": [ 225 | "from keras.preprocessing.text import Tokenizer \n", 226 | "samples \u003d [\u0027The cat sat on the mat.\u0027, \u0027The dog ate my homework.\u0027]\n", 227 | "tokenizer \u003d Tokenizer(num_words\u003d1000) # 只考虑前1000个最常见单词\n", 228 | "tokenizer.fit_on_texts(samples) # \n", 229 | "\n", 230 | "word_index \u003d tokenizer.word_index # 标记索引 ;下标中没有0\n", 231 | "word_index\n", 232 | "sequences \u003d tokenizer.texts_to_sequences(samples)\n", 233 | "sequences\n", 234 | "one_hot_results \u003d tokenizer.texts_to_matrix(samples, mode\u003d\u0027binary\u0027)\n", 235 | "one_hot_results.shape\n", 236 | "one_hot_results" 237 | ] 238 | }, 239 | { 240 | "cell_type": "markdown", 241 | "metadata": { 242 | "pycharm": {} 243 | }, 244 | "source": [ 245 | "# 散列技巧单词级one-hot编码" 246 | ] 247 | }, 248 | { 249 | "cell_type": "code", 250 | "execution_count": 12, 251 | "metadata": { 252 | "pycharm": { 253 | "is_executing": false 254 | } 255 | }, 256 | "outputs": [ 257 | { 258 | "data": { 259 | "text/plain": "(2, 10, 1000)" 260 | }, 261 | "metadata": {}, 262 | "output_type": "execute_result", 263 | "execution_count": 12 264 | } 265 | ], 266 | "source": [ 267 | "samples \u003d [\u0027The cat sat on the mat.\u0027, \u0027The dog ate my homework.\u0027] # 将单词散列为固定长度的向量\n", 268 | "dimensionality \u003d 1000\n", 269 | "max_length \u003d 10\n", 270 | "results \u003d np.zeros((len(samples), max_length, dimensionality))\n", 271 | "for i, sample in enumerate(samples):\n", 272 | " for j, word in list(enumerate(sample.split()))[:max_length]:\n", 273 | " index \u003d abs(hash(word)) % dimensionality\n", 274 | " results[i, j, index] \u003d 1.\n", 275 | "results.shape" 276 | ] 277 | } 278 | ], 279 | "metadata": { 280 | "kernelspec": { 281 | "display_name": "Python [conda env:learn36]", 282 | "language": "python", 283 | "name": "conda-env-learn36-py" 284 | }, 285 | "language_info": { 286 | "codemirror_mode": { 287 | "name": "ipython", 288 | "version": 3 289 | }, 290 | "file_extension": ".py", 291 | "mimetype": "text/x-python", 292 | "name": "python", 293 | "nbconvert_exporter": "python", 294 | "pygments_lexer": "ipython3", 295 | "version": "3.6.8" 296 | }, 297 | "toc": { 298 | "base_numbering": 1, 299 | "nav_menu": {}, 300 | "number_sections": true, 301 | "sideBar": true, 302 | "skip_h1_title": false, 303 | "title_cell": "Table of Contents", 304 | "title_sidebar": "Contents", 305 | "toc_cell": false, 306 | "toc_position": {}, 307 | "toc_section_display": true, 308 | "toc_window_display": true 309 | } 310 | }, 311 | "nbformat": 4, 312 | "nbformat_minor": 2 313 | } -------------------------------------------------------------------------------- /6.4-使用CNN处理序列.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | }, 15 | { 16 | "data": { 17 | "text/plain": [ 18 | "'2.0.8'" 19 | ] 20 | }, 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "output_type": "execute_result" 24 | } 25 | ], 26 | "source": [ 27 | "import keras\n", 28 | "keras.__version__" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": {}, 34 | "source": [ 35 | "* 在音频生成\\机器翻译取得了巨大成功\n", 36 | "* 输入形状 `(samples, time, features)` ,输出形状类似\n", 37 | "\n", 38 | "## 一维卷积实现IMDB情感分类" 39 | ] 40 | }, 41 | { 42 | "cell_type": "code", 43 | "execution_count": 2, 44 | "metadata": {}, 45 | "outputs": [ 46 | { 47 | "name": "stdout", 48 | "output_type": "stream", 49 | "text": [ 50 | "Loading data...\n", 51 | "25000 train sequences\n", 52 | "25000 test sequences\n", 53 | "Pad sequences (samples x time)\n", 54 | "x_train shape: (25000, 500)\n", 55 | "x_test shape: (25000, 500)\n" 56 | ] 57 | } 58 | ], 59 | "source": [ 60 | "from keras.datasets import imdb\n", 61 | "from keras.preprocessing import sequence\n", 62 | "max_features = 10000\n", 63 | "max_len = 500\n", 64 | "\n", 65 | "print('Loading data...')\n", 66 | "(x_train, y_train), (x_test, y_test) = imdb.load_data(num_words=max_features)\n", 67 | "print(len(x_train), 'train sequences')\n", 68 | "print(len(x_test), 'test sequences')\n", 69 | "\n", 70 | "print('Pad sequences (samples x time)')\n", 71 | "x_train = sequence.pad_sequences(x_train, maxlen=max_len)\n", 72 | "x_test = sequence.pad_sequences(x_test, maxlen=max_len)\n", 73 | "print('x_train shape:', x_train.shape)\n", 74 | "print('x_test shape:', x_test.shape)" 75 | ] 76 | }, 77 | { 78 | "cell_type": "code", 79 | "execution_count": 3, 80 | "metadata": {}, 81 | "outputs": [ 82 | { 83 | "name": "stdout", 84 | "output_type": "stream", 85 | "text": [ 86 | "_________________________________________________________________\n", 87 | "Layer (type) Output Shape Param # \n", 88 | "=================================================================\n", 89 | "embedding_1 (Embedding) (None, 500, 128) 1280000 \n", 90 | "_________________________________________________________________\n", 91 | "conv1d_1 (Conv1D) (None, 494, 32) 28704 \n", 92 | "_________________________________________________________________\n", 93 | "max_pooling1d_1 (MaxPooling1 (None, 98, 32) 0 \n", 94 | "_________________________________________________________________\n", 95 | "conv1d_2 (Conv1D) (None, 92, 32) 7200 \n", 96 | "_________________________________________________________________\n", 97 | "global_max_pooling1d_1 (Glob (None, 32) 0 \n", 98 | "_________________________________________________________________\n", 99 | "dense_1 (Dense) (None, 1) 33 \n", 100 | "=================================================================\n", 101 | "Total params: 1,315,937\n", 102 | "Trainable params: 1,315,937\n", 103 | "Non-trainable params: 0\n", 104 | "_________________________________________________________________\n", 105 | "Train on 20000 samples, validate on 5000 samples\n", 106 | "Epoch 1/10\n", 107 | "20000/20000 [==============================] - 4s - loss: 0.7713 - acc: 0.5287 - val_loss: 0.6818 - val_acc: 0.5970\n", 108 | "Epoch 2/10\n", 109 | "20000/20000 [==============================] - 3s - loss: 0.6631 - acc: 0.6775 - val_loss: 0.6582 - val_acc: 0.6646\n", 110 | "Epoch 3/10\n", 111 | "20000/20000 [==============================] - 3s - loss: 0.6142 - acc: 0.7580 - val_loss: 0.5987 - val_acc: 0.7118\n", 112 | "Epoch 4/10\n", 113 | "20000/20000 [==============================] - 3s - loss: 0.5156 - acc: 0.8124 - val_loss: 0.4936 - val_acc: 0.7736\n", 114 | "Epoch 5/10\n", 115 | "20000/20000 [==============================] - 3s - loss: 0.4029 - acc: 0.8469 - val_loss: 0.4123 - val_acc: 0.8358\n", 116 | "Epoch 6/10\n", 117 | "20000/20000 [==============================] - 3s - loss: 0.3455 - acc: 0.8653 - val_loss: 0.4040 - val_acc: 0.8382\n", 118 | "Epoch 7/10\n", 119 | "20000/20000 [==============================] - 3s - loss: 0.3078 - acc: 0.8634 - val_loss: 0.4059 - val_acc: 0.8240\n", 120 | "Epoch 8/10\n", 121 | "20000/20000 [==============================] - 3s - loss: 0.2812 - acc: 0.8535 - val_loss: 0.4147 - val_acc: 0.8098\n", 122 | "Epoch 9/10\n", 123 | "20000/20000 [==============================] - 3s - loss: 0.2554 - acc: 0.8334 - val_loss: 0.4296 - val_acc: 0.7878\n", 124 | "Epoch 10/10\n", 125 | "20000/20000 [==============================] - 3s - loss: 0.2356 - acc: 0.8052 - val_loss: 0.4296 - val_acc: 0.7600\n" 126 | ] 127 | } 128 | ], 129 | "source": [ 130 | "from keras.models import Sequential\n", 131 | "from keras import layers\n", 132 | "from keras.optimizers import RMSprop\n", 133 | "\n", 134 | "model = Sequential()\n", 135 | "model.add(layers.Embedding(max_features, 128, input_length=max_len))\n", 136 | "model.add(layers.Conv1D(32, 7, activation='relu'))\n", 137 | "model.add(layers.MaxPooling1D(5))\n", 138 | "model.add(layers.Conv1D(32, 7, activation='relu'))\n", 139 | "model.add(layers.GlobalMaxPooling1D()) # 全局池化层,或Flattern层,将三维输出转化为二维输出\n", 140 | "model.add(layers.Dense(1))\n", 141 | "\n", 142 | "model.summary()\n", 143 | "\n", 144 | "model.compile(optimizer=RMSprop(lr=1e-4),\n", 145 | " loss='binary_crossentropy',\n", 146 | " metrics=['acc'])\n", 147 | "history = model.fit(x_train, y_train,\n", 148 | " epochs=10,\n", 149 | " batch_size=128,\n", 150 | " validation_split=0.2)" 151 | ] 152 | }, 153 | { 154 | "cell_type": "code", 155 | "execution_count": 2, 156 | "metadata": {}, 157 | "outputs": [ 158 | { 159 | "data": { 160 | "text/plain": [ 161 | "'验证精度略低于LSTM,但运行速度更快'" 162 | ] 163 | }, 164 | "execution_count": 2, 165 | "metadata": {}, 166 | "output_type": "execute_result" 167 | }, 168 | { 169 | "ename": "NameError", 170 | "evalue": "name 'history' is not defined", 171 | "output_type": "error", 172 | "traceback": [ 173 | "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", 174 | "\u001b[1;31mNameError\u001b[0m Traceback (most recent call last)", 175 | "\u001b[1;32m\u001b[0m in \u001b[0;36m\u001b[1;34m\u001b[0m\n\u001b[0;32m 1\u001b[0m \u001b[1;32mimport\u001b[0m \u001b[0mmatplotlib\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mpyplot\u001b[0m \u001b[1;32mas\u001b[0m \u001b[0mplt\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 2\u001b[0m \u001b[1;34m'''验证精度略低于LSTM,但运行速度更快'''\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 3\u001b[1;33m \u001b[0macc\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mhistory\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mhistory\u001b[0m\u001b[1;33m[\u001b[0m\u001b[1;34m'acc'\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 4\u001b[0m \u001b[0mval_acc\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mhistory\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mhistory\u001b[0m\u001b[1;33m[\u001b[0m\u001b[1;34m'val_acc'\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 5\u001b[0m \u001b[0mloss\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mhistory\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mhistory\u001b[0m\u001b[1;33m[\u001b[0m\u001b[1;34m'loss'\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", 176 | "\u001b[1;31mNameError\u001b[0m: name 'history' is not defined" 177 | ] 178 | } 179 | ], 180 | "source": [ 181 | "import matplotlib.pyplot as plt\n", 182 | "'''验证精度略低于LSTM,但运行速度更快'''\n", 183 | "acc = history.history['acc']\n", 184 | "val_acc = history.history['val_acc']\n", 185 | "loss = history.history['loss']\n", 186 | "val_loss = history.history['val_loss']\n", 187 | "\n", 188 | "epochs = range(len(acc))\n", 189 | "\n", 190 | "plt.plot(epochs, acc, 'bo', label='Training acc')\n", 191 | "plt.plot(epochs, val_acc, 'b', label='Validation acc')\n", 192 | "plt.title('Training and validation accuracy')\n", 193 | "plt.legend()\n", 194 | "\n", 195 | "plt.figure()\n", 196 | "\n", 197 | "plt.plot(epochs, loss, 'bo', label='Training loss')\n", 198 | "plt.plot(epochs, val_loss, 'b', label='Validation loss')\n", 199 | "plt.title('Training and validation loss')\n", 200 | "plt.legend()\n", 201 | "\n", 202 | "plt.show()" 203 | ] 204 | }, 205 | { 206 | "cell_type": "markdown", 207 | "metadata": {}, 208 | "source": [ 209 | "## 结合CNN和RNN来处理长序列\n", 210 | "### 耶拿数据集:一维CNN" 211 | ] 212 | }, 213 | { 214 | "cell_type": "code", 215 | "execution_count": 2, 216 | "metadata": { 217 | "collapsed": true 218 | }, 219 | "outputs": [], 220 | "source": [ 221 | "import os\n", 222 | "import numpy as np\n", 223 | "\n", 224 | "data_dir = '/home/ubuntu/data/'\n", 225 | "fname = os.path.join(data_dir, 'jena_climate_2009_2016.csv')\n", 226 | "\n", 227 | "f = open(fname)\n", 228 | "data = f.read()\n", 229 | "f.close()\n", 230 | "\n", 231 | "lines = data.split('\\n')\n", 232 | "header = lines[0].split(',')\n", 233 | "lines = lines[1:]\n", 234 | "\n", 235 | "float_data = np.zeros((len(lines), len(header) - 1))\n", 236 | "for i, line in enumerate(lines):\n", 237 | " values = [float(x) for x in line.split(',')[1:]]\n", 238 | " float_data[i, :] = values\n", 239 | " \n", 240 | "mean = float_data[:200000].mean(axis=0)\n", 241 | "float_data -= mean\n", 242 | "std = float_data[:200000].std(axis=0)\n", 243 | "float_data /= std\n", 244 | "\n", 245 | "def generator(data, lookback, delay, min_index, max_index,\n", 246 | " shuffle=False, batch_size=128, step=6):\n", 247 | " if max_index is None:\n", 248 | " max_index = len(data) - delay - 1\n", 249 | " i = min_index + lookback\n", 250 | " while 1:\n", 251 | " if shuffle:\n", 252 | " rows = np.random.randint(\n", 253 | " min_index + lookback, max_index, size=batch_size)\n", 254 | " else:\n", 255 | " if i + batch_size >= max_index:\n", 256 | " i = min_index + lookback\n", 257 | " rows = np.arange(i, min(i + batch_size, max_index))\n", 258 | " i += len(rows)\n", 259 | "\n", 260 | " samples = np.zeros((len(rows),\n", 261 | " lookback // step,\n", 262 | " data.shape[-1]))\n", 263 | " targets = np.zeros((len(rows),))\n", 264 | " for j, row in enumerate(rows):\n", 265 | " indices = range(rows[j] - lookback, rows[j], step)\n", 266 | " samples[j] = data[indices]\n", 267 | " targets[j] = data[rows[j] + delay][1]\n", 268 | " yield samples, targets\n", 269 | " \n", 270 | "lookback = 1440\n", 271 | "step = 6\n", 272 | "delay = 144\n", 273 | "batch_size = 128\n", 274 | "\n", 275 | "train_gen = generator(float_data,\n", 276 | " lookback=lookback,\n", 277 | " delay=delay,\n", 278 | " min_index=0,\n", 279 | " max_index=200000,\n", 280 | " shuffle=True,\n", 281 | " step=step, \n", 282 | " batch_size=batch_size)\n", 283 | "val_gen = generator(float_data,\n", 284 | " lookback=lookback,\n", 285 | " delay=delay,\n", 286 | " min_index=200001,\n", 287 | " max_index=300000,\n", 288 | " step=step,\n", 289 | " batch_size=batch_size)\n", 290 | "test_gen = generator(float_data,\n", 291 | " lookback=lookback,\n", 292 | " delay=delay,\n", 293 | " min_index=300001,\n", 294 | " max_index=None,\n", 295 | " step=step,\n", 296 | " batch_size=batch_size)\n", 297 | "\n", 298 | "val_steps = (300000 - 200001 - lookback) // batch_size\n", 299 | "test_steps = (len(float_data) - 300001 - lookback) // batch_size" 300 | ] 301 | }, 302 | { 303 | "cell_type": "code", 304 | "execution_count": 3, 305 | "metadata": {}, 306 | "outputs": [ 307 | { 308 | "name": "stdout", 309 | "output_type": "stream", 310 | "text": [ 311 | "Epoch 1/20\n", 312 | "500/500 [==============================] - 124s - loss: 0.4189 - val_loss: 0.4521\n", 313 | "Epoch 2/20\n", 314 | "500/500 [==============================] - 11s - loss: 0.3629 - val_loss: 0.4545\n", 315 | "Epoch 3/20\n", 316 | "500/500 [==============================] - 11s - loss: 0.3399 - val_loss: 0.4527\n", 317 | "Epoch 4/20\n", 318 | "500/500 [==============================] - 11s - loss: 0.3229 - val_loss: 0.4721\n", 319 | "Epoch 5/20\n", 320 | "500/500 [==============================] - 11s - loss: 0.3122 - val_loss: 0.4712\n", 321 | "Epoch 6/20\n", 322 | "500/500 [==============================] - 11s - loss: 0.3030 - val_loss: 0.4705\n", 323 | "Epoch 7/20\n", 324 | "500/500 [==============================] - 11s - loss: 0.2935 - val_loss: 0.4870\n", 325 | "Epoch 8/20\n", 326 | "500/500 [==============================] - 11s - loss: 0.2862 - val_loss: 0.4676\n", 327 | "Epoch 9/20\n", 328 | "500/500 [==============================] - 11s - loss: 0.2817 - val_loss: 0.4738\n", 329 | "Epoch 10/20\n", 330 | "500/500 [==============================] - 11s - loss: 0.2775 - val_loss: 0.4896\n", 331 | "Epoch 11/20\n", 332 | "500/500 [==============================] - 11s - loss: 0.2715 - val_loss: 0.4765\n", 333 | "Epoch 12/20\n", 334 | "500/500 [==============================] - 11s - loss: 0.2683 - val_loss: 0.4724\n", 335 | "Epoch 13/20\n", 336 | "500/500 [==============================] - 11s - loss: 0.2644 - val_loss: 0.4842\n", 337 | "Epoch 14/20\n", 338 | "500/500 [==============================] - 11s - loss: 0.2606 - val_loss: 0.4910\n", 339 | "Epoch 15/20\n", 340 | "500/500 [==============================] - 11s - loss: 0.2558 - val_loss: 0.5000\n", 341 | "Epoch 16/20\n", 342 | "500/500 [==============================] - 11s - loss: 0.2539 - val_loss: 0.4960\n", 343 | "Epoch 17/20\n", 344 | "500/500 [==============================] - 11s - loss: 0.2516 - val_loss: 0.4875\n", 345 | "Epoch 18/20\n", 346 | "500/500 [==============================] - 11s - loss: 0.2501 - val_loss: 0.4884\n", 347 | "Epoch 19/20\n", 348 | "500/500 [==============================] - 11s - loss: 0.2444 - val_loss: 0.5024\n", 349 | "Epoch 20/20\n", 350 | "500/500 [==============================] - 11s - loss: 0.2444 - val_loss: 0.4821\n" 351 | ] 352 | } 353 | ], 354 | "source": [ 355 | "from keras.models import Sequential # 在耶拿数据集上评估一个简单的一维卷积网络\n", 356 | "from keras import layers\n", 357 | "from keras.optimizers import RMSprop\n", 358 | "\n", 359 | "model = Sequential()\n", 360 | "model.add(layers.Conv1D(32, 5, activation='relu',\n", 361 | " input_shape=(None, float_data.shape[-1])))\n", 362 | "model.add(layers.MaxPooling1D(3))\n", 363 | "model.add(layers.Conv1D(32, 5, activation='relu'))\n", 364 | "model.add(layers.MaxPooling1D(3))\n", 365 | "model.add(layers.Conv1D(32, 5, activation='relu'))\n", 366 | "model.add(layers.GlobalMaxPooling1D())\n", 367 | "model.add(layers.Dense(1))\n", 368 | "\n", 369 | "model.compile(optimizer=RMSprop(), loss='mae')\n", 370 | "history = model.fit_generator(train_gen,\n", 371 | " steps_per_epoch=500,\n", 372 | " epochs=20,\n", 373 | " validation_data=val_gen,\n", 374 | " validation_steps=val_steps)" 375 | ] 376 | }, 377 | { 378 | "cell_type": "code", 379 | "execution_count": 5, 380 | "metadata": {}, 381 | "outputs": [ 382 | { 383 | "data": { 384 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAEICAYAAACzliQjAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XmcFNW5//HPwyaCCAgoyjagXGFwYZmQ5CIiahJwATXE\ni2IkLhc1GjWaXPkJSRTjK+JCjF6iEoMbo6MXoyEmSlyIaBKRARFkU0CUAWQTQXaGeX5/nBpohll6\npnu6Z6a/79erX9NVdar66eqep06fOnXK3B0REckM9dIdgIiIpI6SvohIBlHSFxHJIEr6IiIZRElf\nRCSDKOmLiGQQJX2pFDOrb2bbzKxjMsumk5mdYGZJ77tsZmeb2cqY6aVm1j+eslV4rcfN7Paqrl/O\ndn9tZk8me7uSPg3SHYBULzPbFjPZBNgN7Iumr3H33Mpsz933AUcku2wmcPcTk7EdM7sauMzdz4jZ\n9tXJ2LbUfUr6dZy770+6UU3yand/o6zyZtbA3QtTEZuIpJ6adzJc9PP9eTN7zsy+Bi4zs2+b2Xtm\n9pWZrTWzh8ysYVS+gZm5mWVF01Oi5a+a2ddm9m8z61zZstHywWb2sZltMbOHzeyfZvajMuKOJ8Zr\nzGyZmW02s4di1q1vZr81s01mtgIYVM7+GWNmeSXmTTSzCdHzq81scfR+lke18LK2VWBmZ0TPm5jZ\nM1FsC4E+JcqONbMV0XYXmtmQaP7JwP8C/aOms40x+/aOmPWvjd77JjN72cyOjWffVMTMLozi+crM\n3jKzE2OW3W5ma8xsq5ktiXmv3zKzudH8dWZ2X7yvJ9XA3fXIkAewEji7xLxfA3uA8wmVgMOBbwDf\nJPwS7AJ8DNwQlW8AOJAVTU8BNgI5QEPgeWBKFcoeDXwNDI2W3QLsBX5UxnuJJ8Y/A82BLODL4vcO\n3AAsBNoDrYCZ4V+h1NfpAmwDmsZsez2QE02fH5Ux4ExgJ3BKtOxsYGXMtgqAM6Ln9wP/AFoCnYBF\nJcpeDBwbfSaXRjEcEy27GvhHiTinAHdEz78bxdgTaAz8Hngrnn1Tyvv/NfBk9Lx7FMeZ0Wd0O7A0\net4D+AxoG5XtDHSJns8GLomeNwO+me7/hUx+qKYvAO+6+1/cvcjdd7r7bHef5e6F7r4CmAQMKGf9\nqe6e7+57gVxCsqls2fOAee7+52jZbwkHiFLFGeNv3H2Lu68kJNji17oY+K27F7j7JuCecl5nBfAR\n4WAE8B1gs7vnR8v/4u4rPHgLeBMo9WRtCRcDv3b3ze7+GaH2Hvu6L7j72ugzeZZwwM6JY7sAI4DH\n3X2eu+8CRgMDzKx9TJmy9k15hgPT3P2t6DO6h3Dg+CZQSDjA9IiaCD+N9h2Eg3dXM2vl7l+7+6w4\n34dUAyV9AVgVO2Fm3czsr2b2hZltBcYBrctZ/4uY5zso/+RtWWWPi43D3Z1QMy5VnDHG9VqEGmp5\nngUuiZ5fGk0Xx3Gemc0ysy/N7CtCLbu8fVXs2PJiMLMfmdmHUTPKV0C3OLcL4f3t3567bwU2A+1i\nylTmMytru0WEz6iduy8FbiV8Duuj5sK2UdErgGxgqZm9b2bnxPk+pBoo6QuEn/uxHiPUbk9w9yOB\nXxKaL6rTWkJzCwBmZhycpEpKJMa1QIeY6Yq6lL4AnG1m7Qg1/mejGA8HpgK/ITS9tAD+HmccX5QV\ng5l1AR4BrgNaRdtdErPdirqXriE0GRVvrxmhGWl1HHFVZrv1CJ/ZagB3n+Lu/QhNO/UJ+wV3X+ru\nwwlNeA8AL5pZ4wRjkSpS0pfSNAO2ANvNrDtwTQpe8xWgt5mdb2YNgJuANtUU4wvAzWbWzsxaAbeV\nV9jdvwDeBZ4Elrr7J9Giw4BGwAZgn5mdB5xViRhuN7MWFq5juCFm2RGExL6BcPz7b0JNv9g6oH3x\nietSPAdcZWanmNlhhOT7jruX+cupEjEPMbMzotf+OeE8zCwz625mA6PX2xk9ighv4Idm1jr6ZbAl\nem9FCcYiVaSkL6W5FRhJ+Id+jHDCtVq5+zrgv4AJwCbgeOADwnUFyY7xEULb+wLCScapcazzLOHE\n7P6mHXf/Cvgp8BLhZOgwwsErHr8i/OJYCbwKPB2z3fnAw8D7UZkTgdh28NeBT4B1ZhbbTFO8/muE\nZpaXovU7Etr5E+LuCwn7/BHCAWkQMCRq3z8MuJdwHuYLwi+LMdGq5wCLLfQOux/4L3ffk2g8UjUW\nmk5FahYzq09oThjm7u+kOx6RukI1fakxzGxQ1NxxGPALQq+P99MclkidoqQvNclpwApC08H3gAvd\nvazmHRGpAjXviIhkENX0RUQySI0bcK1169aelZWV7jBERGqVOXPmbHT38ro5AzUw6WdlZZGfn5/u\nMEREahUzq+jKckDNOyIiGUVJX0Qkgyjpi4hkECV9EZEMoqQvIpJBlPRFRDKIkr6ISAaJK+lHA2Et\njW6kPLqU5T8ysw1mNi96XB2zbKSZfRI9RiYzeBGRmuLVV+GNN9IdRcUqTPrRELcTgcGEW55dYmbZ\npRR93t17Ro/Ho3WPIowb/k2gL/ArM2uZtOhFRNJs0ya49FI45xz43vfgmWfSHVH54rkity+wrPgm\nx2aWR7hl3KI41v0e8Lq7fxmt+zrhxgvPVS1cEamJ1qyBP/4Rtm2DXr2gZ0/o2hXq1093ZNVr2jS4\n5hrYuBF+9St4910YORL27oUrr0x3dKWLJ+m34+AbOBcQau4lfd/MTgc+Bn7q7qvKWPeQ+56a2Shg\nFEDHjhXdrlREagJ3+Pe/4eGHYepU2LcPGjQICQ+gSRM45ZRwECg+EJx8MjSuA3fH3bwZbr4Znn46\nvMdXXw3vb+dOuPBCuOoq2L0brrsu3ZEeKllj7/wFeM7dd5vZNcBTwJnxruzuk4BJADk5ORrrWaQG\n27ULnn8+JPs5c6B5c7jxRrj+emjfHhYvhg8+gHnzwt/cXHjkkbBu/frQvXtIkLEHg5a1qNH31Vfh\n6qth3Tr4xS9g7Fho1CgsO/xwePll+MEP4Mc/hj174Kab0htvSfEk/dVAh5jp9tG8/dx9U8zk44R7\nZRave0aJdf9R2SBFJP1Wrw7Je9Ik2LABsrPD9GWXwRFHHCh36qnhUayoCFauDAeA4oPBW2/BlCkH\nynTqFJL/d74Ttte8ecreVty2bIFbbw3NWD16hKadPn0OLde4Mbz4IlxySfg1sGcP/PznqY+3TO5e\n7oNwYFgBdAYaAR8CPUqUOTbm+YXAe9Hzo4BPCTdJbhk9P6q81+vTp4+LSM1QVOT+7rvuF1/sXr++\nu5n7kCHub7wRliVi3Tr36dPd77nHffhw965d3cG9SRP3q65ynz07Oe8hGf7+d/cOHdzr1XMfPdp9\n166K19mzJ7wvcL/rruqPEcj3CvK5h3DiKBTuZv8xsBwYE80bBwyJnv8GWBgdEGYA3WLWvRJYFj2u\nqOi1lPRF0m/nTvcnnnDv1StkiRYt3G+91X358up93dmz3a++OiR+cO/Tx/0Pf3Dftq16X7csW7e6\nX3NNiKVbN/f33qvc+nv3uv/wh2H9X/wi8QNleZKa9FP5UNIXSZ9Vq9xvv929deuQHXr0cH/ssdQn\n3a++cp840f2kk0IcRx7pfv317vPnpy6GN99079Qp/Lr52c/cd+yo2nYKC92vvDK8j9tuq77Er6Qv\nInHZssU9N9d96NDQhFOvnvsFF7i/9Vb11kzjUVTk/s9/htryYYeFjNWvn/szz4RfI9Vh27ZwgIHQ\n5PTPfya+zX373K+7Lmzz5purZ78q6YtUwtdfh3/Ghx8ONbO6bvNm96eecj//fPdGjUImOO4495//\n3P3TT9MdXek2bnR/4IEDbf9HHRWanJYuTd5rzJzp3qVLqN3ffLP79u3J23ZRkftNN4XYf/zjcCBI\npniTvoWyNUdOTo7rdomSSgsXwrBhsGRJmM7JCT1UevVKb1zJ9uWXoTvh1KlhuIC9e6FDh/Dehw2D\nb30L6tWC0bjcYcYMePRReOklKCyEs86Ca6+Fb3879JXfvh127Kjc3y1bQq+izp3hiSfg9NOrJ/bb\nboP77gvdPh97LHn73MzmuHtOHEGkv3Yf+1BNv3bautX9pz91f+WVdEdSOU895X744e7HHBOaM557\nzv3oo0Mzx623pu8EYrKsX+8+aZL7d7/r3qBBqGVmZYUa/axZ6W++SdTate533x3a3kNKje9h5n7E\nEeGz7tw5nLvo29f9llvCr77qVFTkPnZsiGPkyOT9skQ1fUmVJUvCVYhLloRay+OPwxVXpDuq8u3c\nCT/5SehzPWAAPPccHHtsWLZ5c6iN/eEPof/4738fxlWpLb74ItSAp06Ff/wj9JM/4YQDNfrevcEs\n3VEm17598Prr8Nln0LRpuBq4vL+HHZb+fXDXXfDLX4b+/E8/Ha5mToRq+pISf/qTe7Nm7m3ahFr+\nd74TajAPPJDuyMr28cfup5wS4hwzJnSrK83Mme7du4dyF18capU11d697n/8o/vpp4daLLifeGKo\nUc6bV/tr9HXVPfeEz2rYsNCvPxHoRK5Up8LC0LUPws/iVavC/F27whe4OKHWtGTzwgvhINWqlfvf\n/lZx+V273MeNCz1Hmjd3f/TR5J+AS9Rrr4XmCXDPzna/4w73jz6qefteSjdhQvjshgyJ76Kvsijp\nS7XZuDG0EYP7qFGHflELC8MFNhC6qdWEJLl7t/tPfhJi+ta33D//vHLrL13qPnBgWP8//zMk1XRb\nuNB98OAQU5cu7i++qERfW/3v/4bPcfDgqrfxK+lLtZg7N5wIbNQoXClZlqKicCEKhEvRd+9OXYwl\nrVwZfo1AONlc1ViKityffDL8SmjYMPySqeoFO4nYsCF0+atfP1y0dN99idUQpWaYNCl8llWlpC9J\n99RT7o0bu7dvH3p+xKO4zXLw4OT2eY7XX/7i3rJlSI5/+lNytrlhg/vll4f3dcIJYRyaVNi1y/3+\n+0MzU/36IfGvX5+a15aaT0lfkmb37gNXKA4cGAbKqoxJk8LJxX79wkVBqbB374FfGr16uS9blvzX\neOONkPQhHASqKwEXFYWmm+OPP3AAXbiwel5Lai8lfUmK1atDGzaE8UfK6ulSkRdeCE0ip57q/sUX\nyY2xpIIC9/79Q8zXXFN9l+u7h+adMWPCe2vVyv1//ie81+XLk9O+PmeO+4ABvn8cnNdeS3ybUjcp\n6UvC3nnHvW1b96ZN3Z9/PvHtTZ8eRk884YTqu9T/9ddD99GmTd2nTKme1yjNRx+5n312SP7FFwC1\nbOl+1lnhQJCX5/7JJ/EfCFavDhfumIX388gjVT/gSmaIN+nr4iw5hDtMnAg//Wm4JP1Pf4KTTkrO\ntv/973ChU5Mm4WKa7OzEt+kOH34YbsoxYUK4M9PUqeFvqu3eDQsWhDtKzZ0b/i5YEG6kAeHmIL17\nh5tvFP894YQDl+Lv2AH33w/jx4fhBW6+GW6/vWbeVERqlngvzlLSr0GWLg03WG7a9OBHkybQsGFq\nYtixI4xh8swzcP754UrBFi2S+xoLFsB3vxsS4auvQt++ld/G9u3w5pvw17+Gx+roXm4jR4YDVtOm\nyY05EXv2wEcfHXwgmD8/HCAAjjwyjPNz0klhbJzVq8OVs+PHQ5cu6Y1dag8l/VrmscfCTZTL+jga\nNjz0YBB7UCh+fvjh4dG48YG/ZT0vOW/zZhgxItSa77wTxoypvgG4VqwIt8Zbtw7+/OcwYFZFVq4M\nCf6VV8KAW7t3Q7Nm4QBy3nkweDAcc0z1xJtse/eGgd7mzDlwMPjww3CT7QcegNNOS3eEUtso6dcS\n7vCb34QEe+654QbT27eX/SgeEbCsx65dYVyZwsKqxdOiRbiRdSrGmlm7NiTsjz+GvLwwfk+swkL4\n178OJPpFi8L8rl1Dkj/3XOjf/8BNqWu7oqLaMcql1EzxJv0Eh/iRRBQVhRsmT5gQathPPJG8ZpzC\nwlAT3rkzHAiKDwaxf0s+37MnNOlkZSUnhooceyy8/XZI3sOGhQHOhg6F114Lif6118KvjwYNwqBo\nV18dyv7Hf6QmvlRTwpdUUE0/TQoL4b//G558Moz2+OCDmftPv307XHQR/P3vYR8UFcHRR4dfG+ed\nF5qBjjwy3VGK1Gyq6ddgu3bB8OGhLfuOO8Lwquke5jWdmjaFadNCM1dRUUj0OTmZexAUqU5K+im2\ndStccEE4EfnQQ6GWL2F88zvuSHcUInWfkn4KbdgQepjMmxf6lI8Yke6IRCTTKOmnyOefh54qn30W\nmnXOPTfdEYlIJlLST4ElS8LJyK1bw8nK/v3THZGIZCol/WqWnx+adOrVC90Te/ZMd0QiksnUP6Ia\nzZgBAweG3invvquELyLpp6RfTV5+OdTwO3aEf/4zXEUqIpJuSvrV4Mkn4fvfDzX7mTOhXbt0RyQi\nEmRcm757uLR/1arwKCg48HzNmjB4WZs2Bx5HH33wdJs2YXCyskyYALfeCmefDS+9BEcckbr3JiJS\nkbiSvpkNAn4H1Aced/d7yij3fWAq8A13zzezLGAxsDQq8p67X5to0OXZuvVAEi+Z1IsfO3YcvE79\n+nDccaFGvnFjGPVww4YwEmJpmjYt/WCwaRNMnhzGkZkyJVxwJCJSk1SY9M2sPjAR+A5QAMw2s2nu\nvqhEuWbATcCsEptY7u7VfgpzzZpw04ytWw+ebxYG9mrfPoxXPngwdOhw8KNt25D4Y7mHba1fHw4A\nsY/YeWvWhCFx168PA5Zdc00Yz73k9kREaoJ4avp9gWXuvgLAzPKAocCiEuXuAsYDP09qhHFq0wYu\nv/zQhH7ccVUbudIs3K2oefP4TsK6hzF1ymv6ERFJt3iSfjtgVcx0AfDN2AJm1hvo4O5/NbOSSb+z\nmX0AbAXGuvs7JV/AzEYBowA6duxYifAPaNgQHn64SqsmhZkSvojUfAn33jGzesAE4NZSFq8FOrp7\nL+AW4FkzO2SQXHef5O457p7Tpk2bREMSEZEyxJP0VwMdYqbbR/OKNQNOAv5hZiuBbwHTzCzH3Xe7\n+yYAd58DLAfq6C0wRERqvniS/mygq5l1NrNGwHBgWvFCd9/i7q3dPcvds4D3gCFR75020YlgzKwL\n0BVYkfR3ISIicamwTd/dC83sBmA6ocvmZHdfaGbjgHx3n1bO6qcD48xsL1AEXOvuXyYjcBERqTzd\nLlFEpA6I93aJGoZBRCSDKOmLiGQQJX0RkQyipC8ikkGU9EVEMoiSvohIBlHSFxHJIEr6IiIZRElf\nRCSDKOmLiGQQJX0RkQyipC8ikkGU9EVEMoiSvohIBlHSFxHJIHUm6efmQlYW1KsX/ubmpjsiEZGa\np8I7Z9UGubkwahTs2BGmP/ssTAOMGJG+uEREapo6UdMfM+ZAwi+2Y0eYLyIiB9SJpP/555WbLyKS\nqepE0u/YsXLzRUQyVZ1I+nffDU2aHDyvSZMwX0REDqgTSX/ECJg0CTp1ArPwd9IkncQVESmpTvTe\ngZDgleRFRMpXJ2r6IiISHyV9EZEMoqQvIpJBlPRFRDKIkr6ISAZR0hcRySBK+iIiGSSupG9mg8xs\nqZktM7PR5ZT7vpm5meXEzPt/0XpLzex7yQhaRESqpsKLs8ysPjAR+A5QAMw2s2nuvqhEuWbATcCs\nmHnZwHCgB3Ac8IaZ/Ye770veWxARkXjFU9PvCyxz9xXuvgfIA4aWUu4uYDywK2beUCDP3Xe7+6fA\nsmh7IiKSBvEk/XbAqpjpgmjefmbWG+jg7n+t7LrR+qPMLN/M8jds2BBX4CIiUnkJn8g1s3rABODW\nqm7D3Se5e46757Rp0ybRkEREpAzxDLi2GugQM90+mlesGXAS8A8zA2gLTDOzIXGsKyIiKRRPTX82\n0NXMOptZI8KJ2WnFC919i7u3dvcsd88C3gOGuHt+VG64mR1mZp2BrsD7SX8XIiISlwpr+u5eaGY3\nANOB+sBkd19oZuOAfHefVs66C83sBWARUAhcr547IiLpY+6e7hgOkpOT4/n5+ekOQ0SkVjGzOe6e\nU1E5XZErIpJBlPRFRDKIkr6ISAZR0hcRySBK+iIiGURJX0Qkgyjpi4hkECV9EZEMoqQvIpJBlPRF\nRDKIkr6ISAZR0hcRySBK+iIiGURJX0Qkgyjpi4hkECV9EZEMoqQvIpJBlPRFRDKIkr6ISAZR0o/k\n5kJWFtSrF/7m5qY7IhGR5GuQ7gBqgtxcGDUKduwI0599FqYBRoxIX1wiIsmmmj4wZsyBhF9sx44w\nX0SkLlHSBz7/vHLzRURqKyV9oGPHys0XEamtlPSBu++GJk0OntekSZgvIlKXKOkTTtZOmgSdOoFZ\n+Dtpkk7iikjdo947kREjlORFpO5TTV9EJIMo6YuIZJC4kr6ZDTKzpWa2zMxGl7L8WjNbYGbzzOxd\nM8uO5meZ2c5o/jwzezTZb0BEROJXYZu+mdUHJgLfAQqA2WY2zd0XxRR71t0fjcoPASYAg6Jly929\nZ3LDFhGRqoinpt8XWObuK9x9D5AHDI0t4O5bYyabAp68EEVEJFniSfrtgFUx0wXRvIOY2fVmthy4\nF7gxZlFnM/vAzN42s/6lvYCZjTKzfDPL37BhQyXCFxGRykjaiVx3n+juxwO3AWOj2WuBju7eC7gF\neNbMjixl3UnunuPuOW3atElWSCIiUkI8SX810CFmun00ryx5wAUA7r7b3TdFz+cAy4H/qFqoIiKS\nqHiS/mygq5l1NrNGwHBgWmwBM+saM3ku8Ek0v010Ihgz6wJ0BVYkI3AREam8CnvvuHuhmd0ATAfq\nA5PdfaGZjQPy3X0acIOZnQ3sBTYDI6PVTwfGmdleoAi41t2/rI43IiIiFTP3mtXRJicnx/Pz89Md\nRqXl5obx9z//PIzOeffdGtZBRFLHzOa4e05F5TT2ThLozlsiUltoGIYk0J23RKS2UNJPAt15S0Rq\nCyX9JNCdt0SktlDSTwLdeUtEagsl/STQnbdEpLZQ750k0Z23RKQ2UE1fRCSDKOmLiGQQJX0RkQyi\npC8ikkGU9EVEMoiSfg2RmwtZWVCvXvibm5vuiESkLlKXzRpAA7aJSKqopl8DaMA2EUkVJf0aQAO2\niUiqKOnXABqwTURSRUm/BtCAbSKSKkr6NYAGbBORVFHvnRpCA7aJSCqopl9HqJ+/iMRDNf06QP38\nRSRequnXAernLyLxUtKvA9TPX0TipaRfB6ifv4jES0m/DlA/fxGJl5J+HaB+/iISLyX9OmLECFi5\nEoqKwt/KJnx1+RTJDOqyKeryKZJB4qrpm9kgM1tqZsvMbHQpy681swVmNs/M3jWz7Jhl/y9ab6mZ\nfS+ZwUtyqMunSOaoMOmbWX1gIjAYyAYuiU3qkWfd/WR37wncC0yI1s0GhgM9gEHA76PtSQ2iLp8i\nmSOemn5fYJm7r3D3PUAeMDS2gLtvjZlsCnj0fCiQ5+673f1TYFm0PalB1OVTJHPEk/TbAatipgui\neQcxs+vNbDmhpn9jJdcdZWb5Zpa/YcOGeGOXJFGXT5HMkbTeO+4+0d2PB24DxlZy3UnunuPuOW3a\ntElWSBIndfkUyRzxJP3VQIeY6fbRvLLkARdUcV1JE3X5FMkM8ST92UBXM+tsZo0IJ2anxRYws64x\nk+cCn0TPpwHDzewwM+sMdAXeTzxsqUmKu3x+9hm4H+jyqcQvUvNUmPTdvRC4AZgOLAZecPeFZjbO\nzIZExW4ws4VmNg+4BRgZrbsQeAFYBLwGXO/u+6rhfUgaqcunSO1h7l5xqRTKycnx/Pz8dIchlVCv\nXqjhl2QWmotEpPqZ2Rx3z6monIZhkISpy6dI7aGkLwlTl0+R2kNJXxKWjC6f6v0jkhoacE2SYsSI\nqvfr14BvIqmjmr6knXr/iKSOkr6kXTIGfFPzkEh8lPQl7RLt/aOLw0Tip6QvaZdo7x81D4nET0lf\n0i7R3j+6H4BI/GpF7529e/dSUFDArl270h2KxKFx48a0b9+ehg0bxr1OIr1/OnYMTTqlzY9Xbm74\nZfD552G9u+9WzyGpm2pF0i8oKKBZs2ZkZWVhZukOR8rh7mzatImCggI6d+6ckte8++6Du3xC5ZqH\n1GVUMkmtaN7ZtWsXrVq1UsKvBcyMVq1apfRXWaLNQzonIJmkVtT0ASX8WiQdn1UizUM6JyCZpFbU\n9EWqUzIGjNN1AlJb1Mmkn+x/wE2bNtGzZ0969uxJ27Ztadeu3f7pPXv2xLWNK664gqVLl5ZbZuLE\nieQmKVucdtppzJs3LynbqusS7TKq6wSkVnH3GvXo06ePl7Ro0aJD5pVlyhT3Jk3cw79feDRpEuYn\nw69+9Su/7777DplfVFTk+/btS86LJEG/fv38gw8+SNvrV+YzqwmmTHHv1MndLPytzPelU6eDv2/F\nj06dqidWkdIA+R5Hjq1zNf1UnpRbtmwZ2dnZjBgxgh49erB27VpGjRpFTk4OPXr0YNy4cfvLFte8\nCwsLadGiBaNHj+bUU0/l29/+NuvXrwdg7NixPPjgg/vLjx49mr59+3LiiSfyr3/9C4Dt27fz/e9/\nn+zsbIYNG0ZOTk6FNfopU6Zw8sknc9JJJ3H77bcDUFhYyA9/+MP98x966CEAfvvb35Kdnc0pp5zC\nZZddlvR9VlMlco9gDSMhtUmtOZEbr1SflFuyZAlPP/00OTnhhjX33HMPRx11FIWFhQwcOJBhw4aR\nnZ190DpbtmxhwIAB3HPPPdxyyy1MnjyZ0aNHH7Jtd+f9999n2rRpjBs3jtdee42HH36Ytm3b8uKL\nL/Lhhx/Su3fvcuMrKChg7Nix5Ofn07x5c84++2xeeeUV2rRpw8aNG1mwYAEAX331FQD33nsvn332\nGY0aNdo/T8qX6HUC6jIqqVTnavqpvovT8ccfvz/hAzz33HP07t2b3r17s3jxYhYtWnTIOocffjiD\nBw8GoE+fPqxcubLUbV900UWHlHn33XcZPnw4AKeeeio9evQoN75Zs2Zx5pln0rp1axo2bMill17K\nzJkzOeGEE1i6dCk33ngj06dPp3nz5gD06NGDyy67jNzc3EpdXJXJNIyE1CZ1Lumn+i5OTZs23f/8\nk08+4XektVlrAAAOCElEQVS/+x1vvfUW8+fPZ9CgQaX2V2/UqNH+5/Xr16ewsLDUbR922GEVlqmq\nVq1aMX/+fPr378/EiRO55pprAJg+fTrXXnsts2fPpm/fvuzbp/vYV6QmDCOh5iGJV51L+sm4i1NV\nbd26lWbNmnHkkUeydu1apk+fnvTX6NevHy+88AIACxYsKPWXRKxvfvObzJgxg02bNlFYWEheXh4D\nBgxgw4YNuDs/+MEPGDduHHPnzmXfvn0UFBRw5plncu+997Jx40Z2lKyCSqkSOSegUUYllepcmz4k\ndqFOInr37k12djbdunWjU6dO9OvXL+mv8ZOf/ITLL7+c7Ozs/Y/ippnStG/fnrvuuoszzjgDd+f8\n88/n3HPPZe7cuVx11VW4O2bG+PHjKSws5NJLL+Xrr7+mqKiIn/3sZzRr1izp70EOlugwEuU1D+mc\ngBwini4+qXwk2mWzrtu7d6/v3LnT3d0//vhjz8rK8r1796Y5qkPpM6ucRLqMmpXeZdQsNa8vNQNx\ndtmskzX9umzbtm2cddZZFBYW4u489thjNGigj7G2S+coo+o9lFmULWqZFi1aMGfOnHSHITWImoek\nMurciVyRTKPeQ1IZqumL1AFqHpJ4qaYvkuFqwsVl+qWQOkr6Ihku3c1DybjOQAeN+MWV9M1skJkt\nNbNlZnbIIDFmdouZLTKz+Wb2ppl1ilm2z8zmRY9pyQw+VQYOHHjIhVYPPvgg1113XbnrHXHEEQCs\nWbOGYcOGlVrmjDPOID8/v9ztPPjggwddJHXOOeckZVycO+64g/vvvz/h7Ujtl86LyxL9paCL0yqn\nwqRvZvWBicBgIBu4xMyySxT7AMhx91OAqcC9Mct2unvP6DEkSXGn1CWXXEJeXt5B8/Ly8rjkkkvi\nWv+4445j6tSpVX79kkn/b3/7Gy1atKjy9kSSKdHmoUR/Kah5qXLiOZHbF1jm7isAzCwPGArsv/7f\n3WfElH8PqLYxeW++GZJ9b5CePSEa0bhUw4YNY+zYsezZs4dGjRqxcuVK1qxZQ//+/dm2bRtDhw5l\n8+bN7N27l1//+tcMHTr0oPVXrlzJeeedx0cffcTOnTu54oor+PDDD+nWrRs7d+7cX+66665j9uzZ\n7Ny5k2HDhnHnnXfy0EMPsWbNGgYOHEjr1q2ZMWMGWVlZ5Ofn07p1ayZMmMDkyZMBuPrqq7n55ptZ\nuXIlgwcP5rTTTuNf//oX7dq1489//jOHH354me9x3rx5XHvttezYsYPjjz+eyZMn07JlSx566CEe\nffRRGjRoQHZ2Nnl5ebz99tvcdNNNQLg14syZM3XlbgYr/lUwZkxI1B07hoQf76+FRE8kJ6t5KVNO\nRMfTvNMOWBUzXRDNK8tVwKsx043NLN/M3jOzC6oQY9odddRR9O3bl1dfDW8rLy+Piy++GDOjcePG\nvPTSS8ydO5cZM2Zw6623Ei6OK90jjzxCkyZNWLx4MXfeeedBfe7vvvtu8vPzmT9/Pm+//Tbz58/n\nxhtv5LjjjmPGjBnMmDHjoG3NmTOHJ554glmzZvHee+/xhz/8gQ8++AAIg79df/31LFy4kBYtWvDi\niy+W+x4vv/xyxo8fz/z58zn55JO58847gTBU9AcffMD8+fN59NFHAbj//vuZOHEi8+bN45133in3\nYCKZIZHmoUR/KaS7eam2SWqXTTO7DMgBBsTM7uTuq82sC/CWmS1w9+Ul1hsFjALoWMEnVV6NvDoV\nN/EMHTqUvLw8/vjHPwJhGIvbb7+dmTNnUq9ePVavXs26deto27ZtqduZOXMmN954IwCnnHIKp5xy\nyv5lL7zwApMmTaKwsJC1a9eyaNGig5aX9O6773LhhRfuH+nzoosu4p133mHIkCF07tyZnj17AuUP\n3wxhfP+vvvqKAQPCxzZy5Eh+8IMf7I9xxIgRXHDBBVxwQThm9+vXj1tuuYURI0Zw0UUX0b59+3h2\noUipEv2lkOjFaam+B0e6xVPTXw10iJluH807iJmdDYwBhrj77uL57r46+rsC+AfQq+S67j7J3XPc\nPadNmzaVegOpMnToUN58803mzp3Ljh076NOnDwC5ubls2LCBOXPmMG/ePI455phSh1OuyKeffsr9\n99/Pm2++yfz58zn33HOrtJ1ixcMyQ2JDM//1r3/l+uuvZ+7cuXzjG9+gsLCQ0aNH8/jjj7Nz5076\n9evHkiVLqhynCCT2SyHR3kfJuAdHoucEUnlOIZ6kPxvoamadzawRMBw4qBeOmfUCHiMk/PUx81ua\n2WHR89ZAP2LOBdQmRxxxBAMHDuTKK6886ATuli1bOProo2nYsCEzZszgs9IaJ2OcfvrpPPvsswB8\n9NFHzJ8/HwjDMjdt2pTmzZuzbt26/U1JAM2aNePrr78+ZFv9+/fn5ZdfZseOHWzfvp2XXnqJ/v37\nV/q9NW/enJYtW/LOO+8A8MwzzzBgwACKiopYtWoVAwcOZPz48WzZsoVt27axfPlyTj75ZG677Ta+\n8Y1vKOlL2qWzeSnR3kOp7n1UYfOOuxea2Q3AdKA+MNndF5rZOMKobtOA+4AjgP8zM4DPo5463YHH\nzKyIcIC5x91rZdKH0MRz4YUXHtSTZ8SIEZx//vmcfPLJ5OTk0K1bt3K3cd1113HFFVfQvXt3unfv\nvv8Xw6mnnkqvXr3o1q0bHTp0OGhY5lGjRjFo0KD9bfvFevfuzY9+9CP69u0LhBO5vXr1KrcppyxP\nPfXU/hO5Xbp04YknnmDfvn1cdtllbNmyBXfnxhtvpEWLFvziF79gxowZ1KtXjx49euy/C5hIbZRo\n81KiYxeleuwjK++kYzrk5OR4yX7rixcvpnv37mmKSKpCn5lkinr1Qg29JLPwy6O61z9Q3ua4e05F\n5XRFrohIAhI9J5Dq+3or6YuIJCDRcwKpvq93rUn6Na0ZSsqmz0oySaK9h1J9X+9a0ab/6aef0qxZ\nM1q1akV0olhqKHdn06ZNfP3113Tu3Dnd4YhkjHjb9GvFePrt27enoKCADRs2pDsUiUPjxo11wZZI\nDVUrkn7Dhg1VaxQRSYJa06YvIiKJU9IXEckgSvoiIhmkxvXeMbMNQPkD2JSvNbAxSeFUB8WXGMWX\nGMWXmJocXyd3r3DEyhqX9BNlZvnxdFtKF8WXGMWXGMWXmJoeXzzUvCMikkGU9EVEMkhdTPqT0h1A\nBRRfYhRfYhRfYmp6fBWqc236IiJStrpY0xcRkTIo6YuIZJBamfTNbJCZLTWzZWY2upTlh5nZ89Hy\nWWaWlcLYOpjZDDNbZGYLzeymUsqcYWZbzGxe9PhlquKLiWGlmS2IXj+/lOVmZg9F+3C+mfVOYWwn\nxuybeWa21cxuLlEmpfvQzCab2Xoz+yhm3lFm9rqZfRL9bVnGuiOjMp+Y2cgUxnefmS2JPr+XzKxF\nGeuW+12oxvjuMLPVMZ/hOWWsW+7/ezXG93xMbCvNbF4Z61b7/ksqd69VD8J9epcDXYBGwIdAdoky\nPwYejZ4PB55PYXzHAr2j582Aj0uJ7wzglTTvx5VA63KWnwO8ChjwLWBWGj/vLwgXnqRtHwKnA72B\nj2Lm3QuMjp6PBsaXst5RwIrob8voecsUxfddoEH0fHxp8cXzXajG+O4AfhbH51/u/3t1xVdi+QPA\nL9O1/5L5qI01/b7AMndf4e57gDxgaIkyQ4GnoudTgbMsRQPxu/tad58bPf8aWAy0S8VrJ9lQ4GkP\n3gNamNmxaYjjLGC5uydylXbC3H0m8GWJ2bHfs6eAC0pZ9XvA6+7+pbtvBl4HBqUiPnf/u7sXRpPv\nAWkb77qM/RePeP7fE1ZefFHuuBh4Ltmvmw61Mem3A1bFTBdwaFLdXyb60m8BWqUkuhhRs1IvYFYp\ni79tZh+a2atm1iOlgQUO/N3M5pjZqFKWx7OfU2E4Zf+zpXsfHuPua6PnXwDHlFKmpuzHKwm/3EpT\n0XehOt0QNT9NLqN5rCbsv/7AOnf/pIzl6dx/lVYbk36tYGZHAC8CN7v71hKL5xKaK04FHgZeTnV8\nwGnu3hsYDFxvZqenIYZymVkjYAjwf6Usrgn7cD8Pv/NrZP9nMxsDFAK5ZRRJ13fhEeB4oCewltCE\nUhNdQvm1/Br/vxSrNib91UCHmOn20bxSy5hZA6A5sCkl0YXXbEhI+Lnu/qeSy919q7tvi57/DWho\nZq1TFV/0uqujv+uBlwg/o2PFs5+r22BgrruvK7mgJuxDYF1xk1f0d30pZdK6H83sR8B5wIjowHSI\nOL4L1cLd17n7PncvAv5Qxuume/81AC4Cni+rTLr2X1XVxqQ/G+hqZp2jmuBwYFqJMtOA4l4Sw4C3\nyvrCJ1vU/vdHYLG7TyijTNvicwxm1pfwOaTyoNTUzJoVPyec8PuoRLFpwOVRL55vAVtimjJSpcwa\nVrr3YST2ezYS+HMpZaYD3zWzllHzxXejedXOzAYB/wMMcfcdZZSJ57tQXfHFniO6sIzXjef/vTqd\nDSxx94LSFqZz/1VZus8kV+VB6FnyMeGs/pho3jjClxugMaFJYBnwPtAlhbGdRviZPx+YFz3OAa4F\nro3K3AAsJPREeA/4zxTvvy7Ra38YxVG8D2NjNGBitI8XADkpjrEpIYk3j5mXtn1IOPisBfYS2pWv\nIpwnehP4BHgDOCoqmwM8HrPuldF3cRlwRQrjW0ZoDy/+Hhb3aDsO+Ft534UUxfdM9N2aT0jkx5aM\nL5o+5P89FfFF858s/s7FlE35/kvmQ8MwiIhkkNrYvCMiIlWkpC8ikkGU9EVEMoiSvohIBlHSFxHJ\nIEr6IiIZRElfRCSD/H9HJBKfh9lz3gAAAABJRU5ErkJggg==\n", 385 | "text/plain": [ 386 | "" 387 | ] 388 | }, 389 | "metadata": {}, 390 | "output_type": "display_data" 391 | } 392 | ], 393 | "source": [ 394 | "import matplotlib.pyplot as plt\n", 395 | "'''MAE停留在0.4-0.5,甚至无法击败基于常识的方法'''\n", 396 | "loss = history.history['loss']\n", 397 | "val_loss = history.history['val_loss']\n", 398 | "\n", 399 | "epochs = range(len(loss))\n", 400 | "\n", 401 | "plt.figure()\n", 402 | "\n", 403 | "plt.plot(epochs, loss, 'bo', label='Training loss')\n", 404 | "plt.plot(epochs, val_loss, 'b', label='Validation loss')\n", 405 | "plt.title('Training and validation loss')\n", 406 | "plt.legend()\n", 407 | "\n", 408 | "plt.show()" 409 | ] 410 | }, 411 | { 412 | "cell_type": "markdown", 413 | "metadata": {}, 414 | "source": [ 415 | "### 耶拿数据集:一维CNN+RNN" 416 | ] 417 | }, 418 | { 419 | "cell_type": "code", 420 | "execution_count": 11, 421 | "metadata": { 422 | "collapsed": true 423 | }, 424 | "outputs": [], 425 | "source": [ 426 | " # 耶拿数据集:准备更高分辨率的数据生成器\n", 427 | "step = 3 # 之前为6(每小时一个数据点);现在为3(每30分钟一个数据点)\n", 428 | "lookback = 720 # 不变\n", 429 | "delay = 144 # 不变\n", 430 | "\n", 431 | "train_gen = generator(float_data,\n", 432 | " lookback=lookback,\n", 433 | " delay=delay,\n", 434 | " min_index=0,\n", 435 | " max_index=200000,\n", 436 | " shuffle=True,\n", 437 | " step=step)\n", 438 | "val_gen = generator(float_data,\n", 439 | " lookback=lookback,\n", 440 | " delay=delay,\n", 441 | " min_index=200001,\n", 442 | " max_index=300000,\n", 443 | " step=step)\n", 444 | "test_gen = generator(float_data,\n", 445 | " lookback=lookback,\n", 446 | " delay=delay,\n", 447 | " min_index=300001,\n", 448 | " max_index=None,\n", 449 | " step=step)\n", 450 | "val_steps = (300000 - 200001 - lookback) // 128\n", 451 | "test_steps = (len(float_data) - 300001 - lookback) // 128" 452 | ] 453 | }, 454 | { 455 | "cell_type": "code", 456 | "execution_count": 12, 457 | "metadata": {}, 458 | "outputs": [ 459 | { 460 | "name": "stdout", 461 | "output_type": "stream", 462 | "text": [ 463 | "_________________________________________________________________\n", 464 | "Layer (type) Output Shape Param # \n", 465 | "=================================================================\n", 466 | "conv1d_6 (Conv1D) (None, None, 32) 2272 \n", 467 | "_________________________________________________________________\n", 468 | "max_pooling1d_4 (MaxPooling1 (None, None, 32) 0 \n", 469 | "_________________________________________________________________\n", 470 | "conv1d_7 (Conv1D) (None, None, 32) 5152 \n", 471 | "_________________________________________________________________\n", 472 | "gru_1 (GRU) (None, 32) 6240 \n", 473 | "_________________________________________________________________\n", 474 | "dense_3 (Dense) (None, 1) 33 \n", 475 | "=================================================================\n", 476 | "Total params: 13,697\n", 477 | "Trainable params: 13,697\n", 478 | "Non-trainable params: 0\n", 479 | "_________________________________________________________________\n", 480 | "Epoch 1/20\n", 481 | "500/500 [==============================] - 60s - loss: 0.3387 - val_loss: 0.3030\n", 482 | "Epoch 2/20\n", 483 | "500/500 [==============================] - 58s - loss: 0.3055 - val_loss: 0.2864\n", 484 | "Epoch 3/20\n", 485 | "500/500 [==============================] - 58s - loss: 0.2904 - val_loss: 0.2841\n", 486 | "Epoch 4/20\n", 487 | "500/500 [==============================] - 58s - loss: 0.2830 - val_loss: 0.2730\n", 488 | "Epoch 5/20\n", 489 | "500/500 [==============================] - 58s - loss: 0.2767 - val_loss: 0.2757\n", 490 | "Epoch 6/20\n", 491 | "500/500 [==============================] - 58s - loss: 0.2696 - val_loss: 0.2819\n", 492 | "Epoch 7/20\n", 493 | "500/500 [==============================] - 57s - loss: 0.2642 - val_loss: 0.2787\n", 494 | "Epoch 8/20\n", 495 | "500/500 [==============================] - 57s - loss: 0.2595 - val_loss: 0.2920\n", 496 | "Epoch 9/20\n", 497 | "500/500 [==============================] - 58s - loss: 0.2546 - val_loss: 0.2919\n", 498 | "Epoch 10/20\n", 499 | "500/500 [==============================] - 57s - loss: 0.2506 - val_loss: 0.2772\n", 500 | "Epoch 11/20\n", 501 | "500/500 [==============================] - 58s - loss: 0.2459 - val_loss: 0.2801\n", 502 | "Epoch 12/20\n", 503 | "500/500 [==============================] - 57s - loss: 0.2433 - val_loss: 0.2807\n", 504 | "Epoch 13/20\n", 505 | "500/500 [==============================] - 58s - loss: 0.2411 - val_loss: 0.2855\n", 506 | "Epoch 14/20\n", 507 | "500/500 [==============================] - 58s - loss: 0.2360 - val_loss: 0.2858\n", 508 | "Epoch 15/20\n", 509 | "500/500 [==============================] - 57s - loss: 0.2350 - val_loss: 0.2834\n", 510 | "Epoch 16/20\n", 511 | "500/500 [==============================] - 58s - loss: 0.2315 - val_loss: 0.2917\n", 512 | "Epoch 17/20\n", 513 | "500/500 [==============================] - 60s - loss: 0.2285 - val_loss: 0.2944\n", 514 | "Epoch 18/20\n", 515 | "500/500 [==============================] - 57s - loss: 0.2280 - val_loss: 0.2923\n", 516 | "Epoch 19/20\n", 517 | "500/500 [==============================] - 57s - loss: 0.2249 - val_loss: 0.2910\n", 518 | "Epoch 20/20\n", 519 | "500/500 [==============================] - 57s - loss: 0.2215 - val_loss: 0.2952\n" 520 | ] 521 | } 522 | ], 523 | "source": [ 524 | "model = Sequential()\n", 525 | "model.add(layers.Conv1D(32, 5, activation='relu',\n", 526 | " input_shape=(None, float_data.shape[-1])))\n", 527 | "model.add(layers.MaxPooling1D(3))\n", 528 | "model.add(layers.Conv1D(32, 5, activation='relu'))\n", 529 | "model.add(layers.GRU(32, dropout=0.1, recurrent_dropout=0.5))\n", 530 | "model.add(layers.Dense(1))\n", 531 | "\n", 532 | "model.summary()\n", 533 | "\n", 534 | "model.compile(optimizer=RMSprop(), loss='mae')\n", 535 | "history = model.fit_generator(train_gen,\n", 536 | " steps_per_epoch=500,\n", 537 | " epochs=20,\n", 538 | " validation_data=val_gen,\n", 539 | " validation_steps=val_steps)" 540 | ] 541 | }, 542 | { 543 | "cell_type": "code", 544 | "execution_count": 13, 545 | "metadata": {}, 546 | "outputs": [ 547 | { 548 | "data": { 549 | "text/plain": [ 550 | "" 551 | ] 552 | }, 553 | "metadata": {}, 554 | "output_type": "display_data" 555 | }, 556 | { 557 | "data": { 558 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAEICAYAAACzliQjAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XmYFNW5x/Hvyy6LgIhxQRhwBWQfQYMEUKO4gRiugkPc\ngxKJGq4mRNQYlBvFDTFc16txQdFoVFSUGCUxJnEBRBSQgAg6grIoII6Kw7z3j1MzNMMsPdM93dPT\nv8/z9DNd1aeq3q7ueav61KlzzN0REZHsUC/dAYiISOoo6YuIZBElfRGRLKKkLyKSRZT0RUSyiJK+\niEgWUdKXKjGz+ma21czaJ7NsOpnZgWaW9LbLZnasma2KmV5mZgPiKVuNbd1nZldWd/kK1nu9mf0x\n2euV9GmQ7gCkZpnZ1pjJpsB3wPZo+kJ3n1GV9bn7dqB5sstmA3c/JBnrMbMLgNHuPihm3RckY91S\n9ynp13HuXpJ0ozPJC9z9r+WVN7MG7l6YithEJPVUvZPlop/vj5vZY2b2FTDazI40szfMbJOZrTWz\naWbWMCrfwMzczHKi6Uei1180s6/M7N9m1rGqZaPXTzCz/5jZZjO7w8z+aWbnlBN3PDFeaGYrzOxL\nM5sWs2x9M7vNzDaa2UpgSAX7Z6KZzSw1b7qZ3Ro9v8DMlkbv58PoLLy8deWb2aDoeVMzeziKbTHQ\np1TZq8xsZbTexWY2NJrfDfgDMCCqOtsQs2+vjVn+oui9bzSzZ8xsn3j2TWXMbHgUzyYze9XMDol5\n7UozW2NmW8zsg5j3eoSZLYjmf25mN8W7PakB7q5HljyAVcCxpeZdD2wDTiGcBOwGHA70I/wS7AT8\nBxgXlW8AOJATTT8CbABygYbA48Aj1Si7F/AVMCx6bTzwPXBOOe8lnhifBVoCOcAXxe8dGAcsBtoB\nbYDXwr9CmdvpBGwFmsWsex2QG02fEpUx4GjgG6B79NqxwKqYdeUDg6LnNwN/A1oDHYAlpcqeDuwT\nfSZnRjH8IHrtAuBvpeJ8BLg2en5cFGNPoAnwv8Cr8eybMt7/9cAfo+edoziOjj6jK4Fl0fOuwGpg\n76hsR6BT9PxtYFT0vAXQL93/C9n80Jm+ALzu7s+5e5G7f+Pub7v7m+5e6O4rgXuAgRUs/6S7z3P3\n74EZhGRT1bInAwvd/dnotdsIB4gyxRnj7919s7uvIiTY4m2dDtzm7vnuvhG4oYLtrATeJxyMAH4M\nfOnu86LXn3P3lR68CrwClHmxtpTTgevd/Ut3X004e4/d7hPuvjb6TB4lHLBz41gvQB5wn7svdPdv\ngQnAQDNrF1OmvH1TkZHALHd/NfqMbiAcOPoBhYQDTNeoivCjaN9BOHgfZGZt3P0rd38zzvchNUBJ\nXwA+iZ0ws0PN7AUz+8zMtgCTgD0rWP6zmOcFVHzxtryy+8bG4e5OODMuU5wxxrUtwhlqRR4FRkXP\nz4ymi+M42czeNLMvzGwT4Sy7on1VbJ+KYjCzc8zs3agaZRNwaJzrhfD+Stbn7luAL4H9YspU5TMr\nb71FhM9oP3dfBvw34XNYF1UX7h0VPRfoAiwzs7fM7MQ434fUACV9gfBzP9bdhLPbA919d+AaQvVF\nTVpLqG4BwMyMnZNUaYnEuBbYP2a6sialTwDHmtl+hDP+R6MYdwOeBH5PqHppBfwlzjg+Ky8GM+sE\n3AmMBdpE6/0gZr2VNS9dQ6gyKl5fC0I10qdxxFWV9dYjfGafArj7I+7en1C1U5+wX3D3Ze4+klCF\ndwvwlJk1STAWqSYlfSlLC2Az8LWZdQYuTME2nwd6m9kpZtYAuBRoW0MxPgFcZmb7mVkb4NcVFXb3\nz4DXgT8Cy9x9efRSY6ARsB7YbmYnA8dUIYYrzayVhfsYxsW81pyQ2NcTjn8/I5zpF/scaFd84boM\njwHnm1l3M2tMSL7/cPdyfzlVIeahZjYo2vYVhOswb5pZZzMbHG3vm+hRRHgDPzWzPaNfBpuj91aU\nYCxSTUr6Upb/Bs4m/EPfTbjgWqPc/XPgDOBWYCNwAPAO4b6CZMd4J6Hu/T3CRcYn41jmUcKF2ZKq\nHXffBPwSeJpwMXQE4eAVj98SfnGsAl4EHopZ7yLgDuCtqMwhQGw9+MvAcuBzM4utpile/iVCNcvT\n0fLtCfX8CXH3xYR9fifhgDQEGBrV7zcGphCuw3xG+GUxMVr0RGCphdZhNwNnuPu2ROOR6rFQdSpS\nu5hZfUJ1wgh3/0e64xGpK3SmL7WGmQ2JqjsaA1cTWn28leawROoUJX2pTY4CVhKqDo4Hhrt7edU7\nIlINqt4REckiOtMXEckita7DtT333NNzcnLSHYaISEaZP3/+BnevqJkzUAuTfk5ODvPmzUt3GCIi\nGcXMKruzHFD1johIVlHSFxHJIkr6IiJZJK46fTMbAtxO6ETpPne/odTrFwEXE4bh2wqMcfclMa+3\nJ/QXfq2735yk2EUkCb7//nvy8/P59ttv0x2KxKFJkya0a9eOhg3L63qpYpUm/eh2+OmEfsTzgbfN\nbFZsUgcedfe7ovJDCf2nxI5GdCuhfxERqWXy8/Np0aIFOTk5hM5NpbZydzZu3Eh+fj4dO3asfIEy\nxFO90xdYEQ0UsQ2YyY4BJYoD2RIz2YyYrl/N7FTgI8JIRTVmxgzIyYF69cLfGVUa7lske3377be0\nadNGCT8DmBlt2rRJ6FdZPNU7+7HzYA/5hJFySgdzMWGIu0aE4dQws+aEbmt/DFxe3gbMbAwwBqB9\n+8q6Nt/VjBkwZgwUFITp1avDNEBewn0LitR9SviZI9HPKmkXct19ursfQEjyV0WzryUMS7e1kmXv\ncfdcd89t27bSewt2MXHijoRfrKAgzBcRkR3iSfqfsvMIPyUj5ZRjJnBq9LwfMMXMVgGXEQaNGFfe\ngtX18cdVmy8itcfGjRvp2bMnPXv2ZO+992a//fYrmd62Lb5u988991yWLVtWYZnp06czI0n1vkcd\ndRQLFy5MyrpSLZ7qnbcJgxp3JCT7kYRxQkuY2UExowmdRBjgAXcfEFPmWmCru+80AHQytG8fqnTK\nmi8iyTVjRvgV/fHH4X9s8uTEqlHbtGlTkkCvvfZamjdvzuWX71wb7O64O/XqlX2e+sADD1S6nYsv\nvrj6QdYhlZ7pu3shYSi3OcBS4Al3X2xmk6KWOgDjzGyxmS0k1OufXWMRl2HyZGjadOd5TZuG+SKS\nPMXXz1avBvcd189qouHEihUr6NKlC3l5eXTt2pW1a9cyZswYcnNz6dq1K5MmTSopW3zmXVhYSKtW\nrZgwYQI9evTgyCOPZN26dQBcddVVTJ06taT8hAkT6Nu3L4cccgj/+te/APj666/5yU9+QpcuXRgx\nYgS5ubmVntE/8sgjdOvWjcMOO4wrr7wSgMLCQn7605+WzJ82bRoAt912G126dKF79+6MHj066fss\nHnG103f32cDsUvOuiXl+aRzruLaqwcWr+CwjmWcfIrKriq6f1cT/2wcffMBDDz1Ebm4uADfccAN7\n7LEHhYWFDB48mBEjRtClS5edltm8eTMDBw7khhtuYPz48dx///1MmDBhl3W7O2+99RazZs1i0qRJ\nvPTSS9xxxx3svffePPXUU7z77rv07t27wvjy8/O56qqrmDdvHi1btuTYY4/l+eefp23btmzYsIH3\n3nsPgE2bNgEwZcoUVq9eTaNGjUrmpVqduSM3Lw9WrYKiovBXCV8k+VJ9/eyAAw4oSfgAjz32GL17\n96Z3794sXbqUJUuW7LLMbrvtxgknnABAnz59WLVqVZnrPu2003Yp8/rrrzNy5EgAevToQdeuXSuM\n78033+Too49mzz33pGHDhpx55pm89tprHHjggSxbtoxLLrmEOXPm0LJlSwC6du3K6NGjmTFjRrVv\nrkpUnUn6IlLzyrtOVlPXz5o1a1byfPny5dx+++28+uqrLFq0iCFDhpTZXr1Ro0Ylz+vXr09hYWGZ\n627cuHGlZaqrTZs2LFq0iAEDBjB9+nQuvPBCAObMmcNFF13E22+/Td++fdm+fXtStxsPJX0RiVs6\nr59t2bKFFi1asPvuu7N27VrmzJmT9G3079+fJ554AoD33nuvzF8Ssfr168fcuXPZuHEjhYWFzJw5\nk4EDB7J+/Xrcnf/6r/9i0qRJLFiwgO3bt5Ofn8/RRx/NlClT2LBhAwWl68pSoNb1py8itVc6r5/1\n7t2bLl26cOihh9KhQwf69++f9G384he/4KyzzqJLly4lj+KqmbK0a9eO6667jkGDBuHunHLKKZx0\n0kksWLCA888/H3fHzLjxxhspLCzkzDPP5KuvvqKoqIjLL7+cFi1aJP09VKbWjZGbm5vrGkRFJHWW\nLl1K586d0x1GrVBYWEhhYSFNmjRh+fLlHHfccSxfvpwGDWrX+XFZn5mZzXf33HIWKVG73omISBpt\n3bqVY445hsLCQtydu+++u9Yl/ETVrXcjIpKAVq1aMX/+/HSHUaN0IVdEJIso6YuIZBElfRGRLKKk\nLyKSRZT0RSStBg8evMuNVlOnTmXs2LEVLte8eXMA1qxZw4gRI8osM2jQICprAj516tSdbpI68cQT\nk9IvzrXXXsvNN9e+IcGV9EUkrUaNGsXMmTN3mjdz5kxGjRoV1/L77rsvTz75ZLW3Xzrpz549m1at\nWlV7fbWdkr6IpNWIESN44YUXSgZMWbVqFWvWrGHAgAEl7eZ79+5Nt27dePbZZ3dZftWqVRx22GEA\nfPPNN4wcOZLOnTszfPhwvvnmm5JyY8eOLemW+be//S0A06ZNY82aNQwePJjBgwcDkJOTw4YNGwC4\n9dZbOeywwzjssMNKumVetWoVnTt35mc/+xldu3bluOOO22k7ZVm4cCFHHHEE3bt3Z/jw4Xz55Zcl\n2y/uarm4o7e///3vJYPI9OrVi6+++qra+7YsaqcvIiUuuwySPSBUz54Q5csy7bHHHvTt25cXX3yR\nYcOGMXPmTE4//XTMjCZNmvD000+z++67s2HDBo444giGDh1a7jixd955J02bNmXp0qUsWrRop66R\nJ0+ezB577MH27ds55phjWLRoEZdccgm33norc+fOZc8999xpXfPnz+eBBx7gzTffxN3p168fAwcO\npHXr1ixfvpzHHnuMe++9l9NPP52nnnqqwv7xzzrrLO644w4GDhzINddcw+9+9zumTp3KDTfcwEcf\nfUTjxo1LqpRuvvlmpk+fTv/+/dm6dStNmjSpwt6unM70RSTtYqt4Yqt23J0rr7yS7t27c+yxx/Lp\np5/y+eefl7ue1157rST5du/ene7du5e89sQTT9C7d2969erF4sWLK+1M7fXXX2f48OE0a9aM5s2b\nc9ppp/GPf/wDgI4dO9KzZ0+g4u6bIfTvv2nTJgYOHAjA2WefzWuvvVYSY15eHo888kjJnb/9+/dn\n/PjxTJs2jU2bNiX9jmCd6YtIiYrOyGvSsGHD+OUvf8mCBQsoKCigT58+AMyYMYP169czf/58GjZs\nSE5OTpndKVfmo48+4uabb+btt9+mdevWnHPOOdVaT7HibpkhdM1cWfVOeV544QVee+01nnvuOSZP\nnsx7773HhAkTOOmkk5g9ezb9+/dnzpw5HHroodWOtTSd6YtI2jVv3pzBgwdz3nnn7XQBd/Pmzey1\n1140bNiQuXPnsrqswbBj/OhHP+LRRx8F4P3332fRokVA6Ja5WbNmtGzZks8//5wXX3yxZJkWLVqU\nWW8+YMAAnnnmGQoKCvj66695+umnGTBgwC7lKtOyZUtat25d8ivh4YcfZuDAgRQVFfHJJ58wePBg\nbrzxRjZv3szWrVv58MMP6datG7/+9a85/PDD+eCDD6q8zYroTF9EaoVRo0YxfPjwnVry5OXlccop\np9CtWzdyc3MrPeMdO3Ys5557Lp07d6Zz584lvxh69OhBr169OPTQQ9l///136pZ5zJgxDBkyhH33\n3Ze5c+eWzO/duzfnnHMOffv2BeCCCy6gV69eFVbllOfBBx/koosuoqCggE6dOvHAAw+wfft2Ro8e\nzebNm3F3LrnkElq1asXVV1/N3LlzqVevHl27di0ZBSxZ1LWySJZT18qZJ5GulVW9IyKSRZT0RUSy\niJK+iFDbqnmlfIl+Vkr6IlmuSZMmbNy4UYk/A7g7GzduTOiGLbXeEcly7dq1Iz8/n/Xr16c7FIlD\nkyZNaNeuXbWXjyvpm9kQ4HagPnCfu99Q6vWLgIuB7cBWYIy7LzGzHwM3AI2AbcAV7v5qtaMVkaRr\n2LAhHTt2THcYkiKVVu+YWX1gOnAC0AUYZWZdShV71N27uXtPYApwazR/A3CKu3cDzgYeTlrkIiJS\nZfHU6fcFVrj7SnffBswEhsUWcPctMZPNAI/mv+Pua6L5i4HdzKwxIiKSFvFU7+wHfBIznQ/0K13I\nzC4GxhOqco4uYz0/ARa4+3dlLDsGGAPQvn37OEISEZHqSFrrHXef7u4HAL8Grop9zcy6AjcCF5az\n7D3unuvuuW3btk1WSCIiUko8Sf9TYP+Y6XbRvPLMBE4tnjCzdsDTwFnu/mF1ghQRkeSIJ+m/DRxk\nZh3NrBEwEpgVW8DMDoqZPAlYHs1vBbwATHD3fyYnZBERqa5Kk767FwLjgDnAUuAJd19sZpPMbGhU\nbJyZLTazhYR6/bOL5wMHAteY2cLosVfy34aIiMRDvWyKiNQB6mVTRER2oaQvIpJFlPRFRLKIkr6I\nSBZR0hcRySJK+iIiWURJX0Qkiyjpi4hkESV9EZEsoqQvIpJFlPRFRLKIkr6ISBZR0hcRySJK+iIi\nWURJX0QkiyjpR2bMgJwcqFcv/J0xI90RiYgkX4N0B1AbzJgBY8ZAQUGYXr06TAPk5aUvLhGRZNOZ\nPjBx4o6EX6ygIMwXEalLlPSBjz+u2nwRkUylpA+0b1+1+SIimUpJH5g8GZo23Xle06ZhvohIXVJn\nkr47PP88bNpU9WXz8uCee6BDBzALf++5RxdxRaTuqTNJf/lyGDoUrr++esvn5cGqVVBUFP4q4YtI\nXVRnkv7BB8N558G0aeEAICIiu4or6ZvZEDNbZmYrzGxCGa9fZGbvmdlCM3vdzLrEvPabaLllZnZ8\nMoMvbfJkaNIE/vu/a3IrIiKZq9Kkb2b1genACUAXYFRsUo886u7d3L0nMAW4NVq2CzAS6AoMAf43\nWl+N+MEP4Kqr4Lnn4OWXa2orIiKZK54z/b7ACndf6e7bgJnAsNgC7r4lZrIZ4NHzYcBMd//O3T8C\nVkTrqzGXXgoHHAC//CUUFtbklkREMk88SX8/4JOY6fxo3k7M7GIz+5Bwpn9JFZcdY2bzzGze+vXr\n4429TI0bw803w+LFcPfdCa1KRKTOSdqFXHef7u4HAL8Grqrisve4e66757Zt2zbhWIYNg8GD4Zpr\n4IsvEl6diEidEU/S/xTYP2a6XTSvPDOBU6u5bFKYwdSpoc3+735X01sTEckc8ST9t4GDzKyjmTUi\nXJidFVvAzA6KmTwJKG40OQsYaWaNzawjcBDwVuJhV657d/jZz2D6dFi6NBVbFBGp/SpN+u5eCIwD\n5gBLgSfcfbGZTTKzoVGxcWa22MwWAuOBs6NlFwNPAEuAl4CL3X17DbyPMl13HTRvDuPHp2qLIiK1\nm7l75aVSKDc31+fNm5e09d16a2i3/8ILcOKJSVutiEitYmbz3T23snJ15o7c8owbF+7WHT8evv8+\n3dGIiKRXnU/6jRrBLbfAsmWhfl9EpDb66qvUXH+s80kf4KST4LjjQkueDRvSHY2IyA4rV4abSdu1\ngzPPDD0G16SsSPpmcNtt4Uh6zTXpjkZEkmX7dnjzTfjtb+GII6Bv33BX/uOPwyefVL58urjD3/4G\np54KBx4If/gDnHxyuKHUrGa3Xecv5Mb6xS/gf/8XFi6Ebt1qZBMiUsM2boQ5c2D27PB3w4aQKPv1\nCx0uvvXWjjGv27WDH/5wx6NnT2jYMH2xf/stPPYY3H47vPsutGkDF10EY8fCfrv0VVA18V7Izaqk\nv3EjHHQQ9O4dOmSr6SOqiCSuqAjeeSck+dmzQ1IvKoI994QhQ0KrvOOOCwkUQoONRYvgX//a8Sge\n73q33eDww3ccBI48Mqynpn32Gdx5Z3isXw+HHRZ+keTlhZiSQUm/HHfcAZdcAs88E7prEJHa58sv\nw4nZ7Nnw4ouwbl04ScvNDUn+xBOhTx+oH2efvfn58O9/7zgILFiwo0PGgw8Oyf+HP4RevaBjx3AA\nScZJ4fz54ax+5sywvZNPDsn+6KOTf9KppF+O77+HHj1g27bQKVvjxjW2KamCoqLQLfaiRWG6+B8i\n9m9Z82L/1q8fqvAGDEhNzJJcK1eGuvjZs0OC3r4dWreG448PSf7442GvvZKzrW++gXnzdv41ENvI\no0WLkPw7dQqP4ucdO0JOTsVn54WF8OyzoSuY118PN4iee274bh50UPnLJUpJvwJz5oSfhVOmwBVX\nJGedM2bAxInhZ2T79mFAFw25GL/im+i6ddtR51r81XTf+Xl5r332GdSrF5q9Ff/Ul8wwZw6MGAFb\nt4bq1xNPhBNOCBdmGzSo+e27w4oVsGQJfPRROADF/v3mm53L77PPrgeDjh3DgeSOO2D16nBwuOSS\nMKJfy5Y1/x7iTfq4e6169OnTx1PhpJPcW7Rw/+yzxNf1yCPuTZsWp5/waNo0zJfKvfOOe6NG7sOH\nuxcVVX89Cxe6N2jgftZZyYtNat5997nXr+/eo4f7ypXpjmZXRUXua9e6/+tf4X/6uuvczz3XfdAg\n9/bt3evV2/l/f+BA96efdi8sTG2cwDyPI8emPcmXfqQq6X/wQUgQF1yQ+Lo6dNj5Qy9+dOiQ+Lrr\nuq+/dj/0UPd993XfsCHx9U2cGPb9Sy8lvi6pWUVF7ldfHT6v449337Il3RFVz3ffuS9f7v6Xv7gv\nWpS+OOJN+llZvVNs/PhQ7zZ/friAU1316pV9Q4VZqKuW8o0dG9omv/wyHHNM4uv79tvQLO/bb+H9\n90N9qtQ+27bBBRfAww+H6o+77kpvU8q6QH3vxOGaa0Ld72WXJXYXXPv2VZsvwbPPhn/2yy9PTsKH\n0E77vvtCnepVVRrKR1Jl8+ZQX//wwzBpUvi8lPBTJ6uTfqtWofvl116Dp56q/nomT4amTXee17Rp\nmC9lW7MGzj8/XLS7/vrkrvuoo+DnP4dp00IrEKk9PvkkfD6vvQYPPghXX637ZVItq6t3IDSv6tUr\ntBpYujScKVaHWu/Er6goNL/75z9De+lDD03+NrZsga5dYffdwzbUNDf9Fi4M/WBt3Qp//nPyft1J\noOqdODVoEOr1V60KzQarKy8vrKOoKPxVwi/fbbfBX/8a9ntNJHwIyf6uu0ITvN//vma2IfGbMyfc\nP1GvXmi7roSfPlmf9CF8AYcNg//5n3CTRvGdepJ877wDv/kNDB8ehrOsSSedBKNGhc918eKa3ZaU\n7/77w2fRqRO88Yb6vUq3rK/eKbZixY5qnhYtQr3jwIHh0aePLjQlQ0FB2JdbtoQ7b1NxA9X69dC5\nc+jJ8J//jP+2/bqkqCjcXFRQEB7Fz8uat//+0L//rteoqsM99H553XWhb5w//Sn8ApOaEW/1Tgru\ndcsMBx4IH34Ir74Kf/97eLz4YnitWbPQL0fxQeDww1VHXB3jx4fBbF5+OXV3zLZtG/o+GT06dF97\n6aWp2W6qvftu+EWzdOmuyfy776q2roYNQ180Rx8dHv36hcGIqmLbtvBL7qGHQhcEd9+tE6faQmf6\nFVi3LrQyKD4IvPdemN+kSfinKD4I9OuXvJ7y6qpnnglVOldcEbq/SCX3UL3w97+Hap6cnNRuvya9\n/34YHOjJJ8Ot/gMHhpOUpk3Dd7Jp0/Kfl57XpEk4KL/6angsWBD2XdOmoT6++CDQq1fFv5g2b4af\n/AReeSXEphY6qaFuGGrAhg3h9urLLnPv1cvdLNxN2KiR+4AB4e7Czz+v3rofeSTcwWsW/talLhw+\n/dS9TRv33r3D3YvpsHq1e/Pm7scdl1hXD7XFkiXuZ5wRvi8tWoTv3hdfJHcbX3wRvu+/+IV71647\n7jRv2dJ92DD32293f++9nffnxx+7H3ZYuNv9j39MbjxSMdQNQ8378kv3555zv/xy98MPD31w9Orl\n/tVXVVtPXe67Z/t292OOCe/ngw/SG8sdd4R9++CD6Y0jEcuWueflhWTfrJn7lVcmp/uKeKxd6/7Y\nY6Hrkk6ddnxX99rLfeRI96lT3ffbLxyEXn45NTHJDkr6aTB7dkj8J59ctc6W6nLfPTfdFN7LPfek\nO5JwAPrhD91bt05OR3uptHx56EiuXr1wAP3Vr9zXrUtvTB995H7//e6jR4e+kyAk/XffTW9c2UpJ\nP02mTw979ZJL4l+muJqo9MOs5uJMhfnz3Rs2TLz3zGRasiRUx51+erojic/Kle7nnRd6oWzSxH38\n+Np5wCoqCgemTO00rS6IN+mrnX6S/fznYWT7adNCa5F41Ia+ez75BC68MLShX7gwsb6IAL7+Gs48\nMwx6ce+9tedCXufO4cLiE0/ArFnpjqZ8q1fDmDFhVKcZM2DcuNC3+y23wA9+kO7odmUWWsC1aJHu\nSKRS8RwZgCHAMmAFMKGM18cDS4BFwCtAh5jXpgCLgaXANKIWQ+U9Mv1M3z1U7QwdGn6KP/985eXT\nWae/fbv7XXeFetgmTcIZJbgffHC4OPj++9Vb75gx4ZfKK68kN95k+O47927dQpXEpk3pjmZnn3zi\nPnZs+IXUqJH7xRe75+enOyrJBCSregeoD3wIdAIaAe8CXUqVGQw0jZ6PBR6Pnv8Q+Ge0jvrAv4FB\nFW2vLiR9d/etW0NrlWbNwiAhlUlH650VK8JAEBAutn74Yagnvusu98GDdwwO0bWr+6RJ4SJiPP78\n57Dcr35Vs/En4q23wvu78MKa20ZRkXtBQaiOWbbM/e233f/617B//vjH0Prluuvcr7giHCRPOy0k\n+oYN3S+6KLSEEYlXMpP+kcCcmOnfAL+poHwv4J8xy84HdgOaAvOAzhVtr64kfffQVLFdu3Bxqzad\nrRUWut99tOiNAAARBUlEQVRyi/tuu7nvvrv7vfeWXee+dm1o8XLUUTt+gfTs6f7734cDRFny8933\n2MO9T5/0Nc+M1/jx4T397W+Jreezz8LoT6eeGkZ/6tgxNFFt0KDsazWlH02ahBYwBx7o/rOfhQuk\nIlUVb9Kv9OYsMxsBDHH3C6LpnwL93H1cOeX/AHzm7tdH0zcDFwAG/MHdJ5axzBhgDED79u37rF69\nusKYMsmiReG29oMOCjd6pXtQj8WLQ5fGb74Jp5wCd94J++1X+XKffhpuo3/88dB/CoQ7k884A04/\nPdy+X1QUbrf/97/DjT2HHFKz7yVRX38d+oFp0CDc0VqVG+w++CCMBzBrVni/7tChA/ToEboa2H33\ncLNUWc9LT1f1bleRsiTt5ixgBHBfzPRPCcm7rLKjgTeAxtH0gcALQPPo8W9gQEXbq0tn+sVeeCFU\nJZxySurHzSy2bVuoomnY0H3PPd0ffbT6LWpWrXKfMiWczRefrR55pPuZZ4bn996b3Nhr0ssvh5gn\nTKi4XGGh++uvh6qYgw/e8b779HH/3e/C+Ly1pYWSZCdSXb0DHEu4WLtXzLwrgKtjpq8BflXR9upi\n0nd3/8Mfwt6+9NLUb3vePPfu3cP2R45Mbvvu5cvdJ08O1Rrg/pOfZF7yK24SuWDBzvO//tr92WfD\n623bhvfXsKH7j38cPk/VuUttksyk3wBYCXRkx4XcrqXK9CJc7D2o1PwzgL9G62hIaNlzSkXbq6tJ\n3z103wAhYSRbWReCCwrcf/3rkND22ScksJq0alXtr8cvyxdfuO+9d7ibes0a9//7v9DNwG67hc9r\n993DwfKxx2pfax+RYklL+mFdnAj8J0rsE6N5k4Ch0fO/Ap8DC6PHrGh+feDu6BfAEuDWyrZVl5N+\nbFPOF15I3nrLavLZuHFIZOB+/vmhywgp35NP7rz/9t/ffdy4UP2TiQcyyT7xJn31spliW7fCj34E\ny5eHEYR69Eh8nTk54Wae0urXh5degmOPTXwb2eB//id0CTxsGPTsWXtuKBOJR7wXcpX002DNmtAd\nM4RWNPvum9j66tUr/w7aWvbxikgN0Ri5tdi++8Lzz8OmTXDyyeHsvzry8+G++8pvatihQ/VjFJG6\nSUk/TXr0CG3e33039FGzfXvly2zbFga3+NWvQvvy/fcPoxM1abLroBZNm8LkyTUTu4hkLiX9NDrx\nxDCU33PPweWXl11m9Wq46y449dQwxOAxx8DUqaEjs5tuCiMnbdgADz4YzuzNwt977oG8vNS+HxGp\n/TRGbpqNGxcGZZ86NfRSeMEF4c7dF18MF2GXLg3lOnQI47yecEIYsq70nb15eUryIlI5XcitBbZv\nD+PHvvBCqKopKAgDrw8cCEOGhER/yCFqTSIi5Yv3Qq7O9GuB+vXh0UdDf/Z77BES/aBBYYBrEZFk\nUtKvJZo3D4NliIjUJF3IFRHJIkr6AoRfGTk54UavnBz96hCpq1S9I8yYEcZjLSgI08Xjs4JaBInU\nNTrTFyZO3JHwixUUhPkiUrco6Qsff1y1+SKSuZT0hfbtqzZfRDKXkr4weXLoqyeW+u4RqZuU9IW8\nvNBXj/ruEan71HpHAPXdI5ItdKYvIpJFlPQlKXRzl0hmUPWOJEw3d4lkDp3pS8J0c5dI5lDSl4Tp\n5i6RzKGkLwnTzV0imUNJXxKmm7tEMoeSviRMN3eJZI64kr6ZDTGzZWa2wswmlPH6eDNbYmaLzOwV\nM+sQ81p7M/uLmS2NyuQkL3ypLfLyYNUqKCoKf5XwRWqnSpO+mdUHpgMnAF2AUWbWpVSxd4Bcd+8O\nPAlMiXntIeAmd+8M9AXWJSNwqVvUzl8kNeI50+8LrHD3le6+DZgJDIst4O5z3b240d4bQDuA6ODQ\nwN1fjsptjSknAuxo5796NbjvaOevxC+SfPEk/f2AT2Km86N55TkfeDF6fjCwycz+bGbvmNlN0S+H\nnZjZGDObZ2bz1q9fH2/sUkeonb9I6iT1Qq6ZjQZygZuiWQ2AAcDlwOFAJ+Cc0su5+z3unuvuuW3b\ntk1mSJIB1M5fJHXiSfqfAvvHTLeL5u3EzI4FJgJD3f27aHY+sDCqGioEngF6Jxay1DVq5y+SOvEk\n/beBg8yso5k1AkYCs2ILmFkv4G5Cwl9XatlWZlZ8+n40sCTxsKUuUTt/kdSpNOlHZ+jjgDnAUuAJ\nd19sZpPMbGhU7CagOfAnM1toZrOiZbcTqnZeMbP3AAPurYH3IRlM7fxFUsfcPd0x7CQ3N9fnzZuX\n7jBERDKKmc1399zKyumOXKkT1M5fJD7qT18ynvrzF4mfzvQl46mdv0j8lPQl46mdv0j8lPQl46md\nv0j8lPQl46mdv0j8lPQl46mdv0j8lPSlTki0P381+ZRsoSabkvXU5FOyic70JeupyadkEyV9yXpq\n8inZRElfsp6afEo2UdKXrKcmn5JNlPQl66nJp2QTtd4RISR4JXnJBjrTF0kCtfOXTKEzfZEEqZ2/\nZBKd6YskSO38JZMo6YskSO38JZMo6YskKBnt/HVNQFJFSV8kQYm28y++JrB6NbjvuCagxC81QUlf\nJEGJtvPXNQFJJXP3dMewk9zcXJ83b166wxBJmXr1whl+aWahq2iReJjZfHfPrayczvRF0kzXBCSV\nlPRF0kzXBCSV4kr6ZjbEzJaZ2Qozm1DG6+PNbImZLTKzV8ysQ6nXdzezfDP7Q7ICF6krdE1AUqnS\nOn0zqw/8B/gxkA+8DYxy9yUxZQYDb7p7gZmNBQa5+xkxr98OtAW+cPdxFW1PdfoiVaNrAgLJrdPv\nC6xw95Xuvg2YCQyLLeDuc929+FzjDaBdTCB9gB8Af4k3eBGJn8YDkKqIJ+nvB3wSM50fzSvP+cCL\nAGZWD7gFuLyiDZjZGDObZ2bz1q9fH0dIIlJM4wFIVST1Qq6ZjQZygZuiWT8HZrt7fkXLufs97p7r\n7rlt27ZNZkgidZ7GA5CqiKeXzU+B/WOm20XzdmJmxwITgYHu/l00+0hggJn9HGgONDKzre6+y8Vg\nEak+jQcg8YrnTP9t4CAz62hmjYCRwKzYAmbWC7gbGOru64rnu3ueu7d39xxCFc9DSvgitY/a+WeP\nSs/03b3QzMYBc4D6wP3uvtjMJgHz3H0WoTqnOfAnMwP42N2H1mDcIpIkGg8gu6gbBpEsl5MTEn1p\nHTrAqlXxrWPGjHBfwMcfh1ZDkyfrgJFq8TbZ1MhZIlku0fEA9Eshs6gbBpEsl2g7f90RnFmU9EWy\nXKLt/DVyWGZR0hfJcom289cdwZlFSV9EyMsLF22LisLfqtTF647gzKKkLyIJScYdwbpPIHXUekdE\nEpbIHcFq/ZNaOtMXkbRS65/UUtIXkbRS65/UUtIXkbRS65/UUtIXkbRKRusfXQiOn5K+iKRVoq1/\nkjEwfDYdNNThmohktEQ7jCvdegjCL41MG4gmmWPkiojUWoleCM621kNK+iKS0RK9EJxtrYeU9EUk\noyV6ITjbWg8p6YtIRkv0QnC2tR5SNwwikvES6QaieLnqjvyVad1IqPWOiEgCkjHcZDKo9Y6ISApk\n2oVgJX0RkQRk2oVgJX0RkQRk2iAySvoiIgnItEFk1HpHRCRBmTSITFxn+mY2xMyWmdkKM5tQxuvj\nzWyJmS0ys1fMrEM0v6eZ/dvMFkevnZHsNyAikslS3Q1EpUnfzOoD04ETgC7AKDPrUqrYO0Cuu3cH\nngSmRPMLgLPcvSswBJhqZq2SFbyISKZLdeufeM70+wIr3H2lu28DZgLDYgu4+1x3Lz5WvQG0i+b/\nx92XR8/XAOuAtskKXkQk06W69U88SX8/4JOY6fxoXnnOB14sPdPM+gKNgA/LeG2Mmc0zs3nr16+P\nIyQRkboh1a1/ktp6x8xGA7nATaXm7wM8DJzr7kWll3P3e9w9191z27bVDwERyR7JaP1TFfG03vkU\n2D9mul00bydmdiwwERjo7t/FzN8deAGY6O5vJBauiEjdk0jrn6qK50z/beAgM+toZo2AkcCs2AJm\n1gu4Gxjq7uti5jcCngYecvcnkxe2iIhUR6VJ390LgXHAHGAp8IS7LzazSWY2NCp2E9Ac+JOZLTSz\n4oPC6cCPgHOi+QvNrGfy34aIiMRDvWyKiNQB6mVTRER2oaQvIpJFal31jpmtB8oYkiBuewIbkhRO\nTVB8iVF8iVF8ianN8XVw90rbvNe6pJ8oM5sXT71Wuii+xCi+xCi+xNT2+OKh6h0RkSyipC8ikkXq\nYtK/J90BVELxJUbxJUbxJaa2x1epOlenLyIi5auLZ/oiIlIOJX0RkSySkUk/juEbG5vZ49Hrb5pZ\nTgpj29/M5kbDRy42s0vLKDPIzDbH9Ed0Tarii4lhlZm9F21/l34vLJgW7cNFZtY7hbEdErNvFprZ\nFjO7rFSZlO5DM7vfzNaZ2fsx8/Yws5fNbHn0t3U5y54dlVluZmenML6bzOyD6PN7urxR6yr7LtRg\nfNea2acxn+GJ5Sxb4f97Dcb3eExsq8xsYTnL1vj+Syp3z6gHUJ8wEEsnwqAs7wJdSpX5OXBX9Hwk\n8HgK49sH6B09bwH8p4z4BgHPp3k/rgL2rOD1EwmD4RhwBPBmGj/vzwg3nqRtHxI6DuwNvB8zbwow\nIXo+AbixjOX2AFZGf1tHz1unKL7jgAbR8xvLii+e70INxnctcHkcn3+F/+81FV+p128BrknX/kvm\nIxPP9CsdvjGafjB6/iRwjJlZKoJz97XuviB6/hWhZ9KKRhqrrYYRusR2D+MgtIoGw0m1Y4AP3T2R\nu7QT5u6vAV+Umh37PXsQOLWMRY8HXnb3L9z9S+BlwnjRNR6fu//FQy+5EDOMaTqUs//iEc//e8Iq\nii/KHacDjyV7u+mQiUk/nuEbS8pEX/rNQJuURBcjqlbqBbxZxstHmtm7ZvaimXVNaWCBA38xs/lm\nNqaM16s6TGZNGUn5/2zp3oc/cPe10fPPgB+UUaa27MfzKGMY00hl34WaNC6qfrq/nOqx2rD/BgCf\nezTedxnSuf+qLBOTfkYws+bAU8Bl7r6l1MsLCNUVPYA7gGdSHR9wlLv3Bk4ALjazH6UhhgpZGIRn\nKPCnMl6uDfuwhIff+bWy/bOZTQQKgRnlFEnXd+FO4ACgJ7CWUIVSG42i4rP8Wv+/FCsTk348wzeW\nlDGzBkBLYGNKogvbbEhI+DPc/c+lX3f3Le6+NXo+G2hoZnumKr5ou59Gf9cRRjfrW6pIXMNk1rAT\ngAXu/nnpF2rDPgQ+L67yiv6uK6NMWvejmZ0DnAzkRQemXcTxXagR7v65u2/3MG72veVsN937rwFw\nGvB4eWXStf+qKxOTfqXDN0bTxa0kRgCvlveFT7ao/u//gKXufms5ZfYuvsZgZn0Jn0MqD0rNzKxF\n8XPCBb/3SxWbBZwVteI5AtgcU5WRKuWeYaV7H0Ziv2dnA8+WUWYOcJyZtY6qL46L5tU4MxsC/Iow\njGlBOWXi+S7UVHyx14iGl7PdeP7fa9KxwAfunl/Wi+ncf9WW7ivJ1XkQWpb8h3BVf2I0bxLhyw3Q\nhFAlsAJ4C+iUwtiOIvzMXwQsjB4nAhcBF0VlxgGLCS0R3gB+mOL91yna9rtRHMX7MDZGA6ZH+/g9\nIDfFMTYjJPGWMfPStg8JB5+1wPeEeuXzCdeJXgGWA38F9ojK5gL3xSx7XvRdXAGcm8L4VhDqw4u/\nh8Ut2vYFZlf0XUhRfA9H361FhES+T+n4ould/t9TEV80/4/F37mYsinff8l8qBsGEZEskonVOyIi\nUk1K+iIiWURJX0Qkiyjpi4hkESV9EZEsoqQvIpJFlPRFRLLI/wO0gNF2vd8PIQAAAABJRU5ErkJg\ngg==\n", 559 | "text/plain": [ 560 | "" 561 | ] 562 | }, 563 | "metadata": {}, 564 | "output_type": "display_data" 565 | } 566 | ], 567 | "source": [ 568 | "'''从验证损失上看,这种不如只用正则化的GRU,但是速度快很多'''\n", 569 | "loss = history.history['loss']\n", 570 | "val_loss = history.history['val_loss']\n", 571 | "\n", 572 | "epochs = range(len(loss))\n", 573 | "\n", 574 | "plt.figure()\n", 575 | "\n", 576 | "plt.plot(epochs, loss, 'bo', label='Training loss')\n", 577 | "plt.plot(epochs, val_loss, 'b', label='Validation loss')\n", 578 | "plt.title('Training and validation loss')\n", 579 | "plt.legend()\n", 580 | "\n", 581 | "plt.show()" 582 | ] 583 | }, 584 | { 585 | "cell_type": "markdown", 586 | "metadata": {}, 587 | "source": [ 588 | "## 小结\n", 589 | "* 一维卷积处理时间模式表现也很好。特别是自然语言处理,可以代替RNN且速度更快\n", 590 | "* Conv1D层和MaxPooling1D层堆叠,最后是一个全局池化层,或者展平操作\n", 591 | "* 一维CNN+RNN,可以使RNN处理时,序列变得更短。\n", 592 | "---\n", 593 | "* 什么是词嵌入,如何使用词嵌入\n", 594 | "* 什么是循环网络,如何使用循环网络\n", 595 | "* 堆叠RNN和使用双向RNN\n", 596 | "* 使用一维CNN处理序列\n", 597 | "* 使用一维CNN+RNN处理长序列\n", 598 | "* 如果序列数据的**整体顺序很重要,使用RNN来处理**\n", 599 | "* 如果序列数据的**整体顺序不重要,使用一维CNN可以实现同样的效果,而且计算代价更小**" 600 | ] 601 | } 602 | ], 603 | "metadata": { 604 | "kernelspec": { 605 | "display_name": "Python [conda env:learn36]", 606 | "language": "python", 607 | "name": "conda-env-learn36-py" 608 | }, 609 | "language_info": { 610 | "codemirror_mode": { 611 | "name": "ipython", 612 | "version": 3 613 | }, 614 | "file_extension": ".py", 615 | "mimetype": "text/x-python", 616 | "name": "python", 617 | "nbconvert_exporter": "python", 618 | "pygments_lexer": "ipython3", 619 | "version": "3.6.8" 620 | }, 621 | "toc": { 622 | "base_numbering": 1, 623 | "nav_menu": {}, 624 | "number_sections": true, 625 | "sideBar": true, 626 | "skip_h1_title": false, 627 | "title_cell": "Table of Contents", 628 | "title_sidebar": "Contents", 629 | "toc_cell": false, 630 | "toc_position": {}, 631 | "toc_section_display": true, 632 | "toc_window_display": true 633 | } 634 | }, 635 | "nbformat": 4, 636 | "nbformat_minor": 2 637 | } 638 | -------------------------------------------------------------------------------- /7-高级深度学习最佳实践.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# keras函数式API\n", 8 | "* 多输入模型\n", 9 | "* 多输出模型:适用于不同的输出具有统计相关性\n", 10 | "* 类图模型\n", 11 | " * **Inception系列网络**:依赖于Inception模块,输入被多个并行的卷积分支处理\n", 12 | " * **残差连接**:最早出现于ResNet网络,将前面的表示重新注入下游数据流中\n", 13 | "\n", 14 | "\n", 15 | "![](https://dpzbhybb2pdcj.cloudfront.net/chollet/Figures/07fig02.jpg)\n", 16 | "![](https://dpzbhybb2pdcj.cloudfront.net/chollet/HighResolutionFigures/figure_7-3.png)\n", 17 | "![](https://dpzbhybb2pdcj.cloudfront.net/chollet/Figures/07fig04.jpg)\n", 18 | "![](https://dpzbhybb2pdcj.cloudfront.net/chollet/HighResolutionFigures/figure_7-5.png)" 19 | ] 20 | }, 21 | { 22 | "cell_type": "markdown", 23 | "metadata": {}, 24 | "source": [ 25 | "## 函数式API简介" 26 | ] 27 | }, 28 | { 29 | "cell_type": "code", 30 | "execution_count": null, 31 | "metadata": {}, 32 | "outputs": [], 33 | "source": [ 34 | "from keras import Input, layers\n", 35 | "input_tensor = Input(shape=(32,))\n", 36 | "dense = layers.Dense(32, activation='relu') # 每个层是一个函数\n", 37 | "output_tensor = dense(input_tensor) # 张量 = 层函数(张量)" 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "execution_count": 2, 43 | "metadata": { 44 | "collapsed": true 45 | }, 46 | "outputs": [ 47 | { 48 | "name": "stdout", 49 | "output_type": "stream", 50 | "text": [ 51 | "_________________________________________________________________\n", 52 | "Layer (type) Output Shape Param # \n", 53 | "=================================================================\n", 54 | "dense_7 (Dense) (None, 32) 2080 \n", 55 | "_________________________________________________________________\n", 56 | "dense_8 (Dense) (None, 32) 1056 \n", 57 | "_________________________________________________________________\n", 58 | "dense_9 (Dense) (None, 10) 330 \n", 59 | "=================================================================\n", 60 | "Total params: 3,466\n", 61 | "Trainable params: 3,466\n", 62 | "Non-trainable params: 0\n", 63 | "_________________________________________________________________\n", 64 | "_________________________________________________________________\n", 65 | "Layer (type) Output Shape Param # \n", 66 | "=================================================================\n", 67 | "input_2 (InputLayer) (None, 64) 0 \n", 68 | "_________________________________________________________________\n", 69 | "dense_10 (Dense) (None, 32) 2080 \n", 70 | "_________________________________________________________________\n", 71 | "dense_11 (Dense) (None, 32) 1056 \n", 72 | "_________________________________________________________________\n", 73 | "dense_12 (Dense) (None, 10) 330 \n", 74 | "=================================================================\n", 75 | "Total params: 3,466\n", 76 | "Trainable params: 3,466\n", 77 | "Non-trainable params: 0\n", 78 | "_________________________________________________________________\n", 79 | "WARNING:tensorflow:From C:\\Users\\pengfeizhang\\Anaconda3\\envs\\learn36\\lib\\site-packages\\tensorflow\\python\\ops\\math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", 80 | "Instructions for updating:\n", 81 | "Use tf.cast instead.\n", 82 | "Epoch 1/10\n", 83 | "1000/1000 [==============================] - 3s 3ms/step - loss: 11.6495\n", 84 | "Epoch 2/10\n", 85 | "1000/1000 [==============================] - 0s 41us/step - loss: 11.5873\n", 86 | "Epoch 3/10\n", 87 | "1000/1000 [==============================] - 0s 41us/step - loss: 11.5753\n", 88 | "Epoch 4/10\n", 89 | "1000/1000 [==============================] - 0s 36us/step - loss: 11.5698\n", 90 | "Epoch 5/10\n", 91 | "1000/1000 [==============================] - 0s 36us/step - loss: 11.5653\n", 92 | "Epoch 6/10\n", 93 | "1000/1000 [==============================] - 0s 39us/step - loss: 11.5622\n", 94 | "Epoch 7/10\n", 95 | "1000/1000 [==============================] - 0s 39us/step - loss: 11.5595\n", 96 | "Epoch 8/10\n", 97 | "1000/1000 [==============================] - 0s 37us/step - loss: 11.5575\n", 98 | "Epoch 9/10\n", 99 | "1000/1000 [==============================] - 0s 37us/step - loss: 11.5555\n", 100 | "Epoch 10/10\n", 101 | "1000/1000 [==============================] - 0s 36us/step - loss: 11.5535\n" 102 | ] 103 | }, 104 | { 105 | "data": { 106 | "text/plain": [ 107 | "" 108 | ] 109 | }, 110 | "execution_count": 2, 111 | "metadata": {}, 112 | "output_type": "execute_result" 113 | }, 114 | { 115 | "name": "stdout", 116 | "output_type": "stream", 117 | "text": [ 118 | "1000/1000 [==============================] - 0s 88us/step\n" 119 | ] 120 | }, 121 | { 122 | "data": { 123 | "text/plain": [ 124 | "11.550111495971679" 125 | ] 126 | }, 127 | "execution_count": 2, 128 | "metadata": {}, 129 | "output_type": "execute_result" 130 | } 131 | ], 132 | "source": [ 133 | "from keras.models import Sequential, Model\n", 134 | "from keras import layers\n", 135 | "from keras import Input\n", 136 | "\n", 137 | "seq_model = Sequential() # Sequentiao模型\n", 138 | "seq_model.add(layers.Dense(32, activation='relu', input_shape=(64,)))\n", 139 | "seq_model.add(layers.Dense(32, activation='relu'))\n", 140 | "seq_model.add(layers.Dense(10, activation='softmax'))\n", 141 | "seq_model.summary()\n", 142 | "\n", 143 | "input_tensor = Input(shape=(64,)) # 函数式API模型\n", 144 | "x = layers.Dense(32, activation='relu')(input_tensor)\n", 145 | "x = layers.Dense(32, activation='relu')(x)\n", 146 | "output_tensor = layers.Dense(10, activation='softmax')(x)\n", 147 | "api_model = Model(input_tensor, output_tensor)\n", 148 | "api_model.summary()" 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "metadata": {}, 155 | "outputs": [], 156 | "source": [ 157 | "api_model.compile(optimizer='rmsprop', loss='categorical_crossentropy') # 编译,相同\n", 158 | "\n", 159 | "import numpy as np\n", 160 | "x_train = np.random.random((1000, 64))\n", 161 | "y_train = np.random.random((1000, 10))\n", 162 | "\n", 163 | "api_model.fit(x_train, y_train, epochs=10, batch_size=128) # 训练相同\n", 164 | "api_model.evaluate(x_train, y_train) # 评估相同" 165 | ] 166 | }, 167 | { 168 | "cell_type": "markdown", 169 | "metadata": {}, 170 | "source": [ 171 | "如果使用不相关的输入、输出构建模型,会得到`RuntimeError` \n", 172 | "这个报错告诉我们,Keras无法从给定输出张量到达`input_1`\n", 173 | "~~~python\n", 174 | ">>> unrelated_input = Input(shape=(32,))\n", 175 | ">>> bad_model = model = Model(unrelated_input, output_tensor)\n", 176 | "RuntimeError: Graph disconnected: cannot\n", 177 | "obtain value for tensor\n", 178 | "Tensor(\"input_1:0\", shape=(?, 64), dtype=float32) at layer \"input_1\".\n", 179 | "~~~" 180 | ] 181 | }, 182 | { 183 | "cell_type": "markdown", 184 | "metadata": {}, 185 | "source": [ 186 | "## 多输入模型\n", 187 | "* 张量的组合方式:相加`keras.layers.add`,连接`keras.layers.concatenate`等\n", 188 | "* 典型的问答模型连个输入:一个自然语言描述问题;一个文本片段(提供用于回答问题的信息)。然后模型生成一个回答,最简单情况下,回答只包含一个单词,通过对预定义的词表做softmax得到。\n", 189 | "![](https://dpzbhybb2pdcj.cloudfront.net/chollet/HighResolutionFigures/figure_7-6.png)" 190 | ] 191 | }, 192 | { 193 | "cell_type": "code", 194 | "execution_count": 4, 195 | "metadata": { 196 | "collapsed": true 197 | }, 198 | "outputs": [ 199 | { 200 | "name": "stdout", 201 | "output_type": "stream", 202 | "text": [ 203 | "__________________________________________________________________________________________________\n", 204 | "Layer (type) Output Shape Param # Connected to \n", 205 | "==================================================================================================\n", 206 | "text (InputLayer) (None, None) 0 \n", 207 | "__________________________________________________________________________________________________\n", 208 | "question (InputLayer) (None, None) 0 \n", 209 | "__________________________________________________________________________________________________\n", 210 | "embedding_3 (Embedding) (None, None, 10000) 640000 text[0][0] \n", 211 | "__________________________________________________________________________________________________\n", 212 | "embedding_4 (Embedding) (None, None, 10000) 320000 question[0][0] \n", 213 | "__________________________________________________________________________________________________\n", 214 | "lstm_3 (LSTM) (None, 32) 1284224 embedding_3[0][0] \n", 215 | "__________________________________________________________________________________________________\n", 216 | "lstm_4 (LSTM) (None, 16) 641088 embedding_4[0][0] \n", 217 | "__________________________________________________________________________________________________\n", 218 | "concatenate_2 (Concatenate) (None, 48) 0 lstm_3[0][0] \n", 219 | " lstm_4[0][0] \n", 220 | "__________________________________________________________________________________________________\n", 221 | "dense_14 (Dense) (None, 500) 24500 concatenate_2[0][0] \n", 222 | "==================================================================================================\n", 223 | "Total params: 2,909,812\n", 224 | "Trainable params: 2,909,812\n", 225 | "Non-trainable params: 0\n", 226 | "__________________________________________________________________________________________________\n" 227 | ] 228 | } 229 | ], 230 | "source": [ 231 | "from keras.models import Model # 一个双输入的问答模型\n", 232 | "from keras import layers\n", 233 | "from keras import Input\n", 234 | "\n", 235 | "text_vocabulary_size = 10000\n", 236 | "question_vocabulary_size = 10000\n", 237 | "answer_vocabulary_size = 500\n", 238 | "\n", 239 | "text_input = Input(shape=(None,), dtype='int32', name='text')\n", 240 | "\n", 241 | "embedded_text = layers.Embedding(64, text_vocabulary_size)(text_input)\n", 242 | "encoded_text = layers.LSTM(32)(embedded_text)\n", 243 | "\n", 244 | "question_input = Input(shape=(None,),dtype='int32', name='question')\n", 245 | "embedded_question = layers.Embedding(32, question_vocabulary_size)(question_input)\n", 246 | "encoded_question = layers.LSTM(16)(embedded_question)\n", 247 | "\n", 248 | "concatenated = layers.concatenate([encoded_text, encoded_question],axis=-1)\n", 249 | "answer = layers.Dense(answer_vocabulary_size,activation='softmax')(concatenated)\n", 250 | "\n", 251 | "model = Model([text_input, question_input], answer) # 指定连个输入和输出\n", 252 | "model.compile(optimizer='rmsprop',\n", 253 | " loss='categorical_crossentropy',\n", 254 | " metrics=['acc'])\n", 255 | "model.summary()" 256 | ] 257 | }, 258 | { 259 | "cell_type": "code", 260 | "execution_count": 5, 261 | "metadata": { 262 | "collapsed": true 263 | }, 264 | "outputs": [ 265 | { 266 | "name": "stdout", 267 | "output_type": "stream", 268 | "text": [ 269 | "Epoch 1/10\n", 270 | "1000/1000 [==============================] - 9s 9ms/step - loss: 0.0000e+00 - acc: 0.5920\n", 271 | "Epoch 2/10\n", 272 | "1000/1000 [==============================] - 6s 6ms/step - loss: 0.0000e+00 - acc: 0.5920\n", 273 | "Epoch 3/10\n", 274 | "1000/1000 [==============================] - 6s 6ms/step - loss: 0.0000e+00 - acc: 0.5920\n", 275 | "Epoch 4/10\n", 276 | "1000/1000 [==============================] - 6s 6ms/step - loss: 0.0000e+00 - acc: 0.5920\n", 277 | "Epoch 5/10\n", 278 | "1000/1000 [==============================] - 6s 6ms/step - loss: 0.0000e+00 - acc: 0.5920\n", 279 | "Epoch 6/10\n", 280 | "1000/1000 [==============================] - 6s 6ms/step - loss: 0.0000e+00 - acc: 0.5920\n", 281 | "Epoch 7/10\n", 282 | "1000/1000 [==============================] - 6s 6ms/step - loss: 0.0000e+00 - acc: 0.5920\n", 283 | "Epoch 8/10\n", 284 | "1000/1000 [==============================] - 6s 6ms/step - loss: 0.0000e+00 - acc: 0.5920\n", 285 | "Epoch 9/10\n", 286 | "1000/1000 [==============================] - 6s 6ms/step - loss: 0.0000e+00 - acc: 0.5920\n", 287 | "Epoch 10/10\n", 288 | "1000/1000 [==============================] - 6s 6ms/step - loss: 0.0000e+00 - acc: 0.5920\n" 289 | ] 290 | }, 291 | { 292 | "data": { 293 | "text/plain": [ 294 | "" 295 | ] 296 | }, 297 | "execution_count": 5, 298 | "metadata": {}, 299 | "output_type": "execute_result" 300 | } 301 | ], 302 | "source": [ 303 | "import numpy as np\n", 304 | "\n", 305 | "num_samples = 1000\n", 306 | "max_length = 100\n", 307 | "\n", 308 | "text = np.random.randint(1, text_vocabulary_size,size=(num_samples, max_length))\n", 309 | "question = np.random.randint(1, question_vocabulary_size, size=(num_samples, max_length))\n", 310 | "answers = np.random.randint(0, 1,size=(num_samples, answer_vocabulary_size))\n", 311 | "\n", 312 | "model.fit([text, question], answers, \n", 313 | " epochs=10, batch_size=128) # \n", 314 | "model.fit({'text': text, 'question': question}, answers,\n", 315 | " epochs=10, batch_size=128) # 对输入命名之后才能这样做。\n" 316 | ] 317 | }, 318 | { 319 | "cell_type": "markdown", 320 | "metadata": {}, 321 | "source": [ 322 | "## 多输出模型\n", 323 | "* 为不同的输出指定不同的损失函数,使用不同的损失加权\n", 324 | "* 例子中预测:年龄(标量回归)、收入(标量会不)、性别(二分类)\n", 325 | "![](https://dpzbhybb2pdcj.cloudfront.net/chollet/Figures/07fig07.jpg)" 326 | ] 327 | }, 328 | { 329 | "cell_type": "code", 330 | "execution_count": null, 331 | "metadata": {}, 332 | "outputs": [], 333 | "source": [ 334 | "from keras import layers\n", 335 | "from keras import Input\n", 336 | "from keras.models import Model\n", 337 | "vocabulary_size = 50000\n", 338 | "num_income_groups = 10\n", 339 | "\n", 340 | "posts_input = Input(shape=(None,), dtype='int32', name='posts')\n", 341 | "embedded_posts = layers.Embedding(256, vocabulary_size)(posts_input)\n", 342 | "x = layers.Conv1D(128, 5, activation='relu')(embedded_posts)\n", 343 | "x = layers.MaxPooling1D(5)(x)\n", 344 | "x = layers.Conv1D(256, 5, activation='relu')(x)\n", 345 | "x = layers.Conv1D(256, 5, activation='relu')(x)\n", 346 | "x = layers.MaxPooling1D(5)(x)\n", 347 | "x = layers.Conv1D(256, 5, activation='relu')(x)\n", 348 | "x = layers.Conv1D(256, 5, activation='relu')(x)\n", 349 | "x = layers.GlobalMaxPooling1D()(x)\n", 350 | "x = layers.Dense(128, activation='relu')(x)\n", 351 | "\n", 352 | "age_prediction = layers.Dense(1, name='age')(x)\n", 353 | "income_prediction = layers.Dense(num_income_groups,activation='softmax',name='income')(x)\n", 354 | "gender_prediction = layers.Dense(1, activation='sigmoid', name='gender')(x)\n", 355 | "\n", 356 | "model = Model(posts_input,[age_prediction, income_prediction, gender_prediction])" 357 | ] 358 | }, 359 | { 360 | "cell_type": "code", 361 | "execution_count": null, 362 | "metadata": {}, 363 | "outputs": [], 364 | "source": [ 365 | "model.compile(optimizer='rmsprop',\n", 366 | " loss=['mse', 'categorical_crossentropy', 'binary_crossentropy'],\n", 367 | " loss_weights=[0.25, 1., 10.])\n", 368 | "model.compile(optimizer='rmsprop', # 使用不同的损失加权\n", 369 | " loss={'age': 'mse', # [3,5]左右\n", 370 | " 'income': 'categorical_crossentropy', # \n", 371 | " 'gender': 'binary_crossentropy'}, # 0.1左右\n", 372 | " loss_weights={'age': 0.25,\n", 373 | " 'income': 1.,\n", 374 | " 'gender': 10.})\n" 375 | ] 376 | }, 377 | { 378 | "cell_type": "code", 379 | "execution_count": null, 380 | "metadata": {}, 381 | "outputs": [], 382 | "source": [ 383 | "model.fit(posts, [age_targets, income_targets, gender_targets],\n", 384 | " epochs=10, batch_size=64)\n", 385 | "model.fit(posts, {'age': age_targets,\n", 386 | " 'income': income_targets,\n", 387 | " 'gender': gender_targets},\n", 388 | " epochs=10, batch_size=64)" 389 | ] 390 | }, 391 | { 392 | "cell_type": "markdown", 393 | "metadata": {}, 394 | "source": [ 395 | "## 层组成的有向无环图" 396 | ] 397 | }, 398 | { 399 | "cell_type": "markdown", 400 | "metadata": {}, 401 | "source": [ 402 | "### Inception模块\n", 403 | "![](https://dpzbhybb2pdcj.cloudfront.net/chollet/Figures/07fig08.jpg)\n", 404 | "* Inception模块:分别学习空间特征和主通道特征\n", 405 | "* Inception V3位于:`keras.applications.inception_v3`包括在ImageNet上预训练得到的权重\n", 406 | "* Xception:代表极端的Inception。空间卷积+逐点卷积" 407 | ] 408 | }, 409 | { 410 | "cell_type": "code", 411 | "execution_count": null, 412 | "metadata": {}, 413 | "outputs": [], 414 | "source": [ 415 | "from keras import layers\n", 416 | "branch_a = layers.Conv2D(128, 1,activation='relu', strides=2)(x)\n", 417 | "\n", 418 | "branch_b = layers.Conv2D(128, 1, activation='relu')(x)\n", 419 | "branch_b = layers.Conv2D(128, 3, activation='relu', strides=2)(branch_b)\n", 420 | "\n", 421 | "branch_c = layers.AveragePooling2D(3, strides=2)(x)\n", 422 | "branch_c = layers.Conv2D(128, 3, activation='relu')(branch_c)\n", 423 | "\n", 424 | "branch_d = layers.Conv2D(128, 1, activation='relu')(x)\n", 425 | "branch_d = layers.Conv2D(128, 3, activation='relu')(branch_d)\n", 426 | "branch_d = layers.Conv2D(128, 3, activation='relu', strides=2)(branch_d)\n", 427 | "\n", 428 | "output = layers.concatenate([branch_a, branch_b, branch_c, branch_d], axis=-1)" 429 | ] 430 | }, 431 | { 432 | "cell_type": "markdown", 433 | "metadata": {}, 434 | "source": [ 435 | "### 残差连接\n", 436 | "![](https://dpzbhybb2pdcj.cloudfront.net/chollet/HighResolutionFigures/figure_7-5.png)\n", 437 | "* 解决了大规模深度学习的两个共性问题:\n", 438 | " * **梯度消失(引入一个纯线性的信息携带轨道)**\n", 439 | " * **表示瓶颈(弥补信息在传播过程中的丢失)**。\n", 440 | "* 前面的层不是和后面的层连接在一起,而是**与后面层的激活相加**\n", 441 | " * 形状相同:**恒等残差连接**\n", 442 | " * 形状不同:**线性残差连接**(可以是不带激活的Dense层;对于卷积特征图是不待激活的1*1卷积)" 443 | ] 444 | }, 445 | { 446 | "cell_type": "code", 447 | "execution_count": null, 448 | "metadata": {}, 449 | "outputs": [], 450 | "source": [ 451 | "from keras import layers # 恒等残差连接\n", 452 | "x = ...\n", 453 | "y = layers.Conv2D(128, 3, activation='relu', padding='same')(x)\n", 454 | "y = layers.Conv2D(128, 3, activation='relu', padding='same')(y)\n", 455 | "y = layers.Conv2D(128, 3, activation='relu', padding='same')(y)\n", 456 | "\n", 457 | "y = layers.add([y, x]) " 458 | ] 459 | }, 460 | { 461 | "cell_type": "code", 462 | "execution_count": null, 463 | "metadata": {}, 464 | "outputs": [], 465 | "source": [ 466 | "from keras import layers # 线性残差连接\n", 467 | "x = ...\n", 468 | "y = layers.Conv2D(128, 3, activation='relu', padding='same')(x)\n", 469 | "y = layers.Conv2D(128, 3, activation='relu', padding='same')(y)\n", 470 | "y = layers.MaxPooling2D(2, strides=2)(y)\n", 471 | "\n", 472 | "residual = layers.Conv2D(128, 1, strides=2, padding='same')(x)\n", 473 | "\n", 474 | "y = layers.add([y, residual]) " 475 | ] 476 | }, 477 | { 478 | "cell_type": "markdown", 479 | "metadata": {}, 480 | "source": [ 481 | "## 共享层权重" 482 | ] 483 | }, 484 | { 485 | "cell_type": "code", 486 | "execution_count": null, 487 | "metadata": {}, 488 | "outputs": [], 489 | "source": [ 490 | "from keras import layers\n", 491 | "from keras import Input\n", 492 | "from keras.models import Model\n", 493 | "\n", 494 | "lstm = layers.LSTM(32) # 将一个LSTM实例化一次,多次重复使用一个层实例\n", 495 | "\n", 496 | "left_input = Input(shape=(None, 128))\n", 497 | "left_output = lstm(left_input)\n", 498 | "\n", 499 | "right_input = Input(shape=(None, 128))\n", 500 | "right_output = lstm(right_input)\n", 501 | "\n", 502 | "merged = layers.concatenate([left_output, right_output], axis=-1)\n", 503 | "predictions = layers.Dense(1, activation='sigmoid')(merged)\n", 504 | "\n", 505 | "model = Model([left_input, right_input], predictions)\n", 506 | "model.fit([left_data, right_data], targets)" 507 | ] 508 | }, 509 | { 510 | "cell_type": "markdown", 511 | "metadata": {}, 512 | "source": [ 513 | "## 模型作为层\n", 514 | "使用**双摄像头作为输入**的视觉模型可以**感知深度**。Keras中实现连体视觉模型(共享卷积基)如下cell\n", 515 | "~~~python\n", 516 | "y = model(x)\n", 517 | "y1, y2 = model([x1, x2])\n", 518 | "~~~" 519 | ] 520 | }, 521 | { 522 | "cell_type": "code", 523 | "execution_count": null, 524 | "metadata": {}, 525 | "outputs": [], 526 | "source": [ 527 | "from keras import layers\n", 528 | "from keras import applications\n", 529 | "from keras import Input\n", 530 | "\n", 531 | "xception_base = applications.Xception(weights=None,include_top=False)\n", 532 | "\n", 533 | "left_input = Input(shape=(250, 250, 3))\n", 534 | "right_input = Input(shape=(250, 250, 3))\n", 535 | "\n", 536 | "left_features = xception_base(left_input)\n", 537 | "right_input = xception_base(right_input)\n", 538 | "\n", 539 | "merged_features = layers.concatenate([left_features, right_input], axis=-1)" 540 | ] 541 | }, 542 | { 543 | "cell_type": "markdown", 544 | "metadata": {}, 545 | "source": [ 546 | "# Keras回调函数和TensorBoard\n", 547 | "## 训练过程中将回调函数用于模型" 548 | ] 549 | }, 550 | { 551 | "cell_type": "markdown", 552 | "metadata": {}, 553 | "source": [ 554 | "* **模型检查点**:训练中保存模型\n", 555 | "* **提前终止**:如果随时不再改善,则中断训练\n", 556 | "* **训练过程中动态调节参数**:例如优化器的学习率\n", 557 | "* **记录训练指标和验证指标**:将模型表示可视化。keras进度条就是一个回调函数\n", 558 | "~~~python\n", 559 | "keras.callbacks.ModelCheckpoint\n", 560 | "keras.callbacks.EarlyStopping\n", 561 | "keras.callbacks.LearningRateScheduler\n", 562 | "keras.callbacks.ReduceLROnPlateau\n", 563 | "keras.callbacks.CSVLogger\n", 564 | "~~~" 565 | ] 566 | }, 567 | { 568 | "cell_type": "code", 569 | "execution_count": null, 570 | "metadata": {}, 571 | "outputs": [], 572 | "source": [ 573 | "import keras # EarlyStopping 通常和 ModelCheckpoint 结合使用\n", 574 | "callbacks_list = [\n", 575 | " keras.callbacks.EarlyStopping( # 如果不再改善,则中段训练\n", 576 | " monitor='acc', # 监控指标为验证精度\n", 577 | " patience=1, # 在多余一轮的时间不再改善则中断训练\n", 578 | " ),\n", 579 | " keras.callbacks.ModelCheckpoint( # 每轮过后,保存模型权重\n", 580 | " filepath='my_model.h5', # 保存路径\n", 581 | " monitor='val_loss', # 监控指标\n", 582 | " save_best_only=True, # 如果 val_loss 没有改善则不覆盖文件\n", 583 | " )\n", 584 | " ...\n", 585 | " keras.callbacks.ReduceLROnPlateau( # 降低学习率,回调函数\n", 586 | " monitor='val_loss' # 监控指标\n", 587 | " factor=0.1, # 触发时将学习率除以10\n", 588 | " patience=10, # 10轮内没有改善则触发\n", 589 | " )\n", 590 | "]\n", 591 | "\n", 592 | "model.compile(optimizer='rmsprop',\n", 593 | " loss='binary_crossentropy',\n", 594 | " metrics=['acc'])\n", 595 | "model.fit(x, y,\n", 596 | " epochs=10,\n", 597 | " batch_size=32,\n", 598 | " callbacks=callbacks_list,\n", 599 | " validation_data=(x_val, y_val))# 要监控验证损失,记得传入验证数据validation_data" 600 | ] 601 | }, 602 | { 603 | "cell_type": "markdown", 604 | "metadata": {}, 605 | "source": [ 606 | "* **编写自己的回调函数**:创建`keras.callbacks.Callback`类的子类。实现下面这些方法: \n", 607 | " on_epoch_begin   每轮开始时被调用 \n", 608 | " on_epoch_end      每轮结束时被调用\n", 609 | " \n", 610 | " on_batch_begin   处理每个批次前被调用 \n", 611 | " on_batch_end      处理每个批次后被调用 \n", 612 | " \n", 613 | " on_train_begin   训练开始时被调用 \n", 614 | " on_train_end      训练结束时被调用 \n", 615 | "* **`logs`参数** :是一个字典,里面包含前一个批量、前一个轮次、或前一个训练的信息。训练指标和验证指标等\n", 616 | "* **`self.model`** :调用回调函数的模型实例\n", 617 | "* **`self.validation_data`** :传入fit的验证数据" 618 | ] 619 | }, 620 | { 621 | "cell_type": "code", 622 | "execution_count": 1, 623 | "metadata": { 624 | "collapsed": true 625 | }, 626 | "outputs": [ 627 | { 628 | "data": { 629 | "text/plain": [ 630 | "'编写自己回调函数实例:每轮结束后,将模型的每层激活保存到硬盘,激活是对验证集的第一个样本计算得到'" 631 | ] 632 | }, 633 | "execution_count": 1, 634 | "metadata": {}, 635 | "output_type": "execute_result" 636 | }, 637 | { 638 | "name": "stderr", 639 | "output_type": "stream", 640 | "text": [ 641 | "Using TensorFlow backend.\n" 642 | ] 643 | } 644 | ], 645 | "source": [ 646 | "'编写自己回调函数实例:每轮结束后,将模型的每层激活保存到硬盘,激活是对验证集的第一个样本计算得到'\n", 647 | "import keras\n", 648 | "import numpy as np\n", 649 | "\n", 650 | "class ActivationLogger(keras.callbacks.Callback):\n", 651 | "\n", 652 | " def set_model(self, model):\n", 653 | " self.model = model # 模型实例\n", 654 | " layer_outputs = [layer.output for layer in model.layers]\n", 655 | " self.activations_model = keras.models.Model(model.input,layer_outputs) # 新的模型实例\n", 656 | "\n", 657 | " def on_epoch_end(self, epoch, logs=None):\n", 658 | " if self.validation_data is None:\n", 659 | " raise RuntimeError('Requires validation_data.')\n", 660 | "\n", 661 | " validation_sample = self.validation_data[0][0:1] # 验证集的第一个sample\n", 662 | " activations = self.activations_model.predict(validation_sample)\n", 663 | " f = open('activations_at_epoch_' + str(epoch) + '.npz', 'w')\n", 664 | " np.savez(f, activations)\n", 665 | " f.close()" 666 | ] 667 | }, 668 | { 669 | "cell_type": "markdown", 670 | "metadata": {}, 671 | "source": [ 672 | "## TensorBoard简介" 673 | ] 674 | }, 675 | { 676 | "cell_type": "markdown", 677 | "metadata": {}, 678 | "source": [ 679 | "`TensorBoard` 是一个内置于TensorFlow中的基于浏览器的可视化工具。 \n", 680 | "**注意**:只有当keras使用tensorflow后端时,这一方法才能用于keras模型。" 681 | ] 682 | }, 683 | { 684 | "cell_type": "code", 685 | "execution_count": 1, 686 | "metadata": {}, 687 | "outputs": [ 688 | { 689 | "name": "stderr", 690 | "output_type": "stream", 691 | "text": [ 692 | "Using TensorFlow backend.\n" 693 | ] 694 | }, 695 | { 696 | "name": "stdout", 697 | "output_type": "stream", 698 | "text": [ 699 | "WARNING:tensorflow:From C:\\Users\\pengfeizhang\\Anaconda3\\envs\\learn36\\lib\\site-packages\\tensorflow\\python\\framework\\op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.\n", 700 | "Instructions for updating:\n", 701 | "Colocations handled automatically by placer.\n", 702 | "_________________________________________________________________\n", 703 | "Layer (type) Output Shape Param # \n", 704 | "=================================================================\n", 705 | "embed (Embedding) (None, 500, 128) 256000 \n", 706 | "_________________________________________________________________\n", 707 | "conv1d_1 (Conv1D) (None, 494, 32) 28704 \n", 708 | "_________________________________________________________________\n", 709 | "max_pooling1d_1 (MaxPooling1 (None, 98, 32) 0 \n", 710 | "_________________________________________________________________\n", 711 | "conv1d_2 (Conv1D) (None, 92, 32) 7200 \n", 712 | "_________________________________________________________________\n", 713 | "global_max_pooling1d_1 (Glob (None, 32) 0 \n", 714 | "_________________________________________________________________\n", 715 | "dense_1 (Dense) (None, 1) 33 \n", 716 | "=================================================================\n", 717 | "Total params: 291,937\n", 718 | "Trainable params: 291,937\n", 719 | "Non-trainable params: 0\n", 720 | "_________________________________________________________________\n" 721 | ] 722 | } 723 | ], 724 | "source": [ 725 | "import keras # IMDB 文本分类,使用TensorBoard可视化\n", 726 | "import numpy as np\n", 727 | "from keras import layers\n", 728 | "from keras.datasets import imdb\n", 729 | "from keras.preprocessing import sequence\n", 730 | "\n", 731 | "max_features = 2000\n", 732 | "max_len = 500\n", 733 | "\n", 734 | "np_load_old = np.load\n", 735 | "np.load = lambda *a,**k: np_load_old(*a, allow_pickle=True, **k)\n", 736 | "(x_train, y_train), (x_test, y_test) = imdb.load_data(num_words=max_features)\n", 737 | "np.load = np_load_old\n", 738 | "\n", 739 | "x_train = sequence.pad_sequences(x_train, maxlen=max_len)\n", 740 | "x_test = sequence.pad_sequences(x_test, maxlen=max_len)\n", 741 | "\n", 742 | "model = keras.models.Sequential()\n", 743 | "model.add(layers.Embedding(max_features, 128,input_length=max_len,name='embed'))\n", 744 | "model.add(layers.Conv1D(32, 7, activation='relu'))\n", 745 | "model.add(layers.MaxPooling1D(5))\n", 746 | "model.add(layers.Conv1D(32, 7, activation='relu'))\n", 747 | "model.add(layers.GlobalMaxPooling1D())\n", 748 | "model.add(layers.Dense(1))\n", 749 | "model.summary()\n", 750 | "model.compile(optimizer='rmsprop',\n", 751 | " loss='binary_crossentropy',\n", 752 | " metrics=['acc'])" 753 | ] 754 | }, 755 | { 756 | "cell_type": "code", 757 | "execution_count": 6, 758 | "metadata": {}, 759 | "outputs": [], 760 | "source": [ 761 | "from keras.utils import plot_model # pydot pydot-ng graphviz 库\n", 762 | "plot_model(model,show_shapes=True, to_file='model.png')" 763 | ] 764 | }, 765 | { 766 | "cell_type": "code", 767 | "execution_count": 2, 768 | "metadata": { 769 | "collapsed": true 770 | }, 771 | "outputs": [ 772 | { 773 | "name": "stdout", 774 | "output_type": "stream", 775 | "text": [ 776 | "WARNING:tensorflow:From C:\\Users\\pengfeizhang\\Anaconda3\\envs\\learn36\\lib\\site-packages\\tensorflow\\python\\ops\\math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", 777 | "Instructions for updating:\n", 778 | "Use tf.cast instead.\n", 779 | "WARNING:tensorflow:From C:\\Users\\pengfeizhang\\Anaconda3\\envs\\learn36\\lib\\site-packages\\tensorflow\\python\\ops\\math_grad.py:102: div (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", 780 | "Instructions for updating:\n", 781 | "Deprecated in favor of operator or tf.math.divide.\n", 782 | "Train on 20000 samples, validate on 5000 samples\n", 783 | "Epoch 1/3\n", 784 | "20000/20000 [==============================] - 5s 258us/step - loss: 0.6412 - acc: 0.6483 - val_loss: 0.4288 - val_acc: 0.8262\n", 785 | "Epoch 2/3\n", 786 | "20000/20000 [==============================] - 4s 190us/step - loss: 0.4514 - acc: 0.8214 - val_loss: 0.4549 - val_acc: 0.8092\n", 787 | "Epoch 3/3\n", 788 | "20000/20000 [==============================] - 4s 180us/step - loss: 0.3918 - acc: 0.8046 - val_loss: 0.9433 - val_acc: 0.6264\n" 789 | ] 790 | } 791 | ], 792 | "source": [ 793 | "import tensorflow as tf\n", 794 | "\n", 795 | "config = tf.ConfigProto(allow_soft_placement=True)\n", 796 | "config.gpu_options.allocator_type = 'BFC'\n", 797 | "config.gpu_options.per_process_gpu_memory_fraction = 0.80\n", 798 | "callbacks = [ \n", 799 | " keras.callbacks.TensorBoard(\n", 800 | " log_dir=r'my_log_dir', # TensorBoard 日志文件写入的位置\n", 801 | " histogram_freq=1, # 每一轮之后记录激活直方图\n", 802 | " embeddings_freq=1, # 每一轮之后记录嵌入数据 \n", 803 | " #embeddings_data = np.arange(0, max_len).reshape((1, max_len)),\n", 804 | " #embeddings_data = x_train[:100],\n", 805 | " #embeddings_data = x_train,\n", 806 | " #embeddings_data = np.arange(0, max_len).reshape((1, max_len)),\n", 807 | " embeddings_data = x_train[:1],\n", 808 | " )\n", 809 | "]\n", 810 | "history = model.fit(x_train, y_train,\n", 811 | " epochs=3,\n", 812 | " batch_size=128,\n", 813 | " validation_split=0.2,\n", 814 | " callbacks=callbacks)" 815 | ] 816 | }, 817 | { 818 | "cell_type": "markdown", 819 | "metadata": {}, 820 | "source": [ 821 | "# 让模型性能发挥到极致\n", 822 | "让模型从“具有不错的性能”上升到“新能卓越且能够赢得机器学习竞赛”" 823 | ] 824 | }, 825 | { 826 | "cell_type": "markdown", 827 | "metadata": {}, 828 | "source": [ 829 | "## 高级架构模式\n", 830 | "\n", 831 | "### 批标准化\n", 832 | "* 三种重要的**设计模式**:残差连接;标准化;深度可分离卷积;\n", 833 | "* **标准化**:数据输入模型之前做标准化。假设数据服从正态分布(高斯分布),确保分布中心为0,同时缩放到方差为1\n", 834 | "* **批标准化**:即是在训练过程中,数据的均值和方差随时间变化,也可以适应性的将数据标准化,有助于梯度的传播。对于有些特别深的网络,只有包含多个`BatchNormalization`层时才能进行训练。例如keras中的许多高级CNN(ResNet50\\Inception V3\\Xception)\n", 835 | "* **`BatchNormalization`层通常在卷积层或密集连接层之后使用** \n", 836 | " conv_model.add(layers.Conv2D(32, 3, activation='relu')) \n", 837 | " conv_model.add(layers.BatchNormalization()) \n", 838 | "\n", 839 | " dense_model.add(layers.Dense(32, activation='relu')) \n", 840 | " dense_model.add(layers.BatchNormalization())\t\t \n", 841 | "* **`BatchNormalization`层接受一个参数axis**,表示对那个特征轴做标准化,默认是-1,即输入张量的最后一个轴。 \n", 842 | " 对于Dense,conv1D,RNN层,`data_famat`为`channels_last`的Conv2D层,默认值正确 \n", 843 | " (`data_famat`为`channels_last`的Conv2D层,axis应设为1) \n", 844 | "* ***批再标准化***:是对普通批标准化的最新改进\n", 845 | "* ***自标准化神经网络***:使用***特殊的激活函数(selu)和特殊的初始化器(lecun_normal)***,让数据通过任何Dnese层之后保持数据的标准化。这种方案虽然有趣,但目前仅限于密集连接网络,其有效性尚未得到大规模重复。\n", 846 | "### 深度可分离卷积\n", 847 | "* 可以代替Conv2D,让模型更轻量,任务性能更高\n", 848 | "* 深度可分离卷积:`SeparableConv2D`层;逐通道卷积+逐点卷积(**如下图:**)\n", 849 | "* 将空间特征学习,通道特征学习分开。**如果空间位置高度相关,不同通道相互独立。那么这样做很有意义**\n", 850 | "* 深度可分离卷积是Xception的基础\n", 851 | "![](https://dpzbhybb2pdcj.cloudfront.net/chollet/Figures/07fig16_alt.jpg)" 852 | ] 853 | }, 854 | { 855 | "cell_type": "code", 856 | "execution_count": null, 857 | "metadata": {}, 858 | "outputs": [], 859 | "source": [ 860 | "from keras.models import Sequential, Model # 构建一个情况的深度可分离卷积,用于图像多分类任务(softmax)\n", 861 | "from keras import layers\n", 862 | "\n", 863 | "height = 64\n", 864 | "width = 64\n", 865 | "channels = 3\n", 866 | "num_classes = 10\n", 867 | "\n", 868 | "model = Sequential()\n", 869 | "model.add(layers.SeparableConv2D(32, 3,activation='relu',input_shape=(height, width, channels,)))\n", 870 | "model.add(layers.SeparableConv2D(64, 3, activation='relu'))\n", 871 | "model.add(layers.MaxPooling2D(2))\n", 872 | "\n", 873 | "model.add(layers.SeparableConv2D(64, 3, activation='relu'))\n", 874 | "model.add(layers.SeparableConv2D(128, 3, activation='relu'))\n", 875 | "model.add(layers.MaxPooling2D(2))\n", 876 | "\n", 877 | "model.add(layers.SeparableConv2D(64, 3, activation='relu'))\n", 878 | "model.add(layers.SeparableConv2D(128, 3, activation='relu'))\n", 879 | "model.add(layers.GlobalAveragePooling2D())\n", 880 | "\n", 881 | "model.add(layers.Dense(32, activation='relu'))\n", 882 | "model.add(layers.Dense(num_classes, activation='softmax'))\n", 883 | "\n", 884 | "model.compile(optimizer='rmsprop', loss='categorical_crossentropy')" 885 | ] 886 | }, 887 | { 888 | "cell_type": "markdown", 889 | "metadata": {}, 890 | "source": [ 891 | "## 超参数优化\n", 892 | "* 整天调节参数的工作,最好留个机器去做。\n", 893 | "* 制定一个原则,系统性的自动碳素可能的决策空间。\n", 894 | "* 可供选择的技术:贝叶斯优化、遗传算法、简单随机搜索\n", 895 | "* 挑战性: \n", 896 | " 1、**计算反馈信号的代价可能很高**,它需要在数据及上创建一个新模型并从头开始训练 \n", 897 | " 2、**超参数空间由许多离散的决定组成,既不是连续的,也不是可微的。**通常不能再超参数空间使用梯度方法。 \n", 898 | "* 方法: \n", 899 | " 1、通常情况下,**随机搜索**(随机选择需要评估的超参数,并重复这一过程)就是最好的方案,也是最简单的方案。 \n", 900 | " 2、**`Hyperopt`**是一个用于超参数优化的Python库,其内部使用`Parzen`估计器的树来预测那组超参数可能会得到很好的效果。另一个叫做**`Hyperas`的库将`Hyperopt`与`Keras`模型结合在一起**。" 901 | ] 902 | }, 903 | { 904 | "cell_type": "markdown", 905 | "metadata": {}, 906 | "source": [ 907 | "## 小结" 908 | ] 909 | }, 910 | { 911 | "cell_type": "markdown", 912 | "metadata": {}, 913 | "source": [ 914 | "# 本章总结" 915 | ] 916 | }, 917 | { 918 | "cell_type": "code", 919 | "execution_count": null, 920 | "metadata": {}, 921 | "outputs": [], 922 | "source": [] 923 | } 924 | ], 925 | "metadata": { 926 | "kernelspec": { 927 | "display_name": "Python [conda env:learn36]", 928 | "language": "python", 929 | "name": "conda-env-learn36-py" 930 | }, 931 | "language_info": { 932 | "codemirror_mode": { 933 | "name": "ipython", 934 | "version": 3 935 | }, 936 | "file_extension": ".py", 937 | "mimetype": "text/x-python", 938 | "name": "python", 939 | "nbconvert_exporter": "python", 940 | "pygments_lexer": "ipython3", 941 | "version": "3.6.8" 942 | }, 943 | "toc": { 944 | "base_numbering": 1, 945 | "nav_menu": {}, 946 | "number_sections": true, 947 | "sideBar": true, 948 | "skip_h1_title": false, 949 | "title_cell": "Table of Contents", 950 | "title_sidebar": "Contents", 951 | "toc_cell": false, 952 | "toc_position": { 953 | "height": "calc(100% - 180px)", 954 | "left": "10px", 955 | "top": "150px", 956 | "width": "165px" 957 | }, 958 | "toc_section_display": true, 959 | "toc_window_display": true 960 | } 961 | }, 962 | "nbformat": 4, 963 | "nbformat_minor": 2 964 | } 965 | -------------------------------------------------------------------------------- /SourceHanSansSC-ExtraLight.otf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flyto22c/keras-deep-learning-with-python-cn/c84b92fe8ba4dccaa840699d8407dde15574dc95/SourceHanSansSC-ExtraLight.otf -------------------------------------------------------------------------------- /creative_commons_elephant.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flyto22c/keras-deep-learning-with-python-cn/c84b92fe8ba4dccaa840699d8407dde15574dc95/creative_commons_elephant.jpg -------------------------------------------------------------------------------- /elephant_cam.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flyto22c/keras-deep-learning-with-python-cn/c84b92fe8ba4dccaa840699d8407dde15574dc95/elephant_cam.jpg -------------------------------------------------------------------------------- /elephant_cam1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flyto22c/keras-deep-learning-with-python-cn/c84b92fe8ba4dccaa840699d8407dde15574dc95/elephant_cam1.jpg -------------------------------------------------------------------------------- /model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flyto22c/keras-deep-learning-with-python-cn/c84b92fe8ba4dccaa840699d8407dde15574dc95/model.png -------------------------------------------------------------------------------- /runoob.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flyto22c/keras-deep-learning-with-python-cn/c84b92fe8ba4dccaa840699d8407dde15574dc95/runoob.npz -------------------------------------------------------------------------------- /test.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flyto22c/keras-deep-learning-with-python-cn/c84b92fe8ba4dccaa840699d8407dde15574dc95/test.png --------------------------------------------------------------------------------