├── LICENSE
├── README.md
├── Section 01
└── Mathematics Refresher.ipynb
├── Section 02
├── 1-ANN in TF 2.0 MNIST with Tensorflow.ipynb
├── 2-Cifar10.ipynb
└── 3- MNIST.ipynb
├── Section 03
└── CNN.ipynb
├── Section 04
└── V04 - Building LSTM model for text data and getting the results.ipynb
├── Section 05
├── .DS_Store
├── .ipynb_checkpoints
│ └── Training a Model for Temperature Forecasting -checkpoint.ipynb
├── Training a Model for Temperature Forecasting .ipynb
└── jena_climate_2009_2016.csv
├── Section 06
└── Section 06 - Auto-encoders.ipynb
└── Section 07
├── 01 - Tensorflow-keras Functional API.ipynb
├── 02-03 - Getting and preprocessing IMDB dataset for IMDB Movie Reviews Classification.ipynb
├── 04-05 - Reuters dataset for News Text Multi-label Classification.ipynb
└── 06-07 - Boston housing prediction.ipynb
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2019 Packt
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Getting-Started-with-TensorFlow-2.0-for-Deep-Learning-Video
2 |
3 | This is the code repository for [Getting Started with TensorFlow 2.0 for Deep Learning [Video]](https://www.packtpub.com/application-development/getting-started-tensorflow-20-deep-learning-video), published by [Packt](https://www.packtpub.com/?utm_source=github). It contains all the supporting project files necessary to work through the video course from start to finish.
4 |
5 | ## About the Video Course
6 | Deep learning is a trending technology if you want to break into cutting-edge AI and solve real-world, data-driven problems. Google’s TensorFlow is a popular library for implementing deep learning algorithms because of its rapid developments and commercial deployments.
7 | This course provides you with the core of deep learning using TensorFlow 2.0. You’ll learn to train your deep learning networks from scratch, pre-process and split your datasets, train deep learning models for real-world applications, and validate the accuracy of your models.
8 | By the end of the course, you’ll have a profound knowledge of how you can leverage TensorFlow 2.0 to build real-world applications without much effort.
9 |
10 |
What You Will Learn
11 |
12 |
13 | Develop real-world deep learning applications
14 | Classify IMDb Movie Reviews using Binary Classification Model
15 | Build a model to classify news with multi-label
16 | Train your deep learning model to predict house prices
17 | Understand the whole package: prepare a dataset, build the deep learning model, and validate results
18 | Understand the working of Recurrent Neural Networks and LSTM with hands-on examples
19 | Implement autoencoders and denoise autoencoders in a project to regenerate images
20 |
21 |
22 | ## Instructions and Navigation
23 | ### Assumed Knowledge
24 | This course is for developers who have a basic knowledge of Python. If you’re aware of the basics of machine learning and now want to build deep learning systems with TensorFlow 2.0 that are smarter, faster, more complex, and more practical, then this course is for you!
25 |
26 | ### Technical Requirements
27 | This course has the following requirements:
28 | Jupyter Notebook, Latest Version
29 | Operating system: Mac/Linux
30 | Python 3.x
31 | basic programming skills
32 |
33 |
34 | ## Related Products
35 | * [Learning TensorFlow 2.0 [Video]](https://www.packtpub.com/big-data-and-business-intelligence/learning-tensorflow-20-video)
36 |
37 | * [Implementing Deep Learning Algorithms with TensorFlow 2.0 [Video]](https://www.packtpub.com/big-data-and-business-intelligence/implementing-deep-learning-algorithms-tensorflow-20-video)
38 |
39 | * [Advanced NLP Projects with TensorFlow 2.0 [Video]](https://www.packtpub.com/application-development/advanced-nlp-projects-tensorflow-20-video)
40 |
--------------------------------------------------------------------------------
/Section 01/Mathematics Refresher.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Mathematics Refresher: Scalers, Vectors, Matrices and Tensor."
8 | ]
9 | },
10 | {
11 | "cell_type": "markdown",
12 | "metadata": {},
13 | "source": [
14 | "## 1) Scalars (0D tensors)"
15 | ]
16 | },
17 | {
18 | "cell_type": "markdown",
19 | "metadata": {},
20 | "source": [
21 | "A tensor that contains only one number is called a scalar (or scalar tensor, or 0-dimensional\n",
22 | "tensor, or 0D tensor). \n",
23 | "\n",
24 | "In Numpy, a float32 or float64 number is a scalar tensor (or scalar\n",
25 | "array). Here’s a Numpy scalar:"
26 | ]
27 | },
28 | {
29 | "cell_type": "code",
30 | "execution_count": 43,
31 | "metadata": {},
32 | "outputs": [
33 | {
34 | "data": {
35 | "text/plain": [
36 | "array(12)"
37 | ]
38 | },
39 | "execution_count": 43,
40 | "metadata": {},
41 | "output_type": "execute_result"
42 | }
43 | ],
44 | "source": [
45 | "import numpy as np\n",
46 | "\n",
47 | "x = np.array(12)\n",
48 | "\n",
49 | "x"
50 | ]
51 | },
52 | {
53 | "cell_type": "markdown",
54 | "metadata": {},
55 | "source": [
56 | "You can display the number of axes of a Numpy tensor via the ndim attribute; a scalar\n",
57 | "tensor has 0 axes (ndim == 0). \n",
58 | "\n",
59 | "The number of axes of a tensor is also called its rank."
60 | ]
61 | },
62 | {
63 | "cell_type": "code",
64 | "execution_count": 44,
65 | "metadata": {},
66 | "outputs": [
67 | {
68 | "data": {
69 | "text/plain": [
70 | "0"
71 | ]
72 | },
73 | "execution_count": 44,
74 | "metadata": {},
75 | "output_type": "execute_result"
76 | }
77 | ],
78 | "source": [
79 | "x.ndim"
80 | ]
81 | },
82 | {
83 | "cell_type": "markdown",
84 | "metadata": {},
85 | "source": [
86 | "## 2) Vectors (1D tensors)"
87 | ]
88 | },
89 | {
90 | "cell_type": "markdown",
91 | "metadata": {},
92 | "source": [
93 | "An array of numbers is called a vector, or 1D tensor. A 1D tensor is said to have exactly\n",
94 | "one axis. Following is a Numpy vector:"
95 | ]
96 | },
97 | {
98 | "cell_type": "code",
99 | "execution_count": 45,
100 | "metadata": {},
101 | "outputs": [
102 | {
103 | "data": {
104 | "text/plain": [
105 | "array([12, 3, 6, 14, 7])"
106 | ]
107 | },
108 | "execution_count": 45,
109 | "metadata": {},
110 | "output_type": "execute_result"
111 | }
112 | ],
113 | "source": [
114 | "x = np.array([12, 3, 6, 14, 7])\n",
115 | "\n",
116 | "x"
117 | ]
118 | },
119 | {
120 | "cell_type": "markdown",
121 | "metadata": {},
122 | "source": [
123 | "This vector has five entries and so is called a 5-dimensional vector. Don’t confuse a 5D vector with a 5D tensor! \n",
124 | "\n",
125 | "A 5D vector has only one axis and has five dimensions along its axis, whereas a 5D tensor has five axes"
126 | ]
127 | },
128 | {
129 | "cell_type": "code",
130 | "execution_count": 46,
131 | "metadata": {},
132 | "outputs": [
133 | {
134 | "data": {
135 | "text/plain": [
136 | "1"
137 | ]
138 | },
139 | "execution_count": 46,
140 | "metadata": {},
141 | "output_type": "execute_result"
142 | }
143 | ],
144 | "source": [
145 | "x.ndim"
146 | ]
147 | },
148 | {
149 | "cell_type": "markdown",
150 | "metadata": {},
151 | "source": [
152 | "## 3) Matrices (2D tensors)"
153 | ]
154 | },
155 | {
156 | "cell_type": "markdown",
157 | "metadata": {},
158 | "source": [
159 | "An array of vectors is a matrix, or 2D tensor. A matrix has two axes (often referred to rows and columns). \n",
160 | "\n",
161 | "You can visually interpret a matrix as a rectangular grid of numbers. \n",
162 | "\n",
163 | "This is a Numpy matrix:"
164 | ]
165 | },
166 | {
167 | "cell_type": "code",
168 | "execution_count": 47,
169 | "metadata": {},
170 | "outputs": [],
171 | "source": [
172 | "x = np.array([[5, 78, 2, 34, 0],\n",
173 | " [6, 79, 3, 35, 1],\n",
174 | " [7, 80, 4, 36, 2]])"
175 | ]
176 | },
177 | {
178 | "cell_type": "markdown",
179 | "metadata": {},
180 | "source": [
181 | "The entries from the first axis are called the rows, and the entries from the second axis are called the columns. \n",
182 | "\n",
183 | "In the previous example, [5, 78, 2, 34, 0] is the first row of x, and [5, 6, 7] is the first column."
184 | ]
185 | },
186 | {
187 | "cell_type": "code",
188 | "execution_count": 48,
189 | "metadata": {},
190 | "outputs": [
191 | {
192 | "data": {
193 | "text/plain": [
194 | "2"
195 | ]
196 | },
197 | "execution_count": 48,
198 | "metadata": {},
199 | "output_type": "execute_result"
200 | }
201 | ],
202 | "source": [
203 | "x.ndim"
204 | ]
205 | },
206 | {
207 | "cell_type": "markdown",
208 | "metadata": {},
209 | "source": [
210 | "## 4) 3D tensors and higher-dimensional tensors"
211 | ]
212 | },
213 | {
214 | "cell_type": "markdown",
215 | "metadata": {},
216 | "source": [
217 | "If you pack such matrices in a new array, you obtain a 3D tensor, which you can visually interpret as a cube of numbers. \n",
218 | "\n",
219 | "Following is a Numpy 3D tensor:"
220 | ]
221 | },
222 | {
223 | "cell_type": "code",
224 | "execution_count": 49,
225 | "metadata": {},
226 | "outputs": [],
227 | "source": [
228 | "x = np.array( [[[5, 78, 2, 34, 0],\n",
229 | " [6, 79, 3, 35, 1],\n",
230 | " [7, 80, 4, 36, 2]],\n",
231 | " [[5, 78, 2, 34, 0],\n",
232 | " [6, 79, 3, 35, 1],\n",
233 | " [7, 80, 4, 36, 2]],\n",
234 | " [[5, 78, 2, 34, 0],\n",
235 | " [6, 79, 3, 35, 1],\n",
236 | " [7, 80, 4, 36, 2]]])"
237 | ]
238 | },
239 | {
240 | "cell_type": "markdown",
241 | "metadata": {},
242 | "source": [
243 | "By packing 3D tensors in an array, you can create a 4D tensor, and so on. \n",
244 | "\n",
245 | "In deep learning, you’ll generally manipulate tensors that are 0D to 4D, although you may go up to 5D if you process video data."
246 | ]
247 | },
248 | {
249 | "cell_type": "code",
250 | "execution_count": null,
251 | "metadata": {},
252 | "outputs": [],
253 | "source": [
254 | "x.ndim"
255 | ]
256 | },
257 | {
258 | "cell_type": "markdown",
259 | "metadata": {},
260 | "source": [
261 | "## Summary: A tensor is defined by three key attributes:\n",
262 | "\n",
263 | "\n",
264 | "### 1) Number of axes (rank)\n",
265 | "\n",
266 | "For instance, a 3D tensor has three axes, and a matrix has two axes. This is also called the tensor’s ndim in Python libraries such as Numpy.\n",
267 | "\n",
268 | "\n",
269 | "### 2) Shape \n",
270 | "\n",
271 | "This is a tuple of integers that describes how many dimensions the tensor has along each axis. \n",
272 | "\n",
273 | "For instance, the previous matrix example has shape (3, 5), and the 3D tensor example has shape (3, 3, 5). \n",
274 | "\n",
275 | "A vector has a shape with a single element, such as (5,), whereas a scalar has an empty shape, ().\n",
276 | "\n",
277 | "\n",
278 | "### 3) Data type (usually called dtype in Python libraries)\n",
279 | "\n",
280 | "This is the type of the data contained in the tensor; for instance, a tensor’s type could be float32, uint8, float64, and so on."
281 | ]
282 | }
283 | ],
284 | "metadata": {
285 | "kernelspec": {
286 | "display_name": "PY37",
287 | "language": "python",
288 | "name": "py37"
289 | },
290 | "language_info": {
291 | "codemirror_mode": {
292 | "name": "ipython",
293 | "version": 3
294 | },
295 | "file_extension": ".py",
296 | "mimetype": "text/x-python",
297 | "name": "python",
298 | "nbconvert_exporter": "python",
299 | "pygments_lexer": "ipython3",
300 | "version": "3.7.1"
301 | }
302 | },
303 | "nbformat": 4,
304 | "nbformat_minor": 2
305 | }
306 |
--------------------------------------------------------------------------------
/Section 02/1-ANN in TF 2.0 MNIST with Tensorflow.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [
8 | {
9 | "name": "stderr",
10 | "output_type": "stream",
11 | "text": [
12 | "WARNING: Logging before flag parsing goes to stderr.\n",
13 | "W0613 04:17:05.581143 140735531586432 deprecation.py:323] From /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/compat/v2_compat.py:65: disable_resource_variables (from tensorflow.python.ops.variable_scope) is deprecated and will be removed in a future version.\n",
14 | "Instructions for updating:\n",
15 | "non-resource variables are not supported in the long term\n"
16 | ]
17 | }
18 | ],
19 | "source": [
20 | "import warnings;warnings.filterwarnings('ignore')\n",
21 | "import numpy as np\n",
22 | "import tensorflow.compat.v1 as tf\n",
23 | "tf.disable_v2_behavior()"
24 | ]
25 | },
26 | {
27 | "cell_type": "code",
28 | "execution_count": 2,
29 | "metadata": {},
30 | "outputs": [
31 | {
32 | "data": {
33 | "text/plain": [
34 | "'2.0.0-beta0'"
35 | ]
36 | },
37 | "execution_count": 2,
38 | "metadata": {},
39 | "output_type": "execute_result"
40 | }
41 | ],
42 | "source": [
43 | "tf.__version__"
44 | ]
45 | },
46 | {
47 | "cell_type": "code",
48 | "execution_count": null,
49 | "metadata": {},
50 | "outputs": [],
51 | "source": [
52 | "(X_train, y_train), (X_test, y_test) = tf.keras.datasets.mnist.load_data()\n",
53 | "X_train = X_train.astype(np.float32).reshape(-1, 28*28) / 255.0\n",
54 | "X_test = X_test.astype(np.float32).reshape(-1, 28*28) / 255.0\n",
55 | "y_train = y_train.astype(np.int32)\n",
56 | "y_test = y_test.astype(np.int32)\n",
57 | "X_valid, X_train = X_train[:5000], X_train[5000:]\n",
58 | "y_valid, y_train = y_train[:5000], y_train[5000:]"
59 | ]
60 | },
61 | {
62 | "cell_type": "code",
63 | "execution_count": null,
64 | "metadata": {},
65 | "outputs": [],
66 | "source": [
67 | "n_inputs = 28*28 # MNIST\n",
68 | "n_hidden1 = 300\n",
69 | "n_hidden2 = 100\n",
70 | "n_outputs = 10"
71 | ]
72 | },
73 | {
74 | "cell_type": "code",
75 | "execution_count": null,
76 | "metadata": {},
77 | "outputs": [],
78 | "source": [
79 | "X = tf.placeholder(tf.float32, shape=(None, n_inputs), name=\"X\")\n",
80 | "y = tf.placeholder(tf.int32, shape=(None), name=\"y\")"
81 | ]
82 | },
83 | {
84 | "cell_type": "code",
85 | "execution_count": null,
86 | "metadata": {},
87 | "outputs": [],
88 | "source": [
89 | "def neuron_layer(X, n_neurons, name, activation=None):\n",
90 | " with tf.name_scope(name):\n",
91 | " n_inputs = int(X.get_shape()[1])\n",
92 | " stddev = 2 / np.sqrt(n_inputs)\n",
93 | " init = tf.truncated_normal((n_inputs, n_neurons), stddev=stddev)\n",
94 | " W = tf.Variable(init, name=\"kernel\")\n",
95 | " b = tf.Variable(tf.zeros([n_neurons]), name=\"bias\")\n",
96 | " Z = tf.matmul(X, W) + b\n",
97 | " if activation is not None:\n",
98 | " return activation(Z)\n",
99 | " else:\n",
100 | " return Z"
101 | ]
102 | },
103 | {
104 | "cell_type": "code",
105 | "execution_count": null,
106 | "metadata": {},
107 | "outputs": [],
108 | "source": [
109 | "with tf.name_scope(\"dnn\"):\n",
110 | " hidden1 = neuron_layer(X, n_hidden1, name=\"hidden1\",\n",
111 | " activation=tf.nn.relu)\n",
112 | " hidden2 = neuron_layer(hidden1, n_hidden2, name=\"hidden2\",\n",
113 | " activation=tf.nn.relu)\n",
114 | " logits = neuron_layer(hidden2, n_outputs, name=\"outputs\")"
115 | ]
116 | },
117 | {
118 | "cell_type": "code",
119 | "execution_count": null,
120 | "metadata": {},
121 | "outputs": [],
122 | "source": [
123 | "with tf.name_scope(\"loss\"):\n",
124 | " xentropy = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y,\n",
125 | " logits=logits)\n",
126 | " loss = tf.reduce_mean(xentropy, name=\"loss\")"
127 | ]
128 | },
129 | {
130 | "cell_type": "code",
131 | "execution_count": null,
132 | "metadata": {},
133 | "outputs": [],
134 | "source": [
135 | "learning_rate = 0.01\n",
136 | "\n",
137 | "with tf.name_scope(\"train\"):\n",
138 | " optimizer = tf.train.GradientDescentOptimizer(learning_rate)\n",
139 | " training_op = optimizer.minimize(loss)\n",
140 | "\n",
141 | "with tf.name_scope(\"eval\"):\n",
142 | " correct = tf.nn.in_top_k(logits, y, 1)\n",
143 | " accuracy = tf.reduce_mean(tf.cast(correct, tf.float32))\n",
144 | "\n",
145 | "init = tf.global_variables_initializer()\n",
146 | "saver = tf.train.Saver()"
147 | ]
148 | },
149 | {
150 | "cell_type": "code",
151 | "execution_count": null,
152 | "metadata": {},
153 | "outputs": [],
154 | "source": [
155 | "n_epochs = 10\n",
156 | "batch_size = 50"
157 | ]
158 | },
159 | {
160 | "cell_type": "code",
161 | "execution_count": null,
162 | "metadata": {},
163 | "outputs": [],
164 | "source": [
165 | "def shuffle_batch(X, y, batch_size):\n",
166 | " rnd_idx = np.random.permutation(len(X))\n",
167 | " n_batches = len(X) // batch_size\n",
168 | " for batch_idx in np.array_split(rnd_idx, n_batches):\n",
169 | " X_batch, y_batch = X[batch_idx], y[batch_idx]\n",
170 | " yield X_batch, y_batch"
171 | ]
172 | },
173 | {
174 | "cell_type": "code",
175 | "execution_count": null,
176 | "metadata": {
177 | "scrolled": true
178 | },
179 | "outputs": [],
180 | "source": [
181 | "with tf.Session() as sess:\n",
182 | " init.run()\n",
183 | " for epoch in range(n_epochs):\n",
184 | " for X_batch, y_batch in shuffle_batch(X_train, y_train, batch_size):\n",
185 | " sess.run(training_op, feed_dict={X: X_batch, y: y_batch})\n",
186 | " acc_batch = accuracy.eval(feed_dict={X: X_batch, y: y_batch})\n",
187 | " acc_val = accuracy.eval(feed_dict={X: X_valid, y: y_valid})\n",
188 | " print(epoch, \"Batch accuracy:\", acc_batch, \"Val accuracy:\", acc_val)\n",
189 | "\n",
190 | " save_path = saver.save(sess, \"./my_model_final.ckpt\")"
191 | ]
192 | },
193 | {
194 | "cell_type": "code",
195 | "execution_count": null,
196 | "metadata": {},
197 | "outputs": [],
198 | "source": [
199 | "with tf.Session() as sess:\n",
200 | " saver.restore(sess, \"./my_model_final.ckpt\") # or better, use save_path\n",
201 | " X_new_scaled = X_test[:20]\n",
202 | " Z = logits.eval(feed_dict={X: X_new_scaled})\n",
203 | " y_pred = np.argmax(Z, axis=1)\n",
204 | "\n",
205 | "print(\"Predicted classes:\", y_pred)\n",
206 | "print(\"Actual classes: \", y_test[:20])"
207 | ]
208 | },
209 | {
210 | "cell_type": "code",
211 | "execution_count": null,
212 | "metadata": {},
213 | "outputs": [],
214 | "source": []
215 | }
216 | ],
217 | "metadata": {
218 | "kernelspec": {
219 | "display_name": "Python 3",
220 | "language": "python",
221 | "name": "python3"
222 | },
223 | "language_info": {
224 | "codemirror_mode": {
225 | "name": "ipython",
226 | "version": 3
227 | },
228 | "file_extension": ".py",
229 | "mimetype": "text/x-python",
230 | "name": "python",
231 | "nbconvert_exporter": "python",
232 | "pygments_lexer": "ipython3",
233 | "version": "3.7.1"
234 | },
235 | "nav_menu": {
236 | "height": "264px",
237 | "width": "369px"
238 | },
239 | "toc": {
240 | "navigate_menu": true,
241 | "number_sections": true,
242 | "sideBar": true,
243 | "threshold": 6,
244 | "toc_cell": false,
245 | "toc_section_display": "block",
246 | "toc_window_display": false
247 | }
248 | },
249 | "nbformat": 4,
250 | "nbformat_minor": 1
251 | }
252 |
--------------------------------------------------------------------------------
/Section 02/2-Cifar10.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "Train a simple deep CNN on the CIFAR10 small images dataset.\n",
8 | "\n",
9 | "It gets to 75% validation accuracy in 25 epochs, and 79% after 50 epochs.\n",
10 | "(it's still underfitting at that point, though)"
11 | ]
12 | },
13 | {
14 | "cell_type": "code",
15 | "execution_count": 1,
16 | "metadata": {},
17 | "outputs": [],
18 | "source": [
19 | "# https://gist.github.com/deep-diver\n",
20 | "import warnings;warnings.filterwarnings('ignore')\n",
21 | "\n",
22 | "from tensorflow import keras\n",
23 | "from tensorflow.keras.datasets import cifar10\n",
24 | "from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
25 | "\n",
26 | "from tensorflow.keras.models import Sequential\n",
27 | "from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten\n",
28 | "from tensorflow.keras.layers import Conv2D, MaxPooling2D\n",
29 | "from tensorflow.keras.optimizers import RMSprop\n",
30 | "\n",
31 | "\n",
32 | "import os"
33 | ]
34 | },
35 | {
36 | "cell_type": "code",
37 | "execution_count": 2,
38 | "metadata": {},
39 | "outputs": [],
40 | "source": [
41 | "batch_size = 32\n",
42 | "\n",
43 | "num_classes = 10\n",
44 | "\n",
45 | "epochs = 100\n",
46 | "\n",
47 | "num_predictions = 20\n",
48 | "\n",
49 | "save_dir = os.path.join(os.getcwd(), 'saved_models')\n",
50 | "\n",
51 | "model_name = 'keras_cifar10_trained_model.h5'"
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "execution_count": 3,
57 | "metadata": {},
58 | "outputs": [
59 | {
60 | "name": "stdout",
61 | "output_type": "stream",
62 | "text": [
63 | "x_train shape: (50000, 32, 32, 3)\n",
64 | "50000 train samples\n",
65 | "10000 test samples\n"
66 | ]
67 | }
68 | ],
69 | "source": [
70 | "# The data, split between train and test sets:\n",
71 | "(x_train, y_train), (x_test, y_test) = cifar10.load_data()\n",
72 | "print('x_train shape:', x_train.shape)\n",
73 | "print(x_train.shape[0], 'train samples')\n",
74 | "print(x_test.shape[0], 'test samples')"
75 | ]
76 | },
77 | {
78 | "cell_type": "code",
79 | "execution_count": 4,
80 | "metadata": {},
81 | "outputs": [],
82 | "source": [
83 | "# Convert class vectors to binary class matrices.\n",
84 | "y_train = keras.utils.to_categorical(y_train, num_classes)\n",
85 | "y_test = keras.utils.to_categorical(y_test, num_classes)"
86 | ]
87 | },
88 | {
89 | "cell_type": "code",
90 | "execution_count": 5,
91 | "metadata": {},
92 | "outputs": [],
93 | "source": [
94 | "\n",
95 | "model = Sequential()\n",
96 | "model.add(Conv2D(32, (3, 3), padding='same',\n",
97 | " input_shape=x_train.shape[1:]))\n",
98 | "model.add(Activation('relu'))\n",
99 | "model.add(Conv2D(32, (3, 3)))\n",
100 | "model.add(Activation('relu'))\n",
101 | "model.add(MaxPooling2D(pool_size=(2, 2)))\n",
102 | "model.add(Dropout(0.25))\n",
103 | "\n",
104 | "model.add(Conv2D(64, (3, 3), padding='same'))\n",
105 | "model.add(Activation('relu'))\n",
106 | "model.add(Conv2D(64, (3, 3)))\n",
107 | "model.add(Activation('relu'))\n",
108 | "model.add(MaxPooling2D(pool_size=(2, 2)))\n",
109 | "model.add(Dropout(0.25))\n",
110 | "\n",
111 | "model.add(Flatten())\n",
112 | "model.add(Dense(512))\n",
113 | "model.add(Activation('relu'))\n",
114 | "model.add(Dropout(0.5))\n",
115 | "model.add(Dense(num_classes))\n",
116 | "model.add(Activation('softmax'))\n"
117 | ]
118 | },
119 | {
120 | "cell_type": "code",
121 | "execution_count": 6,
122 | "metadata": {},
123 | "outputs": [],
124 | "source": [
125 | "# initiate RMSprop optimizer\n",
126 | "opt = RMSprop(lr=0.0001, decay=1e-6)"
127 | ]
128 | },
129 | {
130 | "cell_type": "code",
131 | "execution_count": 7,
132 | "metadata": {},
133 | "outputs": [],
134 | "source": [
135 | "# Let's train the model using RMSprop\n",
136 | "model.compile(loss='categorical_crossentropy',\n",
137 | " optimizer=opt,\n",
138 | " metrics=['accuracy'])"
139 | ]
140 | },
141 | {
142 | "cell_type": "code",
143 | "execution_count": 8,
144 | "metadata": {},
145 | "outputs": [],
146 | "source": [
147 | "x_train = x_train.astype('float32')\n",
148 | "x_test = x_test.astype('float32')\n",
149 | "x_train /= 255\n",
150 | "x_test /= 255"
151 | ]
152 | },
153 | {
154 | "cell_type": "code",
155 | "execution_count": 9,
156 | "metadata": {},
157 | "outputs": [
158 | {
159 | "name": "stdout",
160 | "output_type": "stream",
161 | "text": [
162 | "Train on 50000 samples, validate on 10000 samples\n",
163 | "Epoch 1/100\n",
164 | " 6336/50000 [==>...........................] - ETA: 3:47 - loss: 2.1874 - accuracy: 0.1776"
165 | ]
166 | },
167 | {
168 | "ename": "KeyboardInterrupt",
169 | "evalue": "",
170 | "output_type": "error",
171 | "traceback": [
172 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
173 | "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)",
174 | "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[1;32m 3\u001b[0m \u001b[0mepochs\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mepochs\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 4\u001b[0m \u001b[0mvalidation_data\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mx_test\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0my_test\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 5\u001b[0;31m shuffle=True)\n\u001b[0m",
175 | "\u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py\u001b[0m in \u001b[0;36mfit\u001b[0;34m(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_freq, max_queue_size, workers, use_multiprocessing, **kwargs)\u001b[0m\n\u001b[1;32m 871\u001b[0m \u001b[0mvalidation_steps\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mvalidation_steps\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 872\u001b[0m \u001b[0mvalidation_freq\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mvalidation_freq\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 873\u001b[0;31m steps_name='steps_per_epoch')\n\u001b[0m\u001b[1;32m 874\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 875\u001b[0m def evaluate(self,\n",
176 | "\u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/keras/engine/training_arrays.py\u001b[0m in \u001b[0;36mmodel_iteration\u001b[0;34m(model, inputs, targets, sample_weights, batch_size, epochs, verbose, callbacks, val_inputs, val_targets, val_sample_weights, shuffle, initial_epoch, steps_per_epoch, validation_steps, validation_freq, mode, validation_in_fit, prepared_feed_values_from_dataset, steps_name, **kwargs)\u001b[0m\n\u001b[1;32m 350\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 351\u001b[0m \u001b[0;31m# Get outputs.\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 352\u001b[0;31m \u001b[0mbatch_outs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mf\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mins_batch\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 353\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0misinstance\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mbatch_outs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlist\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 354\u001b[0m \u001b[0mbatch_outs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mbatch_outs\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
177 | "\u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/keras/backend.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, inputs)\u001b[0m\n\u001b[1;32m 3215\u001b[0m \u001b[0mvalue\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mmath_ops\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcast\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvalue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtensor\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdtype\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 3216\u001b[0m \u001b[0mconverted_inputs\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvalue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 3217\u001b[0;31m \u001b[0moutputs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_graph_fn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0mconverted_inputs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 3218\u001b[0m return nest.pack_sequence_as(self._outputs_structure,\n\u001b[1;32m 3219\u001b[0m [x.numpy() for x in outputs])\n",
178 | "\u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/eager/function.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, *args, **kwargs)\u001b[0m\n\u001b[1;32m 556\u001b[0m raise TypeError(\"Keyword arguments {} unknown. Expected {}.\".format(\n\u001b[1;32m 557\u001b[0m list(kwargs.keys()), list(self._arg_keywords)))\n\u001b[0;32m--> 558\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_call_flat\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 559\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 560\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_filtered_call\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0margs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkwargs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
179 | "\u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/eager/function.py\u001b[0m in \u001b[0;36m_call_flat\u001b[0;34m(self, args)\u001b[0m\n\u001b[1;32m 625\u001b[0m \u001b[0;31m# Only need to override the gradient in graph mode and when we have outputs.\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 626\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mcontext\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mexecuting_eagerly\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0moutputs\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 627\u001b[0;31m \u001b[0moutputs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_inference_function\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcall\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mctx\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 628\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 629\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_register_gradient\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
180 | "\u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/eager/function.py\u001b[0m in \u001b[0;36mcall\u001b[0;34m(self, ctx, args)\u001b[0m\n\u001b[1;32m 413\u001b[0m attrs=(\"executor_type\", executor_type,\n\u001b[1;32m 414\u001b[0m \"config_proto\", config),\n\u001b[0;32m--> 415\u001b[0;31m ctx=ctx)\n\u001b[0m\u001b[1;32m 416\u001b[0m \u001b[0;31m# Replace empty list with None\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 417\u001b[0m \u001b[0moutputs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0moutputs\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
181 | "\u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/eager/execute.py\u001b[0m in \u001b[0;36mquick_execute\u001b[0;34m(op_name, num_outputs, inputs, attrs, ctx, name)\u001b[0m\n\u001b[1;32m 58\u001b[0m tensors = pywrap_tensorflow.TFE_Py_Execute(ctx._handle, device_name,\n\u001b[1;32m 59\u001b[0m \u001b[0mop_name\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minputs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mattrs\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 60\u001b[0;31m num_outputs)\n\u001b[0m\u001b[1;32m 61\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mcore\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_NotOkStatusException\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 62\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mname\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
182 | "\u001b[0;31mKeyboardInterrupt\u001b[0m: "
183 | ]
184 | }
185 | ],
186 | "source": [
187 | "model.fit(x_train, y_train,\n",
188 | " batch_size=batch_size,\n",
189 | " epochs=epochs,\n",
190 | " validation_data=(x_test, y_test),\n",
191 | " shuffle=True)"
192 | ]
193 | },
194 | {
195 | "cell_type": "code",
196 | "execution_count": 10,
197 | "metadata": {},
198 | "outputs": [
199 | {
200 | "name": "stdout",
201 | "output_type": "stream",
202 | "text": [
203 | "Saved trained model at /Users/muhammadhamzajaved/mhj/0-Packt/Course-2-Tensorflow20/Section 02/Jupyter Notebooks/saved_models/keras_cifar10_trained_model.h5 \n"
204 | ]
205 | }
206 | ],
207 | "source": [
208 | "# Save model and weights\n",
209 | "if not os.path.isdir(save_dir):\n",
210 | " os.makedirs(save_dir)\n",
211 | "model_path = os.path.join(save_dir, model_name)\n",
212 | "model.save(model_path)\n",
213 | "print('Saved trained model at %s ' % model_path)"
214 | ]
215 | },
216 | {
217 | "cell_type": "code",
218 | "execution_count": 11,
219 | "metadata": {},
220 | "outputs": [
221 | {
222 | "name": "stdout",
223 | "output_type": "stream",
224 | "text": [
225 | "10000/10000 [==============================] - 12s 1ms/sample - loss: 2.0172 - accuracy: 0.2607\n",
226 | "Test loss: 2.0172414276123045\n",
227 | "Test accuracy: 0.2607\n"
228 | ]
229 | }
230 | ],
231 | "source": [
232 | "# Score trained model.\n",
233 | "scores = model.evaluate(x_test, y_test, verbose=1)\n",
234 | "print('Test loss:', scores[0])\n",
235 | "print('Test accuracy:', scores[1])"
236 | ]
237 | },
238 | {
239 | "cell_type": "code",
240 | "execution_count": null,
241 | "metadata": {},
242 | "outputs": [],
243 | "source": []
244 | }
245 | ],
246 | "metadata": {
247 | "kernelspec": {
248 | "display_name": "Python 3",
249 | "language": "python",
250 | "name": "python3"
251 | },
252 | "language_info": {
253 | "codemirror_mode": {
254 | "name": "ipython",
255 | "version": 3
256 | },
257 | "file_extension": ".py",
258 | "mimetype": "text/x-python",
259 | "name": "python",
260 | "nbconvert_exporter": "python",
261 | "pygments_lexer": "ipython3",
262 | "version": "3.7.1"
263 | }
264 | },
265 | "nbformat": 4,
266 | "nbformat_minor": 2
267 | }
268 |
--------------------------------------------------------------------------------
/Section 02/3- MNIST.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Mnist"
8 | ]
9 | },
10 | {
11 | "cell_type": "markdown",
12 | "metadata": {},
13 | "source": [
14 | "## Get started with TensorFlow 2.0"
15 | ]
16 | },
17 | {
18 | "cell_type": "markdown",
19 | "metadata": {},
20 | "source": [
21 | "### To get started, import the TensorFlow library into your program"
22 | ]
23 | },
24 | {
25 | "cell_type": "code",
26 | "execution_count": 1,
27 | "metadata": {},
28 | "outputs": [],
29 | "source": [
30 | "import tensorflow as tf"
31 | ]
32 | },
33 | {
34 | "cell_type": "code",
35 | "execution_count": 2,
36 | "metadata": {},
37 | "outputs": [
38 | {
39 | "data": {
40 | "text/plain": [
41 | "'2.0.0-beta0'"
42 | ]
43 | },
44 | "execution_count": 2,
45 | "metadata": {},
46 | "output_type": "execute_result"
47 | }
48 | ],
49 | "source": [
50 | "tf.__version__"
51 | ]
52 | },
53 | {
54 | "cell_type": "markdown",
55 | "metadata": {},
56 | "source": [
57 | "### Load and prepare the MNIST dataset."
58 | ]
59 | },
60 | {
61 | "cell_type": "code",
62 | "execution_count": 3,
63 | "metadata": {},
64 | "outputs": [],
65 | "source": [
66 | "mnist = tf.keras.datasets.mnist\n",
67 | "\n",
68 | "(x_train, y_train), (x_test, y_test) = mnist.load_data()"
69 | ]
70 | },
71 | {
72 | "cell_type": "code",
73 | "execution_count": 4,
74 | "metadata": {},
75 | "outputs": [
76 | {
77 | "name": "stdout",
78 | "output_type": "stream",
79 | "text": [
80 | "========================================\n",
81 | "x_train details\n",
82 | "========================================\n",
83 | "Rank of x_train: 3\n",
84 | "len of x_train: 60000\n",
85 | "Shapeof 0th index of x_train: (28, 28)\n",
86 | "Shape of x_train: (60000, 28, 28)\n",
87 | "\n",
88 | "========================================\n",
89 | "y_train details\n",
90 | "========================================\n",
91 | "Rank of y_train: 1\n",
92 | "len of y_train: 60000\n",
93 | "Shape of y_train: (60000,)\n"
94 | ]
95 | }
96 | ],
97 | "source": [
98 | "print('========================================')\n",
99 | "print('x_train details')\n",
100 | "print('========================================')\n",
101 | "print('Rank of x_train: ',x_train.ndim)\n",
102 | "print('len of x_train: ', len(x_train))\n",
103 | "print('Shapeof 0th index of x_train: ',x_train[0].shape)\n",
104 | "print('Shape of x_train: ',x_train.shape)\n",
105 | "\n",
106 | "\n",
107 | "print()\n",
108 | "\n",
109 | "print('========================================')\n",
110 | "print('y_train details')\n",
111 | "print('========================================')\n",
112 | "print('Rank of y_train: ',y_train.ndim)\n",
113 | "print('len of y_train: ', len(y_train))\n",
114 | "print('Shape of y_train: ',y_train.shape)"
115 | ]
116 | },
117 | {
118 | "cell_type": "markdown",
119 | "metadata": {},
120 | "source": [
121 | "### Convert the samples from integers to floating-point numbers:"
122 | ]
123 | },
124 | {
125 | "cell_type": "code",
126 | "execution_count": 5,
127 | "metadata": {},
128 | "outputs": [],
129 | "source": [
130 | "x_train, x_test = x_train / 255.0, x_test / 255.0"
131 | ]
132 | },
133 | {
134 | "cell_type": "markdown",
135 | "metadata": {},
136 | "source": [
137 | "### Build the tensorflow.keras.Sequential model by stacking layers."
138 | ]
139 | },
140 | {
141 | "cell_type": "code",
142 | "execution_count": 6,
143 | "metadata": {},
144 | "outputs": [],
145 | "source": [
146 | "from tensorflow.keras.models import Sequential\n",
147 | "\n",
148 | "model = Sequential([\n",
149 | " tf.keras.layers.Flatten(input_shape=(28, 28)),\n",
150 | " tf.keras.layers.Dense(128, activation='relu'),\n",
151 | " tf.keras.layers.Dropout(0.2),\n",
152 | " tf.keras.layers.Dense(10, activation='softmax')\n",
153 | "])\n",
154 | "\n",
155 | "model.compile(optimizer='adam',\n",
156 | " loss='sparse_categorical_crossentropy',\n",
157 | " metrics=['accuracy'])"
158 | ]
159 | },
160 | {
161 | "cell_type": "markdown",
162 | "metadata": {},
163 | "source": [
164 | "### Train and evaluate model"
165 | ]
166 | },
167 | {
168 | "cell_type": "code",
169 | "execution_count": 6,
170 | "metadata": {},
171 | "outputs": [
172 | {
173 | "name": "stdout",
174 | "output_type": "stream",
175 | "text": [
176 | "Epoch 1/5\n",
177 | "60000/60000 [==============================] - 7s 117us/sample - loss: 0.2973 - accuracy: 0.9137\n",
178 | "Epoch 2/5\n",
179 | "60000/60000 [==============================] - 7s 110us/sample - loss: 0.1426 - accuracy: 0.9573\n",
180 | "Epoch 3/5\n",
181 | "60000/60000 [==============================] - 7s 115us/sample - loss: 0.1046 - accuracy: 0.9674\n",
182 | "Epoch 4/5\n",
183 | "60000/60000 [==============================] - 7s 114us/sample - loss: 0.0854 - accuracy: 0.9731\n",
184 | "Epoch 5/5\n",
185 | "60000/60000 [==============================] - 9s 143us/sample - loss: 0.0750 - accuracy: 0.9764\n",
186 | "10000/10000 [==============================] - 1s 70us/sample - loss: 0.0739 - accuracy: 0.9778\n"
187 | ]
188 | },
189 | {
190 | "data": {
191 | "text/plain": [
192 | "[0.07386075351759791, 0.9778]"
193 | ]
194 | },
195 | "execution_count": 6,
196 | "metadata": {},
197 | "output_type": "execute_result"
198 | }
199 | ],
200 | "source": [
201 | "model.fit(x_train, y_train, epochs=5)\n",
202 | "\n",
203 | "model.evaluate(x_test, y_test)"
204 | ]
205 | },
206 | {
207 | "cell_type": "markdown",
208 | "metadata": {},
209 | "source": [
210 | "### The image classifier is now trained to ~98% accuracy on this dataset."
211 | ]
212 | },
213 | {
214 | "cell_type": "code",
215 | "execution_count": 7,
216 | "metadata": {},
217 | "outputs": [
218 | {
219 | "data": {
220 | "text/plain": [
221 | "5"
222 | ]
223 | },
224 | "execution_count": 7,
225 | "metadata": {},
226 | "output_type": "execute_result"
227 | }
228 | ],
229 | "source": [
230 | "import numpy as np\n",
231 | "\n",
232 | "# Argmax: Returns the indices of the maximum values along an axis.\n",
233 | "np.argmax(model.predict([[x_train[0]]]))"
234 | ]
235 | },
236 | {
237 | "cell_type": "code",
238 | "execution_count": null,
239 | "metadata": {},
240 | "outputs": [],
241 | "source": []
242 | }
243 | ],
244 | "metadata": {
245 | "kernelspec": {
246 | "display_name": "Python 3",
247 | "language": "python",
248 | "name": "python3"
249 | },
250 | "language_info": {
251 | "codemirror_mode": {
252 | "name": "ipython",
253 | "version": 3
254 | },
255 | "file_extension": ".py",
256 | "mimetype": "text/x-python",
257 | "name": "python",
258 | "nbconvert_exporter": "python",
259 | "pygments_lexer": "ipython3",
260 | "version": "3.7.1"
261 | }
262 | },
263 | "nbformat": 4,
264 | "nbformat_minor": 2
265 | }
266 |
--------------------------------------------------------------------------------
/Section 03/CNN.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Working with CNN's for Computer Vision \n",
8 | "\n",
9 | "This section covers the convolutional neural networks or covnets widely used in computer vision applications. \n"
10 | ]
11 | },
12 | {
13 | "cell_type": "markdown",
14 | "metadata": {},
15 | "source": [
16 | "## V02 - Download the dataset and making Train and Test sets"
17 | ]
18 | },
19 | {
20 | "cell_type": "markdown",
21 | "metadata": {},
22 | "source": [
23 | "### Download the images from image-net"
24 | ]
25 | },
26 | {
27 | "cell_type": "code",
28 | "execution_count": null,
29 | "metadata": {},
30 | "outputs": [],
31 | "source": [
32 | "import urllib\n",
33 | "import cv2\n",
34 | "import imutils\n",
35 | "import numpy as np\n",
36 | "import os\n",
37 | "\n",
38 | "pic_num = 0\n",
39 | "dir_name = 'shoe'\n",
40 | "wnid = 'n02708093'\n",
41 | "\n",
42 | "neg_images_link = 'http://www.image-net.org/api/text/imagenet.synset.geturls?wnid=' + wnid\n",
43 | "neg_image_urls = urllib.request.urlopen(neg_images_link).read().decode()\n",
44 | "if not os.path.exists(dir_name):\n",
45 | " os.makedirs(dir_name)\n",
46 | "\n",
47 | "for i in neg_image_urls.split('\\n'):\n",
48 | " try:\n",
49 | " print('Downloading ', i)\n",
50 | " urllib.request.urlretrieve(i, dir_name + \"/\" + str(pic_num) + \".jpg\")\n",
51 | " img = cv2.imread(dir_name + \"/\" + str(pic_num) + \".jpg\", cv2.IMREAD_GRAYSCALE)\n",
52 | " resized_image = cv2.resize(img, (1000, 1000))\n",
53 | " cv2.imwrite(dir_name + \"/\" + str(pic_num) + \".jpg\", resized_image)\n",
54 | " pic_num += 1\n",
55 | "\n",
56 | " except Exception as e:\n",
57 | " print(str(e))"
58 | ]
59 | },
60 | {
61 | "cell_type": "markdown",
62 | "metadata": {},
63 | "source": [
64 | "## Loading the dataset, and making Train and Test sets"
65 | ]
66 | },
67 | {
68 | "cell_type": "markdown",
69 | "metadata": {},
70 | "source": [
71 | "Here's a script that would load all the downloaded images"
72 | ]
73 | },
74 | {
75 | "cell_type": "code",
76 | "execution_count": null,
77 | "metadata": {},
78 | "outputs": [],
79 | "source": [
80 | "import os\n",
81 | "import re\n",
82 | "import numpy as np\n",
83 | "import matplotlib.pyplot as plt\n",
84 | "from scipy import ndimage, misc\n",
85 | "import warnings; warnings.filterwarnings('ignore')\n",
86 | "\n",
87 | "image_size = 80\n",
88 | "\n",
89 | "all_images = []\n",
90 | "all_labels = []\n",
91 | "\n",
92 | "# mapping = {0: '/remote', 1:'/scissor'}\n",
93 | "mapping = {0: '/shoe', 1:'/clock'}\n",
94 | "\n",
95 | "for k,v in mapping.items():\n",
96 | " for root, dirnames, filenames in os.walk(os.path.abspath('') + v):\n",
97 | " for filename in filenames:\n",
98 | " if re.search(\"\\.(jpg|jpeg|png|bmp|tiff)$\", filename):\n",
99 | " filepath = os.path.join(root, filename)\n",
100 | " image = ndimage.imread(filepath, mode=\"L\")\n",
101 | " image_resized = misc.imresize(image, (image_size, image_size))\n",
102 | " all_images.append(image_resized)\n",
103 | " all_labels.append(k)"
104 | ]
105 | },
106 | {
107 | "cell_type": "markdown",
108 | "metadata": {},
109 | "source": [
110 | "### Defining a function to shuffle the entire dataset"
111 | ]
112 | },
113 | {
114 | "cell_type": "code",
115 | "execution_count": null,
116 | "metadata": {},
117 | "outputs": [],
118 | "source": [
119 | "def shuffle_batch(X, y):\n",
120 | " rnd_idx = np.random.permutation(len(X))\n",
121 | " n_batches = len(X)\n",
122 | " batch_idx = list(np.array_split(rnd_idx, 1))\n",
123 | " return X[batch_idx], y[batch_idx]"
124 | ]
125 | },
126 | {
127 | "cell_type": "markdown",
128 | "metadata": {},
129 | "source": [
130 | "### Convert the lists to numpy array"
131 | ]
132 | },
133 | {
134 | "cell_type": "code",
135 | "execution_count": null,
136 | "metadata": {},
137 | "outputs": [],
138 | "source": [
139 | "all_images = np.array(all_images)\n",
140 | "all_labels = np.array(all_labels)\n",
141 | "all_images = np.expand_dims(all_images, axis=3)\n",
142 | "\n",
143 | "print('Images', all_images.shape)\n",
144 | "print('Labels', all_labels.shape)"
145 | ]
146 | },
147 | {
148 | "cell_type": "markdown",
149 | "metadata": {},
150 | "source": [
151 | "### Shuffle entire dataset"
152 | ]
153 | },
154 | {
155 | "cell_type": "code",
156 | "execution_count": null,
157 | "metadata": {},
158 | "outputs": [],
159 | "source": [
160 | "all_images, all_labels = shuffle_batch(all_images, all_labels)"
161 | ]
162 | },
163 | {
164 | "cell_type": "markdown",
165 | "metadata": {},
166 | "source": [
167 | "### Spliting into Train (80%) and Test (20%) sets"
168 | ]
169 | },
170 | {
171 | "cell_type": "code",
172 | "execution_count": null,
173 | "metadata": {},
174 | "outputs": [],
175 | "source": [
176 | "percent = int(len(all_images) * 0.8)\n",
177 | "\n",
178 | "train_images, test_images = all_images[:percent], all_images[percent:]\n",
179 | "train_labels, test_labels = all_labels[:percent], all_labels[percent:]\n",
180 | "\n",
181 | "print('Total', len(all_images))\n",
182 | "print('Train images', train_images.shape)\n",
183 | "print('Test images', test_images.shape)\n",
184 | "print('Train labels', train_labels.shape)\n",
185 | "print('Test labels', test_labels.shape)"
186 | ]
187 | },
188 | {
189 | "cell_type": "markdown",
190 | "metadata": {},
191 | "source": [
192 | "# V03 - Dataset Preprocessing"
193 | ]
194 | },
195 | {
196 | "cell_type": "code",
197 | "execution_count": null,
198 | "metadata": {},
199 | "outputs": [],
200 | "source": [
201 | "test_images = test_images.astype('float32') / 255\n",
202 | "train_images = train_images.astype('float32') / 255"
203 | ]
204 | },
205 | {
206 | "cell_type": "code",
207 | "execution_count": null,
208 | "metadata": {},
209 | "outputs": [],
210 | "source": [
211 | "import cv2\n",
212 | "\n",
213 | "def denoise(image):\n",
214 | " image = cv2.GaussianBlur(image, (5, 5), 0)\n",
215 | " return image\n",
216 | "\n",
217 | "def resize(image, size):\n",
218 | " image = cv2.resize(image, (size, size))\n",
219 | " return image"
220 | ]
221 | },
222 | {
223 | "cell_type": "code",
224 | "execution_count": null,
225 | "metadata": {},
226 | "outputs": [],
227 | "source": [
228 | "index = 10\n",
229 | "%matplotlib inline\n",
230 | "image = train_images[index]\n",
231 | "plt.imshow(np.squeeze(image, axis=(2,)))"
232 | ]
233 | },
234 | {
235 | "cell_type": "code",
236 | "execution_count": null,
237 | "metadata": {},
238 | "outputs": [],
239 | "source": [
240 | "image = train_images[index]\n",
241 | "image = np.squeeze(image, axis=(2,))\n",
242 | "\n",
243 | "plt.imshow(resize(image, 150))\n",
244 | "# plt.imshow(denoise(image))"
245 | ]
246 | },
247 | {
248 | "cell_type": "markdown",
249 | "metadata": {},
250 | "source": [
251 | "# V04 - Building CNN Model from scratch"
252 | ]
253 | },
254 | {
255 | "cell_type": "code",
256 | "execution_count": null,
257 | "metadata": {},
258 | "outputs": [],
259 | "source": [
260 | "from tensorflow.keras import layers\n",
261 | "from tensorflow.keras import models\n",
262 | "\n",
263 | "import tensorflow as tf\n",
264 | "\n",
265 | "print('tf version', tf.__version__)\n",
266 | "\n",
267 | "model = models.Sequential()\n",
268 | "\n",
269 | "model.add(layers.Conv2D(64, (3, 3), activation='relu', \n",
270 | " input_shape=(image_size, image_size, 1)))\n",
271 | "\n",
272 | "model.add(layers.MaxPooling2D((2, 2)))\n",
273 | "\n",
274 | "model.add(layers.Conv2D(64, (3, 3), activation='relu'))\n",
275 | "\n",
276 | "model.add(layers.MaxPooling2D((2, 2)))\n",
277 | "\n",
278 | "model.add(layers.Conv2D(64, (3, 3), activation='relu'))"
279 | ]
280 | },
281 | {
282 | "cell_type": "code",
283 | "execution_count": null,
284 | "metadata": {},
285 | "outputs": [],
286 | "source": [
287 | "model.summary()"
288 | ]
289 | },
290 | {
291 | "cell_type": "code",
292 | "execution_count": null,
293 | "metadata": {},
294 | "outputs": [],
295 | "source": [
296 | "model.add(layers.Flatten())\n",
297 | "model.add(layers.Dense(32, activation='relu'))\n",
298 | "model.add(layers.Dense(1, activation='sigmoid'))"
299 | ]
300 | },
301 | {
302 | "cell_type": "code",
303 | "execution_count": null,
304 | "metadata": {},
305 | "outputs": [],
306 | "source": [
307 | "model.summary()"
308 | ]
309 | },
310 | {
311 | "cell_type": "markdown",
312 | "metadata": {},
313 | "source": [
314 | "### Compile the model with optimizer, loss function and metrics"
315 | ]
316 | },
317 | {
318 | "cell_type": "code",
319 | "execution_count": null,
320 | "metadata": {},
321 | "outputs": [],
322 | "source": [
323 | "model.compile(optimizer='rmsprop',\n",
324 | "loss='binary_crossentropy',\n",
325 | "metrics=['accuracy'])"
326 | ]
327 | },
328 | {
329 | "cell_type": "markdown",
330 | "metadata": {},
331 | "source": [
332 | "### Training the model"
333 | ]
334 | },
335 | {
336 | "cell_type": "code",
337 | "execution_count": null,
338 | "metadata": {},
339 | "outputs": [],
340 | "source": [
341 | "model.fit(train_images, train_labels, epochs=30)"
342 | ]
343 | },
344 | {
345 | "cell_type": "code",
346 | "execution_count": null,
347 | "metadata": {},
348 | "outputs": [],
349 | "source": [
350 | "test_loss, test_acc = model.evaluate(test_images, test_labels)"
351 | ]
352 | },
353 | {
354 | "cell_type": "code",
355 | "execution_count": null,
356 | "metadata": {},
357 | "outputs": [],
358 | "source": [
359 | "test_loss, test_acc = model.evaluate(train_images, train_labels)"
360 | ]
361 | },
362 | {
363 | "cell_type": "code",
364 | "execution_count": null,
365 | "metadata": {},
366 | "outputs": [],
367 | "source": [
368 | "import time\n",
369 | "from random import randint\n",
370 | "\n",
371 | "for i in range(20):\n",
372 | " index = randint(0, len(train_images))\n",
373 | " image = train_images[index]\n",
374 | " label = train_labels[index]\n",
375 | " predicted = 0\n",
376 | " if model.predict([[image]])[0][0] >= 0.5:\n",
377 | " predicted = 1\n",
378 | " print('Actual', label, 'Predicted', predicted, label == predicted)\n"
379 | ]
380 | },
381 | {
382 | "cell_type": "code",
383 | "execution_count": null,
384 | "metadata": {},
385 | "outputs": [],
386 | "source": [
387 | "index = 60\n",
388 | "\n",
389 | "image = train_images[index]\n",
390 | "label = train_labels[index]\n",
391 | "plt.imshow(np.squeeze(image, axis=(2,)))\n",
392 | "\n",
393 | "print('Actual', label)\n",
394 | "\n",
395 | "predicted = 0\n",
396 | "if model.predict([[image]])[0][0] >= 0.5:\n",
397 | " predicted = 1\n",
398 | "\n",
399 | "print('Predicted', predicted)\n"
400 | ]
401 | },
402 | {
403 | "cell_type": "code",
404 | "execution_count": null,
405 | "metadata": {},
406 | "outputs": [],
407 | "source": [
408 | "train_images.shape"
409 | ]
410 | },
411 | {
412 | "cell_type": "code",
413 | "execution_count": null,
414 | "metadata": {},
415 | "outputs": [],
416 | "source": []
417 | },
418 | {
419 | "cell_type": "markdown",
420 | "metadata": {},
421 | "source": [
422 | "# Data Augmentation"
423 | ]
424 | },
425 | {
426 | "cell_type": "code",
427 | "execution_count": null,
428 | "metadata": {},
429 | "outputs": [],
430 | "source": [
431 | "from keras.preprocessing.image import ImageDataGenerator\n",
432 | "from matplotlib import pyplot\n",
433 | "from keras import backend as K\n",
434 | "\n",
435 | "# datagen = ImageDataGenerator(rotation_range=90)\n",
436 | "datagen = ImageDataGenerator(horizontal_flip=True, vertical_flip=True)\n",
437 | "\n",
438 | "datagen.fit(train_images)\n",
439 | "\n",
440 | "rotated_images = []\n",
441 | "rotated_images_labels = []\n",
442 | "\n",
443 | "for X_batch, y_batch in datagen.flow(train_images, train_labels, \n",
444 | " batch_size=len(train_images), \n",
445 | " shuffle=False):\n",
446 | " rotated_images = X_batch\n",
447 | " rotated_images_labels = y_batch\n",
448 | " print(len(rotated_images))\n",
449 | " break\n"
450 | ]
451 | },
452 | {
453 | "cell_type": "code",
454 | "execution_count": null,
455 | "metadata": {},
456 | "outputs": [],
457 | "source": [
458 | "index = 62\n",
459 | "%matplotlib inline\n",
460 | "image = train_images[index]\n",
461 | "image_rotated = rotated_images[index]\n",
462 | "plt.imshow(np.squeeze(image, axis=(2,)))"
463 | ]
464 | },
465 | {
466 | "cell_type": "code",
467 | "execution_count": null,
468 | "metadata": {},
469 | "outputs": [],
470 | "source": [
471 | "plt.imshow(np.squeeze(image_rotated, axis=(2,)))"
472 | ]
473 | },
474 | {
475 | "cell_type": "code",
476 | "execution_count": null,
477 | "metadata": {},
478 | "outputs": [],
479 | "source": []
480 | }
481 | ],
482 | "metadata": {
483 | "kernelspec": {
484 | "display_name": "Python 3",
485 | "language": "python",
486 | "name": "python3"
487 | },
488 | "language_info": {
489 | "codemirror_mode": {
490 | "name": "ipython",
491 | "version": 3
492 | },
493 | "file_extension": ".py",
494 | "mimetype": "text/x-python",
495 | "name": "python",
496 | "nbconvert_exporter": "python",
497 | "pygments_lexer": "ipython3",
498 | "version": "3.7.1"
499 | }
500 | },
501 | "nbformat": 4,
502 | "nbformat_minor": 2
503 | }
504 |
--------------------------------------------------------------------------------
/Section 04/V04 - Building LSTM model for text data and getting the results.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Building LSTM model for text data and getting the results"
8 | ]
9 | },
10 | {
11 | "cell_type": "markdown",
12 | "metadata": {},
13 | "source": [
14 | "### Start by importing the SimpleRNN layer"
15 | ]
16 | },
17 | {
18 | "cell_type": "code",
19 | "execution_count": 1,
20 | "metadata": {},
21 | "outputs": [
22 | {
23 | "name": "stdout",
24 | "output_type": "stream",
25 | "text": [
26 | "tf version 2.0.0-beta0\n"
27 | ]
28 | }
29 | ],
30 | "source": [
31 | "from tensorflow.keras.models import Sequential\n",
32 | "from tensorflow.keras.layers import Embedding, SimpleRNN\n",
33 | "import tensorflow as tf\n",
34 | "print('tf version', tf.__version__)\n",
35 | "\n",
36 | "model = Sequential()\n",
37 | "\n",
38 | "# Word embeddings are dense representation of words and their relative meanings. \n",
39 | "# They can be learned from text data and reused among projects. \n",
40 | "# They can also be learned as part of fitting a neural network on text data.\n",
41 | "\n",
42 | "model.add(Embedding(10000, 32))\n",
43 | "model.add(SimpleRNN(32))"
44 | ]
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "### Let's see how the model looks\n",
51 | "\n",
52 | "It has over 322,000 parameters"
53 | ]
54 | },
55 | {
56 | "cell_type": "code",
57 | "execution_count": 2,
58 | "metadata": {},
59 | "outputs": [
60 | {
61 | "name": "stdout",
62 | "output_type": "stream",
63 | "text": [
64 | "Model: \"sequential\"\n",
65 | "_________________________________________________________________\n",
66 | "Layer (type) Output Shape Param # \n",
67 | "=================================================================\n",
68 | "embedding (Embedding) (None, None, 32) 320000 \n",
69 | "_________________________________________________________________\n",
70 | "simple_rnn (SimpleRNN) (None, 32) 2080 \n",
71 | "=================================================================\n",
72 | "Total params: 322,080\n",
73 | "Trainable params: 322,080\n",
74 | "Non-trainable params: 0\n",
75 | "_________________________________________________________________\n"
76 | ]
77 | }
78 | ],
79 | "source": [
80 | "model.summary()"
81 | ]
82 | },
83 | {
84 | "cell_type": "code",
85 | "execution_count": null,
86 | "metadata": {},
87 | "outputs": [],
88 | "source": []
89 | },
90 | {
91 | "cell_type": "code",
92 | "execution_count": null,
93 | "metadata": {},
94 | "outputs": [],
95 | "source": [
96 | "model = Sequential()\n",
97 | "model.add(Embedding(10000, 32))\n",
98 | "model.add(SimpleRNN(32, return_sequences=True))\n",
99 | "\n",
100 | "model.summary()"
101 | ]
102 | },
103 | {
104 | "cell_type": "markdown",
105 | "metadata": {},
106 | "source": [
107 | "It is sometimes useful to stack several recurrent layers one after the other in order to increase the representational power of a network. "
108 | ]
109 | },
110 | {
111 | "cell_type": "code",
112 | "execution_count": 3,
113 | "metadata": {},
114 | "outputs": [],
115 | "source": [
116 | "model = Sequential()\n",
117 | "model.add(Embedding(10000, 32))\n",
118 | "\n",
119 | "model.add(SimpleRNN(32, return_sequences=True))\n",
120 | "model.add(SimpleRNN(32, return_sequences=True))\n",
121 | "model.add(SimpleRNN(32, return_sequences=True))\n",
122 | "# return_sequences: Boolean. Whether to return the last output\n",
123 | "# in the output sequence, or the full sequence.\n",
124 | "model.add(SimpleRNN(32))"
125 | ]
126 | },
127 | {
128 | "cell_type": "code",
129 | "execution_count": 4,
130 | "metadata": {},
131 | "outputs": [
132 | {
133 | "name": "stdout",
134 | "output_type": "stream",
135 | "text": [
136 | "Model: \"sequential_1\"\n",
137 | "_________________________________________________________________\n",
138 | "Layer (type) Output Shape Param # \n",
139 | "=================================================================\n",
140 | "embedding_1 (Embedding) (None, None, 32) 320000 \n",
141 | "_________________________________________________________________\n",
142 | "simple_rnn_1 (SimpleRNN) (None, None, 32) 2080 \n",
143 | "_________________________________________________________________\n",
144 | "simple_rnn_2 (SimpleRNN) (None, None, 32) 2080 \n",
145 | "_________________________________________________________________\n",
146 | "simple_rnn_3 (SimpleRNN) (None, None, 32) 2080 \n",
147 | "_________________________________________________________________\n",
148 | "simple_rnn_4 (SimpleRNN) (None, 32) 2080 \n",
149 | "=================================================================\n",
150 | "Total params: 328,320\n",
151 | "Trainable params: 328,320\n",
152 | "Non-trainable params: 0\n",
153 | "_________________________________________________________________\n"
154 | ]
155 | }
156 | ],
157 | "source": [
158 | "model.summary()"
159 | ]
160 | },
161 | {
162 | "cell_type": "markdown",
163 | "metadata": {},
164 | "source": [
165 | "Now let's try to use such a model on the IMDB movie review classification problem. First, let's preprocess the data:"
166 | ]
167 | },
168 | {
169 | "cell_type": "code",
170 | "execution_count": 5,
171 | "metadata": {},
172 | "outputs": [
173 | {
174 | "name": "stderr",
175 | "output_type": "stream",
176 | "text": [
177 | "Using TensorFlow backend.\n"
178 | ]
179 | },
180 | {
181 | "name": "stdout",
182 | "output_type": "stream",
183 | "text": [
184 | "Loading data...\n",
185 | "25000 train sequences\n",
186 | "25000 test sequences\n",
187 | "Pad sequences (samples x time)\n",
188 | "input_train shape: (25000, 500)\n",
189 | "input_test shape: (25000, 500)\n"
190 | ]
191 | }
192 | ],
193 | "source": [
194 | "from keras.datasets import imdb\n",
195 | "from keras.preprocessing import sequence\n",
196 | "\n",
197 | "max_features = 10000 # number of words to consider as features\n",
198 | "maxlen = 500 # cut texts after 500 words\n",
199 | "batch_size = 32\n",
200 | "\n",
201 | "print('Loading data...')\n",
202 | "(input_train, y_train), (input_test, y_test) = imdb.load_data(num_words=max_features)\n",
203 | "print(len(input_train), 'train sequences')\n",
204 | "print(len(input_test), 'test sequences')\n",
205 | "\n",
206 | "print('Pad sequences (samples x time)')\n",
207 | "input_train = sequence.pad_sequences(input_train, maxlen=maxlen)\n",
208 | "input_test = sequence.pad_sequences(input_test, maxlen=maxlen)\n",
209 | "print('input_train shape:', input_train.shape)\n",
210 | "print('input_test shape:', input_test.shape)"
211 | ]
212 | },
213 | {
214 | "cell_type": "markdown",
215 | "metadata": {},
216 | "source": [
217 | "Let's train a simple recurrent network using an `Embedding` layer and a `SimpleRNN` layer:"
218 | ]
219 | },
220 | {
221 | "cell_type": "code",
222 | "execution_count": 6,
223 | "metadata": {},
224 | "outputs": [
225 | {
226 | "name": "stdout",
227 | "output_type": "stream",
228 | "text": [
229 | "Train on 20000 samples, validate on 5000 samples\n",
230 | "20000/20000 [==============================] - 49s 2ms/sample - loss: 0.6792 - acc: 0.5573 - val_loss: 0.6351 - val_acc: 0.6458\n"
231 | ]
232 | }
233 | ],
234 | "source": [
235 | "from tensorflow.keras.layers import Dense\n",
236 | "\n",
237 | "model = Sequential()\n",
238 | "model.add(Embedding(max_features, 32))\n",
239 | "model.add(SimpleRNN(32))\n",
240 | "model.add(Dense(1, activation='sigmoid'))\n",
241 | "\n",
242 | "model.compile(optimizer='rmsprop', loss='binary_crossentropy', metrics=['acc'])\n",
243 | "\n",
244 | "history = model.fit(input_train, y_train,\n",
245 | " epochs=1,\n",
246 | " batch_size=128,\n",
247 | " validation_split=0.2)"
248 | ]
249 | },
250 | {
251 | "cell_type": "markdown",
252 | "metadata": {},
253 | "source": [
254 | "Let's display the training and validation loss and accuracy:"
255 | ]
256 | },
257 | {
258 | "cell_type": "code",
259 | "execution_count": 7,
260 | "metadata": {},
261 | "outputs": [
262 | {
263 | "name": "stdout",
264 | "output_type": "stream",
265 | "text": [
266 | "Training set accuracy is: [0.55735]\n",
267 | "Validation set accuracy is: [0.6458]\n",
268 | "Training set Loss is: [0.6792083116531372]\n",
269 | "Validation set accuracy is: [0.6350920477867127]\n"
270 | ]
271 | }
272 | ],
273 | "source": [
274 | "import matplotlib.pyplot as plt\n",
275 | "\n",
276 | "acc = history.history['acc']\n",
277 | "val_acc = history.history['val_acc']\n",
278 | "loss = history.history['loss']\n",
279 | "val_loss = history.history['val_loss']\n",
280 | "\n",
281 | "\n",
282 | "print('Training set accuracy is: ', acc)\n",
283 | "print('Validation set accuracy is: ', val_acc)\n",
284 | "print('Training set Loss is: ', loss)\n",
285 | "print('Validation set accuracy is: ', val_loss)\n",
286 | "\n",
287 | "# Of course, you can train it for larger epochs\n",
288 | "# to improve the accuracy"
289 | ]
290 | },
291 | {
292 | "cell_type": "markdown",
293 | "metadata": {},
294 | "source": [
295 | "## 2) Same Example with LSTM - Long Short-term Memory Layer"
296 | ]
297 | },
298 | {
299 | "cell_type": "code",
300 | "execution_count": 8,
301 | "metadata": {},
302 | "outputs": [
303 | {
304 | "name": "stdout",
305 | "output_type": "stream",
306 | "text": [
307 | "Train on 20000 samples, validate on 5000 samples\n",
308 | "20000/20000 [==============================] - 100s 5ms/sample - loss: 0.5059 - acc: 0.7625 - val_loss: 0.3962 - val_acc: 0.8336\n"
309 | ]
310 | }
311 | ],
312 | "source": [
313 | "from tensorflow.keras.layers import LSTM\n",
314 | "\n",
315 | "model = Sequential()\n",
316 | "model.add(Embedding(max_features, 32))\n",
317 | "model.add(LSTM(32))\n",
318 | "model.add(Dense(1, activation='sigmoid'))\n",
319 | "\n",
320 | "model.compile(optimizer='rmsprop',\n",
321 | " loss='binary_crossentropy',\n",
322 | " metrics=['acc'])\n",
323 | "history = model.fit(input_train, y_train,\n",
324 | " epochs=1,\n",
325 | " batch_size=128,\n",
326 | " validation_split=0.2)"
327 | ]
328 | },
329 | {
330 | "cell_type": "code",
331 | "execution_count": 9,
332 | "metadata": {},
333 | "outputs": [
334 | {
335 | "name": "stdout",
336 | "output_type": "stream",
337 | "text": [
338 | "Training set accuracy is: [0.76255]\n",
339 | "Validation set accuracy is: [0.8336]\n",
340 | "Training set Loss is: [0.5059212841033935]\n",
341 | "Validation set accuracy is: [0.3961514075756073]\n"
342 | ]
343 | }
344 | ],
345 | "source": [
346 | "import matplotlib.pyplot as plt\n",
347 | "\n",
348 | "acc = history.history['acc']\n",
349 | "val_acc = history.history['val_acc']\n",
350 | "loss = history.history['loss']\n",
351 | "val_loss = history.history['val_loss']\n",
352 | "\n",
353 | "\n",
354 | "print('Training set accuracy is: ', acc)\n",
355 | "print('Validation set accuracy is: ', val_acc)\n",
356 | "print('Training set Loss is: ', loss)\n",
357 | "print('Validation set accuracy is: ', val_loss)\n",
358 | "\n",
359 | "# Of course, you can train it for larger epochs\n",
360 | "# to improve the accuracy"
361 | ]
362 | },
363 | {
364 | "cell_type": "code",
365 | "execution_count": null,
366 | "metadata": {},
367 | "outputs": [],
368 | "source": []
369 | },
370 | {
371 | "cell_type": "code",
372 | "execution_count": null,
373 | "metadata": {},
374 | "outputs": [],
375 | "source": []
376 | },
377 | {
378 | "cell_type": "code",
379 | "execution_count": null,
380 | "metadata": {},
381 | "outputs": [],
382 | "source": []
383 | }
384 | ],
385 | "metadata": {
386 | "kernelspec": {
387 | "display_name": "Python 3",
388 | "language": "python",
389 | "name": "python3"
390 | },
391 | "language_info": {
392 | "codemirror_mode": {
393 | "name": "ipython",
394 | "version": 3
395 | },
396 | "file_extension": ".py",
397 | "mimetype": "text/x-python",
398 | "name": "python",
399 | "nbconvert_exporter": "python",
400 | "pygments_lexer": "ipython3",
401 | "version": "3.7.1"
402 | }
403 | },
404 | "nbformat": 4,
405 | "nbformat_minor": 2
406 | }
407 |
--------------------------------------------------------------------------------
/Section 05/.DS_Store:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/PacktPublishing/Getting-Started-with-TensorFlow-2.0-for-Deep-Learning-Video/1465580c5ed2dd0c3ee84db31bcc33e8ea2b969d/Section 05/.DS_Store
--------------------------------------------------------------------------------
/Section 06/Section 06 - Auto-encoders.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {
6 | "toc": "true"
7 | },
8 | "source": [
9 | "# Creating a simple Auto-encoders from scratch with Fashion-MNIST dataset."
10 | ]
11 | },
12 | {
13 | "cell_type": "markdown",
14 | "metadata": {},
15 | "source": [
16 | "## 1) Import modules"
17 | ]
18 | },
19 | {
20 | "cell_type": "code",
21 | "execution_count": null,
22 | "metadata": {},
23 | "outputs": [],
24 | "source": [
25 | "%matplotlib inline\n",
26 | "%config InlineBackend.figure_format = 'retina'\n",
27 | "\n",
28 | "import matplotlib.pyplot as plt\n",
29 | "import pandas as pd\n",
30 | "import numpy as np\n",
31 | "import seaborn as sns\n",
32 | "import warnings\n",
33 | "\n",
34 | "warnings.filterwarnings('ignore')\n",
35 | "\n",
36 | "from tensorflow.keras.models import Model\n",
37 | "from tensorflow.keras.layers import Dense, Input\n",
38 | "from tensorflow.keras.datasets import mnist\n",
39 | "from tensorflow.keras.datasets import fashion_mnist\n",
40 | "from tensorflow.keras.regularizers import l1\n",
41 | "from tensorflow.keras.optimizers import Adam"
42 | ]
43 | },
44 | {
45 | "cell_type": "code",
46 | "execution_count": null,
47 | "metadata": {},
48 | "outputs": [],
49 | "source": [
50 | "import tensorflow as tf\n",
51 | "tf.__version__"
52 | ]
53 | },
54 | {
55 | "cell_type": "markdown",
56 | "metadata": {},
57 | "source": [
58 | "## 2) Utility Function"
59 | ]
60 | },
61 | {
62 | "cell_type": "code",
63 | "execution_count": null,
64 | "metadata": {},
65 | "outputs": [],
66 | "source": [
67 | "def plot_autoencoder_outputs(autoencoder, n, dims):\n",
68 | "\n",
69 | " n = 5\n",
70 | " plt.figure(figsize=(10, 4.5))\n",
71 | " decoded_imgs = autoencoder.predict(x_test)\n",
72 | " \n",
73 | " for i in range(n):\n",
74 | " \n",
75 | " # plot original image\n",
76 | " ax = plt.subplot(2, n, i + 1)\n",
77 | " plt.imshow(x_test[i].reshape(*dims))\n",
78 | " plt.gray()\n",
79 | " ax.get_xaxis().set_visible(False)\n",
80 | " ax.get_yaxis().set_visible(False)\n",
81 | " if i == n/2:\n",
82 | " ax.set_title('Original Images')\n",
83 | "\n",
84 | " # plot reconstruction \n",
85 | " ax = plt.subplot(2, n, i + 1 + n)\n",
86 | " plt.imshow(decoded_imgs[i].reshape(*dims))\n",
87 | " plt.gray()\n",
88 | " ax.get_xaxis().set_visible(False)\n",
89 | " ax.get_yaxis().set_visible(False)\n",
90 | " if i == n/2:\n",
91 | " ax.set_title('Reconstructed Images')\n",
92 | " \n",
93 | " plt.show()"
94 | ]
95 | },
96 | {
97 | "cell_type": "markdown",
98 | "metadata": {},
99 | "source": [
100 | "## 3) Loading and preparing the dataset"
101 | ]
102 | },
103 | {
104 | "cell_type": "code",
105 | "execution_count": null,
106 | "metadata": {},
107 | "outputs": [],
108 | "source": [
109 | "(x_train, y_train), (x_test, y_test) = fashion_mnist.load_data()\n",
110 | "\n",
111 | "x_train = x_train.astype('float32') / 255.0\n",
112 | "x_test = x_test.astype('float32') / 255.0\n",
113 | "x_train = x_train.reshape((len(x_train), np.prod(x_train.shape[1:])))\n",
114 | "x_test = x_test.reshape((len(x_test), np.prod(x_test.shape[1:])))\n",
115 | "\n",
116 | "print(x_train.shape)\n",
117 | "print(x_test.shape)"
118 | ]
119 | },
120 | {
121 | "cell_type": "markdown",
122 | "metadata": {},
123 | "source": [
124 | "## 4) Building the Auto-Encoder"
125 | ]
126 | },
127 | {
128 | "cell_type": "code",
129 | "execution_count": null,
130 | "metadata": {},
131 | "outputs": [],
132 | "source": [
133 | "input_size = 784\n",
134 | "\n",
135 | "n_neurons = 64\n",
136 | "\n",
137 | "import tensorflow as tf\n",
138 | "\n",
139 | "print('tf version', tf.__version__)\n",
140 | "\n",
141 | "input_img = Input(shape=(input_size,))\n",
142 | "\n",
143 | "code = Dense(n_neurons, activation='relu')(input_img)\n",
144 | "\n",
145 | "output_img = Dense(input_size, activation='sigmoid')(code)\n",
146 | "\n",
147 | "autoencoder = Model(input_img, output_img)\n",
148 | "\n",
149 | "autoencoder.compile(optimizer='adam', loss='binary_crossentropy')\n",
150 | "\n",
151 | "autoencoder.fit(x_train, x_train, epochs=5)\n",
152 | "\n"
153 | ]
154 | },
155 | {
156 | "cell_type": "markdown",
157 | "metadata": {},
158 | "source": [
159 | "## 5) Visualize the results Original vs Reconstructed Images"
160 | ]
161 | },
162 | {
163 | "cell_type": "code",
164 | "execution_count": null,
165 | "metadata": {},
166 | "outputs": [],
167 | "source": [
168 | "plot_autoencoder_outputs(autoencoder, 5, (28, 28))"
169 | ]
170 | },
171 | {
172 | "cell_type": "code",
173 | "execution_count": null,
174 | "metadata": {},
175 | "outputs": [],
176 | "source": [
177 | "weights = autoencoder.get_weights()[0].T\n",
178 | "\n",
179 | "n = 10\n",
180 | "plt.figure(figsize=(20, 5))\n",
181 | "for i in range(n):\n",
182 | " ax = plt.subplot(1, n, i + 1)\n",
183 | " plt.imshow(weights[i+0].reshape(28, 28))\n",
184 | " ax.get_xaxis().set_visible(False)\n",
185 | " ax.get_yaxis().set_visible(False)\n",
186 | " "
187 | ]
188 | },
189 | {
190 | "cell_type": "code",
191 | "execution_count": null,
192 | "metadata": {},
193 | "outputs": [],
194 | "source": []
195 | },
196 | {
197 | "cell_type": "code",
198 | "execution_count": null,
199 | "metadata": {},
200 | "outputs": [],
201 | "source": []
202 | },
203 | {
204 | "cell_type": "code",
205 | "execution_count": null,
206 | "metadata": {},
207 | "outputs": [],
208 | "source": []
209 | },
210 | {
211 | "cell_type": "code",
212 | "execution_count": null,
213 | "metadata": {},
214 | "outputs": [],
215 | "source": []
216 | },
217 | {
218 | "cell_type": "code",
219 | "execution_count": null,
220 | "metadata": {},
221 | "outputs": [],
222 | "source": []
223 | },
224 | {
225 | "cell_type": "markdown",
226 | "metadata": {},
227 | "source": [
228 | "# Deep Auto-Encoder"
229 | ]
230 | },
231 | {
232 | "cell_type": "markdown",
233 | "metadata": {},
234 | "source": [
235 | "## 4) Buidling the Deep Auto-Encoder"
236 | ]
237 | },
238 | {
239 | "cell_type": "code",
240 | "execution_count": null,
241 | "metadata": {},
242 | "outputs": [],
243 | "source": [
244 | "input_size = 784\n",
245 | "\n",
246 | "hidden_size = 128\n",
247 | "\n",
248 | "code_size = 128\n",
249 | "\n",
250 | "input_img = Input(shape=(input_size,))\n",
251 | "\n",
252 | "hidden_1 = Dense(hidden_size, activation='relu')(input_img)\n",
253 | "\n",
254 | "code = Dense(code_size, activation='relu')(hidden_1)\n",
255 | "\n",
256 | "hidden_2 = Dense(hidden_size, activation='relu')(code)\n",
257 | "\n",
258 | "output_img = Dense(input_size, activation='sigmoid')(hidden_2)\n",
259 | "\n",
260 | "autoencoder = Model(input_img, output_img)\n",
261 | "\n",
262 | "autoencoder.compile(optimizer='adam', loss='binary_crossentropy')\n",
263 | "\n",
264 | "autoencoder.fit(x_train, x_train, epochs=3)"
265 | ]
266 | },
267 | {
268 | "cell_type": "markdown",
269 | "metadata": {},
270 | "source": [
271 | "## 5) Visualize the results Original vs Reconstructed Images"
272 | ]
273 | },
274 | {
275 | "cell_type": "code",
276 | "execution_count": null,
277 | "metadata": {},
278 | "outputs": [],
279 | "source": [
280 | "plot_autoencoder_outputs(autoencoder, 5, (28, 28))"
281 | ]
282 | },
283 | {
284 | "cell_type": "code",
285 | "execution_count": null,
286 | "metadata": {},
287 | "outputs": [],
288 | "source": []
289 | },
290 | {
291 | "cell_type": "code",
292 | "execution_count": null,
293 | "metadata": {},
294 | "outputs": [],
295 | "source": []
296 | },
297 | {
298 | "cell_type": "code",
299 | "execution_count": null,
300 | "metadata": {},
301 | "outputs": [],
302 | "source": []
303 | },
304 | {
305 | "cell_type": "code",
306 | "execution_count": null,
307 | "metadata": {},
308 | "outputs": [],
309 | "source": []
310 | },
311 | {
312 | "cell_type": "code",
313 | "execution_count": null,
314 | "metadata": {},
315 | "outputs": [],
316 | "source": []
317 | },
318 | {
319 | "cell_type": "code",
320 | "execution_count": null,
321 | "metadata": {},
322 | "outputs": [],
323 | "source": []
324 | },
325 | {
326 | "cell_type": "markdown",
327 | "metadata": {
328 | "collapsed": true
329 | },
330 | "source": [
331 | "# Denoising Autoencoder"
332 | ]
333 | },
334 | {
335 | "cell_type": "markdown",
336 | "metadata": {},
337 | "source": [
338 | "## 1) Generating Noisy Images"
339 | ]
340 | },
341 | {
342 | "cell_type": "code",
343 | "execution_count": null,
344 | "metadata": {},
345 | "outputs": [],
346 | "source": [
347 | "noise_factor = 0.4\n",
348 | "x_train_noisy = x_train + noise_factor * np.random.normal(size=x_train.shape) \n",
349 | "x_test_noisy = x_test + noise_factor * np.random.normal(size=x_test.shape)\n",
350 | "\n",
351 | "x_train_noisy = np.clip(x_train_noisy, 0.0, 1.0)\n",
352 | "x_test_noisy = np.clip(x_test_noisy, 0.0, 1.0)\n",
353 | "\n",
354 | "n = 5\n",
355 | "plt.figure(figsize=(10, 4.5))\n",
356 | "for i in range(n):\n",
357 | " # plot original image\n",
358 | " ax = plt.subplot(2, n, i + 1)\n",
359 | " plt.imshow(x_test[i].reshape(28, 28))\n",
360 | " plt.gray()\n",
361 | " ax.get_xaxis().set_visible(False)\n",
362 | " ax.get_yaxis().set_visible(False)\n",
363 | " if i == n/2:\n",
364 | " ax.set_title('Original Images')\n",
365 | "\n",
366 | " # plot noisy image \n",
367 | " ax = plt.subplot(2, n, i + 1 + n)\n",
368 | " plt.imshow(x_test_noisy[i].reshape(28, 28))\n",
369 | " plt.gray()\n",
370 | " ax.get_xaxis().set_visible(False)\n",
371 | " ax.get_yaxis().set_visible(False)\n",
372 | " if i == n/2:\n",
373 | " ax.set_title('Noisy Input')"
374 | ]
375 | },
376 | {
377 | "cell_type": "markdown",
378 | "metadata": {},
379 | "source": [
380 | "## 2) Buidling the Deep Auto-Encoder for Image Denoising"
381 | ]
382 | },
383 | {
384 | "cell_type": "code",
385 | "execution_count": null,
386 | "metadata": {},
387 | "outputs": [],
388 | "source": [
389 | "input_size = 784\n",
390 | "\n",
391 | "hidden_size = 128\n",
392 | "\n",
393 | "code_size = 32\n",
394 | "\n",
395 | "input_img = Input(shape=(input_size,))\n",
396 | "\n",
397 | "hidden_1 = Dense(hidden_size, activation='relu')(input_img)\n",
398 | "\n",
399 | "code = Dense(code_size, activation='relu')(hidden_1)\n",
400 | "\n",
401 | "hidden_2 = Dense(hidden_size, activation='relu')(code)\n",
402 | "\n",
403 | "output_img = Dense(input_size, activation='sigmoid')(hidden_2)\n",
404 | "\n",
405 | "autoencoder = Model(input_img, output_img)\n",
406 | "\n",
407 | "autoencoder.compile(optimizer='adam', loss='binary_crossentropy')\n",
408 | "\n",
409 | "autoencoder.fit(x_train_noisy, x_train, epochs=10)"
410 | ]
411 | },
412 | {
413 | "cell_type": "markdown",
414 | "metadata": {},
415 | "source": [
416 | "## 3) Visualize the results Original vs Reconstructed Images"
417 | ]
418 | },
419 | {
420 | "cell_type": "code",
421 | "execution_count": null,
422 | "metadata": {},
423 | "outputs": [],
424 | "source": [
425 | "n = 5\n",
426 | "plt.figure(figsize=(10, 7))\n",
427 | "\n",
428 | "images = autoencoder.predict(x_test_noisy)\n",
429 | "\n",
430 | "for i in range(n):\n",
431 | " # plot original image\n",
432 | " ax = plt.subplot(3, n, i + 1)\n",
433 | " plt.imshow(x_test[i].reshape(28, 28))\n",
434 | " plt.gray()\n",
435 | " ax.get_xaxis().set_visible(False)\n",
436 | " ax.get_yaxis().set_visible(False)\n",
437 | " if i == n/2:\n",
438 | " ax.set_title('Original Images')\n",
439 | "\n",
440 | " # plot noisy image \n",
441 | " ax = plt.subplot(3, n, i + 1 + n)\n",
442 | " plt.imshow(x_test_noisy[i].reshape(28, 28))\n",
443 | " plt.gray()\n",
444 | " ax.get_xaxis().set_visible(False)\n",
445 | " ax.get_yaxis().set_visible(False)\n",
446 | " if i == n/2:\n",
447 | " ax.set_title('Noisy Input')\n",
448 | " \n",
449 | " # plot noisy image \n",
450 | " ax = plt.subplot(3, n, i + 1 + 2*n)\n",
451 | " plt.imshow(images[i].reshape(28, 28))\n",
452 | " plt.gray()\n",
453 | " ax.get_xaxis().set_visible(False)\n",
454 | " ax.get_yaxis().set_visible(False)\n",
455 | " if i == n/2:\n",
456 | " ax.set_title('Autoencoder Output')"
457 | ]
458 | },
459 | {
460 | "cell_type": "code",
461 | "execution_count": null,
462 | "metadata": {},
463 | "outputs": [],
464 | "source": []
465 | },
466 | {
467 | "cell_type": "code",
468 | "execution_count": null,
469 | "metadata": {},
470 | "outputs": [],
471 | "source": []
472 | },
473 | {
474 | "cell_type": "code",
475 | "execution_count": null,
476 | "metadata": {},
477 | "outputs": [],
478 | "source": []
479 | },
480 | {
481 | "cell_type": "code",
482 | "execution_count": null,
483 | "metadata": {},
484 | "outputs": [],
485 | "source": []
486 | },
487 | {
488 | "cell_type": "code",
489 | "execution_count": null,
490 | "metadata": {},
491 | "outputs": [],
492 | "source": []
493 | }
494 | ],
495 | "metadata": {
496 | "kernelspec": {
497 | "display_name": "PY37",
498 | "language": "python",
499 | "name": "py37"
500 | },
501 | "language_info": {
502 | "codemirror_mode": {
503 | "name": "ipython",
504 | "version": 3
505 | },
506 | "file_extension": ".py",
507 | "mimetype": "text/x-python",
508 | "name": "python",
509 | "nbconvert_exporter": "python",
510 | "pygments_lexer": "ipython3",
511 | "version": "3.7.1"
512 | },
513 | "toc": {
514 | "colors": {
515 | "hover_highlight": "#DAA520",
516 | "navigate_num": "#000000",
517 | "navigate_text": "#333333",
518 | "running_highlight": "#FF0000",
519 | "selected_highlight": "#FFD700",
520 | "sidebar_border": "#EEEEEE",
521 | "wrapper_background": "#FFFFFF"
522 | },
523 | "moveMenuLeft": true,
524 | "nav_menu": {
525 | "height": "66px",
526 | "width": "252px"
527 | },
528 | "navigate_menu": true,
529 | "number_sections": true,
530 | "sideBar": true,
531 | "threshold": 4,
532 | "toc_cell": true,
533 | "toc_section_display": "block",
534 | "toc_window_display": false,
535 | "widenNotebook": false
536 | }
537 | },
538 | "nbformat": 4,
539 | "nbformat_minor": 2
540 | }
541 |
--------------------------------------------------------------------------------
/Section 07/01 - Tensorflow-keras Functional API.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Tensorflow-keras Functional API"
8 | ]
9 | },
10 | {
11 | "cell_type": "markdown",
12 | "metadata": {},
13 | "source": [
14 | "When building models with the functional API, layers are callable (on a tensor), and return a tensor as output. \n",
15 | "\n",
16 | "These input tensor(s) and output tensor(s) can then be used to define a model. For example:"
17 | ]
18 | },
19 | {
20 | "cell_type": "code",
21 | "execution_count": 1,
22 | "metadata": {},
23 | "outputs": [],
24 | "source": [
25 | "\n",
26 | "import tensorflow as tf\n",
27 | "\n",
28 | "from tensorflow.keras import layers, activations\n"
29 | ]
30 | },
31 | {
32 | "cell_type": "code",
33 | "execution_count": 2,
34 | "metadata": {},
35 | "outputs": [
36 | {
37 | "data": {
38 | "text/plain": [
39 | "'2.0.0-beta0'"
40 | ]
41 | },
42 | "execution_count": 2,
43 | "metadata": {},
44 | "output_type": "execute_result"
45 | }
46 | ],
47 | "source": [
48 | "tf.__version__"
49 | ]
50 | },
51 | {
52 | "cell_type": "code",
53 | "execution_count": 3,
54 | "metadata": {},
55 | "outputs": [],
56 | "source": [
57 | "\n",
58 | "inputs = tf.keras.Input(shape=(32,))\n"
59 | ]
60 | },
61 | {
62 | "cell_type": "code",
63 | "execution_count": 4,
64 | "metadata": {},
65 | "outputs": [],
66 | "source": [
67 | "\n",
68 | "# A layer instance is callable on a tensor, and returns a tensor.\n",
69 | "\n",
70 | "hidden = layers.Dense(64, activation='relu')(inputs)\n",
71 | "\n",
72 | "hidden = layers.Dense(64, activation='relu')(hidden)\n"
73 | ]
74 | },
75 | {
76 | "cell_type": "code",
77 | "execution_count": 5,
78 | "metadata": {},
79 | "outputs": [],
80 | "source": [
81 | "\n",
82 | "predictions = layers.Dense(10, activation='softmax')(hidden)\n"
83 | ]
84 | },
85 | {
86 | "cell_type": "code",
87 | "execution_count": 6,
88 | "metadata": {},
89 | "outputs": [],
90 | "source": [
91 | "\n",
92 | "# Instantiate the model given inputs and outputs.\n",
93 | "\n",
94 | "model = tf.keras.Model(inputs=inputs, outputs=predictions)\n"
95 | ]
96 | },
97 | {
98 | "cell_type": "code",
99 | "execution_count": 6,
100 | "metadata": {},
101 | "outputs": [
102 | {
103 | "name": "stdout",
104 | "output_type": "stream",
105 | "text": [
106 | "Model: \"model\"\n",
107 | "_________________________________________________________________\n",
108 | "Layer (type) Output Shape Param # \n",
109 | "=================================================================\n",
110 | "input_1 (InputLayer) [(None, 32)] 0 \n",
111 | "_________________________________________________________________\n",
112 | "dense (Dense) (None, 64) 2112 \n",
113 | "_________________________________________________________________\n",
114 | "dense_1 (Dense) (None, 64) 4160 \n",
115 | "_________________________________________________________________\n",
116 | "dense_2 (Dense) (None, 10) 650 \n",
117 | "=================================================================\n",
118 | "Total params: 6,922\n",
119 | "Trainable params: 6,922\n",
120 | "Non-trainable params: 0\n",
121 | "_________________________________________________________________\n"
122 | ]
123 | }
124 | ],
125 | "source": [
126 | "\n",
127 | "model.summary()\n"
128 | ]
129 | },
130 | {
131 | "cell_type": "markdown",
132 | "metadata": {},
133 | "source": [
134 | "Fully customizable models can be built by using the Model Subclassing API, You define your own forward pass imperatively in this style, in the body of a class method. For example:"
135 | ]
136 | },
137 | {
138 | "cell_type": "code",
139 | "execution_count": null,
140 | "metadata": {},
141 | "outputs": [],
142 | "source": [
143 | "num_classes = 9\n",
144 | "\n",
145 | "class MyModel(tf.keras.Model):\n",
146 | "\n",
147 | " def __init__(self):\n",
148 | " super(MyModel, self).__init__()\n",
149 | "\n",
150 | " # Define your layers here.\n",
151 | " self.dense_1 = layers.Dense(32, activation='relu')\n",
152 | " self.dense_2 = layers.Dense(num_classes, activation='sigmoid')\n",
153 | "\n",
154 | " def call(self, inputs):\n",
155 | "\n",
156 | " # Define your forward pass here,\n",
157 | " # using layers you previously defined in `__init__`\n",
158 | " x = self.dense_1(inputs)\n",
159 | " return self.dense_2(x)"
160 | ]
161 | }
162 | ],
163 | "metadata": {
164 | "kernelspec": {
165 | "display_name": "PY37",
166 | "language": "python",
167 | "name": "py37"
168 | },
169 | "language_info": {
170 | "codemirror_mode": {
171 | "name": "ipython",
172 | "version": 3
173 | },
174 | "file_extension": ".py",
175 | "mimetype": "text/x-python",
176 | "name": "python",
177 | "nbconvert_exporter": "python",
178 | "pygments_lexer": "ipython3",
179 | "version": "3.7.1"
180 | }
181 | },
182 | "nbformat": 4,
183 | "nbformat_minor": 2
184 | }
185 |
--------------------------------------------------------------------------------
/Section 07/02-03 - Getting and preprocessing IMDB dataset for IMDB Movie Reviews Classification.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Classifying movie reviews: a binary classification example\n",
8 | "\n",
9 | "\n",
10 | "Binary classification may be the most widely applied kind of machine learning problem. \n",
11 | "\n",
12 | "In this example, we will learn to classify movie reviews into \"positive\" reviews and \"negative\" reviews, just based on the text content of the reviews."
13 | ]
14 | },
15 | {
16 | "cell_type": "markdown",
17 | "metadata": {},
18 | "source": [
19 | "## The IMDB dataset\n",
20 | "\n",
21 | "We'll be working with \"IMDB dataset\", a set of 50,000 highly-polarized reviews from the Internet Movie Database. \n",
22 | "\n",
23 | "They are split into 25,000 reviews for training and 25,000 reviews for testing, each set consisting in 50% negative and 50% positive reviews.\n",
24 | "\n",
25 | "\n",
26 | "The following code will load the dataset (when you run it for the first time, about 80MB of data will be downloaded to your machine):"
27 | ]
28 | },
29 | {
30 | "cell_type": "code",
31 | "execution_count": 1,
32 | "metadata": {},
33 | "outputs": [],
34 | "source": [
35 | "from tensorflow import keras\n",
36 | "from tensorflow.keras.datasets import imdb\n",
37 | "\n",
38 | "(train_data, train_labels), (test_data, test_labels) = imdb.load_data(num_words=10000)"
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "execution_count": 2,
44 | "metadata": {},
45 | "outputs": [
46 | {
47 | "data": {
48 | "text/plain": [
49 | "'2.0.0-beta0'"
50 | ]
51 | },
52 | "execution_count": 2,
53 | "metadata": {},
54 | "output_type": "execute_result"
55 | }
56 | ],
57 | "source": [
58 | "import tensorflow as tf\n",
59 | "tf.__version__"
60 | ]
61 | },
62 | {
63 | "cell_type": "markdown",
64 | "metadata": {},
65 | "source": [
66 | "\n",
67 | "The argument `num_words=10000` means that we will only keep the top 10,000 most frequently occurring words in the training data.\n",
68 | "\n",
69 | "Rare words will be discarded. This allows us to work with vector data of manageable size.\n",
70 | "\n",
71 | "The variables `train_data` and `test_data` are lists of reviews, each review being a list of word indices (encoding a sequence of words). \n",
72 | "\n",
73 | "\n",
74 | "`train_labels` and `test_labels` are lists of 0s and 1s, where 0 stands for \"negative\" and 1 stands for \"positive\":"
75 | ]
76 | },
77 | {
78 | "cell_type": "code",
79 | "execution_count": 3,
80 | "metadata": {
81 | "scrolled": false
82 | },
83 | "outputs": [
84 | {
85 | "name": "stdout",
86 | "output_type": "stream",
87 | "text": [
88 | "[1, 14, 22, 16, 43, 530, 973, 1622, 1385, 65, 458, 4468, 66, 3941, 4, 173, 36, 256, 5, 25, 100, 43, 838, 112, 50, 670, 2, 9, 35, 480, 284, 5, 150, 4, 172, 112, 167, 2, 336, 385, 39, 4, 172, 4536, 1111, 17, 546, 38, 13, 447, 4, 192, 50, 16, 6, 147, 2025, 19, 14, 22, 4, 1920, 4613, 469, 4, 22, 71, 87, 12, 16, 43, 530, 38, 76, 15, 13, 1247, 4, 22, 17, 515, 17, 12, 16, 626, 18, 2, 5, 62, 386, 12, 8, 316, 8, 106, 5, 4, 2223, 5244, 16, 480, 66, 3785, 33, 4, 130, 12, 16, 38, 619, 5, 25, 124, 51, 36, 135, 48, 25, 1415, 33, 6, 22, 12, 215, 28, 77, 52, 5, 14, 407, 16, 82, 2, 8, 4, 107, 117, 5952, 15, 256, 4, 2, 7, 3766, 5, 723, 36, 71, 43, 530, 476, 26, 400, 317, 46, 7, 4, 2, 1029, 13, 104, 88, 4, 381, 15, 297, 98, 32, 2071, 56, 26, 141, 6, 194, 7486, 18, 4, 226, 22, 21, 134, 476, 26, 480, 5, 144, 30, 5535, 18, 51, 36, 28, 224, 92, 25, 104, 4, 226, 65, 16, 38, 1334, 88, 12, 16, 283, 5, 16, 4472, 113, 103, 32, 15, 16, 5345, 19, 178, 32]\n"
89 | ]
90 | }
91 | ],
92 | "source": [
93 | "print(train_data[0])"
94 | ]
95 | },
96 | {
97 | "cell_type": "code",
98 | "execution_count": 3,
99 | "metadata": {},
100 | "outputs": [
101 | {
102 | "data": {
103 | "text/plain": [
104 | "1"
105 | ]
106 | },
107 | "execution_count": 3,
108 | "metadata": {},
109 | "output_type": "execute_result"
110 | }
111 | ],
112 | "source": [
113 | "train_labels[0]"
114 | ]
115 | },
116 | {
117 | "cell_type": "markdown",
118 | "metadata": {},
119 | "source": [
120 | "Since we restricted ourselves to the top 10,000 most frequent words, no word index will exceed 10,000:"
121 | ]
122 | },
123 | {
124 | "cell_type": "code",
125 | "execution_count": 4,
126 | "metadata": {},
127 | "outputs": [
128 | {
129 | "data": {
130 | "text/plain": [
131 | "9999"
132 | ]
133 | },
134 | "execution_count": 4,
135 | "metadata": {},
136 | "output_type": "execute_result"
137 | }
138 | ],
139 | "source": [
140 | "max([max(sequence) for sequence in train_data])"
141 | ]
142 | },
143 | {
144 | "cell_type": "markdown",
145 | "metadata": {},
146 | "source": [
147 | "For kicks, here's how you can quickly decode one of these reviews back to English words:"
148 | ]
149 | },
150 | {
151 | "cell_type": "code",
152 | "execution_count": 5,
153 | "metadata": {},
154 | "outputs": [],
155 | "source": [
156 | "# word_index is a dictionary mapping words to an integer index\n",
157 | "word_index = imdb.get_word_index()"
158 | ]
159 | },
160 | {
161 | "cell_type": "code",
162 | "execution_count": 6,
163 | "metadata": {},
164 | "outputs": [],
165 | "source": [
166 | "# We reverse it, mapping integer indices to words\n",
167 | "reverse_word_index = dict([(value, key) for (key, value) in word_index.items()])"
168 | ]
169 | },
170 | {
171 | "cell_type": "code",
172 | "execution_count": 7,
173 | "metadata": {},
174 | "outputs": [],
175 | "source": [
176 | "# We decode the review; note that our indices were offset by 3\n",
177 | "# because 0, 1 and 2 are reserved indices for \"padding\", \"start of sequence\", and \"unknown\".\n",
178 | "decoded_review = ' '.join([reverse_word_index.get(i - 3, '?') for i in train_data[0]])"
179 | ]
180 | },
181 | {
182 | "cell_type": "code",
183 | "execution_count": 8,
184 | "metadata": {},
185 | "outputs": [
186 | {
187 | "data": {
188 | "text/plain": [
189 | "\"? this film was just brilliant casting location scenery story direction everyone's really suited the part they played and you could just imagine being there robert ? is an amazing actor and now the same being director ? father came from the same scottish island as myself so i loved the fact there was a real connection with this film the witty remarks throughout the film were great it was just brilliant so much that i bought the film as soon as it was released for ? and would recommend it to everyone to watch and the fly fishing was amazing really cried at the end it was so sad and you know what they say if you cry at a film it must have been good and this definitely was also ? to the two little boy's that played the ? of norman and paul they were just brilliant children are often left out of the ? list i think because the stars that play them all grown up are such a big profile for the whole film but these children are amazing and should be praised for what they have done don't you think the whole story was so lovely because it was true and was someone's life after all that was shared with us all\""
190 | ]
191 | },
192 | "execution_count": 8,
193 | "metadata": {},
194 | "output_type": "execute_result"
195 | }
196 | ],
197 | "source": [
198 | "decoded_review"
199 | ]
200 | },
201 | {
202 | "cell_type": "markdown",
203 | "metadata": {},
204 | "source": [
205 | "## Preparing the data\n",
206 | "\n",
207 | "\n",
208 | "We cannot feed lists of integers into a neural network. We have to turn our lists into tensors. \n",
209 | "\n",
210 | "* We'll one-hot-encode our lists to turn them into vectors of 0s and 1s. Concretely, this would mean for instance turning the sequence \n",
211 | "`[3, 5]` into a 10,000-dimensional vector that would be all-zeros except for indices 3 and 5, which would be ones. Then we could use as \n",
212 | "first layer in our network a `Dense` layer, capable of handling floating point vector data.\n",
213 | "\n",
214 | "Let's vectorize our data, which we will do manually for maximum clarity:"
215 | ]
216 | },
217 | {
218 | "cell_type": "code",
219 | "execution_count": 9,
220 | "metadata": {},
221 | "outputs": [],
222 | "source": [
223 | "import numpy as np\n",
224 | "\n",
225 | "def vectorize_sequences(sequences, dimension=10000):\n",
226 | " # Create an all-zero matrix of shape (len(sequences), dimension)\n",
227 | " results = np.zeros((len(sequences), dimension))\n",
228 | " for i, sequence in enumerate(sequences):\n",
229 | " results[i, sequence] = 1. # set specific indices of results[i] to 1s\n",
230 | " return results\n",
231 | "\n",
232 | "# Our vectorized training data\n",
233 | "x_train = vectorize_sequences(train_data)\n",
234 | "# Our vectorized test data\n",
235 | "x_test = vectorize_sequences(test_data)"
236 | ]
237 | },
238 | {
239 | "cell_type": "markdown",
240 | "metadata": {},
241 | "source": [
242 | "Here's what our samples look like now:"
243 | ]
244 | },
245 | {
246 | "cell_type": "code",
247 | "execution_count": 10,
248 | "metadata": {},
249 | "outputs": [
250 | {
251 | "data": {
252 | "text/plain": [
253 | "array([0., 1., 1., ..., 0., 0., 0.])"
254 | ]
255 | },
256 | "execution_count": 10,
257 | "metadata": {},
258 | "output_type": "execute_result"
259 | }
260 | ],
261 | "source": [
262 | "x_train[0]"
263 | ]
264 | },
265 | {
266 | "cell_type": "markdown",
267 | "metadata": {},
268 | "source": [
269 | "We should also vectorize our labels, which is straightforward:"
270 | ]
271 | },
272 | {
273 | "cell_type": "code",
274 | "execution_count": 11,
275 | "metadata": {},
276 | "outputs": [],
277 | "source": [
278 | "# Our vectorized labels\n",
279 | "y_train = np.asarray(train_labels).astype('float32')\n",
280 | "y_test = np.asarray(test_labels).astype('float32')"
281 | ]
282 | },
283 | {
284 | "cell_type": "markdown",
285 | "metadata": {},
286 | "source": [
287 | "Now our data is ready to be fed into a neural network."
288 | ]
289 | },
290 | {
291 | "cell_type": "markdown",
292 | "metadata": {},
293 | "source": [
294 | "---------------"
295 | ]
296 | },
297 | {
298 | "cell_type": "markdown",
299 | "metadata": {},
300 | "source": [
301 | "## Building our network\n",
302 | "\n",
303 | "\n",
304 | "Our input data is simply vectors, and our labels are scalars (1s and 0s): this is the easiest setup you will ever encounter. \n",
305 | "\n",
306 | "A type of network that performs well on such a problem would be a simple stack of fully-connected (`Dense`) layers with `relu` activations: \n",
307 | "\n",
308 | "`Dense(16, activation='relu')`"
309 | ]
310 | },
311 | {
312 | "cell_type": "markdown",
313 | "metadata": {},
314 | "source": [
315 | "Here's what our network looks like:\n",
316 | "\n",
317 | ""
318 | ]
319 | },
320 | {
321 | "cell_type": "markdown",
322 | "metadata": {},
323 | "source": [
324 | "And here's the Keras implementation, very similar to the MNIST example you saw previously:"
325 | ]
326 | },
327 | {
328 | "cell_type": "code",
329 | "execution_count": 12,
330 | "metadata": {},
331 | "outputs": [],
332 | "source": [
333 | "from tensorflow.keras import models\n",
334 | "from tensorflow.keras import layers\n",
335 | "\n",
336 | "model = models.Sequential()\n",
337 | "model.add(layers.Dense(16, activation='relu', input_shape=(10000,)))\n",
338 | "model.add(layers.Dense(16, activation='relu'))\n",
339 | "model.add(layers.Dense(1, activation='sigmoid'))"
340 | ]
341 | },
342 | {
343 | "cell_type": "markdown",
344 | "metadata": {},
345 | "source": [
346 | "\n",
347 | "Lastly, we need to pick a loss function and an optimizer. Since we are facing a binary classification problem and the output of our network \n",
348 | "is a probability (we end our network with a single-unit layer with a sigmoid activation), is it best to use the `binary_crossentropy` loss. "
349 | ]
350 | },
351 | {
352 | "cell_type": "code",
353 | "execution_count": 13,
354 | "metadata": {},
355 | "outputs": [],
356 | "source": [
357 | "model.compile(optimizer='rmsprop',\n",
358 | " loss='binary_crossentropy',\n",
359 | " metrics=['accuracy'])"
360 | ]
361 | },
362 | {
363 | "cell_type": "markdown",
364 | "metadata": {},
365 | "source": [
366 | "## Validating our approach\n",
367 | "\n",
368 | "In order to monitor during training the accuracy of the model on data that it has never seen before, we will create a \"validation set\" by \n",
369 | "setting apart 10,000 samples from the original training data:"
370 | ]
371 | },
372 | {
373 | "cell_type": "code",
374 | "execution_count": 14,
375 | "metadata": {},
376 | "outputs": [],
377 | "source": [
378 | "x_val = x_train[:10000]\n",
379 | "partial_x_train = x_train[10000:]\n",
380 | "\n",
381 | "y_val = y_train[:10000]\n",
382 | "partial_y_train = y_train[10000:]"
383 | ]
384 | },
385 | {
386 | "cell_type": "code",
387 | "execution_count": 15,
388 | "metadata": {},
389 | "outputs": [
390 | {
391 | "name": "stdout",
392 | "output_type": "stream",
393 | "text": [
394 | "Train on 15000 samples, validate on 10000 samples\n",
395 | "Epoch 1/10\n",
396 | "15000/15000 [==============================] - 6s 413us/sample - loss: 0.5210 - accuracy: 0.7873 - val_loss: 0.3990 - val_accuracy: 0.8651\n",
397 | "Epoch 2/10\n",
398 | "15000/15000 [==============================] - 4s 246us/sample - loss: 0.3143 - accuracy: 0.9035 - val_loss: 0.3110 - val_accuracy: 0.8860\n",
399 | "Epoch 3/10\n",
400 | "15000/15000 [==============================] - 3s 195us/sample - loss: 0.2278 - accuracy: 0.9265 - val_loss: 0.2793 - val_accuracy: 0.8909\n",
401 | "Epoch 4/10\n",
402 | "15000/15000 [==============================] - 3s 174us/sample - loss: 0.1814 - accuracy: 0.9405 - val_loss: 0.2742 - val_accuracy: 0.8916\n",
403 | "Epoch 5/10\n",
404 | "15000/15000 [==============================] - 3s 181us/sample - loss: 0.1458 - accuracy: 0.9537 - val_loss: 0.2953 - val_accuracy: 0.8835\n",
405 | "Epoch 6/10\n",
406 | "15000/15000 [==============================] - 3s 178us/sample - loss: 0.1224 - accuracy: 0.9623 - val_loss: 0.2892 - val_accuracy: 0.8884\n",
407 | "Epoch 7/10\n",
408 | "15000/15000 [==============================] - 3s 178us/sample - loss: 0.1029 - accuracy: 0.9678 - val_loss: 0.3060 - val_accuracy: 0.8855\n",
409 | "Epoch 8/10\n",
410 | "15000/15000 [==============================] - 3s 209us/sample - loss: 0.0832 - accuracy: 0.9770 - val_loss: 0.3229 - val_accuracy: 0.8835\n",
411 | "Epoch 9/10\n",
412 | "15000/15000 [==============================] - 4s 254us/sample - loss: 0.0713 - accuracy: 0.9809 - val_loss: 0.3492 - val_accuracy: 0.8814\n",
413 | "Epoch 10/10\n",
414 | "15000/15000 [==============================] - 3s 183us/sample - loss: 0.0568 - accuracy: 0.9861 - val_loss: 0.3709 - val_accuracy: 0.8781\n"
415 | ]
416 | }
417 | ],
418 | "source": [
419 | "history = model.fit(partial_x_train,\n",
420 | " partial_y_train,\n",
421 | " epochs=10,\n",
422 | " batch_size=512,\n",
423 | " validation_data=(x_val, y_val))"
424 | ]
425 | },
426 | {
427 | "cell_type": "markdown",
428 | "metadata": {},
429 | "source": [
430 | "On CPU, this will take less than two seconds per epoch -- training is over in 20 seconds. At the end of every epoch, there is a slight pause \n",
431 | "as the model computes its loss and accuracy on the 10,000 samples of the validation data.\n",
432 | "\n",
433 | "Note that the call to `model.fit()` returns a `History` object. This object has a member `history`, which is a dictionary containing data \n",
434 | "about everything that happened during training. Let's take a look at it:"
435 | ]
436 | },
437 | {
438 | "cell_type": "code",
439 | "execution_count": 16,
440 | "metadata": {},
441 | "outputs": [
442 | {
443 | "data": {
444 | "text/plain": [
445 | "dict_keys(['loss', 'accuracy', 'val_loss', 'val_accuracy'])"
446 | ]
447 | },
448 | "execution_count": 16,
449 | "metadata": {},
450 | "output_type": "execute_result"
451 | }
452 | ],
453 | "source": [
454 | "history_dict = history.history\n",
455 | "history_dict.keys()"
456 | ]
457 | },
458 | {
459 | "cell_type": "markdown",
460 | "metadata": {},
461 | "source": [
462 | "It contains 4 entries: one per metric that was being monitored, during training and during validation. Let's use Matplotlib to plot the \n",
463 | "training and validation loss side by side, as well as the training and validation accuracy:"
464 | ]
465 | },
466 | {
467 | "cell_type": "code",
468 | "execution_count": 18,
469 | "metadata": {},
470 | "outputs": [
471 | {
472 | "data": {
473 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3XmYVNW57/HvyyyDzE6ANEaPMg82oEFABg3RCCEBZFIxEtTjbHJuiENiSEjQeI3Bw8kj8caYgBDFo2IccCJBYoI0BEFEglHABlRA5kkb3vvH2l1d3fZMV+/q7t/nefrp2rt27XqroOtXe6291jZ3R0REBKBW3AWIiEj6UCiIiEiCQkFERBIUCiIikqBQEBGRBIWCiIgkKBSkQplZbTPbb2anV+S2cTKzM82sws/dNrOhZrYxaXm9mfUvzbbleK5HzOyO8j6+mP3+zMx+X9H7lfjUibsAiZeZ7U9abAgcAY5Gy9e6+9yy7M/djwKNK3rbmsDdz66I/ZjZZGCiu1+YtO/JFbFvqf4UCjWcuyc+lKNvopPd/dWitjezOu6eUxm1iUjlU/ORFCtqHviTmc0zs33ARDM738z+YWa7zWybmc00s7rR9nXMzM0sI1qeE93/opntM7O/m1mHsm4b3f91M/uXme0xs4fM7G9mNqmIuktT47Vm9r6Z7TKzmUmPrW1mvzKznWb2ATCsmPfnTjObX2DdLDN7ILo92czWRa/n39G3+KL2lW1mF0a3G5rZH6Pa1gLnFtj2LjP7INrvWjMbHq3vCvw30D9qmtuR9N7ek/T466LXvtPMnjGzU0vz3pTEzEZG9ew2s9fN7Oyk++4ws61mttfM3kt6reeZ2cpo/Sdm9svSPp+kgLvrRz+4O8BGYGiBdT8DPgcuI3yJOAHoDfQlHGmeAfwLuDHavg7gQEa0PAfYAWQCdYE/AXPKse1JwD5gRHTf7cAXwKQiXktpanwWaApkAJ/lvnbgRmAt0BZoCSwJfyqFPs8ZwH6gUdK+PwUyo+XLom0MGAwcArpF9w0FNibtKxu4MLp9P/AXoDnQHni3wLZjgFOjf5PxUQ0nR/dNBv5SoM45wD3R7YujGnsADYD/AV4vzXtTyOv/GfD76HbHqI7B0b/RHcD66HZnYBNwSrRtB+CM6PZyYFx0uwnQN+6/hZr8oyMFKY2l7v6cux9z90Puvtzdl7l7jrt/AMwGBhbz+AXunuXuXwBzCR9GZd32G8Aqd382uu9XhAApVClr/IW773H3jYQP4NznGgP8yt2z3X0nMKOY5/kAeIcQVgAXAbvcPSu6/zl3/8CD14HXgEI7kwsYA/zM3Xe5+ybCt//k533C3bdF/yaPEwI9sxT7BZgAPOLuq9z9MDAVGGhmbZO2Keq9Kc5YYKG7vx79G80gBEtfIIcQQJ2jJsgPo/cOQrifZWYt3X2fuy8r5euQFFAoSGl8lLxgZueY2fNm9rGZ7QWmAa2KefzHSbcPUnznclHbnpZch7s74Zt1oUpZY6mei/ANtziPA+Oi2+Oj5dw6vmFmy8zsMzPbTfiWXtx7levU4mows0lm9nbUTLMbOKeU+4Xw+hL7c/e9wC6gTdI2Zfk3K2q/xwj/Rm3cfT3wPcK/w6dRc+Qp0aZXA52A9Wb2lpldUsrXISmgUJDSKHg65sOEb8dnuvuJwI8IzSOptI3QnAOAmRn5P8QKOp4atwHtkpZLOmX2CWCombUhHDE8HtV4ArAA+AWhaacZ8HIp6/i4qBrM7AzgN8D1QMtov+8l7bek02e3EpqkcvfXhNBMtaUUdZVlv7UI/2ZbANx9jrv3IzQd1Sa8L7j7encfS2gi/L/AU2bW4DhrkXJSKEh5NAH2AAfMrCNwbSU855+BXmZ2mZnVAW4BWqeoxieAW82sjZm1BH5Q3Mbu/jGwFPg9sN7dN0R31QfqAduBo2b2DWBIGWq4w8yaWRjHcWPSfY0JH/zbCfn4XcKRQq5PgLa5HeuFmAdcY2bdzKw+4cP5DXcv8sirDDUPN7MLo+f+L0I/0DIz62hmg6LnOxT9HCO8gCvMrFV0ZLEnem3HjrMWKSeFgpTH94CrCH/wDxM6hFPK3T8BLgceAHYCXwH+SRhXUdE1/obQ9r+G0Am6oBSPeZzQcZxoOnL33cBtwNOEztpRhHArjR8Tjlg2Ai8Cf0ja72rgIeCtaJuzgeR2+FeADcAnZpbcDJT7+JcIzThPR48/ndDPcFzcfS3hPf8NIbCGAcOj/oX6wH2EfqCPCUcmd0YPvQRYZ+HstvuBy9398+OtR8rHQtOsSNViZrUJzRWj3P2NuOsRqS50pCBVhpkNi5pT6gN3E85aeSvmskSqFYWCVCUXAB8Qmia+Box096Kaj0SkHNR8JCIiCTpSEBGRhCo3IV6rVq08IyMj7jJERKqUFStW7HD34k7jBqpgKGRkZJCVlRV3GSIiVYqZlTQyH1DzkYiIJFEoiIhIgkJBREQSqlyfgohUri+++ILs7GwOHz4cdylSCg0aNKBt27bUrVvU1FfFUyiISLGys7Np0qQJGRkZhMlpJV25Ozt37iQ7O5sOHTqU/IBC1Ijmo7lzISMDatUKv+eW6VL0IjXb4cOHadmypQKhCjAzWrZseVxHddX+SGHuXJgyBQ4eDMubNoVlgAnHPS+kSM2gQKg6jvffqtofKdx5Z14g5Dp4MKwXEZH8qn0obN5ctvUikl527txJjx496NGjB6eccgpt2rRJLH/+eekuu3D11Vezfv36YreZNWsWcyuobfmCCy5g1apVFbKvylbtm49OPz00GRW2XkQq3ty54Uh88+bwdzZ9+vE11bZs2TLxAXvPPffQuHFjvv/97+fbxt1xd2rVKvx77qOPPlri89xwww3lL7IaqfZHCtOnQ8OG+dc1bBjWi0jFyu3D27QJ3PP68FJxcsf7779Pp06dmDBhAp07d2bbtm1MmTKFzMxMOnfuzLRp0xLb5n5zz8nJoVmzZkydOpXu3btz/vnn8+mnnwJw11138eCDDya2nzp1Kn369OHss8/mzTffBODAgQN8+9vfplOnTowaNYrMzMwSjwjmzJlD165d6dKlC3fccQcAOTk5XHHFFYn1M2fOBOBXv/oVnTp1olu3bkycOLHC37PSqPZHCrnfUCrym4uIFK64PrxU/M299957/OEPfyAzMxOAGTNm0KJFC3Jychg0aBCjRo2iU6dO+R6zZ88eBg4cyIwZM7j99tv53e9+x9SpU7+0b3fnrbfeYuHChUybNo2XXnqJhx56iFNOOYWnnnqKt99+m169ehVbX3Z2NnfddRdZWVk0bdqUoUOH8uc//5nWrVuzY8cO1qxZA8Du3bsBuO+++9i0aRP16tVLrKts1f5IAcJ/xo0b4dix8FuBIJIald2H95WvfCURCADz5s2jV69e9OrVi3Xr1vHuu+9+6TEnnHACX//61wE499xz2bhxY6H7/ta3vvWlbZYuXcrYsWMB6N69O507dy62vmXLljF48GBatWpF3bp1GT9+PEuWLOHMM89k/fr13HzzzSxatIimTZsC0LlzZyZOnMjcuXPLPfjseNWIUBCRylFUX12q+vAaNWqUuL1hwwZ+/etf8/rrr7N69WqGDRtW6Pn69erVS9yuXbs2OTk5he67fv36JW5TXi1btmT16tX079+fWbNmce211wKwaNEirrvuOpYvX06fPn04evRohT5vaaQ0FKJr6q43s/fN7EvHZ2Y2ycy2m9mq6GdyKusRkdSKsw9v7969NGnShBNPPJFt27axaNGiCn+Ofv368cQTTwCwZs2aQo9EkvXt25fFixezc+dOcnJymD9/PgMHDmT79u24O6NHj2batGmsXLmSo0ePkp2dzeDBg7nvvvvYsWMHBwu2xVWClPUpmFltYBZwEZANLDezhe5e8F38k7vfmKo6RKTyxNmH16tXLzp16sQ555xD+/bt6devX4U/x0033cSVV15Jp06dEj+5TT+Fadu2LT/96U+58MILcXcuu+wyLr30UlauXMk111yDu2Nm3HvvveTk5DB+/Hj27dvHsWPH+P73v0+TJk0q/DWUJGXXaDaz84F73P1r0fIPAdz9F0nbTAIyyxIKmZmZrovsiFSedevW0bFjx7jLSAs5OTnk5OTQoEEDNmzYwMUXX8yGDRuoUye9ztkp7N/MzFa4e2YRD0lI5StpA3yUtJwN9C1ku2+b2QDgX8Bt7v5RwQ3MbAowBeB0DTAQkZjs37+fIUOGkJOTg7vz8MMPp10gHK+4X81zwDx3P2Jm1wKPAYMLbuTus4HZEI4UKrdEEZGgWbNmrFixIu4yUiqVHc1bgHZJy22jdQnuvtPdj0SLjwDnprAeEREpQSpDYTlwlpl1MLN6wFhgYfIGZnZq0uJwYF0K6xERkRKkrPnI3XPM7EZgEVAb+J27rzWzaUCWuy8Ebjaz4UAO8BkwKVX1iIhIyVLap+DuLwAvFFj3o6TbPwR+mMoaRESk9DSiWUTS2qBBg740EO3BBx/k+uuvL/ZxjRs3BmDr1q2MGjWq0G0uvPBCSjrF/cEHH8w3iOySSy6pkHmJ7rnnHu6///7j3k9FUyiISFobN24c8+fPz7du/vz5jBs3rlSPP+2001iwYEG5n79gKLzwwgs0a9as3PtLdwoFEUlro0aN4vnnn09cUGfjxo1s3bqV/v37J8YN9OrVi65du/Lss89+6fEbN26kS5cuABw6dIixY8fSsWNHRo4cyaFDhxLbXX/99Ylpt3/84x8DMHPmTLZu3cqgQYMYNGgQABkZGezYsQOABx54gC5dutClS5fEtNsbN26kY8eOfPe736Vz585cfPHF+Z6nMKtWreK8886jW7dujBw5kl27diWeP3cq7dyJ+P76178mLjLUs2dP9u3bV+73tjBxj1MQkSrk1luhoi8o1qMHRJ+nhWrRogV9+vThxRdfZMSIEcyfP58xY8ZgZjRo0ICnn36aE088kR07dnDeeecxfPjwIq9T/Jvf/IaGDRuybt06Vq9enW/q6+nTp9OiRQuOHj3KkCFDWL16NTfffDMPPPAAixcvplWrVvn2tWLFCh599FGWLVuGu9O3b18GDhxI8+bN2bBhA/PmzeO3v/0tY8aM4amnnir2+ghXXnklDz30EAMHDuRHP/oRP/nJT3jwwQeZMWMGH374IfXr1080Wd1///3MmjWLfv36sX//fho0aFCGd7tkOlIQkbSX3ISU3HTk7txxxx1069aNoUOHsmXLFj755JMi97NkyZLEh3O3bt3o1q1b4r4nnniCXr160bNnT9auXVviZHdLly5l5MiRNGrUiMaNG/Otb32LN954A4AOHTrQo0cPoPjpuSFc32H37t0MHDgQgKuuuoolS5YkapwwYQJz5sxJjJzu168ft99+OzNnzmT37t0VPqJaRwoiUmrFfaNPpREjRnDbbbexcuVKDh48yLnnhnGuc+fOZfv27axYsYK6deuSkZFR6HTZJfnwww+5//77Wb58Oc2bN2fSpEnl2k+u3Gm3IUy9XVLzUVGef/55lixZwnPPPcf06dNZs2YNU6dO5dJLL+WFF16gX79+LFq0iHPOOafctRakIwURSXuNGzdm0KBBfOc738nXwbxnzx5OOukk6taty+LFi9lU2AXZkwwYMIDHH38cgHfeeYfVq1cDYdrtRo0a0bRpUz755BNefPHFxGOaNGlSaLt9//79eeaZZzh48CAHDhzg6aefpn///mV+bU2bNqV58+aJo4w//vGPDBw4kGPHjvHRRx8xaNAg7r33Xvbs2cP+/fv597//TdeuXfnBD35A7969ee+998r8nMXRkYKIVAnjxo1j5MiR+c5EmjBhApdddhldu3YlMzOzxG/M119/PVdffTUdO3akY8eOiSOO7t2707NnT8455xzatWuXb9rtKVOmMGzYME477TQWL16cWN+rVy8mTZpEnz59AJg8eTI9e/YstqmoKI899hjXXXcdBw8e5IwzzuDRRx/l6NGjTJw4kT179uDu3HzzzTRr1oy7776bxYsXU6tWLTp37py4ilxFSdnU2amiqbNFKpemzq56jmfqbDUfiYhIgkJBREQSFAoiUqKq1sxckx3vv5VCQUSK1aBBA3bu3KlgqALcnZ07dx7XgDadfSQixWrbti3Z2dls37497lKkFBo0aEDbtm3L/XiFgogUq27dunTo0CHuMqSSqPlIREQSFAoiIpKgUBARkQSFgoiIJCgUREQkQaEgIiIJCgUREUlQKIiISIJCQUREEhQKIiKSoFAQEZEEhYKIiCQoFEREJEGhICIiCQoFERFJUCiIiEiCQkFERBIUCiIikqBQEBGRhJSGgpkNM7P1Zva+mU0tZrtvm5mbWWYq6xERkeKlLBTMrDYwC/g60AkYZ2adCtmuCXALsCxVtYiISOmk8kihD/C+u3/g7p8D84ERhWz3U+Be4HAKa2HNGrjhBjh6NJXPIiJStaUyFNoAHyUtZ0frEsysF9DO3Z8vbkdmNsXMsswsa/v27eUqZskS+J//gZtvBvdy7UJEpNqrE9cTm1kt4AFgUknbuvtsYDZAZmZmuT7Sb7gBNm+G++6DU0+Fu+4qz15ERKq3VIbCFqBd0nLbaF2uJkAX4C9mBnAKsNDMhrt7VioKmjEDPv4Y7r4bTjkFJk9OxbOIiFRdqQyF5cBZZtaBEAZjgfG5d7r7HqBV7rKZ/QX4fqoCITwHPPIIfPopXHstnHQSDB+eqmcTEal6Utan4O45wI3AImAd8IS7rzWzaWYW20dx3brw5JNw7rlw+eXwt7/FVYmISPoxr2K9rpmZmZ6VdfwHE9u3Q79+sGMHvPEGdO5cAcWJiKQpM1vh7iWOBauxI5pbt4ZFi6B+fRg2DLKz465IRCR+NTYUADp0gJdegr17QzDs2hV3RSIi8arRoQDQvTs88wxs2BA6nQ8dirsiEZH41PhQABg0CObMCZ3O48ZBTk7cFYmIxEOhEBk9GmbOhGefhf/8T416FpGaKbYRzenoxhth2zb4+c/htNPgnnvirkhEarqDB8MZki+/DGPHQu/eqX0+hUIBP/tZGPX8k5+EUc/XXRd3RSJSkxw7BqtXhxB4+WVYuhSOHAlnSnbsqFCodGbw8MNh1PMNN8DJJ8PIkXFXJSLV2dat8MorIQReeSWMowLo2jW0YFx8MVxwATRsmPpaFAqFqFMH/vQnGDIkdDy//DIMGBB3VSJSXRw8GGZuzg2Bd94J6086Cb72NbjoIhg6NDRjVzaFQhEaNoQ//zmk8/DhoU2va9e4qxKRqujYMXj77bwQeOMN+Pzz0CQ0YABceWU4GujaFWrFfPqPQqEYLVuGUc/nnx8Gt735JrRvH3dVIlIVbNkSAiD3J7lJ6KabQgj07w8nnBBvnQUpFEpw+ulh1HP//iEYli4NYSEikuzAgdAklNs3sHZtWH/yyaFJ6OKLQ5PQqafGW2dJFAql0LUrLFwY/lG/8Q147bXK6fARkfR17BisWpUXAkuX5m8SmjQpr0koXDKmalAolNKAATBvHowaFabcfvrp0CEtIjVHbpNQbt/Ajh1hfbdu4VK/F12Unk1CZaGPtTIYORJmzYLrrw8X6Xnkkar1DUBEyuaLL2DxYnjxxRAE774b1p98cmhOripNQmWhUCij664Lo56nTQuD26ZPj7siEalIn38Or74KCxaEyTJ37YIGDcIRwNVXV80mobJQKJTDPfeEUc8//3kIhptuirsiETkeR46E5qAFC8L8Z7t3w4knhtPRR48OzUJVuUmoLBQK5WAWmpE++QRuuSUEw+jRcVclImVx+HAIgiefDCeS7NkDTZvCiBF5QVC/ftxVVj6FQjnVqRM6ni++GCZOhFatwhTcIpK+Dh8OY49yg2DfPmjWLPQXjh4d+gfq1Yu7yngpFI7DCSeE/1j9+8M3vxnOUe7ePe6qRCTZoUNhrNGTT8Jzz8H+/dCiRQiB0aNh8GAFQTKFwnFq3jycmfDVr+aNeu7QIe6qRGq2gwfD3+WTT4bpag4cCINOx44NQTBoENStG3eV6UmhUAHatQuHpBdcEEYu/u1v0Lp13FWJ1CwHDsALL4QgeP75EAytWsGECSEIBg5UEJSGQqGCdOoUvpEMGQKXXgqvvw6NG8ddlUj1tn9/CIAFC8LvQ4fCTKNXXhmCYMAADTItK71dFeirXw1Tbo8cCWPGhFPb9M1EpGLt2xcC4MknQxPRoUNhMNnVV4cZBwYMgNq1466y6lIoVLDhw8NFer77XZg8GX7/++o7yEWksuzdG47En3wydBofPhxGEV9zTQiCCy5QEFQUhUIKTJ4cBrfdfXcYw3DvvXFXJFL1fPZZXh/BokVhgNlpp4UvXKNHQ79+8V97oDpSKKTInXeG6TDuuy98o7n11rgrEklfR46Ei9AsW5b38/774b42bcL0MqNHh2ubKAhSS6GQImYwc2YY9XzbbaHN89ixEBabN4frNEyfHs6MEKlJ3MMH/ltv5QXAqlVhziEIRwN9+4amoYEDw20FQeVRKKRQ7dowZ06YXveKK8JZEEeOhPs2bYIpU8JtBYNUZzt25AXAW2+Fn88+C/c1agSZmeFIuk+fEABt28Zbb01n7h53DWWSmZnpWVlZcZdRJrt3h9Pkvvjiy/e1bw8bN1Z6SSIpcfhw+Naf3Az0wQfhvlq1oEuXvA//vn3DqdzqIK4cZrbC3TNL2k5HCpWgWbPCAwFCU5JIVXTsGGzYkHcEsGxZ6BfI/b/etm344L/22vD73HM1dqcqUChUkvbtQ5NRQaefXvm1iJTH9u153/5zm4F27w73NW4MvXvD974XAqBPn9A3IFWPQqGSTJ8e+hAOHsy/vn59+MMfwpkVNWW+dkl/hw7BypX5O4Nzmzlr1w4XmRkzJq8Z6Jxz1AxUXZQqFMzsK0C2ux8xswuBbsAf3H13CY8bBvwaqA084u4zCtx/HXADcBTYD0xx93fL/CqqgNzO5Nyzj9q0CbOrZmXBVVeF6zJccUUIji5d4q1VapajR+G99/K+/b/1FqxeDTk54f7TTw8f/DfeGI4AevUKHcRSPZWqo9nMVgGZQAbwAvAs0NndLynmMbWBfwEXAdnAcmBc8oe+mZ3o7nuj28OB/3T3YcXVUhU7movjDn/9K8yeDU89FU7LO//8EA5jxkDDhnFXKNWJO2Rn5w+ArKwwhxCEi8z07p3XGdynTxiAKVVfRXc0H3P3HDMbCTzk7g+Z2T9LeEwf4H13/yAqaD4wAkiEQm4gRBoBVetUqApgBhdeGH527IA//jEExNVXh9P0Jk4MIzh1nQYpj927w4d+8imhH38c7qtXD3r0gEmTwod/nz5w1lkaE1DTlTYUvjCzccBVwGXRupKmemsDfJS0nA30LbiRmd0A3A7UAwYXtiMzmwJMATi9GvfMtmoVBrrdeissXRrC4ZFHwqU/+/QJRw+XX64zOKRwuaOCk48C1q/Pu//ss8MlJnOPALp1q5mXm5Tilbb5qBNwHfB3d59nZh2AMe5e5Kw+ZjYKGObuk6PlK4C+7n5jEduPB77m7lcVV0t1az4qyWef5R09vPsuNGkS+iemTIGePeOurnrYvBn+8Y/wAdmqVfhp2TJcQCldO0+PHYN//St/AKxalXc66Cmn5H349+kTBog1axZvzRKv0jYflXnwmpk1B9q5++oStjsfuMfdvxYt/xDA3X9RxPa1gF3u3rS4/da0UMjlHq7qNns2PPFEGCSUmRnCYezYEBZSOp99BosXw2uvwauvhnPtC2MWgiE3JAr7XXBdixapmb9/69b8AbB8eZg5FPJOB80NgD59wokMmp1XklVoKJjZX4DhhOamFcCnwN/c/fZiHlOH0NE8BNhC6Gge7+5rk7Y5y903RLcvA35cUtE1NRSS7doVps+YPRveeSd8KIwfHwLi3HPjri79HD4crob36qvhZ8WKELKNG4e+nKFDwxz8x47Bzp2hb6fg74K3Dx8u+vmaNSs+OAr+btky/3U39u7N6wfI/dmyJdxXp05o9kkOAJ0OKqVR0aHwT3fvaWaTCUcJPzaz1e7erYTHXQI8SDgl9XfuPt3MpgFZ7r7QzH4NDAW+AHYBNyaHRmEUCnncQ+fh7Nkwf344t7xnzxAO48fDiSfGXWE8jh4NTSm5IbB0afgQr1MnnNk1dGj46d27/BdBOniw+OAo7PeBA0Xv78QTQ0jUrh0mi8v9szzzzPwB0KOHxrNI+VR0KKwBLgYeA+509+WlCYVUUCgUbvduePzxcIGf1avDqazjxoWA6N27ejcluIf5dXJD4PXX8yZc69o1LwQGDIi3k/7QoRAOxR2NHD4czjTr2zc0D7ZsGV+9Ur1UdCiMBu4mNBldb2ZnAL90928ff6llo1Aonntob549G+bNC99ou3cP4TBhQjgPvTr49NPw4Z/bL5A72rZdu7wQGDxY59iL5EpZR3PcFAqlt3dv3tHDqlWh2WHs2BAQfftWraOHAwfgjTfyjgbefjusb9YMBg3KC4Kzzqpar0ukslT0kUJb4CGgX7TqDeAWd88+rirLQaFQdu6hc3X27BASBw6EZpXBg8PZNbk/zZrl/928eQiSOD5kc3JCZ2tuCLz5Zjjdsl69cD3e3BDo1UudrCKlUdGh8ArwOPDHaNVEYIK7X3RcVZaDQuH47NsXOqUfeQTWrQvLxalb98tBUTA8irqvadPSf2C7h/l3Xn01NAktXhyOdMxC53luCPTrp6k/RMqjokNhlbv3KGldZVAoVKycHNizJ5zmunt3+J18u6R1R48Wv/+mTYsPj8aN4Z//DGGwdWt4zBln5IXAoEHhrBwROT4VPffRTjObCMyLlscBO8tbnKSPOnXyzpUvK/fQFFWWENmwIW9d7imarVqFpqyhQ2HIkBAKIhKP0obCdwh9Cr8iTFr3JjApRTVJFWEWvuk3bhzO+imrzz8PTUQtWmgSNpF0Uao/RXff5O7D3b21u5/k7t8EKv10VKle6tULRwkKBJH0cTx/jkVOcSEiIlXT8YSCzgYXEalmjicUqtaoNxERKVGxHc1mto/CP/wN0LRcIiLVTLGh4O6apV9EpAbReR8iIpKgUBARkQSFgoiIJCgUREQkQaEgIiIJCgUREUlQKIh5FFE0AAAJvklEQVSISIJCQUREEhQKIiKSoFCogebOhYyMMGV1RkZYFhGB0l9kR6qJuXNhyhQ4eDAsb9oUlgEmTIivLhFJDzpSqGHuvDMvEHIdPBjWi4goFGqYzZvLtl5EahaFQg1z+ullWy8iNYtCoYaZPh0aNsy/rmHDsF5ERKFQw0yYALNnQ/v2YBZ+z56tTmYRCXT2UQ00YYJCQEQKpyMFERFJUCiIiEiCQkFERBIUCiIikpDSUDCzYWa23szeN7Ophdx/u5m9a2arzew1M2ufynpERKR4KQsFM6sNzAK+DnQCxplZpwKb/RPIdPduwALgvlTVIyIiJUvlkUIf4H13/8DdPwfmAyOSN3D3xe6eOxPPP4C2KaxHRERKkMpQaAN8lLScHa0ryjXAi4XdYWZTzCzLzLK2b99egSWKiEiytOhoNrOJQCbwy8Lud/fZ7p7p7pmtW7eu3OJERGqQVI5o3gK0S1puG63Lx8yGAncCA939SArrERGREqTySGE5cJaZdTCzesBYYGHyBmbWE3gYGO7un6awFhERKYWUhYK75wA3AouAdcAT7r7WzKaZ2fBos18CjYEnzWyVmS0sYnciIlIJUtqn4O4vuPt/uPtX3H16tO5H7r4wuj3U3U929x7Rz/Di9yjVia4VLZJ+NEuqxELXihZJT2lx9pHUPLpWtEh6UihILHStaJH0pFCQWOha0SLpSaEgsdC1okXSk0JBYqFrRYukJ519JLHRtaJF0o+OFEREJEGhICIiCQoFERFJUCiIiEiCQkFERBIUCiIikqBQEBGRBIWC1HiawlskjwavSY2mKbxF8tORgtRomsJbJD+FgtRomsJbJD+FgtRomsJbJD+FgtRomsJbJD+FgtRomsJbJD+dfSQ1nqbwFsmjIwUREUlQKIiISIJCQSQNaFS1pAv1KYjETKOqJZ3oSEEkZhpVLelEoSASM42qlnSiUBCJmUZVSzpRKIjETKOqJZ0oFERiplHVkk509pFIGtCoakkXOlIQkQSNl5CUhoKZDTOz9Wb2vplNLeT+AWa20sxyzGxUKmsRkeLljpfYtAnc88ZLKBhqlpSFgpnVBmYBXwc6AePMrFOBzTYDk4DHU1WHiJSOxksIpLZPoQ/wvrt/AGBm84ERwLu5G7j7xui+YymsQ0RKQeMlBFLbfNQG+ChpOTtaJyJpSOMlBKpIR7OZTTGzLDPL2r59e9zliFRLGi8hkNpQ2AK0S1puG60rM3ef7e6Z7p7ZunXrCilORPLTeAmB1PYpLAfOMrMOhDAYC4xP4fOJyHHSeAlJ2ZGCu+cANwKLgHXAE+6+1symmdlwADPrbWbZwGjgYTNbm6p6RESkZCntU3D3F9z9P9z9K+4+PVr3I3dfGN1e7u5t3b2Ru7d0986prEdEqgYNoouPprkQkbSiiw7Fq0qcfSQiNYcG0cVLoSAiaUWD6OKlUBCRtKJBdPFSKIhIWkmnQXQ1scNboSAiaSVdBtHV1Fljzd3jrqFMMjMzPSsrK+4yRKSay8gIQVBQ+/awcWNlV3P8zGyFu2eWtJ2OFEREClFTO7wVCiIihaipHd4KBRGRQqRTh3dlUiiIiBQiXTq8oXLPgtI0FyIiRUiHWWMre9oPHSmIiKSxyp72Q6EgIpLGKvssKIWCiEgaq+yzoBQKIiJprLLPglIoiIiksco+C0pnH4mIpLnKPAtKRwoiIpKgUBARkQSFgoiIJCgUREQkQaEgIiIJVe4iO2a2HSjk0hdVSitgR9xFpBG9H3n0XuSn9yO/43k/2rt765I2qnKhUB2YWVZproBUU+j9yKP3Ij+9H/lVxvuh5iMREUlQKIiISIJCIR6z4y4gzej9yKP3Ij+9H/ml/P1Qn4KIiCToSEFERBIUCiIikqBQqERm1s7MFpvZu2a21sxuibumuJlZbTP7p5n9Oe5a4mZmzcxsgZm9Z2brzOz8uGuKk5ndFv2dvGNm88ysQdw1VRYz+52ZfWpm7ySta2Fmr5jZhuh381Q8t0KhcuUA33P3TsB5wA1m1inmmuJ2C7Au7iLSxK+Bl9z9HKA7Nfh9MbM2wM1Aprt3AWoDY+OtqlL9HhhWYN1U4DV3Pwt4LVqucAqFSuTu29x9ZXR7H+GPvk28VcXHzNoClwKPxF1L3MysKTAA+H8A7v65u++Ot6rY1QFOMLM6QENga8z1VBp3XwJ8VmD1COCx6PZjwDdT8dwKhZiYWQbQE1gWbyWxehD4P8CxuAtJAx2A7cCjUXPaI2bWKO6i4uLuW4D7gc3ANmCPu78cb1WxO9ndt0W3PwZOTsWTKBRiYGaNgaeAW919b9z1xMHMvgF86u4r4q4lTdQBegG/cfeewAFS1DxQFUTt5SMIYXka0MjMJsZbVfrwMJYgJeMJFAqVzMzqEgJhrrv/b9z1xKgfMNzMNgLzgcFmNifekmKVDWS7e+6R4wJCSNRUQ4EP3X27u38B/C/w1ZhritsnZnYqQPT701Q8iUKhEpmZEdqM17n7A3HXEyd3/6G7t3X3DEIH4uvuXmO/Cbr7x8BHZnZ2tGoI8G6MJcVtM3CemTWM/m6GUIM73iMLgaui21cBz6biSRQKlasfcAXhW/Gq6OeSuIuStHETMNfMVgM9gJ/HXE9soiOmBcBKYA3hs6rGTHlhZvOAvwNnm1m2mV0DzAAuMrMNhCOpGSl5bk1zISIiuXSkICIiCQoFERFJUCiIiEiCQkFERBIUCiIikqBQEImY2dGkU4VXmVmFjSg2s4zkGS9F0lWduAsQSSOH3L1H3EWIxElHCiIlMLONZnafma0xs7fM7MxofYaZvW5mq83sNTM7PVp/spk9bWZvRz+50zPUNrPfRtcIeNnMToi2vzm6xsZqM5sf08sUARQKIslOKNB8dHnSfXvcvSvw34TZXQEeAh5z927AXGBmtH4m8Fd3706Yv2httP4sYJa7dwZ2A9+O1k8Fekb7uS5VL06kNDSiWSRiZvvdvXEh6zcCg939g2hCw4/dvaWZ7QBOdfcvovXb3L2VmW0H2rr7kaR9ZACvRBdIwcx+ANR195+Z2UvAfuAZ4Bl335/ilypSJB0piJSOF3G7LI4k3T5KXp/epcAswlHF8uiiMiKxUCiIlM7lSb//Ht1+k7xLRE4A3ohuvwZcD4lrUDctaqdmVgto5+6LgR8ATYEvHa2IVBZ9IxHJc4KZrUpafsndc09LbR7NXnoEGBetu4lwpbT/Ilw17epo/S3A7Ghmy6OEgNhG4WoDc6LgMGCmLsMpcVKfgkgJoj6FTHffEXctIqmm5iMREUnQkYKIiCToSEFERBIUCiIikqBQEBGRBIWCiIgkKBRERCTh/wNMUCFthOGixgAAAABJRU5ErkJggg==\n",
474 | "text/plain": [
475 | ""
476 | ]
477 | },
478 | "metadata": {
479 | "needs_background": "light"
480 | },
481 | "output_type": "display_data"
482 | }
483 | ],
484 | "source": [
485 | "import matplotlib.pyplot as plt\n",
486 | "\n",
487 | "acc = history.history['accuracy']\n",
488 | "val_acc = history.history['val_accuracy']\n",
489 | "loss = history.history['loss']\n",
490 | "val_loss = history.history['val_loss']\n",
491 | "\n",
492 | "epochs = range(1, len(acc) + 1)\n",
493 | "\n",
494 | "# \"bo\" is for \"blue dot\"\n",
495 | "plt.plot(epochs, loss, 'bo', label='Training loss')\n",
496 | "# b is for \"solid blue line\"\n",
497 | "plt.plot(epochs, val_loss, 'b', label='Validation loss')\n",
498 | "plt.title('Training and validation loss')\n",
499 | "plt.xlabel('Epochs')\n",
500 | "plt.ylabel('Loss')\n",
501 | "plt.legend()\n",
502 | "\n",
503 | "plt.show()"
504 | ]
505 | },
506 | {
507 | "cell_type": "code",
508 | "execution_count": 19,
509 | "metadata": {},
510 | "outputs": [
511 | {
512 | "data": {
513 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZIAAAEWCAYAAABMoxE0AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3XucVXW9//HXm4sg92tmIAwqJxm5O4EleDfRU5KXTMKSLpKWVpqnLD3p4UiejpZmhzxSP285SpRHpdKsDFNLjUEQRUJREQdQBwRvoHL5/P5Ya4Y9w54L7Nmz9zDv5+OxHnut7/qutb9rD+zP/l7WdykiMDMz213tCl0AMzNr3RxIzMwsJw4kZmaWEwcSMzPLiQOJmZnlxIHEzMxy4kBiOZPUXtLbkgY1Z95CknSgpGYfGy/pWEkrM7aXS5rYlLy78V6/kPS93T3erKk6FLoA1vIkvZ2x2QV4D9iWbn8lIsp35XwRsQ3o1tx524KI+HBznEfSl4EzI+LIjHN/uTnObdYYB5I2KCJqvsjTX7xfjog/15dfUoeI2NoSZTNrjP89Fh83bdlOJF0h6VeS7pD0FnCmpI9KekzSRklrJV0nqWOav4OkkFSSbt+W7r9P0luSHpU0ZFfzpvtPkPSspDck/VTS3yRNq6fcTSnjVyStkLRB0nUZx7aXdI2k9ZJeACY18PlcImlOnbRZkn6crn9Z0rL0ep5Pawv1natS0pHpehdJv0zLthQ4pE7eSyW9kJ53qaST0vQRwP8AE9Nmw3UZn+3lGcefk177ekl3S9q3KZ/NrnzO1eWR9GdJr0t6RdK3M97n39PP5E1JFZI+lK0ZUdIj1X/n9PN8KH2f14FLJQ2VND99j3Xp59Yz4/jB6TVWpft/IqlzWuZhGfn2lbRJUt/6rteaICK8tOEFWAkcWyftCuB94JMkPzb2Bj4CjCepxe4PPAucl+bvAARQkm7fBqwDyoCOwK+A23Yj7weAt4DJ6b4LgS3AtHqupSllvAfoCZQAr1dfO3AesBQYCPQFHkr+e2R9n/2Bt4GuGed+DShLtz+Z5hFwNLAZGJnuOxZYmXGuSuDIdP1q4EGgNzAYeKZO3tOBfdO/yWfTMuyT7vsy8GCdct4GXJ6ufzwt42igM/Az4C9N+Wx28XPuCbwKfAPoBPQAxqX7vgs8CQxNr2E00Ac4sO5nDTxS/XdOr20rcC7QnuTf478AxwB7pf9O/gZcnXE9T6efZ9c0/2HpvtnAzIz3+RZwV6H/H7b2peAF8FLgfwD1B5K/NHLcRcCv0/VsweF/M/KeBDy9G3m/CDycsU/AWuoJJE0s46EZ+/8PuChdf4ikia9634l1v9zqnPsx4LPp+gnA8gby/g74WrreUCBZlfm3AL6amTfLeZ8G/jVdbyyQ3AL8IGNfD5J+sYGNfTa7+Dl/DlhQT77nq8tbJ70pgeSFRspwWvX7AhOBV4D2WfIdBrwIKN1eDJzS3P+v2tripi2rz8uZG5IOkvT7tKniTWAG0K+B41/JWN9Ewx3s9eX9UGY5IvmfX1nfSZpYxia9F/BSA+UFuB2Ykq5/Nt2uLscnJD2eNrtsJKkNNPRZVdu3oTJImibpybR5ZiNwUBPPC8n11ZwvIt4ENgADMvI06W/WyOe8H0nAyKahfY2p++/xg5LmSlqdluHmOmVYGcnAjloi4m8ktZsJkoYDg4Df72aZLOVAYvWpO/T1BpJfwAdGRA/g+yQ1hHxaS/KLGQBJovYXX125lHEtyRdQtcaGJ88FjpU0gKTp7fa0jHsDvwGuJGl26gX8sYnleKW+MkjaH7iepHmnb3ref2act7GhymtImsuqz9edpAltdRPKVVdDn/PLwAH1HFffvnfSMnXJSPtgnTx1r++HJKMNR6RlmFanDIMlta+nHLcCZ5LUnuZGxHv15LMmciCxpuoOvAG8k3ZWfqUF3vN3wFhJn5TUgaTdvX+eyjgX+KakAWnH63cayhwRr5A0v9xM0qz1XLqrE0m7fRWwTdInSNrym1qG70nqpeQ+m/My9nUj+TKtIompZ5PUSKq9CgzM7PSu4w7gS5JGSupEEugejoh6a3gNaOhzngcMknSepE6Sekgal+77BXCFpAOUGC2pD0kAfYVkUEd7SdPJCHoNlOEd4A1J+5E0r1V7FFgP/EDJAIa9JR2Wsf+XJE1hnyUJKpYjBxJrqm8BZ5F0ft9A0imeVxHxKvAZ4MckXwwHAItIfok2dxmvBx4AngIWkNQqGnM7SZ9HTbNWRGwELgDuIumwPo0kIDbFZSQ1o5XAfWR8yUXEEuCnwD/SPB8GHs849k/Ac8CrkjKbqKqP/wNJE9Rd6fGDgKlNLFdd9X7OEfEGcBxwKklwexY4It19FXA3yef8JknHd+e0yfJs4HskAy8OrHNt2VwGjCMJaPOAOzPKsBX4BDCMpHayiuTvUL1/Jcnf+b2I+PsuXrtlUd3hZFb00qaKNcBpEfFwoctjrZekW0k68C8vdFn2BL4h0YqapEkkI6Q2kwwf3ULyq9xst6T9TZOBEYUuy57CTVtW7CYAL5D0DRwPnOzOUdtdkq4kuZflBxGxqtDl2VO4acvMzHLiGomZmeWkTfSR9OvXL0pKSgpdDDOzVmXhwoXrIqKhIfdAGwkkJSUlVFRUFLoYZmatiqTGZngA3LRlZmY5ciAxM7OcOJCYmVlO2kQfSTZbtmyhsrKSd999t9BFsXp07tyZgQMH0rFjfdNHmVkxaLOBpLKyku7du1NSUkIyqawVk4hg/fr1VFZWMmTIkMYPMLOCabNNW++++y59+/Z1EClSkujbt69rjGa7obwcSkqgXbvktbw8v+/XZmskgINIkfPfx2zXlZfD9OmwaVOy/dJLyTbA1N2d77kRbbZGYma2J7rkkh1BpNqmTUl6vjiQFMj69esZPXo0o0eP5oMf/CADBgyo2X7//febdI4vfOELLF++vME8s2bNojzf9VozKxqr6pmKsr705tCmm7Z2RXl5EtFXrYJBg2DmzNyqiX379mXx4sUAXH755XTr1o2LLrqoVp6IICJo1y57vL/pppsafZ+vfe1ru19IM2t1Bg1KmrOypeeLayRNUN3m+NJLELGjzTEfP/RXrFhBaWkpU6dO5eCDD2bt2rVMnz6dsrIyDj74YGbMmFGTd8KECSxevJitW7fSq1cvLr74YkaNGsVHP/pRXnvtNQAuvfRSrr322pr8F198MePGjePDH/4wf/978nC4d955h1NPPZXS0lJOO+00ysrKaoJcpssuu4yPfOQjDB8+nHPOOYfqmaOfffZZjj76aEaNGsXYsWNZuXIlAD/4wQ8YMWIEo0aN4pJ81qvNikhLd3TXNXMmdOlSO61LlyQ9b6p/9e7JyyGHHBJ1PfPMMzul1Wfw4IgkhNReBg9u8ikadNlll8VVV10VERHPPfdcSIoFCxbU7F+/fn1ERGzZsiUmTJgQS5cujYiIww47LBYtWhRbtmwJIO69996IiLjgggviyiuvjIiISy65JK655pqa/N/+9rcjIuKee+6J448/PiIirrzyyvjqV78aERGLFy+Odu3axaJFi3YqZ3U5tm/fHmeccUbN+40dOzbmzZsXERGbN2+Od955J+bNmxcTJkyITZs21Tp2V+3K38ms0G67LaJLl9rfE126JOktXY7BgyOk5HV33x+oiCZ8x7pG0gQt3eZ4wAEHUFZWVrN9xx13MHbsWMaOHcuyZct45plndjpm77335oQTTgDgkEMOqakV1HXKKafslOeRRx7hjDPOAGDUqFEcfPDBWY994IEHGDduHKNGjeKvf/0rS5cuZcOGDaxbt45PfvKTQHITYZcuXfjzn//MF7/4Rfbee28A+vTps+sfhFkrU4iO7mymToWVK2H79uQ1X6O1qrmPpAlaus2xa9euNevPPfccP/nJT/jHP/5Br169OPPMM7PeW7HXXnvVrLdv356tW7dmPXenTp0azZPNpk2bOO+883jiiScYMGAAl156qe/xMKujEB3dxcA1kiYoSJtj6s0336R79+706NGDtWvXcv/99zf7exx22GHMnTsXgKeeeiprjWfz5s20a9eOfv368dZbb3HnnXcC0Lt3b/r3789vf/tbILnRc9OmTRx33HHceOONbN68GYDXX3+92cttlqnQfRNQ/4/LfHZ0FwMHkiaYOhVmz4bBg0FKXmfPzn91EWDs2LGUlpZy0EEH8fnPf57DDjus2d/j/PPPZ/Xq1ZSWlvIf//EflJaW0rNnz1p5+vbty1lnnUVpaSknnHAC48ePr9lXXl7Oj370I0aOHMmECROoqqriE5/4BJMmTaKsrIzRo0dzzTXXNHu5zaq15ICYhhTyR2dBNaUjZXcXYBKwHFgBXJxl/2DgAWAJ8CAwME0/ClicsbwLfCrddzPwYsa+0Y2VI9fO9j3dli1bYvPmzRER8eyzz0ZJSUls2bKlwKVK+O9kTZHvATG7ork6uosBTexsz1sfiaT2wCzgOKASWCBpXkRktptcDdwaEbdIOhq4EvhcRMwHRqfn6UMSiP6Ycdy/RcRv8lX2tubtt9/mmGOOYevWrUQEN9xwAx06uPvMWo9i6puYOrVlWiuKST6/LcYBKyLiBQBJc4DJQGYgKQUuTNfnA3dnOc9pwH0RsSnLPmsGvXr1YuHChYUuhtluK8RNeLZDPvtIBgAvZ2xXpmmZngROSddPBrpL6lsnzxnAHXXSZkpaIukaSZ2yvbmk6ZIqJFVUVVXt3hWYWavQZvsmikShO9svAo6QtAg4AlgNbKveKWlfYASQOVTpu8BBwEeAPsB3sp04ImZHRFlElPXv3z9PxTezYlDIATGW36at1cB+GdsD07QaEbGGtEYiqRtwakRszMhyOnBXRGzJOGZtuvqepJtIgpGZtXFtsW+iWOSzRrIAGCppiKS9SJqo5mVmkNRPUnUZvgvcWOccU6jTrJXWUlDysIpPAU/noexmZtZEeQskEbEVOI+kWWoZMDcilkqaIemkNNuRwHJJzwL7ADUtmpJKSGo0f61z6nJJTwFPAf2AK/J1Dfl01FFH7XRz4bXXXsu5557b4HHdunUDYM2aNZx22mlZ8xx55JFUVFQ0eJ5rr72WTRlzOZx44ols3LixgSPMsiuGGwGtwJoyRri1L8V4H8kNN9wQ06ZNq5U2fvz4+Otf/9rgcV27dm303EcccUStSR+zGTx4cFRVVTVe0AIr9N/JGlYskxRafuBJG4vbaaedxu9///uah1itXLmSNWvWMHHixJr7OsaOHcuIESO45557djp+5cqVDB8+HEimLznjjDMYNmwYJ598cs20JADnnntuzRT0l112GQDXXXcda9as4aijjuKoo44CoKSkhHXr1gHw4x//mOHDhzN8+PCaKehXrlzJsGHDOPvsszn44IP5+Mc/Xut9qv32t79l/PjxjBkzhmOPPZZXX30VSO5V+cIXvsCIESMYOXJkzRQrf/jDHxg7diyjRo3imGOOaZbP1lpOsUxSaIXlu86Ab34Tsjx+IyejR0P6HZxVnz59GDduHPfddx+TJ09mzpw5nH766Uiic+fO3HXXXfTo0YN169Zx6KGHctJJJ9X7DPPrr7+eLl26sGzZMpYsWcLYsWNr9s2cOZM+ffqwbds2jjnmGJYsWcLXv/51fvzjHzN//nz69etX61wLFy7kpptu4vHHHyciGD9+PEcccQS9e/fmueee44477uDnP/85p59+OnfeeSdnnnlmreMnTJjAY489hiR+8Ytf8N///d/86Ec/4j//8z/p2bMnTz31FAAbNmygqqqKs88+m4ceeoghQ4Z4Pq5WqJhuBLTCcY2kgKZMmcKcOXMAmDNnDlOmTAGS5sbvfe97jBw5kmOPPZbVq1fX/LLP5qGHHqr5Qh85ciQjR46s2Td37lzGjh3LmDFjWLp0adYJGTM98sgjnHzyyXTt2pVu3bpxyimn8PDDDwMwZMgQRo8eDdQ/VX1lZSXHH388I0aM4KqrrmLp0qUA/PnPf671tMbevXvz2GOPcfjhhzNkyBDAU823Rm11kkKrzTUSGq455NPkyZO54IILeOKJJ9i0aROHHHIIkEyCWFVVxcKFC+nYsSMlJSW7NWX7iy++yNVXX82CBQvo3bs306ZNy2nq9+op6CGZhj5b09b555/PhRdeyEknncSDDz7I5ZdfvtvvZ8Vv5sxkcsTM5i3fCNj2uEZSQN26deOoo47ii1/8Yk1tBOCNN97gAx/4AB07dmT+/Pm8lG3uhwyHH344t99+OwBPP/00S5YsAZIp6Lt27UrPnj159dVXue+++2qO6d69O2+99dZO55o4cSJ33303mzZt4p133uGuu+5i4sSJTb6mN954gwEDkgkMbrnllpr04447jlmzZtVsb9iwgUMPPZSHHnqIF198EfBU87uqGEZL+UZAAweSgpsyZQpPPvlkrUAydepUKioqGDFiBLfeeisHHXRQg+c499xzefvttxk2bBjf//73a2o2o0aNYsyYMRx00EF89rOfrTUF/fTp05k0aVJNZ3u1sWPHMm3aNMaNG8f48eP58pe/zJgxY5p8PZdffjmf/vSnOeSQQ2r1v1x66aVs2LCB4cOHM2rUKObPn0///v2ZPXs2p5xyCqNGjeIzn/lMk9+nrSuWadOh5Z/GZ8VHyQivPVtZWVnUva9i2bJlDBs2rEAlsqby3ym7kpLskxQOHpx8mZs1B0kLI6KssXyukZi1Qh4tZcXEgcSsFfJoKSsmbTqQtIVmvdbMf5/6edp0KyZtNpB07tyZ9evX+8uqSEUE69evp3PnzoUuSlHyaCkrJm22s33Lli1UVlbmdF+F5Vfnzp0ZOHAgHTt2LHRRzNqkpna2t9kbEjt27FhzR7XZrigvT+aSWrUq6ZOYOdM1AWvb2mwgMdsd1fdvVN/JXX3/BjiYWNvVZvtIzHaHZ7s125kDidku8P0bZjvLayCRNEnSckkrJF2cZf9gSQ9IWiLpQUkDM/Ztk7Q4XeZlpA+R9Hh6zl+lj/E1axG+f8NsZ3kLJJLaA7OAE4BSYIqk0jrZrgZujYiRwAzgyox9myNidLqclJH+Q+CaiDgQ2AB8KV/XYFaX798w21k+ayTjgBUR8UJEvA/MASbXyVMK/CVdn59lfy1Knux0NPCbNOkW4FPNVmKzRvj+DbOd5TOQDABeztiuTNMyPQmckq6fDHSX1Dfd7iypQtJjkqqDRV9gY0RsbeCcAEianh5fUVVVleu1mNXwbLdmtRW6s/0i4AhJi4AjgNXAtnTf4PRGmM8C10o6YFdOHBGzI6IsIsr69+/frIU2M7Md8nkfyWpgv4ztgWlajYhYQ1ojkdQNODUiNqb7VqevL0h6EBgD3An0ktQhrZXsdE4zM2tZ+ayRLACGpqOs9gLOAOZlZpDUT1J1Gb4L3Jim95bUqToPcBjwTCTzucwHTkuPOQu4J4/XYGZmjchbIElrDOcB9wPLgLkRsVTSDEnVo7COBJZLehbYB6ge+zIMqJD0JEng+K+IeCbd9x3gQkkrSPpM/l++rsHMzBrXZidtNDOzhvkJiWZm1iIcSKzVKC9PnlXerl3yWl5e6BKZGXj2X2slPOuuWfFyjcRaBc+6a1a8HEisVfCsu2bFy4HEWgXPumtWvBxIrFXwrLtmxcuBxFoFz7prVrw8astajalTHTjMipFrJGZmlhMHEjMzy4kDiZmZ5cSBxMzMcuJAYmZmOXEgsSbxhIlmVh8P/7VGecJEM2tIXmskkiZJWi5phaSLs+wfLOkBSUskPShpYJo+WtKjkpam+z6TcczNkl6UtDhdRufzGswTJppZw/IWSCS1B2YBJwClwBRJpXWyXQ3cGhEjgRnAlWn6JuDzEXEwMAm4VlKvjOP+LSJGp8vifF2DJTxhopk1JJ81knHAioh4ISLeB+YAk+vkKQX+kq7Pr94fEc9GxHPp+hrgNaB/HstqDfCEiWbWkHwGkgHAyxnblWlapieBU9L1k4HukvpmZpA0DtgLeD4jeWba5HWNpE7Z3lzSdEkVkiqqqqpyuY42zxMmmllDCj1q6yLgCEmLgCOA1cC26p2S9gV+CXwhIranyd8FDgI+AvQBvpPtxBExOyLKIqKsf39XZnLhCRPNrCH5HLW1GtgvY3tgmlYjbbY6BUBSN+DUiNiYbvcAfg9cEhGPZRyzNl19T9JNJMHI8swTJppZffJZI1kADJU0RNJewBnAvMwMkvpJqi7Dd4Eb0/S9gLtIOuJ/U+eYfdNXAZ8Cns7jNZiZWSPyFkgiYitwHnA/sAyYGxFLJc2QdFKa7UhguaRngX2A6lb304HDgWlZhvmWS3oKeAroB1yRr2swM7PGKSIKXYa8Kysri4qKikIXw8ysVZG0MCLKGstX6M52MzNr5RxIzMwsJw4kZmaWEwcSMzPLiQOJmZnlxIHEzMxy4kBiZmY5cSAxM7OcOJCYmVlOHEjMzCwnDiRmZpYTBxIzM8uJA0mRKy+HkhJo1y55LS8vdInMzGrL54OtLEfl5TB9OmzalGy/9FKyDX7IlJkVD9dIitgll+wIItU2bUrSzcyKhQNJEVu1atfSzcwKIa+BRNIkScslrZB0cZb9gyU9IGmJpAclDczYd5ak59LlrIz0QyQ9lZ7zuvSRu3ukQYN2Ld3MrBDyFkgktQdmAScApcAUSaV1sl1N8lz2kcAM4Mr02D7AZcB4YBxwmaTe6THXA2cDQ9NlUr6uodBmzoQuXWqndemSpJuZFYt81kjGASsi4oWIeB+YA0yuk6cU+Eu6Pj9j//HAnyLi9YjYAPwJmCRpX6BHRDwWyTOCbwU+lcdrKKipU2H2bBg8GKTkdfZsd7SbWXHJZyAZALycsV2ZpmV6EjglXT8Z6C6pbwPHDkjXGzonAJKmS6qQVFFVVbXbF1FoU6fCypWwfXvy6iBiZsWm0J3tFwFHSFoEHAGsBrY1x4kjYnZElEVEWf/+/ZvjlGZmlkU+7yNZDeyXsT0wTasREWtIaySSugGnRsRGSauBI+sc+2B6/MA66bXOaWZmLSufNZIFwFBJQyTtBZwBzMvMIKmfpOoyfBe4MV2/H/i4pN5pJ/vHgfsjYi3wpqRD09FanwfuyeM1mJlZI/IWSCJiK3AeSVBYBsyNiKWSZkg6Kc12JLBc0rPAPsDM9NjXgf8kCUYLgBlpGsBXgV8AK4DngfvydQ1mZtY4JYOfGskkHQBURsR7ko4ERpIM292Y5/I1i7KysqioqCh0MczMWhVJCyOirLF8Ta2R3Alsk3QgMJuk7+P2HMpnZmZ7iKZ2tm+PiK2STgZ+GhE/TUdamdXYtg22bIH3309eq5eGtnclb/W2BGPHwsSJsM8+hb5qM2tqINkiaQpwFvDJNK1jfopkhbZ9O6xeDc89lywrViSvzz8P77xT/5d9E1pJc9axY/I+W7cm20OHJgFl4kSYMAEOOCAJNGbWcpoaSL4AnAPMjIgXJQ0Bfpm/Ylm+RcDatTuCReby/POwefOOvJ06JV/QBx4IPXokX+Z77ZW81l3P57727ZMgsWULPPEEPPxwstx9N9yYjvfbd98koBx+eBJchg9PjjOz/GlSZ3utA5LhuPtFxJL8FKn5tdXO9gh49dXswWLFitpT1O+1F+y/f/ILv+4ycGDyYK1itX07LFu2I7A8/DC8nM6L0LMnHHZYElwmToSPfCQJjK3d1q3J82lWrIA334T+/eEDH0iWPn2K++9lrUdTO9ubOmrrQeAkkhrMQuA14G8RcWGO5WwRe3IgiYCqqp2DRPX622/vyNuhw87B4sADk9dBg/asX+4vvVQ7sCxblqR36gTjxu1oDvvYx5JaVjHati0JiNl+CLz4YlIzy6ZdO+jXb0dgyVwyA0710r27mwMtu+YOJIsiYoykL5PURi6TtCSdtbfotfZAEgHr19cOEJnLm2/uyNu+PQwZsnOgGDo0mfSxQxt9Jua6dfDIIzsCyxNPJF/U7drBqFE7AktLd+Bv3w6Vldn/ri+8kPQ/VevSpfbfs3rp1Sv5MfHaa8mSuZ65ZP47ydSpU8OBpu6+zp1b5rOxwmvuQPIUyd3ltwCXRMQCB5KW8cc/wte/DsuX70irfn573UAxdGiS3tHDIBr19tvw2GM7Astjj+3oF8rswJ84ManF5fKLfft2WLOm/v6o997bkbdz5+zBYujQpP8nl3K89179QSZb+rvvZj9P9+7ZA0yfPklQ69kz+9K5s2s+rU1zB5JPA/9O0px1rqT9gasi4tTci5p/rTGQrFkDF14Iv/pV8iVyzjnwL/+SrA8ZkvRpWPN5//3aHfiPPAIbNiT79t23dmAZMWLnPoj6Bi+sWJEs2QYvZAsWH/pQcfRvRCTBtr7AUzf4VFUlNbyGdOxYf5Cpu9QXkByMWlazBpLWrjUFkm3bYNYsuPTS5Mvte9+Db3/bzQktbft2eOaZ2v0slekDDKo78IcNS/piqgPGO+/sOD7b4IXqmsZ++xVHsGhO27fDW2/BG280vmzcmD39rbcaH0LelGDUq1ftJTOtZ8+227y7O5q7RjIQ+ClwWJr0MPCNiKis/6ji0VoCyYIFSc3jiSfg4x9PAsqBBxa6VAbJF1zdDvznn6/dH7UnD15oCbsSjOoLTk0JRt267RxgsgWd+tLbUmtAUwNJU2PzTSRTonw63T4zTTtu94pnmTZuhEsugeuvhw9+MGnO+vSnXYUvJlLS/1RSAp/7XJIW4b9Rc2rXbketYndVB6PqwLJxY+0lW9orr8A//7lju7Emur33blog2mefpFl0332T/9d7wrDz+jQ1kPSPiJsytm+W9M18FKgtiYDbb4dvfStpY/7612HGjOIdjmq1OYgUn1yDUUTSRNnUIPTGG/D668kIu+q0zJF2mfr02RFUqgNM5lKd3hqHYzc1kKyXdCZwR7o9BVifnyK1DcuXw1e/Cn/5S3KT3L33JvNHmVnhSEnTV7duMCDrQ7wb9+67SXB59dVkAEbm8soryesjjySvmSP2qnXpkj3A1F369i2evramBpIvkvSRXAME8HdgWp7KtEfbvBmuvBJ++MOkivyzn8H06W5PN9tTdO6cjL770IdgzJj680UkNZi6QSZzWbIE7r8/+z1AHTrUbj6rL/j8EdI7AAAO8klEQVR86EP5/35pUiCJiJdI7myvkTZtXZuPQu2p/vAH+NrXkmrwmWfC1Vd79lqztkqC3r2TpbS04bybNmUPNNUBaNUqePzxpIm87mCDJUuSIev5lMtAuAtpJJBImgT8BGgP/CIi/qvO/kEkNzn2SvNcHBH3SpoK/FtG1pHA2IhYnE7Xsi9QPTL/4xHxWg7XkXerV8MFF8Cvfw0f/jA88AAcfXShS2VmrUWXLslw8v33bzjfli3JfT2ZQWbIkPyXL5dA0mB3kKT2wCySkV2VwAJJ8yLimYxsl5I8gvd6SaXAvUBJRJQD5el5RgB3R8TijOOmRkTRj+fdunXHPSFbt8IVV8BFF+3ZozfMrHA6dkz6dna3f2d35RJIGrsBZRywIiJeAJA0B5gMZAaSAKrHKPUE1mQ5zxRgTg7lLIjHH0/uCVm8GCZNgv/5n+RuZjOzPU2DgUTSW2QPGAL2buTcA4CXM7YrgfF18lwO/FHS+UBX4Ngs5/kMSQDKdJOkbSSPAL4istxVKWk6MB1g0KBBjRS1+WzYkNyNfsMNSUfXr38Np57a+obzmZk1VYODxyKie0T0yLJ0j4jmmGhgCnBzRAwETgR+KammTJLGA5si4umMY6ZGxAhgYrp8rp6yz46Isogo69+/fzMUtWERcNttcNBBMHs2fOMbyU1Op53mIGJme7Z8jkJeDeyXsT0wTcv0JWAuQEQ8CnQG+mXsP4Md966Q5ludvr5Fcrf9uGYt9W745z/hmGOSO56HDIGFC+Gaa5Ibi8zM9nT5DCQLgKGShkjaiyQozKuTZxVwDICkYSSBpCrdbgecTkb/iKQOkvql6x2BTwBPUyCbNycd6SNHwqJF8L//C3//O4weXagSmZm1vLzNgxkRWyWdB9xPMrT3xohYKmkGUBER84BvAT+XdAFJX8y0jP6Ow4GXqzvrU52A+9Mg0h74M/DzfF1DQ+69F847L3lS3ec/D1ddlTyTwcysrfE08ruoshK++U24886kP+T66+HII5vl1GZmRaWps/8WyUwtxW/r1qTfY9gw+P3vYeZMePJJBxEzMz/ipQkefRTOPTcJHCeemNwT0hJ3i5qZtQaukTTg9dfhK1+Bj30M1q9PmrN+9zsHETOzTK6RNOCkk+Cxx5Jnp19+uYfzmpll40DSgKuugq5dk+G9ZmaWnQNJAz760UKXwMys+LmPxMzMcuJAYmZmOXEgMTOznDiQmJlZThxIzMwsJw4kZmaWEwcSMzPLiQOJmZnlxIHEzMxyktdAImmSpOWSVki6OMv+QZLmS1okaYmkE9P0EkmbJS1Ol//NOOYQSU+l57xO8hPRzcwKKW+BRFJ7YBZwAlAKTJFUWifbpcDciBhD8ijen2Xsez4iRqfLORnp1wNnA0PTZVK+rsHMzBqXzxrJOGBFRLwQEe+TPHt9cp08AfRI13sCaxo6oaR9gR4R8Vj6SN5bgU81b7HNzGxX5DOQDABeztiuTNMyXQ6cKakSuBc4P2PfkLTJ66+SJmacs7KRc5qZWQsqdGf7FODmiBgInAj8UlI7YC0wKG3yuhC4XVKPBs6zE0nTJVVIqqiqqmr2gpuZWSKfgWQ1sF/G9sA0LdOXgLkAEfEo0BnoFxHvRcT6NH0h8DzwL+nxAxs5J+lxsyOiLCLK+vfv3wyXY2Zm2eQzkCwAhkoaImkvks70eXXyrAKOAZA0jCSQVEnqn3bWI2l/kk71FyJiLfCmpEPT0VqfB+7J4zWYmVkj8vZgq4jYKuk84H6gPXBjRCyVNAOoiIh5wLeAn0u6gKTjfVpEhKTDgRmStgDbgXMi4vX01F8Fbgb2Bu5LFzMzKxAlg5/2bGVlZVFRUVHoYpiZtSqSFkZEWWP5Ct3ZbmZmrZwDiZmZ5cSBxMzMcuJAYmZmOXEgMTOznDiQmJlZThxIzMwsJw4kZmaWEwcSMzPLiQOJmZnlxIHEzMxy4kBiZmY5cSAxM7OcOJCYmVlOHEjMzCwnDiRmZpaTvAYSSZMkLZe0QtLFWfYPkjRf0iJJSySdmKYfJ2mhpKfS16MzjnkwPefidPlAPq/BzMwalrdH7abPXJ8FHAdUAgskzYuIZzKyXQrMjYjrJZUC9wIlwDrgkxGxRtJwksf1Dsg4bmpE+JGHZmZFIJ81knHAioh4ISLeB+YAk+vkCaBHut4TWAMQEYsiYk2avhTYW1KnPJbVzMx2Uz4DyQDg5YztSmrXKgAuB86UVElSGzk/y3lOBZ6IiPcy0m5Km7X+XZKyvbmk6ZIqJFVUVVXt9kWYmVnDCt3ZPgW4OSIGAicCv5RUUyZJBwM/BL6ScczUiBgBTEyXz2U7cUTMjoiyiCjr379/3i7AzKyty2cgWQ3sl7E9ME3L9CVgLkBEPAp0BvoBSBoI3AV8PiKerz4gIlanr28Bt5M0oZmZWYHkM5AsAIZKGiJpL+AMYF6dPKuAYwAkDSMJJFWSegG/By6OiL9VZ5bUQVJ1oOkIfAJ4Oo/XYGZmjchbIImIrcB5JCOulpGMzloqaYakk9Js3wLOlvQkcAcwLSIiPe5A4Pt1hvl2Au6XtARYTFLD+Xm+rsHMzBqn5Ht7z1ZWVhYVFR4tbGa2KyQtjIiyxvIVurPdzMxaOQcSMzPLiQOJmZnlxIHEzMxy4kBiZmY5cSAxM7OcOJCYmVlOHEjMzCwnDiRmZpYTBxIzM8uJA4mZmeXEgcTMzHLiQGJmZjlxIDEzs5w4kJiZWU4cSMzMLCd5DSSSJklaLmmFpIuz7B8kab6kRZKWSDoxY9930+OWSzq+qec0M7OWlbdAIqk9MAs4ASgFpkgqrZPtUpJH8I4heab7z9JjS9Ptg4FJwM8ktW/iOc3MrAXls0YyDlgRES9ExPvAHGBynTwB9EjXewJr0vXJwJyIeC8iXgRWpOdryjnNzKwF5TOQDABeztiuTNMyXQ6cKakSuBc4v5Fjm3JOACRNl1QhqaKqqmp3r8HMzBpR6M72KcDNETEQOBH4paRmKVNEzI6Isogo69+/f3Oc0szMsuiQx3OvBvbL2B6YpmX6EkkfCBHxqKTOQL9Gjm3snGZm1oLyWSNZAAyVNETSXiSd5/Pq5FkFHAMgaRjQGahK850hqZOkIcBQ4B9NPKeZmbWgvNVIImKrpPOA+4H2wI0RsVTSDKAiIuYB3wJ+LukCko73aRERwFJJc4FngK3A1yJiG0C2c+brGszMrHFKvrf3bGVlZVFRUVHoYpiZtSqSFkZEWWP5Ct3ZbmZmrZwDiZmZ5cSBxMzMcuJAYmZmOXEgqUd5OZSUQLt2yWt5eaFLZGZWnPJ5Q2KrVV4O06fDpk3J9ksvJdsAU6cWrlxmZsXINZIsLrlkRxCptmlTkm5mZrU5kGSxatWupZuZtWUOJFkMGrRr6WZmbZkDSRYzZ0KXLrXTunRJ0s3MrDYHkiymToXZs2HwYJCS19mz3dFuZpaNR23VY+pUBw4zs6ZwjcTMzHLiQGJmZjlxIDEzs5w4kJiZWU4cSMzMLCdt4gmJkqqAlwpdjhz1A9YVuhBFwp9Fbf48avPnsUOun8XgiOjfWKY2EUj2BJIqmvLIy7bAn0Vt/jxq8+exQ0t9Fm7aMjOznDiQmJlZThxIWo/ZhS5AEfFnUZs/j9r8eezQIp+F+0jMzCwnrpGYmVlOHEjMzCwnDiRFTNJ+kuZLekbSUknfKHSZioGk9pIWSfpdoctSaJJ6SfqNpH9KWibpo4UuU6FIuiD9f/K0pDskdS50mVqSpBslvSbp6Yy0PpL+JOm59LV3Pt7bgaS4bQW+FRGlwKHA1ySVFrhMxeAbwLJCF6JI/AT4Q0QcBIyijX4ukgYAXwfKImI40B44o7ClanE3A5PqpF0MPBARQ4EH0u1m50BSxCJibUQ8ka6/RfIlMaCwpSosSQOBfwV+UeiyFJqknsDhwP8DiIj3I2JjYUtVUB2AvSV1ALoAawpcnhYVEQ8Br9dJngzckq7fAnwqH+/tQNJKSCoBxgCPF7YkBXct8G1ge6ELUgSGAFXATWlT3y8kdS10oQohIlYDVwOrgLXAGxHxx8KWqijsExFr0/VXgH3y8SYOJK2ApG7AncA3I+LNQpenUCR9AngtIhYWuixFogMwFrg+IsYA75Cnpotil7b9TyYJrh8Cuko6s7ClKi6R3OuRl/s9HEiKnKSOJEGkPCL+r9DlKbDDgJMkrQTmAEdLuq2wRSqoSqAyIqprqb8hCSxt0bHAixFRFRFbgP8DPlbgMhWDVyXtC5C+vpaPN3EgKWKSRNL+vSwiflzo8hRaRHw3IgZGRAlJR+pfIqLN/uqMiFeAlyV9OE06BnimgEUqpFXAoZK6pP9vjqGNDjyoYx5wVrp+FnBPPt7EgaS4HQZ8juSX9+J0ObHQhbKicj5QLmkJMBr4QYHLUxBprew3wBPAUyTfbW1qqhRJdwCPAh+WVCnpS8B/AcdJeo6k1vZfeXlvT5FiZma5cI3EzMxy4kBiZmY5cSAxM7OcOJCYmVlOHEjMzCwnDiRmu0nStoxh2YslNdtd5ZJKMmdxNStmHQpdALNWbHNEjC50IcwKzTUSs2YmaaWk/5b0lKR/SDowTS+R9BdJSyQ9IGlQmr6PpLskPZku1VN7tJf08/QZG3+UtHea/+vpM2qWSJpToMs0q+FAYrb79q7TtPWZjH1vRMQI4H9IZiwG+ClwS0SMBMqB69L064C/RsQokrmylqbpQ4FZEXEwsBE4NU2/GBiTnuecfF2cWVP5znaz3STp7YjoliV9JXB0RLyQTrr5SkT0lbQO2DcitqTpayOin6QqYGBEvJdxjhLgT+kDiZD0HaBjRFwh6Q/A28DdwN0R8XaeL9WsQa6RmOVH1LO+K97LWN/Gjj7NfwVmkdReFqQPcjIrGAcSs/z4TMbro+n639nx+NepwMPp+gPAuVDzPPqe9Z1UUjtgv4iYD3wH6AnsVCsya0n+JWO2+/aWtDhj+w8RUT0EuHc6I+97wJQ07XySpxn+G8mTDb+Qpn8DmJ3O1rqNJKisJbv2wG1psBFwXRt/vK4VAfeRmDWztI+kLCLWFbosZi3BTVtmZpYT10jMzCwnrpGYmVlOHEjMzCwnDiRmZpYTBxIzM8uJA4mZmeXk/wOdTePf17QWiQAAAABJRU5ErkJggg==\n",
514 | "text/plain": [
515 | ""
516 | ]
517 | },
518 | "metadata": {
519 | "needs_background": "light"
520 | },
521 | "output_type": "display_data"
522 | }
523 | ],
524 | "source": [
525 | "plt.clf() # clear figure\n",
526 | "acc_values = history_dict['accuracy']\n",
527 | "val_acc_values = history_dict['val_accuracy']\n",
528 | "\n",
529 | "plt.plot(epochs, acc, 'bo', label='Training acc')\n",
530 | "plt.plot(epochs, val_acc, 'b', label='Validation acc')\n",
531 | "plt.title('Training and validation accuracy')\n",
532 | "plt.xlabel('Epochs')\n",
533 | "plt.ylabel('Loss')\n",
534 | "plt.legend()\n",
535 | "\n",
536 | "plt.show()"
537 | ]
538 | },
539 | {
540 | "cell_type": "markdown",
541 | "metadata": {},
542 | "source": [
543 | "## Using a trained network to generate predictions on new data\n",
544 | "\n",
545 | "After having trained a network, you will want to use it in a practical setting. \n",
546 | "\n",
547 | "You can generate the likelihood of reviews being positive by using the `predict` method:"
548 | ]
549 | },
550 | {
551 | "cell_type": "code",
552 | "execution_count": 20,
553 | "metadata": {},
554 | "outputs": [
555 | {
556 | "data": {
557 | "text/plain": [
558 | "array([[0.08146486],\n",
559 | " [0.9999759 ],\n",
560 | " [0.61180466],\n",
561 | " ...,\n",
562 | " [0.03263709],\n",
563 | " [0.02457938],\n",
564 | " [0.45536834]], dtype=float32)"
565 | ]
566 | },
567 | "execution_count": 20,
568 | "metadata": {},
569 | "output_type": "execute_result"
570 | }
571 | ],
572 | "source": [
573 | "model.predict(x_test)"
574 | ]
575 | },
576 | {
577 | "cell_type": "markdown",
578 | "metadata": {},
579 | "source": [
580 | "As you can see, the network is very confident for some samples (0.99 or more, or 0.01 or less) but less confident for others (0.6, 0.4). \n"
581 | ]
582 | },
583 | {
584 | "cell_type": "markdown",
585 | "metadata": {},
586 | "source": [
587 | "## Conclusions\n",
588 | "\n",
589 | "\n",
590 | "Here's what you should take away from this example:\n",
591 | "\n",
592 | "* There's usually quite a bit of preprocessing you need to do on your raw data in order to be able to feed it -- as tensors -- into a neural \n",
593 | "network. \n",
594 | "\n",
595 | "* In the case of sequences of words, they can be encoded as binary vectors -- but there are other encoding options too.\n",
596 | "* Stacks of `Dense` layers with `relu` activations can solve a wide range of problems (including sentiment classification).\n",
597 | "* In a binary classification problem (two output classes), your network should end with a `Dense` layer with 1 unit and a `sigmoid` activation, \n",
598 | "\n",
599 | "i.e. the output of your network should be a scalar between 0 and 1, encoding a probability."
600 | ]
601 | },
602 | {
603 | "cell_type": "code",
604 | "execution_count": null,
605 | "metadata": {},
606 | "outputs": [],
607 | "source": []
608 | }
609 | ],
610 | "metadata": {
611 | "kernelspec": {
612 | "display_name": "Python 3",
613 | "language": "python",
614 | "name": "python3"
615 | },
616 | "language_info": {
617 | "codemirror_mode": {
618 | "name": "ipython",
619 | "version": 3
620 | },
621 | "file_extension": ".py",
622 | "mimetype": "text/x-python",
623 | "name": "python",
624 | "nbconvert_exporter": "python",
625 | "pygments_lexer": "ipython3",
626 | "version": "3.7.1"
627 | }
628 | },
629 | "nbformat": 4,
630 | "nbformat_minor": 2
631 | }
632 |
--------------------------------------------------------------------------------
/Section 07/04-05 - Reuters dataset for News Text Multi-label Classification.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Classifying newswires: a multi-class classification example\n",
8 | "\n",
9 | "In the previous video we saw how to classify vector inputs into two mutually exclusive classes using a densely-connected neural network. \n",
10 | "\n",
11 | "\n",
12 | "But what happens when you have more than two classes? "
13 | ]
14 | },
15 | {
16 | "cell_type": "markdown",
17 | "metadata": {},
18 | "source": [
19 | "In this video, we will build a network to classify Reuters newswires into 46 different mutually-exclusive topics. "
20 | ]
21 | },
22 | {
23 | "cell_type": "markdown",
24 | "metadata": {},
25 | "source": [
26 | "## The Reuters dataset\n",
27 | "\n",
28 | "\n",
29 | "We will be working with the _Reuters dataset_, a set of short newswires and their topics, published by Reuters in 1986.\n",
30 | "\n",
31 | "Like IMDB and MNIST, the Reuters dataset comes packaged as part of Keras. Let's take a look right away:"
32 | ]
33 | },
34 | {
35 | "cell_type": "code",
36 | "execution_count": 1,
37 | "metadata": {},
38 | "outputs": [],
39 | "source": [
40 | "from tensorflow import keras\n",
41 | "from tensorflow.keras.datasets import reuters\n",
42 | "\n",
43 | "(train_data, train_labels), (test_data, test_labels) = reuters.load_data(num_words=10000)"
44 | ]
45 | },
46 | {
47 | "cell_type": "code",
48 | "execution_count": 2,
49 | "metadata": {},
50 | "outputs": [
51 | {
52 | "data": {
53 | "text/plain": [
54 | "'2.0.0-beta0'"
55 | ]
56 | },
57 | "execution_count": 2,
58 | "metadata": {},
59 | "output_type": "execute_result"
60 | }
61 | ],
62 | "source": [
63 | "import tensorflow as tf\n",
64 | "tf.__version__"
65 | ]
66 | },
67 | {
68 | "cell_type": "markdown",
69 | "metadata": {},
70 | "source": [
71 | "\n",
72 | "Like with the IMDB dataset, the argument `num_words=10000` restricts the data to the 10,000 most frequently occurring words found in the \n",
73 | "data.\n",
74 | "\n",
75 | "We have 8,982 training examples and 2,246 test examples:"
76 | ]
77 | },
78 | {
79 | "cell_type": "code",
80 | "execution_count": 3,
81 | "metadata": {},
82 | "outputs": [
83 | {
84 | "data": {
85 | "text/plain": [
86 | "8982"
87 | ]
88 | },
89 | "execution_count": 3,
90 | "metadata": {},
91 | "output_type": "execute_result"
92 | }
93 | ],
94 | "source": [
95 | "len(train_data)"
96 | ]
97 | },
98 | {
99 | "cell_type": "code",
100 | "execution_count": 4,
101 | "metadata": {},
102 | "outputs": [
103 | {
104 | "data": {
105 | "text/plain": [
106 | "2246"
107 | ]
108 | },
109 | "execution_count": 4,
110 | "metadata": {},
111 | "output_type": "execute_result"
112 | }
113 | ],
114 | "source": [
115 | "len(test_data)"
116 | ]
117 | },
118 | {
119 | "cell_type": "markdown",
120 | "metadata": {},
121 | "source": [
122 | "As with the IMDB reviews, each example is a list of integers (word indices):"
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "execution_count": 7,
128 | "metadata": {},
129 | "outputs": [
130 | {
131 | "name": "stdout",
132 | "output_type": "stream",
133 | "text": [
134 | "[1, 245, 273, 207, 156, 53, 74, 160, 26, 14, 46, 296, 26, 39, 74, 2979, 3554, 14, 46, 4689, 4329, 86, 61, 3499, 4795, 14, 61, 451, 4329, 17, 12]\n"
135 | ]
136 | }
137 | ],
138 | "source": [
139 | "print(train_data[10])"
140 | ]
141 | },
142 | {
143 | "cell_type": "markdown",
144 | "metadata": {},
145 | "source": [
146 | "Here's how you can decode it back to words, in case you are curious:"
147 | ]
148 | },
149 | {
150 | "cell_type": "code",
151 | "execution_count": 8,
152 | "metadata": {},
153 | "outputs": [],
154 | "source": [
155 | "word_index = reuters.get_word_index()\n",
156 | "reverse_word_index = dict([(value, key) for (key, value) in word_index.items()])\n",
157 | "# Note that our indices were offset by 3\n",
158 | "# because 0, 1 and 2 are reserved indices for \"padding\", \"start of sequence\", and \"unknown\".\n",
159 | "decoded_newswire = ' '.join([reverse_word_index.get(i - 3, '?') for i in train_data[0]])"
160 | ]
161 | },
162 | {
163 | "cell_type": "code",
164 | "execution_count": 9,
165 | "metadata": {},
166 | "outputs": [
167 | {
168 | "data": {
169 | "text/plain": [
170 | "'? ? ? said as a result of its december acquisition of space co it expects earnings per share in 1987 of 1 15 to 1 30 dlrs per share up from 70 cts in 1986 the company said pretax net should rise to nine to 10 mln dlrs from six mln dlrs in 1986 and rental operation revenues to 19 to 22 mln dlrs from 12 5 mln dlrs it said cash flow per share this year should be 2 50 to three dlrs reuter 3'"
171 | ]
172 | },
173 | "execution_count": 9,
174 | "metadata": {},
175 | "output_type": "execute_result"
176 | }
177 | ],
178 | "source": [
179 | "decoded_newswire"
180 | ]
181 | },
182 | {
183 | "cell_type": "markdown",
184 | "metadata": {},
185 | "source": [
186 | "The label associated with an example is an integer between 0 and 45: a topic index."
187 | ]
188 | },
189 | {
190 | "cell_type": "code",
191 | "execution_count": 10,
192 | "metadata": {},
193 | "outputs": [
194 | {
195 | "data": {
196 | "text/plain": [
197 | "3"
198 | ]
199 | },
200 | "execution_count": 10,
201 | "metadata": {},
202 | "output_type": "execute_result"
203 | }
204 | ],
205 | "source": [
206 | "train_labels[10]"
207 | ]
208 | },
209 | {
210 | "cell_type": "markdown",
211 | "metadata": {},
212 | "source": [
213 | "## Preparing the data\n",
214 | "\n",
215 | "We can vectorize the data with the exact same code as in our previous example:"
216 | ]
217 | },
218 | {
219 | "cell_type": "code",
220 | "execution_count": 11,
221 | "metadata": {},
222 | "outputs": [],
223 | "source": [
224 | "import numpy as np\n",
225 | "\n",
226 | "def vectorize_sequences(sequences, dimension=10000):\n",
227 | " results = np.zeros((len(sequences), dimension))\n",
228 | " for i, sequence in enumerate(sequences):\n",
229 | " results[i, sequence] = 1.\n",
230 | " return results\n",
231 | "\n",
232 | "# Our vectorized training data\n",
233 | "x_train = vectorize_sequences(train_data)\n",
234 | "# Our vectorized test data\n",
235 | "x_test = vectorize_sequences(test_data)"
236 | ]
237 | },
238 | {
239 | "cell_type": "markdown",
240 | "metadata": {},
241 | "source": [
242 | "\n",
243 | "To vectorize the labels, there are two possibilities: we could just cast the label list as an integer tensor, or we could use a \"one-hot\" encoding. \n",
244 | "\n",
245 | "One-hot encoding is a widely used format for categorical data, also called \"categorical encoding\". "
246 | ]
247 | },
248 | {
249 | "cell_type": "markdown",
250 | "metadata": {},
251 | "source": [
252 | "Note that there is a built-in way to do this in Keras"
253 | ]
254 | },
255 | {
256 | "cell_type": "code",
257 | "execution_count": 12,
258 | "metadata": {},
259 | "outputs": [
260 | {
261 | "name": "stderr",
262 | "output_type": "stream",
263 | "text": [
264 | "Using TensorFlow backend.\n"
265 | ]
266 | }
267 | ],
268 | "source": [
269 | "from keras.utils.np_utils import to_categorical\n",
270 | "\n",
271 | "one_hot_train_labels = to_categorical(train_labels)\n",
272 | "one_hot_test_labels = to_categorical(test_labels)"
273 | ]
274 | },
275 | {
276 | "cell_type": "markdown",
277 | "metadata": {},
278 | "source": [
279 | "## Building our network\n"
280 | ]
281 | },
282 | {
283 | "cell_type": "code",
284 | "execution_count": 13,
285 | "metadata": {},
286 | "outputs": [],
287 | "source": [
288 | "from tensorflow.keras import models\n",
289 | "from tensorflow.keras import layers\n",
290 | "\n",
291 | "model = models.Sequential()\n",
292 | "model.add(layers.Dense(64, activation='relu', input_shape=(10000,)))\n",
293 | "model.add(layers.Dense(64, activation='relu'))\n",
294 | "model.add(layers.Dense(46, activation='softmax'))"
295 | ]
296 | },
297 | {
298 | "cell_type": "markdown",
299 | "metadata": {},
300 | "source": [
301 | "We are ending the network with a `Dense` layer of size 46. \n",
302 | "\n",
303 | "This means that for each input sample, our network will output a 46-dimensional vector. \n",
304 | "\n",
305 | "Each entry in this vector (each dimension) will encode a different output class.\n"
306 | ]
307 | },
308 | {
309 | "cell_type": "code",
310 | "execution_count": 14,
311 | "metadata": {},
312 | "outputs": [],
313 | "source": [
314 | "model.compile(optimizer='rmsprop',\n",
315 | " loss='categorical_crossentropy',\n",
316 | " metrics=['accuracy'])"
317 | ]
318 | },
319 | {
320 | "cell_type": "markdown",
321 | "metadata": {},
322 | "source": [
323 | "## Validating our approach\n",
324 | "\n",
325 | "Let's set apart 1,000 samples in our training data to use as a validation set:"
326 | ]
327 | },
328 | {
329 | "cell_type": "code",
330 | "execution_count": 15,
331 | "metadata": {},
332 | "outputs": [],
333 | "source": [
334 | "x_val = x_train[:1000]\n",
335 | "partial_x_train = x_train[1000:]\n",
336 | "\n",
337 | "y_val = one_hot_train_labels[:1000]\n",
338 | "partial_y_train = one_hot_train_labels[1000:]"
339 | ]
340 | },
341 | {
342 | "cell_type": "markdown",
343 | "metadata": {},
344 | "source": [
345 | "Now let's train our network for 10 epochs:"
346 | ]
347 | },
348 | {
349 | "cell_type": "code",
350 | "execution_count": 16,
351 | "metadata": {},
352 | "outputs": [
353 | {
354 | "name": "stdout",
355 | "output_type": "stream",
356 | "text": [
357 | "Train on 7982 samples, validate on 1000 samples\n",
358 | "Epoch 1/10\n",
359 | "7982/7982 [==============================] - 2s 267us/sample - loss: 2.5931 - accuracy: 0.5510 - val_loss: 1.6887 - val_accuracy: 0.6570\n",
360 | "Epoch 2/10\n",
361 | "7982/7982 [==============================] - 1s 166us/sample - loss: 1.3786 - accuracy: 0.7179 - val_loss: 1.2672 - val_accuracy: 0.7280\n",
362 | "Epoch 3/10\n",
363 | "7982/7982 [==============================] - 1s 168us/sample - loss: 1.0071 - accuracy: 0.7885 - val_loss: 1.1052 - val_accuracy: 0.7630\n",
364 | "Epoch 4/10\n",
365 | "7982/7982 [==============================] - 1s 163us/sample - loss: 0.7874 - accuracy: 0.8385 - val_loss: 1.0230 - val_accuracy: 0.7780\n",
366 | "Epoch 5/10\n",
367 | "7982/7982 [==============================] - 1s 164us/sample - loss: 0.6268 - accuracy: 0.8710 - val_loss: 0.9599 - val_accuracy: 0.7840\n",
368 | "Epoch 6/10\n",
369 | "7982/7982 [==============================] - 1s 181us/sample - loss: 0.5039 - accuracy: 0.8960 - val_loss: 0.9170 - val_accuracy: 0.8020\n",
370 | "Epoch 7/10\n",
371 | "7982/7982 [==============================] - 1s 180us/sample - loss: 0.4073 - accuracy: 0.9137 - val_loss: 0.9205 - val_accuracy: 0.8010\n",
372 | "Epoch 8/10\n",
373 | "7982/7982 [==============================] - 1s 182us/sample - loss: 0.3370 - accuracy: 0.9285 - val_loss: 0.9135 - val_accuracy: 0.8100\n",
374 | "Epoch 9/10\n",
375 | "7982/7982 [==============================] - 2s 190us/sample - loss: 0.2768 - accuracy: 0.9409 - val_loss: 0.9342 - val_accuracy: 0.7970\n",
376 | "Epoch 10/10\n",
377 | "7982/7982 [==============================] - 1s 184us/sample - loss: 0.2368 - accuracy: 0.9450 - val_loss: 0.8982 - val_accuracy: 0.8130\n"
378 | ]
379 | }
380 | ],
381 | "source": [
382 | "history = model.fit(partial_x_train,\n",
383 | " partial_y_train,\n",
384 | " epochs=10,\n",
385 | " batch_size=512,\n",
386 | " validation_data=(x_val, y_val))"
387 | ]
388 | },
389 | {
390 | "cell_type": "code",
391 | "execution_count": 17,
392 | "metadata": {},
393 | "outputs": [
394 | {
395 | "data": {
396 | "text/plain": [
397 | "dict_keys(['loss', 'accuracy', 'val_loss', 'val_accuracy'])"
398 | ]
399 | },
400 | "execution_count": 17,
401 | "metadata": {},
402 | "output_type": "execute_result"
403 | }
404 | ],
405 | "source": [
406 | "history_dict = history.history\n",
407 | "history_dict.keys()"
408 | ]
409 | },
410 | {
411 | "cell_type": "markdown",
412 | "metadata": {},
413 | "source": [
414 | "Let's display its loss and accuracy curves:"
415 | ]
416 | },
417 | {
418 | "cell_type": "code",
419 | "execution_count": 19,
420 | "metadata": {},
421 | "outputs": [
422 | {
423 | "data": {
424 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3XmYFOW59/Hvzb6N7Eb2weXILuAIeogB1Pi6RA2R44YaMEjkGNGYnDcEsxgjJ8bXg4gxKklEDSPEI2qMoiQnouhJog6IICDBKOgIskU2UXSG+/3jqenpGXpmGma6q4f5fa6rru6urq66uwf6189TVU+ZuyMiIgLQKO4CREQkdygUREQkQaEgIiIJCgUREUlQKIiISIJCQUREEhQKUqfMrLGZ7TGznnW5bJzM7Fgzq/Njt83sDDNbn/R4rZmdms6yh7CtX5vZtEN9fTXrvdXMHqzr9Up8msRdgMTLzPYkPWwF7ANKo8ffdPfCg1mfu5cCbep62YbA3Y+vi/WY2UTgcncflbTuiXWxbjn8KRQaOHdPfClHv0Qnuvv/VLW8mTVx95Js1CYi2afuI6lW1D3wOzObZ2a7gcvN7BQz+5uZ7TCzTWY2y8yaRss3MTM3s/zo8dzo+WfNbLeZ/dXMeh/sstHzZ5vZ381sp5ndbWb/a2bjq6g7nRq/aWZvm9lHZjYr6bWNzexOM9tuZu8AZ1Xz+dxkZvMrzbvHzGZE9yea2Zro/fwj+hVf1bqKzWxUdL+Vmf02qm0VcGKlZX9gZu9E611lZudH8wcCvwBOjbrmtiV9tjcnvf6a6L1vN7MnzaxLOp9NTcxsTFTPDjN73syOT3pumpltNLNdZvZW0ns92cyWRfM3m9n/S3d7kgHurkkT7g6wHjij0rxbgc+A8wg/IloCJwHDCS3No4G/A9+Klm8COJAfPZ4LbAMKgKbA74C5h7DskcBu4ILouRuBz4HxVbyXdGr8PdAWyAf+WfbegW8Bq4DuQEdgSfivknI7RwN7gNZJ694CFESPz4uWMeA04BNgUPTcGcD6pHUVA6Oi+3cALwDtgV7A6krLXgR0if4ml0U1fCF6biLwQqU65wI3R/fPjGocDLQAfgk8n85nk+L93wo8GN3vG9VxWvQ3mgasje73BzYAR0XL9gaOju6/Blwa3c8Dhsf9f6EhT2opSDpedvc/uPt+d//E3V9z91fcvcTd3wFmAyOref1j7l7k7p8DhYQvo4Nd9ivAcnf/ffTcnYQASSnNGn/m7jvdfT3hC7hsWxcBd7p7sbtvB26rZjvvAG8Swgrgy8BH7l4UPf8Hd3/Hg+eBPwMpdyZXchFwq7t/5O4bCL/+k7f7qLtviv4mjxACvSCN9QKMA37t7svd/VNgKjDSzLonLVPVZ1OdS4Cn3P356G90GyFYhgMlhADqH3VBvht9dhDC/Tgz6+juu939lTTfh2SAQkHS8X7yAzPrY2bPmNmHZrYLuAXoVM3rP0y6v5fqdy5XtWzX5Drc3Qm/rFNKs8a0tkX4hVudR4BLo/uXRY/L6viKmb1iZv80sx2EX+nVfVZlulRXg5mNN7M3om6aHUCfNNcL4f0l1ufuu4CPgG5JyxzM36yq9e4n/I26ufta4DuEv8OWqDvyqGjRCUA/YK2ZvWpm56T5PiQDFAqSjsqHY95P+HV8rLsfAfyI0D2SSZsI3TkAmJlR8UusstrUuAnokfS4pkNmHwXOMLNuhBbDI1GNLYHHgJ8RunbaAX9Ms44Pq6rBzI4G7gUmAx2j9b6VtN6aDp/dSOiSKltfHqGb6oM06jqY9TYi/M0+AHD3ue4+gtB11JjwueDua939EkIX4X8BC8ysRS1rkUOkUJBDkQfsBD42s77AN7OwzaeBoWZ2npk1Aa4HOmeoxkeBG8ysm5l1BL5X3cLu/iHwMvAgsNbd10VPNQeaAVuBUjP7CnD6QdQwzczaWTiP41tJz7UhfPFvJeTj1YSWQpnNQPeyHespzAO+YWaDzKw54cv5JXevsuV1EDWfb2ajom3/B2E/0Ctm1tfMRkfb+ySa9hPewBVm1ilqWeyM3tv+WtYih0ihIIfiO8DXCf/h7yfsEM4od98MXAzMALYDxwCvE86rqOsa7yX0/a8k7AR9LI3XPELYcZzoOnL3HcC3gScIO2vHEsItHT8mtFjWA88CDyetdwVwN/BqtMzxQHI//J+AdcBmM0vuBip7/XOEbpwnotf3JOxnqBV3X0X4zO8lBNZZwPnR/oXmwO2E/UAfElomN0UvPQdYY+HotjuAi939s9rWI4fGQtesSP1iZo0J3RVj3f2luOsROVyopSD1hpmdFXWnNAd+SDhq5dWYyxI5rCgUpD75IvAOoWvi/wBj3L2q7iMROQTqPhIRkQS1FEREJKHeDYjXqVMnz8/Pj7sMEZF6ZenSpdvcvbrDuIF6GAr5+fkUFRXFXYaISL1iZjWdmQ+o+0hERJIoFEREJEGhICIiCfVun4KIZNfnn39OcXExn376adylSBpatGhB9+7dadq0qqGvqqdQEJFqFRcXk5eXR35+PmFwWslV7s727dspLi6md+/eNb8ghQbRfVRYCPn50KhRuC08qEvRizRsn376KR07dlQg1ANmRseOHWvVqjvsWwqFhTBpEuzdGx5v2BAeA4yr9biQIg2DAqH+qO3f6rBvKdx0U3kglNm7N8wXEZGKDvtQeO+9g5svIrll+/btDB48mMGDB3PUUUfRrVu3xOPPPkvvsgsTJkxg7dq11S5zzz33UFhHfctf/OIXWb58eZ2sK9sO++6jnj1Dl1Gq+SJS9woLQ0v8vffC/7Pp02vXVduxY8fEF+zNN99MmzZt+O53v1thGXfH3WnUKPXv3Dlz5tS4nWuvvfbQizyMHPYthenToVWrivNatQrzRaRule3D27AB3Mv34WXi4I63336bfv36MW7cOPr378+mTZuYNGkSBQUF9O/fn1tuuSWxbNkv95KSEtq1a8fUqVM54YQTOOWUU9iyZQsAP/jBD5g5c2Zi+alTpzJs2DCOP/54/vKXvwDw8ccfc+GFF9KvXz/Gjh1LQUFBjS2CuXPnMnDgQAYMGMC0adMAKCkp4YorrkjMnzVrFgB33nkn/fr1Y9CgQVx++eV1/pml47BvKZT9QqnLXy4iklp1+/Ay8X/urbfe4uGHH6agoACA2267jQ4dOlBSUsLo0aMZO3Ys/fr1q/CanTt3MnLkSG677TZuvPFGHnjgAaZOnXrAut2dV199laeeeopbbrmF5557jrvvvpujjjqKBQsW8MYbbzB06NBq6ysuLuYHP/gBRUVFtG3bljPOOIOnn36azp07s23bNlauXAnAjh07ALj99tvZsGEDzZo1S8zLtsO+pQDhH+P69bB/f7hVIIhkRrb34R1zzDGJQACYN28eQ4cOZejQoaxZs4bVq1cf8JqWLVty9tlnA3DiiSeyfv36lOv+2te+dsAyL7/8MpdccgkAJ5xwAv3796+2vldeeYXTTjuNTp060bRpUy677DKWLFnCsccey9q1a5kyZQqLFi2ibdu2APTv35/LL7+cwsLCQz75rLYyFgpm1sPMFpvZajNbZWbXp1hmlJntNLPl0fSjTNUjIplX1b66TO3Da926deL+unXruOuuu3j++edZsWIFZ511Vsrj9Zs1a5a437hxY0pKSlKuu3nz5jUuc6g6duzIihUrOPXUU7nnnnv45je/CcCiRYu45ppreO211xg2bBilpaV1ut10ZLKlUAJ8x937AScD15pZvxTLveTug6PplhTPi0g9Eec+vF27dpGXl8cRRxzBpk2bWLRoUZ1vY8SIETz66KMArFy5MmVLJNnw4cNZvHgx27dvp6SkhPnz5zNy5Ei2bt2Ku/Nv//Zv3HLLLSxbtozS0lKKi4s57bTTuP3229m2bRt7K/fFZUHG9im4+yZgU3R/t5mtAboB1X+KIlJvxbkPb+jQofTr148+ffrQq1cvRowYUefbuO6667jyyivp169fYirr+kmle/fu/PSnP2XUqFG4O+eddx7nnnsuy5Yt4xvf+Abujpnx85//nJKSEi677DJ2797N/v37+e53v0teXl6dv4eaZOUazWaWDywBBrj7rqT5o4AFQDGwEfiuu69K8fpJwCSAnj17nrgh1TGmIpIRa9asoW/fvnGXkRNKSkooKSmhRYsWrFu3jjPPPJN169bRpEluHbOT6m9mZkvdvaCKlyRk/J2YWRvCF/8NyYEQWQb0cvc9ZnYO8CRwXOV1uPtsYDZAQUFB5lNMRCSFPXv2cPrpp1NSUoK7c//99+dcINRWRt+NmTUlBEKhuz9e+fnkkHD3hWb2SzPr5O7bMlmXiMihaNeuHUuXLo27jIzK5NFHBvwGWOPuM6pY5qhoOcxsWFTP9kzVJCIi1ctkS2EEcAWw0szKTvmbBvQEcPf7gLHAZDMrAT4BLvFs7OQQEZGUMnn00ctAtWO4uvsvgF9kqgYRETk4DeKMZhERSY9CQURy2ujRow84EW3mzJlMnjy52te1adMGgI0bNzJ27NiUy4waNYqioqJq1zNz5swKJ5Gdc845dTIu0c0338wdd9xR6/XUNYWCiOS0Sy+9lPnz51eYN3/+fC699NK0Xt+1a1cee+yxQ95+5VBYuHAh7dq1O+T15TqFgojktLFjx/LMM88kLqizfv16Nm7cyKmnnpo4b2Do0KEMHDiQ3//+9we8fv369QwYMACATz75hEsuuYS+ffsyZswYPvnkk8RykydPTgy7/eMf/xiAWbNmsXHjRkaPHs3o0aMByM/PZ9u2cNT8jBkzGDBgAAMGDEgMu71+/Xr69u3L1VdfTf/+/TnzzDMrbCeV5cuXc/LJJzNo0CDGjBnDRx99lNh+2VDaZQPxvfjii4mLDA0ZMoTdu3cf8mebyuF11oWIZNQNN0BdX1Bs8GCIvk9T6tChA8OGDePZZ5/lggsuYP78+Vx00UWYGS1atOCJJ57giCOOYNu2bZx88smcf/75VV6n+N5776VVq1asWbOGFStWVBj6evr06XTo0IHS0lJOP/10VqxYwZQpU5gxYwaLFy+mU6dOFda1dOlS5syZwyuvvIK7M3z4cEaOHEn79u1Zt24d8+bN41e/+hUXXXQRCxYsqPb6CFdeeSV33303I0eO5Ec/+hE/+clPmDlzJrfddhvvvvsuzZs3T3RZ3XHHHdxzzz2MGDGCPXv20KJFi4P4tGumloKI5LzkLqTkriN3Z9q0aQwaNIgzzjiDDz74gM2bN1e5niVLliS+nAcNGsSgQYMSzz366KMMHTqUIUOGsGrVqhoHu3v55ZcZM2YMrVu3pk2bNnzta1/jpZdeAqB3794MHjwYqH54bgjXd9ixYwcjR44E4Otf/zpLlixJ1Dhu3Djmzp2bOHN6xIgR3HjjjcyaNYsdO3bU+RnVaimISNqq+0WfSRdccAHf/va3WbZsGXv37uXEE08EoLCwkK1bt7J06VKaNm1Kfn5+yuGya/Luu+9yxx138Nprr9G+fXvGjx9/SOspUzbsNoSht2vqPqrKM888w5IlS/jDH/7A9OnTWblyJVOnTuXcc89l4cKFjBgxgkWLFtGnT59DrrUytRREJOe1adOG0aNHc9VVV1XYwbxz506OPPJImjZtyuLFi6lpsMwvfelLPPLIIwC8+eabrFixAgjDbrdu3Zq2bduyefNmnn322cRr8vLyUvbbn3rqqTz55JPs3buXjz/+mCeeeIJTTz31oN9b27Ztad++faKV8dvf/paRI0eyf/9+3n//fUaPHs3Pf/5zdu7cyZ49e/jHP/7BwIED+d73vsdJJ53EW2+9ddDbrI5aCiJSL1x66aWMGTOmwpFI48aN47zzzmPgwIEUFBTU+It58uTJTJgwgb59+9K3b99Ei+OEE05gyJAh9OnThx49elQYdnvSpEmcddZZdO3alcWLFyfmDx06lPHjxzNs2DAAJk6cyJAhQ6rtKqrKQw89xDXXXMPevXs5+uijmTNnDqWlpVx++eXs3LkTd2fKlCm0a9eOH/7whyxevJhGjRrRv3//xFXk6kpWhs6uSwUFBV7TccUiUnc0dHb9U5uhs9V9JCIiCQoFERFJUCiISI3qWzdzQ1bbv5VCQUSq1aJFC7Zv365gqAfcne3bt9fqhDYdfSQi1erevTvFxcVs3bo17lIkDS1atKB79+6H/HqFgohUq2nTpvTu3TvuMiRL1H0kIiIJCgUREUlQKIiISIJCQUREEhQKIiKSoFAQEZEEhYKIiCQoFEREJEGhICIiCQoFERFJUCiIiEiCQkFERBIUCiIikqBQEBGRBIWCiIgkKBRERCRBoSAiIgkZCwUz62Fmi81stZmtMrPrUyxjZjbLzN42sxVmNjRT9YiISM0yeTnOEuA77r7MzPKApWb2J3dfnbTM2cBx0TQcuDe6FRGRGGSspeDum9x9WXR/N7AG6FZpsQuAhz34G9DOzLpkqiYREaleVvYpmFk+MAR4pdJT3YD3kx4Xc2BwYGaTzKzIzIq2bt2aqTJFRBq8jIeCmbUBFgA3uPuuQ1mHu8929wJ3L+jcuXPdFigiIgkZDQUza0oIhEJ3fzzFIh8APZIed4/miYhIDDJ59JEBvwHWuPuMKhZ7CrgyOgrpZGCnu2/KVE0iIlK9TB59NAK4AlhpZsujedOAngDufh+wEDgHeBvYC0zIYD0iIlKDjIWCu78MWA3LOHBtpmoQEZGDozOaRUQkQaEgIiIJCgUREUlQKIiISIJCQUREEhQKIiKSoFAQEZEEhYKIiCQoFEREJEGhICIiCQoFERFJUCiIiEiCQkFERBIUCiIikqBQEBGRBIWCiIgkKBRERCRBoSAiIgkKBRERSWgwobBuHXz1q7BjR9yViIjkrgYTCu++CwsXwjnnwJ49cVcjIpKbGkwonHkm/O538OqrcP758MkncVckIpJ7GkwoAIwZAw8/DC+8ABdeCPv2xV2RiEhuaVChAHDZZTB7Njz7bLhfUhJ3RSIiuaPBhQLAxIlw113w+OMwYQLs3x93RSIiuaFJ3AXEZcoU+PhjmDYNWrWC++4Ds7irEhGJV4MNBYDvfz8Ew/TpIRhmzFAwiEjD1qBDAeCnPw2HqM6cCa1bw623xl2RiEh8GnwomMGdd8LevaHF0Lp1aEGIiDREDT4UIATDvfeGYCjbx3D99XFXJSKSfQqFSOPG8OCD4aS2G24ILYaJE+OuSkQkuxrkIalVadIE5s2Ds8+GSZPgkUfirkhEJLsUCpU0awYLFsDIkXDllfDEE3FXJCKSPRkLBTN7wMy2mNmbVTw/ysx2mtnyaPpRpmo5WC1bwlNPwUknwcUXw3PPxV2RiEh2ZLKl8CBwVg3LvOTug6PplgzWctDy8sJQGAMGhDGTXngh7opERDIvrVAws2PMrHl0f5SZTTGzdtW9xt2XAP+sgxpj064d/PGPcPTR8JWvwN/+FndFIiKZlW5LYQFQambHArOBHkBd7IY9xczeMLNnzax/VQuZ2SQzKzKzoq1bt9bBZtPXqRP8z/9Aly5w1lnw+utZ3byISFalGwr73b0EGAPc7e7/AXSp5baXAb3c/QTgbuDJqhZ099nuXuDuBZ07d67lZg9ely7w5z9D27bhugyrV2e9BBGRrEg3FD43s0uBrwNPR/Oa1mbD7r7L3fdE9xcCTc2sU23WmUk9e4ZgaNIEzjgD3n477opEROpeuqEwATgFmO7u75pZb+C3tdmwmR1lFoafM7NhUS3ba7POTDv22NCV9NlncPrp8N57cVckIlK30jqj2d1XA1MAzKw9kOfuP6/uNWY2DxgFdDKzYuDHRK0Ld78PGAtMNrMS4BPgEnf3Q3wfWdO/P/zpTzB6dAiGJUtC95KIyOHA0vkeNrMXgPMJIbIU2AL8r7vfmNHqUigoKPCioqJsb/YAf/0rfPnL0KsXvPhi2CEtIpKrzGypuxfUtFy63Udt3X0X8DXgYXcfDpxRmwLru1NOgaefhnfeCTufd+yIuyIRkdpLNxSamFkX4CLKdzQ3eKNGhUt6vvkmnHNOuC5DdQoLIT8fGjUKt4WFWShSROQgpBsKtwCLgH+4+2tmdjSwLnNl1R9nnw3z58Orr8L554dRVlMpLAyD7G3YAO7hdtIkBYOI5Ja09inkklzZp1BZYSFccUUIiSeeCAPrJcvPD0FQWa9esH59NioUkYasTvcpmFl3M3siGuBui5ktMLPutS/z8DFuHNx/PyxcCJddBiUlFZ+v6vBVHdYqIrkk3e6jOcBTQNdo+kM0T5JcfXW41vOCBTBhAuzfX/5cz56pX1PVfBGROKQbCp3dfY67l0TTg0D2x5uoB66/Plzree5cmDw57D+AMK9Vq4rLtmoV5ouI5Ip0L8e53cwuB+ZFjy8lx88+jtO0afDxx/Cf/xm++GfMCN1LADfdFLqMevYMgVA2X0QkF6QbClcRBq27E3DgL8D4DNV0WLj11nCI6syZ0KYN/PSnIQAUAiKSy9Id5mID4YzmBDO7AZiZiaIOB2YhEPbuDQHRujVMnRp3VSIi1Uu3pZDKjSgUqmUG990XguH73w9dSVOmxF2ViEjVahMKVmdVHMYaN4aHHgontV1/fWgxfOMbcVclIpJaba7RXL/OeotRkyYwb164ctvVV4eQqGfnDIpIA1FtKJjZbjPblWLaTThfQdLUvHkYJ2nkSBg/HgYNgl/8QgPpiUhuqTYU3D3P3Y9IMeW5e226nhqkli3h2WfhV7+CFi3guuuga1e46ir429/UehCR+NWm+0gOQYsWMHEivPYaLF0axkv67/8OQ3EPHgy//CXs3Bl3lSLSUCkUYjR0aBgvaePGcNukCVx7bWg9TJwYRl5V60FEskmhkAPy8sIw2kuXhhbEZZeF4biHDw/Bcd99sHt33FWKSEOgUMgxBQVhn8PGjaEryT2ModSlS3lwiIhkikIhRx1xRAiD118PO6EvvjgMsldQUB4cNV3pTUTkYCkUcpxZ6Eb6zW9g06ZwGOtnn4VWQ5cucM01IThEROqCQqEeads27Ih+4w34y1/gwgvDiXBDh8KwYSE4Pv447ipFpD5TKNRDZuEQ1gcfDPse7rorhMHEiaH18O//HoJDRORgKRTqufbtwyB7b74JL78MX/0qPPBAOOfh5JNhzpwwIJ+ISDoUCocJMxgxAh5+OLQe7rwznAR31VXhvIfrrgvBISJSHYXCYahDB7jhBli9Gl58Ec49F2bPhoEDQ3CUjdoqIlKZQuEwZgZf+hIUFsIHH8B//Rds2xYG5Csbc+m++6CoCPbti7taEckF5vVsHIWCggIvKiqKu4x6yz20HmbPhj/+EbZHV9pu2jSM3HrSSeXnQvTrF+aLSP1nZkvdvaDG5RQKDZc7bNgQWgrJU9mAfC1ahB3WyUFx/PHhwkEiUr8oFOSQ7N8P//hHxZBYurT8/IfWrcN5EclBccwx0EgdkSI5TaEgdaa0FNaurRgUr78On34anm/bFk48sWJQ9OoV9mmISG5QKEhGff55OLopOSjeeCPMB+jYMYRDclB07aqgEImLQkGqVFgIN90E770HPXvC9Okwblzt17tvH6xcWTEo3nwztDQAjjrqwKA48sjab1dEapZuKGTskppm9gDwFWCLuw9I8bwBdwHnAHuB8e6+LFP1SFBYGAbTKzvLecOG8BhqHwzNm5d/2ZfZuze0IJKD4plnyi8e1KNHOOqpTx/o2zfc9ukTWhoikn0ZaymY2ZeAPcDDVYTCOcB1hFAYDtzl7sNrWq9aCrWTnx+CoLJevWD9+uzUsHt32CdRFhKrVoV9FsnnSnTqVB4QyVN+vo5+EjkUOdF9ZGb5wNNVhML9wAvuPi96vBYY5e6bqlunQqF2GjVKfYlPs3DkUVxKS0N31ltvwZo14bZs2rq1fLnmzeG44w5sWfzLv0CbNvHVL5LrYu8+SkM34P2kx8XRvANCwcwmAZMAevbsmZXiDlc9e6ZuKcT9sTZuDL17h+nssys+t317aEkkB8Ubb8Djj1cMsh49UrcuunTRDm6RdMUZCmlz99nAbAgthZjLqdemT6+4TwGgVaswP1d17Aj/+q9hSrZvXzinonLLYs6cilely8srD4jk1sUxx0CzZtl9LyK5Ls5Q+ADokfS4ezRPMqhsZ3Imjj7KtubNw1Ac/fpVnO8eRopNDoq33oLFi+G3vy1frnHjEAx9+oQztXv1gu7doVu3MB15pPZfSMMTZyg8BXzLzOYTdjTvrGl/gtSNcePqZwiky6z8i/300ys+t3s3/P3vB7YunnsuXOY0WePG4dyKsnV161YxNMqmli2z995EMi2Th6TOA0YBncysGPgx0BTA3e8DFhKOPHqbcEjqhEzVIlImLy+cfX3iiRXnl5bCli1hNNnkqbg43K5aBYsWVeyWKtOhQ83B0bGj9mvkutLS8KMh1dS0KXzhC2E68sgwLtjhSieviRyEXbuqDo6yafPmA4/wat685uDo0uXg93G4h53tZVNpacXH1c2vbp47NGkSvgyru23SJLSo4gg89zDUSuUv8F27qv5yr+65g7lC4RFHlIdEqunII8vv58pRcTlxSGomKBQk133+OWzaVH1wfPBB+dhRZcxCi6JRo/S/wHNFVYGRTqhU99z+/dV/uaf7GbRqFVqJeXnhC73sfqop1fP79oWW5ObNqactW+Cf/6x62zUFR9nUtm3mArY+HJIqclhq2jTswK/uMF/38CVSOTg2bw7PN2pUcWrc+MB5Vc1Pd151ywKUlITp889T39bFc/v2hRF4q1qmUaOKX85du6b35Z78XJs22Tlg4LPPwjk1VYXG5s3wzjvw17+Gi12lOi+oWbPqQ2Pw4HAEXSYpFERiUNYq6NgxDPMh9V+zZuVdgTUpLQ3BkBwYladNm8L5OJs3h5AEmDoVfvazzL4PhYKISJY1blz+678m7vDRRyEc8vIyX5tCQUQkh5mFI9w6dMjO9nS9LBERSVAoiIhIgkJBREQSFAoiIpKgUJDYFBaGi+Y0ahRuCwvjrkhEdPSRxCKTlwUVkUOnloLE4qabDhxrZu/eMF9E4qNQkFi8997BzReR7FAoSCyqGhco7suCijR0CgWJxfTpYfTIZLl+WVCRhkC3iM9QAAAHuklEQVShILEYNw5mzw6XwDQLt7NnayezSNx09JHE5nC/LKhIfaSWgoiIJCgUREQkQaEgIiIJCgUREUlQKIiISIJCQRo8DcwnUk6HpEqDpoH5RCpSS0EaNA3MJ1KRQkEaNA3MJ1KRQkEaNA3MJ1KRQkEaNA3MJ1KRQkEaNA3MJ1KRjj6SBk8D84mUU0tBREQSFAoiIpKgUBARkYSMhoKZnWVma83sbTObmuL58Wa21cyWR9PETNYjkss03IbkgoztaDazxsA9wJeBYuA1M3vK3VdXWvR37v6tTNUhUh9ouA3JFZlsKQwD3nb3d9z9M2A+cEEGtydSb2m4DckVmQyFbsD7SY+Lo3mVXWhmK8zsMTPrkWpFZjbJzIrMrGjr1q2ZqFUkVhpuQ3JF3Dua/wDku/sg4E/AQ6kWcvfZ7l7g7gWdO3fOaoEi2aDhNiRXZDIUPgCSf/l3j+YluPt2d98XPfw1cGIG6xHJWRpuQ3JFJkPhNeA4M+ttZs2AS4Cnkhcwsy5JD88H1mSwHpGcpeE2JFdk7Ogjdy8xs28Bi4DGwAPuvsrMbgGK3P0pYIqZnQ+UAP8ExmeqHpFcp+E2JBdkdJ+Cuy90939x92PcfXo070dRIODu33f3/u5+gruPdve3MlmPiFRP50qIBsQTEUDnSkgQ99FHIpIjdK6EgEJBRCI6V0JAoSAiEZ0rIaBQEJGIzpUQUCiISCSXzpXQUVDx0dFHIpKQC+dK6CioeKmlICI5RUdBxUuhICI5RUdBxUuhICI5RUdBxUuhICI5JZeOgmqIO7wVCiKSU3LlKKiyHd4bNoB7+Q7vwz0YzN3jruGgFBQUeFFRUdxliMhhLj8/BEFlvXrB+vXZrqb2zGypuxfUtJxaCiIiKeTSDu9sdmMpFEREUsiVHd7Z7sZSKIiIpJArO7yzfd6GQkFEJIVc2eGd7W4sDXMhIlKFXBj2o2fP1Du8M9WNpZaCiEgOy3Y3lkJBRCSHZbsbS91HIiI5LpvdWGopiIhIgkJBREQSFAoiIpKgUBARkQSFgoiIJNS7UVLNbCuQ4lSOeqUTsC3uInKIPo+K9HmU02dRUW0+j17u3rmmhepdKBwOzKwonSFsGwp9HhXp8yinz6KibHwe6j4SEZEEhYKIiCQoFOIxO+4Ccow+j4r0eZTTZ1FRxj8P7VMQEZEEtRRERCRBoSAiIgkKhSwysx5mttjMVpvZKjO7Pu6a4mZmjc3sdTN7Ou5a4mZm7czsMTN7y8zWmNkpcdcUJzP7dvT/5E0zm2dmLeKuKZvM7AEz22JmbybN62BmfzKzddFt+7rerkIhu0qA77h7P+Bk4Foz6xdzTXG7HlgTdxE54i7gOXfvA5xAA/5czKwbMAUocPcBQGPgkniryroHgbMqzZsK/NndjwP+HD2uUwqFLHL3Te6+LLq/m/Cfvlu8VcXHzLoD5wK/jruWuJlZW+BLwG8A3P0zd98Rb1WxawK0NLMmQCtgY8z1ZJW7LwH+WWn2BcBD0f2HgK/W9XYVCjExs3xgCPBKvJXEaibwf4H9cReSA3oDW4E5UXfar82sddxFxcXdPwDuAN4DNgE73f2P8VaVE77g7pui+x8CX6jrDSgUYmBmbYAFwA3uvivueuJgZl8Btrj70rhryRFNgKHAve4+BPiYDHQN1BdRX/kFhLDsCrQ2s8vjrSq3eDifoM7PKVAoZJmZNSUEQqG7Px53PTEaAZxvZuuB+cBpZjY33pJiVQwUu3tZy/ExQkg0VGcA77r7Vnf/HHgc+NeYa8oFm82sC0B0u6WuN6BQyCIzM0Kf8Rp3nxF3PXFy9++7e3d3zyfsQHze3RvsL0F3/xB438yOj2adDqyOsaS4vQecbGatov83p9OAd7wneQr4enT/68Dv63oDCoXsGgFcQfhVvDyazom7KMkZ1wGFZrYCGAz8Z8z1xCZqMT0GLANWEr6rGtSQF2Y2D/grcLyZFZvZN4DbgC+b2TpCa+q2Ot+uhrkQEZEyaimIiEiCQkFERBIUCiIikqBQEBGRBIWCiIgkKBREImZWmnSo8HIzq7Mzis0sP3m0S5Fc1STuAkRyyCfuPjjuIkTipJaCSA3MbL2Z3W5mK83sVTM7Npqfb2bPm9kKM/uzmfWM5n/BzJ4wszeiqWx4hsZm9qvoGgF/NLOW0fJTomtsrDCz+TG9TRFAoSCSrGWl7qOLk57b6e4DgV8QRncFuBt4yN0HAYXArGj+LOBFdz+BMH7Rqmj+ccA97t4f2AFcGM2fCgyJ1nNNpt6cSDp0RrNIxMz2uHubFPPXA6e5+zvRgIYfuntHM9sGdHH3z6P5m9y9k5ltBbq7+76kdeQDf4oujoKZfQ9o6u63mtlzwB7gSeBJd9+T4bcqUiW1FETS41XcPxj7ku6XUr5P71zgHkKr4rXoojIisVAoiKTn4qTbv0b3/0L5JSLHAS9F9/8MTIbENajbVrVSM2sE9HD3xcD3gLbAAa0VkWzRLxKRci3NbHnS4+fcveyw1PbR6KX7gEujedcRrpT2H4Srpk2I5l8PzI5GtSwlBMQmUmsMzI2Cw4BZugynxEn7FERqEO1TKHD3bXHXIpJp6j4SEZEEtRRERCRBLQUREUlQKIiISIJCQUREEhQKIiKSoFAQEZGE/w8f+NP3RwnmqQAAAABJRU5ErkJggg==\n",
425 | "text/plain": [
426 | ""
427 | ]
428 | },
429 | "metadata": {
430 | "needs_background": "light"
431 | },
432 | "output_type": "display_data"
433 | }
434 | ],
435 | "source": [
436 | "import matplotlib.pyplot as plt\n",
437 | "\n",
438 | "loss = history.history['loss']\n",
439 | "val_loss = history.history['val_loss']\n",
440 | "\n",
441 | "epochs = range(1, len(loss) + 1)\n",
442 | "\n",
443 | "plt.plot(epochs, loss, 'bo', label='Training loss')\n",
444 | "plt.plot(epochs, val_loss, 'b', label='Validation loss')\n",
445 | "plt.title('Training and validation loss')\n",
446 | "plt.xlabel('Epochs')\n",
447 | "plt.ylabel('Loss')\n",
448 | "plt.legend()\n",
449 | "\n",
450 | "plt.show()"
451 | ]
452 | },
453 | {
454 | "cell_type": "code",
455 | "execution_count": 20,
456 | "metadata": {},
457 | "outputs": [
458 | {
459 | "data": {
460 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEWCAYAAACXGLsWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3Xt8VNW5//HPYwSROwhaBUnQotwvMQU9iJcCilbFuyD2eKnSatFT62lri60clOqverydcmzRY7WKUqtFaatSUBRtQQkVUbACImBAJdxFUEh4fn+sHbIzJJkJZDIT8n2/XvOa2WuvPfuZCexn1l57r2XujoiISHUOyHQAIiKS/ZQsREQkKSULERFJSslCRESSUrIQEZGklCxERCQpJQtJmZnlmNlWM+tUm3Uzycy+bma1fv24mQ0xsxWx5Q/MbFAqdfdiXw+b2c/2dnuRVByY6QAkfcxsa2yxKfAVUBotf9fdJ9fk/dy9FGhe23UbAnc/tjbex8yuBi5z91Ni7311bby3SHWULPZj7r77YB39cr3a3WdWVd/MDnT3krqITSQZ/XvMLjoN1YCZ2e1m9gcze8rMPgcuM7MTzGyumW0ys0/M7AEzaxTVP9DM3MzyouUnovUvmtnnZjbHzDrXtG60/gwzW2Jmm83sf8zs72Z2RRVxpxLjd81smZltNLMHYtvmmNm9ZrbezJYDw6r5fsaa2ZSEsolmdk/0+mozez/6PB9Gv/qreq8iMzslet3UzB6PYlsEHJdQ9xYzWx697yIzOycq7wX8GhgUneJbF/tux8W2/1702deb2XNmdngq301NvueyeMxsppltMLNPzezHsf38PPpOtphZoZkdUdkpPzN7o+zvHH2fs6P9bABuMbMuZjYr2se66HtrFds+N/qMxdH6+82sSRRzt1i9w81sm5kdUtXnlSTcXY8G8ABWAEMSym4HdgBnE344HAx8AxhAaHUeBSwBxkT1DwQcyIuWnwDWAQVAI+APwBN7UfdQ4HNgeLTuh8BO4IoqPksqMT4PtALygA1lnx0YAywCOgKHALPDf4NK93MUsBVoFnvvtUBBtHx2VMeAbwLbgd7RuiHAith7FQGnRK/vBl4F2gC5wOKEuhcDh0d/k0ujGA6L1l0NvJoQ5xPAuOj1aVGMfYEmwP8Cr6Ty3dTwe24FfAb8B3AQ0BLoH637KfAO0CX6DH2BtsDXE79r4I2yv3P02UqAa4Ecwr/HY4DBQOPo38nfgbtjn+e96PtsFtUfGK2bBEyI7ecmYGqm/x/W50fGA9Cjjv7QVSeLV5Js95/AH6PXlSWA38TqngO8txd1rwJej60z4BOqSBYpxnh8bP2fgP+MXs8mnI4rW3dm4gEs4b3nApdGr88APqim7l+A70evq0sWq+J/C+C6eN1K3vc94FvR62TJ4jHgl7F1LQn9VB2TfTc1/J6/Dcyrot6HZfEmlKeSLJYnieHCsv0Cg4BPgZxK6g0EPgIsWl4AnF/b/68a0kOnoeTj+IKZdTWzv0anFbYA44F21Wz/aez1Nqrv1K6q7hHxODz87y6q6k1SjDGlfQErq4kX4ElgZPT60mi5LI6zzOzN6BTJJsKv+uq+qzKHVxeDmV1hZu9Ep1I2AV1TfF8In2/3+7n7FmAj0CFWJ6W/WZLv+UhCUqhMdeuSSfz3+DUze9rMVkcxPJoQwwoPF1NU4O5/J7RSTjSznkAn4K97GZOgPgsJvzTjfkv4Jft1d28J/ILwSz+dPiH88gXAzIyKB7dE+xLjJ4SDTJlkl/Y+DQwxsw6E02RPRjEeDDwD3EE4RdQa+FuKcXxaVQxmdhTwIOFUzCHR+/4r9r7JLvNdQzi1VfZ+LQinu1anEFei6r7nj4Gjq9iuqnVfRDE1jZV9LaFO4uf7f4Sr+HpFMVyREEOumeVUEcfvgcsIraCn3f2rKupJCpQsJFELYDPwRdRB+N062OdfgHwzO9vMDiScB2+fphifBn5gZh2izs6fVFfZ3T8lnCp5lHAKamm06iDCefRioNTMziKcW081hp+ZWWsL96GMia1rTjhgFhPy5jWElkWZz4CO8Y7mBE8B3zGz3mZ2ECGZve7uVbbUqlHd9zwN6GRmY8zsIDNraWb9o3UPA7eb2dEW9DWztoQk+SnhQoocMxtNLLFVE8MXwGYzO5JwKqzMHGA98EsLFw0cbGYDY+sfJ5y2upSQOGQfKFlIopuAywkdzr8ldESnlbt/BlwC3EP4z3808DbhF2Vtx/gg8DLwLjCP0DpI5klCH8TuU1Duvgm4EZhK6CS+kJD0UnEroYWzAniR2IHM3RcC/wO8FdU5Fngztu0MYCnwmZnFTyeVbf8S4XTR1Gj7TsCoFONKVOX37O6bgaHABYQEtgQ4OVp9F/Ac4XveQuhsbhKdXrwG+BnhYoevJ3y2ytwK9CckrWnAs7EYSoCzgG6EVsYqwt+hbP0Kwt/5K3f/Rw0/uyQo6/wRyRrRaYU1wIXu/nqm45H6y8x+T+g0H5fpWOo73ZQnWcHMhhGuPNpOuPRyJ+HXtcheifp/hgO9Mh3L/kCnoSRbnAgsJ5yrPx04Tx2SsrfM7A7CvR6/dPdVmY5nf6DTUCIikpRaFiIiktR+02fRrl07z8vLy3QYIiL1yvz589e5e3WXqgP7UbLIy8ujsLAw02GIiNQrZpZsFANAp6FERCQFShYiIpKUkoWIiCSV1j6L6Ear+wlj0z/s7ncmrM8FHiGMA7SBMF1kUbSulHCrPsAqdz+npvvfuXMnRUVFfPnll/vwKSTdmjRpQseOHWnUqKrhjkQk09KWLKIhGyYSxo8pAuaZ2TR3Xxyrdjfwe3d/zMy+SRj07NvRuu3u3ndfYigqKqJFixbk5eURBjKVbOPurF+/nqKiIjp37px8AxHJiHSehuoPLHP35e6+A5hCuPU+rjvwSvR6ViXr98mXX37JIYccokSRxcyMQw45RK0/kb0weTLk5cEBB4TnyZPTt690JosOVJzIpIg95yh4Bzg/en0e0CI2R26TaO7euWZ2bmU7MLPRUZ3C4uLiSoNQosh++huJ1NzkyTB6NKxcCe7hefTo9CWMTHdw/ydwspm9TRjeeDVhCkiAXHcvIIxFf5+Z7TGZirtPcvcCdy9o3z7pPSUiIrWiLn/RV2XsWNi2rWLZtm2hPB3SmSxWU3E2sI4kzNbl7mvc/Xx37weMjco2Rc+ro+flhMnt+6Ux1rRYv349ffv2pW/fvnzta1+jQ4cOu5d37NiR0ntceeWVfPDBB9XWmThxIpMz8a9VpAGq61/0VVlVxfCIVZXvs3RN7k3oPF8OdCbMKPYO0COhTjvggOj1BGB89LoNcFCszlKge3X7O+644zzR4sWL9yirzhNPuOfmupuF5yeeqNHm1br11lv9rrvu2qN8165dXlpaWns7qqdq+rcSyZTcXPeQJio+cnPrZxxAoadwTE9by8LDLFZjgOnA+4Q5cBeZ2XgzK7sM9hTgAzNbAhwWJQwIM18Vmtk7hI7vO73iVVS1ri5/LSxbtozu3bszatQoevTowSeffMLo0aMpKCigR48ejB8/fnfdE088kQULFlBSUkLr1q25+eab6dOnDyeccAJr164F4JZbbuG+++7bXf/mm2+mf//+HHvssfzjH2GCsC+++IILLriA7t27c+GFF1JQUMCCBQv2iO3WW2/lG9/4Bj179uR73/teWVJnyZIlfPOb36RPnz7k5+ezYsUKAH75y1/Sq1cv+vTpw9h0tX9Fskid/6KvwoQJ0LRpxbKmTUN5WqSSUerDY19bFun+tRBvWSxdutTNzOfNm7d7/fr1693dfefOnX7iiSf6okWL3N194MCB/vbbb/vOnTsd8BdeeMHd3W+88Ua/44473N197Nixfu+99+6u/+Mf/9jd3Z9//nk//fTT3d39jjvu8Ouuu87d3RcsWOAHHHCAv/3223vEWRbHrl27fMSIEbv3l5+f79OmTXN39+3bt/sXX3zh06ZN8xNPPNG3bdtWYdu9oZaF1BfZ0rJwr52zIWS6ZVHf1PWvhaOPPpqCgoLdy0899RT5+fnk5+fz/vvvs3jxng2pgw8+mDPOOAOA4447bvev+0Tnn3/+HnXeeOMNRowYAUCfPn3o0aNHpdu+/PLL9O/fnz59+vDaa6+xaNEiNm7cyLp16zj77LOBcBNd06ZNmTlzJldddRUHH3wwAG3btq35FyFSz9T5L/pqjBoFK1bArl3hedTezraeAiWLSKdONSvfV82aNdv9eunSpdx///288sorLFy4kGHDhlV630Hjxo13v87JyaGkpKTS9z7ooIOS1qnMtm3bGDNmDFOnTmXhwoVcddVVuv9BJMGoUTBpEuTmgll4njQpvQfqbKBkEcnkr4UtW7bQokULWrZsySeffML06dNrfR8DBw7k6aefBuDdd9+ttOWyfft2DjjgANq1a8fnn3/Os88+C0CbNm1o3749f/7zn4Fws+O2bdsYOnQojzzyCNu3bwdgw4YNtR63SFw2XLIKdfuLPlvsN/NZ7KuyP/bYseHUU6dOIVHUxT+C/Px8unfvTteuXcnNzWXgwIG1vo/rr7+ef//3f6d79+67H61atapQ55BDDuHyyy+ne/fuHH744QwYMGD3usmTJ/Pd736XsWPH0rhxY5599lnOOuss3nnnHQoKCmjUqBFnn302t912W63HLgLlF6GU3VtQdhEKNIyDdabtN3NwFxQUeOLkR++//z7dunXLUETZpaSkhJKSEpo0acLSpUs57bTTWLp0KQcemB2/F/S3kmTy8kKCSJSbG37dy94xs/keboCuVnYcKSTttm7dyuDBgykpKcHd+e1vf5s1iUIkFdlyyWpDpaNFA9G6dWvmz5+f6TBE9lqnTpW3LNJ1EYpUpA5uEakXsumS1YZIyUJE6oWGeslqtlCyEJGUZMNlqw3xktVsoT4LEUlKl62KWhZpdOqpp+5xg919993HtddeW+12zZs3B2DNmjVceOGFldY55ZRTSLxUONF9993HttiA92eeeSabNm1KJXSRCup67gTJPkoWaTRy5EimTJlSoWzKlCmMHDkype2POOIInnnmmb3ef2KyeOGFF2jduvVev580XLpsVZQs0ujCCy/kr3/96+6JjlasWMGaNWsYNGjQ7vse8vPz6dWrF88///we269YsYKePXsCYSiOESNG0K1bN84777zdQ2wAXHvttbuHN7/11lsBeOCBB1izZg2nnnoqp556KgB5eXmsW7cOgHvuuYeePXvSs2fP3cObr1ixgm7dunHNNdfQo0cPTjvttAr7KfPnP/+ZAQMG0K9fP4YMGcJnn30GhHs5rrzySnr16kXv3r13Dxfy0ksvkZ+fT58+fRg8eHCtfLdSt+p67DTJPg2mz+IHP4BKpm/YJ337QnScrVTbtm3p378/L774IsOHD2fKlClcfPHFmBlNmjRh6tSptGzZknXr1nH88cdzzjnnVDkf9YMPPkjTpk15//33WbhwIfn5+bvXTZgwgbZt21JaWsrgwYNZuHAhN9xwA/fccw+zZs2iXbt2Fd5r/vz5/O53v+PNN9/E3RkwYAAnn3wybdq0YenSpTz11FM89NBDXHzxxTz77LNcdtllFbY/8cQTmTt3LmbGww8/zK9+9Sv++7//m9tuu41WrVrx7rvvArBx40aKi4u55pprmD17Np07d9b4UfXUhAkV+yxAl602NGpZpFn8VFT8FJS787Of/YzevXszZMgQVq9evfsXemVmz569+6Ddu3dvevfuvXvd008/TX5+Pv369WPRokWVDhIY98Ybb3DeeefRrFkzmjdvzvnnn8/rr78OQOfOnenbty9Q9TDoRUVFnH766fTq1Yu77rqLRYsWATBz5ky+//3v767Xpk0b5s6dy0knnUTnzp0BDWNeX+myVUlry8LMhgH3AznAw+5+Z8L6XOARoD2wAbjM3YuidZcDt0RVb3f3x/YllupaAOk0fPhwbrzxRv75z3+ybds2jjvuOCAMzFdcXMz8+fNp1KgReXl5ezUc+EcffcTdd9/NvHnzaNOmDVdcccU+DSteNrw5hCHOKzsNdf311/PDH/6Qc845h1dffZVx48bt9f6k/hg1SsmhIUtby8LMcoCJwBlAd2CkmXVPqHY38Ht37w2MB+6Itm0L3AoMAPoDt5pZm3TFmk7Nmzfn1FNP5aqrrqrQsb1582YOPfRQGjVqxKxZs1hZ2TgGMSeddBJPPvkkAO+99x4LFy4EwvDmzZo1o1WrVnz22We8+OKLu7dp0aIFn3/++R7vNWjQIJ577jm2bdvGF198wdSpUxk0aFDKn2nz5s106NABgMceK8/hQ4cOZeLEibuXN27cyPHHH8/s2bP56KOPAA1jLlJfpfM0VH9gmbsvd/cdwBRgeEKd7sAr0etZsfWnAzPcfYO7bwRmAMPSGGtajRw5knfeeadCshg1ahSFhYX06tWL3//+93Tt2rXa97j22mvZunUr3bp14xe/+MXuFkqfPn3o168fXbt25dJLL60wvPno0aMZNmzY7g7uMvn5+VxxxRX079+fAQMGcPXVV9OvX7+UP8+4ceO46KKLOO644yr0h9xyyy1s3LiRnj170qdPH2bNmkX79u2ZNGkS559/Pn369OGSSy5JeT8SZMPNcCJpG6LczC4Ehrn71dHyt4EB7j4mVudJ4E13v9/MzgeeBdoBVwJN3P32qN7Pge3ufnfCPkYDowE6dep0XOKvcw17XX/ob1W5xJvhIHQsq79AakuqQ5RnuoP7P4GTzext4GRgNVCa6sbuPsndC9y9oH379umKUSRjdDOcZIt0dnCvBo6MLXeMynZz9zXA+QBm1hy4wN03mdlq4JSEbV9NY6wiWUk3w0m2SGfLYh7Qxcw6m1ljYAQwLV7BzNqZWVkMPyVcGQUwHTjNzNpEHdunRWU1tr/MBLg/09+oaroZTrJF2pKFu5cAYwgH+feBp919kZmNN7NzomqnAB+Y2RLgMGBCtO0G4DZCwpkHjI/KaqRJkyasX79eB6Ms5u6sX7+eJk2aZDqUrKQ5HCRb7NdzcO/cuZOioqJ9uu9A0q9JkyZ07NiRRo0aZTqUrDR5cuijWLUqtCgmTFDnttSeVDu49+tkISIi1asvV0OJiEg9oGQhIiJJKVmIiEhSShYiIpKUkoVIFTQmk0i5BjP5kUhNJI7JtHJlWAZdtioNk1oWIpXQmEwiFSlZiFRCYzKJVKRkIVIJjckkUpGShUglNCaTSEVKFiKVGDUqTDCUmwtm4VkTDklDpquhRKowapSSg0gZtSxERCQpJQsREUlKyUJERJJKa7Iws2Fm9oGZLTOzmytZ38nMZpnZ22a20MzOjMrzzGy7mS2IHr9JZ5wiIlK9tHVwm1kOMBEYChQB88xsmrsvjlW7hTDd6oNm1h14AciL1n3o7n3TFZ+IiKQunS2L/sAyd1/u7juAKcDwhDoOtIxetwLWpDEeERHZS+lMFh2Aj2PLRVFZ3DjgMjMrIrQqro+t6xydnnrNzAZVtgMzG21mhWZWWFxcXIuhi4hIXKY7uEcCj7p7R+BM4HEzOwD4BOjk7v2AHwJPmlnLxI3dfZK7F7h7Qfv27es0cBGRhiSdyWI1cGRsuWNUFvcd4GkAd58DNAHauftX7r4+Kp8PfAgck8ZYRUSkGulMFvOALmbW2cwaAyOAaQl1VgGDAcysGyFZFJtZ+6iDHDM7CugCLE9jrJJFNOmQSPZJ29VQ7l5iZmOA6UAO8Ii7LzKz8UChu08DbgIeMrMbCZ3dV7i7m9lJwHgz2wnsAr7n7hvSFatkD006JJKdzN0zHUOtKCgo8MLCwkyHIfsoLy8kiES5ubBiRV1HI7L/M7P57l6QrF6mO7hFKtCkQyLZSclCsoomHRLJTkoWklU06ZBIdlKykKyiSYdEspMmP5Kso0mHRLKPWhYiIpKUkoWIiCSlZCEiIkkpWYiISFJKFiIikpSShYiIJKVkISJST+3aBQsXwuuvp39fus9CRKSe+OorKCyEN94ICeLvf4dNm6BvX3j77fTuW8lCRLJWSUkYRHL58oqPdeuge3fo1y88evSAgw7KdLS1b8sW+Mc/QmJ44w146y348suwrls3uOgiGDQoPNJNyUJEMmrTJvjwwz0TwvLlYbj60tLyuo0bQ+fO0KYNPP44TJwYyhs1CgmjXz/Izw/PffpA8+aZ+Ux769NPQ2IoSw7vvBNONeXkhM913XUhMQwcCHU9k7TmsxCRtCopgY8/Lk8AiYlh48aK9du3h6OOCo+jjy5/fdRRcMQR4cAJ4SC6fHk4/fLPf5Y/FxeH9WZwzDEVE0i/fnDIIXX7+aviDsuWVUwOy5aFdU2bwgknwIknhuRw/PHQrFl64kh1Pou0JgszGwbcT5gp72F3vzNhfSfgMaB1VOdmd38hWvdTwhzdpcAN7j69un0pWYhkzqZNe7YKypJCYuugUaPQOogngbLE0LkztGix93G4w5o1FRPI229XnFCrU6fyxFGWRDp0CMklnUpLQ0shnhw++yysa9cuJIay5NCvX/ie6kLGk0U0h/YSYChQRJiTe6S7L47VmQS87e4Pmll34AV3z4tePwX0B44AZgLHuHtp4n7KKFnI/qbsSpcZM+DVV2HrVjjwwPJHTk71y7VdVvaLPt5KKHtsSJj0uF27PVsFZY8OHcrfq66sXw8LFlRMIB98EJJLWbzx1kd+foj/gH24XnT79tDHUJYc5syBzz8P6/LyyhPDoEHQtWv6k1VVUk0W6eyz6A8sc/flUUBTgOHA4lgdB1pGr1sBa6LXw4Ep7v4V8JGZLYveb04a4xXJuJUrYebMkCBefjl05ELozDzsMNixI8xPXloaTu/EH6mUlVb5c6tmGjUKB7yjjoJvfKNiYujcGVq2TPoWdeqQQ2Dw4PAos3VrSMbxVsg998DOnWF9ixah3yOeRLp3r/oX/4YN5Z3Rr78erlrauTMkgZ494bLLypNDx47p/8y1LZ3JogPwcWy5CBiQUGcc8Dczux5oBgyJbTs3YdsOiTsws9HAaIBOmkpN6qGNG2HWrPIEUXbO+vDD4YwzYOjQcIA74oja2Z97aLHsTaIpKQnbd+yYmdZBbWveHP7t38KjzI4dsGhRxQTyf/8HX3wR1jduDL16lbc+WrYMl6++/jq8916o06hRSKA33ljeGd2mTd1/vtqW6auhRgKPuvt/m9kJwONm1jPVjd19EjAJwmmoNMUoUmu++iqcjpgxIySIwsJw8G7eHE45BcaMCQmiW7f0nJYwCwf5nJz981LTfdW4cXkr4qqrQllpKSxdWjGB/OlP8PDDYX2LFiHhXHJJSA79+8PBB2fuM6RLOpPFauDI2HLHqCzuO8AwAHefY2ZNgHYpbiuS9XbtgnffLW85zJ4dzmXn5MCAAfDzn8OQIeF1XXVoSs3k5IQ+ha5dYeTIUOYe7v/YvDmcmjow0z+760A6P+I8oIuZdSYc6EcAlybUWQUMBh41s25AE6AYmAY8aWb3EDq4uwBvpTFWASZPhrFjw3+CTp3CvNeasa7mVq0KyWHmzNDvsHZtKO/WDa6+OrQcTj45+87rS+rKpvxtSNKWLNy9xMzGANMJl8U+4u6LzGw8UOju04CbgIfM7EZCZ/cVHi7PWmRmTxM6w0uA71d3JZTsu8mTYfTo0HkKoaN19OjwWgmjeps2hauVyk4tLVkSyr/2NTjttNByGDIknOcXqa90U54A4cqW+LXoZXJzYcWKuo4mu+3YEfodyk4tzZsXTjc1axb6HcqSQ48embscUiRV2XDprNQjq1bVrLwhcS/vd5g5E157LbTAcnJCZ+bYseHU0oABoYNUZH+kZCFA6KOorGWxv1+RvGtXuHz1s88qPtauDc+ffhquWCq707Zr13CVzJAhoRXRqlVGwxepM0oWAoTO7HifBYTxaSZMyFxMe6ukJIwPFD/oV5YIPvss1Csp2fM9cnLg0EPDjXCDB5ff73DkkXvWFWkIlCwEKO/Eztarob78cs8Df1XL69eXD+MQd9BB4eB/2GHhxrLjjitPCPHHoYdC27b7NtSDyP5GHdySNXbtgn/9K3Qez5kTxu4pSwRbtlS+TYsWFQ/yiQf9+HKLFupwFkmkDm7Jeps3w5tvlieHN98Ml6FCGB6hd+8wpEJlB/6ysv3xTlmRbKRkIXVi167QUihLDHPmwOLF4XSRWbjM9KKLwhj+J5wQ5iHQaSCR7KFkIWmxZUvFVsPcueWthtatw2QuF18cEkP//rqqSCTbKVnIPtu1K9y1HG81LFqkVoPI/kTJQmpsy5YwqUu81VA2NWZZq6EsOajVILJ/ULKQarmHvoa5c8uTw3vvlbcauneHCy4obzUce6xaDSL7IyULqWDnzjCM9pw5YdaveKuhVavQaihLDgMGqNUg0lCklCzM7GigyN2/MrNTgN7A7919UzqDk7pTWgpPPQXjxsGHH4ay7t3h/PPLWw1du6rVINJQpdqyeBYoMLOvE2amex54EjgzXYFJ3XCHqVPDJDyLF0PfvvDMM2Foi9atMx2diGSLVH8n7nL3EuA84H/c/UfA4ekLS9LNHV58EQoKwmmlXbvg6adh/vywrEQhInGpJoudZjYSuBz4S1SmSSDrqddeg5NOgjPPhA0b4NFHwxDcF12k00wiUrlUDw1XAicAE9z9o2iq1MeTbWRmw8zsAzNbZmY3V7L+XjNbED2WmNmm2LrS2LppqX4gqdpbb4WZ2045BZYvh//933Cl0+WXN4w5hEVk76V0iHD3xcANAGbWBmjh7v+vum3MLAeYCAwFioB5ZjYteq+y970xVv96oF/sLba7e99UP4hUbeFC+MUv4PnnoV07uPtuuO46jaskIqlLqWVhZq+aWUszawv8kzBv9j1JNusPLHP35e6+A5gCDK+m/kjgqVTikdQsWQIjR4ZO61mzYPz40KK46SYlChGpmVRPQ7Vy9y3A+YRLZgcAQ5Js0wH4OLZcFJXtwcxygc7AK7HiJmZWaGZzzezcKrYbHdUpLC4uTvGj7P9WroTvfCdc+jptGtx8M3z0UbjiqUWLTEcnIvVRqmeqDzSzw4GLgbFpiGME8Iy7l8bKct19tZkdBbxiZu+6+4fxjdxdeNz9AAAQ6UlEQVR9EuFSXgoKCvaPiTn2waefhgmLJk0Ky2PGwE9/GobzFhHZF6m2LMYD04EP3X1edABfmmSb1UB8EsqOUVllRpBwCsrdV0fPy4FXqdifITHr18NPfgJHHQUPPhg6rJctg/vuU6IQkdqRagf3H4E/xpaXAxck2Wwe0CW6cmo1ISFcmljJzLoCbYA5sbI2wLbojvF2wEDgV6nE2pBs2QL33gv33AOffw6XXhruwP761zMdmYjsb1Lt4O5oZlPNbG30eNbMOla3TXQT3xhCi+R94Gl3X2Rm483snFjVEcAUrzi/azeg0MzeAWYBd8avomrotm2Du+4KLYlx48Ld1gsXwhNPKFGISHqkNAe3mc0gDO9Rdm/FZcAodx+axthqpCHMwf3VV/Dww3D77aF/4vTTw+uCpLPniohULtU5uFPts2jv7r9z95Lo8SjQfp8ilJSVlMAjj4Thv8eMgS5dwsiwL72kRCEidSPVZLHezC4zs5zocRmwPp2BSRivacqUMNPcd74D7duHBPHaazBoUKajE5GGJNVkcRXhstlPgU+AC4Er0hRTg+ce7o/o1y/cVNe4cRgZ9q23wqkns0xHKCINTUrJwt1Xuvs57t7e3Q9193NJfjWU1JA7zJwZ5o4YPjx0ZE+eDAsWwLnnKkmISObsyxijP6y1KITVq8NVTUOHwpo18NBDYX6JSy+FnJxMRyciDd2+jDWq37m1ZO1aGDIEiorg/vth9Gho0iTTUYmIlNuXZNHgh9eoDZs2hX6IlStD5/VJJ2U6IhGRPVV7GsrMPjezLZU8PgeOqKMY91tbt8IZZ4SJh1q0CPNM5OWFfgoRkWxSbcvC3TVGaZp8+WXoxH7rrTDx0Nq1oXzlynAaCmDUqMzFJyISp0k0M2DnzjCF6SuvQJs2sGNHxfXbtsHYdIztKyKyl5Qs6lhpKXz72/CXv4RpTTdsqLzeqlV1G5eISHWULOrQrl3hFNMf/gC/+hVcey106lR53arKRUQyQcmijrjDjTeGMZ5+/nP40Y9C+YQJ0LRpxbpNm4ZyEZFsoWRRR37+c3jgAfjBD+C//qu8fNSoMLNdbm64Qzs3Nyyrc1tEssm+3GchKbrzztBSuPrqMFFR4rAdo0YpOYhIdlPLIs1+/eswD/bIkfCb32h8JxGpn9KaLMxsmJl9YGbLzOzmStbfa2YLoscSM9sUW3e5mS2NHpenM850efRRuP56OOcceOwxjfEkIvVX2k5DmVkOMBEYChQB88xsWnx6VHe/MVb/eqBf9LotcCtQQBhWZH607cZ0xVvb/vjHMAfFkCHh6qdGjTIdkYjI3ktny6I/sMzdl7v7DmAKMLya+iOBp6LXpwMz3H1DlCBmAMPSGGuteuGFMFrsCSfAc89pUEARqf/SmSw6AB/Hlouisj2YWS7QGXilJtua2WgzKzSzwuLi4loJel+9+ipccAH07g1//Ss0a5bpiERE9l22dHCPAJ5x99KabOTuk9y9wN0L2rfP/JTgb74JZ58NRx0F06dDq1aZjkhEpHakM1msBo6MLXeMyiozgvJTUDXdNiu88w4MGwaHHQYzZkC7dpmOSESk9qQzWcwDuphZZzNrTEgI0xIrmVlXoA0wJ1Y8HTjNzNqYWRvgtKgsK33wQZjhrnnzMC3qERq8XUT2M2m7GsrdS8xsDOEgnwM84u6LzGw8UOjuZYljBDDF3T227QYzu42QcADGu3sVQ+5l1ooV4Yons5Ao8vIyHZGISO2z2DG6XisoKPDCwsI63eeaNTBoUBg59tVXoU+fOt29iMg+M7P57l6QrJ6G+9hL69aFU09r14YWhRKFiOzPlCz2wubNYd7s5cvhxRdhwIBMRyQikl5KFjX0xRfwrW+FebOfey7Mmy0isr9TsqiBL7+Ec8+FOXPCEB5nnpnpiERE6oaSRYp27oRLLgn9E48+ChdemOmIRETqTrbcwZ3VSkvh8sth2rQw5Pjl9XIMXBGRvadkkYQ7fO978NRTcMcd8P3vZzoiEZG6p2RRDXe46SZ4+GH42c/g5j1m5BARaRiULKoxbhzcey/ccAPcfnumoxERyRwliyrcfTeMHw9XXhkShqZDFZGGTMmiEr/5DfzoR3DxxfDQQ3CAviURaeB0GEzwxBNw3XVw1lnw+OOaN1tEBJQsKpg6Fa64Ak49Ncyh3bhxpiMSEckOShaR6dPDTXf9+8Pzz2vebBGROCULYPZsOO886NEDXnghTGIkIiLlGnyyWLIk9E/k5sLf/gatW2c6IhGR7JPWZGFmw8zsAzNbZmaV3tJmZheb2WIzW2RmT8bKS81sQfTYYzrW2nL00eE+ipkzoX37dO1FRKR+S9tAgmaWA0wEhgJFwDwzm+bui2N1ugA/BQa6+0YzOzT2FtvdvW+64iuTk6Mb7kREkklny6I/sMzdl7v7DmAKMDyhzjXARHffCODua9MYj4iI7KV0JosOwMex5aKoLO4Y4Bgz+7uZzTWzYbF1TcysMCo/t7IdmNnoqE5hcXFx7UYvIiK7ZXo+iwOBLsApQEdgtpn1cvdNQK67rzazo4BXzOxdd/8wvrG7TwImARQUFHjdhi4i0nCks2WxGjgyttwxKosrAqa5+053/whYQkgeuPvq6Hk58CrQL42xiohINdKZLOYBXcyss5k1BkYAiVc1PUdoVWBm7QinpZabWRszOyhWPhBYjIiIZETaTkO5e4mZjQGmAznAI+6+yMzGA4XuPi1ad5qZLQZKgR+5+3oz+zfgt2a2i5DQ7oxfRSUiInXL3PePU/0FBQVeWFiY6TBEROoVM5vv7gXJ6jX4O7hFRCQ5JQsREUlKyUJERJJSshARkaSULEREJCklCxERSUrJQkREklKyEBGRpJQsREQkKSULERFJSslCRESSUrIQEZGklCxERCQpJQsREUlKyUJERJJSshARkaTSmizMbJiZfWBmy8zs5irqXGxmi81skZk9GSu/3MyWRo/L0xmniIhUL23TqppZDjARGAoUAfPMbFp8elQz6wL8FBjo7hvN7NCovC1wK1AAODA/2nZjuuIVEZGqpbNl0R9Y5u7L3X0HMAUYnlDnGmBiWRJw97VR+enADHffEK2bAQxLY6wiIlKNdCaLDsDHseWiqCzuGOAYM/u7mc01s2E12BYzG21mhWZWWFxcXIuhi4hIXKY7uA8EugCnACOBh8ysdaobu/skdy9w94L27dunKUQREUlnslgNHBlb7hiVxRUB09x9p7t/BCwhJI9UthURkTqSzmQxD+hiZp3NrDEwApiWUOc5QqsCM2tHOC21HJgOnGZmbcysDXBaVCYiIhmQtquh3L3EzMYQDvI5wCPuvsjMxgOF7j6N8qSwGCgFfuTu6wHM7DZCwgEY7+4b0hWriIhUz9w90zHUioKCAi8sLMx0GCIi9YqZzXf3gmT1Mt3BLSIi9YCShYiIJKVkISIiSSlZiIhIUkoWIiKSlJKFiIgkpWQhIiJJKVmIiEhSShYiIpKUkoWIiCSlZCEiIkkpWYiISFJKFiIikpSShYiIJKVkISIiSSlZiIhIUmlNFmY2zMw+MLNlZnZzJeuvMLNiM1sQPa6OrSuNlSdOx1prJk+GvDw44IDwPHlyuvYkIlJ/pW1aVTPLASYCQ4EiYJ6ZTXP3xQlV/+DuYyp5i+3u3jdd8UFIDKNHw7ZtYXnlyrAMMGpUOvcsIlK/pLNl0R9Y5u7L3X0HMAUYnsb91djYseWJosy2baFcRETKpTNZdAA+ji0XRWWJLjCzhWb2jJkdGStvYmaFZjbXzM6tbAdmNjqqU1hcXFzjAFetqlm5iEhDlekO7j8Dee7eG5gBPBZblxtNIn4pcJ+ZHZ24sbtPcvcCdy9o3759jXfeqVPNykVEGqp0JovVQLyl0DEq283d17v7V9Hiw8BxsXWro+flwKtAv9oOcMIEaNq0YlnTpqFcRETKpTNZzAO6mFlnM2sMjAAqXNVkZofHFs8B3o/K25jZQdHrdsBAILFjfJ+NGgWTJkFuLpiF50mT1LktIpIobVdDuXuJmY0BpgM5wCPuvsjMxgOF7j4NuMHMzgFKgA3AFdHm3YDfmtkuQkK7s5KrqGrFqFFKDiIiyZi7ZzqGWlFQUOCFhYWZDkNEpF4xs/lR/3C1Mt3BLSIi9YCShYiIJKVkISIiSSlZiIhIUvtNB7eZFQMrMx3HPmoHrMt0EFlE30dF+j7K6buoaF++j1x3T3pX836TLPYHZlaYylUJDYW+j4r0fZTTd1FRXXwfOg0lIiJJKVmIiEhSShbZZVKmA8gy+j4q0vdRTt9FRWn/PtRnISIiSallISIiSSlZiIhIUkoWWcDMjjSzWWa22MwWmdl/ZDqmTDOzHDN728z+kulYMs3MWkczSf7LzN43sxMyHVMmmdmN0f+T98zsKTNrkumY6pKZPWJma83svVhZWzObYWZLo+c2tb1fJYvsUALc5O7dgeOB75tZ9wzHlGn/QTS/iXA/8JK7dwX60IC/FzPrANwAFLh7T8L0ByMyG1WdexQYllB2M/Cyu3cBXo6Wa5WSRRZw90/c/Z/R688JB4PK5itvEMysI/AtwuyJDZqZtQJOAv4PwN13uPumzEaVcQcCB5vZgUBTYE2G46lT7j6bMP9P3HDKp6V+DDi3tverZJFlzCyPMIXsm5mNJKPuA34M7Mp0IFmgM1AM/C46LfewmTXLdFCZEk23fDewCvgE2Ozuf8tsVFnhMHf/JHr9KXBYbe9AySKLmFlz4FngB+6+JdPxZIKZnQWsdff5mY4lSxwI5AMPuns/4AvScIqhvojOxQ8nJNEjgGZmdllmo8ouHu6HqPV7IpQssoSZNSIkisnu/qdMx5NBA4FzzGwFMAX4ppk9kdmQMqoIKHL3spbmM4Tk0VANAT5y92J33wn8Cfi3DMeUDT4zs8MBoue1tb0DJYssYGZGOCf9vrvfk+l4Msndf+ruHd09j9Bx+Yq7N9hfju7+KfCxmR0bFQ0G0jIffT2xCjjezJpG/28G04A7/GOmAZdHry8Hnq/tHShZZIeBwLcJv6IXRI8zMx2UZI3rgclmthDoC/wyw/FkTNTCegb4J/Au4RjWoIb+MLOngDnAsWZWZGbfAe4EhprZUkLr685a36+G+xARkWTUshARkaSULEREJCklCxERSUrJQkREklKyEBGRpJQsRJIws9LYJc0LzKzW7qA2s7z46KEi2erATAcgUg9sd/e+mQ5CJJPUshDZS2a2wsx+ZWbvmtlbZvb1qDzPzF4xs4Vm9rKZdYrKDzOzqWb2TvQoG6Yix8weiuZo+JuZHRzVvyGa42ShmU3J0McUAZQsRFJxcMJpqEti6za7ey/g14TRcgH+B3jM3XsDk4EHovIHgNfcvQ9hfKdFUXkXYKK79wA2ARdE5TcD/aL3+V66PpxIKnQHt0gSZrbV3ZtXUr4C+Ka7L48GgvzU3Q8xs3XA4e6+Myr/xN3bmVkx0NHdv4q9Rx4wI5q0BjP7CdDI3W83s5eArcBzwHPuvjXNH1WkSmpZiOwbr+J1TXwVe11KeV/it4CJhFbIvGiyH5GMULIQ2TeXxJ7nRK//QflUn6OA16PXLwPXwu45xltV9aZmdgBwpLvPAn4CtAL2aN2I1BX9UhFJ7mAzWxBbfsndyy6fbRONBvsVMDIqu54ws92PCLPcXRmV/wcwKRoltJSQOD6hcjnAE1FCMeABTacqmaQ+C5G9FPVZFLj7ukzHIpJuOg0lIiJJqWUhIiJJqWUhIiJJKVmIiEhSShYiIpKUkoWIiCSlZCEiIkn9f/QVL1hV58FxAAAAAElFTkSuQmCC\n",
461 | "text/plain": [
462 | ""
463 | ]
464 | },
465 | "metadata": {
466 | "needs_background": "light"
467 | },
468 | "output_type": "display_data"
469 | }
470 | ],
471 | "source": [
472 | "plt.clf() # clear figure\n",
473 | "\n",
474 | "acc = history.history['accuracy']\n",
475 | "val_acc = history.history['val_accuracy']\n",
476 | "\n",
477 | "plt.plot(epochs, acc, 'bo', label='Training acc')\n",
478 | "plt.plot(epochs, val_acc, 'b', label='Validation acc')\n",
479 | "plt.title('Training and validation accuracy')\n",
480 | "plt.xlabel('Epochs')\n",
481 | "plt.ylabel('Loss')\n",
482 | "plt.legend()\n",
483 | "\n",
484 | "plt.show()"
485 | ]
486 | },
487 | {
488 | "cell_type": "markdown",
489 | "metadata": {},
490 | "source": [
491 | "## Generating predictions on new data\n",
492 | "\n",
493 | "We can verify that the `predict` method of our model instance returns a probability distribution over all 46 topics. \n",
494 | "\n",
495 | "Let's generate topic predictions for all of the test data:"
496 | ]
497 | },
498 | {
499 | "cell_type": "code",
500 | "execution_count": 21,
501 | "metadata": {},
502 | "outputs": [],
503 | "source": [
504 | "predictions = model.predict(x_test)"
505 | ]
506 | },
507 | {
508 | "cell_type": "markdown",
509 | "metadata": {},
510 | "source": [
511 | "Each entry in `predictions` is a vector of length 46:"
512 | ]
513 | },
514 | {
515 | "cell_type": "code",
516 | "execution_count": 22,
517 | "metadata": {},
518 | "outputs": [
519 | {
520 | "data": {
521 | "text/plain": [
522 | "(46,)"
523 | ]
524 | },
525 | "execution_count": 22,
526 | "metadata": {},
527 | "output_type": "execute_result"
528 | }
529 | ],
530 | "source": [
531 | "predictions[0].shape"
532 | ]
533 | },
534 | {
535 | "cell_type": "markdown",
536 | "metadata": {},
537 | "source": [
538 | "The coefficients in this vector sum to 1:"
539 | ]
540 | },
541 | {
542 | "cell_type": "code",
543 | "execution_count": 23,
544 | "metadata": {},
545 | "outputs": [
546 | {
547 | "data": {
548 | "text/plain": [
549 | "1.0"
550 | ]
551 | },
552 | "execution_count": 23,
553 | "metadata": {},
554 | "output_type": "execute_result"
555 | }
556 | ],
557 | "source": [
558 | "np.sum(predictions[0])"
559 | ]
560 | },
561 | {
562 | "cell_type": "markdown",
563 | "metadata": {},
564 | "source": [
565 | "The largest entry is the predicted class, i.e. the class with the highest probability:"
566 | ]
567 | },
568 | {
569 | "cell_type": "code",
570 | "execution_count": 24,
571 | "metadata": {},
572 | "outputs": [
573 | {
574 | "data": {
575 | "text/plain": [
576 | "3"
577 | ]
578 | },
579 | "execution_count": 24,
580 | "metadata": {},
581 | "output_type": "execute_result"
582 | }
583 | ],
584 | "source": [
585 | "np.argmax(predictions[0])"
586 | ]
587 | },
588 | {
589 | "cell_type": "markdown",
590 | "metadata": {},
591 | "source": [
592 | "## Wrapping up\n",
593 | "\n",
594 | "\n",
595 | "Here's what you should take away from this example:\n",
596 | "\n",
597 | "* If you are trying to classify data points between N classes, your network should end with a `Dense` layer of size N.\n",
598 | "\n",
599 | "\n",
600 | "* In a single-label, multi-class classification problem, your network should end with a `softmax` activation, so that it will output a probability distribution over the N output classes.\n",
601 | "\n",
602 | "\n",
603 | "* _Categorical crossentropy_ is almost always the loss function you should use for such problems. "
604 | ]
605 | },
606 | {
607 | "cell_type": "code",
608 | "execution_count": null,
609 | "metadata": {},
610 | "outputs": [],
611 | "source": []
612 | }
613 | ],
614 | "metadata": {
615 | "kernelspec": {
616 | "display_name": "Python 3",
617 | "language": "python",
618 | "name": "python3"
619 | },
620 | "language_info": {
621 | "codemirror_mode": {
622 | "name": "ipython",
623 | "version": 3
624 | },
625 | "file_extension": ".py",
626 | "mimetype": "text/x-python",
627 | "name": "python",
628 | "nbconvert_exporter": "python",
629 | "pygments_lexer": "ipython3",
630 | "version": "3.7.1"
631 | }
632 | },
633 | "nbformat": 4,
634 | "nbformat_minor": 2
635 | }
636 |
--------------------------------------------------------------------------------