├── .gitignore ├── README.md ├── Sentence classification by MorphConv.ipynb ├── Tutorial of implementing Batch norm and Drop out.ipynb ├── Tutorial of implementing Batch normalization.ipynb ├── Tutorial of implementing Class activation map.ipynb ├── Tutorial of implementing Conditional GAN.ipynb ├── Tutorial of implementing DL model by OOP.ipynb ├── Tutorial of implementing Drop out.ipynb ├── Tutorial of implementing Generative Adversarial Net.ipynb ├── Tutorial of implementing Sequence classification with RNN series.ipynb ├── Tutorial of implementing Transfer learning.ipynb ├── Tutorial of implementing Variational Auto-Encoder.ipynb ├── Tutorial of implementing Word2Vec.ipynb └── mnist_sequence1_sample_5distortions5x5.npz /.gitignore: -------------------------------------------------------------------------------- 1 | # .DS_Store in mac 2 | .DS_Store 3 | 4 | # ipython notebook checkpoints 5 | .ipynb_checkpoints 6 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # TF code examples for Deep learning 2 | 주로 tf.layers를 이용했으며, 추후 tf.contrib.slim으로 refactoring한 파일도 추가예정 3 | - - - 4 | 5 | ### 01. Tutorial of implementing DL model by OOP 6 | - OOP style로 Deep learning 모형을 짜는 예제 (with mnist) 7 | - [Tutorial of implementing DL model by OOP.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20DL%20model%20by%20OOP.ipynb) 8 | 9 | ### 02. Tutorial of implementing Drop out 10 | - Drop out을 적용하는 예제 (with mnist) 11 | - [Tutorial of implementing Drop out.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Batch%20norm%20and%20Drop%20out.ipynb) 12 | 13 | ### 03. Tutorial of implementing Batch normalization 14 | - Batch normalization을 적용하는 예제 (with mnist) 15 | - [Tutorial of implementing Batch normalization.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Batch%20normalization.ipynb) 16 | 17 | ### 04. Tutorial of implementing Batch norm and Drop out 18 | - Batch normalization과 Drop out을 동시 적용하는 예제 (with fashion mnist) 19 | - [Tutorial of implementing Batch normalization and Drop out.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Batch%20norm%20and%20Drop%20out.ipynb) 20 | 21 | ### 05. Tutorial of implementing Transfer learning 22 | - Transfer learning에 대한 예제 (with mnist) 23 | - [Tutorial of implementing Transfer learning.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Transfer%20learning.ipynb) 24 | 25 | ### 06. Tutorial of implementing Transfer learning with model Zoo 26 | - Vggnet, Googlenet, Resnet 27 | 28 | ### 07. Tutorial of implementing Class activation map 29 | - ["*Learning Deep Features for Discriminative Localization*"](https://www.cv-foundation.org/openaccess/content_cvpr_2016/papers/Zhou_Learning_Deep_Features_CVPR_2016_paper.pdf) 논문에서 제안된 Class activation map의 간단한 구현 (with mnist clutter) 30 | - [Tutorial of implementing Class activation map.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Class%20activation%20map.ipynb) 31 | 32 | ### 08. Tutorial of implementing Word2Vec 33 | - Word2Vec의 skip-gram의 간단한 구현 34 | - 추후 Word2Vec cbow 구현 추가 35 | - [Tutorial of implementing Word2Vec.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Word2Vec.ipynb) 36 | 37 | ### 09. Tutorial of implementing Sequence classification with RNN series 38 | - Character level GRU로 가변길이의 영어단어의 긍/부정을 예측하는 모형을 학습 39 | - [Tutorial of implementing Sequence classification with RNN series.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Sequence%20classification%20with%20RNN%20series.ipynb) 40 | 41 | ### 10. Tutorial of implementing Variational Auto-Encoder 42 | - ["*Auto-Encoding Variational Bayes*"](https://arxiv.org/pdf/1312.6114.pdf) 논문에서 소개된 Variational Auto-Encoder의 간단한 구현 (with mnist) 43 | - [Tutorial of implementing Variational Auto-Encoder.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Variational%20Auto-Encoder.ipynb) 44 | 45 | ### 11. Tutorial of implementing Generative Adversarial Net 46 | - ["*Generative Adversarial Nets*"](https://papers.nips.cc/paper/5423-generative-adversarial-nets.pdf) 논문에서 소개된 Generative Adversarial Net의 간단한 구현 (with mnist) 47 | - [Tutorial of implementing Generative Adversarial Net.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Generative%20Adversarial%20Net.ipynb) 48 | 49 | ### 12. Tutorial of implementing Conditional GAN 50 | - ["*Conditional Generative Adversarial Nets*"](https://arxiv.org/pdf/1411.1784.pdf) 논문에서 소개된 Conditional Generative Adversarial Net의 간단한 구현 (with mnist) 51 | - [Tutorial of implementing Conditional GAN.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Tutorial%20of%20implementing%20Conditional%20GAN.ipynb) 52 | 53 | ### 13. Sentence classification by MorphConv 54 | - ["Convolution Neural Networks for Sentence Classification"](https://arxiv.org/abs/1408.5882) 논문에서 소개된 모형의 간단한 구현 (with naver movie review) 55 | - [Sentence classification by MorphConv.ipynb](https://nbviewer.jupyter.org/github/aisolab/TF_code_examples_for_Deep_learning/blob/master/Sentence%20classification%20by%20MorphConv.ipynb) 56 | 57 | -------------------------------------------------------------------------------- /Sentence classification by MorphConv.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Sentence classification by MorphConv\n", 8 | "Implementation of [Convolutional Neural Networks for Sentence Classification](https://arxiv.org/abs/1408.5882) to classify sentiment of movie review\n", 9 | "\n", 10 | "### Explanation of this notebook\n", 11 | "* Dataset : [Naver sentiment movie corpus v1.0](https://github.com/e9t/nsmc)\n", 12 | " + train, validation : splitting `ratings_train.txt` (150k reviews) for train (120k reviews) and validation (30k reviews)\n", 13 | " + test : `ratings_test.txt` (50k reviews)\n", 14 | "* Preprocessing\n", 15 | " + Morphological analysis by Mecab wrapped by [konlpy](http://konlpy.org/en/latest/)\n", 16 | " + Using [FastText](https://arxiv.org/abs/1607.04606) embedding by [gluonnlp package](https://gluon-nlp.mxnet.io/)" 17 | ] 18 | }, 19 | { 20 | "cell_type": "markdown", 21 | "metadata": {}, 22 | "source": [ 23 | "### Setup" 24 | ] 25 | }, 26 | { 27 | "cell_type": "code", 28 | "execution_count": 1, 29 | "metadata": {}, 30 | "outputs": [ 31 | { 32 | "name": "stdout", 33 | "output_type": "stream", 34 | "text": [ 35 | "1.12.0\n" 36 | ] 37 | } 38 | ], 39 | "source": [ 40 | "import os, sys\n", 41 | "import konlpy\n", 42 | "import gluonnlp as nlp\n", 43 | "import numpy as np\n", 44 | "import pandas as pd\n", 45 | "import tensorflow as tf\n", 46 | "import itertools\n", 47 | "from tensorflow import keras\n", 48 | "from tensorflow.keras.preprocessing.sequence import pad_sequences\n", 49 | "from tqdm import tqdm\n", 50 | "from pprint import pprint\n", 51 | "\n", 52 | "print(tf.__version__)" 53 | ] 54 | }, 55 | { 56 | "cell_type": "markdown", 57 | "metadata": {}, 58 | "source": [ 59 | "### Loading dataset" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": 2, 65 | "metadata": {}, 66 | "outputs": [ 67 | { 68 | "name": "stdout", 69 | "output_type": "stream", 70 | "text": [ 71 | "5 3\n", 72 | "0 0\n" 73 | ] 74 | } 75 | ], 76 | "source": [ 77 | "ratings = pd.read_csv('./ratings_train.txt', sep = '\\t')[['document', 'label']]\n", 78 | "ratings_tst = pd.read_csv('./ratings_test.txt', sep = '\\t')[['document', 'label']]\n", 79 | "\n", 80 | "# ratings, ratings_tst의 document column에 nan 값이 있으므로 이를 빈 문자열로 대체\n", 81 | "print(sum(ratings.document.isna()), sum(ratings_tst.document.isna()))\n", 82 | "\n", 83 | "ratings.document[ratings.document.isna()] = ''\n", 84 | "ratings_tst.document[ratings_tst.document.isna()] = ''\n", 85 | "\n", 86 | "print(sum(ratings.document.isna()), sum(ratings_tst.document.isna()))" 87 | ] 88 | }, 89 | { 90 | "cell_type": "code", 91 | "execution_count": 3, 92 | "metadata": {}, 93 | "outputs": [ 94 | { 95 | "name": "stdout", 96 | "output_type": "stream", 97 | "text": [ 98 | "(120000, 2) (30000, 2) (50000, 2)\n" 99 | ] 100 | } 101 | ], 102 | "source": [ 103 | "val_indices = np.random.choice(a = range(ratings.shape[0]), size = int(ratings.shape[0] * .2),\n", 104 | " replace = False)\n", 105 | "tr_indices = np.delete(arr = range(ratings.shape[0]), obj = val_indices, axis = 0)\n", 106 | "\n", 107 | "ratings_tr = ratings.iloc[tr_indices,:]\n", 108 | "ratings_val = ratings.iloc[val_indices,:]\n", 109 | "\n", 110 | "print(ratings_tr.shape, ratings_val.shape, ratings_tst.shape)" 111 | ] 112 | }, 113 | { 114 | "cell_type": "markdown", 115 | "metadata": {}, 116 | "source": [ 117 | "### Preprocessing dataset" 118 | ] 119 | }, 120 | { 121 | "cell_type": "code", 122 | "execution_count": 4, 123 | "metadata": {}, 124 | "outputs": [], 125 | "source": [ 126 | "mecab = konlpy.tag.Mecab() # 어떠한 분석기라도 상관이 없음" 127 | ] 128 | }, 129 | { 130 | "cell_type": "code", 131 | "execution_count": 5, 132 | "metadata": {}, 133 | "outputs": [ 134 | { 135 | "name": "stdout", 136 | "output_type": "stream", 137 | "text": [ 138 | "CPU times: user 13.6 s, sys: 64.3 ms, total: 13.6 s\n", 139 | "Wall time: 13.6 s\n" 140 | ] 141 | } 142 | ], 143 | "source": [ 144 | "%%time\n", 145 | "# train\n", 146 | "X_tr = ratings_tr.document.apply(mecab.morphs).tolist()\n", 147 | "y_tr = ratings_tr.label.tolist()\n", 148 | "\n", 149 | "# validation\n", 150 | "X_val = ratings_val.document.apply(mecab.morphs).tolist()\n", 151 | "y_val = ratings_val.label.tolist()\n", 152 | "\n", 153 | "# test\n", 154 | "X_tst = ratings_tst.document.apply(mecab.morphs).tolist()\n", 155 | "y_tst = ratings_tst.label.tolist()" 156 | ] 157 | }, 158 | { 159 | "cell_type": "markdown", 160 | "metadata": {}, 161 | "source": [ 162 | "#### Building vocabulary and connecting vocabulary with fasttext embedding\n", 163 | "https://gluon-nlp.mxnet.io/examples/word_embedding/word_embedding.html" 164 | ] 165 | }, 166 | { 167 | "cell_type": "code", 168 | "execution_count": 6, 169 | "metadata": {}, 170 | "outputs": [], 171 | "source": [ 172 | "# training dataset 기반으로 vocab 생성\n", 173 | "counter = nlp.data.count_tokens(itertools.chain.from_iterable([c for c in X_tr]))\n", 174 | "vocab = nlp.Vocab(counter,bos_token=None, eos_token=None, min_freq=15)" 175 | ] 176 | }, 177 | { 178 | "cell_type": "code", 179 | "execution_count": 7, 180 | "metadata": {}, 181 | "outputs": [], 182 | "source": [ 183 | "# Loading fasttext embedding \n", 184 | "fasttext_simple = nlp.embedding.create('fasttext', source='wiki.ko')\n", 185 | "\n", 186 | "# vocab에 embedding 연결\n", 187 | "vocab.set_embedding(fasttext_simple)" 188 | ] 189 | }, 190 | { 191 | "cell_type": "code", 192 | "execution_count": 8, 193 | "metadata": {}, 194 | "outputs": [ 195 | { 196 | "name": "stdout", 197 | "output_type": "stream", 198 | "text": [ 199 | "CPU times: user 2.83 s, sys: 45.1 ms, total: 2.88 s\n", 200 | "Wall time: 2.83 s\n" 201 | ] 202 | } 203 | ], 204 | "source": [ 205 | "%%time\n", 206 | "# final preprocessing\n", 207 | "\n", 208 | "X_tr = list(map(lambda sen : [vocab.token_to_idx[token] for token in sen], X_tr))\n", 209 | "X_tr = pad_sequences(sequences = X_tr, maxlen = 30, padding = 'post', value = 1.)\n", 210 | "\n", 211 | "X_val = list(map(lambda sen : [vocab.token_to_idx[token] for token in sen], X_val))\n", 212 | "X_val = pad_sequences(sequences = X_val, maxlen = 30, padding = 'post', value = 1.)\n", 213 | "\n", 214 | "X_tst = list(map(lambda sen : [vocab.token_to_idx[token] for token in sen], X_tst))\n", 215 | "X_tst = pad_sequences(sequences = X_tst, maxlen = 30, padding = 'post', value = 1.)" 216 | ] 217 | }, 218 | { 219 | "cell_type": "markdown", 220 | "metadata": {}, 221 | "source": [ 222 | "### Define MorphConv class" 223 | ] 224 | }, 225 | { 226 | "cell_type": "code", 227 | "execution_count": 9, 228 | "metadata": {}, 229 | "outputs": [], 230 | "source": [ 231 | "class MorphConv:\n", 232 | " def __init__(self, X, y, n_of_classes, embedding):\n", 233 | " \n", 234 | " with tf.variable_scope('input_layer'):\n", 235 | " self.__X = X\n", 236 | " self.__y = y\n", 237 | " self.is_training = tf.placeholder(dtype = tf.bool)\n", 238 | " \n", 239 | " with tf.variable_scope('embedding_layer'):\n", 240 | " static_embed = tf.get_variable(name = 'static', initializer = embedding,\n", 241 | " trainable = False)\n", 242 | " non_static_embed = tf.get_variable(name = 'non_static', initializer = embedding,\n", 243 | " trainable = True)\n", 244 | " static_batch = tf.nn.embedding_lookup(params = static_embed, ids = self.__X)\n", 245 | " non_static_batch = tf.nn.embedding_lookup(params = non_static_embed, ids = self.__X)\n", 246 | " \n", 247 | " with tf.variable_scope('convoluion_layer'):\n", 248 | " with tf.variable_scope('tri_gram'):\n", 249 | " \n", 250 | " tri_gram = keras.layers.Conv1D(filters = 100, kernel_size = 3,\n", 251 | " activation = keras.activations.relu,\n", 252 | " kernel_initializer = 'he_uniform', padding = 'valid')\n", 253 | " static_3 = tri_gram(static_batch)\n", 254 | " non_static_3 = tri_gram(non_static_batch)\n", 255 | " \n", 256 | " with tf.variable_scope('tetra_gram'):\n", 257 | " tetra_gram = keras.layers.Conv1D(filters = 100, kernel_size = 4,\n", 258 | " activation = keras.activations.relu,\n", 259 | " kernel_initializer = 'he_uniform', padding = 'valid')\n", 260 | " \n", 261 | " static_4 = tetra_gram(static_batch)\n", 262 | " non_static_4 = tetra_gram(non_static_batch)\n", 263 | " \n", 264 | " with tf.variable_scope('penta_gram'):\n", 265 | " penta_gram = keras.layers.Conv1D(filters = 100, kernel_size = 5,\n", 266 | " activation = keras.activations.relu,\n", 267 | " kernel_initializer = 'he_uniform', padding = 'valid')\n", 268 | " \n", 269 | " static_5 = penta_gram(static_batch)\n", 270 | " non_static_5 = penta_gram(non_static_batch)\n", 271 | "\n", 272 | " fmap_3 = tf.reduce_max(static_3 + non_static_3, axis = 1)\n", 273 | " fmap_4 = tf.reduce_max(static_4 + non_static_4, axis = 1)\n", 274 | " fmap_5 = tf.reduce_max(static_5 + non_static_5, axis = 1)\n", 275 | " \n", 276 | " with tf.variable_scope('output_layer'):\n", 277 | " flattened = tf.concat([fmap_3, fmap_4, fmap_5], axis = -1)\n", 278 | " score = keras.layers.Dense(units = n_of_classes,\n", 279 | " kernel_regularizer = keras.regularizers.l2(.7))(flattened)\n", 280 | " \n", 281 | " self.__score = keras.layers.Dropout(rate = .5)(score, training = self.is_training)\n", 282 | "\n", 283 | " with tf.variable_scope('loss'):\n", 284 | " ce_loss = tf.losses.sparse_softmax_cross_entropy(labels = self.__y, logits = self.__score)\n", 285 | " reg_term = tf.reduce_sum(tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES))\n", 286 | " self.total_loss = ce_loss + reg_term\n", 287 | " \n", 288 | " with tf.variable_scope('prediction'):\n", 289 | " self.prediction = tf.argmax(self.__score, axis = -1)\n", 290 | " \n", 291 | " # predict instance method for small dataset\n", 292 | " def predict(self, sess, x_data, is_training = False):\n", 293 | " feed_prediction = {self.__X : x_data, self.is_training : is_training}\n", 294 | " return sess.run(self.prediction, feed_dict = feed_prediction)" 295 | ] 296 | }, 297 | { 298 | "cell_type": "markdown", 299 | "metadata": {}, 300 | "source": [ 301 | "### Create a model of MorphConv" 302 | ] 303 | }, 304 | { 305 | "cell_type": "code", 306 | "execution_count": 10, 307 | "metadata": {}, 308 | "outputs": [ 309 | { 310 | "name": "stdout", 311 | "output_type": "stream", 312 | "text": [ 313 | "1200\n" 314 | ] 315 | } 316 | ], 317 | "source": [ 318 | "# hyper-parameter\n", 319 | "lr = .003\n", 320 | "epochs = 5\n", 321 | "batch_size = 100\n", 322 | "total_step = int(X_tr.shape[0] / batch_size)\n", 323 | "print(total_step)" 324 | ] 325 | }, 326 | { 327 | "cell_type": "code", 328 | "execution_count": 11, 329 | "metadata": {}, 330 | "outputs": [], 331 | "source": [ 332 | "# train\n", 333 | "tr_dataset = tf.data.Dataset.from_tensor_slices((X_tr, y_tr))\n", 334 | "tr_dataset = tr_dataset.shuffle(buffer_size = 1000000)\n", 335 | "tr_dataset = tr_dataset.batch(batch_size = batch_size)\n", 336 | "tr_iterator = tr_dataset.make_initializable_iterator()" 337 | ] 338 | }, 339 | { 340 | "cell_type": "code", 341 | "execution_count": 12, 342 | "metadata": {}, 343 | "outputs": [], 344 | "source": [ 345 | "# val\n", 346 | "val_dataset = tf.data.Dataset.from_tensor_slices((X_val, y_val))\n", 347 | "val_dataset = val_dataset.batch(batch_size = batch_size)\n", 348 | "val_iterator = val_dataset.make_initializable_iterator()" 349 | ] 350 | }, 351 | { 352 | "cell_type": "code", 353 | "execution_count": 13, 354 | "metadata": {}, 355 | "outputs": [ 356 | { 357 | "name": "stdout", 358 | "output_type": "stream", 359 | "text": [ 360 | "Tensor(\"IteratorGetNext:0\", shape=(?, 30), dtype=int32) Tensor(\"IteratorGetNext:1\", shape=(?,), dtype=int32)\n" 361 | ] 362 | } 363 | ], 364 | "source": [ 365 | "# anonymous iterator\n", 366 | "handle = tf.placeholder(dtype = tf.string)\n", 367 | "iterator = tf.data.Iterator.from_string_handle(string_handle = handle,\n", 368 | " output_types = tr_iterator.output_types,\n", 369 | " output_shapes = tr_iterator.output_shapes)\n", 370 | "x_data, y_data = iterator.get_next()\n", 371 | "print(x_data, y_data)" 372 | ] 373 | }, 374 | { 375 | "cell_type": "code", 376 | "execution_count": 14, 377 | "metadata": {}, 378 | "outputs": [ 379 | { 380 | "name": "stdout", 381 | "output_type": "stream", 382 | "text": [ 383 | "float32\n", 384 | "(6582, 300)\n" 385 | ] 386 | } 387 | ], 388 | "source": [ 389 | "embedding = vocab.embedding.idx_to_vec.asnumpy()\n", 390 | "print(embedding.dtype)\n", 391 | "print(embedding.shape)" 392 | ] 393 | }, 394 | { 395 | "cell_type": "code", 396 | "execution_count": 15, 397 | "metadata": {}, 398 | "outputs": [], 399 | "source": [ 400 | "morph_conv = MorphConv(X = x_data, y = y_data, n_of_classes = 2,\n", 401 | " embedding = embedding)" 402 | ] 403 | }, 404 | { 405 | "cell_type": "code", 406 | "execution_count": 16, 407 | "metadata": {}, 408 | "outputs": [], 409 | "source": [ 410 | "# create training op\n", 411 | "opt = tf.train.AdamOptimizer(learning_rate = lr)\n", 412 | "training_op = opt.minimize(loss = morph_conv.total_loss)" 413 | ] 414 | }, 415 | { 416 | "cell_type": "markdown", 417 | "metadata": {}, 418 | "source": [ 419 | "### Training" 420 | ] 421 | }, 422 | { 423 | "cell_type": "code", 424 | "execution_count": 17, 425 | "metadata": {}, 426 | "outputs": [], 427 | "source": [ 428 | "sess_config = tf.ConfigProto(gpu_options=tf.GPUOptions(allow_growth=True))\n", 429 | "sess = tf.Session(config = sess_config)\n", 430 | "sess.run(tf.global_variables_initializer())\n", 431 | "tr_handle, val_handle = sess.run(fetches = [tr_iterator.string_handle(), val_iterator.string_handle()])" 432 | ] 433 | }, 434 | { 435 | "cell_type": "code", 436 | "execution_count": 18, 437 | "metadata": {}, 438 | "outputs": [ 439 | { 440 | "name": "stderr", 441 | "output_type": "stream", 442 | "text": [ 443 | " 20%|██ | 1/5 [00:09<00:36, 9.19s/it]" 444 | ] 445 | }, 446 | { 447 | "name": "stdout", 448 | "output_type": "stream", 449 | "text": [ 450 | "epoch : 1, tr_loss : 0.654, val_loss : 0.479\n" 451 | ] 452 | }, 453 | { 454 | "name": "stderr", 455 | "output_type": "stream", 456 | "text": [ 457 | "\r", 458 | " 40%|████ | 2/5 [00:17<00:26, 8.89s/it]" 459 | ] 460 | }, 461 | { 462 | "name": "stdout", 463 | "output_type": "stream", 464 | "text": [ 465 | "epoch : 2, tr_loss : 0.506, val_loss : 0.428\n" 466 | ] 467 | }, 468 | { 469 | "name": "stderr", 470 | "output_type": "stream", 471 | "text": [ 472 | "\r", 473 | " 60%|██████ | 3/5 [00:25<00:17, 8.64s/it]" 474 | ] 475 | }, 476 | { 477 | "name": "stdout", 478 | "output_type": "stream", 479 | "text": [ 480 | "epoch : 3, tr_loss : 0.439, val_loss : 0.353\n" 481 | ] 482 | }, 483 | { 484 | "name": "stderr", 485 | "output_type": "stream", 486 | "text": [ 487 | "\r", 488 | " 80%|████████ | 4/5 [00:33<00:08, 8.58s/it]" 489 | ] 490 | }, 491 | { 492 | "name": "stdout", 493 | "output_type": "stream", 494 | "text": [ 495 | "epoch : 4, tr_loss : 0.449, val_loss : 0.607\n" 496 | ] 497 | }, 498 | { 499 | "name": "stderr", 500 | "output_type": "stream", 501 | "text": [ 502 | "\r", 503 | "100%|██████████| 5/5 [00:42<00:00, 8.45s/it]" 504 | ] 505 | }, 506 | { 507 | "name": "stdout", 508 | "output_type": "stream", 509 | "text": [ 510 | "epoch : 5, tr_loss : 0.428, val_loss : 0.380\n", 511 | "CPU times: user 45.6 s, sys: 8.23 s, total: 53.8 s\n", 512 | "Wall time: 42 s\n" 513 | ] 514 | }, 515 | { 516 | "name": "stderr", 517 | "output_type": "stream", 518 | "text": [ 519 | "\n" 520 | ] 521 | } 522 | ], 523 | "source": [ 524 | "%%time\n", 525 | "\n", 526 | "tr_loss_hist = []\n", 527 | "val_loss_hist = []\n", 528 | "\n", 529 | "for epoch in tqdm(range(epochs)):\n", 530 | "\n", 531 | " avg_tr_loss = 0\n", 532 | " avg_val_loss = 0\n", 533 | " tr_step = 0\n", 534 | " val_step = 0\n", 535 | "\n", 536 | " # for mini-batch training\n", 537 | " sess.run(tr_iterator.initializer) \n", 538 | " try:\n", 539 | " \n", 540 | " while True:\n", 541 | " _, tr_loss = sess.run(fetches = [training_op, morph_conv.total_loss],\n", 542 | " feed_dict = {handle : tr_handle, morph_conv.is_training : True})\n", 543 | " avg_tr_loss += tr_loss\n", 544 | " tr_step += 1\n", 545 | "\n", 546 | " except tf.errors.OutOfRangeError:\n", 547 | " pass\n", 548 | "\n", 549 | " # for validation\n", 550 | " sess.run(val_iterator.initializer)\n", 551 | " try:\n", 552 | " while True:\n", 553 | " val_loss = sess.run(fetches = morph_conv.total_loss,\n", 554 | " feed_dict = {handle : val_handle, morph_conv.is_training : False})\n", 555 | " avg_val_loss += val_loss\n", 556 | " val_step += 1\n", 557 | " \n", 558 | " except tf.errors.OutOfRangeError:\n", 559 | " pass\n", 560 | "\n", 561 | " avg_tr_loss /= tr_step\n", 562 | " avg_val_loss /= val_step\n", 563 | " tr_loss_hist.append(avg_tr_loss)\n", 564 | " val_loss_hist.append(avg_val_loss)\n", 565 | " \n", 566 | " print('epoch : {:3}, tr_loss : {:.3f}, val_loss : {:.3f}'.format(epoch + 1, avg_tr_loss, avg_val_loss))" 567 | ] 568 | }, 569 | { 570 | "cell_type": "markdown", 571 | "metadata": {}, 572 | "source": [ 573 | "### Saving model" 574 | ] 575 | }, 576 | { 577 | "cell_type": "code", 578 | "execution_count": 19, 579 | "metadata": {}, 580 | "outputs": [], 581 | "source": [ 582 | "var_list = tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES)" 583 | ] 584 | }, 585 | { 586 | "cell_type": "code", 587 | "execution_count": 20, 588 | "metadata": {}, 589 | "outputs": [], 590 | "source": [ 591 | "saver = tf.train.Saver(var_list = var_list)" 592 | ] 593 | }, 594 | { 595 | "cell_type": "code", 596 | "execution_count": 21, 597 | "metadata": {}, 598 | "outputs": [ 599 | { 600 | "data": { 601 | "text/plain": [ 602 | "'./model/MorphConv/'" 603 | ] 604 | }, 605 | "execution_count": 21, 606 | "metadata": {}, 607 | "output_type": "execute_result" 608 | } 609 | ], 610 | "source": [ 611 | "saver.save(sess = sess, save_path = './model/MorphConv/')" 612 | ] 613 | }, 614 | { 615 | "cell_type": "code", 616 | "execution_count": 22, 617 | "metadata": {}, 618 | "outputs": [], 619 | "source": [ 620 | "sess.close()" 621 | ] 622 | }, 623 | { 624 | "cell_type": "code", 625 | "execution_count": 23, 626 | "metadata": {}, 627 | "outputs": [], 628 | "source": [ 629 | "tf.reset_default_graph()" 630 | ] 631 | }, 632 | { 633 | "cell_type": "markdown", 634 | "metadata": {}, 635 | "source": [ 636 | "### Restoring model" 637 | ] 638 | }, 639 | { 640 | "cell_type": "code", 641 | "execution_count": 24, 642 | "metadata": {}, 643 | "outputs": [], 644 | "source": [ 645 | "x_data_infer = tf.placeholder(dtype = tf.int32, shape = [None, 30]) # 30은 sequence의 max_length\n", 646 | "y_data_infer = tf.placeholder(dtype = tf.int32, shape = [None]) #" 647 | ] 648 | }, 649 | { 650 | "cell_type": "code", 651 | "execution_count": 27, 652 | "metadata": {}, 653 | "outputs": [ 654 | { 655 | "name": "stdout", 656 | "output_type": "stream", 657 | "text": [ 658 | "(6582, 300)\n" 659 | ] 660 | } 661 | ], 662 | "source": [ 663 | "# 내 초기 token들의 embedding\n", 664 | "print(embedding.shape)\n", 665 | "initial = np.zeros(shape = [6582,300], dtype = np.float32)" 666 | ] 667 | }, 668 | { 669 | "cell_type": "code", 670 | "execution_count": 28, 671 | "metadata": {}, 672 | "outputs": [], 673 | "source": [ 674 | "morph_conv_infer = MorphConv(X = x_data_infer, y = y_data_infer, n_of_classes = 2,\n", 675 | " embedding = initial)" 676 | ] 677 | }, 678 | { 679 | "cell_type": "code", 680 | "execution_count": 30, 681 | "metadata": {}, 682 | "outputs": [ 683 | { 684 | "name": "stdout", 685 | "output_type": "stream", 686 | "text": [ 687 | "INFO:tensorflow:Restoring parameters from ./model/MorphConv/\n" 688 | ] 689 | }, 690 | { 691 | "name": "stderr", 692 | "output_type": "stream", 693 | "text": [ 694 | "INFO:tensorflow:Restoring parameters from ./model/MorphConv/\n" 695 | ] 696 | } 697 | ], 698 | "source": [ 699 | "sess = tf.Session()\n", 700 | "saver = tf.train.Saver()\n", 701 | "saver.restore(sess = sess, save_path='./model/MorphConv/')" 702 | ] 703 | }, 704 | { 705 | "cell_type": "markdown", 706 | "metadata": {}, 707 | "source": [ 708 | "### Test" 709 | ] 710 | }, 711 | { 712 | "cell_type": "code", 713 | "execution_count": 33, 714 | "metadata": {}, 715 | "outputs": [ 716 | { 717 | "data": { 718 | "text/plain": [ 719 | "(10, 30)" 720 | ] 721 | }, 722 | "execution_count": 33, 723 | "metadata": {}, 724 | "output_type": "execute_result" 725 | } 726 | ], 727 | "source": [ 728 | "X_tst[:10].shape" 729 | ] 730 | }, 731 | { 732 | "cell_type": "code", 733 | "execution_count": 34, 734 | "metadata": {}, 735 | "outputs": [], 736 | "source": [ 737 | "y_tst_hat = morph_conv_infer.predict(sess = sess, x_data = X_tst[:10])" 738 | ] 739 | }, 740 | { 741 | "cell_type": "code", 742 | "execution_count": 35, 743 | "metadata": {}, 744 | "outputs": [ 745 | { 746 | "data": { 747 | "text/plain": [ 748 | "array([1, 0, 0, 0, 0, 1, 0, 0, 0, 1])" 749 | ] 750 | }, 751 | "execution_count": 35, 752 | "metadata": {}, 753 | "output_type": "execute_result" 754 | } 755 | ], 756 | "source": [ 757 | "y_tst_hat" 758 | ] 759 | }, 760 | { 761 | "cell_type": "code", 762 | "execution_count": 36, 763 | "metadata": {}, 764 | "outputs": [ 765 | { 766 | "data": { 767 | "text/plain": [ 768 | "[1, 0, 0, 0, 0, 1, 0, 0, 0, 1]" 769 | ] 770 | }, 771 | "execution_count": 36, 772 | "metadata": {}, 773 | "output_type": "execute_result" 774 | } 775 | ], 776 | "source": [ 777 | "y_tst[:10]" 778 | ] 779 | }, 780 | { 781 | "cell_type": "code", 782 | "execution_count": 37, 783 | "metadata": {}, 784 | "outputs": [], 785 | "source": [ 786 | "# tst_dataset = tf.data.Dataset.from_tensor_slices((X_tst, y_tst))\n", 787 | "# tst_dataset = tst_dataset.batch(batch_size = batch_size)\n", 788 | "# tst_iterator = tst_dataset.make_initializable_iterator()" 789 | ] 790 | }, 791 | { 792 | "cell_type": "code", 793 | "execution_count": 38, 794 | "metadata": {}, 795 | "outputs": [], 796 | "source": [ 797 | "# tst_handle = sess.run(tst_iterator.string_handle())" 798 | ] 799 | }, 800 | { 801 | "cell_type": "code", 802 | "execution_count": 39, 803 | "metadata": {}, 804 | "outputs": [], 805 | "source": [ 806 | "# y_tst_hat = np.array([])\n", 807 | "\n", 808 | "# sess.run(tst_iterator.initializer)\n", 809 | "\n", 810 | "# try:\n", 811 | "# while True:\n", 812 | "# y_tst_tmp = sess.run(morph_conv.prediction,\n", 813 | "# feed_dict = {handle : tst_handle,\n", 814 | "# morph_conv.is_training : False})\n", 815 | "# y_tst_hat= np.append(y_tst_hat,y_tst_tmp)\n", 816 | "\n", 817 | "# except tf.errors.OutOfRangeError:\n", 818 | "# pass" 819 | ] 820 | }, 821 | { 822 | "cell_type": "code", 823 | "execution_count": 40, 824 | "metadata": {}, 825 | "outputs": [], 826 | "source": [ 827 | "# print('test acc : {:.2%}'.format(np.mean(y_tst_hat == np.array(y_tst))))" 828 | ] 829 | } 830 | ], 831 | "metadata": { 832 | "kernelspec": { 833 | "display_name": "Python 3", 834 | "language": "python", 835 | "name": "python3" 836 | }, 837 | "language_info": { 838 | "codemirror_mode": { 839 | "name": "ipython", 840 | "version": 3 841 | }, 842 | "file_extension": ".py", 843 | "mimetype": "text/x-python", 844 | "name": "python", 845 | "nbconvert_exporter": "python", 846 | "pygments_lexer": "ipython3", 847 | "version": "3.6.6" 848 | } 849 | }, 850 | "nbformat": 4, 851 | "nbformat_minor": 2 852 | } 853 | -------------------------------------------------------------------------------- /Tutorial of implementing Batch normalization.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Tutorial of implementing Batch normalization\n", 8 | "mnist image를 분류하는 Convolution Neural Network에 Batch normalization을 적용하는 간단한 example\n", 9 | "\n", 10 | "Batch normalization paper : http://proceedings.mlr.press/v37/ioffe15.pdf" 11 | ] 12 | }, 13 | { 14 | "cell_type": "markdown", 15 | "metadata": {}, 16 | "source": [ 17 | "### Setup" 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": 1, 23 | "metadata": {}, 24 | "outputs": [ 25 | { 26 | "name": "stdout", 27 | "output_type": "stream", 28 | "text": [ 29 | "Extracting ./MNIST_data\\train-images-idx3-ubyte.gz\n", 30 | "Extracting ./MNIST_data\\train-labels-idx1-ubyte.gz\n", 31 | "Extracting ./MNIST_data\\t10k-images-idx3-ubyte.gz\n", 32 | "Extracting ./MNIST_data\\t10k-labels-idx1-ubyte.gz\n" 33 | ] 34 | } 35 | ], 36 | "source": [ 37 | "import os, sys\n", 38 | "import shutil \n", 39 | "import tensorflow as tf\n", 40 | "import numpy as np\n", 41 | "import matplotlib.pylab as plt\n", 42 | "%matplotlib inline\n", 43 | "from tensorflow.examples.tutorials.mnist import input_data # load mnist dataset\n", 44 | "mnist = input_data.read_data_sets(train_dir = './MNIST_data', one_hot = True, reshape = True, seed = 20171104)" 45 | ] 46 | }, 47 | { 48 | "cell_type": "markdown", 49 | "metadata": {}, 50 | "source": [ 51 | "### Define MnistCNN class\n", 52 | "conv-conv-max pool-conv-conv-max pool-fc-fc" 53 | ] 54 | }, 55 | { 56 | "cell_type": "code", 57 | "execution_count": 2, 58 | "metadata": { 59 | "code_folding": [] 60 | }, 61 | "outputs": [], 62 | "source": [ 63 | "'''\n", 64 | "Batch normalization의 구현을 위해서는 tf.layers module에 있는 tf.layers.batch_normalization을 활용한다. activation 하기전에\n", 65 | "Batch normalization을 하고, 후에 activation을 한다.(전에하냐 후에하냐라는 issue가 있으나 activation 전에 하는 것이 convention이고\n", 66 | "실험결과로 activation 전에 하는 것이 좋다는 결과도 종종 보인다.)\n", 67 | "'''\n", 68 | "class MnistCNN:\n", 69 | " def __init__(self, activation_fn = tf.nn.relu, initializer = tf.contrib.layers.variance_scaling_initializer(),\n", 70 | " l2_scale = 0.02):\n", 71 | " \n", 72 | " with tf.variable_scope('input_layer'):\n", 73 | " self._x = tf.placeholder(dtype = tf.float32, shape = [None,784])\n", 74 | " self._ximg = tf.reshape(tensor = self._x, shape = [-1,28,28,1])\n", 75 | " self._y = tf.placeholder(dtype = tf.float32, shape = [None,10])\n", 76 | " self._training = tf.placeholder(dtype = tf.bool)\n", 77 | " \n", 78 | " with tf.variable_scope('conv_layer1'):\n", 79 | " _conv_pre = tf.layers.conv2d(inputs = self._ximg, filters = 64, kernel_size = [3,3],\n", 80 | " kernel_initializer = initializer,\n", 81 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale),\n", 82 | " padding = 'same')\n", 83 | " _conv_bn = tf.layers.batch_normalization(inputs = _conv_pre, momentum = .9, training = self._training)\n", 84 | " _conv_relu = activation_fn(_conv_bn)\n", 85 | " \n", 86 | " with tf.variable_scope('conv_layer2'):\n", 87 | " _conv_pre = tf.layers.conv2d(inputs = _conv_relu, filters = 64, kernel_size = [3,3],\n", 88 | " kernel_initializer = initializer,\n", 89 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale),\n", 90 | " padding = 'same')\n", 91 | " _conv_bn = tf.layers.batch_normalization(inputs = _conv_pre, momentum = .9, training = self._training)\n", 92 | " _conv_relu = activation_fn(_conv_bn)\n", 93 | " \n", 94 | " with tf.variable_scope('max_pool1'):\n", 95 | " _pooled = tf.layers.max_pooling2d(inputs = _conv_relu, pool_size = [2,2], strides = 2)\n", 96 | " \n", 97 | " with tf.variable_scope('conv_layer3'):\n", 98 | " _conv_pre = tf.layers.conv2d(inputs = _pooled, filters = 128, kernel_size = [3,3],\n", 99 | " kernel_initializer = initializer,\n", 100 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale),\n", 101 | " padding = 'same')\n", 102 | " _conv_bn = tf.layers.batch_normalization(inputs = _conv_pre, momentum = .9, training = self._training)\n", 103 | " _conv_relu = activation_fn(_conv_bn)\n", 104 | " \n", 105 | " with tf.variable_scope('conv_layer4'):\n", 106 | " _conv_pre = tf.layers.conv2d(inputs = _conv_relu, filters = 128, kernel_size = [3,3],\n", 107 | " kernel_initializer = initializer,\n", 108 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale),\n", 109 | " padding = 'same')\n", 110 | " _conv_bn = tf.layers.batch_normalization(inputs = _conv_pre, momentum = .9, training = self._training)\n", 111 | " _conv_relu = activation_fn(_conv_bn)\n", 112 | " \n", 113 | " with tf.variable_scope('max_pool2'):\n", 114 | " _pooled = tf.layers.max_pooling2d(inputs = _conv_relu, pool_size = [2,2], strides = 2)\n", 115 | " \n", 116 | " with tf.variable_scope('dense_layer1'):\n", 117 | " _pooled_vector = tf.reshape(tensor = _pooled, shape = [-1,np.cumprod(_pooled.get_shape().as_list()[-3:])[-1]])\n", 118 | " _fc_pre = tf.layers.dense(inputs = _pooled_vector, units = 1024, kernel_initializer = initializer,\n", 119 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale))\n", 120 | " _fc_bn = tf.layers.batch_normalization(inputs = _fc_pre, momentum = .9, training = self._training)\n", 121 | " _fc_relu = activation_fn(_fc_bn)\n", 122 | " \n", 123 | " with tf.variable_scope('output_layer'):\n", 124 | " self._score = tf.layers.dense(inputs = _fc_relu, units = 10, kernel_initializer = initializer,\n", 125 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale))\n", 126 | " \n", 127 | " with tf.variable_scope('loss'):\n", 128 | " _ce_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits = self._score, labels = self._y))\n", 129 | " _reg_term = tf.reduce_sum(tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES))\n", 130 | " self._total_loss = _ce_loss + _reg_term\n", 131 | " ''' \n", 132 | " 각각 mini_batch 마다의 mean과 variance를 계산하고, 이를 Exponential Moving Average로 저장하는 과정이 필요한 데,\n", 133 | " 이를 수행하기위해 tf.get_collection(tf.GraphKeys.UPDATE_OPS)에서 뽑히는 ops를 저장해둔다. 이 op들은 후에\n", 134 | " tf.control_dependencies의 control_inputs argument에 전달된다.\n", 135 | " \n", 136 | " Note: when training, the moving_mean and moving_variance need to be updated.\n", 137 | " By default the update ops are placed in `tf.GraphKeys.UPDATE_OPS`, so they\n", 138 | " need to be added as a dependency to the `train_op`. For example:\n", 139 | "\n", 140 | " \n", 141 | " update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS) <-- 본 example에서는 model 생성 class 코드에 들어가고,\n", 142 | " 객체변수로 저장한다.\n", 143 | " \n", 144 | " with tf.control_dependencies(update_ops): <-- Solver class는 코드에 들어간다. Solver class는 model class\n", 145 | " train_op = optimizer.minimize(loss) 생성된 instance를 input으로 받으므로, 거기에서 객체변수로 \n", 146 | " 저장된 update_ops를 tf.control_dependencies의 argument에 전달한다.\n", 147 | " '''\n", 148 | " # 객체변수에 model class 코드로 생성되는 graph의 UPDATE_OPS를 저장\n", 149 | " self._update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS) \n", 150 | " \n", 151 | " with tf.variable_scope('predict'):\n", 152 | " self._prediction = tf.argmax(input = self._score, axis = 1)\n", 153 | " \n", 154 | " def predict(self, sess, x_data, training):\n", 155 | " feed_predict = {self._x : x_data, self._training : training}\n", 156 | " return sess.run(fetches = self._prediction, feed_dict = feed_predict)" 157 | ] 158 | }, 159 | { 160 | "cell_type": "markdown", 161 | "metadata": {}, 162 | "source": [ 163 | "### Define Solver class" 164 | ] 165 | }, 166 | { 167 | "cell_type": "code", 168 | "execution_count": 3, 169 | "metadata": {}, 170 | "outputs": [], 171 | "source": [ 172 | "class Solver:\n", 173 | " def __init__(self, model, optimizer = tf.train.AdamOptimizer, var_list = None):\n", 174 | " self._model = model\n", 175 | " self._lr = tf.placeholder(dtype = tf.float32)\n", 176 | " self._optimizer = optimizer(learning_rate = self._lr)\n", 177 | " \n", 178 | " # Solver class는 model class로부터 생성된 instance를 input으로 받음. model class에서 저장한 객체변수를 아래와 같이 활용\n", 179 | " with tf.control_dependencies(self._model._update_ops):\n", 180 | " self._training_op = self._optimizer.minimize(loss = self._model._total_loss, var_list = var_list)\n", 181 | " \n", 182 | " def train(self, sess, x_data, y_data, lr, training):\n", 183 | " feed_train = {self._model._x : x_data, self._model._y : y_data, self._lr : lr,\n", 184 | " self._model._training : training}\n", 185 | " return sess.run(fetches = [self._training_op, self._model._total_loss], feed_dict = feed_train)\n", 186 | " \n", 187 | " def evaluate(self, sess, x_data, y_data, training = False):\n", 188 | " feed_loss = {self._model._x : x_data, self._model._y : y_data, self._model._training : training}\n", 189 | " return sess.run(fetches = self._model._total_loss, feed_dict = feed_loss)" 190 | ] 191 | }, 192 | { 193 | "cell_type": "markdown", 194 | "metadata": {}, 195 | "source": [ 196 | "### Generate CNN model and Adam solver" 197 | ] 198 | }, 199 | { 200 | "cell_type": "code", 201 | "execution_count": 4, 202 | "metadata": {}, 203 | "outputs": [], 204 | "source": [ 205 | "sess = tf.Session()\n", 206 | "mnist_classifier = MnistCNN()" 207 | ] 208 | }, 209 | { 210 | "cell_type": "code", 211 | "execution_count": 5, 212 | "metadata": {}, 213 | "outputs": [], 214 | "source": [ 215 | "adam_solver = Solver(model = mnist_classifier)" 216 | ] 217 | }, 218 | { 219 | "cell_type": "markdown", 220 | "metadata": {}, 221 | "source": [ 222 | "### Training" 223 | ] 224 | }, 225 | { 226 | "cell_type": "code", 227 | "execution_count": 6, 228 | "metadata": {}, 229 | "outputs": [], 230 | "source": [ 231 | "# Hyper-parameters\n", 232 | "batch_size = 100\n", 233 | "n_epochs = 10\n", 234 | "best_loss = np.infty\n", 235 | "max_checks_without_progress = 15\n", 236 | "checks_without_progress = 0\n", 237 | "tr_loss_history = []\n", 238 | "val_loss_history = []" 239 | ] 240 | }, 241 | { 242 | "cell_type": "code", 243 | "execution_count": 7, 244 | "metadata": {}, 245 | "outputs": [], 246 | "source": [ 247 | "sess.run(tf.global_variables_initializer())" 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "execution_count": 8, 253 | "metadata": {}, 254 | "outputs": [ 255 | { 256 | "name": "stdout", 257 | "output_type": "stream", 258 | "text": [ 259 | "step : 0, tr_loss : 31.353, val_loss : 31.285\n", 260 | "step : 100, tr_loss : 3.589, val_loss : 3.813\n", 261 | "step : 200, tr_loss : 1.709, val_loss : 1.768\n", 262 | "step : 300, tr_loss : 0.953, val_loss : 1.186\n", 263 | "step : 400, tr_loss : 0.780, val_loss : 0.795\n", 264 | "step : 500, tr_loss : 0.626, val_loss : 0.742\n", 265 | "epoch : 0, tr_loss : 2.976, val_loss : 3.082\n", 266 | "step : 0, tr_loss : 0.648, val_loss : 0.786\n", 267 | "step : 100, tr_loss : 0.583, val_loss : 0.452\n", 268 | "step : 200, tr_loss : 0.502, val_loss : 0.606\n", 269 | "step : 300, tr_loss : 0.518, val_loss : 0.511\n", 270 | "step : 400, tr_loss : 0.351, val_loss : 0.486\n", 271 | "step : 500, tr_loss : 0.421, val_loss : 0.838\n", 272 | "epoch : 1, tr_loss : 0.474, val_loss : 0.632\n", 273 | "step : 0, tr_loss : 0.396, val_loss : 0.461\n", 274 | "step : 100, tr_loss : 0.344, val_loss : 0.486\n", 275 | "step : 200, tr_loss : 0.316, val_loss : 0.496\n", 276 | "step : 300, tr_loss : 0.366, val_loss : 0.355\n", 277 | "step : 400, tr_loss : 0.364, val_loss : 0.335\n", 278 | "step : 500, tr_loss : 0.376, val_loss : 0.603\n", 279 | "epoch : 2, tr_loss : 0.364, val_loss : 0.448\n", 280 | "step : 0, tr_loss : 0.592, val_loss : 0.388\n", 281 | "step : 100, tr_loss : 0.391, val_loss : 0.604\n", 282 | "step : 200, tr_loss : 0.426, val_loss : 0.339\n", 283 | "step : 300, tr_loss : 0.389, val_loss : 0.287\n", 284 | "step : 400, tr_loss : 0.257, val_loss : 0.345\n", 285 | "step : 500, tr_loss : 0.313, val_loss : 0.366\n", 286 | "epoch : 3, tr_loss : 0.320, val_loss : 0.393\n", 287 | "step : 0, tr_loss : 0.352, val_loss : 0.254\n", 288 | "step : 100, tr_loss : 0.319, val_loss : 0.268\n", 289 | "step : 200, tr_loss : 0.358, val_loss : 0.435\n", 290 | "step : 300, tr_loss : 0.229, val_loss : 0.452\n", 291 | "step : 400, tr_loss : 0.243, val_loss : 0.381\n", 292 | "step : 500, tr_loss : 0.192, val_loss : 0.395\n", 293 | "epoch : 4, tr_loss : 0.276, val_loss : 0.332\n", 294 | "step : 0, tr_loss : 0.272, val_loss : 0.322\n", 295 | "step : 100, tr_loss : 0.446, val_loss : 0.312\n", 296 | "step : 200, tr_loss : 0.223, val_loss : 0.280\n", 297 | "step : 300, tr_loss : 0.168, val_loss : 0.239\n", 298 | "step : 400, tr_loss : 0.179, val_loss : 0.321\n", 299 | "step : 500, tr_loss : 0.243, val_loss : 0.319\n", 300 | "epoch : 5, tr_loss : 0.251, val_loss : 0.288\n", 301 | "step : 0, tr_loss : 0.334, val_loss : 0.202\n", 302 | "step : 100, tr_loss : 0.243, val_loss : 0.193\n", 303 | "step : 200, tr_loss : 0.256, val_loss : 0.221\n", 304 | "step : 300, tr_loss : 0.181, val_loss : 0.186\n", 305 | "step : 400, tr_loss : 0.252, val_loss : 0.206\n", 306 | "step : 500, tr_loss : 0.191, val_loss : 0.181\n", 307 | "epoch : 6, tr_loss : 0.227, val_loss : 0.245\n", 308 | "step : 0, tr_loss : 0.252, val_loss : 0.255\n", 309 | "step : 100, tr_loss : 0.227, val_loss : 0.261\n", 310 | "step : 200, tr_loss : 0.179, val_loss : 0.192\n", 311 | "step : 300, tr_loss : 0.169, val_loss : 0.194\n", 312 | "step : 400, tr_loss : 0.265, val_loss : 0.203\n", 313 | "step : 500, tr_loss : 0.227, val_loss : 0.205\n", 314 | "epoch : 7, tr_loss : 0.201, val_loss : 0.219\n", 315 | "step : 0, tr_loss : 0.211, val_loss : 0.216\n", 316 | "step : 100, tr_loss : 0.227, val_loss : 0.174\n", 317 | "step : 200, tr_loss : 0.217, val_loss : 0.194\n", 318 | "step : 300, tr_loss : 0.224, val_loss : 0.180\n", 319 | "step : 400, tr_loss : 0.189, val_loss : 0.241\n", 320 | "step : 500, tr_loss : 0.147, val_loss : 0.411\n", 321 | "epoch : 8, tr_loss : 0.196, val_loss : 0.216\n", 322 | "step : 0, tr_loss : 0.213, val_loss : 0.145\n", 323 | "step : 100, tr_loss : 0.155, val_loss : 0.171\n", 324 | "step : 200, tr_loss : 0.209, val_loss : 0.164\n", 325 | "step : 300, tr_loss : 0.158, val_loss : 0.226\n", 326 | "step : 400, tr_loss : 0.156, val_loss : 0.218\n", 327 | "step : 500, tr_loss : 0.229, val_loss : 0.234\n", 328 | "epoch : 9, tr_loss : 0.182, val_loss : 0.192\n" 329 | ] 330 | } 331 | ], 332 | "source": [ 333 | "for epoch in range(n_epochs):\n", 334 | " avg_tr_loss = 0\n", 335 | " avg_val_loss = 0\n", 336 | " total_batch = int(mnist.train.num_examples / batch_size)\n", 337 | " \n", 338 | " for step in range(total_batch):\n", 339 | " \n", 340 | " batch_xs, batch_ys = mnist.train.next_batch(batch_size = batch_size)\n", 341 | " val_xs, val_ys = mnist.validation.next_batch(batch_size = batch_size)\n", 342 | " _, tr_loss = adam_solver.train(sess = sess, x_data = batch_xs, y_data = batch_ys, lr = 1e-3, training = True)\n", 343 | " val_loss = adam_solver.evaluate(sess = sess, x_data = val_xs, y_data = val_ys)\n", 344 | " \n", 345 | " avg_tr_loss += tr_loss / total_batch\n", 346 | " avg_val_loss += val_loss / total_batch\n", 347 | " if step % 100 == 0:\n", 348 | " print('step : {:3}, tr_loss : {:.3f}, val_loss : {:.3f}'.format(step, tr_loss, val_loss))\n", 349 | " \n", 350 | " print('epoch : {:3}, tr_loss : {:.3f}, val_loss : {:.3f}'.format(epoch, avg_tr_loss, avg_val_loss))\n", 351 | " tr_loss_history.append(avg_tr_loss)\n", 352 | " val_loss_history.append(avg_val_loss)\n", 353 | " \n", 354 | " # early stopping\n", 355 | " if avg_val_loss < best_loss:\n", 356 | " best_loss = avg_val_loss\n", 357 | " checks_without_progress = 0\n", 358 | " else:\n", 359 | " checks_without_progress += 1\n", 360 | " if checks_without_progress > max_checks_without_progress:\n", 361 | " print('Early stopping')\n", 362 | " break" 363 | ] 364 | }, 365 | { 366 | "cell_type": "code", 367 | "execution_count": 9, 368 | "metadata": {}, 369 | "outputs": [ 370 | { 371 | "data": { 372 | "text/plain": [ 373 | "" 374 | ] 375 | }, 376 | "execution_count": 9, 377 | "metadata": {}, 378 | "output_type": "execute_result" 379 | }, 380 | { 381 | "data": { 382 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAD8CAYAAACMwORRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3Xt0G+d95vHvDwAJigAJSiQlkaDu\nsS2JpG5WHKVuHbtOXed+cxPl0m5ymvrUaTdxNunGm902TU9zNj3Nut7cnHWapGnXcZqVEydtbTf1\n1l5bie1YkiVZFzuWY9miSIkXiRTvJIB3/xiQBClQpCSQQwDP5xwcDGYGwE+Q9MzM+868Y845RESk\nsAT8LkBERHJP4S4iUoAU7iIiBUjhLiJSgBTuIiIFSOEuIlKAFO4iIgVI4S4iUoAU7iIiBSjk1xfX\n1NS41atX+/X1IiJ5ae/evZ3OudqZ1vMt3FevXs2ePXv8+noRkbxkZq/MZj01y4iIFCCFu4hIAVK4\ni4gUIN/a3EWksIyOjtLS0sLQ0JDfpRSEsrIyGhoaKCkpuaT3K9xFJCdaWlqoqKhg9erVmJnf5eQ1\n5xxdXV20tLSwZs2aS/oMNcuISE4MDQ1RXV2tYM8BM6O6uvqyjoIU7iKSMwr23Lnc3zL/wv30Efjp\nf4PhPr8rERFZsPIv3LtfhZ9/BU4953clIrKAdHd38/Wvf/2i3/fmN7+Z7u7uOajIX/kX7vVbvOfW\nZ/2tQ0QWlOnCPZlMXvB9Dz74IFVVVXNVlm9mDHczKzOzX5jZATM7bGafz7JO2Mz+0cyOmdnTZrZ6\nLooFoGI5VNRD2/45+woRyT933HEHL730Elu2bOG1r30tN9xwAx/4wAdobm4G4J3vfCdXX301jY2N\n3HPPPePvW716NZ2dnRw/fpwNGzbwB3/wBzQ2NnLTTTcxODjo1x/nss3mVMhh4Dedc31mVgLsNrOH\nnHNPZazz+8BZ59xrzGwn8FfA++agXk/9Vu25iyxgn/+nwxxpPZfTz9xYX8nn3tY47fIvfvGLHDp0\niP379/PYY4/xlre8hUOHDo2fSvjtb3+bJUuWMDg4yGtf+1re8573UF1dPekzXnzxRe677z6++c1v\n8t73vpf777+fD33oQzn9c8yXGffcnWes97Ik/XBTVnsH8N309C7gRpvLbvP6rdD5Igzl9h+PiBSO\na665ZtI54l/+8pfZvHkzO3bs4MSJE7z44ovnvWfNmjVs2eI1/V599dUcP358vsrNuVldxGRmQWAv\n8Brga865p6esEgdOADjnEmbWA1QDnTmsdUL9FsDBqYOw+tfn5CtE5NJdaA97vkQikfHpxx57jEce\neYQnn3yS8vJyrr/++qznkIfD4fHpYDCY180ys+pQdc4lnXNbgAbgGjNrmrJKtr30qXv3mNmtZrbH\nzPZ0dHRcfLVAZ98wj3TXeS/UNCMiaRUVFfT29mZd1tPTw+LFiykvL+f555/nqaeeyrpeIbmos2Wc\nc93AY8DNUxa1ACsAzCwExIAzWd5/j3Nuu3Nue23tjGPNZ/Xzl7r46P2vMBKNK9xFZFx1dTXXXnst\nTU1N/Mmf/MmkZTfffDOJRIJNmzbxp3/6p+zYscOnKufPjM0yZlYLjDrnus1sEfBGvA7TTD8B/gPw\nJHAL8O/OufP23HOhOR4D4HRkAytadcaMiEz43ve+l3V+OBzmoYceyrpsrF29pqaGQ4cOjc//9Kc/\nnfP65tNs9tzrgEfN7CDwDPBvzrl/NrO/MLO3p9f5FlBtZseA/wTcMTflwqol5VSEQxwNrIMzL8Fg\n4V18ICJyuWbcc3fOHQS2Zpn/ZxnTQ8Dv5La07AIBozFeye7+FdwE0HYA1r5hPr5aRCRv5N8VqkBT\nfYyHu5Z5L9TuLiJynrwM9+aGGO2JCCMVKxXuIiJZ5GW4N411qkY3KNxFRLLIy3BfUx0hGg5x1NZB\n9yswcN5ZlyIiRS0vwz0QMDbWV7J7oMGboUHEROQiRaNRAFpbW7nllluyrnP99dezZ8+eC37OXXfd\nxcDAwPjrhTKEcF6GO3idqg+pU1VELlN9fT27du265PdPDfeFMoRw3oZ7c0MlHaOLGKlcrXAXET7z\nmc9MGs/9z//8z/n85z/PjTfeyLZt22hububHP/7xee87fvw4TU3eiCqDg4Ps3LmTTZs28b73vW/S\n2DK33XYb27dvp7Gxkc997nOANxhZa2srN9xwAzfccAMwMYQwwJ133klTUxNNTU3cdddd4983H0ML\nz2rgsIVo7ErVU9ENrNSVqiILy0N35P5uacub4U1fnHbxzp07uf322/nYxz4GwA9+8AMefvhhPvnJ\nT1JZWUlnZyc7duzg7W9/+7T3J7377rspLy/n4MGDHDx4kG3bto0v+8IXvsCSJUtIJpPceOONHDx4\nkI9//OPceeedPProo9TU1Ez6rL179/Kd73yHp59+Guccr3vd63jDG97A4sWL52Vo4bzdc19TE6W8\nNOh1qvacgL5LG4hMRArD1q1baW9vp7W1lQMHDrB48WLq6ur47Gc/y6ZNm3jjG9/IyZMnOX369LSf\n8fjjj4+H7KZNm9i0adP4sh/84Ads27aNrVu3cvjwYY4cOXLBenbv3s273vUuIpEI0WiUd7/73Tzx\nxBPA/AwtnLd77sGAsbGukt0DK/ht8DpVr/gtv8sSEbjgHvZcuuWWW9i1axenTp1i586d3HvvvXR0\ndLB3715KSkpYvXp11qF+M2Xbq3/55Zf50pe+xDPPPMPixYv58Ic/POPnXGh4rfkYWjhv99zBO9/9\noc6lOAzUNCNS9Hbu3Mn3v/99du3axS233EJPTw9Lly6lpKSERx99lFdeeeWC77/uuuu49957ATh0\n6BAHDx4E4Ny5c0QiEWKxGKdPn540CNl0Qw1fd911PPDAAwwMDNDf38+PfvQjfuM3fiOHf9oLy9s9\nd/Da3f/u52FGq9dRqk5VkaLX2NhIb28v8Xicuro6PvjBD/K2t72N7du3s2XLFtavX3/B99922218\n5CMfYdOmTWzZsoVrrrkGgM2bN7N161YaGxtZu3Yt11577fh7br31Vt70pjdRV1fHo48+Oj5/27Zt\nfPjDHx7/jI9+9KNs3bp13u7uZHM0Mu+Mtm/f7mY6f3Qmvzzdy01/8ziPv+Y+VvbshU8dzVF1InKx\njh49yoYNG/wuo6Bk+03NbK9zbvtM783rZpm1NRHKSgIcYS30tkLvKb9LEhFZEPI63EPBABvrKvnZ\n2JWqancXEQHyPNzBa3d/qHMpzgK6mEnEZ3418xaiy/0t8z7cm+IxOkdKGFl8hcJdxEdlZWV0dXUp\n4HPAOUdXVxdlZWWX/Bl5fbYMZAz/G9nAyrYnwTmY5uozEZk7DQ0NtLS00NGhCwpzoaysjIaGhkt+\nf96H+xVLo4RDAQ7bWlb2PQC9bVBZ73dZIkWnpKSENWvW+F2GpOV9s0woGGBDXSU/6x/rVFXTjIhI\n3oc7eJ2qD3fW4iyocBcRoUDCvSleSedwkJElVyrcRUQomHBPD/8bSd9TVb31IlLkCiLcr1xWQWko\n4A3/O9AFPS1+lyQi4quCCPeSYIANyyt4on+FN0NNMyJS5GYMdzNbYWaPmtlRMztsZp/Iss71ZtZj\nZvvTjz+bm3Kn1xiP8VBnNS5QonAXkaI3m/PcE8CnnHP7zKwC2Gtm/+acm3obkiecc2/NfYmz0xyP\n8b2njZEVVxFWuItIkZtxz9051+ac25ee7gWOAvG5LuxiNatTVURk3EW1uZvZamAr8HSWxa83swNm\n9pCZNeagtoty5bIKSoMBjtg6GOqGs8fnuwQRkQVj1uFuZlHgfuB259y5KYv3Aaucc5uBrwAPTPMZ\nt5rZHjPbk+vxJ0pDAa5aXsHusStV2zT8r4gUr1mFu5mV4AX7vc65H05d7pw755zrS08/CJSYWU2W\n9e5xzm13zm2vra29zNLP1xSv5OH2JbhgqTpVRaSozeZsGQO+BRx1zt05zTrL0+thZtekP7crl4XO\nRlM8RtcQjNRsVLiLSFGbzdky1wK/CzxnZmNtHZ8FVgI4574B3ALcZmYJYBDY6XwY1Hm8U7V8Pata\nH4JUCgIFcSq/iMhFmTHcnXO7gQsOkO6c+yrw1VwVdamuWl5BKGAcsXWsGu6Bsy9D9Tq/yxIRmXcF\ntVsbDgW5cllGp6qaZkSkSBVUuIPXNPPT9ipcqEzhLiJFq+DCvakhRsegY6SmEVp1OqSIFKeCC/eJ\nK1XXe+e6p1I+VyQiMv8KLtzXL68gGDAOsxZG+qDrmN8liYjMu4IL97KSIFcsjbJbw/+KSBEruHAH\nr2nmkfYYrqRc4S4iRakww70hRvtAkpHaJoW7iBSlggz3xvqMTtVTByGV9LkiEZH5VZDhvrGukoDB\nEdbB6AB0/tLvkkRE5lVBhvui0iBXLK3gcV2pKiJFqiDDHbwRIh85XYkrjSrcRaToFHC4V9LRP8po\nbbPCXUSKTsGG+9iVqm2R9XDqOUiO+lyRiMj8Kdhw31jvdaoeZh0khqDjeb9LEhGZNwUb7uWlIdbV\nRnlivFNVg4iJSPEo2HAHr1P1309HIBxTu7uIFJWCD/fTfaOMLNWVqiJSXAo63Cc6VTfA6UOQGPG5\nIhGR+VHQ4d5YX4kZHHbrIDkC7Uf8LklEZF4UdLhHwiHW1kR0paqIFJ2CDnfw2t0fO10OZVXenZlE\nRIpAwYd7czzGqd5hRpZt1p67iBSNgg/3prFO1fL1cPoIjA75XJGIyNwr+HBvrK8E8O6pmhqF9sM+\nVyQiMvdmDHczW2Fmj5rZUTM7bGafyLKOmdmXzeyYmR00s21zU+7FqygrYU1NhMf7dE9VESkeoVms\nkwA+5ZzbZ2YVwF4z+zfnXOZ5hW8Crkg/XgfcnX5eEJriMZ44noTyaoW7iBSFGffcnXNtzrl96ele\n4CgQn7LaO4C/d56ngCozq8t5tZeoOV7JyZ6hdKeqzpgRkcJ3UW3uZrYa2Ao8PWVRHDiR8bqF8zcA\nvpnoVN0A7UdhdNDnikRE5tasw93MosD9wO3OuXNTF2d5i8vyGbea2R4z29PR0XFxlV6GsRtmH2Yt\nuCScOjRv3y0i4odZhbuZleAF+73OuR9mWaUFWJHxugFonbqSc+4e59x259z22traS6n3ksQWlbCq\nujxj+F+1u4tIYZvN2TIGfAs46py7c5rVfgL8XvqsmR1Aj3OuLYd1XrameIwnTpVAZKnCXUQK3mzO\nlrkW+F3gOTMb6438LLASwDn3DeBB4M3AMWAA+EjuS708zfEY/3KwjZGNmylVuItIgZsx3J1zu8ne\npp65jgP+KFdFzYWm+ol7qq56+f/CSD+URnyuSkRkbhT8FapjmuLpK1XdOnAp76bZIiIFqmjCvaq8\nlBVLFqlTVUSKQtGEO3jt7j87HYKKOoW7iBS0ogr3xvoYr54ZYFTD/4pIgSuqcG/OHP6380UYmnot\nlohIYSjKcD/EOsDBqYP+FiQiMkeKKtwXR0qJVy3i8b70sDcaRExEClRRhTt4p0Q+dToAsRVqdxeR\nglV04d4cj3G8a4DRZZsU7iJSsIou3CfdU/XMSzDY7XNFIiK5V3ThPt6p6tZ5M9oO+FiNiMjcKLpw\nr46GqY+V8Xj/WKeqmmZEpPAUXbgDNMZj/OI0ULVK4S4iBakow705HuPlzn5Gl2+BNp0OKSKFp2jD\n3bl0p+rZ4zBwxu+SRERyqijDvWm8U3WNN0N77yJSYIoy3GsrwiyrDPNEnzpVRaQwFWW4g9c088xp\nB0vWKtxFpOAUbbg3xWO81NFHYtkWjTEjIgWnaMN9rFO1NbIeek5Af6ffJYmI5EzRhvtEp+pab4b2\n3kWkgBRtuC+rLKO2IswTffXeDLW7i0gBKdpwB69pZu+pBFRfoXAXkYJS1OHeFI9xrL2PxHLdU1VE\nCktxh3t9JSkHbZEN0NsKvaf8LklEJCdmDHcz+7aZtZvZoWmWX29mPWa2P/34s9yXOTeaG9SpKiKF\naTZ77n8H3DzDOk8457akH39x+WXNj+WVZdRES3mitw4soGEIRKRgzBjuzrnHgYIcWcvMaIrH2Hdq\nFGquVLu7iBSMXLW5v97MDpjZQ2bWmKPPnBdN9TFebO8juXyLF+7O+V2SiMhly0W47wNWOec2A18B\nHphuRTO71cz2mNmejo6OHHz15WuKx0imnHelat9p6G3zuyQRkct22eHunDvnnOtLTz8IlJhZzTTr\n3uOc2+6c215bW3u5X50T53eqqmlGRPLfZYe7mS03M0tPX5P+zK7L/dz5Uh8rY0mklN29y8GCCncR\nKQihmVYws/uA64EaM2sBPgeUADjnvgHcAtxmZglgENjpXP40XJsZjfWV7GsbgaUbdDqkiBSEGcPd\nOff+GZZ/FfhqziryQXM8xj2P/4rEazcTevFfvU5V72BERCQvFfUVqmOa4zESKcep8vUw0Ak9LX6X\nJCJyWRTuTAz/+5w6VUWkQCjcgYbFi6gqL+FnvcsgEFK4i0jeU7iTvlK1Psb+U0OwdKPCXUTynsI9\nrSke44VTvSTrtnhjzOTPCT8iIudRuKc1x2OMJp03/O/gWeh+xe+SREQumcI9rXmsUzW1xpuhphkR\nyWMK97QVSxZRWRbi571LIViqcBeRvKZwTxsb/vdA2yAsa1S4i0heU7hnaI7HeL6tNz387wFIpfwu\nSUTkkijcMzTFY4wkU5yKbIDhHjj7st8liYhcEoV7hrErVQ+hK1VFJL8p3DOsWlJORTjEz87VQKhM\n4S4ieUvhniEQMBrjlRxoHYBlTRr+V0TylsJ9iuZ4jKNt5yauVFWnqojkIYX7FE3xGCOJdKfqSB90\nHfO7JBGRi6Zwn0LD/4pIIVC4T7GmOkI0HOKpc0ugpNxrmhERyTMK9ykCAWNjfSUHWvth+SbtuYtI\nXlK4Z9FU73Wqpuo2Q9sBSCX9LklE5KIo3LNobqhkaDTF6egGGB2Azl/6XZKIyEVRuGcxNvzvwZQ6\nVUUkPyncs1hTE6W8NMhTPYuhNKpwF5G8o3DPIhgwNtZVcrC1D+o2K9xFJO8o3KfRFI9xpPUcqbot\ncOo5SCb8LklEZNZmDHcz+7aZtZvZoWmWm5l92cyOmdlBM9uW+zLnX3M8xuBokvbIekgMQcfzfpck\nIjJrs9lz/zvg5gssfxNwRfpxK3D35Zflv+aGdKeqrlQVkTw0Y7g75x4HzlxglXcAf+88TwFVZlaX\nqwL9srYmQllJgCe7YxCuVLiLSF7JRZt7HDiR8bolPS+vhYIBNtZVclidqiKSh3IR7pZlnsu6otmt\nZrbHzPZ0dHTk4KvnVnM8xuHWHlzdVjh9CBIjfpckIjIruQj3FmBFxusGoDXbis65e5xz251z22tr\na3Pw1XOrKR6jfyTJ6YoNkByBjqN+lyQiMiu5CPefAL+XPmtmB9DjnGvLwef6brxTVVeqikiemc2p\nkPcBTwJXmVmLmf2+mf2hmf1hepUHgV8Bx4BvAh+bs2rn2Wtqo4RDAX5xtgLKqhTuIpI3QjOt4Jx7\n/wzLHfBHOatoAQkFA2yoq+S51nNQv0XhLiJ5Q1eozsDrVD2X7lQ9AqNDfpckIjIjhfsMmuMx+oYT\ntFdsgNQotB/2uyQRkRkp3GfQGK8EdKWqiOQXhfsMrlxWQWkowDNnIlBeDa26p6qILHwK9xmUBANs\nWF7BcyfPQd0WhbuI5AWF+yw0xWMcau3B1W+F9iMwOuh3SSIiF6Rwn4WmeIzeoQQdFRvAJeFU1tGP\nRUQWDIX7LOieqiKSbxTus3DlsgpKgwGeOVMGkaUKdxFZ8BTus1AaCnDV8goOtZ6D+q3Qpk5VEVnY\nFO6z1BSv5NDJc7i6zd4t90b6/S5JRGRaCvdZaorH6BkcpbOyEVzKu2m2iMgCpXCfpYlO1dXeDLW7\ni8gCpnCfpauWVxAKGHvOlEFFncJdRBY0hfsshUNBrlxWwaGTPV6nqsJdRBYwhftFaI7HOHSyB1e3\nBTpfhOFev0sSEclK4X4RmhpinB0YpSu2EXDQdtDvkkREslK4X4TxTtXkGm+GmmZEZIFSuF+E9csr\nCAaMvV0hqGxQuIvIgqVwvwhlJUGuWBrl0Mn0PVVPPA3dJ/wuS0TkPAr3izTeqbrxndBzAu5qhu++\nDfbfp6tWRWTBULhfpOaGGF39I7StfCt8fD9cfwd0vwoP/CH89RXwo9vg5cchlfK7VBEpYiG/C8g3\nTelO1edO9lDfuMYL9zd8Bl59EvZ/Dw4/AAe+B7GVsPl9sPn9UL3O56pFpNhoz/0ibVheScDg8Mme\niZlmsOrX4B1fhU//Et79t1DzGnjif8BXtsG3boI934HBbv8KF5Gioj33i7SoNMgVSyt4LjPcM5WW\nw6bf8R7nWuHgP3rt8f98Ozz0GVj/FtjyAVh7AwT184vI3FC6XIKmeIz/98sOnHOY2fQrVtbDr38S\nrr0dWvd5IX9oFxz+IUSXwab3wuYPwLKN81e8iBSFWTXLmNnNZvaCmR0zszuyLP+wmXWY2f7046O5\nL3XhaI5X0tk3zOlzw7N7gxnEr4a3fAk+9QK89x+810/dDXe/Hv7XdfDUN6C/c24LF5GiMeOeu5kF\nga8BvwW0AM+Y2U+cc0emrPqPzrk/noMaF5yxTtVDJ3tYHiu7uDeHwrDx7d6jvxOe+z9eR+zDn4Gf\n/le44rdhy/u951DpHFQvIsVgNs0y1wDHnHO/AjCz7wPvAKaGe9HYWO91qj53soc3blx26R8UqYEd\nt3mP04fhwH1w8Afwwr/AoiXQfIt3tk39Vm/vX0RklmbTLBMHMi/DbEnPm+o9ZnbQzHaZ2YpsH2Rm\nt5rZHjPb09HRcQnlLgzlpSHW1Ua94X9zZVkj3PSX8Mkj8MFdsPZ62Ptd+OYN8PUdsPsuONeWu+8T\nkYI2m3DPtsvoprz+J2C1c24T8Ajw3Wwf5Jy7xzm33Tm3vba29uIqXWCa4zF+cfwMX3v0GD9/qZP+\n4URuPjgYgit+C37nO95plW+9C8pi8Mjn4G82wj+8G57bBaODufk+ESlIs2mWaQEy98QbgNbMFZxz\nXRkvvwn81eWXtrC95+oGDrR089f/+gIAAYOrlleybWUV21YuZtuqxayuLr/w2TQzWVQF2z/iPbpe\n8pptDnwf7v99CFdC4zu9s21W7lCzjYhMYs5N3QmfsoJZCPglcCNwEngG+IBz7nDGOnXOubb09LuA\nzzjndlzoc7dv3+727NlzmeX7r3tghGdPdPPsK2fZ92o3+09005fei18SKWXriiq2pgN/84oqIuHL\nPPs0lYJXdnunVR75MYz2QzgGsbh36mVlPP2o9x6xBu85XJGDP62I+M3M9jrnts+43kzhnv6wNwN3\nAUHg2865L5jZXwB7nHM/MbP/DrwdSABngNucc89f6DMLJdynSqYcx9r72PfqWfa9cpZ9r57lpQ5v\nQLGc790P98HRf4KTe70Lps6d9J77289fN1w5EfiTNgDxiQ1DuFJHACILXE7DfS4Uarhnk7l3/+yJ\nbva/2k1veu9+cXkJW1cuHg/8nOzdJ4ahty0d+K3Q0zI5/M+1Qt9pzus6KY1m3/vP3ACUVWkDIOIj\nhfsCdjF791tXVrGmJnJ5bffZJEag79RE6PecnLIBOAm9pzhvA1BSPv0GoGoFxFZAWWVuaxWRcQr3\nPDPve/ezkRz19vAn7f2PbQDSG4HeNnBThjcuq4KqlenHqozpld4GoCw297WLFCiFe56bae/+ymUV\nbFu1mG0rF7N+eQU10TDV0VJKgvM80Gcy4bXx95z0bl7S/er5j8SU0zbLYtmDP7bCe15UNb9/BpE8\nonAvQN0DI+w/0c2+V7t59tWzk/bux1SVl1AdKaUmGqamIkxNxnR1pJSaijC16Q1Beek87P07BwNd\n0P1KluA/4c0fHZj8nnDs/L39zNdq95cipnAvAmN798e7+unsG6azd4Su/uHx6c7+YTp7hzk3lP0C\nq/LS4Pgef000nH6UZp0XW1SS+3Z/SIf/mcnhn3kEcPYV73TPTOHK8/f2Mx+LFiv8pWDNNtw15G8e\nCwaMq5ZXcNXyC5/DPpxIcqZ/xAv8vnT493nTXenpE2cGePbVbs70D5PKsr0vCRrVkewbgpqKUqoj\n4fHpJeWlhGbbPGQGkWrvEd92/nLnYPBsRvhPCf6Xn4CR3ik/TClEar2xeyK1EFmaMV0L0dqJ6fIa\nDdAmBUnhXgTCoSB1sUXUxRbNuG4y5Tg7MEJX3/Qbgs6+YY6199HRN8xI4vx7xZrB4vJSaqLp0K/I\n2BCMHxlMTJeVBKcvyAzKl3iP+q3nL3cOhronN/f0tUN/x8Sj4wVvXnKaIZrLqibCPlID0aVZNg7p\n12UxHRVIXlC4yyTBgI3vmV/FhY8InHP0DicmNgS9w3T2j3jPfcPj859r6aazb2T8yt2pouHQtM1B\nU/sLKsKhyc1DZl4zzKLFULf5QsXCcG868DvTz+0Z0x3Ql94QHN8Ng2em+YGmHhVke6Q3EDoqEB8p\n3OWSmRmVZSVUlpWwpiYy4/pDo8nxI4GuKUcFnX3eRuFXHf08c/wsZwdGyNYdVBoKeJ3EY4GfEf61\nFd5GoaIsRCQcIhoOUV4aJFIaIhAw7/z7ssrZ3bA8Oer1BfS3T94g9E3ZIMz2qGD8aGBsusY7Isic\nLo3oqEByRuEu86asJEjD4nIaFpfPuG4imfL6CTKah8aOBDrSG4P23mGOtJ2jq2+ERLaOggzlpcHx\nwI+EvcCPhkOUh0NE068nlk+sEwnXEw2vJFIZHF9WXhqcfPTgHIz0TWwE+tonHxWMNRO1H/Gmh6a5\nUXpoUbo/YOmUvoGlk6cjtd6RSkD3t5fpKdxlQQoFAyytLGNp5cx3ukqlHOeGRsePAHqHEvQPJ+gb\n9p77R5Lec+a84SRtPUMMjCToG/aWD44mZ1WbGURKvZDP3Bh40yVEwquIlK71NhYVIcqrMzcWIaKh\nJBWpHioSZ1k0eoay4S4CA52T+wp6Wrz77vZ3gstSVyDkNftM7SAeP0IYOyKo8QaNK41C4AJ9G1Jw\nFO6S9wIBo6q8lKryUl6z9NI/J5ly9I8kxjcE/enQ7xtO0J+xEZi0LGP9k91D9A8n0huMBEOj53c2\nZ1fFopJqIuHGjI1AkEh1iOiiFYfvAAAGEUlEQVTyADWhfmqtlxrrYYnrIebOUpnoJpI8S/lIF2U9\nXZS0v0hosINAYmj6rykpnwj6cBRKK7zX4egF5lVkf4/6EhY8hbtIWjAw0YeQC4lkioHR5KSNgbfx\nSI5vAAaGk95zeuMxMDKx8TjTP8KJsfeOlNA/XEXKVQGrpvlGR4Qhqu0cy4PnqA/1sTzUx+LQMLHg\nMJU2REVgiGhiiEhikEV9/SxyHZSmBihNDlCS6CeYvMDGYdKPVXrhjUS4MmN51NuwWCD9sPQj/ZqM\n6fHlgVmukzF9wXXSr4Nhr2+jNALB3Pw9L1QKd5E5EgoGqAwGcraxcM4xnEhl2SgkGBhJv57SDNUx\nnOR4eqMy1lzVP5ygN/08tasiSJIIg0QZImJDRBmkwoaoLhlmSckIi0MjxALDxIJDVNogETdEdHiI\nRUODLHJthDM2FKFEP3beTdsWkPGgj04E/nSvw9Esy7JMB0sXTKe4wl0kT5gZZSVB77qA6OV/nnOO\nwdFkuh8iSd/QRJ9E3/D50+eGErSljzD6hka992Ssd36ntmMRw0QZpMxGMCBAigAOSz8C44/UtPOC\n5gga6WdHyBwBMwKWIjQ2H0dg0nTG+3AELEUQRziQpMKGidgQERsmwiDlqSEWDQ2xaHCQMneGcOok\n4dQgpalBSpIDlKRmeTQDuEAIV+IFvpVGIZx+nrQxiHr3SL7ypsv/S7wAhbtIkTIzyktD3hhDl3mj\nrsyjivENwtDEEcPwaIqUc6QcpJzDOUcylfkaks6NT6dSLv3a++yUcyRTE9Mp5/WRuIzPTDlHKgWJ\nqfPGplOO0aRjJJlieDSZfk55z4kkI4kUw4nU+HMyvbEKkGIRw5Snj2YipDcK6WnveWh8efnIMJH+\nofH50UAHUWvxljNEOYO8cGqYrQp3EVnoMo8qaqJhv8vJiUTSC/7JoZ9keMpGYGx+5uvuRJL2KesN\nZ6z3xg3LyHK9dU4p3EVEsggFA4SCAcrz9MQgXQUhIlKAFO4iIgVI4S4iUoAU7iIiBUjhLiJSgBTu\nIiIFSOEuIlKAFO4iIgXIXLbb3czHF5t1AK9c4ttrgM4clpPv9HtMpt9jgn6LyQrh91jlnKudaSXf\nwv1ymNke59x2v+tYKPR7TKbfY4J+i8mK6fdQs4yISAFSuIuIFKB8Dfd7/C5ggdHvMZl+jwn6LSYr\nmt8jL9vcRUTkwvJ1z11ERC4g78LdzG42sxfM7JiZ3eF3PX4ysxVm9qiZHTWzw2b2Cb9r8puZBc3s\nWTP7Z79r8ZuZVZnZLjN7Pv1v5PV+1+QXM/tk+v/IITO7z8zK/K5pruVVuJtZEPga8CZgI/B+M9vo\nb1W+SgCfcs5tAHYAf1TkvwfAJ4CjfhexQPxP4GHn3HpgM0X6u5hZHPg4sN051wQEgZ3+VjX38irc\ngWuAY865XznnRoDvA+/wuSbfOOfanHP70tO9eP954/5W5R8zawDeAvyt37X4zcwqgeuAbwE450ac\nc93+VuWrELDIzEJAOdDqcz1zLt/CPQ6cyHjdQhGHWSYzWw1sBZ72txJf3QX8ZyDldyELwFqgA/hO\nupnqb80s4ndRfnDOnQS+BLwKtAE9zrmf+lvV3Mu3cLcs84r+dB8ziwL3A7c75875XY8fzOytQLtz\nbq/ftSwQIWAbcLdzbivQDxRlH5WZLcY7wl8D1AMRM/uQv1XNvXwL9xZgRcbrBorg8OpCzKwEL9jv\ndc790O96fHQt8HYzO47XXPebZva//S3JVy1Ai3Nu7EhuF17YF6M3Ai875zqcc6PAD4Ff87mmOZdv\n4f4McIWZrTGzUrxOkZ/4XJNvzMzw2lSPOufu9LsePznn/otzrsE5txrv38W/O+cKfu9sOs65U8AJ\nM7sqPetG4IiPJfnpVWCHmZWn/8/cSBF0Lof8LuBiOOcSZvbHwL/i9Xh/2zl32Oey/HQt8LvAc2a2\nPz3vs865B32sSRaO/wjcm94R+hXwEZ/r8YVz7mkz2wXswzvD7FmK4EpVXaEqIlKA8q1ZRkREZkHh\nLiJSgBTuIiIFSOEuIlKAFO4iIgVI4S4iUoAU7iIiBUjhLiJSgP4/IkqTlk5IP88AAAAASUVORK5C\nYII=\n", 383 | "text/plain": [ 384 | "" 385 | ] 386 | }, 387 | "metadata": {}, 388 | "output_type": "display_data" 389 | } 390 | ], 391 | "source": [ 392 | "plt.plot(tr_loss_history, label = 'train')\n", 393 | "plt.plot(val_loss_history, label = 'validation')\n", 394 | "plt.legend()" 395 | ] 396 | }, 397 | { 398 | "cell_type": "code", 399 | "execution_count": 10, 400 | "metadata": {}, 401 | "outputs": [], 402 | "source": [ 403 | "hat = mnist_classifier.predict(sess=sess, x_data=mnist.test.images, training=False)" 404 | ] 405 | }, 406 | { 407 | "cell_type": "code", 408 | "execution_count": 11, 409 | "metadata": {}, 410 | "outputs": [ 411 | { 412 | "name": "stdout", 413 | "output_type": "stream", 414 | "text": [ 415 | "accuracy : 95.48%\n" 416 | ] 417 | } 418 | ], 419 | "source": [ 420 | "print('accuracy : {:.2%}'.format(np.mean(np.argmax(mnist.test.labels, axis = 1) == hat)))" 421 | ] 422 | } 423 | ], 424 | "metadata": { 425 | "kernelspec": { 426 | "display_name": "Python 3", 427 | "language": "python", 428 | "name": "python3" 429 | }, 430 | "language_info": { 431 | "codemirror_mode": { 432 | "name": "ipython", 433 | "version": 3 434 | }, 435 | "file_extension": ".py", 436 | "mimetype": "text/x-python", 437 | "name": "python", 438 | "nbconvert_exporter": "python", 439 | "pygments_lexer": "ipython3", 440 | "version": "3.6.3" 441 | }, 442 | "varInspector": { 443 | "cols": { 444 | "lenName": 16, 445 | "lenType": 16, 446 | "lenVar": 40 447 | }, 448 | "kernels_config": { 449 | "python": { 450 | "delete_cmd_postfix": "", 451 | "delete_cmd_prefix": "del ", 452 | "library": "var_list.py", 453 | "varRefreshCmd": "print(var_dic_list())" 454 | }, 455 | "r": { 456 | "delete_cmd_postfix": ") ", 457 | "delete_cmd_prefix": "rm(", 458 | "library": "var_list.r", 459 | "varRefreshCmd": "cat(var_dic_list()) " 460 | } 461 | }, 462 | "types_to_exclude": [ 463 | "module", 464 | "function", 465 | "builtin_function_or_method", 466 | "instance", 467 | "_Feature" 468 | ], 469 | "window_display": false 470 | } 471 | }, 472 | "nbformat": 4, 473 | "nbformat_minor": 2 474 | } 475 | -------------------------------------------------------------------------------- /Tutorial of implementing DL model by OOP.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Tutorial of implementing Deep learning model by OOP\n", 8 | "- **Ch1. Object Oriented Programming** \n", 9 | "파이썬으로 객체지향 프로그래밍을 하는 방법에 대하 더 자세한 사항을 확인하려면 아래를 링크를 참고\n", 10 | " - 점프투파이썬(Class) : https://wikidocs.net/28 \n", 11 | " \n", 12 | " \n", 13 | "- **Ch2. Example : Deep Neural Network for MNIST dataset** \n", 14 | "Class를 활용하여 5개의 hidden layer를 가지는 DNN 모형을 구성하는 예제" 15 | ] 16 | }, 17 | { 18 | "cell_type": "markdown", 19 | "metadata": {}, 20 | "source": [ 21 | "## Ch1. Object Oriented Programming (OOP)\n", 22 | "* 클래스(class) : 똑같은 무엇인가를 계속해서 만들어낼 수 있는 설계 도면 (과자틀)\n", 23 | "* 객체(object) : 과자틀에 의해서 만들어진 과자들" 24 | ] 25 | }, 26 | { 27 | "cell_type": "markdown", 28 | "metadata": {}, 29 | "source": [ 30 | "### Class" 31 | ] 32 | }, 33 | { 34 | "cell_type": "code", 35 | "execution_count": 1, 36 | "metadata": {}, 37 | "outputs": [], 38 | "source": [ 39 | "class Programmer:\n", 40 | " pass\n", 41 | "kim = Programmer()\n", 42 | "park = Programmer()" 43 | ] 44 | }, 45 | { 46 | "cell_type": "markdown", 47 | "metadata": {}, 48 | "source": [ 49 | "### Class variable\n", 50 | "객체(object)간 서로 공유되는 변수로 보통 클래스(class)에 의해 생성되는 객체들이 공통적으로 사용할 목적으로 쓰임" 51 | ] 52 | }, 53 | { 54 | "cell_type": "code", 55 | "execution_count": 2, 56 | "metadata": {}, 57 | "outputs": [], 58 | "source": [ 59 | "class ToBigs:\n", 60 | " name = 'ToBigs'" 61 | ] 62 | }, 63 | { 64 | "cell_type": "code", 65 | "execution_count": 3, 66 | "metadata": {}, 67 | "outputs": [ 68 | { 69 | "data": { 70 | "text/plain": [ 71 | "'ToBigs'" 72 | ] 73 | }, 74 | "execution_count": 3, 75 | "metadata": {}, 76 | "output_type": "execute_result" 77 | } 78 | ], 79 | "source": [ 80 | "tobigs = ToBigs()\n", 81 | "tobigs.name" 82 | ] 83 | }, 84 | { 85 | "cell_type": "code", 86 | "execution_count": 4, 87 | "metadata": {}, 88 | "outputs": [], 89 | "source": [ 90 | "ToBigs.name = '투빅스'" 91 | ] 92 | }, 93 | { 94 | "cell_type": "code", 95 | "execution_count": 5, 96 | "metadata": {}, 97 | "outputs": [ 98 | { 99 | "data": { 100 | "text/plain": [ 101 | "'투빅스'" 102 | ] 103 | }, 104 | "execution_count": 5, 105 | "metadata": {}, 106 | "output_type": "execute_result" 107 | } 108 | ], 109 | "source": [ 110 | "tobigs.name" 111 | ] 112 | }, 113 | { 114 | "cell_type": "markdown", 115 | "metadata": {}, 116 | "source": [ 117 | "### Method\n", 118 | "클래스(class) 내부의 함수로 \"파이썬은 객체를 통해 클래스의 함수를 호출할 때, 호출한 객체 자신이 호출한 클래스 함수의 첫번째 입력 인수로 전달된다.\"" 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "execution_count": 6, 124 | "metadata": {}, 125 | "outputs": [], 126 | "source": [ 127 | "class ToBigs:\n", 128 | " name = 'ToBigs'\n", 129 | " def add_member(self, name):\n", 130 | " member_name = name\n", 131 | " return '동아리 : {}, 회원이름 : {}'.format(self.name, member_name)" 132 | ] 133 | }, 134 | { 135 | "cell_type": "code", 136 | "execution_count": 7, 137 | "metadata": {}, 138 | "outputs": [ 139 | { 140 | "data": { 141 | "text/plain": [ 142 | "'동아리 : ToBigs, 회원이름 : 김보섭'" 143 | ] 144 | }, 145 | "execution_count": 7, 146 | "metadata": {}, 147 | "output_type": "execute_result" 148 | } 149 | ], 150 | "source": [ 151 | "# 객체에서 Method를 호출할 때\n", 152 | "tobigs = ToBigs()\n", 153 | "tobigs.add_member('김보섭')" 154 | ] 155 | }, 156 | { 157 | "cell_type": "code", 158 | "execution_count": 8, 159 | "metadata": {}, 160 | "outputs": [ 161 | { 162 | "data": { 163 | "text/plain": [ 164 | "'동아리 : ToBigs, 회원이름 : 김보섭'" 165 | ] 166 | }, 167 | "execution_count": 8, 168 | "metadata": {}, 169 | "output_type": "execute_result" 170 | } 171 | ], 172 | "source": [ 173 | "# 클래스에서 Method를 호출할 때, 잘 사용되지 않는 방법임\n", 174 | "ToBigs.add_member(tobigs, '김보섭')" 175 | ] 176 | }, 177 | { 178 | "cell_type": "markdown", 179 | "metadata": {}, 180 | "source": [ 181 | "### Object variable\n", 182 | "객체 변수(Object variable)은 객체별로 고유한 값이 저장되는 변수이다. 반면에 클래스 변수(Class variable)은 객체간 서로 공유되는 변수" 183 | ] 184 | }, 185 | { 186 | "cell_type": "code", 187 | "execution_count": 9, 188 | "metadata": {}, 189 | "outputs": [], 190 | "source": [ 191 | "class ToBigs:\n", 192 | " name = 'ToBigs'\n", 193 | " def make_list(self):\n", 194 | " self.member = []\n", 195 | " def add_member(self, member_name):\n", 196 | " self.member.append(member_name)\n", 197 | " print('{}을 {}에 추가하였습니다.'.format(member_name, self.name))" 198 | ] 199 | }, 200 | { 201 | "cell_type": "code", 202 | "execution_count": 10, 203 | "metadata": {}, 204 | "outputs": [], 205 | "source": [ 206 | "first_members = ToBigs()" 207 | ] 208 | }, 209 | { 210 | "cell_type": "code", 211 | "execution_count": 11, 212 | "metadata": {}, 213 | "outputs": [], 214 | "source": [ 215 | "first_members.make_list()" 216 | ] 217 | }, 218 | { 219 | "cell_type": "code", 220 | "execution_count": 12, 221 | "metadata": {}, 222 | "outputs": [ 223 | { 224 | "name": "stdout", 225 | "output_type": "stream", 226 | "text": [ 227 | "김보섭을 ToBigs에 추가하였습니다.\n" 228 | ] 229 | } 230 | ], 231 | "source": [ 232 | "first_members.add_member('김보섭')" 233 | ] 234 | }, 235 | { 236 | "cell_type": "markdown", 237 | "metadata": {}, 238 | "source": [ 239 | "### __init__ Method\n", 240 | "객체를 만들 때, 항상 실행되는 메서드" 241 | ] 242 | }, 243 | { 244 | "cell_type": "code", 245 | "execution_count": 13, 246 | "metadata": {}, 247 | "outputs": [], 248 | "source": [ 249 | "class ToBigs:\n", 250 | " name = 'ToBigs'\n", 251 | " def __init__(self):\n", 252 | " self.make_list()\n", 253 | " \n", 254 | " def make_list(self):\n", 255 | " self.member = []\n", 256 | " \n", 257 | " def add_member(self, member_name):\n", 258 | " self.member.append(member_name)\n", 259 | " print('{}을 {}에 추가하였습니다.'.format(member_name, self.name))" 260 | ] 261 | }, 262 | { 263 | "cell_type": "code", 264 | "execution_count": 14, 265 | "metadata": {}, 266 | "outputs": [ 267 | { 268 | "name": "stdout", 269 | "output_type": "stream", 270 | "text": [ 271 | "김보섭을 ToBigs에 추가하였습니다.\n" 272 | ] 273 | } 274 | ], 275 | "source": [ 276 | "first_members = ToBigs()\n", 277 | "first_members.add_member('김보섭')" 278 | ] 279 | }, 280 | { 281 | "cell_type": "code", 282 | "execution_count": 15, 283 | "metadata": { 284 | "code_folding": [] 285 | }, 286 | "outputs": [], 287 | "source": [ 288 | "class ToBigs:\n", 289 | " name = 'ToBigs'\n", 290 | " def __init__(self):\n", 291 | " self.member = []\n", 292 | " \n", 293 | " def add_member(self, member_name):\n", 294 | " self.member.append(member_name)\n", 295 | " print('{}을 {}에 추가하였습니다.'.format(member_name, self.name))" 296 | ] 297 | }, 298 | { 299 | "cell_type": "code", 300 | "execution_count": 16, 301 | "metadata": {}, 302 | "outputs": [ 303 | { 304 | "name": "stdout", 305 | "output_type": "stream", 306 | "text": [ 307 | "이경택을 ToBigs에 추가하였습니다.\n", 308 | "김강진을 ToBigs에 추가하였습니다.\n", 309 | "김보섭을 ToBigs에 추가하였습니다.\n", 310 | "김윤진을 ToBigs에 추가하였습니다.\n", 311 | "김현주을 ToBigs에 추가하였습니다.\n" 312 | ] 313 | } 314 | ], 315 | "source": [ 316 | "first_members = ToBigs()\n", 317 | "first_members.add_member('이경택')\n", 318 | "first_members.add_member('김강진')\n", 319 | "first_members.add_member('김보섭')\n", 320 | "first_members.add_member('김윤진')\n", 321 | "first_members.add_member('김현주')" 322 | ] 323 | }, 324 | { 325 | "cell_type": "code", 326 | "execution_count": 17, 327 | "metadata": {}, 328 | "outputs": [ 329 | { 330 | "data": { 331 | "text/plain": [ 332 | "['이경택', '김강진', '김보섭', '김윤진', '김현주']" 333 | ] 334 | }, 335 | "execution_count": 17, 336 | "metadata": {}, 337 | "output_type": "execute_result" 338 | } 339 | ], 340 | "source": [ 341 | "first_members.member" 342 | ] 343 | }, 344 | { 345 | "cell_type": "markdown", 346 | "metadata": {}, 347 | "source": [ 348 | "### Python coding convention\n", 349 | "변수명에서 *_* 은 위치에 따라 다음과 같은 의미가 있다. 더 자세한 사항은 아래의 링크를 참고 \n", 350 | " \n", 351 | "* **_single_leading_underscore** : 내부적으로 사용되는 변수\n", 352 | "* **single_trailing_underscore_** : 파이썬 기본 키워드와 충돌을 피하려고 사용\n", 353 | "\n", 354 | "링크 : https://spoqa.github.io/2012/08/03/about-python-coding-convention.html \n", 355 | "\n", 356 | "\n" 357 | ] 358 | }, 359 | { 360 | "cell_type": "code", 361 | "execution_count": 18, 362 | "metadata": {}, 363 | "outputs": [], 364 | "source": [ 365 | "class ToBigs:\n", 366 | " name = 'ToBigs'\n", 367 | " def __init__(self):\n", 368 | " self.member = []\n", 369 | " self._comment = '저는 여러분들을 잘 모릅니다. 이 멘트는 객체변수이고 접근할 순 있지만, 자동완성 기능 사용시 추천에 안뜹니다.'\n", 370 | " \n", 371 | " def add_member(self, member_name):\n", 372 | " self.member.append(member_name)\n", 373 | " print('{}을 {}에 추가하였습니다.'.format(member_name, self.name))" 374 | ] 375 | }, 376 | { 377 | "cell_type": "code", 378 | "execution_count": 19, 379 | "metadata": {}, 380 | "outputs": [], 381 | "source": [ 382 | "first_members = ToBigs()" 383 | ] 384 | }, 385 | { 386 | "cell_type": "code", 387 | "execution_count": 20, 388 | "metadata": {}, 389 | "outputs": [ 390 | { 391 | "data": { 392 | "text/plain": [ 393 | "'저는 여러분들을 잘 모릅니다. 이 멘트는 객체변수이고 접근할 순 있지만, 자동완성 기능 사용시 추천에 안뜹니다.'" 394 | ] 395 | }, 396 | "execution_count": 20, 397 | "metadata": {}, 398 | "output_type": "execute_result" 399 | } 400 | ], 401 | "source": [ 402 | "first_members._comment" 403 | ] 404 | }, 405 | { 406 | "cell_type": "markdown", 407 | "metadata": {}, 408 | "source": [ 409 | "## Ch2. Example : Deep Neural Network for MNIST\n", 410 | "모형을 생성하는 클래스는 우리가 최소로 필요로 하는 기능인 그래프 생성, 모형이 주로 하는 일인 predict만 Model Class의 Method로 구현한다. 실제 Training과 loss를 평가하는 evaluate는 Solver Class의 Method로 구현한다. 위에 언급한 내용을 반영하는 Deep Neural Network 모형을 만들어서 MNIST에 적용\n", 411 | "\n", 412 | "* **Model Class *(class DNNClassifier)*** \n", 413 | " * Model을 생성하는 Method가 필요, 기왕이면 객체를 생성할 때, Deep Learning 모형이 바로 생성되면 좋다. ***(init 메서드)***\n", 414 | " * Model을 생성할 때, 모든 layer에 대해서 공통으로 적용되는 것은 ***init 메서드***의 input으로 받는다. (eg. activation, weight initialization 방법)\n", 415 | " * 특정 데이터가 주어졌을 때, 결과를 예측하는 Method가 필요 ***(predict 메서드)***\n", 416 | " \n", 417 | " \n", 418 | "* **Solver Class *(class Solver)*** \n", 419 | " * Model Class의 instance와 어떤 Optimizer를 쓸 것인지를 input으로 받아 instance를 생성한다. ***(init 메서드)***\n", 420 | " * Optimizer가 training 할 때, learning rate를 메서드의 input으로 받아 조절할 수 있어야한다. ***(train 메서드)***\n", 421 | " * loss를 계산한다. ***(evaluate 메서드)***" 422 | ] 423 | }, 424 | { 425 | "cell_type": "markdown", 426 | "metadata": {}, 427 | "source": [ 428 | "### Setup" 429 | ] 430 | }, 431 | { 432 | "cell_type": "code", 433 | "execution_count": 21, 434 | "metadata": {}, 435 | "outputs": [ 436 | { 437 | "name": "stdout", 438 | "output_type": "stream", 439 | "text": [ 440 | "Extracting ./MNIST_data\\train-images-idx3-ubyte.gz\n", 441 | "Extracting ./MNIST_data\\train-labels-idx1-ubyte.gz\n", 442 | "Extracting ./MNIST_data\\t10k-images-idx3-ubyte.gz\n", 443 | "Extracting ./MNIST_data\\t10k-labels-idx1-ubyte.gz\n" 444 | ] 445 | } 446 | ], 447 | "source": [ 448 | "import os, sys\n", 449 | "import tensorflow as tf\n", 450 | "import numpy as np\n", 451 | "import matplotlib.pylab as plt\n", 452 | "%matplotlib inline\n", 453 | "from tensorflow.examples.tutorials.mnist import input_data # load mnist dataset\n", 454 | "mnist = input_data.read_data_sets(train_dir = './MNIST_data', one_hot = True, reshape = True, seed = 777)" 455 | ] 456 | }, 457 | { 458 | "cell_type": "markdown", 459 | "metadata": {}, 460 | "source": [ 461 | "### Define DNNClassifier Class" 462 | ] 463 | }, 464 | { 465 | "cell_type": "code", 466 | "execution_count": 22, 467 | "metadata": {}, 468 | "outputs": [], 469 | "source": [ 470 | "class DnnClassifier:\n", 471 | " def __init__(self, sess, n_features, n_class, hidden_dims = [100, 100, 100, 100, 100],\n", 472 | " activation_fn = tf.nn.relu, initializer = tf.contrib.layers.xavier_initializer()):\n", 473 | " self._sess = sess\n", 474 | " \n", 475 | " with tf.variable_scope('input_layer'):\n", 476 | " self._x = tf.placeholder(dtype = tf.float32, shape = [None, n_features])\n", 477 | " self._y = tf.placeholder(dtype = tf.float32, shape = [None, n_class])\n", 478 | " \n", 479 | " _net = self._x\n", 480 | " \n", 481 | " for layer, h_dim in enumerate(hidden_dims):\n", 482 | " with tf.variable_scope('hidden_layer{}'.format(layer + 1)):\n", 483 | " _net = tf.layers.dense(inputs = _net, units = h_dim, activation = activation_fn, kernel_initializer = initializer)\n", 484 | " \n", 485 | " with tf.variable_scope('output_layer'):\n", 486 | " self._score = tf.layers.dense(inputs = _net, units = n_class, kernel_initializer = tf.contrib.layers.xavier_initializer())\n", 487 | " \n", 488 | " with tf.variable_scope('loss'):\n", 489 | " self._ce_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels = self._y, logits = self._score))\n", 490 | " \n", 491 | " with tf.variable_scope('predict'):\n", 492 | " self._prediction = tf.argmax(input = self._score, axis = -1)\n", 493 | " \n", 494 | " def predict(self, x_data):\n", 495 | " feed_predict = {self._x : x_data}\n", 496 | " return self._sess.run(fetches = self._prediction, feed_dict = feed_predict) " 497 | ] 498 | }, 499 | { 500 | "cell_type": "markdown", 501 | "metadata": {}, 502 | "source": [ 503 | "### Define Solver Class" 504 | ] 505 | }, 506 | { 507 | "cell_type": "code", 508 | "execution_count": 23, 509 | "metadata": {}, 510 | "outputs": [], 511 | "source": [ 512 | "class Solver:\n", 513 | " def __init__(self, sess, model, optimizer = tf.train.AdamOptimizer):\n", 514 | " self._sess = sess\n", 515 | " self._model = model # DnnClassifier의 class의 instance를 input으로 받는다\n", 516 | " self._lr = tf.placeholder(dtype = tf.float32)\n", 517 | " self._optimizer = optimizer(self._lr)\n", 518 | " self._training_op = self._optimizer.minimize(self._model._ce_loss)\n", 519 | " \n", 520 | " def train(self, x_data, y_data, lr):\n", 521 | " feed_train = {self._model._x : x_data,\n", 522 | " self._model._y : y_data,\n", 523 | " self._lr : lr}\n", 524 | " return self._sess.run(fetches = [self._training_op, self._model._ce_loss], feed_dict = feed_train)\n", 525 | " \n", 526 | " def loss(self, x_data, y_data):\n", 527 | " feed_loss = {self._model._x : x_data, self._model._y : y_data}\n", 528 | " return self._sess.run(fetches = self._model._ce_loss, feed_dict = feed_loss) " 529 | ] 530 | }, 531 | { 532 | "cell_type": "markdown", 533 | "metadata": {}, 534 | "source": [ 535 | "### Training" 536 | ] 537 | }, 538 | { 539 | "cell_type": "code", 540 | "execution_count": 24, 541 | "metadata": {}, 542 | "outputs": [], 543 | "source": [ 544 | "# hyperparameter setting\n", 545 | "epochs = 2\n", 546 | "batch_size = 100\n", 547 | "tr_loss_history = []\n", 548 | "val_loss_history = []\n", 549 | "\n", 550 | "sess = tf.Session()\n", 551 | "dnn_model = DnnClassifier(sess = sess, n_features = 784, n_class = 10)\n", 552 | "adam_solver = Solver(sess = sess, model = dnn_model, optimizer = tf.train.AdamOptimizer)" 553 | ] 554 | }, 555 | { 556 | "cell_type": "code", 557 | "execution_count": 25, 558 | "metadata": {}, 559 | "outputs": [ 560 | { 561 | "name": "stdout", 562 | "output_type": "stream", 563 | "text": [ 564 | "step : 0, tr_loss : 2.309, val_loss : 2.290\n", 565 | "step :5000, tr_loss : 0.021, val_loss : 0.004\n", 566 | "step :10000, tr_loss : 0.003, val_loss : 0.024\n", 567 | "step :15000, tr_loss : 0.021, val_loss : 0.009\n", 568 | "step :20000, tr_loss : 0.027, val_loss : 0.000\n", 569 | "step :25000, tr_loss : 0.000, val_loss : 0.000\n", 570 | "step :30000, tr_loss : 0.002, val_loss : 0.001\n", 571 | "step :35000, tr_loss : 0.000, val_loss : 0.000\n", 572 | "step :40000, tr_loss : 0.000, val_loss : 0.000\n", 573 | "step :45000, tr_loss : 0.000, val_loss : 0.000\n", 574 | "step :50000, tr_loss : 0.001, val_loss : 0.000\n", 575 | "epoch : 0, tr_loss : 0.016, val_loss : 0.016\n", 576 | "step : 0, tr_loss : 0.000, val_loss : 0.000\n", 577 | "step :5000, tr_loss : 0.000, val_loss : 0.000\n", 578 | "step :10000, tr_loss : 0.000, val_loss : 0.023\n", 579 | "step :15000, tr_loss : 0.005, val_loss : 0.000\n", 580 | "step :20000, tr_loss : 0.000, val_loss : 0.028\n", 581 | "step :25000, tr_loss : 0.009, val_loss : 0.003\n", 582 | "step :30000, tr_loss : 0.002, val_loss : 0.000\n", 583 | "step :35000, tr_loss : 0.000, val_loss : 0.000\n", 584 | "step :40000, tr_loss : 0.000, val_loss : 0.000\n", 585 | "step :45000, tr_loss : 0.000, val_loss : 0.000\n", 586 | "step :50000, tr_loss : 0.000, val_loss : 0.000\n", 587 | "epoch : 1, tr_loss : 0.002, val_loss : 0.002\n" 588 | ] 589 | } 590 | ], 591 | "source": [ 592 | "sess.run(tf.global_variables_initializer())\n", 593 | "for epoch in range(epochs):\n", 594 | " total_batch = int(mnist.train.images.shape[0])\n", 595 | " avg_tr_loss = 0\n", 596 | " avg_val_loss = 0\n", 597 | " \n", 598 | " for step in range(total_batch):\n", 599 | " batch_xs, batch_ys = mnist.train.next_batch(batch_size = batch_size)\n", 600 | " val_xs, val_ys = mnist.train.next_batch(batch_size = batch_size)\n", 601 | " _, tr_loss = adam_solver.train(x_data = batch_xs, y_data = batch_ys, lr = 1e-3)\n", 602 | " val_loss = adam_solver.loss(x_data = val_xs, y_data = val_ys)\n", 603 | " \n", 604 | " avg_tr_loss += tr_loss / total_batch\n", 605 | " avg_val_loss += val_loss / total_batch\n", 606 | " \n", 607 | " if step % 5000 == 0:\n", 608 | " print('step :{:3}, tr_loss : {:.3f}, val_loss : {:.3f}'.format(step, tr_loss, val_loss))\n", 609 | " \n", 610 | " print('epoch :{:3}, tr_loss : {:.3f}, val_loss : {:.3f}'.format(epoch, avg_tr_loss, avg_val_loss))\n", 611 | " tr_loss_history.append(avg_tr_loss)\n", 612 | " val_loss_history.append(avg_val_loss)" 613 | ] 614 | }, 615 | { 616 | "cell_type": "code", 617 | "execution_count": 26, 618 | "metadata": {}, 619 | "outputs": [ 620 | { 621 | "data": { 622 | "text/plain": [ 623 | "" 624 | ] 625 | }, 626 | "execution_count": 26, 627 | "metadata": {}, 628 | "output_type": "execute_result" 629 | }, 630 | { 631 | "data": { 632 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYQAAAD8CAYAAAB3u9PLAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3XdcleX/x/HX5wxAHKiI5SpMrZw5\nkDTLNLMcpWamVJqZZu6Vlraz+pamOHLkyrTMhalkjixX5gIaJpqJI0VLceEe4PX7g/P1x5cYRxk3\nBz7Px8PH45z7vu77fC4Q3tzrusQYg1JKKWWzugCllFK5gwaCUkopQANBKaWUiwaCUkopQANBKaWU\niwaCUkopQANBKaWUiwaCUkopQANBKaWUi8PqAm5EiRIlTGBgoNVlKKWUR4mKijpujAnIqJ1HBUJg\nYCCRkZFWl6GUUh5FRP5yp52eMlJKKQVoICillHLRQFBKKQW4eQ1BRJoB4wA7MN0Y81GK9d7AbKAO\ncALoYIw5ICL+QBhQF/jcGNMn2TZewASgEXANeN0YsyjTPVJKeYSrV68SGxvLpUuXrC4lz/Dx8aFs\n2bI4nc6b2j7DQBAROzARaArEAhEiEm6M2ZmsWVfglDGmooiEACOADsAl4E2gmutfcq8Dx4wxd4qI\nDSh+Uz1QSnmk2NhYChcuTGBgICJidTkezxjDiRMniI2NpXz58je1D3dOGQUDMcaYfcaYK8A8oHWK\nNq2BWa7XYUATERFjzHljzEaSgiGlF4APXR25Zow5flM9UEp5pEuXLuHv769hkEVEBH9//0wdcbkT\nCGWAQ8nex7qWpdrGGJMAxAP+ae1QRIq6Xr4nIj+LyEIRucXtqpVSeYKGQdbK7NfTnUBI7RNSzrvp\nTpvkHEBZ4CdjTG1gMzAq1Q8X6S4ikSISGRcX50a5/7b964/ZvyX8prZVSqn8wp1AiAXKJXtfFjiS\nVhsRcQB+wMl09nkCuAAsdr1fCNROraExZqoxJsgYExQQkOGDdv+ScOUyBXd8SfmVndg+4WkuxuuZ\nKaUUnD59mkmTJt3wdi1atOD06dPZUJH13AmECKCSiJR33RkUAqT8czsc6Ox63Q5YY4xJ8wjBte4b\nku4wAmgC7EyrfWY4vLwpMXAja0o+R5W4lVwYW4c/1nyZHR+llPIgaQVCYmJiutstX76cokWLptvG\nU2UYCK5rAn2AVcAuYIExJlpEhotIK1ezGYC/iMQAg4Ch/91eRA4AocDzIhIrIlVcq14F3hGR7UAn\n4OUs6tO/+BUuzEO9PmHHY0s5IcW5e0NvdoS24kzcoYw3VkrlSUOHDmXv3r3UrFmTunXr0rhxY555\n5hmqV68OQJs2bahTpw5Vq1Zl6tSp17cLDAzk+PHjHDhwgMqVK/Piiy9StWpVHnnkES5evGhVd7KE\npPOHfK4TFBRkMjuW0cVLl9ny5Tvcd2gal8WLv4Jep3rLXqAXt5TKUbt27aJy5coAvPtNNDuPnMnS\n/VcpXYS3H6+a5voDBw7w2GOPsWPHDtatW0fLli3ZsWPH9Vs2T548SfHixbl48SJ169Zl/fr1+Pv7\nXx9T7dy5c1SsWJHIyEhq1qxJ+/btadWqFR07dszSftyo5F/X/xKRKGNMUEbb5rsnlQv4eNO424cc\neGoVB+23Uz3yNXZ93ITjsX9aXZpSykLBwcH/c//++PHjueeee6hXrx6HDh1iz549/9qmfPny1KxZ\nE4A6depw4MCBnCo3W3jUaKdZ6a5qdbh694+snzeSOnvGYZt+P79UHUjNtkMQe779sihlifT+ks8p\nBQsWvP563bp1fP/992zevBlfX18aNWqU6v393t7e11/b7XaPP2WU744QknM6HDzY8TWOP7eB3V7V\nqRX9EXtGPMDfMb9aXZpSKpsVLlyYs2fPprouPj6eYsWK4evryx9//MGWLVtyuDpr5OtA+K/ACndx\nz6ur2Vj9fUpePoj/F02InP0aiVevWF2aUiqb+Pv706BBA6pVq8aQIUP+Z12zZs1ISEigRo0avPnm\nm9SrV8+iKnNWvruonJG/Dx/k0Jw+BF9Yz377HUibCQRWb5Ctn6lUfpTaxU+VeXpROQuVKnMbdYcs\nZWvweAolnqRs2GNsm9aXyxfPWV2aUkplKw2EVIgI97bojL3PNiKKNiP48GziPq7Ln9tWWl2aUkpl\nGw2EdBQvcQv1B87l10afY7uWwJ3LO7BtQhfOn0lvVA6llPJMGghuqNnoCYq8HMmmgPYExS3mbGhd\nfl8XZnVZSimVpTQQ3FSosB/39Z7G7paLuGQrQPV1XYkc047TcX9bXZpSSmUJDYQbVDm4CbcO2cbm\nMl255/Qark0MJmr5DMy1a1aXppRSmaKBcBN8CvhS/8VQDrZbwQlHSepsG8Svo1py7PABq0tTSmWT\nQoUKAXDkyBHatWuXaptGjRqR0a3xY8eO5cKFC9ff56bhtDUQMqFC9Xsp/+pmtlYYQOXzEfhMrc+W\nsLFcS9SjBaXyqtKlSxMWdvPXEFMGQm4aTlsDIZMcTi/u7fQuJ55by2HvCtTb8TbRIxpzaG+2TO+g\nlMoir7766v/Mh/DOO+/w7rvv0qRJE2rXrk316tVZunTpv7Y7cOAA1apVA+DixYuEhIRQo0YNOnTo\n8D9jGfXs2ZOgoCCqVq3K22+/DSQNmHfkyBEaN25M48aNgf8fThsgNDSUatWqUa1aNcaOHXv983Jq\nmG0dxS2LlKlQndJD1xP59Rju3jEK2+xG/FSxD/eGDMPhdFpdnlK524qh8M/vWbvPW6tD84/SXB0S\nEsKAAQPo1asXAAsWLGDlypUMHDiQIkWKcPz4cerVq0erVq3SnKt48uTJ+Pr6sn37drZv307t2v8/\n8eMHH3xA8eLFSUxMpEmTJmzfvp1+/foRGhrK2rVrKVGixP/sKyoqipkzZ7J161aMMdx77708+OCD\nFCtWjD179jB37lymTZtG+/btWbRoUbYMs61HCFlIbHaC2g3m4oubiClYkwZ7RxPzUQP2/L7N6tKU\nUinUqlWLY8eOceTIEX777TeKFStGqVKleO2116hRowYPP/wwhw8f5ujRo2nuY8OGDdd/MdeoUYMa\nNWpcX7dgwQJq165NrVq1iI6OZufO9M8abNy4kSeeeIKCBQtSqFAh2rZty48//gjk3DDbeoSQDQLK\n3EHAkFX8unwagRHD8Q1rxo9bX6Bux/fw8SlgdXlK5T7p/CWfndq1a0dYWBj//PMPISEhzJkzh7i4\nOKKionA6nQQGBqY67HVyqR097N+/n1GjRhEREUGxYsV4/vnnM9xPeuPK5dQw23qEkF1EqNmyO9Jn\nG9F+jXggdhqHR97Ljm1rrK5MKeUSEhLCvHnzCAsLo127dsTHx1OyZEmcTidr167lr7/+Snf7hg0b\nMmfOHAB27NjB9u3bAThz5gwFCxbEz8+Po0ePsmLFiuvbpDXsdsOGDVmyZAkXLlzg/PnzLF68mAce\neCALe5sxDYRs5leiNLUGfc3OB6dQxJyl8rdt+XFiD86ejbe6NKXyvapVq3L27FnKlClDqVKlePbZ\nZ4mMjCQoKIg5c+Zw9913p7t9z549OXfuHDVq1GDkyJEEBwcDcM8991CrVi2qVq3KCy+8QIMG/z9i\ncvfu3WnevPn1i8r/Vbt2bZ5//nmCg4O599576datG7Vq1cr6TqdDh7/OQRfOnGTX7AHUOb6UQ5Ti\nWKOR1GnUyuqylLKEDn+dPbJ9+GsRaSYiu0UkRkSGprLeW0Tmu9ZvFZFA13J/EVkrIudEZEIa+w4X\nkR3u1OHpfIsUp06f2cQ0n4vdBnXWdWLjmI6cOBFndWlKKZVxIIiIHZgINAeqAE+LSJUUzboCp4wx\nFYExwAjX8kvAm8DgNPbdFsh3Ew1UvLcFJYZEElWmI/VPLyPhk2A2Lf8y3YtKSimV3dw5QggGYowx\n+4wxV4B5QOsUbVoDs1yvw4AmIiLGmPPGmI0kBcP/EJFCwCDg/Zuu3oN5FShEnRcncvjJcC7ZC3Pf\ntt5s+fgJjhw+aHVpSuUY/SMoa2X26+lOIJQBDiV7H+talmobY0wCEA/4Z7Df94DRwIUM2uVpt9Vo\nSNmhEfxyRw+Czm/AZ+p9rA+bpMNfqDzPx8eHEydOaChkEWMMJ06cwMfH56b34c5zCKk9opfyO+hO\nm/9vLFITqGiMGfjf6w3ptO0OdAe47bbb0i3UU9md3tR6bgT/7OnA+YU9eXDHMCL+XEyJDhMoX+Eu\nq8tTKluULVuW2NhY4uL0GlpW8fHxoWzZsje9vTuBEAuUS/a+LHAkjTaxIuIA/ID0phWrD9QRkQOu\nGkqKyDpjTKOUDY0xU4GpkHSXkRv1eqxbK9XGvPoT27/+iGrR40iY/SA/VBrIAx1exsupzxCqvMXp\ndFK+fHmry1DJuHPKKAKoJCLlRcQLCAHCU7QJBzq7XrcD1ph0jgONMZONMaWNMYHA/cCfqYVBfiR2\nBzWeeoOL3X7kiO/dNIn5D7tGNGJn9K9Wl6aUyuMyDATXNYE+wCpgF7DAGBMtIsNF5L830c8A/EUk\nhqQLxddvTXUdBYQCz4tIbCp3KKlUFC97N3e9spadQe9TIWEv5Rc0ZfX0N7iQwePvSil1s/TBNA9w\nNu4gsV/0oPKZn9hpq8jl5uOpVbdBxhsqpRRZ/GCaslbhgNuoPPBb9jwwnlImjmrLHmf1xP7En8l3\nj3AopbKRBoKnEKFSk84UGBDJnwFNaRr3OcdD67Fp/UqrK1NK5REaCB7Gx68kVfvM58CjMyliu0i9\nNSF8P+YFjp04YXVpSikPp4HgoQLrt6Xo4Ch2lmnHw/GLuPxJfdauWKgP+SilbpoGggdz+halWvfp\nHGkThsNup/HWbqz9OISDR1I+JqKUUhnTQMgDStdsyi2vRBFdvgsPnl+Fz5T7+G7RDBJ0+Aul1A3Q\nQMgjbN6+VO08llPPrOSKV1Ee+X0Qm0a05s99+6wuTSnlITQQ8pgSd9WjzKtb+aNKP+pd2UTArAdY\nPmcsl68mWF2aUiqX00DIg8Thzd3t3+Nil3XE+95Giz1v88tHj/BbdL6Yh0gpdZM0EPIwv9urEzhk\nIzG13+CexB3cseBhlk1/j3OXrlhdmlIqF9JAyOtsdiq2GoLpuZljRarxWOwoYkY8yJaIrVZXppTK\nZTQQ8gnfWypQYdBqDjQYSUXzFzWXtSR84iucPJuv5ydSSiWjgZCfiBDY9CWc/bcR69+AVnFTODq6\nAWvX/6APtCmlNBDyI+9iZanYdwmHm35KKTnJ/WueYtnY3hw5ftrq0pRSFtJAyK9EKNPgaQq//DP7\nSzXn8fg5XPzkPlasCOfaNT1aUCo/0kDI5+yF/LmzxxziWs3Bz3GVR7c8x4pRz7Pv8FGrS1NK5TAN\nBAVAQO3H8B8Sxd7AEFpeWIJzagOWLvqSqzr8hVL5hgaCuk58ilCpy6ec6rAULy9vWv/em7UjniJ6\n719Wl6aUygEaCOpfilVuxC2vRLH3rpd46MoaAmY3ZNGXk7l4JdHq0pRS2UgDQaXO6UOFp0dysfNq\nEgoE8GTMULaOeIyIHbusrkwplU3cCgQRaSYiu0UkRkSGprLeW0Tmu9ZvFZFA13J/EVkrIudEZEKy\n9r4i8q2I/CEi0SLyUVZ1SGWtwuWDKD1kM3/VHEz9xAgqLWzCghkjib+gw18olddkGAgiYgcmAs2B\nKsDTIlIlRbOuwCljTEVgDDDCtfwS8CYwOJVdjzLG3A3UAhqISPOb64LKdnYnt7d5E9P9R84WvoP2\nhz5g58dNWb8tyurKlFJZyJ0jhGAgxhizzxhzBZgHtE7RpjUwy/U6DGgiImKMOW+M2UhSMFxnjLlg\njFnren0F+Bkom4l+qBzgU7oy5QZt4HD9d6lp/iDo2+YsmPgmcWcuWl2aUioLuBMIZYBDyd7Hupal\n2sYYkwDEA/7uFCAiRYHHgR/SWN9dRCJFJDIuLs6dXarsZLNR5tEBOPpu5UTxWrSPG09s6IOsWLdB\nh79QysO5EwiSyrKUP/nutPn3jkUcwFxgvDEm1am9jDFTjTFBxpiggICADItVOcPpH8ht/VZy9KEx\nVJTDPLS2LQvHDuRQXLzVpSmlbpI7gRALlEv2viyQchb3621cv+T9gJNu7HsqsMcYM9aNtiq3EeGW\nhi9QcGAU/9zaiPbxMzk7oSFLlq8gUYe/UMrjuBMIEUAlESkvIl5ACBCeok040Nn1uh2wxmRw/kBE\n3icpOAbcWMkqt7EVuZXbe4Zx4rEZlLHH89jWZ1j8cXf2HNZTfEp5kgwDwXVNoA+wCtgFLDDGRIvI\ncBFp5Wo2A/AXkRhgEHD91lQROQCEAs+LSKyIVBGRssDrJN219LOI/Coi3bKyYyrn+Qe1o8jgXzh8\ne2vaXVyAfeoDzF+0gMsJ+kCbUp5APOlCYFBQkImMjLS6DOWGMztWcnVJP/wTjrLU2YLbQ0ZSs0K5\njDdUSmU5EYkyxgRl1E6fVFbZoki1ZvgP+ZlDdz7H41dXEDC7EV9+OYPzlxOsLk0plQYNBJV9vAtR\n7plPuNRpOV4+BekYM4gNI55k0+97rK5MKZUKDQSV7Xwr3EfA4G0cqdGbptd+pFLYQ8yaPo7TOvyF\nUrmKBoLKGU4fSrf9D4ld15BYqDSdY9/i549bsnrrr/pAm1K5hAaCylHe5Wpy66CfOBo8jPvNLwQv\nb87nEz/gn9M6/IVSVtNAUDnP7uCWFkOx9drEhaJ30eX4x+wf05SlazfpfM5KWUgDQVnGUfJOSvVf\nw4lGH1LTFkPTdW2YNXYo+4+dsbo0pfIlDQRlLZsN/0a98Om/jdMl69LlzKecmvgQ85avJkHnc1Yq\nR2kgqFxBit5G6V7LiG82kTvt//DE1hC+GtWP6NjjVpemVL6hgaByDxH86nWk4KCfOVGuKc9d/ALb\n1MbMClvMpas6/IVS2U0DQeU6UqgkpbvN4/wTsyjtdZ6Ov3dh8chuROxJOciuUioraSCoXKvgPW3w\ne/lnjlV8iqevfo3/F42Z9sUXnL101erSlMqTNBBU7lagKKU6TePS019TzMfGi3v78N3IZ1m3fa/V\nlSmV52ggKI/gc1cTig2O5FjVrjxx7TsqLWrK5GmTOX7ustWlKZVnaCAoz+FVkJJPhZL4/Cp8CvrR\n8/BQtox6kmWbf9fhL5TKAhoIyuM4A+/Ff9AWTtQZQDM2UW9lCyZN+JjYk+etLk0pj6aBoDyTwxv/\nx99FXlqP8StL7xMf8Oe4Vsz/YasOf6HUTdJAUB7NXqo6Af1/5PT9b3G/bTvNN7Th0zFvE3NUh79Q\n6kZpICjPZ3dQ9OGXcfbZwuUSVel1dhxxE5vx+bK1XEnQ4S+UcpcGgsozxL8CAb2/42zTUdR07KdD\nRHtmjhrMb3+dsLo0pTyCW4EgIs1EZLeIxIjI0FTWe4vIfNf6rSIS6FruLyJrReSciExIsU0dEfnd\ntc14EZGs6JDK52w2Cjd4kQL9IzlX+j5eujSDazMeYcrCb7hwRedzVio9GQaCiNiBiUBzoArwtIhU\nSdGsK3DKGFMRGAOMcC2/BLwJDE5l15OB7kAl179mN9MBpVLlV4aA7ku40GoKdzqP02VHZ+Z+3JtN\nu3X4C6XS4s4RQjAQY4zZZ4y5AswDWqdo0xqY5XodBjQRETHGnDfGbCQpGK4TkVJAEWPMZpN0A/ls\noE1mOqLUv4jgWzuEgoOiOHNHS7penUfxOY8wfvY84i/o8BdKpeROIJQBDiV7H+talmobY0wCEA/4\nZ7DP2Az2CYCIdBeRSBGJjIuLc6NcpVIoWIISnb/gylNfUdr7Er339uCbUS/w3a/7rK5MqVzFnUBI\n7dx+yhu93WlzU+2NMVONMUHGmKCAgIB0dqlU+ryqtqTIy1HEV36GjtfCufPrZoROncGxM5cy3lip\nfMCdQIgFyiV7XxZIeSL2ehsRcQB+wMkM9lk2g30qlfV8/CgeMomETuEU83Uy6Mgg1oc+y9ebonX4\nC5XvuRMIEUAlESkvIl5ACBCeok040Nn1uh2wxqTz02WM+Rs4KyL1XHcXPQcsveHqlbpJjgoP4jco\ngtM1e/Aka6i/6jFGfzKOv07o8Bcq/8owEFzXBPoAq4BdwAJjTLSIDBeRVq5mMwB/EYkBBgHXb00V\nkQNAKPC8iMQmu0OpJzAdiAH2AiuypktKucnLl6JtRkDX1RQoXJzBJ9/m93Ht+OL7SJ3PWeVL4kmH\nyUFBQSYyMtLqMlRelHCFsz+MpMDmMZwxPnxWuCctn+lL5dJ+VlemVKaJSJQxJiijdvqkslIADi8K\nP/oG9p4/IsXvYPC5j/n70zZM+WYDlxN0PmeVP2ggKJWM3FKFYn3XcaHxe9xvj+aZyPZM+vh1Ivcf\nt7o0pbKdBoJSKdns+D7YD6++W0m4tSYDL0/m6szHGbdgJecu6/AXKu/SQFAqLcXLU6zHCi43H0st\nx1+8FN2RmSMHsm6X3iGt8iYNBKXSI4L3vV3wGRDJpdsepG/ibIrNbcnIWYs4ef6K1dUplaU0EJRy\nR5HSFH0hjKttZ1DJ+xQD971I2KhefBO1Xx9oU3mGBoJS7hLBWaMdvgN/5sKdreluFnLX0pb8Z8ps\njpy+aHV1SmWaBoJSN8q3OH7PziTx6QWULpDAsL/7s3pMV+Zu3KXzOSuPpoGg1E2y3/UohQZGcr7G\nc3SWb2nw3eO8/8lk9sads7o0pW6KBoJSmeFThMJPjsc8/y3FCvvy1qlh/PxJR6at/oWrOvyF8jAa\nCEplAQm8n8IDtnK+bh/a2tbTamMbPgwdze+x8VaXppTbNBCUyirOAhRs+QH27msoUPRW3jr/Pgen\ntGfc0o1cvKLDX6jcTwNBqaxWuhZF+m3k0gOv8agjiud+7kDo6OFsjtHhL1TupoGgVHawO/Fp8iqO\nXj9hD6jE65fHcmlWWz6at5r4izqfs8qdNBCUyk4Bd1Gk1w9cafohDZy76bOrI1NGDeO7HTr8hcp9\nNBCUym42O14NeuHVdyumTBCvJE7Db8ETvDtzKXFnL1tdnVLXaSAolVOKBVL4xWUkPD6Be7wOM/RA\nV+aMHsiiiAM6/IXKFTQQlMpJIjjqdMKnfyRX73iYAczhzm9a88anczl08oLV1al8TgNBKSsUvpVC\nnedx7alZVPQ5yzv/9OHbsb2Ytf4PEnX4C2URDQSlLGSr2oYCAyK5UuVJetgW0+CHJ3ht/HT+PHrW\n6tJUPuRWIIhIMxHZLSIxIjI0lfXeIjLftX6riAQmWzfMtXy3iDyabPlAEYkWkR0iMldEfLKiQ0p5\nHN/iFOwwDfPsIkoXNHx4egibJ3Rj4qpfuZKgw1+onJNhIIiIHZgINAeqAE+LSJUUzboCp4wxFYEx\nwAjXtlWAEKAq0AyYJCJ2ESkD9AOCjDHVALurnVL5llR6GN8BEVyu9QKd7KtovelJ3g4dz88HT1ld\nmson3DlCCAZijDH7jDFXgHlA6xRtWgOzXK/DgCYiIq7l84wxl40x+4EY1/4AHEABEXEAvoDemK2U\nd2EKtA7F1mUFxYoU5sMLb7N32nOMWLyZ8zqfs8pm7gRCGeBQsvexrmWptjHGJADxgH9a2xpjDgOj\ngIPA30C8Mea71D5cRLqLSKSIRMbFxblRrlJ5wO31KdhvC5frD6Ct/Se6/BrCB6NHsOFP/RlQ2ced\nQJBUlqW8DSKtNqkuF5FiJB09lAdKAwVFpGNqH26MmWqMCTLGBAUEBLhRrlJ5hNMH70ffxf7SWgr6\nl+Y/V0Zy7otneOerNZy+oPM5q6znTiDEAuWSvS/Lv0/vXG/jOgXkB5xMZ9uHgf3GmDhjzFXga+C+\nm+mAUnleqXso2HsDVxu9ySOOXxmwuyNjRr3Lt78d0QfaVJZyJxAigEoiUl5EvEi6+Bueok040Nn1\nuh2wxiT9Tw0HQlx3IZUHKgHbSDpVVE9EfF3XGpoAuzLfHaXyKLsTZ6PBOHr9hNetVXj32gQKh7Vn\n2Gff8k/8JaurU3lEhoHguibQB1hF0i/tBcaYaBEZLiKtXM1mAP4iEgMMAoa6to0GFgA7gZVAb2NM\nojFmK0kXn38GfnfVMTVLe6ZUXhRwJ74vfUdis5HU89rLmwdfYGboUL7ackDnc1aZJp50yBkUFGQi\nIyOtLkOp3OH0QS5+3ZcCB9cRce1O5t4yhH4dWhJYoqDVlalcRkSijDFBGbXTJ5WV8lRFb6NAlyWY\nNpOp4f0PH8X15OtxA5m69g8SdD5ndRM0EJTyZCJIzWfw7h/FtUrNGWSfT4O17Rk0fjbRR3Q+Z3Vj\nNBCUygsKlcTn2S8x7WdTwfcCofED2TC5D6HLf+XSVZ3PWblHA0GpPESqtManfwSJ1UPoaQ+n9ZYQ\nXgmdwrb9J60uTXkADQSl8poCxfB+cjJ0WkKZwnbGXxzGrhndGb5oK2cv6XzOKm0aCErlVRUa49Nv\nK1frvsRzju/puv0Z3ho9lh92HbW6MpVLaSAolZd5F8LZciTS9TuKFyvKmKvvc/qrrgz9ch0nzul8\nzup/aSAolR+UC6ZAn00k3j+YNo5NDN7TiY9Gf8Tinw/p8BfqOg0EpfILhzf2h9/E/tJ6CpUM5GMT\nSoHFXRgwfSWHT1+0ujqVC2ggKJXf3Fodnx5rufbwcB52bue92BeYHPoOs37ar8Nf5HMaCErlR3YH\ntvv74+i1Ca8yNXjf9il3rOxI70mLiTmm8znnVxoISuVnJSri020FpmUo9bz3E3q8B/M+eY0J3/+h\n8znnQxoISuV3NhtStyvOvtuwlb+fN+yzqb+hI33GfcVvh05bXZ3KQRoISqkkfmXxfm4RtJ1GdZ84\nJpztz/dTBvPhN9u5eEWHv8gPNBCUUv9PBGq0x6tfJFL5MV52LKRNxLP0G/0ZP8Uct7o6lc00EJRS\n/1YoAGeHWRDyFRUKXubTy6/w++f9eW3BNuIv6PAXeZUGglIqbXe3xKvfNkzNjvRwLKN7dCdeGT2J\nlTv+troylQ00EJRS6StQFEebT+C5cEoX8WJK4lscn9ebAbM2cOyMzuecl2ggKKXcc8eDePXdQmK9\n3jzjWMur+zrzTugYFkTo8Bd5hQaCUsp9XgWxN/sPtm6r8S9egkl8hDP8JXpM/Y6DJy5YXZ3KJLcC\nQUSaichuEYkRkaGprPcWkfmAVQs3AAAQiklEQVSu9VtFJDDZumGu5btF5NFky4uKSJiI/CEiu0Sk\nflZ0SCmVA8oG4dV7I6bhq7RybOU/R7oxZtyHTN+wl0Qd/sJjZRgIImIHJgLNgSrA0yJSJUWzrsAp\nY0xFYAwwwrVtFSAEqAo0Aya59gcwDlhpjLkbuAfYlfnuKKVyjMMbeeg17D02UPjWOxhjG8/tq1+k\n2ydL+eOfM1ZXp26CO0cIwUCMMWafMeYKMA9onaJNa2CW63UY0ERExLV8njHmsjFmPxADBItIEaAh\nMAPAGHPFGKOPRCrliW6pitdLazBN3+Mhr2g+OdWTLya8y+hVf3A5QR9o8yTuBEIZ4FCy97GuZam2\nMcYkAPGAfzrb3gHEATNF5BcRmS4iBVP7cBHpLiKRIhIZFxfnRrlKqRxnsyMN+mHvtQnvcrX4wDGd\n+hu70G3sQqL+0vmcPYU7gSCpLEt5kjCtNmktdwC1gcnGmFrAeeBf1yYAjDFTjTFBxpiggIAAN8pV\nSlnGvwLOLsvgsbEE+xxk2rm+rJj2Bu8s+Y1zlxOsrk5lwJ1AiAXKJXtfFjiSVhsRcQB+wMl0to0F\nYo0xW13Lw0gKCKWUp7PZIKgLjj7bcFRszBuOObT5uQs9Rn/B2t3HrK5OpcOdQIgAKolIeRHxIuki\ncXiKNuFAZ9frdsAak3RjcjgQ4roLqTxQCdhmjPkHOCQid7m2aQLszGRflFK5iV8ZHM/OhydnULXA\nKWZeGcyvs4cyeF4EJ89fsbo6lQpHRg2MMQki0gdYBdiBz4wx0SIyHIg0xoSTdHH4CxGJIenIIMS1\nbbSILCDpl30C0NsY89+rTH2BOa6Q2Qd0yeK+KaWsJgLV2+G8ozGJy19hYHQYu3duY8CfvXiyVWta\n3VOapPtPVG4gnvSEYVBQkImMjLS6DKXUzdq9kqvh/bGdP8aMhOb8ckdP3noyiFJ+BayuLE8TkShj\nTFBG7fRJZaVUzrmrGc6+25A6nenu+JZhf73Aa6GT+GLLXzqfcy6ggaCUylk+ftgeHwudl1GqqC8z\nZTj2ZQPo8ukP7Is7Z3V1+ZoGglLKGuUfwNl7M+a+foQ41jHyWDdGjB/LxLUxXE3U+ZytoIGglLKO\nly/yyHvYXvwB/xK3MsX+MeXW9KHT+OXsOBxvdXX5jgaCUsp6ZWrj6LEBGr/OY45IPo3vwfTJI/lw\n+U4uXdXhL3KKBoJSKndweMGDr2Dr+SOFSt3JWMcEgjf3olPoIjbvPWF1dfmCBoJSKncpWRnHi6vh\n0Q9p5PUHn1/sxzeffcBri37jzCWdzzk7aSAopXIfmx3q98LeezM+gcH8xzmDVr+9xAuj57J651Gr\nq8uzNBCUUrlX8fLYOy+FVp9Q1+cwc66+TMSct+k7J4K4s5etri7P0UBQSuVuIlD7Oex9tuG882Fe\nc86l++7u9Bw9m7CoWJ3POQtpICilPEORUtie/gqe+pzKBc8wj1c5vPgNXpixkUMndT7nrKCBoJTy\nHCJQ9QkcfSOw12hHf8diXj/UgyFjp/PZxv06n3MmaSAopTyPb3Gk7VR4NozAIoavbG9hVg7jmUlr\n+PPoWaur81gaCEopz1WpKY7eW5CgrnR1rGD08R68/8lkxn7/J1cSdPiLG6WBoJTybD5FkMdGQ5cV\nlCpWiNmOD7h13RA6jF/BLwdPWV2dR9FAUErlDbffh73XJrh/IO2dPzL1TG8mTxnH8G92cuGKzufs\nDg0EpVTe4SwAD7+D7cUfKF6yDFOdY6i9bQAdQsP5cU+c1dXlehoISqm8p3Qt7C+tg4fepIXzF768\n1I+vZ47m5fm/cvqCzuecFg0EpVTeZHdCw8HYem6kcJm7GeM1mcej+/Ps6DC+3f63PtCWCrcCQUSa\nichuEYkRkaGprPcWkfmu9VtFJDDZumGu5btF5NEU29lF5BcRWZbZjiilVKoC7sLWdRU0H0lDrz9Z\nmDiIzfNH8NLsCI6euWR1dblKhoEgInZgItAcqAI8LSJVUjTrCpwyxlQExgAjXNtWAUKAqkAzYJJr\nf//VH9iV2U4opVS6bHa49yVsvbfgc0c93nfOpPu+PnQLncvcbQf1aMHFnSOEYCDGGLPPGHMFmAe0\nTtGmNTDL9ToMaCIi4lo+zxhz2RizH4hx7Q8RKQu0BKZnvhtKKeWGYrdj67QYWk+ils8/LGIIB5d+\nwLNTf+LA8fNWV2c5dwKhDHAo2ftY17JU2xhjEoB4wD+DbccCrwD69IhSKueIQK1nsfeJwFm5Ga86\n5/HG333oP3Y2n67fS0I+ns/ZnUCQVJalPL5Kq02qy0XkMeCYMSYqww8X6S4ikSISGRent40ppbJI\n4VuQDl9C+9nc5XuOrx2vkbD6XZ6auJboI/lzPmd3AiEWKJfsfVngSFptRMQB+AEn09m2AdBKRA6Q\ndArqIRH5MrUPN8ZMNcYEGWOCAgIC3ChXKaVuQJXW2Ptsw3ZPB/o4lhJ6si/vTJzJyJV/5Lv5nN0J\nhAigkoiUFxEvki4Sh6doEw50dr1uB6wxSVdpwoEQ111I5YFKwDZjzDBjTFljTKBrf2uMMR2zoD9K\nKXXjfIsjT3wKHRdxexEb853vELDxLdqO/Y5t+09aXV2OyTAQXNcE+gCrSLojaIExJlpEhotIK1ez\nGYC/iMQAg4Chrm2jgQXATmAl0NsYk78iVynlOSo+jK33FmzB3Xne8R0zLvTlk2lTeGPJ75zNB/M5\niyfdbhUUFGQiIyOtLkMplR8c3MK1Jb2xnYxhYWJDphfoyqtt6/PQ3bdYXdkNE5EoY0xQRu30SWWl\nlErNbfWw9fwJHniZdo6fmHu1PwtmT6Lf3F84cS5vzuesgaCUUmlx+kCTt5Du6yh6y2186jWWFrte\nocPoJSz55XCee6BNA0EppTJSqga2F9fAw+/wiPM3FjOIjWHj6DJzG4dPX7S6uiyjgaCUUu6wO+H+\ngdh6/kShstUZ5ZxCt78G83zoQmZvPsC1PDCfswaCUkrdiBKVkC7LocUo7vPayzf2IexdFkqHT38i\n5tg5q6vLFA0EpZS6UTYbBL+IrfcWvCvcz7vOWbx+bBB9x83jkx/2cNVDh7/QQFBKqZtV9Dbk2TB4\nYgo1fI4R7jWUi2tG0mb8OrbHnra6uhumgaCUUpkhAveEYOuzDWfllrziXEDomUG8PulLPvh2Jxev\neM6zuBoISimVFQqVhPazoMOXVPI9zxKvtyi2+UMeH7OaTTHHra7OLRoISimVlSo/jq3PNuy1nqGX\nI5zPLg1kzIxZvBq2nfgLuXv4Cw0EpZTKagWKQesJ0GkJZYs4WOg9nGq/DadV6EpW7vjb6urSpIGg\nlFLZpUJjbL23QL1edLR/z8LEgcz76jN6fhnFsbO5bz5nDQSllMpOXgWh2YdI1+8I8C/O514jefTP\nt3ly9DIWRBzKVcNfaCAopVROKBeM9PgRGr5Ca8dmvrG9zLrFU+k4fQsHT1ywujpAA0EppXKOwxse\neh3pvh6/WwKZ5DWeF2LfpOPYxUz/cR+JFg9/oYGglFI57dZqSLcfoOlwHnL8zkrHEP5cOYm2Ezfy\nxz9nLCtLA0Eppaxgd0CD/kivTRS47R5GOqfx2olh9By/iNDvdnM5IecfaNNAUEopK/lXQDp/Cy1D\nCfbaz0rvoZxb/wmPj1tP1F85O5+zBoJSSlnNZoO6XZHeW/Gu2JC3nF8Qeu5Vhk5ZyDvh0Zy/nJAz\nZeTIpyillMqYX1l4ZgG0nU5Vn+Os8H6dIttCaRH6A3/HZ/9EPG4Fgog0E5HdIhIjIkNTWe8tIvNd\n67eKSGCydcNcy3eLyKOuZeVEZK2I7BKRaBHpn1UdUkopjyYCNZ5C+kTgqNqKQY4w5sswbpVT2f7R\nGQaCiNiBiUBzoArwtIhUSdGsK3DKGFMRGAOMcG1bBQgBqgLNgEmu/SUALxtjKgP1gN6p7FMppfKv\ngiWg3WcQMpdbAysjhW7J9o905wghGIgxxuwzxlwB5gGtU7RpDcxyvQ4DmoiIuJbPM8ZcNsbsB2KA\nYGPM38aYnwGMMWeBXUCZzHdHKaXymLtbQMgcsNmz/aPcCYQywKFk72P59y/v622MMQlAPODvzrau\n00u1gK3ul62UUiqruRMIksqylI/TpdUm3W1FpBCwCBhgjEn1aQwR6S4ikSISGRcX50a5SimlboY7\ngRALlEv2vixwJK02IuIA/ICT6W0rIk6SwmCOMebrtD7cGDPVGBNkjAkKCAhwo1yllFI3w51AiAAq\niUh5EfEi6SJxeIo24UBn1+t2wBqTNIRfOBDiugupPFAJ2Oa6vjAD2GWMCc2KjiillMocR0YNjDEJ\nItIHWAXYgc+MMdEiMhyINMaEk/TL/QsRiSHpyCDEtW20iCwAdpJ0Z1FvY0yiiNwPdAJ+F5FfXR/1\nmjFmeVZ3UCmllHskN43FnZGgoCATGRlpdRlKKeVRRCTKGBOUUTt9UlkppRSggaCUUsrFo04ZiUgc\n8NdNbl4COJ6F5XgC7XP+kN/6nN/6C5nv8+3GmAxv0/SoQMgMEYl05xxaXqJ9zh/yW5/zW38h5/qs\np4yUUkoBGghKKaVc8lMgTLW6AAton/OH/Nbn/NZfyKE+55trCEoppdKXn44QlFJKpSPPBUJmZnfz\nRG70d5CI7BSR7SLyg4jcbkWdWSmjPidr105EjIh4/B0p7vRZRNq7vtfRIvJVTteY1dz4v32ba+bF\nX1z/v1tYUWdWEZHPROSYiOxIY72IyHjX12O7iNTO8iKMMXnmH0ljLe0F7gC8gN+AKina9AI+db0O\nAeZbXXc297cx4Ot63dOT++tun13tCgMbgC1AkNV158D3uRLwC1DM9b6k1XXnQJ+nAj1dr6sAB6yu\nO5N9bgjUBnaksb4FsIKkaQXqAVuzuoa8doSQmdndPFGG/TXGrDXGXHC93ULSEOSezJ3vMcB7wEjg\nUk4Wl03c6fOLwERjzCkAY8yxHK4xq7nTZwMUcb3249/D8nsUY8wGkgYHTUtrYLZJsgUoKiKlsrKG\nvBYImZndzRO509/kupL0F4Ync2cWvlpAOWPMspwsLBu5832+E7hTRH4SkS0i0izHqsse7vT5HaCj\niMQCy4G+OVOaZW705/2GZTj8tYfJzOxunsjtvohIRyAIeDBbK8p+Gc3CZwPGAM/nVEE5wJ3vs4Ok\n00aNSDoK/FFEqhljTmdzbdnFnT4/DXxujBktIvVJGoK/mjHmWvaXZ4ls/92V144QMjO7mydyp7+I\nyMPA60ArY8zlHKotu2TU58JANWCdiBwg6VxruIdfWHb3//VSY8xVY8x+YDdJAeGp3OlzV2ABgDFm\nM+BD0pg/eZVbP++ZkdcCITOzu3miDPvrOn0yhaQw8PTzypBBn40x8caYEsaYQGNMIEnXTVoZYzx5\nIg13/l8vIekGAkSkBEmnkPblaJVZy50+HwSaAIhIZZICIS9PvB4OPOe626geEG+M+TsrPyBPnTIy\nmZjdzRO52d+PgULAQte184PGmFaWFZ1JbvY5T3Gzz6uAR0RkJ5AIDDHGnLCu6sxxs88vA9NEZCBJ\np06e9+A/7hCRuSSd8ivhui7yNuAEMMZ8StJ1khZADHAB6JLlNXjw108ppVQWymunjJRSSt0kDQSl\nlFKABoJSSikXDQSllFKABoJSSikXDQSllFKABoJSSikXDQSllFIA/B9VjodDiLpmhgAAAABJRU5E\nrkJggg==\n", 633 | "text/plain": [ 634 | "" 635 | ] 636 | }, 637 | "metadata": {}, 638 | "output_type": "display_data" 639 | } 640 | ], 641 | "source": [ 642 | "plt.plot(tr_loss_history, label = 'train')\n", 643 | "plt.plot(val_loss_history, label = 'validation')\n", 644 | "plt.legend()" 645 | ] 646 | }, 647 | { 648 | "cell_type": "code", 649 | "execution_count": 27, 650 | "metadata": {}, 651 | "outputs": [ 652 | { 653 | "name": "stdout", 654 | "output_type": "stream", 655 | "text": [ 656 | "test accuracy : 98.27%\n" 657 | ] 658 | } 659 | ], 660 | "source": [ 661 | "acc = np.mean(np.argmax(mnist.test.labels, axis = 1) == dnn_model.predict(x_data = mnist.test.images))\n", 662 | "print('test accuracy : {:.2%}'.format(acc))" 663 | ] 664 | } 665 | ], 666 | "metadata": { 667 | "kernelspec": { 668 | "display_name": "Python 3", 669 | "language": "python", 670 | "name": "python3" 671 | }, 672 | "language_info": { 673 | "codemirror_mode": { 674 | "name": "ipython", 675 | "version": 3 676 | }, 677 | "file_extension": ".py", 678 | "mimetype": "text/x-python", 679 | "name": "python", 680 | "nbconvert_exporter": "python", 681 | "pygments_lexer": "ipython3", 682 | "version": "3.6.3" 683 | }, 684 | "varInspector": { 685 | "cols": { 686 | "lenName": 16, 687 | "lenType": 16, 688 | "lenVar": 40 689 | }, 690 | "kernels_config": { 691 | "python": { 692 | "delete_cmd_postfix": "", 693 | "delete_cmd_prefix": "del ", 694 | "library": "var_list.py", 695 | "varRefreshCmd": "print(var_dic_list())" 696 | }, 697 | "r": { 698 | "delete_cmd_postfix": ") ", 699 | "delete_cmd_prefix": "rm(", 700 | "library": "var_list.r", 701 | "varRefreshCmd": "cat(var_dic_list()) " 702 | } 703 | }, 704 | "types_to_exclude": [ 705 | "module", 706 | "function", 707 | "builtin_function_or_method", 708 | "instance", 709 | "_Feature" 710 | ], 711 | "window_display": false 712 | } 713 | }, 714 | "nbformat": 4, 715 | "nbformat_minor": 2 716 | } 717 | -------------------------------------------------------------------------------- /Tutorial of implementing Drop out.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Tutorial of implementing Drop out\n", 8 | "mnist image를 분류하는 Convolution Neural Network에 Drop out을 적용하는 간단한 example\n", 9 | "\n", 10 | "Drop out paper : http://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf" 11 | ] 12 | }, 13 | { 14 | "cell_type": "markdown", 15 | "metadata": {}, 16 | "source": [ 17 | "### Setup" 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": 1, 23 | "metadata": {}, 24 | "outputs": [ 25 | { 26 | "name": "stdout", 27 | "output_type": "stream", 28 | "text": [ 29 | "Extracting ./MNIST_data\\train-images-idx3-ubyte.gz\n", 30 | "Extracting ./MNIST_data\\train-labels-idx1-ubyte.gz\n", 31 | "Extracting ./MNIST_data\\t10k-images-idx3-ubyte.gz\n", 32 | "Extracting ./MNIST_data\\t10k-labels-idx1-ubyte.gz\n" 33 | ] 34 | } 35 | ], 36 | "source": [ 37 | "import os, sys\n", 38 | "import shutil \n", 39 | "import tensorflow as tf\n", 40 | "import numpy as np\n", 41 | "import matplotlib.pylab as plt\n", 42 | "%matplotlib inline\n", 43 | "from tensorflow.examples.tutorials.mnist import input_data # load mnist dataset\n", 44 | "mnist = input_data.read_data_sets(train_dir = './MNIST_data', one_hot = True, reshape = True, seed = 777)" 45 | ] 46 | }, 47 | { 48 | "cell_type": "markdown", 49 | "metadata": {}, 50 | "source": [ 51 | "### Define MnistCNN class\n", 52 | "conv-conv-max pool-conv-conv-max pool-fc-fc" 53 | ] 54 | }, 55 | { 56 | "cell_type": "code", 57 | "execution_count": 2, 58 | "metadata": {}, 59 | "outputs": [], 60 | "source": [ 61 | "'''\n", 62 | "Drop out의 구현을 위해서는 tf.nn module에 있는 tf.nn.dropout을 활용한다. tf.nn.dropout의 wrapper로 \n", 63 | "tf.layers.dropout이 있지만 해당 api는 default rate가 정해져있고, dropout을 걸지 안 걸지를 결정하기위해서\n", 64 | "training argument가 존재하는데, 두 가지를 다 조절하려면 두 개의 tf.placeholder가 필요하기 때문에 개인 선호에 의해서\n", 65 | "tf.nn.dropout을 활용한다.(tf.nn.dropout은 keep_prob argument 하나만 존재)\n", 66 | "일반적인 Convolution Neural Network에 적용할 때는, 마지막 output layer (label의 개수와 같은 fc layer) 전의 fc layer에\n", 67 | "Drop out을 적용한다.\n", 68 | "(Convolution Neural Network의 Filter에 적용하는 논문도 있다. 이는 drop connect 논문과 비슷한 발상에서 나온 것 같다)\n", 69 | "'''\n", 70 | "class MnistCNN:\n", 71 | " def __init__(self, activation_fn = tf.nn.relu, initializer = tf.contrib.layers.variance_scaling_initializer(),\n", 72 | " l2_scale = 0.01):\n", 73 | " \n", 74 | " with tf.variable_scope('input_layer'):\n", 75 | " self._x = tf.placeholder(dtype = tf.float32, shape = [None,784])\n", 76 | " self._ximg = tf.reshape(tensor = self._x, shape = [-1,28,28,1])\n", 77 | " self._y = tf.placeholder(dtype = tf.float32, shape = [None,10])\n", 78 | " self._keep_prob = tf.placeholder(dtype = tf.float32)\n", 79 | " \n", 80 | " with tf.variable_scope('conv_layer1'):\n", 81 | " _conv_pre = tf.layers.conv2d(inputs = self._ximg, filters = 64, kernel_size = [3,3],\n", 82 | " kernel_initializer = initializer,\n", 83 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale),\n", 84 | " padding = 'same')\n", 85 | " _conv_relu = activation_fn(_conv_pre)\n", 86 | " \n", 87 | " with tf.variable_scope('conv_layer2'):\n", 88 | " _conv_pre = tf.layers.conv2d(inputs = _conv_relu, filters = 64, kernel_size = [3,3],\n", 89 | " kernel_initializer = initializer,\n", 90 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale),\n", 91 | " padding = 'same')\n", 92 | " _conv_relu = activation_fn(_conv_pre)\n", 93 | " \n", 94 | " with tf.variable_scope('max_pool1'):\n", 95 | " _pooled = tf.layers.max_pooling2d(inputs = _conv_relu, pool_size = [2,2], strides = 2)\n", 96 | " \n", 97 | " with tf.variable_scope('conv_layer3'):\n", 98 | " _conv_pre = tf.layers.conv2d(inputs = _pooled, filters = 128, kernel_size = [3,3],\n", 99 | " kernel_initializer = initializer,\n", 100 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale),\n", 101 | " padding = 'same')\n", 102 | " _conv_relu = activation_fn(_conv_pre)\n", 103 | " \n", 104 | " with tf.variable_scope('conv_layer4'):\n", 105 | " _conv_pre = tf.layers.conv2d(inputs = _conv_relu, filters = 128, kernel_size = [3,3],\n", 106 | " kernel_initializer = initializer,\n", 107 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale),\n", 108 | " padding = 'same')\n", 109 | " _conv_relu = activation_fn(_conv_pre)\n", 110 | " \n", 111 | " with tf.variable_scope('max_pool2'):\n", 112 | " _pooled = tf.layers.max_pooling2d(inputs = _conv_relu, pool_size = [2,2], strides = 2)\n", 113 | " \n", 114 | " with tf.variable_scope('dense_layer1'):\n", 115 | " _pooled_vector = tf.reshape(tensor = _pooled, shape = [-1,np.cumprod(_pooled.get_shape().as_list()[-3:])[-1]])\n", 116 | " _fc_pre = tf.layers.dense(inputs = _pooled_vector, units = 1024, kernel_initializer = initializer,\n", 117 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale))\n", 118 | " _fc_relu = activation_fn(_fc_pre)\n", 119 | " _fc_relu = tf.nn.dropout(x = _fc_relu, keep_prob = self._keep_prob)\n", 120 | " \n", 121 | " with tf.variable_scope('output_layer'):\n", 122 | " self._score = tf.layers.dense(inputs = _fc_relu, units = 10, kernel_initializer = initializer,\n", 123 | " kernel_regularizer = tf.contrib.layers.l2_regularizer(scale = l2_scale))\n", 124 | " \n", 125 | " with tf.variable_scope('loss'):\n", 126 | " _ce_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits = self._score, labels = self._y))\n", 127 | " _reg_term = tf.reduce_sum(tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES))\n", 128 | " self._total_loss = _ce_loss + _reg_term\n", 129 | "\n", 130 | " \n", 131 | " with tf.variable_scope('predict'):\n", 132 | " self._prediction = tf.argmax(input = self._score, axis = 1)\n", 133 | " \n", 134 | " def predict(self, sess, x_data, keep_prob = 1.):\n", 135 | " feed_predict = {self._x : x_data, self._keep_prob : keep_prob}\n", 136 | " return sess.run(fetches = self._prediction, feed_dict = feed_predict)" 137 | ] 138 | }, 139 | { 140 | "cell_type": "markdown", 141 | "metadata": {}, 142 | "source": [ 143 | "### Define Solver class" 144 | ] 145 | }, 146 | { 147 | "cell_type": "code", 148 | "execution_count": 3, 149 | "metadata": {}, 150 | "outputs": [], 151 | "source": [ 152 | "class Solver:\n", 153 | " def __init__(self, model, optimizer = tf.train.AdamOptimizer, var_list = None):\n", 154 | " self._model = model\n", 155 | " self._lr = tf.placeholder(dtype = tf.float32)\n", 156 | " self._optimizer = optimizer(learning_rate = self._lr)\n", 157 | " self._training_op = self._optimizer.minimize(loss = self._model._total_loss, var_list = var_list)\n", 158 | " \n", 159 | " def train(self, sess, x_data, y_data, lr, keep_prob = .5):\n", 160 | " feed_train = {self._model._x : x_data, self._model._y : y_data, self._lr : lr,\n", 161 | " self._model._keep_prob : keep_prob}\n", 162 | " return sess.run(fetches = [self._training_op, self._model._total_loss], feed_dict = feed_train)\n", 163 | " \n", 164 | " def evaluate(self, sess, x_data, y_data, keep_prob = 1.):\n", 165 | " feed_loss = {self._model._x : x_data, self._model._y : y_data, self._model._keep_prob : keep_prob}\n", 166 | " return sess.run(fetches = self._model._total_loss, feed_dict = feed_loss)" 167 | ] 168 | }, 169 | { 170 | "cell_type": "markdown", 171 | "metadata": {}, 172 | "source": [ 173 | "### Generate CNN model and Adam solver### " 174 | ] 175 | }, 176 | { 177 | "cell_type": "code", 178 | "execution_count": 4, 179 | "metadata": {}, 180 | "outputs": [], 181 | "source": [ 182 | "sess = tf.Session()\n", 183 | "mnist_classifier = MnistCNN()" 184 | ] 185 | }, 186 | { 187 | "cell_type": "code", 188 | "execution_count": 5, 189 | "metadata": {}, 190 | "outputs": [], 191 | "source": [ 192 | "adam_solver = Solver(model = mnist_classifier)" 193 | ] 194 | }, 195 | { 196 | "cell_type": "markdown", 197 | "metadata": {}, 198 | "source": [ 199 | "### Training" 200 | ] 201 | }, 202 | { 203 | "cell_type": "code", 204 | "execution_count": 6, 205 | "metadata": {}, 206 | "outputs": [], 207 | "source": [ 208 | "# Hypear-parameters\n", 209 | "batch_size = 100\n", 210 | "n_epochs = 10\n", 211 | "best_loss = np.infty\n", 212 | "max_checks_without_progress = 15\n", 213 | "checks_without_progress = 0\n", 214 | "tr_loss_history = []\n", 215 | "val_loss_history = []" 216 | ] 217 | }, 218 | { 219 | "cell_type": "code", 220 | "execution_count": 7, 221 | "metadata": {}, 222 | "outputs": [], 223 | "source": [ 224 | "sess.run(tf.global_variables_initializer())" 225 | ] 226 | }, 227 | { 228 | "cell_type": "code", 229 | "execution_count": 8, 230 | "metadata": {}, 231 | "outputs": [ 232 | { 233 | "name": "stdout", 234 | "output_type": "stream", 235 | "text": [ 236 | "step : 0, tr_loss : 17.423, val_loss : 21.375\n", 237 | "step : 100, tr_loss : 3.446, val_loss : 3.483\n", 238 | "step : 200, tr_loss : 1.886, val_loss : 1.836\n", 239 | "step : 300, tr_loss : 1.265, val_loss : 1.269\n", 240 | "step : 400, tr_loss : 0.894, val_loss : 0.996\n", 241 | "step : 500, tr_loss : 0.744, val_loss : 0.723\n", 242 | "epoch : 0, tr_loss : 2.493, val_loss : 2.422\n", 243 | "step : 0, tr_loss : 0.694, val_loss : 0.737\n", 244 | "step : 100, tr_loss : 0.577, val_loss : 0.590\n", 245 | "step : 200, tr_loss : 0.454, val_loss : 0.454\n", 246 | "step : 300, tr_loss : 0.468, val_loss : 0.478\n", 247 | "step : 400, tr_loss : 0.370, val_loss : 0.361\n", 248 | "step : 500, tr_loss : 0.317, val_loss : 0.388\n", 249 | "epoch : 1, tr_loss : 0.488, val_loss : 0.461\n", 250 | "step : 0, tr_loss : 0.351, val_loss : 0.377\n", 251 | "step : 100, tr_loss : 0.323, val_loss : 0.360\n", 252 | "step : 200, tr_loss : 0.275, val_loss : 0.270\n", 253 | "step : 300, tr_loss : 0.303, val_loss : 0.257\n", 254 | "step : 400, tr_loss : 0.315, val_loss : 0.312\n", 255 | "step : 500, tr_loss : 0.263, val_loss : 0.273\n", 256 | "epoch : 2, tr_loss : 0.335, val_loss : 0.311\n", 257 | "step : 0, tr_loss : 0.295, val_loss : 0.326\n", 258 | "step : 100, tr_loss : 0.335, val_loss : 0.345\n", 259 | "step : 200, tr_loss : 0.253, val_loss : 0.306\n", 260 | "step : 300, tr_loss : 0.319, val_loss : 0.270\n", 261 | "step : 400, tr_loss : 0.246, val_loss : 0.224\n", 262 | "step : 500, tr_loss : 0.305, val_loss : 0.273\n", 263 | "epoch : 3, tr_loss : 0.303, val_loss : 0.281\n", 264 | "step : 0, tr_loss : 0.278, val_loss : 0.259\n", 265 | "step : 100, tr_loss : 0.333, val_loss : 0.243\n", 266 | "step : 200, tr_loss : 0.318, val_loss : 0.272\n", 267 | "step : 300, tr_loss : 0.282, val_loss : 0.282\n", 268 | "step : 400, tr_loss : 0.389, val_loss : 0.251\n", 269 | "step : 500, tr_loss : 0.281, val_loss : 0.403\n", 270 | "epoch : 4, tr_loss : 0.295, val_loss : 0.271\n", 271 | "step : 0, tr_loss : 0.296, val_loss : 0.351\n", 272 | "step : 100, tr_loss : 0.270, val_loss : 0.348\n", 273 | "step : 200, tr_loss : 0.346, val_loss : 0.374\n", 274 | "step : 300, tr_loss : 0.228, val_loss : 0.232\n", 275 | "step : 400, tr_loss : 0.254, val_loss : 0.274\n", 276 | "step : 500, tr_loss : 0.259, val_loss : 0.210\n", 277 | "epoch : 5, tr_loss : 0.287, val_loss : 0.263\n", 278 | "step : 0, tr_loss : 0.265, val_loss : 0.226\n", 279 | "step : 100, tr_loss : 0.439, val_loss : 0.263\n", 280 | "step : 200, tr_loss : 0.275, val_loss : 0.250\n", 281 | "step : 300, tr_loss : 0.251, val_loss : 0.316\n", 282 | "step : 400, tr_loss : 0.366, val_loss : 0.274\n", 283 | "step : 500, tr_loss : 0.457, val_loss : 0.251\n", 284 | "epoch : 6, tr_loss : 0.285, val_loss : 0.259\n", 285 | "step : 0, tr_loss : 0.259, val_loss : 0.260\n", 286 | "step : 100, tr_loss : 0.300, val_loss : 0.225\n", 287 | "step : 200, tr_loss : 0.457, val_loss : 0.301\n", 288 | "step : 300, tr_loss : 0.267, val_loss : 0.251\n", 289 | "step : 400, tr_loss : 0.278, val_loss : 0.237\n", 290 | "step : 500, tr_loss : 0.244, val_loss : 0.277\n", 291 | "epoch : 7, tr_loss : 0.281, val_loss : 0.255\n", 292 | "step : 0, tr_loss : 0.308, val_loss : 0.231\n", 293 | "step : 100, tr_loss : 0.205, val_loss : 0.281\n", 294 | "step : 200, tr_loss : 0.228, val_loss : 0.225\n", 295 | "step : 300, tr_loss : 0.296, val_loss : 0.216\n", 296 | "step : 400, tr_loss : 0.284, val_loss : 0.275\n", 297 | "step : 500, tr_loss : 0.265, val_loss : 0.258\n", 298 | "epoch : 8, tr_loss : 0.279, val_loss : 0.252\n", 299 | "step : 0, tr_loss : 0.254, val_loss : 0.184\n", 300 | "step : 100, tr_loss : 0.291, val_loss : 0.252\n", 301 | "step : 200, tr_loss : 0.228, val_loss : 0.234\n", 302 | "step : 300, tr_loss : 0.228, val_loss : 0.241\n", 303 | "step : 400, tr_loss : 0.301, val_loss : 0.271\n", 304 | "step : 500, tr_loss : 0.239, val_loss : 0.273\n", 305 | "epoch : 9, tr_loss : 0.274, val_loss : 0.250\n" 306 | ] 307 | } 308 | ], 309 | "source": [ 310 | "for epoch in range(n_epochs):\n", 311 | " avg_tr_loss = 0\n", 312 | " avg_val_loss = 0\n", 313 | " total_batch = int(mnist.train.num_examples / batch_size)\n", 314 | " \n", 315 | " for step in range(total_batch):\n", 316 | " \n", 317 | " batch_xs, batch_ys = mnist.train.next_batch(batch_size = batch_size)\n", 318 | " val_xs, val_ys = mnist.validation.next_batch(batch_size = batch_size)\n", 319 | " _, tr_loss = adam_solver.train(sess = sess, x_data = batch_xs, y_data = batch_ys, lr = 1e-3, keep_prob = .5)\n", 320 | " val_loss = adam_solver.evaluate(sess = sess, x_data = val_xs, y_data = val_ys, keep_prob = 1)\n", 321 | " \n", 322 | " avg_tr_loss += tr_loss / total_batch\n", 323 | " avg_val_loss += val_loss / total_batch\n", 324 | " if step % 100 == 0:\n", 325 | " print('step : {:3}, tr_loss : {:.3f}, val_loss : {:.3f}'.format(step, tr_loss, val_loss))\n", 326 | " \n", 327 | " print('epoch : {:3}, tr_loss : {:.3f}, val_loss : {:.3f}'.format(epoch, avg_tr_loss, avg_val_loss))\n", 328 | " tr_loss_history.append(avg_tr_loss)\n", 329 | " val_loss_history.append(avg_val_loss)\n", 330 | " \n", 331 | " # early stopping\n", 332 | " if avg_val_loss < best_loss:\n", 333 | " best_loss = avg_val_loss\n", 334 | " checks_without_progress = 0\n", 335 | " else:\n", 336 | " checks_without_progress += 1\n", 337 | " if checks_without_progress > max_checks_without_progress:\n", 338 | " print('Early stopping')\n", 339 | " break" 340 | ] 341 | }, 342 | { 343 | "cell_type": "code", 344 | "execution_count": 9, 345 | "metadata": {}, 346 | "outputs": [ 347 | { 348 | "data": { 349 | "text/plain": [ 350 | "" 351 | ] 352 | }, 353 | "execution_count": 9, 354 | "metadata": {}, 355 | "output_type": "execute_result" 356 | }, 357 | { 358 | "data": { 359 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAD8CAYAAACMwORRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3WtwXGed5/Hvv2+625YsxW5JduyE\nYGzJujiKcTATwibLJmGSQEiBM8BMqGJclYEhoWZqh+XFMkztzPCCSmUYGDJhCAyzJiHrJCRAgB12\nk4UskI1sHMe3YMeWrasty7YsS7Kk7n72Rbfslty6WG75qE//PlVd3f30c07/3Yl/5/g85zzHnHOI\niIi/BLwuQEREsk/hLiLiQwp3EREfUriLiPiQwl1ExIcU7iIiPqRwFxHxIYW7iIgPKdxFRHwo5NUX\nV1ZWulWrVnn19SIiOWnHjh0nnXNVM/XzLNxXrVpFa2urV18vIpKTzOzobPrpsIyIiA8p3EVEfEjh\nLiLiQ54dcxcRfxkbG6Ojo4Pz5897XYovFBYWUltbSzgcntPyM4a7ma0AvgcsBxLAE865f5jU51bg\nBeBIquk559zfzKkiEclJHR0dlJWVsWrVKszM63JymnOOvr4+Ojo6WL169ZzWMZs99xjwF865nWZW\nBuwws393zu2b1O9Xzrk/nFMVIpLzzp8/r2DPEjNj6dKl9Pb2znkdMx5zd851O+d2pl4PAPuBmjl/\no4j4loI9e670t7ysAVUzWwU0A69l+PhmM3vDzH5qZnVXVNU03uoZ4O9f2s+5kdh8fYWISM6bdbib\nWSnwLPCIc+7spI93Atc65xqBfwR+OMU6tppZq5m1zvWfG+2nhvjnXx7mQPfkEkQkn505c4Z/+qd/\nuuzl7rrrLs6cOTMPFXlrVuFuZmGSwb7NOffc5M+dc2edc+dSr18CwmZWmaHfE865FudcS1XVjFfP\nZlRXswiAvV0KdxG5aKpwj8fj0y730ksvsWTJkvkqyzOzOVvGgG8D+51zj07RZzlw3DnnzGwjyY1G\nX1YrTVm+qJCKkgh7u/rnY/UikqO+8IUv8Pbbb9PU1EQ4HKa0tJRoNMquXbvYt28fH/rQh2hvb+f8\n+fM8/PDDbN26Fbg4Fcq5c+e48847ee9738uvf/1rampqeOGFFygqKvL4TzY3szlbZjPwSeBNM9uV\navsisBLAOfc4cD/wkJnFgGFgi3POzUO9mBl11Yu05y6ygH35R3vZl+W/o+uqF/Glu6cezvvKV77C\nnj172LVrF6+88gof/OAH2bNnz4VTCZ988kkqKioYHh7mpptu4iMf+QhLly6dsI6DBw/y1FNP8a1v\nfYuPfvSjPPvss3ziE5/I6p/japkx3J1zrwLTDts6574OfD1bRc1kXfUinnz1CKOxBJGQLrIVkUtt\n3LhxwjniX/va13j++ecBaG9v5+DBg5eE++rVq2lqagLgxhtvpK2t7arVm225d4XqyAC3hA/w7XiA\ngycGqKte7HVFIjLJdHvYV0tJScmF16+88gq/+MUv+M1vfkNxcTG33nprxitpCwoKLrwOBoMMDw9f\nlVrnQ+7t9r71Mza/+iDXW5cOzYjIBWVlZQwMDGT8rL+/n/LycoqLizlw4AC//e1vr3J1V1/u7blH\nGwHYED6W9WN6IpK7li5dyubNm6mvr6eoqIhly5Zd+OyOO+7g8ccfp6GhgTVr1rBp0yYPK706ci/c\nl14P4RLeG+rkuzpjRkTSfP/738/YXlBQwE9/+tOMn40fV6+srGTPnj0X2v/yL/8y6/VdTbl3WCYQ\nhOXrqQ8cYV/XWRKJeTkpR0Qkp+VeuANEG6k+f5Dh0TGOnhryuhoRkQUnZ8M9HB9mtXXrYiYRkQxy\nNtwBGoJHdcaMiEgGuRnuVWsgWMB7SzoV7iIiGeRmuAfDsKyOxlAb+7r6maeZDkREclZuhjtAdRMr\nRg5y8twIJwZGvK5GRHJMaWkpAF1dXdx///0Z+9x66620trZOu57HHnuMoaGLJ3YslCmEczfco40U\nxM6xwk5oUFVE5qy6uprt27fPefnJ4b5QphDO6XAHqLc29nbquLtIvvurv/qrCfO5//Vf/zVf/vKX\nue2229iwYQPr16/nhRdeuGS5trY26uvrARgeHmbLli00NDTwsY99bMLcMg899BAtLS3U1dXxpS99\nCUhORtbV1cX73/9+3v/+9wPJKYRPnjwJwKOPPkp9fT319fU89thjF75v7dq1/Omf/il1dXV84AMf\nmJc5bHLvCtVx16yDQIjNJR28qkFVkYXlp1+Anjezu87l6+HOr0z58ZYtW3jkkUf4sz/7MwCeeeYZ\nfvazn/H5z3+eRYsWcfLkSTZt2sQ999wz5f1Jv/nNb1JcXMzu3bvZvXs3GzZsuPDZ3/7t31JRUUE8\nHue2225j9+7dfO5zn+PRRx/l5ZdfprJy4v2JduzYwXe+8x1ee+01nHO8+93v5n3vex/l5eVXZWrh\n3N1zDxXANWvZED7K3m4dlhHJd83NzZw4cYKuri7eeOMNysvLiUajfPGLX6ShoYHbb7+dzs5Ojh8/\nPuU6fvnLX14I2YaGBhoaGi589swzz7Bhwwaam5vZu3cv+/btm7aeV199lQ9/+MOUlJRQWlrKfffd\nx69+9Svg6kwtnLt77gDRRlb3/Zj2gSH6h8dYXBT2uiIRgWn3sOfT/fffz/bt2+np6WHLli1s27aN\n3t5eduzYQTgcZtWqVRmn+k2Xaa/+yJEjfPWrX+X111+nvLycBx98cMb1THcW39WYWjh399wBok0U\njZ0hyinNECkibNmyhaeffprt27dz//3309/fzzXXXEM4HObll1/m6NGj0y5/yy23sG3bNgD27NnD\n7t27ATh79iwlJSUsXryY48ePT5iEbKqphm+55RZ++MMfMjQ0xODgIM8//zx/8Ad/kMU/7fRyfs8d\noD5whL1d/dx8/dIZFhARP6urq2NgYICamhqi0Sgf//jHufvuu2lpaaGpqYl3vetd0y7/0EMP8alP\nfYqGhgaamprYuHEjAI2NjTQ3N1NXV8d1113H5s2bLyyzdetW7rzzTqLRKC+//PKF9g0bNvDggw9e\nWMenP/1pmpubr9rdncyrC4BaWlrcTOePzmh0CP6+hm/ZR9i/5rM8+rGm7BQnIpdt//79rF271usy\nfCXTb2pmO5xzLTMtm9uHZSLFULmGmwraNQ2BiEia3A53gGgj18fe5lDvOc6Pxb2uRkRkQfBFuJeN\n9VKROM1bPZnvnygiV4fmecqeK/0tfRHuAHWBNh2aEfFQYWEhfX19CvgscM7R19dHYWHhnNeR22fL\nQPKqNeDGyFHNMSPiodraWjo6Oujt7fW6FF8oLCyktrZ2zsvnfrgXLoKK63n3YDt/pz13Ec+Ew2FW\nr17tdRmSkvuHZQCijbzTHeZAz1niumG2iIh/wn3JaA+FY/0c7j3ndTUiIp7zTbiDBlVFRMb5Ktwb\ngxpUFREBv4R7cQUsWcmmog7tuYuI4JdwB4g2so7D7O06q/NsRSTv+SrcK0c7iA/303km+3Mji4jk\nEh+Fe3JGyDo7qkMzIpL3fBTuyUHV9cEjCncRyXv+CffSa6AsyruLOtinM2ZEJM/NGO5mtsLMXjaz\n/Wa218weztDHzOxrZnbIzHab2YZM65p30UbqTee6i4jMZs89BvyFc24tsAn4jJmtm9TnTuCG1GMr\n8M2sVjlb0UaWjR7jTP8ZTg2OelKCiMhCMGO4O+e6nXM7U68HgP1AzaRu9wLfc0m/BZaYWTTr1c4k\n2kiABGvtmC5mEpG8dlnH3M1sFdAMvDbpoxqgPe19B5duAObf+BkzmoZARPLcrMPdzEqBZ4FHnHOT\nk9MyLHLJlURmttXMWs2sdV7mfF5UDcWVbNQ9VUUkz80q3M0sTDLYtznnnsvQpQNYkfa+Fuia3Mk5\n94RzrsU511JVVTWXemcqFKKNNIY0x4yI5LfZnC1jwLeB/c65R6fo9iLwx6mzZjYB/c657izWOXvR\nRmrGjtB58gyDIzFPShAR8dps7sS0Gfgk8KaZ7Uq1fRFYCeCcexx4CbgLOAQMAZ/KfqmzFG0k6OK8\nk3YO9JzlxmsrPCtFRMQrM4a7c+5VMh9TT+/jgM9kq6grkrpStT6QvFJV4S4i+cg/V6iOK1+FK1jE\nhvAx9nZqUFVE8pP/wt0MizayIXyUvd0aVBWR/OS/cAeINnJtrI3DPWcYiye8rkZE5Krzabg3EXKj\nrEx0cPC4bpgtIvnHn+FenbxSNTmoqkMzIpJ//BnuFdfjIqWpG2ZrUFVE8o8/wz0QwJavp6XgGPsU\n7iKSh/wZ7gDRRq6PH+ZA9xkSCd0wW0Tyi6/DPZI4T9VoO8dODXldjYjIVeXrcAeoN91TVUTyj3/D\nvXINLlTI+mCbzpgRkbzj33APhrBlddykud1FJA/5N9wBoo28M3GYfZ1nvK5EROSq8nm4N1GUGKRo\nqIMTZ897XY2IyFXj83DXoKqI5Cd/h/s1a3GBMPUBDaqKSH7xd7iHCrBr1tISOaY9dxHJK/4Od4Bo\nI+s4zN5O7bmLSP7Ii3AvTZwldrqds+fHvK5GROSqyINwvzj9ryYRE5F84f9wX1aHswB1gTYddxeR\nvOH/cI8UY5VrkjfM1hkzIpIn/B/uANFGHZYRkbySH+Fe3UR5/BRnTrRzfizudTUiIvMuP8I9daXq\nuzjC748PeFyMiMj8y49wX74egHrToKqI5If8CPeCMtzSd9Ac0jQEIpIf8iPcAYs20hA8qj13EckL\neRPuRBupSpygp7uTuG6YLSI+l1fhDnB9/DBHTg56XIyIyPzKn3Bf3gCMz+2u4+4i4m/5E+7FFbgl\nK2kItuliJhHxvfwJd8CiTTSFNKgqIv6XV+FOtJHqRDdHO7twToOqIuJfeRbuyel/q8+/TXe/bpgt\nIv41Y7ib2ZNmdsLM9kzx+a1m1m9mu1KP/5r9MrMkmhpUDeiG2SLib7PZc/8ucMcMfX7lnGtKPf7m\nysuaJ6XXkCiLsj6gM2ZExN9mDHfn3C+BU1ehlqsiEG2iKaxBVRHxt2wdc7/ZzN4ws5+aWV2W1jk/\noo2sTHRyuPOE15WIiMybbIT7TuBa51wj8I/AD6fqaGZbzazVzFp7e3uz8NVzEG0kgGPx2bc4PTjq\nTQ0iIvPsisPdOXfWOXcu9folIGxmlVP0fcI51+Kca6mqqrrSr56b1DQE9YEj7OvWoRkR8acrDncz\nW25mlnq9MbXOvitd77xZVE2iuDI1t7sGVUXEn0IzdTCzp4BbgUoz6wC+BIQBnHOPA/cDD5lZDBgG\ntriFfIWQGYHqJpqGD/ENDaqKiE/NGO7OuQdm+PzrwNezVtHVEG3kukMv8/vOk15XIiIyL/LrCtVx\n0UZCxIn0HWB4VDfMFhH/ydtwB6izI+zv0aEZEfGf/Az3JdeSKFicmttd4S4i/pOf4W6GVTfSEDrK\nPp0xIyI+lJ/hTvKG2Ws4xoFO38ysICJyQd6GO9EmwowRP36AsXjC62pERLIqj8M9Oai6xr3N273n\nPC5GRCS78jfcK64nES6hztrY26lBVRHxl/wN90AAizbQEGzTGTMi4jv5G+4kb5i9zo6yT4OqIuIz\neR3uRBspZIShnrd0w2wR8ZW8D3eA60YP0X5q2ONiRESyJ7/DvfKdJIIFqRtm62ImEfGP/A73YAiW\n1VMf0KCqiPhLfoc7EKhuYn3gKPs6T3tdiohI1uR9uBNtpIQhznQd9LoSEZGsUbinBlWjQ7+nd2DE\n42JERLJD4X7NOhKBsAZVRcRXFO6hCK5qbXIaAg2qiohPKNyBYE0TDcE29nVqz11E/EHhDhBtZAkD\nnOw85HUlIiJZoXAHiDYBsKR/PwPnxzwuRkTkyincAZbVkbAgdYEj7O8e8LoaEZErpnAHCBcRr7iB\nemvTGTMi4gsK95RQbZPmdhcR31C4p1i0iUrO0NXR5nUpIiJXTOE+LnWlavHJNxmJxT0uRkTkyijc\nxy1fj8NYyxEOHtcNs0UktyncxxWUMbbkutT0vxpUFZHcpnBPE65tpiFwRIOqIpLzFO5pLNpI1Ppo\n7zjmdSkiIldE4Z4uNagaPP4m8YRumC0iuUvhni7aAMAN8cO09Q16XIyIyNwp3NMVlTNatoI6HXcX\nkRyncJ8kVNPMep0xIyI5bsZwN7MnzeyEme2Z4nMzs6+Z2SEz221mG7Jf5tUTqG7kWjtOW0eX16WI\niMzZbPbcvwvcMc3ndwI3pB5bgW9eeVkeqk5O/5vo3o1zGlQVkdw0Y7g7534JnJqmy73A91zSb4El\nZhbNVoFX3fLkGTMrRw7Sc/a8x8WIiMxNNo651wDtae87Um25qbSK0eJo8krVTg2qikhuyka4W4a2\njMczzGyrmbWaWWtvb28Wvnp+BGoaWW86Y0ZEclc2wr0DWJH2vhbIOBrpnHvCOdfinGupqqrKwlfP\nj1BNM9cFujnU0eN1KSIic5KNcH8R+OPUWTObgH7nXHcW1uudaCMBHGNdu72uRERkTkIzdTCzp4Bb\ngUoz6wC+BIQBnHOPAy8BdwGHgCHgU/NV7FWTmoZg2eBbnBkaZUlxxOOCREQuz4zh7px7YIbPHfCZ\nrFW0EJRFGS1cSn3sCPu6zvKed1R6XZGIyGXRFaqZmMHyxtTc7hpUFZHco3CfQmRFMzcEOnir44TX\npYiIXDaF+1SiTYRIMNyZcdYFEZEFTeE+ldSg6pIz+xge1Q2zRSS3KNynsmQlo+HF1NlhDvTouLuI\n5BaF+1TMSCxv0KCqiOQkhfs0ClY0865AOwc6+rwuRUTksijcp2HRRiLEONfxpteliIhcFoX7dKLJ\nud2L+/YSiyc8LkZEZPYU7tOpuI6xYDFr3GHe7tUNs0UkdyjcpxMIMHbN+tSgqu6pKiK5Q+E+g8KV\nG1hnR9nXedrrUkREZk3hPoNAdRNFNsrpY3u9LkVEZNYU7jNJXaka6X1TN8wWkZyhcJ/J0huIBQp5\nR+xtOk4Pe12NiMisKNxnEgwxsnQt9YEjGlQVkZyhcJ+FgtSg6t7OM16XIiIyKwr3WQjVNFFmw5w8\ndsDrUkREZkXhPhupQdVgj26YLSK5QeE+G1VriVuI2pGD9J0b8boaEZEZKdxnIxRhuHwNdXZE0/+K\nSE5QuM9SpHZDchqCTp0xIyILn8J9liIrmim3c3Qd+73XpYiIzEjhPlup6X8D3W94XIiIyMwU7rO1\nbB0JglSdO8DgSMzrakREpqVwn61wEYOL30GdHWF/twZVRWRhU7hfhlBNU3IaAg2qisgCp3C/DIUr\nm6mys7QfO+x1KSIi01K4XwZLDarGOnd5XImIyPQU7pdj+XocRnn/PkZjumG2iCxcCvfLUVDKudLV\nrOUIB08MeF2NiMiUFO6XK9pIXaBN0xCIyIKmcL9MJas2UGN9tB096nUpIiJTUrhfpkB1clB1tON3\nHlciIjI1hfvlWt4AQEnfHhIJ3TBbRBamWYW7md1hZm+Z2SEz+0KGzx80s14z25V6fDr7pS4QRUs4\nV1zLO91hjp4a8roaEZGMZgx3MwsC3wDuBNYBD5jZugxdf+Cca0o9/iXLdS4o8WUN1FubbpgtIgvW\nbPbcNwKHnHOHnXOjwNPAvfNb1sJWsupGrg2c4NCxDq9LERHJaDbhXgO0p73vSLVN9hEz221m281s\nRVaqW6BCNclB1eFjulJVRBam2YS7ZWibPJL4I2CVc64B+AXwrxlXZLbVzFrNrLW3t/fyKl1IUtMQ\nhE7s5oVdnYzE4h4XJCIy0WzCvQNI3xOvBbrSOzjn+pxz43eO/hZwY6YVOeeecM61OOdaqqqq5lLv\nwlBSyWhJNU3Bozz89C42/d3/4r/9eB+HTpzzujIREWB24f46cIOZrTazCLAFeDG9g5lF097eA+zP\nXokLU6S2mdsXHeOpP67jPddX8t1ft3H7o/+Hj/7zb/jh7zo5P6a9eRHxTmimDs65mJl9Fvg5EASe\ndM7tNbO/AVqdcy8CnzOze4AYcAp4cB5rXhhu+I/YWz/h5uffw83193H6Tx7gB93LePr1dh75wS6W\n/CjMfc21PLBxBTcsK/O6WhHJM+acNxfitLS0uNbWVk++Oyucg45W2PmvsOc5GBuEqrUkmj9J6+IP\n8L03Bvj53h7G4o6bVpXzwMaV3LU+SmE46HXlIpLDzGyHc65lxn4K9ywYGYA9z8LO70HnDghGYO3d\n9K99gB+cXMVTr3dy5OQgiwpD3Lehlgc2rmTNcu3Ni8jlU7h7pWdPMuR3Pw3n+6F8Fa75k+wov4vv\n7RnhZ3t6GI0nuPHa5N78B9dHKYpob15EZkfh7rWxYdj/4+Rhm7ZfgQXhnf+JgXV/xDNn1vD91i7e\n7h2krDDEfc01bNm4krXRRV5XLSILnMJ9Iel7G373b/C7bTB4AsqiuKaPs7vqbr67H37yZjejsQRN\nK5bwRxtX8oeNUYojM451i0geUrgvRPEx+P3Pk4dtDv07uASsfh+D9R/nfww2sa21h4MnzlFWEOLe\n5moe2LiSuurFXlctIguIwn2h6++EXdtg579B/zEoKsc1bGHv8g/x5MFCfrK7m5FYgsbaxTywcSV3\nN1ZTUqC9eZF8p3DPFYkEHHkluTe//8eQGIPajQzVf5znRm7i33b28dbxAUoiQe5truGPNq6kvkZ7\n8yL5SuGeiwZPwhtPJwdhT/4eImW4+o9woPrDfPvwEn78ZjfnxxKsr0nuzd/TVE2p9uZF8orCPZc5\nB+2vJffm9zwHsWFYVs/w+k/wQmIz3915hgM9AxRHgtzTWM3ta5dRWVbA0pIIS0sjGowV8TGFu1+c\n74c3tyeDvnsXBAtw6+7l7RX38a2j1by4u5vhSfPYFIYDLC0pYGlphIqSyIXXS0tS70uTbRXaGIjk\nHIW7H3W/kRyA3f0MjPRDxXWMNHyCw+XvpTexiOOxIk4Oxjk1OELf4Ch950Y5NZh8nDw3wkgskXG1\nReFgWuhHqNDGQGTBUrj72egQ7H8xuTd/9P+mfWBQuBiKl056VOCKKhgtKOesLeI0ZfQlSjkeL6Vn\npIC+oVhWNgbFkSDhYIBw0AgFAoRDAcIBI5RqCwcDhAKWag8QSrWN94+Eks+hoBEJBggFU30CyT7B\ngGGW6fYCIvlD4Z4vTh5KHq4ZOgVDfZMep2D4VHKgNj4yxQoMisozbgzGCso5F1zMaco45crojZfS\nM1ZM90gkuUGY5cYgmyZsJIKB5OtJG4lwMLlBudjn4gZmfONxycYleHFDlP75eHv694TS3qd/3+Ta\nzCBglnqATfE83seMjMuIpJttuOvf17mu8h3Jx3Scg7GhScF/OsPGoA/OHIWundhQH5H4KBVABXB9\n+vosAEUVyQ1ByVKoqkhuEMIlxIMREhYhEYgQsxDxQJh4IELMwsQswhhhYhYiZmFGCTNGmDFCjBJm\nlBCjhBhxYUYIMerCjCQCjCUgFk8wFk8wlnCp146xeIJY6nks4RiLJYglkp+NPw+Oxi70iaUtG0uk\nLZvWfyFK3wBgXLJBmGrjMblP8l8+F/tf2IgEIGg2cbnA5D6T3l/ol6lt6vUk/wiptsCk92n1T/5z\nTa47+bukr8cwJvWZtEz67xAMGMH014FkfaFg6jlDWzCQbA9MXtaMYDD5HAhAKBBYEBtmhXs+MINI\nSfKxZOXslnEORs/NvDEYPg2nDkP7/8PGhgnFRyA+mt36gxEIFkBo0nMwcmlbJJJsD4QhOP6IQCCU\nWk8EgqGMfVwgTCK1QYoRJG5hYhYkRvL9GCFiFmLMJV+PEWTUjb8PMpIIEXPuwkbDAc45Eg4SqWfn\nHImEw8HF92l9nCPtc5exD2nru7BMqk/y9cVl4glwJL8zvX88MXGd6d+V/lk8kUi9T33mHIlE2nrS\nvnd8ufF6Ji/n0r7fTarfMfG9H4wHf3rgJzcIAf7k5mv589tumNfvV7hLZmZQUJZ8lK+6vGWdSwZ8\nLBX06a8nP1/SNpKcpmH8dWx06rbY6MV1jA5B/HSyLTGWao+lnscutiViU/+RSd6NJghE5vq7BdI2\nHIFA8l85Ex7B1LMlnwPBDH0yPCb0s7T1TO5jE78rON4eTNUTTL4PhC5ts1T7JW3j/QMT2y58NlVb\naNJngYn1YZf+GSyAM8M5w5mRIPlwBHAWSL52kCCAS7UnMOJY8r0FkhsJAiRccv0JSG2o3IUNTTxx\n8ZFIbZDHN1yxROJC2/gGcrwtnhjf4CU3bPF4gnhqgxy7ZF0X29K/L+4cNywrnev/YbOmcJfsM4NQ\nQfKx0Dg3MewvbABSwT++MZhVn/T3af0SqeVdYuIjEU9+/4W2eIY+k95P6JNaNp5hOTdp3Yn4xfbE\n+HP80ufJbQuApR6Q3NBmZ6XpGxSb5XOqmvGN0uUuO933LfsT4LPZ+tNlpHCX/GKWPIRDBCjxupqF\nJ5FIbrAu2Rik2i9pi2foP74xiU1sw2XYKE1uS3ufsf94n6k+S192Up9E/GL7Jc9M0Z62nimXTR4q\ny9yeyLxM6bJ5/0+pcBeRiwIBCMz5oJQsIAGvCxARkexTuIuI+JDCXUTEhxTuIiI+pHAXEfEhhbuI\niA8p3EVEfEjhLiLiQ55N+WtmvcDROS5eCZzMYjm5Tr/HRPo9LtJvMZEffo9rnXNVM3XyLNyvhJm1\nzmY+43yh32Mi/R4X6beYKJ9+Dx2WERHxIYW7iIgP5Wq4P+F1AQuMfo+J9HtcpN9iorz5PXLymLuI\niEwvV/fcRURkGjkX7mZ2h5m9ZWaHzOwLXtfjJTNbYWYvm9l+M9trZg97XZPXzCxoZr8zsx97XYvX\nzGyJmW03swOp/0du9romr5jZ51N/R/aY2VNmVuh1TfMtp8LdzILAN4A7gXXAA2a2ztuqPBUD/sI5\ntxbYBHwmz38PgIeB/V4XsUD8A/Az59y7gEby9Hcxsxrgc0CLc66e5N37tnhb1fzLqXAHNgKHnHOH\nnXOjwNPAvR7X5BnnXLdzbmfq9QDJv7w13lblHTOrBT4I/IvXtXjNzBYBtwDfBnDOjTrnznhbladC\nQJGZhYBioMvjeuZdroV7DdCe9r6DPA6zdGa2CmgGXvO2Ek89BvxnIOF1IQvAdUAv8J3UYap/MbO8\nvGmsc64T+CpwDOgG+p1z/9Pl0Y5YAAABZ0lEQVTbquZfroW7ZWjL+9N9zKwUeBZ4xDl31ut6vGBm\nfwiccM7t8LqWBSIEbAC+6ZxrBgaBvByjMrNykv/CXw1UAyVm9glvq5p/uRbuHcCKtPe15ME/r6Zj\nZmGSwb7NOfec1/V4aDNwj5m1kTxc9x/M7L97W5KnOoAO59z4v+S2kwz7fHQ7cMQ51+ucGwOeA97j\ncU3zLtfC/XXgBjNbbWYRkoMiL3pck2fMzEgeU93vnHvU63q85Jz7L865WufcKpL/X/xv55zv986m\n4pzrAdrNbE2q6TZgn4cleekYsMnMilN/Z24jDwaXQ14XcDmcczEz+yzwc5Ij3k865/Z6XJaXNgOf\nBN40s12pti86517ysCZZOP4c2JbaEToMfMrjejzhnHvNzLYDO0meYfY78uBKVV2hKiLiQ7l2WEZE\nRGZB4S4i4kMKdxERH1K4i4j4kMJdRMSHFO4iIj6kcBcR8SGFu4iID/1/meyc1eBfKaAAAAAASUVO\nRK5CYII=\n", 360 | "text/plain": [ 361 | "" 362 | ] 363 | }, 364 | "metadata": {}, 365 | "output_type": "display_data" 366 | } 367 | ], 368 | "source": [ 369 | "plt.plot(tr_loss_history, label = 'train')\n", 370 | "plt.plot(val_loss_history, label = 'validation')\n", 371 | "plt.legend()" 372 | ] 373 | }, 374 | { 375 | "cell_type": "code", 376 | "execution_count": 10, 377 | "metadata": {}, 378 | "outputs": [], 379 | "source": [ 380 | "hat = mnist_classifier.predict(sess=sess, x_data=mnist.test.images, keep_prob=1.)" 381 | ] 382 | }, 383 | { 384 | "cell_type": "code", 385 | "execution_count": 11, 386 | "metadata": {}, 387 | "outputs": [ 388 | { 389 | "name": "stdout", 390 | "output_type": "stream", 391 | "text": [ 392 | "accuracy : 97.85%\n" 393 | ] 394 | } 395 | ], 396 | "source": [ 397 | "print('accuracy : {:.2%}'.format(np.mean(np.argmax(mnist.test.labels, axis = 1) == hat)))" 398 | ] 399 | } 400 | ], 401 | "metadata": { 402 | "kernelspec": { 403 | "display_name": "Python 3", 404 | "language": "python", 405 | "name": "python3" 406 | }, 407 | "language_info": { 408 | "codemirror_mode": { 409 | "name": "ipython", 410 | "version": 3 411 | }, 412 | "file_extension": ".py", 413 | "mimetype": "text/x-python", 414 | "name": "python", 415 | "nbconvert_exporter": "python", 416 | "pygments_lexer": "ipython3", 417 | "version": "3.6.3" 418 | }, 419 | "varInspector": { 420 | "cols": { 421 | "lenName": 16, 422 | "lenType": 16, 423 | "lenVar": 40 424 | }, 425 | "kernels_config": { 426 | "python": { 427 | "delete_cmd_postfix": "", 428 | "delete_cmd_prefix": "del ", 429 | "library": "var_list.py", 430 | "varRefreshCmd": "print(var_dic_list())" 431 | }, 432 | "r": { 433 | "delete_cmd_postfix": ") ", 434 | "delete_cmd_prefix": "rm(", 435 | "library": "var_list.r", 436 | "varRefreshCmd": "cat(var_dic_list()) " 437 | } 438 | }, 439 | "types_to_exclude": [ 440 | "module", 441 | "function", 442 | "builtin_function_or_method", 443 | "instance", 444 | "_Feature" 445 | ], 446 | "window_display": false 447 | } 448 | }, 449 | "nbformat": 4, 450 | "nbformat_minor": 2 451 | } 452 | -------------------------------------------------------------------------------- /Tutorial of implementing Sequence classification with RNN series.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Tutorial of implementing Sequence classification with RNN series\n", 8 | "Sequence를 modeling 할 수 있는 RNN series에는 기본적인 Recurrent Neural Network, Long Short-Term Memory Units (LSTM), Gated Recurrent Unit (GRU) 등 여러 Cell을 활용할 수 있지만, ***이 Tutorial에서는 many to one의 예제로 영어단어의 알파벳을 하나씩보고 (영어단어의 길이는 모두 다르다.), 단어의 긍/부정을 예측하는 Character level RNN을 학습한다. cell은 GRU을 활용한다.*** 특히 알파벳을 벡터로 표현하는 방법은 아래의 논문에서 소개된 ***Character quantization*** 방법을 참고하였다. \n", 9 | "\n", 10 | "* Paper : https://papers.nips.cc/paper/5782-character-level-convolutional-networks-for-text-classification.pdf \n", 11 | "* Reference : https://github.com/golbin/TensorFlow-Tutorials/blob/master/10%20-%20RNN/02%20-%20Autocomplete.py" 12 | ] 13 | }, 14 | { 15 | "cell_type": "markdown", 16 | "metadata": {}, 17 | "source": [ 18 | "### Setup" 19 | ] 20 | }, 21 | { 22 | "cell_type": "code", 23 | "execution_count": 1, 24 | "metadata": {}, 25 | "outputs": [], 26 | "source": [ 27 | "import os, sys\n", 28 | "import numpy as np\n", 29 | "import pandas as pd\n", 30 | "import tensorflow as tf\n", 31 | "import matplotlib.pyplot as plt\n", 32 | "%matplotlib inline" 33 | ] 34 | }, 35 | { 36 | "cell_type": "markdown", 37 | "metadata": {}, 38 | "source": [ 39 | "### Charater quantization\n", 40 | "Paper에서는 영어로 쓰인 어떤 글이라 One hot vector의 모음으로 표현할 수 있도록, 아래와 같은 문자열들을 모두 one hot encoding에 포함시키지만 ***이 Tutorial에서는 영어 알파벳만 활용한다.***\n", 41 | "\n", 42 | "Paper : \"abcdefghijklmnopqrstuvwxyz0123456789-,;.!?:’’’/\\|_@#$%ˆ&*˜‘+-=<>()[]{}\"" 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": 2, 48 | "metadata": {}, 49 | "outputs": [], 50 | "source": [ 51 | "char_arr = \"abcdefghijklmnopqrstuvwxyz \"" 52 | ] 53 | }, 54 | { 55 | "cell_type": "code", 56 | "execution_count": 3, 57 | "metadata": {}, 58 | "outputs": [], 59 | "source": [ 60 | "char_dic = {char : idx for idx, char in enumerate(char_arr)}\n", 61 | "dic_len = len(char_dic)" 62 | ] 63 | }, 64 | { 65 | "cell_type": "code", 66 | "execution_count": 4, 67 | "metadata": {}, 68 | "outputs": [ 69 | { 70 | "data": { 71 | "text/plain": [ 72 | "{' ': 26,\n", 73 | " 'a': 0,\n", 74 | " 'b': 1,\n", 75 | " 'c': 2,\n", 76 | " 'd': 3,\n", 77 | " 'e': 4,\n", 78 | " 'f': 5,\n", 79 | " 'g': 6,\n", 80 | " 'h': 7,\n", 81 | " 'i': 8,\n", 82 | " 'j': 9,\n", 83 | " 'k': 10,\n", 84 | " 'l': 11,\n", 85 | " 'm': 12,\n", 86 | " 'n': 13,\n", 87 | " 'o': 14,\n", 88 | " 'p': 15,\n", 89 | " 'q': 16,\n", 90 | " 'r': 17,\n", 91 | " 's': 18,\n", 92 | " 't': 19,\n", 93 | " 'u': 20,\n", 94 | " 'v': 21,\n", 95 | " 'w': 22,\n", 96 | " 'x': 23,\n", 97 | " 'y': 24,\n", 98 | " 'z': 25}" 99 | ] 100 | }, 101 | "execution_count": 4, 102 | "metadata": {}, 103 | "output_type": "execute_result" 104 | } 105 | ], 106 | "source": [ 107 | "# char_dic을 이용해서 영어알파벳을 27차원의 one hot vector로 만들 수 있다.\n", 108 | "char_dic" 109 | ] 110 | }, 111 | { 112 | "cell_type": "code", 113 | "execution_count": 5, 114 | "metadata": {}, 115 | "outputs": [ 116 | { 117 | "data": { 118 | "text/plain": [ 119 | "dict_keys(['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z', ' '])" 120 | ] 121 | }, 122 | "execution_count": 5, 123 | "metadata": {}, 124 | "output_type": "execute_result" 125 | } 126 | ], 127 | "source": [ 128 | "char_dic.keys()" 129 | ] 130 | }, 131 | { 132 | "cell_type": "markdown", 133 | "metadata": {}, 134 | "source": [ 135 | "### Define user function\n", 136 | "***GRU의 input을 생성하는 make_batch function***을 작성한다. max_len 이라는 argument가 존재하는 이유는 RNN은 실제로 variable length sequence를 처리할 수 있지만, Tensorflow에서는 RNN의 input의 sequence는 고정된 길이로 주어야하기 때문에, 일단 max_len의 길이만큼 padding을 한다. 코드로 확인!" 137 | ] 138 | }, 139 | { 140 | "cell_type": "code", 141 | "execution_count": 6, 142 | "metadata": {}, 143 | "outputs": [], 144 | "source": [ 145 | "def make_batch(seq_data, max_len):\n", 146 | " seq_len = []\n", 147 | " seq_batch = []\n", 148 | " for seq in seq_data:\n", 149 | " seq_len.append(len(seq))\n", 150 | " seq_idx = [char_dic.get(char) for char in seq]\n", 151 | " seq_idx += (max_len - len(seq_idx)) * [0]\n", 152 | " seq_matrix = np.eye(dic_len)[seq_idx].tolist()\n", 153 | " seq_batch.append(seq_matrix) \n", 154 | " return seq_len, seq_batch" 155 | ] 156 | }, 157 | { 158 | "cell_type": "code", 159 | "execution_count": 7, 160 | "metadata": {}, 161 | "outputs": [ 162 | { 163 | "name": "stdout", 164 | "output_type": "stream", 165 | "text": [ 166 | "length : [4], \n", 167 | " batch : [[[ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 168 | " 0. 1. 0. 0. 0. 0. 0. 0. 0.]\n", 169 | " [ 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 170 | " 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 171 | " [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 172 | " 1. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 173 | " [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 174 | " 0. 1. 0. 0. 0. 0. 0. 0. 0.]]]\n" 175 | ] 176 | } 177 | ], 178 | "source": [ 179 | "# padding을 안했을 시\n", 180 | "test_len, test_batch = make_batch(['test'], len('test'))\n", 181 | "print('length : {}, \\n batch : {}'.format(test_len, np.array(test_batch)))" 182 | ] 183 | }, 184 | { 185 | "cell_type": "markdown", 186 | "metadata": {}, 187 | "source": [ 188 | "최대 길이를 10으로하면 아래와 같이 길이가 4인 \"test\"라는 단어에 대해서 우리가 생각한 것과 batch가 다르게 생성되지만, ***해당 문자열의 실제 길이를 저장한 후, tf.placehoder를 이용해서 tf.nn.dynamic_rnn에 sequence_length argument에 값을 전달하여, variable sequence length를 처리할 수 있다.***" 189 | ] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": 8, 194 | "metadata": {}, 195 | "outputs": [ 196 | { 197 | "name": "stdout", 198 | "output_type": "stream", 199 | "text": [ 200 | "length : [4], \n", 201 | " batch : [[[ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 202 | " 0. 1. 0. 0. 0. 0. 0. 0. 0.]\n", 203 | " [ 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 204 | " 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 205 | " [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 206 | " 1. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 207 | " [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 208 | " 0. 1. 0. 0. 0. 0. 0. 0. 0.]\n", 209 | " [ 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 210 | " 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 211 | " [ 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 212 | " 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 213 | " [ 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 214 | " 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 215 | " [ 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 216 | " 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 217 | " [ 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 218 | " 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 219 | " [ 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n", 220 | " 0. 0. 0. 0. 0. 0. 0. 0. 0.]]]\n" 221 | ] 222 | } 223 | ], 224 | "source": [ 225 | "# max_len (최대길이)를 10으로 padding을 했을 시 \n", 226 | "test_len, test_batch = make_batch(['test'], 10)\n", 227 | "print('length : {}, \\n batch : {}'.format(test_len, np.array(test_batch)))" 228 | ] 229 | }, 230 | { 231 | "cell_type": "markdown", 232 | "metadata": {}, 233 | "source": [ 234 | "### Define CharRNN class\n", 235 | "가변길이의 영어단어가 긍정 단어인지 부정단어인지 예측하는 CharRNN 모형을 정의" 236 | ] 237 | }, 238 | { 239 | "cell_type": "code", 240 | "execution_count": 9, 241 | "metadata": {}, 242 | "outputs": [], 243 | "source": [ 244 | "class CharRNN():\n", 245 | " def __init__(self, n_label, max_len, input_dim, hidden_dim):\n", 246 | " with tf.variable_scope('input_layer'):\n", 247 | " self._x_len = tf.placeholder(dtype = tf.int32)\n", 248 | " self._x_batch = tf.placeholder(dtype = tf.float32, shape = [None, max_len, input_dim])\n", 249 | " self._y = tf.placeholder(dtype = tf.float32, shape = [None, n_label])\n", 250 | " \n", 251 | " with tf.variable_scope('gru_cell'):\n", 252 | " cell = tf.contrib.rnn.GRUCell(num_units = hidden_dim)\n", 253 | " _, self._hidden = tf.nn.dynamic_rnn(cell = cell, inputs = self._x_batch,\n", 254 | " sequence_length = self._x_len, dtype = tf.float32)\n", 255 | "\n", 256 | " with tf.variable_scope('output_layer'):\n", 257 | " self._score = tf.layers.dense(self._hidden, units = n_label)\n", 258 | " \n", 259 | " with tf.variable_scope('loss'):\n", 260 | " self._ce_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits = self._score, labels = self._y))\n", 261 | " \n", 262 | " with tf.variable_scope('predict'):\n", 263 | " self._prediction = tf.argmax(input = self._score, axis = 1)\n", 264 | " \n", 265 | " def predict(self, sess, seq_len, seq_batch):\n", 266 | " feed_predict = {self._x_len : seq_len, self._x_batch : seq_batch}\n", 267 | " return sess.run(fetches = self._prediction, feed_dict = feed_predict)\n", 268 | " \n", 269 | " def encode(self, sess, seq_len, seq_batch):\n", 270 | " feed_encode = {self._x_len : seq_len, self._x_batch : seq_batch}\n", 271 | " return sess.run(fetches = self._hidden, feed_dict = feed_encode) " 272 | ] 273 | }, 274 | { 275 | "cell_type": "markdown", 276 | "metadata": {}, 277 | "source": [ 278 | "### Define Solver class" 279 | ] 280 | }, 281 | { 282 | "cell_type": "code", 283 | "execution_count": 10, 284 | "metadata": {}, 285 | "outputs": [], 286 | "source": [ 287 | "class Solver:\n", 288 | " def __init__(self, model, optimizer = tf.train.AdamOptimizer, var_list = None):\n", 289 | " self._model = model\n", 290 | " self._lr = tf.placeholder(dtype = tf.float32)\n", 291 | " self._optimizer = optimizer(learning_rate = self._lr)\n", 292 | " self._training_op = self._optimizer.minimize(loss = self._model._ce_loss, var_list = var_list)\n", 293 | " \n", 294 | " def train(self, sess, seq_len, seq_batch, y_data, lr):\n", 295 | " feed_train = {self._model._x_len : seq_len, self._model._x_batch : seq_batch,\n", 296 | " self._model._y : y_data, self._lr : lr}\n", 297 | " return sess.run(fetches = [self._training_op, self._model._ce_loss], feed_dict = feed_train)\n", 298 | " \n", 299 | " def evaluate(self, sess, seq_len, seq_batch, y_data):\n", 300 | " feed_loss = {self._model._x_len : seq_len, self._model._x_batch : seq_batch, self._model._y : y_data}\n", 301 | " return sess.run(fetches = self._model._ce_loss, feed_dict = feed_loss)" 302 | ] 303 | }, 304 | { 305 | "cell_type": "markdown", 306 | "metadata": {}, 307 | "source": [ 308 | "### Example : word sentiment classification" 309 | ] 310 | }, 311 | { 312 | "cell_type": "code", 313 | "execution_count": 11, 314 | "metadata": {}, 315 | "outputs": [], 316 | "source": [ 317 | "y_data = [[1.,0.], [0.,1.], [1.,0.], [1., 0.],[0.,1.]]\n", 318 | "words = ['good', 'bad', 'amazing', 'so good', 'bull shit']" 319 | ] 320 | }, 321 | { 322 | "cell_type": "code", 323 | "execution_count": 12, 324 | "metadata": {}, 325 | "outputs": [], 326 | "source": [ 327 | "# words를 CharRNN이 받을 수 있는 데이터로 변환\n", 328 | "words_len, word_batch = make_batch(words, max_len = 10)" 329 | ] 330 | }, 331 | { 332 | "cell_type": "code", 333 | "execution_count": 13, 334 | "metadata": {}, 335 | "outputs": [ 336 | { 337 | "data": { 338 | "text/plain": [ 339 | "[4, 3, 7, 7, 9]" 340 | ] 341 | }, 342 | "execution_count": 13, 343 | "metadata": {}, 344 | "output_type": "execute_result" 345 | } 346 | ], 347 | "source": [ 348 | "words_len" 349 | ] 350 | }, 351 | { 352 | "cell_type": "code", 353 | "execution_count": 14, 354 | "metadata": {}, 355 | "outputs": [ 356 | { 357 | "data": { 358 | "text/plain": [ 359 | "(5, 10, 27)" 360 | ] 361 | }, 362 | "execution_count": 14, 363 | "metadata": {}, 364 | "output_type": "execute_result" 365 | } 366 | ], 367 | "source": [ 368 | "np.shape(word_batch)" 369 | ] 370 | }, 371 | { 372 | "cell_type": "markdown", 373 | "metadata": {}, 374 | "source": [ 375 | "### Training" 376 | ] 377 | }, 378 | { 379 | "cell_type": "code", 380 | "execution_count": 15, 381 | "metadata": {}, 382 | "outputs": [], 383 | "source": [ 384 | "sess = tf.Session()\n", 385 | "char_gru = CharRNN(n_label = 2, max_len = 10, input_dim = len(char_dic), hidden_dim = 20)\n", 386 | "adam_solver = Solver(model = char_gru)" 387 | ] 388 | }, 389 | { 390 | "cell_type": "code", 391 | "execution_count": 16, 392 | "metadata": {}, 393 | "outputs": [], 394 | "source": [ 395 | "sess.run(tf.global_variables_initializer())" 396 | ] 397 | }, 398 | { 399 | "cell_type": "code", 400 | "execution_count": 17, 401 | "metadata": {}, 402 | "outputs": [], 403 | "source": [ 404 | "# hyper parameters\n", 405 | "n_epochs = 20\n", 406 | "loss_history = []" 407 | ] 408 | }, 409 | { 410 | "cell_type": "code", 411 | "execution_count": 18, 412 | "metadata": {}, 413 | "outputs": [ 414 | { 415 | "name": "stdout", 416 | "output_type": "stream", 417 | "text": [ 418 | "epochs : 0, tr_loss : 0.648\n", 419 | "epochs : 1, tr_loss : 0.598\n", 420 | "epochs : 2, tr_loss : 0.553\n", 421 | "epochs : 3, tr_loss : 0.513\n", 422 | "epochs : 4, tr_loss : 0.475\n", 423 | "epochs : 5, tr_loss : 0.438\n", 424 | "epochs : 6, tr_loss : 0.400\n", 425 | "epochs : 7, tr_loss : 0.360\n", 426 | "epochs : 8, tr_loss : 0.317\n", 427 | "epochs : 9, tr_loss : 0.274\n", 428 | "epochs : 10, tr_loss : 0.233\n", 429 | "epochs : 11, tr_loss : 0.194\n", 430 | "epochs : 12, tr_loss : 0.159\n", 431 | "epochs : 13, tr_loss : 0.128\n", 432 | "epochs : 14, tr_loss : 0.101\n", 433 | "epochs : 15, tr_loss : 0.078\n", 434 | "epochs : 16, tr_loss : 0.058\n", 435 | "epochs : 17, tr_loss : 0.043\n", 436 | "epochs : 18, tr_loss : 0.031\n", 437 | "epochs : 19, tr_loss : 0.022\n" 438 | ] 439 | } 440 | ], 441 | "source": [ 442 | "for epoch in range(n_epochs):\n", 443 | " _, tr_loss = adam_solver.train(sess = sess, seq_len = words_len, seq_batch = word_batch, y_data = y_data, lr = 1e-2)\n", 444 | " print('epochs : {:3}, tr_loss : {:.3f}'.format(epoch, tr_loss))\n", 445 | " loss_history.append(tr_loss) " 446 | ] 447 | }, 448 | { 449 | "cell_type": "code", 450 | "execution_count": 19, 451 | "metadata": {}, 452 | "outputs": [ 453 | { 454 | "name": "stdout", 455 | "output_type": "stream", 456 | "text": [ 457 | "Accuracy : 100.00%\n" 458 | ] 459 | } 460 | ], 461 | "source": [ 462 | "yhat = char_gru.predict(sess = sess, seq_len = words_len, seq_batch = word_batch)\n", 463 | "print('Accuracy : {:.2%}'.format(np.mean(np.argmax(y_data, axis = 1) == yhat)))" 464 | ] 465 | }, 466 | { 467 | "cell_type": "markdown", 468 | "metadata": {}, 469 | "source": [ 470 | "### Analyze embeddings of words\n", 471 | "영어단어를 왼쪽부터 오른쪽으로 알파벳 한글자씩보고, 긍/부정을 예측하는 모형을 CharRNN으로 학습한 후, 이 때 특정 영어단어를 넣었을 때의 hidden state를 비교하면 아래와 같은 결과를 얻을 수 있음" 472 | ] 473 | }, 474 | { 475 | "cell_type": "code", 476 | "execution_count": 20, 477 | "metadata": {}, 478 | "outputs": [ 479 | { 480 | "data": { 481 | "text/plain": [ 482 | "['good', 'bad', 'amazing', 'so good', 'bull shit']" 483 | ] 484 | }, 485 | "execution_count": 20, 486 | "metadata": {}, 487 | "output_type": "execute_result" 488 | } 489 | ], 490 | "source": [ 491 | "words" 492 | ] 493 | }, 494 | { 495 | "cell_type": "code", 496 | "execution_count": 21, 497 | "metadata": {}, 498 | "outputs": [], 499 | "source": [ 500 | "embedding_words = char_gru.encode(sess = sess, seq_len = words_len, seq_batch = word_batch)\n", 501 | "embedding = pd.DataFrame(embedding_words, index = words)" 502 | ] 503 | }, 504 | { 505 | "cell_type": "code", 506 | "execution_count": 22, 507 | "metadata": {}, 508 | "outputs": [ 509 | { 510 | "data": { 511 | "text/html": [ 512 | "
\n", 513 | "\n", 526 | "\n", 527 | " \n", 528 | " \n", 529 | " \n", 530 | " \n", 531 | " \n", 532 | " \n", 533 | " \n", 534 | " \n", 535 | " \n", 536 | " \n", 537 | " \n", 538 | " \n", 539 | " \n", 540 | " \n", 541 | " \n", 542 | " \n", 543 | " \n", 544 | " \n", 545 | " \n", 546 | " \n", 547 | " \n", 548 | " \n", 549 | " \n", 550 | " \n", 551 | " \n", 552 | " \n", 553 | " \n", 554 | " \n", 555 | " \n", 556 | " \n", 557 | " \n", 558 | " \n", 559 | " \n", 560 | " \n", 561 | " \n", 562 | " \n", 563 | " \n", 564 | " \n", 565 | " \n", 566 | " \n", 567 | " \n", 568 | " \n", 569 | " \n", 570 | " \n", 571 | " \n", 572 | " \n", 573 | " \n", 574 | " \n", 575 | " \n", 576 | " \n", 577 | " \n", 578 | " \n", 579 | " \n", 580 | " \n", 581 | " \n", 582 | " \n", 583 | " \n", 584 | " \n", 585 | " \n", 586 | " \n", 587 | " \n", 588 | " \n", 589 | " \n", 590 | " \n", 591 | " \n", 592 | " \n", 593 | " \n", 594 | " \n", 595 | " \n", 596 | " \n", 597 | " \n", 598 | " \n", 599 | " \n", 600 | " \n", 601 | " \n", 602 | " \n", 603 | " \n", 604 | " \n", 605 | " \n", 606 | " \n", 607 | " \n", 608 | " \n", 609 | " \n", 610 | " \n", 611 | " \n", 612 | " \n", 613 | " \n", 614 | " \n", 615 | " \n", 616 | " \n", 617 | " \n", 618 | " \n", 619 | " \n", 620 | " \n", 621 | " \n", 622 | " \n", 623 | " \n", 624 | " \n", 625 | " \n", 626 | " \n", 627 | " \n", 628 | " \n", 629 | " \n", 630 | " \n", 631 | " \n", 632 | " \n", 633 | " \n", 634 | " \n", 635 | " \n", 636 | " \n", 637 | " \n", 638 | " \n", 639 | " \n", 640 | " \n", 641 | " \n", 642 | " \n", 643 | " \n", 644 | " \n", 645 | " \n", 646 | " \n", 647 | " \n", 648 | " \n", 649 | " \n", 650 | " \n", 651 | " \n", 652 | " \n", 653 | " \n", 654 | " \n", 655 | " \n", 656 | " \n", 657 | " \n", 658 | " \n", 659 | " \n", 660 | " \n", 661 | " \n", 662 | " \n", 663 | " \n", 664 | " \n", 665 | " \n", 666 | " \n", 667 | " \n", 668 | " \n", 669 | " \n", 670 | " \n", 671 | " \n", 672 | " \n", 673 | " \n", 674 | " \n", 675 | " \n", 676 | " \n", 677 | " \n", 678 | " \n", 679 | " \n", 680 | " \n", 681 | " \n", 682 | " \n", 683 | " \n", 684 | " \n", 685 | " \n", 686 | " \n", 687 | " \n", 688 | " \n", 689 | " \n", 690 | " \n", 691 | " \n", 692 | " \n", 693 | " \n", 694 | " \n", 695 | " \n", 696 | " \n", 697 | " \n", 698 | " \n", 699 | "
goodbadamazingso goodbull shit
0-0.0906200.3832850.357408-0.1265030.689105
1-0.1429470.140611-0.120752-0.1743740.740000
20.481865-0.2019770.5699660.677262-0.394927
3-0.4901650.012184-0.646111-0.6651530.374838
40.408214-0.0817100.6622390.600297-0.258617
50.552937-0.1567940.6720200.731237-0.695990
60.251143-0.2427710.5637000.399336-0.654192
70.167962-0.3332500.3833780.284998-0.732360
80.355938-0.1661410.6711520.543169-0.490243
9-0.2683230.256304-0.575321-0.4632830.471359
100.0019020.3064570.1308830.0841070.680491
110.572170-0.2628170.5959680.769292-0.515272
120.308290-0.1597600.4712110.472356-0.558880
130.225386-0.004079-0.0744010.241343-0.447123
14-0.1067480.2597580.296636-0.0794790.769076
150.499670-0.1623870.7520820.692628-0.520880
160.583276-0.1291840.7443280.774138-0.563597
17-0.1238470.036050-0.534177-0.2530040.099086
180.312512-0.1584620.4911210.459040-0.522646
19-0.6273920.175276-0.503636-0.7853350.522767
\n", 700 | "
" 701 | ], 702 | "text/plain": [ 703 | " good bad amazing so good bull shit\n", 704 | "0 -0.090620 0.383285 0.357408 -0.126503 0.689105\n", 705 | "1 -0.142947 0.140611 -0.120752 -0.174374 0.740000\n", 706 | "2 0.481865 -0.201977 0.569966 0.677262 -0.394927\n", 707 | "3 -0.490165 0.012184 -0.646111 -0.665153 0.374838\n", 708 | "4 0.408214 -0.081710 0.662239 0.600297 -0.258617\n", 709 | "5 0.552937 -0.156794 0.672020 0.731237 -0.695990\n", 710 | "6 0.251143 -0.242771 0.563700 0.399336 -0.654192\n", 711 | "7 0.167962 -0.333250 0.383378 0.284998 -0.732360\n", 712 | "8 0.355938 -0.166141 0.671152 0.543169 -0.490243\n", 713 | "9 -0.268323 0.256304 -0.575321 -0.463283 0.471359\n", 714 | "10 0.001902 0.306457 0.130883 0.084107 0.680491\n", 715 | "11 0.572170 -0.262817 0.595968 0.769292 -0.515272\n", 716 | "12 0.308290 -0.159760 0.471211 0.472356 -0.558880\n", 717 | "13 0.225386 -0.004079 -0.074401 0.241343 -0.447123\n", 718 | "14 -0.106748 0.259758 0.296636 -0.079479 0.769076\n", 719 | "15 0.499670 -0.162387 0.752082 0.692628 -0.520880\n", 720 | "16 0.583276 -0.129184 0.744328 0.774138 -0.563597\n", 721 | "17 -0.123847 0.036050 -0.534177 -0.253004 0.099086\n", 722 | "18 0.312512 -0.158462 0.491121 0.459040 -0.522646\n", 723 | "19 -0.627392 0.175276 -0.503636 -0.785335 0.522767" 724 | ] 725 | }, 726 | "execution_count": 22, 727 | "metadata": {}, 728 | "output_type": "execute_result" 729 | } 730 | ], 731 | "source": [ 732 | "# 각 단어의 20차원 embedding\n", 733 | "embedding.T" 734 | ] 735 | }, 736 | { 737 | "cell_type": "markdown", 738 | "metadata": {}, 739 | "source": [ 740 | "embedding된 각 단어간의 euclidean distance를 계산해보면,***같은 범주 (긍정 또는 부정)에 속한 단어들 끼리 가깝고, 서로 다른 범주에 속한 단어들 끼리는 상대적으로 먼것을 확인 가능***" 741 | ] 742 | }, 743 | { 744 | "cell_type": "code", 745 | "execution_count": 23, 746 | "metadata": {}, 747 | "outputs": [ 748 | { 749 | "data": { 750 | "text/html": [ 751 | "
\n", 752 | "\n", 765 | "\n", 766 | " \n", 767 | " \n", 768 | " \n", 769 | " \n", 770 | " \n", 771 | " \n", 772 | " \n", 773 | " \n", 774 | " \n", 775 | " \n", 776 | " \n", 777 | " \n", 778 | " \n", 779 | " \n", 780 | " \n", 781 | " \n", 782 | " \n", 783 | " \n", 784 | " \n", 785 | " \n", 786 | " \n", 787 | " \n", 788 | " \n", 789 | " \n", 790 | " \n", 791 | " \n", 792 | " \n", 793 | " \n", 794 | " \n", 795 | " \n", 796 | " \n", 797 | " \n", 798 | " \n", 799 | " \n", 800 | " \n", 801 | " \n", 802 | " \n", 803 | " \n", 804 | " \n", 805 | " \n", 806 | " \n", 807 | " \n", 808 | " \n", 809 | " \n", 810 | " \n", 811 | " \n", 812 | " \n", 813 | " \n", 814 | " \n", 815 | " \n", 816 | " \n", 817 | " \n", 818 | "
goodbadamazingso goodbull shit
good0.0000002.4177021.1186110.6773303.980407
bad2.4177020.0003452.9791473.0470301.678281
amazing1.1186112.9791470.0000000.8712894.414713
so good0.6773303.0470300.8712890.0009774.573600
bull shit3.9804071.6782814.4147134.5736000.000000
\n", 819 | "
" 820 | ], 821 | "text/plain": [ 822 | " good bad amazing so good bull shit\n", 823 | "good 0.000000 2.417702 1.118611 0.677330 3.980407\n", 824 | "bad 2.417702 0.000345 2.979147 3.047030 1.678281\n", 825 | "amazing 1.118611 2.979147 0.000000 0.871289 4.414713\n", 826 | "so good 0.677330 3.047030 0.871289 0.000977 4.573600\n", 827 | "bull shit 3.980407 1.678281 4.414713 4.573600 0.000000" 828 | ] 829 | }, 830 | "execution_count": 23, 831 | "metadata": {}, 832 | "output_type": "execute_result" 833 | } 834 | ], 835 | "source": [ 836 | "from sklearn.metrics.pairwise import pairwise_distances\n", 837 | "pd.DataFrame(pairwise_distances(X = embedding), index = words, columns= words)" 838 | ] 839 | }, 840 | { 841 | "cell_type": "markdown", 842 | "metadata": {}, 843 | "source": [ 844 | "### Encode unseen words\n", 845 | "Input을 Character 단위로 받으므로 학습에 이용되지않은 영어단어에 대해서도 embedding을 구할 수 있으며, 긍/부정을 판단할 수 있음" 846 | ] 847 | }, 848 | { 849 | "cell_type": "code", 850 | "execution_count": 24, 851 | "metadata": {}, 852 | "outputs": [], 853 | "source": [ 854 | "unseen = ['great']" 855 | ] 856 | }, 857 | { 858 | "cell_type": "code", 859 | "execution_count": 25, 860 | "metadata": {}, 861 | "outputs": [], 862 | "source": [ 863 | "unseen_len, unseen_batch = make_batch(unseen, 10)\n", 864 | "unseen_vector = char_gru.encode(sess = sess, seq_len = unseen_len, seq_batch = unseen_batch)" 865 | ] 866 | }, 867 | { 868 | "cell_type": "code", 869 | "execution_count": 26, 870 | "metadata": {}, 871 | "outputs": [ 872 | { 873 | "data": { 874 | "text/plain": [ 875 | "array([[ 0.36647743, -0.01373214, 0.10402276, -0.1385455 , -0.00311775,\n", 876 | " 0.09648833, -0.14816383, -0.11707296, 0.2829062 , -0.02193693,\n", 877 | " 0.10115166, 0.08888885, 0.00755516, -0.02079342, 0.18498814,\n", 878 | " 0.04492051, 0.04183176, -0.15769707, -0.01859909, 0.01101905]], dtype=float32)" 879 | ] 880 | }, 881 | "execution_count": 26, 882 | "metadata": {}, 883 | "output_type": "execute_result" 884 | } 885 | ], 886 | "source": [ 887 | "unseen_vector" 888 | ] 889 | }, 890 | { 891 | "cell_type": "code", 892 | "execution_count": 27, 893 | "metadata": {}, 894 | "outputs": [ 895 | { 896 | "data": { 897 | "text/plain": [ 898 | "array([0], dtype=int64)" 899 | ] 900 | }, 901 | "execution_count": 27, 902 | "metadata": {}, 903 | "output_type": "execute_result" 904 | } 905 | ], 906 | "source": [ 907 | "char_gru.predict(sess = sess, seq_len = unseen_len, seq_batch = unseen_batch)" 908 | ] 909 | } 910 | ], 911 | "metadata": { 912 | "kernelspec": { 913 | "display_name": "Python 3", 914 | "language": "python", 915 | "name": "python3" 916 | }, 917 | "language_info": { 918 | "codemirror_mode": { 919 | "name": "ipython", 920 | "version": 3 921 | }, 922 | "file_extension": ".py", 923 | "mimetype": "text/x-python", 924 | "name": "python", 925 | "nbconvert_exporter": "python", 926 | "pygments_lexer": "ipython3", 927 | "version": "3.6.3" 928 | }, 929 | "varInspector": { 930 | "cols": { 931 | "lenName": 16, 932 | "lenType": 16, 933 | "lenVar": 40 934 | }, 935 | "kernels_config": { 936 | "python": { 937 | "delete_cmd_postfix": "", 938 | "delete_cmd_prefix": "del ", 939 | "library": "var_list.py", 940 | "varRefreshCmd": "print(var_dic_list())" 941 | }, 942 | "r": { 943 | "delete_cmd_postfix": ") ", 944 | "delete_cmd_prefix": "rm(", 945 | "library": "var_list.r", 946 | "varRefreshCmd": "cat(var_dic_list()) " 947 | } 948 | }, 949 | "types_to_exclude": [ 950 | "module", 951 | "function", 952 | "builtin_function_or_method", 953 | "instance", 954 | "_Feature" 955 | ], 956 | "window_display": false 957 | } 958 | }, 959 | "nbformat": 4, 960 | "nbformat_minor": 2 961 | } 962 | -------------------------------------------------------------------------------- /Tutorial of implementing Word2Vec.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Tutorial of implementing Word2Vec\n", 8 | "네이버 [김진중](https://github.com/golbin)님의 저서 **골빈해커의 3분 딥러닝 텐서플로맛**의 Word2Vec 예제코드를 OOP Style로 refactoring한 example\n", 9 | " \n", 10 | "* Paper \n", 11 | " * [Efficient Estimation of Word Representations in Vector Space](https://arxiv.org/pdf/1301.3781.pdf)\n", 12 | " * [Distributed Representations of Words and Phrases\n", 13 | "and their Compositionality](https://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf)\n", 14 | "* Reference\n", 15 | " * https://github.com/golbin/TensorFlow-Tutorials/blob/master/04%20-%20Neural%20Network%20Basic/03%20-%20Word2Vec.py\n", 16 | " * https://github.com/sjchoi86/Tensorflow-101/blob/master/notebooks/word2vec_simple.ipynb" 17 | ] 18 | }, 19 | { 20 | "cell_type": "markdown", 21 | "metadata": {}, 22 | "source": [ 23 | "### Setup" 24 | ] 25 | }, 26 | { 27 | "cell_type": "code", 28 | "execution_count": 1, 29 | "metadata": {}, 30 | "outputs": [ 31 | { 32 | "name": "stderr", 33 | "output_type": "stream", 34 | "text": [ 35 | "c:\\python36\\lib\\site-packages\\h5py\\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n", 36 | " from ._conv import register_converters as _register_converters\n" 37 | ] 38 | }, 39 | { 40 | "name": "stdout", 41 | "output_type": "stream", 42 | "text": [ 43 | "1.6.0\n" 44 | ] 45 | } 46 | ], 47 | "source": [ 48 | "import os, sys\n", 49 | "import tensorflow as tf\n", 50 | "import numpy as np\n", 51 | "import matplotlib\n", 52 | "import matplotlib.pyplot as plt\n", 53 | "from pprint import pprint\n", 54 | "%matplotlib inline\n", 55 | "\n", 56 | "# matplot에서 한글을 표시하기위한 설정\n", 57 | "font_name = matplotlib.font_manager.FontProperties(fname = 'C:/Windows/Fonts/malgun.ttf')\n", 58 | "matplotlib.rc('font', family = font_name.get_name())\n", 59 | "print(tf.__version__)\n", 60 | "np.random.seed(777)\n", 61 | "slim = tf.contrib.slim" 62 | ] 63 | }, 64 | { 65 | "cell_type": "markdown", 66 | "metadata": {}, 67 | "source": [ 68 | "### Prepare sentences and Preprocess" 69 | ] 70 | }, 71 | { 72 | "cell_type": "code", 73 | "execution_count": 2, 74 | "metadata": {}, 75 | "outputs": [], 76 | "source": [ 77 | "# 단어 벡터를 분석해볼 임의의 문장들\n", 78 | "sentences = [\"나 고양이 좋다\",\n", 79 | " \"나 강아지 좋다\",\n", 80 | " \"나 동물 좋다\",\n", 81 | " \"강아지 고양이 동물\",\n", 82 | " \"여자친구 고양이 강아지 좋다\",\n", 83 | " \"고양이 생선 우유 좋다\",\n", 84 | " \"강아지 생선 싫다 우유 좋다\",\n", 85 | " \"강아지 고양이 눈 좋다\",\n", 86 | " \"나 여자친구 좋다\",\n", 87 | " \"여자친구 나 싫다\",\n", 88 | " \"여자친구 나 영화 책 음악 좋다\",\n", 89 | " \"나 게임 만화 애니 좋다\",\n", 90 | " \"고양이 강아지 싫다\",\n", 91 | " \"강아지 고양이 좋다\"]" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": 3, 97 | "metadata": {}, 98 | "outputs": [ 99 | { 100 | "name": "stdout", 101 | "output_type": "stream", 102 | "text": [ 103 | "['나', '고양이', '좋다', '나', '강아지', '좋다', '나', '동물', '좋다', '강아지', '고양이', '동물', '여자친구', '고양이', '강아지', '좋다', '고양이', '생선', '우유', '좋다', '강아지', '생선', '싫다', '우유', '좋다', '강아지', '고양이', '눈', '좋다', '나', '여자친구', '좋다', '여자친구', '나', '싫다', '여자친구', '나', '영화', '책', '음악', '좋다', '나', '게임', '만화', '애니', '좋다', '고양이', '강아지', '싫다', '강아지', '고양이', '좋다']\n", 104 | "['게임', '음악', '애니', '고양이', '책', '좋다', '우유', '눈', '만화', '생선', '동물', '싫다', '강아지', '영화', '나', '여자친구']\n" 105 | ] 106 | } 107 | ], 108 | "source": [ 109 | "# 문장을 전부 합친 후, 공백으로 단어들을 나누고 고유한 단어들로 리스트를 만듭니다.\n", 110 | "word_sequence = ' '.join(sentences).split()\n", 111 | "word_list = list(set(word_sequence))\n", 112 | "print(word_sequence)\n", 113 | "print(word_list)" 114 | ] 115 | }, 116 | { 117 | "cell_type": "code", 118 | "execution_count": 4, 119 | "metadata": {}, 120 | "outputs": [ 121 | { 122 | "name": "stdout", 123 | "output_type": "stream", 124 | "text": [ 125 | "{'강아지': 12,\n", 126 | " '게임': 0,\n", 127 | " '고양이': 3,\n", 128 | " '나': 14,\n", 129 | " '눈': 7,\n", 130 | " '동물': 10,\n", 131 | " '만화': 8,\n", 132 | " '생선': 9,\n", 133 | " '싫다': 11,\n", 134 | " '애니': 2,\n", 135 | " '여자친구': 15,\n", 136 | " '영화': 13,\n", 137 | " '우유': 6,\n", 138 | " '음악': 1,\n", 139 | " '좋다': 5,\n", 140 | " '책': 4}\n" 141 | ] 142 | } 143 | ], 144 | "source": [ 145 | "# 문자열로 분석하는 것 보다, 숫자로 분석하는 것이 훨씬 용이하므로\n", 146 | "# 리스트에서 문자들의 인덱스를 뽑아서 사용하기 위해,\n", 147 | "# 이를 표현하기 위한 연관 배열과, 단어 리스트에서 단어를 참조 할 수 있는 인덱스 배열을 만듭합니다.\n", 148 | "word_dic = {w: i for i, w in enumerate(word_list)}\n", 149 | "pprint(word_dic)" 150 | ] 151 | }, 152 | { 153 | "cell_type": "markdown", 154 | "metadata": {}, 155 | "source": [ 156 | "### Define preprocessor function for skip-gram of Word2Vec" 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": 5, 162 | "metadata": {}, 163 | "outputs": [], 164 | "source": [ 165 | "def preprocessor(sequences, word_dic, window_size):\n", 166 | " context = []\n", 167 | " for idx in range(window_size, len(sequences) - window_size):\n", 168 | " center_word = word_dic.get(sequences[idx])\n", 169 | " context_words = [word_dic.get(sequences[idx + _]) for _ in range(-window_size, window_size + 1) if _ != 0]\n", 170 | " \n", 171 | " for token in context_words:\n", 172 | " context.append([center_word, token])\n", 173 | " else:\n", 174 | " return context" 175 | ] 176 | }, 177 | { 178 | "cell_type": "code", 179 | "execution_count": 6, 180 | "metadata": {}, 181 | "outputs": [ 182 | { 183 | "name": "stdout", 184 | "output_type": "stream", 185 | "text": [ 186 | "[14, 3, 5, 14, 12, 5, 14, 10, 5, 12, 3, 10, 15, 3, 12, 5, 3, 9, 6, 5, 12, 9, 11, 6, 5, 12, 3, 7, 5, 14, 15, 5, 15, 14, 11, 15, 14, 13, 4, 1, 5, 14, 0, 8, 2, 5, 3, 12, 11, 12, 3, 5]\n" 187 | ] 188 | } 189 | ], 190 | "source": [ 191 | "print([word_dic.get(_) for _ in word_sequence])" 192 | ] 193 | }, 194 | { 195 | "cell_type": "code", 196 | "execution_count": 7, 197 | "metadata": {}, 198 | "outputs": [ 199 | { 200 | "name": "stdout", 201 | "output_type": "stream", 202 | "text": [ 203 | "[[3, 14], [3, 5], [5, 3], [5, 14], [14, 5], [14, 12], [12, 14], [12, 5], [5, 12], [5, 14], [14, 5], [14, 10], [10, 14], [10, 5], [5, 10], [5, 12], [12, 5], [12, 3], [3, 12], [3, 10], [10, 3], [10, 15], [15, 10], [15, 3], [3, 15], [3, 12], [12, 3], [12, 5], [5, 12], [5, 3], [3, 5], [3, 9], [9, 3], [9, 6], [6, 9], [6, 5], [5, 6], [5, 12], [12, 5], [12, 9], [9, 12], [9, 11], [11, 9], [11, 6], [6, 11], [6, 5], [5, 6], [5, 12], [12, 5], [12, 3], [3, 12], [3, 7], [7, 3], [7, 5], [5, 7], [5, 14], [14, 5], [14, 15], [15, 14], [15, 5], [5, 15], [5, 15], [15, 5], [15, 14], [14, 15], [14, 11], [11, 14], [11, 15], [15, 11], [15, 14], [14, 15], [14, 13], [13, 14], [13, 4], [4, 13], [4, 1], [1, 4], [1, 5], [5, 1], [5, 14], [14, 5], [14, 0], [0, 14], [0, 8], [8, 0], [8, 2], [2, 8], [2, 5], [5, 2], [5, 3], [3, 5], [3, 12], [12, 3], [12, 11], [11, 12], [11, 12], [12, 11], [12, 3], [3, 12], [3, 5]]\n" 204 | ] 205 | } 206 | ], 207 | "source": [ 208 | "tmp = preprocessor(sequences=word_sequence, word_dic=word_dic, window_size=1)\n", 209 | "print(tmp)" 210 | ] 211 | }, 212 | { 213 | "cell_type": "markdown", 214 | "metadata": {}, 215 | "source": [ 216 | "### Define Word2Vec class" 217 | ] 218 | }, 219 | { 220 | "cell_type": "code", 221 | "execution_count": 8, 222 | "metadata": {}, 223 | "outputs": [], 224 | "source": [ 225 | "class Word2Vec:\n", 226 | " def __init__(self, num_classes, embedding_dim = 2, num_sampled = 10):\n", 227 | " self._inputs = tf.placeholder(dtype = tf.int32, shape = [None])\n", 228 | " self._labels = tf.placeholder(dtype = tf.int32, shape = [None,1])\n", 229 | "\n", 230 | " # Look up embeddings for inputs.\n", 231 | " self._embeddings = tf.get_variable(name = 'embeddings_input', shape = [num_classes, embedding_dim],\n", 232 | " dtype = tf.float32, initializer = tf.truncated_normal_initializer())\n", 233 | " self._selected_embed = tf.nn.embedding_lookup(params = self._embeddings, ids = self._inputs)\n", 234 | " \n", 235 | " # Construct the variables for the NCE loss\n", 236 | " nce_weights = tf.Variable(tf.random_uniform([num_classes, embedding_dim], -1.0, 1.0))\n", 237 | " nce_biases = tf.Variable(tf.zeros([num_classes]))\n", 238 | " \n", 239 | " # Compute the average NCE loss for the batch\n", 240 | " self._nce_loss = tf.reduce_mean(tf.nn.nce_loss(weights = nce_weights, biases = nce_biases,\n", 241 | " labels = self._labels, inputs = self._selected_embed,\n", 242 | " num_sampled = num_sampled, num_classes = num_classes))\n", 243 | " \n", 244 | " def get_wordvector(self, sess, word_dic, word):\n", 245 | " idx = word_dic.get(word)\n", 246 | " feed_get_wordvector = {self._inputs : [idx]}\n", 247 | " return sess.run(self._selected_embed, feed_dict = feed_get_wordvector)" 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "execution_count": 9, 253 | "metadata": {}, 254 | "outputs": [], 255 | "source": [ 256 | "class Solver:\n", 257 | " def __init__(self, model, optimizer = tf.train.AdamOptimizer, var_list = None):\n", 258 | " self._model = model\n", 259 | " self._lr = tf.placeholder(dtype = tf.float32)\n", 260 | " self._optimizer = optimizer(learning_rate = self._lr)\n", 261 | " self._training_op = self._optimizer.minimize(loss = self._model._nce_loss, var_list = var_list)\n", 262 | " \n", 263 | " def train(self, sess, x_data, y_data, lr):\n", 264 | " feed_train = {self._model._inputs : x_data, self._model._labels : y_data, self._lr : lr}\n", 265 | " return sess.run(fetches = [self._training_op, self._model._nce_loss], feed_dict = feed_train)" 266 | ] 267 | }, 268 | { 269 | "cell_type": "markdown", 270 | "metadata": {}, 271 | "source": [ 272 | "### Training" 273 | ] 274 | }, 275 | { 276 | "cell_type": "code", 277 | "execution_count": 10, 278 | "metadata": {}, 279 | "outputs": [ 280 | { 281 | "name": "stdout", 282 | "output_type": "stream", 283 | "text": [ 284 | "{'게임': 0, '음악': 1, '애니': 2, '고양이': 3, '책': 4, '좋다': 5, '우유': 6, '눈': 7, '만화': 8, '생선': 9, '동물': 10, '싫다': 11, '강아지': 12, '영화': 13, '나': 14, '여자친구': 15}\n", 285 | "['나', '고양이', '좋다', '나', '강아지', '좋다', '나', '동물', '좋다', '강아지', '고양이', '동물', '여자친구', '고양이', '강아지', '좋다', '고양이', '생선', '우유', '좋다', '강아지', '생선', '싫다', '우유', '좋다', '강아지', '고양이', '눈', '좋다', '나', '여자친구', '좋다', '여자친구', '나', '싫다', '여자친구', '나', '영화', '책', '음악', '좋다', '나', '게임', '만화', '애니', '좋다', '고양이', '강아지', '싫다', '강아지', '고양이', '좋다']\n" 286 | ] 287 | } 288 | ], 289 | "source": [ 290 | "print(word_dic)\n", 291 | "print(word_sequence)" 292 | ] 293 | }, 294 | { 295 | "cell_type": "code", 296 | "execution_count": 11, 297 | "metadata": {}, 298 | "outputs": [], 299 | "source": [ 300 | "batch = preprocessor(sequences = word_sequence, word_dic = word_dic, window_size = 2)\n", 301 | "x_data = np.array(batch)[:,0]\n", 302 | "y_data = np.array(batch)[:,[1]]" 303 | ] 304 | }, 305 | { 306 | "cell_type": "code", 307 | "execution_count": 12, 308 | "metadata": {}, 309 | "outputs": [], 310 | "source": [ 311 | "# generate Word2Vec instance, Solver instance\n", 312 | "sess = tf.Session()\n", 313 | "sgram = Word2Vec(num_classes = len(word_dic.keys()), embedding_dim = 2, num_sampled = 10)\n", 314 | "adam_solver = Solver(model = sgram)" 315 | ] 316 | }, 317 | { 318 | "cell_type": "code", 319 | "execution_count": 13, 320 | "metadata": {}, 321 | "outputs": [ 322 | { 323 | "name": "stdout", 324 | "output_type": "stream", 325 | "text": [ 326 | "9\n" 327 | ] 328 | } 329 | ], 330 | "source": [ 331 | "# hyper-parameter\n", 332 | "epochs = 30\n", 333 | "batch_size = 20\n", 334 | "total_batch = int(x_data.shape[0] / batch_size)\n", 335 | "print(total_batch)" 336 | ] 337 | }, 338 | { 339 | "cell_type": "code", 340 | "execution_count": 14, 341 | "metadata": {}, 342 | "outputs": [], 343 | "source": [ 344 | "sess.run(tf.global_variables_initializer())" 345 | ] 346 | }, 347 | { 348 | "cell_type": "code", 349 | "execution_count": 15, 350 | "metadata": {}, 351 | "outputs": [ 352 | { 353 | "name": "stdout", 354 | "output_type": "stream", 355 | "text": [ 356 | "epoch : 0, tr_loss : 10.306\n", 357 | "epoch : 1, tr_loss : 9.960\n", 358 | "epoch : 2, tr_loss : 9.442\n", 359 | "epoch : 3, tr_loss : 8.703\n", 360 | "epoch : 4, tr_loss : 8.402\n", 361 | "epoch : 5, tr_loss : 7.949\n", 362 | "epoch : 6, tr_loss : 7.722\n", 363 | "epoch : 7, tr_loss : 6.643\n", 364 | "epoch : 8, tr_loss : 6.268\n", 365 | "epoch : 9, tr_loss : 5.667\n", 366 | "epoch : 10, tr_loss : 5.469\n", 367 | "epoch : 11, tr_loss : 4.941\n", 368 | "epoch : 12, tr_loss : 4.483\n", 369 | "epoch : 13, tr_loss : 4.637\n", 370 | "epoch : 14, tr_loss : 3.751\n", 371 | "epoch : 15, tr_loss : 3.753\n", 372 | "epoch : 16, tr_loss : 3.419\n", 373 | "epoch : 17, tr_loss : 3.397\n", 374 | "epoch : 18, tr_loss : 3.398\n", 375 | "epoch : 19, tr_loss : 3.348\n", 376 | "epoch : 20, tr_loss : 3.177\n", 377 | "epoch : 21, tr_loss : 3.025\n", 378 | "epoch : 22, tr_loss : 3.124\n", 379 | "epoch : 23, tr_loss : 3.112\n", 380 | "epoch : 24, tr_loss : 3.073\n", 381 | "epoch : 25, tr_loss : 2.805\n", 382 | "epoch : 26, tr_loss : 2.978\n", 383 | "epoch : 27, tr_loss : 3.072\n", 384 | "epoch : 28, tr_loss : 2.961\n", 385 | "epoch : 29, tr_loss : 2.908\n" 386 | ] 387 | } 388 | ], 389 | "source": [ 390 | "#training \n", 391 | "tr_loss_hist = []\n", 392 | "\n", 393 | "for epoch in range(epochs):\n", 394 | " avg_tr_loss = 0\n", 395 | " \n", 396 | " for step in range(total_batch):\n", 397 | " tr_indices = np.random.choice(np.arange(x_data.shape[0]), size = batch_size, replace = False)\n", 398 | " batch_xs = x_data[tr_indices]\n", 399 | " batch_ys = y_data[tr_indices]\n", 400 | " \n", 401 | " _, tr_loss = adam_solver.train(sess = sess, x_data = batch_xs, y_data = batch_ys, lr = .01)\n", 402 | " avg_tr_loss += tr_loss / total_batch\n", 403 | " \n", 404 | " print('epoch : {:3}, tr_loss : {:.3f}'.format(epoch, avg_tr_loss))\n", 405 | " tr_loss_hist.append(avg_tr_loss)" 406 | ] 407 | }, 408 | { 409 | "cell_type": "code", 410 | "execution_count": 16, 411 | "metadata": {}, 412 | "outputs": [ 413 | { 414 | "data": { 415 | "text/plain": [ 416 | "[]" 417 | ] 418 | }, 419 | "execution_count": 16, 420 | "metadata": {}, 421 | "output_type": "execute_result" 422 | }, 423 | { 424 | "data": { 425 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXIAAAD7CAYAAAB37B+tAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3Xl8VfWd//HXJzsJSchyE7ZA2JF9\nCSCgIFUrdcMN24Jal4pWnf5ap3TszGM641hbO+rUWqwWF7q4zChqrVq1akV2IYCAZd/3LCQkIfvy\n/f2RVNGChJubnLu8n4/HfXhvcsJ9Hw+8OXzvOd+vOecQEZHQFeV1ABERaRsVuYhIiFORi4iEOBW5\niEiIU5GLiIQ4FbmISIhTkYuIhDgVuYhIiFORi4iEuJiOeJPMzEyXm5vbEW8lIhI21qxZU+yc851u\nuw4p8tzcXPLz8zvirUREwoaZ7W3NdhpaEREJcSpyEZEQpyIXEQlxKnIRkRCnIhcRCXEqchGREKci\nFxEJcUFd5CWVdfzX65uoqW/0OoqISNAK6iJftqOYBct3880nV3L0eK3XcUREglJQF/llI7vz+Owx\nbDpUzpW/Xs6uouNeRxIRCTpBXeQA04d143/nnE1lbQNXPb6cVbtLvI4kIhJUgr7IAUb3SuOVOyaR\nnhjHdU99xOvrD3kdSUQkaIREkQP0zkjilTsmMSqnC//0wjoeX7QT55zXsUREPHfaIjczn5ndb2b3\ntbweZGbvm9kyM3uw/SN+pktiHH/49nguH9mdn7+9hX999RMaGps6MoKISNBpzRn5w0AtENvy+hHg\nFufcZCDXzCa0V7iTiY+J5pGvj+LOaf14YdU+bvldPsdrGzoygohIUDltkTvnbgAWA5hZDJDgnNvT\n8u2XgYntlu4UoqKMuRcN5oGrhrN0RzEzn1jB4bLqjo4hIhIUznSM3AccPeH1USDtZBua2Rwzyzez\n/KKiIn/zfalvjO/FMzeOY39JFVc+tpzNh8vb5X1ERILZmRb5MaDLCa/TgJO2tHNuvnMuzzmX5/Od\ndqUiv00d6OPF2ybicNy0YLVuHBKRiHNGRe6cqwbizaxHy5euAt4PeKozNKR7Ck9/axwlVXX8v//9\nmMYmXc0iIpHDn8sP7wYWmtkiYJVzbnNgI/lnWI9U/uvyoSzdUcwv39vmdRwRkQ7TqsWXnXOLgEUt\nz1fjwQecrfH1cTnk7y3l0b/uYHTvNKYNyvI6kohIuwuZG4Jaw8y4b8YwBndN5vv/9zEHSqu8jiQi\n0u7CqsgBOsVF88R1Y2lsdNzx3FpqGzQFroiEt7ArcoDczCQenDmSDQfKuO+NTV7HERFpV2FZ5ADT\nh3VlzpS+PLtyH39cd9DrOCIi7SZsixzghxcNYnxuOj96ZSPbCiq8jiMi0i7CushjoqOYN2s0SfEx\n3P7sGs3JIiJhKayLHCArJYFffXM0e4or+ZeFGzT1rYiEnbAvcoCJ/TKYe9Fg3tx4mAXL9ngdR0Qk\noCKiyAFun9qXC87K5qd/3syavVouTkTCR8QUuZnx8LUj6d6lE3c+t46yqnqvI4mIBETEFDlAaqdY\n5s0aTUFFDY9/uNPrOCIiARFRRQ4womcXrhjVgwXLdnOkrMbrOCIibRZxRQ5w94UDcQ4e0SyJIhIG\nIrLIc9ITmX12L17M38+OQt0oJCKhLSKLHOCuaf1JjIvhv9/e6nUUEZE2idgiz+gcz21T+vKXTQW6\nHFFEQlrEFjnALef2IbNzPA+8tUV3fIpIyIroIk+Mi+F7Fwxg9Z5S3t9c6HUcERG/+F3kZvbfZvah\nma0ws1GBDNWRvj4uhz6ZSfz87S1atFlEQpJfRW5m04EE59xU4Gbg4YCm6kCx0VHMvWgQ2wuP8/La\nA17HERE5Y/6ekQ8HPgBwzm0GugQskQe+NqwrI3O68It3t1FTr6XhRCS0+FvkG4ArrNkAINfMLIC5\nOpSZcc/0wRwuq+H3K/Z4HUdE5Iz4VeTOuXeA7cAi4DvASveFyz7MbI6Z5ZtZflFRUZuDtreJ/TKY\nOtDHYx/s1IRaIhJS/P6w0zn3k5Yx8veAlSf5/nznXJ5zLs/n87UlY4f5l+mDKa+p14RaIhJS/P2w\nM8PMlprZMmAm8GBgY3ljSPeUTyfUOlxW7XUcEZFW8Xdo5ahz7hzn3GTn3E3OubCZRvDTCbXe3e51\nFBGRVonoG4JOJic9kevO7s1La/azvUATaolI8FORn8RdX2mZUOsdTaglIsFPRX4S6Ulx3D61L+9u\nKiB/jybUEpHgpiI/hZvP6UNWcjy3P7uGZTuKvY4jInJKKvJTSIyL4blvT6BLYhzXPf0Rv3xvu+Zi\nEZGgpCL/EgOyk/nTXZO5clQPfvHeNm5csIri47VexxIR+RwV+WkkxsXw8LUjeeCq4azaXcIljy7h\no11HvY4lIvIpFXkrmBnfGN+LV++YTGJcDLOe+ojHF+2kSUMtIhIEVORnYEj3FP5012SmD+vKz9/e\nwi2/W01pZZ3XsUQkwqnIz1ByQizzvjma+2YMZdmOo1zy6BLW7iv1OpaIRDAVuR/MjOsn5vLydyYR\nHW1c+8QKnl66W+t+iognVORtMLxnKm/807l8ZXAW972xid8s3uV1JBGJQCryNkrtFMsT143lspHd\neeCtLbz28UGvI4lIhInxOkA4iIoyHpo5gsLyGn7w0np8yfFM6pfpdSwRiRA6Iw+Q+Jho5t+QR25G\nErf9YQ1bj2jmRBHpGCryAErtFMtvbx5Pp9hoblqwioLysJmmXUSCmIo8wHp06cSCm8ZRVl3PjQtW\nU1Gj9T9FpH2pyNvB0O6pPH7dWLYXVHDHc2upb2zyOpKIhDEVeTuZMtDHz64azpLtxdzz8kZdYy4i\n7cbvIjezu83sQzNbZmajAxkqXMzMy+H7Fwzk5bUH+MW727yOIyJhyq/LD82sC3A5cB7QD/gFcFng\nYoWP757fn0PHqnn0rzvo3qUT3xjfy+tIIhJm/L2OvJHms/k4IBMoCliiMGNm/OTKYRwpr+Hf/vgJ\n2akJTBuU5XUsEQkjfg2tOOcqgMXAZuBPNJ+Rf46ZzTGzfDPLLyqK7J6PjY7isdljGNw1mTufW8ua\nvZpkS0QCx68iN7NLgFiah1UGA4+aWeyJ2zjn5jvn8pxzeT6fr+1JQ1zn+BgW3DiO9KQ4rnliOT96\nZQNFFVptSETazt8PO3sDBa75UoxyIBlICFiqMJWVksCb/3Qut0zuw0v5B5j20CIeX7STmvpGr6OJ\nSAgzfy6LM7NE4BmgGxAPPO2ce/JU2+fl5bn8/Hy/Q4ajXUXH+emft/De5gJy0jvxr187i+nDumJm\nXkcTkSBhZmucc3mn3a4jrm9WkZ/a0u3F3PfGJrYWVDC+Tzo/vnQIw3qkeh1LRIJAa4tcNwR57JwB\nmbz53XO4/8ph7Cw8zmXzljL3pfUUap4WEWklFXkQiImOYvaE3nww9zzmnNuXP358kPMeWsQTH+7U\nHaEicloq8iCSkhDLjy4+i3e/P5WJfTN44K0trNh11OtYIhLkVORBKDczicdmjyE5PoZX1mrFIRH5\nciryIJUQG83Fw7vx1sbDVNU1eB1HRIKYijyIXTWmB5V1jbzztyNeRxGRIKYiD2LjctPJSe+k4RUR\n+VIq8iAWFWVcObonS3cUc7is2us4IhKkVORB7uoxPXAO/rjukNdRRCRIqciDXO+MJPJ6p/HK2gO6\nplxETkpFHgKuGtOT7YXH2XiwzOsoIhKEVOQh4JIR3YiLidKHniJyUiryEJDaKZYLh2Tz2scHqWto\n8jqOiAQZFXmIuHpMD0qr6lm0tdDrKCISZFTkIWLKAB+ZneM0vCIi/0BFHiJioqOYMaoH728poLSy\nzus4IhJEVOQh5OoxPalvdLyxQdeUi8hnVOQhZEj3FAZ3TWahhldE5AR+FbmZ3WVmi054FAc6mJzc\n1WN6sn7/MXYUHvc6iogECb+K3Dk3zzl3nnPuPOBXwAMBTSWnNGN0d6IMXll7wOsoIhIk2jS0YmZR\nwJ3AvMDEkdPJSk5gykAfr647SFOTbtkXkbaPkc8A3nXO/cNKwWY2x8zyzSy/qKiojW8jJ7p6TE8O\nl9WwUsvAiQhtL/KbgadP9g3n3HznXJ5zLs/n87XxbeREFw7JJjk+hoUaXhER2lDkZpYBJDjndKth\nB0uIjeaSEd14+5MjVNZqGTiRSNeWM/IpwIpABZEzc/XYnlTVNfL2J1oGTiTS+V3kzrlXnXM/DmQY\nab283mn0Sk/klXUaXhGJdLohKESZGVeN6cHynUc5dEzLwIlEMhV5CLtqdE+cg1fX6U5PkUimIg9h\nvTISGZerZeBEIp2KPMRdPaYnO4sqWX9Ay8CJRCoVeYi7uGUZuEff367Vg0QilIo8xKUkxHLP9MH8\ndUsht/0hn5r6Rq8jiUgHU5GHgZvP6cP9Vw5j0bYivvXMKipq6r2OJCIdSEUeJmZP6M0jXx9F/t5S\nZj/1kVYREokgKvIwMmNUD35z3Vi2HKng2t+soKD8H+YyE5EwpCIPMxcMyea3N43j0LFqZj6xgv0l\nVV5HEpF2piIPQ5P6ZfLcrWdTVl3PNU8sZ3tBhdeRRKQdqcjD1KicLrx420SaHFz7mxVs1HXmImFL\nRR7GBnVN5qXbJpIYF8OsJ1eyaneJ15FEpB2oyMNcbmYSC78zkayUeG545iMWbdX08SLhRkUeAbql\nduLF2ybSz9eZW3+fzycHNcwiEk5U5BEio3M8z317AmmJccxduEG384uEERV5BOmSGMf9Vw5n8+Fy\nHl+00+s4IhIgKvIIc+GQbK4Y1Z1f/XU7mw6Vex1HRAJARR6B/uOyoXRJjGPuwvXUN2qIRSTU+V3k\nZjbezBab2TIz+2EgQ0n7SkuK4ydXDONvh8r5zYcaYhEJdTH+/JCZxQI/BmY450oDG0k6wvRhXbl0\nRDd++f52LhzSlUFdk72OJCJ+8veM/GvAXuAFM3vfzMYEMJN0kHsvH0pKQixzF66nQUMsIiHL3yIf\nAKQDlwK3AI99cQMzm2Nm+WaWX1RU1IaI0l4yOsdz3xXD2HCgjPlLdnkdR0T85G+RNwB/cc41OOf2\nAE1mZidu4Jyb75zLc87l+Xy+tuaUdnLx8G5cPLwrj7y7XZNriYQof4t8Bc3DK5hZNlDvtIx7yPqv\nGcNIio/mBws3aIhFJAT5VeTOuVXAVjNbBrwI/HNAU0mHyuwcz70zhrF+/zGeXrrb6zgicob8vvzQ\nOffvzrnJzrmpzrk1gQwlHe+yEd24aGg2D7+7jR2Fx72OIyJnQDcECQBmxn1XDCMxLpofLlxPY5NG\nykRChYpcPpWVnMB/XjaUtfuOsWCZhlhEQoWKXD5nxqjuXHBWFg++s5WdRRpiEQkFKnL5HDPj/iuH\nEx8TxeW/WsrP395CSWWd17FE5EuoyOUfZKck8Modk5k2OIsnPtzJOT//Kz97azNHj9d6HU1ETsI6\n4vLvvLw8l5+f3+7vI4G3vaCCeR/s4PX1h4iPiea6s3sxZ0o/fMnxXkcTCXtmtsY5l3fa7VTk0ho7\ni44z7687eO3jg8TFRDF7Qm9um9qXrOQEr6OJhC0VubSLXUXHmffBDl77+BAxUcasCb24fWo/slNU\n6CKBpiKXdrWnuJLHPtjBK+sOEh8TxUu3T2Ro91SvY4mEldYWuT7sFL/kZibx4MyRvH/3VFISYrnz\nubVU1NR7HUskIqnIpU1yM5P41azR7C+t5p5XNqK500Q6nopc2mxcbjo/+Oog3txwmGdX7vU6jkjE\nUZFLQNw2pS/TBvm4743NbDxQ5nUckYiiIpeAiIoy/ufaUWR2juOO59dQVq3xcpGOoiKXgElLiuNX\ns8Zw+FgNP1y4XuPlIh1ERS4BNbZ3Gvd8bTDv/K2ABcv2eB1HJCKoyCXgbjmnDxcOyeZnb21m3b5S\nr+OIhD0VuQScmfHQNSPJTkngrufXcaxKsyeKtCe/i9zMNprZopbHrECGktCXmhjLY7PGUFhRww9e\n0ni5SHtqyxl5gXPuvJbH8wFLJGFjZE4X/u3is3hvcyFPLtnldRyRsNWWIm8KWAoJW9+alMvFw7vy\n87e3smZviddxRMKSX0VuZklAPzNbbGYvmlnOSbaZY2b5ZpZfVFTU5qASmsyMB64eQc+0Ttz1/Dqt\nNiTSDvwqcudcpXOun3NuCvAk8PBJtpnvnMtzzuX5fL625pQQlpLQPF5+9Hgddz2/lroG/WNOJJD8\nPSOPPuGlTrfltIb1SOWBq4ezfOdRfqTJtUQCKsbPn+tvZs8AdS2P7wQukoSrq8b0ZH9JNb94bxs5\n6Z343gUDvY4kEhb8KnLn3FZgcoCzSAT47vn92V9axSPvbadnWiLXjO3pdSSRkOfvGbmIX8yMn145\nnMNl1dzz8ga6pyYwqX+m17FEQpru7JQOFxcTxa9nj6WvL4nbnl3DtoIKryOJhDQVuXgitVMsC24a\nT0JsNDctWE1heY3XkURClopcPNOjSycW3DiO0qo6bv7daiprG87416ipb6SxSVfASGRTkYunhvVI\nZd6s0Ww6VM53X1hHQ+PprzFvanIs21HMd19Yx4h7/8KPX/ukA5KKBC8VuXjuK4Ozuffyoby/pZB7\nX990ymvMDx2r5tH3tzP1oQ+Y/dRHLNpaSN/MJF7M36+hGYloumpFgsL1E3PZX1rN/MW76J2RyLfP\n7QtAXUMT720u4P9W72fJ9iKaHEzql8EPvjqIi4Z25UhZDdMeXsTvVuxh7kWDvd0JEY+oyCVo3DN9\nMAdKq7j/z5uJiTL2l1bz6rqDlFTW0TUlgTun9Wfm2Bx6ZSR++jO5mUlceFY2z67cx53T+pMYp9/S\nEnn0u16Cxt8XcD5StpL/fH0TsdHGBWdlc+24HKYM8BEdZSf9uTlT+vKXTQUsXHOAGybmdmxokSCg\nIpegkhAbzYIbx/Pu5gKmDfKR0Tn+tD8ztncao3K68PTS3cye0PuUhS8SrvRhpwSd1MRYrhnbs1Ul\nDs13i956bl/2Hq3i3U0F7ZxOJPioyCUsXDQ0m55pnXhKKxFJBFKRS1iIiY7i5sl9yN9byrp9pV7H\nEelQKnIJG9eOyyE5IYanluz2OopIh1KRS9joHB/DrAm9eOuTw+wvqfI6jkiHUZFLWLlxUi5RZjyz\nTGflEjlU5BJWuqV24rKR3Xlx9X7Kquu9jiPSIVTkEna+fW4fKusaeWHVPq+jiHSINhe5ma01s+mB\nCCMSCEO7pzKpXwYLlu2mruH0symKhLo2FbmZXQOkBiiLSMDcOqUvBeW1vLHhkNdRRNqd30VuZsnA\n9cBzgYsjEhjnDfQxIKszTy7ZfcppcUXCRVvOyB8FfgKc9N+uZjbHzPLNLL+oqKgNbyNy5syMb5/b\nh82Hy1m+86jXcUTalV9FbmazgX3OudWn2sY5N985l+ecy/P5fH4HFPHXjFE9yOwcx5O6bV/CnL9n\n5LOAIWb2v8A1wD1mNihwsUTaLiE2mhsm5rJoaxHbCyq8jiPSbvwqcufcJc65mc65bwALgQecc1sD\nG02k7a47uzcJsVG6bV/CWpsvP3TO/adz7u1AhBEJtPSkOK4e05NX1x2kqKLW6zgi7UI3BEnYu+Wc\nPtQ3NfGHFXu8jiLSLrRCkIS9vr7OnD84m6eW7mbJjmKcA+ccDnAOmpz79L9/d+mIbtxxXn+itNqQ\nhAAVuUSEf/7qQOrfaqLJOcwMA8wg6oTnYEQZHKuq56G/bGPzkQoeumYkneKivQ0vchoqcokIZ3VL\n4Xc3j2/Vts455i/exQNvb2F/SRVP3pBHdkpCOycU8Z/GyEW+wMy4bWo/5l+fx47C48yYt4xPDpZ5\nHUvklFTkIqdw4ZBsFt4+iSiDmU+s4O1PDnsdSeSkVOQiX2JI9xT+eNdkBndL5vZn1/LYBzs0d4sE\nHRW5yGlkJSfwwq1nc/nI7jz4zlb++cX11DY0eh1L5FP6sFOkFRJio/nlN0bRP6sz//PuNvaWVPGb\n68eS2Tne62giOiMXaS0z47vnD2DerNF8crCMGfOWseVIudexRHRGLnKmLh3RnZy0RG79fT7TH1lC\nQmwUneNjSUmIoXNCDMkJMXSOjyE5IZbO8TGffj0mqvm86cQR9pONt8dGR9EtNYGeaYn0TO9ESkJs\nB+2ZhCoVuYgfRuZ04U93ncPLaw9QVl1PRU09FTUNHK9toKKmgeKKquav1TZ/rS2fj6YkxNAjLZGe\naZ1aHon06NL8fGB2MnEx+od1pFORi/ipa2oCd07rf9rtmpoclXUNNJ24BMsJd/6bff5LtQ1NHDpW\nzYHSag6UVnGgtJqDpdXsO1rFsh3FVNV99kGrLzme6yb0ZtaEXviSNV4fqawjLqXKy8tz+fn57f4+\nIuHOOcexqnoOlFaz+2glL685wIfbioiLjuLSkd24aVIfhvfUMrrhwszWOOfyTrudilwktO0sOs7v\nl+/hpTUHqKprZGzvNG6anMtFQ7sSG61hl1CmIheJMOU19byUf4DfLd/DvpIquqYkcP3E3nxzfC/S\nk+K8jid+UJGLRKjGJseirYUsWLaHpTuKiYuJYsbI7nxjfA5jeqVhpql5Q0Vri1wfdoqEmego4/yz\nsjn/rGy2F1Tw2+V7eHXdQV5ac4C+viRmjs3h6jE9yGrDjI5lVfXsLD5ObFQUcTHNj/iYLzyPjtJf\nGh1EZ+QiEaCytoE3Nx7mpfz9rN5TSnSUcd5AHzPzcvjK4KzTXsJYVlXPqj0lrNx1lJW7jrLpcHmr\nLqmMi24u9dTEWLKS48lKTiA7JZ6slAR8yfGffi0rJZ70xDgt5PEF7Tq0YmZxwMtAMs1XTc1yzh08\n1fYqcpHgsavoOAvXHGDhmgMUVtSSkRTHFaN7cG1eDoO6JgNwrKqOVbtLWLmrubw3H2ku7viYKMb0\nSuPsvhkM65GCc1DX2ERtQyN1DU3UNTRR2/L47Hkjx6rqKSivobCilsLyGsprGv4hV0yU0deXxH9c\nNpTJ/TM75P9FQXkNq/eUkL+nlHX7SslKSWDKQB9TB/jolZHYIRm+THsXeRSQ4JyrMrPrgF7OuZ+e\nansVuUjwaWhsYsn2Yl7M3897mwuob3QM75FKY5P7XHGP7d1c3Gf3zWBkTirxMW1fMammvpGiiloK\nK2ooLK9tLviKGt7aeIRdxZVcf3Zv7vnaYJLiAzf665xjZ1Elq/eUfFre+0qqAOgUG82InqkcbLl+\nHyA3I5EpA32cO8DHxH4ZdA5gltbqsA87zexeIN859/qptlGRiwS3kso6/rjuIK+tP0RSXHTAi7u1\nqusaefCdrSxYvpuctEQevGYEE/pm+P3rHTxWzZ83HGbVnhLW7C2lpLIOgIykOMblppOXm8a43HSG\ndE8hNjoK5xy7iytZvK2IxduLWbHzKNX1jcRGG2N6pTWfrQ/0MaRbSocMA7V7kZvZXGAOsA241jlX\n+YXvz2n5Pr169Rq7d+9ev95HRCLPR7uOMnfhBvaXVnHjpFx+eNHgVq+d6pxj9Z5SFizbzTt/O0KT\naz67zstNZ3xLeffJTGrVB7G1DY2s2VPKh9uLWLytmM2HmydJS06IoU9mEjnpifT6wqNbagIxAbp+\nvyPPyL8GfN05d+OpttEZuYicqaq6Bh54awu/X7GXPplJPDRzJGN7p51y+9qGRl5ff5gFy3bzt0Pl\npHaK5ZvjezF7Qi9y0gMz3l1YUcPS7cWs3VfKvpJq9h2t5EBpNQ1Nn/VoTJTRI60TvdITyUlP5MKz\nspk2OMuv92vvMfJk4LhzzpnZUOBe59w1p9peRS4i/lq+o5i5CzdwuKyaW6f05fsXDCQh9rOz88Ly\nGp5duZfnV+2j+HgdA7M7c+OkPlw5ukerz+LborHJcbismn0lVewvqWJfSRV7j372/FuTcvneBQP9\n+rXbu8jHAY8AtUA1cJdzbveptleRi0hbHK9t4P43N/PCqn30z+rMwzNHArBg2W7e3HiYhibH+YOz\nuHFSHyb3zwiq69cbmxzRfo6n685OEQk7i7cV8S8vb+BIeQ3OQef4GGbm9eRbE3PJzUzyOl7A6c5O\nEQk7Uwb6ePt7U3h6yS7SkuK4ZmxPkrXwhopcREJLaqdY7v7qIK9jBBXNcSkiEuJU5CIiIU5FLiIS\n4lTkIiIhTkUuIhLiVOQiIiFORS4iEuJU5CIiIa5DbtE3syLA33lsM4HiAMYJBuG2T+G2PxB++xRu\n+wPht08n25/ezjnf6X6wQ4q8LcwsvzVzDYSScNuncNsfCL99Crf9gfDbp7bsj4ZWRERCnIpcRCTE\nhUKRz/c6QDsIt30Kt/2B8NuncNsfCL998nt/gn6MXEREvlwonJGLiMiXCOr5yM3sPmAKzTnnOOf+\n5nGkNjOzjcDRlpfznXPPe5nHH2bmA74HNDnn/t3MBgG/BhKA5c65uZ4GPEMn2Z/rgR8BhUCdc+6r\nngY8Q2bWBXgC6Erzydq3gDhC+xidbJ/OIUSPk5nFAS8DyYABs4DO+HmMgrbIzexcINs5N9XMhgEP\nAhd7HCsQCpxzF3gdoo0eBnYAf1+a/BHgFufcHjN7ycwmOOc+8i7eGfvi/nQBfuSce827SG2SCNzt\nnDtkZpcAPwD6EtrH6GT7tIXQPU4NwNedc1Vmdh3NfzGdi5/HKJiHVr4KvADgnPsESPc2TsA0eR2g\nrZxzNwCLAcwsBkhwzu1p+fbLwESPovnlxP1p0QUo9ShOmznnDjnnDrW8LKV5kfRQP0Zf3KdKQvg4\nOeeanHNVLS8HABtpwzEK5iLPAopOeN1gZsGc97TMLAnoZ2aLzexFM8vxOlMA+PhsqIiW52keZQmU\nGOC/zWyJmc3xOoy/zKwHzWeuDxMmx+iEfXqEED9OZjbXzLYDecBa2nCMgrkYy/j8jjQ550L6bNY5\nV+mc6+ecmwI8SfMfsFB3jOYzo79L4/N/AYcc59x/OOfOBi4CZprZUK8znSkzuxT4MXArUEIYHKMT\n96nlDD2kj5Nz7kHn3ABgHvA/tOEYBXORLwGuATCzIcABb+O0nZlFn/Ay5P4gnYxzrhqIbzlTArgK\neN/DSG3WMlwEUA1UACF1ja6ZjQAuc87d5pw7Gg7H6Iv71PK1kD1OZpZsZtbych8QTRuOUdB+2Am8\nCVxsZktoPki3eZwnEPqb2TNleJOuAAAAhElEQVRAXcvjOx7nCZS7gYVmVgv8yTm32etAbfQzMxtP\n85+PV51zm7wOdIamA+ea2aKW1/sI/WN0sn0qCOHjNBh4pOV4VAN30Txpll/HSDcEiYiEuGAeWhER\nkVZQkYuIhDgVuYhIiFORi4iEOBW5iEiIU5GLiIQ4FbmISIhTkYuIhLj/D6IWXRZ/7pAIAAAAAElF\nTkSuQmCC\n", 426 | "text/plain": [ 427 | "" 428 | ] 429 | }, 430 | "metadata": {}, 431 | "output_type": "display_data" 432 | } 433 | ], 434 | "source": [ 435 | "plt.plot(tr_loss_hist)" 436 | ] 437 | }, 438 | { 439 | "cell_type": "code", 440 | "execution_count": 17, 441 | "metadata": {}, 442 | "outputs": [ 443 | { 444 | "data": { 445 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYQAAAD7CAYAAABqvuNzAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3XtclGX+//HXxUkRRTQ1tHRNXTX9\nWZZUmpqaqGxBWllraUVt63bYStNctDzU6lpfLc1sM7fz6q7ZYXXJzK+Yrli6hXn6dtI0ygQTFRCR\n81y/PxgmRlDAmWFA38/Hg4dzX/c193xmuuPNfV/3dY+x1iIiIhLg7wJERKRuUCCIiAigQBAREScF\ngoiIAAoEERFxUiCIiAigQBAREScFgoiIAAoEERFxCvJ3AWVatGhh27dv7+8yRETqla1btx621rb0\nxrbqTCC0b9+elJSUWn3NjRs3MmvWLNeyw+Fg8uTJXHvttafsY61l8uTJDBo0qFZrFRGpjDHmB29t\nq84Egj9MmjSJNWvW0LRpUwCys7MZNGgQn3/+OYGBgQBMmDCBpKSk0/YRETkbnNNjCA0bNuSzzz7j\nxIkT5OXl8fnnn9OgQQO3X/SNGjWq0Kdhw4YKAxE565i6crfTqKgoW9unjA4ePMhf//pXdu7cCUD3\n7t158MEHadOmzSn79OjRgwceeIDWrVvXaq0iIpUxxmy11kZ5ZVvnciCIiNR33gyEc3IMYe3atW4D\nxT///DPWWiIjI11tEyZM4NlnnwVgx44dXHrppQDs3LmTHj16YIwhISGBmJiY2i1eRMRHdIQALFmy\nhOLiYuLj4ytdHx0dTVJSEgAxMTF88MEHBAWdk1kqInWMN48QzulBZRER+YX+zK2GkpxC0p/+jJKs\nAgr2ZZO77RBNr2hT9RNFROoRnTIC0tPTcTgcXHDBBRXW5W47RNb7e7BFDlebCQ4g4qZfE3ZZq9os\ns07IyckhJSXltBPzjhw5QmZmJp06darFykTOTTpl5GWtW7euNAwAjq1JdQsDAFvk4Nia1FqozH8m\nTJhAdHS06wdKx0+OHDnC0qVLAcjNzWX06NEMGTKEa665hn//+98A7Nq1i2XLlvmtdhE5MzplVIWS\nrIIatZ8tyq6wOnDgADNmzKi0z/z584mLi2PUqFHk5+czcOBAXXUlUo/pCKEKgRENatR+tklOTqZT\np0588803FBYWuq3LzMykW7duQOms7zZt2lBQcHYHpcjZTEcIVQgf1r7SMYTwYe39V1QtWrx4Mbfd\ndhuLFi0iKyvLbd0DDzzAxIkTueGGG9i7dy89evSgSZMmfqpURDylQKhC2cDxsTWplGQVEBjRgPBh\n7c+JAeW//e1vxMbG8vvf/x6gwumgDh068I9//IMvv/ySa6+9lnbt2gHQqlUrunfvXuv1iohnFAjV\nEHZZq3MiAMpbs2YNH374Ie+9995p+61du5Ynn3ySxo0bu9oKCgoYPXq0r0sUES9TIEgFGRkZvP32\n2yxbtoyAgNMPM+Xm5vLHP/7RbZb3pk2b2LBhg2+LFBGvUyBIBS1btuS1117zdxkiUssUCOKxOXPm\nsGTJEtdydnY2w4cP92NFInImNFNZvGLFtgPMWfMtaVl5tIkI5bFhXRhxWeWT/UTEe3T7a6lTVmw7\nwOT3d5FXVALAgaw8Jr+/C0ChIFKPaGKaeGzOmm9dYVAmr6iEOWu+9VNFInImFAjisbSsvBq1i0jd\npEAQj7WJCK1Ru4jUTQoE8dhjw7oQGhzo1hYaHMhjw7r4qSIRORMaVBaPlQ0c6yojkfpNgSBeMeKy\nCxQAIvWcThmJiAigQBAREScFgoiIAAoEERFxUiCIiAigQBAREScFgoiIAAoEERFx8igQjDF/Nsb8\nxxjziTGme7n2xsaYfxpjNhpjVhhjwj0vVUREfOmMA8EY0x8431o7APgDMKfc6vFAorX2GmAtcL9H\nVYqIiM95coQwFPgngLX2/4Dm5dZdC7zjfPwe0MeD1xERkVrgSSC0AjLKLRcbY8q218BaW+R8fARo\nVtkGjDFjjTEpxpiUjIyMyrqIiEgt8SQQsnH/Re+w1jrKHpcLh2a4B4eLtXaxtTbKWhvVsmVLD0oR\nERFPeRIIycBIAGNMN+Cncuv+Cwx3Pr4ZSPLgdUREpBZ4EgirgBBjTDIwF/iTMeYZY0wIMBsYa4zZ\nAPQCXve4UhER8akz/j4E5+mhk68e+pPz38PAb8502yIiUvs0MU1ERAAFgoiIOCkQREQEUCCIiIiT\nAkFERAAFgoiIOCkQREQEUCCIiIiTzwPBGPNrY0zXKvoMr6x99uzZ5Ofnu5bXrVvHhg0bvFugiIgA\nHsxUPpkxJhh4EegMlAB3WGvTKL11RUPgG2PMGiAQ6AlsBw5aa8cAD5Xf1rPPPktmZibvvPMOWVlZ\nBAcHc99997F//36CgtxLHjFiBMePH3dr27FjB2lpaQQHB3vr7YmInPW8FgjA3cBX1tqxxpirgA+N\nMduBi3Dey8haOwzAGPO/1tqhp9rQb37zGwoLC/nkk08YPnw4jRo1onnz5pX2XbFiRYW266+/vkJw\niIjI6XnzlNG1wKsA1tr/Aj8D9wGvlO9kjAkCLjvdhrp160bPnj05dOgQwcHB9OzZk0aNGtWoGGNM\njfqL70RHR/u7BBGpBm8GQiNrbU65ZYe1Nh8oOqnfEKDQGHNt+Tr27t3LokWLXA1///vfGTRoEFOn\nTuXEiRM1KiQwMLCGpYuIiDfPq/xkjOlkrf3O+eU4Fxhj7gWuADYDONsfAa4D5hhjNlpriwFHx44d\nue+++ygpKWHRokVs2rSJpUuXsnHjRuLi4njxxRfdXmzt2rXMmjXrlxf/6Se3U0sDBw4kISGBmJgY\nL75FOROhoaEMHDiwQvv7779/ylOBIlL7vBkIrwDPGGMSgDuBNUAq0L5cn2eA5dbaHcaYV4FXnKHh\npkmTJrz88ssUFRUxcOBAunXrRpMmTdiyZYurz5AhQxgyZIhref78+XTt2lUBUAclJib6uwQRqQav\nBYK19gtjzDOUDi5/Y619C8AY0wJoaIyJBA5Ya19z9n/bGHMCcLsUKDAwkDvvvJP58+fTqVMnYmNj\nadWqFQDx8fHeKldqwclHcUeOHKG4uJjzzz/f1aajOJG6w6uX4lhrPwM+O8W6g8D8k9oSQQPAZ6sh\nQ4Zw9dVXs2XLFgYPHsy7775LVlYW997rflB44MAB0tPTiYqK8lOlIgJ1fKbyE088QXR0tNvPc889\nV2nfn/fu4cMXn+XZUXEsfvBuvk5eX8vV1m+33XZbhbYNGzZQXFzsWl6/fj05OTkV+gHs3r2bAQMG\n0LdvXz755BMAYmJiyMzMdAXAxo0bXevK27NnDx999JE33oaIeMDnF+tba5dVo090VFSULd82btw4\nxo0bV63X+Dp5Pc0yfiKiQRBYS87hDP538UIALu4/6EzKPmv17NmTiIgIALKyshg1ahQJCQlkZGQA\nUFhYyFdffQXArFmzeOqppwgNDSUyMpKlS5dy0UUX0aRJkwrbnTFjBkuWLCEiIoLbbruNDz74wLWu\noKAAgEsuuYSiopMvOhORuuKsmL2VvOwtAm0JlLvctLiwgORlbykQThIZGen6a3zDhg1uA/UA+fn5\nJCUlAdCnTx82btxIYGAgV1999Wm3e/z4cdq2bQtAcHAwDofDte7YsWN07tyZrKwsAgICePPNN2nc\nuDHvvPMOzZo18+bbExEP1OlTRtWVc+RwjdqlImstCxcuZP/+/QwePJjk5GROnDjBzp07cTgcVQZC\neWFhYeTm5pKamsrkyZPp3Lkzt99+O/369aN3797ExcXx3HPPKQxE6pizIhCanNeiRu1SuU6dOhEe\nHs78+fNZsGABc+fOZenSpSQlJbn+4l+9ejXbt2+v8NzyFwbk5OTQpEkTWrRoQVhYmNvRAkBRURHr\n1q3z7ZsRkRo7K04Z9R91J/+7eCHFhQWutqCQBvQfdacfq6qbGjVqRGxsLFB6eqhsMNkY47r8MzY2\nloSEBGJjY0lNTaVNmzYEBJT+7RAYGOh6XF7btm3Ztm0b559/PqGhoQA0bty4wi1Hdu3axcGDB3E4\nHOzatYvc3Fzuv/9+n71fEam+syIQysYJkpe9Rc6RwzQ5rwX9R92p8YNKvP/++xw7dozw8PBT9rnl\nllvo27cvo0aN4umnn+bxxx93rRs6dCjt27ev8JyZM2cyfvx48vLyeOaZZ1zt5Qegu3btSrt27TDG\n0KxZM8aPH0/jxo35z3/+4503JyIeOSsCAUpDQQFQPTfddJNr4LhM2fJ3333H7t27KSkpobi4mO+/\n/57U1NQqtxkREcHrr79eob1fv36umcohISGEhIQQHBxMXFxcpVcriYj/nBVjCOI9hYWFZGZmcvz4\nccaNG0dYWBjNmzenQ4cOZ7S9iy++mLZt29K0aVMAmjZtSlxcHJdccok3yxYRLzhrjhCk+kpKSiq9\n2dwLL7xAjx496Natm6st/eBK9u2dRe6JdH7++TiHMq6kffux1XqdsstbV65cWWHdqn2reP6L5zmY\ne5DIsEgeuf2RM3szIuI1xlpbda9aEBUVZVNSUvxdhpSTfnAl33zzOA5HnqstICCUrl1n0Tqy0m89\nrZZV+1Yx49MZ5Jf88vWoDQMbMuPqGVzf4XqPahY51xhjtlprvXLfF50yqseOHDnC9OnTueGGG7jh\nhhuYNm0aR44cqfJ5W7durdb29+2d6xYGAA5HHvv2zj2jess8/8XzbmEAkF+Sz/NfPO/RdkXEMwqE\nemzMmDH06tWLZcuWsWzZMq644gpuv/121/rNmzczYMAABg0axPDhwzl06BCA21VDp5NfkF6j9uo6\nmHuwRu0iUjsUCPVYTk4O/fr1o1GjRjRq1Ii+ffu63Xxu4sSJvPfee6xfv55HH32UJ554okbbb9ig\ndY3aqysyLLJG7SJSOxQI9di8efMYM2YMMTExDBs2jDFjxvD887+cdgkLC6NFi9LZ2pdddhmZmZmu\ndQMHDuRf//rXabffoeNEAgJC3doCAkLp0HGiR3U/cvkjNAxs6NbWMLAhj1yugWURf9JVRvXQ4cOH\n+e677wCYNm2a27qSkhK2bNlCx44dGTFiBBMmTKBXr1588MEHTJo0ydVvw4YNVb5O2cDxvr1zyS9I\np2GD1nToONGjAWXANXDsdpXR5Y9oQFnEzxQI9VBmZmal9xMqr1mzZjzwwAOkpaXxww8/sGjRItfs\n5Kuuuqrar9U6crjHAVCZ6ztcrwAQqWN02Wk9Nnz4cLKzs93aGjdu7PZdBJXNN8jNzeXzzz/3dXki\nUgu8edmpjhDqsYKCggqnfqKjo92WKzs1pO8wFpHKaFBZRESAahwhGGPaA58De8s1NwEGA68ATZ3r\nfm+tLSr3vKnAjcAx4Edrre5F7WW7d++ucESwe/fuih13Lod1T0H2T9D0QpraU9/pVETOXdU9ZbTK\nWhtftmCMSQJmAX+x1n5qjJkD3AS8Xe45EcDd1tod3ipW3O3bt6/qTjuXQ+LDUOSccZy9n7evCS1t\nv+RW3xYo9daxY8fIycnhggsucLVt2LCBfv36ERRU+mtj/fr1REVF6a61ZxFPThl1sdZ+6nz8HtDn\npPURQCbiX+ue+iUMyhTllbbLOS8nJ4fRo0cTHR1N586d6dWrF9HR0QwaNIjXX3+dwsJCtm/fzvbt\n25k1axaff/4527dv5+DBgyxdurRat0qR+sOTQeXyYXIEOPkLcg2w1BhTAjxvra0wC8oYMxYYC9Cu\nXTsPSpFTyv6pZu1yTpk5cyajRo0iLi6OgoICBgwYwPPPP09GRgabNm0iPz/f9V0Zffr0YePGjQQG\nBtboO7al/vDkCMGUe9wMyCi/0lp7j7W2P6XjCNOMMU1P3oC1drG1NspaG9WyZUsPSpFTanphzdrl\nnJKamsqgQaVfLNWgQQP69u3rOiUEEB4ezuDBg0lOTubEiRPs3LkTh8OhQDhLeXKEcMAYc7m19gvg\nZsDtK7iMMUHW2mIgB8gH6saEhzqqQ4cOFY6SQkNDWb16tWv517/+Nb/61a/c+vzwww/s2bPn1Bse\nPM19DAEgOLS0Xc55d911F9OnT+exxx5j7969fPrpp+zYsYPMzExuvPFGAObPn8+CBQtc+97QoUOZ\nOLH09iWrV6+mT58+9OzZ02/vQbynuoEQa4wpP2ssFPgD8JoxxkHpVUhrjDEXAXHW2gXAG8aYts7X\neNFae8ybhZ9t2rVrV+XtJDp27Oj60pkyVc4pKBs4LneVEYOnaUBZALjuuuu48MILWbp0KS1btuTj\njz8mNDSU5ORkNm/eDEBsbCwJCQnExsaSmppKmzZtCAgoPbkQGBjoeiz1n2Yq1xHXXHMNISEhHD9+\nHGut68qNlStXEhYWBsCsWbM4+b+XMabat7MWOZWFCxeycuVKjDE4HA569uzJU089RaNGjQBIS0tj\n1KhRPP30067TRffeey9PPPEE7du392PlopnKZ6GNGzcCsGTJEoqLi4mPj3etW7duHbNnzz7lc9ev\nX8+kSZMYOnSor8uUs9Dy5cv57rvvWL16tWv8YMmSJUyePJmHHnqI3bt3U1JSQnFxMd9//z2pqan+\nLVh8RoHgZ1u2bOGVV15xLTscDqy1bNq0ydV2zz33kJSUxI8//ojD4WDFihUUFxczcuRIAgICdIWW\neCQvL4/w8HC3weTmzZuTl5dHYWEhmZmZBAQEMG7cOEJCQmjYsCERERF8/PHHfqxafEGnjOqIAwcO\nsHDhQr788ksAunfvzh//+Ee3iUF/+9vfKCgocHveq6++yrZt22q1Vjm7OBwOnnrqKTZs2EBISAgl\nJSW0bduWefPm0azZyVeT/0KnjOoGnTI6C91666385S9/4cknnwRKjxxuueUWPv30U1eflStXcvz4\ncbfnaWKQeCogIIAZM2ZUq292YiKH5s2nOD2dP7VuTbNdu0CBcNZQINQBJSUlFBQU0KtXL0JCQgC4\n/PLLKSwspKioiODgYACKi4ur9cU2Ir6QnZhI+tRp2Px8AIrT0kifWnr5ctO4OH+WJl6iQKgDAgMD\nmTVrFsOHu38RzaxZs1xhAPDVV19V+v0Gr7zyCp06dfJ1mXKOOzRvvisMytj8fA7Nm69AOEsoEOqI\nYcOGMWzYsNP2+fHHH1mx7QBz1nxLWlYebSJCeWxYFzp1uuC0zxPxhuL09Bq1S/2jGSX1yIptB5j8\n/i4OZOVhgQNZeUx+fxcrth3wd2lyDghq3bpG7VL/KBDqkTlrviWvqMStLa+ohDlrvvVTRXIuaTV+\nHKZhQ7c207AhrcaP81NF4m06ZVSPpGXl1ahdxJvKxgnKrjIKat2aVuPHafzgLKJAqEfaRIRyoJJf\n/m0iQv1QjZyLmsbFKQDOYjplVI88NqwLocGBbm2hwYE8NqyLnyoSkbOJjhDqkRGXlV5NdPJVRmXt\nIiKeUCDUMyMuu0ABICI+oVNGIiICKBBERMRJgSAiIoACQUREnBQIIiICKBBERMRJgSAiIoACQURE\nnBQIIiICKBBERMRJgSAiIoACQUREnBQIIiICKBBERMRJgSAiIoACQUREnBQIIiICKBBERMRJgSAi\nIoACQUREnBQIIiICKBBERMSpykAwxrQ3xmQYY7aU+/nSue5KY8x6Y0zXSp53tTFmozFmszFmnC+K\nFxER7wmqZr9V1tr4sgVjTJIxpjcwGsg9ubMxxgD/A8QBx4D/GGOWW2vTPC9ZRER84YxPGVlrt1hr\nHwIOV7K6A7DXWptprS0BPgCuPNPXEvEHay05OTlubUVFRaSmpvqnIBEf89UYQisgo9zyEaDZyZ2M\nMWONMSnGmJSMjIyTV4v4zZ49e9i0aRM33XQT27dv5+jRo8TExPDzzz8zY8YMf5cn4hO+CoRs3AOg\nGe4BAYC1drG1NspaG9WyZUsflSJSc/v27ePLL78kMzOT//u//yMrK8vfJYn4nK8CYQ9wiTGmiTEm\nEBgKfOKj1xLxumHDhtGtWzdycnK4/fbb6dChAz/++CO33HKLv0sT8ZnqBkJs2akdY0wK0LqyTsaY\ni4wxD1tri4AngXXAx8Cr1tpM75Qs4nv5+fksWLCAKVOmMHXqVKy1tGvXjnfeeccnr5eYmOiT7YrU\nRJVXGVlrU4EWp1kfX+7x98AC5+MPKB1MFqkViYmJxMXFebwday233norf/rTn+jbty/Lli1jxYoV\nbn1efvll/vCHP7i1rV27llmzZgGQmppKbm4u3bt3ByAhIYGYmBiio6PdnhMdHc3NN9/MM88845Xa\nRTyhiWlS75z8S7XMiy++6JXtG2N499132bNnD+3ateP+++/nnnvuYc2aNfTv35/jx4/zr3/9y9U/\nIyOD4cOHM3v2bMLDwxk6dCgjR44kLCyM2NhYXn75ZYYMGcKYMWP44osv2LZtm+u5DzzwAFu3biU7\nO9srtYt4QoEgUomQkBDi4+P58ccfyczMJDMzk9/+9re8/fbbNG7c2K3v5MmTefTRR/nd735Hs2bN\nyMzMZMCAATRt2pQxY8aQkpLCm2++yZIlS7j88svp3r07SUlJJCUlER4e7qd3KFJRdSemiZzT3n//\nfdq1a0ebNm1cbf369ePRRx9l//79FBcX8/rrr2OtZdWqVaxevZrU1FTGjBnj6n/hhRcCpZe0ZmRk\nUP7KumPHjjFixAhuvvlm7rjjjtp7YyLl6AhBzhoZGRmMHDmS5cuXe2V7u/97kNcnJ3Pz1Q+w+Onl\n3D3iERo2bEiPHj0AXPMUAgICGDx4MElJSaxbt44rrriCRYsW0blzZyIiIhgzZgxJSUkMHTqUY8eO\n0aBBA7dTTgDh4eGsWLFCYSB+pSMEOWu0bNmSd9991yvb2v3fg6x6LYVXP5pJ7y7DuLbDSDYu28Og\n0V2ZMGEC+/btc/V1OBwUFxfz4Ycf8t5777F7924WLVpEWloaV155JVu2bKF58+bccMMN/PDDD4we\nPZpnnnmGn376CYfDQWRkpFdqFvGUAkHqpTvvvJOAgAAcDgdFRUVMnDjRq9vfvHIvIaYR9//mL/x0\n+DteWj2FEkcxLyQGENkxnAkTJrj63nbbbUyZMoWxY8fy2muv0aVLFw4ePEhhYSEOh4OEhAQ6dOjA\n7NmzueOOOxg5ciSXXHIJS5YsYc6cOXz55ZderV3kTCkQpN5ZvXo1JSUlBAQEEBgYSGBgoNdf4/jR\nAtfjZcnzuXfodCLCSs/53zX3SoYOHcpVV11F8+bNiY+PJzExkddff52jR48yc+ZMrrnmGqKjo7nz\nzjuZNGkSr776KsYY5s6dC0Dv3r3p2rUrrVq14ttvv/V6/SJnQoEg9U5wcDDBwcE+fY3GzRu4hQIY\nV3vpzXxx/QsQFxdHXFwca9euZfr06QQGBvLFF18wdepUJk2aRNOmTUlISHB7jd69e/v0PYjUlAJB\n6qXd/z3I5pV7OX60gMbNG9BneEevbr/P8I6sX/oNxYUORvUfxz83PofDltC8dSPe+m8Qjz/+OM2a\nVbhfI0OGDGHIkCFA6XyJpKSkU77G18nrSV72Fhu37aBXq6Z8nbyei/sP8ur7EKkJY631dw0AREVF\n2ZSUFH+XIfXA7v8edP2yLhMUEsCg0V3pfJX3BmgrCx1vbH/VvlXM3fwMh4syCcsL5PJvI+iY3pig\nkAYMHftHhYLUiDFmq7U2yhvb0hGC1DubV+51CwOA4kIHm1fu9WogdL4q0ivbKywsJD09nV/96les\n2reKGZ/OIL8kHwzkNirh0x5H2bsng9ZZoYQue0uBIH6jeQhS77if26+6vTZ99NFHLFy40LUcExND\nWloaf/7znwF4/ovnyS/JJ3VuqqtPSZDlu/NyKCwpJudIZd83JVI7FAhS7zRu3qBG7bXpyJEjnO7L\nng7mHgSgKKsIgJK8EoqyijhRXLrc5LxT3kdSxOcUCFLv9BnekaAQ9103KCTA6wPLZ2LdunV8/fXX\n5OfnA7Bjxw6321dEhkWS92MeRYeLOLHvBMd3HefwqsOc2HacwMBg+o+601+liygQpP7pfFUkg0Z3\ndR0RNG7ewOsDytW1detW1+NFixZx1VVXMW3aNOLj4zl8+DCXXnopS5YsITc3l5SUFB7s8SBHVhzh\nooSLOPT+IcK6hXHhb9vQ59ed+H/XDtH4gfiVBpWlXvLWgC/A0aNHuemmmyq07969m7S0NAA2b97M\nlClTCAoKIiwsjMWLF9OqVSsef/xxPvroI/bv309GRgYTJkxg7Nix7NmzhzvuuIPCwkKg9OZ1K1as\nYNeuXUx4ZAIbG20kICSAjFczePnvL5MTmENQkP53FP/SZacip1B+HkHfvn3597//zXnnnUdycjJL\nly5l0aJFxMTEUFxczEMPPcTw4cOZOXMmn332GceOHSMrK4u0tDR+97vf0bdvX7Zv386YMWNo2bIl\nwcHBhISEYK3FGMMbb7xBUFCQ2+klkerw5mWnOmUkcgoBAb/87xEWFsZ5550HQM+ePTl8+JergZKS\nkhg+fDgAS5Ys4ZprruHjjz9m+/bt9OzZkzZt2rjuwNq+fXvmzZvHp59+Cvwy2zk+Pl5hIH6nQBAp\n57777nM9Lh8I1113HdOnT+fDDz/koYce4tFHH3Wti46OZuXKlUDp12+2bNnS9dygoCBatGhBUVFR\nLb0DkTOnk5Yi5aSmproelw+EcePGkZqayvfff89zzz1H8+bNATj//PN58803Xf169uzJJ598wptv\nvklAQADbtm2jffv23HXXXZQ/JTphwoQKt7647rrr3IJGpLYpEEROYenSpW7LBQUFPPfcc5w4cQJr\nLdZaJk2a5Nbn/vvv56233uK1117jpZde4o477uCuu+5iw4YNrj5PPPEEF9/7ALP3pXOgoIgLGgQz\nuUNrbo5sXhtvS+SUFAhyTlu7di2zZs1yLe/cuZOBAwe69UlISCAmJob777/f9U1oAMePH+faa69l\nwIABNGrUCMD13Jdeeok+ffowYsSICq/53sGjTPx2P3mO0gs6fiooYuK3+wEUCuJXCgQ5p5W/O2lV\njDFup5FOZeDAgRVCpbzZ+9JdYVAmz2GZvS9dgSB+pUAQqaa//vWvjB8/ntzcXKB0APnJJ590HR1U\n14GCygeYT9UuUlsUCCLV1KUMxuRUAAAJb0lEQVRLFxITE6vdPzsxkUPz5lOcns4FrVvz0PhxAFzQ\nIJifKvnlf0ED337pj0hVdNmpiA9kJyaSPnUaxWlpYC3FaWmkT51GdmIikzu0JjTAuPUPDTBM7tDa\nT9WKlFIgiPjAoXnzsc4b3JWx+fkcmjefmyObM7dLWy5sEIwBLmwQzNwubTV+IH6nU0YiPlCcnn7a\n9psjmysApM7REYKIDwS1rvz0z6naReoCBYKID7QaPw7TsKFbm2nYkFbOgWWRukinjER8oGlcHIDr\nKqOg1q1pNX6cq12kLlIgiPhI07g4BYDUKzplJCIigAJBREScFAgiIgIoEERExEmBICIiQDUCwRjT\n3hiTYYzZUu7nS+e6K40x640xXSt53lRjzBfGmA3GmLd8UbyIiHhPdS87XWWtjS9bMMYkGWN6A6OB\n3FM8JwK421q7w7MSRUSkNpzxKSNr7RZr7UPA4VN0iQAyz3T7IiJSu3w5hmCApc5TRjdW2sGYscaY\nFGNMSkZGhg9LERGRqvhsprK19h4AY0wz4GNjzMfW2uyT+iwGFgNERUXZilsREZHa4rMjBGNMWdjk\nAPmAfuGLiNRh1T1CiDXGpJRbDq2skzHmIiDOWrsAeMMY09b5Gi9aa495VqqIiPhSlYFgrU0FWpxm\nfXy5x98DC5yPx3henoiI1BZNTBMREUCBICIiTgoEEREBFAgiIuKkQBAREUCBICIiTgoEEREBFAgi\nIuKkQBAREUCBICIiTgoEEREBFAgiIuKkQBAREUCBICIiTgoEEREBFAgiIj61cuVK/vGPf5y2T2Ji\nYi1Vc3oKBBERL+rfv7/bcl5eHidOnAAgOjra7efpp58G4MUXX6z1OitT3a/QFBGRKuzbt4+tW7ey\ndetWevXqVWmfpKSkWq6q+nSEICLiBUePHuXhhx8mOTmZJ554gi1btvi7pBpTIIiIeCgvL48HH3yQ\nefPm0atXL959910+/PBDcnNzq/X8Q4cOMWLECP7+97/7uNLT0ykjEREPhYaGMmXKFNdYQGxsLFOm\nTGHhwoXcdNNNtG3b9rTPb9WqFStWrKiNUk9LgSAi4gUXXXQR9913Hzt27GD16tUsWrQIgC1bthAV\nFUVCQgIAc+fOxeFwUFxczIkTJxg7dqw/y3ajQBAR8YKtW7cyffp0jh07xq233kpUVBQvv/wy4eHh\n3H333QC88MILZGdnY4whKCiI8PBwIiMj/Vz5LxQIIiJekJGRQXx8PPHx8QAMGzaM9evXs2rVKt5+\n+20efvhhLr74Yv8WWQUFgoiIl8ydO5fly5fTqVMnBg8eTO/evQkICODNN9/0d2nVYqy1/q4BgKio\nKJuSkuLvMkREfG/nclj3FGT/BE0vhMHT4JJbz2hTxpit1toob5SlIwQRkdq0czkkPgxFeaXL2ftL\nl+GMQ8FbNA9BRKQ2rXvqlzAoU5RX2u5nCgQRkdqU/VPN2muRAkFEpDY1vbBm7bVIgSAiUpsGT4Pg\nUPe24NDSdj9TIIiI1KZLboW4BdC0LWBK/41b4PcBZdBVRiIite+SW+tEAJxMRwgiIgIoEERExEmB\nICIigAJBREScqgwEY0x7Y0yGMWZLuZ8vjTGDjDHrncsLK3ne1caYjcaYzcaYcb4pX0REvKW6Rwir\nrLW9y36AdCAbGOxcPt8Yc0VZZ2OMAf4HGA70A0YaY9p4uXYREfGiMz5lZK39wlrrcC5mAuW/PLQD\nsNdam2mtLQE+AK488zJFRMTXPJ6HYIy5Eci31n5VrrkVkFFu+QjQrJLnjgXKvj/uuDHmCHDY05pq\nQQtUpzfVhzrrQ42gOr2pPtQI0MVbGzrjQDDGBAMzgR+stQ+ftDob9wBoBnx1Uh+stYuBxeW2meKt\n+3r7kur0rvpQZ32oEVSnN9WHGqG0Tm9ty5OrjGYCH1pr/1rJuj3AJcaYJsaYQGAo8IkHryUiIj5W\n3SOE2JNSKBSIBa4qHT8GSv/S3wzEWWsXGGOeBNYBecAia22ml2oWEREfqDIQrLWplJ5Lq64Fzud9\nQOlgck0srrpLnaA6vas+1FkfagTV6U31oUbwYp115juVRUTEvzRTWUREgFoIBA9mOo9zrt9qjIkv\n137IGLPB+XOtn2usMBvbGBNkjHnJGPMfY0ySNyfknapO57ornbV2Pek5ncp9XhuMMWnGmBuc67z+\nWZ5pnc51U40xXzjrecvZ5pPP04Maa22/9LDOOrFvGmMijTEfGGOSjTFvmNKrE8ueU6v75pnU6Hxe\nre2XHtbp+b5prfXpD9AeeOOktiTgciDAufwOcMVJfaKc/zYEvgYM0AT4V12o0VnPJkovqQ10Pm4D\n3AFMcfa5HnipFursDbxA6ZhN19M8PwxYQ+kfAj75LD2pE3gWuPSkNp98nh7UWGv75ZnWWcf2zVeB\nq53Lc4Df+mvfPNMaa3O/9LBOj/dNv50ysqef6Yy1NsX5bz5w2Ja+0whn37pQ46lmYw8F/unssxq4\ntBbq3GKtfYiqJ9E8ArzofE+1+llCteqsrKZa/TyrqrEu7JfO1z9dnXVm3wS6WGs/dT5+D+hzin7+\n3DerqtHv+6XTaev0xr7p9zEEU/lM57J1AcaYmcDLzqbGQB9jzCZjzGvGmAg/1niq2diu9nJh4nfG\nmBAgBkh0Nvnls6yCAZY6D21vdLbVuc+zruyXp1GX9s3yv2NOdccCf++bVdVYV/bL6nyWHu2bfgsE\nY0ywMeYZoLWtONMZY0wk8DqQZK1dAmCt/dpae7G1th+lcx6m+LHGymZjZ5RvN8YYoMiXNdbAjcAK\n518Ntf5ZVoe19h5rbX9Ka51mjGlKHfs868J+WQ11ad805R6X1XEyf++bp62xDu2Xp63TG/umP48Q\nTjfTGUqvrX3MWruhrMEYU37eRGU7lredyWzsZGCks08Mpf8R6oLbgXfLFvzwWVapXE05QD5gqXuf\nZ13YL6tSl/bNA8aYy52Pb6b0XPjJ/L1vnrbGOrRfVvVZerxvenxzu2qq0UxnYBmlt81eXm79A0Ar\nY8wsoBDIAn7nrxrtKWZjG2NeAd4wxvwHOAT83os1nqrOCowxF5WrMwD4lbX2x3Jd+vnwszyjOin9\n3NpSul++aK095uPPs0Y14p/9ssZ11rF98w/Aa8YYB/A5sMbP+2aNa6T298sa14mX9k1NTBMREaAO\nDCqLiEjdoEAQERFAgSAiIk4KBBERARQIIiLipEAQERFAgSAiIk4KBBERAeD/A72ioj8z8bEwAAAA\nAElFTkSuQmCC\n", 446 | "text/plain": [ 447 | "" 448 | ] 449 | }, 450 | "metadata": {}, 451 | "output_type": "display_data" 452 | } 453 | ], 454 | "source": [ 455 | "for word in word_list:\n", 456 | " tmp = sgram.get_wordvector(sess = sess, word_dic = word_dic, word = word)\n", 457 | " x, y = tmp[0][0], tmp[0][1]\n", 458 | " plt.scatter(x, y)\n", 459 | " plt.annotate(word, xy=(x, y), xytext=(5, 2),\n", 460 | " textcoords='offset points', ha='right', va='bottom')" 461 | ] 462 | } 463 | ], 464 | "metadata": { 465 | "kernelspec": { 466 | "display_name": "Python 3", 467 | "language": "python", 468 | "name": "python3" 469 | }, 470 | "language_info": { 471 | "codemirror_mode": { 472 | "name": "ipython", 473 | "version": 3 474 | }, 475 | "file_extension": ".py", 476 | "mimetype": "text/x-python", 477 | "name": "python", 478 | "nbconvert_exporter": "python", 479 | "pygments_lexer": "ipython3", 480 | "version": "3.6.3" 481 | }, 482 | "varInspector": { 483 | "cols": { 484 | "lenName": 16, 485 | "lenType": 16, 486 | "lenVar": 40 487 | }, 488 | "kernels_config": { 489 | "python": { 490 | "delete_cmd_postfix": "", 491 | "delete_cmd_prefix": "del ", 492 | "library": "var_list.py", 493 | "varRefreshCmd": "print(var_dic_list())" 494 | }, 495 | "r": { 496 | "delete_cmd_postfix": ") ", 497 | "delete_cmd_prefix": "rm(", 498 | "library": "var_list.r", 499 | "varRefreshCmd": "cat(var_dic_list()) " 500 | } 501 | }, 502 | "types_to_exclude": [ 503 | "module", 504 | "function", 505 | "builtin_function_or_method", 506 | "instance", 507 | "_Feature" 508 | ], 509 | "window_display": false 510 | } 511 | }, 512 | "nbformat": 4, 513 | "nbformat_minor": 2 514 | } 515 | -------------------------------------------------------------------------------- /mnist_sequence1_sample_5distortions5x5.npz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/seopbo/TF_code_examples_for_Deep_learning/f024749aea54d7a352aaed622775325ea9c39107/mnist_sequence1_sample_5distortions5x5.npz --------------------------------------------------------------------------------