├── .gitignore ├── .ipynb_checkpoints └── README-checkpoint.md ├── Deep Learning Emotion Recognition PyTorch.ipynb ├── Deep Learning Emotion Recognition TensorFlow .ipynb ├── README.md ├── data └── merged_training.pkl ├── helpers ├── .ipynb_checkpoints │ └── evaluate-checkpoint.py ├── __pycache__ │ ├── evaluate.cpython-36.pyc │ └── pickle_helpers.cpython-36.pyc ├── evaluate.py └── pickle_helpers.py └── img ├── autograd.jpg ├── dl_frameworks.png ├── emotion_classifier.png ├── gru-model.png └── tensor.png /.gitignore: -------------------------------------------------------------------------------- 1 | */__pycache__ 2 | __pycache__/ 3 | .ipynb_checkpoints -------------------------------------------------------------------------------- /.ipynb_checkpoints/README-checkpoint.md: -------------------------------------------------------------------------------- 1 | ## Deep Learning Based NLP 2 | This repository contains python notebooks related to several deep learning based NLP tasks such as emotion recognition and neural machine translation. It will provide implementations based on several deep learning frameworks such as PyTorch and TensorFlow. 3 | 4 | ## Emotion Recognition with GRU 5 | - [Deep Learning Based Emotion Recognition With PyTorch](https://github.com/omarsar/nlp_pytorch_tensorflow_notebooks/blob/master/Deep%20Learning%20Emotion%20Recognition%20PyTorch.ipynb) 6 | - [Deep Learning Based Emotion Recognition With TensorFlow](https://github.com/omarsar/nlp_pytorch_tensorflow_notebooks/blob/master/Deep%20Learning%20Emotion%20Recognition%20TensorFlow%20.ipynb) 7 | 8 | --- 9 | Author: [Elvis Saravia](https://twitter.com/omarsar0) -------------------------------------------------------------------------------- /Deep Learning Emotion Recognition TensorFlow .ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Deep Learning Based Emotion Recognition with TensorFlow\n", 8 | "\n", 9 | "![alt txt](img/emotion_classifier.png)\n", 10 | "\n", 11 | "In this notebook we are going to learn how to train deep neural networks, such as recurrent neural networks (RNNs), for addressing a natural language task known as **emotion recognition**. We will cover everything you need to know to get started with NLP using deep learning frameworks such as TensorFlow. We will cover the common best practices, functionalities, and steps you need to understand the basics of TensorFlow APIs to build powerful predictive models via the computation graph. In the process of building our models, we will compare PyTorch and TensorFlow to let the learner appreciate the strenghts of each tool.\n", 12 | "\n", 13 | "by [Elvis Saravia](https://twitter.com/omarsar0)" 14 | ] 15 | }, 16 | { 17 | "cell_type": "markdown", 18 | "metadata": {}, 19 | "source": [ 20 | "---" 21 | ] 22 | }, 23 | { 24 | "cell_type": "markdown", 25 | "metadata": {}, 26 | "source": [ 27 | "## Outline\n", 28 | "1. Deep Learning Frameworks\n", 29 | " - 1.1 Eager execution\n", 30 | " - 1.2 Computation graph\n", 31 | "2. Tensors\n", 32 | " - 2.1 Basic math with tensors\n", 33 | " - 2.2 Transforming tensors\n", 34 | "3. Data\n", 35 | " - 3.1 Preprocessing data\n", 36 | " - Tokenization and Sampling\n", 37 | " - Constructing Vocabulary and Index-Word Mapping\n", 38 | " - 3.2 Converting data into tensors\n", 39 | " - 3.3 Padding data\n", 40 | " - 3.4 Binarization\n", 41 | " - 3.5 Split data\n", 42 | " - 3.6 Data Loader\n", 43 | "4. Model\n", 44 | " - 4.1 Pretesting Model\n", 45 | " - 4.2 Testing models with eager execution\n", 46 | "5. Training\n", 47 | "6. Evaluation on Testing Dataset\n", 48 | " - 6.1 Confusion matrix\n", 49 | "- Final Words\n", 50 | "- References\n", 51 | "- *Storing models and setting checkpoints (Exercise)*\n", 52 | "- *Restoring models (Exercise)*" 53 | ] 54 | }, 55 | { 56 | "cell_type": "markdown", 57 | "metadata": {}, 58 | "source": [ 59 | "---" 60 | ] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": {}, 65 | "source": [ 66 | "## 1. Deep Learning Frameworks\n", 67 | "There are many deep learning frameworks such as Chainer, DyNet, MXNet, PyTorch, TensorFlow, and Keras. Each framework has their own strenghts which a researcher or a developer may want to consider before choosing the right framework. In my opinion, PyTorch is great for researchers and offers eager execution by default, but its high-level APIs require some understanding of deep learning concepts such as **affine layers** and **automatic differentiation**. On the other hand, TensorFlow was originally built as a low-level API that provides a robust list of functionalities to build deep learning models from the ground up. More recently, TensorFlow also offers **eager execution** and is equipped with a high-level API known as Keras.\n", 68 | "\n", 69 | "![alt txt](img/dl_frameworks.png)" 70 | ] 71 | }, 72 | { 73 | "cell_type": "markdown", 74 | "metadata": {}, 75 | "source": [ 76 | "### 1.1 Eager Execution\n", 77 | "Eager execution allows us to operate on the computation graph dynamically, also known as **imperative programming**. TensorFlow requires that you manually turn this mode on, while PyTorch comes with this mode by default. Below we import the necessary library and enable eager execution." 78 | ] 79 | }, 80 | { 81 | "cell_type": "code", 82 | "execution_count": 1, 83 | "metadata": {}, 84 | "outputs": [ 85 | { 86 | "name": "stdout", 87 | "output_type": "stream", 88 | "text": [ 89 | "1.10.0\n", 90 | "EE enabled? True\n" 91 | ] 92 | } 93 | ], 94 | "source": [ 95 | "import tensorflow as tf\n", 96 | "tf.enable_eager_execution()\n", 97 | "print(tf.__version__)\n", 98 | "print(\"EE enabled?\", tf.executing_eagerly())" 99 | ] 100 | }, 101 | { 102 | "cell_type": "markdown", 103 | "metadata": {}, 104 | "source": [ 105 | "### 1.2 Computation Graph\n", 106 | "A simplified definition of a neural network is a string of functions that are **differentiable** and that we can combine together to get more complicated functions. An intuitive way to express this process is through computation graphs. \n", 107 | "\n", 108 | "![alt txt](http://colah.github.io/posts/2015-08-Backprop/img/tree-eval-derivs.png)\n", 109 | "\n", 110 | "Image credit: [Chris Olah](http://colah.github.io/posts/2015-08-Backprop/)" 111 | ] 112 | }, 113 | { 114 | "cell_type": "markdown", 115 | "metadata": {}, 116 | "source": [ 117 | "## 2. Tensors\n", 118 | "Tensors are the fundamental data structure used to store data that will be fed as input to a computation graph for processing and applying tranformations. Let's create two tensors and multiply them, and then output the result. The figure below shows a 4-D Tensor.\n", 119 | "\n", 120 | "![alt txt](img/tensor.png)" 121 | ] 122 | }, 123 | { 124 | "cell_type": "code", 125 | "execution_count": 2, 126 | "metadata": {}, 127 | "outputs": [ 128 | { 129 | "name": "stdout", 130 | "output_type": "stream", 131 | "text": [ 132 | "tf.Tensor(\n", 133 | "[[1. 3.]\n", 134 | " [3. 7.]], shape=(2, 2), dtype=float32)\n" 135 | ] 136 | } 137 | ], 138 | "source": [ 139 | "c = tf.constant([[1.0, 2.0], [3.0, 4.0]])\n", 140 | "d = tf.constant([[1.0, 1.0], [0.0, 1.0]])\n", 141 | "e = tf.matmul(c, d)\n", 142 | "print(e)" 143 | ] 144 | }, 145 | { 146 | "cell_type": "markdown", 147 | "metadata": {}, 148 | "source": [ 149 | "### 2.1 Math with Tensors\n", 150 | "TensorFlow and other deep learning libraries like PyTorch allow you to do **automatic differentation**. Let's try to compute the derivative of a function -- in this case that function is stored in the variable `z`. In TensorFlow, the `tf.GradienTape()` function allows tracking of operations on the input tensor. " 151 | ] 152 | }, 153 | { 154 | "cell_type": "code", 155 | "execution_count": 3, 156 | "metadata": {}, 157 | "outputs": [ 158 | { 159 | "name": "stdout", 160 | "output_type": "stream", 161 | "text": [ 162 | "tf.Tensor(\n", 163 | "[[4.5 4.5]\n", 164 | " [4.5 4.5]], shape=(2, 2), dtype=float32)\n" 165 | ] 166 | } 167 | ], 168 | "source": [ 169 | "### Automatic differentiation with TensorFlow\n", 170 | "\n", 171 | "x = tf.contrib.eager.Variable(tf.ones((2,2)))\n", 172 | "with tf.GradientTape() as tape:\n", 173 | " y = x + 2\n", 174 | " z = y * y * 3\n", 175 | " out = tf.reduce_mean(z)\n", 176 | "\n", 177 | "grad = tape.gradient(out, x) # d(out)/dx\n", 178 | "print(grad)" 179 | ] 180 | }, 181 | { 182 | "cell_type": "markdown", 183 | "metadata": {}, 184 | "source": [ 185 | "You can verfiy the output with the equations in the figure below:\n", 186 | "\n", 187 | "![alt txt](img/autograd.jpg)" 188 | ] 189 | }, 190 | { 191 | "cell_type": "markdown", 192 | "metadata": {}, 193 | "source": [ 194 | "### 2.2 Transforming Tensors\n", 195 | "We can also apply some transformation to a tensor such as adding a dimension or transposing it. Let's try both adding a dimension and transposing a matrix below." 196 | ] 197 | }, 198 | { 199 | "cell_type": "code", 200 | "execution_count": 7, 201 | "metadata": {}, 202 | "outputs": [ 203 | { 204 | "name": "stdout", 205 | "output_type": "stream", 206 | "text": [ 207 | "X shape: (2, 3)\n", 208 | "tf.Tensor([2 1 3], shape=(3,), dtype=int32)\n" 209 | ] 210 | }, 211 | { 212 | "data": { 213 | "text/plain": [ 214 | "TensorShape([Dimension(3), Dimension(2)])" 215 | ] 216 | }, 217 | "execution_count": 7, 218 | "metadata": {}, 219 | "output_type": "execute_result" 220 | } 221 | ], 222 | "source": [ 223 | "x = tf.constant([[1, 2, 3], [4, 5, 6]])\n", 224 | "print(\"X shape: \", x.shape)\n", 225 | "\n", 226 | "# add dimension\n", 227 | "print(tf.shape(tf.expand_dims(x, 1)))\n", 228 | "\n", 229 | "# transpose\n", 230 | "tf.transpose(x).shape" 231 | ] 232 | }, 233 | { 234 | "cell_type": "markdown", 235 | "metadata": {}, 236 | "source": [ 237 | "---" 238 | ] 239 | }, 240 | { 241 | "cell_type": "markdown", 242 | "metadata": {}, 243 | "source": [ 244 | "## 3. Emotion Dataset\n", 245 | "In this notebook we are working on an emotion classification task. We are using the public emotion dataset provided [here](https://github.com/huseinzol05/NLP-Dataset/tree/master/emotion-english). The dataset contains tweets labeled into 6 categories." 246 | ] 247 | }, 248 | { 249 | "cell_type": "code", 250 | "execution_count": 8, 251 | "metadata": {}, 252 | "outputs": [], 253 | "source": [ 254 | "import re\n", 255 | "import numpy as np\n", 256 | "import time\n", 257 | "import helpers.pickle_helpers as ph\n", 258 | "from sklearn import preprocessing\n", 259 | "from sklearn.model_selection import train_test_split\n", 260 | "import matplotlib.pyplot as plt\n", 261 | "%matplotlib inline" 262 | ] 263 | }, 264 | { 265 | "cell_type": "code", 266 | "execution_count": 9, 267 | "metadata": {}, 268 | "outputs": [ 269 | { 270 | "data": { 271 | "text/plain": [ 272 | "" 273 | ] 274 | }, 275 | "execution_count": 9, 276 | "metadata": {}, 277 | "output_type": "execute_result" 278 | }, 279 | { 280 | "data": { 281 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY0AAAEbCAYAAAAmmNiPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAHEBJREFUeJzt3X2UHXWd5/H3x2SCoEBAelhMIomaQRGfMEJ2cXZY0BAEDaOoMGqyTiRnNIyO646AoxMPyh6fVnaYQcZAIsF1eBBnJAPBmEHR8SFAgwwxPJgmiiQLJhIEjwxCmM/+Ub+Gm04/VPredHW6P69z7umqX/3q1reS7v7cqvpVtWwTERFRx7OaLiAiIvYcCY2IiKgtoREREbUlNCIioraERkRE1JbQiIiI2hIaERFRW0IjIiJqS2hERERtCY2IiKhtYtMFdNpBBx3k6dOnN11GRMQe5dZbb/2V7a6h+o250Jg+fTrd3d1NlxERsUeRdF+dfjk9FRERtSU0IiKitoRGRETUltCIiIjaEhoREVFbQiMiImobMjQkLZe0RdJP+ln2YUmWdFCZl6QLJPVIukPSkS19F0jaUF4LWtpfI2ldWecCSSrtB0paU/qvkXRAZ3Y5IiKGq86RxqXA3L6NkqYBc4BftDSfCMwsr0XARaXvgcAS4GjgKGBJSwhcBJzRsl7vts4GbrA9E7ihzEdERIOGvLnP9vckTe9n0fnAR4BrWtrmAZfZNrBW0mRJhwDHAmtsbwOQtAaYK+lGYD/ba0v7ZcApwPXlvY4t77sCuBE4a5f2bhdMP/u63fXW/fr5p08a0e1FRHTCsK5pSJoHbLb9b30WTQHub5nfVNoGa9/UTzvAwbYfKNMPAgcPp9aIiOicXX6MiKR9gI9SnZoaEbYtyYPUtIjqdBgveMELRqqsiIhxZzhHGi8CZgD/JunnwFTgNkn/CdgMTGvpO7W0DdY+tZ92gF+WU1uUr1sGKsj2UtuzbM/q6hryeVsRETFMuxwattfZ/n3b021PpzqldKTtB4GVwPwyimo28Eg5xbQamCPpgHIBfA6wuix7VNLsMmpqPs9cI1kJ9I6yWsCO104iIqIBdYbcXg78CDhM0iZJCwfpvgrYCPQAFwPvBygXwD8J3FJe5/ZeFC99Linr3Et1ERzg08AbJG0AXl/mIyKiQXVGT50+xPLpLdMGFg/QbzmwvJ/2buCIftofAo4fqr6IiBg5uSM8IiJqS2hERERtCY2IiKgtoREREbUlNCIioraERkRE1JbQiIiI2hIaERFRW0IjIiJqS2hERERtCY2IiKgtoREREbUlNCIioraERkRE1JbQiIiI2hIaERFRW0IjIiJqS2hERERtCY2IiKgtoREREbUNGRqSlkvaIuknLW2fk3S3pDsk/ZOkyS3LzpHUI+keSSe0tM8tbT2Szm5pnyHpptJ+paRJpX2vMt9Tlk/v1E5HRMTw1DnSuBSY26dtDXCE7VcAPwXOAZB0OHAa8LKyzhclTZA0AbgQOBE4HDi99AX4DHC+7RcDDwMLS/tC4OHSfn7pFxERDZo4VAfb3+v7Kd/2t1pm1wKnlul5wBW2fwf8TFIPcFRZ1mN7I4CkK4B5ku4CjgP+pPRZAXwCuKi81ydK+9XA30mSbe/C/kWvT+w/wtt7ZGS3FxEjohPXNP4UuL5MTwHub1m2qbQN1P484Ne2t/dp3+G9yvJHSv+IiGhIW6Eh6a+A7cBXO1POsOtYJKlbUvfWrVubLCUiYkwbdmhI+u/AycA7W04ZbQamtXSbWtoGan8ImCxpYp/2Hd6rLN+/9N+J7aW2Z9me1dXVNdxdioiIIQwrNCTNBT4CvNn2Yy2LVgKnlZFPM4CZwM3ALcDMMlJqEtXF8pUlbL7DM9dEFgDXtLzXgjJ9KvDtXM+IiGjWkBfCJV0OHAscJGkTsIRqtNRewBpJAGtt/5nt9ZKuAu6kOm212PZT5X3OBFYDE4DltteXTZwFXCHpU8CPgWWlfRnwlXIxfRtV0ERERIPqjJ46vZ/mZf209fY/Dzivn/ZVwKp+2jfyzAir1vbHgbcNVV9ERIyc3BEeERG1JTQiIqK2hEZERNSW0IiIiNoSGhERUVtCIyIiaktoREREbQmNiIioLaERERG1JTQiIqK2hEZERNSW0IiIiNoSGhERUVtCIyIiaktoREREbQmNiIioLaERERG1JTQiIqK2hEZERNSW0IiIiNqGDA1JyyVtkfSTlrYDJa2RtKF8PaC0S9IFknok3SHpyJZ1FpT+GyQtaGl/jaR1ZZ0LJGmwbURERHPqHGlcCszt03Y2cIPtmcANZR7gRGBmeS0CLoIqAIAlwNHAUcCSlhC4CDijZb25Q2wjIiIaMmRo2P4esK1P8zxgRZleAZzS0n6ZK2uByZIOAU4A1tjeZvthYA0wtyzbz/Za2wYu6/Ne/W0jIiIaMtxrGgfbfqBMPwgcXKanAPe39NtU2gZr39RP+2DbiIiIhrR9IbwcIbgDtQx7G5IWSeqW1L1169bdWUpExLg23ND4ZTm1RPm6pbRvBqa19Jta2gZrn9pP+2Db2IntpbZn2Z7V1dU1zF2KiIihDDc0VgK9I6AWANe0tM8vo6hmA4+UU0yrgTmSDigXwOcAq8uyRyXNLqOm5vd5r/62ERERDZk4VAdJlwPHAgdJ2kQ1CurTwFWSFgL3AW8v3VcBbwR6gMeA9wDY3ibpk8Atpd+5tnsvrr+faoTW3sD15cUg24iIiIYMGRq2Tx9g0fH99DWweID3WQ4s76e9Gziin/aH+ttGREQ0J3eER0REbQmNiIioLaERERG1JTQiIqK2hEZERNSW0IiIiNoSGhERUVtCIyIiaktoREREbQmNiIioLaERERG1JTQiIqK2hEZERNSW0IiIiNoSGhERUVtCIyIiaktoREREbQmNiIioLaERERG1JTQiIqK2hEZERNTWVmhI+pCk9ZJ+IulySc+WNEPSTZJ6JF0paVLpu1eZ7ynLp7e8zzml/R5JJ7S0zy1tPZLObqfWiIho37BDQ9IU4APALNtHABOA04DPAOfbfjHwMLCwrLIQeLi0n1/6Ienwst7LgLnAFyVNkDQBuBA4ETgcOL30jYiIhrR7emoisLekicA+wAPAccDVZfkK4JQyPa/MU5YfL0ml/Qrbv7P9M6AHOKq8emxvtP0EcEXpGxERDRl2aNjeDHwe+AVVWDwC3Ar82vb20m0TMKVMTwHuL+tuL/2f19reZ52B2iMioiHtnJ46gOqT/wzg+cBzqE4vjThJiyR1S+reunVrEyVERIwL7Zyeej3wM9tbbT8J/CNwDDC5nK4CmApsLtObgWkAZfn+wEOt7X3WGah9J7aX2p5le1ZXV1cbuxQREYNpJzR+AcyWtE+5NnE8cCfwHeDU0mcBcE2ZXlnmKcu/bdul/bQyumoGMBO4GbgFmFlGY02iuli+so16IyKiTROH7tI/2zdJuhq4DdgO/BhYClwHXCHpU6VtWVllGfAVST3ANqoQwPZ6SVdRBc52YLHtpwAknQmsphqZtdz2+uHWGxER7Rt2aADYXgIs6dO8kWrkU9++jwNvG+B9zgPO66d9FbCqnRojIqJzckd4RETUltCIiIjaEhoREVFbQiMiImpLaERERG0JjYiIqC2hERERtSU0IiKitoRGRETUltCIiIjaEhoREVFbQiMiImpLaERERG0JjYiIqC2hERERtSU0IiKitoRGRETUltCIiIjaEhoREVFbQiMiImprKzQkTZZ0taS7Jd0l6T9LOlDSGkkbytcDSl9JukBSj6Q7JB3Z8j4LSv8Nkha0tL9G0rqyzgWS1E69ERHRnnaPNP4G+KbtlwCvBO4CzgZusD0TuKHMA5wIzCyvRcBFAJIOBJYARwNHAUt6g6b0OaNlvblt1hsREW0YdmhI2h/4r8AyANtP2P41MA9YUbqtAE4p0/OAy1xZC0yWdAhwArDG9jbbDwNrgLll2X6219o2cFnLe0VERAPaOdKYAWwFvizpx5IukfQc4GDbD5Q+DwIHl+kpwP0t628qbYO1b+qnPSIiGtJOaEwEjgQusv1q4Lc8cyoKgHKE4Da2UYukRZK6JXVv3bp1d28uImLcaic0NgGbbN9U5q+mCpFfllNLlK9byvLNwLSW9aeWtsHap/bTvhPbS23Psj2rq6urjV2KiIjBDDs0bD8I3C/psNJ0PHAnsBLoHQG1ALimTK8E5pdRVLOBR8pprNXAHEkHlAvgc4DVZdmjkmaXUVPzW94rIiIaMLHN9f8c+KqkScBG4D1UQXSVpIXAfcDbS99VwBuBHuCx0hfb2yR9Eril9DvX9rYy/X7gUmBv4PryioiIhrQVGrZvB2b1s+j4fvoaWDzA+ywHlvfT3g0c0U6NMT68fMXLR2xb6xasG7FtRYw2uSM8IiJqS2hERERtCY2IiKgtoREREbUlNCIioraERkRE1JbQiIiI2hIaERFRW0IjIiJqS2hERERtCY2IiKgtoREREbUlNCIiorZ2H40eEbvZXS956Yhu76V33zWi24s9S440IiKitoRGRETUltCIiIjaEhoREVFbQiMiImpLaERERG1th4akCZJ+LOnaMj9D0k2SeiRdKWlSad+rzPeU5dNb3uOc0n6PpBNa2ueWth5JZ7dba0REtKcTRxofBFoHdn8GON/2i4GHgYWlfSHwcGk/v/RD0uHAacDLgLnAF0sQTQAuBE4EDgdOL30jIqIhbYWGpKnAScAlZV7AccDVpcsK4JQyPa/MU5YfX/rPA66w/TvbPwN6gKPKq8f2RttPAFeUvhER0ZB2jzT+D/AR4D/K/POAX9veXuY3AVPK9BTgfoCy/JHS/+n2PusM1B4REQ0ZdmhIOhnYYvvWDtYz3FoWSeqW1L1169amy4mIGLPaOdI4BnizpJ9TnTo6DvgbYLKk3mdaTQU2l+nNwDSAsnx/4KHW9j7rDNS+E9tLbc+yPaurq6uNXYqIiMEMOzRsn2N7qu3pVBeyv237ncB3gFNLtwXANWV6ZZmnLP+2bZf208roqhnATOBm4BZgZhmNNalsY+Vw642IiPbtjqfcngVcIelTwI+BZaV9GfAVST3ANqoQwPZ6SVcBdwLbgcW2nwKQdCawGpgALLe9fjfUGxERNXUkNGzfCNxYpjdSjXzq2+dx4G0DrH8ecF4/7auAVZ2oMSIi2pc7wiMioraERkRE1JbQiIiI2vLnXiOiURf+2bdHdHuL//64Ed3eWJMjjYiIqC2hERERtSU0IiKitoRGRETUltCIiIjaEhoREVFbQiMiImpLaERERG0JjYiIqC2hERERtSU0IiKitoRGRETUltCIiIjaEhoREVFbQiMiImpLaERERG0JjYiIqG3YoSFpmqTvSLpT0npJHyztB0paI2lD+XpAaZekCyT1SLpD0pEt77Wg9N8gaUFL+2skrSvrXCBJ7exsRES0p50jje3Ah20fDswGFks6HDgbuMH2TOCGMg9wIjCzvBYBF0EVMsAS4GjgKGBJb9CUPme0rDe3jXojIqJNww4N2w/Yvq1M/wa4C5gCzANWlG4rgFPK9DzgMlfWApMlHQKcAKyxvc32w8AaYG5Ztp/ttbYNXNbyXhER0YCOXNOQNB14NXATcLDtB8qiB4GDy/QU4P6W1TaVtsHaN/XTHhERDWk7NCQ9F/g68Be2H21dVo4Q3O42atSwSFK3pO6tW7fu7s1FRIxbbYWGpN+jCoyv2v7H0vzLcmqJ8nVLad8MTGtZfWppG6x9aj/tO7G91PYs27O6urra2aWIiBhEO6OnBCwD7rL9hZZFK4HeEVALgGta2ueXUVSzgUfKaazVwBxJB5QL4HOA1WXZo5Jml23Nb3mviIhowMQ21j0GeDewTtLtpe2jwKeBqyQtBO4D3l6WrQLeCPQAjwHvAbC9TdIngVtKv3NtbyvT7wcuBfYGri+viIhoyLBDw/b3gYHumzi+n/4GFg/wXsuB5f20dwNHDLfGiIjorNwRHhERtSU0IiKitnauaURExBD+9ztOHtHtffjKa3fr++dIIyIiaktoREREbQmNiIioLaERERG1JTQiIqK2hEZERNSW0IiIiNoSGhERUVtCIyIiaktoREREbQmNiIioLaERERG1JTQiIqK2hEZERNSW0IiIiNoSGhERUVtCIyIiahv1oSFprqR7JPVIOrvpeiIixrNRHRqSJgAXAicChwOnSzq82aoiIsavUR0awFFAj+2Ntp8ArgDmNVxTRMS4NdpDYwpwf8v8ptIWERENkO2maxiQpFOBubbfW+bfDRxt+8w+/RYBi8rsYcA9I1jmQcCvRnB7I20s799Y3jfI/u3pRnr/DrXdNVSniSNRSRs2A9Na5qeWth3YXgosHamiWknqtj2riW2PhLG8f2N53yD7t6cbrfs32k9P3QLMlDRD0iTgNGBlwzVFRIxbo/pIw/Z2SWcCq4EJwHLb6xsuKyJi3BrVoQFgexWwquk6BtHIabERNJb3byzvG2T/9nSjcv9G9YXwiIgYXUb7NY2IiBhFEhoREVFbQmMXSXqTpPy77YFUmTZ0z4gYSH757bp3ABskfVbSS5ouZneSdICkVzRdR6e4uoA3mgdVtEXSBEl3N13H7ibpUEmvL9N7S9q36ZrGk4TGLrL9LuDVwL3ApZJ+JGnRWPnGlXSjpP0kHQjcBlws6QtN19VBt0l6bdNF7A62nwLukfSCpmvZXSSdAVwNfKk0TQW+0VxFnSPpYEnLJF1f5g+XtLDpuvpKaAyD7UepvnGvAA4B/pjql9GfN1pYZ+xf9u8twGW2jwZe33BNnXQ08CNJ90q6Q9I6SXc0XVQHHQCsl3SDpJW9r6aL6qDFwDHAowC2NwC/32hFnXMp1T1pzy/zPwX+orFqBjDq79MYbSS9GXgP8GLgMuAo21sk7QPcCfxtk/V1wERJhwBvB/6q6WJ2gxOaLmA3+3jTBexmv7P9hCQAJE0Exsp9AwfZvkrSOfD0zc1PNV1UXwmNXfdW4Hzb32tttP3YaDyUHIZzqT7tfN/2LZJeCGxouKaOsX2fpNcBM21/WVIX8Nym6+oU299tuobd7LuSPgrsLekNwPuBf264pk75raTnUUJQ0mzgkWZL2llu7hsGSQcDvefFb7a9pcl6oj5JS4BZwGG2/0DS84Gv2T6m4dI6ovyi+VvgpcAkqsfv/Nb2fo0W1iFl5OJCYA4gqg84l3gM/CKTdCTV/90RwE+ALuBU26Pq9GlCYxdJehvweeBGqm/aPwT+0vbVTdbVKZI+C3wK+Hfgm8ArgA/Z/r+NFtYhkm6nGshwm+1Xl7Y7bI+JUWKSuqke7Pk1qnCcD/yB7XMaLaxDJL0FuM7275quZXcop9sOo/rdco/tJxsuaSe5EL7rPga81vYC2/Op/rrgWDqPPKdcCD8Z+DnVtZu/bLSiznqifCrtPQXwnIbr6TjbPcAE20/Z/jIwt+maOuhNwE8lfUXSyeWX7JhQPpDuXR7KegpwZTn6GFUSGrvuWX1ORz3E2Pp37P0hPInqtM2oO6fapqskfQmYXIZv/gtwccM1ddJj5c8I3F7uJfoQY+j703bvIJSvAacD90q6pNmqOubjtn9TrrkdDywDLmq4pp2MmZQeQd+UtBq4vMyfBlzfYD2ddm25QezfgfeVC8WPN1xTx9j+fLmA+ijVaYC/tr2m4bI66d1UIXEm8CGqP2L21kYr6jDbT5Z7GQzsTfWp/L3NVtURvSOlTgIutn2dpE81WVB/ck1jGMp51d4Lp/9qe0zcXNSr3Nj3iO2nyumbfW0/2HRdUY+kvYEX2B7JP3s8IiSdSPVUhmOpriteBXzL9vYGy+oISddS/WXSNwBHUn1wu9n2KxstrI+ERk2Svm/7dZJ+Q/UJRy2L/wPYBnzO9hcbKbBDyv0m/4Pql84iSTOpRhpd23BpHdHy/9fqEaAb+LDtjSNfVedIehPVQI1JtmdIehVwru03N1xaR0i6HLgSuH6sXQwvP3tzgXW2N5T7pV5u+1sNl7aDhEaHlPHVP7R9WNO1tEPSlcCtwHzbR5Rv5B/aflXDpXWEpE8Cm4B/oAr+04AXUT0y5X22j22uuvZJuhU4DrixZXTYOtsvb7ayzhlrQ94l7Wf70XKEvxPb20a6psGMmQtkTbP9ENUh857uRbY/CzwJ1U2L7HhUtad7s+0v2f6N7UdtLwVOsH0l1SM49nRP9jN4Ycx8MiwjjG4G3kb11IKbJJ3abFVt+4fy9VaqI95bW17dTRU1kFwI7yDbDzRdQwc8Uc6J9w5JfREwlk4DPCbp7VTPDgM4lWcu9I+FX67rJf0JMKGcWvwA8MOGa+qk3iHvWwDKQI1/4Zn/zz2O7ZNVPRflj2z/oul6hpIjjehrCdVNfdMkfRW4AfhIsyV11DupRhhtAX5Zpt9VgvLMJgtrh6SvlMl7gZdRBf3lVKPERt1D79owJoe8l3uHrmu6jjpyTSN2Uq7PzKY6LbXW9q8aLimGIOlOqqcRXw/8t77LR9t58eGS9DmqpxT0Dnl/B3CH7bOaq6ozJK0A/s72LU3XMpiERuxE0hTgUFpOX/Z9QOOeqpzOOAOYzo7796dN1dQJkj4AvA94IdWwzacXUX2QfWEjhe0Gkt7KjkPe/6nJejql3B/1YuA+4Lc88383qh5xk9CIHUj6DNWnt/VUQ4mh+sYdK0M2fwj8K9VFxqcfO237640V1UGSLrL9vqbriF0n6dD+2m3fN9K1DCahETuQdA/wirE2Br6XpNvHyvDh8WSA+2vgmU/jY+UpvkcCr6Pa1x/Yvq3hknayx19Aio7bCPxe00XsRtdKemPTRcSusb2v7f36ee07hgLjr4EVwPOAg4AvS/pYs1XtLEcasQNJXwdeSTVq6umjDdsfaKyoDiqfWJ9DtW9PMsY+qcaeqxzlv9L242V+b+D20XbDcO7TiL5WlteYZHvfcuftTODZTdcT0eL/UX1P9t43tBc7DmoYFXKkEeOKpPcCHwSmArdTDS3+oe3jGy0sxj1J36B6PMoaqmsab6C6+30TjJ6j/YRGANXziRjkjujRNuxvuMp+vpbq/pNXSXoJ8L9sv6Xh0mKck7RgsOW2V4xULYPJ6anodXL5urh87b3D+F2Mjcdr9Hrc9uOSkLSX7bsljapzxjH+SJpA9Vcz39l0LUNJaATwzFhwSW/ofTpqcZak24Czm6ms4zZJmgx8A1gj6WGqm6kiGlP+ds2hkibZfqLpegaT0Ii+JOkY2z8oM/+FMTQ02/Yfl8lPSPoOsD/Vs7YimrYR+IGklVR3hANg+wvNlbSzhEb0tRBYLml/quGoDwN79CM2BmL7u03XENHi3vJ6FrBvw7UMKBfCo18lNOjnbzNExDiW0IidSDqJ6vHaT9/HYPvc5iqKGPvK6dKdfiHbPq6BcgaU01OxA0l/D+xD9XjtS6j+SNHNjRYVMT78z5bpZwNvBbY3VMuAcqQRO5B0h+1XtHx9LnC97T9suraI8UbSzbaParqOVjnSiL56H2HwmKTnA9uAQxqsJ2JcKI+36fUsYBbV6L5RJaERff1zuY/hc8BtVOdYL262pIhx4VaqnzdRPUzz51SjGUeVMTP+PjrmbuCp8keJLgTWUt0IFxG711nAq2zPoHoiw2+Bx5otaWcJjejr47Z/I+l1wHFUF8MvarimiPHgY7YfHe0/ewmN6Kv3T6CeBFxs+zpgUoP1RIwXe8TPXkIj+tos6UtUfyd8laS9yPdJxEjYI372MuQ2diBpH2AusM72BkmHAC+3/a2GS4sY0/aUn72ERkRE1DbqDn0iImL0SmhERERtCY2IiKgtoREREbUlNCIiorb/D0Uz5gCzSaONAAAAAElFTkSuQmCC\n", 282 | "text/plain": [ 283 | "
" 284 | ] 285 | }, 286 | "metadata": {}, 287 | "output_type": "display_data" 288 | } 289 | ], 290 | "source": [ 291 | "# load data\n", 292 | "data = ph.load_from_pickle(directory=\"data/merged_training.pkl\")\n", 293 | "data.emotions.value_counts().plot.bar()" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": 10, 299 | "metadata": {}, 300 | "outputs": [ 301 | { 302 | "data": { 303 | "text/html": [ 304 | "
\n", 305 | "\n", 318 | "\n", 319 | " \n", 320 | " \n", 321 | " \n", 322 | " \n", 323 | " \n", 324 | " \n", 325 | " \n", 326 | " \n", 327 | " \n", 328 | " \n", 329 | " \n", 330 | " \n", 331 | " \n", 332 | " \n", 333 | " \n", 334 | " \n", 335 | " \n", 336 | " \n", 337 | " \n", 338 | " \n", 339 | " \n", 340 | " \n", 341 | " \n", 342 | " \n", 343 | " \n", 344 | " \n", 345 | " \n", 346 | " \n", 347 | " \n", 348 | " \n", 349 | " \n", 350 | " \n", 351 | " \n", 352 | " \n", 353 | " \n", 354 | " \n", 355 | " \n", 356 | " \n", 357 | " \n", 358 | " \n", 359 | " \n", 360 | " \n", 361 | " \n", 362 | " \n", 363 | " \n", 364 | " \n", 365 | " \n", 366 | " \n", 367 | " \n", 368 | " \n", 369 | " \n", 370 | " \n", 371 | " \n", 372 | " \n", 373 | " \n", 374 | " \n", 375 | " \n", 376 | " \n", 377 | " \n", 378 | "
textemotions
27383i feel awful about it too because it s my job ...sadness
110083im alone i feel awfulsadness
140764ive probably mentioned this before but i reall...joy
100071i was feeling a little low few days backsadness
2837i beleive that i am much more sensitive to oth...love
18231i find myself frustrated with christians becau...love
10714i am one of those people who feels like going ...joy
35177i feel especially pleased about this as this h...joy
122177i was struggling with these awful feelings and...joy
26723i feel so enraged but helpless at the same timeanger
\n", 379 | "
" 380 | ], 381 | "text/plain": [ 382 | " text emotions\n", 383 | "27383 i feel awful about it too because it s my job ... sadness\n", 384 | "110083 im alone i feel awful sadness\n", 385 | "140764 ive probably mentioned this before but i reall... joy\n", 386 | "100071 i was feeling a little low few days back sadness\n", 387 | "2837 i beleive that i am much more sensitive to oth... love\n", 388 | "18231 i find myself frustrated with christians becau... love\n", 389 | "10714 i am one of those people who feels like going ... joy\n", 390 | "35177 i feel especially pleased about this as this h... joy\n", 391 | "122177 i was struggling with these awful feelings and... joy\n", 392 | "26723 i feel so enraged but helpless at the same time anger" 393 | ] 394 | }, 395 | "execution_count": 10, 396 | "metadata": {}, 397 | "output_type": "execute_result" 398 | } 399 | ], 400 | "source": [ 401 | "data.head(10)" 402 | ] 403 | }, 404 | { 405 | "cell_type": "markdown", 406 | "metadata": {}, 407 | "source": [ 408 | "### 3.1 Preprocessing Data\n", 409 | "In the next steps we are going to create tokenize the text, create index mapping for words, and also construct a vocabulary. " 410 | ] 411 | }, 412 | { 413 | "cell_type": "markdown", 414 | "metadata": {}, 415 | "source": [ 416 | "#### Tokenization and Sampling" 417 | ] 418 | }, 419 | { 420 | "cell_type": "code", 421 | "execution_count": 11, 422 | "metadata": {}, 423 | "outputs": [], 424 | "source": [ 425 | "# retain only text that contain less that 70 tokens to avoid too much padding\n", 426 | "data[\"token_size\"] = data[\"text\"].apply(lambda x: len(x.split(' ')))\n", 427 | "data = data.loc[data['token_size'] < 70].copy()\n", 428 | "\n", 429 | "# sampling\n", 430 | "data = data.sample(n=50000);" 431 | ] 432 | }, 433 | { 434 | "cell_type": "markdown", 435 | "metadata": {}, 436 | "source": [ 437 | "#### Constructing Vocabulary and Index-Word Mapping" 438 | ] 439 | }, 440 | { 441 | "cell_type": "code", 442 | "execution_count": 12, 443 | "metadata": {}, 444 | "outputs": [], 445 | "source": [ 446 | "# This class creates a word -> index mapping (e.g,. \"dad\" -> 5) and vice-versa \n", 447 | "# (e.g., 5 -> \"dad\") for the dataset\n", 448 | "class ConstructVocab():\n", 449 | " def __init__(self, sentences):\n", 450 | " self.sentences = sentences\n", 451 | " self.word2idx = {}\n", 452 | " self.idx2word = {}\n", 453 | " self.vocab = set()\n", 454 | " self.create_index()\n", 455 | " \n", 456 | " def create_index(self):\n", 457 | " for s in self.sentences:\n", 458 | " # update with individual tokens\n", 459 | " self.vocab.update(s.split(' '))\n", 460 | " \n", 461 | " # sort the vocab\n", 462 | " self.vocab = sorted(self.vocab)\n", 463 | "\n", 464 | " # add a padding token with index 0\n", 465 | " self.word2idx[''] = 0\n", 466 | " \n", 467 | " # word to index mapping\n", 468 | " for index, word in enumerate(self.vocab):\n", 469 | " self.word2idx[word] = index + 1 # +1 because of pad token\n", 470 | " \n", 471 | " # index to word mapping\n", 472 | " for word, index in self.word2idx.items():\n", 473 | " self.idx2word[index] = word " 474 | ] 475 | }, 476 | { 477 | "cell_type": "code", 478 | "execution_count": 16, 479 | "metadata": {}, 480 | "outputs": [ 481 | { 482 | "data": { 483 | "text/plain": [ 484 | "['a',\n", 485 | " 'aa',\n", 486 | " 'aaa',\n", 487 | " 'aaaaaaaaaaaaaaaaggghhhh',\n", 488 | " 'aaaaall',\n", 489 | " 'aaaand',\n", 490 | " 'aaradhya',\n", 491 | " 'aaron',\n", 492 | " 'aashiqui',\n", 493 | " 'ab']" 494 | ] 495 | }, 496 | "execution_count": 16, 497 | "metadata": {}, 498 | "output_type": "execute_result" 499 | } 500 | ], 501 | "source": [ 502 | "# construct vocab and indexing\n", 503 | "inputs = ConstructVocab(data[\"text\"].values.tolist())\n", 504 | "\n", 505 | "# examples of what is in the vocab\n", 506 | "inputs.vocab[0:10]" 507 | ] 508 | }, 509 | { 510 | "cell_type": "markdown", 511 | "metadata": {}, 512 | "source": [ 513 | "### 3.2 Converting Data into Tensors \n", 514 | "For convenience we would like to convert the data into tensors. " 515 | ] 516 | }, 517 | { 518 | "cell_type": "code", 519 | "execution_count": 17, 520 | "metadata": {}, 521 | "outputs": [], 522 | "source": [ 523 | "# vectorize to tensor\n", 524 | "input_tensor = [[inputs.word2idx[s] for s in es.split(' ')] for es in data[\"text\"].values.tolist()]" 525 | ] 526 | }, 527 | { 528 | "cell_type": "code", 529 | "execution_count": 18, 530 | "metadata": {}, 531 | "outputs": [ 532 | { 533 | "data": { 534 | "text/plain": [ 535 | "[[11503,\n", 536 | " 3362,\n", 537 | " 6829,\n", 538 | " 26812,\n", 539 | " 15723,\n", 540 | " 17689,\n", 541 | " 11766,\n", 542 | " 24088,\n", 543 | " 18414,\n", 544 | " 16507,\n", 545 | " 16822,\n", 546 | " 9190,\n", 547 | " 11291,\n", 548 | " 26812,\n", 549 | " 8667,\n", 550 | " 6770,\n", 551 | " 11766,\n", 552 | " 15723,\n", 553 | " 22504],\n", 554 | " [11503,\n", 555 | " 19743,\n", 556 | " 24088,\n", 557 | " 14496,\n", 558 | " 14234,\n", 559 | " 11766,\n", 560 | " 10867,\n", 561 | " 865,\n", 562 | " 16507,\n", 563 | " 667,\n", 564 | " 16507,\n", 565 | " 24088,\n", 566 | " 24380,\n", 567 | " 11503,\n", 568 | " 26325,\n", 569 | " 16609,\n", 570 | " 20769,\n", 571 | " 15723,\n", 572 | " 9410,\n", 573 | " 27106,\n", 574 | " 865,\n", 575 | " 11503,\n", 576 | " 3397,\n", 577 | " 10876,\n", 578 | " 3214,\n", 579 | " 8660,\n", 580 | " 1,\n", 581 | " 25058,\n", 582 | " 16507,\n", 583 | " 10212,\n", 584 | " 24622,\n", 585 | " 10909,\n", 586 | " 8059,\n", 587 | " 24232,\n", 588 | " 21437,\n", 589 | " 1,\n", 590 | " 10689,\n", 591 | " 2332,\n", 592 | " 865,\n", 593 | " 11503,\n", 594 | " 6249,\n", 595 | " 8081,\n", 596 | " 8768,\n", 597 | " 16507,\n", 598 | " 10909,\n", 599 | " 2079,\n", 600 | " 865,\n", 601 | " 12917,\n", 602 | " 328,\n", 603 | " 24206,\n", 604 | " 14386,\n", 605 | " 15723,\n", 606 | " 21849,\n", 607 | " 5348]]" 608 | ] 609 | }, 610 | "execution_count": 18, 611 | "metadata": {}, 612 | "output_type": "execute_result" 613 | } 614 | ], 615 | "source": [ 616 | "# examples of what is in the input tensors\n", 617 | "input_tensor[0:2]" 618 | ] 619 | }, 620 | { 621 | "cell_type": "markdown", 622 | "metadata": {}, 623 | "source": [ 624 | "### 3.3 Padding data\n", 625 | "In order to train our recurrent neural network later on in the notebook, it is required padding to generate inputs of same length." 626 | ] 627 | }, 628 | { 629 | "cell_type": "code", 630 | "execution_count": 19, 631 | "metadata": {}, 632 | "outputs": [], 633 | "source": [ 634 | "def max_length(tensor):\n", 635 | " return max(len(t) for t in tensor)" 636 | ] 637 | }, 638 | { 639 | "cell_type": "code", 640 | "execution_count": 20, 641 | "metadata": {}, 642 | "outputs": [ 643 | { 644 | "name": "stdout", 645 | "output_type": "stream", 646 | "text": [ 647 | "68\n" 648 | ] 649 | } 650 | ], 651 | "source": [ 652 | "# calculate the max_length of input tensor\n", 653 | "max_length_inp = max_length(input_tensor)\n", 654 | "print(max_length_inp)" 655 | ] 656 | }, 657 | { 658 | "cell_type": "code", 659 | "execution_count": 21, 660 | "metadata": {}, 661 | "outputs": [], 662 | "source": [ 663 | "# Padding the input and output tensor to the maximum length\n", 664 | "input_tensor = tf.keras.preprocessing.sequence.pad_sequences(input_tensor, \n", 665 | " maxlen=max_length_inp,\n", 666 | " padding='post')" 667 | ] 668 | }, 669 | { 670 | "cell_type": "code", 671 | "execution_count": 40, 672 | "metadata": {}, 673 | "outputs": [ 674 | { 675 | "data": { 676 | "text/plain": [ 677 | "array([[11503, 3362, 6829, 26812, 15723, 17689, 11766, 24088, 18414,\n", 678 | " 16507, 16822, 9190, 11291, 26812, 8667, 6770, 11766, 15723,\n", 679 | " 22504, 0, 0, 0, 0, 0, 0, 0, 0,\n", 680 | " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", 681 | " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", 682 | " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", 683 | " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", 684 | " 0, 0, 0, 0, 0],\n", 685 | " [11503, 19743, 24088, 14496, 14234, 11766, 10867, 865, 16507,\n", 686 | " 667, 16507, 24088, 24380, 11503, 26325, 16609, 20769, 15723,\n", 687 | " 9410, 27106, 865, 11503, 3397, 10876, 3214, 8660, 1,\n", 688 | " 25058, 16507, 10212, 24622, 10909, 8059, 24232, 21437, 1,\n", 689 | " 10689, 2332, 865, 11503, 6249, 8081, 8768, 16507, 10909,\n", 690 | " 2079, 865, 12917, 328, 24206, 14386, 15723, 21849, 5348,\n", 691 | " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", 692 | " 0, 0, 0, 0, 0]], dtype=int32)" 693 | ] 694 | }, 695 | "execution_count": 40, 696 | "metadata": {}, 697 | "output_type": "execute_result" 698 | } 699 | ], 700 | "source": [ 701 | "input_tensor[0:2]" 702 | ] 703 | }, 704 | { 705 | "cell_type": "markdown", 706 | "metadata": {}, 707 | "source": [ 708 | "### 3.4 Binarization\n", 709 | "We would like to binarize our target so that we can obtain one-hot encodings as target values. These are easier and more efficient to work with and will be useful when training the models." 710 | ] 711 | }, 712 | { 713 | "cell_type": "code", 714 | "execution_count": 23, 715 | "metadata": {}, 716 | "outputs": [], 717 | "source": [ 718 | "### convert targets to one-hot encoding vectors\n", 719 | "emotions = list(set(data.emotions.unique()))\n", 720 | "num_emotions = len(emotions)\n", 721 | "# binarizer\n", 722 | "mlb = preprocessing.MultiLabelBinarizer()\n", 723 | "data_labels = [set(emos) & set(emotions) for emos in data[['emotions']].values]\n", 724 | "bin_emotions = mlb.fit_transform(data_labels)\n", 725 | "target_tensor = np.array(bin_emotions.tolist())" 726 | ] 727 | }, 728 | { 729 | "cell_type": "code", 730 | "execution_count": 24, 731 | "metadata": {}, 732 | "outputs": [ 733 | { 734 | "data": { 735 | "text/plain": [ 736 | "array([[0, 0, 0, 0, 1, 0],\n", 737 | " [1, 0, 0, 0, 0, 0]])" 738 | ] 739 | }, 740 | "execution_count": 24, 741 | "metadata": {}, 742 | "output_type": "execute_result" 743 | } 744 | ], 745 | "source": [ 746 | "target_tensor[0:2] " 747 | ] 748 | }, 749 | { 750 | "cell_type": "code", 751 | "execution_count": 30, 752 | "metadata": {}, 753 | "outputs": [ 754 | { 755 | "data": { 756 | "text/html": [ 757 | "
\n", 758 | "\n", 771 | "\n", 772 | " \n", 773 | " \n", 774 | " \n", 775 | " \n", 776 | " \n", 777 | " \n", 778 | " \n", 779 | " \n", 780 | " \n", 781 | " \n", 782 | " \n", 783 | " \n", 784 | " \n", 785 | " \n", 786 | " \n", 787 | " \n", 788 | " \n", 789 | " \n", 790 | " \n", 791 | " \n", 792 | " \n", 793 | " \n", 794 | "
textemotionstoken_size
11677i can do without my phone in the presence of o...sadness19
41836i remember the many lunches in hell and of all...anger54
\n", 795 | "
" 796 | ], 797 | "text/plain": [ 798 | " text emotions token_size\n", 799 | "11677 i can do without my phone in the presence of o... sadness 19\n", 800 | "41836 i remember the many lunches in hell and of all... anger 54" 801 | ] 802 | }, 803 | "execution_count": 30, 804 | "metadata": {}, 805 | "output_type": "execute_result" 806 | } 807 | ], 808 | "source": [ 809 | "data[0:2]" 810 | ] 811 | }, 812 | { 813 | "cell_type": "code", 814 | "execution_count": 26, 815 | "metadata": {}, 816 | "outputs": [], 817 | "source": [ 818 | "get_emotion = lambda t: np.argmax(t)" 819 | ] 820 | }, 821 | { 822 | "cell_type": "code", 823 | "execution_count": 27, 824 | "metadata": {}, 825 | "outputs": [ 826 | { 827 | "data": { 828 | "text/plain": [ 829 | "4" 830 | ] 831 | }, 832 | "execution_count": 27, 833 | "metadata": {}, 834 | "output_type": "execute_result" 835 | } 836 | ], 837 | "source": [ 838 | "get_emotion(target_tensor[0])" 839 | ] 840 | }, 841 | { 842 | "cell_type": "code", 843 | "execution_count": 32, 844 | "metadata": {}, 845 | "outputs": [], 846 | "source": [ 847 | "emotion_dict = {0: 'anger', 1: 'fear', 2: 'joy', 3: 'love', 4: 'sadness', 5: 'surprise'}" 848 | ] 849 | }, 850 | { 851 | "cell_type": "code", 852 | "execution_count": 35, 853 | "metadata": {}, 854 | "outputs": [ 855 | { 856 | "data": { 857 | "text/plain": [ 858 | "'sadness'" 859 | ] 860 | }, 861 | "execution_count": 35, 862 | "metadata": {}, 863 | "output_type": "execute_result" 864 | } 865 | ], 866 | "source": [ 867 | "emotion_dict[get_emotion(target_tensor[0])]" 868 | ] 869 | }, 870 | { 871 | "cell_type": "markdown", 872 | "metadata": {}, 873 | "source": [ 874 | "### 3.5 Split data\n", 875 | "We would like to split our data into a train and validation set. In addition, we also want a holdout dataset (test set) for evaluating the models." 876 | ] 877 | }, 878 | { 879 | "cell_type": "code", 880 | "execution_count": 36, 881 | "metadata": {}, 882 | "outputs": [ 883 | { 884 | "data": { 885 | "text/plain": [ 886 | "(40000, 40000, 5000, 5000, 5000, 5000)" 887 | ] 888 | }, 889 | "execution_count": 36, 890 | "metadata": {}, 891 | "output_type": "execute_result" 892 | } 893 | ], 894 | "source": [ 895 | "# Creating training and validation sets using an 80-20 split\n", 896 | "input_tensor_train, input_tensor_val, target_tensor_train, target_tensor_val = train_test_split(input_tensor, target_tensor, test_size=0.2)\n", 897 | "\n", 898 | "# Split the validataion further to obtain a holdout dataset (for testing) -- split 50:50\n", 899 | "input_tensor_val, input_tensor_test, target_tensor_val, target_tensor_test = train_test_split(input_tensor_val, target_tensor_val, test_size=0.5)\n", 900 | "\n", 901 | "# Show length\n", 902 | "len(input_tensor_train), len(target_tensor_train), len(input_tensor_val), len(target_tensor_val), len(input_tensor_test), len(target_tensor_test)" 903 | ] 904 | }, 905 | { 906 | "cell_type": "markdown", 907 | "metadata": {}, 908 | "source": [ 909 | "### 3.6 Data Loader\n", 910 | "We can also load the data into a data loader, which makes it easy to **manipulate the data**, **create batches**, and apply further **transformations**. In TensorFlow we can use the `tf.data` function." 911 | ] 912 | }, 913 | { 914 | "cell_type": "code", 915 | "execution_count": 37, 916 | "metadata": {}, 917 | "outputs": [], 918 | "source": [ 919 | "TRAIN_BUFFER_SIZE = len(input_tensor_train)\n", 920 | "VAL_BUFFER_SIZE = len(input_tensor_val)\n", 921 | "TEST_BUFFER_SIZE = len(input_tensor_test)\n", 922 | "BATCH_SIZE = 64\n", 923 | "TRAIN_N_BATCH = TRAIN_BUFFER_SIZE // BATCH_SIZE\n", 924 | "VAL_N_BATCH = VAL_BUFFER_SIZE // BATCH_SIZE\n", 925 | "TEST_N_BATCH = TEST_BUFFER_SIZE // BATCH_SIZE\n", 926 | "\n", 927 | "embedding_dim = 256\n", 928 | "units = 1024\n", 929 | "vocab_inp_size = len(inputs.word2idx)\n", 930 | "target_size = num_emotions\n", 931 | "\n", 932 | "train_dataset = tf.data.Dataset.from_tensor_slices((input_tensor_train, \n", 933 | " target_tensor_train)).shuffle(TRAIN_BUFFER_SIZE)\n", 934 | "train_dataset = train_dataset.batch(BATCH_SIZE, drop_remainder=True)\n", 935 | "val_dataset = tf.data.Dataset.from_tensor_slices((input_tensor_val, \n", 936 | " target_tensor_val)).shuffle(VAL_BUFFER_SIZE)\n", 937 | "val_dataset = val_dataset.batch(BATCH_SIZE, drop_remainder=True)\n", 938 | "test_dataset = tf.data.Dataset.from_tensor_slices((input_tensor_test, \n", 939 | " target_tensor_test)).shuffle(TEST_BUFFER_SIZE)\n", 940 | "test_dataset = test_dataset.batch(BATCH_SIZE, drop_remainder=True)" 941 | ] 942 | }, 943 | { 944 | "cell_type": "code", 945 | "execution_count": 38, 946 | "metadata": {}, 947 | "outputs": [ 948 | { 949 | "name": "stdout", 950 | "output_type": "stream", 951 | "text": [ 952 | "\n", 953 | "\n", 954 | "\n" 955 | ] 956 | } 957 | ], 958 | "source": [ 959 | "# checking minibatch\n", 960 | "print(train_dataset)\n", 961 | "print(val_dataset)\n", 962 | "print(test_dataset)" 963 | ] 964 | }, 965 | { 966 | "cell_type": "markdown", 967 | "metadata": {}, 968 | "source": [ 969 | "## 4. Model\n", 970 | "After the data has been preprocessed, transformed and prepared it is now time to construct the model or the so-called computation graph that will be used to train our classification models. We are going to use a gated recurrent neural network (GRU), which is considered a more efficient version of a basic RNN. The figure below shows a high-level overview of the model details. \n", 971 | "\n", 972 | "![alt txt](img/gru-model.png)" 973 | ] 974 | }, 975 | { 976 | "cell_type": "markdown", 977 | "metadata": {}, 978 | "source": [ 979 | "### 4.1 Constructing the Model\n", 980 | "Below we construct our model:" 981 | ] 982 | }, 983 | { 984 | "cell_type": "code", 985 | "execution_count": 41, 986 | "metadata": {}, 987 | "outputs": [], 988 | "source": [ 989 | "### define the GRU component\n", 990 | "def gru(units):\n", 991 | " # If you have a GPU, we recommend using CuDNNGRU(provides a 3x speedup than GRU)\n", 992 | " # the code automatically does that.\n", 993 | " if tf.test.is_gpu_available():\n", 994 | " return tf.keras.layers.CuDNNGRU(units, \n", 995 | " return_sequences=True, \n", 996 | " return_state=True, \n", 997 | " recurrent_initializer='glorot_uniform')\n", 998 | " else:\n", 999 | " return tf.keras.layers.GRU(units, \n", 1000 | " return_sequences=True, \n", 1001 | " return_state=True, \n", 1002 | " recurrent_activation='relu', \n", 1003 | " recurrent_initializer='glorot_uniform')\n", 1004 | "\n", 1005 | "### Build the model\n", 1006 | "class EmoGRU(tf.keras.Model):\n", 1007 | " def __init__(self, vocab_size, embedding_dim, hidden_units, batch_sz, output_size):\n", 1008 | " super(EmoGRU, self).__init__()\n", 1009 | " self.batch_sz = batch_sz\n", 1010 | " self.hidden_units = hidden_units\n", 1011 | " \n", 1012 | " # layers\n", 1013 | " self.embedding = tf.keras.layers.Embedding(vocab_size, embedding_dim)\n", 1014 | " self.dropout = tf.keras.layers.Dropout(0.5)\n", 1015 | " self.gru = gru(self.hidden_units)\n", 1016 | " self.fc = tf.keras.layers.Dense(output_size)\n", 1017 | " \n", 1018 | " def call(self, x, hidden):\n", 1019 | " x = self.embedding(x) # batch_size X max_len X embedding_dim\n", 1020 | " output, state = self.gru(x, initial_state = hidden) # batch_size X max_len X hidden_units\n", 1021 | " out = output[:,-1,:]\n", 1022 | " out = self.dropout(out)\n", 1023 | " out = self.fc(out) # batch_size X max_len X output_size\n", 1024 | " return out, state\n", 1025 | " \n", 1026 | " def initialize_hidden_state(self):\n", 1027 | " return tf.zeros((self.batch_sz, self.hidden_units))" 1028 | ] 1029 | }, 1030 | { 1031 | "cell_type": "markdown", 1032 | "metadata": {}, 1033 | "source": [ 1034 | "### 4.1 Pretesting model\n", 1035 | "Since eager execution is enabled we can print the output of the model by passing a sample of the dataset and making sure that the dimensions of the outputs are as expected." 1036 | ] 1037 | }, 1038 | { 1039 | "cell_type": "code", 1040 | "execution_count": 42, 1041 | "metadata": {}, 1042 | "outputs": [ 1043 | { 1044 | "name": "stdout", 1045 | "output_type": "stream", 1046 | "text": [ 1047 | "(64, 6)\n" 1048 | ] 1049 | } 1050 | ], 1051 | "source": [ 1052 | "model = EmoGRU(vocab_inp_size, embedding_dim, units, BATCH_SIZE, target_size)\n", 1053 | "\n", 1054 | "# initialize the hidden state of the RNN\n", 1055 | "hidden = model.initialize_hidden_state()\n", 1056 | "\n", 1057 | "# testing for the first batch only then break the for loop\n", 1058 | "# Potential bug: out is not randomized enough\n", 1059 | "for (batch, (inp, targ)) in enumerate(train_dataset):\n", 1060 | " out, state = model(inp, hidden)\n", 1061 | " print(out.shape) \n", 1062 | " break" 1063 | ] 1064 | }, 1065 | { 1066 | "cell_type": "markdown", 1067 | "metadata": {}, 1068 | "source": [ 1069 | "## 5. Training the Model\n", 1070 | "Now that we have tested the model, it is time to train it. We will define our optimization algorithm, learning rate, and other necessary information to train the model." 1071 | ] 1072 | }, 1073 | { 1074 | "cell_type": "code", 1075 | "execution_count": 43, 1076 | "metadata": {}, 1077 | "outputs": [], 1078 | "source": [ 1079 | "optimizer = tf.train.AdamOptimizer()\n", 1080 | "\n", 1081 | "def loss_function(y, prediction):\n", 1082 | " return tf.losses.softmax_cross_entropy(y, logits=prediction)\n", 1083 | "\n", 1084 | "def accuracy(y, yhat):\n", 1085 | " #compare the predictions to the truth\n", 1086 | " yhat = tf.argmax(yhat, 1).numpy()\n", 1087 | " y = tf.argmax(y , 1).numpy()\n", 1088 | " return np.sum(y == yhat)/len(y)" 1089 | ] 1090 | }, 1091 | { 1092 | "cell_type": "code", 1093 | "execution_count": 44, 1094 | "metadata": {}, 1095 | "outputs": [ 1096 | { 1097 | "name": "stdout", 1098 | "output_type": "stream", 1099 | "text": [ 1100 | "Epoch 1 Batch 0 Val. Loss 0.3018\n", 1101 | "Epoch 1 Batch 100 Val. Loss 0.2750\n", 1102 | "Epoch 1 Batch 200 Val. Loss 0.2609\n", 1103 | "Epoch 1 Batch 300 Val. Loss 0.1847\n", 1104 | "Epoch 1 Batch 400 Val. Loss 0.0389\n", 1105 | "Epoch 1 Batch 500 Val. Loss 0.0250\n", 1106 | "Epoch 1 Batch 600 Val. Loss 0.0175\n", 1107 | "Epoch 1 Loss 0.1530 -- Train Acc. 0.6361 -- Val Acc. 0.9319\n", 1108 | "Time taken for 1 epoch 41.33251762390137 sec\n", 1109 | "\n", 1110 | "Epoch 2 Batch 0 Val. Loss 0.0263\n", 1111 | "Epoch 2 Batch 100 Val. Loss 0.0195\n", 1112 | "Epoch 2 Batch 200 Val. Loss 0.0247\n", 1113 | "Epoch 2 Batch 300 Val. Loss 0.0213\n", 1114 | "Epoch 2 Batch 400 Val. Loss 0.0381\n", 1115 | "Epoch 2 Batch 500 Val. Loss 0.0204\n", 1116 | "Epoch 2 Batch 600 Val. Loss 0.0086\n", 1117 | "Epoch 2 Loss 0.0208 -- Train Acc. 0.9358 -- Val Acc. 0.9431\n", 1118 | "Time taken for 1 epoch 41.09313082695007 sec\n", 1119 | "\n", 1120 | "Epoch 3 Batch 0 Val. Loss 0.0228\n", 1121 | "Epoch 3 Batch 100 Val. Loss 0.0345\n", 1122 | "Epoch 3 Batch 200 Val. Loss 0.0262\n", 1123 | "Epoch 3 Batch 300 Val. Loss 0.0169\n", 1124 | "Epoch 3 Batch 400 Val. Loss 0.0142\n", 1125 | "Epoch 3 Batch 500 Val. Loss 0.0175\n", 1126 | "Epoch 3 Batch 600 Val. Loss 0.0194\n", 1127 | "Epoch 3 Loss 0.0172 -- Train Acc. 0.9425 -- Val Acc. 0.9393\n", 1128 | "Time taken for 1 epoch 41.424949169158936 sec\n", 1129 | "\n", 1130 | "Epoch 4 Batch 0 Val. Loss 0.0080\n", 1131 | "Epoch 4 Batch 100 Val. Loss 0.0179\n", 1132 | "Epoch 4 Batch 200 Val. Loss 0.0104\n", 1133 | "Epoch 4 Batch 300 Val. Loss 0.0192\n", 1134 | "Epoch 4 Batch 400 Val. Loss 0.0105\n", 1135 | "Epoch 4 Batch 500 Val. Loss 0.0223\n", 1136 | "Epoch 4 Batch 600 Val. Loss 0.0192\n", 1137 | "Epoch 4 Loss 0.0161 -- Train Acc. 0.9463 -- Val Acc. 0.9463\n", 1138 | "Time taken for 1 epoch 41.28922462463379 sec\n", 1139 | "\n", 1140 | "Epoch 5 Batch 0 Val. Loss 0.0172\n", 1141 | "Epoch 5 Batch 100 Val. Loss 0.0188\n", 1142 | "Epoch 5 Batch 200 Val. Loss 0.0138\n", 1143 | "Epoch 5 Batch 300 Val. Loss 0.0220\n", 1144 | "Epoch 5 Batch 400 Val. Loss 0.0089\n", 1145 | "Epoch 5 Batch 500 Val. Loss 0.0146\n", 1146 | "Epoch 5 Batch 600 Val. Loss 0.0107\n", 1147 | "Epoch 5 Loss 0.0153 -- Train Acc. 0.9496 -- Val Acc. 0.9425\n", 1148 | "Time taken for 1 epoch 41.406941413879395 sec\n", 1149 | "\n", 1150 | "Epoch 6 Batch 0 Val. Loss 0.0157\n", 1151 | "Epoch 6 Batch 100 Val. Loss 0.0232\n", 1152 | "Epoch 6 Batch 200 Val. Loss 0.0162\n", 1153 | "Epoch 6 Batch 300 Val. Loss 0.0166\n", 1154 | "Epoch 6 Batch 400 Val. Loss 0.0100\n", 1155 | "Epoch 6 Batch 500 Val. Loss 0.0156\n", 1156 | "Epoch 6 Batch 600 Val. Loss 0.0116\n", 1157 | "Epoch 6 Loss 0.0145 -- Train Acc. 0.9530 -- Val Acc. 0.9415\n", 1158 | "Time taken for 1 epoch 41.41020727157593 sec\n", 1159 | "\n", 1160 | "Epoch 7 Batch 0 Val. Loss 0.0048\n", 1161 | "Epoch 7 Batch 100 Val. Loss 0.0097\n", 1162 | "Epoch 7 Batch 200 Val. Loss 0.0168\n", 1163 | "Epoch 7 Batch 300 Val. Loss 0.0154\n", 1164 | "Epoch 7 Batch 400 Val. Loss 0.0075\n", 1165 | "Epoch 7 Batch 500 Val. Loss 0.0104\n", 1166 | "Epoch 7 Batch 600 Val. Loss 0.0105\n", 1167 | "Epoch 7 Loss 0.0128 -- Train Acc. 0.9607 -- Val Acc. 0.9387\n", 1168 | "Time taken for 1 epoch 41.614171266555786 sec\n", 1169 | "\n", 1170 | "Epoch 8 Batch 0 Val. Loss 0.0038\n", 1171 | "Epoch 8 Batch 100 Val. Loss 0.0120\n", 1172 | "Epoch 8 Batch 200 Val. Loss 0.0007\n", 1173 | "Epoch 8 Batch 300 Val. Loss 0.0105\n", 1174 | "Epoch 8 Batch 400 Val. Loss 0.0278\n", 1175 | "Epoch 8 Batch 500 Val. Loss 0.0136\n", 1176 | "Epoch 8 Batch 600 Val. Loss 0.0149\n", 1177 | "Epoch 8 Loss 0.0116 -- Train Acc. 0.9665 -- Val Acc. 0.9373\n", 1178 | "Time taken for 1 epoch 41.726645946502686 sec\n", 1179 | "\n", 1180 | "Epoch 9 Batch 0 Val. Loss 0.0066\n", 1181 | "Epoch 9 Batch 100 Val. Loss 0.0156\n", 1182 | "Epoch 9 Batch 200 Val. Loss 0.0122\n", 1183 | "Epoch 9 Batch 300 Val. Loss 0.0114\n", 1184 | "Epoch 9 Batch 400 Val. Loss 0.0040\n", 1185 | "Epoch 9 Batch 500 Val. Loss 0.0078\n", 1186 | "Epoch 9 Batch 600 Val. Loss 0.0088\n", 1187 | "Epoch 9 Loss 0.0106 -- Train Acc. 0.9690 -- Val Acc. 0.9361\n", 1188 | "Time taken for 1 epoch 41.65887784957886 sec\n", 1189 | "\n", 1190 | "Epoch 10 Batch 0 Val. Loss 0.0225\n", 1191 | "Epoch 10 Batch 100 Val. Loss 0.0042\n", 1192 | "Epoch 10 Batch 200 Val. Loss 0.0122\n", 1193 | "Epoch 10 Batch 300 Val. Loss 0.0003\n", 1194 | "Epoch 10 Batch 400 Val. Loss 0.0362\n", 1195 | "Epoch 10 Batch 500 Val. Loss 0.0187\n", 1196 | "Epoch 10 Batch 600 Val. Loss 0.0089\n", 1197 | "Epoch 10 Loss 0.0105 -- Train Acc. 0.9702 -- Val Acc. 0.9401\n", 1198 | "Time taken for 1 epoch 41.729103326797485 sec\n", 1199 | "\n" 1200 | ] 1201 | } 1202 | ], 1203 | "source": [ 1204 | "EPOCHS = 10\n", 1205 | "\n", 1206 | "for epoch in range(EPOCHS):\n", 1207 | " start = time.time()\n", 1208 | " \n", 1209 | " ### Initialize hidden state\n", 1210 | " hidden = model.initialize_hidden_state()\n", 1211 | " total_loss = 0\n", 1212 | " train_accuracy, val_accuracy = 0, 0\n", 1213 | " \n", 1214 | " ### Training\n", 1215 | " for (batch, (inp, targ)) in enumerate(train_dataset):\n", 1216 | " loss = 0\n", 1217 | " \n", 1218 | " with tf.GradientTape() as tape:\n", 1219 | " predictions,_ = model(inp, hidden)\n", 1220 | " loss += loss_function(targ, predictions)\n", 1221 | " batch_loss = (loss / int(targ.shape[1])) \n", 1222 | " total_loss += batch_loss\n", 1223 | " \n", 1224 | " batch_accuracy = accuracy(targ, predictions)\n", 1225 | " train_accuracy += batch_accuracy\n", 1226 | " \n", 1227 | " gradients = tape.gradient(loss, model.variables)\n", 1228 | " optimizer.apply_gradients(zip(gradients, model.variables))\n", 1229 | " \n", 1230 | " if batch % 100 == 0:\n", 1231 | " print('Epoch {} Batch {} Val. Loss {:.4f}'.format(epoch + 1,\n", 1232 | " batch,\n", 1233 | " batch_loss.numpy()))\n", 1234 | " \n", 1235 | " ### Validating\n", 1236 | " hidden = model.initialize_hidden_state()\n", 1237 | "\n", 1238 | " for (batch, (inp, targ)) in enumerate(val_dataset): \n", 1239 | " predictions,_ = model(inp, hidden) \n", 1240 | " batch_accuracy = accuracy(targ, predictions)\n", 1241 | " val_accuracy += batch_accuracy\n", 1242 | " \n", 1243 | " print('Epoch {} Loss {:.4f} -- Train Acc. {:.4f} -- Val Acc. {:.4f}'.format(epoch + 1, \n", 1244 | " total_loss / TRAIN_N_BATCH, \n", 1245 | " train_accuracy / TRAIN_N_BATCH,\n", 1246 | " val_accuracy / VAL_N_BATCH))\n", 1247 | " print('Time taken for 1 epoch {} sec\\n'.format(time.time() - start))" 1248 | ] 1249 | }, 1250 | { 1251 | "cell_type": "code", 1252 | "execution_count": 45, 1253 | "metadata": {}, 1254 | "outputs": [ 1255 | { 1256 | "name": "stdout", 1257 | "output_type": "stream", 1258 | "text": [ 1259 | "_________________________________________________________________\n", 1260 | "Layer (type) Output Shape Param # \n", 1261 | "=================================================================\n", 1262 | "embedding (Embedding) multiple 6984192 \n", 1263 | "_________________________________________________________________\n", 1264 | "dropout (Dropout) multiple 0 \n", 1265 | "_________________________________________________________________\n", 1266 | "cu_dnngru (CuDNNGRU) multiple 3938304 \n", 1267 | "_________________________________________________________________\n", 1268 | "dense (Dense) multiple 6150 \n", 1269 | "=================================================================\n", 1270 | "Total params: 10,928,646\n", 1271 | "Trainable params: 10,928,646\n", 1272 | "Non-trainable params: 0\n", 1273 | "_________________________________________________________________\n" 1274 | ] 1275 | } 1276 | ], 1277 | "source": [ 1278 | "model.summary()" 1279 | ] 1280 | }, 1281 | { 1282 | "cell_type": "markdown", 1283 | "metadata": {}, 1284 | "source": [ 1285 | "## 6. Evaluation on the Testing Data\n", 1286 | "Now we will evaluate the model with the holdout dataset." 1287 | ] 1288 | }, 1289 | { 1290 | "cell_type": "code", 1291 | "execution_count": 57, 1292 | "metadata": {}, 1293 | "outputs": [ 1294 | { 1295 | "name": "stdout", 1296 | "output_type": "stream", 1297 | "text": [ 1298 | "Test Accuracy: 0.9326923076923077\n" 1299 | ] 1300 | } 1301 | ], 1302 | "source": [ 1303 | "test_accuracy = 0\n", 1304 | "all_predictions = []\n", 1305 | "x_raw = []\n", 1306 | "y_raw = []\n", 1307 | "\n", 1308 | "hidden = model.initialize_hidden_state()\n", 1309 | "\n", 1310 | "for (batch, (inp, targ)) in enumerate(test_dataset): \n", 1311 | " predictions,_ = model(inp, hidden) \n", 1312 | " batch_accuracy = accuracy(targ, predictions)\n", 1313 | " test_accuracy += batch_accuracy\n", 1314 | " \n", 1315 | " x_raw = x_raw + [x for x in inp]\n", 1316 | " y_raw = y_raw + [y for y in targ]\n", 1317 | " \n", 1318 | " all_predictions.append(predictions)\n", 1319 | " \n", 1320 | "print(\"Test Accuracy: \", test_accuracy/TEST_N_BATCH)" 1321 | ] 1322 | }, 1323 | { 1324 | "cell_type": "markdown", 1325 | "metadata": {}, 1326 | "source": [ 1327 | "### 6.1 Confusion Matrix\n", 1328 | "The test accuracy alone is not an interesting performance metric in this case. Let's plot a confusion matrix to get a drilled down view of how the model is performing with regards to each emotion." 1329 | ] 1330 | }, 1331 | { 1332 | "cell_type": "code", 1333 | "execution_count": 73, 1334 | "metadata": {}, 1335 | "outputs": [ 1336 | { 1337 | "name": "stdout", 1338 | "output_type": "stream", 1339 | "text": [ 1340 | "Default Classification report\n", 1341 | " precision recall f1-score support\n", 1342 | "\n", 1343 | " anger 0.96 0.90 0.93 687\n", 1344 | " fear 0.82 0.96 0.88 532\n", 1345 | " joy 0.94 0.97 0.96 1765\n", 1346 | " love 0.89 0.79 0.84 425\n", 1347 | " sadness 0.97 0.97 0.97 1405\n", 1348 | " surprise 0.89 0.66 0.76 178\n", 1349 | "\n", 1350 | "avg / total 0.93 0.93 0.93 4992\n", 1351 | "\n", 1352 | "\n", 1353 | "Accuracy:\n", 1354 | "0.9326923076923077\n", 1355 | "Correct Predictions: 4656\n", 1356 | "precision: 0.91\n", 1357 | "recall: 0.88\n", 1358 | "f1: 0.89\n", 1359 | "\n", 1360 | "confusion matrix\n", 1361 | " [[ 621 36 3 1 26 0]\n", 1362 | " [ 6 511 1 0 5 9]\n", 1363 | " [ 5 2 1711 39 4 4]\n", 1364 | " [ 1 0 86 336 2 0]\n", 1365 | " [ 14 29 2 0 1359 1]\n", 1366 | " [ 0 46 14 0 0 118]]\n", 1367 | "(row=expected, col=predicted)\n" 1368 | ] 1369 | }, 1370 | { 1371 | "data": { 1372 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZwAAAGGCAYAAABPDDfEAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzsnXd4FFXbh+8nQOgdkhAEQUGUktADQRFCFwEVKSoQQFGR3hSUruhrBemivmKXKiCiKNJEIKGpn10UXlFMQmjR0JPz/TGzYTfZDdklJBvy3Lnm2uyZ55zzmzOz88ypI8YYFEVRFOVKE5DbAhRFUZT8gTocRVEUJUdQh6MoiqLkCOpwFEVRlBxBHY6iKIqSI6jDURRFUXIEdTiKoihKjqAOR1EURckR1OEoiqIoOYI6HEVRFCVHUIejKIqi5AgFc1uAoihKfkdEigCBl5HEOWPMmezSc6UQXbxTURQl9xCRIhQsdpoLpy4nmTigur87Ha3hKIqi5C6BXDhF4ToDoIAPlZyUc5z9/o0QrBqSOhxFURTlEhQIRHxwOHmpjUodjqIoij8ggIhv8fII6nAURVH8AQmwNl/i5RHU4SiKovgDIj7WcPJOFSfvuEZFURQlT6M1HEVRFH9Am9QURVGUHCEfNKmpw1EURfELfKzh5KGeEXU4iqIo/kA+qOHkHdeoKIqi5Gm0hqMoiuIP6KABRVEUJUfIB01q6nAURVH8gXxQw8k7ShVFUZQ8jdZwFEVR/AFtUlMURVFyhHzQpKYOR1EUxR8Q8dHh5J0aTt5xjYqiKEqeRms4iqIo/kCAWJsv8fII6nAURVH8Ae3DURRFUXKEfDBKLe+4RkVRFCVPow5HAUBEaorIZyJyUkSMiNyRzelXs9Ptn53pXg2IyEERWZzbOvICItLKvo5aOYUtFpGDOawj+69nR5OaL1seIe8ozQeIyPUi8oqI/C4iZ0QkSUS+EpERIlL0Cmf/JlAPeALoC+y+wvlddYhIbRGZKiLVclGD40ZoRKS7m/1T7X0VckOfkgmOJjVftjyC9uH4CSLSGVgGnAXeAr4DAoGbgeeBOsCDVyjvokBzYIYxZu6VyAP4H1AUOH+F0vcHagNTgM3AQS/i1QJSr4CeySKy0hhjrkDa/sQgroaHZx00oOQEIlId+ADrphxljPnbafc8EakBdL6CEiranyeuVAb2Te/MlUo/ryEiAhQxxpw2xpy9All8DdQH7gRWXoH0ARCR4saY5CuVflYwxlwdDzE6aEDJIR4FSgD3p3M2ABhj9htjXnZ8F5GCIjJJRH4TkbN2H8DTIlLYOZ4dvlZEbhaRWLuZ7ncR6edkMxXL0QE8bze3HLT3uW0bdzTLpAtrJyLbROSEiPwrIj+LyNNO+922eYtIlIh8KSLJdtzVInKTu/xEpIat6YTd1/SGiBTLvGhBRDaLyHciEiYiW0TklIjsF5G77f23ikiMiJy2dbdNF/9aEZlv7zstIkdFZJlz05l9XMvsr5ucmrVapTsXHURkN3AaeMhp32L7fxGRTSJyRESCnNIPFJH/s8958UsdM9YDzC9YtZxL3pFEpIeI7LGPL1FE3hGRyulsFtvn9noRWSci/wDv5lQZZ6Ld5Tq1tRgPW38nuzIiMktEDtm/o/0i8piIa5XBtltsX3MnRORNoMyldCkZUYfjH3QBfjfGbM+i/WvAdGAvMArYAkzAusmkpwawHPgcGAMcBxaLSB17/0o7DYD3sfpvRnoj3k5rLVAYmGznswZocYl4bYH1QBAwFXgJiAS+8nCjWQqUxDrWpUB/rCasrFDW1hiD5eDPAh+ISC+sclsHjAeKA8tFpKRT3Ca2rg+A4cBCoA2w2cnhbQVm2/8/jVWOfYEfndKphVXGnwMjsGohLtg1wYFAETsfB9OwmlUHZLFGkQI8BYRj1XI8Yt+El9pxJgCvAncB20Qk/Y21INY5SwDGAiuc9l3pMs4qM7hY/o5tvb0vwT7mYli/mz5YTdjDga+AZ7CuQ0fZCLDaTuMdYCJwDVafZ/aSDwYNaJNaLiMipYDKWBd1VuzDgWjgNWPMIDt4vogkAGNFpLUxZpNTlFpAS2PMl3b8pcAhYAAw1hjzrYgkATOBvcaYd3w4jHZY/U2djDGJXsR7HjgGNDfGHLP1rQL2Yd1go9PZ7zPG3O/4IiLlgfuBx7KQVyhwrzHmfTvu58BPwHtApDEmxg7/Eevm1B1YbMf92Biz3DkxEfkI2GHbvW2M+V1EvsS6cX1ujNnsRkMNoKMxZr2bfWkYYw6IyBjgFRG5D9gPjANeNsZszcKxOngPmIRVy/nQXV+OiBQCnsXqM2xpjDljh2/Dch6jcHXqhYFlxpgJbvK7omWc1YM2xnyeLp1IIAr4rzFmnR08GrgeaGCM+dUOe0VEDgPjRORFY8whoCvQEnjUGPO8nd4CwPk3lj1ok5qSA5SyP//Jov1t9udL6cJftD/T9/X84HA2AMaYI8DPwHXeiLwEjr6fbumbIzwhIpWw+hgWO5yNre9brBrAbW6iLUz3/UugvO20L8W/ONUAjTE/27p/dNwIbRz/X+dke9pJdyHb0e234zfMQt4ODlzK2TjluQjrpjwH62b7G/C4F3lhjHGu5Xga5t4Yq4Y53+Fs7LgfYzkLd32HCzyk5Q9l7IKIhGDV8L8GHnHa1QPr+jkuIhUcG7ABKIDlZMC6Di/gdMx2uc7xVZNnfK3d5J3beN5RevWSZH+WzNTqItdijWja7xxojInD+nFem87+DzdpHMdq/sgulmA1R7wGxIvIByLS8xLOx6HzZzf7fgQquOmrSH8sx+3PrBzLn26e8E9i1fbSMMacTJ+miBQVkekicgirmSgROILVjl86C3k7OOCFLVi1t2JATaC/803ZC97FulY89eVkdh5+IuP1dAH400Ne/lDGaYhIQaymwgLAXekGZ9QEOtp5OG8b7P2O/rNrgb+NMf+mS95deSmXQJvUchljTJJdja/rbdQs2qV4CM9KPdxTHgVcjIw5LSItgdZYT8QdgV7ARhFpbz8RZgeXcyye4mYlzTlYTZCzsJp4TmKVzQd499DmrcNohdWEBdYcqR1exscYkyIiT2E1XXXzNr4bzhpjPA3h9ocyduZ5rOH+bY0x6Z1kAFZN+jkPcX/xMU/fyQdNaupw/IO1wIMi0twYc6mbyv+wfiw1ceqQFpFgrKfB/3mI5wvHcT8aJ/1TL/ZN6At7Gy0ij2N13rbm4lOjMw6dtdzsuxFIzO3htk7cDbxpjBnjCBCRImQsm2yb72I3Oc4BPgPOAS+IyHpjjC/n19HZPQVrMIczzudhY7p9tcje6ykzslrGWUJEemMNfhlpjNnixuQ3oIQxxt216cz/gDYiUiJdLcfddXt56PtwlBziOSAZeM12HC7Yw1BH2F8dnZ7pR5KNtj8/zkZdvwGlRSTMSUsl0o16EpFybuI6RmAVdrMPe/j310C080goEakLtOficfoDKWSsRQ0jXU0P6xxC9gyZfRXr93k/1oTfC8DrHprFMsWpL6c+Vie4M7uxRm49LE7D6kWkE3AT2Xs9ZUZWy/iS2NfQa8A7ztMJ0rEUaC4iHdzEL2M3x4F1HRYEBjvtL2Bry150lJqSExhjfhORe7H6Qn4UEeeVBiKxOjgX27bf2PMAHrRv1FuAplgjulalG6F2uXyANYLpQxGZjdWfMBirucG5I3ey3aT2MdYTYRBWB+2fwLZM0h8HfALsEJHXsVYiGIbVnDI1G4/jclkL9BWRk8AP2M00wNF0dl9j3TgfE5HSWH0RG40xCd5kJiIDsJom+zuagkRkGFZNZTAw34djeBdrxFp950BjzHkReQx4A9giIu8DwVjDtg9ijV7MCbJaxlnhDftzq4j0SbdvuzHmd6zmtq7AWrHmQO3BGq5dD6u2VQ2rH+kjrP7J/9hD9X/AGjLuU79Sfkcdjp9gjFlj1yTGYbW1D8a6YX2LNa/lVSfzB4Dfseah3AnEYc0fmJbNmo6KyJ1YI+Kew+r0noDVnOfscNZg/UAHAhWwfqhbgClOHcTu0t8gIh1t3dOxlr3ZAjxmjPG2g/1KMgLLkdyHNT/mK6ybocuIM2NMnIg8jFVGr2M9nbfGnvuRFUTkGqyb/EfGmLS5HsaYd8VaG+05EfnE2/Ixxlyw+3LecLNvsYicwpoj8yxWTe1DrPNwxVafSEeWyjiLVMRyHovc7BuANeftlIjcijXyrwfQD2sAzy9YTY8nwWoqFpGuWH1LfbCaTddg/Sb3+aDNM/mgD0eu/mWWFEVR/Bd7WP/Jwp1mIoW8X6PXnD/N2U9GAZQ2xiRdyj430RqOoiiKP5APajjqcBRFUfyBfLBadN5RqiiKouRptIajKIriD2iTmqIoipITiAg+TLNSh6NcGnsCXyhZX7RTURT/pSRw+HLerqoOR7mShOJ5EURFUfIe1wB/5bYIf0YdTu7xD0Bg80eRgm5Xf/ELflri1bvYcpyigV6vfJLj+PTUmsOcOZ9d66teGYoU8t/z/E9SEjWqV4HLba0QsrYMrbt4eQR1OLmMFCyMFCyS2zI8UrJUVl41k3sUU4eTLQSqw8l1tElNURRFyRHyg8PReTiKoihKjqA1HEVRFD8gP9Rw1OEoiqL4AepwFEVRlJwhH4xS0z4cRVEUJUdQh6MoiuIHOJrUfNl8zG+IiBwUkTMiEiMiTS9hP1JEfhaR0yJySERmiohXczq0SU1RFMUPsNbu9KUPx5e8pBfWm3wfBmKAkcB6Eanl7pXoInIv8B+st/puB27Aeu29AUZnNV+t4fgxD3VrxE/vPcLxTx9l67xoGt9YyaNtwQIBTOh7M9+/M5jjnz5KzKv3067JdZeVZlZ4/ZX5NKhdg8rlS9C+VSR7d8dmar965XKaNahL5fIluKVpfT5f/4nL/qEPDaRCiUIuW887Ovusb+GCedxYszplSxalZYtm7NqVub6Vy5dRv+5NlC1ZlCYNwvj0k3Uu+40xTJ86mepVQylXqhidO7Zj/6+/+qwPYOH8edSqUY0yJYpwS2QEu2Iz17hi+TLC695ImRJFaFy/nmeNVSpRtmRRbuvQ9rI0vrpwPmE3Xk9I2eK0bdmcPZcow1Url9O0fh1CyhYnskl9Pvt0nUfbUcMeoWyxgiyY+7LP+sD/yzArCD7WcHzrxBkNvGqMecMY8wOW4zmF5VDcEQl8ZYx5zxhz0BjzGfA+kGmtKD3qcPyUu1vdxLOD2zDjrW00f+i/fPtbAmue7U3FMsXc2k8deCsPdGnA6Dmf0WDAIl77aB9LpncnvEawz2leig+XL2XShHGMmzCRjdtiqVM3jB53dOZIQoYHJABid27nwQF9uC96AJu+2sVtt3ejX+/u/Pj9dy52bdp14PvfDqVti954xyd9y5cuYfy4MTw+cTLbY/ZQLyyMbp07kuBB384d24nuey/RAwayI3Yvt3ftRq+77+T77y7qe+mF51gwbw6z5y5gy7adFCtWnK63d+TMmTM+aVy2dAmPjRvNExOnsCN2L2Fh4XTt3MGjxh3btxPd5x6iB9zPzl376NLtDnp2v8NF44svPMf8ubOZPW8hW7+KoXjx4nTp3MEnjSuXL2Xi+LE89vgkNm/fRd164XTvdpvHcxyzczsPRN9Hn+gBbNmxm863d6VPr+78kO4cA6xdvYrdsTFUqhTqtS5n/L0Ms0o2NKmVFJFSTpvbNbNEJBBoBGxwhBljUu3vzT3I2w40cjS7ich1wG2A56cJd3lfxuKmymWQ9h7zWya5Xdpm67xo9vz8N6Nmf2bbw/4lw1jw4W5eeH9HBvvflw7j2Xe388rqPWlh70+9i9NnLzDwmTU+pQnw50ePejyG9q0iadCwMc++NBuA1NRUwmpVZ9DDQxgxJmO8+/vdy6lTyby/fHVaWIfWLahbL5wXZ88HrBrOyZMnefuDFR7zdSazpW1atmhGo8aNmfny3DR9Na+ryuBHhjL20fEZ7Pve25vkU8msXPVRWtitNzcnLDycOfMWYozhumsrM2LkaEaOHgvAyZMnqXZNCItee4MevXq71ZFZM8ktkRE0atyEWbMvaqxRvQqDhwxjnBuNfe7txankZFauXutynOHh9Zkz39ZYNZTho8YwyknjtZWDWfT6Ynp60OhpLbW2LZvToFETnp958RzXrVmNQYOHMGrsYxnsB/a9h+TkZJasXJMW1u7WSOqG1WfmnPlpYYf/+ot2t0ayfM06et3VlcFDhzN46AiP5ZTZ0ja5XYZJSUkEly8NUNoYk+RRqAcc94KyvV5DAr1/+DPnTnF8yQPudk0zxkx1k18o1iKjkcaYHU7hzwG3GmMiPOgcDryA1YhXEFhojBnsjVat4fghhQoG0OCGSmzcczAtzBjYuOcATWtXdhsnsFABzpy74BJ2+uwFIutd43OamXHu3Dm+2beXW1u3SQsLCAjg1tZR7Ird6TbO7tid3No6yiWsdZv27E5n/9WXW7ixWigRDeowdsQQjh096pO+fXv30DqqrYu+qKi2xOx0ry8mZgdRUW1cwtq2a0+sbX/wwAHi4+Jc0ixdujRNmkYQE+PeYWdFY1SbjBpjd7pPL2bnDpf8Adq170CMbX/wwAHi4uKIcqfRQ5qZ6ft6315apT/HUW3YFeO+DGNjdtIqXRlGtW3vck2kpqby8APRDBs1hptq1/FKkzuN/lyGXiGXsVlcA5R22p7JNmkirYDHgUeAhsBdQGcRmeRNOupw/JAKpYtRsEAACceTXcITjicTUq642zgbdh9geI+mXF+5LCIQ1aga3W6pRUi5Ej6nmRlHjyaSkpJCxaAgl/CKQcEkxMe5jZMQH0fFisEuYUFBQSTEx6d9j2rbgXmL3mDl2vVMmf4027d9Sa+7biclxbvFJRMTLX3BwRnzi/egLz4ujqCgdPbBwWn2js+gDGkGEx8Xj7c4NLrLMy4uE43u8re1OeJlsHE6jqxy1NZXMTj9OQ7K/BynO57018SsF5+jYMGCPPTIMK/0uMPfy9ArfG1Ou1iD/scYk+S0nfWQUyKQAgSnCw8GPB3gk8DbxpjXjDH/Z4z5EMsBTRCRLPsRHaV2lTB27ufMH9OJbxY/hAF+P3yctz79luhOYbktzSvu6tEr7f/adetRu249GterxVdbt9AyXe1IyXt8vXcPr8ybw+btu/LEKto5ia9DnL2NY4w5JyJ7gDbAKjuNAPv7XA/RigGp6cIcT4FZFqA1HD8k8eQpLqSkElTWteYRVLY4cceSPcbpOXkF5W97nlr3zCU8+hWST5/jwN8nfE4zM8qXr0CBAgUydB4fSYgnKDjEbZyg4BCOHHGtCSQkJGR4knSmWvXrKF++Ar//vt8rfRUqWPri4zPmF+xBX3BICAkJ6ezj49PsHZ8JGdKMJzjE8zFcSqO7PENCMtHoLn9bmyNeBhun48gq5W19R+LTn+OEzM9xuuNxviZ2bN/GkSMJ1KtVnQolC1OhZGEO/fE/Jo4fR9iN13ulD/y/DP2Yl4BBIhItIjcBC4DiwBsAIvKWiDg3yX0EDBaR3iJSXUTaYdV6PjLGZLn5QR3OFUBECl1O/PMXUtn3y9+0bljNKU1o3bAasT9k/kLBs+dTOJz4LwULBHBHy1qs/eqXy07THYGBgYQ3aMjWzRvTwlJTU9m6eRNNmjZzG6dx02Zs3bzJJWzLpg009mAPcPivPzl27CjBId4N3w4MDKRBw0Zs3vSFi75Nm74gopn7/CIimrNp40aXsI1fbKCpbV+tenWCQ0Jc0kxKSmJXbAwREZ4G91xa46aNGTU2beY+vYhmzV3yB/hiw+dE2PbVqlcnJCSETe40ekgzM331GzRkS/pzvGkjTSLcl2HTiGZs2eRahps2bki7Jnrd04dtsfvYunNP2lapUijDRo1hxRqvBjylafTnMvSGnJz4aYxZAowFpgNfA/WBjsYYh5etCjj/6J4CXrQ/fwBeB9YDD3mTb55uUhORjsBEoC5W9W4HMMIY85uIVAMOAN2BYUAE8CvwcLqRGYOAyUB5rAL8EphsjCnjZNMNmALUBg4DbwIzjDEX7P0GqzOtE1a19HlgajqthQHnYYolMzu22ctieXV8F/b8/De7fzrM0O5NKVakEG99+i0Ar43vwuHEf5j82mYAmtwYSmjFknyzP57KFUryRPQtBIjw0gc7s5ymtwweOpKhDw2kfsNGNGzUhIXzZnPqVDL39IkG4JFB/akUWplJ02YA8NAjQ+nasQ3zZs+kfYdOrFy+lK/37uGl2QsA+Pfff3n+mSfp0u1OgoJDOPj770ydNJ7q19cgqm17r/UNHzGKQff3p2HDxjRu0pS5c2ZxKjmZvtEDAHhgQDShoaFMn2E9yA0ZNpz2bVrx8swX6dipM8uWfsDePbuZO/8VwLohDB02gmefmcH1NWpSrVp1pk+dTKXQULp0u8OnMhw+cjSDBkbTqJGtcbalsZ+t8f7+/QitXJknHRqHjqB9m1uZNfNFOjlpnLdgUZrGIcNH8uzTT1HD1jht6iQqhYbS1QeNjwwfxSODBtCgYSMaNm7CgrmzST6VzH19+wPw8AP9qRQaypTpTwPw0JBh3N4+irkvv0T7jrexctkSvt67h1lzFwJQrnx5ypUv75JHwUKFCA4OoeYNta7KMswyObyWmjFmLh6a0IwxrdJ9vwBMszefydMOB6sK+BLwLVACy1t/KCL1nWxmYHnyX+3/3xeRGsaYCyLSAlgIPAasAdpiVRPTEJFbgLeA4VjO6Hpgkb3bufCnAuOxZuy6DhezmIDltLLE8s0/UqFMMSYPaElw2eJ8+1s83R5bktbpXyWoFKmpF4e0Fw4syJQBt1I9tAz/nj7H+pjfuP+ZNZxMPpvlNL3lzrt7cjTxCP95ahoJ8XHUDQtn6Ydr05rI/jx0iICAi5Xops0ieeW/b/P0k1OYMXUi111fk7c+WMFNdeoCUKBAAX747v9Y8u7bnDx5gpBKobSKasuESdMoXNj713Df3bMXRxKP8OT0KcTHxREWXp9Vaz9JG0hw6NAfLvqaNY9k8VvvMm3KJKZMeoIaNWqyZPmH1KlbN81m9NhHSU5OZugjD3HyxAkiW9zM6o8+oUgR397a2qNnLxKPHGH6tMlpGlev/dSjxuaRkSx++z2mTZnIlImPU6NmTZauWOWicczYRzmVnMzQwQ9ywta4Zu2nPmm86+6eJB45wtNPTiUhPo56YeEsX/Wx0zl21RfRLJJXF7/DjGmTeXLKRK6rUZN3lqygdp26nrK4bPy9DLNKTvXh5CZX1TwcEakAHAHqAf9i1XAeMMa8bu+vDXwP3GSM+UlEPgBKGGNud0rjHeB2Rw1HRDYAXxhjnnGy6QM8Z4wJtb8bYJYxZlQm2tzVcP70NA/HX8hsHo4/oK+Yzh48zcPxF/z5FdPZNQ+nQt/FBPgwDyf13CkS3+7vc/45SZ7uwxGRmiLyvoj8LiJJwEF7V1UnM+f2or/tT8c4z1pA+jUw0n8PByaLyL+ODXgVqCQizlfH7sy0GmPOOg9ZBP7J9OAURclX5GQfTm6R15vUPgL+BwzC6lsJAL4DAp1szjv976jOeeNoS2A1ha10s895nQvf2qUURVHIH01qedbhiEh5rBrKIGPMl3bYzV4m8zPQJF1Y+u97gVrGGO/G5SqKoniBOhz/5jhwFHhQRP7Gakb7j5dpzAG2ishorNpSFNZIM+eOrenAWhH5A1iONfkpHKhrjJl4eYegKIpio2/89F/s1U17Y616+h0wExjnZRpfYS3LPRr4Buhop3PGyWY9cDvQHtgF7ARGYTXlKYqiKFkkL9dwMMZswJob44x4+B9jzAk3Ya9iDQKwIoi8CuxPZ7Mea46OJx156BlDURR/RJvU8gEiMhb4HKvTvxMQjTWJU1EUJcdQh5M/aAo8ijUv5ndguDHmtdyVpChKfkMdTj7AGNMztzUoiqLkB/K9w1EURfEL8sEoNXU4iqIofoA2qSmKoig5Qn5wOHl2Ho6iKIqSt9AajqIoih8g+FjDyUOdOOpwFEVR/ID80KSmDkdRFMUfyAej1LQPR1EURckRtIajKIriB2iTmnLFObhyDKVKlcptGR4p33xkbkvIlOMxL+e2hKuCwAL+3dhxISU1tyV4JLu0qcNRFEVRcgQRa/MlXl5BHY6iKIofYDkcX2o4V0DMFcK/69GKoijKVYPWcBRFUfwBH5vU8tKwaHU4iqIofoAOGlAURVFyhPwwaED7cBRFUZQcQWs4iqIofkBAgBAQ4H11xfgQJ7dQh6MoiuIH5IcmNXU4iqIofkB+GDSgfTiKoihKjqAOx495ZcE8brqhOuVKFeXWm5uxe1dspvYrVyyjQb2bKFeqKE0ahvHpJ+tc9q9etZIut3WgSqUKFC8cwDfffH3ZGh/qcTM/fTSZ49tfYOubo2hcp6pH24IFA5gwqAPfr57E8e0vEPP+o7RrfmMGu9CKpfnvk33584unOfbV8+xa8hgNb6rik76F8+dRq0Y1ypQowi2REeyKzbwMVyxfRnjdGylTogiN69fLUIbGGKZPnUz1KpUoW7Iot3Voy/5ff/VJW17RmBeuw0UL51PnhuuoULoYrW9pfkmNH65YRsOw2lQoXYyIRuGs//SixvPnzzPpifFENAonuFxJala/hgcHRvP34cOXrTMzHE1qvmx5BXU4fsryZUsY/+gYJjwxma9i9lCvXhjdbu9IQkKCW/udO7bTv++99Os/kO0xe+nStRu9e9zJ999/l2aTnJxMZIsWPDnjP9mi8e52DXh29J3MWLSe5vc9z7e/HGbN3MFULFvCrf3UwZ154K5IRj+3ggY9nuG1FV+x5IX7Ca9VOc2mTMmibPzvCM5fSOGO4Qtp0OMZxs9cxfF/Tnmtb9nSJTw2bjRPTJzCjti9hIWF07VzB49luGP7dqL73EP0gPvZuWsfXbrdQc/ud/D9dxfL8MUXnmP+3NnMnreQrV/FULx4cbp07sCZM2e81pcXNOaF63DFsiVMeHQM45+YxLadu6lbL4w7u3TiSCYaB/S7j379B7ItZg+3d+nGPT3u4gdb46lTp/hm314em/AEX+7czbsfLOfXX3+h1913ZIteTzia1HzZ8gpijMltDfkSESkFnPz7yAm3q0XfenMzGjVF6wP/AAAgAElEQVRqzEsvzwUgNTWVG66vysOPDGXsuPEZ7Pvd15vk5GRWrPooLazVLc0JCwtn9ryFLrb/O3iQ2rWuY3vsXsLD62eqM7PVore+OYo93//BqOdWOI6J/eumsmDJl7yweEMG+98/nc6zr3/GK8u2pYW9/9xATp89z8BJbwPw5LAuNA+vTtsHZmeqy0Fmq0XfEhlBo8ZNmDX7YhnWqF6FwUOGMe7RjGXY595enEpOZuXqtWlhLVs0Izy8PnPmL8QYw3VVQxk+agyjRo8F4OTJk1xbOZhFry+mZ6/eWdLsjxpTU93fB/zlOkzN5D7V+pbmNGzUmBdnzUnTeGONa3lo8FDGjHssg310H0vj8g8vamzdMpKwsHBenrvAbR57du+i1c3N+OGXA1Sp6lqLT0pKonJQWYDSxpikTA/EDY57QZ3HVlOgcHFvo5NyNpnvn+3mc/45idZw/JBz586xb+8eWke1TQsLCAigdVRbYnfudBsnJmYHraPauIS1bdeemBj39pdLoYIFaHBjFTbG/pIWZoxhY+wvNK1XzW2cwEIFOXPugkvY6bPniaxfPe1755Z12fvDId59tj//+/wpdrw7jgF3Nvdan6MMo9q4lmFUVFtid+5wGydm5w6XMgdo174DMbb9wQMHiIuLI8rJpnTp0jRpGpFmczVpzAvXoUNjK6c8AwICaNW6DbEx7o83dufODGXYtm17YjPRmHTyJCJC6TJlskd4PiVfORyxWCQix0TEiEjmj1W5xNHERFJSUggKDnYJDwoKIj4+zm2c+Lg4N/bBHu0vlwplilOwYAESjv7jEp5w9B9CKpR0G2fDzp8Yfl8rrq9SEREhKqIW3aLCCKlQOs2meuXyDLq7Bfv/SKTr0AW8unwbL469i/tub+KVvkRHGQalK5PgYOLifCtDR7wMNsG+lbO/a8wL1+HRTMowIT7evcb4OIKCgjLYe9J45swZJk+cQI+eva/ou6vyQx9OfhsW3RHoD7QCfgcSc1NMfmPs8yuYP6k336x4HGMMv/+ZyFtrYojuGpFmExAg7P3hEFPmWU1G3/z8F3VqVGJQ9xa8u3ZXbklX8innz5+n3329MMYwc878K5qX4OOw6Dy0eme+quEA1wN/G2O2G2PijDEXLhnDS0Qk8HLTKF+hAgUKFMjwhJaQkEBwcIjbOMEhIW7s4z3aXy6JJ5K5cCGFoPKutZmg8iWJS/zHY5yeY16n/M3jqHX7NMK7P03yqbMc+Otomk1cYhI/HnB90vzpQDxVQsp6pa+CowwT0pVJfDwhIb6VoSNeBpt438rZ3zXmheuwfCZlmL6mlaYxOCTDoAd35eNwNof++IPVH6+/4m/mzQ81nHzjcERkMTAHqGo3px0UkQARmSAiB0TktIh8IyJ3O8UpICKvO+3/WURGpE9XRFaJyBMichj4+XK1BgYG0qBhIzZv+iItLDU1lc2bvqBps2Zu40RENGfzpo0uYRu/2EBEhHv7y+X8hRT2/XSI1k1uSAsTEVo3uYHY/zuYadyz5y5w+MhJChYM4I424azdcnEE045vDnDDta7NHTWrBvHH38e90ucow00bXctw06YvaNrMfZ9QRLPmLmUO8MWGz4mw7atVr05ISAibnGySkpLYFRuTZnM1acwL16FD4xanPFNTU9myeSNNI9wfb9NmzTKU4caNG2jqpNHhbH7bv5816z6jfPnyV0S/M/lhlFp+alIbAfwGPAg0AVKACUAf4GHgV6Al8I6IHDHGbMFyyH8CPYCjQCSwSET+NsYsdUq7DZAEtPOUuYgUBgo7Bbnv6LAZNmIUD97fnwaNGtO4cVPmzZnFqeRk+vYbAMADA6MJDQ1l+lPPAPDI0OF0aNuKl2e+SMdOnVm+7AP27tnNnPmvpKV57NgxDh36I20+wa+/WL4xODjE4xN1Zsx+ZzOvTruPPT/+we7v/mDovbdSrGggb62JAeC1afdx+MhJJs+1msea1L2W0Iql+eaXv6hcsTRPPNSJABFeevPij3/Ou5vZ9MZIxg1ox4rP99Gk7rUMvKs5Q2cs8Vrf8JGjGTQwmkaNGtO4SVPmzrbKsF+0VYb39+9HaOXKPDnDKsMhQ0fQvs2tzJr5Ip06dWbZUqsM5y1YBFg3hCHDR/Ls009Ro0ZNqlWrzrSpk6gUGkrXbr4NmfV3jXnhOhw6fCQPPTCABg0b0ahJU+bPednW2B+ABwdGUym0MtOeehqAwUOG06lda2bPeokOnW5jxdIl7Nuzmzn2KLrz58/T554efLNvH8s+XENqSgrxdt9Y2XLlCAy87EaMfEu+cTjGmJMi8g+QYoyJsx3A40BbY4xjOMvvInIz8BCwxRhzHpjilMwBEWkO9AScHU4y8IAx5lwmEiakSytT7u7Ri8QjR3hq+hTi4+IIC6/Pqo8+IdhuJvjz0B8EBFysoDZrHskbb73L9CmTmDr5Ca6vUZMPln1InTp102w+XruGhwcNTPse3eceAB6fOJknJk3NqrQ0ln++jwplSzD54dsILl+Kb3/5k27DFpJwzGpSqxJS1mU4a+HAgkx5pDPVK5fn39NnWb/tB+6f9DYn/z2dZrPnhz/oNfZ1pg+9nccHdeDg4aOMe/FDPvhkj9f6evS0ynD6tMlpZbh67adpZXgoXRk2j4xk8dvvMW3KRKZMfJwaNWuydMUq6tS9WIZjxj7KqeRkhg5+kBMnThDZ4mbWrP2UIkWKeK0vL2jMC9dh9x69SExMZMb0qcTHWxpXrlmX1qR26NAhJJ3G/775DtOnTmaarfH9ZSupbWs8/NdfrFtrDZmObNrQJa9167/glltbea0xK+SHtdTy1TwcERkJjDTGVBOROsB3WM7CmUBgnzEmwo4zBBgIVAWK2vu/NsY0tfcvBiobYzzWbmw7dzWcPz3Nw/EXMpuH4w9kNg9HyTqe5uH4C5nNw8ltsmseToOJaylQxId5OGeS2ffU7T7nn5PkmxqOGxzT4TsDf6XbdxZARHoDLwBjgB3AP8A4ICKdfXqnlQFjzFlHunbaPolWFOXqJD/UcPKzw/kBywFUtftr3NEC2G6MSRsPKSLX54Q4RVGUq41863CMMf+IyAvATBEJALYBpbGcTJIx5k2sgQT9RKQDcADoizXg4EAuyVYU5SolP7yeIN86HJtJwBGsDv3rgBPAXuBpe/8rQANgCWCA94H5QKccV6ooytWNr3Nq8o6/yV8OxxgzC5jl9N0AL9ubO/uzwAB7c2aCk03/bBeqKEq+Iz/UcPLNxE9FURQld8lXNRxFURR/RUepKYqiKDlCfmhSU4ejKIriB2gNR1EURckR8kMNRwcNKIqiKDmC1nAURVH8gPxQw1GHoyiK4gdoH46iKIqSI+SHGo724SiKoig5gjocRVEUP8DRpObL5lt+MkREDorIGRGJEZGml7AvIyLzRORvETkrIr+IyG3e5KlNaoqiKH5ATjapiUgv4CXgYSAGGAmsF5FaxpgEN/aBwOdAAnA31jvErsVa8DjLqMPJZQIChIAA/22D9fc3apZtMjS3JVySIztn57aES1KwgH83dgT48ZLI2VV2go+DBnzLbjTwqjHmDQAReRjrZZQDgf+4sR8IlAMijTHn7bCD3mbq31eZoiiKklVKikgpp62wOyO7ttII2OAIM8ak2t+be0i7K9Zbj+eJSLyIfCcij4tIAW8EqsNRFEXxAwJEfN5s/gROOm0TPGRVASgAxKcLjwdCPMS5DqsprQBwG/AkMAaY6M0xapOaoiiKH5AN83CuAf5x2nX2skVdJACr/+ZBY0wKsEdEKgPjgGlZTUQdjqIoih+QDYMG/jHGJGUhSiKQAgSnCw8G4jzE+Rs4bzsbBz8CISISaIw5lxWt2qSmKIqSj7Cdwx6gjSNMRALs7zs8RPsKqGHbObgB+DurzgbU4SiKovgFAeL75gMvAYNEJFpEbgIWAMUBx6i1t0TkGSf7BVij1F4WkRtEpDPwODDPm0y1SU1RFMUfEB+XqfEhijFmiYhUBKZjDRT4GuhojHEMJKgKpDrZHxKRDsBM4FuseTgvA896k686HEVRFD8gpxfvNMbMBeZ62NfKTdgOoJlvuVlkyeGISPusJmiM+cx3OYqiKPkTsf98iZdXyGoN59Ms2hmscdqKoiiK4kJWHU7RK6pCURQln+PrAAA/XhkrA1kapWaMOetuwxqXnT5MySYWzp9HrRrVKFOiCLdERrArNjZT+xXLlxFe90bKlChC4/r1+PSTdS77jTFMnzqZ6lUqUbZkUW7r0Jb9v/56VWt8qGdLfvp4Gsd3zmTrW2NpXOdaj7YFCwYw4cGOfL9mCsd3ziRmyXjaRd7kYvPTx9M4vW9uhm3m+J4+a1y0cD51briOCqWL0fqW5uzelXkZfrhiGQ3DalOhdDEiGoWz/tOLZXj+/HkmPTGeiEbhBJcrSc3q1/DgwGj+PnzYZ33+fo7zisZL4ZiH48uWV/B6WLSIBIjIOBH5DTgjItfZ4VNEpF+2K8ynLFu6hMfGjeaJiVPYEbuXsLBwunbuQEJChoVcAdixfTvRfe4hesD97Ny1jy7d7qBn9zv4/rvv0mxefOE55s+dzex5C9n6VQzFixenS+cOnDlz5qrUeHf7hjw75k5mvPIJze99lm9/+Ys184dQsWwJt/ZTH+nCA91vZvRzy2jQ/SleW76NJS8OIrzWNWk2N/d5nmptJ6Rttz08B4CVn+/zWh/AimVLmPDoGMY/MYltO3dTt14Yd3bpxBEPZbhzx3YG9LuPfv0Hsi1mD7d36cY9Pe7ih++tMjx16hTf7NvLYxOe4Mudu3n3g+X8+usv9Lr7Dp/0+fs5zisas0JOv54gNxBjjHcRRCYAD2GtpTMHqGuM+V1E7gGGGWMis1/m1YeIlAJOxh89SalSpTLsvyUygkaNmzBrtjWIJDU1lRrVqzB4yDDGPTo+g32fe3txKjmZlavXpoW1bNGM8PD6zJm/EGMM11UNZfioMYwaPRaAkydPcm3lYBa9vpievXp7fQz+oDGz1aK3vjWWPd//j1HPLgOsJ8j9nz7Jgg+28MIbn2ew//2zGTz72npeWbo1Lez9Fx7g9JlzDJz4lts8nh/bnU631KVuN8+re2S2WnTrW5rTsFFjXpxlOa7U1FRurHEtDw0eyphxj2Wwj+7Tm+TkZJZ/+NHFNFpGEhYWzstzF7jNY8/uXbS6uRk//HKAKlWrurXxtOKxP5zjS5HbGpOSkgguXxqgdBZn+rvguBfcNnsThYq6fxjKjPOn/2Xd8NY+55+T+DLxcwDWejqvYy2P4OBr4MZsUZXPOXfuHPv27iGqTdu0sICAAKKi2hK70/1E4JidO2gd1dYlrF37DsTY9gcPHCAuLo4oJ5vSpUvTpGlEms3VpLFQwQI0uKkKG2N+TgszxrAx5meahlV3GyewUEHOnDvvEnb6zDkiG1zvMY/etzXhzdXelx9cLMNWUWkTvgkICKBV6zbExrhPM3bnzgxl2LZte2JjdnrMJ+nkSUSE0mXK+KTPX89xXtGYVbJh8U6/xxeHUwX4xcM+t8thK96RmJhISkoKQUGuSx0FBQcTF+d+qaP4uDiCgtPZBwUTH2/ZO+JlsAm+aHM1aaxQtgQFCxYg4dg/LuEJR5MIKZ+xRgmwYcePDO8TxfVVKyIiREXcSLeo+oRUcG/ftXUYZUoW5Z2PYrzS5uBoJmWYEJ9+IV+L+Pg4goKCMth7Kp8zZ84weeIEevTs7bYmnRn+fo7zisaskh+a1HxxOD/j/p0Jd2LNQM2ziMhiEVmV2zqU3GHs88v57Y8Evlk5iaTYWcwc34O31uwkNdV9s3P0HZGs/+oH/j5yMoeVZo3z58/T775eGGOYOWd+bstRLoEOGnDPU8BcERlhx79NROYAk7GWScjLjAD657aIChUqUKBAARISXJ9yE+LjCQlx/7qK4JCQDE/FCQnxBAdb9o54GWziL9pcTRoTj//LhQspBJUr6RIeVL4UcUfdN3MnHv+XnqNfpXzkaGrdNpnwO58k+dRZDvx1NINt1UpliYqoxeJV273S5Uz5TMow/dO1g+DgkAyd4e7Kx+FsDv3xB6s/Xu917Qb8/xznFY3KRbx2OMaY5UAvrJfxXABmYdV4ehhjPsleeTmLMeakMcard3RfCQIDA2nQsBGbNn6RFpaamsqmTV/QtJn7F/JFNGvO5k1fuIR9seFzImz7atWrExISwiYnm6SkJHbFxqTZXE0az19IYd+Ph2gdUSstTERo3fQGYr89kGncs+cucPjISQoWDOCONvVZuzljxb1v1+YkHPuHT7783itdzjjKcMumjWlhqampbNm8kaYR7o+3abNmGcpw48YNNI24uOKIw9n8tn8/a9Z9Rvny5S9Ln7+e47yiMavkhyY1n9ZSM8ZswH49qYiI8Xaom58iIouBMsaYO+zXsz4P9AZKAbuBUcaYXWLVYX8FFhpjXnCKXx/YB9Q0xuxPl3ZhXPu4XB+90zF85GgGDYymUaPGNG7SlLmzZ3EqOZl+0QMAuL9/P0IrV+bJGdaCrkOGjqB9m1uZNfNFOnXqzLKlH7B3z27mLVjkyJ8hw0fy7NNPUaNGTapVq860qZOoFBpK126+DZn1d42z39nIq9P7sueHP9j93UGG3tuaYkUL89Zqq4P9tSf7cjjhJJPnrAGgSd1rCQ0qwzc//0nloDI88dBtBAQILy3e4JKuiNCvWzPeXRtDSkpqhny9YejwkTz0wAAaNGxEoyZNmT/nZU4lJ9O3X38AHhwYTaXQykx76mkABg8ZTqd2rZk96yU6dLqNFUuXsG/PbubMWwhYzqbPPT34Zt8+ln24htSUFOLtPomy5coRGBjolT5/P8d5RWNW8HUAQF4aNODz4p0iUhe4yf7/B2OM7496/slzQHcgGvgf8CiwXkRqGGOOich/sUbsveAUZwCwNb2zsZkATMlq5j169iLxyBGmT5tMfFwcYeH1Wb32U4LtppZDh/4gIOBiBbV5ZCSL336PaVMmMmXi49SoWZOlK1ZRp27dNJsxYx/lVHIyQwc/yIkTJ4hscTNr1n5KkSJFsiorT2lc/tleKpQtweTBnQkuX5Jvf/6LbkPmpQ0kqBJSzqV/pnDhQkwZcjvVK1fg31NnWf/V99w/6S1O/nvaJd2oiFpUrVSON1d5HhmWVbr36EViYiIzpk8lPt4qw5Vr1qU1qR06dAhxKsNmzSP575vvMH3qZKZNfoLra9Tk/WUrqV3HKsPDf/3FurXWkOnIpg1d8lq3/gtuubWVV/r8/RznFY1ZQfBp4ec8tJKab/NwQoC3sV7W4/glFgE2AX2NMX9nq8IcxFHDAe4DjgP9jTHv2fsKAQeBWcaY50UkFPgDiDTGxNr7DwNjjTFvuknbXQ3nT0/zcJSskdk8HH8hs3k4/oKneTjKpcmueTh3Ldjq8zyclYNb+px/TuLLVfYaUBZoYIwpbowpDjQESgOvZqe4XOR6oBDWW+4AMMacB2Kxa3XGmMPAx8BA26QLlkNZ5i5Be+mfJMeG67vHFUXJ5+goNfe0AR4yxnzjCLD/fwSIyi5heYTXgN4iUhSrOW2JMeZULmtSFCUPksNv/MwVfHE4nlYBNMCVmxWVs/wGnANaOALsJrMmwA9OduuAZGAw0BH4bw5qVBTlKkJrOO4ZD8yxBw0AaQMIZgEZF3/KgxhjkrHe4f28iHQUkdpYzYXFgNed7FKAxcAzwK/2G/EURVF84moeEg1Zf+Pn31g1GAdlgW9ExDFooChWjeBlPPRh5EHGYznkt7E6+HcDHYwxx9PZvQ48DryRs/IURVHyFlkdFj31SorwIwoD/wIYY84Aw+0tMyoD5wH3ywkriqJkAV+bx/JSk1qWHI4x5pUrLSQ3EZGCwA1YKyZk6VjtYc4VsZzxMmOM+9UWFUVRsoC+8fMS2C9jC3TesktYDlMXq8nse2BhFuPcgzUhtAzWpFBFURSfyQ+DBrxeacAeAvwk0BMIJeNE1wLZoCtHMcZ8jTUgwJs4i7EGDCiKoihZwJcazjNAV6ylWs4BQ+yweC5OglQURVG8QC5jyyv4spbancBAY8wXIrIQ2GCM2S8iv2GtPZZhWRdFURQlc/LD4p2+1HAqYK2UDJCENUQaYDPQOhs0KYqi5Dvyw+sJfHE4B4Cq9v8/A3fZ/3fAckCKoiiKkgFfmtTexlriZRvW+2JWicgQoDhWv46iKIriJToPxw3GmGed/v/EXtamCbDfGBObneIURVHyC742j+Uhf+P7C9gcGGN+5WKfjqIoiuID+WHQQFbXUnswqwkaYxb5LkdRFCV/ojWci0zLop0B1OEoiqIoGcjqWmqVrrQQRfGF7auezm0Jl6Taw0tzW8IlObSoV25LyJS81DHuKzpoQFEURckRAvBtnsplLYiZw6jDURRF8QPyQw0nLzlHRVEUJQ+jNRxFURQ/QHx8H04equCow1EURfEH9AVsHhCRpiLymohsEpFQO6y3iDTLXnmKoij5g/zwAjavHY6IdAW2AIWxXslcxN4VBEzMPmmKoijK1YQvNZwpwFBjTF/gvFP4NqBRtqhSFEXJZzia1HzZ8gq+9OHcCHzhJvwEF9+NoyiKoniBLm3jngSgOnAwXXhzrHflKIqiKF6SHxbv9KVJ7Q1gloiEY62dVl5EugMvoOuoKYqiKB7wxeE8BawBdgAlgJ3Ae8A7xpiZ2agt37Nw/jxq1ahGmRJFuCUygl2xmb9uaMXyZYTXvZEyJYrQuH49Pv1knct+YwzTp06mepVKlC1ZlNs6tGX/r5f3Zgl/17jkrVfp3KIezW4Iol+3KL77eo9H25XvL2Zgj47cGlaVW8Oq8vB9XTPYG2NY8NIM2je5gea1gnn4vq78ceA3n/UBDIyqwd7nu/Dnoh6sn9iOBtXLebRd/VgUiW/0zrC9P7Jlmk3FUoWZc38E373UjT8W3s2S0bdyXXAJn/UtXDCPG2tWp2zJorRs0YxduzI/xyuXL6N+3ZsoW7IoTRqEeT7HVUMpV6oYnTu2u+qvw6wQcBlbXsFrrcaYVGPMJKAi0BhoDYQYY8Zlt7j8zLKlS3hs3GiemDiFHbF7CQsLp2vnDiQkJLi137F9O9F97iF6wP3s3LWPLt3uoGf3O/j+u+/SbF584Tnmz53N7HkL2fpVDMWLF6dL5w6cOXPmqtS4/qMVvPTU4zw44jHe+3grNWvXZUi/OzmWeMSt/Z6d2+jYtTuL3l/L4pUbCK50DY/0vZOEuMNpNm8unMX7b7zC4zNm8uaqLyhatDhD+t3JWR/L8I6mVXiydwOeX/0dUVPX8/2hEywb04oKJQu7tY+eu43aI1albS2eWMeFlFRW7zqUZvPWsFuoVrE4fed8SdTU9fx5NJkVY1tTLLCA1/qWL13C+HFjeHziZLbH7KFeWBjdOnf0eI537thOdN97iR4wkB2xe7m9azd63X2nyzl+6YXnWDBvDrPnLmDLtp0UK1acrrd3vGqvw6zi6MPxZcsriDEmtzXkS0SkFHAy/uhJSpUqlWH/LZERNGrchFmz5wKQmppKjepVGDxkGOMeHZ/Bvs+9vTiVnMzK1WvTwlq2aEZ4eH3mzF+IMYbrqoYyfNQYRo0eC8DJkye5tnIwi15fTM9evb0+Bn/Q+ONfSR719esWRe3whoyf/kKavk7Na9M7+kEGPDL6kseXkpJCq/BreWza89ze/R6MMXRoWos+g4bS78HhAPyTdJJ2jWsy7YX5dOh6t9t02k1d5zYcYP3Eduw7eJTx7+wFrJvHty925dUNvzJ73Y+X1PhQuxsYf2c96oxcxalzKVwfXJKY/3SmxRPr+PlwUlqaP8y6gxkrvuWdrb+7TcfTatEtWzSjUePGzHz54jmueV1VBj8ylLFuznHfe3uTfCqZlas+Sgu79ebmhIWHM2eefY6vrcyIkaMZ6XSOq10TwqLX3qCHh+sws7kmuX0dJiUlEVy+NEBpY4znC9IDjnvBuOV7KVzc+5ro2eR/ef7uhj7nn5P4Mg9nXWbblRCZ3zh37hz79u4hqk3btLCAgACiotoSu3OH2zgxO3fQOqqtS1i79h2Ise0PHjhAXFwcUU42pUuXpknTiDSbq0nj+XPn+PG7r4lo0cpFX0SLVny7d1eW0jhz+hQXzp+nVBlr8OVfhw6SeCTeJc2SpUpTt37jLKfpTKECAYRXK8uW7+PTwoyBLT/E06RG+SylcV/L6/gw5g9OnUsBILCQ9ZM+ez7VJc1zF1KJqFnRK32Oc+x8zhznOGbnTrdxYmJ2EBXVxiWsbbv2xNr2Bw8cID4uziXNtHMcc/Vdh96QH2o4vjT//S/ddhhr0mek/d0vEZHNIjIrt3VkhcTERFJSUggKCnYJDwoOJi4uzm2c+Lg4goLT2QcFEx9v2TviZbAJvmhzNWk8cfwoKSkplKsQ5BJermJFjh6J9xDLldn/mULF4JA0B3P0SIKdhmua5StWJDGLabrEKxlIwQIBHElybaY5cvIMQaWKXjJ+g+rlqH1NGd7eerEP6de/kziUmMzEu8MoXawQhQoEMOy2G6lcrhjBZYpkklpGHOc4OMM5C/J4PuLj4txeEw57x6fb6yDO+zL09+tQccXrYdHGmMHuwkXkaSAP+VpF8cwb819i/UcrWPTBxxQu4t2NOqfo0/I6vj90gn0HjqWFXUgx9J+7jVkDm/LbvO5cSEllyw/xfP7tYf1x+jm6lpp3vAEMysb08i0VKlSgQIECJCS4PvElxMcTEhLiNk5wSAgJ8ensE+IJDrbsHfEy2MRftLmaNJYpW54CBQpwLNG14/jYkSOUrxjsIZbFW4tm88aCWcx/+0NuuKluWnh5u2Zz7IhrmkePHKHCJdJ0x9F/znEhJZWKpVwdWsXSRUhIOp1p3GKBBbizaVXe/TJjn8w3/ztO6ynrqf7ICuqMWk2vl7ZQrnhh/nck2St9jnMcn+GcJXg8H8EhIW6vCYe949PtdTdS8AcAACAASURBVBDifRn6+3XoDdZq0eL1drU3qXmiIa5L3fgtIlJWRN4SkeMickpEPhGRmva+UiJyWkQ6pYtzp4j8IyLF7O9VRGSpiJwQkWMislpEqmWHvsDAQBo0bMSmjRcXdEhNTWXTpi9o2qy52zgRzZqzeZPrAhBfbPicCNu+WvXqhISEsMnJJikpiV2xMWk2V5PGQoGB3FS3PrHbt7joi92+hbCGTTzGW7xwFq/NeZ65b66gdlhDl32Vq1SjQsVglzT//SeJ777enWmanjifkso3B4/TsvbFG60ItLwpmF37j2Yat2uTqgQWKsCy7Qc92vxz+jxH/znLdcElqF+9LJ/s+8srfY5z7HzOHOc4opn7dXojIpqzaeNGl7CNX2ygqW1frXp1gkNCXNJMO8cRV9916A35oQ/H6yY1EXkvfRBQCWgBPJcdonKAxUBNoCuQBDwLrBOR2saYJBFZC9wLfOIU5z5glTHmlIgUAtZjzUW6BbiAtXDppyISZow5lz5DESmMteCpg5KZCRw+cjSDBkbTqFFjGjdpytzZsziVnEy/6AEA3N+/H6GVK/PkjGcAGDJ0BO3b3MqsmS/SqVNnli39gL17djNvwSJH/gwZPpJnn36KGjVqUq1adaZNnUSl0FC6drvDy+LLGxrve2AIU8YMpna9BtSp34j3Xp/P6VPJdO3RB4BJox8iKLgSwx6bCsDiBTNZMPNpnn75NUKvqUqi/dRcrHhxihUvgYhw78DBvDbneapWu57QKtey4MUZVAwOoVX7230qwwWf/cTcB5rx9cFj7P39GA+3v4FihQvy/jar5jLvgQj+PnGap5Z/63psLa/jk71/cjw5w6VG18ZVOPrPWf48lkzta8ow496GrNv7F5u/977/YfiIUQy6vz8NG9rneI51jvva5/iBAdGEhoYy3XGOhw2nfZtWvDzzRTo6neO5818BrHM8dNgInn1mBtfb53j61MlUCg2ly1V6HSoX8WVpm/T+NBX4GnjJGLPm8iVdWeyaTFeghTFmux12H3AIuANYBrwLvC0ixWwHUwroDNxpJ9MLq3b4gLHHlYvIAKz15FoBn7nJegLWwqdZokfPXiQeOcL0aZOJj4sjLLw+q9d+mtaBe+jQHwQEXKygNo+MZPHb7zFtykSmTHycGjVrsnTFKurUvdgkNGbso5xKTmbo4Ac5ceIEkS1uZs3aTyniYx+Fv2vs0KU7x48dZcHMpzl6JP7/2zvv+CjKJgA/E6pUqQGRJk2QjtIEQbBiQSxgBQEbxUZTRLq9ImIDkaKoFBGQT0VpolKlKgI2QAQTmrTQyXx/vHvhklwCF5K7C5mH3/7I7c7uzu3d7eyUd16qVK3BiHFTE0JjMVv/IUpO6jf5ow84dvQovbu0T3ScBx59koce7wtAh4ce49ChgzzT91H279tL7UsaMmLc1DTneaYt3UKR/Ll58qYaFC+Ym1/+3kPb1+azY98RAM4vkpf4JCMXKpbIT6PKxbjl5XkBjxl9bm6G3lGHYgVyEbvnMBMXbuLVGWvTpN+tbduxY+cOhg4ZmPAZT5v5VYqfccNGjRk7fgKDB/ZnYP9+VKxYiYlTPk/0Gffo1Ye4uDi6d32Qvd5nPP2Lr87a7+HpkhVyOEGNwxGRbLiO0BtUdW+GaZUBiMh8nGGcC3wG5FbVE37bVwKfq+oQEckJxABdVfVTz5i8AJRS1eMi8jLwOJB0FFgeoJuqvhPg/IE8nH9SGodjnB6pjcOJFFIbhxMppDQOJ1KI5Dlf0mscTv/pK8mdN9XAR0AOx+1naOs6aT5/KAnKw1HVEyLyPVAVyFQGJxhU9aiITMGF1T71/p+oqsc9kXzAclyYLSkBh7Gr6hHgiO91JP+ADMMIPVnBw0lL0cCvQOn0ViSErMMZ2ga+FSJSBKiCe28+JgDXiMhFQAvvtY8VuBzQdlX9I8ly1hpiwzCMMyEtBqcP8IqIXOFVe+X0X9JbwfRGVX8HpgOjRKSJ1/X6I2Crt97HAlxYbQKwUVWX+G2bAOwEpotIUxEpLyLNRWS4iJwfmndiGMbZRFaYgC0tBmcWLo8zC3fTPZRkyQx0xIXEZuIqzQRopaoJZd1eMcAnQC0Sezeo6kHgMuBvYCrOaxqNm247omOohmFEJiKS5iWN5+smIptE5LCILBGR+qe53+0ioiIyLdhzpqVK7dpTi0Qeqtrc7+//gPYpSyfIPQE8kcK2GKBDeulnGEbWJpQ5HBFpB7wGPAQsAR4DZolIFVUN3Gbb7VcON/fZ98GfNQiDIyIDgFdUdVZaTmQYhmFEDD2AUao6BkBEHsIN/eiEq8hNhlelPAE3vKMpcG6wJw0mpDYQV51lGIZhpDPp0Gkgv9cpxbcEnFTJy7XXA2b71qlqvPc6tVYKA3CFUqPT+h6DMTiZKDVlGIaRuUhLHzXf4vEPbriKb+mbwqmKAtmApO25Y4GAzeJEpAnQmTPslxlsDsdmazMMw8gA0iGHcz6w32/TkWTCaUBE8gMfAver6s4zOVawBuc3EUnV6KhqyhOyG4ZhGIFJayPOk/vsP81OAzuBE0DS9tzRuKEgSakAlAO+8KuIiwIQkeNAFVX9M8B+yQjW4AzkLO4wYBiGcbbjdVJZDrQEpgGISJT3ekSAXdYDNZKsewbXnutRXB/K0yJYg/NpaiVzhmEYRtqIQohKQ6o8LfvgSqLHichPwFJcWXRe3LxmiMh4YKuq9lXVw8Av/juLyB4AVU20/lQEY3Asf2MYhpFBpHVum7Tso6oTRaQYMARXKLAKuEZVfYUEZXAzAaQrwRgcq1IzDMPIIELdvFNVRxA4hJZooHwK2+9NyzlP2+CoanrODmoYhmFkMdLS2sYwDMNIZ5KMqQlqv8yCGRzDMIwIIJQ5nHBhBscwDCMCiCKNHk4mSq+bwQkz8fFKfNJJ6yOI+CCmIA8HVUtF/vTc/4y6PdwqnJJCl3QPtwqp8t+ygLltI5NhBscwDCMCsJCaYRiGERKiSNuMmJmpfNgMjmEYRgSQ1tk70zrjZzgwg2MYhhEBCGkbXZ95zE3m8sYMwzCMTIx5OIZhGBGADfw0DMMwQkbmMR1pwwyOYRhGBJAVyqIth2MYhmGEBPNwDMMwIoCsUBZtHk4E8947b1G1cnkKFziHZk0a8tOypanKT/1sMnVqVKVwgXO4pG5Nvv7qy0Tbp0+byg2trqZ0yaLkzRXF6tWrzljHke++zUWVL6BowTxc3rTRKXX8/LPJ1K1ZjaIF89CgXi1mfX1Sx2PHjtG/35M0qFeL6ML5qVT+fB7o1IF/t21Ls37vvv0WVSqW49x8uWnauAHLlqau32dTJlOr+oWcmy83F9eukewaqipDBg2gfOmSFMp/Dq2uvoI/fv89zfplBh0fbHsZ6/83mP8Wv86C8b24+KKyKcpmzx5F3weuYe2Mgfy3+HWWTHySKxtXTSSz/n+DObRyRLLl9SfbplnHSL+Gp0PUGSyZhcyka5ZiyuSJPNmnJ337DeDHJcupUaMmra+/hu3bA8/wvXjRQu69507a39uJhUtWcMONrbn9tjasXXtyBti4uDgaX3opQ599IV10/GzyRPr26cmT/frzw+KfqF6jJm1uuJYdqejYsf1dtL+3Ez8sWc71N7Tmjttu5ldPx4MHD7J65Qqe6NuP7xf/xIRPp/D777/R7tab0qTf5EkTeaJ3D/o9PZBFS1dQs2Ytbrzu6hSv4aKFC+lw9x106NiZxctWckPrm2h7y02s/eXkNXz1lZd4e8Rwhr/1Lgt+XELevHm54bqrOXz48Fmp461X1eXFnm149r2vaHTni6z5bSsz3u5GsUL5AsoP6noD993ShB4vTabOLc/w/pQfmPjq/dSqcn6CTJO7X6bcFX0TllYPvQnA1G9XBq0fRP41PF18Hk5alsyCaIQ3ZzxbEZECwN5/d+yhQIHkDSibNWlIvXoX89obrmlhfHw8lSuU4aGu3enV+8lk8u3vup24uDg+m/ZFwrrmTRtRs2Ythr/1biLZzZs2Ua3KBSxcuoJatWqnqmdqzTsvb9qIuvUu5tVhbyboeGHFsjzYpTs9ez+RTL7D3U7HKZ+f1PHyyxpTs2Yt3hjxTsBzLP9pGc2bNOTX3zZSukyZZNuzZ0v5malp4wbUu/gShg0/eQ0rli9Nl24P07tP8mt4953tOBgXx9TpMxPWXXZpQ2rVqs2bb7+LqnJBmfN45PGePN6jFwB79+6lbKloRo4eS9t2wTfpjBQdU2reuWB8L5av3czjL04G3E3xj6+H8s6n3/HKmG+Tyf/1zbO8+P4s3pu0IGHdJ6/cx6HDR+n09PiA53i51y1c27Q61VsPDrgdUm/eGe5ruG/fPqKLFAQoqKr7UlQ0BXz3gjHfrydPvvzB7s7BA/vp2PTCNJ8/lJiHE4EcPXqUlSuWc3mLKxLWRUVFcXmLK1i6eHHAfZYsWcTlLVomWnfFlVexZElg+fTSsbnfOaOiomh+eUuWLlkUcJ+lixcnek8AV1xxFUtT0XHf3r2ICAXPPTdN+rVomfgatmhxBUsXB9ZvyeJFyfS78qqrWeLJb9q4kZiYGFr4yRQsWJBL6jdIkDmbdMyRPRt1qpZm7pINCetUlblLNlC/ZvmA++TMkZ3DR48lWnfo8FEa16mQ4jlub3UJ46YHf/0g8q9hMMgZLJkFMzgRyK6dOzlx4gTFo6MTrS9evDixsTEB94mNiQkgH52ifLrpWDzJOaOj2R4bG1jH2BiKFy+eTD4lHQ8fPsyAp/tyW9vbA3qBqbEzFf1iYtJ2DX37JZNJ5T1kZh2LFspH9uzZ2L57f6L123fto0SRwJ/H7EXreOTuFlQoUwwRoUWDC2ndojYligaWv/Hympyb/xw++mJJULr5iPRrGAxZIaR2VhkcEVERSVvA34gojh07Rvu72qGqvP7m2+FWxzhNer08hT//3s7qqf3Zt3QYrz95G+NnLE5xzqcONzVm1o+/8u+OvSHWNPKwogEjLBQpWpRs2bIl8xS2b99OdHSJgPtElygRQD42Rfl003F7knPGxiZ7MkzQMbpEskTu9tjkOvqMzZa//2b6/2YF7d0AFE1FvxIl0nYNffslkwnwHs4GHXf+d4Djx09QvHDivELxIgWI2RU4VbDzvwO07TGKIo17UKXVAGq1GUrcwSNs3LormWyZkoVo0aAKY6ctDEovfyL9GhqJMYMTgeTMmZM6desxf96chHXx8fHMnzeH+g0bBtynQYNGzJ83N9G6uXNm06BBYPn00vE7v3PGx8fz3fy51G/QKOA+9Rs2TPSeAObOnU19Px19xubPP/5gxpffUKRIkTPSb97cxNdw3rw51G8YWL8GDRsl02/O7G9p4MmXK1+eEiVKMM9PZt++fSxbuiRB5mzS8djxE6xct4XLG1RJWCciXF6/MkvXbEx13yNHj7Ntx16yZ4/ippa1mTl/TTKZe25sxPbd+/nq+7VB6eVPpF/DYLCQWgYjIreKyM8ickhEdonIbBHJKyKXiMi3IrJTRPaKyHciUjfJvpVEZIGIHBaRX0XkyiTby3khtptFZJ6IHBSR1SLSKIlcExH53tNhi4gMF5G8ftu7isjv3nliRWTKqfRP4b3mEpECvgVItRzl4UcfZ8wH7/PRh+NYv24dj3bvwsG4OO5p3xGA+zp1YMDTfRPku3Z/hG+/+Zo3Xn+VDevX8+zQQaxY/hMPdj1ZfbR7925Wr17FunW/AvD7bxtYvXpVirHuU9H9kccY+8H7TPhwHOvXr+Oxh7t6Ot4LwAOdOjDw6acS5Lt0e4TZ38xi+LDX2LBhPc8NHczK5T/xYJdugDM2d99xGyuXL2f02A+JP3GC2JgYYmNiOHr0aND6PfJYD8aMHsVH4901fKSbu4btO7hr2Pne9vTvd/Iaduv+KN/M+pph3jV8Zoi7hg9511BE6PbIY7z43DPM/GIGv/z8M507tqfkeedxY+u0RXIjXcfhH82lY5vG3HVDA6qUj2b4U+3Ic04uxk93hR7vD72HIQ/fmCB/SfWytG5Ri3KlinBpnQrMGNGNqCjhtbGzEx1XRGjfuiETZi7hxIn4oPXyJ9Kv4emSFYoGwtZpQERKAp8AfYDPcTfgprjrlx8YBzzsve4JfCkilVR1v4hEAVOBWKABUBAYlsKpngV6Ab97f38iIhVV9biIVAC+Bp4GOgHFgBHe0lFELgaGA/cAC4HCno6n0j8QfYGBp3t9br2tHTt37OCZIQOJjYmhZq3aTPviK6K9cNU/W/4mKurk80LDRo0ZM34CQwb2Z9CAflSoWIlPJ3/ORRdVT5D538wZPHR/p4TXHe6+A4Cnnh5Av/6DTle1BG65rR07d+7k2SGDiI11Ok6d8WVCSG3Lli1IEh0/GPcRQwYNYLCn4yeTp1LN03Hb1q18OdOVTDeun+j5gi9nzaFps+ZB6XdbW3cNhwwekHANp8/8OuEabklyDRs1bszYDz9m8MCnGfj0U1SsVIlJn03jouonr2HPXn04GBdH9y4PsGfPHhpf2oQZM78md+7cQemWWXSc8s0KihbKx4Au1xFdJD9rNmyldbe3EgoJSpconCg/kytXDgZ2u57ypYpy4OARZv24ls79x7P3wKFEx23RoAplShZm3LQzr6KM9Gt4umSFXmphG4fjeSzLgXKquvkUslHAHuBOVZ0pIlcB/wPKquo2T+Ya4CugjapOE5FywEbgPlUd7clUA9YCVVV1vYi8D5xQ1Qf9ztUE+A7IC7QCxgDnq2qiUp1g9PfkcwG5/FblB/5JaRxOpJDaOJxIILVxOMbpk9I4nEghtXE44Sa9xuF8uvD3NI/Dub1xpTSfP5SE89e6GpgD/Cwik0XkfhEpBCAi0SIyygtl7QX2AfkA38i/qsAWn7HxSKlA3j94/K/3v682txZwr4gc8C3ALNx1KQ98C2wG/hKRD0XkLhHJcyr9A6GqR1R1n28B9qckaxiGcTYSNoOjqieAK4FrgV9x4bMNIlIeF06rDTwKNPb+3gXkTMOp/Eeh+R7Xfe87H/Ced3zfUguoBPzpeTV1gTtwxmoIsFpEzj2F/oZhGEHhC6mlZckshDUeoY4fVXUgUAc4CrQBLgWGq+qXqroWOAIU9dt1HVDay6P4SEs51gqgmqr+EWA56ul4XFVnq2ofoCZQDmhxCv0NwzCCQs7gX2YhnEUDDYCWwDfAdlzyvxjOmPwO3CMiPwEFgJcB/6zjbOA3YJyI9PZknk2DGi8Ci0VkBPA+EAdUA65U1e4icj1wAbAA+A+X04nCeTKp6W8YhhEUWaFoIJzz4ewDLgMewxmMzUBPVf1KRGKAkTgPZAvwFPCKb0dVjReRNsBoYCmwCXgEV3F22qjqGhFphjNW3+MqzP4EJnoie4CbgUFAbpwhvENV14pI1ZT0D+oqGIZhZBHCZnBUdR1wTQrbVgKXJFk9JYnMb3glyn6I3/ZN/q+9dXsCrFsGXJWCHj8AzYPV3zAMI1gEISoN4TELqRmGYRhBYSE1wzAMIyRkBYNjo+YMwzCMkGAejmEYRgSQ1hJny+EYhmEYQRElbknLfpkFMziGYRgRQFbwcCyHYxiGYYQE83AMwzAigKxQpWYGxzAMIwJwk6mlJaSWeTCDYxiGEQFY0YBhGIYREqxowDAMwzDSCfNwwkxUlBAVwT7xsj//C7cKqVKlRL5wq3BKzs2blnkDQ0skT+EM0Pi5ueFWIUVOHI5Ll+NY0YBhGIYREoS0FQBkIntjBscwDCMSiEKISoO7kpYpDcKF5XAMwzCMkGAejmEYRgRgITXDMAwjNGQBi2MGxzAMIwKwcTiGYRiGkU6Yh2MYhhEJpHEcTiZycMzgGIZhRAJZIIVjBscwDCMiyAIWxwyOYRhGBGBFA0ZYefftt6hSsRzn5stN08YNWLZ0aaryn02ZTK3qF3JuvtxcXLsGX3/1ZaLtqsqQQQMoX7okhfKfQ6urr+CP338/Ix0/n/A+7VrU5sqa59Gl7ZWsW7M8RdmZk8bz8F3XcX39C7i+/gX06NgmoPzmPzfwVJe7uO7iclxTpzQP3tqS2G3/pEm/MaPe4ZIalSkXXYBWLZuwcvmyVOW/mPYZTS6pQbnoAlzeuC5zvvkq0fa4Awd4qvej1K12AeVLFOSyBrUY98HINOnmI9I/50jXD6DtxaWY+UgjFj3VjHGd63HReflTlc+XKztPXluZWY9fyuKnmvN5t4ZcWrFIIpli+XPyzE3VmNurKQv7NmPig/WpWjL14xqpYwYnQpk8aSJP9O5Bv6cHsmjpCmrWrMWN113N9u3bA8ovWriQDnffQYeOnVm8bCU3tL6JtrfcxNpffkmQefWVl3h7xHCGv/UuC35cQt68ebnhuqs5fPhwmnSc++XnvP1Cf+7t1ptRU+dSoUp1et93G//t2hFQftXSH2l53c28Pm46b336NcVLlKJX51vZEbstQWbr3xt5+M7rKHNBJYaNn8Ho6Qto37UXOXPlClq/6VMnM6hfH3o+0Y9Z3y2hWvUa3HHz9ezcEfgaLluyiC6d7+HOe+7lmwVLuKbVjXS86zbW/7o2QWZgv97Mm/0NI94bw4Ilq7m/y8P06/0Ys778Imj9IPI/50jXD+CqasXpcVUlRn63iTtHLuP3mAO8dVdtCuXJEVA+e5Twzt21KXlubvpM+YU2by1m6Mz1bN9/JEEmf+7sjOlYj+PxysMfr+LWd5bw+rd/sP/w8TTpeDr4mnemZcksiKqGW4csiYgUAPbG7tpLgQIFkm1v2rgB9S6+hGHDXRff+Ph4KpYvTZduD9O7z5PJ5O++sx0H4+KYOn1mwrrLLm1IrVq1efPtd1FVLihzHo883pPHe/QCYO/evZQtFc3I0WNp2+72gHou+XN3iu+hS9srqVK9Do8NeClBx7bNa9Dm7vu564HHTnkNTpw4wQ31L+DR/i9y9U3u/IN73Ef27Nnp99K7p9wfUu8W3aplE2rXrcdzL7+RoF+9iyrQ6YGuPPx472TyD3a8i4MH4/hw4rSEdddd0ZSLatTkpdffAqB5ozrc2OY2evR5KkHmqmYNaXHl1Tz59OCAeqTWLTpSPudI1y+1btHjOtfj1637efHr3wCX0vjqsUv5dNk/jP1xczL5W+qdR/tGZbnl7cUcjw98/3u4ZQVqly5I57ErUjyvjxOH41jz/I0ABVV13yl3SILvXvDdmi3ky5/8XnAqDuzfR7OapYM+v4h0A3oDJYDVwMOqGtB9FZH7gfZAdW/VcuCplORTwjycCOTo0aOsXLGcFi2vSFgXFRVFixZXsHTxooD7LFm8iMtbXJFo3ZVXXc0ST37Txo3ExMTQwk+mYMGCXFK/QYJMMBw7epQNa1dTr3GzRDrWa9SMX1elHrbyceTQQY4fP07+goUAdzNbPP8bSperSO/Ot3JT4yp0aXsl38/+X9D6HT16lDWrVtC0WYtE+jVt1oLlSxcH3OenZUsSyQM0b3Ely5cuSXh9cf2GfPPVTP7dthVV5ccF8/nrz99pdvkVSQ93WjpG8ucc6fqB81aqlszPko0nH4wUWLJxNzXPD3zzbla5KD//s5cnr63Mtz2aMOmh+nRqUjbRzJnNKhfl1237ePHW6szu2YSP77+ENnXOC1q/oJAzWII9lUg74DVgMFAXZ3BmiUjxFHZpDnwCXA40ArYA34hIqWDOawbHDxEZJCKrwq3Hzp07OXHiBMWLRydaXzw6mpiYmID7xMbEUDw6iXzxaGJjnbxvv2Qy0SdlgmHvf7uIP3GCwkUSfz8LFS3O7p2Bwy1Jee/VwRQtXiLBaP23aweHDsbx8ag3qN+0JS+PnkKTK65jwMMdWLX0x6D0273LXcNiSa5hseLF2b49NuA+O2JjTin/7EvDqFylKnWrXUCZYvm489YbeO7lN2h0adOg9IPI/5wjXT+Ac/PkIHtUFLvjjiZavzvuKEXyBfYsSxU6h5bVihEVJTzyyWreX7CJuxuW4b6m5fxkcnPrxaXYsvsg3SasYsryrfS+phLX1ywRtI4RSg9glKqOUdVfgYeAg0CnQMKqepeqvq2qq1R1PXAfzn60DOakVqWWmFeAN8OtRFZgwshhzP3yc4aNn0GuXLkB0Ph4AC5tcS233dsFgEpVa7B25TJmfDqW2vUvDZu+Pj4Y+RYrflrCuE8+4/zSZVm88Hue6v0oJUqW5LLmQf32jDARJcLuuGM8M3M98Qrr/t1PsQK5aN+oDCMXbEqQ+XXbfkbM/QuADTEHqFAsL7deXIqZa4I3jKdDOlSp5ZfECZ0jqnokmbxITqAe8LxvnarGi8hsnPdyOuQBcgApx9wDcFZ5ON6FTMt+IiLZVfWAqu5Kb72CpWjRomTLli3Zk/j22FhKlAj8hBVdogTbY5PIb48lOtrJ+/ZLJhN7UiYYChYqQlS2bOzeldib+W/ndgoXTckrd3w6egQfj3qDl9+fQoUqFyU6Zrbs2SlbsXIi+bIVKrH93+Cq1AoXcddwR5JruGP79mRP7D6KRZdIVf7QoUM8P2QAg559iauuvZ5q1WvQ6YGutG5zG++8+XpQ+kHkf86Rrh/AnoPHOB4fT+EkebLCeXOy68DRgPvsPHCEv3cdxD99s3FnHMXy5yK7F1fbuf8of+1IPJPnxp0HKVEgd9A6ni7pUDTwD7DXb+mbwqmKAtmApK5+LC6fczq8CGwDZgfxFsNvcETkVhH5WUQOicguEZktInlFZL6IDEsiO01Exvq93iQi/UVkvIjsA0aKSDkRURG5XUQWishhEflFRJr57dfck7lWRJYDR4AmSUNqntxSEYkTkT0i8qOIlPXb3lpEVnjn+EtEBorIGXuNOXPmpE7desybOydhXXx8PPPmzaF+w8APIA0aNmL+vDmJ1s2Z/S0NPPly5ctTokQJ5vnJ7Nu3j2VLlyTIBEOOnDmpclEtVixakEjH5YsXUK32JSnu98n7w/nwnVd4adQkLqxRJ9kxL6xehy0b/0i0xhKHZAAAIABJREFUfsumP4k+r3RQ+uXMmZOatevyw3fzEun3w4J51KvfMOA+F1/SIJE8wIL5c6hXvwEAx48d49ixY0hU4p9NVLYo4j3vLFgdI/lzjnT9AI7HK+v+3U/98oUS1glQv3wh1vwTOH++esteShc+J5EvUbZwHnbsP5JQRLBqyx7KFc2TaL+yRc7h371pq6Q7HdIhhXM+UNBveZ4MQESeBG4H2qhqUBckrAZHREriElEfAFVxiampBJcG64VLeNUBhvqtfxl41Vu/CPhCRIok2fcF4Env3GuS6JYdmAZ8B9TEuZojcTlJRKQpMB54A6gGPAjcC/RL4b3mEpECvgVItaD/kcd6MGb0KD4aP47169bxSLcuHIyLo32HjgB0vrc9/fudfIDp1v1Rvpn1NcNef5UN69fzzJBBrFj+Ew917e47P90eeYwXn3uGmV/M4Jeff6Zzx/aUPO88bmx9U2qqpMht93Zl5uQP+frzT9j85wZeH9SLw4cOcu3NdwLw3BNdGPnqkAT5j0e9wQdvPE+fZ4dTolQZdu2IZdeOWA7GHUiQub1zd+Z9NY2Zk8bzz+a/mPrRKBbOm0XrOwOGllPlwW6PMmH8B0z6+EN+27COJ3p052BcHLff1R6Ahx/sxLODn06Qv++h7syb8w3vvvk6v/+2nleeH8rqlcvpdH9XAPIXKECjSy9j6IC+LPz+O/7etJGJE8Yz5dMJtLq+dZquYaR/zpGuH8CERVtoU/c8rq9ZgvJF8/DUdVU4J0c2Zqxy5fZDWlele4sLEuQn/7SVAufkoPc1lShT+ByaVCpCpyblmLTspBc9YckWqpcqQKcmZSld6ByuqR7NzXVLJZKJQPar6j6/JVk4zWMncAJI6upHA6nGC0WkF+6eeZWqrklNNhDhzuGU9HSYqqq++sWfwX0xT5O5qvqq74WIlPP+HKGqn3nrugDXAJ2Bl/z2HaCq3/rt63/cArinhJmq+qe3bp3f9oHAC6o6znv9l4j0944fqD62r7fPaXFb23bs3LGDIYMHEBsTQ81atZk+82uivWTrli1/E+X3pN2ocWPGfvgxgwc+zcCnn6JipUpM+mwaF1WvniDTs1cfDsbF0b3LA+zZs4fGlzZhxsyvyZ07bWGCFq3asGf3Tsa8+QK7d2ynYtXqvDRqUkJILXbbVkRO6jj9kzEcO3aUgY92THScDt360PHhJwBoeuX19Bj0KhNGDmP4s30pXb4iQ4aPpWa9wF5JarS++TZ27dzBS88NYcf2GC6qUYuPP/sioTBg6z9bEl3DSxo04u33x/PiMwN5fugAyleoyJgJk7mw2smw37sffMhzg/vT7YF72fPfbkqVLsMTTw+mfacHgtYPIv9zjnT9AL75dTuF8uagS/MLKJIvJxti99P949XsjjsGQImCuROFz2L3HaH7hFX0vKoSEx+qz/Z9R/lk6ZZEJdS/bttPr0k/071FBe6/rBzb/jvMK7N+56tfAhecpAsham2jqke9yE5L3EM14n6oLYERKZ5GpA/ugfpqVf0pDZqGdxyOiGQDZgH1vf+/Aaao6n8iMh9YpaqP+clPA/ao6r3e6024Sotn/WTKARuBZqq6wG/9596+HUWkOTAPOF9Vt/rJDAJuUtXa3usxwB3At7hY5SRV/dfbtgPIh3tS8JENyA3kVdWDSd5rLsB/9GJ+4J+UxuFECqmNw4kEUhuHEymkNg7HOD1SG4cTbtJrHM6Pa7emeRzOpReVCur8Xln0OFxkZinwGNAWuFBVY0VkPLBVVft68k8AQ4A7Af+S0QOqeoDTJKwhNVU9AVwJXAv8CjwMbBCR8kA8yW13oKHDcQHWnS6p7quqHXGhtIVAO+A3EfE9aufDeSy1/ZYaQCUgWVxTVY/4u7vA/jPQ2zCMs4xQdhpQ1Ym4dMQQYBXu/nWNqvpcuDK4CJSPLkBOYArwr9/SK5jzhjukhjoX60fgRxEZAmwG2gA78HvDnjdUHeeZnA4NgQXevtlxZYApuoup6LcSWAk8LyKLcBZ+MbACqKKqf6S2v2EYRiSiqiNI4Z6oqs2TvC6XHucMq8ERkQa4uOE3wHagAVAMlyuJA14TkeuAP3EDlc4N4vDdROR371iPA4VwxQmnq1t54AFgBq78rwrOexnviQwBZorI3zirHw/UAqqr6tPJj2gYhpEyWWB2grB7OPuAy3DxwwI476anqn4lIjlwN/DxwHHgdU7fuwFXSfEkzlX8A7hRVXcGsf9B4EKgA1AE5z6+BbwHoKqzROR6YADwBHAMWA+8H8Q5DMMwHFnA4oTV4KjqOlz1WKBtx4Cu3pLS/uVSOfw6VW2Qwn7zCfAxqeogYJD3dywutJciqjoLV+xgGIZxRmSF+XDC7eEYhmEYpL0AIDNNTxD2TgOGYRhG1uCs83BUdROZKqppGIaRJVI4Z5/BMQzDyJRkAYtjBscwDCMCyApFA5bDMQzDMEKCeTiGYRgRQFaoUjODYxiGEQFkgRSOGRzDMIyIIAtYHMvhGIZhGCHBPBzDMIwIICtUqZnBMQzDiATSWDSQieyNGZxws39f0BMEhpS4A5Gt3/598eFW4ZREnbAZP8+UE4fPZJ7FjOXEkYOnFjoNskAKxwxOGMkPULF86XDrYRhG+pAfN+WKkQJmcMLHNuB80neq6fzAPxlw3PQi0vWDyNcx0vWDyNcxI/TLj/tNp50s4OKYwQkT3tTaW9PzmHIyALxfVSPuSSvS9YPI1zHS9YPI1zGD9Dvj41jRgGEYhhESrNOAYRiGERKyQETNBn6eZRwBBnv/RyKRrh9Evo6Rrh9Evo6Rrt9Zi7hUgmEYhhEORKQAsHfNxljy5y8Q9P779++jZvlogIKRmDPzx0JqhmEYEYAVDRiGYRghQUhj0UC6a5JxWA7HMAzDCAnm4RiGYUQAWaFKzQyOYRhGBGDjcAwjxIhIlKqGtSNnJOhgZEXOfh/HcjhZFBFpJiL5I0AP8f6vAxAJN3qfDiLysIiU8f7OPL/qswC73mcnZnCyICLyLPAaEB1uXVRVRaQVsFxEWoRbHx8ikgPoDvSHhN53Rgbh9+BxkYicG2nXW0QC3itTWp+2c6R9ySyYwcliiMgFQC2gp6r+EQH6lAFaAN1UdW649fGhqseAkUBFESkGkfnUnZpOkahvIEREvAePm4CvgK4ikjvcevnwD7GKSFMRaS0i14lIdlWNTy+jI2ewZBbM4GQhRKQH8D+gIBAJxqYW8D5wNbDGWxfy308qN4yJQG3gTog8L8fvRt1CRF4Tkc9FpJuInA+Rp29KeO/heuBj4BlggqoeDrNaCfgZmxeBUcALwJPAzyJSKL3CwObhGGcbM4BzgUuBymHWBZwuAlQEqkDCzSekPyG/G0obEbnBb/0/wCvArSIScTPledeqDfA57lquAF4FXvMZncyAiOQFHgJeVNWRQKyIlBKRR0SkuYiEPfQrIt2ATsA9qloVmIL7zjbykzmj762cwb/MghmcLIL3NPwH7geyC+gvImE1Oqr6HfA0MBd4WERu9NaH1OiIowTuyfVFEflBRK4SkeLAZNxEXZU92Yj5zXhGcCjQV1U74byDw8Amz1hmFnIB5YCjIlIQeBbn7QwAJgC3QvhChN55qwHPqeoyL/Q3FHhQVb8Ukbwiki2zeJThJGJ+PEbGICI3isijuLh4HVXdhDM6NYE3RKRSiPTwJYVLikgF31Orqi4BXgQ2AY97oZUMNzr+hkMdMcBlwM3Af8AgYD7O+/oH6CciOSOhis6PKOAgMEpEKuD0nKSqfQBEpF44lUsJv+9CVREpoKq7gQ9x13wTcAEwXlWL4j6DqyF0IcKk3zvvvKWBHCJyrafrE6o6yvsedQLuP/MTn8GSSTCDcxYjIi8Bw4AbcYn55SJylefpXAJcDAwTkaoZrId/UngG8CPwoYg8A6CqC4A3gD3AoyJys7c+Q24wSZLADUTkGhGpCRxU1fWqegPwKC6HMxw4DxeGrOfbPyP0Ol1E5Bzvz3w43a4GZuHyc108mZrAIBGpHRYlU8Dvu9Aa913o4VUEvoz7jrbHeTRjvV32A1s9mVDoF+X73olIWb/PegnQBvgUZ2ze8dYXAa7BTTF9Zuc+gyWzYAbnLEVE7gDuAW5X1ZbANG9TcQBV/QtoCFwL3JeRung3mGuBj3ChkhbAKqCLiLzjycwHXsd9J+8VkXzprYcXOpMkSeCpwDvAMmCkiFzj6bNMVQcD1wOP4+ar7+5tC5uXIyJ1gbUiUlxV1+IMzVTgZ1V9QFVPeKLtgGJAbJhUDYj3XbgRd+N+BRinqsdUNV5Vf1TVL7z3UMZ7ILkdGOFVDWYoSR5EBgHjcQ9l4Lya/LjruVxE8oirsByHMzqvn/n5z/6iAes0cPZSEfhMVZd6HsPbuJjzR+Lm3yisqn96eYCYjFRERM7DVfX0U9U3RKQQrvJrHdBCRN5R1S6qukBEBgCbVfVAOutwvn9eQ0QeADrinqbX4DyYh4HuIhKnqt8DqOp6YL2IHATeEpFqqvpreuoWJMdwnmAL3E17Ei7/UVLceKYob1tnoKmq/hsmPQMiIufiDPdAVX1PRHJ7+bObgJXAWly+pDcu7NvcM6wZrZf/g8jzwL3AI8AWAFX92/PQvwRG44z5n7jr3VRVj3t5nBOBjm84zOCcRfjCFd7L7EA2r4ppHNBbVUd521oDF4rIi6q61ds3u6oezwi9VHWbiHwOzPFyN/Nx4ZTewHtARxHJr6p3q+qP6X1+EXkLiAP6+N0UGgBfeeE8gP+JyAFcldcNwPeSuMXNX0AOINzjQ9bh8hwPAp+q6iwRyQXchquc+h1XFNJUVdeETcuUUaAMcEJEcgJDgMa4ooz8uBv918AYYK2qbs5IZUSklqqu9gujNcQ9DLVV1e9FJJdnEOsA3wN1gfqevr8B36nqifT4/WSF+XAspHZ20cjv7z9xT7of4qqY3gV8swveAWRXv9kBM8rY+B1/mKr+gruh/Ab0V9U43FPtb0AxzxPKCL4B+nl/n+u3Pj+czMl4VXOfAJ1FpGCS0FlTnCexM4N0TIZfcj0hf+F9Tj2AizwvDVWdoar3AFWBJkDrCDU2qOpeXOVff9y1rAx8pKrFgZlAO1Xdq6pfhsDYPIP3vfArFCiIm3r6FxGpjzOIvu/F50AZVZ2rqu96/5/wHmLO/PeTBZI4ZnDOErzk8A/ixgugquNxeQmA3SJSWUSq40Iw0ST/oaWHDuJ3k6zmJeOvEpGKfmKVgWKqust7fZ6nU1tV3ZZeuvj0AVDV6ap6TETaAxM8w/YVcJOINEliWLbgDGC833FyAAeA6qr6d3rqmBpevqMlMFNEOnueDLh80gygsfcEHuV5t5tVdb+q7g+Vjqnh912oLSJ3iEgnESmjqv1xhSydcZ7ZSG+XA8DmEBZlfIY3qBdXhQZuLNP5uIeU2UAhXOn+1Tgv54KkB0mvMFoWsDcWUjsbEJGuwIW4MRjDReQcVX1FVe8Wkem4/ElV4Cfc01vD9Iw5e+Gw/X5hiZuBEcBGoDCwS0RGq+oYYCFQW0Q+xoW52gH1vCffdCVAlVteoACuIqo3rgJtpojcicsd7MUVUOzE3fx8xzkmIlNDVZabhM24MNR9wBMi0g8XknwbWAp84BcWjCg8g3kLrlLyH+AQrjDjNlX93CcnImU8b601kPQBICP1W+mdvw1uiEBHVZ3jPZjdgStsWaCq+0UkGy5qEJJqubMVMziZHC8scD+ukmoN0BwYKCI5VPV5VW3t/YBKAluB9er6P6VLzkZERuJyRQ944YX6uPYf/VX1bXHVaTNwHgW4sEk00BI4jrvB/H6mepwOqvqOl/zvjKsqegV3E5yCy3vsA44C9b2bZUJOLFTGxv+c3mf0h7juB2WAPrhQ1FO45qvf4PJSqzPCYJ8p4irq3sOFdEd5nu5vuGKAzz2Z5jhj2hBoEYqCjCTXuCbuM18KvCQiPVV1vog8530HcolIEVyFZRTuu5xBeqWt4iwzValJeB7ajPTAS8D/D3hTVcd5687HGaDewFOqOizAfuky34uI3I7zEq72e1rsDNyiqq1EpBwwD/haVX3jQ4r4wmkikkdVD56pHqepq/9NpiPQAVed1wUXTvF1Xfg8vZLAadXRC6P5jMxc4Et1ZeyISGNcnqYPznv8DWgQoQanDXC3qt4iIuWBBcAXqtrV254f99B7ObBC3aDkjNbJv/R5GC5U1hQXAXgEqAQ8oq5iMieuOMMXdrvM83bTtRrNy6vu3bhtNwUKFAh6/3379lH+vMIABf3zspGI5XAyNyeAskBR3wp1pb/vA6txPbUe9W3zy2mkV8iiNLBLVVeK66D7GO47tcWr7PkBN06km3f+K4FO4sqiCZWx8c6lfu9/DK5yrxRuDM5+VZ2CKyNPvyRwEPgZmza4h4jzcCXQQ4GXRcQ32n6hqr6EG4TaA7gxEo2NR0ngPHHdLObjSoq7A4jrKPEicERVp4bC2ECivnmFcPmZ7qq6U10Z/Bs4Az5cRJqq6lGckZyE88SPeQ8iGVL6nBXG4ZjBydzsBb4AGohfixpV3YJLfs4BeoobBJoRYaH5uHvlHFyIZDMu/9Ee+AWYqqoP+Rm4W4EauBBGyAlgdD7A3difF5HyfuGzkIylEJFWXkjHp1spYDCuhL2tqrYDrsKFIO8TkbLeftnUVXC9oaq/hULXNLIY91kvAeaq6oN+21rgBiGHPKwvIg/i8jEX4vKMAKjqDziP/TdcB46W6kqmXw/Xg8jZhhmcTIa4arNqkDBny9e4mPj9IlLFk8mPe7qcBCwCrvNi0en6LKSqy3BG7XJgsap+7iWDR+KeHmeISEERKSIiL+Bagzyvrhw6LAQwOqNx1+rBjLhGKeGFQ0cAj8nJ1kLHcIUN/3gyUd41fhzXPqWZp/cJ33sJha6nwnfNRKSWuKrEy7xNq3B5xWPAKnGNOUuJG1jZHjf4MxwhoOXAr8BFeOOqxCs994yOr83SPf47hepB5GzGcjiZCL8fajbck9k9XlL5PlzvryO4m9X5uHE2tUXkZVxTysbp/YMR19NrJm5QZGNgjareISJ5cN7DTbgy4524m3obX64n3CTJ6byMS1q39MIoodKhLvAuLvz5BvA3sB4YoqrvejmE416RxyxcF+gHUz5i+PBCgR/hPu/KwJu4kB+4vmi1cN0vVuNCwG1D8V0IlK/0Ks6q4zpRH8WFyw765+08z/OXdAw/p6ZjAWDv5pi053DKlsgcORyrUsskeD/odkBX3I+kP/CtiNyiqu+LyAZcXL8RrnppsLdrcdzTXDZczifdUNVDInKD92PthKuYGq+q7YHbxfXMKgzsxiWFI6ZlfpIqtAO40No5hDDcp6orvPDO+8BjuHb8L+NKdNd6eQUf2YCIaVPj5yWqV8X1BC5XNx/ncU/EebkdcQ9JF+JG6f8BbNF0HnOVgo7+BQItPX02AxtUdbVX9PIZMF9Emnnf5xzqerutSXqMDNc3C3QaMA8nE+D9MAoD2VT1TW9dDlw4qwxws6quSLLP+Tjj1AX3BJeh/ajENdu8DXfjWaGqd55il4jAu3HeCvymqqvDpEMdnEf4E25Ee2vcZ/ckzlhXAx7AlWtvCIeOPsQNmo3xu5Ffjav0Kgw8pqp7vPXNcaXwk3BVX2ErbBDXpLULsB33e5kBjFLXFqiGp+MenIcbskIWP/0KAHu3xP6XZg+ndHQhyAQejuVwIhwvH/MaLt5/vrdOvPxNS9wT20QRaewXS88H9MWV1l6e0cYGQF2zzUm4yqMaIpJh4xXSE3VMDpex8XRYiZtTpQ7QFtf9+THcTbIHzmttFgHGphOuFVEDv1xXSZyuCS36Pa9gPq4T+U3A+yJSOIR6it/f9XEGvBXO87oGyIObe6mZqv6MixxUxhUMGBmIeTiZAHEdnSfhRslfr6ob/cposwM/49rTt/XbpwiQU0PcLVjcdMHtcT3T2oQidHK24OV0RuJu6v1x3k0OnGcb9idX70a+GtdN5X5gmVe9dSvOM3sFN+D3uN/380pcCXq9MHwX+wAlgDyq+pDf+ktxTVqXquoj4lrpXABsDEdhgM/D+ecMPJzzM4mHYwYnQhGRK3ATbMWr6gwvRPYVbmT8Laq6xe9HnQ1OVtH4J8TDpHseIEc4wyiZFS+89h6uEGOIhncqhATEzXZ61Pt7OZATeAhXnXhCXJ+6D4DngMHeOt/38xxVPRQCHf1zNoVwIcneuFDllf7fRxF5CGd0LlDVWL/1IZ9iIMHgbD8Dg1M8cxgcC6lFIF412lhcEnmiiIz1NrXChQOmiJvfJWHciG+cgPc6rE8RqnrQjE3a8MJr3XBP5v+FWR1/jgGI6x7xFK6k+EWgvneTHo8LCz4F9Pcqvnzfzww3Nt55fMbmOeB53KDZQbhimpt9vw+PzTijLkmOEbbSZzmDf5kFMzgRhhcG6IArBKiLe0JrjyubVVwMOjduvpbi/vvaOIGzA3Vjb64JdQgqNTxP5SbcfDxNcFVo5+HGMfkbnXtxD0p9QqVbkpzN1bjxXu+r6gFVHYJrHvou0E1c5+oyuLzTHiJsRtSzHSuLjiC8CqBqwON6cqbOIcAzuD5PbwA9cUnQobiGk8ZZiKoeDrcO/ohIUZzX8IyqPuutK4zr9TYaN4fQMnUzyh7DDfgMCX7jqdrhxlPNVNWffONqVLWniMTjDM9BXL4pG65ZqIay9Dk1skLzTvNwIovdwHRglohcjIsxD1LVATjD0wYXJz+oqvf4h9EMI4M5jgs//Q6uLF9VdwO+XOOzQBPP05moqusyWiG/qswor3imF24AdHVwk9XJycn1egMDcSHpOap6hZ7sjRZ2YwNkiflwzOBEEN5T7UxvLMMVuDlaxnmbj+JGRh/Bb9ZJC6MZocD7TsbjSvF9cwRlxz0k/YybFuN5QjhfjF+usri6DgGXAdOA6iJyl1foEO9ndIbiogRjxc3Tk+Ez3QZFiC2OiHQTkU0iclhElngl5KnJ3yYi6z35n0WkVbDnNIMTefh+AJVx092qiOTGDa6bqarX+v+IDCO98c+JJOEZXF++vuBu1p53sB6X17kj1KFAEbkHGC0il3jFCXfh8kyPA9d7npi/0Xkc13Znsoi0DqWukYQXfnwN15GkLq7cfVbSvLCffGNcKHI0brzYNGCauLm2Tv+8VhYdmYhIQ1xr9A1ALtxsnnUj6onMOOvwK2W+DNcfrwyu9c4vuGIV34DUb3Czt9bB3eQvVNWtYdC3I64Lw5/AMC93kwfXTaAAzuua6Q2U9t/vWeBDVV0fap2T4iuLjtm5N81l0SWKFoQgyqJFZAluHJVvuogoXB+8N1X1hQDyE4G8qnq937rFwCr/MU6nwooGIhRVXewZnZtxM1G+5sWkQz4xmJF18IyNL1f4I87IfA28gDM8r+JCvU/jSqOPA01DYWwCJfdVdYyIHMKVkvcUkVc9o3MjbsqMYbiw33dJ9uuX0foGy/79+9JUALB/f4KNyZ/EOT2iqkeSyotrClsPZ4wBV1IuIrNxXS0C0QjnEfkzC9dJ4rQxgxPBqOuPltAjzYyNkdF4DzkjgB7ezTw7brBxT1xz03dU9TPgM3HdwqM0RNNN+I2zuRL4S1X/9NZ/6j2hdwF6icjz6ppz3owrZvghFPqdAUeBmErlS5c4g2McwJvWwo/BuHFISSmKq9JLWhIei2uyGogSKcgHpbMZnEyEGRsjBFTAhZrGiJsWei7wNhCHu4EdF5FPVXVzqAZ0JukgUBuXR5jueTObAFT1Y+/JfTgQLyIjVHUhLgQYlg4Cp4uqHvaudc50PnQy7ybcmMExjCyMX86mFrADN73ASq9Q5T1cCfGjnmx7XLuYoyIyPBQ38CTG5kZcXvMV3ORoj4vI635GZ6y4ac6b4nKfC33vL1KNjQ+v2CJUBRc7cVOVRCdZHw3EpLBPTJDyAbFKJ8PIovgZm5twffoeAv5T17+tJC5c8pknWwqYB4wCvgiRsZEk7WpGArer6nBcxdRluBlTy3kyJYBluPzSUAh/m6dIRF1PvOV4Je6QUDTQEjdDcCAW+ct7XJmKfEDMwzGMLIpnbK4DPsZ1svhST84Hkw8oAhQTkbK4ljVlgAdC2BvN10GgP647dSu8gaeq+ppXLHAP8JaIzAWu8nYdH0kdBCKU14BxIvITsBQXeswLjAEQkfHAVlXt68m/AXwnIj2B/wG3AxfjKgRPGzM4hpFF8cJmHYDX1c0am0dELsBNpLcMN1Psa7gmogVx/d1CYmz8dCyM58mo6jIRKSWuo/btwGzcFOfVcAbxD9zU1ervHRnJUdWJIlIM18GkBLAK9/n6CgPK4Ab6+uQXisiduLFYz+EM/02q+ksw57VxOIaRRfGqzBbgwiKDcEUBNYAquHzCq7iuygKs8eVKQqxjIdwYoDG4sT9dgfK4dMD5uNDZeziD+J9nbKyaM0Ixg2MYWRivEOBd3PQDc4BpqjpeRN7EGZ5rwu0piEhn4GVcKe+7wLeqOltEPgJOqGoHP1kLo0UwFlIzjCyMZ1x+Akqp6rd+LZMEV4GUgzCX16rqaBH5Fsilqr7moVG4UNDiJLJmbCIY83AMw0hARC7EJeK7AU2CjdFnNCKSD6gNPAGUxdo9ZSrMwzEMAwARqYfrKFAbaBaBxkZwlVE9cZ5XPa/dU8QO6jQSYx6OYRhAQhHBxcAmVd0Sbn0CISK5cFVpq73+X1YgkIkwg2MYRqbECgQyH2ZwDMMwjJBgrW0MwzCMkGAGxzAMwwgJZnAMwzCMkGAGxzAMwwgJZnAMwzCMkGAGxzAMwwgJZnCMLI+IlBMR9aYvRkSae6/PDYMu80VkWCrbB4nIqiCP6Ztk7Uz0Gisi087kGIZhBseISLwbnHrLURH5Q0QGiEgo2jEtxM14ufd0hE9lJAzDcFgvNSOS+RroCOTCzfb4Fq6N/vNJBUUkG26SyDMeee5NwRvUXO2GYZwa83CMSOaIqsa3l9/5AAAEeElEQVSo6mZVfQc3w+ONACJyr4jsEZEbReRXXAv9Mt62+0RknYgcFpH1ItLV/6AiUl9EVnrbfwLqJNmeLKQmIpd6nsxBEflPRGaJSCERGQs0Ax7188jKeftUF5GvROSAiMSKyIciUtTvmHlFZLy3/V9v+t6gEJFLRORbEdkpIntF5DsRqRtAtKSnyyER+UtEbk1ynNIiMsm7prtFZLrvfRhGemEGx8hMHAJy+r3Og2tTfx9wEbBdRO7CTZvbD6gKPAUMFZEOkNDefiZu+uR6uJkuX0ntpF5uZ463TyOgCfAFbkKwR3EzZo7CheFKAls8YzUXWIlriHkNEA1M8jv0yzhj1Rq4CmgOBDIWqZEfGOfp1BA39e+XIpI/idxQ4DOgFjAB+FREqnrvLwcwC9gPNAUuBQ4AX4tITgwjnbCQmhHxeG3pWwJXA2/6bcoBdFXV1X6yg4GeqjrVW7VRRKoBD+JuzHfiHrQ6q+phYK2InA+8k4oKfYCfVNXfU1rrd86jwEFVjfFb1x1YqapP+a3rhDNGlYFtQGfgblWd423vAPxzOtfEh6rO9X8tIg8Ae3CGbKbfpsmq+r73d38RuRJ4GDdlczvcNblPveaKItLRO05z3NTOhnHGmMExIpnrReQAzrBEAR/jPBIfR4E1vhcikheoAIwWkVF+ctk5WQBQFVjjGRsfi06hR21gcpC61wIu9/RPSgXgHJy3tsS3UlV3i8iGYE4iItHAMzjDUBzndeXBCy/6kfQ9LsK9L5+uFYH9zrYnkNvT1TDSBTM4RiQzD+iCMyzbAsx7ckgTtzvP5/1/P343co8zmaDrUBr2yYcLuz0RYNu/uBt8ejAOKIIL7W3G5bIWkTj0eCryAcuBuwJs23GmChqGDzM4RiQTp6p/nK6wqsaKyDbgAlWdkILYOuAeEcnt5+U0PMWh1+BCegNT2H4U51n4swK4BTeZWbIJwkTkT1zFXQPgb29dIaAy8N0p9PHnUlxY8UvvGKWBogHkGgLjk7xe6adrO2C7qu4L4tyGERRWNGCcbQwE+orIIyJSWURqiEhHEenhbf8YUGCUiFQTkVZAr1Mc83ngEhF5W0RqisiFItLFr+JsE9DAG0BaVESicCXchYFPvEqyCiJytYiMETcl8gFgNPCyiLQQkerAWCDYsu7fcQa0qog0wBUEBPLIbhORTt41GQzUB0Z42yYAO4HpItJURMp7lXrDvfyWYaQLZnCMswovMX4fbvzOzzhv4V5go7f9AHADUAP3hP8sgcNe/sf8DVdFVgtYigtZtQZ8nssruJDdr7gQVBlV3YbzPrLhku4/A8NwiXifUekNfI8Lvc0GfsCFtoKhM1AI56V8CAwHtgeQGwjcjvPW2gN3qOqv3vs7CFyG87Sm4rzA0bgcjnk8RrphM34ahmEYIcE8HMMwDCMkmMExDMMwQoIZHMMwDCMkmMExDMMwQoIZHMMwDCMkmMExDMMwQoIZHMMwDCMkmMExDMMwQoIZHMMwDCMkmMExDMMwQoIZHMMwDCMk/B9SXKIcnxgszwAAAABJRU5ErkJggg==\n", 1373 | "text/plain": [ 1374 | "
" 1375 | ] 1376 | }, 1377 | "metadata": {}, 1378 | "output_type": "display_data" 1379 | } 1380 | ], 1381 | "source": [ 1382 | "import helpers.evaluate as ev\n", 1383 | "evaluator = ev.Evaluate()\n", 1384 | "import pandas as pd\n", 1385 | "\n", 1386 | "final_predictions = []\n", 1387 | "\n", 1388 | "for p in all_predictions:\n", 1389 | " for sub_p in p:\n", 1390 | " final_predictions.append(sub_p)\n", 1391 | "\n", 1392 | "predictions = [np.argmax(p).item() for p in final_predictions]\n", 1393 | "targets = [np.argmax(t).item() for t in y_raw]\n", 1394 | "correct_predictions = float(np.sum(predictions == targets))\n", 1395 | "\n", 1396 | "# predictions\n", 1397 | "predictions_human_readable = ((x_raw, predictions))\n", 1398 | "# actual targets\n", 1399 | "target_human_readable = ((x_raw, targets))\n", 1400 | "\n", 1401 | "emotion_dict = {0: 'anger', 1: 'fear', 2: 'joy', 3: 'love', 4: 'sadness', 5: 'surprise'}\n", 1402 | "\n", 1403 | "# convert results into dataframe\n", 1404 | "model_test_result = pd.DataFrame(predictions_human_readable[1],columns=[\"emotion\"])\n", 1405 | "test = pd.DataFrame(target_human_readable[1], columns=[\"emotion\"])\n", 1406 | "\n", 1407 | "model_test_result.emotion = model_test_result.emotion.map(lambda x: emotion_dict[int(float(x))])\n", 1408 | "test.emotion = test.emotion.map(lambda x: emotion_dict[int(x)])\n", 1409 | "\n", 1410 | "evaluator.evaluate_class(model_test_result.emotion, test.emotion );" 1411 | ] 1412 | }, 1413 | { 1414 | "cell_type": "markdown", 1415 | "metadata": {}, 1416 | "source": [ 1417 | "## Final Words\n", 1418 | "You have learned how to perform neural-based emotion recognition using RNNs. There are many things you can do after you have completed this tutorial. You can attempt the exercises outlined in the \"Outline\" section of this notebook. You can also try other types of neural architectures such as LSTMs, Bi-LSTMS, attentions models, and CNNs. In addition, you can also store the models and conduct transfer learning to other emotion-related tasks. \n" 1419 | ] 1420 | }, 1421 | { 1422 | "cell_type": "markdown", 1423 | "metadata": {}, 1424 | "source": [ 1425 | "---" 1426 | ] 1427 | }, 1428 | { 1429 | "cell_type": "markdown", 1430 | "metadata": {}, 1431 | "source": [ 1432 | "## References\n", 1433 | "\n", 1434 | "- [Introduction to what is a Tensor](https://www.youtube.com/watch?v=hCSjWCVrphc&t=1137s)\n", 1435 | "- [Deep Learning for NLP](https://docs.google.com/presentation/d/1cf2H1qMvP1rdKUF5000ifOIRv1_b0bvj0ZTVL7-RaVE/edit?usp=sharing)\n", 1436 | "- [Enable Eager Execution on TensorFlow](https://colab.research.google.com/github/zaidalyafeai/Notebooks/blob/master/Eager_Execution_Gradient_.ipynb)\n", 1437 | "- [Basic Text Classification](https://www.tensorflow.org/tutorials/keras/basic_text_classification)\n", 1438 | "- [Deep Learning for NLP: An Overview of Recent Trends](https://medium.com/dair-ai/deep-learning-for-nlp-an-overview-of-recent-trends-d0d8f40a776d)" 1439 | ] 1440 | } 1441 | ], 1442 | "metadata": { 1443 | "kernelspec": { 1444 | "display_name": "Python 3", 1445 | "language": "python", 1446 | "name": "python3" 1447 | }, 1448 | "language_info": { 1449 | "codemirror_mode": { 1450 | "name": "ipython", 1451 | "version": 3 1452 | }, 1453 | "file_extension": ".py", 1454 | "mimetype": "text/x-python", 1455 | "name": "python", 1456 | "nbconvert_exporter": "python", 1457 | "pygments_lexer": "ipython3", 1458 | "version": "3.6.1" 1459 | } 1460 | }, 1461 | "nbformat": 4, 1462 | "nbformat_minor": 2 1463 | } 1464 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ## Deep Learning Based NLP 2 | This repository contains python notebooks related to several deep learning based NLP tasks such as emotion recognition and neural machine translation. It will provide implementations based on several deep learning frameworks such as PyTorch and TensorFlow. 3 | 4 | ## Emotion Recognition with GRU 5 | - [Deep Learning Based Emotion Recognition With PyTorch](https://github.com/omarsar/nlp_pytorch_tensorflow_notebooks/blob/master/Deep%20Learning%20Emotion%20Recognition%20PyTorch.ipynb) 6 | - [Deep Learning Based Emotion Recognition With TensorFlow](https://github.com/omarsar/nlp_pytorch_tensorflow_notebooks/blob/master/Deep%20Learning%20Emotion%20Recognition%20TensorFlow%20.ipynb) 7 | 8 | --- 9 | Author: [Elvis Saravia](https://twitter.com/omarsar0) -------------------------------------------------------------------------------- /data/merged_training.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/omarsar/nlp_pytorch_tensorflow_notebooks/2fc77b0aa2b1a4be4558f14365c429cbde892586/data/merged_training.pkl -------------------------------------------------------------------------------- /helpers/.ipynb_checkpoints/evaluate-checkpoint.py: -------------------------------------------------------------------------------- 1 | from sklearn import metrics 2 | from sklearn.preprocessing import LabelEncoder 3 | from scipy import stats 4 | import numpy as np 5 | import pandas as pd 6 | import matplotlib.pyplot as plt 7 | import itertools 8 | 9 | """ Author: Elvis Saravia (Adapted from Renaud)""" 10 | 11 | class Evaluate(): 12 | 13 | def va_dist(cls, prediction, target, va_df, binarizer, name='', silent=False): 14 | """ Computes distance between actual and prediction through cosine distance """ 15 | va_matrix = va_df.loc[binarizer.classes_][['valence','arousal']].values 16 | y_va = target.dot(va_matrix) 17 | F_va = prediction.dot(va_matrix) 18 | 19 | # dist is a one row vector with size of the test data passed(emotion) 20 | dist = metrics.pairwise.paired_cosine_distances(y_va, F_va) 21 | res = stats.describe(dist) 22 | 23 | # print by default (if silent=False) 24 | if not silent: 25 | print('%s\tmean: %f\tvariance: %f' % (name, res.mean, res.variance)) 26 | 27 | return { 28 | 'distances': dist, 29 | 'dist_stat': res 30 | } 31 | 32 | def evaluate_class(cls, predictions, target, target2=None, silent=False): 33 | """ Compute only the predicted class """ 34 | p_2_annotation = dict() 35 | 36 | precision_recall_fscore_support = [ 37 | (pair[0], pair[1].mean()) for pair in zip( 38 | ['precision', 'recall', 'f1', 'support'], 39 | metrics.precision_recall_fscore_support(target, predictions) 40 | ) 41 | ] 42 | 43 | metrics.precision_recall_fscore_support(target, predictions) 44 | 45 | # confusion matrix 46 | le = LabelEncoder() 47 | target_le = le.fit_transform(target) 48 | predictions_le = le.transform(predictions) 49 | cm = metrics.confusion_matrix(target_le, predictions_le) 50 | 51 | # prediction if two annotations are given on test data 52 | if target2: 53 | p_2_annotation = pd.DataFrame( 54 | [(pred, pred in set([t1,t2])) for pred, t1, t2 in zip(predictions, target, target2)], 55 | columns=['emo','success'] 56 | ).groupby('emo').apply(lambda emo: emo.success.sum()/ len(emo.success)).to_dict() 57 | 58 | if not silent: 59 | print("Default Classification report") 60 | print(metrics.classification_report(target, predictions)) 61 | 62 | # print if target2 was provided 63 | if len(p_2_annotation) > 0: 64 | print('\nPrecision on 2 annotations:') 65 | for emo in p_2_annotation: 66 | print("%s: %.2f" % (emo, p_2_annotation[emo])) 67 | 68 | # print accuracies, precision, recall, and f1 69 | print('\nAccuracy:') 70 | print(metrics.accuracy_score(target, predictions)) 71 | print("Correct Predictions: ", metrics.accuracy_score(target, predictions,normalize=False)) 72 | for to_print in precision_recall_fscore_support[:3]: 73 | print( "%s: %.2f" % to_print ) 74 | 75 | # normalizing the values of the consfusion matrix 76 | print('\nconfusion matrix\n %s' % cm) 77 | print('(row=expected, col=predicted)') 78 | cm_normalized = cm.astype('float') / cm.sum(axis=1)[:, np.newaxis] 79 | cls.plot_confusion_matrix(cm_normalized, le.classes_, 'Confusion matrix Normalized') 80 | 81 | return { 82 | 'precision_recall_fscore_support': precision_recall_fscore_support, 83 | 'accuracy': metrics.accuracy_score(target, predictions), 84 | 'p_2_annotation': p_2_annotation, 85 | 'confusion_matrix': cm 86 | } 87 | 88 | def predict_class(cls, X_train, y_train, X_test, y_test, 89 | pipeline, silent=False, target2=None): 90 | """ Predicted class,then run some performance evaluation """ 91 | pipeline.fit(X_train, y_train) 92 | predictions = pipeline.predict(X_test) 93 | print("predictions computed....") 94 | return cls.evaluate_class(predictions, y_test, target2, silent) 95 | 96 | def evaluate_prob(cls, prediction, target_rank, target_class, binarizer, va_df, silent=False, target2=None): 97 | """ Evaluate through probability """ 98 | # Run normal class evaluator 99 | predict_class = binarizer.classes_[prediction.argmax(axis=1)] 100 | class_eval = cls.evaluate_class(predict_class, target_class, target2, silent) 101 | 102 | print("==============================================================================") 103 | print("Finished from evaluation class, now continue with the probability evaluation") 104 | 105 | if not silent: 106 | print('\n - First Emotion Classification Metrics -') 107 | print('\n - Multiple Emotion rank Metrics -') 108 | print('VA Cosine Distance') 109 | 110 | classes_dist = [ 111 | ( 112 | emo, 113 | cls.va_dist( 114 | prediction[np.array(target_class) == emo], 115 | target_rank[np.array(target_class) == emo], 116 | va_df, 117 | binarizer, 118 | emo, 119 | silent) 120 | ) for emo in binarizer.classes_ 121 | ] 122 | avg_dist = cls.va_dist(prediction, target_rank, va_df, binarizer, 'avg', silent) 123 | 124 | coverage_error = metrics.coverage_error(target_rank, prediction) 125 | average_precision_score = metrics.average_precision_score(target_rank, prediction) 126 | label_ranking_average_precision_score = metrics.label_ranking_average_precision_score(target_rank, prediction) 127 | label_ranking_loss = metrics.label_ranking_loss(target_rank, prediction) 128 | 129 | # recall at 2 130 | # obtain top two predictions 131 | top2_pred = [set([binarizer.classes_[i[0]], binarizer.classes_[i[1]]]) for i in (prediction.argsort(axis=1).T[-2:].T)] 132 | recall_at_2 = pd.DataFrame( 133 | [ 134 | t in p for t, p in zip(target_class, top2_pred) 135 | ], index=target_class, columns=['recall@2']).groupby(level=0).apply(lambda emo: emo.sum()/len(emo)) 136 | 137 | # combine target into sets 138 | if target2: 139 | union_target = [set(t) for t in zip(target_class, target2)] 140 | else: 141 | union_target = [set(t) for t in zip(target_class)] 142 | 143 | # precision at k 144 | top_k_pred = [ 145 | [set([binarizer.classes_[i] for i in i_list]) for i_list in (prediction.argsort(axis=1).T[-i:].T)] 146 | for i in range(2, len(binarizer.classes_)+1)] 147 | precision_at_k = [ 148 | ('p@' + str(k+2), np.array([len(t & p)/(k+2) for t, p in zip(union_target, top_k_pred[k])]).mean()) 149 | for k in range(len(top_k_pred))] 150 | 151 | # do this if silent= False 152 | if not silent: 153 | print('\n') 154 | print(recall_at_2) 155 | print('\n') 156 | print('p@k') 157 | for pk in precision_at_k: 158 | print(pk[0] + ':\t' + str(pk[1])) 159 | print('\ncoverage_error: %f' % coverage_error) 160 | print('average_precision_score: %f' % average_precision_score) 161 | print('label_ranking_average_precision_score: %f' % label_ranking_average_precision_score) 162 | print('label_ranking_loss: %f' % label_ranking_loss) 163 | 164 | return { 165 | 'class_eval': class_eval, 166 | 'recall_at_2': recall_at_2.to_dict(), 167 | 'precision_at_2': precision_at_k, 168 | 'classes_dist': classes_dist, 169 | 'avg_dist': avg_dist, 170 | 'coverage_error': coverage_error, 171 | 'average_precision_score': average_precision_score, 172 | 'label_ranking_average_precision_score': label_ranking_average_precision_score, 173 | 'label_ranking_loss': label_ranking_loss 174 | } 175 | 176 | 177 | def predict_prob(cls, X_train, y_train, X_test, y_test, label_test, pipeline, binarizer, va_df, silent=False, target2=None): 178 | """ Output predcations based on training and labels """ 179 | pipeline.fit(X_train, y_train) 180 | predictions = pipeline.predict_proba(X_test) 181 | pred_to_mlb = [np.where(pipeline.classes_ == emo)[0][0] for emo in binarizer.classes_.tolist()] 182 | return cls.evaluate_prob(predictions[:,pred_to_mlb], y_test, label_test, binarizer, va_df, silent, target2) 183 | 184 | 185 | def plot_confusion_matrix(cls, cm, my_tags, title='Confusion matrix', cmap=plt.cm.Blues): 186 | """ Plotting the confusion_matrix""" 187 | plt.rc('figure', figsize=(4, 4), dpi=100) 188 | plt.imshow(cm, interpolation='nearest', cmap=cmap) 189 | plt.title(title) 190 | plt.colorbar() 191 | tick_marks = np.arange(len(my_tags)) 192 | target_names = my_tags 193 | plt.xticks(tick_marks, target_names, rotation=45) 194 | plt.yticks(tick_marks, target_names) 195 | 196 | # add normalized valued inside the Confusion matrix 197 | fmt = '.2f' 198 | thresh = cm.max() / 2. 199 | for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])): 200 | plt.text(j, i, format(cm[i, j], fmt), horizontalalignment="center", color="white" if cm[i, j] > thresh else "black") 201 | 202 | plt.tight_layout() 203 | plt.ylabel('True label') 204 | plt.xlabel('Predicted label') 205 | -------------------------------------------------------------------------------- /helpers/__pycache__/evaluate.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/omarsar/nlp_pytorch_tensorflow_notebooks/2fc77b0aa2b1a4be4558f14365c429cbde892586/helpers/__pycache__/evaluate.cpython-36.pyc -------------------------------------------------------------------------------- /helpers/__pycache__/pickle_helpers.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/omarsar/nlp_pytorch_tensorflow_notebooks/2fc77b0aa2b1a4be4558f14365c429cbde892586/helpers/__pycache__/pickle_helpers.cpython-36.pyc -------------------------------------------------------------------------------- /helpers/evaluate.py: -------------------------------------------------------------------------------- 1 | from sklearn import metrics 2 | from sklearn.preprocessing import LabelEncoder 3 | from scipy import stats 4 | import numpy as np 5 | import pandas as pd 6 | import matplotlib.pyplot as plt 7 | import itertools 8 | 9 | """ Author: Elvis Saravia (Adapted from Renaud)""" 10 | 11 | class Evaluate(): 12 | 13 | def va_dist(cls, prediction, target, va_df, binarizer, name='', silent=False): 14 | """ Computes distance between actual and prediction through cosine distance """ 15 | va_matrix = va_df.loc[binarizer.classes_][['valence','arousal']].values 16 | y_va = target.dot(va_matrix) 17 | F_va = prediction.dot(va_matrix) 18 | 19 | # dist is a one row vector with size of the test data passed(emotion) 20 | dist = metrics.pairwise.paired_cosine_distances(y_va, F_va) 21 | res = stats.describe(dist) 22 | 23 | # print by default (if silent=False) 24 | if not silent: 25 | print('%s\tmean: %f\tvariance: %f' % (name, res.mean, res.variance)) 26 | 27 | return { 28 | 'distances': dist, 29 | 'dist_stat': res 30 | } 31 | 32 | def evaluate_class(cls, predictions, target, target2=None, silent=False): 33 | """ Compute only the predicted class """ 34 | p_2_annotation = dict() 35 | 36 | precision_recall_fscore_support = [ 37 | (pair[0], pair[1].mean()) for pair in zip( 38 | ['precision', 'recall', 'f1', 'support'], 39 | metrics.precision_recall_fscore_support(target, predictions) 40 | ) 41 | ] 42 | 43 | metrics.precision_recall_fscore_support(target, predictions) 44 | 45 | # confusion matrix 46 | le = LabelEncoder() 47 | target_le = le.fit_transform(target) 48 | predictions_le = le.transform(predictions) 49 | cm = metrics.confusion_matrix(target_le, predictions_le) 50 | 51 | # prediction if two annotations are given on test data 52 | if target2: 53 | p_2_annotation = pd.DataFrame( 54 | [(pred, pred in set([t1,t2])) for pred, t1, t2 in zip(predictions, target, target2)], 55 | columns=['emo','success'] 56 | ).groupby('emo').apply(lambda emo: emo.success.sum()/ len(emo.success)).to_dict() 57 | 58 | if not silent: 59 | print("Default Classification report") 60 | print(metrics.classification_report(target, predictions)) 61 | 62 | # print if target2 was provided 63 | if len(p_2_annotation) > 0: 64 | print('\nPrecision on 2 annotations:') 65 | for emo in p_2_annotation: 66 | print("%s: %.2f" % (emo, p_2_annotation[emo])) 67 | 68 | # print accuracies, precision, recall, and f1 69 | print('\nAccuracy:') 70 | print(metrics.accuracy_score(target, predictions)) 71 | print("Correct Predictions: ", metrics.accuracy_score(target, predictions,normalize=False)) 72 | for to_print in precision_recall_fscore_support[:3]: 73 | print( "%s: %.2f" % to_print ) 74 | 75 | # normalizing the values of the consfusion matrix 76 | print('\nconfusion matrix\n %s' % cm) 77 | print('(row=expected, col=predicted)') 78 | cm_normalized = cm.astype('float') / cm.sum(axis=1)[:, np.newaxis] 79 | cls.plot_confusion_matrix(cm_normalized, le.classes_, 'Confusion matrix Normalized') 80 | 81 | return { 82 | 'precision_recall_fscore_support': precision_recall_fscore_support, 83 | 'accuracy': metrics.accuracy_score(target, predictions), 84 | 'p_2_annotation': p_2_annotation, 85 | 'confusion_matrix': cm 86 | } 87 | 88 | def predict_class(cls, X_train, y_train, X_test, y_test, 89 | pipeline, silent=False, target2=None): 90 | """ Predicted class,then run some performance evaluation """ 91 | pipeline.fit(X_train, y_train) 92 | predictions = pipeline.predict(X_test) 93 | print("predictions computed....") 94 | return cls.evaluate_class(predictions, y_test, target2, silent) 95 | 96 | def evaluate_prob(cls, prediction, target_rank, target_class, binarizer, va_df, silent=False, target2=None): 97 | """ Evaluate through probability """ 98 | # Run normal class evaluator 99 | predict_class = binarizer.classes_[prediction.argmax(axis=1)] 100 | class_eval = cls.evaluate_class(predict_class, target_class, target2, silent) 101 | 102 | print("==============================================================================") 103 | print("Finished from evaluation class, now continue with the probability evaluation") 104 | 105 | if not silent: 106 | print('\n - First Emotion Classification Metrics -') 107 | print('\n - Multiple Emotion rank Metrics -') 108 | print('VA Cosine Distance') 109 | 110 | classes_dist = [ 111 | ( 112 | emo, 113 | cls.va_dist( 114 | prediction[np.array(target_class) == emo], 115 | target_rank[np.array(target_class) == emo], 116 | va_df, 117 | binarizer, 118 | emo, 119 | silent) 120 | ) for emo in binarizer.classes_ 121 | ] 122 | avg_dist = cls.va_dist(prediction, target_rank, va_df, binarizer, 'avg', silent) 123 | 124 | coverage_error = metrics.coverage_error(target_rank, prediction) 125 | average_precision_score = metrics.average_precision_score(target_rank, prediction) 126 | label_ranking_average_precision_score = metrics.label_ranking_average_precision_score(target_rank, prediction) 127 | label_ranking_loss = metrics.label_ranking_loss(target_rank, prediction) 128 | 129 | # recall at 2 130 | # obtain top two predictions 131 | top2_pred = [set([binarizer.classes_[i[0]], binarizer.classes_[i[1]]]) for i in (prediction.argsort(axis=1).T[-2:].T)] 132 | recall_at_2 = pd.DataFrame( 133 | [ 134 | t in p for t, p in zip(target_class, top2_pred) 135 | ], index=target_class, columns=['recall@2']).groupby(level=0).apply(lambda emo: emo.sum()/len(emo)) 136 | 137 | # combine target into sets 138 | if target2: 139 | union_target = [set(t) for t in zip(target_class, target2)] 140 | else: 141 | union_target = [set(t) for t in zip(target_class)] 142 | 143 | # precision at k 144 | top_k_pred = [ 145 | [set([binarizer.classes_[i] for i in i_list]) for i_list in (prediction.argsort(axis=1).T[-i:].T)] 146 | for i in range(2, len(binarizer.classes_)+1)] 147 | precision_at_k = [ 148 | ('p@' + str(k+2), np.array([len(t & p)/(k+2) for t, p in zip(union_target, top_k_pred[k])]).mean()) 149 | for k in range(len(top_k_pred))] 150 | 151 | # do this if silent= False 152 | if not silent: 153 | print('\n') 154 | print(recall_at_2) 155 | print('\n') 156 | print('p@k') 157 | for pk in precision_at_k: 158 | print(pk[0] + ':\t' + str(pk[1])) 159 | print('\ncoverage_error: %f' % coverage_error) 160 | print('average_precision_score: %f' % average_precision_score) 161 | print('label_ranking_average_precision_score: %f' % label_ranking_average_precision_score) 162 | print('label_ranking_loss: %f' % label_ranking_loss) 163 | 164 | return { 165 | 'class_eval': class_eval, 166 | 'recall_at_2': recall_at_2.to_dict(), 167 | 'precision_at_2': precision_at_k, 168 | 'classes_dist': classes_dist, 169 | 'avg_dist': avg_dist, 170 | 'coverage_error': coverage_error, 171 | 'average_precision_score': average_precision_score, 172 | 'label_ranking_average_precision_score': label_ranking_average_precision_score, 173 | 'label_ranking_loss': label_ranking_loss 174 | } 175 | 176 | 177 | def predict_prob(cls, X_train, y_train, X_test, y_test, label_test, pipeline, binarizer, va_df, silent=False, target2=None): 178 | """ Output predcations based on training and labels """ 179 | pipeline.fit(X_train, y_train) 180 | predictions = pipeline.predict_proba(X_test) 181 | pred_to_mlb = [np.where(pipeline.classes_ == emo)[0][0] for emo in binarizer.classes_.tolist()] 182 | return cls.evaluate_prob(predictions[:,pred_to_mlb], y_test, label_test, binarizer, va_df, silent, target2) 183 | 184 | 185 | def plot_confusion_matrix(cls, cm, my_tags, title='Confusion matrix', cmap=plt.cm.Blues): 186 | """ Plotting the confusion_matrix""" 187 | plt.rc('figure', figsize=(4, 4), dpi=100) 188 | plt.imshow(cm, interpolation='nearest', cmap=cmap) 189 | plt.title(title) 190 | plt.colorbar() 191 | tick_marks = np.arange(len(my_tags)) 192 | target_names = my_tags 193 | plt.xticks(tick_marks, target_names, rotation=45) 194 | plt.yticks(tick_marks, target_names) 195 | 196 | # add normalized valued inside the Confusion matrix 197 | fmt = '.2f' 198 | thresh = cm.max() / 2. 199 | for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])): 200 | plt.text(j, i, format(cm[i, j], fmt), horizontalalignment="center", color="white" if cm[i, j] > thresh else "black") 201 | 202 | plt.tight_layout() 203 | plt.ylabel('True label') 204 | plt.xlabel('Predicted label') 205 | -------------------------------------------------------------------------------- /helpers/pickle_helpers.py: -------------------------------------------------------------------------------- 1 | import pickle 2 | 3 | ''' 4 | Store and load anything to and from pickle 5 | ''' 6 | 7 | def convert_to_pickle(item, directory): 8 | ''' 9 | Usage: convert dictionary object to pickle format 10 | pickle_helpers.convert_to_pickle(cat_list,"data/liwc_pickle/liwc_cat.p") 11 | ''' 12 | 13 | pickle.dump(item, open(directory,"wb")) 14 | 15 | 16 | def load_from_pickle(directory): 17 | ''' 18 | Usage: Load pickle file 19 | pickle_helpers.load_from_pickle("data/liwc_pickle/liwc_cat.p") 20 | ''' 21 | 22 | return pickle.load(open(directory,"rb")) 23 | -------------------------------------------------------------------------------- /img/autograd.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/omarsar/nlp_pytorch_tensorflow_notebooks/2fc77b0aa2b1a4be4558f14365c429cbde892586/img/autograd.jpg -------------------------------------------------------------------------------- /img/dl_frameworks.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/omarsar/nlp_pytorch_tensorflow_notebooks/2fc77b0aa2b1a4be4558f14365c429cbde892586/img/dl_frameworks.png -------------------------------------------------------------------------------- /img/emotion_classifier.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/omarsar/nlp_pytorch_tensorflow_notebooks/2fc77b0aa2b1a4be4558f14365c429cbde892586/img/emotion_classifier.png -------------------------------------------------------------------------------- /img/gru-model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/omarsar/nlp_pytorch_tensorflow_notebooks/2fc77b0aa2b1a4be4558f14365c429cbde892586/img/gru-model.png -------------------------------------------------------------------------------- /img/tensor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/omarsar/nlp_pytorch_tensorflow_notebooks/2fc77b0aa2b1a4be4558f14365c429cbde892586/img/tensor.png --------------------------------------------------------------------------------