├── .gitignore
├── abstract.md
├── bio.md
├── data
├── bibbia_costituzione.txt
└── train.csv
├── imgs
├── aimldl.png
├── charseq.jpeg
├── conv_op.gif
├── convnet.png
├── deep_learning.png
├── evidence3.png
├── features.png
├── image1.png
├── learning_representations.png
├── lstm.png
├── lstm_detail.jpeg
├── machine_learning.png
├── musk.jpg
├── network1.png
├── network2.png
├── neural_network.png
├── patch.png
├── program_vs_ml.png
├── rnn_types.jpeg
├── separate.png
└── what_is_ai.png
├── picture.jpg
├── presentation.css
├── presentation.ipynb
├── private.md
├── readme.md
├── requirements.txt
└── utils.py
/.gitignore:
--------------------------------------------------------------------------------
1 | .ipynb_checkpoints/
2 | __pycache__
3 | models/*
4 | .DS_Store
5 | *.h5
6 | data/pos
7 |
--------------------------------------------------------------------------------
/abstract.md:
--------------------------------------------------------------------------------
1 | # Demystifying Artificial Intelligence and Neural Networks
2 |
3 |
4 | ## Abstract Eng
5 | Over the last years we read news such as “No, Facebook Did Not Panic and Shut Down an AI Program That Was Getting Dangerously Smart”, "Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day" or "Researchers want a 'big red button' for shutting down a rogue artificial intelligence".
6 | Thus, the goal of this talk is to demystify the idea of Artificial Intelligence that we see in movies and news with respect the one that is intended by the research community. After a short introduction, we will discuss, using running examples, the principal types of layers and architectures used in neural networks and when to use them. Finally, we will introduce the concept of "representation learning" and discuss why this is important when talking of neural networks.
7 |
8 |
9 | ## Abstract Ita
10 | Sono ormai diversi anni che leggiamo articoli come “due robot Facebook hanno dialogato in una lingua sconosciuta”, oppure "Tweet razzisti, Microsoft chiude il chatbot Tay", o dulcis in fundo "Google, un tasto rosso per evitare che l'intelligenza artificiale diventi consapevole".
11 | L'obiettivo di questo talk è demistificare il termine Intelligenza Artificiale, differenziandone la definizione fornita dalla comunità scientifica, rispetto a quella mostrata nei film e nei giornali. Dopo una breve introduzione sull'argomento analizzeremo, con degli esempi, le principali tipologie di "layer"/architetture utilizzate nelle reti neurali, quando e in che modo usarle. Infine, introdurremo il concetto generale di "representation learning" e discuteremo su come questo sia alla base di molti approcci usati per le reti neurali.
12 |
--------------------------------------------------------------------------------
/bio.md:
--------------------------------------------------------------------------------
1 |
2 | Ho sempre avuto una passione per la scienza e la tecnologia, e questa mi ha spinto verso gli studi informatici.
3 | All’inizio era difficile, ma passate le prime difficoltà mi sono letteralmente innamorato di questa materia di studio. Mi sono laureato alla triennale e specialistica in cinque anni. Nel 2008, ho ricevuto un premio come miglior laureato per la Facoltà di Scienze all’Università di Bari e infine deciso di iniziare un dottorato di ricerca in Machine Learning.
4 |
5 | Durante il periodo del dottorato, ho imparato cosa vuol dire fare ricerca. Ossia un mix tra stidio della letteratura di riferimento, pensiero innovativo, creatività, passione e duro lavoro (circa il 70%). Mi e’ piaciuto tantissimo collaborare con diversi ricercatori sparsi per il mondo, e passare diversi mesi come ricercatore invitato presso il dipartimento di informatica all’Università dell’Illinois sotto la guida del professore Jiawei Han. Durante questo periodo, ho condotto ricerca su reti informative con Tim Weninger e pubblicato su diverse riviste di primo livello e conferenze. Nel Maggio del 2011 ho difeso la mia tesi di dottorato con relatore il professore Donato Malerba. Ottenuto il dottorato, ho speso un anno al Dipartimento di Scienze Biomedica e Oncologia Umana come ricercatore a contratto, dove ho collaborato sullo studio dei geni modificati dai tumori nel sangue. In parallelo ho lavorato come assistente ricercatore all’Università di Bari su Data Stream Mining applicato allo studio dell’energia prodotta da impianti fotovoltaici. Nel 2013, ho lavorato come Data Scientist presso il gruppo Mer.Mec. e fondato una startup innovativa con il goal di applicare algoritmi di Machine Learning per creare mappe per studiare la qualita’ dei quartieri cittadini. Nel 2014, sono ritornato in Università come ricercatore e lavorato su Web Mining e Big Data.
6 |
7 | Nel 2015, ho iniziato a lavorare come Data Scientist in Unicredit Ricerca e Sviluppo. Sono stato in questo gruppo per quasi 3 anni lavorando su tanti progetti interessanti legati a Natural Language Processing, Graph Mining, Big Data, data encryption e trading ad alta frequenza. E’ stata una bella esperienza in cui ho avuto il piacere di collaborare con tante persone interessanti e preparate.
8 |
9 | Durante il 2017, ho sentito del Team per la Trasformazione Digitale ed ho iniziato a seguirlo con crescente interesse. Alla fine del 2017 ho visto sul sito una posizione aperta come esperto in Machine Learning, ho inviato il mio cv e dopo dei colloqui sono stato assunto. Sono molto onorato di poter lavorare per il Team per la Trasformazione Digitale sotto la guida del Commissario Straordinario Diego Piacentini. Sono sicuro di poter imparare molto dalla sua esperienza e da quella del team e di poter collaborare per l’innovazione del mio Paese.
10 |
11 | Mi piace descrivermi come appassionato nell’utilizzo dei dati per risolvere problemi, scoprire pattern interessanti e per creare modelli descrittivi e predittivi. Mi considero una persona comunicativa a cui piace costruire “valore” attraverso la collaborazione con i colleghi e gli altri interlocutori, e in grado di spiegare argomenti complessi con parole chiare a tutti. La cosa che mi caratterizza di più è la curiosità intellettuale e il desiderio di migliorarmi.
12 |
--------------------------------------------------------------------------------
/data/bibbia_costituzione.txt:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/data/bibbia_costituzione.txt
--------------------------------------------------------------------------------
/imgs/aimldl.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/aimldl.png
--------------------------------------------------------------------------------
/imgs/charseq.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/charseq.jpeg
--------------------------------------------------------------------------------
/imgs/conv_op.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/conv_op.gif
--------------------------------------------------------------------------------
/imgs/convnet.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/convnet.png
--------------------------------------------------------------------------------
/imgs/deep_learning.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/deep_learning.png
--------------------------------------------------------------------------------
/imgs/evidence3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/evidence3.png
--------------------------------------------------------------------------------
/imgs/features.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/features.png
--------------------------------------------------------------------------------
/imgs/image1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/image1.png
--------------------------------------------------------------------------------
/imgs/learning_representations.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/learning_representations.png
--------------------------------------------------------------------------------
/imgs/lstm.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/lstm.png
--------------------------------------------------------------------------------
/imgs/lstm_detail.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/lstm_detail.jpeg
--------------------------------------------------------------------------------
/imgs/machine_learning.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/machine_learning.png
--------------------------------------------------------------------------------
/imgs/musk.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/musk.jpg
--------------------------------------------------------------------------------
/imgs/network1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/network1.png
--------------------------------------------------------------------------------
/imgs/network2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/network2.png
--------------------------------------------------------------------------------
/imgs/neural_network.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/neural_network.png
--------------------------------------------------------------------------------
/imgs/patch.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/patch.png
--------------------------------------------------------------------------------
/imgs/program_vs_ml.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/program_vs_ml.png
--------------------------------------------------------------------------------
/imgs/rnn_types.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/rnn_types.jpeg
--------------------------------------------------------------------------------
/imgs/separate.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/separate.png
--------------------------------------------------------------------------------
/imgs/what_is_ai.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/imgs/what_is_ai.png
--------------------------------------------------------------------------------
/picture.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fabiofumarola/demystifying-ai/35e1a7f6effdf107bb41f116f4d4d99334a7b443/picture.jpg
--------------------------------------------------------------------------------
/presentation.css:
--------------------------------------------------------------------------------
1 | div.foo {
2 | background-color: red;
3 | }
4 |
--------------------------------------------------------------------------------
/presentation.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {
6 | "slideshow": {
7 | "slide_type": "slide"
8 | }
9 | },
10 | "source": [
11 | "# Why Neural Networks?\n",
12 | "what you can do, and why everybody are so enthusiast\n",
13 | "\n",
14 | "\n",
15 | "
Fabio Fumarola "
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {
21 | "slideshow": {
22 | "slide_type": "slide"
23 | }
24 | },
25 | "source": [
26 | "# Outline\n",
27 | "\n",
28 | "- Why there is so much confusion about Artificial Intelligence?\n",
29 | "- Artificial Intelligence, Machine Learning and Deep Learning\n",
30 | "- Neural Networks architectures:\n",
31 | " - Linear \n",
32 | " - Convolutional\n",
33 | " - Recurrent\n",
34 | "- Conclusion"
35 | ]
36 | },
37 | {
38 | "cell_type": "markdown",
39 | "metadata": {
40 | "slideshow": {
41 | "slide_type": "slide"
42 | }
43 | },
44 | "source": [
45 | "# Evidence 1: Work\n",
46 | "\n",
47 | " "
48 | ]
49 | },
50 | {
51 | "cell_type": "markdown",
52 | "metadata": {
53 | "slideshow": {
54 | "slide_type": "slide"
55 | }
56 | },
57 | "source": [
58 | "# Evidence 2: Elon Musk Warns\n",
59 | "\n",
60 | "### \"With AI ... We're summoning the Demon\"\n",
61 | "\n",
62 | " \n"
63 | ]
64 | },
65 | {
66 | "cell_type": "markdown",
67 | "metadata": {
68 | "slideshow": {
69 | "slide_type": "slide"
70 | }
71 | },
72 | "source": [
73 | "# Evidence 3: News Papers\n",
74 | "\n",
75 | " "
76 | ]
77 | },
78 | {
79 | "cell_type": "markdown",
80 | "metadata": {
81 | "slideshow": {
82 | "slide_type": "slide"
83 | }
84 | },
85 | "source": [
86 | "# Goal of this presentation\n",
87 | "\n",
88 | " "
89 | ]
90 | },
91 | {
92 | "cell_type": "markdown",
93 | "metadata": {
94 | "slideshow": {
95 | "slide_type": "notes"
96 | }
97 | },
98 | "source": [
99 | "- let be clear about what AI can and cannot do"
100 | ]
101 | },
102 | {
103 | "cell_type": "markdown",
104 | "metadata": {
105 | "slideshow": {
106 | "slide_type": "slide"
107 | }
108 | },
109 | "source": [
110 | "# Terminator\n",
111 | "\n",
112 | "*\"If you're worried about the Terminator, just keep the door closed\"* - Rodney Brooks\n",
113 | "\n",
114 | " "
115 | ]
116 | },
117 | {
118 | "cell_type": "markdown",
119 | "metadata": {
120 | "slideshow": {
121 | "slide_type": "skip"
122 | }
123 | },
124 | "source": [
125 | "# Terminator Thief???\n",
126 | "\n",
127 | " "
128 | ]
129 | },
130 | {
131 | "cell_type": "markdown",
132 | "metadata": {
133 | "slideshow": {
134 | "slide_type": "slide"
135 | }
136 | },
137 | "source": [
138 | "# Artificial Intelligence, Machine Learning and Deep Learning\n",
139 | "\n",
140 | "\n",
141 | " \n",
142 | " \n",
143 | " \n",
144 | " \n",
145 | "
"
146 | ]
147 | },
148 | {
149 | "cell_type": "markdown",
150 | "metadata": {
151 | "slideshow": {
152 | "slide_type": "slide"
153 | }
154 | },
155 | "source": [
156 | "# How we define Artificial Intelligence\n",
157 | "\n",
158 | "- Theory: AI is machines doing what humans do **(AGI)**\n",
159 | " - Gave rise to a lot of Hollywood scripts\n",
160 | " - AI as noun: \"the AI\" took over the world\n",
161 | " \n",
162 | "- AI in practice **(ANI)**:\n",
163 | " - **N**arrow tasks once used to require human intelligence (speech recognition, play go, driving)"
164 | ]
165 | },
166 | {
167 | "cell_type": "markdown",
168 | "metadata": {
169 | "slideshow": {
170 | "slide_type": "slide"
171 | }
172 | },
173 | "source": [
174 | "# Example of Artifical Narrow Intelligence: Alpha Go\n",
175 | "\n",
176 | "- GO is a game, with a board, discrete states, and evaluate function\n",
177 | "- **DeepMind** beated the best human player\n",
178 | "- Alpha GO can play also if the room is on fire (is not aware of the real world)"
179 | ]
180 | },
181 | {
182 | "cell_type": "markdown",
183 | "metadata": {
184 | "slideshow": {
185 | "slide_type": "slide"
186 | }
187 | },
188 | "source": [
189 | "# Machine Learning: AI Success\n",
190 | "\n",
191 | " "
192 | ]
193 | },
194 | {
195 | "cell_type": "markdown",
196 | "metadata": {
197 | "slideshow": {
198 | "slide_type": "slide"
199 | }
200 | },
201 | "source": [
202 | "# Learning Representations from data\n",
203 | "\n",
204 | "- Given: **1. input data points**, **2. labels for the input points** and a **3. why to measure if the algorithm is doing a good job**\n",
205 | "- the algorithm learns **meaningful representations** by searching in the **Hypothesis Search Space**\n",
206 | "\n",
207 | " "
208 | ]
209 | },
210 | {
211 | "cell_type": "markdown",
212 | "metadata": {
213 | "slideshow": {
214 | "slide_type": "slide"
215 | }
216 | },
217 | "source": [
218 | "# Deep Learning and Neural Networks\n",
219 | "\n",
220 | "- The deep isn’t a reference to any deeper understanding, \n",
221 | "- But with the idea of learning successive layers of representations.\n",
222 | "- This is the awesome part of neural networks **(spend some words about)**\n",
223 | "\n",
224 | " \n",
225 | "\n",
226 | "\n",
227 | "\n"
228 | ]
229 | },
230 | {
231 | "cell_type": "markdown",
232 | "metadata": {
233 | "slideshow": {
234 | "slide_type": "slide"
235 | }
236 | },
237 | "source": [
238 | "# Deep Learning and Neural Networks\n",
239 | "\n",
240 | "*layered representations learning* and *hierarchical representations learning*\n",
241 | "\n",
242 | " "
243 | ]
244 | },
245 | {
246 | "cell_type": "markdown",
247 | "metadata": {
248 | "slideshow": {
249 | "slide_type": "slide"
250 | }
251 | },
252 | "source": [
253 | "# Neural Network Workflow\n",
254 | "\n",
255 | " "
256 | ]
257 | },
258 | {
259 | "cell_type": "markdown",
260 | "metadata": {
261 | "slideshow": {
262 | "slide_type": "slide"
263 | }
264 | },
265 | "source": [
266 | "# First Example: MNIST\n"
267 | ]
268 | },
269 | {
270 | "cell_type": "code",
271 | "execution_count": 26,
272 | "metadata": {
273 | "slideshow": {
274 | "slide_type": "fragment"
275 | }
276 | },
277 | "outputs": [
278 | {
279 | "name": "stdout",
280 | "output_type": "stream",
281 | "text": [
282 | "shape training images (60000, 28, 28)\n",
283 | "shape test images (10000, 28, 28)\n"
284 | ]
285 | }
286 | ],
287 | "source": [
288 | "import numpy as np\n",
289 | "import tensorflow as tf\n",
290 | "import matplotlib.pyplot as plt\n",
291 | "import utils\n",
292 | "\n",
293 | "tf.logging.set_verbosity(tf.logging.INFO)\n",
294 | "\n",
295 | "(train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.mnist.load_data()\n",
296 | "print('shape training images ', train_images.shape)\n",
297 | "print('shape test images ', test_images.shape)\n",
298 | "\n",
299 | "num_classes = 10"
300 | ]
301 | },
302 | {
303 | "cell_type": "code",
304 | "execution_count": 2,
305 | "metadata": {
306 | "slideshow": {
307 | "slide_type": "subslide"
308 | }
309 | },
310 | "outputs": [
311 | {
312 | "name": "stdout",
313 | "output_type": "stream",
314 | "text": [
315 | "displaying image 46598 with class 1\n"
316 | ]
317 | },
318 | {
319 | "data": {
320 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAADHFJREFUeJzt3X/oXfV9x/HnW9f6R9oEs7IQbGJqCYOiLJ1BAvNHR2dxUhL7jzRIyZguVaKuMnDq/pgwBjLWDv+qJBiajs52oMGgY20Mumw4izG4+GutmXxrE2IytVIrQqZ574/vyfatfu+5X++vc795Px9w+d573vfc8+aSVz7n3HPu/URmIqmes7puQFI3DL9UlOGXijL8UlGGXyrK8EtFGX6pKMMvFWX4paJ+Y5IbiwgvJ5TGLDNjIc8bauSPiKsi4icRcTgi7hjmtSRNVgx6bX9EnA38FLgSOAI8DWzOzBdb1nHkl8ZsEiP/JcDhzHwlM08C3wc2DfF6kiZomPCfB/x8zuMjzbJfExFbI+JARBwYYluSRmzsH/hl5nZgO7jbL02TYUb+o8CqOY8/3SyTtAgME/6ngbUR8ZmI+DjwVWDPaNqSNG4D7/Zn5nsRcTPwQ+BsYGdmvjCyziSN1cCn+gbamMf80thN5CIfSYuX4ZeKMvxSUYZfKsrwS0UZfqkowy8VZfilogy/VJThl4oy/FJRhl8qyvBLRRl+qSjDLxVl+KWiDL9UlOGXijL8UlGGXyrK8EtFTXSKbmmuO++8s7V+0003tdZXr149ynbKceSXijL8UlGGXyrK8EtFGX6pKMMvFWX4paKGmqU3ImaAt4H3gfcyc32f5ztLbzGrVq3qWdu7d2/ruhdccEFrfdu2ba31HTt2tNbPVAudpXcUF/n8fma+PoLXkTRB7vZLRQ0b/gR+FBHPRMTWUTQkaTKG3e2/NDOPRsRvAXsj4j8zc//cJzT/KfgfgzRlhhr5M/No8/cEsBu4ZJ7nbM/M9f0+DJQ0WQOHPyKWRMQnT98HvgQ8P6rGJI3XMLv9K4DdEXH6df4hM/95JF1JGruBw5+ZrwC/M8JedAbasGFDz9ratWuHeu0lS5YMtX51nuqTijL8UlGGXyrK8EtFGX6pKMMvFeVPd2tqvfHGG631Rx55ZEKdnJkc+aWiDL9UlOGXijL8UlGGXyrK8EtFGX6pKM/za6w2btw48Lrvvvtua/3w4cMDv7Yc+aWyDL9UlOGXijL8UlGGXyrK8EtFGX6pKM/zayi33XZba/26667rWWvmfOhp//79rXUNx5FfKsrwS0UZfqkowy8VZfilogy/VJThl4qKzGx/QsRO4MvAicy8sFm2HPgBsAaYAa7NzF/03VhE+8Y0ddatW9daf/zxx1vrS5cu7Vnr97v8/X4L4KmnnmqtV5WZ7RdQNBYy8n8HuOoDy+4A9mXmWmBf81jSItI3/Jm5H3jzA4s3Abua+7uAa0bcl6QxG/SYf0VmHmvuvwasGFE/kiZk6Gv7MzPbjuUjYiuwddjtSBqtQUf+4xGxEqD5e6LXEzNze2auz8z1A25L0hgMGv49wJbm/hbg4dG0I2lS+oY/Ih4A/h347Yg4EhHXA/cAV0bEy8AfNI8lLSJ9j/kzc3OP0hdH3Is6cM4557TWb7/99tb6smXLWutt15Hs3bu3dV3P44+XV/hJRRl+qSjDLxVl+KWiDL9UlOGXiur7ld6Rbsyv9E6dVatWtdZnZmZa6/1+fvudd97pWbviiita1z148GBrXfMb5Vd6JZ2BDL9UlOGXijL8UlGGXyrK8EtFGX6pKKfoLm7Tpk1Drd/vPP+jjz7as+Z5/G458ktFGX6pKMMvFWX4paIMv1SU4ZeKMvxSUZ7nP8Nt3tzrl9dn3XDDDa31s85qHx/6nau/8cYbW+vqjiO/VJThl4oy/FJRhl8qyvBLRRl+qSjDLxXV9zx/ROwEvgycyMwLm2V3A38C/HfztLsy85/G1aQGd8stt7TWL7rootb6qVOnWutPPPFEa/2tt95qras7Cxn5vwNcNc/yv8vMdc3N4EuLTN/wZ+Z+4M0J9CJpgoY55r85Ig5FxM6IOHdkHUmaiEHD/23gs8A64BjwzV5PjIitEXEgIg4MuC1JYzBQ+DPzeGa+n5mngB3AJS3P3Z6Z6zNz/aBNShq9gcIfESvnPPwK8Pxo2pE0KQs51fcA8AXgUxFxBPhL4AsRsQ5IYAb4+hh7lDQGfcOfmfN9Ifz+MfSiRejVV1/tugUNyCv8pKIMv1SU4ZeKMvxSUYZfKsrwS0X5091ngA0bNvSsrV27dqjX3rVrV2v93nvvHer11R1Hfqkowy8VZfilogy/VJThl4oy/FJRhl8qyvP8Z4Bt27b1rC1fvnyo1z506NBQ62t6OfJLRRl+qSjDLxVl+KWiDL9UlOGXijL8UlGRmZPbWMTkNnYGufjii1vrjz32WM/asmXLWtc9cuRIa3316tWtdU2fzIyFPM+RXyrK8EtFGX6pKMMvFWX4paIMv1SU4ZeK6vt9/ohYBXwXWAEksD0z742I5cAPgDXADHBtZv5ifK3Wdfnll7fWly5d2rPW7zqOSV7noemykJH/PeDPMvNzwAZgW0R8DrgD2JeZa4F9zWNJi0Tf8Gfmscw82Nx/G3gJOA/YBJyezmUXcM24mpQ0eh/pmD8i1gCfB34MrMjMY03pNWYPCyQtEgv+Db+I+ATwIPCNzPxlxP9fPpyZ2eu6/YjYCmwdtlFJo7WgkT8iPsZs8L+XmQ81i49HxMqmvhI4Md+6mbk9M9dn5vpRNCxpNPqGP2aH+PuBlzLzW3NKe4Atzf0twMOjb0/SuCxkt//3gK8Bz0XEs82yu4B7gH+MiOuBnwHXjqdFnX/++V23oDNQ3/Bn5r8Bvb4f/MXRtiNpUrzCTyrK8EtFGX6pKMMvFWX4paIMv1SUU3QvArfeemtrve1ruSdPnmxd97777huoJy1+jvxSUYZfKsrwS0UZfqkowy8VZfilogy/VJRTdC8Cu3fvbq1v3LixZ+3JJ59sXfeyyy4bqCdNL6foltTK8EtFGX6pKMMvFWX4paIMv1SU4ZeK8jy/dIbxPL+kVoZfKsrwS0UZfqkowy8VZfilogy/VFTf8EfEqoh4PCJejIgXIuJPm+V3R8TRiHi2uV09/nYljUrfi3wiYiWwMjMPRsQngWeAa4BrgV9l5t8ueGNe5CON3UIv8uk7Y09mHgOONfffjoiXgPOGa09S1z7SMX9ErAE+D/y4WXRzRByKiJ0RcW6PdbZGxIGIODBUp5JGasHX9kfEJ4B/Af46Mx+KiBXA60ACf8XsocEf93kNd/ulMVvobv+Cwh8RHwMeAX6Ymd+ap74GeCQzL+zzOoZfGrORfbEnIgK4H3hpbvCbDwJP+wrw/EdtUlJ3FvJp/6XAvwLPAaeaxXcBm4F1zO72zwBfbz4cbHstR35pzEa62z8qhl8aP7/PL6mV4ZeKMvxSUYZfKsrwS0UZfqkowy8VZfilogy/VJThl4oy/FJRhl8qyvBLRRl+qai+P+A5Yq8DP5vz+FPNsmk0rb1Na19gb4MaZW/nL/SJE/0+/4c2HnEgM9d31kCLae1tWvsCextUV7252y8VZfiloroO//aOt99mWnub1r7A3gbVSW+dHvNL6k7XI7+kjnQS/oi4KiJ+EhGHI+KOLnroJSJmIuK5ZubhTqcYa6ZBOxERz89Ztjwi9kbEy83feadJ66i3qZi5uWVm6U7fu2mb8Xriu/0RcTbwU+BK4AjwNLA5M1+caCM9RMQMsD4zOz8nHBGXA78Cvnt6NqSI+Bvgzcy8p/mP89zM/PMp6e1uPuLMzWPqrdfM0n9Eh+/dKGe8HoUuRv5LgMOZ+UpmngS+D2zqoI+pl5n7gTc/sHgTsKu5v4vZfzwT16O3qZCZxzLzYHP/beD0zNKdvnctfXWii/CfB/x8zuMjTNeU3wn8KCKeiYitXTczjxVzZkZ6DVjRZTPz6Dtz8yR9YGbpqXnvBpnxetT8wO/DLs3M3wX+ENjW7N5OpZw9Zpum0zXfBj7L7DRux4BvdtlMM7P0g8A3MvOXc2tdvnfz9NXJ+9ZF+I8Cq+Y8/nSzbCpk5tHm7wlgN7OHKdPk+OlJUpu/Jzru5/9k5vHMfD8zTwE76PC9a2aWfhD4XmY+1Czu/L2br6+u3rcuwv80sDYiPhMRHwe+CuzpoI8PiYglzQcxRMQS4EtM3+zDe4Atzf0twMMd9vJrpmXm5l4zS9Pxezd1M15n5sRvwNXMfuL/X8BfdNFDj74uAP6jub3QdW/AA8zuBv4Ps5+NXA/8JrAPeBl4DFg+Rb39PbOzOR9iNmgrO+rtUmZ36Q8Bzza3q7t+71r66uR98wo/qSg/8JOKMvxSUYZfKsrwS0UZfqkowy8VZfilogy/VNT/AoAkAK53PIMKAAAAAElFTkSuQmCC\n",
321 | "text/plain": [
322 | ""
323 | ]
324 | },
325 | "metadata": {
326 | "needs_background": "light"
327 | },
328 | "output_type": "display_data"
329 | }
330 | ],
331 | "source": [
332 | "i = np.random.randint(0, len(train_images))\n",
333 | "print('displaying image {} with class {}'.format(i, train_labels[i]))\n",
334 | "plt.imshow(train_images[i], cmap='gray');"
335 | ]
336 | },
337 | {
338 | "cell_type": "markdown",
339 | "metadata": {
340 | "slideshow": {
341 | "slide_type": "subslide"
342 | }
343 | },
344 | "source": [
345 | "# Define Inputs"
346 | ]
347 | },
348 | {
349 | "cell_type": "code",
350 | "execution_count": 3,
351 | "metadata": {
352 | "slideshow": {
353 | "slide_type": "fragment"
354 | }
355 | },
356 | "outputs": [],
357 | "source": [
358 | "g = tf.Graph()\n",
359 | "\n",
360 | "flatten_shape = np.prod(train_images.shape[1:])\n",
361 | "\n",
362 | "with g.as_default():\n",
363 | " X = tf.placeholder(tf.float32, [None, flatten_shape], name='X')\n",
364 | " y = tf.placeholder(tf.float32, [None, 10], name='y')\n",
365 | " \n",
366 | "#utils.show_graph(g)"
367 | ]
368 | },
369 | {
370 | "cell_type": "markdown",
371 | "metadata": {
372 | "slideshow": {
373 | "slide_type": "subslide"
374 | }
375 | },
376 | "source": [
377 | "# Define the Dense Layer"
378 | ]
379 | },
380 | {
381 | "cell_type": "code",
382 | "execution_count": 4,
383 | "metadata": {
384 | "slideshow": {
385 | "slide_type": "fragment"
386 | }
387 | },
388 | "outputs": [],
389 | "source": [
390 | "def dense(input_tensor, output_units, scope, activation=None):\n",
391 | " with tf.name_scope(scope):\n",
392 | " #shape of the weights matrix\n",
393 | " shape = (input_tensor.shape.as_list()[1], output_units)\n",
394 | " # weights matrix\n",
395 | " weights = tf.Variable(tf.truncated_normal(shape=shape, dtype=tf.float32), name='W')\n",
396 | " # bias vector (**brodcast**)\n",
397 | " bias = tf.Variable(tf.zeros(shape=[output_units], dtype=tf.float32), name='b')\n",
398 | " # define the layers as W * x + b\n",
399 | " layer = tf.add(tf.matmul(input_tensor, weights), bias)\n",
400 | " # add the squashing function (non linearity)\n",
401 | " if activation is not None:\n",
402 | " return activation(layer)\n",
403 | " else:\n",
404 | " return layer"
405 | ]
406 | },
407 | {
408 | "cell_type": "markdown",
409 | "metadata": {
410 | "slideshow": {
411 | "slide_type": "subslide"
412 | }
413 | },
414 | "source": [
415 | "# Let us build the Model"
416 | ]
417 | },
418 | {
419 | "cell_type": "code",
420 | "execution_count": 5,
421 | "metadata": {
422 | "scrolled": false,
423 | "slideshow": {
424 | "slide_type": "fragment"
425 | }
426 | },
427 | "outputs": [
428 | {
429 | "name": "stdout",
430 | "output_type": "stream",
431 | "text": [
432 | "WARNING:tensorflow:From /Users/fumarolaf/miniconda3/envs/ai/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.\n",
433 | "Instructions for updating:\n",
434 | "Colocations handled automatically by placer.\n"
435 | ]
436 | }
437 | ],
438 | "source": [
439 | "with g.as_default():\n",
440 | " # define the model\n",
441 | " l1 = dense(X, 32, 'h1', activation=tf.nn.sigmoid)\n",
442 | " l2 = dense(l1, 64, 'h2', activation=tf.nn.relu)\n",
443 | " logits = dense(l1, 10, 'out', activation=None)\n",
444 | "\n",
445 | " # define the loss function\n",
446 | " loss_op = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(\n",
447 | " logits=logits, labels=y))\n",
448 | "\n",
449 | " # define the optimizer\n",
450 | " optmizer = tf.train.RMSPropOptimizer(learning_rate=0.01)\n",
451 | "\n",
452 | " #train operation\n",
453 | " train_op = optmizer.minimize(loss_op)\n",
454 | " \n",
455 | " #metrics\n",
456 | " correct_pred = tf.equal(tf.argmax(logits, axis=1), tf.argmax(y,1))\n",
457 | " accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32))\n"
458 | ]
459 | },
460 | {
461 | "cell_type": "code",
462 | "execution_count": 6,
463 | "metadata": {
464 | "slideshow": {
465 | "slide_type": "subslide"
466 | }
467 | },
468 | "outputs": [
469 | {
470 | "data": {
471 | "text/html": [
472 | "\n",
473 | " \n",
484 | " "
485 | ],
486 | "text/plain": [
487 | ""
488 | ]
489 | },
490 | "metadata": {},
491 | "output_type": "display_data"
492 | }
493 | ],
494 | "source": [
495 | "utils.show_graph(g)"
496 | ]
497 | },
498 | {
499 | "cell_type": "markdown",
500 | "metadata": {
501 | "slideshow": {
502 | "slide_type": "subslide"
503 | }
504 | },
505 | "source": [
506 | "# Train the Model"
507 | ]
508 | },
509 | {
510 | "cell_type": "markdown",
511 | "metadata": {
512 | "slideshow": {
513 | "slide_type": "subslide"
514 | }
515 | },
516 | "source": [
517 | "## Data Preprocessing"
518 | ]
519 | },
520 | {
521 | "cell_type": "code",
522 | "execution_count": 7,
523 | "metadata": {
524 | "slideshow": {
525 | "slide_type": "fragment"
526 | }
527 | },
528 | "outputs": [
529 | {
530 | "name": "stdout",
531 | "output_type": "stream",
532 | "text": [
533 | "shape train images (60000, 784)\n",
534 | "shape train labels with one hot econding (60000, 10)\n"
535 | ]
536 | }
537 | ],
538 | "source": [
539 | "epochs = 5\n",
540 | "batch_size = 128\n",
541 | "val_step = int(len(train_images) * 0.9)\n",
542 | "\n",
543 | "train_images = train_images.astype('float32') / 255\n",
544 | "test_images = test_images.astype('float32') / 255\n",
545 | "\n",
546 | "#reshape the images\n",
547 | "train_images_reshaped = np.reshape(train_images, (len(train_images), -1))\n",
548 | "test_images_reshaped = np.reshape(test_images, [len(test_images), -1])\n",
549 | "\n",
550 | "print('shape train images {}'.format(train_images_reshaped.shape))\n",
551 | "\n",
552 | "\n",
553 | "def one_hot(labels, num_classes):\n",
554 | " results = np.zeros(shape=(len(labels), num_classes), dtype=np.float32)\n",
555 | " for i, values in enumerate(labels):\n",
556 | " results[i,values] = 1.\n",
557 | " return results\n",
558 | "\n",
559 | "train_labels_one_hot = one_hot(train_labels, 10)\n",
560 | "test_labels_one_hot = one_hot(test_labels, 10)\n",
561 | "print('shape train labels with one hot econding {}'.format(train_labels_one_hot.shape))"
562 | ]
563 | },
564 | {
565 | "cell_type": "markdown",
566 | "metadata": {
567 | "slideshow": {
568 | "slide_type": "subslide"
569 | }
570 | },
571 | "source": [
572 | "## Train, Val and mini-batches"
573 | ]
574 | },
575 | {
576 | "cell_type": "code",
577 | "execution_count": 8,
578 | "metadata": {
579 | "slideshow": {
580 | "slide_type": "fragment"
581 | }
582 | },
583 | "outputs": [],
584 | "source": [
585 | "train_x, val_x = train_images_reshaped[:val_step], train_images_reshaped[val_step:]\n",
586 | "train_y, val_y = train_labels_one_hot[:val_step], train_labels_one_hot[val_step:]\n",
587 | "\n",
588 | "num_batches = len(train_x) // batch_size\n",
589 | "\n",
590 | "\n",
591 | "def to_batch(x,y, batch_size, shuffle=True):\n",
592 | " idxs = np.arange(0, len(x))\n",
593 | " np.random.shuffle(idxs)\n",
594 | " x = x[idxs]\n",
595 | " y = y[idxs]\n",
596 | " num_batches = len(x) // batch_size\n",
597 | " for i in range(num_batches):\n",
598 | " start = i * batch_size\n",
599 | " end = (i + 1) * batch_size\n",
600 | " if end < len(x):\n",
601 | " yield x[start:end], y[start:end]\n",
602 | " else:\n",
603 | " yield x[start:], y[start:]"
604 | ]
605 | },
606 | {
607 | "cell_type": "markdown",
608 | "metadata": {
609 | "slideshow": {
610 | "slide_type": "subslide"
611 | }
612 | },
613 | "source": [
614 | "## Train the Model"
615 | ]
616 | },
617 | {
618 | "cell_type": "code",
619 | "execution_count": 9,
620 | "metadata": {
621 | "scrolled": false,
622 | "slideshow": {
623 | "slide_type": "fragment"
624 | }
625 | },
626 | "outputs": [
627 | {
628 | "name": "stdout",
629 | "output_type": "stream",
630 | "text": [
631 | "Epochs 1, val_loss=0.255, val_acc=0.925\n",
632 | "Epochs 2, val_loss=0.185, val_acc=0.947\n",
633 | "Epochs 3, val_loss=0.167, val_acc=0.949\n",
634 | "Epochs 4, val_loss=0.156, val_acc=0.954\n",
635 | "Epochs 5, val_loss=0.145, val_acc=0.956\n",
636 | "training finished\n",
637 | "Testing accuracy 0.9490000009536743\n",
638 | "model saved in path /tmp/model.ckpt\n"
639 | ]
640 | }
641 | ],
642 | "source": [
643 | "with tf.Session(graph=g) as sess:\n",
644 | " # initialize variables (i.e. assign to their default value)\n",
645 | " sess.run(tf.global_variables_initializer())\n",
646 | " \n",
647 | " for e in range(1, epochs + 1):\n",
648 | " gen = to_batch(train_x, train_y, batch_size)\n",
649 | " for _ in range(num_batches):\n",
650 | " x_batch, y_batch = next(gen)\n",
651 | " sess.run(train_op, feed_dict={X: x_batch, y: y_batch})\n",
652 | " loss, acc = sess.run([loss_op, accuracy], feed_dict={X: val_x, y: val_y})\n",
653 | " print(\"Epochs {}, val_loss={:.3f}, val_acc={:.3f}\".format(e, loss, acc))\n",
654 | " \n",
655 | " print('training finished')\n",
656 | " \n",
657 | " test_acc = sess.run(accuracy, feed_dict={X: test_images_reshaped, y: test_labels_one_hot})\n",
658 | " print('Testing accuracy {}'.format(test_acc))\n",
659 | " \n",
660 | " saver = tf.train.Saver()\n",
661 | " save_path = saver.save(sess, '/tmp/model.ckpt')\n",
662 | " print('model saved in path {}'.format(save_path))"
663 | ]
664 | },
665 | {
666 | "cell_type": "markdown",
667 | "metadata": {
668 | "slideshow": {
669 | "slide_type": "subslide"
670 | }
671 | },
672 | "source": [
673 | "## Use the Model"
674 | ]
675 | },
676 | {
677 | "cell_type": "code",
678 | "execution_count": 11,
679 | "metadata": {
680 | "scrolled": false,
681 | "slideshow": {
682 | "slide_type": "fragment"
683 | }
684 | },
685 | "outputs": [
686 | {
687 | "name": "stdout",
688 | "output_type": "stream",
689 | "text": [
690 | "INFO:tensorflow:Restoring parameters from /tmp/model.ckpt\n",
691 | "class predicted 9, expected 9\n"
692 | ]
693 | },
694 | {
695 | "data": {
696 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAADY1JREFUeJzt3V+MFfUZxvHnrbQXIibQRrJRIlahpjFGcANNSgyN1Vhogkui4kVDE2C9qIkmklRpTLnTNCJpvCAuioXGQpto46KmrSVE27Ua+WMVtUWrS1yygoIJq15Q3LcXO5gt7vzmeM7MmbO830+y2XPmPXPmZdhnZ87On5+5uwDE87W6GwBQD8IPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiCoKe1cmJlxOiFQMXe3Rl7X0pbfzG4ws3+b2Ttmdncr7wWgvazZc/vN7BxJByVdJ2lI0iuSbnX3NxPzsOUHKtaOLf8CSe+4+7vuflLSDknLWng/AG3USvgvlPT+uOdD2bT/Y2a9ZrbHzPa0sCwAJav8D37u3iepT2K3H+gkrWz5D0uaNe75Rdk0AJNAK+F/RdIcM7vEzL4haYWk/nLaAlC1pnf73f2Umd0u6c+SzpG0xd3fKK0zAJVq+lBfUwvjMz9Qubac5ANg8iL8QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gqKaH6JYkMxuUNCLpc0mn3L27jKYAVK+l8Gd+4O4flfA+ANqI3X4gqFbD75L+YmZ7zay3jIYAtEeru/2L3P2wmV0g6Tkz+5e7vzD+BdkvBX4xAB3G3L2cNzJbL+kTd38g8ZpyFgYgl7tbI69rerffzKaa2bTTjyVdL+lAs+8HoL1a2e2fKemPZnb6fX7n7n8qpSsAlSttt7+hhQXd7Z8yJf07tqurK1lfsWJFsr5kyZLc2uLFi5Pzjo6OJutFHnnkkWR97dq1ubWRkZGWlo2JVb7bD2ByI/xAUIQfCIrwA0ERfiAowg8ExaG+NliwYEGy/uKLL1a27Ow8jFxV//+n/m09PT3JeY8dO1Z2OyFwqA9AEuEHgiL8QFCEHwiK8ANBEX4gKMIPBMVx/jbo7+9P1lOX5Dbi4MGDubWHHnooOe/+/fuT9VtuuSVZX716dbJ+7rnn5taef/755Lw33XRTss55ABPjOD+AJMIPBEX4gaAIPxAU4QeCIvxAUIQfCKqMUXrD27RpU7K+dOnSZL3oXIvt27cn6/fcc09ubWhoKDlvkZdeeilZP378eLK+fv363No111yTnHfOnDnJOsf5W8OWHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCKrye38y2SPqxpKPufkU2bYak30uaLWlQ0s3u/nHhwibx9fyLFi3KrT3zzDPJec8///xk/b777kvW161bl6x3shMnTuTWpk6dmpy3aL0uX748WT916lSyfrYq83r+30i64Yxpd0va5e5zJO3KngOYRArD7+4vSDrzNK5lkrZmj7dKurHkvgBUrNnP/DPdfTh7/IGkmSX1A6BNWj6339099VnezHol9ba6HADlanbLf8TMuiQp+34074Xu3ufu3e7e3eSyAFSg2fD3S1qZPV4p6aly2gHQLoXhN7Ptkv4h6TtmNmRmqyTdL+k6M3tb0g+z5wAmEe7b36CBgYHc2sKFC5PzFl3zPn/+/GS91Wvy67Rhw4bc2h133NHSe8+ePTtZn8zrrRXctx9AEuEHgiL8QFCEHwiK8ANBEX4gKG7d3QaPPfZYsh71kFSr7r333mR9586dubWnn3667HYmHbb8QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAUl/Q2qJVLeqdPn56sj4yMNNXTZHD55Zfn1g4cOFDpskdHR3NrRbcF7+npKbudtuGSXgBJhB8IivADQRF+ICjCDwRF+IGgCD8QFNfzN8gs/9Bpqiad3cfxi3z66ae5taL11qopU/J/vC+44IJKlz0ZsOUHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaAKj/Ob2RZJP5Z01N2vyKatl7RG0ofZy9a5+7NVNdkJUvc9KLonwtVXX52s7927t6meyjB37txkfcaMGcn6mjVrkvXU8fSq7yVx8uTJ3Nq2bdsqXfZk0MiW/zeSbphg+kZ3vyr7OquDD5yNCsPv7i9IOt6GXgC0USuf+W83s9fMbIuZpe9TBaDjNBv+TZIulXSVpGFJG/JeaGa9ZrbHzPY0uSwAFWgq/O5+xN0/d/dRSZslLUi8ts/du929u9kmAZSvqfCbWde4pz2Sqr0NK4DSNXKob7ukxZK+ZWZDkn4pabGZXSXJJQ1Kuq3CHgFUgPv2Nyh1H/eiY8ap68olad++fU311Iiia+Yvu+yyZL3oOH8ry6/6Z+/w4cO5tYsvvrjSZdeJ+/YDSCL8QFCEHwiK8ANBEX4gKMIPBMWhvhIsXbo0We/v70/Wq/w/KDrUV/X///79+3Nr8+bNq3TZy5Yty60VDdE9mXGoD0AS4QeCIvxAUIQfCIrwA0ERfiAowg8ExXH+Npg2bVqyvnr16mR9/vz5TdeLLhcuqu/YsSNZHx4eTtZnzZqVWxscHEzOW2Tjxo3J+tq1a1t6/8mK4/wAkgg/EBThB4Ii/EBQhB8IivADQRF+ICiO86NSqeP87733XnLeY8eOJevLly9P1gcGBpL1sxXH+QEkEX4gKMIPBEX4gaAIPxAU4QeCIvxAUOmxoyWZ2SxJ2yTNlOSS+tz912Y2Q9LvJc2WNCjpZnf/uLpW0Ynmzp2brLdyTf3DDz+crEc9jl+WRrb8pyTd5e7flfQ9ST8zs+9KulvSLnefI2lX9hzAJFEYfncfdvd92eMRSW9JulDSMklbs5dtlXRjVU0CKN9X+sxvZrMlzZP0sqSZ7n76Hk4faOxjAYBJovAz/2lmdp6kJyTd6e4nxo8B5+6ed96+mfVK6m21UQDlamjLb2Zf11jwH3f3J7PJR8ysK6t3STo60bzu3ufu3e7eXUbDAMpRGH4b28Q/Kuktd39wXKlf0srs8UpJT5XfHoCqNLLb/31JP5H0upm9mk1bJ+l+SX8ws1WSDkm6uZoW0cmKhidftWpVbq3ocvJnn322qZ7QmMLwu/vfJeVdH3xtue0AaBfO8AOCIvxAUIQfCIrwA0ERfiAowg8E1fDpvUAzxp8GfqZDhw4l5y2qozVs+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKI7zoyVF1/OnrtnfvHlzct7h4eFkHa1hyw8ERfiBoAg/EBThB4Ii/EBQhB8IivADQXGcH0nXXpu+O/uVV16ZrH/22We5td27dzfVE8rBlh8IivADQRF+ICjCDwRF+IGgCD8QFOEHgrKiMdLNbJakbZJmSnJJfe7+azNbL2mNpA+zl65z9+SA6maWXhg6zsDAQLK+cOHCZH3nzp25tZ6enqZ6Qpq75w+WME4jJ/mcknSXu+8zs2mS9prZc1lto7s/0GyTAOpTGH53H5Y0nD0eMbO3JF1YdWMAqvWVPvOb2WxJ8yS9nE263cxeM7MtZjY9Z55eM9tjZnta6hRAqRoOv5mdJ+kJSXe6+wlJmyRdKukqje0ZbJhoPnfvc/dud+8uoV8AJWko/Gb2dY0F/3F3f1KS3P2Iu3/u7qOSNktaUF2bAMpWGH4bG2b1UUlvufuD46Z3jXtZj6QD5bcHoCqNHOpbJOlvkl6XNJpNXifpVo3t8rukQUm3ZX8cTL0Xh/qAijV6qK8w/GUi/ED1Gg0/Z/gBQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCavcQ3R9JOjTu+beyaZ2oU3vr1L4kemtWmb1d3OgL23o9/5cWbranU+/t16m9dWpfEr01q67e2O0HgiL8QFB1h7+v5uWndGpvndqXRG/NqqW3Wj/zA6hP3Vt+ADWpJfxmdoOZ/dvM3jGzu+voIY+ZDZrZ62b2at1DjGXDoB01swPjps0ws+fM7O3s+4TDpNXU23ozO5ytu1fNbElNvc0ys91m9qaZvWFmd2TTa113ib5qWW9t3+03s3MkHZR0naQhSa9IutXd32xrIznMbFBSt7vXfkzYzK6R9Imkbe5+RTbtV5KOu/v92S/O6e7+8w7pbb2kT+oeuTkbUKZr/MjSkm6U9FPVuO4Sfd2sGtZbHVv+BZLecfd33f2kpB2SltXQR8dz9xckHT9j8jJJW7PHWzX2w9N2Ob11BHcfdvd92eMRSadHlq513SX6qkUd4b9Q0vvjng+ps4b8dkl/MbO9ZtZbdzMTmDluZKQPJM2ss5kJFI7c3E5njCzdMeuumRGvy8Yf/L5skbvPl/QjST/Ldm87ko99ZuukwzUNjdzcLhOMLP2FOtddsyNel62O8B+WNGvc84uyaR3B3Q9n349K+qM6b/ThI6cHSc2+H625ny900sjNE40srQ5Yd5004nUd4X9F0hwzu8TMviFphaT+Gvr4EjObmv0hRmY2VdL16rzRh/slrcwer5T0VI29/J9OGbk5b2Rp1bzuOm7Ea3dv+5ekJRr7i/9/JP2ijh5y+vq2pH9mX2/U3Zuk7RrbDfyvxv42skrSNyXtkvS2pL9KmtFBvf1WY6M5v6axoHXV1Nsije3Svybp1exrSd3rLtFXLeuNM/yAoPiDHxAU4QeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoP4HyvSNpkSoBeEAAAAASUVORK5CYII=\n",
697 | "text/plain": [
698 | ""
699 | ]
700 | },
701 | "metadata": {
702 | "needs_background": "light"
703 | },
704 | "output_type": "display_data"
705 | }
706 | ],
707 | "source": [
708 | "with tf.Session(graph=g) as sess:\n",
709 | " saver = tf.train.Saver()\n",
710 | " saver.restore(sess, '/tmp/model.ckpt')\n",
711 | " \n",
712 | " i = np.random.randint(0, len(test_images))\n",
713 | " res = np.argmax(sess.run(logits, feed_dict={X: [test_images_reshaped[i]]}))\n",
714 | " print('class predicted {}, expected {}'.format(res, test_labels[i]))\n",
715 | " plt.imshow(test_images[i], cmap='gray')\n",
716 | " "
717 | ]
718 | },
719 | {
720 | "cell_type": "markdown",
721 | "metadata": {
722 | "slideshow": {
723 | "slide_type": "subslide"
724 | }
725 | },
726 | "source": [
727 | "# Concept Learned\n",
728 | "\n",
729 | "- computation graph\n",
730 | "- what is a layer\n",
731 | "- how to plug non-linearity\n",
732 | "- loss function\n",
733 | "- optimizer\n",
734 | "- how to train a neural network"
735 | ]
736 | },
737 | {
738 | "cell_type": "markdown",
739 | "metadata": {
740 | "slideshow": {
741 | "slide_type": "slide"
742 | }
743 | },
744 | "source": [
745 | "# Introduction to Convolutional Neural Networks\n",
746 | "\n",
747 | "\n",
748 | "- Different from fully connected\n",
749 | "- Extract features from small patches, extracting local features\n",
750 | "\n",
751 | "\n",
752 | " \n",
753 | " \n",
754 | " \n",
755 | " \n",
756 | "
\n"
757 | ]
758 | },
759 | {
760 | "cell_type": "markdown",
761 | "metadata": {
762 | "slideshow": {
763 | "slide_type": "slide"
764 | }
765 | },
766 | "source": [
767 | "# Features and Convolution\n",
768 | "\n",
769 | "- compose these features to learn a hiearchical representation of the input\n",
770 | "\n",
771 | " \n",
772 | " \n",
773 | " \n",
774 | " \n",
775 | "
"
776 | ]
777 | },
778 | {
779 | "cell_type": "markdown",
780 | "metadata": {
781 | "slideshow": {
782 | "slide_type": "slide"
783 | }
784 | },
785 | "source": [
786 | "# MNIST Reloaded"
787 | ]
788 | },
789 | {
790 | "cell_type": "markdown",
791 | "metadata": {
792 | "slideshow": {
793 | "slide_type": "subslide"
794 | }
795 | },
796 | "source": [
797 | "## Model"
798 | ]
799 | },
800 | {
801 | "cell_type": "code",
802 | "execution_count": 12,
803 | "metadata": {
804 | "slideshow": {
805 | "slide_type": "fragment"
806 | }
807 | },
808 | "outputs": [],
809 | "source": [
810 | "input_l = tf.keras.Input((28,28, 1))\n",
811 | "conv_1 = tf.keras.layers.Conv2D(32, (3,3), activation='relu')(input_l)\n",
812 | "maxpool_1 = tf.keras.layers.MaxPooling2D((2,2))(conv_1)\n",
813 | "conv_2 = tf.keras.layers.Conv2D(64, (3,3), activation='relu')(maxpool_1)\n",
814 | "maxpool_2 = tf.keras.layers.MaxPooling2D((2,2))(conv_2)\n",
815 | "conv_3 = tf.keras.layers.Conv2D(64, (3,3), activation='relu')(maxpool_2)\n",
816 | "flatten = tf.keras.layers.Flatten()(conv_3)\n",
817 | "dense_1 = tf.keras.layers.Dense(64, activation='relu')(flatten)\n",
818 | "dense_2 = tf.keras.layers.Dense(10, activation='softmax')(dense_1)\n",
819 | "\n",
820 | "model = tf.keras.Model(inputs=[input_l], outputs=[dense_2])"
821 | ]
822 | },
823 | {
824 | "cell_type": "code",
825 | "execution_count": 13,
826 | "metadata": {
827 | "slideshow": {
828 | "slide_type": "subslide"
829 | }
830 | },
831 | "outputs": [
832 | {
833 | "name": "stdout",
834 | "output_type": "stream",
835 | "text": [
836 | "_________________________________________________________________\n",
837 | "Layer (type) Output Shape Param # \n",
838 | "=================================================================\n",
839 | "input_1 (InputLayer) (None, 28, 28, 1) 0 \n",
840 | "_________________________________________________________________\n",
841 | "conv2d (Conv2D) (None, 26, 26, 32) 320 \n",
842 | "_________________________________________________________________\n",
843 | "max_pooling2d (MaxPooling2D) (None, 13, 13, 32) 0 \n",
844 | "_________________________________________________________________\n",
845 | "conv2d_1 (Conv2D) (None, 11, 11, 64) 18496 \n",
846 | "_________________________________________________________________\n",
847 | "max_pooling2d_1 (MaxPooling2 (None, 5, 5, 64) 0 \n",
848 | "_________________________________________________________________\n",
849 | "conv2d_2 (Conv2D) (None, 3, 3, 64) 36928 \n",
850 | "_________________________________________________________________\n",
851 | "flatten (Flatten) (None, 576) 0 \n",
852 | "_________________________________________________________________\n",
853 | "dense (Dense) (None, 64) 36928 \n",
854 | "_________________________________________________________________\n",
855 | "dense_1 (Dense) (None, 10) 650 \n",
856 | "=================================================================\n",
857 | "Total params: 93,322\n",
858 | "Trainable params: 93,322\n",
859 | "Non-trainable params: 0\n",
860 | "_________________________________________________________________\n"
861 | ]
862 | }
863 | ],
864 | "source": [
865 | "model.summary()"
866 | ]
867 | },
868 | {
869 | "cell_type": "markdown",
870 | "metadata": {
871 | "slideshow": {
872 | "slide_type": "subslide"
873 | }
874 | },
875 | "source": [
876 | "## Compile and Fit"
877 | ]
878 | },
879 | {
880 | "cell_type": "code",
881 | "execution_count": 14,
882 | "metadata": {
883 | "slideshow": {
884 | "slide_type": "fragment"
885 | }
886 | },
887 | "outputs": [
888 | {
889 | "name": "stdout",
890 | "output_type": "stream",
891 | "text": [
892 | "Train on 54000 samples, validate on 6000 samples\n",
893 | "WARNING:tensorflow:From /Users/fumarolaf/miniconda3/envs/ai/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n",
894 | "Instructions for updating:\n",
895 | "Use tf.cast instead.\n",
896 | "Epoch 1/5\n",
897 | "54000/54000 [==============================] - 19s 343us/sample - loss: 0.1977 - acc: 0.9378 - val_loss: 0.0501 - val_acc: 0.9857\n",
898 | "Epoch 2/5\n",
899 | "54000/54000 [==============================] - 18s 340us/sample - loss: 0.0504 - acc: 0.9844 - val_loss: 0.0371 - val_acc: 0.9898\n",
900 | "Epoch 3/5\n",
901 | "54000/54000 [==============================] - 18s 340us/sample - loss: 0.0342 - acc: 0.9896 - val_loss: 0.0385 - val_acc: 0.9900\n",
902 | "Epoch 4/5\n",
903 | "54000/54000 [==============================] - 18s 341us/sample - loss: 0.0263 - acc: 0.9918 - val_loss: 0.0281 - val_acc: 0.9917\n",
904 | "Epoch 5/5\n",
905 | "54000/54000 [==============================] - 18s 342us/sample - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0437 - val_acc: 0.9880\n"
906 | ]
907 | }
908 | ],
909 | "source": [
910 | "train_images_reshaped = train_images.reshape((-1, 28, 28, 1))\n",
911 | "test_images_reshaped = test_images.reshape((-1, 28, 28, 1))\n",
912 | "\n",
913 | "model.compile(optimizer=tf.keras.optimizers.RMSprop(),\n",
914 | " loss=tf.keras.losses.categorical_crossentropy,\n",
915 | " metrics=['accuracy'])\n",
916 | "\n",
917 | "history = model.fit(train_images_reshaped, train_labels_one_hot, validation_split=0.1, epochs=5, batch_size=64);"
918 | ]
919 | },
920 | {
921 | "cell_type": "markdown",
922 | "metadata": {
923 | "slideshow": {
924 | "slide_type": "subslide"
925 | }
926 | },
927 | "source": [
928 | "## Plot Metrics"
929 | ]
930 | },
931 | {
932 | "cell_type": "code",
933 | "execution_count": 15,
934 | "metadata": {
935 | "scrolled": true,
936 | "slideshow": {
937 | "slide_type": "fragment"
938 | }
939 | },
940 | "outputs": [
941 | {
942 | "data": {
943 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZIAAAEWCAYAAABMoxE0AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xt4FeW9//33h7MH5GxVogYVFRAEjKiliKh1Y1ulttSCWA/VUt21dtdf+8jWHmm9ftrto1Y3T3dt6xmlbn201BPbvaW1/bUigSKIiCCiBlEDCoKgGPj+/phJshJyWMnKykrC53Vdc2Xmnntm7pkk67vu+565RxGBmZlZc3UqdAHMzKx9cyAxM7OcOJCYmVlOHEjMzCwnDiRmZpYTBxIzM8uJA4kVnKTOkrZKOqQl8xaSpCMktfi99ZJOl7Q2Y3mlpHHZ5G3GsX4r6Zrmbt/Afn8u6a6W3q8VTpdCF8DaH0lbMxb3Bj4GdqbL34yI2U3ZX0TsBPZt6bx7gog4qiX2I+lS4PyIOCVj35e2xL6t43MgsSaLiKoP8vQb76UR8d/15ZfUJSIqWqNsZtb63LRlLS5tuvi9pAckbQHOl3SSpOckbZK0XtKtkrqm+btICknF6fJ96fonJW2R9HdJg5qaN11/pqRXJG2WdJuk/yPponrKnU0ZvylptaT3Jd2asW1nSTdL2ihpDTCxgetzraQ5tdJmSbopnb9U0or0fF5Nawv17atM0inp/N6S7k3Lthw4rlbeH0hak+53uaSz0/ThwL8D49Jmww0Z1/YnGdtflp77RkmPSjowm2vTGEnnpOXZJOkZSUdlrLtG0luSPpD0csa5nihpcZr+jqR/y/Z4lgcR4clTsydgLXB6rbSfAzuAs0i+rOwFHA+cQFILPgx4Bbgizd8FCKA4Xb4P2ACUAF2B3wP3NSPv/sAWYFK67irgE+Cies4lmzL+AegFFAPvVZ47cAWwHCgC+gHPJv9edR7nMGArsE/Gvt8FStLls9I8Ak4FtgMj0nWnA2sz9lUGnJLO3wj8CegDHAq8VCvvucCB6e/kvLQMn0rXXQr8qVY57wN+ks6fkZZxJNAD+P+AZ7K5NnWc/8+Bu9L5IWk5Tk1/R9cAK9P5YcDrwAFp3kHAYen8QmBqOt8TOKHQ/wt78uQaieXLXyPijxGxKyK2R8TCiFgQERURsQa4HRjfwPYPRURpRHwCzCb5AGtq3i8ASyLiD+m6m0mCTp2yLOP/jojNEbGW5EO78ljnAjdHRFlEbASub+A4a4AXSQIcwGeB9yOiNF3/x4hYE4lngP8B6uxQr+Vc4OcR8X5EvE5Sy8g87oMRsT79ndxP8iWgJIv9AkwDfhsRSyLiI2AGMF5SUUae+q5NQ6YAcyPimfR3dD1JMDoBqCAJWsPS5tHX0msHyReCwZL6RcSWiFiQ5XlYHjiQWL68mbkg6WhJj0t6W9IHwEygfwPbv50xv42GO9jry3tQZjkiIki+wdcpyzJmdSySb9INuR+Yms6fly5XluMLkhZIek/SJpLaQEPXqtKBDZVB0kWSXkibkDYBR2e5X0jOr2p/EfEB8D4wMCNPU35n9e13F8nvaGBErAT+F8nv4d20qfSANOvFwFBgpaTnJX0uy/OwPHAgsXypfevrr0m+hR8REfsBPyJpusmn9SRNTQBIEjU/+GrLpYzrgYMzlhu7PflB4HRJA0lqJvenZdwLeAj43yTNTr2B/8qyHG/XVwZJhwG/Ai4H+qX7fTljv43dqvwWSXNZ5f56kjShrcuiXE3ZbyeS39k6gIi4LyLGkjRrdSa5LkTEyoiYQtJ8+f8CD0vqkWNZrJkcSKy19AQ2Ax9KGgJ8sxWO+RgwWtJZkroA3wEG5KmMDwL/ImmgpH7A1Q1ljoi3gb8CdwErI2JVuqo70A0oB3ZK+gJwWhPKcI2k3kqes7kiY92+JMGinCSmfoOkRlLpHaCo8uaCOjwAXCJphKTuJB/of4mIemt4TSjz2ZJOSY/9fZJ+rQWShkiakB5vezrtIjmBr0nqn9ZgNqfntivHslgzOZBYa/lfwIUkHxK/JukUz6uIeAf4KnATsBE4HPgHyXMvLV3GX5H0ZSwj6Qh+KItt7ifpPK9q1oqITcB3gUdIOqwnkwTEbPyYpGa0FngSuCdjv0uB24Dn0zxHAZn9Ck8Dq4B3JGU2UVVu/xRJE9Mj6faHkPSb5CQilpNc81+RBLmJwNlpf0l34Bck/Vpvk9SArk03/RywQsldgTcCX42IHbmWx5pHSbOxWccnqTNJU8rkiPhLoctj1lG4RmIdmqSJaVNPd+CHJHf7PF/gYpl1KA4k1tF9BlhD0mzyT8A5EVFf05aZNYObtszMLCeukZiZWU72iEEb+/fvH8XFxYUuhplZu7Jo0aINEdHQLfPAHhJIiouLKS0tLXQxzMzaFUmNjdAAuGnLzMxy5EBiZmY5cSAxM7Oc5LWPRNJE4Jckg639NiKur7X+KpL3IFSQ3Of/9XT4ayRdCPwgzfrziLg7TT+OZHyivYAngO+E72E2a1M++eQTysrK+OijjwpdFMtCjx49KCoqomvX+oZaa1jeAkk6HMUsknctlAELJc2NiJcysv2D5GU+2yRdTjKuzlcl9SUZN6iEZDC2Rem275OMyfMNknGCniAZm+fJfJ2HmTVdWVkZPXv2pLi4mGTQZWurIoKNGzdSVlbGoEGDGt+gDvls2hoDrE5f0LMDmEP1i3wAiIj5EbEtXXyO6iG//wl4OiLeS4PH08DE9NWe+0XEc2kt5B7gi/ko/OzZUFwMnTolP2fPzsdRzDqmjz76iH79+jmItAOS6NevX061x3wGkoHUfMlOGQ2/C+ISqmsW9W07kJovJqp3n5KmSyqVVFpeXt6kgs+eDdOnw+uvQ0Tyc/p0BxOzpnAQaT9y/V21ic52SeeTNGP9W0vtMyJuj4iSiCgZMKDR52lquPZa2LatZtq2bUm6mZnVlM9Aso6ab2ureutZJkmnk7xj4OyMwfTq23YdGW+8q2+fuXrjjaalm1nbsnHjRkaOHMnIkSM54IADGDhwYNXyjh3Zvbbk4osvZuXKlQ3mmTVrFrNbqKniM5/5DEuWLGmRfbW2fAaShcBgSYMkdQOmAHMzM0gaRfICobMj4t2MVfOAMyT1kdSH5J3V8yJiPfCBpBPT16ZeAPyhpQt+SD0vSa0v3cxy09J9kv369WPJkiUsWbKEyy67jO9+97tVy926dQOSTuZdu+p/qeKdd97JUUcd1eBxvvWtbzFtWs7v92r38hZIIqKC5FWf84AVwIMRsVzSTElnp9n+jeQVoP8paYmkuem27wE/IwlGC4GZaRrAPwO/BVYDr5KHO7auuw723rtm2t57J+lm1rJas09y9erVDB06lGnTpjFs2DDWr1/P9OnTKSkpYdiwYcycObMqb2UNoaKigt69ezNjxgyOPfZYTjrpJN59N/ne+4Mf/IBbbrmlKv+MGTMYM2YMRx11FH/7298A+PDDD/nyl7/M0KFDmTx5MiUlJY3WPO677z6GDx/OMcccwzXXXANARUUFX/va16rSb731VgBuvvlmhg4dyogRIzj//PNb/JplI6/PkUTEEyS36Gam/Shj/vQGtr0DuKOO9FLgmBYs5m4qv2Bce23SnHXIIUkQ8RcPs5bXUJ9kPv7nXn75Ze655x5KSkoAuP766+nbty8VFRVMmDCByZMnM3To0BrbbN68mfHjx3P99ddz1VVXcccddzBjxozd9h0RPP/888ydO5eZM2fy1FNPcdttt3HAAQfw8MMP88ILLzB69OgGy1dWVsYPfvADSktL6dWrF6effjqPPfYYAwYMYMOGDSxbtgyATZs2AfCLX/yC119/nW7dulWltbY20dneFk2bBmvXwq5dyU8HEbP8aO0+ycMPP7wqiAA88MADjB49mtGjR7NixQpeeuml3bbZa6+9OPPMMwE47rjjWLt2bZ37/tKXvrRbnr/+9a9MmTIFgGOPPZZhw4Y1WL4FCxZw6qmn0r9/f7p27cp5553Hs88+yxFHHMHKlSu58sormTdvHr169QJg2LBhnH/++cyePbvZDxTmyoHEzAqqtfsk99lnn6r5VatW8ctf/pJnnnmGpUuXMnHixDqfp6jsVwHo3LkzFRUVde67e/fujeZprn79+rF06VLGjRvHrFmz+OY3vwnAvHnzuOyyy1i4cCFjxoxh586dLXrcbDiQmFlBFbJP8oMPPqBnz57st99+rF+/nnnz5rX4McaOHcuDDz4IwLJly+qs8WQ64YQTmD9/Phs3bqSiooI5c+Ywfvx4ysvLiQi+8pWvMHPmTBYvXszOnTspKyvj1FNP5Re/+AUbNmxgW+12wlawR7yPxMzarkL2SY4ePZqhQ4dy9NFHc+ihhzJ27NgWP8a3v/1tLrjgAoYOHVo1VTZL1aWoqIif/exnnHLKKUQEZ511Fp///OdZvHgxl1xyCRGBJG644QYqKio477zz2LJlC7t27eJ73/sePXv2bPFzaMwe8c72kpKS8IutzFrPihUrGDJkSKGL0SZUVFRQUVFBjx49WLVqFWeccQarVq2iS5e29T2+rt+ZpEURUVLPJlXa1pmYmXUwW7du5bTTTqOiooKI4Ne//nWbCyK56lhnY2bWxvTu3ZtFixYVuhh55c52MzPLiQOJmZnlxIHEzMxy4kBiZmY5cSAxsw5nwoQJuz1ceMstt3D55Zc3uN2+++4LwFtvvcXkyZPrzHPKKafQ2OMEt9xyS40HAz/3uc+1yDhYP/nJT7jxxhtz3k9LcyAxsw5n6tSpzJkzp0banDlzmDp1albbH3TQQTz00EPNPn7tQPLEE0/Qu3fvZu+vrXMgMbMOZ/LkyTz++ONVL7Fau3Ytb731FuPGjat6rmP06NEMHz6cP/xh91carV27lmOOSQYZ3759O1OmTGHIkCGcc845bN++vSrf5ZdfXjUE/Y9//GMAbr31Vt566y0mTJjAhAkTACguLmbDhg0A3HTTTRxzzDEcc8wxVUPQr127liFDhvCNb3yDYcOGccYZZ9Q4Tl2WLFnCiSeeyIgRIzjnnHN4//33q45fOax85WCRf/7zn6te7DVq1Ci2bNnS7GtbFz9HYmZ59S//Ai394r+RIyH9DK5T3759GTNmDE8++SSTJk1izpw5nHvuuUiiR48ePPLII+y3335s2LCBE088kbPPPrve95b/6le/Yu+992bFihUsXbq0xjDw1113HX379mXnzp2cdtppLF26lCuvvJKbbrqJ+fPn079//xr7WrRoEXfeeScLFiwgIjjhhBMYP348ffr0YdWqVTzwwAP85je/4dxzz+Xhhx9u8P0iF1xwAbfddhvjx4/nRz/6ET/96U+55ZZbuP7663nttdfo3r17VXPajTfeyKxZsxg7dixbt26lR48eTbjajXONxMw6pMzmrcxmrYjgmmuuYcSIEZx++umsW7eOd955p979PPvss1Uf6CNGjGDEiBFV6x588EFGjx7NqFGjWL58eaMDMv71r3/lnHPOYZ999mHfffflS1/6En/5y18AGDRoECNHjgQaHqoekvejbNq0ifHjxwNw4YUX8uyzz1aVcdq0adx3331VT9CPHTuWq666iltvvZVNmza1+JP1rpGYWV41VHPIp0mTJvHd736XxYsXs23bNo477jgAZs+eTXl5OYsWLaJr164UFxfXOXR8Y1577TVuvPFGFi5cSJ8+fbjooouatZ9KlUPQQzIMfWNNW/V5/PHHefbZZ/njH//Iddddx7Jly5gxYwaf//zneeKJJxg7dizz5s3j6KOPbnZZa3ONxMw6pH333ZcJEybw9a9/vUYn++bNm9l///3p2rUr8+fP5/XXX29wPyeffDL3338/AC+++CJLly4FkiHo99lnH3r16sU777zDk09Wv/W7Z8+edfZDjBs3jkcffZRt27bx4Ycf8sgjjzBu3Lgmn1uvXr3o06dPVW3m3nvvZfz48ezatYs333yTCRMmcMMNN7B582a2bt3Kq6++yvDhw7n66qs5/vjjefnll5t8zIbktUYiaSLwS6Az8NuIuL7W+pOBW4ARwJSIeChNnwDcnJH16HT9o5LuAsYDm9N1F0VEC7fAmllHMHXqVM4555wad3BNmzaNs846i+HDh1NSUtLoN/PLL7+ciy++mCFDhjBkyJCqms2xxx7LqFGjOProozn44INrDEE/ffp0Jk6cyEEHHcT8+fOr0kePHs1FF13EmDFjALj00ksZNWpUg81Y9bn77ru57LLL2LZtG4cddhh33nknO3fu5Pzzz2fz5s1EBFdeeSW9e/fmhz/8IfPnz6dTp04MGzas6m2PLSVvw8hL6gy8AnwWKAMWAlMj4qWMPMXAfsD3gLmVgaTWfvoCq4GiiNiWBpLH6spbHw8jb9a6PIx8+9NWh5EfA6yOiDVpgeYAk4CqQBIRa9N1uxrYz2TgyYho/dd+mZlZo/LZRzIQeDNjuSxNa6opwAO10q6TtFTSzZK617WRpOmSSiWVlpeXN+OwZmaWjTbd2S7pQGA4kDnWwb+S9JkcD/QFrq5r24i4PSJKIqJkwIABeS+rmdW0J7x9taPI9XeVz0CyDjg4Y7koTWuKc4FHIuKTyoSIWB+Jj4E7SZrQzKwN6dGjBxs3bnQwaQcigo0bN+b0kGI++0gWAoMlDSIJIFOA85q4j6kkNZAqkg6MiPVKHkP9IvBiSxTWzFpOUVERZWVluFm5fejRowdFRUXN3j5vgSQiKiRdQdIs1Rm4IyKWS5oJlEbEXEnHA48AfYCzJP00IoZB1R1dBwN/rrXr2ZIGAAKWAJfl6xzMrHm6du3KoEGDCl0MayV5u/23LfHtv2ZmTZft7b9turPdzMzaPgcSMzPLiQOJmZnlxIHEzMxy4kBiZmY5cSAxM7OcOJCYmVlOHEjMzCwnDiRmZpYTBxIzM8uJA4mZmeXEgcTMzHLiQGJmZjlxIDEzs5w4kJiZWU4cSMzMLCcOJGZmlhMHEjMzy4kDiZmZ5SSvgUTSREkrJa2WNKOO9SdLWiypQtLkWut2SlqSTnMz0gdJWpDu8/eSuuXzHMzMrGF5CySSOgOzgDOBocBUSUNrZXsDuAi4v45dbI+Ikel0dkb6DcDNEXEE8D5wSYsX3szMspbPGskYYHVErImIHcAcYFJmhohYGxFLgV3Z7FCSgFOBh9Kku4EvtlyRzcysqfIZSAYCb2Ysl6Vp2eohqVTSc5Iqg0U/YFNEVDS2T0nT0+1Ly8vLm1p2MzPLUpdCF6ABh0bEOkmHAc9IWgZsznbjiLgduB2gpKQk8lRGM7M9Xj5rJOuAgzOWi9K0rETEuvTnGuBPwChgI9BbUmUAbNI+zcys5eUzkCwEBqd3WXUDpgBzG9kGAEl9JHVP5/sDY4GXIiKA+UDlHV4XAn9o8ZKbmVnW8hZI0n6MK4B5wArgwYhYLmmmpLMBJB0vqQz4CvBrScvTzYcApZJeIAkc10fES+m6q4GrJK0m6TP5Xb7OwczMGqfkS37HVlJSEqWlpYUuhplZuyJpUUSUNJbPT7abmVlOHEjMzCwnDiRmZpYTBxIzM8uJA4mZmeXEgcTMzHLiQGJmZjlxIDEzs5w4kJiZWU4cSMzMLCcOJGZmlhMHEjMzy4kDiZmZ5cSBxMzMcuJAYmZmOXEgMTOznDiQmJlZThxIzMwsJw4kZmaWk7wGEkkTJa2UtFrSjDrWnyxpsaQKSZMz0kdK+ruk5ZKWSvpqxrq7JL0maUk6jcznOZiZWcO65GvHkjoDs4DPAmXAQklzI+KljGxvABcB36u1+TbggohYJekgYJGkeRGxKV3//Yh4KF9lNzOz7OUtkABjgNURsQZA0hxgElAVSCJibbpuV+aGEfFKxvxbkt4FBgCbMDOzNiWfTVsDgTczlsvStCaRNAboBryakXxd2uR1s6Tu9Ww3XVKppNLy8vKmHtbMzLLUpjvbJR0I3AtcHBGVtZZ/BY4Gjgf6AlfXtW1E3B4RJRFRMmDAgFYpr5nZniifgWQdcHDGclGalhVJ+wGPA9dGxHOV6RGxPhIfA3eSNKGZmVmB5DOQLAQGSxokqRswBZibzYZp/keAe2p3qqe1FCQJ+CLwYouW2szMmiRvgSQiKoArgHnACuDBiFguaaakswEkHS+pDPgK8GtJy9PNzwVOBi6q4zbf2ZKWAcuA/sDP83UOZmbWOEVEocuQdyUlJVFaWlroYpiZtSuSFkVESWP52nRnu5mZtX0OJGZmlhMHEjMzy4kDiZmZ5SSrQCLp8MonyCWdIulKSb3zWzQzM2sPsq2RPAzslHQEcDvJg4b3561UZmbWbmQbSHalz4WcA9wWEd8HDsxfsczMrL3INpB8ImkqcCHwWJrWNT9FMjOz9iTbQHIxcBJwXUS8JmkQyWCKZma2h8vqfSTpy6iuBJDUB+gZETfks2BmZtY+ZHvX1p8k7SepL7AY+I2km/JbNDMzaw+ybdrqFREfAF8iGZH3BOD0/BXLzMzai2wDSZd0+PZzqe5sNzMzyzqQzCQZDv7ViFgo6TBgVf6KZWZm7UW2ne3/CfxnxvIa4Mv5KpSZmbUf2Xa2F0l6RNK76fSwpKJ8F87MzNq+bJu27iR5Te5B6fTHNM3MzPZw2QaSARFxZ0RUpNNdwIA8lsvMzNqJbAPJRknnS+qcTucDGxvbSNJESSslrZY0o471J0taLKlC0uRa6y6UtCqdLsxIP07SsnSft0pSludgZmZ5kG0g+TrJrb9vA+uBycBFDW0gqTMwCzgTGApMlTS0VrY30v3cX2vbvsCPgROAMcCP0yfqAX4FfAMYnE4TszwHMzPLg6wCSUS8HhFnR8SAiNg/Ir5I43dtjQFWR8SaiNgBzAEm1drv2ohYCuyqte0/AU9HxHsR8T7wNDAxfZZlv4h4LiICuAf4YjbnYGZm+ZHLGxKvamT9QODNjOWyNC0b9W07MJ1vdJ+SpksqlVRaXl6e5WHNzKypcgkkbbpvIiJuj4iSiCgZMMD3BZiZ5UsugSQaWb+O5E2KlYrStGzUt+26dL45+zQzszxoMJBI2iLpgzqmLSTPkzRkITBY0iBJ3YApJM+iZGMecIakPmkn+xnAvIhYD3wg6cT0bq0LgD9kuU8zM8uDBodIiYiezd1xRFRIuoIkKHQG7oiI5ZJmAqURMVfS8cAjQB/gLEk/jYhhEfGepJ+RBCOAmRHxXjr/z8BdwF7Ak+lkZmYFouTmp46tpKQkSktLC10MM7N2RdKiiChpLF8ufSRmZmYOJGZmlhsHEjMzy4kDiZmZ5cSBxMzMcuJAYmZmOXEgMTOznDiQmJlZThxIzMwsJw4kZmaWEwcSMzPLiQOJmZnlxIHEzMxy4kBiZmY5cSAxM7OcOJCYmVlOHEjMzCwnDiRmZpYTBxIzM8tJXgOJpImSVkpaLWlGHeu7S/p9un6BpOI0fZqkJRnTLkkj03V/SvdZuW7/fJ6DmZk1LG+BRFJnYBZwJjAUmCppaK1slwDvR8QRwM3ADQARMTsiRkbESOBrwGsRsSRju2mV6yPi3Xydg5mZNS6fNZIxwOqIWBMRO4A5wKRaeSYBd6fzDwGnSVKtPFPTbc3MrA3KZyAZCLyZsVyWptWZJyIqgM1Av1p5vgo8UCvtzrRZ64d1BB4AJE2XVCqptLy8vLnnYGZmjWjTne2STgC2RcSLGcnTImI4MC6dvlbXthFxe0SURETJgAEDWqG0ZmZ7pnwGknXAwRnLRWlanXkkdQF6ARsz1k+hVm0kItalP7cA95M0oZmZWYHkM5AsBAZLGiSpG0lQmFsrz1zgwnR+MvBMRASApE7AuWT0j0jqIql/Ot8V+ALwImZmVjBd8rXjiKiQdAUwD+gM3BERyyXNBEojYi7wO+BeSauB90iCTaWTgTcjYk1GWndgXhpEOgP/DfwmX+dgZmaNU1oB6NBKSkqitLS00MUwM2tXJC2KiJLG8rXpznYzM2v7HEjMzCwnDiRmZpYTBxIzM8uJA4mZmeXEgcTMzHLiQGJmZjlxIDEzs5w4kJiZWU4cSMzMLCcOJGZmlhMHEjMzy4kDiZmZ5cSBxMzMcuJAYmZmOXEgMTOznDiQmJlZThxIzMwsJw4k1iJmz4biYujUKfk5e3ahS2RmrSWvgUTSREkrJa2WNKOO9d0l/T5dv0BScZpeLGm7pCXp9B8Z2xwnaVm6za2SlM9zsMbNng3Tp8Prr0NE8nP6dAcTsz1F3gKJpM7ALOBMYCgwVdLQWtkuAd6PiCOAm4EbMta9GhEj0+myjPRfAd8ABqfTxHydw+bNsGtXvvbecVx7LWzbVjNt27Yk3cw6vnzWSMYAqyNiTUTsAOYAk2rlmQTcnc4/BJzWUA1D0oHAfhHxXEQEcA/wxZYveuLii6F3bxg3Dr79bfjd72DxYvj443wdsX16442mpZtZx9Ilj/seCLyZsVwGnFBfnoiokLQZ6JeuGyTpH8AHwA8i4i9p/rJa+xxY18ElTQemAxxyyCHNOoGpU+Ggg2DJErjrLti6NUnv0gWGDIFRo2DkyOqpT59mHabdO+SQpDmrrnQz6/jyGUhysR44JCI2SjoOeFTSsKbsICJuB24HKCkpieYU4itfSSZImrhefTUJKv/4R/Lz6afhnnuq8x96aBJQMgPMIYdAR+/Fue66pE8ks3lr772TdDPr+PIZSNYBB2csF6VpdeUpk9QF6AVsTJutPgaIiEWSXgWOTPMXNbLPvOjUCQYPTqbK4ALwzjtJUMkMMHPnJp3OkNRSKoNKZYA5+mjo2rU1St06pk1Lfl57bdKcdcghSRCpTDezjk0Rzfqy3viOk8DwCnAayYf9QuC8iFiekedbwPCIuEzSFOBLEXGupAHAexGxU9JhwF/SfO9Jeh64ElgAPAHcFhFPNFSWkpKSKC0tzcdp1unDD2HZsurAsmQJLF0KH32UrO/eHY45pmaAGTECevZstSKamTVK0qKIKGksX95qJGmfxxXAPKAzcEdELJc0EyiNiLnA74B7Ja0G3gOmpJufDMyU9AmwC7gsIt5L1/0zcBewF/BkOrUp++wDJ56YTJUqKuCVV2rWXB59NOnAr3TEEbv3uxx4YMdvGjOz9i1vNZK2pLUtS75nAAANu0lEQVRrJNmKgHXrdm8aW7OmOs/+++/eNDZ4MHTuXLhym1n7sHNnbp8VBa+RWOMkKCpKpi98oTp982Z44YWaAebmm+GTT5L1e++dNIVlBphjjknSzWzPtXkzLFgAf/tbMj33HLz8cnL3aT45kLRBvXrByScnU6UdO2DFipo1lwcegP9In/nv1AmOOqpm09ioUdC/f2HOwczyKyK5k7QyaPztb/Dii0l6p04wfDicf37SrJ5vbtpqxyJg7drdm8bezHh6Z+DA3W9JHjQo+UMzs/Zj+3ZYtKhm4CgvT9bttx+cdBJ8+tPJNGZMkpYrN23tAaQkKAwaBOecU52+cWN1cKkMME89lbSXQvIHduyxNQPM0KHJ3WRm1ja89VbNoLF4cXXz9uDB8LnPVQeOIUMK22/qGskeYvt2WL68Zs3lhReSW5Uhea5l6NCazWLHHpsMEWNm+VVRkTwikBk4KkeL6NEDjj++OmicdBIMGNA65cq2RuJAsgfbtQtWr969aeztt6vzFBfv3u9SVORbkhsTkXw4fPJJ9bRjR83lTz5JaoGDBkG3boUusbWm995LOsIrg8aCBdUjQxx0EIwdWx04Ro4s3N+HA0kGB5Kmefvt3ZvGVq2qflq/b9/d+12OPjoZg6wl7NzZ8Idvc9Jbe19N6eDs3DkJJkcdlUxHHlk9f8ABDtrt3a5dyTNkmbWNFSuSdZ07J/8/lUHj05+Ggw9uO79zB5IMDiS527p196f1ly2r+bT+8OHJeGO5fpC3xp+klHzL69p196m+9IbWNXdf27YlHzIrVybTqlVJM2Slnj1rBpbKQHPkkcmDr9b2fPghLFxYHTT+/vekBgLJkEmZQeP449v279GBJIMDSX5UVCQffpk1l7feyt+Hckt+kBf6gc7Zs+sem2zXLigrqw4smUHmjTdqBtmiot1rMEcdleyv0Oe3p4hI7pLMrG0sWVJ9Y8uQITUDx5FHtq87Jh1IMjiQWFtS+UbJ2qMl3357wwNdbt+e9GlVBpbMQLNpU3W+7t2T4Xbqairr2zd/57Un2LEjCRSZgWNdOmzs3nvDCSdUB40TT2z/19uBJIMDibUlxcV1v7/l0EOT54KaKiJ5nqB2DWblyuSBtcz+mn79dm8mO+ooOPxw3/5dl/LypGmqMmgsXFjdnHvooTVrGyNGtFw/YVvhQJLBgcTakk6d6u4Hklr+1c6ffJIEp7pqMZl353XqlHT419Ufc9BBbafzN5927YKXXqpZ21i1KlnXtSuMHl3zFtyBdb5Sr2PxA4lmbVRrvlGya9fq9+hkjucGybhMr7xSsxbzyivw5z/XbHbbd9/qAJMZaI48MlnXXm3Zsvu4VJs3J+sGDEgCxqWXJj+POw722quw5W3LHEjMWllbeaNkr17JXUPHH18zfdeupN2/dlPZ3/8Oc+bUrE0ddFDdTWXFxW2rwz8CXnutZm1j2bLkXKVk0NMpU6prHIcfvmfUwlqKm7bMCqC+u7bauo8+qr/D//33q/N165Z0+NfVVNYaA4l+9FEypEhm4HjnnWRdz55JR3hl0DjhhCSo2u7cR5LBgcQsvyJgw4bdO/tfeSUJPJVjREFyJ1Ndd5QdcUTzO/zffrtm0Fi0KLnDCpLaRWan+LBhbau21JY5kGRwIDErnIqK6g7/2oFm/frqfJ06JXdC1dVUNnBgdVNTRUUyXHpm4HjttWRd9+5QUlKzU/xTn2r1U+4wHEgyOJCYtU1btuze2V/5s3JAUUgCRERSy5Cq+2kOOKDmuFSjRvk25pbUJu7akjQR+CXJO9t/GxHX11rfHbgHOA7YCHw1ItZK+ixwPdAN2AF8PyKeSbf5E3AgUDmQxBkR8W4+z8PM8qNnz+SOqOOOq5le+RrqV16Be++F++6rfh4mIumDueEG+M533CneFuTtYX1JnYFZwJnAUGCqpKG1sl0CvB8RRwA3Azek6RuAsyJiOHAhcG+t7aZFxMh0chAx62AqX0N96qkwf/7ug2Du2AG33OIg0lbkc9SXMcDqiFgTETuAOcCkWnkmAXen8w8Bp0lSRPwjIt5K05cDe6W1FzPbw7zxRtPSrfXlM5AMBDJe+kpZmlZnnoioADYD/Wrl+TKwOCI+zki7U9ISST+U/J3ErCOr70HNfDzAac3TpsehlDSMpLnrmxnJ09Imr3Hp9LV6tp0uqVRSaXnli43NrN257rrkgc1MhXiA0+qXz0CyDjg4Y7koTaszj6QuQC+STnckFQGPABdExKuVG0TEuvTnFuB+kia03UTE7RFREhElA1rrvZRm1uKmTUtGRj700KRP5NBDGx8p2VpXPu/aWggMljSIJGBMAc6rlWcuSWf634HJwDMREZJ6A48DMyLi/1RmToNN74jYIKkr8AXgv/N4DmbWBkyb5sDRluWtRpL2eVwBzANWAA9GxHJJMyWdnWb7HdBP0mrgKmBGmn4FcATwo7QvZImk/YHuwDxJS4ElJAHqN/k6BzOz9mj27GS8s06dkp+zZ+f3eH4g0cysA2nui9Pqku0DiW26s93MzJrm2mtrBhFIlq+9Nn/HdCAxM+tACvHcjQOJmVkHUojnbhxIzMw6kEI8d+NAYmbWgRTiuRu/atfMrINp7eduXCMxM7OcOJCYmVlOHEjMzCwnDiRmZpYTBxIzM8vJHjHWlqRy4PVmbt6f5NW/bY3L1TQuV9O4XE3TUct1aEQ0+h6OPSKQ5EJSaTaDlrU2l6tpXK6mcbmaZk8vl5u2zMwsJw4kZmaWEweSxt1e6ALUw+VqGperaVyuptmjy+U+EjMzy4lrJGZmlhMHEjMzy4kDCSDpDknvSnqxnvWSdKuk1ZKWShrdRsp1iqTNkpak049aqVwHS5ov6SVJyyV9p448rX7NsixXq18zST0kPS/phbRcP60jT3dJv0+v1wJJxW2kXBdJKs+4Xpfmu1wZx+4s6R+SHqtjXatfryzLVZDrJWmtpGXpMUvrWJ/f/8eI2OMn4GRgNPBiPes/BzwJCDgRWNBGynUK8FgBrteBwOh0vifwCjC00Ncsy3K1+jVLr8G+6XxXYAFwYq08/wz8Rzo/Bfh9GynXRcC/t/bfWHrsq4D76/p9FeJ6ZVmuglwvYC3Qv4H1ef1/dI0EiIhngfcayDIJuCcSzwG9JR3YBspVEBGxPiIWp/NbgBXAwFrZWv2aZVmuVpdeg63pYtd0qn2XyyTg7nT+IeA0SWoD5SoISUXA54Hf1pOl1a9XluVqq/L6/+hAkp2BwJsZy2W0gQ+o1Elp08STkoa19sHTJoVRJN9mMxX0mjVQLijANUubQ5YA7wJPR0S91ysiKoDNQL82UC6AL6fNIQ9JOjjfZUrdAvw/wK561hfkemVRLijM9QrgvyQtkjS9jvV5/X90IGnfFpOMhXMscBvwaGseXNK+wMPAv0TEB6157IY0Uq6CXLOI2BkRI4EiYIykY1rjuI3Jolx/BIojYgTwNNW1gLyR9AXg3YhYlO9jNUWW5Wr165X6TESMBs4EviXp5FY6LuBAkq11QOY3i6I0raAi4oPKpomIeALoKql/axxbUleSD+vZEfH/15GlINessXIV8pqlx9wEzAcm1lpVdb0kdQF6ARsLXa6I2BgRH6eLvwWOa4XijAXOlrQWmAOcKum+WnkKcb0aLVeBrhcRsS79+S7wCDCmVpa8/j86kGRnLnBBeufDicDmiFhf6EJJOqCyXVjSGJLfZ94/fNJj/g5YERE31ZOt1a9ZNuUqxDWTNEBS73R+L+CzwMu1ss0FLkznJwPPRNpLWshy1WpHP5uk3ymvIuJfI6IoIopJOtKfiYjza2Vr9euVTbkKcb0k7SOpZ+U8cAZQ+07PvP4/dmmpHbVnkh4guZunv6Qy4MckHY9ExH8AT5Dc9bAa2AZc3EbKNRm4XFIFsB2Yku9/ptRY4GvAsrR9HeAa4JCMshXimmVTrkJcswOBuyV1JglcD0bEY5JmAqURMZckAN4raTXJDRZT8lymbMt1paSzgYq0XBe1Qrnq1AauVzblKsT1+hTwSPr9qAtwf0Q8JekyaJ3/Rw+RYmZmOXHTlpmZ5cSBxMzMcuJAYmZmOXEgMTOznDiQmJlZThxIzJpJ0s6MUV6XSJrRgvsuVj2jPpu1NX6OxKz5tqfDi5jt0VwjMWth6bshfpG+H+J5SUek6cWSnkkH9PsfSYek6Z+S9Eg6kOQLkj6d7qqzpN8oeVfIf6VPnyPpSiXvXFkqaU6BTtOsigOJWfPtVatp66sZ6zZHxHDg30lGjIVkkMi70wH9ZgO3pum3An9OB5IcDSxP0wcDsyJiGLAJ+HKaPgMYle7nsnydnFm2/GS7WTNJ2hoR+9aRvhY4NSLWpINIvh0R/SRtAA6MiE/S9PUR0V9SOVCUMdhf5TD4T0fE4HT5aqBrRPxc0lPAVpKRix/NeKeIWUG4RmKWH1HPfFN8nDG/k+o+zc8Ds0hqLwvT0W/NCsaBxCw/vprx8+/p/N+oHlxwGvCXdP5/gMuh6kVTverbqaROwMERMR+4mmT49N1qRWatyd9kzJpvr4xRhgGeiojKW4D7SFpKUquYmqZ9G7hT0veBcqpHYP0OcLukS0hqHpcD9Q3x3Rm4Lw02Am5N3yViVjDuIzFrYWkfSUlEbCh0Wcxag5u2zMwsJ66RmJlZTlwjMTOznDiQmJlZThxIzMwsJw4kZmaWEwcSMzPLyf8FzHtzu66ELSEAAAAASUVORK5CYII=\n",
944 | "text/plain": [
945 | ""
946 | ]
947 | },
948 | "metadata": {
949 | "needs_background": "light"
950 | },
951 | "output_type": "display_data"
952 | }
953 | ],
954 | "source": [
955 | "import matplotlib.pyplot as plt\n",
956 | "\n",
957 | "acc = history.history['acc']\n",
958 | "val_acc = history.history['val_acc']\n",
959 | "loss = history.history['loss']\n",
960 | "val_loss = history.history['val_loss']\n",
961 | "\n",
962 | "epochs = range(1, len(acc) + 1)\n",
963 | "\n",
964 | "# \"bo\" is for \"blue dot\"\n",
965 | "plt.plot(epochs, loss, 'bo', label='Training loss')\n",
966 | "# b is for \"solid blue line\"\n",
967 | "plt.plot(epochs, val_loss, 'b', label='Validation loss')\n",
968 | "plt.title('Training and validation loss')\n",
969 | "plt.xlabel('Epochs')\n",
970 | "plt.ylabel('Loss')\n",
971 | "plt.legend()\n",
972 | "\n",
973 | "plt.show()"
974 | ]
975 | },
976 | {
977 | "cell_type": "markdown",
978 | "metadata": {
979 | "slideshow": {
980 | "slide_type": "subslide"
981 | }
982 | },
983 | "source": [
984 | "## Evaluate on Test"
985 | ]
986 | },
987 | {
988 | "cell_type": "code",
989 | "execution_count": 16,
990 | "metadata": {
991 | "slideshow": {
992 | "slide_type": "fragment"
993 | }
994 | },
995 | "outputs": [
996 | {
997 | "name": "stdout",
998 | "output_type": "stream",
999 | "text": [
1000 | "10000/10000 [==============================] - 1s 88us/sample - loss: 0.0387 - acc: 0.9876\n",
1001 | "test loss=0.03868261780568318, accuracy=0.9876000285148621\n"
1002 | ]
1003 | }
1004 | ],
1005 | "source": [
1006 | "test_loss, test_acc = model.evaluate(test_images_reshaped, test_labels_one_hot)\n",
1007 | "print('test loss={}, accuracy={}'.format(test_loss, test_acc))"
1008 | ]
1009 | },
1010 | {
1011 | "cell_type": "markdown",
1012 | "metadata": {
1013 | "slideshow": {
1014 | "slide_type": "subslide"
1015 | }
1016 | },
1017 | "source": [
1018 | "# What we don't cover\n",
1019 | "\n",
1020 | "- using data augmentation\n",
1021 | "- using pretained convnets\n",
1022 | "- fine tuning\n",
1023 | "- visualizing convnets filters and activation"
1024 | ]
1025 | },
1026 | {
1027 | "cell_type": "markdown",
1028 | "metadata": {
1029 | "slideshow": {
1030 | "slide_type": "slide"
1031 | }
1032 | },
1033 | "source": [
1034 | "# From feed-forward to RNNs\n",
1035 | "\n",
1036 | "- language model\n",
1037 | "- image captioning\n",
1038 | "- sentiment, classification\n",
1039 | "- machine translation\n",
1040 | "- video classification on frame level\n",
1041 | "\n",
1042 | " "
1043 | ]
1044 | },
1045 | {
1046 | "cell_type": "markdown",
1047 | "metadata": {
1048 | "slideshow": {
1049 | "slide_type": "subslide"
1050 | }
1051 | },
1052 | "source": [
1053 | "## Long Short Term Memory\n",
1054 | "\n",
1055 | "- **forget gate**: whether to erase cell\n",
1056 | "- **input gate**: whether to write to cell\n",
1057 | "- **gate gate**: how much to write to cell\n",
1058 | "- **output gate**: how much to reveal cell\n",
1059 | "\n",
1060 | "\n",
1061 | "\n",
1062 | " \n",
1063 | " \n",
1064 | " \n",
1065 | " \n",
1066 | "
"
1067 | ]
1068 | },
1069 | {
1070 | "cell_type": "markdown",
1071 | "metadata": {
1072 | "slideshow": {
1073 | "slide_type": "subslide"
1074 | }
1075 | },
1076 | "source": [
1077 | "## Character-level Language Model\n",
1078 | "\n",
1079 | " "
1080 | ]
1081 | },
1082 | {
1083 | "cell_type": "markdown",
1084 | "metadata": {
1085 | "slideshow": {
1086 | "slide_type": "subslide"
1087 | }
1088 | },
1089 | "source": [
1090 | "## Review Dataset"
1091 | ]
1092 | },
1093 | {
1094 | "cell_type": "code",
1095 | "execution_count": 27,
1096 | "metadata": {
1097 | "slideshow": {
1098 | "slide_type": "fragment"
1099 | }
1100 | },
1101 | "outputs": [
1102 | {
1103 | "name": "stdout",
1104 | "output_type": "stream",
1105 | "text": [
1106 | "max sentence length 4112\n",
1107 | "min sentence length 1\n",
1108 | "num of reviews = 67633\n",
1109 | "num of reviews len [100:200] = 17577\n"
1110 | ]
1111 | }
1112 | ],
1113 | "source": [
1114 | "with open('./data/train.csv','r') as f:\n",
1115 | " reviews = f.readlines()\n",
1116 | "\n",
1117 | "reviews = [r.lower() for r in reviews]\n",
1118 | "\n",
1119 | "max_review = np.max([len(x) for x in reviews])\n",
1120 | "min_review = np.min([len(x) for x in reviews])\n",
1121 | "print('max sentence length {}'.format(max_review))\n",
1122 | "print('min sentence length {}'.format(min_review))\n",
1123 | "\n",
1124 | "# \n",
1125 | "print('num of reviews =', len(reviews))\n",
1126 | "reviews = [s for s in reviews if 200 >= len(s) >= 100]\n",
1127 | "print('num of reviews len [100:200] =', len(reviews))"
1128 | ]
1129 | },
1130 | {
1131 | "cell_type": "code",
1132 | "execution_count": 28,
1133 | "metadata": {},
1134 | "outputs": [
1135 | {
1136 | "data": {
1137 | "text/plain": [
1138 | "30"
1139 | ]
1140 | },
1141 | "execution_count": 28,
1142 | "metadata": {},
1143 | "output_type": "execute_result"
1144 | }
1145 | ],
1146 | "source": [
1147 | "import gc\n",
1148 | "gc.collect()"
1149 | ]
1150 | },
1151 | {
1152 | "cell_type": "code",
1153 | "execution_count": 29,
1154 | "metadata": {
1155 | "slideshow": {
1156 | "slide_type": "subslide"
1157 | }
1158 | },
1159 | "outputs": [
1160 | {
1161 | "name": "stdout",
1162 | "output_type": "stream",
1163 | "text": [
1164 | "Number of sequences: 172507\n",
1165 | "Unique characters: 110\n",
1166 | "Vectorization...\n"
1167 | ]
1168 | }
1169 | ],
1170 | "source": [
1171 | "# text = ' '.join(reviews[:8000])\n",
1172 | "num_reviews = 2000\n",
1173 | "\n",
1174 | "# Length of extracted character sequences# Length \n",
1175 | "maxlen = 60\n",
1176 | "\n",
1177 | "# We sample a new sequence every `step` characters\n",
1178 | "step = 1\n",
1179 | "\n",
1180 | "# This holds our extracted sequences\n",
1181 | "sentences = []\n",
1182 | "\n",
1183 | "# This holds the targets (the follow-up characters)\n",
1184 | "next_chars = []\n",
1185 | "\n",
1186 | "for review in reviews[:num_reviews]:\n",
1187 | " for i in range(0, len(review) - maxlen, step):\n",
1188 | " sentences.append(review[i: i + maxlen])\n",
1189 | " next_chars.append(review[i + maxlen])\n",
1190 | "print('Number of sequences:', len(sentences))\n",
1191 | "\n",
1192 | "\n",
1193 | "# List of unique characters in the corpus\n",
1194 | "chars = sorted(list(set( ' '.join(reviews[:num_reviews]))))\n",
1195 | "print('Unique characters:', len(chars))\n",
1196 | "# Dictionary mapping unique characters to their index in `chars`\n",
1197 | "char_indices = dict((char, chars.index(char)) for char in chars)\n",
1198 | "\n",
1199 | "# Next, one-hot encode the characters into binary arrays.\n",
1200 | "print('Vectorization...')\n",
1201 | "x = np.zeros((len(sentences), maxlen, len(chars)), dtype=np.bool)\n",
1202 | "y = np.zeros((len(sentences), len(chars)), dtype=np.bool)\n",
1203 | "for i, sentence in enumerate(sentences):\n",
1204 | " for t, char in enumerate(sentence):\n",
1205 | " x[i, t, char_indices[char]] = 1\n",
1206 | " y[i, char_indices[next_chars[i]]] = 1"
1207 | ]
1208 | },
1209 | {
1210 | "cell_type": "markdown",
1211 | "metadata": {
1212 | "slideshow": {
1213 | "slide_type": "subslide"
1214 | }
1215 | },
1216 | "source": [
1217 | "## Building the Model"
1218 | ]
1219 | },
1220 | {
1221 | "cell_type": "code",
1222 | "execution_count": 30,
1223 | "metadata": {
1224 | "slideshow": {
1225 | "slide_type": "fragment"
1226 | }
1227 | },
1228 | "outputs": [
1229 | {
1230 | "name": "stdout",
1231 | "output_type": "stream",
1232 | "text": [
1233 | "_________________________________________________________________\n",
1234 | "Layer (type) Output Shape Param # \n",
1235 | "=================================================================\n",
1236 | "lstm (LSTM) (None, 60, 64) 44800 \n",
1237 | "_________________________________________________________________\n",
1238 | "lstm_1 (LSTM) (None, 32) 12416 \n",
1239 | "_________________________________________________________________\n",
1240 | "dense_6 (Dense) (None, 32) 1056 \n",
1241 | "_________________________________________________________________\n",
1242 | "dense_7 (Dense) (None, 110) 3630 \n",
1243 | "=================================================================\n",
1244 | "Total params: 61,902\n",
1245 | "Trainable params: 61,902\n",
1246 | "Non-trainable params: 0\n",
1247 | "_________________________________________________________________\n"
1248 | ]
1249 | }
1250 | ],
1251 | "source": [
1252 | "model = tf.keras.models.Sequential()\n",
1253 | "model.add(tf.keras.layers.LSTM(64, activation='tanh',\n",
1254 | " return_sequences=True,\n",
1255 | " input_shape=(maxlen, len(chars))))\n",
1256 | "model.add(tf.keras.layers.LSTM(32, activation='tanh'))\n",
1257 | "model.add(tf.keras.layers.Dense(32, activation='relu'))\n",
1258 | "model.add(tf.keras.layers.Dense(len(chars), activation='softmax'))\n",
1259 | "\n",
1260 | "optimizer = tf.keras.optimizers.Adam(lr=0.002)\n",
1261 | "model.compile(loss='categorical_crossentropy', optimizer=optimizer)\n",
1262 | "\n",
1263 | "model.summary()"
1264 | ]
1265 | },
1266 | {
1267 | "cell_type": "code",
1268 | "execution_count": 31,
1269 | "metadata": {
1270 | "slideshow": {
1271 | "slide_type": "subslide"
1272 | }
1273 | },
1274 | "outputs": [],
1275 | "source": [
1276 | "def sample(preds, temperature=1.0):\n",
1277 | " preds = np.asarray(preds).astype('float64')\n",
1278 | " preds = np.log(preds) / temperature\n",
1279 | " exp_preds = np.exp(preds)\n",
1280 | " preds = exp_preds / np.sum(exp_preds)\n",
1281 | " probas = np.random.multinomial(1, preds, 1)\n",
1282 | " return np.argmax(probas)"
1283 | ]
1284 | },
1285 | {
1286 | "cell_type": "code",
1287 | "execution_count": 32,
1288 | "metadata": {
1289 | "slideshow": {
1290 | "slide_type": "subslide"
1291 | }
1292 | },
1293 | "outputs": [
1294 | {
1295 | "data": {
1296 | "text/plain": [
1297 | "'la tagliata é abbondante e molto buona, gli antipasti buonissimi ma mi aspettavo qualcosa di piú!proprietario molto simpatico e alla mano!\\n'"
1298 | ]
1299 | },
1300 | "execution_count": 32,
1301 | "metadata": {},
1302 | "output_type": "execute_result"
1303 | }
1304 | ],
1305 | "source": [
1306 | "reviews[120]"
1307 | ]
1308 | },
1309 | {
1310 | "cell_type": "code",
1311 | "execution_count": null,
1312 | "metadata": {
1313 | "scrolled": false,
1314 | "slideshow": {
1315 | "slide_type": "subslide"
1316 | }
1317 | },
1318 | "outputs": [],
1319 | "source": [
1320 | "import random\n",
1321 | "import sys\n",
1322 | "\n",
1323 | "for e in range(1, 21):\n",
1324 | " print('epoch', e)\n",
1325 | " model.fit(x,y, batch_size=128, epochs=1)\n",
1326 | " \n",
1327 | " # Select a text seed at random\n",
1328 | " i = random.randint(0, 6000)\n",
1329 | " #select the same review each time\n",
1330 | " if e % 2 == 0:\n",
1331 | " generated_text = reviews[120][0: 0 + maxlen]\n",
1332 | " print('--- Generating from review: \"' + reviews[120] + '\"')\n",
1333 | "\n",
1334 | " for temperature in [0.2, 0.5, 1.0]:\n",
1335 | " generated_text_copy = generated_text\n",
1336 | " print('------ temperature:', temperature)\n",
1337 | " sys.stdout.write('SENTENCE: ')\n",
1338 | " sys.stdout.write(generated_text_copy)\n",
1339 | "\n",
1340 | " # We generate 100 characters\n",
1341 | " for i in range(100):\n",
1342 | " sampled = np.zeros((1, maxlen, len(chars)))\n",
1343 | " for t, char in enumerate(generated_text_copy):\n",
1344 | " sampled[0, t, char_indices[char]] = 1.\n",
1345 | "\n",
1346 | " preds = model.predict(sampled, verbose=0)[0]\n",
1347 | " next_index = sample(preds, temperature)\n",
1348 | " next_char = chars[next_index]\n",
1349 | "\n",
1350 | " generated_text_copy += next_char\n",
1351 | " #shift left the text\n",
1352 | " generated_text_copy = generated_text_copy[1:]\n",
1353 | "\n",
1354 | " sys.stdout.write(next_char)\n",
1355 | " sys.stdout.flush()\n",
1356 | " print() "
1357 | ]
1358 | },
1359 | {
1360 | "cell_type": "code",
1361 | "execution_count": null,
1362 | "metadata": {
1363 | "slideshow": {
1364 | "slide_type": "skip"
1365 | }
1366 | },
1367 | "outputs": [],
1368 | "source": [
1369 | "model.save('language.h5')"
1370 | ]
1371 | },
1372 | {
1373 | "cell_type": "markdown",
1374 | "metadata": {
1375 | "slideshow": {
1376 | "slide_type": "slide"
1377 | }
1378 | },
1379 | "source": [
1380 | "# Other Interesting Topics\n",
1381 | "\n",
1382 | "- use language models, and autoencoder as **pretraining**\n",
1383 | "- use autoencoder for anomaly detection\n",
1384 | "- sequence to sequencec with attention \n",
1385 | "- siamese network for one-shot learning\n",
1386 | "- unsupervised sentiment classification with sentiment neuron\n",
1387 | "- Generative Adversarial Networks\n"
1388 | ]
1389 | },
1390 | {
1391 | "cell_type": "code",
1392 | "execution_count": null,
1393 | "metadata": {},
1394 | "outputs": [],
1395 | "source": []
1396 | }
1397 | ],
1398 | "metadata": {
1399 | "celltoolbar": "Slideshow",
1400 | "kernelspec": {
1401 | "display_name": "Python 3",
1402 | "language": "python",
1403 | "name": "python3"
1404 | },
1405 | "language_info": {
1406 | "codemirror_mode": {
1407 | "name": "ipython",
1408 | "version": 3
1409 | },
1410 | "file_extension": ".py",
1411 | "mimetype": "text/x-python",
1412 | "name": "python",
1413 | "nbconvert_exporter": "python",
1414 | "pygments_lexer": "ipython3",
1415 | "version": "3.6.8"
1416 | },
1417 | "livereveal": {
1418 | "auto_select": "code",
1419 | "scroll": true,
1420 | "slideNumber": "c/t"
1421 | }
1422 | },
1423 | "nbformat": 4,
1424 | "nbformat_minor": 2
1425 | }
1426 |
--------------------------------------------------------------------------------
/private.md:
--------------------------------------------------------------------------------
1 | $ANSIBLE_VAULT;1.1;AES256
2 | 34363264386238633062336466393666376333653032386462323637613664353463343861653831
3 | 3031393634323230303663386536663634393737333136340a623435346662323364346433386539
4 | 36636564386435386630643664616566306363623934326431653832373035383236356137346337
5 | 3634386566633737350a656366636630313536626636633162646663356130363831353538663566
6 | 65373238383330386262663031623831386166386430323666376333663737656535666435633765
7 | 63623734663339313430663039323434373939643263363435633064666332326536666561633631
8 | 33643963326636373930643965623332613364306635666365643737336432643766646435666361
9 | 37633138356663313838653562653032333261353861613861343666363836643262623938323361
10 | 66666630643734663032343966626539383832323637313662313330306138656630363639353936
11 | 31393566633830376137363935633530346636356436666434313730376334643061386266636538
12 | 64393735646466313330323166633064326531636635386262353661376333306632623664643734
13 | 34613065626330663161
14 |
--------------------------------------------------------------------------------
/readme.md:
--------------------------------------------------------------------------------
1 | # Why Neural Networks
2 |
3 | This repository contains all the supporting material for the talk made at [Apache Spark and More Milano](https://www.meetup.com/Spark-More-Milano/)
4 | The slides are interactive and can be run during the presentation.
5 |
6 | ## Setup
7 |
8 | 1. setup [anaconda](https://www.anaconda.com/download/#linux) or [miniconda](https://conda.io/miniconda.html)
9 | 2. create a virtual environment
10 | ```
11 | # please use python 3 :)
12 | $conda create --name dl python=3.6
13 |
14 | #activate the environment
15 | source activate dl
16 | ```
17 | 3. install the needed libraries
18 | ```
19 | $pip install -r requirements.txt
20 | ```
21 |
22 | 4. Install RISE from conda forge
23 | ```
24 | $conda install -c conda-forge rise
25 | ```
26 |
27 | 4. run the [jupyter](http://jupyter.org/) notebook service
28 | ```
29 | $ jupyter notebook
30 | ```
31 |
32 | If you want to get more information about anaconda and jupyter check the [presentation + tutorials](https://github.com/fabiofumarola/anaconda_jupyter_tutorial) made for the [School of Artificial Intelligence at PiCampus](http://picampus-school.com/).
33 |
34 | ## Reference and Interesting Links
35 |
36 | ### Suggested Books
37 |
38 | 1. [Manning: Deep Learning with Python](https://www.manning.com/books/deep-learning-with-python): really good for a high level introduction of arguments and ideas
39 | 2. [Deep Learning](http://www.deeplearningbook.org/): very good for a deep down
40 | 3. [Elements of Statistical Learning](https://web.stanford.edu/~hastie/Papers/ESLII.pdf)
41 | 4. [Machine Learning: A Probabilistic Perspective](https://doc.lagout.org/science/Artificial%20Intelligence/Machine%20learning/Machine%20Learning_%20A%20Probabilistic%20Perspective%20%5BMurphy%202012-08-24%5D.pdf)
42 |
43 | ### MOOC
44 |
45 | 1. [coursera deep learning specialization](https://www.coursera.org/specializations/deep-learning)
46 | 2. [MIT intro to deep learning](http://introtodeeplearning.com/index.html)
47 | 3. [Intro to TensorFlow for Deep Learning byTensorFlow](https://eu.udacity.com/course/intro-to-tensorflow-for-deep-learning--ud187)
48 | 4. [Intro to Deep Learning with PyTorch byFacebook AI](https://www.udacity.com/course/deep-learning-pytorch--ud188)
49 |
50 | ### Pretrained Models
51 |
52 | 1. [model zoo](https://modelzoo.co/)
53 | 2. [tensorflow-hub](https://www.tensorflow.org/hub/)
54 | 3. [tensorflow models](https://github.com/tensorflow/models)
55 | 4. [papers with code](https://paperswithcode.com/)
56 |
57 | ### libraries
58 |
59 | 1. [keras](https://keras.io/)
60 | 1. [tensorflow](https://www.tensorflow.org/)
61 | 2. [pytorch](https://pytorch.org/)
62 |
63 |
64 | ## Bonus
65 |
66 | In the data folder there is a dataset that you can use as exercise ;)
67 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | tensorflow
2 | keras
3 | pandas
4 | matplotlib
5 | numpy
6 | RISE==5.6.0.dev3
7 | jupyter
8 | tensorboard
9 |
--------------------------------------------------------------------------------
/utils.py:
--------------------------------------------------------------------------------
1 | from IPython.display import clear_output, Image, display, HTML
2 | import tensorflow as tf
3 | import numpy as np
4 |
5 | def strip_consts(graph_def, max_const_size=32):
6 | """Strip large constant values from graph_def."""
7 | strip_def = tf.compat.v1.GraphDef()
8 | for n0 in graph_def.node:
9 | n = strip_def.node.add()
10 | n.MergeFrom(n0)
11 | if n.op == 'Const':
12 | tensor = n.attr['value'].tensor
13 | size = len(tensor.tensor_content)
14 | if size > max_const_size:
15 | tensor.tensor_content = ""%size
16 | return strip_def
17 |
18 | def show_graph(graph_def, max_const_size=32):
19 | """Visualize TensorFlow graph."""
20 | if hasattr(graph_def, 'as_graph_def'):
21 | graph_def = graph_def.as_graph_def()
22 | strip_def = strip_consts(graph_def, max_const_size=max_const_size)
23 | code = """
24 |
29 |
30 |
31 |
32 |
33 | """.format(data=repr(str(strip_def)), id='graph'+str(np.random.rand()))
34 |
35 | iframe = """
36 |
37 | """.format(code.replace('"', '"'))
38 | display(HTML(iframe))
39 |
--------------------------------------------------------------------------------