├── README.md ├── autoencoder_7_4_symbol.ipynb ├── autoencoder_dynamic.ipynb ├── git-photo ├── cons_1.PNG ├── cons_2.PNG ├── cons_4.PNG ├── plot_1.PNG ├── plot_2.PNG ├── research_paper_1.PNG └── research_paper_2.PNG └── plotting ├── plot1.m └── plot2.m /README.md: -------------------------------------------------------------------------------- 1 | # AutoEncoder-Based-Communication-System 2 | 3 | Implementation and result of AutoEncoder Based Communication System From Research Paper : "An Introduction to Deep Learning for the Physical Layer" http://ieeexplore.ieee.org/document/8054694/ 4 | 5 | This Repo is effictively implementation of AutoEncoder based Communication System From Research Paper "An Introduction to Deep Learning 6 | for the Physical Layer" written by Tim O'Shea and Jakob Hoydis.During My wireless Communication Lab Course,I worked on this research Paper 7 | and re-generated result of this research Paper. 8 | Idea of Deep learning Based Communication System is new and there is many advantages of Deep learning based Communication.This paper gives 9 | complete different apporach than many other paper and tries to introduce deep learning in physical layer. 10 | 11 | # Abstract of Research Paper 12 | 13 | *We present and discuss several novel applications of deep learning for the physical layer. By interpreting a communications system as an 14 | autoencoder, we develop a fundamental new way to think about communications system design as an end-to-end reconstruction task that seeks 15 | to jointly optimize transmitter and receiver components in a single process. We show how this idea can be extended to networks of multiple 16 | transmitters and receivers and present the concept of radio transformer networks as a means to incorporate expert domain knowledge in the 17 | machine learning model. Lastly, we demonstrate the application of convolutional neural networks on raw IQ samples for modulation 18 | classification which achieves competitive accuracy with respect to traditional schemes relying on expert features. This paper is concluded 19 | with a discussion of open challenges and areas for future investigation.* 20 | 21 | From "An Introduction to Deep Learning for the Physical Layer" http://ieeexplore.ieee.org/document/8054694/ written by Tim O'Shea and Jakob Hoydis 22 | 23 | # Requirements 24 | 25 | - Tensorflow 26 | - Keras 27 | - Numpy 28 | - Matplotlib 29 | 30 | # Note 31 | Given Jupyter-Notebook file is dynamic to train any given (n,k) autoencodeer but for getting optimal result one has to manually tweak 32 | learning rate and epochs. 33 | Plots are generated by matlab script which for now i am not providing it.Anyone can plot result in matlab by training autoencoder and 34 | copy-pasting BER array and ploting it into matlab. 35 | All re-generated result below are generated with autoencoder_dynamic.ipynb file. 36 | 37 | # Result 38 | Re-generated Result | Research Paper 39 | :-----------------------------:|:-------------------------: 40 | ![BER Perfomance of (7,4) AutoEncoder](https://github.com/immortal3/AutoEncoder-Based-Communication-System/blob/master/git-photo/plot_1.PNG "BER Perfomance of (7,4) AutoEncoder") | ![Research Paper Result-1](https://github.com/immortal3/AutoEncoder-Based-Communication-System/blob/master/git-photo/research_paper_1.PNG) 41 | ![BER Perfomance of R=1 AutoEncoders](https://github.com/immortal3/AutoEncoder-Based-Communication-System/blob/master/git-photo/plot_2.PNG "BER Perfomance of R=1 AutoEncoders") | ![Research Paper Result-2](https://github.com/immortal3/AutoEncoder-Based-Communication-System/blob/master/git-photo/research_paper_2.PNG) 42 | 43 | # Constellation diagram 44 | 45 | ### (2,2) AutoEncoder's Constellation diagram 46 | 47 | Following Constellation diagram are learned by Autoencoder after training it. 48 | ![(2,2) Autoencoder constellation diagram](https://github.com/immortal3/AutoEncoder-Based-Communication-System/blob/master/git-photo/cons_1.PNG) 49 | 50 | 51 | ### (2,4) AutoEncoder Constellation diagram 52 | ![(2,4) Autoencoder constellation diagram](https://github.com/immortal3/AutoEncoder-Based-Communication-System/blob/master/git-photo/cons_2.PNG) 53 | 54 | 55 | 56 | 57 | 58 | 59 | -------------------------------------------------------------------------------- /autoencoder_7_4_symbol.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 34, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "# importing libs\n", 10 | "import numpy as np\n", 11 | "import tensorflow as tf\n", 12 | "from keras.layers import Input, Dense, GaussianNoise\n", 13 | "from keras.models import Model\n", 14 | "from keras import regularizers\n", 15 | "from keras.layers.normalization import BatchNormalization\n", 16 | "from keras.optimizers import SGD\n", 17 | "import random as rn" 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": 35, 23 | "metadata": {}, 24 | "outputs": [ 25 | { 26 | "name": "stdout", 27 | "output_type": "stream", 28 | "text": [ 29 | "M: 16 k: 4\n" 30 | ] 31 | } 32 | ], 33 | "source": [ 34 | "# defining parameters\n", 35 | "M = 16 \n", 36 | "k = np.log2(M)\n", 37 | "k = int(k)\n", 38 | "print ('M:',M,'k:',k)" 39 | ] 40 | }, 41 | { 42 | "cell_type": "code", 43 | "execution_count": 36, 44 | "metadata": { 45 | "collapsed": true 46 | }, 47 | "outputs": [], 48 | "source": [ 49 | "#generating data of size N\n", 50 | "N = 10000\n", 51 | "label = np.random.randint(M,size=N)" 52 | ] 53 | }, 54 | { 55 | "cell_type": "code", 56 | "execution_count": 37, 57 | "metadata": { 58 | "collapsed": true 59 | }, 60 | "outputs": [], 61 | "source": [ 62 | "# creating one hot encoded vectors\n", 63 | "data = []\n", 64 | "for i in label:\n", 65 | " temp = np.zeros(M)\n", 66 | " temp[i] = 1\n", 67 | " data.append(temp)" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": 38, 73 | "metadata": {}, 74 | "outputs": [ 75 | { 76 | "name": "stdout", 77 | "output_type": "stream", 78 | "text": [ 79 | "(10000, 16)\n" 80 | ] 81 | } 82 | ], 83 | "source": [ 84 | "data = np.array(data)\n", 85 | "print (data.shape)" 86 | ] 87 | }, 88 | { 89 | "cell_type": "code", 90 | "execution_count": 39, 91 | "metadata": {}, 92 | "outputs": [ 93 | { 94 | "name": "stdout", 95 | "output_type": "stream", 96 | "text": [ 97 | "11 [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]\n", 98 | "6 [ 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 99 | "4 [ 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 100 | "5 [ 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 101 | "2 [ 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n", 102 | "12 [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]\n", 103 | "13 [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]\n", 104 | "14 [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]\n", 105 | "14 [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]\n" 106 | ] 107 | } 108 | ], 109 | "source": [ 110 | "temp_check = [17,23,45,67,89,96,72,250,350]\n", 111 | "for i in temp_check:\n", 112 | " print(label[i],data[i])" 113 | ] 114 | }, 115 | { 116 | "cell_type": "code", 117 | "execution_count": 40, 118 | "metadata": {}, 119 | "outputs": [ 120 | { 121 | "name": "stdout", 122 | "output_type": "stream", 123 | "text": [ 124 | "7\n" 125 | ] 126 | } 127 | ], 128 | "source": [ 129 | "R = 4/7\n", 130 | "n_channel = 7\n", 131 | "print (int(k/R))\n", 132 | "input_signal = Input(shape=(M,))\n", 133 | "encoded = Dense(M, activation='relu')(input_signal)\n", 134 | "encoded1 = Dense(n_channel, activation='linear')(encoded)\n", 135 | "encoded2 = BatchNormalization()(encoded1)\n", 136 | "\n", 137 | "EbNo_train = 5.01187 # coverted 7 db of EbNo\n", 138 | "encoded3 = GaussianNoise(np.sqrt(1/(2*R*EbNo_train)))(encoded2)\n", 139 | "\n", 140 | "decoded = Dense(M, activation='relu')(encoded3)\n", 141 | "decoded1 = Dense(M, activation='softmax')(decoded)\n", 142 | "\n", 143 | "autoencoder = Model(input_signal, decoded1)\n", 144 | "#sgd = SGD(lr=0.001)\n", 145 | "autoencoder.compile(optimizer='adam', loss='categorical_crossentropy')" 146 | ] 147 | }, 148 | { 149 | "cell_type": "code", 150 | "execution_count": 41, 151 | "metadata": {}, 152 | "outputs": [ 153 | { 154 | "name": "stdout", 155 | "output_type": "stream", 156 | "text": [ 157 | "_________________________________________________________________\n", 158 | "Layer (type) Output Shape Param # \n", 159 | "=================================================================\n", 160 | "input_5 (InputLayer) (None, 16) 0 \n", 161 | "_________________________________________________________________\n", 162 | "dense_9 (Dense) (None, 16) 272 \n", 163 | "_________________________________________________________________\n", 164 | "dense_10 (Dense) (None, 7) 119 \n", 165 | "_________________________________________________________________\n", 166 | "batch_normalization_3 (Batch (None, 7) 28 \n", 167 | "_________________________________________________________________\n", 168 | "gaussian_noise_3 (GaussianNo (None, 7) 0 \n", 169 | "_________________________________________________________________\n", 170 | "dense_11 (Dense) (None, 16) 128 \n", 171 | "_________________________________________________________________\n", 172 | "dense_12 (Dense) (None, 16) 272 \n", 173 | "=================================================================\n", 174 | "Total params: 819\n", 175 | "Trainable params: 805\n", 176 | "Non-trainable params: 14\n", 177 | "_________________________________________________________________\n", 178 | "None\n" 179 | ] 180 | } 181 | ], 182 | "source": [ 183 | "print (autoencoder.summary())" 184 | ] 185 | }, 186 | { 187 | "cell_type": "code", 188 | "execution_count": 42, 189 | "metadata": { 190 | "collapsed": true 191 | }, 192 | "outputs": [], 193 | "source": [ 194 | "N_val = 1500\n", 195 | "val_label = np.random.randint(M,size=N_val)\n", 196 | "val_data = []\n", 197 | "for i in val_label:\n", 198 | " temp = np.zeros(M)\n", 199 | " temp[i] = 1\n", 200 | " val_data.append(temp)\n", 201 | "val_data = np.array(val_data)" 202 | ] 203 | }, 204 | { 205 | "cell_type": "code", 206 | "execution_count": 43, 207 | "metadata": { 208 | "scrolled": true 209 | }, 210 | "outputs": [ 211 | { 212 | "name": "stdout", 213 | "output_type": "stream", 214 | "text": [ 215 | "Train on 10000 samples, validate on 1500 samples\n", 216 | "Epoch 1/17\n", 217 | "10000/10000 [==============================] - 0s - loss: 2.4397 - val_loss: 2.6391\n", 218 | "Epoch 2/17\n", 219 | "10000/10000 [==============================] - 0s - loss: 1.9941 - val_loss: 2.4919\n", 220 | "Epoch 3/17\n", 221 | "10000/10000 [==============================] - 0s - loss: 1.6196 - val_loss: 2.2910\n", 222 | "Epoch 4/17\n", 223 | "10000/10000 [==============================] - 0s - loss: 1.2889 - val_loss: 2.0355\n", 224 | "Epoch 5/17\n", 225 | "10000/10000 [==============================] - 0s - loss: 0.9983 - val_loss: 1.7330\n", 226 | "Epoch 6/17\n", 227 | "10000/10000 [==============================] - 0s - loss: 0.7578 - val_loss: 1.4023\n", 228 | "Epoch 7/17\n", 229 | "10000/10000 [==============================] - 0s - loss: 0.5730 - val_loss: 1.0829\n", 230 | "Epoch 8/17\n", 231 | "10000/10000 [==============================] - 0s - loss: 0.4313 - val_loss: 0.7943\n", 232 | "Epoch 9/17\n", 233 | "10000/10000 [==============================] - 0s - loss: 0.3188 - val_loss: 0.5534\n", 234 | "Epoch 10/17\n", 235 | "10000/10000 [==============================] - 0s - loss: 0.2373 - val_loss: 0.3709\n", 236 | "Epoch 11/17\n", 237 | "10000/10000 [==============================] - 0s - loss: 0.1768 - val_loss: 0.2380\n", 238 | "Epoch 12/17\n", 239 | "10000/10000 [==============================] - 0s - loss: 0.1325 - val_loss: 0.1516\n", 240 | "Epoch 13/17\n", 241 | "10000/10000 [==============================] - 0s - loss: 0.1031 - val_loss: 0.0992\n", 242 | "Epoch 14/17\n", 243 | "10000/10000 [==============================] - 0s - loss: 0.0803 - val_loss: 0.0669\n", 244 | "Epoch 15/17\n", 245 | "10000/10000 [==============================] - 0s - loss: 0.0637 - val_loss: 0.0467\n", 246 | "Epoch 16/17\n", 247 | "10000/10000 [==============================] - 0s - loss: 0.0544 - val_loss: 0.0343\n", 248 | "Epoch 17/17\n", 249 | "10000/10000 [==============================] - 0s - loss: 0.0440 - val_loss: 0.0258\n" 250 | ] 251 | }, 252 | { 253 | "data": { 254 | "text/plain": [ 255 | "" 256 | ] 257 | }, 258 | "execution_count": 43, 259 | "metadata": {}, 260 | "output_type": "execute_result" 261 | } 262 | ], 263 | "source": [ 264 | "autoencoder.fit(data, data,\n", 265 | " epochs=17,\n", 266 | " batch_size=300,\n", 267 | " validation_data=(val_data, val_data))" 268 | ] 269 | }, 270 | { 271 | "cell_type": "code", 272 | "execution_count": 44, 273 | "metadata": { 274 | "collapsed": true 275 | }, 276 | "outputs": [], 277 | "source": [ 278 | "from keras.models import load_model\n", 279 | "#autoencoder.save('4_7_symbol_autoencoder_v_best.model')" 280 | ] 281 | }, 282 | { 283 | "cell_type": "code", 284 | "execution_count": 45, 285 | "metadata": { 286 | "collapsed": true 287 | }, 288 | "outputs": [], 289 | "source": [ 290 | "#autoencoder_loaded = load_model('4_7_symbol_autoencoder_v_best.model')" 291 | ] 292 | }, 293 | { 294 | "cell_type": "code", 295 | "execution_count": 46, 296 | "metadata": { 297 | "collapsed": true 298 | }, 299 | "outputs": [], 300 | "source": [ 301 | "encoder = Model(input_signal, encoded2)" 302 | ] 303 | }, 304 | { 305 | "cell_type": "code", 306 | "execution_count": 47, 307 | "metadata": { 308 | "collapsed": true 309 | }, 310 | "outputs": [], 311 | "source": [ 312 | "encoded_input = Input(shape=(n_channel,))\n", 313 | "\n", 314 | "deco = autoencoder.layers[-2](encoded_input)\n", 315 | "deco = autoencoder.layers[-1](deco)\n", 316 | "# create the decoder model\n", 317 | "decoder = Model(encoded_input, deco)" 318 | ] 319 | }, 320 | { 321 | "cell_type": "code", 322 | "execution_count": 48, 323 | "metadata": { 324 | "collapsed": true 325 | }, 326 | "outputs": [], 327 | "source": [ 328 | "N = 45000\n", 329 | "test_label = np.random.randint(M,size=N)\n", 330 | "test_data = []\n", 331 | "\n", 332 | "for i in test_label:\n", 333 | " temp = np.zeros(M)\n", 334 | " temp[i] = 1\n", 335 | " test_data.append(temp)\n", 336 | " \n", 337 | "test_data = np.array(test_data)" 338 | ] 339 | }, 340 | { 341 | "cell_type": "code", 342 | "execution_count": 49, 343 | "metadata": {}, 344 | "outputs": [ 345 | { 346 | "name": "stdout", 347 | "output_type": "stream", 348 | "text": [ 349 | "1.0 13\n" 350 | ] 351 | } 352 | ], 353 | "source": [ 354 | "temp_test = 6\n", 355 | "print (test_data[temp_test][test_label[temp_test]],test_label[temp_test])" 356 | ] 357 | }, 358 | { 359 | "cell_type": "code", 360 | "execution_count": 50, 361 | "metadata": {}, 362 | "outputs": [ 363 | { 364 | "data": { 365 | "text/plain": [ 366 | "" 367 | ] 368 | }, 369 | "execution_count": 50, 370 | "metadata": {}, 371 | "output_type": "execute_result" 372 | } 373 | ], 374 | "source": [ 375 | "autoencoder" 376 | ] 377 | }, 378 | { 379 | "cell_type": "code", 380 | "execution_count": 51, 381 | "metadata": { 382 | "collapsed": true 383 | }, 384 | "outputs": [], 385 | "source": [ 386 | "def frange(x, y, jump):\n", 387 | " while x < y:\n", 388 | " yield x\n", 389 | " x += jump" 390 | ] 391 | }, 392 | { 393 | "cell_type": "code", 394 | "execution_count": 52, 395 | "metadata": {}, 396 | "outputs": [ 397 | { 398 | "name": "stdout", 399 | "output_type": "stream", 400 | "text": [ 401 | "SNR: -4 BER: 0.341422222222\n", 402 | "SNR: -3.5 BER: 0.306422222222\n", 403 | "SNR: -3.0 BER: 0.266755555556\n", 404 | "SNR: -2.5 BER: 0.233888888889\n", 405 | "SNR: -2.0 BER: 0.195488888889\n", 406 | "SNR: -1.5 BER: 0.168711111111\n", 407 | "SNR: -1.0 BER: 0.132488888889\n", 408 | "SNR: -0.5 BER: 0.109\n", 409 | "SNR: 0.0 BER: 0.0812444444444\n", 410 | "SNR: 0.5 BER: 0.0620222222222\n", 411 | "SNR: 1.0 BER: 0.0431555555556\n", 412 | "SNR: 1.5 BER: 0.0314444444444\n", 413 | "SNR: 2.0 BER: 0.0214222222222\n", 414 | "SNR: 2.5 BER: 0.0152888888889\n", 415 | "SNR: 3.0 BER: 0.00926666666667\n", 416 | "SNR: 3.5 BER: 0.00515555555556\n", 417 | "SNR: 4.0 BER: 0.00328888888889\n", 418 | "SNR: 4.5 BER: 0.00164444444444\n", 419 | "SNR: 5.0 BER: 0.000644444444444\n", 420 | "SNR: 5.5 BER: 0.000355555555556\n", 421 | "SNR: 6.0 BER: 0.000133333333333\n", 422 | "SNR: 6.5 BER: 0.000155555555556\n", 423 | "SNR: 7.0 BER: 4.44444444444e-05\n", 424 | "SNR: 7.5 BER: 2.22222222222e-05\n", 425 | "SNR: 8.0 BER: 0.0\n" 426 | ] 427 | } 428 | ], 429 | "source": [ 430 | "EbNodB_range = list(frange(-4,8.5,0.5))\n", 431 | "ber = [None]*len(EbNodB_range)\n", 432 | "for n in range(0,len(EbNodB_range)):\n", 433 | " EbNo=10.0**(EbNodB_range[n]/10.0)\n", 434 | " noise_std = np.sqrt(1/(2*R*EbNo))\n", 435 | " noise_mean = 0\n", 436 | " no_errors = 0\n", 437 | " nn = N\n", 438 | " noise = noise_std * np.random.randn(nn,n_channel)\n", 439 | " encoded_signal = encoder.predict(test_data) \n", 440 | " final_signal = encoded_signal + noise\n", 441 | " pred_final_signal = decoder.predict(final_signal)\n", 442 | " pred_output = np.argmax(pred_final_signal,axis=1)\n", 443 | " no_errors = (pred_output != test_label)\n", 444 | " no_errors = no_errors.astype(int).sum()\n", 445 | " ber[n] = no_errors / nn \n", 446 | " print ('SNR:',EbNodB_range[n],'BER:',ber[n])" 447 | ] 448 | }, 449 | { 450 | "cell_type": "code", 451 | "execution_count": 53, 452 | "metadata": {}, 453 | "outputs": [ 454 | { 455 | "data": { 456 | "text/plain": [ 457 | "" 458 | ] 459 | }, 460 | "execution_count": 53, 461 | "metadata": {}, 462 | "output_type": "execute_result" 463 | } 464 | ], 465 | "source": [ 466 | "import matplotlib.pyplot as plt\n", 467 | "plt.plot(EbNodB_range, ber, 'bo',label='Autoencoder(7,4)')\n", 468 | "#plt.plot(list(EbNodB_range), ber_theory, 'ro-',label='BPSK BER')\n", 469 | "plt.yscale('log')\n", 470 | "plt.xlabel('SNR Range')\n", 471 | "plt.ylabel('Block Error Rate')\n", 472 | "plt.grid()\n", 473 | "plt.legend(loc='upper right',ncol = 1)" 474 | ] 475 | }, 476 | { 477 | "cell_type": "code", 478 | "execution_count": 54, 479 | "metadata": {}, 480 | "outputs": [ 481 | { 482 | "data": { 483 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEKCAYAAAAFJbKyAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XuYVPWd5/H3V9QAQhyjyOME6WYGr2kuSosSYqA1uCQq\nbNQY2A47iSRsfCCayyaa4Iy6idEncUxiIOPijWRlu+P9QZeJjnJRiSigRBQvIUobwAREBVoGI/Z3\n/zhVbXfTVX1OVZ+qU6c+r+epp+v86lx+v4bub//u5u6IiIiEdUC5MyAiIpVFgUNERCJR4BARkUgU\nOEREJBIFDhERiUSBQ0REIlHgEBGRSBQ4REQkEgUOERGJ5MByZyAORxxxhNfW1hZ07bvvvsshhxzS\nuxkqE5UledJSDlBZkqjYcqxdu/ZNdx/U03mpDBy1tbWsWbOmoGuXL1/OxIkTezdDZaKyJE9aygEq\nSxIVWw4zawlzXqqaqszsXDNbsHPnznJnRUQktVIVONz9AXefdeihh5Y7KyIiqZWqwCEiIvFLZR+H\niET3/vvvs3nzZvbu3Rvq/EMPPZQXX3wx5lyVRlrKErYcffv2ZciQIRx00EEFPUeBQ0QA2Lx5MwMH\nDqS2thYz6/H83bt3M3DgwBLkLH5pKUuYcrg7O3bsYPPmzQwbNqyg56SqqaqYzvFFi6C2Fs44YwK1\ntcGxSDXZu3cvhx9+eKigIZXLzDj88MND1yy7k6rAUWjn+KJFMGsWtLSAu9HSEhwreEi1UdCoDsX+\nO6cqcBRq7lzYs6dz2p49QXo+2VrKAQegWoqIVA0FDuD116OlQ9daCqqliPSS+++/HzPjpZde6vHc\nn//85+zp+ldfmS1cuJA5c+ZEvu7ZZ59l5syZAPz0pz9l9OjRjB49mrq6Ovr06cNbb72V89opU6ZQ\nV1fXfjxv3jxuu+226JkPSYEDGDo0WjoUXksRSYs77zwwlhp3U1MTn/rUp2hqaurx3CQGjqj27dsH\nwI9//GMuueQSAL773e+ybt061q1bx7XXXsuECRP42Mc+1u319957LwMGDOiUdtFFF/HLX/4ytjwr\ncADXXAP9+3dO698/SM+l0FqKmrYkDRYtgm98o2+v17hbW1t54oknuPXWW2lubgaCZTTOOeec9nPm\nzJnDwoULufHGG9m6dSsNDQ00NDQAQdAZMWIEdXV1XHbZZe3XPPzww4wbN46TTz6ZL3zhC7S2tgLB\n8kRXXnklp59+OiNGjGiv5bS2tvKVr3yFESNGMHLkSO65556897/99ts59thjGTt2LCtXrmxP3759\nO+effz6nnHIKp5xySvtnV111FTNmzGD8+PHMmDGD3bt389xzzzFq1Kj9vidNTU1Mnz495/frhhtu\n4IorruiU3r9/f2pra3n66adDfucjcvfUvIBzgQXDhw/3qO64w72mxt2szWtqguN8amrcgx+Zzq+a\nmtz379+/87n9+/f8nGIsW7YsvpuXWFrKkuRybNiwIfS5Uf//h3XHHXf4RRdd5O7u48aN8zVr1viy\nZcv87LPPbj9n9uzZfvvtt2fyUePbt293d/ctW7b40Ucf7du2bfP333/fGxoa/L777vPt27f76aef\n7q2tre7uft111/nVV1/dfv2NN97ou3bt8vnz5/vMmTPd3f173/ueX3rppe3PfOutt3Lef+vWre3p\n7733nn/yk5/02bNnu7v79OnT/fHHH3d395aWFj/++OPd3f3KK6/0k08+2ffs2ePu7kuXLvXzzjtv\nv+/Hu+++64cddpjv2LGj2+/XN7/5Tb/33nv9tdde80984hO+a9eu9s9+9KMf+fXXX5/ze93dvzew\nxkP8rk1VjcOLWHKksRE2bYKlS1ewaVNwnE/UWoqatiRNCqlxh9HU1MS0adMAmDZtWqjmqqzVq1cz\nceJEBg0axIEHHkhjYyOPPfYYq1atYsOGDYwfP57Ro0fz61//mpaWD9fyO++88wAYM2YMmzZtAuCR\nRx5h9uzZ7eccdthhOe//1FNPtacffPDBfPGLX2y/7pFHHmHOnDmMHj2aKVOmsGvXrvbazpQpU+jX\nrx8Ab7zxBoMG7b8o7QMPPMD48eO7baZat24df/rTn/j85z/f7ffjyCOPZOvWraG/f1FoAmCBsoFl\n7tzgh2Xo0CBo5Ao4hf6gLVoU/hkipTJ0aNA81V16od566y2WLl3K+vXrMTM++OADzIypU6fS1tbW\nfl7U+QfuzqRJk3IGoY985CMA9OnTp72/obe0tbWxatUq+vbtu99nHZc/79evX7flam5uztlM9eST\nT7JmzRpqa2vZt28f27Zt43Of+xyPP/44EHyfsoGpt6WqxlFq2VpKWxs91lIK6YDXyC1JqmuugX79\nvFNaT/2CPbn77ruZMWMGLS0tbNq0iT//+c8MGzaMtrY2NmzYwHvvvcc777zDo48+2n7NwIED2b17\nNwBjx45lxYoVvPnmm3zwwQc0NTUxYcIETjvtNFauXMnGjRuBYM+KV155JW9eJk2axPz589uP3377\n7Zz3P/XUU1mxYgU7duzg/fff56677mq/7qyzzurUSb1u3bpun3fCCSe05y9r586drFixgqlTp3ZK\nP/PMM9myZQsXX3wxW7duZdOmTTzxxBMce+yxLFmypP28V155pdNIq96kwFEihXTAq3lLkqqxEX75\ny73U1IAZ1NTAggXF1Yabmpr2a3Y5//zzaW5u5sILL6Suro4LL7yQk046qf3zWbNmMXnyZBoaGjjq\nqKO47rrraGhoYNSoUYwZM4apU6cyaNAgFi5cyPTp0xk5ciTjxo3rcajvFVdcwdtvv01dXR2jRo1i\n2bJlOe9/1FFHcdVVVzFu3DjGjx/PCSec0H6fG2+8kTVr1jBy5EhOPPFEbrrppm6fd/zxx7Nz5872\nIAhw3333cdZZZ3WqmbS1tbFx48acI6w6WrlyJZMmTerxvIKE6QiptNeYMWNydgj1JM7Oyw874D1U\nB7xZ9x2QZuGel+SO2KjSUpYklyNK57i7d+qIrXRJKMsNN9zgN998c95z1q9f79/61rdyfp4txzPP\nPONf+tKX8t5LneMVIkrTFhTWvAVad0ukEl188cXt/S251NXVccMNN/R4rzfffJMf/vCHvZW1/Shw\nJFghzVtad0ukMvXt25cZM2b0yr0mTZpEbW1tr9yrO6kKHGnbOraxMWg3jtKOrH4RKUbQWiFpV+y/\nc6oCh6dw69iozVvFDPvVrPbq1rdvX3bs2KHgkXLuwX4c3Q0RDkvzOFKmkPH12eatbE0l27wFmjNS\nTYYMGcLmzZvZvn17qPP37t1b1C+fJElLWcKWI7sDYKEUOFLmmms6BwEobtivAkf1OOiggyLtCLd8\n+fJOQ2MrWVrKUqpypKqpSrr2i3iofpG4lo8QkXRS4EihqOtuFTrsV0SqkwKHFDzsV53pItVJgUMi\nD/vVGloi1U2BQ4Bow341V0SkuilwSGTqTBepbgocElmxa2ipX0SksiU+cJjZP5jZrWZ2d7nzIoHi\n19BSv4hIJYs1cJjZbWa2zcye75I+2cxeNrONZnZ5vnu4+6vuPjPOfEo0WkNLpLrFPXN8ITAP+E02\nwcz6APOBScBmYLWZLQb6ANd2uf4id98Wcx6lAI2N0WaVq19EJD0s7gXNzKwWeNDd6zLH44Cr3P2/\nZI6/D+DuXYNG1/vc7e4X5Pl8FjALYPDgwWOam5sLym9raysDBgwo6NqkSVJZpk07jb/+df81dAYP\n3ktz86oer09SWYqRlnKAypJExZajoaFhrbvX93himN2einkBtcDzHY4vAG7pcDwDmJfn+sOBm4A/\nAd8P88yk7gBYakkqyx13uPfv33knw/79e94FMStJZSlGWsrhrrIkUbHlIC07ALr7Dnf/urv/o/dQ\nK5HkKqRfBLSboUgSlWN13C3A0R2Oh2TSimZm5wLnDh8+vDduJ70sar9I5+XeTcu9iyREOWocq4Fj\nzGyYmR0MTAMW98aNPYUbOVUzjcQSSaa4h+M2AU8Cx5nZZjOb6e77gDnAQ8CLwJ3u/kIvPS9VW8dW\nO43EEkmmWJuq3H16jvQlwJIYnvcA8EB9ff3XevveUnqF7GYoIvFLfOe4VC8t9y6STKkKHGqqSpeo\nuxlqWROR0khV4FDnePpE2c1QnekipZGqwCHVTZ3pIqWRqsChpqrqpr3TRUojVYFDTVXVrZDOdBGJ\nLlWBQ6pbocuaiEg05VhyRCQ2UZc1EZHoUlXjUB+HFEJzP0SiSVXgUB+HRKW5HyLRpSpwiESluR8i\n0SlwSFXT3A+R6FIVONTHIVFp7odIdKkKHOrjkKg090MkulQFDpGoit3SViOxpBppHodUveK2tEVb\n2krVUY1DJCKNxJJqp8AhEpFGYkm1S1Xg0KgqKQWNxJJql6rAoVFVUgoaiSXVLlWBQ6QUtAqvVDuN\nqhIpgFbhlWqmGodICWTnfZxxxgTN+5CKpxqHSMw6z/swzfuQiqcah0jMNO9D0kaBQyRmmvchaZOq\nwKF5HJJEmvchaZOqwKF5HJJEmvchaRMqcJhZjZl9JvO+n5kNjDdbIunRed6Ha96HVLweA4eZfQ24\nG/jfmaQhwP1xZkokbRobYdMmWLp0BZs2hQsaWrpdkirMcNzZwFjgKQB3/6OZHRlrrkSqnJZulyQL\n01T1nrv/LXtgZgcCHl+WRERDeCXJwgSOFWb2A6CfmU0C7gIeiDdbItVNQ3glycIEjsuB7cB64H8A\nS9xdf/eIxEhDeCXJwgSOb7j7ze7+BXe/wN1vNrNLY8+ZSBXTEF5JsjCB45+6SftyL+dDRDrQ0u2S\nZDlHVZnZdOC/AcPMbHGHjwYCb8WdMZFqp6XbJanyDcf9PfAGcATwrx3SdwPPxZmprszsvwJnAx8F\nbnX3h0v5fBER+VDOwOHuLUALMK6YB5jZbcA5wDZ3r+uQPhn4BdAHuMXdr8uTl/uB+83sMOB6QIFD\nRKRMwswcP83MVptZq5n9zcw+MLNdEZ6xEJjc5Z59gPnAZ4ETgelmdqKZjTCzB7u8Ok42vCJznYh0\nQ7PNpRTCzByfB0wjmL9RD/x34NiwD3D3x8ystkvyWGCju78KYGbNwFR3v5agdtKJmRlwHfDv7v5M\n2GeLVBPNNpdSMff8k8DNbI2715vZc+4+MpP2rLufFPohQeB4MNtUZWYXAJPd/auZ4xnAqe4+J8f1\nlxCM7loNrHP3m7o5ZxYwC2Dw4MFjmpubw2avk9bWVgYMGFDQtUmjsiRPnOWYNu00/vrXvvulDx68\nl+bmVb3+vLT8m0B6ylJsORoaGta6e32PJ7p73hfwGHAw8BvgJ8C3gD/0dF2Xe9QCz3c4voCgXyN7\nPAOYF+We+V5jxozxQi1btqzga5NGZUmeOMth5g77v8zieV5a/k3c01OWYssBrPEQv2PDzOOYQdAX\nMgd4FzgaOD9CEOvOlsx9soZk0oqijZykmmm2uZRKj4HD3Vvcfa+773L3q93928DgIp+7GjjGzIaZ\n2cEEfSiLe7imR66NnKSKaba5lErOwGFmfcxsupn9TzPL9k2cY2a/J+gwD8XMmoAngePMbLOZzXT3\nfQQ1mIeAF4E73f2FokqCahxS3TTbXEol36iqWwmak54GbjSzrQSjqi73YF5FKO4+PUf6EmBJhLyG\nedYDwAP19fVf6837ilQKzTaXUsgXOOqBke7eZmZ9gb8A/+juO0qTNRERSaJ8fRx/c/c2AHffC7ya\n9KChpioRkfjlCxzHm9lzmdf6Dsfrzayka1WFpc5xEZH45WuqOqFkuRARkYqRs8aRGYab81XKTIal\npiqRaLS2lRQizATAiqGmKpHwsmtbtbQEc8yza1speEhPUhU4RCS8uXM/XBAxa8+eIF0kn7yBIzMJ\nUH9/iKTQ669HSxfJyhs43P0DoCazLEjiqY9DJDytbSWFCtNU9Sqw0sz+2cy+nX3FnbFCqI9DJDyt\nbSWFChM4/gQ8mDl3YIeXiFQwrW0lhepxB0B3vxrAzAZkjlvjzpSIlIbWtpJChNlzvM7MngVeAF4w\ns7Vm9on4sxad+jhEROIXpqlqAfBtd69x9xrgO8DN8WarMOrjEIlfdtLgGWdM0KTBKtVjUxVwiLsv\nyx64+3IzOyTGPIlIQmUnDQbzP6x90iCoyauahBpVlRlRVZt5XUEw0kpEqowmDQqECxwXAYOAe4F7\ngCMyaSJSZTRpUKCHpioz6wPMdfdLSpQfEUmwoUODNa26S5fqEWbm+KdKlJeiaVSVSLw0aVAgXFPV\ns2a22MxmmNl52VfsOSuARlWJxKvzpEHXpMEqFWZUVV9gB3BGhzQn6PMQkSqTnTS4fPkKJk6cWO7s\nSBmE6eN4zt1/VqL8iIhIwoXp45heoryIiEgFCNNUtdLM5gG/Bd7NJrr7M7HlSkREEitM4Bid+fq/\nOqQ5nfs8RESkSoRZHbehFBkREZHKkLOPw8x+3uH9pV0+WxhjnkREJMHydY5/usP7f+ry2cgY8lI0\nTQAUEYlfvsBhOd4nliYAiiRTdin2Aw5AS7GnQL4+jgPM7DCC4JJ9nw0gfWLPmYikQuel2NFS7CmQ\nr8ZxKLAWWAN8FHgmc7wW7TkuIiFpKfb0yVnjcPfaEuZDRFJKS7GnT5hFDkVECpZryXUtxV65FDhE\nJFZaij19FDhEJFadl2JHS7GnQI+Bw8xmdpN2XTzZEZE0amyETZugrS34qqBR2cKsVXW+me1190UA\nZjafYI8OERGpQqECB7DYzNqAycA77r5fLSQuZnYCcClwBPCou/9bqZ4tIiL7y7dW1cfM7GNAP+Cr\nwPeA3cDVmfQemdltZrbNzJ7vkj7ZzF42s41mdnm+e7j7i+7+deBCYHyY54qISHzy1TjWEiyfbh2+\nnp15OfAPIe6/EJgH/CabkNlVcD4wCdgMrDazxQSz0a/tcv1F7r7NzKYAFwP/J8QzRUQkRvkmAA4r\n9ubu/piZ1XZJHgtsdPdXAcysGZjq7tcC5+S4z2KC5rL/B/zfYvMlIiKFM3fPf4LZbGCRu7+TOT4M\nmO7uvwr1gCBwPOjudZnjC4DJ7v7VzPEM4FR3n5Pj+onAecBHCPY/n5/jvFnALIDBgwePaW5uDpO9\n/bS2tjJgwICCrk0alSV50lIOUFmSqNhyNDQ0rHX3+h5PdPe8L2BdN2nP9nRdh3Nrgec7HF8A3NLh\neAYwL+z9wrzGjBnjhVq2bFnB1yaNypI8aSmHu8qSRMWWA1jjIX7HhpkA2MfM2pdVz/RRHBwlinWx\nBTi6w/GQTFrRtB+HiEj8wgSO3wG/NbMzzexMoCmTVqjVwDFmNszMDgamAYuLuF87134cIiKxCxM4\nLgOWEYxquhh4lGBobo/MrAl4EjjOzDab2Ux33wfMAR4CXgTudPcXCsl8N89TjUNEJGY9TgB09zYz\nuxV4gmAY7svu/kGYm7v79BzpS4AlUTIa8nkPAA/U19d/rbfvLSIigTBrVU0E/kgwH+NXwCtm9um8\nF4mIFEFbzSZbmCVH/hU4y91fBjCzYwn6OcbEmbFCmNm5wLnDhw8vd1ZEpEDaajb5wvRxHJQNGgDu\n/gpwUHxZKpw6x0Uqn7aaTb4wNY41ZnYLcEfmuJFgH3IRkV6nrWaTL0yN42JgA3BJ5rUhk5Y4GlUl\nUvm01Wzy9Rg43P09d7/B3c/LvH7m7u+VInNRqalKpPJpq9nky9lUZWbrCYbfdsvdR8aSIxGpatkO\n8Llzg+apoUODoKGO8eTI18fR7Uq1IiJxa2xUoEiyfMuqt3RNM7MjgB2ZxbASR8NxRUTil28HwNPM\nbLmZ3WtmJ2V28Xse+KuZTS5dFsNTH4eISPzyNVXNA34AHAosBT7r7qvM7HiKX+hQREQqVL5RVQe6\n+8PufhfwF3dfBeDuL5UmayIikkT5Akdbh/f/2eWzRPZxiEj10vpWpZOvqWqUme0CDOiXeU/muG/s\nOSuAOsdFqpPWtyqtnDUOd+/j7h9194HufmDmffZYa1WJSGJofavSCrPkiIhIoml9q9JS4BCRiqf1\nrUpLgUNEKp7WtyotBQ4RqXiNjbBgAdTUgFnwdcECdYzHJcx+HBVDo6pEqpfWtyqdVNU4NKpKRCR+\nqQocIiISPwUOERGJRIFDREQiUeAQEZFIFDhERCQSBQ4REYkkVYHDzM41swU7d+4sd1ZERFIrVYFD\n8zhEROKXqsAhIhInbRYVSNWSIyIicdFmUR9SjUNEJARtFvUhBQ4RqVrZpqczzpjQY9OTNov6kAKH\niFSlbNNTSwu4W3vTU67goc2iPqTAISJVKWrTkzaL+pACh4hUpahNT9os6kMaVSUiVWno0KCZqrv0\nXLRZVKAiahxmdoiZrTGzc8qdFxFJBzU9FS7WwGFmt5nZNjN7vkv6ZDN72cw2mtnlIW51GXBnPLkU\nkWrUuenJq7rpKaq4m6oWAvOA32QTzKwPMB+YBGwGVpvZYqAPcG2X6y8CRgEbgL4x51VEqky26Wn5\n8hVMnDix3NmpGLEGDnd/zMxquySPBTa6+6sAZtYMTHX3a4H9mqLMbCJwCHAi8J9mtsTd2+LMt4iI\n5GbuHu8DgsDxoLvXZY4vACa7+1czxzOAU919Tg/3+TLwprs/mOPzWcAsgMGDB49pbm4uKL+tra0M\nGDCgoGuTRmVJnrSUA1SWJCq2HA0NDWvdvb6n8ypmVJW7L+zh8wXAAoD6+novtNq5fPny1FRZVZbk\nSUs5QGVJolKVoxyjqrYAR3c4HpJJK5r24xARiV85Asdq4BgzG2ZmBwPTgMW9cWPtxyEiEr+4h+M2\nAU8Cx5nZZjOb6e77gDnAQ8CLwJ3u/kKc+RARkd4T96iq6TnSlwBLevt5ZnYucO7w4cN7+9YiIpJR\nETPHw1JTlYhI/FIVONQ5LiISv1QFDtU4RETil6rAISIi8VPgEBGRSFIVONTHISISv1QFDvVxiIjE\nL1WBQ0RE4peqwKGmKhFJmkWLoLYWDjgg+LpoUblzVLxUBQ41VYlIkixaBLNmBXubuwdfZ82q/OCR\nqsAhIpIkc+fCnj2d0/bsCdIrmQKHiEhMXn89WnqlUOAQEYnJ0KHR0itFqgKHOsdFJEmuuQb69++c\n1r9/kF7JUhU41DkuIknS2AgLFkBNDZgFXxcsCNIrWcXsOS4iUokaGys/UHSVqhqHiIjET4FDREQi\nUeAQEZFIUhU4NKpKRCR+qQocGlUlIhK/VAUOERGJnwKHiEjCJH1FXc3jEBFJkOyKutnFEbMr6kJy\n5oOoxiEikiCVsKKuAoeISIJUwoq6ChwiIglSCSvqpipwaB6HiFS6SlhRN1WBQ/M4RKTSVcKKuhpV\nJSKSMElfUTdVNQ4REYmfAoeIiESiwCEiIpEocIiISCQKHCIiEom5e7nz0OvMbDvQUuDlRwBv9mJ2\nykllSZ60lANUliQqthw17j6op5NSGTiKYWZr3L2+3PnoDSpL8qSlHKCyJFGpyqGmKhERiUSBQ0RE\nIlHg2N+CcmegF6ksyZOWcoDKkkQlKYf6OEREJBLVOEREJBIFjjzM7Dtm5mZ2RLnzUigz+6mZvWRm\nz5nZfWb2d+XOUxRmNtnMXjazjWZ2ebnzUygzO9rMlpnZBjN7wcwuLXeeimFmfczsWTN7sNx5KYaZ\n/Z2Z3Z35GXnRzMaVO0+FMrNvZf5vPW9mTWbWN65nKXDkYGZHA2cBCdp3qyD/AdS5+0jgFeD7Zc5P\naGbWB5gPfBY4EZhuZieWN1cF2wd8x91PBE4DZldwWQAuBV4sdyZ6wS+A37n78cAoKrRMZvZx4BKg\n3t3rgD7AtLiep8CR28+A7wEV3Qnk7g+7+77M4SpgSDnzE9FYYKO7v+rufwOagallzlNB3P0Nd38m\n8343wS+oj5c3V4UxsyHA2cAt5c5LMczsUODTwK0A7v43d3+nvLkqyoFAPzM7EOgPbI3rQQoc3TCz\nqcAWd/9DufPSyy4C/r3cmYjg48CfOxxvpkJ/2XZkZrXAScBT5c1JwX5O8EdVW7kzUqRhwHbg9kyz\n2y1mdki5M1UId98CXE/QQvIGsNPdH47reVUbOMzskUxbYNfXVOAHwL+UO49h9VCW7DlzCZpLFpUv\np2JmA4B7gG+6+65y5ycqMzsH2Obua8udl15wIHAy8G/ufhLwLlCR/WhmdhhBbXwY8PfAIWb2pbie\nV7U7ALr7Z7pLN7MRBN/8P5gZBE07z5jZWHf/SwmzGFqusmSZ2ZeBc4AzvbLGX28Bju5wPCSTVpHM\n7CCCoLHI3e8td34KNB6YYmafA/oCHzWzO9w9tl9SMdoMbHb3bM3vbio0cACfAV5z9+0AZnYv8Eng\njjgeVrU1jlzcfb27H+nute5eS/Cf6+SkBo2emNlkgmaFKe6+p9z5iWg1cIyZDTOzgwk6+xaXOU8F\nseCvkFuBF939hnLnp1Du/n13H5L52ZgGLK3QoEHmZ/rPZnZcJulMYEMZs1SM14HTzKx/5v/amcTY\n0V+1NY4qMg/4CPAfmRrUKnf/enmzFI677zOzOcBDBKNEbnP3F8qcrUKNB2YA681sXSbtB+6+pIx5\nEvgGsCjzh8mrwFfKnJ+CuPtTZnY38AxBk/SzxDiLXDPHRUQkEjVViYhIJAocIiISiQKHiIhEosAh\nIiKRKHCIiEgkChxS9cxsbmZV0efMbJ2ZnZpJX25mazqcV29myzPvJ5rZzsz5L5nZ9TnuHeo8kUqi\nwCFVLbOM9jkEkzxHEszA7bg+1pFm9tkclz/u7qMJ1p06x8zGF3meSEVQ4JBqdxTwpru/B+Dub7p7\nx1VFfwrMzXcDd/9PYB09LMDY9TwzG2tmT2YW2Pt9dgazmX3ZzO41s9+Z2R/N7CfZe5jZTDN7xcye\nNrObzWxeJn2Qmd1jZqszLwUniY0Ch1S7h4GjM7+Mf2VmE7p8/iTwNzNryHWDzAJzxwCP5XtQN+e9\nBJyeWWDvX4Afdzh9NPBFYATwxcxGUH8P/DPBfh7jgeM7nP8L4GfufgpwPhW+5LkkmwKHVDV3bwXG\nALMIltj+bWZRyI5+BFzRzeWnm9kfCBZefCjPema5zjsUuMvMnifY/+UTHa551N13uvtegvWTagj2\nJ1nh7m93ctbMAAABTElEQVS5+/vAXR3O/wwwL7OcyWKCxQcHhPgWiESmwCFVz90/cPfl7n4lMIfg\nL/aOny8F+hH8pd/R4+4+iuAX/kwzG53jEbnO+yGwLLNj27kEq81mvdfh/Qf0vK7cAcBp7j468/p4\nJiiK9DoFDqlqZnacmR3TIWk00NLNqT8iWGV4P+7+GnAdcFm+Z3Vz3qF8uEz8l0NkdzUwwcwOy+zy\n1jHAPUywYB8AeYKYSNEUOKTaDQB+bWYbzOw5gr3Nr+p6UmYV2+157nMT8OnM7n75dDzvJ8C1ZvYs\nIVaqzuzy9mPgaWAlsAnYmfn4EqA+M6R4A1ARKyBLZdLquCIVxMwGuHtrpsZxH8FS8/eVO19SXVTj\nEKksV2U6wJ8HXgPuL3N+pAqpxiEiIpGoxiEiIpEocIiISCQKHCIiEokCh4iIRKLAISIikShwiIhI\nJP8fv+AmRWtmIiQAAAAASUVORK5CYII=\n", 484 | "text/plain": [ 485 | "" 486 | ] 487 | }, 488 | "metadata": {}, 489 | "output_type": "display_data" 490 | } 491 | ], 492 | "source": [ 493 | "plt.savefig('AutoEncoder_7_4_BER_matplotlib')\n", 494 | "plt.show()" 495 | ] 496 | }, 497 | { 498 | "cell_type": "code", 499 | "execution_count": null, 500 | "metadata": { 501 | "collapsed": true 502 | }, 503 | "outputs": [], 504 | "source": [] 505 | } 506 | ], 507 | "metadata": { 508 | "anaconda-cloud": {}, 509 | "kernelspec": { 510 | "display_name": "Python 3", 511 | "language": "python", 512 | "name": "python3" 513 | }, 514 | "language_info": { 515 | "codemirror_mode": { 516 | "name": "ipython", 517 | "version": 3 518 | }, 519 | "file_extension": ".py", 520 | "mimetype": "text/x-python", 521 | "name": "python", 522 | "nbconvert_exporter": "python", 523 | "pygments_lexer": "ipython3", 524 | "version": "3.6.3" 525 | } 526 | }, 527 | "nbformat": 4, 528 | "nbformat_minor": 2 529 | } 530 | -------------------------------------------------------------------------------- /autoencoder_dynamic.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "# importing libs\n", 10 | "import numpy as np\n", 11 | "import tensorflow as tf\n", 12 | "import keras\n", 13 | "from keras.layers import Input, Dense, GaussianNoise,Lambda,Dropout\n", 14 | "from keras.models import Model\n", 15 | "from keras import regularizers\n", 16 | "from keras.layers.normalization import BatchNormalization\n", 17 | "from keras.optimizers import Adam,SGD\n", 18 | "from keras import backend as K" 19 | ] 20 | }, 21 | { 22 | "cell_type": "code", 23 | "execution_count": null, 24 | "metadata": { 25 | "collapsed": true 26 | }, 27 | "outputs": [], 28 | "source": [ 29 | "# for reproducing reslut\n", 30 | "from numpy.random import seed\n", 31 | "seed(1)\n", 32 | "from tensorflow import set_random_seed\n", 33 | "set_random_seed(3)" 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": null, 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [ 42 | "# defining parameters\n", 43 | "# define (n,k) here for (n,k) autoencoder\n", 44 | "# n = n_channel \n", 45 | "# k = log2(M) ==> so for (7,4) autoencoder n_channel = 7 and M = 2^4 = 16 \n", 46 | "M = 4\n", 47 | "k = np.log2(M)\n", 48 | "k = int(k)\n", 49 | "n_channel = 2\n", 50 | "R = k/n_channel\n", 51 | "print ('M:',M,'k:',k,'n:',n_channel)" 52 | ] 53 | }, 54 | { 55 | "cell_type": "code", 56 | "execution_count": null, 57 | "metadata": { 58 | "collapsed": true 59 | }, 60 | "outputs": [], 61 | "source": [ 62 | "#generating data of size N\n", 63 | "N = 8000\n", 64 | "label = np.random.randint(M,size=N)" 65 | ] 66 | }, 67 | { 68 | "cell_type": "code", 69 | "execution_count": null, 70 | "metadata": { 71 | "collapsed": true 72 | }, 73 | "outputs": [], 74 | "source": [ 75 | "# creating one hot encoded vectors\n", 76 | "data = []\n", 77 | "for i in label:\n", 78 | " temp = np.zeros(M)\n", 79 | " temp[i] = 1\n", 80 | " data.append(temp)" 81 | ] 82 | }, 83 | { 84 | "cell_type": "code", 85 | "execution_count": null, 86 | "metadata": {}, 87 | "outputs": [], 88 | "source": [ 89 | "# checking data shape\n", 90 | "data = np.array(data)\n", 91 | "print (data.shape)" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": null, 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "# checking generated data with it's label\n", 101 | "temp_check = [17,23,45,67,89,96,72,250,350]\n", 102 | "for i in temp_check:\n", 103 | " print(label[i],data[i])" 104 | ] 105 | }, 106 | { 107 | "cell_type": "code", 108 | "execution_count": null, 109 | "metadata": { 110 | "collapsed": true 111 | }, 112 | "outputs": [], 113 | "source": [ 114 | "# defining autoencoder and it's layer\n", 115 | "input_signal = Input(shape=(M,))\n", 116 | "encoded = Dense(M, activation='relu')(input_signal)\n", 117 | "encoded1 = Dense(n_channel, activation='linear')(encoded)\n", 118 | "encoded2 = Lambda(lambda x: np.sqrt(n_channel)*K.l2_normalize(x,axis=1))(encoded1)\n", 119 | "\n", 120 | "EbNo_train = 5.01187 # coverted 7 db of EbNo\n", 121 | "encoded3 = GaussianNoise(np.sqrt(1/(2*R*EbNo_train)))(encoded2)\n", 122 | "\n", 123 | "decoded = Dense(M, activation='relu')(encoded3)\n", 124 | "decoded1 = Dense(M, activation='softmax')(decoded)\n", 125 | "autoencoder = Model(input_signal, decoded1)\n", 126 | "adam = Adam(lr=0.01)\n", 127 | "autoencoder.compile(optimizer=adam, loss='categorical_crossentropy')" 128 | ] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "execution_count": null, 133 | "metadata": {}, 134 | "outputs": [], 135 | "source": [ 136 | "# printing summary of layers and it's trainable parameters \n", 137 | "print (autoencoder.summary())" 138 | ] 139 | }, 140 | { 141 | "cell_type": "code", 142 | "execution_count": null, 143 | "metadata": { 144 | "collapsed": true 145 | }, 146 | "outputs": [], 147 | "source": [ 148 | "# for tensor board visualization\n", 149 | "#tbCallBack = keras.callbacks.TensorBoard(log_dir='./logs', histogram_freq=0, batch_size=32, write_graph=True, write_grads=True, write_images=False, embeddings_freq=0, embeddings_layer_names=None, embeddings_metadata=None)" 150 | ] 151 | }, 152 | { 153 | "cell_type": "code", 154 | "execution_count": null, 155 | "metadata": { 156 | "scrolled": true 157 | }, 158 | "outputs": [], 159 | "source": [ 160 | "# traning auto encoder\n", 161 | "autoencoder.fit(data, data,\n", 162 | " epochs=45,\n", 163 | " batch_size=32)" 164 | ] 165 | }, 166 | { 167 | "cell_type": "code", 168 | "execution_count": null, 169 | "metadata": { 170 | "collapsed": true 171 | }, 172 | "outputs": [], 173 | "source": [ 174 | "# saving keras model\n", 175 | "from keras.models import load_model\n", 176 | "# if you want to save model then remove below comment\n", 177 | "# autoencoder.save('autoencoder_v_best.model')" 178 | ] 179 | }, 180 | { 181 | "cell_type": "code", 182 | "execution_count": null, 183 | "metadata": { 184 | "collapsed": true 185 | }, 186 | "outputs": [], 187 | "source": [ 188 | "# making encoder from full autoencoder\n", 189 | "encoder = Model(input_signal, encoded2)" 190 | ] 191 | }, 192 | { 193 | "cell_type": "code", 194 | "execution_count": null, 195 | "metadata": { 196 | "collapsed": true 197 | }, 198 | "outputs": [], 199 | "source": [ 200 | "# making decoder from full autoencoder\n", 201 | "encoded_input = Input(shape=(n_channel,))\n", 202 | "\n", 203 | "deco = autoencoder.layers[-2](encoded_input)\n", 204 | "deco = autoencoder.layers[-1](deco)\n", 205 | "decoder = Model(encoded_input, deco)" 206 | ] 207 | }, 208 | { 209 | "cell_type": "code", 210 | "execution_count": null, 211 | "metadata": { 212 | "collapsed": true 213 | }, 214 | "outputs": [], 215 | "source": [ 216 | "# generating data for checking BER\n", 217 | "# if you're not using t-sne for visulation than set N to 70,000 for better result \n", 218 | "# for t-sne use less N like N = 1500\n", 219 | "N = 50000\n", 220 | "test_label = np.random.randint(M,size=N)\n", 221 | "test_data = []\n", 222 | "\n", 223 | "for i in test_label:\n", 224 | " temp = np.zeros(M)\n", 225 | " temp[i] = 1\n", 226 | " test_data.append(temp)\n", 227 | " \n", 228 | "test_data = np.array(test_data)" 229 | ] 230 | }, 231 | { 232 | "cell_type": "code", 233 | "execution_count": null, 234 | "metadata": {}, 235 | "outputs": [], 236 | "source": [ 237 | "# checking generated data\n", 238 | "temp_test = 6\n", 239 | "print (test_data[temp_test][test_label[temp_test]],test_label[temp_test])" 240 | ] 241 | }, 242 | { 243 | "cell_type": "code", 244 | "execution_count": null, 245 | "metadata": {}, 246 | "outputs": [], 247 | "source": [ 248 | "# for plotting learned consteallation diagram\n", 249 | "\n", 250 | "scatter_plot = []\n", 251 | "for i in range(0,M):\n", 252 | " temp = np.zeros(M)\n", 253 | " temp[i] = 1\n", 254 | " scatter_plot.append(encoder.predict(np.expand_dims(temp,axis=0)))\n", 255 | "scatter_plot = np.array(scatter_plot)\n", 256 | "print (scatter_plot.shape)" 257 | ] 258 | }, 259 | { 260 | "cell_type": "code", 261 | "execution_count": null, 262 | "metadata": {}, 263 | "outputs": [], 264 | "source": [ 265 | " # use this function for ploting constellation for higher dimenson like 7-D for (7,4) autoencoder \n", 266 | "'''\n", 267 | "x_emb = encoder.predict(test_data)\n", 268 | "noise_std = np.sqrt(1/(2*R*EbNo_train))\n", 269 | "noise = noise_std * np.random.randn(N,n_channel)\n", 270 | "x_emb = x_emb + noise\n", 271 | "from sklearn.manifold import TSNE\n", 272 | "X_embedded = TSNE(learning_rate=700, n_components=2,n_iter=35000, random_state=0, perplexity=60).fit_transform(x_emb)\n", 273 | "print (X_embedded.shape)\n", 274 | "X_embedded = X_embedded / 7\n", 275 | "import matplotlib.pyplot as plt\n", 276 | "plt.scatter(X_embedded[:,0],X_embedded[:,1])\n", 277 | "#plt.axis((-2.5,2.5,-2.5,2.5)) \n", 278 | "plt.grid()\n", 279 | "plt.show()\n", 280 | "'''" 281 | ] 282 | }, 283 | { 284 | "cell_type": "code", 285 | "execution_count": null, 286 | "metadata": {}, 287 | "outputs": [], 288 | "source": [ 289 | "# ploting constellation diagram\n", 290 | "import matplotlib.pyplot as plt\n", 291 | "scatter_plot = scatter_plot.reshape(M,2,1)\n", 292 | "plt.scatter(scatter_plot[:,0],scatter_plot[:,1])\n", 293 | "plt.axis((-2.5,2.5,-2.5,2.5))\n", 294 | "plt.grid()\n", 295 | "plt.show()" 296 | ] 297 | }, 298 | { 299 | "cell_type": "code", 300 | "execution_count": null, 301 | "metadata": { 302 | "collapsed": true 303 | }, 304 | "outputs": [], 305 | "source": [ 306 | "def frange(x, y, jump):\n", 307 | " while x < y:\n", 308 | " yield x\n", 309 | " x += jump" 310 | ] 311 | }, 312 | { 313 | "cell_type": "code", 314 | "execution_count": null, 315 | "metadata": {}, 316 | "outputs": [], 317 | "source": [ 318 | "# calculating BER\n", 319 | "# this is optimized BER function so it can handle large number of N\n", 320 | "# previous code has another for loop which was making it slow\n", 321 | "EbNodB_range = list(frange(-4,8.5,0.5))\n", 322 | "ber = [None]*len(EbNodB_range)\n", 323 | "for n in range(0,len(EbNodB_range)):\n", 324 | " EbNo=10.0**(EbNodB_range[n]/10.0)\n", 325 | " noise_std = np.sqrt(1/(2*R*EbNo))\n", 326 | " noise_mean = 0\n", 327 | " no_errors = 0\n", 328 | " nn = N\n", 329 | " noise = noise_std * np.random.randn(nn,n_channel)\n", 330 | " encoded_signal = encoder.predict(test_data) \n", 331 | " final_signal = encoded_signal + noise\n", 332 | " pred_final_signal = decoder.predict(final_signal)\n", 333 | " pred_output = np.argmax(pred_final_signal,axis=1)\n", 334 | " no_errors = (pred_output != test_label)\n", 335 | " no_errors = no_errors.astype(int).sum()\n", 336 | " ber[n] = no_errors / nn \n", 337 | " print ('SNR:',EbNodB_range[n],'BER:',ber[n])\n", 338 | " # use below line for generating matlab like matrix which can be copy and paste for plotting ber graph in matlab\n", 339 | " #print(ber[n], \" \",end='')" 340 | ] 341 | }, 342 | { 343 | "cell_type": "code", 344 | "execution_count": null, 345 | "metadata": {}, 346 | "outputs": [], 347 | "source": [ 348 | "# ploting ber curve\n", 349 | "import matplotlib.pyplot as plt\n", 350 | "from scipy import interpolate\n", 351 | "plt.plot(EbNodB_range, ber, 'bo',label='Autoencoder(2,2)')\n", 352 | "plt.yscale('log')\n", 353 | "plt.xlabel('SNR Range')\n", 354 | "plt.ylabel('Block Error Rate')\n", 355 | "plt.grid()\n", 356 | "plt.legend(loc='upper right',ncol = 1)" 357 | ] 358 | }, 359 | { 360 | "cell_type": "code", 361 | "execution_count": null, 362 | "metadata": {}, 363 | "outputs": [], 364 | "source": [ 365 | "# for saving figure remove below comment\n", 366 | "#plt.savefig('AutoEncoder_2_2_constrained_BER_matplotlib')\n", 367 | "plt.show()" 368 | ] 369 | }, 370 | { 371 | "cell_type": "code", 372 | "execution_count": null, 373 | "metadata": { 374 | "collapsed": true 375 | }, 376 | "outputs": [], 377 | "source": [] 378 | } 379 | ], 380 | "metadata": { 381 | "anaconda-cloud": {}, 382 | "kernelspec": { 383 | "display_name": "Python 3", 384 | "language": "python", 385 | "name": "python3" 386 | }, 387 | "language_info": { 388 | "codemirror_mode": { 389 | "name": "ipython", 390 | "version": 3 391 | }, 392 | "file_extension": ".py", 393 | "mimetype": "text/x-python", 394 | "name": "python", 395 | "nbconvert_exporter": "python", 396 | "pygments_lexer": "ipython3", 397 | "version": "3.5.3" 398 | } 399 | }, 400 | "nbformat": 4, 401 | "nbformat_minor": 2 402 | } 403 | -------------------------------------------------------------------------------- /git-photo/cons_1.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/immortal3/AutoEncoder-Based-Communication-System/8d186ba809ae9b60995e56c6d93e6a21db56eb30/git-photo/cons_1.PNG -------------------------------------------------------------------------------- /git-photo/cons_2.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/immortal3/AutoEncoder-Based-Communication-System/8d186ba809ae9b60995e56c6d93e6a21db56eb30/git-photo/cons_2.PNG -------------------------------------------------------------------------------- /git-photo/cons_4.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/immortal3/AutoEncoder-Based-Communication-System/8d186ba809ae9b60995e56c6d93e6a21db56eb30/git-photo/cons_4.PNG -------------------------------------------------------------------------------- /git-photo/plot_1.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/immortal3/AutoEncoder-Based-Communication-System/8d186ba809ae9b60995e56c6d93e6a21db56eb30/git-photo/plot_1.PNG -------------------------------------------------------------------------------- /git-photo/plot_2.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/immortal3/AutoEncoder-Based-Communication-System/8d186ba809ae9b60995e56c6d93e6a21db56eb30/git-photo/plot_2.PNG -------------------------------------------------------------------------------- /git-photo/research_paper_1.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/immortal3/AutoEncoder-Based-Communication-System/8d186ba809ae9b60995e56c6d93e6a21db56eb30/git-photo/research_paper_1.PNG -------------------------------------------------------------------------------- /git-photo/research_paper_2.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/immortal3/AutoEncoder-Based-Communication-System/8d186ba809ae9b60995e56c6d93e6a21db56eb30/git-photo/research_paper_2.PNG -------------------------------------------------------------------------------- /plotting/plot1.m: -------------------------------------------------------------------------------- 1 | 2 | n = 7; 3 | SNR_db = -4:0.5:8; 4 | SNR = 10.^(SNR_db./10); 5 | k = 4; 6 | ber_7_4_hamming_hard = bercoding(SNR_db,'block','hard',n,k,4); 7 | M = 4; 8 | ber_4_4 = berawgn( SNR_db ,'psk',M,'nondiff'); 9 | ber_7_4_autoencoder = [ 0.37038 0.32722 0.29286 0.25436 0.21936 0.18242 0.1502 0.11876 0.09346 0.06972 0.04974 0.03586 0.02388 0.0169 0.00966 0.00574 0.00294 0.00154 0.00094 0.00034 0.00016 6e-05 3e-05 1e-05 8e-06 ]; 10 | %ber_7_4_fit = berfit(SNR_db,ber_7_4_autoencoder); 11 | semilogy(SNR_db,ber_7_4_hamming_hard.*2,'b-.','linewidth',1),grid on,hold on; 12 | semilogy(SNR_db,ber_7_4_autoencoder,'ro','linewidth',0.25,'MarkerFaceColor',[ 1 0 0]); 13 | 14 | BER_SOFT=(7.*qfunc(sqrt(3*SNR)))+(7.*qfunc(sqrt(4*SNR)))+(qfunc(sqrt(7*SNR))); 15 | semilogy(SNR_db,BER_SOFT./7,'b-','linewidth',1); 16 | semilogy(SNR_db,ber_4_4.*4,'k','linewidth',1) 17 | legend('Hamming (7,4) Hard Decision','Autoencoder(7,4)','Hamming (7,4) MLD','UnCoded BPSK(4,4)'); 18 | legend('Location','southwest'); 19 | ylim([1E-5 1]); 20 | -------------------------------------------------------------------------------- /plotting/plot2.m: -------------------------------------------------------------------------------- 1 | M = 4; 2 | SNR_db = -2:0.5:10; 3 | SNR = 10.^(SNR_db/10); 4 | [ber_2_2,ser_2_2 ]= berawgn(SNR_db,'psk',M,'nondiff'); 5 | M_new = 2; 6 | [ber_8_8,ser_8_8 ]= berawgn(SNR_db,'psk',M_new,'nondiff'); 7 | ber_8_8 = 8 * ber_8_8; 8 | ber_2_2_autoencoder = [0.247873333333 0.222406666667 0.198713333333 0.17418 0.152546666667 0.1319 0.113633333333 0.0926933333333 0.0758733333333 0.06042 0.0475333333333 0.0348266666667 0.0264066666667 0.0189733333333 0.0131066666667 0.00856 0.00546 0.00314 0.00174 0.000853333333333 0.000553333333333 0.000213333333333 9.33333333333e-05 3.33333333333e-05 1.33333333333e-05 ]; 9 | ber_8_8_autoencoder = [0.631973333333 0.58108 0.529746666667 0.47574 0.417266666667 0.35732 0.29806 0.244213333333 0.19182 0.14556 0.107273333333 0.0760066666667 0.0494266666667 0.0323466666667 0.0184266666667 0.0103533333333 0.00558 0.00288 0.00118666666667 0.00046 0.000193333333333 7.33333333333e-05 2.66666666667e-05 0.9e-05 0.0]; 10 | semilogy(SNR_db,ser_2_2,'k--','linewidth',2),grid on,hold on; 11 | semilogy(SNR_db,ber_8_8_autoencoder,'--r*','linewidth',0.8) 12 | semilogy(SNR_db,ber_2_2_autoencoder,'rs','linewidth',0.8) 13 | semilogy(SNR_db,ber_8_8,'k','linewidth',1) 14 | legend('Uncoded BPSK(2,2)','Autoencoder(8,8)','Autoencoder (2,2)','Uncoded BPSK(8,8)'); 15 | legend('Location','southwest'); 16 | ylim([1E-5 1]) 17 | --------------------------------------------------------------------------------