├── 0.Data_preparation ├── data_exploration.ipynb └── data_processing.ipynb ├── 1.Data_exploration_baseline_models ├── DTW_KNN.ipynb ├── Further Feature Exploration + LR + RF model.ipynb └── Other_simple_models.ipynb ├── 2.Binary_classification_models ├── Binary_classification_contents.md ├── Model_binary_classification_model1.ipynb ├── Model_binary_classification_model2.ipynb ├── Model_binary_classification_model3.ipynb ├── Model_binary_classification_model4.ipynb ├── Model_binary_classification_model5.ipynb ├── Model_binary_classification_model6.ipynb ├── Model_binary_classification_model7.ipynb └── fetch_data.py ├── 3.Transformer_model ├── transformer_high_income.ipynb └── transformer_low_income.ipynb ├── Data_Schema.md ├── Images ├── Picture1.png ├── Picture2.png ├── Picture3.png ├── Picture4.png ├── Picture5.png ├── Picture6.png ├── Picture7.png └── Picture8.png ├── Presentation.pdf ├── README.md └── Readings ├── Attention is all you need.pdf └── OCAN.pdf /0.Data_preparation/data_processing.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "source": [ 6 | "Data Preparation\n", 7 | "* Data loading from data_train.json\n", 8 | "* Basic features(user type, application time) processing and exploration\n", 9 | " * Application time ---- processed into day of week, time of the day.\n", 10 | " * Check the relationship between fraud and user type/appication time\n", 11 | "* Sequential User behavior features processing and exploration\n", 12 | " * stay time, lag time between pages, time span of an application, page transitions overview.\n", 13 | "* Processing data for model building\n", 14 | " * Sequential data padding\n", 15 | " * Standardization\n", 16 | " * Save in json format\n", 17 | "* Generate Markov Transition Field along features. \n", 18 | " * Zhang, R., Zheng, F., & Min, W. (2018). Sequential Behavioral Data Processing Using Deep Learning and the Markov Transition Field in Online Fraud Detection. arXiv preprint arXiv:1808.05329.\n", 19 | " * All features are bin and one-hot encoded into a binary vector.\n", 20 | " * matrix[i,j] represents the probability that j=1 given i=1\n", 21 | "* Generate Markov Transition Field along timesteps and Gramian Angular Field(code only)\n", 22 | " * Wang, Z., & Oates, T. (2015, April). Encoding time series as images for visual inspection and classification using tiled convolutional neural networks. In Workshops at the Twenty-Ninth AAAI Conference on Artificial Intelligence." 23 | ], 24 | "metadata": { 25 | "colab_type": "text", 26 | "id": "RG3sRuu7DzY0" 27 | } 28 | }, 29 | { 30 | "cell_type": "code", 31 | "source": [ 32 | "from datetime import datetime\n", 33 | "import numpy as np\n", 34 | "import pandas as pd\n", 35 | "import json\n", 36 | "from sklearn.preprocessing import LabelEncoder\n", 37 | "import matplotlib.pyplot as plt\n", 38 | "from scipy import stats\n", 39 | "from gensim.models import Word2Vec\n", 40 | "from sklearn.manifold import TSNE\n", 41 | "import os\n", 42 | "import pickle\n" 43 | ], 44 | "outputs": [], 45 | "execution_count": 0, 46 | "metadata": { 47 | "ExecuteTime": { 48 | "end_time": "2020-04-28T04:27:17.508554Z", 49 | "start_time": "2020-04-28T04:27:17.503568Z" 50 | }, 51 | "colab_type": "code", 52 | "id": "QwwXGSE5DzZK", 53 | "colab": {} 54 | } 55 | }, 56 | { 57 | "cell_type": "code", 58 | "source": [ 59 | "from google.colab import drive\n", 60 | "drive.mount('/content/drive', force_remount=True)\n", 61 | "os.chdir(\"drive/My Drive/Online Lending/Sequential Embedding/Spring 2020/low_income_data\") \n", 62 | "# os.chdir(\"\") # give the path to the data file.\n", 63 | "os.listdir()" 64 | ], 65 | "outputs": [], 66 | "execution_count": null, 67 | "metadata": { 68 | "id": "oXnbVSq4XiU-", 69 | "colab_type": "code", 70 | "outputId": "506a8828-d113-417a-bfda-bce5a1978167", 71 | "executionInfo": { 72 | "status": "error", 73 | "timestamp": 1590035296307, 74 | "user_tz": -480, 75 | "elapsed": 1426, 76 | "user": { 77 | "displayName": "Hanyu Wu", 78 | "photoUrl": "", 79 | "userId": "05588030028148674891" 80 | } 81 | }, 82 | "colab": { 83 | "base_uri": "https://localhost:8080/", 84 | "height": 778 85 | } 86 | } 87 | }, 88 | { 89 | "cell_type": "code", 90 | "source": [ 91 | "data_dir = 'processed/'" 92 | ], 93 | "outputs": [], 94 | "execution_count": 0, 95 | "metadata": { 96 | "id": "34vLSQwQXmSS", 97 | "colab_type": "code", 98 | "colab": {} 99 | } 100 | }, 101 | { 102 | "cell_type": "markdown", 103 | "source": [ 104 | "# Data loading" 105 | ], 106 | "metadata": { 107 | "colab_type": "text", 108 | "id": "4FKluv7lDzaj" 109 | } 110 | }, 111 | { 112 | "cell_type": "code", 113 | "source": [ 114 | "# change to raw data folder\n", 115 | "# os.chdir(os.path.abspath(os.path.join(data_dir, \"../../data/raw\")))" 116 | ], 117 | "outputs": [], 118 | "execution_count": 0, 119 | "metadata": { 120 | "ExecuteTime": { 121 | "end_time": "2020-04-28T03:30:39.204790Z", 122 | "start_time": "2020-04-28T03:30:39.202799Z" 123 | }, 124 | "id": "0eiwURUjTfgW", 125 | "colab_type": "code", 126 | "colab": {} 127 | } 128 | }, 129 | { 130 | "cell_type": "code", 131 | "source": [ 132 | "%%time\n", 133 | "sequential_data = []\n", 134 | "\n", 135 | "line_number = 0\n", 136 | "max_lines = 100\n", 137 | "# with open('raw/data_train.json', 'r') as f:\n", 138 | "with open('raw/test_new.json', 'r') as f:\n", 139 | " for line in f:\n", 140 | "# if len(sequential_data) > max_lines:\n", 141 | "# break\n", 142 | " sequential_data.append(json.loads(line))\n", 143 | "# sequential_data = sequential_data[:100]\n", 144 | "len(sequential_data)" 145 | ], 146 | "outputs": [ 147 | { 148 | "output_type": "stream", 149 | "name": "stdout", 150 | "text": [ 151 | "CPU times: user 1.49 s, sys: 420 ms, total: 1.91 s\n", 152 | "Wall time: 2.72 s\n" 153 | ] 154 | } 155 | ], 156 | "execution_count": 0, 157 | "metadata": { 158 | "ExecuteTime": { 159 | "end_time": "2020-04-28T03:30:53.959549Z", 160 | "start_time": "2020-04-28T03:30:39.205767Z" 161 | }, 162 | "colab_type": "code", 163 | "executionInfo": { 164 | "status": "ok", 165 | "timestamp": 1590033968517, 166 | "user_tz": -480, 167 | "elapsed": 3073, 168 | "user": { 169 | "displayName": "Hanyu Wu", 170 | "photoUrl": "", 171 | "userId": "05588030028148674891" 172 | } 173 | }, 174 | "id": "nzGeQjCDDza8", 175 | "outputId": "72775cad-e467-4cea-cd97-57f11d145e3a", 176 | "scrolled": true, 177 | "colab": { 178 | "base_uri": "https://localhost:8080/", 179 | "height": 51 180 | } 181 | } 182 | }, 183 | { 184 | "cell_type": "code", 185 | "source": [ 186 | "sequential_driver = {}\n", 187 | "\n", 188 | "sequential_behavior = {}\n", 189 | "\n", 190 | "for item in sequential_data:\n", 191 | " user_id = item[0]\n", 192 | " application_time = int(item[1]['order_info']['order_time'])\n", 193 | " sequential_driver.update({f\"{user_id}|{application_time}\": item[1]['order_info']})\n", 194 | " sub_data = [x for x in item[1]['data']\n", 195 | " if x['petime'] <= application_time-100]\n", 196 | " # we only keep data occurs before application time. \"-100\" is not neccessary for offline data cleaning.\n", 197 | " # but sometimes we use this trick for online calculation to avoid network slowdown\n", 198 | " sequential_behavior.update({f\"{user_id}|{application_time}\": sub_data})\n", 199 | "## driver saved user data, while behavior saved both user data and behavior sequence\n", 200 | "len(sequential_behavior), len(sequential_driver)" 201 | ], 202 | "outputs": [ 203 | { 204 | "output_type": "execute_result", 205 | "execution_count": 9, 206 | "data": { 207 | "text/plain": "(30672, 30672)" 208 | }, 209 | "metadata": { 210 | "tags": [] 211 | } 212 | } 213 | ], 214 | "execution_count": 0, 215 | "metadata": { 216 | "ExecuteTime": { 217 | "end_time": "2020-04-28T03:30:56.127858Z", 218 | "start_time": "2020-04-28T03:30:53.960517Z" 219 | }, 220 | "colab_type": "code", 221 | "executionInfo": { 222 | "status": "ok", 223 | "timestamp": 1590033978017, 224 | "user_tz": -480, 225 | "elapsed": 525, 226 | "user": { 227 | "displayName": "Hanyu Wu", 228 | "photoUrl": "", 229 | "userId": "05588030028148674891" 230 | } 231 | }, 232 | "id": "coI759MXDzb0", 233 | "outputId": "c27832cb-9510-47a9-8c46-01403cc89012", 234 | "colab": { 235 | "base_uri": "https://localhost:8080/", 236 | "height": 34 237 | } 238 | } 239 | }, 240 | { 241 | "cell_type": "markdown", 242 | "source": [ 243 | "# Preprocessing data for model building" 244 | ], 245 | "metadata": { 246 | "colab_type": "text", 247 | "id": "KiE148qbDzgp" 248 | } 249 | }, 250 | { 251 | "cell_type": "code", 252 | "source": [ 253 | "unique_user_session = list(set(sequential_behavior.keys()))\n", 254 | "keys_order = list()\n", 255 | "for i in sequential_behavior.keys():\n", 256 | " if sequential_behavior[i] !=[]:\n", 257 | " keys_order.append(i)\n", 258 | "keys_order[:10]\n", 259 | "# len(keys_order)" 260 | ], 261 | "outputs": [ 262 | { 263 | "output_type": "execute_result", 264 | "execution_count": 38, 265 | "data": { 266 | "text/plain": "['0ed9672fa61f4d6da241a6289000e2f2|1507102200000',\n '48b72ca5b43248d3b50dfadb76c651a2|1508810700000',\n '8e669f481a4645c89de0f29f980522e9|1508354520000',\n 'd9a86017a67140ab96d9c405d1ebb02d|1507129920000',\n '96e3ded73e6b47ba91108e5b6a923c81|1509348180000',\n '593c8f76b5164c08a52c7fc1bf99d283|1508970600000',\n 'e87ffe735fba4541a009967d25b0e714|1507058160000',\n '8e6c5ba3d14c41539b64816c4692f142|1507671840000',\n '3f8e1ae4c91147138fbdc5ea8aa17302|1508903820000',\n 'ed76629b471b4496ab9b979eda237c20|1508883240000']" 267 | }, 268 | "metadata": { 269 | "tags": [] 270 | } 271 | } 272 | ], 273 | "execution_count": 0, 274 | "metadata": { 275 | "id": "CQBHQd3MTfgd", 276 | "colab_type": "code", 277 | "outputId": "c9409a6e-a3f7-4750-e5c0-b262ec61431b", 278 | "executionInfo": { 279 | "status": "ok", 280 | "timestamp": 1590035325103, 281 | "user_tz": -480, 282 | "elapsed": 361, 283 | "user": { 284 | "displayName": "Hanyu Wu", 285 | "photoUrl": "", 286 | "userId": "05588030028148674891" 287 | } 288 | }, 289 | "colab": { 290 | "base_uri": "https://localhost:8080/", 291 | "height": 187 292 | } 293 | } 294 | }, 295 | { 296 | "cell_type": "markdown", 297 | "source": [ 298 | "## Processing Data for Word2vec Embedding" 299 | ], 300 | "metadata": { 301 | "id": "lZS9Ojb1Tfgf", 302 | "colab_type": "text" 303 | } 304 | }, 305 | { 306 | "cell_type": "code", 307 | "source": [ 308 | "# This is for word2vec embedding. In this part, we will not do one hot endoing for page types.\n", 309 | "def data_process_for_embedding(sequence_for_a_single_application):\n", 310 | " '''\n", 311 | " Function to process signle application\n", 312 | " '''\n", 313 | " sequence_for_a_single_application.sort(key=lambda x: x['petime'])\n", 314 | " page_sequence = [x['pname'] for x in sequence_for_a_single_application]\n", 315 | " pstart = [x['pstime'] for x in sequence_for_a_single_application]\n", 316 | " pend = ([x['petime'] for x in sequence_for_a_single_application])\n", 317 | "\n", 318 | "# mark some outliers as -1 and take the logarithm of the lag time.\n", 319 | " page_stay_time = [np.log((y-x)/1000 +1) if (y-x)>0 and (y-x)//1000<800 else -1 for x,y in zip(pstart, pend)]\n", 320 | "# mark some outliers as -1 and take the logarithm of the lag time.\n", 321 | "# page_lagg_time = [np.log((x-y)/1000 +1)if (x-y)>=0 else -1 for x,y in zip(pstart[1:], pend[:-1])]\n", 322 | "\n", 323 | "# page_lagg_time_padd=[0]\n", 324 | "# page_lagg_time_padd.extend(page_lagg_time)\n", 325 | "\n", 326 | " return page_sequence ,page_stay_time #,page_lagg_time_padd\n", 327 | "\n", 328 | "\n", 329 | "def get_data_for_embedding(x):\n", 330 | " sequence_data = []\n", 331 | " stay_time_sequence = list()\n", 332 | " overdue = []\n", 333 | " for keys in keys_order:\n", 334 | " #page_sequence, page_stay_time, page_lagg_time = data_process_for_embedding(x[keys])\n", 335 | " page_sequence, stay_time = data_process_for_embedding(x[keys])\n", 336 | " # single_entry=np.vstack((page_sequence, page_stay_time, page_lagg_time)).T\n", 337 | " sequence_data.append(page_sequence)\n", 338 | " stay_time_sequence.append(stay_time)\n", 339 | " return sequence_data, stay_time_sequence\n", 340 | "\n", 341 | "\n", 342 | "sequence_data_for_embedding, sequence_stay_time = get_data_for_embedding(sequential_behavior)\n", 343 | "# sequence_data_for_embedding=np.array(sequence_data_for_embedding)\n", 344 | "# sequence_data_for_embedding.shape\n", 345 | "print(\"some information about the the data shape to help you undestand the data:\")\n", 346 | "print(f\"number of users: {len(sequence_data_for_embedding)}\")\n", 347 | "print(\n", 348 | " f\"timestamps length of behaviors of the first user: {len(sequence_data_for_embedding[0])}\")\n", 349 | "print(\n", 350 | " f\"timestamps length of behaviors of the second user: {len(sequence_data_for_embedding[1])}\")\n", 351 | "print(\"Different user may have different page viewing sequence length(i.e. timestaps length), but we later decide to modify all to 60 timestamps\")" 352 | ], 353 | "outputs": [ 354 | { 355 | "output_type": "stream", 356 | "name": "stdout", 357 | "text": [ 358 | "some information about the the data shape to help you undestand the data:\n", 359 | "number of users: 19617\n", 360 | "timestamps length of behaviors of the first user: 15\n", 361 | "timestamps length of behaviors of the second user: 21\n", 362 | "Different user may have different page viewing sequence length(i.e. timestaps length), but we later decide to modify all to 60 timestamps\n" 363 | ] 364 | } 365 | ], 366 | "execution_count": 0, 367 | "metadata": { 368 | "ExecuteTime": { 369 | "end_time": "2020-04-28T03:33:57.124005Z", 370 | "start_time": "2020-04-28T03:33:55.800542Z" 371 | }, 372 | "id": "kEsybNJUTfgg", 373 | "colab_type": "code", 374 | "outputId": "c4364104-b02c-465a-9a48-df4e1d705f6c", 375 | "executionInfo": { 376 | "status": "ok", 377 | "timestamp": 1590033983362, 378 | "user_tz": -480, 379 | "elapsed": 1868, 380 | "user": { 381 | "displayName": "Hanyu Wu", 382 | "photoUrl": "", 383 | "userId": "05588030028148674891" 384 | } 385 | }, 386 | "colab": { 387 | "base_uri": "https://localhost:8080/", 388 | "height": 122 389 | } 390 | } 391 | }, 392 | { 393 | "cell_type": "code", 394 | "source": [ 395 | "# This is to train our own word2vec model. NB: we do not use pre-trained word2vec model!\n", 396 | "model = Word2Vec(sequence_data_for_embedding, min_count=1,\n", 397 | " size=50, workers=3, window=5, sg=1)" 398 | ], 399 | "outputs": [], 400 | "execution_count": 0, 401 | "metadata": { 402 | "ExecuteTime": { 403 | "end_time": "2020-04-28T03:34:01.705790Z", 404 | "start_time": "2020-04-28T03:33:57.125002Z" 405 | }, 406 | "id": "ckjlDQE4Tfgi", 407 | "colab_type": "code", 408 | "colab": {} 409 | } 410 | }, 411 | { 412 | "cell_type": "code", 413 | "source": [ 414 | "# After word2vec, let's see what the data strcture looks like. \n", 415 | "# The change is that, now every page type has been modified into a vector of length 50\n", 416 | "# (this vector length can be changed, it is a hymperparameter of word2vec model).\n", 417 | "sequence_data_embedded = []\n", 418 | "for i in range(len(sequence_data_for_embedding)):\n", 419 | " sequence_data_embedded_for_a_single_user = []\n", 420 | " for j in range(len(sequence_data_for_embedding[i])):\n", 421 | " sequence_data_embedded_for_a_single_user.append(model.wv[sequence_data_for_embedding[i][j]])\n", 422 | " sequence_data_embedded.append(sequence_data_embedded_for_a_single_user)\n", 423 | "print(f\"number of users: {len(sequence_data_embedded)}\")\n", 424 | "print(f\"number of behaviors of first user: {len(sequence_data_embedded[0])}\")\n", 425 | "print(f\"number of behaviors of second user: {len(sequence_data_embedded[1])}\")\n", 426 | "print(f\"embedding size of every page type: {len(sequence_data_embedded[0][0])}\")\n", 427 | "print(\"the shape of sequence_data_embedded is (19617,X,50) where X is the number of behaviors of each user, it is not a fixed number\")" 428 | ], 429 | "outputs": [ 430 | { 431 | "output_type": "stream", 432 | "name": "stdout", 433 | "text": [ 434 | "number of users: 19617\n", 435 | "number of behaviors of first user: 15\n", 436 | "number of behaviors of second user: 21\n", 437 | "embedding size of every page type: 50\n", 438 | "the shape of sequence_data_embedded is (19617,X,50) where X is the number of behaviors of each user, it is not a fixed number\n" 439 | ] 440 | } 441 | ], 442 | "execution_count": 0, 443 | "metadata": { 444 | "ExecuteTime": { 445 | "end_time": "2020-04-28T03:34:08.188431Z", 446 | "start_time": "2020-04-28T03:34:01.708779Z" 447 | }, 448 | "id": "a3P_LgLLTfgk", 449 | "colab_type": "code", 450 | "outputId": "f2dcbadb-9804-428d-fa2c-7c72ec219322", 451 | "executionInfo": { 452 | "status": "ok", 453 | "timestamp": 1590033989542, 454 | "user_tz": -480, 455 | "elapsed": 1738, 456 | "user": { 457 | "displayName": "Hanyu Wu", 458 | "photoUrl": "", 459 | "userId": "05588030028148674891" 460 | } 461 | }, 462 | "colab": { 463 | "base_uri": "https://localhost:8080/", 464 | "height": 122 465 | } 466 | } 467 | }, 468 | { 469 | "cell_type": "markdown", 470 | "source": [ 471 | "## Pad the sequence into the same length" 472 | ], 473 | "metadata": { 474 | "colab_type": "text", 475 | "id": "zHW60eaiDzhF" 476 | } 477 | }, 478 | { 479 | "cell_type": "code", 480 | "source": [ 481 | "padded_sequence_data_embedded = []\n", 482 | "padding_vector = [0]*50 # 50 is the embedding size\n", 483 | "maxLen = 60\n", 484 | "for i in range(len(sequence_data_embedded)):\n", 485 | " padded_sequence_data_embedded_for_a_user = []\n", 486 | " if len(sequence_data_embedded[i]) >= maxLen:\n", 487 | " padded_sequence_data_embedded_for_a_user = sequence_data_embedded[i].copy()[:maxLen]\n", 488 | " else:\n", 489 | " padding_size = maxLen - len(sequence_data_embedded[i])\n", 490 | " padded_sequence_data_embedded_for_a_user = sequence_data_embedded[i].copy()\n", 491 | " for n in range(padding_size):\n", 492 | " padded_sequence_data_embedded_for_a_user.append(padding_vector)\n", 493 | " padded_sequence_data_embedded.append(padded_sequence_data_embedded_for_a_user)\n", 494 | "len(padded_sequence_data_embedded), len(padded_sequence_data_embedded[0]), len(padded_sequence_data_embedded[0][0])" 495 | ], 496 | "outputs": [ 497 | { 498 | "output_type": "execute_result", 499 | "execution_count": 15, 500 | "data": { 501 | "text/plain": "(19617, 60, 50)" 502 | }, 503 | "metadata": { 504 | "tags": [] 505 | } 506 | } 507 | ], 508 | "execution_count": 0, 509 | "metadata": { 510 | "ExecuteTime": { 511 | "end_time": "2020-04-28T03:34:16.130893Z", 512 | "start_time": "2020-04-28T03:34:14.957907Z" 513 | }, 514 | "id": "Qi7s_8ugTfgm", 515 | "colab_type": "code", 516 | "outputId": "298cb5be-2e7b-4bf5-e63c-e87c0b5f1632", 517 | "executionInfo": { 518 | "status": "ok", 519 | "timestamp": 1590034015189, 520 | "user_tz": -480, 521 | "elapsed": 445, 522 | "user": { 523 | "displayName": "Hanyu Wu", 524 | "photoUrl": "", 525 | "userId": "05588030028148674891" 526 | } 527 | }, 528 | "colab": { 529 | "base_uri": "https://localhost:8080/", 530 | "height": 34 531 | } 532 | } 533 | }, 534 | { 535 | "cell_type": "markdown", 536 | "source": [ 537 | "## Pad Stay Time" 538 | ], 539 | "metadata": { 540 | "id": "GAL76cBWTfgo", 541 | "colab_type": "text" 542 | } 543 | }, 544 | { 545 | "cell_type": "code", 546 | "source": [ 547 | "padded_sequence_stay_time = []\n", 548 | "padding_vector = 0 # 50 is the embedding size\n", 549 | "for i in range(len(sequence_stay_time)):\n", 550 | " padded_sequence_data_embedded_for_a_user = []\n", 551 | " if len(sequence_stay_time[i]) >= maxLen:\n", 552 | " padded_sequence_data_embedded_for_a_user = sequence_stay_time[i].copy()[:maxLen]\n", 553 | " else:\n", 554 | " padding_size = maxLen - len(sequence_stay_time[i])\n", 555 | " padded_sequence_data_embedded_for_a_user = sequence_stay_time[i].copy()\n", 556 | " for n in range(padding_size):\n", 557 | " padded_sequence_data_embedded_for_a_user.append(padding_vector)\n", 558 | " padded_sequence_stay_time.append(padded_sequence_data_embedded_for_a_user)\n", 559 | "len(padded_sequence_stay_time), len(padded_sequence_stay_time[0])" 560 | ], 561 | "outputs": [ 562 | { 563 | "output_type": "execute_result", 564 | "execution_count": 16, 565 | "data": { 566 | "text/plain": "(19617, 60)" 567 | }, 568 | "metadata": { 569 | "tags": [] 570 | } 571 | } 572 | ], 573 | "execution_count": 0, 574 | "metadata": { 575 | "id": "HSQd52tRTfgp", 576 | "colab_type": "code", 577 | "outputId": "5e58bb25-106d-447d-8a7b-92569eac8765", 578 | "executionInfo": { 579 | "status": "ok", 580 | "timestamp": 1590034018330, 581 | "user_tz": -480, 582 | "elapsed": 463, 583 | "user": { 584 | "displayName": "Hanyu Wu", 585 | "photoUrl": "", 586 | "userId": "05588030028148674891" 587 | } 588 | }, 589 | "colab": { 590 | "base_uri": "https://localhost:8080/", 591 | "height": 34 592 | } 593 | } 594 | }, 595 | { 596 | "cell_type": "code", 597 | "source": [ 598 | "# check if the len has been padded to maxLen\n", 599 | "print(len(sequence_stay_time[0])) \n", 600 | "# print(\"the shape of padded_sequence_data_embedded is {0} where {1} is the max len we set\".format(padded_sequence_data_embedded.shape, len(padded_sequence_data_embedded[0])))\n", 601 | "[padded_sequence_stay_time[0], sequence_stay_time[0]]" 602 | ], 603 | "outputs": [ 604 | { 605 | "output_type": "stream", 606 | "name": "stdout", 607 | "text": [ 608 | "15\n" 609 | ] 610 | }, 611 | { 612 | "output_type": "execute_result", 613 | "execution_count": 19, 614 | "data": { 615 | "text/plain": "[[2.320916049678769,\n 0.8082599876604498,\n 1.6397731122301733,\n 1.4678743481123135,\n 1.6867692553704239,\n 1.7967470107390942,\n 1.660701206371642,\n 2.273259263612217,\n 2.595180077306471,\n 0.9439058989071285,\n 0.8666802313208206,\n 1.3724489542978375,\n 1.0511712140679732,\n 1.4156104154539437,\n 1.7112722183153684,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0,\n 0],\n [2.320916049678769,\n 0.8082599876604498,\n 1.6397731122301733,\n 1.4678743481123135,\n 1.6867692553704239,\n 1.7967470107390942,\n 1.660701206371642,\n 2.273259263612217,\n 2.595180077306471,\n 0.9439058989071285,\n 0.8666802313208206,\n 1.3724489542978375,\n 1.0511712140679732,\n 1.4156104154539437,\n 1.7112722183153684]]" 616 | }, 617 | "metadata": { 618 | "tags": [] 619 | } 620 | } 621 | ], 622 | "execution_count": 0, 623 | "metadata": { 624 | "ExecuteTime": { 625 | "end_time": "2020-04-28T03:36:39.548614Z", 626 | "start_time": "2020-04-28T03:36:39.542630Z" 627 | }, 628 | "scrolled": true, 629 | "id": "f-CbmkaFTfgr", 630 | "colab_type": "code", 631 | "outputId": "2f57c91f-93bf-4edf-cfb3-5b5ec444f93d", 632 | "executionInfo": { 633 | "status": "ok", 634 | "timestamp": 1590034073169, 635 | "user_tz": -480, 636 | "elapsed": 383, 637 | "user": { 638 | "displayName": "Hanyu Wu", 639 | "photoUrl": "", 640 | "userId": "05588030028148674891" 641 | } 642 | }, 643 | "colab": { 644 | "base_uri": "https://localhost:8080/", 645 | "height": 1000 646 | } 647 | } 648 | }, 649 | { 650 | "cell_type": "markdown", 651 | "source": [ 652 | "## Save the Processed Data in Json Format" 653 | ], 654 | "metadata": { 655 | "colab_type": "text", 656 | "id": "rib7s5HlDzhR" 657 | } 658 | }, 659 | { 660 | "cell_type": "code", 661 | "source": [ 662 | "driver_columns=['new_client', '1', '2', '3', '4', '5', '6', '7', '1_time', '2_time','3_time', '4_time', '5_time', '6_time']\n", 663 | "features_sequential={} ## sequential data\n", 664 | "features_nonsequential={} ## non sequential features -- user type, application time\n", 665 | "features_sequential_embedded={} ## embedded sequential data\n", 666 | "label={}\n", 667 | "# for i in range(len(keys)):\n", 668 | "# uid=keys[i]\n", 669 | "# driver_data=driver[driver[\"index\"]==uid]\n", 670 | "# feature_nonsequential=list(driver_data[driver_columns].values[0].astype(\"float\"))\n", 671 | "# features_sequential[uid]=padded_sequence_data[i]\n", 672 | "# features_nonsequential[uid]=feature_nonsequential\n", 673 | "# features_sequential_embedded[uid]=padded_sequence_data_embedded[i]\n", 674 | "# label[uid]=driver_data[\"label\"].values[0]\n", 675 | "# if i%1000==0:\n", 676 | "# print(i,\"/\", len(keys))" 677 | ], 678 | "outputs": [], 679 | "execution_count": 0, 680 | "metadata": { 681 | "ExecuteTime": { 682 | "end_time": "2020-04-28T03:46:16.347578Z", 683 | "start_time": "2020-04-28T03:46:16.292725Z" 684 | }, 685 | "colab_type": "code", 686 | "id": "urGe0Bj9DzhS", 687 | "colab": {} 688 | } 689 | }, 690 | { 691 | "cell_type": "code", 692 | "source": [ 693 | "# driver[[\"index\", \"label\"]]" 694 | ], 695 | "outputs": [], 696 | "execution_count": 0, 697 | "metadata": { 698 | "id": "J6AqZhHmTfgy", 699 | "colab_type": "code", 700 | "colab": {} 701 | } 702 | }, 703 | { 704 | "cell_type": "code", 705 | "source": [ 706 | "len(padded_sequence_data_embedded[0][0])" 707 | ], 708 | "outputs": [ 709 | { 710 | "output_type": "execute_result", 711 | "execution_count": 30, 712 | "data": { 713 | "text/plain": "50" 714 | }, 715 | "metadata": { 716 | "tags": [] 717 | } 718 | } 719 | ], 720 | "execution_count": 0, 721 | "metadata": { 722 | "ExecuteTime": { 723 | "end_time": "2020-04-28T04:27:07.124321Z", 724 | "start_time": "2020-04-28T04:27:07.120321Z" 725 | }, 726 | "id": "qnLTJTsfTfg0", 727 | "colab_type": "code", 728 | "outputId": "b3a0a584-77c8-4cf8-87cb-081893d7c9af", 729 | "executionInfo": { 730 | "status": "ok", 731 | "timestamp": 1590034560665, 732 | "user_tz": -480, 733 | "elapsed": 424, 734 | "user": { 735 | "displayName": "Hanyu Wu", 736 | "photoUrl": "", 737 | "userId": "05588030028148674891" 738 | } 739 | }, 740 | "colab": { 741 | "base_uri": "https://localhost:8080/", 742 | "height": 34 743 | } 744 | } 745 | }, 746 | { 747 | "cell_type": "code", 748 | "source": [ 749 | "# len(features_nonsequential[\"56f889ee11df4a72955147cb2f29a638|1509322980000\"])\n", 750 | "# features_sequential_embedded = dict(zip(keys, padded_sequence_data_embedded))\n", 751 | "# write embedding sequence into pickle file\n", 752 | "with open(data_dir+'embedding_sequence.p', \"wb\") as fp: # Pickling\n", 753 | " pickle.dump(padded_sequence_data_embedded, fp)\n", 754 | "with open(data_dir+'stay_time_sequence.p', \"wb\") as fp: # Pickling\n", 755 | " pickle.dump(padded_sequence_stay_time, fp)" 756 | ], 757 | "outputs": [], 758 | "execution_count": 0, 759 | "metadata": { 760 | "ExecuteTime": { 761 | "end_time": "2020-04-28T04:32:48.882930Z", 762 | "start_time": "2020-04-28T04:32:33.911483Z" 763 | }, 764 | "id": "UZJUIPphTfg2", 765 | "colab_type": "code", 766 | "colab": {} 767 | } 768 | }, 769 | { 770 | "cell_type": "code", 771 | "source": [ 772 | "class NpEncoder(json.JSONEncoder):\n", 773 | " def default(self, obj):\n", 774 | " if isinstance(obj, np.integer):\n", 775 | " return int(obj)\n", 776 | " elif isinstance(obj, np.floating):\n", 777 | " return float(obj)\n", 778 | " elif isinstance(obj, np.ndarray):\n", 779 | " return obj.tolist()\n", 780 | " else:\n", 781 | " return super(NpEncoder, self).default(obj)\n", 782 | "\n", 783 | "\n", 784 | " \n", 785 | "# with open(data_dir+'features_sequential_embedded.json', 'w') as f: # LSTM input with word2vec embedded \n", 786 | "# json.dump(features_sequential_embedded, f, cls=NpEncoder)\n", 787 | "# print(\"Done saving padded_sequential_features\")\n", 788 | "\n", 789 | "\n", 790 | "# with open(data_dir+'padded_sequential_features.json', 'w') as f:\n", 791 | "# json.dump(features_sequential, f, cls=NpEncoder)\n", 792 | "# print(\"Done saving padded_sequential_features\")\n", 793 | "\n", 794 | "# with open(data_dir+'non_sequential_features.json', 'w') as f:\n", 795 | "# json.dump(features_nonsequential, f, cls=NpEncoder)\n", 796 | "# print(\"Done saving features_nonsequential\")\n", 797 | "\n", 798 | "# with open(data_dir+'label.json', 'w') as f:\n", 799 | "# json.dump(label, f, cls=NpEncoder)\n", 800 | "# print(\"Done saving label.json\")" 801 | ], 802 | "outputs": [], 803 | "execution_count": 0, 804 | "metadata": { 805 | "ExecuteTime": { 806 | "end_time": "2020-04-28T04:23:28.977069Z", 807 | "start_time": "2020-04-28T04:08:12.908Z" 808 | }, 809 | "colab_type": "code", 810 | "id": "651A30YPDzhX", 811 | "colab": {} 812 | } 813 | }, 814 | { 815 | "cell_type": "markdown", 816 | "source": [ 817 | "# Generate Markov Transition Field along features" 818 | ], 819 | "metadata": { 820 | "colab_type": "text", 821 | "id": "rhLzwrYcDzhZ" 822 | } 823 | }, 824 | { 825 | "cell_type": "code", 826 | "source": [ 827 | "# with open(data_dir+'padded_sequential_features_3.json') as f:\n", 828 | "# sequential_features = json.load(f)\n", 829 | "sequential_features = features_sequential\n", 830 | "feature1 = np.asarray([_ for _ in sequential_features.values()])\n", 831 | "len(feature1[0])" 832 | ], 833 | "outputs": [], 834 | "execution_count": null, 835 | "metadata": { 836 | "ExecuteTime": { 837 | "end_time": "2020-04-28T03:36:09.578260Z", 838 | "start_time": "2020-04-28T03:30:36.563Z" 839 | }, 840 | "colab_type": "code", 841 | "id": "aCWJLn0mDzhj", 842 | "outputId": "4e98a8aa-11ee-4438-e283-10fe7fba96c4", 843 | "executionInfo": { 844 | "status": "error", 845 | "timestamp": 1590034678414, 846 | "user_tz": -480, 847 | "elapsed": 341, 848 | "user": { 849 | "displayName": "Hanyu Wu", 850 | "photoUrl": "", 851 | "userId": "05588030028148674891" 852 | } 853 | }, 854 | "colab": { 855 | "base_uri": "https://localhost:8080/", 856 | "height": 197 857 | } 858 | } 859 | }, 860 | { 861 | "cell_type": "markdown", 862 | "source": [ 863 | "- To Transform continous time features into categorical features we need to cut them into bins\n", 864 | "- We first explore the distribution of the time features" 865 | ], 866 | "metadata": { 867 | "colab_type": "text", 868 | "id": "YsVGBvUkDzhl" 869 | } 870 | }, 871 | { 872 | "cell_type": "code", 873 | "source": [ 874 | "x = feature1[:, :, 15].flatten()\n", 875 | "pd.DataFrame(x).describe()" 876 | ], 877 | "outputs": [], 878 | "execution_count": 0, 879 | "metadata": { 880 | "ExecuteTime": { 881 | "end_time": "2020-04-28T03:36:09.578260Z", 882 | "start_time": "2020-04-28T03:30:36.565Z" 883 | }, 884 | "colab_type": "code", 885 | "id": "l1fQ0tkBDzhn", 886 | "scrolled": false, 887 | "colab": {} 888 | } 889 | }, 890 | { 891 | "cell_type": "markdown", 892 | "source": [ 893 | "- Cut stay time, lag time into 8 categories" 894 | ], 895 | "metadata": { 896 | "colab_type": "text", 897 | "id": "c2ezsn3NDzhq" 898 | } 899 | }, 900 | { 901 | "cell_type": "code", 902 | "source": [ 903 | "stay_time=feature1[:,:,14].flatten()\n", 904 | "stay_time=pd.cut(x,8,labels=list(range(8)))\n", 905 | "stay_time=np.array(stay_time).reshape(-1,60)\n", 906 | "feature1[:,:,14]=stay_time\n", 907 | "\n", 908 | "lag_time=feature1[:,:,15].flatten()\n", 909 | "lag_time=pd.cut(x,8,labels=list(range(8)))\n", 910 | "lag_time=np.array(lag_time).reshape(-1,60)\n", 911 | "feature1[:,:,15]=lag_time" 912 | ], 913 | "outputs": [], 914 | "execution_count": 0, 915 | "metadata": { 916 | "ExecuteTime": { 917 | "end_time": "2020-04-28T03:36:09.580255Z", 918 | "start_time": "2020-04-28T03:30:36.567Z" 919 | }, 920 | "colab_type": "code", 921 | "id": "6vjrljt8Dzhq", 922 | "colab": {} 923 | } 924 | }, 925 | { 926 | "cell_type": "markdown", 927 | "source": [ 928 | "- Generate transition matrix\n", 929 | "- A 31 by 31 matrix \n", 930 | "- All features are bin and one-hot encoded into a binary vector.\n", 931 | "- matrix[i,j] represents the probability that j=1 given i=1" 932 | ], 933 | "metadata": { 934 | "colab_type": "text", 935 | "id": "UGvRYDM2DziM" 936 | } 937 | }, 938 | { 939 | "cell_type": "code", 940 | "source": [ 941 | "from sklearn import preprocessing\n", 942 | "#test\n", 943 | "X =list(range(8))\n", 944 | "X=np.array(X)\n", 945 | "X=X[:,np.newaxis]\n", 946 | "enc = preprocessing.OneHotEncoder(categories='auto')\n", 947 | "enc.fit(X)\n", 948 | "onehotlabels = enc.transform(X).toarray()" 949 | ], 950 | "outputs": [], 951 | "execution_count": 0, 952 | "metadata": { 953 | "ExecuteTime": { 954 | "end_time": "2020-04-28T03:36:09.581252Z", 955 | "start_time": "2020-04-28T03:30:36.570Z" 956 | }, 957 | "colab_type": "code", 958 | "id": "QjC_mVjEDziN", 959 | "colab": {} 960 | } 961 | }, 962 | { 963 | "cell_type": "code", 964 | "source": [ 965 | "arr=[]\n", 966 | "for i in range(feature1.shape[0]):\n", 967 | " seq=feature1[i]\n", 968 | " seq=[i for i in seq if i[0]!=-1] \n", 969 | " seq=np.array(seq)\n", 970 | " #print(seq)\n", 971 | " onehot_staytime=enc.transform(seq[:,14][:,np.newaxis]).A\n", 972 | " onehot_lagtime=enc.transform(seq[:,15][:,np.newaxis]).A\n", 973 | " #print(onehot_staytime)\n", 974 | " seq=np.delete(seq, np.s_[0,14,15], 1)\n", 975 | " seq=np.concatenate([seq, onehot_staytime,onehot_lagtime],axis=1)\n", 976 | " matrix=np.zeros([seq.shape[1],seq.shape[1]])\n", 977 | " for i in range(seq.shape[1]):\n", 978 | " total=np.sum(seq[:,i])\n", 979 | " if total==0:\n", 980 | " continue\n", 981 | " sub=seq[seq[:,i]==1]\n", 982 | " for col in range(sub.shape[1]):\n", 983 | " coappear=np.sum(sub[:,col])\n", 984 | " matrix[i,col]=matrix[i,col]+coappear/total\n", 985 | " arr.append(matrix)" 986 | ], 987 | "outputs": [], 988 | "execution_count": 0, 989 | "metadata": { 990 | "ExecuteTime": { 991 | "end_time": "2020-04-28T03:36:09.581252Z", 992 | "start_time": "2020-04-28T03:30:36.572Z" 993 | }, 994 | "colab_type": "code", 995 | "id": "IJlPaYFXDziR", 996 | "scrolled": true, 997 | "colab": {} 998 | } 999 | }, 1000 | { 1001 | "cell_type": "markdown", 1002 | "source": [ 1003 | "- Save the matrix" 1004 | ], 1005 | "metadata": { 1006 | "colab_type": "text", 1007 | "id": "pOAatRqVDziT" 1008 | } 1009 | }, 1010 | { 1011 | "cell_type": "code", 1012 | "source": [ 1013 | "class NpEncoder(json.JSONEncoder):\n", 1014 | " def default(self, obj):\n", 1015 | " if isinstance(obj, np.integer):\n", 1016 | " return int(obj)\n", 1017 | " elif isinstance(obj, np.floating):\n", 1018 | " return float(obj)\n", 1019 | " elif isinstance(obj, np.ndarray):\n", 1020 | " return obj.tolist()\n", 1021 | " else:\n", 1022 | " return super(NpEncoder, self).default(obj)\n", 1023 | "with open(data_dir+'featurematrix.json', 'w') as f:\n", 1024 | " json.dump(arr, f,cls=NpEncoder)" 1025 | ], 1026 | "outputs": [], 1027 | "execution_count": 0, 1028 | "metadata": { 1029 | "ExecuteTime": { 1030 | "end_time": "2020-04-28T03:36:09.582252Z", 1031 | "start_time": "2020-04-28T03:30:36.574Z" 1032 | }, 1033 | "colab_type": "code", 1034 | "id": "am7DSRnUDzih", 1035 | "colab": {} 1036 | } 1037 | }, 1038 | { 1039 | "cell_type": "code", 1040 | "source": [], 1041 | "outputs": [], 1042 | "execution_count": 0, 1043 | "metadata": { 1044 | "colab_type": "code", 1045 | "id": "tLcoc_EvJOLT", 1046 | "colab": {} 1047 | } 1048 | } 1049 | ], 1050 | "metadata": { 1051 | "colab": { 1052 | "name": "data_processing.ipynb", 1053 | "provenance": [], 1054 | "collapsed_sections": [] 1055 | }, 1056 | "kernelspec": { 1057 | "display_name": "Python 3", 1058 | "language": "python", 1059 | "name": "python3" 1060 | }, 1061 | "language_info": { 1062 | "name": "python", 1063 | "version": "3.7.7", 1064 | "mimetype": "text/x-python", 1065 | "codemirror_mode": { 1066 | "name": "ipython", 1067 | "version": 3 1068 | }, 1069 | "pygments_lexer": "ipython3", 1070 | "nbconvert_exporter": "python", 1071 | "file_extension": ".py" 1072 | }, 1073 | "toc": { 1074 | "toc_position": {}, 1075 | "skip_h1_title": false, 1076 | "number_sections": true, 1077 | "title_cell": "Table of Contents", 1078 | "toc_window_display": false, 1079 | "base_numbering": 1, 1080 | "toc_section_display": true, 1081 | "title_sidebar": "Contents", 1082 | "toc_cell": false, 1083 | "nav_menu": {}, 1084 | "sideBar": true 1085 | }, 1086 | "varInspector": { 1087 | "cols": { 1088 | "lenName": 16, 1089 | "lenType": 16, 1090 | "lenVar": 40 1091 | }, 1092 | "kernels_config": { 1093 | "python": { 1094 | "delete_cmd_postfix": "", 1095 | "delete_cmd_prefix": "del ", 1096 | "library": "var_list.py", 1097 | "varRefreshCmd": "print(var_dic_list())" 1098 | }, 1099 | "r": { 1100 | "delete_cmd_postfix": ") ", 1101 | "delete_cmd_prefix": "rm(", 1102 | "library": "var_list.r", 1103 | "varRefreshCmd": "cat(var_dic_list()) " 1104 | } 1105 | }, 1106 | "types_to_exclude": [ 1107 | "module", 1108 | "function", 1109 | "builtin_function_or_method", 1110 | "instance", 1111 | "_Feature" 1112 | ], 1113 | "window_display": false 1114 | }, 1115 | "nteract": { 1116 | "version": "0.26.0" 1117 | } 1118 | }, 1119 | "nbformat": 4, 1120 | "nbformat_minor": 0 1121 | } -------------------------------------------------------------------------------- /1.Data_exploration_baseline_models/Other_simple_models.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "ExecuteTime": { 8 | "end_time": "2020-03-31T21:22:19.810884Z", 9 | "start_time": "2020-03-31T21:22:17.377317Z" 10 | }, 11 | "colab": { 12 | "base_uri": "https://localhost:8080/", 13 | "height": 134 14 | }, 15 | "colab_type": "code", 16 | "executionInfo": { 17 | "elapsed": 2327, 18 | "status": "ok", 19 | "timestamp": 1582642950102, 20 | "user": { 21 | "displayName": "Hanyu Wu", 22 | "photoUrl": "", 23 | "userId": "05588030028148674891" 24 | }, 25 | "user_tz": 300 26 | }, 27 | "id": "7KGzer_R3VL4", 28 | "outputId": "a1fd760f-a9cd-4d2e-f36c-b8ff77126488" 29 | }, 30 | "outputs": [ 31 | { 32 | "name": "stdout", 33 | "output_type": "stream", 34 | "text": [ 35 | "WARNING:tensorflow:From C:\\Users\\Harry\\Anaconda3\\envs\\tensorflow-gpu\\lib\\site-packages\\tensorflow_core\\python\\compat\\v2_compat.py:88: disable_resource_variables (from tensorflow.python.ops.variable_scope) is deprecated and will be removed in a future version.\n", 36 | "Instructions for updating:\n", 37 | "non-resource variables are not supported in the long term\n" 38 | ] 39 | } 40 | ], 41 | "source": [ 42 | "from sklearn.metrics import roc_curve, auc\n", 43 | "from sklearn.metrics import accuracy_score\n", 44 | "import sklearn.preprocessing as pre\n", 45 | "import json\n", 46 | "import pandas as pd\n", 47 | "import numpy as np\n", 48 | "import sklearn.model_selection\n", 49 | "from sklearn.metrics import confusion_matrix\n", 50 | "from sklearn.feature_selection import f_classif\n", 51 | "from sklearn.feature_selection import SelectKBest\n", 52 | "import sklearn\n", 53 | "import tensorflow.compat.v1 as tf\n", 54 | "import tensorflow\n", 55 | "import tensorflow.keras as keras\n", 56 | "import warnings\n", 57 | "warnings.filterwarnings('ignore')\n", 58 | "tf.disable_v2_behavior()\n", 59 | "from sklearn.base import BaseEstimator,ClassifierMixin\n", 60 | "\n", 61 | "data_dir = '../../data/processed/'" 62 | ] 63 | }, 64 | { 65 | "cell_type": "markdown", 66 | "metadata": { 67 | "colab_type": "text", 68 | "id": "Au1LXjaz3VMl" 69 | }, 70 | "source": [ 71 | "baseline: \n", 72 | "\n", 73 | "\n", 74 | " 1. use random search for hyper-parameter\n", 75 | " 2. KS值作为最重要的指标, the best threshold for the classifier is based on KS\n", 76 | " 3. Done: SVM, Lasso, logistic\n", 77 | " 4. with probability: Logistic, NN\n", 78 | " 5. without probability: Tree. \n", 79 | " XGBoost, Random Forest\n", 80 | " 6. try with bagging. \n", 81 | " 7. use model: ensemble model\n", 82 | " ereg = VotingRegressor(estimators=[('gb', reg1), ('rf', reg2), ('lr', reg3)])\n", 83 | " reg1 = GradientBoostingRegressor(random_state=1, n_estimators=10)\n", 84 | " reg2 = RandomForestRegressor(random_state=1, n_estimators=10)\n", 85 | " 8. naive bayes\n", 86 | "\n", 87 | " \n" 88 | ] 89 | }, 90 | { 91 | "cell_type": "markdown", 92 | "metadata": { 93 | "colab_type": "text", 94 | "id": "BXw7tBkZ3VMm" 95 | }, 96 | "source": [ 97 | "# preparing the data" 98 | ] 99 | }, 100 | { 101 | "cell_type": "markdown", 102 | "metadata": { 103 | "colab_type": "text", 104 | "id": "zdfQ8g4m3VMn" 105 | }, 106 | "source": [ 107 | "## adding the crime data into it" 108 | ] 109 | }, 110 | { 111 | "cell_type": "code", 112 | "execution_count": 2, 113 | "metadata": { 114 | "ExecuteTime": { 115 | "end_time": "2020-03-31T21:22:20.791856Z", 116 | "start_time": "2020-03-31T21:22:19.811882Z" 117 | }, 118 | "colab": {}, 119 | "colab_type": "code", 120 | "id": "6cb7O8xa3VM4" 121 | }, 122 | "outputs": [], 123 | "source": [ 124 | "# X=pd.read_csv()\n", 125 | "with open(data_dir+\"non_sequential_features.json\") as obj:\n", 126 | " X = json.load(obj)\n", 127 | "with open(data_dir+\"padded_sequential_features_3.json\") as obj:\n", 128 | " X_seq = json.load(obj)\n", 129 | "with open(data_dir+\"label.json\") as obj:\n", 130 | " Y = json.load(obj)" 131 | ] 132 | }, 133 | { 134 | "cell_type": "code", 135 | "execution_count": 3, 136 | "metadata": { 137 | "ExecuteTime": { 138 | "end_time": "2020-03-31T21:22:20.962236Z", 139 | "start_time": "2020-03-31T21:22:20.792717Z" 140 | }, 141 | "colab": {}, 142 | "colab_type": "code", 143 | "id": "tom-dGYV3VM7" 144 | }, 145 | "outputs": [], 146 | "source": [ 147 | "feature2=np.array([X_seq[key] for key in X_seq.keys()])" 148 | ] 149 | }, 150 | { 151 | "cell_type": "code", 152 | "execution_count": 4, 153 | "metadata": { 154 | "ExecuteTime": { 155 | "end_time": "2020-03-31T21:22:20.972234Z", 156 | "start_time": "2020-03-31T21:22:20.964231Z" 157 | }, 158 | "colab": { 159 | "base_uri": "https://localhost:8080/", 160 | "height": 34 161 | }, 162 | "colab_type": "code", 163 | "executionInfo": { 164 | "elapsed": 6303, 165 | "status": "ok", 166 | "timestamp": 1582642954215, 167 | "user": { 168 | "displayName": "Hanyu Wu", 169 | "photoUrl": "", 170 | "userId": "05588030028148674891" 171 | }, 172 | "user_tz": 300 173 | }, 174 | "id": "UlbjzHox3VM-", 175 | "outputId": "bcf22103-60b9-47b4-a8c9-c689310d1dcf" 176 | }, 177 | "outputs": [ 178 | { 179 | "data": { 180 | "text/plain": [ 181 | "(3715, 60, 18)" 182 | ] 183 | }, 184 | "execution_count": 4, 185 | "metadata": {}, 186 | "output_type": "execute_result" 187 | } 188 | ], 189 | "source": [ 190 | "feature2.shape" 191 | ] 192 | }, 193 | { 194 | "cell_type": "code", 195 | "execution_count": 5, 196 | "metadata": { 197 | "ExecuteTime": { 198 | "end_time": "2020-03-31T21:22:20.987201Z", 199 | "start_time": "2020-03-31T21:22:20.973207Z" 200 | }, 201 | "colab": {}, 202 | "colab_type": "code", 203 | "id": "h-wv8RkQ3VNC" 204 | }, 205 | "outputs": [], 206 | "source": [ 207 | "feature2 = np.concatenate(\n", 208 | " [feature2[:, :, 0][:, :, np.newaxis], feature2[:, :, 14:]], axis=-1)" 209 | ] 210 | }, 211 | { 212 | "cell_type": "code", 213 | "execution_count": 6, 214 | "metadata": { 215 | "ExecuteTime": { 216 | "end_time": "2020-03-31T21:22:20.991173Z", 217 | "start_time": "2020-03-31T21:22:20.989165Z" 218 | }, 219 | "colab": {}, 220 | "colab_type": "code", 221 | "id": "nkOgj1Kh3VNE" 222 | }, 223 | "outputs": [], 224 | "source": [ 225 | "feature2 = feature2.reshape((-1, 60*5))" 226 | ] 227 | }, 228 | { 229 | "cell_type": "code", 230 | "execution_count": 7, 231 | "metadata": { 232 | "ExecuteTime": { 233 | "end_time": "2020-03-31T21:22:21.002155Z", 234 | "start_time": "2020-03-31T21:22:20.993154Z" 235 | }, 236 | "colab": {}, 237 | "colab_type": "code", 238 | "id": "8hLSh7DF3VNH" 239 | }, 240 | "outputs": [], 241 | "source": [ 242 | "X = np.array([X[key] for key in X.keys()])" 243 | ] 244 | }, 245 | { 246 | "cell_type": "code", 247 | "execution_count": 8, 248 | "metadata": { 249 | "ExecuteTime": { 250 | "end_time": "2020-03-31T21:22:21.012128Z", 251 | "start_time": "2020-03-31T21:22:21.004141Z" 252 | }, 253 | "colab": {}, 254 | "colab_type": "code", 255 | "id": "I3b_BW_-3VNK" 256 | }, 257 | "outputs": [], 258 | "source": [ 259 | "X = np.concatenate([X, feature2], axis=-1)" 260 | ] 261 | }, 262 | { 263 | "cell_type": "code", 264 | "execution_count": 9, 265 | "metadata": { 266 | "ExecuteTime": { 267 | "end_time": "2020-03-31T21:22:21.016118Z", 268 | "start_time": "2020-03-31T21:22:21.013100Z" 269 | }, 270 | "colab": {}, 271 | "colab_type": "code", 272 | "id": "O_XN4mZ-3VNM" 273 | }, 274 | "outputs": [], 275 | "source": [ 276 | "Y_ = Y.values()" 277 | ] 278 | }, 279 | { 280 | "cell_type": "code", 281 | "execution_count": 10, 282 | "metadata": { 283 | "ExecuteTime": { 284 | "end_time": "2020-03-31T21:22:21.022101Z", 285 | "start_time": "2020-03-31T21:22:21.017100Z" 286 | }, 287 | "colab": {}, 288 | "colab_type": "code", 289 | "id": "K7PNQC5i3VNP" 290 | }, 291 | "outputs": [], 292 | "source": [ 293 | "Y_ = np.array(list(Y_))" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": 11, 299 | "metadata": { 300 | "ExecuteTime": { 301 | "end_time": "2020-03-31T21:22:21.027087Z", 302 | "start_time": "2020-03-31T21:22:21.023120Z" 303 | }, 304 | "colab": {}, 305 | "colab_type": "code", 306 | "id": "_tZijjTz3VNS" 307 | }, 308 | "outputs": [], 309 | "source": [ 310 | "Y_ = Y_.astype(\"int\")" 311 | ] 312 | }, 313 | { 314 | "cell_type": "code", 315 | "execution_count": 12, 316 | "metadata": { 317 | "ExecuteTime": { 318 | "end_time": "2020-03-31T21:22:21.033078Z", 319 | "start_time": "2020-03-31T21:22:21.028060Z" 320 | }, 321 | "colab": {}, 322 | "colab_type": "code", 323 | "id": "qhDJTQkd3VNV" 324 | }, 325 | "outputs": [], 326 | "source": [ 327 | "X_ = X" 328 | ] 329 | }, 330 | { 331 | "cell_type": "code", 332 | "execution_count": 13, 333 | "metadata": { 334 | "ExecuteTime": { 335 | "end_time": "2020-03-31T21:22:21.039060Z", 336 | "start_time": "2020-03-31T21:22:21.034044Z" 337 | }, 338 | "colab": { 339 | "base_uri": "https://localhost:8080/", 340 | "height": 34 341 | }, 342 | "colab_type": "code", 343 | "executionInfo": { 344 | "elapsed": 6391, 345 | "status": "ok", 346 | "timestamp": 1582642954368, 347 | "user": { 348 | "displayName": "Hanyu Wu", 349 | "photoUrl": "", 350 | "userId": "05588030028148674891" 351 | }, 352 | "user_tz": 300 353 | }, 354 | "id": "6e7sqwRE3VNX", 355 | "outputId": "3e7323df-a64e-4271-ec07-9f6fed718264" 356 | }, 357 | "outputs": [ 358 | { 359 | "data": { 360 | "text/plain": [ 361 | "((3715, 314), (3715,))" 362 | ] 363 | }, 364 | "execution_count": 13, 365 | "metadata": {}, 366 | "output_type": "execute_result" 367 | } 368 | ], 369 | "source": [ 370 | "X_.shape, Y_.shape" 371 | ] 372 | }, 373 | { 374 | "cell_type": "code", 375 | "execution_count": 44, 376 | "metadata": { 377 | "ExecuteTime": { 378 | "end_time": "2020-03-31T21:24:36.181964Z", 379 | "start_time": "2020-03-31T21:24:36.172988Z" 380 | }, 381 | "colab": {}, 382 | "colab_type": "code", 383 | "id": "Qp_NV0Ea3VNZ" 384 | }, 385 | "outputs": [], 386 | "source": [ 387 | "index = np.arange(len(X_))\n", 388 | "X, X_test_2019, Y, Y_test_2019 = sklearn.model_selection.train_test_split(X_[:100000], Y_[:100000], test_size=0.25)" 389 | ] 390 | }, 391 | { 392 | "cell_type": "markdown", 393 | "metadata": { 394 | "colab_type": "text", 395 | "id": "y8zmU96y3VNb" 396 | }, 397 | "source": [ 398 | "# parameters" 399 | ] 400 | }, 401 | { 402 | "cell_type": "code", 403 | "execution_count": 15, 404 | "metadata": { 405 | "ExecuteTime": { 406 | "end_time": "2020-03-31T21:22:21.054988Z", 407 | "start_time": "2020-03-31T21:22:21.050999Z" 408 | }, 409 | "colab": {}, 410 | "colab_type": "code", 411 | "id": "qqjcBZKL3VNj" 412 | }, 413 | "outputs": [], 414 | "source": [ 415 | "def get_KS(y_prob, y_true):\n", 416 | " '''\n", 417 | "\n", 418 | " return the best threshold, maximum ks\n", 419 | "\n", 420 | " '''\n", 421 | "\n", 422 | " fpr, tpr, threshold = roc_curve(y_true, y_prob)\n", 423 | " ks = (tpr-fpr)\n", 424 | " max_ = np.argmax(ks)\n", 425 | "\n", 426 | " return threshold[max_], np.max(ks)" 427 | ] 428 | }, 429 | { 430 | "cell_type": "markdown", 431 | "metadata": { 432 | "colab_type": "text", 433 | "id": "gKm62lEe3VNl" 434 | }, 435 | "source": [ 436 | "# Test on important features" 437 | ] 438 | }, 439 | { 440 | "cell_type": "code", 441 | "execution_count": 16, 442 | "metadata": { 443 | "ExecuteTime": { 444 | "end_time": "2020-03-31T21:22:21.060973Z", 445 | "start_time": "2020-03-31T21:22:21.055998Z" 446 | }, 447 | "colab": {}, 448 | "colab_type": "code", 449 | "id": "O5zG_hPP3VNo" 450 | }, 451 | "outputs": [], 452 | "source": [ 453 | "class Get_best(BaseEstimator, ClassifierMixin):\n", 454 | " def __init__(self, model, threshold):\n", 455 | " self.model = model\n", 456 | " self.threshold = threshold\n", 457 | "\n", 458 | " def fit(self, X, Y):\n", 459 | " self.model.fit(X, Y)\n", 460 | " return self\n", 461 | "\n", 462 | " def predict_proba(self, X):\n", 463 | " self.proba = self.model.predict_proba(X)\n", 464 | " return self.proba\n", 465 | "\n", 466 | " def predict(self, X):\n", 467 | " return self.predict_proba(X)[:, 1] >= self.threshold" 468 | ] 469 | }, 470 | { 471 | "cell_type": "code", 472 | "execution_count": 17, 473 | "metadata": { 474 | "ExecuteTime": { 475 | "end_time": "2020-03-31T21:22:21.069947Z", 476 | "start_time": "2020-03-31T21:22:21.061969Z" 477 | }, 478 | "colab": {}, 479 | "colab_type": "code", 480 | "id": "WxCwCfoK3VNq" 481 | }, 482 | "outputs": [], 483 | "source": [ 484 | "class selection_and_KS:\n", 485 | " def __init__(self, model, parameter, model_name=None):\n", 486 | " self.model = model\n", 487 | " self.parameter = parameter\n", 488 | " self.scoring = {'AUC': 'roc_auc'}\n", 489 | " self.refit_first = 'AUC'\n", 490 | " self.cv = 5\n", 491 | " self.clf = RandomizedSearchCV(self.model, param_distributions=self.parameter,\n", 492 | " cv=self.cv, scoring=self.scoring, refit=self.refit_first)\n", 493 | " self.model_name = model_name\n", 494 | "\n", 495 | " def fit(self, X, Y):\n", 496 | " if self.model_name == \"SVM\":\n", 497 | " self.scoring = {'AUC': 'accuracy'}\n", 498 | " self.clf.fit(X, Y)\n", 499 | " self.best_model = self.clf.best_estimator_\n", 500 | " if self.model_name == \"SVM\":\n", 501 | " self.best_model = self.best_model.set_params(probability=True)\n", 502 | " self.best_model.fit(X, Y)\n", 503 | " self.threshold, self.ks = get_KS(\n", 504 | " self.best_model.predict_proba(X)[:, 1], Y)\n", 505 | " print(\"ks-value is: {}\".format(self.ks))\n", 506 | " self.accuracy = np.mean((self.best_model.predict(X) == Y))\n", 507 | " print(self.clf.cv_results_['params'][self.clf.best_index_])\n", 508 | " return self\n", 509 | "\n", 510 | " def return_best(self):\n", 511 | " return Get_best(model=self.best_model, threshold=self.threshold)" 512 | ] 513 | }, 514 | { 515 | "cell_type": "code", 516 | "execution_count": 18, 517 | "metadata": { 518 | "ExecuteTime": { 519 | "end_time": "2020-03-31T21:22:21.091919Z", 520 | "start_time": "2020-03-31T21:22:21.070956Z" 521 | }, 522 | "colab": {}, 523 | "colab_type": "code", 524 | "id": "6lpOrXc33VNs" 525 | }, 526 | "outputs": [], 527 | "source": [ 528 | "from sklearn.ensemble import RandomForestClassifier, VotingClassifier" 529 | ] 530 | }, 531 | { 532 | "cell_type": "code", 533 | "execution_count": 19, 534 | "metadata": { 535 | "ExecuteTime": { 536 | "end_time": "2020-03-31T21:22:21.096876Z", 537 | "start_time": "2020-03-31T21:22:21.092901Z" 538 | }, 539 | "colab": {}, 540 | "colab_type": "code", 541 | "id": "Q9B1zVZf3VNw" 542 | }, 543 | "outputs": [], 544 | "source": [ 545 | "from sklearn.model_selection import RandomizedSearchCV" 546 | ] 547 | }, 548 | { 549 | "cell_type": "markdown", 550 | "metadata": { 551 | "colab_type": "text", 552 | "id": "9VPdpp3d3VN1" 553 | }, 554 | "source": [ 555 | "# base model" 556 | ] 557 | }, 558 | { 559 | "cell_type": "markdown", 560 | "metadata": { 561 | "colab_type": "text", 562 | "id": "b3WqAto03VN2" 563 | }, 564 | "source": [ 565 | "## for voting ensembling" 566 | ] 567 | }, 568 | { 569 | "cell_type": "markdown", 570 | "metadata": { 571 | "colab_type": "text", 572 | "id": "EyVIQ-Wf3VN3" 573 | }, 574 | "source": [ 575 | "## logistics Ridge" 576 | ] 577 | }, 578 | { 579 | "cell_type": "code", 580 | "execution_count": 20, 581 | "metadata": { 582 | "ExecuteTime": { 583 | "end_time": "2020-03-31T21:22:21.102859Z", 584 | "start_time": "2020-03-31T21:22:21.097873Z" 585 | }, 586 | "colab": {}, 587 | "colab_type": "code", 588 | "id": "XsH_Da3z3VN3" 589 | }, 590 | "outputs": [], 591 | "source": [ 592 | "model_dict = {}\n", 593 | "model_acc = {}\n", 594 | "model_ks = {}\n", 595 | "X_dict = {}\n", 596 | "Y_dict = {}" 597 | ] 598 | }, 599 | { 600 | "cell_type": "markdown", 601 | "metadata": { 602 | "colab_type": "text", 603 | "id": "RMrR2vb83VN5" 604 | }, 605 | "source": [ 606 | "## SVM" 607 | ] 608 | }, 609 | { 610 | "cell_type": "markdown", 611 | "metadata": { 612 | "colab_type": "text", 613 | "id": "uwjALGRF3VN7" 614 | }, 615 | "source": [ 616 | "our dataset is so big that SVM is not appropriate here." 617 | ] 618 | }, 619 | { 620 | "cell_type": "code", 621 | "execution_count": 21, 622 | "metadata": { 623 | "ExecuteTime": { 624 | "end_time": "2020-03-31T21:22:21.108844Z", 625 | "start_time": "2020-03-31T21:22:21.103858Z" 626 | }, 627 | "colab": {}, 628 | "colab_type": "code", 629 | "id": "bW09yf6F3VN8" 630 | }, 631 | "outputs": [], 632 | "source": [ 633 | "from sklearn.ensemble import BaggingClassifier, RandomForestClassifier" 634 | ] 635 | }, 636 | { 637 | "cell_type": "markdown", 638 | "metadata": { 639 | "colab_type": "text", 640 | "id": "zUIO_UMD3VN-" 641 | }, 642 | "source": [ 643 | "## LDA&QDA" 644 | ] 645 | }, 646 | { 647 | "cell_type": "code", 648 | "execution_count": 22, 649 | "metadata": { 650 | "ExecuteTime": { 651 | "end_time": "2020-03-31T21:22:21.119838Z", 652 | "start_time": "2020-03-31T21:22:21.112832Z" 653 | }, 654 | "colab": {}, 655 | "colab_type": "code", 656 | "id": "XtTIVRAV3VN_" 657 | }, 658 | "outputs": [], 659 | "source": [ 660 | "from sklearn.discriminant_analysis import LinearDiscriminantAnalysis,QuadraticDiscriminantAnalysis" 661 | ] 662 | }, 663 | { 664 | "cell_type": "markdown", 665 | "metadata": { 666 | "colab_type": "text", 667 | "id": "HYYHrHGD3VOB" 668 | }, 669 | "source": [ 670 | "### LDA" 671 | ] 672 | }, 673 | { 674 | "cell_type": "code", 675 | "execution_count": 23, 676 | "metadata": { 677 | "ExecuteTime": { 678 | "end_time": "2020-03-31T21:22:23.422742Z", 679 | "start_time": "2020-03-31T21:22:21.121810Z" 680 | }, 681 | "colab": { 682 | "base_uri": "https://localhost:8080/", 683 | "height": 71 684 | }, 685 | "colab_type": "code", 686 | "executionInfo": { 687 | "elapsed": 9579, 688 | "status": "ok", 689 | "timestamp": 1582642957648, 690 | "user": { 691 | "displayName": "Hanyu Wu", 692 | "photoUrl": "", 693 | "userId": "05588030028148674891" 694 | }, 695 | "user_tz": 300 696 | }, 697 | "id": "uRZ3Q9Nj3VOC", 698 | "outputId": "64b91e10-92e2-483e-83ac-5d18ae06f06e" 699 | }, 700 | "outputs": [ 701 | { 702 | "name": "stdout", 703 | "output_type": "stream", 704 | "text": [ 705 | "ks-value is: 0.5145390978084762\n", 706 | "{'solver': 'svd'}\n" 707 | ] 708 | } 709 | ], 710 | "source": [ 711 | "LDA = LinearDiscriminantAnalysis()\n", 712 | "search_parameter = {'solver': ['svd', 'lsqr']}\n", 713 | "m_LDA = selection_and_KS(LDA, parameter=search_parameter)\n", 714 | "m_LDA.fit(X, Y)\n", 715 | "model_dict[\"LDA\"] = m_LDA.return_best()\n", 716 | "model_acc[\"LDA\"] = m_LDA.accuracy\n", 717 | "model_ks[\"LDA\"] = m_LDA.ks\n", 718 | "X_dict[\"LDA\"] = X\n", 719 | "Y_dict[\"LDA\"] = Y" 720 | ] 721 | }, 722 | { 723 | "cell_type": "markdown", 724 | "metadata": { 725 | "colab_type": "text", 726 | "id": "ybLBTPVx3VOF" 727 | }, 728 | "source": [ 729 | "### QDA" 730 | ] 731 | }, 732 | { 733 | "cell_type": "code", 734 | "execution_count": 24, 735 | "metadata": { 736 | "ExecuteTime": { 737 | "end_time": "2020-03-31T21:22:25.141141Z", 738 | "start_time": "2020-03-31T21:22:23.424731Z" 739 | }, 740 | "colab": { 741 | "base_uri": "https://localhost:8080/", 742 | "height": 71 743 | }, 744 | "colab_type": "code", 745 | "executionInfo": { 746 | "elapsed": 10967, 747 | "status": "ok", 748 | "timestamp": 1582642959041, 749 | "user": { 750 | "displayName": "Hanyu Wu", 751 | "photoUrl": "", 752 | "userId": "05588030028148674891" 753 | }, 754 | "user_tz": 300 755 | }, 756 | "id": "tNDdJ61T3VOG", 757 | "outputId": "42645cd5-580d-4231-e539-2f26ff442130" 758 | }, 759 | "outputs": [ 760 | { 761 | "name": "stdout", 762 | "output_type": "stream", 763 | "text": [ 764 | "ks-value is: 1.0\n", 765 | "{}\n" 766 | ] 767 | } 768 | ], 769 | "source": [ 770 | "QDA = QuadraticDiscriminantAnalysis()\n", 771 | "search_parameter = {}\n", 772 | "m_QDA = selection_and_KS(QDA, parameter=search_parameter)\n", 773 | "m_QDA.fit(X, Y)\n", 774 | "model_dict[\"QDA\"] = m_QDA.return_best()\n", 775 | "model_acc[\"QDA\"] = m_QDA.accuracy\n", 776 | "model_ks[\"QDA\"] = m_QDA.ks\n", 777 | "X_dict[\"QDA\"] = X\n", 778 | "Y_dict[\"QDA\"] = Y" 779 | ] 780 | }, 781 | { 782 | "cell_type": "markdown", 783 | "metadata": { 784 | "colab_type": "text", 785 | "id": "vtRQgAF73VOJ" 786 | }, 787 | "source": [ 788 | "## Tree model: Random_forest, XGboost" 789 | ] 790 | }, 791 | { 792 | "cell_type": "markdown", 793 | "metadata": { 794 | "colab_type": "text", 795 | "id": "_TSXE7tQ3VOJ" 796 | }, 797 | "source": [ 798 | "### random_forest" 799 | ] 800 | }, 801 | { 802 | "cell_type": "code", 803 | "execution_count": 25, 804 | "metadata": { 805 | "ExecuteTime": { 806 | "end_time": "2020-03-31T21:22:43.868247Z", 807 | "start_time": "2020-03-31T21:22:25.142139Z" 808 | }, 809 | "colab": { 810 | "base_uri": "https://localhost:8080/", 811 | "height": 51 812 | }, 813 | "colab_type": "code", 814 | "executionInfo": { 815 | "elapsed": 39810, 816 | "status": "ok", 817 | "timestamp": 1582642987890, 818 | "user": { 819 | "displayName": "Hanyu Wu", 820 | "photoUrl": "", 821 | "userId": "05588030028148674891" 822 | }, 823 | "user_tz": 300 824 | }, 825 | "id": "YpusoBKa3VOK", 826 | "outputId": "0019553a-dfbb-42a8-b498-3a6c09056655" 827 | }, 828 | "outputs": [ 829 | { 830 | "name": "stdout", 831 | "output_type": "stream", 832 | "text": [ 833 | "ks-value is: 1.0\n", 834 | "{'max_features': 9, 'max_depth': 28}\n" 835 | ] 836 | } 837 | ], 838 | "source": [ 839 | "params = {\"max_depth\": [i for i in range(\n", 840 | " 2, 32)], 'max_features': range(3, 11, 2)}\n", 841 | "model = RandomForestClassifier()\n", 842 | "random_forest = selection_and_KS(model, parameter=params)\n", 843 | "random_forest.fit(X, Y)\n", 844 | "model_dict[\"tree_random_forest\"] = random_forest.return_best()\n", 845 | "model_acc[\"tree_random_forest\"] = random_forest.accuracy\n", 846 | "model_ks[\"tree_random_forest\"] = random_forest.ks\n", 847 | "X_dict[\"tree_random_forest\"] = X\n", 848 | "Y_dict[\"tree_random_forest\"] = Y" 849 | ] 850 | }, 851 | { 852 | "cell_type": "markdown", 853 | "metadata": { 854 | "colab_type": "text", 855 | "id": "frNclZkO3VOM" 856 | }, 857 | "source": [ 858 | "### XGBOOST" 859 | ] 860 | }, 861 | { 862 | "cell_type": "code", 863 | "execution_count": 26, 864 | "metadata": { 865 | "ExecuteTime": { 866 | "end_time": "2020-03-31T21:22:43.887168Z", 867 | "start_time": "2020-03-31T21:22:43.869216Z" 868 | }, 869 | "colab": {}, 870 | "colab_type": "code", 871 | "id": "UnWLvR2-3VON" 872 | }, 873 | "outputs": [], 874 | "source": [ 875 | "from xgboost.sklearn import XGBClassifier" 876 | ] 877 | }, 878 | { 879 | "cell_type": "code", 880 | "execution_count": 27, 881 | "metadata": { 882 | "ExecuteTime": { 883 | "end_time": "2020-03-31T21:23:03.122578Z", 884 | "start_time": "2020-03-31T21:22:43.888166Z" 885 | }, 886 | "colab": { 887 | "base_uri": "https://localhost:8080/", 888 | "height": 51 889 | }, 890 | "colab_type": "code", 891 | "executionInfo": { 892 | "elapsed": 153599, 893 | "status": "ok", 894 | "timestamp": 1582643101689, 895 | "user": { 896 | "displayName": "Hanyu Wu", 897 | "photoUrl": "", 898 | "userId": "05588030028148674891" 899 | }, 900 | "user_tz": 300 901 | }, 902 | "id": "66joGxBe3VOP", 903 | "outputId": "4fb52298-22d8-4d30-b7d4-c042d9bd9231" 904 | }, 905 | "outputs": [ 906 | { 907 | "name": "stdout", 908 | "output_type": "stream", 909 | "text": [ 910 | "ks-value is: 0.9171825701372901\n", 911 | "{'max_features': 16, 'max_depth': 32, 'learning_rate': 0.01}\n" 912 | ] 913 | } 914 | ], 915 | "source": [ 916 | "params = {\"max_depth\": [i for i in [32]],\n", 917 | " 'max_features': [16], 'learning_rate': [0.01]}\n", 918 | "model = XGBClassifier()\n", 919 | "tree_XGBOOST = selection_and_KS(model, parameter=params)\n", 920 | "tree_XGBOOST.fit(X, Y)\n", 921 | "model_dict[\"tree_XGBOOST\"] = tree_XGBOOST.return_best()\n", 922 | "model_acc[\"tree_XGBOOST\"] = tree_XGBOOST.accuracy\n", 923 | "model_ks[\"tree_XGBOOST\"] = tree_XGBOOST.ks\n", 924 | "X_dict[\"tree_XGBOOST\"] = X\n", 925 | "Y_dict[\"tree_XGBOOST\"] = Y" 926 | ] 927 | }, 928 | { 929 | "cell_type": "markdown", 930 | "metadata": { 931 | "colab_type": "text", 932 | "id": "N_2dOUyo3VOS" 933 | }, 934 | "source": [ 935 | "# soft_voting" 936 | ] 937 | }, 938 | { 939 | "cell_type": "code", 940 | "execution_count": 28, 941 | "metadata": { 942 | "ExecuteTime": { 943 | "end_time": "2020-03-31T21:23:03.150504Z", 944 | "start_time": "2020-03-31T21:23:03.124573Z" 945 | }, 946 | "colab": {}, 947 | "colab_type": "code", 948 | "id": "3mIjvEWx3VOS" 949 | }, 950 | "outputs": [], 951 | "source": [ 952 | "class soft_voting:\n", 953 | " def __init__(self,model_dict,X_dict,Y_dict):\n", 954 | " self.model_dict=model_dict\n", 955 | " self.X_dict=X_dict\n", 956 | " self.Y_dict=Y_dict\n", 957 | " def train(self):\n", 958 | " for key in self.model_dict:\n", 959 | " self.model_dict[key].fit(self.X_dict[key],Y_dict[key])\n", 960 | " y_=self.model_dict[key].predict(self.X_dict[key])\n", 961 | " tn, fp, fn, tp = confusion_matrix(Y_dict[key],y_).ravel()\n", 962 | " \n", 963 | " print(\"in sample for model: {}; fpr={}, tpr={}\".format(key,fp/(fp+tn),tp/(tp+fn)))\n", 964 | " \n", 965 | " def predict_proba(self,X_dict,weight,seperate=\"Tree\"):\n", 966 | " non_tree=np.sum([weight[key] for key in weight if \"tree\" not in key])\n", 967 | " tree=np.sum([weight[key] for key in weight if \"tree\" in key])\n", 968 | " scaler=tree/(non_tree+1e-11)\n", 969 | " if seperate=='Tree':\n", 970 | " for key in X_dict:\n", 971 | " frame=np.zeros(shape=(X_dict[key].shape[0],2))\n", 972 | "# print(key)\n", 973 | " if \"tree\" not in key:\n", 974 | " frame=frame+self.model_dict[key].predict_proba(X_dict[key])*scaler\n", 975 | " else:\n", 976 | " frame=frame+self.model_dict[key].predict_proba(X_dict[key])\n", 977 | " else:\n", 978 | " for key in X_dict:\n", 979 | " frame=np.zeros(shape=(X_dict[key].shape[0],2))\n", 980 | " frame=frame+self.model_dict[key].predict_proba(X_dict[key])\n", 981 | " return frame\n", 982 | " def predict(self,X_dict,weight,seperate=\"Tree\"):\n", 983 | " return np.argmax(self.predict_proba(X_dict,weight,seperate),axis=1)\n", 984 | " \n", 985 | " \n", 986 | " def compare(self,X_dict,Y_dict):\n", 987 | " for key in self.model_dict:\n", 988 | " y_=self.model_dict[key].predict(X_dict[key])\n", 989 | " tn, fp, fn, tp = confusion_matrix(Y_dict[key],y_).ravel()\n", 990 | " print(\"in sample for model: {}; fpr={}, tpr={}, acc={}\".format(key,fp/(fp+tn),tp/(tp+fn),np.mean(Y_dict[key]==y_)))\n", 991 | " y_proba=self.model_dict[key].predict_proba(X_dict[key])\n", 992 | " _,ks=get_KS(y_proba[:,1],Y_dict[key])\n", 993 | " print(\"ks for\"+key+\"is :\"+str(ks)) " 994 | ] 995 | }, 996 | { 997 | "cell_type": "code", 998 | "execution_count": 29, 999 | "metadata": { 1000 | "ExecuteTime": { 1001 | "end_time": "2020-03-31T21:23:03.156488Z", 1002 | "start_time": "2020-03-31T21:23:03.152498Z" 1003 | }, 1004 | "colab": {}, 1005 | "colab_type": "code", 1006 | "id": "wUyJPxot3VOU" 1007 | }, 1008 | "outputs": [], 1009 | "source": [ 1010 | "final_model=soft_voting(model_dict,X_dict,Y_dict)" 1011 | ] 1012 | }, 1013 | { 1014 | "cell_type": "code", 1015 | "execution_count": 30, 1016 | "metadata": { 1017 | "ExecuteTime": { 1018 | "end_time": "2020-03-31T21:23:07.660275Z", 1019 | "start_time": "2020-03-31T21:23:03.158483Z" 1020 | }, 1021 | "colab": { 1022 | "base_uri": "https://localhost:8080/", 1023 | "height": 85 1024 | }, 1025 | "colab_type": "code", 1026 | "executionInfo": { 1027 | "elapsed": 176163, 1028 | "status": "ok", 1029 | "timestamp": 1582643124277, 1030 | "user": { 1031 | "displayName": "Hanyu Wu", 1032 | "photoUrl": "", 1033 | "userId": "05588030028148674891" 1034 | }, 1035 | "user_tz": 300 1036 | }, 1037 | "id": "BJKKl10G3VOW", 1038 | "outputId": "cb7e5090-3f85-4d34-bd55-8e1b93d03ef9" 1039 | }, 1040 | "outputs": [ 1041 | { 1042 | "name": "stdout", 1043 | "output_type": "stream", 1044 | "text": [ 1045 | "in sample for model: LDA; fpr=0.2632386799693016, tpr=0.7777777777777778\n", 1046 | "in sample for model: QDA; fpr=0.0, tpr=1.0\n", 1047 | "in sample for model: tree_random_forest; fpr=0.0, tpr=0.9888888888888889\n", 1048 | "in sample for model: tree_XGBOOST; fpr=0.03837298541826554, tpr=0.9555555555555556\n" 1049 | ] 1050 | } 1051 | ], 1052 | "source": [ 1053 | "final_model.train()" 1054 | ] 1055 | }, 1056 | { 1057 | "cell_type": "code", 1058 | "execution_count": 31, 1059 | "metadata": { 1060 | "ExecuteTime": { 1061 | "end_time": "2020-03-31T21:23:07.665261Z", 1062 | "start_time": "2020-03-31T21:23:07.662270Z" 1063 | }, 1064 | "colab": {}, 1065 | "colab_type": "code", 1066 | "id": "DrszdZIy3VOZ" 1067 | }, 1068 | "outputs": [], 1069 | "source": [ 1070 | "# X_test_2019,Y_test_2019=X_test_2019[:2000],Y_test_2019[:2000]\n", 1071 | "# X_test_2019_non,Y_test_2019_non=X_test_2019_non[:2000],Y_test_2019_non[:2000]" 1072 | ] 1073 | }, 1074 | { 1075 | "cell_type": "code", 1076 | "execution_count": 32, 1077 | "metadata": { 1078 | "ExecuteTime": { 1079 | "end_time": "2020-03-31T21:23:07.672243Z", 1080 | "start_time": "2020-03-31T21:23:07.666259Z" 1081 | }, 1082 | "colab": {}, 1083 | "colab_type": "code", 1084 | "id": "K9gF4PhK3VOb" 1085 | }, 1086 | "outputs": [], 1087 | "source": [ 1088 | "Y_test_data={}\n", 1089 | "X_test_data={}\n", 1090 | "for i in X_dict:\n", 1091 | " if \"tree\" in i:\n", 1092 | " Y_test_data[i]=Y_test_2019\n", 1093 | " X_test_data[i]=X_test_2019\n", 1094 | " else:\n", 1095 | " Y_test_data[i]=Y_test_2019\n", 1096 | " X_test_data[i]=X_test_2019" 1097 | ] 1098 | }, 1099 | { 1100 | "cell_type": "markdown", 1101 | "metadata": { 1102 | "colab_type": "text", 1103 | "id": "crxwQeQN3VOd" 1104 | }, 1105 | "source": [ 1106 | "# with the model trained before, lets select" 1107 | ] 1108 | }, 1109 | { 1110 | "cell_type": "markdown", 1111 | "metadata": { 1112 | "colab_type": "text", 1113 | "id": "t4Rhqkh_3VOe" 1114 | }, 1115 | "source": [ 1116 | "## First lets see in-sample metrcs" 1117 | ] 1118 | }, 1119 | { 1120 | "cell_type": "markdown", 1121 | "metadata": { 1122 | "colab_type": "text", 1123 | "id": "HQfaobUj3VOe" 1124 | }, 1125 | "source": [ 1126 | "### imbalance weight:tree=50%, linear=50%" 1127 | ] 1128 | }, 1129 | { 1130 | "cell_type": "code", 1131 | "execution_count": 33, 1132 | "metadata": { 1133 | "ExecuteTime": { 1134 | "end_time": "2020-03-31T21:23:07.926563Z", 1135 | "start_time": "2020-03-31T21:23:07.674238Z" 1136 | }, 1137 | "colab": { 1138 | "base_uri": "https://localhost:8080/", 1139 | "height": 54 1140 | }, 1141 | "colab_type": "code", 1142 | "executionInfo": { 1143 | "elapsed": 176263, 1144 | "status": "ok", 1145 | "timestamp": 1582643124398, 1146 | "user": { 1147 | "displayName": "Hanyu Wu", 1148 | "photoUrl": "", 1149 | "userId": "05588030028148674891" 1150 | }, 1151 | "user_tz": 300 1152 | }, 1153 | "id": "u2AbHbG03VOf", 1154 | "outputId": "fd426f42-a44e-49a2-bce0-07e9d3e3a6d1" 1155 | }, 1156 | "outputs": [ 1157 | { 1158 | "name": "stdout", 1159 | "output_type": "stream", 1160 | "text": [ 1161 | "in sample for model: voting; fpr=0.03518518518518519, tpr=0.9883720930232558, acc=0.9655419956927495\n" 1162 | ] 1163 | } 1164 | ], 1165 | "source": [ 1166 | "proba=final_model.predict_proba(X_dict,weight=model_ks)\n", 1167 | "predict=final_model.predict(X_dict,weight=model_ks)\n", 1168 | "tn, fp, fn, tp = confusion_matrix(predict,Y).ravel()\n", 1169 | "print(\"in sample for model: {}; fpr={}, tpr={}, acc={}\".format(\"voting\",fp/(fp+tn),tp/(tp+fn),np.mean(predict==Y)))" 1170 | ] 1171 | }, 1172 | { 1173 | "cell_type": "code", 1174 | "execution_count": 34, 1175 | "metadata": { 1176 | "ExecuteTime": { 1177 | "end_time": "2020-03-31T21:23:07.935540Z", 1178 | "start_time": "2020-03-31T21:23:07.928558Z" 1179 | }, 1180 | "colab": { 1181 | "base_uri": "https://localhost:8080/", 1182 | "height": 71 1183 | }, 1184 | "colab_type": "code", 1185 | "executionInfo": { 1186 | "elapsed": 176259, 1187 | "status": "ok", 1188 | "timestamp": 1582643124399, 1189 | "user": { 1190 | "displayName": "Hanyu Wu", 1191 | "photoUrl": "", 1192 | "userId": "05588030028148674891" 1193 | }, 1194 | "user_tz": 300 1195 | }, 1196 | "id": "ql9MCQJC3VOh", 1197 | "outputId": "485b33d0-9f17-4291-db64-f17c62b6cdb3" 1198 | }, 1199 | "outputs": [ 1200 | { 1201 | "data": { 1202 | "text/plain": [ 1203 | "array([0.19244359, 0.24120206, 0.19441538, ..., 0.22799245, 0.20492069,\n", 1204 | " 0.19328833])" 1205 | ] 1206 | }, 1207 | "execution_count": 34, 1208 | "metadata": {}, 1209 | "output_type": "execute_result" 1210 | } 1211 | ], 1212 | "source": [ 1213 | "proba[:,1]" 1214 | ] 1215 | }, 1216 | { 1217 | "cell_type": "code", 1218 | "execution_count": 35, 1219 | "metadata": { 1220 | "ExecuteTime": { 1221 | "end_time": "2020-03-31T21:23:07.944516Z", 1222 | "start_time": "2020-03-31T21:23:07.937534Z" 1223 | }, 1224 | "colab": { 1225 | "base_uri": "https://localhost:8080/", 1226 | "height": 54 1227 | }, 1228 | "colab_type": "code", 1229 | "executionInfo": { 1230 | "elapsed": 176253, 1231 | "status": "ok", 1232 | "timestamp": 1582643124399, 1233 | "user": { 1234 | "displayName": "Hanyu Wu", 1235 | "photoUrl": "", 1236 | "userId": "05588030028148674891" 1237 | }, 1238 | "user_tz": 300 1239 | }, 1240 | "id": "JTEXp6HD3VOj", 1241 | "outputId": "01a7e489-b5c2-451f-a12b-7f1a230a3c8f" 1242 | }, 1243 | "outputs": [ 1244 | { 1245 | "data": { 1246 | "text/plain": [ 1247 | "(0.27197638154029846, 0.9171825701372901)" 1248 | ] 1249 | }, 1250 | "execution_count": 35, 1251 | "metadata": {}, 1252 | "output_type": "execute_result" 1253 | } 1254 | ], 1255 | "source": [ 1256 | "get_KS(proba[:,1],Y)" 1257 | ] 1258 | }, 1259 | { 1260 | "cell_type": "markdown", 1261 | "metadata": { 1262 | "colab_type": "text", 1263 | "id": "A0LOQVbN3VOl" 1264 | }, 1265 | "source": [ 1266 | "### balance weight" 1267 | ] 1268 | }, 1269 | { 1270 | "cell_type": "code", 1271 | "execution_count": 36, 1272 | "metadata": { 1273 | "ExecuteTime": { 1274 | "end_time": "2020-03-31T21:23:08.198835Z", 1275 | "start_time": "2020-03-31T21:23:07.946510Z" 1276 | }, 1277 | "colab": { 1278 | "base_uri": "https://localhost:8080/", 1279 | "height": 54 1280 | }, 1281 | "colab_type": "code", 1282 | "executionInfo": { 1283 | "elapsed": 176695, 1284 | "status": "ok", 1285 | "timestamp": 1582643124849, 1286 | "user": { 1287 | "displayName": "Hanyu Wu", 1288 | "photoUrl": "", 1289 | "userId": "05588030028148674891" 1290 | }, 1291 | "user_tz": 300 1292 | }, 1293 | "id": "NgioQqux3VOm", 1294 | "outputId": "d92d15ae-6779-4587-bc1f-91a50d696534" 1295 | }, 1296 | "outputs": [ 1297 | { 1298 | "name": "stdout", 1299 | "output_type": "stream", 1300 | "text": [ 1301 | "in sample for model: voting; fpr=0.03518518518518519, tpr=0.9883720930232558, acc=0.9655419956927495\n" 1302 | ] 1303 | } 1304 | ], 1305 | "source": [ 1306 | "proba = final_model.predict_proba(X_dict, weight=model_ks, seperate=\"None\")\n", 1307 | "predict = final_model.predict(X_dict, weight=model_ks, seperate=\"None\")\n", 1308 | "tn, fp, fn, tp = confusion_matrix(predict, Y).ravel()\n", 1309 | "print(\"in sample for model: {}; fpr={}, tpr={}, acc={}\".format(\n", 1310 | " \"voting\", fp/(fp+tn), tp/(tp+fn), np.mean(predict == Y)))" 1311 | ] 1312 | }, 1313 | { 1314 | "cell_type": "code", 1315 | "execution_count": 37, 1316 | "metadata": { 1317 | "ExecuteTime": { 1318 | "end_time": "2020-03-31T21:23:08.206814Z", 1319 | "start_time": "2020-03-31T21:23:08.199833Z" 1320 | }, 1321 | "colab": { 1322 | "base_uri": "https://localhost:8080/", 1323 | "height": 54 1324 | }, 1325 | "colab_type": "code", 1326 | "executionInfo": { 1327 | "elapsed": 176690, 1328 | "status": "ok", 1329 | "timestamp": 1582643124850, 1330 | "user": { 1331 | "displayName": "Hanyu Wu", 1332 | "photoUrl": "", 1333 | "userId": "05588030028148674891" 1334 | }, 1335 | "user_tz": 300 1336 | }, 1337 | "id": "Fcx7BS-93VOp", 1338 | "outputId": "5d2df000-8757-406a-fbd5-28b8c9bd284f" 1339 | }, 1340 | "outputs": [ 1341 | { 1342 | "data": { 1343 | "text/plain": [ 1344 | "(0.27197638154029846, 0.9171825701372901)" 1345 | ] 1346 | }, 1347 | "execution_count": 37, 1348 | "metadata": {}, 1349 | "output_type": "execute_result" 1350 | } 1351 | ], 1352 | "source": [ 1353 | "get_KS(proba[:,1],Y)" 1354 | ] 1355 | }, 1356 | { 1357 | "cell_type": "markdown", 1358 | "metadata": { 1359 | "colab_type": "text", 1360 | "id": "zRTJ6paL3VOs" 1361 | }, 1362 | "source": [ 1363 | "### compare" 1364 | ] 1365 | }, 1366 | { 1367 | "cell_type": "code", 1368 | "execution_count": 38, 1369 | "metadata": { 1370 | "ExecuteTime": { 1371 | "end_time": "2020-03-31T21:23:08.475097Z", 1372 | "start_time": "2020-03-31T21:23:08.208808Z" 1373 | }, 1374 | "colab": { 1375 | "base_uri": "https://localhost:8080/", 1376 | "height": 173 1377 | }, 1378 | "colab_type": "code", 1379 | "executionInfo": { 1380 | "elapsed": 177108, 1381 | "status": "ok", 1382 | "timestamp": 1582643125274, 1383 | "user": { 1384 | "displayName": "Hanyu Wu", 1385 | "photoUrl": "", 1386 | "userId": "05588030028148674891" 1387 | }, 1388 | "user_tz": 300 1389 | }, 1390 | "id": "S9a57dFK3VOs", 1391 | "outputId": "653e3c7c-9e68-40f9-e5c9-e13c42ecbc68" 1392 | }, 1393 | "outputs": [ 1394 | { 1395 | "name": "stdout", 1396 | "output_type": "stream", 1397 | "text": [ 1398 | "in sample for model: LDA; fpr=0.2632386799693016, tpr=0.7777777777777778, acc=0.7394113424264178\n", 1399 | "ks forLDAis :0.5145390978084762\n", 1400 | "in sample for model: QDA; fpr=0.0, tpr=1.0, acc=1.0\n", 1401 | "ks forQDAis :1.0\n", 1402 | "in sample for model: tree_random_forest; fpr=0.0, tpr=0.9888888888888889, acc=0.9992821249102656\n", 1403 | "ks fortree_random_forestis :1.0\n", 1404 | "in sample for model: tree_XGBOOST; fpr=0.03837298541826554, tpr=0.9555555555555556, acc=0.9612347451543432\n", 1405 | "ks fortree_XGBOOSTis :0.9171825701372901\n" 1406 | ] 1407 | } 1408 | ], 1409 | "source": [ 1410 | "final_model.compare(X_dict,Y_dict)" 1411 | ] 1412 | }, 1413 | { 1414 | "cell_type": "markdown", 1415 | "metadata": { 1416 | "colab_type": "text", 1417 | "id": "q8dtmo5y3VOu" 1418 | }, 1419 | "source": [ 1420 | "## the test dataset metrics" 1421 | ] 1422 | }, 1423 | { 1424 | "cell_type": "markdown", 1425 | "metadata": { 1426 | "colab_type": "text", 1427 | "id": "_qwdX_P93VOv" 1428 | }, 1429 | "source": [ 1430 | "### imbalance" 1431 | ] 1432 | }, 1433 | { 1434 | "cell_type": "code", 1435 | "execution_count": 39, 1436 | "metadata": { 1437 | "ExecuteTime": { 1438 | "end_time": "2020-03-31T21:23:08.604750Z", 1439 | "start_time": "2020-03-31T21:23:08.476094Z" 1440 | }, 1441 | "colab": { 1442 | "base_uri": "https://localhost:8080/", 1443 | "height": 54 1444 | }, 1445 | "colab_type": "code", 1446 | "executionInfo": { 1447 | "elapsed": 177246, 1448 | "status": "ok", 1449 | "timestamp": 1582643125417, 1450 | "user": { 1451 | "displayName": "Hanyu Wu", 1452 | "photoUrl": "", 1453 | "userId": "05588030028148674891" 1454 | }, 1455 | "user_tz": 300 1456 | }, 1457 | "id": "Y963sycp3VOw", 1458 | "outputId": "b5dbdc28-0599-488a-f1f2-a0c676d451f9" 1459 | }, 1460 | "outputs": [ 1461 | { 1462 | "name": "stdout", 1463 | "output_type": "stream", 1464 | "text": [ 1465 | "out sample for model: voting; fpr=0.07883369330453564, tpr=0.3333333333333333, acc=0.9192680301399354\n" 1466 | ] 1467 | } 1468 | ], 1469 | "source": [ 1470 | "proba=final_model.predict_proba(X_test_data,weight=model_ks)\n", 1471 | "predict=final_model.predict(X_test_data,weight=model_ks)\n", 1472 | "tn, fp, fn, tp = confusion_matrix(predict,Y_test_2019).ravel()\n", 1473 | "print(\"out sample for model: {}; fpr={}, tpr={}, acc={}\".format(\"voting\",fp/(fp+tn),tp/(tp+fn),np.mean(predict==Y_test_2019)))" 1474 | ] 1475 | }, 1476 | { 1477 | "cell_type": "code", 1478 | "execution_count": 40, 1479 | "metadata": { 1480 | "ExecuteTime": { 1481 | "end_time": "2020-03-31T21:23:08.613726Z", 1482 | "start_time": "2020-03-31T21:23:08.605747Z" 1483 | }, 1484 | "colab": { 1485 | "base_uri": "https://localhost:8080/", 1486 | "height": 54 1487 | }, 1488 | "colab_type": "code", 1489 | "executionInfo": { 1490 | "elapsed": 177241, 1491 | "status": "ok", 1492 | "timestamp": 1582643125417, 1493 | "user": { 1494 | "displayName": "Hanyu Wu", 1495 | "photoUrl": "", 1496 | "userId": "05588030028148674891" 1497 | }, 1498 | "user_tz": 300 1499 | }, 1500 | "id": "5hWPJZsP3VOx", 1501 | "outputId": "c61b1f21-218b-4f02-8acd-734f6da30551" 1502 | }, 1503 | "outputs": [ 1504 | { 1505 | "data": { 1506 | "text/plain": [ 1507 | "(0.25037384033203125, 0.17253042516200412)" 1508 | ] 1509 | }, 1510 | "execution_count": 40, 1511 | "metadata": {}, 1512 | "output_type": "execute_result" 1513 | } 1514 | ], 1515 | "source": [ 1516 | "get_KS(proba[:,1],Y_test_2019)" 1517 | ] 1518 | }, 1519 | { 1520 | "cell_type": "markdown", 1521 | "metadata": { 1522 | "colab_type": "text", 1523 | "id": "UtzdWP8z3VO0" 1524 | }, 1525 | "source": [ 1526 | "### balance" 1527 | ] 1528 | }, 1529 | { 1530 | "cell_type": "code", 1531 | "execution_count": 41, 1532 | "metadata": { 1533 | "ExecuteTime": { 1534 | "end_time": "2020-03-31T21:23:08.734403Z", 1535 | "start_time": "2020-03-31T21:23:08.615721Z" 1536 | }, 1537 | "colab": { 1538 | "base_uri": "https://localhost:8080/", 1539 | "height": 54 1540 | }, 1541 | "colab_type": "code", 1542 | "executionInfo": { 1543 | "elapsed": 177546, 1544 | "status": "ok", 1545 | "timestamp": 1582643125727, 1546 | "user": { 1547 | "displayName": "Hanyu Wu", 1548 | "photoUrl": "", 1549 | "userId": "05588030028148674891" 1550 | }, 1551 | "user_tz": 300 1552 | }, 1553 | "id": "SWfG_0cU3VO0", 1554 | "outputId": "8edc657c-75f1-43a4-9d2c-f0d6df8bd7c4" 1555 | }, 1556 | "outputs": [ 1557 | { 1558 | "name": "stdout", 1559 | "output_type": "stream", 1560 | "text": [ 1561 | "out sample for model: voting; fpr=0.07883369330453564, tpr=0.3333333333333333, acc=0.9192680301399354\n" 1562 | ] 1563 | } 1564 | ], 1565 | "source": [ 1566 | "proba = final_model.predict_proba(X_test_data, weight=model_ks, seperate=\"None\")\n", 1567 | "predict = final_model.predict(X_test_data, weight=model_ks, seperate=\"None\")\n", 1568 | "tn, fp, fn, tp = confusion_matrix(predict, Y_test_2019).ravel()\n", 1569 | "print(\"out sample for model: {}; fpr={}, tpr={}, acc={}\".format(\n", 1570 | " \"voting\", fp/(fp+tn), tp/(tp+fn), np.mean(predict == Y_test_2019)))" 1571 | ] 1572 | }, 1573 | { 1574 | "cell_type": "code", 1575 | "execution_count": 42, 1576 | "metadata": { 1577 | "ExecuteTime": { 1578 | "end_time": "2020-03-31T21:23:08.743380Z", 1579 | "start_time": "2020-03-31T21:23:08.736398Z" 1580 | }, 1581 | "colab": { 1582 | "base_uri": "https://localhost:8080/", 1583 | "height": 54 1584 | }, 1585 | "colab_type": "code", 1586 | "executionInfo": { 1587 | "elapsed": 177542, 1588 | "status": "ok", 1589 | "timestamp": 1582643125728, 1590 | "user": { 1591 | "displayName": "Hanyu Wu", 1592 | "photoUrl": "", 1593 | "userId": "05588030028148674891" 1594 | }, 1595 | "user_tz": 300 1596 | }, 1597 | "id": "XNP5N0NX3VO2", 1598 | "outputId": "27f781a2-b0bb-470f-e35f-d2f180895a9b" 1599 | }, 1600 | "outputs": [ 1601 | { 1602 | "data": { 1603 | "text/plain": [ 1604 | "(0.25037384033203125, 0.17253042516200412)" 1605 | ] 1606 | }, 1607 | "execution_count": 42, 1608 | "metadata": {}, 1609 | "output_type": "execute_result" 1610 | } 1611 | ], 1612 | "source": [ 1613 | "get_KS(proba[:,1],Y_test_2019)" 1614 | ] 1615 | }, 1616 | { 1617 | "cell_type": "markdown", 1618 | "metadata": { 1619 | "colab_type": "text", 1620 | "id": "hCRPpRCv3VO4" 1621 | }, 1622 | "source": [ 1623 | "### compare" 1624 | ] 1625 | }, 1626 | { 1627 | "cell_type": "code", 1628 | "execution_count": 43, 1629 | "metadata": { 1630 | "ExecuteTime": { 1631 | "end_time": "2020-03-31T21:23:08.873033Z", 1632 | "start_time": "2020-03-31T21:23:08.745374Z" 1633 | }, 1634 | "colab": { 1635 | "base_uri": "https://localhost:8080/", 1636 | "height": 173 1637 | }, 1638 | "colab_type": "code", 1639 | "executionInfo": { 1640 | "elapsed": 177789, 1641 | "status": "ok", 1642 | "timestamp": 1582643125979, 1643 | "user": { 1644 | "displayName": "Hanyu Wu", 1645 | "photoUrl": "", 1646 | "userId": "05588030028148674891" 1647 | }, 1648 | "user_tz": 300 1649 | }, 1650 | "id": "hT9roCAO3VO4", 1651 | "outputId": "611f14e7-ab90-4866-d0c0-5b33ce4762ec" 1652 | }, 1653 | "outputs": [ 1654 | { 1655 | "name": "stdout", 1656 | "output_type": "stream", 1657 | "text": [ 1658 | "in sample for model: LDA; fpr=0.3146198830409357, tpr=0.32432432432432434, acc=0.6566200215285253\n", 1659 | "ks forLDAis :0.06867393709498976\n", 1660 | "in sample for model: QDA; fpr=0.0, tpr=0.0, acc=0.9203444564047363\n", 1661 | "ks forQDAis :0.0\n", 1662 | "in sample for model: tree_random_forest; fpr=0.0, tpr=0.0, acc=0.9203444564047363\n", 1663 | "ks fortree_random_forestis :0.03960802908171329\n", 1664 | "in sample for model: tree_XGBOOST; fpr=0.1368421052631579, tpr=0.22972972972972974, acc=0.8127018299246501\n", 1665 | "ks fortree_XGBOOSTis :0.17253042516200412\n" 1666 | ] 1667 | } 1668 | ], 1669 | "source": [ 1670 | "final_model.compare(X_test_data,Y_test_data)" 1671 | ] 1672 | }, 1673 | { 1674 | "cell_type": "code", 1675 | "execution_count": null, 1676 | "metadata": { 1677 | "colab": {}, 1678 | "colab_type": "code", 1679 | "id": "zAWWDWFF3VO7" 1680 | }, 1681 | "outputs": [], 1682 | "source": [] 1683 | } 1684 | ], 1685 | "metadata": { 1686 | "colab": { 1687 | "collapsed_sections": [ 1688 | "b3WqAto03VN2", 1689 | "EyVIQ-Wf3VN3", 1690 | "RMrR2vb83VN5", 1691 | "zUIO_UMD3VN-", 1692 | "HYYHrHGD3VOB", 1693 | "ybLBTPVx3VOF", 1694 | "vtRQgAF73VOJ", 1695 | "_TSXE7tQ3VOJ", 1696 | "frNclZkO3VOM", 1697 | "t4Rhqkh_3VOe", 1698 | "HQfaobUj3VOe", 1699 | "A0LOQVbN3VOl", 1700 | "zRTJ6paL3VOs", 1701 | "q8dtmo5y3VOu", 1702 | "_qwdX_P93VOv", 1703 | "UtzdWP8z3VO0", 1704 | "hCRPpRCv3VO4" 1705 | ], 1706 | "name": "2.Other_simple_models.ipynb", 1707 | "provenance": [] 1708 | }, 1709 | "kernelspec": { 1710 | "display_name": "Python 3", 1711 | "language": "python", 1712 | "name": "python3" 1713 | }, 1714 | "language_info": { 1715 | "codemirror_mode": { 1716 | "name": "ipython", 1717 | "version": 3 1718 | }, 1719 | "file_extension": ".py", 1720 | "mimetype": "text/x-python", 1721 | "name": "python", 1722 | "nbconvert_exporter": "python", 1723 | "pygments_lexer": "ipython3", 1724 | "version": "3.7.7" 1725 | }, 1726 | "toc": { 1727 | "base_numbering": 1, 1728 | "nav_menu": {}, 1729 | "number_sections": true, 1730 | "sideBar": true, 1731 | "skip_h1_title": false, 1732 | "title_cell": "Table of Contents", 1733 | "title_sidebar": "Contents", 1734 | "toc_cell": false, 1735 | "toc_position": {}, 1736 | "toc_section_display": true, 1737 | "toc_window_display": false 1738 | }, 1739 | "varInspector": { 1740 | "cols": { 1741 | "lenName": 16, 1742 | "lenType": 16, 1743 | "lenVar": 40 1744 | }, 1745 | "kernels_config": { 1746 | "python": { 1747 | "delete_cmd_postfix": "", 1748 | "delete_cmd_prefix": "del ", 1749 | "library": "var_list.py", 1750 | "varRefreshCmd": "print(var_dic_list())" 1751 | }, 1752 | "r": { 1753 | "delete_cmd_postfix": ") ", 1754 | "delete_cmd_prefix": "rm(", 1755 | "library": "var_list.r", 1756 | "varRefreshCmd": "cat(var_dic_list()) " 1757 | } 1758 | }, 1759 | "types_to_exclude": [ 1760 | "module", 1761 | "function", 1762 | "builtin_function_or_method", 1763 | "instance", 1764 | "_Feature" 1765 | ], 1766 | "window_display": false 1767 | } 1768 | }, 1769 | "nbformat": 4, 1770 | "nbformat_minor": 1 1771 | } 1772 | -------------------------------------------------------------------------------- /2.Binary_classification_models/Binary_classification_contents.md: -------------------------------------------------------------------------------- 1 | # Binary Classification 2 | 3 | After data exploration with some baseline models, we experimented with several combinations of deep learning architectures to see how these common-used methods perform. 4 | 5 | ## Contents 6 | - Binary Classification Model1: **[Dense Layer, LSTM]**, Sequential features as imput for LSTM; 7 | - Binary Classification Model2: **[Dense, CNN]**, Images (MTF & GAF along time series) as imput for CNN; 8 | - Binary Classification Model3: **[Dense, CNN, LSTM]**, Sequential features as imput for LSTM, Images (MTF & GAF along time series) as imput for CNN; 9 | - Binary Classification Model4: **[Dense, LSTM, CNN]**, Sequential features as imput for LSTM, Images (MTF along the features) as imput for CNN; 10 | - Binary Classification Model5: **[Dense, LSTM, LSTM, CNN]**, Sequential features as imput for LSTM, Images (MTF along the features) as imput for CN; 11 | - Binary Classification Model6: **[Dense, LSTM, CNN]**, Sequential features as imput for LSTM, Sequential features as imput for CNN; 12 | - Binary Classification Model7: **[LSTM, CNN]**, Sequential features as imput for LSTM, Sequential features as imput for CNN; 13 | -------------------------------------------------------------------------------- /2.Binary_classification_models/Model_binary_classification_model4.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Model Input:\n", 8 | "- Sequential features: batchsize * 60 * 17\n", 9 | "- MTF along features: batchsize * 31 * 31\n", 10 | "- Non-sequential Features: batchsize * 14\n", 11 | "\n", 12 | "### Attention Mechanism is applied in this model" 13 | ] 14 | }, 15 | { 16 | "cell_type": "markdown", 17 | "metadata": {}, 18 | "source": [ 19 | "## Import data" 20 | ] 21 | }, 22 | { 23 | "cell_type": "code", 24 | "execution_count": 1, 25 | "metadata": { 26 | "ExecuteTime": { 27 | "end_time": "2020-04-01T00:39:08.523297Z", 28 | "start_time": "2020-04-01T00:39:06.207870Z" 29 | } 30 | }, 31 | "outputs": [ 32 | { 33 | "name": "stderr", 34 | "output_type": "stream", 35 | "text": [ 36 | "Using TensorFlow backend.\n" 37 | ] 38 | } 39 | ], 40 | "source": [ 41 | "import pandas as pd\n", 42 | "import numpy as np\n", 43 | "import json\n", 44 | "import tensorflow as tf\n", 45 | "import keras\n", 46 | "from keras.models import Sequential,Model\n", 47 | "from keras.layers import Dense, Dropout, Flatten,concatenate,LSTM,Input,Bidirectional,BatchNormalization\n", 48 | "from keras.layers import Conv2D, MaxPooling2D,AveragePooling2D\n", 49 | "from keras_self_attention import SeqSelfAttention\n", 50 | "from keras import backend as K\n", 51 | "import keras\n", 52 | "from keras import regularizers\n", 53 | "from keras.callbacks import EarlyStopping\n", 54 | "from keras import optimizers\n", 55 | "from keras import metrics\n", 56 | "from sklearn.model_selection import train_test_split\n", 57 | "from tensorflow.python.client import device_lib\n", 58 | "from fetch_data import get, get_KS\n" 59 | ] 60 | }, 61 | { 62 | "cell_type": "code", 63 | "execution_count": 2, 64 | "metadata": { 65 | "ExecuteTime": { 66 | "end_time": "2020-04-01T00:39:08.604057Z", 67 | "start_time": "2020-04-01T00:39:08.524270Z" 68 | } 69 | }, 70 | "outputs": [], 71 | "source": [ 72 | "# In order to run the model on GPU, we need to enable memory growth for only one GPU.\n", 73 | "physical_devices = tf.config.list_physical_devices('GPU')\n", 74 | "if len(physical_devices) != 0:\n", 75 | " tf.config.experimental.set_memory_growth(physical_devices[0], enable=True)" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": 3, 81 | "metadata": { 82 | "ExecuteTime": { 83 | "end_time": "2020-04-01T00:39:10.642040Z", 84 | "start_time": "2020-04-01T00:39:08.605055Z" 85 | } 86 | }, 87 | "outputs": [], 88 | "source": [ 89 | "with open(data_dir+'label.json') as f:\n", 90 | " labels = json.load(f)\n", 91 | "with open(data_dir+'non_sequential_features.json') as f:\n", 92 | " non_sequential_features = json.load(f)\n", 93 | "with open(data_dir+'padded_sequential_features_3.json') as f:\n", 94 | " sequential_features = json.load(f)\n", 95 | "with open(data_dir+'featurematrix.json') as f:\n", 96 | " arr_ = json.load(f)\n", 97 | "arr_ = np.array(arr_)" 98 | ] 99 | }, 100 | { 101 | "cell_type": "markdown", 102 | "metadata": {}, 103 | "source": [ 104 | "## Preprocessing" 105 | ] 106 | }, 107 | { 108 | "cell_type": "code", 109 | "execution_count": 4, 110 | "metadata": { 111 | "ExecuteTime": { 112 | "end_time": "2020-04-01T00:39:10.819235Z", 113 | "start_time": "2020-04-01T00:39:10.643019Z" 114 | } 115 | }, 116 | "outputs": [], 117 | "source": [ 118 | "feature1 = np.array([sequential_features[key]\n", 119 | " for key in sequential_features.keys()])\n", 120 | "feature2 = np.array([non_sequential_features[key]\n", 121 | " for key in non_sequential_features.keys()])\n", 122 | "label = np.array([labels[key] for key in labels.keys()])" 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "execution_count": 5, 128 | "metadata": { 129 | "ExecuteTime": { 130 | "end_time": "2020-04-01T00:39:10.829210Z", 131 | "start_time": "2020-04-01T00:39:10.820233Z" 132 | } 133 | }, 134 | "outputs": [ 135 | { 136 | "data": { 137 | "text/plain": [ 138 | "((3715, 31, 31), (3715, 60, 18), (3715, 14), (3715,))" 139 | ] 140 | }, 141 | "execution_count": 5, 142 | "metadata": {}, 143 | "output_type": "execute_result" 144 | } 145 | ], 146 | "source": [ 147 | "arr_.shape,feature1.shape,feature2.shape,label.shape" 148 | ] 149 | }, 150 | { 151 | "cell_type": "code", 152 | "execution_count": 6, 153 | "metadata": { 154 | "ExecuteTime": { 155 | "end_time": "2020-04-01T00:39:10.869132Z", 156 | "start_time": "2020-04-01T00:39:10.830209Z" 157 | }, 158 | "scrolled": true 159 | }, 160 | "outputs": [ 161 | { 162 | "data": { 163 | "text/plain": [ 164 | "((2972, 31, 31, 1), (2972, 60, 17), (2972, 14), (2972,))" 165 | ] 166 | }, 167 | "execution_count": 6, 168 | "metadata": {}, 169 | "output_type": "execute_result" 170 | } 171 | ], 172 | "source": [ 173 | "arr_ = arr_.reshape(-1, 31, 31, 1)\n", 174 | "feature1 = feature1[:, :, 1:]\n", 175 | "\n", 176 | "X_train_arr, X_test_arr, X_train_f1, X_test_f1, X_train_f2, X_test_f2, y_train, y_test = train_test_split(arr_, feature1, feature2, label,\n", 177 | " test_size=0.20, random_state=42)\n", 178 | "\n", 179 | "\n", 180 | "X_train_arr.shape, X_train_f1.shape, X_train_f2.shape, y_train.shape" 181 | ] 182 | }, 183 | { 184 | "cell_type": "markdown", 185 | "metadata": {}, 186 | "source": [ 187 | "## Model Building" 188 | ] 189 | }, 190 | { 191 | "cell_type": "code", 192 | "execution_count": 7, 193 | "metadata": { 194 | "ExecuteTime": { 195 | "end_time": "2020-04-01T00:39:41.779743Z", 196 | "start_time": "2020-04-01T00:39:10.871098Z" 197 | }, 198 | "scrolled": true 199 | }, 200 | "outputs": [ 201 | { 202 | "name": "stdout", 203 | "output_type": "stream", 204 | "text": [ 205 | "Model: \"model_1\"\n", 206 | "__________________________________________________________________________________________________\n", 207 | "Layer (type) Output Shape Param # Connected to \n", 208 | "==================================================================================================\n", 209 | "cnn_input (InputLayer) (None, 31, 31, 1) 0 \n", 210 | "__________________________________________________________________________________________________\n", 211 | "batch_normalization_1 (BatchNor (None, 31, 31, 1) 4 cnn_input[0][0] \n", 212 | "__________________________________________________________________________________________________\n", 213 | "conv2d_1 (Conv2D) (None, 31, 31, 16) 160 batch_normalization_1[0][0] \n", 214 | "__________________________________________________________________________________________________\n", 215 | "batch_normalization_2 (BatchNor (None, 31, 31, 16) 64 conv2d_1[0][0] \n", 216 | "__________________________________________________________________________________________________\n", 217 | "max_pooling2d_1 (MaxPooling2D) (None, 15, 15, 16) 0 batch_normalization_2[0][0] \n", 218 | "__________________________________________________________________________________________________\n", 219 | "dropout_1 (Dropout) (None, 15, 15, 16) 0 max_pooling2d_1[0][0] \n", 220 | "__________________________________________________________________________________________________\n", 221 | "conv2d_2 (Conv2D) (None, 15, 15, 8) 1160 dropout_1[0][0] \n", 222 | "__________________________________________________________________________________________________\n", 223 | "rnn_input (InputLayer) (None, 60, 17) 0 \n", 224 | "__________________________________________________________________________________________________\n", 225 | "batch_normalization_3 (BatchNor (None, 15, 15, 8) 32 conv2d_2[0][0] \n", 226 | "__________________________________________________________________________________________________\n", 227 | "batch_normalization_4 (BatchNor (None, 60, 17) 68 rnn_input[0][0] \n", 228 | "__________________________________________________________________________________________________\n", 229 | "dense_input (InputLayer) (None, 14) 0 \n", 230 | "__________________________________________________________________________________________________\n", 231 | "average_pooling2d_1 (AveragePoo (None, 7, 7, 8) 0 batch_normalization_3[0][0] \n", 232 | "__________________________________________________________________________________________________\n", 233 | "bidirectional_1 (Bidirectional) (None, 60, 64) 12800 batch_normalization_4[0][0] \n", 234 | "__________________________________________________________________________________________________\n", 235 | "dense_1 (Dense) (None, 64) 960 dense_input[0][0] \n", 236 | "__________________________________________________________________________________________________\n", 237 | "dropout_2 (Dropout) (None, 7, 7, 8) 0 average_pooling2d_1[0][0] \n", 238 | "__________________________________________________________________________________________________\n", 239 | "seq_self_attention_1 (SeqSelfAt (None, 60, 64) 4161 bidirectional_1[0][0] \n", 240 | "__________________________________________________________________________________________________\n", 241 | "batch_normalization_6 (BatchNor (None, 64) 256 dense_1[0][0] \n", 242 | "__________________________________________________________________________________________________\n", 243 | "conv2d_3 (Conv2D) (None, 7, 7, 1) 73 dropout_2[0][0] \n", 244 | "__________________________________________________________________________________________________\n", 245 | "dropout_3 (Dropout) (None, 60, 64) 0 seq_self_attention_1[0][0] \n", 246 | "__________________________________________________________________________________________________\n", 247 | "dropout_4 (Dropout) (None, 64) 0 batch_normalization_6[0][0] \n", 248 | "__________________________________________________________________________________________________\n", 249 | "flatten_1 (Flatten) (None, 49) 0 conv2d_3[0][0] \n", 250 | "__________________________________________________________________________________________________\n", 251 | "bidirectional_2 (Bidirectional) (None, 2) 528 dropout_3[0][0] \n", 252 | "__________________________________________________________________________________________________\n", 253 | "dense_2 (Dense) (None, 32) 2080 dropout_4[0][0] \n", 254 | "__________________________________________________________________________________________________\n", 255 | "concatenate_1 (Concatenate) (None, 83) 0 flatten_1[0][0] \n", 256 | " bidirectional_2[0][0] \n", 257 | " dense_2[0][0] \n", 258 | "__________________________________________________________________________________________________\n", 259 | "batch_normalization_7 (BatchNor (None, 83) 332 concatenate_1[0][0] \n", 260 | "__________________________________________________________________________________________________\n", 261 | "dense_3 (Dense) (None, 128) 10752 batch_normalization_7[0][0] \n", 262 | "__________________________________________________________________________________________________\n", 263 | "batch_normalization_8 (BatchNor (None, 128) 512 dense_3[0][0] \n", 264 | "__________________________________________________________________________________________________\n", 265 | "dropout_5 (Dropout) (None, 128) 0 batch_normalization_8[0][0] \n", 266 | "__________________________________________________________________________________________________\n", 267 | "dense_4 (Dense) (None, 64) 8256 dropout_5[0][0] \n", 268 | "__________________________________________________________________________________________________\n", 269 | "batch_normalization_9 (BatchNor (None, 64) 256 dense_4[0][0] \n", 270 | "__________________________________________________________________________________________________\n", 271 | "dropout_6 (Dropout) (None, 64) 0 batch_normalization_9[0][0] \n", 272 | "__________________________________________________________________________________________________\n", 273 | "output_layer (Dense) (None, 1) 65 dropout_6[0][0] \n", 274 | "==================================================================================================\n", 275 | "Total params: 42,519\n", 276 | "Trainable params: 41,757\n", 277 | "Non-trainable params: 762\n", 278 | "__________________________________________________________________________________________________\n", 279 | "Train on 2080 samples, validate on 892 samples\n", 280 | "Epoch 1/200\n", 281 | "2080/2080 [==============================] - 5s 2ms/step - loss: 1.0187 - accuracy: 0.4865 - auc_1: 0.5152 - val_loss: 0.7981 - val_accuracy: 0.7152 - val_auc_1: 0.4465\n", 282 | "Epoch 2/200\n", 283 | "2080/2080 [==============================] - 1s 349us/step - loss: 1.0532 - accuracy: 0.4899 - auc_1: 0.4820 - val_loss: 0.8034 - val_accuracy: 0.5123 - val_auc_1: 0.4834\n", 284 | "Epoch 3/200\n", 285 | "2080/2080 [==============================] - 1s 342us/step - loss: 1.0043 - accuracy: 0.4846 - auc_1: 0.5059 - val_loss: 0.8104 - val_accuracy: 0.2904 - val_auc_1: 0.4863\n", 286 | "Epoch 4/200\n", 287 | "2080/2080 [==============================] - 1s 339us/step - loss: 1.0030 - accuracy: 0.5106 - auc_1: 0.5306 - val_loss: 0.8146 - val_accuracy: 0.2085 - val_auc_1: 0.5179\n", 288 | "Epoch 5/200\n", 289 | "2080/2080 [==============================] - 1s 347us/step - loss: 0.9921 - accuracy: 0.4962 - auc_1: 0.5014 - val_loss: 0.8185 - val_accuracy: 0.1626 - val_auc_1: 0.5304\n", 290 | "Epoch 6/200\n", 291 | "2080/2080 [==============================] - 1s 341us/step - loss: 0.9790 - accuracy: 0.5135 - auc_1: 0.5456 - val_loss: 0.8220 - val_accuracy: 0.1390 - val_auc_1: 0.5532\n", 292 | "Epoch 7/200\n", 293 | "2080/2080 [==============================] - 1s 340us/step - loss: 0.9910 - accuracy: 0.5096 - auc_1: 0.5384 - val_loss: 0.8255 - val_accuracy: 0.1233 - val_auc_1: 0.5554\n", 294 | "Epoch 8/200\n", 295 | "2080/2080 [==============================] - 1s 346us/step - loss: 0.9933 - accuracy: 0.5101 - auc_1: 0.5120 - val_loss: 0.8276 - val_accuracy: 0.1177 - val_auc_1: 0.5786\n", 296 | "Epoch 9/200\n", 297 | "2080/2080 [==============================] - 1s 333us/step - loss: 1.0083 - accuracy: 0.4990 - auc_1: 0.4651 - val_loss: 0.8291 - val_accuracy: 0.1155 - val_auc_1: 0.5654\n", 298 | "Epoch 10/200\n", 299 | "2080/2080 [==============================] - 1s 340us/step - loss: 0.9913 - accuracy: 0.5043 - auc_1: 0.4778 - val_loss: 0.8306 - val_accuracy: 0.1200 - val_auc_1: 0.5695\n", 300 | "Epoch 11/200\n", 301 | "2080/2080 [==============================] - 1s 332us/step - loss: 0.9524 - accuracy: 0.5188 - auc_1: 0.5462 - val_loss: 0.8306 - val_accuracy: 0.1244 - val_auc_1: 0.5652\n", 302 | "Epoch 12/200\n", 303 | "2080/2080 [==============================] - 1s 339us/step - loss: 0.9583 - accuracy: 0.5207 - auc_1: 0.5190 - val_loss: 0.8310 - val_accuracy: 0.1323 - val_auc_1: 0.5668\n", 304 | "Epoch 13/200\n", 305 | "2080/2080 [==============================] - 1s 346us/step - loss: 0.9766 - accuracy: 0.5188 - auc_1: 0.4948 - val_loss: 0.8300 - val_accuracy: 0.1424 - val_auc_1: 0.5627\n", 306 | "Epoch 14/200\n", 307 | "2080/2080 [==============================] - 1s 327us/step - loss: 0.9534 - accuracy: 0.5274 - auc_1: 0.5174 - val_loss: 0.8295 - val_accuracy: 0.1558 - val_auc_1: 0.5610\n", 308 | "Epoch 15/200\n", 309 | "2080/2080 [==============================] - 1s 337us/step - loss: 0.9328 - accuracy: 0.5385 - auc_1: 0.5302 - val_loss: 0.8291 - val_accuracy: 0.1659 - val_auc_1: 0.5620\n", 310 | "Epoch 16/200\n", 311 | "2080/2080 [==============================] - 1s 339us/step - loss: 0.9696 - accuracy: 0.5216 - auc_1: 0.4930 - val_loss: 0.8282 - val_accuracy: 0.1872 - val_auc_1: 0.5627\n", 312 | "Epoch 17/200\n", 313 | "2080/2080 [==============================] - 1s 336us/step - loss: 0.9358 - accuracy: 0.5168 - auc_1: 0.5207 - val_loss: 0.8268 - val_accuracy: 0.2096 - val_auc_1: 0.5634\n", 314 | "Epoch 18/200\n", 315 | "2080/2080 [==============================] - 1s 344us/step - loss: 0.9106 - accuracy: 0.5245 - auc_1: 0.5553 - val_loss: 0.8247 - val_accuracy: 0.2433 - val_auc_1: 0.5631\n", 316 | "Epoch 19/200\n", 317 | "2080/2080 [==============================] - 1s 352us/step - loss: 0.9307 - accuracy: 0.5231 - auc_1: 0.4955 - val_loss: 0.8220 - val_accuracy: 0.2769 - val_auc_1: 0.5676\n", 318 | "Epoch 20/200\n", 319 | "2080/2080 [==============================] - 1s 359us/step - loss: 0.9066 - accuracy: 0.5394 - auc_1: 0.5185 - val_loss: 0.8200 - val_accuracy: 0.3128 - val_auc_1: 0.5643\n", 320 | "Epoch 21/200\n", 321 | "2080/2080 [==============================] - 1s 351us/step - loss: 0.9321 - accuracy: 0.5327 - auc_1: 0.4914 - val_loss: 0.8179 - val_accuracy: 0.3475 - val_auc_1: 0.5643\n", 322 | "Epoch 22/200\n", 323 | "2080/2080 [==============================] - 1s 344us/step - loss: 0.9006 - accuracy: 0.5288 - auc_1: 0.5190 - val_loss: 0.8159 - val_accuracy: 0.3756 - val_auc_1: 0.5678\n", 324 | "Epoch 23/200\n", 325 | "2080/2080 [==============================] - 1s 346us/step - loss: 0.8785 - accuracy: 0.5524 - auc_1: 0.5492 - val_loss: 0.8153 - val_accuracy: 0.3868 - val_auc_1: 0.5691\n", 326 | "Epoch 24/200\n", 327 | "2080/2080 [==============================] - 1s 339us/step - loss: 0.8924 - accuracy: 0.5394 - auc_1: 0.5213 - val_loss: 0.8133 - val_accuracy: 0.4036 - val_auc_1: 0.5722\n", 328 | "Epoch 25/200\n", 329 | "2080/2080 [==============================] - 1s 346us/step - loss: 0.8963 - accuracy: 0.5471 - auc_1: 0.5203 - val_loss: 0.8113 - val_accuracy: 0.4316 - val_auc_1: 0.5770\n", 330 | "Epoch 26/200\n", 331 | "2080/2080 [==============================] - 1s 346us/step - loss: 0.8997 - accuracy: 0.5490 - auc_1: 0.5163 - val_loss: 0.8104 - val_accuracy: 0.4462 - val_auc_1: 0.5799\n", 332 | "Epoch 27/200\n", 333 | "2080/2080 [==============================] - 1s 341us/step - loss: 0.8969 - accuracy: 0.5409 - auc_1: 0.5075 - val_loss: 0.8092 - val_accuracy: 0.4585 - val_auc_1: 0.5802\n", 334 | "Epoch 28/200\n", 335 | "2080/2080 [==============================] - 1s 338us/step - loss: 0.8570 - accuracy: 0.5659 - auc_1: 0.5803 - val_loss: 0.8071 - val_accuracy: 0.4776 - val_auc_1: 0.5783\n", 336 | "Epoch 29/200\n", 337 | "2080/2080 [==============================] - 1s 336us/step - loss: 0.8913 - accuracy: 0.5500 - auc_1: 0.5158 - val_loss: 0.8059 - val_accuracy: 0.4910 - val_auc_1: 0.5842\n", 338 | "Epoch 30/200\n", 339 | "2080/2080 [==============================] - 1s 333us/step - loss: 0.8706 - accuracy: 0.5654 - auc_1: 0.5464 - val_loss: 0.8031 - val_accuracy: 0.5168 - val_auc_1: 0.5848\n", 340 | "Epoch 31/200\n", 341 | "2080/2080 [==============================] - 1s 339us/step - loss: 0.9001 - accuracy: 0.5462 - auc_1: 0.4635 - val_loss: 0.8003 - val_accuracy: 0.5460 - val_auc_1: 0.5893\n", 342 | "Epoch 00031: early stopping\n" 343 | ] 344 | }, 345 | { 346 | "data": { 347 | "text/plain": [ 348 | "" 349 | ] 350 | }, 351 | "execution_count": 7, 352 | "metadata": {}, 353 | "output_type": "execute_result" 354 | } 355 | ], 356 | "source": [ 357 | "# Input1: MTF along features, shape=batchsize*31*31*1\n", 358 | "input_fm = Input(shape=(31, 31, 1), name=\"cnn_input\")\n", 359 | "fm = BatchNormalization()(input_fm)\n", 360 | "cnn1 = Conv2D(16, (3, 3), padding=\"same\", activation='relu')(fm)\n", 361 | "bn1 = BatchNormalization()(cnn1)\n", 362 | "pool1 = MaxPooling2D(pool_size=(2, 2), strides=2)(bn1)\n", 363 | "pool1 = Dropout(0.5)(pool1)\n", 364 | "cnn2 = Conv2D(8, (3, 3), padding=\"same\", activation='relu')(pool1)\n", 365 | "bn2 = BatchNormalization()(cnn2)\n", 366 | "pool2 = AveragePooling2D(pool_size=(2, 2), strides=2)(bn2)\n", 367 | "pool2 = Dropout(0.5)(pool2)\n", 368 | "cnn3 = Conv2D(1, (3, 3), padding=\"same\", activation='relu')(pool2)\n", 369 | "fm_output = Flatten()(cnn3)\n", 370 | "\n", 371 | "\n", 372 | "# Input2: sequential features, shape= batchsize*60*17\n", 373 | "input_rnn = Input(shape=(60, 17), name=\"rnn_input\")\n", 374 | "rnn = BatchNormalization()(input_rnn)\n", 375 | "lstm1 = Bidirectional(LSTM(32, activation='tanh', return_sequences=True))(rnn)\n", 376 | "lstm1 = SeqSelfAttention(attention_width=10, attention_activation='tanh')(lstm1)\n", 377 | "lstm1 = Dropout(0.25)(lstm1)\n", 378 | "rnn_output = Bidirectional(\n", 379 | " LSTM(1, activation='tanh', return_sequences=False))(lstm1)\n", 380 | "# rnn_output=Flatten()(rnn_output)\n", 381 | "\n", 382 | "# Input3: non-sequential features(user type and application time), shape=batchsize*14\n", 383 | "input_non_sequential = Input(shape=(14,), name=\"dense_input\")\n", 384 | "non_sequential = BatchNormalization()(input_rnn)\n", 385 | "dense1 = Dense(64, activation=\"relu\")(input_non_sequential)\n", 386 | "bn3 = BatchNormalization()(dense1)\n", 387 | "drop2 = Dropout(0.25)(bn3)\n", 388 | "dense2_output = Dense(32, activation=\"relu\")(drop2)\n", 389 | "\n", 390 | "# Merge the above models\n", 391 | "merged = concatenate([fm_output, rnn_output, dense2_output])\n", 392 | "merged = BatchNormalization()(merged)\n", 393 | "dense3 = Dense(128, activation=\"relu\")(merged)\n", 394 | "bn4 = BatchNormalization()(dense3)\n", 395 | "drop3 = Dropout(0.2)(bn4)\n", 396 | "dense4 = Dense(64, activation=\"relu\")(drop3)\n", 397 | "bn5 = BatchNormalization()(dense4)\n", 398 | "drop4 = Dropout(0.2)(bn5)\n", 399 | "out = Dense(1, activation='sigmoid', name='output_layer')(drop4)\n", 400 | "\n", 401 | "model = Model(inputs=[input_fm, input_rnn,\n", 402 | " input_non_sequential], outputs=[out])\n", 403 | "model.summary()\n", 404 | "\n", 405 | "\n", 406 | "ada = optimizers.Adam(learning_rate=0.0002)\n", 407 | "model.compile(loss='binary_crossentropy', optimizer=ada, metrics=[\n", 408 | " 'accuracy', metrics.AUC()]) # optimizer='rmsprop'\n", 409 | "early_stopping = EarlyStopping(monitor='val_loss', patience=30, verbose=2)\n", 410 | "\n", 411 | "model.fit([X_train_arr, X_train_f1, X_train_f2], [y_train],\n", 412 | " epochs=200, batch_size=1024,\n", 413 | " class_weight={0: 1., 1: 3},\n", 414 | " shuffle=True,\n", 415 | " validation_split=0.3,\n", 416 | " callbacks=[early_stopping])" 417 | ] 418 | }, 419 | { 420 | "cell_type": "code", 421 | "execution_count": 8, 422 | "metadata": { 423 | "ExecuteTime": { 424 | "end_time": "2020-04-01T00:39:41.908414Z", 425 | "start_time": "2020-04-01T00:39:41.781709Z" 426 | } 427 | }, 428 | "outputs": [], 429 | "source": [ 430 | "model.save(\"../../data/models/binary_classification_model/\"+\"multi_source_classifier_ks_point1487.h5\")" 431 | ] 432 | }, 433 | { 434 | "cell_type": "markdown", 435 | "metadata": {}, 436 | "source": [ 437 | "## Model evaluation" 438 | ] 439 | }, 440 | { 441 | "cell_type": "code", 442 | "execution_count": 9, 443 | "metadata": { 444 | "ExecuteTime": { 445 | "end_time": "2020-04-01T00:39:43.161507Z", 446 | "start_time": "2020-04-01T00:39:41.909367Z" 447 | } 448 | }, 449 | "outputs": [], 450 | "source": [ 451 | "prediction=model.predict([X_test_arr,X_test_f1,X_test_f2])" 452 | ] 453 | }, 454 | { 455 | "cell_type": "code", 456 | "execution_count": 10, 457 | "metadata": { 458 | "ExecuteTime": { 459 | "end_time": "2020-04-01T00:39:43.518471Z", 460 | "start_time": "2020-04-01T00:39:43.162476Z" 461 | } 462 | }, 463 | "outputs": [ 464 | { 465 | "data": { 466 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAEWCAYAAAB42tAoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO3deZxN9f/A8dfbErIVUdYQYsYyGITIkqhvpb6ppF/f0jCINrSob0ohviR79iwRIqEsY9+XGQzGVPZd1uzbjHn//rh3pmvMcs3MnTvL+/l43MfMuedzznnfY9z3+ZzP+Xw+oqoYY4wx8cni7QCMMcakbZYojDHGJMgShTHGmARZojDGGJMgSxTGGGMSZInCGGNMgixRGOMFIqIiUtbbcRjjDksUJk0RkTdEZIeIXBGRv0TkOxG55w62PyAij3syRm9ynp+bInJJRC6IyDYReTpWmRwi8rWIHBKRqyKyW0Q+EBGJVa6ZiKwSkYsickpEVorIs6n7iUx6YInCpBki0hXoB3wA5AceAR4EFovIXd6MLY1Zr6p5gHuAEcC0WMn0J6AJ8BSQF3gNCAQGRxcQkZbOcpOA4sD9QA/gmdT4ACadUVV72cvrLyAfcAl4Kdb7eYCTwJvO5QlAL5f1DYEjzt8nA1HAVee+PozjOPcBvwLngLPAaiCLc93HwF7gIhAOPO+y3RvAWuBb57b7gLrO9w87Y3zdpfwEYCSw2Lm/lcCDLusVKOv8PQcwADgEnHBulyue8/QGsMZl+W7nvmo6l5sA14ASsbarDdwEygLiPNYH3v53t1f6eFmNwqQVdYGcwM+ub6rqJWAB0DSxHajqazi+AJ9R1Tyq+r84inUFjgCFcFxFf4LjixYcSaI+jtpMT+AHESnism1tYDtQEJgKTANq4vjy/T9gmIjkcSn/KvAVjuQUCkyJJ/R+QHnAz7mvYjiu7hMkIlmBNkAEcND5dlNgo6oedi2rqhudn7sJ8DBQApiZ2DGMAbv1ZNKO+4DTqhoZx7rjzvUpIQIoguPqPkJVV6uq4xJf9SdVPaaqUao6HdgN1HLZdr+qfq+qN4HpOL5sv1TV66oaBNzA8UUf7TdVXaWq14FPgToiUsI1GGe7QTvgfVU9q6oXgT5AqwQ+wyMicg5HzWEA8H+qetK57j4c5ysu0eexoMuyMYmyRGHSitPAfSKSLY51RZzrU0J/YA8QJCL7ROTj6BUi8h8RCRWRc84v4krcmqBOuPx+FUBVY7/nWqOIuap31ozOAkVjxVMIx+2jzS7HXeh8Pz4bVPUe4F5gLo5aULTTOM5XXKLP4xmXZWMSZYnCpBXrgevAv13fFJHcwJPAUudbl3F8sUZ7INZ+EhwOWVUvqmpXVS2Do+G2i4g0EZEHgTFAZ6Cg84s4DMf9/KSKqT04b0kVAI7FKnMaR4LxVdV7nK/86misTpAz+bwFvCYi1ZxvLwFqx1FzqeWMZxnwJ44k9kLSPpbJbCxRmDRBVc/jaBcYKiLNRSS7iJTC8WTOERwN1eC41/+UiBQQkQeA92Lt6gRQJr7jiMjTIlLWecvnAo4G3ptAbhxJ5pSzXBscNYrkeEpEHnU+sfUVcbcdROFIUN+KSGHnsYuJSDN3DqCqZ4CxONs0VHUJjqQ6S0R8RSSriDyCo33kO1Xd7bzV1gX4TETaiEg+EcnijHV0Mj+zyYAsUZg0w9n4/AmO++4XgI04rnybOO/zgyNhbAMOAEE42gpcfQ3813kbp1schymH46r7Eo5azAhVXaGq4cA3zvdOAJVxPOWUHFOBz3HccqqBo3E7Lh/huB22QUQuOON7+A6OMwhHUqriXH4BWI7jFtYl4AdgHPB29AaqOhN4GXgTRy3nBNALmHMHxzWZhDjb8YwxKUhEJuB4bPe/3o7FmOSyGoUxxpgEeSxRiMh4ETkpImHxrBcRGSIie0Rku4hU91Qsxhhjks5jt55EpAGO+6OTVPW2RkEReQrHPdOncHRkGqyqtT0SjDHGmCTzWI1CVVfhaMSLTwscSURVdQNwT6xesMYYY9KAuDo3pZZiuHRIwvEIZDHi6C0qIoE4BjUjd+7cNSpUqJAqARpjTHqx79RlrkbcJFf2rDHvqUZx9dwprl/8GxwjHyTUkTNe3kwUcXVkivM+mKqOBkYD+Pv7a0hIiCfjMsaYdOflUesBmN6+DgCLFy8mMDCQcxf/plOnTgwfPvxgQtsnxJtPPR3BpecqjqGOY/daNcYYcwf+/vtv3nzzTZ544gly5MjB6tWrGTZsWLL26c1EMRf4j/Ppp0eA86pqg5QZY0wSHdm6Eh8fHyZNmkT37t0JDQ3l0UcfTfZ+PXbrSUR+xDFXwH0icgRHD9XsAKo6EpiP44mnPcAVHMMlG2OMuUN//fUX60Z/ypEty/Hz8+O3336jevWU63HgsUShqq8ksl6BTp46vjHGZHSqyqRJk3j//fc5f/EylZ/rwKYZQ8iePXuKHsebjdnGGGOS6ODBg7Rv355FixZRt25d8j3RmXwPlErxJAE2hIcxxqQrUVFRDBs2DF9fX9asWcPQoUNZvXo1+R4o5bFjWqIwxph04s8//6RBgwa8/fbbPProo+zcuZPOnTszLfgIG/cn1L85eSxRGGNMGhcREcHXX39N1apVCQ8PZ8KECSxYsIAHH3wQgDmhRwFo4VfMI8e3NgpjjEnDtm7dSkBAAFu3bqVly5YMHTqUBx6IPbEj1C5dgNa1S3okBqtRGGNMGnTt2jU++eQTatasybFjx5g1axY//fRTnEnC06xGYYwxXjZ146GY20cAp/ZsI2Ty11w8cYhSdf+F3wtvM/1UPqY7h+mILfz4BXyK5PNYfJYojDHGy+aEHiX8+AXK3ZuVHbNHsmflLHIXLEKDd77lAZ/EZ1/wKZLPY+0TYInCGGPShAJnw9n8/SAOHz7MO++8Q+/evcmTJ4+3wwIsURhjjFedPXuWjRO+4uCGBVSoUIE1a9ZQt25db4d1C2vMNsYYL5k5cyYVK1bk0KYgKj75Olu3bk1zSQIsURhjTKob/lswxas15MUXXyQi572UaTeEyi3akzNnTm+HFie79WSMMalEVZkwYQLvd36XyOvXqPx8Rx5+/BWyZM3m0cbo5LJEYYwxqWD//v0EBgayZMkS7itblZqvdWd+j5e9HZZbLFEYY4wH3bx5k+HDh9O9e3eyZMnCiBEjWE4VJEv6ufNvicIYY1JI7I5zF44fIHhyH87sC+MB3zrUePUDVmR5gN893EEupVmiMMaYFBLdca5C4bv5Y9EPhM//nmw5clG7TQ9K1mqGiACe7yCX0ixRGGNMCnrgxjF2jRpE2PbtvPTSSwwdOpTChQt7O6xksURhjDEp4OrVq2z7eTi7Fv/IAw/cz+zZs3nuuee8HVaKsERhjDFOsdsY3HVq91aCJ/fl0snDlK73DFt+ncQ999zjgQi9wxKFMcY4RbcxuNvQHHH1Mttnj2Dvqtnkvq8oj703hMBWLTJUkgBLFMYYcwufIvmY3r5OouXmz59Phw4dOHLkCO+//z5fffUVuXPnToUIU58lCmOMuQOnT5/m/fff54cffsDHx4d169bxyCOPeDssj0o/PT6MMcaLVJXp06fj4+PDtGnT6NGjB1u2bMnwSQKsRmGMyUCS2hgdLb72iWPHjtGxY0fmzp2Lv78/S5YsoUqVKskJNV2xGoUxJsOIboxOqtgd4VSVsWPH4uPjQ1BQEAMGDGD9+vWZKkmA1SiMMRmMu43Ridm3bx/t2rVj2bJlPPbYY4wdO5ayZcumQITpj9UojDHGxc2bN/n222+pVKkSwcHBjBo1imXLlmXaJAGWKIwxGcTUjYfYuP9ssvaxc+dO6tWrR5cuXWjcuDHh4eEEBgaSJR2N9OoJmfvTG2MyjOhG7KQMtnfjxg2+/PJLqlWrxt69e5k6dSrz5s2jePHiKR1mumRtFMaYDKN26QK0rl3yjrYJDg7mzTffJCwsjNatWzNo0CAKFSrkoQjTJ6tRGGMypStXrtCtWzceeeQR/v77b+bOncuUKVMsScTBahTGmHQlvr4SdzJG04oVK2jbti179+6lffv29OvXj/z586d0qBmG1SiMMelKfH0l3JkM6Pz587Rv355GjRoBsGzZMkaOHGlJIhFWozDGpDtJ6Ssxb948OnbsyPHjx+nWrRs9e/bk7rvv9lCEGYtHaxQi0lxE/hSRPSLycRzr84vIPBHZJiI7RaSNJ+MxxmQ+p06donXr1jz77LPce++9rF+/nv79+1uSuAMeSxQikhUYDjwJ+ACviIhPrGKdgHBVrQo0BL4Rkbs8FZMxJvNQVaZOnUrFihWZOXMmPXv2ZPPmzdSqVcvboaU7nrz1VAvYo6r7AERkGtACCHcpo0Beccw4ngc4C0R6MCZjTDoTu/HanUbrI0eO0LFjR3799Vdq167NuHHj8PX19XSoGZYnbz0VAw67LB9xvudqGFAROAbsAN5V1ajYOxKRQBEJEZGQU6dOeSpeY0waFLvxOqFG66ioKEaNGoWPjw9Lly5l4MCBrF271pJEMnmyRiFxvKexlpsBoUBj4CFgsYisVtVbHmlQ1dHAaAB/f//Y+zDGZHDuNF7v2bOHdu3asWLFCho3bsyYMWMoU6ZMKkWYsXmyRnEEKOGyXBxHzcFVG+BnddgD7AcqeDAmY0wGExkZyYABA6hcuTJbtmxhzJgxLFmyxJJECvJkjSIYKCcipYGjQCugdawyh4AmwGoRuR94GNjnwZiMMWlMYpMNJdQmsX37dgICAggJCeHZZ59lxIgRFCt252M9mYR5rEahqpFAZ2AR8DswQ1V3ikgHEengLPYVUFdEdgBLgY9U9bSnYjLGpD2JTTYUV5vE9evX+fzzz6lRowYHDx5k+vTp/PLLL5YkPMSjHe5UdT4wP9Z7I11+PwY84ckYjDFp3510oNuwYQMBAQGEh4fzf//3fwwaNIiCBQt6OMLMzYbwMMakC5cvX6ZLly7UrVuXCxcu8NtvvzF58mRLEqnAhvAwxqR5S5cupV27duzfv5+OHTvSt29f8uVzbwBAk3yWKIwxqc61ATuhxupz587xwQcfMHbsWMqVK8fKlStp0KBBaoZqsFtPxhgvcG3Ajq8D3Zw5c/Dx8WH8+PF8+OGHbNu2zZKEl1iNwhjjFfE1YJ84cYJ33nmHGTNmUKVKFebOnYu/v78XIjTRrEZhjEkTVJUffvgBHx8ffvnlF3r16kVISIgliTTAahTGmFSRULvEoUOH6NChAwsWLKBOnTqMGzeOihUreitUE4vVKIwxqSKudomoqCi+++47fH19WblyJYMHD2b16tWWJNIYq1EYY1KNa7vErl27aNiwIatXr+bxxx9n9OjRlC5d2ssRmrhYjcIYk6oiIyPp168fVapUYceOHYwfP56goCBLEmmY1SiMMR4V3TYRfvwCRW+eoHbtzmzZsoXnn3+e4cOHU6RIEW+HaBJhicIY41FzQo8Sdvg0umUWS4OmcN99BZk5cyYvvPCCt0MzbrJEYYzxqNN7d3B4ch8u/nWQ119/nYEDB1KgQAFvh2XugCUKY4xHXLp0iU8//ZRlQ4dy972FWbhwIc2aNfN2WCYJLFEYY1JcUFAQgYGBHDx4kLINW1L5ufY0a/a4t8MySWSJwhgTp8RmnovLjcsXCJ05hAPr55P3/pI06vYdp3KXJnvO3B6K0qQGtxOFiORW1cueDMYYk3ZEP6kU38iusR3ZuoItP37D9UvnqND8P/j+qw1Zs+egEMQ56J9JPxJNFCJSFxgL5AFKikhVoL2qvuXp4Iwx3uXOzHN//fUXnTt3Zt2sWfj5+TF+/GKqVauWShGa1OBOh7tvgWbAGQBV3QbYWL/GZHKqysSJE/Hx8eHXX3+lT58+bNq0yZJEBuTWrSdVPSwirm/d9Ew4xhhvid0mkdBtp4MHD9K+fXsWLVpEvXr1GDt2LBUqVEitUE0qc6dGcdh5+0lF5C4R6Qb87uG4jDGpzHXQPoh7QqGoqCiGDRuGr68va9asYejQoaxatcqSRAbnTo2iAzAYKAYcAYIAa58wJgNKqE3ijz/+oG3btqxdu5ZmzZoxatQoHnzwwVSO0HiDOzWKh1X1VVW9X1ULq+r/ATYGsDGZREREBH369KFq1aqEh4czceJEFixYYEkiE3GnRjEUqO7Ge8aYNMjd/hBxtUls3bqVN998k9DQUFq2bMmwYcO4//77PRWqSaPiTRQiUgeoCxQSkS4uq/IBWT0dmDEmZbjbH8K1TeLatWv07NmT/v37U6hQIWbNmsW///3v1AjXpEEJ1SjuwtF3IhuQ1+X9C0BLTwZljElZ7vSHiLZmzRoCAgLYtWsXbdq04ZtvvuHee+/1cIQmLYs3UajqSmCliExQ1YOpGJMxxgsuXrxI9+7dGT58OKVKlSIoKIimTZt6OyyTBrjTRnFFRPoDvkDO6DdVtbHHojLGpKqFCxfSvn17Dh8+zLvvvkuvXr3IkyePt8MyaYQ7Tz1NAf4ASgM9gQNAsAdjMsakgKkbD/HyqPW39I2I7cyZM7z++us8+eST5M6dm7Vr1zJo0CBLEuYW7iSKgqo6DohQ1ZWq+ibwiIfjMsYkk2sjduyOc6rKzJkz8fHxYerUqfz3v/9l69at1KnjXjuGyVzcufUU4fx5XET+BRwDinsuJGNMSomrEfv48eN06tSJ2bNnU6NGDYKCgqhataqXIjTpgTuJopeI5Ae64ug/kQ94z6NRGWNSnKry/fff07VrV65du0a/fv3o0qUL2bLZtDQmYYneelLVX1X1vKqGqWojVa0BnE2F2IwxSTR14yE27v/nv+n+/ft54oknCAgIoHLlymzbto0PP/zQkoRxS7yJQkSyisgrItJNRCo533taRNYBw1ItQmPMHYvuif1M5QcYPHgwlSpVYuPGjYwYMYIVK1ZQvnx5L0do0pOELifGASWATcAQETkI1AE+VtVf3Nm5iDTHMaBgVmCsqvaNo0xDYBCQHTitqo/d0ScwxsSpQo7zjOj2KuvXr+fJJ59k1KhRlChRwtthmXQooUThD1RR1SgRyQmcBsqq6l/u7FhEsgLDgaY4Rp0NFpG5qhruUuYeYATQXFUPiUjhpH4QY4xDREQE4fO/J3z+BO7Nn48ffviB1q1bE2tOGWPcllCiuKGqUQCqek1EdrmbJJxqAXtUdR+AiEwDWgDhLmVaAz+r6iHncU7eUfTGZDKJDfB39uAfBE/qw/mjeyjh34SQ36ZSuLBdf5nkSShRVBCR7c7fBXjIuSyAqmqVRPZdDDjssnwEqB2rTHkgu4iswDGe1GBVnRR7RyISCAQClCxZMpHDGpNxxTfAX+SN6+z8dSy7Fv9IjnwFqNehL2+98YolCZMiEkoUyZ1zIq56rsZx/BpAEyAXsF5ENqjqrls2Uh0NjAbw9/ePvQ9jMpXYfSNWrlxJ27bt2LNnD23btqV///7cc889XozQZDQJDQqY3IEAj+BoDI9WHEdnvdhlTqvqZeCyiKwCqgK7MMYk6MKFC3z00UeMHDmSMmXKsGTJEpo0aeLtsEwG5M4QHkkVDJQTkdIichfQCpgbq8wcoL6IZBORu3HcmrL5uI1JxPz58/H19WX06NF06dKF7du3W5IwHuOx3jaqGikinYFFOB6PHa+qO0Wkg3P9SFX9XUQWAtuBKByP0IZ5KiZj0hJ3Z55ztX3vYa6sHMeMTUH4+Pgwc+ZMateO3fRnTMpyK1GISC6gpKr+eSc7V9X5wPxY742Mtdwf6H8n+zUmI3B35jlwDL9xePNS9k8fSOSVS3z++ed0796dHDlypEKkJrNLNFGIyDPAABwz3pUWET/gS1V91tPBGZPRuTPz3NGjR3nrrbfYMHcu/v7+jB8/nsqVK6dShMa410bxBY4+EecAVDUUKOW5kIwx4KhFjBkzBh8fH4KCghgwYADr16+3JGFSnTu3niJV9bz16jQm9ezdu5d27dqxfPlyGjZsyJgxYyhbtqy3wzKZlDs1ijARaQ1kFZFyIjIUWOfhuIzJlG7evMnAgQOpXLkymzdvZtSoUSxdutSShPEqdxLF2zjmy74OTAXOY/NRGJPiwsLCqFu3Ll27dqVJkybs3LmTwMBAsmTx5FPsxiTOnVtPD6vqp8Cnng7GmMzoxo0bfP311/Tu3Zv8+fMzdepUWrVqZYP4mTTDnUQxUESKAD8B01R1p4djMibTOLM/nBo1AgkLC6N169YMGjSIQoUKeTssY26RaKJQ1UYi8gDwEjBaRPIB01W1l8ejMyaDunLlCqEzh7J76XSKFi3CvHnzePrpp70dljFxcuvmp6r+papDgA5AKNDDo1EZk4EtX76cypUrs2vJj5R+9Fl27txpScKkaYkmChGpKCJfiEgYjilQ1+EY4M8YcwfOnz9P+/btady4MSJCw/eH4f/qh+TPn9/boRmTIHdqFN8DfwNPqOpjqvqdTTBkzJ2ZN28ePj4+jB07lm7durF9+3YKP1zd22EZ4xZ32igeSY1AjPG2pAzSl5hrF/8mdMYgDgUvJn+xh2j84WgOlfKhzeRtbo/zZIy3xZsoRGSGqr4kIju4dcIhd2e4MyZduZNB+hKjqhwKXszW6d8See0yvs+0pUKz18iaLXtMGZ8i+WjhVyzZxzLG0xKqUbzr/GmtbCbTcGeQvsQcPnyYjh07svG336hduzbjxo3D19c3hSI0JvXF20ahqsedv76lqgddX8BbqROeMelHVFQUo0aNwtfXl+XLl/Ptt9+ydu1aSxIm3XOnMbtpHO89mdKBGONNUzceYuP+s0nefvfu3TRu3JgOHTpQq1YtduzYwXvvvUfWrFlTMEpjvCPeRCEiHZ3tEw+LyHaX134cM9IZk2FEN2LfaZtBZGQkAwYMoEqVKoSGhjJ27FgWL15MmTJlPBGmMV6RUBvFVGAB8DXwscv7F1U16ZdexqRRtUsXoHXtkm6X3759OwEBAYSEhNCiRQtGjBhB0aJFPRihMd6R0K0nVdUDQCfgossLESng+dCMSZuuX79Ojx49qFGjBgcPHmT69OnMnj3bkoTJsBKrUTwNbMbxeKzrUJYKWN3apGuu/SbcfSx2w4YNBAQEEB4ezmuvvca3335LwYIFPR2qMV6V0FNPTzt/llbVMs6f0S9LEibdi+43AYn3abh8+TLvv/8+devW5eLFi8yfP59JkyZZkjCZQqI9s0WkHhCqqpdF5P+A6sAgVT3k8eiM8TB3+k0sXbqUdu3asX//ft566y2+/vpr8uWzHtUm83Dn8djvgCsiUhX4EDgITPZoVMakAefOnaNt27Y8/vjjZMuWjZUrVzJ8+HBLEibTcSdRRKqqAi2Awao6GMjr2bCM8a5ffvkFHx8fJkyYwEcffcS2bdto0KCBt8MyxivcmeHuooh0B14D6otIViB7ItsYk6bENeBfXA3YJ06c4O233+ann36iatWqzJs3jxo1aqRmqMakOe7UKF4GrgNvqupfQDGgv0ejMiaFuTZcR3NtwFZVJk+ejI+PD3PmzKFXr14EBwdbkjAG94YZ/0tEpgA1ReRpYJOqTvJ8aMakrPgarg8dOkSHDh1YsGABderUYdy4cVSsWNELERqTNrkzw91LwCbgRRzzZm8UkZaeDswYT4uKimLEiBH4+vqyatUqhgwZwurVqy1JGBOLO20UnwI1o2e1E5FCwBJgpicDMyYlJxKK3R6xa9cu2rZty+rVq2natCmjR4+mVKlSKXIsYzIad9oossSa+vSMm9sZkyxxtSskVXR7RGRkJP369aNKlSrs2LGD77//nkWLFlmSMCYB7tQoForIIuBH5/LLwHzPhWTMP1JiIqFooaGh1K79PFu2bOH5559n+PDhFClSJEX2bUxGlmjNQFU/AEYBVYCqwGhV/cjTgRmTUq5du8ann36Kv78/R48eZebMmfz888+WJIxxU0JzZpcDBgAPATuAbqqasjPPGxOP6ImEapdO3kDF69atIyAggD/++IPXX3+dgQMHUqCADX5szJ1IqEYxHvgVeAHHCLJDUyUiY0j6RELRLl26xDvvvMOjjz7KlStXWLhwIRMmTLAkYUwSJNRGkVdVxzh//1NEtqRGQMZEu9OJhKIFBQURGBjIoUOH6NSpE3369CFvXht1xpikSqhGkVNEqolIdRGpDuSKtZwoEWkuIn+KyB4R+TiBcjVF5Kb1zzDJcfbsWdq0aUOzZs3ImTMnq1atYujQoZYkjEmmhGoUx4GBLst/uSwr0DihHTvHhBoONAWOAMEiMldVw+Mo1w9YdGehG/OPWbNm0alTJ06fPk337t3p0aMHOXPm9HZYxmQI8SYKVW2UzH3XAvao6j4AEZmGYwTa8Fjl3gZmATWTeTyTQdxJQ/Zff/1F586dmTVrFtWqVWPBggVUq1YtFaI0JvPwZMe5YsBhl+UjzvdiiEgx4HlgZEI7EpFAEQkRkZBTp06leKAmbXGnIVtVmTBhAj4+Pvz66698/fXXbNy40ZKEMR7gyUQhcbynsZYHAR+p6s2EdqSqo1XVX1X9CxUqlGIBmrQroYbsAwcO0Lx5c9q0aYOvry/btm3j448/Jnt2G/3eGE9wp2d2Uh0BSrgsFweOxSrjD0wTEYD7gKdEJFJVf/FgXCadioqKYvjw4XTv3h0RYdiwYXTs2JEsWWxEGWM8yZ05swV4FSijql+KSEngAVXdlMimwUA5ESkNHAVaAa1dC6hqaZfjTAB+tSSReUUPAhjXhEJ//PEHbdu2Ze3atTRr1oxRo0bx4IMPeilSYzIXdy7FRgB1gFecyxdxPM2UIFWNBDrjeJrpd2CGqu4UkQ4i0iGJ8ZoMzDVJRLdPRERE0KdPH6pWrUp4eDgTJ05kwYIFliSMSUXu3HqqrarVRWQrgKr+LSJ3ubNzVZ1PrAEEVTXOhmtVfcOdfZqMzXUQwC1bthAQEEBoaCgvvvgiQ4cO5f777/dyhMZkPu7UKCKcfR0UYuajiPJoVCZTu3r1Kt27d6dWrVr89ddf/Pzzz8yYMcOShDFe4k6NYggwGygsIr2BlsB/PRqVybRO7dmGn98b7Nq1izfffJMBAwZw7733ejssYzI1d+bMniIim4EmOB55fU5Vf/d4ZCbDimvmuohrl1k5ZQhng31r3x0AACAASURBVOdRqlQpFi9ezOOPP+6lCI0xrtx56qkkcAWY5/qeqh7yZGAm44r9ZNPxsPVsnvI/rpw7SfOX3+SnsYPJkyePl6M0xkRz59bTbzjaJwTICZQG/gR8PRiXyeB8iuRjRMvyvP/++6yePJmKFSsy7rfZ1KmTMrPZGWNSjju3niq7LjtHjm3vsYhMhqeqHNmyHJ8ez3H27Fk+++wzPv30U3LkyOHt0IwxcbjjntmqukVEbAA/E6+42iCiXT1/mhXf9+XiH+uoUaMGQUFBVK1aNZUjNMbcCXfaKLq4LGYBqgM2Mp+JV1y9q1WV/et+Y9vMIURF3uCVzt2Z9O2XZMvmyVFkjDEpwZ3/pa6zvkTiaLOY5ZlwTEbh2nFu//79BAYGErJkCQ0aNGDMmDGUL1/eyxEaY9yVYKJwdrTLo6ofpFI8JgO5efMmw4YN45NPPiFr1qx89913BAYG2iB+xqQz8SYKEcmmqpHuTntqjOugfsX0DI8+2oUNGzbw1FNPMXLkSEqUKJH4TowxaU5CNYpNONojQkVkLvATcDl6par+7OHYTDozJ/QoYUfOINvmsnThRPLny8sPP/xA69atcQ4lb4xJh9xpoygAnMExR3Z0fwoFLFGYW5w9+DtHJ33N+aN7aNWqFYMHD6Zw4cLeDssYk0wJJYrCzieewvgnQUSLPVOdycSuXr3K559/ztIB35AzXwHmzJnDs88+6+2wjDEpJKFEkRXIg3tTmppMauXKlbRt25Y9e/ZQ5tFnqfLvTjz77BPeDssYk4ISShTHVfXLVIvEpAvRDdYRVy+zffYI9q6aTe77ivLYe0M4k788d92dN/GdGGPSlYQShbU+mtvMCT3KxpWLOblwONfOnab8462o9Ew7suXIxf0QMzOdMSbjSChRNEm1KEy6cPr0aTaM/4JDm4Lw9fVl3IK51K5d29thGWM8LN6eT6p6NjUDMWmXqjJt2jQqVqzIkc3L8PnXm2zZssWShDGZhHWRNQk6evQozz33HK+88gpRuQtRpt1QKj3TlrvucmvadGNMBmAjspk4qSpjx46lW7duREREUPWFzkT6PIlvsXutHcKYTMYShbnN3r17adeuHcuXL6dhw4aMGTOGT5c6BgyOHujPGJN52K0nE+PmzZsMHDiQypUrs3nzZkaPHs3SpUspW7ast0MzxniR1SgMAGFhYQQEBLBp0yaeeeYZnmj3X5Ycu8mSMRsBbptfwhiTeViNIpO7ceMGPXv2pHr16uzbt48ff/yROXPmsPrYTcKPX4gp51Mkn7VNGJNJWY0iE9u0aRMBAQGEhYXRunVrBg8ezH333Rez3nXyIWNM5mU1ikzoypUrdO3alTp16vD3338zb948pkyZckuSMMaYaFajyGSWL19O27Zt2bdvH+3bt6dfv37kz5/f22EZY9IwSxQZXPQgfjeuXmL7rGHsWzOXPIWK0/D9Yfz9cHUCp4XHuZ01XhtjolmiyODmhB5lw4ogTi0YxrULZ3m46av4PhNAtrtyJridNV4bY6JZosjATp06xfqxPTgcsoTKlSszfvwC/P39vR2WMSadscbsDEhVmTp1KhUrVuTo1hX4PtOOkJAQSxLGmCSxGkU6F90GEe3K2RNs/rE/x3eso0BpX8q81BvfqlVsED9jTJJZokjn5oQeJfz4BSren4e9a+aw/efhaFQUfi++S9lGLcmSJau1NRhjksWjiUJEmgODccy/PVZV+8Za/yrwkXPxEtBRVbd5MqaMqESW85yc/jVbVq6kSZMmjB49mjJlyng7LGNMBuGxRCEiWYHhQFPgCBAsInNV1fV5zP3AY6r6t4g8CYwGbDYcN0VGRvJH0BR2zhtDnrtzMW7cONq0aYOIzWJrjEk5nqxR1AL2qOo+ABGZBrQAYhKFqq5zKb8BKO7BeDKU7du3ExAQwPaQEIpWrU/w/GkULVrU22EZYzIgTz71VAw47LJ8xPlefAKABXGtEJFAEQkRkZBTp06lYIjpz/Xr1+nRowc1atTg0KFD1GnXi3od+lqSMMZ4jCcTRVz3PzTOgiKNcCSKj+Jar6qjVdVfVf0LFSqUgiGmL+vXr6datWp89dVXvPLKK4SHh1OiRmO71WSM8ShPJoojQAmX5eLAsdiFRKQKMBZooapnPBhPunX58mXee+896tWrx6VLl5g/fz6TJk2iYMGC3g7NGJMJeDJRBAPlRKS0iNwFtALmuhYQkZLAz8BrqrrLg7GkW0uWLKFSpUoMHjyYjh07EhYWxpNPPuntsIwxmYjHGrNVNVJEOgOLcDweO15Vd4pIB+f6kUAPoCAwwnn7JFJVrfswcO7cObp27cr48eMpV64cq1aton79+jHrozva2eB9xhhP82g/ClWdD8yP9d5Il9/bAm09GUN69Msvv/DWW29x8uRJPv74Y3r06EGuXLluKeOaJKxDnTHGk6xndhpy4sQJ3n77bX766SeqVq3KvHnzqFGjRrzlbQY6Y0xqsEEB0wBVZfLkyfj4+DBnzhx69+5NcHBwgknCGGNSi9Uokin2oHx36vLZv9g85X/8tXMDBctUpvHb3dlWsBT/Nz4kwe2sbcIYk1osUSRTUhuUNSqKPatms2P2d4BS7eX3KfvYC0gW9yp51jZhjEktlihSwJ22Ffz555+0bduWrWvW0LRpU0aPHk2pUqU8F6AxxiSDtVGkosjISPr27UvVqlUJCwvj+++/Z9GiRZYkjDFpmiWKZJi68RAb9591q2xoaCi1a9eme/fu/Otf/+L333/njTfesOE3jDFpniWKZIhuxE6oreDatWt8+umn+Pv7c/ToUWbOnMmsWbN44IEHUitMY4xJFmujSKbapQvQunbJONetXbuWgIAA/vzzT9544w2++eYbChQokMoRGmNM8liNwgMuXbrEO++8Q/369bl27RqLFi3i+++/tyRhjEmXLFGksKCgICpVqsSwYcPo3LkzYWFhPPHEE94OyxhjkswSRQo5e/Ysbdq0oVmzZuTMmZPVq1czZMgQ8uTJ4+3QjDEmWSxRpIBZs2bh4+PD5MmT+eSTTwgNDaVevXreDssYY1KENWYnw9Xzp9kybSAztq6gWrVqLFy4ED8/P2+HZYwxKcoSRRKoKhMnTmRhz3e4eeM6ffv2pUuXLmTPnt3boRljTIqzROGm6MH/Lp8+TsiUvpz4PZi7S/rS5M3/8tFHrVIlhoiICI4cOcK1a9dS5XjGmPQnZ86cFC9ePEUvXC1RuOmXLYdZM3cKp5ZPAITqrbryUIPnea56icQ2TTFHjhwhb968lCpVynp0G2Nuo6qcOXOGI0eOULp06RTbryUKN/z+++8sG9CRM/t20Lx5c0aOHMmDDz6Y6nFcu3bNkoQxJl4iQsGCBTl16lSK7teeekpAREQEffr0wc/Pj4snDlLrjc+YP3++V5JENEsSxpiEeOI7IkPXKJIzqdDfh/4keFIfzh3ZTfHqjcnTqB2lyj1oX9TGmEwnQ9cooicVuhORN66zffYIlvRty7ULZ6nb/mvqBvaiSrkHbaIgHO0kLVq0oFy5cjz00EO8++673LhxI9Ht+vTpE++6EydO8PTTT1O1alV8fHx46qmnUjLk2xw4cIBKlSrd0TZZs2bFz8+PSpUq8cwzz3Du3LmYdTt37qRx48aUL1+ecuXK8dVXX6GqMesXLFiAv78/FStWpEKFCnTr1i3OY7hbzlvatm1LeHg4cOu/Z1LOZ2Z1/fp1Xn75ZcqWLUvt2rU5cOBAnOUaNmzIww8/jJ+fH35+fpw8eRKAgQMH4uPjQ5UqVWjSpAkHDx5MncBVNV29atSooe56aeQ6fWnkOrfLr1q1SsuXL6+ABgQE6NmzZ93eNjWEh4d79fhRUVFas2ZNHT9+vKqqRkZG6ptvvqndunVLdNvcuXPHuy4wMFAHDRoUs7xt27bkB5uA/fv3q6+v7x1t4xr/f/7zH+3Vq5eqql65ckXLlCmjixYtUlXVy5cva/PmzXXYsGGqqrpjxw4tU6aM/v7776qqGhERocOHD79t/+6Wi09kZOQdfZ7kcj0fSTmfaUVERESqHm/48OHavn17VVX98ccf9aWXXoqz3GOPPabBwcG3vb9s2TK9fPmyqqqOGDEi3u3j+q4AQjSJ37sZ+taTuy5evMjHH3/MiBEjKFWqFIsXL+bxxx/3dlgJ6jlvJ+HH7qy2lBifovn4/BnfeNcvW7aMnDlz0qZNG8Bxlf3tt99SunRpevbsyYwZMwgJCWHYsGEAPP3003Tr1o2FCxdy9epV/Pz88PX1ZcqUKbfs9/jx47eMh1WlShXAMbhiixYt+Pvvv4mIiKBXr160aNGCAwcO0Lx5cx599FE2bNhA1apVadOmDZ9//jknT55kypQp1KpViy+++IK9e/dy9OhRDh8+zIcffki7du1uOfbNmzf5+OOPWbFiBdevX6dTp060b98+wfNUp04dtm/fDsDUqVOpV69eTPx33303w4YNo2HDhnTq1In//e9/fPrpp1SoUAGAbNmy8dZbb922z4TKvfHGGzz99NO0bNkSgDx58nDp0iVWrFhBz549KVKkCKGhoTzzzDM8+OCDMdt98cUX5M2bl65du9K/f39mzJjB9evXef755+nZs+ctx58xYwYbNmxg4MCBDB48mMGDB7Nv3z727t3L66+/zpo1a2jYsCEDBgxg5syZt/x79u7dm5s3b9KuXTvWrVtHsWLFmDNnDrly5UrwPHrapk2beO+997h69Sq5cuXi+++/5+GHH2bChAn89ttvXLt2jcuXLzNv3jzefvttduzYQWRkJF988UXM39lrr73G5cuXARg2bBh169ZNVkxz5szhiy++AKBly5Z07twZVXX7lnajRo1ifn/kkUf44YcfkhWPuzL0rSd3LFiwAF9fX7777jvee+89wsLC0nyS8JadO3dSo0aNW97Lly8fJUuWZM+ePfFu17dvX3LlykVoaOhtSQKgU6dOBAQE0KhRI3r37s2xY8cAx/Pgs2fPZsuWLSxfvpyuXbvG3NLZs2cP7777Ltu3b+ePP/5g6tSprFmzhgEDBtxyW2T79u389ttvrF+/ni+//DJm39HGjRtH/vz5CQ4OJjg4mDFjxrB///54P8vNmzdZunQpzz77bLzn5KGHHuLSpUtcuHCBsLCw29bHxd1ysW3atInevXsTHh5Oq1atmD59esy6GTNm8OKLLxIUFMTu3bvZtGkToaGhbN68mVWrVt2ynwYNGrB69WoAVq9eTcGCBTl69Chr1qyhfv36t5SN699z9+7ddOrUiZ07d3LPPfcwa9asO/4sKa1ChQqsWrWKrVu38uWXX/LJJ5/ErFu/fj0TJ05k2bJl9O7dm8aNGxMcHMzy5cv54IMPuHz5MoULF2bx4sVs2bKF6dOn884778R5nPr168fcInJ9LVmy5LayR48epUQJxyP12bJlI3/+/Jw5cybO/bZp0wY/P7/bbmVGGzduHE8++WRSTs0dy7Q1ijNnzvD+++8zefJkKlasyNq1a6lTx/15r70toSt/T4nvyudOroji0qxZM/bt28fChQtZsGAB1apVIywsjHvuuYdPPvmEVatWkSVLFo4ePcqJEycAKF26NJUrVwbA19eXJk2aICJUrlz5lvu+LVq0IFeuXOTKlYtGjRqxadOmW4ZZCQoKYvv27cycOROA8+fPs3v37tueQY++gj5w4AA1atSgadOmiX721HjwoVatWjGxVqtWjZMnT3Ls2DFOnTrFvffeS8mSJRkyZAhBQUFUq1YNcNTUdu/eTYMGDWL288ADD3Dp0iUuXrzI4cOHad26NatWrWL16tX8+9//TjSO0qVLx5zXGjVqxHvvPTWdP3+e119/nd27dyMiRERExKxr2rRpzLD/QUFBzJ07lwEDBgCOx9APHTpE0aJF6dy5M6GhoWTNmpVdu3bFeZzoBOuOuL7w4/o7mTJlCsWKFePixYu88MILTJ48mf/85z8x63/44QdCQkJYuXKl28dOjkxXo1BVZsyYQcWKFfnxxx/57LPP2Lp1a7pKEt7i6+tLSEjILe9duHCBw4cP89BDD5EtWzaioqJi1sXXg3z48OExV13RV/gFChSgdevWTJ48mZo1a7Jq1SqmTJnCqVOn2Lx5M6Ghodx///0x+8yRI0fM/rJkyRKznCVLFiIjI2PWxf5PGHtZVRk6dCihoaGEhoayf//+OIeFj76CPnjwIDdu3GD48OHxnpN9+/aRJ08e8ubNi6+vL5s3b47zPLhKqJzreVXVWx4eyJ079y1lW7ZsycyZM5k+fTqtWrWK2aZ79+4xn3HPnj0EBATcdpw6derE3J6pX78+q1evZv369W4NcOn675E1a9Zb/g285bPPPqNRo0aEhYUxb968W/4eXc+bqjJr1qyY83Po0CEqVqzIt99+y/3338+2bdsICQmJ96GNO6lRFC9enMOHDwMQGRnJ+fPn45ynplgxx4MzefPmpXXr1mzatClm3ZIlS+jduzdz58695bx7UqZKFMeOHePf//43L7/8MiVLlmTz5s18+eWXqXay07smTZpw5coVJk2aBDhuw3Tt2pU33niDu+++m1KlShEaGkpUVBSHDx++5Y87e/bsMVd0nTp1ivlPWbRoUZYtW8aVK1cAR3vR3r17KVmyJOfPn6dw4cJkz56d5cuXJ+kJjzlz5nDt2jXOnDnDihUrqFmz5i3rmzVrxnfffRcT265du2LuScclf/78DBkyhAEDBhAREcGrr77KmjVrYr4Url69yjvvvMOHH34IwAcffECfPn1irkajoqIYOHDgbftNqFypUqViksicOXNuuTKOrVWrVkybNo2ZM2fGtGk0a9aM8ePHc+nSJcBx+yP6KRpXDRo0YMCAATRo0IBq1aqxfPlycuTIQf78+W8r6/rvmVadP38+5gt3woQJ8ZZr1qwZQ4cOjbna37p1a8z2RYoUIUuWLEyePJmbN2/Guf3q1atj/p5dX3Hdwn722WeZOHEiADNnzqRx48a3XbxERkZy+vRpwNGX69dff415qmzr1q20b9+euXPnUrhw4Ts4G8mTKRKFqjJu3Dh8fHxYuHAh//vf/9iwYUNMo6lxj4gwe/ZsfvrpJ8qVK0f58uXJmTNnTJtAvXr1Ym4JdevWjerVq8dsGxgYSJUqVXj11Vdv2+/mzZvx9/enSpUq1KlTh7Zt21KzZk1effVVQkJC8Pf3Z8qUKTENvXeiVq1a/Otf/+KRRx7hs88+o2jRoresb9u2LT4+PlSvXp1KlSrRvn37RK+Gq1WrRtWqVZk2bRq5cuVizpw59OrVi4cffpjKlStTs2ZNOnfuDDga5gcNGsQrr7xCxYoVqVSpEsePH79tnwmVa9euHStXrqRWrVps3LjxtlqEK19fXy5evEixYsUoUqQIAE888QStW7emTp06VK5cmZYtW3Lx4sXbtq1fvz6HDx+mQYMGZM2alRIlSvDoo4/GeZyE/j3Tig8//JDu3btTr169eL/kwVHziIiIoEqVKlSqVInPPvsMgLfeeouJEyfyyCOPsGvXrgTPu7sCAgI4c+YMZcuWZeDAgfTt2zdmXfStu+vXr9OsWTOqVKmCn58fxYoVi3kI44MPPuDSpUu8+OKL+Pn5xbSVeZrEdc8sLfP399fYVf34vDxqPZdOHeX6ipEsXbqUBg0aMHbsWMqVK+fhKD3j999/p2LFit4OI9344osvyJMnT5rrj2CMp8X1XSEim1XVPyn7y7CN2Tdv3mTX0unsmDOKu3Nk57vvviMwMJAsWTJFJcoYY1JMhkwU4eHhBAQEELphA0Uq1WHj/Okxj6SZzCP6eXVjTPJkqMvrGzdu8NVXX+Hn58fu3bup3eZzHu00IEMlifR2q9AYk7o88R2RLmsUcQ32d/ZAOMGTv+b80b2U8H+cai+9x77L2TLUIH45c+bkzJkzFCxYMEN9LmNMylDnfBQ5c+ZM0f2my0QRPdifT5F8RN64xs5549i15Edy5itAvY79KFbV0ZPUJx8ZaiC/4sWLc+TIkRQfa94Yk3FEz3CXktJlogDwKZKPtyrcoG3bQPbs2UO7du3o379/nM98ZxTZs2dP0VmrjDHGHR5toxCR5iLyp4jsEZGP41gvIjLEuX67iFSPaz+xRVy9zOap/WnYsCFRUVEsXbqU0aNHZ+gkYYwx3uKxGoWIZAWGA02BI0CwiMxV1XCXYk8C5Zyv2sB3zp/xOn/+PAu/fJVr507TpUsXvvrqK+6++27PfAhjjDEerVHUAvao6j5VvQFMA1rEKtMCmOQcLn0DcI+IFElop3v27CEya04afziKb775xpKEMcZ4mCfbKIoBh12Wj3B7bSGuMsWAW8Y4EJFAINC5eD3i9KGwpf3aIf1unVsgE7oPOO3tINIIOxf/sHPxDzsX/3g4qRt6MlHE9fxm7Ad83SmDqo4GRgOISEhSu6FnNHYu/mHn4h92Lv5h5+IfIuLe2Edx8OStpyOAa0+34sCxJJQxxhjjRZ5MFMFAOREpLSJ3Aa2AubHKzAX+43z66RHgvKrePrSmMcYYr/HYrSdVjRSRzsAiICswXlV3ikgH5/qRwHzgKWAPcAVo48auR3so5PTIzsU/7Fz8w87FP+xc/CPJ5yLdDTNujDEmdWWoQQGNMcakPEsUxhhjEpRmE4Wnhv9Ij9w4F686z8F2EVknIlW9EWdqSOxcuJSrKSI3RaRlasaXmtw5FyLSUERCRWSniKxM7RhTixv/R/KLyDwR2eY8F+60h6Y7IjJeRE6KSFg865P2vamqae6Fo/F7L1AGuAvYBvjEKvMUsABHX4xHgI3ejtuL56IucK/z9ycz87lwKbcMx8MSLb0dtxf/Lu4BwoGSzuXC3o7bi+fiE6Cf8/dCwFngLm/H7oFz0QCoDoTFsz5J35tptUbhkeE/0qlEz4WqrlPVv52LG3D0R8mI3Pm7AHgbmAWcTM3gUpk756I18LOqHgJQ1Yx6Ptw5FwrkFcdELnlwJIrI1A3T81R1FY7PFp8kfW+m1UQR39Aed1omI7jTzxmA44ohI0r0XIhIMeB5YGQqxuUN7vxdlAfuFZEVIrJZRP6TatGlLnfOxTCgIo4OvTuAd1U1KnXCS1OS9L2ZVuejSLHhPzIAtz+niDTCkSge9WhE3uPOuRgEfKSqNzP4LIDunItsQA2gCZALWC8iG1R1l6eDS2XunItmQCjQGHgIWCwiq1X1gqeDS2OS9L2ZVhOFDf/xD7c+p4hUAcYCT6rqmVSKLbW5cy78gWnOJHEf8JSIRKrqL6kTYqpx9//IaVW9DFwWkVVAVSCjJQp3zkUboK86btTvEZH9QAVgU+qEmGYk6Xszrd56suE//pHouRCRksDPwGsZ8GrRVaLnQlVLq2opVS0FzATeyoBJAtz7PzIHqC8i2UTkbhyjN/+eynGmBnfOxSEcNStE5H4cI6nuS9Uo04YkfW+myRqFem74j3THzXPRAygIjHBeSUdqBhwx081zkSm4cy5U9XcRWQhsB6KAsaoa52OT6ZmbfxdfARNEZAeO2y8fqWqGG35cRH4EGgL3icgR4HMgOyTve9OG8DDGGJOgtHrryRhjTBphicIYY0yCLFEYY4xJkCUKY4wxCbJEYYwxJkGWKEya5Bz5NdTlVSqBspdS4HgTRGS/81hbRKROEvYxVkR8nL9/EmvduuTG6NxP9HkJc46Gek8i5f1E5KmUOLbJvOzxWJMmicglVc2T0mUT2McE4FdVnSkiTwADVLVKMvaX7JgS26+ITAR2qWrvBMq/AfiraueUjsVkHlajMOmCiOQRkaXOq/0dInLbqLEiUkREVrlccdd3vv+EiKx3bvuTiCT2Bb4KKOvctotzX2Ei8p7zvdwi8ptzboMwEXnZ+f4KEfEXkb5ALmccU5zrLjl/Tne9wnfWZF4Qkawi0l9EgsUxT0B7N07LepwDuolILXHMRbLV+fNhZy/lL4GXnbG87Ix9vPM4W+M6j8bcxtvjp9vLXnG9gJs4BnELBWbjGEUgn3PdfTh6lkbXiC85f3YFPnX+nhXI6yy7CsjtfP8joEccx5uAc+4K4EVgI44B9XYAuXEMTb0TqAa8AIxx2Ta/8+cKHFfvMTG5lImO8XlgovP3u3CM5JkLCAT+63w/BxAClI4jzksun+8noLlzOR+Qzfn748As5+9vAMNctu8D/J/z93twjPuU29v/3vZK2680OYSHMcBVVfWLXhCR7EAfEWmAYziKYsD9wF8u2wQD451lf1HVUBF5DPAB1jqHN7kLx5V4XPqLyH+BUzhG4W0CzFbHoHqIyM9AfWAhMEBE+uG4XbX6Dj7XAmCIiOQAmgOrVPWq83ZXFflnRr78QDlgf6ztc4lIKFAK2Awsdik/UUTK4RgNNHs8x38CeFZEujmXcwIlyZhjQJkUYonCpBev4piZrIaqRojIARxfcjFUdZUzkfwLmCwi/YG/gcWq+oobx/hAVWdGL4jI43EVUtVdIlIDx5g5X4tIkKp+6c6HUNVrIrICx7DXLwM/Rh8OeFtVFyWyi6uq6ici+YFfgU7AEBxjGS1X1eedDf8r4tlegBdU9U934jUGrI3CpB/5gZPOJNEIeDB2ARF50FlmDDAOx5SQG4B6IhLd5nC3iJR385irgOec2+TGcdtotYgUBa6o6g/AAOdxYotw1mziMg3HYGz1cQxkh/Nnx+htRKS885hxUtXzwDtAN+c2+YGjztVvuBS9iOMWXLRFwNvirF6JSLX4jmFMNEsUJr2YAviLSAiO2sUfcZRpCISKyFYc7QiDVfUUji/OH0VkO47EUcGdA6rqFhxtF5twtFmMVdWtQGVgk/MW0KdArzg2Hw1sj27MjiUIx9zGS9QxdSc45hIJB7aISBgwikRq/M5YtuEYVvt/OGo3a3G0X0RbDvhEN2bjqHlkd8YW5lw2JkH2eKwxxpgEWY3CGGNMgixRGGOMSZAlCmOMMQmyRGGMMSZB43rb/AAAABpJREFUliiMMcYkyBKFMcaYBFmiMMYYk6D/B8kAtz/Q3NbeAAAAAElFTkSuQmCC\n", 467 | "text/plain": [ 468 | "
" 469 | ] 470 | }, 471 | "metadata": { 472 | "needs_background": "light" 473 | }, 474 | "output_type": "display_data" 475 | } 476 | ], 477 | "source": [ 478 | "from sklearn.metrics import roc_curve, auc\n", 479 | "import matplotlib.pyplot as plt\n", 480 | "(fpr, tpr, thresholds) = roc_curve(y_test,prediction)\n", 481 | "area = auc(fpr,tpr)\n", 482 | "plt.clf() #Clear the current figure\n", 483 | "plt.plot(fpr,tpr,label=\"Out-Sample ROC Curve with \\\n", 484 | " area = %1.2f\"%area)\n", 485 | "\n", 486 | "plt.plot([0, 1], [0, 1], 'k')\n", 487 | "plt.xlim([0.0, 1.0])\n", 488 | "plt.ylim([0.0, 1.0])\n", 489 | "plt.xlabel('False Positive Rate')\n", 490 | "plt.ylabel('True Positive Rate')\n", 491 | "plt.title('Out sample ROC')\n", 492 | "plt.legend(loc=\"lower right\")\n", 493 | "plt.show()" 494 | ] 495 | }, 496 | { 497 | "cell_type": "markdown", 498 | "metadata": {}, 499 | "source": [ 500 | "- auc=0.59" 501 | ] 502 | }, 503 | { 504 | "cell_type": "code", 505 | "execution_count": 11, 506 | "metadata": { 507 | "ExecuteTime": { 508 | "end_time": "2020-04-01T00:39:43.526478Z", 509 | "start_time": "2020-04-01T00:39:43.519468Z" 510 | } 511 | }, 512 | "outputs": [ 513 | { 514 | "data": { 515 | "text/plain": [ 516 | "(0.503213, 0.0705916305916306)" 517 | ] 518 | }, 519 | "execution_count": 11, 520 | "metadata": {}, 521 | "output_type": "execute_result" 522 | } 523 | ], 524 | "source": [ 525 | "def get_KS(y_prob,y_true):\n", 526 | " fpr,tpr,threshold=roc_curve(y_true,y_prob)\n", 527 | " ks=(tpr-fpr)\n", 528 | " max_=np.argmax(ks)\n", 529 | " \n", 530 | " return threshold[max_],np.max(ks)\n", 531 | "get_KS(prediction,y_test)" 532 | ] 533 | }, 534 | { 535 | "cell_type": "markdown", 536 | "metadata": {}, 537 | "source": [ 538 | "- ks=0.1487, corresponding threshold=0.17198" 539 | ] 540 | } 541 | ], 542 | "metadata": { 543 | "kernelspec": { 544 | "display_name": "Python 3", 545 | "language": "python", 546 | "name": "python3" 547 | }, 548 | "language_info": { 549 | "codemirror_mode": { 550 | "name": "ipython", 551 | "version": 3 552 | }, 553 | "file_extension": ".py", 554 | "mimetype": "text/x-python", 555 | "name": "python", 556 | "nbconvert_exporter": "python", 557 | "pygments_lexer": "ipython3", 558 | "version": "3.7.7" 559 | }, 560 | "toc": { 561 | "base_numbering": 1, 562 | "nav_menu": {}, 563 | "number_sections": true, 564 | "sideBar": true, 565 | "skip_h1_title": false, 566 | "title_cell": "Table of Contents", 567 | "title_sidebar": "Contents", 568 | "toc_cell": false, 569 | "toc_position": {}, 570 | "toc_section_display": true, 571 | "toc_window_display": false 572 | }, 573 | "varInspector": { 574 | "cols": { 575 | "lenName": 16, 576 | "lenType": 16, 577 | "lenVar": 40 578 | }, 579 | "kernels_config": { 580 | "python": { 581 | "delete_cmd_postfix": "", 582 | "delete_cmd_prefix": "del ", 583 | "library": "var_list.py", 584 | "varRefreshCmd": "print(var_dic_list())" 585 | }, 586 | "r": { 587 | "delete_cmd_postfix": ") ", 588 | "delete_cmd_prefix": "rm(", 589 | "library": "var_list.r", 590 | "varRefreshCmd": "cat(var_dic_list()) " 591 | } 592 | }, 593 | "types_to_exclude": [ 594 | "module", 595 | "function", 596 | "builtin_function_or_method", 597 | "instance", 598 | "_Feature" 599 | ], 600 | "window_display": false 601 | } 602 | }, 603 | "nbformat": 4, 604 | "nbformat_minor": 2 605 | } 606 | -------------------------------------------------------------------------------- /2.Binary_classification_models/Model_binary_classification_model5.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Model\n", 8 | "### Input:\n", 9 | "- sequential features with a shape of (60,17)\n", 10 | "- images feature (MTF along the features) with a shape of (31,31,1)\n", 11 | "- non-sequential features with a shape of (14,)\n", 12 | "\n", 13 | "#### Attention mechanism is applied in this model" 14 | ] 15 | }, 16 | { 17 | "cell_type": "markdown", 18 | "metadata": {}, 19 | "source": [ 20 | "## Import data" 21 | ] 22 | }, 23 | { 24 | "cell_type": "code", 25 | "execution_count": 1, 26 | "metadata": {}, 27 | "outputs": [ 28 | { 29 | "name": "stderr", 30 | "output_type": "stream", 31 | "text": [ 32 | "Using TensorFlow backend.\n" 33 | ] 34 | } 35 | ], 36 | "source": [ 37 | "import pandas as pd\n", 38 | "import numpy as np\n", 39 | "import json\n", 40 | "import keras\n", 41 | "from keras.models import Sequential,Model\n", 42 | "from keras.layers import Dense, Dropout, Flatten,concatenate,LSTM,Input,Bidirectional,BatchNormalization\n", 43 | "from keras.layers import Conv2D, MaxPooling2D,AveragePooling2D,Embedding,Masking\n", 44 | "from keras_self_attention import SeqSelfAttention\n", 45 | "from keras import backend as K\n", 46 | "import keras\n", 47 | "from keras import regularizers\n", 48 | "from tensorflow.keras import optimizers\n", 49 | "from tensorflow.keras import metrics" 50 | ] 51 | }, 52 | { 53 | "cell_type": "code", 54 | "execution_count": 2, 55 | "metadata": {}, 56 | "outputs": [], 57 | "source": [ 58 | "with open('/home/yz3698/label.json') as f:\n", 59 | " labels=json.load(f)\n", 60 | "with open('/home/yz3698/non_sequential_features.json') as f:\n", 61 | " non_sequential_features=json.load(f)\n", 62 | "with open('/home/yz3698/padded_sequential_features_3.json') as f:\n", 63 | " sequential_features=json.load(f)" 64 | ] 65 | }, 66 | { 67 | "cell_type": "code", 68 | "execution_count": 3, 69 | "metadata": {}, 70 | "outputs": [], 71 | "source": [ 72 | "with open('featurematrix.json') as f:\n", 73 | " arr_=json.load(f)\n", 74 | "arr_=np.array(arr_)" 75 | ] 76 | }, 77 | { 78 | "cell_type": "markdown", 79 | "metadata": {}, 80 | "source": [ 81 | "## Preprocessing\n" 82 | ] 83 | }, 84 | { 85 | "cell_type": "code", 86 | "execution_count": 4, 87 | "metadata": {}, 88 | "outputs": [], 89 | "source": [ 90 | "feature1=np.array([sequential_features[key] for key in sequential_features.keys()])\n", 91 | "feature2=np.array([non_sequential_features[key] for key in non_sequential_features.keys()])\n", 92 | "label=np.array([labels[key] for key in labels.keys()])" 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": 5, 98 | "metadata": {}, 99 | "outputs": [ 100 | { 101 | "data": { 102 | "text/plain": [ 103 | "((149923, 31, 31), (149923, 60, 18), (149923, 14), (149923,))" 104 | ] 105 | }, 106 | "execution_count": 5, 107 | "metadata": {}, 108 | "output_type": "execute_result" 109 | } 110 | ], 111 | "source": [ 112 | "arr_.shape,feature1.shape,feature2.shape,label.shape" 113 | ] 114 | }, 115 | { 116 | "cell_type": "code", 117 | "execution_count": 6, 118 | "metadata": {}, 119 | "outputs": [], 120 | "source": [ 121 | "arr_=arr_.reshape(-1,31,31,1)\n", 122 | "feature_page=feature1[:,:,0]\n", 123 | "feature_page[feature_page==-1]=0\n", 124 | "feature_time_pid=feature1[:,:,[14,15,16,17]]\n", 125 | "from sklearn.model_selection import train_test_split\n", 126 | "X_train_arr, X_test_arr,X_train_page, X_test_page,X_train_f2, X_test_f2,X_train_time_pid, X_test_time_pid,y_train,y_test= train_test_split(\n", 127 | " arr_,feature_page,feature2,feature_time_pid,label,\n", 128 | " test_size=0.20, random_state=42)\n" 129 | ] 130 | }, 131 | { 132 | "cell_type": "markdown", 133 | "metadata": {}, 134 | "source": [ 135 | "## Model building" 136 | ] 137 | }, 138 | { 139 | "cell_type": "code", 140 | "execution_count": 7, 141 | "metadata": { 142 | "scrolled": true 143 | }, 144 | "outputs": [ 145 | { 146 | "name": "stdout", 147 | "output_type": "stream", 148 | "text": [ 149 | "Model: \"model_1\"\n", 150 | "__________________________________________________________________________________________________\n", 151 | "Layer (type) Output Shape Param # Connected to \n", 152 | "==================================================================================================\n", 153 | "cnn_input (InputLayer) (None, 31, 31, 1) 0 \n", 154 | "__________________________________________________________________________________________________\n", 155 | "batch_normalization_1 (BatchNor (None, 31, 31, 1) 4 cnn_input[0][0] \n", 156 | "__________________________________________________________________________________________________\n", 157 | "conv2d_1 (Conv2D) (None, 31, 31, 2) 20 batch_normalization_1[0][0] \n", 158 | "__________________________________________________________________________________________________\n", 159 | "batch_normalization_2 (BatchNor (None, 31, 31, 2) 8 conv2d_1[0][0] \n", 160 | "__________________________________________________________________________________________________\n", 161 | "max_pooling2d_1 (MaxPooling2D) (None, 15, 15, 2) 0 batch_normalization_2[0][0] \n", 162 | "__________________________________________________________________________________________________\n", 163 | "dropout_1 (Dropout) (None, 15, 15, 2) 0 max_pooling2d_1[0][0] \n", 164 | "__________________________________________________________________________________________________\n", 165 | "conv2d_2 (Conv2D) (None, 15, 15, 2) 38 dropout_1[0][0] \n", 166 | "__________________________________________________________________________________________________\n", 167 | "batch_normalization_3 (BatchNor (None, 15, 15, 2) 8 conv2d_2[0][0] \n", 168 | "__________________________________________________________________________________________________\n", 169 | "rnn_input_pid (InputLayer) (None, 60, 4) 0 \n", 170 | "__________________________________________________________________________________________________\n", 171 | "rnn_input_page (InputLayer) (None, 60) 0 \n", 172 | "__________________________________________________________________________________________________\n", 173 | "dense_input (InputLayer) (None, 14) 0 \n", 174 | "__________________________________________________________________________________________________\n", 175 | "average_pooling2d_1 (AveragePoo (None, 7, 7, 2) 0 batch_normalization_3[0][0] \n", 176 | "__________________________________________________________________________________________________\n", 177 | "bidirectional_1 (Bidirectional) (None, 60, 16) 832 rnn_input_pid[0][0] \n", 178 | "__________________________________________________________________________________________________\n", 179 | "embedding_1 (Embedding) (None, 60, 4) 56 rnn_input_page[0][0] \n", 180 | "__________________________________________________________________________________________________\n", 181 | "batch_normalization_4 (BatchNor (None, 14) 56 dense_input[0][0] \n", 182 | "__________________________________________________________________________________________________\n", 183 | "dropout_2 (Dropout) (None, 7, 7, 2) 0 average_pooling2d_1[0][0] \n", 184 | "__________________________________________________________________________________________________\n", 185 | "seq_self_attention_1 (SeqSelfAt (None, 60, 16) 1089 bidirectional_1[0][0] \n", 186 | "__________________________________________________________________________________________________\n", 187 | "bidirectional_2 (Bidirectional) (None, 60, 32) 2688 embedding_1[0][0] \n", 188 | "__________________________________________________________________________________________________\n", 189 | "dense_1 (Dense) (None, 32) 480 batch_normalization_4[0][0] \n", 190 | "__________________________________________________________________________________________________\n", 191 | "conv2d_3 (Conv2D) (None, 5, 5, 1) 19 dropout_2[0][0] \n", 192 | "__________________________________________________________________________________________________\n", 193 | "dropout_3 (Dropout) (None, 60, 16) 0 seq_self_attention_1[0][0] \n", 194 | "__________________________________________________________________________________________________\n", 195 | "seq_self_attention_2 (SeqSelfAt (None, 60, 32) 2113 bidirectional_2[0][0] \n", 196 | "__________________________________________________________________________________________________\n", 197 | "batch_normalization_5 (BatchNor (None, 32) 128 dense_1[0][0] \n", 198 | "__________________________________________________________________________________________________\n", 199 | "flatten_1 (Flatten) (None, 25) 0 conv2d_3[0][0] \n", 200 | "__________________________________________________________________________________________________\n", 201 | "lstm_2 (LSTM) (None, 64) 20736 dropout_3[0][0] \n", 202 | "__________________________________________________________________________________________________\n", 203 | "lstm_4 (LSTM) (None, 16) 3136 seq_self_attention_2[0][0] \n", 204 | "__________________________________________________________________________________________________\n", 205 | "dense_2 (Dense) (None, 16) 528 batch_normalization_5[0][0] \n", 206 | "__________________________________________________________________________________________________\n", 207 | "concatenate_1 (Concatenate) (None, 121) 0 flatten_1[0][0] \n", 208 | " lstm_2[0][0] \n", 209 | " lstm_4[0][0] \n", 210 | " dense_2[0][0] \n", 211 | "__________________________________________________________________________________________________\n", 212 | "dense_4 (Dense) (None, 64) 7808 concatenate_1[0][0] \n", 213 | "__________________________________________________________________________________________________\n", 214 | "batch_normalization_7 (BatchNor (None, 64) 256 dense_4[0][0] \n", 215 | "__________________________________________________________________________________________________\n", 216 | "dropout_5 (Dropout) (None, 64) 0 batch_normalization_7[0][0] \n", 217 | "__________________________________________________________________________________________________\n", 218 | "dense_5 (Dense) (None, 32) 2080 dropout_5[0][0] \n", 219 | "__________________________________________________________________________________________________\n", 220 | "output_layer (Dense) (None, 1) 33 dense_5[0][0] \n", 221 | "==================================================================================================\n", 222 | "Total params: 42,116\n", 223 | "Trainable params: 41,886\n", 224 | "Non-trainable params: 230\n", 225 | "__________________________________________________________________________________________________\n", 226 | "Train on 107944 samples, validate on 11994 samples\n", 227 | "Epoch 1/200\n", 228 | "107944/107944 [==============================] - 97s 895us/step - loss: 0.5765 - accuracy: 0.9062 - auc: 0.5053 - val_loss: 0.6163 - val_accuracy: 0.9318 - val_auc: 0.5114\n", 229 | "Epoch 2/200\n", 230 | "107944/107944 [==============================] - 93s 858us/step - loss: 0.5427 - accuracy: 0.9313 - auc: 0.5178 - val_loss: 0.5551 - val_accuracy: 0.9318 - val_auc: 0.5219\n", 231 | "Epoch 3/200\n", 232 | "107944/107944 [==============================] - 92s 852us/step - loss: 0.5377 - accuracy: 0.9320 - auc: 0.5247 - val_loss: 0.5335 - val_accuracy: 0.9318 - val_auc: 0.5278\n", 233 | "Epoch 4/200\n", 234 | "107944/107944 [==============================] - 91s 848us/step - loss: 0.5343 - accuracy: 0.9322 - auc: 0.5306 - val_loss: 0.5309 - val_accuracy: 0.9318 - val_auc: 0.5330\n", 235 | "Epoch 5/200\n", 236 | "107944/107944 [==============================] - 92s 851us/step - loss: 0.5323 - accuracy: 0.9324 - auc: 0.5354 - val_loss: 0.5303 - val_accuracy: 0.9318 - val_auc: 0.5376\n", 237 | "Epoch 6/200\n", 238 | "107944/107944 [==============================] - 92s 848us/step - loss: 0.5292 - accuracy: 0.9324 - auc: 0.5399 - val_loss: 0.5290 - val_accuracy: 0.9318 - val_auc: 0.5421\n", 239 | "Epoch 7/200\n", 240 | "107944/107944 [==============================] - 91s 846us/step - loss: 0.5291 - accuracy: 0.9326 - auc: 0.5440 - val_loss: 0.5288 - val_accuracy: 0.9318 - val_auc: 0.5456\n", 241 | "Epoch 8/200\n", 242 | "107944/107944 [==============================] - 91s 842us/step - loss: 0.5288 - accuracy: 0.9325 - auc: 0.5469 - val_loss: 0.5279 - val_accuracy: 0.9318 - val_auc: 0.5482\n", 243 | "Epoch 9/200\n", 244 | "107944/107944 [==============================] - 91s 847us/step - loss: 0.5280 - accuracy: 0.9327 - auc: 0.5491 - val_loss: 0.5285 - val_accuracy: 0.9318 - val_auc: 0.5504\n", 245 | "Epoch 10/200\n", 246 | "107944/107944 [==============================] - 91s 844us/step - loss: 0.5280 - accuracy: 0.9326 - auc: 0.5515 - val_loss: 0.5264 - val_accuracy: 0.9318 - val_auc: 0.5522\n", 247 | "Epoch 11/200\n", 248 | "107944/107944 [==============================] - 91s 841us/step - loss: 0.5258 - accuracy: 0.9327 - auc: 0.5532 - val_loss: 0.5281 - val_accuracy: 0.9318 - val_auc: 0.5544\n", 249 | "Epoch 12/200\n", 250 | "107944/107944 [==============================] - 91s 846us/step - loss: 0.5261 - accuracy: 0.9326 - auc: 0.5552 - val_loss: 0.5269 - val_accuracy: 0.9318 - val_auc: 0.5561\n", 251 | "Epoch 13/200\n", 252 | "107944/107944 [==============================] - 90s 838us/step - loss: 0.5250 - accuracy: 0.9326 - auc: 0.5569 - val_loss: 0.5276 - val_accuracy: 0.9318 - val_auc: 0.5578\n", 253 | "Epoch 14/200\n", 254 | "107944/107944 [==============================] - 90s 837us/step - loss: 0.5258 - accuracy: 0.9327 - auc: 0.5584 - val_loss: 0.5268 - val_accuracy: 0.9317 - val_auc: 0.5591\n", 255 | "Epoch 15/200\n", 256 | "107944/107944 [==============================] - 91s 841us/step - loss: 0.5254 - accuracy: 0.9327 - auc: 0.5598 - val_loss: 0.5278 - val_accuracy: 0.9317 - val_auc: 0.5603\n", 257 | "Epoch 16/200\n", 258 | "107944/107944 [==============================] - 91s 845us/step - loss: 0.5250 - accuracy: 0.9327 - auc: 0.5609 - val_loss: 0.5283 - val_accuracy: 0.9318 - val_auc: 0.5614\n", 259 | "Epoch 17/200\n", 260 | "107944/107944 [==============================] - 90s 835us/step - loss: 0.5244 - accuracy: 0.9327 - auc: 0.5620 - val_loss: 0.5274 - val_accuracy: 0.9316 - val_auc: 0.5625\n", 261 | "Epoch 18/200\n", 262 | "107944/107944 [==============================] - 90s 835us/step - loss: 0.5239 - accuracy: 0.9327 - auc: 0.5631 - val_loss: 0.5285 - val_accuracy: 0.9315 - val_auc: 0.5636\n", 263 | "Epoch 19/200\n", 264 | "107944/107944 [==============================] - 91s 841us/step - loss: 0.5235 - accuracy: 0.9327 - auc: 0.5642 - val_loss: 0.5282 - val_accuracy: 0.9317 - val_auc: 0.5647\n", 265 | "Epoch 20/200\n", 266 | "107944/107944 [==============================] - 90s 835us/step - loss: 0.5229 - accuracy: 0.9327 - auc: 0.5652 - val_loss: 0.5272 - val_accuracy: 0.9316 - val_auc: 0.5658\n", 267 | "Epoch 00020: early stopping\n" 268 | ] 269 | }, 270 | { 271 | "data": { 272 | "text/plain": [ 273 | "" 274 | ] 275 | }, 276 | "execution_count": 7, 277 | "metadata": {}, 278 | "output_type": "execute_result" 279 | } 280 | ], 281 | "source": [ 282 | "# Input1: MTF along features, shape=batchsize*31*31*1\n", 283 | "input_fm=Input(shape=(31,31,1),name=\"cnn_input\")\n", 284 | "fm=BatchNormalization()(input_fm)\n", 285 | "cnn1=Conv2D(2, (3, 3), padding=\"same\",activation='relu')(fm)\n", 286 | "bn1=BatchNormalization()(cnn1)\n", 287 | "pool1=MaxPooling2D(pool_size=(3,3),strides=2)(bn1)\n", 288 | "pool1=Dropout(0.5)(pool1)\n", 289 | "cnn2=Conv2D(2, (3, 3), padding=\"same\",activation='relu')(pool1)\n", 290 | "bn2=BatchNormalization()(cnn2)\n", 291 | "pool2=AveragePooling2D(pool_size=(2,2),strides=2)(bn2)\n", 292 | "pool2=Dropout(0.5)(pool2)\n", 293 | "cnn3=Conv2D(1, (3, 3), padding=\"valid\",activation='relu')(pool2)\n", 294 | "fm_output=Flatten()(cnn3)\n", 295 | "\n", 296 | "\n", 297 | "# Input2: sequential features, shape= batchsize*60*4\n", 298 | "input_rnn_pid=Input(shape=(60,4),name=\"rnn_input_pid\")\n", 299 | "pid=Bidirectional(LSTM(8, activation='tanh', return_sequences=True))(input_rnn_pid)\n", 300 | "pid=SeqSelfAttention(attention_activation='tanh',attention_width=10)(pid)\n", 301 | "pid=Dropout(0.25)(pid)\n", 302 | "rnn_pid_output=LSTM(64, activation='tanh', return_sequences=False)(pid)\n", 303 | "#rnn_pid_output=Flatten()(pid)\n", 304 | "\n", 305 | "# Input2: sequential features,page type, shape= batchsize*60*1\n", 306 | "input_rnn_page=Input(shape=(60,),name=\"rnn_input_page\")\n", 307 | "page=Embedding(14,4,input_length=60)(input_rnn_page)\n", 308 | "page=Bidirectional(LSTM(16, activation='tanh', return_sequences=True))(page)\n", 309 | "page=SeqSelfAttention(attention_activation='tanh',attention_width=10)(page)\n", 310 | "rnn_page_output=LSTM(16, activation='tanh', return_sequences=False)(page)\n", 311 | "#rnn_page_output=Flatten()(page)\n", 312 | "\n", 313 | "\n", 314 | "# Input3: non-sequential features(user type and application time), shape=batchsize*14\n", 315 | "input_non_sequential=Input(shape=(14,),name=\"dense_input\")\n", 316 | "non_sequential=BatchNormalization()(input_non_sequential)\n", 317 | "non_sequential=Dense(32,activation=\"relu\")(non_sequential)\n", 318 | "non_sequential=BatchNormalization()(non_sequential)\n", 319 | "non_sequential_output=Dense(16,activation=\"relu\")(non_sequential)\n", 320 | "\n", 321 | "# Merge the above models\n", 322 | "merged = concatenate([fm_output,rnn_pid_output,rnn_page_output,non_sequential_output])\n", 323 | "combined=Dense(128,activation=\"relu\")(merged)\n", 324 | "combined=BatchNormalization()(combined)\n", 325 | "combined=Dropout(0.5)(combined)\n", 326 | "combined=Dense(64,activation=\"relu\")(merged)\n", 327 | "combined=BatchNormalization()(combined)\n", 328 | "combined=Dropout(0.2)(combined)\n", 329 | "combined=Dense(32,activation=\"relu\")(combined)\n", 330 | "out = Dense(1, activation='sigmoid', name='output_layer')(combined)\n", 331 | "\n", 332 | "model = Model(inputs=[input_fm,input_rnn_pid,input_rnn_page,input_non_sequential], outputs=[out])\n", 333 | "model.summary()\n", 334 | "ada=optimizers.Adam(learning_rate=0.0005)\n", 335 | "model.compile(loss='binary_crossentropy', optimizer=ada, metrics=['accuracy',metrics.AUC()]) #optimizer='rmsprop'\n", 336 | "from keras.callbacks import EarlyStopping\n", 337 | "early_stopping = EarlyStopping(monitor='val_loss', patience=10, verbose=2)\n", 338 | "\n", 339 | "model.fit([X_train_arr,X_train_time_pid,X_train_page,X_train_f2], [y_train],\n", 340 | " epochs=200, batch_size=1024,\n", 341 | " class_weight={0 : 1., 1: 3},\n", 342 | " shuffle=True,\n", 343 | " validation_split=0.1,\n", 344 | " callbacks=[early_stopping])" 345 | ] 346 | }, 347 | { 348 | "cell_type": "code", 349 | "execution_count": 8, 350 | "metadata": {}, 351 | "outputs": [], 352 | "source": [ 353 | "#model.save(\"multi_source_classifier_ks_point.h5\")" 354 | ] 355 | }, 356 | { 357 | "cell_type": "markdown", 358 | "metadata": {}, 359 | "source": [ 360 | "## Model evaluation" 361 | ] 362 | }, 363 | { 364 | "cell_type": "code", 365 | "execution_count": 10, 366 | "metadata": { 367 | "scrolled": true 368 | }, 369 | "outputs": [ 370 | { 371 | "data": { 372 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAEWCAYAAAB42tAoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOzdd1hUx9fA8e+IvWEvsWKv2FBjokaj/jTGxJg3xZhioSkWLNiNLfbeKxhjYu8aG/Zu7CJiQbAANuwi0uf9Y5EQRVyVZRc5n+fhCbfO2cXs2Zl77xmltUYIIYR4lTTmDkAIIYRlk0QhhBAiUZIohBBCJEoShRBCiERJohBCCJEoSRRCCCESJYlCCDNQSmmlVClzxyGEMSRRCIuilGqnlDqrlApVSt1SSs1WSuV4g+OvKqUamzJGc4p9f6KVUiFKqcdKqTNKqRYv7JNBKTVaKXVdKfVMKeWrlOqtlFIv7NdUKbVPKfVEKRWslNqrlPoyeV+RSAkkUQiLoZTqBYwFegPWwIdAMWC7Uiq9OWOzMIe11lmBHMAsYNkLyXQl0AhoDmQDfgacgKnPd1BKfRO73yKgMJAfGAx8kRwvQKQwWmv5kR+z/wDZgRDguxfWZwXuAB1ilxcCI+JtbwAExv7+JxADPIs9V58E2skD/A08BO4D+4E0sdv6AX7AE8AHaBXvuHbAQWBy7LH+wEex6wNiY2wbb/+FwBxge+z59gLF4m3XQKnY3zMAE4DrwO3Y4zK94n1qBxyIt5w59lw1Y5cbAWFAkReOqw1EA6UAFdtWb3P/3eUnZfxIj0JYio+AjMCa+Cu11iHAFqDJ606gtf4ZwwfgF1rrrFrrcQns1gsIBPJi+BY9AMMHLRiSRD0MvZlhwF9KqYLxjq0NeAG5gSXAMqAmhg/fn4AZSqms8fb/EfgNQ3I6DSx+RehjgTJA1dhzFcLw7T5RSikroD0QCVyLXd0E+EdrHRB/X631P7GvuxFQFigCrHpdG0KADD0Jy5EHuKu1jkpg283Y7UkhEiiI4dt9pNZ6v9ba8BVf65Va6xta6xit9XLAF6gV79grWuvftdbRwHIMH7bDtdbhWmtPIALDB/1zm7TW+7TW4cBAoI5Sqkj8YGKvGzgCPbTW97XWT4BRQOtEXsOHSqmHGHoOE4CftNZ3YrflwfB+JeT5+5g73rIQryWJQliKu0AepVTaBLYVjN2eFMYDlwFPpZS/Uqrf8w1KqV+UUqeVUg9jP4gr8d8EdTve788AtNYvrovfo4j7Vh/bM7oPfPBCPHkxDB+diNfu1tj1r3JEa50DyAlswNALeu4uhvcrIc/fx3vxloV4LUkUwlIcBsKBr+OvVEplAT4Ddsaueorhg/W5Ai+cJ9FyyFrrJ1rrXlrrEhgu3PZUSjVSShUD5gNdgNyxH8TeGMbz31Zc7yF2SCoXcOOFfe5iSDAVtdY5Yn+steFidaJik48L8LNSqlrs6h1A7QR6LrVi49kFXMSQxP7v7V6WSG0kUQiLoLV+hOG6wHSlVDOlVDqlVHEMd+YEYrhQDYax/uZKqVxKqQJA9xdOdRso8ap2lFItlFKlYod8HmO4wBsNZMGQZIJj92uPoUfxLporperG3rH1GwlfO4jBkKAmK6XyxbZdSCnV1JgGtNb3AHdir2lorXdgSKqrlVIVlVJWSqkPMVwfma219o0dausJ/KqUaq+Uyq6UShMb67x3fM3iPSSJQliM2IvPAzCMuz8G/sHwzbdR7Dg/GBLGGeAq4InhWkF8o4FBscM4bgk0UxrDt+4QDL2YWVrrPVprH2Bi7LrbQGUMdzm9iyXAEAxDTjUwXNxOSF8Mw2FHlFKPY+Mr+wbtTMGQlGxjl/8P2I1hCCsE+AvwALo+P0BrvQr4HuiAoZdzGxgBrH+DdkUqoWKv4wkhkpBSaiGG23YHmTsWId6V9CiEEEIkymSJQim1QCl1Rynl/YrtSik1TSl1WSnlpZSqbqpYhBBCvD2TDT0ppepjGB9dpLV+6aKgUqo5hjHT5hgeZJqqta5tkmCEEEK8NZP1KLTW+zBcxHuVlhiSiNZaHwFyvPAUrBBCCAuQ0MNNyaUQ8R5IwnALZCESeFpUKeWEoagZWbJkqVGuXLlkCVAIIVKqqBhN8KNQ7twKIir0CRgqHyT2IOcrmTNRJPQgU4LjYFrrecA8ADs7O338+HFTxiWEECmWd9Aj5uz1Y/myZdy/MBcd/owPv+nIkVVzrr3+6ISZM1EEEu/JVQyljl98alUIIYQR/INDmOB5kQ0HvbnvOZNnfscoU6kaq5cuolKlSig1563Pbc5EsQHoopRahuFi9iOttRQpE0KIN3D17lO2+9xmxKZzhJzx5OGeBaRTmkmTJtGtWzesrKzeuQ2TJQql1FIMcwXkUUoFYnhCNR2A1noOsBnDHU+XgVAM5ZKFEEIY4dT1B7itPINf8FMiH9zg3tbphF8/S4OGDXGfP5+SJUsmWVsmSxRa6x9es10DnU3VvhBCvI/uPAnDdelpDvvfQ8dEY+23nUub3cmQPj0z5s/H3t6eF2a9fWfmHHoSQghhhL2XgvnH/x6z9vjFrcvy9AZWB+Zw9vRJvvzyS2bNmkWhQoVM0r4kCiGEsFARUTH0Xe3F2lNBAGTPmJaC2dJiffFv1v4+g5w5c7Js2TK+++67JO9FxCeJQgghLIzWmpGbzuN+4AoAJfNmYeaP1Xl07Tz29vZ4njvHTz/9xOTJk8mTJ6kmf3w1SRRCCGFmkdEx/ON/n5PXH3Dg8l2OXvm3qEWraoUY0aI0v/76K1OmTKFQoUL8/ffffP7558kWnyQKIYQwA601G87cYMk/1/nnyn+rHRXOmYnG5fPTv3k5Du7bi62tLf7+/nTq1IkxY8aQPXv2ZI1VEoUQQiSzy3dC+H7uYe49jQAMQ0tfVy/MJ2XyUipfVjKms+Lhw4d06dQRd3d3SpcuzZ49e/jkk0/MEq8kCiGESCZBD58xaO1Zdl8MBqDiB9lZ2L4WebNl+M9+69evp1OnTty+fZs+ffowdOhQMmXKZI6QAUkUQghhchdvPaHr0pNcuh0St27OT9VpVum/BbPv3LlDt27dWL58Oba2tmzYsAE7O7vkDvclkiiEEMJE/vG/R+clJ7kbYhhiqlY0B871S1K/TB4yp//341drzeLFi3F1dSUkJITffvuNvn37ki5dOnOF/h+SKIQQIon943+PIRvOceHWEwDK5M/K7J9qUDJv1pf2DQgIoGPHjmzevJkPP/wQDw8PKlSokNwhJ0oShRBCJIE7T8JYczKIOXv9eBgaCRh6EIM+L0+NYrle2j8mJoa5c+fSt29foqOjmTJlCl26dEmSIn5JTRKFEEK8pZgYzaqTgUzZfokbj8Li1pfJn5Xf29eiUI6EL0BfunQJR0dH9u3bR+PGjZk3bx42NjbJFfYbk0QhhBBvQGvN3H3+zNp9mcdhUXHrKxTMTq//laFh2XykSZNwOY2oqCgmTZrEkCFDyJgxIwsWLKBdu3YmLb+RFCRRCCGEkRYevMLQjT5xyy2rfkC5Atn5vmYRcmVJn+ixZ86coUOHDpw8eZJWrVoxc+ZMChYsmOgxlkIShRBCvMa9kHCc/jzBiWsPAPihVhF+bVHhP3cuvUp4eDgjRoxgzJgx5MqVi5UrV/J///d/Ft+LiE8ShRBCvMK2c7cYvtGHoIfPAKhRLCfjvrFN8O6lhBw+fBh7e3vOnz/PL7/8wqRJk8idO7cpQzYJSRRCCBErJkZzyO8efxy+yp6Ld4iM1gDUtslFx09K0rBcPqPOExISwqBBg5g2bRpFihRhy5YtNGvWzISRm5YkCiFEqhcToxn+tw8rjgcQGhEdt75B2byM+8aWfNkyGn2u7du34+TkxNWrV+nSpQujRo0iW7Zspgg72UiiEEKkWk/Do3Dff4XJOy7FrevwsQ3tPy5OkVyZ3+hcDx48oFevXvz++++ULVuW/fv3U7du3aQO2SwkUQghUhX/4BBGb7nAEb97PAn/9/bW9h8XZ3CLCm91kXnt2rW4uLgQHBxM//79GTx4MBkzGt8LsXSSKIQQqUJEVAyz9lxmyg7fuHX/q5CfJhXy81nlgmTN8OYfh7du3aJr166sWrWKqlWrsmnTJqpXr56UYVsESRRCiPfalrM3mbT9Er53/q3cur7zx9gWtn7rW1S11vz55590796d0NBQRo0ahZubm8UU8UtqkiiEEO8drTWbz95i5CafuNIaOTKno/1HNjjUsyHLW/Qenrt27RrOzs5s27aNjz76CA8PD8qVK5dUoVskSRRCiPdGZHQMg9d743nudtzscXbFcuLRtibWmd/t235MTAyzZs2iX79+AEyfPh0XFxfSpEnzznFbOkkUQoj3wrCN5/j94NW45XYfFce1UWlyvqa0hjEuXryIvb09Bw8epGnTpsydO5dixYq983lTCkkUQogUKzpGc/TKfRz+OMbTiGjSWSkGf1GRn2oXTZISGZGRkUyYMIFhw4aROXNmFi5cyC+//JKiym8kBUkUQogUaZPXTTovORm3/Hnlgkz+virp0ybNUNCpU6ewt7fn1KlTfPPNN0yfPp0CBQokyblTGkkUQogUZd2pIEZtPs+dJ+EAONUvwRe2H1C5sHWSnD8sLIzhw4czbtw48uTJw+rVq/n666+T5NwplSQKIUSKsOpEIKM3n4+7SF0qX1Y82tpRLHeWJGvjwIEDODg4cPHiRdq3b8/EiRPJmTNnkp0/pZJEIYSwSJfvhLDz/G02e9/i6t2nPHpmmF7088oFGf+trVElvo315MkT+vfvz8yZMylevDienp40adIkyc6f0kmiEEJYlAdPI+i96gw7zt+JW/eBdUYalcvHry0qJMldTPFt27YNJycnAgIC6NatGyNHjiRrVuPKiKcWkiiEEBbhcVgkA9d6s8nrBjEaiuTKxMivKlOzeC4ypbdK8vbu379Pjx49WLRoEeXKlePAgQN89NFHSd7O+0AShRDCIjgvOsFh/3sATPquCl9XL2yytlatWkXnzp25f/8+AwcOZNCgQe9VEb+kJolCCGFWd0PC+XH+P1y8/YTyBbOzxbWeydq6efMmXbp0Yc2aNVSvXp1t27ZRtWpVk7X3vpBEIYQwi0fPIllw4ApTdxqqudYsnhOPdjVN0pbWmoULF9KzZ0/CwsIYO3YsPXv2JG1a+Qg0hknfJaVUM2AqYAW4a63HvLC9KPAHkCN2n35a682mjEkIYV6PwyLpt9qLzWdvAZApnRU9mpTGqX5Jk7R35coVnJyc2LFjB/Xq1cPd3Z0yZcqYpK33lckShVLKCpgJNAECgWNKqQ1aa594uw0CVmitZyulKgCbgeKmikkIYT5aa6bvusyk7f/OJjf5+yq0sP2AdFZJX1gvOjqamTNn0r9/f9KkScOsWbNwdnZOFUX8kpopexS1gMtaa38ApdQyoCUQP1FoIHvs79bADRPGI4Qwg+An4Uzb6cufR67FrevbrBzO9UuQJo1paiadP38ee3t7Dh8+zGeffcacOXMoWrSoSdpKDUyZKAoBAfGWA4HaL+wzFPBUSnUFsgCNEzqRUsoJcALkjy1ECnAvJJxLt0NYdPgqW7xvxa3/zq4wA5tXeOeS368SGRnJuHHjGD58OFmzZuXPP//kxx9/THVF/JKaKRNFQn8Z/cLyD8BCrfVEpVQd4E+lVCWtdcx/DtJ6HjAPwM7O7sVzCCEshF9wCJ0Xn+TCrSdx64rkykS3T0vzVbVCJhlieu7EiRN06NABLy8vvvvuO6ZPn06+fPlM1l5qYspEEQgUibdcmJeHluyBZgBa68NKqYxAHuAOQogU41lENJO2X2T+/isA5M+ega6flqZ60ZyUK5DNZENMAM+ePWPo0KFMnDiRfPnysXbtWr766iuTtZcamTJRHANKK6VsgCCgNdDmhX2uA42AhUqp8kBGINiEMQkhklBMjOaLGQc4d+Nx3Lrf29ekYdnk+Sa/b98+HBwc8PX1xcHBgfHjx5MjR45kaTs1MVmi0FpHKaW6ANsw3Pq6QGt9Tik1HDiutd4A9ALmK6V6YBiWaqe1lqElISyc1pr5+/0ZtflC3LqBzcvzc51iZEyX9OU2XvT48WP69evH7NmzsbGxYceOHTRq1Mjk7aZWJn2OIvaZiM0vrBsc73cf4GNTxiCESFr3QsLpt+Ys231uA9DCtiDTf6iWbBeMN2/eTMeOHQkMDKRHjx789ttvZMmSdKXGxcvksUQhhFGehkfRYvoBrtx9CkDZ/NlY1/ljkxTsS8jdu3fp0aMHf/31FxUqVODQoUN8+OGHydJ2aieJQgjxWhvO3KDb0lMApLNSjP+mCs0qFUiWYSatNStXrqRLly48ePCAwYMHM2DAADJkyGDytoWBJAohRIKiYzSbzt5k/j5/zgY9AqBnkzJ0a1Q62WK4ceMGLi4urF+/Hjs7O3bs2IGtrW2ytS8MJFEIIV6y1fsmA9Z6cz922tEWtgXp3rg0pfJlS5b2tdZ4eHjg5uZGeHg4EyZMwNXVVYr4mYm860IIwPDh/M+V+wxYcxb/2OsQLat+wK8tKpAna/IN8/j7++Po6MiuXbv45JNPcHd3p1SpUsnWvniZJAohBDExml4rz7D2VBAAFQpmx6OdHQWtMyVbDNHR0UybNo2BAweSNm1a5s6di4ODgxTxswCSKIRIpbTW7Dh/h4meF+NKbpTIm4V5P9tRKl/yzhl97tw57O3t+eeff/j888+ZM2cOhQubboY78WYkUQiRymit8ThwhRGbzsety5M1Pa6Ny/BtjcLJcifTcxEREYwZM4YRI0ZgbW3NkiVLaN26tRTxszCSKIRIBcIio9nqfYsNZ26w68K/pdT+VyE/w1pWTNYhpueOHTtGhw4d8Pb2pk2bNkyZMoW8efMmexzi9SRRCPEe01qz4OBVfvv732lgqhTJwcclc9OxQUmyZzRNue/EhIaGMnjwYCZPnkzBggXZsGEDX3zxRbLHIYwniUKI95DWmh7LT7Pu9L8Fm3s3Lcs3NQqTP3tGs8W1Z88eHBwc8PPzw9nZmbFjx2JtbW22eIRxJFEI8Z6Jio6h/rjd3HgUBkD7j4vjVL+EWYaXnnv06BF9+vRh3rx5lCxZkl27dtGwYUOzxSPejCQKId4j7vv9Gbn5PFpDs4oFmPVjdZPOBWGMv//+m44dO3Lz5k3c3NwYNmwYmTNnNmtM4s1IohAihbty9ynHrt6nzyqvuHXdGpWmR+PSZr17KDg4GFdXV5YuXUqlSpVYs2YNtWrVMls84u1JohAihQoJj+KHeUfi6jAB2Ba2ZlGHWuTInN5scWmtWbp0Kd26dePx48cMGzaMfv36kT69+WIS78aoRKGUSg8U1VpfNnE8QojXeBQayeQdl1h46CoA1YvmoMunpbArnsssdzHFFxgYSKdOnfj777+pXbs2Hh4eVKxY0awxiXf32kShlPocmASkB2yUUlWBIVrrVqYOTghhoLXmb6+bzNrjx/mb/0472r1xabo3LmPGyAxiYmKYP38+vXv3JioqikmTJtGtWzesrJLv4T1hOsb0KIYDtYHdAFrr00opqdAlRDI5dPkuXZaeiqvkWtA6I50blqJNraJmv1ANcPnyZRwdHdmzZw+ffvop8+fPp0SJEuYOSyQhYxJFpNb64QsXxWReayGSwUTPi0zfZRjxrVLYmt/b1yJXFssY64+KimLKlCn8+uuvpE+fnvnz52Nvby/lN95DxiSK80qp74A0SikbwBU4YtqwhEi9IqNjcP7zRFypDaVguVMdatnkMnNk/zp79iz29vYcO3aML7/8klmzZlGoUCFzhyVMxJhE0QUYDMQAa4BtQH9TBiVEanXncRgNJuwhNCKagtYZ+enDYtjXtUnWQn2JCQ8PZ9SoUYwaNYqcOXOyfPlyvv32W+lFvOeMSRRNtdZ9gb7PVyilvsaQNIQQ7ygqOoazQY8YuuEcZwINt7p+VqkAM9pUx8oCrkE8d+TIEezt7fHx8eGnn35iypQp5M6d29xhiWRgTKIYxMtJYWAC64QQb+Bs4CP6r/XCO+jfu5hK58vKgOblaVgunxkj+6+nT5/y66+/MmXKFAoVKsSmTZto3ry5ucMSyeiViUIp1RRoBhRSSk2Ktyk7hmEoIcRbGrf1ArP2+AGG2eTqlcnDF7YfUKmQZRXI27lzJ46Ojly5coVOnToxZswYsmfPbu6wRDJLrEdxB/AGwoBz8dY/AfqZMigh3lcnrt1n6s7L7LsUTJb0Vni0q8mHJSxv+Obhw4f07t0bd3d3Spcuzd69e6lfv765wxJm8spEobU+BZxSSi3WWoclY0xCvHcO+91jyAZvLt0OAeCTMnmZ+WN1smawvCo669evp1OnTty+fZs+ffowdOhQMmUyX+VZYX7G/CstpJQaCVQA4grZa63N/zioECnA7wevMGyjYeKgxuXz0++zspTKl83MUb3s9u3bdOvWjRUrVmBra8uGDRuws7Mzd1jCAhiTKBYCI4AJwGdAe+QahRCvFR2jaff7Ufb73iVbxrSs7FiHcgUsb3xfa83ixYtxdXUlJCSEESNG0KdPH9KlM2/dKGE5jEkUmbXW25RSE7TWfsAgpdR+UwcmREoVERVD/zVnWX0yEDBUdF3i+KFFDjNdv36djh07smXLFurUqYOHhwfly5c3d1jCwhjzLzdcGZ6m8VNKdQSCAMu5d08IC+IXHML/Ju8jOkaTOb0V39kVYeiXllc9NSYmhrlz59KnTx9iYmKYOnUqnTt3liJ+IkHGJIoeQFagGzASsAY6mDIoIVKii7ee0HTKPsDQi9jQpa6ZI0rYpUuXcHBwYP/+/TRu3Jh58+ZhY2Nj7rCEBXttotBa/xP76xPgZwClVGFTBiVESuNz4zHNpxlGZGf9WJ3mlQuaOaKXPS//PWTIEDJmzMiCBQto166dlN8Qr5VoolBK1QQKAQe01neVUhUxlPL4FJBkIVK9sMho5uz1Y8oOXwDGf2NrkUnizJkzdOjQgZMnT9KqVStmzpxJwYKWF6ewTGletUEpNRpYDPwIbFVKDcQwJ8UZQG6NFQL4fu7huCQxsHl5vrUrYuaI/issLIxBgwZhZ2dHUFAQq1atYs2aNZIkxBtJrEfREqiitX6mlMoF3IhdvmjsyZVSzYCpgBXgrrUek8A+3wFDMcxxcUZr3eYN4hfCLGJiNP3WeHEm8BF5sqbn2MDGFjeEc+jQIezt7blw4QJt27Zl0qRJ5MplOaXKRcrxyh4FEKa1fgagtb4PXHjDJGEFzMTw7EUF4AelVIUX9imNoWT5x1rrikD3N4xfCLOYvusyK44bbn9d6/KxRSWJkJAQXF1dqVu3LqGhoWzdupWFCxdKkhBvLbEeRQml1PMKsQooHm8ZrfXXrzl3LeCy1tofQCm1DEMvxSfePo7ATK31g9hz3nnD+IVIdhvP3GDyjktkzZCWg30/xTqz5TyY5unpiZOTE9euXaNLly6MGjWKbNks7ylwkbIklij+74XlGW947kJAQLzlQAxzb8dXBkApdRDD8NRQrfXWF0+klHICnACKFi36hmEIkXR6rzzDyhOGnsRfDrUtJkk8ePCAnj17snDhQsqWLcv+/fupW9cyb88VKU9iRQF3vuO5E+qLvzjXdlqgNNAAw11U+5VSlbTWD1+IZR4wD8DOzk7m6xbJ7tjV+3T88wT3nkYAsLPXJ5TMm9XMURmsWbOGzp07ExwcTP/+/Rk8eDAZM2Z8/YFCGMmUNQUCgfi3gBTGcEH8xX2OaK0jgStKqYsYEscxE8YlhFEiomLYeOYGo7ec526IIUHULZUHj3Z2ZEhr/ieYb926RZcuXVi9ejVVq1Zl8+bNVKtWzdxhifeQKRPFMaC0UsoGQ9mP1sCLdzStA34AFiql8mAYivI3YUxCvNaDpxFM2n6JP49ci1v3vwr56dPMMqq+aq1ZtGgRPXr0IDQ0lFGjRuHm5iZF/ITJGJ0olFIZtNbhxu6vtY5SSnUBtmG4/rBAa31OKTUcOK613hC77X9KKR8gGuittb73Zi9BiKShtWbR4WsM2WCYpyt7xrT8+GExvrcrQvE8WcwcncG1a9dwdnZm27ZtfPzxx7i7u1OuXDlzhyXec0rrxIf8lVK1AA/AWmtdVClVBXDQWndNjgBfZGdnp48fP26OpsV7LCQ8inpjd/EgNBKAMV9X5vuaRSzmtteYmBhmzZpFv36GySXHjBmDi4sLadIkdoe7EP9SSp3QWr/VBCPG9CimAS0wDBOhtT6jlGr4No0JYWmWH7vOqesPWXbMcIOeTZ4srOxYhzxZM5g5sn9duHABBwcHDh48SNOmTZk7dy7FihUzd1giFTEmUaTRWl974ZtVtIniESJZBNwP5Uf3f7h+PxSAbBnS8knZvMxoU93Mkf0rMjKS8ePHM2zYMLJkycIff/zBzz//bDG9HJF6GJMoAmKHn3Ts09ZdgUumDUsI0zjge5cVxwPYcMZwA16+bBk40PdT0qe1rCGcU6dO0aFDB06fPs0333zDjBkzyJ8/v7nDEqmUMYmiE4bhp6LAbWBH7DohUozwqGi6LT3FtnO3AfiwRC5+qVPc4iq9hoWFMWzYMMaPH0/evHlZvXo1X3/9uiIIQpiWMYkiSmvd2uSRCGEiNx4+o9mUfTwOi6JOidxMbV2VfNkt74G0AwcOYG9vz6VLl2jfvj0TJ04kZ86c5g5LiESLAj53TCm1WSnVVill/pvIhXgDHgeu8NGYXTwOi+LjUrlZ4ljb4pLEkydP6NKlC/Xq1SMiIgJPT08WLFggSUJYDGNmuCuplPoIwwNzw5RSp4FlWutlJo9OiLekteaXBUfZ73sXgD861KJ+6TwWdyF469atODs7ExAQgKurKyNGjCBrVssoDSLEc0ZdwdNaH9JadwOqA48xTGgkhMXqtuw0+33vUiZ/Vvb3acgnZfJaVJK4d+8ebdu25bPPPiNLliwcPHiQKVOmSJIQFum1PQqlVFYM5cFbA+WB9cBHJo5LiLdy50kYTSfv40FoJOnTpmFb9/oWlSC01qxevZrOnTtz//59Bg0axKBBg8iQwXKe2xDiRcZczPYGNgLjtNb7TRyPEG/lUWgky45dZ8oOX55FRlOvdB4WtPMEZLsAACAASURBVKtpUUni5s2bdO7cmbVr11KjRg08PT2pUqWKucMS4rWMSRQltNYxJo9EiLfwNDyK4Rt9WH7c8GR11gxp+b19TRqWzWfmyP6ltWbhwoX07NmTsLAwxo4dS8+ePUmb1pQ1OYVIOq/8l6qUmqi17gWsVkq9VBDKiBnuhDCpo1fu86P7ESKjDf88ezQug0vDkqSzspyH565cuYKTkxM7duygXr16uLu7U6ZMGXOHJcQbSewrzfLY/77pzHZCmJTWmgFrz7L0qKEX8UOtooxqVcmihpmio6OZMWMGAwYMwMrKilmzZuHs7CxF/ESKlNgMd0djfy2vtf5PsogtH/6uM+AJ8cbCo6L5v9mH8A56DMC27vUpW8CyHu/x8fHBwcGBw4cP89lnnzF37lyKFCny+gOFsFDGfL3pkMA6+6QORIjX2XXhNmUHbcU76DGZ01vhN6q5RSWJyMhIRowYQbVq1bh06RJ//fUXmzZtkiQhUrzErlF8j+GWWBul1Jp4m7IBDxM+Soikd+Laff5v9uG45ZZVP2DK91UtaqjpxIkTdOjQAS8vL77//numTZtGvnyWc0FdiHeR2DWKo8A9DHNdz4y3/glwypRBCQGGaxEjNp3H48AVAIrkysSCtjUpnd9yehHPnj1j6NChTJgwgfz587Nu3Tpatmxp7rCESFKJXaO4AlzBUC1WiGTlFxzCVzMP8iQsiszprVjX+WPKWFCCANi7dy+Ojo74+vri4ODA+PHjyZEjh7nDEiLJJTb0tFdr/YlS6gEQ//ZYBWitdS6TRydSnZXHA5joeYlbj8MAsK9rg2vj0mTPmM7Mkf3r8ePH9O3blzlz5lCiRAl27NhBo0aNzB2WECaT2NDT8+lO8yRHICJ1Ox3wkOk7fdl54Q4ATSrkx76uDR+WyG3myP5r8+bNODs7c+PGDXr27Mnw4cPJkiWLucMSwqQSG3p6/jR2EeCG1jpCKVUXsAX+wlAcUIh38ueRa/y6zjtuuXzB7Cx2qE2uLOnNGNXL7t69S/fu3Vm8eDEVKlRg1apV1K5d29xhCZEsjKkhsA6oqZQqCSwCNgFLgBamDEy8/1YeD4hLEh0/KUmraoUs6nZXMFxQX7FiBV27duXBgwcMGTKE/v37SxE/kaoYkyhitNaRSqmvgSla62lKKbnrSby1qOgYpuzwZcbuywD83bUulQpZmzmqlwUFBeHi4sKGDRuws7Nj586dVK5c2dxhCZHsjJoKVSn1LfAz8FXsOsu5sihSjGcR0fRedYa/vW4CkCFtGvb3bUi+bJY145zWGnd3d9zc3IiIiGDChAm4urpKET+RahnzL78D4IKhzLi/UsoGWGrasMT7Zv3pIFyXnQYgZ+Z0/FCrKO0+Km5xScLPzw9HR0d2795NgwYNmD9/PqVKlTJ3WEKYlTFToXorpboBpZRS5YDLWuuRpg9NvC8mbb/EtJ2+ALg0KEmfZuXMHNHLoqOjmTp1KoMGDSJdunTMnTsXBwcHKeInBMbNcFcP+BMIwvAMRQGl1M9a64OmDk6kXDExmt0X7zB3rz9Hr94H4HD/TyloncnMkb3M29sbe3t7jh49SosWLZg9ezaFCxc2d1hCWAxjhp4mA8211j4ASqnyGBKHnSkDEynbb5t8+P3gVQDql8nLxG+rkDebZd0pFBERwejRoxk5ciTW1tYsWbKE1q1bW1QNKSEsgTGJIv3zJAGgtT6vlLKsm9yFxbjx8BkNxu8hIjqG9GnTsLPnJxTJldncYb3k6NGj2Nvb4+3tTZs2bZgyZQp58+Y1d1hCWCRjEsVJpdRcDL0IgB+RooDiBQH3Q+mzyovD/vcAyJM1PTt7NcA6k2XdIBcaGsrgwYOZPHkyBQsWZMOGDXzxxRfmDksIi2ZMougIdAP6YLhGsQ+YbsqgRMqy6PBVBq8/B0CNYjnp1qg0n5SxvG/nu3fvxsHBAX9/f5ydnRk7dizW1pb3/IYQlibRRKGUqgyUBNZqrcclT0gipTh34xGz9/jFPRcx9+caNK1YwMxRvezRo0f06dOHefPmUbJkybhbX4UQxkmseuwADDPZncRQwmO41npBskUmLNphv3v8MP8IYLhYPejz8hZXBhxg48aNdOzYkVu3buHm5sawYcPInNnyrpkIYckS61H8CNhqrZ8qpfICmwFJFKlc4INQ3PdfYeGhqwCsdfmIakVzmjeoBAQHB+Pq6srSpUupXLky69ato2bNmuYOS4gUKbFEEa61fgqgtQ5WSsmTR6nciWsP+L/ZhwDInN6Knk3KWFyS0FqzdOlSunXrxuPHjxk2bBj9+vUjfXq5UU+It5VYoigRb65sBZSMP3e21vrr151cKdUMmApYAe5a6zGv2O8bYCVQU2t93NjgRfLxOHCF3/423CU98dsqtKpWiDRpLOt5g4CAADp16sSmTZuoXbs2Hh4eVKxY0dxhCZHiJZYo/u+F5RlvcmKllBWGubabAIHAMaXUhvjPZMTulw3DXVX/vMn5RfKIjI7BcdFx9lwMJnN6K5Y6fkiVIpY13WdMTAzz58+nd+/eREdHM3nyZLp27YqVlZW5QxPivZDYxEU73/HctTDUhfIHUEotA1oCPi/s9xswDnB7x/ZEEos/1FSlSA4Wdahlcc9F+Pr64ujoyN69e2nUqBHz5s2jRIkS5g5LiPeKKa87FAIC4i0Hxq6Lo5SqBhTRWv+d2ImUUk5KqeNKqePBwcFJH6l4yYxdvnFJwqGuDetcPrKoJBEVFcWECROwtbXl9OnTuLu7s337dkkSQpiAKQvsJzSAreM2Gi6OTwbave5EWut5wDwAOzs7/ZrdxTu4fi+UMVvPs/nsLQB2uzXAJo9lzQnt5eWFvb09x48fp2XLlsyaNYsPPvjA3GEJ8d4yOlEopTJorcPf4NyBGObbfq4wcCPecjagErAntghbAWCDUupLuaCd/LwCHzJ8ow/Hrz2IW/enfS2LShLh4eGMHDmS0aNHkzNnTpYvX863334rRfyEMDFjyozXAjwAa6CoUqoK4KC17vqaQ48BpWMnOgoCWgNtnm/UWj8C8sRrZw/gJkki+c3cfZnx2y4C0KRCftrUKkqDsnkt6gP4yJEj2Nvb4+Pjw08//cSUKVPInTu3ucMSIlUwpkcxDWgBrAPQWp9RSjV83UFa6yilVBdgG4bbYxdorc8ppYYDx7XWG94hbvGOvIMecfTKfVadCMTn5mMAJn1Xha+rW9Y8DE+fPmXQoEFMnTqVwoULs2nTJpo3b27usIRIVYxJFGm01tde+HYZbczJtdabMTzRHX/d4Ffs28CYc4p3o7VmzJYLzN3nH7cuZ+Z0LHeuY3ElOHbu3ImjoyNXrlzBxcWF0aNHkz17dnOHJUSqY0yiCIgdftKxz0Z0BS6ZNixhCuduPOLzaQfiluf/Ysen5fJhZWEPzj18+BA3Nzc8PDwoXbo0e/fupX79+uYOS4hUy5hE0QnD8FNR4DawI3adSEGu3n3KlzMMs9e2rPoBI1tVJmsGU9709nbWrVuHi4sLd+7coW/fvgwZMoRMmSxv+lQhUpPXflJore9guBAtUqg5e/2YvP0S0TGaBe3s+LRcfnOH9JLbt2/TtWtXVq5cSZUqVdi4cSM1atQwd1hCCIy762k+8Z5/eE5r7WSSiESSeBYRzZpTgaw+EcjJ6w8B+KNDLYubUEhrzV9//UX37t0JCQlhxIgR9OnTh3TpLOfhPiFSO2PGHnbE+z0j0Ir/PnEtLMxfR64xaJ133HKraoUY/XVlMqazrNpH169fp2PHjmzZsoU6derg4eFB+fLlzR2WEOIFxgw9LY+/rJT6E9husojEWwuLjGbA2rOsORlE+rRp6NqwFPb1bMic3rKuRcTExDBnzhz69u2L1ppp06bh4uIiRfyEsFBv8wliAxRL6kDEuxmw9ixL/rkOQJb0VmztXp8iuSxvJrdLly7h4ODA/v37adKkCfPmzaN48eLmDksIkQhjrlE84N9rFGmA+0A/UwYljBcaEUWfVV5x81b3+6wczvVLWNRT1WAo4jdx4sS4u5h+//132rZta3FxCiFelmiiUIb/i6tgKMEBEKO1lqJ8FmLl8QB6r/ICwDpTOrb3rE++bBnNHNXLTp8+jb29PSdPnqRVq1bMnDmTggULmjssIYSREi0zHpsU1mqto2N/JElYiPWng+KSxKDPy3NmyP8sLkmEhYUxcOBA7OzsCAoKYtWqVaxZs0aShBApjDHXKI4qpaprrU+aPBrxWnsvBTNw7VkCHzwDLPOWV4BDhw5hb2/PhQsXaNu2LZMmTSJXrlzmDksI8RZemSiUUmm11lFAXcBRKeUHPMUwz4TWWldPphhFrFl7LjNuq6HK61dVP2DA5+UtrhcREhLCgAEDmDFjBkWKFGHr1q00bdrU3GEJId5BYj2Ko0B14KtkikUk4rDfPcZtvUj+7BkY+HkFvqxieRP1eHp64uTkxPXr1+ncuTOjRo0iWzbLKjQohHhziSUKBaC19kumWMQrnLr+gB/mHwFgQ5e65M9uWb2I+/fv06tXLxYuXEjZsmXZt28fdevWNXdYQogkkliiyKuU6vmqjVrrSSaIR7xg89mbuCw2XB4a3KKCxSWJ1atX07lzZ+7evUv//v0ZPHgwGTNaVoxCiHeTWKKwArKS8NzXIhmsPx2E67LTAMz9uQZNKxYwc0T/unXrFl26dGH16tVUrVqVLVu2UK1aNXOHJYQwgcQSxU2t9fBki0TEiYnR9F7lxeqTgeTMnI6dvRqQK0t6c4cFGIr4/fHHH/Ts2ZPQ0FBGjx5Nr169pIifEO+x116jEMlr94U7tF94LG55scOHFpMkrl69irOzM56entStWxd3d3fKli1r7rCEECaWWKJolGxRCAIfhOK28gxH/O8D0KtJGRzqlSBTevMXyouJiWHmzJn0798fpRQzZsygU6dOpEmT6POaQoj3xCsThdb6fnIGklo9DY/i2zmH8bn5GIA8WTMw+6fq1CxuGQ+nXbhwAQcHBw4ePEjTpk2ZO3cuxYpJTUghUhPLqj+dyjyLiKbO6J08DosCYGH7mjQom8/MURlERkYyfvx4hg0bRpYsWfjjjz/4+eefpYifEKmQJAozuXbvKd/MOczjsCicPylBv2blLOZD+OTJk9jb23P69Gm+/fZbpk+fTv78ljd9qhAiecggsxlcvvOET8bvIfhJON/bFaH/Z+UtIkk8e/aM/v37U6tWLW7dusWaNWtYsWKFJAkhUjnpUZhBq1mHAJj+QzW+sJBSHAcOHMDe3p5Lly7RoUMHJkyYQM6cOc0dlhDCAkiPIpntuxTMk7AoPimT1yKSxJMnT+jSpQv16tUjIiKC7du34+HhIUlCCBFHEkUyunznCb8sOApA/+blzBwNbNmyhYoVKzJr1ixcXV05e/YsjRs3NndYQggLI4kimWzyuknjSfsAmPNTdcoVyG62WO7du8cvv/xC8+bNyZo1KwcPHmTKlClkzZrVbDEJISyXJIpkcOzqfTovMRT2m/ZDNZpVMs8Mb1prVq5cSYUKFVi6dCm//vorp06dok6dOmaJRwiRMsjF7GTQPbawn0dbOxqVN88dRDdv3sTFxYV169ZRo0YNPD09qVKlilliEUKkLNKjMLFJnhcJeviMphXzmyVJaK1ZsGAB5cuXZ+vWrYwbN44jR45IkhBCGE16FCYSFhlNlyWn2HH+NgATvk3+D2Z/f3+cnZ3ZsWMH9evXZ/78+ZQpUybZ4xBCpGySKExgv28wP3scjVte6/IR2TImXxnu6Ohopk+fzsCBA7GysmL27Nk4OTlJET8hxFuRRJHEdl+8Q/vfDWXCezQuQ9dPS5EmTfI9de3j44O9vT1Hjhzhs88+Y+7cuRQpUiTZ2hdCvH8kUSSh0wEP45LExi51qVzYOtnajoiIYOzYsYwYMYJs2bLx119/0aZNG4soDSKESNlMOhahlGqmlLqolLqslOqXwPaeSikfpZSXUmqnUirF1q/+2+sGX808CMCwLysma5I4fvw4NWvWZPDgwXz99df4+Pjw448/SpIQQiQJkyUKpZQVMBP4DKgA/KCUqvDCbqcAO621LbAKGGeqeEwp4H4oXZacAmD+L3a0/ah4srT77Nkz+vTpQ+3atbl79y7r169n6dKl5MtnGaXKhRDvB1MOPdUCLmut/QGUUsuAloDP8x201rvj7X8E+MmE8SS5sMhoOiw8xiG/ewBM+q4KTSokzy2we/fuxcHBgcuXL+Po6Mi4cePIkSNHsrQthEhdTDn0VAgIiLccGLvuVeyBLQltUEo5KaWOK6WOBwcHJ2GIby86RtN63hEO+d0jfdo0uP9ix9fVC5u83cePH9OpUycaNGhATEwMO3fuZN68eZIkhBAmY8oeRUID5DrBHZX6CbADPklou9Z6HjAPwM7OLsFzJLdB685yOuAhtYrnYrnzh8lyPWDTpk107NiRGzdu0LNnT4YPH06WLFlM3q4QInUzZY8iEIh/X2Zh4MaLOymlGgMDgS+11uEmjCfJHLp8l6VHA6hWNAfLnEyfJO7evctPP/1EixYtsLa25tChQ0ycOFGShBAiWZgyURwDSiulbJRS6YHWwIb4OyilqgFzMSSJOyaMJcmsOBZAG/d/AJj9Yw2TPiOhtWbZsmWUL1+eFStWMGTIEE6ePEnt2rVN1qYQQrzIZENPWusopVQXYBtgBSzQWp9TSg0HjmutNwDjgazAythv5de11l+aKqZ35RccQp/VXgCs7vQRBawzmqytoKAgXFxc2LBhAzVr1sTDw4PKlSubrD0hhHgVkz5wp7XeDGx+Yd3geL+nqFlyZu/xA2CFcx1qFDPNDHBaa9zd3XFzcyMyMpIJEybQvXt3rKysTNKeEEK8jjyZbaQ1JwNZdSKQCgWzU8sml0na8PPzw9HRkd27d9OgQQPmz59PqVKlTNKWEEIYS6rEGWHL2Zv0XHEGgIXtayb5+aOjo5k0aRKVK1fmxIkTzJ07l507d0qSEEJYBOlRvMaqE4G4rTQkiVk/Vidf9qS9LuHt7Y29vT1Hjx6lRYsWzJ49m8KFTf88hhBCGEsSRSJGbznP3L3+AGzqVpeKHyRd/aaIiAhGjx7NyJEjsba2ZunSpXz//fdSn0kIYXEkUbyC+37/uCSxulOdJE0SR48exd7eHm9vb9q0acPUqVPJkydPkp1fCCGSklyjSMChy3cZsek8AJ496lOjWNJcvA4NDaVXr17UqVOHBw8esHHjRhYvXixJQghh0aRH8YKn4VEM3nAOMCSJMvmzJcl5d+/ejYODQ9z0pGPHjsXaOvlKkQshxNuSHkU8/sEhVByyjct3QmhTu2iSJIlHjx7h5OTEp59+ilKK3bt3M2fOHEkSQogUQxJFrDuPw2g6ZR8AP39YjFGt3v0p6I0bN1KhQgU8PDzo3bs3Xl5eNGjQ4J3PK4QQyUmGnmK5LD5JZLRm3De2fGf3bnNMBwcH061bN5YtW0blypVZv349dnZ2SRSpEEIkr1Tfo9BaM3KTD8evPaBpxfzvlCS01ixZsoTy5cuzevVqhg8fzvHjxyVJCCFStFTfo9h7KZj5+69gnSkdU1tXe+vzBAQE0KlTJzZt2kTt2rXx8PCgYsWKSRipEEKYR6ruUQTcD8X5zxMA7HZrQMZ0b154LyYmhjlz5lCxYkV2797N5MmTOXjwoCQJIcR7I9X2KMIio2m/8BiR0TFMbV2VXFnSv/E5fH19cXR0ZO/evTRq1Ih58+ZRokQJE0QrhBDmkyp7FMeu3qfcr1u5fCcE10ZlaFk1sam8XxYVFcX48eOxtbXl9OnTuLu7s337dkkSQoj3UqrrUbjv94976rpbo9K4Ni79Rsd7eXlhb2/P8ePHadmyJbNmzeKDDz4wRagviYyMJDAwkLCwsGRpTwiR8mTMmJHChQuTLl26JDtnqkoUvVacYfXJQHJnSY9Hu5pULZLD6GPDw8MZOXIko0ePJleuXKxYsYJvvvkmWYv4BQYGki1bNooXLy7FA4UQL9Fac+/ePQIDA7GxsUmy86aaRDF0wzlWnwwkV5b07HJrgHUm47Pt4cOHsbe35/z58/z8889MnjyZ3LlzmzDahIWFhUmSEEK8klKK3LlzExwcnKTnfe+vUYSER9F/jRcLD10lW8a07OltfJJ4+vQp3bt35+OPPyYkJITNmzezaNEisySJ5yRJCCESY4rPiPe+RzFykw9LjwZQJn9WPNrWJHtG45LEjh07cHR05OrVq7i4uDB69GiyZ89u4miFEMLyvNc9ijtPwlh6NACAbd3rUyRX5tce8/DhQ+zt7WnSpAnp0qVj7969zJw5U5JErMDAQFq2bEnp0qUpWbIkrq6uREREvPa4UaNGvXLb7du3adGiBVWqVKFChQo0b948KUN+ydWrV6lUqdIbHWNlZUXVqlWpVKkSX3zxBQ8fPozbdu7cOT799FPKlClD6dKl+e2339Bax23fsmULdnZ2lC9fnnLlyuHm5pZgG8buZy4ODg74+PgA//17vs37mZqNHj2aUqVKUbZsWbZt25bgPu3atcPGxoaqVatStWpVTp8+DcCDBw9o1aoVtra21KpVC29v7+QJWmudon5q1KihjRH8JEwX6/u3Ltb3bz3R86JRx6xdu1YXLFhQW1lZ6X79+unQ0FCjjksuPj4+Zm0/JiZG16xZUy9YsEBrrXVUVJTu0KGDdnNze+2xWbJkeeU2JycnPWXKlLjlM2fOvHuwibhy5YquWLHiGx0TP/5ffvlFjxgxQmutdWhoqC5RooTetm2b1lrrp0+f6mbNmukZM2ZorbU+e/asLlGihD5//rzWWuvIyEg9c+bMl85v7H6vEhUV9Uav513Ffz/e5v20FJGRkcna3rlz57Stra0OCwvT/v7+ukSJEgn+7dq2batXrlz50no3Nzc9dOhQrbXW58+f159++mmC7ST0WQEc12/5ufteDj3dfxqB3YgdAPRqUoaujRK/Bfb27dt07dqVlStXUqVKFTZu3EiNGjWSI9S3NmzjOXxuPE7Sc1b4IDtDvnj1E+W7du0iY8aMtG/fHjB8y548eTI2NjYMGzaMFStWcPz4cWbMmAFAixYtcHNzY+vWrTx79oyqVatSsWJFFi9e/J/z3rx5k//9739xy7a2tgCEhITQsmVLHjx4QGRkJCNGjKBly5ZcvXqVZs2aUbduXY4cOUKVKlVo3749Q4YM4c6dOyxevJhatWoxdOhQ/Pz8CAoKIiAggD59+uDo6PiftqOjo+nXrx979uwhPDyczp074+zsnOj7VKdOHby8vABYsmQJH3/8cVz8mTNnZsaMGTRo0IDOnTszbtw4Bg4cSLly5QBImzYtLi4uL50zsf3atWtHixYt+OabbwDImjUrISEh7Nmzh2HDhlGwYEFOnz7NF198QbFixeKOGzp0KNmyZaNXr16MHz+eFStWEB4eTqtWrRg2bNh/2l+xYgVHjhxh0qRJTJ06lalTp+Lv74+fnx9t27blwIEDNGjQgAkTJrBq1ar//D1HjhxJdHQ0jo6OHDp0iEKFCrF+/XoyZcqU6PtoakePHqV79+48e/aMTJky8fvvv1O2bFkWLlzIpk2bCAsL4+nTp+zateuV789XX31FQEAAYWFhuLq64uTk9E4xrV+/ntatW5MhQwZsbGwoVaoUR48epU6dOkYd7+PjQ//+/QEoV64cV69e5fbt2+TPn/+d4nqd927oSWvNR2N2AvBj7aKJJgmtNYsWLaJ8+fKsX7+ekSNHcuzYMYtPEuZy7ty5l96b7NmzU7RoUS5fvvzK48aMGUOmTJk4ffr0S0kCoHPnztjb29OwYUNGjhzJjRs3AMP94GvXruXkyZPs3r2bXr16xQ3pXL58GVdXV7y8vLhw4QJLlizhwIEDTJgw4T/DIl5eXmzatInDhw8zfPjwuHM/5+HhgbW1NceOHePYsWPMnz+fK1euvPK1REdHs3PnTr788stXviclS5YkJCSEx48f4+3tbdS/J2P3e9HRo0cZOXIkPj4+tG7dmuXLl8dtW7FiBd9++y2enp74+vpy9OhRTp8+zYkTJ9i3b99/zlO/fn32798PwP79+8mdOzdBQUEcOHCAevXq/WffhP6evr6+dO7cmXPnzpEjRw5Wr179xq8lqZUrV459+/Zx6tQphg8fzoABA+K2HT58mD/++INdu3Yl+v4sWLCAEydOcPz4caZNm8a9e/deaqdHjx5xQ0Txf8aMGfPSvkFBQRQp8m/h0cKFCxMUFJRg/AMHDsTW1pYePXoQHh4OQJUqVVizZg1g+Ntfu3aNwMDAt3+TjPTe9Shm7fEjLDKGeqXzMDKROSWuX7+Os7MzW7dupU6dOnh4eFC+fPlkjPTdJPbN31S01gneUfGq9cZq2rQp/v7+bN26lS1btlCtWjW8vb3JkSMHAwYMYN++faRJk4agoCBu374NgI2NDZUrG/6+FStWpFGjRiilqFy5MlevXo07d8uWLcmUKROZMmWiYcOGHD16lKpVq8Zt9/T0xMvLi1WrVgGGiaZ8fX1fugf9+Tfoq1evUqNGDZo0afLa154cd6jVqlUrLtZq1apx584dbty4QXBwMDlz5qRo0aJMmzYNT09PqlUzFL0MCQnB19eX+vXrx52nQIEChISE8OTJEwICAmjTpg379u1j//79fP3116+N4/l4OkCNGjX+8zcwl0ePHtG2bVt8fX1RShEZGRm3rUmTJuTKZZji2NPT85Xvz7Rp01i7di1gKPzp6+v70l2PkydPNjqm51904kvo38no0aMpUKAAERERODk5MXbsWAYPHky/fv1wdXWlatWqVK5cmWrVqpE2rek/xt+rRHHz0TPGb7tI+YLZWdi+VoL7xMTEMHv2bPr164fWmmnTpuHi4oKV1ZsXBExtKlas+NI3xcePHxMQEEDJkiU5c+YMMTExcdte9QT5zJkzmT9/PgCbN2/mgw8+JksjEAAADt9JREFUIFeuXLRp04Y2bdrQokUL9u3bx5MnTwgODubEiROk+//2zj66qupK4L8NREKAMlpKFdAGp4iSL8gEChMTZbDAoMUpKwPB2IIDg18xSwRZIGZNZkYdV4uoSCqlFCKztKFEEVZbxFJDRSZ8BAwYQUA+NGFwENQ0SKAE9vxxLy+P5OXlJea9vJfs31pvrXvPPR/77Xff3efsc88+UVHExsZ66uzataunvk6dOnnOO3XqRG1treda/T9h/XNV5aWXXmLs2LF+v/vlHnRVVRV33XUX+fn55OTkEBcX16B3fuTIEXr06EHPnj2Ji4tj165dJCUl+a3fX74uXbp49KqqV7w80L179yvyZmRkUFRUxGeffUZmZqanzPz58wNyqV12z6SlpbFixQpKSkp47rnn/JaDK3+Pzp07U1NT02SZYJObm8uoUaNYu3Ytx44du2LTMG+9NaafzZs3s2nTJkpKSoiJieH222/3eU/PmjWL4uLiBumZmZnMmzfvirT+/ftTUVHhOa+srPQZ2eG6664DHL3ed999LFy4EHBG8CtXrvTIPWDAgFZdWNcY7cr1NGHJVgD+a2ICnTs1tNIHDhzgtttuIzs7m5EjR1JeXs4jjzxiRiJARo8ezdmzZ1m1ahXguGFmz57NtGnTiImJITY2lrKyMi5dukRFRQU7duzwlI2KivL06B5++GHKysooKyujb9++vPPOO5w9exaA6upqDh8+zA033EBVVRV9+vQhKiqK4uJiPvnkk2bLvG7dOs6dO8fp06fZvHkzw4YNu+L62LFjefnllz2yHTx4kK+//rrR+nr16sXixYtZuHAhFy5cICsri/fee49Nm5w5sZqaGnJycpg7dy4Ajz/+OM888wwHDx4EnI7KokWLGtTrL19sbCy7du3yfB/vnnF9MjMzKSwspKioyDOnMXbsWFasWMGZM2cAx/1x8uTJBmXT09NZuHAh6enpDB06lOLiYrp27epz217v3zNcqaqqol8/J45bQUFBo/ka009VVRVXX301MTExfPTRR2zbts1n+eeff95zP3t/6hsJgAkTJlBYWMj58+c5evQohw4dYvjwhp3aEydOAI4xePPNNz1vlX311VeejsLy5ctJT08PyRuZ7cJQ/OXcBR4tfJ/Pq8/zgwHXNAjNUVtby7PPPktSUhLl5eWsXLmSjRs3Ehsb2zYCRygiwtq1a1mzZg0DBw7kpptuIjo62jMnkJqa6nEJzZkzh+TkZE/ZmTNnkpiYSFZWVoN6d+3aRUpKComJiYwcOZIZM2YwbNgwsrKyPBs/vfrqq56J3uYwfPhw7rzzTkaMGEFubm6D3tuMGTMYPHgwycnJxMfHc//9918xIvHF0KFDSUpKorCwkG7durFu3TqeeuopBg0aREJCAsOGDSM7OxtwJuZfeOEFpkyZwi233EJ8fLznIeCNv3yXIxQPHz6c7du3NxhFeBMXF0d1dTX9+vXz9ErHjBnDPffcw8iRI0lISCAjI4Pq6uoGZdPS0qioqCA9PZ3OnTtz/fXXc+utt/psx9/vGS7MnTuX+fPnk5qaysWLFxvN15h+xo0bR21tLYmJieTm5jJixIhvLFNcXByTJk1i8ODBjBs3jvz8fE9Hdfz48Z45tKysLBISEkhISODUqVM8+eSTAOzfv5+4uDhuvvlmNmzYwIsvvviNZQoE8eUzC2dSUlK0tLT0irR5r++lcGcFfXp25fc5aXynZ90wuKysjOnTp7N7924mTpxIfn4+1157bajFbhX2798fUfMobU1eXh49evQIu/UIhhFsfD0rRGSXqrZou82IH1G8e/BzCndWcGPv7uxYcIfHSJw7d44FCxaQkpLC8ePHKSoq4vXXX49YI2EYhtFWRPxk9mvbPwXgV1PrDOXWrVuZPn06Bw4cYOrUqSxatMjzhoPRccjLy2trEQyjXRCxIwpV5bHflvHWh5+RNrA3f/sdZxFSTk4OaWlp1NTU8NZbb1FQUNCujESkuQoNwwgtwXhGRKyhmL1mD2/sPk7vHl1Zck8yb7/9NvHx8SxZsoTs7GzKy8ubfOUx0oiOjub06dNmLAzD8Im6+1FER0e3ar0R6XoqPfYFb+w+zo29u1M4NZ5HH5pJQUEBgwYNYsuWLaSmpra1iEGhf//+VFZWtnqsecMw2g+Xd7hrTSLOUNReUjKWlhAd1Yl//vZxEhMyOXXqFE888QS5ubmtbknDiaioqJAsrjEMw/AmqK4nERknIgdE5GMRabD6RES6ishq9/p2EYltqs7DJ89Qe+YLemx5kYf+5V769u1LaWkpTz/9dLs2EoZhGG1F0NZRiEhn4CDwQ6AS2AlMUdV9XnkeAhJV9QERyQR+rKqT/dXbpdd3tUttDVz8K3l5ecyePbtVNxE3DMNoj4TrOorhwMeqekRV/woUAnfXy3M38Ip7XASMliYiqV38y0mShySyZ88e5s2bZ0bCMAwjyARzjqIfUOF1Xgn8oLE8qlorIlXAt4FT3plEZCZwORD8+ZL/2VreknAO7ZDe1NNVB8Z0UYfpog7TRR2DWlowmIbC18igvp8rkDyo6jJgGYCIlLZ0+NTeMF3UYbqow3RRh+miDhEpbTqXb4LpeqoErvc67w/8b2N5RKQL0Av4IogyGYZhGM0kmIZiJzBQRAaIyFVAJrC+Xp71wFT3OAN4R201mWEYRlgRNNeTO+eQDWwEOgMrVPVDEfkPnE2+1wO/Bv5bRD7GGUlkBlD1smDJHIGYLuowXdRhuqjDdFFHi3URcWHGDcMwjNASsbGeDMMwjNBghsIwDMPwS9gaimCE/4hUAtDFYyKyT0T2isifROR7bSFnKGhKF175MkRERaTdvhoZiC5EZJJ7b3woIq+FWsZQEcB/5AYRKRaR993/yfi2kDPYiMgKETkpIuWNXBcRWezqaa+IJPvK1wBVDbsPzuT3YeBG4CpgDzC4Xp6HgKXucSawuq3lbkNdjAJi3OMHO7Iu3Hw9gXeBbUBKW8vdhvfFQOB94Gr3vE9by92GulgGPOgeDwaOtbXcQdJFOpAMlDdyfTywAWcN2whgeyD1huuIIijhPyKUJnWhqsWqetY93YazZqU9Esh9AfCfwM+Ac6EULsQEoot/BfJV9UsAVT0ZYhlDRSC6UOBb7nEvGq7paheo6rv4X4t2N7BKHbYBfyMi1zVVb7gaCl/hP/o1lkdVa4HL4T/aG4HowpvpOD2G9kiTuhCRocD1qvq7UArWBgRyX9wE3CQiW0Vkm4iMC5l0oSUQXeQB94pIJfAH4JHQiBZ2NPd5AoTvfhStFv6jHRDw9xSRe4EU4LagStR2+NWFiHQCngemhUqgNiSQ+6ILjvvpdpxR5hYRiVfVr4IsW6gJRBdTgAJVfU5ERuKs34pX1UvBFy+saNFzM1xHFBb+o45AdIGI3AEsACao6vkQyRZqmtJFTyAe2Cwix3B8sOvb6YR2oP+Rdap6QVWPAgdwDEd7IxBdTAd+C6CqJUA0TsDAjkZAz5P6hKuhsPAfdTSpC9fd8kscI9Fe/dDQhC5UtUpVe6tqrKrG4szXTFDVFgdDC2MC+Y+8ifOiAyLSG8cVdSSkUoaGQHTxKTAaQERuwTEUHXFP4fXAT923n0YAVap6oqlCYel60uCF/4g4AtTFz4EewBp3Pv9TVZ3QZkIHiQB10SEIUBcbgTEisg+4CDyuqqfbTurgEKAuZgO/EpFZOK6Wae2xYykiv8FxNfZ252P+DYgCUNWlOPMz44GPgbPAfQHV2w51ZRiGYbQi4ep6MgzDMMIEMxSGYRiGX8xQGIZhGH4xQ2EYhmH4xQyFYRiG4RczFEbYISIXRaTM6xPrJ29sY5Eym9nmZjf66B435MWgFtTxgIj81D2eJiJ9va4tF5HBrSznThEZEkCZR0Uk5pu2bXRczFAY4UiNqg7x+hwLUbtZqpqEE2zy580trKpLVXWVezoN6Ot1bYaq7msVKevk/AWByfkoYIbCaDFmKIyIwB05bBGR3e7n733kiRORHe4oZK+IDHTT7/VK/6WIdG6iuXeB77tlR7t7GHzgxvrv6qY/K3V7gCx00/JEZI6IZODE3HrVbbObOxJIEZEHReRnXjJPE5GXWihnCV4B3UTkZREpFWfviX9303JwDFaxiBS7aWNEpMTV4xoR6dFEO0YHxwyFEY5083I7rXXTTgI/VNVkYDKw2Ee5B4AXVXUIzoO60g3XMBlIddMvAllNtP8j4AMRiQYKgMmqmoATyeBBEbkG+DEQp6qJwFPehVW1CCjF6fkPUdUar8tFwESv88nA6hbKOQ4nTMdlFqhqCpAI3CYiiaq6GCeWzyhVHeWG8ngSuMPVZSnwWBPtGB2csAzhYXR4atyHpTdRwBLXJ38RJ25RfUqABSLSH3hDVQ+JyGjg74CdbniTbjhGxxevikgNcAwnDPUg4KiqHnSvvwI8DCzB2etiuYj8Hgg4pLmqfi4iR9w4O4fcNra69TZHzu444Sq8dyibJCIzcf7X1+Fs0LO3XtkRbvpWt52rcPRmGI1ihsKIFGYB/wck4YyEG2xKpKqvich24E5go4jMwAmr/Iqqzg+gjSzvAIIi4nN/Eze20HCcIHOZQDbwD834LquBScBHwFpVVXGe2gHLibOL27NAPjBRRAYAc4BhqvqliBTgBL6rjwB/VNUpzZDX6OCY68mIFHoBJ9z9A36C05u+AhG5ETjiulvW47hg/gRkiEgfN881Evie4h8BsSLyfff8J8CfXZ9+L1X9A85Esa83j6pxwp774g3gn3D2SFjtpjVLTlW9gONCGuG6rb4FfA1Uich3gX9sRJZtQOrl7yQiMSLia3RmGB7MUBiRwi+AqSKyDcft9LWPPJOBchEpA27G2fJxH84D9W0R2Qv8Ecct0ySqeg4nuuYaEfkAuAQsxXno/s6t7884o536FABLL09m16v3S2Af8D1V3eGmNVtOd+7jOWCOqu7B2R/7Q2AFjjvrMsuADSJSrKqf47yR9Ru3nW04ujKMRrHosYZhGIZfbERhGIZh+MUMhWEYhuEXMxSGYRiGX8xQGIZhGH4xQ2EYhmH4xQyFYRiG4RczFIZhGIZf/h81UEp4TSOTBgAAAABJRU5ErkJggg==\n", 373 | "text/plain": [ 374 | "
" 375 | ] 376 | }, 377 | "metadata": { 378 | "needs_background": "light" 379 | }, 380 | "output_type": "display_data" 381 | }, 382 | { 383 | "name": "stdout", 384 | "output_type": "stream", 385 | "text": [ 386 | "0.17328835 0.12840138474769391\n" 387 | ] 388 | } 389 | ], 390 | "source": [ 391 | "prediction=model.predict([X_test_arr,X_test_time_pid,X_test_page,X_test_f2])\n", 392 | "from sklearn.metrics import roc_curve, auc\n", 393 | "import matplotlib.pyplot as plt\n", 394 | "(fpr, tpr, thresholds) = roc_curve(y_test,prediction)\n", 395 | "area = auc(fpr,tpr)\n", 396 | "plt.clf() #Clear the current figure\n", 397 | "plt.plot(fpr,tpr,label=\"Out-Sample ROC Curve with \\\n", 398 | " area = %1.2f\"%area)\n", 399 | "\n", 400 | "plt.plot([0, 1], [0, 1], 'k')\n", 401 | "plt.xlim([0.0, 1.0])\n", 402 | "plt.ylim([0.0, 1.0])\n", 403 | "plt.xlabel('False Positive Rate')\n", 404 | "plt.ylabel('True Positive Rate')\n", 405 | "plt.title('Out sample ROC')\n", 406 | "plt.legend(loc=\"lower right\")\n", 407 | "plt.show()\n", 408 | "\n", 409 | "def get_KS(y_prob,y_true):\n", 410 | " fpr,tpr,threshold=roc_curve(y_true,y_prob)\n", 411 | " ks=(tpr-fpr)\n", 412 | " max_=np.argmax(ks)\n", 413 | " \n", 414 | " return threshold[max_],np.max(ks)\n", 415 | "threshold,ks=get_KS(prediction,y_test)\n", 416 | "print(threshold,ks)" 417 | ] 418 | } 419 | ], 420 | "metadata": { 421 | "kernelspec": { 422 | "display_name": "Python 3", 423 | "language": "python", 424 | "name": "python3" 425 | }, 426 | "language_info": { 427 | "codemirror_mode": { 428 | "name": "ipython", 429 | "version": 3 430 | }, 431 | "file_extension": ".py", 432 | "mimetype": "text/x-python", 433 | "name": "python", 434 | "nbconvert_exporter": "python", 435 | "pygments_lexer": "ipython3", 436 | "version": "3.7.7" 437 | }, 438 | "toc": { 439 | "base_numbering": 1, 440 | "nav_menu": {}, 441 | "number_sections": true, 442 | "sideBar": true, 443 | "skip_h1_title": false, 444 | "title_cell": "Table of Contents", 445 | "title_sidebar": "Contents", 446 | "toc_cell": false, 447 | "toc_position": {}, 448 | "toc_section_display": true, 449 | "toc_window_display": false 450 | }, 451 | "varInspector": { 452 | "cols": { 453 | "lenName": 16, 454 | "lenType": 16, 455 | "lenVar": 40 456 | }, 457 | "kernels_config": { 458 | "python": { 459 | "delete_cmd_postfix": "", 460 | "delete_cmd_prefix": "del ", 461 | "library": "var_list.py", 462 | "varRefreshCmd": "print(var_dic_list())" 463 | }, 464 | "r": { 465 | "delete_cmd_postfix": ") ", 466 | "delete_cmd_prefix": "rm(", 467 | "library": "var_list.r", 468 | "varRefreshCmd": "cat(var_dic_list()) " 469 | } 470 | }, 471 | "types_to_exclude": [ 472 | "module", 473 | "function", 474 | "builtin_function_or_method", 475 | "instance", 476 | "_Feature" 477 | ], 478 | "window_display": false 479 | } 480 | }, 481 | "nbformat": 4, 482 | "nbformat_minor": 2 483 | } 484 | -------------------------------------------------------------------------------- /2.Binary_classification_models/fetch_data.py: -------------------------------------------------------------------------------- 1 | import json 2 | import numpy as np 3 | import pickle 4 | 5 | def get(train_size, val_size, test_size): 6 | ''' 7 | remember to adjust your file path 8 | read the data for use 9 | ''' 10 | data_dir = '../../data/processed/' 11 | with open(data_dir+'label.json') as f: 12 | label = json.load(f) 13 | with open(data_dir+'non_sequential_features.json') as f: 14 | non_sequential_features = json.load(f) 15 | # with open(data_dir+'features_sequential_embedded.json') as f: 16 | # sequential_features = json.load(f) 17 | 18 | with open(data_dir+'features_sequential_embedded.txt', "rb") as fp: #Unpickling 19 | feature1 = pickle.load(fp) 20 | 21 | label_size = len(label) 22 | train_size = int(train_size*label_size) 23 | val_size = int(val_size*label_size) 24 | test_size = int(test_size*label_size) 25 | 26 | # feature1 = np.array([sequential_features[key] 27 | # for key in sequential_features.keys()]) 28 | feature1 = np.array(feature1) 29 | feature2 = np.array([non_sequential_features[key] 30 | for key in non_sequential_features.keys()]) 31 | label = np.array([label[key] for key in label.keys()]) 32 | feature1_benign = feature1[label == 1] 33 | feature2_benign = feature2[label == 1] 34 | label_benign = label[label == 1] 35 | 36 | idx = np.arange(len(label)) 37 | np.random.shuffle(idx) 38 | 39 | idx_train = idx[:train_size] 40 | idx_val = idx[train_size:train_size+val_size] 41 | idx_test = idx[train_size+val_size:train_size+val_size+test_size] 42 | 43 | train_seq, val_seq, test_seq = feature1[idx_train], feature1[idx_val], feature1[idx_test] 44 | train_non, val_non, test_non = feature2[idx_train], feature2[idx_val], feature2[idx_test] 45 | train_Y, val_Y, test_Y = label[idx_train], label[idx_val], label[idx_test] 46 | 47 | return (train_seq, train_non, train_Y), (val_seq, val_non, val_Y), (test_seq, test_non, test_Y) 48 | 49 | def get_KS(y_prob,y_true): 50 | ''' 51 | y_prob: the predict_proba from the model 52 | y_true: the true label of Y 53 | this one calculates the KS value for a model 54 | return the best threshold, maximum ks 55 | ''' 56 | from sklearn.metrics import roc_curve 57 | fpr,tpr,threshold=roc_curve(y_true,y_prob) 58 | ks=(tpr-fpr) 59 | max_=np.argmax(ks) 60 | 61 | return threshold[max_],np.max(ks) -------------------------------------------------------------------------------- /Data_Schema.md: -------------------------------------------------------------------------------- 1 | # Data Schema 2 | 3 | ## sample data 4 | ``` 5 | [ 6 | "56f889ee11df4a72955147cb2f29a638", 7 | { 8 | "order_info": { 9 | "overdue": 0, 10 | "new_client": 0, 11 | "order_time": 1509322980000.0, 12 | "label": 0 13 | }, 14 | "data": [ 15 | { 16 | "pname": "loan_index", 17 | "pstime": 1508169825905, 18 | "petime": 1508169827989, 19 | "pid": "1508169825083X3005", 20 | "sid": "1508169825895" 21 | }, 22 | { 23 | "pname": "loan_submission", 24 | "pstime": 1508169828161, 25 | "petime": 1508169832016, 26 | "pid": "1508169825083X3005", 27 | "sid": "1508169825895" 28 | }, 29 | { 30 | "pname": "login", 31 | "pstime": 1509351552976, 32 | "petime": 1509351568401, 33 | "pid": "1509351523127X29603", 34 | "sid": "1509351523686" 35 | } 36 | ] 37 | } 38 | ] 39 | ``` 40 | 41 | - Data Structure 42 | - Each line of the text file contains the whole behavior for an individual application (Note: not an individual customer) 43 | - Data Type : ```[user_id, dictionary(order_info, data)] ``` 44 | - Order Info 45 | 46 | Keys | ValueDescription | ValueType 47 | ---|---|--- 48 | `overdue` | If application defaults, how many days overdued | int 49 | `new_client` | Whether this user has previous `successful` application | int (1-yes / 0-no) 50 | `order_time`| Application time | foat from Unix time(ms) 51 | `label` | Whether this application defaults | int (0-no / 1-yes) 52 | 53 | - Data 54 | - A list of page view behavior for an individual before a certain time (not neccessarily before application time) 55 | 56 | Keys | ValueDescription | ValueType 57 | ---|---|--- 58 | `pname` | Page name | String 59 | `pstime`| Time when page viewing starts | int from Unix time(ms) 60 | `petime`| Time when page viewing ends | int from Unix time(ms) 61 | `pid` | Whether this application defaults | String 62 | `sid` | Session ID | A string of int ID 63 | -------------------------------------------------------------------------------- /Images/Picture1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Images/Picture1.png -------------------------------------------------------------------------------- /Images/Picture2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Images/Picture2.png -------------------------------------------------------------------------------- /Images/Picture3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Images/Picture3.png -------------------------------------------------------------------------------- /Images/Picture4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Images/Picture4.png -------------------------------------------------------------------------------- /Images/Picture5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Images/Picture5.png -------------------------------------------------------------------------------- /Images/Picture6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Images/Picture6.png -------------------------------------------------------------------------------- /Images/Picture7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Images/Picture7.png -------------------------------------------------------------------------------- /Images/Picture8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Images/Picture8.png -------------------------------------------------------------------------------- /Presentation.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Presentation.pdf -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Fraud Detection with Sequential User Behavior 2 | 3 | Online-lending fraud detection with customers' sequential behavioral data (End-to-end ML and NLP project). 4 | 5 | 6 | 7 | ## Project Summary 8 | 9 | The project goal is to utilize customer’s behavior sequences before submission of the loan application to make fraud detection in the online lending business. We tried different kinds of machine learning classification models and state-of-the-art methods including the customized transformer model with attention mechanism to extract features and make the best prediction ability, measured by AUC and KS statistics. 10 | 11 | Based on the data exploration results of some baseline machine leaning models such as random forest and KNN, we further utilized different neural networks to conduct binary classifications, e.g. directly using LSTM layers and bidirectional LSTM followed by CNN layers, to extract features from sequential data and do fraud detection. In all models fitted, the best result came from the model with the combination of LSTM layer and CNN layer. The final AUC equals 0.59 and KS equals 0.1487. 12 | 13 | In order to dive deeper, we focused on both feature extraction and model optimization to improve our result accuracy. For better feature extraction, we trained our own skip-gram model on a web page sequence to get the word2vec embedding layer and fed it to the LSTM layers. For the model improvement part, although we tried several optimizations on the LSTM models, the limitation of LSTM maximum timestamps made it not perform well, since it can not learn the relations between sequences when padding size is larger than 20. Thus, we further constructed a new architecture based on "Transformer Model”, which introduced a multi-head attention mechanism in order to detect the interactions between web page sequence feature and page stay time feature. In this case, there is no more timesteps limitation in our new model, and this variant transformer model is much more flexible in future exploration. 14 | 15 | 16 | ## Background 17 | 18 | In the previous fraud detection analysis, people usually focus on the users’ basic information like gender, age, income, family background, and their application date. Although these kinds of information could be treated as features and helped to build machine learning models to detect fraud, there was still lots of information lost during the process. One kind of information it lost was page view behavior. By the development of the Internet, now more and more lending is happening online. As a result, we can record the page view behavior for each customer. Thanks to the development of Deep Learning, we can make the page view behavior into sequence feature sets and fit them into different models. 19 | 20 | 21 | ## Data Description 22 | 23 | In this project, we mainly got two datasets. One is the original dataset which was also used by the two previous groups. In this dataset, there are 200000 records of user behaviors sequences before the lending application. We treated people in this dataset as higher-income groups. The fraud occurrence frequency is very low. The other dataset was newly provided by Moffy which got 30000 records of user behaviors sequences. The fraud occurrence frequency is a little higher in the new dataset, and we considered this group as the lower-income. Both the two datasets have the same types of data which showed in the following table. 24 | 25 | ### 1. Basic Information Features(Non-Sequantial Features) 26 | 27 | Keys | ValueDescription | ValueType 28 | ---|---|--- 29 | `label` | Whether this application defaults | int (0-no / 1-yes) 30 | `overdue` | If application defaults, how many days overdued | int 31 | `new_client` | Whether this user has previous `successful` application | int (1-yes / 0-no) 32 | `order_time`| Application time | foat from Unix time(ms) 33 | 34 | ### 2. Page-View Behavior Features(Sequential Features) 35 | 36 | Keys | ValueDescription | ValueType 37 | ---|---|--- 38 | `pname` | Page name | String 39 | `pstime`| Time when page viewing starts | int from Unix time(ms) 40 | `petime`| Time when page viewing ends | int from Unix time(ms) 41 | `pid` | Whether this application defaults | String 42 | `sid` | Session ID | A string of int ID 43 | 44 | 45 | ## Data Processing 46 | 47 | To feed the data into Deep Learning models, we made some preprocessing. 48 | 49 | **1. Basic information features (Non-sequential features):** For the Basic information features, we think the last group did a good job on it. Thus, we kept the data processing made by them and got each sample as a dimension binary vector. 50 | 51 | **2. Page view behavior features (Sequential features):** 52 | - For the pid and sid, we kept the treatment method from the last group and recorded if the process id and session id changed from the last page. 53 | - For the pstime and petime, we also kept the method applied by the last group. We subtracted from page end time to page start time to get page stay time and subtracted from current page start time to last page end time to get the lag time between each page. 54 | - For the pname, there are 12 different categories of page. Last group encoded them in two ways, label encoding and one hot encoding. We add one more way which uses word2vec encoding. Thus, we totally used three methods. 55 | - Last group found that over 95% of sequences have length shorter than 60 timesteps and therefore they chose a padding length of 60. We kept this padding size in our Transformer Model. However, when we trained the LSTM model, we found 60 timesteps could be useless due to LSTM only accepting 20 timesteps as maximum. Therefore, we chose a padding length of 20 for our related LSTM models. 56 | - As a result, we totally got two main page view behavior sequences data. One is with 60 timesteps, while the other is with 20 timesteps. And we encoded categories of pages into three different length variables. One is using label encoding which only got one column. The second one is using one-hot encoding which got 12 columns. The final one which will discuss one detail in the following used word2vec encoding. We made it 50 columns. 57 | - To determine the effect of length for timesteps, we tested the original dataset on the best model provided by the last group. The result was shown in the following tables. Both the AUC and KS score of the model with 20 timesteps were not too much worse than the values of the model with 60 timesteps. These results could be made by the limitation of LSTM. Although there is an improvement in learning long term items in the LSTM model compared with RNN model, LSTM always can not learn items with timesteps more than 20. When the length of timesteps is longer than 20, the LSTM model will become something like a Markov chain and only memorize the last 20 timesteps. However, in the last 20 timesteps of 60 timesteps input, there are many ‘-1’s which could make errors for the result. Thus, the KS score for the model with 20 timesteps input is even better than the model with 60 timesteps input. 58 | 59 | Model Performance on High-Income Dataset | AUC | KS Score 60 | ---|---|--- 61 | `Model 4 using timesteps with 60` | 0.59 | 0.1487 62 | `Model 4 using timesteps with 20`| 0.58 | 0.1942 63 | 64 | 65 | ## Feature Extraction and Exploration 66 | 67 | ### Word2vec Embedding 68 | We experimented feature extraction by RNN(LSTM), where we used three different inputs : 69 | - Sequence embedding 1: Using one-hot encoded page type, discretized (binned) page stay time and page lag time, and change of pid and sid. In this way, the input shape is (batch size, 60, 17); 70 | - Sequence embedding 2: Using label encoded page type only. In this way, the input shape of LSTM is (batch size, 60, 1); 71 | - Sequence embedding 3: Using discretized (binned) page stay time and page lag time, and change of pid and sid only (without page type). In this way, the input shape is (batch size, 60, 4); 72 | The model performance using ‘Sequence embedding 1’ turns out more effectively. 73 | 74 | To further explore this part (making input for LSTM), our group utilized word2vec embedding on “page type” to make inputs. Word2vec is a common technique in NLP. The basic idea of word2vec embedding is to use two-layer shallow neural networks to train a text document (a document consists of one or multiple sentences) and then find an appropriate vector representation for each word of this document. Word2vec is an unsupervised algorithm, so we only need the document as the input without using any labels. 75 | 76 |

77 | 78 |

79 | 80 | In our project, we regard each page type as a word, and the sequence of web viewing behavior of a single user can be regarded as a single sentence, and then the sequence of web viewing behaviors of all users can be regarded as a document that consists of multiple sentences. Word2vec is a collection of different specific models. These models all share the same property (two-layer shallow neural network) with different specific applications. Our group tried skip-gram model, and the input size using this method is (batch size, 60, 50) where 50 is the embedding size that can be changed as you like. 81 | 82 | Further notice: The word2vec model trained by Google set this embedding size as 300, but Google used so many documents and they will give a vector representation to each word in the documents (there are so many words, so they use a 300-dimension vector to represent each), while in our project we in total only have over 10 page types (remember one-page type is equivalent to one word in our case, and it seems we do not need that big a embedding size). 83 | 84 | The performance after replacing the “sequence embedding 1” that the last group did with our skip-gram embedding becomes slightly worse (see table below). But the sequence embedding 1 that the last group did use not only page type but also page stay time, lag time, and so on, while the word2vec embedding only uses page type. So, we still think word2vec embedding is a powerful embedding method, but we need to find some ways to concatenate the page type and other variables to extract more information. Also, further work can include tuning the hyperparameters of the word2vec model. 85 | 86 | Model Performance on Low-Income Dataset | AUC | KS Score 87 | ---|---|--- 88 | `Model 4 using sequential embedding 1` | 0.61 | 0.206 89 | `Model 4 using Word2vec embedding`| 0.60 | 0.177 90 | 91 | The reason we compare the performance of model 4 instead of other model architectures is that model 4 is the best architecture we found at present. 92 | 93 | ### Customized Transformer model with multi-head attention mechanism 94 | 95 | In order to explore the interactions between “web page sequence” and “page stay time” features, which are both time-series features and crucial to research on the customer behaviors before submitting a loan application, we constructed our own state-of-art architecture with reference to the traditional transformer model. 96 | 97 | The core idea of our Transformer model is attention mechanism---the ability to attend to different positions of the input sequence to compute a representation of that sequence. In this case, we are able to handle input features using attention layers instead of RNNs or CNNs. This architecture has two main advantages: 98 | 99 | - Layer outputs can be calculated in parallel in the form of multi heads, instead of a series like an RNN or CNN; 100 | - Long-range behavioral sequences can be learnt. Since we set the padding size equal to 60 in the preprocessing stage, this characteristic can solve the problem that LSTM is not able to learn sequences well when longer than 20 timestamps. 101 | 102 |

103 | 104 |

105 | 106 | Our transformer model architecture is constructed with two parts, and we will call them “encoder” and “decoder” in alignment with the traditional transformer model. 107 | 108 | #### 1.Encoder 109 | 110 | The web page sequence after embedding with word2vector method is passed through N encoder layers, which consist of a point wise feed forward network and a residual connection around it followed by a layer normalization. 111 | 112 | The point wise feed forward network contains two fully-connected layers with a ReLU activation in between. Residual connection helps in avoiding the vanishing gradient problem in deep networks. 113 | 114 | After learning with N encoder layers, the output for each web page behavior in the sequence is then generated and ready to enter the decoder as the input. 115 | 116 | #### 2. Decoder 117 | 118 | The decoder attends on the encoder's output and its own input (Page stay time) to predict whether the customer defaults. There are N decoder layers in our transformer model in total, each one consists of two point wise feed forward networks and a multi-head attention layer, and every sublayer has a residual connection around it followed by a layer normalization. 119 | 120 | Below is the idea behind the multi-head attention layer, which consists of four parts: linear layers and split into heads, scaled dot-product attention, concatenation of heads and the final linear layer. The multi-head attention layer takes three inputs: Q (query), K (key), V (value), and they are passed through dense layers and then split into multiple heads; Each head is applied with the scaled dot-product attention layer; The attention output for each head is then concatenated; At last, the concatenated output is put through a final dense layer. 121 | 122 | Multi-head attention is utilized because it allows the model to jointly attend to information at different positions from different representational spaces. After the split each head has a reduced dimensionality, so the total computation cost is the same as a single head attention with full dimensionality. 123 | 124 |

125 | 126 |

127 | 128 | In our case, V(value) and K(key) receives the encoder output of the customer’s web page sequences, and Q(query) receives the output of page stay time sequence from the decoder’s first point wise feed forward network sublayer, the attention weights represent the importance given to the decoder's input based on the encoder's output. In other words, the decoder predicts whether the customer defaults by looking at the encoder output of the “web page sequence” behavior and self-attending to its own output of the “page stay time” behavior. 129 | 130 | Model Performance on Both Datasets | AUC | KS Score 131 | ---|---|--- 132 | `Transformer Model on High-Income Data` | 0.56 | 0.107 133 | `Transformer Model on Low-Income Data`| 0.60 | 0.163 134 | 135 | The performance of the Transformer model can be seen in Table 5. The result is not as expected that we think this is due to the lack of feature dimension. Only the sequence data itself may not include enough distribution information. 136 | 137 | 138 | ## Conclusions and Future Extensions 139 | 140 | In this project we applied ways to extract features from sequential behavior data, improved the previous model, and explored the state of art model Transformer. 141 | 142 | - As a novel model, we use Transformer for its advantages compared to CNN and LSTM for sequential data. It uses multi-head attention to fix some problems with LSTM in longer memory and efficiency problems. We use Word2Vec embedding as a new feature extraction way to have an expression of the behavior sequential data. 143 | 144 | - In a more general way, we can regard the Transformer model as an ensemble model. It emblems two subnets with multi-head attention in order to link sequential data with stay time data. The subnets could not only be the vanilla multi-head attention layer, but ensemble two LSTM subnets is also a feasible way. We could learn from the Residual Network to ensemble multiple relatively networks. 145 | 146 | - For the Transformer model, it offers a new way to contact the web sequence data and page stay time. Without directly adding a DNN layer, the output of the Transformer model could be the same shape as web sequence data. The output which shape is the same as sequence data now was added by the factor of page stay time. We hope the next group could treat this output as an input to an LSTM model and detect if there is an improvement compared to LSTM model with web sequence as input. 147 | 148 | - Also, generator models could be used to solve the imbalanced data problem. Such as the One Class GAN model that the previous group has tried could still be an attemptable method. 149 | 150 | - Though we tried several networks to predict the result of fraud behavior, the result is not as expected. We have complemented the feature extraction ways of the previous group, but the result is still not good enough. More ways of feature extraction could be explored. As in the sequential data, there’re 50 vector dimensions. When other non-sequential data is concatenated to it, the performance is not good. In the future more ways could be tried to map non-sequential features such as applied time with sequential features. 151 | -------------------------------------------------------------------------------- /Readings/Attention is all you need.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Readings/Attention is all you need.pdf -------------------------------------------------------------------------------- /Readings/OCAN.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/avayang/Fraud_Detection_with_Sequential_User_Behavior/959d1888cb0f5fa2d58f02d82d0c84ded8f528e0/Readings/OCAN.pdf --------------------------------------------------------------------------------