├── HGP-SL ├── pre_trained ├── fig │ └── model.png ├── __pycache__ │ ├── layers.cpython-36.pyc │ ├── layers.cpython-37.pyc │ ├── models.cpython-36.pyc │ ├── models.cpython-37.pyc │ ├── sparse_softmax.cpython-36.pyc │ └── sparse_softmax.cpython-37.pyc ├── data │ ├── ADNI2 │ │ └── raw │ │ │ └── ADNI2_graph_labels.txt │ ├── ADNI │ │ └── raw │ │ │ ├── ADNI_graph_labels.txt │ │ │ └── ADNI_node_labels.txt │ └── ABIDE │ │ └── raw │ │ └── ABIDE_graph_labels.txt ├── README.md ├── models.py ├── sparse_softmax.py ├── mainTL.py ├── main.py └── layers.py ├── README.md ├── ADNI_with_Abide_TL.ipynb ├── Node2vec.ipynb ├── Hierarchical GCN.ipynb └── Baseline_GCN.ipynb /HGP-SL/pre_trained: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anoop23IISc/BrainNetworks/HEAD/HGP-SL/pre_trained -------------------------------------------------------------------------------- /HGP-SL/fig/model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anoop23IISc/BrainNetworks/HEAD/HGP-SL/fig/model.png -------------------------------------------------------------------------------- /HGP-SL/__pycache__/layers.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anoop23IISc/BrainNetworks/HEAD/HGP-SL/__pycache__/layers.cpython-36.pyc -------------------------------------------------------------------------------- /HGP-SL/__pycache__/layers.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anoop23IISc/BrainNetworks/HEAD/HGP-SL/__pycache__/layers.cpython-37.pyc -------------------------------------------------------------------------------- /HGP-SL/__pycache__/models.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anoop23IISc/BrainNetworks/HEAD/HGP-SL/__pycache__/models.cpython-36.pyc -------------------------------------------------------------------------------- /HGP-SL/__pycache__/models.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anoop23IISc/BrainNetworks/HEAD/HGP-SL/__pycache__/models.cpython-37.pyc -------------------------------------------------------------------------------- /HGP-SL/__pycache__/sparse_softmax.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anoop23IISc/BrainNetworks/HEAD/HGP-SL/__pycache__/sparse_softmax.cpython-36.pyc -------------------------------------------------------------------------------- /HGP-SL/__pycache__/sparse_softmax.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anoop23IISc/BrainNetworks/HEAD/HGP-SL/__pycache__/sparse_softmax.cpython-37.pyc -------------------------------------------------------------------------------- /HGP-SL/data/ADNI2/raw/ADNI2_graph_labels.txt: -------------------------------------------------------------------------------- 1 | 1 2 | 1 3 | 1 4 | 2 5 | 1 6 | 1 7 | 1 8 | 1 9 | 1 10 | 1 11 | 1 12 | 1 13 | 1 14 | 1 15 | 2 16 | 1 17 | 2 18 | 1 19 | 2 20 | 1 21 | 1 22 | 1 23 | 1 24 | 1 25 | 1 26 | 1 27 | 1 28 | 1 29 | 1 30 | 2 31 | 1 32 | 2 33 | 1 34 | 1 35 | 1 36 | 2 37 | 1 38 | 1 39 | 1 40 | 1 41 | 1 42 | 1 43 | 1 44 | 1 45 | 1 46 | 1 47 | 1 48 | 1 49 | 1 50 | 1 51 | 1 52 | 1 53 | 1 54 | 2 55 | 1 56 | 2 57 | 2 58 | 2 59 | 1 60 | 1 61 | 1 62 | 1 63 | 1 64 | 2 65 | 1 66 | 1 67 | 1 68 | 1 69 | 2 70 | 2 71 | 1 72 | 2 73 | 2 74 | 1 75 | 1 76 | 2 77 | 1 78 | 1 79 | 2 80 | 2 81 | 1 82 | 2 83 | 1 84 | 1 85 | 1 86 | 2 87 | 2 88 | 1 89 | 1 90 | 1 91 | 2 92 | 1 93 | 1 94 | 1 95 | 2 96 | 2 97 | 1 98 | 1 99 | 2 100 | 1 101 | 1 102 | 1 103 | 1 104 | 1 105 | 1 106 | 1 107 | 1 108 | 1 109 | 1 110 | 1 111 | 2 112 | 1 113 | 2 114 | 2 115 | 1 116 | 2 117 | 2 118 | -------------------------------------------------------------------------------- /HGP-SL/data/ADNI/raw/ADNI_graph_labels.txt: -------------------------------------------------------------------------------- 1 | 1 2 | 2 3 | 2 4 | 1 5 | 1 6 | 1 7 | 1 8 | 2 9 | 1 10 | 1 11 | 1 12 | 2 13 | 1 14 | 1 15 | 1 16 | 1 17 | 1 18 | 1 19 | 1 20 | 2 21 | 1 22 | 2 23 | 2 24 | 1 25 | 1 26 | 1 27 | 1 28 | 1 29 | 1 30 | 1 31 | 1 32 | 1 33 | 2 34 | 1 35 | 1 36 | 1 37 | 2 38 | 1 39 | 2 40 | 1 41 | 1 42 | 2 43 | 2 44 | 1 45 | 1 46 | 1 47 | 1 48 | 1 49 | 1 50 | 1 51 | 2 52 | 1 53 | 1 54 | 1 55 | 2 56 | 1 57 | 1 58 | 1 59 | 1 60 | 1 61 | 1 62 | 2 63 | 1 64 | 1 65 | 1 66 | 1 67 | 1 68 | 1 69 | 1 70 | 1 71 | 1 72 | 1 73 | 2 74 | 2 75 | 1 76 | 2 77 | 1 78 | 2 79 | 2 80 | 2 81 | 2 82 | 1 83 | 1 84 | 2 85 | 1 86 | 2 87 | 2 88 | 1 89 | 1 90 | 1 91 | 1 92 | 1 93 | 2 94 | 1 95 | 2 96 | 2 97 | 1 98 | 1 99 | 1 100 | 2 101 | 1 102 | 1 103 | 1 104 | 1 105 | 1 106 | 1 107 | 2 108 | 1 109 | 2 110 | 1 111 | 1 112 | 1 113 | 1 114 | 1 115 | 1 116 | 1 117 | 2 118 | 2 119 | 1 120 | 2 121 | 2 122 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Graph Convolutional Neural Networks for Alzheimer’s Classification with transfer learning and HPC methods 2 | 3 | GCNs have been increasingly used for the classification of brain functional networks to aid early prediction of neurodegenerative diseases. It is important to analyze the performance and capabilities of these GCNs for the classification and obtain insights on how and when the GCNs can be used. In this work, we perform detailed analyses of the performance of GCNs for the classification of brain functional networks of Alzheimer's data. We study the impact of the various parameters including the thresholds used for graph generation, graph sizes, data sizes or the number of subjects, and classification accuracy across different visits of the subjects. We have also developed a transfer learning approach to train using one dataset and apply the weights on another dataset to make use of larger data sizes. Finally, we have developed GPU-based acceleration methods to decrease the training time. We find that the accuracy of the models improves when taking into account all visits of the subjects and also improves with increasing graph sizes. The use of transfer learning has also improved classification accuracy. Finally, the use of CUDA streams for asynchronous computations has resulted in a reduction in execution times by up to 60\%. 4 | 5 | ### Requirements 6 | 7 | * python 8 | * pytorch 9 | * torch-scatter 10 | * torch-sparse 11 | * torch-cluster 12 | * torch-geometric 13 | * stellargraph 14 | 15 | Note: 16 | This code repository is heavily built on [pytorch_geometric](https://github.com/rusty1s/pytorch_geometric), which is a Geometric Deep Learning Extension Library for PyTorch. Please refer [here](https://pytorch-geometric.readthedocs.io/en/latest/) for how to install and utilize the library. 17 | 18 | ### Dataset 19 | 20 | * ADNI dataset is publicly avaiable at http://adni.loni.usc.edu/ 21 | * ABIDE dataset is publicly available at http://preprocessed-connectomes-project.org/abide/ 22 | -------------------------------------------------------------------------------- /HGP-SL/README.md: -------------------------------------------------------------------------------- 1 | # HGP-SL 2 | Hierarchical Graph Pooling with Structure Learning ([arXiv](https://arxiv.org/abs/1911.05954)). 3 | 4 | This is a PyTorch implementation of the HGP-SL algorithm, which learns a low-dimensional representation for the entire graph. Specifically, the graph pooling operation utilizes node features and graph structure information to perform down-sampling on graphs. Then, a structure learning layer is stacked on the pooling operation, which aims to learn a refined graph structure that can best preserve the essential topological information. 5 | 6 | 7 | ### Data format 8 | 9 | This folder contains the following comma separated text files (replace DS by the name of the dataset): 10 | 11 | **n = total number of nodes** 12 | 13 | **m = total number of edges** 14 | 15 | **N = number of graphs** 16 | 17 | **(1) DS_A.txt (m lines)** 18 | 19 | *sparse (block diagonal) adjacency matrix for all graphs, each line corresponds to (row, col) resp. (node_id, node_id)* 20 | 21 | **(2) DS_graph_indicator.txt (n lines)** 22 | 23 | *column vector of graph identifiers for all nodes of all graphs, the value in the i-th line is the graph_id of the node with node_id i* 24 | 25 | **(3) DS_graph_labels.txt (N lines)** 26 | 27 | *class labels for all graphs in the dataset, the value in the i-th line is the class label of the graph with graph_id i* 28 | 29 | **(4) DS_node_labels.txt (n lines)** 30 | 31 | *column vector of node labels, the value in the i-th line corresponds to the node with node_id i* 32 | 33 | There are OPTIONAL files if the respective information is available: 34 | 35 | **(5) DS_edge_labels.txt (m lines; same size as DS_A_sparse.txt)** 36 | 37 | *labels for the edges in DS_A_sparse.txt* 38 | 39 | **(6) DS_edge_attributes.txt (m lines; same size as DS_A.txt)** 40 | 41 | *attributes for the edges in DS_A.txt* 42 | 43 | **(7) DS_node_attributes.txt (n lines)** 44 | 45 | *matrix of node attributes, the comma seperated values in the i-th line is the attribute vector of the node with node_id i* 46 | 47 | **(8) DS_graph_attributes.txt (N lines)** 48 | 49 | *regression values for all graphs in the dataset, the value in the i-th line is the attribute of the graph with graph_id i* 50 | -------------------------------------------------------------------------------- /HGP-SL/models.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn.functional as F 3 | from torch_geometric.nn import global_mean_pool as gap, global_max_pool as gmp 4 | from torch_geometric.nn import GCNConv 5 | 6 | from layers import GCN, HGPSLPool 7 | 8 | 9 | class Model(torch.nn.Module): 10 | def __init__(self, args): 11 | super(Model, self).__init__() 12 | self.args = args 13 | self.num_features = args.num_features 14 | self.nhid = args.nhid 15 | self.num_classes = args.num_classes 16 | self.pooling_ratio = args.pooling_ratio 17 | self.dropout_ratio = args.dropout_ratio 18 | self.sample = args.sample_neighbor 19 | self.sparse = args.sparse_attention 20 | self.sl = args.structure_learning 21 | self.lamb = args.lamb 22 | 23 | self.conv1 = GCNConv(self.num_features, self.nhid) 24 | self.conv2 = GCN(self.nhid, self.nhid) 25 | self.conv3 = GCN(self.nhid, self.nhid) 26 | 27 | self.pool1 = HGPSLPool(self.nhid, self.pooling_ratio, self.sample, self.sparse, self.sl, self.lamb) 28 | self.pool2 = HGPSLPool(self.nhid, self.pooling_ratio, self.sample, self.sparse, self.sl, self.lamb) 29 | 30 | self.lin1 = torch.nn.Linear(self.nhid * 2, self.nhid) 31 | self.lin2 = torch.nn.Linear(self.nhid, self.nhid // 2) 32 | self.lin3 = torch.nn.Linear(self.nhid // 2, self.num_classes) 33 | 34 | def forward(self, data): 35 | x, edge_index, batch = data.x, data.edge_index, data.batch 36 | edge_attr = None 37 | 38 | x = F.relu(self.conv1(x, edge_index, edge_attr)) 39 | x, edge_index, edge_attr, batch = self.pool1(x, edge_index, edge_attr, batch) 40 | x1 = torch.cat([gmp(x, batch), gap(x, batch)], dim=1) 41 | 42 | x = F.relu(self.conv2(x, edge_index, edge_attr)) 43 | x, edge_index, edge_attr, batch = self.pool2(x, edge_index, edge_attr, batch) 44 | x2 = torch.cat([gmp(x, batch), gap(x, batch)], dim=1) 45 | 46 | x = F.relu(self.conv3(x, edge_index, edge_attr)) 47 | x3 = torch.cat([gmp(x, batch), gap(x, batch)], dim=1) 48 | 49 | x = F.relu(x1) + F.relu(x2) + F.relu(x3) 50 | 51 | x = F.relu(self.lin1(x)) 52 | x = F.dropout(x, p=self.dropout_ratio, training=self.training) 53 | x = F.relu(self.lin2(x)) 54 | x = F.dropout(x, p=self.dropout_ratio, training=self.training) 55 | x = F.log_softmax(self.lin3(x), dim=-1) 56 | 57 | return x 58 | -------------------------------------------------------------------------------- /HGP-SL/data/ABIDE/raw/ABIDE_graph_labels.txt: -------------------------------------------------------------------------------- 1 | 1 2 | 2 3 | 1 4 | 2 5 | 1 6 | 2 7 | 2 8 | 2 9 | 1 10 | 1 11 | 2 12 | 1 13 | 2 14 | 1 15 | 2 16 | 1 17 | 2 18 | 1 19 | 2 20 | 1 21 | 2 22 | 1 23 | 1 24 | 2 25 | 1 26 | 1 27 | 2 28 | 1 29 | 1 30 | 2 31 | 2 32 | 1 33 | 2 34 | 2 35 | 1 36 | 1 37 | 1 38 | 1 39 | 2 40 | 1 41 | 2 42 | 1 43 | 2 44 | 2 45 | 1 46 | 1 47 | 1 48 | 1 49 | 2 50 | 2 51 | 2 52 | 1 53 | 1 54 | 1 55 | 2 56 | 1 57 | 1 58 | 1 59 | 2 60 | 2 61 | 1 62 | 1 63 | 2 64 | 2 65 | 1 66 | 2 67 | 1 68 | 1 69 | 2 70 | 1 71 | 1 72 | 1 73 | 1 74 | 1 75 | 2 76 | 2 77 | 2 78 | 2 79 | 2 80 | 1 81 | 2 82 | 1 83 | 2 84 | 1 85 | 1 86 | 2 87 | 1 88 | 1 89 | 2 90 | 2 91 | 2 92 | 1 93 | 1 94 | 1 95 | 2 96 | 2 97 | 2 98 | 2 99 | 2 100 | 1 101 | 1 102 | 2 103 | 1 104 | 2 105 | 1 106 | 1 107 | 1 108 | 1 109 | 2 110 | 2 111 | 2 112 | 1 113 | 1 114 | 1 115 | 1 116 | 1 117 | 2 118 | 1 119 | 1 120 | 2 121 | 2 122 | 2 123 | 1 124 | 2 125 | 1 126 | 2 127 | 2 128 | 1 129 | 2 130 | 1 131 | 2 132 | 2 133 | 1 134 | 2 135 | 2 136 | 2 137 | 1 138 | 1 139 | 1 140 | 1 141 | 2 142 | 2 143 | 1 144 | 2 145 | 1 146 | 1 147 | 1 148 | 1 149 | 2 150 | 1 151 | 1 152 | 1 153 | 1 154 | 1 155 | 2 156 | 1 157 | 2 158 | 1 159 | 2 160 | 2 161 | 1 162 | 1 163 | 1 164 | 2 165 | 2 166 | 2 167 | 2 168 | 1 169 | 2 170 | 2 171 | 2 172 | 1 173 | 1 174 | 2 175 | 2 176 | 1 177 | 2 178 | 1 179 | 2 180 | 1 181 | 1 182 | 2 183 | 1 184 | 2 185 | 2 186 | 2 187 | 1 188 | 1 189 | 2 190 | 2 191 | 1 192 | 1 193 | 2 194 | 2 195 | 2 196 | 1 197 | 2 198 | 2 199 | 1 200 | 1 201 | 1 202 | 1 203 | 2 204 | 1 205 | 1 206 | 2 207 | 2 208 | 1 209 | 2 210 | 1 211 | 2 212 | 2 213 | 2 214 | 2 215 | 2 216 | 2 217 | 1 218 | 1 219 | 2 220 | 1 221 | 1 222 | 1 223 | 1 224 | 1 225 | 1 226 | 1 227 | 2 228 | 1 229 | 2 230 | 2 231 | 1 232 | 2 233 | 1 234 | 1 235 | 1 236 | 2 237 | 2 238 | 1 239 | 2 240 | 1 241 | 1 242 | 1 243 | 1 244 | 2 245 | 1 246 | 2 247 | 1 248 | 2 249 | 1 250 | 1 251 | 1 252 | 1 253 | 2 254 | 1 255 | 1 256 | 1 257 | 1 258 | 2 259 | 2 260 | 2 261 | 2 262 | 1 263 | 1 264 | 1 265 | 2 266 | 2 267 | 1 268 | 2 269 | 1 270 | 2 271 | 1 272 | 1 273 | 1 274 | 1 275 | 1 276 | 1 277 | 1 278 | 1 279 | 1 280 | 2 281 | 2 282 | 1 283 | 1 284 | 1 285 | 2 286 | 1 287 | 1 288 | 1 289 | 1 290 | 2 291 | 1 292 | 2 293 | 1 294 | 1 295 | 2 296 | 2 297 | 2 298 | 1 299 | 2 300 | 1 301 | 1 302 | 1 303 | 1 304 | 1 305 | 1 306 | 2 307 | 1 308 | 2 309 | 1 310 | 2 311 | 2 312 | 2 313 | 2 314 | 1 315 | 2 316 | 1 317 | 1 318 | 2 319 | 1 320 | 1 321 | 2 322 | 1 323 | 2 324 | 2 325 | 2 326 | 2 327 | 2 328 | 2 329 | 1 330 | 2 331 | 1 332 | 2 333 | 2 334 | 1 335 | 2 336 | 1 337 | 1 338 | 2 339 | 2 340 | 2 341 | 2 342 | 1 343 | 1 344 | 1 345 | 2 346 | 1 347 | 1 348 | 2 349 | 1 350 | 2 351 | 1 352 | 2 353 | 2 354 | 1 355 | 1 356 | 1 357 | 1 358 | 2 359 | 2 360 | 2 361 | 1 362 | 2 363 | 1 364 | 2 365 | 2 366 | 1 367 | 1 368 | 2 369 | 2 370 | 1 371 | 2 372 | 1 373 | 1 374 | 1 375 | 2 376 | 2 377 | 2 378 | 1 379 | 1 380 | 1 381 | 2 382 | 1 383 | 2 384 | 1 385 | 1 386 | 2 387 | 1 388 | 2 389 | 1 390 | 2 391 | 1 392 | 2 393 | 2 394 | 2 395 | 1 396 | 1 397 | 2 398 | 2 399 | 2 400 | 1 401 | 1 402 | 1 403 | 2 404 | 2 405 | 2 406 | 2 407 | 1 408 | 1 409 | 1 410 | 2 411 | 1 412 | 2 413 | 1 414 | 1 415 | 2 416 | 1 417 | 2 418 | 1 419 | 1 420 | 1 421 | 2 422 | 2 423 | 2 424 | 2 425 | 2 426 | 2 427 | 2 428 | 2 429 | 1 430 | 1 431 | 2 432 | 2 433 | 2 434 | 2 435 | 2 436 | 1 437 | 1 438 | 1 439 | 1 440 | 1 441 | 2 442 | 2 443 | 1 444 | 1 445 | 2 446 | 1 447 | 2 448 | 1 449 | 1 450 | 1 451 | 1 452 | 2 453 | 2 454 | 1 455 | 1 456 | 2 457 | 1 458 | 1 459 | 2 460 | 1 461 | 1 462 | 2 463 | 2 464 | 1 465 | 2 466 | 2 467 | 2 468 | 1 469 | 1 470 | 2 471 | 2 472 | 2 473 | 1 474 | 2 475 | 2 476 | 1 477 | 2 478 | 1 479 | 2 480 | 2 481 | 1 482 | 2 483 | 1 484 | 1 485 | 2 486 | 1 487 | 1 488 | 1 489 | 2 490 | 1 491 | 1 492 | 1 493 | 2 494 | 1 495 | 1 496 | 1 497 | 2 498 | 2 499 | 1 500 | 2 501 | 2 502 | 2 503 | 2 504 | 1 505 | 1 506 | 2 507 | 1 508 | 2 509 | 1 510 | 2 511 | 1 512 | 1 513 | 1 514 | 2 515 | 1 516 | 2 517 | 1 518 | 2 519 | 2 520 | 2 521 | 1 522 | 1 523 | 2 524 | 1 525 | 2 526 | 1 527 | 2 528 | 2 529 | 1 530 | 1 531 | 1 532 | 1 533 | 1 534 | 2 535 | 2 536 | 1 537 | 2 538 | 1 539 | 1 540 | 2 541 | 1 542 | 1 543 | 1 544 | 2 545 | 2 546 | 1 547 | 1 548 | 1 549 | 2 550 | 2 551 | 1 552 | 1 553 | 1 554 | 1 555 | 1 556 | 2 557 | 2 558 | 2 559 | 2 560 | 1 561 | 2 562 | 1 563 | 1 564 | 1 565 | 2 566 | 1 567 | 2 568 | 1 569 | 1 570 | 2 571 | 2 572 | 2 573 | 1 574 | 2 575 | 2 576 | 1 577 | 1 578 | 2 579 | 1 580 | 2 581 | 1 582 | 1 583 | 2 584 | 2 585 | 2 586 | 1 587 | 1 588 | 2 589 | 2 590 | 2 591 | 2 592 | 1 593 | 1 594 | 2 595 | 2 596 | 1 597 | 2 598 | 2 599 | 1 600 | 1 601 | 2 602 | 1 603 | 2 604 | 2 605 | 1 606 | -------------------------------------------------------------------------------- /HGP-SL/sparse_softmax.py: -------------------------------------------------------------------------------- 1 | """ 2 | An original implementation of sparsemax (Martins & Astudillo, 2016) is available at 3 | https://github.com/OpenNMT/OpenNMT-py/blob/master/onmt/modules/sparse_activations.py. 4 | See `From Softmax to Sparsemax: A Sparse Model of Attention and Multi-Label Classification, ICML 2016` 5 | for detailed description. 6 | 7 | We make some modifications to make it work at scatter operation scenarios, e.g., calculate softmax according to batch 8 | indicators. 9 | 10 | Usage: 11 | >> x = torch.tensor([ 1.7301, 0.6792, -1.0565, 1.6614, -0.3196, -0.7790, -0.3877, -0.4943, 12 | 0.1831, -0.0061]) 13 | >> batch = torch.tensor([0, 0, 0, 0, 1, 1, 1, 1, 1, 1]) 14 | >> sparse_attention = Sparsemax() 15 | >> res = sparse_attention(x, batch) 16 | >> print(res) 17 | tensor([0.5343, 0.0000, 0.0000, 0.4657, 0.0612, 0.0000, 0.0000, 0.0000, 0.5640, 18 | 0.3748]) 19 | 20 | """ 21 | import torch 22 | import torch.nn as nn 23 | from torch.autograd import Function 24 | from torch_scatter import scatter_add, scatter_max 25 | 26 | 27 | def scatter_sort(x, batch, fill_value=-1e16): 28 | num_nodes = scatter_add(batch.new_ones(x.size(0)), batch, dim=0) 29 | batch_size, max_num_nodes = num_nodes.size(0), num_nodes.max().item() 30 | 31 | cum_num_nodes = torch.cat([num_nodes.new_zeros(1), num_nodes.cumsum(dim=0)[:-1]], dim=0) 32 | 33 | index = torch.arange(batch.size(0), dtype=torch.long, device=x.device) 34 | index = (index - cum_num_nodes[batch]) + (batch * max_num_nodes) 35 | 36 | dense_x = x.new_full((batch_size * max_num_nodes,), fill_value) 37 | dense_x[index] = x 38 | dense_x = dense_x.view(batch_size, max_num_nodes) 39 | 40 | sorted_x, _ = dense_x.sort(dim=-1, descending=True) 41 | cumsum_sorted_x = sorted_x.cumsum(dim=-1) 42 | cumsum_sorted_x = cumsum_sorted_x.view(-1) 43 | 44 | sorted_x = sorted_x.view(-1) 45 | filled_index = sorted_x != fill_value 46 | 47 | sorted_x = sorted_x[filled_index] 48 | cumsum_sorted_x = cumsum_sorted_x[filled_index] 49 | 50 | return sorted_x, cumsum_sorted_x 51 | 52 | 53 | def _make_ix_like(batch): 54 | num_nodes = scatter_add(batch.new_ones(batch.size(0)), batch, dim=0) 55 | idx = [torch.arange(1, i + 1, dtype=torch.long, device=batch.device) for i in num_nodes] 56 | idx = torch.cat(idx, dim=0) 57 | 58 | return idx 59 | 60 | 61 | def _threshold_and_support(x, batch): 62 | """Sparsemax building block: compute the threshold 63 | Args: 64 | x: input tensor to apply the sparsemax 65 | batch: group indicators 66 | Returns: 67 | the threshold value 68 | """ 69 | num_nodes = scatter_add(batch.new_ones(x.size(0)), batch, dim=0) 70 | cum_num_nodes = torch.cat([num_nodes.new_zeros(1), num_nodes.cumsum(dim=0)[:-1]], dim=0) 71 | 72 | sorted_input, input_cumsum = scatter_sort(x, batch) 73 | input_cumsum = input_cumsum - 1.0 74 | rhos = _make_ix_like(batch).to(x.dtype) 75 | support = rhos * sorted_input > input_cumsum 76 | 77 | support_size = scatter_add(support.to(batch.dtype), batch) 78 | # mask invalid index, for example, if batch is not start from 0 or not continuous, it may result in negative index 79 | idx = support_size + cum_num_nodes - 1 80 | mask = idx < 0 81 | idx[mask] = 0 82 | tau = input_cumsum.gather(0, idx) 83 | tau /= support_size.to(x.dtype) 84 | 85 | return tau, support_size 86 | 87 | 88 | class SparsemaxFunction(Function): 89 | 90 | @staticmethod 91 | def forward(ctx, x, batch): 92 | """sparsemax: normalizing sparse transform 93 | Parameters: 94 | ctx: context object 95 | x (Tensor): shape (N, ) 96 | batch: group indicator 97 | Returns: 98 | output (Tensor): same shape as input 99 | """ 100 | max_val, _ = scatter_max(x, batch) 101 | x -= max_val[batch] 102 | tau, supp_size = _threshold_and_support(x, batch) 103 | output = torch.clamp(x - tau[batch], min=0) 104 | ctx.save_for_backward(supp_size, output, batch) 105 | 106 | return output 107 | 108 | @staticmethod 109 | def backward(ctx, grad_output): 110 | supp_size, output, batch = ctx.saved_tensors 111 | grad_input = grad_output.clone() 112 | grad_input[output == 0] = 0 113 | 114 | v_hat = scatter_add(grad_input, batch) / supp_size.to(output.dtype) 115 | grad_input = torch.where(output != 0, grad_input - v_hat[batch], grad_input) 116 | 117 | return grad_input, None 118 | 119 | 120 | sparsemax = SparsemaxFunction.apply 121 | 122 | 123 | class Sparsemax(nn.Module): 124 | 125 | def __init__(self): 126 | super(Sparsemax, self).__init__() 127 | 128 | def forward(self, x, batch): 129 | return sparsemax(x, batch) 130 | 131 | 132 | if __name__ == '__main__': 133 | sparse_attention = Sparsemax() 134 | input_x = torch.tensor([1.7301, 0.6792, -1.0565, 1.6614, -0.3196, -0.7790, -0.3877, -0.4943, 0.1831, -0.0061]) 135 | input_batch = torch.cat([torch.zeros(4, dtype=torch.long), torch.ones(6, dtype=torch.long)], dim=0) 136 | res = sparse_attention(input_x, input_batch) 137 | print(res) 138 | -------------------------------------------------------------------------------- /HGP-SL/mainTL.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import glob 3 | import os 4 | import time 5 | 6 | import torch 7 | import torch.nn.functional as F 8 | from models import Model 9 | from torch.utils.data import random_split 10 | from torch_geometric.data import DataLoader 11 | from torch_geometric.datasets import TUDataset 12 | 13 | parser = argparse.ArgumentParser() 14 | 15 | parser.add_argument('--seed', type=int, default=777, help='random seed') 16 | parser.add_argument('--batch_size', type=int, default=512, help='batch size') 17 | parser.add_argument('--lr', type=float, default=0.001, help='learning rate') 18 | parser.add_argument('--weight_decay', type=float, default=0.001, help='weight decay') 19 | parser.add_argument('--nhid', type=int, default=128, help='hidden size') 20 | parser.add_argument('--sample_neighbor', type=bool, default=True, help='whether sample neighbors') 21 | parser.add_argument('--sparse_attention', type=bool, default=True, help='whether use sparse attention') 22 | parser.add_argument('--structure_learning', type=bool, default=True, help='whether perform structure learning') 23 | parser.add_argument('--pooling_ratio', type=float, default=0.5, help='pooling ratio') 24 | parser.add_argument('--dropout_ratio', type=float, default=0.4, help='dropout ratio') 25 | parser.add_argument('--lamb', type=float, default=1.0, help='trade-off parameter') 26 | parser.add_argument('--dataset', type=str, default='PROTEINS', help='DD/PROTEINS/NCI1/NCI109/Mutagenicity/ENZYMES') 27 | parser.add_argument('--device', type=str, default='cuda:0', help='specify cuda devices') 28 | parser.add_argument('--epochs', type=int, default=1000, help='maximum number of epochs') 29 | parser.add_argument('--patience', type=int, default=100, help='patience for early stopping') 30 | 31 | args = parser.parse_args() 32 | torch.manual_seed(args.seed) 33 | if torch.cuda.is_available(): 34 | print("CUDA") 35 | torch.cuda.manual_seed(args.seed) 36 | 37 | 38 | dataset = TUDataset("/home/anoopkumar/Brain/HGP-SL/data", name=args.dataset, use_node_attr=False) 39 | 40 | 41 | args.num_classes = dataset.num_classes 42 | args.num_features = dataset.num_features 43 | 44 | print(args) 45 | 46 | print("len of the dataset ######## ",len(dataset)) 47 | 48 | num_training = int(len(dataset) * 0.7) 49 | num_val = int(len(dataset) * 0.1) 50 | num_test = len(dataset) - (num_training + num_val) 51 | training_set, validation_set, test_set = random_split(dataset, [num_training, num_val, num_test]) 52 | 53 | train_loader = DataLoader(training_set, batch_size=args.batch_size, shuffle=True) 54 | val_loader = DataLoader(validation_set, batch_size=args.batch_size, shuffle=False) 55 | test_loader = DataLoader(test_set, batch_size=args.batch_size, shuffle=False) 56 | 57 | model = Model(args).to(args.device) 58 | 59 | model.load_state_dict(torch.load("pre_trained")) 60 | print("model loaded ################") 61 | optimizer = torch.optim.Adam(model.parameters(), lr=args.lr, weight_decay=args.weight_decay) 62 | 63 | 64 | def train(): 65 | min_loss = 1e10 66 | patience_cnt = 0 67 | val_loss_values = [] 68 | best_epoch = 0 69 | 70 | t = time.time() 71 | model.train() 72 | for epoch in range(args.epochs): 73 | loss_train = 0.0 74 | correct = 0 75 | for i, data in enumerate(train_loader): 76 | optimizer.zero_grad() 77 | data = data.to(args.device) 78 | out = model(data) 79 | loss = F.nll_loss(out, data.y) 80 | loss.backward() 81 | optimizer.step() 82 | loss_train += loss.item() 83 | pred = out.max(dim=1)[1] 84 | correct += pred.eq(data.y).sum().item() 85 | acc_train = correct / len(train_loader.dataset) 86 | acc_val, loss_val = compute_test(val_loader) 87 | print('Epoch: {:04d}'.format(epoch + 1), 'loss_train: {:.6f}'.format(loss_train), 88 | 'acc_train: {:.6f}'.format(acc_train), 'loss_val: {:.6f}'.format(loss_val), 89 | 'acc_val: {:.6f}'.format(acc_val), 'time: {:.6f}s'.format(time.time() - t)) 90 | 91 | val_loss_values.append(loss_val) 92 | # torch.save(model.state_dict(), '{}.pth'.format(epoch)) 93 | torch.save(model.state_dict(), "pre_trained") 94 | 95 | if val_loss_values[-1] < min_loss: 96 | min_loss = val_loss_values[-1] 97 | best_epoch = epoch 98 | patience_cnt = 0 99 | else: 100 | patience_cnt += 1 101 | 102 | if patience_cnt == args.patience: 103 | break 104 | 105 | files = glob.glob('*.pth') 106 | for f in files: 107 | epoch_nb = int(f.split('.')[0]) 108 | if epoch_nb < best_epoch: 109 | os.remove(f) 110 | 111 | files = glob.glob('*.pth') 112 | for f in files: 113 | epoch_nb = int(f.split('.')[0]) 114 | if epoch_nb > best_epoch: 115 | os.remove(f) 116 | print('Optimization Finished! Total time elapsed: {:.6f}'.format(time.time() - t)) 117 | 118 | return best_epoch 119 | 120 | 121 | def compute_test(loader, which="val"): 122 | model.eval() 123 | correct = 0.0 124 | loss_test = 0.0 125 | for data in loader: 126 | data = data.to(args.device) 127 | out = model(data) 128 | pred = out.max(dim=1)[1] 129 | correct += pred.eq(data.y).sum().item() 130 | loss_test += F.nll_loss(out, data.y).item() 131 | return correct / len(loader.dataset), loss_test 132 | 133 | 134 | if __name__ == '__main__': 135 | # Model training 136 | print("here inside main") 137 | best_model = train() 138 | print("here inside main") 139 | print("done") 140 | # Restore best model for test set 141 | model.load_state_dict(torch.load('{}.pth'.format(best_model))) 142 | test_acc, test_loss = compute_test(test_loader,"test") 143 | print('Test set results, loss = {:.6f}, accuracy = {:.6f}'.format(test_loss, test_acc)) 144 | 145 | -------------------------------------------------------------------------------- /HGP-SL/main.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import glob 3 | import os 4 | import time 5 | 6 | import torch 7 | import torch.nn.functional as F 8 | from models import Model 9 | from torch.utils.data import random_split 10 | from torch_geometric.data import DataLoader 11 | from torch_geometric.datasets import TUDataset 12 | 13 | parser = argparse.ArgumentParser() 14 | 15 | parser.add_argument('--seed', type=int, default=777, help='random seed') 16 | parser.add_argument('--batch_size', type=int, default=512, help='batch size') 17 | parser.add_argument('--lr', type=float, default=0.001, help='learning rate') 18 | parser.add_argument('--weight_decay', type=float, default=0.001, help='weight decay') 19 | parser.add_argument('--nhid', type=int, default=128, help='hidden size') 20 | parser.add_argument('--sample_neighbor', type=bool, default=True, help='whether sample neighbors') 21 | parser.add_argument('--sparse_attention', type=bool, default=False, help='whether use sparse attention') 22 | parser.add_argument('--structure_learning', type=bool, default=False, help='whether perform structure learning') 23 | parser.add_argument('--pooling_ratio', type=float, default=0.5, help='pooling ratio') 24 | parser.add_argument('--dropout_ratio', type=float, default=0.4, help='dropout ratio') 25 | parser.add_argument('--lamb', type=float, default=1.0, help='trade-off parameter') 26 | parser.add_argument('--dataset', type=str, default='PROTEINS', help='DD/PROTEINS/NCI1/NCI109/Mutagenicity/ENZYMES') 27 | parser.add_argument('--device', type=str, default='cuda:0', help='specify cuda devices') 28 | parser.add_argument('--epochs', type=int, default=1000, help='maximum number of epochs') 29 | parser.add_argument('--patience', type=int, default=100, help='patience for early stopping') 30 | 31 | args = parser.parse_args() 32 | torch.manual_seed(args.seed) 33 | if torch.cuda.is_available(): 34 | print("CUDA") 35 | torch.cuda.manual_seed(args.seed) 36 | 37 | 38 | dataset = TUDataset("/home/anoopkumar/Brain/HGP-SL/data", name=args.dataset, use_node_attr=False, use_edge_attr=True) 39 | 40 | 41 | args.num_classes = dataset.num_classes 42 | args.num_features = dataset.num_features 43 | 44 | print(args) 45 | 46 | print("len of the dataset ######## ",len(dataset)) 47 | 48 | num_training = int(len(dataset) * 0.80) 49 | num_val = int(len(dataset) * 0.1) 50 | num_test = len(dataset) - (num_training + num_val) 51 | training_set, validation_set, test_set = random_split(dataset, [num_training, num_val, num_test]) 52 | print(" train len", len(training_set)) 53 | print("val len", len(validation_set)) 54 | print("test len", len(test_set)) 55 | print("random split") 56 | 57 | train_loader = DataLoader(training_set, batch_size=args.batch_size, shuffle=True) 58 | val_loader = DataLoader(validation_set, batch_size=args.batch_size, shuffle=True) 59 | test_loader = DataLoader(test_set, batch_size=args.batch_size, shuffle=True) 60 | 61 | model = Model(args).to(args.device) 62 | 63 | # model.load_state_dict(torch.load("pre_trained")) 64 | # print("model loaded ################") 65 | optimizer = torch.optim.Adam(model.parameters(), lr=args.lr, weight_decay=args.weight_decay) 66 | 67 | 68 | def train(): 69 | min_loss = 1e10 70 | patience_cnt = 0 71 | val_loss_values = [] 72 | best_epoch = 0 73 | 74 | t = time.time() 75 | model.train() 76 | for epoch in range(args.epochs): 77 | loss_train = 0.0 78 | correct = 0 79 | for i, data in enumerate(train_loader): 80 | optimizer.zero_grad() 81 | data = data.to(args.device) 82 | out = model(data) 83 | loss = F.nll_loss(out, data.y) 84 | loss.backward() 85 | optimizer.step() 86 | loss_train += loss.item() 87 | pred = out.max(dim=1)[1] 88 | correct += pred.eq(data.y).sum().item() 89 | acc_train = correct / len(train_loader.dataset) 90 | acc_val, loss_val = compute_test(val_loader) 91 | print('Epoch: {:04d}'.format(epoch + 1), 'loss_train: {:.6f}'.format(loss_train), 92 | 'acc_train: {:.6f}'.format(acc_train), 'loss_val: {:.6f}'.format(loss_val), 93 | 'acc_val: {:.6f}'.format(acc_val), 'time: {:.6f}s'.format(time.time() - t)) 94 | 95 | val_loss_values.append(loss_val) 96 | torch.save(model.state_dict(), '{}.pth'.format(epoch)) 97 | # torch.save(model.state_dict(), "pre_trained") 98 | 99 | if val_loss_values[-1] < min_loss: 100 | min_loss = val_loss_values[-1] 101 | best_epoch = epoch 102 | patience_cnt = 0 103 | else: 104 | patience_cnt += 1 105 | 106 | if patience_cnt == args.patience: 107 | break 108 | 109 | files = glob.glob('*.pth') 110 | for f in files: 111 | epoch_nb = int(f.split('.')[0]) 112 | if epoch_nb < best_epoch: 113 | os.remove(f) 114 | 115 | files = glob.glob('*.pth') 116 | for f in files: 117 | epoch_nb = int(f.split('.')[0]) 118 | if epoch_nb > best_epoch: 119 | os.remove(f) 120 | print('Optimization Finished! Total time elapsed: {:.6f}'.format(time.time() - t)) 121 | 122 | return best_epoch 123 | 124 | 125 | def compute_test(loader, which="val"): 126 | model.eval() 127 | correct = 0.0 128 | loss_test = 0.0 129 | testResult = [] 130 | asli = [] 131 | for data in loader: 132 | data = data.to(args.device) 133 | out = model(data) 134 | pred = out.max(dim=1)[1] 135 | testResult.append(pred) 136 | asli.append(data.y) 137 | correct += pred.eq(data.y).sum().item() 138 | loss_test += F.nll_loss(out, data.y).item() 139 | # print("test results", testResult) 140 | # print("asli results", asli) 141 | return correct / len(loader.dataset), loss_test 142 | 143 | 144 | if __name__ == '__main__': 145 | best_model = train() 146 | # Restore best model for test set 147 | model.load_state_dict(torch.load('{}.pth'.format(best_model))) 148 | print("ALL DONE") 149 | test_acc, test_loss = compute_test(test_loader,"test") 150 | print('Test set results, loss = {:.6f}, accuracy = {:.6f}'.format(test_loss, test_acc)) 151 | 152 | -------------------------------------------------------------------------------- /ADNI_with_Abide_TL.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "# !pwd\n", 10 | "import os\n", 11 | "import pandas as pd\n", 12 | "import numpy as np" 13 | ] 14 | }, 15 | { 16 | "cell_type": "code", 17 | "execution_count": null, 18 | "metadata": {}, 19 | "outputs": [], 20 | "source": [ 21 | "all_sub_adjMat = np.load(\"corr_mat_abide_605.npy\")\n", 22 | "sub_names = np.load(\"filename_abide_605.npy\", allow_pickle= True)\n", 23 | "print(len(all_sub_adjMat), len(sub_names))" 24 | ] 25 | }, 26 | { 27 | "cell_type": "code", 28 | "execution_count": null, 29 | "metadata": {}, 30 | "outputs": [], 31 | "source": [ 32 | "with open('Phenotypic_V1_0b_preprocessed1.csv') as f:\n", 33 | " content = f.readlines()\n", 34 | "content = [x.strip() for x in content]\n", 35 | "content.pop(-1)\n", 36 | "print(len(content))" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "execution_count": null, 42 | "metadata": {}, 43 | "outputs": [], 44 | "source": [ 45 | "df = pd.read_csv('Phenotypic_V1_0b_preprocessed1.csv') # can also index sheet by name or fetch all sheets\n", 46 | "label = df['DX_GROUP'].tolist()\n", 47 | "subject = df['FILE_ID'].tolist()\n", 48 | "print(len(label),label[:5], subject[:5])" 49 | ] 50 | }, 51 | { 52 | "cell_type": "code", 53 | "execution_count": null, 54 | "metadata": {}, 55 | "outputs": [], 56 | "source": [ 57 | "subnames = [str(i)[:-14] for i in sub_names]\n", 58 | "print(len(subnames), subnames[:5])" 59 | ] 60 | }, 61 | { 62 | "cell_type": "code", 63 | "execution_count": null, 64 | "metadata": {}, 65 | "outputs": [], 66 | "source": [ 67 | "adni90_feature = []\n", 68 | "adni90_labels = []\n", 69 | "adni90_adjmat = []\n", 70 | "# edge_fea = []\n", 71 | "\n", 72 | "for x,i in enumerate(subnames):\n", 73 | " ind = [n for n, l in enumerate(subject) if l.startswith(i)]\n", 74 | "# ind = subject.index(i)\n", 75 | " adni90_labels.append(label[ind[0]])\n", 76 | " adni90_feature.append(all_sub_adjMat[x])\n", 77 | "\n", 78 | "\n", 79 | "for i in adni90_feature:\n", 80 | " temp = np.zeros((90, 90))\n", 81 | " for x in range(90):\n", 82 | " for y in range(90):\n", 83 | " if abs(i[x][y]) >= 0.15:\n", 84 | " temp[x][y] = 1\n", 85 | "# edge_fea.append(i[x][y])\n", 86 | "# else:\n", 87 | "# edge_fea.append(-999)\n", 88 | " adni90_adjmat.append(list(temp))\n", 89 | "\n", 90 | " \n", 91 | " \n", 92 | "print(len(adni90_labels), len(adni90_adjmat), len(adni90_feature))\n", 93 | "print(np.array(adni90_adjmat[0]).shape)" 94 | ] 95 | }, 96 | { 97 | "cell_type": "code", 98 | "execution_count": null, 99 | "metadata": {}, 100 | "outputs": [], 101 | "source": [ 102 | "avg = []\n", 103 | "for k in range(605):\n", 104 | " count = 0\n", 105 | " for j in adni90_adjmat[k]:\n", 106 | "# print(len(j))\n", 107 | " for i in j:\n", 108 | " if i == 1:\n", 109 | " count += 1\n", 110 | "\n", 111 | " avg.append(count)\n", 112 | "print(sum(avg)/len(avg))\n" 113 | ] 114 | }, 115 | { 116 | "cell_type": "code", 117 | "execution_count": null, 118 | "metadata": {}, 119 | "outputs": [], 120 | "source": [ 121 | "# # choose the classification type\n", 122 | "\n", 123 | "lab1 = []\n", 124 | "exp1 = []\n", 125 | "fea1 = []\n", 126 | "\n", 127 | "stop = 0\n", 128 | "\n", 129 | "for c,i in enumerate(adni90_labels):\n", 130 | " if i == 1:\n", 131 | " lab1.append(1)\n", 132 | " exp1.append(adni90_adjmat[c])\n", 133 | " \n", 134 | " else:\n", 135 | " lab1.append(2)\n", 136 | " exp1.append(adni90_adjmat[c])\n", 137 | " \n", 138 | "print(len(exp1), len(fea1), len(lab1))" 139 | ] 140 | }, 141 | { 142 | "cell_type": "code", 143 | "execution_count": null, 144 | "metadata": {}, 145 | "outputs": [], 146 | "source": [ 147 | "# for ADNI_A.txt\n", 148 | "\n", 149 | "edgeFea = []\n", 150 | "count = -1\n", 151 | "adni_a = []\n", 152 | "for i, x in enumerate(exp1):\n", 153 | " temp = []\n", 154 | " for a in range(len(x)):\n", 155 | " for b in range(len(x[0])):\n", 156 | " count += 1\n", 157 | " if a < b and x[a][b] == 1:\n", 158 | " temp.append((a+1+i*90,b+1+i*90))\n", 159 | "# edgeFea.append(edge_fea[count])\n", 160 | "\n", 161 | " adni_a.extend(temp)\n", 162 | "\n", 163 | "print(len(adni_a))\n", 164 | "print(adni_a[-10:])\n", 165 | "\n", 166 | "\n", 167 | "with open(\"/home/anoopkumar/Brain/HGP-SL/data/ABIDE/raw/ABIDE_A.txt\",\"w\") as f:\n", 168 | " for item in adni_a:\n", 169 | " f.write(\"%s, %s\\n\" % (item[0],item[1]))\n", 170 | "\n", 171 | "# print(len(edgeFea))" 172 | ] 173 | }, 174 | { 175 | "cell_type": "code", 176 | "execution_count": null, 177 | "metadata": {}, 178 | "outputs": [], 179 | "source": [ 180 | "print(len(exp1))" 181 | ] 182 | }, 183 | { 184 | "cell_type": "code", 185 | "execution_count": null, 186 | "metadata": {}, 187 | "outputs": [], 188 | "source": [ 189 | "# ADNI_graph_indicator.txt\n", 190 | "\n", 191 | "node_graph_indicator = []\n", 192 | "for i in range(1,len(exp1)+1):\n", 193 | " node_graph_indicator.extend([i for j in range(90)])\n", 194 | "\n", 195 | "with open(\"/home/anoopkumar/Brain/HGP-SL/data/ABIDE/raw/ABIDE_graph_indicator.txt\",\"w\") as f:\n", 196 | " for item in node_graph_indicator:\n", 197 | " f.write(\"%s\\n\" % item) " 198 | ] 199 | }, 200 | { 201 | "cell_type": "code", 202 | "execution_count": null, 203 | "metadata": {}, 204 | "outputs": [], 205 | "source": [ 206 | "# ADNI_graph_labels.txt\n", 207 | "\n", 208 | "with open(\"/home/anoopkumar/Brain/HGP-SL/data/ABIDE/raw/ABIDE_graph_labels.txt\",\"w\") as f:\n", 209 | " for item in lab1:\n", 210 | " f.write(\"%s\\n\" % item) " 211 | ] 212 | }, 213 | { 214 | "cell_type": "code", 215 | "execution_count": null, 216 | "metadata": {}, 217 | "outputs": [], 218 | "source": [ 219 | "# ADNI node labels.txt\n", 220 | "\n", 221 | "\n", 222 | "\n", 223 | "node_labels = []\n", 224 | "\n", 225 | "for i in range(len(exp1)):\n", 226 | " for j in range(90):\n", 227 | " node_labels.append(j)\n", 228 | "print(len(node_labels), len(node_graph_indicator)) \n", 229 | "with open(\"/home/anoopkumar/Brain/HGP-SL/data/ABIDE/raw/ABIDE_node_labels.txt\",\"w\") as f:\n", 230 | " for item in node_labels:\n", 231 | " f.write(\"%s\\n\" % item) " 232 | ] 233 | }, 234 | { 235 | "cell_type": "code", 236 | "execution_count": null, 237 | "metadata": {}, 238 | "outputs": [], 239 | "source": [ 240 | "import torch\n", 241 | "import numpy as np\n", 242 | "import pandas as pd\n", 243 | "import os\n", 244 | "import re\n", 245 | "import sklearn" 246 | ] 247 | }, 248 | { 249 | "cell_type": "code", 250 | "execution_count": null, 251 | "metadata": {}, 252 | "outputs": [], 253 | "source": [ 254 | "print(torch.__version__, torch.cuda.is_available())" 255 | ] 256 | }, 257 | { 258 | "cell_type": "code", 259 | "execution_count": null, 260 | "metadata": {}, 261 | "outputs": [], 262 | "source": [ 263 | "!which pip\n", 264 | "\n", 265 | "os.chdir(\"/home/anoopkumar/Brain/HGP-SL\")\n", 266 | "!pwd" 267 | ] 268 | }, 269 | { 270 | "cell_type": "code", 271 | "execution_count": null, 272 | "metadata": {}, 273 | "outputs": [], 274 | "source": [ 275 | "!rm -rf /home/anoopkumar/Brain/HGP-SL/data/ABIDE/processed\n", 276 | "!python main.py --epochs 50 --dataset ABIDE --batch_size 64 --lr 0.0001 --dropout_ratio 0.8 --nhid 128\n", 277 | "# --sparse_attention False --structure_learning False" 278 | ] 279 | }, 280 | { 281 | "cell_type": "code", 282 | "execution_count": null, 283 | "metadata": {}, 284 | "outputs": [], 285 | "source": [ 286 | "############### load learned weights and use it to train adni #######################" 287 | ] 288 | }, 289 | { 290 | "cell_type": "code", 291 | "execution_count": null, 292 | "metadata": {}, 293 | "outputs": [], 294 | "source": [ 295 | "!rm -rf /home/anoopkumar/Brain/HGP-SL/data/ADNI_612/processed\n", 296 | "!python mainTL.py --epochs 20 --dataset ADNI_612 --batch_size 8 --lr 0.0001 --dropout_ratio 0.5 --nhid 256 --sparse_attention False --structure_learning False" 297 | ] 298 | } 299 | ], 300 | "metadata": { 301 | "kernelspec": { 302 | "display_name": "Python 3", 303 | "language": "python", 304 | "name": "python3" 305 | }, 306 | "language_info": { 307 | "codemirror_mode": { 308 | "name": "ipython", 309 | "version": 3 310 | }, 311 | "file_extension": ".py", 312 | "mimetype": "text/x-python", 313 | "name": "python", 314 | "nbconvert_exporter": "python", 315 | "pygments_lexer": "ipython3", 316 | "version": "3.7.6" 317 | } 318 | }, 319 | "nbformat": 4, 320 | "nbformat_minor": 2 321 | } 322 | -------------------------------------------------------------------------------- /Node2vec.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import pandas as pd\n", 10 | "import pickle\n", 11 | "import numpy as np\n", 12 | "from stellargraph import StellarGraph\n", 13 | "import os\n", 14 | "import networkx as nx\n", 15 | "import numpy as np\n", 16 | "import pandas as pd\n", 17 | "\n", 18 | "from stellargraph.data import BiasedRandomWalk\n", 19 | "from stellargraph import StellarGraph" 20 | ] 21 | }, 22 | { 23 | "cell_type": "code", 24 | "execution_count": null, 25 | "metadata": {}, 26 | "outputs": [], 27 | "source": [ 28 | "os.chdir('/home/anoopkumar/')\n", 29 | "all_sub_adj_mat = []\n", 30 | "all_sub_label = []\n", 31 | "\n", 32 | "with open ('all_subject_adj_mat.pkl','rb') as f1, open('all_subject_label.pkl','rb') as f2:\n", 33 | " all_sub_adj_mat = pickle.load(f1)\n", 34 | " all_sub_label = pickle.load(f2)" 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "execution_count": null, 40 | "metadata": {}, 41 | "outputs": [], 42 | "source": [ 43 | "newall = []\n", 44 | "for sub in all_sub_adj_mat:\n", 45 | " newsub = []\n", 46 | " for row in sub:\n", 47 | " row = np.append(row,1)\n", 48 | " newsub.append(row)\n", 49 | " newsub = np.vstack([newsub,[1 for i in range(91)]])\n", 50 | " newall.append(newsub)\n", 51 | "\n" 52 | ] 53 | }, 54 | { 55 | "cell_type": "code", 56 | "execution_count": null, 57 | "metadata": {}, 58 | "outputs": [], 59 | "source": [ 60 | "print(len(all_sub_label))\n", 61 | "print(len(all_sub_adj_mat))\n", 62 | "print(len(all_sub_adj_mat[0]))\n", 63 | "print(len(all_sub_adj_mat[0][0]))\n", 64 | "print(len(newall))\n", 65 | "print(len(newall[0]))\n", 66 | "print(len(newall[0][0]))" 67 | ] 68 | }, 69 | { 70 | "cell_type": "code", 71 | "execution_count": null, 72 | "metadata": {}, 73 | "outputs": [], 74 | "source": [ 75 | "all_sg = [[] for i in range(160)]" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": null, 81 | "metadata": {}, 82 | "outputs": [], 83 | "source": [ 84 | "for count, sub in enumerate(newall):\n", 85 | " np.fill_diagonal(sub, np.nan)\n", 86 | " df = pd.DataFrame(sub)\n", 87 | " df = df.stack().reset_index()\n", 88 | " pairs = list(zip(df['level_0'], df['level_1']))\n", 89 | " for i in pairs:\n", 90 | " a, b = i\n", 91 | " if (b,a) not in all_sg[count]:\n", 92 | " all_sg[count].append(i)" 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": null, 98 | "metadata": {}, 99 | "outputs": [], 100 | "source": [ 101 | "all_sg_lr = [[] for _ in range(160)]\n", 102 | "\n", 103 | "for i,x in enumerate(all_sg):\n", 104 | " all_sg_lr[i] = [[],[]]\n", 105 | " for pair in x:\n", 106 | " a, b = pair\n", 107 | " all_sg_lr[i][0].append(a)\n", 108 | " all_sg_lr[i][1].append(b)" 109 | ] 110 | }, 111 | { 112 | "cell_type": "code", 113 | "execution_count": null, 114 | "metadata": {}, 115 | "outputs": [], 116 | "source": [ 117 | "graphs = []\n", 118 | "\n", 119 | "for one_sglr in all_sg_lr:\n", 120 | " dic = {\"source\": one_sglr[0],\"target\":one_sglr[1]}\n", 121 | " one = pd.DataFrame(dic)\n", 122 | " graphs.append(StellarGraph(edges = one))" 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "execution_count": null, 128 | "metadata": {}, 129 | "outputs": [], 130 | "source": [ 131 | "print(graphs[0].info()) \n", 132 | "G = graphs" 133 | ] 134 | }, 135 | { 136 | "cell_type": "code", 137 | "execution_count": null, 138 | "metadata": {}, 139 | "outputs": [], 140 | "source": [ 141 | "allwalk = []\n", 142 | "for sub in G:\n", 143 | " rw = BiasedRandomWalk(sub)\n", 144 | " walks = rw.run(\n", 145 | " nodes=list(sub.nodes()), # root nodes\n", 146 | " length=100, # maximum length of a random walk\n", 147 | " n=10, # number of random walks per root node\n", 148 | " p=0.5, # Defines (unormalised) probability, 1/p, of returning to source node\n", 149 | " q=2.0, # Defines (unormalised) probability, 1/q, for moving away from source node\n", 150 | " )\n", 151 | " allwalk.append(walks)\n", 152 | "# print(\"Number of random walks: {}\".format(len(walks)))\n" 153 | ] 154 | }, 155 | { 156 | "cell_type": "code", 157 | "execution_count": null, 158 | "metadata": {}, 159 | "outputs": [], 160 | "source": [ 161 | "print(len(allwalk))" 162 | ] 163 | }, 164 | { 165 | "cell_type": "code", 166 | "execution_count": null, 167 | "metadata": {}, 168 | "outputs": [], 169 | "source": [ 170 | "modelvecs = []\n", 171 | "from gensim.models import Word2Vec\n", 172 | "\n", 173 | "for walks in allwalk:\n", 174 | " str_walks = [[str(n) for n in walk] for walk in walks]\n", 175 | " model = Word2Vec(str_walks, size=80, window=5, min_count=0, sg=1, workers=2, iter=1)\n", 176 | " modelvecs.append(model)" 177 | ] 178 | }, 179 | { 180 | "cell_type": "code", 181 | "execution_count": null, 182 | "metadata": {}, 183 | "outputs": [], 184 | "source": [ 185 | "graphembeds = []\n", 186 | "\n", 187 | "for vecs in modelvecs:\n", 188 | " node_embeds = (vecs.wv.vectors)\n", 189 | " print(len(node_embeds))\n", 190 | " break" 191 | ] 192 | }, 193 | { 194 | "cell_type": "code", 195 | "execution_count": null, 196 | "metadata": {}, 197 | "outputs": [], 198 | "source": [ 199 | "supernode = []\n", 200 | "avgnode = []\n", 201 | "\n", 202 | "for vecs in modelvecs:\n", 203 | " node_embeds = (vecs.wv.vectors)\n", 204 | " supernode.append(node_embeds[-1])\n", 205 | " avgnode.append(node_embeds.mean(axis = 0)) " 206 | ] 207 | }, 208 | { 209 | "cell_type": "code", 210 | "execution_count": null, 211 | "metadata": {}, 212 | "outputs": [], 213 | "source": [ 214 | "print(len(supernode),len(avgnode))\n", 215 | "graph_labels = []\n", 216 | "rem_ad = []\n", 217 | "for c,i in enumerate(all_sub_label):\n", 218 | " if i == 'CN':\n", 219 | " graph_labels.append(1)\n", 220 | " elif i == 'MCI' or i == 'EMCI' or i == 'LMCI':\n", 221 | " graph_labels.append(-1)\n", 222 | " else:\n", 223 | " rem_ad.append(c)\n", 224 | "\n", 225 | "exp1 = []\n", 226 | "for i in range(160):\n", 227 | " if i not in rem_ad:\n", 228 | " exp1.append(supernode[i])\n", 229 | " \n", 230 | "print(len(exp1), len (graph_labels))" 231 | ] 232 | }, 233 | { 234 | "cell_type": "code", 235 | "execution_count": null, 236 | "metadata": {}, 237 | "outputs": [], 238 | "source": [ 239 | "############# supernode check #############################################" 240 | ] 241 | }, 242 | { 243 | "cell_type": "code", 244 | "execution_count": null, 245 | "metadata": {}, 246 | "outputs": [], 247 | "source": [ 248 | "data = exp1\n", 249 | "label = graph_labels\n", 250 | "print(len(data),len(label))" 251 | ] 252 | }, 253 | { 254 | "cell_type": "code", 255 | "execution_count": null, 256 | "metadata": {}, 257 | "outputs": [], 258 | "source": [ 259 | "pd.DataFrame(label).value_counts().to_frame()" 260 | ] 261 | }, 262 | { 263 | "cell_type": "code", 264 | "execution_count": null, 265 | "metadata": {}, 266 | "outputs": [], 267 | "source": [ 268 | "from sklearn.model_selection import cross_validate\n", 269 | "from sklearn.neural_network import MLPClassifier\n", 270 | "from sklearn.pipeline import make_pipeline\n", 271 | "from sklearn.preprocessing import StandardScaler\n", 272 | "from sklearn.ensemble import GradientBoostingClassifier\n", 273 | "from sklearn.svm import SVC\n", 274 | "clf1 = SVC(gamma='auto')\n", 275 | "# clf2 =make_pipeline(StandardScaler(), MLPClassifier(solver='lbfgs', alpha=1e-5, hidden_layer_sizes=(16, 4), random_state=1))\n", 276 | "clf3 = GradientBoostingClassifier(n_estimators=200, learning_rate=1.0,max_depth=5, random_state=0)\n", 277 | "\n", 278 | "X = np.array(data)\n", 279 | "y = np.array(label)\n" 280 | ] 281 | }, 282 | { 283 | "cell_type": "code", 284 | "execution_count": null, 285 | "metadata": {}, 286 | "outputs": [], 287 | "source": [ 288 | "cv_results = cross_validate(clf1, X, y, cv=40, scoring = ['accuracy','roc_auc','balanced_accuracy'])\n", 289 | "# cv_results2 = cross_validate(clf2, X, y, cv=40)\n", 290 | "cv_results3 = cross_validate(clf3, X, y, cv=40,scoring = ['accuracy','roc_auc','balanced_accuracy'])" 291 | ] 292 | }, 293 | { 294 | "cell_type": "code", 295 | "execution_count": null, 296 | "metadata": {}, 297 | "outputs": [], 298 | "source": [ 299 | "print(np.mean(cv_results['test_roc_auc']))\n", 300 | "# print((cv_results['test_score']))\n", 301 | "print(np.mean(cv_results3['test_roc_auc']))\n", 302 | "print(np.mean(cv_results3['test_balanced_accuracy']))\n" 303 | ] 304 | }, 305 | { 306 | "cell_type": "code", 307 | "execution_count": null, 308 | "metadata": {}, 309 | "outputs": [], 310 | "source": [] 311 | } 312 | ], 313 | "metadata": { 314 | "kernelspec": { 315 | "display_name": "Python 3", 316 | "language": "python", 317 | "name": "python3" 318 | }, 319 | "language_info": { 320 | "codemirror_mode": { 321 | "name": "ipython", 322 | "version": 3 323 | }, 324 | "file_extension": ".py", 325 | "mimetype": "text/x-python", 326 | "name": "python", 327 | "nbconvert_exporter": "python", 328 | "pygments_lexer": "ipython3", 329 | "version": "3.7.6" 330 | } 331 | }, 332 | "nbformat": 4, 333 | "nbformat_minor": 4 334 | } 335 | -------------------------------------------------------------------------------- /HGP-SL/layers.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn as nn 3 | import torch.nn.functional as F 4 | from sparse_softmax import Sparsemax 5 | from torch.nn import Parameter 6 | from torch_geometric.data import Data 7 | from torch_geometric.nn.conv import MessagePassing 8 | from torch_geometric.nn.pool.topk_pool import topk, filter_adj 9 | from torch_geometric.utils import softmax, dense_to_sparse, add_remaining_self_loops 10 | from torch_scatter import scatter_add 11 | from torch_sparse import spspmm, coalesce 12 | 13 | 14 | class TwoHopNeighborhood(object): 15 | def __call__(self, data): 16 | edge_index, edge_attr = data.edge_index, data.edge_attr 17 | n = data.num_nodes 18 | 19 | value = edge_index.new_ones((edge_index.size(1),), dtype=torch.float) 20 | 21 | index, value = spspmm(edge_index, value, edge_index, value, n, n, n) 22 | value.fill_(0) 23 | 24 | edge_index = torch.cat([edge_index, index], dim=1) 25 | if edge_attr is None: 26 | data.edge_index, _ = coalesce(edge_index, None, n, n) 27 | else: 28 | value = value.view(-1, *[1 for _ in range(edge_attr.dim() - 1)]) 29 | value = value.expand(-1, *list(edge_attr.size())[1:]) 30 | edge_attr = torch.cat([edge_attr, value], dim=0) 31 | data.edge_index, edge_attr = coalesce(edge_index, edge_attr, n, n) 32 | data.edge_attr = edge_attr 33 | 34 | return data 35 | 36 | def __repr__(self): 37 | return '{}()'.format(self.__class__.__name__) 38 | 39 | 40 | class GCN(MessagePassing): 41 | def __init__(self, in_channels, out_channels, cached=False, bias=True, **kwargs): 42 | super(GCN, self).__init__(aggr='add', **kwargs) 43 | 44 | self.in_channels = in_channels 45 | self.out_channels = out_channels 46 | self.cached = cached 47 | self.cached_result = None 48 | self.cached_num_edges = None 49 | 50 | self.weight = Parameter(torch.Tensor(in_channels, out_channels)) 51 | nn.init.xavier_uniform_(self.weight.data) 52 | 53 | if bias: 54 | self.bias = Parameter(torch.Tensor(out_channels)) 55 | nn.init.zeros_(self.bias.data) 56 | else: 57 | self.register_parameter('bias', None) 58 | 59 | self.reset_parameters() 60 | 61 | def reset_parameters(self): 62 | self.cached_result = None 63 | self.cached_num_edges = None 64 | 65 | @staticmethod 66 | def norm(edge_index, num_nodes, edge_weight, dtype=None): 67 | if edge_weight is None: 68 | edge_weight = torch.ones((edge_index.size(1),), dtype=dtype, device=edge_index.device) 69 | 70 | row, col = edge_index 71 | deg = scatter_add(edge_weight, row, dim=0, dim_size=num_nodes) 72 | deg_inv_sqrt = deg.pow(-0.5) 73 | deg_inv_sqrt[deg_inv_sqrt == float('inf')] = 0 74 | 75 | return edge_index, deg_inv_sqrt[row] * edge_weight * deg_inv_sqrt[col] 76 | 77 | def forward(self, x, edge_index, edge_weight=None): 78 | x = torch.matmul(x, self.weight) 79 | 80 | if self.cached and self.cached_result is not None: 81 | if edge_index.size(1) != self.cached_num_edges: 82 | raise RuntimeError( 83 | 'Cached {} number of edges, but found {}'.format(self.cached_num_edges, edge_index.size(1))) 84 | 85 | if not self.cached or self.cached_result is None: 86 | self.cached_num_edges = edge_index.size(1) 87 | edge_index, norm = self.norm(edge_index, x.size(0), edge_weight, x.dtype) 88 | self.cached_result = edge_index, norm 89 | 90 | edge_index, norm = self.cached_result 91 | 92 | return self.propagate(edge_index, x=x, norm=norm) 93 | 94 | def message(self, x_j, norm): 95 | return norm.view(-1, 1) * x_j 96 | 97 | def update(self, aggr_out): 98 | if self.bias is not None: 99 | aggr_out = aggr_out + self.bias 100 | return aggr_out 101 | 102 | def __repr__(self): 103 | return '{}({}, {})'.format(self.__class__.__name__, self.in_channels, self.out_channels) 104 | 105 | 106 | class NodeInformationScore(MessagePassing): 107 | def __init__(self, improved=False, cached=False, **kwargs): 108 | super(NodeInformationScore, self).__init__(aggr='add', **kwargs) 109 | 110 | self.improved = improved 111 | self.cached = cached 112 | self.cached_result = None 113 | self.cached_num_edges = None 114 | 115 | @staticmethod 116 | def norm(edge_index, num_nodes, edge_weight, dtype=None): 117 | if edge_weight is None: 118 | edge_weight = torch.ones((edge_index.size(1),), dtype=dtype, device=edge_index.device) 119 | 120 | row, col = edge_index 121 | deg = scatter_add(edge_weight, row, dim=0, dim_size=num_nodes) 122 | deg_inv_sqrt = deg.pow(-0.5) 123 | deg_inv_sqrt[deg_inv_sqrt == float('inf')] = 0 124 | 125 | edge_index, edge_weight = add_remaining_self_loops(edge_index, edge_weight, 0, num_nodes) 126 | 127 | row, col = edge_index 128 | expand_deg = torch.zeros((edge_weight.size(0),), dtype=dtype, device=edge_index.device) 129 | expand_deg[-num_nodes:] = torch.ones((num_nodes,), dtype=dtype, device=edge_index.device) 130 | 131 | return edge_index, expand_deg - deg_inv_sqrt[row] * edge_weight * deg_inv_sqrt[col] 132 | 133 | def forward(self, x, edge_index, edge_weight): 134 | if self.cached and self.cached_result is not None: 135 | if edge_index.size(1) != self.cached_num_edges: 136 | raise RuntimeError( 137 | 'Cached {} number of edges, but found {}'.format(self.cached_num_edges, edge_index.size(1))) 138 | 139 | if not self.cached or self.cached_result is None: 140 | self.cached_num_edges = edge_index.size(1) 141 | edge_index, norm = self.norm(edge_index, x.size(0), edge_weight, x.dtype) 142 | self.cached_result = edge_index, norm 143 | 144 | edge_index, norm = self.cached_result 145 | 146 | return self.propagate(edge_index, x=x, norm=norm) 147 | 148 | def message(self, x_j, norm): 149 | return norm.view(-1, 1) * x_j 150 | 151 | def update(self, aggr_out): 152 | return aggr_out 153 | 154 | 155 | class HGPSLPool(torch.nn.Module): 156 | def __init__(self, in_channels, ratio=0.8, sample=False, sparse=False, sl=True, lamb=1.0, negative_slop=0.2): 157 | super(HGPSLPool, self).__init__() 158 | self.in_channels = in_channels 159 | self.ratio = ratio 160 | self.sample = sample 161 | self.sparse = sparse 162 | self.sl = sl 163 | self.negative_slop = negative_slop 164 | self.lamb = lamb 165 | 166 | self.att = Parameter(torch.Tensor(1, self.in_channels * 2)) 167 | nn.init.xavier_uniform_(self.att.data) 168 | self.sparse_attention = Sparsemax() 169 | self.neighbor_augment = TwoHopNeighborhood() 170 | self.calc_information_score = NodeInformationScore() 171 | 172 | def forward(self, x, edge_index, edge_attr, batch=None): 173 | if batch is None: 174 | batch = edge_index.new_zeros(x.size(0)) 175 | 176 | x_information_score = self.calc_information_score(x, edge_index, edge_attr) 177 | score = torch.sum(torch.abs(x_information_score), dim=1) 178 | 179 | # Graph Pooling 180 | original_x = x 181 | perm = topk(score, self.ratio, batch) 182 | x = x[perm] 183 | batch = batch[perm] 184 | induced_edge_index, induced_edge_attr = filter_adj(edge_index, edge_attr, perm, num_nodes=score.size(0)) 185 | 186 | # Discard structure learning layer, directly return 187 | if self.sl is False: 188 | return x, induced_edge_index, induced_edge_attr, batch 189 | 190 | # Structure Learning 191 | if self.sample: 192 | # A fast mode for large graphs. 193 | # In large graphs, learning the possible edge weights between each pair of nodes is time consuming. 194 | # To accelerate this process, we sample it's K-Hop neighbors for each node and then learn the 195 | # edge weights between them. 196 | k_hop = 3 197 | if edge_attr is None: 198 | edge_attr = torch.ones((edge_index.size(1),), dtype=torch.float, device=edge_index.device) 199 | 200 | hop_data = Data(x=original_x, edge_index=edge_index, edge_attr=edge_attr) 201 | for _ in range(k_hop - 1): 202 | hop_data = self.neighbor_augment(hop_data) 203 | hop_edge_index = hop_data.edge_index 204 | hop_edge_attr = hop_data.edge_attr 205 | new_edge_index, new_edge_attr = filter_adj(hop_edge_index, hop_edge_attr, perm, num_nodes=score.size(0)) 206 | 207 | new_edge_index, new_edge_attr = add_remaining_self_loops(new_edge_index, new_edge_attr, 0, x.size(0)) 208 | row, col = new_edge_index 209 | weights = (torch.cat([x[row], x[col]], dim=1) * self.att).sum(dim=-1) 210 | weights = F.leaky_relu(weights, self.negative_slop) + new_edge_attr * self.lamb 211 | adj = torch.zeros((x.size(0), x.size(0)), dtype=torch.float, device=x.device) 212 | adj[row, col] = weights 213 | new_edge_index, weights = dense_to_sparse(adj) 214 | row, col = new_edge_index 215 | if self.sparse: 216 | new_edge_attr = self.sparse_attention(weights, row) 217 | else: 218 | new_edge_attr = softmax(weights, row, x.size(0)) 219 | # filter out zero weight edges 220 | adj[row, col] = new_edge_attr 221 | new_edge_index, new_edge_attr = dense_to_sparse(adj) 222 | # release gpu memory 223 | del adj 224 | torch.cuda.empty_cache() 225 | else: 226 | # Learning the possible edge weights between each pair of nodes in the pooled subgraph, relative slower. 227 | if edge_attr is None: 228 | induced_edge_attr = torch.ones((induced_edge_index.size(1),), dtype=x.dtype, 229 | device=induced_edge_index.device) 230 | num_nodes = scatter_add(batch.new_ones(x.size(0)), batch, dim=0) 231 | shift_cum_num_nodes = torch.cat([num_nodes.new_zeros(1), num_nodes.cumsum(dim=0)[:-1]], dim=0) 232 | cum_num_nodes = num_nodes.cumsum(dim=0) 233 | adj = torch.zeros((x.size(0), x.size(0)), dtype=torch.float, device=x.device) 234 | # Construct batch fully connected graph in block diagonal matirx format 235 | for idx_i, idx_j in zip(shift_cum_num_nodes, cum_num_nodes): 236 | adj[idx_i:idx_j, idx_i:idx_j] = 1.0 237 | new_edge_index, _ = dense_to_sparse(adj) 238 | row, col = new_edge_index 239 | 240 | weights = (torch.cat([x[row], x[col]], dim=1) * self.att).sum(dim=-1) 241 | weights = F.leaky_relu(weights, self.negative_slop) 242 | adj[row, col] = weights 243 | induced_row, induced_col = induced_edge_index 244 | 245 | adj[induced_row, induced_col] += induced_edge_attr * self.lamb 246 | weights = adj[row, col] 247 | if self.sparse: 248 | new_edge_attr = self.sparse_attention(weights, row) 249 | else: 250 | new_edge_attr = softmax(weights, row, x.size(0)) 251 | # filter out zero weight edges 252 | adj[row, col] = new_edge_attr 253 | new_edge_index, new_edge_attr = dense_to_sparse(adj) 254 | # release gpu memory 255 | del adj 256 | torch.cuda.empty_cache() 257 | 258 | return x, new_edge_index, new_edge_attr, batch 259 | -------------------------------------------------------------------------------- /Hierarchical GCN.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "# !pwd\n", 10 | "import os\n", 11 | "import pandas as pd\n", 12 | "import numpy as np" 13 | ] 14 | }, 15 | { 16 | "cell_type": "code", 17 | "execution_count": null, 18 | "metadata": { 19 | "scrolled": true 20 | }, 21 | "outputs": [], 22 | "source": [ 23 | "all_sub_adjMat = np.load(\"adni_612_corrmat.npy\")\n", 24 | "sub_names = np.load(\"adni_612_filenames.npy\", allow_pickle= True)\n", 25 | "print(len(all_sub_adjMat), len(sub_names))" 26 | ] 27 | }, 28 | { 29 | "cell_type": "code", 30 | "execution_count": null, 31 | "metadata": { 32 | "scrolled": false 33 | }, 34 | "outputs": [], 35 | "source": [ 36 | "len(all_sub_adjMat), np.shape(all_sub_adjMat), type(all_sub_adjMat)" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "execution_count": null, 42 | "metadata": {}, 43 | "outputs": [], 44 | "source": [ 45 | "all_sub_adjMat = np.delete(all_sub_adjMat, (130,460, 540), axis=0)\n", 46 | "sub_names = np.delete(sub_names, (130,460, 540))\n", 47 | "print(len(all_sub_adjMat), len(sub_names))" 48 | ] 49 | }, 50 | { 51 | "cell_type": "code", 52 | "execution_count": null, 53 | "metadata": {}, 54 | "outputs": [], 55 | "source": [ 56 | "with open('ADNI_largerdataset_labels.csv') as f:\n", 57 | " content = f.readlines()\n", 58 | "content = [x.strip() for x in content]\n", 59 | "content.pop(-1)\n", 60 | "print(content[0])\n", 61 | "print(len(content))" 62 | ] 63 | }, 64 | { 65 | "cell_type": "code", 66 | "execution_count": null, 67 | "metadata": {}, 68 | "outputs": [], 69 | "source": [ 70 | "with open('imageid_visitid.txt', 'r') as g:\n", 71 | " lines = g.readlines()\n", 72 | "lines = [x.strip() for x in lines]\n", 73 | "image_ids = [x[0:7] for x in lines]\n", 74 | "visit_ids = [x[9:] for x in lines]\n", 75 | "print(lines[0], image_ids[0], visit_ids[0])" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": null, 81 | "metadata": { 82 | "scrolled": true 83 | }, 84 | "outputs": [], 85 | "source": [ 86 | "df = pd.read_csv('ADNI_largerdataset_labels.csv') # can also index sheet by name or fetch all sheets\n", 87 | "label = df['Group'].tolist()\n", 88 | "subject = df['Image Data ID'].tolist()\n", 89 | "print(len(label),label[:5], subject[:5])" 90 | ] 91 | }, 92 | { 93 | "cell_type": "code", 94 | "execution_count": null, 95 | "metadata": {}, 96 | "outputs": [], 97 | "source": [ 98 | "visitnames = [str(i)[11:] for i in sub_names]\n", 99 | "visitnames = [str(i)[0:-21] for i in visitnames]\n", 100 | "print(visitnames[0])\n", 101 | "\n", 102 | "subnames = []\n", 103 | "for i in range(len(visitnames)):\n", 104 | " index = visit_ids.index(visitnames[i])\n", 105 | " subnames.append(image_ids[index])\n", 106 | " " 107 | ] 108 | }, 109 | { 110 | "cell_type": "code", 111 | "execution_count": null, 112 | "metadata": { 113 | "scrolled": true 114 | }, 115 | "outputs": [], 116 | "source": [ 117 | "print(len(subnames), subnames[:5])" 118 | ] 119 | }, 120 | { 121 | "cell_type": "code", 122 | "execution_count": null, 123 | "metadata": {}, 124 | "outputs": [], 125 | "source": [ 126 | "subnames.index('visitid')" 127 | ] 128 | }, 129 | { 130 | "cell_type": "code", 131 | "execution_count": null, 132 | "metadata": {}, 133 | "outputs": [], 134 | "source": [ 135 | "print(len(subnames), subnames[:5])\n", 136 | "for x,i in enumerate(subnames):\n", 137 | " print(x,i)\n", 138 | " break\n", 139 | " \n", 140 | "# print(len(subject), subject[])" 141 | ] 142 | }, 143 | { 144 | "cell_type": "code", 145 | "execution_count": null, 146 | "metadata": {}, 147 | "outputs": [], 148 | "source": [ 149 | "adni90_feature = []\n", 150 | "adni90_labels = []\n", 151 | "adni90_adjmat = []\n", 152 | "edge_fea = []\n", 153 | "\n", 154 | "for x,i in enumerate(subnames):\n", 155 | " if i =='visitid':\n", 156 | " continue\n", 157 | " ind = subject.index(i)\n", 158 | " adni90_labels.append(label[ind])\n", 159 | " adni90_feature.append(all_sub_adjMat[x])\n", 160 | "\n", 161 | "for i in adni90_feature:\n", 162 | " temp = np.zeros((90,90))\n", 163 | " for x in range(90):\n", 164 | " for y in range(90):\n", 165 | " if abs(i[x][y]) > 0.15:\n", 166 | " temp[x][y] = 1\n", 167 | " edge_fea.append(i[x][y])\n", 168 | " else:\n", 169 | " edge_fea.append(-999)\n", 170 | " adni90_adjmat.append(list(temp))\n", 171 | " \n", 172 | " \n", 173 | "print(len(adni90_labels), len(adni90_adjmat), len(adni90_feature))\n", 174 | "print(np.array(adni90_adjmat[0]).shape)" 175 | ] 176 | }, 177 | { 178 | "cell_type": "code", 179 | "execution_count": null, 180 | "metadata": {}, 181 | "outputs": [], 182 | "source": [ 183 | "# # choose the classification type\n", 184 | "\n", 185 | "lab1 = []\n", 186 | "exp1 = []\n", 187 | "fea1 = []\n", 188 | "\n", 189 | "c1,c2 = 0, 0\n", 190 | "\n", 191 | "for c,i in enumerate(adni90_labels):\n", 192 | " if i == 'EMCI' or i == 'LMCI' or i == 'MCI':\n", 193 | "# if i == 'CN':\n", 194 | " c1 += 1\n", 195 | "# if c1 == 150:\n", 196 | "# c1 -= 1\n", 197 | "# continue\n", 198 | " lab1.append(1)\n", 199 | " exp1.append(adni90_adjmat[c])\n", 200 | " fea1.append(adni90_feature[c])\n", 201 | " \n", 202 | "# elif i == 'AD':\n", 203 | " elif i == 'CN':\n", 204 | " c2 += 1\n", 205 | " lab1.append(2)\n", 206 | " exp1.append(adni90_adjmat[c])\n", 207 | " fea1.append(adni90_feature[c])\n", 208 | " \n", 209 | "\n", 210 | "print(len(exp1), len(fea1), len(lab1))\n", 211 | "print(c1, c2) " 212 | ] 213 | }, 214 | { 215 | "cell_type": "code", 216 | "execution_count": null, 217 | "metadata": { 218 | "scrolled": true 219 | }, 220 | "outputs": [], 221 | "source": [ 222 | "# print(len(edge_fea), len(adni_a))\n", 223 | "print(lab1)" 224 | ] 225 | }, 226 | { 227 | "cell_type": "code", 228 | "execution_count": null, 229 | "metadata": { 230 | "scrolled": true 231 | }, 232 | "outputs": [], 233 | "source": [ 234 | "!pwd\n", 235 | "os.chdir(\"/home/anoopkumar/Brain/HGP-SL/data\")\n", 236 | "!pwd" 237 | ] 238 | }, 239 | { 240 | "cell_type": "code", 241 | "execution_count": null, 242 | "metadata": {}, 243 | "outputs": [], 244 | "source": [ 245 | "# for ADNI_A.txt\n", 246 | "\n", 247 | "edgeFea = []\n", 248 | "count = -1\n", 249 | "adni_a = []\n", 250 | "for i, x in enumerate(exp1):\n", 251 | " temp = []\n", 252 | " for a in range(len(x)):\n", 253 | " for b in range(len(x[0])):\n", 254 | " count += 1\n", 255 | " if a < b and x[a][b] == 1:\n", 256 | " temp.append((a+1+i*90,b+1+i*90))\n", 257 | " edgeFea.append(edge_fea[count])\n", 258 | "\n", 259 | " adni_a.extend(temp)\n", 260 | "\n", 261 | "print(len(adni_a))\n", 262 | "print(adni_a[-10:])\n", 263 | "\n", 264 | "\n", 265 | "with open(\"ADNI_612/raw/ADNI_612_A.txt\",\"w\") as f:\n", 266 | " for item in adni_a:\n", 267 | " f.write(\"%s, %s\\n\" % (item[0],item[1]))\n", 268 | "\n", 269 | "print(len(edgeFea))" 270 | ] 271 | }, 272 | { 273 | "cell_type": "code", 274 | "execution_count": null, 275 | "metadata": {}, 276 | "outputs": [], 277 | "source": [ 278 | "# ADNI_graph_indicator.txt\n", 279 | "\n", 280 | "node_graph_indicator = []\n", 281 | "for i in range(1,len(exp1)+1):\n", 282 | " node_graph_indicator.extend([i for j in range(90)])\n", 283 | "\n", 284 | "with open(\"ADNI_612/raw/ADNI_612_graph_indicator.txt\",\"w\") as f:\n", 285 | " for item in node_graph_indicator:\n", 286 | " f.write(\"%s\\n\" % item) " 287 | ] 288 | }, 289 | { 290 | "cell_type": "code", 291 | "execution_count": null, 292 | "metadata": {}, 293 | "outputs": [], 294 | "source": [ 295 | "# ADNI_graph_labels.txt\n", 296 | "\n", 297 | "with open(\"ADNI_612/raw/ADNI_612_graph_labels.txt\",\"w\") as f:\n", 298 | " for item in lab1:\n", 299 | " f.write(\"%s\\n\" % item) " 300 | ] 301 | }, 302 | { 303 | "cell_type": "code", 304 | "execution_count": null, 305 | "metadata": {}, 306 | "outputs": [], 307 | "source": [ 308 | "# ADNI node labels.txt\n", 309 | "\n", 310 | "\n", 311 | "\n", 312 | "node_labels = []\n", 313 | "\n", 314 | "for i in range(len(exp1)):\n", 315 | " for j in range(90):\n", 316 | " node_labels.extend([j])\n", 317 | "print(len(node_labels), len(node_graph_indicator)) \n", 318 | "with open(\"ADNI_612/raw/ADNI_612_node_labels.txt\",\"w\") as f:\n", 319 | " for item in node_labels:\n", 320 | " f.write(\"%s\\n\" % item) " 321 | ] 322 | }, 323 | { 324 | "cell_type": "code", 325 | "execution_count": null, 326 | "metadata": {}, 327 | "outputs": [], 328 | "source": [ 329 | "# ADNI edge attr\n", 330 | "\n", 331 | "\n", 332 | "with open(\"ADNI_612/raw/ADNI_612_edge_attributes.txt\",\"w\") as f:\n", 333 | " for item in edgeFea:\n", 334 | " f.write(\"%s\\n\" % item) " 335 | ] 336 | }, 337 | { 338 | "cell_type": "code", 339 | "execution_count": null, 340 | "metadata": {}, 341 | "outputs": [], 342 | "source": [ 343 | "import torch\n", 344 | "import numpy as np\n", 345 | "import pandas as pd\n", 346 | "import os\n", 347 | "import re\n", 348 | "import sklearn" 349 | ] 350 | }, 351 | { 352 | "cell_type": "code", 353 | "execution_count": null, 354 | "metadata": {}, 355 | "outputs": [], 356 | "source": [ 357 | "print(torch.__version__, torch.cuda.is_available())" 358 | ] 359 | }, 360 | { 361 | "cell_type": "code", 362 | "execution_count": null, 363 | "metadata": {}, 364 | "outputs": [], 365 | "source": [ 366 | "!which pip\n", 367 | "\n", 368 | "os.chdir(\"/home/anoopkumar/Brain/HGP-SL\")\n", 369 | "!pwd" 370 | ] 371 | }, 372 | { 373 | "cell_type": "code", 374 | "execution_count": null, 375 | "metadata": { 376 | "scrolled": false 377 | }, 378 | "outputs": [], 379 | "source": [ 380 | "# !python main_cudastreams2.py --epochs 10 --dataset ADNI_612 --batch_size 4 --lr 0.01 --dropout_ratio 0.5 --nhid 512 \n", 381 | "# # --sparse_attention False --structure_learning False\n", 382 | "\n", 383 | "!rm -rf /home/anoopkumar/Brain/HGP-SL/data/ADNI_612/processed\n", 384 | "!python main.py --epochs 20 --dataset ADNI_612 --batch_size 8 --lr 0.0001 --dropout_ratio 0.5 --nhid 256 --sparse_attention False --structure_learning False" 385 | ] 386 | }, 387 | { 388 | "cell_type": "code", 389 | "execution_count": null, 390 | "metadata": {}, 391 | "outputs": [], 392 | "source": [ 393 | "#plot the times taken\n", 394 | "import matplotlib.pyplot as plt" 395 | ] 396 | }, 397 | { 398 | "cell_type": "code", 399 | "execution_count": null, 400 | "metadata": {}, 401 | "outputs": [], 402 | "source": [ 403 | "fig = plt.figure()\n", 404 | "ax = fig.add_axes([0,0,1,1])\n", 405 | "ax.bar(times_labels[0:3],times_values[0:3])\n", 406 | "ax.set_ylabel('sec')\n", 407 | "ax.set_title('train and test')\n", 408 | "plt.show()" 409 | ] 410 | }, 411 | { 412 | "cell_type": "code", 413 | "execution_count": null, 414 | "metadata": { 415 | "scrolled": true 416 | }, 417 | "outputs": [], 418 | "source": [ 419 | "fig = plt.figure()\n", 420 | "ax = fig.add_axes([0,0,1,1])\n", 421 | "ax.bar(times_labels[3:],times_values[3:])\n", 422 | "ax.set_ylabel('sec')\n", 423 | "ax.set_title('train time split up')\n", 424 | "plt.show()" 425 | ] 426 | } 427 | ], 428 | "metadata": { 429 | "kernelspec": { 430 | "display_name": "Python 3", 431 | "language": "python", 432 | "name": "python3" 433 | }, 434 | "language_info": { 435 | "codemirror_mode": { 436 | "name": "ipython", 437 | "version": 3 438 | }, 439 | "file_extension": ".py", 440 | "mimetype": "text/x-python", 441 | "name": "python", 442 | "nbconvert_exporter": "python", 443 | "pygments_lexer": "ipython3", 444 | "version": "3.7.6" 445 | } 446 | }, 447 | "nbformat": 4, 448 | "nbformat_minor": 2 449 | } 450 | -------------------------------------------------------------------------------- /Baseline_GCN.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "!pip3 install stellargraph\n", 10 | "!pip3 install pandas\n", 11 | "!pip3 install sklearn" 12 | ] 13 | }, 14 | { 15 | "cell_type": "code", 16 | "execution_count": null, 17 | "metadata": {}, 18 | "outputs": [], 19 | "source": [ 20 | "import pickle\n", 21 | "import numpy as np\n", 22 | "import pandas as pd\n", 23 | "import sklearn\n", 24 | "import tensorflow as tf\n", 25 | "from copy import deepcopy\n", 26 | "from stellargraph import StellarGraph\n", 27 | "import time" 28 | ] 29 | }, 30 | { 31 | "cell_type": "code", 32 | "execution_count": null, 33 | "metadata": {}, 34 | "outputs": [], 35 | "source": [ 36 | "!pwd" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "execution_count": null, 42 | "metadata": {}, 43 | "outputs": [], 44 | "source": [ 45 | "all_sub_adjMat = np.load(\"vibha_preprop_adni/correlation_matrices.npy\")\n", 46 | "sub_names = np.load(\"vibha_preprop_adni/filenamelist.npy\", allow_pickle= True)\n", 47 | "print(len(all_sub_adjMat), len(sub_names))" 48 | ] 49 | }, 50 | { 51 | "cell_type": "code", 52 | "execution_count": null, 53 | "metadata": {}, 54 | "outputs": [], 55 | "source": [ 56 | "subnames = [str(i)[5:15] for i in sub_names]\n", 57 | "print(len(subnames), subnames[:5])" 58 | ] 59 | }, 60 | { 61 | "cell_type": "code", 62 | "execution_count": null, 63 | "metadata": {}, 64 | "outputs": [], 65 | "source": [ 66 | "with open('/home/anoopkumar/adni/anoop/bck/filelist.txt') as f:\n", 67 | " content = f.readlines()\n", 68 | "content = [x.strip() for x in content]\n", 69 | "content.pop(-1)\n", 70 | "print(len(content))" 71 | ] 72 | }, 73 | { 74 | "cell_type": "code", 75 | "execution_count": null, 76 | "metadata": {}, 77 | "outputs": [], 78 | "source": [ 79 | "df = pd.read_csv('/home/anoopkumar/adni/anoop/bck/xlabel.csv') # can also index sheet by name or fetch all sheets\n", 80 | "label = df['Group'].tolist()\n", 81 | "subject = df['Subject'].tolist()\n", 82 | "print(len(label),label[:5], subject[:5])" 83 | ] 84 | }, 85 | { 86 | "cell_type": "code", 87 | "execution_count": null, 88 | "metadata": {}, 89 | "outputs": [], 90 | "source": [ 91 | "adni90_feature = []\n", 92 | "adni90_labels = []\n", 93 | "adni90_adjmat = []\n", 94 | "\n", 95 | "for x,i in enumerate(subnames):\n", 96 | " ind = subject.index(i)\n", 97 | " adni90_labels.append(label[ind])\n", 98 | " adni90_feature.append(all_sub_adjMat[x])\n", 99 | "\n", 100 | "for i in adni90_feature:\n", 101 | " temp = np.zeros((90,90))\n", 102 | " for x in range(90):\n", 103 | " for y in range(90):\n", 104 | " if i[x][y] > 0.15:\n", 105 | " temp[x][y] = 1\n", 106 | " adni90_adjmat.append(list(temp))\n", 107 | " \n", 108 | " \n", 109 | "print(len(adni90_labels), len(adni90_adjmat), len(adni90_feature))" 110 | ] 111 | }, 112 | { 113 | "cell_type": "code", 114 | "execution_count": null, 115 | "metadata": {}, 116 | "outputs": [], 117 | "source": [ 118 | "############################# gcn code ###############################################################" 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "execution_count": null, 124 | "metadata": {}, 125 | "outputs": [], 126 | "source": [ 127 | "all_sg = [[] for i in range(152)]\n", 128 | "\n", 129 | "for count, sub in enumerate(adni90_adjmat):\n", 130 | " np.fill_diagonal(np.array(sub), np.nan)\n", 131 | " df = pd.DataFrame(sub)\n", 132 | " df = df.stack().reset_index()\n", 133 | " df.columns = ['a','b','c']\n", 134 | " isedge = df['c'] == 1\n", 135 | " dfn = df[isedge]\n", 136 | " pairs = list(zip(dfn['a'], dfn['b']))\n", 137 | " for i in pairs:\n", 138 | " a, b = i\n", 139 | " if (b,a) not in all_sg[count]:\n", 140 | " all_sg[count].append(i)" 141 | ] 142 | }, 143 | { 144 | "cell_type": "code", 145 | "execution_count": null, 146 | "metadata": {}, 147 | "outputs": [], 148 | "source": [ 149 | "print(len(all_sg))\n", 150 | "print(len(all_sg[0]))" 151 | ] 152 | }, 153 | { 154 | "cell_type": "code", 155 | "execution_count": null, 156 | "metadata": {}, 157 | "outputs": [], 158 | "source": [ 159 | "all_sg_lr = [[] for _ in range(152)]\n", 160 | "\n", 161 | "for i,x in enumerate(all_sg):\n", 162 | " all_sg_lr[i] = [[],[]]\n", 163 | " for pair in x:\n", 164 | " a, b = pair\n", 165 | " all_sg_lr[i][0].append(a)\n", 166 | " all_sg_lr[i][1].append(b)\n", 167 | " " 168 | ] 169 | }, 170 | { 171 | "cell_type": "code", 172 | "execution_count": null, 173 | "metadata": {}, 174 | "outputs": [], 175 | "source": [ 176 | "# for featue\n", 177 | "\n", 178 | "all_sub_fea = []\n", 179 | "for i in adni90_feature:\n", 180 | " fead = pd.DataFrame(i)\n", 181 | " all_sub_fea.append(fead)" 182 | ] 183 | }, 184 | { 185 | "cell_type": "code", 186 | "execution_count": null, 187 | "metadata": {}, 188 | "outputs": [], 189 | "source": [ 190 | "graphs = []\n", 191 | "\n", 192 | "for i,one_sglr in enumerate(all_sg_lr):\n", 193 | " dic = {\"source\": one_sglr[0],\"target\":one_sglr[1]}\n", 194 | " one = pd.DataFrame(dic)\n", 195 | " graphs.append(StellarGraph(all_sub_fea[i], edge = one))" 196 | ] 197 | }, 198 | { 199 | "cell_type": "code", 200 | "execution_count": null, 201 | "metadata": {}, 202 | "outputs": [], 203 | "source": [ 204 | "print(graphs[1].info())" 205 | ] 206 | }, 207 | { 208 | "cell_type": "code", 209 | "execution_count": null, 210 | "metadata": {}, 211 | "outputs": [], 212 | "source": [ 213 | "summary = pd.DataFrame(\n", 214 | " [(g.number_of_nodes(), g.number_of_edges()) for g in graphs],\n", 215 | " columns=[\"nodes\", \"edges\"],\n", 216 | ")\n", 217 | "summary.describe().round(1)" 218 | ] 219 | }, 220 | { 221 | "cell_type": "code", 222 | "execution_count": null, 223 | "metadata": {}, 224 | "outputs": [], 225 | "source": [ 226 | "\n", 227 | "################### EMCI Vs LMCI #############################\n", 228 | "graph_labels = []\n", 229 | "rem_ad = []\n", 230 | "exp1 = []\n", 231 | "for c,i in enumerate(adni90_labels):\n", 232 | " if i == 'EMCI':\n", 233 | " graph_labels.append(1)\n", 234 | " exp1.append(graphs[c])\n", 235 | " elif i == 'LMCI':\n", 236 | " graph_labels.append(-1)\n", 237 | " exp1.append(graphs[c])\n", 238 | " else:\n", 239 | " rem_ad.append(c)\n", 240 | " \n", 241 | "\n", 242 | "################## AD Vs NC ################################\n", 243 | "# graph_labels = []\n", 244 | "# rem_ad = []\n", 245 | "# exp1 = []\n", 246 | "# for c,i in enumerate(all_sub_label):\n", 247 | "# if i == 'AD':\n", 248 | "# graph_labels.append(1)\n", 249 | "# exp1.append(graphs[c])\n", 250 | "# elif i == 'CN':\n", 251 | "# graph_labels.append(-1)\n", 252 | "# exp1.append(graphs[c])\n", 253 | "# else:\n", 254 | "# rem_ad.append(c)\n", 255 | "\n", 256 | "\n", 257 | "################ AD Vs MCI ###############################\n", 258 | "# graph_labels = []\n", 259 | "# rem_ad = []\n", 260 | "# exp1 = []\n", 261 | "# for c,i in enumerate(all_sub_label):\n", 262 | "# if i == 'EMCI' or i == 'LMCI' or i == 'MCI':\n", 263 | "# graph_labels.append(1)\n", 264 | "# exp1.append(graphs[c])\n", 265 | "# elif i == 'AD':\n", 266 | "# graph_labels.append(-1)\n", 267 | "# exp1.append(graphs[c])\n", 268 | "# else:\n", 269 | "# rem_ad.append(c)\n", 270 | " \n", 271 | " \n", 272 | "\n", 273 | "\n", 274 | "########### MCI Vs NC ##########################\n", 275 | "# graph_labels = []\n", 276 | "# rem_ad = []\n", 277 | "# exp1 = []\n", 278 | "# for c,i in enumerate(all_sub_label):\n", 279 | "# if i == 'EMCI' or i == 'LMCI' or i == 'MCI':\n", 280 | "# graph_labels.append(1)\n", 281 | "# exp1.append(graphs[c])\n", 282 | "# elif i == 'CN':\n", 283 | "# graph_labels.append(-1)\n", 284 | "# exp1.append(graphs[c])\n", 285 | "# else:\n", 286 | "# rem_ad.append(c)\n", 287 | " " 288 | ] 289 | }, 290 | { 291 | "cell_type": "code", 292 | "execution_count": null, 293 | "metadata": {}, 294 | "outputs": [], 295 | "source": [ 296 | "print(len(exp1), len (graph_labels))" 297 | ] 298 | }, 299 | { 300 | "cell_type": "code", 301 | "execution_count": null, 302 | "metadata": {}, 303 | "outputs": [], 304 | "source": [ 305 | "pd.Series(graph_labels).value_counts().to_frame()" 306 | ] 307 | }, 308 | { 309 | "cell_type": "code", 310 | "execution_count": null, 311 | "metadata": {}, 312 | "outputs": [], 313 | "source": [ 314 | "graph_labels = pd.get_dummies(graph_labels, drop_first=True)\n", 315 | "# print(graph_labels)" 316 | ] 317 | }, 318 | { 319 | "cell_type": "code", 320 | "execution_count": null, 321 | "metadata": {}, 322 | "outputs": [], 323 | "source": [ 324 | "\n", 325 | "##############################################################################################" 326 | ] 327 | }, 328 | { 329 | "cell_type": "code", 330 | "execution_count": null, 331 | "metadata": {}, 332 | "outputs": [], 333 | "source": [ 334 | " import pandas as pd\n", 335 | " import numpy as np\n", 336 | "\n", 337 | " import stellargraph as sg\n", 338 | " from stellargraph.mapper import PaddedGraphGenerator\n", 339 | " from stellargraph.layer import GCNSupervisedGraphClassification\n", 340 | " from stellargraph import StellarGraph\n", 341 | "\n", 342 | " from stellargraph import datasets\n", 343 | " import sklearn\n", 344 | " from sklearn import model_selection\n", 345 | " from sklearn.metrics import balanced_accuracy_score\n", 346 | " from IPython.display import display, HTML\n", 347 | "\n", 348 | " from tensorflow.keras import Model\n", 349 | " from tensorflow.keras.optimizers import Adam\n", 350 | " from tensorflow.keras.layers import Dense\n", 351 | " from tensorflow.keras.losses import binary_crossentropy\n", 352 | " from tensorflow.keras.callbacks import EarlyStopping\n", 353 | " import tensorflow.keras.metrics\n", 354 | " import tensorflow as tf\n", 355 | " import matplotlib.pyplot as plt" 356 | ] 357 | }, 358 | { 359 | "cell_type": "code", 360 | "execution_count": null, 361 | "metadata": {}, 362 | "outputs": [], 363 | "source": [ 364 | "generator = PaddedGraphGenerator(graphs=exp1)" 365 | ] 366 | }, 367 | { 368 | "cell_type": "code", 369 | "execution_count": null, 370 | "metadata": {}, 371 | "outputs": [], 372 | "source": [ 373 | "def create_graph_classification_model(generator):\n", 374 | " gc_model = GCNSupervisedGraphClassification(\n", 375 | " layer_sizes=[128, 128, 128,128],\n", 376 | " activations=[\"relu\", \"relu\",\"relu\", \"relu\"],\n", 377 | " generator=generator,\n", 378 | " dropout=0.5,\n", 379 | " )\n", 380 | " x_inp, x_out = gc_model.in_out_tensors()\n", 381 | " predictions = Dense(units=256, activation=\"relu\")(x_out)\n", 382 | " predictions = Dense(units=128, activation=\"relu\")(predictions)\n", 383 | " \n", 384 | " predictions = Dense(units=64, activation=\"relu\")(predictions)\n", 385 | " \n", 386 | " predictions = Dense(units=32, activation=\"relu\")(predictions)\n", 387 | " \n", 388 | " predictions = Dense(units=16, activation=\"relu\")(predictions)\n", 389 | " predictions = Dense(units=1, activation=\"sigmoid\")(predictions)\n", 390 | "\n", 391 | " # Let's create the Keras model and prepare it for training\n", 392 | " model = Model(inputs=x_inp, outputs=predictions)\n", 393 | "# model.compile(optimizer=Adam(0.005), loss=binary_crossentropy, metrics=[\"acc\"]) \n", 394 | " model.compile(optimizer=Adam(0.005), loss=binary_crossentropy, metrics=[\"acc\",tf.keras.metrics.AUC()])\n", 395 | "\n", 396 | " return model" 397 | ] 398 | }, 399 | { 400 | "cell_type": "code", 401 | "execution_count": null, 402 | "metadata": {}, 403 | "outputs": [], 404 | "source": [ 405 | "epochs = 200 # maximum number of training epochs\n", 406 | "folds = 32 # the number of folds for k-fold cross validation\n", 407 | "n_repeats = 5 # the number of repeats for repeated k-fold cross validation" 408 | ] 409 | }, 410 | { 411 | "cell_type": "code", 412 | "execution_count": null, 413 | "metadata": {}, 414 | "outputs": [], 415 | "source": [ 416 | "es = EarlyStopping(\n", 417 | " monitor=\"val_loss\", min_delta=0, patience=25, restore_best_weights=True\n", 418 | ")" 419 | ] 420 | }, 421 | { 422 | "cell_type": "code", 423 | "execution_count": null, 424 | "metadata": {}, 425 | "outputs": [], 426 | "source": [ 427 | "def train_fold(model, train_gen, test_gen, es, epochs):\n", 428 | " history = model.fit(\n", 429 | " train_gen, epochs=epochs, validation_data=test_gen, verbose=0, callbacks=[es],\n", 430 | " )\n", 431 | " # calculate performance on the test data and return along with history\n", 432 | " \n", 433 | " test_metrics = model.evaluate(test_gen, verbose=0)\n", 434 | "# print(model.metrics_names)\n", 435 | " test_acc = test_metrics[model.metrics_names.index(\"acc\")]\n", 436 | "# test_bacc = test_metrics[3]\n", 437 | " test_auc = test_metrics[2]\n", 438 | " \n", 439 | "\n", 440 | " return history, test_acc, test_auc" 441 | ] 442 | }, 443 | { 444 | "cell_type": "code", 445 | "execution_count": null, 446 | "metadata": {}, 447 | "outputs": [], 448 | "source": [ 449 | "def get_generators(train_index, test_index, graph_labels, batch_size):\n", 450 | " train_gen = generator.flow(\n", 451 | " train_index, targets=graph_labels.iloc[train_index].values, batch_size=batch_size\n", 452 | " )\n", 453 | " test_gen = generator.flow(\n", 454 | " test_index, targets=graph_labels.iloc[test_index].values, batch_size=batch_size\n", 455 | " )\n", 456 | "\n", 457 | " return train_gen, test_gen" 458 | ] 459 | }, 460 | { 461 | "cell_type": "code", 462 | "execution_count": null, 463 | "metadata": {}, 464 | "outputs": [], 465 | "source": [ 466 | "test_accs = []\n", 467 | "# test_raccs = []\n", 468 | "# test_baccs = []\n", 469 | "test_aucs = []\n", 470 | "start = time.time()\n", 471 | "\n", 472 | "\n", 473 | "stratified_folds = model_selection.RepeatedStratifiedKFold(\n", 474 | " n_splits=folds, n_repeats=n_repeats\n", 475 | ").split(graph_labels, graph_labels)\n", 476 | "\n", 477 | "for i, (train_index, test_index) in enumerate(stratified_folds):\n", 478 | " print(f\"Training and evaluating on fold {i+1} out of {folds * n_repeats}...\")\n", 479 | " train_gen, test_gen = get_generators(\n", 480 | " train_index, test_index, graph_labels, batch_size=30\n", 481 | " )\n", 482 | "\n", 483 | " model = create_graph_classification_model(generator)\n", 484 | " history, acc, auc = train_fold(model, train_gen, test_gen, es, epochs)\n", 485 | " \n", 486 | "# history, acc = train_fold(model, train_gen, test_gen, es, epochs)\n", 487 | "\n", 488 | " test_accs.append(acc)\n", 489 | "# test_raccs.append(racc)\n", 490 | " test_aucs.append(auc)\n", 491 | "\n", 492 | "end = time.time()\n", 493 | "print(end - start)" 494 | ] 495 | }, 496 | { 497 | "cell_type": "code", 498 | "execution_count": null, 499 | "metadata": {}, 500 | "outputs": [], 501 | "source": [ 502 | "print(\n", 503 | " f\"Accuracy over all folds mean: {np.mean(test_accs)*100:.3}% and std: {np.std(test_accs)*100:.2}% and auc: {np.mean(test_aucs)}\"\n", 504 | ")" 505 | ] 506 | }, 507 | { 508 | "cell_type": "code", 509 | "execution_count": null, 510 | "metadata": {}, 511 | "outputs": [], 512 | "source": [ 513 | "plt.figure(figsize=(8, 6))\n", 514 | "plt.hist(test_accs)\n", 515 | "plt.xlabel(\"Accuracy\")\n", 516 | "plt.ylabel(\"Count\")" 517 | ] 518 | } 519 | ], 520 | "metadata": { 521 | "kernelspec": { 522 | "display_name": "Python 3", 523 | "language": "python", 524 | "name": "python3" 525 | }, 526 | "language_info": { 527 | "codemirror_mode": { 528 | "name": "ipython", 529 | "version": 3 530 | }, 531 | "file_extension": ".py", 532 | "mimetype": "text/x-python", 533 | "name": "python", 534 | "nbconvert_exporter": "python", 535 | "pygments_lexer": "ipython3", 536 | "version": "3.7.6" 537 | } 538 | }, 539 | "nbformat": 4, 540 | "nbformat_minor": 4 541 | } 542 | -------------------------------------------------------------------------------- /HGP-SL/data/ADNI/raw/ADNI_node_labels.txt: -------------------------------------------------------------------------------- 1 | 0 2 | 1 3 | 2 4 | 3 5 | 4 6 | 5 7 | 6 8 | 7 9 | 8 10 | 9 11 | 10 12 | 11 13 | 12 14 | 13 15 | 14 16 | 15 17 | 16 18 | 17 19 | 18 20 | 19 21 | 20 22 | 21 23 | 22 24 | 23 25 | 24 26 | 25 27 | 26 28 | 27 29 | 28 30 | 29 31 | 30 32 | 31 33 | 32 34 | 33 35 | 34 36 | 35 37 | 36 38 | 37 39 | 38 40 | 39 41 | 40 42 | 41 43 | 42 44 | 43 45 | 44 46 | 45 47 | 46 48 | 47 49 | 48 50 | 49 51 | 50 52 | 51 53 | 52 54 | 53 55 | 54 56 | 55 57 | 56 58 | 57 59 | 58 60 | 59 61 | 60 62 | 61 63 | 62 64 | 63 65 | 64 66 | 65 67 | 66 68 | 67 69 | 68 70 | 69 71 | 70 72 | 71 73 | 72 74 | 73 75 | 74 76 | 75 77 | 76 78 | 77 79 | 78 80 | 79 81 | 80 82 | 81 83 | 82 84 | 83 85 | 84 86 | 85 87 | 86 88 | 87 89 | 88 90 | 89 91 | 0 92 | 1 93 | 2 94 | 3 95 | 4 96 | 5 97 | 6 98 | 7 99 | 8 100 | 9 101 | 10 102 | 11 103 | 12 104 | 13 105 | 14 106 | 15 107 | 16 108 | 17 109 | 18 110 | 19 111 | 20 112 | 21 113 | 22 114 | 23 115 | 24 116 | 25 117 | 26 118 | 27 119 | 28 120 | 29 121 | 30 122 | 31 123 | 32 124 | 33 125 | 34 126 | 35 127 | 36 128 | 37 129 | 38 130 | 39 131 | 40 132 | 41 133 | 42 134 | 43 135 | 44 136 | 45 137 | 46 138 | 47 139 | 48 140 | 49 141 | 50 142 | 51 143 | 52 144 | 53 145 | 54 146 | 55 147 | 56 148 | 57 149 | 58 150 | 59 151 | 60 152 | 61 153 | 62 154 | 63 155 | 64 156 | 65 157 | 66 158 | 67 159 | 68 160 | 69 161 | 70 162 | 71 163 | 72 164 | 73 165 | 74 166 | 75 167 | 76 168 | 77 169 | 78 170 | 79 171 | 80 172 | 81 173 | 82 174 | 83 175 | 84 176 | 85 177 | 86 178 | 87 179 | 88 180 | 89 181 | 0 182 | 1 183 | 2 184 | 3 185 | 4 186 | 5 187 | 6 188 | 7 189 | 8 190 | 9 191 | 10 192 | 11 193 | 12 194 | 13 195 | 14 196 | 15 197 | 16 198 | 17 199 | 18 200 | 19 201 | 20 202 | 21 203 | 22 204 | 23 205 | 24 206 | 25 207 | 26 208 | 27 209 | 28 210 | 29 211 | 30 212 | 31 213 | 32 214 | 33 215 | 34 216 | 35 217 | 36 218 | 37 219 | 38 220 | 39 221 | 40 222 | 41 223 | 42 224 | 43 225 | 44 226 | 45 227 | 46 228 | 47 229 | 48 230 | 49 231 | 50 232 | 51 233 | 52 234 | 53 235 | 54 236 | 55 237 | 56 238 | 57 239 | 58 240 | 59 241 | 60 242 | 61 243 | 62 244 | 63 245 | 64 246 | 65 247 | 66 248 | 67 249 | 68 250 | 69 251 | 70 252 | 71 253 | 72 254 | 73 255 | 74 256 | 75 257 | 76 258 | 77 259 | 78 260 | 79 261 | 80 262 | 81 263 | 82 264 | 83 265 | 84 266 | 85 267 | 86 268 | 87 269 | 88 270 | 89 271 | 0 272 | 1 273 | 2 274 | 3 275 | 4 276 | 5 277 | 6 278 | 7 279 | 8 280 | 9 281 | 10 282 | 11 283 | 12 284 | 13 285 | 14 286 | 15 287 | 16 288 | 17 289 | 18 290 | 19 291 | 20 292 | 21 293 | 22 294 | 23 295 | 24 296 | 25 297 | 26 298 | 27 299 | 28 300 | 29 301 | 30 302 | 31 303 | 32 304 | 33 305 | 34 306 | 35 307 | 36 308 | 37 309 | 38 310 | 39 311 | 40 312 | 41 313 | 42 314 | 43 315 | 44 316 | 45 317 | 46 318 | 47 319 | 48 320 | 49 321 | 50 322 | 51 323 | 52 324 | 53 325 | 54 326 | 55 327 | 56 328 | 57 329 | 58 330 | 59 331 | 60 332 | 61 333 | 62 334 | 63 335 | 64 336 | 65 337 | 66 338 | 67 339 | 68 340 | 69 341 | 70 342 | 71 343 | 72 344 | 73 345 | 74 346 | 75 347 | 76 348 | 77 349 | 78 350 | 79 351 | 80 352 | 81 353 | 82 354 | 83 355 | 84 356 | 85 357 | 86 358 | 87 359 | 88 360 | 89 361 | 0 362 | 1 363 | 2 364 | 3 365 | 4 366 | 5 367 | 6 368 | 7 369 | 8 370 | 9 371 | 10 372 | 11 373 | 12 374 | 13 375 | 14 376 | 15 377 | 16 378 | 17 379 | 18 380 | 19 381 | 20 382 | 21 383 | 22 384 | 23 385 | 24 386 | 25 387 | 26 388 | 27 389 | 28 390 | 29 391 | 30 392 | 31 393 | 32 394 | 33 395 | 34 396 | 35 397 | 36 398 | 37 399 | 38 400 | 39 401 | 40 402 | 41 403 | 42 404 | 43 405 | 44 406 | 45 407 | 46 408 | 47 409 | 48 410 | 49 411 | 50 412 | 51 413 | 52 414 | 53 415 | 54 416 | 55 417 | 56 418 | 57 419 | 58 420 | 59 421 | 60 422 | 61 423 | 62 424 | 63 425 | 64 426 | 65 427 | 66 428 | 67 429 | 68 430 | 69 431 | 70 432 | 71 433 | 72 434 | 73 435 | 74 436 | 75 437 | 76 438 | 77 439 | 78 440 | 79 441 | 80 442 | 81 443 | 82 444 | 83 445 | 84 446 | 85 447 | 86 448 | 87 449 | 88 450 | 89 451 | 0 452 | 1 453 | 2 454 | 3 455 | 4 456 | 5 457 | 6 458 | 7 459 | 8 460 | 9 461 | 10 462 | 11 463 | 12 464 | 13 465 | 14 466 | 15 467 | 16 468 | 17 469 | 18 470 | 19 471 | 20 472 | 21 473 | 22 474 | 23 475 | 24 476 | 25 477 | 26 478 | 27 479 | 28 480 | 29 481 | 30 482 | 31 483 | 32 484 | 33 485 | 34 486 | 35 487 | 36 488 | 37 489 | 38 490 | 39 491 | 40 492 | 41 493 | 42 494 | 43 495 | 44 496 | 45 497 | 46 498 | 47 499 | 48 500 | 49 501 | 50 502 | 51 503 | 52 504 | 53 505 | 54 506 | 55 507 | 56 508 | 57 509 | 58 510 | 59 511 | 60 512 | 61 513 | 62 514 | 63 515 | 64 516 | 65 517 | 66 518 | 67 519 | 68 520 | 69 521 | 70 522 | 71 523 | 72 524 | 73 525 | 74 526 | 75 527 | 76 528 | 77 529 | 78 530 | 79 531 | 80 532 | 81 533 | 82 534 | 83 535 | 84 536 | 85 537 | 86 538 | 87 539 | 88 540 | 89 541 | 0 542 | 1 543 | 2 544 | 3 545 | 4 546 | 5 547 | 6 548 | 7 549 | 8 550 | 9 551 | 10 552 | 11 553 | 12 554 | 13 555 | 14 556 | 15 557 | 16 558 | 17 559 | 18 560 | 19 561 | 20 562 | 21 563 | 22 564 | 23 565 | 24 566 | 25 567 | 26 568 | 27 569 | 28 570 | 29 571 | 30 572 | 31 573 | 32 574 | 33 575 | 34 576 | 35 577 | 36 578 | 37 579 | 38 580 | 39 581 | 40 582 | 41 583 | 42 584 | 43 585 | 44 586 | 45 587 | 46 588 | 47 589 | 48 590 | 49 591 | 50 592 | 51 593 | 52 594 | 53 595 | 54 596 | 55 597 | 56 598 | 57 599 | 58 600 | 59 601 | 60 602 | 61 603 | 62 604 | 63 605 | 64 606 | 65 607 | 66 608 | 67 609 | 68 610 | 69 611 | 70 612 | 71 613 | 72 614 | 73 615 | 74 616 | 75 617 | 76 618 | 77 619 | 78 620 | 79 621 | 80 622 | 81 623 | 82 624 | 83 625 | 84 626 | 85 627 | 86 628 | 87 629 | 88 630 | 89 631 | 0 632 | 1 633 | 2 634 | 3 635 | 4 636 | 5 637 | 6 638 | 7 639 | 8 640 | 9 641 | 10 642 | 11 643 | 12 644 | 13 645 | 14 646 | 15 647 | 16 648 | 17 649 | 18 650 | 19 651 | 20 652 | 21 653 | 22 654 | 23 655 | 24 656 | 25 657 | 26 658 | 27 659 | 28 660 | 29 661 | 30 662 | 31 663 | 32 664 | 33 665 | 34 666 | 35 667 | 36 668 | 37 669 | 38 670 | 39 671 | 40 672 | 41 673 | 42 674 | 43 675 | 44 676 | 45 677 | 46 678 | 47 679 | 48 680 | 49 681 | 50 682 | 51 683 | 52 684 | 53 685 | 54 686 | 55 687 | 56 688 | 57 689 | 58 690 | 59 691 | 60 692 | 61 693 | 62 694 | 63 695 | 64 696 | 65 697 | 66 698 | 67 699 | 68 700 | 69 701 | 70 702 | 71 703 | 72 704 | 73 705 | 74 706 | 75 707 | 76 708 | 77 709 | 78 710 | 79 711 | 80 712 | 81 713 | 82 714 | 83 715 | 84 716 | 85 717 | 86 718 | 87 719 | 88 720 | 89 721 | 0 722 | 1 723 | 2 724 | 3 725 | 4 726 | 5 727 | 6 728 | 7 729 | 8 730 | 9 731 | 10 732 | 11 733 | 12 734 | 13 735 | 14 736 | 15 737 | 16 738 | 17 739 | 18 740 | 19 741 | 20 742 | 21 743 | 22 744 | 23 745 | 24 746 | 25 747 | 26 748 | 27 749 | 28 750 | 29 751 | 30 752 | 31 753 | 32 754 | 33 755 | 34 756 | 35 757 | 36 758 | 37 759 | 38 760 | 39 761 | 40 762 | 41 763 | 42 764 | 43 765 | 44 766 | 45 767 | 46 768 | 47 769 | 48 770 | 49 771 | 50 772 | 51 773 | 52 774 | 53 775 | 54 776 | 55 777 | 56 778 | 57 779 | 58 780 | 59 781 | 60 782 | 61 783 | 62 784 | 63 785 | 64 786 | 65 787 | 66 788 | 67 789 | 68 790 | 69 791 | 70 792 | 71 793 | 72 794 | 73 795 | 74 796 | 75 797 | 76 798 | 77 799 | 78 800 | 79 801 | 80 802 | 81 803 | 82 804 | 83 805 | 84 806 | 85 807 | 86 808 | 87 809 | 88 810 | 89 811 | 0 812 | 1 813 | 2 814 | 3 815 | 4 816 | 5 817 | 6 818 | 7 819 | 8 820 | 9 821 | 10 822 | 11 823 | 12 824 | 13 825 | 14 826 | 15 827 | 16 828 | 17 829 | 18 830 | 19 831 | 20 832 | 21 833 | 22 834 | 23 835 | 24 836 | 25 837 | 26 838 | 27 839 | 28 840 | 29 841 | 30 842 | 31 843 | 32 844 | 33 845 | 34 846 | 35 847 | 36 848 | 37 849 | 38 850 | 39 851 | 40 852 | 41 853 | 42 854 | 43 855 | 44 856 | 45 857 | 46 858 | 47 859 | 48 860 | 49 861 | 50 862 | 51 863 | 52 864 | 53 865 | 54 866 | 55 867 | 56 868 | 57 869 | 58 870 | 59 871 | 60 872 | 61 873 | 62 874 | 63 875 | 64 876 | 65 877 | 66 878 | 67 879 | 68 880 | 69 881 | 70 882 | 71 883 | 72 884 | 73 885 | 74 886 | 75 887 | 76 888 | 77 889 | 78 890 | 79 891 | 80 892 | 81 893 | 82 894 | 83 895 | 84 896 | 85 897 | 86 898 | 87 899 | 88 900 | 89 901 | 0 902 | 1 903 | 2 904 | 3 905 | 4 906 | 5 907 | 6 908 | 7 909 | 8 910 | 9 911 | 10 912 | 11 913 | 12 914 | 13 915 | 14 916 | 15 917 | 16 918 | 17 919 | 18 920 | 19 921 | 20 922 | 21 923 | 22 924 | 23 925 | 24 926 | 25 927 | 26 928 | 27 929 | 28 930 | 29 931 | 30 932 | 31 933 | 32 934 | 33 935 | 34 936 | 35 937 | 36 938 | 37 939 | 38 940 | 39 941 | 40 942 | 41 943 | 42 944 | 43 945 | 44 946 | 45 947 | 46 948 | 47 949 | 48 950 | 49 951 | 50 952 | 51 953 | 52 954 | 53 955 | 54 956 | 55 957 | 56 958 | 57 959 | 58 960 | 59 961 | 60 962 | 61 963 | 62 964 | 63 965 | 64 966 | 65 967 | 66 968 | 67 969 | 68 970 | 69 971 | 70 972 | 71 973 | 72 974 | 73 975 | 74 976 | 75 977 | 76 978 | 77 979 | 78 980 | 79 981 | 80 982 | 81 983 | 82 984 | 83 985 | 84 986 | 85 987 | 86 988 | 87 989 | 88 990 | 89 991 | 0 992 | 1 993 | 2 994 | 3 995 | 4 996 | 5 997 | 6 998 | 7 999 | 8 1000 | 9 1001 | 10 1002 | 11 1003 | 12 1004 | 13 1005 | 14 1006 | 15 1007 | 16 1008 | 17 1009 | 18 1010 | 19 1011 | 20 1012 | 21 1013 | 22 1014 | 23 1015 | 24 1016 | 25 1017 | 26 1018 | 27 1019 | 28 1020 | 29 1021 | 30 1022 | 31 1023 | 32 1024 | 33 1025 | 34 1026 | 35 1027 | 36 1028 | 37 1029 | 38 1030 | 39 1031 | 40 1032 | 41 1033 | 42 1034 | 43 1035 | 44 1036 | 45 1037 | 46 1038 | 47 1039 | 48 1040 | 49 1041 | 50 1042 | 51 1043 | 52 1044 | 53 1045 | 54 1046 | 55 1047 | 56 1048 | 57 1049 | 58 1050 | 59 1051 | 60 1052 | 61 1053 | 62 1054 | 63 1055 | 64 1056 | 65 1057 | 66 1058 | 67 1059 | 68 1060 | 69 1061 | 70 1062 | 71 1063 | 72 1064 | 73 1065 | 74 1066 | 75 1067 | 76 1068 | 77 1069 | 78 1070 | 79 1071 | 80 1072 | 81 1073 | 82 1074 | 83 1075 | 84 1076 | 85 1077 | 86 1078 | 87 1079 | 88 1080 | 89 1081 | 0 1082 | 1 1083 | 2 1084 | 3 1085 | 4 1086 | 5 1087 | 6 1088 | 7 1089 | 8 1090 | 9 1091 | 10 1092 | 11 1093 | 12 1094 | 13 1095 | 14 1096 | 15 1097 | 16 1098 | 17 1099 | 18 1100 | 19 1101 | 20 1102 | 21 1103 | 22 1104 | 23 1105 | 24 1106 | 25 1107 | 26 1108 | 27 1109 | 28 1110 | 29 1111 | 30 1112 | 31 1113 | 32 1114 | 33 1115 | 34 1116 | 35 1117 | 36 1118 | 37 1119 | 38 1120 | 39 1121 | 40 1122 | 41 1123 | 42 1124 | 43 1125 | 44 1126 | 45 1127 | 46 1128 | 47 1129 | 48 1130 | 49 1131 | 50 1132 | 51 1133 | 52 1134 | 53 1135 | 54 1136 | 55 1137 | 56 1138 | 57 1139 | 58 1140 | 59 1141 | 60 1142 | 61 1143 | 62 1144 | 63 1145 | 64 1146 | 65 1147 | 66 1148 | 67 1149 | 68 1150 | 69 1151 | 70 1152 | 71 1153 | 72 1154 | 73 1155 | 74 1156 | 75 1157 | 76 1158 | 77 1159 | 78 1160 | 79 1161 | 80 1162 | 81 1163 | 82 1164 | 83 1165 | 84 1166 | 85 1167 | 86 1168 | 87 1169 | 88 1170 | 89 1171 | 0 1172 | 1 1173 | 2 1174 | 3 1175 | 4 1176 | 5 1177 | 6 1178 | 7 1179 | 8 1180 | 9 1181 | 10 1182 | 11 1183 | 12 1184 | 13 1185 | 14 1186 | 15 1187 | 16 1188 | 17 1189 | 18 1190 | 19 1191 | 20 1192 | 21 1193 | 22 1194 | 23 1195 | 24 1196 | 25 1197 | 26 1198 | 27 1199 | 28 1200 | 29 1201 | 30 1202 | 31 1203 | 32 1204 | 33 1205 | 34 1206 | 35 1207 | 36 1208 | 37 1209 | 38 1210 | 39 1211 | 40 1212 | 41 1213 | 42 1214 | 43 1215 | 44 1216 | 45 1217 | 46 1218 | 47 1219 | 48 1220 | 49 1221 | 50 1222 | 51 1223 | 52 1224 | 53 1225 | 54 1226 | 55 1227 | 56 1228 | 57 1229 | 58 1230 | 59 1231 | 60 1232 | 61 1233 | 62 1234 | 63 1235 | 64 1236 | 65 1237 | 66 1238 | 67 1239 | 68 1240 | 69 1241 | 70 1242 | 71 1243 | 72 1244 | 73 1245 | 74 1246 | 75 1247 | 76 1248 | 77 1249 | 78 1250 | 79 1251 | 80 1252 | 81 1253 | 82 1254 | 83 1255 | 84 1256 | 85 1257 | 86 1258 | 87 1259 | 88 1260 | 89 1261 | 0 1262 | 1 1263 | 2 1264 | 3 1265 | 4 1266 | 5 1267 | 6 1268 | 7 1269 | 8 1270 | 9 1271 | 10 1272 | 11 1273 | 12 1274 | 13 1275 | 14 1276 | 15 1277 | 16 1278 | 17 1279 | 18 1280 | 19 1281 | 20 1282 | 21 1283 | 22 1284 | 23 1285 | 24 1286 | 25 1287 | 26 1288 | 27 1289 | 28 1290 | 29 1291 | 30 1292 | 31 1293 | 32 1294 | 33 1295 | 34 1296 | 35 1297 | 36 1298 | 37 1299 | 38 1300 | 39 1301 | 40 1302 | 41 1303 | 42 1304 | 43 1305 | 44 1306 | 45 1307 | 46 1308 | 47 1309 | 48 1310 | 49 1311 | 50 1312 | 51 1313 | 52 1314 | 53 1315 | 54 1316 | 55 1317 | 56 1318 | 57 1319 | 58 1320 | 59 1321 | 60 1322 | 61 1323 | 62 1324 | 63 1325 | 64 1326 | 65 1327 | 66 1328 | 67 1329 | 68 1330 | 69 1331 | 70 1332 | 71 1333 | 72 1334 | 73 1335 | 74 1336 | 75 1337 | 76 1338 | 77 1339 | 78 1340 | 79 1341 | 80 1342 | 81 1343 | 82 1344 | 83 1345 | 84 1346 | 85 1347 | 86 1348 | 87 1349 | 88 1350 | 89 1351 | 0 1352 | 1 1353 | 2 1354 | 3 1355 | 4 1356 | 5 1357 | 6 1358 | 7 1359 | 8 1360 | 9 1361 | 10 1362 | 11 1363 | 12 1364 | 13 1365 | 14 1366 | 15 1367 | 16 1368 | 17 1369 | 18 1370 | 19 1371 | 20 1372 | 21 1373 | 22 1374 | 23 1375 | 24 1376 | 25 1377 | 26 1378 | 27 1379 | 28 1380 | 29 1381 | 30 1382 | 31 1383 | 32 1384 | 33 1385 | 34 1386 | 35 1387 | 36 1388 | 37 1389 | 38 1390 | 39 1391 | 40 1392 | 41 1393 | 42 1394 | 43 1395 | 44 1396 | 45 1397 | 46 1398 | 47 1399 | 48 1400 | 49 1401 | 50 1402 | 51 1403 | 52 1404 | 53 1405 | 54 1406 | 55 1407 | 56 1408 | 57 1409 | 58 1410 | 59 1411 | 60 1412 | 61 1413 | 62 1414 | 63 1415 | 64 1416 | 65 1417 | 66 1418 | 67 1419 | 68 1420 | 69 1421 | 70 1422 | 71 1423 | 72 1424 | 73 1425 | 74 1426 | 75 1427 | 76 1428 | 77 1429 | 78 1430 | 79 1431 | 80 1432 | 81 1433 | 82 1434 | 83 1435 | 84 1436 | 85 1437 | 86 1438 | 87 1439 | 88 1440 | 89 1441 | 0 1442 | 1 1443 | 2 1444 | 3 1445 | 4 1446 | 5 1447 | 6 1448 | 7 1449 | 8 1450 | 9 1451 | 10 1452 | 11 1453 | 12 1454 | 13 1455 | 14 1456 | 15 1457 | 16 1458 | 17 1459 | 18 1460 | 19 1461 | 20 1462 | 21 1463 | 22 1464 | 23 1465 | 24 1466 | 25 1467 | 26 1468 | 27 1469 | 28 1470 | 29 1471 | 30 1472 | 31 1473 | 32 1474 | 33 1475 | 34 1476 | 35 1477 | 36 1478 | 37 1479 | 38 1480 | 39 1481 | 40 1482 | 41 1483 | 42 1484 | 43 1485 | 44 1486 | 45 1487 | 46 1488 | 47 1489 | 48 1490 | 49 1491 | 50 1492 | 51 1493 | 52 1494 | 53 1495 | 54 1496 | 55 1497 | 56 1498 | 57 1499 | 58 1500 | 59 1501 | 60 1502 | 61 1503 | 62 1504 | 63 1505 | 64 1506 | 65 1507 | 66 1508 | 67 1509 | 68 1510 | 69 1511 | 70 1512 | 71 1513 | 72 1514 | 73 1515 | 74 1516 | 75 1517 | 76 1518 | 77 1519 | 78 1520 | 79 1521 | 80 1522 | 81 1523 | 82 1524 | 83 1525 | 84 1526 | 85 1527 | 86 1528 | 87 1529 | 88 1530 | 89 1531 | 0 1532 | 1 1533 | 2 1534 | 3 1535 | 4 1536 | 5 1537 | 6 1538 | 7 1539 | 8 1540 | 9 1541 | 10 1542 | 11 1543 | 12 1544 | 13 1545 | 14 1546 | 15 1547 | 16 1548 | 17 1549 | 18 1550 | 19 1551 | 20 1552 | 21 1553 | 22 1554 | 23 1555 | 24 1556 | 25 1557 | 26 1558 | 27 1559 | 28 1560 | 29 1561 | 30 1562 | 31 1563 | 32 1564 | 33 1565 | 34 1566 | 35 1567 | 36 1568 | 37 1569 | 38 1570 | 39 1571 | 40 1572 | 41 1573 | 42 1574 | 43 1575 | 44 1576 | 45 1577 | 46 1578 | 47 1579 | 48 1580 | 49 1581 | 50 1582 | 51 1583 | 52 1584 | 53 1585 | 54 1586 | 55 1587 | 56 1588 | 57 1589 | 58 1590 | 59 1591 | 60 1592 | 61 1593 | 62 1594 | 63 1595 | 64 1596 | 65 1597 | 66 1598 | 67 1599 | 68 1600 | 69 1601 | 70 1602 | 71 1603 | 72 1604 | 73 1605 | 74 1606 | 75 1607 | 76 1608 | 77 1609 | 78 1610 | 79 1611 | 80 1612 | 81 1613 | 82 1614 | 83 1615 | 84 1616 | 85 1617 | 86 1618 | 87 1619 | 88 1620 | 89 1621 | 0 1622 | 1 1623 | 2 1624 | 3 1625 | 4 1626 | 5 1627 | 6 1628 | 7 1629 | 8 1630 | 9 1631 | 10 1632 | 11 1633 | 12 1634 | 13 1635 | 14 1636 | 15 1637 | 16 1638 | 17 1639 | 18 1640 | 19 1641 | 20 1642 | 21 1643 | 22 1644 | 23 1645 | 24 1646 | 25 1647 | 26 1648 | 27 1649 | 28 1650 | 29 1651 | 30 1652 | 31 1653 | 32 1654 | 33 1655 | 34 1656 | 35 1657 | 36 1658 | 37 1659 | 38 1660 | 39 1661 | 40 1662 | 41 1663 | 42 1664 | 43 1665 | 44 1666 | 45 1667 | 46 1668 | 47 1669 | 48 1670 | 49 1671 | 50 1672 | 51 1673 | 52 1674 | 53 1675 | 54 1676 | 55 1677 | 56 1678 | 57 1679 | 58 1680 | 59 1681 | 60 1682 | 61 1683 | 62 1684 | 63 1685 | 64 1686 | 65 1687 | 66 1688 | 67 1689 | 68 1690 | 69 1691 | 70 1692 | 71 1693 | 72 1694 | 73 1695 | 74 1696 | 75 1697 | 76 1698 | 77 1699 | 78 1700 | 79 1701 | 80 1702 | 81 1703 | 82 1704 | 83 1705 | 84 1706 | 85 1707 | 86 1708 | 87 1709 | 88 1710 | 89 1711 | 0 1712 | 1 1713 | 2 1714 | 3 1715 | 4 1716 | 5 1717 | 6 1718 | 7 1719 | 8 1720 | 9 1721 | 10 1722 | 11 1723 | 12 1724 | 13 1725 | 14 1726 | 15 1727 | 16 1728 | 17 1729 | 18 1730 | 19 1731 | 20 1732 | 21 1733 | 22 1734 | 23 1735 | 24 1736 | 25 1737 | 26 1738 | 27 1739 | 28 1740 | 29 1741 | 30 1742 | 31 1743 | 32 1744 | 33 1745 | 34 1746 | 35 1747 | 36 1748 | 37 1749 | 38 1750 | 39 1751 | 40 1752 | 41 1753 | 42 1754 | 43 1755 | 44 1756 | 45 1757 | 46 1758 | 47 1759 | 48 1760 | 49 1761 | 50 1762 | 51 1763 | 52 1764 | 53 1765 | 54 1766 | 55 1767 | 56 1768 | 57 1769 | 58 1770 | 59 1771 | 60 1772 | 61 1773 | 62 1774 | 63 1775 | 64 1776 | 65 1777 | 66 1778 | 67 1779 | 68 1780 | 69 1781 | 70 1782 | 71 1783 | 72 1784 | 73 1785 | 74 1786 | 75 1787 | 76 1788 | 77 1789 | 78 1790 | 79 1791 | 80 1792 | 81 1793 | 82 1794 | 83 1795 | 84 1796 | 85 1797 | 86 1798 | 87 1799 | 88 1800 | 89 1801 | 0 1802 | 1 1803 | 2 1804 | 3 1805 | 4 1806 | 5 1807 | 6 1808 | 7 1809 | 8 1810 | 9 1811 | 10 1812 | 11 1813 | 12 1814 | 13 1815 | 14 1816 | 15 1817 | 16 1818 | 17 1819 | 18 1820 | 19 1821 | 20 1822 | 21 1823 | 22 1824 | 23 1825 | 24 1826 | 25 1827 | 26 1828 | 27 1829 | 28 1830 | 29 1831 | 30 1832 | 31 1833 | 32 1834 | 33 1835 | 34 1836 | 35 1837 | 36 1838 | 37 1839 | 38 1840 | 39 1841 | 40 1842 | 41 1843 | 42 1844 | 43 1845 | 44 1846 | 45 1847 | 46 1848 | 47 1849 | 48 1850 | 49 1851 | 50 1852 | 51 1853 | 52 1854 | 53 1855 | 54 1856 | 55 1857 | 56 1858 | 57 1859 | 58 1860 | 59 1861 | 60 1862 | 61 1863 | 62 1864 | 63 1865 | 64 1866 | 65 1867 | 66 1868 | 67 1869 | 68 1870 | 69 1871 | 70 1872 | 71 1873 | 72 1874 | 73 1875 | 74 1876 | 75 1877 | 76 1878 | 77 1879 | 78 1880 | 79 1881 | 80 1882 | 81 1883 | 82 1884 | 83 1885 | 84 1886 | 85 1887 | 86 1888 | 87 1889 | 88 1890 | 89 1891 | 0 1892 | 1 1893 | 2 1894 | 3 1895 | 4 1896 | 5 1897 | 6 1898 | 7 1899 | 8 1900 | 9 1901 | 10 1902 | 11 1903 | 12 1904 | 13 1905 | 14 1906 | 15 1907 | 16 1908 | 17 1909 | 18 1910 | 19 1911 | 20 1912 | 21 1913 | 22 1914 | 23 1915 | 24 1916 | 25 1917 | 26 1918 | 27 1919 | 28 1920 | 29 1921 | 30 1922 | 31 1923 | 32 1924 | 33 1925 | 34 1926 | 35 1927 | 36 1928 | 37 1929 | 38 1930 | 39 1931 | 40 1932 | 41 1933 | 42 1934 | 43 1935 | 44 1936 | 45 1937 | 46 1938 | 47 1939 | 48 1940 | 49 1941 | 50 1942 | 51 1943 | 52 1944 | 53 1945 | 54 1946 | 55 1947 | 56 1948 | 57 1949 | 58 1950 | 59 1951 | 60 1952 | 61 1953 | 62 1954 | 63 1955 | 64 1956 | 65 1957 | 66 1958 | 67 1959 | 68 1960 | 69 1961 | 70 1962 | 71 1963 | 72 1964 | 73 1965 | 74 1966 | 75 1967 | 76 1968 | 77 1969 | 78 1970 | 79 1971 | 80 1972 | 81 1973 | 82 1974 | 83 1975 | 84 1976 | 85 1977 | 86 1978 | 87 1979 | 88 1980 | 89 1981 | 0 1982 | 1 1983 | 2 1984 | 3 1985 | 4 1986 | 5 1987 | 6 1988 | 7 1989 | 8 1990 | 9 1991 | 10 1992 | 11 1993 | 12 1994 | 13 1995 | 14 1996 | 15 1997 | 16 1998 | 17 1999 | 18 2000 | 19 2001 | 20 2002 | 21 2003 | 22 2004 | 23 2005 | 24 2006 | 25 2007 | 26 2008 | 27 2009 | 28 2010 | 29 2011 | 30 2012 | 31 2013 | 32 2014 | 33 2015 | 34 2016 | 35 2017 | 36 2018 | 37 2019 | 38 2020 | 39 2021 | 40 2022 | 41 2023 | 42 2024 | 43 2025 | 44 2026 | 45 2027 | 46 2028 | 47 2029 | 48 2030 | 49 2031 | 50 2032 | 51 2033 | 52 2034 | 53 2035 | 54 2036 | 55 2037 | 56 2038 | 57 2039 | 58 2040 | 59 2041 | 60 2042 | 61 2043 | 62 2044 | 63 2045 | 64 2046 | 65 2047 | 66 2048 | 67 2049 | 68 2050 | 69 2051 | 70 2052 | 71 2053 | 72 2054 | 73 2055 | 74 2056 | 75 2057 | 76 2058 | 77 2059 | 78 2060 | 79 2061 | 80 2062 | 81 2063 | 82 2064 | 83 2065 | 84 2066 | 85 2067 | 86 2068 | 87 2069 | 88 2070 | 89 2071 | 0 2072 | 1 2073 | 2 2074 | 3 2075 | 4 2076 | 5 2077 | 6 2078 | 7 2079 | 8 2080 | 9 2081 | 10 2082 | 11 2083 | 12 2084 | 13 2085 | 14 2086 | 15 2087 | 16 2088 | 17 2089 | 18 2090 | 19 2091 | 20 2092 | 21 2093 | 22 2094 | 23 2095 | 24 2096 | 25 2097 | 26 2098 | 27 2099 | 28 2100 | 29 2101 | 30 2102 | 31 2103 | 32 2104 | 33 2105 | 34 2106 | 35 2107 | 36 2108 | 37 2109 | 38 2110 | 39 2111 | 40 2112 | 41 2113 | 42 2114 | 43 2115 | 44 2116 | 45 2117 | 46 2118 | 47 2119 | 48 2120 | 49 2121 | 50 2122 | 51 2123 | 52 2124 | 53 2125 | 54 2126 | 55 2127 | 56 2128 | 57 2129 | 58 2130 | 59 2131 | 60 2132 | 61 2133 | 62 2134 | 63 2135 | 64 2136 | 65 2137 | 66 2138 | 67 2139 | 68 2140 | 69 2141 | 70 2142 | 71 2143 | 72 2144 | 73 2145 | 74 2146 | 75 2147 | 76 2148 | 77 2149 | 78 2150 | 79 2151 | 80 2152 | 81 2153 | 82 2154 | 83 2155 | 84 2156 | 85 2157 | 86 2158 | 87 2159 | 88 2160 | 89 2161 | 0 2162 | 1 2163 | 2 2164 | 3 2165 | 4 2166 | 5 2167 | 6 2168 | 7 2169 | 8 2170 | 9 2171 | 10 2172 | 11 2173 | 12 2174 | 13 2175 | 14 2176 | 15 2177 | 16 2178 | 17 2179 | 18 2180 | 19 2181 | 20 2182 | 21 2183 | 22 2184 | 23 2185 | 24 2186 | 25 2187 | 26 2188 | 27 2189 | 28 2190 | 29 2191 | 30 2192 | 31 2193 | 32 2194 | 33 2195 | 34 2196 | 35 2197 | 36 2198 | 37 2199 | 38 2200 | 39 2201 | 40 2202 | 41 2203 | 42 2204 | 43 2205 | 44 2206 | 45 2207 | 46 2208 | 47 2209 | 48 2210 | 49 2211 | 50 2212 | 51 2213 | 52 2214 | 53 2215 | 54 2216 | 55 2217 | 56 2218 | 57 2219 | 58 2220 | 59 2221 | 60 2222 | 61 2223 | 62 2224 | 63 2225 | 64 2226 | 65 2227 | 66 2228 | 67 2229 | 68 2230 | 69 2231 | 70 2232 | 71 2233 | 72 2234 | 73 2235 | 74 2236 | 75 2237 | 76 2238 | 77 2239 | 78 2240 | 79 2241 | 80 2242 | 81 2243 | 82 2244 | 83 2245 | 84 2246 | 85 2247 | 86 2248 | 87 2249 | 88 2250 | 89 2251 | 0 2252 | 1 2253 | 2 2254 | 3 2255 | 4 2256 | 5 2257 | 6 2258 | 7 2259 | 8 2260 | 9 2261 | 10 2262 | 11 2263 | 12 2264 | 13 2265 | 14 2266 | 15 2267 | 16 2268 | 17 2269 | 18 2270 | 19 2271 | 20 2272 | 21 2273 | 22 2274 | 23 2275 | 24 2276 | 25 2277 | 26 2278 | 27 2279 | 28 2280 | 29 2281 | 30 2282 | 31 2283 | 32 2284 | 33 2285 | 34 2286 | 35 2287 | 36 2288 | 37 2289 | 38 2290 | 39 2291 | 40 2292 | 41 2293 | 42 2294 | 43 2295 | 44 2296 | 45 2297 | 46 2298 | 47 2299 | 48 2300 | 49 2301 | 50 2302 | 51 2303 | 52 2304 | 53 2305 | 54 2306 | 55 2307 | 56 2308 | 57 2309 | 58 2310 | 59 2311 | 60 2312 | 61 2313 | 62 2314 | 63 2315 | 64 2316 | 65 2317 | 66 2318 | 67 2319 | 68 2320 | 69 2321 | 70 2322 | 71 2323 | 72 2324 | 73 2325 | 74 2326 | 75 2327 | 76 2328 | 77 2329 | 78 2330 | 79 2331 | 80 2332 | 81 2333 | 82 2334 | 83 2335 | 84 2336 | 85 2337 | 86 2338 | 87 2339 | 88 2340 | 89 2341 | 0 2342 | 1 2343 | 2 2344 | 3 2345 | 4 2346 | 5 2347 | 6 2348 | 7 2349 | 8 2350 | 9 2351 | 10 2352 | 11 2353 | 12 2354 | 13 2355 | 14 2356 | 15 2357 | 16 2358 | 17 2359 | 18 2360 | 19 2361 | 20 2362 | 21 2363 | 22 2364 | 23 2365 | 24 2366 | 25 2367 | 26 2368 | 27 2369 | 28 2370 | 29 2371 | 30 2372 | 31 2373 | 32 2374 | 33 2375 | 34 2376 | 35 2377 | 36 2378 | 37 2379 | 38 2380 | 39 2381 | 40 2382 | 41 2383 | 42 2384 | 43 2385 | 44 2386 | 45 2387 | 46 2388 | 47 2389 | 48 2390 | 49 2391 | 50 2392 | 51 2393 | 52 2394 | 53 2395 | 54 2396 | 55 2397 | 56 2398 | 57 2399 | 58 2400 | 59 2401 | 60 2402 | 61 2403 | 62 2404 | 63 2405 | 64 2406 | 65 2407 | 66 2408 | 67 2409 | 68 2410 | 69 2411 | 70 2412 | 71 2413 | 72 2414 | 73 2415 | 74 2416 | 75 2417 | 76 2418 | 77 2419 | 78 2420 | 79 2421 | 80 2422 | 81 2423 | 82 2424 | 83 2425 | 84 2426 | 85 2427 | 86 2428 | 87 2429 | 88 2430 | 89 2431 | 0 2432 | 1 2433 | 2 2434 | 3 2435 | 4 2436 | 5 2437 | 6 2438 | 7 2439 | 8 2440 | 9 2441 | 10 2442 | 11 2443 | 12 2444 | 13 2445 | 14 2446 | 15 2447 | 16 2448 | 17 2449 | 18 2450 | 19 2451 | 20 2452 | 21 2453 | 22 2454 | 23 2455 | 24 2456 | 25 2457 | 26 2458 | 27 2459 | 28 2460 | 29 2461 | 30 2462 | 31 2463 | 32 2464 | 33 2465 | 34 2466 | 35 2467 | 36 2468 | 37 2469 | 38 2470 | 39 2471 | 40 2472 | 41 2473 | 42 2474 | 43 2475 | 44 2476 | 45 2477 | 46 2478 | 47 2479 | 48 2480 | 49 2481 | 50 2482 | 51 2483 | 52 2484 | 53 2485 | 54 2486 | 55 2487 | 56 2488 | 57 2489 | 58 2490 | 59 2491 | 60 2492 | 61 2493 | 62 2494 | 63 2495 | 64 2496 | 65 2497 | 66 2498 | 67 2499 | 68 2500 | 69 2501 | 70 2502 | 71 2503 | 72 2504 | 73 2505 | 74 2506 | 75 2507 | 76 2508 | 77 2509 | 78 2510 | 79 2511 | 80 2512 | 81 2513 | 82 2514 | 83 2515 | 84 2516 | 85 2517 | 86 2518 | 87 2519 | 88 2520 | 89 2521 | 0 2522 | 1 2523 | 2 2524 | 3 2525 | 4 2526 | 5 2527 | 6 2528 | 7 2529 | 8 2530 | 9 2531 | 10 2532 | 11 2533 | 12 2534 | 13 2535 | 14 2536 | 15 2537 | 16 2538 | 17 2539 | 18 2540 | 19 2541 | 20 2542 | 21 2543 | 22 2544 | 23 2545 | 24 2546 | 25 2547 | 26 2548 | 27 2549 | 28 2550 | 29 2551 | 30 2552 | 31 2553 | 32 2554 | 33 2555 | 34 2556 | 35 2557 | 36 2558 | 37 2559 | 38 2560 | 39 2561 | 40 2562 | 41 2563 | 42 2564 | 43 2565 | 44 2566 | 45 2567 | 46 2568 | 47 2569 | 48 2570 | 49 2571 | 50 2572 | 51 2573 | 52 2574 | 53 2575 | 54 2576 | 55 2577 | 56 2578 | 57 2579 | 58 2580 | 59 2581 | 60 2582 | 61 2583 | 62 2584 | 63 2585 | 64 2586 | 65 2587 | 66 2588 | 67 2589 | 68 2590 | 69 2591 | 70 2592 | 71 2593 | 72 2594 | 73 2595 | 74 2596 | 75 2597 | 76 2598 | 77 2599 | 78 2600 | 79 2601 | 80 2602 | 81 2603 | 82 2604 | 83 2605 | 84 2606 | 85 2607 | 86 2608 | 87 2609 | 88 2610 | 89 2611 | 0 2612 | 1 2613 | 2 2614 | 3 2615 | 4 2616 | 5 2617 | 6 2618 | 7 2619 | 8 2620 | 9 2621 | 10 2622 | 11 2623 | 12 2624 | 13 2625 | 14 2626 | 15 2627 | 16 2628 | 17 2629 | 18 2630 | 19 2631 | 20 2632 | 21 2633 | 22 2634 | 23 2635 | 24 2636 | 25 2637 | 26 2638 | 27 2639 | 28 2640 | 29 2641 | 30 2642 | 31 2643 | 32 2644 | 33 2645 | 34 2646 | 35 2647 | 36 2648 | 37 2649 | 38 2650 | 39 2651 | 40 2652 | 41 2653 | 42 2654 | 43 2655 | 44 2656 | 45 2657 | 46 2658 | 47 2659 | 48 2660 | 49 2661 | 50 2662 | 51 2663 | 52 2664 | 53 2665 | 54 2666 | 55 2667 | 56 2668 | 57 2669 | 58 2670 | 59 2671 | 60 2672 | 61 2673 | 62 2674 | 63 2675 | 64 2676 | 65 2677 | 66 2678 | 67 2679 | 68 2680 | 69 2681 | 70 2682 | 71 2683 | 72 2684 | 73 2685 | 74 2686 | 75 2687 | 76 2688 | 77 2689 | 78 2690 | 79 2691 | 80 2692 | 81 2693 | 82 2694 | 83 2695 | 84 2696 | 85 2697 | 86 2698 | 87 2699 | 88 2700 | 89 2701 | 0 2702 | 1 2703 | 2 2704 | 3 2705 | 4 2706 | 5 2707 | 6 2708 | 7 2709 | 8 2710 | 9 2711 | 10 2712 | 11 2713 | 12 2714 | 13 2715 | 14 2716 | 15 2717 | 16 2718 | 17 2719 | 18 2720 | 19 2721 | 20 2722 | 21 2723 | 22 2724 | 23 2725 | 24 2726 | 25 2727 | 26 2728 | 27 2729 | 28 2730 | 29 2731 | 30 2732 | 31 2733 | 32 2734 | 33 2735 | 34 2736 | 35 2737 | 36 2738 | 37 2739 | 38 2740 | 39 2741 | 40 2742 | 41 2743 | 42 2744 | 43 2745 | 44 2746 | 45 2747 | 46 2748 | 47 2749 | 48 2750 | 49 2751 | 50 2752 | 51 2753 | 52 2754 | 53 2755 | 54 2756 | 55 2757 | 56 2758 | 57 2759 | 58 2760 | 59 2761 | 60 2762 | 61 2763 | 62 2764 | 63 2765 | 64 2766 | 65 2767 | 66 2768 | 67 2769 | 68 2770 | 69 2771 | 70 2772 | 71 2773 | 72 2774 | 73 2775 | 74 2776 | 75 2777 | 76 2778 | 77 2779 | 78 2780 | 79 2781 | 80 2782 | 81 2783 | 82 2784 | 83 2785 | 84 2786 | 85 2787 | 86 2788 | 87 2789 | 88 2790 | 89 2791 | 0 2792 | 1 2793 | 2 2794 | 3 2795 | 4 2796 | 5 2797 | 6 2798 | 7 2799 | 8 2800 | 9 2801 | 10 2802 | 11 2803 | 12 2804 | 13 2805 | 14 2806 | 15 2807 | 16 2808 | 17 2809 | 18 2810 | 19 2811 | 20 2812 | 21 2813 | 22 2814 | 23 2815 | 24 2816 | 25 2817 | 26 2818 | 27 2819 | 28 2820 | 29 2821 | 30 2822 | 31 2823 | 32 2824 | 33 2825 | 34 2826 | 35 2827 | 36 2828 | 37 2829 | 38 2830 | 39 2831 | 40 2832 | 41 2833 | 42 2834 | 43 2835 | 44 2836 | 45 2837 | 46 2838 | 47 2839 | 48 2840 | 49 2841 | 50 2842 | 51 2843 | 52 2844 | 53 2845 | 54 2846 | 55 2847 | 56 2848 | 57 2849 | 58 2850 | 59 2851 | 60 2852 | 61 2853 | 62 2854 | 63 2855 | 64 2856 | 65 2857 | 66 2858 | 67 2859 | 68 2860 | 69 2861 | 70 2862 | 71 2863 | 72 2864 | 73 2865 | 74 2866 | 75 2867 | 76 2868 | 77 2869 | 78 2870 | 79 2871 | 80 2872 | 81 2873 | 82 2874 | 83 2875 | 84 2876 | 85 2877 | 86 2878 | 87 2879 | 88 2880 | 89 2881 | 0 2882 | 1 2883 | 2 2884 | 3 2885 | 4 2886 | 5 2887 | 6 2888 | 7 2889 | 8 2890 | 9 2891 | 10 2892 | 11 2893 | 12 2894 | 13 2895 | 14 2896 | 15 2897 | 16 2898 | 17 2899 | 18 2900 | 19 2901 | 20 2902 | 21 2903 | 22 2904 | 23 2905 | 24 2906 | 25 2907 | 26 2908 | 27 2909 | 28 2910 | 29 2911 | 30 2912 | 31 2913 | 32 2914 | 33 2915 | 34 2916 | 35 2917 | 36 2918 | 37 2919 | 38 2920 | 39 2921 | 40 2922 | 41 2923 | 42 2924 | 43 2925 | 44 2926 | 45 2927 | 46 2928 | 47 2929 | 48 2930 | 49 2931 | 50 2932 | 51 2933 | 52 2934 | 53 2935 | 54 2936 | 55 2937 | 56 2938 | 57 2939 | 58 2940 | 59 2941 | 60 2942 | 61 2943 | 62 2944 | 63 2945 | 64 2946 | 65 2947 | 66 2948 | 67 2949 | 68 2950 | 69 2951 | 70 2952 | 71 2953 | 72 2954 | 73 2955 | 74 2956 | 75 2957 | 76 2958 | 77 2959 | 78 2960 | 79 2961 | 80 2962 | 81 2963 | 82 2964 | 83 2965 | 84 2966 | 85 2967 | 86 2968 | 87 2969 | 88 2970 | 89 2971 | 0 2972 | 1 2973 | 2 2974 | 3 2975 | 4 2976 | 5 2977 | 6 2978 | 7 2979 | 8 2980 | 9 2981 | 10 2982 | 11 2983 | 12 2984 | 13 2985 | 14 2986 | 15 2987 | 16 2988 | 17 2989 | 18 2990 | 19 2991 | 20 2992 | 21 2993 | 22 2994 | 23 2995 | 24 2996 | 25 2997 | 26 2998 | 27 2999 | 28 3000 | 29 3001 | 30 3002 | 31 3003 | 32 3004 | 33 3005 | 34 3006 | 35 3007 | 36 3008 | 37 3009 | 38 3010 | 39 3011 | 40 3012 | 41 3013 | 42 3014 | 43 3015 | 44 3016 | 45 3017 | 46 3018 | 47 3019 | 48 3020 | 49 3021 | 50 3022 | 51 3023 | 52 3024 | 53 3025 | 54 3026 | 55 3027 | 56 3028 | 57 3029 | 58 3030 | 59 3031 | 60 3032 | 61 3033 | 62 3034 | 63 3035 | 64 3036 | 65 3037 | 66 3038 | 67 3039 | 68 3040 | 69 3041 | 70 3042 | 71 3043 | 72 3044 | 73 3045 | 74 3046 | 75 3047 | 76 3048 | 77 3049 | 78 3050 | 79 3051 | 80 3052 | 81 3053 | 82 3054 | 83 3055 | 84 3056 | 85 3057 | 86 3058 | 87 3059 | 88 3060 | 89 3061 | 0 3062 | 1 3063 | 2 3064 | 3 3065 | 4 3066 | 5 3067 | 6 3068 | 7 3069 | 8 3070 | 9 3071 | 10 3072 | 11 3073 | 12 3074 | 13 3075 | 14 3076 | 15 3077 | 16 3078 | 17 3079 | 18 3080 | 19 3081 | 20 3082 | 21 3083 | 22 3084 | 23 3085 | 24 3086 | 25 3087 | 26 3088 | 27 3089 | 28 3090 | 29 3091 | 30 3092 | 31 3093 | 32 3094 | 33 3095 | 34 3096 | 35 3097 | 36 3098 | 37 3099 | 38 3100 | 39 3101 | 40 3102 | 41 3103 | 42 3104 | 43 3105 | 44 3106 | 45 3107 | 46 3108 | 47 3109 | 48 3110 | 49 3111 | 50 3112 | 51 3113 | 52 3114 | 53 3115 | 54 3116 | 55 3117 | 56 3118 | 57 3119 | 58 3120 | 59 3121 | 60 3122 | 61 3123 | 62 3124 | 63 3125 | 64 3126 | 65 3127 | 66 3128 | 67 3129 | 68 3130 | 69 3131 | 70 3132 | 71 3133 | 72 3134 | 73 3135 | 74 3136 | 75 3137 | 76 3138 | 77 3139 | 78 3140 | 79 3141 | 80 3142 | 81 3143 | 82 3144 | 83 3145 | 84 3146 | 85 3147 | 86 3148 | 87 3149 | 88 3150 | 89 3151 | 0 3152 | 1 3153 | 2 3154 | 3 3155 | 4 3156 | 5 3157 | 6 3158 | 7 3159 | 8 3160 | 9 3161 | 10 3162 | 11 3163 | 12 3164 | 13 3165 | 14 3166 | 15 3167 | 16 3168 | 17 3169 | 18 3170 | 19 3171 | 20 3172 | 21 3173 | 22 3174 | 23 3175 | 24 3176 | 25 3177 | 26 3178 | 27 3179 | 28 3180 | 29 3181 | 30 3182 | 31 3183 | 32 3184 | 33 3185 | 34 3186 | 35 3187 | 36 3188 | 37 3189 | 38 3190 | 39 3191 | 40 3192 | 41 3193 | 42 3194 | 43 3195 | 44 3196 | 45 3197 | 46 3198 | 47 3199 | 48 3200 | 49 3201 | 50 3202 | 51 3203 | 52 3204 | 53 3205 | 54 3206 | 55 3207 | 56 3208 | 57 3209 | 58 3210 | 59 3211 | 60 3212 | 61 3213 | 62 3214 | 63 3215 | 64 3216 | 65 3217 | 66 3218 | 67 3219 | 68 3220 | 69 3221 | 70 3222 | 71 3223 | 72 3224 | 73 3225 | 74 3226 | 75 3227 | 76 3228 | 77 3229 | 78 3230 | 79 3231 | 80 3232 | 81 3233 | 82 3234 | 83 3235 | 84 3236 | 85 3237 | 86 3238 | 87 3239 | 88 3240 | 89 3241 | 0 3242 | 1 3243 | 2 3244 | 3 3245 | 4 3246 | 5 3247 | 6 3248 | 7 3249 | 8 3250 | 9 3251 | 10 3252 | 11 3253 | 12 3254 | 13 3255 | 14 3256 | 15 3257 | 16 3258 | 17 3259 | 18 3260 | 19 3261 | 20 3262 | 21 3263 | 22 3264 | 23 3265 | 24 3266 | 25 3267 | 26 3268 | 27 3269 | 28 3270 | 29 3271 | 30 3272 | 31 3273 | 32 3274 | 33 3275 | 34 3276 | 35 3277 | 36 3278 | 37 3279 | 38 3280 | 39 3281 | 40 3282 | 41 3283 | 42 3284 | 43 3285 | 44 3286 | 45 3287 | 46 3288 | 47 3289 | 48 3290 | 49 3291 | 50 3292 | 51 3293 | 52 3294 | 53 3295 | 54 3296 | 55 3297 | 56 3298 | 57 3299 | 58 3300 | 59 3301 | 60 3302 | 61 3303 | 62 3304 | 63 3305 | 64 3306 | 65 3307 | 66 3308 | 67 3309 | 68 3310 | 69 3311 | 70 3312 | 71 3313 | 72 3314 | 73 3315 | 74 3316 | 75 3317 | 76 3318 | 77 3319 | 78 3320 | 79 3321 | 80 3322 | 81 3323 | 82 3324 | 83 3325 | 84 3326 | 85 3327 | 86 3328 | 87 3329 | 88 3330 | 89 3331 | 0 3332 | 1 3333 | 2 3334 | 3 3335 | 4 3336 | 5 3337 | 6 3338 | 7 3339 | 8 3340 | 9 3341 | 10 3342 | 11 3343 | 12 3344 | 13 3345 | 14 3346 | 15 3347 | 16 3348 | 17 3349 | 18 3350 | 19 3351 | 20 3352 | 21 3353 | 22 3354 | 23 3355 | 24 3356 | 25 3357 | 26 3358 | 27 3359 | 28 3360 | 29 3361 | 30 3362 | 31 3363 | 32 3364 | 33 3365 | 34 3366 | 35 3367 | 36 3368 | 37 3369 | 38 3370 | 39 3371 | 40 3372 | 41 3373 | 42 3374 | 43 3375 | 44 3376 | 45 3377 | 46 3378 | 47 3379 | 48 3380 | 49 3381 | 50 3382 | 51 3383 | 52 3384 | 53 3385 | 54 3386 | 55 3387 | 56 3388 | 57 3389 | 58 3390 | 59 3391 | 60 3392 | 61 3393 | 62 3394 | 63 3395 | 64 3396 | 65 3397 | 66 3398 | 67 3399 | 68 3400 | 69 3401 | 70 3402 | 71 3403 | 72 3404 | 73 3405 | 74 3406 | 75 3407 | 76 3408 | 77 3409 | 78 3410 | 79 3411 | 80 3412 | 81 3413 | 82 3414 | 83 3415 | 84 3416 | 85 3417 | 86 3418 | 87 3419 | 88 3420 | 89 3421 | 0 3422 | 1 3423 | 2 3424 | 3 3425 | 4 3426 | 5 3427 | 6 3428 | 7 3429 | 8 3430 | 9 3431 | 10 3432 | 11 3433 | 12 3434 | 13 3435 | 14 3436 | 15 3437 | 16 3438 | 17 3439 | 18 3440 | 19 3441 | 20 3442 | 21 3443 | 22 3444 | 23 3445 | 24 3446 | 25 3447 | 26 3448 | 27 3449 | 28 3450 | 29 3451 | 30 3452 | 31 3453 | 32 3454 | 33 3455 | 34 3456 | 35 3457 | 36 3458 | 37 3459 | 38 3460 | 39 3461 | 40 3462 | 41 3463 | 42 3464 | 43 3465 | 44 3466 | 45 3467 | 46 3468 | 47 3469 | 48 3470 | 49 3471 | 50 3472 | 51 3473 | 52 3474 | 53 3475 | 54 3476 | 55 3477 | 56 3478 | 57 3479 | 58 3480 | 59 3481 | 60 3482 | 61 3483 | 62 3484 | 63 3485 | 64 3486 | 65 3487 | 66 3488 | 67 3489 | 68 3490 | 69 3491 | 70 3492 | 71 3493 | 72 3494 | 73 3495 | 74 3496 | 75 3497 | 76 3498 | 77 3499 | 78 3500 | 79 3501 | 80 3502 | 81 3503 | 82 3504 | 83 3505 | 84 3506 | 85 3507 | 86 3508 | 87 3509 | 88 3510 | 89 3511 | 0 3512 | 1 3513 | 2 3514 | 3 3515 | 4 3516 | 5 3517 | 6 3518 | 7 3519 | 8 3520 | 9 3521 | 10 3522 | 11 3523 | 12 3524 | 13 3525 | 14 3526 | 15 3527 | 16 3528 | 17 3529 | 18 3530 | 19 3531 | 20 3532 | 21 3533 | 22 3534 | 23 3535 | 24 3536 | 25 3537 | 26 3538 | 27 3539 | 28 3540 | 29 3541 | 30 3542 | 31 3543 | 32 3544 | 33 3545 | 34 3546 | 35 3547 | 36 3548 | 37 3549 | 38 3550 | 39 3551 | 40 3552 | 41 3553 | 42 3554 | 43 3555 | 44 3556 | 45 3557 | 46 3558 | 47 3559 | 48 3560 | 49 3561 | 50 3562 | 51 3563 | 52 3564 | 53 3565 | 54 3566 | 55 3567 | 56 3568 | 57 3569 | 58 3570 | 59 3571 | 60 3572 | 61 3573 | 62 3574 | 63 3575 | 64 3576 | 65 3577 | 66 3578 | 67 3579 | 68 3580 | 69 3581 | 70 3582 | 71 3583 | 72 3584 | 73 3585 | 74 3586 | 75 3587 | 76 3588 | 77 3589 | 78 3590 | 79 3591 | 80 3592 | 81 3593 | 82 3594 | 83 3595 | 84 3596 | 85 3597 | 86 3598 | 87 3599 | 88 3600 | 89 3601 | 0 3602 | 1 3603 | 2 3604 | 3 3605 | 4 3606 | 5 3607 | 6 3608 | 7 3609 | 8 3610 | 9 3611 | 10 3612 | 11 3613 | 12 3614 | 13 3615 | 14 3616 | 15 3617 | 16 3618 | 17 3619 | 18 3620 | 19 3621 | 20 3622 | 21 3623 | 22 3624 | 23 3625 | 24 3626 | 25 3627 | 26 3628 | 27 3629 | 28 3630 | 29 3631 | 30 3632 | 31 3633 | 32 3634 | 33 3635 | 34 3636 | 35 3637 | 36 3638 | 37 3639 | 38 3640 | 39 3641 | 40 3642 | 41 3643 | 42 3644 | 43 3645 | 44 3646 | 45 3647 | 46 3648 | 47 3649 | 48 3650 | 49 3651 | 50 3652 | 51 3653 | 52 3654 | 53 3655 | 54 3656 | 55 3657 | 56 3658 | 57 3659 | 58 3660 | 59 3661 | 60 3662 | 61 3663 | 62 3664 | 63 3665 | 64 3666 | 65 3667 | 66 3668 | 67 3669 | 68 3670 | 69 3671 | 70 3672 | 71 3673 | 72 3674 | 73 3675 | 74 3676 | 75 3677 | 76 3678 | 77 3679 | 78 3680 | 79 3681 | 80 3682 | 81 3683 | 82 3684 | 83 3685 | 84 3686 | 85 3687 | 86 3688 | 87 3689 | 88 3690 | 89 3691 | 0 3692 | 1 3693 | 2 3694 | 3 3695 | 4 3696 | 5 3697 | 6 3698 | 7 3699 | 8 3700 | 9 3701 | 10 3702 | 11 3703 | 12 3704 | 13 3705 | 14 3706 | 15 3707 | 16 3708 | 17 3709 | 18 3710 | 19 3711 | 20 3712 | 21 3713 | 22 3714 | 23 3715 | 24 3716 | 25 3717 | 26 3718 | 27 3719 | 28 3720 | 29 3721 | 30 3722 | 31 3723 | 32 3724 | 33 3725 | 34 3726 | 35 3727 | 36 3728 | 37 3729 | 38 3730 | 39 3731 | 40 3732 | 41 3733 | 42 3734 | 43 3735 | 44 3736 | 45 3737 | 46 3738 | 47 3739 | 48 3740 | 49 3741 | 50 3742 | 51 3743 | 52 3744 | 53 3745 | 54 3746 | 55 3747 | 56 3748 | 57 3749 | 58 3750 | 59 3751 | 60 3752 | 61 3753 | 62 3754 | 63 3755 | 64 3756 | 65 3757 | 66 3758 | 67 3759 | 68 3760 | 69 3761 | 70 3762 | 71 3763 | 72 3764 | 73 3765 | 74 3766 | 75 3767 | 76 3768 | 77 3769 | 78 3770 | 79 3771 | 80 3772 | 81 3773 | 82 3774 | 83 3775 | 84 3776 | 85 3777 | 86 3778 | 87 3779 | 88 3780 | 89 3781 | 0 3782 | 1 3783 | 2 3784 | 3 3785 | 4 3786 | 5 3787 | 6 3788 | 7 3789 | 8 3790 | 9 3791 | 10 3792 | 11 3793 | 12 3794 | 13 3795 | 14 3796 | 15 3797 | 16 3798 | 17 3799 | 18 3800 | 19 3801 | 20 3802 | 21 3803 | 22 3804 | 23 3805 | 24 3806 | 25 3807 | 26 3808 | 27 3809 | 28 3810 | 29 3811 | 30 3812 | 31 3813 | 32 3814 | 33 3815 | 34 3816 | 35 3817 | 36 3818 | 37 3819 | 38 3820 | 39 3821 | 40 3822 | 41 3823 | 42 3824 | 43 3825 | 44 3826 | 45 3827 | 46 3828 | 47 3829 | 48 3830 | 49 3831 | 50 3832 | 51 3833 | 52 3834 | 53 3835 | 54 3836 | 55 3837 | 56 3838 | 57 3839 | 58 3840 | 59 3841 | 60 3842 | 61 3843 | 62 3844 | 63 3845 | 64 3846 | 65 3847 | 66 3848 | 67 3849 | 68 3850 | 69 3851 | 70 3852 | 71 3853 | 72 3854 | 73 3855 | 74 3856 | 75 3857 | 76 3858 | 77 3859 | 78 3860 | 79 3861 | 80 3862 | 81 3863 | 82 3864 | 83 3865 | 84 3866 | 85 3867 | 86 3868 | 87 3869 | 88 3870 | 89 3871 | 0 3872 | 1 3873 | 2 3874 | 3 3875 | 4 3876 | 5 3877 | 6 3878 | 7 3879 | 8 3880 | 9 3881 | 10 3882 | 11 3883 | 12 3884 | 13 3885 | 14 3886 | 15 3887 | 16 3888 | 17 3889 | 18 3890 | 19 3891 | 20 3892 | 21 3893 | 22 3894 | 23 3895 | 24 3896 | 25 3897 | 26 3898 | 27 3899 | 28 3900 | 29 3901 | 30 3902 | 31 3903 | 32 3904 | 33 3905 | 34 3906 | 35 3907 | 36 3908 | 37 3909 | 38 3910 | 39 3911 | 40 3912 | 41 3913 | 42 3914 | 43 3915 | 44 3916 | 45 3917 | 46 3918 | 47 3919 | 48 3920 | 49 3921 | 50 3922 | 51 3923 | 52 3924 | 53 3925 | 54 3926 | 55 3927 | 56 3928 | 57 3929 | 58 3930 | 59 3931 | 60 3932 | 61 3933 | 62 3934 | 63 3935 | 64 3936 | 65 3937 | 66 3938 | 67 3939 | 68 3940 | 69 3941 | 70 3942 | 71 3943 | 72 3944 | 73 3945 | 74 3946 | 75 3947 | 76 3948 | 77 3949 | 78 3950 | 79 3951 | 80 3952 | 81 3953 | 82 3954 | 83 3955 | 84 3956 | 85 3957 | 86 3958 | 87 3959 | 88 3960 | 89 3961 | 0 3962 | 1 3963 | 2 3964 | 3 3965 | 4 3966 | 5 3967 | 6 3968 | 7 3969 | 8 3970 | 9 3971 | 10 3972 | 11 3973 | 12 3974 | 13 3975 | 14 3976 | 15 3977 | 16 3978 | 17 3979 | 18 3980 | 19 3981 | 20 3982 | 21 3983 | 22 3984 | 23 3985 | 24 3986 | 25 3987 | 26 3988 | 27 3989 | 28 3990 | 29 3991 | 30 3992 | 31 3993 | 32 3994 | 33 3995 | 34 3996 | 35 3997 | 36 3998 | 37 3999 | 38 4000 | 39 4001 | 40 4002 | 41 4003 | 42 4004 | 43 4005 | 44 4006 | 45 4007 | 46 4008 | 47 4009 | 48 4010 | 49 4011 | 50 4012 | 51 4013 | 52 4014 | 53 4015 | 54 4016 | 55 4017 | 56 4018 | 57 4019 | 58 4020 | 59 4021 | 60 4022 | 61 4023 | 62 4024 | 63 4025 | 64 4026 | 65 4027 | 66 4028 | 67 4029 | 68 4030 | 69 4031 | 70 4032 | 71 4033 | 72 4034 | 73 4035 | 74 4036 | 75 4037 | 76 4038 | 77 4039 | 78 4040 | 79 4041 | 80 4042 | 81 4043 | 82 4044 | 83 4045 | 84 4046 | 85 4047 | 86 4048 | 87 4049 | 88 4050 | 89 4051 | 0 4052 | 1 4053 | 2 4054 | 3 4055 | 4 4056 | 5 4057 | 6 4058 | 7 4059 | 8 4060 | 9 4061 | 10 4062 | 11 4063 | 12 4064 | 13 4065 | 14 4066 | 15 4067 | 16 4068 | 17 4069 | 18 4070 | 19 4071 | 20 4072 | 21 4073 | 22 4074 | 23 4075 | 24 4076 | 25 4077 | 26 4078 | 27 4079 | 28 4080 | 29 4081 | 30 4082 | 31 4083 | 32 4084 | 33 4085 | 34 4086 | 35 4087 | 36 4088 | 37 4089 | 38 4090 | 39 4091 | 40 4092 | 41 4093 | 42 4094 | 43 4095 | 44 4096 | 45 4097 | 46 4098 | 47 4099 | 48 4100 | 49 4101 | 50 4102 | 51 4103 | 52 4104 | 53 4105 | 54 4106 | 55 4107 | 56 4108 | 57 4109 | 58 4110 | 59 4111 | 60 4112 | 61 4113 | 62 4114 | 63 4115 | 64 4116 | 65 4117 | 66 4118 | 67 4119 | 68 4120 | 69 4121 | 70 4122 | 71 4123 | 72 4124 | 73 4125 | 74 4126 | 75 4127 | 76 4128 | 77 4129 | 78 4130 | 79 4131 | 80 4132 | 81 4133 | 82 4134 | 83 4135 | 84 4136 | 85 4137 | 86 4138 | 87 4139 | 88 4140 | 89 4141 | 0 4142 | 1 4143 | 2 4144 | 3 4145 | 4 4146 | 5 4147 | 6 4148 | 7 4149 | 8 4150 | 9 4151 | 10 4152 | 11 4153 | 12 4154 | 13 4155 | 14 4156 | 15 4157 | 16 4158 | 17 4159 | 18 4160 | 19 4161 | 20 4162 | 21 4163 | 22 4164 | 23 4165 | 24 4166 | 25 4167 | 26 4168 | 27 4169 | 28 4170 | 29 4171 | 30 4172 | 31 4173 | 32 4174 | 33 4175 | 34 4176 | 35 4177 | 36 4178 | 37 4179 | 38 4180 | 39 4181 | 40 4182 | 41 4183 | 42 4184 | 43 4185 | 44 4186 | 45 4187 | 46 4188 | 47 4189 | 48 4190 | 49 4191 | 50 4192 | 51 4193 | 52 4194 | 53 4195 | 54 4196 | 55 4197 | 56 4198 | 57 4199 | 58 4200 | 59 4201 | 60 4202 | 61 4203 | 62 4204 | 63 4205 | 64 4206 | 65 4207 | 66 4208 | 67 4209 | 68 4210 | 69 4211 | 70 4212 | 71 4213 | 72 4214 | 73 4215 | 74 4216 | 75 4217 | 76 4218 | 77 4219 | 78 4220 | 79 4221 | 80 4222 | 81 4223 | 82 4224 | 83 4225 | 84 4226 | 85 4227 | 86 4228 | 87 4229 | 88 4230 | 89 4231 | 0 4232 | 1 4233 | 2 4234 | 3 4235 | 4 4236 | 5 4237 | 6 4238 | 7 4239 | 8 4240 | 9 4241 | 10 4242 | 11 4243 | 12 4244 | 13 4245 | 14 4246 | 15 4247 | 16 4248 | 17 4249 | 18 4250 | 19 4251 | 20 4252 | 21 4253 | 22 4254 | 23 4255 | 24 4256 | 25 4257 | 26 4258 | 27 4259 | 28 4260 | 29 4261 | 30 4262 | 31 4263 | 32 4264 | 33 4265 | 34 4266 | 35 4267 | 36 4268 | 37 4269 | 38 4270 | 39 4271 | 40 4272 | 41 4273 | 42 4274 | 43 4275 | 44 4276 | 45 4277 | 46 4278 | 47 4279 | 48 4280 | 49 4281 | 50 4282 | 51 4283 | 52 4284 | 53 4285 | 54 4286 | 55 4287 | 56 4288 | 57 4289 | 58 4290 | 59 4291 | 60 4292 | 61 4293 | 62 4294 | 63 4295 | 64 4296 | 65 4297 | 66 4298 | 67 4299 | 68 4300 | 69 4301 | 70 4302 | 71 4303 | 72 4304 | 73 4305 | 74 4306 | 75 4307 | 76 4308 | 77 4309 | 78 4310 | 79 4311 | 80 4312 | 81 4313 | 82 4314 | 83 4315 | 84 4316 | 85 4317 | 86 4318 | 87 4319 | 88 4320 | 89 4321 | 0 4322 | 1 4323 | 2 4324 | 3 4325 | 4 4326 | 5 4327 | 6 4328 | 7 4329 | 8 4330 | 9 4331 | 10 4332 | 11 4333 | 12 4334 | 13 4335 | 14 4336 | 15 4337 | 16 4338 | 17 4339 | 18 4340 | 19 4341 | 20 4342 | 21 4343 | 22 4344 | 23 4345 | 24 4346 | 25 4347 | 26 4348 | 27 4349 | 28 4350 | 29 4351 | 30 4352 | 31 4353 | 32 4354 | 33 4355 | 34 4356 | 35 4357 | 36 4358 | 37 4359 | 38 4360 | 39 4361 | 40 4362 | 41 4363 | 42 4364 | 43 4365 | 44 4366 | 45 4367 | 46 4368 | 47 4369 | 48 4370 | 49 4371 | 50 4372 | 51 4373 | 52 4374 | 53 4375 | 54 4376 | 55 4377 | 56 4378 | 57 4379 | 58 4380 | 59 4381 | 60 4382 | 61 4383 | 62 4384 | 63 4385 | 64 4386 | 65 4387 | 66 4388 | 67 4389 | 68 4390 | 69 4391 | 70 4392 | 71 4393 | 72 4394 | 73 4395 | 74 4396 | 75 4397 | 76 4398 | 77 4399 | 78 4400 | 79 4401 | 80 4402 | 81 4403 | 82 4404 | 83 4405 | 84 4406 | 85 4407 | 86 4408 | 87 4409 | 88 4410 | 89 4411 | 0 4412 | 1 4413 | 2 4414 | 3 4415 | 4 4416 | 5 4417 | 6 4418 | 7 4419 | 8 4420 | 9 4421 | 10 4422 | 11 4423 | 12 4424 | 13 4425 | 14 4426 | 15 4427 | 16 4428 | 17 4429 | 18 4430 | 19 4431 | 20 4432 | 21 4433 | 22 4434 | 23 4435 | 24 4436 | 25 4437 | 26 4438 | 27 4439 | 28 4440 | 29 4441 | 30 4442 | 31 4443 | 32 4444 | 33 4445 | 34 4446 | 35 4447 | 36 4448 | 37 4449 | 38 4450 | 39 4451 | 40 4452 | 41 4453 | 42 4454 | 43 4455 | 44 4456 | 45 4457 | 46 4458 | 47 4459 | 48 4460 | 49 4461 | 50 4462 | 51 4463 | 52 4464 | 53 4465 | 54 4466 | 55 4467 | 56 4468 | 57 4469 | 58 4470 | 59 4471 | 60 4472 | 61 4473 | 62 4474 | 63 4475 | 64 4476 | 65 4477 | 66 4478 | 67 4479 | 68 4480 | 69 4481 | 70 4482 | 71 4483 | 72 4484 | 73 4485 | 74 4486 | 75 4487 | 76 4488 | 77 4489 | 78 4490 | 79 4491 | 80 4492 | 81 4493 | 82 4494 | 83 4495 | 84 4496 | 85 4497 | 86 4498 | 87 4499 | 88 4500 | 89 4501 | 0 4502 | 1 4503 | 2 4504 | 3 4505 | 4 4506 | 5 4507 | 6 4508 | 7 4509 | 8 4510 | 9 4511 | 10 4512 | 11 4513 | 12 4514 | 13 4515 | 14 4516 | 15 4517 | 16 4518 | 17 4519 | 18 4520 | 19 4521 | 20 4522 | 21 4523 | 22 4524 | 23 4525 | 24 4526 | 25 4527 | 26 4528 | 27 4529 | 28 4530 | 29 4531 | 30 4532 | 31 4533 | 32 4534 | 33 4535 | 34 4536 | 35 4537 | 36 4538 | 37 4539 | 38 4540 | 39 4541 | 40 4542 | 41 4543 | 42 4544 | 43 4545 | 44 4546 | 45 4547 | 46 4548 | 47 4549 | 48 4550 | 49 4551 | 50 4552 | 51 4553 | 52 4554 | 53 4555 | 54 4556 | 55 4557 | 56 4558 | 57 4559 | 58 4560 | 59 4561 | 60 4562 | 61 4563 | 62 4564 | 63 4565 | 64 4566 | 65 4567 | 66 4568 | 67 4569 | 68 4570 | 69 4571 | 70 4572 | 71 4573 | 72 4574 | 73 4575 | 74 4576 | 75 4577 | 76 4578 | 77 4579 | 78 4580 | 79 4581 | 80 4582 | 81 4583 | 82 4584 | 83 4585 | 84 4586 | 85 4587 | 86 4588 | 87 4589 | 88 4590 | 89 4591 | 0 4592 | 1 4593 | 2 4594 | 3 4595 | 4 4596 | 5 4597 | 6 4598 | 7 4599 | 8 4600 | 9 4601 | 10 4602 | 11 4603 | 12 4604 | 13 4605 | 14 4606 | 15 4607 | 16 4608 | 17 4609 | 18 4610 | 19 4611 | 20 4612 | 21 4613 | 22 4614 | 23 4615 | 24 4616 | 25 4617 | 26 4618 | 27 4619 | 28 4620 | 29 4621 | 30 4622 | 31 4623 | 32 4624 | 33 4625 | 34 4626 | 35 4627 | 36 4628 | 37 4629 | 38 4630 | 39 4631 | 40 4632 | 41 4633 | 42 4634 | 43 4635 | 44 4636 | 45 4637 | 46 4638 | 47 4639 | 48 4640 | 49 4641 | 50 4642 | 51 4643 | 52 4644 | 53 4645 | 54 4646 | 55 4647 | 56 4648 | 57 4649 | 58 4650 | 59 4651 | 60 4652 | 61 4653 | 62 4654 | 63 4655 | 64 4656 | 65 4657 | 66 4658 | 67 4659 | 68 4660 | 69 4661 | 70 4662 | 71 4663 | 72 4664 | 73 4665 | 74 4666 | 75 4667 | 76 4668 | 77 4669 | 78 4670 | 79 4671 | 80 4672 | 81 4673 | 82 4674 | 83 4675 | 84 4676 | 85 4677 | 86 4678 | 87 4679 | 88 4680 | 89 4681 | 0 4682 | 1 4683 | 2 4684 | 3 4685 | 4 4686 | 5 4687 | 6 4688 | 7 4689 | 8 4690 | 9 4691 | 10 4692 | 11 4693 | 12 4694 | 13 4695 | 14 4696 | 15 4697 | 16 4698 | 17 4699 | 18 4700 | 19 4701 | 20 4702 | 21 4703 | 22 4704 | 23 4705 | 24 4706 | 25 4707 | 26 4708 | 27 4709 | 28 4710 | 29 4711 | 30 4712 | 31 4713 | 32 4714 | 33 4715 | 34 4716 | 35 4717 | 36 4718 | 37 4719 | 38 4720 | 39 4721 | 40 4722 | 41 4723 | 42 4724 | 43 4725 | 44 4726 | 45 4727 | 46 4728 | 47 4729 | 48 4730 | 49 4731 | 50 4732 | 51 4733 | 52 4734 | 53 4735 | 54 4736 | 55 4737 | 56 4738 | 57 4739 | 58 4740 | 59 4741 | 60 4742 | 61 4743 | 62 4744 | 63 4745 | 64 4746 | 65 4747 | 66 4748 | 67 4749 | 68 4750 | 69 4751 | 70 4752 | 71 4753 | 72 4754 | 73 4755 | 74 4756 | 75 4757 | 76 4758 | 77 4759 | 78 4760 | 79 4761 | 80 4762 | 81 4763 | 82 4764 | 83 4765 | 84 4766 | 85 4767 | 86 4768 | 87 4769 | 88 4770 | 89 4771 | 0 4772 | 1 4773 | 2 4774 | 3 4775 | 4 4776 | 5 4777 | 6 4778 | 7 4779 | 8 4780 | 9 4781 | 10 4782 | 11 4783 | 12 4784 | 13 4785 | 14 4786 | 15 4787 | 16 4788 | 17 4789 | 18 4790 | 19 4791 | 20 4792 | 21 4793 | 22 4794 | 23 4795 | 24 4796 | 25 4797 | 26 4798 | 27 4799 | 28 4800 | 29 4801 | 30 4802 | 31 4803 | 32 4804 | 33 4805 | 34 4806 | 35 4807 | 36 4808 | 37 4809 | 38 4810 | 39 4811 | 40 4812 | 41 4813 | 42 4814 | 43 4815 | 44 4816 | 45 4817 | 46 4818 | 47 4819 | 48 4820 | 49 4821 | 50 4822 | 51 4823 | 52 4824 | 53 4825 | 54 4826 | 55 4827 | 56 4828 | 57 4829 | 58 4830 | 59 4831 | 60 4832 | 61 4833 | 62 4834 | 63 4835 | 64 4836 | 65 4837 | 66 4838 | 67 4839 | 68 4840 | 69 4841 | 70 4842 | 71 4843 | 72 4844 | 73 4845 | 74 4846 | 75 4847 | 76 4848 | 77 4849 | 78 4850 | 79 4851 | 80 4852 | 81 4853 | 82 4854 | 83 4855 | 84 4856 | 85 4857 | 86 4858 | 87 4859 | 88 4860 | 89 4861 | 0 4862 | 1 4863 | 2 4864 | 3 4865 | 4 4866 | 5 4867 | 6 4868 | 7 4869 | 8 4870 | 9 4871 | 10 4872 | 11 4873 | 12 4874 | 13 4875 | 14 4876 | 15 4877 | 16 4878 | 17 4879 | 18 4880 | 19 4881 | 20 4882 | 21 4883 | 22 4884 | 23 4885 | 24 4886 | 25 4887 | 26 4888 | 27 4889 | 28 4890 | 29 4891 | 30 4892 | 31 4893 | 32 4894 | 33 4895 | 34 4896 | 35 4897 | 36 4898 | 37 4899 | 38 4900 | 39 4901 | 40 4902 | 41 4903 | 42 4904 | 43 4905 | 44 4906 | 45 4907 | 46 4908 | 47 4909 | 48 4910 | 49 4911 | 50 4912 | 51 4913 | 52 4914 | 53 4915 | 54 4916 | 55 4917 | 56 4918 | 57 4919 | 58 4920 | 59 4921 | 60 4922 | 61 4923 | 62 4924 | 63 4925 | 64 4926 | 65 4927 | 66 4928 | 67 4929 | 68 4930 | 69 4931 | 70 4932 | 71 4933 | 72 4934 | 73 4935 | 74 4936 | 75 4937 | 76 4938 | 77 4939 | 78 4940 | 79 4941 | 80 4942 | 81 4943 | 82 4944 | 83 4945 | 84 4946 | 85 4947 | 86 4948 | 87 4949 | 88 4950 | 89 4951 | 0 4952 | 1 4953 | 2 4954 | 3 4955 | 4 4956 | 5 4957 | 6 4958 | 7 4959 | 8 4960 | 9 4961 | 10 4962 | 11 4963 | 12 4964 | 13 4965 | 14 4966 | 15 4967 | 16 4968 | 17 4969 | 18 4970 | 19 4971 | 20 4972 | 21 4973 | 22 4974 | 23 4975 | 24 4976 | 25 4977 | 26 4978 | 27 4979 | 28 4980 | 29 4981 | 30 4982 | 31 4983 | 32 4984 | 33 4985 | 34 4986 | 35 4987 | 36 4988 | 37 4989 | 38 4990 | 39 4991 | 40 4992 | 41 4993 | 42 4994 | 43 4995 | 44 4996 | 45 4997 | 46 4998 | 47 4999 | 48 5000 | 49 5001 | 50 5002 | 51 5003 | 52 5004 | 53 5005 | 54 5006 | 55 5007 | 56 5008 | 57 5009 | 58 5010 | 59 5011 | 60 5012 | 61 5013 | 62 5014 | 63 5015 | 64 5016 | 65 5017 | 66 5018 | 67 5019 | 68 5020 | 69 5021 | 70 5022 | 71 5023 | 72 5024 | 73 5025 | 74 5026 | 75 5027 | 76 5028 | 77 5029 | 78 5030 | 79 5031 | 80 5032 | 81 5033 | 82 5034 | 83 5035 | 84 5036 | 85 5037 | 86 5038 | 87 5039 | 88 5040 | 89 5041 | 0 5042 | 1 5043 | 2 5044 | 3 5045 | 4 5046 | 5 5047 | 6 5048 | 7 5049 | 8 5050 | 9 5051 | 10 5052 | 11 5053 | 12 5054 | 13 5055 | 14 5056 | 15 5057 | 16 5058 | 17 5059 | 18 5060 | 19 5061 | 20 5062 | 21 5063 | 22 5064 | 23 5065 | 24 5066 | 25 5067 | 26 5068 | 27 5069 | 28 5070 | 29 5071 | 30 5072 | 31 5073 | 32 5074 | 33 5075 | 34 5076 | 35 5077 | 36 5078 | 37 5079 | 38 5080 | 39 5081 | 40 5082 | 41 5083 | 42 5084 | 43 5085 | 44 5086 | 45 5087 | 46 5088 | 47 5089 | 48 5090 | 49 5091 | 50 5092 | 51 5093 | 52 5094 | 53 5095 | 54 5096 | 55 5097 | 56 5098 | 57 5099 | 58 5100 | 59 5101 | 60 5102 | 61 5103 | 62 5104 | 63 5105 | 64 5106 | 65 5107 | 66 5108 | 67 5109 | 68 5110 | 69 5111 | 70 5112 | 71 5113 | 72 5114 | 73 5115 | 74 5116 | 75 5117 | 76 5118 | 77 5119 | 78 5120 | 79 5121 | 80 5122 | 81 5123 | 82 5124 | 83 5125 | 84 5126 | 85 5127 | 86 5128 | 87 5129 | 88 5130 | 89 5131 | 0 5132 | 1 5133 | 2 5134 | 3 5135 | 4 5136 | 5 5137 | 6 5138 | 7 5139 | 8 5140 | 9 5141 | 10 5142 | 11 5143 | 12 5144 | 13 5145 | 14 5146 | 15 5147 | 16 5148 | 17 5149 | 18 5150 | 19 5151 | 20 5152 | 21 5153 | 22 5154 | 23 5155 | 24 5156 | 25 5157 | 26 5158 | 27 5159 | 28 5160 | 29 5161 | 30 5162 | 31 5163 | 32 5164 | 33 5165 | 34 5166 | 35 5167 | 36 5168 | 37 5169 | 38 5170 | 39 5171 | 40 5172 | 41 5173 | 42 5174 | 43 5175 | 44 5176 | 45 5177 | 46 5178 | 47 5179 | 48 5180 | 49 5181 | 50 5182 | 51 5183 | 52 5184 | 53 5185 | 54 5186 | 55 5187 | 56 5188 | 57 5189 | 58 5190 | 59 5191 | 60 5192 | 61 5193 | 62 5194 | 63 5195 | 64 5196 | 65 5197 | 66 5198 | 67 5199 | 68 5200 | 69 5201 | 70 5202 | 71 5203 | 72 5204 | 73 5205 | 74 5206 | 75 5207 | 76 5208 | 77 5209 | 78 5210 | 79 5211 | 80 5212 | 81 5213 | 82 5214 | 83 5215 | 84 5216 | 85 5217 | 86 5218 | 87 5219 | 88 5220 | 89 5221 | 0 5222 | 1 5223 | 2 5224 | 3 5225 | 4 5226 | 5 5227 | 6 5228 | 7 5229 | 8 5230 | 9 5231 | 10 5232 | 11 5233 | 12 5234 | 13 5235 | 14 5236 | 15 5237 | 16 5238 | 17 5239 | 18 5240 | 19 5241 | 20 5242 | 21 5243 | 22 5244 | 23 5245 | 24 5246 | 25 5247 | 26 5248 | 27 5249 | 28 5250 | 29 5251 | 30 5252 | 31 5253 | 32 5254 | 33 5255 | 34 5256 | 35 5257 | 36 5258 | 37 5259 | 38 5260 | 39 5261 | 40 5262 | 41 5263 | 42 5264 | 43 5265 | 44 5266 | 45 5267 | 46 5268 | 47 5269 | 48 5270 | 49 5271 | 50 5272 | 51 5273 | 52 5274 | 53 5275 | 54 5276 | 55 5277 | 56 5278 | 57 5279 | 58 5280 | 59 5281 | 60 5282 | 61 5283 | 62 5284 | 63 5285 | 64 5286 | 65 5287 | 66 5288 | 67 5289 | 68 5290 | 69 5291 | 70 5292 | 71 5293 | 72 5294 | 73 5295 | 74 5296 | 75 5297 | 76 5298 | 77 5299 | 78 5300 | 79 5301 | 80 5302 | 81 5303 | 82 5304 | 83 5305 | 84 5306 | 85 5307 | 86 5308 | 87 5309 | 88 5310 | 89 5311 | 0 5312 | 1 5313 | 2 5314 | 3 5315 | 4 5316 | 5 5317 | 6 5318 | 7 5319 | 8 5320 | 9 5321 | 10 5322 | 11 5323 | 12 5324 | 13 5325 | 14 5326 | 15 5327 | 16 5328 | 17 5329 | 18 5330 | 19 5331 | 20 5332 | 21 5333 | 22 5334 | 23 5335 | 24 5336 | 25 5337 | 26 5338 | 27 5339 | 28 5340 | 29 5341 | 30 5342 | 31 5343 | 32 5344 | 33 5345 | 34 5346 | 35 5347 | 36 5348 | 37 5349 | 38 5350 | 39 5351 | 40 5352 | 41 5353 | 42 5354 | 43 5355 | 44 5356 | 45 5357 | 46 5358 | 47 5359 | 48 5360 | 49 5361 | 50 5362 | 51 5363 | 52 5364 | 53 5365 | 54 5366 | 55 5367 | 56 5368 | 57 5369 | 58 5370 | 59 5371 | 60 5372 | 61 5373 | 62 5374 | 63 5375 | 64 5376 | 65 5377 | 66 5378 | 67 5379 | 68 5380 | 69 5381 | 70 5382 | 71 5383 | 72 5384 | 73 5385 | 74 5386 | 75 5387 | 76 5388 | 77 5389 | 78 5390 | 79 5391 | 80 5392 | 81 5393 | 82 5394 | 83 5395 | 84 5396 | 85 5397 | 86 5398 | 87 5399 | 88 5400 | 89 5401 | 0 5402 | 1 5403 | 2 5404 | 3 5405 | 4 5406 | 5 5407 | 6 5408 | 7 5409 | 8 5410 | 9 5411 | 10 5412 | 11 5413 | 12 5414 | 13 5415 | 14 5416 | 15 5417 | 16 5418 | 17 5419 | 18 5420 | 19 5421 | 20 5422 | 21 5423 | 22 5424 | 23 5425 | 24 5426 | 25 5427 | 26 5428 | 27 5429 | 28 5430 | 29 5431 | 30 5432 | 31 5433 | 32 5434 | 33 5435 | 34 5436 | 35 5437 | 36 5438 | 37 5439 | 38 5440 | 39 5441 | 40 5442 | 41 5443 | 42 5444 | 43 5445 | 44 5446 | 45 5447 | 46 5448 | 47 5449 | 48 5450 | 49 5451 | 50 5452 | 51 5453 | 52 5454 | 53 5455 | 54 5456 | 55 5457 | 56 5458 | 57 5459 | 58 5460 | 59 5461 | 60 5462 | 61 5463 | 62 5464 | 63 5465 | 64 5466 | 65 5467 | 66 5468 | 67 5469 | 68 5470 | 69 5471 | 70 5472 | 71 5473 | 72 5474 | 73 5475 | 74 5476 | 75 5477 | 76 5478 | 77 5479 | 78 5480 | 79 5481 | 80 5482 | 81 5483 | 82 5484 | 83 5485 | 84 5486 | 85 5487 | 86 5488 | 87 5489 | 88 5490 | 89 5491 | 0 5492 | 1 5493 | 2 5494 | 3 5495 | 4 5496 | 5 5497 | 6 5498 | 7 5499 | 8 5500 | 9 5501 | 10 5502 | 11 5503 | 12 5504 | 13 5505 | 14 5506 | 15 5507 | 16 5508 | 17 5509 | 18 5510 | 19 5511 | 20 5512 | 21 5513 | 22 5514 | 23 5515 | 24 5516 | 25 5517 | 26 5518 | 27 5519 | 28 5520 | 29 5521 | 30 5522 | 31 5523 | 32 5524 | 33 5525 | 34 5526 | 35 5527 | 36 5528 | 37 5529 | 38 5530 | 39 5531 | 40 5532 | 41 5533 | 42 5534 | 43 5535 | 44 5536 | 45 5537 | 46 5538 | 47 5539 | 48 5540 | 49 5541 | 50 5542 | 51 5543 | 52 5544 | 53 5545 | 54 5546 | 55 5547 | 56 5548 | 57 5549 | 58 5550 | 59 5551 | 60 5552 | 61 5553 | 62 5554 | 63 5555 | 64 5556 | 65 5557 | 66 5558 | 67 5559 | 68 5560 | 69 5561 | 70 5562 | 71 5563 | 72 5564 | 73 5565 | 74 5566 | 75 5567 | 76 5568 | 77 5569 | 78 5570 | 79 5571 | 80 5572 | 81 5573 | 82 5574 | 83 5575 | 84 5576 | 85 5577 | 86 5578 | 87 5579 | 88 5580 | 89 5581 | 0 5582 | 1 5583 | 2 5584 | 3 5585 | 4 5586 | 5 5587 | 6 5588 | 7 5589 | 8 5590 | 9 5591 | 10 5592 | 11 5593 | 12 5594 | 13 5595 | 14 5596 | 15 5597 | 16 5598 | 17 5599 | 18 5600 | 19 5601 | 20 5602 | 21 5603 | 22 5604 | 23 5605 | 24 5606 | 25 5607 | 26 5608 | 27 5609 | 28 5610 | 29 5611 | 30 5612 | 31 5613 | 32 5614 | 33 5615 | 34 5616 | 35 5617 | 36 5618 | 37 5619 | 38 5620 | 39 5621 | 40 5622 | 41 5623 | 42 5624 | 43 5625 | 44 5626 | 45 5627 | 46 5628 | 47 5629 | 48 5630 | 49 5631 | 50 5632 | 51 5633 | 52 5634 | 53 5635 | 54 5636 | 55 5637 | 56 5638 | 57 5639 | 58 5640 | 59 5641 | 60 5642 | 61 5643 | 62 5644 | 63 5645 | 64 5646 | 65 5647 | 66 5648 | 67 5649 | 68 5650 | 69 5651 | 70 5652 | 71 5653 | 72 5654 | 73 5655 | 74 5656 | 75 5657 | 76 5658 | 77 5659 | 78 5660 | 79 5661 | 80 5662 | 81 5663 | 82 5664 | 83 5665 | 84 5666 | 85 5667 | 86 5668 | 87 5669 | 88 5670 | 89 5671 | 0 5672 | 1 5673 | 2 5674 | 3 5675 | 4 5676 | 5 5677 | 6 5678 | 7 5679 | 8 5680 | 9 5681 | 10 5682 | 11 5683 | 12 5684 | 13 5685 | 14 5686 | 15 5687 | 16 5688 | 17 5689 | 18 5690 | 19 5691 | 20 5692 | 21 5693 | 22 5694 | 23 5695 | 24 5696 | 25 5697 | 26 5698 | 27 5699 | 28 5700 | 29 5701 | 30 5702 | 31 5703 | 32 5704 | 33 5705 | 34 5706 | 35 5707 | 36 5708 | 37 5709 | 38 5710 | 39 5711 | 40 5712 | 41 5713 | 42 5714 | 43 5715 | 44 5716 | 45 5717 | 46 5718 | 47 5719 | 48 5720 | 49 5721 | 50 5722 | 51 5723 | 52 5724 | 53 5725 | 54 5726 | 55 5727 | 56 5728 | 57 5729 | 58 5730 | 59 5731 | 60 5732 | 61 5733 | 62 5734 | 63 5735 | 64 5736 | 65 5737 | 66 5738 | 67 5739 | 68 5740 | 69 5741 | 70 5742 | 71 5743 | 72 5744 | 73 5745 | 74 5746 | 75 5747 | 76 5748 | 77 5749 | 78 5750 | 79 5751 | 80 5752 | 81 5753 | 82 5754 | 83 5755 | 84 5756 | 85 5757 | 86 5758 | 87 5759 | 88 5760 | 89 5761 | 0 5762 | 1 5763 | 2 5764 | 3 5765 | 4 5766 | 5 5767 | 6 5768 | 7 5769 | 8 5770 | 9 5771 | 10 5772 | 11 5773 | 12 5774 | 13 5775 | 14 5776 | 15 5777 | 16 5778 | 17 5779 | 18 5780 | 19 5781 | 20 5782 | 21 5783 | 22 5784 | 23 5785 | 24 5786 | 25 5787 | 26 5788 | 27 5789 | 28 5790 | 29 5791 | 30 5792 | 31 5793 | 32 5794 | 33 5795 | 34 5796 | 35 5797 | 36 5798 | 37 5799 | 38 5800 | 39 5801 | 40 5802 | 41 5803 | 42 5804 | 43 5805 | 44 5806 | 45 5807 | 46 5808 | 47 5809 | 48 5810 | 49 5811 | 50 5812 | 51 5813 | 52 5814 | 53 5815 | 54 5816 | 55 5817 | 56 5818 | 57 5819 | 58 5820 | 59 5821 | 60 5822 | 61 5823 | 62 5824 | 63 5825 | 64 5826 | 65 5827 | 66 5828 | 67 5829 | 68 5830 | 69 5831 | 70 5832 | 71 5833 | 72 5834 | 73 5835 | 74 5836 | 75 5837 | 76 5838 | 77 5839 | 78 5840 | 79 5841 | 80 5842 | 81 5843 | 82 5844 | 83 5845 | 84 5846 | 85 5847 | 86 5848 | 87 5849 | 88 5850 | 89 5851 | 0 5852 | 1 5853 | 2 5854 | 3 5855 | 4 5856 | 5 5857 | 6 5858 | 7 5859 | 8 5860 | 9 5861 | 10 5862 | 11 5863 | 12 5864 | 13 5865 | 14 5866 | 15 5867 | 16 5868 | 17 5869 | 18 5870 | 19 5871 | 20 5872 | 21 5873 | 22 5874 | 23 5875 | 24 5876 | 25 5877 | 26 5878 | 27 5879 | 28 5880 | 29 5881 | 30 5882 | 31 5883 | 32 5884 | 33 5885 | 34 5886 | 35 5887 | 36 5888 | 37 5889 | 38 5890 | 39 5891 | 40 5892 | 41 5893 | 42 5894 | 43 5895 | 44 5896 | 45 5897 | 46 5898 | 47 5899 | 48 5900 | 49 5901 | 50 5902 | 51 5903 | 52 5904 | 53 5905 | 54 5906 | 55 5907 | 56 5908 | 57 5909 | 58 5910 | 59 5911 | 60 5912 | 61 5913 | 62 5914 | 63 5915 | 64 5916 | 65 5917 | 66 5918 | 67 5919 | 68 5920 | 69 5921 | 70 5922 | 71 5923 | 72 5924 | 73 5925 | 74 5926 | 75 5927 | 76 5928 | 77 5929 | 78 5930 | 79 5931 | 80 5932 | 81 5933 | 82 5934 | 83 5935 | 84 5936 | 85 5937 | 86 5938 | 87 5939 | 88 5940 | 89 5941 | 0 5942 | 1 5943 | 2 5944 | 3 5945 | 4 5946 | 5 5947 | 6 5948 | 7 5949 | 8 5950 | 9 5951 | 10 5952 | 11 5953 | 12 5954 | 13 5955 | 14 5956 | 15 5957 | 16 5958 | 17 5959 | 18 5960 | 19 5961 | 20 5962 | 21 5963 | 22 5964 | 23 5965 | 24 5966 | 25 5967 | 26 5968 | 27 5969 | 28 5970 | 29 5971 | 30 5972 | 31 5973 | 32 5974 | 33 5975 | 34 5976 | 35 5977 | 36 5978 | 37 5979 | 38 5980 | 39 5981 | 40 5982 | 41 5983 | 42 5984 | 43 5985 | 44 5986 | 45 5987 | 46 5988 | 47 5989 | 48 5990 | 49 5991 | 50 5992 | 51 5993 | 52 5994 | 53 5995 | 54 5996 | 55 5997 | 56 5998 | 57 5999 | 58 6000 | 59 6001 | 60 6002 | 61 6003 | 62 6004 | 63 6005 | 64 6006 | 65 6007 | 66 6008 | 67 6009 | 68 6010 | 69 6011 | 70 6012 | 71 6013 | 72 6014 | 73 6015 | 74 6016 | 75 6017 | 76 6018 | 77 6019 | 78 6020 | 79 6021 | 80 6022 | 81 6023 | 82 6024 | 83 6025 | 84 6026 | 85 6027 | 86 6028 | 87 6029 | 88 6030 | 89 6031 | 0 6032 | 1 6033 | 2 6034 | 3 6035 | 4 6036 | 5 6037 | 6 6038 | 7 6039 | 8 6040 | 9 6041 | 10 6042 | 11 6043 | 12 6044 | 13 6045 | 14 6046 | 15 6047 | 16 6048 | 17 6049 | 18 6050 | 19 6051 | 20 6052 | 21 6053 | 22 6054 | 23 6055 | 24 6056 | 25 6057 | 26 6058 | 27 6059 | 28 6060 | 29 6061 | 30 6062 | 31 6063 | 32 6064 | 33 6065 | 34 6066 | 35 6067 | 36 6068 | 37 6069 | 38 6070 | 39 6071 | 40 6072 | 41 6073 | 42 6074 | 43 6075 | 44 6076 | 45 6077 | 46 6078 | 47 6079 | 48 6080 | 49 6081 | 50 6082 | 51 6083 | 52 6084 | 53 6085 | 54 6086 | 55 6087 | 56 6088 | 57 6089 | 58 6090 | 59 6091 | 60 6092 | 61 6093 | 62 6094 | 63 6095 | 64 6096 | 65 6097 | 66 6098 | 67 6099 | 68 6100 | 69 6101 | 70 6102 | 71 6103 | 72 6104 | 73 6105 | 74 6106 | 75 6107 | 76 6108 | 77 6109 | 78 6110 | 79 6111 | 80 6112 | 81 6113 | 82 6114 | 83 6115 | 84 6116 | 85 6117 | 86 6118 | 87 6119 | 88 6120 | 89 6121 | 0 6122 | 1 6123 | 2 6124 | 3 6125 | 4 6126 | 5 6127 | 6 6128 | 7 6129 | 8 6130 | 9 6131 | 10 6132 | 11 6133 | 12 6134 | 13 6135 | 14 6136 | 15 6137 | 16 6138 | 17 6139 | 18 6140 | 19 6141 | 20 6142 | 21 6143 | 22 6144 | 23 6145 | 24 6146 | 25 6147 | 26 6148 | 27 6149 | 28 6150 | 29 6151 | 30 6152 | 31 6153 | 32 6154 | 33 6155 | 34 6156 | 35 6157 | 36 6158 | 37 6159 | 38 6160 | 39 6161 | 40 6162 | 41 6163 | 42 6164 | 43 6165 | 44 6166 | 45 6167 | 46 6168 | 47 6169 | 48 6170 | 49 6171 | 50 6172 | 51 6173 | 52 6174 | 53 6175 | 54 6176 | 55 6177 | 56 6178 | 57 6179 | 58 6180 | 59 6181 | 60 6182 | 61 6183 | 62 6184 | 63 6185 | 64 6186 | 65 6187 | 66 6188 | 67 6189 | 68 6190 | 69 6191 | 70 6192 | 71 6193 | 72 6194 | 73 6195 | 74 6196 | 75 6197 | 76 6198 | 77 6199 | 78 6200 | 79 6201 | 80 6202 | 81 6203 | 82 6204 | 83 6205 | 84 6206 | 85 6207 | 86 6208 | 87 6209 | 88 6210 | 89 6211 | 0 6212 | 1 6213 | 2 6214 | 3 6215 | 4 6216 | 5 6217 | 6 6218 | 7 6219 | 8 6220 | 9 6221 | 10 6222 | 11 6223 | 12 6224 | 13 6225 | 14 6226 | 15 6227 | 16 6228 | 17 6229 | 18 6230 | 19 6231 | 20 6232 | 21 6233 | 22 6234 | 23 6235 | 24 6236 | 25 6237 | 26 6238 | 27 6239 | 28 6240 | 29 6241 | 30 6242 | 31 6243 | 32 6244 | 33 6245 | 34 6246 | 35 6247 | 36 6248 | 37 6249 | 38 6250 | 39 6251 | 40 6252 | 41 6253 | 42 6254 | 43 6255 | 44 6256 | 45 6257 | 46 6258 | 47 6259 | 48 6260 | 49 6261 | 50 6262 | 51 6263 | 52 6264 | 53 6265 | 54 6266 | 55 6267 | 56 6268 | 57 6269 | 58 6270 | 59 6271 | 60 6272 | 61 6273 | 62 6274 | 63 6275 | 64 6276 | 65 6277 | 66 6278 | 67 6279 | 68 6280 | 69 6281 | 70 6282 | 71 6283 | 72 6284 | 73 6285 | 74 6286 | 75 6287 | 76 6288 | 77 6289 | 78 6290 | 79 6291 | 80 6292 | 81 6293 | 82 6294 | 83 6295 | 84 6296 | 85 6297 | 86 6298 | 87 6299 | 88 6300 | 89 6301 | 0 6302 | 1 6303 | 2 6304 | 3 6305 | 4 6306 | 5 6307 | 6 6308 | 7 6309 | 8 6310 | 9 6311 | 10 6312 | 11 6313 | 12 6314 | 13 6315 | 14 6316 | 15 6317 | 16 6318 | 17 6319 | 18 6320 | 19 6321 | 20 6322 | 21 6323 | 22 6324 | 23 6325 | 24 6326 | 25 6327 | 26 6328 | 27 6329 | 28 6330 | 29 6331 | 30 6332 | 31 6333 | 32 6334 | 33 6335 | 34 6336 | 35 6337 | 36 6338 | 37 6339 | 38 6340 | 39 6341 | 40 6342 | 41 6343 | 42 6344 | 43 6345 | 44 6346 | 45 6347 | 46 6348 | 47 6349 | 48 6350 | 49 6351 | 50 6352 | 51 6353 | 52 6354 | 53 6355 | 54 6356 | 55 6357 | 56 6358 | 57 6359 | 58 6360 | 59 6361 | 60 6362 | 61 6363 | 62 6364 | 63 6365 | 64 6366 | 65 6367 | 66 6368 | 67 6369 | 68 6370 | 69 6371 | 70 6372 | 71 6373 | 72 6374 | 73 6375 | 74 6376 | 75 6377 | 76 6378 | 77 6379 | 78 6380 | 79 6381 | 80 6382 | 81 6383 | 82 6384 | 83 6385 | 84 6386 | 85 6387 | 86 6388 | 87 6389 | 88 6390 | 89 6391 | 0 6392 | 1 6393 | 2 6394 | 3 6395 | 4 6396 | 5 6397 | 6 6398 | 7 6399 | 8 6400 | 9 6401 | 10 6402 | 11 6403 | 12 6404 | 13 6405 | 14 6406 | 15 6407 | 16 6408 | 17 6409 | 18 6410 | 19 6411 | 20 6412 | 21 6413 | 22 6414 | 23 6415 | 24 6416 | 25 6417 | 26 6418 | 27 6419 | 28 6420 | 29 6421 | 30 6422 | 31 6423 | 32 6424 | 33 6425 | 34 6426 | 35 6427 | 36 6428 | 37 6429 | 38 6430 | 39 6431 | 40 6432 | 41 6433 | 42 6434 | 43 6435 | 44 6436 | 45 6437 | 46 6438 | 47 6439 | 48 6440 | 49 6441 | 50 6442 | 51 6443 | 52 6444 | 53 6445 | 54 6446 | 55 6447 | 56 6448 | 57 6449 | 58 6450 | 59 6451 | 60 6452 | 61 6453 | 62 6454 | 63 6455 | 64 6456 | 65 6457 | 66 6458 | 67 6459 | 68 6460 | 69 6461 | 70 6462 | 71 6463 | 72 6464 | 73 6465 | 74 6466 | 75 6467 | 76 6468 | 77 6469 | 78 6470 | 79 6471 | 80 6472 | 81 6473 | 82 6474 | 83 6475 | 84 6476 | 85 6477 | 86 6478 | 87 6479 | 88 6480 | 89 6481 | 0 6482 | 1 6483 | 2 6484 | 3 6485 | 4 6486 | 5 6487 | 6 6488 | 7 6489 | 8 6490 | 9 6491 | 10 6492 | 11 6493 | 12 6494 | 13 6495 | 14 6496 | 15 6497 | 16 6498 | 17 6499 | 18 6500 | 19 6501 | 20 6502 | 21 6503 | 22 6504 | 23 6505 | 24 6506 | 25 6507 | 26 6508 | 27 6509 | 28 6510 | 29 6511 | 30 6512 | 31 6513 | 32 6514 | 33 6515 | 34 6516 | 35 6517 | 36 6518 | 37 6519 | 38 6520 | 39 6521 | 40 6522 | 41 6523 | 42 6524 | 43 6525 | 44 6526 | 45 6527 | 46 6528 | 47 6529 | 48 6530 | 49 6531 | 50 6532 | 51 6533 | 52 6534 | 53 6535 | 54 6536 | 55 6537 | 56 6538 | 57 6539 | 58 6540 | 59 6541 | 60 6542 | 61 6543 | 62 6544 | 63 6545 | 64 6546 | 65 6547 | 66 6548 | 67 6549 | 68 6550 | 69 6551 | 70 6552 | 71 6553 | 72 6554 | 73 6555 | 74 6556 | 75 6557 | 76 6558 | 77 6559 | 78 6560 | 79 6561 | 80 6562 | 81 6563 | 82 6564 | 83 6565 | 84 6566 | 85 6567 | 86 6568 | 87 6569 | 88 6570 | 89 6571 | 0 6572 | 1 6573 | 2 6574 | 3 6575 | 4 6576 | 5 6577 | 6 6578 | 7 6579 | 8 6580 | 9 6581 | 10 6582 | 11 6583 | 12 6584 | 13 6585 | 14 6586 | 15 6587 | 16 6588 | 17 6589 | 18 6590 | 19 6591 | 20 6592 | 21 6593 | 22 6594 | 23 6595 | 24 6596 | 25 6597 | 26 6598 | 27 6599 | 28 6600 | 29 6601 | 30 6602 | 31 6603 | 32 6604 | 33 6605 | 34 6606 | 35 6607 | 36 6608 | 37 6609 | 38 6610 | 39 6611 | 40 6612 | 41 6613 | 42 6614 | 43 6615 | 44 6616 | 45 6617 | 46 6618 | 47 6619 | 48 6620 | 49 6621 | 50 6622 | 51 6623 | 52 6624 | 53 6625 | 54 6626 | 55 6627 | 56 6628 | 57 6629 | 58 6630 | 59 6631 | 60 6632 | 61 6633 | 62 6634 | 63 6635 | 64 6636 | 65 6637 | 66 6638 | 67 6639 | 68 6640 | 69 6641 | 70 6642 | 71 6643 | 72 6644 | 73 6645 | 74 6646 | 75 6647 | 76 6648 | 77 6649 | 78 6650 | 79 6651 | 80 6652 | 81 6653 | 82 6654 | 83 6655 | 84 6656 | 85 6657 | 86 6658 | 87 6659 | 88 6660 | 89 6661 | 0 6662 | 1 6663 | 2 6664 | 3 6665 | 4 6666 | 5 6667 | 6 6668 | 7 6669 | 8 6670 | 9 6671 | 10 6672 | 11 6673 | 12 6674 | 13 6675 | 14 6676 | 15 6677 | 16 6678 | 17 6679 | 18 6680 | 19 6681 | 20 6682 | 21 6683 | 22 6684 | 23 6685 | 24 6686 | 25 6687 | 26 6688 | 27 6689 | 28 6690 | 29 6691 | 30 6692 | 31 6693 | 32 6694 | 33 6695 | 34 6696 | 35 6697 | 36 6698 | 37 6699 | 38 6700 | 39 6701 | 40 6702 | 41 6703 | 42 6704 | 43 6705 | 44 6706 | 45 6707 | 46 6708 | 47 6709 | 48 6710 | 49 6711 | 50 6712 | 51 6713 | 52 6714 | 53 6715 | 54 6716 | 55 6717 | 56 6718 | 57 6719 | 58 6720 | 59 6721 | 60 6722 | 61 6723 | 62 6724 | 63 6725 | 64 6726 | 65 6727 | 66 6728 | 67 6729 | 68 6730 | 69 6731 | 70 6732 | 71 6733 | 72 6734 | 73 6735 | 74 6736 | 75 6737 | 76 6738 | 77 6739 | 78 6740 | 79 6741 | 80 6742 | 81 6743 | 82 6744 | 83 6745 | 84 6746 | 85 6747 | 86 6748 | 87 6749 | 88 6750 | 89 6751 | 0 6752 | 1 6753 | 2 6754 | 3 6755 | 4 6756 | 5 6757 | 6 6758 | 7 6759 | 8 6760 | 9 6761 | 10 6762 | 11 6763 | 12 6764 | 13 6765 | 14 6766 | 15 6767 | 16 6768 | 17 6769 | 18 6770 | 19 6771 | 20 6772 | 21 6773 | 22 6774 | 23 6775 | 24 6776 | 25 6777 | 26 6778 | 27 6779 | 28 6780 | 29 6781 | 30 6782 | 31 6783 | 32 6784 | 33 6785 | 34 6786 | 35 6787 | 36 6788 | 37 6789 | 38 6790 | 39 6791 | 40 6792 | 41 6793 | 42 6794 | 43 6795 | 44 6796 | 45 6797 | 46 6798 | 47 6799 | 48 6800 | 49 6801 | 50 6802 | 51 6803 | 52 6804 | 53 6805 | 54 6806 | 55 6807 | 56 6808 | 57 6809 | 58 6810 | 59 6811 | 60 6812 | 61 6813 | 62 6814 | 63 6815 | 64 6816 | 65 6817 | 66 6818 | 67 6819 | 68 6820 | 69 6821 | 70 6822 | 71 6823 | 72 6824 | 73 6825 | 74 6826 | 75 6827 | 76 6828 | 77 6829 | 78 6830 | 79 6831 | 80 6832 | 81 6833 | 82 6834 | 83 6835 | 84 6836 | 85 6837 | 86 6838 | 87 6839 | 88 6840 | 89 6841 | 0 6842 | 1 6843 | 2 6844 | 3 6845 | 4 6846 | 5 6847 | 6 6848 | 7 6849 | 8 6850 | 9 6851 | 10 6852 | 11 6853 | 12 6854 | 13 6855 | 14 6856 | 15 6857 | 16 6858 | 17 6859 | 18 6860 | 19 6861 | 20 6862 | 21 6863 | 22 6864 | 23 6865 | 24 6866 | 25 6867 | 26 6868 | 27 6869 | 28 6870 | 29 6871 | 30 6872 | 31 6873 | 32 6874 | 33 6875 | 34 6876 | 35 6877 | 36 6878 | 37 6879 | 38 6880 | 39 6881 | 40 6882 | 41 6883 | 42 6884 | 43 6885 | 44 6886 | 45 6887 | 46 6888 | 47 6889 | 48 6890 | 49 6891 | 50 6892 | 51 6893 | 52 6894 | 53 6895 | 54 6896 | 55 6897 | 56 6898 | 57 6899 | 58 6900 | 59 6901 | 60 6902 | 61 6903 | 62 6904 | 63 6905 | 64 6906 | 65 6907 | 66 6908 | 67 6909 | 68 6910 | 69 6911 | 70 6912 | 71 6913 | 72 6914 | 73 6915 | 74 6916 | 75 6917 | 76 6918 | 77 6919 | 78 6920 | 79 6921 | 80 6922 | 81 6923 | 82 6924 | 83 6925 | 84 6926 | 85 6927 | 86 6928 | 87 6929 | 88 6930 | 89 6931 | 0 6932 | 1 6933 | 2 6934 | 3 6935 | 4 6936 | 5 6937 | 6 6938 | 7 6939 | 8 6940 | 9 6941 | 10 6942 | 11 6943 | 12 6944 | 13 6945 | 14 6946 | 15 6947 | 16 6948 | 17 6949 | 18 6950 | 19 6951 | 20 6952 | 21 6953 | 22 6954 | 23 6955 | 24 6956 | 25 6957 | 26 6958 | 27 6959 | 28 6960 | 29 6961 | 30 6962 | 31 6963 | 32 6964 | 33 6965 | 34 6966 | 35 6967 | 36 6968 | 37 6969 | 38 6970 | 39 6971 | 40 6972 | 41 6973 | 42 6974 | 43 6975 | 44 6976 | 45 6977 | 46 6978 | 47 6979 | 48 6980 | 49 6981 | 50 6982 | 51 6983 | 52 6984 | 53 6985 | 54 6986 | 55 6987 | 56 6988 | 57 6989 | 58 6990 | 59 6991 | 60 6992 | 61 6993 | 62 6994 | 63 6995 | 64 6996 | 65 6997 | 66 6998 | 67 6999 | 68 7000 | 69 7001 | 70 7002 | 71 7003 | 72 7004 | 73 7005 | 74 7006 | 75 7007 | 76 7008 | 77 7009 | 78 7010 | 79 7011 | 80 7012 | 81 7013 | 82 7014 | 83 7015 | 84 7016 | 85 7017 | 86 7018 | 87 7019 | 88 7020 | 89 7021 | 0 7022 | 1 7023 | 2 7024 | 3 7025 | 4 7026 | 5 7027 | 6 7028 | 7 7029 | 8 7030 | 9 7031 | 10 7032 | 11 7033 | 12 7034 | 13 7035 | 14 7036 | 15 7037 | 16 7038 | 17 7039 | 18 7040 | 19 7041 | 20 7042 | 21 7043 | 22 7044 | 23 7045 | 24 7046 | 25 7047 | 26 7048 | 27 7049 | 28 7050 | 29 7051 | 30 7052 | 31 7053 | 32 7054 | 33 7055 | 34 7056 | 35 7057 | 36 7058 | 37 7059 | 38 7060 | 39 7061 | 40 7062 | 41 7063 | 42 7064 | 43 7065 | 44 7066 | 45 7067 | 46 7068 | 47 7069 | 48 7070 | 49 7071 | 50 7072 | 51 7073 | 52 7074 | 53 7075 | 54 7076 | 55 7077 | 56 7078 | 57 7079 | 58 7080 | 59 7081 | 60 7082 | 61 7083 | 62 7084 | 63 7085 | 64 7086 | 65 7087 | 66 7088 | 67 7089 | 68 7090 | 69 7091 | 70 7092 | 71 7093 | 72 7094 | 73 7095 | 74 7096 | 75 7097 | 76 7098 | 77 7099 | 78 7100 | 79 7101 | 80 7102 | 81 7103 | 82 7104 | 83 7105 | 84 7106 | 85 7107 | 86 7108 | 87 7109 | 88 7110 | 89 7111 | 0 7112 | 1 7113 | 2 7114 | 3 7115 | 4 7116 | 5 7117 | 6 7118 | 7 7119 | 8 7120 | 9 7121 | 10 7122 | 11 7123 | 12 7124 | 13 7125 | 14 7126 | 15 7127 | 16 7128 | 17 7129 | 18 7130 | 19 7131 | 20 7132 | 21 7133 | 22 7134 | 23 7135 | 24 7136 | 25 7137 | 26 7138 | 27 7139 | 28 7140 | 29 7141 | 30 7142 | 31 7143 | 32 7144 | 33 7145 | 34 7146 | 35 7147 | 36 7148 | 37 7149 | 38 7150 | 39 7151 | 40 7152 | 41 7153 | 42 7154 | 43 7155 | 44 7156 | 45 7157 | 46 7158 | 47 7159 | 48 7160 | 49 7161 | 50 7162 | 51 7163 | 52 7164 | 53 7165 | 54 7166 | 55 7167 | 56 7168 | 57 7169 | 58 7170 | 59 7171 | 60 7172 | 61 7173 | 62 7174 | 63 7175 | 64 7176 | 65 7177 | 66 7178 | 67 7179 | 68 7180 | 69 7181 | 70 7182 | 71 7183 | 72 7184 | 73 7185 | 74 7186 | 75 7187 | 76 7188 | 77 7189 | 78 7190 | 79 7191 | 80 7192 | 81 7193 | 82 7194 | 83 7195 | 84 7196 | 85 7197 | 86 7198 | 87 7199 | 88 7200 | 89 7201 | 0 7202 | 1 7203 | 2 7204 | 3 7205 | 4 7206 | 5 7207 | 6 7208 | 7 7209 | 8 7210 | 9 7211 | 10 7212 | 11 7213 | 12 7214 | 13 7215 | 14 7216 | 15 7217 | 16 7218 | 17 7219 | 18 7220 | 19 7221 | 20 7222 | 21 7223 | 22 7224 | 23 7225 | 24 7226 | 25 7227 | 26 7228 | 27 7229 | 28 7230 | 29 7231 | 30 7232 | 31 7233 | 32 7234 | 33 7235 | 34 7236 | 35 7237 | 36 7238 | 37 7239 | 38 7240 | 39 7241 | 40 7242 | 41 7243 | 42 7244 | 43 7245 | 44 7246 | 45 7247 | 46 7248 | 47 7249 | 48 7250 | 49 7251 | 50 7252 | 51 7253 | 52 7254 | 53 7255 | 54 7256 | 55 7257 | 56 7258 | 57 7259 | 58 7260 | 59 7261 | 60 7262 | 61 7263 | 62 7264 | 63 7265 | 64 7266 | 65 7267 | 66 7268 | 67 7269 | 68 7270 | 69 7271 | 70 7272 | 71 7273 | 72 7274 | 73 7275 | 74 7276 | 75 7277 | 76 7278 | 77 7279 | 78 7280 | 79 7281 | 80 7282 | 81 7283 | 82 7284 | 83 7285 | 84 7286 | 85 7287 | 86 7288 | 87 7289 | 88 7290 | 89 7291 | 0 7292 | 1 7293 | 2 7294 | 3 7295 | 4 7296 | 5 7297 | 6 7298 | 7 7299 | 8 7300 | 9 7301 | 10 7302 | 11 7303 | 12 7304 | 13 7305 | 14 7306 | 15 7307 | 16 7308 | 17 7309 | 18 7310 | 19 7311 | 20 7312 | 21 7313 | 22 7314 | 23 7315 | 24 7316 | 25 7317 | 26 7318 | 27 7319 | 28 7320 | 29 7321 | 30 7322 | 31 7323 | 32 7324 | 33 7325 | 34 7326 | 35 7327 | 36 7328 | 37 7329 | 38 7330 | 39 7331 | 40 7332 | 41 7333 | 42 7334 | 43 7335 | 44 7336 | 45 7337 | 46 7338 | 47 7339 | 48 7340 | 49 7341 | 50 7342 | 51 7343 | 52 7344 | 53 7345 | 54 7346 | 55 7347 | 56 7348 | 57 7349 | 58 7350 | 59 7351 | 60 7352 | 61 7353 | 62 7354 | 63 7355 | 64 7356 | 65 7357 | 66 7358 | 67 7359 | 68 7360 | 69 7361 | 70 7362 | 71 7363 | 72 7364 | 73 7365 | 74 7366 | 75 7367 | 76 7368 | 77 7369 | 78 7370 | 79 7371 | 80 7372 | 81 7373 | 82 7374 | 83 7375 | 84 7376 | 85 7377 | 86 7378 | 87 7379 | 88 7380 | 89 7381 | 0 7382 | 1 7383 | 2 7384 | 3 7385 | 4 7386 | 5 7387 | 6 7388 | 7 7389 | 8 7390 | 9 7391 | 10 7392 | 11 7393 | 12 7394 | 13 7395 | 14 7396 | 15 7397 | 16 7398 | 17 7399 | 18 7400 | 19 7401 | 20 7402 | 21 7403 | 22 7404 | 23 7405 | 24 7406 | 25 7407 | 26 7408 | 27 7409 | 28 7410 | 29 7411 | 30 7412 | 31 7413 | 32 7414 | 33 7415 | 34 7416 | 35 7417 | 36 7418 | 37 7419 | 38 7420 | 39 7421 | 40 7422 | 41 7423 | 42 7424 | 43 7425 | 44 7426 | 45 7427 | 46 7428 | 47 7429 | 48 7430 | 49 7431 | 50 7432 | 51 7433 | 52 7434 | 53 7435 | 54 7436 | 55 7437 | 56 7438 | 57 7439 | 58 7440 | 59 7441 | 60 7442 | 61 7443 | 62 7444 | 63 7445 | 64 7446 | 65 7447 | 66 7448 | 67 7449 | 68 7450 | 69 7451 | 70 7452 | 71 7453 | 72 7454 | 73 7455 | 74 7456 | 75 7457 | 76 7458 | 77 7459 | 78 7460 | 79 7461 | 80 7462 | 81 7463 | 82 7464 | 83 7465 | 84 7466 | 85 7467 | 86 7468 | 87 7469 | 88 7470 | 89 7471 | 0 7472 | 1 7473 | 2 7474 | 3 7475 | 4 7476 | 5 7477 | 6 7478 | 7 7479 | 8 7480 | 9 7481 | 10 7482 | 11 7483 | 12 7484 | 13 7485 | 14 7486 | 15 7487 | 16 7488 | 17 7489 | 18 7490 | 19 7491 | 20 7492 | 21 7493 | 22 7494 | 23 7495 | 24 7496 | 25 7497 | 26 7498 | 27 7499 | 28 7500 | 29 7501 | 30 7502 | 31 7503 | 32 7504 | 33 7505 | 34 7506 | 35 7507 | 36 7508 | 37 7509 | 38 7510 | 39 7511 | 40 7512 | 41 7513 | 42 7514 | 43 7515 | 44 7516 | 45 7517 | 46 7518 | 47 7519 | 48 7520 | 49 7521 | 50 7522 | 51 7523 | 52 7524 | 53 7525 | 54 7526 | 55 7527 | 56 7528 | 57 7529 | 58 7530 | 59 7531 | 60 7532 | 61 7533 | 62 7534 | 63 7535 | 64 7536 | 65 7537 | 66 7538 | 67 7539 | 68 7540 | 69 7541 | 70 7542 | 71 7543 | 72 7544 | 73 7545 | 74 7546 | 75 7547 | 76 7548 | 77 7549 | 78 7550 | 79 7551 | 80 7552 | 81 7553 | 82 7554 | 83 7555 | 84 7556 | 85 7557 | 86 7558 | 87 7559 | 88 7560 | 89 7561 | 0 7562 | 1 7563 | 2 7564 | 3 7565 | 4 7566 | 5 7567 | 6 7568 | 7 7569 | 8 7570 | 9 7571 | 10 7572 | 11 7573 | 12 7574 | 13 7575 | 14 7576 | 15 7577 | 16 7578 | 17 7579 | 18 7580 | 19 7581 | 20 7582 | 21 7583 | 22 7584 | 23 7585 | 24 7586 | 25 7587 | 26 7588 | 27 7589 | 28 7590 | 29 7591 | 30 7592 | 31 7593 | 32 7594 | 33 7595 | 34 7596 | 35 7597 | 36 7598 | 37 7599 | 38 7600 | 39 7601 | 40 7602 | 41 7603 | 42 7604 | 43 7605 | 44 7606 | 45 7607 | 46 7608 | 47 7609 | 48 7610 | 49 7611 | 50 7612 | 51 7613 | 52 7614 | 53 7615 | 54 7616 | 55 7617 | 56 7618 | 57 7619 | 58 7620 | 59 7621 | 60 7622 | 61 7623 | 62 7624 | 63 7625 | 64 7626 | 65 7627 | 66 7628 | 67 7629 | 68 7630 | 69 7631 | 70 7632 | 71 7633 | 72 7634 | 73 7635 | 74 7636 | 75 7637 | 76 7638 | 77 7639 | 78 7640 | 79 7641 | 80 7642 | 81 7643 | 82 7644 | 83 7645 | 84 7646 | 85 7647 | 86 7648 | 87 7649 | 88 7650 | 89 7651 | 0 7652 | 1 7653 | 2 7654 | 3 7655 | 4 7656 | 5 7657 | 6 7658 | 7 7659 | 8 7660 | 9 7661 | 10 7662 | 11 7663 | 12 7664 | 13 7665 | 14 7666 | 15 7667 | 16 7668 | 17 7669 | 18 7670 | 19 7671 | 20 7672 | 21 7673 | 22 7674 | 23 7675 | 24 7676 | 25 7677 | 26 7678 | 27 7679 | 28 7680 | 29 7681 | 30 7682 | 31 7683 | 32 7684 | 33 7685 | 34 7686 | 35 7687 | 36 7688 | 37 7689 | 38 7690 | 39 7691 | 40 7692 | 41 7693 | 42 7694 | 43 7695 | 44 7696 | 45 7697 | 46 7698 | 47 7699 | 48 7700 | 49 7701 | 50 7702 | 51 7703 | 52 7704 | 53 7705 | 54 7706 | 55 7707 | 56 7708 | 57 7709 | 58 7710 | 59 7711 | 60 7712 | 61 7713 | 62 7714 | 63 7715 | 64 7716 | 65 7717 | 66 7718 | 67 7719 | 68 7720 | 69 7721 | 70 7722 | 71 7723 | 72 7724 | 73 7725 | 74 7726 | 75 7727 | 76 7728 | 77 7729 | 78 7730 | 79 7731 | 80 7732 | 81 7733 | 82 7734 | 83 7735 | 84 7736 | 85 7737 | 86 7738 | 87 7739 | 88 7740 | 89 7741 | 0 7742 | 1 7743 | 2 7744 | 3 7745 | 4 7746 | 5 7747 | 6 7748 | 7 7749 | 8 7750 | 9 7751 | 10 7752 | 11 7753 | 12 7754 | 13 7755 | 14 7756 | 15 7757 | 16 7758 | 17 7759 | 18 7760 | 19 7761 | 20 7762 | 21 7763 | 22 7764 | 23 7765 | 24 7766 | 25 7767 | 26 7768 | 27 7769 | 28 7770 | 29 7771 | 30 7772 | 31 7773 | 32 7774 | 33 7775 | 34 7776 | 35 7777 | 36 7778 | 37 7779 | 38 7780 | 39 7781 | 40 7782 | 41 7783 | 42 7784 | 43 7785 | 44 7786 | 45 7787 | 46 7788 | 47 7789 | 48 7790 | 49 7791 | 50 7792 | 51 7793 | 52 7794 | 53 7795 | 54 7796 | 55 7797 | 56 7798 | 57 7799 | 58 7800 | 59 7801 | 60 7802 | 61 7803 | 62 7804 | 63 7805 | 64 7806 | 65 7807 | 66 7808 | 67 7809 | 68 7810 | 69 7811 | 70 7812 | 71 7813 | 72 7814 | 73 7815 | 74 7816 | 75 7817 | 76 7818 | 77 7819 | 78 7820 | 79 7821 | 80 7822 | 81 7823 | 82 7824 | 83 7825 | 84 7826 | 85 7827 | 86 7828 | 87 7829 | 88 7830 | 89 7831 | 0 7832 | 1 7833 | 2 7834 | 3 7835 | 4 7836 | 5 7837 | 6 7838 | 7 7839 | 8 7840 | 9 7841 | 10 7842 | 11 7843 | 12 7844 | 13 7845 | 14 7846 | 15 7847 | 16 7848 | 17 7849 | 18 7850 | 19 7851 | 20 7852 | 21 7853 | 22 7854 | 23 7855 | 24 7856 | 25 7857 | 26 7858 | 27 7859 | 28 7860 | 29 7861 | 30 7862 | 31 7863 | 32 7864 | 33 7865 | 34 7866 | 35 7867 | 36 7868 | 37 7869 | 38 7870 | 39 7871 | 40 7872 | 41 7873 | 42 7874 | 43 7875 | 44 7876 | 45 7877 | 46 7878 | 47 7879 | 48 7880 | 49 7881 | 50 7882 | 51 7883 | 52 7884 | 53 7885 | 54 7886 | 55 7887 | 56 7888 | 57 7889 | 58 7890 | 59 7891 | 60 7892 | 61 7893 | 62 7894 | 63 7895 | 64 7896 | 65 7897 | 66 7898 | 67 7899 | 68 7900 | 69 7901 | 70 7902 | 71 7903 | 72 7904 | 73 7905 | 74 7906 | 75 7907 | 76 7908 | 77 7909 | 78 7910 | 79 7911 | 80 7912 | 81 7913 | 82 7914 | 83 7915 | 84 7916 | 85 7917 | 86 7918 | 87 7919 | 88 7920 | 89 7921 | 0 7922 | 1 7923 | 2 7924 | 3 7925 | 4 7926 | 5 7927 | 6 7928 | 7 7929 | 8 7930 | 9 7931 | 10 7932 | 11 7933 | 12 7934 | 13 7935 | 14 7936 | 15 7937 | 16 7938 | 17 7939 | 18 7940 | 19 7941 | 20 7942 | 21 7943 | 22 7944 | 23 7945 | 24 7946 | 25 7947 | 26 7948 | 27 7949 | 28 7950 | 29 7951 | 30 7952 | 31 7953 | 32 7954 | 33 7955 | 34 7956 | 35 7957 | 36 7958 | 37 7959 | 38 7960 | 39 7961 | 40 7962 | 41 7963 | 42 7964 | 43 7965 | 44 7966 | 45 7967 | 46 7968 | 47 7969 | 48 7970 | 49 7971 | 50 7972 | 51 7973 | 52 7974 | 53 7975 | 54 7976 | 55 7977 | 56 7978 | 57 7979 | 58 7980 | 59 7981 | 60 7982 | 61 7983 | 62 7984 | 63 7985 | 64 7986 | 65 7987 | 66 7988 | 67 7989 | 68 7990 | 69 7991 | 70 7992 | 71 7993 | 72 7994 | 73 7995 | 74 7996 | 75 7997 | 76 7998 | 77 7999 | 78 8000 | 79 8001 | 80 8002 | 81 8003 | 82 8004 | 83 8005 | 84 8006 | 85 8007 | 86 8008 | 87 8009 | 88 8010 | 89 8011 | 0 8012 | 1 8013 | 2 8014 | 3 8015 | 4 8016 | 5 8017 | 6 8018 | 7 8019 | 8 8020 | 9 8021 | 10 8022 | 11 8023 | 12 8024 | 13 8025 | 14 8026 | 15 8027 | 16 8028 | 17 8029 | 18 8030 | 19 8031 | 20 8032 | 21 8033 | 22 8034 | 23 8035 | 24 8036 | 25 8037 | 26 8038 | 27 8039 | 28 8040 | 29 8041 | 30 8042 | 31 8043 | 32 8044 | 33 8045 | 34 8046 | 35 8047 | 36 8048 | 37 8049 | 38 8050 | 39 8051 | 40 8052 | 41 8053 | 42 8054 | 43 8055 | 44 8056 | 45 8057 | 46 8058 | 47 8059 | 48 8060 | 49 8061 | 50 8062 | 51 8063 | 52 8064 | 53 8065 | 54 8066 | 55 8067 | 56 8068 | 57 8069 | 58 8070 | 59 8071 | 60 8072 | 61 8073 | 62 8074 | 63 8075 | 64 8076 | 65 8077 | 66 8078 | 67 8079 | 68 8080 | 69 8081 | 70 8082 | 71 8083 | 72 8084 | 73 8085 | 74 8086 | 75 8087 | 76 8088 | 77 8089 | 78 8090 | 79 8091 | 80 8092 | 81 8093 | 82 8094 | 83 8095 | 84 8096 | 85 8097 | 86 8098 | 87 8099 | 88 8100 | 89 8101 | 0 8102 | 1 8103 | 2 8104 | 3 8105 | 4 8106 | 5 8107 | 6 8108 | 7 8109 | 8 8110 | 9 8111 | 10 8112 | 11 8113 | 12 8114 | 13 8115 | 14 8116 | 15 8117 | 16 8118 | 17 8119 | 18 8120 | 19 8121 | 20 8122 | 21 8123 | 22 8124 | 23 8125 | 24 8126 | 25 8127 | 26 8128 | 27 8129 | 28 8130 | 29 8131 | 30 8132 | 31 8133 | 32 8134 | 33 8135 | 34 8136 | 35 8137 | 36 8138 | 37 8139 | 38 8140 | 39 8141 | 40 8142 | 41 8143 | 42 8144 | 43 8145 | 44 8146 | 45 8147 | 46 8148 | 47 8149 | 48 8150 | 49 8151 | 50 8152 | 51 8153 | 52 8154 | 53 8155 | 54 8156 | 55 8157 | 56 8158 | 57 8159 | 58 8160 | 59 8161 | 60 8162 | 61 8163 | 62 8164 | 63 8165 | 64 8166 | 65 8167 | 66 8168 | 67 8169 | 68 8170 | 69 8171 | 70 8172 | 71 8173 | 72 8174 | 73 8175 | 74 8176 | 75 8177 | 76 8178 | 77 8179 | 78 8180 | 79 8181 | 80 8182 | 81 8183 | 82 8184 | 83 8185 | 84 8186 | 85 8187 | 86 8188 | 87 8189 | 88 8190 | 89 8191 | 0 8192 | 1 8193 | 2 8194 | 3 8195 | 4 8196 | 5 8197 | 6 8198 | 7 8199 | 8 8200 | 9 8201 | 10 8202 | 11 8203 | 12 8204 | 13 8205 | 14 8206 | 15 8207 | 16 8208 | 17 8209 | 18 8210 | 19 8211 | 20 8212 | 21 8213 | 22 8214 | 23 8215 | 24 8216 | 25 8217 | 26 8218 | 27 8219 | 28 8220 | 29 8221 | 30 8222 | 31 8223 | 32 8224 | 33 8225 | 34 8226 | 35 8227 | 36 8228 | 37 8229 | 38 8230 | 39 8231 | 40 8232 | 41 8233 | 42 8234 | 43 8235 | 44 8236 | 45 8237 | 46 8238 | 47 8239 | 48 8240 | 49 8241 | 50 8242 | 51 8243 | 52 8244 | 53 8245 | 54 8246 | 55 8247 | 56 8248 | 57 8249 | 58 8250 | 59 8251 | 60 8252 | 61 8253 | 62 8254 | 63 8255 | 64 8256 | 65 8257 | 66 8258 | 67 8259 | 68 8260 | 69 8261 | 70 8262 | 71 8263 | 72 8264 | 73 8265 | 74 8266 | 75 8267 | 76 8268 | 77 8269 | 78 8270 | 79 8271 | 80 8272 | 81 8273 | 82 8274 | 83 8275 | 84 8276 | 85 8277 | 86 8278 | 87 8279 | 88 8280 | 89 8281 | 0 8282 | 1 8283 | 2 8284 | 3 8285 | 4 8286 | 5 8287 | 6 8288 | 7 8289 | 8 8290 | 9 8291 | 10 8292 | 11 8293 | 12 8294 | 13 8295 | 14 8296 | 15 8297 | 16 8298 | 17 8299 | 18 8300 | 19 8301 | 20 8302 | 21 8303 | 22 8304 | 23 8305 | 24 8306 | 25 8307 | 26 8308 | 27 8309 | 28 8310 | 29 8311 | 30 8312 | 31 8313 | 32 8314 | 33 8315 | 34 8316 | 35 8317 | 36 8318 | 37 8319 | 38 8320 | 39 8321 | 40 8322 | 41 8323 | 42 8324 | 43 8325 | 44 8326 | 45 8327 | 46 8328 | 47 8329 | 48 8330 | 49 8331 | 50 8332 | 51 8333 | 52 8334 | 53 8335 | 54 8336 | 55 8337 | 56 8338 | 57 8339 | 58 8340 | 59 8341 | 60 8342 | 61 8343 | 62 8344 | 63 8345 | 64 8346 | 65 8347 | 66 8348 | 67 8349 | 68 8350 | 69 8351 | 70 8352 | 71 8353 | 72 8354 | 73 8355 | 74 8356 | 75 8357 | 76 8358 | 77 8359 | 78 8360 | 79 8361 | 80 8362 | 81 8363 | 82 8364 | 83 8365 | 84 8366 | 85 8367 | 86 8368 | 87 8369 | 88 8370 | 89 8371 | 0 8372 | 1 8373 | 2 8374 | 3 8375 | 4 8376 | 5 8377 | 6 8378 | 7 8379 | 8 8380 | 9 8381 | 10 8382 | 11 8383 | 12 8384 | 13 8385 | 14 8386 | 15 8387 | 16 8388 | 17 8389 | 18 8390 | 19 8391 | 20 8392 | 21 8393 | 22 8394 | 23 8395 | 24 8396 | 25 8397 | 26 8398 | 27 8399 | 28 8400 | 29 8401 | 30 8402 | 31 8403 | 32 8404 | 33 8405 | 34 8406 | 35 8407 | 36 8408 | 37 8409 | 38 8410 | 39 8411 | 40 8412 | 41 8413 | 42 8414 | 43 8415 | 44 8416 | 45 8417 | 46 8418 | 47 8419 | 48 8420 | 49 8421 | 50 8422 | 51 8423 | 52 8424 | 53 8425 | 54 8426 | 55 8427 | 56 8428 | 57 8429 | 58 8430 | 59 8431 | 60 8432 | 61 8433 | 62 8434 | 63 8435 | 64 8436 | 65 8437 | 66 8438 | 67 8439 | 68 8440 | 69 8441 | 70 8442 | 71 8443 | 72 8444 | 73 8445 | 74 8446 | 75 8447 | 76 8448 | 77 8449 | 78 8450 | 79 8451 | 80 8452 | 81 8453 | 82 8454 | 83 8455 | 84 8456 | 85 8457 | 86 8458 | 87 8459 | 88 8460 | 89 8461 | 0 8462 | 1 8463 | 2 8464 | 3 8465 | 4 8466 | 5 8467 | 6 8468 | 7 8469 | 8 8470 | 9 8471 | 10 8472 | 11 8473 | 12 8474 | 13 8475 | 14 8476 | 15 8477 | 16 8478 | 17 8479 | 18 8480 | 19 8481 | 20 8482 | 21 8483 | 22 8484 | 23 8485 | 24 8486 | 25 8487 | 26 8488 | 27 8489 | 28 8490 | 29 8491 | 30 8492 | 31 8493 | 32 8494 | 33 8495 | 34 8496 | 35 8497 | 36 8498 | 37 8499 | 38 8500 | 39 8501 | 40 8502 | 41 8503 | 42 8504 | 43 8505 | 44 8506 | 45 8507 | 46 8508 | 47 8509 | 48 8510 | 49 8511 | 50 8512 | 51 8513 | 52 8514 | 53 8515 | 54 8516 | 55 8517 | 56 8518 | 57 8519 | 58 8520 | 59 8521 | 60 8522 | 61 8523 | 62 8524 | 63 8525 | 64 8526 | 65 8527 | 66 8528 | 67 8529 | 68 8530 | 69 8531 | 70 8532 | 71 8533 | 72 8534 | 73 8535 | 74 8536 | 75 8537 | 76 8538 | 77 8539 | 78 8540 | 79 8541 | 80 8542 | 81 8543 | 82 8544 | 83 8545 | 84 8546 | 85 8547 | 86 8548 | 87 8549 | 88 8550 | 89 8551 | 0 8552 | 1 8553 | 2 8554 | 3 8555 | 4 8556 | 5 8557 | 6 8558 | 7 8559 | 8 8560 | 9 8561 | 10 8562 | 11 8563 | 12 8564 | 13 8565 | 14 8566 | 15 8567 | 16 8568 | 17 8569 | 18 8570 | 19 8571 | 20 8572 | 21 8573 | 22 8574 | 23 8575 | 24 8576 | 25 8577 | 26 8578 | 27 8579 | 28 8580 | 29 8581 | 30 8582 | 31 8583 | 32 8584 | 33 8585 | 34 8586 | 35 8587 | 36 8588 | 37 8589 | 38 8590 | 39 8591 | 40 8592 | 41 8593 | 42 8594 | 43 8595 | 44 8596 | 45 8597 | 46 8598 | 47 8599 | 48 8600 | 49 8601 | 50 8602 | 51 8603 | 52 8604 | 53 8605 | 54 8606 | 55 8607 | 56 8608 | 57 8609 | 58 8610 | 59 8611 | 60 8612 | 61 8613 | 62 8614 | 63 8615 | 64 8616 | 65 8617 | 66 8618 | 67 8619 | 68 8620 | 69 8621 | 70 8622 | 71 8623 | 72 8624 | 73 8625 | 74 8626 | 75 8627 | 76 8628 | 77 8629 | 78 8630 | 79 8631 | 80 8632 | 81 8633 | 82 8634 | 83 8635 | 84 8636 | 85 8637 | 86 8638 | 87 8639 | 88 8640 | 89 8641 | 0 8642 | 1 8643 | 2 8644 | 3 8645 | 4 8646 | 5 8647 | 6 8648 | 7 8649 | 8 8650 | 9 8651 | 10 8652 | 11 8653 | 12 8654 | 13 8655 | 14 8656 | 15 8657 | 16 8658 | 17 8659 | 18 8660 | 19 8661 | 20 8662 | 21 8663 | 22 8664 | 23 8665 | 24 8666 | 25 8667 | 26 8668 | 27 8669 | 28 8670 | 29 8671 | 30 8672 | 31 8673 | 32 8674 | 33 8675 | 34 8676 | 35 8677 | 36 8678 | 37 8679 | 38 8680 | 39 8681 | 40 8682 | 41 8683 | 42 8684 | 43 8685 | 44 8686 | 45 8687 | 46 8688 | 47 8689 | 48 8690 | 49 8691 | 50 8692 | 51 8693 | 52 8694 | 53 8695 | 54 8696 | 55 8697 | 56 8698 | 57 8699 | 58 8700 | 59 8701 | 60 8702 | 61 8703 | 62 8704 | 63 8705 | 64 8706 | 65 8707 | 66 8708 | 67 8709 | 68 8710 | 69 8711 | 70 8712 | 71 8713 | 72 8714 | 73 8715 | 74 8716 | 75 8717 | 76 8718 | 77 8719 | 78 8720 | 79 8721 | 80 8722 | 81 8723 | 82 8724 | 83 8725 | 84 8726 | 85 8727 | 86 8728 | 87 8729 | 88 8730 | 89 8731 | 0 8732 | 1 8733 | 2 8734 | 3 8735 | 4 8736 | 5 8737 | 6 8738 | 7 8739 | 8 8740 | 9 8741 | 10 8742 | 11 8743 | 12 8744 | 13 8745 | 14 8746 | 15 8747 | 16 8748 | 17 8749 | 18 8750 | 19 8751 | 20 8752 | 21 8753 | 22 8754 | 23 8755 | 24 8756 | 25 8757 | 26 8758 | 27 8759 | 28 8760 | 29 8761 | 30 8762 | 31 8763 | 32 8764 | 33 8765 | 34 8766 | 35 8767 | 36 8768 | 37 8769 | 38 8770 | 39 8771 | 40 8772 | 41 8773 | 42 8774 | 43 8775 | 44 8776 | 45 8777 | 46 8778 | 47 8779 | 48 8780 | 49 8781 | 50 8782 | 51 8783 | 52 8784 | 53 8785 | 54 8786 | 55 8787 | 56 8788 | 57 8789 | 58 8790 | 59 8791 | 60 8792 | 61 8793 | 62 8794 | 63 8795 | 64 8796 | 65 8797 | 66 8798 | 67 8799 | 68 8800 | 69 8801 | 70 8802 | 71 8803 | 72 8804 | 73 8805 | 74 8806 | 75 8807 | 76 8808 | 77 8809 | 78 8810 | 79 8811 | 80 8812 | 81 8813 | 82 8814 | 83 8815 | 84 8816 | 85 8817 | 86 8818 | 87 8819 | 88 8820 | 89 8821 | 0 8822 | 1 8823 | 2 8824 | 3 8825 | 4 8826 | 5 8827 | 6 8828 | 7 8829 | 8 8830 | 9 8831 | 10 8832 | 11 8833 | 12 8834 | 13 8835 | 14 8836 | 15 8837 | 16 8838 | 17 8839 | 18 8840 | 19 8841 | 20 8842 | 21 8843 | 22 8844 | 23 8845 | 24 8846 | 25 8847 | 26 8848 | 27 8849 | 28 8850 | 29 8851 | 30 8852 | 31 8853 | 32 8854 | 33 8855 | 34 8856 | 35 8857 | 36 8858 | 37 8859 | 38 8860 | 39 8861 | 40 8862 | 41 8863 | 42 8864 | 43 8865 | 44 8866 | 45 8867 | 46 8868 | 47 8869 | 48 8870 | 49 8871 | 50 8872 | 51 8873 | 52 8874 | 53 8875 | 54 8876 | 55 8877 | 56 8878 | 57 8879 | 58 8880 | 59 8881 | 60 8882 | 61 8883 | 62 8884 | 63 8885 | 64 8886 | 65 8887 | 66 8888 | 67 8889 | 68 8890 | 69 8891 | 70 8892 | 71 8893 | 72 8894 | 73 8895 | 74 8896 | 75 8897 | 76 8898 | 77 8899 | 78 8900 | 79 8901 | 80 8902 | 81 8903 | 82 8904 | 83 8905 | 84 8906 | 85 8907 | 86 8908 | 87 8909 | 88 8910 | 89 8911 | 0 8912 | 1 8913 | 2 8914 | 3 8915 | 4 8916 | 5 8917 | 6 8918 | 7 8919 | 8 8920 | 9 8921 | 10 8922 | 11 8923 | 12 8924 | 13 8925 | 14 8926 | 15 8927 | 16 8928 | 17 8929 | 18 8930 | 19 8931 | 20 8932 | 21 8933 | 22 8934 | 23 8935 | 24 8936 | 25 8937 | 26 8938 | 27 8939 | 28 8940 | 29 8941 | 30 8942 | 31 8943 | 32 8944 | 33 8945 | 34 8946 | 35 8947 | 36 8948 | 37 8949 | 38 8950 | 39 8951 | 40 8952 | 41 8953 | 42 8954 | 43 8955 | 44 8956 | 45 8957 | 46 8958 | 47 8959 | 48 8960 | 49 8961 | 50 8962 | 51 8963 | 52 8964 | 53 8965 | 54 8966 | 55 8967 | 56 8968 | 57 8969 | 58 8970 | 59 8971 | 60 8972 | 61 8973 | 62 8974 | 63 8975 | 64 8976 | 65 8977 | 66 8978 | 67 8979 | 68 8980 | 69 8981 | 70 8982 | 71 8983 | 72 8984 | 73 8985 | 74 8986 | 75 8987 | 76 8988 | 77 8989 | 78 8990 | 79 8991 | 80 8992 | 81 8993 | 82 8994 | 83 8995 | 84 8996 | 85 8997 | 86 8998 | 87 8999 | 88 9000 | 89 9001 | 0 9002 | 1 9003 | 2 9004 | 3 9005 | 4 9006 | 5 9007 | 6 9008 | 7 9009 | 8 9010 | 9 9011 | 10 9012 | 11 9013 | 12 9014 | 13 9015 | 14 9016 | 15 9017 | 16 9018 | 17 9019 | 18 9020 | 19 9021 | 20 9022 | 21 9023 | 22 9024 | 23 9025 | 24 9026 | 25 9027 | 26 9028 | 27 9029 | 28 9030 | 29 9031 | 30 9032 | 31 9033 | 32 9034 | 33 9035 | 34 9036 | 35 9037 | 36 9038 | 37 9039 | 38 9040 | 39 9041 | 40 9042 | 41 9043 | 42 9044 | 43 9045 | 44 9046 | 45 9047 | 46 9048 | 47 9049 | 48 9050 | 49 9051 | 50 9052 | 51 9053 | 52 9054 | 53 9055 | 54 9056 | 55 9057 | 56 9058 | 57 9059 | 58 9060 | 59 9061 | 60 9062 | 61 9063 | 62 9064 | 63 9065 | 64 9066 | 65 9067 | 66 9068 | 67 9069 | 68 9070 | 69 9071 | 70 9072 | 71 9073 | 72 9074 | 73 9075 | 74 9076 | 75 9077 | 76 9078 | 77 9079 | 78 9080 | 79 9081 | 80 9082 | 81 9083 | 82 9084 | 83 9085 | 84 9086 | 85 9087 | 86 9088 | 87 9089 | 88 9090 | 89 9091 | 0 9092 | 1 9093 | 2 9094 | 3 9095 | 4 9096 | 5 9097 | 6 9098 | 7 9099 | 8 9100 | 9 9101 | 10 9102 | 11 9103 | 12 9104 | 13 9105 | 14 9106 | 15 9107 | 16 9108 | 17 9109 | 18 9110 | 19 9111 | 20 9112 | 21 9113 | 22 9114 | 23 9115 | 24 9116 | 25 9117 | 26 9118 | 27 9119 | 28 9120 | 29 9121 | 30 9122 | 31 9123 | 32 9124 | 33 9125 | 34 9126 | 35 9127 | 36 9128 | 37 9129 | 38 9130 | 39 9131 | 40 9132 | 41 9133 | 42 9134 | 43 9135 | 44 9136 | 45 9137 | 46 9138 | 47 9139 | 48 9140 | 49 9141 | 50 9142 | 51 9143 | 52 9144 | 53 9145 | 54 9146 | 55 9147 | 56 9148 | 57 9149 | 58 9150 | 59 9151 | 60 9152 | 61 9153 | 62 9154 | 63 9155 | 64 9156 | 65 9157 | 66 9158 | 67 9159 | 68 9160 | 69 9161 | 70 9162 | 71 9163 | 72 9164 | 73 9165 | 74 9166 | 75 9167 | 76 9168 | 77 9169 | 78 9170 | 79 9171 | 80 9172 | 81 9173 | 82 9174 | 83 9175 | 84 9176 | 85 9177 | 86 9178 | 87 9179 | 88 9180 | 89 9181 | 0 9182 | 1 9183 | 2 9184 | 3 9185 | 4 9186 | 5 9187 | 6 9188 | 7 9189 | 8 9190 | 9 9191 | 10 9192 | 11 9193 | 12 9194 | 13 9195 | 14 9196 | 15 9197 | 16 9198 | 17 9199 | 18 9200 | 19 9201 | 20 9202 | 21 9203 | 22 9204 | 23 9205 | 24 9206 | 25 9207 | 26 9208 | 27 9209 | 28 9210 | 29 9211 | 30 9212 | 31 9213 | 32 9214 | 33 9215 | 34 9216 | 35 9217 | 36 9218 | 37 9219 | 38 9220 | 39 9221 | 40 9222 | 41 9223 | 42 9224 | 43 9225 | 44 9226 | 45 9227 | 46 9228 | 47 9229 | 48 9230 | 49 9231 | 50 9232 | 51 9233 | 52 9234 | 53 9235 | 54 9236 | 55 9237 | 56 9238 | 57 9239 | 58 9240 | 59 9241 | 60 9242 | 61 9243 | 62 9244 | 63 9245 | 64 9246 | 65 9247 | 66 9248 | 67 9249 | 68 9250 | 69 9251 | 70 9252 | 71 9253 | 72 9254 | 73 9255 | 74 9256 | 75 9257 | 76 9258 | 77 9259 | 78 9260 | 79 9261 | 80 9262 | 81 9263 | 82 9264 | 83 9265 | 84 9266 | 85 9267 | 86 9268 | 87 9269 | 88 9270 | 89 9271 | 0 9272 | 1 9273 | 2 9274 | 3 9275 | 4 9276 | 5 9277 | 6 9278 | 7 9279 | 8 9280 | 9 9281 | 10 9282 | 11 9283 | 12 9284 | 13 9285 | 14 9286 | 15 9287 | 16 9288 | 17 9289 | 18 9290 | 19 9291 | 20 9292 | 21 9293 | 22 9294 | 23 9295 | 24 9296 | 25 9297 | 26 9298 | 27 9299 | 28 9300 | 29 9301 | 30 9302 | 31 9303 | 32 9304 | 33 9305 | 34 9306 | 35 9307 | 36 9308 | 37 9309 | 38 9310 | 39 9311 | 40 9312 | 41 9313 | 42 9314 | 43 9315 | 44 9316 | 45 9317 | 46 9318 | 47 9319 | 48 9320 | 49 9321 | 50 9322 | 51 9323 | 52 9324 | 53 9325 | 54 9326 | 55 9327 | 56 9328 | 57 9329 | 58 9330 | 59 9331 | 60 9332 | 61 9333 | 62 9334 | 63 9335 | 64 9336 | 65 9337 | 66 9338 | 67 9339 | 68 9340 | 69 9341 | 70 9342 | 71 9343 | 72 9344 | 73 9345 | 74 9346 | 75 9347 | 76 9348 | 77 9349 | 78 9350 | 79 9351 | 80 9352 | 81 9353 | 82 9354 | 83 9355 | 84 9356 | 85 9357 | 86 9358 | 87 9359 | 88 9360 | 89 9361 | 0 9362 | 1 9363 | 2 9364 | 3 9365 | 4 9366 | 5 9367 | 6 9368 | 7 9369 | 8 9370 | 9 9371 | 10 9372 | 11 9373 | 12 9374 | 13 9375 | 14 9376 | 15 9377 | 16 9378 | 17 9379 | 18 9380 | 19 9381 | 20 9382 | 21 9383 | 22 9384 | 23 9385 | 24 9386 | 25 9387 | 26 9388 | 27 9389 | 28 9390 | 29 9391 | 30 9392 | 31 9393 | 32 9394 | 33 9395 | 34 9396 | 35 9397 | 36 9398 | 37 9399 | 38 9400 | 39 9401 | 40 9402 | 41 9403 | 42 9404 | 43 9405 | 44 9406 | 45 9407 | 46 9408 | 47 9409 | 48 9410 | 49 9411 | 50 9412 | 51 9413 | 52 9414 | 53 9415 | 54 9416 | 55 9417 | 56 9418 | 57 9419 | 58 9420 | 59 9421 | 60 9422 | 61 9423 | 62 9424 | 63 9425 | 64 9426 | 65 9427 | 66 9428 | 67 9429 | 68 9430 | 69 9431 | 70 9432 | 71 9433 | 72 9434 | 73 9435 | 74 9436 | 75 9437 | 76 9438 | 77 9439 | 78 9440 | 79 9441 | 80 9442 | 81 9443 | 82 9444 | 83 9445 | 84 9446 | 85 9447 | 86 9448 | 87 9449 | 88 9450 | 89 9451 | 0 9452 | 1 9453 | 2 9454 | 3 9455 | 4 9456 | 5 9457 | 6 9458 | 7 9459 | 8 9460 | 9 9461 | 10 9462 | 11 9463 | 12 9464 | 13 9465 | 14 9466 | 15 9467 | 16 9468 | 17 9469 | 18 9470 | 19 9471 | 20 9472 | 21 9473 | 22 9474 | 23 9475 | 24 9476 | 25 9477 | 26 9478 | 27 9479 | 28 9480 | 29 9481 | 30 9482 | 31 9483 | 32 9484 | 33 9485 | 34 9486 | 35 9487 | 36 9488 | 37 9489 | 38 9490 | 39 9491 | 40 9492 | 41 9493 | 42 9494 | 43 9495 | 44 9496 | 45 9497 | 46 9498 | 47 9499 | 48 9500 | 49 9501 | 50 9502 | 51 9503 | 52 9504 | 53 9505 | 54 9506 | 55 9507 | 56 9508 | 57 9509 | 58 9510 | 59 9511 | 60 9512 | 61 9513 | 62 9514 | 63 9515 | 64 9516 | 65 9517 | 66 9518 | 67 9519 | 68 9520 | 69 9521 | 70 9522 | 71 9523 | 72 9524 | 73 9525 | 74 9526 | 75 9527 | 76 9528 | 77 9529 | 78 9530 | 79 9531 | 80 9532 | 81 9533 | 82 9534 | 83 9535 | 84 9536 | 85 9537 | 86 9538 | 87 9539 | 88 9540 | 89 9541 | 0 9542 | 1 9543 | 2 9544 | 3 9545 | 4 9546 | 5 9547 | 6 9548 | 7 9549 | 8 9550 | 9 9551 | 10 9552 | 11 9553 | 12 9554 | 13 9555 | 14 9556 | 15 9557 | 16 9558 | 17 9559 | 18 9560 | 19 9561 | 20 9562 | 21 9563 | 22 9564 | 23 9565 | 24 9566 | 25 9567 | 26 9568 | 27 9569 | 28 9570 | 29 9571 | 30 9572 | 31 9573 | 32 9574 | 33 9575 | 34 9576 | 35 9577 | 36 9578 | 37 9579 | 38 9580 | 39 9581 | 40 9582 | 41 9583 | 42 9584 | 43 9585 | 44 9586 | 45 9587 | 46 9588 | 47 9589 | 48 9590 | 49 9591 | 50 9592 | 51 9593 | 52 9594 | 53 9595 | 54 9596 | 55 9597 | 56 9598 | 57 9599 | 58 9600 | 59 9601 | 60 9602 | 61 9603 | 62 9604 | 63 9605 | 64 9606 | 65 9607 | 66 9608 | 67 9609 | 68 9610 | 69 9611 | 70 9612 | 71 9613 | 72 9614 | 73 9615 | 74 9616 | 75 9617 | 76 9618 | 77 9619 | 78 9620 | 79 9621 | 80 9622 | 81 9623 | 82 9624 | 83 9625 | 84 9626 | 85 9627 | 86 9628 | 87 9629 | 88 9630 | 89 9631 | 0 9632 | 1 9633 | 2 9634 | 3 9635 | 4 9636 | 5 9637 | 6 9638 | 7 9639 | 8 9640 | 9 9641 | 10 9642 | 11 9643 | 12 9644 | 13 9645 | 14 9646 | 15 9647 | 16 9648 | 17 9649 | 18 9650 | 19 9651 | 20 9652 | 21 9653 | 22 9654 | 23 9655 | 24 9656 | 25 9657 | 26 9658 | 27 9659 | 28 9660 | 29 9661 | 30 9662 | 31 9663 | 32 9664 | 33 9665 | 34 9666 | 35 9667 | 36 9668 | 37 9669 | 38 9670 | 39 9671 | 40 9672 | 41 9673 | 42 9674 | 43 9675 | 44 9676 | 45 9677 | 46 9678 | 47 9679 | 48 9680 | 49 9681 | 50 9682 | 51 9683 | 52 9684 | 53 9685 | 54 9686 | 55 9687 | 56 9688 | 57 9689 | 58 9690 | 59 9691 | 60 9692 | 61 9693 | 62 9694 | 63 9695 | 64 9696 | 65 9697 | 66 9698 | 67 9699 | 68 9700 | 69 9701 | 70 9702 | 71 9703 | 72 9704 | 73 9705 | 74 9706 | 75 9707 | 76 9708 | 77 9709 | 78 9710 | 79 9711 | 80 9712 | 81 9713 | 82 9714 | 83 9715 | 84 9716 | 85 9717 | 86 9718 | 87 9719 | 88 9720 | 89 9721 | 0 9722 | 1 9723 | 2 9724 | 3 9725 | 4 9726 | 5 9727 | 6 9728 | 7 9729 | 8 9730 | 9 9731 | 10 9732 | 11 9733 | 12 9734 | 13 9735 | 14 9736 | 15 9737 | 16 9738 | 17 9739 | 18 9740 | 19 9741 | 20 9742 | 21 9743 | 22 9744 | 23 9745 | 24 9746 | 25 9747 | 26 9748 | 27 9749 | 28 9750 | 29 9751 | 30 9752 | 31 9753 | 32 9754 | 33 9755 | 34 9756 | 35 9757 | 36 9758 | 37 9759 | 38 9760 | 39 9761 | 40 9762 | 41 9763 | 42 9764 | 43 9765 | 44 9766 | 45 9767 | 46 9768 | 47 9769 | 48 9770 | 49 9771 | 50 9772 | 51 9773 | 52 9774 | 53 9775 | 54 9776 | 55 9777 | 56 9778 | 57 9779 | 58 9780 | 59 9781 | 60 9782 | 61 9783 | 62 9784 | 63 9785 | 64 9786 | 65 9787 | 66 9788 | 67 9789 | 68 9790 | 69 9791 | 70 9792 | 71 9793 | 72 9794 | 73 9795 | 74 9796 | 75 9797 | 76 9798 | 77 9799 | 78 9800 | 79 9801 | 80 9802 | 81 9803 | 82 9804 | 83 9805 | 84 9806 | 85 9807 | 86 9808 | 87 9809 | 88 9810 | 89 9811 | 0 9812 | 1 9813 | 2 9814 | 3 9815 | 4 9816 | 5 9817 | 6 9818 | 7 9819 | 8 9820 | 9 9821 | 10 9822 | 11 9823 | 12 9824 | 13 9825 | 14 9826 | 15 9827 | 16 9828 | 17 9829 | 18 9830 | 19 9831 | 20 9832 | 21 9833 | 22 9834 | 23 9835 | 24 9836 | 25 9837 | 26 9838 | 27 9839 | 28 9840 | 29 9841 | 30 9842 | 31 9843 | 32 9844 | 33 9845 | 34 9846 | 35 9847 | 36 9848 | 37 9849 | 38 9850 | 39 9851 | 40 9852 | 41 9853 | 42 9854 | 43 9855 | 44 9856 | 45 9857 | 46 9858 | 47 9859 | 48 9860 | 49 9861 | 50 9862 | 51 9863 | 52 9864 | 53 9865 | 54 9866 | 55 9867 | 56 9868 | 57 9869 | 58 9870 | 59 9871 | 60 9872 | 61 9873 | 62 9874 | 63 9875 | 64 9876 | 65 9877 | 66 9878 | 67 9879 | 68 9880 | 69 9881 | 70 9882 | 71 9883 | 72 9884 | 73 9885 | 74 9886 | 75 9887 | 76 9888 | 77 9889 | 78 9890 | 79 9891 | 80 9892 | 81 9893 | 82 9894 | 83 9895 | 84 9896 | 85 9897 | 86 9898 | 87 9899 | 88 9900 | 89 9901 | 0 9902 | 1 9903 | 2 9904 | 3 9905 | 4 9906 | 5 9907 | 6 9908 | 7 9909 | 8 9910 | 9 9911 | 10 9912 | 11 9913 | 12 9914 | 13 9915 | 14 9916 | 15 9917 | 16 9918 | 17 9919 | 18 9920 | 19 9921 | 20 9922 | 21 9923 | 22 9924 | 23 9925 | 24 9926 | 25 9927 | 26 9928 | 27 9929 | 28 9930 | 29 9931 | 30 9932 | 31 9933 | 32 9934 | 33 9935 | 34 9936 | 35 9937 | 36 9938 | 37 9939 | 38 9940 | 39 9941 | 40 9942 | 41 9943 | 42 9944 | 43 9945 | 44 9946 | 45 9947 | 46 9948 | 47 9949 | 48 9950 | 49 9951 | 50 9952 | 51 9953 | 52 9954 | 53 9955 | 54 9956 | 55 9957 | 56 9958 | 57 9959 | 58 9960 | 59 9961 | 60 9962 | 61 9963 | 62 9964 | 63 9965 | 64 9966 | 65 9967 | 66 9968 | 67 9969 | 68 9970 | 69 9971 | 70 9972 | 71 9973 | 72 9974 | 73 9975 | 74 9976 | 75 9977 | 76 9978 | 77 9979 | 78 9980 | 79 9981 | 80 9982 | 81 9983 | 82 9984 | 83 9985 | 84 9986 | 85 9987 | 86 9988 | 87 9989 | 88 9990 | 89 9991 | 0 9992 | 1 9993 | 2 9994 | 3 9995 | 4 9996 | 5 9997 | 6 9998 | 7 9999 | 8 10000 | 9 10001 | 10 10002 | 11 10003 | 12 10004 | 13 10005 | 14 10006 | 15 10007 | 16 10008 | 17 10009 | 18 10010 | 19 10011 | 20 10012 | 21 10013 | 22 10014 | 23 10015 | 24 10016 | 25 10017 | 26 10018 | 27 10019 | 28 10020 | 29 10021 | 30 10022 | 31 10023 | 32 10024 | 33 10025 | 34 10026 | 35 10027 | 36 10028 | 37 10029 | 38 10030 | 39 10031 | 40 10032 | 41 10033 | 42 10034 | 43 10035 | 44 10036 | 45 10037 | 46 10038 | 47 10039 | 48 10040 | 49 10041 | 50 10042 | 51 10043 | 52 10044 | 53 10045 | 54 10046 | 55 10047 | 56 10048 | 57 10049 | 58 10050 | 59 10051 | 60 10052 | 61 10053 | 62 10054 | 63 10055 | 64 10056 | 65 10057 | 66 10058 | 67 10059 | 68 10060 | 69 10061 | 70 10062 | 71 10063 | 72 10064 | 73 10065 | 74 10066 | 75 10067 | 76 10068 | 77 10069 | 78 10070 | 79 10071 | 80 10072 | 81 10073 | 82 10074 | 83 10075 | 84 10076 | 85 10077 | 86 10078 | 87 10079 | 88 10080 | 89 10081 | 0 10082 | 1 10083 | 2 10084 | 3 10085 | 4 10086 | 5 10087 | 6 10088 | 7 10089 | 8 10090 | 9 10091 | 10 10092 | 11 10093 | 12 10094 | 13 10095 | 14 10096 | 15 10097 | 16 10098 | 17 10099 | 18 10100 | 19 10101 | 20 10102 | 21 10103 | 22 10104 | 23 10105 | 24 10106 | 25 10107 | 26 10108 | 27 10109 | 28 10110 | 29 10111 | 30 10112 | 31 10113 | 32 10114 | 33 10115 | 34 10116 | 35 10117 | 36 10118 | 37 10119 | 38 10120 | 39 10121 | 40 10122 | 41 10123 | 42 10124 | 43 10125 | 44 10126 | 45 10127 | 46 10128 | 47 10129 | 48 10130 | 49 10131 | 50 10132 | 51 10133 | 52 10134 | 53 10135 | 54 10136 | 55 10137 | 56 10138 | 57 10139 | 58 10140 | 59 10141 | 60 10142 | 61 10143 | 62 10144 | 63 10145 | 64 10146 | 65 10147 | 66 10148 | 67 10149 | 68 10150 | 69 10151 | 70 10152 | 71 10153 | 72 10154 | 73 10155 | 74 10156 | 75 10157 | 76 10158 | 77 10159 | 78 10160 | 79 10161 | 80 10162 | 81 10163 | 82 10164 | 83 10165 | 84 10166 | 85 10167 | 86 10168 | 87 10169 | 88 10170 | 89 10171 | 0 10172 | 1 10173 | 2 10174 | 3 10175 | 4 10176 | 5 10177 | 6 10178 | 7 10179 | 8 10180 | 9 10181 | 10 10182 | 11 10183 | 12 10184 | 13 10185 | 14 10186 | 15 10187 | 16 10188 | 17 10189 | 18 10190 | 19 10191 | 20 10192 | 21 10193 | 22 10194 | 23 10195 | 24 10196 | 25 10197 | 26 10198 | 27 10199 | 28 10200 | 29 10201 | 30 10202 | 31 10203 | 32 10204 | 33 10205 | 34 10206 | 35 10207 | 36 10208 | 37 10209 | 38 10210 | 39 10211 | 40 10212 | 41 10213 | 42 10214 | 43 10215 | 44 10216 | 45 10217 | 46 10218 | 47 10219 | 48 10220 | 49 10221 | 50 10222 | 51 10223 | 52 10224 | 53 10225 | 54 10226 | 55 10227 | 56 10228 | 57 10229 | 58 10230 | 59 10231 | 60 10232 | 61 10233 | 62 10234 | 63 10235 | 64 10236 | 65 10237 | 66 10238 | 67 10239 | 68 10240 | 69 10241 | 70 10242 | 71 10243 | 72 10244 | 73 10245 | 74 10246 | 75 10247 | 76 10248 | 77 10249 | 78 10250 | 79 10251 | 80 10252 | 81 10253 | 82 10254 | 83 10255 | 84 10256 | 85 10257 | 86 10258 | 87 10259 | 88 10260 | 89 10261 | 0 10262 | 1 10263 | 2 10264 | 3 10265 | 4 10266 | 5 10267 | 6 10268 | 7 10269 | 8 10270 | 9 10271 | 10 10272 | 11 10273 | 12 10274 | 13 10275 | 14 10276 | 15 10277 | 16 10278 | 17 10279 | 18 10280 | 19 10281 | 20 10282 | 21 10283 | 22 10284 | 23 10285 | 24 10286 | 25 10287 | 26 10288 | 27 10289 | 28 10290 | 29 10291 | 30 10292 | 31 10293 | 32 10294 | 33 10295 | 34 10296 | 35 10297 | 36 10298 | 37 10299 | 38 10300 | 39 10301 | 40 10302 | 41 10303 | 42 10304 | 43 10305 | 44 10306 | 45 10307 | 46 10308 | 47 10309 | 48 10310 | 49 10311 | 50 10312 | 51 10313 | 52 10314 | 53 10315 | 54 10316 | 55 10317 | 56 10318 | 57 10319 | 58 10320 | 59 10321 | 60 10322 | 61 10323 | 62 10324 | 63 10325 | 64 10326 | 65 10327 | 66 10328 | 67 10329 | 68 10330 | 69 10331 | 70 10332 | 71 10333 | 72 10334 | 73 10335 | 74 10336 | 75 10337 | 76 10338 | 77 10339 | 78 10340 | 79 10341 | 80 10342 | 81 10343 | 82 10344 | 83 10345 | 84 10346 | 85 10347 | 86 10348 | 87 10349 | 88 10350 | 89 10351 | 0 10352 | 1 10353 | 2 10354 | 3 10355 | 4 10356 | 5 10357 | 6 10358 | 7 10359 | 8 10360 | 9 10361 | 10 10362 | 11 10363 | 12 10364 | 13 10365 | 14 10366 | 15 10367 | 16 10368 | 17 10369 | 18 10370 | 19 10371 | 20 10372 | 21 10373 | 22 10374 | 23 10375 | 24 10376 | 25 10377 | 26 10378 | 27 10379 | 28 10380 | 29 10381 | 30 10382 | 31 10383 | 32 10384 | 33 10385 | 34 10386 | 35 10387 | 36 10388 | 37 10389 | 38 10390 | 39 10391 | 40 10392 | 41 10393 | 42 10394 | 43 10395 | 44 10396 | 45 10397 | 46 10398 | 47 10399 | 48 10400 | 49 10401 | 50 10402 | 51 10403 | 52 10404 | 53 10405 | 54 10406 | 55 10407 | 56 10408 | 57 10409 | 58 10410 | 59 10411 | 60 10412 | 61 10413 | 62 10414 | 63 10415 | 64 10416 | 65 10417 | 66 10418 | 67 10419 | 68 10420 | 69 10421 | 70 10422 | 71 10423 | 72 10424 | 73 10425 | 74 10426 | 75 10427 | 76 10428 | 77 10429 | 78 10430 | 79 10431 | 80 10432 | 81 10433 | 82 10434 | 83 10435 | 84 10436 | 85 10437 | 86 10438 | 87 10439 | 88 10440 | 89 10441 | 0 10442 | 1 10443 | 2 10444 | 3 10445 | 4 10446 | 5 10447 | 6 10448 | 7 10449 | 8 10450 | 9 10451 | 10 10452 | 11 10453 | 12 10454 | 13 10455 | 14 10456 | 15 10457 | 16 10458 | 17 10459 | 18 10460 | 19 10461 | 20 10462 | 21 10463 | 22 10464 | 23 10465 | 24 10466 | 25 10467 | 26 10468 | 27 10469 | 28 10470 | 29 10471 | 30 10472 | 31 10473 | 32 10474 | 33 10475 | 34 10476 | 35 10477 | 36 10478 | 37 10479 | 38 10480 | 39 10481 | 40 10482 | 41 10483 | 42 10484 | 43 10485 | 44 10486 | 45 10487 | 46 10488 | 47 10489 | 48 10490 | 49 10491 | 50 10492 | 51 10493 | 52 10494 | 53 10495 | 54 10496 | 55 10497 | 56 10498 | 57 10499 | 58 10500 | 59 10501 | 60 10502 | 61 10503 | 62 10504 | 63 10505 | 64 10506 | 65 10507 | 66 10508 | 67 10509 | 68 10510 | 69 10511 | 70 10512 | 71 10513 | 72 10514 | 73 10515 | 74 10516 | 75 10517 | 76 10518 | 77 10519 | 78 10520 | 79 10521 | 80 10522 | 81 10523 | 82 10524 | 83 10525 | 84 10526 | 85 10527 | 86 10528 | 87 10529 | 88 10530 | 89 10531 | 0 10532 | 1 10533 | 2 10534 | 3 10535 | 4 10536 | 5 10537 | 6 10538 | 7 10539 | 8 10540 | 9 10541 | 10 10542 | 11 10543 | 12 10544 | 13 10545 | 14 10546 | 15 10547 | 16 10548 | 17 10549 | 18 10550 | 19 10551 | 20 10552 | 21 10553 | 22 10554 | 23 10555 | 24 10556 | 25 10557 | 26 10558 | 27 10559 | 28 10560 | 29 10561 | 30 10562 | 31 10563 | 32 10564 | 33 10565 | 34 10566 | 35 10567 | 36 10568 | 37 10569 | 38 10570 | 39 10571 | 40 10572 | 41 10573 | 42 10574 | 43 10575 | 44 10576 | 45 10577 | 46 10578 | 47 10579 | 48 10580 | 49 10581 | 50 10582 | 51 10583 | 52 10584 | 53 10585 | 54 10586 | 55 10587 | 56 10588 | 57 10589 | 58 10590 | 59 10591 | 60 10592 | 61 10593 | 62 10594 | 63 10595 | 64 10596 | 65 10597 | 66 10598 | 67 10599 | 68 10600 | 69 10601 | 70 10602 | 71 10603 | 72 10604 | 73 10605 | 74 10606 | 75 10607 | 76 10608 | 77 10609 | 78 10610 | 79 10611 | 80 10612 | 81 10613 | 82 10614 | 83 10615 | 84 10616 | 85 10617 | 86 10618 | 87 10619 | 88 10620 | 89 10621 | 0 10622 | 1 10623 | 2 10624 | 3 10625 | 4 10626 | 5 10627 | 6 10628 | 7 10629 | 8 10630 | 9 10631 | 10 10632 | 11 10633 | 12 10634 | 13 10635 | 14 10636 | 15 10637 | 16 10638 | 17 10639 | 18 10640 | 19 10641 | 20 10642 | 21 10643 | 22 10644 | 23 10645 | 24 10646 | 25 10647 | 26 10648 | 27 10649 | 28 10650 | 29 10651 | 30 10652 | 31 10653 | 32 10654 | 33 10655 | 34 10656 | 35 10657 | 36 10658 | 37 10659 | 38 10660 | 39 10661 | 40 10662 | 41 10663 | 42 10664 | 43 10665 | 44 10666 | 45 10667 | 46 10668 | 47 10669 | 48 10670 | 49 10671 | 50 10672 | 51 10673 | 52 10674 | 53 10675 | 54 10676 | 55 10677 | 56 10678 | 57 10679 | 58 10680 | 59 10681 | 60 10682 | 61 10683 | 62 10684 | 63 10685 | 64 10686 | 65 10687 | 66 10688 | 67 10689 | 68 10690 | 69 10691 | 70 10692 | 71 10693 | 72 10694 | 73 10695 | 74 10696 | 75 10697 | 76 10698 | 77 10699 | 78 10700 | 79 10701 | 80 10702 | 81 10703 | 82 10704 | 83 10705 | 84 10706 | 85 10707 | 86 10708 | 87 10709 | 88 10710 | 89 10711 | 0 10712 | 1 10713 | 2 10714 | 3 10715 | 4 10716 | 5 10717 | 6 10718 | 7 10719 | 8 10720 | 9 10721 | 10 10722 | 11 10723 | 12 10724 | 13 10725 | 14 10726 | 15 10727 | 16 10728 | 17 10729 | 18 10730 | 19 10731 | 20 10732 | 21 10733 | 22 10734 | 23 10735 | 24 10736 | 25 10737 | 26 10738 | 27 10739 | 28 10740 | 29 10741 | 30 10742 | 31 10743 | 32 10744 | 33 10745 | 34 10746 | 35 10747 | 36 10748 | 37 10749 | 38 10750 | 39 10751 | 40 10752 | 41 10753 | 42 10754 | 43 10755 | 44 10756 | 45 10757 | 46 10758 | 47 10759 | 48 10760 | 49 10761 | 50 10762 | 51 10763 | 52 10764 | 53 10765 | 54 10766 | 55 10767 | 56 10768 | 57 10769 | 58 10770 | 59 10771 | 60 10772 | 61 10773 | 62 10774 | 63 10775 | 64 10776 | 65 10777 | 66 10778 | 67 10779 | 68 10780 | 69 10781 | 70 10782 | 71 10783 | 72 10784 | 73 10785 | 74 10786 | 75 10787 | 76 10788 | 77 10789 | 78 10790 | 79 10791 | 80 10792 | 81 10793 | 82 10794 | 83 10795 | 84 10796 | 85 10797 | 86 10798 | 87 10799 | 88 10800 | 89 10801 | 0 10802 | 1 10803 | 2 10804 | 3 10805 | 4 10806 | 5 10807 | 6 10808 | 7 10809 | 8 10810 | 9 10811 | 10 10812 | 11 10813 | 12 10814 | 13 10815 | 14 10816 | 15 10817 | 16 10818 | 17 10819 | 18 10820 | 19 10821 | 20 10822 | 21 10823 | 22 10824 | 23 10825 | 24 10826 | 25 10827 | 26 10828 | 27 10829 | 28 10830 | 29 10831 | 30 10832 | 31 10833 | 32 10834 | 33 10835 | 34 10836 | 35 10837 | 36 10838 | 37 10839 | 38 10840 | 39 10841 | 40 10842 | 41 10843 | 42 10844 | 43 10845 | 44 10846 | 45 10847 | 46 10848 | 47 10849 | 48 10850 | 49 10851 | 50 10852 | 51 10853 | 52 10854 | 53 10855 | 54 10856 | 55 10857 | 56 10858 | 57 10859 | 58 10860 | 59 10861 | 60 10862 | 61 10863 | 62 10864 | 63 10865 | 64 10866 | 65 10867 | 66 10868 | 67 10869 | 68 10870 | 69 10871 | 70 10872 | 71 10873 | 72 10874 | 73 10875 | 74 10876 | 75 10877 | 76 10878 | 77 10879 | 78 10880 | 79 10881 | 80 10882 | 81 10883 | 82 10884 | 83 10885 | 84 10886 | 85 10887 | 86 10888 | 87 10889 | 88 10890 | 89 10891 | --------------------------------------------------------------------------------