├── model.jpg ├── data └── PROTEINS │ ├── processed │ ├── data.pt │ ├── pre_filter.pt │ └── pre_transform.pt │ └── raw │ ├── README.txt │ └── PROTEINS_graph_labels.txt ├── README.md ├── main.py └── LICENSE /model.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YingtongDou/GCNN/HEAD/model.jpg -------------------------------------------------------------------------------- /data/PROTEINS/processed/data.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YingtongDou/GCNN/HEAD/data/PROTEINS/processed/data.pt -------------------------------------------------------------------------------- /data/PROTEINS/processed/pre_filter.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YingtongDou/GCNN/HEAD/data/PROTEINS/processed/pre_filter.pt -------------------------------------------------------------------------------- /data/PROTEINS/processed/pre_transform.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YingtongDou/GCNN/HEAD/data/PROTEINS/processed/pre_transform.pt -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # GCNN 2 | 3 | **Update:** Please check out more GNN-based fake news detection models along with two real-world datasets in [A collection of GNN-based fake news detectors with fake news propagation datasets](https://github.com/safe-graph/GNN-FakeNews). 4 | 5 | The Pytorch-Geometric Implementation of GCNN model for fake news detection, the original model is proposed in the following paper: 6 | 7 | Paper: [Fake News Detection on Social Media using Geometric Deep Learning](https://arxiv.org/pdf/1902.06673.pdf) 8 | 9 |

10 |
11 | 12 | 13 | 14 |
15 |

16 | 17 | To run the model, you need to have at least Python 3.6 and install following packages: 18 | 19 | ```bash 20 | PyTorch 1.6.0 21 | PyTorch Geometric 1.6.1 22 | tqdm 23 | sklearn 24 | ``` 25 | -------------------------------------------------------------------------------- /data/PROTEINS/raw/README.txt: -------------------------------------------------------------------------------- 1 | README for dataset PROTEINS 2 | 3 | 4 | === Usage === 5 | 6 | This folder contains the following comma separated text files 7 | (replace DS by the name of the dataset): 8 | 9 | n = total number of nodes 10 | m = total number of edges 11 | N = number of graphs 12 | 13 | (1) DS_A.txt (m lines) 14 | sparse (block diagonal) adjacency matrix for all graphs, 15 | each line corresponds to (row, col) resp. (node_id, node_id) 16 | 17 | (2) DS_graph_indicator.txt (n lines) 18 | column vector of graph identifiers for all nodes of all graphs, 19 | the value in the i-th line is the graph_id of the node with node_id i 20 | 21 | (3) DS_graph_labels.txt (N lines) 22 | class labels for all graphs in the dataset, 23 | the value in the i-th line is the class label of the graph with graph_id i 24 | 25 | (4) DS_node_labels.txt (n lines) 26 | column vector of node labels, 27 | the value in the i-th line corresponds to the node with node_id i 28 | 29 | There are OPTIONAL files if the respective information is available: 30 | 31 | (5) DS_edge_labels.txt (m lines; same size as DS_A_sparse.txt) 32 | labels for the edges in DS_A_sparse.txt 33 | 34 | (6) DS_edge_attributes.txt (m lines; same size as DS_A.txt) 35 | attributes for the edges in DS_A.txt 36 | 37 | (7) DS_node_attributes.txt (n lines) 38 | matrix of node attributes, 39 | the comma seperated values in the i-th line is the attribute vector of the node with node_id i 40 | 41 | (8) DS_graph_attributes.txt (N lines) 42 | regression values for all graphs in the dataset, 43 | the value in the i-th line is the attribute of the graph with graph_id i 44 | 45 | 46 | === Previous Use of the Dataset === 47 | 48 | Feragen, A., Kasenburg, N., Petersen, J., de Bruijne, M., Borgwardt, K.M.: Scalable 49 | kernels for graphs with continuous attributes. In: C.J.C. Burges, L. Bottou, Z. Ghahra- 50 | mani, K.Q. Weinberger (eds.) NIPS, pp. 216-224 (2013) 51 | 52 | Neumann, M., Garnett R., Bauckhage Ch., Kersting K.: Propagation Kernels: Efficient Graph 53 | Kernels from Propagated Information. Under review at MLJ. 54 | 55 | 56 | === References === 57 | 58 | K. M. Borgwardt, C. S. Ong, S. Schoenauer, S. V. N. Vishwanathan, A. J. Smola, and H. P. 59 | Kriegel. Protein function prediction via graph kernels. Bioinformatics, 21(Suppl 1):i47–i56, 60 | Jun 2005. 61 | 62 | P. D. Dobson and A. J. Doig. Distinguishing enzyme structures from non-enzymes without 63 | alignments. J. Mol. Biol., 330(4):771–783, Jul 2003. 64 | 65 | 66 | -------------------------------------------------------------------------------- /main.py: -------------------------------------------------------------------------------- 1 | """ 2 | The Pytorch-Geometric Implementation of GCNN model from the following paper 3 | "Fake News Detection on Social Media using Geometric Deep Learning" 4 | https://arxiv.org/abs/1902.06673 5 | """ 6 | 7 | import argparse 8 | import time 9 | from tqdm import tqdm 10 | 11 | import torch 12 | import torch.nn.functional as F 13 | from torch.utils.data import random_split 14 | from torch_geometric.data import DataLoader 15 | from torch.nn import Linear 16 | from torch_geometric.nn import global_mean_pool, GATConv 17 | from torch_geometric.datasets import TUDataset 18 | 19 | from sklearn.metrics import f1_score, accuracy_score, recall_score, precision_score 20 | 21 | parser = argparse.ArgumentParser() 22 | 23 | # original model parameters 24 | parser.add_argument('--seed', type=int, default=777, help='random seed') 25 | parser.add_argument('--device', type=str, default='cuda:0', help='specify cuda devices') 26 | 27 | # hyper-parameters 28 | parser.add_argument('--dataset', type=str, default='PROTEINS', help='DD/PROTEINS/NCI1/NCI109/Mutagenicity/ENZYMES') 29 | parser.add_argument('--batch_size', type=int, default=128, help='batch size') 30 | parser.add_argument('--lr', type=float, default=0.001, help='learning rate') 31 | parser.add_argument('--weight_decay', type=float, default=0.01, help='weight decay') 32 | parser.add_argument('--nhid', type=int, default=128, help='hidden size') 33 | parser.add_argument('--epochs', type=int, default=100, help='maximum number of epochs') 34 | 35 | args = parser.parse_args() 36 | torch.manual_seed(args.seed) 37 | if torch.cuda.is_available(): 38 | torch.cuda.manual_seed(args.seed) 39 | 40 | dataset = TUDataset('data/', name=args.dataset, use_node_attr=True) 41 | 42 | args.num_classes = dataset.num_classes 43 | args.num_features = dataset.num_features 44 | 45 | print(args) 46 | 47 | num_training = int(len(dataset) * 0.6) 48 | num_val = int(len(dataset) * 0.1) 49 | num_test = len(dataset) - (num_training + num_val) 50 | training_set, validation_set, test_set = random_split(dataset, [num_training, num_val, num_test]) 51 | 52 | train_loader = DataLoader(training_set, batch_size=args.batch_size, shuffle=True) 53 | val_loader = DataLoader(validation_set, batch_size=args.batch_size, shuffle=False) 54 | test_loader = DataLoader(test_set, batch_size=args.batch_size, shuffle=False) 55 | 56 | 57 | class Net(torch.nn.Module): 58 | def __init__(self): 59 | super(Net, self).__init__() 60 | 61 | self.num_features = dataset.num_features 62 | self.nhid = args.nhid 63 | 64 | self.conv1 = GATConv(self.num_features, self.nhid * 2) 65 | self.conv2 = GATConv(self.nhid * 2, self.nhid * 2) 66 | 67 | self.fc1 = Linear(self.nhid * 2, self.nhid) 68 | self.fc2 = Linear(self.nhid, dataset.num_classes) 69 | 70 | def forward(self, x, edge_index, batch): 71 | 72 | x = F.selu(self.conv1(x, edge_index)) 73 | x = F.selu(self.conv2(x, edge_index)) 74 | x = F.selu(global_mean_pool(x, batch)) 75 | x = F.selu(self.fc1(x)) 76 | x = F.dropout(x, p=0.5, training=self.training) 77 | x = self.fc2(x) 78 | return F.log_softmax(x, dim=-1) 79 | 80 | 81 | def eval(log): 82 | 83 | accuracy, f1_macro, precision, recall = 0, 0, 0, 0 84 | 85 | prob_log, label_log = [], [] 86 | 87 | for batch in log: 88 | pred_y, y = batch[0].data.cpu().numpy().argmax(axis=1), batch[1].data.cpu().numpy().tolist() 89 | prob_log.extend(batch[0].data.cpu().numpy()[:, 1].tolist()) 90 | label_log.extend(y) 91 | 92 | accuracy += accuracy_score(y, pred_y) 93 | f1_macro += f1_score(y, pred_y, average='macro') 94 | precision += precision_score(y, pred_y, zero_division=0) 95 | recall += recall_score(y, pred_y, zero_division=0) 96 | 97 | return accuracy/len(log), f1_macro/len(log), precision/len(log), recall/len(log) 98 | 99 | 100 | def compute_test(loader): 101 | model.eval() 102 | loss_test = 0.0 103 | out_log = [] 104 | with torch.no_grad(): 105 | for data in loader: 106 | data = data.to(args.device) 107 | out = model(data.x, data.edge_index, data.batch) 108 | y = data.y 109 | out_log.append([F.softmax(out, dim=1), y]) 110 | loss_test += F.nll_loss(out, y).item() 111 | return eval(out_log), loss_test 112 | 113 | 114 | model = Net().to(args.device) 115 | model = model.to(args.device) 116 | optimizer = torch.optim.Adam(model.parameters(), lr=args.lr, weight_decay=args.weight_decay) 117 | 118 | 119 | if __name__ == '__main__': 120 | # Model training 121 | out_log = [] 122 | 123 | t = time.time() 124 | model.train() 125 | for epoch in tqdm(range(args.epochs)): 126 | loss_train = 0.0 127 | correct = 0 128 | for i, data in enumerate(train_loader): 129 | optimizer.zero_grad() 130 | data = data.to(args.device) 131 | out = model(data.x, data.edge_index, data.batch) 132 | y = data.y 133 | loss = F.nll_loss(out, y) 134 | loss.backward() 135 | optimizer.step() 136 | loss_train += loss.item() 137 | out_log.append([F.softmax(out, dim=1), y]) 138 | acc_train, _, _, recall_train = eval(out_log) 139 | [acc_val, _, _, recall_val], loss_val = compute_test(val_loader) 140 | print(f'loss_train: {loss_train:.4f}, acc_train: {acc_train:.4f},' 141 | f' recall_train: {recall_train:.4f}, loss_val: {loss_val:.4f},' 142 | f' acc_val: {acc_val:.4f}, recall_val: {recall_val:.4f}') 143 | 144 | [acc, f1_macro, precision, recall], test_loss = compute_test(test_loader) 145 | print(f'Test set results: acc: {acc:.4f}, f1_macro: {f1_macro:.4f}, ' 146 | f'precision: {precision:.4f}, recall: {recall:.4f}') 147 | -------------------------------------------------------------------------------- /data/PROTEINS/raw/PROTEINS_graph_labels.txt: -------------------------------------------------------------------------------- 1 | 1 2 | 1 3 | 1 4 | 1 5 | 1 6 | 1 7 | 1 8 | 1 9 | 1 10 | 1 11 | 1 12 | 1 13 | 1 14 | 1 15 | 1 16 | 1 17 | 1 18 | 1 19 | 1 20 | 1 21 | 1 22 | 1 23 | 1 24 | 1 25 | 1 26 | 1 27 | 1 28 | 1 29 | 1 30 | 1 31 | 1 32 | 1 33 | 1 34 | 1 35 | 1 36 | 1 37 | 1 38 | 1 39 | 1 40 | 1 41 | 1 42 | 1 43 | 1 44 | 1 45 | 1 46 | 1 47 | 1 48 | 1 49 | 1 50 | 1 51 | 1 52 | 1 53 | 1 54 | 1 55 | 1 56 | 1 57 | 1 58 | 1 59 | 1 60 | 1 61 | 1 62 | 1 63 | 1 64 | 1 65 | 1 66 | 1 67 | 1 68 | 1 69 | 1 70 | 1 71 | 1 72 | 1 73 | 1 74 | 1 75 | 1 76 | 1 77 | 1 78 | 1 79 | 1 80 | 1 81 | 1 82 | 1 83 | 1 84 | 1 85 | 1 86 | 1 87 | 1 88 | 1 89 | 1 90 | 1 91 | 1 92 | 1 93 | 1 94 | 1 95 | 1 96 | 1 97 | 1 98 | 1 99 | 1 100 | 1 101 | 1 102 | 1 103 | 1 104 | 1 105 | 1 106 | 1 107 | 1 108 | 1 109 | 1 110 | 1 111 | 1 112 | 1 113 | 1 114 | 1 115 | 1 116 | 1 117 | 1 118 | 1 119 | 1 120 | 1 121 | 1 122 | 1 123 | 1 124 | 1 125 | 1 126 | 1 127 | 1 128 | 1 129 | 1 130 | 1 131 | 1 132 | 1 133 | 1 134 | 1 135 | 1 136 | 1 137 | 1 138 | 1 139 | 1 140 | 1 141 | 1 142 | 1 143 | 1 144 | 1 145 | 1 146 | 1 147 | 1 148 | 1 149 | 1 150 | 1 151 | 1 152 | 1 153 | 1 154 | 1 155 | 1 156 | 1 157 | 1 158 | 1 159 | 1 160 | 1 161 | 1 162 | 1 163 | 1 164 | 1 165 | 1 166 | 1 167 | 1 168 | 1 169 | 1 170 | 1 171 | 1 172 | 1 173 | 1 174 | 1 175 | 1 176 | 1 177 | 1 178 | 1 179 | 1 180 | 1 181 | 1 182 | 1 183 | 1 184 | 1 185 | 1 186 | 1 187 | 1 188 | 1 189 | 1 190 | 1 191 | 1 192 | 1 193 | 1 194 | 1 195 | 1 196 | 1 197 | 1 198 | 1 199 | 1 200 | 1 201 | 1 202 | 1 203 | 1 204 | 1 205 | 1 206 | 1 207 | 1 208 | 1 209 | 1 210 | 1 211 | 1 212 | 1 213 | 1 214 | 1 215 | 1 216 | 1 217 | 1 218 | 1 219 | 1 220 | 1 221 | 1 222 | 1 223 | 1 224 | 1 225 | 1 226 | 1 227 | 1 228 | 1 229 | 1 230 | 1 231 | 1 232 | 1 233 | 1 234 | 1 235 | 1 236 | 1 237 | 1 238 | 1 239 | 1 240 | 1 241 | 1 242 | 1 243 | 1 244 | 1 245 | 1 246 | 1 247 | 1 248 | 1 249 | 1 250 | 1 251 | 1 252 | 1 253 | 1 254 | 1 255 | 1 256 | 1 257 | 1 258 | 1 259 | 1 260 | 1 261 | 1 262 | 1 263 | 1 264 | 1 265 | 1 266 | 1 267 | 1 268 | 1 269 | 1 270 | 1 271 | 1 272 | 1 273 | 1 274 | 1 275 | 1 276 | 1 277 | 1 278 | 1 279 | 1 280 | 1 281 | 1 282 | 1 283 | 1 284 | 1 285 | 1 286 | 1 287 | 1 288 | 1 289 | 1 290 | 1 291 | 1 292 | 1 293 | 1 294 | 1 295 | 1 296 | 1 297 | 1 298 | 1 299 | 1 300 | 1 301 | 1 302 | 1 303 | 1 304 | 1 305 | 1 306 | 1 307 | 1 308 | 1 309 | 1 310 | 1 311 | 1 312 | 1 313 | 1 314 | 1 315 | 1 316 | 1 317 | 1 318 | 1 319 | 1 320 | 1 321 | 1 322 | 1 323 | 1 324 | 1 325 | 1 326 | 1 327 | 1 328 | 1 329 | 1 330 | 1 331 | 1 332 | 1 333 | 1 334 | 1 335 | 1 336 | 1 337 | 1 338 | 1 339 | 1 340 | 1 341 | 1 342 | 1 343 | 1 344 | 1 345 | 1 346 | 1 347 | 1 348 | 1 349 | 1 350 | 1 351 | 1 352 | 1 353 | 1 354 | 1 355 | 1 356 | 1 357 | 1 358 | 1 359 | 1 360 | 1 361 | 1 362 | 1 363 | 1 364 | 1 365 | 1 366 | 1 367 | 1 368 | 1 369 | 1 370 | 1 371 | 1 372 | 1 373 | 1 374 | 1 375 | 1 376 | 1 377 | 1 378 | 1 379 | 1 380 | 1 381 | 1 382 | 1 383 | 1 384 | 1 385 | 1 386 | 1 387 | 1 388 | 1 389 | 1 390 | 1 391 | 1 392 | 1 393 | 1 394 | 1 395 | 1 396 | 1 397 | 1 398 | 1 399 | 1 400 | 1 401 | 1 402 | 1 403 | 1 404 | 1 405 | 1 406 | 1 407 | 1 408 | 1 409 | 1 410 | 1 411 | 1 412 | 1 413 | 1 414 | 1 415 | 1 416 | 1 417 | 1 418 | 1 419 | 1 420 | 1 421 | 1 422 | 1 423 | 1 424 | 1 425 | 1 426 | 1 427 | 1 428 | 1 429 | 1 430 | 1 431 | 1 432 | 1 433 | 1 434 | 1 435 | 1 436 | 1 437 | 1 438 | 1 439 | 1 440 | 1 441 | 1 442 | 1 443 | 1 444 | 1 445 | 1 446 | 1 447 | 1 448 | 1 449 | 1 450 | 1 451 | 1 452 | 1 453 | 1 454 | 1 455 | 1 456 | 1 457 | 1 458 | 1 459 | 1 460 | 1 461 | 1 462 | 1 463 | 1 464 | 1 465 | 1 466 | 1 467 | 1 468 | 1 469 | 1 470 | 1 471 | 1 472 | 1 473 | 1 474 | 1 475 | 1 476 | 1 477 | 1 478 | 1 479 | 1 480 | 1 481 | 1 482 | 1 483 | 1 484 | 1 485 | 1 486 | 1 487 | 1 488 | 1 489 | 1 490 | 1 491 | 1 492 | 1 493 | 1 494 | 1 495 | 1 496 | 1 497 | 1 498 | 1 499 | 1 500 | 1 501 | 1 502 | 1 503 | 1 504 | 1 505 | 1 506 | 1 507 | 1 508 | 1 509 | 1 510 | 1 511 | 1 512 | 1 513 | 1 514 | 1 515 | 1 516 | 1 517 | 1 518 | 1 519 | 1 520 | 1 521 | 1 522 | 1 523 | 1 524 | 1 525 | 1 526 | 1 527 | 1 528 | 1 529 | 1 530 | 1 531 | 1 532 | 1 533 | 1 534 | 1 535 | 1 536 | 1 537 | 1 538 | 1 539 | 1 540 | 1 541 | 1 542 | 1 543 | 1 544 | 1 545 | 1 546 | 1 547 | 1 548 | 1 549 | 1 550 | 1 551 | 1 552 | 1 553 | 1 554 | 1 555 | 1 556 | 1 557 | 1 558 | 1 559 | 1 560 | 1 561 | 1 562 | 1 563 | 1 564 | 1 565 | 1 566 | 1 567 | 1 568 | 1 569 | 1 570 | 1 571 | 1 572 | 1 573 | 1 574 | 1 575 | 1 576 | 1 577 | 1 578 | 1 579 | 1 580 | 1 581 | 1 582 | 1 583 | 1 584 | 1 585 | 1 586 | 1 587 | 1 588 | 1 589 | 1 590 | 1 591 | 1 592 | 1 593 | 1 594 | 1 595 | 1 596 | 1 597 | 1 598 | 1 599 | 1 600 | 1 601 | 1 602 | 1 603 | 1 604 | 1 605 | 1 606 | 1 607 | 1 608 | 1 609 | 1 610 | 1 611 | 1 612 | 1 613 | 1 614 | 1 615 | 1 616 | 1 617 | 1 618 | 1 619 | 1 620 | 1 621 | 1 622 | 1 623 | 1 624 | 1 625 | 1 626 | 1 627 | 1 628 | 1 629 | 1 630 | 1 631 | 1 632 | 1 633 | 1 634 | 1 635 | 1 636 | 1 637 | 1 638 | 1 639 | 1 640 | 1 641 | 1 642 | 1 643 | 1 644 | 1 645 | 1 646 | 1 647 | 1 648 | 1 649 | 1 650 | 1 651 | 1 652 | 1 653 | 1 654 | 1 655 | 1 656 | 1 657 | 1 658 | 1 659 | 1 660 | 1 661 | 1 662 | 1 663 | 1 664 | 2 665 | 2 666 | 2 667 | 2 668 | 2 669 | 2 670 | 2 671 | 2 672 | 2 673 | 2 674 | 2 675 | 2 676 | 2 677 | 2 678 | 2 679 | 2 680 | 2 681 | 2 682 | 2 683 | 2 684 | 2 685 | 2 686 | 2 687 | 2 688 | 2 689 | 2 690 | 2 691 | 2 692 | 2 693 | 2 694 | 2 695 | 2 696 | 2 697 | 2 698 | 2 699 | 2 700 | 2 701 | 2 702 | 2 703 | 2 704 | 2 705 | 2 706 | 2 707 | 2 708 | 2 709 | 2 710 | 2 711 | 2 712 | 2 713 | 2 714 | 2 715 | 2 716 | 2 717 | 2 718 | 2 719 | 2 720 | 2 721 | 2 722 | 2 723 | 2 724 | 2 725 | 2 726 | 2 727 | 2 728 | 2 729 | 2 730 | 2 731 | 2 732 | 2 733 | 2 734 | 2 735 | 2 736 | 2 737 | 2 738 | 2 739 | 2 740 | 2 741 | 2 742 | 2 743 | 2 744 | 2 745 | 2 746 | 2 747 | 2 748 | 2 749 | 2 750 | 2 751 | 2 752 | 2 753 | 2 754 | 2 755 | 2 756 | 2 757 | 2 758 | 2 759 | 2 760 | 2 761 | 2 762 | 2 763 | 2 764 | 2 765 | 2 766 | 2 767 | 2 768 | 2 769 | 2 770 | 2 771 | 2 772 | 2 773 | 2 774 | 2 775 | 2 776 | 2 777 | 2 778 | 2 779 | 2 780 | 2 781 | 2 782 | 2 783 | 2 784 | 2 785 | 2 786 | 2 787 | 2 788 | 2 789 | 2 790 | 2 791 | 2 792 | 2 793 | 2 794 | 2 795 | 2 796 | 2 797 | 2 798 | 2 799 | 2 800 | 2 801 | 2 802 | 2 803 | 2 804 | 2 805 | 2 806 | 2 807 | 2 808 | 2 809 | 2 810 | 2 811 | 2 812 | 2 813 | 2 814 | 2 815 | 2 816 | 2 817 | 2 818 | 2 819 | 2 820 | 2 821 | 2 822 | 2 823 | 2 824 | 2 825 | 2 826 | 2 827 | 2 828 | 2 829 | 2 830 | 2 831 | 2 832 | 2 833 | 2 834 | 2 835 | 2 836 | 2 837 | 2 838 | 2 839 | 2 840 | 2 841 | 2 842 | 2 843 | 2 844 | 2 845 | 2 846 | 2 847 | 2 848 | 2 849 | 2 850 | 2 851 | 2 852 | 2 853 | 2 854 | 2 855 | 2 856 | 2 857 | 2 858 | 2 859 | 2 860 | 2 861 | 2 862 | 2 863 | 2 864 | 2 865 | 2 866 | 2 867 | 2 868 | 2 869 | 2 870 | 2 871 | 2 872 | 2 873 | 2 874 | 2 875 | 2 876 | 2 877 | 2 878 | 2 879 | 2 880 | 2 881 | 2 882 | 2 883 | 2 884 | 2 885 | 2 886 | 2 887 | 2 888 | 2 889 | 2 890 | 2 891 | 2 892 | 2 893 | 2 894 | 2 895 | 2 896 | 2 897 | 2 898 | 2 899 | 2 900 | 2 901 | 2 902 | 2 903 | 2 904 | 2 905 | 2 906 | 2 907 | 2 908 | 2 909 | 2 910 | 2 911 | 2 912 | 2 913 | 2 914 | 2 915 | 2 916 | 2 917 | 2 918 | 2 919 | 2 920 | 2 921 | 2 922 | 2 923 | 2 924 | 2 925 | 2 926 | 2 927 | 2 928 | 2 929 | 2 930 | 2 931 | 2 932 | 2 933 | 2 934 | 2 935 | 2 936 | 2 937 | 2 938 | 2 939 | 2 940 | 2 941 | 2 942 | 2 943 | 2 944 | 2 945 | 2 946 | 2 947 | 2 948 | 2 949 | 2 950 | 2 951 | 2 952 | 2 953 | 2 954 | 2 955 | 2 956 | 2 957 | 2 958 | 2 959 | 2 960 | 2 961 | 2 962 | 2 963 | 2 964 | 2 965 | 2 966 | 2 967 | 2 968 | 2 969 | 2 970 | 2 971 | 2 972 | 2 973 | 2 974 | 2 975 | 2 976 | 2 977 | 2 978 | 2 979 | 2 980 | 2 981 | 2 982 | 2 983 | 2 984 | 2 985 | 2 986 | 2 987 | 2 988 | 2 989 | 2 990 | 2 991 | 2 992 | 2 993 | 2 994 | 2 995 | 2 996 | 2 997 | 2 998 | 2 999 | 2 1000 | 2 1001 | 2 1002 | 2 1003 | 2 1004 | 2 1005 | 2 1006 | 2 1007 | 2 1008 | 2 1009 | 2 1010 | 2 1011 | 2 1012 | 2 1013 | 2 1014 | 2 1015 | 2 1016 | 2 1017 | 2 1018 | 2 1019 | 2 1020 | 2 1021 | 2 1022 | 2 1023 | 2 1024 | 2 1025 | 2 1026 | 2 1027 | 2 1028 | 2 1029 | 2 1030 | 2 1031 | 2 1032 | 2 1033 | 2 1034 | 2 1035 | 2 1036 | 2 1037 | 2 1038 | 2 1039 | 2 1040 | 2 1041 | 2 1042 | 2 1043 | 2 1044 | 2 1045 | 2 1046 | 2 1047 | 2 1048 | 2 1049 | 2 1050 | 2 1051 | 2 1052 | 2 1053 | 2 1054 | 2 1055 | 2 1056 | 2 1057 | 2 1058 | 2 1059 | 2 1060 | 2 1061 | 2 1062 | 2 1063 | 2 1064 | 2 1065 | 2 1066 | 2 1067 | 2 1068 | 2 1069 | 2 1070 | 2 1071 | 2 1072 | 2 1073 | 2 1074 | 2 1075 | 2 1076 | 2 1077 | 2 1078 | 2 1079 | 2 1080 | 2 1081 | 2 1082 | 2 1083 | 2 1084 | 2 1085 | 2 1086 | 2 1087 | 2 1088 | 2 1089 | 2 1090 | 2 1091 | 2 1092 | 2 1093 | 2 1094 | 2 1095 | 2 1096 | 2 1097 | 2 1098 | 2 1099 | 2 1100 | 2 1101 | 2 1102 | 2 1103 | 2 1104 | 2 1105 | 2 1106 | 2 1107 | 2 1108 | 2 1109 | 2 1110 | 2 1111 | 2 1112 | 2 1113 | 2 1114 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | --------------------------------------------------------------------------------