├── .gitignore ├── datasets ├── zachary │ ├── GraphATest.txt │ ├── GraphATrain.txt │ ├── GraphALabels_cluster.txt │ ├── GraphALabels_infection.txt │ ├── karate.edgelist │ └── GraphA.txt ├── disease │ ├── Readme.txt │ ├── GraphATest.txt │ ├── GraphATrain.txt │ ├── GraphALabels_cluster.txt │ ├── GraphALabels_infection.txt │ └── Roles.txt ├── BP-2 │ ├── LabelCA.txt │ ├── LabelCB.txt │ ├── GraphCA.txt │ └── GraphCB.txt ├── SB-4 │ ├── GraphALabels_infection.txt │ └── GraphALabels_cluster.txt ├── SB-6 │ ├── GraphALabels_infection.txt │ └── GraphALabels_cluster.txt └── email │ ├── GraphATest.txt │ ├── GraphATrain.txt │ ├── GraphALabels_infection.txt │ └── email-Eu-core-department-labels.txt ├── requirements.txt ├── LICENSE ├── code ├── graph_utils.py ├── random_walks.py ├── updateP.py ├── data_generator.py ├── load_save_data.py ├── run_gtl.py ├── create_gtl_model.py └── train_gtl_model.py └── README.md /.gitignore: -------------------------------------------------------------------------------- 1 | .idea/** -------------------------------------------------------------------------------- /datasets/zachary/GraphATest.txt: -------------------------------------------------------------------------------- 1 | 4 2 | 22 3 | 33 4 | 31 5 | 8 6 | 10 7 | 15 8 | 17 9 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | pathlib 2 | numpy==1.17.2 3 | scipy==1.3.1 4 | scikit-learn==0.21.3 5 | networkx==2.3 6 | tensorflow==2.0.0 7 | keras==2.3.1 -------------------------------------------------------------------------------- /datasets/zachary/GraphATrain.txt: -------------------------------------------------------------------------------- 1 | 30 2 | 19 3 | 32 4 | 23 5 | 5 6 | 24 7 | 7 8 | 1 9 | 2 10 | 29 11 | 16 12 | 20 13 | 3 14 | 25 15 | 6 16 | 18 17 | 28 18 | 26 19 | 12 20 | 9 21 | 21 22 | 15 23 | 13 24 | 0 25 | 11 26 | 27 27 | -------------------------------------------------------------------------------- /datasets/zachary/GraphALabels_cluster.txt: -------------------------------------------------------------------------------- 1 | 0 1 2 | 0 1 3 | 0 1 4 | 0 1 5 | 0 1 6 | 0 1 7 | 0 1 8 | 0 1 9 | 0 1 10 | 1 0 11 | 0 1 12 | 0 1 13 | 0 1 14 | 0 1 15 | 1 0 16 | 1 0 17 | 0 1 18 | 0 1 19 | 1 0 20 | 0 1 21 | 1 0 22 | 0 1 23 | 1 0 24 | 1 0 25 | 1 0 26 | 1 0 27 | 1 0 28 | 1 0 29 | 1 0 30 | 1 0 31 | 1 0 32 | 1 0 33 | 1 0 34 | 1 0 35 | -------------------------------------------------------------------------------- /datasets/disease/Readme.txt: -------------------------------------------------------------------------------- 1 | In assent.txt, you will find the role of the corresponding ids. 2 | 3 | weightedEdgeList.txt has the following format: 4 | 5 | id1 id2 interaction_duration 6 | 7 | Where id1 and id2 refer to a unique identifier. Multiple interactions between the same ids mean that the motes were in close proximity multiple times during the day. 8 | 9 | Questions, comments: salathe@psu.edu 10 | 11 | 12 | 13 | 14 | 15 | -------------------------------------------------------------------------------- /datasets/zachary/GraphALabels_infection.txt: -------------------------------------------------------------------------------- 1 | 0.6502 2 | 0.7408 3 | 0.7150 4 | 0.7105 5 | 0.6160 6 | 0.6377 7 | 0.6624 8 | 0.6949 9 | 0.6591 10 | 0.2753 11 | 0.6347 12 | 0.5180 13 | 0.5908 14 | 0.7133 15 | 0.2337 16 | 0.2287 17 | 0.2782 18 | 0.6163 19 | 0.2266 20 | 0.6503 21 | 0.2191 22 | 0.6083 23 | 0.2227 24 | 0.2784 25 | 0.2592 26 | 0.2581 27 | 0.2051 28 | 0.3119 29 | 0.3435 30 | 0.2614 31 | 0.3214 32 | 0.6355 33 | 0.4125 34 | 0.4403 35 | -------------------------------------------------------------------------------- /datasets/BP-2/LabelCA.txt: -------------------------------------------------------------------------------- 1 | 0 2 | 0 3 | 0 4 | 0 5 | 0 6 | 0 7 | 0 8 | 0 9 | 0 10 | 0 11 | 0 12 | 0 13 | 0 14 | 0 15 | 0 16 | 0 17 | 0 18 | 0 19 | 0 20 | 0 21 | 0 22 | 0 23 | 0 24 | 0 25 | 0 26 | 1 27 | 1 28 | 1 29 | 1 30 | 1 31 | 1 32 | 1 33 | 1 34 | 1 35 | 1 36 | 1 37 | 1 38 | 1 39 | 1 40 | 1 41 | 1 42 | 1 43 | 1 44 | 1 45 | 1 46 | 1 47 | 1 48 | 1 49 | 1 50 | 1 51 | 2 52 | 2 53 | 2 54 | 2 55 | 2 56 | 2 57 | 2 58 | 2 59 | 2 60 | 2 61 | 2 62 | 2 63 | 2 64 | 2 65 | 2 66 | 2 67 | 2 68 | 2 69 | 2 70 | 2 71 | 2 72 | 2 73 | 2 74 | 2 75 | 2 76 | 3 77 | 3 78 | 3 79 | 3 80 | 3 81 | 3 82 | 3 83 | 3 84 | 3 85 | 3 86 | 3 87 | 3 88 | 3 89 | 3 90 | 3 91 | 3 92 | 3 93 | 3 94 | 3 95 | 3 96 | 3 97 | 3 98 | 3 99 | 3 100 | 3 101 | -------------------------------------------------------------------------------- /datasets/BP-2/LabelCB.txt: -------------------------------------------------------------------------------- 1 | 0 2 | 0 3 | 1 4 | 0 5 | 1 6 | 1 7 | 1 8 | 3 9 | 3 10 | 0 11 | 2 12 | 3 13 | 2 14 | 0 15 | 3 16 | 0 17 | 0 18 | 1 19 | 3 20 | 1 21 | 3 22 | 0 23 | 0 24 | 2 25 | 3 26 | 1 27 | 1 28 | 0 29 | 1 30 | 1 31 | 0 32 | 1 33 | 0 34 | 3 35 | 0 36 | 1 37 | 0 38 | 1 39 | 2 40 | 2 41 | 2 42 | 2 43 | 1 44 | 2 45 | 3 46 | 2 47 | 0 48 | 1 49 | 3 50 | 1 51 | 3 52 | 0 53 | 1 54 | 0 55 | 3 56 | 1 57 | 3 58 | 3 59 | 1 60 | 1 61 | 2 62 | 2 63 | 0 64 | 3 65 | 1 66 | 2 67 | 3 68 | 2 69 | 1 70 | 2 71 | 3 72 | 3 73 | 3 74 | 2 75 | 1 76 | 1 77 | 2 78 | 1 79 | 3 80 | 2 81 | 0 82 | 3 83 | 2 84 | 3 85 | 0 86 | 2 87 | 3 88 | 0 89 | 2 90 | 3 91 | 0 92 | 0 93 | 0 94 | 2 95 | 3 96 | 2 97 | 2 98 | 2 99 | 2 100 | 0 101 | -------------------------------------------------------------------------------- /datasets/zachary/karate.edgelist: -------------------------------------------------------------------------------- 1 | 1 32 2 | 1 22 3 | 1 20 4 | 1 18 5 | 1 14 6 | 1 13 7 | 1 12 8 | 1 11 9 | 1 9 10 | 1 8 11 | 1 7 12 | 1 6 13 | 1 5 14 | 1 4 15 | 1 3 16 | 1 2 17 | 2 31 18 | 2 22 19 | 2 20 20 | 2 18 21 | 2 14 22 | 2 8 23 | 2 4 24 | 2 3 25 | 3 14 26 | 3 9 27 | 3 10 28 | 3 33 29 | 3 29 30 | 3 28 31 | 3 8 32 | 3 4 33 | 4 14 34 | 4 13 35 | 4 8 36 | 5 11 37 | 5 7 38 | 6 17 39 | 6 11 40 | 6 7 41 | 7 17 42 | 9 34 43 | 9 33 44 | 9 33 45 | 10 34 46 | 14 34 47 | 15 34 48 | 15 33 49 | 16 34 50 | 16 33 51 | 19 34 52 | 19 33 53 | 20 34 54 | 21 34 55 | 21 33 56 | 23 34 57 | 23 33 58 | 24 30 59 | 24 34 60 | 24 33 61 | 24 28 62 | 24 26 63 | 25 32 64 | 25 28 65 | 25 26 66 | 26 32 67 | 27 34 68 | 27 30 69 | 28 34 70 | 29 34 71 | 29 32 72 | 30 34 73 | 30 33 74 | 31 34 75 | 31 33 76 | 32 34 77 | 32 33 78 | 33 34 -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2021 NEU SPIRAL 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /datasets/SB-4/GraphALabels_infection.txt: -------------------------------------------------------------------------------- 1 | 0.4618 2 | 0.4436 3 | 0.4408 4 | 0.4462 5 | 0.4027 6 | 0.4589 7 | 0.7202 8 | 0.3734 9 | 0.4577 10 | 0.4480 11 | 0.4302 12 | 0.4695 13 | 0.4715 14 | 0.4298 15 | 0.7200 16 | 0.4607 17 | 0.4568 18 | 0.4625 19 | 0.7238 20 | 0.4292 21 | 0.4293 22 | 0.4768 23 | 0.4258 24 | 0.4542 25 | 0.4475 26 | 0.4274 27 | 0.3754 28 | 0.4070 29 | 0.4047 30 | 0.4082 31 | 0.4482 32 | 0.4032 33 | 0.4304 34 | 0.4453 35 | 0.7237 36 | 0.4067 37 | 0.3728 38 | 0.4703 39 | 0.4017 40 | 0.4495 41 | 0.4305 42 | 0.4471 43 | 0.4315 44 | 0.4481 45 | 0.3748 46 | 0.4442 47 | 0.4493 48 | 0.4447 49 | 0.4577 50 | 0.4292 51 | 0.7366 52 | 0.4892 53 | 0.4987 54 | 0.7380 55 | 0.7520 56 | 0.7343 57 | 0.7493 58 | 0.7478 59 | 0.7538 60 | 0.7635 61 | 0.7467 62 | 0.7420 63 | 0.4987 64 | 0.7473 65 | 0.7417 66 | 0.7497 67 | 0.4978 68 | 0.7545 69 | 0.7563 70 | 0.4967 71 | 0.7658 72 | 0.4783 73 | 0.7435 74 | 0.7370 75 | 0.7480 76 | 0.4592 77 | 0.4578 78 | 0.4454 79 | 0.4586 80 | 0.4435 81 | 0.4039 82 | 0.7071 83 | 0.4454 84 | 0.3777 85 | 0.4565 86 | 0.4610 87 | 0.4067 88 | 0.4500 89 | 0.6990 90 | 0.4378 91 | 0.4668 92 | 0.4474 93 | 0.6854 94 | 0.4268 95 | 0.7005 96 | 0.4415 97 | 0.4230 98 | 0.4277 99 | 0.4424 100 | 0.3992 101 | -------------------------------------------------------------------------------- /datasets/SB-4/GraphALabels_cluster.txt: -------------------------------------------------------------------------------- 1 | 1 0 0 0 2 | 1 0 0 0 3 | 1 0 0 0 4 | 1 0 0 0 5 | 1 0 0 0 6 | 1 0 0 0 7 | 1 0 0 0 8 | 1 0 0 0 9 | 1 0 0 0 10 | 1 0 0 0 11 | 1 0 0 0 12 | 1 0 0 0 13 | 1 0 0 0 14 | 1 0 0 0 15 | 1 0 0 0 16 | 1 0 0 0 17 | 1 0 0 0 18 | 1 0 0 0 19 | 1 0 0 0 20 | 1 0 0 0 21 | 1 0 0 0 22 | 1 0 0 0 23 | 1 0 0 0 24 | 1 0 0 0 25 | 1 0 0 0 26 | 0 1 0 0 27 | 0 1 0 0 28 | 0 1 0 0 29 | 0 1 0 0 30 | 0 1 0 0 31 | 0 1 0 0 32 | 0 1 0 0 33 | 0 1 0 0 34 | 0 1 0 0 35 | 0 1 0 0 36 | 0 1 0 0 37 | 0 1 0 0 38 | 0 1 0 0 39 | 0 1 0 0 40 | 0 1 0 0 41 | 0 1 0 0 42 | 0 1 0 0 43 | 0 1 0 0 44 | 0 1 0 0 45 | 0 1 0 0 46 | 0 1 0 0 47 | 0 1 0 0 48 | 0 1 0 0 49 | 0 1 0 0 50 | 0 1 0 0 51 | 0 0 1 0 52 | 0 0 1 0 53 | 0 0 1 0 54 | 0 0 1 0 55 | 0 0 1 0 56 | 0 0 1 0 57 | 0 0 1 0 58 | 0 0 1 0 59 | 0 0 1 0 60 | 0 0 1 0 61 | 0 0 1 0 62 | 0 0 1 0 63 | 0 0 1 0 64 | 0 0 1 0 65 | 0 0 1 0 66 | 0 0 1 0 67 | 0 0 1 0 68 | 0 0 1 0 69 | 0 0 1 0 70 | 0 0 1 0 71 | 0 0 1 0 72 | 0 0 1 0 73 | 0 0 1 0 74 | 0 0 1 0 75 | 0 0 1 0 76 | 0 0 0 1 77 | 0 0 0 1 78 | 0 0 0 1 79 | 0 0 0 1 80 | 0 0 0 1 81 | 0 0 0 1 82 | 0 0 0 1 83 | 0 0 0 1 84 | 0 0 0 1 85 | 0 0 0 1 86 | 0 0 0 1 87 | 0 0 0 1 88 | 0 0 0 1 89 | 0 0 0 1 90 | 0 0 0 1 91 | 0 0 0 1 92 | 0 0 0 1 93 | 0 0 0 1 94 | 0 0 0 1 95 | 0 0 0 1 96 | 0 0 0 1 97 | 0 0 0 1 98 | 0 0 0 1 99 | 0 0 0 1 100 | 0 0 0 1 101 | -------------------------------------------------------------------------------- /datasets/disease/GraphATest.txt: -------------------------------------------------------------------------------- 1 | 273 2 | 125 3 | 5 4 | 775 5 | 509 6 | 692 7 | 267 8 | 770 9 | 55 10 | 340 11 | 194 12 | 248 13 | 408 14 | 532 15 | 20 16 | 527 17 | 320 18 | 10 19 | 720 20 | 117 21 | 726 22 | 467 23 | 663 24 | 778 25 | 513 26 | 412 27 | 570 28 | 558 29 | 227 30 | 225 31 | 711 32 | 7 33 | 280 34 | 188 35 | 378 36 | 439 37 | 611 38 | 376 39 | 233 40 | 561 41 | 219 42 | 79 43 | 303 44 | 88 45 | 708 46 | 416 47 | 406 48 | 256 49 | 3 50 | 563 51 | 600 52 | 171 53 | 473 54 | 568 55 | 392 56 | 18 57 | 269 58 | 330 59 | 765 60 | 4 61 | 399 62 | 334 63 | 249 64 | 307 65 | 567 66 | 15 67 | 506 68 | 697 69 | 441 70 | 525 71 | 155 72 | 177 73 | 86 74 | 146 75 | 94 76 | 739 77 | 562 78 | 72 79 | 308 80 | 413 81 | 234 82 | 110 83 | 624 84 | 523 85 | 47 86 | 474 87 | 471 88 | 731 89 | 442 90 | 67 91 | 126 92 | 732 93 | 774 94 | 51 95 | 755 96 | 168 97 | 247 98 | 423 99 | 375 100 | 582 101 | 299 102 | 583 103 | 74 104 | 369 105 | 93 106 | 16 107 | 293 108 | 520 109 | 152 110 | 543 111 | 649 112 | 667 113 | 46 114 | 753 115 | 76 116 | 665 117 | 475 118 | 329 119 | 201 120 | 500 121 | 122 122 | 492 123 | 136 124 | 355 125 | 580 126 | 222 127 | 572 128 | 772 129 | 359 130 | 662 131 | 414 132 | 537 133 | 601 134 | 421 135 | 302 136 | 451 137 | 515 138 | 223 139 | 180 140 | 71 141 | 466 142 | 284 143 | 630 144 | 612 145 | 85 146 | 674 147 | 538 148 | 638 149 | 784 150 | 574 151 | 35 152 | 352 153 | 190 154 | 279 155 | 266 156 | 83 157 | 571 158 | 671 159 | -------------------------------------------------------------------------------- /datasets/SB-6/GraphALabels_infection.txt: -------------------------------------------------------------------------------- 1 | 0.3101 2 | 0.3597 3 | 0.3618 4 | 0.3633 5 | 0.3194 6 | 0.3086 7 | 0.3056 8 | 0.2970 9 | 0.3089 10 | 0.3179 11 | 0.3635 12 | 0.3206 13 | 0.3046 14 | 0.3632 15 | 0.2886 16 | 0.3168 17 | 0.3586 18 | 0.3618 19 | 0.3575 20 | 0.3075 21 | 0.4040 22 | 0.4633 23 | 0.7180 24 | 0.4827 25 | 0.4622 26 | 0.7227 27 | 0.4431 28 | 0.4449 29 | 0.4068 30 | 0.4338 31 | 0.7299 32 | 0.4313 33 | 0.7365 34 | 0.4908 35 | 0.4566 36 | 0.4614 37 | 0.7164 38 | 0.4466 39 | 0.4705 40 | 0.4462 41 | 0.7605 42 | 0.7532 43 | 0.7382 44 | 0.7510 45 | 0.7573 46 | 0.7568 47 | 0.7537 48 | 0.7405 49 | 0.7335 50 | 0.7392 51 | 0.7478 52 | 0.7497 53 | 0.7597 54 | 0.7515 55 | 0.7537 56 | 0.4975 57 | 0.4957 58 | 0.7553 59 | 0.7563 60 | 0.4953 61 | 0.6816 62 | 0.4282 63 | 0.4448 64 | 0.3729 65 | 0.4327 66 | 0.4487 67 | 0.4290 68 | 0.4317 69 | 0.4298 70 | 0.4613 71 | 0.4499 72 | 0.4018 73 | 0.4497 74 | 0.4057 75 | 0.4074 76 | 0.4616 77 | 0.3773 78 | 0.4446 79 | 0.6917 80 | 0.3784 81 | 0.4321 82 | 0.4318 83 | 0.4664 84 | 0.4310 85 | 0.4052 86 | 0.4260 87 | 0.7219 88 | 0.4300 89 | 0.4611 90 | 0.4614 91 | 0.4323 92 | 0.7169 93 | 0.4482 94 | 0.4091 95 | 0.4302 96 | 0.4502 97 | 0.4507 98 | 0.4348 99 | 0.4691 100 | 0.4118 101 | 0.3023 102 | 0.2905 103 | 0.3001 104 | 0.2989 105 | 0.2980 106 | 0.2805 107 | 0.2585 108 | 0.2981 109 | 0.3347 110 | 0.3069 111 | 0.3111 112 | 0.2845 113 | 0.2706 114 | 0.3327 115 | 0.2850 116 | 0.2988 117 | 0.2718 118 | 0.2956 119 | 0.3101 120 | 0.2570 121 | -------------------------------------------------------------------------------- /datasets/email/GraphATest.txt: -------------------------------------------------------------------------------- 1 | 71 2 | 639 3 | 979 4 | 535 5 | 283 6 | 439 7 | 854 8 | 856 9 | 88 10 | 595 11 | 191 12 | 763 13 | 953 14 | 894 15 | 696 16 | 729 17 | 364 18 | 225 19 | 778 20 | 263 21 | 456 22 | 329 23 | 42 24 | 984 25 | 884 26 | 859 27 | 203 28 | 389 29 | 727 30 | 976 31 | 578 32 | 770 33 | 607 34 | 560 35 | 143 36 | 127 37 | 244 38 | 935 39 | 294 40 | 480 41 | 732 42 | 804 43 | 881 44 | 349 45 | 354 46 | 544 47 | 472 48 | 749 49 | 950 50 | 516 51 | 738 52 | 629 53 | 502 54 | 331 55 | 533 56 | 288 57 | 340 58 | 943 59 | 682 60 | 719 61 | 33 62 | 139 63 | 179 64 | 195 65 | 568 66 | 441 67 | 622 68 | 700 69 | 956 70 | 526 71 | 291 72 | 632 73 | 934 74 | 553 75 | 1 76 | 197 77 | 481 78 | 540 79 | 672 80 | 608 81 | 338 82 | 831 83 | 867 84 | 189 85 | 339 86 | 192 87 | 408 88 | 913 89 | 234 90 | 503 91 | 821 92 | 826 93 | 663 94 | 63 95 | 969 96 | 886 97 | 201 98 | 514 99 | 154 100 | 757 101 | 820 102 | 61 103 | 245 104 | 767 105 | 811 106 | 538 107 | 110 108 | 73 109 | 486 110 | 125 111 | 745 112 | 97 113 | 785 114 | 394 115 | 736 116 | 484 117 | 392 118 | 579 119 | 735 120 | 62 121 | 430 122 | 530 123 | 446 124 | 401 125 | 242 126 | 22 127 | 664 128 | 601 129 | 973 130 | 10 131 | 164 132 | 334 133 | 936 134 | 444 135 | 782 136 | 651 137 | 844 138 | 796 139 | 876 140 | 489 141 | 713 142 | 908 143 | 557 144 | 212 145 | 223 146 | 981 147 | 298 148 | 404 149 | 103 150 | 967 151 | 243 152 | 687 153 | 790 154 | 693 155 | 755 156 | 818 157 | 918 158 | 873 159 | 163 160 | 4 161 | 563 162 | 271 163 | 78 164 | 692 165 | 31 166 | 153 167 | 403 168 | 768 169 | 83 170 | 620 171 | 483 172 | 47 173 | 528 174 | 440 175 | 320 176 | 206 177 | 51 178 | 369 179 | 175 180 | 776 181 | 314 182 | 781 183 | 221 184 | 927 185 | 449 186 | 800 187 | 707 188 | 107 189 | 470 190 | 488 191 | 278 192 | 471 193 | 701 194 | 12 195 | 951 196 | 968 197 | 877 198 | 635 199 | 198 200 | 368 201 | 32 202 | 44 203 | 609 204 | 272 205 | 421 206 | 959 207 | -------------------------------------------------------------------------------- /datasets/SB-6/GraphALabels_cluster.txt: -------------------------------------------------------------------------------- 1 | 1 0 0 0 0 0 2 | 1 0 0 0 0 0 3 | 1 0 0 0 0 0 4 | 1 0 0 0 0 0 5 | 1 0 0 0 0 0 6 | 1 0 0 0 0 0 7 | 1 0 0 0 0 0 8 | 1 0 0 0 0 0 9 | 1 0 0 0 0 0 10 | 1 0 0 0 0 0 11 | 1 0 0 0 0 0 12 | 1 0 0 0 0 0 13 | 1 0 0 0 0 0 14 | 1 0 0 0 0 0 15 | 1 0 0 0 0 0 16 | 1 0 0 0 0 0 17 | 1 0 0 0 0 0 18 | 1 0 0 0 0 0 19 | 1 0 0 0 0 0 20 | 1 0 0 0 0 0 21 | 0 1 0 0 0 0 22 | 0 1 0 0 0 0 23 | 0 1 0 0 0 0 24 | 0 1 0 0 0 0 25 | 0 1 0 0 0 0 26 | 0 1 0 0 0 0 27 | 0 1 0 0 0 0 28 | 0 1 0 0 0 0 29 | 0 1 0 0 0 0 30 | 0 1 0 0 0 0 31 | 0 1 0 0 0 0 32 | 0 1 0 0 0 0 33 | 0 1 0 0 0 0 34 | 0 1 0 0 0 0 35 | 0 1 0 0 0 0 36 | 0 1 0 0 0 0 37 | 0 1 0 0 0 0 38 | 0 1 0 0 0 0 39 | 0 1 0 0 0 0 40 | 0 1 0 0 0 0 41 | 0 0 1 0 0 0 42 | 0 0 1 0 0 0 43 | 0 0 1 0 0 0 44 | 0 0 1 0 0 0 45 | 0 0 1 0 0 0 46 | 0 0 1 0 0 0 47 | 0 0 1 0 0 0 48 | 0 0 1 0 0 0 49 | 0 0 1 0 0 0 50 | 0 0 1 0 0 0 51 | 0 0 1 0 0 0 52 | 0 0 1 0 0 0 53 | 0 0 1 0 0 0 54 | 0 0 1 0 0 0 55 | 0 0 1 0 0 0 56 | 0 0 1 0 0 0 57 | 0 0 1 0 0 0 58 | 0 0 1 0 0 0 59 | 0 0 1 0 0 0 60 | 0 0 1 0 0 0 61 | 0 0 0 1 0 0 62 | 0 0 0 1 0 0 63 | 0 0 0 1 0 0 64 | 0 0 0 1 0 0 65 | 0 0 0 1 0 0 66 | 0 0 0 1 0 0 67 | 0 0 0 1 0 0 68 | 0 0 0 1 0 0 69 | 0 0 0 1 0 0 70 | 0 0 0 1 0 0 71 | 0 0 0 1 0 0 72 | 0 0 0 1 0 0 73 | 0 0 0 1 0 0 74 | 0 0 0 1 0 0 75 | 0 0 0 1 0 0 76 | 0 0 0 1 0 0 77 | 0 0 0 1 0 0 78 | 0 0 0 1 0 0 79 | 0 0 0 1 0 0 80 | 0 0 0 1 0 0 81 | 0 0 0 0 1 0 82 | 0 0 0 0 1 0 83 | 0 0 0 0 1 0 84 | 0 0 0 0 1 0 85 | 0 0 0 0 1 0 86 | 0 0 0 0 1 0 87 | 0 0 0 0 1 0 88 | 0 0 0 0 1 0 89 | 0 0 0 0 1 0 90 | 0 0 0 0 1 0 91 | 0 0 0 0 1 0 92 | 0 0 0 0 1 0 93 | 0 0 0 0 1 0 94 | 0 0 0 0 1 0 95 | 0 0 0 0 1 0 96 | 0 0 0 0 1 0 97 | 0 0 0 0 1 0 98 | 0 0 0 0 1 0 99 | 0 0 0 0 1 0 100 | 0 0 0 0 1 0 101 | 0 0 0 0 0 1 102 | 0 0 0 0 0 1 103 | 0 0 0 0 0 1 104 | 0 0 0 0 0 1 105 | 0 0 0 0 0 1 106 | 0 0 0 0 0 1 107 | 0 0 0 0 0 1 108 | 0 0 0 0 0 1 109 | 0 0 0 0 0 1 110 | 0 0 0 0 0 1 111 | 0 0 0 0 0 1 112 | 0 0 0 0 0 1 113 | 0 0 0 0 0 1 114 | 0 0 0 0 0 1 115 | 0 0 0 0 0 1 116 | 0 0 0 0 0 1 117 | 0 0 0 0 0 1 118 | 0 0 0 0 0 1 119 | 0 0 0 0 0 1 120 | 0 0 0 0 0 1 121 | -------------------------------------------------------------------------------- /datasets/zachary/GraphA.txt: -------------------------------------------------------------------------------- 1 | 0 1 1 1 1 1 1 1 1 0 1 1 1 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 2 | 1 0 1 1 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 3 | 1 1 0 1 0 0 0 1 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 4 | 1 1 1 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 | 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 | 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 | 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 | 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 | 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 10 | 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 11 | 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 12 | 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 13 | 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 14 | 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 15 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 16 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 17 | 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 18 | 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 19 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 20 | 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 21 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 22 | 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 23 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 24 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 1 1 25 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 26 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 0 0 27 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 28 | 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 1 29 | 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 30 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 1 31 | 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 32 | 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 1 1 33 | 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 1 0 1 0 1 1 0 0 0 0 0 1 1 1 0 1 34 | 0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 1 0 0 1 1 1 0 1 1 0 0 1 1 1 1 1 1 1 0 35 | -------------------------------------------------------------------------------- /code/graph_utils.py: -------------------------------------------------------------------------------- 1 | """ 2 | Module loads data and also creates synthetic data 3 | 4 | @author: Andrey Gritsenko 5 | SPIRAL Group 6 | Electrical & Computer Engineering 7 | Northeastern University 8 | """ 9 | 10 | import numpy as np 11 | from itertools import permutations 12 | 13 | 14 | def PermuteGraph(Adj, labels=None, P=None): 15 | """ 16 | Permute graph according to permutation matrix P 17 | 18 | Inputs: 19 | ADJ Adjacency matrix of a graph to be permuted 20 | P Permutation matrix, such that B = P^T * A * P, where A is an old adjacency matrix and B is a new one 21 | LABELS List of node labels 22 | 23 | Outputs: 24 | BSIM Adjacency matrix of a new graph after permutation 25 | BLABELS List of node labels after permutation 26 | """ 27 | 28 | if P is None: 29 | P = np.eye(len(Adj), dtype=int)[np.random.permutation(len(Adj))] 30 | 31 | new_Adj = np.matmul(np.matmul(P.T, Adj), P) 32 | 33 | new_Labels = np.matmul(P.T, labels) if labels is not None else None 34 | return new_Adj, new_Labels, P 35 | 36 | 37 | def ReverseEdges(A, pct=None): 38 | """ 39 | Invert elements of graph's adjacency matrix with probability Prob 40 | 41 | Inputs: 42 | A Adjacency matrix of a graph to be permuted 43 | PCT Percent of inverted edges 44 | 45 | Outputs: 46 | B Adjacency matrix of a new graph after edge inversion 47 | """ 48 | 49 | B = np.triu(A, 1) # get upper triangular matrix from input A 50 | 51 | # remove edges 52 | ind = np.where(B == 1) # find all edges 53 | # define number of edges to remove and add 54 | if pct is None or not (0 < pct < 1): 55 | n_edges = min(int(np.floor(len(A) / 2) + 1), len(ind[0])) 56 | else: 57 | n_edges = int(len(ind[0]) * pct) + 1 58 | randperm = np.random.permutation(len(ind[0]))[:n_edges] # randomly choose edges to delete 59 | B[(ind[0][randperm], ind[1][randperm])] = 0 # delete selected edges 60 | 61 | # add edges 62 | B0 = np.triu(abs(B - 1), 1) 63 | ind = np.where(B0 == 1) # find all absent edges 64 | randperm = np.random.permutation(len(ind[0]))[:n_edges] # randomly choose edges to add 65 | B[(ind[0][randperm], ind[1][randperm])] = 1 # add selected edges 66 | return B + B.T 67 | 68 | 69 | def RemoveEdges(A, n_edges=0): 70 | """ 71 | Remove certain amount of edges from a graph 72 | """ 73 | n_edges = int(n_edges) 74 | if n_edges < 1 or n_edges > A.size - len(A): 75 | n_edges = int(np.floor(len(A) / 2) + 1) 76 | 77 | B = np.triu(A, 1) 78 | ind = np.where(B == 1) 79 | randperm = np.random.permutation(len(ind[0]))[:min(n_edges, len(ind[0]))] 80 | B[(ind[0][randperm], ind[1][randperm])] = 0 81 | return B + B.T 82 | 83 | 84 | def GetDoublyStochasticMatrix(Labels, P): 85 | n_labels = Labels.shape[1] 86 | if n_labels != 1: 87 | cluster_size = len(Labels) / n_labels 88 | else: 89 | cluster_size = 6 if len(Labels) == 120 else 4 90 | 91 | Block = np.eye(len(Labels)) 92 | for l in range(n_labels): 93 | Block[cluster_size * l:cluster_size * (l + 1), cluster_size * l:cluster_size * (l + 1)] = 1 / cluster_size 94 | Pds = np.matmul(Block, P) 95 | return Pds 96 | 97 | 98 | def getNodeSimilarity(A, mode='adjacency', n_walks=10, walk_length=10, window_size=5, p=.25, q=4, n_negative=5): 99 | """ 100 | Wrapper of function computing similarity between node pairs 101 | 102 | Inputs: 103 | A Adjacency matrix of a graph 104 | MODE Similarity metric between nodes 105 | NODES List of nodes to be used (if one graph is split in training and test sets) 106 | FEATURES Features of graph nodes (used for some similarity metrics) 107 | 108 | Outputs: 109 | NODE_PAIRS List of node pairs, for which similarity was computed 110 | SIMILARITY Measure of similarity between two nodes 111 | """ 112 | if mode == 'randomwalk': 113 | from random_walks import getRWSimilarity 114 | target, context, similarity, neg_samples = getRWSimilarity(A, n_walks, walk_length, window_size, p, q, 115 | n_negative) 116 | else: 117 | node_pairs = np.array(list(permutations(range(len(A)), 2))) 118 | np.random.shuffle(node_pairs) 119 | target = node_pairs[:, 0] 120 | context = node_pairs[:, 1] 121 | n_negative = 0 122 | neg_samples = None 123 | 124 | similarity = np.zeros((len(node_pairs), n_negative + 1)) 125 | similarity[:, 0] = [A[i, j] for i, j in node_pairs] 126 | 127 | return target, context, similarity, neg_samples 128 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Contents 2 | 3 | * [Acknowledgement](#acknowledgement) 4 | * [Citing This Paper](#citing-this-paper) 5 | * [Environment Setup](#environment-setup) 6 | * [Running Framework](#running-framework) 7 | * [Datasets](#datasets) 8 | 9 | 10 | ## Acknowledgement 11 | This repository contains the source code for the Graph Transfer Learning project developed by the Northeastern University's SPIRAL research group. This research was generously supported by the National Science Foundation (grants IIS-1741197, CCF-1750539) 12 | and Google via GCP credit support. 13 | 14 | 15 | ## Citing This Paper 16 | Please cite the following paper if you intend to use this code for your research. 17 | > A. Gritsenko, Y. Guo, K. Shayestehfard, A. Moharrer, J. Dy, S. Ioannidis, "Graph Transfer Learning", ICDM, 2021. 18 | 19 | 20 | ## Environment Setup 21 | Please install the python dependencies found in `requirements.txt` with: 22 | ```bash 23 | pip install -r requirements.txt 24 | ``` 25 | 26 | ## Running Framework 27 | To fully address the generic nature of the algorithm introduced in the original paper, we provide a fully-customizable framework with a wide variety of parameters for node embedding, model creation and training. 28 | 29 | The following arguments can be specified to train node embeddings: 30 | ```bash 31 | --nembedding Size of the output embedding vector 32 | --topology_similarity Similarity measure between nodes of the same graph in 33 | graph topological space 34 | --embedding_type Type of embedding function: skipgram, unified 35 | -embedding_similarity Similarity measures between nodes of the same graph in 36 | embedding space 37 | --nwalks Number of node2vec random walks 38 | --walk_length Length of random walk 39 | --window_size Width of sliding window in random walks 40 | --p Parameter p for node2vec random walks 41 | --q Parameter q for node2vec random walks 42 | --nnegative Number of negative samples used in skip-gram 43 | --scale_negative Specifies whether to scale outputs for negative 44 | samples 45 | --graph_distance Pairwise distance measure between nodes in the 46 | embedding space (matrix D) 47 | ``` 48 | The following arguments can be specified to create and train model: 49 | ```bash 50 | --similarity_loss Loss function between similarities in topological and 51 | embedding spaces for nodes of the same graph 52 | --depth Number of hidden layers in Prediction Branch 53 | --activation_function Activation function for Prediction Branch neurons 54 | --prediction_loss Loss function for Prediction Branch 55 | --transfer_mode Specifies transfer learning mode 56 | --alpha Weight of graph matching loss 57 | --beta Specifies whether to scale parts of P-optimization 58 | loss 59 | --learning_rate Learning rate 60 | --batch_size Number of instances in each batch 61 | --epochs Number of epochs 62 | --early_stopping Number of epochs with no improvement after which 63 | training will be stopped. If <=0, no early stopping is 64 | used 65 | ``` 66 | For a full list of arguments to run a framework, you may use `--help`. 67 | 68 | 69 | ## Datasets 70 | All datasets referenced in the original paper are presented in the folder `data`. A user can run the framework on either provided datasets, or any arbitrary ones by specifying the dataset folder via `--load_path` and `--dataset` parameters. 71 | The framework expects the following files to be present in the specified dataset directory: 72 | * a `GraphA.txt` file containing graph's A adjacency matrix, 73 | * a `GraphALabels_cluster.txt` file containing class labels for each graph A node, 74 | * a `GraphALabels_infection.txt` file containing infection labels for each graph A node. 75 | Optionally, a dataset directory can contain `GraphATrain.txt` and `GraphATest.txt` files containing node indices for train and test splits, respectively. If these files are not provided, graph nodes are split randomly into train and test subsets with a ratio 8:2. 76 | We provide `GraphATrain.txt` and `GraphATest.txt` files for all real-world datasets for the reproducibility purposes. Additionally, we provide original dataset files for each real-world graph. 77 | 78 | The following real-world datasets are presented in the folder `data`: 79 | * Zachary Karate Club 80 | > W. W. Zachary, “An information flow model for conflict andfission in small groups”, Journal of Anthropological Research, 1977 81 | * Email 82 | > J. Leskovecet et al., “Graph evolution: Densification andshrinking diameters”, ACM TKDD, 2007 83 | * Infectious Disease Transmission Dataset 84 | > M. Salathé et al., “A high-resolution human contact networkfor infectious disease transmission”, PNAS, 2010 85 | 86 | For the details on synthetic dataset construction, please refer to Section V.A of the original paper. 87 | -------------------------------------------------------------------------------- /datasets/disease/GraphATrain.txt: -------------------------------------------------------------------------------- 1 | 246 2 | 608 3 | 752 4 | 333 5 | 54 6 | 286 7 | 653 8 | 123 9 | 103 10 | 198 11 | 477 12 | 764 13 | 391 14 | 664 15 | 449 16 | 113 17 | 209 18 | 43 19 | 101 20 | 114 21 | 287 22 | 258 23 | 105 24 | 344 25 | 213 26 | 596 27 | 278 28 | 737 29 | 547 30 | 707 31 | 328 32 | 252 33 | 701 34 | 8 35 | 518 36 | 141 37 | 504 38 | 719 39 | 124 40 | 380 41 | 768 42 | 354 43 | 742 44 | 618 45 | 229 46 | 264 47 | 693 48 | 487 49 | 512 50 | 49 51 | 787 52 | 459 53 | 400 54 | 370 55 | 332 56 | 362 57 | 717 58 | 100 59 | 494 60 | 208 61 | 144 62 | 603 63 | 21 64 | 725 65 | 635 66 | 497 67 | 237 68 | 239 69 | 773 70 | 637 71 | 295 72 | 446 73 | 89 74 | 365 75 | 292 76 | 602 77 | 490 78 | 785 79 | 734 80 | 503 81 | 476 82 | 161 83 | 566 84 | 11 85 | 610 86 | 763 87 | 746 88 | 2 89 | 501 90 | 205 91 | 522 92 | 464 93 | 579 94 | 666 95 | 323 96 | 197 97 | 149 98 | 187 99 | 322 100 | 616 101 | 410 102 | 709 103 | 81 104 | 683 105 | 631 106 | 6 107 | 728 108 | 312 109 | 78 110 | 761 111 | 710 112 | 432 113 | 698 114 | 607 115 | 381 116 | 647 117 | 584 118 | 575 119 | 341 120 | 431 121 | 385 122 | 656 123 | 82 124 | 28 125 | 251 126 | 462 127 | 454 128 | 393 129 | 621 130 | 605 131 | 648 132 | 718 133 | 311 134 | 232 135 | 723 136 | 364 137 | 468 138 | 748 139 | 356 140 | 530 141 | 212 142 | 52 143 | 250 144 | 688 145 | 57 146 | 783 147 | 360 148 | 535 149 | 659 150 | 104 151 | 736 152 | 705 153 | 135 154 | 131 155 | 740 156 | 762 157 | 750 158 | 66 159 | 343 160 | 301 161 | 696 162 | 319 163 | 87 164 | 396 165 | 34 166 | 625 167 | 140 168 | 553 169 | 240 170 | 434 171 | 170 172 | 45 173 | 204 174 | 202 175 | 142 176 | 704 177 | 398 178 | 548 179 | 545 180 | 129 181 | 38 182 | 745 183 | 478 184 | 516 185 | 675 186 | 214 187 | 519 188 | 418 189 | 448 190 | 14 191 | 357 192 | 586 193 | 417 194 | 150 195 | 58 196 | 686 197 | 242 198 | 440 199 | 272 200 | 526 201 | 429 202 | 27 203 | 108 204 | 353 205 | 53 206 | 521 207 | 591 208 | 241 209 | 309 210 | 203 211 | 756 212 | 84 213 | 12 214 | 97 215 | 550 216 | 390 217 | 17 218 | 257 219 | 533 220 | 564 221 | 643 222 | 657 223 | 167 224 | 281 225 | 30 226 | 640 227 | 186 228 | 488 229 | 216 230 | 577 231 | 508 232 | 151 233 | 261 234 | 644 235 | 609 236 | 457 237 | 483 238 | 321 239 | 777 240 | 361 241 | 606 242 | 642 243 | 182 244 | 106 245 | 681 246 | 268 247 | 109 248 | 29 249 | 627 250 | 428 251 | 296 252 | 595 253 | 779 254 | 348 255 | 445 256 | 91 257 | 69 258 | 70 259 | 757 260 | 405 261 | 747 262 | 31 263 | 130 264 | 347 265 | 651 266 | 184 267 | 730 268 | 160 269 | 744 270 | 593 271 | 56 272 | 191 273 | 668 274 | 486 275 | 781 276 | 703 277 | 40 278 | 118 279 | 373 280 | 597 281 | 555 282 | 551 283 | 92 284 | 236 285 | 134 286 | 289 287 | 679 288 | 377 289 | 245 290 | 560 291 | 780 292 | 196 293 | 425 294 | 316 295 | 598 296 | 382 297 | 534 298 | 238 299 | 633 300 | 620 301 | 59 302 | 716 303 | 158 304 | 115 305 | 61 306 | 350 307 | 578 308 | 569 309 | 694 310 | 741 311 | 179 312 | 189 313 | 13 314 | 327 315 | 206 316 | 771 317 | 481 318 | 452 319 | 670 320 | 73 321 | 617 322 | 235 323 | 749 324 | 102 325 | 386 326 | 22 327 | 655 328 | 374 329 | 163 330 | 639 331 | 776 332 | 435 333 | 554 334 | 641 335 | 556 336 | 514 337 | 455 338 | 498 339 | 502 340 | 672 341 | 470 342 | 96 343 | 68 344 | 721 345 | 260 346 | 712 347 | 275 348 | 594 349 | 713 350 | 153 351 | 395 352 | 300 353 | 26 354 | 315 355 | 690 356 | 754 357 | 759 358 | 682 359 | 536 360 | 133 361 | 226 362 | 41 363 | 363 364 | 680 365 | 539 366 | 676 367 | 599 368 | 271 369 | 424 370 | 685 371 | 510 372 | 549 373 | 358 374 | 587 375 | 401 376 | 409 377 | 116 378 | 673 379 | 626 380 | 585 381 | 689 382 | 138 383 | 524 384 | 529 385 | 695 386 | 24 387 | 60 388 | 37 389 | 50 390 | 345 391 | 221 392 | 397 393 | 255 394 | 786 395 | 75 396 | 411 397 | 384 398 | 262 399 | 285 400 | 652 401 | 629 402 | 426 403 | 622 404 | 119 405 | 337 406 | 310 407 | 139 408 | 77 409 | 542 410 | 183 411 | 507 412 | 495 413 | 430 414 | 366 415 | 766 416 | 128 417 | 634 418 | 230 419 | 313 420 | 460 421 | 484 422 | 436 423 | 623 424 | 193 425 | 450 426 | 407 427 | 491 428 | 565 429 | 112 430 | 383 431 | 482 432 | 164 433 | 314 434 | 231 435 | 33 436 | 715 437 | 444 438 | 210 439 | 217 440 | 276 441 | 654 442 | 192 443 | 298 444 | 324 445 | 544 446 | 465 447 | 480 448 | 165 449 | 505 450 | 615 451 | 576 452 | 727 453 | 336 454 | 263 455 | 456 456 | 528 457 | 297 458 | 215 459 | 147 460 | 669 461 | 342 462 | 349 463 | 453 464 | 154 465 | 485 466 | 531 467 | 628 468 | 156 469 | 159 470 | 63 471 | 318 472 | 702 473 | 645 474 | 32 475 | 65 476 | 80 477 | 371 478 | 290 479 | 511 480 | 767 481 | 646 482 | 419 483 | 291 484 | 769 485 | 660 486 | 706 487 | 339 488 | 552 489 | 559 490 | 437 491 | 42 492 | 166 493 | 546 494 | 489 495 | 9 496 | 479 497 | 760 498 | 64 499 | 143 500 | 253 501 | 200 502 | 632 503 | 148 504 | 207 505 | 636 506 | 199 507 | 461 508 | 351 509 | 211 510 | 496 511 | 169 512 | 36 513 | 658 514 | 195 515 | 335 516 | 259 517 | 443 518 | 724 519 | 557 520 | 244 521 | 145 522 | 367 523 | 90 524 | 99 525 | 228 526 | 62 527 | 691 528 | 132 529 | 613 530 | 23 531 | 469 532 | 127 533 | 121 534 | 684 535 | 590 536 | 687 537 | 427 538 | 387 539 | 404 540 | 589 541 | 220 542 | 735 543 | 499 544 | 98 545 | 282 546 | 699 547 | 394 548 | 619 549 | 592 550 | 162 551 | 650 552 | 438 553 | 346 554 | 277 555 | 326 556 | 254 557 | 224 558 | 283 559 | 111 560 | 733 561 | 447 562 | 588 563 | 614 564 | 48 565 | 288 566 | 107 567 | 175 568 | 581 569 | 758 570 | 388 571 | 604 572 | 137 573 | 270 574 | 402 575 | 317 576 | 540 577 | 573 578 | 218 579 | 541 580 | 372 581 | 304 582 | 729 583 | 743 584 | 120 585 | 325 586 | 458 587 | 173 588 | 44 589 | 420 590 | 305 591 | 389 592 | 403 593 | 661 594 | 722 595 | 174 596 | 185 597 | 678 598 | 338 599 | 368 600 | 463 601 | 178 602 | 265 603 | 274 604 | 714 605 | 25 606 | 379 607 | 493 608 | 0 609 | 95 610 | 176 611 | 1 612 | 738 613 | 157 614 | 677 615 | 19 616 | 433 617 | 243 618 | 181 619 | 751 620 | 415 621 | 331 622 | 172 623 | 422 624 | 306 625 | 294 626 | 472 627 | 39 628 | 517 629 | 700 630 | 782 631 | -------------------------------------------------------------------------------- /datasets/email/GraphATrain.txt: -------------------------------------------------------------------------------- 1 | 378 2 | 870 3 | 265 4 | 938 5 | 644 6 | 34 7 | 525 8 | 708 9 | 49 10 | 808 11 | 316 12 | 96 13 | 926 14 | 740 15 | 759 16 | 576 17 | 954 18 | 353 19 | 509 20 | 669 21 | 159 22 | 156 23 | 261 24 | 836 25 | 399 26 | 112 27 | 617 28 | 85 29 | 660 30 | 352 31 | 5 32 | 145 33 | 552 34 | 391 35 | 416 36 | 419 37 | 67 38 | 671 39 | 253 40 | 897 41 | 50 42 | 817 43 | 521 44 | 123 45 | 497 46 | 633 47 | 148 48 | 459 49 | 216 50 | 905 51 | 131 52 | 517 53 | 977 54 | 230 55 | 597 56 | 847 57 | 716 58 | 537 59 | 115 60 | 684 61 | 586 62 | 648 63 | 254 64 | 589 65 | 825 66 | 883 67 | 147 68 | 902 69 | 76 70 | 357 71 | 717 72 | 91 73 | 306 74 | 455 75 | 77 76 | 659 77 | 972 78 | 363 79 | 386 80 | 185 81 | 824 82 | 268 83 | 657 84 | 466 85 | 476 86 | 458 87 | 181 88 | 833 89 | 674 90 | 771 91 | 304 92 | 393 93 | 615 94 | 341 95 | 343 96 | 167 97 | 355 98 | 623 99 | 132 100 | 594 101 | 816 102 | 462 103 | 603 104 | 493 105 | 760 106 | 196 107 | 3 108 | 872 109 | 21 110 | 380 111 | 791 112 | 846 113 | 137 114 | 921 115 | 610 116 | 24 117 | 390 118 | 487 119 | 362 120 | 474 121 | 699 122 | 901 123 | 211 124 | 611 125 | 961 126 | 803 127 | 292 128 | 93 129 | 250 130 | 734 131 | 962 132 | 40 133 | 307 134 | 300 135 | 910 136 | 60 137 | 136 138 | 428 139 | 806 140 | 371 141 | 802 142 | 0 143 | 35 144 | 336 145 | 839 146 | 625 147 | 269 148 | 84 149 | 966 150 | 277 151 | 896 152 | 628 153 | 645 154 | 46 155 | 706 156 | 940 157 | 531 158 | 282 159 | 238 160 | 162 161 | 631 162 | 75 163 | 434 164 | 100 165 | 281 166 | 709 167 | 106 168 | 743 169 | 27 170 | 469 171 | 562 172 | 714 173 | 360 174 | 180 175 | 758 176 | 498 177 | 451 178 | 612 179 | 232 180 | 861 181 | 577 182 | 190 183 | 14 184 | 903 185 | 946 186 | 358 187 | 114 188 | 815 189 | 116 190 | 2 191 | 38 192 | 117 193 | 906 194 | 188 195 | 348 196 | 290 197 | 247 198 | 739 199 | 41 200 | 941 201 | 819 202 | 45 203 | 890 204 | 922 205 | 387 206 | 697 207 | 305 208 | 323 209 | 891 210 | 655 211 | 857 212 | 309 213 | 515 214 | 388 215 | 947 216 | 822 217 | 619 218 | 779 219 | 985 220 | 801 221 | 328 222 | 224 223 | 296 224 | 851 225 | 849 226 | 932 227 | 814 228 | 939 229 | 929 230 | 105 231 | 834 232 | 685 233 | 166 234 | 53 235 | 527 236 | 652 237 | 381 238 | 48 239 | 413 240 | 761 241 | 312 242 | 423 243 | 95 244 | 450 245 | 838 246 | 400 247 | 794 248 | 564 249 | 219 250 | 845 251 | 600 252 | 407 253 | 813 254 | 835 255 | 542 256 | 359 257 | 141 258 | 754 259 | 752 260 | 172 261 | 356 262 | 126 263 | 299 264 | 208 265 | 361 266 | 642 267 | 55 268 | 111 269 | 686 270 | 259 271 | 29 272 | 683 273 | 194 274 | 210 275 | 878 276 | 237 277 | 793 278 | 327 279 | 239 280 | 689 281 | 454 282 | 518 283 | 952 284 | 347 285 | 395 286 | 171 287 | 668 288 | 880 289 | 284 290 | 379 291 | 30 292 | 157 293 | 460 294 | 382 295 | 809 296 | 677 297 | 983 298 | 233 299 | 964 300 | 724 301 | 649 302 | 909 303 | 209 304 | 636 305 | 370 306 | 676 307 | 150 308 | 124 309 | 712 310 | 499 311 | 864 312 | 186 313 | 414 314 | 94 315 | 504 316 | 25 317 | 366 318 | 653 319 | 892 320 | 36 321 | 948 322 | 229 323 | 264 324 | 830 325 | 957 326 | 183 327 | 930 328 | 914 329 | 885 330 | 580 331 | 917 332 | 681 333 | 937 334 | 222 335 | 19 336 | 479 337 | 468 338 | 593 339 | 512 340 | 56 341 | 218 342 | 310 343 | 485 344 | 958 345 | 65 346 | 742 347 | 529 348 | 427 349 | 889 350 | 960 351 | 893 352 | 7 353 | 971 354 | 303 355 | 911 356 | 840 357 | 373 358 | 520 359 | 273 360 | 788 361 | 634 362 | 837 363 | 178 364 | 874 365 | 541 366 | 102 367 | 367 368 | 418 369 | 477 370 | 101 371 | 733 372 | 606 373 | 751 374 | 722 375 | 266 376 | 321 377 | 534 378 | 730 379 | 165 380 | 963 381 | 377 382 | 519 383 | 18 384 | 411 385 | 501 386 | 385 387 | 747 388 | 513 389 | 478 390 | 650 391 | 492 392 | 812 393 | 871 394 | 618 395 | 200 396 | 852 397 | 970 398 | 855 399 | 39 400 | 511 401 | 566 402 | 158 403 | 80 404 | 66 405 | 974 406 | 293 407 | 256 408 | 134 409 | 641 410 | 925 411 | 640 412 | 337 413 | 723 414 | 556 415 | 604 416 | 795 417 | 869 418 | 89 419 | 945 420 | 829 421 | 975 422 | 904 423 | 546 424 | 286 425 | 81 426 | 144 427 | 850 428 | 227 429 | 152 430 | 405 431 | 241 432 | 900 433 | 128 434 | 737 435 | 214 436 | 140 437 | 931 438 | 575 439 | 630 440 | 555 441 | 799 442 | 789 443 | 13 444 | 465 445 | 599 446 | 104 447 | 248 448 | 554 449 | 276 450 | 638 451 | 6 452 | 87 453 | 28 454 | 810 455 | 508 456 | 135 457 | 558 458 | 70 459 | 406 460 | 435 461 | 731 462 | 52 463 | 463 464 | 409 465 | 17 466 | 765 467 | 762 468 | 59 469 | 315 470 | 748 471 | 928 472 | 868 473 | 545 474 | 786 475 | 591 476 | 756 477 | 624 478 | 505 479 | 170 480 | 176 481 | 448 482 | 702 483 | 888 484 | 43 485 | 228 486 | 452 487 | 495 488 | 792 489 | 173 490 | 780 491 | 257 492 | 490 493 | 750 494 | 524 495 | 656 496 | 345 497 | 289 498 | 510 499 | 912 500 | 536 501 | 626 502 | 643 503 | 129 504 | 255 505 | 582 506 | 424 507 | 226 508 | 728 509 | 295 510 | 805 511 | 704 512 | 715 513 | 204 514 | 302 515 | 130 516 | 773 517 | 718 518 | 262 519 | 721 520 | 670 521 | 769 522 | 11 523 | 596 524 | 774 525 | 274 526 | 587 527 | 113 528 | 322 529 | 426 530 | 432 531 | 559 532 | 326 533 | 301 534 | 695 535 | 397 536 | 445 537 | 475 538 | 246 539 | 350 540 | 895 541 | 744 542 | 133 543 | 798 544 | 572 545 | 279 546 | 828 547 | 647 548 | 539 549 | 978 550 | 365 551 | 99 552 | 741 553 | 848 554 | 690 555 | 588 556 | 777 557 | 766 558 | 673 559 | 160 560 | 807 561 | 584 562 | 678 563 | 496 564 | 220 565 | 433 566 | 842 567 | 251 568 | 523 569 | 333 570 | 875 571 | 841 572 | 319 573 | 258 574 | 887 575 | 74 576 | 23 577 | 54 578 | 561 579 | 662 580 | 213 581 | 118 582 | 287 583 | 120 584 | 473 585 | 151 586 | 898 587 | 205 588 | 853 589 | 698 590 | 169 591 | 16 592 | 862 593 | 437 594 | 330 595 | 461 596 | 772 597 | 965 598 | 616 599 | 438 600 | 746 601 | 168 602 | 317 603 | 543 604 | 916 605 | 694 606 | 187 607 | 311 608 | 453 609 | 919 610 | 574 611 | 199 612 | 654 613 | 980 614 | 182 615 | 90 616 | 375 617 | 177 618 | 581 619 | 68 620 | 9 621 | 949 622 | 843 623 | 866 624 | 691 625 | 325 626 | 522 627 | 425 628 | 240 629 | 775 630 | 236 631 | 202 632 | 865 633 | 285 634 | 92 635 | 308 636 | 431 637 | 783 638 | 98 639 | 217 640 | 571 641 | 924 642 | 57 643 | 467 644 | 550 645 | 863 646 | 920 647 | 753 648 | 20 649 | 267 650 | 637 651 | 658 652 | 710 653 | 942 654 | 602 655 | 417 656 | 711 657 | 324 658 | 37 659 | 58 660 | 787 661 | 184 662 | 567 663 | 26 664 | 506 665 | 703 666 | 402 667 | 249 668 | 565 669 | 422 670 | 482 671 | 605 672 | 138 673 | 532 674 | 396 675 | 443 676 | 415 677 | 573 678 | 613 679 | 982 680 | 879 681 | 666 682 | 614 683 | 313 684 | 174 685 | 590 686 | 69 687 | 72 688 | 720 689 | 646 690 | 551 691 | 797 692 | 231 693 | 500 694 | 374 695 | 252 696 | 764 697 | 275 698 | 412 699 | 442 700 | 260 701 | 8 702 | 207 703 | 899 704 | 621 705 | 667 706 | 675 707 | 146 708 | 688 709 | 882 710 | 549 711 | 344 712 | 547 713 | 679 714 | 398 715 | 155 716 | 784 717 | 726 718 | 583 719 | 280 720 | 436 721 | 384 722 | 383 723 | 627 724 | 372 725 | 142 726 | 860 727 | 464 728 | 161 729 | 215 730 | 548 731 | 318 732 | 832 733 | 585 734 | 410 735 | 64 736 | 570 737 | 491 738 | 665 739 | 297 740 | 858 741 | 335 742 | 270 743 | 429 744 | 725 745 | 82 746 | 420 747 | 923 748 | 79 749 | 823 750 | 569 751 | 507 752 | 15 753 | 907 754 | 149 755 | 342 756 | 592 757 | 944 758 | 109 759 | 955 760 | 933 761 | 661 762 | 457 763 | 376 764 | 598 765 | 680 766 | 827 767 | 119 768 | 332 769 | 121 770 | 351 771 | 193 772 | 447 773 | 86 774 | 108 775 | 235 776 | 122 777 | 494 778 | 346 779 | 705 780 | 915 781 | -------------------------------------------------------------------------------- /code/random_walks.py: -------------------------------------------------------------------------------- 1 | """ 2 | @author: Yuan Guo, Andrey Gritsenko 3 | SPIRAL Group 4 | Electrical & Computer Engineering 5 | Northeastern University 6 | """ 7 | 8 | import random 9 | import numpy as np 10 | import networkx as nx 11 | 12 | 13 | def read_graph(Amatrix): 14 | """ 15 | Reads the input network in networkx 16 | """ 17 | G = nx.from_numpy_matrix(Amatrix) 18 | G = G.to_undirected() 19 | return G 20 | 21 | 22 | class Graph: 23 | def __init__(self, nx_G, is_directed, p, q): 24 | self.G = nx_G 25 | self.is_directed = is_directed 26 | self.p = p 27 | self.q = q 28 | 29 | def node2vec_walk(self, walk_length, start_node): 30 | ''' 31 | Simulate a random walk starting from start node. 32 | ''' 33 | G = self.G 34 | alias_nodes = self.alias_nodes 35 | alias_edges = self.alias_edges 36 | 37 | walk = [start_node] 38 | 39 | while len(walk) < walk_length: 40 | cur = walk[-1] 41 | cur_nbrs = sorted(G.neighbors(cur)) 42 | if len(cur_nbrs) > 0: 43 | if len(walk) == 1: 44 | walk.append(cur_nbrs[alias_draw(alias_nodes[cur][0], alias_nodes[cur][1])]) 45 | else: 46 | prev = walk[-2] 47 | next = cur_nbrs[alias_draw(alias_edges[(prev, cur)][0], alias_edges[(prev, cur)][1])] 48 | walk.append(next) 49 | else: 50 | break 51 | 52 | return walk 53 | 54 | def simulate_walks(self, num_walks, walk_length): 55 | ''' 56 | Repeatedly simulate random walks from each node. 57 | ''' 58 | G = self.G 59 | walks = [] 60 | nodes = list(G.nodes()) 61 | for walk_iter in range(num_walks): 62 | random.shuffle(nodes) 63 | for node in nodes: 64 | walks.append(self.node2vec_walk(walk_length=walk_length, start_node=node)) 65 | 66 | return walks 67 | 68 | def get_alias_edge(self, src, dst): 69 | ''' 70 | Get the alias edge setup lists for a given edge. 71 | ''' 72 | G = self.G 73 | p = self.p 74 | q = self.q 75 | 76 | unnormalized_probs = [] 77 | for dst_nbr in sorted(G.neighbors(dst)): 78 | if dst_nbr == src: 79 | unnormalized_probs.append(G[dst][dst_nbr]['weight']/p) 80 | elif G.has_edge(dst_nbr, src): 81 | unnormalized_probs.append(G[dst][dst_nbr]['weight']) 82 | else: 83 | unnormalized_probs.append(G[dst][dst_nbr]['weight']/q) 84 | norm_const = sum(unnormalized_probs) 85 | normalized_probs = [float(u_prob)/norm_const for u_prob in unnormalized_probs] 86 | 87 | return alias_setup(normalized_probs) 88 | 89 | def preprocess_transition_probs(self): 90 | ''' 91 | Preprocessing of transition probabilities for guiding the random walks. 92 | ''' 93 | G = self.G 94 | is_directed = self.is_directed 95 | 96 | alias_nodes = {} 97 | for node in G.nodes(): 98 | unnormalized_probs = [G[node][nbr]['weight'] for nbr in sorted(G.neighbors(node))] 99 | norm_const = sum(unnormalized_probs) 100 | normalized_probs = [float(u_prob)/norm_const for u_prob in unnormalized_probs] 101 | alias_nodes[node] = alias_setup(normalized_probs) 102 | 103 | alias_edges = {} 104 | 105 | if is_directed: 106 | for edge in G.edges(): 107 | alias_edges[edge] = self.get_alias_edge(edge[0], edge[1]) 108 | else: 109 | for edge in G.edges(): 110 | alias_edges[edge] = self.get_alias_edge(edge[0], edge[1]) 111 | alias_edges[(edge[1], edge[0])] = self.get_alias_edge(edge[1], edge[0]) 112 | 113 | self.alias_nodes = alias_nodes 114 | self.alias_edges = alias_edges 115 | 116 | return 117 | 118 | 119 | def alias_setup(probs): 120 | ''' 121 | Compute utility lists for non-uniform sampling from discrete distributions. 122 | Refer to https://hips.seas.harvard.edu/blog/2013/03/03/the-alias-method-efficient-sampling-with-many-discrete-outcomes/ 123 | for details 124 | ''' 125 | K = len(probs) 126 | q = np.zeros(K) 127 | J = np.zeros(K, dtype=np.int) 128 | 129 | smaller = [] 130 | larger = [] 131 | for kk, prob in enumerate(probs): 132 | q[kk] = K*prob 133 | if q[kk] < 1.0: 134 | smaller.append(kk) 135 | else: 136 | larger.append(kk) 137 | 138 | while len(smaller) > 0 and len(larger) > 0: 139 | small = smaller.pop() 140 | large = larger.pop() 141 | 142 | J[small] = large 143 | q[large] = q[large] + q[small] - 1.0 144 | if q[large] < 1.0: 145 | smaller.append(large) 146 | else: 147 | larger.append(large) 148 | 149 | return J, q 150 | 151 | 152 | 153 | def alias_draw(J, q): 154 | ''' 155 | Draw sample from a non-uniform discrete distribution using alias sampling. 156 | ''' 157 | K = len(J) 158 | 159 | kk = int(np.floor(np.random.rand()*K)) 160 | if np.random.rand() < q[kk]: 161 | return kk 162 | else: 163 | return J[kk] 164 | 165 | 166 | def frequency(walks): 167 | """ for a random walk, calculate the 3/4 occurrence probability for 168 | negative sampling""" 169 | P_m = {} 170 | for walk in walks: 171 | for item in walk: 172 | try: 173 | P_m[item] += 1 174 | except: 175 | P_m[item] = 1 176 | for key, value in P_m.items(): 177 | P_m[key] = value**0.75 178 | return P_m 179 | 180 | 181 | def negative_frequency(P_m): 182 | """get the negative probability""" 183 | sample_num = [] 184 | sample_prob = [] 185 | for key, value in P_m.items(): 186 | sample_num.append(key) 187 | sample_prob.append(value) 188 | return sample_num, np.array(sample_prob)/sum(sample_prob) 189 | 190 | 191 | def get_negative_sample(context, num, prob, Gn): 192 | """sample negative nodes for each context node""" 193 | negative_list = [] 194 | while len(negative_list) < Gn: 195 | negative_sample = np.random.choice(num, p=prob.ravel()) 196 | if negative_sample != context: 197 | negative_list.append(negative_sample) 198 | else: 199 | pass 200 | return np.array([negative_list]) 201 | 202 | 203 | def skip_train(walks, window_size, negative_size): 204 | """ 205 | use the wallks to generate negative samples for neural network 206 | generate train input under the skip-gram formula 207 | """ 208 | P_m = frequency(walks) 209 | Num, Prob = negative_frequency(P_m) 210 | targets = [] 211 | contexts = [] 212 | similarity = [] 213 | negative_samples = [] 214 | for walk in walks: 215 | for source_id, source in enumerate(walk): 216 | reduced_window = np.random.randint(window_size) 217 | start = max(0, source_id - window_size + reduced_window) 218 | for target_id in range(start, source_id + window_size + 1 - reduced_window): 219 | if target_id != source_id: 220 | try: 221 | target = walk[target_id] 222 | targets.append(target) 223 | contexts.append(source) 224 | negative_samples.append(get_negative_sample(target, Num, Prob, negative_size)) 225 | similarity.append(np.concatenate((np.ones(1), np.zeros(negative_size)))) 226 | except: 227 | pass 228 | else: 229 | pass 230 | return map(np.array, (targets, contexts, similarity, negative_samples)) 231 | 232 | 233 | def getRandomWalks(adj_mat, p, q, n_walks, walk_length): 234 | nx_G = read_graph(adj_mat) 235 | G = Graph(nx_G, False, p, q) 236 | G.preprocess_transition_probs() 237 | walks = G.simulate_walks(n_walks, walk_length) 238 | return walks 239 | 240 | 241 | def getRWSimilarity(adj_mat, n_walks, walk_length, window_size, p, q, n_negative): 242 | """ 243 | Implements random walk similarity in analogy to node2vec 244 | 245 | Inputs: 246 | ADJ_MAT Adjacency matrix of a graph 247 | U, V Indeces of graph nodes pair, between which similarity is computed 248 | 249 | Outputs: 250 | Measure of similarity between two nodes 251 | """ 252 | walks = getRandomWalks(adj_mat, p, q, n_walks, walk_length) 253 | target, context, similarity, negative_samples = skip_train(walks, window_size, n_negative) 254 | return target, context, similarity, negative_samples 255 | 256 | 257 | 258 | 259 | -------------------------------------------------------------------------------- /code/updateP.py: -------------------------------------------------------------------------------- 1 | """ 2 | Module is responsible for permutation matrix P optimization 3 | 4 | @author: Yuan Guo 5 | SPIRAL Group 6 | Electrical & Computer Engineering 7 | Northeastern University 8 | """ 9 | 10 | import numpy as np 11 | from scipy.optimize import linear_sum_assignment 12 | 13 | 14 | def updateP(A, B, D, P=None, mode='opt'): 15 | """ 16 | Wrapper for updating P. 17 | 18 | Inputs: 19 | A Adjacency matrix of the first graph 20 | B Adjacency matrix of the second graph 21 | D Matrix of pairwise distances between nodes of two graphs in the ebedding space 22 | P Original value of matrix P (used only in iterative methods) 23 | MODE Algorithm used to compute new value of matrix P 24 | 25 | Outputs: 26 | NEWP New value of matrix P 27 | """ 28 | 29 | mode = mode.lower() 30 | if mode.find('iter')>=0 and P is not None: # iterative P solution (projected gradient descent) 31 | gam = 0.001 32 | newP = ADMM_update(A,B,D,P,gam,norm=2,mode='norm') 33 | else: # optimal P solution 34 | prec = 100 35 | newP = Frank_optimal(A,B,D,prec,norm=2,mode='norm') 36 | 37 | return newP 38 | 39 | 40 | def ADMM_update(A,B,D,P,gam,norm=2,mode='norm'): 41 | """ 42 | projection of the ||AP-PB||_{p}+tr(P^{T}D) 43 | with gamma 44 | """ 45 | dep = Derivative(A,B,D,P,norm,mode) 46 | V = P - gam*dep 47 | pnew = ADMMV(V) 48 | return pnew 49 | 50 | 51 | def Derivative(A,B,D,P,order,mode): 52 | C = np.dot(A,P) - np.dot(P,B) 53 | if mode == 'square': 54 | grad_P = 2*(np.dot(A.T,C) - np.dot(C,B.T)) 55 | else: 56 | Corder = np.sum(C**order) 57 | if Corder <= 10**(-9): 58 | cp=0 59 | else: 60 | cp = Corder**(1./order-1) 61 | Cp1 = C**(order-1) 62 | grad_P = cp*(np.dot(A.T,Cp1) - np.dot(Cp1,B.T)) 63 | grad_P += D 64 | return grad_P 65 | 66 | 67 | def ADMMV(V,rho=1,epsilon=0.001,itr_num=10000): 68 | obj = ADMM_OBJ(V) 69 | Pnew = obj.loop(rho,epsilon,itr_num) 70 | return Pnew 71 | 72 | 73 | class ADMM_OBJ: 74 | """ 75 | The ADMM class to solve: 76 | solve min ||P-V||_{2}^{2} 77 | P1=P^{T}1=0 78 | P>=0 79 | """ 80 | def __init__(self,V): 81 | self.V = V 82 | self.X = {} 83 | self.Y = {} 84 | self.Z = np.zeros(V.shape) 85 | 86 | def loop(self,rho,epsilon,itr_num): 87 | self.Y['P'] = np.zeros(self.V.shape) 88 | self.Y['row'] = np.zeros(self.V.shape) 89 | self.Y['column'] = np.zeros(self.V.shape) 90 | 91 | dual = np.inf 92 | primal = np.inf 93 | itr = 0 94 | while ((dual >= epsilon) | (primal >= epsilon)): 95 | #t1=time.time() 96 | Z_old = self.Z.copy() 97 | self.X['P'] = (2*self.V + rho*self.Z - self.Y['P']) / (2+rho) 98 | self.X['row'] = MapZero(self.Z, self.Y['row'], rho, 'row') 99 | self.X['column'] = MapZero(self.Z, self.Y['column'], rho, 'column') 100 | self.Z = np.mean(np.array(list(self.X.values())),0) 101 | primal = 0 102 | for key in self.Y: 103 | y_d = self.X[key] - self.Z 104 | self.Y[key] = self.Y[key] + rho*y_d 105 | primal += np.linalg.norm(y_d)**2 106 | primal = np.sqrt(primal) 107 | dual = np.sqrt(3) * rho * np.linalg.norm(self.Z-Z_old) 108 | itr += 1 109 | if itr >= itr_num: 110 | break 111 | else: 112 | pass 113 | return self.Z 114 | 115 | 116 | def MapZero(Z,Y,rho,mode): 117 | V = Z - Y/rho 118 | Xtrace = V.copy() 119 | if mode == 'column': 120 | for i in range(V.shape[1]): 121 | Xtrace[:,i] = VecSimplex(V[:,i],1) 122 | else: 123 | for i in range(V.shape[0]): 124 | Xtrace[i,:] = VecSimplex(V[i,:],1) 125 | return Xtrace 126 | 127 | 128 | def projectToVA(x,A,r): 129 | Ac = set(x.keys()).difference(set(A)) 130 | offset = 1.0 / (len(Ac))*(sum([x[i] for i in Ac]) - r) 131 | y = dict([(i,0.0) for i in A] + [(i,x[i] - offset) for i in Ac]) 132 | return y 133 | 134 | 135 | def projectToPositiveSimplex(x,r): 136 | """ 137 | A function that projects a vector x to the face of the positive simplex. 138 | Given x as input, where x is a dictionary, and a r>0, the algorithm returns a dictionary y with the same keys as x such that: 139 | (1) sum( [ y[key] for key in y] ) == r, 140 | (2) y[key]>=0 for all key in y 141 | (3) y is the closest vector to x in the l2 norm that satifies (1) and (2) 142 | The algorithm terminates in at most O(len(x)) steps, and is described in: 143 | 144 | Michelot, Christian. "A finite algorithm for finding the projection of a point onto the canonical simplex of R^n." Journal of Optimization Theory and Applications 50.1 (1986): 195-200iii 145 | and a short summary can be found in Appendix C of: 146 | 147 | http://www.ece.neu.edu/fac-ece/ioannidis/static/pdf/2010/CR-PRL-2009-07-0001.pdf 148 | """ 149 | A = [] 150 | y = projectToVA(x,A,r) 151 | B = [i for i in y.keys() if y[i]<0.0] 152 | while len(B) > 0: 153 | A += B 154 | y = projectToVA(y,A,r) 155 | B = [i for i in y.keys() if y[i]<0.0] 156 | return y 157 | 158 | 159 | def VecSimplex(x,r): 160 | """ 161 | max ||s-v||_{2}^{2} 162 | subject to =r, s>=0 163 | """ 164 | size = len(x) 165 | xd = {} 166 | for i in range(size): 167 | xd[i] = x[i] 168 | y = projectToPositiveSimplex(xd,r) 169 | return np.array([y[i] for i in range(size)]) 170 | 171 | 172 | """ 173 | Frank-Wolfe Implementation 174 | """ 175 | def V_dot(v): 176 | """ 177 | the ADMM algorithm to solve: 178 | min 179 | subject to S>=0, S1=1, 1^{T}S=1^{T} 180 | """ 181 | row_ind, col_ind = linear_sum_assignment(v) 182 | b = np.zeros(v.shape) 183 | b[row_ind,col_ind] = 1 184 | return b 185 | 186 | 187 | class Frank_P: 188 | """ 189 | The Frank Wolfe class 190 | To solve ||AP-PB||_{2}^{2}+tr(P^{T}D) using cvxopt 191 | """ 192 | def __init__(self, A, B, D, P, order, norm_mode, gamma=0.1): 193 | self.A = A 194 | self.B = B 195 | self.D = D 196 | self.P = P 197 | self.gamma = gamma 198 | self.order = order 199 | self.norm = norm_mode 200 | 201 | def initialize(self): 202 | pass 203 | 204 | def first_fun(self): 205 | nablaP = Derivative(self.A,self.B,self.D,self.P,self.order,self.norm) 206 | return nablaP 207 | 208 | def iteration(self, epsilon, itr_num): 209 | itr=0 210 | while(True): 211 | nabla_P = self.first_fun() 212 | S_optimal = V_dot(nabla_P) 213 | delta_P = S_optimal - self.P 214 | eta = 2/(itr+2) 215 | dual = -np.sum(nabla_P*delta_P) 216 | P_new = self.P + eta*delta_P 217 | self.P = P_new.copy() 218 | #print (np.linalg.norm(np.dot(self.A,self.P)-np.dot(self.P,self.B))**2+np.sum(self.P*self.D)) 219 | itr += 1 220 | if itr >= itr_num: 221 | break 222 | else: 223 | pass 224 | if dual <= epsilon: 225 | break 226 | else: 227 | pass 228 | return P_new 229 | 230 | 231 | def Frank_optimal(A,B,D,prec=100,norm=2,mode='norm'): 232 | """ 233 | Frank-Wolfe method to solve: 234 | ||AP-PB||_{2}^{2}+tr(P^{T}D) 235 | P1=P^T1=1 236 | P>=0 237 | """ 238 | P = np.identity(A.shape[0]) 239 | obj = Frank_P(A,B,D,P,norm,mode) 240 | pnew = obj.iteration(0.001,prec) 241 | return pnew 242 | -------------------------------------------------------------------------------- /code/data_generator.py: -------------------------------------------------------------------------------- 1 | """ 2 | Module contains data generator for batch training 3 | 4 | @author: Andrey Gritsenko 5 | SPIRAL Group 6 | Electrical & Computer Engineering 7 | Northeastern University 8 | """ 9 | 10 | import numpy as np 11 | from keras.utils import Sequence 12 | 13 | 14 | class EmbeddingNodesGenerator(Sequence): 15 | """ 16 | Generates pairs of nodes for Embedding model 17 | 18 | Inputs: 19 | A Adjacency matrix of the first (labeled) graph 20 | BATCH_SIZE Number of datapoints in each batch 21 | LABELS List of node labels of the first graph in one-hot coding format 22 | B Adjacency matrix of the second (unlabeled) graph 23 | P Permutation matrix 24 | AFEATURES Matrix of the first graph nodes feature vectors 25 | BFEATURES Matrix of the second graph nodes feature vectors 26 | RANDOM_ORDER Flag that specifies whether datapoints should be shuffled between batches 27 | 28 | Returns batch data: 29 | Neural network input: 30 | ATARGET, ACONTEXT List of node pairs for the first graph 31 | BTARGET, BCONTEXT List of node pairs for the second graph 32 | Neural network output: 33 | ASIM Similarity between first graph nodes in each pair 34 | ATARGET_LABEL, ACONTEXT_LABEL Labels of first graph nodes in each pair 35 | BSIM Similarity between second graph nodes in each pair 36 | PT1T2, PT1C2, PC1T2, PC1C2 Pairwise distances between first and second graphs' nodes 37 | 38 | """ 39 | 40 | def __init__(self, batch_size, A, Atarget, Acontext, Asimilarity, Anegative, n_negative=0, B=None, Btarget=None, Bcontext=None, Bsimilarity=None, Bnegative=None, P=None, Afeatures=None, Bfeatures=None): 41 | self.batch_size = batch_size 42 | # self.topology_similarity = topology_similarity 43 | self.n_negative = n_negative 44 | 45 | self.Anodes = np.array(range(len(A))) 46 | if Afeatures is not None: 47 | self.Afeatures = Afeatures 48 | else: 49 | self.Afeatures = self.Anodes.reshape(len(A),1) # 'dictionary' setting 50 | 51 | self.Atarget, self.Acontext, self.Asimilarity, self.Anegative = Atarget, Acontext, Asimilarity, Anegative 52 | 53 | if B is not None: 54 | self.two_graphs = True 55 | self.Bnodes = np.array(range(len(B))) 56 | if Bfeatures is not None: 57 | self.Bfeatures = Bfeatures 58 | else: 59 | self.Bfeatures = self.Bnodes.reshape(len(B),1) 60 | 61 | self.Btarget, self.Bcontext, self.Bsimilarity, self.Bnegative = Btarget, Bcontext, Bsimilarity, Bnegative 62 | 63 | if len(self.Btarget) < len(self.Atarget): 64 | idx = np.random.permutation(len(self.Atarget))[0:len(self.Btarget)] 65 | self.Atarget = self.Atarget[idx] 66 | self.Acontext = self.Acontext[idx] 67 | self.Asimilarity = self.Asimilarity[idx,:] 68 | self.Anegative = self.Anegative[idx,:] 69 | elif len(self.Btarget) > len(self.Atarget): 70 | idx = np.random.permutation(len(self.Atarget))[0:len(self.Btarget)] 71 | self.Btarget = self.Btarget[idx] 72 | self.Bcontext = self.Bcontext[idx] 73 | self.Bsimilarity = self.Bsimilarity[idx,:] 74 | self.Bnegative = self.Bnegative[idx,:] 75 | 76 | if P is not None: 77 | self.transfer_learning = True 78 | self.P = P 79 | else: 80 | self.transfer_learning = False 81 | else: 82 | self.two_graphs = False 83 | self.transfer_learning = False 84 | 85 | def __len__(self): 86 | return int(np.ceil(len(self.Atarget) / float(self.batch_size))) 87 | 88 | def __getitem__(self, idx): 89 | Atarget = self.Afeatures[self.Atarget[idx*self.batch_size : (idx+1)*self.batch_size],:] 90 | Acontext = self.Afeatures[self.Acontext[idx*self.batch_size : (idx+1)*self.batch_size],:] 91 | if self.Afeatures.shape[-1]>1: 92 | Atarget = Atarget.reshape((-1,1,self.Afeatures.shape[-1])) 93 | Acontext = Acontext.reshape((-1,1,self.Afeatures.shape[-1])) 94 | inputs = [np.array(Atarget), np.array(Acontext)] 95 | 96 | if self.n_negative > 0: 97 | Anegsampl = self.Afeatures[self.Anegative[idx*self.batch_size : (idx+1)*self.batch_size,:].reshape(-1,1),:] 98 | if self.Afeatures.shape[-1]>1: 99 | Anegsampl = Anegsampl.reshape((-1,self.n_negative,self.Afeatures.shape[-1])) 100 | else: 101 | Anegsampl = Anegsampl.reshape((-1,self.n_negative)) 102 | inputs.extend([Anegsampl]) 103 | 104 | Asim = self.Asimilarity[idx*self.batch_size : (idx+1)*self.batch_size, :].reshape((-1,1,self.n_negative+1)) 105 | outputs = [np.array(Asim)] 106 | 107 | if self.two_graphs: 108 | Btarget = self.Bfeatures[self.Btarget[idx*self.batch_size : (idx+1)*self.batch_size],:] 109 | Bcontext = self.Bfeatures[self.Bcontext[idx*self.batch_size : (idx+1)*self.batch_size],:] 110 | if self.Bfeatures.shape[-1]>1: 111 | Btarget = Btarget.reshape((-1,1,self.Bfeatures.shape[-1])) 112 | Bcontext = Bcontext.reshape((-1,1,self.Bfeatures.shape[-1])) 113 | inputs.extend([np.array(Btarget), np.array(Bcontext)]) 114 | 115 | if self.n_negative > 0: 116 | Bnegsampl = self.Bfeatures[self.Bnegative[idx*self.batch_size : (idx+1)*self.batch_size,:].reshape(-1,1),:] 117 | if self.Bfeatures.shape[-1]>1: 118 | Bnegsampl = Bnegsampl.reshape((-1,self.n_negative,self.Bfeatures.shape[-1])) 119 | else: 120 | Bnegsampl = Bnegsampl.reshape((-1,self.n_negative)) 121 | inputs.extend([Bnegsampl]) 122 | 123 | Bsim = self.Bsimilarity[idx*self.batch_size : (idx+1)*self.batch_size, :].reshape((-1,1,self.n_negative+1)) 124 | outputs.extend([np.array(Bsim)]) 125 | 126 | if self.transfer_learning: 127 | PtAtB = self.P[self.Atarget[idx*self.batch_size : (idx+1)*self.batch_size], self.Btarget[idx*self.batch_size : (idx+1)*self.batch_size]] 128 | outputs.extend([np.array(PtAtB)]) 129 | 130 | return inputs, outputs 131 | 132 | 133 | class PredictionNodesGenerator(Sequence): 134 | """ 135 | Generates data for Graph Transfer Learning model 136 | 137 | Inputs: 138 | A Adjacency matrix of the first (labeled) graph 139 | BATCH_SIZE Number of datapoints in each batch 140 | LABELS List of node labels of the first graph in one-hot coding format 141 | B Adjacency matrix of the second (unlabeled) graph 142 | P Permutation matrix 143 | AFEATURES Matrix of the first graph nodes feature vectors 144 | BFEATURES Matrix of the second graph nodes feature vectors 145 | RANDOM_ORDER Flag that specifies whether datapoints should be shuffled between batches 146 | 147 | Returns batch data: 148 | Neural network input: 149 | ATARGET, ACONTEXT List of node pairs for the first graph 150 | BTARGET, BCONTEXT List of node pairs for the second graph 151 | Neural network output: 152 | ASIM Similarity between first graph nodes in each pair 153 | ATARGET_LABEL, ACONTEXT_LABEL Labels of first graph nodes in each pair 154 | BSIM Similarity between second graph nodes in each pair 155 | PT1T2, PT1C2, PC1T2, PC1C2 Pairwise distances between first and second graphs' nodes 156 | 157 | """ 158 | 159 | def __init__(self, batch_size, X, Y): 160 | self.batch_size = batch_size 161 | self.data = X 162 | self.labels = Y 163 | 164 | def __len__(self): 165 | return int(np.ceil(len(self.data) / float(self.batch_size))) 166 | 167 | def __getitem__(self, idx): 168 | batch_data = self.data[idx * self.batch_size:(idx + 1) * self.batch_size,:] 169 | if self.data.shape[-1]>1: 170 | batch_data = batch_data.reshape((-1,1,self.data.shape[-1])) 171 | inputs = [batch_data] 172 | batch_labels = self.labels[idx * self.batch_size:(idx + 1) * self.batch_size,:] 173 | outputs = [batch_labels] 174 | return inputs, outputs 175 | -------------------------------------------------------------------------------- /datasets/disease/GraphALabels_cluster.txt: -------------------------------------------------------------------------------- 1 | 0 1 0 2 | 1 0 0 3 | 1 0 0 4 | 1 0 0 5 | 1 0 0 6 | 1 0 0 7 | 1 0 0 8 | 1 0 0 9 | 1 0 0 10 | 1 0 0 11 | 0 1 0 12 | 1 0 0 13 | 1 0 0 14 | 0 1 0 15 | 1 0 0 16 | 0 0 1 17 | 1 0 0 18 | 1 0 0 19 | 0 1 0 20 | 1 0 0 21 | 1 0 0 22 | 1 0 0 23 | 1 0 0 24 | 1 0 0 25 | 1 0 0 26 | 1 0 0 27 | 1 0 0 28 | 1 0 0 29 | 1 0 0 30 | 1 0 0 31 | 1 0 0 32 | 1 0 0 33 | 0 1 0 34 | 1 0 0 35 | 1 0 0 36 | 1 0 0 37 | 1 0 0 38 | 1 0 0 39 | 0 1 0 40 | 1 0 0 41 | 0 1 0 42 | 1 0 0 43 | 1 0 0 44 | 0 0 1 45 | 1 0 0 46 | 1 0 0 47 | 1 0 0 48 | 1 0 0 49 | 1 0 0 50 | 1 0 0 51 | 0 0 1 52 | 1 0 0 53 | 0 0 1 54 | 1 0 0 55 | 1 0 0 56 | 1 0 0 57 | 0 0 1 58 | 0 0 1 59 | 1 0 0 60 | 1 0 0 61 | 1 0 0 62 | 0 1 0 63 | 1 0 0 64 | 1 0 0 65 | 1 0 0 66 | 1 0 0 67 | 1 0 0 68 | 1 0 0 69 | 0 1 0 70 | 1 0 0 71 | 1 0 0 72 | 1 0 0 73 | 1 0 0 74 | 1 0 0 75 | 1 0 0 76 | 1 0 0 77 | 1 0 0 78 | 1 0 0 79 | 1 0 0 80 | 1 0 0 81 | 1 0 0 82 | 1 0 0 83 | 1 0 0 84 | 1 0 0 85 | 1 0 0 86 | 1 0 0 87 | 1 0 0 88 | 1 0 0 89 | 1 0 0 90 | 0 1 0 91 | 0 0 1 92 | 0 0 1 93 | 1 0 0 94 | 1 0 0 95 | 1 0 0 96 | 0 0 1 97 | 1 0 0 98 | 1 0 0 99 | 1 0 0 100 | 1 0 0 101 | 1 0 0 102 | 1 0 0 103 | 1 0 0 104 | 1 0 0 105 | 1 0 0 106 | 0 1 0 107 | 1 0 0 108 | 0 0 1 109 | 1 0 0 110 | 1 0 0 111 | 1 0 0 112 | 1 0 0 113 | 0 0 1 114 | 1 0 0 115 | 0 1 0 116 | 1 0 0 117 | 1 0 0 118 | 1 0 0 119 | 1 0 0 120 | 0 1 0 121 | 1 0 0 122 | 1 0 0 123 | 1 0 0 124 | 1 0 0 125 | 1 0 0 126 | 1 0 0 127 | 1 0 0 128 | 0 0 1 129 | 1 0 0 130 | 1 0 0 131 | 1 0 0 132 | 1 0 0 133 | 1 0 0 134 | 1 0 0 135 | 0 1 0 136 | 1 0 0 137 | 1 0 0 138 | 1 0 0 139 | 1 0 0 140 | 0 0 1 141 | 1 0 0 142 | 1 0 0 143 | 1 0 0 144 | 1 0 0 145 | 1 0 0 146 | 0 1 0 147 | 1 0 0 148 | 1 0 0 149 | 1 0 0 150 | 1 0 0 151 | 1 0 0 152 | 1 0 0 153 | 1 0 0 154 | 1 0 0 155 | 1 0 0 156 | 1 0 0 157 | 1 0 0 158 | 1 0 0 159 | 1 0 0 160 | 1 0 0 161 | 1 0 0 162 | 0 0 1 163 | 0 1 0 164 | 1 0 0 165 | 1 0 0 166 | 1 0 0 167 | 1 0 0 168 | 1 0 0 169 | 1 0 0 170 | 1 0 0 171 | 1 0 0 172 | 1 0 0 173 | 1 0 0 174 | 1 0 0 175 | 0 1 0 176 | 1 0 0 177 | 1 0 0 178 | 1 0 0 179 | 1 0 0 180 | 0 0 1 181 | 0 0 1 182 | 0 0 1 183 | 1 0 0 184 | 1 0 0 185 | 1 0 0 186 | 1 0 0 187 | 1 0 0 188 | 1 0 0 189 | 0 0 1 190 | 1 0 0 191 | 1 0 0 192 | 1 0 0 193 | 1 0 0 194 | 1 0 0 195 | 1 0 0 196 | 1 0 0 197 | 1 0 0 198 | 1 0 0 199 | 1 0 0 200 | 1 0 0 201 | 0 0 1 202 | 1 0 0 203 | 1 0 0 204 | 0 0 1 205 | 1 0 0 206 | 1 0 0 207 | 1 0 0 208 | 1 0 0 209 | 1 0 0 210 | 1 0 0 211 | 1 0 0 212 | 1 0 0 213 | 1 0 0 214 | 1 0 0 215 | 1 0 0 216 | 1 0 0 217 | 1 0 0 218 | 1 0 0 219 | 1 0 0 220 | 0 0 1 221 | 1 0 0 222 | 0 1 0 223 | 1 0 0 224 | 1 0 0 225 | 1 0 0 226 | 1 0 0 227 | 1 0 0 228 | 1 0 0 229 | 1 0 0 230 | 0 1 0 231 | 1 0 0 232 | 1 0 0 233 | 1 0 0 234 | 0 0 1 235 | 1 0 0 236 | 1 0 0 237 | 1 0 0 238 | 1 0 0 239 | 1 0 0 240 | 1 0 0 241 | 1 0 0 242 | 1 0 0 243 | 1 0 0 244 | 1 0 0 245 | 1 0 0 246 | 1 0 0 247 | 1 0 0 248 | 1 0 0 249 | 1 0 0 250 | 1 0 0 251 | 1 0 0 252 | 1 0 0 253 | 1 0 0 254 | 1 0 0 255 | 1 0 0 256 | 1 0 0 257 | 0 1 0 258 | 0 1 0 259 | 1 0 0 260 | 1 0 0 261 | 1 0 0 262 | 1 0 0 263 | 1 0 0 264 | 1 0 0 265 | 1 0 0 266 | 1 0 0 267 | 0 0 1 268 | 1 0 0 269 | 1 0 0 270 | 1 0 0 271 | 1 0 0 272 | 1 0 0 273 | 0 0 1 274 | 0 0 1 275 | 1 0 0 276 | 1 0 0 277 | 1 0 0 278 | 1 0 0 279 | 1 0 0 280 | 1 0 0 281 | 1 0 0 282 | 0 1 0 283 | 1 0 0 284 | 1 0 0 285 | 1 0 0 286 | 1 0 0 287 | 1 0 0 288 | 1 0 0 289 | 0 0 1 290 | 1 0 0 291 | 1 0 0 292 | 1 0 0 293 | 1 0 0 294 | 1 0 0 295 | 1 0 0 296 | 0 1 0 297 | 1 0 0 298 | 1 0 0 299 | 1 0 0 300 | 1 0 0 301 | 1 0 0 302 | 1 0 0 303 | 1 0 0 304 | 1 0 0 305 | 0 0 1 306 | 1 0 0 307 | 1 0 0 308 | 1 0 0 309 | 1 0 0 310 | 1 0 0 311 | 1 0 0 312 | 1 0 0 313 | 1 0 0 314 | 1 0 0 315 | 1 0 0 316 | 0 0 1 317 | 1 0 0 318 | 1 0 0 319 | 1 0 0 320 | 1 0 0 321 | 1 0 0 322 | 1 0 0 323 | 0 1 0 324 | 1 0 0 325 | 1 0 0 326 | 1 0 0 327 | 0 0 1 328 | 0 1 0 329 | 1 0 0 330 | 1 0 0 331 | 1 0 0 332 | 1 0 0 333 | 1 0 0 334 | 1 0 0 335 | 1 0 0 336 | 1 0 0 337 | 1 0 0 338 | 1 0 0 339 | 1 0 0 340 | 1 0 0 341 | 1 0 0 342 | 1 0 0 343 | 1 0 0 344 | 1 0 0 345 | 1 0 0 346 | 0 1 0 347 | 0 1 0 348 | 0 0 1 349 | 1 0 0 350 | 1 0 0 351 | 0 1 0 352 | 0 1 0 353 | 0 0 1 354 | 1 0 0 355 | 1 0 0 356 | 1 0 0 357 | 1 0 0 358 | 1 0 0 359 | 1 0 0 360 | 1 0 0 361 | 1 0 0 362 | 1 0 0 363 | 1 0 0 364 | 1 0 0 365 | 1 0 0 366 | 1 0 0 367 | 1 0 0 368 | 1 0 0 369 | 1 0 0 370 | 1 0 0 371 | 0 1 0 372 | 1 0 0 373 | 1 0 0 374 | 1 0 0 375 | 0 0 1 376 | 0 0 1 377 | 0 1 0 378 | 1 0 0 379 | 1 0 0 380 | 1 0 0 381 | 1 0 0 382 | 1 0 0 383 | 1 0 0 384 | 1 0 0 385 | 1 0 0 386 | 1 0 0 387 | 1 0 0 388 | 1 0 0 389 | 0 1 0 390 | 1 0 0 391 | 1 0 0 392 | 1 0 0 393 | 1 0 0 394 | 1 0 0 395 | 1 0 0 396 | 1 0 0 397 | 1 0 0 398 | 1 0 0 399 | 1 0 0 400 | 1 0 0 401 | 1 0 0 402 | 1 0 0 403 | 1 0 0 404 | 1 0 0 405 | 1 0 0 406 | 0 0 1 407 | 1 0 0 408 | 1 0 0 409 | 1 0 0 410 | 0 1 0 411 | 1 0 0 412 | 1 0 0 413 | 1 0 0 414 | 1 0 0 415 | 1 0 0 416 | 1 0 0 417 | 0 0 1 418 | 1 0 0 419 | 1 0 0 420 | 1 0 0 421 | 1 0 0 422 | 1 0 0 423 | 1 0 0 424 | 1 0 0 425 | 1 0 0 426 | 1 0 0 427 | 1 0 0 428 | 1 0 0 429 | 0 1 0 430 | 0 1 0 431 | 1 0 0 432 | 1 0 0 433 | 0 1 0 434 | 0 1 0 435 | 0 0 1 436 | 0 0 1 437 | 1 0 0 438 | 1 0 0 439 | 1 0 0 440 | 1 0 0 441 | 1 0 0 442 | 1 0 0 443 | 1 0 0 444 | 1 0 0 445 | 1 0 0 446 | 1 0 0 447 | 0 0 1 448 | 1 0 0 449 | 1 0 0 450 | 1 0 0 451 | 1 0 0 452 | 1 0 0 453 | 1 0 0 454 | 1 0 0 455 | 1 0 0 456 | 0 1 0 457 | 1 0 0 458 | 1 0 0 459 | 1 0 0 460 | 1 0 0 461 | 1 0 0 462 | 1 0 0 463 | 1 0 0 464 | 1 0 0 465 | 1 0 0 466 | 1 0 0 467 | 1 0 0 468 | 1 0 0 469 | 1 0 0 470 | 1 0 0 471 | 1 0 0 472 | 0 1 0 473 | 0 0 1 474 | 1 0 0 475 | 1 0 0 476 | 1 0 0 477 | 0 0 1 478 | 0 1 0 479 | 1 0 0 480 | 1 0 0 481 | 0 0 1 482 | 0 0 1 483 | 1 0 0 484 | 1 0 0 485 | 1 0 0 486 | 1 0 0 487 | 1 0 0 488 | 1 0 0 489 | 1 0 0 490 | 1 0 0 491 | 0 1 0 492 | 1 0 0 493 | 1 0 0 494 | 1 0 0 495 | 1 0 0 496 | 1 0 0 497 | 1 0 0 498 | 1 0 0 499 | 1 0 0 500 | 1 0 0 501 | 1 0 0 502 | 1 0 0 503 | 1 0 0 504 | 1 0 0 505 | 0 0 1 506 | 1 0 0 507 | 1 0 0 508 | 1 0 0 509 | 1 0 0 510 | 1 0 0 511 | 1 0 0 512 | 1 0 0 513 | 1 0 0 514 | 1 0 0 515 | 1 0 0 516 | 1 0 0 517 | 1 0 0 518 | 1 0 0 519 | 0 1 0 520 | 1 0 0 521 | 0 0 1 522 | 1 0 0 523 | 1 0 0 524 | 0 0 1 525 | 1 0 0 526 | 0 1 0 527 | 1 0 0 528 | 1 0 0 529 | 1 0 0 530 | 1 0 0 531 | 1 0 0 532 | 1 0 0 533 | 1 0 0 534 | 1 0 0 535 | 1 0 0 536 | 1 0 0 537 | 1 0 0 538 | 1 0 0 539 | 1 0 0 540 | 0 1 0 541 | 1 0 0 542 | 1 0 0 543 | 1 0 0 544 | 1 0 0 545 | 0 0 1 546 | 1 0 0 547 | 1 0 0 548 | 1 0 0 549 | 1 0 0 550 | 1 0 0 551 | 1 0 0 552 | 1 0 0 553 | 1 0 0 554 | 1 0 0 555 | 1 0 0 556 | 1 0 0 557 | 1 0 0 558 | 1 0 0 559 | 1 0 0 560 | 1 0 0 561 | 1 0 0 562 | 1 0 0 563 | 1 0 0 564 | 1 0 0 565 | 0 1 0 566 | 1 0 0 567 | 1 0 0 568 | 1 0 0 569 | 0 1 0 570 | 1 0 0 571 | 1 0 0 572 | 1 0 0 573 | 1 0 0 574 | 1 0 0 575 | 0 1 0 576 | 1 0 0 577 | 1 0 0 578 | 1 0 0 579 | 0 1 0 580 | 1 0 0 581 | 1 0 0 582 | 1 0 0 583 | 1 0 0 584 | 1 0 0 585 | 1 0 0 586 | 1 0 0 587 | 1 0 0 588 | 1 0 0 589 | 1 0 0 590 | 1 0 0 591 | 1 0 0 592 | 1 0 0 593 | 1 0 0 594 | 1 0 0 595 | 1 0 0 596 | 1 0 0 597 | 1 0 0 598 | 0 0 1 599 | 1 0 0 600 | 1 0 0 601 | 1 0 0 602 | 1 0 0 603 | 1 0 0 604 | 1 0 0 605 | 1 0 0 606 | 1 0 0 607 | 1 0 0 608 | 1 0 0 609 | 1 0 0 610 | 1 0 0 611 | 1 0 0 612 | 0 1 0 613 | 0 1 0 614 | 0 1 0 615 | 1 0 0 616 | 1 0 0 617 | 1 0 0 618 | 1 0 0 619 | 0 1 0 620 | 0 0 1 621 | 0 0 1 622 | 0 1 0 623 | 0 1 0 624 | 1 0 0 625 | 1 0 0 626 | 1 0 0 627 | 0 0 1 628 | 1 0 0 629 | 1 0 0 630 | 1 0 0 631 | 1 0 0 632 | 1 0 0 633 | 1 0 0 634 | 0 1 0 635 | 0 1 0 636 | 1 0 0 637 | 1 0 0 638 | 1 0 0 639 | 0 1 0 640 | 1 0 0 641 | 0 0 1 642 | 0 0 1 643 | 1 0 0 644 | 1 0 0 645 | 1 0 0 646 | 1 0 0 647 | 1 0 0 648 | 0 1 0 649 | 1 0 0 650 | 1 0 0 651 | 1 0 0 652 | 1 0 0 653 | 1 0 0 654 | 1 0 0 655 | 0 1 0 656 | 1 0 0 657 | 1 0 0 658 | 1 0 0 659 | 1 0 0 660 | 1 0 0 661 | 1 0 0 662 | 1 0 0 663 | 0 1 0 664 | 1 0 0 665 | 1 0 0 666 | 1 0 0 667 | 1 0 0 668 | 1 0 0 669 | 1 0 0 670 | 1 0 0 671 | 1 0 0 672 | 0 0 1 673 | 0 0 1 674 | 1 0 0 675 | 1 0 0 676 | 1 0 0 677 | 1 0 0 678 | 1 0 0 679 | 0 0 1 680 | 0 0 1 681 | 1 0 0 682 | 1 0 0 683 | 1 0 0 684 | 1 0 0 685 | 1 0 0 686 | 1 0 0 687 | 1 0 0 688 | 1 0 0 689 | 1 0 0 690 | 1 0 0 691 | 1 0 0 692 | 0 0 1 693 | 1 0 0 694 | 1 0 0 695 | 1 0 0 696 | 0 1 0 697 | 0 1 0 698 | 1 0 0 699 | 1 0 0 700 | 1 0 0 701 | 1 0 0 702 | 1 0 0 703 | 1 0 0 704 | 0 0 1 705 | 0 0 1 706 | 1 0 0 707 | 1 0 0 708 | 1 0 0 709 | 1 0 0 710 | 0 1 0 711 | 1 0 0 712 | 1 0 0 713 | 0 0 1 714 | 1 0 0 715 | 1 0 0 716 | 1 0 0 717 | 1 0 0 718 | 1 0 0 719 | 1 0 0 720 | 0 1 0 721 | 1 0 0 722 | 0 1 0 723 | 1 0 0 724 | 1 0 0 725 | 1 0 0 726 | 0 0 1 727 | 1 0 0 728 | 1 0 0 729 | 1 0 0 730 | 0 1 0 731 | 1 0 0 732 | 1 0 0 733 | 1 0 0 734 | 1 0 0 735 | 1 0 0 736 | 1 0 0 737 | 0 1 0 738 | 0 0 1 739 | 1 0 0 740 | 0 1 0 741 | 0 1 0 742 | 1 0 0 743 | 0 0 1 744 | 1 0 0 745 | 1 0 0 746 | 1 0 0 747 | 1 0 0 748 | 1 0 0 749 | 1 0 0 750 | 1 0 0 751 | 0 0 1 752 | 0 1 0 753 | 1 0 0 754 | 0 1 0 755 | 1 0 0 756 | 1 0 0 757 | 1 0 0 758 | 1 0 0 759 | 1 0 0 760 | 1 0 0 761 | 1 0 0 762 | 1 0 0 763 | 1 0 0 764 | 1 0 0 765 | 0 1 0 766 | 1 0 0 767 | 0 0 1 768 | 1 0 0 769 | 1 0 0 770 | 1 0 0 771 | 1 0 0 772 | 1 0 0 773 | 1 0 0 774 | 1 0 0 775 | 1 0 0 776 | 1 0 0 777 | 1 0 0 778 | 1 0 0 779 | 1 0 0 780 | 1 0 0 781 | 1 0 0 782 | 1 0 0 783 | 0 1 0 784 | 1 0 0 785 | 0 1 0 786 | 1 0 0 787 | 1 0 0 788 | 1 0 0 789 | -------------------------------------------------------------------------------- /code/load_save_data.py: -------------------------------------------------------------------------------- 1 | """ 2 | Module is responsible for results saving 3 | 4 | @author: Andrey Gritsenko 5 | SPIRAL Group 6 | Electrical & Computer Engineering 7 | Northeastern University 8 | """ 9 | 10 | import os 11 | import numpy as np 12 | import pickle 13 | 14 | from graph_utils import PermuteGraph, ReverseEdges, GetDoublyStochasticMatrix 15 | 16 | 17 | def load_data(dataset_path, labels, transmode, B_from_A='permute', disc_pect=None): 18 | """ 19 | Loads dataset 20 | 21 | INPUTS: 22 | LOAD_PATH Path the data directory 23 | FEATUES Specifies type of features to be used 24 | LABELS Specifies type of labels to be used 25 | TWO_GRAPHS Specifies whether two graphs should be loaded 26 | TRANSFER_LEARNING Specifies if permutation matrix P should be created 27 | VISUALIZE Specifies whether graphs should be plotted 28 | SAVE_PATH Path where graph plots should be saved 29 | 30 | OUTPUTS: 31 | NNODES Number of nodes in graphs 32 | NFEATURES Number of node features 33 | NLABELS Numbeer of node labels 34 | A Adjacency matrix of the first graph 35 | AFEATURES Node features of the first graph 36 | ALABELS Node labels of the first graph 37 | ATRAIN Indices of nodes from the first graph in the train set 38 | ATEST Indices of nodes from the second graph in the test set 39 | B Adjacency matrix of the second graph, if it is used 40 | BFEATURES Node features of the second graph, if it is used 41 | BLABELS Node labels of the second graph, if it is used 42 | """ 43 | 44 | A = np.loadtxt(dataset_path + 'GraphA.txt').astype(int) 45 | nnodes = A.shape[0] 46 | if all(os.path.exists(os.path.join(dataset_path, f"GraphA{split}.txt")) for split in ['Train', 'Test']): 47 | Atrain = np.loadtxt(os.path.join(dataset_path, 'GraphATrain.txt')).astype(int) 48 | Atest = np.loadtxt(os.path.join(dataset_path, 'GraphATest.txt')).astype(int) 49 | else: 50 | ntrain = 0.8 51 | Aindices = np.random.permutation(len(A)) 52 | Atrain = Aindices[:np.ceil(ntrain * len(A)).astype(int)] 53 | Atest = Aindices[np.ceil(ntrain * len(A)).astype(int):] 54 | 55 | if os.path.exists(os.path.join(dataset_path, 'GraphAFeatures.txt')): 56 | Afeatures = np.loadtxt(os.path.join(dataset_path, 'GraphAFeatures.txt')) 57 | else: 58 | Afeatures = np.arange(len(A)).reshape(-1, 1) 59 | if Afeatures.ndim == 1: 60 | Afeatures = Afeatures.reshape(-1, 1) 61 | nfeatures = Afeatures.shape[1] 62 | 63 | Alabels = np.loadtxt(os.path.join(dataset_path, 'GraphALabels_' + labels + '.txt')) 64 | if Alabels.ndim == 1: 65 | Alabels = Alabels.reshape(-1, 1) 66 | nlabels = Alabels.shape[1] 67 | 68 | if transmode == '1graph': 69 | B, Bfeatures, Blabels, P = None, None, None, None 70 | else: 71 | B, Blabels = np.copy(A), np.copy(Alabels) 72 | B, Blabels, P = PermuteGraph(B) 73 | if B_from_A == 'modify': 74 | B = ReverseEdges(B, disc_pect) 75 | if transmode == 'trueP_ds': 76 | P = GetDoublyStochasticMatrix(Alabels, P) 77 | if os.path.exists(os.path.join(dataset_path, 'GraphAFeatures.txt')): 78 | Bfeatures = np.matmul(np.matmul(P.T, Afeatures), P) 79 | else: 80 | Bfeatures = np.array(range(len(B))).reshape((len(B), 1)) 81 | if Bfeatures.ndim == 1: 82 | Bfeatures = Bfeatures.reshape(-1, 1) 83 | 84 | return nnodes, nfeatures, nlabels, A, Afeatures, Alabels, Atrain, Atest, B, Bfeatures, Blabels, P 85 | 86 | 87 | def lookup_lod(lod, **kw): 88 | res = [] 89 | idx = [] 90 | for ind in range(len(lod)): 91 | row = lod[ind] 92 | for k, v in kw.iteritems(): 93 | if row[k] != str(v): 94 | break 95 | else: 96 | res.append(row) 97 | idx.append(ind) 98 | return res, idx 99 | 100 | 101 | # save results for each iteration 102 | def save_iteration_results(save_path, suffix, Atrain, Atest, Aembedding, Alabels_output, Alabels_predicted, B, Bfeatures, Bembedding, Blabels, Blabels_output, Blabels_predicted, Ptrue, P, acc_results): 103 | output_dict = dict(zip(['GraphATrain', 'GraphATest', 'GraphAEmbedding', 'GraphALabels_outputs'], [Atrain, Atest, Aembedding, Alabels_output])) 104 | if B is not None: 105 | output_dict.update(dict(zip(['GraphB', 'GraphBFeatures', 'GraphBLabels', 'GraphBEmbedding', 'GraphBLabels_outputs', 'P_true'], [B, Bfeatures, Blabels, Bembedding, Blabels_output, Ptrue]))) 106 | if P is not None: 107 | output_dict.update(dict(zip(['P_predicted'], [P]))) 108 | 109 | if Alabels_predicted is not None: 110 | output_dict.update(dict(zip(['GraphALabels_predicted'], [Alabels_predicted]))) 111 | if B is not None: 112 | output_dict.update(dict(zip(['GraphBLabels_predicted'], [Blabels_predicted]))) 113 | 114 | output_dict.update(dict(zip(['Accuracy'], [acc_results]))) 115 | with open(save_path + 'results' + suffix + '.pkl', 'wb') as handle: 116 | pickle.dump(output_dict, handle, protocol=pickle.HIGHEST_PROTOCOL) 117 | 118 | 119 | # add results to global results file 120 | def save_global_results(args, iter, best_results, picklename): 121 | save_path = args.save_path[0:args.save_path.find('results/')+8] 122 | 123 | labels = args.labels 124 | transmode = args.transfer_mode 125 | 126 | epochs_mean = np.mean(list(best_results['epochs'].values())) 127 | epochs_std = np.std(list(best_results['epochs'].values())) 128 | train_mean = np.mean(list(best_results['train'].values())) 129 | train_std = np.std(list(best_results['train'].values())) 130 | testA_mean = np.mean(list(best_results['testA'].values())) 131 | testA_std = np.std(list(best_results['testA'].values())) 132 | 133 | with open(args.save_path + 'Accuracy_(' + str(args.iterations) + ')_' + str(args.epochs) + '.txt', "w") as f: 134 | if transmode == '1graph': 135 | if labels == 'cluster': 136 | f.write("%4d\t%.4f (%.4f)\t%.4f (%.4f)\t%.4f (%.4f)\n" % (iter, epochs_mean, epochs_std, train_mean, train_std, testA_mean, testA_std)) 137 | else: 138 | rsquaredA_mean = np.mean(list(best_results['rsquaredA'].values())) 139 | rsquaredA_std = np.std(list(best_results['rsquaredA'].values())) 140 | f.write("%4d\t%.4f (%.4f)\t%.4f (%.4f)\t%.4f (%.4f)\t%.4f (%.4f)\n" % (iter, epochs_mean, epochs_std, train_mean, train_std, testA_mean, testA_std, rsquaredA_mean, rsquaredA_std)) 141 | else: 142 | testB_mean = np.mean(list(best_results['testB'].values())) 143 | testB_std = np.std(list(best_results['testB'].values())) 144 | if labels == 'cluster': 145 | f.write("%4d\t%.4f (%.4f)\t%.4f (%.4f)\t%.4f (%.4f)\t%.4f (%.4f)\n" % (iter, epochs_mean, epochs_std, train_mean, train_std, testA_mean, testA_std, testB_mean, testB_std)) 146 | else: 147 | rsquaredA_mean = np.mean(list(best_results['rsquaredA'].values())) 148 | rsquaredA_std = np.std(list(best_results['rsquaredA'].values())) 149 | rsquaredB_mean = np.mean(list(best_results['rsquaredB'].values())) 150 | rsquaredB_std = np.std(list(best_results['rsquaredB'].values())) 151 | f.write("%4d\t%.4f (%.4f)\t%.4f (%.4f)\t%.4f (%.4f)\t%.4f (%.4f)\t%.4f (%.4f)\t%.4f (%.4f)\n" % (iter, epochs_mean, epochs_std, train_mean, train_std, testA_mean, testA_std, rsquaredA_mean, rsquaredA_std, testB_mean, testB_std, rsquaredB_mean, rsquaredB_std)) 152 | 153 | new_res = {'Dataset':args.dataset, 'Cliq':args.ncliq if args.dataset=='synthetic' else 'N/A', 'Labels':args.labels, 'Features':args.features, 'Embedding':args.embedding_type, 'TopSim':args.topology_similarity, 'EmbSim':args.embedding_similarity, 'SimLoss':args.similarity_loss, 'TransMode':transmode, 'GraphDist':args.graph_distance, 'Iterations':iter, 'Epochs_mean':epochs_mean, 'Epochs_std':epochs_std, 'TrainAcc_mean':train_mean, 'TrainAcc_std':train_std, 'TestAccA_mean':testA_mean, 'TestAccA_std':testA_std, 'RsquaredA_mean':rsquaredA_mean if labels!='cluster' else 'N/A', 'RsquaredA_std':rsquaredA_std if labels!='cluster' else 'N/A', 'TestAccB_mean':testB_mean if transmode!='1graph' else 'N/A', 'TestAccB_std':testB_std if transmode!='1graph' else 'N/A', 'RsquaredB_mean':rsquaredB_mean if (labels!='cluster' and transmode!='1graph') else 'N/A', 'RsquaredB_std':rsquaredB_std if (labels!='cluster' and transmode!='1graph') else 'N/A'} 154 | try: 155 | with open(save_path + picklename + '.pkl', 'rb') as handle: 156 | results = pickle.load(handle) 157 | _, idx = lookup_lod(results, Dataset=args.dataset, Cliq=args.ncliq, Labels=args.labels, Features=args.features, Embedding=args.embedding_type, TopSim=args.topology_similarity, EmbSim=args.embedding_similarity, SimLoss=args.similarity_loss, TransMode=transmode, GraphDist=args.graph_distance) 158 | if len(idx)==0: 159 | results.append(new_res) 160 | else: 161 | results[idx[0]] = new_res 162 | print("SAVING: Update an existing record") 163 | except: 164 | results = [new_res] 165 | print("SAVING: Create a new record") 166 | with open(save_path + picklename + '.pkl', 'wb') as handle: 167 | pickle.dump(results, handle, protocol=pickle.HIGHEST_PROTOCOL) 168 | -------------------------------------------------------------------------------- /datasets/disease/GraphALabels_infection.txt: -------------------------------------------------------------------------------- 1 | 0.5000 2 | 0.5000 3 | 0.7615 4 | 0.5000 5 | 0.5000 6 | 0.7555 7 | 0.7510 8 | 0.5000 9 | 0.5000 10 | 0.5000 11 | 0.7465 12 | 0.7365 13 | 0.5000 14 | 0.7530 15 | 0.5000 16 | 0.7535 17 | 0.5000 18 | 0.5000 19 | 0.7500 20 | 0.5000 21 | 0.7645 22 | 0.7535 23 | 0.5000 24 | 0.7635 25 | 0.7585 26 | 0.4895 27 | 0.7490 28 | 0.7560 29 | 0.5000 30 | 0.7515 31 | 0.7630 32 | 0.5000 33 | 0.7490 34 | 0.4998 35 | 0.7495 36 | 0.7550 37 | 0.7530 38 | 0.5000 39 | 0.5000 40 | 0.5000 41 | 0.5000 42 | 0.7550 43 | 0.7500 44 | 0.5000 45 | 0.5000 46 | 0.5000 47 | 0.5000 48 | 0.7545 49 | 0.5000 50 | 0.7545 51 | 0.5000 52 | 0.5000 53 | 0.4852 54 | 0.7335 55 | 0.5000 56 | 0.7375 57 | 0.7485 58 | 0.5000 59 | 0.5000 60 | 0.7405 61 | 0.5000 62 | 0.5000 63 | 0.7390 64 | 0.5000 65 | 0.5000 66 | 0.5000 67 | 0.5000 68 | 0.5000 69 | 0.5000 70 | 0.7440 71 | 0.7630 72 | 0.5000 73 | 0.5000 74 | 0.7480 75 | 0.5000 76 | 0.7425 77 | 0.7445 78 | 0.5000 79 | 0.7630 80 | 0.7485 81 | 0.7455 82 | 0.7490 83 | 0.7580 84 | 0.5000 85 | 0.7560 86 | 0.7475 87 | 0.5000 88 | 0.7425 89 | 0.7650 90 | 0.5000 91 | 0.5000 92 | 0.7395 93 | 0.7555 94 | 0.5000 95 | 0.7385 96 | 0.5000 97 | 0.7635 98 | 0.7470 99 | 0.5000 100 | 0.5000 101 | 0.7400 102 | 0.7475 103 | 0.7530 104 | 0.5000 105 | 0.7525 106 | 0.5000 107 | 0.7450 108 | 0.4853 109 | 0.7435 110 | 0.7425 111 | 0.5000 112 | 0.7555 113 | 0.4915 114 | 0.7465 115 | 0.5000 116 | 0.5000 117 | 0.5000 118 | 0.5000 119 | 0.7450 120 | 0.5000 121 | 0.7490 122 | 0.7690 123 | 0.5000 124 | 0.5000 125 | 0.5000 126 | 0.7435 127 | 0.7385 128 | 0.5000 129 | 0.7500 130 | 0.5000 131 | 0.5000 132 | 0.7535 133 | 0.7580 134 | 0.7440 135 | 0.5000 136 | 0.5000 137 | 0.5000 138 | 0.7615 139 | 0.5000 140 | 0.5000 141 | 0.7505 142 | 0.5000 143 | 0.7535 144 | 0.5000 145 | 0.5000 146 | 0.5000 147 | 0.7465 148 | 0.7625 149 | 0.5000 150 | 0.7460 151 | 0.5000 152 | 0.7535 153 | 0.7445 154 | 0.7460 155 | 0.7525 156 | 0.7535 157 | 0.7535 158 | 0.5000 159 | 0.7620 160 | 0.7570 161 | 0.7440 162 | 0.4968 163 | 0.5000 164 | 0.5000 165 | 0.5000 166 | 0.5000 167 | 0.7600 168 | 0.7550 169 | 0.7540 170 | 0.7490 171 | 0.7595 172 | 0.7470 173 | 0.7590 174 | 0.7420 175 | 0.7440 176 | 0.7305 177 | 0.7600 178 | 0.7545 179 | 0.5000 180 | 0.5000 181 | 0.5000 182 | 0.5000 183 | 0.7605 184 | 0.7400 185 | 0.7445 186 | 0.7465 187 | 0.5000 188 | 0.7345 189 | 0.7585 190 | 0.7395 191 | 0.7540 192 | 0.7455 193 | 0.5000 194 | 0.7470 195 | 0.7625 196 | 0.7505 197 | 0.5000 198 | 0.5000 199 | 0.7515 200 | 0.7535 201 | 0.7490 202 | 0.7535 203 | 0.7465 204 | 0.5000 205 | 0.5000 206 | 0.7680 207 | 0.5000 208 | 0.7510 209 | 0.7515 210 | 0.7590 211 | 0.7525 212 | 0.7575 213 | 0.7495 214 | 0.5000 215 | 0.5000 216 | 0.7565 217 | 0.7670 218 | 0.5000 219 | 0.7550 220 | 0.4907 221 | 0.7465 222 | 0.5000 223 | 0.7605 224 | 0.5000 225 | 0.7490 226 | 0.5000 227 | 0.5000 228 | 0.7430 229 | 0.5000 230 | 0.7570 231 | 0.7520 232 | 0.5000 233 | 0.7565 234 | 0.4950 235 | 0.7705 236 | 0.7440 237 | 0.5000 238 | 0.5000 239 | 0.7570 240 | 0.5000 241 | 0.7475 242 | 0.7690 243 | 0.5000 244 | 0.5000 245 | 0.5000 246 | 0.5000 247 | 0.7565 248 | 0.5000 249 | 0.5000 250 | 0.5000 251 | 0.7515 252 | 0.5000 253 | 0.7480 254 | 0.7485 255 | 0.7535 256 | 0.7510 257 | 0.5000 258 | 0.5000 259 | 0.7460 260 | 0.4227 261 | 0.5000 262 | 0.7485 263 | 0.5000 264 | 0.7430 265 | 0.7645 266 | 0.7580 267 | 0.3643 268 | 0.7575 269 | 0.5000 270 | 0.7450 271 | 0.5000 272 | 0.5000 273 | 0.5000 274 | 0.5000 275 | 0.5000 276 | 0.7490 277 | 0.7520 278 | 0.7575 279 | 0.7445 280 | 0.7540 281 | 0.7535 282 | 0.5000 283 | 0.5000 284 | 0.7575 285 | 0.5000 286 | 0.7580 287 | 0.5000 288 | 0.7435 289 | 0.4628 290 | 0.7505 291 | 0.7510 292 | 0.5000 293 | 0.7525 294 | 0.7575 295 | 0.7535 296 | 0.7645 297 | 0.5000 298 | 0.5000 299 | 0.7650 300 | 0.5000 301 | 0.7520 302 | 0.5000 303 | 0.5000 304 | 0.7530 305 | 0.5000 306 | 0.5000 307 | 0.5000 308 | 0.5000 309 | 0.7465 310 | 0.7565 311 | 0.5000 312 | 0.7565 313 | 0.5000 314 | 0.7535 315 | 0.5000 316 | 0.5000 317 | 0.7555 318 | 0.7635 319 | 0.7335 320 | 0.7495 321 | 0.5000 322 | 0.7455 323 | 0.5000 324 | 0.7470 325 | 0.7705 326 | 0.7330 327 | 0.5000 328 | 0.7505 329 | 0.5000 330 | 0.5000 331 | 0.5000 332 | 0.7485 333 | 0.7490 334 | 0.7545 335 | 0.7460 336 | 0.7535 337 | 0.5000 338 | 0.5000 339 | 0.5000 340 | 0.7445 341 | 0.5000 342 | 0.5000 343 | 0.5000 344 | 0.7505 345 | 0.7555 346 | 0.7690 347 | 0.5000 348 | 0.5000 349 | 0.7525 350 | 0.5000 351 | 0.7545 352 | 0.5000 353 | 0.5000 354 | 0.7515 355 | 0.5000 356 | 0.5000 357 | 0.7730 358 | 0.7590 359 | 0.7510 360 | 0.5000 361 | 0.7440 362 | 0.5000 363 | 0.7505 364 | 0.5000 365 | 0.7480 366 | 0.7580 367 | 0.7610 368 | 0.7565 369 | 0.7425 370 | 0.5000 371 | 0.5000 372 | 0.5000 373 | 0.7565 374 | 0.7520 375 | 0.5000 376 | 0.3543 377 | 0.7475 378 | 0.5000 379 | 0.7545 380 | 0.7475 381 | 0.5000 382 | 0.5000 383 | 0.7575 384 | 0.5000 385 | 0.5000 386 | 0.7640 387 | 0.5000 388 | 0.5000 389 | 0.5000 390 | 0.5000 391 | 0.7610 392 | 0.7510 393 | 0.7585 394 | 0.7435 395 | 0.7450 396 | 0.7570 397 | 0.5000 398 | 0.5000 399 | 0.7535 400 | 0.5000 401 | 0.7425 402 | 0.7520 403 | 0.5000 404 | 0.7465 405 | 0.7440 406 | 0.5000 407 | 0.5000 408 | 0.5000 409 | 0.5000 410 | 0.5000 411 | 0.5000 412 | 0.7600 413 | 0.7475 414 | 0.7645 415 | 0.7595 416 | 0.7520 417 | 0.7560 418 | 0.5000 419 | 0.5000 420 | 0.5000 421 | 0.5000 422 | 0.7450 423 | 0.7620 424 | 0.7545 425 | 0.7420 426 | 0.5000 427 | 0.5000 428 | 0.7350 429 | 0.7480 430 | 0.7530 431 | 0.7625 432 | 0.7435 433 | 0.5000 434 | 0.5000 435 | 0.7450 436 | 0.4882 437 | 0.5000 438 | 0.5000 439 | 0.5000 440 | 0.5000 441 | 0.7620 442 | 0.5000 443 | 0.5000 444 | 0.7480 445 | 0.7430 446 | 0.5000 447 | 0.7560 448 | 0.5000 449 | 0.7620 450 | 0.7470 451 | 0.5000 452 | 0.5000 453 | 0.7435 454 | 0.7455 455 | 0.7450 456 | 0.5000 457 | 0.7450 458 | 0.7525 459 | 0.5000 460 | 0.5000 461 | 0.7445 462 | 0.5000 463 | 0.5000 464 | 0.5000 465 | 0.5000 466 | 0.5000 467 | 0.7590 468 | 0.7560 469 | 0.5000 470 | 0.7490 471 | 0.5000 472 | 0.5000 473 | 0.5000 474 | 0.5000 475 | 0.5000 476 | 0.7585 477 | 0.4975 478 | 0.5000 479 | 0.7510 480 | 0.5000 481 | 0.7480 482 | 0.7455 483 | 0.7575 484 | 0.7445 485 | 0.7460 486 | 0.7660 487 | 0.5000 488 | 0.5000 489 | 0.7480 490 | 0.7545 491 | 0.7510 492 | 0.5000 493 | 0.7495 494 | 0.7475 495 | 0.7510 496 | 0.5000 497 | 0.7400 498 | 0.7505 499 | 0.5000 500 | 0.5000 501 | 0.5000 502 | 0.5000 503 | 0.5000 504 | 0.7455 505 | 0.5000 506 | 0.7385 507 | 0.5000 508 | 0.7565 509 | 0.7615 510 | 0.5000 511 | 0.7455 512 | 0.7525 513 | 0.7535 514 | 0.5000 515 | 0.7595 516 | 0.7565 517 | 0.7530 518 | 0.7495 519 | 0.5000 520 | 0.7620 521 | 0.7650 522 | 0.7595 523 | 0.7350 524 | 0.5000 525 | 0.5000 526 | 0.5000 527 | 0.7465 528 | 0.7585 529 | 0.5000 530 | 0.7345 531 | 0.7570 532 | 0.7420 533 | 0.7560 534 | 0.7410 535 | 0.7335 536 | 0.5000 537 | 0.7490 538 | 0.7495 539 | 0.5000 540 | 0.5000 541 | 0.5000 542 | 0.5000 543 | 0.5000 544 | 0.7545 545 | 0.4505 546 | 0.5000 547 | 0.5000 548 | 0.7550 549 | 0.5000 550 | 0.7500 551 | 0.7515 552 | 0.7540 553 | 0.5000 554 | 0.7560 555 | 0.7585 556 | 0.5000 557 | 0.7510 558 | 0.5000 559 | 0.5000 560 | 0.7380 561 | 0.7500 562 | 0.5000 563 | 0.7405 564 | 0.5000 565 | 0.5000 566 | 0.7580 567 | 0.7565 568 | 0.7555 569 | 0.5000 570 | 0.7465 571 | 0.7475 572 | 0.7465 573 | 0.7605 574 | 0.5000 575 | 0.5000 576 | 0.5000 577 | 0.7470 578 | 0.5000 579 | 0.5000 580 | 0.5000 581 | 0.7405 582 | 0.7555 583 | 0.7585 584 | 0.5000 585 | 0.5000 586 | 0.7410 587 | 0.5000 588 | 0.7450 589 | 0.7740 590 | 0.5000 591 | 0.7335 592 | 0.7390 593 | 0.5000 594 | 0.5000 595 | 0.7315 596 | 0.7445 597 | 0.7315 598 | 0.5000 599 | 0.7455 600 | 0.5000 601 | 0.7465 602 | 0.7540 603 | 0.7485 604 | 0.5000 605 | 0.7435 606 | 0.7560 607 | 0.7575 608 | 0.7620 609 | 0.7460 610 | 0.5000 611 | 0.5000 612 | 0.5000 613 | 0.7540 614 | 0.5000 615 | 0.7495 616 | 0.5000 617 | 0.5000 618 | 0.5000 619 | 0.7500 620 | 0.5000 621 | 0.5000 622 | 0.5000 623 | 0.7500 624 | 0.7650 625 | 0.5000 626 | 0.7525 627 | 0.4997 628 | 0.7565 629 | 0.7585 630 | 0.5000 631 | 0.5000 632 | 0.7560 633 | 0.7390 634 | 0.5000 635 | 0.5000 636 | 0.7480 637 | 0.5000 638 | 0.5000 639 | 0.5000 640 | 0.5000 641 | 0.7605 642 | 0.5000 643 | 0.7385 644 | 0.5000 645 | 0.7310 646 | 0.4997 647 | 0.5000 648 | 0.5000 649 | 0.7575 650 | 0.7490 651 | 0.7500 652 | 0.7405 653 | 0.5000 654 | 0.7455 655 | 0.5000 656 | 0.7385 657 | 0.7625 658 | 0.7675 659 | 0.5000 660 | 0.5000 661 | 0.5000 662 | 0.5000 663 | 0.5000 664 | 0.7570 665 | 0.7415 666 | 0.7435 667 | 0.7520 668 | 0.7310 669 | 0.7440 670 | 0.5000 671 | 0.7585 672 | 0.5000 673 | 0.4998 674 | 0.7565 675 | 0.5000 676 | 0.5000 677 | 0.7515 678 | 0.7380 679 | 0.5000 680 | 0.7370 681 | 0.7455 682 | 0.5000 683 | 0.7565 684 | 0.5000 685 | 0.7535 686 | 0.7335 687 | 0.7360 688 | 0.7505 689 | 0.5000 690 | 0.5000 691 | 0.7400 692 | 0.5000 693 | 0.7465 694 | 0.7485 695 | 0.5000 696 | 0.5000 697 | 0.7550 698 | 0.7480 699 | 0.7455 700 | 0.5000 701 | 0.7465 702 | 0.5000 703 | 0.7490 704 | 0.7540 705 | 0.7470 706 | 0.5000 707 | 0.5000 708 | 0.7560 709 | 0.7555 710 | 0.7440 711 | 0.5000 712 | 0.5000 713 | 0.7415 714 | 0.5000 715 | 0.7380 716 | 0.7455 717 | 0.7410 718 | 0.5000 719 | 0.7615 720 | 0.5000 721 | 0.7560 722 | 0.5000 723 | 0.7500 724 | 0.7550 725 | 0.7410 726 | 0.4853 727 | 0.7505 728 | 0.7425 729 | 0.5000 730 | 0.5000 731 | 0.7695 732 | 0.7525 733 | 0.5000 734 | 0.7495 735 | 0.7460 736 | 0.5000 737 | 0.5000 738 | 0.4982 739 | 0.7435 740 | 0.7495 741 | 0.5000 742 | 0.5000 743 | 0.5000 744 | 0.5000 745 | 0.5000 746 | 0.7490 747 | 0.5000 748 | 0.5000 749 | 0.7460 750 | 0.5000 751 | 0.5000 752 | 0.5000 753 | 0.7560 754 | 0.5000 755 | 0.7560 756 | 0.7585 757 | 0.5000 758 | 0.7410 759 | 0.5000 760 | 0.7560 761 | 0.7710 762 | 0.7475 763 | 0.5000 764 | 0.7485 765 | 0.5000 766 | 0.7475 767 | 0.5000 768 | 0.5000 769 | 0.5000 770 | 0.5000 771 | 0.5000 772 | 0.7540 773 | 0.5000 774 | 0.5000 775 | 0.7510 776 | 0.7470 777 | 0.5000 778 | 0.7515 779 | 0.7670 780 | 0.4802 781 | 0.7425 782 | 0.5000 783 | 0.5000 784 | 0.7545 785 | 0.5000 786 | 0.7465 787 | 0.7465 788 | 0.7585 789 | -------------------------------------------------------------------------------- /datasets/email/GraphALabels_infection.txt: -------------------------------------------------------------------------------- 1 | 0.7385 2 | 0.7515 3 | 0.4703 4 | 0.4277 5 | 0.4277 6 | 0.7347 7 | 0.7282 8 | 0.4475 9 | 0.3707 10 | 0.4063 11 | 0.3737 12 | 0.4463 13 | 0.4443 14 | 0.4778 15 | 0.4918 16 | 0.3333 17 | 0.4448 18 | 0.7635 19 | 0.7490 20 | 0.4462 21 | 0.4785 22 | 0.4963 23 | 0.3321 24 | 0.4312 25 | 0.4046 26 | 0.3757 27 | 0.3317 28 | 0.3760 29 | 0.4032 30 | 0.4298 31 | 0.3333 32 | 0.3333 33 | 0.3328 34 | 0.3330 35 | 0.3725 36 | 0.3697 37 | 0.3750 38 | 0.3327 39 | 0.3333 40 | 0.3777 41 | 0.3743 42 | 0.4608 43 | 0.4935 44 | 0.3669 45 | 0.4453 46 | 0.3333 47 | 0.3730 48 | 0.3757 49 | 0.4077 50 | 0.3721 51 | 0.3322 52 | 0.4605 53 | 0.4693 54 | 0.4843 55 | 0.4382 56 | 0.4250 57 | 0.3760 58 | 0.4455 59 | 0.4770 60 | 0.4044 61 | 0.4823 62 | 0.4948 63 | 0.4912 64 | 0.4455 65 | 0.7485 66 | 0.7483 67 | 0.4480 68 | 0.4051 69 | 0.3745 70 | 0.4045 71 | 0.3329 72 | 0.3329 73 | 0.3287 74 | 0.7653 75 | 0.7580 76 | 0.3791 77 | 0.3726 78 | 0.4052 79 | 0.3772 80 | 0.4572 81 | 0.4332 82 | 0.4593 83 | 0.4978 84 | 0.4797 85 | 0.4792 86 | 0.4935 87 | 0.4997 88 | 0.4892 89 | 0.6827 90 | 0.4678 91 | 0.4093 92 | 0.3752 93 | 0.4297 94 | 0.4058 95 | 0.4702 96 | 0.4610 97 | 0.4625 98 | 0.4047 99 | 0.4099 100 | 0.4232 101 | 0.4068 102 | 0.7330 103 | 0.4462 104 | 0.7457 105 | 0.4673 106 | 0.4865 107 | 0.4970 108 | 0.4918 109 | 0.3745 110 | 0.3207 111 | 0.3258 112 | 0.3201 113 | 0.3718 114 | 0.4303 115 | 0.4435 116 | 0.4720 117 | 0.4510 118 | 0.4075 119 | 0.3333 120 | 0.3302 121 | 0.7370 122 | 0.4960 123 | 0.4089 124 | 0.3758 125 | 0.4282 126 | 0.4467 127 | 0.3718 128 | 0.4275 129 | 0.4903 130 | 0.4783 131 | 0.4068 132 | 0.4247 133 | 0.4457 134 | 0.4933 135 | 0.4483 136 | 0.4075 137 | 0.3333 138 | 0.4457 139 | 0.3333 140 | 0.4300 141 | 0.4453 142 | 0.4717 143 | 0.4957 144 | 0.4120 145 | 0.4007 146 | 0.4282 147 | 0.7408 148 | 0.4690 149 | 0.6745 150 | 0.4613 151 | 0.4760 152 | 0.3333 153 | 0.4492 154 | 0.4293 155 | 0.4033 156 | 0.4832 157 | 0.4308 158 | 0.4930 159 | 0.4058 160 | 0.4298 161 | 0.4993 162 | 0.3333 163 | 0.4325 164 | 0.4623 165 | 0.3743 166 | 0.4463 167 | 0.7502 168 | 0.4302 169 | 0.4295 170 | 0.4700 171 | 0.4312 172 | 0.4042 173 | 0.4618 174 | 0.4620 175 | 0.4463 176 | 0.3298 177 | 0.4505 178 | 0.7503 179 | 0.7347 180 | 0.4302 181 | 0.4505 182 | 0.4885 183 | 0.4783 184 | 0.4865 185 | 0.4472 186 | 0.3762 187 | 0.4057 188 | 0.4883 189 | 0.4482 190 | 0.4852 191 | 0.3787 192 | 0.4772 193 | 0.3755 194 | 0.3769 195 | 0.4473 196 | 0.4057 197 | 0.4050 198 | 0.4462 199 | 0.4023 200 | 0.4617 201 | 0.4713 202 | 0.4815 203 | 0.3259 204 | 0.3747 205 | 0.4695 206 | 0.4063 207 | 0.4068 208 | 0.4048 209 | 0.3333 210 | 0.4702 211 | 0.4450 212 | 0.4942 213 | 0.4875 214 | 0.4058 215 | 0.4648 216 | 0.7492 217 | 0.3767 218 | 0.3330 219 | 0.7622 220 | 0.4807 221 | 0.4814 222 | 0.7493 223 | 0.7550 224 | 0.7510 225 | 0.4872 226 | 0.4965 227 | 0.7415 228 | 0.4711 229 | 0.4813 230 | 0.4300 231 | 0.4287 232 | 0.4745 233 | 0.4978 234 | 0.4632 235 | 0.4705 236 | 0.4061 237 | 0.4000 238 | 0.4284 239 | 0.6947 240 | 0.3750 241 | 0.3329 242 | 0.4068 243 | 0.4280 244 | 0.4487 245 | 0.4070 246 | 0.3333 247 | 0.4016 248 | 0.3667 249 | 0.7500 250 | 0.4960 251 | 0.7198 252 | 0.3332 253 | 0.4757 254 | 0.4727 255 | 0.4627 256 | 0.4777 257 | 0.4568 258 | 0.4065 259 | 0.3748 260 | 0.4478 261 | 0.4263 262 | 0.3728 263 | 0.4265 264 | 0.3717 265 | 0.4315 266 | 0.4041 267 | 0.6940 268 | 0.4038 269 | 0.7458 270 | 0.4513 271 | 0.3332 272 | 0.4440 273 | 0.4283 274 | 0.3329 275 | 0.3333 276 | 0.4727 277 | 0.3768 278 | 0.3762 279 | 0.3763 280 | 0.3711 281 | 0.4860 282 | 0.4313 283 | 0.4965 284 | 0.7305 285 | 0.4627 286 | 0.4613 287 | 0.4107 288 | 0.3332 289 | 0.4092 290 | 0.3281 291 | 0.4578 292 | 0.4077 293 | 0.4038 294 | 0.3742 295 | 0.4046 296 | 0.4967 297 | 0.4825 298 | 0.7393 299 | 0.3329 300 | 0.4040 301 | 0.4765 302 | 0.4893 303 | 0.4467 304 | 0.4415 305 | 0.3755 306 | 0.4297 307 | 0.4253 308 | 0.4755 309 | 0.4822 310 | 0.7420 311 | 0.4895 312 | 0.4970 313 | 0.4868 314 | 0.7428 315 | 0.4950 316 | 0.4314 317 | 0.7572 318 | 0.4957 319 | 0.4220 320 | 0.3314 321 | 0.4888 322 | 0.4863 323 | 0.3319 324 | 0.4495 325 | 0.4067 326 | 0.4335 327 | 0.3757 328 | 0.4017 329 | 0.4088 330 | 0.4440 331 | 0.4611 332 | 0.4868 333 | 0.3709 334 | 0.4620 335 | 0.3745 336 | 0.3289 337 | 0.3739 338 | 0.4007 339 | 0.3333 340 | 0.3333 341 | 0.4847 342 | 0.4917 343 | 0.4477 344 | 0.3332 345 | 0.4492 346 | 0.3705 347 | 0.4060 348 | 0.4292 349 | 0.3054 350 | 0.4507 351 | 0.4072 352 | 0.4833 353 | 0.4462 354 | 0.4112 355 | 0.4048 356 | 0.4070 357 | 0.3752 358 | 0.4277 359 | 0.4718 360 | 0.3741 361 | 0.3740 362 | 0.3750 363 | 0.4767 364 | 0.4445 365 | 0.3322 366 | 0.4703 367 | 0.4930 368 | 0.4053 369 | 0.7260 370 | 0.3692 371 | 0.3728 372 | 0.4460 373 | 0.4272 374 | 0.4052 375 | 0.4069 376 | 0.4467 377 | 0.4898 378 | 0.7508 379 | 0.4502 380 | 0.4913 381 | 0.7398 382 | 0.4492 383 | 0.3626 384 | 0.3618 385 | 0.3758 386 | 0.4335 387 | 0.4312 388 | 0.4600 389 | 0.4053 390 | 0.4585 391 | 0.4887 392 | 0.4323 393 | 0.4468 394 | 0.4852 395 | 0.4580 396 | 0.4653 397 | 0.4137 398 | 0.4695 399 | 0.3527 400 | 0.3764 401 | 0.4728 402 | 0.4037 403 | 0.4022 404 | 0.4097 405 | 0.4290 406 | 0.4863 407 | 0.3739 408 | 0.4066 409 | 0.4291 410 | 0.4447 411 | 0.3322 412 | 0.4837 413 | 0.4283 414 | 0.4278 415 | 0.2781 416 | 0.2770 417 | 0.2532 418 | 0.4038 419 | 0.3728 420 | 0.4828 421 | 0.4862 422 | 0.4293 423 | 0.4305 424 | 0.3730 425 | 0.4785 426 | 0.3737 427 | 0.4487 428 | 0.4093 429 | 0.3327 430 | 0.4046 431 | 0.4705 432 | 0.4060 433 | 0.3163 434 | 0.3329 435 | 0.4938 436 | 0.4059 437 | 0.2722 438 | 0.3718 439 | 0.4467 440 | 0.2713 441 | 0.4942 442 | 0.4478 443 | 0.3326 444 | 0.3718 445 | 0.3333 446 | 0.4268 447 | 0.4050 448 | 0.3752 449 | 0.3333 450 | 0.1134 451 | 0.4717 452 | 0.3284 453 | 0.3755 454 | 0.4285 455 | 0.3330 456 | 0.4048 457 | 0.2883 458 | 0.4064 459 | 0.4410 460 | 0.7632 461 | 0.4062 462 | 0.3308 463 | 0.3720 464 | 0.1360 465 | 0.4355 466 | 0.3804 467 | 0.4612 468 | 0.4457 469 | 0.4708 470 | 0.3723 471 | 0.3326 472 | 0.3330 473 | 0.3257 474 | 0.4683 475 | 0.4318 476 | 0.2470 477 | 0.3209 478 | 0.3245 479 | 0.3333 480 | 0.3298 481 | 0.4510 482 | 0.4052 483 | 0.4350 484 | 0.4498 485 | 0.3739 486 | 0.4495 487 | 0.4490 488 | 0.3699 489 | 0.3726 490 | 0.4295 491 | 0.4092 492 | 0.3712 493 | 0.4453 494 | 0.4602 495 | 0.4785 496 | 0.4970 497 | 0.4726 498 | 0.3754 499 | 0.7330 500 | 0.4012 501 | 0.4068 502 | 0.3722 503 | 0.3322 504 | 0.3694 505 | 0.3333 506 | 0.3301 507 | 0.4465 508 | 0.3125 509 | 0.3333 510 | 0.3333 511 | 0.3332 512 | 0.4468 513 | 0.3090 514 | 0.4793 515 | 0.3267 516 | 0.4497 517 | 0.4070 518 | 0.4057 519 | 0.4698 520 | 0.4315 521 | 0.4335 522 | 0.3328 523 | 0.2570 524 | 0.3744 525 | 0.1235 526 | 0.4480 527 | 0.4715 528 | 0.4344 529 | 0.4266 530 | 0.4082 531 | 0.3307 532 | 0.3763 533 | 0.3787 534 | 0.4915 535 | 0.3116 536 | 0.3283 537 | 0.3746 538 | 0.4945 539 | 0.3622 540 | 0.4055 541 | 0.3300 542 | 0.3775 543 | 0.3332 544 | 0.4063 545 | 0.3745 546 | 0.3331 547 | 0.4735 548 | 0.3325 549 | 0.4482 550 | 0.4960 551 | 0.3333 552 | 0.3332 553 | 0.3742 554 | 0.3278 555 | 0.3287 556 | 0.3782 557 | 0.3301 558 | 0.4033 559 | 0.3211 560 | 0.4007 561 | 0.7543 562 | 0.1298 563 | 0.4695 564 | 0.3762 565 | 0.3333 566 | 0.4016 567 | 0.3317 568 | 0.2845 569 | 0.4100 570 | 0.3735 571 | 0.3265 572 | 0.3732 573 | 0.4290 574 | 0.4048 575 | 0.3478 576 | 0.4077 577 | 0.3784 578 | 0.4254 579 | 0.1536 580 | 0.3301 581 | 0.7222 582 | 0.3773 583 | 0.1253 584 | 0.2971 585 | 0.3306 586 | 0.3332 587 | 0.4037 588 | 0.3333 589 | 0.4320 590 | 0.3332 591 | 0.3315 592 | 0.3321 593 | 0.4047 594 | 0.3647 595 | 0.1536 596 | 0.3600 597 | 0.3332 598 | 0.3243 599 | 0.3317 600 | 0.3738 601 | 0.4682 602 | 0.4009 603 | 0.1109 604 | 0.3299 605 | 0.2284 606 | 0.1603 607 | 0.4080 608 | 0.4112 609 | 0.2940 610 | 0.4072 611 | 0.4470 612 | 0.4492 613 | 0.3300 614 | 0.2900 615 | 0.4635 616 | 0.4012 617 | 0.3049 618 | 0.3268 619 | 0.3655 620 | 0.2407 621 | 0.3741 622 | 0.2031 623 | 0.3748 624 | 0.3750 625 | 0.3199 626 | 0.1674 627 | 0.3331 628 | 0.3269 629 | 0.4118 630 | 0.2279 631 | 0.2357 632 | 0.2790 633 | 0.1217 634 | 0.3257 635 | 0.1617 636 | 0.2842 637 | 0.3332 638 | 0.3538 639 | 0.3711 640 | 0.4048 641 | 0.3333 642 | 0.3728 643 | 0.4442 644 | 0.3060 645 | 0.2881 646 | 0.3749 647 | 0.3302 648 | 0.1935 649 | 0.3332 650 | 0.3257 651 | 0.3818 652 | 0.3742 653 | 0.4477 654 | 0.1507 655 | 0.2097 656 | 0.3810 657 | 0.3763 658 | 0.3332 659 | 0.4044 660 | 0.4042 661 | 0.3762 662 | 0.3777 663 | 0.1623 664 | 0.3588 665 | 0.3765 666 | 0.3325 667 | 0.1680 668 | 0.3312 669 | 0.2775 670 | 0.2432 671 | 0.3322 672 | 0.3330 673 | 0.1943 674 | 0.3992 675 | 0.2711 676 | 0.3292 677 | 0.4052 678 | 0.3292 679 | 0.3281 680 | 0.1458 681 | 0.2136 682 | 0.4593 683 | 0.2298 684 | 0.3622 685 | 0.3302 686 | 0.4488 687 | 0.4840 688 | 0.3551 689 | 0.3327 690 | 0.3327 691 | 0.4067 692 | 0.1370 693 | 0.2421 694 | 0.1572 695 | 0.3301 696 | 0.3298 697 | 0.4083 698 | 0.3330 699 | 0.2582 700 | 0.3765 701 | 0.2795 702 | 0.2632 703 | 0.3743 704 | 0.3169 705 | 0.2007 706 | 0.4065 707 | 0.3605 708 | 0.3747 709 | 0.4056 710 | 0.2832 711 | 0.2659 712 | 0.3221 713 | 0.2295 714 | 0.3301 715 | 0.4575 716 | 0.3055 717 | 0.3748 718 | 0.3276 719 | 0.4024 720 | 0.3748 721 | 0.7063 722 | 0.2961 723 | 0.3218 724 | 0.1487 725 | 0.4281 726 | 0.3245 727 | 0.3030 728 | 0.4157 729 | 0.3184 730 | 0.2669 731 | 0.4334 732 | 0.4075 733 | 0.4307 734 | 0.3123 735 | 0.1822 736 | 0.3703 737 | 0.3907 738 | 0.3316 739 | 0.3769 740 | 0.1600 741 | 0.4818 742 | 0.2695 743 | 0.4572 744 | 0.2928 745 | 0.2652 746 | 0.1614 747 | 0.1450 748 | 0.2031 749 | 0.4018 750 | 0.3763 751 | 0.2707 752 | 0.3238 753 | 0.3728 754 | 0.2812 755 | 0.1420 756 | 0.3719 757 | 0.1694 758 | 0.1539 759 | 0.2145 760 | 0.4097 761 | 0.3707 762 | 0.3154 763 | 0.3328 764 | 0.1693 765 | 0.2784 766 | 0.3219 767 | 0.3290 768 | 0.1644 769 | 0.2976 770 | 0.4062 771 | 0.1657 772 | 0.1480 773 | 0.3703 774 | 0.1338 775 | 0.3623 776 | 0.1593 777 | 0.3494 778 | 0.1863 779 | 0.1403 780 | 0.3053 781 | 0.3512 782 | 0.3672 783 | 0.3222 784 | 0.2389 785 | 0.3157 786 | 0.3322 787 | 0.4025 788 | 0.2920 789 | 0.3508 790 | 0.3626 791 | 0.4045 792 | 0.4298 793 | 0.3282 794 | 0.3728 795 | 0.3332 796 | 0.3181 797 | 0.4001 798 | 0.3318 799 | 0.3039 800 | 0.4052 801 | 0.2981 802 | 0.4698 803 | 0.2492 804 | 0.3330 805 | 0.3304 806 | 0.1658 807 | 0.2610 808 | 0.3765 809 | 0.1476 810 | 0.4058 811 | 0.2919 812 | 0.1407 813 | 0.1373 814 | 0.3331 815 | 0.3673 816 | 0.1629 817 | 0.3691 818 | 0.2527 819 | 0.1664 820 | 0.2127 821 | 0.2812 822 | 0.4268 823 | 0.3689 824 | 0.2488 825 | 0.2107 826 | 0.2697 827 | 0.3763 828 | 0.0907 829 | 0.3322 830 | 0.2887 831 | 0.3247 832 | 0.2399 833 | 0.3723 834 | 0.3268 835 | 0.1720 836 | 0.3291 837 | 0.3187 838 | 0.4303 839 | 0.3528 840 | 0.1511 841 | 0.4460 842 | 0.3320 843 | 0.1505 844 | 0.1290 845 | 0.1497 846 | 0.1537 847 | 0.3272 848 | 0.2858 849 | 0.3088 850 | 0.2868 851 | 0.3230 852 | 0.3125 853 | 0.1608 854 | 0.4735 855 | 0.2108 856 | 0.3096 857 | 0.1551 858 | 0.1247 859 | 0.3314 860 | 0.2729 861 | 0.1474 862 | 0.4040 863 | 0.1595 864 | 0.3027 865 | 0.3220 866 | 0.1526 867 | 0.3149 868 | 0.3300 869 | 0.2664 870 | 0.1466 871 | 0.3035 872 | 0.1598 873 | 0.2260 874 | 0.3728 875 | 0.2224 876 | 0.1363 877 | 0.2745 878 | 0.3787 879 | 0.1302 880 | 0.2282 881 | 0.1566 882 | 0.3738 883 | 0.1526 884 | 0.2029 885 | 0.2726 886 | 0.1643 887 | 0.4793 888 | 0.3196 889 | 0.3737 890 | 0.3332 891 | 0.3317 892 | 0.2163 893 | 0.3356 894 | 0.3044 895 | 0.3743 896 | 0.3746 897 | 0.1263 898 | 0.1045 899 | 0.3332 900 | 0.4067 901 | 0.3257 902 | 0.2255 903 | 0.3679 904 | 0.4035 905 | 0.2466 906 | 0.3732 907 | 0.3302 908 | 0.3746 909 | 0.3246 910 | 0.1651 911 | 0.3151 912 | 0.3726 913 | 0.3967 914 | 0.4058 915 | 0.3238 916 | 0.3554 917 | 0.2924 918 | 0.3332 919 | 0.3645 920 | 0.1515 921 | 0.2760 922 | 0.2367 923 | 0.1758 924 | 0.2361 925 | 0.1773 926 | 0.1447 927 | 0.2697 928 | 0.1395 929 | 0.3735 930 | 0.1519 931 | 0.3776 932 | 0.4717 933 | 0.3205 934 | 0.3025 935 | 0.3256 936 | 0.3284 937 | 0.3255 938 | 0.3732 939 | 0.3247 940 | 0.2163 941 | 0.2073 942 | 0.1923 943 | 0.1989 944 | 0.1893 945 | 0.3713 946 | 0.3043 947 | 0.1255 948 | 0.2451 949 | 0.3478 950 | 0.2315 951 | 0.2242 952 | 0.3267 953 | 0.3332 954 | 0.1999 955 | 0.1202 956 | 0.3272 957 | 0.1235 958 | 0.3321 959 | 0.2343 960 | 0.2946 961 | 0.3994 962 | 0.4003 963 | 0.3314 964 | 0.1603 965 | 0.2743 966 | 0.2745 967 | 0.1663 968 | 0.2649 969 | 0.3287 970 | 0.2313 971 | 0.2193 972 | 0.4044 973 | 0.2916 974 | 0.1621 975 | 0.1520 976 | 0.1930 977 | 0.1067 978 | 0.1440 979 | 0.2503 980 | 0.2116 981 | 0.1442 982 | 0.3649 983 | 0.3667 984 | 0.2197 985 | 0.1377 986 | 0.1562 987 | -------------------------------------------------------------------------------- /datasets/email/email-Eu-core-department-labels.txt: -------------------------------------------------------------------------------- 1 | 0 1 2 | 1 1 3 | 2 21 4 | 3 21 5 | 4 21 6 | 5 25 7 | 6 25 8 | 7 14 9 | 8 14 10 | 9 14 11 | 10 9 12 | 11 14 13 | 12 14 14 | 13 26 15 | 14 4 16 | 15 17 17 | 16 34 18 | 17 1 19 | 18 1 20 | 19 14 21 | 20 9 22 | 21 9 23 | 22 9 24 | 23 11 25 | 24 11 26 | 25 11 27 | 26 11 28 | 27 11 29 | 28 11 30 | 29 11 31 | 30 11 32 | 31 11 33 | 32 11 34 | 33 11 35 | 34 11 36 | 35 11 37 | 36 11 38 | 37 11 39 | 38 11 40 | 39 11 41 | 40 11 42 | 41 5 43 | 42 34 44 | 43 14 45 | 44 14 46 | 45 17 47 | 46 17 48 | 47 10 49 | 48 10 50 | 49 36 51 | 50 37 52 | 51 5 53 | 52 7 54 | 53 4 55 | 54 22 56 | 55 22 57 | 56 21 58 | 57 21 59 | 58 21 60 | 59 21 61 | 60 7 62 | 61 7 63 | 62 36 64 | 63 21 65 | 64 25 66 | 65 4 67 | 66 8 68 | 67 15 69 | 68 15 70 | 69 15 71 | 70 37 72 | 71 37 73 | 72 9 74 | 73 1 75 | 74 1 76 | 75 10 77 | 76 10 78 | 77 3 79 | 78 3 80 | 79 3 81 | 80 29 82 | 81 15 83 | 82 36 84 | 83 36 85 | 84 37 86 | 85 1 87 | 86 36 88 | 87 34 89 | 88 20 90 | 89 20 91 | 90 8 92 | 91 15 93 | 92 9 94 | 93 4 95 | 94 5 96 | 95 4 97 | 96 20 98 | 97 16 99 | 98 16 100 | 99 16 101 | 100 16 102 | 101 16 103 | 102 38 104 | 103 7 105 | 104 7 106 | 105 34 107 | 106 38 108 | 107 36 109 | 108 8 110 | 109 27 111 | 110 8 112 | 111 8 113 | 112 8 114 | 113 10 115 | 114 10 116 | 115 13 117 | 116 13 118 | 117 6 119 | 118 26 120 | 119 10 121 | 120 1 122 | 121 36 123 | 122 0 124 | 123 13 125 | 124 16 126 | 125 16 127 | 126 22 128 | 127 6 129 | 128 5 130 | 129 4 131 | 130 0 132 | 131 28 133 | 132 28 134 | 133 4 135 | 134 2 136 | 135 13 137 | 136 13 138 | 137 21 139 | 138 21 140 | 139 17 141 | 140 17 142 | 141 14 143 | 142 36 144 | 143 8 145 | 144 40 146 | 145 35 147 | 146 15 148 | 147 23 149 | 148 0 150 | 149 0 151 | 150 7 152 | 151 10 153 | 152 37 154 | 153 27 155 | 154 35 156 | 155 35 157 | 156 0 158 | 157 0 159 | 158 19 160 | 159 19 161 | 160 36 162 | 161 14 163 | 162 37 164 | 163 24 165 | 164 17 166 | 165 13 167 | 166 36 168 | 167 4 169 | 168 4 170 | 169 13 171 | 170 13 172 | 171 10 173 | 172 4 174 | 173 38 175 | 174 32 176 | 175 32 177 | 176 4 178 | 177 1 179 | 178 0 180 | 179 0 181 | 180 0 182 | 181 7 183 | 182 7 184 | 183 4 185 | 184 15 186 | 185 16 187 | 186 40 188 | 187 15 189 | 188 15 190 | 189 15 191 | 190 15 192 | 191 0 193 | 192 21 194 | 193 21 195 | 194 21 196 | 195 21 197 | 196 5 198 | 197 4 199 | 198 4 200 | 199 4 201 | 200 4 202 | 201 4 203 | 202 4 204 | 203 4 205 | 204 5 206 | 205 5 207 | 206 4 208 | 207 4 209 | 208 22 210 | 209 19 211 | 210 19 212 | 211 22 213 | 212 34 214 | 213 14 215 | 214 0 216 | 215 1 217 | 216 17 218 | 217 37 219 | 218 1 220 | 219 1 221 | 220 1 222 | 221 1 223 | 222 1 224 | 223 1 225 | 224 1 226 | 225 1 227 | 226 1 228 | 227 1 229 | 228 1 230 | 229 10 231 | 230 23 232 | 231 0 233 | 232 4 234 | 233 19 235 | 234 19 236 | 235 19 237 | 236 19 238 | 237 19 239 | 238 19 240 | 239 19 241 | 240 19 242 | 241 19 243 | 242 19 244 | 243 19 245 | 244 19 246 | 245 10 247 | 246 14 248 | 247 14 249 | 248 1 250 | 249 14 251 | 250 7 252 | 251 13 253 | 252 20 254 | 253 31 255 | 254 40 256 | 255 6 257 | 256 4 258 | 257 0 259 | 258 8 260 | 259 9 261 | 260 9 262 | 261 10 263 | 262 0 264 | 263 10 265 | 264 14 266 | 265 14 267 | 266 14 268 | 267 14 269 | 268 39 270 | 269 17 271 | 270 4 272 | 271 28 273 | 272 17 274 | 273 17 275 | 274 17 276 | 275 4 277 | 276 4 278 | 277 0 279 | 278 0 280 | 279 23 281 | 280 4 282 | 281 21 283 | 282 36 284 | 283 36 285 | 284 0 286 | 285 22 287 | 286 21 288 | 287 15 289 | 288 37 290 | 289 0 291 | 290 4 292 | 291 4 293 | 292 4 294 | 293 14 295 | 294 4 296 | 295 7 297 | 296 7 298 | 297 1 299 | 298 15 300 | 299 15 301 | 300 38 302 | 301 26 303 | 302 20 304 | 303 20 305 | 304 20 306 | 305 21 307 | 306 9 308 | 307 1 309 | 308 1 310 | 309 1 311 | 310 1 312 | 311 1 313 | 312 1 314 | 313 1 315 | 314 1 316 | 315 1 317 | 316 1 318 | 317 1 319 | 318 10 320 | 319 19 321 | 320 7 322 | 321 7 323 | 322 17 324 | 323 16 325 | 324 14 326 | 325 9 327 | 326 9 328 | 327 9 329 | 328 8 330 | 329 8 331 | 330 13 332 | 331 39 333 | 332 14 334 | 333 10 335 | 334 17 336 | 335 17 337 | 336 13 338 | 337 13 339 | 338 13 340 | 339 13 341 | 340 2 342 | 341 1 343 | 342 0 344 | 343 0 345 | 344 0 346 | 345 0 347 | 346 0 348 | 347 0 349 | 348 0 350 | 349 0 351 | 350 0 352 | 351 0 353 | 352 0 354 | 353 16 355 | 354 16 356 | 355 27 357 | 356 8 358 | 357 8 359 | 358 14 360 | 359 14 361 | 360 14 362 | 361 10 363 | 362 14 364 | 363 35 365 | 364 37 366 | 365 14 367 | 366 36 368 | 367 10 369 | 368 7 370 | 369 20 371 | 370 10 372 | 371 16 373 | 372 36 374 | 373 36 375 | 374 14 376 | 375 8 377 | 376 7 378 | 377 7 379 | 378 7 380 | 379 7 381 | 380 7 382 | 381 7 383 | 382 7 384 | 383 7 385 | 384 7 386 | 385 7 387 | 386 7 388 | 387 7 389 | 388 7 390 | 389 7 391 | 390 7 392 | 391 7 393 | 392 7 394 | 393 7 395 | 394 7 396 | 395 7 397 | 396 7 398 | 397 7 399 | 398 7 400 | 399 4 401 | 400 9 402 | 401 4 403 | 402 0 404 | 403 4 405 | 404 16 406 | 405 38 407 | 406 14 408 | 407 14 409 | 408 21 410 | 409 26 411 | 410 27 412 | 411 28 413 | 412 21 414 | 413 4 415 | 414 1 416 | 415 1 417 | 416 9 418 | 417 10 419 | 418 15 420 | 419 4 421 | 420 26 422 | 421 14 423 | 422 35 424 | 423 10 425 | 424 34 426 | 425 4 427 | 426 4 428 | 427 12 429 | 428 17 430 | 429 17 431 | 430 14 432 | 431 37 433 | 432 37 434 | 433 37 435 | 434 34 436 | 435 6 437 | 436 13 438 | 437 13 439 | 438 13 440 | 439 13 441 | 440 4 442 | 441 14 443 | 442 10 444 | 443 10 445 | 444 10 446 | 445 3 447 | 446 17 448 | 447 17 449 | 448 17 450 | 449 1 451 | 450 4 452 | 451 14 453 | 452 14 454 | 453 6 455 | 454 27 456 | 455 22 457 | 456 21 458 | 457 4 459 | 458 4 460 | 459 1 461 | 460 34 462 | 461 17 463 | 462 30 464 | 463 30 465 | 464 4 466 | 465 23 467 | 466 14 468 | 467 15 469 | 468 1 470 | 469 22 471 | 470 12 472 | 471 31 473 | 472 6 474 | 473 15 475 | 474 15 476 | 475 8 477 | 476 15 478 | 477 8 479 | 478 8 480 | 479 1 481 | 480 15 482 | 481 22 483 | 482 2 484 | 483 3 485 | 484 4 486 | 485 10 487 | 486 4 488 | 487 14 489 | 488 14 490 | 489 25 491 | 490 6 492 | 491 6 493 | 492 40 494 | 493 4 495 | 494 36 496 | 495 23 497 | 496 14 498 | 497 3 499 | 498 14 500 | 499 14 501 | 500 14 502 | 501 14 503 | 502 14 504 | 503 14 505 | 504 14 506 | 505 14 507 | 506 14 508 | 507 31 509 | 508 15 510 | 509 15 511 | 510 14 512 | 511 0 513 | 512 23 514 | 513 35 515 | 514 8 516 | 515 4 517 | 516 1 518 | 517 1 519 | 518 35 520 | 519 23 521 | 520 21 522 | 521 2 523 | 522 4 524 | 523 4 525 | 524 9 526 | 525 14 527 | 526 4 528 | 527 10 529 | 528 25 530 | 529 14 531 | 530 14 532 | 531 3 533 | 532 21 534 | 533 35 535 | 534 4 536 | 535 9 537 | 536 15 538 | 537 6 539 | 538 9 540 | 539 3 541 | 540 15 542 | 541 23 543 | 542 4 544 | 543 4 545 | 544 4 546 | 545 11 547 | 546 35 548 | 547 10 549 | 548 6 550 | 549 15 551 | 550 15 552 | 551 15 553 | 552 22 554 | 553 2 555 | 554 2 556 | 555 14 557 | 556 4 558 | 557 3 559 | 558 14 560 | 559 27 561 | 560 31 562 | 561 34 563 | 562 4 564 | 563 4 565 | 564 19 566 | 565 14 567 | 566 14 568 | 567 4 569 | 568 4 570 | 569 14 571 | 570 14 572 | 571 21 573 | 572 4 574 | 573 14 575 | 574 4 576 | 575 0 577 | 576 4 578 | 577 27 579 | 578 27 580 | 579 17 581 | 580 16 582 | 581 3 583 | 582 15 584 | 583 2 585 | 584 4 586 | 585 4 587 | 586 21 588 | 587 21 589 | 588 11 590 | 589 23 591 | 590 11 592 | 591 23 593 | 592 17 594 | 593 5 595 | 594 36 596 | 595 15 597 | 596 23 598 | 597 23 599 | 598 2 600 | 599 19 601 | 600 4 602 | 601 36 603 | 602 14 604 | 603 1 605 | 604 22 606 | 605 1 607 | 606 21 608 | 607 34 609 | 608 14 610 | 609 13 611 | 610 6 612 | 611 4 613 | 612 37 614 | 613 6 615 | 614 24 616 | 615 35 617 | 616 6 618 | 617 17 619 | 618 16 620 | 619 6 621 | 620 4 622 | 621 0 623 | 622 21 624 | 623 4 625 | 624 26 626 | 625 21 627 | 626 4 628 | 627 15 629 | 628 7 630 | 629 1 631 | 630 20 632 | 631 19 633 | 632 7 634 | 633 21 635 | 634 21 636 | 635 21 637 | 636 21 638 | 637 19 639 | 638 38 640 | 639 19 641 | 640 16 642 | 641 23 643 | 642 6 644 | 643 37 645 | 644 25 646 | 645 1 647 | 646 22 648 | 647 6 649 | 648 21 650 | 649 14 651 | 650 1 652 | 651 26 653 | 652 8 654 | 653 7 655 | 654 37 656 | 655 4 657 | 656 0 658 | 657 17 659 | 658 14 660 | 659 6 661 | 660 17 662 | 661 14 663 | 662 16 664 | 663 15 665 | 664 4 666 | 665 32 667 | 666 14 668 | 667 15 669 | 668 0 670 | 669 23 671 | 670 21 672 | 671 29 673 | 672 14 674 | 673 23 675 | 674 14 676 | 675 1 677 | 676 17 678 | 677 26 679 | 678 15 680 | 679 29 681 | 680 0 682 | 681 0 683 | 682 0 684 | 683 22 685 | 684 34 686 | 685 21 687 | 686 6 688 | 687 16 689 | 688 4 690 | 689 4 691 | 690 15 692 | 691 21 693 | 692 0 694 | 693 36 695 | 694 4 696 | 695 23 697 | 696 1 698 | 697 1 699 | 698 22 700 | 699 14 701 | 700 14 702 | 701 30 703 | 702 4 704 | 703 9 705 | 704 10 706 | 705 4 707 | 706 4 708 | 707 14 709 | 708 16 710 | 709 16 711 | 710 15 712 | 711 21 713 | 712 0 714 | 713 15 715 | 714 4 716 | 715 15 717 | 716 29 718 | 717 24 719 | 718 21 720 | 719 7 721 | 720 14 722 | 721 11 723 | 722 11 724 | 723 9 725 | 724 13 726 | 725 10 727 | 726 31 728 | 727 4 729 | 728 22 730 | 729 14 731 | 730 23 732 | 731 1 733 | 732 4 734 | 733 9 735 | 734 1 736 | 735 17 737 | 736 27 738 | 737 28 739 | 738 22 740 | 739 14 741 | 740 20 742 | 741 7 743 | 742 23 744 | 743 1 745 | 744 4 746 | 745 6 747 | 746 15 748 | 747 15 749 | 748 23 750 | 749 4 751 | 750 20 752 | 751 5 753 | 752 36 754 | 753 10 755 | 754 14 756 | 755 21 757 | 756 39 758 | 757 10 759 | 758 41 760 | 759 31 761 | 760 17 762 | 761 7 763 | 762 21 764 | 763 34 765 | 764 1 766 | 765 14 767 | 766 2 768 | 767 18 769 | 768 16 770 | 769 27 771 | 770 16 772 | 771 38 773 | 772 7 774 | 773 38 775 | 774 21 776 | 775 1 777 | 776 5 778 | 777 9 779 | 778 15 780 | 779 15 781 | 780 15 782 | 781 0 783 | 782 6 784 | 783 23 785 | 784 28 786 | 785 11 787 | 786 23 788 | 787 34 789 | 788 24 790 | 789 4 791 | 790 4 792 | 791 4 793 | 792 24 794 | 793 23 795 | 794 17 796 | 795 10 797 | 796 17 798 | 797 1 799 | 798 1 800 | 799 15 801 | 800 15 802 | 801 4 803 | 802 4 804 | 803 21 805 | 804 14 806 | 805 14 807 | 806 20 808 | 807 28 809 | 808 20 810 | 809 22 811 | 810 26 812 | 811 3 813 | 812 32 814 | 813 4 815 | 814 0 816 | 815 21 817 | 816 13 818 | 817 4 819 | 818 15 820 | 819 17 821 | 820 5 822 | 821 24 823 | 822 4 824 | 823 14 825 | 824 0 826 | 825 9 827 | 826 21 828 | 827 14 829 | 828 38 830 | 829 4 831 | 830 14 832 | 831 31 833 | 832 21 834 | 833 14 835 | 834 6 836 | 835 4 837 | 836 4 838 | 837 6 839 | 838 17 840 | 839 0 841 | 840 4 842 | 841 7 843 | 842 16 844 | 843 4 845 | 844 4 846 | 845 21 847 | 846 1 848 | 847 10 849 | 848 3 850 | 849 21 851 | 850 4 852 | 851 0 853 | 852 1 854 | 853 7 855 | 854 17 856 | 855 15 857 | 856 14 858 | 857 0 859 | 858 9 860 | 859 32 861 | 860 13 862 | 861 5 863 | 862 2 864 | 863 21 865 | 864 28 866 | 865 21 867 | 866 22 868 | 867 22 869 | 868 7 870 | 869 7 871 | 870 33 872 | 871 0 873 | 872 1 874 | 873 15 875 | 874 4 876 | 875 31 877 | 876 30 878 | 877 15 879 | 878 11 880 | 879 19 881 | 880 21 882 | 881 9 883 | 882 21 884 | 883 13 885 | 884 21 886 | 885 9 887 | 886 32 888 | 887 9 889 | 888 32 890 | 889 38 891 | 890 9 892 | 891 38 893 | 892 38 894 | 893 14 895 | 894 9 896 | 895 10 897 | 896 38 898 | 897 10 899 | 898 22 900 | 899 21 901 | 900 13 902 | 901 21 903 | 902 4 904 | 903 0 905 | 904 1 906 | 905 1 907 | 906 23 908 | 907 0 909 | 908 5 910 | 909 4 911 | 910 4 912 | 911 15 913 | 912 14 914 | 913 14 915 | 914 13 916 | 915 11 917 | 916 1 918 | 917 5 919 | 918 5 920 | 919 10 921 | 920 23 922 | 921 21 923 | 922 14 924 | 923 9 925 | 924 20 926 | 925 10 927 | 926 19 928 | 927 19 929 | 928 21 930 | 929 17 931 | 930 19 932 | 931 19 933 | 932 36 934 | 933 17 935 | 934 35 936 | 935 16 937 | 936 4 938 | 937 16 939 | 938 4 940 | 939 6 941 | 940 4 942 | 941 41 943 | 942 6 944 | 943 7 945 | 944 23 946 | 945 9 947 | 946 23 948 | 947 7 949 | 948 6 950 | 949 22 951 | 950 36 952 | 951 14 953 | 952 15 954 | 953 11 955 | 954 35 956 | 955 5 957 | 956 14 958 | 957 14 959 | 958 15 960 | 959 4 961 | 960 6 962 | 961 4 963 | 962 9 964 | 963 19 965 | 964 11 966 | 965 4 967 | 966 29 968 | 967 14 969 | 968 15 970 | 969 15 971 | 970 5 972 | 971 32 973 | 972 15 974 | 973 14 975 | 974 5 976 | 975 9 977 | 976 10 978 | 977 19 979 | 978 13 980 | 979 23 981 | 980 12 982 | 981 10 983 | 982 21 984 | 983 10 985 | 984 35 986 | 985 7 987 | 986 22 988 | 987 22 989 | 988 22 990 | 989 8 991 | 990 21 992 | 991 32 993 | 992 4 994 | 993 21 995 | 994 21 996 | 995 6 997 | 996 14 998 | 997 11 999 | 998 14 1000 | 999 15 1001 | 1000 4 1002 | 1001 21 1003 | 1002 1 1004 | 1003 6 1005 | 1004 22 1006 | -------------------------------------------------------------------------------- /datasets/disease/Roles.txt: -------------------------------------------------------------------------------- 1 | 4 student 2 | 476 student 3 | 481 student 4 | 436 staff 5 | 590 student 6 | 484 student 7 | 615 teacher 8 | 482 staff 9 | 361 student 10 | 325 student 11 | 511 student 12 | 352 teacher 13 | 66 student 14 | 182 staff 15 | 500 student 16 | 622 staff 17 | 324 student 18 | 353 staff 19 | 92 staff 20 | 202 student 21 | 319 student 22 | 681 staff 23 | 643 staff 24 | 58 staff 25 | 504 student 26 | 670 student 27 | 761 student 28 | 515 student 29 | 785 student 30 | 728 student 31 | 364 student 32 | 274 staff 33 | 457 student 34 | 0 staff 35 | 734 student 36 | 571 student 37 | 419 student 38 | 534 student 39 | 54 student 40 | 632 student 41 | 171 student 42 | 128 staff 43 | 139 student 44 | 240 student 45 | 447 staff 46 | 90 teacher 47 | 331 student 48 | 770 student 49 | 716 student 50 | 491 teacher 51 | 524 staff 52 | 505 staff 53 | 674 staff 54 | 459 student 55 | 439 student 56 | 62 teacher 57 | 275 student 58 | 341 student 59 | 597 student 60 | 201 staff 61 | 36 student 62 | 591 student 63 | 263 student 64 | 350 student 65 | 469 student 66 | 512 student 67 | 741 teacher 68 | 705 staff 69 | 348 staff 70 | 743 student 71 | 727 staff 72 | 1 teacher 73 | 668 student 74 | 314 student 75 | 686 student 76 | 566 teacher 77 | 0 staff 78 | 714 staff 79 | 747 student 80 | 116 student 81 | 549 student 82 | 181 student 83 | 315 student 84 | 0 staff 85 | 307 student 86 | 641 student 87 | 694 student 88 | 628 staff 89 | 380 student 90 | 677 student 91 | 648 student 92 | 762 student 93 | 698 teacher 94 | 543 student 95 | 126 student 96 | 388 student 97 | 342 student 98 | 774 student 99 | 226 student 100 | 441 student 101 | 583 student 102 | 294 student 103 | 692 student 104 | 383 student 105 | 582 student 106 | 690 student 107 | 2 student 108 | 335 student 109 | 125 student 110 | 293 student 111 | 193 student 112 | 555 student 113 | 378 student 114 | 295 student 115 | 685 student 116 | 156 student 117 | 300 student 118 | 730 student 119 | 336 student 120 | 200 student 121 | 396 student 122 | 23 student 123 | 667 student 124 | 194 student 125 | 328 teacher 126 | 486 student 127 | 639 student 128 | 721 teacher 129 | 68 other 130 | 463 student 131 | 696 student 132 | 73 student 133 | 411 student 134 | 523 student 135 | 385 student 136 | 618 student 137 | 256 student 138 | 408 student 139 | 209 student 140 | 547 student 141 | 610 student 142 | 238 student 143 | 191 student 144 | 729 student 145 | 103 student 146 | 780 student 147 | 360 student 148 | 468 student 149 | 101 student 150 | 179 student 151 | 742 teacher 152 | 0 student 153 | 609 student 154 | 349 student 155 | 277 student 156 | 112 student 157 | 538 student 158 | 703 student 159 | 452 student 160 | 758 student 161 | 487 student 162 | 168 student 163 | 607 student 164 | 493 student 165 | 333 student 166 | 498 student 167 | 702 student 168 | 435 student 169 | 570 teacher 170 | 625 student 171 | 529 student 172 | 650 student 173 | 569 student 174 | 106 teacher 175 | 264 student 176 | 258 teacher 177 | 678 student 178 | 157 student 179 | 262 student 180 | 458 student 181 | 163 teacher 182 | 20 student 183 | 710 student 184 | 389 teacher 185 | 409 student 186 | 783 student 187 | 427 student 188 | 584 student 189 | 725 student 190 | 190 student 191 | 205 student 192 | 432 student 193 | 773 student 194 | 412 student 195 | 7 student 196 | 518 student 197 | 606 student 198 | 227 student 199 | 269 student 200 | 192 student 201 | 497 student 202 | 697 teacher 203 | 414 student 204 | 214 student 205 | 198 student 206 | 454 student 207 | 159 student 208 | 309 student 209 | 552 student 210 | 724 student 211 | 688 student 212 | 475 student 213 | 422 student 214 | 246 student 215 | 765 student 216 | 18 student 217 | 312 student 218 | 580 teacher 219 | 6 student 220 | 781 student 221 | 704 student 222 | 522 student 223 | 689 student 224 | 608 student 225 | 232 student 226 | 557 student 227 | 22 student 228 | 574 student 229 | 10 student 230 | 540 teacher 231 | 49 student 232 | 653 student 233 | 268 student 234 | 423 student 235 | 0 teacher 236 | 620 teacher 237 | 186 student 238 | 343 student 239 | 285 student 240 | 241 student 241 | 489 student 242 | 756 student 243 | 665 student 244 | 445 student 245 | 626 student 246 | 740 student 247 | 330 student 248 | 509 student 249 | 784 teacher 250 | 308 student 251 | 502 student 252 | 561 student 253 | 567 student 254 | 637 student 255 | 751 student 256 | 530 student 257 | 158 student 258 | 102 student 259 | 74 student 260 | 152 student 261 | 42 student 262 | 528 student 263 | 228 student 264 | 149 student 265 | 260 student 266 | 559 student 267 | 550 other 268 | 132 student 269 | 114 student 270 | 757 student 271 | 255 student 272 | 748 student 273 | 494 student 274 | 413 student 275 | 391 student 276 | 27 student 277 | 122 student 278 | 153 student 279 | 141 student 280 | 373 student 281 | 737 student 282 | 0 student 283 | 322 student 284 | 86 student 285 | 365 student 286 | 213 student 287 | 207 student 288 | 462 student 289 | 700 student 290 | 617 student 291 | 372 student 292 | 224 student 293 | 576 teacher 294 | 485 student 295 | 446 student 296 | 554 student 297 | 771 student 298 | 671 student 299 | 143 student 300 | 358 student 301 | 129 student 302 | 786 teacher 303 | 483 student 304 | 124 student 305 | 254 student 306 | 63 student 307 | 67 student 308 | 589 student 309 | 401 student 310 | 568 student 311 | 619 student 312 | 669 student 313 | 564 student 314 | 510 student 315 | 595 student 316 | 245 student 317 | 746 student 318 | 320 student 319 | 81 student 320 | 170 student 321 | 11 teacher 322 | 197 student 323 | 745 student 324 | 397 student 325 | 542 student 326 | 217 student 327 | 766 teacher 328 | 657 student 329 | 659 student 330 | 283 student 331 | 70 student 332 | 506 student 333 | 456 teacher 334 | 699 student 335 | 514 student 336 | 88 student 337 | 59 student 338 | 177 student 339 | 272 student 340 | 662 student 341 | 166 student 342 | 21 student 343 | 709 student 344 | 265 student 345 | 50 student 346 | 46 student 347 | 52 student 348 | 624 teacher 349 | 286 student 350 | 14 teacher 351 | 261 student 352 | 119 student 353 | 374 student 354 | 79 student 355 | 38 student 356 | 104 student 357 | 248 student 358 | 252 student 359 | 627 student 360 | 526 teacher 361 | 169 student 362 | 337 student 363 | 425 student 364 | 0 student 365 | 173 student 366 | 19 teacher 367 | 517 student 368 | 250 student 369 | 57 student 370 | 196 student 371 | 48 student 372 | 281 student 373 | 602 student 374 | 713 student 375 | 525 student 376 | 85 student 377 | 28 student 378 | 174 student 379 | 215 student 380 | 553 student 381 | 244 student 382 | 37 student 383 | 273 student 384 | 630 student 385 | 237 student 386 | 633 student 387 | 203 student 388 | 777 student 389 | 145 student 390 | 291 student 391 | 111 student 392 | 467 student 393 | 684 student 394 | 236 student 395 | 769 student 396 | 25 student 397 | 433 teacher 398 | 270 student 399 | 586 student 400 | 474 student 401 | 501 student 402 | 161 student 403 | 105 student 404 | 243 student 405 | 621 student 406 | 148 student 407 | 45 student 408 | 588 student 409 | 775 student 410 | 437 student 411 | 565 student 412 | 466 student 413 | 109 student 414 | 115 teacher 415 | 280 student 416 | 183 student 417 | 32 student 418 | 292 student 419 | 347 teacher 420 | 603 student 421 | 318 student 422 | 465 student 423 | 362 student 424 | 453 student 425 | 578 student 426 | 135 teacher 427 | 78 student 428 | 760 student 429 | 712 student 430 | 537 student 431 | 539 student 432 | 551 student 433 | 778 student 434 | 76 student 435 | 631 student 436 | 118 student 437 | 326 student 438 | 233 student 439 | 278 student 440 | 346 teacher 441 | 339 student 442 | 142 student 443 | 223 student 444 | 178 student 445 | 55 student 446 | 403 student 447 | 629 student 448 | 726 student 449 | 302 student 450 | 471 student 451 | 536 student 452 | 531 student 453 | 579 student 454 | 53 staff 455 | 701 student 456 | 392 student 457 | 558 student 458 | 451 student 459 | 605 student 460 | 720 student 461 | 276 student 462 | 442 student 463 | 772 student 464 | 398 student 465 | 296 teacher 466 | 495 student 467 | 415 student 468 | 64 student 469 | 354 student 470 | 75 student 471 | 593 student 472 | 407 student 473 | 147 student 474 | 499 student 475 | 40 student 476 | 548 student 477 | 652 student 478 | 160 student 479 | 719 student 480 | 299 student 481 | 654 student 482 | 316 staff 483 | 744 staff 484 | 527 student 485 | 282 teacher 486 | 30 student 487 | 110 student 488 | 93 student 489 | 313 student 490 | 323 teacher 491 | 480 student 492 | 601 student 493 | 29 student 494 | 676 student 495 | 711 teacher 496 | 779 student 497 | 43 student 498 | 386 student 499 | 416 student 500 | 573 student 501 | 572 student 502 | 635 teacher 503 | 184 student 504 | 440 student 505 | 239 student 506 | 600 student 507 | 288 student 508 | 195 student 509 | 154 student 510 | 767 student 511 | 251 student 512 | 393 student 513 | 379 student 514 | 321 student 515 | 212 student 516 | 754 student 517 | 464 student 518 | 455 student 519 | 604 student 520 | 107 student 521 | 560 student 522 | 151 student 523 | 150 student 524 | 516 student 525 | 663 student 526 | 520 student 527 | 562 student 528 | 297 student 529 | 738 teacher 530 | 83 student 531 | 400 student 532 | 345 student 533 | 490 student 534 | 359 student 535 | 329 student 536 | 355 student 537 | 764 student 538 | 763 student 539 | 242 student 540 | 636 teacher 541 | 647 student 542 | 594 student 543 | 15 student 544 | 84 student 545 | 428 student 546 | 488 student 547 | 12 student 548 | 421 student 549 | 418 student 550 | 41 teacher 551 | 533 student 552 | 448 student 553 | 394 student 554 | 733 student 555 | 735 student 556 | 133 student 557 | 130 student 558 | 598 student 559 | 695 student 560 | 390 student 561 | 444 student 562 | 750 student 563 | 384 student 564 | 366 student 565 | 776 student 566 | 301 student 567 | 789 student 568 | 89 student 569 | 577 student 570 | 317 student 571 | 675 student 572 | 478 teacher 573 | 660 student 574 | 231 student 575 | 229 student 576 | 72 student 577 | 138 student 578 | 519 teacher 579 | 216 student 580 | 424 student 581 | 782 student 582 | 715 student 583 | 131 student 584 | 344 student 585 | 496 student 586 | 420 student 587 | 87 student 588 | 123 student 589 | 230 teacher 590 | 35 student 591 | 144 student 592 | 717 student 593 | 284 student 594 | 249 student 595 | 222 teacher 596 | 279 student 597 | 306 student 598 | 253 student 599 | 206 student 600 | 210 student 601 | 351 student 602 | 753 teacher 603 | 638 student 604 | 535 student 605 | 732 student 606 | 661 student 607 | 472 teacher 608 | 17 student 609 | 644 student 610 | 94 student 611 | 167 student 612 | 310 student 613 | 0 student 614 | 434 teacher 615 | 640 teacher 616 | 507 student 617 | 691 student 618 | 541 student 619 | 247 student 620 | 77 student 621 | 338 student 622 | 381 student 623 | 0 student 624 | 382 student 625 | 655 student 626 | 303 student 627 | 503 student 628 | 370 student 629 | 199 student 630 | 136 student 631 | 682 student 632 | 39 teacher 633 | 683 student 634 | 121 student 635 | 80 student 636 | 581 student 637 | 402 student 638 | 9 student 639 | 31 student 640 | 612 student 641 | 363 student 642 | 266 student 643 | 24 student 644 | 646 student 645 | 96 staff 646 | 259 student 647 | 672 student 648 | 369 student 649 | 137 student 650 | 649 teacher 651 | 100 student 652 | 443 student 653 | 749 student 654 | 492 student 655 | 587 student 656 | 759 student 657 | 450 student 658 | 34 student 659 | 679 student 660 | 556 student 661 | 120 teacher 662 | 356 student 663 | 5 student 664 | 736 student 665 | 666 student 666 | 479 student 667 | 752 staff 668 | 656 teacher 669 | 460 student 670 | 3 student 671 | 470 student 672 | 613 teacher 673 | 718 student 674 | 211 student 675 | 187 student 676 | 99 student 677 | 97 student 678 | 225 student 679 | 33 teacher 680 | 304 student 681 | 117 student 682 | 60 student 683 | 449 student 684 | 532 student 685 | 61 student 686 | 208 student 687 | 127 student 688 | 687 student 689 | 788 student 690 | 731 teacher 691 | 546 student 692 | 357 student 693 | 8 student 694 | 461 student 695 | 404 student 696 | 165 student 697 | 13 student 698 | 431 student 699 | 221 student 700 | 95 student 701 | 651 student 702 | 563 student 703 | 426 student 704 | 218 student 705 | 680 teacher 706 | 634 student 707 | 257 teacher 708 | 340 student 709 | 219 student 710 | 723 teacher 711 | 155 student 712 | 544 student 713 | 98 student 714 | 616 student 715 | 287 student 716 | 134 student 717 | 438 student 718 | 334 student 719 | 175 teacher 720 | 82 student 721 | 298 student 722 | 592 student 723 | 645 student 724 | 290 student 725 | 305 staff 726 | 473 staff 727 | 755 teacher 728 | 787 other 729 | 585 student 730 | 311 student 731 | 545 staff 732 | 162 staff 733 | 180 staff 734 | 0 student 735 | 0 staff 736 | 377 teacher 737 | 596 student 738 | 267 staff 739 | 367 student 740 | 658 student 741 | 140 staff 742 | 521 staff 743 | 611 student 744 | 44 staff 745 | 614 student 746 | 410 teacher 747 | 327 staff 748 | 387 student 749 | 575 other 750 | 768 staff 751 | 220 other 752 | 417 staff 753 | 271 student 754 | 164 student 755 | 91 student 756 | 65 student 757 | 235 student 758 | 693 staff 759 | 146 teacher 760 | 508 student 761 | 664 teacher 762 | 289 staff 763 | 371 teacher 764 | 26 student 765 | 113 staff 766 | 234 staff 767 | 204 staff 768 | 47 student 769 | 332 student 770 | 56 student 771 | 395 student 772 | 172 student 773 | 51 staff 774 | 739 staff 775 | 405 student 776 | 707 student 777 | 188 student 778 | 399 student 779 | 375 staff 780 | 0 staff 781 | 513 student 782 | 189 staff 783 | 16 staff 784 | 176 student 785 | 708 student 786 | 706 staff 787 | 722 student 788 | 368 student 789 | 429 teacher 790 | 599 staff 791 | 376 staff 792 | 71 student 793 | 642 student 794 | 623 teacher 795 | 430 teacher 796 | 108 staff 797 | 477 staff 798 | 69 teacher 799 | 406 staff 800 | 185 student 801 | 673 staff 802 | -------------------------------------------------------------------------------- /code/run_gtl.py: -------------------------------------------------------------------------------- 1 | """ 2 | Main module to run GTL model. Depending on the input parameters, 3 | it loads certain data, creates and trains a specific GTL model 4 | 5 | @author: Andrey Gritsenko 6 | SPIRAL Group 7 | Electrical & Computer Engineering 8 | Northeastern University 9 | """ 10 | 11 | import argparse 12 | import os 13 | import numpy as np 14 | from pathlib import Path 15 | 16 | from load_save_data import load_data 17 | from create_gtl_model import getGTLmodel 18 | from train_gtl_model import train 19 | from load_save_data import save_global_results 20 | 21 | 22 | def str2bool(v): 23 | if v.lower() in ('true', 't', 'yes', 'y', '1'): 24 | return True 25 | elif v.lower() in ('false', 'f', 'no', 'n', '0'): 26 | return False 27 | else: 28 | raise argparse.ArgumentTypeError('Boolean value expected.') 29 | 30 | 31 | if __name__ == '__main__': 32 | parser = argparse.ArgumentParser( 33 | description="Train Graph Transfer Learning model", 34 | formatter_class=argparse.ArgumentDefaultsHelpFormatter 35 | ) 36 | # dataset parameters 37 | parser.add_argument('-lp', '--load_path', 38 | type=str, default='../datasets/', 39 | help='Full path to folder with datasets') 40 | parser.add_argument('-d', '--dataset', 41 | type=str, default='sb-4', choices=['bp-2', 'sb-4', 'sb-6', 'zachary', 'disease', 'email'], 42 | help='Dataset to be used') 43 | # general graph parameters 44 | parser.add_argument('--labels', 45 | type=str, default='cluster', choices=['cluster', 'infection'], 46 | help='Node labels of the synthetic data') 47 | # graph embedding parameters 48 | parser.add_argument('--nembedding', 49 | type=int, default=5, 50 | help='Size of the output embedding vector') 51 | parser.add_argument('-sg', '--topology_similarity', 52 | type=str, default='randomwalk', choices=['randomwalk', 'adjacency'], 53 | help='Similarity measure between nodes of the same graph in graph topological space') 54 | parser.add_argument('-et', '--embedding_type', 55 | type=str, default='skipgram', choices=['unified', 'skipgram'], 56 | help='Type of embedding function: skipgram, unified') 57 | parser.add_argument('-se', '--embedding_similarity', 58 | type=str, default='softmax', choices=['softmax', 'innerprod', 'cossim', 'l2'], 59 | help='Similarity measures between nodes of the same graph in embedding space') 60 | parser.add_argument('-sl', '--similarity_loss', 61 | type=str, default='crossentropy', choices=['crossentropy', 'innerprod', 'l2'], 62 | help='Loss function between similarities in topological and embedding ' 63 | 'spaces for nodes of the same graph') 64 | # prediction branch parameters 65 | parser.add_argument('--depth', 66 | type=int, default=1, 67 | help='Number of hidden layers in Prediction Branch') 68 | parser.add_argument('-af', '--activation_function', 69 | type=str, default='tanh', choices=['tanh', 'sigmoid', 'relu'], 70 | help='Activation function for Prediction Branch neurons') 71 | parser.add_argument('-prl', '--prediction_loss', 72 | type=str, default='mean_squared_error', 73 | choices=['mean_squared_error', 'mean_absolute_percentage_error'], 74 | help='Loss function for Prediction Branch') 75 | # randomwalk parameters 76 | parser.add_argument('--nwalks', 77 | type=int, default=20, 78 | help='Number of node2vec random walks') 79 | parser.add_argument('--walk_length', 80 | type=int, default=10, 81 | help='Length of random walk') 82 | parser.add_argument('--window_size', 83 | type=int, default=4, 84 | help='Width of sliding window in random walks') 85 | parser.add_argument('--p', 86 | type=float, default=0.25, 87 | help='Parameter p for node2vec random walks') 88 | parser.add_argument('--q', 89 | type=float, default=4.0, 90 | help='Parameter q for node2vec random walks') 91 | parser.add_argument('--nnegative', 92 | type=int, default=5, 93 | help='Number of negative samples used in skip-gram') 94 | parser.add_argument('--scale_negative', 95 | type=str2bool, default=False, 96 | help='Specifies whether to scale outputs for negative samples') 97 | # second graph parameters 98 | parser.add_argument('--transfer_mode', 99 | type=str, default='1graph', choices=['1graph', 'noP', 'iterP', 'optP', 'trueP', 'trueP_DS'], 100 | help='Specifies transfer learning mode') 101 | parser.add_argument('--b_from_a', 102 | type=str, default='permute', choices=['permute', 'modify'], 103 | help='Specifies whether to permute or add/remove edges to graph A to generate graph B') 104 | parser.add_argument('-dp', '--discrepancy_percent', 105 | type=float, default=0, 106 | help='Specifies percentage of edges to be removed/added when generating second graph') 107 | parser.add_argument('-gd', '--graph_distance', 108 | type=str, default='l2', choices=['l2', 'innerprod', 'cossim'], 109 | help='Pairwise distance measure between nodes in the embedding space (matrix D)') 110 | # neural net train/test parameters 111 | parser.add_argument('--alpha', 112 | type=float, default=1.0, 113 | help='Weight of graph matching loss') 114 | parser.add_argument('--beta', 115 | type=str2bool, default=False, 116 | help='Specifies whether to scale parts of P-optimization loss') 117 | parser.add_argument('-lr', '--learning_rate', 118 | type=float, default=0.025, 119 | help='Learning rate') 120 | parser.add_argument('--batch_size', 121 | type=int, default=2, 122 | help='Number of instances in each batch') 123 | parser.add_argument('--epochs', 124 | type=int, default=2, 125 | help='Number of epochs') 126 | parser.add_argument('--early_stopping', 127 | type=int, default=0, 128 | help='Number of epochs with no improvement after which training will be stopped. ' 129 | 'If <=0, no early stopping is used') 130 | parser.add_argument('--iterations', 131 | type=int, default=1, 132 | help='Number of iterations for model to initialize and run. ' 133 | 'Output results are averaged across iterations') 134 | # CUDA parameters 135 | parser.add_argument('--id_gpu', 136 | default=-1, type=int, 137 | help='Specifies which gpu to use. If <0, model is run on cpu') 138 | # results parameters 139 | parser.add_argument('-sp', '--save_path', type=str, default='', 140 | help='Full path to folder where results are saved') 141 | args = parser.parse_args() 142 | 143 | if args.id_gpu >= 0: 144 | os.environ["CUDA_DEVICE_ORDER"] = "PCI_BUS_ID" 145 | # The GPU id to use 146 | os.environ["CUDA_VISIBLE_DEVICES"] = str(args.id_gpu) 147 | 148 | # save configuration file 149 | n_iter = max(1, args.iterations) 150 | save_path = args.save_path 151 | print("*************** Configuration ***************") 152 | args_dic = vars(args) 153 | for arg, value in args_dic.items(): 154 | line = arg + ' : ' + str(value) 155 | print(line) 156 | print("*********************************************\n") 157 | 158 | # load data 159 | dataset_path = str(Path(args.load_path) / args.dataset.lower()) 160 | labels = args.labels.lower() 161 | transmode = args.transfer_mode 162 | B_from_A = args.b_from_a 163 | disc_pect = args.discrepancy_percent 164 | 165 | n_layers = max(10, args.depth) 166 | 167 | n_embedding = args.nembedding 168 | topology_similarity = args.topology_similarity 169 | embedding_type = args.embedding_type 170 | embedding_similarity = args.embedding_similarity 171 | if embedding_type == 'skipgram': 172 | embedding_similarity = 'softmax' 173 | if embedding_similarity == 'softmax': 174 | n_negative = args.nnegative 175 | scale_negative = args.scale_negative 176 | else: 177 | n_negative = 0 178 | scale_negative = False 179 | similarity_loss = args.similarity_loss 180 | prediction_loss = args.prediction_loss 181 | activation_function = args.activation_function 182 | 183 | graph_distance = args.graph_distance 184 | n_walks = args.nwalks 185 | walk_length = args.walk_length 186 | window_size = args.window_size 187 | p = args.p 188 | q = args.q 189 | 190 | alpha = args.alpha 191 | beta = args.beta 192 | learning_rate = args.learning_rate 193 | batch_size = args.batch_size 194 | early_stopping = args.early_stopping 195 | 196 | n_epochs = args.epochs 197 | results = {'epochs': {}, 'train': {}, 'testA': {}} 198 | if labels != 'cluster': 199 | results['rsquaredA'] = {} 200 | if transmode != '1graph': 201 | results['testB'] = {} 202 | if labels != 'cluster': 203 | results['rsquaredB'] = {} 204 | for iter in range(n_iter): 205 | save_path_iter = save_path + str(iter+1) + '/' 206 | if not os.path.exists(save_path_iter): 207 | os.makedirs(save_path_iter) 208 | # load data 209 | n_nodes, n_features, n_labels, A, Afeatures, Alabels, Atrain, Atest, B, Bfeatures, Blabels, Ptrue = \ 210 | load_data(dataset_path, labels, transmode, B_from_A, disc_pect) 211 | # create model 212 | models = getGTLmodel(n_nodes, n_features, n_embedding, labels, n_labels, n_layers, n_negative, scale_negative, 213 | embedding_type, embedding_similarity, similarity_loss, prediction_loss, 214 | activation_function, transmode, graph_distance, learning_rate, alpha, save_path) 215 | 216 | if iter == 0: 217 | print("\nEmbedding model summary:".upper()) 218 | print(models['EmbeddingModel'].summary()) 219 | print("\nEmbedding similarity branch summary:".upper()) 220 | print(models['EmbeddingModel'].get_layer('Branch_SimilarityA').summary()) 221 | print("\nPrediction model summary:".upper()) 222 | print(models['PredictionModel'].summary()) 223 | print("\nPrediction branch summary:".upper()) 224 | print(models['PredictionModel'].get_layer('Branch_Prediction').summary()) 225 | 226 | # train/test model 227 | print("\n ============================================== ") 228 | print("|*************** ITERATION #{:3d} ***************|".format(iter+1) if n_iter > 1 else 229 | "|************ GRAPH TRANSFER LEARNING *********|") 230 | print(" ============================================== ") 231 | iter_results = train(models, A, Afeatures, Alabels, Atrain, Atest, B, Bfeatures, Blabels, Ptrue, transmode, 232 | topology_similarity, n_walks, walk_length, window_size, p, q, n_negative, learning_rate, 233 | beta, n_epochs, early_stopping, batch_size, save_path_iter) 234 | results['epochs'][iter] = iter_results['epochs'] 235 | results['train'][iter] = iter_results['acc_train'] 236 | results['testA'][iter] = iter_results['acc_testA'] 237 | if labels != 'cluster': 238 | results['rsquaredA'][iter] = iter_results['acc_rsquaredA'] 239 | if transmode != '1graph': 240 | results['testB'][iter] = iter_results['acc_testB'] 241 | if labels != 'cluster': 242 | results['rsquaredB'][iter] = iter_results['acc_rsquaredB'] 243 | 244 | # save global results 245 | picklename = "GlobalResults" 246 | save_global_results(args, iter + 1, results, picklename) 247 | 248 | epochs_mean = np.mean(list(results['epochs'].values())) 249 | epochs_std = np.std(list(results['epochs'].values())) 250 | train_mean = np.mean(list(results['train'].values())) 251 | train_std = np.std(list(results['train'].values())) 252 | testA_mean = np.mean(list(results['testA'].values())) 253 | testA_std = np.std(list(results['testA'].values())) 254 | res_str = "" 255 | if labels != 'cluster': 256 | rsquaredA_mean = np.mean(list(results['rsquaredA'].values())) 257 | rsquaredA_std = np.std(list(results['rsquaredA'].values())) 258 | res_str += "\tR-squared (graph A) = {0:.4f} (\u00B1{1:.4f})\n".format(rsquaredA_mean, rsquaredA_std) 259 | if transmode != '1graph': 260 | testB_mean = np.mean(list(results['testB'].values())) 261 | testB_std = np.std(list(results['testB'].values())) 262 | res_str += "\tTest accuracy (graph B) = {0:.4f} (\u00B1{1:.4f})\n".format(testB_mean, testB_std) 263 | if labels != 'cluster': 264 | rsquaredB_mean = np.mean(list(results['rsquaredB'].values())) 265 | rsquaredB_std = np.std(list(results['rsquaredB'].values())) 266 | res_str += "\tR-squared (graph B) = {0:.4f} (\u00B1{1:.4f})\n".format(rsquaredB_mean, rsquaredB_std) 267 | print("\n\n ============================================== ") 268 | print("|*************** FINAL RESULTS ***************|") 269 | print(" ============================================== ") 270 | print(f"After {n_iter:2d} iteration(s), the average\n" 271 | f"\tConvergence rate = {epochs_mean:.4f} (\u00B1{epochs_std:.4f})\n" 272 | f"\tTrain accuracy (graph A) = {train_mean:.4f} (\u00B1{train_std:.4f})\n" 273 | f"\tTest accuracy (graph A) = {testA_mean:.4f} (\u00B1{testA_std:.4f})") 274 | print(res_str) 275 | -------------------------------------------------------------------------------- /code/create_gtl_model.py: -------------------------------------------------------------------------------- 1 | """ 2 | Main GTL module responsible for creating, training and testing GTL NN 3 | 4 | @author: Andrey Gritsenko 5 | SPIRAL Group 6 | Electrical & Computer Engineering 7 | Northeastern University 8 | """ 9 | 10 | import numpy as np 11 | 12 | from keras.models import Model 13 | from keras.layers import Input, Activation 14 | from keras.layers.core import Dense, Lambda, Reshape 15 | from keras.layers.embeddings import Embedding 16 | from keras.layers.merge import Dot, Concatenate 17 | from keras import backend as K 18 | from keras import optimizers 19 | 20 | 21 | # l2 distance between two network layers 22 | def l2_dist(vects): 23 | x, y = vects 24 | sum_square = K.sum(K.square(x - y), axis=1, keepdims=True) 25 | return K.sqrt(K.maximum(sum_square, K.epsilon())) 26 | # shape of l2 distance between two embeddings in similarity branch 27 | def l2_output_shape_sim(shapes): 28 | shape1, shape2 = shapes 29 | return (shape1[0], 1, 1) 30 | # shape of l2 distance between two embeddings in distance branch 31 | def l2_output_shape_dist(shapes): 32 | shape1, shape2 = shapes 33 | return (shape1[0], 1) 34 | 35 | 36 | def createNetworks(n_nodes, input_size, labels, n_layers, embedding_type, embedding_similarity, transmode, graph_distance, scale_negative, activation_function): 37 | """ 38 | Creates neural network that learns node embeddings of given graph(s) 39 | 40 | Inputs: 41 | INPUT_SIZE List [n,m,k,l] where: 42 | N Number of samples, 43 | M Number of original features (equal to n for one-hot coding) 44 | K Number of embedding features 45 | L Number of node labels 46 | EMBEDDING_TYPE Type of embedding approach, e.g. 'unified' (unified embedding for target and context nodes) or 'skipgram' (different embeddings for target and context nodes) 47 | EMBEDDING_SIMILARITY Measure of similarity between node embeddings within one graph 48 | TRANSMODE Flag to specify transfer learning mode 49 | GRAPH_DISTANCE Distance between node embeddings of different graphs 50 | 51 | Outputs: 52 | Neural network for graph node embeddings 53 | """ 54 | 55 | dict_sizeA, dict_sizeB = n_nodes if type(n_nodes) in [list, tuple] else [n_nodes]*2 56 | feature_size, embedding_size, class_size, negative_size = input_size 57 | inputsEmbedding = [] 58 | inputsEmbeddingA = [] 59 | inputsEmbeddingB = [] 60 | outputsEmbedding = [] 61 | inputsPrediction = [] 62 | outputsPrediction = [] 63 | 64 | if embedding_similarity == 'l2': 65 | from keras.constraints import UnitNorm 66 | constraints = UnitNorm(axis=1) 67 | else: 68 | constraints = None 69 | 70 | # create embedding branch for graph A 71 | if feature_size == 1: 72 | input_shape = (1,) 73 | input_type = 'int32' 74 | Embedding_targetA = Embedding(dict_sizeA, embedding_size, embeddings_constraint=constraints, 75 | name='Embedding_TargetA') 76 | Embedding_contextA = Embedding(dict_sizeA, embedding_size, embeddings_constraint=constraints, 77 | name='Embedding_ContextA') 78 | else: 79 | input_shape = (1, feature_size,) 80 | input_type = 'float' 81 | Embedding_targetA = Dense(embedding_size, activation='tanh', kernel_constraint=constraints, 82 | name='Embedding_TargetA') 83 | Embedding_contextA = Dense(embedding_size, activation='tanh', kernel_constraint=constraints, 84 | name='Embedding_ContextA') 85 | 86 | input_targetA = Input(shape=input_shape, dtype=input_type, 87 | name='Input_TargetA') 88 | input_contextA = Input(shape=input_shape, dtype=input_type, 89 | name='Input_ContextA') 90 | inputsEmbeddingA.extend([input_targetA, input_contextA]) 91 | 92 | # initialize graph A embedding weights from multivariate gaussian distribution 93 | embedding_targetA = Embedding_targetA(input_targetA) 94 | Embedding_targetA.set_weights([np.random.multivariate_normal(np.zeros(embedding_size), 0.1*np.identity(embedding_size), dict_sizeA)]) 95 | if embedding_type == 'skipgram': 96 | embedding_contextA = Embedding_contextA(input_contextA) 97 | Embedding_contextA.set_weights([np.random.multivariate_normal(np.zeros(embedding_size), 0.1*np.identity(embedding_size), dict_sizeA)]) 98 | elif embedding_type == 'unified': 99 | embedding_contextA = Embedding_targetA(input_contextA) 100 | 101 | # add more dense layers to embedding branch if predicting pagerank 102 | if labels == 'pagerank': 103 | embedding_targetA = Dense(embedding_size, activation='tanh')(embedding_targetA) 104 | embedding_contextA = Dense(embedding_size, activation='tanh')(embedding_contextA) 105 | 106 | # create similarity branch for graph A 107 | inputsSimilarityA = [embedding_targetA, embedding_contextA] 108 | if embedding_similarity == 'softmax': 109 | # add negative samples 110 | input_negativeA = Input(shape=(negative_size,)+input_shape[1:], dtype=input_type, name='Input_NegativeA') 111 | inputsEmbeddingA.extend([input_negativeA]) 112 | embedding_negativeA = Embedding_targetA(input_negativeA) 113 | # add more dense layers to embedding branch if predicting pagerank 114 | if labels == 'pagerank': 115 | embedding_negativeA = Dense(embedding_size, activation='tanh')(embedding_negativeA) 116 | inputsSimilarityA.extend([embedding_negativeA]) 117 | similarityA = createSimilarityBranch(embedding_size, mode=embedding_similarity, negative_size=negative_size, graph='A', scale_negative=scale_negative)(inputsSimilarityA) 118 | outputsEmbedding.extend([similarityA]) 119 | inputsEmbedding.extend(inputsEmbeddingA) 120 | 121 | # create prediction branch 122 | inputsPrediction.extend([input_targetA]) 123 | predictionBranch = createPredictionBranch(embedding_size, n_layers, class_size, activation_function)(embedding_targetA) 124 | predictionOutput = Reshape((class_size,), name='PredictionOutput')(predictionBranch) 125 | outputsPrediction.extend([predictionOutput]) 126 | 127 | if transmode != '1graph': 128 | input_targetB = Input(shape=input_shape, dtype=input_type, name='Input_TargetB') 129 | input_contextB = Input(shape=input_shape, dtype=input_type, name='Input_ContextB') 130 | inputsEmbeddingB.extend([input_targetB, input_contextB]) 131 | 132 | # create embedding branch for graph B 133 | if feature_size == 1: 134 | Embedding_targetB = Embedding(dict_sizeB, embedding_size, embeddings_constraint=constraints, name='Embedding_TargetB') 135 | Embedding_contextB = Embedding(dict_sizeB, embedding_size, embeddings_constraint=constraints, name='Embedding_ContextB') 136 | else: 137 | Embedding_targetB = Dense(embedding_size, activation='tanh', kernel_constraint=constraints, name='Embedding_TargetB') 138 | Embedding_contextB = Dense(embedding_size, activation='tanh', kernel_constraint=constraints, name='Embedding_ContextB') 139 | 140 | # initialize graph B embedding weights as zeros when graph embeddings are linked, or from multivariate gaussian distribution, otherwise 141 | embedding_targetB = Embedding_targetB(input_targetB) 142 | Embedding_targetB.set_weights([np.random.multivariate_normal(np.zeros(embedding_size), 0.1*np.identity(embedding_size), dict_sizeB)]) 143 | if embedding_type == 'skipgram': # separate embeddings for target and context nodes 144 | embedding_contextB = Embedding_contextB(input_contextB) 145 | Embedding_contextB.set_weights([np.random.multivariate_normal(np.zeros(embedding_size), 0.1*np.identity(embedding_size), dict_sizeB)]) 146 | elif embedding_type == 'unified': # unified embedding 147 | embedding_contextB = Embedding_targetB(input_contextB) 148 | 149 | # add more dense layers to embedding branch if predicting pagerank 150 | if labels == 'pagerank': 151 | embedding_targetB = Dense(embedding_size, activation='tanh')(embedding_targetB) 152 | embedding_contextB = Dense(embedding_size, activation='tanh')(embedding_contextB) 153 | 154 | # create similarity branch for graph B 155 | inputsSimilarityB = [embedding_targetB, embedding_contextB] 156 | if embedding_similarity == 'softmax': 157 | # add negative samples 158 | input_negativeB = Input(shape=(negative_size,)+input_shape[1:], dtype=input_type, name='Input_NegativeB') 159 | inputsEmbeddingB.extend([input_negativeB]) 160 | embedding_negativeB = Embedding_targetB(input_negativeB) 161 | # add more dense layers to embedding branch if predicting pagerank 162 | if labels == 'pagerank': 163 | embedding_negativeB = Dense(embedding_size, activation='tanh')(embedding_negativeB) 164 | inputsSimilarityB.extend([embedding_negativeB]) 165 | similarityB = createSimilarityBranch(embedding_size, mode=embedding_similarity, negative_size=negative_size, graph='B', scale_negative=scale_negative)(inputsSimilarityB) 166 | outputsEmbedding.extend([similarityB]) 167 | inputsEmbedding.extend(inputsEmbeddingB) 168 | 169 | # create graph distance branch 170 | if transmode != 'noP': 171 | distanceAB = createDistanceBranch(embedding_size, mode=graph_distance)([embedding_targetA, embedding_targetB]) 172 | outputsEmbedding.extend([distanceAB]) 173 | 174 | modelEmbedding = Model(inputs=inputsEmbedding, outputs=outputsEmbedding) 175 | branchEmbeddingA = Model(inputs=inputsEmbeddingA, outputs=outputsEmbedding[0]) 176 | branchEmbeddingB = Model(inputs=inputsEmbeddingB, outputs=outputsEmbedding[1]) if transmode != '1graph' else None 177 | modelPrediction = Model(inputs=inputsPrediction, outputs=outputsPrediction) 178 | 179 | return modelEmbedding, branchEmbeddingA, branchEmbeddingB, modelPrediction 180 | 181 | 182 | def createSimilarityBranch(embedding_size, mode='innerprod', negative_size=0, graph='A', scale_negative=False): 183 | """ 184 | Branch of global network: computes similarity between embeddings of given two nodes in a graph 185 | """ 186 | inputT = Input(shape=(1,embedding_size,), name='Embedding_Target') 187 | inputC = Input(shape=(1,embedding_size,), name='Embedding_Context') 188 | inputs = [inputT, inputC] 189 | 190 | layer_name = 'Output_Similarity' 191 | if mode=='l2': # l2 distance 192 | similarity = Lambda(l2_dist, output_shape=l2_output_shape_sim, name=layer_name)(inputs) 193 | elif mode=='cossim': # cosine similarity 194 | similarity = Dot(axes=-1, normalize=True, name=layer_name)(inputs) 195 | elif mode=='innerprod': # inner product 196 | similarity = Dot(axes=-1, name=layer_name)(inputs) 197 | else: # softmax (default) 198 | inputNS = Input(shape=(negative_size,embedding_size,), name='Embedding_NS') 199 | inputs.append(inputNS) 200 | similarityTC = Dot(axes=-1)([inputT, inputC]) 201 | similarityCNS = Dot(axes=-1)([inputC, inputNS]) 202 | similarity = Concatenate(axis=-1)([similarityTC, similarityCNS]) 203 | similarity = Activation('softmax', name=layer_name)(similarity) 204 | 205 | # normalize negative samples loss 206 | if scale_negative: 207 | def normalizeNS(x): 208 | return x/negative_size 209 | similarity = Activation(normalizeNS)(similarity) 210 | 211 | return Model(inputs, similarity, name='Branch_Similarity'+graph) 212 | 213 | 214 | def createPredictionBranch(embedding_size, n_layers, class_size, activation): 215 | """ 216 | Branch of global network: predicts node label for a given embedding 217 | """ 218 | input = Input(shape=(embedding_size,), name='Input_Embedding') 219 | 220 | # intermediate layer(s) structure 221 | prediction = Dense(embedding_size, activation=activation)(input) 222 | for _ in range(n_layers-1): 223 | prediction = Dense(embedding_size, activation=activation)(prediction) 224 | # last layer structure 225 | layer_name = 'Output_Label' 226 | if class_size > 1: 227 | # classification task case 228 | prediction = Dense(class_size, activation='softmax', name=layer_name)(prediction) 229 | else: 230 | # regression task case 231 | prediction = Dense(class_size, activation='sigmoid', name=layer_name)(prediction) 232 | return Model(input, prediction, name='Branch_Prediction') 233 | 234 | 235 | def createDistanceBranch(embedding_size, mode='l2'): 236 | """ 237 | Branch of global network: computes all pairwise distances between node embeddings of different graphs 238 | """ 239 | input1 = Input(shape=(embedding_size,), name='Input_EmbeddingA') 240 | input2 = Input(shape=(embedding_size,), name='Input_EmbeddingB') 241 | 242 | layer_name = 'Output_DistanceAB' 243 | if mode == 'innerprod': 244 | distance12 = Dot(axes=-1, name=layer_name)([input1, input2]) 245 | elif mode == 'cossim': 246 | distance12 = Dot(axes=-1, normalize=True, name=layer_name)([input1, input2]) 247 | else: 248 | distance12 = Lambda(l2_dist, output_shape=l2_output_shape_dist, name=layer_name)([input1, input2]) 249 | return Model([input1, input2], distance12, name='Branch_Distance') 250 | 251 | 252 | # generic function that specifies loss function to be used 253 | def getLoss(loss='crossentropy'): 254 | if loss == 'l2': 255 | return L2Loss 256 | elif loss == 'innerprod': 257 | return InnerProdLoss 258 | else: 259 | return 'binary_crossentropy' # multiple true labels are possible in output vector 260 | # return 'categorical_crossentropy' # only one true label is possible in output vector 261 | 262 | 263 | # outputs inner product loss between true and predicted labels 264 | def InnerProdLoss(yTrue, yPred): 265 | return K.sum(yTrue * yPred) 266 | 267 | 268 | # outputs l2 loss between true and predicted labels 269 | def L2Loss(yTrue, yPred): 270 | return K.sqrt(K.sum(K.square(yTrue - yPred))) 271 | 272 | 273 | # root mean squared error 274 | def RMSELoss(yTrue, yPred): 275 | return K.sqrt(K.mean(K.square(yTrue - yPred))) 276 | 277 | 278 | def getGTLmodel(n_nodes, n_features, n_embedding, labels, n_labels, n_layers, 279 | n_negative, scale_negative, embedding_type, embedding_similarity, similarity_loss, 280 | prediction_loss, activation_function, 281 | transmode, graph_distance, learning_rate, alpha, save_path): 282 | 283 | input_size = (n_features, n_embedding, n_labels, n_negative) 284 | 285 | Optimizer = optimizers.SGD(lr=learning_rate, nesterov=True) 286 | 287 | print("\n*************** Creating Model ***************".upper()) 288 | EmbeddingModel, EmbeddingABranch, EmbeddingBBranch, PredictionModel = \ 289 | createNetworks(n_nodes, input_size, labels, n_layers, 290 | embedding_type, embedding_similarity, transmode, graph_distance, scale_negative, activation_function) 291 | 292 | # Neural Net learning node embeddings 293 | EmbLoss = {"Branch_SimilarityA": getLoss(similarity_loss)} 294 | loss_weights = [1] 295 | if transmode != '1graph': 296 | EmbLoss.update({"Branch_SimilarityB": getLoss(similarity_loss)}) 297 | loss_weights.append(1) 298 | if transmode != 'noP': 299 | EmbLoss.update({"Branch_Distance": InnerProdLoss}) 300 | loss_weights.append(alpha) 301 | EmbeddingModel.compile(optimizer=Optimizer, loss=EmbLoss, loss_weights=loss_weights) 302 | with open(save_path+"EmbeddingModel.json", "w") as json_file: 303 | json_file.write(EmbeddingModel.to_json()) 304 | 305 | # Branch of neural net learning node embeddings only for graph A 306 | EmbeddingABranch.compile(optimizer=Optimizer, loss=getLoss(similarity_loss)) 307 | with open(save_path+"EmbeddingABranch.json", "w") as json_file: 308 | json_file.write(EmbeddingABranch.to_json()) 309 | 310 | if transmode != '1graph': 311 | # Branch of neural net learning node embeddings only for graph B 312 | EmbeddingBBranch.compile(optimizer=Optimizer, loss=getLoss(similarity_loss)) 313 | with open(save_path+"EmbeddingBBranch.json", "w") as json_file: 314 | json_file.write(EmbeddingBBranch.to_json()) 315 | 316 | # Neural Net predicting labels 317 | if n_labels > 1: 318 | prediction_loss = "categorical_crossentropy" 319 | elif prediction_loss is None: 320 | prediction_loss = "binary_crossentropy" 321 | elif prediction_loss == 'root_mean_squared_error': 322 | prediction_loss = RMSELoss 323 | PredictionModel.compile(optimizer=Optimizer, loss=prediction_loss) 324 | 325 | with open(save_path+"PredictionModel.json", "w") as json_file: 326 | json_file.write(PredictionModel.to_json()) 327 | with open(save_path+"PredictionBranch.json", "w") as json_file: 328 | json_file.write(PredictionModel.get_layer('Branch_Prediction').to_json()) 329 | 330 | model_names = ['EmbeddingModel', 'EmbeddingABranch', 'EmbeddingBBranch', 'PredictionModel'] 331 | models = dict(zip(model_names, [EmbeddingModel, EmbeddingABranch, EmbeddingBBranch, PredictionModel])) 332 | 333 | return models 334 | -------------------------------------------------------------------------------- /code/train_gtl_model.py: -------------------------------------------------------------------------------- 1 | """ 2 | @author: Andrey Gritsenko 3 | SPIRAL Group 4 | Electrical & Computer Engineering 5 | Northeastern University 6 | """ 7 | 8 | import os 9 | import numpy as np 10 | from sklearn.metrics.pairwise import euclidean_distances 11 | 12 | from keras.models import Model 13 | 14 | import data_generator as DG 15 | from load_save_data import save_iteration_results 16 | 17 | 18 | # manually update 19 | def updateWeigths(WA, WB, P, lr=0.025): 20 | print("WA shape = {}, WB shape = {}".format(WA.shape, WB.shape)) 21 | if WA.shape[0] != WB.shape[0]: 22 | P = P[:WA.shape[0], -WB.shape[0]:] 23 | WAnew = (1 - 2*lr)*WA + 2*lr*np.matmul(P, WB) 24 | WBnew = (1 - 2*lr)*WB + 2*lr*np.matmul(P.T, WA) 25 | return WAnew, WBnew 26 | 27 | 28 | # outputs embedding for given raw data 29 | def getEmbedding(model, graph_name, data): 30 | new_model = Model(inputs=model.get_layer('Input_Target'+graph_name).input, 31 | outputs=model.get_layer('Embedding_Target'+graph_name).get_output_at(0)) 32 | if data.shape[1] > 1: 33 | data = data.reshape((data.shape[0],1,data.shape[1])) 34 | return new_model.predict(data).squeeze() 35 | 36 | 37 | # outputs prediction for given raw data 38 | def getModelOutputs_fromRaw(model, data): 39 | if data.shape[1] > 1: 40 | data = data.reshape((data.shape[0], 1, data.shape[1])) 41 | return model.predict(data).squeeze() 42 | 43 | 44 | # outputs prediction for given embedding of data 45 | def getModelOutputs_fromEmbedding(model, data): 46 | return model.get_layer('Branch_Prediction').predict(data) 47 | 48 | 49 | # compute matrix D of node-wise distance between two graphs 50 | def computeDistanceBetweenGraphs(graphA, graphB, mode='l2'): 51 | if mode == 'innerprod': # inner product 52 | D = np.matmul(graphA, graphB.T) 53 | elif mode == 'l2sq': # squared l2 distance 54 | D = euclidean_distances(graphA, graphB, squared=True) 55 | else: #l2 distance 56 | D = euclidean_distances(graphA, graphB) 57 | if graphA.shape != graphB.shape: 58 | scale = 100 59 | D = euclidean_distances(graphA, graphB) 60 | D = np.concatenate((np.concatenate((np.ones((graphA.shape[0], graphA.shape[0]))*scale, D), axis=1), 61 | np.concatenate((np.zeros(D.T.shape), np.ones((graphB.shape[0], graphB.shape[0]))*scale), axis=1)), axis=0) 62 | 63 | return D 64 | 65 | 66 | # return predicted labels and regression values 67 | def getModelPredictions(prediction): 68 | return prediction.max(axis=1, keepdims=1) == prediction 69 | 70 | 71 | # compute classification and regression accuracy 72 | def computeAccuracy(true, prediction): 73 | if prediction.ndim == 2 and prediction.shape[1] > 1: 74 | #prediction accuracy 75 | prediction = getModelPredictions(prediction) 76 | acc = np.sum((np.argmax(true, axis=1) == np.argmax(prediction.astype(int), axis=1)).astype(int))/float(len(true)) if len(true) > 0 else 0 77 | else: 78 | try: 79 | prediction = np.delete(prediction, np.where(true == 1)[0][0]) 80 | true = np.delete(true, np.where(true == 1)[0][0]) 81 | except: 82 | pass 83 | # root mean squared error aka standard deviation of residuals 84 | acc = np.sqrt(np.mean(np.square(prediction - true))) 85 | return acc 86 | 87 | 88 | # compute R-squared statistics aka coefficient of determination 89 | def computeRSquared(true, prediction, avg_predictor=None): 90 | try: 91 | prediction = np.delete(prediction, np.where(true == 1)[0][0]) 92 | true = np.delete(true, np.where(true == 1)[0][0]) 93 | except: 94 | pass 95 | if avg_predictor is None: 96 | avg_predictor = np.mean(true) 97 | rsquared = 1 - np.sum(np.square(prediction - true))/np.sum(np.square(prediction - avg_predictor)) 98 | return rsquared 99 | 100 | 101 | def train(models, A, Afeatures, Alabels, Atrain, Atest, B, Bfeatures, Blabels, Ptrue, transmode, topology_similarity, n_walks, walk_length, window_size, p, q, n_negative, learning_rate, beta, nepochs, early_stopping, batch_size, save_path): 102 | 103 | # file to save results 104 | suffix = '_' + str(nepochs) + '_' 105 | acc_txt = open(save_path + 'Accuracy' + suffix + '.txt', "w") 106 | 107 | # choose the period of how often results are saved 108 | step = max(int(nepochs/10), int(1)) if nepochs<=100 else max(int(nepochs/100), int(10)) 109 | # variables to contain results 110 | epochs = [] 111 | acc_train = [] 112 | acc_testA = [] 113 | acc_testB = [] 114 | 115 | lr_init = 0.1 116 | 117 | patience = 0 118 | epoch_best = -1 119 | acc_curr = -float('inf') if Alabels.shape[1]>1 else float('inf') 120 | acc_best = -float('inf') if Alabels.shape[1]>1 else float('inf') 121 | weights_embed_best = None 122 | weights_pred_best = None 123 | 124 | EmbeddingModel = models['EmbeddingModel'] 125 | EmbeddingABranch = models['EmbeddingABranch'] 126 | EmbeddingBBranch = models['EmbeddingBBranch'] 127 | PredictionModel = models['PredictionModel'] 128 | 129 | # prepare data for training 130 | print("\n=======================================\nDATA IS LOADING") 131 | from graph_utils import getNodeSimilarity 132 | Atarget, Acontext, Asimilarity, Anegative = getNodeSimilarity(A, mode=topology_similarity, n_walks=n_walks, 133 | walk_length=walk_length, window_size=window_size, p=p, 134 | q=q, n_negative=n_negative) 135 | if B is None: 136 | Btarget, Bcontext, Bsimilarity, Bnegative = None, None, None, None 137 | else: 138 | Btarget, Bcontext, Bsimilarity, Bnegative = getNodeSimilarity(B, mode=topology_similarity, n_walks=n_walks, 139 | walk_length=walk_length, window_size=window_size, 140 | p=p, q=q, n_negative=n_negative) 141 | print("\n=======================================\nDATA IS LOADED") 142 | 143 | # initialize model 144 | if transmode not in ['1graph', 'noP']: 145 | # initialize model 146 | print("\n=======================================\nMODEL INITIALIZATION") 147 | # train embedding of graph A 148 | Data = DG.EmbeddingNodesGenerator(batch_size, A, Atarget, Acontext, Asimilarity, Anegative, n_negative=n_negative, Afeatures=Afeatures) 149 | print("\nEMBEDDING BRANCH A INITIALIZATION:") 150 | EmbeddingABranch.fit_generator(Data, epochs=1, verbose=2) 151 | # adjust embedding of graph A by learning class labels 152 | print("\nPREDICTION BRANCH INITIALIZATION:") 153 | PredictionModel.fit_generator(DG.PredictionNodesGenerator(batch_size, Afeatures[Atrain,:], Alabels[Atrain,:]), epochs=1, verbose=2) 154 | # compute training and testA accuracies 155 | epochs.append(0) 156 | Aembedding = getEmbedding(EmbeddingModel, 'A', Afeatures[np.concatenate((Atrain, Atest)),:]) 157 | Alabels_output = getModelOutputs_fromEmbedding(PredictionModel, Aembedding) 158 | acc_train.append(computeAccuracy(Alabels[Atrain,:], Alabels_output[Atrain,:])) 159 | acc_testA.append(computeAccuracy(Alabels[Atest,:], Alabels_output[Atest,:])) 160 | # initialize P 161 | print("\nP INITIALIZATION:") 162 | if A.shape[0] != B.shape[0]: 163 | factor = 0.5 164 | Aext = np.concatenate((np.concatenate((A, np.ones((A.shape[0],B.shape[1]))*factor), axis=1), 165 | np.ones((B.shape[0], A.shape[1]+B.shape[1]))*factor), axis=0) 166 | Bext = np.concatenate((np.ones((A.shape[0], A.shape[1]+B.shape[1]))*factor, 167 | np.concatenate((np.ones((B.shape[0],A.shape[1]))*factor, B), axis=1)), axis=0) 168 | else: 169 | Aext, Bext = A, B 170 | if transmode.find('trueP') != -1: 171 | P = Ptrue 172 | else: 173 | from updateP import updateP 174 | P = updateP(Aext, Bext, np.zeros(Aext.shape), None, 'opt') 175 | # update A,B embeddings w.r.t. transfer loss 176 | W_TargetA = EmbeddingABranch.get_layer('Embedding_TargetA').get_weights()[0] 177 | W_TargetB = np.zeros(EmbeddingBBranch.get_layer('Embedding_TargetB').get_weights()[0].shape) 178 | W_TargetA, W_TargetB = updateWeigths(W_TargetA, W_TargetB, P, lr=lr_init) 179 | EmbeddingModel.get_layer('Embedding_TargetA').set_weights([W_TargetA]) 180 | EmbeddingModel.get_layer('Embedding_TargetB').set_weights([W_TargetB]) 181 | try: 182 | W_ContextA = EmbeddingABranch.get_layer('Embedding_ContextA').get_weights()[0] 183 | W_ContextB = np.zeros(EmbeddingBBranch.get_layer('Embedding_ContextB').get_weights()[0].shape) 184 | W_ContextA, W_ContextB = updateWeigths(W_ContextA, W_ContextB, P, lr=lr_init) 185 | EmbeddingModel.get_layer('Embedding_ContextA').set_weights([W_ContextA]) 186 | EmbeddingModel.get_layer('Embedding_ContextB').set_weights([W_ContextB]) 187 | except: 188 | pass 189 | # adjust embedding of graph B 190 | Data = DG.EmbeddingNodesGenerator(batch_size, B, Btarget, Bcontext, Bsimilarity, Bnegative, n_negative=n_negative, Afeatures=Bfeatures) 191 | print("\nEMBEDDING BRANCH B INITIALIZATION:") 192 | EmbeddingBBranch.fit_generator(Data, epochs=1, verbose=2) 193 | # compute testB accuracy 194 | Bembedding = getEmbedding(EmbeddingModel, 'B', Bfeatures) 195 | Blabels_output = getModelOutputs_fromEmbedding(PredictionModel, Bembedding) 196 | acc_testB.append(computeAccuracy(Blabels, Blabels_output)) 197 | print(f"\n---------------------------------------\n" 198 | f"After model initialization\n" 199 | f"\tTrain accuracy = {acc_train[-1]:.4f} (graph A)\n" 200 | f"\tTest accuracy = {acc_testA[-1]:.4f} (graph A)\n" 201 | f"\tTest accuracy = {acc_testB[-1]:.4f} (graph B)") 202 | acc_txt.write("{:4d}\t{:.4f}\t{:.4f}\t{:.4f}\n".format(epochs[-1], acc_train[-1], acc_testA[-1], acc_testB[-1])) 203 | else: 204 | P = None 205 | 206 | print("\n*************** Model Training ***************".upper()) 207 | # train model 208 | for epoch in range(nepochs): 209 | print("\n=======================================\nEPOCH #{:4d}".format(epoch+1)) 210 | 211 | # optimize permutation matrix and update A,B embeddings w.r.t. it 212 | print("\nTRANSFER TRAINING:") 213 | if transmode not in ['1graph', 'noP']: 214 | if transmode == 'trueP': 215 | print("True permutation P is used") 216 | elif transmode == 'trueP_DS': 217 | print("True doubly-stochastic P is used") 218 | else: 219 | D = computeDistanceBetweenGraphs(getEmbedding(EmbeddingModel,'A',Afeatures), getEmbedding(EmbeddingModel,'B',Bfeatures)) 220 | # scale loss w.r.t. epoch number: give more value to second part of the loss as training progress 221 | if beta: 222 | scalefunc = lambda x: x/(1+abs(x)) # S-shaped function 223 | scalefactor = 5 # loss is scaled in range [1/scalefactor ... 1 ... scalefactor] 224 | scale = scalefactor^(scalefunc(epoch-nepochs/2)) 225 | D *= scale 226 | P = updateP(Aext, Bext, D, P, transmode) # optimize permutation matrix P 227 | print("P is updated with '{}' method".format(transmode)) 228 | else: 229 | print("No P is used (no transfer)") 230 | 231 | # train Embedding model for 1 epoch 232 | print("\nEMBEDDING TRAINING:") 233 | Data = DG.EmbeddingNodesGenerator(batch_size, A, Atarget, Acontext, Asimilarity, Anegative, n_negative, B, Btarget, Bcontext, Bsimilarity, Bnegative, P, Afeatures=Afeatures, Bfeatures=Bfeatures) 234 | EmbeddingModel.fit_generator(Data, epochs=1, verbose=2) 235 | 236 | # train Prediction model for 1 epoch 237 | print("\nPREDICTION TRAINING:") 238 | PredictionModel.fit_generator(DG.PredictionNodesGenerator(batch_size, Afeatures[Atrain, :], Alabels[Atrain, :]), epochs=1, verbose=2) 239 | 240 | # compute training accuracy 241 | epochs.append(epoch+1) 242 | Aembedding = getEmbedding(EmbeddingModel, 'A', Afeatures[np.concatenate((Atrain, Atest)), :]) 243 | Alabels_output = getModelOutputs_fromEmbedding(PredictionModel, Aembedding) 244 | acc_curr = computeAccuracy(Alabels[Atrain,:], Alabels_output[Atrain,:]) 245 | acc_train.append(acc_curr) 246 | 247 | # check for training accuracy improvement 248 | if (Alabels.shape[1]>1 and acc_curr>acc_best) or (Alabels.shape[1]==1 and acc_curr {acc_curr:.4f}") 251 | patience = 0 252 | epoch_best = epoch 253 | acc_best = acc_curr 254 | weights_embed_best = EmbeddingModel.get_weights() 255 | weights_pred_best = PredictionModel.get_weights() 256 | else: 257 | print("\nEPOCH #{:4d} :: Train accuracy ({:.4f}) has not improved from the previous best value at epoch #{:4d} ({:.4f})".format(epoch+1, acc_curr, epoch_best+1, acc_best)) 258 | patience += 1 259 | 260 | # output intermediate training and test results at given interval 261 | if epoch==0 or (epoch+1)%step==0 or epoch==nepochs-1: 262 | acc_testA.append(computeAccuracy(Alabels[Atest,:], Alabels_output[Atest,:])) # compute A test accuracy 263 | if B is None: 264 | print("\n---------------------------------------\n\tTrain accuracy = {:.4f} (graph A)\n\tTest accuracy = {:.4f} (graph A)\n---------------------------------------\n".format(acc_train[-1], acc_testA[-1])) 265 | acc_txt.write("{:4d}\t{:.4f}\t{:.4f}\n".format(epochs[-1], acc_train[-1], acc_testA[-1])) 266 | else: 267 | Bembedding = getEmbedding(EmbeddingModel, 'B', Bfeatures) 268 | Blabels_output = getModelOutputs_fromEmbedding(PredictionModel, Bembedding) 269 | acc_testB.append(computeAccuracy(Blabels, Blabels_output)) 270 | print("\n---------------------------------------\n\tTrain accuracy = {:.4f} (graph A)\n\tTest accuracy = {:.4f} (graph A)\n\tTest accuracy = {:.4f} (graph B)\n---------------------------------------\n".format(acc_train[-1], acc_testA[-1], acc_testB[-1])) 271 | acc_txt.write("{:4d}\t{:.4f}\t{:.4f}\t{:.4f}\n".format(epochs[-1], acc_train[-1], acc_testA[-1], acc_testB[-1])) 272 | 273 | # check for early stopping 274 | if (early_stopping>0 and patience>early_stopping-1): 275 | print("Early stopping.") 276 | break 277 | # end of training 278 | if epoch==nepochs-1: 279 | print("\nMaximum # of epochs (" + str(nepochs) + ") is reached.\n") 280 | 281 | # rename output file 282 | newsuffix = suffix + str(epoch_best+1) 283 | os.rename(save_path + 'Accuracy' + suffix + '.txt', 284 | save_path + 'Accuracy' + newsuffix + '.txt') 285 | suffix = newsuffix 286 | 287 | # restore weights for best epoch 288 | EmbeddingModel.set_weights(weights_embed_best) 289 | PredictionModel.set_weights(weights_pred_best) 290 | 291 | # save best model states (with weights) 292 | EmbeddingModel.save(save_path + 'EmbeddingModel' + suffix + '.h5') 293 | PredictionModel.save(save_path + 'PredictionModel' + suffix + '.h5') 294 | 295 | # compute, save and output best iteration results 296 | Aembedding = getEmbedding(EmbeddingModel, 'A', Afeatures[np.concatenate((Atrain, Atest)),:]) # graph A node embeddings 297 | Alabels_output = getModelOutputs_fromEmbedding(PredictionModel, Aembedding) 298 | Alabels_predict = getModelPredictions(Alabels_output) if Alabels.shape[1]>1 else None 299 | epochs.append(epoch_best+1) 300 | acc_train.append(acc_best) 301 | acc_testA.append(computeAccuracy(Alabels[Atest,:], Alabels_output[Atest,:])) 302 | acc_results = dict(zip(['epochs', 'acc_train', 'acc_testA'], [epochs, acc_train, acc_testA])) 303 | best_results = dict(zip(['epochs', 'acc_train', 'acc_testA'], [epochs[-1], acc_train[-1], acc_testA[-1]])) 304 | print("\n---------------------------------------\nRestoring best model state from epoch #{:4d}:".format(epochs[-1])) 305 | if B is None: 306 | Bembedding = None 307 | Blabels_output = None 308 | Blabels_predict = None 309 | if Alabels.shape[1]>1: 310 | print("Train accuracy = {:.4f} (graph A)\t:\tTest accuracy = {:.4f} (graph A)".format(acc_train[-1], acc_testA[-1])) 311 | acc_txt.write("-"*24 + "\n{:4d}\t{:.4f}\t{:.4f}".format(epochs[-1], acc_train[-1], acc_testA[-1])) 312 | else: 313 | rsquared = computeRSquared(Alabels[Atest,:], Alabels_output[Atest,:], np.mean(Alabels[Atrain,:])) 314 | best_results['acc_rsquaredA'] = rsquared 315 | print("\tTrain accuracy = {:.4f} (graph A)\n\tTest accuracy = {:.4f} (graph A)\t:\tR-squared = {:.4f} (graph A)".format(acc_train[-1], acc_testA[-1], rsquared)) 316 | acc_txt.write("-"*32 + "\n{:4d}\t{:.4f}\t{:.4f}\t{:.4f}".format(epochs[-1], acc_train[-1], acc_testA[-1], rsquared)) 317 | else: 318 | Bembedding = getEmbedding(EmbeddingModel, 'B', Bfeatures) # graph B node embeddings 319 | Blabels_output = getModelOutputs_fromEmbedding(PredictionModel, Bembedding) 320 | Blabels_predict = getModelPredictions(Blabels_output) if Alabels.shape[1]>1 else None 321 | acc_testB.append(computeAccuracy(Blabels, Blabels_output)) 322 | acc_results['acc_testB'] = acc_testB 323 | best_results['acc_testB'] = acc_testB[-1] 324 | if Alabels.shape[1]>1: 325 | print("\tTrain accuracy = {:.4f} (graph A)\n\tTest accuracy = {:.4f} (graph A)\n\tTest accuracy = {:.4f} (graph B)".format(acc_train[-1], acc_testA[-1], acc_testB[-1])) 326 | acc_txt.write("-"*32 + "\n{:4d}\t{:.4f}\t{:.4f}\t{:.4f}".format(epochs[-1], acc_train[-1], acc_testA[-1], acc_testB[-1])) 327 | else: 328 | rsquaredA = computeRSquared(Alabels[Atest,:], Alabels_output[Atest,:], np.mean(Alabels[Atrain,:])) 329 | rsquaredB = computeRSquared(Blabels, Blabels_output, np.mean(Alabels[Atrain,:])) 330 | best_results['acc_rsquaredA'] = rsquaredA 331 | best_results['acc_rsquaredB'] = rsquaredB 332 | print("\tTrain accuracy = {:.4f} (graph A)\n\tTest accuracy = {:.4f} (graph A)\t:\tR-squared = {:.4f} (graph A)\n\tTest accuracy = {:.4f} (graph B)\t:\tR-squared = {:.4f} (graph B)".format(acc_train[-1], acc_testA[-1], rsquaredA, acc_testB[-1], rsquaredB)) 333 | acc_txt.write("-"*48 + "\n{:4d}\t{:.4f}\t{:.4f}\t{:.4f}\t{:.4f}\t{:.4f}".format(epochs[-1], acc_train[-1], acc_testA[-1], rsquaredA, acc_testB[-1], rsquaredB)) 334 | 335 | acc_txt.close() 336 | 337 | save_iteration_results(save_path, suffix, Atrain, Atest, Aembedding, Alabels_output, Alabels_predict, B, Bfeatures, 338 | Bembedding, Blabels, Blabels_output, Blabels_predict, Ptrue, P, acc_results) 339 | 340 | return best_results 341 | 342 | 343 | 344 | 345 | 346 | 347 | 348 | 349 | 350 | 351 | 352 | 353 | 354 | 355 | 356 | -------------------------------------------------------------------------------- /datasets/BP-2/GraphCA.txt: -------------------------------------------------------------------------------- 1 | 0 1 0 1 1 1 1 0 1 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 | 1 0 0 1 0 0 0 1 1 0 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 | 0 0 0 1 1 1 1 0 1 1 1 0 0 0 1 1 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 | 1 1 1 0 1 1 1 1 1 1 1 1 0 1 1 1 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5 | 1 0 1 1 0 1 0 1 1 0 1 1 1 0 1 1 0 1 1 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 | 1 0 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 | 1 0 1 1 0 1 0 0 0 1 0 1 1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 8 | 0 1 0 1 1 1 0 0 0 0 1 0 0 1 1 0 0 0 1 1 0 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 9 | 1 1 1 1 1 1 0 0 0 1 1 1 0 1 1 1 1 0 0 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 10 | 1 0 1 1 0 1 1 0 1 0 1 1 1 1 1 1 1 1 0 0 1 0 1 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 11 | 1 1 1 1 1 1 0 1 1 1 0 1 1 1 1 1 1 1 0 0 1 1 1 1 1 0 0 0 0 1 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 12 | 1 1 0 1 1 1 1 0 1 1 1 0 1 1 1 0 1 0 1 1 1 1 1 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 13 | 1 1 0 0 1 1 1 0 0 1 1 1 0 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 14 | 0 1 0 1 0 0 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 0 0 1 1 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 15 | 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 0 1 1 0 1 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 16 | 0 1 1 1 1 1 1 0 1 1 1 0 0 1 1 0 1 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 17 | 1 0 1 0 0 1 0 0 1 1 1 1 0 1 1 1 0 0 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 18 | 1 1 1 1 1 1 0 0 0 1 1 0 1 1 0 0 0 0 0 1 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 19 | 1 1 1 1 1 0 1 1 0 0 0 1 0 1 1 0 1 0 0 1 1 1 1 1 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 | 1 1 0 1 1 0 0 1 1 0 0 1 1 1 1 0 1 1 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 21 | 0 1 1 1 1 0 1 0 1 1 1 1 1 1 0 1 1 1 1 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 22 | 1 1 1 1 0 1 0 1 1 0 1 1 1 0 1 1 1 1 1 0 1 0 1 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 23 | 0 1 1 1 1 0 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 24 | 1 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 1 0 1 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 25 | 0 1 1 0 1 1 1 1 1 0 1 0 0 1 1 0 1 1 1 0 1 1 0 1 0 0 0 1 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 26 | 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 1 1 0 1 1 0 0 0 1 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 27 | 0 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 28 | 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 1 1 1 1 0 0 1 0 0 0 1 0 0 1 0 1 0 1 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 29 | 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 1 1 0 1 1 0 0 0 0 1 1 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 30 | 0 1 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 1 1 0 0 0 1 0 0 0 0 1 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 31 | 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 0 1 0 0 1 0 1 0 1 1 1 0 1 1 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 32 | 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 1 0 1 1 0 1 1 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 33 | 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 1 0 0 0 1 0 0 0 1 1 1 0 0 0 1 0 1 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 34 | 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 1 1 0 0 0 1 0 0 1 1 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 35 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 1 1 0 1 1 1 0 1 1 1 0 1 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 36 | 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 37 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 0 1 0 1 1 1 0 1 1 0 0 1 0 1 0 1 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 38 | 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 1 0 0 1 1 0 1 0 0 0 0 1 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 39 | 0 0 0 0 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 1 0 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 40 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 1 0 1 0 1 1 0 0 1 1 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 41 | 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 0 1 0 1 0 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 42 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 43 | 0 0 0 0 0 1 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 1 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 44 | 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 0 0 1 1 0 1 0 0 0 1 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 45 | 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 1 0 1 1 1 0 0 1 0 1 0 0 0 0 1 1 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 46 | 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 47 | 1 1 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 0 1 0 0 0 0 0 0 0 1 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 48 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 1 1 1 0 1 0 1 1 0 0 1 1 1 0 1 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 49 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 1 1 0 0 1 0 1 1 0 1 0 1 1 0 0 0 1 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 50 | 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 1 1 1 0 0 0 1 1 0 1 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 51 | 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 1 0 1 1 0 1 1 1 1 0 0 1 1 0 1 1 1 0 0 0 1 0 1 0 0 0 0 1 0 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 52 | 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0 1 1 1 0 0 1 0 1 1 1 1 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 53 | 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 1 1 1 1 0 1 0 1 0 0 1 0 1 1 0 1 1 1 1 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 54 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 1 1 1 1 0 1 1 1 1 0 0 0 1 1 1 1 0 1 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 55 | 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 1 1 0 1 0 0 0 0 1 0 0 1 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 56 | 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 0 1 1 0 1 1 1 0 1 0 1 1 0 1 0 1 1 1 1 1 1 0 1 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 1 0 0 1 1 0 0 0 0 0 57 | 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 1 1 1 1 0 0 1 0 1 1 0 1 0 0 0 1 0 1 1 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 58 | 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 0 1 0 0 1 1 1 1 1 1 1 1 0 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 59 | 0 1 0 0 0 1 0 0 0 0 0 0 1 0 1 1 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 1 1 1 0 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 60 | 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 1 1 0 1 0 1 0 0 1 1 0 1 1 1 1 1 1 0 0 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 61 | 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 0 0 1 0 1 1 1 1 0 1 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 62 | 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 1 1 0 0 1 1 0 1 0 0 0 0 0 1 1 1 0 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 63 | 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 1 0 0 1 1 1 0 1 1 1 1 0 0 1 0 1 1 1 1 1 0 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 64 | 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 1 1 1 0 1 1 1 0 1 0 0 1 0 0 1 1 0 1 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 1 1 1 1 0 0 0 65 | 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 1 1 0 0 0 0 0 1 1 1 1 0 0 0 0 0 1 0 0 0 1 1 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 66 | 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 1 1 1 1 1 1 0 0 0 1 0 1 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 67 | 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 1 0 1 1 1 1 1 0 0 1 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 68 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 1 1 1 1 1 1 0 1 1 1 0 0 1 1 0 0 0 1 1 0 0 1 0 0 0 0 1 0 0 0 0 0 1 1 0 1 0 0 1 0 0 0 0 0 0 0 0 69 | 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 0 1 1 0 1 0 1 1 0 0 0 1 1 0 0 1 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 70 | 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 1 1 1 1 1 1 0 1 0 0 1 0 0 1 0 0 1 1 1 0 0 1 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 71 | 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 0 1 1 1 1 1 1 0 1 1 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 72 | 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 0 1 0 0 1 0 0 1 0 1 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 1 0 73 | 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 1 0 1 1 1 1 1 1 1 1 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 74 | 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 1 1 0 1 1 1 1 0 0 1 1 0 0 1 0 1 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 75 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 76 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 0 1 0 1 0 1 1 1 1 1 0 0 0 1 1 0 1 1 77 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 1 1 0 1 1 0 0 1 0 1 1 1 0 1 1 1 0 0 0 78 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1 1 1 1 1 1 0 1 1 0 0 0 1 0 0 1 1 1 1 0 79 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 1 0 0 1 1 0 1 1 0 1 0 1 1 0 0 0 1 0 0 0 1 1 0 80 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 1 0 0 1 1 0 1 0 1 0 1 1 1 1 1 0 1 1 1 1 0 0 0 81 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 1 1 1 0 0 1 1 0 0 0 0 1 0 0 1 0 1 1 1 0 1 1 1 82 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 0 0 0 1 0 0 0 1 1 1 0 1 0 0 0 1 1 0 1 1 83 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 1 0 0 1 0 0 0 1 1 0 1 1 1 1 1 1 1 1 0 1 1 1 0 84 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 1 0 1 1 1 1 1 0 0 1 0 1 1 1 0 0 1 1 1 1 0 0 0 1 1 85 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 1 1 0 0 0 1 1 0 1 1 0 0 0 1 0 1 0 1 0 0 1 1 1 86 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 1 1 0 1 0 0 1 0 1 0 0 0 1 0 1 1 1 1 0 1 0 0 1 1 87 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 0 0 1 1 0 0 0 0 1 1 0 1 1 1 1 88 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 1 0 0 0 1 0 1 0 1 0 1 1 1 0 0 1 0 1 0 1 0 1 0 0 1 0 1 0 0 89 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 0 1 1 1 0 0 1 1 1 1 1 0 1 1 0 1 90 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 1 0 1 1 0 0 0 0 0 0 0 1 0 1 1 0 1 0 1 0 0 91 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 1 0 1 1 0 1 1 1 0 1 0 0 0 1 0 0 0 1 92 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 1 0 0 1 1 1 1 1 0 1 0 0 1 0 1 0 0 1 1 1 0 1 0 0 93 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 0 0 1 1 1 1 0 1 1 1 0 0 0 1 0 1 1 0 1 0 94 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 0 0 95 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 1 1 1 0 1 0 1 0 0 1 0 1 0 0 1 1 1 0 96 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 1 1 0 1 1 1 0 0 0 1 0 1 0 1 1 1 1 0 0 0 0 1 1 1 97 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 1 1 1 0 1 0 1 1 0 0 0 1 0 1 0 0 0 1 1 1 0 0 0 0 1 98 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 0 0 1 0 99 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 1 1 0 1 1 1 1 1 1 1 0 0 0 0 0 1 0 1 1 0 1 0 1 100 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 0 1 1 1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 101 | -------------------------------------------------------------------------------- /datasets/BP-2/GraphCB.txt: -------------------------------------------------------------------------------- 1 | 0 1 0 1 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 0 1 1 1 0 0 0 0 0 0 1 2 | 1 0 1 1 1 0 0 0 0 1 0 0 0 1 0 0 1 0 0 0 0 1 1 0 0 1 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 0 0 1 0 1 0 0 0 0 0 0 1 3 | 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 4 | 1 1 0 0 0 1 0 0 0 1 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 0 1 5 | 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 1 1 1 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 6 | 0 0 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 7 | 0 0 1 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 1 0 0 1 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 8 | 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 1 0 0 0 0 0 1 0 0 0 1 1 1 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 9 | 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 1 0 0 0 1 1 1 0 0 0 0 0 1 1 0 1 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 10 | 1 1 0 1 0 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 1 1 0 0 0 0 1 1 1 11 | 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0 1 0 0 0 0 1 1 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 1 1 0 1 12 | 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 13 | 0 0 0 0 0 0 1 0 0 1 1 0 0 0 1 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 1 1 1 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 1 0 0 0 0 14 | 1 1 0 1 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 1 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 0 0 0 1 1 0 0 0 0 0 0 1 15 | 0 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 16 | 1 0 0 1 0 0 0 0 0 1 1 0 0 1 0 0 0 0 0 0 0 1 1 0 0 1 0 1 1 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 1 0 0 0 0 0 0 1 17 | 1 1 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 1 0 1 0 1 0 0 0 0 0 0 1 18 | 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 19 | 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 1 0 20 | 0 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 21 | 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 22 | 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 1 1 0 1 0 1 0 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 1 23 | 0 1 0 0 0 0 0 0 0 1 1 0 0 1 0 1 1 0 0 0 0 1 0 0 0 0 0 1 1 0 1 1 1 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 0 0 0 24 | 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 1 0 1 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0 25 | 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 1 0 0 1 1 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 26 | 1 1 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 0 0 1 0 1 0 0 0 1 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 27 | 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 28 | 1 0 0 0 1 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 1 1 0 0 0 0 0 0 0 1 29 | 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 30 | 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 31 | 1 1 0 1 0 0 1 0 0 0 0 0 0 1 0 1 1 0 0 1 0 1 1 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 32 | 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 0 1 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 33 | 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 0 0 1 1 0 0 0 0 1 0 0 1 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 1 0 34 | 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 35 | 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 1 0 0 0 0 0 0 1 36 | 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 37 | 1 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 1 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 1 0 0 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 0 0 0 1 0 0 1 1 1 0 0 0 0 0 0 1 38 | 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 1 1 1 1 0 0 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 39 | 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 1 0 0 0 0 0 1 0 0 1 0 0 1 0 0 1 0 0 0 0 1 0 0 1 1 0 0 40 | 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 0 1 0 0 1 1 0 1 0 0 1 0 0 0 1 1 0 1 1 1 1 0 41 | 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 0 0 0 1 0 0 0 1 1 1 0 1 1 1 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 1 0 1 1 0 0 0 0 0 1 0 0 1 0 1 1 0 0 0 0 1 0 0 0 0 1 0 42 | 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 1 0 0 0 0 1 1 1 1 0 43 | 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 1 0 0 0 1 0 1 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 44 | 1 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 1 1 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 1 1 0 1 1 0 1 0 0 1 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 1 1 0 0 0 45 | 0 0 0 0 0 0 0 1 1 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 1 0 0 0 1 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 46 | 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 0 1 0 1 1 0 0 1 47 | 1 1 0 0 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 0 0 1 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 0 0 1 1 0 0 0 0 0 0 1 1 48 | 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 49 | 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1 0 0 0 0 0 0 1 0 1 0 1 0 0 1 0 0 1 0 0 0 0 1 1 0 0 0 0 50 | 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 1 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 51 | 0 0 0 0 0 0 0 1 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 1 1 0 0 0 1 0 1 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 52 | 1 1 0 1 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 0 1 0 1 0 1 0 1 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 53 | 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 0 1 0 0 0 0 1 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 54 | 0 1 0 1 0 0 0 0 0 1 1 0 0 1 0 0 1 0 0 0 0 1 1 0 0 1 0 1 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 1 1 55 | 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 56 | 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 57 | 0 0 0 0 0 0 0 1 1 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 0 1 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 58 | 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 1 0 0 0 1 1 1 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 59 | 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 60 | 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 61 | 0 0 0 0 0 0 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 1 0 1 1 0 0 1 0 0 1 0 0 1 0 0 1 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 0 0 62 | 1 0 0 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 1 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 0 0 1 1 0 0 63 | 0 1 0 1 0 0 0 0 0 1 1 0 0 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 64 | 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 65 | 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 66 | 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 1 0 0 0 0 1 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 1 1 0 0 67 | 0 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 1 0 0 1 0 68 | 1 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 1 1 0 0 69 | 1 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 70 | 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 1 1 0 0 1 0 0 1 0 0 1 0 0 0 0 1 0 1 1 1 1 0 71 | 0 0 0 0 0 0 0 1 1 0 1 1 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 72 | 0 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 0 73 | 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 74 | 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 1 0 0 1 1 0 1 0 0 1 1 0 1 0 0 0 0 1 0 0 0 1 1 0 75 | 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 76 | 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 1 0 0 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 77 | 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 1 0 0 1 0 1 0 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 1 0 0 0 1 0 0 0 1 0 1 1 0 1 0 78 | 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 79 | 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 80 | 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0 81 | 1 1 0 1 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 0 1 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 1 0 0 1 0 0 0 1 82 | 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 83 | 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 1 1 1 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 1 0 0 0 84 | 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 1 1 0 1 0 0 85 | 0 1 0 1 0 0 0 0 0 1 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 0 1 1 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 1 86 | 1 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 87 | 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 1 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 88 | 1 1 0 1 0 0 1 0 0 0 0 0 1 1 0 1 1 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 1 0 1 1 1 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 1 1 0 0 0 1 0 0 1 89 | 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 1 0 90 | 0 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 91 | 1 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 1 92 | 1 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 1 0 0 0 0 0 0 1 93 | 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 1 1 0 0 0 1 1 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 1 0 0 1 1 0 0 0 1 0 0 0 1 94 | 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 1 0 0 1 1 1 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 1 0 0 0 1 0 0 1 0 0 1 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 1 0 0 95 | 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 96 | 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 0 0 1 0 0 0 1 0 0 0 1 1 1 0 97 | 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 1 0 1 0 0 0 0 1 0 0 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 1 1 0 98 | 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 1 1 0 0 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 1 0 1 1 0 0 0 99 | 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 1 0 0 1 0 1 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 100 | 1 1 0 1 0 0 0 0 0 1 1 0 0 1 0 1 1 0 0 0 0 1 0 0 0 0 1 1 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 0 0 0 101 | --------------------------------------------------------------------------------