.
675 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | MixHop and N-GCN
2 | ===========================
3 | [](https://paperswithcode.com/sota/node-classification-on-citeseer?p=mixhop-higher-order-graph-convolution)
4 | [](https://arxiv.org/abs/1905.00067)
5 | [](https://codebeat.co/projects/github-com-benedekrozemberczki-mixhop-and-n-gcn-master)
6 | [](https://github.com/benedekrozemberczki/MixHop-and-N-GCN/archive/master.zip)⠀[](https://twitter.com/intent/follow?screen_name=benrozemberczki)
7 |
8 | A **PyTorch** implementation of "MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing" (ICML 2019) and "A Higher-Order Graph Convolutional Layer" (NeurIPS 2018).
9 |
10 |
11 |
12 |
13 | ---------------------
14 |
15 | ### Abstract
16 |
17 |
18 | Recent methods generalize convolutional layers from Euclidean domains to graph-structured data by approximating the eigenbasis of the graph Laplacian. The computationally-efficient and broadly-used Graph ConvNet of Kipf & Welling, over-simplifies the approximation, effectively rendering graph convolution as a neighborhood-averaging operator. This simplification restricts the model from learning delta operators, the very premise of the graph Laplacian. In this work, we propose a new Graph Convolutional layer which mixes multiple powers of the adjacency matrix, allowing it to learn delta operators. Our layer exhibits the same memory footprint and computational complexity as a GCN. We illustrate the strength of our proposed layer on both synthetic graph datasets, and on several real-world citation graphs, setting the record state-of-the-art on Pubmed.
19 |
20 | This repository provides a PyTorch implementation of MixHop and N-GCN as described in the papers:
21 |
22 | > MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing
23 | > Sami Abu-El-Haija, Bryan Perozzi, Amol Kapoor, Hrayr Harutyunyan, Nazanin Alipourfard, Kristina Lerman, Greg Ver Steeg, and Aram Galstyan.
24 | > ICML, 2019.
25 | > [[Paper]](https://arxiv.org/pdf/1905.00067.pdf)
26 |
27 | > A Higher-Order Graph Convolutional Layer.
28 | > Sami A Abu-El-Haija, Bryan Perozzi, Amol Kapoor, Nazanin Alipourfard, Hrayr Harutyunyan.
29 | > NeurIPS, 2018.
30 | > [[Paper]](http://sami.haija.org/papers/high-order-gc-layer.pdf)
31 |
32 | The original TensorFlow implementation of MixHop is available [[Here]](https://github.com/samihaija/mixhop).
33 |
34 | ### Requirements
35 | The codebase is implemented in Python 3.5.2. package versions used for development are just below.
36 | ```
37 | networkx 2.4
38 | tqdm 4.28.1
39 | numpy 1.15.4
40 | pandas 0.23.4
41 | texttable 1.5.0
42 | scipy 1.1.0
43 | argparse 1.1.0
44 | torch 1.1.0
45 | torch-sparse 0.3.0
46 | ```
47 | ### Datasets
48 |
49 | The code takes the **edge list** of the graph in a csv file. Every row indicates an edge between two nodes separated by a comma. The first row is a header. Nodes should be indexed starting with 0. A sample graph for `Cora` is included in the `input/` directory. In addition to the edgelist there is a JSON file with the sparse features and a csv with the target variable.
50 |
51 | The **feature matrix** is a sparse binary one it is stored as a json. Nodes are keys of the json and feature indices are the values. For each node feature column ids are stored as elements of a list. The feature matrix is structured as:
52 |
53 | ```javascript
54 | { 0: [0, 1, 38, 1968, 2000, 52727],
55 | 1: [10000, 20, 3],
56 | 2: [],
57 | ...
58 | n: [2018, 10000]}
59 | ```
60 |
61 | The **target vector** is a csv with two columns and headers, the first contains the node identifiers the second the targets. This csv is sorted by node identifiers and the target column contains the class meberships indexed from zero.
62 |
63 | | **NODE ID**| **Target** |
64 | | --- | --- |
65 | | 0 | 3 |
66 | | 1 | 1 |
67 | | 2 | 0 |
68 | | 3 | 1 |
69 | | ... | ... |
70 | | n | 3 |
71 |
72 | ### Options
73 |
74 | Training an N-GCN/MixHop model is handled by the `src/main.py` script which provides the following command line arguments.
75 |
76 | #### Input and output options
77 | ```
78 | --edge-path STR Edge list csv. Default is `input/cora_edges.csv`.
79 | --features-path STR Features json. Default is `input/cora_features.json`.
80 | --target-path STR Target classes csv. Default is `input/cora_target.csv`.
81 | ```
82 | #### Model options
83 | ```
84 | --model STR Model variant. Default is `mixhop`.
85 | --seed INT Random seed. Default is 42.
86 | --epochs INT Number of training epochs. Default is 2000.
87 | --early-stopping INT Early stopping rounds. Default is 10.
88 | --training-size INT Training set size. Default is 1500.
89 | --validation-size INT Validation set size. Default is 500.
90 | --learning-rate FLOAT Adam learning rate. Default is 0.01.
91 | --dropout FLOAT Dropout rate value. Default is 0.5.
92 | --lambd FLOAT Regularization coefficient. Default is 0.0005.
93 | --layers-1 LST Layer sizes (upstream). Default is [200, 200, 200].
94 | --layers-2 LST Layer sizes (bottom). Default is [200, 200, 200].
95 | --cut-off FLOAT Norm cut-off for pruning. Default is 0.1.
96 | --budget INT Architecture neuron budget. Default is 60.
97 | ```
98 | ### Examples
99 |
100 | The following commands learn a neural network and score on the test set. Training a model on the default dataset.
101 |
102 | ```sh
103 | $ python src/main.py
104 | ```
105 |
106 |
107 |
108 |
109 | Training a MixHop model for a 100 epochs.
110 | ```sh
111 | $ python src/main.py --epochs 100
112 | ```
113 | Increasing the learning rate and the dropout.
114 | ```sh
115 | $ python src/main.py --learning-rate 0.1 --dropout 0.9
116 | ```
117 | Training a model with diffusion order 2:
118 | ```sh
119 | $ python src/main.py --layers 64 64
120 | ```
121 | Training an N-GCN model:
122 | ```sh
123 | $ python src/main.py --model ngcn
124 | ```
125 |
126 |
127 | --------------------------------------------------------------------------------
128 |
129 | **License**
130 |
131 | - [GNU](https://github.com/benedekrozemberczki/MixHop-and-N-GCN/blob/master/LICENSE)
132 |
133 | --------------------------------------------------------------------------------
134 |
--------------------------------------------------------------------------------
/input/cora_target.csv:
--------------------------------------------------------------------------------
1 | id,target
2 | 0,3
3 | 1,4
4 | 2,4
5 | 3,0
6 | 4,3
7 | 5,2
8 | 6,0
9 | 7,3
10 | 8,3
11 | 9,2
12 | 10,0
13 | 11,0
14 | 12,4
15 | 13,3
16 | 14,3
17 | 15,3
18 | 16,2
19 | 17,3
20 | 18,1
21 | 19,3
22 | 20,5
23 | 21,3
24 | 22,4
25 | 23,6
26 | 24,3
27 | 25,3
28 | 26,6
29 | 27,3
30 | 28,2
31 | 29,4
32 | 30,3
33 | 31,6
34 | 32,0
35 | 33,4
36 | 34,2
37 | 35,0
38 | 36,1
39 | 37,5
40 | 38,4
41 | 39,4
42 | 40,3
43 | 41,6
44 | 42,6
45 | 43,4
46 | 44,3
47 | 45,3
48 | 46,2
49 | 47,5
50 | 48,3
51 | 49,4
52 | 50,5
53 | 51,3
54 | 52,0
55 | 53,2
56 | 54,1
57 | 55,4
58 | 56,6
59 | 57,3
60 | 58,2
61 | 59,2
62 | 60,0
63 | 61,0
64 | 62,0
65 | 63,4
66 | 64,2
67 | 65,0
68 | 66,4
69 | 67,5
70 | 68,2
71 | 69,6
72 | 70,5
73 | 71,2
74 | 72,2
75 | 73,2
76 | 74,0
77 | 75,4
78 | 76,5
79 | 77,6
80 | 78,4
81 | 79,0
82 | 80,0
83 | 81,0
84 | 82,4
85 | 83,2
86 | 84,4
87 | 85,1
88 | 86,4
89 | 87,6
90 | 88,0
91 | 89,4
92 | 90,2
93 | 91,4
94 | 92,6
95 | 93,6
96 | 94,0
97 | 95,0
98 | 96,6
99 | 97,5
100 | 98,0
101 | 99,6
102 | 100,0
103 | 101,2
104 | 102,1
105 | 103,1
106 | 104,1
107 | 105,2
108 | 106,6
109 | 107,5
110 | 108,6
111 | 109,1
112 | 110,2
113 | 111,2
114 | 112,1
115 | 113,5
116 | 114,5
117 | 115,5
118 | 116,6
119 | 117,5
120 | 118,6
121 | 119,5
122 | 120,5
123 | 121,1
124 | 122,6
125 | 123,6
126 | 124,1
127 | 125,5
128 | 126,1
129 | 127,6
130 | 128,5
131 | 129,5
132 | 130,5
133 | 131,1
134 | 132,5
135 | 133,1
136 | 134,1
137 | 135,1
138 | 136,1
139 | 137,1
140 | 138,1
141 | 139,1
142 | 140,4
143 | 141,3
144 | 142,0
145 | 143,3
146 | 144,6
147 | 145,6
148 | 146,0
149 | 147,3
150 | 148,4
151 | 149,0
152 | 150,3
153 | 151,4
154 | 152,4
155 | 153,1
156 | 154,2
157 | 155,2
158 | 156,2
159 | 157,3
160 | 158,3
161 | 159,3
162 | 160,3
163 | 161,0
164 | 162,4
165 | 163,5
166 | 164,0
167 | 165,3
168 | 166,4
169 | 167,3
170 | 168,3
171 | 169,3
172 | 170,2
173 | 171,3
174 | 172,3
175 | 173,2
176 | 174,2
177 | 175,6
178 | 176,1
179 | 177,4
180 | 178,3
181 | 179,3
182 | 180,3
183 | 181,6
184 | 182,3
185 | 183,3
186 | 184,3
187 | 185,3
188 | 186,0
189 | 187,4
190 | 188,2
191 | 189,2
192 | 190,6
193 | 191,5
194 | 192,3
195 | 193,5
196 | 194,4
197 | 195,0
198 | 196,4
199 | 197,3
200 | 198,4
201 | 199,4
202 | 200,3
203 | 201,3
204 | 202,2
205 | 203,4
206 | 204,0
207 | 205,3
208 | 206,2
209 | 207,3
210 | 208,3
211 | 209,4
212 | 210,4
213 | 211,0
214 | 212,3
215 | 213,6
216 | 214,0
217 | 215,3
218 | 216,3
219 | 217,4
220 | 218,3
221 | 219,3
222 | 220,5
223 | 221,2
224 | 222,3
225 | 223,2
226 | 224,4
227 | 225,1
228 | 226,3
229 | 227,2
230 | 228,2
231 | 229,3
232 | 230,3
233 | 231,3
234 | 232,3
235 | 233,5
236 | 234,1
237 | 235,3
238 | 236,1
239 | 237,3
240 | 238,5
241 | 239,0
242 | 240,3
243 | 241,5
244 | 242,0
245 | 243,4
246 | 244,2
247 | 245,4
248 | 246,2
249 | 247,4
250 | 248,4
251 | 249,5
252 | 250,4
253 | 251,3
254 | 252,5
255 | 253,3
256 | 254,3
257 | 255,4
258 | 256,3
259 | 257,0
260 | 258,4
261 | 259,5
262 | 260,0
263 | 261,3
264 | 262,6
265 | 263,2
266 | 264,5
267 | 265,5
268 | 266,5
269 | 267,3
270 | 268,2
271 | 269,3
272 | 270,0
273 | 271,4
274 | 272,5
275 | 273,3
276 | 274,0
277 | 275,4
278 | 276,0
279 | 277,3
280 | 278,3
281 | 279,0
282 | 280,0
283 | 281,3
284 | 282,5
285 | 283,4
286 | 284,4
287 | 285,3
288 | 286,4
289 | 287,3
290 | 288,3
291 | 289,2
292 | 290,2
293 | 291,3
294 | 292,0
295 | 293,3
296 | 294,1
297 | 295,3
298 | 296,2
299 | 297,3
300 | 298,3
301 | 299,4
302 | 300,5
303 | 301,2
304 | 302,1
305 | 303,1
306 | 304,0
307 | 305,0
308 | 306,1
309 | 307,6
310 | 308,1
311 | 309,3
312 | 310,3
313 | 311,3
314 | 312,2
315 | 313,3
316 | 314,3
317 | 315,0
318 | 316,3
319 | 317,4
320 | 318,1
321 | 319,3
322 | 320,4
323 | 321,3
324 | 322,2
325 | 323,0
326 | 324,0
327 | 325,4
328 | 326,2
329 | 327,3
330 | 328,2
331 | 329,1
332 | 330,4
333 | 331,6
334 | 332,3
335 | 333,2
336 | 334,0
337 | 335,3
338 | 336,3
339 | 337,2
340 | 338,3
341 | 339,4
342 | 340,4
343 | 341,2
344 | 342,1
345 | 343,3
346 | 344,5
347 | 345,3
348 | 346,2
349 | 347,0
350 | 348,4
351 | 349,5
352 | 350,1
353 | 351,3
354 | 352,3
355 | 353,2
356 | 354,0
357 | 355,2
358 | 356,4
359 | 357,2
360 | 358,2
361 | 359,2
362 | 360,5
363 | 361,4
364 | 362,4
365 | 363,2
366 | 364,2
367 | 365,0
368 | 366,3
369 | 367,2
370 | 368,4
371 | 369,4
372 | 370,5
373 | 371,5
374 | 372,1
375 | 373,0
376 | 374,3
377 | 375,4
378 | 376,5
379 | 377,3
380 | 378,4
381 | 379,5
382 | 380,3
383 | 381,4
384 | 382,3
385 | 383,3
386 | 384,1
387 | 385,4
388 | 386,3
389 | 387,3
390 | 388,5
391 | 389,2
392 | 390,3
393 | 391,2
394 | 392,5
395 | 393,5
396 | 394,4
397 | 395,3
398 | 396,3
399 | 397,3
400 | 398,3
401 | 399,1
402 | 400,5
403 | 401,3
404 | 402,3
405 | 403,2
406 | 404,6
407 | 405,0
408 | 406,1
409 | 407,3
410 | 408,0
411 | 409,1
412 | 410,5
413 | 411,3
414 | 412,6
415 | 413,3
416 | 414,6
417 | 415,0
418 | 416,3
419 | 417,3
420 | 418,3
421 | 419,5
422 | 420,4
423 | 421,3
424 | 422,4
425 | 423,0
426 | 424,5
427 | 425,2
428 | 426,1
429 | 427,2
430 | 428,4
431 | 429,4
432 | 430,4
433 | 431,4
434 | 432,3
435 | 433,3
436 | 434,0
437 | 435,4
438 | 436,3
439 | 437,0
440 | 438,5
441 | 439,2
442 | 440,0
443 | 441,5
444 | 442,4
445 | 443,4
446 | 444,4
447 | 445,3
448 | 446,0
449 | 447,6
450 | 448,5
451 | 449,2
452 | 450,4
453 | 451,5
454 | 452,1
455 | 453,3
456 | 454,5
457 | 455,3
458 | 456,0
459 | 457,3
460 | 458,5
461 | 459,1
462 | 460,1
463 | 461,0
464 | 462,3
465 | 463,4
466 | 464,2
467 | 465,6
468 | 466,2
469 | 467,0
470 | 468,5
471 | 469,3
472 | 470,4
473 | 471,6
474 | 472,5
475 | 473,3
476 | 474,5
477 | 475,0
478 | 476,1
479 | 477,3
480 | 478,0
481 | 479,5
482 | 480,2
483 | 481,2
484 | 482,3
485 | 483,5
486 | 484,1
487 | 485,0
488 | 486,3
489 | 487,1
490 | 488,4
491 | 489,2
492 | 490,5
493 | 491,6
494 | 492,4
495 | 493,2
496 | 494,2
497 | 495,6
498 | 496,0
499 | 497,0
500 | 498,4
501 | 499,6
502 | 500,3
503 | 501,2
504 | 502,0
505 | 503,3
506 | 504,6
507 | 505,1
508 | 506,6
509 | 507,3
510 | 508,1
511 | 509,3
512 | 510,3
513 | 511,3
514 | 512,3
515 | 513,2
516 | 514,5
517 | 515,4
518 | 516,5
519 | 517,5
520 | 518,3
521 | 519,1
522 | 520,3
523 | 521,3
524 | 522,4
525 | 523,4
526 | 524,2
527 | 525,0
528 | 526,2
529 | 527,0
530 | 528,5
531 | 529,4
532 | 530,0
533 | 531,0
534 | 532,3
535 | 533,2
536 | 534,2
537 | 535,2
538 | 536,2
539 | 537,6
540 | 538,4
541 | 539,6
542 | 540,5
543 | 541,5
544 | 542,1
545 | 543,0
546 | 544,0
547 | 545,4
548 | 546,3
549 | 547,3
550 | 548,1
551 | 549,3
552 | 550,6
553 | 551,6
554 | 552,2
555 | 553,3
556 | 554,3
557 | 555,3
558 | 556,1
559 | 557,2
560 | 558,2
561 | 559,5
562 | 560,4
563 | 561,3
564 | 562,2
565 | 563,1
566 | 564,2
567 | 565,2
568 | 566,3
569 | 567,2
570 | 568,3
571 | 569,2
572 | 570,3
573 | 571,3
574 | 572,0
575 | 573,5
576 | 574,3
577 | 575,3
578 | 576,3
579 | 577,4
580 | 578,5
581 | 579,3
582 | 580,2
583 | 581,1
584 | 582,4
585 | 583,4
586 | 584,4
587 | 585,4
588 | 586,0
589 | 587,5
590 | 588,4
591 | 589,1
592 | 590,3
593 | 591,0
594 | 592,3
595 | 593,4
596 | 594,6
597 | 595,3
598 | 596,6
599 | 597,3
600 | 598,3
601 | 599,3
602 | 600,6
603 | 601,3
604 | 602,4
605 | 603,3
606 | 604,6
607 | 605,3
608 | 606,0
609 | 607,3
610 | 608,1
611 | 609,2
612 | 610,5
613 | 611,6
614 | 612,5
615 | 613,2
616 | 614,0
617 | 615,2
618 | 616,2
619 | 617,3
620 | 618,3
621 | 619,0
622 | 620,3
623 | 621,5
624 | 622,3
625 | 623,4
626 | 624,0
627 | 625,3
628 | 626,2
629 | 627,4
630 | 628,5
631 | 629,2
632 | 630,3
633 | 631,2
634 | 632,2
635 | 633,3
636 | 634,5
637 | 635,2
638 | 636,0
639 | 637,3
640 | 638,4
641 | 639,3
642 | 640,3
643 | 641,3
644 | 642,0
645 | 643,5
646 | 644,5
647 | 645,5
648 | 646,5
649 | 647,5
650 | 648,5
651 | 649,3
652 | 650,2
653 | 651,0
654 | 652,4
655 | 653,3
656 | 654,4
657 | 655,1
658 | 656,1
659 | 657,2
660 | 658,3
661 | 659,0
662 | 660,1
663 | 661,5
664 | 662,3
665 | 663,6
666 | 664,3
667 | 665,4
668 | 666,0
669 | 667,0
670 | 668,5
671 | 669,3
672 | 670,3
673 | 671,5
674 | 672,2
675 | 673,3
676 | 674,3
677 | 675,4
678 | 676,5
679 | 677,4
680 | 678,3
681 | 679,0
682 | 680,0
683 | 681,3
684 | 682,6
685 | 683,1
686 | 684,2
687 | 685,1
688 | 686,2
689 | 687,2
690 | 688,4
691 | 689,2
692 | 690,3
693 | 691,4
694 | 692,3
695 | 693,0
696 | 694,5
697 | 695,3
698 | 696,3
699 | 697,3
700 | 698,4
701 | 699,3
702 | 700,3
703 | 701,5
704 | 702,6
705 | 703,5
706 | 704,2
707 | 705,4
708 | 706,4
709 | 707,0
710 | 708,3
711 | 709,5
712 | 710,3
713 | 711,0
714 | 712,6
715 | 713,3
716 | 714,4
717 | 715,4
718 | 716,3
719 | 717,0
720 | 718,0
721 | 719,1
722 | 720,5
723 | 721,2
724 | 722,3
725 | 723,2
726 | 724,6
727 | 725,0
728 | 726,4
729 | 727,3
730 | 728,5
731 | 729,3
732 | 730,0
733 | 731,0
734 | 732,2
735 | 733,0
736 | 734,0
737 | 735,5
738 | 736,0
739 | 737,5
740 | 738,0
741 | 739,5
742 | 740,4
743 | 741,1
744 | 742,2
745 | 743,3
746 | 744,2
747 | 745,3
748 | 746,3
749 | 747,5
750 | 748,2
751 | 749,4
752 | 750,5
753 | 751,0
754 | 752,2
755 | 753,0
756 | 754,2
757 | 755,5
758 | 756,3
759 | 757,2
760 | 758,2
761 | 759,4
762 | 760,2
763 | 761,4
764 | 762,2
765 | 763,0
766 | 764,2
767 | 765,3
768 | 766,3
769 | 767,0
770 | 768,3
771 | 769,0
772 | 770,3
773 | 771,0
774 | 772,6
775 | 773,1
776 | 774,4
777 | 775,3
778 | 776,4
779 | 777,0
780 | 778,6
781 | 779,6
782 | 780,4
783 | 781,3
784 | 782,4
785 | 783,4
786 | 784,3
787 | 785,3
788 | 786,4
789 | 787,4
790 | 788,3
791 | 789,4
792 | 790,3
793 | 791,3
794 | 792,3
795 | 793,5
796 | 794,0
797 | 795,3
798 | 796,2
799 | 797,2
800 | 798,4
801 | 799,3
802 | 800,2
803 | 801,5
804 | 802,4
805 | 803,5
806 | 804,4
807 | 805,4
808 | 806,2
809 | 807,5
810 | 808,4
811 | 809,0
812 | 810,4
813 | 811,3
814 | 812,3
815 | 813,4
816 | 814,4
817 | 815,0
818 | 816,5
819 | 817,2
820 | 818,3
821 | 819,2
822 | 820,2
823 | 821,3
824 | 822,5
825 | 823,2
826 | 824,2
827 | 825,2
828 | 826,5
829 | 827,3
830 | 828,4
831 | 829,1
832 | 830,6
833 | 831,1
834 | 832,3
835 | 833,3
836 | 834,1
837 | 835,3
838 | 836,3
839 | 837,4
840 | 838,0
841 | 839,0
842 | 840,5
843 | 841,3
844 | 842,0
845 | 843,3
846 | 844,5
847 | 845,3
848 | 846,3
849 | 847,6
850 | 848,2
851 | 849,4
852 | 850,6
853 | 851,0
854 | 852,0
855 | 853,2
856 | 854,4
857 | 855,3
858 | 856,4
859 | 857,4
860 | 858,0
861 | 859,2
862 | 860,2
863 | 861,0
864 | 862,4
865 | 863,0
866 | 864,1
867 | 865,3
868 | 866,3
869 | 867,2
870 | 868,3
871 | 869,3
872 | 870,3
873 | 871,2
874 | 872,4
875 | 873,0
876 | 874,3
877 | 875,3
878 | 876,1
879 | 877,3
880 | 878,5
881 | 879,3
882 | 880,0
883 | 881,2
884 | 882,2
885 | 883,2
886 | 884,4
887 | 885,5
888 | 886,3
889 | 887,1
890 | 888,0
891 | 889,2
892 | 890,5
893 | 891,6
894 | 892,3
895 | 893,4
896 | 894,3
897 | 895,0
898 | 896,5
899 | 897,0
900 | 898,6
901 | 899,3
902 | 900,3
903 | 901,0
904 | 902,2
905 | 903,5
906 | 904,5
907 | 905,2
908 | 906,4
909 | 907,6
910 | 908,6
911 | 909,3
912 | 910,1
913 | 911,4
914 | 912,4
915 | 913,5
916 | 914,3
917 | 915,2
918 | 916,3
919 | 917,0
920 | 918,3
921 | 919,2
922 | 920,3
923 | 921,6
924 | 922,4
925 | 923,3
926 | 924,4
927 | 925,5
928 | 926,3
929 | 927,3
930 | 928,3
931 | 929,2
932 | 930,3
933 | 931,2
934 | 932,3
935 | 933,2
936 | 934,4
937 | 935,5
938 | 936,2
939 | 937,1
940 | 938,3
941 | 939,6
942 | 940,5
943 | 941,5
944 | 942,3
945 | 943,4
946 | 944,3
947 | 945,1
948 | 946,4
949 | 947,4
950 | 948,0
951 | 949,4
952 | 950,6
953 | 951,2
954 | 952,3
955 | 953,3
956 | 954,4
957 | 955,6
958 | 956,4
959 | 957,2
960 | 958,1
961 | 959,3
962 | 960,3
963 | 961,3
964 | 962,3
965 | 963,4
966 | 964,0
967 | 965,0
968 | 966,0
969 | 967,3
970 | 968,1
971 | 969,2
972 | 970,2
973 | 971,5
974 | 972,3
975 | 973,5
976 | 974,3
977 | 975,0
978 | 976,2
979 | 977,2
980 | 978,2
981 | 979,3
982 | 980,1
983 | 981,3
984 | 982,3
985 | 983,4
986 | 984,4
987 | 985,2
988 | 986,3
989 | 987,3
990 | 988,3
991 | 989,0
992 | 990,3
993 | 991,6
994 | 992,0
995 | 993,6
996 | 994,3
997 | 995,5
998 | 996,4
999 | 997,3
1000 | 998,2
1001 | 999,2
1002 | 1000,3
1003 | 1001,4
1004 | 1002,3
1005 | 1003,2
1006 | 1004,3
1007 | 1005,3
1008 | 1006,0
1009 | 1007,2
1010 | 1008,0
1011 | 1009,1
1012 | 1010,4
1013 | 1011,1
1014 | 1012,4
1015 | 1013,0
1016 | 1014,3
1017 | 1015,4
1018 | 1016,3
1019 | 1017,3
1020 | 1018,4
1021 | 1019,3
1022 | 1020,3
1023 | 1021,4
1024 | 1022,5
1025 | 1023,3
1026 | 1024,3
1027 | 1025,0
1028 | 1026,3
1029 | 1027,6
1030 | 1028,5
1031 | 1029,5
1032 | 1030,2
1033 | 1031,3
1034 | 1032,5
1035 | 1033,2
1036 | 1034,2
1037 | 1035,2
1038 | 1036,0
1039 | 1037,2
1040 | 1038,2
1041 | 1039,5
1042 | 1040,2
1043 | 1041,2
1044 | 1042,0
1045 | 1043,5
1046 | 1044,3
1047 | 1045,1
1048 | 1046,4
1049 | 1047,0
1050 | 1048,3
1051 | 1049,3
1052 | 1050,4
1053 | 1051,4
1054 | 1052,3
1055 | 1053,3
1056 | 1054,3
1057 | 1055,3
1058 | 1056,3
1059 | 1057,3
1060 | 1058,0
1061 | 1059,3
1062 | 1060,5
1063 | 1061,4
1064 | 1062,3
1065 | 1063,4
1066 | 1064,4
1067 | 1065,3
1068 | 1066,3
1069 | 1067,2
1070 | 1068,4
1071 | 1069,0
1072 | 1070,2
1073 | 1071,4
1074 | 1072,2
1075 | 1073,3
1076 | 1074,6
1077 | 1075,3
1078 | 1076,6
1079 | 1077,5
1080 | 1078,0
1081 | 1079,0
1082 | 1080,3
1083 | 1081,4
1084 | 1082,4
1085 | 1083,0
1086 | 1084,3
1087 | 1085,6
1088 | 1086,3
1089 | 1087,4
1090 | 1088,1
1091 | 1089,1
1092 | 1090,3
1093 | 1091,3
1094 | 1092,3
1095 | 1093,3
1096 | 1094,4
1097 | 1095,3
1098 | 1096,3
1099 | 1097,4
1100 | 1098,3
1101 | 1099,3
1102 | 1100,3
1103 | 1101,3
1104 | 1102,4
1105 | 1103,2
1106 | 1104,0
1107 | 1105,5
1108 | 1106,3
1109 | 1107,3
1110 | 1108,3
1111 | 1109,4
1112 | 1110,0
1113 | 1111,4
1114 | 1112,4
1115 | 1113,5
1116 | 1114,2
1117 | 1115,4
1118 | 1116,3
1119 | 1117,0
1120 | 1118,0
1121 | 1119,3
1122 | 1120,0
1123 | 1121,3
1124 | 1122,5
1125 | 1123,2
1126 | 1124,3
1127 | 1125,0
1128 | 1126,3
1129 | 1127,3
1130 | 1128,5
1131 | 1129,4
1132 | 1130,3
1133 | 1131,3
1134 | 1132,3
1135 | 1133,5
1136 | 1134,3
1137 | 1135,4
1138 | 1136,2
1139 | 1137,0
1140 | 1138,4
1141 | 1139,0
1142 | 1140,1
1143 | 1141,4
1144 | 1142,1
1145 | 1143,4
1146 | 1144,1
1147 | 1145,2
1148 | 1146,1
1149 | 1147,3
1150 | 1148,2
1151 | 1149,2
1152 | 1150,2
1153 | 1151,3
1154 | 1152,0
1155 | 1153,4
1156 | 1154,2
1157 | 1155,2
1158 | 1156,0
1159 | 1157,4
1160 | 1158,1
1161 | 1159,3
1162 | 1160,3
1163 | 1161,2
1164 | 1162,4
1165 | 1163,6
1166 | 1164,2
1167 | 1165,6
1168 | 1166,3
1169 | 1167,5
1170 | 1168,5
1171 | 1169,2
1172 | 1170,6
1173 | 1171,3
1174 | 1172,0
1175 | 1173,2
1176 | 1174,0
1177 | 1175,3
1178 | 1176,3
1179 | 1177,3
1180 | 1178,4
1181 | 1179,5
1182 | 1180,1
1183 | 1181,5
1184 | 1182,5
1185 | 1183,5
1186 | 1184,5
1187 | 1185,3
1188 | 1186,3
1189 | 1187,0
1190 | 1188,0
1191 | 1189,2
1192 | 1190,5
1193 | 1191,3
1194 | 1192,3
1195 | 1193,1
1196 | 1194,4
1197 | 1195,0
1198 | 1196,4
1199 | 1197,1
1200 | 1198,0
1201 | 1199,2
1202 | 1200,3
1203 | 1201,3
1204 | 1202,4
1205 | 1203,0
1206 | 1204,1
1207 | 1205,2
1208 | 1206,4
1209 | 1207,4
1210 | 1208,4
1211 | 1209,2
1212 | 1210,2
1213 | 1211,3
1214 | 1212,3
1215 | 1213,3
1216 | 1214,2
1217 | 1215,6
1218 | 1216,2
1219 | 1217,3
1220 | 1218,0
1221 | 1219,3
1222 | 1220,0
1223 | 1221,3
1224 | 1222,5
1225 | 1223,3
1226 | 1224,0
1227 | 1225,3
1228 | 1226,5
1229 | 1227,5
1230 | 1228,0
1231 | 1229,2
1232 | 1230,4
1233 | 1231,3
1234 | 1232,0
1235 | 1233,2
1236 | 1234,4
1237 | 1235,4
1238 | 1236,6
1239 | 1237,5
1240 | 1238,2
1241 | 1239,3
1242 | 1240,4
1243 | 1241,3
1244 | 1242,3
1245 | 1243,2
1246 | 1244,1
1247 | 1245,1
1248 | 1246,4
1249 | 1247,3
1250 | 1248,1
1251 | 1249,2
1252 | 1250,2
1253 | 1251,1
1254 | 1252,2
1255 | 1253,1
1256 | 1254,2
1257 | 1255,4
1258 | 1256,3
1259 | 1257,4
1260 | 1258,1
1261 | 1259,0
1262 | 1260,4
1263 | 1261,4
1264 | 1262,2
1265 | 1263,2
1266 | 1264,4
1267 | 1265,4
1268 | 1266,4
1269 | 1267,5
1270 | 1268,0
1271 | 1269,5
1272 | 1270,3
1273 | 1271,3
1274 | 1272,3
1275 | 1273,3
1276 | 1274,3
1277 | 1275,0
1278 | 1276,5
1279 | 1277,3
1280 | 1278,3
1281 | 1279,0
1282 | 1280,2
1283 | 1281,2
1284 | 1282,2
1285 | 1283,1
1286 | 1284,2
1287 | 1285,0
1288 | 1286,4
1289 | 1287,2
1290 | 1288,6
1291 | 1289,3
1292 | 1290,3
1293 | 1291,6
1294 | 1292,2
1295 | 1293,0
1296 | 1294,3
1297 | 1295,3
1298 | 1296,0
1299 | 1297,3
1300 | 1298,3
1301 | 1299,3
1302 | 1300,3
1303 | 1301,3
1304 | 1302,0
1305 | 1303,3
1306 | 1304,1
1307 | 1305,2
1308 | 1306,2
1309 | 1307,4
1310 | 1308,2
1311 | 1309,5
1312 | 1310,3
1313 | 1311,5
1314 | 1312,5
1315 | 1313,5
1316 | 1314,5
1317 | 1315,3
1318 | 1316,3
1319 | 1317,2
1320 | 1318,4
1321 | 1319,3
1322 | 1320,4
1323 | 1321,3
1324 | 1322,4
1325 | 1323,3
1326 | 1324,5
1327 | 1325,3
1328 | 1326,3
1329 | 1327,6
1330 | 1328,6
1331 | 1329,3
1332 | 1330,0
1333 | 1331,3
1334 | 1332,0
1335 | 1333,6
1336 | 1334,3
1337 | 1335,1
1338 | 1336,4
1339 | 1337,1
1340 | 1338,5
1341 | 1339,2
1342 | 1340,3
1343 | 1341,0
1344 | 1342,4
1345 | 1343,4
1346 | 1344,3
1347 | 1345,2
1348 | 1346,1
1349 | 1347,3
1350 | 1348,3
1351 | 1349,4
1352 | 1350,4
1353 | 1351,6
1354 | 1352,0
1355 | 1353,5
1356 | 1354,5
1357 | 1355,3
1358 | 1356,3
1359 | 1357,0
1360 | 1358,2
1361 | 1359,6
1362 | 1360,5
1363 | 1361,2
1364 | 1362,6
1365 | 1363,3
1366 | 1364,3
1367 | 1365,3
1368 | 1366,4
1369 | 1367,1
1370 | 1368,5
1371 | 1369,4
1372 | 1370,6
1373 | 1371,3
1374 | 1372,6
1375 | 1373,2
1376 | 1374,0
1377 | 1375,5
1378 | 1376,0
1379 | 1377,5
1380 | 1378,2
1381 | 1379,4
1382 | 1380,4
1383 | 1381,4
1384 | 1382,3
1385 | 1383,2
1386 | 1384,2
1387 | 1385,4
1388 | 1386,3
1389 | 1387,6
1390 | 1388,0
1391 | 1389,2
1392 | 1390,4
1393 | 1391,0
1394 | 1392,3
1395 | 1393,3
1396 | 1394,5
1397 | 1395,0
1398 | 1396,6
1399 | 1397,0
1400 | 1398,2
1401 | 1399,6
1402 | 1400,3
1403 | 1401,4
1404 | 1402,6
1405 | 1403,3
1406 | 1404,5
1407 | 1405,3
1408 | 1406,4
1409 | 1407,2
1410 | 1408,5
1411 | 1409,5
1412 | 1410,0
1413 | 1411,3
1414 | 1412,2
1415 | 1413,3
1416 | 1414,5
1417 | 1415,5
1418 | 1416,0
1419 | 1417,4
1420 | 1418,4
1421 | 1419,4
1422 | 1420,6
1423 | 1421,6
1424 | 1422,4
1425 | 1423,3
1426 | 1424,3
1427 | 1425,4
1428 | 1426,2
1429 | 1427,2
1430 | 1428,4
1431 | 1429,4
1432 | 1430,2
1433 | 1431,2
1434 | 1432,3
1435 | 1433,2
1436 | 1434,3
1437 | 1435,0
1438 | 1436,5
1439 | 1437,4
1440 | 1438,3
1441 | 1439,3
1442 | 1440,3
1443 | 1441,5
1444 | 1442,3
1445 | 1443,4
1446 | 1444,2
1447 | 1445,3
1448 | 1446,3
1449 | 1447,3
1450 | 1448,1
1451 | 1449,4
1452 | 1450,3
1453 | 1451,4
1454 | 1452,4
1455 | 1453,3
1456 | 1454,4
1457 | 1455,5
1458 | 1456,3
1459 | 1457,3
1460 | 1458,3
1461 | 1459,1
1462 | 1460,3
1463 | 1461,4
1464 | 1462,3
1465 | 1463,3
1466 | 1464,6
1467 | 1465,3
1468 | 1466,2
1469 | 1467,0
1470 | 1468,0
1471 | 1469,3
1472 | 1470,5
1473 | 1471,2
1474 | 1472,3
1475 | 1473,3
1476 | 1474,4
1477 | 1475,0
1478 | 1476,6
1479 | 1477,3
1480 | 1478,5
1481 | 1479,3
1482 | 1480,2
1483 | 1481,4
1484 | 1482,6
1485 | 1483,2
1486 | 1484,4
1487 | 1485,6
1488 | 1486,2
1489 | 1487,6
1490 | 1488,3
1491 | 1489,2
1492 | 1490,1
1493 | 1491,4
1494 | 1492,2
1495 | 1493,4
1496 | 1494,5
1497 | 1495,6
1498 | 1496,3
1499 | 1497,3
1500 | 1498,3
1501 | 1499,2
1502 | 1500,5
1503 | 1501,6
1504 | 1502,3
1505 | 1503,3
1506 | 1504,6
1507 | 1505,1
1508 | 1506,2
1509 | 1507,0
1510 | 1508,3
1511 | 1509,2
1512 | 1510,4
1513 | 1511,3
1514 | 1512,5
1515 | 1513,2
1516 | 1514,3
1517 | 1515,0
1518 | 1516,2
1519 | 1517,0
1520 | 1518,4
1521 | 1519,4
1522 | 1520,2
1523 | 1521,0
1524 | 1522,4
1525 | 1523,0
1526 | 1524,0
1527 | 1525,6
1528 | 1526,0
1529 | 1527,0
1530 | 1528,2
1531 | 1529,4
1532 | 1530,4
1533 | 1531,4
1534 | 1532,4
1535 | 1533,4
1536 | 1534,4
1537 | 1535,0
1538 | 1536,0
1539 | 1537,5
1540 | 1538,5
1541 | 1539,6
1542 | 1540,0
1543 | 1541,3
1544 | 1542,3
1545 | 1543,5
1546 | 1544,5
1547 | 1545,4
1548 | 1546,2
1549 | 1547,1
1550 | 1548,3
1551 | 1549,5
1552 | 1550,2
1553 | 1551,1
1554 | 1552,1
1555 | 1553,5
1556 | 1554,3
1557 | 1555,5
1558 | 1556,0
1559 | 1557,2
1560 | 1558,3
1561 | 1559,4
1562 | 1560,1
1563 | 1561,1
1564 | 1562,2
1565 | 1563,3
1566 | 1564,1
1567 | 1565,2
1568 | 1566,2
1569 | 1567,3
1570 | 1568,2
1571 | 1569,4
1572 | 1570,3
1573 | 1571,1
1574 | 1572,1
1575 | 1573,3
1576 | 1574,3
1577 | 1575,3
1578 | 1576,3
1579 | 1577,3
1580 | 1578,5
1581 | 1579,5
1582 | 1580,0
1583 | 1581,3
1584 | 1582,3
1585 | 1583,0
1586 | 1584,1
1587 | 1585,4
1588 | 1586,2
1589 | 1587,6
1590 | 1588,0
1591 | 1589,2
1592 | 1590,3
1593 | 1591,3
1594 | 1592,6
1595 | 1593,6
1596 | 1594,5
1597 | 1595,3
1598 | 1596,2
1599 | 1597,3
1600 | 1598,3
1601 | 1599,2
1602 | 1600,3
1603 | 1601,2
1604 | 1602,0
1605 | 1603,3
1606 | 1604,2
1607 | 1605,3
1608 | 1606,2
1609 | 1607,3
1610 | 1608,3
1611 | 1609,1
1612 | 1610,2
1613 | 1611,3
1614 | 1612,2
1615 | 1613,3
1616 | 1614,3
1617 | 1615,3
1618 | 1616,6
1619 | 1617,2
1620 | 1618,4
1621 | 1619,5
1622 | 1620,1
1623 | 1621,3
1624 | 1622,3
1625 | 1623,1
1626 | 1624,1
1627 | 1625,2
1628 | 1626,4
1629 | 1627,2
1630 | 1628,0
1631 | 1629,2
1632 | 1630,5
1633 | 1631,0
1634 | 1632,2
1635 | 1633,3
1636 | 1634,4
1637 | 1635,4
1638 | 1636,3
1639 | 1637,0
1640 | 1638,1
1641 | 1639,4
1642 | 1640,1
1643 | 1641,3
1644 | 1642,2
1645 | 1643,4
1646 | 1644,0
1647 | 1645,3
1648 | 1646,2
1649 | 1647,6
1650 | 1648,2
1651 | 1649,4
1652 | 1650,5
1653 | 1651,1
1654 | 1652,0
1655 | 1653,4
1656 | 1654,3
1657 | 1655,0
1658 | 1656,1
1659 | 1657,3
1660 | 1658,0
1661 | 1659,2
1662 | 1660,6
1663 | 1661,5
1664 | 1662,3
1665 | 1663,3
1666 | 1664,3
1667 | 1665,3
1668 | 1666,3
1669 | 1667,4
1670 | 1668,2
1671 | 1669,0
1672 | 1670,1
1673 | 1671,4
1674 | 1672,0
1675 | 1673,3
1676 | 1674,6
1677 | 1675,6
1678 | 1676,6
1679 | 1677,5
1680 | 1678,3
1681 | 1679,3
1682 | 1680,0
1683 | 1681,3
1684 | 1682,0
1685 | 1683,6
1686 | 1684,3
1687 | 1685,2
1688 | 1686,4
1689 | 1687,2
1690 | 1688,4
1691 | 1689,2
1692 | 1690,5
1693 | 1691,3
1694 | 1692,3
1695 | 1693,0
1696 | 1694,2
1697 | 1695,0
1698 | 1696,0
1699 | 1697,3
1700 | 1698,6
1701 | 1699,1
1702 | 1700,5
1703 | 1701,3
1704 | 1702,4
1705 | 1703,4
1706 | 1704,3
1707 | 1705,1
1708 | 1706,2
1709 | 1707,5
1710 | 1708,3
1711 | 1709,2
1712 | 1710,2
1713 | 1711,2
1714 | 1712,2
1715 | 1713,0
1716 | 1714,2
1717 | 1715,2
1718 | 1716,2
1719 | 1717,2
1720 | 1718,2
1721 | 1719,2
1722 | 1720,2
1723 | 1721,2
1724 | 1722,2
1725 | 1723,2
1726 | 1724,2
1727 | 1725,2
1728 | 1726,2
1729 | 1727,2
1730 | 1728,3
1731 | 1729,2
1732 | 1730,2
1733 | 1731,2
1734 | 1732,2
1735 | 1733,2
1736 | 1734,2
1737 | 1735,1
1738 | 1736,2
1739 | 1737,2
1740 | 1738,2
1741 | 1739,2
1742 | 1740,2
1743 | 1741,3
1744 | 1742,2
1745 | 1743,2
1746 | 1744,2
1747 | 1745,2
1748 | 1746,2
1749 | 1747,2
1750 | 1748,2
1751 | 1749,2
1752 | 1750,2
1753 | 1751,2
1754 | 1752,2
1755 | 1753,2
1756 | 1754,2
1757 | 1755,2
1758 | 1756,2
1759 | 1757,2
1760 | 1758,2
1761 | 1759,2
1762 | 1760,2
1763 | 1761,2
1764 | 1762,2
1765 | 1763,2
1766 | 1764,5
1767 | 1765,2
1768 | 1766,2
1769 | 1767,1
1770 | 1768,1
1771 | 1769,1
1772 | 1770,1
1773 | 1771,1
1774 | 1772,1
1775 | 1773,1
1776 | 1774,4
1777 | 1775,1
1778 | 1776,1
1779 | 1777,1
1780 | 1778,1
1781 | 1779,1
1782 | 1780,1
1783 | 1781,1
1784 | 1782,1
1785 | 1783,1
1786 | 1784,1
1787 | 1785,4
1788 | 1786,1
1789 | 1787,1
1790 | 1788,1
1791 | 1789,1
1792 | 1790,1
1793 | 1791,1
1794 | 1792,3
1795 | 1793,4
1796 | 1794,4
1797 | 1795,4
1798 | 1796,4
1799 | 1797,1
1800 | 1798,1
1801 | 1799,3
1802 | 1800,1
1803 | 1801,0
1804 | 1802,3
1805 | 1803,0
1806 | 1804,2
1807 | 1805,1
1808 | 1806,3
1809 | 1807,3
1810 | 1808,3
1811 | 1809,3
1812 | 1810,3
1813 | 1811,3
1814 | 1812,3
1815 | 1813,3
1816 | 1814,3
1817 | 1815,3
1818 | 1816,3
1819 | 1817,3
1820 | 1818,3
1821 | 1819,3
1822 | 1820,3
1823 | 1821,3
1824 | 1822,3
1825 | 1823,3
1826 | 1824,5
1827 | 1825,5
1828 | 1826,5
1829 | 1827,5
1830 | 1828,5
1831 | 1829,5
1832 | 1830,2
1833 | 1831,2
1834 | 1832,2
1835 | 1833,2
1836 | 1834,1
1837 | 1835,6
1838 | 1836,6
1839 | 1837,3
1840 | 1838,0
1841 | 1839,0
1842 | 1840,5
1843 | 1841,0
1844 | 1842,5
1845 | 1843,0
1846 | 1844,3
1847 | 1845,5
1848 | 1846,3
1849 | 1847,0
1850 | 1848,0
1851 | 1849,6
1852 | 1850,0
1853 | 1851,6
1854 | 1852,3
1855 | 1853,3
1856 | 1854,1
1857 | 1855,3
1858 | 1856,1
1859 | 1857,3
1860 | 1858,3
1861 | 1859,3
1862 | 1860,3
1863 | 1861,3
1864 | 1862,3
1865 | 1863,3
1866 | 1864,3
1867 | 1865,3
1868 | 1866,3
1869 | 1867,3
1870 | 1868,3
1871 | 1869,3
1872 | 1870,3
1873 | 1871,3
1874 | 1872,3
1875 | 1873,3
1876 | 1874,3
1877 | 1875,3
1878 | 1876,3
1879 | 1877,3
1880 | 1878,5
1881 | 1879,5
1882 | 1880,5
1883 | 1881,5
1884 | 1882,5
1885 | 1883,5
1886 | 1884,5
1887 | 1885,5
1888 | 1886,2
1889 | 1887,2
1890 | 1888,2
1891 | 1889,4
1892 | 1890,4
1893 | 1891,4
1894 | 1892,0
1895 | 1893,3
1896 | 1894,3
1897 | 1895,2
1898 | 1896,5
1899 | 1897,5
1900 | 1898,5
1901 | 1899,5
1902 | 1900,6
1903 | 1901,5
1904 | 1902,5
1905 | 1903,5
1906 | 1904,5
1907 | 1905,0
1908 | 1906,4
1909 | 1907,4
1910 | 1908,4
1911 | 1909,0
1912 | 1910,0
1913 | 1911,5
1914 | 1912,0
1915 | 1913,0
1916 | 1914,6
1917 | 1915,6
1918 | 1916,6
1919 | 1917,6
1920 | 1918,6
1921 | 1919,6
1922 | 1920,0
1923 | 1921,0
1924 | 1922,0
1925 | 1923,0
1926 | 1924,3
1927 | 1925,0
1928 | 1926,0
1929 | 1927,0
1930 | 1928,3
1931 | 1929,3
1932 | 1930,0
1933 | 1931,3
1934 | 1932,3
1935 | 1933,3
1936 | 1934,3
1937 | 1935,3
1938 | 1936,3
1939 | 1937,3
1940 | 1938,3
1941 | 1939,3
1942 | 1940,3
1943 | 1941,3
1944 | 1942,3
1945 | 1943,3
1946 | 1944,3
1947 | 1945,3
1948 | 1946,3
1949 | 1947,3
1950 | 1948,3
1951 | 1949,3
1952 | 1950,3
1953 | 1951,3
1954 | 1952,3
1955 | 1953,5
1956 | 1954,5
1957 | 1955,5
1958 | 1956,5
1959 | 1957,3
1960 | 1958,5
1961 | 1959,5
1962 | 1960,5
1963 | 1961,5
1964 | 1962,5
1965 | 1963,5
1966 | 1964,4
1967 | 1965,4
1968 | 1966,4
1969 | 1967,4
1970 | 1968,4
1971 | 1969,4
1972 | 1970,4
1973 | 1971,4
1974 | 1972,6
1975 | 1973,6
1976 | 1974,5
1977 | 1975,6
1978 | 1976,6
1979 | 1977,3
1980 | 1978,5
1981 | 1979,5
1982 | 1980,5
1983 | 1981,0
1984 | 1982,5
1985 | 1983,0
1986 | 1984,4
1987 | 1985,4
1988 | 1986,3
1989 | 1987,3
1990 | 1988,3
1991 | 1989,2
1992 | 1990,2
1993 | 1991,1
1994 | 1992,3
1995 | 1993,3
1996 | 1994,3
1997 | 1995,3
1998 | 1996,3
1999 | 1997,3
2000 | 1998,5
2001 | 1999,3
2002 | 2000,3
2003 | 2001,4
2004 | 2002,4
2005 | 2003,3
2006 | 2004,3
2007 | 2005,3
2008 | 2006,3
2009 | 2007,3
2010 | 2008,3
2011 | 2009,3
2012 | 2010,0
2013 | 2011,3
2014 | 2012,3
2015 | 2013,6
2016 | 2014,3
2017 | 2015,6
2018 | 2016,0
2019 | 2017,5
2020 | 2018,0
2021 | 2019,0
2022 | 2020,4
2023 | 2021,0
2024 | 2022,6
2025 | 2023,5
2026 | 2024,5
2027 | 2025,0
2028 | 2026,1
2029 | 2027,3
2030 | 2028,3
2031 | 2029,5
2032 | 2030,6
2033 | 2031,5
2034 | 2032,3
2035 | 2033,3
2036 | 2034,4
2037 | 2035,3
2038 | 2036,3
2039 | 2037,3
2040 | 2038,3
2041 | 2039,3
2042 | 2040,4
2043 | 2041,3
2044 | 2042,3
2045 | 2043,4
2046 | 2044,3
2047 | 2045,1
2048 | 2046,1
2049 | 2047,0
2050 | 2048,1
2051 | 2049,0
2052 | 2050,6
2053 | 2051,0
2054 | 2052,0
2055 | 2053,0
2056 | 2054,0
2057 | 2055,0
2058 | 2056,0
2059 | 2057,0
2060 | 2058,5
2061 | 2059,0
2062 | 2060,5
2063 | 2061,5
2064 | 2062,5
2065 | 2063,3
2066 | 2064,3
2067 | 2065,3
2068 | 2066,3
2069 | 2067,3
2070 | 2068,0
2071 | 2069,0
2072 | 2070,0
2073 | 2071,2
2074 | 2072,0
2075 | 2073,0
2076 | 2074,0
2077 | 2075,3
2078 | 2076,3
2079 | 2077,3
2080 | 2078,3
2081 | 2079,1
2082 | 2080,1
2083 | 2081,1
2084 | 2082,1
2085 | 2083,2
2086 | 2084,1
2087 | 2085,1
2088 | 2086,1
2089 | 2087,1
2090 | 2088,1
2091 | 2089,0
2092 | 2090,1
2093 | 2091,3
2094 | 2092,1
2095 | 2093,1
2096 | 2094,1
2097 | 2095,1
2098 | 2096,1
2099 | 2097,0
2100 | 2098,0
2101 | 2099,0
2102 | 2100,5
2103 | 2101,5
2104 | 2102,5
2105 | 2103,5
2106 | 2104,3
2107 | 2105,5
2108 | 2106,1
2109 | 2107,1
2110 | 2108,3
2111 | 2109,6
2112 | 2110,6
2113 | 2111,5
2114 | 2112,6
2115 | 2113,2
2116 | 2114,3
2117 | 2115,3
2118 | 2116,0
2119 | 2117,3
2120 | 2118,3
2121 | 2119,3
2122 | 2120,4
2123 | 2121,4
2124 | 2122,4
2125 | 2123,4
2126 | 2124,3
2127 | 2125,3
2128 | 2126,3
2129 | 2127,4
2130 | 2128,3
2131 | 2129,3
2132 | 2130,4
2133 | 2131,0
2134 | 2132,6
2135 | 2133,0
2136 | 2134,6
2137 | 2135,6
2138 | 2136,0
2139 | 2137,0
2140 | 2138,3
2141 | 2139,3
2142 | 2140,3
2143 | 2141,3
2144 | 2142,3
2145 | 2143,1
2146 | 2144,1
2147 | 2145,1
2148 | 2146,3
2149 | 2147,3
2150 | 2148,3
2151 | 2149,3
2152 | 2150,5
2153 | 2151,6
2154 | 2152,3
2155 | 2153,4
2156 | 2154,6
2157 | 2155,0
2158 | 2156,0
2159 | 2157,6
2160 | 2158,6
2161 | 2159,6
2162 | 2160,6
2163 | 2161,6
2164 | 2162,3
2165 | 2163,3
2166 | 2164,6
2167 | 2165,6
2168 | 2166,5
2169 | 2167,2
2170 | 2168,1
2171 | 2169,2
2172 | 2170,1
2173 | 2171,0
2174 | 2172,0
2175 | 2173,6
2176 | 2174,6
2177 | 2175,2
2178 | 2176,3
2179 | 2177,3
2180 | 2178,5
2181 | 2179,0
2182 | 2180,0
2183 | 2181,0
2184 | 2182,0
2185 | 2183,0
2186 | 2184,5
2187 | 2185,5
2188 | 2186,0
2189 | 2187,3
2190 | 2188,5
2191 | 2189,0
2192 | 2190,6
2193 | 2191,3
2194 | 2192,6
2195 | 2193,0
2196 | 2194,0
2197 | 2195,0
2198 | 2196,0
2199 | 2197,0
2200 | 2198,0
2201 | 2199,0
2202 | 2200,0
2203 | 2201,0
2204 | 2202,0
2205 | 2203,0
2206 | 2204,3
2207 | 2205,3
2208 | 2206,3
2209 | 2207,3
2210 | 2208,1
2211 | 2209,6
2212 | 2210,1
2213 | 2211,0
2214 | 2212,3
2215 | 2213,3
2216 | 2214,3
2217 | 2215,3
2218 | 2216,3
2219 | 2217,6
2220 | 2218,1
2221 | 2219,0
2222 | 2220,2
2223 | 2221,2
2224 | 2222,4
2225 | 2223,4
2226 | 2224,4
2227 | 2225,4
2228 | 2226,4
2229 | 2227,5
2230 | 2228,6
2231 | 2229,3
2232 | 2230,3
2233 | 2231,0
2234 | 2232,0
2235 | 2233,0
2236 | 2234,0
2237 | 2235,5
2238 | 2236,4
2239 | 2237,4
2240 | 2238,4
2241 | 2239,4
2242 | 2240,4
2243 | 2241,3
2244 | 2242,3
2245 | 2243,3
2246 | 2244,3
2247 | 2245,3
2248 | 2246,0
2249 | 2247,3
2250 | 2248,4
2251 | 2249,4
2252 | 2250,4
2253 | 2251,1
2254 | 2252,1
2255 | 2253,3
2256 | 2254,1
2257 | 2255,1
2258 | 2256,5
2259 | 2257,1
2260 | 2258,3
2261 | 2259,4
2262 | 2260,4
2263 | 2261,4
2264 | 2262,4
2265 | 2263,4
2266 | 2264,4
2267 | 2265,4
2268 | 2266,0
2269 | 2267,0
2270 | 2268,0
2271 | 2269,5
2272 | 2270,5
2273 | 2271,5
2274 | 2272,5
2275 | 2273,5
2276 | 2274,0
2277 | 2275,5
2278 | 2276,3
2279 | 2277,0
2280 | 2278,6
2281 | 2279,2
2282 | 2280,0
2283 | 2281,5
2284 | 2282,3
2285 | 2283,3
2286 | 2284,5
2287 | 2285,5
2288 | 2286,5
2289 | 2287,5
2290 | 2288,5
2291 | 2289,4
2292 | 2290,4
2293 | 2291,0
2294 | 2292,4
2295 | 2293,0
2296 | 2294,4
2297 | 2295,0
2298 | 2296,3
2299 | 2297,4
2300 | 2298,4
2301 | 2299,4
2302 | 2300,1
2303 | 2301,3
2304 | 2302,3
2305 | 2303,3
2306 | 2304,3
2307 | 2305,3
2308 | 2306,4
2309 | 2307,2
2310 | 2308,3
2311 | 2309,3
2312 | 2310,3
2313 | 2311,0
2314 | 2312,0
2315 | 2313,2
2316 | 2314,3
2317 | 2315,3
2318 | 2316,3
2319 | 2317,3
2320 | 2318,1
2321 | 2319,1
2322 | 2320,3
2323 | 2321,0
2324 | 2322,1
2325 | 2323,4
2326 | 2324,1
2327 | 2325,1
2328 | 2326,1
2329 | 2327,1
2330 | 2328,1
2331 | 2329,1
2332 | 2330,0
2333 | 2331,1
2334 | 2332,0
2335 | 2333,0
2336 | 2334,2
2337 | 2335,4
2338 | 2336,4
2339 | 2337,4
2340 | 2338,3
2341 | 2339,3
2342 | 2340,3
2343 | 2341,4
2344 | 2342,0
2345 | 2343,3
2346 | 2344,3
2347 | 2345,3
2348 | 2346,3
2349 | 2347,0
2350 | 2348,3
2351 | 2349,3
2352 | 2350,4
2353 | 2351,4
2354 | 2352,4
2355 | 2353,4
2356 | 2354,4
2357 | 2355,4
2358 | 2356,0
2359 | 2357,4
2360 | 2358,3
2361 | 2359,2
2362 | 2360,0
2363 | 2361,3
2364 | 2362,4
2365 | 2363,5
2366 | 2364,0
2367 | 2365,2
2368 | 2366,2
2369 | 2367,3
2370 | 2368,3
2371 | 2369,3
2372 | 2370,3
2373 | 2371,3
2374 | 2372,2
2375 | 2373,3
2376 | 2374,5
2377 | 2375,5
2378 | 2376,4
2379 | 2377,1
2380 | 2378,4
2381 | 2379,4
2382 | 2380,4
2383 | 2381,3
2384 | 2382,4
2385 | 2383,4
2386 | 2384,0
2387 | 2385,4
2388 | 2386,4
2389 | 2387,4
2390 | 2388,5
2391 | 2389,2
2392 | 2390,2
2393 | 2391,2
2394 | 2392,2
2395 | 2393,4
2396 | 2394,6
2397 | 2395,6
2398 | 2396,6
2399 | 2397,6
2400 | 2398,3
2401 | 2399,4
2402 | 2400,4
2403 | 2401,4
2404 | 2402,1
2405 | 2403,3
2406 | 2404,0
2407 | 2405,3
2408 | 2406,3
2409 | 2407,5
2410 | 2408,0
2411 | 2409,2
2412 | 2410,3
2413 | 2411,3
2414 | 2412,3
2415 | 2413,3
2416 | 2414,3
2417 | 2415,2
2418 | 2416,4
2419 | 2417,4
2420 | 2418,0
2421 | 2419,0
2422 | 2420,3
2423 | 2421,2
2424 | 2422,6
2425 | 2423,6
2426 | 2424,0
2427 | 2425,3
2428 | 2426,3
2429 | 2427,3
2430 | 2428,5
2431 | 2429,1
2432 | 2430,3
2433 | 2431,4
2434 | 2432,4
2435 | 2433,2
2436 | 2434,4
2437 | 2435,4
2438 | 2436,4
2439 | 2437,3
2440 | 2438,3
2441 | 2439,2
2442 | 2440,2
2443 | 2441,2
2444 | 2442,2
2445 | 2443,2
2446 | 2444,2
2447 | 2445,2
2448 | 2446,2
2449 | 2447,2
2450 | 2448,2
2451 | 2449,0
2452 | 2450,2
2453 | 2451,2
2454 | 2452,2
2455 | 2453,0
2456 | 2454,6
2457 | 2455,6
2458 | 2456,5
2459 | 2457,6
2460 | 2458,6
2461 | 2459,3
2462 | 2460,2
2463 | 2461,6
2464 | 2462,3
2465 | 2463,4
2466 | 2464,4
2467 | 2465,4
2468 | 2466,2
2469 | 2467,6
2470 | 2468,6
2471 | 2469,0
2472 | 2470,0
2473 | 2471,3
2474 | 2472,0
2475 | 2473,4
2476 | 2474,4
2477 | 2475,3
2478 | 2476,2
2479 | 2477,3
2480 | 2478,1
2481 | 2479,6
2482 | 2480,6
2483 | 2481,5
2484 | 2482,3
2485 | 2483,4
2486 | 2484,3
2487 | 2485,5
2488 | 2486,3
2489 | 2487,1
2490 | 2488,1
2491 | 2489,3
2492 | 2490,4
2493 | 2491,5
2494 | 2492,2
2495 | 2493,3
2496 | 2494,3
2497 | 2495,3
2498 | 2496,4
2499 | 2497,5
2500 | 2498,4
2501 | 2499,0
2502 | 2500,3
2503 | 2501,3
2504 | 2502,0
2505 | 2503,2
2506 | 2504,1
2507 | 2505,1
2508 | 2506,5
2509 | 2507,2
2510 | 2508,3
2511 | 2509,3
2512 | 2510,5
2513 | 2511,0
2514 | 2512,2
2515 | 2513,3
2516 | 2514,2
2517 | 2515,2
2518 | 2516,5
2519 | 2517,5
2520 | 2518,4
2521 | 2519,3
2522 | 2520,4
2523 | 2521,3
2524 | 2522,2
2525 | 2523,2
2526 | 2524,4
2527 | 2525,2
2528 | 2526,4
2529 | 2527,5
2530 | 2528,5
2531 | 2529,3
2532 | 2530,2
2533 | 2531,3
2534 | 2532,1
2535 | 2533,0
2536 | 2534,3
2537 | 2535,3
2538 | 2536,4
2539 | 2537,5
2540 | 2538,4
2541 | 2539,3
2542 | 2540,3
2543 | 2541,3
2544 | 2542,3
2545 | 2543,3
2546 | 2544,0
2547 | 2545,1
2548 | 2546,2
2549 | 2547,4
2550 | 2548,4
2551 | 2549,4
2552 | 2550,3
2553 | 2551,3
2554 | 2552,3
2555 | 2553,5
2556 | 2554,2
2557 | 2555,3
2558 | 2556,2
2559 | 2557,2
2560 | 2558,2
2561 | 2559,3
2562 | 2560,2
2563 | 2561,2
2564 | 2562,0
2565 | 2563,4
2566 | 2564,4
2567 | 2565,3
2568 | 2566,3
2569 | 2567,3
2570 | 2568,3
2571 | 2569,3
2572 | 2570,3
2573 | 2571,3
2574 | 2572,3
2575 | 2573,3
2576 | 2574,3
2577 | 2575,0
2578 | 2576,0
2579 | 2577,3
2580 | 2578,0
2581 | 2579,3
2582 | 2580,0
2583 | 2581,2
2584 | 2582,3
2585 | 2583,4
2586 | 2584,1
2587 | 2585,2
2588 | 2586,5
2589 | 2587,4
2590 | 2588,3
2591 | 2589,3
2592 | 2590,3
2593 | 2591,1
2594 | 2592,5
2595 | 2593,3
2596 | 2594,4
2597 | 2595,3
2598 | 2596,2
2599 | 2597,2
2600 | 2598,1
2601 | 2599,3
2602 | 2600,3
2603 | 2601,3
2604 | 2602,3
2605 | 2603,3
2606 | 2604,6
2607 | 2605,3
2608 | 2606,3
2609 | 2607,3
2610 | 2608,6
2611 | 2609,3
2612 | 2610,3
2613 | 2611,3
2614 | 2612,2
2615 | 2613,3
2616 | 2614,2
2617 | 2615,4
2618 | 2616,2
2619 | 2617,4
2620 | 2618,2
2621 | 2619,2
2622 | 2620,1
2623 | 2621,5
2624 | 2622,6
2625 | 2623,4
2626 | 2624,3
2627 | 2625,3
2628 | 2626,3
2629 | 2627,2
2630 | 2628,5
2631 | 2629,3
2632 | 2630,3
2633 | 2631,4
2634 | 2632,3
2635 | 2633,3
2636 | 2634,3
2637 | 2635,3
2638 | 2636,3
2639 | 2637,4
2640 | 2638,6
2641 | 2639,0
2642 | 2640,3
2643 | 2641,2
2644 | 2642,2
2645 | 2643,2
2646 | 2644,5
2647 | 2645,4
2648 | 2646,4
2649 | 2647,4
2650 | 2648,4
2651 | 2649,6
2652 | 2650,3
2653 | 2651,2
2654 | 2652,2
2655 | 2653,0
2656 | 2654,2
2657 | 2655,2
2658 | 2656,2
2659 | 2657,2
2660 | 2658,2
2661 | 2659,3
2662 | 2660,4
2663 | 2661,4
2664 | 2662,4
2665 | 2663,3
2666 | 2664,3
2667 | 2665,4
2668 | 2666,4
2669 | 2667,3
2670 | 2668,3
2671 | 2669,3
2672 | 2670,4
2673 | 2671,4
2674 | 2672,4
2675 | 2673,4
2676 | 2674,4
2677 | 2675,4
2678 | 2676,3
2679 | 2677,4
2680 | 2678,4
2681 | 2679,4
2682 | 2680,4
2683 | 2681,4
2684 | 2682,4
2685 | 2683,4
2686 | 2684,4
2687 | 2685,2
2688 | 2686,3
2689 | 2687,3
2690 | 2688,3
2691 | 2689,2
2692 | 2690,6
2693 | 2691,2
2694 | 2692,3
2695 | 2693,3
2696 | 2694,4
2697 | 2695,4
2698 | 2696,3
2699 | 2697,3
2700 | 2698,3
2701 | 2699,3
2702 | 2700,3
2703 | 2701,3
2704 | 2702,0
2705 | 2703,3
2706 | 2704,3
2707 | 2705,3
2708 | 2706,3
2709 | 2707,3
2710 |
--------------------------------------------------------------------------------
/mixhop.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/benedekrozemberczki/MixHop-and-N-GCN/e9df5eb988f0dbe5192ac87902c55cbe6d8191fe/mixhop.gif
--------------------------------------------------------------------------------
/mixhop1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/benedekrozemberczki/MixHop-and-N-GCN/e9df5eb988f0dbe5192ac87902c55cbe6d8191fe/mixhop1.jpg
--------------------------------------------------------------------------------
/mixhop_paper.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/benedekrozemberczki/MixHop-and-N-GCN/e9df5eb988f0dbe5192ac87902c55cbe6d8191fe/mixhop_paper.pdf
--------------------------------------------------------------------------------
/paper.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/benedekrozemberczki/MixHop-and-N-GCN/e9df5eb988f0dbe5192ac87902c55cbe6d8191fe/paper.pdf
--------------------------------------------------------------------------------
/src/layers.py:
--------------------------------------------------------------------------------
1 | """NGCN and DenseNGCN layers."""
2 |
3 | import math
4 | import torch
5 | from torch_sparse import spmm
6 |
7 | def uniform(size, tensor):
8 | """
9 | Uniform weight initialization.
10 | :param size: Size of the tensor.
11 | :param tensor: Tensor initialized.
12 | """
13 | stdv = 1.0 / math.sqrt(size)
14 | if tensor is not None:
15 | tensor.data.uniform_(-stdv, stdv)
16 |
17 | class SparseNGCNLayer(torch.nn.Module):
18 | """
19 | Multi-scale Sparse Feature Matrix GCN layer.
20 | :param in_channels: Number of features.
21 | :param out_channels: Number of filters.
22 | :param iterations: Adjacency matrix power order.
23 | :param dropout_rate: Dropout value.
24 | """
25 | def __init__(self, in_channels, out_channels, iterations, dropout_rate):
26 | super(SparseNGCNLayer, self).__init__()
27 | self.in_channels = in_channels
28 | self.out_channels = out_channels
29 | self.iterations = iterations
30 | self.dropout_rate = dropout_rate
31 | self.define_parameters()
32 | self.init_parameters()
33 |
34 | def define_parameters(self):
35 | """
36 | Defining the weight matrices.
37 | """
38 | self.weight_matrix = torch.nn.Parameter(torch.Tensor(self.in_channels, self.out_channels))
39 | self.bias = torch.nn.Parameter(torch.Tensor(1, self.out_channels))
40 |
41 | def init_parameters(self):
42 | """
43 | Initializing weights.
44 | """
45 | torch.nn.init.xavier_uniform_(self.weight_matrix)
46 | torch.nn.init.xavier_uniform_(self.bias)
47 |
48 | def forward(self, normalized_adjacency_matrix, features):
49 | """
50 | Doing a forward pass.
51 | :param normalized_adjacency_matrix: Normalized adjacency matrix.
52 | :param features: Feature matrix.
53 | :return base_features: Convolved features.
54 | """
55 | feature_count, _ = torch.max(features["indices"],dim=1)
56 | feature_count = feature_count + 1
57 | base_features = spmm(features["indices"], features["values"], feature_count[0],
58 | feature_count[1], self.weight_matrix)
59 |
60 | base_features = base_features + self.bias
61 |
62 | base_features = torch.nn.functional.dropout(base_features,
63 | p=self.dropout_rate,
64 | training=self.training)
65 |
66 | base_features = torch.nn.functional.relu(base_features)
67 | for _ in range(self.iterations-1):
68 | base_features = spmm(normalized_adjacency_matrix["indices"],
69 | normalized_adjacency_matrix["values"],
70 | base_features.shape[0],
71 | base_features.shape[0],
72 | base_features)
73 | return base_features
74 |
75 | class DenseNGCNLayer(torch.nn.Module):
76 | """
77 | Multi-scale Dense Feature Matrix GCN layer.
78 | :param in_channels: Number of features.
79 | :param out_channels: Number of filters.
80 | :param iterations: Adjacency matrix power order.
81 | :param dropout_rate: Dropout value.
82 | """
83 | def __init__(self, in_channels, out_channels, iterations, dropout_rate):
84 | super(DenseNGCNLayer, self).__init__()
85 | self.in_channels = in_channels
86 | self.out_channels = out_channels
87 | self.iterations = iterations
88 | self.dropout_rate = dropout_rate
89 | self.define_parameters()
90 | self.init_parameters()
91 |
92 | def define_parameters(self):
93 | """
94 | Defining the weight matrices.
95 | """
96 | self.weight_matrix = torch.nn.Parameter(torch.Tensor(self.in_channels, self.out_channels))
97 | self.bias = torch.nn.Parameter(torch.Tensor(1, self.out_channels))
98 |
99 | def init_parameters(self):
100 | """
101 | Initializing weights.
102 | """
103 | torch.nn.init.xavier_uniform_(self.weight_matrix)
104 | torch.nn.init.xavier_uniform_(self.bias)
105 |
106 | def forward(self, normalized_adjacency_matrix, features):
107 | """
108 | Doing a forward pass.
109 | :param normalized_adjacency_matrix: Normalized adjacency matrix.
110 | :param features: Feature matrix.
111 | :return base_features: Convolved features.
112 | """
113 | base_features = torch.mm(features, self.weight_matrix)
114 | base_features = torch.nn.functional.dropout(base_features,
115 | p=self.dropout_rate,
116 | training=self.training)
117 | for _ in range(self.iterations-1):
118 | base_features = spmm(normalized_adjacency_matrix["indices"],
119 | normalized_adjacency_matrix["values"],
120 | base_features.shape[0],
121 | base_features.shape[0],
122 | base_features)
123 | base_features = base_features + self.bias
124 | return base_features
125 |
126 | class ListModule(torch.nn.Module):
127 | """
128 | Abstract list layer class.
129 | """
130 | def __init__(self, *args):
131 | """
132 | Module initializing.
133 | """
134 | super(ListModule, self).__init__()
135 | idx = 0
136 | for module in args:
137 | self.add_module(str(idx), module)
138 | idx += 1
139 |
140 | def __getitem__(self, idx):
141 | """
142 | Getting the indexed layer.
143 | """
144 | if idx < 0 or idx >= len(self._modules):
145 | raise IndexError('index {} is out of range'.format(idx))
146 | it = iter(self._modules.values())
147 | for i in range(idx):
148 | next(it)
149 | return next(it)
150 |
151 | def __iter__(self):
152 | """
153 | Iterating on the layers.
154 | """
155 | return iter(self._modules.values())
156 |
157 | def __len__(self):
158 | """
159 | Number of layers.
160 | """
161 | return len(self._modules)
162 |
--------------------------------------------------------------------------------
/src/main.py:
--------------------------------------------------------------------------------
1 | """Running MixHop or N-GCN."""
2 |
3 | import torch
4 | from param_parser import parameter_parser
5 | from trainer_and_networks import Trainer
6 | from utils import tab_printer, graph_reader, feature_reader, target_reader
7 |
8 | def main():
9 | """
10 | Parsing command line parameters, reading data.
11 | Fitting an NGCN and scoring the model.
12 | """
13 | args = parameter_parser()
14 | torch.manual_seed(args.seed)
15 | tab_printer(args)
16 | graph = graph_reader(args.edge_path)
17 | features = feature_reader(args.features_path)
18 | target = target_reader(args.target_path)
19 | trainer = Trainer(args, graph, features, target, True)
20 | trainer.fit()
21 | if args.model == "mixhop":
22 | trainer.evaluate_architecture()
23 | args = trainer.reset_architecture()
24 | trainer = Trainer(args, graph, features, target, False)
25 | trainer.fit()
26 |
27 | if __name__ == "__main__":
28 | main()
29 |
--------------------------------------------------------------------------------
/src/param_parser.py:
--------------------------------------------------------------------------------
1 | """Parameter parsing."""
2 |
3 | import argparse
4 |
5 | def parameter_parser():
6 | """
7 | A method to parse up command line parameters. By default it trains on the Cora dataset.
8 | The default hyperparameters give a good quality representation without grid search.
9 | """
10 | parser = argparse.ArgumentParser(description="Run MixHop/N-GCN.")
11 |
12 | parser.add_argument("--edge-path",
13 | nargs="?",
14 | default="./input/cora_edges.csv",
15 | help="Edge list csv.")
16 |
17 | parser.add_argument("--features-path",
18 | nargs="?",
19 | default="./input/cora_features.json",
20 | help="Features json.")
21 |
22 | parser.add_argument("--target-path",
23 | nargs="?",
24 | default="./input/cora_target.csv",
25 | help="Target classes csv.")
26 |
27 | parser.add_argument("--model",
28 | nargs="?",
29 | default="mixhop",
30 | help="Target classes csv.")
31 |
32 | parser.add_argument("--epochs",
33 | type=int,
34 | default=2000,
35 | help="Number of training epochs. Default is 2000.")
36 |
37 | parser.add_argument("--seed",
38 | type=int,
39 | default=42,
40 | help="Random seed for train-test split. Default is 42.")
41 |
42 | parser.add_argument("--early-stopping",
43 | type=int,
44 | default=10,
45 | help="Number of early stopping rounds. Default is 10.")
46 |
47 | parser.add_argument("--training-size",
48 | type=int,
49 | default=1500,
50 | help="Training set size. Default is 1500.")
51 |
52 | parser.add_argument("--validation-size",
53 | type=int,
54 | default=500,
55 | help="Validation set size. Default is 500.")
56 |
57 | parser.add_argument("--dropout",
58 | type=float,
59 | default=0.5,
60 | help="Dropout parameter. Default is 0.5.")
61 |
62 | parser.add_argument("--learning-rate",
63 | type=float,
64 | default=0.01,
65 | help="Learning rate. Default is 0.01.")
66 |
67 | parser.add_argument("--cut-off",
68 | type=float,
69 | default=0.1,
70 | help="Weight cut-off. Default is 0.1.")
71 |
72 | parser.add_argument("--lambd",
73 | type=float,
74 | default=0.0005,
75 | help="L2 regularization coefficient. Default is 0.0005.")
76 |
77 | parser.add_argument("--layers-1",
78 | nargs="+",
79 | type=int,
80 | help="Layer dimensions separated by space (top). E.g. 200 20.")
81 |
82 | parser.add_argument("--layers-2",
83 | nargs="+",
84 | type=int,
85 | help="Layer dimensions separated by space (bottom). E.g. 200 200.")
86 |
87 | parser.add_argument("--budget",
88 | type=int,
89 | default=60,
90 | help="Architecture neuron allocation budget. Default is 60.")
91 |
92 | parser.set_defaults(layers_1=[200, 200, 200])
93 | parser.set_defaults(layers_2=[200, 200, 200])
94 |
95 | return parser.parse_args()
96 |
--------------------------------------------------------------------------------
/src/trainer_and_networks.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import random
3 | from tqdm import trange
4 | from utils import create_propagator_matrix
5 | from layers import SparseNGCNLayer, DenseNGCNLayer, ListModule
6 |
7 | class NGCNNetwork(torch.nn.Module):
8 | """
9 | Higher Order Graph Convolutional Model.
10 | :param args: Arguments object.
11 | :param feature_number: Feature input number.
12 | :param class_number: Target class number.
13 | """
14 | def __init__(self, args, feature_number, class_number):
15 | super(NGCNNetwork, self).__init__()
16 | self.args = args
17 | self.feature_number = feature_number
18 | self.class_number = class_number
19 | self.order = len(self.args.layers_1)
20 | self.setup_layer_structure()
21 |
22 | def setup_layer_structure(self):
23 | """
24 | Creating the layer structure (3 convolutional layers) and dense final.
25 | """
26 | self.main_layers = [SparseNGCNLayer(self.feature_number, self.args.layers_1[i-1], i, self.args.dropout) for i in range(1, self.order+1)]
27 | self.main_layers = ListModule(*self.main_layers)
28 | self.fully_connected = torch.nn.Linear(sum(self.args.layers_1), self.class_number)
29 |
30 | def forward(self, normalized_adjacency_matrix, features):
31 | """
32 | Forward pass.
33 | :param normalized adjacency_matrix: Target matrix as a dict with indices and values.
34 | :param features: Feature matrix.
35 | :return predictions: Label predictions.
36 | """
37 | abstract_features = [self.main_layers[i](normalized_adjacency_matrix, features) for i in range(self.order)]
38 | abstract_features = torch.cat(abstract_features, dim=1)
39 | predictions = torch.nn.functional.log_softmax(self.fully_connected(abstract_features), dim=1)
40 | return predictions
41 |
42 | class MixHopNetwork(torch.nn.Module):
43 | """
44 | MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing.
45 | :param args: Arguments object.
46 | :param feature_number: Feature input number.
47 | :param class_number: Target class number.
48 | """
49 | def __init__(self, args, feature_number, class_number):
50 | super(MixHopNetwork, self).__init__()
51 | self.args = args
52 | self.feature_number = feature_number
53 | self.class_number = class_number
54 | self.calculate_layer_sizes()
55 | self.setup_layer_structure()
56 |
57 | def calculate_layer_sizes(self):
58 | self.abstract_feature_number_1 = sum(self.args.layers_1)
59 | self.abstract_feature_number_2 = sum(self.args.layers_2)
60 | self.order_1 = len(self.args.layers_1)
61 | self.order_2 = len(self.args.layers_2)
62 |
63 | def setup_layer_structure(self):
64 | """
65 | Creating the layer structure (3 convolutional upper layers, 3 bottom layers) and dense final.
66 | """
67 | self.upper_layers = [SparseNGCNLayer(self.feature_number, self.args.layers_1[i-1], i, self.args.dropout) for i in range(1, self.order_1+1)]
68 | self.upper_layers = ListModule(*self.upper_layers)
69 | self.bottom_layers = [DenseNGCNLayer(self.abstract_feature_number_1, self.args.layers_2[i-1], i, self.args.dropout) for i in range(1, self.order_2+1)]
70 | self.bottom_layers = ListModule(*self.bottom_layers)
71 | self.fully_connected = torch.nn.Linear(self.abstract_feature_number_2, self.class_number)
72 |
73 | def calculate_group_loss(self):
74 | """
75 | Calculating the column losses.
76 | """
77 | weight_loss = 0
78 | for i in range(self.order_1):
79 | upper_column_loss = torch.norm(self.upper_layers[i].weight_matrix, dim=0)
80 | loss_upper = torch.sum(upper_column_loss)
81 | weight_loss = weight_loss + self.args.lambd*loss_upper
82 | for i in range(self.order_2):
83 | bottom_column_loss = torch.norm(self.bottom_layers[i].weight_matrix, dim=0)
84 | loss_bottom = torch.sum(bottom_column_loss)
85 | weight_loss = weight_loss + self.args.lambd*loss_bottom
86 | return weight_loss
87 |
88 | def calculate_loss(self):
89 | """
90 | Calculating the losses.
91 | """
92 | weight_loss = 0
93 | for i in range(self.order_1):
94 | loss_upper = torch.norm(self.upper_layers[i].weight_matrix)
95 | weight_loss = weight_loss + self.args.lambd*loss_upper
96 | for i in range(self.order_2):
97 | loss_bottom = torch.norm(self.bottom_layers[i].weight_matrix)
98 | weight_loss = weight_loss + self.args.lambd*loss_bottom
99 | return weight_loss
100 |
101 | def forward(self, normalized_adjacency_matrix, features):
102 | """
103 | Forward pass.
104 | :param normalized adjacency_matrix: Target matrix as a dict with indices and values.
105 | :param features: Feature matrix.
106 | :return predictions: Label predictions.
107 | """
108 | abstract_features_1 = torch.cat([self.upper_layers[i](normalized_adjacency_matrix, features) for i in range(self.order_1)], dim=1)
109 | abstract_features_2 = torch.cat([self.bottom_layers[i](normalized_adjacency_matrix, abstract_features_1) for i in range(self.order_2)], dim=1)
110 | predictions = torch.nn.functional.log_softmax(self.fully_connected(abstract_features_2), dim=1)
111 | return predictions
112 |
113 | class Trainer(object):
114 | """
115 | Class for training the neural network.
116 | :param args: Arguments object.
117 | :param graph: NetworkX graph.
118 | :param features: Feature sparse matrix.
119 | :param target: Target vector.
120 | :param base_run: Loss calculation behavioural flag.
121 | """
122 | def __init__(self, args, graph, features, target, base_run):
123 | self.args = args
124 | self.graph = graph
125 | self.features = features
126 | self.target = target
127 | self.base_run = base_run
128 | self.setup_features()
129 | self.train_test_split()
130 | self.setup_model()
131 |
132 | def train_test_split(self):
133 | """
134 | Creating a train/test split.
135 | """
136 | random.seed(self.args.seed)
137 | nodes = [node for node in range(self.ncount)]
138 | random.shuffle(nodes)
139 | self.train_nodes = torch.LongTensor(nodes[0:self.args.training_size])
140 | self.validation_nodes = torch.LongTensor(nodes[self.args.training_size:self.args.training_size+self.args.validation_size])
141 | self.test_nodes = torch.LongTensor(nodes[self.args.training_size+self.args.validation_size:])
142 |
143 | def setup_features(self):
144 | """
145 | Creating a feature matrix, target vector and propagation matrix.
146 | """
147 | self.ncount = self.features["dimensions"][0]
148 | self.feature_number = self.features["dimensions"][1]
149 | self.class_number = torch.max(self.target).item()+1
150 | self.propagation_matrix = create_propagator_matrix(self.graph)
151 |
152 | def setup_model(self):
153 | """
154 | Defining a PageRankNetwork.
155 | """
156 | if self.args.model == "mixhop":
157 | self.model = MixHopNetwork(self.args, self.feature_number, self.class_number)
158 | else:
159 | self.model = NGCNNetwork(self.args, self.feature_number, self.class_number)
160 |
161 | def fit(self):
162 | """
163 | Fitting a neural network with early stopping.
164 | """
165 | accuracy = 0
166 | no_improvement = 0
167 | epochs = trange(self.args.epochs, desc="Accuracy")
168 | self.optimizer = torch.optim.Adam(self.model.parameters(), lr=self.args.learning_rate)
169 | self.model.train()
170 | for _ in epochs:
171 | self.optimizer.zero_grad()
172 | prediction = self.model(self.propagation_matrix, self.features)
173 | loss = torch.nn.functional.nll_loss(prediction[self.train_nodes], self.target[self.train_nodes])
174 | if self.args.model == "mixhop" and self.base_run == True:
175 | loss = loss + self.model.calculate_group_loss()
176 | elif self.args.model == "mixhop" and self.base_run == False:
177 | loss = loss + self.model.calculate_loss()
178 | loss.backward()
179 | self.optimizer.step()
180 | new_accuracy = self.score(self.validation_nodes)
181 | epochs.set_description("Validation Accuracy: %g" % round(new_accuracy, 4))
182 | if new_accuracy < accuracy:
183 | no_improvement = no_improvement + 1
184 | if no_improvement == self.args.early_stopping:
185 | epochs.close()
186 | break
187 | else:
188 | no_improvement = 0
189 | accuracy = new_accuracy
190 | acc = self.score(self.test_nodes)
191 | print("\nTest accuracy: " + str(round(acc, 4)) +"\n")
192 |
193 | def score(self, indices):
194 | """
195 | Scoring a neural network.
196 | :param indices: Indices of nodes involved in accuracy calculation.
197 | :return acc: Accuracy score.
198 | """
199 | self.model.eval()
200 | _, prediction = self.model(self.propagation_matrix, self.features).max(dim=1)
201 | correct = prediction[indices].eq(self.target[indices]).sum().item()
202 | acc = correct / indices.shape[0]
203 | return acc
204 |
205 | def evaluate_architecture(self):
206 | """
207 | Making a choice about the optimal layer sizes.
208 | """
209 | print("The best architecture is:\n")
210 | self.layer_sizes = dict()
211 |
212 | self.layer_sizes["upper"] = []
213 |
214 | for layer in self.model.upper_layers:
215 | norms = torch.norm(layer.weight_matrix**2, dim=0)
216 | norms = norms[norms < self.args.cut_off]
217 | self.layer_sizes["upper"].append(norms.shape[0])
218 |
219 | self.layer_sizes["bottom"] = []
220 |
221 | for layer in self.model.bottom_layers:
222 | norms = torch.norm(layer.weight_matrix**2, dim=0)
223 | norms = norms[norms < self.args.cut_off]
224 | self.layer_sizes["bottom"].append(norms.shape[0])
225 |
226 | self.layer_sizes["upper"] = [int(self.args.budget*layer_size/sum(self.layer_sizes["upper"])) for layer_size in self.layer_sizes["upper"]]
227 | self.layer_sizes["bottom"] = [int(self.args.budget*layer_size/sum(self.layer_sizes["bottom"])) for layer_size in self.layer_sizes["bottom"]]
228 | print("Layer 1.: "+str(tuple(self.layer_sizes["upper"])))
229 | print("Layer 2.: "+str(tuple(self.layer_sizes["bottom"])))
230 |
231 | def reset_architecture(self):
232 | """
233 | Changing the layer sizes.
234 | """
235 | print("\nResetting the architecture.\n")
236 | self.args.layers_1 = self.layer_sizes["upper"]
237 | self.args.layers_2 = self.layer_sizes["bottom"]
238 | return self.args
239 |
--------------------------------------------------------------------------------
/src/utils.py:
--------------------------------------------------------------------------------
1 | """Data reading tools."""
2 |
3 | import json
4 | import torch
5 | import numpy as np
6 | import pandas as pd
7 | import networkx as nx
8 | from scipy import sparse
9 | from texttable import Texttable
10 |
11 | def tab_printer(args):
12 | """
13 | Function to print the logs in a nice tabular format.
14 | :param args: Parameters used for the model.
15 | """
16 | args = vars(args)
17 | keys = sorted(args.keys())
18 | t = Texttable()
19 | t.add_rows([["Parameter", "Value"]])
20 | t.add_rows([[k.replace("_", " ").capitalize(), args[k]] for k in keys])
21 | print(t.draw())
22 |
23 | def graph_reader(path):
24 | """
25 | Function to read the graph from the path.
26 | :param path: Path to the edge list.
27 | :return graph: NetworkX object returned.
28 | """
29 | graph = nx.from_edgelist(pd.read_csv(path).values.tolist())
30 | graph.remove_edges_from(list(nx.selfloop_edges(graph)))
31 | return graph
32 |
33 | def feature_reader(path):
34 | """
35 | Reading the feature matrix stored as JSON from the disk.
36 | :param path: Path to the JSON file.
37 | :return out_features: Dict with index and value tensor.
38 | """
39 | features = json.load(open(path))
40 | index_1 = [int(k) for k, v in features.items() for fet in v]
41 | index_2 = [int(fet) for k, v in features.items() for fet in v]
42 | values = [1.0]*len(index_1)
43 | nodes = [int(k) for k, v in features.items()]
44 | node_count = max(nodes)+1
45 | feature_count = max(index_2)+1
46 | features = sparse.coo_matrix((values, (index_1, index_2)),
47 | shape=(node_count, feature_count),
48 | dtype=np.float32)
49 | out_features = dict()
50 | ind = np.concatenate([features.row.reshape(-1, 1), features.col.reshape(-1, 1)], axis=1)
51 | out_features["indices"] = torch.LongTensor(ind.T)
52 | out_features["values"] = torch.FloatTensor(features.data)
53 | out_features["dimensions"] = features.shape
54 | return out_features
55 |
56 | def target_reader(path):
57 | """
58 | Reading the target vector from disk.
59 | :param path: Path to the target.
60 | :return target: Target vector.
61 | """
62 | target = torch.LongTensor(np.array(pd.read_csv(path)["target"]))
63 | return target
64 |
65 | def create_adjacency_matrix(graph):
66 | """
67 | Creating a sparse adjacency matrix.
68 | :param graph: NetworkX object.
69 | :return A: Adjacency matrix.
70 | """
71 | index_1 = [edge[0] for edge in graph.edges()] + [edge[1] for edge in graph.edges()]
72 | index_2 = [edge[1] for edge in graph.edges()] + [edge[0] for edge in graph.edges()]
73 | values = [1 for index in index_1]
74 | node_count = max(max(index_1)+1, max(index_2)+1)
75 | A = sparse.coo_matrix((values, (index_1, index_2)),
76 | shape=(node_count, node_count),
77 | dtype=np.float32)
78 | return A
79 |
80 | def normalize_adjacency_matrix(A, I):
81 | """
82 | Creating a normalized adjacency matrix with self loops.
83 | :param A: Sparse adjacency matrix.
84 | :param I: Identity matrix.
85 | :return A_tile_hat: Normalized adjacency matrix.
86 | """
87 | A_tilde = A + I
88 | degrees = A_tilde.sum(axis=0)[0].tolist()
89 | D = sparse.diags(degrees, [0])
90 | D = D.power(-0.5)
91 | A_tilde_hat = D.dot(A_tilde).dot(D)
92 | return A_tilde_hat
93 |
94 | def create_propagator_matrix(graph):
95 | """
96 | Creating a propagator matrix.
97 | :param graph: NetworkX graph.
98 | :return propagator: Dictionary of matrix indices and values.
99 | """
100 | A = create_adjacency_matrix(graph)
101 | I = sparse.eye(A.shape[0])
102 | A_tilde_hat = normalize_adjacency_matrix(A, I)
103 | propagator = dict()
104 | A_tilde_hat = sparse.coo_matrix(A_tilde_hat)
105 | ind = np.concatenate([A_tilde_hat.row.reshape(-1, 1), A_tilde_hat.col.reshape(-1, 1)], axis=1)
106 | propagator["indices"] = torch.LongTensor(ind.T)
107 | propagator["values"] = torch.FloatTensor(A_tilde_hat.data)
108 | return propagator
109 |
--------------------------------------------------------------------------------