├── ANRL
├── README.md
├── config.py
├── embed
│ └── citeseer.embed
├── evaluation.py
├── graph
│ ├── citeseer.edgelist
│ ├── citeseer.feature
│ └── citeseer.label
├── main.py
├── model.py
├── node2vec.py
├── paper
│ └── ANRL.pdf
├── requirements.txt
└── utils.py
├── DELP
├── README.md
├── graph
│ ├── cora.edgelist
│ ├── cora.feature
│ └── cora.label
├── main.py
├── model.py
├── requirements.txt
└── utils.py
├── PGRR
├── KDD_18_ADS_Mobile_Access_Record_Resolution_on_Large_Scale_Identifier_Linkage_Graphs.pdf
├── README.md
├── semi-supervised
│ ├── pom.xml
│ └── src
│ │ └── main
│ │ └── scala
│ │ └── Main.scala
└── unsupervised
│ ├── pom.xml
│ └── src
│ └── main
│ └── scala
│ └── Main.scala
├── PRRE
├── README.md
├── __pycache__
│ └── classify.cpython-36.pyc
├── classify.py
├── classify.pyc
├── graph_distance.py
├── graph_distance.pyc
├── prre.py
└── requirements.txt
└── README.md
/ANRL/README.md:
--------------------------------------------------------------------------------
1 | # ANRL
2 | ANRL: Attributed Network Representation Learning via Deep Neural Networks (IJCAI-18)
3 |
4 | This is a Tensorflow implementation of the ANRL algorithm, which learns a low-dimensional representations for each node in a network. Specifically, ANRL consists of two modules, i.e., neighbor enhancement autoencoder and attribute-aware skip-gram model, to jointly capture the node attribute proximity and network topology proximity.
5 |
6 | ## Requirements
7 | * tensorflow
8 | * networkx
9 | * numpy
10 | * scipy
11 | * scikit-learn
12 |
13 | All required packages are defined in requirements.txt. To install all requirement, just use the following commands:
14 | ```
15 | pip install -r requirements.txt
16 | ```
17 |
18 | ## Basic Usage
19 |
20 | ### Input Data
21 | For node classification, each dataset contains 3 files: edgelist, features and labels.
22 | ```
23 | 1. citeseer.edgelist: each line contains two connected nodes.
24 | node_1 node_2
25 | node_2 node_3
26 | ...
27 |
28 | 2. citeseer.feature: this file has n+1 lines.
29 | The first line has the following format:
30 | node_number feature_dimension
31 | The next n lines are as follows: (each node per line ordered by node id)
32 | (for node_1) feature_1 feature_2 ... feature_n
33 | (for node_2) feature_1 feature_2 ... feature_n
34 | ...
35 |
36 | 3. citeseer.label: each line represents a node and its class label.
37 | node_1 label_1
38 | node_2 label_2
39 | ...
40 | ```
41 | For link predictin, each dataset contains 3 files: training edgelist, features and test edgelist.
42 | ```
43 | 1. xxx_train.edgelist: each line contains two connected nodes.
44 | node_1 node_2
45 | node_2 node_3
46 | ...
47 |
48 | 2. xxx.feature: this file has n+1 lines.
49 | The first line has the following format:
50 | node_number feature_dimension
51 | The next n lines are as follows: (each node per line ordered by node id)
52 | (for node_1) feature_1 feature_2 ... feature_n
53 | (for node_2) feature_1 feature_2 ... feature_n
54 | ...
55 |
56 | 3. xxx_test.edgelist: each line contains two connected nodes.
57 | node_1 node_2 1 (positive sample)
58 | node_2 node_3 0 (negative sample)
59 | ...
60 | ```
61 |
62 | ### Output Data
63 | The output file has n+1 lines as the input feature files. The first line has the following format:
64 | ```
65 | node_number embedding_dimension
66 | ```
67 | The next n lines are as follows:
68 | node_id, dim_1, dim_2, ... dim_d
69 |
70 | ### Run
71 | To run ANRL, just execute the following command for node classification task:
72 | ```
73 | python main.py
74 | ```
75 |
76 |
77 | Note:
78 | As for simulating random walks, we directly use the code provided in [node2vec](https://github.com/aditya-grover/node2vec), which levearges alias sampling to faciliate the procedure.
79 |
80 | ## Citing
81 | If you find ANRL useful for your research, please consider citing the following paper:
82 | ```
83 | @inproceedings{anrl-ijcai2018,
84 | author={Zhen Zhang, Hongxia Yang, Jiajun Bu, Sheng Zhou, Pinggang Yu, Jianwei Zhang, Martin Ester, Can Wang},
85 | title={ANRL: Attributed Network Representation Learning via Deep Neural Networks},
86 | booktitle={IJCAI-18},
87 | year={2018}
88 | }
89 | ```
90 |
--------------------------------------------------------------------------------
/ANRL/config.py:
--------------------------------------------------------------------------------
1 | # Configure parameters
2 |
3 | class Config(object):
4 | def __init__(self):
5 | # hyperparameter
6 | self.struct = [None, 1000, 500, None]
7 | self.reg = 10
8 | self.alpha = 1
9 |
10 | # parameters for training
11 | self.batch_size = 512
12 | self.num_sampled = 10
13 | self.max_iters = 20000
14 | self.sg_learning_rate = 1e-4
15 | self.ae_learning_rate = 1e-4
16 |
--------------------------------------------------------------------------------
/ANRL/evaluation.py:
--------------------------------------------------------------------------------
1 | # Evaluation Metric for node classification and link prediction
2 |
3 | import numpy as np
4 | import random
5 | import math
6 | import warnings
7 | from sklearn.model_selection import train_test_split
8 | from sklearn.svm import LinearSVC
9 | from sklearn.metrics import *
10 | from sklearn.metrics.pairwise import cosine_similarity
11 |
12 |
13 | def read_label(inputFileName):
14 | f = open(inputFileName, "r")
15 | lines = f.readlines()
16 | f.close()
17 | N = len(lines)
18 | y = np.zeros(N, dtype=int)
19 | for line in lines:
20 | l = line.strip("\n\r").split(" ")
21 | y[int(l[0])] = int(l[1])
22 |
23 | return y
24 |
25 |
26 | def multiclass_node_classification_eval(X, y, ratio = 0.2, rnd = 2018):
27 | warnings.filterwarnings("ignore")
28 |
29 | X_train, X_test, y_train, y_test = train_test_split(
30 | X, y, test_size = ratio, random_state = rnd)
31 | clf = LinearSVC()
32 | clf.fit(X_train, y_train)
33 |
34 | y_pred = clf.predict(X_test)
35 |
36 | macro_f1 = f1_score(y_test, y_pred, average = "macro")
37 | micro_f1 = f1_score(y_test, y_pred, average = "micro")
38 |
39 | return macro_f1, micro_f1
40 |
41 |
42 | def link_prediction_ROC(inputFileName, Embeddings):
43 | f = open(inputFileName, "r")
44 | lines = f.readlines()
45 | f.close()
46 |
47 | X_test = []
48 |
49 | for line in lines:
50 | l = line.strip("\n\r").split(" ")
51 | X_test.append([int(l[0]), int(l[1]), int(l[2])])
52 |
53 | y_true = [X_test[i][2] for i in range(len(X_test))]
54 | y_predict = [cosine_similarity(Embeddings[X_test[i][0], :].reshape(
55 | 1, -1), Embeddings[X_test[i][1], :].reshape(1, -1))[0, 0] for i in range(len(X_test))]
56 | auc = roc_auc_score(y_true, y_predict)
57 |
58 | if auc < 0.5:
59 | auc = 1 - auc
60 |
61 | return auc
62 |
63 |
64 | def node_classification_F1(Embeddings, y):
65 | macro_f1_avg = 0
66 | micro_f1_avg = 0
67 | for i in range(10):
68 | rnd = np.random.randint(2018)
69 | macro_f1, micro_f1 = multiclass_node_classification_eval(
70 | Embeddings, y, 0.7, rnd)
71 | macro_f1_avg += macro_f1
72 | micro_f1_avg += micro_f1
73 | macro_f1_avg /= 10
74 | micro_f1_avg /= 10
75 | print "Macro_f1 average value: " + str(macro_f1_avg)
76 | print "Micro_f1 average value: " + str(micro_f1_avg)
77 |
--------------------------------------------------------------------------------
/ANRL/graph/citeseer.label:
--------------------------------------------------------------------------------
1 | 0 0
2 | 1 3
3 | 2 0
4 | 3 2
5 | 4 1
6 | 5 1
7 | 6 0
8 | 7 3
9 | 8 1
10 | 9 5
11 | 10 3
12 | 11 0
13 | 12 2
14 | 13 3
15 | 14 2
16 | 15 3
17 | 16 2
18 | 17 4
19 | 18 0
20 | 19 0
21 | 20 0
22 | 21 3
23 | 22 3
24 | 23 2
25 | 24 3
26 | 25 4
27 | 26 2
28 | 27 0
29 | 28 3
30 | 29 4
31 | 30 4
32 | 31 4
33 | 32 4
34 | 33 0
35 | 34 3
36 | 35 0
37 | 36 1
38 | 37 3
39 | 38 0
40 | 39 0
41 | 40 0
42 | 41 0
43 | 42 0
44 | 43 0
45 | 44 0
46 | 45 0
47 | 46 0
48 | 47 0
49 | 48 3
50 | 49 0
51 | 50 3
52 | 51 3
53 | 52 0
54 | 53 5
55 | 54 3
56 | 55 2
57 | 56 4
58 | 57 2
59 | 58 3
60 | 59 0
61 | 60 2
62 | 61 5
63 | 62 3
64 | 63 1
65 | 64 0
66 | 65 3
67 | 66 3
68 | 67 3
69 | 68 2
70 | 69 3
71 | 70 0
72 | 71 0
73 | 72 0
74 | 73 2
75 | 74 4
76 | 75 0
77 | 76 0
78 | 77 4
79 | 78 4
80 | 79 3
81 | 80 2
82 | 81 0
83 | 82 3
84 | 83 4
85 | 84 0
86 | 85 3
87 | 86 2
88 | 87 1
89 | 88 3
90 | 89 0
91 | 90 4
92 | 91 5
93 | 92 0
94 | 93 3
95 | 94 3
96 | 95 3
97 | 96 4
98 | 97 1
99 | 98 3
100 | 99 0
101 | 100 3
102 | 101 0
103 | 102 4
104 | 103 3
105 | 104 0
106 | 105 3
107 | 106 3
108 | 107 3
109 | 108 2
110 | 109 0
111 | 110 0
112 | 111 0
113 | 112 2
114 | 113 1
115 | 114 0
116 | 115 0
117 | 116 0
118 | 117 2
119 | 118 3
120 | 119 0
121 | 120 2
122 | 121 0
123 | 122 3
124 | 123 4
125 | 124 1
126 | 125 0
127 | 126 1
128 | 127 3
129 | 128 2
130 | 129 5
131 | 130 4
132 | 131 0
133 | 132 0
134 | 133 0
135 | 134 1
136 | 135 5
137 | 136 0
138 | 137 0
139 | 138 0
140 | 139 0
141 | 140 1
142 | 141 0
143 | 142 0
144 | 143 5
145 | 144 0
146 | 145 0
147 | 146 0
148 | 147 3
149 | 148 0
150 | 149 2
151 | 150 2
152 | 151 3
153 | 152 3
154 | 153 3
155 | 154 3
156 | 155 3
157 | 156 3
158 | 157 3
159 | 158 0
160 | 159 4
161 | 160 0
162 | 161 5
163 | 162 4
164 | 163 0
165 | 164 2
166 | 165 0
167 | 166 0
168 | 167 3
169 | 168 2
170 | 169 5
171 | 170 0
172 | 171 0
173 | 172 4
174 | 173 1
175 | 174 0
176 | 175 4
177 | 176 3
178 | 177 0
179 | 178 3
180 | 179 4
181 | 180 3
182 | 181 0
183 | 182 3
184 | 183 4
185 | 184 3
186 | 185 0
187 | 186 0
188 | 187 0
189 | 188 0
190 | 189 3
191 | 190 0
192 | 191 4
193 | 192 0
194 | 193 0
195 | 194 3
196 | 195 5
197 | 196 0
198 | 197 0
199 | 198 0
200 | 199 0
201 | 200 3
202 | 201 1
203 | 202 0
204 | 203 0
205 | 204 0
206 | 205 3
207 | 206 3
208 | 207 5
209 | 208 0
210 | 209 0
211 | 210 5
212 | 211 0
213 | 212 3
214 | 213 3
215 | 214 0
216 | 215 4
217 | 216 3
218 | 217 1
219 | 218 4
220 | 219 3
221 | 220 3
222 | 221 3
223 | 222 3
224 | 223 3
225 | 224 3
226 | 225 5
227 | 226 0
228 | 227 3
229 | 228 3
230 | 229 0
231 | 230 0
232 | 231 3
233 | 232 3
234 | 233 3
235 | 234 3
236 | 235 3
237 | 236 3
238 | 237 0
239 | 238 2
240 | 239 0
241 | 240 0
242 | 241 0
243 | 242 0
244 | 243 4
245 | 244 3
246 | 245 0
247 | 246 0
248 | 247 0
249 | 248 3
250 | 249 5
251 | 250 0
252 | 251 3
253 | 252 0
254 | 253 3
255 | 254 3
256 | 255 0
257 | 256 3
258 | 257 3
259 | 258 0
260 | 259 3
261 | 260 0
262 | 261 3
263 | 262 4
264 | 263 1
265 | 264 0
266 | 265 3
267 | 266 4
268 | 267 3
269 | 268 4
270 | 269 1
271 | 270 1
272 | 271 0
273 | 272 4
274 | 273 1
275 | 274 4
276 | 275 3
277 | 276 3
278 | 277 0
279 | 278 4
280 | 279 5
281 | 280 1
282 | 281 0
283 | 282 0
284 | 283 1
285 | 284 4
286 | 285 4
287 | 286 4
288 | 287 0
289 | 288 1
290 | 289 1
291 | 290 1
292 | 291 3
293 | 292 3
294 | 293 5
295 | 294 1
296 | 295 3
297 | 296 3
298 | 297 4
299 | 298 1
300 | 299 3
301 | 300 5
302 | 301 2
303 | 302 3
304 | 303 3
305 | 304 3
306 | 305 4
307 | 306 0
308 | 307 0
309 | 308 2
310 | 309 5
311 | 310 3
312 | 311 2
313 | 312 0
314 | 313 0
315 | 314 0
316 | 315 0
317 | 316 0
318 | 317 3
319 | 318 3
320 | 319 0
321 | 320 0
322 | 321 3
323 | 322 3
324 | 323 0
325 | 324 2
326 | 325 2
327 | 326 2
328 | 327 2
329 | 328 1
330 | 329 1
331 | 330 1
332 | 331 2
333 | 332 3
334 | 333 0
335 | 334 0
336 | 335 3
337 | 336 3
338 | 337 0
339 | 338 4
340 | 339 0
341 | 340 0
342 | 341 0
343 | 342 3
344 | 343 5
345 | 344 0
346 | 345 0
347 | 346 0
348 | 347 0
349 | 348 0
350 | 349 3
351 | 350 0
352 | 351 0
353 | 352 2
354 | 353 2
355 | 354 2
356 | 355 3
357 | 356 3
358 | 357 3
359 | 358 5
360 | 359 5
361 | 360 0
362 | 361 0
363 | 362 2
364 | 363 3
365 | 364 3
366 | 365 3
367 | 366 3
368 | 367 0
369 | 368 0
370 | 369 0
371 | 370 5
372 | 371 2
373 | 372 3
374 | 373 0
375 | 374 3
376 | 375 2
377 | 376 2
378 | 377 2
379 | 378 2
380 | 379 0
381 | 380 4
382 | 381 5
383 | 382 3
384 | 383 2
385 | 384 2
386 | 385 0
387 | 386 0
388 | 387 0
389 | 388 0
390 | 389 0
391 | 390 3
392 | 391 3
393 | 392 3
394 | 393 0
395 | 394 0
396 | 395 1
397 | 396 5
398 | 397 5
399 | 398 5
400 | 399 0
401 | 400 0
402 | 401 0
403 | 402 0
404 | 403 0
405 | 404 0
406 | 405 0
407 | 406 3
408 | 407 4
409 | 408 3
410 | 409 0
411 | 410 0
412 | 411 0
413 | 412 0
414 | 413 5
415 | 414 3
416 | 415 5
417 | 416 0
418 | 417 0
419 | 418 0
420 | 419 3
421 | 420 3
422 | 421 2
423 | 422 3
424 | 423 2
425 | 424 0
426 | 425 0
427 | 426 0
428 | 427 3
429 | 428 3
430 | 429 4
431 | 430 4
432 | 431 1
433 | 432 5
434 | 433 5
435 | 434 0
436 | 435 0
437 | 436 0
438 | 437 3
439 | 438 3
440 | 439 3
441 | 440 3
442 | 441 2
443 | 442 3
444 | 443 2
445 | 444 3
446 | 445 3
447 | 446 4
448 | 447 3
449 | 448 0
450 | 449 2
451 | 450 2
452 | 451 3
453 | 452 4
454 | 453 0
455 | 454 0
456 | 455 3
457 | 456 3
458 | 457 4
459 | 458 4
460 | 459 3
461 | 460 3
462 | 461 2
463 | 462 3
464 | 463 3
465 | 464 0
466 | 465 5
467 | 466 5
468 | 467 0
469 | 468 3
470 | 469 3
471 | 470 3
472 | 471 3
473 | 472 0
474 | 473 3
475 | 474 4
476 | 475 3
477 | 476 3
478 | 477 3
479 | 478 0
480 | 479 4
481 | 480 0
482 | 481 3
483 | 482 3
484 | 483 3
485 | 484 0
486 | 485 3
487 | 486 0
488 | 487 0
489 | 488 0
490 | 489 0
491 | 490 4
492 | 491 5
493 | 492 0
494 | 493 0
495 | 494 0
496 | 495 0
497 | 496 0
498 | 497 0
499 | 498 4
500 | 499 0
501 | 500 0
502 | 501 3
503 | 502 3
504 | 503 3
505 | 504 0
506 | 505 4
507 | 506 0
508 | 507 3
509 | 508 3
510 | 509 0
511 | 510 3
512 | 511 0
513 | 512 2
514 | 513 2
515 | 514 0
516 | 515 0
517 | 516 3
518 | 517 3
519 | 518 2
520 | 519 0
521 | 520 0
522 | 521 3
523 | 522 3
524 | 523 3
525 | 524 2
526 | 525 3
527 | 526 0
528 | 527 0
529 | 528 1
530 | 529 0
531 | 530 0
532 | 531 0
533 | 532 0
534 | 533 4
535 | 534 4
536 | 535 3
537 | 536 0
538 | 537 4
539 | 538 3
540 | 539 3
541 | 540 1
542 | 541 3
543 | 542 0
544 | 543 5
545 | 544 1
546 | 545 2
547 | 546 0
548 | 547 3
549 | 548 0
550 | 549 3
551 | 550 3
552 | 551 3
553 | 552 2
554 | 553 4
555 | 554 3
556 | 555 3
557 | 556 2
558 | 557 3
559 | 558 3
560 | 559 5
561 | 560 3
562 | 561 4
563 | 562 4
564 | 563 5
565 | 564 0
566 | 565 5
567 | 566 3
568 | 567 4
569 | 568 0
570 | 569 0
571 | 570 0
572 | 571 3
573 | 572 3
574 | 573 3
575 | 574 3
576 | 575 3
577 | 576 3
578 | 577 1
579 | 578 2
580 | 579 0
581 | 580 3
582 | 581 3
583 | 582 5
584 | 583 0
585 | 584 0
586 | 585 0
587 | 586 3
588 | 587 0
589 | 588 2
590 | 589 0
591 | 590 3
592 | 591 1
593 | 592 3
594 | 593 0
595 | 594 0
596 | 595 0
597 | 596 0
598 | 597 3
599 | 598 3
600 | 599 3
601 | 600 0
602 | 601 4
603 | 602 3
604 | 603 0
605 | 604 0
606 | 605 0
607 | 606 0
608 | 607 0
609 | 608 3
610 | 609 3
611 | 610 5
612 | 611 0
613 | 612 0
614 | 613 0
615 | 614 1
616 | 615 1
617 | 616 3
618 | 617 4
619 | 618 0
620 | 619 3
621 | 620 0
622 | 621 0
623 | 622 0
624 | 623 0
625 | 624 0
626 | 625 0
627 | 626 2
628 | 627 1
629 | 628 5
630 | 629 2
631 | 630 0
632 | 631 0
633 | 632 0
634 | 633 3
635 | 634 3
636 | 635 5
637 | 636 3
638 | 637 2
639 | 638 2
640 | 639 3
641 | 640 3
642 | 641 0
643 | 642 0
644 | 643 0
645 | 644 1
646 | 645 2
647 | 646 3
648 | 647 0
649 | 648 3
650 | 649 0
651 | 650 0
652 | 651 0
653 | 652 1
654 | 653 5
655 | 654 0
656 | 655 3
657 | 656 0
658 | 657 5
659 | 658 3
660 | 659 0
661 | 660 0
662 | 661 0
663 | 662 5
664 | 663 0
665 | 664 4
666 | 665 3
667 | 666 0
668 | 667 0
669 | 668 3
670 | 669 3
671 | 670 3
672 | 671 0
673 | 672 4
674 | 673 3
675 | 674 0
676 | 675 0
677 | 676 0
678 | 677 2
679 | 678 0
680 | 679 2
681 | 680 3
682 | 681 5
683 | 682 0
684 | 683 3
685 | 684 3
686 | 685 3
687 | 686 3
688 | 687 3
689 | 688 3
690 | 689 4
691 | 690 0
692 | 691 0
693 | 692 0
694 | 693 4
695 | 694 3
696 | 695 5
697 | 696 1
698 | 697 3
699 | 698 3
700 | 699 3
701 | 700 3
702 | 701 3
703 | 702 2
704 | 703 0
705 | 704 5
706 | 705 5
707 | 706 3
708 | 707 3
709 | 708 3
710 | 709 2
711 | 710 2
712 | 711 3
713 | 712 2
714 | 713 4
715 | 714 3
716 | 715 5
717 | 716 5
718 | 717 3
719 | 718 0
720 | 719 0
721 | 720 0
722 | 721 0
723 | 722 0
724 | 723 1
725 | 724 1
726 | 725 3
727 | 726 0
728 | 727 3
729 | 728 3
730 | 729 0
731 | 730 0
732 | 731 0
733 | 732 4
734 | 733 0
735 | 734 0
736 | 735 0
737 | 736 0
738 | 737 0
739 | 738 3
740 | 739 1
741 | 740 4
742 | 741 0
743 | 742 3
744 | 743 2
745 | 744 3
746 | 745 3
747 | 746 3
748 | 747 3
749 | 748 3
750 | 749 3
751 | 750 0
752 | 751 0
753 | 752 0
754 | 753 0
755 | 754 4
756 | 755 3
757 | 756 2
758 | 757 0
759 | 758 0
760 | 759 0
761 | 760 0
762 | 761 0
763 | 762 2
764 | 763 3
765 | 764 0
766 | 765 2
767 | 766 5
768 | 767 0
769 | 768 3
770 | 769 4
771 | 770 4
772 | 771 3
773 | 772 3
774 | 773 3
775 | 774 2
776 | 775 0
777 | 776 2
778 | 777 3
779 | 778 0
780 | 779 3
781 | 780 2
782 | 781 2
783 | 782 4
784 | 783 2
785 | 784 2
786 | 785 0
787 | 786 0
788 | 787 0
789 | 788 0
790 | 789 1
791 | 790 0
792 | 791 3
793 | 792 0
794 | 793 4
795 | 794 0
796 | 795 5
797 | 796 0
798 | 797 3
799 | 798 4
800 | 799 3
801 | 800 3
802 | 801 4
803 | 802 4
804 | 803 3
805 | 804 3
806 | 805 3
807 | 806 0
808 | 807 0
809 | 808 0
810 | 809 0
811 | 810 5
812 | 811 0
813 | 812 3
814 | 813 4
815 | 814 0
816 | 815 0
817 | 816 0
818 | 817 0
819 | 818 5
820 | 819 3
821 | 820 5
822 | 821 0
823 | 822 0
824 | 823 4
825 | 824 0
826 | 825 3
827 | 826 1
828 | 827 3
829 | 828 3
830 | 829 3
831 | 830 2
832 | 831 4
833 | 832 4
834 | 833 4
835 | 834 0
836 | 835 0
837 | 836 0
838 | 837 0
839 | 838 1
840 | 839 3
841 | 840 0
842 | 841 3
843 | 842 3
844 | 843 0
845 | 844 3
846 | 845 2
847 | 846 0
848 | 847 0
849 | 848 0
850 | 849 0
851 | 850 0
852 | 851 5
853 | 852 5
854 | 853 5
855 | 854 0
856 | 855 0
857 | 856 1
858 | 857 5
859 | 858 5
860 | 859 4
861 | 860 0
862 | 861 0
863 | 862 4
864 | 863 0
865 | 864 0
866 | 865 0
867 | 866 0
868 | 867 2
869 | 868 0
870 | 869 0
871 | 870 5
872 | 871 3
873 | 872 0
874 | 873 3
875 | 874 0
876 | 875 0
877 | 876 3
878 | 877 0
879 | 878 2
880 | 879 0
881 | 880 2
882 | 881 4
883 | 882 3
884 | 883 3
885 | 884 4
886 | 885 0
887 | 886 4
888 | 887 4
889 | 888 3
890 | 889 1
891 | 890 1
892 | 891 4
893 | 892 0
894 | 893 0
895 | 894 2
896 | 895 4
897 | 896 2
898 | 897 4
899 | 898 0
900 | 899 3
901 | 900 0
902 | 901 3
903 | 902 3
904 | 903 3
905 | 904 3
906 | 905 3
907 | 906 3
908 | 907 0
909 | 908 5
910 | 909 4
911 | 910 3
912 | 911 3
913 | 912 1
914 | 913 4
915 | 914 3
916 | 915 5
917 | 916 1
918 | 917 2
919 | 918 3
920 | 919 0
921 | 920 3
922 | 921 3
923 | 922 3
924 | 923 0
925 | 924 4
926 | 925 0
927 | 926 5
928 | 927 5
929 | 928 0
930 | 929 3
931 | 930 0
932 | 931 0
933 | 932 0
934 | 933 0
935 | 934 3
936 | 935 0
937 | 936 0
938 | 937 0
939 | 938 0
940 | 939 0
941 | 940 1
942 | 941 0
943 | 942 0
944 | 943 0
945 | 944 5
946 | 945 0
947 | 946 3
948 | 947 2
949 | 948 1
950 | 949 1
951 | 950 0
952 | 951 0
953 | 952 3
954 | 953 3
955 | 954 0
956 | 955 0
957 | 956 3
958 | 957 0
959 | 958 3
960 | 959 0
961 | 960 2
962 | 961 5
963 | 962 0
964 | 963 4
965 | 964 0
966 | 965 0
967 | 966 1
968 | 967 3
969 | 968 5
970 | 969 5
971 | 970 0
972 | 971 2
973 | 972 2
974 | 973 0
975 | 974 3
976 | 975 0
977 | 976 3
978 | 977 3
979 | 978 2
980 | 979 0
981 | 980 0
982 | 981 0
983 | 982 0
984 | 983 0
985 | 984 0
986 | 985 1
987 | 986 0
988 | 987 3
989 | 988 0
990 | 989 0
991 | 990 0
992 | 991 4
993 | 992 3
994 | 993 4
995 | 994 0
996 | 995 0
997 | 996 0
998 | 997 4
999 | 998 3
1000 | 999 3
1001 | 1000 3
1002 | 1001 2
1003 | 1002 0
1004 | 1003 0
1005 | 1004 3
1006 | 1005 3
1007 | 1006 3
1008 | 1007 3
1009 | 1008 3
1010 | 1009 0
1011 | 1010 0
1012 | 1011 4
1013 | 1012 3
1014 | 1013 2
1015 | 1014 4
1016 | 1015 2
1017 | 1016 2
1018 | 1017 1
1019 | 1018 4
1020 | 1019 4
1021 | 1020 0
1022 | 1021 2
1023 | 1022 3
1024 | 1023 2
1025 | 1024 0
1026 | 1025 1
1027 | 1026 4
1028 | 1027 4
1029 | 1028 2
1030 | 1029 2
1031 | 1030 1
1032 | 1031 5
1033 | 1032 0
1034 | 1033 4
1035 | 1034 4
1036 | 1035 2
1037 | 1036 3
1038 | 1037 2
1039 | 1038 2
1040 | 1039 4
1041 | 1040 2
1042 | 1041 2
1043 | 1042 2
1044 | 1043 2
1045 | 1044 1
1046 | 1045 5
1047 | 1046 2
1048 | 1047 2
1049 | 1048 2
1050 | 1049 2
1051 | 1050 4
1052 | 1051 5
1053 | 1052 4
1054 | 1053 4
1055 | 1054 2
1056 | 1055 4
1057 | 1056 2
1058 | 1057 4
1059 | 1058 2
1060 | 1059 4
1061 | 1060 2
1062 | 1061 3
1063 | 1062 3
1064 | 1063 4
1065 | 1064 5
1066 | 1065 0
1067 | 1066 4
1068 | 1067 2
1069 | 1068 2
1070 | 1069 0
1071 | 1070 2
1072 | 1071 4
1073 | 1072 4
1074 | 1073 3
1075 | 1074 1
1076 | 1075 3
1077 | 1076 2
1078 | 1077 0
1079 | 1078 2
1080 | 1079 0
1081 | 1080 4
1082 | 1081 2
1083 | 1082 4
1084 | 1083 2
1085 | 1084 0
1086 | 1085 4
1087 | 1086 2
1088 | 1087 0
1089 | 1088 2
1090 | 1089 2
1091 | 1090 2
1092 | 1091 4
1093 | 1092 0
1094 | 1093 4
1095 | 1094 5
1096 | 1095 4
1097 | 1096 5
1098 | 1097 5
1099 | 1098 2
1100 | 1099 4
1101 | 1100 2
1102 | 1101 2
1103 | 1102 2
1104 | 1103 0
1105 | 1104 1
1106 | 1105 2
1107 | 1106 2
1108 | 1107 2
1109 | 1108 5
1110 | 1109 2
1111 | 1110 4
1112 | 1111 4
1113 | 1112 1
1114 | 1113 5
1115 | 1114 3
1116 | 1115 3
1117 | 1116 2
1118 | 1117 2
1119 | 1118 4
1120 | 1119 2
1121 | 1120 1
1122 | 1121 5
1123 | 1122 2
1124 | 1123 4
1125 | 1124 5
1126 | 1125 2
1127 | 1126 0
1128 | 1127 2
1129 | 1128 5
1130 | 1129 4
1131 | 1130 4
1132 | 1131 3
1133 | 1132 2
1134 | 1133 2
1135 | 1134 1
1136 | 1135 2
1137 | 1136 2
1138 | 1137 4
1139 | 1138 2
1140 | 1139 4
1141 | 1140 4
1142 | 1141 2
1143 | 1142 2
1144 | 1143 4
1145 | 1144 3
1146 | 1145 4
1147 | 1146 2
1148 | 1147 4
1149 | 1148 2
1150 | 1149 5
1151 | 1150 2
1152 | 1151 5
1153 | 1152 2
1154 | 1153 4
1155 | 1154 1
1156 | 1155 3
1157 | 1156 4
1158 | 1157 4
1159 | 1158 5
1160 | 1159 2
1161 | 1160 1
1162 | 1161 2
1163 | 1162 4
1164 | 1163 2
1165 | 1164 4
1166 | 1165 3
1167 | 1166 4
1168 | 1167 3
1169 | 1168 4
1170 | 1169 3
1171 | 1170 2
1172 | 1171 4
1173 | 1172 4
1174 | 1173 2
1175 | 1174 2
1176 | 1175 4
1177 | 1176 2
1178 | 1177 3
1179 | 1178 5
1180 | 1179 4
1181 | 1180 2
1182 | 1181 1
1183 | 1182 5
1184 | 1183 5
1185 | 1184 3
1186 | 1185 5
1187 | 1186 5
1188 | 1187 4
1189 | 1188 5
1190 | 1189 4
1191 | 1190 2
1192 | 1191 3
1193 | 1192 2
1194 | 1193 1
1195 | 1194 4
1196 | 1195 4
1197 | 1196 2
1198 | 1197 3
1199 | 1198 5
1200 | 1199 3
1201 | 1200 2
1202 | 1201 5
1203 | 1202 5
1204 | 1203 3
1205 | 1204 4
1206 | 1205 0
1207 | 1206 2
1208 | 1207 3
1209 | 1208 4
1210 | 1209 3
1211 | 1210 2
1212 | 1211 0
1213 | 1212 3
1214 | 1213 1
1215 | 1214 5
1216 | 1215 5
1217 | 1216 0
1218 | 1217 5
1219 | 1218 5
1220 | 1219 3
1221 | 1220 1
1222 | 1221 5
1223 | 1222 3
1224 | 1223 5
1225 | 1224 5
1226 | 1225 2
1227 | 1226 3
1228 | 1227 4
1229 | 1228 2
1230 | 1229 3
1231 | 1230 3
1232 | 1231 3
1233 | 1232 5
1234 | 1233 0
1235 | 1234 3
1236 | 1235 5
1237 | 1236 2
1238 | 1237 1
1239 | 1238 3
1240 | 1239 3
1241 | 1240 0
1242 | 1241 3
1243 | 1242 2
1244 | 1243 5
1245 | 1244 3
1246 | 1245 4
1247 | 1246 2
1248 | 1247 3
1249 | 1248 2
1250 | 1249 3
1251 | 1250 2
1252 | 1251 2
1253 | 1252 5
1254 | 1253 2
1255 | 1254 4
1256 | 1255 2
1257 | 1256 2
1258 | 1257 3
1259 | 1258 5
1260 | 1259 5
1261 | 1260 1
1262 | 1261 2
1263 | 1262 2
1264 | 1263 5
1265 | 1264 3
1266 | 1265 5
1267 | 1266 5
1268 | 1267 3
1269 | 1268 2
1270 | 1269 4
1271 | 1270 5
1272 | 1271 4
1273 | 1272 3
1274 | 1273 3
1275 | 1274 4
1276 | 1275 2
1277 | 1276 5
1278 | 1277 4
1279 | 1278 4
1280 | 1279 3
1281 | 1280 5
1282 | 1281 3
1283 | 1282 3
1284 | 1283 3
1285 | 1284 3
1286 | 1285 5
1287 | 1286 2
1288 | 1287 0
1289 | 1288 4
1290 | 1289 2
1291 | 1290 3
1292 | 1291 5
1293 | 1292 5
1294 | 1293 5
1295 | 1294 2
1296 | 1295 2
1297 | 1296 4
1298 | 1297 1
1299 | 1298 5
1300 | 1299 5
1301 | 1300 5
1302 | 1301 5
1303 | 1302 2
1304 | 1303 5
1305 | 1304 4
1306 | 1305 3
1307 | 1306 5
1308 | 1307 2
1309 | 1308 5
1310 | 1309 2
1311 | 1310 4
1312 | 1311 2
1313 | 1312 4
1314 | 1313 3
1315 | 1314 4
1316 | 1315 4
1317 | 1316 4
1318 | 1317 5
1319 | 1318 4
1320 | 1319 4
1321 | 1320 2
1322 | 1321 2
1323 | 1322 3
1324 | 1323 4
1325 | 1324 5
1326 | 1325 5
1327 | 1326 3
1328 | 1327 4
1329 | 1328 0
1330 | 1329 4
1331 | 1330 3
1332 | 1331 3
1333 | 1332 5
1334 | 1333 5
1335 | 1334 3
1336 | 1335 3
1337 | 1336 3
1338 | 1337 5
1339 | 1338 5
1340 | 1339 3
1341 | 1340 5
1342 | 1341 5
1343 | 1342 4
1344 | 1343 0
1345 | 1344 5
1346 | 1345 3
1347 | 1346 5
1348 | 1347 2
1349 | 1348 1
1350 | 1349 3
1351 | 1350 4
1352 | 1351 3
1353 | 1352 2
1354 | 1353 1
1355 | 1354 4
1356 | 1355 1
1357 | 1356 1
1358 | 1357 4
1359 | 1358 1
1360 | 1359 4
1361 | 1360 4
1362 | 1361 5
1363 | 1362 2
1364 | 1363 4
1365 | 1364 4
1366 | 1365 4
1367 | 1366 4
1368 | 1367 4
1369 | 1368 1
1370 | 1369 4
1371 | 1370 4
1372 | 1371 2
1373 | 1372 4
1374 | 1373 2
1375 | 1374 3
1376 | 1375 4
1377 | 1376 4
1378 | 1377 5
1379 | 1378 4
1380 | 1379 1
1381 | 1380 4
1382 | 1381 3
1383 | 1382 5
1384 | 1383 3
1385 | 1384 1
1386 | 1385 4
1387 | 1386 1
1388 | 1387 3
1389 | 1388 0
1390 | 1389 2
1391 | 1390 2
1392 | 1391 2
1393 | 1392 2
1394 | 1393 2
1395 | 1394 2
1396 | 1395 3
1397 | 1396 2
1398 | 1397 3
1399 | 1398 4
1400 | 1399 4
1401 | 1400 1
1402 | 1401 2
1403 | 1402 2
1404 | 1403 2
1405 | 1404 2
1406 | 1405 2
1407 | 1406 4
1408 | 1407 4
1409 | 1408 4
1410 | 1409 4
1411 | 1410 3
1412 | 1411 2
1413 | 1412 1
1414 | 1413 0
1415 | 1414 1
1416 | 1415 1
1417 | 1416 2
1418 | 1417 2
1419 | 1418 5
1420 | 1419 3
1421 | 1420 0
1422 | 1421 5
1423 | 1422 5
1424 | 1423 5
1425 | 1424 3
1426 | 1425 2
1427 | 1426 2
1428 | 1427 2
1429 | 1428 4
1430 | 1429 4
1431 | 1430 3
1432 | 1431 1
1433 | 1432 5
1434 | 1433 5
1435 | 1434 5
1436 | 1435 5
1437 | 1436 5
1438 | 1437 0
1439 | 1438 4
1440 | 1439 2
1441 | 1440 2
1442 | 1441 4
1443 | 1442 2
1444 | 1443 1
1445 | 1444 2
1446 | 1445 2
1447 | 1446 5
1448 | 1447 5
1449 | 1448 5
1450 | 1449 2
1451 | 1450 2
1452 | 1451 4
1453 | 1452 5
1454 | 1453 5
1455 | 1454 3
1456 | 1455 5
1457 | 1456 5
1458 | 1457 5
1459 | 1458 4
1460 | 1459 1
1461 | 1460 3
1462 | 1461 5
1463 | 1462 1
1464 | 1463 5
1465 | 1464 3
1466 | 1465 5
1467 | 1466 2
1468 | 1467 3
1469 | 1468 2
1470 | 1469 5
1471 | 1470 5
1472 | 1471 3
1473 | 1472 2
1474 | 1473 2
1475 | 1474 3
1476 | 1475 2
1477 | 1476 1
1478 | 1477 0
1479 | 1478 2
1480 | 1479 2
1481 | 1480 2
1482 | 1481 2
1483 | 1482 5
1484 | 1483 4
1485 | 1484 5
1486 | 1485 2
1487 | 1486 4
1488 | 1487 5
1489 | 1488 5
1490 | 1489 1
1491 | 1490 5
1492 | 1491 2
1493 | 1492 4
1494 | 1493 4
1495 | 1494 2
1496 | 1495 4
1497 | 1496 0
1498 | 1497 2
1499 | 1498 2
1500 | 1499 2
1501 | 1500 2
1502 | 1501 5
1503 | 1502 5
1504 | 1503 5
1505 | 1504 1
1506 | 1505 5
1507 | 1506 2
1508 | 1507 2
1509 | 1508 2
1510 | 1509 2
1511 | 1510 4
1512 | 1511 3
1513 | 1512 4
1514 | 1513 4
1515 | 1514 1
1516 | 1515 3
1517 | 1516 4
1518 | 1517 3
1519 | 1518 5
1520 | 1519 5
1521 | 1520 0
1522 | 1521 3
1523 | 1522 4
1524 | 1523 4
1525 | 1524 4
1526 | 1525 2
1527 | 1526 3
1528 | 1527 0
1529 | 1528 2
1530 | 1529 2
1531 | 1530 5
1532 | 1531 2
1533 | 1532 2
1534 | 1533 2
1535 | 1534 1
1536 | 1535 1
1537 | 1536 2
1538 | 1537 2
1539 | 1538 2
1540 | 1539 2
1541 | 1540 5
1542 | 1541 5
1543 | 1542 5
1544 | 1543 3
1545 | 1544 2
1546 | 1545 2
1547 | 1546 2
1548 | 1547 2
1549 | 1548 4
1550 | 1549 5
1551 | 1550 5
1552 | 1551 2
1553 | 1552 5
1554 | 1553 3
1555 | 1554 5
1556 | 1555 2
1557 | 1556 2
1558 | 1557 4
1559 | 1558 4
1560 | 1559 1
1561 | 1560 1
1562 | 1561 4
1563 | 1562 2
1564 | 1563 4
1565 | 1564 1
1566 | 1565 4
1567 | 1566 0
1568 | 1567 0
1569 | 1568 3
1570 | 1569 4
1571 | 1570 2
1572 | 1571 5
1573 | 1572 5
1574 | 1573 2
1575 | 1574 4
1576 | 1575 5
1577 | 1576 0
1578 | 1577 1
1579 | 1578 5
1580 | 1579 2
1581 | 1580 5
1582 | 1581 4
1583 | 1582 5
1584 | 1583 4
1585 | 1584 4
1586 | 1585 4
1587 | 1586 3
1588 | 1587 3
1589 | 1588 3
1590 | 1589 5
1591 | 1590 5
1592 | 1591 4
1593 | 1592 3
1594 | 1593 3
1595 | 1594 4
1596 | 1595 4
1597 | 1596 4
1598 | 1597 3
1599 | 1598 3
1600 | 1599 2
1601 | 1600 2
1602 | 1601 2
1603 | 1602 2
1604 | 1603 5
1605 | 1604 2
1606 | 1605 2
1607 | 1606 2
1608 | 1607 2
1609 | 1608 2
1610 | 1609 5
1611 | 1610 4
1612 | 1611 2
1613 | 1612 5
1614 | 1613 2
1615 | 1614 2
1616 | 1615 2
1617 | 1616 2
1618 | 1617 5
1619 | 1618 5
1620 | 1619 4
1621 | 1620 2
1622 | 1621 0
1623 | 1622 2
1624 | 1623 2
1625 | 1624 4
1626 | 1625 4
1627 | 1626 5
1628 | 1627 4
1629 | 1628 3
1630 | 1629 4
1631 | 1630 2
1632 | 1631 4
1633 | 1632 2
1634 | 1633 2
1635 | 1634 2
1636 | 1635 3
1637 | 1636 2
1638 | 1637 4
1639 | 1638 1
1640 | 1639 2
1641 | 1640 4
1642 | 1641 4
1643 | 1642 3
1644 | 1643 4
1645 | 1644 3
1646 | 1645 5
1647 | 1646 2
1648 | 1647 2
1649 | 1648 5
1650 | 1649 3
1651 | 1650 3
1652 | 1651 4
1653 | 1652 0
1654 | 1653 4
1655 | 1654 1
1656 | 1655 5
1657 | 1656 5
1658 | 1657 4
1659 | 1658 4
1660 | 1659 2
1661 | 1660 3
1662 | 1661 2
1663 | 1662 3
1664 | 1663 4
1665 | 1664 3
1666 | 1665 4
1667 | 1666 4
1668 | 1667 3
1669 | 1668 3
1670 | 1669 5
1671 | 1670 5
1672 | 1671 5
1673 | 1672 2
1674 | 1673 2
1675 | 1674 5
1676 | 1675 0
1677 | 1676 5
1678 | 1677 3
1679 | 1678 3
1680 | 1679 2
1681 | 1680 4
1682 | 1681 4
1683 | 1682 2
1684 | 1683 5
1685 | 1684 2
1686 | 1685 2
1687 | 1686 4
1688 | 1687 2
1689 | 1688 2
1690 | 1689 4
1691 | 1690 2
1692 | 1691 4
1693 | 1692 1
1694 | 1693 2
1695 | 1694 1
1696 | 1695 2
1697 | 1696 4
1698 | 1697 4
1699 | 1698 5
1700 | 1699 3
1701 | 1700 4
1702 | 1701 2
1703 | 1702 5
1704 | 1703 5
1705 | 1704 5
1706 | 1705 5
1707 | 1706 5
1708 | 1707 2
1709 | 1708 5
1710 | 1709 5
1711 | 1710 4
1712 | 1711 5
1713 | 1712 5
1714 | 1713 5
1715 | 1714 3
1716 | 1715 4
1717 | 1716 4
1718 | 1717 4
1719 | 1718 3
1720 | 1719 5
1721 | 1720 2
1722 | 1721 4
1723 | 1722 0
1724 | 1723 2
1725 | 1724 3
1726 | 1725 2
1727 | 1726 4
1728 | 1727 4
1729 | 1728 4
1730 | 1729 1
1731 | 1730 2
1732 | 1731 4
1733 | 1732 4
1734 | 1733 4
1735 | 1734 4
1736 | 1735 4
1737 | 1736 4
1738 | 1737 4
1739 | 1738 4
1740 | 1739 5
1741 | 1740 3
1742 | 1741 3
1743 | 1742 4
1744 | 1743 4
1745 | 1744 4
1746 | 1745 4
1747 | 1746 1
1748 | 1747 2
1749 | 1748 3
1750 | 1749 5
1751 | 1750 2
1752 | 1751 2
1753 | 1752 3
1754 | 1753 3
1755 | 1754 1
1756 | 1755 5
1757 | 1756 2
1758 | 1757 1
1759 | 1758 0
1760 | 1759 4
1761 | 1760 5
1762 | 1761 3
1763 | 1762 2
1764 | 1763 2
1765 | 1764 5
1766 | 1765 5
1767 | 1766 3
1768 | 1767 4
1769 | 1768 4
1770 | 1769 2
1771 | 1770 0
1772 | 1771 0
1773 | 1772 5
1774 | 1773 2
1775 | 1774 5
1776 | 1775 2
1777 | 1776 2
1778 | 1777 5
1779 | 1778 4
1780 | 1779 3
1781 | 1780 4
1782 | 1781 3
1783 | 1782 0
1784 | 1783 3
1785 | 1784 5
1786 | 1785 5
1787 | 1786 2
1788 | 1787 4
1789 | 1788 3
1790 | 1789 4
1791 | 1790 2
1792 | 1791 1
1793 | 1792 5
1794 | 1793 5
1795 | 1794 5
1796 | 1795 4
1797 | 1796 3
1798 | 1797 4
1799 | 1798 2
1800 | 1799 3
1801 | 1800 1
1802 | 1801 2
1803 | 1802 2
1804 | 1803 4
1805 | 1804 2
1806 | 1805 5
1807 | 1806 3
1808 | 1807 3
1809 | 1808 2
1810 | 1809 2
1811 | 1810 4
1812 | 1811 5
1813 | 1812 2
1814 | 1813 2
1815 | 1814 2
1816 | 1815 4
1817 | 1816 4
1818 | 1817 2
1819 | 1818 2
1820 | 1819 5
1821 | 1820 5
1822 | 1821 3
1823 | 1822 4
1824 | 1823 4
1825 | 1824 4
1826 | 1825 5
1827 | 1826 4
1828 | 1827 5
1829 | 1828 2
1830 | 1829 3
1831 | 1830 2
1832 | 1831 5
1833 | 1832 3
1834 | 1833 2
1835 | 1834 5
1836 | 1835 5
1837 | 1836 5
1838 | 1837 5
1839 | 1838 5
1840 | 1839 5
1841 | 1840 4
1842 | 1841 5
1843 | 1842 0
1844 | 1843 0
1845 | 1844 0
1846 | 1845 5
1847 | 1846 5
1848 | 1847 2
1849 | 1848 0
1850 | 1849 0
1851 | 1850 3
1852 | 1851 2
1853 | 1852 3
1854 | 1853 5
1855 | 1854 5
1856 | 1855 4
1857 | 1856 3
1858 | 1857 1
1859 | 1858 1
1860 | 1859 1
1861 | 1860 2
1862 | 1861 5
1863 | 1862 1
1864 | 1863 3
1865 | 1864 2
1866 | 1865 4
1867 | 1866 2
1868 | 1867 2
1869 | 1868 0
1870 | 1869 4
1871 | 1870 5
1872 | 1871 5
1873 | 1872 5
1874 | 1873 4
1875 | 1874 1
1876 | 1875 2
1877 | 1876 4
1878 | 1877 5
1879 | 1878 5
1880 | 1879 0
1881 | 1880 2
1882 | 1881 0
1883 | 1882 2
1884 | 1883 3
1885 | 1884 3
1886 | 1885 2
1887 | 1886 5
1888 | 1887 0
1889 | 1888 0
1890 | 1889 5
1891 | 1890 5
1892 | 1891 3
1893 | 1892 1
1894 | 1893 2
1895 | 1894 2
1896 | 1895 3
1897 | 1896 2
1898 | 1897 3
1899 | 1898 3
1900 | 1899 3
1901 | 1900 5
1902 | 1901 5
1903 | 1902 2
1904 | 1903 0
1905 | 1904 4
1906 | 1905 4
1907 | 1906 5
1908 | 1907 3
1909 | 1908 2
1910 | 1909 2
1911 | 1910 2
1912 | 1911 2
1913 | 1912 2
1914 | 1913 1
1915 | 1914 2
1916 | 1915 2
1917 | 1916 5
1918 | 1917 5
1919 | 1918 5
1920 | 1919 4
1921 | 1920 4
1922 | 1921 3
1923 | 1922 5
1924 | 1923 4
1925 | 1924 2
1926 | 1925 5
1927 | 1926 5
1928 | 1927 2
1929 | 1928 2
1930 | 1929 5
1931 | 1930 2
1932 | 1931 4
1933 | 1932 2
1934 | 1933 0
1935 | 1934 4
1936 | 1935 0
1937 | 1936 0
1938 | 1937 3
1939 | 1938 3
1940 | 1939 2
1941 | 1940 5
1942 | 1941 4
1943 | 1942 4
1944 | 1943 2
1945 | 1944 5
1946 | 1945 5
1947 | 1946 4
1948 | 1947 4
1949 | 1948 3
1950 | 1949 3
1951 | 1950 2
1952 | 1951 3
1953 | 1952 1
1954 | 1953 5
1955 | 1954 5
1956 | 1955 5
1957 | 1956 3
1958 | 1957 4
1959 | 1958 2
1960 | 1959 2
1961 | 1960 5
1962 | 1961 4
1963 | 1962 4
1964 | 1963 4
1965 | 1964 5
1966 | 1965 1
1967 | 1966 4
1968 | 1967 2
1969 | 1968 4
1970 | 1969 3
1971 | 1970 4
1972 | 1971 4
1973 | 1972 4
1974 | 1973 4
1975 | 1974 5
1976 | 1975 4
1977 | 1976 3
1978 | 1977 1
1979 | 1978 4
1980 | 1979 3
1981 | 1980 5
1982 | 1981 1
1983 | 1982 0
1984 | 1983 0
1985 | 1984 2
1986 | 1985 4
1987 | 1986 2
1988 | 1987 2
1989 | 1988 2
1990 | 1989 3
1991 | 1990 2
1992 | 1991 3
1993 | 1992 1
1994 | 1993 1
1995 | 1994 5
1996 | 1995 4
1997 | 1996 4
1998 | 1997 4
1999 | 1998 3
2000 | 1999 3
2001 | 2000 5
2002 | 2001 5
2003 | 2002 5
2004 | 2003 5
2005 | 2004 5
2006 | 2005 5
2007 | 2006 5
2008 | 2007 4
2009 | 2008 5
2010 | 2009 5
2011 | 2010 2
2012 | 2011 2
2013 | 2012 0
2014 | 2013 1
2015 | 2014 4
2016 | 2015 4
2017 | 2016 3
2018 | 2017 3
2019 | 2018 3
2020 | 2019 4
2021 | 2020 4
2022 | 2021 2
2023 | 2022 2
2024 | 2023 5
2025 | 2024 3
2026 | 2025 5
2027 | 2026 4
2028 | 2027 4
2029 | 2028 1
2030 | 2029 1
2031 | 2030 4
2032 | 2031 3
2033 | 2032 5
2034 | 2033 5
2035 | 2034 2
2036 | 2035 3
2037 | 2036 2
2038 | 2037 2
2039 | 2038 0
2040 | 2039 5
2041 | 2040 4
2042 | 2041 2
2043 | 2042 3
2044 | 2043 4
2045 | 2044 3
2046 | 2045 2
2047 | 2046 2
2048 | 2047 2
2049 | 2048 5
2050 | 2049 5
2051 | 2050 5
2052 | 2051 2
2053 | 2052 5
2054 | 2053 2
2055 | 2054 5
2056 | 2055 5
2057 | 2056 5
2058 | 2057 2
2059 | 2058 2
2060 | 2059 2
2061 | 2060 5
2062 | 2061 4
2063 | 2062 5
2064 | 2063 5
2065 | 2064 3
2066 | 2065 3
2067 | 2066 3
2068 | 2067 0
2069 | 2068 3
2070 | 2069 3
2071 | 2070 4
2072 | 2071 5
2073 | 2072 5
2074 | 2073 3
2075 | 2074 2
2076 | 2075 5
2077 | 2076 4
2078 | 2077 4
2079 | 2078 0
2080 | 2079 2
2081 | 2080 5
2082 | 2081 4
2083 | 2082 4
2084 | 2083 1
2085 | 2084 4
2086 | 2085 3
2087 | 2086 0
2088 | 2087 0
2089 | 2088 4
2090 | 2089 0
2091 | 2090 3
2092 | 2091 3
2093 | 2092 5
2094 | 2093 2
2095 | 2094 1
2096 | 2095 5
2097 | 2096 4
2098 | 2097 3
2099 | 2098 5
2100 | 2099 5
2101 | 2100 2
2102 | 2101 2
2103 | 2102 2
2104 | 2103 4
2105 | 2104 3
2106 | 2105 2
2107 | 2106 2
2108 | 2107 2
2109 | 2108 2
2110 | 2109 5
2111 | 2110 5
2112 | 2111 5
2113 | 2112 5
2114 | 2113 5
2115 | 2114 5
2116 | 2115 2
2117 | 2116 4
2118 | 2117 0
2119 | 2118 3
2120 | 2119 2
2121 | 2120 3
2122 | 2121 5
2123 | 2122 5
2124 | 2123 5
2125 | 2124 5
2126 | 2125 2
2127 | 2126 5
2128 | 2127 5
2129 | 2128 3
2130 | 2129 1
2131 | 2130 4
2132 | 2131 4
2133 | 2132 4
2134 | 2133 4
2135 | 2134 0
2136 | 2135 4
2137 | 2136 2
2138 | 2137 0
2139 | 2138 2
2140 | 2139 2
2141 | 2140 2
2142 | 2141 4
2143 | 2142 1
2144 | 2143 1
2145 | 2144 1
2146 | 2145 5
2147 | 2146 3
2148 | 2147 0
2149 | 2148 5
2150 | 2149 3
2151 | 2150 3
2152 | 2151 5
2153 | 2152 3
2154 | 2153 2
2155 | 2154 3
2156 | 2155 2
2157 | 2156 4
2158 | 2157 3
2159 | 2158 0
2160 | 2159 4
2161 | 2160 4
2162 | 2161 3
2163 | 2162 3
2164 | 2163 2
2165 | 2164 2
2166 | 2165 2
2167 | 2166 2
2168 | 2167 2
2169 | 2168 2
2170 | 2169 2
2171 | 2170 4
2172 | 2171 4
2173 | 2172 4
2174 | 2173 4
2175 | 2174 4
2176 | 2175 5
2177 | 2176 4
2178 | 2177 4
2179 | 2178 1
2180 | 2179 0
2181 | 2180 2
2182 | 2181 3
2183 | 2182 1
2184 | 2183 4
2185 | 2184 2
2186 | 2185 0
2187 | 2186 1
2188 | 2187 0
2189 | 2188 3
2190 | 2189 3
2191 | 2190 5
2192 | 2191 1
2193 | 2192 4
2194 | 2193 2
2195 | 2194 4
2196 | 2195 4
2197 | 2196 1
2198 | 2197 1
2199 | 2198 1
2200 | 2199 4
2201 | 2200 0
2202 | 2201 2
2203 | 2202 4
2204 | 2203 2
2205 | 2204 4
2206 | 2205 2
2207 | 2206 2
2208 | 2207 4
2209 | 2208 0
2210 | 2209 4
2211 | 2210 4
2212 | 2211 4
2213 | 2212 2
2214 | 2213 2
2215 | 2214 3
2216 | 2215 4
2217 | 2216 2
2218 | 2217 2
2219 | 2218 2
2220 | 2219 0
2221 | 2220 2
2222 | 2221 2
2223 | 2222 2
2224 | 2223 2
2225 | 2224 3
2226 | 2225 4
2227 | 2226 0
2228 | 2227 4
2229 | 2228 2
2230 | 2229 0
2231 | 2230 4
2232 | 2231 2
2233 | 2232 0
2234 | 2233 1
2235 | 2234 2
2236 | 2235 2
2237 | 2236 2
2238 | 2237 4
2239 | 2238 3
2240 | 2239 0
2241 | 2240 3
2242 | 2241 4
2243 | 2242 3
2244 | 2243 5
2245 | 2244 2
2246 | 2245 2
2247 | 2246 3
2248 | 2247 4
2249 | 2248 2
2250 | 2249 1
2251 | 2250 4
2252 | 2251 2
2253 | 2252 2
2254 | 2253 2
2255 | 2254 5
2256 | 2255 5
2257 | 2256 2
2258 | 2257 2
2259 | 2258 2
2260 | 2259 3
2261 | 2260 5
2262 | 2261 2
2263 | 2262 5
2264 | 2263 2
2265 | 2264 2
2266 | 2265 4
2267 | 2266 3
2268 | 2267 4
2269 | 2268 4
2270 | 2269 4
2271 | 2270 2
2272 | 2271 2
2273 | 2272 1
2274 | 2273 5
2275 | 2274 4
2276 | 2275 4
2277 | 2276 2
2278 | 2277 4
2279 | 2278 2
2280 | 2279 5
2281 | 2280 2
2282 | 2281 1
2283 | 2282 0
2284 | 2283 2
2285 | 2284 1
2286 | 2285 0
2287 | 2286 2
2288 | 2287 5
2289 | 2288 0
2290 | 2289 4
2291 | 2290 0
2292 | 2291 2
2293 | 2292 5
2294 | 2293 3
2295 | 2294 3
2296 | 2295 2
2297 | 2296 5
2298 | 2297 4
2299 | 2298 3
2300 | 2299 2
2301 | 2300 3
2302 | 2301 4
2303 | 2302 1
2304 | 2303 4
2305 | 2304 2
2306 | 2305 2
2307 | 2306 2
2308 | 2307 2
2309 | 2308 4
2310 | 2309 2
2311 | 2310 2
2312 | 2311 4
2313 | 2312 0
2314 | 2313 4
2315 | 2314 2
2316 | 2315 4
2317 | 2316 3
2318 | 2317 1
2319 | 2318 1
2320 | 2319 2
2321 | 2320 4
2322 | 2321 5
2323 | 2322 4
2324 | 2323 2
2325 | 2324 4
2326 | 2325 2
2327 | 2326 4
2328 | 2327 4
2329 | 2328 3
2330 | 2329 4
2331 | 2330 3
2332 | 2331 1
2333 | 2332 2
2334 | 2333 2
2335 | 2334 0
2336 | 2335 2
2337 | 2336 5
2338 | 2337 5
2339 | 2338 5
2340 | 2339 2
2341 | 2340 3
2342 | 2341 0
2343 | 2342 3
2344 | 2343 2
2345 | 2344 2
2346 | 2345 1
2347 | 2346 5
2348 | 2347 5
2349 | 2348 0
2350 | 2349 2
2351 | 2350 2
2352 | 2351 0
2353 | 2352 2
2354 | 2353 1
2355 | 2354 0
2356 | 2355 3
2357 | 2356 2
2358 | 2357 0
2359 | 2358 5
2360 | 2359 3
2361 | 2360 0
2362 | 2361 3
2363 | 2362 4
2364 | 2363 3
2365 | 2364 3
2366 | 2365 1
2367 | 2366 4
2368 | 2367 4
2369 | 2368 5
2370 | 2369 5
2371 | 2370 4
2372 | 2371 2
2373 | 2372 2
2374 | 2373 3
2375 | 2374 4
2376 | 2375 4
2377 | 2376 5
2378 | 2377 2
2379 | 2378 4
2380 | 2379 0
2381 | 2380 4
2382 | 2381 5
2383 | 2382 5
2384 | 2383 0
2385 | 2384 3
2386 | 2385 3
2387 | 2386 0
2388 | 2387 5
2389 | 2388 2
2390 | 2389 4
2391 | 2390 2
2392 | 2391 0
2393 | 2392 4
2394 | 2393 5
2395 | 2394 4
2396 | 2395 1
2397 | 2396 5
2398 | 2397 4
2399 | 2398 5
2400 | 2399 3
2401 | 2400 3
2402 | 2401 3
2403 | 2402 5
2404 | 2403 3
2405 | 2404 4
2406 | 2405 5
2407 | 2406 3
2408 | 2407 2
2409 | 2408 3
2410 | 2409 2
2411 | 2410 1
2412 | 2411 5
2413 | 2412 5
2414 | 2413 5
2415 | 2414 5
2416 | 2415 2
2417 | 2416 4
2418 | 2417 2
2419 | 2418 5
2420 | 2419 4
2421 | 2420 4
2422 | 2421 0
2423 | 2422 2
2424 | 2423 2
2425 | 2424 5
2426 | 2425 5
2427 | 2426 5
2428 | 2427 3
2429 | 2428 5
2430 | 2429 0
2431 | 2430 1
2432 | 2431 4
2433 | 2432 5
2434 | 2433 4
2435 | 2434 0
2436 | 2435 0
2437 | 2436 5
2438 | 2437 4
2439 | 2438 0
2440 | 2439 5
2441 | 2440 5
2442 | 2441 0
2443 | 2442 3
2444 | 2443 0
2445 | 2444 0
2446 | 2445 0
2447 | 2446 3
2448 | 2447 2
2449 | 2448 1
2450 | 2449 0
2451 | 2450 3
2452 | 2451 2
2453 | 2452 5
2454 | 2453 3
2455 | 2454 0
2456 | 2455 0
2457 | 2456 2
2458 | 2457 4
2459 | 2458 5
2460 | 2459 4
2461 | 2460 3
2462 | 2461 5
2463 | 2462 1
2464 | 2463 3
2465 | 2464 3
2466 | 2465 0
2467 | 2466 3
2468 | 2467 4
2469 | 2468 4
2470 | 2469 0
2471 | 2470 4
2472 | 2471 1
2473 | 2472 5
2474 | 2473 3
2475 | 2474 2
2476 | 2475 5
2477 | 2476 2
2478 | 2477 4
2479 | 2478 2
2480 | 2479 1
2481 | 2480 3
2482 | 2481 4
2483 | 2482 3
2484 | 2483 2
2485 | 2484 0
2486 | 2485 2
2487 | 2486 2
2488 | 2487 0
2489 | 2488 4
2490 | 2489 2
2491 | 2490 2
2492 | 2491 2
2493 | 2492 4
2494 | 2493 3
2495 | 2494 4
2496 | 2495 4
2497 | 2496 3
2498 | 2497 4
2499 | 2498 1
2500 | 2499 1
2501 | 2500 2
2502 | 2501 1
2503 | 2502 3
2504 | 2503 2
2505 | 2504 1
2506 | 2505 2
2507 | 2506 4
2508 | 2507 3
2509 | 2508 4
2510 | 2509 3
2511 | 2510 2
2512 | 2511 2
2513 | 2512 4
2514 | 2513 3
2515 | 2514 3
2516 | 2515 5
2517 | 2516 1
2518 | 2517 1
2519 | 2518 5
2520 | 2519 5
2521 | 2520 2
2522 | 2521 3
2523 | 2522 2
2524 | 2523 2
2525 | 2524 1
2526 | 2525 3
2527 | 2526 3
2528 | 2527 4
2529 | 2528 3
2530 | 2529 2
2531 | 2530 2
2532 | 2531 0
2533 | 2532 0
2534 | 2533 0
2535 | 2534 2
2536 | 2535 2
2537 | 2536 2
2538 | 2537 5
2539 | 2538 2
2540 | 2539 1
2541 | 2540 0
2542 | 2541 3
2543 | 2542 3
2544 | 2543 0
2545 | 2544 0
2546 | 2545 1
2547 | 2546 1
2548 | 2547 2
2549 | 2548 3
2550 | 2549 5
2551 | 2550 5
2552 | 2551 1
2553 | 2552 2
2554 | 2553 5
2555 | 2554 0
2556 | 2555 5
2557 | 2556 2
2558 | 2557 2
2559 | 2558 3
2560 | 2559 5
2561 | 2560 4
2562 | 2561 2
2563 | 2562 1
2564 | 2563 1
2565 | 2564 3
2566 | 2565 1
2567 | 2566 2
2568 | 2567 2
2569 | 2568 1
2570 | 2569 5
2571 | 2570 3
2572 | 2571 3
2573 | 2572 2
2574 | 2573 5
2575 | 2574 3
2576 | 2575 0
2577 | 2576 5
2578 | 2577 3
2579 | 2578 2
2580 | 2579 2
2581 | 2580 2
2582 | 2581 4
2583 | 2582 2
2584 | 2583 2
2585 | 2584 2
2586 | 2585 2
2587 | 2586 2
2588 | 2587 2
2589 | 2588 3
2590 | 2589 2
2591 | 2590 2
2592 | 2591 1
2593 | 2592 5
2594 | 2593 4
2595 | 2594 2
2596 | 2595 2
2597 | 2596 2
2598 | 2597 3
2599 | 2598 2
2600 | 2599 4
2601 | 2600 4
2602 | 2601 2
2603 | 2602 2
2604 | 2603 2
2605 | 2604 4
2606 | 2605 0
2607 | 2606 0
2608 | 2607 4
2609 | 2608 0
2610 | 2609 2
2611 | 2610 5
2612 | 2611 2
2613 | 2612 2
2614 | 2613 1
2615 | 2614 0
2616 | 2615 0
2617 | 2616 0
2618 | 2617 0
2619 | 2618 0
2620 | 2619 0
2621 | 2620 2
2622 | 2621 2
2623 | 2622 1
2624 | 2623 4
2625 | 2624 2
2626 | 2625 2
2627 | 2626 2
2628 | 2627 5
2629 | 2628 4
2630 | 2629 4
2631 | 2630 5
2632 | 2631 4
2633 | 2632 2
2634 | 2633 3
2635 | 2634 3
2636 | 2635 1
2637 | 2636 1
2638 | 2637 1
2639 | 2638 3
2640 | 2639 4
2641 | 2640 4
2642 | 2641 1
2643 | 2642 2
2644 | 2643 4
2645 | 2644 4
2646 | 2645 4
2647 | 2646 0
2648 | 2647 4
2649 | 2648 0
2650 | 2649 2
2651 | 2650 2
2652 | 2651 1
2653 | 2652 2
2654 | 2653 1
2655 | 2654 2
2656 | 2655 4
2657 | 2656 4
2658 | 2657 2
2659 | 2658 2
2660 | 2659 2
2661 | 2660 2
2662 | 2661 2
2663 | 2662 0
2664 | 2663 3
2665 | 2664 5
2666 | 2665 2
2667 | 2666 0
2668 | 2667 4
2669 | 2668 3
2670 | 2669 4
2671 | 2670 3
2672 | 2671 5
2673 | 2672 2
2674 | 2673 2
2675 | 2674 2
2676 | 2675 4
2677 | 2676 3
2678 | 2677 3
2679 | 2678 5
2680 | 2679 3
2681 | 2680 4
2682 | 2681 2
2683 | 2682 4
2684 | 2683 4
2685 | 2684 0
2686 | 2685 4
2687 | 2686 3
2688 | 2687 2
2689 | 2688 2
2690 | 2689 2
2691 | 2690 4
2692 | 2691 4
2693 | 2692 4
2694 | 2693 3
2695 | 2694 3
2696 | 2695 4
2697 | 2696 2
2698 | 2697 1
2699 | 2698 0
2700 | 2699 0
2701 | 2700 2
2702 | 2701 2
2703 | 2702 4
2704 | 2703 0
2705 | 2704 2
2706 | 2705 0
2707 | 2706 3
2708 | 2707 5
2709 | 2708 4
2710 | 2709 2
2711 | 2710 2
2712 | 2711 4
2713 | 2712 2
2714 | 2713 1
2715 | 2714 2
2716 | 2715 5
2717 | 2716 0
2718 | 2717 0
2719 | 2718 5
2720 | 2719 5
2721 | 2720 3
2722 | 2721 0
2723 | 2722 2
2724 | 2723 1
2725 | 2724 2
2726 | 2725 1
2727 | 2726 2
2728 | 2727 0
2729 | 2728 4
2730 | 2729 2
2731 | 2730 5
2732 | 2731 1
2733 | 2732 3
2734 | 2733 2
2735 | 2734 0
2736 | 2735 5
2737 | 2736 2
2738 | 2737 2
2739 | 2738 3
2740 | 2739 5
2741 | 2740 3
2742 | 2741 5
2743 | 2742 2
2744 | 2743 4
2745 | 2744 3
2746 | 2745 3
2747 | 2746 3
2748 | 2747 5
2749 | 2748 5
2750 | 2749 2
2751 | 2750 5
2752 | 2751 4
2753 | 2752 5
2754 | 2753 5
2755 | 2754 5
2756 | 2755 5
2757 | 2756 5
2758 | 2757 1
2759 | 2758 2
2760 | 2759 2
2761 | 2760 2
2762 | 2761 4
2763 | 2762 4
2764 | 2763 3
2765 | 2764 4
2766 | 2765 2
2767 | 2766 1
2768 | 2767 3
2769 | 2768 4
2770 | 2769 4
2771 | 2770 0
2772 | 2771 0
2773 | 2772 2
2774 | 2773 3
2775 | 2774 3
2776 | 2775 5
2777 | 2776 5
2778 | 2777 4
2779 | 2778 3
2780 | 2779 3
2781 | 2780 3
2782 | 2781 4
2783 | 2782 1
2784 | 2783 3
2785 | 2784 5
2786 | 2785 5
2787 | 2786 0
2788 | 2787 4
2789 | 2788 4
2790 | 2789 2
2791 | 2790 0
2792 | 2791 3
2793 | 2792 4
2794 | 2793 3
2795 | 2794 4
2796 | 2795 3
2797 | 2796 5
2798 | 2797 1
2799 | 2798 5
2800 | 2799 4
2801 | 2800 5
2802 | 2801 5
2803 | 2802 4
2804 | 2803 2
2805 | 2804 4
2806 | 2805 1
2807 | 2806 1
2808 | 2807 1
2809 | 2808 0
2810 | 2809 3
2811 | 2810 3
2812 | 2811 1
2813 | 2812 3
2814 | 2813 0
2815 | 2814 3
2816 | 2815 3
2817 | 2816 0
2818 | 2817 3
2819 | 2818 2
2820 | 2819 2
2821 | 2820 5
2822 | 2821 2
2823 | 2822 0
2824 | 2823 5
2825 | 2824 5
2826 | 2825 1
2827 | 2826 5
2828 | 2827 5
2829 | 2828 5
2830 | 2829 5
2831 | 2830 2
2832 | 2831 2
2833 | 2832 2
2834 | 2833 3
2835 | 2834 5
2836 | 2835 4
2837 | 2836 5
2838 | 2837 5
2839 | 2838 5
2840 | 2839 0
2841 | 2840 0
2842 | 2841 4
2843 | 2842 4
2844 | 2843 1
2845 | 2844 4
2846 | 2845 3
2847 | 2846 5
2848 | 2847 4
2849 | 2848 4
2850 | 2849 4
2851 | 2850 0
2852 | 2851 0
2853 | 2852 2
2854 | 2853 2
2855 | 2854 3
2856 | 2855 1
2857 | 2856 1
2858 | 2857 5
2859 | 2858 3
2860 | 2859 5
2861 | 2860 5
2862 | 2861 2
2863 | 2862 5
2864 | 2863 5
2865 | 2864 5
2866 | 2865 3
2867 | 2866 4
2868 | 2867 4
2869 | 2868 3
2870 | 2869 0
2871 | 2870 5
2872 | 2871 3
2873 | 2872 3
2874 | 2873 5
2875 | 2874 3
2876 | 2875 4
2877 | 2876 3
2878 | 2877 2
2879 | 2878 2
2880 | 2879 1
2881 | 2880 3
2882 | 2881 3
2883 | 2882 5
2884 | 2883 0
2885 | 2884 0
2886 | 2885 5
2887 | 2886 5
2888 | 2887 5
2889 | 2888 5
2890 | 2889 5
2891 | 2890 3
2892 | 2891 2
2893 | 2892 3
2894 | 2893 3
2895 | 2894 1
2896 | 2895 5
2897 | 2896 0
2898 | 2897 5
2899 | 2898 4
2900 | 2899 2
2901 | 2900 2
2902 | 2901 2
2903 | 2902 3
2904 | 2903 3
2905 | 2904 4
2906 | 2905 2
2907 | 2906 0
2908 | 2907 3
2909 | 2908 0
2910 | 2909 0
2911 | 2910 2
2912 | 2911 2
2913 | 2912 2
2914 | 2913 2
2915 | 2914 3
2916 | 2915 4
2917 | 2916 5
2918 | 2917 4
2919 | 2918 3
2920 | 2919 4
2921 | 2920 0
2922 | 2921 0
2923 | 2922 4
2924 | 2923 4
2925 | 2924 4
2926 | 2925 1
2927 | 2926 3
2928 | 2927 4
2929 | 2928 1
2930 | 2929 1
2931 | 2930 1
2932 | 2931 4
2933 | 2932 5
2934 | 2933 5
2935 | 2934 4
2936 | 2935 2
2937 | 2936 0
2938 | 2937 2
2939 | 2938 2
2940 | 2939 2
2941 | 2940 4
2942 | 2941 4
2943 | 2942 4
2944 | 2943 2
2945 | 2944 3
2946 | 2945 3
2947 | 2946 2
2948 | 2947 3
2949 | 2948 2
2950 | 2949 2
2951 | 2950 4
2952 | 2951 2
2953 | 2952 2
2954 | 2953 4
2955 | 2954 2
2956 | 2955 2
2957 | 2956 4
2958 | 2957 2
2959 | 2958 2
2960 | 2959 2
2961 | 2960 4
2962 | 2961 5
2963 | 2962 1
2964 | 2963 5
2965 | 2964 2
2966 | 2965 5
2967 | 2966 5
2968 | 2967 3
2969 | 2968 5
2970 | 2969 1
2971 | 2970 2
2972 | 2971 2
2973 | 2972 1
2974 | 2973 1
2975 | 2974 2
2976 | 2975 4
2977 | 2976 3
2978 | 2977 5
2979 | 2978 4
2980 | 2979 4
2981 | 2980 4
2982 | 2981 5
2983 | 2982 5
2984 | 2983 3
2985 | 2984 0
2986 | 2985 0
2987 | 2986 0
2988 | 2987 0
2989 | 2988 4
2990 | 2989 1
2991 | 2990 2
2992 | 2991 1
2993 | 2992 2
2994 | 2993 4
2995 | 2994 1
2996 | 2995 2
2997 | 2996 0
2998 | 2997 1
2999 | 2998 1
3000 | 2999 5
3001 | 3000 3
3002 | 3001 2
3003 | 3002 1
3004 | 3003 0
3005 | 3004 5
3006 | 3005 5
3007 | 3006 5
3008 | 3007 4
3009 | 3008 3
3010 | 3009 3
3011 | 3010 5
3012 | 3011 5
3013 | 3012 3
3014 | 3013 0
3015 | 3014 3
3016 | 3015 3
3017 | 3016 1
3018 | 3017 3
3019 | 3018 2
3020 | 3019 4
3021 | 3020 0
3022 | 3021 2
3023 | 3022 3
3024 | 3023 3
3025 | 3024 2
3026 | 3025 1
3027 | 3026 2
3028 | 3027 2
3029 | 3028 3
3030 | 3029 4
3031 | 3030 2
3032 | 3031 5
3033 | 3032 5
3034 | 3033 0
3035 | 3034 3
3036 | 3035 3
3037 | 3036 1
3038 | 3037 2
3039 | 3038 5
3040 | 3039 5
3041 | 3040 4
3042 | 3041 4
3043 | 3042 5
3044 | 3043 1
3045 | 3044 2
3046 | 3045 5
3047 | 3046 2
3048 | 3047 5
3049 | 3048 0
3050 | 3049 0
3051 | 3050 0
3052 | 3051 2
3053 | 3052 2
3054 | 3053 2
3055 | 3054 2
3056 | 3055 5
3057 | 3056 5
3058 | 3057 0
3059 | 3058 3
3060 | 3059 5
3061 | 3060 1
3062 | 3061 1
3063 | 3062 1
3064 | 3063 1
3065 | 3064 4
3066 | 3065 1
3067 | 3066 1
3068 | 3067 4
3069 | 3068 2
3070 | 3069 2
3071 | 3070 5
3072 | 3071 4
3073 | 3072 4
3074 | 3073 5
3075 | 3074 4
3076 | 3075 0
3077 | 3076 1
3078 | 3077 2
3079 | 3078 3
3080 | 3079 2
3081 | 3080 2
3082 | 3081 5
3083 | 3082 5
3084 | 3083 5
3085 | 3084 5
3086 | 3085 4
3087 | 3086 2
3088 | 3087 1
3089 | 3088 4
3090 | 3089 2
3091 | 3090 4
3092 | 3091 2
3093 | 3092 4
3094 | 3093 4
3095 | 3094 5
3096 | 3095 5
3097 | 3096 0
3098 | 3097 5
3099 | 3098 4
3100 | 3099 4
3101 | 3100 5
3102 | 3101 5
3103 | 3102 5
3104 | 3103 4
3105 | 3104 0
3106 | 3105 1
3107 | 3106 5
3108 | 3107 0
3109 | 3108 4
3110 | 3109 4
3111 | 3110 5
3112 | 3111 1
3113 | 3112 1
3114 | 3113 1
3115 | 3114 0
3116 | 3115 3
3117 | 3116 3
3118 | 3117 1
3119 | 3118 5
3120 | 3119 5
3121 | 3120 5
3122 | 3121 3
3123 | 3122 3
3124 | 3123 5
3125 | 3124 0
3126 | 3125 4
3127 | 3126 0
3128 | 3127 5
3129 | 3128 4
3130 | 3129 3
3131 | 3130 5
3132 | 3131 5
3133 | 3132 2
3134 | 3133 5
3135 | 3134 5
3136 | 3135 3
3137 | 3136 2
3138 | 3137 2
3139 | 3138 5
3140 | 3139 1
3141 | 3140 2
3142 | 3141 2
3143 | 3142 1
3144 | 3143 2
3145 | 3144 2
3146 | 3145 3
3147 | 3146 0
3148 | 3147 0
3149 | 3148 0
3150 | 3149 0
3151 | 3150 0
3152 | 3151 0
3153 | 3152 5
3154 | 3153 3
3155 | 3154 4
3156 | 3155 0
3157 | 3156 2
3158 | 3157 1
3159 | 3158 4
3160 | 3159 2
3161 | 3160 2
3162 | 3161 2
3163 | 3162 3
3164 | 3163 4
3165 | 3164 1
3166 | 3165 4
3167 | 3166 4
3168 | 3167 0
3169 | 3168 0
3170 | 3169 3
3171 | 3170 0
3172 | 3171 2
3173 | 3172 3
3174 | 3173 2
3175 | 3174 3
3176 | 3175 2
3177 | 3176 2
3178 | 3177 5
3179 | 3178 5
3180 | 3179 2
3181 | 3180 0
3182 | 3181 3
3183 | 3182 5
3184 | 3183 2
3185 | 3184 1
3186 | 3185 0
3187 | 3186 5
3188 | 3187 3
3189 | 3188 4
3190 | 3189 4
3191 | 3190 3
3192 | 3191 3
3193 | 3192 2
3194 | 3193 3
3195 | 3194 2
3196 | 3195 2
3197 | 3196 4
3198 | 3197 4
3199 | 3198 1
3200 | 3199 4
3201 | 3200 2
3202 | 3201 2
3203 | 3202 0
3204 | 3203 0
3205 | 3204 4
3206 | 3205 4
3207 | 3206 0
3208 | 3207 2
3209 | 3208 3
3210 | 3209 5
3211 | 3210 2
3212 | 3211 4
3213 | 3212 1
3214 | 3213 4
3215 | 3214 4
3216 | 3215 2
3217 | 3216 4
3218 | 3217 3
3219 | 3218 4
3220 | 3219 5
3221 | 3220 4
3222 | 3221 4
3223 | 3222 0
3224 | 3223 2
3225 | 3224 3
3226 | 3225 5
3227 | 3226 5
3228 | 3227 5
3229 | 3228 2
3230 | 3229 4
3231 | 3230 4
3232 | 3231 2
3233 | 3232 0
3234 | 3233 0
3235 | 3234 1
3236 | 3235 4
3237 | 3236 3
3238 | 3237 2
3239 | 3238 2
3240 | 3239 2
3241 | 3240 0
3242 | 3241 1
3243 | 3242 1
3244 | 3243 5
3245 | 3244 4
3246 | 3245 4
3247 | 3246 4
3248 | 3247 2
3249 | 3248 3
3250 | 3249 5
3251 | 3250 4
3252 | 3251 4
3253 | 3252 3
3254 | 3253 4
3255 | 3254 4
3256 | 3255 3
3257 | 3256 1
3258 | 3257 5
3259 | 3258 2
3260 | 3259 5
3261 | 3260 2
3262 | 3261 2
3263 | 3262 0
3264 | 3263 2
3265 | 3264 2
3266 | 3265 5
3267 | 3266 1
3268 | 3267 2
3269 | 3268 2
3270 | 3269 0
3271 | 3270 4
3272 | 3271 4
3273 | 3272 2
3274 | 3273 4
3275 | 3274 4
3276 | 3275 1
3277 | 3276 3
3278 | 3277 0
3279 | 3278 2
3280 | 3279 2
3281 | 3280 3
3282 | 3281 0
3283 | 3282 3
3284 | 3283 4
3285 | 3284 4
3286 | 3285 1
3287 | 3286 4
3288 | 3287 2
3289 | 3288 4
3290 | 3289 3
3291 | 3290 0
3292 | 3291 4
3293 | 3292 2
3294 | 3293 2
3295 | 3294 4
3296 | 3295 0
3297 | 3296 0
3298 | 3297 4
3299 | 3298 4
3300 | 3299 2
3301 | 3300 3
3302 | 3301 3
3303 | 3302 2
3304 | 3303 3
3305 | 3304 4
3306 | 3305 2
3307 | 3306 4
3308 | 3307 2
3309 | 3308 4
3310 | 3309 3
3311 | 3310 2
3312 | 3311 4
3313 |
--------------------------------------------------------------------------------
/ANRL/main.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import networkx as nx
3 | import node2vec
4 | import random
5 | from config import *
6 | from evaluation import *
7 | from model import *
8 | from utils import *
9 | import tensorflow as tf
10 | import math
11 | import time
12 |
13 | tf.app.flags.DEFINE_string("datasets", "citeseer", "datasets descriptions")
14 | tf.app.flags.DEFINE_string(
15 | "inputEdgeFile", "graph/citeseer.edgelist", "input graph edge file")
16 | tf.app.flags.DEFINE_string(
17 | "inputFeatureFile", "graph/citeseer.feature", "input graph feature file")
18 | tf.app.flags.DEFINE_string(
19 | "inputLabelFile", "graph/citeseer.label", "input graph label file")
20 | tf.app.flags.DEFINE_string(
21 | "outputEmbedFile", "embed/citeseer.embed", "output embedding result")
22 | tf.app.flags.DEFINE_integer("dimensions", 128, "embedding dimensions")
23 | tf.app.flags.DEFINE_integer("feaDims", 3703, "feature dimensions")
24 | tf.app.flags.DEFINE_integer("walk_length", 80, "walk length")
25 | tf.app.flags.DEFINE_integer("num_walks", 10, "number of walks")
26 | tf.app.flags.DEFINE_integer("window_size", 10, "window size")
27 | tf.app.flags.DEFINE_float("p", 1.0, "p value")
28 | tf.app.flags.DEFINE_float("q", 1.0, "q value")
29 | tf.app.flags.DEFINE_boolean("weighted", False, "weighted edges")
30 | tf.app.flags.DEFINE_boolean("directed", False, "undirected edges")
31 |
32 |
33 | def generate_graph_context_all_pairs(path, window_size):
34 | # generating graph context pairs
35 | all_pairs = []
36 | for k in range(len(path)):
37 | for i in range(len(path[k])):
38 | for j in range(i - window_size, i + window_size + 1):
39 | if i == j or j < 0 or j >= len(path[k]):
40 | continue
41 | else:
42 | all_pairs.append([path[k][i], path[k][j]])
43 |
44 | return np.array(all_pairs, dtype = np.int32)
45 |
46 |
47 | def graph_context_batch_iter(all_pairs, batch_size):
48 | while True:
49 | start_idx = np.random.randint(0, len(all_pairs) - batch_size)
50 | batch_idx = np.array(range(start_idx, start_idx + batch_size))
51 | batch_idx = np.random.permutation(batch_idx)
52 | batch = np.zeros((batch_size), dtype = np.int32)
53 | labels = np.zeros((batch_size, 1), dtype = np.int32)
54 | batch[:] = all_pairs[batch_idx, 0]
55 | labels[:, 0] = all_pairs[batch_idx, 1]
56 | yield batch, labels
57 |
58 |
59 | def construct_traget_neighbors(nx_G, X, FLAGS, mode = "WAN"):
60 | # construct target neighbor feature matrix
61 | X_target = np.zeros(X.shape)
62 | nodes = nx_G.nodes()
63 |
64 | if mode == "OWN":
65 | # autoencoder for reconstructing itself
66 | return X
67 | elif mode == "EMN":
68 | # autoencoder for reconstructing Elementwise Median Neighbor
69 | for node in nodes:
70 | neighbors = nx_G.neighbors(node)
71 | if len(neighbors) == 0:
72 | X_target[node] = X[node]
73 | else:
74 | temp = X[node]
75 | for n in neighbors:
76 | if FLAGS.weighted:
77 | # weighted
78 | # temp = np.vstack((temp, X[n] * edgeWeight))
79 | pass
80 | else:
81 | temp = np.vstack((temp, X[n]))
82 | temp = np.median(temp, axis = 0)
83 | X_target[node] = temp
84 | return X_target
85 | elif mode == "WAN":
86 | # autoencoder for reconstructing Weighted Average Neighbor
87 | for node in nodes:
88 | neighbors = nx_G.neighbors(node)
89 | if len(neighbors) == 0:
90 | X_target[node] = X[node]
91 | else:
92 | temp = X[node]
93 | for n in neighbors:
94 | if FLAGS.weighted:
95 | # weighted sum
96 | # temp += X[n] * edgeWeight
97 | pass
98 | else:
99 | temp = np.vstack((temp, X[n]))
100 | temp = np.mean(temp, axis = 0)
101 | X_target[node] = temp
102 | return X_target
103 |
104 |
105 | def main():
106 | FLAGS = tf.app.flags.FLAGS
107 | inputEdgeFile = FLAGS.inputEdgeFile
108 | inputFeatureFile = FLAGS.inputFeatureFile
109 | inputLabelFile = FLAGS.inputLabelFile
110 | outputEmbedFile = FLAGS.outputEmbedFile
111 | window_size = FLAGS.window_size
112 |
113 | # Read graph
114 | nx_G = read_graph(FLAGS, inputEdgeFile)
115 |
116 | # Perform random walks to generate graph context
117 | G = node2vec.Graph(nx_G, FLAGS.directed, FLAGS.p, FLAGS.q)
118 | G.preprocess_transition_probs()
119 | walks = G.simulate_walks(FLAGS.num_walks, FLAGS.walk_length)
120 |
121 | # Read features
122 | print "reading features..."
123 | X = read_feature(inputFeatureFile)
124 |
125 | print "generating graph context pairs..."
126 | start_time = time.time()
127 | all_pairs = generate_graph_context_all_pairs(walks, window_size)
128 | end_time = time.time()
129 | print "time consumed for constructing graph context: %.2f" %(end_time - start_time)
130 |
131 | nodes = nx_G.nodes()
132 | X_target = construct_traget_neighbors(nx_G, X, FLAGS, mode = "WAN")
133 |
134 | # Total number nodes
135 | N = len(nodes)
136 | feaDims = FLAGS.feaDims
137 | dims = FLAGS.dimensions
138 |
139 | config = Config()
140 | config.struct[0] = FLAGS.feaDims
141 | config.struct[-1] = FLAGS.dimensions
142 | model = Model(config, N, dims, X_target)
143 |
144 | init = tf.global_variables_initializer()
145 | sess = tf.Session()
146 | sess.run(init)
147 |
148 | batch_size = config.batch_size
149 | max_iters = config.max_iters
150 | embedding_result = None
151 |
152 | idx = 0
153 | print_every_k_iterations = 1000
154 | start = time.time()
155 |
156 | total_loss = 0
157 | loss_sg = 0
158 | loss_ae = 0
159 |
160 | for iter_cnt in xrange(max_iters):
161 | idx += 1
162 |
163 | batch_index, batch_labels = next(
164 | graph_context_batch_iter(all_pairs, batch_size))
165 |
166 | # train for autoencoder model
167 | start_idx = np.random.randint(0, N - batch_size)
168 | batch_idx = np.array(range(start_idx, start_idx + batch_size))
169 | batch_idx = np.random.permutation(batch_idx)
170 | batch_X = X[batch_idx]
171 | feed_dict = {model.X: batch_X, model.inputs: batch_idx}
172 | _, loss_ae_value = sess.run(
173 | [model.train_opt_ae, model.loss_ae], feed_dict = feed_dict)
174 | loss_ae += loss_ae_value
175 |
176 | # train for skip-gram model
177 | batch_X = X[batch_index]
178 | feed_dict = {model.X: batch_X, model.labels: batch_labels}
179 | _, loss_sg_value = sess.run(
180 | [model.train_opt_sg, model.loss_sg], feed_dict = feed_dict)
181 | loss_sg += loss_sg_value
182 |
183 | if idx % print_every_k_iterations == 0:
184 | end = time.time()
185 | print "iterations: %d" %(idx) + ", time elapsed: %.2f," %(end - start),
186 | total_loss = loss_sg/idx + loss_ae/idx
187 | print "loss: %.2f," %(total_loss),
188 |
189 | y = read_label(inputLabelFile)
190 | embedding_result = sess.run(model.Y, feed_dict={model.X: X})
191 | macro_f1, micro_f1 = multiclass_node_classification_eval(embedding_result, y, 0.7)
192 | print "[macro_f1 = %.4f, micro_f1 = %.4f]" %(macro_f1, micro_f1)
193 |
194 | print "optimization finished..."
195 | y = read_label(inputLabelFile)
196 | embedding_result = sess.run(model.Y, feed_dict = {model.X: X})
197 | print "repeat 10 times for node classification with random split..."
198 | node_classification_F1(embedding_result, y)
199 | print "saving embedding result..."
200 | write_embedding(embedding_result, outputEmbedFile)
201 |
202 |
203 | if __name__ == "__main__":
204 | main()
205 |
--------------------------------------------------------------------------------
/ANRL/model.py:
--------------------------------------------------------------------------------
1 | # Representation Learning Class
2 |
3 | import numpy as np
4 | import tensorflow as tf
5 | import time
6 | import random
7 | import math
8 |
9 |
10 | class Model:
11 | def __init__(self, config, N, dims, X_target):
12 | self.config = config
13 | self.N = N
14 | self.dims = dims
15 |
16 | self.labels = tf.placeholder(tf.int32, shape=[None, 1])
17 | self.inputs = tf.placeholder(tf.int32, shape=[None])
18 | self.X_target = tf.constant(X_target, dtype=tf.float32)
19 | self.X_new = tf.nn.embedding_lookup(self.X_target, self.inputs)
20 |
21 | ############ define variables for autoencoder ##################
22 | self.layers = len(config.struct)
23 | self.struct = config.struct
24 | self.W = {}
25 | self.b = {}
26 | struct = self.struct
27 |
28 | # encode module
29 | for i in range(self.layers - 1):
30 | name_W = "encoder_W_" + str(i)
31 | name_b = "encoder_b_" + str(i)
32 | self.W[name_W] = tf.get_variable(
33 | name_W, [struct[i], struct[i+1]], initializer=tf.contrib.layers.xavier_initializer())
34 | self.b[name_b] = tf.get_variable(
35 | name_b, [struct[i+1]], initializer=tf.zeros_initializer())
36 |
37 | # decode module
38 | struct.reverse()
39 | for i in range(self.layers - 1):
40 | name_W = "decoder_W_" + str(i)
41 | name_b = "decoder_b_" + str(i)
42 | self.W[name_W] = tf.get_variable(
43 | name_W, [struct[i], struct[i+1]], initializer=tf.contrib.layers.xavier_initializer())
44 | self.b[name_b] = tf.get_variable(
45 | name_b, [struct[i+1]], initializer=tf.zeros_initializer())
46 | self.struct.reverse()
47 |
48 | ############## define input ###################
49 | self.X = tf.placeholder(tf.float32, shape=[None, config.struct[0]])
50 |
51 | self.make_compute_graph()
52 | self.make_autoencoder_loss()
53 |
54 | # compute gradients for deep autoencoder
55 | self.train_opt_ae = tf.train.AdamOptimizer(config.ae_learning_rate).minimize(self.loss_ae)
56 |
57 | ############ define variables for skipgram ####################
58 | # construct variables for nce loss
59 | self.nce_weights = tf.get_variable("nce_weights", [
60 | self.N, self.dims], initializer=tf.contrib.layers.xavier_initializer())
61 | self.nce_biases = tf.get_variable(
62 | "nce_biases", [self.N], initializer=tf.zeros_initializer())
63 |
64 | self.loss_sg = self.make_skipgram_loss()
65 |
66 | # compute gradients for skipgram
67 | self.train_opt_sg = tf.train.AdamOptimizer(config.sg_learning_rate).minimize(self.loss_sg)
68 |
69 | def make_skipgram_loss(self):
70 | loss = tf.reduce_sum(tf.nn.sampled_softmax_loss(
71 | weights=self.nce_weights,
72 | biases=self.nce_biases,
73 | labels=self.labels,
74 | inputs=self.Y,
75 | num_sampled=self.config.num_sampled,
76 | num_classes=self.N))
77 |
78 | return loss
79 |
80 | def make_compute_graph(self):
81 |
82 | def encoder(X):
83 | for i in range(self.layers - 1):
84 | name_W = "encoder_W_" + str(i)
85 | name_b = "encoder_b_" + str(i)
86 | X = tf.nn.tanh(tf.matmul(X, self.W[name_W]) + self.b[name_b])
87 | return X
88 |
89 | def decoder(X):
90 | for i in range(self.layers - 1):
91 | name_W = "decoder_W_" + str(i)
92 | name_b = "decoder_b_" + str(i)
93 | X = tf.nn.tanh(tf.matmul(X, self.W[name_W]) + self.b[name_b])
94 | return X
95 |
96 | self.Y = encoder(self.X)
97 |
98 | self.X_reconstruct = decoder(self.Y)
99 |
100 | def make_autoencoder_loss(self):
101 |
102 | def get_autoencoder_loss(X, newX):
103 | return tf.reduce_sum(tf.pow((newX - X), 2))
104 |
105 | def get_reg_loss(weights, biases):
106 | reg = tf.add_n([tf.nn.l2_loss(w) for w in weights.itervalues()])
107 | reg += tf.add_n([tf.nn.l2_loss(b) for b in biases.itervalues()])
108 | return reg
109 |
110 | loss_autoencoder = get_autoencoder_loss(self.X_new, self.X_reconstruct)
111 |
112 | loss_reg = get_reg_loss(self.W, self.b)
113 |
114 | self.loss_ae = self.config.alpha * loss_autoencoder + self.config.reg * loss_reg
115 |
--------------------------------------------------------------------------------
/ANRL/node2vec.py:
--------------------------------------------------------------------------------
1 | # Leverage Node2vec for random walks
2 |
3 | import numpy as np
4 | import networkx as nx
5 | import random
6 |
7 |
8 | class Graph():
9 | def __init__(self, nx_G, is_directed, p, q):
10 | self.G = nx_G
11 | self.is_directed = is_directed
12 | self.p = p
13 | self.q = q
14 |
15 | def node2vec_walk(self, walk_length, start_node):
16 | G = self.G
17 | alias_nodes = self.alias_nodes
18 | alias_edges = self.alias_edges
19 |
20 | walk = [start_node]
21 |
22 | while len(walk) < walk_length:
23 | cur = walk[-1]
24 | cur_nbrs = sorted(G.neighbors(cur))
25 | if len(cur_nbrs) > 0:
26 | if len(walk) == 1:
27 | walk.append(
28 | cur_nbrs[alias_draw(alias_nodes[cur][0], alias_nodes[cur][1])])
29 | else:
30 | prev = walk[-2]
31 | next = cur_nbrs[alias_draw(alias_edges[(prev, cur)][0],
32 | alias_edges[(prev, cur)][1])]
33 | walk.append(next)
34 | else:
35 | break
36 |
37 | return walk
38 |
39 | def simulate_walks(self, num_walks, walk_length):
40 | '''
41 | Repeatedly simulate random walks from each node.
42 | '''
43 | G = self.G
44 | walks = []
45 | nodes = list(G.nodes())
46 | print 'simulating random walk...'
47 | for walk_iter in range(num_walks):
48 | print str(walk_iter+1), '/', str(num_walks)
49 | random.shuffle(nodes)
50 | for node in nodes:
51 | walks.append(self.node2vec_walk(
52 | walk_length=walk_length, start_node=node))
53 | return walks
54 |
55 | def get_alias_edge(self, src, dst):
56 | '''
57 | Get the alias edge setup lists for a given edge.
58 | '''
59 | G = self.G
60 | p = self.p
61 | q = self.q
62 |
63 | unnormalized_probs = []
64 | for dst_nbr in sorted(G.neighbors(dst)):
65 | if dst_nbr == src:
66 | unnormalized_probs.append(G[dst][dst_nbr]['weight']/p)
67 | elif G.has_edge(dst_nbr, src):
68 | unnormalized_probs.append(G[dst][dst_nbr]['weight'])
69 | else:
70 | unnormalized_probs.append(G[dst][dst_nbr]['weight']/q)
71 | norm_const = sum(unnormalized_probs)
72 | normalized_probs = [
73 | float(u_prob)/norm_const for u_prob in unnormalized_probs]
74 |
75 | return alias_setup(normalized_probs)
76 |
77 | def preprocess_transition_probs(self):
78 | '''
79 | Preprocessing of transition probabilities for guiding the random walks.
80 | '''
81 | G = self.G
82 | is_directed = self.is_directed
83 |
84 | alias_nodes = {}
85 | for node in G.nodes():
86 | unnormalized_probs = [G[node][nbr]['weight']
87 | for nbr in sorted(G.neighbors(node))]
88 | norm_const = sum(unnormalized_probs)
89 | normalized_probs = [
90 | float(u_prob)/norm_const for u_prob in unnormalized_probs]
91 | alias_nodes[node] = alias_setup(normalized_probs)
92 |
93 | alias_edges = {}
94 | if is_directed:
95 | for edge in G.edges():
96 | alias_edges[edge] = self.get_alias_edge(edge[0], edge[1])
97 | else:
98 | for edge in G.edges():
99 | alias_edges[edge] = self.get_alias_edge(edge[0], edge[1])
100 | alias_edges[(edge[1], edge[0])] = self.get_alias_edge(
101 | edge[1], edge[0])
102 |
103 | self.alias_nodes = alias_nodes
104 | self.alias_edges = alias_edges
105 |
106 |
107 | def alias_setup(probs):
108 | '''
109 | Compute utility lists for non-uniform sampling from discrete distributions.
110 | Refer to https://hips.seas.harvard.edu/blog/2013/03/03/the-alias-method-efficient-sampling-with-many-discrete-outcomes/
111 | for details
112 | '''
113 | K = len(probs)
114 | q = np.zeros(K)
115 | J = np.zeros(K, dtype=np.int)
116 |
117 | smaller = []
118 | larger = []
119 | for kk, prob in enumerate(probs):
120 | q[kk] = K*prob
121 | if q[kk] < 1.0:
122 | smaller.append(kk)
123 | else:
124 | larger.append(kk)
125 |
126 | while len(smaller) > 0 and len(larger) > 0:
127 | small = smaller.pop()
128 | large = larger.pop()
129 |
130 | J[small] = large
131 | q[large] = q[large] + q[small] - 1.0
132 | if q[large] < 1.0:
133 | smaller.append(large)
134 | else:
135 | larger.append(large)
136 |
137 | return J, q
138 |
139 |
140 | def alias_draw(J, q):
141 | '''
142 | Draw sample from a non-uniform discrete distribution using alias sampling.
143 | '''
144 | K = len(J)
145 |
146 | kk = int(np.floor(np.random.rand()*K))
147 | if np.random.rand() < q[kk]:
148 | return kk
149 | else:
150 | return J[kk]
151 |
--------------------------------------------------------------------------------
/ANRL/paper/ANRL.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hp027/AliGraph/60bd650d6d1c66eb419399d28c88972c89472c74/ANRL/paper/ANRL.pdf
--------------------------------------------------------------------------------
/ANRL/requirements.txt:
--------------------------------------------------------------------------------
1 | tensorflow==1.4.1
2 | networkx==1.11
3 | numpy==1.14.2
4 | scipy==0.19.1
5 | scikit-learn==0.19.0
6 |
--------------------------------------------------------------------------------
/ANRL/utils.py:
--------------------------------------------------------------------------------
1 | # Useful functions
2 |
3 | import networkx as nx
4 | import numpy as np
5 | import collections
6 | import random
7 | import tensorflow as tf
8 |
9 |
10 | def read_graph(FLAGS, edgeFile):
11 | print "loading graph..."
12 |
13 | if FLAGS.weighted:
14 | G = nx.read_edgelist(edgeFile, nodetype=int, data=(
15 | ('weight', float),), create_using=nx.DiGraph())
16 | else:
17 | G = nx.read_edgelist(edgeFile, nodetype=int, create_using=nx.DiGraph())
18 | for edge in G.edges():
19 | G[edge[0]][edge[1]]['weight'] = 1
20 |
21 | if not FLAGS.directed:
22 | G = G.to_undirected()
23 |
24 | return G
25 |
26 |
27 | def read_edgelist(inputFileName):
28 | f = open(inputFileName, "r")
29 | lines = f.readlines()
30 | f.close()
31 |
32 | edgelist = []
33 | for line in lines:
34 | l = line.strip("\n\r").split(" ")
35 | edge = (int(l[0]), int(l[1]))
36 | edgelist.append(edge)
37 | return edgelist
38 |
39 |
40 | def read_feature(inputFileName):
41 | f = open(inputFileName, "r")
42 | lines = f.readlines()
43 | f.close()
44 |
45 | features = []
46 | for line in lines[1:]:
47 | l = line.strip("\n\r").split(" ")
48 | features.append(l)
49 | features = np.array(features, dtype=np.float32)
50 | features[features > 0] = 1.0 # feature binarization
51 |
52 | return features
53 |
54 |
55 | def write_embedding(embedding_result, outputFileName):
56 | f = open(outputFileName, "w")
57 | N, dims = embedding_result.shape
58 |
59 | for i in range(N):
60 | s = ""
61 | for j in range(dims):
62 | if j == 0:
63 | s = str(i) + "," + str(embedding_result[i, j])
64 | else:
65 | s = s + "," + str(embedding_result[i, j])
66 | f.writelines(s + "\n")
67 | f.close()
68 |
--------------------------------------------------------------------------------
/DELP/README.md:
--------------------------------------------------------------------------------
1 | # DELP
2 |
3 | This is a Tensorflow implementation of the DELP algorithm, which jointly performs graph embedding and label propagation.
4 |
5 | ## Requirements
6 | * tensorflow
7 | * networkx
8 | * numpy
9 | * scipy
10 | * scikit-learn
11 |
12 | All required packages are defined in requirements.txt. To install all requirement, just use the following commands:
13 | ```
14 | pip install -r requirements.txt
15 | ```
16 |
17 | ## Basic Usage
18 |
19 | ### Input Data
20 | Each dataset contains 3 files: edgelist, features and labels.
21 | ```
22 | 1. cora.edgelist: each line contains two connected nodes.
23 | node_1 node_2
24 | node_2 node_3
25 | ...
26 |
27 | 2. cora.feature: this file has n+1 lines.
28 | The first line has the following format:
29 | node_number feature_dimension
30 | The next n lines are as follows: (each node per line ordered by node id)
31 | (for node_1) feature_1 feature_2 ... feature_n
32 | (for node_2) feature_1 feature_2 ... feature_n
33 | ...
34 |
35 | 3. cora.label: each line represents a node and its class label.
36 | node_1 label_1
37 | node_2 label_2
38 | ...
39 | ```
40 |
41 | ### Run
42 | To run DELP, just execute the following command for node classification task:
43 | ```
44 | python main.py
45 | ```
--------------------------------------------------------------------------------
/DELP/graph/cora.label:
--------------------------------------------------------------------------------
1 | 0 5
2 | 1 5
3 | 2 0
4 | 3 5
5 | 4 5
6 | 5 5
7 | 6 5
8 | 7 5
9 | 8 5
10 | 9 5
11 | 10 5
12 | 11 5
13 | 12 5
14 | 13 5
15 | 14 5
16 | 15 0
17 | 16 5
18 | 17 4
19 | 18 5
20 | 19 5
21 | 20 5
22 | 21 5
23 | 22 5
24 | 23 5
25 | 24 5
26 | 25 5
27 | 26 5
28 | 27 5
29 | 28 5
30 | 29 5
31 | 30 5
32 | 31 5
33 | 32 5
34 | 33 5
35 | 34 5
36 | 35 5
37 | 36 5
38 | 37 5
39 | 38 5
40 | 39 5
41 | 40 5
42 | 41 5
43 | 42 5
44 | 43 5
45 | 44 5
46 | 45 5
47 | 46 5
48 | 47 5
49 | 48 5
50 | 49 5
51 | 50 5
52 | 51 5
53 | 52 5
54 | 53 5
55 | 54 5
56 | 55 2
57 | 56 5
58 | 57 5
59 | 58 5
60 | 59 5
61 | 60 5
62 | 61 0
63 | 62 5
64 | 63 5
65 | 64 5
66 | 65 5
67 | 66 6
68 | 67 2
69 | 68 5
70 | 69 5
71 | 70 5
72 | 71 5
73 | 72 5
74 | 73 5
75 | 74 5
76 | 75 0
77 | 76 5
78 | 77 5
79 | 78 5
80 | 79 5
81 | 80 5
82 | 81 5
83 | 82 5
84 | 83 4
85 | 84 2
86 | 85 5
87 | 86 5
88 | 87 5
89 | 88 5
90 | 89 5
91 | 90 5
92 | 91 5
93 | 92 5
94 | 93 5
95 | 94 5
96 | 95 5
97 | 96 5
98 | 97 5
99 | 98 5
100 | 99 5
101 | 100 5
102 | 101 5
103 | 102 5
104 | 103 5
105 | 104 0
106 | 105 5
107 | 106 5
108 | 107 5
109 | 108 5
110 | 109 5
111 | 110 5
112 | 111 5
113 | 112 5
114 | 113 5
115 | 114 5
116 | 115 5
117 | 116 5
118 | 117 5
119 | 118 5
120 | 119 5
121 | 120 5
122 | 121 5
123 | 122 5
124 | 123 5
125 | 124 5
126 | 125 5
127 | 126 5
128 | 127 5
129 | 128 5
130 | 129 5
131 | 130 5
132 | 131 5
133 | 132 5
134 | 133 5
135 | 134 5
136 | 135 5
137 | 136 5
138 | 137 5
139 | 138 5
140 | 139 5
141 | 140 5
142 | 141 5
143 | 142 5
144 | 143 5
145 | 144 5
146 | 145 5
147 | 146 5
148 | 147 5
149 | 148 5
150 | 149 5
151 | 150 5
152 | 151 5
153 | 152 5
154 | 153 5
155 | 154 5
156 | 155 5
157 | 156 5
158 | 157 5
159 | 158 6
160 | 159 5
161 | 160 5
162 | 161 5
163 | 162 2
164 | 163 5
165 | 164 5
166 | 165 5
167 | 166 5
168 | 167 5
169 | 168 5
170 | 169 5
171 | 170 2
172 | 171 2
173 | 172 2
174 | 173 2
175 | 174 2
176 | 175 2
177 | 176 2
178 | 177 2
179 | 178 2
180 | 179 2
181 | 180 2
182 | 181 0
183 | 182 2
184 | 183 2
185 | 184 2
186 | 185 2
187 | 186 2
188 | 187 2
189 | 188 3
190 | 189 2
191 | 190 2
192 | 191 2
193 | 192 2
194 | 193 2
195 | 194 2
196 | 195 2
197 | 196 2
198 | 197 2
199 | 198 2
200 | 199 2
201 | 200 2
202 | 201 2
203 | 202 2
204 | 203 2
205 | 204 2
206 | 205 2
207 | 206 2
208 | 207 2
209 | 208 2
210 | 209 2
211 | 210 2
212 | 211 2
213 | 212 2
214 | 213 2
215 | 214 3
216 | 215 3
217 | 216 2
218 | 217 3
219 | 218 2
220 | 219 2
221 | 220 2
222 | 221 2
223 | 222 2
224 | 223 2
225 | 224 2
226 | 225 2
227 | 226 2
228 | 227 2
229 | 228 2
230 | 229 2
231 | 230 2
232 | 231 1
233 | 232 0
234 | 233 6
235 | 234 6
236 | 235 0
237 | 236 6
238 | 237 6
239 | 238 3
240 | 239 3
241 | 240 3
242 | 241 3
243 | 242 3
244 | 243 3
245 | 244 6
246 | 245 2
247 | 246 2
248 | 247 2
249 | 248 2
250 | 249 2
251 | 250 0
252 | 251 6
253 | 252 5
254 | 253 2
255 | 254 4
256 | 255 0
257 | 256 0
258 | 257 4
259 | 258 6
260 | 259 5
261 | 260 2
262 | 261 2
263 | 262 2
264 | 263 2
265 | 264 5
266 | 265 0
267 | 266 0
268 | 267 0
269 | 268 0
270 | 269 0
271 | 270 0
272 | 271 0
273 | 272 0
274 | 273 0
275 | 274 0
276 | 275 0
277 | 276 0
278 | 277 0
279 | 278 0
280 | 279 0
281 | 280 0
282 | 281 0
283 | 282 0
284 | 283 0
285 | 284 0
286 | 285 0
287 | 286 0
288 | 287 3
289 | 288 0
290 | 289 0
291 | 290 0
292 | 291 0
293 | 292 0
294 | 293 0
295 | 294 0
296 | 295 0
297 | 296 0
298 | 297 0
299 | 298 0
300 | 299 0
301 | 300 0
302 | 301 5
303 | 302 0
304 | 303 0
305 | 304 0
306 | 305 0
307 | 306 0
308 | 307 0
309 | 308 0
310 | 309 6
311 | 310 6
312 | 311 6
313 | 312 6
314 | 313 6
315 | 314 6
316 | 315 6
317 | 316 6
318 | 317 6
319 | 318 6
320 | 319 6
321 | 320 6
322 | 321 6
323 | 322 6
324 | 323 6
325 | 324 5
326 | 325 5
327 | 326 5
328 | 327 5
329 | 328 5
330 | 329 5
331 | 330 5
332 | 331 2
333 | 332 5
334 | 333 0
335 | 334 0
336 | 335 4
337 | 336 0
338 | 337 2
339 | 338 4
340 | 339 4
341 | 340 4
342 | 341 4
343 | 342 4
344 | 343 6
345 | 344 4
346 | 345 0
347 | 346 6
348 | 347 4
349 | 348 4
350 | 349 4
351 | 350 4
352 | 351 4
353 | 352 1
354 | 353 4
355 | 354 0
356 | 355 1
357 | 356 6
358 | 357 4
359 | 358 4
360 | 359 5
361 | 360 1
362 | 361 1
363 | 362 4
364 | 363 0
365 | 364 1
366 | 365 4
367 | 366 4
368 | 367 1
369 | 368 1
370 | 369 4
371 | 370 1
372 | 371 1
373 | 372 6
374 | 373 0
375 | 374 0
376 | 375 0
377 | 376 2
378 | 377 0
379 | 378 0
380 | 379 0
381 | 380 0
382 | 381 0
383 | 382 0
384 | 383 4
385 | 384 0
386 | 385 2
387 | 386 0
388 | 387 0
389 | 388 0
390 | 389 0
391 | 390 0
392 | 391 0
393 | 392 0
394 | 393 2
395 | 394 0
396 | 395 0
397 | 396 0
398 | 397 0
399 | 398 0
400 | 399 0
401 | 400 0
402 | 401 0
403 | 402 4
404 | 403 0
405 | 404 0
406 | 405 0
407 | 406 0
408 | 407 0
409 | 408 0
410 | 409 0
411 | 410 0
412 | 411 2
413 | 412 0
414 | 413 0
415 | 414 0
416 | 415 0
417 | 416 0
418 | 417 0
419 | 418 0
420 | 419 0
421 | 420 0
422 | 421 0
423 | 422 0
424 | 423 2
425 | 424 0
426 | 425 0
427 | 426 0
428 | 427 0
429 | 428 0
430 | 429 0
431 | 430 0
432 | 431 0
433 | 432 0
434 | 433 0
435 | 434 0
436 | 435 0
437 | 436 0
438 | 437 0
439 | 438 0
440 | 439 0
441 | 440 0
442 | 441 0
443 | 442 6
444 | 443 6
445 | 444 6
446 | 445 6
447 | 446 6
448 | 447 6
449 | 448 6
450 | 449 6
451 | 450 1
452 | 451 6
453 | 452 6
454 | 453 6
455 | 454 6
456 | 455 6
457 | 456 6
458 | 457 6
459 | 458 5
460 | 459 5
461 | 460 5
462 | 461 5
463 | 462 5
464 | 463 3
465 | 464 3
466 | 465 3
467 | 466 3
468 | 467 4
469 | 468 0
470 | 469 1
471 | 470 0
472 | 471 5
473 | 472 5
474 | 473 3
475 | 474 3
476 | 475 6
477 | 476 6
478 | 477 6
479 | 478 6
480 | 479 6
481 | 480 6
482 | 481 6
483 | 482 6
484 | 483 6
485 | 484 6
486 | 485 6
487 | 486 6
488 | 487 6
489 | 488 6
490 | 489 6
491 | 490 1
492 | 491 1
493 | 492 1
494 | 493 1
495 | 494 6
496 | 495 6
497 | 496 6
498 | 497 6
499 | 498 6
500 | 499 6
501 | 500 6
502 | 501 3
503 | 502 4
504 | 503 4
505 | 504 0
506 | 505 3
507 | 506 3
508 | 507 3
509 | 508 3
510 | 509 4
511 | 510 3
512 | 511 4
513 | 512 3
514 | 513 6
515 | 514 3
516 | 515 4
517 | 516 4
518 | 517 4
519 | 518 4
520 | 519 4
521 | 520 6
522 | 521 6
523 | 522 4
524 | 523 4
525 | 524 1
526 | 525 1
527 | 526 1
528 | 527 1
529 | 528 1
530 | 529 1
531 | 530 1
532 | 531 1
533 | 532 1
534 | 533 1
535 | 534 1
536 | 535 1
537 | 536 1
538 | 537 1
539 | 538 1
540 | 539 1
541 | 540 1
542 | 541 1
543 | 542 1
544 | 543 4
545 | 544 1
546 | 545 1
547 | 546 1
548 | 547 4
549 | 548 4
550 | 549 4
551 | 550 4
552 | 551 4
553 | 552 4
554 | 553 4
555 | 554 4
556 | 555 4
557 | 556 4
558 | 557 0
559 | 558 4
560 | 559 4
561 | 560 4
562 | 561 4
563 | 562 4
564 | 563 4
565 | 564 4
566 | 565 4
567 | 566 0
568 | 567 4
569 | 568 0
570 | 569 4
571 | 570 4
572 | 571 4
573 | 572 4
574 | 573 4
575 | 574 4
576 | 575 0
577 | 576 0
578 | 577 0
579 | 578 0
580 | 579 3
581 | 580 0
582 | 581 0
583 | 582 0
584 | 583 0
585 | 584 0
586 | 585 0
587 | 586 3
588 | 587 0
589 | 588 0
590 | 589 0
591 | 590 0
592 | 591 0
593 | 592 0
594 | 593 0
595 | 594 0
596 | 595 0
597 | 596 0
598 | 597 0
599 | 598 0
600 | 599 0
601 | 600 0
602 | 601 0
603 | 602 0
604 | 603 0
605 | 604 0
606 | 605 0
607 | 606 0
608 | 607 0
609 | 608 0
610 | 609 0
611 | 610 0
612 | 611 0
613 | 612 0
614 | 613 0
615 | 614 0
616 | 615 0
617 | 616 0
618 | 617 0
619 | 618 0
620 | 619 0
621 | 620 0
622 | 621 0
623 | 622 0
624 | 623 0
625 | 624 0
626 | 625 0
627 | 626 0
628 | 627 0
629 | 628 0
630 | 629 6
631 | 630 6
632 | 631 6
633 | 632 6
634 | 633 6
635 | 634 6
636 | 635 0
637 | 636 0
638 | 637 6
639 | 638 6
640 | 639 6
641 | 640 6
642 | 641 6
643 | 642 6
644 | 643 6
645 | 644 6
646 | 645 6
647 | 646 6
648 | 647 6
649 | 648 6
650 | 649 6
651 | 650 3
652 | 651 3
653 | 652 3
654 | 653 3
655 | 654 3
656 | 655 3
657 | 656 0
658 | 657 3
659 | 658 3
660 | 659 3
661 | 660 3
662 | 661 3
663 | 662 3
664 | 663 3
665 | 664 3
666 | 665 3
667 | 666 3
668 | 667 3
669 | 668 4
670 | 669 3
671 | 670 1
672 | 671 4
673 | 672 1
674 | 673 1
675 | 674 6
676 | 675 4
677 | 676 4
678 | 677 4
679 | 678 1
680 | 679 1
681 | 680 6
682 | 681 6
683 | 682 1
684 | 683 1
685 | 684 6
686 | 685 1
687 | 686 0
688 | 687 0
689 | 688 6
690 | 689 6
691 | 690 0
692 | 691 6
693 | 692 6
694 | 693 6
695 | 694 6
696 | 695 4
697 | 696 6
698 | 697 6
699 | 698 6
700 | 699 6
701 | 700 6
702 | 701 4
703 | 702 6
704 | 703 6
705 | 704 4
706 | 705 3
707 | 706 3
708 | 707 3
709 | 708 3
710 | 709 4
711 | 710 0
712 | 711 0
713 | 712 0
714 | 713 0
715 | 714 5
716 | 715 0
717 | 716 0
718 | 717 5
719 | 718 2
720 | 719 0
721 | 720 0
722 | 721 0
723 | 722 0
724 | 723 0
725 | 724 0
726 | 725 0
727 | 726 0
728 | 727 0
729 | 728 0
730 | 729 2
731 | 730 0
732 | 731 0
733 | 732 3
734 | 733 0
735 | 734 0
736 | 735 0
737 | 736 6
738 | 737 0
739 | 738 0
740 | 739 0
741 | 740 2
742 | 741 3
743 | 742 3
744 | 743 0
745 | 744 0
746 | 745 0
747 | 746 3
748 | 747 2
749 | 748 0
750 | 749 5
751 | 750 0
752 | 751 0
753 | 752 5
754 | 753 0
755 | 754 0
756 | 755 0
757 | 756 4
758 | 757 4
759 | 758 6
760 | 759 6
761 | 760 0
762 | 761 6
763 | 762 0
764 | 763 1
765 | 764 4
766 | 765 1
767 | 766 0
768 | 767 4
769 | 768 0
770 | 769 3
771 | 770 0
772 | 771 4
773 | 772 6
774 | 773 6
775 | 774 4
776 | 775 4
777 | 776 1
778 | 777 4
779 | 778 6
780 | 779 4
781 | 780 4
782 | 781 6
783 | 782 3
784 | 783 4
785 | 784 4
786 | 785 4
787 | 786 4
788 | 787 3
789 | 788 4
790 | 789 1
791 | 790 4
792 | 791 6
793 | 792 6
794 | 793 4
795 | 794 4
796 | 795 6
797 | 796 2
798 | 797 0
799 | 798 0
800 | 799 4
801 | 800 0
802 | 801 6
803 | 802 6
804 | 803 6
805 | 804 1
806 | 805 6
807 | 806 4
808 | 807 3
809 | 808 3
810 | 809 0
811 | 810 0
812 | 811 0
813 | 812 3
814 | 813 3
815 | 814 3
816 | 815 0
817 | 816 0
818 | 817 3
819 | 818 0
820 | 819 0
821 | 820 0
822 | 821 0
823 | 822 3
824 | 823 0
825 | 824 0
826 | 825 0
827 | 826 0
828 | 827 2
829 | 828 0
830 | 829 3
831 | 830 0
832 | 831 0
833 | 832 0
834 | 833 0
835 | 834 0
836 | 835 0
837 | 836 3
838 | 837 3
839 | 838 3
840 | 839 3
841 | 840 0
842 | 841 3
843 | 842 4
844 | 843 3
845 | 844 4
846 | 845 3
847 | 846 3
848 | 847 4
849 | 848 2
850 | 849 2
851 | 850 2
852 | 851 0
853 | 852 2
854 | 853 4
855 | 854 2
856 | 855 2
857 | 856 2
858 | 857 4
859 | 858 4
860 | 859 1
861 | 860 1
862 | 861 1
863 | 862 1
864 | 863 1
865 | 864 1
866 | 865 1
867 | 866 1
868 | 867 4
869 | 868 4
870 | 869 4
871 | 870 4
872 | 871 0
873 | 872 4
874 | 873 3
875 | 874 4
876 | 875 4
877 | 876 4
878 | 877 4
879 | 878 4
880 | 879 4
881 | 880 4
882 | 881 4
883 | 882 4
884 | 883 6
885 | 884 6
886 | 885 6
887 | 886 6
888 | 887 6
889 | 888 6
890 | 889 6
891 | 890 3
892 | 891 3
893 | 892 3
894 | 893 3
895 | 894 3
896 | 895 4
897 | 896 0
898 | 897 0
899 | 898 0
900 | 899 0
901 | 900 0
902 | 901 0
903 | 902 0
904 | 903 0
905 | 904 0
906 | 905 0
907 | 906 0
908 | 907 6
909 | 908 6
910 | 909 6
911 | 910 6
912 | 911 4
913 | 912 4
914 | 913 4
915 | 914 5
916 | 915 4
917 | 916 4
918 | 917 4
919 | 918 4
920 | 919 0
921 | 920 0
922 | 921 0
923 | 922 0
924 | 923 0
925 | 924 2
926 | 925 2
927 | 926 0
928 | 927 2
929 | 928 2
930 | 929 2
931 | 930 2
932 | 931 5
933 | 932 2
934 | 933 2
935 | 934 2
936 | 935 2
937 | 936 0
938 | 937 2
939 | 938 0
940 | 939 2
941 | 940 2
942 | 941 2
943 | 942 2
944 | 943 6
945 | 944 2
946 | 945 2
947 | 946 2
948 | 947 0
949 | 948 5
950 | 949 2
951 | 950 2
952 | 951 2
953 | 952 2
954 | 953 2
955 | 954 2
956 | 955 2
957 | 956 2
958 | 957 2
959 | 958 2
960 | 959 2
961 | 960 2
962 | 961 2
963 | 962 2
964 | 963 2
965 | 964 2
966 | 965 2
967 | 966 2
968 | 967 4
969 | 968 2
970 | 969 0
971 | 970 2
972 | 971 2
973 | 972 2
974 | 973 0
975 | 974 2
976 | 975 2
977 | 976 6
978 | 977 2
979 | 978 2
980 | 979 2
981 | 980 2
982 | 981 2
983 | 982 4
984 | 983 4
985 | 984 3
986 | 985 4
987 | 986 4
988 | 987 4
989 | 988 4
990 | 989 6
991 | 990 6
992 | 991 6
993 | 992 6
994 | 993 6
995 | 994 6
996 | 995 6
997 | 996 6
998 | 997 6
999 | 998 6
1000 | 999 6
1001 | 1000 4
1002 | 1001 6
1003 | 1002 0
1004 | 1003 6
1005 | 1004 6
1006 | 1005 0
1007 | 1006 2
1008 | 1007 2
1009 | 1008 2
1010 | 1009 2
1011 | 1010 2
1012 | 1011 4
1013 | 1012 0
1014 | 1013 1
1015 | 1014 1
1016 | 1015 1
1017 | 1016 4
1018 | 1017 6
1019 | 1018 4
1020 | 1019 4
1021 | 1020 1
1022 | 1021 1
1023 | 1022 5
1024 | 1023 6
1025 | 1024 6
1026 | 1025 0
1027 | 1026 0
1028 | 1027 0
1029 | 1028 0
1030 | 1029 4
1031 | 1030 0
1032 | 1031 0
1033 | 1032 0
1034 | 1033 0
1035 | 1034 3
1036 | 1035 3
1037 | 1036 3
1038 | 1037 0
1039 | 1038 3
1040 | 1039 3
1041 | 1040 3
1042 | 1041 3
1043 | 1042 3
1044 | 1043 3
1045 | 1044 3
1046 | 1045 3
1047 | 1046 3
1048 | 1047 3
1049 | 1048 3
1050 | 1049 0
1051 | 1050 0
1052 | 1051 0
1053 | 1052 0
1054 | 1053 3
1055 | 1054 0
1056 | 1055 0
1057 | 1056 0
1058 | 1057 0
1059 | 1058 0
1060 | 1059 3
1061 | 1060 0
1062 | 1061 0
1063 | 1062 3
1064 | 1063 3
1065 | 1064 3
1066 | 1065 3
1067 | 1066 3
1068 | 1067 1
1069 | 1068 1
1070 | 1069 1
1071 | 1070 1
1072 | 1071 4
1073 | 1072 4
1074 | 1073 1
1075 | 1074 4
1076 | 1075 4
1077 | 1076 1
1078 | 1077 1
1079 | 1078 1
1080 | 1079 1
1081 | 1080 1
1082 | 1081 4
1083 | 1082 4
1084 | 1083 2
1085 | 1084 0
1086 | 1085 4
1087 | 1086 4
1088 | 1087 0
1089 | 1088 0
1090 | 1089 0
1091 | 1090 0
1092 | 1091 0
1093 | 1092 2
1094 | 1093 2
1095 | 1094 2
1096 | 1095 2
1097 | 1096 0
1098 | 1097 0
1099 | 1098 0
1100 | 1099 0
1101 | 1100 0
1102 | 1101 0
1103 | 1102 0
1104 | 1103 0
1105 | 1104 0
1106 | 1105 0
1107 | 1106 4
1108 | 1107 4
1109 | 1108 4
1110 | 1109 1
1111 | 1110 4
1112 | 1111 1
1113 | 1112 0
1114 | 1113 4
1115 | 1114 3
1116 | 1115 1
1117 | 1116 1
1118 | 1117 1
1119 | 1118 0
1120 | 1119 4
1121 | 1120 1
1122 | 1121 1
1123 | 1122 6
1124 | 1123 4
1125 | 1124 0
1126 | 1125 0
1127 | 1126 0
1128 | 1127 0
1129 | 1128 0
1130 | 1129 0
1131 | 1130 0
1132 | 1131 1
1133 | 1132 4
1134 | 1133 1
1135 | 1134 4
1136 | 1135 6
1137 | 1136 3
1138 | 1137 3
1139 | 1138 1
1140 | 1139 6
1141 | 1140 4
1142 | 1141 6
1143 | 1142 6
1144 | 1143 5
1145 | 1144 5
1146 | 1145 5
1147 | 1146 5
1148 | 1147 2
1149 | 1148 5
1150 | 1149 2
1151 | 1150 5
1152 | 1151 5
1153 | 1152 5
1154 | 1153 5
1155 | 1154 5
1156 | 1155 5
1157 | 1156 2
1158 | 1157 4
1159 | 1158 4
1160 | 1159 4
1161 | 1160 4
1162 | 1161 3
1163 | 1162 3
1164 | 1163 3
1165 | 1164 5
1166 | 1165 0
1167 | 1166 0
1168 | 1167 0
1169 | 1168 4
1170 | 1169 4
1171 | 1170 4
1172 | 1171 3
1173 | 1172 4
1174 | 1173 6
1175 | 1174 0
1176 | 1175 6
1177 | 1176 4
1178 | 1177 4
1179 | 1178 4
1180 | 1179 4
1181 | 1180 4
1182 | 1181 4
1183 | 1182 4
1184 | 1183 4
1185 | 1184 4
1186 | 1185 4
1187 | 1186 4
1188 | 1187 6
1189 | 1188 6
1190 | 1189 6
1191 | 1190 6
1192 | 1191 6
1193 | 1192 6
1194 | 1193 6
1195 | 1194 6
1196 | 1195 6
1197 | 1196 6
1198 | 1197 6
1199 | 1198 6
1200 | 1199 0
1201 | 1200 0
1202 | 1201 4
1203 | 1202 0
1204 | 1203 0
1205 | 1204 0
1206 | 1205 0
1207 | 1206 0
1208 | 1207 0
1209 | 1208 0
1210 | 1209 0
1211 | 1210 6
1212 | 1211 0
1213 | 1212 2
1214 | 1213 6
1215 | 1214 6
1216 | 1215 4
1217 | 1216 6
1218 | 1217 1
1219 | 1218 0
1220 | 1219 0
1221 | 1220 4
1222 | 1221 4
1223 | 1222 1
1224 | 1223 4
1225 | 1224 1
1226 | 1225 6
1227 | 1226 4
1228 | 1227 4
1229 | 1228 4
1230 | 1229 4
1231 | 1230 4
1232 | 1231 4
1233 | 1232 4
1234 | 1233 4
1235 | 1234 4
1236 | 1235 4
1237 | 1236 4
1238 | 1237 4
1239 | 1238 4
1240 | 1239 4
1241 | 1240 4
1242 | 1241 4
1243 | 1242 4
1244 | 1243 4
1245 | 1244 4
1246 | 1245 4
1247 | 1246 4
1248 | 1247 4
1249 | 1248 3
1250 | 1249 4
1251 | 1250 4
1252 | 1251 4
1253 | 1252 4
1254 | 1253 4
1255 | 1254 4
1256 | 1255 4
1257 | 1256 4
1258 | 1257 4
1259 | 1258 4
1260 | 1259 4
1261 | 1260 4
1262 | 1261 4
1263 | 1262 0
1264 | 1263 0
1265 | 1264 0
1266 | 1265 0
1267 | 1266 0
1268 | 1267 0
1269 | 1268 0
1270 | 1269 0
1271 | 1270 0
1272 | 1271 0
1273 | 1272 0
1274 | 1273 4
1275 | 1274 4
1276 | 1275 4
1277 | 1276 4
1278 | 1277 4
1279 | 1278 0
1280 | 1279 2
1281 | 1280 5
1282 | 1281 5
1283 | 1282 5
1284 | 1283 5
1285 | 1284 5
1286 | 1285 5
1287 | 1286 5
1288 | 1287 5
1289 | 1288 5
1290 | 1289 6
1291 | 1290 5
1292 | 1291 0
1293 | 1292 0
1294 | 1293 2
1295 | 1294 2
1296 | 1295 2
1297 | 1296 2
1298 | 1297 0
1299 | 1298 0
1300 | 1299 2
1301 | 1300 4
1302 | 1301 0
1303 | 1302 2
1304 | 1303 3
1305 | 1304 0
1306 | 1305 0
1307 | 1306 0
1308 | 1307 0
1309 | 1308 0
1310 | 1309 0
1311 | 1310 0
1312 | 1311 0
1313 | 1312 0
1314 | 1313 1
1315 | 1314 1
1316 | 1315 2
1317 | 1316 2
1318 | 1317 2
1319 | 1318 4
1320 | 1319 4
1321 | 1320 5
1322 | 1321 5
1323 | 1322 5
1324 | 1323 5
1325 | 1324 5
1326 | 1325 0
1327 | 1326 0
1328 | 1327 3
1329 | 1328 3
1330 | 1329 6
1331 | 1330 6
1332 | 1331 6
1333 | 1332 6
1334 | 1333 1
1335 | 1334 6
1336 | 1335 6
1337 | 1336 1
1338 | 1337 6
1339 | 1338 0
1340 | 1339 4
1341 | 1340 4
1342 | 1341 0
1343 | 1342 0
1344 | 1343 0
1345 | 1344 4
1346 | 1345 4
1347 | 1346 4
1348 | 1347 4
1349 | 1348 4
1350 | 1349 4
1351 | 1350 4
1352 | 1351 5
1353 | 1352 6
1354 | 1353 6
1355 | 1354 6
1356 | 1355 6
1357 | 1356 6
1358 | 1357 3
1359 | 1358 3
1360 | 1359 3
1361 | 1360 3
1362 | 1361 3
1363 | 1362 3
1364 | 1363 3
1365 | 1364 3
1366 | 1365 3
1367 | 1366 3
1368 | 1367 0
1369 | 1368 0
1370 | 1369 0
1371 | 1370 0
1372 | 1371 0
1373 | 1372 0
1374 | 1373 0
1375 | 1374 0
1376 | 1375 0
1377 | 1376 4
1378 | 1377 0
1379 | 1378 0
1380 | 1379 0
1381 | 1380 0
1382 | 1381 0
1383 | 1382 0
1384 | 1383 3
1385 | 1384 3
1386 | 1385 3
1387 | 1386 3
1388 | 1387 3
1389 | 1388 3
1390 | 1389 3
1391 | 1390 3
1392 | 1391 3
1393 | 1392 3
1394 | 1393 3
1395 | 1394 3
1396 | 1395 3
1397 | 1396 3
1398 | 1397 3
1399 | 1398 2
1400 | 1399 3
1401 | 1400 2
1402 | 1401 2
1403 | 1402 3
1404 | 1403 3
1405 | 1404 0
1406 | 1405 5
1407 | 1406 2
1408 | 1407 5
1409 | 1408 5
1410 | 1409 6
1411 | 1410 2
1412 | 1411 0
1413 | 1412 0
1414 | 1413 0
1415 | 1414 0
1416 | 1415 0
1417 | 1416 1
1418 | 1417 3
1419 | 1418 3
1420 | 1419 3
1421 | 1420 3
1422 | 1421 3
1423 | 1422 3
1424 | 1423 3
1425 | 1424 3
1426 | 1425 3
1427 | 1426 3
1428 | 1427 3
1429 | 1428 3
1430 | 1429 3
1431 | 1430 3
1432 | 1431 4
1433 | 1432 3
1434 | 1433 3
1435 | 1434 4
1436 | 1435 3
1437 | 1436 3
1438 | 1437 3
1439 | 1438 6
1440 | 1439 2
1441 | 1440 2
1442 | 1441 2
1443 | 1442 2
1444 | 1443 5
1445 | 1444 5
1446 | 1445 5
1447 | 1446 4
1448 | 1447 4
1449 | 1448 4
1450 | 1449 4
1451 | 1450 4
1452 | 1451 0
1453 | 1452 0
1454 | 1453 0
1455 | 1454 0
1456 | 1455 5
1457 | 1456 0
1458 | 1457 0
1459 | 1458 0
1460 | 1459 0
1461 | 1460 0
1462 | 1461 0
1463 | 1462 0
1464 | 1463 4
1465 | 1464 6
1466 | 1465 6
1467 | 1466 6
1468 | 1467 6
1469 | 1468 6
1470 | 1469 6
1471 | 1470 6
1472 | 1471 6
1473 | 1472 6
1474 | 1473 6
1475 | 1474 6
1476 | 1475 6
1477 | 1476 6
1478 | 1477 6
1479 | 1478 6
1480 | 1479 6
1481 | 1480 6
1482 | 1481 6
1483 | 1482 6
1484 | 1483 4
1485 | 1484 6
1486 | 1485 6
1487 | 1486 6
1488 | 1487 6
1489 | 1488 6
1490 | 1489 0
1491 | 1490 0
1492 | 1491 4
1493 | 1492 0
1494 | 1493 4
1495 | 1494 4
1496 | 1495 4
1497 | 1496 1
1498 | 1497 1
1499 | 1498 3
1500 | 1499 3
1501 | 1500 3
1502 | 1501 3
1503 | 1502 3
1504 | 1503 3
1505 | 1504 4
1506 | 1505 5
1507 | 1506 5
1508 | 1507 5
1509 | 1508 5
1510 | 1509 5
1511 | 1510 5
1512 | 1511 5
1513 | 1512 5
1514 | 1513 5
1515 | 1514 6
1516 | 1515 6
1517 | 1516 6
1518 | 1517 4
1519 | 1518 4
1520 | 1519 6
1521 | 1520 6
1522 | 1521 4
1523 | 1522 0
1524 | 1523 0
1525 | 1524 0
1526 | 1525 6
1527 | 1526 6
1528 | 1527 6
1529 | 1528 6
1530 | 1529 4
1531 | 1530 6
1532 | 1531 6
1533 | 1532 3
1534 | 1533 3
1535 | 1534 3
1536 | 1535 3
1537 | 1536 3
1538 | 1537 3
1539 | 1538 3
1540 | 1539 3
1541 | 1540 4
1542 | 1541 3
1543 | 1542 3
1544 | 1543 0
1545 | 1544 1
1546 | 1545 4
1547 | 1546 4
1548 | 1547 3
1549 | 1548 4
1550 | 1549 3
1551 | 1550 0
1552 | 1551 0
1553 | 1552 0
1554 | 1553 3
1555 | 1554 3
1556 | 1555 3
1557 | 1556 2
1558 | 1557 4
1559 | 1558 0
1560 | 1559 0
1561 | 1560 0
1562 | 1561 0
1563 | 1562 0
1564 | 1563 0
1565 | 1564 0
1566 | 1565 0
1567 | 1566 0
1568 | 1567 4
1569 | 1568 0
1570 | 1569 0
1571 | 1570 3
1572 | 1571 3
1573 | 1572 0
1574 | 1573 0
1575 | 1574 0
1576 | 1575 0
1577 | 1576 3
1578 | 1577 0
1579 | 1578 0
1580 | 1579 5
1581 | 1580 0
1582 | 1581 0
1583 | 1582 3
1584 | 1583 4
1585 | 1584 6
1586 | 1585 0
1587 | 1586 0
1588 | 1587 0
1589 | 1588 0
1590 | 1589 4
1591 | 1590 4
1592 | 1591 4
1593 | 1592 4
1594 | 1593 4
1595 | 1594 4
1596 | 1595 4
1597 | 1596 0
1598 | 1597 5
1599 | 1598 0
1600 | 1599 0
1601 | 1600 0
1602 | 1601 0
1603 | 1602 1
1604 | 1603 0
1605 | 1604 0
1606 | 1605 0
1607 | 1606 0
1608 | 1607 2
1609 | 1608 2
1610 | 1609 0
1611 | 1610 0
1612 | 1611 4
1613 | 1612 4
1614 | 1613 2
1615 | 1614 3
1616 | 1615 2
1617 | 1616 6
1618 | 1617 6
1619 | 1618 2
1620 | 1619 0
1621 | 1620 5
1622 | 1621 5
1623 | 1622 2
1624 | 1623 2
1625 | 1624 2
1626 | 1625 2
1627 | 1626 2
1628 | 1627 2
1629 | 1628 2
1630 | 1629 2
1631 | 1630 2
1632 | 1631 2
1633 | 1632 2
1634 | 1633 4
1635 | 1634 0
1636 | 1635 2
1637 | 1636 2
1638 | 1637 4
1639 | 1638 4
1640 | 1639 4
1641 | 1640 5
1642 | 1641 5
1643 | 1642 3
1644 | 1643 3
1645 | 1644 3
1646 | 1645 6
1647 | 1646 0
1648 | 1647 0
1649 | 1648 0
1650 | 1649 0
1651 | 1650 0
1652 | 1651 0
1653 | 1652 0
1654 | 1653 3
1655 | 1654 0
1656 | 1655 6
1657 | 1656 0
1658 | 1657 4
1659 | 1658 4
1660 | 1659 0
1661 | 1660 0
1662 | 1661 0
1663 | 1662 0
1664 | 1663 0
1665 | 1664 4
1666 | 1665 0
1667 | 1666 0
1668 | 1667 0
1669 | 1668 0
1670 | 1669 0
1671 | 1670 0
1672 | 1671 3
1673 | 1672 3
1674 | 1673 3
1675 | 1674 3
1676 | 1675 3
1677 | 1676 3
1678 | 1677 3
1679 | 1678 3
1680 | 1679 3
1681 | 1680 3
1682 | 1681 3
1683 | 1682 0
1684 | 1683 0
1685 | 1684 3
1686 | 1685 4
1687 | 1686 3
1688 | 1687 0
1689 | 1688 0
1690 | 1689 0
1691 | 1690 0
1692 | 1691 0
1693 | 1692 5
1694 | 1693 0
1695 | 1694 4
1696 | 1695 4
1697 | 1696 4
1698 | 1697 0
1699 | 1698 3
1700 | 1699 3
1701 | 1700 6
1702 | 1701 4
1703 | 1702 4
1704 | 1703 0
1705 | 1704 0
1706 | 1705 0
1707 | 1706 0
1708 | 1707 0
1709 | 1708 0
1710 | 1709 0
1711 | 1710 5
1712 | 1711 5
1713 | 1712 0
1714 | 1713 0
1715 | 1714 0
1716 | 1715 0
1717 | 1716 0
1718 | 1717 0
1719 | 1718 0
1720 | 1719 0
1721 | 1720 0
1722 | 1721 0
1723 | 1722 0
1724 | 1723 0
1725 | 1724 0
1726 | 1725 5
1727 | 1726 5
1728 | 1727 0
1729 | 1728 0
1730 | 1729 0
1731 | 1730 6
1732 | 1731 6
1733 | 1732 3
1734 | 1733 3
1735 | 1734 3
1736 | 1735 3
1737 | 1736 5
1738 | 1737 0
1739 | 1738 0
1740 | 1739 0
1741 | 1740 2
1742 | 1741 2
1743 | 1742 4
1744 | 1743 3
1745 | 1744 3
1746 | 1745 3
1747 | 1746 0
1748 | 1747 3
1749 | 1748 3
1750 | 1749 4
1751 | 1750 4
1752 | 1751 3
1753 | 1752 3
1754 | 1753 3
1755 | 1754 3
1756 | 1755 3
1757 | 1756 3
1758 | 1757 3
1759 | 1758 6
1760 | 1759 4
1761 | 1760 4
1762 | 1761 3
1763 | 1762 5
1764 | 1763 5
1765 | 1764 5
1766 | 1765 5
1767 | 1766 5
1768 | 1767 5
1769 | 1768 5
1770 | 1769 5
1771 | 1770 5
1772 | 1771 3
1773 | 1772 3
1774 | 1773 3
1775 | 1774 3
1776 | 1775 3
1777 | 1776 3
1778 | 1777 6
1779 | 1778 1
1780 | 1779 1
1781 | 1780 1
1782 | 1781 1
1783 | 1782 1
1784 | 1783 1
1785 | 1784 1
1786 | 1785 1
1787 | 1786 1
1788 | 1787 0
1789 | 1788 0
1790 | 1789 3
1791 | 1790 3
1792 | 1791 3
1793 | 1792 3
1794 | 1793 3
1795 | 1794 3
1796 | 1795 3
1797 | 1796 3
1798 | 1797 3
1799 | 1798 3
1800 | 1799 0
1801 | 1800 2
1802 | 1801 0
1803 | 1802 4
1804 | 1803 3
1805 | 1804 0
1806 | 1805 0
1807 | 1806 6
1808 | 1807 6
1809 | 1808 4
1810 | 1809 5
1811 | 1810 5
1812 | 1811 5
1813 | 1812 0
1814 | 1813 0
1815 | 1814 0
1816 | 1815 4
1817 | 1816 4
1818 | 1817 5
1819 | 1818 0
1820 | 1819 0
1821 | 1820 0
1822 | 1821 1
1823 | 1822 1
1824 | 1823 5
1825 | 1824 0
1826 | 1825 0
1827 | 1826 5
1828 | 1827 3
1829 | 1828 3
1830 | 1829 4
1831 | 1830 0
1832 | 1831 4
1833 | 1832 4
1834 | 1833 0
1835 | 1834 0
1836 | 1835 2
1837 | 1836 0
1838 | 1837 5
1839 | 1838 5
1840 | 1839 1
1841 | 1840 4
1842 | 1841 4
1843 | 1842 1
1844 | 1843 4
1845 | 1844 4
1846 | 1845 4
1847 | 1846 4
1848 | 1847 0
1849 | 1848 0
1850 | 1849 0
1851 | 1850 6
1852 | 1851 6
1853 | 1852 0
1854 | 1853 2
1855 | 1854 2
1856 | 1855 5
1857 | 1856 5
1858 | 1857 5
1859 | 1858 5
1860 | 1859 0
1861 | 1860 0
1862 | 1861 3
1863 | 1862 0
1864 | 1863 0
1865 | 1864 0
1866 | 1865 0
1867 | 1866 3
1868 | 1867 3
1869 | 1868 3
1870 | 1869 3
1871 | 1870 0
1872 | 1871 0
1873 | 1872 3
1874 | 1873 4
1875 | 1874 5
1876 | 1875 5
1877 | 1876 5
1878 | 1877 5
1879 | 1878 5
1880 | 1879 5
1881 | 1880 5
1882 | 1881 5
1883 | 1882 5
1884 | 1883 5
1885 | 1884 5
1886 | 1885 5
1887 | 1886 5
1888 | 1887 5
1889 | 1888 5
1890 | 1889 5
1891 | 1890 5
1892 | 1891 5
1893 | 1892 5
1894 | 1893 4
1895 | 1894 6
1896 | 1895 6
1897 | 1896 5
1898 | 1897 6
1899 | 1898 6
1900 | 1899 2
1901 | 1900 5
1902 | 1901 5
1903 | 1902 4
1904 | 1903 0
1905 | 1904 6
1906 | 1905 5
1907 | 1906 5
1908 | 1907 5
1909 | 1908 0
1910 | 1909 0
1911 | 1910 5
1912 | 1911 5
1913 | 1912 5
1914 | 1913 4
1915 | 1914 5
1916 | 1915 0
1917 | 1916 0
1918 | 1917 5
1919 | 1918 5
1920 | 1919 1
1921 | 1920 1
1922 | 1921 1
1923 | 1922 5
1924 | 1923 0
1925 | 1924 5
1926 | 1925 5
1927 | 1926 5
1928 | 1927 5
1929 | 1928 0
1930 | 1929 6
1931 | 1930 3
1932 | 1931 0
1933 | 1932 0
1934 | 1933 1
1935 | 1934 6
1936 | 1935 1
1937 | 1936 6
1938 | 1937 4
1939 | 1938 3
1940 | 1939 3
1941 | 1940 3
1942 | 1941 3
1943 | 1942 3
1944 | 1943 3
1945 | 1944 3
1946 | 1945 3
1947 | 1946 4
1948 | 1947 5
1949 | 1948 5
1950 | 1949 1
1951 | 1950 1
1952 | 1951 4
1953 | 1952 0
1954 | 1953 4
1955 | 1954 4
1956 | 1955 4
1957 | 1956 4
1958 | 1957 0
1959 | 1958 0
1960 | 1959 5
1961 | 1960 5
1962 | 1961 2
1963 | 1962 0
1964 | 1963 4
1965 | 1964 3
1966 | 1965 3
1967 | 1966 3
1968 | 1967 3
1969 | 1968 6
1970 | 1969 6
1971 | 1970 0
1972 | 1971 0
1973 | 1972 0
1974 | 1973 0
1975 | 1974 6
1976 | 1975 2
1977 | 1976 5
1978 | 1977 5
1979 | 1978 0
1980 | 1979 3
1981 | 1980 0
1982 | 1981 0
1983 | 1982 0
1984 | 1983 2
1985 | 1984 1
1986 | 1985 3
1987 | 1986 3
1988 | 1987 3
1989 | 1988 3
1990 | 1989 3
1991 | 1990 6
1992 | 1991 6
1993 | 1992 0
1994 | 1993 3
1995 | 1994 3
1996 | 1995 0
1997 | 1996 6
1998 | 1997 6
1999 | 1998 6
2000 | 1999 0
2001 | 2000 0
2002 | 2001 0
2003 | 2002 2
2004 | 2003 2
2005 | 2004 1
2006 | 2005 1
2007 | 2006 0
2008 | 2007 0
2009 | 2008 4
2010 | 2009 4
2011 | 2010 3
2012 | 2011 3
2013 | 2012 3
2014 | 2013 3
2015 | 2014 0
2016 | 2015 5
2017 | 2016 0
2018 | 2017 0
2019 | 2018 0
2020 | 2019 3
2021 | 2020 3
2022 | 2021 3
2023 | 2022 0
2024 | 2023 0
2025 | 2024 6
2026 | 2025 6
2027 | 2026 5
2028 | 2027 5
2029 | 2028 5
2030 | 2029 5
2031 | 2030 5
2032 | 2031 0
2033 | 2032 0
2034 | 2033 4
2035 | 2034 5
2036 | 2035 4
2037 | 2036 0
2038 | 2037 0
2039 | 2038 5
2040 | 2039 0
2041 | 2040 4
2042 | 2041 2
2043 | 2042 0
2044 | 2043 3
2045 | 2044 6
2046 | 2045 6
2047 | 2046 3
2048 | 2047 0
2049 | 2048 4
2050 | 2049 4
2051 | 2050 3
2052 | 2051 3
2053 | 2052 3
2054 | 2053 4
2055 | 2054 0
2056 | 2055 0
2057 | 2056 0
2058 | 2057 0
2059 | 2058 0
2060 | 2059 4
2061 | 2060 0
2062 | 2061 5
2063 | 2062 5
2064 | 2063 5
2065 | 2064 6
2066 | 2065 2
2067 | 2066 2
2068 | 2067 2
2069 | 2068 6
2070 | 2069 6
2071 | 2070 5
2072 | 2071 5
2073 | 2072 5
2074 | 2073 0
2075 | 2074 0
2076 | 2075 6
2077 | 2076 6
2078 | 2077 4
2079 | 2078 5
2080 | 2079 0
2081 | 2080 0
2082 | 2081 5
2083 | 2082 5
2084 | 2083 6
2085 | 2084 6
2086 | 2085 3
2087 | 2086 3
2088 | 2087 3
2089 | 2088 0
2090 | 2089 0
2091 | 2090 3
2092 | 2091 0
2093 | 2092 5
2094 | 2093 5
2095 | 2094 3
2096 | 2095 3
2097 | 2096 3
2098 | 2097 3
2099 | 2098 3
2100 | 2099 5
2101 | 2100 6
2102 | 2101 6
2103 | 2102 6
2104 | 2103 4
2105 | 2104 5
2106 | 2105 6
2107 | 2106 6
2108 | 2107 0
2109 | 2108 0
2110 | 2109 2
2111 | 2110 0
2112 | 2111 4
2113 | 2112 5
2114 | 2113 0
2115 | 2114 0
2116 | 2115 0
2117 | 2116 0
2118 | 2117 0
2119 | 2118 3
2120 | 2119 3
2121 | 2120 3
2122 | 2121 3
2123 | 2122 3
2124 | 2123 3
2125 | 2124 3
2126 | 2125 3
2127 | 2126 0
2128 | 2127 0
2129 | 2128 0
2130 | 2129 4
2131 | 2130 4
2132 | 2131 6
2133 | 2132 6
2134 | 2133 0
2135 | 2134 0
2136 | 2135 0
2137 | 2136 0
2138 | 2137 4
2139 | 2138 0
2140 | 2139 0
2141 | 2140 2
2142 | 2141 5
2143 | 2142 5
2144 | 2143 5
2145 | 2144 5
2146 | 2145 3
2147 | 2146 3
2148 | 2147 3
2149 | 2148 0
2150 | 2149 0
2151 | 2150 6
2152 | 2151 4
2153 | 2152 1
2154 | 2153 1
2155 | 2154 4
2156 | 2155 3
2157 | 2156 3
2158 | 2157 0
2159 | 2158 0
2160 | 2159 4
2161 | 2160 0
2162 | 2161 0
2163 | 2162 5
2164 | 2163 5
2165 | 2164 0
2166 | 2165 6
2167 | 2166 3
2168 | 2167 5
2169 | 2168 0
2170 | 2169 5
2171 | 2170 5
2172 | 2171 5
2173 | 2172 2
2174 | 2173 0
2175 | 2174 0
2176 | 2175 5
2177 | 2176 5
2178 | 2177 0
2179 | 2178 4
2180 | 2179 5
2181 | 2180 3
2182 | 2181 3
2183 | 2182 3
2184 | 2183 3
2185 | 2184 3
2186 | 2185 3
2187 | 2186 3
2188 | 2187 3
2189 | 2188 0
2190 | 2189 4
2191 | 2190 1
2192 | 2191 1
2193 | 2192 3
2194 | 2193 0
2195 | 2194 0
2196 | 2195 0
2197 | 2196 1
2198 | 2197 1
2199 | 2198 0
2200 | 2199 0
2201 | 2200 0
2202 | 2201 0
2203 | 2202 0
2204 | 2203 0
2205 | 2204 0
2206 | 2205 0
2207 | 2206 4
2208 | 2207 6
2209 | 2208 4
2210 | 2209 2
2211 | 2210 0
2212 | 2211 6
2213 | 2212 0
2214 | 2213 4
2215 | 2214 4
2216 | 2215 5
2217 | 2216 0
2218 | 2217 0
2219 | 2218 4
2220 | 2219 6
2221 | 2220 5
2222 | 2221 5
2223 | 2222 5
2224 | 2223 0
2225 | 2224 0
2226 | 2225 0
2227 | 2226 0
2228 | 2227 0
2229 | 2228 2
2230 | 2229 5
2231 | 2230 5
2232 | 2231 3
2233 | 2232 3
2234 | 2233 3
2235 | 2234 3
2236 | 2235 3
2237 | 2236 3
2238 | 2237 0
2239 | 2238 0
2240 | 2239 0
2241 | 2240 0
2242 | 2241 0
2243 | 2242 0
2244 | 2243 0
2245 | 2244 0
2246 | 2245 0
2247 | 2246 2
2248 | 2247 5
2249 | 2248 6
2250 | 2249 0
2251 | 2250 0
2252 | 2251 3
2253 | 2252 3
2254 | 2253 3
2255 | 2254 0
2256 | 2255 0
2257 | 2256 0
2258 | 2257 0
2259 | 2258 0
2260 | 2259 0
2261 | 2260 0
2262 | 2261 0
2263 | 2262 5
2264 | 2263 5
2265 | 2264 5
2266 | 2265 0
2267 | 2266 1
2268 | 2267 4
2269 | 2268 4
2270 | 2269 2
2271 | 2270 0
2272 | 2271 6
2273 | 2272 5
2274 | 2273 0
2275 | 2274 0
2276 | 2275 5
2277 | 2276 0
2278 | 2277 0
2279 | 2278 0
2280 | 2279 0
2281 | 2280 0
2282 | 2281 0
2283 | 2282 0
2284 | 2283 0
2285 | 2284 4
2286 | 2285 0
2287 | 2286 0
2288 | 2287 2
2289 | 2288 5
2290 | 2289 5
2291 | 2290 6
2292 | 2291 3
2293 | 2292 0
2294 | 2293 0
2295 | 2294 0
2296 | 2295 6
2297 | 2296 0
2298 | 2297 5
2299 | 2298 5
2300 | 2299 5
2301 | 2300 5
2302 | 2301 1
2303 | 2302 0
2304 | 2303 0
2305 | 2304 0
2306 | 2305 0
2307 | 2306 0
2308 | 2307 4
2309 | 2308 0
2310 | 2309 0
2311 | 2310 0
2312 | 2311 0
2313 | 2312 5
2314 | 2313 5
2315 | 2314 0
2316 | 2315 5
2317 | 2316 5
2318 | 2317 3
2319 | 2318 5
2320 | 2319 5
2321 | 2320 5
2322 | 2321 5
2323 | 2322 3
2324 | 2323 1
2325 | 2324 2
2326 | 2325 0
2327 | 2326 0
2328 | 2327 6
2329 | 2328 3
2330 | 2329 3
2331 | 2330 4
2332 | 2331 0
2333 | 2332 5
2334 | 2333 6
2335 | 2334 0
2336 | 2335 3
2337 | 2336 0
2338 | 2337 4
2339 | 2338 0
2340 | 2339 3
2341 | 2340 3
2342 | 2341 1
2343 | 2342 5
2344 | 2343 0
2345 | 2344 5
2346 | 2345 5
2347 | 2346 5
2348 | 2347 5
2349 | 2348 5
2350 | 2349 5
2351 | 2350 5
2352 | 2351 3
2353 | 2352 1
2354 | 2353 6
2355 | 2354 6
2356 | 2355 3
2357 | 2356 3
2358 | 2357 3
2359 | 2358 0
2360 | 2359 3
2361 | 2360 3
2362 | 2361 3
2363 | 2362 3
2364 | 2363 0
2365 | 2364 0
2366 | 2365 3
2367 | 2366 3
2368 | 2367 3
2369 | 2368 1
2370 | 2369 0
2371 | 2370 4
2372 | 2371 4
2373 | 2372 5
2374 | 2373 5
2375 | 2374 5
2376 | 2375 5
2377 | 2376 5
2378 | 2377 0
2379 | 2378 5
2380 | 2379 5
2381 | 2380 5
2382 | 2381 5
2383 | 2382 5
2384 | 2383 5
2385 | 2384 5
2386 | 2385 4
2387 | 2386 0
2388 | 2387 5
2389 | 2388 5
2390 | 2389 5
2391 | 2390 5
2392 | 2391 5
2393 | 2392 5
2394 | 2393 5
2395 | 2394 5
2396 | 2395 5
2397 | 2396 5
2398 | 2397 5
2399 | 2398 5
2400 | 2399 5
2401 | 2400 5
2402 | 2401 0
2403 | 2402 0
2404 | 2403 3
2405 | 2404 0
2406 | 2405 0
2407 | 2406 3
2408 | 2407 3
2409 | 2408 3
2410 | 2409 3
2411 | 2410 3
2412 | 2411 3
2413 | 2412 3
2414 | 2413 3
2415 | 2414 3
2416 | 2415 3
2417 | 2416 3
2418 | 2417 3
2419 | 2418 3
2420 | 2419 3
2421 | 2420 3
2422 | 2421 3
2423 | 2422 3
2424 | 2423 3
2425 | 2424 3
2426 | 2425 3
2427 | 2426 3
2428 | 2427 3
2429 | 2428 3
2430 | 2429 3
2431 | 2430 3
2432 | 2431 3
2433 | 2432 3
2434 | 2433 0
2435 | 2434 0
2436 | 2435 3
2437 | 2436 3
2438 | 2437 3
2439 | 2438 3
2440 | 2439 3
2441 | 2440 3
2442 | 2441 3
2443 | 2442 3
2444 | 2443 3
2445 | 2444 3
2446 | 2445 3
2447 | 2446 5
2448 | 2447 5
2449 | 2448 5
2450 | 2449 5
2451 | 2450 5
2452 | 2451 5
2453 | 2452 6
2454 | 2453 0
2455 | 2454 0
2456 | 2455 0
2457 | 2456 0
2458 | 2457 0
2459 | 2458 0
2460 | 2459 3
2461 | 2460 3
2462 | 2461 0
2463 | 2462 0
2464 | 2463 0
2465 | 2464 0
2466 | 2465 0
2467 | 2466 0
2468 | 2467 0
2469 | 2468 0
2470 | 2469 4
2471 | 2470 4
2472 | 2471 4
2473 | 2472 4
2474 | 2473 4
2475 | 2474 0
2476 | 2475 0
2477 | 2476 0
2478 | 2477 0
2479 | 2478 0
2480 | 2479 0
2481 | 2480 0
2482 | 2481 0
2483 | 2482 0
2484 | 2483 0
2485 | 2484 0
2486 |
--------------------------------------------------------------------------------
/DELP/main.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | from scipy import sparse as sp
3 | import argparse
4 | import cPickle
5 | import numpy as np
6 | import os
7 | import sys
8 | import time
9 | import tensorflow as tf
10 | from model import Model
11 | from utils import data
12 |
13 | parser = argparse.ArgumentParser()
14 | parser.add_argument(
15 | '--dataset', help='the description of dataset', type=str, default='cora')
16 | parser.add_argument('--attr_filename', help='the attribute filename of the dataset',
17 | type=str, default='graph/cora.feature')
18 | parser.add_argument('--label_filename', help='the label filename of the dataset',
19 | type=str, default='graph/cora.label')
20 | parser.add_argument('--edge_filename', help='the edge filename of the dataset',
21 | type=str, default='graph/cora.edgelist')
22 | parser.add_argument('--labels_num', help='number of labels',
23 | type=int, default=7)
24 | parser.add_argument('--em_learning_rate',
25 | help='learning rate for embedding loss', type=float, default=1e-2)
26 | parser.add_argument('--lp_learning_rate',
27 | help='learning rate for label propagation loss', type=float, default=1e-2)
28 | parser.add_argument(
29 | '--ratio', help='ratio of labeled nodes for label propagation', type=float, default=0.05)
30 | parser.add_argument('--embedding_size',
31 | help='embedding dimensions', type=int, default=128)
32 | parser.add_argument(
33 | '--window_size', help='window size in random walk sequences', type=int, default=5)
34 | parser.add_argument(
35 | '--path_size', help='length of random walk sequences', type=int, default=80)
36 | parser.add_argument('--graph_context_batch_size',
37 | help='batch size for graph context loss', type=int, default=64)
38 | # see the comments below.
39 | parser.add_argument('--label_context_batch_size',
40 | help='batch size for label context loss', type=int, default=256)
41 | parser.add_argument(
42 | '--neg_samp', help='negative sampling rate', type=int, default=6)
43 | # reduce this number if you just want to get some quick results. Increasing this number usually leads to better results but takes more time.
44 | parser.add_argument(
45 | '--max_iter', help='max iterations of training', type=int, default=100)
46 | parser.add_argument(
47 | '--mu', help='hyper-parameter for label propagation', type=float, default=10)
48 | parser.add_argument(
49 | '--keep_prob', help='keep probability for dropout', type=float, default=1.0)
50 | # In particular, $\alpha = (label_context_batch_size)/(graph_context_batch_size + label_context_batch_size)$.
51 | # A larger value is assigned to graph_context_batch_size if graph structure is more informative than label information.
52 |
53 |
54 | args = parser.parse_args()
55 | print args
56 |
57 | data = data(args)
58 |
59 | print_every_k_iters = 1
60 | start_time = time.time()
61 |
62 | with tf.Session() as session:
63 | model = Model(args, data, session)
64 | iter_cnt = 0
65 | augument_size = int(len(model.label_x))
66 | while True:
67 | iter_cnt += 1
68 | curr_loss_label_propagation, curr_loss_u_1, curr_loss_u_2 = (
69 | 0.0, 0.0, 0.0)
70 |
71 | average_loss_u_1 = 0
72 | average_loss_u_2 = 0
73 | for i in range(200):
74 | # unsupervised training using network context.
75 | curr_loss_u_1 = model.unsupervised_train_step(
76 | is_graph_context=True)
77 | # unsupervised training using label context.
78 | curr_loss_u_2 = model.unsupervised_train_step(
79 | is_label_context=True)
80 | average_loss_u_1 += curr_loss_u_1
81 | average_loss_u_2 += curr_loss_u_2
82 |
83 | average_loss_u_1 /= 200.0
84 | average_loss_u_2 /= 200.0
85 |
86 | S = model.calc_similarity()
87 |
88 | curr_loss_label_propagation = model.label_propagation_train_step(
89 | S, 100)
90 | entropy_loss_vec = model.calc_entropy_loss()
91 | entropy_loss_vec.sort()
92 |
93 | chosed_ranking_loss = entropy_loss_vec[int(augument_size)]
94 | if augument_size < model.vertices_num/2 and iter_cnt >= 10:
95 | model.session.run(model.lambd.assign(chosed_ranking_loss))
96 | augument_size += model.vertices_num/(args.max_iter * 1.0)
97 |
98 | if iter_cnt != 1:
99 | V_pre = model.session.run(model.V)
100 |
101 | # Update indicator vector via closed form solution
102 | model.indicator_vector_train_step(S)
103 |
104 | # Augument label context nodes
105 | V, F = model.session.run([model.V, model.F])
106 |
107 | if np.sum(V) != 0:
108 | nonzero_idx = np.nonzero(V)[0]
109 | for i in nonzero_idx:
110 | if model.augumentDict.has_key(i):
111 | continue
112 | augument_label = np.argmax(F[i])
113 | model.label2idx[augument_label].append(i)
114 | for j in range(model.labels_num):
115 | if j is not augument_label:
116 | model.not_label2idx[j].append(i)
117 | model.augumentDict[i] = 1
118 | model.augument_Label_x.append(i)
119 | model.augument_Label_y.append(augument_label)
120 |
121 | if iter_cnt % print_every_k_iters == 0: # for printing info.
122 | curr_loss = average_loss_u_1 + average_loss_u_2 + curr_loss_label_propagation
123 | print "iter = %d, loss = %f, time = %d s" % (iter_cnt, curr_loss, int(time.time()-start_time))
124 | print "embedding loss: " + str(average_loss_u_1 + average_loss_u_2)
125 | print "label propagation loss: " + str(curr_loss_label_propagation)
126 | model.evaluation_label_propagation()
127 |
128 | if iter_cnt == args.max_iter:
129 | acc = model.evaluation_label_propagation()
130 | print "Model has converged."
131 | break
132 |
133 | model.store_useful_information()
134 |
135 | print "The final accuracy is: " + str(acc)
136 |
--------------------------------------------------------------------------------
/DELP/model.py:
--------------------------------------------------------------------------------
1 | from collections import defaultdict
2 | from scipy import sparse as sp
3 | from scipy.sparse import csr_matrix
4 | from scipy.sparse import vstack
5 | from sklearn.metrics import *
6 | from sklearn.preprocessing import *
7 | from sklearn.metrics.pairwise import *
8 | import argparse
9 | import copy
10 | import cPickle as cpkl
11 | import numpy as np
12 | import os
13 | import sys
14 | import tensorflow as tf
15 | from tensorflow.python import py_func
16 | import warnings
17 |
18 |
19 | class Model(object):
20 | def __init__(self, args, data, session):
21 | self.embedding_size = args.embedding_size
22 | self.em_learning_rate = args.em_learning_rate
23 | self.lp_learning_rate = args.lp_learning_rate
24 | self.neg_samp = args.neg_samp
25 | self.mu = args.mu
26 | self.keep_probability = args.keep_prob
27 |
28 | self.window_size = args.window_size
29 | self.path_size = args.path_size
30 |
31 | self.graph_context_batch_size = args.graph_context_batch_size # rnd walk
32 | self.label_context_batch_size = args.label_context_batch_size # label proximity
33 |
34 | self.data = data
35 | self.dataset = data.dataset
36 | self.all_x = data.all_x
37 | self.all_y = data.all_y
38 | self.graph = data.graph
39 | self.adj_matrix = data.adj_matrix
40 | self.labelDict = data.labelDict
41 |
42 | self.vertices_num = data.all_x.shape[0]
43 | self.attributes_num = data.all_x.shape[1]
44 | self.labels_num = args.labels_num
45 | self.ratio = args.ratio
46 |
47 | self.model_dic = dict()
48 |
49 | self.partial_label()
50 |
51 | self.unsupervised_label_contx_generator = self.unsupervised_label_context_iter()
52 | self.unsupervised_graph_contx_generator = self.unsupervised_graph_context_iter()
53 | self.session = session
54 | self.build_tf_graph()
55 |
56 | def partial_label(self):
57 | # fetch partial labels for training label propagation
58 | N = self.vertices_num
59 | K = self.labels_num
60 | ratio = self.ratio
61 | num = int(N * ratio)/K
62 | self.label_x = []
63 | self.label_y = []
64 | for k, v in self.labelDict.items():
65 | values = self.labelDict[k]
66 | np.random.shuffle(values)
67 | if len(values) <= num:
68 | for value in values:
69 | self.label_x.append(value)
70 | self.label_y.append(k)
71 | else:
72 | for i in range(num):
73 | self.label_x.append(values[i])
74 | self.label_y.append(k)
75 |
76 | def build_tf_graph(self):
77 | """Create the TensorFlow graph. """
78 | input_attr = tf.placeholder(
79 | tf.float32, shape=[None, self.attributes_num], name="input_attr")
80 | graph_context = tf.placeholder(
81 | tf.int32, shape=[None], name="graph_context")
82 | pos_or_neg = tf.placeholder(
83 | tf.float32, shape=[None], name="pos_or_neg")
84 | episilon = tf.constant(1e-6)
85 | keep_prob = tf.constant(self.keep_probability)
86 | input_attr_keep = tf.contrib.layers.dropout(input_attr, keep_prob)
87 |
88 | W_1 = tf.get_variable("W_1", shape=(self.attributes_num, 1000),
89 | initializer=tf.contrib.layers.xavier_initializer())
90 | b_1 = tf.Variable(tf.random_uniform([1000], -1.0, 1.0))
91 | hidden1_layer = tf.nn.softsign(tf.matmul(input_attr_keep, W_1) + b_1)
92 | hidden1_layer_keep = tf.contrib.layers.dropout(
93 | hidden1_layer, keep_prob)
94 |
95 | # can add more layers here
96 | # W_2 layer
97 | W_2 = tf.get_variable("W_2", shape=(1000, 500),
98 | initializer=tf.contrib.layers.xavier_initializer())
99 | b_2 = tf.Variable(tf.random_uniform([500], -1.0, 1.0))
100 | hidden2_layer = tf.nn.softsign(
101 | tf.matmul(hidden1_layer_keep, W_2) + b_2)
102 | hidden2_layer_keep = tf.contrib.layers.dropout(
103 | hidden2_layer, keep_prob)
104 |
105 | # W_3 layer
106 | W_3 = tf.get_variable("W_3", shape=(
107 | 500, self.embedding_size), initializer=tf.contrib.layers.xavier_initializer())
108 | b_3 = tf.Variable(tf.random_uniform([self.embedding_size], -1.0, 1.0))
109 | embed_layer = tf.nn.softsign(tf.matmul(hidden2_layer_keep, W_3) + b_3)
110 |
111 | out_embed_layer = tf.get_variable("Out_embed", shape=(self.vertices_num, self.embedding_size),
112 | initializer=tf.contrib.layers.xavier_initializer())
113 | out_embed_vecs = tf.nn.embedding_lookup(
114 | out_embed_layer, graph_context) # lookup from output
115 |
116 | loss_regularizer = tf.nn.l2_loss(W_2) + tf.nn.l2_loss(W_3)
117 | loss_regularizer += tf.nn.l2_loss(b_2) + tf.nn.l2_loss(b_3)
118 |
119 | ele_wise_prod = tf.multiply(embed_layer, out_embed_vecs)
120 | loss_unsupervised = - \
121 | tf.reduce_sum(tf.log(tf.sigmoid(tf.reduce_sum(
122 | ele_wise_prod, axis=1) * pos_or_neg) + episilon)) + 1.0 * loss_regularizer
123 |
124 | optimizer_unsupervised = tf.train.GradientDescentOptimizer(
125 | learning_rate=self.em_learning_rate).minimize(loss_unsupervised)
126 |
127 | self.model_dic['input'] = input_attr
128 | self.model_dic['truth_context'] = graph_context
129 | self.model_dic['pos_or_neg'] = pos_or_neg
130 | self.model_dic['loss_u'] = loss_unsupervised
131 | self.model_dic['opt_u'] = optimizer_unsupervised
132 | self.model_dic['embeddings'] = embed_layer
133 |
134 | self.F = tf.Variable(
135 | tf.ones([self.vertices_num, self.labels_num], dtype=tf.float32)/self.labels_num)
136 | self.V = tf.Variable(tf.zeros([self.vertices_num], dtype=tf.float32))
137 | self.lambd = tf.Variable(0.0, dtype=tf.float32)
138 |
139 | S = tf.placeholder(tf.float32, shape=[
140 | self.vertices_num, self.vertices_num])
141 | self.adj_matrix = tf.convert_to_tensor(
142 | self.adj_matrix, dtype=tf.float32)
143 | S_adj = tf.multiply(S, self.adj_matrix)
144 | D = tf.diag(
145 | tf.sqrt(1.0/(tf.reduce_sum(S_adj, axis=1) + 1e-6 * tf.ones(self.vertices_num))))
146 | S_norm = tf.matmul(tf.matmul(D, S_adj), D)
147 | L = tf.eye(self.vertices_num) - S_norm
148 |
149 | labels = self.gen_labeled_matrix()
150 | labeled_F = tf.nn.embedding_lookup(self.F, self.label_x)
151 | labeled_Y = tf.nn.embedding_lookup(labels, self.label_x)
152 | F_Y = labeled_F - labeled_Y
153 |
154 | smooth_loss = tf.trace(
155 | tf.matmul(tf.matmul(tf.transpose(self.F), L), self.F))
156 |
157 | fitness_loss = self.mu * (tf.trace(tf.matmul(F_Y, tf.transpose(F_Y))))
158 |
159 | entropy_loss = tf.reduce_sum(
160 | tf.keras.backend.categorical_crossentropy(self.F, self.F) * self.V)
161 |
162 | regularizer_loss = - self.lambd * tf.reduce_sum(self.V)
163 |
164 | loss_labelpropagation = smooth_loss + \
165 | fitness_loss + entropy_loss + regularizer_loss
166 | optimizer_labelpropagation = tf.train.GradientDescentOptimizer(
167 | learning_rate=self.lp_learning_rate)
168 |
169 | def prox_V(grads, params, lr):
170 | neg_index = grads < 0
171 | pos_index = grads >= 0
172 | grads[neg_index] = (params[neg_index] - 1)/lr
173 | grads[pos_index] = (params[pos_index])/lr
174 |
175 | return grads
176 |
177 | def prox_F(grads, params, lr):
178 | params_new = params - lr * grads
179 | params_new_sort = - np.sort(- params_new, axis=1)
180 |
181 | tri = np.tri(params_new_sort.shape[1], params_new_sort.shape[1]).T
182 | rangeArr = np.array(range(1, params_new_sort.shape[1] + 1))
183 | params_new_sort_mult = (
184 | 1 - np.dot(params_new_sort, tri))/rangeArr + params_new_sort
185 |
186 | params_new_sort_mult[params_new_sort_mult > 0] = 1
187 | params_new_sort_mult[params_new_sort_mult <= 0] = 0
188 |
189 | choice_index = np.argmax(params_new_sort_mult * rangeArr, axis=1)
190 | arr1 = np.zeros([params_new_sort.shape[1],
191 | params_new_sort.shape[0]])
192 | arr1[choice_index, range(0, params_new_sort.shape[0])] = 1
193 | arr2 = 1 - arr1.cumsum(axis=0)
194 | arr2[choice_index, range(0, params_new_sort.shape[0])] = 1
195 | arr3 = np.dot(params_new_sort, arr2)
196 | arr4 = np.reshape((1 - arr3[range(0, params_new_sort.shape[0]), range(
197 | 0, params_new_sort.shape[0])])/(choice_index + 1), [-1, 1])
198 |
199 | idx = (params_new + arr4) > 0
200 | temp = np.tile(arr4, (1, params_new.shape[1]))
201 | grads[idx] = grads[idx] - temp[idx]/lr
202 | grads[(params_new + arr4) <= 0] = params[(params_new + arr4) <= 0]/lr
203 |
204 | return grads
205 |
206 | def tf_prox_F(grads, params, lr):
207 | return py_func(prox_F, [grads, params, lr], tf.float32)
208 |
209 | def tf_prox_V(grads, params, lr):
210 | return py_func(prox_V, [grads, params, lr], tf.float32)
211 |
212 | grad_V = optimizer_labelpropagation.compute_gradients(
213 | loss_labelpropagation, self.V)
214 | grad_F = optimizer_labelpropagation.compute_gradients(
215 | loss_labelpropagation, self.F)
216 | apply_f_op = optimizer_labelpropagation.apply_gradients(
217 | [(tf_prox_F(gv[0], gv[1], self.lp_learning_rate), gv[1]) for gv in grad_F])
218 | apply_v_op = optimizer_labelpropagation.apply_gradients(
219 | [(tf_prox_V(gv[0], gv[1], self.lp_learning_rate), gv[1]) for gv in grad_V])
220 |
221 | self.model_dic['similarity_matrix'] = S
222 | self.model_dic['loss_lp'] = loss_labelpropagation
223 | self.model_dic['opt_lp'] = optimizer_labelpropagation
224 | self.model_dic['apply_v_op'] = apply_v_op
225 | self.model_dic['apply_f_op'] = apply_f_op
226 |
227 | init = tf.global_variables_initializer()
228 | self.session.run(init)
229 |
230 | def unsupervised_train_step(self, is_graph_context=False, is_label_context=False):
231 | """Unsupervised training step."""
232 | if is_graph_context:
233 | batch_x, batch_gy, batch_pos_or_neg = next(
234 | self.unsupervised_graph_contx_generator)
235 | elif is_label_context:
236 | batch_x, batch_gy, batch_pos_or_neg = next(
237 | self.unsupervised_label_contx_generator)
238 | _, loss = self.session.run([self.model_dic['opt_u'],
239 | self.model_dic['loss_u']],
240 | feed_dict={self.model_dic['input']: batch_x,
241 | self.model_dic['truth_context']: batch_gy,
242 | self.model_dic['pos_or_neg']: batch_pos_or_neg})
243 | return loss
244 |
245 | def store_useful_information(self):
246 | """Store useful information, such as embeddings, outlier scores, after the model finish training."""
247 | embeddings = self.session.run(self.model_dic['embeddings'],
248 | feed_dict={self.model_dic['input']: self.all_x})
249 |
250 | cpkl.dump(embeddings, open(self.data.dataset + ".embed", "wb"))
251 |
252 | def unsupervised_label_context_iter(self):
253 | self.augumentDict = dict()
254 | self.augument_Label_x = list(np.copy(self.label_x))
255 | self.augument_Label_y = list(np.copy(self.label_y))
256 | self.label2idx, self.not_label2idx = defaultdict(
257 | list), defaultdict(list)
258 | for i in range(len(self.label_x)):
259 | self.augumentDict[self.label_x[i]] = 1
260 | label = self.label_y[i]
261 | self.label2idx[label].append(self.label_x[i])
262 | for j in range(self.labels_num):
263 | if j is not label:
264 | self.not_label2idx[j].append(self.label_x[i])
265 |
266 | while True:
267 | context_pairs, pos_or_neg = [], []
268 | cnt = 0
269 | while cnt < self.label_context_batch_size:
270 | input_idx = np.random.randint(0, len(self.augument_Label_x))
271 | label = self.augument_Label_y[input_idx]
272 | if len(self.label2idx) == 1:
273 | continue
274 | target_idx = self.augument_Label_x[input_idx]
275 | context_idx = np.random.choice(self.label2idx[label])
276 | context_pairs.append([target_idx, context_idx])
277 | pos_or_neg.append(1.0)
278 | for _ in range(self.neg_samp):
279 | context_pairs.append(
280 | [target_idx, np.random.choice(self.not_label2idx[label])])
281 | pos_or_neg.append(-1.0)
282 | cnt += 1
283 | context_pairs = np.array(context_pairs, dtype=np.int32)
284 | pos_or_neg = np.array(pos_or_neg, dtype=np.float32)
285 | input_idx_var = context_pairs[:, 0]
286 | output_idx_var = context_pairs[:, 1]
287 | shuffle_idx = np.random.permutation(np.arange(len(input_idx_var)))
288 | input_idx_var = input_idx_var[shuffle_idx]
289 | output_idx_var = output_idx_var[shuffle_idx]
290 | pos_or_neg = pos_or_neg[shuffle_idx]
291 | yield self.all_x[input_idx_var], output_idx_var, pos_or_neg
292 |
293 | def gen_rnd_walk_pairs(self):
294 | print "Generate random walks..."
295 | all_pairs = []
296 | permuted_idx = np.random.permutation(self.vertices_num)
297 | if (len(permuted_idx) > 10000):
298 | permuted_idx = np.random.choice(permuted_idx, 10000, replace=False)
299 | print "Randomly selected src nodes for random walk..."
300 | for start_idx in permuted_idx:
301 | if start_idx not in self.graph or len(self.graph[start_idx]) == 0:
302 | continue
303 | path = [start_idx]
304 | for _ in range(self.path_size):
305 | if path[-1] in self.graph:
306 | path.append(np.random.choice(self.graph[path[-1]]))
307 | for l in range(len(path)):
308 | for m in range(l - self.window_size, l + self.window_size + 1):
309 | if m < 0 or m >= len(path):
310 | continue
311 | all_pairs.append([path[l], path[m]])
312 | return np.random.permutation(all_pairs)
313 |
314 | def unsupervised_graph_context_iter(self):
315 | """Unsupervised graph context iterator."""
316 | rnd_walk_save_file = self.data.dataset + ".rnd_walks.npy"
317 | save_walks = np.array([], dtype=np.int32).reshape(0, 2)
318 | if os.path.exists(rnd_walk_save_file):
319 | save_walks = np.load(rnd_walk_save_file)
320 |
321 | all_pairs = save_walks
322 | new_walks = np.array([], dtype=np.int32).reshape(
323 | 0, 2) # buffer storage
324 | max_num_pairs = max(10000000, self.vertices_num * 100)
325 | while True:
326 | if len(all_pairs) == 0:
327 | # enough rnd walks, reuse them.
328 | if len(save_walks) >= max_num_pairs:
329 | all_pairs = save_walks
330 | else:
331 | all_pairs = self.gen_rnd_walk_pairs()
332 | print "newly generated rnd walks " + str(all_pairs.shape)
333 | # save the new walks to buffer
334 | new_walks = np.concatenate((new_walks, all_pairs), axis=0)
335 | if len(new_walks) >= 10000: # buffer full.
336 | save_walks = np.concatenate(
337 | (save_walks, new_walks), axis=0)
338 | np.save(rnd_walk_save_file, save_walks)
339 | print "Successfully save the walks..."
340 | new_walks = np.array([], dtype=np.int32).reshape(0, 2)
341 | i = 0
342 | j = i + self.graph_context_batch_size
343 | while j < len(all_pairs):
344 | pos_or_neg = np.array([1.0] * self.graph_context_batch_size + [-1.0]
345 | * self.graph_context_batch_size * self.neg_samp, dtype=np.float32)
346 | context_pairs = np.zeros(
347 | (self.graph_context_batch_size + self.graph_context_batch_size * self.neg_samp, 2), dtype=np.int32)
348 | context_pairs[:self.graph_context_batch_size,
349 | :] = all_pairs[i:j, :]
350 | context_pairs[self.graph_context_batch_size:, 0] = np.repeat(
351 | all_pairs[i:j, 0], self.neg_samp)
352 | context_pairs[self.graph_context_batch_size:, 1] = np.random.randint(
353 | 0, self.vertices_num, size=self.graph_context_batch_size * self.neg_samp)
354 | input_idx_var = context_pairs[:, 0]
355 | output_idx_var = context_pairs[:, 1]
356 | shuffle_idx = np.random.permutation(
357 | np.arange(len(input_idx_var)))
358 | input_idx_var = input_idx_var[shuffle_idx]
359 | output_idx_var = output_idx_var[shuffle_idx]
360 | pos_or_neg = pos_or_neg[shuffle_idx]
361 | yield self.all_x[input_idx_var], output_idx_var, pos_or_neg
362 | i = j
363 | j = i + self.graph_context_batch_size
364 | all_pairs = []
365 |
366 | def get_embeddings(self):
367 | embeddings = self.session.run(self.model_dic['embeddings'],
368 | feed_dict={self.model_dic['input']: self.all_x})
369 |
370 | return embeddings
371 |
372 | def calc_similarity(self, model="cosine"):
373 | X = self.get_embeddings()
374 | if model == "cosine":
375 | X = cosine_similarity(X)
376 | X = np.exp(-1/0.03 * (1 - X))
377 | elif model == "distance":
378 | X = pairwise_distances(X)
379 | X = 1/(1+X)
380 |
381 | return X
382 |
383 | def gen_labeled_matrix(self):
384 | N = self.vertices_num
385 | K = self.labels_num
386 | Y = np.zeros((N, K))
387 | for i in range(len(self.label_x)):
388 | idx = self.label_x[i]
389 | label = self.label_y[i]
390 | Y[idx, label] = 1.0
391 | return tf.convert_to_tensor(Y, dtype=tf.float32)
392 |
393 | def label_propagation_train_step(self, S, num_iterations=100):
394 | average_loss = 0
395 | for i in range(num_iterations):
396 | _, loss = self.session.run([self.model_dic['apply_f_op'], self.model_dic['loss_lp']], feed_dict={
397 | self.model_dic['similarity_matrix']: S})
398 | average_loss += loss
399 | average_loss /= num_iterations * 1.0
400 |
401 | return average_loss
402 |
403 | def indicator_vector_train_step(self, S):
404 | _ = self.session.run(self.model_dic['apply_v_op'], feed_dict={
405 | self.model_dic['similarity_matrix']: S})
406 |
407 | def calc_entropy_loss(self):
408 | entropy = tf.keras.backend.categorical_crossentropy(self.F, self.F)
409 | entropy_loss_vec = self.session.run(entropy)
410 |
411 | return entropy_loss_vec
412 |
413 | def evaluation_label_propagation(self):
414 | F = self.session.run(self.F)
415 | testID = list(set(range(self.vertices_num)) - set(self.label_x))
416 |
417 | predict = np.argmax(F, axis=1)
418 |
419 | y_pred = predict[self.label_x]
420 | y_gt = self.all_y[self.label_x]
421 |
422 | accuracy = accuracy_score(y_gt, y_pred)
423 | macro_f1 = f1_score(y_gt, y_pred, average="macro")
424 | micro_f1 = f1_score(y_gt, y_pred, average="micro")
425 |
426 | print "Train Set, accuracy: %f, macro_f1: %f, micro_f1: %f" % (accuracy, macro_f1, micro_f1)
427 |
428 | y_pred = predict[testID]
429 | y_gt = self.all_y[testID]
430 |
431 | accuracy = accuracy_score(y_gt, y_pred)
432 | macro_f1 = f1_score(y_gt, y_pred, average="macro")
433 | micro_f1 = f1_score(y_gt, y_pred, average="micro")
434 |
435 | print "Test Set, accuracy: %f, macro_f1: %f, micro_f1: %f" % (accuracy, macro_f1, micro_f1)
436 |
437 | return accuracy
438 |
--------------------------------------------------------------------------------
/DELP/requirements.txt:
--------------------------------------------------------------------------------
1 | tensorflow==1.4.1
2 | networkx==1.11
3 | numpy==1.14.2
4 | scipy==0.19.1
5 | scikit-learn==0.19.0
6 |
--------------------------------------------------------------------------------
/DELP/utils.py:
--------------------------------------------------------------------------------
1 | from scipy.sparse import csr_matrix
2 | from scipy.sparse import vstack
3 | import networkx as nx
4 | import cPickle
5 | import numpy as np
6 | import sys
7 | import tensorflow as tf
8 | import scipy.io as sio
9 |
10 |
11 | class data(object):
12 | def __init__(self, args):
13 | self.dataset = args.dataset
14 | self.all_x = self.read_attributes(args.attr_filename)
15 | self.all_y = self.read_label(args.label_filename)
16 | self.graph = self.read_network(args.edge_filename)
17 | self.adj_matrix = self.gen_network_adjmatrix(args.edge_filename)
18 |
19 | def read_attributes(self, filename):
20 | f = open(filename, "r")
21 | lines = f.readlines()
22 | f.close()
23 | features = []
24 | for line in lines[1:]:
25 | l = line.strip("\n\r").split(" ")
26 | features.append(l)
27 | features = np.array(features, dtype=np.float32)
28 | features[features > 0] = 1.0 # feature binarization
29 |
30 | return features
31 |
32 | def read_attributes_mat(self, filename):
33 | mat = sio.loadmat(filename)
34 | features = mat['feature']
35 | features[features > 0] = 1.0
36 |
37 | return features
38 |
39 | def read_label(self, labelFile):
40 | # Read node label and read node label dict
41 | f = open(labelFile, "r")
42 | lines = f.readlines()
43 | f.close()
44 |
45 | labels = []
46 | self.labelDict = dict()
47 |
48 | for line in lines:
49 | l = line.strip("\n\r").split(" ")
50 | nodeID = int(l[0])
51 | label = int(l[1])
52 | labels.append(label)
53 | if self.labelDict.has_key(label):
54 | self.labelDict[label].append(nodeID)
55 | else:
56 | self.labelDict[label] = [nodeID]
57 | labels = np.array(labels, dtype=np.int32)
58 | return labels
59 |
60 | def read_network(self, filename):
61 | f = open(filename, "r")
62 | lines = f.readlines()
63 | f.close()
64 |
65 | graph = dict()
66 | for line in lines:
67 | l = line.strip("\n\r").split(" ")
68 | node1 = int(l[0])
69 | node2 = int(l[1])
70 | if not graph.has_key(node1):
71 | graph[node1] = [node2]
72 | else:
73 | graph[node1].append(node2)
74 | if not graph.has_key(node2):
75 | graph[node2] = [node1]
76 | else:
77 | graph[node2].append(node1)
78 | return graph
79 |
80 | def gen_network_adjmatrix(self, filename):
81 | G = nx.read_edgelist(filename, nodetype=int, create_using=nx.DiGraph())
82 | G = G.to_undirected()
83 | G_adj = nx.to_numpy_matrix(G)
84 |
85 | return G_adj
86 |
--------------------------------------------------------------------------------
/PGRR/KDD_18_ADS_Mobile_Access_Record_Resolution_on_Large_Scale_Identifier_Linkage_Graphs.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hp027/AliGraph/60bd650d6d1c66eb419399d28c88972c89472c74/PGRR/KDD_18_ADS_Mobile_Access_Record_Resolution_on_Large_Scale_Identifier_Linkage_Graphs.pdf
--------------------------------------------------------------------------------
/PGRR/README.md:
--------------------------------------------------------------------------------
1 | # PGRR
2 |
3 | Mobile Access Record Resolution on Large-Scale Identifier-Linkage Graphs (KDD-18)
4 |
5 | This is the Spark implementations of two versions (unsupervised and semi-supervised) of Parallel Graph-based Record Resolution (PGRR) algorithm, which solves the Mobile Access Records Resolution (MARR) problem.
6 |
7 |
8 | ## Requirement
9 |
10 | * Spark
11 | * Scala
12 | * ODPS
13 | * Maven
14 |
15 | ## Basic Usage
16 |
17 | ### Input Data
18 | For unsupervised version, each dataset contains 2 table: features and edges.
19 | ```
20 | 1. features: each record contains n+1 fields.
21 | node_1 feature_1 feature_2 ... feature_n
22 | node_2 feature_1 feature_2 ... feature_n
23 | ...
24 |
25 | 2. edges: each record contains two connected nodes.
26 | node_1 node_2
27 | node_2 node_3
28 | ...
29 | ```
30 | For semi-supervised version, each dataset contains 3 table: features, edges and labels.
31 | ```
32 | 1. features: each record contains n+1 fields.
33 | node_1 feature_1 feature_2 ... feature_n
34 | node_2 feature_1 feature_2 ... feature_n
35 | ...
36 |
37 | 2. edges: each line contains two connected nodes.
38 | node_1 node_2
39 | node_2 node_3
40 | ...
41 |
42 | 3. labels: each record contains a node and its label.
43 | node_1 label_1
44 | node_2 label_2
45 | ...
46 | ```
47 |
48 | ### Run
49 | To run PGRR, recommend using `IntelliJ IDEA` to open this project.
50 |
51 |
--------------------------------------------------------------------------------
/PGRR/semi-supervised/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
5 | 4.0.0
6 |
7 | groupId
8 | PGRR-semi-supervised
9 | 1.0-SNAPSHOT
10 |
11 |
12 | 2.0.6
13 | UTF-8
14 | UTF-8
15 | 2.11.8
16 | 2.11
17 | 512m
18 | 1024m
19 | 2.2.1-odps0.29.0
20 | 2.7.1-ali1.8.2
21 |
22 |
23 |
24 |
25 | org.apache.spark
26 | spark-streaming_${scala.binary.version}
27 | ${spark.version}
28 | provided
29 |
30 |
31 | org.apache.spark
32 | spark-mllib_${scala.binary.version}
33 | ${spark.version}
34 | provided
35 |
36 |
37 | org.apache.spark
38 | spark-sql_${scala.binary.version}
39 | ${spark.version}
40 | provided
41 |
42 |
43 | org.apache.spark
44 | spark-hive_${scala.binary.version}
45 | ${spark.version}
46 | provided
47 |
48 |
49 | org.apache.hadoop
50 | hadoop-aliyun-fs
51 | ${hadoop.aliyun.fs.version}
52 |
53 |
54 | org.apache.hadoop
55 | hadoop-common
56 |
57 |
58 |
59 |
60 | org.apache.spark
61 | spark-core_${scala.binary.version}
62 | ${spark.version}
63 | provided
64 |
65 |
66 | org.apache.spark
67 | odps-support_${scala.binary.version}
68 | ${spark.version}
69 | provided
70 |
71 |
72 |
73 | com.aliyun.odps
74 | odps-spark-datasource
75 | ${odps.spark.plugin.version}
76 |
77 |
78 |
79 | com.aliyun.odps
80 | odps-spark-client
81 | ${odps.spark.plugin.version}
82 |
83 |
84 |
85 | org.scalatest
86 | scalatest_${scala.binary.version}
87 | 2.2.6
88 | provided
89 |
90 |
91 | org.scala-lang
92 | scala-library
93 | ${scala.version}
94 | provided
95 |
96 |
97 | org.scala-lang
98 | scala-actors
99 | ${scala.version}
100 | provided
101 |
102 |
103 | org.codehaus.jackson
104 | jackson-core-asl
105 | 1.9.13
106 |
107 |
108 | com.aliyun.odps
109 | cupid-sdk
110 | 1.8.2
111 | provided
112 |
113 |
114 |
115 |
116 | tbmirror
117 | taobao mirror
118 | http://mvnrepo.alibaba-inc.com/mvn/repository
119 |
120 |
121 | tbmirror-snapshots
122 | taobao mirror snapshots
123 | http://mvnrepo.alibaba-inc.com/mvn/repository
124 |
125 |
126 |
127 |
128 |
129 | maven-compiler-plugin
130 |
131 | 1.7
132 | 1.7
133 | UTF-8
134 |
135 |
136 |
137 | org.apache.maven.plugins
138 | maven-shade-plugin
139 | 2.1
140 |
141 |
142 | package
143 |
144 | shade
145 |
146 |
147 | false
148 | true
149 |
150 |
151 |
153 | *:*
154 |
155 |
156 |
157 |
158 | *:*
159 |
160 | META-INF/*.SF
161 | META-INF/*.DSA
162 | META-INF/*.RSA
163 |
164 |
165 |
166 |
167 |
169 | reference.conf
170 |
171 |
172 |
173 |
174 |
175 |
176 |
177 | net.alchim31.maven
178 | scala-maven-plugin
179 | 3.2.2
180 |
181 |
182 | scala-compile-first
183 | process-resources
184 |
185 | compile
186 |
187 |
188 |
197 |
198 | attach-scaladocs
199 | verify
200 |
201 | doc-jar
202 |
203 |
204 |
205 |
206 |
207 |
208 |
--------------------------------------------------------------------------------
/PGRR/semi-supervised/src/main/scala/Main.scala:
--------------------------------------------------------------------------------
1 | import breeze.linalg._
2 | import breeze.numerics._
3 | import org.apache.spark.graphx._
4 | import com.aliyun.odps.TableSchema
5 | import com.aliyun.odps.data.Record
6 | import org.apache.spark.odps.OdpsOps
7 | import org.apache.spark.{SparkConf, SparkContext}
8 |
9 | object Main {
10 | val inputSemiTable = "sample_semi"
11 | val inputNodeTable = "sample_feature"
12 | val inputEdgeTable = "sample_edge"
13 | val outputClusterTable = "sample_label"
14 | val featureDimension = 100
15 | var delta = 1.0
16 | var minError = 0.0
17 | val rho = 1.0
18 | val lambda = 1e-5
19 | val lambda2 = 1e-5
20 |
21 | def sqr(x: Double): Double = {
22 | return x * x
23 | }
24 |
25 | def computeLoss(graph: Graph[
26 | (DenseVector[Double], DenseVector[Double], Long, Long, Long),
27 | (Double, DenseVector[Double], DenseVector[Double], DenseVector[Double], DenseVector[Double])
28 | ]): Double = {
29 | var loss: Double = 0
30 | loss += graph.vertices.map { case (_, (xi, ci, _, _,_)) => 0.0 }.sum
31 | loss += lambda * graph.triplets.map(triplet => {
32 | 0.0
33 | }).sum
34 | return loss
35 | }
36 |
37 |
38 | def updateDelta(graph: Graph[
39 | (DenseVector[Double], DenseVector[Double], Long, Long, Long),
40 | (Double, DenseVector[Double], DenseVector[Double], DenseVector[Double], DenseVector[Double])
41 | ]): Int = {
42 |
43 | val tmp1 = graph.triplets.map(triplet => {
44 | val ci = triplet.srcAttr._2
45 | val cj = triplet.dstAttr._2
46 | val wij = exp(-norm(ci - cj))
47 | val ti = triplet.srcAttr._5
48 | val tj = triplet.dstAttr._5
49 | var flag = 0
50 | if (ti == -1 || tj == -1) flag = -1
51 | else if (ti == tj) flag = 1
52 | else flag = 0
53 | (wij, flag)
54 | })
55 | var array = tmp1.filter(_._2 != -1).collect
56 | array = array.sortWith(_._1 < _._1)
57 | var cnt0 = 0
58 | var cnt1 = 0
59 | for (elem <- array) {
60 | if (elem._2 == 1) cnt1 += 1
61 | else cnt0 +=1
62 | }
63 | delta = 1.0
64 | minError = cnt0 + cnt1
65 | var cur0 = 0
66 | var cur1 = 0
67 | for (elem <- array) {
68 | if (elem._2 == 1) cur1 += 1
69 | else cur0 += 1
70 | val err = cur1 + (cnt0 - cur0)
71 | if (err <= minError) {
72 | minError = err
73 | delta = elem._1
74 | }
75 | }
76 | return 0
77 | }
78 |
79 | def updateC(oldGraph: Graph[
80 | (DenseVector[Double], DenseVector[Double], Long, Long, Long),
81 | (Double, DenseVector[Double], DenseVector[Double], DenseVector[Double], DenseVector[Double])
82 | ]): Graph[
83 | (DenseVector[Double], DenseVector[Double], Long, Long, Long),
84 | (Double, DenseVector[Double], DenseVector[Double], DenseVector[Double], DenseVector[Double])
85 | ] = {
86 |
87 | var graph = oldGraph
88 | graph = graph.mapTriplets(triplet => {
89 | val xi = triplet.srcAttr._1
90 | val xj = triplet.dstAttr._1
91 | val pi = triplet.srcAttr._4
92 | val pj = triplet.dstAttr._4
93 | var sij = 0.0
94 | if (pi == pj) sij = 1.0
95 | else sij = exp(-norm(xi-xj))
96 | (sij, triplet.attr._2, triplet.attr._3, triplet.attr._4, triplet.attr._5)
97 | })
98 |
99 | for (iteration <- 1 to 50) {
100 | val prevG = graph
101 | //c-update
102 | val tempValue = graph.aggregateMessages[DenseVector[Double]](
103 | triplet => {
104 | val zij = triplet.attr._2
105 | val zji = triplet.attr._3
106 | val uij = triplet.attr._4
107 | val uji = triplet.attr._5
108 | triplet.sendToSrc(zij - uij)
109 | triplet.sendToDst(zji - uji)
110 | },
111 | (a, b) => a + b
112 | )
113 | graph = graph.joinVertices(tempValue)((_, oldValue, extraValue) => {
114 | val xi = oldValue._1
115 | val di = oldValue._3
116 | val pi = oldValue._4
117 | val ti = oldValue._5
118 | val ci = (xi * 2.0 + extraValue * rho) / (2.0 + rho * di + lambda2 * 2)
119 | (xi, ci, di, pi, ti)
120 | })
121 |
122 | //z-update
123 | graph = graph.mapTriplets(triplets => {
124 | val sij = triplets.attr._1
125 | val uij = triplets.attr._4
126 | val uji = triplets.attr._5
127 | val ci = triplets.srcAttr._2
128 | val cj = triplets.dstAttr._2
129 | var theta = 0.0
130 | if (norm(ci + uij - (cj + uji)) == 0.0) theta = 0.5
131 | else theta = max(1 - lambda * sij / (rho * norm(ci + uij - (cj + uji))), 0.5)
132 | val zij = (ci + uij) * theta + (cj + uji) * (1 - theta)
133 | val zji = (ci + uij) * (1 - theta) + (cj + uji) * theta
134 | (sij, zij, zji, uij, uji)
135 | })
136 |
137 | //u-update
138 | graph = graph.mapTriplets(triplets => {
139 | val sij = triplets.attr._1
140 | val zij = triplets.attr._2
141 | val zji = triplets.attr._3
142 | val uij = triplets.attr._4
143 | val uji = triplets.attr._5
144 | val ci = triplets.srcAttr._2
145 | val cj = triplets.dstAttr._2
146 | (sij, zij, zji, uij + ci - zij, uji + cj - zji)
147 | })
148 |
149 | graph.cache()
150 | println(" iteration " + iteration + ":")
151 | println(" loss = " + computeLoss(graph))
152 |
153 | prevG.unpersistVertices(blocking = false)
154 | prevG.edges.unpersist(blocking = false)
155 | }
156 |
157 | return graph
158 | }
159 |
160 | def main(args: Array[String]): Unit = {
161 | val projectName = "da_intern_dev"
162 | val conf = new SparkConf().setAppName("PGRR-SEMI")
163 | val sc = new SparkContext(conf)
164 | val odpsOps = new OdpsOps(sc)
165 |
166 | //(id, (feature, clusterCenter, degree, clusterIndex, semi_label))
167 | //(id, (xi, ci, di, pi, ti))
168 | var node = odpsOps.readTable[(Long, (DenseVector[Double], DenseVector[Double], Long, Long, Long))](
169 | projectName,
170 | inputNodeTable,
171 | (r: Record, schema: TableSchema) => {
172 | val vectorBuilder = new VectorBuilder[Double](featureDimension)
173 | for (i <- 1 to featureDimension) {
174 | vectorBuilder.add(i - 1, r.getDouble(i))
175 | }
176 | (r.getBigint(0).toLong, (vectorBuilder.toDenseVector, vectorBuilder.toDenseVector, 0L, 0L, -1L))
177 | }, 0).cache()
178 |
179 | //(srcId, dstId, (sij, zij, zji, uij, uji))
180 | var edge = odpsOps.readTable[Edge[(Double, DenseVector[Double], DenseVector[Double], DenseVector[Double], DenseVector[Double])]](
181 | projectName,
182 | inputEdgeTable,
183 | (r: Record, schema: TableSchema) => {
184 | Edge(r.getBigint(0).toLong, r.getBigint(1).toLong,
185 | (0.0,
186 | DenseVector.zeros[Double](featureDimension),
187 | DenseVector.zeros[Double](featureDimension),
188 | DenseVector.zeros[Double](featureDimension),
189 | DenseVector.zeros[Double](featureDimension)
190 | )
191 | )
192 | }, 0).cache()
193 |
194 | var graph = Graph(node, edge).cache()
195 | graph = graph.joinVertices(graph.degrees)((_, oldValue, di) => {
196 | val xi = oldValue._1
197 | val ci = oldValue._2
198 | (xi, ci, di, oldValue._4, oldValue._5)
199 | })
200 |
201 | var semi = odpsOps.readTable[(Long, Long)](
202 | projectName,
203 | inputSemiTable,
204 | (r: Record, schema: TableSchema) => {
205 | (r.getBigint(0).toLong, r.getBigint(1).toLong)
206 | }
207 | )
208 | graph = graph.joinVertices(semi)((_, oldValue, ti) => {
209 | (oldValue._1, oldValue._2, oldValue._3, oldValue._4, ti)
210 | })
211 |
212 | graph = graph.mapTriplets(triplet => {
213 | val xi = triplet.srcAttr._1
214 | val xj = triplet.dstAttr._1
215 | (exp(-norm(xi-xj)), xi, xj, DenseVector.zeros[Double](featureDimension), DenseVector.zeros[Double](featureDimension))
216 | })
217 |
218 | for (iter <- 1 to 50) {
219 | updateDelta(graph)
220 | val tmpGraph = graph.mapTriplets(triplet => exp(-norm(triplet.srcAttr._2 - triplet.dstAttr._2)))
221 | val subGraph = tmpGraph.subgraph(epred = edge => edge.attr >= delta)
222 | val clusterResult = subGraph.connectedComponents().vertices
223 | graph = graph.joinVertices(clusterResult)((_, oldValue, pi) => {
224 | val xi = oldValue._1
225 | val ci = oldValue._2
226 | val di = oldValue._3
227 | val ti = oldValue._5
228 | (xi, ci, di, pi, ti)
229 | }
230 | )
231 | val loss = minError + computeLoss(graph)
232 | println("" + iter + ": " + loss)
233 | graph = updateC(graph)
234 | }
235 |
236 | val clusterResult = graph.vertices.map{case (id, (xi, ci, di, pi,ti)) => (id, pi)}
237 |
238 | odpsOps.saveToTable(projectName, outputClusterTable, clusterResult,
239 | (v: (Long, Long), r: Record, schema: TableSchema) => {
240 | r.set(0, v._1)
241 | r.set(1, v._2)
242 | },
243 | isOverWrite = true
244 | )
245 |
246 | sc.stop()
247 | }
248 | }
--------------------------------------------------------------------------------
/PGRR/unsupervised/pom.xml:
--------------------------------------------------------------------------------
1 |
2 |
5 | 4.0.0
6 |
7 | groupId
8 | PGRR-unsupervised
9 | 1.0-SNAPSHOT
10 |
11 |
12 |
13 | 1.0.4
14 | UTF-8
15 | UTF-8
16 | 2.10
17 | 512m
18 | 1024m
19 |
20 |
21 |
22 | org.apache.spark
23 | spark-streaming_2.10
24 | 1.5.1-odps0.26.0
25 | provided
26 |
27 |
28 | org.apache.spark
29 | spark-mllib_2.10
30 | 1.5.1
31 | provided
32 |
33 |
34 | org.apache.spark
35 | spark-sql_2.10
36 | 1.5.1
37 | provided
38 |
39 |
40 | org.apache.spark
41 | spark-hive_2.10
42 | 1.5.1
43 | provided
44 |
45 |
46 | org.apache.hadoop
47 | hadoop-aliyun
48 | 3.0.0-alpha2-odps
49 |
50 |
51 | org.apache.spark
52 | spark-core_2.10
53 | 1.5.1-odps0.26.0
54 | provided
55 |
56 |
57 |
58 | com.aliyun.odps
59 | odps-spark-datasource
60 | ${spark.odps.and.client.version}
61 |
62 |
63 |
64 | com.aliyun.odps
65 | odps-spark-client
66 | ${spark.odps.and.client.version}
67 |
68 |
69 |
70 | org.scalatest
71 | scalatest_2.10
72 | 2.2.1
73 |
74 |
75 | org.scala-lang
76 | scala-library
77 | 2.10.4
78 |
79 |
80 | org.scala-lang
81 | scala-actors
82 | 2.10.4
83 |
84 |
85 | org.codehaus.jackson
86 | jackson-core-asl
87 | 1.9.13
88 |
89 |
90 |
91 |
92 | tbmirror
93 | taobao mirror
94 | http://mvnrepo.alibaba-inc.com/mvn/repository
95 |
96 |
97 | tbmirror-snapshots
98 | taobao mirror snapshots
99 | http://mvnrepo.alibaba-inc.com/mvn/repository
100 |
101 |
102 |
--------------------------------------------------------------------------------
/PGRR/unsupervised/src/main/scala/Main.scala:
--------------------------------------------------------------------------------
1 | import breeze.linalg._
2 | import breeze.numerics._
3 | import org.apache.spark.graphx._
4 | import com.aliyun.odps.TableSchema
5 | import com.aliyun.odps.data.Record
6 | import org.apache.spark.odps.OdpsOps
7 | import org.apache.spark.{SparkConf, SparkContext}
8 |
9 | object Main {
10 | val inputNodeTable = "sample_feature"
11 | val inputEdgeTable = "sample_edge"
12 | val outputClusterTable = "sample_label"
13 |
14 | val featureDimension = 100
15 | val lambda = 1e-5
16 | val delta = 1.0
17 | val rho = 1.0
18 | val lambda2 = 1e5
19 |
20 | def sqr(x: Double): Double = {
21 | return x * x
22 | }
23 |
24 | def computeLoss(graph: Graph[
25 | (DenseVector[Double], DenseVector[Double], Long),
26 | (Double, DenseVector[Double], DenseVector[Double], DenseVector[Double], DenseVector[Double])
27 | ]): Double = {
28 | var loss: Double = 0
29 | loss += graph.vertices.map { case (_, (xi, ci, _)) => sqr(norm(xi - ci))+lambda2*sqr(norm(ci)) }.sum
30 | loss += lambda * graph.triplets.map(triplet => {
31 | val sij = triplet.attr._1
32 | val ci = triplet.srcAttr._2
33 | val cj = triplet.dstAttr._2
34 | norm(ci - cj) * sij
35 | }).sum
36 | return loss
37 | }
38 |
39 | def main(args: Array[String]): Unit = {
40 | val projectName = "da_intern_dev"
41 | val conf = new SparkConf().setAppName("PGRR-unsupervised")
42 | val sc = new SparkContext(conf)
43 | val odpsOps = new OdpsOps(sc)
44 |
45 | var node = odpsOps.readTable[(Long, (DenseVector[Double], DenseVector[Double], Long))](
46 | projectName,
47 | inputNodeTable,
48 | (r: Record, schema: TableSchema) => {
49 | val vectorBuilder = new VectorBuilder[Double](featureDimension)
50 | for (i <- 1 to featureDimension) {
51 | vectorBuilder.add(i - 1, r.getDouble(i))
52 | }
53 | (r.getBigint(0).toLong, (vectorBuilder.toDenseVector, vectorBuilder.toDenseVector, 0L))
54 | }, 10)
55 | node = node.repartition(10)
56 |
57 | var inputEdge = odpsOps.readTable[(Long, Long)](projectName, inputEdgeTable,
58 | (r: Record, schema: TableSchema) =>
59 | (r.getBigint(0).toLong, r.getBigint(1).toLong),10)
60 | inputEdge = inputEdge.repartition(10)
61 |
62 | val edge = inputEdge.map { case (src, dst) => Edge(src, dst,
63 | (0.0,
64 | DenseVector.zeros[Double](featureDimension),
65 | DenseVector.zeros[Double](featureDimension),
66 | DenseVector.zeros[Double](featureDimension),
67 | DenseVector.zeros[Double](featureDimension)
68 | )
69 | )
70 | }
71 |
72 | var graph = Graph(node, edge)
73 | graph = graph.joinVertices(graph.degrees)((_, oldValue, di) => {
74 | val xi = oldValue._1
75 | val ci = oldValue._2
76 | (xi, ci, di)
77 | })
78 |
79 | graph = graph.mapTriplets(triplets => {
80 | val xi = triplets.srcAttr._1
81 | val xj = triplets.dstAttr._1
82 | val di = triplets.srcAttr._3
83 | val dj = triplets.dstAttr._3
84 | var oi = 0.0
85 | if (di > 10) oi = 0.1
86 | else if (di < 5) oi = 1
87 | else oi = 0.5
88 | var oj = 0.0
89 | if (dj > 10) oj = 0.1
90 | else if (dj < 5) oj = 1
91 | else oj = 0.5
92 | val sij = oi * oj * exp(norm(xi - xj) * (-1.0))
93 | (sij, xi, xj, triplets.attr._4, triplets.attr._5)
94 | })
95 |
96 | for (iteration <- 1 to 50) {
97 | val prevG = graph
98 |
99 | //c-update
100 | val tempValue = graph.aggregateMessages[DenseVector[Double]](
101 | triplet => {
102 | val zij = triplet.attr._2
103 | val zji = triplet.attr._3
104 | val uij = triplet.attr._4
105 | val uji = triplet.attr._5
106 | triplet.sendToSrc(zij - uij)
107 | triplet.sendToDst(zji - uji)
108 | },
109 | (a, b) => a + b
110 | )
111 | graph = graph.joinVertices(tempValue)((_, oldValue, extraValue) => {
112 | val xi = oldValue._1
113 | val di = oldValue._3
114 | val ci = (xi * 2.0 + extraValue * rho) / (2.0 + rho * di + lambda2 * 2)
115 | (xi, ci, di)
116 | })
117 |
118 | //z-update
119 | graph = graph.mapTriplets(triplets => {
120 | val sij = triplets.attr._1
121 | val uij = triplets.attr._4
122 | val uji = triplets.attr._5
123 | val ci = triplets.srcAttr._2
124 | val cj = triplets.dstAttr._2
125 | var theta = 0.0
126 | if (norm(ci + uij - (cj + uji)) == 0.0) theta = 0.5
127 | else theta = max(1 - lambda * sij / (rho * norm(ci + uij - (cj + uji))), 0.5)
128 | val zij = (ci + uij) * theta + (cj + uji) * (1 - theta)
129 | val zji = (ci + uij) * (1 - theta) + (cj + uji) * theta
130 | (sij, zij, zji, uij, uji)
131 | })
132 |
133 | //u-update
134 | graph = graph.mapTriplets(triplets => {
135 | val sij = triplets.attr._1
136 | val zij = triplets.attr._2
137 | val zji = triplets.attr._3
138 | val uij = triplets.attr._4
139 | val uji = triplets.attr._5
140 | val ci = triplets.srcAttr._2
141 | val cj = triplets.dstAttr._2
142 | (sij, zij, zji, uij + ci - zij, uji + cj - zji)
143 | })
144 |
145 | graph.cache()
146 | println("iteration " + iteration + ":")
147 | println("loss = " + computeLoss(graph))
148 | val newGraph = graph.mapTriplets(triplest => exp(-norm(triplest.srcAttr._2 - triplest.dstAttr._2)))
149 | val subGraph = newGraph.subgraph(epred = edge => edge.attr >= delta)
150 | println("cluster number = " + subGraph.connectedComponents().vertices.map {
151 | case (_, cluster) => cluster
152 | }.distinct.count)
153 |
154 | prevG.unpersistVertices(blocking = false)
155 | prevG.edges.unpersist(blocking = false)
156 | }
157 |
158 | val newGraph = graph.mapTriplets(triplest => exp(-norm(triplest.srcAttr._2 - triplest.dstAttr._2)))
159 | val subGraph = newGraph.subgraph(epred = edge => edge.attr >= delta)
160 | val clusterResult = subGraph.connectedComponents().vertices
161 |
162 | odpsOps.saveToTable(projectName, outputClusterTable, clusterResult,
163 | (v: (Long, Long), r: Record, schema: TableSchema) => {
164 | r.set(0, v._1)
165 | r.set(1, v._2)
166 | },
167 | isOverWrite = true
168 | )
169 |
170 | sc.stop()
171 | }
172 | }
173 |
174 |
--------------------------------------------------------------------------------
/PRRE/README.md:
--------------------------------------------------------------------------------
1 | # PRRE
2 |
3 | This is a python implementation of the PRRE algorithm, which considers the partial correlation between node attributes and network topology.
4 | **The paper is under submission and will be released after acceptance.**
5 |
6 | ## Requirements
7 | * networkx
8 | * numpy
9 | * scipy
10 | * scikit-learn
11 |
12 | All required packages are defined in requirements.txt. To install all requirement, just use the following commands:
13 | ```
14 | pip install -r requirements.txt
15 | ```
16 |
17 | ## Basic Usage
18 |
19 | ### Input Data
20 | Each dataset contains 3 files: edgelist, feature for training and label for evaluation.
21 | ```
22 | 1. dataset.edgelist: each line contains two connected nodes.
23 | node_1 node_2
24 | node_2 node_3
25 | ...
26 |
27 | 2. dataset.feature: this file has n lines.
28 | feature_1 feature_2 ... feature_n
29 | feature_1 feature_2 ... feature_n
30 | ...
31 |
32 | 3. dataset.label: each line represents a node and its class label.
33 | node_1 label_1
34 | node_2 label_2
35 | ...
36 | ```
37 |
38 | ### Run
39 | To run PRRE, just execute the following command for node classification task and link prediction with different 'task' parameter:
40 | ```
41 | python prre.py
42 | ```
43 | For data visualization task, Embedding Projector is recommended to deal with the embedding result.
44 |
45 | ### TODO
46 | The parallel and distributed version of PRRE will be released later.
47 |
--------------------------------------------------------------------------------
/PRRE/__pycache__/classify.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hp027/AliGraph/60bd650d6d1c66eb419399d28c88972c89472c74/PRRE/__pycache__/classify.cpython-36.pyc
--------------------------------------------------------------------------------
/PRRE/classify.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from sklearn.multiclass import OneVsRestClassifier
3 | from sklearn.metrics import f1_score
4 | from sklearn.preprocessing import MultiLabelBinarizer
5 | from time import time
6 | from sklearn.model_selection import train_test_split
7 | from sklearn.svm import SVC
8 | from sklearn.svm import LinearSVC
9 | from sklearn.metrics import accuracy_score,f1_score
10 | from sklearn.metrics import roc_auc_score,average_precision_score
11 | import networkx as nx
12 | from sklearn.metrics.pairwise import cosine_similarity
13 |
14 |
15 |
16 | # class TopKRanker(OneVsRestClassifier):
17 | # def predict(self, X, top_k_list):
18 | # probs = numpy.asarray(super(TopKRanker, self).predict_proba(X))
19 | # all_labels = []
20 | # for i, k in enumerate(top_k_list):
21 | # probs_ = probs[i, :]
22 | # labels = self.classes_[probs_.argsort()[-k:]].tolist()
23 | # probs_[:] = 0
24 | # probs_[labels] = 1
25 | # all_labels.append(probs_)
26 | # return numpy.asarray(all_labels)
27 | #
28 | #
29 | # class Classifier(object):
30 | #
31 | # def __init__(self, vectors, clf):
32 | # self.embeddings = vectors
33 | # self.clf = TopKRanker(clf)
34 | # self.binarizer = MultiLabelBinarizer(sparse_output=True)
35 | #
36 | # def train(self, X, Y, Y_all):
37 | # self.binarizer.fit(Y_all)
38 | # X_train = [self.embeddings[x] for x in X]
39 | # Y = self.binarizer.transform(Y)
40 | # self.clf.fit(X_train, Y)
41 | #
42 | # def evaluate(self, X, Y):
43 | # top_k_list = [len(l) for l in Y]
44 | # Y_ = self.predict(X, top_k_list)
45 | # Y = self.binarizer.transform(Y)
46 | # averages = ["micro", "macro", "samples", "weighted"]
47 | # results = {}
48 | # for average in averages:
49 | # results[average] = f1_score(Y, Y_, average=average)
50 | # # print 'Results, using embeddings of dimensionality', len(self.embeddings[X[0]])
51 | # # print '-------------------'
52 | # print results
53 | # return results
54 | # # print '-------------------'
55 | #
56 | # def predict(self, X, top_k_list):
57 | # X_ = numpy.asarray([self.embeddings[x] for x in X])
58 | # Y = self.clf.predict(X_, top_k_list=top_k_list)
59 | # return Y
60 | #
61 | # def split_train_evaluate(self, X, Y, train_precent, seed=0):
62 | # state = numpy.random.get_state()
63 | #
64 | # training_size = int(train_precent * len(X))
65 | # numpy.random.seed(seed)
66 | # shuffle_indices = numpy.random.permutation(numpy.arange(len(X)))
67 | # X_train = [X[shuffle_indices[i]] for i in range(training_size)]
68 | # Y_train = [Y[shuffle_indices[i]] for i in range(training_size)]
69 | # X_test = [X[shuffle_indices[i]] for i in range(training_size, len(X))]
70 | # Y_test = [Y[shuffle_indices[i]] for i in range(training_size, len(X))]
71 | #
72 | # self.train(X_train, Y_train, Y)
73 | # numpy.random.set_state(state)
74 | # return self.evaluate(X_test, Y_test)
75 |
76 |
77 |
78 | # def load_embeddings(filename):
79 | # fin = open(filename, 'r')
80 | # node_num, size = [int(x) for x in fin.readline().strip().split()]
81 | # vectors = {}
82 | # while 1:
83 | # l = fin.readline()
84 | # if l == '':
85 | # break
86 | # vec = l.strip().split(' ')
87 | # assert len(vec) == size+1
88 | # vectors[vec[0]] = [float(x) for x in vec[1:]]
89 | # fin.close()
90 | # assert len(vectors) == node_num
91 | # return vectors
92 |
93 | # def read_node_label(filename):
94 | # fin = open(filename, 'r')
95 | # X = []
96 | # Y = []
97 | # while 1:
98 | # l = fin.readline()
99 | # if l == '':
100 | # break
101 | # vec = l.strip().split(' ')
102 | # X.append(vec[0])
103 | # Y.append(vec[1:])
104 | # fin.close()
105 | # return X, Y
106 |
107 | def load_embeddings(filename):
108 | fin = open(filename, 'r')
109 | node_num, size = [int(x) for x in fin.readline().strip().split()]
110 | print(node_num,size)
111 | X=np.zeros((node_num,size))
112 | while 1:
113 | l = fin.readline()
114 | if l == '':
115 | break
116 | node = int(l.strip('\n\r').split()[0])
117 | embedding=l.strip('\n\r').split()[1:]
118 | X[node,:]=[float(x) for x in embedding]
119 | fin.close()
120 | return X,node_num
121 |
122 | def load_embeddings2(filename):
123 | fin = open(filename, 'r')
124 | node_num, size = [int(x) for x in fin.readline().strip().split()]
125 | X=np.zeros((node_num,size*2))
126 | while 1:
127 | l = fin.readline()
128 | if l == '':
129 | break
130 | node = int(l.strip('\n\r').split()[0])
131 | embedding=l.strip('\n\r').split()[1:]
132 | X[node,:]=[float(x) for x in embedding]
133 | fin.close()
134 | return X,node_num
135 |
136 |
137 |
138 | def read_node_label(filename,node_num):
139 | Y = np.zeros(node_num)
140 | label_path = filename
141 | with open(label_path) as fp:
142 | for line in fp.readlines():
143 | node = line.strip('\n\r').split()[0]
144 | label = line.strip('\n\r').split()[1]
145 | Y[int(node)] = int(label)
146 | return Y
147 |
148 | def eval(X,Y,train_percent=0.3):
149 | X_train, X_test, y_train, y_test = train_test_split(X, Y, train_size=train_percent, test_size=1 - train_percent,random_state=666)
150 | #clf = SVC(C=20)
151 | clf=LinearSVC()
152 | clf.fit(X_train, y_train)
153 | res = clf.predict(X_test)
154 | accuracy = accuracy_score(y_test, res)
155 | macro = f1_score(y_test, res, average='macro')
156 | micro = f1_score(y_test, res, average='micro')
157 | print(micro,macro)
158 |
159 | def link_cut(edge_path,rate):
160 | all_edge = []
161 | node_edge_num_dict = {}
162 | node_set = set()
163 | with open(edge_path) as fp:
164 | for line in fp.readlines():
165 | node1 = int(line.strip('\n\r').split()[0])
166 | node2 = int(line.strip('\n\r').split()[1])
167 | node_set.add(node1)
168 | node_set.add(node2)
169 | all_edge.append((node1, node2))
170 | if node1 not in node_edge_num_dict:
171 | node_edge_num_dict[node1] = 1
172 | else:
173 | node_edge_num_dict[node1] += 1
174 | if node2 not in node_edge_num_dict:
175 | node_edge_num_dict[node2] = 1
176 | else:
177 | node_edge_num_dict[node2] += 1
178 | seperatable_edge = []
179 | for edge in all_edge:
180 | node1 = edge[0]
181 | node2 = edge[1]
182 | if node_edge_num_dict[node1] > 1 and node_edge_num_dict[node2] > 1:
183 | seperatable_edge.append(edge)
184 | print('Number of nodes:',len(node_set))
185 | print('Number of edges:', len(all_edge))
186 | print('Number of seperatable edges:', len(seperatable_edge))
187 | test_edges = []
188 | train_edges = []
189 | if len(all_edge) * rate > len(seperatable_edge):
190 | print('Not so many edges to be sampled!')
191 | else:
192 | np.random.shuffle(seperatable_edge)
193 | for i in range(int(len(all_edge) * rate)):
194 | test_edges.append(seperatable_edge[i])
195 | for edge in all_edge:
196 | if edge not in test_edges:
197 | train_edges.append(edge)
198 | for i in range(len(node_set)):
199 | flag=0
200 | for pair in train_edges:
201 | if i in pair:
202 | flag+=1
203 | if flag==0:
204 | train_edges.append((i,i))
205 | train_set=set()
206 | with open('training_graph.txt', 'w') as wp:
207 | for edge in train_edges:
208 | node1 = edge[0]
209 | node2 = edge[1]
210 | train_set.add(node1)
211 | train_set.add(node2)
212 | wp.write(str(node1) + '\t' + str(node2) + '\n')
213 | with open('test_graph.txt', 'w') as wp:
214 | for edge in test_edges:
215 | node1 = edge[0]
216 | node2 = edge[1]
217 | wp.write(str(node1) + '\t' + str(node2) + '\n')
218 | print('Training graph node number:',len(train_set))
219 |
220 |
221 | def link_prediction(embedding,test_path):
222 | embedding_sim =cosine_similarity(embedding)
223 | y_predict=[]
224 | y_gt=[]
225 | with open(test_path) as fp:
226 | for line in fp.readlines():
227 | node1=int(line.strip('\n\r').split()[0])
228 | node2 = int(line.strip('\n\r').split()[1])
229 | label=int(line.strip('\n\r').split()[2])
230 | y_gt.append(label)
231 | y_predict.append(embedding_sim[node1,node2])
232 | roc=roc_auc_score(y_gt,y_predict)
233 | ap=average_precision_score(y_gt,y_predict)
234 | if roc<0.5:
235 | roc=1-roc
236 | print('ROC:',roc,'AP:',ap)
--------------------------------------------------------------------------------
/PRRE/classify.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hp027/AliGraph/60bd650d6d1c66eb419399d28c88972c89472c74/PRRE/classify.pyc
--------------------------------------------------------------------------------
/PRRE/graph_distance.py:
--------------------------------------------------------------------------------
1 | import networkx as nx
2 | import numpy as np
3 | from scipy.sparse import csr_matrix
4 | from numpy.linalg import inv
5 | import random
6 |
7 | def jaccard(edge_path):
8 | neighbor_set_dict={}
9 | node_set=set()
10 | edge_num=0
11 | with open(edge_path) as fp:
12 | for line in fp.readlines():
13 | edge_num+=1
14 | node1=int(line.strip('\n\r').split()[0])
15 | node2=int(line.strip('\n\r').split()[1])
16 | node_set.add(node1)
17 | node_set.add(node2)
18 | if node1 not in neighbor_set_dict:
19 | neighbor_set_dict[node1]=set()
20 | neighbor_set_dict[node1].add(node2)
21 | else:
22 | neighbor_set_dict[node1].add(node2)
23 | if node2 not in neighbor_set_dict:
24 | neighbor_set_dict[node2]=set()
25 | neighbor_set_dict[node2].add(node1)
26 | else:
27 | neighbor_set_dict[node2].add(node1)
28 | node_num=len(node_set)
29 | print('Node number:',node_num)
30 | print('Edge number:',edge_num)
31 | num=0
32 | sim_mat=np.zeros((node_num,node_num))
33 | row=[]
34 | col=[]
35 | data=[]
36 | for i in range(node_num):
37 | for j in range(node_num):
38 | i_nbr=neighbor_set_dict[i]
39 | j_nbr=neighbor_set_dict[j]
40 | inter=len(i_nbr.intersection(j_nbr))
41 | union=len(i_nbr.union(j_nbr))
42 | score=float(inter)/union
43 | sim_mat[i,j]=score
44 | if i!=j and score>0:
45 | num+=1
46 | row.append(i)
47 | col.append(j)
48 | data.append(score)
49 | M=csr_matrix((data, (row, col)), shape=(node_num, node_num))
50 | print('Jaccard simiarity finished!')
51 | print(float(num)/(node_num*node_num))
52 | return M.toarray(),node_num
53 |
54 | def katz(edge_path,beta):
55 | '''
56 | S=(I-beta*A)^-1 * beta*A
57 | :param edge_path:
58 | :param beta:
59 | :return:
60 | '''
61 | node_set = set()
62 | row=[]
63 | col=[]
64 | G=nx.Graph()
65 | with open(edge_path) as fp:
66 | for line in fp.readlines():
67 | node1 = int(line.strip('\n\r').split()[0])
68 | node2 = int(line.strip('\n\r').split()[1])
69 | G.add_edge(node1,node2)
70 | row.append(node1)
71 | row.append(node2)
72 | col.append(node2)
73 | col.append(node1)
74 | node_set.add(node1)
75 | node_set.add(node2)
76 | node_num=len(node_set)
77 | print('node num:',node_num,G.number_of_nodes())
78 | A=np.zeros((node_num,node_num))
79 | for i in range(len(col)):
80 | A[row[i],col[i]]=1.0
81 | S=np.dot(inv(np.identity(node_num)-beta*A),beta*A)
82 | return S,node_num
83 |
84 | def RPR(edge_path,alpha):
85 | '''
86 | S=(I-alpha*P)^-1 * (1-alpha)* I
87 | :param edge_path:
88 | :param alpha:
89 | :return:
90 | '''
91 | node_set = set()
92 | row = []
93 | col = []
94 | G = nx.Graph()
95 | with open(edge_path) as fp:
96 | for line in fp.readlines():
97 | node1 = int(line.strip('\n\r').split()[0])
98 | node2 = int(line.strip('\n\r').split()[1])
99 | G.add_edge(node1, node2)
100 | row.append(node1)
101 | row.append(node2)
102 | col.append(node2)
103 | col.append(node1)
104 | node_set.add(node1)
105 | node_set.add(node2)
106 | node_num = len(node_set)
107 | print('node num:', node_num, G.number_of_nodes())
108 | A = np.zeros((node_num, node_num))
109 | for i in range(len(col)):
110 | A[row[i], col[i]] = 1.0
111 | P=np.zeros((node_num,node_num))
112 | for i in range(node_num):
113 | row_sum=np.sum(A[i,:])
114 | P[i,:]=A[i,:]/row_sum
115 | S=np.dot(inv(np.identity(node_num)-alpha*P),(1-alpha)*np.identity(node_num))
116 | return S,node_num
117 |
118 | def CN(edge_path):
119 | '''
120 | S=I^-1 * A^2
121 | :param edge_path:
122 | :return: integer similarity
123 | '''
124 | node_set = set()
125 | row = []
126 | col = []
127 | G = nx.Graph()
128 | with open(edge_path) as fp:
129 | for line in fp.readlines():
130 | node1 = int(line.strip('\n\r').split()[0])
131 | node2 = int(line.strip('\n\r').split()[1])
132 | G.add_edge(node1, node2)
133 | row.append(node1)
134 | row.append(node2)
135 | col.append(node2)
136 | col.append(node1)
137 | node_set.add(node1)
138 | node_set.add(node2)
139 | node_num = len(node_set)
140 | print('node num:', node_num, G.number_of_nodes())
141 | A = np.zeros((node_num, node_num))
142 | for i in range(len(col)):
143 | A[row[i], col[i]] = 1
144 | S=np.dot(inv(np.identity(node_num)),np.dot(A,A))
145 | return S
146 |
147 | def AA(edge_path):
148 | '''
149 | S=I^-1 * (A*D*A)
150 | :param edge_path:
151 | :return: similarity matrix
152 | '''
153 | node_set = set()
154 | row = []
155 | col = []
156 | G = nx.Graph()
157 | with open(edge_path) as fp:
158 | for line in fp.readlines():
159 | node1 = int(line.strip('\n\r').split()[0])
160 | node2 = int(line.strip('\n\r').split()[1])
161 | G.add_edge(node1, node2)
162 | row.append(node1)
163 | row.append(node2)
164 | col.append(node2)
165 | col.append(node1)
166 | node_set.add(node1)
167 | node_set.add(node2)
168 | node_num = len(node_set)
169 | print('node num:', node_num, G.number_of_nodes())
170 | A = np.zeros((node_num, node_num))
171 | for i in range(len(col)):
172 | A[row[i], col[i]] = 1.0
173 | D=np.zeros((node_num,node_num))
174 | for i in range(node_num):
175 | D[i,i]=0.5/np.sum(A[i,:])
176 | S=np.dot(inv(np.identity(node_num)),np.dot(np.dot(A,D),A))
177 | return S,node_num
178 |
179 | def PPMI(edge_path,window_size):
180 | G_dic = {}
181 | max_node = 0
182 | with open(edge_path) as f:
183 | lines = f.readlines()
184 | print('Edge Number:',len(lines))
185 | for line in lines:
186 | items = line.strip('\n').split()
187 | a = int(items[0])
188 | b = int(items[1])
189 | #if a == b:
190 | # continue
191 | max_node = max(max_node, a)
192 | max_node = max(max_node, b)
193 | if a in G_dic:
194 | G_dic[a].append(b)
195 | else:
196 | G_dic[a] = [b]
197 | if b in G_dic:
198 | G_dic[b].append(a)
199 | else:
200 | G_dic[b] = [a]
201 | G = [[] for _ in range(max_node + 1)]
202 | for k, v in G_dic.items():
203 | G[k] = v
204 | node_num=len(G_dic.items())
205 | print('Node num:',node_num)
206 | walk_length=80
207 | walk_num=20
208 | walks = []
209 | for cnt in range(walk_num):
210 | for node in range(node_num):
211 | path = [node]
212 | while len(path) < walk_length:
213 | cur = path[-1]
214 | if len(G[cur]) > 0:
215 | path.append(random.choice(G[cur]))
216 | else:
217 | break
218 | walks.append(path)
219 |
220 | vocab = np.zeros(node_num)
221 | for walk in walks:
222 | for node in walk:
223 | vocab[node] += 1
224 | pair_num_dict={}
225 | for walk in walks:
226 | for i in range(len(walk)):
227 | source_node = walk[i]
228 | left_window = max(i - window_size, 0)
229 | right_window = min(i + window_size, len(walk))
230 | for j in range(left_window, right_window):
231 | target_node=walk[j]
232 | if source_node!=target_node:
233 | if (source_node,target_node) not in pair_num_dict:
234 | pair_num_dict[(source_node,target_node)]=1
235 | else:
236 | pair_num_dict[(source_node, target_node)] += 1
237 | PPMI_matrix=np.zeros((node_num,node_num))
238 | len_D=node_num*walk_length*walk_num
239 | for key in pair_num_dict:
240 | node1=key[0]
241 | node2=key[1]
242 | co_occurance=pair_num_dict[key]
243 | frequency_1=vocab[node1]
244 | frequency_2=vocab[node2]
245 | res=np.log(1.0*co_occurance*len_D/(frequency_1*frequency_2))
246 | if res<0:
247 | res=0
248 | PPMI_matrix[node1,node2]=res
249 | return PPMI_matrix,node_num
250 |
251 |
--------------------------------------------------------------------------------
/PRRE/graph_distance.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hp027/AliGraph/60bd650d6d1c66eb419399d28c88972c89472c74/PRRE/graph_distance.pyc
--------------------------------------------------------------------------------
/PRRE/prre.py:
--------------------------------------------------------------------------------
1 | #__author__:zhousheng
2 |
3 | from __future__ import print_function
4 | import numpy as np
5 | from scipy.special import expit as sigmoid
6 | from copy import deepcopy
7 | from classify import read_node_label, eval, link_cut, link_prediction
8 | from graph_distance import PPMI,jaccard
9 | from sklearn.metrics.pairwise import cosine_similarity
10 | import sys
11 | from scipy.io import mmread
12 | from sklearn.preprocessing import normalize
13 |
14 |
15 | class Graph():
16 | def __init__(self, feature_path, edge_path, label_path, embedding_size, lambda_h, lambda_theta_attr,lambda_theta_net, step_size,
17 | step_size_theta_attr,step_size_theta_net,feature_sparse):
18 | self.label_path = label_path
19 | self.feature_path = feature_path
20 | self.edge_path = edge_path
21 | [sim_mat_graph, self.node_num] = jaccard(self.edge_path)
22 | self.sim_mat_graph=sim_mat_graph
23 | #self.sim_mat_graph=self.norm_sim_mat(sim_mat_graph,self.node_num)
24 | print('------Using Jaccard similarity measure------')
25 | self.feature_sim_mat(feature_sparse)
26 | self.embedding_size = embedding_size
27 | self.embedding_mat = np.random.normal(loc=0, scale=0.1, size=(self.node_num, self.embedding_size))
28 | self.context_mat = np.random.normal(loc=0, scale=0.1, size=(self.node_num, self.embedding_size))
29 | self.lambda_h = lambda_h
30 | self.lambda_theta_attr = lambda_theta_attr
31 | self.lambda_theta_net=lambda_theta_net
32 | self.step_size = step_size
33 | self.step_size_theta_attr = step_size_theta_attr
34 | self.step_size_theta_net=step_size_theta_net
35 | self.theta_graph = np.mean(self.sim_mat_graph)
36 | self.theta_attr = np.mean(self.sim_mat_attr)
37 | print('Theta graph:',self.theta_graph,'Theta attr:',self.theta_attr)
38 | self.batch_size = 256
39 | self.b = np.random.normal(loc=0, scale=0.1, size=(1, self.node_num))
40 | self.loss = 1
41 |
42 | def norm_sim_mat(self,M,node_num):
43 | MM=deepcopy(M)
44 | for i in range(node_num):
45 | MM[i,i]=0
46 | print('Using L1 norm')
47 | return normalize(MM,'l1')
48 |
49 | def feature_sim_mat(self,feature_sparse):
50 | if feature_sparse==False:
51 | with open(self.feature_path) as fp:
52 | lines = fp.readlines()
53 | node_num = len(lines)
54 | line = lines[0]
55 | attr_num = len(line.strip('\n\r').split())
56 | print('Node number:', node_num, 'Attribute dimension:', attr_num)
57 | self.node_num = node_num
58 | A = np.zeros((node_num, attr_num))
59 | with open(self.feature_path) as fp:
60 | line_num = 0
61 | for line in fp.readlines():
62 | A[line_num, :] = line.strip('\n\r').split()
63 | line_num += 1
64 | else:
65 | A=mmread(feature_path).todense()
66 | self.node_num=A.shape[0]
67 | self.A = A
68 | A_sim = cosine_similarity(A)
69 | self.sim_mat_attr=A_sim
70 | #self.sim_mat_attr = self.norm_sim_mat(A_sim,self.node_num)
71 | print('Average Attribute Similarity:',np.mean(A_sim))
72 |
73 |
74 | def judge_pos_neg(self, M, theta):
75 | sim_mat = M
76 | pos_neg_mat = np.zeros((self.node_num, self.node_num))
77 | pos_neg_mat[sim_mat >= theta] = 1
78 | pos_neg_mat[sim_mat < theta] = 0
79 | for i in range(self.node_num):
80 | pos_neg_mat[i, i] = 666
81 | return pos_neg_mat
82 |
83 | def final_judge(self):
84 | pos_neg_mat_graph = self.judge_pos_neg(self.sim_mat_graph, self.theta_graph)
85 | pos_neg_mat_attr = self.judge_pos_neg(self.sim_mat_attr, self.theta_attr)
86 | final_pos_neg_mat = np.zeros((self.node_num, self.node_num))
87 | final_pos_neg_mat[np.where(pos_neg_mat_graph == 1)] += 1
88 | final_pos_neg_mat[np.where(pos_neg_mat_graph == 0)] -= 1
89 | final_pos_neg_mat[np.where(pos_neg_mat_attr == 1)] += 1
90 | final_pos_neg_mat[np.where(pos_neg_mat_attr == 0)] -= 1
91 | for i in range(self.node_num):
92 | final_pos_neg_mat[i, i] = 666
93 | return final_pos_neg_mat
94 |
95 | def sampling(self):
96 | u_list = range(self.node_num)
97 | final_pos_neg_mat = self.final_judge()
98 | P = float(len(np.where(final_pos_neg_mat == 2)[0]))
99 | A = float(len(np.where(final_pos_neg_mat == 0)[0]))
100 | N = float(len(np.where(final_pos_neg_mat == -2)[0]))
101 | sum = P + A + N
102 | sampled_list = []
103 | np.random.shuffle(u_list)
104 | for u in u_list:
105 | pos_neg_vec = final_pos_neg_mat[u, :]
106 | p_list = np.where(pos_neg_vec == 2)[0]
107 | a_list = np.where(pos_neg_vec == 0)[0]
108 | n_list = np.where(pos_neg_vec == -2)[0]
109 | # if len(p_list) > 0 and len(a_list) > 0 and len(n_list) > 0:
110 | # for i in range(100):
111 | # p = np.random.choice(p_list)
112 | # a = np.random.choice(a_list)
113 | # n = np.random.choice(n_list)
114 | # sampled_list.append([u, p, a, n])
115 | #
116 | for i in range(20):
117 | if len(p_list) > 0 and len(a_list) > 0 and len(n_list) > 0:
118 | p = np.random.choice(p_list)
119 | a = np.random.choice(a_list)
120 | n = np.random.choice(n_list)
121 | sampled_list.append([u, p, a, n])
122 | elif len(p_list) > 0 and len(a_list) == 0 and len(n_list) > 0:
123 | p=np.random.choice(p_list)
124 | a=p
125 | n = np.random.choice(n_list)
126 | sampled_list.append([u, p, a, n])
127 | elif len(p_list) == 0 and len(a_list) > 0 and len(n_list) > 0:
128 | p = np.random.choice(a_list)
129 | a = p
130 | n = np.random.choice(n_list)
131 | sampled_list.append([u, p, a, n])
132 |
133 |
134 |
135 | np.random.shuffle(sampled_list)
136 | self.sampled_list = sampled_list
137 | print('-----------New Sampling--------')
138 | print(len(sampled_list), 'triplets sampled')
139 |
140 | def mini_batch(self):
141 | sampled_list = self.sampled_list
142 | np.random.shuffle(sampled_list)
143 | batch_num = len(sampled_list) // self.batch_size
144 | if len(sampled_list) % self.batch_size == 0:
145 | for i in range(batch_num):
146 | yield sampled_list[i * self.batch_size: (i + 1) * self.batch_size]
147 | else:
148 | for i in range(batch_num + 1):
149 | if i < batch_num:
150 | yield sampled_list[i * self.batch_size: (i + 1) * self.batch_size]
151 | else:
152 | yield sampled_list[i * self.batch_size:]
153 |
154 | def g_theta(self):
155 | pos_neg_mat_graph = self.judge_pos_neg(self.sim_mat_graph, self.theta_graph)
156 | pos_neg_mat_attr = self.judge_pos_neg(self.sim_mat_attr, self.theta_attr)
157 | if len(np.where(pos_neg_mat_graph == 0)[0]) > 0:
158 | t_neg_graph = np.mean(self.sim_mat_graph[np.where(pos_neg_mat_graph == 0)])
159 | else:
160 | t_neg_graph = 0
161 | print('T graph negative is 0!!!!')
162 | self.t_neg_graph = t_neg_graph
163 | if len(np.where(pos_neg_mat_graph == 1)[0]) > 0:
164 | t_pos_graph = np.mean(self.sim_mat_graph[np.where(pos_neg_mat_graph == 1)])
165 | else:
166 | t_pos_graph = self.theta_graph
167 | print('Theta graph is setted to 1!!!!!')
168 | self.t_pos_graph = t_pos_graph
169 | if len(np.where(pos_neg_mat_attr == 0)[0]) > 0:
170 | t_neg_attr = np.mean(self.sim_mat_attr[np.where(pos_neg_mat_attr == 0)])
171 | else:
172 | t_neg_attr = 0
173 | print('T attribute negative is 0!!!!')
174 | self.t_neg_attr = t_neg_attr
175 | if len(np.where(pos_neg_mat_attr == 1)[0]) > 0:
176 | t_pos_attr = np.mean(self.sim_mat_attr[np.where(pos_neg_mat_attr == 1)])
177 | else:
178 | t_pos_attr = self.theta_attr
179 | print('Theta attribute is setted to 1!!!!!')
180 | self.t_pos_attr = t_pos_attr
181 | return (t_pos_graph - self.theta_graph) * (self.theta_graph - t_neg_graph) + (t_pos_attr - self.theta_attr) * (
182 | self.theta_attr - t_neg_attr)
183 |
184 | def Estep(self):
185 | g_theta = self.g_theta()
186 | H = deepcopy(self.embedding_mat)
187 | HH = sigmoid(np.dot(H, H.T))
188 | SHH = HH * (1 - HH)
189 | grad_mat = np.zeros((self.node_num, self.embedding_size))
190 | g_theta_plus_one_inv = 1.0 / (g_theta + 1)
191 | for pair in self.sampled_list:
192 | u = pair[0]
193 | p = pair[1]
194 | a = pair[2]
195 | n = pair[3]
196 | h_u = H[u, :]
197 | h_p = H[p, :]
198 | h_a = H[a, :]
199 | h_n = H[n, :]
200 | up = HH[u, p]
201 | ua = HH[u, a]
202 | un = HH[u, n]
203 | sup = SHH[u, p]
204 | sua = SHH[u, a]
205 | sun = SHH[u, n]
206 | upua = 1 + up - ua
207 | uaun = 1 + ua - un
208 | if upua != 0 and uaun != 0:
209 | grad_mat[u, :] += (sup * h_p - sua*h_a) / upua + (sua*h_a - sun * h_n) / uaun
210 | grad_mat[p, :] += (sup * h_u) / upua
211 | grad_mat[a, :] += (-sua*h_u) / upua + (sua*h_u) / uaun
212 | grad_mat[n, :] += (-sun * h_u) / uaun
213 | grad_mat *= g_theta_plus_one_inv
214 | grad_mat -= self.lambda_h * H
215 | H += self.step_size * grad_mat
216 | self.embedding_mat = H
217 |
218 | def Mstep(self):
219 | H = deepcopy(self.embedding_mat)
220 | HH = sigmoid(np.dot(H, H.T) )
221 | grad_theta_graph = 0
222 | grad_theta_attr = 0
223 | g_theta = self.g_theta()
224 | for pair in self.sampled_list:
225 | u = pair[0]
226 | p = pair[1]
227 | a = pair[2]
228 | n = pair[3]
229 | grad_theta_graph += (self.t_pos_graph + self.t_neg_graph - 2 * self.theta_graph) * (
230 | np.log((HH[u, p] - HH[u, a] + 1) / 2) + np.log((HH[u, a] - HH[u, n] + 1) / 2)) * (
231 | -1.0 / np.power(1.0 + g_theta, 2))
232 | grad_theta_attr += (self.t_pos_attr + self.t_neg_attr - 2 * self.theta_attr) * (
233 | np.log((HH[u, p] - HH[u, a] + 1) / 2) + np.log((HH[u, a] - HH[u, n] + 1) / 2)) * (
234 | -1.0 / np.power(1.0 + g_theta, 2))
235 | grad_theta_graph /= len(self.sampled_list)
236 | grad_theta_attr /= len(self.sampled_list)
237 | grad_theta_graph -= self.lambda_theta_net * self.theta_graph
238 | grad_theta_attr -= self.lambda_theta_attr * self.theta_attr
239 | self.theta_graph += self.step_size_theta_net * grad_theta_graph
240 | self.theta_attr += self.step_size_theta_attr * grad_theta_attr
241 | print( 't_graph:', self.theta_graph, 't_attr:', self.theta_attr)
242 |
243 | def run(self, task):
244 | for i in range(200):
245 | self.sampling()
246 | print(i + 1, 'epoch generated')
247 | self.Estep()
248 | self.Mstep()
249 | self.output(task)
250 | #np.savetxt('res500.tsv', self.embedding_mat[:500,:], delimiter='\t')
251 | #np.savetxt('res1000.tsv', self.embedding_mat[:1000, :], delimiter='\t')
252 | np.savetxt('resall.tsv', self.embedding_mat, delimiter='\t')
253 |
254 |
255 |
256 | def output(self, task):
257 | X = self.embedding_mat
258 | node_num = self.node_num
259 | if task == 'class':
260 | Y = read_node_label(self.label_path, node_num)
261 | eval(X, Y)
262 | else:
263 | link_prediction(X, test_path)
264 |
265 |
266 | if __name__ == '__main__':
267 | data = 'blogcatalog'#sys.argv[1]
268 | task = 'class'
269 | split='1'#sys.argv[1]#str(1)
270 | print(data,task,split)
271 | print('PRRE both')
272 | feature_sparse = True
273 | edge_path = '../../data/' + data + '/' + data + '.edgelist'
274 | label_path = '../../data/' + data + '/' + data + '.label'
275 | if feature_sparse==False:
276 | feature_path = '../../data/' + data + '/' + data + '.feature'
277 | else:
278 | feature_path='../../data/' + data + '/' + data + '_feature.mtx'
279 | train_path = '../../data/' + data + '/' + data + '.train'+split
280 | test_path = '../../data/' + data + '/' + data + '.test'+split
281 |
282 | lambda_h =1#float(sys.argv[1])
283 | lambda_theta_attr = 0
284 | lambda_theta_net=0
285 | step_size = 0.1
286 | step_size_theta_attr = 0.1
287 | step_size_theta_net=0.1
288 | print(lambda_h, lambda_theta_attr,lambda_theta_net, step_size, step_size_theta_attr,step_size_theta_net)
289 | if task == 'class':
290 | G = Graph(feature_path=feature_path, edge_path=edge_path, label_path=label_path, embedding_size=128,
291 | lambda_h=lambda_h, lambda_theta_attr=lambda_theta_attr,lambda_theta_net=lambda_theta_net,
292 | step_size=step_size, step_size_theta_attr=step_size_theta_attr,step_size_theta_net=step_size_theta_net,feature_sparse=feature_sparse)
293 | G.run(task)
294 | else:
295 | G = Graph(feature_path=feature_path, edge_path=train_path, label_path=None, embedding_size=128,
296 | lambda_h=lambda_h, lambda_theta_attr=lambda_theta_attr,lambda_theta_net=lambda_theta_net,
297 | step_size=step_size, step_size_theta_attr=step_size_theta_attr,step_size_theta_net=step_size_theta_net,feature_sparse=feature_sparse)
298 | G.run(task)
299 |
--------------------------------------------------------------------------------
/PRRE/requirements.txt:
--------------------------------------------------------------------------------
1 | tensorflow==1.4.1
2 | networkx==1.11
3 | numpy==1.14.2
4 | scipy==0.19.1
5 | scikit-learn==0.19.0
6 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # AliGraph:A Comprehensive Graph Neural Network System
2 | Large scale graphical modeling has been playing a key role in assuring success in many of the challenges in big data companies, as many problems can be abstracted and solved by graphical models. Recently, methods based on representing networks in the vector space, while preserving their properties, have become widely popular. The embeddings are inputs as features to a model and the parameters are learned based on the training data. Obtaining a vector representation of a large heterogeneous graph is inherently difficult and poses several challenges where related research topics may arise:
3 |
4 | 1. Heterogeneity: A “good” representation of nodes should preserve not only the structure of the graph but also the entity differences. Existing works mainly focus on embeddings of homogeneous graphs but majority of problems should be solved with heterogeneous graphical modeling.
5 |
6 | 2. Scalability: The graph formed by buyers, sellers and commodities contains hundreds of billions of nodes and edges. Defining a scalable model can be challenging especially when the model is aimed to preserve global properties of the graph.
7 |
8 | 3. Reliability: Observed edges or node labels can be polluted and thus are not reliable.
9 |
10 | To solve the above mentioned challenges, we propose a principled effort to investigate a new framework of deep graph that seamlessly integrates computing, learning and inference in a consistent and optimum set up. Graph neural network (GNN) is an effective yet efficient way to solve the graph analytics problem by converting the graph data into a low dimensional space while maximumly keeping both the graph structural and property information. In AliGraph, we introduce a relatively comprehensive graph neural network system that is currently deployed in Alibaba. It consists of three key components, including universal module(i.e., sampling, multi-edge, dynamic, etc), upper level module(i.e., mixture GNN, hierarchical GNN, etc) and cognitive module(i.e., GNN to text generation, Bayesian GNN with knowledge graph as a prior, etc). Current checked in methods and papers include the following list which is still growing:
11 |
12 | 1. ANRL: Attributed Network Representation Learning via Deep Neural Networks (IJCAI-18)
13 |
14 | 2. DELP: Semi-Supervised Learning Through Dynamic Graph Embedding and Label Propagation (Under Review)
15 |
16 | 3. PGRR: Mobile Access Record Resolution on Large-Scale Identifier-Linkage Graphs (KDD-18)
17 |
18 | 4. PRRE: Personalized Relation Ranking Embedding for Attributed Network (CIKM-18)
19 |
20 |
21 |
--------------------------------------------------------------------------------