├── .gitignore
├── LICENSE
├── README.md
├── data
└── cora
│ ├── cora_adjlist.txt
│ ├── cora_attr.txt
│ ├── cora_doub_adjlist.txt
│ └── cora_label.txt
├── emb
└── init
├── requirements.txt
└── src
├── libnrl
├── __init__.py
├── aane.py
├── abrw.py
├── attrcomb.py
├── attrpure.py
├── classify.py
├── graph.py
├── node2vec.py
├── tadw.py
├── utils.py
└── walker.py
└── main.py
/.gitignore:
--------------------------------------------------------------------------------
1 | # Others
2 |
3 | # Byte-compiled / optimized / DLL files
4 | __pycache__/
5 | *.py[cod]
6 | *$py.class
7 |
8 | # C extensions
9 | *.so
10 |
11 | # Distribution / packaging
12 | .Python
13 | build/
14 | develop-eggs/
15 | dist/
16 | downloads/
17 | eggs/
18 | .eggs/
19 | lib/
20 | lib64/
21 | parts/
22 | sdist/
23 | var/
24 | wheels/
25 | *.egg-info/
26 | .installed.cfg
27 | *.egg
28 | MANIFEST
29 |
30 | # PyInstaller
31 | # Usually these files are written by a python script from a template
32 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
33 | *.manifest
34 | *.spec
35 |
36 | # Installer logs
37 | pip-log.txt
38 | pip-delete-this-directory.txt
39 |
40 | # Unit test / coverage reports
41 | htmlcov/
42 | .tox/
43 | .coverage
44 | .coverage.*
45 | .cache
46 | nosetests.xml
47 | coverage.xml
48 | *.cover
49 | .hypothesis/
50 | .pytest_cache/
51 |
52 | # Translations
53 | *.mo
54 | *.pot
55 |
56 | # Django stuff:
57 | *.log
58 | local_settings.py
59 | db.sqlite3
60 |
61 | # Flask stuff:
62 | instance/
63 | .webassets-cache
64 |
65 | # Scrapy stuff:
66 | .scrapy
67 |
68 | # Sphinx documentation
69 | docs/_build/
70 |
71 | # PyBuilder
72 | target/
73 |
74 | # Jupyter Notebook
75 | .ipynb_checkpoints
76 |
77 | # pyenv
78 | .python-version
79 |
80 | # celery beat schedule file
81 | celerybeat-schedule
82 |
83 | # SageMath parsed files
84 | *.sage.py
85 |
86 | # Environments
87 | .env
88 | .venv
89 | env/
90 | venv/
91 | ENV/
92 | env.bak/
93 | venv.bak/
94 |
95 | # Spyder project settings
96 | .spyderproject
97 | .spyproject
98 |
99 | # Rope project settings
100 | .ropeproject
101 |
102 | # mkdocs documentation
103 | /site
104 |
105 | # mypy
106 | .mypy_cache/
107 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2018 Chengbin HOU
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Notice
2 |
3 | ### For the future updates of ABRW method, please see https://github.com/houchengbin/OpenANE
4 |
5 |
6 | # ABRW: Attributed Biased Random Walks
7 | ABRW is an Attributed Network/Graph Embedding (ANE) method, which takes network structural information and node attribute information as the input, and generates unified low-dim node embeddings for each node in the network as the output. The node embeddings can be then fed into various different downstream tasks e.g. node classification and link prediction. The off-the-shelf Machine Learning techniques and distance/similarity metrics can be easily applied in the downstream tasks, since the resulting node embeddings are just some isolated low-dim data points in Euclidean space.
8 |
9 | by Chengbin HOU 2018 chengbin.hou10(AT)foxmail.com
10 |
11 | For more details, please have a look at our paper https://arxiv.org/abs/1811.11728. And if you find ABRW or this framework is useful for your research, please consider citing it.
12 | ```
13 | @article{hou2020RoSANE,
14 | title={RoSANE: Robust and Scalable Attributed Network Embedding for Sparse Networks},
15 | author={Hou, Chengbin and He, Shan and Tang, Ke},
16 | journal={Neurocomputing},
17 | year={2020},
18 | publisher={Elsevier},
19 | url={https://doi.org/10.1016/j.neucom.2020.05.080},
20 | doi={10.1016/j.neucom.2020.05.080},
21 | }
22 | ```
23 |
24 | # Usages
25 | ### Install necessary packages
26 | ```bash
27 | pip install -r requirements.txt
28 | ```
29 | Note: python 3.6 or above is required due to the [new print() feature](https://docs.python.org/3.6/reference/lexical_analysis.html#f-strings)
30 | ### Run ABRW method with default parameters
31 | ```bash
32 | python src/main.py --method abrw
33 | ```
34 | ### Try other methods
35 | abrw; aane; tadw; attrpure; attrcomb; deepwalk; node2vec
36 | ### Change parameters in each method
37 | Please see main.py
38 |
39 | ## Datasets
40 | ### Default
41 | Cora (a citation network)
42 | ### Your own dataset?
43 | #### FILE for structural information (each row):
44 | adjlist: node_id1 node_id2 node_id3 -> (the edges between (id1, id2) and (id1, id3))
45 |
46 | OR edgelist: node_id1 node_id2 weight(optional) -> one edge (id1, id2)
47 | #### FILE for attribute information (each row):
48 | node_id1 attr1 attr2 ... attrM
49 |
50 | #### FILE for label (each row):
51 | node_id1 label(s)
52 |
53 | ## Acknowledgement
54 | Thanks to Zeyu DONG for helpful discussions. And thanks to [the excellent project](https://github.com/thunlp/OpenNE), so that we can have a good starting point to carry on our project.
55 |
--------------------------------------------------------------------------------
/data/cora/cora_adjlist.txt:
--------------------------------------------------------------------------------
1 | 0 633 1862 2582
2 | 1 2 652 654
3 | 2 332 1454 1666 1986
4 | 3 2544
5 | 4 1016 1256 1761 2175 2176
6 | 5 1629 1659 2546
7 | 6 373 1042 1416 1602
8 | 7 208
9 | 8 269 281 1996
10 | 9 723 2614
11 | 10 476 2545
12 | 11 1655 1839
13 | 12 1001 1318 2661 2662
14 | 13 1701 1810
15 | 14 158 2034 2075 2077 2668
16 | 15 1090 1093 1271 2367
17 | 16 970 1632 2444 2642
18 | 17 24 927 1315 1316 2140
19 | 18 139 1560 1786 2082 2145
20 | 19 1939
21 | 20 1072 2269 2270 2374 2375
22 | 21 1043 2310
23 | 22 39 1234 1702 1703 2238
24 | 23 2159
25 | 24 201 598 1636 1701 2139 2141
26 | 25 1301 1344 2011 2317
27 | 26 99 122 123 2454 2455
28 | 27 606 1810 2360 2578
29 | 28 1687
30 | 29 963 2645
31 | 30 697 738 1358 1416 2162 2343
32 | 31 1594
33 | 32 279 518 1850 1973
34 | 33 286 588 698 911 1051 2040 2119 2120 2121
35 | 34 1358
36 | 35 895 1296 1913
37 | 36 1146 1505 1552 1640 1781 2094 2106 2107
38 | 37 60 1190 2427
39 | 38 429 862 863 1160
40 | 39 1349 1522 1532 1634 1965 2357
41 | 40 507 866 1364
42 | 41 175 596 644 1914
43 | 42 87 1372
44 | 43 152 963 1530 1653 2399 2400
45 | 44 1582 2624 2701
46 | 45 733 1219 1986 2303 2667 2668
47 | 46 1604 2366
48 | 47 163 1579
49 | 48 598 714 1031 1662 1666 2041 2205 2206 2471
50 | 49 1666 2034
51 | 50 1441
52 | 51 457 710 1392 2213 2214 2215
53 | 52 1139 1467 2053 2172 2182
54 | 53 1103 1358 1739
55 | 54 401 767
56 | 55 60 210 323 651 771 787 815 1079 1156 1983 2020 2021
57 | 56 412 447 1616 1849
58 | 57 2418
59 | 58 1715
60 | 59 105 580 609 615 1067 1287 1358 1627 1725 2651
61 | 60 1527
62 | 61 1080 1309 1416 2162 2312
63 | 62 485
64 | 63 1322
65 | 64 1209 1737 2616
66 | 65 239 543 619 771 1156 1293 1628 2021 2418 2419
67 | 66 2631
68 | 67 282 540 2628
69 | 68 391 1358 1986
70 | 69 604 1013 1351 1914 1920 1926 2189
71 | 70 441 2184
72 | 71 206 1986 2691
73 | 72 1103 1358
74 | 73 449 558 797 876 1035 1136 1189 1214 1358 1723 1745 1751
75 | 74 544 586 1042 1118 1416 1517 2052 2155 2419
76 | 75 84 583 2222 2223 2224 2225
77 | 76 88 2018 2178
78 | 77 659 1803
79 | 78 1219 1329 1418
80 | 79 603 2097
81 | 80 257 1117 2049
82 | 81 347 423 527 2180
83 | 82 1138 1634
84 | 83 1520 2581
85 | 84 284 2223 2224 2226
86 | 85 1065 2487 2488
87 | 86 429 1336 2034 2295
88 | 87 842 2164
89 | 88 130 162 300 415 498 696 737 743 815 841 851 1174 1288 1309 1394 1494 1527 1658 1677 1713 1732 1741 1847 1882 1908 2010 2011 2012 2013 2014 2015 2016 2017 2178 2394
90 | 89 258 884 1087 1094 1157 1401 1530 1585 1653 2463
91 | 90 155 156 817 1358
92 | 91 330 1046 2001 2122 2123 2380
93 | 92 898 1836
94 | 93 550 950 1495 2151
95 | 94 195 586 675 733 934 1649 1966 2263 2355 2357 2490
96 | 95 334 456 693 734 736 861 1303 1535 1580 1602 1628 1838 2054 2074 2181 2182 2183 2197 2199 2200 2201
97 | 96 2217
98 | 97 661 1353
99 | 98 1521
100 | 99 122 123 2454 2455 2604
101 | 100 1602 2056
102 | 101 281 1358 1589
103 | 102 109 1251 1448 1561 1623 1871 1878 2256
104 | 103 139 306 484 608 1775 1790
105 | 104 401 864 1065 2210
106 | 105 1721 2476 2651
107 | 106 2461
108 | 107 541 971 1113 1650
109 | 108 1647 2157 2209
110 | 109 124 133 153 176 236 289 306 318 426 459 519 563 610 1045 1337 1346 1624 1661 1769 1772 1779 1785 1787 1789 1805 1998 2045 2092 2093 2094 2095
111 | 110 567 1161 1262 1599 2279
112 | 111 758 1169 1358 1762 2492 2643
113 | 112 306 487 1623 2080
114 | 113 540 747 1884
115 | 114 610 2288 2506
116 | 115 973
117 | 116 1076 1501
118 | 117 259 2537
119 | 118 255 388 446 454 554 581 842 1029 1343 1507 1538 1616 1690 1851 2010 2030 2112 2165 2166
120 | 119 379 646 1537 1549 1901 1959
121 | 120 483 514 816 1842 2405
122 | 121 589 802 980 1158 1910 2251 2252
123 | 122 2454 2455
124 | 123 2455 2604
125 | 124 306 1367 1622 2478
126 | 125 458 2363
127 | 126 236 306 1551 1623 2045 2079
128 | 127 2604
129 | 128 233 370 392 2270
130 | 129 441 701
131 | 130 1982 2017
132 | 131 834 2169
133 | 132 379 479 904 1022 1959
134 | 133 399 452 1623 1670 1777 1784 2045 2082
135 | 134 406
136 | 135 137 2095 2144
137 | 136 831
138 | 137 2144 2329 2504
139 | 138 236 306 1776
140 | 139 306 660 910 1623 1780 2045
141 | 140 582 623
142 | 141 740 1002 2034
143 | 142 456 525 1628 2049 2181
144 | 143 316 598 1701
145 | 144 145 213 1593 2192
146 | 145 213 537 1165 1593 2622
147 | 146 897 1468 1907 1927 2059
148 | 147 1519 1976
149 | 148 378 381 602 714 2518
150 | 149 1080 1416 1655 2025
151 | 150 310 914 1942
152 | 151 1986 2236 2238 2335
153 | 152 884 2238 2240
154 | 153 236 306 459 1623 2045
155 | 154 326 364 819 1238 1339 1358 1444 1568 1752 1753 1754 2525
156 | 155 156 817 1358 1689 1736 1763
157 | 156 817 1358 1689 1763
158 | 157 598 1701 1870 2493
159 | 158 180 1701 2034
160 | 159 180 2308
161 | 160 277 553 743 745 1986 2000 2009
162 | 161 842 2010
163 | 162 323 1174 1268 1667
164 | 163
165 | 164 210 211
166 | 165 598 1473 2706 2707
167 | 166 271 2617
168 | 167 168 1056 2437 2482
169 | 168 2437 2438
170 | 169 1994 2706
171 | 170 1067 1358
172 | 171 775 790 1548
173 | 172 240 512 756 1692 2187
174 | 173 687 1033 1358 1586
175 | 174 629 1358 1742 2334
176 | 175 596 955 1914 2135 2217 2388
177 | 176 231 973
178 | 177 726 1639 2679 2682
179 | 178 833
180 | 179 197 231 1986
181 | 180 775 790 791 1020 1807 2004 2037 2076
182 | 181 1013 1359 1464 1683
183 | 182 183 997
184 | 183 997 1837
185 | 184 520
186 | 185 297 2003
187 | 186 1228 1319 1536 1672 2580 2702
188 | 187 1208
189 | 188 1169 1727
190 | 189 564 1072 1262 1509
191 | 190 491 2135
192 | 191 382 608 1677 1791 1998 2385
193 | 192 2472
194 | 193 441 458
195 | 194 473 1050 1451 2564 2646
196 | 195 2424
197 | 196 429 442 2122
198 | 197 231 2412
199 | 198 1255
200 | 199 420 1869
201 | 200 1439 2676
202 | 201 297 570 598 1701 1986 2430
203 | 202 353 1173 1250 1345
204 | 203 854 857 963 1115 1869
205 | 204 415
206 | 205 1701 1869
207 | 206 2691
208 | 207 833
209 | 208
210 | 209 2681
211 | 210 1079 1614 1626 1671 1905 1906 1907 2309
212 | 211 356 1394 1908 1909
213 | 212 1290
214 | 213 1593 2192
215 | 214 2194
216 | 215 1701 1986 2045
217 | 216 1542 1701
218 | 217 243 2473 2474
219 | 218 482 733 781 792 1020 1348 1382 2076 2091 2119 2594
220 | 219 507 1413 1542
221 | 220 376 817 913
222 | 221 817
223 | 222 821
224 | 223 744 1154 1761
225 | 224 1810 2034
226 | 225 2255
227 | 226 1701
228 | 227 534 883
229 | 228 322 848 1358 1721 1871
230 | 229 1053 1405 1894 2118
231 | 230 549 577 1095 1217 1810
232 | 231 232 387 551 1701 1868 2217
233 | 232 869 1701 1864 1986
234 | 233
235 | 234 610 2591
236 | 235 1701
237 | 236 306 1072 1572 2045 2046 2078 2079
238 | 237 2539
239 | 238 1828
240 | 239 619 887 910 1069 1220 1274 1376 1759 1909 2021 2182 2418
241 | 240 512 756 1472 1692
242 | 241 1441
243 | 242 279 838 1131 2280
244 | 243 473 2474 2647 2648
245 | 244 565 1072 1358 1610
246 | 245 782 1162
247 | 246 936 1358 1687 2515
248 | 247 2583
249 | 248 1264 1531 1964
250 | 249 621 1884 1978
251 | 250 2429
252 | 251 253 507 812 1300 1413 1542 1933
253 | 252 711 973 1973 2485
254 | 253 507 1413
255 | 254 507 1211
256 | 255 1266 1343 2112
257 | 256 756
258 | 257 964 1388 2180
259 | 258 963 1094 1153 1240 1401 2645
260 | 259 2537
261 | 260 365
262 | 261 873 1479 1701 1894
263 | 262 1351 1464
264 | 263 364 732 1103
265 | 264 472 844 1826 1902
266 | 265 1441
267 | 266 578 809 2227 2229
268 | 267 670 2000 2373
269 | 268 1740 2451
270 | 269 321 418 2543 2551
271 | 270 279 838 1195 1215 2280
272 | 271 638 1970 2387 2498
273 | 272 2012
274 | 273 374 846
275 | 274 748 749
276 | 275 805 1967
277 | 276 1463 1695 2133
278 | 277 553 696 973 1869
279 | 278 1185
280 | 279 304 502 666 838 1195 2165 2280 2344 2423
281 | 280 1535 2155
282 | 281 746 1000 1347 1382 2244 2247
283 | 282 747
284 | 283 1559 2250
285 | 284 583 2222 2223 2224 2225 2226
286 | 285 502 1999
287 | 286 442 698
288 | 287 2705
289 | 288 618
290 | 289 1358 1722
291 | 290 328 567 687 1765 2359
292 | 291 1692
293 | 292 2562
294 | 293 2038
295 | 294 306 2047
296 | 295 1176 1177
297 | 296 732 2685
298 | 297 598 899 1542
299 | 298 607 1095 1325
300 | 299 1484 2383 2386 2387 2498
301 | 300 415 634
302 | 301 1169 2492
303 | 302 306 350 719 1158 1253
304 | 303 719 1670 2143
305 | 304 1172 1857 2423
306 | 305 1416 1927
307 | 306 308 329 350 417 426 452 476 487 519 542 554 573 581 608 655 656 719 887 910 958 973 1009 1045 1072 1158 1193 1245 1251 1346 1367 1483 1490 1551 1572 1584 1640 1651 1656 1705 1770 1771 1772 1775 1779 1781 1782 1787 1797 1798 1799 1802 1804 1805 1856 2045 2046 2048 2078 2080 2084 2085 2086 2087 2088 2089 2090 2091
308 | 307 991
309 | 308 350 1253 1670 2045
310 | 309 383 2268 2305
311 | 310 352 875 892 990 1241 1272 1331 1581 1815 1944 1945 1946 1947 1948 1949 1950
312 | 311 343 690 1132 1865
313 | 312 1610 1722 1740
314 | 313 2296
315 | 314 1623 2000
316 | 315 408 423 1602 1921 2232
317 | 316 598 766 845 920 1073 1297 1821 1868 2004
318 | 317 1624 1785
319 | 318 563 1701 2045
320 | 319 1205
321 | 320 348 1597
322 | 321 1083 1450
323 | 322
324 | 323 498 2020 2022
325 | 324 827 2156 2194
326 | 325 787 1848
327 | 326 1358 1754
328 | 327 928 1325 1996 2063 2064
329 | 328 951 1250
330 | 329 1798 2598
331 | 330 1505 2123
332 | 331 1527 2480
333 | 332 665 2003 2122 2615
334 | 333 1358 1694
335 | 334 693 1013
336 | 335 1701 1986
337 | 336 1944 1950
338 | 337 1214 1358 1499 1776
339 | 338 1413 1542 1934
340 | 339 813
341 | 340 584 2463
342 | 341 613 1358 1389 1546 2045
343 | 342 1072 1512
344 | 343 1701
345 | 344 389 441 661 854 2323
346 | 345 833
347 | 346 999 1358
348 | 347 408 423 525
349 | 348 1643 2248 2250
350 | 349 419 465 2427
351 | 350 406 656 1623 1640 1775 1778 1782 2084 2085 2089 2090
352 | 351 1810
353 | 352 1294 1944
354 | 353 1173 1345 2507
355 | 354 659 1803 2453
356 | 355 1668
357 | 356 498 1259 2018 2116
358 | 357 657 1154 1358 2652
359 | 358 2642
360 | 359 389 1483 1620
361 | 360 1409 1824
362 | 361 2681
363 | 362 1966
364 | 363 1003
365 | 364 732 819 970 1306 1358 1606 1632 1756 2642
366 | 365 2469
367 | 366 553 745 1583 1986
368 | 367 2083 2451
369 | 368 428 996 1194 1366 1545 2249 2393
370 | 369 385 2484
371 | 370 1408 1414 1415 2269
372 | 371 720 1354 1414 1415 1441 1553 2375
373 | 372 626 1710 1834 2377
374 | 373 1025 1042
375 | 374 413 846 1101 2146 2147 2149
376 | 375 963 2401 2434 2435
377 | 376 672 1583
378 | 377 678 733 1265 2291
379 | 378 602
380 | 379 904 1022 1898 1959
381 | 380 477 930 2569
382 | 381 2518 2538
383 | 382 1203 2034
384 | 383 733 1704 2268
385 | 384 1623 1624
386 | 385 2483 2484
387 | 386 2114 2115 2117
388 | 387 519 2191
389 | 388 1899 2497
390 | 389 704 905 1077 1483 1743 2450
391 | 390 1108
392 | 391 493 1986 2365
393 | 392 720 1415 1619 1825 2269
394 | 393 739
395 | 394 577 849 1063 1064 2351
396 | 395 640 1147 2367
397 | 396 1217 1988
398 | 397 665 879 1096 2329
399 | 398 730 2177
400 | 399 544 1089 1769 2107
401 | 400 1354
402 | 401 733 1065 1096 1160
403 | 402 507 776 1413 1542 1932 1936 1940
404 | 403 464 806 1037 1383 1742 2503 2654
405 | 404 1170 1476
406 | 405 738 1275 1927 2059
407 | 406 505 2505
408 | 407 409 695 1681 1682
409 | 408 525 734 1841 2056
410 | 409 1624 1681 2084
411 | 410 676 1179
412 | 411 1096 1488 1577 2609
413 | 412 1616 1849
414 | 413
415 | 414 711
416 | 415 525 593 737 818 851 966 1174 1309 1527 1644 1677 1843 1847 1850 1908 2014 2178 2182
417 | 416 519 1701
418 | 417 1072 1871 2034 2494
419 | 418
420 | 419
421 | 420 1869
422 | 421 2213 2216
423 | 422 545
424 | 423 525 527
425 | 424 436 1203
426 | 425 2466
427 | 426 487 1772 1777 1782 1789 1805
428 | 427 565 1528
429 | 428 996 1257 2248 2671
430 | 429 523 705 794 863 1493 1618 1669 1807 1889 2001 2041 2043 2044
431 | 430 2130
432 | 431 2694 2695
433 | 432 1442 1597
434 | 433 436 756
435 | 434 1117 1524 1693 2133 2408
436 | 435 486 733 1912
437 | 436 483 498 668 816 878 885 1039 1131 1332 1494 1977 1978 1979 1980 1981 1982
438 | 437 498 718 950 1495 1985
439 | 438 718 787 1667 1983 1984 1985
440 | 439 1169 1642
441 | 440 441 725 895 1296 1913 2179
442 | 441 454 671 1455 1537 2094 2184
443 | 442 588 627 698 1051 2120 2362
444 | 443 1010 1264 1531 1966
445 | 444 488 515 582 623 828 2289 2290 2464
446 | 445 909 1434
447 | 446 903 1507
448 | 447 1616 1849 2638
449 | 448 472 1824 1826 1902
450 | 449 1280 1834 2581
451 | 450 1569 2538
452 | 451 671 935 2285
453 | 452 1258 1623
454 | 453 1806 2305 2306
455 | 454 581 973 1441 1661 1917 2022 2188 2189 2190
456 | 455 1664 2147 2486
457 | 456 525 544 667 1580 2054 2199 2347 2472
458 | 457 1201 2215 2216
459 | 458 1122 1276 1555 1953
460 | 459 1623 1681
461 | 460 1986 1989 2532
462 | 461 1203 2309
463 | 462 1048
464 | 463 1702 1703 1964 1966
465 | 464 1758 2444 2445
466 | 465 896 1403 1869 2304
467 | 466 970 1305 1358 1738 2546
468 | 467 1172 1701 1708
469 | 468 647 1311 1313
470 | 469 620 2304 2412
471 | 470 652 854 857 1097 1115
472 | 471 1013 2013 2394
473 | 472 559 844 1182 1183 1825 2029
474 | 473 1575 2647
475 | 474 1181
476 | 475 2219
477 | 476 1140 1800 1986
478 | 477 930 2569
479 | 478 514 1978 2024
480 | 479 1880
481 | 480 598 687 1426 1710 2359 2372
482 | 481 823 1287 1810 2291
483 | 482 790 1807 1810 1812 1814 2230 2307 2308
484 | 483 816 2405
485 | 484 542 1367 1699 1775 2046 2081
486 | 485 839 1042 2219
487 | 486 2077 2667 2668
488 | 487 655 859 1705 1772 1773 1782 1798 2026 2080
489 | 488 2547
490 | 489 1358 2390 2391
491 | 490 826 1131 1309 1463 1879 1973 1981 2099 2105
492 | 491
493 | 492 688 956 1257 1533 2538 2673 2678
494 | 493 1244 1759
495 | 494 2614
496 | 495 1327 1328 2192
497 | 496 596 711 992
498 | 497 2419
499 | 498 1652
500 | 499 2457 2458
501 | 500 2241
502 | 501 1358 1765
503 | 502 1976
504 | 503 576 1692 2001
505 | 504 1973 2394 2396
506 | 505 573 843 1180 1251 1258 1304 1448 1560 1779 2083
507 | 506 1013 1352 1914 1918
508 | 507 812 1176 1211 1300 1363 1413 1931 1933 1936 1940 1941 2508
509 | 508 2325
510 | 509 1101 2146 2148
511 | 510 1013 1131 1171
512 | 511 549 1810
513 | 512 756 1472 1692
514 | 513 565 657 1528
515 | 514 1314 1882 1978
516 | 515 623 1102 2464 2465 2547
517 | 516 1882 2104
518 | 517 952 2105
519 | 518 666 1373 2423
520 | 519 598 836 1346 1574 1670 1986 1998 2045
521 | 520
522 | 521 598
523 | 522 2097
524 | 523 949 2406
525 | 524 552 979 1169 1358 2476
526 | 525 1540 1628 2054 2057 2172 2180 2181 2182 2183
527 | 526 565 1007 1216 1742
528 | 527
529 | 528 803
530 | 529 1417 1419 2260
531 | 530 1013 1505 2094
532 | 531 606 2308
533 | 532 1701 1863
534 | 533 1742 2466
535 | 534 609 629 933
536 | 535 2392 2641
537 | 536 567 1894
538 | 537 1165 1593 1698 2192
539 | 538 1286
540 | 539 711 1616 2155
541 | 540 747 1441 2527
542 | 541 647 904 1022 1360 1896
543 | 542 1245 1623
544 | 543 984
545 | 544 1580 1769 2052 2199 2347
546 | 545
547 | 546 1608 2127
548 | 547 598 1701
549 | 548 809 843 1523 1937 2138
550 | 549 1497 1810
551 | 550
552 | 551 2388
553 | 552 1169 1358 2201
554 | 553 745 1127 1583 1995 2009
555 | 554 685
556 | 555 1463
557 | 556 1623 1805
558 | 557 1725
559 | 558 876 1189 1706 2585
560 | 559 1182 1579
561 | 560 774 2526
562 | 561 1016
563 | 562 704 1483 1735 2450
564 | 563 1337 1701 2045
565 | 564 1262 2476
566 | 565 687 1038 1361 1398 1528 1610 1723 2279
567 | 566 1907 1986
568 | 567 631 760 1262 1513
569 | 568 1116 1223 1810
570 | 569 1759 2512
571 | 570 2430
572 | 571 696 1570
573 | 572 1416 2025 2418
574 | 573 643 1313 2407
575 | 574 576 1692 2001
576 | 575 2283
577 | 576 665 1810 2348
578 | 577 849 870 906 1063 1129 1518 2351 2352 2353 2354
579 | 578 1105 1879 1974
580 | 579 994 1134 1808
581 | 580 744 1358 1742 1758 1765 2444 2445
582 | 581 1616 1623 1792
583 | 582 2548 2549
584 | 583 2224 2225
585 | 584 623 2289 2463 2464
586 | 585 2526
587 | 586 1218 1224 2155
588 | 587 1032
589 | 588 698 2383
590 | 589 980 1158 1624
591 | 590 1441 1954
592 | 591 642 1017 1078 1423 2472
593 | 592 2669
594 | 593 1115 1474
595 | 594 663 2195
596 | 595 1016 1256 2176
597 | 596 644 992 1914
598 | 597 1104 1628
599 | 598 637 766 845 869 968 1003 1100 1107 1297 1299 1301 1473 1573 1636 1821 1823 1864 1870 1875 2138 2707
600 | 599 1093 1271
601 | 600 2394
602 | 601 2229
603 | 602
604 | 603 716 795 1248 1821 1873 2045
605 | 604 1351 1652 1914 2109
606 | 605 1107 2534
607 | 606 1666 2230 2360 2361
608 | 607 1213 1664
609 | 608 1623 2080
610 | 609 883 1041 1358 2446 2447 2448
611 | 610 826 2287 2288 2481
612 | 611 2690
613 | 612 2235
614 | 613 1358
615 | 614 888 1475 2470
616 | 615 2443
617 | 616 1358 1739 2365 2503
618 | 617 705 2034
619 | 618 938
620 | 619
621 | 620 923 2534
622 | 621 1332 1978 2018
623 | 622 886 1002 1951 1952
624 | 623 2289 2464
625 | 624 1042 1232
626 | 625 1024
627 | 626 1358 2556 2558
628 | 627 2122 2238 2335
629 | 628 1169 1725 1764 2546
630 | 629 883
631 | 630 973
632 | 631 2358
633 | 632 1430
634 | 633 1701 1866
635 | 634 2227
636 | 635 2058
637 | 636 2233
638 | 637 1641
639 | 638 783 808 893 924 1773 2387
640 | 639 900
641 | 640 765 894 1090 1598 2420
642 | 641 2704
643 | 642 2472
644 | 643 1167 1538 2407
645 | 644 711 728 1258 2177
646 | 645 891 950 1267 1358 1495 1824 1826 1895 1898 1899 1900 1901
647 | 646
648 | 647 904 1267 1360
649 | 648 1441
650 | 649 1171 1457
651 | 650 819 1250 1486 1557 1685 1756 2083
652 | 651 885 1667 1984 2023 2024 2156
653 | 652
654 | 653 1231
655 | 654
656 | 655 2080
657 | 656 2090
658 | 657 867 871 1229 1254 1729 1740 2440 2442 2522 2523
659 | 658 2147
660 | 659 1803
661 | 660 1623
662 | 661 807 830 935 1028 1045 1353 1878 1879 1880 1881 1882 1883 1884 1885
663 | 662 932
664 | 663 2195
665 | 664 1701
666 | 665 2120 2122
667 | 666 1850 1973
668 | 667 2364 2472
669 | 668
670 | 669 729 1086 1124
671 | 670 2606
672 | 671 973 2025
673 | 672 2456
674 | 673 1907 1986
675 | 674 1907
676 | 675 2356
677 | 676 1955 2235
678 | 677 954
679 | 678 733
680 | 679 1395
681 | 680 2199
682 | 681 1171 1445 1446 1986
683 | 682 973 1224
684 | 683 834 2169
685 | 684 1282 1358
686 | 685
687 | 686 1358 2366
688 | 687 1033 1171 1358 1546 1586 1610 1710 1721 1725 2359 2372
689 | 688 1257
690 | 689 1169 1292 1317 1358 1721
691 | 690
692 | 691 2034 2130
693 | 692 2629
694 | 693 763 1851 2108
695 | 694 1478 2516
696 | 695 948 1012 1336 1681 1682 1790 1998 2319
697 | 696 1570
698 | 697 1080
699 | 698 1015
700 | 699 1701 2045 2327
701 | 700 1691
702 | 701 735 739 2062
703 | 702 777 779 822 993 1368 1637 2069 2070 2101
704 | 703 801 1961 2497
705 | 704 969 1412 1483 1743 2113 2450
706 | 705
707 | 706 805 963 1240 2238 2645
708 | 707 763
709 | 708 836 1358 2212 2216
710 | 709 1896 1903
711 | 710 1392 2212 2213
712 | 711 1285 2111 2112 2113
713 | 712 850 1464 2480
714 | 713 1044
715 | 714 1385
716 | 715 2291
717 | 716 1810 1821 2409
718 | 717 1907
719 | 718 1416 2059 2261
720 | 719 1623 1810
721 | 720 1441
722 | 721 1034
723 | 722 1273 1567 1874 2500 2501
724 | 723 2614
725 | 724 1344 2394 2395
726 | 725 1104 1137 1652 2071
727 | 726 1533 2671
728 | 727 1386 2570
729 | 728 1070 1296 2388 2553
730 | 729 1159 1221 1701 2534
731 | 730
732 | 731 901 1279 2199 2203
733 | 732 1756 2685
734 | 733 759 794 862 1062 1160 1192 1265 1294 1329 1682 1817 1818 2008 2011 2035 2268 2291 2301 2302 2303 2304
735 | 734 736 751 964 1006 1388 1841 2056 2197 2202 2277
736 | 735 739 1237 1543 1881 1958 2060 2061 2062
737 | 736 751 964 1006 1388 1631 1841 2197 2202
738 | 737 1269 1630 1974 2136 2178
739 | 738 1080 1927 2162
740 | 739 1237 1543 1881 1958
741 | 740 1002 2033
742 | 741 831 1011 1088 1282 2093
743 | 742 1483 1620 1743
744 | 743 745 1239 1570 1986
745 | 744 1154 1358
746 | 745 1127 1986 1995 2009
747 | 746 1303 2006
748 | 747 1538
749 | 748 1103 1145 1169 1358 1374 1550 1568 1617 1732 2553
750 | 749 1333 2155
751 | 750 1441
752 | 751 1204 2198 2202
753 | 752 957 2083 2451
754 | 753 1269 1880
755 | 754 1358 1483 2450
756 | 755 1277 1898 2071
757 | 756 885 1469 1472 1692 2186 2187
758 | 757 1284 1358 2421
759 | 758 1358 1762 2492
760 | 759 1703
761 | 760 985 1426 1513 2459 2460
762 | 761 1401 1443
763 | 762 859 2448
764 | 763 1635
765 | 764 796 1358 1606 1612
766 | 765 1147 2367
767 | 766 845 1297 1868
768 | 767 873 1701 2314
769 | 768 1566 1723 1724
770 | 769 1927
771 | 770 1894 2117
772 | 771 1080 1156
773 | 772 908
774 | 773 1072 1505
775 | 774
776 | 775 2075
777 | 776 1413 1542
778 | 777
779 | 778 779 1224 1370 1914 1919
780 | 779 1224 1370 1592 1914 1919 1973
781 | 780 2341
782 | 781 2402
783 | 782
784 | 783 893 1061 1143 2376
785 | 784 960 1701
786 | 785 1664
787 | 786 947
788 | 787
789 | 788 1031 1952 2033 2041 2205
790 | 789 1240 1703
791 | 790 1548 1810
792 | 791 2034 2130
793 | 792 1488 1558 1986 2555
794 | 793 1896 1902 1903 2592
795 | 794 863 1160 1590
796 | 795 1645 1810
797 | 796 1103 1317 1358
798 | 797 800 1358 1833 2359
799 | 798 1381 1793 1794 1795
800 | 799 961
801 | 800 820 889 1834
802 | 801 1470 1961 2516
803 | 802 1307
804 | 803 2100
805 | 804 1307
806 | 805 963 2236
807 | 806 1383
808 | 807 1226 1955 1956
809 | 808 924 1785 2383
810 | 809 1115 2228 2229
811 | 810 1068
812 | 811 855 900
813 | 812 1300 1413 1542
814 | 813 2376
815 | 814 893 1158 2382 2383
816 | 815 1174 1344 2017 2018
817 | 816 1842 1980 2282 2405
818 | 817 1736 1763 2456
819 | 818 1999
820 | 819 1358 1606
821 | 820 889
822 | 821
823 | 822 1973
824 | 823 1169 1358 1765 2221
825 | 824 1831
826 | 825 1765
827 | 826 2287 2528
828 | 827 1039 1921 1999
829 | 828 1795 2464
830 | 829 1816
831 | 830 1914 2100
832 | 831 1011 1358 1638 1733 2093
833 | 832 2600
834 | 833 1130 1641 1678 2036 2495 2595
835 | 834 876 2168
836 | 835 1121 1134 1654 1810
837 | 836 1013 1131 1343 1572 2078 2403
838 | 837 1422 2248 2670
839 | 838 868 1131 2280
840 | 839 965 2277
841 | 840 1903 2481
842 | 841 2034
843 | 842 2016
844 | 843 2339 2542
845 | 844 1902
846 | 845
847 | 846 2146
848 | 847 2189 2422
849 | 848 985 1373
850 | 849 1993 2351
851 | 850 1464 2278
852 | 851 1309 2018
853 | 852 1800
854 | 853 1169 1358 1762
855 | 854 857 935 1097 1884
856 | 855 2493
857 | 856 1390 2248
858 | 857 1097 1246
859 | 858 2182 2502
860 | 859 1030 1199 2096
861 | 860 1427 1527 1758
862 | 861 1397 2068 2330
863 | 862
864 | 863 1160 1590
865 | 864 1065 2210
866 | 865 2653
867 | 866 1049
868 | 867 871 1229 1252 2439 2440 2442 2522 2523
869 | 868 988 1195 1341 1526 1844 2099 2276
870 | 869 1440 1701 1864
871 | 870 1119 2126
872 | 871 1229 2530 2652
873 | 872 1340 1413 1542
874 | 873 1358 1479 1708 2313
875 | 874 1701
876 | 875 914 1942 1945 1950
877 | 876 1358 2168 2584
878 | 877 1176 1177 1300 1413 1542 1935
879 | 878 885 2326
880 | 879 1558
881 | 880 973 1013
882 | 881 1155 1199
883 | 882 1358
884 | 883 933
885 | 884
886 | 885 1039 2024 2386
887 | 886 1400
888 | 887 910 1623 1624
889 | 888 1475 1556
890 | 889 1830 1831 1832
891 | 890 1269 1314
892 | 891 1674 2608
893 | 892 1950
894 | 893 911 1051 2383
895 | 894 1845 2367
896 | 895 1187 1296 1913
897 | 896
898 | 897 1152 1156 1410 1494 1907 1983 2295
899 | 898 1835 1836 2160
900 | 899 1055 1858 1986
901 | 900
902 | 901 1042 1279 2073 2186 2199
903 | 902 1284 1358 1604
904 | 903 2030 2031
905 | 904 1312 1360 1959
906 | 905 1282 1483 1620
907 | 906 2351 2353
908 | 907 1351 1916
909 | 908 1013
910 | 909
911 | 910 1623 2085
912 | 911 1051 1534 2376
913 | 912 2130
914 | 913
915 | 914
916 | 915 1035 1740 2415 2597
917 | 916 1026
918 | 917 2639
919 | 918 1098 1542 2065
920 | 919 1358
921 | 920 1893
922 | 921 1914 2134 2293
923 | 922 1307 1406
924 | 923
925 | 924 2383
926 | 925 1222
927 | 926 1862
928 | 927 1316 2140
929 | 928 1326 2063
930 | 929 1358 2627
931 | 930 2569
932 | 931 1433
933 | 932
934 | 933 1038
935 | 934 2356
936 | 935 973 1873
937 | 936 1067 2515 2530 2641
938 | 937 1180 1304 2096
939 | 938 2027 2230
940 | 939 2173
941 | 940 1829
942 | 941 2517
943 | 942 1924 1952
944 | 943 1196
945 | 944 2138
946 | 945 1072 1512 1911 2388
947 | 946 1966 2623
948 | 947
949 | 948 1395 1681 1682
950 | 949
951 | 950
952 | 951 1358
953 | 952 2546
954 | 953 2565 2566 2567
955 | 954 1112
956 | 955 1482 1914
957 | 956 2677
958 | 957 1154 1358 1557 1685 1757 2083 2167
959 | 958 1072 1798
960 | 959 2529
961 | 960 1776 2650
962 | 961 1728 2555 2599
963 | 962 1062 1665 1806 1807 1808 1809 1810
964 | 963 1094 1141 1157 1417 1443 1703 1785 2240 2399 2400 2401 2434
965 | 964 965 1388 2198
966 | 965 1693 2197
967 | 966
968 | 967 2659
969 | 968 1603 1986
970 | 969 1483 1735 1743 2450
971 | 970 1358 2365
972 | 971 1650
973 | 972 1000 2247 2579 2580
974 | 973 1072 1142 1583 2016 2019 2153
975 | 974 1496
976 | 975 2097
977 | 976 998 1431
978 | 977 1281
979 | 978 998 1358
980 | 979 1303 1503 1839 1957 2207
981 | 980 1367 1623
982 | 981 1159 1221
983 | 982 1091 1256 1458 2175 2176
984 | 983 2130
985 | 984 1068 2265
986 | 985 1073 1149 1358
987 | 986 2697
988 | 987 1175
989 | 988
990 | 989 1507 1645 2285 2436
991 | 990 1054 1542
992 | 991
993 | 992
994 | 993 2155
995 | 994 1219 1329 2309
996 | 995 1309 1912 2385
997 | 996 1135 1257 2248 2518
998 | 997 1837
999 | 998 1103 1284 1431 2365
1000 | 999 1358
1001 | 1000 1325 2543
1002 | 1001 2662
1003 | 1002 1952 2033 2034
1004 | 1003 1373 1687 2607
1005 | 1004 1489 1810
1006 | 1005 1541
1007 | 1006 1302
1008 | 1007 1154 1742 2466
1009 | 1008 1416 1922 2059 2575
1010 | 1009 2096
1011 | 1010 1966
1012 | 1011
1013 | 1012 1336 1681 1682 1797
1014 | 1013 1120 1293 1403 1464 1521 1583 1625 1644 1661 1675 1729 1840 1841 1842 1843 1844 1845 1846 1847 1848 1849 1850 1851 2045
1015 | 1014 1558
1016 | 1015 1068 1143 1385 1425 1452 1481 1519 1618 1788 1789 2262 2263
1017 | 1016 1256 2176
1018 | 1017 1475 2472
1019 | 1018 2378
1020 | 1019 1457
1021 | 1020 2075 2077 2667
1022 | 1021 2309
1023 | 1022 1168 1276 1961 1962 1963
1024 | 1023 1701 1986 2568
1025 | 1024
1026 | 1025 1042
1027 | 1026 1303 2034 2531
1028 | 1027 2394
1029 | 1028
1030 | 1029 2166
1031 | 1030
1032 | 1031 1952
1033 | 1032
1034 | 1033 2168
1035 | 1034
1036 | 1035 1433 1486 1740 1801 1833
1037 | 1036 2562
1038 | 1037 2447 2654
1039 | 1038 1358 2503
1040 | 1039
1041 | 1040 1169 1317 1358 1765
1042 | 1041 1243 1358 1466 2390 2448
1043 | 1042 1047 1118 1125 1198 1481 1517 1628 1925 1926 2051 2052 2054 2055 2073 2198 2333
1044 | 1043 2309
1045 | 1044
1046 | 1045 1571 1773
1047 | 1046 1510 2336 2387 2587
1048 | 1047 1588 2333
1049 | 1048
1050 | 1049 1432 1894 2610
1051 | 1050 1081 1230 1320 1342 1380 1569 1634 2563 2564
1052 | 1051 2122 2383
1053 | 1052 1701
1054 | 1053 1405
1055 | 1054 1099 1432 1577 1605 1950
1056 | 1055 1577
1057 | 1056 2482
1058 | 1057 2409
1059 | 1058 1195
1060 | 1059 1600
1061 | 1060 1441
1062 | 1061 1196 2376
1063 | 1062 1192 1810
1064 | 1063 1518
1065 | 1064 2351 2352
1066 | 1065 1096 1699 2211
1067 | 1066 1574
1068 | 1067 1216 1830 2307 2448
1069 | 1068
1070 | 1069
1071 | 1070 1072 1167 1358 1470 1725 1829
1072 | 1071 1135 1257 2248 2671
1073 | 1072 1262 1358 1478 1483 1505 1725 1733 1740 1784 1797 1798 1799 1800 1801 1802 1803 1804 1805
1074 | 1073 2163
1075 | 1074 1501 1675 1676
1076 | 1075 1124 1701
1077 | 1076
1078 | 1077 1483 2113 2450
1079 | 1078
1080 | 1079 1218 1914
1081 | 1080 2162 2344
1082 | 1081 2563
1083 | 1082 1534 2386
1084 | 1083 1232 1450 2114
1085 | 1084 1213
1086 | 1085 1421 2109 2110 2394
1087 | 1086 1124 2247 2579
1088 | 1087 2434
1089 | 1088 1089
1090 | 1089 1552 2086
1091 | 1090 1093 1147 1271 1598 2367
1092 | 1091 1256 2176
1093 | 1092 1106 1171 2501
1094 | 1093 1271 1598 2367
1095 | 1094 1369 1443 1653
1096 | 1095 1325 1810 1986 2613
1097 | 1096 1633
1098 | 1097 1115 1869
1099 | 1098 1477 2066 2067
1100 | 1099 1577 1607
1101 | 1100 1701
1102 | 1101 1242 2146 2148
1103 | 1102 1207 2290 2464 2465
1104 | 1103 1123 1284 1358 1480 1520 1718 1748 1760 1886 1887 1888
1105 | 1104 2264
1106 | 1105
1107 | 1106 1874 2501
1108 | 1107 1810 2243 2379
1109 | 1108
1110 | 1109 1533 2671 2674 2675
1111 | 1110 1120 1224 2025 2295
1112 | 1111 1119 1273 2125 2126 2129
1113 | 1112
1114 | 1113 1650
1115 | 1114 1717 2560
1116 | 1115 1529 2276
1117 | 1116 1223
1118 | 1117 1628 2499
1119 | 1118 1416 2155
1120 | 1119 1235 1273 1462 1663 1697 2124 2125 2126 2127 2128 2129
1121 | 1120 1583 2025 2194
1122 | 1121 1131 1410 1655 1810 2136
1123 | 1122
1124 | 1123 1358
1125 | 1124
1126 | 1125 2332
1127 | 1126 1823
1128 | 1127 1583 1986
1129 | 1128 1441 1954 2275
1130 | 1129 1518 2351 2352 2353
1131 | 1130 2495
1132 | 1131 1133 1215 1224 1500 1538 1655 1909 1930 2155 2185 2280 2281 2282 2283
1133 | 1132
1134 | 1133 1500 1538 2185
1135 | 1134 2296
1136 | 1135 1257 2248 2518 2671 2678
1137 | 1136 1189 2359 2554
1138 | 1137
1139 | 1138 2248 2250
1140 | 1139 1628 2054 2182
1141 | 1140
1142 | 1141 2294
1143 | 1142 2326
1144 | 1143 1618 2155 2262 2376
1145 | 1144 1245 1623
1146 | 1145 1169 1358
1147 | 1146 1505 1506 2254
1148 | 1147 1598 1865 2367 2371
1149 | 1148 1317 1747 2220 2221
1150 | 1149 1358 1894 1986
1151 | 1150 1243
1152 | 1151 2570 2700
1153 | 1152 1927 2155 2416
1154 | 1153 1401
1155 | 1154 1358 1714 1715 1742 2514 2515
1156 | 1155 1229
1157 | 1156
1158 | 1157 2240 2463
1159 | 1158 1635 2251 2252 2382
1160 | 1159 2534
1161 | 1160 2268
1162 | 1161
1163 | 1162 1510
1164 | 1163 1402 2132
1165 | 1164 1287
1166 | 1165 1593
1167 | 1166 1986 2582
1168 | 1167
1169 | 1168 1955
1170 | 1169 1358 1714 1719 1720 1731 1734 1737 2220 2476 2492
1171 | 1170 1476
1172 | 1171 1295 1355 1855 1909 2359
1173 | 1172
1174 | 1173 1250
1175 | 1174 1912 2394
1176 | 1175 2359
1177 | 1176 1542 1935
1178 | 1177 1542 1935
1179 | 1178 1529 1908 1984 2350
1180 | 1179 1955
1181 | 1180 1304
1182 | 1181
1183 | 1182 1183 1579
1184 | 1183 1354 1579 1824 1825
1185 | 1184 1190 2100
1186 | 1185 1460 1992 2349 2358 2589 2590
1187 | 1186 2535 2574
1188 | 1187 1188
1189 | 1188
1190 | 1189 1358 2359
1191 | 1190 1387
1192 | 1191 1558 2241 2339
1193 | 1192 1621
1194 | 1193 1705
1195 | 1194 1366 2670
1196 | 1195
1197 | 1196 1367 1624 2300
1198 | 1197 1791 1998
1199 | 1198
1200 | 1199 1214 1566 1599 1716
1201 | 1200 1502 2207 2414
1202 | 1201 2214 2533
1203 | 1202 1428 1510 2336 2376 2524
1204 | 1203 1215 1410 1842 2155
1205 | 1204 1330 1602 1838 1839
1206 | 1205 1358 1761
1207 | 1206 1257 2671
1208 | 1207 2290 2464
1209 | 1208
1210 | 1209 1358 1737
1211 | 1210 1648
1212 | 1211
1213 | 1212 1701
1214 | 1213 1290
1215 | 1214 1499 1716 1745
1216 | 1215 1894
1217 | 1216 2446 2503
1218 | 1217
1219 | 1218 1927 2116
1220 | 1219
1221 | 1220
1222 | 1221 1577 1607 1949
1223 | 1222 1224
1224 | 1223
1225 | 1224 1291 1341 1370 1525 1526 1583 1587 1660 1732 1848 1851 1919 1975 2151 2152 2153 2154 2155 2156
1226 | 1225 1450 2579
1227 | 1226 1955 1956 2528
1228 | 1227 1269
1229 | 1228 1536 2580
1230 | 1229 1254 1308 1358 1384 1506 1627 1749 2439 2440 2441 2442
1231 | 1230 1342
1232 | 1231
1233 | 1232
1234 | 1233 2433
1235 | 1234 1702 1703 1966
1236 | 1235 1462 2126 2577
1237 | 1236 2479
1238 | 1237 1958
1239 | 1238 1358
1240 | 1239 2027 2611
1241 | 1240 1443 1702 2237
1242 | 1241 1701 1950
1243 | 1242 2146 2148
1244 | 1243 1358
1245 | 1244
1246 | 1245 1770 2080
1247 | 1246
1248 | 1247 1679
1249 | 1248 1651 2045
1250 | 1249 1483 2556 2558
1251 | 1250 2507
1252 | 1251 1448 1623
1253 | 1252 2656
1254 | 1253
1255 | 1254 1308
1256 | 1255 2248
1257 | 1256 1761 2175 2176
1258 | 1257 2248 2249 2518 2538 2671 2673 2677 2678 2679
1259 | 1258
1260 | 1259 2342
1261 | 1260 1559 2250
1262 | 1261 2130
1263 | 1262 1509
1264 | 1263 1407
1265 | 1264 1531
1266 | 1265 2291
1267 | 1266 1529 1984
1268 | 1267 1824
1269 | 1268 1929 2309
1270 | 1269 1314 1630 1880 2227
1271 | 1270 1622 2034
1272 | 1271 2367
1273 | 1272 1945 1947 1950
1274 | 1273 2034 2124 2125 2126 2129
1275 | 1274
1276 | 1275 1468
1277 | 1276 1962
1278 | 1277
1279 | 1278 2268 2305 2306
1280 | 1279 2186 2203
1281 | 1280 2612
1282 | 1281 1358 1746 1748
1283 | 1282 1283 1483 1735 1776 2324
1284 | 1283 1776 2324
1285 | 1284 1358 1604 2450
1286 | 1285 2182
1287 | 1286
1288 | 1287 1358
1289 | 1288 1309
1290 | 1289 1622 1705
1291 | 1290 2258
1292 | 1291 1660 2154
1293 | 1292 1317 1358
1294 | 1293 2418
1295 | 1294
1296 | 1295 1987
1297 | 1296 1910 1911 1912 1913
1298 | 1297
1299 | 1298 2703
1300 | 1299 1701 1810
1301 | 1300 1542 1933 1940 1941 2508
1302 | 1301 1315 1316 1542 2140
1303 | 1302
1304 | 1303 2163
1305 | 1304
1306 | 1305 1358
1307 | 1306 1317
1308 | 1307 1444 2536
1309 | 1308 1627
1310 | 1309 1338 1625 1677 1741 1882 2015 2102 2103 2178
1311 | 1310 2692
1312 | 1311 1312 1313 2080
1313 | 1312 1313
1314 | 1313 2407
1315 | 1314
1316 | 1315
1317 | 1316 1679
1318 | 1317 1719 1720 1739 2220 2221
1319 | 1318 2661 2662
1320 | 1319 2034
1321 | 1320 1349 1491 2684
1322 | 1321 2501
1323 | 1322 2265
1324 | 1323 1701
1325 | 1324 2035 2265
1326 | 1325 2063 2247
1327 | 1326 2063
1328 | 1327 1328 2161
1329 | 1328 2160
1330 | 1329
1331 | 1330 1837 2068
1332 | 1331 1576 1810
1333 | 1332 2024
1334 | 1333 1973 2195 2312
1335 | 1334 1542 1577 1701 1941 1950
1336 | 1335 1856 2087 2090
1337 | 1336 1452 1797 2032 2033 2034
1338 | 1337 1701 2045
1339 | 1338 1912 2385
1340 | 1339 1358 1753
1341 | 1340 1413 1542 1943
1342 | 1341 2425
1343 | 1342
1344 | 1343 1507 1645 2165 2316
1345 | 1344 1928 2315
1346 | 1345
1347 | 1346 1670
1348 | 1347 1997
1349 | 1348 1810 2163 2605
1350 | 1349 1634 2684
1351 | 1350 1443
1352 | 1351 1362 1535 1592 1972 2134 2282
1353 | 1352
1354 | 1353
1355 | 1354 2270
1356 | 1355 1957
1357 | 1356 1613
1358 | 1357 2025 2170
1359 | 1358 1384 1389 1444 1471 1480 1483 1492 1499 1516 1546 1550 1562 1565 1568 1599 1604 1620 1646 1708 1709 1710 1711 1712 1713 1714 1715 1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 2597
1360 | 1359 1464 1914 2278 2480
1361 | 1360 1650 1902 1904 1960 2491
1362 | 1361 1528
1363 | 1362 1372 1914
1364 | 1363 1413
1365 | 1364
1366 | 1365 1702 1703 2509 2519
1367 | 1366 2248
1368 | 1367 1584 1656 1778 2080 2143
1369 | 1368 1911 2553
1370 | 1369 1530 1653
1371 | 1370 1525 1919 1975 2274 2485
1372 | 1371 1393
1373 | 1372
1374 | 1373
1375 | 1374
1376 | 1375 2586
1377 | 1376 1848 2232
1378 | 1377 1840 2068 2069 2070 2388
1379 | 1378 1544 2058 2150
1380 | 1379 1380 2564
1381 | 1380 2564
1382 | 1381 1794
1383 | 1382 1502 2176 2283
1384 | 1383
1385 | 1384
1386 | 1385 2335
1387 | 1386 2570 2571
1388 | 1387
1389 | 1388 2181 2197
1390 | 1389 2597
1391 | 1390 2681
1392 | 1391 1693
1393 | 1392 2212
1394 | 1393
1395 | 1394 2137
1396 | 1395 2072 2189 2266 2267
1397 | 1396 1399 1421 2109 2228 2394 2395
1398 | 1397
1399 | 1398 1528
1400 | 1399 1501 2227 2309 2394
1401 | 1400 2698
1402 | 1401 1585
1403 | 1402 1482 1616 1919 1972
1404 | 1403 2317
1405 | 1404 2375
1406 | 1405 1508 1894
1407 | 1406 2387
1408 | 1407
1409 | 1408 1553 1826 1828 1829
1410 | 1409 1553 1826 1827 1828
1411 | 1410 1655
1412 | 1411 2218
1413 | 1412 1483 1735 1743 2450
1414 | 1413 1542 1931 1932 1933 1934 1935 1936
1415 | 1414 1415 2270
1416 | 1415 2269
1417 | 1416 1468 1602 1921 1922 1923 1924 1925 1926
1418 | 1417 1452 1529 1703 2260 2350
1419 | 1418
1420 | 1419 2259 2260
1421 | 1420 1421 2394 2396
1422 | 1421 1975 2109 2110 2394
1423 | 1422
1424 | 1423 1692
1425 | 1424 1621 1813
1426 | 1425 1618
1427 | 1426 1751
1428 | 1427 1725 1742 2446
1429 | 1428 1429 1534 2336 2376 2520 2524
1430 | 1429 1534 2520
1431 | 1430 1718
1432 | 1431
1433 | 1432 1950 2663
1434 | 1433 2359
1435 | 1434 1986
1436 | 1435 1927 2059 2071
1437 | 1436 2259
1438 | 1437 1688
1439 | 1438 2664
1440 | 1439 2676
1441 | 1440 1701
1442 | 1441 1619 1700 1707 1954 1957 1958 2271 2272 2273 2274 2275
1443 | 1442 2462
1444 | 1443 2240 2399 2645
1445 | 1444 2323
1446 | 1445 1446
1447 | 1446
1448 | 1447 1928
1449 | 1448 2034 2045
1450 | 1449 2681
1451 | 1450
1452 | 1451 2564
1453 | 1452
1454 | 1453 1701 1986
1455 | 1454
1456 | 1455 1537
1457 | 1456 1457
1458 | 1457
1459 | 1458
1460 | 1459 2080 2218
1461 | 1460 2349 2358
1462 | 1461 1634 2248
1463 | 1462 1611 1663
1464 | 1463 1695 2131 2132 2133
1465 | 1464 1482 1683 1914
1466 | 1465 1488 1799
1467 | 1466
1468 | 1467 2172
1469 | 1468
1470 | 1469 1498 1692 2346 2349 2358 2590
1471 | 1470 1555 1824 1953 2553
1472 | 1471 1646
1473 | 1472 1692 2187
1474 | 1473 2706 2707
1475 | 1474 1908 2261 2335
1476 | 1475 2469 2470
1477 | 1476
1478 | 1477 1542 2066 2067
1479 | 1478 1902
1480 | 1479 1701 1894
1481 | 1480
1482 | 1481 1669 2332 2335
1483 | 1482 1973 2132
1484 | 1483 1620 1726 1735 1743 2083 2113 2208
1485 | 1484 2383 2498 2520
1486 | 1485 1914
1487 | 1486 1745
1488 | 1487 1501 1849 2017 2137
1489 | 1488 2601
1490 | 1489
1491 | 1490 2085 2327
1492 | 1491
1493 | 1492 1765
1494 | 1493
1495 | 1494
1496 | 1495
1497 | 1496
1498 | 1497 2034
1499 | 1498 2025 2349 2590
1500 | 1499 1776
1501 | 1500 1538 2185
1502 | 1501 1676 2228 2395
1503 | 1502 1503 2163
1504 | 1503 2241 2243
1505 | 1504 1647 2209
1506 | 1505 1552 1624 1699 1788 1801 2086 2107
1507 | 1506
1508 | 1507 1645 2436
1509 | 1508 1894 2117
1510 | 1509
1511 | 1510 2387
1512 | 1511 1519
1513 | 1512 2023 2145
1514 | 1513 1830
1515 | 1514 1692
1516 | 1515 2182 2234
1517 | 1516 2612
1518 | 1517
1519 | 1518
1520 | 1519 1534 2108 2335 2640
1521 | 1520 1750 1765
1522 | 1521
1523 | 1522 1697 2680
1524 | 1523 2449
1525 | 1524 1693 2499
1526 | 1525 1914 1973
1527 | 1526 1879 1977 1981 2099 2425
1528 | 1527 1668 2274 2450
1529 | 1528
1530 | 1529 1908 1984 2350
1531 | 1530 2240 2464
1532 | 1531 1889
1533 | 1532 2648
1534 | 1533 2671
1535 | 1534
1536 | 1535 1916 2155 2293
1537 | 1536 2580 2702
1538 | 1537
1539 | 1538 1842 2185
1540 | 1539 1914
1541 | 1540 1628 2054
1542 | 1541
1543 | 1542 1577 1607 1931 1935 1936 1939 1940 1941 1942 1943
1544 | 1543 1881 2029
1545 | 1544
1546 | 1545 2248
1547 | 1546 1804 2169
1548 | 1547 2169
1549 | 1548
1550 | 1549 1901 2286
1551 | 1550 2083
1552 | 1551
1553 | 1552 1778 2591
1554 | 1553 1827
1555 | 1554 1657 2686 2687 2688
1556 | 1555 1953 2363
1557 | 1556
1558 | 1557 2083 2168
1559 | 1558 1986 2338 2339 2340
1560 | 1559
1561 | 1560 1623
1562 | 1561 1623
1563 | 1562 1765
1564 | 1563 2633
1565 | 1564 1772 1786
1566 | 1565 2365
1567 | 1566 1732 1895 2071
1568 | 1567 1654 1874 2500
1569 | 1568 1729
1570 | 1569 1966
1571 | 1570
1572 | 1571
1573 | 1572 1776
1574 | 1573 2698
1575 | 1574 1986
1576 | 1575
1577 | 1576 1810
1578 | 1577
1579 | 1578 2497
1580 | 1579 2029
1581 | 1580 1892 2199 2200
1582 | 1581 1810 1819
1583 | 1582
1584 | 1583 1995 2025
1585 | 1584 1798
1586 | 1585 1653
1587 | 1586 2168
1588 | 1587 2154
1589 | 1588 2055 2333
1590 | 1589 1745 2596
1591 | 1590
1592 | 1591 2541
1593 | 1592 1914 2071 2485
1594 | 1593
1595 | 1594
1596 | 1595 2246
1597 | 1596 2476
1598 | 1597
1599 | 1598 2367
1600 | 1599 2441 2460 2614
1601 | 1600
1602 | 1601 1750
1603 | 1602 2054 2072 2073 2074
1604 | 1603
1605 | 1604
1606 | 1605 1950 2663
1607 | 1606
1608 | 1607
1609 | 1608 2124 2126 2489
1610 | 1609 1790 1791 1912
1611 | 1610 2556 2557 2558
1612 | 1611 2124
1613 | 1612
1614 | 1613
1615 | 1614 1671 2034
1616 | 1615 1665
1617 | 1616 1849 2050
1618 | 1617
1619 | 1618 1789
1620 | 1619 2269
1621 | 1620 2450
1622 | 1621
1623 | 1622 2425
1624 | 1623 1767 1768 1769 1770 1771 1772 1773 1774 1775 1776 1777 1778 1779 1780 1781 1782 1783 1784
1625 | 1624 1705 1777 1778 1779 1780 1785 1786 1787 1788
1626 | 1625 1729
1627 | 1626
1628 | 1627 1740
1629 | 1628 1635 2021 2053 2054 2055 2056 2057 2172
1630 | 1629 1659 1711
1631 | 1630 2227
1632 | 1631 2196
1633 | 1632 1756
1634 | 1633
1635 | 1634 2393
1636 | 1635 2383
1637 | 1636 1701 2139 2141
1638 | 1637 1973
1639 | 1638
1640 | 1639 1688 2679
1641 | 1640
1642 | 1641 2495
1643 | 1642 2657 2658
1644 | 1643
1645 | 1644
1646 | 1645
1647 | 1646
1648 | 1647 2160 2209
1649 | 1648
1650 | 1649
1651 | 1650
1652 | 1651 2045
1653 | 1652 1848 1972
1654 | 1653
1655 | 1654 1952 2034
1656 | 1655 1839 1842 1894 2136 2282 2295 2384
1657 | 1656 1798
1658 | 1657 2686 2688
1659 | 1658 2018
1660 | 1659
1661 | 1660 2154
1662 | 1661 1851
1663 | 1662 1666 2381 2398
1664 | 1663 2577
1665 | 1664 2147
1666 | 1665 1701 2034
1667 | 1666 2381
1668 | 1667
1669 | 1668
1670 | 1669 1892 2332
1671 | 1670
1672 | 1671 1929 1968 2136 2137
1673 | 1672
1674 | 1673 2660
1675 | 1674 1914 2608
1676 | 1675 1676 1914
1677 | 1676 1914 1975
1678 | 1677 1908
1679 | 1678 1746
1680 | 1679 2140
1681 | 1680 2056 2576
1682 | 1681 2318 2319 2320
1683 | 1682 2189 2291 2319 2320 2321
1684 | 1683 1973 2132
1685 | 1684 2486 2513
1686 | 1685 2083
1687 | 1686 2647
1688 | 1687 1721
1689 | 1688 2671
1690 | 1689
1691 | 1690 2112
1692 | 1691
1693 | 1692 2186 2187 2302 2345 2346 2347 2348 2349
1694 | 1693 2408
1695 | 1694
1696 | 1695 2133
1697 | 1696 1713
1698 | 1697 1986 2253
1699 | 1698 2160
1700 | 1699
1701 | 1700
1702 | 1701 1799 1820 1846 1852 1853 1854 1855 1856 1857 1858 1859 1860 1861 1862 1863 1864 1865 1866 1867 1868 1869 1870 1871 1872 1873 1874 1875 1876 1877
1703 | 1702 1703 1966 1970 1971 2238
1704 | 1703 1906 1966 1967 1968 1969 1970 1971 2238
1705 | 1704 1912 1986 2077
1706 | 1705 2081 2322
1707 | 1706 2585
1708 | 1707
1709 | 1708 1857 2313 2314
1710 | 1709 1738 1739 1986 2365
1711 | 1710 2444
1712 | 1711 1730 1731 1765
1713 | 1712 1725 2446
1714 | 1713 1898
1715 | 1714 1727
1716 | 1715
1717 | 1716 2446
1718 | 1717 1731 2560
1719 | 1718 2685
1720 | 1719 1765
1721 | 1720 1765
1722 | 1721 1761 2476
1723 | 1722
1724 | 1723 1724
1725 | 1724 2215 2216
1726 | 1725 1734 1740 1745 2334 2413 2596 2597
1727 | 1726 2168
1728 | 1727
1729 | 1728 2257 2555 2599
1730 | 1729
1731 | 1730
1732 | 1731 1755
1733 | 1732
1734 | 1733
1735 | 1734 2597
1736 | 1735 1776 2324
1737 | 1736 1763
1738 | 1737 1888 2616
1739 | 1738 2365 2366
1740 | 1739 1750 1766 1886 2221
1741 | 1740 1804 2389 2390 2391 2392
1742 | 1741 2593
1743 | 1742 1749 1758 2307 2443 2444 2445
1744 | 1743 2113 2450
1745 | 1744 1765
1746 | 1745 2596
1747 | 1746
1748 | 1747
1749 | 1748
1750 | 1749 2446
1751 | 1750 1765 1766
1752 | 1751 2596
1753 | 1752 1754
1754 | 1753
1755 | 1754
1756 | 1755 1765
1757 | 1756 2642
1758 | 1757 2451
1759 | 1758 2466
1760 | 1759
1761 | 1760
1762 | 1761 2175
1763 | 1762
1764 | 1763
1765 | 1764
1766 | 1765 2492
1767 | 1766
1768 | 1767 1783
1769 | 1768 1837 2325
1770 | 1769
1771 | 1770 2080
1772 | 1771 1778
1773 | 1772 1782 2045
1774 | 1773 1780 1789
1775 | 1774
1776 | 1775
1777 | 1776 2045 2078 2324
1778 | 1777 2143
1779 | 1778 2327
1780 | 1779 2318
1781 | 1780
1782 | 1781 1998 2107
1783 | 1782 1798 2326
1784 | 1783
1785 | 1784 2045 2096
1786 | 1785
1787 | 1786
1788 | 1787 1810 2045 2143
1789 | 1788
1790 | 1789
1791 | 1790
1792 | 1791 2045
1793 | 1792 2485
1794 | 1793
1795 | 1794 1795 1796
1796 | 1795 1796 2299 2464
1797 | 1796 2297
1798 | 1797 2088
1799 | 1798
1800 | 1799
1801 | 1800
1802 | 1801 2203 2451
1803 | 1802 2079
1804 | 1803
1805 | 1804 2451
1806 | 1805
1807 | 1806 1809
1808 | 1807 2034
1809 | 1808 1810
1810 | 1809 1810
1811 | 1810 1811 1812 1813 1814 1815 1816 1817 1818 1819 1820 1821 1822 1823 1869
1812 | 1811 1822
1813 | 1812 1986 2034 2550
1814 | 1813
1815 | 1814
1816 | 1815 1950
1817 | 1816
1818 | 1817
1819 | 1818 2305 2306
1820 | 1819 1948
1821 | 1820 1873
1822 | 1821
1823 | 1822
1824 | 1823
1825 | 1824 1825 1826
1826 | 1825
1827 | 1826 1827 1829
1828 | 1827 1828
1829 | 1828
1830 | 1829 1902
1831 | 1830 2045
1832 | 1831
1833 | 1832
1834 | 1833 1834
1835 | 1834 2359
1836 | 1835 2157 2160
1837 | 1836
1838 | 1837
1839 | 1838 2182 2219
1840 | 1839 2424 2453
1841 | 1840 2071
1842 | 1841 2056
1843 | 1842 1979 1980 2281 2405
1844 | 1843
1845 | 1844 2276
1846 | 1845
1847 | 1846 1867 2117 2357
1848 | 1847
1849 | 1848 1909 2419
1850 | 1849 1914
1851 | 1850
1852 | 1851 2485
1853 | 1852 2045
1854 | 1853
1855 | 1854
1856 | 1855
1857 | 1856 2045 2096
1858 | 1857
1859 | 1858
1860 | 1859 1986
1861 | 1860
1862 | 1861
1863 | 1862 2582
1864 | 1863
1865 | 1864
1866 | 1865
1867 | 1866
1868 | 1867 2117
1869 | 1868
1870 | 1869 2309
1871 | 1870 1986
1872 | 1871
1873 | 1872
1874 | 1873 1986
1875 | 1874 2500 2501
1876 | 1875 1986
1877 | 1876 1986
1878 | 1877 2573
1879 | 1878 2256
1880 | 1879
1881 | 1880
1882 | 1881 1958
1883 | 1882
1884 | 1883
1885 | 1884 1885 1958
1886 | 1885 2644
1887 | 1886
1888 | 1887 1888
1889 | 1888
1890 | 1889 1890 1891 1892 1893 2034 2039 2130
1891 | 1890 2130
1892 | 1891 2034 2130
1893 | 1892
1894 | 1893
1895 | 1894 2034 2117 2118
1896 | 1895
1897 | 1896 1897
1898 | 1897 1903
1899 | 1898
1900 | 1899 1902 2491 2510
1901 | 1900 2608
1902 | 1901 1954 2284 2286
1903 | 1902 1903 1904
1904 | 1903 1960 2269 2592
1905 | 1904 1960
1906 | 1905 1914
1907 | 1906 1907 1964 2059
1908 | 1907
1909 | 1908
1910 | 1909 2305
1911 | 1910
1912 | 1911 2553
1913 | 1912 2102 2103 2404
1914 | 1913
1915 | 1914 1915 1916 1917 1918 1919 1920 2388
1916 | 1915 2649
1917 | 1916
1918 | 1917 2457
1919 | 1918
1920 | 1919
1921 | 1920 1926 2310 2422 2653
1922 | 1921
1923 | 1922
1924 | 1923 1927
1925 | 1924 1927
1926 | 1925 2620
1927 | 1926 2051 2052 2189
1928 | 1927 1928 1929 1930
1929 | 1928 2317 2475
1930 | 1929 2034
1931 | 1930
1932 | 1931
1933 | 1932 1936
1934 | 1933
1935 | 1934
1936 | 1935
1937 | 1936
1938 | 1937 1938
1939 | 1938
1940 | 1939
1941 | 1940
1942 | 1941 2508
1943 | 1942
1944 | 1943
1945 | 1944 1945 1946 1950
1946 | 1945 1946 1947
1947 | 1946 1950
1948 | 1947 1950
1949 | 1948 1950
1950 | 1949
1951 | 1950
1952 | 1951 1952 2204 2206
1953 | 1952 2204 2205 2206
1954 | 1953
1955 | 1954 1955 1956 1957 1958 2275
1956 | 1955 1956 2235
1957 | 1956
1958 | 1957 2241 2630
1959 | 1958
1960 | 1959 2621
1961 | 1960 2491
1962 | 1961
1963 | 1962 1963
1964 | 1963
1965 | 1964 1965 1966
1966 | 1965
1967 | 1966 1971
1968 | 1967
1969 | 1968 2034
1970 | 1969 2236 2238
1971 | 1970
1972 | 1971
1973 | 1972
1974 | 1973 1974 1975 1976
1975 | 1974
1976 | 1975 2109 2110
1977 | 1976
1978 | 1977
1979 | 1978
1980 | 1979 1980 2178 2405
1981 | 1980 2405
1982 | 1981
1983 | 1982
1984 | 1983 2034
1985 | 1984
1986 | 1985
1987 | 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009
1988 | 1987 2004 2509
1989 | 1988
1990 | 1989
1991 | 1990
1992 | 1991 2325
1993 | 1992
1994 | 1993
1995 | 1994
1996 | 1995
1997 | 1996 2063 2064
1998 | 1997 2412
1999 | 1998 2326
2000 | 1999
2001 | 2000
2002 | 2001 2002 2003 2044 2121 2123 2335 2348 2378 2379 2380 2381
2003 | 2002 2379
2004 | 2003 2123
2005 | 2004
2006 | 2005
2007 | 2006 2477
2008 | 2007
2009 | 2008 2301
2010 | 2009 2025
2011 | 2010 2016
2012 | 2011
2013 | 2012
2014 | 2013
2015 | 2014
2016 | 2015 2071
2017 | 2016 2071
2018 | 2017 2467
2019 | 2018 2019
2020 | 2019
2021 | 2020
2022 | 2021 2418
2023 | 2022 2190
2024 | 2023 2217
2025 | 2024 2156 2178
2026 | 2025 2026 2027 2028
2027 | 2026 2045 2096
2028 | 2027 2360
2029 | 2028
2030 | 2029
2031 | 2030 2112
2032 | 2031 2436
2033 | 2032
2034 | 2033
2035 | 2034 2035 2036 2037 2038 2039 2040 2041 2042 2130
2036 | 2035
2037 | 2036
2038 | 2037
2039 | 2038
2040 | 2039 2042
2041 | 2040 2120 2122
2042 | 2041
2043 | 2042
2044 | 2043 2406
2045 | 2044
2046 | 2045 2046 2047 2048
2047 | 2046 2048
2048 | 2047
2049 | 2048
2050 | 2049
2051 | 2050 2485
2052 | 2051 2364
2053 | 2052
2054 | 2053
2055 | 2054 2133
2056 | 2055 2198
2057 | 2056 2311 2576
2058 | 2057
2059 | 2058 2150
2060 | 2059
2061 | 2060
2062 | 2061
2063 | 2062
2064 | 2063
2065 | 2064
2066 | 2065 2066 2067
2067 | 2066 2122
2068 | 2067
2069 | 2068
2070 | 2069 2070
2071 | 2070
2072 | 2071
2073 | 2072 2182
2074 | 2073
2075 | 2074 2181 2277
2076 | 2075 2076 2077 2091 2667 2668
2077 | 2076 2077 2667
2078 | 2077
2079 | 2078
2080 | 2079
2081 | 2080
2082 | 2081
2083 | 2082
2084 | 2083 2168
2085 | 2084
2086 | 2085 2089
2087 | 2086 2094
2088 | 2087
2089 | 2088
2090 | 2089
2091 | 2090
2092 | 2091
2093 | 2092
2094 | 2093
2095 | 2094 2106
2096 | 2095 2145 2329 2504
2097 | 2096
2098 | 2097 2098
2099 | 2098
2100 | 2099 2425
2101 | 2100 2101
2102 | 2101
2103 | 2102 2385
2104 | 2103 2385
2105 | 2104
2106 | 2105
2107 | 2106
2108 | 2107
2109 | 2108
2110 | 2109 2110
2111 | 2110
2112 | 2111
2113 | 2112 2436
2114 | 2113 2450
2115 | 2114 2509 2519 2588
2116 | 2115 2117 2519
2117 | 2116
2118 | 2117 2118
2119 | 2118
2120 | 2119
2121 | 2120 2637
2122 | 2121 2122
2123 | 2122 2123
2124 | 2123 2380
2125 | 2124 2125 2126 2127
2126 | 2125 2126
2127 | 2126
2128 | 2127 2130
2129 | 2128 2129
2130 | 2129
2131 | 2130 2249
2132 | 2131 2133 2182
2133 | 2132
2134 | 2133 2182
2135 | 2134
2136 | 2135 2282
2137 | 2136
2138 | 2137
2139 | 2138
2140 | 2139 2140 2141
2141 | 2140
2142 | 2141 2142
2143 | 2142
2144 | 2143
2145 | 2144 2145 2329 2331 2504
2146 | 2145
2147 | 2146 2147 2148
2148 | 2147 2486
2149 | 2148
2150 | 2149
2151 | 2150
2152 | 2151 2423
2153 | 2152 2385
2154 | 2153 2155
2155 | 2154
2156 | 2155 2293 2294 2295
2157 | 2156
2158 | 2157 2158 2159 2160
2159 | 2158
2160 | 2159 2160
2161 | 2160 2161
2162 | 2161
2163 | 2162 2163
2164 | 2163
2165 | 2164 2217 2282
2166 | 2165
2167 | 2166
2168 | 2167
2169 | 2168 2169
2170 | 2169
2171 | 2170 2171
2172 | 2171
2173 | 2172 2182
2174 | 2173 2174
2175 | 2174
2176 | 2175 2176
2177 | 2176
2178 | 2177 2367 2371
2179 | 2178
2180 | 2179
2181 | 2180 2182 2183
2182 | 2181 2196 2197 2202
2183 | 2182 2198 2231 2232 2233
2184 | 2183
2185 | 2184
2186 | 2185
2187 | 2186 2199
2188 | 2187
2189 | 2188
2190 | 2189
2191 | 2190
2192 | 2191
2193 | 2192
2194 | 2193 2194
2195 | 2194 2195
2196 | 2195
2197 | 2196
2198 | 2197
2199 | 2198
2200 | 2199 2200 2201
2201 | 2200
2202 | 2201
2203 | 2202
2204 | 2203 2330 2511
2205 | 2204 2521
2206 | 2205
2207 | 2206
2208 | 2207 2241
2209 | 2208
2210 | 2209
2211 | 2210
2212 | 2211 2356
2213 | 2212 2216
2214 | 2213
2215 | 2214 2216
2216 | 2215 2216
2217 | 2216
2218 | 2217
2219 | 2218
2220 | 2219
2221 | 2220 2221 2525
2222 | 2221 2507
2223 | 2222 2224 2225
2224 | 2223 2224
2225 | 2224 2225 2226
2226 | 2225
2227 | 2226
2228 | 2227 2228
2229 | 2228 2394 2395
2230 | 2229
2231 | 2230 2296
2232 | 2231 2292
2233 | 2232
2234 | 2233 2291
2235 | 2234
2236 | 2235
2237 | 2236 2237
2238 | 2237 2238
2239 | 2238 2239 2240
2240 | 2239
2241 | 2240 2399
2242 | 2241 2242
2243 | 2242 2539
2244 | 2243 2244 2245 2246
2245 | 2244
2246 | 2245 2246
2247 | 2246 2247
2248 | 2247 2509 2543
2249 | 2248 2249
2250 | 2249 2518 2671
2251 | 2250
2252 | 2251 2252
2253 | 2252
2254 | 2253
2255 | 2254
2256 | 2255
2257 | 2256
2258 | 2257
2259 | 2258
2260 | 2259 2335 2337
2261 | 2260
2262 | 2261
2263 | 2262
2264 | 2263 2490
2265 | 2264
2266 | 2265
2267 | 2266 2267 2321 2364
2268 | 2267
2269 | 2268 2291
2270 | 2269 2270
2271 | 2270 2375
2272 | 2271
2273 | 2272
2274 | 2273
2275 | 2274
2276 | 2275
2277 | 2276
2278 | 2277
2279 | 2278
2280 | 2279 2466
2281 | 2280
2282 | 2281 2405
2283 | 2282
2284 | 2283
2285 | 2284 2285
2286 | 2285 2286
2287 | 2286
2288 | 2287 2288
2289 | 2288
2290 | 2289 2464
2291 | 2290 2464 2496
2292 | 2291 2292
2293 | 2292
2294 | 2293
2295 | 2294
2296 | 2295
2297 | 2296
2298 | 2297 2298 2299
2299 | 2298
2300 | 2299
2301 | 2300
2302 | 2301
2303 | 2302
2304 | 2303 2541
2305 | 2304 2385
2306 | 2305 2306
2307 | 2306 2552
2308 | 2307
2309 | 2308 2559
2310 | 2309
2311 | 2310
2312 | 2311 2576
2313 | 2312
2314 | 2313
2315 | 2314
2316 | 2315
2317 | 2316
2318 | 2317
2319 | 2318
2320 | 2319
2321 | 2320
2322 | 2321
2323 | 2322
2324 | 2323
2325 | 2324
2326 | 2325
2327 | 2326
2328 | 2327 2328
2329 | 2328
2330 | 2329 2330 2331
2331 | 2330
2332 | 2331
2333 | 2332
2334 | 2333
2335 | 2334
2336 | 2335 2336
2337 | 2336 2524
2338 | 2337
2339 | 2338 2555
2340 | 2339 2340
2341 | 2340
2342 | 2341
2343 | 2342
2344 | 2343 2344
2345 | 2344 2475
2346 | 2345
2347 | 2346 2349
2348 | 2347
2349 | 2348
2350 | 2349 2358
2351 | 2350
2352 | 2351
2353 | 2352
2354 | 2353
2355 | 2354
2356 | 2355 2356 2357
2357 | 2356 2357
2358 | 2357
2359 | 2358
2360 | 2359 2372
2361 | 2360 2578
2362 | 2361
2363 | 2362
2364 | 2363
2365 | 2364
2366 | 2365
2367 | 2366
2368 | 2367 2368 2369 2370 2371
2369 | 2368 2370
2370 | 2369 2370
2371 | 2370
2372 | 2371
2373 | 2372
2374 | 2373
2375 | 2374
2376 | 2375
2377 | 2376
2378 | 2377
2379 | 2378
2380 | 2379
2381 | 2380
2382 | 2381
2383 | 2382 2383
2384 | 2383
2385 | 2384
2386 | 2385
2387 | 2386
2388 | 2387
2389 | 2388
2390 | 2389 2451
2391 | 2390 2391 2451
2392 | 2391 2451
2393 | 2392 2451
2394 | 2393
2395 | 2394 2395 2396 2397
2396 | 2395
2397 | 2396
2398 | 2397
2399 | 2398 2493
2400 | 2399 2434
2401 | 2400
2402 | 2401 2434 2435
2403 | 2402 2403
2404 | 2403
2405 | 2404
2406 | 2405
2407 | 2406
2408 | 2407
2409 | 2408
2410 | 2409
2411 | 2410 2411
2412 | 2411
2413 | 2412
2414 | 2413 2414 2415
2415 | 2414
2416 | 2415 2597
2417 | 2416 2417
2418 | 2417
2419 | 2418
2420 | 2419
2421 | 2420
2422 | 2421
2423 | 2422
2424 | 2423
2425 | 2424
2426 | 2425 2426
2427 | 2426
2428 | 2427 2428
2429 | 2428
2430 | 2429
2431 | 2430
2432 | 2431 2432
2433 | 2432
2434 | 2433
2435 | 2434
2436 | 2435
2437 | 2436
2438 | 2437 2438
2439 | 2438
2440 | 2439 2530 2652
2441 | 2440 2530
2442 | 2441
2443 | 2442 2530 2614
2444 | 2443
2445 | 2444 2503
2446 | 2445 2503
2447 | 2446 2447 2448
2448 | 2447
2449 | 2448
2450 | 2449
2451 | 2450
2452 | 2451 2452
2453 | 2452 2685
2454 | 2453
2455 | 2454
2456 | 2455
2457 | 2456
2458 | 2457 2458 2649
2459 | 2458
2460 | 2459 2689
2461 | 2460
2462 | 2461
2463 | 2462
2464 | 2463
2465 | 2464 2465
2466 | 2465
2467 | 2466
2468 | 2467 2468
2469 | 2468
2470 | 2469
2471 | 2470
2472 | 2471 2493
2473 | 2472
2474 | 2473 2648
2475 | 2474
2476 | 2475
2477 | 2476 2651
2478 | 2477
2479 | 2478
2480 | 2479
2481 | 2480
2482 | 2481
2483 | 2482
2484 | 2483 2484
2485 | 2484
2486 | 2485
2487 | 2486
2488 | 2487
2489 | 2488
2490 | 2489
2491 | 2490
2492 | 2491
2493 | 2492
2494 | 2493
2495 | 2494 2495
2496 | 2495
2497 | 2496
2498 | 2497
2499 | 2498
2500 | 2499
2501 | 2500
2502 | 2501
2503 | 2502
2504 | 2503
2505 | 2504
2506 | 2505
2507 | 2506
2508 | 2507
2509 | 2508
2510 | 2509 2519
2511 | 2510
2512 | 2511
2513 | 2512
2514 | 2513
2515 | 2514 2515
2516 | 2515 2530
2517 | 2516 2517
2518 | 2517
2519 | 2518 2538
2520 | 2519
2521 | 2520
2522 | 2521
2523 | 2522 2656
2524 | 2523
2525 | 2524
2526 | 2525
2527 | 2526
2528 | 2527
2529 | 2528
2530 | 2529
2531 | 2530
2532 | 2531
2533 | 2532
2534 | 2533
2535 | 2534 2535
2536 | 2535
2537 | 2536
2538 | 2537
2539 | 2538
2540 | 2539 2540 2632
2541 | 2540
2542 | 2541
2543 | 2542
2544 | 2543
2545 | 2544
2546 | 2545
2547 | 2546
2548 | 2547
2549 | 2548
2550 | 2549
2551 | 2550
2552 | 2551
2553 | 2552
2554 | 2553
2555 | 2554 2689
2556 | 2555 2599
2557 | 2556
2558 | 2557
2559 | 2558
2560 | 2559
2561 | 2560 2561
2562 | 2561
2563 | 2562
2564 | 2563
2565 | 2564
2566 | 2565 2566 2567
2567 | 2566
2568 | 2567
2569 | 2568
2570 | 2569
2571 | 2570 2572 2573
2572 | 2571
2573 | 2572
2574 | 2573
2575 | 2574
2576 | 2575
2577 | 2576
2578 | 2577
2579 | 2578
2580 | 2579
2581 | 2580
2582 | 2581
2583 | 2582
2584 | 2583
2585 | 2584
2586 | 2585
2587 | 2586
2588 | 2587
2589 | 2588
2590 | 2589
2591 | 2590
2592 | 2591
2593 | 2592
2594 | 2593
2595 | 2594 2680
2596 | 2595
2597 | 2596
2598 | 2597
2599 | 2598
2600 | 2599
2601 | 2600
2602 | 2601 2609
2603 | 2602 2603
2604 | 2603
2605 | 2604
2606 | 2605
2607 | 2606
2608 | 2607
2609 | 2608
2610 | 2609
2611 | 2610
2612 | 2611
2613 | 2612
2614 | 2613
2615 | 2614
2616 | 2615 2696
2617 | 2616
2618 | 2617
2619 | 2618 2619
2620 | 2619
2621 | 2620
2622 | 2621
2623 | 2622
2624 | 2623
2625 | 2624 2701
2626 | 2625 2626
2627 | 2626
2628 | 2627
2629 | 2628
2630 | 2629
2631 | 2630 2699
2632 | 2631
2633 | 2632
2634 | 2633
2635 | 2634 2635 2636
2636 | 2635
2637 | 2636 2693
2638 | 2637
2639 | 2638
2640 | 2639
2641 | 2640
2642 | 2641
2643 | 2642
2644 | 2643
2645 | 2644
2646 | 2645
2647 | 2646
2648 | 2647
2649 | 2648
2650 | 2649
2651 | 2650
2652 | 2651
2653 | 2652
2654 | 2653
2655 | 2654 2655
2656 | 2655
2657 | 2656
2658 | 2657
2659 | 2658
2660 | 2659
2661 | 2660
2662 | 2661
2663 | 2662
2664 | 2663
2665 | 2664
2666 | 2665 2666
2667 | 2666
2668 | 2667 2668
2669 | 2668
2670 | 2669
2671 | 2670 2679
2672 | 2671 2672 2673 2674 2675
2673 | 2672
2674 | 2673
2675 | 2674 2675
2676 | 2675
2677 | 2676
2678 | 2677
2679 | 2678
2680 | 2679 2682
2681 | 2680
2682 | 2681 2683
2683 | 2682
2684 | 2683
2685 | 2684
2686 | 2685
2687 | 2686
2688 | 2687 2688
2689 | 2688
2690 | 2689
2691 | 2690
2692 | 2691
2693 | 2692
2694 | 2693
2695 | 2694 2695
2696 | 2695
2697 | 2696
2698 | 2697
2699 | 2698
2700 | 2699
2701 | 2700
2702 | 2701
2703 | 2702
2704 | 2703
2705 | 2704
2706 | 2705
2707 | 2706 2707
2708 | 2707
2709 |
--------------------------------------------------------------------------------
/data/cora/cora_label.txt:
--------------------------------------------------------------------------------
1 | 0 3
2 | 1 4
3 | 2 4
4 | 3 0
5 | 4 3
6 | 5 2
7 | 6 0
8 | 7 3
9 | 8 3
10 | 9 2
11 | 10 0
12 | 11 0
13 | 12 4
14 | 13 3
15 | 14 3
16 | 15 3
17 | 16 2
18 | 17 3
19 | 18 1
20 | 19 3
21 | 20 5
22 | 21 3
23 | 22 4
24 | 23 6
25 | 24 3
26 | 25 3
27 | 26 6
28 | 27 3
29 | 28 2
30 | 29 4
31 | 30 3
32 | 31 6
33 | 32 0
34 | 33 4
35 | 34 2
36 | 35 0
37 | 36 1
38 | 37 5
39 | 38 4
40 | 39 4
41 | 40 3
42 | 41 6
43 | 42 6
44 | 43 4
45 | 44 3
46 | 45 3
47 | 46 2
48 | 47 5
49 | 48 3
50 | 49 4
51 | 50 5
52 | 51 3
53 | 52 0
54 | 53 2
55 | 54 1
56 | 55 4
57 | 56 6
58 | 57 3
59 | 58 2
60 | 59 2
61 | 60 0
62 | 61 0
63 | 62 0
64 | 63 4
65 | 64 2
66 | 65 0
67 | 66 4
68 | 67 5
69 | 68 2
70 | 69 6
71 | 70 5
72 | 71 2
73 | 72 2
74 | 73 2
75 | 74 0
76 | 75 4
77 | 76 5
78 | 77 6
79 | 78 4
80 | 79 0
81 | 80 0
82 | 81 0
83 | 82 4
84 | 83 2
85 | 84 4
86 | 85 1
87 | 86 4
88 | 87 6
89 | 88 0
90 | 89 4
91 | 90 2
92 | 91 4
93 | 92 6
94 | 93 6
95 | 94 0
96 | 95 0
97 | 96 6
98 | 97 5
99 | 98 0
100 | 99 6
101 | 100 0
102 | 101 2
103 | 102 1
104 | 103 1
105 | 104 1
106 | 105 2
107 | 106 6
108 | 107 5
109 | 108 6
110 | 109 1
111 | 110 2
112 | 111 2
113 | 112 1
114 | 113 5
115 | 114 5
116 | 115 5
117 | 116 6
118 | 117 5
119 | 118 6
120 | 119 5
121 | 120 5
122 | 121 1
123 | 122 6
124 | 123 6
125 | 124 1
126 | 125 5
127 | 126 1
128 | 127 6
129 | 128 5
130 | 129 5
131 | 130 5
132 | 131 1
133 | 132 5
134 | 133 1
135 | 134 1
136 | 135 1
137 | 136 1
138 | 137 1
139 | 138 1
140 | 139 1
141 | 140 4
142 | 141 3
143 | 142 0
144 | 143 3
145 | 144 6
146 | 145 6
147 | 146 0
148 | 147 3
149 | 148 4
150 | 149 0
151 | 150 3
152 | 151 4
153 | 152 4
154 | 153 1
155 | 154 2
156 | 155 2
157 | 156 2
158 | 157 3
159 | 158 3
160 | 159 3
161 | 160 3
162 | 161 0
163 | 162 4
164 | 163 5
165 | 164 0
166 | 165 3
167 | 166 4
168 | 167 3
169 | 168 3
170 | 169 3
171 | 170 2
172 | 171 3
173 | 172 3
174 | 173 2
175 | 174 2
176 | 175 6
177 | 176 1
178 | 177 4
179 | 178 3
180 | 179 3
181 | 180 3
182 | 181 6
183 | 182 3
184 | 183 3
185 | 184 3
186 | 185 3
187 | 186 0
188 | 187 4
189 | 188 2
190 | 189 2
191 | 190 6
192 | 191 5
193 | 192 3
194 | 193 5
195 | 194 4
196 | 195 0
197 | 196 4
198 | 197 3
199 | 198 4
200 | 199 4
201 | 200 3
202 | 201 3
203 | 202 2
204 | 203 4
205 | 204 0
206 | 205 3
207 | 206 2
208 | 207 3
209 | 208 3
210 | 209 4
211 | 210 4
212 | 211 0
213 | 212 3
214 | 213 6
215 | 214 0
216 | 215 3
217 | 216 3
218 | 217 4
219 | 218 3
220 | 219 3
221 | 220 5
222 | 221 2
223 | 222 3
224 | 223 2
225 | 224 4
226 | 225 1
227 | 226 3
228 | 227 2
229 | 228 2
230 | 229 3
231 | 230 3
232 | 231 3
233 | 232 3
234 | 233 5
235 | 234 1
236 | 235 3
237 | 236 1
238 | 237 3
239 | 238 5
240 | 239 0
241 | 240 3
242 | 241 5
243 | 242 0
244 | 243 4
245 | 244 2
246 | 245 4
247 | 246 2
248 | 247 4
249 | 248 4
250 | 249 5
251 | 250 4
252 | 251 3
253 | 252 5
254 | 253 3
255 | 254 3
256 | 255 4
257 | 256 3
258 | 257 0
259 | 258 4
260 | 259 5
261 | 260 0
262 | 261 3
263 | 262 6
264 | 263 2
265 | 264 5
266 | 265 5
267 | 266 5
268 | 267 3
269 | 268 2
270 | 269 3
271 | 270 0
272 | 271 4
273 | 272 5
274 | 273 3
275 | 274 0
276 | 275 4
277 | 276 0
278 | 277 3
279 | 278 3
280 | 279 0
281 | 280 0
282 | 281 3
283 | 282 5
284 | 283 4
285 | 284 4
286 | 285 3
287 | 286 4
288 | 287 3
289 | 288 3
290 | 289 2
291 | 290 2
292 | 291 3
293 | 292 0
294 | 293 3
295 | 294 1
296 | 295 3
297 | 296 2
298 | 297 3
299 | 298 3
300 | 299 4
301 | 300 5
302 | 301 2
303 | 302 1
304 | 303 1
305 | 304 0
306 | 305 0
307 | 306 1
308 | 307 6
309 | 308 1
310 | 309 3
311 | 310 3
312 | 311 3
313 | 312 2
314 | 313 3
315 | 314 3
316 | 315 0
317 | 316 3
318 | 317 4
319 | 318 1
320 | 319 3
321 | 320 4
322 | 321 3
323 | 322 2
324 | 323 0
325 | 324 0
326 | 325 4
327 | 326 2
328 | 327 3
329 | 328 2
330 | 329 1
331 | 330 4
332 | 331 6
333 | 332 3
334 | 333 2
335 | 334 0
336 | 335 3
337 | 336 3
338 | 337 2
339 | 338 3
340 | 339 4
341 | 340 4
342 | 341 2
343 | 342 1
344 | 343 3
345 | 344 5
346 | 345 3
347 | 346 2
348 | 347 0
349 | 348 4
350 | 349 5
351 | 350 1
352 | 351 3
353 | 352 3
354 | 353 2
355 | 354 0
356 | 355 2
357 | 356 4
358 | 357 2
359 | 358 2
360 | 359 2
361 | 360 5
362 | 361 4
363 | 362 4
364 | 363 2
365 | 364 2
366 | 365 0
367 | 366 3
368 | 367 2
369 | 368 4
370 | 369 4
371 | 370 5
372 | 371 5
373 | 372 1
374 | 373 0
375 | 374 3
376 | 375 4
377 | 376 5
378 | 377 3
379 | 378 4
380 | 379 5
381 | 380 3
382 | 381 4
383 | 382 3
384 | 383 3
385 | 384 1
386 | 385 4
387 | 386 3
388 | 387 3
389 | 388 5
390 | 389 2
391 | 390 3
392 | 391 2
393 | 392 5
394 | 393 5
395 | 394 4
396 | 395 3
397 | 396 3
398 | 397 3
399 | 398 3
400 | 399 1
401 | 400 5
402 | 401 3
403 | 402 3
404 | 403 2
405 | 404 6
406 | 405 0
407 | 406 1
408 | 407 3
409 | 408 0
410 | 409 1
411 | 410 5
412 | 411 3
413 | 412 6
414 | 413 3
415 | 414 6
416 | 415 0
417 | 416 3
418 | 417 3
419 | 418 3
420 | 419 5
421 | 420 4
422 | 421 3
423 | 422 4
424 | 423 0
425 | 424 5
426 | 425 2
427 | 426 1
428 | 427 2
429 | 428 4
430 | 429 4
431 | 430 4
432 | 431 4
433 | 432 3
434 | 433 3
435 | 434 0
436 | 435 4
437 | 436 3
438 | 437 0
439 | 438 5
440 | 439 2
441 | 440 0
442 | 441 5
443 | 442 4
444 | 443 4
445 | 444 4
446 | 445 3
447 | 446 0
448 | 447 6
449 | 448 5
450 | 449 2
451 | 450 4
452 | 451 5
453 | 452 1
454 | 453 3
455 | 454 5
456 | 455 3
457 | 456 0
458 | 457 3
459 | 458 5
460 | 459 1
461 | 460 1
462 | 461 0
463 | 462 3
464 | 463 4
465 | 464 2
466 | 465 6
467 | 466 2
468 | 467 0
469 | 468 5
470 | 469 3
471 | 470 4
472 | 471 6
473 | 472 5
474 | 473 3
475 | 474 5
476 | 475 0
477 | 476 1
478 | 477 3
479 | 478 0
480 | 479 5
481 | 480 2
482 | 481 2
483 | 482 3
484 | 483 5
485 | 484 1
486 | 485 0
487 | 486 3
488 | 487 1
489 | 488 4
490 | 489 2
491 | 490 5
492 | 491 6
493 | 492 4
494 | 493 2
495 | 494 2
496 | 495 6
497 | 496 0
498 | 497 0
499 | 498 4
500 | 499 6
501 | 500 3
502 | 501 2
503 | 502 0
504 | 503 3
505 | 504 6
506 | 505 1
507 | 506 6
508 | 507 3
509 | 508 1
510 | 509 3
511 | 510 3
512 | 511 3
513 | 512 3
514 | 513 2
515 | 514 5
516 | 515 4
517 | 516 5
518 | 517 5
519 | 518 3
520 | 519 1
521 | 520 3
522 | 521 3
523 | 522 4
524 | 523 4
525 | 524 2
526 | 525 0
527 | 526 2
528 | 527 0
529 | 528 5
530 | 529 4
531 | 530 0
532 | 531 0
533 | 532 3
534 | 533 2
535 | 534 2
536 | 535 2
537 | 536 2
538 | 537 6
539 | 538 4
540 | 539 6
541 | 540 5
542 | 541 5
543 | 542 1
544 | 543 0
545 | 544 0
546 | 545 4
547 | 546 3
548 | 547 3
549 | 548 1
550 | 549 3
551 | 550 6
552 | 551 6
553 | 552 2
554 | 553 3
555 | 554 3
556 | 555 3
557 | 556 1
558 | 557 2
559 | 558 2
560 | 559 5
561 | 560 4
562 | 561 3
563 | 562 2
564 | 563 1
565 | 564 2
566 | 565 2
567 | 566 3
568 | 567 2
569 | 568 3
570 | 569 2
571 | 570 3
572 | 571 3
573 | 572 0
574 | 573 5
575 | 574 3
576 | 575 3
577 | 576 3
578 | 577 4
579 | 578 5
580 | 579 3
581 | 580 2
582 | 581 1
583 | 582 4
584 | 583 4
585 | 584 4
586 | 585 4
587 | 586 0
588 | 587 5
589 | 588 4
590 | 589 1
591 | 590 3
592 | 591 0
593 | 592 3
594 | 593 4
595 | 594 6
596 | 595 3
597 | 596 6
598 | 597 3
599 | 598 3
600 | 599 3
601 | 600 6
602 | 601 3
603 | 602 4
604 | 603 3
605 | 604 6
606 | 605 3
607 | 606 0
608 | 607 3
609 | 608 1
610 | 609 2
611 | 610 5
612 | 611 6
613 | 612 5
614 | 613 2
615 | 614 0
616 | 615 2
617 | 616 2
618 | 617 3
619 | 618 3
620 | 619 0
621 | 620 3
622 | 621 5
623 | 622 3
624 | 623 4
625 | 624 0
626 | 625 3
627 | 626 2
628 | 627 4
629 | 628 5
630 | 629 2
631 | 630 3
632 | 631 2
633 | 632 2
634 | 633 3
635 | 634 5
636 | 635 2
637 | 636 0
638 | 637 3
639 | 638 4
640 | 639 3
641 | 640 3
642 | 641 3
643 | 642 0
644 | 643 5
645 | 644 5
646 | 645 5
647 | 646 5
648 | 647 5
649 | 648 5
650 | 649 3
651 | 650 2
652 | 651 0
653 | 652 4
654 | 653 3
655 | 654 4
656 | 655 1
657 | 656 1
658 | 657 2
659 | 658 3
660 | 659 0
661 | 660 1
662 | 661 5
663 | 662 3
664 | 663 6
665 | 664 3
666 | 665 4
667 | 666 0
668 | 667 0
669 | 668 5
670 | 669 3
671 | 670 3
672 | 671 5
673 | 672 2
674 | 673 3
675 | 674 3
676 | 675 4
677 | 676 5
678 | 677 4
679 | 678 3
680 | 679 0
681 | 680 0
682 | 681 3
683 | 682 6
684 | 683 1
685 | 684 2
686 | 685 1
687 | 686 2
688 | 687 2
689 | 688 4
690 | 689 2
691 | 690 3
692 | 691 4
693 | 692 3
694 | 693 0
695 | 694 5
696 | 695 3
697 | 696 3
698 | 697 3
699 | 698 4
700 | 699 3
701 | 700 3
702 | 701 5
703 | 702 6
704 | 703 5
705 | 704 2
706 | 705 4
707 | 706 4
708 | 707 0
709 | 708 3
710 | 709 5
711 | 710 3
712 | 711 0
713 | 712 6
714 | 713 3
715 | 714 4
716 | 715 4
717 | 716 3
718 | 717 0
719 | 718 0
720 | 719 1
721 | 720 5
722 | 721 2
723 | 722 3
724 | 723 2
725 | 724 6
726 | 725 0
727 | 726 4
728 | 727 3
729 | 728 5
730 | 729 3
731 | 730 0
732 | 731 0
733 | 732 2
734 | 733 0
735 | 734 0
736 | 735 5
737 | 736 0
738 | 737 5
739 | 738 0
740 | 739 5
741 | 740 4
742 | 741 1
743 | 742 2
744 | 743 3
745 | 744 2
746 | 745 3
747 | 746 3
748 | 747 5
749 | 748 2
750 | 749 4
751 | 750 5
752 | 751 0
753 | 752 2
754 | 753 0
755 | 754 2
756 | 755 5
757 | 756 3
758 | 757 2
759 | 758 2
760 | 759 4
761 | 760 2
762 | 761 4
763 | 762 2
764 | 763 0
765 | 764 2
766 | 765 3
767 | 766 3
768 | 767 0
769 | 768 3
770 | 769 0
771 | 770 3
772 | 771 0
773 | 772 6
774 | 773 1
775 | 774 4
776 | 775 3
777 | 776 4
778 | 777 0
779 | 778 6
780 | 779 6
781 | 780 4
782 | 781 3
783 | 782 4
784 | 783 4
785 | 784 3
786 | 785 3
787 | 786 4
788 | 787 4
789 | 788 3
790 | 789 4
791 | 790 3
792 | 791 3
793 | 792 3
794 | 793 5
795 | 794 0
796 | 795 3
797 | 796 2
798 | 797 2
799 | 798 4
800 | 799 3
801 | 800 2
802 | 801 5
803 | 802 4
804 | 803 5
805 | 804 4
806 | 805 4
807 | 806 2
808 | 807 5
809 | 808 4
810 | 809 0
811 | 810 4
812 | 811 3
813 | 812 3
814 | 813 4
815 | 814 4
816 | 815 0
817 | 816 5
818 | 817 2
819 | 818 3
820 | 819 2
821 | 820 2
822 | 821 3
823 | 822 5
824 | 823 2
825 | 824 2
826 | 825 2
827 | 826 5
828 | 827 3
829 | 828 4
830 | 829 1
831 | 830 6
832 | 831 1
833 | 832 3
834 | 833 3
835 | 834 1
836 | 835 3
837 | 836 3
838 | 837 4
839 | 838 0
840 | 839 0
841 | 840 5
842 | 841 3
843 | 842 0
844 | 843 3
845 | 844 5
846 | 845 3
847 | 846 3
848 | 847 6
849 | 848 2
850 | 849 4
851 | 850 6
852 | 851 0
853 | 852 0
854 | 853 2
855 | 854 4
856 | 855 3
857 | 856 4
858 | 857 4
859 | 858 0
860 | 859 2
861 | 860 2
862 | 861 0
863 | 862 4
864 | 863 0
865 | 864 1
866 | 865 3
867 | 866 3
868 | 867 2
869 | 868 3
870 | 869 3
871 | 870 3
872 | 871 2
873 | 872 4
874 | 873 0
875 | 874 3
876 | 875 3
877 | 876 1
878 | 877 3
879 | 878 5
880 | 879 3
881 | 880 0
882 | 881 2
883 | 882 2
884 | 883 2
885 | 884 4
886 | 885 5
887 | 886 3
888 | 887 1
889 | 888 0
890 | 889 2
891 | 890 5
892 | 891 6
893 | 892 3
894 | 893 4
895 | 894 3
896 | 895 0
897 | 896 5
898 | 897 0
899 | 898 6
900 | 899 3
901 | 900 3
902 | 901 0
903 | 902 2
904 | 903 5
905 | 904 5
906 | 905 2
907 | 906 4
908 | 907 6
909 | 908 6
910 | 909 3
911 | 910 1
912 | 911 4
913 | 912 4
914 | 913 5
915 | 914 3
916 | 915 2
917 | 916 3
918 | 917 0
919 | 918 3
920 | 919 2
921 | 920 3
922 | 921 6
923 | 922 4
924 | 923 3
925 | 924 4
926 | 925 5
927 | 926 3
928 | 927 3
929 | 928 3
930 | 929 2
931 | 930 3
932 | 931 2
933 | 932 3
934 | 933 2
935 | 934 4
936 | 935 5
937 | 936 2
938 | 937 1
939 | 938 3
940 | 939 6
941 | 940 5
942 | 941 5
943 | 942 3
944 | 943 4
945 | 944 3
946 | 945 1
947 | 946 4
948 | 947 4
949 | 948 0
950 | 949 4
951 | 950 6
952 | 951 2
953 | 952 3
954 | 953 3
955 | 954 4
956 | 955 6
957 | 956 4
958 | 957 2
959 | 958 1
960 | 959 3
961 | 960 3
962 | 961 3
963 | 962 3
964 | 963 4
965 | 964 0
966 | 965 0
967 | 966 0
968 | 967 3
969 | 968 1
970 | 969 2
971 | 970 2
972 | 971 5
973 | 972 3
974 | 973 5
975 | 974 3
976 | 975 0
977 | 976 2
978 | 977 2
979 | 978 2
980 | 979 3
981 | 980 1
982 | 981 3
983 | 982 3
984 | 983 4
985 | 984 4
986 | 985 2
987 | 986 3
988 | 987 3
989 | 988 3
990 | 989 0
991 | 990 3
992 | 991 6
993 | 992 0
994 | 993 6
995 | 994 3
996 | 995 5
997 | 996 4
998 | 997 3
999 | 998 2
1000 | 999 2
1001 | 1000 3
1002 | 1001 4
1003 | 1002 3
1004 | 1003 2
1005 | 1004 3
1006 | 1005 3
1007 | 1006 0
1008 | 1007 2
1009 | 1008 0
1010 | 1009 1
1011 | 1010 4
1012 | 1011 1
1013 | 1012 4
1014 | 1013 0
1015 | 1014 3
1016 | 1015 4
1017 | 1016 3
1018 | 1017 3
1019 | 1018 4
1020 | 1019 3
1021 | 1020 3
1022 | 1021 4
1023 | 1022 5
1024 | 1023 3
1025 | 1024 3
1026 | 1025 0
1027 | 1026 3
1028 | 1027 6
1029 | 1028 5
1030 | 1029 5
1031 | 1030 2
1032 | 1031 3
1033 | 1032 5
1034 | 1033 2
1035 | 1034 2
1036 | 1035 2
1037 | 1036 0
1038 | 1037 2
1039 | 1038 2
1040 | 1039 5
1041 | 1040 2
1042 | 1041 2
1043 | 1042 0
1044 | 1043 5
1045 | 1044 3
1046 | 1045 1
1047 | 1046 4
1048 | 1047 0
1049 | 1048 3
1050 | 1049 3
1051 | 1050 4
1052 | 1051 4
1053 | 1052 3
1054 | 1053 3
1055 | 1054 3
1056 | 1055 3
1057 | 1056 3
1058 | 1057 3
1059 | 1058 0
1060 | 1059 3
1061 | 1060 5
1062 | 1061 4
1063 | 1062 3
1064 | 1063 4
1065 | 1064 4
1066 | 1065 3
1067 | 1066 3
1068 | 1067 2
1069 | 1068 4
1070 | 1069 0
1071 | 1070 2
1072 | 1071 4
1073 | 1072 2
1074 | 1073 3
1075 | 1074 6
1076 | 1075 3
1077 | 1076 6
1078 | 1077 5
1079 | 1078 0
1080 | 1079 0
1081 | 1080 3
1082 | 1081 4
1083 | 1082 4
1084 | 1083 0
1085 | 1084 3
1086 | 1085 6
1087 | 1086 3
1088 | 1087 4
1089 | 1088 1
1090 | 1089 1
1091 | 1090 3
1092 | 1091 3
1093 | 1092 3
1094 | 1093 3
1095 | 1094 4
1096 | 1095 3
1097 | 1096 3
1098 | 1097 4
1099 | 1098 3
1100 | 1099 3
1101 | 1100 3
1102 | 1101 3
1103 | 1102 4
1104 | 1103 2
1105 | 1104 0
1106 | 1105 5
1107 | 1106 3
1108 | 1107 3
1109 | 1108 3
1110 | 1109 4
1111 | 1110 0
1112 | 1111 4
1113 | 1112 4
1114 | 1113 5
1115 | 1114 2
1116 | 1115 4
1117 | 1116 3
1118 | 1117 0
1119 | 1118 0
1120 | 1119 3
1121 | 1120 0
1122 | 1121 3
1123 | 1122 5
1124 | 1123 2
1125 | 1124 3
1126 | 1125 0
1127 | 1126 3
1128 | 1127 3
1129 | 1128 5
1130 | 1129 4
1131 | 1130 3
1132 | 1131 3
1133 | 1132 3
1134 | 1133 5
1135 | 1134 3
1136 | 1135 4
1137 | 1136 2
1138 | 1137 0
1139 | 1138 4
1140 | 1139 0
1141 | 1140 1
1142 | 1141 4
1143 | 1142 1
1144 | 1143 4
1145 | 1144 1
1146 | 1145 2
1147 | 1146 1
1148 | 1147 3
1149 | 1148 2
1150 | 1149 2
1151 | 1150 2
1152 | 1151 3
1153 | 1152 0
1154 | 1153 4
1155 | 1154 2
1156 | 1155 2
1157 | 1156 0
1158 | 1157 4
1159 | 1158 1
1160 | 1159 3
1161 | 1160 3
1162 | 1161 2
1163 | 1162 4
1164 | 1163 6
1165 | 1164 2
1166 | 1165 6
1167 | 1166 3
1168 | 1167 5
1169 | 1168 5
1170 | 1169 2
1171 | 1170 6
1172 | 1171 3
1173 | 1172 0
1174 | 1173 2
1175 | 1174 0
1176 | 1175 3
1177 | 1176 3
1178 | 1177 3
1179 | 1178 4
1180 | 1179 5
1181 | 1180 1
1182 | 1181 5
1183 | 1182 5
1184 | 1183 5
1185 | 1184 5
1186 | 1185 3
1187 | 1186 3
1188 | 1187 0
1189 | 1188 0
1190 | 1189 2
1191 | 1190 5
1192 | 1191 3
1193 | 1192 3
1194 | 1193 1
1195 | 1194 4
1196 | 1195 0
1197 | 1196 4
1198 | 1197 1
1199 | 1198 0
1200 | 1199 2
1201 | 1200 3
1202 | 1201 3
1203 | 1202 4
1204 | 1203 0
1205 | 1204 1
1206 | 1205 2
1207 | 1206 4
1208 | 1207 4
1209 | 1208 4
1210 | 1209 2
1211 | 1210 2
1212 | 1211 3
1213 | 1212 3
1214 | 1213 3
1215 | 1214 2
1216 | 1215 6
1217 | 1216 2
1218 | 1217 3
1219 | 1218 0
1220 | 1219 3
1221 | 1220 0
1222 | 1221 3
1223 | 1222 5
1224 | 1223 3
1225 | 1224 0
1226 | 1225 3
1227 | 1226 5
1228 | 1227 5
1229 | 1228 0
1230 | 1229 2
1231 | 1230 4
1232 | 1231 3
1233 | 1232 0
1234 | 1233 2
1235 | 1234 4
1236 | 1235 4
1237 | 1236 6
1238 | 1237 5
1239 | 1238 2
1240 | 1239 3
1241 | 1240 4
1242 | 1241 3
1243 | 1242 3
1244 | 1243 2
1245 | 1244 1
1246 | 1245 1
1247 | 1246 4
1248 | 1247 3
1249 | 1248 1
1250 | 1249 2
1251 | 1250 2
1252 | 1251 1
1253 | 1252 2
1254 | 1253 1
1255 | 1254 2
1256 | 1255 4
1257 | 1256 3
1258 | 1257 4
1259 | 1258 1
1260 | 1259 0
1261 | 1260 4
1262 | 1261 4
1263 | 1262 2
1264 | 1263 2
1265 | 1264 4
1266 | 1265 4
1267 | 1266 4
1268 | 1267 5
1269 | 1268 0
1270 | 1269 5
1271 | 1270 3
1272 | 1271 3
1273 | 1272 3
1274 | 1273 3
1275 | 1274 3
1276 | 1275 0
1277 | 1276 5
1278 | 1277 3
1279 | 1278 3
1280 | 1279 0
1281 | 1280 2
1282 | 1281 2
1283 | 1282 2
1284 | 1283 1
1285 | 1284 2
1286 | 1285 0
1287 | 1286 4
1288 | 1287 2
1289 | 1288 6
1290 | 1289 3
1291 | 1290 3
1292 | 1291 6
1293 | 1292 2
1294 | 1293 0
1295 | 1294 3
1296 | 1295 3
1297 | 1296 0
1298 | 1297 3
1299 | 1298 3
1300 | 1299 3
1301 | 1300 3
1302 | 1301 3
1303 | 1302 0
1304 | 1303 3
1305 | 1304 1
1306 | 1305 2
1307 | 1306 2
1308 | 1307 4
1309 | 1308 2
1310 | 1309 5
1311 | 1310 3
1312 | 1311 5
1313 | 1312 5
1314 | 1313 5
1315 | 1314 5
1316 | 1315 3
1317 | 1316 3
1318 | 1317 2
1319 | 1318 4
1320 | 1319 3
1321 | 1320 4
1322 | 1321 3
1323 | 1322 4
1324 | 1323 3
1325 | 1324 5
1326 | 1325 3
1327 | 1326 3
1328 | 1327 6
1329 | 1328 6
1330 | 1329 3
1331 | 1330 0
1332 | 1331 3
1333 | 1332 0
1334 | 1333 6
1335 | 1334 3
1336 | 1335 1
1337 | 1336 4
1338 | 1337 1
1339 | 1338 5
1340 | 1339 2
1341 | 1340 3
1342 | 1341 0
1343 | 1342 4
1344 | 1343 4
1345 | 1344 3
1346 | 1345 2
1347 | 1346 1
1348 | 1347 3
1349 | 1348 3
1350 | 1349 4
1351 | 1350 4
1352 | 1351 6
1353 | 1352 0
1354 | 1353 5
1355 | 1354 5
1356 | 1355 3
1357 | 1356 3
1358 | 1357 0
1359 | 1358 2
1360 | 1359 6
1361 | 1360 5
1362 | 1361 2
1363 | 1362 6
1364 | 1363 3
1365 | 1364 3
1366 | 1365 3
1367 | 1366 4
1368 | 1367 1
1369 | 1368 5
1370 | 1369 4
1371 | 1370 6
1372 | 1371 3
1373 | 1372 6
1374 | 1373 2
1375 | 1374 0
1376 | 1375 5
1377 | 1376 0
1378 | 1377 5
1379 | 1378 2
1380 | 1379 4
1381 | 1380 4
1382 | 1381 4
1383 | 1382 3
1384 | 1383 2
1385 | 1384 2
1386 | 1385 4
1387 | 1386 3
1388 | 1387 6
1389 | 1388 0
1390 | 1389 2
1391 | 1390 4
1392 | 1391 0
1393 | 1392 3
1394 | 1393 3
1395 | 1394 5
1396 | 1395 0
1397 | 1396 6
1398 | 1397 0
1399 | 1398 2
1400 | 1399 6
1401 | 1400 3
1402 | 1401 4
1403 | 1402 6
1404 | 1403 3
1405 | 1404 5
1406 | 1405 3
1407 | 1406 4
1408 | 1407 2
1409 | 1408 5
1410 | 1409 5
1411 | 1410 0
1412 | 1411 3
1413 | 1412 2
1414 | 1413 3
1415 | 1414 5
1416 | 1415 5
1417 | 1416 0
1418 | 1417 4
1419 | 1418 4
1420 | 1419 4
1421 | 1420 6
1422 | 1421 6
1423 | 1422 4
1424 | 1423 3
1425 | 1424 3
1426 | 1425 4
1427 | 1426 2
1428 | 1427 2
1429 | 1428 4
1430 | 1429 4
1431 | 1430 2
1432 | 1431 2
1433 | 1432 3
1434 | 1433 2
1435 | 1434 3
1436 | 1435 0
1437 | 1436 5
1438 | 1437 4
1439 | 1438 3
1440 | 1439 3
1441 | 1440 3
1442 | 1441 5
1443 | 1442 3
1444 | 1443 4
1445 | 1444 2
1446 | 1445 3
1447 | 1446 3
1448 | 1447 3
1449 | 1448 1
1450 | 1449 4
1451 | 1450 3
1452 | 1451 4
1453 | 1452 4
1454 | 1453 3
1455 | 1454 4
1456 | 1455 5
1457 | 1456 3
1458 | 1457 3
1459 | 1458 3
1460 | 1459 1
1461 | 1460 3
1462 | 1461 4
1463 | 1462 3
1464 | 1463 3
1465 | 1464 6
1466 | 1465 3
1467 | 1466 2
1468 | 1467 0
1469 | 1468 0
1470 | 1469 3
1471 | 1470 5
1472 | 1471 2
1473 | 1472 3
1474 | 1473 3
1475 | 1474 4
1476 | 1475 0
1477 | 1476 6
1478 | 1477 3
1479 | 1478 5
1480 | 1479 3
1481 | 1480 2
1482 | 1481 4
1483 | 1482 6
1484 | 1483 2
1485 | 1484 4
1486 | 1485 6
1487 | 1486 2
1488 | 1487 6
1489 | 1488 3
1490 | 1489 2
1491 | 1490 1
1492 | 1491 4
1493 | 1492 2
1494 | 1493 4
1495 | 1494 5
1496 | 1495 6
1497 | 1496 3
1498 | 1497 3
1499 | 1498 3
1500 | 1499 2
1501 | 1500 5
1502 | 1501 6
1503 | 1502 3
1504 | 1503 3
1505 | 1504 6
1506 | 1505 1
1507 | 1506 2
1508 | 1507 0
1509 | 1508 3
1510 | 1509 2
1511 | 1510 4
1512 | 1511 3
1513 | 1512 5
1514 | 1513 2
1515 | 1514 3
1516 | 1515 0
1517 | 1516 2
1518 | 1517 0
1519 | 1518 4
1520 | 1519 4
1521 | 1520 2
1522 | 1521 0
1523 | 1522 4
1524 | 1523 0
1525 | 1524 0
1526 | 1525 6
1527 | 1526 0
1528 | 1527 0
1529 | 1528 2
1530 | 1529 4
1531 | 1530 4
1532 | 1531 4
1533 | 1532 4
1534 | 1533 4
1535 | 1534 4
1536 | 1535 0
1537 | 1536 0
1538 | 1537 5
1539 | 1538 5
1540 | 1539 6
1541 | 1540 0
1542 | 1541 3
1543 | 1542 3
1544 | 1543 5
1545 | 1544 5
1546 | 1545 4
1547 | 1546 2
1548 | 1547 1
1549 | 1548 3
1550 | 1549 5
1551 | 1550 2
1552 | 1551 1
1553 | 1552 1
1554 | 1553 5
1555 | 1554 3
1556 | 1555 5
1557 | 1556 0
1558 | 1557 2
1559 | 1558 3
1560 | 1559 4
1561 | 1560 1
1562 | 1561 1
1563 | 1562 2
1564 | 1563 3
1565 | 1564 1
1566 | 1565 2
1567 | 1566 2
1568 | 1567 3
1569 | 1568 2
1570 | 1569 4
1571 | 1570 3
1572 | 1571 1
1573 | 1572 1
1574 | 1573 3
1575 | 1574 3
1576 | 1575 3
1577 | 1576 3
1578 | 1577 3
1579 | 1578 5
1580 | 1579 5
1581 | 1580 0
1582 | 1581 3
1583 | 1582 3
1584 | 1583 0
1585 | 1584 1
1586 | 1585 4
1587 | 1586 2
1588 | 1587 6
1589 | 1588 0
1590 | 1589 2
1591 | 1590 3
1592 | 1591 3
1593 | 1592 6
1594 | 1593 6
1595 | 1594 5
1596 | 1595 3
1597 | 1596 2
1598 | 1597 3
1599 | 1598 3
1600 | 1599 2
1601 | 1600 3
1602 | 1601 2
1603 | 1602 0
1604 | 1603 3
1605 | 1604 2
1606 | 1605 3
1607 | 1606 2
1608 | 1607 3
1609 | 1608 3
1610 | 1609 1
1611 | 1610 2
1612 | 1611 3
1613 | 1612 2
1614 | 1613 3
1615 | 1614 3
1616 | 1615 3
1617 | 1616 6
1618 | 1617 2
1619 | 1618 4
1620 | 1619 5
1621 | 1620 1
1622 | 1621 3
1623 | 1622 3
1624 | 1623 1
1625 | 1624 1
1626 | 1625 2
1627 | 1626 4
1628 | 1627 2
1629 | 1628 0
1630 | 1629 2
1631 | 1630 5
1632 | 1631 0
1633 | 1632 2
1634 | 1633 3
1635 | 1634 4
1636 | 1635 4
1637 | 1636 3
1638 | 1637 0
1639 | 1638 1
1640 | 1639 4
1641 | 1640 1
1642 | 1641 3
1643 | 1642 2
1644 | 1643 4
1645 | 1644 0
1646 | 1645 3
1647 | 1646 2
1648 | 1647 6
1649 | 1648 2
1650 | 1649 4
1651 | 1650 5
1652 | 1651 1
1653 | 1652 0
1654 | 1653 4
1655 | 1654 3
1656 | 1655 0
1657 | 1656 1
1658 | 1657 3
1659 | 1658 0
1660 | 1659 2
1661 | 1660 6
1662 | 1661 5
1663 | 1662 3
1664 | 1663 3
1665 | 1664 3
1666 | 1665 3
1667 | 1666 3
1668 | 1667 4
1669 | 1668 2
1670 | 1669 0
1671 | 1670 1
1672 | 1671 4
1673 | 1672 0
1674 | 1673 3
1675 | 1674 6
1676 | 1675 6
1677 | 1676 6
1678 | 1677 5
1679 | 1678 3
1680 | 1679 3
1681 | 1680 0
1682 | 1681 3
1683 | 1682 0
1684 | 1683 6
1685 | 1684 3
1686 | 1685 2
1687 | 1686 4
1688 | 1687 2
1689 | 1688 4
1690 | 1689 2
1691 | 1690 5
1692 | 1691 3
1693 | 1692 3
1694 | 1693 0
1695 | 1694 2
1696 | 1695 0
1697 | 1696 0
1698 | 1697 3
1699 | 1698 6
1700 | 1699 1
1701 | 1700 5
1702 | 1701 3
1703 | 1702 4
1704 | 1703 4
1705 | 1704 3
1706 | 1705 1
1707 | 1706 2
1708 | 1707 5
1709 | 1708 3
1710 | 1709 2
1711 | 1710 2
1712 | 1711 2
1713 | 1712 2
1714 | 1713 0
1715 | 1714 2
1716 | 1715 2
1717 | 1716 2
1718 | 1717 2
1719 | 1718 2
1720 | 1719 2
1721 | 1720 2
1722 | 1721 2
1723 | 1722 2
1724 | 1723 2
1725 | 1724 2
1726 | 1725 2
1727 | 1726 2
1728 | 1727 2
1729 | 1728 3
1730 | 1729 2
1731 | 1730 2
1732 | 1731 2
1733 | 1732 2
1734 | 1733 2
1735 | 1734 2
1736 | 1735 1
1737 | 1736 2
1738 | 1737 2
1739 | 1738 2
1740 | 1739 2
1741 | 1740 2
1742 | 1741 3
1743 | 1742 2
1744 | 1743 2
1745 | 1744 2
1746 | 1745 2
1747 | 1746 2
1748 | 1747 2
1749 | 1748 2
1750 | 1749 2
1751 | 1750 2
1752 | 1751 2
1753 | 1752 2
1754 | 1753 2
1755 | 1754 2
1756 | 1755 2
1757 | 1756 2
1758 | 1757 2
1759 | 1758 2
1760 | 1759 2
1761 | 1760 2
1762 | 1761 2
1763 | 1762 2
1764 | 1763 2
1765 | 1764 5
1766 | 1765 2
1767 | 1766 2
1768 | 1767 1
1769 | 1768 1
1770 | 1769 1
1771 | 1770 1
1772 | 1771 1
1773 | 1772 1
1774 | 1773 1
1775 | 1774 4
1776 | 1775 1
1777 | 1776 1
1778 | 1777 1
1779 | 1778 1
1780 | 1779 1
1781 | 1780 1
1782 | 1781 1
1783 | 1782 1
1784 | 1783 1
1785 | 1784 1
1786 | 1785 4
1787 | 1786 1
1788 | 1787 1
1789 | 1788 1
1790 | 1789 1
1791 | 1790 1
1792 | 1791 1
1793 | 1792 3
1794 | 1793 4
1795 | 1794 4
1796 | 1795 4
1797 | 1796 4
1798 | 1797 1
1799 | 1798 1
1800 | 1799 3
1801 | 1800 1
1802 | 1801 0
1803 | 1802 3
1804 | 1803 0
1805 | 1804 2
1806 | 1805 1
1807 | 1806 3
1808 | 1807 3
1809 | 1808 3
1810 | 1809 3
1811 | 1810 3
1812 | 1811 3
1813 | 1812 3
1814 | 1813 3
1815 | 1814 3
1816 | 1815 3
1817 | 1816 3
1818 | 1817 3
1819 | 1818 3
1820 | 1819 3
1821 | 1820 3
1822 | 1821 3
1823 | 1822 3
1824 | 1823 3
1825 | 1824 5
1826 | 1825 5
1827 | 1826 5
1828 | 1827 5
1829 | 1828 5
1830 | 1829 5
1831 | 1830 2
1832 | 1831 2
1833 | 1832 2
1834 | 1833 2
1835 | 1834 1
1836 | 1835 6
1837 | 1836 6
1838 | 1837 3
1839 | 1838 0
1840 | 1839 0
1841 | 1840 5
1842 | 1841 0
1843 | 1842 5
1844 | 1843 0
1845 | 1844 3
1846 | 1845 5
1847 | 1846 3
1848 | 1847 0
1849 | 1848 0
1850 | 1849 6
1851 | 1850 0
1852 | 1851 6
1853 | 1852 3
1854 | 1853 3
1855 | 1854 1
1856 | 1855 3
1857 | 1856 1
1858 | 1857 3
1859 | 1858 3
1860 | 1859 3
1861 | 1860 3
1862 | 1861 3
1863 | 1862 3
1864 | 1863 3
1865 | 1864 3
1866 | 1865 3
1867 | 1866 3
1868 | 1867 3
1869 | 1868 3
1870 | 1869 3
1871 | 1870 3
1872 | 1871 3
1873 | 1872 3
1874 | 1873 3
1875 | 1874 3
1876 | 1875 3
1877 | 1876 3
1878 | 1877 3
1879 | 1878 5
1880 | 1879 5
1881 | 1880 5
1882 | 1881 5
1883 | 1882 5
1884 | 1883 5
1885 | 1884 5
1886 | 1885 5
1887 | 1886 2
1888 | 1887 2
1889 | 1888 2
1890 | 1889 4
1891 | 1890 4
1892 | 1891 4
1893 | 1892 0
1894 | 1893 3
1895 | 1894 3
1896 | 1895 2
1897 | 1896 5
1898 | 1897 5
1899 | 1898 5
1900 | 1899 5
1901 | 1900 6
1902 | 1901 5
1903 | 1902 5
1904 | 1903 5
1905 | 1904 5
1906 | 1905 0
1907 | 1906 4
1908 | 1907 4
1909 | 1908 4
1910 | 1909 0
1911 | 1910 0
1912 | 1911 5
1913 | 1912 0
1914 | 1913 0
1915 | 1914 6
1916 | 1915 6
1917 | 1916 6
1918 | 1917 6
1919 | 1918 6
1920 | 1919 6
1921 | 1920 0
1922 | 1921 0
1923 | 1922 0
1924 | 1923 0
1925 | 1924 3
1926 | 1925 0
1927 | 1926 0
1928 | 1927 0
1929 | 1928 3
1930 | 1929 3
1931 | 1930 0
1932 | 1931 3
1933 | 1932 3
1934 | 1933 3
1935 | 1934 3
1936 | 1935 3
1937 | 1936 3
1938 | 1937 3
1939 | 1938 3
1940 | 1939 3
1941 | 1940 3
1942 | 1941 3
1943 | 1942 3
1944 | 1943 3
1945 | 1944 3
1946 | 1945 3
1947 | 1946 3
1948 | 1947 3
1949 | 1948 3
1950 | 1949 3
1951 | 1950 3
1952 | 1951 3
1953 | 1952 3
1954 | 1953 5
1955 | 1954 5
1956 | 1955 5
1957 | 1956 5
1958 | 1957 3
1959 | 1958 5
1960 | 1959 5
1961 | 1960 5
1962 | 1961 5
1963 | 1962 5
1964 | 1963 5
1965 | 1964 4
1966 | 1965 4
1967 | 1966 4
1968 | 1967 4
1969 | 1968 4
1970 | 1969 4
1971 | 1970 4
1972 | 1971 4
1973 | 1972 6
1974 | 1973 6
1975 | 1974 5
1976 | 1975 6
1977 | 1976 6
1978 | 1977 3
1979 | 1978 5
1980 | 1979 5
1981 | 1980 5
1982 | 1981 0
1983 | 1982 5
1984 | 1983 0
1985 | 1984 4
1986 | 1985 4
1987 | 1986 3
1988 | 1987 3
1989 | 1988 3
1990 | 1989 2
1991 | 1990 2
1992 | 1991 1
1993 | 1992 3
1994 | 1993 3
1995 | 1994 3
1996 | 1995 3
1997 | 1996 3
1998 | 1997 3
1999 | 1998 5
2000 | 1999 3
2001 | 2000 3
2002 | 2001 4
2003 | 2002 4
2004 | 2003 3
2005 | 2004 3
2006 | 2005 3
2007 | 2006 3
2008 | 2007 3
2009 | 2008 3
2010 | 2009 3
2011 | 2010 0
2012 | 2011 3
2013 | 2012 3
2014 | 2013 6
2015 | 2014 3
2016 | 2015 6
2017 | 2016 0
2018 | 2017 5
2019 | 2018 0
2020 | 2019 0
2021 | 2020 4
2022 | 2021 0
2023 | 2022 6
2024 | 2023 5
2025 | 2024 5
2026 | 2025 0
2027 | 2026 1
2028 | 2027 3
2029 | 2028 3
2030 | 2029 5
2031 | 2030 6
2032 | 2031 5
2033 | 2032 3
2034 | 2033 3
2035 | 2034 4
2036 | 2035 3
2037 | 2036 3
2038 | 2037 3
2039 | 2038 3
2040 | 2039 3
2041 | 2040 4
2042 | 2041 3
2043 | 2042 3
2044 | 2043 4
2045 | 2044 3
2046 | 2045 1
2047 | 2046 1
2048 | 2047 0
2049 | 2048 1
2050 | 2049 0
2051 | 2050 6
2052 | 2051 0
2053 | 2052 0
2054 | 2053 0
2055 | 2054 0
2056 | 2055 0
2057 | 2056 0
2058 | 2057 0
2059 | 2058 5
2060 | 2059 0
2061 | 2060 5
2062 | 2061 5
2063 | 2062 5
2064 | 2063 3
2065 | 2064 3
2066 | 2065 3
2067 | 2066 3
2068 | 2067 3
2069 | 2068 0
2070 | 2069 0
2071 | 2070 0
2072 | 2071 2
2073 | 2072 0
2074 | 2073 0
2075 | 2074 0
2076 | 2075 3
2077 | 2076 3
2078 | 2077 3
2079 | 2078 3
2080 | 2079 1
2081 | 2080 1
2082 | 2081 1
2083 | 2082 1
2084 | 2083 2
2085 | 2084 1
2086 | 2085 1
2087 | 2086 1
2088 | 2087 1
2089 | 2088 1
2090 | 2089 0
2091 | 2090 1
2092 | 2091 3
2093 | 2092 1
2094 | 2093 1
2095 | 2094 1
2096 | 2095 1
2097 | 2096 1
2098 | 2097 0
2099 | 2098 0
2100 | 2099 0
2101 | 2100 5
2102 | 2101 5
2103 | 2102 5
2104 | 2103 5
2105 | 2104 3
2106 | 2105 5
2107 | 2106 1
2108 | 2107 1
2109 | 2108 3
2110 | 2109 6
2111 | 2110 6
2112 | 2111 5
2113 | 2112 6
2114 | 2113 2
2115 | 2114 3
2116 | 2115 3
2117 | 2116 0
2118 | 2117 3
2119 | 2118 3
2120 | 2119 3
2121 | 2120 4
2122 | 2121 4
2123 | 2122 4
2124 | 2123 4
2125 | 2124 3
2126 | 2125 3
2127 | 2126 3
2128 | 2127 4
2129 | 2128 3
2130 | 2129 3
2131 | 2130 4
2132 | 2131 0
2133 | 2132 6
2134 | 2133 0
2135 | 2134 6
2136 | 2135 6
2137 | 2136 0
2138 | 2137 0
2139 | 2138 3
2140 | 2139 3
2141 | 2140 3
2142 | 2141 3
2143 | 2142 3
2144 | 2143 1
2145 | 2144 1
2146 | 2145 1
2147 | 2146 3
2148 | 2147 3
2149 | 2148 3
2150 | 2149 3
2151 | 2150 5
2152 | 2151 6
2153 | 2152 3
2154 | 2153 4
2155 | 2154 6
2156 | 2155 0
2157 | 2156 0
2158 | 2157 6
2159 | 2158 6
2160 | 2159 6
2161 | 2160 6
2162 | 2161 6
2163 | 2162 3
2164 | 2163 3
2165 | 2164 6
2166 | 2165 6
2167 | 2166 5
2168 | 2167 2
2169 | 2168 1
2170 | 2169 2
2171 | 2170 1
2172 | 2171 0
2173 | 2172 0
2174 | 2173 6
2175 | 2174 6
2176 | 2175 2
2177 | 2176 3
2178 | 2177 3
2179 | 2178 5
2180 | 2179 0
2181 | 2180 0
2182 | 2181 0
2183 | 2182 0
2184 | 2183 0
2185 | 2184 5
2186 | 2185 5
2187 | 2186 0
2188 | 2187 3
2189 | 2188 5
2190 | 2189 0
2191 | 2190 6
2192 | 2191 3
2193 | 2192 6
2194 | 2193 0
2195 | 2194 0
2196 | 2195 0
2197 | 2196 0
2198 | 2197 0
2199 | 2198 0
2200 | 2199 0
2201 | 2200 0
2202 | 2201 0
2203 | 2202 0
2204 | 2203 0
2205 | 2204 3
2206 | 2205 3
2207 | 2206 3
2208 | 2207 3
2209 | 2208 1
2210 | 2209 6
2211 | 2210 1
2212 | 2211 0
2213 | 2212 3
2214 | 2213 3
2215 | 2214 3
2216 | 2215 3
2217 | 2216 3
2218 | 2217 6
2219 | 2218 1
2220 | 2219 0
2221 | 2220 2
2222 | 2221 2
2223 | 2222 4
2224 | 2223 4
2225 | 2224 4
2226 | 2225 4
2227 | 2226 4
2228 | 2227 5
2229 | 2228 6
2230 | 2229 3
2231 | 2230 3
2232 | 2231 0
2233 | 2232 0
2234 | 2233 0
2235 | 2234 0
2236 | 2235 5
2237 | 2236 4
2238 | 2237 4
2239 | 2238 4
2240 | 2239 4
2241 | 2240 4
2242 | 2241 3
2243 | 2242 3
2244 | 2243 3
2245 | 2244 3
2246 | 2245 3
2247 | 2246 0
2248 | 2247 3
2249 | 2248 4
2250 | 2249 4
2251 | 2250 4
2252 | 2251 1
2253 | 2252 1
2254 | 2253 3
2255 | 2254 1
2256 | 2255 1
2257 | 2256 5
2258 | 2257 1
2259 | 2258 3
2260 | 2259 4
2261 | 2260 4
2262 | 2261 4
2263 | 2262 4
2264 | 2263 4
2265 | 2264 4
2266 | 2265 4
2267 | 2266 0
2268 | 2267 0
2269 | 2268 0
2270 | 2269 5
2271 | 2270 5
2272 | 2271 5
2273 | 2272 5
2274 | 2273 5
2275 | 2274 0
2276 | 2275 5
2277 | 2276 3
2278 | 2277 0
2279 | 2278 6
2280 | 2279 2
2281 | 2280 0
2282 | 2281 5
2283 | 2282 3
2284 | 2283 3
2285 | 2284 5
2286 | 2285 5
2287 | 2286 5
2288 | 2287 5
2289 | 2288 5
2290 | 2289 4
2291 | 2290 4
2292 | 2291 0
2293 | 2292 4
2294 | 2293 0
2295 | 2294 4
2296 | 2295 0
2297 | 2296 3
2298 | 2297 4
2299 | 2298 4
2300 | 2299 4
2301 | 2300 1
2302 | 2301 3
2303 | 2302 3
2304 | 2303 3
2305 | 2304 3
2306 | 2305 3
2307 | 2306 4
2308 | 2307 2
2309 | 2308 3
2310 | 2309 3
2311 | 2310 3
2312 | 2311 0
2313 | 2312 0
2314 | 2313 2
2315 | 2314 3
2316 | 2315 3
2317 | 2316 3
2318 | 2317 3
2319 | 2318 1
2320 | 2319 1
2321 | 2320 3
2322 | 2321 0
2323 | 2322 1
2324 | 2323 4
2325 | 2324 1
2326 | 2325 1
2327 | 2326 1
2328 | 2327 1
2329 | 2328 1
2330 | 2329 1
2331 | 2330 0
2332 | 2331 1
2333 | 2332 0
2334 | 2333 0
2335 | 2334 2
2336 | 2335 4
2337 | 2336 4
2338 | 2337 4
2339 | 2338 3
2340 | 2339 3
2341 | 2340 3
2342 | 2341 4
2343 | 2342 0
2344 | 2343 3
2345 | 2344 3
2346 | 2345 3
2347 | 2346 3
2348 | 2347 0
2349 | 2348 3
2350 | 2349 3
2351 | 2350 4
2352 | 2351 4
2353 | 2352 4
2354 | 2353 4
2355 | 2354 4
2356 | 2355 4
2357 | 2356 0
2358 | 2357 4
2359 | 2358 3
2360 | 2359 2
2361 | 2360 0
2362 | 2361 3
2363 | 2362 4
2364 | 2363 5
2365 | 2364 0
2366 | 2365 2
2367 | 2366 2
2368 | 2367 3
2369 | 2368 3
2370 | 2369 3
2371 | 2370 3
2372 | 2371 3
2373 | 2372 2
2374 | 2373 3
2375 | 2374 5
2376 | 2375 5
2377 | 2376 4
2378 | 2377 1
2379 | 2378 4
2380 | 2379 4
2381 | 2380 4
2382 | 2381 3
2383 | 2382 4
2384 | 2383 4
2385 | 2384 0
2386 | 2385 4
2387 | 2386 4
2388 | 2387 4
2389 | 2388 5
2390 | 2389 2
2391 | 2390 2
2392 | 2391 2
2393 | 2392 2
2394 | 2393 4
2395 | 2394 6
2396 | 2395 6
2397 | 2396 6
2398 | 2397 6
2399 | 2398 3
2400 | 2399 4
2401 | 2400 4
2402 | 2401 4
2403 | 2402 1
2404 | 2403 3
2405 | 2404 0
2406 | 2405 3
2407 | 2406 3
2408 | 2407 5
2409 | 2408 0
2410 | 2409 2
2411 | 2410 3
2412 | 2411 3
2413 | 2412 3
2414 | 2413 3
2415 | 2414 3
2416 | 2415 2
2417 | 2416 4
2418 | 2417 4
2419 | 2418 0
2420 | 2419 0
2421 | 2420 3
2422 | 2421 2
2423 | 2422 6
2424 | 2423 6
2425 | 2424 0
2426 | 2425 3
2427 | 2426 3
2428 | 2427 3
2429 | 2428 5
2430 | 2429 1
2431 | 2430 3
2432 | 2431 4
2433 | 2432 4
2434 | 2433 2
2435 | 2434 4
2436 | 2435 4
2437 | 2436 4
2438 | 2437 3
2439 | 2438 3
2440 | 2439 2
2441 | 2440 2
2442 | 2441 2
2443 | 2442 2
2444 | 2443 2
2445 | 2444 2
2446 | 2445 2
2447 | 2446 2
2448 | 2447 2
2449 | 2448 2
2450 | 2449 0
2451 | 2450 2
2452 | 2451 2
2453 | 2452 2
2454 | 2453 0
2455 | 2454 6
2456 | 2455 6
2457 | 2456 5
2458 | 2457 6
2459 | 2458 6
2460 | 2459 3
2461 | 2460 2
2462 | 2461 6
2463 | 2462 3
2464 | 2463 4
2465 | 2464 4
2466 | 2465 4
2467 | 2466 2
2468 | 2467 6
2469 | 2468 6
2470 | 2469 0
2471 | 2470 0
2472 | 2471 3
2473 | 2472 0
2474 | 2473 4
2475 | 2474 4
2476 | 2475 3
2477 | 2476 2
2478 | 2477 3
2479 | 2478 1
2480 | 2479 6
2481 | 2480 6
2482 | 2481 5
2483 | 2482 3
2484 | 2483 4
2485 | 2484 3
2486 | 2485 5
2487 | 2486 3
2488 | 2487 1
2489 | 2488 1
2490 | 2489 3
2491 | 2490 4
2492 | 2491 5
2493 | 2492 2
2494 | 2493 3
2495 | 2494 3
2496 | 2495 3
2497 | 2496 4
2498 | 2497 5
2499 | 2498 4
2500 | 2499 0
2501 | 2500 3
2502 | 2501 3
2503 | 2502 0
2504 | 2503 2
2505 | 2504 1
2506 | 2505 1
2507 | 2506 5
2508 | 2507 2
2509 | 2508 3
2510 | 2509 3
2511 | 2510 5
2512 | 2511 0
2513 | 2512 2
2514 | 2513 3
2515 | 2514 2
2516 | 2515 2
2517 | 2516 5
2518 | 2517 5
2519 | 2518 4
2520 | 2519 3
2521 | 2520 4
2522 | 2521 3
2523 | 2522 2
2524 | 2523 2
2525 | 2524 4
2526 | 2525 2
2527 | 2526 4
2528 | 2527 5
2529 | 2528 5
2530 | 2529 3
2531 | 2530 2
2532 | 2531 3
2533 | 2532 1
2534 | 2533 0
2535 | 2534 3
2536 | 2535 3
2537 | 2536 4
2538 | 2537 5
2539 | 2538 4
2540 | 2539 3
2541 | 2540 3
2542 | 2541 3
2543 | 2542 3
2544 | 2543 3
2545 | 2544 0
2546 | 2545 1
2547 | 2546 2
2548 | 2547 4
2549 | 2548 4
2550 | 2549 4
2551 | 2550 3
2552 | 2551 3
2553 | 2552 3
2554 | 2553 5
2555 | 2554 2
2556 | 2555 3
2557 | 2556 2
2558 | 2557 2
2559 | 2558 2
2560 | 2559 3
2561 | 2560 2
2562 | 2561 2
2563 | 2562 0
2564 | 2563 4
2565 | 2564 4
2566 | 2565 3
2567 | 2566 3
2568 | 2567 3
2569 | 2568 3
2570 | 2569 3
2571 | 2570 3
2572 | 2571 3
2573 | 2572 3
2574 | 2573 3
2575 | 2574 3
2576 | 2575 0
2577 | 2576 0
2578 | 2577 3
2579 | 2578 0
2580 | 2579 3
2581 | 2580 0
2582 | 2581 2
2583 | 2582 3
2584 | 2583 4
2585 | 2584 1
2586 | 2585 2
2587 | 2586 5
2588 | 2587 4
2589 | 2588 3
2590 | 2589 3
2591 | 2590 3
2592 | 2591 1
2593 | 2592 5
2594 | 2593 3
2595 | 2594 4
2596 | 2595 3
2597 | 2596 2
2598 | 2597 2
2599 | 2598 1
2600 | 2599 3
2601 | 2600 3
2602 | 2601 3
2603 | 2602 3
2604 | 2603 3
2605 | 2604 6
2606 | 2605 3
2607 | 2606 3
2608 | 2607 3
2609 | 2608 6
2610 | 2609 3
2611 | 2610 3
2612 | 2611 3
2613 | 2612 2
2614 | 2613 3
2615 | 2614 2
2616 | 2615 4
2617 | 2616 2
2618 | 2617 4
2619 | 2618 2
2620 | 2619 2
2621 | 2620 1
2622 | 2621 5
2623 | 2622 6
2624 | 2623 4
2625 | 2624 3
2626 | 2625 3
2627 | 2626 3
2628 | 2627 2
2629 | 2628 5
2630 | 2629 3
2631 | 2630 3
2632 | 2631 4
2633 | 2632 3
2634 | 2633 3
2635 | 2634 3
2636 | 2635 3
2637 | 2636 3
2638 | 2637 4
2639 | 2638 6
2640 | 2639 0
2641 | 2640 3
2642 | 2641 2
2643 | 2642 2
2644 | 2643 2
2645 | 2644 5
2646 | 2645 4
2647 | 2646 4
2648 | 2647 4
2649 | 2648 4
2650 | 2649 6
2651 | 2650 3
2652 | 2651 2
2653 | 2652 2
2654 | 2653 0
2655 | 2654 2
2656 | 2655 2
2657 | 2656 2
2658 | 2657 2
2659 | 2658 2
2660 | 2659 3
2661 | 2660 4
2662 | 2661 4
2663 | 2662 4
2664 | 2663 3
2665 | 2664 3
2666 | 2665 4
2667 | 2666 4
2668 | 2667 3
2669 | 2668 3
2670 | 2669 3
2671 | 2670 4
2672 | 2671 4
2673 | 2672 4
2674 | 2673 4
2675 | 2674 4
2676 | 2675 4
2677 | 2676 3
2678 | 2677 4
2679 | 2678 4
2680 | 2679 4
2681 | 2680 4
2682 | 2681 4
2683 | 2682 4
2684 | 2683 4
2685 | 2684 4
2686 | 2685 2
2687 | 2686 3
2688 | 2687 3
2689 | 2688 3
2690 | 2689 2
2691 | 2690 6
2692 | 2691 2
2693 | 2692 3
2694 | 2693 3
2695 | 2694 4
2696 | 2695 4
2697 | 2696 3
2698 | 2697 3
2699 | 2698 3
2700 | 2699 3
2701 | 2700 3
2702 | 2701 3
2703 | 2702 0
2704 | 2703 3
2705 | 2704 3
2706 | 2705 3
2707 | 2706 3
2708 | 2707 3
2709 |
--------------------------------------------------------------------------------
/emb/init:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/houchengbin/ABRW/47b21dae84e624e3f9338c1a65f8dd9ef949b139/emb/init
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | setuptools==39.1.0
2 | absl-py==0.2.2
3 | astor==0.6.2
4 | backports.weakref==1.0.post1
5 | bleach==1.5.0
6 | decorator==4.3.0
7 | funcsigs==1.0.2
8 | gast==0.2.0
9 | grpcio==1.12.1
10 | html5lib==0.9999999
11 | Markdown==2.6.11
12 | mock==2.0.0
13 | numpy==1.14.5
14 | pbr==4.0.4
15 | protobuf==3.6.0
16 | scipy==1.1.0
17 | six==1.11.0
18 | termcolor==1.1.0
19 | Werkzeug==0.15.3
20 | networkx==2.2
21 | tensorflow==1.15.0
22 | tensorboard==1.10.0
23 | gensim==3.0.1
24 | scikit-learn==0.19.0 #0.20.0 is OK but may get some warnings
25 |
--------------------------------------------------------------------------------
/src/libnrl/__init__.py:
--------------------------------------------------------------------------------
1 | from __future__ import print_function
2 | from __future__ import division
--------------------------------------------------------------------------------
/src/libnrl/aane.py:
--------------------------------------------------------------------------------
1 | """
2 | ANE method: Accelerated Attributed Network Embedding (AANE)
3 |
4 | modified by Chengbin Hou 2018
5 |
6 | originally from https://github.com/xhuang31/AANE_Python
7 | """
8 |
9 | import numpy as np
10 | from scipy import sparse
11 | from scipy.sparse import csc_matrix
12 | from scipy.sparse.linalg import svds
13 | from math import ceil
14 |
15 | class AANE:
16 | """Jointly embed Net and Attri into embedding representation H
17 | H = AANE(Net,Attri,d).function()
18 | H = AANE(Net,Attri,d,lambd,rho).function()
19 | H = AANE(Net,Attri,d,lambd,rho,maxiter).function()
20 | H = AANE(Net,Attri,d,lambd,rho,maxiter,'Att').function()
21 | H = AANE(Net,Attri,d,lambd,rho,maxiter,'Att',splitnum).function()
22 | :param Net: the weighted adjacency matrix
23 | :param Attri: the attribute information matrix with row denotes nodes
24 | :param d: the dimension of the embedding representation
25 | :param lambd: the regularization parameter
26 | :param rho: the penalty parameter
27 | :param maxiter: the maximum number of iteration
28 | :param 'Att': refers to conduct Initialization from the SVD of Attri
29 | :param splitnum: the number of pieces we split the SA for limited cache
30 | :return: the embedding representation H
31 | Copyright 2017 & 2018, Xiao Huang and Jundong Li.
32 | $Revision: 1.0.2 $ $Date: 2018/02/19 00:00:00 $
33 | """
34 | def __init__(self, graph, dim, lambd=0.05, rho=5, maxiter=5, mode='comb', *varargs):
35 | self.dim = dim
36 | self.look_back_list = graph.look_back_list #look back node id for Net and Attr
37 | self.lambd = lambd # Initial regularization parameter
38 | self.rho = rho # Initial penalty parameter
39 | self.maxiter = maxiter # Max num of iteration
40 | splitnum = 1 # number of pieces we split the SA for limited cache
41 | if mode == 'comb':
42 | print('==============AANE-comb mode: jointly learn emb from both structure and attribute info========')
43 | Net = graph.get_adj_mat()
44 | Attri = graph.get_attr_mat()
45 | elif mode == 'pure':
46 | print('======================AANE-pure mode: learn emb purely from structure info====================')
47 | Net = graph.get_adj_mat()
48 | Attri = Net
49 | else:
50 | exit(0)
51 |
52 | [self.n, m] = Attri.shape # n = Total num of nodes, m = attribute category num
53 | Net = sparse.lil_matrix(Net)
54 | Net.setdiag(np.zeros(self.n))
55 | Net = csc_matrix(Net)
56 | Attri = csc_matrix(Attri)
57 | if len(varargs) >= 4 and varargs[3] == 'Att':
58 | sumcol = np.arange(m)
59 | np.random.shuffle(sumcol)
60 | self.H = svds(Attri[:, sumcol[0:min(10 * self.dim, m)]], self.dim)[0]
61 | else:
62 | sumcol = Net.sum(0)
63 | self.H = svds(Net[:, sorted(range(self.n), key=lambda k: sumcol[0, k], reverse=True)[0:min(10 * self.dim, self.n)]], self.dim)[0]
64 |
65 | if len(varargs) > 0:
66 | self.lambd = varargs[0]
67 | self.rho = varargs[1]
68 | if len(varargs) >= 3:
69 | self.maxiter = varargs[2]
70 | if len(varargs) >= 5:
71 | splitnum = varargs[4]
72 | self.block = min(int(ceil(float(self.n) / splitnum)), 7575) # Treat at least each 7575 nodes as a block
73 | self.splitnum = int(ceil(float(self.n) / self.block))
74 | with np.errstate(divide='ignore'): # inf will be ignored
75 | self.Attri = Attri.transpose() * sparse.diags(np.ravel(np.power(Attri.power(2).sum(1), -0.5)))
76 | self.Z = self.H.copy()
77 | self.affi = -1 # Index for affinity matrix sa
78 | self.U = np.zeros((self.n, self.dim))
79 | self.nexidx = np.split(Net.indices, Net.indptr[1:-1])
80 | self.Net = np.split(Net.data, Net.indptr[1:-1])
81 |
82 | self.vectors = {}
83 | self.function() #run aane----------------------------
84 |
85 |
86 | '''################# Update functions #################'''
87 | def updateH(self):
88 | xtx = np.dot(self.Z.transpose(), self.Z) * 2 + self.rho * np.eye(self.dim)
89 | for blocki in range(self.splitnum): # Split nodes into different Blocks
90 | indexblock = self.block * blocki # Index for splitting blocks
91 | if self.affi != blocki:
92 | self.sa = self.Attri[:, range(indexblock, indexblock + min(self.n - indexblock, self.block))].transpose() * self.Attri
93 | self.affi = blocki
94 | sums = self.sa.dot(self.Z) * 2
95 | for i in range(indexblock, indexblock + min(self.n - indexblock, self.block)):
96 | neighbor = self.Z[self.nexidx[i], :] # the set of adjacent nodes of node i
97 | for j in range(1):
98 | normi_j = np.linalg.norm(neighbor - self.H[i, :], axis=1) # norm of h_i^k-z_j^k
99 | nzidx = normi_j != 0 # Non-equal Index
100 | if np.any(nzidx):
101 | normi_j = (self.lambd * self.Net[i][nzidx]) / normi_j[nzidx]
102 | self.H[i, :] = np.linalg.solve(xtx + normi_j.sum() * np.eye(self.dim), sums[i - indexblock, :] + (
103 | neighbor[nzidx, :] * normi_j.reshape((-1, 1))).sum(0) + self.rho * (
104 | self.Z[i, :] - self.U[i, :]))
105 | else:
106 | self.H[i, :] = np.linalg.solve(xtx, sums[i - indexblock, :] + self.rho * (
107 | self.Z[i, :] - self.U[i, :]))
108 | def updateZ(self):
109 | xtx = np.dot(self.H.transpose(), self.H) * 2 + self.rho * np.eye(self.dim)
110 | for blocki in range(self.splitnum): # Split nodes into different Blocks
111 | indexblock = self.block * blocki # Index for splitting blocks
112 | if self.affi != blocki:
113 | self.sa = self.Attri[:, range(indexblock, indexblock + min(self.n - indexblock, self.block))].transpose() * self.Attri
114 | self.affi = blocki
115 | sums = self.sa.dot(self.H) * 2
116 | for i in range(indexblock, indexblock + min(self.n - indexblock, self.block)):
117 | neighbor = self.H[self.nexidx[i], :] # the set of adjacent nodes of node i
118 | for j in range(1):
119 | normi_j = np.linalg.norm(neighbor - self.Z[i, :], axis=1) # norm of h_i^k-z_j^k
120 | nzidx = normi_j != 0 # Non-equal Index
121 | if np.any(nzidx):
122 | normi_j = (self.lambd * self.Net[i][nzidx]) / normi_j[nzidx]
123 | self.Z[i, :] = np.linalg.solve(xtx + normi_j.sum() * np.eye(self.dim), sums[i - indexblock, :] + (
124 | neighbor[nzidx, :] * normi_j.reshape((-1, 1))).sum(0) + self.rho * (
125 | self.H[i, :] + self.U[i, :]))
126 | else:
127 | self.Z[i, :] = np.linalg.solve(xtx, sums[i - indexblock, :] + self.rho * (
128 | self.H[i, :] + self.U[i, :]))
129 |
130 | def function(self):
131 | self.updateH()
132 | '''################# Iterations #################'''
133 | for i in range(self.maxiter):
134 | import time
135 | t1=time.time()
136 | self.updateZ()
137 | self.U = self.U + self.H - self.Z
138 | self.updateH()
139 | t2=time.time()
140 | print(f'iter: {i+1}/{self.maxiter}; time cost {t2-t1:0.2f}s')
141 |
142 | #-------save emb to self.vectors and return
143 | ind = 0
144 | for id in self.look_back_list:
145 | self.vectors[id] = self.H[ind]
146 | ind += 1
147 | return self.vectors
148 |
149 | def save_embeddings(self, filename):
150 | '''
151 | save embeddings to file
152 | '''
153 | fout = open(filename, 'w')
154 | node_num = len(self.vectors.keys())
155 | fout.write("{} {}\n".format(node_num, self.dim))
156 | for node, vec in self.vectors.items():
157 | fout.write("{} {}\n".format(node,' '.join([str(x) for x in vec])))
158 | fout.close()
159 |
--------------------------------------------------------------------------------
/src/libnrl/abrw.py:
--------------------------------------------------------------------------------
1 | """
2 | ANE method: Attributed Biased Random Walks;
3 |
4 | by Chengbin Hou 2018
5 | """
6 |
7 | import time
8 | import warnings
9 | warnings.filterwarnings(action='ignore', category=UserWarning, module='gensim')
10 |
11 | import numpy as np
12 | from gensim.models import Word2Vec
13 | from scipy import sparse
14 |
15 | from . import walker
16 | from .utils import pairwise_similarity, row_as_probdist
17 |
18 |
19 | class ABRW(object):
20 | def __init__(self, graph, dim, alpha, topk, number_walks, walk_length, **kwargs):
21 | self.g = graph
22 | self.dim = dim
23 | self.alpha = float(alpha)
24 | self.topk = int(topk)
25 | self.number_walks = number_walks
26 | self.walk_length = walk_length
27 |
28 | # obtain biased transition mat -----------
29 | self.T = self.get_biased_transition_mat(A=self.g.get_adj_mat(), X=self.g.get_attr_mat())
30 |
31 | # aim to generate a sequences of walks/sentences
32 | # apply weighted random walks on the reconstructed network based on biased transition mat
33 | kwargs["workers"] = kwargs.get("workers", 8)
34 | weighted_walker = walker.WeightedWalker(node_id_map=self.g.look_back_list, transition_mat=self.T, workers=kwargs["workers"]) # instance weighted walker
35 | sentences = weighted_walker.simulate_walks(num_walks=self.number_walks, walk_length=self.walk_length)
36 |
37 | # feed the walks/sentences into Word2Vec Skip-Gram model for traning node embeddings
38 | kwargs["sentences"] = sentences
39 | kwargs["size"] = self.dim
40 | kwargs["sg"] = 1 # use skip-gram; but see deepwalk which uses 'hs' = 1
41 | kwargs["window"] = kwargs.get("window", 10)
42 | kwargs["min_count"] = kwargs.get("min_count", 0) # drop words/nodes if below the min_count freq; set to 0 to get all node embs
43 | print("Learning node embeddings......")
44 | word2vec = Word2Vec(**kwargs)
45 |
46 | # save emb as a dict
47 | self.vectors = {}
48 | for word in self.g.G.nodes():
49 | self.vectors[word] = word2vec.wv[word]
50 | del word2vec
51 |
52 | def get_biased_transition_mat(self, A, X):
53 | '''
54 | given: A and X --> T_A and T_X
55 | research question: how to combine A and X in a more principled way
56 | genral idea: Attribute Biased Random Walk
57 | i.e. a walker based on a mixed transition matrix by P=alpha*T_A + (1-alpha)*T_X
58 | result: ABRW-trainsition matrix; T
59 | *** questions: 1) what about if we have some single nodes i.e. some rows of T_A gives 0s
60 | 2) the similarity/distance metric to obtain T_X
61 | 3) alias sampling as used in node2vec for speeding up, but this is the case
62 | if each row of P gives many 0s
63 | --> how to make each row of P is a pdf and meanwhile is sparse
64 | '''
65 | print("obtaining biased transition matrix where each row sums up to 1.0...")
66 |
67 | preserve_zeros = False # compare them: 1) accuracy; 2) efficiency
68 | T_A = row_as_probdist(A, preserve_zeros) # norm adj/struc info mat; for isolated node, return all-zeros row or all-1/m row
69 | print('Preserve zero rows of the adj matrix: ', preserve_zeros)
70 |
71 | t1 = time.time()
72 | X_sim = pairwise_similarity(X) # attr similarity mat; X_sim is a square mat, but X is not
73 |
74 | t2 = time.time()
75 | print(f'keep the top {self.topk} attribute similar nodes w.r.t. a node')
76 | cutoff = np.partition(X_sim, -self.topk, axis=1)[:, -self.topk:].min(axis=1)
77 | X_sim[(X_sim < cutoff)] = 0 # improve both accuracy and efficiency
78 | X_sim = sparse.csr_matrix(X_sim)
79 |
80 | t3 = time.time()
81 | T_X = row_as_probdist(X_sim)
82 |
83 | t4 = time.time()
84 | print(f'attr sim cal time: {(t2-t1):.2f}s; topk sparse ops time: {(t3-t2):.2f}s; row norm time: {(t4-t3):.2f}s')
85 | del A, X, X_sim
86 |
87 | # =====================================information fusion via transition matrices========================================
88 | print('------alpha for P = alpha * T_A + (1-alpha) * T_X------: ', self.alpha)
89 | n = self.g.get_num_nodes()
90 | alp = np.array(n * [self.alpha]) # for vectorized computation
91 | alp[~np.asarray(T_A.sum(axis=1) != 0).ravel()] = 0
92 | T = sparse.diags(alp).dot(T_A) + sparse.diags(1 - alp).dot(T_X) # sparse version
93 | t5 = time.time()
94 | print(f'ABRW biased transition matrix processing time: {(t5-t4):.2f}s')
95 | return T
96 |
97 | def save_embeddings(self, filename): #to do... put it to utils;
98 | fout = open(filename, 'w') #call it while __init__ (abrw calss) with flag --save-emb=True (from main.py)
99 | node_num = len(self.vectors.keys())
100 | fout.write("{} {}\n".format(node_num, self.dim))
101 | for node, vec in self.vectors.items():
102 | fout.write("{} {}\n".format(node, ' '.join([str(x) for x in vec])))
103 | fout.close()
104 |
--------------------------------------------------------------------------------
/src/libnrl/attrcomb.py:
--------------------------------------------------------------------------------
1 | """
2 | NE method: naively combine AttrPure and DeepWalk (AttrComb)
3 |
4 | by Chengbin Hou 2018
5 | """
6 |
7 | import time
8 |
9 | import networkx as nx
10 | import numpy as np
11 |
12 | from . import node2vec
13 | from .utils import dim_reduction
14 |
15 | class ATTRCOMB(object):
16 | def __init__(self, graph, dim, comb_method='concat', comb_with='deepWalk', number_walks=10, walk_length=80, window=10, workers=8):
17 | self.g = graph
18 | self.dim = dim
19 | self.number_walks= number_walks
20 | self.walk_length = walk_length
21 | self.window = window
22 | self.workers = workers
23 |
24 | print("Learning representation...")
25 | self.vectors = {}
26 |
27 | print('attr naively combined method ', comb_method, '=====================')
28 | if comb_method == 'concat':
29 | print('comb_method == concat by default; dim/2 from attr and dim/2 from nrl.............')
30 | attr_embeddings = self.train_attr(dim=int(self.dim/2))
31 | nrl_embeddings = self.train_nrl(dim=int(self.dim/2), comb_with='deepWalk')
32 | embeddings = np.concatenate((attr_embeddings, nrl_embeddings), axis=1)
33 | print('shape of embeddings', embeddings.shape)
34 |
35 | elif comb_method == 'elementwise-mean':
36 | print('comb_method == elementwise-mean.............')
37 | attr_embeddings = self.train_attr(dim=self.dim)
38 | nrl_embeddings = self.train_nrl(dim=self.dim, comb_with='deepWalk') #we may try deepWalk, node2vec, line and etc...
39 | embeddings = np.add(attr_embeddings, nrl_embeddings)/2.0
40 | print('shape of embeddings', embeddings.shape)
41 |
42 | elif comb_method == 'elementwise-max':
43 | print('comb_method == elementwise-max.............')
44 | attr_embeddings = self.train_attr(dim=self.dim)
45 | nrl_embeddings = self.train_nrl(dim=self.dim, comb_with='deepWalk') #we may try deepWalk, node2vec, line and etc...
46 | embeddings = np.zeros(shape=(attr_embeddings.shape[0],attr_embeddings.shape[1]))
47 | for i in range(attr_embeddings.shape[0]): #size(attr_embeddings) = size(nrl_embeddings)
48 | for j in range(attr_embeddings.shape[1]):
49 | if attr_embeddings[i][j] > nrl_embeddings[i][j]:
50 | embeddings[i][j] = attr_embeddings[i][j]
51 | else:
52 | embeddings[i][j] = nrl_embeddings[i][j]
53 | print('shape of embeddings', embeddings.shape)
54 |
55 | else:
56 | print('error, no comb_method was found....')
57 | exit(0)
58 |
59 | for key, ind in self.g.look_up_dict.items():
60 | self.vectors[key] = embeddings[ind]
61 |
62 |
63 | def train_attr(self, dim):
64 | X = self.g.get_attr_mat()
65 | X_compressed = dim_reduction(X, dim=dim, method='svd') #svd or pca for dim reduction
66 | print('X_compressed shape: ', X_compressed.shape)
67 | return np.array(X_compressed) #n*dim matrix, each row corresponding to node ID stored in graph.look_back_list
68 |
69 |
70 | def train_nrl(self, dim, comb_with):
71 | print('attr naively combined with ', comb_with, '=====================')
72 | if comb_with == 'deepWalk':
73 | model = node2vec.Node2vec(graph=self.g, dim=dim, path_length=self.walk_length, #do not use self.dim here
74 | num_paths=self.number_walks, workers=self.workers, window=self.window, dw=True)
75 | nrl_embeddings = []
76 | for key in self.g.look_back_list:
77 | nrl_embeddings.append(model.vectors[key])
78 | return np.array(nrl_embeddings)
79 |
80 | elif comb_with == 'node2vec': #to do... the parameters
81 | model = node2vec.Node2vec(graph=self.g, path_length=80, num_paths=self.number_walks,
82 | dim=dim, workers=4, p=0.8, q=0.8, window=10)
83 | nrl_embeddings = []
84 | for key in self.g.look_back_list:
85 | nrl_embeddings.append(model.vectors[key])
86 | return np.array(nrl_embeddings)
87 |
88 | else:
89 | print('error, no comb_with was found....')
90 | print('to do.... line, grarep, and etc...')
91 | exit(0)
92 |
93 | def save_embeddings(self, filename):
94 | fout = open(filename, 'w')
95 | node_num = len(self.vectors.keys())
96 | fout.write("{} {}\n".format(node_num, self.dim))
97 | for node, vec in self.vectors.items():
98 | fout.write("{} {}\n".format(node,
99 | ' '.join([str(x) for x in vec])))
100 | fout.close()
101 |
--------------------------------------------------------------------------------
/src/libnrl/attrpure.py:
--------------------------------------------------------------------------------
1 | """
2 | NE method: use only attribute information (AttrPure)
3 |
4 | by Chengbin Hou 2018
5 | """
6 |
7 | import time
8 |
9 | import networkx as nx
10 | import numpy as np
11 |
12 | from .utils import dim_reduction
13 |
14 | class ATTRPURE(object):
15 |
16 | def __init__(self, graph, dim, mode):
17 | self.g = graph
18 | self.dim = dim
19 | self.mode = mode
20 |
21 | print("Learning representation...")
22 | self.vectors = {}
23 | embeddings = self.train()
24 | for key, ind in self.g.look_up_dict.items():
25 | self.vectors[key] = embeddings[ind]
26 |
27 | def train(self):
28 | X = self.g.get_attr_mat().todense()
29 | X_compressed = None
30 | if self.mode == 'pca':
31 | X_compressed = dim_reduction(X, dim=self.dim, method='pca')
32 | elif self.mode == 'svd':
33 | X_compressed = dim_reduction(X, dim=self.dim, method='svd')
34 | else:
35 | print('unknown dim reduction technique...')
36 | return X_compressed
37 |
38 |
39 | def save_embeddings(self, filename):
40 | fout = open(filename, 'w')
41 | node_num = len(self.vectors.keys())
42 | fout.write("{} {}\n".format(node_num, self.dim))
43 | for node, vec in self.vectors.items():
44 | fout.write("{} {}\n".format(node,
45 | ' '.join([str(x) for x in vec])))
46 | fout.close()
47 |
--------------------------------------------------------------------------------
/src/libnrl/classify.py:
--------------------------------------------------------------------------------
1 | """
2 | modified by Chengbin Hou 2018
3 |
4 | originally from https://github.com/thunlp/OpenNE
5 | """
6 |
7 | import numpy as np
8 | import math
9 | import random
10 | import networkx as nx
11 | import warnings
12 | warnings.filterwarnings(action='ignore', category=UserWarning, module='sklearn')
13 | from sklearn.multiclass import OneVsRestClassifier
14 | from sklearn.metrics import f1_score, accuracy_score, roc_auc_score, classification_report, roc_curve, auc
15 | from sklearn.preprocessing import MultiLabelBinarizer
16 |
17 | # node classification classifier
18 | class ncClassifier(object):
19 |
20 | def __init__(self, vectors, clf):
21 | self.embeddings = vectors
22 | self.clf = TopKRanker(clf) #here clf is LR
23 | self.binarizer = MultiLabelBinarizer(sparse_output=True)
24 |
25 | def split_train_evaluate(self, X, Y, train_precent, seed=0):
26 | state = np.random.get_state()
27 | training_size = int(train_precent * len(X))
28 | #np.random.seed(seed)
29 | shuffle_indices = np.random.permutation(np.arange(len(X)))
30 | X_train = [X[shuffle_indices[i]] for i in range(training_size)]
31 | Y_train = [Y[shuffle_indices[i]] for i in range(training_size)]
32 | X_test = [X[shuffle_indices[i]] for i in range(training_size, len(X))]
33 | Y_test = [Y[shuffle_indices[i]] for i in range(training_size, len(X))]
34 |
35 | self.train(X_train, Y_train, Y)
36 | np.random.set_state(state) #why??? for binarizer.transform??
37 | return self.evaluate(X_test, Y_test)
38 |
39 | def train(self, X, Y, Y_all):
40 | self.binarizer.fit(Y_all) #to support multi-labels, fit means dict mapping {orig cat: binarized vec}
41 | X_train = [self.embeddings[x] for x in X]
42 | Y = self.binarizer.transform(Y) #since we have use Y_all fitted, then we simply transform
43 | self.clf.fit(X_train, Y)
44 |
45 | def predict(self, X, top_k_list):
46 | X_ = np.asarray([self.embeddings[x] for x in X])
47 | # see TopKRanker(OneVsRestClassifier)
48 | Y = self.clf.predict(X_, top_k_list=top_k_list) # the top k probs to be output...
49 | return Y
50 |
51 | def evaluate(self, X, Y):
52 | top_k_list = [len(l) for l in Y] #multi-labels, diff len of labels of each node
53 | Y_ = self.predict(X, top_k_list) #pred val of X_test i.e. Y_pred
54 | Y = self.binarizer.transform(Y) #true val i.e. Y_test
55 | averages = ["micro", "macro", "samples", "weighted"]
56 | results = {}
57 | for average in averages:
58 | results[average] = f1_score(Y, Y_, average=average)
59 | # print('Results, using embeddings of dimensionality', len(self.embeddings[X[0]]))
60 | print(results)
61 | return results
62 |
63 | class TopKRanker(OneVsRestClassifier): #orignal LR or SVM is for binary clf
64 | def predict(self, X, top_k_list): #re-define predict func of OneVsRestClassifier
65 | probs = np.asarray(super(TopKRanker, self).predict_proba(X))
66 | all_labels = []
67 | for i, k in enumerate(top_k_list):
68 | probs_ = probs[i, :]
69 | labels = self.classes_[probs_.argsort()[-k:]].tolist() #denote labels
70 | probs_[:] = 0 #reset probs_ to all 0
71 | probs_[labels] = 1 #reset probs_ to 1 if labels denoted...
72 | all_labels.append(probs_)
73 | return np.asarray(all_labels)
74 |
75 | '''
76 | #note: UndefinedMetricWarning: F-score is ill-defined and being set to 0.0 in samples with no true labels
77 | #see: https://stackoverflow.com/questions/43162506/undefinedmetricwarning-f-score-is-ill-defined-and-being-set-to-0-0-in-labels-wi
78 | '''
79 |
80 | '''
81 | import matplotlib.pyplot as plt
82 | def plt_roc(y_test, y_score):
83 | """
84 | calculate AUC value and plot the ROC curve
85 | """
86 | fpr, tpr, threshold = roc_curve(y_test, y_score)
87 | roc_auc = auc(fpr, tpr)
88 | plt.figure()
89 | plt.stackplot(fpr, tpr, color='steelblue', alpha = 0.5, edgecolor = 'black')
90 | plt.plot(fpr, tpr, color='black', lw = 1)
91 | plt.plot([0,1],[0,1], color = 'red', linestyle = '--')
92 | plt.text(0.5,0.3,'ROC curve (area = %0.3f)' % roc_auc)
93 | plt.xlabel('False Positive Rate')
94 | plt.ylabel('True Positive Rate')
95 | plt.show()
96 | return roc_auc
97 | '''
98 |
99 | # link prediction binary classifier
100 | class lpClassifier(object):
101 |
102 | def __init__(self, vectors):
103 | self.embeddings = vectors
104 |
105 | def evaluate(self, X_test, Y_test, seed=0): #clf here is simply a similarity/distance metric
106 | state = np.random.get_state()
107 | #np.random.seed(seed)
108 | test_size = len(X_test)
109 | #shuffle_indices = np.random.permutation(np.arange(test_size))
110 | #X_test = [X_test[shuffle_indices[i]] for i in range(test_size)]
111 | #Y_test = [Y_test[shuffle_indices[i]] for i in range(test_size)]
112 |
113 | Y_true = [int(i) for i in Y_test]
114 | Y_probs = []
115 | for i in range(test_size):
116 | start_node_emb = np.array(self.embeddings[X_test[i][0]]).reshape(-1,1)
117 | end_node_emb = np.array(self.embeddings[X_test[i][1]]).reshape(-1,1)
118 | score = cosine_similarity(start_node_emb, end_node_emb) #ranging from [-1, +1]
119 | Y_probs.append( (score+1)/2.0 ) #switch to prob... however, we may also directly y_score = score
120 | #in sklearn roc... which yields the same reasult
121 | roc = roc_auc_score(y_true = Y_true, y_score = Y_probs)
122 | if roc < 0.5:
123 | roc = 1.0 - roc #since lp is binary clf task, just predict the opposite if<0.5
124 | print("roc=", "{:.9f}".format(roc))
125 | #plt_roc(Y_true, Y_probs) #enable to plot roc curve and return auc value
126 |
127 | def norm(a):
128 | sum = 0.0
129 | for i in range(len(a)):
130 | sum = sum + a[i] * a[i]
131 | return math.sqrt(sum)
132 |
133 | def cosine_similarity(a, b):
134 | sum = 0.0
135 | for i in range(len(a)):
136 | sum = sum + a[i] * b[i]
137 | #return sum/(norm(a) * norm(b))
138 | return sum/(norm(a) * norm(b) + 1e-20) #fix numerical issue 1e-20 almost = 0!
139 |
140 | '''
141 | #cosine_similarity realized by use...
142 | #or try sklearn....
143 | from sklearn.metrics.pairwise import linear_kernel, cosine_similarity, cosine_distances, euclidean_distances # we may try diff metrics
144 | #ref http://scikit-learn.org/stable/modules/classes.html#module-sklearn.metrics.pairwise
145 | '''
146 |
147 | def lp_train_test_split(graph, ratio=0.5, neg_pos_link_ratio=1.0, test_pos_links_ratio=0.1):
148 | #randomly split links/edges into training set and testing set
149 | #*** note: we do not assume every node must be connected after removing links
150 | #*** hence, the resulting graph might have few single nodes --> more realistic scenario
151 | #*** e.g. a user just sign in a website has no link to others
152 |
153 | #graph: OpenANE graph data strcture
154 | #ratio: perc of links for training; ranging [0, 1]
155 | #neg_pos_link_ratio: 1.0 means neg-links/pos-links = 1.0 i.e. balance case; raning [0, +inf)
156 | g = graph
157 | test_pos_links = int(nx.number_of_edges(g.G) * test_pos_links_ratio)
158 |
159 | print("test_pos_links_ratio {:.2f}, test_pos_links {:.2f}, neg_pos_link_ratio is {:.2f}, links for training {:.2f}%,".format(test_pos_links_ratio, test_pos_links, neg_pos_link_ratio, ratio*100))
160 | test_pos_sample = []
161 | test_neg_sample = []
162 |
163 | #random.seed(2018) #generate testing set that contains both pos and neg samples
164 | test_pos_sample = random.sample(g.G.edges(), test_pos_links)
165 | #test_neg_sample = random.sample(list(nx.classes.function.non_edges(g.G)), int(test_size * neg_pos_link_ratio)) #using nx build-in func, not efficient, to do...
166 | #more efficient way:
167 | test_neg_sample = []
168 | num_neg_sample = int(test_pos_links * neg_pos_link_ratio)
169 | num = 0
170 | while num < num_neg_sample:
171 | pair_nodes = np.random.choice(g.look_back_list, size=2, replace=False)
172 | if pair_nodes not in g.G.edges():
173 | num += 1
174 | test_neg_sample.append(list(pair_nodes))
175 |
176 | test_edge_pair = test_pos_sample + test_neg_sample
177 | test_edge_label = list(np.ones(len(test_pos_sample))) + list(np.zeros(len(test_neg_sample)))
178 |
179 | print('before removing, the # of links: ', nx.number_of_edges(g.G), '; the # of single nodes: ', g.numSingleNodes())
180 | g.G.remove_edges_from(test_pos_sample) #training set should NOT contain testing set i.e. delete testing pos samples
181 | g.simulate_sparsely_linked_net(link_reserved = ratio) #simulate sparse net
182 | print('after removing, the # of links: ', nx.number_of_edges(g.G), '; the # of single nodes: ', g.numSingleNodes())
183 | print("# training links {0}; # positive testing links {1}; # negative testing links {2},".format(nx.number_of_edges(g.G), len(test_pos_sample), len(test_neg_sample)))
184 | return g.G, test_edge_pair, test_edge_label
185 |
186 | #---------------------------------ulits for downstream tasks--------------------------------
187 | def load_embeddings(filename):
188 | fin = open(filename, 'r')
189 | node_num, size = [int(x) for x in fin.readline().strip().split()]
190 | vectors = {}
191 | while 1:
192 | l = fin.readline()
193 | if l == '':
194 | break
195 | vec = l.strip().split(' ')
196 | assert len(vec) == size+1
197 | vectors[vec[0]] = [float(x) for x in vec[1:]]
198 | fin.close()
199 | assert len(vectors) == node_num
200 | return vectors
201 |
202 | def read_node_label(filename):
203 | fin = open(filename, 'r')
204 | X = []
205 | Y = []
206 | while 1:
207 | l = fin.readline()
208 | if l == '':
209 | break
210 | vec = l.strip().split(' ')
211 | X.append(vec[0])
212 | Y.append(vec[1:])
213 | fin.close()
214 | return X, Y
215 |
216 |
217 | def read_edge_label(filename):
218 | fin = open(filename, 'r')
219 | X = []
220 | Y = []
221 | while 1:
222 | l = fin.readline()
223 | if l == '':
224 | break
225 | vec = l.strip().split(' ')
226 | X.append(vec[:2])
227 | Y.append(vec[2])
228 | fin.close()
229 | return X, Y
230 |
--------------------------------------------------------------------------------
/src/libnrl/graph.py:
--------------------------------------------------------------------------------
1 | """
2 | commonly used graph APIs based NetworkX;
3 | use g.xxx to access the commonly used APIs offered by us;
4 | use g.G.xxx to access NetworkX APIs;
5 |
6 | by Chengbin Hou 2018
7 | """
8 |
9 | import time
10 | import random
11 | import numpy as np
12 | import scipy.sparse as sp
13 | import networkx as nx
14 |
15 | class Graph(object):
16 | def __init__(self):
17 | self.G = None #to access NetworkX graph data structure
18 | self.look_up_dict = {} #use node ID to find index via g.look_up_dict['0']
19 | self.look_back_list = [] #use index to find node ID via g.look_back_list[0]
20 |
21 | #--------------------------------------------------------------------------------------
22 | #--------------------commonly used APIs that will modify graph-------------------------
23 | #--------------------------------------------------------------------------------------
24 | def node_mapping(self):
25 | """ node id and index mapping; \n
26 | based on the order given by networkx G.nodes(); \n
27 | NB: updating is needed if any node is added/removed; \n
28 | """
29 | i = 0 #node index
30 | self.look_up_dict = {} #init
31 | self.look_back_list = [] #init
32 | for node_id in self.G.nodes(): #node id
33 | self.look_up_dict[node_id] = i
34 | self.look_back_list.append(node_id)
35 | i += 1
36 |
37 | def read_adjlist(self, path, directed=False):
38 | """ read adjacency list format graph; \n
39 | support unweighted and (un)directed graph; \n
40 | format: see https://networkx.github.io/documentation/stable/reference/readwrite/adjlist.html \n
41 | NB: not supoort weighted graph \n
42 | """
43 | if directed:
44 | self.G = nx.read_adjlist(path, create_using=nx.DiGraph())
45 | else:
46 | self.G = nx.read_adjlist(path, create_using=nx.Graph())
47 | self.node_mapping() #update node id index mapping
48 |
49 | def read_edgelist(self, path, weighted=False, directed=False):
50 | """ read edge list format graph; \n
51 | support (un)weighted and (un)directed graph; \n
52 | format: see https://networkx.github.io/documentation/stable/reference/readwrite/edgelist.html \n
53 | """
54 | if directed:
55 | self.G = nx.read_edgelist(path, create_using=nx.DiGraph())
56 | else:
57 | self.G = nx.read_edgelist(path, create_using=nx.Graph())
58 | self.node_mapping() #update node id index mapping
59 |
60 | def add_edge_weight(self, equal_weight=1.0):
61 | ''' add weights to networkx graph; \n
62 | currently only support adding 1.0 to all existing edges; \n
63 | some NE method may require 'weight' attribute spcified in networkx graph; \n
64 | to do... support user-specified weights e.g. from file (similar to read_node_attr): node_id1 node_id2 weight \n
65 | https://networkx.github.io/documentation/stable/reference/generated/networkx.classes.function.set_edge_attributes.html#networkx.classes.function.set_edge_attributes
66 | '''
67 | nx.set_edge_attributes(self.G, equal_weight, 'weight') #check the url and use dict to assign diff weights to diff edges
68 |
69 | def read_node_attr(self, path):
70 | """ read node attributes and store as NetworkX graph {'node_id': {'attr': values}} \n
71 | input file format: node_id1 attr1 attr2 ... attrM \n
72 | node_id2 attr1 attr2 ... attrM \n
73 | """
74 | with open(path, 'r') as fin:
75 | for l in fin.readlines():
76 | vec = l.split()
77 | self.G.nodes[vec[0]]['attr'] = np.array([float(x) for x in vec[1:]])
78 |
79 | def read_node_label(self, path):
80 | """ todo... read node labels and store as NetworkX graph {'node_id': {'label': values}} \n
81 | input file format: node_id1 labels \n
82 | node_id2 labels \n
83 | with open(path, 'r') as fin: \n
84 | for l in fin.readlines(): \n
85 | vec = l.split() \n
86 | self.G.nodes[vec[0]]['label'] = np.array([float(x) for x in vec[1:]]) \n
87 | """
88 | pass #to do...
89 |
90 | def remove_edge(self, ratio=0.0):
91 | """ randomly remove edges/links \n
92 | ratio: the percentage of edges to be removed \n
93 | edges_removed: return removed edges, each of which is a pair of nodes \n
94 | """
95 | num_edges_removed = int( ratio * self.G.number_of_edges() )
96 | #random.seed(2018)
97 | edges_removed = random.sample(self.G.edges(), int(num_edges_removed))
98 | print('before removing, the # of edges: ', self.G.number_of_edges())
99 | self.G.remove_edges_from(edges_removed)
100 | print('after removing, the # of edges: ', self.G.number_of_edges())
101 | return edges_removed
102 |
103 | def remove_node_attr(self, ratio):
104 | """ todo... randomly remove node attributes; \n
105 | """
106 | pass #to do...
107 |
108 | def remove_node(self, ratio):
109 | """ todo... randomly remove nodes; \n
110 | #self.node_mapping() #update node id index mapping is needed \n
111 | """
112 | pass #to do...
113 |
114 | #------------------------------------------------------------------------------------------
115 | #--------------------commonly used APIs that will not modify graph-------------------------
116 | #------------------------------------------------------------------------------------------
117 | def get_adj_mat(self, is_sparse=True):
118 | """ return adjacency matrix; \n
119 | use 'csr' format for sparse matrix \n
120 | """
121 | if is_sparse:
122 | return nx.to_scipy_sparse_matrix(self.G, nodelist=self.look_back_list, format='csr', dtype='float64')
123 | else:
124 | return nx.to_numpy_matrix(self.G, nodelist=self.look_back_list, dtype='float64')
125 |
126 | def get_attr_mat(self, is_sparse=True):
127 | """ return attribute matrix; \n
128 | use 'csr' format for sparse matrix \n
129 | """
130 | attr_dense_narray = np.vstack([self.G.nodes[self.look_back_list[i]]['attr'] for i in range(self.get_num_nodes())])
131 | if is_sparse:
132 | return sp.csr_matrix(attr_dense_narray, dtype='float64')
133 | else:
134 | return np.matrix(attr_dense_narray, dtype='float64')
135 |
136 | def get_num_nodes(self):
137 | """ return the number of nodes """
138 | return nx.number_of_nodes(self.G)
139 |
140 | def get_num_edges(self):
141 | """ return the number of edges """
142 | return nx.number_of_edges(self.G)
143 |
144 | def get_density(self):
145 | """ return the density of a graph """
146 | return nx.density(self.G)
147 |
148 | def get_num_isolates(self):
149 | """ return the number of isolated nodes """
150 | return len(list(nx.isolates(self.G)))
151 |
152 | def get_isdirected(self):
153 | """ return True if it is directed graph """
154 | return nx.is_directed(self.G)
155 |
156 | def get_isweighted(self):
157 | """ return True if it is weighted graph """
158 | return nx.is_weighted(self.G)
159 |
160 | def get_neighbors(self, node):
161 | """ return neighbors connected to a node """
162 | return list(nx.neighbors(self.G, node))
163 |
164 | def get_common_neighbors(self, node1, node2):
165 | """ return common neighbors of two nodes """
166 | return list(nx.common_neighbors(self.G, node1, node2))
167 |
168 | def get_centrality(self, centrality_type='degree'):
169 | """ todo... return specified type of centrality \n
170 | see https://networkx.github.io/documentation/stable/reference/algorithms/centrality.html \n
171 | """
172 | pass #to do...
173 |
174 |
--------------------------------------------------------------------------------
/src/libnrl/node2vec.py:
--------------------------------------------------------------------------------
1 | """
2 | NE method: DeepWalk and Node2Vec
3 |
4 | modified by Chengbin Hou 2018
5 |
6 | originally from https://github.com/thunlp/OpenNE/blob/master/src/openne/node2vec.py
7 | """
8 |
9 | import time
10 | import warnings
11 | warnings.filterwarnings(action='ignore', category=UserWarning, module='gensim')
12 | from gensim.models import Word2Vec
13 | from . import walker
14 |
15 |
16 | class Node2vec(object):
17 | def __init__(self, graph, path_length, num_paths, dim, p=1.0, q=1.0, dw=False, **kwargs):
18 | kwargs["workers"] = kwargs.get("workers", 1)
19 | if dw:
20 | kwargs["hs"] = 1
21 | p = 1.0
22 | q = 1.0
23 |
24 | self.graph = graph
25 | if dw:
26 | self.walker = walker.BasicWalker(graph, workers=kwargs["workers"]) #walker for deepwalk
27 | else:
28 | self.walker = walker.Walker(graph, p=p, q=q, workers=kwargs["workers"]) #walker for node2vec
29 | print("Preprocess transition probs...")
30 | self.walker.preprocess_transition_probs()
31 | sentences = self.walker.simulate_walks(num_walks=num_paths, walk_length=path_length)
32 | kwargs["sentences"] = sentences
33 | kwargs["min_count"] = kwargs.get("min_count", 0)
34 | kwargs["size"] = kwargs.get("size", dim)
35 | kwargs["sg"] = 1
36 |
37 | self.size = kwargs["size"]
38 | print("Learning representation...")
39 | word2vec = Word2Vec(**kwargs)
40 | self.vectors = {}
41 | for word in graph.G.nodes():
42 | self.vectors[word] = word2vec.wv[word]
43 | del word2vec
44 |
45 | def save_embeddings(self, filename):
46 | fout = open(filename, 'w')
47 | node_num = len(self.vectors.keys())
48 | fout.write("{} {}\n".format(node_num, self.size))
49 | for node, vec in self.vectors.items():
50 | fout.write("{} {}\n".format(node,
51 | ' '.join([str(x) for x in vec])))
52 | fout.close()
53 |
--------------------------------------------------------------------------------
/src/libnrl/tadw.py:
--------------------------------------------------------------------------------
1 | """
2 | ANE method: Text Associated DeepWalk (TADW)
3 |
4 | modified by Chengbin Hou 2018
5 |
6 | originally from https://github.com/thunlp/OpenNE/blob/master/src/openne/tadw.py
7 | the main diff: adapt to our graph.py APIs
8 | to do... sparse computation and remove unnecessary self vars;
9 | otherwise, not scalable to large network;
10 | """
11 |
12 | import math
13 |
14 | import numpy as np
15 | from numpy import linalg as la
16 | from sklearn.preprocessing import normalize
17 |
18 | from .utils import row_as_probdist
19 |
20 | class TADW(object):
21 |
22 | def __init__(self, graph, dim, lamb=0.2, maxiter=10):
23 | self.g = graph
24 | self.lamb = lamb
25 | self.dim = dim
26 | self.maxiter = maxiter
27 | self.train()
28 |
29 | def getAdj(self):
30 | '''
31 | graph = self.g.G
32 | node_size = self.g.node_size
33 | look_up = self.g.look_up_dict
34 | adj = np.zeros((node_size, node_size))
35 | for edge in self.g.G.edges():
36 | adj[look_up[edge[0]]][look_up[edge[1]]] = 1.0
37 | adj[look_up[edge[1]]][look_up[edge[0]]] = 1.0
38 | # ScaleSimMat
39 | return adj/np.sum(adj, axis=1) #original may get numerical error sometimes...
40 | '''
41 | A = self.g.get_adj_mat() #by defalut, return a sparse matrix
42 | return np.array(row_as_probdist(A, dense_output=True, preserve_zeros=True)) #only support np.array, otherwise dim error...
43 |
44 |
45 | def getT(self):
46 | g = self.g.G
47 | look_back = self.g.look_back_list
48 | self.features = np.vstack([g.nodes[look_back[i]]['attr']
49 | for i in range(g.number_of_nodes())])
50 | #self.features = self.g.get_attr_mat().todense()
51 | self.preprocessFeature()
52 | return self.features.T
53 |
54 | def preprocessFeature(self):
55 | if self.features.shape[1] > 200:
56 | U, S, VT = la.svd(self.features)
57 | Ud = U[:, 0:200]
58 | Sd = S[0:200]
59 | self.features = np.array(Ud)*Sd.reshape(200)
60 | #from .utils import dim_reduction
61 | #self.features = dim_reduction(self.features, dim=200, method='svd')
62 |
63 | def train(self):
64 | self.adj = self.getAdj()
65 | # M=(A+A^2)/2 where A is the row-normalized adjacency matrix
66 | self.M = (self.adj + np.dot(self.adj, self.adj))/2
67 | # T is feature_size*node_num, text features
68 | self.T = self.getT() #transpose of self.features
69 | self.node_size = self.adj.shape[0]
70 | self.feature_size = self.features.shape[1]
71 | self.W = np.random.randn(self.dim, self.node_size)
72 | self.H = np.random.randn(self.dim, self.feature_size)
73 | # Update
74 |
75 | import time
76 | for i in range(self.maxiter):
77 | t1=time.time()
78 | # Update W
79 | B = np.dot(self.H, self.T)
80 | drv = 2 * np.dot(np.dot(B, B.T), self.W) - \
81 | 2*np.dot(B, self.M.T) + self.lamb*self.W
82 | Hess = 2*np.dot(B, B.T) + self.lamb*np.eye(self.dim)
83 | drv = np.reshape(drv, [self.dim*self.node_size, 1])
84 | rt = -drv
85 | dt = rt
86 | vecW = np.reshape(self.W, [self.dim*self.node_size, 1])
87 | while np.linalg.norm(rt, 2) > 1e-4:
88 | dtS = np.reshape(dt, (self.dim, self.node_size))
89 | Hdt = np.reshape(np.dot(Hess, dtS), [self.dim*self.node_size, 1])
90 |
91 | at = np.dot(rt.T, rt)/np.dot(dt.T, Hdt)
92 | vecW = vecW + at*dt
93 | rtmp = rt
94 | rt = rt - at*Hdt
95 | bt = np.dot(rt.T, rt)/np.dot(rtmp.T, rtmp)
96 | dt = rt + bt * dt
97 | self.W = np.reshape(vecW, (self.dim, self.node_size))
98 |
99 | # Update H
100 | drv = np.dot((np.dot(np.dot(np.dot(self.W, self.W.T),self.H),self.T)
101 | - np.dot(self.W, self.M.T)), self.T.T) + self.lamb*self.H
102 | drv = np.reshape(drv, (self.dim*self.feature_size, 1))
103 | rt = -drv
104 | dt = rt
105 | vecH = np.reshape(self.H, (self.dim*self.feature_size, 1))
106 | while np.linalg.norm(rt, 2) > 1e-4:
107 | dtS = np.reshape(dt, (self.dim, self.feature_size))
108 | Hdt = np.reshape(np.dot(np.dot(np.dot(self.W, self.W.T), dtS), np.dot(self.T, self.T.T))
109 | + self.lamb*dtS, (self.dim*self.feature_size, 1))
110 | at = np.dot(rt.T, rt)/np.dot(dt.T, Hdt)
111 | vecH = vecH + at*dt
112 | rtmp = rt
113 | rt = rt - at*Hdt
114 | bt = np.dot(rt.T, rt)/np.dot(rtmp.T, rtmp)
115 | dt = rt + bt * dt
116 | self.H = np.reshape(vecH, (self.dim, self.feature_size))
117 | t2=time.time()
118 | print(f'iter: {i+1}/{self.maxiter}; time cost {t2-t1:0.2f}s')
119 |
120 | self.Vecs = np.hstack((normalize(self.W.T), normalize(np.dot(self.T.T, self.H.T))))
121 | # get embeddings
122 | self.vectors = {}
123 | look_back = self.g.look_back_list
124 | for i, embedding in enumerate(self.Vecs):
125 | self.vectors[look_back[i]] = embedding
126 |
127 | def save_embeddings(self, filename):
128 | fout = open(filename, 'w')
129 | node_num = len(self.vectors.keys())
130 | fout.write("{} {}\n".format(node_num, self.dim))
131 | for node, vec in self.vectors.items():
132 | fout.write("{} {}\n".format(node,' '.join([str(x) for x in vec])))
133 | fout.close()
134 |
--------------------------------------------------------------------------------
/src/libnrl/utils.py:
--------------------------------------------------------------------------------
1 | """
2 | (weighted) random walks for walk-based NE methods:
3 | DeepWalk, Node2Vec, TriDNR, and ABRW;
4 | alias sampling; walks by multiprocessors; etc.
5 |
6 | by Chengbin Hou & Zeyu Dong 2018
7 | """
8 |
9 | import time
10 |
11 | import numpy as np
12 | from scipy import sparse
13 |
14 |
15 | # ---------------------------------ulits for calculation--------------------------------
16 | def row_as_probdist(mat, dense_output=False, preserve_zeros=False):
17 | """Make each row of matrix sums up to 1.0, i.e., a probability distribution.
18 | Support both dense and sparse matrix.
19 |
20 | Attributes
21 | ----------
22 | mat : scipy sparse matrix or dense matrix or numpy array
23 | The matrix to be normalized
24 | dense_output : bool
25 | whether forced dense output
26 | perserve_zeros : bool
27 | If False, for row with all entries 0, we normalize it to a vector with all entries 1/n.
28 | Leave 0 otherwise
29 | Returns
30 | -------
31 | dense or sparse matrix:
32 | return dense matrix if input is dense matrix or numpy array
33 | return sparse matrix for sparse matrix input
34 | (note: np.array & np.matrix are diff; and may cause some dim issues...)
35 | """
36 | row_sum = np.array(mat.sum(axis=1)).ravel() # type: np.array
37 | zero_rows = row_sum == 0
38 | row_sum[zero_rows] = 1
39 | diag = sparse.dia_matrix((1 / row_sum, 0), (mat.shape[0], mat.shape[0]))
40 | mat = diag.dot(mat)
41 | if not preserve_zeros:
42 | mat += sparse.csr_matrix(zero_rows.astype(int)).T.dot(sparse.csr_matrix(np.repeat(1 / mat.shape[1], mat.shape[1])))
43 |
44 | if dense_output and sparse.issparse(mat):
45 | return mat.todense()
46 | return mat
47 |
48 |
49 | def pairwise_similarity(mat, type='cosine'):
50 | # XXX: possible to integrate pairwise_similarity with top_k to enhance performance?
51 | # we'll use it elsewhere. if really needed, write a new method for this purpose
52 | if type == 'cosine': # support sprase and dense mat
53 | from sklearn.metrics.pairwise import cosine_similarity
54 | result = cosine_similarity(mat, dense_output=True)
55 | elif type == 'jaccard':
56 | from sklearn.metrics import jaccard_similarity_score
57 | from sklearn.metrics.pairwise import pairwise_distances
58 | # n_jobs=-1 means using all CPU for parallel computing
59 | result = pairwise_distances(mat.todense(), metric=jaccard_similarity_score, n_jobs=-1)
60 | elif type == 'euclidean':
61 | from sklearn.metrics.pairwise import euclidean_distances
62 | # note: similarity = - distance
63 | # other version: similarity = 1 - 2 / pi * arctan(distance)
64 | result = euclidean_distances(mat)
65 | result = -result
66 | # result = 1 - 2 / np.pi * np.arctan(result)
67 | elif type == 'manhattan':
68 | from sklearn.metrics.pairwise import manhattan_distances
69 | # note: similarity = - distance
70 | # other version: similarity = 1 - 2 / pi * arctan(distance)
71 | result = manhattan_distances(mat)
72 | result = -result
73 | # result = 1 - 2 / np.pi * np.arctan(result)
74 | else:
75 | print('Please choose from: cosine, jaccard, euclidean or manhattan')
76 | return 'Not found!'
77 | return result
78 |
79 |
80 | # ---------------------------------ulits for preprocessing--------------------------------
81 | def node_auxi_to_attr(fin, fout):
82 | """ TODO...
83 | -> read auxi info associated with each node;
84 | -> preprocessing auxi via:
85 | 1) NLP for sentences; or 2) one-hot for discrete features;
86 | -> then becomes node attr with m dim, and store them into attr file
87 | """
88 | pass
89 |
90 |
91 | def simulate_incomplete_stru():
92 | pass
93 |
94 |
95 | def simulate_incomplete_attr():
96 | pass
97 |
98 |
99 | def simulate_noisy_world():
100 | pass
101 |
102 | # ---------------------------------ulits for downstream tasks--------------------------------
103 | # XXX: read and save using panda or numpy
104 |
105 |
106 | def read_edge_label_downstream(filename):
107 | fin = open(filename, 'r')
108 | X = []
109 | Y = []
110 | while 1:
111 | line = fin.readline()
112 | if line == '':
113 | break
114 | vec = line.strip().split(' ')
115 | X.append(vec[:2])
116 | Y.append(vec[2])
117 | fin.close()
118 | return X, Y
119 |
120 |
121 | def read_node_label_downstream(filename):
122 | """ may be used in node classification task;
123 | part of labels for training clf and
124 | the result served as ground truth;
125 | note: similar method can be found in graph.py -> read_node_label
126 | """
127 | fin = open(filename, 'r')
128 | X = []
129 | Y = []
130 | while 1:
131 | line = fin.readline()
132 | if line == '':
133 | break
134 | vec = line.strip().split(' ')
135 | X.append(vec[0])
136 | Y.append(vec[1:])
137 | fin.close()
138 | return X, Y
139 |
140 |
141 | def store_embedddings(vectors, filename, dim):
142 | """ store embeddings to file
143 | """
144 | fout = open(filename, 'w')
145 | num_nodes = len(vectors.keys())
146 | fout.write("{} {}\n".format(num_nodes, dim))
147 | for node, vec in vectors.items():
148 | fout.write("{} {}\n".format(node, ' '.join([str(x) for x in vec])))
149 | fout.close()
150 | print('store the resulting embeddings in file: ', filename)
151 |
152 |
153 | def load_embeddings(filename):
154 | """ load embeddings from file
155 | """
156 | fin = open(filename, 'r')
157 | num_nodes, size = [int(x) for x in fin.readline().strip().split()]
158 | vectors = {}
159 | while 1:
160 | line = fin.readline()
161 | if line == '':
162 | break
163 | vec = line.strip().split(' ')
164 | assert len(vec) == size + 1
165 | vectors[vec[0]] = [float(x) for x in vec[1:]]
166 | fin.close()
167 | assert len(vectors) == num_nodes
168 | return vectors
169 |
170 |
171 |
172 | def generate_edges_for_linkpred(graph, edges_removed, balance_ratio=1.0):
173 | ''' given a graph and edges_removed;
174 | generate non_edges not in [both graph and edges_removed];
175 | return all_test_samples including [edges_removed (pos samples), non_edges (neg samples)];
176 | return format X=[[1,2],[2,4],...] Y=[1,0,...] where Y tells where corresponding element has a edge
177 | '''
178 | g = graph
179 | num_edges_removed = len(edges_removed)
180 | num_non_edges = int(balance_ratio * num_edges_removed)
181 | num = 0
182 | #np.random.seed(2018)
183 | non_edges = []
184 | exist_edges = list(g.G.edges())+list(edges_removed)
185 | while num < num_non_edges:
186 | non_edge = list(np.random.choice(g.look_back_list, size=2, replace=False))
187 | if non_edge not in exist_edges:
188 | num += 1
189 | non_edges.append(non_edge)
190 |
191 | test_node_pairs = edges_removed + non_edges
192 | test_edge_labels = list(np.ones(num_edges_removed)) + list(np.zeros(num_non_edges))
193 | return test_node_pairs, test_edge_labels
194 |
195 |
196 | def dim_reduction(mat, dim=128, method='pca'):
197 | import time
198 | ''' dimensionality reduction: PCA, SVD, etc...
199 | dim = # of columns
200 | '''
201 | print('START dimensionality reduction using ' + method + ' ......')
202 | t1 = time.time()
203 | if method == 'pca':
204 | from sklearn.decomposition import PCA
205 | pca = PCA(n_components=dim, svd_solver='auto', random_state=None)
206 | mat_reduced = pca.fit_transform(mat) #sklearn pca auto remove mean, no need to preprocess
207 | elif method == 'svd':
208 | from sklearn.decomposition import TruncatedSVD
209 | svd = TruncatedSVD(n_components=dim, n_iter=5, random_state=None)
210 | mat_reduced = svd.fit_transform(mat)
211 | else: #to do... more methods... e.g. random projection, ica, t-sne...
212 | print('dimensionality reduction method not found......')
213 | t2 = time.time()
214 | print('END dimensionality reduction: {:.2f}s'.format(t2-t1))
215 | return mat_reduced
216 |
217 |
218 | def row_normalized(mat, is_transition_matrix=False):
219 | ''' to do...
220 | '''
221 | p = 1.0/mat.shape[0] #probability = 1/num of rows
222 | norms = np.asarray(mat.sum(axis=1)).ravel()
223 | for i, norm in enumerate(norms):
224 | if norm != 0:
225 | mat[i, :] /= norm
226 | else:
227 | if is_transition_matrix:
228 | mat[i, :] = p #every row of transition matrix should sum up to 1
229 | else:
230 | pass #do nothing; keep all-zero row
231 | return mat
232 |
233 | def rowAsPDF(mat): #make each row sum up to 1 i.e. a probabolity density distribution
234 | mat = np.array(mat)
235 | for i in range(mat.shape[0]):
236 | sum_row = mat[i,:].sum()
237 | if sum_row !=0:
238 | mat[i,:] = mat[i,:]/sum_row #if a row [0, 1, 1, 1] -> [0, 1/3, 1/3, 1/3] -> may have some small issue...
239 | else:
240 | # to do...
241 | # for node without any link... remain row as [0, 0, 0, 0] OR set to [1/n, 1/n, 1/n...]??
242 | pass
243 | if mat[i,:].sum() != 1.00: #small trick to make sure each row is a pdf 笨犯法。。。
244 | error = 1.00 - mat[i,:].sum()
245 | mat[i,-1] += error
246 | return mat
--------------------------------------------------------------------------------
/src/libnrl/walker.py:
--------------------------------------------------------------------------------
1 | """
2 | (weighted) random walks for walk-based NE methods:
3 | DeepWalk, Node2Vec, TriDNR, and ABRW;
4 | alias sampling; walks by multiprocessors; etc.
5 |
6 | by Chengbin Hou 2018
7 | """
8 |
9 | import multiprocessing
10 | import random
11 | import time
12 | from itertools import chain
13 | import numpy as np
14 | from networkx import nx
15 |
16 |
17 | # ===========================================ABRW-weighted-walker============================================
18 | class WeightedWalker:
19 | ''' Weighted Walker for Attributed Biased Randomw Walks (ABRW) method
20 | '''
21 |
22 | def __init__(self, node_id_map, transition_mat, workers):
23 | self.look_back_list = node_id_map
24 | self.T = transition_mat
25 | self.workers = workers
26 | # self.rec_G = nx.to_networkx_graph(self.T, create_using=nx.Graph()) # wrong... will return symt transition mat
27 | self.rec_G = nx.to_networkx_graph(self.T, create_using=nx.DiGraph()) # reconstructed "directed" "weighted" graph based on transition matrix
28 | # print(nx.adjacency_matrix(self.rec_G).todense()[0:6, 0:6])
29 | # print(transition_mat[0:6, 0:6])
30 | # print(nx.adjacency_matrix(self.rec_G).todense()==transition_mat)
31 |
32 | # alias sampling for ABRW-------------------------
33 | def simulate_walks(self, num_walks, walk_length):
34 | t1 = time.time()
35 | self.preprocess_transition_probs(weighted_G=self.rec_G) # construct alias table; adapted from node2vec
36 | t2 = time.time()
37 | print(f'Time for construct alias table: {(t2-t1):.2f}')
38 |
39 | walks = []
40 | nodes = list(self.rec_G.nodes())
41 | for walk_iter in range(num_walks):
42 | t1 = time.time()
43 | # random.seed(2018)
44 | random.shuffle(nodes)
45 | for node in nodes:
46 | walks.append(self.weighted_walk(weighted_G=self.rec_G, walk_length=walk_length, start_node=node))
47 | t2 = time.time()
48 | print(f'Walk iteration: {walk_iter+1}/{num_walks}; time cost: {(t2-t1):.2f}')
49 |
50 | for i in range(len(walks)): # use ind to retrive orignal node ID
51 | for j in range(len(walks[0])):
52 | walks[i][j] = self.look_back_list[int(walks[i][j])]
53 | return walks
54 |
55 | def weighted_walk(self, weighted_G, walk_length, start_node): # more efficient way instead of copy from node2vec
56 | G = weighted_G
57 | walk = [start_node]
58 | while len(walk) < walk_length:
59 | cur = walk[-1]
60 | cur_nbrs = list(G.neighbors(cur))
61 | if len(cur_nbrs) > 0: # if non-isolated node
62 | walk.append(cur_nbrs[alias_draw(self.alias_nodes[cur][0], self.alias_nodes[cur][1])]) # alias sampling in O(1) time to get the index of
63 | else: # if isolated node # 1) randomly choose a nbr; 2) judege if use nbr or its alias
64 | break
65 | return walk
66 |
67 | def preprocess_transition_probs(self, weighted_G):
68 | ''' reconstructed G mush be weighted; \n
69 | return a dict of alias table for each node
70 | '''
71 | G = weighted_G
72 | alias_nodes = {} # unlike node2vec, the reconstructed graph is based on transtion matrix
73 | for node in G.nodes(): # no need to normalize again
74 | probs = [G[node][nbr]['weight'] for nbr in G.neighbors(node)] #pick prob of neighbors with non-zero weight --> sum up to 1.0
75 | #print(f'sum of prob: {np.sum(probs)}')
76 | alias_nodes[node] = alias_setup(probs) #alias table format {node_id: (array1, array2)}
77 | self.alias_nodes = alias_nodes #where array1 gives alias node indexes; array2 gives its prob
78 | #print(self.alias_nodes)
79 |
80 |
81 | '''
82 | #naive sampling for ABRW-------------------------------------------------------------------
83 | def weighted_walk(self, start_node):
84 | #
85 | #Simulate a weighted walk starting from start node.
86 | #
87 | G = self.G
88 | look_up_dict = self.look_up_dict
89 | look_back_list = self.look_back_list
90 | node_size = self.node_size
91 | walk = [start_node]
92 |
93 | while len(walk) < self.walk_length:
94 | cur_node = walk[-1] #the last one entry/node
95 | cur_ind = look_up_dict[cur_node] #key -> index
96 | pdf = self.T[cur_ind,:] #the pdf of node with ind
97 | #pdf = np.random.randn(18163)+10 #......test multiprocessor
98 | #pdf = pdf / pdf.sum() #......test multiprocessor
99 | #next_ind = int( np.array( nx.utils.random_sequence.discrete_sequence(n=1,distribution=pdf) ) )
100 | next_ind = np.random.choice(len(pdf), 1, p=pdf)[0] #faster than nx
101 | #next_ind = 0 #......test multiprocessor
102 | next_node = look_back_list[next_ind] #index -> key
103 | walk.append(next_node)
104 | return walk
105 |
106 | def simulate_walks(self, num_walks, walk_length):
107 | #
108 | #Repeatedly simulate weighted walks from each node.
109 | #
110 | G = self.G
111 | self.num_walks = num_walks
112 | self.walk_length = walk_length
113 | self.walks = [] #what we all need later as input to skip-gram
114 | nodes = list(G.nodes())
115 |
116 | print('Walk iteration:')
117 | for walk_iter in range(num_walks):
118 | t1 = time.time()
119 | random.shuffle(nodes)
120 | for node in nodes: #for single cpu, if # of nodes < 2000 (speed up) or nodes > 20000 (avoid memory error)
121 | self.walks.append(self.weighted_walk(node)) #for single cpu, if # of nodes < 2000 (speed up) or nodes > 20000 (avoid memory error)
122 | #pool = multiprocessing.Pool(processes=3) #use all cpu by defalut or specify processes = xx
123 | #self.walks.append(pool.map(self.weighted_walk, nodes)) #ref: https://stackoverflow.com/questions/8533318/multiprocessing-pool-when-to-use-apply-apply-async-or-map
124 | #pool.close()
125 | #pool.join()
126 | t2 = time.time()
127 | print(str(walk_iter+1), '/', str(num_walks), ' each itr last for: {:.2f}s'.format(t2-t1))
128 | #self.walks = list(chain.from_iterable(self.walks)) #unlist...[[[x,x],[x,x]]] -> [x,x], [x,x]
129 | return self.walks
130 | '''
131 |
132 |
133 |
134 | def deepwalk_walk_wrapper(class_instance, walk_length, start_node):
135 | class_instance.deepwalk_walk(walk_length, start_node)
136 |
137 | # ===========================================deepWalk-walker============================================
138 | class BasicWalker:
139 | def __init__(self, g, workers):
140 | self.g = g
141 | self.node_size = g.get_num_nodes()
142 | self.look_up_dict = g.look_up_dict
143 |
144 | def deepwalk_walk(self, walk_length, start_node):
145 | '''
146 | Simulate a random walk starting from start node.
147 | '''
148 | G = self.g.G
149 | look_up_dict = self.look_up_dict
150 | node_size = self.node_size
151 |
152 | walk = [start_node]
153 |
154 | while len(walk) < walk_length:
155 | cur = walk[-1]
156 | cur_nbrs = list(G.neighbors(cur))
157 | if len(cur_nbrs) > 0:
158 | walk.append(random.choice(cur_nbrs))
159 | else:
160 | break
161 | return walk
162 |
163 | def simulate_walks(self, num_walks, walk_length):
164 | '''
165 | Repeatedly simulate random walks from each node.
166 | '''
167 | G = self.g.G
168 | walks = []
169 | nodes = list(G.nodes())
170 | for walk_iter in range(num_walks):
171 | t1 = time.time()
172 | # pool = multiprocessing.Pool(processes = 4)
173 | random.shuffle(nodes)
174 | for node in nodes:
175 | # walks.append(pool.apply_async(deepwalk_walk_wrapper, (self, walk_length, node, )))
176 | walks.append(self.deepwalk_walk(walk_length=walk_length, start_node=node))
177 | # pool.close()
178 | # pool.join()
179 | t2 = time.time()
180 | print(f'Walk iteration: {walk_iter+1}/{num_walks}; time cost: {(t2-t1):.2f}')
181 | # print(len(walks))
182 | return walks
183 |
184 |
185 | # ===========================================node2vec-walker============================================
186 | class Walker:
187 | def __init__(self, g, p, q, workers):
188 | self.g = g
189 | self.p = p
190 | self.q = q
191 |
192 | if self.g.get_isweighted():
193 | #print('is weighted graph: ', self.g.get_isweighted())
194 | #print(self.g.get_adj_mat(is_sparse=False)[0:6,0:6])
195 | pass
196 | else: #otherwise, add equal weights 1.0 to all existing edges
197 | #print('is weighted graph: ', self.g.get_isweighted())
198 | self.g.add_edge_weight(equal_weight=1.0) #add 'weight' to networkx graph
199 | #print(self.g.get_adj_mat(is_sparse=False)[0:6,0:6])
200 |
201 | self.node_size = g.get_num_nodes()
202 | self.look_up_dict = g.look_up_dict
203 |
204 | def node2vec_walk(self, walk_length, start_node):
205 | '''
206 | Simulate a random walk starting from start node.
207 | '''
208 | G = self.g.G
209 | alias_nodes = self.alias_nodes
210 | alias_edges = self.alias_edges
211 | look_up_dict = self.look_up_dict
212 | node_size = self.node_size
213 |
214 | walk = [start_node]
215 |
216 | while len(walk) < walk_length:
217 | cur = walk[-1]
218 | cur_nbrs = list(G.neighbors(cur))
219 | if len(cur_nbrs) > 0:
220 | if len(walk) == 1:
221 | walk.append(cur_nbrs[alias_draw(alias_nodes[cur][0], alias_nodes[cur][1])])
222 | else:
223 | prev = walk[-2]
224 | next = cur_nbrs[alias_draw(alias_edges[(prev, cur)][0], alias_edges[(prev, cur)][1])]
225 | walk.append(next)
226 | else:
227 | break
228 | return walk
229 |
230 | def simulate_walks(self, num_walks, walk_length):
231 | '''
232 | Repeatedly simulate random walks from each node.
233 | '''
234 | G = self.g.G
235 | walks = []
236 | nodes = list(G.nodes())
237 | for walk_iter in range(num_walks):
238 | t1 = time.time()
239 | random.shuffle(nodes)
240 | for node in nodes:
241 | walks.append(self.node2vec_walk(walk_length=walk_length, start_node=node))
242 | t2 = time.time()
243 | print(f'Walk iteration: {walk_iter+1}/{num_walks}; time cost: {(t2-t1):.2f}')
244 | return walks
245 |
246 | def get_alias_edge(self, src, dst):
247 | '''
248 | Get the alias edge setup lists for a given edge.
249 | '''
250 | G = self.g.G
251 | p = self.p
252 | q = self.q
253 |
254 | unnormalized_probs = []
255 | for dst_nbr in G.neighbors(dst):
256 | if dst_nbr == src:
257 | unnormalized_probs.append(G[dst][dst_nbr]['weight']/p)
258 | elif G.has_edge(dst_nbr, src):
259 | unnormalized_probs.append(G[dst][dst_nbr]['weight'])
260 | else:
261 | unnormalized_probs.append(G[dst][dst_nbr]['weight']/q)
262 | norm_const = sum(unnormalized_probs)
263 | normalized_probs = [float(u_prob)/norm_const for u_prob in unnormalized_probs]
264 | return alias_setup(normalized_probs)
265 |
266 | def preprocess_transition_probs(self):
267 | '''
268 | Preprocessing of transition probabilities for guiding the random walks.
269 | '''
270 | G = self.g.G
271 | alias_nodes = {}
272 | for node in G.nodes():
273 | unnormalized_probs = [G[node][nbr]['weight'] for nbr in G.neighbors(node)] #pick prob of neighbors with non-zero weight
274 | norm_const = sum(unnormalized_probs)
275 | normalized_probs = [float(u_prob)/norm_const for u_prob in unnormalized_probs]
276 | alias_nodes[node] = alias_setup(normalized_probs)
277 |
278 | alias_edges = {}
279 | triads = {}
280 |
281 | look_up_dict = self.look_up_dict
282 | node_size = self.node_size
283 | if self.g.get_isdirected():
284 | for edge in G.edges():
285 | alias_edges[edge] = self.get_alias_edge(edge[0], edge[1])
286 | else: #if undirected, duplicate the reverse direction; otherwise may get key error
287 | for edge in G.edges():
288 | alias_edges[edge] = self.get_alias_edge(edge[0], edge[1])
289 | alias_edges[(edge[1], edge[0])] = self.get_alias_edge(edge[1], edge[0])
290 |
291 | self.alias_nodes = alias_nodes
292 | self.alias_edges = alias_edges
293 |
294 |
295 | #========================================= utils: alias sampling method ====================================================
296 | def alias_setup(probs):
297 | '''
298 | Compute utility lists for non-uniform sampling from discrete distributions.
299 | Refer to https://hips.seas.harvard.edu/blog/2013/03/03/the-alias-method-efficient-sampling-with-many-discrete-outcomes/
300 | for details
301 | '''
302 | K = len(probs)
303 | q = np.zeros(K, dtype=np.float32)
304 | J = np.zeros(K, dtype=np.int32)
305 |
306 | smaller = []
307 | larger = []
308 | for kk, prob in enumerate(probs):
309 | q[kk] = K*prob
310 | if q[kk] < 1.0:
311 | smaller.append(kk)
312 | else:
313 | larger.append(kk)
314 |
315 | while len(smaller) > 0 and len(larger) > 0: #it is all about use large prob to compensate small prob untill reach the average
316 | small = smaller.pop()
317 | large = larger.pop()
318 |
319 | J[small] = large
320 | q[large] = q[large] + q[small] - 1.0
321 | if q[large] < 1.0:
322 | smaller.append(large)
323 | else:
324 | larger.append(large)
325 |
326 | return J, q #the values in J are indexes; it is possible to have repeated indexes if that that index have large prob to compensate others
327 |
328 |
329 | def alias_draw(J, q):
330 | '''
331 | Draw sample from a non-uniform discrete distribution using alias sampling.
332 | '''
333 | K = len(J)
334 |
335 | kk = int(np.floor(np.random.rand()*K)) #randomly choose a nbr (an index)
336 | if np.random.rand() < q[kk]: #use alias table to choose
337 | return kk #either that nbr node (an index)
338 | else:
339 | return J[kk] #or the nbr's alias node (an index)
--------------------------------------------------------------------------------
/src/main.py:
--------------------------------------------------------------------------------
1 | '''
2 | STEP1: load data -->
3 | STEP2: prepare data -->
4 | STEP3: learn node embeddings -->
5 | STEP4: downstream evaluations
6 |
7 | python src/main.py --method abrw
8 |
9 | by Chengbin HOU 2018
10 | '''
11 |
12 | import time
13 | import random
14 | import numpy as np
15 | from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter
16 | from sklearn.linear_model import LogisticRegression
17 | from libnrl.classify import ncClassifier, lpClassifier, read_node_label
18 | from libnrl.graph import *
19 | from libnrl.utils import *
20 | from libnrl import abrw #ANE method; Attributed Biased Random Walk
21 | from libnrl import tadw #ANE method
22 | from libnrl import aane #ANE method
23 | from libnrl import attrcomb #ANE method
24 | from libnrl import attrpure #NE method simply use svd or pca for dim reduction
25 | from libnrl import node2vec #PNE method; including deepwalk and node2vec
26 |
27 | def parse_args():
28 | parser = ArgumentParser(formatter_class=ArgumentDefaultsHelpFormatter, conflict_handler='resolve')
29 | #-----------------------------------------------general settings--------------------------------------------------
30 | parser.add_argument('--graph-format', default='adjlist', choices=['adjlist', 'edgelist'],
31 | help='graph/network format')
32 | parser.add_argument('--graph-file', default='data/cora/cora_adjlist.txt',
33 | help='graph/network file')
34 | parser.add_argument('--attribute-file', default='data/cora/cora_attr.txt',
35 | help='node attribute/feature file')
36 | parser.add_argument('--label-file', default='data/cora/cora_label.txt',
37 | help='node label file')
38 | parser.add_argument('--dim', default=128, type=int,
39 | help='node embeddings dimensions')
40 | parser.add_argument('--task', default='lp_and_nc', choices=['none', 'lp', 'nc', 'lp_and_nc'],
41 | help='choices of downstream tasks: none, lp, nc, lp_and_nc')
42 | parser.add_argument('--link-remove', default=0.1, type=float,
43 | help='simulate randomly missing links if necessary; a ratio ranging [0.0, 1.0]')
44 | parser.add_argument('--label-reserved', default=0.7, type=float,
45 | help='for nc task, train/test split, a ratio ranging [0.0, 1.0]')
46 | parser.add_argument('--directed', default=False, action='store_true',
47 | help='directed or undirected graph')
48 | parser.add_argument('--weighted', default=False, action='store_true',
49 | help='weighted or unweighted graph')
50 | parser.add_argument('--save-emb', default=False, action='store_true',
51 | help='save emb to disk if True')
52 | parser.add_argument('--emb-file', default='emb/unnamed_node_embs.txt',
53 | help='node embeddings file; suggest: data_method_dim_embs.txt')
54 | #-------------------------------------------------method settings-----------------------------------------------------------
55 | parser.add_argument('--method', default='abrw', choices=['deepwalk', 'node2vec', 'abrw', 'attrpure', 'attrcomb', 'tadw', 'aane'],
56 | help='choices of Network Embedding methods')
57 | parser.add_argument('--ABRW-topk', default=30, type=int,
58 | help='select the most attr similar top k nodes of a node; ranging [0, # of nodes]')
59 | parser.add_argument('--ABRW-alpha', default=0.8, type=float,
60 | help='balance struc and attr info; ranging [0, 1]')
61 | parser.add_argument('--AANE-lamb', default=0.05, type=float,
62 | help='balance struc and attr info; ranging [0, inf]')
63 | parser.add_argument('--AANE-rho', default=5, type=float,
64 | help='penalty parameter; ranging [0, inf]')
65 | parser.add_argument('--AANE-maxiter', default=10, type=int,
66 | help='max iter')
67 | parser.add_argument('--TADW-lamb', default=0.2, type=float,
68 | help='balance struc and attr info; ranging [0, inf]')
69 | parser.add_argument('--TADW-maxiter', default=10, type=int,
70 | help='max iter')
71 | parser.add_argument('--AttrComb-mode', default='concat', type=str,
72 | help='choices of mode: concat, elementwise-mean, elementwise-max')
73 | parser.add_argument('--Node2Vec-p', default=0.5, type=float, #if p=q=1.0 node2vec = deepwalk
74 | help='trade-off BFS and DFS; rid search [0.25; 0.50; 1; 2; 4]')
75 | parser.add_argument('--Node2Vec-q', default=0.5, type=float,
76 | help='trade-off BFS and DFS; rid search [0.25; 0.50; 1; 2; 4]')
77 | #for walk based methods; some Word2Vec SkipGram parameters are not specified here
78 | parser.add_argument('--number-walks', default=10, type=int,
79 | help='# of random walks of each node')
80 | parser.add_argument('--walk-length', default=80, type=int,
81 | help='length of each random walk')
82 | parser.add_argument('--window-size', default=10, type=int,
83 | help='window size of skipgram model')
84 | parser.add_argument('--workers', default=24, type=int,
85 | help='# of parallel processes.')
86 | args = parser.parse_args()
87 | return args
88 |
89 |
90 | def main(args):
91 | g = Graph() #see graph.py for commonly-used APIs and use g.G to access NetworkX APIs
92 | print(f'Summary of all settings: {args}')
93 |
94 |
95 | #---------------------------------------STEP1: load data-----------------------------------------------------
96 | print('\nSTEP1: start loading data......')
97 | t1 = time.time()
98 | #load graph structure info------
99 | if args.graph_format == 'adjlist':
100 | g.read_adjlist(path=args.graph_file, directed=args.directed)
101 | elif args.graph_format == 'edgelist':
102 | g.read_edgelist(path=args.graph_file, weighted=args.weighted, directed=args.directed)
103 | #load node attribute info------
104 | is_ane = (args.method == 'abrw' or args.method == 'tadw' or args.method == 'attrpure' or args.method == 'attrcomb' or args.method == 'aane')
105 | if is_ane:
106 | assert args.attribute_file != ''
107 | g.read_node_attr(args.attribute_file)
108 | #load node label info------
109 | t2 = time.time()
110 | print(f'STEP1: end loading data; time cost: {(t2-t1):.2f}s')
111 |
112 |
113 | #---------------------------------------STEP2: prepare data----------------------------------------------------
114 | print('\nSTEP2: start preparing data for link pred task......')
115 | t1 = time.time()
116 | test_node_pairs=[]
117 | test_edge_labels=[]
118 | if args.task == 'lp' or args.task == 'lp_and_nc':
119 | edges_removed = g.remove_edge(ratio=args.link_remove)
120 | test_node_pairs, test_edge_labels = generate_edges_for_linkpred(graph=g, edges_removed=edges_removed, balance_ratio=1.0)
121 | t2 = time.time()
122 | print(f'STEP2: end preparing data; time cost: {(t2-t1):.2f}s')
123 |
124 |
125 | #-----------------------------------STEP3: upstream embedding task-------------------------------------------------
126 | print('\nSTEP3: start learning embeddings......')
127 | print(f'the graph: {args.graph_file}; \nthe model used: {args.method}; \
128 | \nthe # of edges used during embedding (edges maybe removed if lp task): {g.get_num_edges()}; \
129 | \nthe # of nodes: {g.get_num_nodes()}; \nthe # of isolated nodes: {g.get_num_isolates()}; \nis directed graph: {g.get_isdirected()}')
130 | t1 = time.time()
131 | model = None
132 | if args.method == 'abrw':
133 | model = abrw.ABRW(graph=g, dim=args.dim, alpha=args.ABRW_alpha, topk=args.ABRW_topk, number_walks=args.number_walks,
134 | walk_length=args.walk_length, window=args.window_size, workers=args.workers)
135 | elif args.method == 'aane':
136 | model = aane.AANE(graph=g, dim=args.dim, lambd=args.AANE_lamb, rho=args.AANE_rho, maxiter=args.AANE_maxiter,
137 | mode='comb') #mode: 'comb' struc and attri or 'pure' struc
138 | elif args.method == 'tadw':
139 | model = tadw.TADW(graph=g, dim=args.dim, lamb=args.TADW_lamb, maxiter=args.TADW_maxiter)
140 | elif args.method == 'attrpure':
141 | model = attrpure.ATTRPURE(graph=g, dim=args.dim, mode='pca') #mode: pca or svd
142 | elif args.method == 'attrcomb':
143 | model = attrcomb.ATTRCOMB(graph=g, dim=args.dim, comb_with='deepwalk', number_walks=args.number_walks, walk_length=args.walk_length,
144 | window=args.window_size, workers=args.workers, comb_method=args.AttrComb_mode) #comb_method: concat, elementwise-mean, elementwise-max
145 | elif args.method == 'deepwalk':
146 | model = node2vec.Node2vec(graph=g, path_length=args.walk_length, num_paths=args.number_walks, dim=args.dim,
147 | workers=args.workers, window=args.window_size, dw=True)
148 | elif args.method == 'node2vec':
149 | model = node2vec.Node2vec(graph=g, path_length=args.walk_length, num_paths=args.number_walks, dim=args.dim,
150 | workers=args.workers, window=args.window_size, p=args.Node2Vec_p, q=args.Node2Vec_q)
151 | else:
152 | print('method not found...')
153 | exit(0)
154 | t2 = time.time()
155 | print(f'STEP3: end learning embeddings; time cost: {(t2-t1):.2f}s')
156 |
157 | if args.save_emb:
158 | model.save_embeddings(args.emb_file + time.strftime(' %Y%m%d-%H%M%S', time.localtime()))
159 | print(f'Save node embeddings in file: {args.emb_file}')
160 |
161 |
162 | #---------------------------------------STEP4: downstream task-----------------------------------------------
163 | print('\nSTEP4: start evaluating ......: ')
164 | t1 = time.time()
165 | vectors = model.vectors
166 | del model, g
167 | #------lp task
168 | if args.task == 'lp' or args.task == 'lp_and_nc':
169 | #X_test_lp, Y_test_lp = read_edge_label(args.label_file) #if you want to load your own lp testing data
170 | print(f'Link Prediction task; the percentage of positive links for testing: {(args.link_remove*100):.2f}%'
171 | + ' (by default, also generate equal negative links for testing)')
172 | clf = lpClassifier(vectors=vectors) #similarity/distance metric as clf; basically, lp is a binary clf probelm
173 | clf.evaluate(test_node_pairs, test_edge_labels)
174 | #------nc task
175 | if args.task == 'nc' or args.task == 'lp_and_nc':
176 | X, Y = read_node_label(args.label_file)
177 | print(f'Node Classification task; the percentage of labels for testing: {((1-args.label_reserved)*100):.2f}%')
178 | clf = ncClassifier(vectors=vectors, clf=LogisticRegression()) #use Logistic Regression as clf; we may choose SVM or more advanced ones
179 | clf.split_train_evaluate(X, Y, args.label_reserved)
180 | t2 = time.time()
181 | print(f'STEP4: end evaluating; time cost: {(t2-t1):.2f}s')
182 |
183 |
184 | if __name__ == '__main__':
185 | print(f'------ START @ {time.strftime("%Y-%m-%d %H:%M:%S %Z", time.localtime())} ------')
186 | #random.seed(2018)
187 | #np.random.seed(2018)
188 | main(parse_args())
189 | print(f'------ END @ {time.strftime("%Y-%m-%d %H:%M:%S %Z", time.localtime())} ------')
190 |
191 |
--------------------------------------------------------------------------------