├── __init__.py ├── data ├── ind.cora.x ├── ._ind.cora.tx ├── ._ind.cora.x ├── ind.cora.allx ├── ind.cora.tx ├── ._ind.cora.allx ├── ind.citeseer.tx ├── ind.citeseer.x ├── ind.cora.graph ├── ._ind.citeseer.tx ├── ._ind.citeseer.x ├── ._ind.cora.graph ├── ind.citeseer.allx ├── ind.citeseer.graph ├── ._ind.citeseer.allx ├── ._ind.citeseer.graph ├── ._ind.cora.test.index ├── ._ind.citeseer.test.index ├── ind.cora.test.index └── ind.citeseer.test.index ├── README.md ├── layers.py ├── optimizer.py ├── train.py ├── model.py └── utils.py /__init__.py: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | -------------------------------------------------------------------------------- /data/ind.cora.x: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/ind.cora.x -------------------------------------------------------------------------------- /data/._ind.cora.tx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.cora.tx -------------------------------------------------------------------------------- /data/._ind.cora.x: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.cora.x -------------------------------------------------------------------------------- /data/ind.cora.allx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/ind.cora.allx -------------------------------------------------------------------------------- /data/ind.cora.tx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/ind.cora.tx -------------------------------------------------------------------------------- /data/._ind.cora.allx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.cora.allx -------------------------------------------------------------------------------- /data/ind.citeseer.tx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/ind.citeseer.tx -------------------------------------------------------------------------------- /data/ind.citeseer.x: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/ind.citeseer.x -------------------------------------------------------------------------------- /data/ind.cora.graph: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/ind.cora.graph -------------------------------------------------------------------------------- /data/._ind.citeseer.tx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.citeseer.tx -------------------------------------------------------------------------------- /data/._ind.citeseer.x: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.citeseer.x -------------------------------------------------------------------------------- /data/._ind.cora.graph: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.cora.graph -------------------------------------------------------------------------------- /data/ind.citeseer.allx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/ind.citeseer.allx -------------------------------------------------------------------------------- /data/ind.citeseer.graph: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/ind.citeseer.graph -------------------------------------------------------------------------------- /data/._ind.citeseer.allx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.citeseer.allx -------------------------------------------------------------------------------- /data/._ind.citeseer.graph: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.citeseer.graph -------------------------------------------------------------------------------- /data/._ind.cora.test.index: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.cora.test.index -------------------------------------------------------------------------------- /data/._ind.citeseer.test.index: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/YH-UtMSB/sigvae-torch/HEAD/data/._ind.citeseer.test.index -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # sigvae-torch 2 | A Pytorch implementation of [*Semi-Implicit Graph Variational Auto-Encoders*](http://papers.nips.cc/paper/9255-semi-implicit-graph-variational-auto-encoders). 3 | 4 | The work is programed with python3/3.6.3 and torch 1.4.0 5 | 6 | # Updates 7 | ### (w.r.t. the [authors' release](https://github.com/sigvae/SIGraphVAE) ) 8 | We had a minor adjustment on the encoder structure, namely, instead of using individual network branches to produce mu and sigma, we let them share the first hidden layer. This update on the encoder cuts down redundant network weights and improves the model performance. The options of encoder structure is coded up in the argument "encsto", the encoder stochasticity. Set it to 'full' to inject randomness into both mu and sigma, and produces different sigma for all (K+J) outputs. Set it to 'semi' so that sigma is produced deterministically from node features. 9 | 10 | # Usage 11 | For example, run sigvae-torch on cora dataset with bernoulli-poisson decoder, and semi-stochastic encoder with the following command 12 | ``` 13 | >>>python train.py --dataset-str cora --gdc bp --encsto semi 14 | ``` 15 | The arguments are default to ones that yield optimal results. 16 | 17 | # Acknowledgements 18 | This work is developed from https://github.com/zfjsail/gae-pytorch, thank you [@zfjsail](https://github.com/zfjsail) for sharing! 19 | Also appreciate the technical support from [@Chaojie](https://chaojiewang94.github.io/). 20 | -------------------------------------------------------------------------------- /layers.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn.functional as F 3 | from torch.nn.modules.module import Module 4 | from torch.nn.parameter import Parameter 5 | 6 | 7 | class GraphConvolution(Module): 8 | """ 9 | GCN layer, based on https://arxiv.org/abs/1609.02907 10 | that allows MIMO 11 | """ 12 | 13 | def __init__(self, in_features, out_features, dropout=0., act=F.relu): 14 | super(GraphConvolution, self).__init__() 15 | self.in_features = in_features 16 | self.out_features = out_features 17 | self.dropout = dropout 18 | self.act = act 19 | self.weight = Parameter(torch.FloatTensor(in_features, out_features)) 20 | self.reset_parameters() 21 | 22 | def reset_parameters(self): 23 | torch.nn.init.xavier_uniform_(self.weight) 24 | 25 | def forward(self, input, adj): 26 | input = F.dropout(input, self.dropout, self.training) 27 | """ 28 | if the input features are a matrix -- excute regular GCN, 29 | if the input features are of shape [K, N, Z] -- excute MIMO GCN with shared weights. 30 | """ 31 | # An alternative to derive XW (line 32 to 35) 32 | # W = self.weight.view( 33 | # [1, self.in_features, self.out_features] 34 | # ).expand([input.shape[0], -1, -1]) 35 | # support = torch.bmm(input, W) 36 | 37 | support = torch.stack( 38 | [torch.mm(inp, self.weight) for inp in torch.unbind(input, dim=0)], 39 | dim=0) 40 | output = torch.stack( 41 | [torch.spmm(adj, sup) for sup in torch.unbind(support, dim=0)], 42 | dim=0) 43 | output = self.act(output) 44 | return output 45 | 46 | def __repr__(self): 47 | return self.__class__.__name__ + ' (' \ 48 | + str(self.in_features) + ' -> ' \ 49 | + str(self.out_features) + ')' 50 | -------------------------------------------------------------------------------- /optimizer.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn.modules.loss 3 | import torch.nn.functional as F 4 | import numpy as np 5 | 6 | 7 | def loss_function(preds, labels, mu, logvar, emb, eps, n_nodes, norm, pos_weight): 8 | """ 9 | Computing the negative ELBO for SIGVAE: 10 | loss = - \E_{h(z)} \log \frac{p(x|z)p(z)}{h(z)}. 11 | 12 | Parameters 13 | ---------- 14 | preds : torch.Tensor of shape [J, N, N], 15 | Reconsurcted graph probability with J samples drawn from h(z). 16 | labels : torch.Tensor of shape [N, N], 17 | the ground truth connectivity between nodes in the adjacency matrix. 18 | mu : torch.Tensor of shape [K+J, N, zdim], 19 | the gaussian mean of q(z|psi). 20 | logvar : torch.Tensor of shape [K+J, N, zdim], 21 | the gaussian logvar of q(z|psi). 22 | emb: torch.Tensor of shape [J, N, zdim], 23 | the node embeddings that generate preds. 24 | eps: torch.Tensor of shape [J, N, zdim], 25 | the random noise drawn from N(0,1) to construct emb. 26 | n_nodes : int, 27 | the number of nodes in the dataset. 28 | norm : float, 29 | normalizing constant for re-balanced dataset. 30 | pos_weight : torch.Tensor of shape [1], 31 | stands for "positive weight", used for re-balancing +/- trainning samples. 32 | 33 | Returns 34 | reconstruction loss and kl regularizer. 35 | ------- 36 | TYPE 37 | DESCRIPTION. 38 | 39 | """ 40 | def get_rec(pred): 41 | # pred = torch.sigmoid(pred) 42 | log_lik = norm * (pos_weight * labels * torch.log(pred) + (1 - labels) * torch.log(1 - pred)) # N * N 43 | rec = -log_lik.mean() 44 | return rec 45 | 46 | 47 | # There are some problem with bce function when running bp models. Causes are under investigation. 48 | # def get_rec(pred): 49 | # return norm * F.binary_cross_entropy_with_logits(pred, labels, pos_weight=pos_weight) 50 | 51 | 52 | # The objective is made up of 3 components, 53 | # loss = rec_cost + beta * (log_posterior - log_prior), where 54 | # rec_cost = -mean(log p(A|Z[1]), log p(A|Z[2]), ... log p(A|Z[J])), 55 | # log_prior = mean(log p(Z[1]), log p(Z[2]), ..., log p(Z[J])), 56 | # log_posterior = mean(log post[1], log post[2], ..., log post[J]), where 57 | # log post[j] = 1/(K+1) {q(Z[j]\psi[j]) + [q(Z[j]|psi^[1]) + ... + q(Z[j]|psi^[k])]}. 58 | # In practice, the loss is computed as 59 | # loss = rec_lost + log_posterior_ker - log_prior_ker. 60 | 61 | 62 | SMALL = 1e-6 63 | std = torch.exp(0.5 * logvar) 64 | J, N, zdim = emb.shape 65 | K = mu.shape[0] - J 66 | 67 | mu_mix, mu_emb = mu[:K, :], mu[K:, :] 68 | std_mix, std_emb = std[:K, :], std[K:, :] 69 | 70 | preds = torch.clamp(preds, min=SMALL, max=1-SMALL) 71 | 72 | # compute rec_cost 73 | rec_costs = torch.stack( 74 | [get_rec(pred) for pred in torch.unbind(preds, dim=0)], 75 | dim=0) 76 | # average over J * N * N items 77 | rec_cost = rec_costs.mean() 78 | 79 | 80 | 81 | # compute log_prior_ker, the constant 1/sqrt(2*pi) is cancelled out. 82 | # average over J items 83 | log_prior_ker = torch.sum(- 0.5 * emb.pow(2), dim=[1,2]).mean() 84 | 85 | 86 | # compute log_posterior 87 | # Z.shape = [J, 1, N, zdim] 88 | Z = emb.view(J, 1, N, zdim) 89 | 90 | # mu_mix.shape = std_mix.shape = [1, K, N, zdim] 91 | mu_mix = mu_mix.view(1, K, N, zdim) 92 | std_mix = std_mix.view(1, K, N, zdim) 93 | 94 | # compute -log std[k] - (Z[j] - mu[k])^2 / 2*std[k]^2 for all (j,k) 95 | # the shape of result tensor log_post_ker_JK is [J,K] 96 | log_post_ker_JK = - torch.sum( 97 | 0.5 * ((Z - mu_mix) / (std_mix + SMALL)).pow(2), dim=[-2,-1] 98 | ) 99 | 100 | log_post_ker_JK += - torch.sum( 101 | (std_mix + SMALL).log(), dim=[-2,-1] 102 | ) 103 | 104 | # compute -log std[j] - (Z[j] - mu[j])^2 / 2*std[j]^2 for j = 1,2,...,J 105 | # the shape of result tensor log_post_ker_J is [J, 1] 106 | log_post_ker_J = - torch.sum( 107 | 0.5 * eps.pow(2), dim=[-2,-1] 108 | ) 109 | log_post_ker_J += - torch.sum( 110 | (std_emb + SMALL).log(), dim = [-2,-1] 111 | ) 112 | log_post_ker_J = log_post_ker_J.view(-1,1) 113 | 114 | 115 | # bind up log_post_ker_JK and log_post_ker_J into log_post_ker, the shape of result tensor is [J, K+1]. 116 | log_post_ker = torch.cat([log_post_ker_JK, log_post_ker_J], dim=-1) 117 | 118 | # apply "log-mean-exp" to the above tensor 119 | log_post_ker -= np.log(K + 1.) / J 120 | # average over J items. 121 | log_posterior_ker = torch.logsumexp(log_post_ker, dim=-1).mean() 122 | 123 | 124 | 125 | 126 | return rec_cost, log_prior_ker, log_posterior_ker 127 | 128 | 129 | 130 | -------------------------------------------------------------------------------- /train.py: -------------------------------------------------------------------------------- 1 | from __future__ import division 2 | from __future__ import print_function 3 | 4 | import argparse 5 | import time 6 | 7 | import numpy as np 8 | import scipy.sparse as sp 9 | import torch 10 | from torch import optim 11 | 12 | from model import GCNModelSIGVAE 13 | from optimizer import loss_function 14 | from utils import load_data, mask_test_edges, preprocess_graph, get_roc_score 15 | 16 | parser = argparse.ArgumentParser() 17 | parser.add_argument('--no-cuda', action='store_true', default=False, help='Disables CUDA training.') 18 | parser.add_argument('--model', type=str, default='gcn_vae', help="models used.") 19 | parser.add_argument('--seed', type=int, default=42, help='Random seed.') 20 | parser.add_argument('--epochs', type=int, default=200, help='Number of epochs to train.') 21 | parser.add_argument('--monit', type=int, default=50, help='Number of epochs to train before a test') 22 | parser.add_argument('--edim', type=int, default=32, help='Number of units in noise epsilon.') 23 | parser.add_argument('--hidden1', type=int, default=32, help='Number of units in hidden layer 1.') 24 | parser.add_argument('--hidden2', type=int, default=16, help='Number of units in hidden layer 2.') 25 | parser.add_argument('--lr', type=float, default=0.01, help='Initial learning rate.') 26 | parser.add_argument('--dropout', type=float, default=0., help='Dropout rate (1 - keep probability).') 27 | parser.add_argument('--dataset-str', type=str, default='cora', help='type of dataset.') 28 | parser.add_argument('--encsto', type=str, default='semi', help='encoder stochasticity.') 29 | parser.add_argument('--gdc', type=str, default='ip', help='type of graph decoder') 30 | parser.add_argument('--noise-dist', type=str, default='Bernoulli', 31 | help='Distriubtion of random noise in generating psi.') 32 | parser.add_argument('--K', type=int, default=15, 33 | help='number of samples to draw for MC estimation of h(psi).') 34 | parser.add_argument('--J', type=int, default=20, 35 | help='Number of samples to draw for MC estimation of log-likelihood.') 36 | 37 | 38 | args = parser.parse_args() 39 | args.cuda = not args.no_cuda and torch.cuda.is_available() 40 | args.device = 'cuda' if args.cuda else 'cpu' 41 | 42 | np.random.seed(args.seed) 43 | torch.manual_seed(args.seed) 44 | if args.cuda: 45 | torch.cuda.manual_seed(args.seed) 46 | 47 | 48 | 49 | def gae_for(args): 50 | print("Using {} dataset".format(args.dataset_str)) 51 | # Set tensor dtype to float16 52 | # torch.set_default_tensor_type(torch.HalfTensor) 53 | 54 | adj, features = load_data(args.dataset_str) 55 | _, n_nodes, feat_dim = features.shape 56 | 57 | # Store original adjacency matrix (without diagonal entries) for later 58 | adj_orig = adj 59 | adj_orig = adj_orig - sp.dia_matrix((adj_orig.diagonal()[np.newaxis, :], [0]), shape=adj_orig.shape) 60 | adj_orig.eliminate_zeros() 61 | 62 | adj_train, train_edges, val_edges, val_edges_false, test_edges, test_edges_false = mask_test_edges(adj) 63 | adj = adj_train 64 | 65 | # Some preprocessing 66 | adj_norm = preprocess_graph(adj) 67 | adj_label = adj_train + sp.eye(adj_train.shape[0]) 68 | # adj_label = sparse_to_tuple(adj_label) 69 | adj_label = torch.FloatTensor(adj_label.toarray()) 70 | 71 | pos_weight = torch.tensor([float(adj.shape[0] * adj.shape[0] - adj.sum()) / adj.sum()]) 72 | norm = adj.shape[0] * adj.shape[0] / float((adj.shape[0] * adj.shape[0] - adj.sum()) * 2) 73 | 74 | model = GCNModelSIGVAE( 75 | args.edim, feat_dim, args.hidden1, args.hidden2, args.dropout, 76 | encsto=args.encsto, 77 | gdc=args.gdc, 78 | ndist=args.noise_dist, 79 | copyK=args.K, 80 | copyJ = args.J, 81 | device=args.device 82 | ) 83 | 84 | optimizer = optim.Adam(model.parameters(), lr=args.lr) 85 | 86 | hidden_emb = None 87 | 88 | model.to(args.device) 89 | features = features.to(args.device) 90 | adj_norm = adj_norm.to(args.device) 91 | adj_label = adj_label.to(args.device) 92 | pos_weight = pos_weight.to(args.device) 93 | 94 | 95 | 96 | 97 | for epoch in range(args.epochs): 98 | t = time.time() 99 | model.train() 100 | optimizer.zero_grad() 101 | recovered, mu, logvar, z, z_scaled, eps, rk, snr = model(features, adj_norm) 102 | loss_rec, loss_prior, loss_post = loss_function( 103 | preds=recovered, 104 | labels=adj_label, 105 | mu=mu, 106 | logvar=logvar, 107 | emb=z, 108 | eps=eps, 109 | n_nodes=n_nodes, 110 | norm=norm, 111 | pos_weight=pos_weight 112 | ) 113 | 114 | WU = np.min([epoch/300., 1.]) 115 | reg = (loss_post - loss_prior) * WU / (n_nodes**2) 116 | 117 | loss_train = loss_rec + WU * reg 118 | # loss_train = loss_rec 119 | loss_train.backward() 120 | 121 | cur_loss = loss_train.item() 122 | cur_rec = loss_rec.item() 123 | # cur_rec_bce = loss_rec1.item() 124 | optimizer.step() 125 | 126 | hidden_emb = z_scaled.detach().cpu().numpy() 127 | roc_curr, ap_curr = get_roc_score(hidden_emb, val_edges, val_edges_false, args.gdc) 128 | 129 | 130 | print("Epoch:", '%04d' % (epoch + 1), "train_loss=", "{:.5f}".format(cur_loss), 131 | "rec_loss=", "{:.5f}".format(cur_rec), 132 | "val_ap=", "{:.5f}".format(ap_curr), 133 | "time=", "{:.5f}".format(time.time() - t) 134 | ) 135 | # print(rk.detach().cpu().numpy()) 136 | 137 | cur_snr = snr.detach().cpu().numpy() 138 | print("SNR: ", cur_snr) 139 | 140 | 141 | if((epoch+1) % args.monit == 0): 142 | model.eval() 143 | recovered, mu, logvar, z, z_scaled, eps, rk, _ = model(features, adj_norm) 144 | hidden_emb = z_scaled.detach().cpu().numpy() 145 | roc_score, ap_score = get_roc_score(hidden_emb, test_edges, test_edges_false, args.gdc) 146 | rslt = "Test ROC score: {:.4f}, Test AP score: {:.4f}\n".format(roc_score, ap_score) 147 | print("\n", rslt, "\n") 148 | with open("results.txt", "a+") as f: 149 | f.write(rslt) 150 | 151 | print("Optimization Finished!") 152 | 153 | 154 | 155 | if __name__ == '__main__': 156 | gae_for(args) 157 | -------------------------------------------------------------------------------- /model.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn as nn 3 | import torch.nn.functional as F 4 | import torch.distributions as tdist 5 | 6 | import numpy as np 7 | 8 | from layers import GraphConvolution 9 | from torch.nn.parameter import Parameter 10 | 11 | 12 | 13 | class GCNModelSIGVAE(nn.Module): 14 | def __init__(self, ndim, input_feat_dim, hidden_dim1, hidden_dim2, dropout, encsto='semi', gdc='ip', ndist = 'Bernoulli', copyK=1, copyJ=1, device='cuda'): 15 | super(GCNModelSIGVAE, self).__init__() 16 | 17 | self.gce = GraphConvolution(ndim, hidden_dim1, dropout, act=F.relu) 18 | # self.gc0 = GraphConvolution(input_feat_dim, hidden_dim1, dropout, act=F.relu) 19 | self.gc1 = GraphConvolution(input_feat_dim, hidden_dim1, dropout, act=F.relu) 20 | self.gc2 = GraphConvolution(hidden_dim1, hidden_dim2, dropout, act=lambda x: x) 21 | self.gc3 = GraphConvolution(hidden_dim1, hidden_dim2, dropout, act=lambda x: x) 22 | self.encsto = encsto 23 | self.dc = GraphDecoder(hidden_dim2, dropout, gdc=gdc) 24 | self.device = device 25 | 26 | if ndist == 'Bernoulli': 27 | self.ndist = tdist.Bernoulli(torch.tensor([.5], device=self.device)) 28 | elif ndist == 'Normal': 29 | self.ndist == tdist.Normal( 30 | torch.tensor([0.], device=self.device), 31 | torch.tensor([1.], device=self.device)) 32 | elif ndist == 'Exponential': 33 | self.ndist = tdist.Exponential(torch.tensor([1.], device=self.device)) 34 | 35 | # K and J are defined in http://proceedings.mlr.press/v80/yin18b/yin18b-supp.pdf 36 | # Algorthm 1. 37 | self.K = copyK 38 | self.J = copyJ 39 | self.ndim = ndim 40 | 41 | # parameters in network gc1 and gce are NOT identically distributed, so we need to reweight the output 42 | # of gce() so that the effect of hiddenx + hiddene is equivalent to gc(x || e). 43 | self.reweight = ((self.ndim + hidden_dim1) / (input_feat_dim + hidden_dim1))**(.5) 44 | 45 | 46 | def encode(self, x, adj): 47 | assert len(x.shape) == 3, 'The input tensor dimension is not 3!' 48 | # Without torch.Size(), an error would occur while resampling. 49 | hiddenx = self.gc1(x, adj) 50 | 51 | if self.ndim >= 1: 52 | e = self.ndist.sample(torch.Size([self.K+self.J, x.shape[1], self.ndim])) 53 | e = torch.squeeze(e, -1) 54 | e = e.mul(self.reweight) 55 | hiddene = self.gce(e, adj) 56 | else: 57 | print("no randomness.") 58 | hiddene = torch.zeros(self.K+self.J, hiddenx.shape[1], hiddenx.shape[2], device=self.device) 59 | 60 | 61 | 62 | hidden1 = hiddenx + hiddene 63 | 64 | # hiddens = self.gc0(x, adj) 65 | 66 | p_signal = hiddenx.pow(2.).mean() 67 | p_noise = hiddene.pow(2.).mean([-2,-1]) 68 | snr = (p_signal / p_noise) 69 | 70 | 71 | # below are 3 options for producing logvar 72 | # 1. stochastic logvar (more instinctive) 73 | # where logvar = self.gc3(hidden1, adj) 74 | # set args.encsto to 'full'. 75 | # 2. deterministic logvar, shared by all K+J samples, and share a previous hidden layer with mu 76 | # where logvar = self.gc3(hiddenx, adj) 77 | # set args.encsto to 'semi'. 78 | # 3. deterministic logvar, shared by all K+J samples, and produced by another branch of network 79 | # (the one applied by A. Hasanzadeh et al.) 80 | 81 | 82 | mu = self.gc2(hidden1, adj) 83 | 84 | EncSto = (self.encsto == 'full') 85 | hidden_sd = EncSto * hidden1 + (1 - EncSto) * hiddenx 86 | 87 | logvar = self.gc3(hidden_sd, adj) 88 | 89 | return mu, logvar, snr 90 | 91 | def reparameterize(self, mu, logvar): 92 | std = torch.exp(logvar / 2.) 93 | eps = torch.randn_like(std) 94 | return eps.mul(std).add(mu), eps 95 | # return mu, eps 96 | 97 | def forward(self, x, adj): 98 | mu, logvar, snr = self.encode(x, adj) 99 | 100 | emb_mu = mu[self.K:, :] 101 | emb_logvar = logvar[self.K:, :] 102 | 103 | # check tensor size compatibility 104 | assert len(emb_mu.shape) == len(emb_logvar.shape), 'mu and logvar are not equi-dimension.' 105 | 106 | z, eps = self.reparameterize(emb_mu, emb_logvar) 107 | 108 | adj_, z_scaled, rk = self.dc(z) 109 | 110 | return adj_, mu, logvar, z, z_scaled, eps, rk, snr 111 | 112 | 113 | class GraphDecoder(nn.Module): 114 | """Decoder for using inner product for prediction.""" 115 | 116 | def __init__(self, zdim, dropout, gdc='ip'): 117 | super(GraphDecoder, self).__init__() 118 | self.dropout = dropout 119 | self.gdc = gdc 120 | self.zdim = zdim 121 | self.rk_lgt = Parameter(torch.FloatTensor(torch.Size([1, zdim]))) 122 | self.reset_parameters() 123 | self.SMALL = 1e-16 124 | 125 | def reset_parameters(self): 126 | torch.nn.init.uniform_(self.rk_lgt, a=-6., b=0.) 127 | 128 | def forward(self, z): 129 | z = F.dropout(z, self.dropout, training=self.training) 130 | assert self.zdim == z.shape[2], 'zdim not compatible!' 131 | 132 | # The variable 'rk' in the code is the square root of the same notation in 133 | # http://proceedings.mlr.press/v80/yin18b/yin18b-supp.pdf 134 | # i.e., instead of do Z*diag(rk)*Z', we perform [Z*diag(rk)] * [Z*diag(rk)]'. 135 | rk = torch.sigmoid(self.rk_lgt).pow(.5) 136 | 137 | # Z shape: [J, N, zdim] 138 | # Z' shape: [J, zdim, N] 139 | if self.gdc == 'bp': 140 | z = z.mul(rk.view(1, 1, self.zdim)) 141 | adj_lgt = torch.bmm(z, torch.transpose(z, 1, 2)) 142 | 143 | if self.gdc == 'ip': 144 | adj = torch.sigmoid(adj_lgt) 145 | elif self.gdc == 'bp': 146 | # 1 - exp( - exp(ZZ')) 147 | adj_lgt = torch.clamp(adj_lgt, min=-np.Inf, max=25) 148 | adj = 1 - torch.exp(-adj_lgt.exp()) 149 | 150 | 151 | # if self.training: 152 | # adj_lgt = - torch.log(1 / (adj + self.SMALL) - 1 + self.SMALL) 153 | # else: 154 | # adj_mean = torch.mean(adj, dim=0, keepdim=True) 155 | # adj_lgt = - torch.log(1 / (adj_mean + self.SMALL) - 1 + self.SMALL) 156 | 157 | if not self.training: 158 | adj = torch.mean(adj, dim=0, keepdim=True) 159 | 160 | return adj, z, rk.pow(2) 161 | 162 | 163 | 164 | 165 | 166 | # class GCNModelVAE(nn.Module): 167 | # def __init__(self, input_feat_dim, hidden_dim1, hidden_dim2, dropout): 168 | # super(GCNModelVAE, self).__init__() 169 | # self.gc1 = GraphConvolution(input_feat_dim, hidden_dim1, dropout, act=F.relu) 170 | # self.gc2 = GraphConvolution(hidden_dim1, hidden_dim2, dropout, act=lambda x: x) 171 | # self.gc3 = GraphConvolution(hidden_dim1, hidden_dim2, dropout, act=lambda x: x) 172 | # self.dc = GraphDecoder(dropout, act=lambda x: x) 173 | 174 | # def encode(self, x, adj): 175 | # hidden1 = self.gc1(x, adj) 176 | # return self.gc2(hidden1, adj), self.gc3(hidden1, adj) 177 | 178 | # def reparameterize(self, mu, logvar): 179 | # if self.training: 180 | # std = torch.exp(logvar) 181 | # eps = torch.randn_like(std) 182 | # return eps.mul(std).add_(mu) 183 | # else: 184 | # return mu 185 | 186 | # def forward(self, x, adj): 187 | # mu, logvar = self.encode(x, adj) 188 | # mu = torch.mean(mu, dim=0, keepdim=False) 189 | # logvar = torch.mean(logvar, dim=0, keepdim=False) 190 | # z = self.reparameterize(mu, logvar) 191 | # return self.dc(z), mu, logvar 192 | 193 | 194 | # class GraphDecoder(nn.Module): 195 | # """Decoder for using inner product for prediction.""" 196 | 197 | # def __init__(self, dropout, act=torch.sigmoid): 198 | # super(GraphDecoder, self).__init__() 199 | # self.dropout = dropout 200 | # self.act = act 201 | 202 | 203 | # def forward(self, z): 204 | # z = F.dropout(z, self.dropout, training=self.training) 205 | # adj = self.act(torch.mm(z, z.t())) 206 | # return adj -------------------------------------------------------------------------------- /utils.py: -------------------------------------------------------------------------------- 1 | import pickle as pkl 2 | import sys 3 | import networkx as nx 4 | import numpy as np 5 | import scipy.sparse as sp 6 | import torch 7 | from sklearn.metrics import roc_auc_score, average_precision_score 8 | 9 | 10 | def load_data(dataset): 11 | # load the data: x, tx, allx, graph 12 | names = ['x', 'tx', 'allx', 'graph'] 13 | objects = [] 14 | for i in range(len(names)): 15 | ''' 16 | fix Pickle incompatibility of numpy arrays between Python 2 and 3 17 | https://stackoverflow.com/questions/11305790/pickle-incompatibility-of-numpy-arrays-between-python-2-and-3 18 | ''' 19 | # with open("data/ind.{}.{}".format(dataset, names[i]), 'rb') as rf: 20 | # u = pkl._Unpickler(rf) 21 | # u.encoding = 'latin1' 22 | # cur_data = u.load() 23 | # objects.append(cur_data) 24 | 25 | with open("data/ind.{}.{}".format(dataset, names[i]), 'rb') as f: 26 | if sys.version_info > (3, 0): 27 | objects.append(pkl.load(f, encoding='latin1')) 28 | else: 29 | objects.append(pkl.load(f)) 30 | 31 | x, tx, allx, graph = tuple(objects) 32 | test_idx_reorder = parse_index_file( 33 | "data/ind.{}.test.index".format(dataset)) 34 | test_idx_range = np.sort(test_idx_reorder) 35 | 36 | if dataset == 'citeseer': 37 | # Fix citeseer dataset (there are some isolated nodes in the graph) 38 | # Find isolated nodes, add them as zero-vecs into the right position 39 | test_idx_range_full = range( 40 | min(test_idx_reorder), max(test_idx_reorder) + 1) 41 | tx_extended = sp.lil_matrix((len(test_idx_range_full), x.shape[1])) 42 | tx_extended[test_idx_range - min(test_idx_range), :] = tx 43 | tx = tx_extended 44 | 45 | features = sp.vstack((allx, tx)).tolil() 46 | features[test_idx_reorder, :] = features[test_idx_range, :] 47 | features = torch.FloatTensor(np.array(features.todense())) 48 | # features = features / features.sum(-1, keepdim=True) 49 | # adding a dimension to features for future expansion 50 | if len(features.shape) == 2: 51 | features = features.view([1,features.shape[0], features.shape[1]]) 52 | adj = nx.adjacency_matrix(nx.from_dict_of_lists(graph)) 53 | 54 | return adj, features 55 | 56 | 57 | def parse_index_file(filename): 58 | index = [] 59 | for line in open(filename): 60 | index.append(int(line.strip())) 61 | return index 62 | 63 | 64 | def sparse_to_tuple(sparse_mx): 65 | if not sp.isspmatrix_coo(sparse_mx): 66 | sparse_mx = sparse_mx.tocoo() 67 | coords = np.vstack((sparse_mx.row, sparse_mx.col)).transpose() 68 | values = sparse_mx.data 69 | shape = sparse_mx.shape 70 | return coords, values, shape 71 | 72 | 73 | def mask_test_edges(adj): 74 | # Function to build test set with 10% positive links 75 | # NOTE: Splits are randomized and results might slightly deviate from reported numbers in the paper. 76 | # TODO: Clean up. 77 | 78 | # Remove diagonal elements 79 | adj = adj - sp.dia_matrix((adj.diagonal()[np.newaxis, :], [0]), shape=adj.shape) 80 | adj.eliminate_zeros() 81 | # Check that diag is zero: 82 | assert np.diag(adj.todense()).sum() == 0 83 | 84 | adj_triu = sp.triu(adj) 85 | adj_tuple = sparse_to_tuple(adj_triu) 86 | edges = adj_tuple[0] 87 | edges_all = sparse_to_tuple(adj)[0] 88 | num_test = int(np.floor(edges.shape[0] / 10.)) 89 | num_val = int(np.floor(edges.shape[0] / 20.)) 90 | 91 | all_edge_idx = list(range(edges.shape[0])) 92 | np.random.shuffle(all_edge_idx) 93 | val_edge_idx = all_edge_idx[:num_val] 94 | test_edge_idx = all_edge_idx[num_val:(num_val + num_test)] 95 | test_edges = edges[test_edge_idx] 96 | val_edges = edges[val_edge_idx] 97 | train_edges = np.delete(edges, np.hstack([test_edge_idx, val_edge_idx]), axis=0) 98 | 99 | def ismember(a, b, tol=5): 100 | rows_close = np.all(np.round(a - b[:, None], tol) == 0, axis=-1) 101 | return np.any(rows_close) 102 | 103 | test_edges_false = [] 104 | while len(test_edges_false) < len(test_edges): 105 | idx_i = np.random.randint(0, adj.shape[0]) 106 | idx_j = np.random.randint(0, adj.shape[0]) 107 | if idx_i == idx_j: 108 | continue 109 | if ismember([idx_i, idx_j], edges_all): 110 | continue 111 | if test_edges_false: 112 | if ismember([idx_j, idx_i], np.array(test_edges_false)): 113 | continue 114 | if ismember([idx_i, idx_j], np.array(test_edges_false)): 115 | continue 116 | test_edges_false.append([idx_i, idx_j]) 117 | 118 | val_edges_false = [] 119 | while len(val_edges_false) < len(val_edges): 120 | idx_i = np.random.randint(0, adj.shape[0]) 121 | idx_j = np.random.randint(0, adj.shape[0]) 122 | if idx_i == idx_j: 123 | continue 124 | if ismember([idx_i, idx_j], train_edges): 125 | continue 126 | if ismember([idx_j, idx_i], train_edges): 127 | continue 128 | if ismember([idx_i, idx_j], val_edges): 129 | continue 130 | if ismember([idx_j, idx_i], val_edges): 131 | continue 132 | if val_edges_false: 133 | if ismember([idx_j, idx_i], np.array(val_edges_false)): 134 | continue 135 | if ismember([idx_i, idx_j], np.array(val_edges_false)): 136 | continue 137 | val_edges_false.append([idx_i, idx_j]) 138 | 139 | assert ~ismember(test_edges_false, edges_all) 140 | assert ~ismember(val_edges_false, edges_all) 141 | assert ~ismember(val_edges, train_edges) 142 | assert ~ismember(test_edges, train_edges) 143 | assert ~ismember(val_edges, test_edges) 144 | 145 | data = np.ones(train_edges.shape[0]) 146 | 147 | # Re-build adj matrix 148 | adj_train = sp.csr_matrix((data, (train_edges[:, 0], train_edges[:, 1])), shape=adj.shape) 149 | adj_train = adj_train + adj_train.T 150 | 151 | # NOTE: these edge lists only contain single direction of edge! 152 | return adj_train, train_edges, val_edges, val_edges_false, test_edges, test_edges_false 153 | 154 | 155 | def preprocess_graph(adj): 156 | adj = sp.coo_matrix(adj) 157 | adj_ = adj + sp.eye(adj.shape[0]) 158 | rowsum = np.array(adj_.sum(1)) 159 | degree_mat_inv_sqrt = sp.diags(np.power(rowsum, -0.5).flatten()) 160 | adj_normalized = adj_.dot(degree_mat_inv_sqrt).transpose().dot(degree_mat_inv_sqrt).tocoo() 161 | # return sparse_to_tuple(adj_normalized) 162 | return sparse_mx_to_torch_sparse_tensor(adj_normalized) 163 | 164 | 165 | def sparse_mx_to_torch_sparse_tensor(sparse_mx): 166 | """Convert a scipy sparse matrix to a torch sparse tensor.""" 167 | sparse_mx = sparse_mx.tocoo().astype(np.float32) 168 | indices = torch.from_numpy( 169 | np.vstack((sparse_mx.row, sparse_mx.col)).astype(np.int64)) 170 | values = torch.from_numpy(sparse_mx.data) 171 | shape = torch.Size(sparse_mx.shape) 172 | return torch.sparse.FloatTensor(indices, values, shape) 173 | 174 | 175 | def get_roc_score(emb, edges_pos, edges_neg, gdc): 176 | def GraphDC(x): 177 | if gdc == 'ip': 178 | return 1 / (1 + np.exp(-x)) 179 | elif gdc == 'bp': 180 | return 1 - np.exp( - np.exp(x)) 181 | 182 | J = emb.shape[0] 183 | 184 | # Predict on test set of edges 185 | edges_pos = np.array(edges_pos).transpose((1,0)) 186 | emb_pos_sp = emb[:, edges_pos[0], :] 187 | emb_pos_ep = emb[:, edges_pos[1], :] 188 | 189 | # preds_pos is torch.Tensor with shape [J, #pos_edges] 190 | preds_pos = GraphDC( 191 | np.einsum('ijk,ijk->ij', emb_pos_sp, emb_pos_ep) 192 | ) 193 | 194 | edges_neg = np.array(edges_neg).transpose((1,0)) 195 | emb_neg_sp = emb[:, edges_neg[0], :] 196 | emb_neg_ep = emb[:, edges_neg[1], :] 197 | 198 | preds_neg = GraphDC( 199 | np.einsum('ijk,ijk->ij', emb_neg_sp, emb_neg_ep) 200 | ) 201 | 202 | preds_all = np.hstack([preds_pos, preds_neg]) 203 | labels_all = np.hstack([np.ones(preds_pos.shape[-1]), np.zeros(preds_neg.shape[-1])]) 204 | 205 | roc_score = np.array( 206 | [roc_auc_score(labels_all, pred_all.flatten()) \ 207 | for pred_all in np.vsplit(preds_all, J)] 208 | ).mean() 209 | 210 | ap_score = np.array( 211 | [average_precision_score(labels_all, pred_all.flatten()) \ 212 | for pred_all in np.vsplit(preds_all, J)] 213 | ).mean() 214 | 215 | return roc_score, ap_score 216 | -------------------------------------------------------------------------------- /data/ind.cora.test.index: -------------------------------------------------------------------------------- 1 | 2692 2 | 2532 3 | 2050 4 | 1715 5 | 2362 6 | 2609 7 | 2622 8 | 1975 9 | 2081 10 | 1767 11 | 2263 12 | 1725 13 | 2588 14 | 2259 15 | 2357 16 | 1998 17 | 2574 18 | 2179 19 | 2291 20 | 2382 21 | 1812 22 | 1751 23 | 2422 24 | 1937 25 | 2631 26 | 2510 27 | 2378 28 | 2589 29 | 2345 30 | 1943 31 | 1850 32 | 2298 33 | 1825 34 | 2035 35 | 2507 36 | 2313 37 | 1906 38 | 1797 39 | 2023 40 | 2159 41 | 2495 42 | 1886 43 | 2122 44 | 2369 45 | 2461 46 | 1925 47 | 2565 48 | 1858 49 | 2234 50 | 2000 51 | 1846 52 | 2318 53 | 1723 54 | 2559 55 | 2258 56 | 1763 57 | 1991 58 | 1922 59 | 2003 60 | 2662 61 | 2250 62 | 2064 63 | 2529 64 | 1888 65 | 2499 66 | 2454 67 | 2320 68 | 2287 69 | 2203 70 | 2018 71 | 2002 72 | 2632 73 | 2554 74 | 2314 75 | 2537 76 | 1760 77 | 2088 78 | 2086 79 | 2218 80 | 2605 81 | 1953 82 | 2403 83 | 1920 84 | 2015 85 | 2335 86 | 2535 87 | 1837 88 | 2009 89 | 1905 90 | 2636 91 | 1942 92 | 2193 93 | 2576 94 | 2373 95 | 1873 96 | 2463 97 | 2509 98 | 1954 99 | 2656 100 | 2455 101 | 2494 102 | 2295 103 | 2114 104 | 2561 105 | 2176 106 | 2275 107 | 2635 108 | 2442 109 | 2704 110 | 2127 111 | 2085 112 | 2214 113 | 2487 114 | 1739 115 | 2543 116 | 1783 117 | 2485 118 | 2262 119 | 2472 120 | 2326 121 | 1738 122 | 2170 123 | 2100 124 | 2384 125 | 2152 126 | 2647 127 | 2693 128 | 2376 129 | 1775 130 | 1726 131 | 2476 132 | 2195 133 | 1773 134 | 1793 135 | 2194 136 | 2581 137 | 1854 138 | 2524 139 | 1945 140 | 1781 141 | 1987 142 | 2599 143 | 1744 144 | 2225 145 | 2300 146 | 1928 147 | 2042 148 | 2202 149 | 1958 150 | 1816 151 | 1916 152 | 2679 153 | 2190 154 | 1733 155 | 2034 156 | 2643 157 | 2177 158 | 1883 159 | 1917 160 | 1996 161 | 2491 162 | 2268 163 | 2231 164 | 2471 165 | 1919 166 | 1909 167 | 2012 168 | 2522 169 | 1865 170 | 2466 171 | 2469 172 | 2087 173 | 2584 174 | 2563 175 | 1924 176 | 2143 177 | 1736 178 | 1966 179 | 2533 180 | 2490 181 | 2630 182 | 1973 183 | 2568 184 | 1978 185 | 2664 186 | 2633 187 | 2312 188 | 2178 189 | 1754 190 | 2307 191 | 2480 192 | 1960 193 | 1742 194 | 1962 195 | 2160 196 | 2070 197 | 2553 198 | 2433 199 | 1768 200 | 2659 201 | 2379 202 | 2271 203 | 1776 204 | 2153 205 | 1877 206 | 2027 207 | 2028 208 | 2155 209 | 2196 210 | 2483 211 | 2026 212 | 2158 213 | 2407 214 | 1821 215 | 2131 216 | 2676 217 | 2277 218 | 2489 219 | 2424 220 | 1963 221 | 1808 222 | 1859 223 | 2597 224 | 2548 225 | 2368 226 | 1817 227 | 2405 228 | 2413 229 | 2603 230 | 2350 231 | 2118 232 | 2329 233 | 1969 234 | 2577 235 | 2475 236 | 2467 237 | 2425 238 | 1769 239 | 2092 240 | 2044 241 | 2586 242 | 2608 243 | 1983 244 | 2109 245 | 2649 246 | 1964 247 | 2144 248 | 1902 249 | 2411 250 | 2508 251 | 2360 252 | 1721 253 | 2005 254 | 2014 255 | 2308 256 | 2646 257 | 1949 258 | 1830 259 | 2212 260 | 2596 261 | 1832 262 | 1735 263 | 1866 264 | 2695 265 | 1941 266 | 2546 267 | 2498 268 | 2686 269 | 2665 270 | 1784 271 | 2613 272 | 1970 273 | 2021 274 | 2211 275 | 2516 276 | 2185 277 | 2479 278 | 2699 279 | 2150 280 | 1990 281 | 2063 282 | 2075 283 | 1979 284 | 2094 285 | 1787 286 | 2571 287 | 2690 288 | 1926 289 | 2341 290 | 2566 291 | 1957 292 | 1709 293 | 1955 294 | 2570 295 | 2387 296 | 1811 297 | 2025 298 | 2447 299 | 2696 300 | 2052 301 | 2366 302 | 1857 303 | 2273 304 | 2245 305 | 2672 306 | 2133 307 | 2421 308 | 1929 309 | 2125 310 | 2319 311 | 2641 312 | 2167 313 | 2418 314 | 1765 315 | 1761 316 | 1828 317 | 2188 318 | 1972 319 | 1997 320 | 2419 321 | 2289 322 | 2296 323 | 2587 324 | 2051 325 | 2440 326 | 2053 327 | 2191 328 | 1923 329 | 2164 330 | 1861 331 | 2339 332 | 2333 333 | 2523 334 | 2670 335 | 2121 336 | 1921 337 | 1724 338 | 2253 339 | 2374 340 | 1940 341 | 2545 342 | 2301 343 | 2244 344 | 2156 345 | 1849 346 | 2551 347 | 2011 348 | 2279 349 | 2572 350 | 1757 351 | 2400 352 | 2569 353 | 2072 354 | 2526 355 | 2173 356 | 2069 357 | 2036 358 | 1819 359 | 1734 360 | 1880 361 | 2137 362 | 2408 363 | 2226 364 | 2604 365 | 1771 366 | 2698 367 | 2187 368 | 2060 369 | 1756 370 | 2201 371 | 2066 372 | 2439 373 | 1844 374 | 1772 375 | 2383 376 | 2398 377 | 1708 378 | 1992 379 | 1959 380 | 1794 381 | 2426 382 | 2702 383 | 2444 384 | 1944 385 | 1829 386 | 2660 387 | 2497 388 | 2607 389 | 2343 390 | 1730 391 | 2624 392 | 1790 393 | 1935 394 | 1967 395 | 2401 396 | 2255 397 | 2355 398 | 2348 399 | 1931 400 | 2183 401 | 2161 402 | 2701 403 | 1948 404 | 2501 405 | 2192 406 | 2404 407 | 2209 408 | 2331 409 | 1810 410 | 2363 411 | 2334 412 | 1887 413 | 2393 414 | 2557 415 | 1719 416 | 1732 417 | 1986 418 | 2037 419 | 2056 420 | 1867 421 | 2126 422 | 1932 423 | 2117 424 | 1807 425 | 1801 426 | 1743 427 | 2041 428 | 1843 429 | 2388 430 | 2221 431 | 1833 432 | 2677 433 | 1778 434 | 2661 435 | 2306 436 | 2394 437 | 2106 438 | 2430 439 | 2371 440 | 2606 441 | 2353 442 | 2269 443 | 2317 444 | 2645 445 | 2372 446 | 2550 447 | 2043 448 | 1968 449 | 2165 450 | 2310 451 | 1985 452 | 2446 453 | 1982 454 | 2377 455 | 2207 456 | 1818 457 | 1913 458 | 1766 459 | 1722 460 | 1894 461 | 2020 462 | 1881 463 | 2621 464 | 2409 465 | 2261 466 | 2458 467 | 2096 468 | 1712 469 | 2594 470 | 2293 471 | 2048 472 | 2359 473 | 1839 474 | 2392 475 | 2254 476 | 1911 477 | 2101 478 | 2367 479 | 1889 480 | 1753 481 | 2555 482 | 2246 483 | 2264 484 | 2010 485 | 2336 486 | 2651 487 | 2017 488 | 2140 489 | 1842 490 | 2019 491 | 1890 492 | 2525 493 | 2134 494 | 2492 495 | 2652 496 | 2040 497 | 2145 498 | 2575 499 | 2166 500 | 1999 501 | 2434 502 | 1711 503 | 2276 504 | 2450 505 | 2389 506 | 2669 507 | 2595 508 | 1814 509 | 2039 510 | 2502 511 | 1896 512 | 2168 513 | 2344 514 | 2637 515 | 2031 516 | 1977 517 | 2380 518 | 1936 519 | 2047 520 | 2460 521 | 2102 522 | 1745 523 | 2650 524 | 2046 525 | 2514 526 | 1980 527 | 2352 528 | 2113 529 | 1713 530 | 2058 531 | 2558 532 | 1718 533 | 1864 534 | 1876 535 | 2338 536 | 1879 537 | 1891 538 | 2186 539 | 2451 540 | 2181 541 | 2638 542 | 2644 543 | 2103 544 | 2591 545 | 2266 546 | 2468 547 | 1869 548 | 2582 549 | 2674 550 | 2361 551 | 2462 552 | 1748 553 | 2215 554 | 2615 555 | 2236 556 | 2248 557 | 2493 558 | 2342 559 | 2449 560 | 2274 561 | 1824 562 | 1852 563 | 1870 564 | 2441 565 | 2356 566 | 1835 567 | 2694 568 | 2602 569 | 2685 570 | 1893 571 | 2544 572 | 2536 573 | 1994 574 | 1853 575 | 1838 576 | 1786 577 | 1930 578 | 2539 579 | 1892 580 | 2265 581 | 2618 582 | 2486 583 | 2583 584 | 2061 585 | 1796 586 | 1806 587 | 2084 588 | 1933 589 | 2095 590 | 2136 591 | 2078 592 | 1884 593 | 2438 594 | 2286 595 | 2138 596 | 1750 597 | 2184 598 | 1799 599 | 2278 600 | 2410 601 | 2642 602 | 2435 603 | 1956 604 | 2399 605 | 1774 606 | 2129 607 | 1898 608 | 1823 609 | 1938 610 | 2299 611 | 1862 612 | 2420 613 | 2673 614 | 1984 615 | 2204 616 | 1717 617 | 2074 618 | 2213 619 | 2436 620 | 2297 621 | 2592 622 | 2667 623 | 2703 624 | 2511 625 | 1779 626 | 1782 627 | 2625 628 | 2365 629 | 2315 630 | 2381 631 | 1788 632 | 1714 633 | 2302 634 | 1927 635 | 2325 636 | 2506 637 | 2169 638 | 2328 639 | 2629 640 | 2128 641 | 2655 642 | 2282 643 | 2073 644 | 2395 645 | 2247 646 | 2521 647 | 2260 648 | 1868 649 | 1988 650 | 2324 651 | 2705 652 | 2541 653 | 1731 654 | 2681 655 | 2707 656 | 2465 657 | 1785 658 | 2149 659 | 2045 660 | 2505 661 | 2611 662 | 2217 663 | 2180 664 | 1904 665 | 2453 666 | 2484 667 | 1871 668 | 2309 669 | 2349 670 | 2482 671 | 2004 672 | 1965 673 | 2406 674 | 2162 675 | 1805 676 | 2654 677 | 2007 678 | 1947 679 | 1981 680 | 2112 681 | 2141 682 | 1720 683 | 1758 684 | 2080 685 | 2330 686 | 2030 687 | 2432 688 | 2089 689 | 2547 690 | 1820 691 | 1815 692 | 2675 693 | 1840 694 | 2658 695 | 2370 696 | 2251 697 | 1908 698 | 2029 699 | 2068 700 | 2513 701 | 2549 702 | 2267 703 | 2580 704 | 2327 705 | 2351 706 | 2111 707 | 2022 708 | 2321 709 | 2614 710 | 2252 711 | 2104 712 | 1822 713 | 2552 714 | 2243 715 | 1798 716 | 2396 717 | 2663 718 | 2564 719 | 2148 720 | 2562 721 | 2684 722 | 2001 723 | 2151 724 | 2706 725 | 2240 726 | 2474 727 | 2303 728 | 2634 729 | 2680 730 | 2055 731 | 2090 732 | 2503 733 | 2347 734 | 2402 735 | 2238 736 | 1950 737 | 2054 738 | 2016 739 | 1872 740 | 2233 741 | 1710 742 | 2032 743 | 2540 744 | 2628 745 | 1795 746 | 2616 747 | 1903 748 | 2531 749 | 2567 750 | 1946 751 | 1897 752 | 2222 753 | 2227 754 | 2627 755 | 1856 756 | 2464 757 | 2241 758 | 2481 759 | 2130 760 | 2311 761 | 2083 762 | 2223 763 | 2284 764 | 2235 765 | 2097 766 | 1752 767 | 2515 768 | 2527 769 | 2385 770 | 2189 771 | 2283 772 | 2182 773 | 2079 774 | 2375 775 | 2174 776 | 2437 777 | 1993 778 | 2517 779 | 2443 780 | 2224 781 | 2648 782 | 2171 783 | 2290 784 | 2542 785 | 2038 786 | 1855 787 | 1831 788 | 1759 789 | 1848 790 | 2445 791 | 1827 792 | 2429 793 | 2205 794 | 2598 795 | 2657 796 | 1728 797 | 2065 798 | 1918 799 | 2427 800 | 2573 801 | 2620 802 | 2292 803 | 1777 804 | 2008 805 | 1875 806 | 2288 807 | 2256 808 | 2033 809 | 2470 810 | 2585 811 | 2610 812 | 2082 813 | 2230 814 | 1915 815 | 1847 816 | 2337 817 | 2512 818 | 2386 819 | 2006 820 | 2653 821 | 2346 822 | 1951 823 | 2110 824 | 2639 825 | 2520 826 | 1939 827 | 2683 828 | 2139 829 | 2220 830 | 1910 831 | 2237 832 | 1900 833 | 1836 834 | 2197 835 | 1716 836 | 1860 837 | 2077 838 | 2519 839 | 2538 840 | 2323 841 | 1914 842 | 1971 843 | 1845 844 | 2132 845 | 1802 846 | 1907 847 | 2640 848 | 2496 849 | 2281 850 | 2198 851 | 2416 852 | 2285 853 | 1755 854 | 2431 855 | 2071 856 | 2249 857 | 2123 858 | 1727 859 | 2459 860 | 2304 861 | 2199 862 | 1791 863 | 1809 864 | 1780 865 | 2210 866 | 2417 867 | 1874 868 | 1878 869 | 2116 870 | 1961 871 | 1863 872 | 2579 873 | 2477 874 | 2228 875 | 2332 876 | 2578 877 | 2457 878 | 2024 879 | 1934 880 | 2316 881 | 1841 882 | 1764 883 | 1737 884 | 2322 885 | 2239 886 | 2294 887 | 1729 888 | 2488 889 | 1974 890 | 2473 891 | 2098 892 | 2612 893 | 1834 894 | 2340 895 | 2423 896 | 2175 897 | 2280 898 | 2617 899 | 2208 900 | 2560 901 | 1741 902 | 2600 903 | 2059 904 | 1747 905 | 2242 906 | 2700 907 | 2232 908 | 2057 909 | 2147 910 | 2682 911 | 1792 912 | 1826 913 | 2120 914 | 1895 915 | 2364 916 | 2163 917 | 1851 918 | 2391 919 | 2414 920 | 2452 921 | 1803 922 | 1989 923 | 2623 924 | 2200 925 | 2528 926 | 2415 927 | 1804 928 | 2146 929 | 2619 930 | 2687 931 | 1762 932 | 2172 933 | 2270 934 | 2678 935 | 2593 936 | 2448 937 | 1882 938 | 2257 939 | 2500 940 | 1899 941 | 2478 942 | 2412 943 | 2107 944 | 1746 945 | 2428 946 | 2115 947 | 1800 948 | 1901 949 | 2397 950 | 2530 951 | 1912 952 | 2108 953 | 2206 954 | 2091 955 | 1740 956 | 2219 957 | 1976 958 | 2099 959 | 2142 960 | 2671 961 | 2668 962 | 2216 963 | 2272 964 | 2229 965 | 2666 966 | 2456 967 | 2534 968 | 2697 969 | 2688 970 | 2062 971 | 2691 972 | 2689 973 | 2154 974 | 2590 975 | 2626 976 | 2390 977 | 1813 978 | 2067 979 | 1952 980 | 2518 981 | 2358 982 | 1789 983 | 2076 984 | 2049 985 | 2119 986 | 2013 987 | 2124 988 | 2556 989 | 2105 990 | 2093 991 | 1885 992 | 2305 993 | 2354 994 | 2135 995 | 2601 996 | 1770 997 | 1995 998 | 2504 999 | 1749 1000 | 2157 1001 | -------------------------------------------------------------------------------- /data/ind.citeseer.test.index: -------------------------------------------------------------------------------- 1 | 2488 2 | 2644 3 | 3261 4 | 2804 5 | 3176 6 | 2432 7 | 3310 8 | 2410 9 | 2812 10 | 2520 11 | 2994 12 | 3282 13 | 2680 14 | 2848 15 | 2670 16 | 3005 17 | 2977 18 | 2592 19 | 2967 20 | 2461 21 | 3184 22 | 2852 23 | 2768 24 | 2905 25 | 2851 26 | 3129 27 | 3164 28 | 2438 29 | 2793 30 | 2763 31 | 2528 32 | 2954 33 | 2347 34 | 2640 35 | 3265 36 | 2874 37 | 2446 38 | 2856 39 | 3149 40 | 2374 41 | 3097 42 | 3301 43 | 2664 44 | 2418 45 | 2655 46 | 2464 47 | 2596 48 | 3262 49 | 3278 50 | 2320 51 | 2612 52 | 2614 53 | 2550 54 | 2626 55 | 2772 56 | 3007 57 | 2733 58 | 2516 59 | 2476 60 | 2798 61 | 2561 62 | 2839 63 | 2685 64 | 2391 65 | 2705 66 | 3098 67 | 2754 68 | 3251 69 | 2767 70 | 2630 71 | 2727 72 | 2513 73 | 2701 74 | 3264 75 | 2792 76 | 2821 77 | 3260 78 | 2462 79 | 3307 80 | 2639 81 | 2900 82 | 3060 83 | 2672 84 | 3116 85 | 2731 86 | 3316 87 | 2386 88 | 2425 89 | 2518 90 | 3151 91 | 2586 92 | 2797 93 | 2479 94 | 3117 95 | 2580 96 | 3182 97 | 2459 98 | 2508 99 | 3052 100 | 3230 101 | 3215 102 | 2803 103 | 2969 104 | 2562 105 | 2398 106 | 3325 107 | 2343 108 | 3030 109 | 2414 110 | 2776 111 | 2383 112 | 3173 113 | 2850 114 | 2499 115 | 3312 116 | 2648 117 | 2784 118 | 2898 119 | 3056 120 | 2484 121 | 3179 122 | 3132 123 | 2577 124 | 2563 125 | 2867 126 | 3317 127 | 2355 128 | 3207 129 | 3178 130 | 2968 131 | 3319 132 | 2358 133 | 2764 134 | 3001 135 | 2683 136 | 3271 137 | 2321 138 | 2567 139 | 2502 140 | 3246 141 | 2715 142 | 3066 143 | 2390 144 | 2381 145 | 3162 146 | 2741 147 | 2498 148 | 2790 149 | 3038 150 | 3321 151 | 2481 152 | 3050 153 | 3161 154 | 3122 155 | 2801 156 | 2957 157 | 3177 158 | 2965 159 | 2621 160 | 3208 161 | 2921 162 | 2802 163 | 2357 164 | 2677 165 | 2519 166 | 2860 167 | 2696 168 | 2368 169 | 3241 170 | 2858 171 | 2419 172 | 2762 173 | 2875 174 | 3222 175 | 3064 176 | 2827 177 | 3044 178 | 2471 179 | 3062 180 | 2982 181 | 2736 182 | 2322 183 | 2709 184 | 2766 185 | 2424 186 | 2602 187 | 2970 188 | 2675 189 | 3299 190 | 2554 191 | 2964 192 | 2597 193 | 2753 194 | 2979 195 | 2523 196 | 2912 197 | 2896 198 | 2317 199 | 3167 200 | 2813 201 | 2482 202 | 2557 203 | 3043 204 | 3244 205 | 2985 206 | 2460 207 | 2363 208 | 3272 209 | 3045 210 | 3192 211 | 2453 212 | 2656 213 | 2834 214 | 2443 215 | 3202 216 | 2926 217 | 2711 218 | 2633 219 | 2384 220 | 2752 221 | 3285 222 | 2817 223 | 2483 224 | 2919 225 | 2924 226 | 2661 227 | 2698 228 | 2361 229 | 2662 230 | 2819 231 | 3143 232 | 2316 233 | 3196 234 | 2739 235 | 2345 236 | 2578 237 | 2822 238 | 3229 239 | 2908 240 | 2917 241 | 2692 242 | 3200 243 | 2324 244 | 2522 245 | 3322 246 | 2697 247 | 3163 248 | 3093 249 | 3233 250 | 2774 251 | 2371 252 | 2835 253 | 2652 254 | 2539 255 | 2843 256 | 3231 257 | 2976 258 | 2429 259 | 2367 260 | 3144 261 | 2564 262 | 3283 263 | 3217 264 | 3035 265 | 2962 266 | 2433 267 | 2415 268 | 2387 269 | 3021 270 | 2595 271 | 2517 272 | 2468 273 | 3061 274 | 2673 275 | 2348 276 | 3027 277 | 2467 278 | 3318 279 | 2959 280 | 3273 281 | 2392 282 | 2779 283 | 2678 284 | 3004 285 | 2634 286 | 2974 287 | 3198 288 | 2342 289 | 2376 290 | 3249 291 | 2868 292 | 2952 293 | 2710 294 | 2838 295 | 2335 296 | 2524 297 | 2650 298 | 3186 299 | 2743 300 | 2545 301 | 2841 302 | 2515 303 | 2505 304 | 3181 305 | 2945 306 | 2738 307 | 2933 308 | 3303 309 | 2611 310 | 3090 311 | 2328 312 | 3010 313 | 3016 314 | 2504 315 | 2936 316 | 3266 317 | 3253 318 | 2840 319 | 3034 320 | 2581 321 | 2344 322 | 2452 323 | 2654 324 | 3199 325 | 3137 326 | 2514 327 | 2394 328 | 2544 329 | 2641 330 | 2613 331 | 2618 332 | 2558 333 | 2593 334 | 2532 335 | 2512 336 | 2975 337 | 3267 338 | 2566 339 | 2951 340 | 3300 341 | 2869 342 | 2629 343 | 2747 344 | 3055 345 | 2831 346 | 3105 347 | 3168 348 | 3100 349 | 2431 350 | 2828 351 | 2684 352 | 3269 353 | 2910 354 | 2865 355 | 2693 356 | 2884 357 | 3228 358 | 2783 359 | 3247 360 | 2770 361 | 3157 362 | 2421 363 | 2382 364 | 2331 365 | 3203 366 | 3240 367 | 2351 368 | 3114 369 | 2986 370 | 2688 371 | 2439 372 | 2996 373 | 3079 374 | 3103 375 | 3296 376 | 2349 377 | 2372 378 | 3096 379 | 2422 380 | 2551 381 | 3069 382 | 2737 383 | 3084 384 | 3304 385 | 3022 386 | 2542 387 | 3204 388 | 2949 389 | 2318 390 | 2450 391 | 3140 392 | 2734 393 | 2881 394 | 2576 395 | 3054 396 | 3089 397 | 3125 398 | 2761 399 | 3136 400 | 3111 401 | 2427 402 | 2466 403 | 3101 404 | 3104 405 | 3259 406 | 2534 407 | 2961 408 | 3191 409 | 3000 410 | 3036 411 | 2356 412 | 2800 413 | 3155 414 | 3224 415 | 2646 416 | 2735 417 | 3020 418 | 2866 419 | 2426 420 | 2448 421 | 3226 422 | 3219 423 | 2749 424 | 3183 425 | 2906 426 | 2360 427 | 2440 428 | 2946 429 | 2313 430 | 2859 431 | 2340 432 | 3008 433 | 2719 434 | 3058 435 | 2653 436 | 3023 437 | 2888 438 | 3243 439 | 2913 440 | 3242 441 | 3067 442 | 2409 443 | 3227 444 | 2380 445 | 2353 446 | 2686 447 | 2971 448 | 2847 449 | 2947 450 | 2857 451 | 3263 452 | 3218 453 | 2861 454 | 3323 455 | 2635 456 | 2966 457 | 2604 458 | 2456 459 | 2832 460 | 2694 461 | 3245 462 | 3119 463 | 2942 464 | 3153 465 | 2894 466 | 2555 467 | 3128 468 | 2703 469 | 2323 470 | 2631 471 | 2732 472 | 2699 473 | 2314 474 | 2590 475 | 3127 476 | 2891 477 | 2873 478 | 2814 479 | 2326 480 | 3026 481 | 3288 482 | 3095 483 | 2706 484 | 2457 485 | 2377 486 | 2620 487 | 2526 488 | 2674 489 | 3190 490 | 2923 491 | 3032 492 | 2334 493 | 3254 494 | 2991 495 | 3277 496 | 2973 497 | 2599 498 | 2658 499 | 2636 500 | 2826 501 | 3148 502 | 2958 503 | 3258 504 | 2990 505 | 3180 506 | 2538 507 | 2748 508 | 2625 509 | 2565 510 | 3011 511 | 3057 512 | 2354 513 | 3158 514 | 2622 515 | 3308 516 | 2983 517 | 2560 518 | 3169 519 | 3059 520 | 2480 521 | 3194 522 | 3291 523 | 3216 524 | 2643 525 | 3172 526 | 2352 527 | 2724 528 | 2485 529 | 2411 530 | 2948 531 | 2445 532 | 2362 533 | 2668 534 | 3275 535 | 3107 536 | 2496 537 | 2529 538 | 2700 539 | 2541 540 | 3028 541 | 2879 542 | 2660 543 | 3324 544 | 2755 545 | 2436 546 | 3048 547 | 2623 548 | 2920 549 | 3040 550 | 2568 551 | 3221 552 | 3003 553 | 3295 554 | 2473 555 | 3232 556 | 3213 557 | 2823 558 | 2897 559 | 2573 560 | 2645 561 | 3018 562 | 3326 563 | 2795 564 | 2915 565 | 3109 566 | 3086 567 | 2463 568 | 3118 569 | 2671 570 | 2909 571 | 2393 572 | 2325 573 | 3029 574 | 2972 575 | 3110 576 | 2870 577 | 3284 578 | 2816 579 | 2647 580 | 2667 581 | 2955 582 | 2333 583 | 2960 584 | 2864 585 | 2893 586 | 2458 587 | 2441 588 | 2359 589 | 2327 590 | 3256 591 | 3099 592 | 3073 593 | 3138 594 | 2511 595 | 2666 596 | 2548 597 | 2364 598 | 2451 599 | 2911 600 | 3237 601 | 3206 602 | 3080 603 | 3279 604 | 2934 605 | 2981 606 | 2878 607 | 3130 608 | 2830 609 | 3091 610 | 2659 611 | 2449 612 | 3152 613 | 2413 614 | 2722 615 | 2796 616 | 3220 617 | 2751 618 | 2935 619 | 3238 620 | 2491 621 | 2730 622 | 2842 623 | 3223 624 | 2492 625 | 3074 626 | 3094 627 | 2833 628 | 2521 629 | 2883 630 | 3315 631 | 2845 632 | 2907 633 | 3083 634 | 2572 635 | 3092 636 | 2903 637 | 2918 638 | 3039 639 | 3286 640 | 2587 641 | 3068 642 | 2338 643 | 3166 644 | 3134 645 | 2455 646 | 2497 647 | 2992 648 | 2775 649 | 2681 650 | 2430 651 | 2932 652 | 2931 653 | 2434 654 | 3154 655 | 3046 656 | 2598 657 | 2366 658 | 3015 659 | 3147 660 | 2944 661 | 2582 662 | 3274 663 | 2987 664 | 2642 665 | 2547 666 | 2420 667 | 2930 668 | 2750 669 | 2417 670 | 2808 671 | 3141 672 | 2997 673 | 2995 674 | 2584 675 | 2312 676 | 3033 677 | 3070 678 | 3065 679 | 2509 680 | 3314 681 | 2396 682 | 2543 683 | 2423 684 | 3170 685 | 2389 686 | 3289 687 | 2728 688 | 2540 689 | 2437 690 | 2486 691 | 2895 692 | 3017 693 | 2853 694 | 2406 695 | 2346 696 | 2877 697 | 2472 698 | 3210 699 | 2637 700 | 2927 701 | 2789 702 | 2330 703 | 3088 704 | 3102 705 | 2616 706 | 3081 707 | 2902 708 | 3205 709 | 3320 710 | 3165 711 | 2984 712 | 3185 713 | 2707 714 | 3255 715 | 2583 716 | 2773 717 | 2742 718 | 3024 719 | 2402 720 | 2718 721 | 2882 722 | 2575 723 | 3281 724 | 2786 725 | 2855 726 | 3014 727 | 2401 728 | 2535 729 | 2687 730 | 2495 731 | 3113 732 | 2609 733 | 2559 734 | 2665 735 | 2530 736 | 3293 737 | 2399 738 | 2605 739 | 2690 740 | 3133 741 | 2799 742 | 2533 743 | 2695 744 | 2713 745 | 2886 746 | 2691 747 | 2549 748 | 3077 749 | 3002 750 | 3049 751 | 3051 752 | 3087 753 | 2444 754 | 3085 755 | 3135 756 | 2702 757 | 3211 758 | 3108 759 | 2501 760 | 2769 761 | 3290 762 | 2465 763 | 3025 764 | 3019 765 | 2385 766 | 2940 767 | 2657 768 | 2610 769 | 2525 770 | 2941 771 | 3078 772 | 2341 773 | 2916 774 | 2956 775 | 2375 776 | 2880 777 | 3009 778 | 2780 779 | 2370 780 | 2925 781 | 2332 782 | 3146 783 | 2315 784 | 2809 785 | 3145 786 | 3106 787 | 2782 788 | 2760 789 | 2493 790 | 2765 791 | 2556 792 | 2890 793 | 2400 794 | 2339 795 | 3201 796 | 2818 797 | 3248 798 | 3280 799 | 2570 800 | 2569 801 | 2937 802 | 3174 803 | 2836 804 | 2708 805 | 2820 806 | 3195 807 | 2617 808 | 3197 809 | 2319 810 | 2744 811 | 2615 812 | 2825 813 | 2603 814 | 2914 815 | 2531 816 | 3193 817 | 2624 818 | 2365 819 | 2810 820 | 3239 821 | 3159 822 | 2537 823 | 2844 824 | 2758 825 | 2938 826 | 3037 827 | 2503 828 | 3297 829 | 2885 830 | 2608 831 | 2494 832 | 2712 833 | 2408 834 | 2901 835 | 2704 836 | 2536 837 | 2373 838 | 2478 839 | 2723 840 | 3076 841 | 2627 842 | 2369 843 | 2669 844 | 3006 845 | 2628 846 | 2788 847 | 3276 848 | 2435 849 | 3139 850 | 3235 851 | 2527 852 | 2571 853 | 2815 854 | 2442 855 | 2892 856 | 2978 857 | 2746 858 | 3150 859 | 2574 860 | 2725 861 | 3188 862 | 2601 863 | 2378 864 | 3075 865 | 2632 866 | 2794 867 | 3270 868 | 3071 869 | 2506 870 | 3126 871 | 3236 872 | 3257 873 | 2824 874 | 2989 875 | 2950 876 | 2428 877 | 2405 878 | 3156 879 | 2447 880 | 2787 881 | 2805 882 | 2720 883 | 2403 884 | 2811 885 | 2329 886 | 2474 887 | 2785 888 | 2350 889 | 2507 890 | 2416 891 | 3112 892 | 2475 893 | 2876 894 | 2585 895 | 2487 896 | 3072 897 | 3082 898 | 2943 899 | 2757 900 | 2388 901 | 2600 902 | 3294 903 | 2756 904 | 3142 905 | 3041 906 | 2594 907 | 2998 908 | 3047 909 | 2379 910 | 2980 911 | 2454 912 | 2862 913 | 3175 914 | 2588 915 | 3031 916 | 3012 917 | 2889 918 | 2500 919 | 2791 920 | 2854 921 | 2619 922 | 2395 923 | 2807 924 | 2740 925 | 2412 926 | 3131 927 | 3013 928 | 2939 929 | 2651 930 | 2490 931 | 2988 932 | 2863 933 | 3225 934 | 2745 935 | 2714 936 | 3160 937 | 3124 938 | 2849 939 | 2676 940 | 2872 941 | 3287 942 | 3189 943 | 2716 944 | 3115 945 | 2928 946 | 2871 947 | 2591 948 | 2717 949 | 2546 950 | 2777 951 | 3298 952 | 2397 953 | 3187 954 | 2726 955 | 2336 956 | 3268 957 | 2477 958 | 2904 959 | 2846 960 | 3121 961 | 2899 962 | 2510 963 | 2806 964 | 2963 965 | 3313 966 | 2679 967 | 3302 968 | 2663 969 | 3053 970 | 2469 971 | 2999 972 | 3311 973 | 2470 974 | 2638 975 | 3120 976 | 3171 977 | 2689 978 | 2922 979 | 2607 980 | 2721 981 | 2993 982 | 2887 983 | 2837 984 | 2929 985 | 2829 986 | 3234 987 | 2649 988 | 2337 989 | 2759 990 | 2778 991 | 2771 992 | 2404 993 | 2589 994 | 3123 995 | 3209 996 | 2729 997 | 3252 998 | 2606 999 | 2579 1000 | 2552 1001 | --------------------------------------------------------------------------------