├── AccVSNum_degrees_layers ├── ACCvsNum_layers.m ├── AccVsNum_degrees.m ├── README.md ├── data_degree.mat ├── data_five_layers.mat ├── data_four_layers.mat ├── data_one_layer.mat ├── data_three_layers.mat └── data_two_layers.mat ├── Collaborative_Reasoning ├── Collaborative_Reasoning_NumDevice.ipynb ├── Collaborative_Reasoning_p.ipynb ├── README.md ├── data │ ├── ind.citeseer.allx │ ├── ind.citeseer.ally │ ├── ind.citeseer.graph │ ├── ind.citeseer.test.index │ ├── ind.citeseer.tx │ ├── ind.citeseer.ty │ ├── ind.citeseer.x │ ├── ind.citeseer.y │ ├── ind.cora.allx │ ├── ind.cora.ally │ ├── ind.cora.graph │ ├── ind.cora.test.index │ ├── ind.cora.tx │ ├── ind.cora.ty │ ├── ind.cora.x │ ├── ind.cora.y │ ├── ind.pubmed.allx │ ├── ind.pubmed.ally │ ├── ind.pubmed.graph │ ├── ind.pubmed.test.index │ ├── ind.pubmed.tx │ ├── ind.pubmed.ty │ ├── ind.pubmed.x │ └── ind.pubmed.y ├── data_process.py ├── layers.py ├── models.py ├── train_func.py └── utils.py ├── README.md └── SER ├── DecodingSimulation.m ├── README.md ├── data_ent.mat └── data_tri.mat /AccVSNum_degrees_layers/ACCvsNum_layers.m: -------------------------------------------------------------------------------- 1 | clear 2 | clc 3 | % Load the data of different numbers of layers. Choose from 4 | % "data_one_layer", "data_two_layers","data_three_layers","data_four_layers","data_five_layers". 5 | load("data_five_layers.mat"); 6 | 7 | data = data_layer_5_1; % Choose the data from a specific layer to see the numerical results. 8 | 9 | emb = data(:,1:10:100); 10 | n = 30000; 11 | index = 1:1:length(emb); 12 | random_index = index(randi(numel(index),1,n)); 13 | snrs = [2,3,8,9]; 14 | acc_list = zeros(1,length(snrs)); 15 | for s = 1:1:length(snrs) 16 | err = 0; 17 | for i = 1 :1: n 18 | signal = emb(random_index(i),:); 19 | noised_emb = awgn(signal, snrs(s),'measured'); 20 | temp = emb - noised_emb; 21 | mod = []; 22 | for j = 1:1:length(emb) 23 | mod(j) = norm(temp(j,:)); 24 | end 25 | [minvalue, min_idex] = min(mod); 26 | if min_idex ~= random_index(i) 27 | err = err+1; 28 | end 29 | end 30 | 31 | 32 | ser = err/n; 33 | acc = 1-ser; 34 | acc_list(s) = acc; 35 | end 36 | -------------------------------------------------------------------------------- /AccVSNum_degrees_layers/AccVsNum_degrees.m: -------------------------------------------------------------------------------- 1 | clear 2 | clc 3 | load("data_degree.mat"); 4 | data = data_10; %Change data blocks of different degrees to see the corresponding results. 5 | emb = data(:,1:10:100); 6 | n = 30000; 7 | index = 1:1:length(emb); 8 | random_index = index(randi(numel(index),1,n)); 9 | snrs = [2,3,8,9]; 10 | acc_list = zeros(1,length(snrs)); 11 | for s = 1:1:length(snrs) 12 | err = 0; 13 | for i = 1 :1: n 14 | signal = emb(random_index(i),:); 15 | noised_emb = awgn(signal, snrs(s),'measured'); 16 | temp = emb - noised_emb; 17 | mod = []; 18 | for j = 1:1:length(emb) 19 | mod(j) = norm(temp(j,:)); 20 | end 21 | [minvalue, min_idex] = min(mod); 22 | if min_idex ~= random_index(i) 23 | err = err+1; 24 | end 25 | end 26 | acc = 1-err/n; 27 | acc_list(s) = acc; 28 | end 29 | -------------------------------------------------------------------------------- /AccVSNum_degrees_layers/README.md: -------------------------------------------------------------------------------- 1 | # The relationship bewteen accuracy and number of degrees/layers. 2 | This simulation shows the numerical results for accuracy and number of degrees/layers. 3 | Data corresponding to different degrees/layers is pretrained and processed. 4 | # How to use 5 | Downlaod all the files under this folder and same them to the same path. Simplly run the `AccvsNum_degrees.m` or the `ACCvsNum_layers.m` file to see the numerical results. 6 | -------------------------------------------------------------------------------- /AccVSNum_degrees_layers/data_degree.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/AccVSNum_degrees_layers/data_degree.mat -------------------------------------------------------------------------------- /AccVSNum_degrees_layers/data_five_layers.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/AccVSNum_degrees_layers/data_five_layers.mat -------------------------------------------------------------------------------- /AccVSNum_degrees_layers/data_four_layers.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/AccVSNum_degrees_layers/data_four_layers.mat -------------------------------------------------------------------------------- /AccVSNum_degrees_layers/data_one_layer.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/AccVSNum_degrees_layers/data_one_layer.mat -------------------------------------------------------------------------------- /AccVSNum_degrees_layers/data_three_layers.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/AccVSNum_degrees_layers/data_three_layers.mat -------------------------------------------------------------------------------- /AccVSNum_degrees_layers/data_two_layers.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/AccVSNum_degrees_layers/data_two_layers.mat -------------------------------------------------------------------------------- /Collaborative_Reasoning/Collaborative_Reasoning_NumDevice.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "id": "d5QApBmLhOGP" 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "from __future__ import division\n", 12 | "from __future__ import print_function\n", 13 | "import time\n", 14 | "import argparse\n", 15 | "import numpy as np\n", 16 | "import torch\n", 17 | "import torch.nn.functional as F\n", 18 | "import torch.optim as optim\n", 19 | "import random\n", 20 | "import matplotlib.pyplot as plt\n", 21 | "from sklearn import metrics\n", 22 | "\n", 23 | "from utils import get_plot\n", 24 | "from models import GCN\n", 25 | "from data_process import load_data\n", 26 | "from train_func import test, Block_matrix_train, Block_matrix_train_batch" 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": null, 32 | "metadata": { 33 | "id": "9n0WIOFZPqz2" 34 | }, 35 | "outputs": [], 36 | "source": [ 37 | "def get_K_hop_neighbors(adj_matrix, index, K):\n", 38 | " adj_matrix = adj_matrix + torch.eye(adj_matrix.shape[0],adj_matrix.shape[1]) #make sure the diagonal part >= 1\n", 39 | " hop_neightbor_index=index\n", 40 | " for i in range(K):\n", 41 | " hop_neightbor_index=torch.unique(torch.nonzero(adj[hop_neightbor_index])[:,1])\n", 42 | " return hop_neightbor_index" 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": null, 48 | "metadata": { 49 | "id": "OyL-gXSZPqz2" 50 | }, 51 | "outputs": [], 52 | "source": [ 53 | "def get_K_hop_neighbors_BDS(adj_matrix, index, K):\n", 54 | " adj_matrix = adj_matrix + torch.eye(adj_matrix.shape[0],adj_matrix.shape[1]) #make sure the diagonal part >= 1\n", 55 | " \n", 56 | " onehop_neightbor_index=torch.unique(torch.nonzero(adj[index])[:,1])\n", 57 | " np.setdiff1d(index, onehop_neightbor_index)\n", 58 | " \n", 59 | " return onehop_neightbor_index" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "metadata": {}, 66 | "outputs": [], 67 | "source": [ 68 | "import scipy.sparse as sp\n", 69 | "def normalize(mx):\n", 70 | " \"\"\"Row-normalize sparse matrix\"\"\"\n", 71 | " \n", 72 | " mx = mx + torch.eye(mx.shape[0],mx.shape[1])\n", 73 | " \n", 74 | " rowsum = np.array(mx.sum(1))\n", 75 | " r_inv = np.power(rowsum, -1).flatten()\n", 76 | " r_inv[np.isinf(r_inv)] = 0.\n", 77 | " r_mat_inv = sp.diags(r_inv)\n", 78 | " mx = r_mat_inv.dot(mx)\n", 79 | " return torch.tensor(mx)" 80 | ] 81 | }, 82 | { 83 | "cell_type": "markdown", 84 | "metadata": { 85 | "id": "Z5CMAPT6gafH" 86 | }, 87 | "source": [ 88 | "# Model" 89 | ] 90 | }, 91 | { 92 | "cell_type": "code", 93 | "execution_count": null, 94 | "metadata": { 95 | "id": "UMHZprmsOTdl" 96 | }, 97 | "outputs": [], 98 | "source": [ 99 | "\n", 100 | "def Collaborative_Reasoning(K, features, adj, labels, idx_train, idx_val, idx_test, iid_percent):\n", 101 | " # K: number of models\n", 102 | " #choose adj matrix\n", 103 | " #GCN:n*n\n", 104 | " #no connection between agents\n", 105 | "\n", 106 | " #define model\n", 107 | "\n", 108 | " global_model = GCN(nfeat=features.shape[1],\n", 109 | " nhid=args_hidden,\n", 110 | " nclass=labels.max().item() + 1,\n", 111 | " dropout=args_dropout)\n", 112 | " \n", 113 | " \n", 114 | " \n", 115 | " models=[]\n", 116 | " for i in range(K):\n", 117 | " models.append(GCN(nfeat=features.shape[1],\n", 118 | " nhid=args_hidden,\n", 119 | " nclass=labels.max().item() + 1,\n", 120 | " dropout=args_dropout))\n", 121 | " if args_cuda:\n", 122 | " for i in range(K):\n", 123 | " models[i]=models[i].to(torch.device('cuda:0'))#.cuda()\n", 124 | " global_model=global_model.to(torch.device('cuda:0'))\n", 125 | " features = features.cuda()\n", 126 | " adj = adj.to(torch.device('cuda:0'))\n", 127 | " labels = labels.cuda()\n", 128 | " idx_train = idx_train.cuda()\n", 129 | " idx_val = idx_val.cuda()\n", 130 | " idx_test = idx_test.cuda()\n", 131 | " #optimizer and train\n", 132 | " optimizers=[]\n", 133 | " for i in range(K):\n", 134 | " optimizers.append(optim.SGD(models[i].parameters(),\n", 135 | " lr=args_lr, weight_decay=args_weight_decay))\n", 136 | " # split data into K devices\n", 137 | " \n", 138 | " n=len(adj)\n", 139 | " \n", 140 | " split_data_indexes=[]\n", 141 | " \n", 142 | " nclass=labels.max().item() + 1\n", 143 | " split_data_indexes = []\n", 144 | " non_iid_percent = 1 - float(iid_percent)\n", 145 | " iid_indexes = [] #random assign\n", 146 | " shuffle_labels = [] #make train data points split into different devices\n", 147 | " for i in range(K):\n", 148 | " current = torch.nonzero(labels == i).reshape(-1)\n", 149 | " current = current[np.random.permutation(len(current))] #shuffle\n", 150 | " shuffle_labels.append(current)\n", 151 | " \n", 152 | " average_device_of_class = K // nclass\n", 153 | " if K % nclass != 0: #for non-iid\n", 154 | " average_device_of_class += 1\n", 155 | " for i in range(K): \n", 156 | " label_i= i // average_device_of_class \n", 157 | " labels_class = shuffle_labels[label_i]\n", 158 | "\n", 159 | " average_num= int(len(labels_class)//average_device_of_class * non_iid_percent)\n", 160 | " split_data_indexes.append(np.array(labels_class[average_num * (i % average_device_of_class):average_num * (i % average_device_of_class + 1)]))\n", 161 | " \n", 162 | " L = []\n", 163 | " for i in split_data_indexes:\n", 164 | " L += list(i)\n", 165 | " L.sort()\n", 166 | " iid_indexes = np.setdiff1d(range(len(labels)), L)\n", 167 | " \n", 168 | " for i in range(K): #for iid\n", 169 | " label_i= i // average_device_of_class\n", 170 | " labels_class = shuffle_labels[label_i]\n", 171 | "\n", 172 | " average_num= int(len(labels_class)//average_device_of_class * (1 - non_iid_percent))\n", 173 | " split_data_indexes[i] = list(split_data_indexes[i]) + list(iid_indexes[:average_num])\n", 174 | " \n", 175 | " iid_indexes = iid_indexes[average_num:]\n", 176 | " \n", 177 | " \n", 178 | " #get train indexes in each device, only part of nodes in each device have labels in the train process\n", 179 | " split_train_ids = []\n", 180 | " for i in range(K):\n", 181 | " split_data_indexes[i].sort()\n", 182 | " inter = np.intersect1d(split_data_indexes[i], idx_train)\n", 183 | " \n", 184 | " split_train_ids.append(np.searchsorted(split_data_indexes[i], inter)) #local id in block matrix\n", 185 | " \n", 186 | " \n", 187 | " \n", 188 | " #assign global model weights to local models at initial step\n", 189 | " for i in range(K):\n", 190 | " models[i].load_state_dict(global_model.state_dict())\n", 191 | " \n", 192 | " \n", 193 | " #start training\n", 194 | " for t in range(iterations):\n", 195 | " acc_trains=[]\n", 196 | " for i in range(K):\n", 197 | " for epoch in range(args_epochs):\n", 198 | " if len(split_train_ids[i]) == 0:\n", 199 | " continue\n", 200 | " acc_train=Block_matrix_train(epoch, models[i], optimizers[i], features, adj, labels,\n", 201 | " split_data_indexes[i], split_train_ids[i])\n", 202 | " \n", 203 | " acc_trains.append(acc_train)\n", 204 | " #print(model.Lambda)\n", 205 | " states=[]\n", 206 | " gloabl_state=dict()\n", 207 | " for i in range(K):\n", 208 | " states.append(models[i].state_dict())\n", 209 | " # Average all parameters\n", 210 | " \n", 211 | " \n", 212 | " for key in global_model.state_dict():\n", 213 | " gloabl_state[key] = split_train_ids[0].shape[0] * states[0][key]\n", 214 | " count_D=split_train_ids[0].shape[0]\n", 215 | " for i in range(1,K):\n", 216 | " gloabl_state[key] += split_train_ids[i].shape[0] * states[i][key]\n", 217 | " count_D += split_train_ids[i].shape[0]\n", 218 | " gloabl_state[key] /= count_D\n", 219 | " \n", 220 | "\n", 221 | " global_model.load_state_dict(gloabl_state)\n", 222 | " \n", 223 | " \n", 224 | " loss_train, acc_train = test(global_model, features, adj, labels, idx_train)\n", 225 | " #print(t,'\\t',\"train\",'\\t',loss_train,'\\t',acc_train)\n", 226 | " \n", 227 | " loss_val, acc_val = test(global_model, features, adj, labels, idx_val) #validation\n", 228 | " #print(t,'\\t',\"val\",'\\t',loss_val,'\\t',acc_val)\n", 229 | " \n", 230 | "\n", 231 | " a = open(mode+'_'+dataset_name+'_IID_'+str(iid_percent)+'_Collaborative_Reasoning_iter_'+str(iterations)+'_epoch_'+str(args_epochs)+'_device_num_'+str(K),'a+')\n", 232 | " a.write(str(t)+'\\t'+\"train\"+'\\t'+str(loss_train)+'\\t'+str(acc_train)+'\\n')\n", 233 | " a.write(str(t)+'\\t'+\"val\"+'\\t'+str(loss_val)+'\\t'+str(acc_val)+'\\n')\n", 234 | " a.close()\n", 235 | " for i in range(K):\n", 236 | " models[i].load_state_dict(gloabl_state)\n", 237 | " #test \n", 238 | " loss_test, acc_test= test(global_model, features, adj, labels, idx_test)\n", 239 | " #print(t,'\\t',\"test\",'\\t',loss_test,'\\t',acc_test)\n", 240 | " a = open(mode+'_'+dataset_name+'_IID_'+str(iid_percent)+'_Collaborative_Reasoning_iter_'+str(iterations)+'_epoch_'+str(args_epochs)+'_device_num_'+str(K),'a+')\n", 241 | " a.write(str(t)+'\\t'+\"test\"+'\\t'+str(loss_test)+'\\t'+str(acc_test)+'\\n')\n", 242 | " a.close()\n", 243 | " #print(\"save file as\",mode+'_'+dataset_name+'_IID_'+str(iid_percent)+'_Collaborative_Reasoning_iter_'+str(iterations)+'_epoch_'+str(args_epochs)+'_device_num_'+str(K))\n", 244 | "\n", 245 | "\n", 246 | " return loss_test, acc_test\n", 247 | "\n" 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "execution_count": null, 253 | "metadata": { 254 | "id": "ubM8c3SqXwA3", 255 | "scrolled": true 256 | }, 257 | "outputs": [], 258 | "source": [ 259 | "np.random.seed(42)\n", 260 | "torch.manual_seed(42)\n", 261 | "mode=\"real\"\n", 262 | "dataset_name='cora'\n", 263 | "features, adj, labels, idx_train, idx_val, idx_test = load_data(dataset_name)\n", 264 | "class_num = labels.max().item() + 1\n", 265 | "\n", 266 | "\n" 267 | ] 268 | }, 269 | { 270 | "cell_type": "code", 271 | "execution_count": null, 272 | "metadata": { 273 | "colab": { 274 | "base_uri": "https://localhost:8080/" 275 | }, 276 | "id": "3jDXPAhGUu16", 277 | "outputId": "2bbdd38b-f48d-4338-dd07-41bcce391781", 278 | "scrolled": true 279 | }, 280 | "outputs": [], 281 | "source": [ 282 | "#for fix seed, need to rerun both data and model codes\n", 283 | "\n", 284 | "args_normalize = True\n", 285 | "\n", 286 | "model_type = 'GCN' #GCN\n", 287 | "args_hidden = 16\n", 288 | "args_dropout = 0.5\n", 289 | "args_lr = 0.5\n", 290 | "args_weight_decay = 5e-4 #L2 penalty\n", 291 | "args_epochs = 3\n", 292 | "args_no_cuda = False\n", 293 | "args_cuda = not args_no_cuda and torch.cuda.is_available()\n", 294 | "\n", 295 | "args_device_num = class_num #split data into args_device_num parts\n", 296 | "#iterations = 100\n", 297 | "\n", 298 | "\n", 299 | "\n", 300 | "if args_normalize==True: \n", 301 | " adj = normalize(adj)\n", 302 | " '''\n", 303 | " adj = adj + torch.eye(adj.shape[0],adj.shape[1])\n", 304 | " d=torch.sum(adj,axis=1)\n", 305 | " D_minus_one_over_2=torch.zeros(adj.shape[0],adj.shape[0])\n", 306 | " D_minus_one_over_2[range(len(D_minus_one_over_2)), range(len(D_minus_one_over_2))] = d**(-0.5)\n", 307 | " adj = torch.mm(torch.mm(D_minus_one_over_2,adj),D_minus_one_over_2)\n", 308 | " '''\n", 309 | " \n", 310 | "\n", 311 | "\n" 312 | ] 313 | }, 314 | { 315 | "cell_type": "code", 316 | "execution_count": null, 317 | "metadata": {}, 318 | "outputs": [], 319 | "source": [ 320 | "for args_epochs in [3]:\n", 321 | " for args_random_assign in [0.0, 0.5, 1]:\n", 322 | " for args_device_num in [2,3,4,5,6]:\n", 323 | " for iterations in [10,20,40,80,100,200,300,400,500,600]:\n", 324 | " # for i in range(3):\n", 325 | " Collaborative_Reasoning(args_device_num, features, adj, labels, idx_train, idx_val, idx_test, args_random_assign)" 326 | ] 327 | } 328 | ], 329 | "metadata": { 330 | "accelerator": "GPU", 331 | "colab": { 332 | "collapsed_sections": [], 333 | "machine_shape": "hm", 334 | "name": "main.ipynb", 335 | "provenance": [], 336 | "toc_visible": true 337 | }, 338 | "kernelspec": { 339 | "display_name": "Python 3.9.7 ('base')", 340 | "language": "python", 341 | "name": "python3" 342 | }, 343 | "language_info": { 344 | "codemirror_mode": { 345 | "name": "ipython", 346 | "version": 3 347 | }, 348 | "file_extension": ".py", 349 | "mimetype": "text/x-python", 350 | "name": "python", 351 | "nbconvert_exporter": "python", 352 | "pygments_lexer": "ipython3", 353 | "version": "3.9.7" 354 | }, 355 | "vscode": { 356 | "interpreter": { 357 | "hash": "5a410216a586027e41e6693bdeb1563026181c5d1fabea1354baf3177ace6ae8" 358 | } 359 | } 360 | }, 361 | "nbformat": 4, 362 | "nbformat_minor": 1 363 | } 364 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/Collaborative_Reasoning_p.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "id": "d5QApBmLhOGP" 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "from __future__ import division\n", 12 | "from __future__ import print_function\n", 13 | "import time\n", 14 | "import argparse\n", 15 | "import numpy as np\n", 16 | "import torch\n", 17 | "import torch.nn.functional as F\n", 18 | "import torch.optim as optim\n", 19 | "import random\n", 20 | "import matplotlib.pyplot as plt\n", 21 | "from sklearn import metrics\n", 22 | "\n", 23 | "from utils import get_plot\n", 24 | "from models import GCN\n", 25 | "from data_process import load_data\n", 26 | "from train_func import test, Block_matrix_train, Block_matrix_train_batch" 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": null, 32 | "metadata": { 33 | "id": "9n0WIOFZPqz2" 34 | }, 35 | "outputs": [], 36 | "source": [ 37 | "def get_K_hop_neighbors(adj_matrix, index, K):\n", 38 | " adj_matrix = adj_matrix + torch.eye(adj_matrix.shape[0],adj_matrix.shape[1]) #make sure the diagonal part >= 1\n", 39 | " hop_neightbor_index=index\n", 40 | " for i in range(K):\n", 41 | " hop_neightbor_index=torch.unique(torch.nonzero(adj[hop_neightbor_index])[:,1])\n", 42 | " return hop_neightbor_index" 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": null, 48 | "metadata": { 49 | "id": "OyL-gXSZPqz2" 50 | }, 51 | "outputs": [], 52 | "source": [ 53 | "def get_K_hop_neighbors_BDS(adj_matrix, index, K):\n", 54 | " adj_matrix = adj_matrix + torch.eye(adj_matrix.shape[0],adj_matrix.shape[1]) #make sure the diagonal part >= 1\n", 55 | " \n", 56 | " onehop_neightbor_index=torch.unique(torch.nonzero(adj[index])[:,1])\n", 57 | " np.setdiff1d(index, onehop_neightbor_index)\n", 58 | " \n", 59 | " return onehop_neightbor_index" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "metadata": {}, 66 | "outputs": [], 67 | "source": [ 68 | "import scipy.sparse as sp\n", 69 | "def normalize(mx):\n", 70 | " \"\"\"Row-normalize sparse matrix\"\"\"\n", 71 | " \n", 72 | " mx = mx + torch.eye(mx.shape[0],mx.shape[1])\n", 73 | " \n", 74 | " rowsum = np.array(mx.sum(1))\n", 75 | " r_inv = np.power(rowsum, -1).flatten()\n", 76 | " r_inv[np.isinf(r_inv)] = 0.\n", 77 | " r_mat_inv = sp.diags(r_inv)\n", 78 | " mx = r_mat_inv.dot(mx)\n", 79 | " return torch.tensor(mx)" 80 | ] 81 | }, 82 | { 83 | "cell_type": "markdown", 84 | "metadata": { 85 | "id": "Z5CMAPT6gafH" 86 | }, 87 | "source": [ 88 | "# Model" 89 | ] 90 | }, 91 | { 92 | "cell_type": "code", 93 | "execution_count": null, 94 | "metadata": { 95 | "id": "UMHZprmsOTdl" 96 | }, 97 | "outputs": [], 98 | "source": [ 99 | "\n", 100 | "def Collaborative_Reasoning(K, features, adj, labels, idx_train, idx_val, idx_test, iid_percent):\n", 101 | " # K: number of models\n", 102 | " #choose adj matrix\n", 103 | " #GCN:n*n\n", 104 | " #no connection between agents\n", 105 | "\n", 106 | " #define model\n", 107 | "\n", 108 | " global_model = GCN(nfeat=features.shape[1],\n", 109 | " nhid=args_hidden,\n", 110 | " nclass=labels.max().item() + 1,\n", 111 | " dropout=args_dropout)\n", 112 | " \n", 113 | " \n", 114 | " \n", 115 | " models=[]\n", 116 | " for i in range(K):\n", 117 | " models.append(GCN(nfeat=features.shape[1],\n", 118 | " nhid=args_hidden,\n", 119 | " nclass=labels.max().item() + 1,\n", 120 | " dropout=args_dropout))\n", 121 | " if args_cuda:\n", 122 | " for i in range(K):\n", 123 | " models[i]=models[i].to(torch.device('cuda:0'))#.cuda()\n", 124 | " global_model=global_model.to(torch.device('cuda:0'))\n", 125 | " features = features.cuda()\n", 126 | " adj = adj.to(torch.device('cuda:0'))\n", 127 | " labels = labels.cuda()\n", 128 | " idx_train = idx_train.cuda()\n", 129 | " idx_val = idx_val.cuda()\n", 130 | " idx_test = idx_test.cuda()\n", 131 | " #optimizer and train\n", 132 | " optimizers=[]\n", 133 | " for i in range(K):\n", 134 | " optimizers.append(optim.SGD(models[i].parameters(),\n", 135 | " lr=args_lr, weight_decay=args_weight_decay))\n", 136 | " # split data into K devices\n", 137 | " \n", 138 | " n=len(adj)\n", 139 | " \n", 140 | " split_data_indexes=[]\n", 141 | " \n", 142 | " nclass=labels.max().item() + 1\n", 143 | " split_data_indexes = []\n", 144 | " non_iid_percent = 1 - float(iid_percent)\n", 145 | " iid_indexes = [] #random assign\n", 146 | " shuffle_labels = [] #make train data points split into different devices\n", 147 | " for i in range(K):\n", 148 | " current = torch.nonzero(labels == i).reshape(-1)\n", 149 | " current = current[np.random.permutation(len(current))] #shuffle\n", 150 | " shuffle_labels.append(current)\n", 151 | " \n", 152 | " average_device_of_class = K // nclass\n", 153 | " if K % nclass != 0: #for non-iid\n", 154 | " average_device_of_class += 1\n", 155 | " for i in range(K): \n", 156 | " label_i= i // average_device_of_class \n", 157 | " labels_class = shuffle_labels[label_i]\n", 158 | "\n", 159 | " average_num= int(len(labels_class)//average_device_of_class * non_iid_percent)\n", 160 | " split_data_indexes.append(np.array(labels_class[average_num * (i % average_device_of_class):average_num * (i % average_device_of_class + 1)]))\n", 161 | " \n", 162 | " L = []\n", 163 | " for i in split_data_indexes:\n", 164 | " L += list(i)\n", 165 | " L.sort()\n", 166 | " iid_indexes = np.setdiff1d(range(len(labels)), L)\n", 167 | " \n", 168 | " for i in range(K): #for iid\n", 169 | " label_i= i // average_device_of_class\n", 170 | " labels_class = shuffle_labels[label_i]\n", 171 | "\n", 172 | " average_num= int(len(labels_class)//average_device_of_class * (1 - non_iid_percent))\n", 173 | " split_data_indexes[i] = list(split_data_indexes[i]) + list(iid_indexes[:average_num])\n", 174 | " \n", 175 | " iid_indexes = iid_indexes[average_num:]\n", 176 | " \n", 177 | " \n", 178 | " #get train indexes in each device, only part of nodes in each device have labels in the train process\n", 179 | " split_train_ids = []\n", 180 | " for i in range(K):\n", 181 | " split_data_indexes[i].sort()\n", 182 | " inter = np.intersect1d(split_data_indexes[i], idx_train)\n", 183 | " \n", 184 | " split_train_ids.append(np.searchsorted(split_data_indexes[i], inter)) #local id in block matrix\n", 185 | " \n", 186 | " \n", 187 | " \n", 188 | " #assign global model weights to local models at initial step\n", 189 | " for i in range(K):\n", 190 | " models[i].load_state_dict(global_model.state_dict())\n", 191 | " \n", 192 | " \n", 193 | " #start training\n", 194 | " for t in range(iterations):\n", 195 | " acc_trains=[]\n", 196 | " for i in range(K):\n", 197 | " for epoch in range(args_epochs):\n", 198 | " if len(split_train_ids[i]) == 0:\n", 199 | " continue\n", 200 | " acc_train=Block_matrix_train(epoch, models[i], optimizers[i], features, adj, labels,\n", 201 | " split_data_indexes[i], split_train_ids[i])\n", 202 | " \n", 203 | " acc_trains.append(acc_train)\n", 204 | " #print(model.Lambda)\n", 205 | " states=[]\n", 206 | " gloabl_state=dict()\n", 207 | " for i in range(K):\n", 208 | " states.append(models[i].state_dict())\n", 209 | " # Average all parameters\n", 210 | " \n", 211 | " \n", 212 | " for key in global_model.state_dict():\n", 213 | " gloabl_state[key] = split_train_ids[0].shape[0] * states[0][key]\n", 214 | " count_D=split_train_ids[0].shape[0]\n", 215 | " for i in range(1,K):\n", 216 | " gloabl_state[key] += split_train_ids[i].shape[0] * states[i][key]\n", 217 | " count_D += split_train_ids[i].shape[0]\n", 218 | " gloabl_state[key] /= count_D\n", 219 | " \n", 220 | "\n", 221 | " global_model.load_state_dict(gloabl_state)\n", 222 | " \n", 223 | " \n", 224 | " loss_train, acc_train = test(global_model, features, adj, labels, idx_train)\n", 225 | " #print(t,'\\t',\"train\",'\\t',loss_train,'\\t',acc_train)\n", 226 | " \n", 227 | " loss_val, acc_val = test(global_model, features, adj, labels, idx_val) #validation\n", 228 | " #print(t,'\\t',\"val\",'\\t',loss_val,'\\t',acc_val)\n", 229 | " \n", 230 | "\n", 231 | " a = open(mode+'_'+dataset_name+'_IID_'+str(iid_percent)+'_Collaborative_Reasoning_iter_'+str(iterations)+'_epoch_'+str(args_epochs)+'_device_num_'+str(K),'a+')\n", 232 | " a.write(str(t)+'\\t'+\"train\"+'\\t'+str(loss_train)+'\\t'+str(acc_train)+'\\n')\n", 233 | " a.write(str(t)+'\\t'+\"val\"+'\\t'+str(loss_val)+'\\t'+str(acc_val)+'\\n')\n", 234 | " a.close()\n", 235 | " for i in range(K):\n", 236 | " models[i].load_state_dict(gloabl_state)\n", 237 | " #test \n", 238 | " loss_test, acc_test= test(global_model, features, adj, labels, idx_test)\n", 239 | " #print(t,'\\t',\"test\",'\\t',loss_test,'\\t',acc_test)\n", 240 | " a = open(mode+'_'+dataset_name+'_IID_'+str(iid_percent)+'_Collaborative_Reasoning_iter_'+str(iterations)+'_epoch_'+str(args_epochs)+'_device_num_'+str(K),'a+')\n", 241 | " a.write(str(t)+'\\t'+\"test\"+'\\t'+str(loss_test)+'\\t'+str(acc_test)+'\\n')\n", 242 | " a.close()\n", 243 | " #print(\"save file as\",mode+'_'+dataset_name+'_IID_'+str(iid_percent)+'_Collaborative_Reasoning_iter_'+str(iterations)+'_epoch_'+str(args_epochs)+'_device_num_'+str(K))\n", 244 | "\n", 245 | "\n", 246 | " return loss_test, acc_test\n", 247 | "\n" 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "execution_count": null, 253 | "metadata": { 254 | "id": "ubM8c3SqXwA3", 255 | "scrolled": true 256 | }, 257 | "outputs": [], 258 | "source": [ 259 | "np.random.seed(42)\n", 260 | "torch.manual_seed(42)\n", 261 | "mode=\"real\"\n", 262 | "dataset_name='cora'\n", 263 | "features, adj, labels, idx_train, idx_val, idx_test = load_data(dataset_name)\n", 264 | "class_num = labels.max().item() + 1\n", 265 | "\n", 266 | "\n" 267 | ] 268 | }, 269 | { 270 | "cell_type": "code", 271 | "execution_count": null, 272 | "metadata": { 273 | "colab": { 274 | "base_uri": "https://localhost:8080/" 275 | }, 276 | "id": "3jDXPAhGUu16", 277 | "outputId": "2bbdd38b-f48d-4338-dd07-41bcce391781", 278 | "scrolled": true 279 | }, 280 | "outputs": [], 281 | "source": [ 282 | "#for fix seed, need to rerun both data and model codes\n", 283 | "\n", 284 | "args_normalize = True\n", 285 | "\n", 286 | "model_type = 'GCN' #GCN\n", 287 | "args_hidden = 16\n", 288 | "args_dropout = 0.5\n", 289 | "args_lr = 0.5\n", 290 | "args_weight_decay = 5e-4 #L2 penalty\n", 291 | "args_epochs = 3\n", 292 | "args_no_cuda = False\n", 293 | "args_cuda = not args_no_cuda and torch.cuda.is_available()\n", 294 | "\n", 295 | "args_device_num = class_num #split data into args_device_num parts\n", 296 | "#iterations = 100\n", 297 | "\n", 298 | "\n", 299 | "\n", 300 | "if args_normalize==True: \n", 301 | " adj = normalize(adj)\n", 302 | " '''\n", 303 | " adj = adj + torch.eye(adj.shape[0],adj.shape[1])\n", 304 | " d=torch.sum(adj,axis=1)\n", 305 | " D_minus_one_over_2=torch.zeros(adj.shape[0],adj.shape[0])\n", 306 | " D_minus_one_over_2[range(len(D_minus_one_over_2)), range(len(D_minus_one_over_2))] = d**(-0.5)\n", 307 | " adj = torch.mm(torch.mm(D_minus_one_over_2,adj),D_minus_one_over_2)\n", 308 | " '''\n", 309 | " \n", 310 | "\n", 311 | "\n" 312 | ] 313 | }, 314 | { 315 | "cell_type": "code", 316 | "execution_count": null, 317 | "metadata": {}, 318 | "outputs": [], 319 | "source": [ 320 | "for args_epochs in [3]:\n", 321 | " for args_random_assign in [0.0,0.2,0.4,0.6,0.8]:\n", 322 | " for iterations in [10,20,40,80,100,200,300,400,500,600]:\n", 323 | " # for i in range(3):\n", 324 | " Collaborative_Reasoning(args_device_num, features, adj, labels, idx_train, idx_val, idx_test, args_random_assign)" 325 | ] 326 | } 327 | ], 328 | "metadata": { 329 | "accelerator": "GPU", 330 | "colab": { 331 | "collapsed_sections": [], 332 | "machine_shape": "hm", 333 | "name": "main.ipynb", 334 | "provenance": [], 335 | "toc_visible": true 336 | }, 337 | "kernelspec": { 338 | "display_name": "Python 3.9.7 ('base')", 339 | "language": "python", 340 | "name": "python3" 341 | }, 342 | "language_info": { 343 | "codemirror_mode": { 344 | "name": "ipython", 345 | "version": 3 346 | }, 347 | "file_extension": ".py", 348 | "mimetype": "text/x-python", 349 | "name": "python", 350 | "nbconvert_exporter": "python", 351 | "pygments_lexer": "ipython3", 352 | "version": "3.9.7" 353 | }, 354 | "vscode": { 355 | "interpreter": { 356 | "hash": "5a410216a586027e41e6693bdeb1563026181c5d1fabea1354baf3177ace6ae8" 357 | } 358 | } 359 | }, 360 | "nbformat": 4, 361 | "nbformat_minor": 1 362 | } 363 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/README.md: -------------------------------------------------------------------------------- 1 | # Collaborative Reasoning 2 | 3 | The implementation of collaborative reasoning is based on the code released by https://github.com/yh-yao/FedGCN. 4 | 5 | ## Data 6 | 7 | 2 datasets were used in the paper: 8 | 9 | - Cora 10 | - Citeseer 11 | 12 | The Pubmed dataset is also included, but the results are not presented in the paper due to the page limit. 13 | 14 | ## Requirements 15 | * Python 3 16 | * PyTorch 17 | * networkx 18 | * numpy 19 | 20 | ## How to Use 21 | * Run the `Collaborative_Reasoning_p.ipynb` file to see the numerical results for different vaules of $p$. 22 | * Run the `Collaborative_Reasoning_NumDevice.ipynb` file to see the numerical results for different numbers of devices. 23 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.citeseer.allx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.citeseer.allx -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.citeseer.ally: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.citeseer.ally -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.citeseer.graph: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.citeseer.graph -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.citeseer.test.index: -------------------------------------------------------------------------------- 1 | 2488 2 | 2644 3 | 3261 4 | 2804 5 | 3176 6 | 2432 7 | 3310 8 | 2410 9 | 2812 10 | 2520 11 | 2994 12 | 3282 13 | 2680 14 | 2848 15 | 2670 16 | 3005 17 | 2977 18 | 2592 19 | 2967 20 | 2461 21 | 3184 22 | 2852 23 | 2768 24 | 2905 25 | 2851 26 | 3129 27 | 3164 28 | 2438 29 | 2793 30 | 2763 31 | 2528 32 | 2954 33 | 2347 34 | 2640 35 | 3265 36 | 2874 37 | 2446 38 | 2856 39 | 3149 40 | 2374 41 | 3097 42 | 3301 43 | 2664 44 | 2418 45 | 2655 46 | 2464 47 | 2596 48 | 3262 49 | 3278 50 | 2320 51 | 2612 52 | 2614 53 | 2550 54 | 2626 55 | 2772 56 | 3007 57 | 2733 58 | 2516 59 | 2476 60 | 2798 61 | 2561 62 | 2839 63 | 2685 64 | 2391 65 | 2705 66 | 3098 67 | 2754 68 | 3251 69 | 2767 70 | 2630 71 | 2727 72 | 2513 73 | 2701 74 | 3264 75 | 2792 76 | 2821 77 | 3260 78 | 2462 79 | 3307 80 | 2639 81 | 2900 82 | 3060 83 | 2672 84 | 3116 85 | 2731 86 | 3316 87 | 2386 88 | 2425 89 | 2518 90 | 3151 91 | 2586 92 | 2797 93 | 2479 94 | 3117 95 | 2580 96 | 3182 97 | 2459 98 | 2508 99 | 3052 100 | 3230 101 | 3215 102 | 2803 103 | 2969 104 | 2562 105 | 2398 106 | 3325 107 | 2343 108 | 3030 109 | 2414 110 | 2776 111 | 2383 112 | 3173 113 | 2850 114 | 2499 115 | 3312 116 | 2648 117 | 2784 118 | 2898 119 | 3056 120 | 2484 121 | 3179 122 | 3132 123 | 2577 124 | 2563 125 | 2867 126 | 3317 127 | 2355 128 | 3207 129 | 3178 130 | 2968 131 | 3319 132 | 2358 133 | 2764 134 | 3001 135 | 2683 136 | 3271 137 | 2321 138 | 2567 139 | 2502 140 | 3246 141 | 2715 142 | 3066 143 | 2390 144 | 2381 145 | 3162 146 | 2741 147 | 2498 148 | 2790 149 | 3038 150 | 3321 151 | 2481 152 | 3050 153 | 3161 154 | 3122 155 | 2801 156 | 2957 157 | 3177 158 | 2965 159 | 2621 160 | 3208 161 | 2921 162 | 2802 163 | 2357 164 | 2677 165 | 2519 166 | 2860 167 | 2696 168 | 2368 169 | 3241 170 | 2858 171 | 2419 172 | 2762 173 | 2875 174 | 3222 175 | 3064 176 | 2827 177 | 3044 178 | 2471 179 | 3062 180 | 2982 181 | 2736 182 | 2322 183 | 2709 184 | 2766 185 | 2424 186 | 2602 187 | 2970 188 | 2675 189 | 3299 190 | 2554 191 | 2964 192 | 2597 193 | 2753 194 | 2979 195 | 2523 196 | 2912 197 | 2896 198 | 2317 199 | 3167 200 | 2813 201 | 2482 202 | 2557 203 | 3043 204 | 3244 205 | 2985 206 | 2460 207 | 2363 208 | 3272 209 | 3045 210 | 3192 211 | 2453 212 | 2656 213 | 2834 214 | 2443 215 | 3202 216 | 2926 217 | 2711 218 | 2633 219 | 2384 220 | 2752 221 | 3285 222 | 2817 223 | 2483 224 | 2919 225 | 2924 226 | 2661 227 | 2698 228 | 2361 229 | 2662 230 | 2819 231 | 3143 232 | 2316 233 | 3196 234 | 2739 235 | 2345 236 | 2578 237 | 2822 238 | 3229 239 | 2908 240 | 2917 241 | 2692 242 | 3200 243 | 2324 244 | 2522 245 | 3322 246 | 2697 247 | 3163 248 | 3093 249 | 3233 250 | 2774 251 | 2371 252 | 2835 253 | 2652 254 | 2539 255 | 2843 256 | 3231 257 | 2976 258 | 2429 259 | 2367 260 | 3144 261 | 2564 262 | 3283 263 | 3217 264 | 3035 265 | 2962 266 | 2433 267 | 2415 268 | 2387 269 | 3021 270 | 2595 271 | 2517 272 | 2468 273 | 3061 274 | 2673 275 | 2348 276 | 3027 277 | 2467 278 | 3318 279 | 2959 280 | 3273 281 | 2392 282 | 2779 283 | 2678 284 | 3004 285 | 2634 286 | 2974 287 | 3198 288 | 2342 289 | 2376 290 | 3249 291 | 2868 292 | 2952 293 | 2710 294 | 2838 295 | 2335 296 | 2524 297 | 2650 298 | 3186 299 | 2743 300 | 2545 301 | 2841 302 | 2515 303 | 2505 304 | 3181 305 | 2945 306 | 2738 307 | 2933 308 | 3303 309 | 2611 310 | 3090 311 | 2328 312 | 3010 313 | 3016 314 | 2504 315 | 2936 316 | 3266 317 | 3253 318 | 2840 319 | 3034 320 | 2581 321 | 2344 322 | 2452 323 | 2654 324 | 3199 325 | 3137 326 | 2514 327 | 2394 328 | 2544 329 | 2641 330 | 2613 331 | 2618 332 | 2558 333 | 2593 334 | 2532 335 | 2512 336 | 2975 337 | 3267 338 | 2566 339 | 2951 340 | 3300 341 | 2869 342 | 2629 343 | 2747 344 | 3055 345 | 2831 346 | 3105 347 | 3168 348 | 3100 349 | 2431 350 | 2828 351 | 2684 352 | 3269 353 | 2910 354 | 2865 355 | 2693 356 | 2884 357 | 3228 358 | 2783 359 | 3247 360 | 2770 361 | 3157 362 | 2421 363 | 2382 364 | 2331 365 | 3203 366 | 3240 367 | 2351 368 | 3114 369 | 2986 370 | 2688 371 | 2439 372 | 2996 373 | 3079 374 | 3103 375 | 3296 376 | 2349 377 | 2372 378 | 3096 379 | 2422 380 | 2551 381 | 3069 382 | 2737 383 | 3084 384 | 3304 385 | 3022 386 | 2542 387 | 3204 388 | 2949 389 | 2318 390 | 2450 391 | 3140 392 | 2734 393 | 2881 394 | 2576 395 | 3054 396 | 3089 397 | 3125 398 | 2761 399 | 3136 400 | 3111 401 | 2427 402 | 2466 403 | 3101 404 | 3104 405 | 3259 406 | 2534 407 | 2961 408 | 3191 409 | 3000 410 | 3036 411 | 2356 412 | 2800 413 | 3155 414 | 3224 415 | 2646 416 | 2735 417 | 3020 418 | 2866 419 | 2426 420 | 2448 421 | 3226 422 | 3219 423 | 2749 424 | 3183 425 | 2906 426 | 2360 427 | 2440 428 | 2946 429 | 2313 430 | 2859 431 | 2340 432 | 3008 433 | 2719 434 | 3058 435 | 2653 436 | 3023 437 | 2888 438 | 3243 439 | 2913 440 | 3242 441 | 3067 442 | 2409 443 | 3227 444 | 2380 445 | 2353 446 | 2686 447 | 2971 448 | 2847 449 | 2947 450 | 2857 451 | 3263 452 | 3218 453 | 2861 454 | 3323 455 | 2635 456 | 2966 457 | 2604 458 | 2456 459 | 2832 460 | 2694 461 | 3245 462 | 3119 463 | 2942 464 | 3153 465 | 2894 466 | 2555 467 | 3128 468 | 2703 469 | 2323 470 | 2631 471 | 2732 472 | 2699 473 | 2314 474 | 2590 475 | 3127 476 | 2891 477 | 2873 478 | 2814 479 | 2326 480 | 3026 481 | 3288 482 | 3095 483 | 2706 484 | 2457 485 | 2377 486 | 2620 487 | 2526 488 | 2674 489 | 3190 490 | 2923 491 | 3032 492 | 2334 493 | 3254 494 | 2991 495 | 3277 496 | 2973 497 | 2599 498 | 2658 499 | 2636 500 | 2826 501 | 3148 502 | 2958 503 | 3258 504 | 2990 505 | 3180 506 | 2538 507 | 2748 508 | 2625 509 | 2565 510 | 3011 511 | 3057 512 | 2354 513 | 3158 514 | 2622 515 | 3308 516 | 2983 517 | 2560 518 | 3169 519 | 3059 520 | 2480 521 | 3194 522 | 3291 523 | 3216 524 | 2643 525 | 3172 526 | 2352 527 | 2724 528 | 2485 529 | 2411 530 | 2948 531 | 2445 532 | 2362 533 | 2668 534 | 3275 535 | 3107 536 | 2496 537 | 2529 538 | 2700 539 | 2541 540 | 3028 541 | 2879 542 | 2660 543 | 3324 544 | 2755 545 | 2436 546 | 3048 547 | 2623 548 | 2920 549 | 3040 550 | 2568 551 | 3221 552 | 3003 553 | 3295 554 | 2473 555 | 3232 556 | 3213 557 | 2823 558 | 2897 559 | 2573 560 | 2645 561 | 3018 562 | 3326 563 | 2795 564 | 2915 565 | 3109 566 | 3086 567 | 2463 568 | 3118 569 | 2671 570 | 2909 571 | 2393 572 | 2325 573 | 3029 574 | 2972 575 | 3110 576 | 2870 577 | 3284 578 | 2816 579 | 2647 580 | 2667 581 | 2955 582 | 2333 583 | 2960 584 | 2864 585 | 2893 586 | 2458 587 | 2441 588 | 2359 589 | 2327 590 | 3256 591 | 3099 592 | 3073 593 | 3138 594 | 2511 595 | 2666 596 | 2548 597 | 2364 598 | 2451 599 | 2911 600 | 3237 601 | 3206 602 | 3080 603 | 3279 604 | 2934 605 | 2981 606 | 2878 607 | 3130 608 | 2830 609 | 3091 610 | 2659 611 | 2449 612 | 3152 613 | 2413 614 | 2722 615 | 2796 616 | 3220 617 | 2751 618 | 2935 619 | 3238 620 | 2491 621 | 2730 622 | 2842 623 | 3223 624 | 2492 625 | 3074 626 | 3094 627 | 2833 628 | 2521 629 | 2883 630 | 3315 631 | 2845 632 | 2907 633 | 3083 634 | 2572 635 | 3092 636 | 2903 637 | 2918 638 | 3039 639 | 3286 640 | 2587 641 | 3068 642 | 2338 643 | 3166 644 | 3134 645 | 2455 646 | 2497 647 | 2992 648 | 2775 649 | 2681 650 | 2430 651 | 2932 652 | 2931 653 | 2434 654 | 3154 655 | 3046 656 | 2598 657 | 2366 658 | 3015 659 | 3147 660 | 2944 661 | 2582 662 | 3274 663 | 2987 664 | 2642 665 | 2547 666 | 2420 667 | 2930 668 | 2750 669 | 2417 670 | 2808 671 | 3141 672 | 2997 673 | 2995 674 | 2584 675 | 2312 676 | 3033 677 | 3070 678 | 3065 679 | 2509 680 | 3314 681 | 2396 682 | 2543 683 | 2423 684 | 3170 685 | 2389 686 | 3289 687 | 2728 688 | 2540 689 | 2437 690 | 2486 691 | 2895 692 | 3017 693 | 2853 694 | 2406 695 | 2346 696 | 2877 697 | 2472 698 | 3210 699 | 2637 700 | 2927 701 | 2789 702 | 2330 703 | 3088 704 | 3102 705 | 2616 706 | 3081 707 | 2902 708 | 3205 709 | 3320 710 | 3165 711 | 2984 712 | 3185 713 | 2707 714 | 3255 715 | 2583 716 | 2773 717 | 2742 718 | 3024 719 | 2402 720 | 2718 721 | 2882 722 | 2575 723 | 3281 724 | 2786 725 | 2855 726 | 3014 727 | 2401 728 | 2535 729 | 2687 730 | 2495 731 | 3113 732 | 2609 733 | 2559 734 | 2665 735 | 2530 736 | 3293 737 | 2399 738 | 2605 739 | 2690 740 | 3133 741 | 2799 742 | 2533 743 | 2695 744 | 2713 745 | 2886 746 | 2691 747 | 2549 748 | 3077 749 | 3002 750 | 3049 751 | 3051 752 | 3087 753 | 2444 754 | 3085 755 | 3135 756 | 2702 757 | 3211 758 | 3108 759 | 2501 760 | 2769 761 | 3290 762 | 2465 763 | 3025 764 | 3019 765 | 2385 766 | 2940 767 | 2657 768 | 2610 769 | 2525 770 | 2941 771 | 3078 772 | 2341 773 | 2916 774 | 2956 775 | 2375 776 | 2880 777 | 3009 778 | 2780 779 | 2370 780 | 2925 781 | 2332 782 | 3146 783 | 2315 784 | 2809 785 | 3145 786 | 3106 787 | 2782 788 | 2760 789 | 2493 790 | 2765 791 | 2556 792 | 2890 793 | 2400 794 | 2339 795 | 3201 796 | 2818 797 | 3248 798 | 3280 799 | 2570 800 | 2569 801 | 2937 802 | 3174 803 | 2836 804 | 2708 805 | 2820 806 | 3195 807 | 2617 808 | 3197 809 | 2319 810 | 2744 811 | 2615 812 | 2825 813 | 2603 814 | 2914 815 | 2531 816 | 3193 817 | 2624 818 | 2365 819 | 2810 820 | 3239 821 | 3159 822 | 2537 823 | 2844 824 | 2758 825 | 2938 826 | 3037 827 | 2503 828 | 3297 829 | 2885 830 | 2608 831 | 2494 832 | 2712 833 | 2408 834 | 2901 835 | 2704 836 | 2536 837 | 2373 838 | 2478 839 | 2723 840 | 3076 841 | 2627 842 | 2369 843 | 2669 844 | 3006 845 | 2628 846 | 2788 847 | 3276 848 | 2435 849 | 3139 850 | 3235 851 | 2527 852 | 2571 853 | 2815 854 | 2442 855 | 2892 856 | 2978 857 | 2746 858 | 3150 859 | 2574 860 | 2725 861 | 3188 862 | 2601 863 | 2378 864 | 3075 865 | 2632 866 | 2794 867 | 3270 868 | 3071 869 | 2506 870 | 3126 871 | 3236 872 | 3257 873 | 2824 874 | 2989 875 | 2950 876 | 2428 877 | 2405 878 | 3156 879 | 2447 880 | 2787 881 | 2805 882 | 2720 883 | 2403 884 | 2811 885 | 2329 886 | 2474 887 | 2785 888 | 2350 889 | 2507 890 | 2416 891 | 3112 892 | 2475 893 | 2876 894 | 2585 895 | 2487 896 | 3072 897 | 3082 898 | 2943 899 | 2757 900 | 2388 901 | 2600 902 | 3294 903 | 2756 904 | 3142 905 | 3041 906 | 2594 907 | 2998 908 | 3047 909 | 2379 910 | 2980 911 | 2454 912 | 2862 913 | 3175 914 | 2588 915 | 3031 916 | 3012 917 | 2889 918 | 2500 919 | 2791 920 | 2854 921 | 2619 922 | 2395 923 | 2807 924 | 2740 925 | 2412 926 | 3131 927 | 3013 928 | 2939 929 | 2651 930 | 2490 931 | 2988 932 | 2863 933 | 3225 934 | 2745 935 | 2714 936 | 3160 937 | 3124 938 | 2849 939 | 2676 940 | 2872 941 | 3287 942 | 3189 943 | 2716 944 | 3115 945 | 2928 946 | 2871 947 | 2591 948 | 2717 949 | 2546 950 | 2777 951 | 3298 952 | 2397 953 | 3187 954 | 2726 955 | 2336 956 | 3268 957 | 2477 958 | 2904 959 | 2846 960 | 3121 961 | 2899 962 | 2510 963 | 2806 964 | 2963 965 | 3313 966 | 2679 967 | 3302 968 | 2663 969 | 3053 970 | 2469 971 | 2999 972 | 3311 973 | 2470 974 | 2638 975 | 3120 976 | 3171 977 | 2689 978 | 2922 979 | 2607 980 | 2721 981 | 2993 982 | 2887 983 | 2837 984 | 2929 985 | 2829 986 | 3234 987 | 2649 988 | 2337 989 | 2759 990 | 2778 991 | 2771 992 | 2404 993 | 2589 994 | 3123 995 | 3209 996 | 2729 997 | 3252 998 | 2606 999 | 2579 1000 | 2552 1001 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.citeseer.tx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.citeseer.tx -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.citeseer.ty: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.citeseer.ty -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.citeseer.x: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.citeseer.x -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.citeseer.y: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.citeseer.y -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.cora.allx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.cora.allx -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.cora.ally: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.cora.ally -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.cora.graph: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.cora.graph -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.cora.test.index: -------------------------------------------------------------------------------- 1 | 2692 2 | 2532 3 | 2050 4 | 1715 5 | 2362 6 | 2609 7 | 2622 8 | 1975 9 | 2081 10 | 1767 11 | 2263 12 | 1725 13 | 2588 14 | 2259 15 | 2357 16 | 1998 17 | 2574 18 | 2179 19 | 2291 20 | 2382 21 | 1812 22 | 1751 23 | 2422 24 | 1937 25 | 2631 26 | 2510 27 | 2378 28 | 2589 29 | 2345 30 | 1943 31 | 1850 32 | 2298 33 | 1825 34 | 2035 35 | 2507 36 | 2313 37 | 1906 38 | 1797 39 | 2023 40 | 2159 41 | 2495 42 | 1886 43 | 2122 44 | 2369 45 | 2461 46 | 1925 47 | 2565 48 | 1858 49 | 2234 50 | 2000 51 | 1846 52 | 2318 53 | 1723 54 | 2559 55 | 2258 56 | 1763 57 | 1991 58 | 1922 59 | 2003 60 | 2662 61 | 2250 62 | 2064 63 | 2529 64 | 1888 65 | 2499 66 | 2454 67 | 2320 68 | 2287 69 | 2203 70 | 2018 71 | 2002 72 | 2632 73 | 2554 74 | 2314 75 | 2537 76 | 1760 77 | 2088 78 | 2086 79 | 2218 80 | 2605 81 | 1953 82 | 2403 83 | 1920 84 | 2015 85 | 2335 86 | 2535 87 | 1837 88 | 2009 89 | 1905 90 | 2636 91 | 1942 92 | 2193 93 | 2576 94 | 2373 95 | 1873 96 | 2463 97 | 2509 98 | 1954 99 | 2656 100 | 2455 101 | 2494 102 | 2295 103 | 2114 104 | 2561 105 | 2176 106 | 2275 107 | 2635 108 | 2442 109 | 2704 110 | 2127 111 | 2085 112 | 2214 113 | 2487 114 | 1739 115 | 2543 116 | 1783 117 | 2485 118 | 2262 119 | 2472 120 | 2326 121 | 1738 122 | 2170 123 | 2100 124 | 2384 125 | 2152 126 | 2647 127 | 2693 128 | 2376 129 | 1775 130 | 1726 131 | 2476 132 | 2195 133 | 1773 134 | 1793 135 | 2194 136 | 2581 137 | 1854 138 | 2524 139 | 1945 140 | 1781 141 | 1987 142 | 2599 143 | 1744 144 | 2225 145 | 2300 146 | 1928 147 | 2042 148 | 2202 149 | 1958 150 | 1816 151 | 1916 152 | 2679 153 | 2190 154 | 1733 155 | 2034 156 | 2643 157 | 2177 158 | 1883 159 | 1917 160 | 1996 161 | 2491 162 | 2268 163 | 2231 164 | 2471 165 | 1919 166 | 1909 167 | 2012 168 | 2522 169 | 1865 170 | 2466 171 | 2469 172 | 2087 173 | 2584 174 | 2563 175 | 1924 176 | 2143 177 | 1736 178 | 1966 179 | 2533 180 | 2490 181 | 2630 182 | 1973 183 | 2568 184 | 1978 185 | 2664 186 | 2633 187 | 2312 188 | 2178 189 | 1754 190 | 2307 191 | 2480 192 | 1960 193 | 1742 194 | 1962 195 | 2160 196 | 2070 197 | 2553 198 | 2433 199 | 1768 200 | 2659 201 | 2379 202 | 2271 203 | 1776 204 | 2153 205 | 1877 206 | 2027 207 | 2028 208 | 2155 209 | 2196 210 | 2483 211 | 2026 212 | 2158 213 | 2407 214 | 1821 215 | 2131 216 | 2676 217 | 2277 218 | 2489 219 | 2424 220 | 1963 221 | 1808 222 | 1859 223 | 2597 224 | 2548 225 | 2368 226 | 1817 227 | 2405 228 | 2413 229 | 2603 230 | 2350 231 | 2118 232 | 2329 233 | 1969 234 | 2577 235 | 2475 236 | 2467 237 | 2425 238 | 1769 239 | 2092 240 | 2044 241 | 2586 242 | 2608 243 | 1983 244 | 2109 245 | 2649 246 | 1964 247 | 2144 248 | 1902 249 | 2411 250 | 2508 251 | 2360 252 | 1721 253 | 2005 254 | 2014 255 | 2308 256 | 2646 257 | 1949 258 | 1830 259 | 2212 260 | 2596 261 | 1832 262 | 1735 263 | 1866 264 | 2695 265 | 1941 266 | 2546 267 | 2498 268 | 2686 269 | 2665 270 | 1784 271 | 2613 272 | 1970 273 | 2021 274 | 2211 275 | 2516 276 | 2185 277 | 2479 278 | 2699 279 | 2150 280 | 1990 281 | 2063 282 | 2075 283 | 1979 284 | 2094 285 | 1787 286 | 2571 287 | 2690 288 | 1926 289 | 2341 290 | 2566 291 | 1957 292 | 1709 293 | 1955 294 | 2570 295 | 2387 296 | 1811 297 | 2025 298 | 2447 299 | 2696 300 | 2052 301 | 2366 302 | 1857 303 | 2273 304 | 2245 305 | 2672 306 | 2133 307 | 2421 308 | 1929 309 | 2125 310 | 2319 311 | 2641 312 | 2167 313 | 2418 314 | 1765 315 | 1761 316 | 1828 317 | 2188 318 | 1972 319 | 1997 320 | 2419 321 | 2289 322 | 2296 323 | 2587 324 | 2051 325 | 2440 326 | 2053 327 | 2191 328 | 1923 329 | 2164 330 | 1861 331 | 2339 332 | 2333 333 | 2523 334 | 2670 335 | 2121 336 | 1921 337 | 1724 338 | 2253 339 | 2374 340 | 1940 341 | 2545 342 | 2301 343 | 2244 344 | 2156 345 | 1849 346 | 2551 347 | 2011 348 | 2279 349 | 2572 350 | 1757 351 | 2400 352 | 2569 353 | 2072 354 | 2526 355 | 2173 356 | 2069 357 | 2036 358 | 1819 359 | 1734 360 | 1880 361 | 2137 362 | 2408 363 | 2226 364 | 2604 365 | 1771 366 | 2698 367 | 2187 368 | 2060 369 | 1756 370 | 2201 371 | 2066 372 | 2439 373 | 1844 374 | 1772 375 | 2383 376 | 2398 377 | 1708 378 | 1992 379 | 1959 380 | 1794 381 | 2426 382 | 2702 383 | 2444 384 | 1944 385 | 1829 386 | 2660 387 | 2497 388 | 2607 389 | 2343 390 | 1730 391 | 2624 392 | 1790 393 | 1935 394 | 1967 395 | 2401 396 | 2255 397 | 2355 398 | 2348 399 | 1931 400 | 2183 401 | 2161 402 | 2701 403 | 1948 404 | 2501 405 | 2192 406 | 2404 407 | 2209 408 | 2331 409 | 1810 410 | 2363 411 | 2334 412 | 1887 413 | 2393 414 | 2557 415 | 1719 416 | 1732 417 | 1986 418 | 2037 419 | 2056 420 | 1867 421 | 2126 422 | 1932 423 | 2117 424 | 1807 425 | 1801 426 | 1743 427 | 2041 428 | 1843 429 | 2388 430 | 2221 431 | 1833 432 | 2677 433 | 1778 434 | 2661 435 | 2306 436 | 2394 437 | 2106 438 | 2430 439 | 2371 440 | 2606 441 | 2353 442 | 2269 443 | 2317 444 | 2645 445 | 2372 446 | 2550 447 | 2043 448 | 1968 449 | 2165 450 | 2310 451 | 1985 452 | 2446 453 | 1982 454 | 2377 455 | 2207 456 | 1818 457 | 1913 458 | 1766 459 | 1722 460 | 1894 461 | 2020 462 | 1881 463 | 2621 464 | 2409 465 | 2261 466 | 2458 467 | 2096 468 | 1712 469 | 2594 470 | 2293 471 | 2048 472 | 2359 473 | 1839 474 | 2392 475 | 2254 476 | 1911 477 | 2101 478 | 2367 479 | 1889 480 | 1753 481 | 2555 482 | 2246 483 | 2264 484 | 2010 485 | 2336 486 | 2651 487 | 2017 488 | 2140 489 | 1842 490 | 2019 491 | 1890 492 | 2525 493 | 2134 494 | 2492 495 | 2652 496 | 2040 497 | 2145 498 | 2575 499 | 2166 500 | 1999 501 | 2434 502 | 1711 503 | 2276 504 | 2450 505 | 2389 506 | 2669 507 | 2595 508 | 1814 509 | 2039 510 | 2502 511 | 1896 512 | 2168 513 | 2344 514 | 2637 515 | 2031 516 | 1977 517 | 2380 518 | 1936 519 | 2047 520 | 2460 521 | 2102 522 | 1745 523 | 2650 524 | 2046 525 | 2514 526 | 1980 527 | 2352 528 | 2113 529 | 1713 530 | 2058 531 | 2558 532 | 1718 533 | 1864 534 | 1876 535 | 2338 536 | 1879 537 | 1891 538 | 2186 539 | 2451 540 | 2181 541 | 2638 542 | 2644 543 | 2103 544 | 2591 545 | 2266 546 | 2468 547 | 1869 548 | 2582 549 | 2674 550 | 2361 551 | 2462 552 | 1748 553 | 2215 554 | 2615 555 | 2236 556 | 2248 557 | 2493 558 | 2342 559 | 2449 560 | 2274 561 | 1824 562 | 1852 563 | 1870 564 | 2441 565 | 2356 566 | 1835 567 | 2694 568 | 2602 569 | 2685 570 | 1893 571 | 2544 572 | 2536 573 | 1994 574 | 1853 575 | 1838 576 | 1786 577 | 1930 578 | 2539 579 | 1892 580 | 2265 581 | 2618 582 | 2486 583 | 2583 584 | 2061 585 | 1796 586 | 1806 587 | 2084 588 | 1933 589 | 2095 590 | 2136 591 | 2078 592 | 1884 593 | 2438 594 | 2286 595 | 2138 596 | 1750 597 | 2184 598 | 1799 599 | 2278 600 | 2410 601 | 2642 602 | 2435 603 | 1956 604 | 2399 605 | 1774 606 | 2129 607 | 1898 608 | 1823 609 | 1938 610 | 2299 611 | 1862 612 | 2420 613 | 2673 614 | 1984 615 | 2204 616 | 1717 617 | 2074 618 | 2213 619 | 2436 620 | 2297 621 | 2592 622 | 2667 623 | 2703 624 | 2511 625 | 1779 626 | 1782 627 | 2625 628 | 2365 629 | 2315 630 | 2381 631 | 1788 632 | 1714 633 | 2302 634 | 1927 635 | 2325 636 | 2506 637 | 2169 638 | 2328 639 | 2629 640 | 2128 641 | 2655 642 | 2282 643 | 2073 644 | 2395 645 | 2247 646 | 2521 647 | 2260 648 | 1868 649 | 1988 650 | 2324 651 | 2705 652 | 2541 653 | 1731 654 | 2681 655 | 2707 656 | 2465 657 | 1785 658 | 2149 659 | 2045 660 | 2505 661 | 2611 662 | 2217 663 | 2180 664 | 1904 665 | 2453 666 | 2484 667 | 1871 668 | 2309 669 | 2349 670 | 2482 671 | 2004 672 | 1965 673 | 2406 674 | 2162 675 | 1805 676 | 2654 677 | 2007 678 | 1947 679 | 1981 680 | 2112 681 | 2141 682 | 1720 683 | 1758 684 | 2080 685 | 2330 686 | 2030 687 | 2432 688 | 2089 689 | 2547 690 | 1820 691 | 1815 692 | 2675 693 | 1840 694 | 2658 695 | 2370 696 | 2251 697 | 1908 698 | 2029 699 | 2068 700 | 2513 701 | 2549 702 | 2267 703 | 2580 704 | 2327 705 | 2351 706 | 2111 707 | 2022 708 | 2321 709 | 2614 710 | 2252 711 | 2104 712 | 1822 713 | 2552 714 | 2243 715 | 1798 716 | 2396 717 | 2663 718 | 2564 719 | 2148 720 | 2562 721 | 2684 722 | 2001 723 | 2151 724 | 2706 725 | 2240 726 | 2474 727 | 2303 728 | 2634 729 | 2680 730 | 2055 731 | 2090 732 | 2503 733 | 2347 734 | 2402 735 | 2238 736 | 1950 737 | 2054 738 | 2016 739 | 1872 740 | 2233 741 | 1710 742 | 2032 743 | 2540 744 | 2628 745 | 1795 746 | 2616 747 | 1903 748 | 2531 749 | 2567 750 | 1946 751 | 1897 752 | 2222 753 | 2227 754 | 2627 755 | 1856 756 | 2464 757 | 2241 758 | 2481 759 | 2130 760 | 2311 761 | 2083 762 | 2223 763 | 2284 764 | 2235 765 | 2097 766 | 1752 767 | 2515 768 | 2527 769 | 2385 770 | 2189 771 | 2283 772 | 2182 773 | 2079 774 | 2375 775 | 2174 776 | 2437 777 | 1993 778 | 2517 779 | 2443 780 | 2224 781 | 2648 782 | 2171 783 | 2290 784 | 2542 785 | 2038 786 | 1855 787 | 1831 788 | 1759 789 | 1848 790 | 2445 791 | 1827 792 | 2429 793 | 2205 794 | 2598 795 | 2657 796 | 1728 797 | 2065 798 | 1918 799 | 2427 800 | 2573 801 | 2620 802 | 2292 803 | 1777 804 | 2008 805 | 1875 806 | 2288 807 | 2256 808 | 2033 809 | 2470 810 | 2585 811 | 2610 812 | 2082 813 | 2230 814 | 1915 815 | 1847 816 | 2337 817 | 2512 818 | 2386 819 | 2006 820 | 2653 821 | 2346 822 | 1951 823 | 2110 824 | 2639 825 | 2520 826 | 1939 827 | 2683 828 | 2139 829 | 2220 830 | 1910 831 | 2237 832 | 1900 833 | 1836 834 | 2197 835 | 1716 836 | 1860 837 | 2077 838 | 2519 839 | 2538 840 | 2323 841 | 1914 842 | 1971 843 | 1845 844 | 2132 845 | 1802 846 | 1907 847 | 2640 848 | 2496 849 | 2281 850 | 2198 851 | 2416 852 | 2285 853 | 1755 854 | 2431 855 | 2071 856 | 2249 857 | 2123 858 | 1727 859 | 2459 860 | 2304 861 | 2199 862 | 1791 863 | 1809 864 | 1780 865 | 2210 866 | 2417 867 | 1874 868 | 1878 869 | 2116 870 | 1961 871 | 1863 872 | 2579 873 | 2477 874 | 2228 875 | 2332 876 | 2578 877 | 2457 878 | 2024 879 | 1934 880 | 2316 881 | 1841 882 | 1764 883 | 1737 884 | 2322 885 | 2239 886 | 2294 887 | 1729 888 | 2488 889 | 1974 890 | 2473 891 | 2098 892 | 2612 893 | 1834 894 | 2340 895 | 2423 896 | 2175 897 | 2280 898 | 2617 899 | 2208 900 | 2560 901 | 1741 902 | 2600 903 | 2059 904 | 1747 905 | 2242 906 | 2700 907 | 2232 908 | 2057 909 | 2147 910 | 2682 911 | 1792 912 | 1826 913 | 2120 914 | 1895 915 | 2364 916 | 2163 917 | 1851 918 | 2391 919 | 2414 920 | 2452 921 | 1803 922 | 1989 923 | 2623 924 | 2200 925 | 2528 926 | 2415 927 | 1804 928 | 2146 929 | 2619 930 | 2687 931 | 1762 932 | 2172 933 | 2270 934 | 2678 935 | 2593 936 | 2448 937 | 1882 938 | 2257 939 | 2500 940 | 1899 941 | 2478 942 | 2412 943 | 2107 944 | 1746 945 | 2428 946 | 2115 947 | 1800 948 | 1901 949 | 2397 950 | 2530 951 | 1912 952 | 2108 953 | 2206 954 | 2091 955 | 1740 956 | 2219 957 | 1976 958 | 2099 959 | 2142 960 | 2671 961 | 2668 962 | 2216 963 | 2272 964 | 2229 965 | 2666 966 | 2456 967 | 2534 968 | 2697 969 | 2688 970 | 2062 971 | 2691 972 | 2689 973 | 2154 974 | 2590 975 | 2626 976 | 2390 977 | 1813 978 | 2067 979 | 1952 980 | 2518 981 | 2358 982 | 1789 983 | 2076 984 | 2049 985 | 2119 986 | 2013 987 | 2124 988 | 2556 989 | 2105 990 | 2093 991 | 1885 992 | 2305 993 | 2354 994 | 2135 995 | 2601 996 | 1770 997 | 1995 998 | 2504 999 | 1749 1000 | 2157 1001 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.cora.tx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.cora.tx -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.cora.ty: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.cora.ty -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.cora.x: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.cora.x -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.cora.y: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.cora.y -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.pubmed.allx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.pubmed.allx -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.pubmed.ally: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.pubmed.ally -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.pubmed.graph: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.pubmed.graph -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.pubmed.test.index: -------------------------------------------------------------------------------- 1 | 18747 2 | 19392 3 | 19181 4 | 18843 5 | 19221 6 | 18962 7 | 19560 8 | 19097 9 | 18966 10 | 19014 11 | 18756 12 | 19313 13 | 19000 14 | 19569 15 | 19359 16 | 18854 17 | 18970 18 | 19073 19 | 19661 20 | 19180 21 | 19377 22 | 18750 23 | 19401 24 | 18788 25 | 19224 26 | 19447 27 | 19017 28 | 19241 29 | 18890 30 | 18908 31 | 18965 32 | 19001 33 | 18849 34 | 19641 35 | 18852 36 | 19222 37 | 19172 38 | 18762 39 | 19156 40 | 19162 41 | 18856 42 | 18763 43 | 19318 44 | 18826 45 | 19712 46 | 19192 47 | 19695 48 | 19030 49 | 19523 50 | 19249 51 | 19079 52 | 19232 53 | 19455 54 | 18743 55 | 18800 56 | 19071 57 | 18885 58 | 19593 59 | 19394 60 | 19390 61 | 18832 62 | 19445 63 | 18838 64 | 19632 65 | 19548 66 | 19546 67 | 18825 68 | 19498 69 | 19266 70 | 19117 71 | 19595 72 | 19252 73 | 18730 74 | 18913 75 | 18809 76 | 19452 77 | 19520 78 | 19274 79 | 19555 80 | 19388 81 | 18919 82 | 19099 83 | 19637 84 | 19403 85 | 18720 86 | 19526 87 | 18905 88 | 19451 89 | 19408 90 | 18923 91 | 18794 92 | 19322 93 | 19431 94 | 18912 95 | 18841 96 | 19239 97 | 19125 98 | 19258 99 | 19565 100 | 18898 101 | 19482 102 | 19029 103 | 18778 104 | 19096 105 | 19684 106 | 19552 107 | 18765 108 | 19361 109 | 19171 110 | 19367 111 | 19623 112 | 19402 113 | 19327 114 | 19118 115 | 18888 116 | 18726 117 | 19510 118 | 18831 119 | 19490 120 | 19576 121 | 19050 122 | 18729 123 | 18896 124 | 19246 125 | 19012 126 | 18862 127 | 18873 128 | 19193 129 | 19693 130 | 19474 131 | 18953 132 | 19115 133 | 19182 134 | 19269 135 | 19116 136 | 18837 137 | 18872 138 | 19007 139 | 19212 140 | 18798 141 | 19102 142 | 18772 143 | 19660 144 | 19511 145 | 18914 146 | 18886 147 | 19672 148 | 19360 149 | 19213 150 | 18810 151 | 19420 152 | 19512 153 | 18719 154 | 19432 155 | 19350 156 | 19127 157 | 18782 158 | 19587 159 | 18924 160 | 19488 161 | 18781 162 | 19340 163 | 19190 164 | 19383 165 | 19094 166 | 18835 167 | 19487 168 | 19230 169 | 18791 170 | 18882 171 | 18937 172 | 18928 173 | 18755 174 | 18802 175 | 19516 176 | 18795 177 | 18786 178 | 19273 179 | 19349 180 | 19398 181 | 19626 182 | 19130 183 | 19351 184 | 19489 185 | 19446 186 | 18959 187 | 19025 188 | 18792 189 | 18878 190 | 19304 191 | 19629 192 | 19061 193 | 18785 194 | 19194 195 | 19179 196 | 19210 197 | 19417 198 | 19583 199 | 19415 200 | 19443 201 | 18739 202 | 19662 203 | 18904 204 | 18910 205 | 18901 206 | 18960 207 | 18722 208 | 18827 209 | 19290 210 | 18842 211 | 19389 212 | 19344 213 | 18961 214 | 19098 215 | 19147 216 | 19334 217 | 19358 218 | 18829 219 | 18984 220 | 18931 221 | 18742 222 | 19320 223 | 19111 224 | 19196 225 | 18887 226 | 18991 227 | 19469 228 | 18990 229 | 18876 230 | 19261 231 | 19270 232 | 19522 233 | 19088 234 | 19284 235 | 19646 236 | 19493 237 | 19225 238 | 19615 239 | 19449 240 | 19043 241 | 19674 242 | 19391 243 | 18918 244 | 19155 245 | 19110 246 | 18815 247 | 19131 248 | 18834 249 | 19715 250 | 19603 251 | 19688 252 | 19133 253 | 19053 254 | 19166 255 | 19066 256 | 18893 257 | 18757 258 | 19582 259 | 19282 260 | 19257 261 | 18869 262 | 19467 263 | 18954 264 | 19371 265 | 19151 266 | 19462 267 | 19598 268 | 19653 269 | 19187 270 | 19624 271 | 19564 272 | 19534 273 | 19581 274 | 19478 275 | 18985 276 | 18746 277 | 19342 278 | 18777 279 | 19696 280 | 18824 281 | 19138 282 | 18728 283 | 19643 284 | 19199 285 | 18731 286 | 19168 287 | 18948 288 | 19216 289 | 19697 290 | 19347 291 | 18808 292 | 18725 293 | 19134 294 | 18847 295 | 18828 296 | 18996 297 | 19106 298 | 19485 299 | 18917 300 | 18911 301 | 18776 302 | 19203 303 | 19158 304 | 18895 305 | 19165 306 | 19382 307 | 18780 308 | 18836 309 | 19373 310 | 19659 311 | 18947 312 | 19375 313 | 19299 314 | 18761 315 | 19366 316 | 18754 317 | 19248 318 | 19416 319 | 19658 320 | 19638 321 | 19034 322 | 19281 323 | 18844 324 | 18922 325 | 19491 326 | 19272 327 | 19341 328 | 19068 329 | 19332 330 | 19559 331 | 19293 332 | 18804 333 | 18933 334 | 18935 335 | 19405 336 | 18936 337 | 18945 338 | 18943 339 | 18818 340 | 18797 341 | 19570 342 | 19464 343 | 19428 344 | 19093 345 | 19433 346 | 18986 347 | 19161 348 | 19255 349 | 19157 350 | 19046 351 | 19292 352 | 19434 353 | 19298 354 | 18724 355 | 19410 356 | 19694 357 | 19214 358 | 19640 359 | 19189 360 | 18963 361 | 19218 362 | 19585 363 | 19041 364 | 19550 365 | 19123 366 | 19620 367 | 19376 368 | 19561 369 | 18944 370 | 19706 371 | 19056 372 | 19283 373 | 18741 374 | 19319 375 | 19144 376 | 19542 377 | 18821 378 | 19404 379 | 19080 380 | 19303 381 | 18793 382 | 19306 383 | 19678 384 | 19435 385 | 19519 386 | 19566 387 | 19278 388 | 18946 389 | 19536 390 | 19020 391 | 19057 392 | 19198 393 | 19333 394 | 19649 395 | 19699 396 | 19399 397 | 19654 398 | 19136 399 | 19465 400 | 19321 401 | 19577 402 | 18907 403 | 19665 404 | 19386 405 | 19596 406 | 19247 407 | 19473 408 | 19568 409 | 19355 410 | 18925 411 | 19586 412 | 18982 413 | 19616 414 | 19495 415 | 19612 416 | 19023 417 | 19438 418 | 18817 419 | 19692 420 | 19295 421 | 19414 422 | 19676 423 | 19472 424 | 19107 425 | 19062 426 | 19035 427 | 18883 428 | 19409 429 | 19052 430 | 19606 431 | 19091 432 | 19651 433 | 19475 434 | 19413 435 | 18796 436 | 19369 437 | 19639 438 | 19701 439 | 19461 440 | 19645 441 | 19251 442 | 19063 443 | 19679 444 | 19545 445 | 19081 446 | 19363 447 | 18995 448 | 19549 449 | 18790 450 | 18855 451 | 18833 452 | 18899 453 | 19395 454 | 18717 455 | 19647 456 | 18768 457 | 19103 458 | 19245 459 | 18819 460 | 18779 461 | 19656 462 | 19076 463 | 18745 464 | 18971 465 | 19197 466 | 19711 467 | 19074 468 | 19128 469 | 19466 470 | 19139 471 | 19309 472 | 19324 473 | 18814 474 | 19092 475 | 19627 476 | 19060 477 | 18806 478 | 18929 479 | 18737 480 | 18942 481 | 18906 482 | 18858 483 | 19456 484 | 19253 485 | 19716 486 | 19104 487 | 19667 488 | 19574 489 | 18903 490 | 19237 491 | 18864 492 | 19556 493 | 19364 494 | 18952 495 | 19008 496 | 19323 497 | 19700 498 | 19170 499 | 19267 500 | 19345 501 | 19238 502 | 18909 503 | 18892 504 | 19109 505 | 19704 506 | 18902 507 | 19275 508 | 19680 509 | 18723 510 | 19242 511 | 19112 512 | 19169 513 | 18956 514 | 19343 515 | 19650 516 | 19541 517 | 19698 518 | 19521 519 | 19087 520 | 18976 521 | 19038 522 | 18775 523 | 18968 524 | 19671 525 | 19412 526 | 19407 527 | 19573 528 | 19027 529 | 18813 530 | 19357 531 | 19460 532 | 19673 533 | 19481 534 | 19036 535 | 19614 536 | 18787 537 | 19195 538 | 18732 539 | 18884 540 | 19613 541 | 19657 542 | 19575 543 | 19226 544 | 19589 545 | 19234 546 | 19617 547 | 19707 548 | 19484 549 | 18740 550 | 19424 551 | 18784 552 | 19419 553 | 19159 554 | 18865 555 | 19105 556 | 19315 557 | 19480 558 | 19664 559 | 19378 560 | 18803 561 | 19605 562 | 18870 563 | 19042 564 | 19426 565 | 18848 566 | 19223 567 | 19509 568 | 19532 569 | 18752 570 | 19691 571 | 18718 572 | 19209 573 | 19362 574 | 19090 575 | 19492 576 | 19567 577 | 19687 578 | 19018 579 | 18830 580 | 19530 581 | 19554 582 | 19119 583 | 19442 584 | 19558 585 | 19527 586 | 19427 587 | 19291 588 | 19543 589 | 19422 590 | 19142 591 | 18897 592 | 18950 593 | 19425 594 | 19002 595 | 19588 596 | 18978 597 | 19551 598 | 18930 599 | 18736 600 | 19101 601 | 19215 602 | 19150 603 | 19263 604 | 18949 605 | 18974 606 | 18759 607 | 19335 608 | 19200 609 | 19129 610 | 19328 611 | 19437 612 | 18988 613 | 19429 614 | 19368 615 | 19406 616 | 19049 617 | 18811 618 | 19296 619 | 19256 620 | 19385 621 | 19602 622 | 18770 623 | 19337 624 | 19580 625 | 19476 626 | 19045 627 | 19132 628 | 19089 629 | 19120 630 | 19265 631 | 19483 632 | 18767 633 | 19227 634 | 18934 635 | 19069 636 | 18820 637 | 19006 638 | 19459 639 | 18927 640 | 19037 641 | 19280 642 | 19441 643 | 18823 644 | 19015 645 | 19114 646 | 19618 647 | 18957 648 | 19176 649 | 18853 650 | 19648 651 | 19201 652 | 19444 653 | 19279 654 | 18751 655 | 19302 656 | 19505 657 | 18733 658 | 19601 659 | 19533 660 | 18863 661 | 19708 662 | 19387 663 | 19346 664 | 19152 665 | 19206 666 | 18851 667 | 19338 668 | 19681 669 | 19380 670 | 19055 671 | 18766 672 | 19085 673 | 19591 674 | 19547 675 | 18958 676 | 19146 677 | 18840 678 | 19051 679 | 19021 680 | 19207 681 | 19235 682 | 19086 683 | 18979 684 | 19300 685 | 18939 686 | 19100 687 | 19619 688 | 19287 689 | 18980 690 | 19277 691 | 19326 692 | 19108 693 | 18920 694 | 19625 695 | 19374 696 | 19078 697 | 18734 698 | 19634 699 | 19339 700 | 18877 701 | 19423 702 | 19652 703 | 19683 704 | 19044 705 | 18983 706 | 19330 707 | 19529 708 | 19714 709 | 19468 710 | 19075 711 | 19540 712 | 18839 713 | 19022 714 | 19286 715 | 19537 716 | 19175 717 | 19463 718 | 19167 719 | 19705 720 | 19562 721 | 19244 722 | 19486 723 | 19611 724 | 18801 725 | 19178 726 | 19590 727 | 18846 728 | 19450 729 | 19205 730 | 19381 731 | 18941 732 | 19670 733 | 19185 734 | 19504 735 | 19633 736 | 18997 737 | 19113 738 | 19397 739 | 19636 740 | 19709 741 | 19289 742 | 19264 743 | 19353 744 | 19584 745 | 19126 746 | 18938 747 | 19669 748 | 18964 749 | 19276 750 | 18774 751 | 19173 752 | 19231 753 | 18973 754 | 18769 755 | 19064 756 | 19040 757 | 19668 758 | 18738 759 | 19082 760 | 19655 761 | 19236 762 | 19352 763 | 19609 764 | 19628 765 | 18951 766 | 19384 767 | 19122 768 | 18875 769 | 18992 770 | 18753 771 | 19379 772 | 19254 773 | 19301 774 | 19506 775 | 19135 776 | 19010 777 | 19682 778 | 19400 779 | 19579 780 | 19316 781 | 19553 782 | 19208 783 | 19635 784 | 19644 785 | 18891 786 | 19024 787 | 18989 788 | 19250 789 | 18850 790 | 19317 791 | 18915 792 | 19607 793 | 18799 794 | 18881 795 | 19479 796 | 19031 797 | 19365 798 | 19164 799 | 18744 800 | 18760 801 | 19502 802 | 19058 803 | 19517 804 | 18735 805 | 19448 806 | 19243 807 | 19453 808 | 19285 809 | 18857 810 | 19439 811 | 19016 812 | 18975 813 | 19503 814 | 18998 815 | 18981 816 | 19186 817 | 18994 818 | 19240 819 | 19631 820 | 19070 821 | 19174 822 | 18900 823 | 19065 824 | 19220 825 | 19229 826 | 18880 827 | 19308 828 | 19372 829 | 19496 830 | 18771 831 | 19325 832 | 19538 833 | 19033 834 | 18874 835 | 19077 836 | 19211 837 | 18764 838 | 19458 839 | 19571 840 | 19121 841 | 19019 842 | 19059 843 | 19497 844 | 18969 845 | 19666 846 | 19297 847 | 19219 848 | 19622 849 | 19184 850 | 18977 851 | 19702 852 | 19539 853 | 19329 854 | 19095 855 | 19675 856 | 18972 857 | 19514 858 | 19703 859 | 19188 860 | 18866 861 | 18812 862 | 19314 863 | 18822 864 | 18845 865 | 19494 866 | 19411 867 | 18916 868 | 19686 869 | 18967 870 | 19294 871 | 19143 872 | 19204 873 | 18805 874 | 19689 875 | 19233 876 | 18758 877 | 18748 878 | 19011 879 | 19685 880 | 19336 881 | 19608 882 | 19454 883 | 19124 884 | 18868 885 | 18807 886 | 19544 887 | 19621 888 | 19228 889 | 19154 890 | 19141 891 | 19145 892 | 19153 893 | 18860 894 | 19163 895 | 19393 896 | 19268 897 | 19160 898 | 19305 899 | 19259 900 | 19471 901 | 19524 902 | 18783 903 | 19396 904 | 18894 905 | 19430 906 | 19690 907 | 19348 908 | 19597 909 | 19592 910 | 19677 911 | 18889 912 | 19331 913 | 18773 914 | 19137 915 | 19009 916 | 18932 917 | 19599 918 | 18816 919 | 19054 920 | 19067 921 | 19477 922 | 19191 923 | 18921 924 | 18940 925 | 19578 926 | 19183 927 | 19004 928 | 19072 929 | 19710 930 | 19005 931 | 19610 932 | 18955 933 | 19457 934 | 19148 935 | 18859 936 | 18993 937 | 19642 938 | 19047 939 | 19418 940 | 19535 941 | 19600 942 | 19312 943 | 19039 944 | 19028 945 | 18879 946 | 19003 947 | 19026 948 | 19013 949 | 19149 950 | 19177 951 | 19217 952 | 18987 953 | 19354 954 | 19525 955 | 19202 956 | 19084 957 | 19032 958 | 18749 959 | 18867 960 | 19048 961 | 18999 962 | 19260 963 | 19630 964 | 18727 965 | 19356 966 | 19083 967 | 18926 968 | 18789 969 | 19370 970 | 18861 971 | 19311 972 | 19557 973 | 19531 974 | 19436 975 | 19140 976 | 19310 977 | 19501 978 | 18721 979 | 19604 980 | 19713 981 | 19262 982 | 19563 983 | 19507 984 | 19440 985 | 19572 986 | 19513 987 | 19515 988 | 19518 989 | 19421 990 | 19470 991 | 19499 992 | 19663 993 | 19508 994 | 18871 995 | 19528 996 | 19500 997 | 19307 998 | 19288 999 | 19594 1000 | 19271 1001 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.pubmed.tx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.pubmed.tx -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.pubmed.ty: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.pubmed.ty -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.pubmed.x: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.pubmed.x -------------------------------------------------------------------------------- /Collaborative_Reasoning/data/ind.pubmed.y: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/Collaborative_Reasoning/data/ind.pubmed.y -------------------------------------------------------------------------------- /Collaborative_Reasoning/data_process.py: -------------------------------------------------------------------------------- 1 | #setting of data generation 2 | 3 | import torch 4 | import random 5 | import sys 6 | import pickle as pkl 7 | import numpy as np 8 | import scipy.sparse as sp 9 | from scipy.sparse.linalg.eigen.arpack import eigsh 10 | import networkx as nx 11 | 12 | def parse_index_file(filename): 13 | """Parse index file.""" 14 | index = [] 15 | for line in open(filename): 16 | index.append(int(line.strip())) 17 | return index 18 | 19 | 20 | 21 | def normalize(mx): 22 | """Row-normalize sparse matrix""" 23 | rowsum = np.array(mx.sum(1)) 24 | r_inv = np.power(rowsum, -1).flatten() 25 | r_inv[np.isinf(r_inv)] = 0. 26 | r_mat_inv = sp.diags(r_inv) 27 | mx = r_mat_inv.dot(mx) 28 | return mx 29 | 30 | 31 | 32 | def load_data(dataset_str): 33 | """ 34 | Loads input data from gcn/data directory 35 | 36 | ind.dataset_str.x => the feature vectors of the training instances as scipy.sparse.csr.csr_matrix object; 37 | ind.dataset_str.tx => the feature vectors of the test instances as scipy.sparse.csr.csr_matrix object; 38 | ind.dataset_str.allx => the feature vectors of both labeled and unlabeled training instances 39 | (a superset of ind.dataset_str.x) as scipy.sparse.csr.csr_matrix object; 40 | ind.dataset_str.y => the one-hot labels of the labeled training instances as numpy.ndarray object; 41 | ind.dataset_str.ty => the one-hot labels of the test instances as numpy.ndarray object; 42 | ind.dataset_str.ally => the labels for instances in ind.dataset_str.allx as numpy.ndarray object; 43 | ind.dataset_str.graph => a dict in the format {index: [index_of_neighbor_nodes]} as collections.defaultdict 44 | object; 45 | ind.dataset_str.test.index => the indices of test instances in graph, for the inductive setting as list object. 46 | 47 | All objects above must be saved using python pickle module. 48 | 49 | :param dataset_str: Dataset name 50 | :return: All data input files loaded (as well the training/test data). 51 | """ 52 | names = ['x', 'y', 'tx', 'ty', 'allx', 'ally', 'graph'] 53 | objects = [] 54 | for i in range(len(names)): 55 | with open("data/ind.{}.{}".format(dataset_str, names[i]), 'rb') as f: 56 | if sys.version_info > (3, 0): 57 | objects.append(pkl.load(f, encoding='latin1')) 58 | else: 59 | objects.append(pkl.load(f)) 60 | 61 | x, y, tx, ty, allx, ally, graph = tuple(objects) 62 | test_idx_reorder = parse_index_file("data/ind.{}.test.index".format(dataset_str)) 63 | test_idx_range = np.sort(test_idx_reorder) 64 | 65 | if dataset_str == 'citeseer': 66 | # Fix citeseer dataset (there are some isolated nodes in the graph) 67 | # Find isolated nodes, add them as zero-vecs into the right position 68 | test_idx_range_full = range(min(test_idx_reorder), max(test_idx_reorder)+1) 69 | tx_extended = sp.lil_matrix((len(test_idx_range_full), x.shape[1])) 70 | tx_extended[test_idx_range-min(test_idx_range), :] = tx 71 | tx = tx_extended 72 | ty_extended = np.zeros((len(test_idx_range_full), y.shape[1])) 73 | ty_extended[test_idx_range-min(test_idx_range), :] = ty 74 | ty = ty_extended 75 | 76 | features = sp.vstack((allx, tx)).tolil() 77 | features[test_idx_reorder, :] = features[test_idx_range, :] 78 | adj = nx.adjacency_matrix(nx.from_dict_of_lists(graph)) 79 | 80 | labels = np.vstack((ally, ty)) 81 | labels[test_idx_reorder, :] = labels[test_idx_range, :] 82 | 83 | number_of_nodes=adj.shape[0] 84 | 85 | 86 | idx_test = test_idx_range.tolist() 87 | idx_train = range(int(len(y))) 88 | idx_val = range(len(y), len(y)+500) 89 | 90 | idx_train = torch.LongTensor(idx_train) 91 | idx_val = torch.LongTensor(idx_val) 92 | idx_test = torch.LongTensor(idx_test) 93 | 94 | #features = normalize(features) #cannot converge if use SGD, why?????????? 95 | #adj = normalize(adj) # no normalize adj here, normalize it in the training process 96 | 97 | 98 | features=torch.tensor(features.toarray()).float() 99 | adj=torch.tensor(adj.toarray()).float() 100 | labels=torch.tensor(labels) 101 | labels=torch.argmax(labels,dim=1) 102 | 103 | 104 | return features.float(), adj.float(), labels, idx_train, idx_val, idx_test 105 | 106 | 107 | 108 | 109 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/layers.py: -------------------------------------------------------------------------------- 1 | import math 2 | 3 | import torch 4 | 5 | from torch.nn.parameter import Parameter 6 | from torch.nn.modules.module import Module 7 | 8 | class GraphConvolution(Module): 9 | """ 10 | Simple GCN layer, similar to https://arxiv.org/abs/1609.02907 11 | """ 12 | 13 | def __init__(self, in_features, out_features, bias=True): 14 | super(GraphConvolution, self).__init__() 15 | self.in_features = in_features 16 | self.out_features = out_features 17 | self.weight = Parameter(torch.FloatTensor(in_features, out_features)) 18 | if bias: 19 | self.bias = Parameter(torch.FloatTensor(out_features)) 20 | else: 21 | self.register_parameter('bias', None) 22 | self.reset_parameters() 23 | 24 | def reset_parameters(self): 25 | stdv = 1. / math.sqrt(self.weight.size(1)) 26 | self.weight.data.uniform_(-stdv, stdv) 27 | if self.bias is not None: 28 | self.bias.data.uniform_(-stdv, stdv) 29 | 30 | def forward(self, input, adj): 31 | support = torch.mm(input, self.weight) 32 | output = torch.spmm(adj, support) 33 | if self.bias is not None: 34 | return output + self.bias 35 | else: 36 | return output 37 | 38 | def __repr__(self): 39 | return self.__class__.__name__ + ' (' \ 40 | + str(self.in_features) + ' -> ' \ 41 | + str(self.out_features) + ')' 42 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/models.py: -------------------------------------------------------------------------------- 1 | import torch.nn as nn 2 | import torch.nn.functional as F 3 | from layers import GraphConvolution 4 | 5 | 6 | class GCN(nn.Module): 7 | def __init__(self, nfeat, nhid, nclass, dropout): 8 | super(GCN, self).__init__() 9 | 10 | self.gc1 = GraphConvolution(nfeat, nhid) 11 | self.gc2 = GraphConvolution(nhid, nclass) 12 | self.dropout = dropout 13 | ''' 14 | def forward(self, x, adj): 15 | x = F.relu(self.gc1(x, adj)) 16 | x = F.dropout(x, self.dropout, training=self.training) 17 | x = self.gc2(x, adj) 18 | return F.log_softmax(x, dim=1) 19 | ''' 20 | 21 | def forward(self, x, adj, split_data_indexs = None): 22 | if split_data_indexs != None: 23 | x = F.relu(self.gc1(x, adj[split_data_indexs[0]])) 24 | x = F.dropout(x, self.dropout, training=self.training) 25 | x = self.gc2(x, adj[split_data_indexs[1]][:,split_data_indexs[0]]) 26 | else: 27 | x = F.relu(self.gc1(x, adj)) 28 | x = F.dropout(x, self.dropout, training=self.training) 29 | x = self.gc2(x, adj) 30 | return F.log_softmax(x, dim=1) 31 | 32 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/train_func.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn.functional as F 3 | from sklearn import metrics 4 | 5 | def accuracy(output, labels): 6 | preds = output.max(1)[1].type_as(labels) 7 | correct = preds.eq(labels).double() 8 | correct = correct.sum() 9 | return correct / len(labels) 10 | 11 | def test(model, features, adj, labels, idx_test): 12 | model.eval() 13 | output = model(features, adj) 14 | pred_labels=torch.argmax(output,axis=1) 15 | loss_test = F.nll_loss(output[idx_test], labels[idx_test]) 16 | acc_test = accuracy(output[idx_test], labels[idx_test]) 17 | 18 | #acc_test = metrics.accuracy_score(labels[idx_test].cpu().detach().numpy(), pred_labels[idx_test].cpu().detach().numpy()) 19 | #f1_test=metrics.f1_score(labels[idx_test].cpu().detach().numpy(), pred_labels[idx_test].cpu().detach().numpy(),average='weighted') 20 | #auc_test=metrics.roc_auc_score(one_hot(labels[idx_test].cpu().detach().numpy()), output[idx_test].cpu().detach().numpy(),multi_class='ovr',average='weighted') 21 | 22 | return loss_test.item(), acc_test.item()#, f1_test, auc_test 23 | 24 | def Block_matrix_train(epoch, model, optimizer, features, adj, labels, split_data_index, idx_train): 25 | model.train() 26 | optimizer.zero_grad() 27 | 28 | output = model(features[split_data_index], adj[split_data_index][:, split_data_index]) #adj, keep block matrix 29 | 30 | loss_train = F.nll_loss(output[idx_train], labels[split_data_index][idx_train]) 31 | 32 | 33 | acc_train = accuracy(output[idx_train], labels[split_data_index][idx_train]) 34 | 35 | 36 | loss_train.backward() 37 | optimizer.step() 38 | optimizer.zero_grad() 39 | 40 | #print("epoch", epoch, 41 | # "train", loss_train.item(), acc_train.item()) 42 | return loss_train.item(), acc_train.item() 43 | 44 | 45 | def Block_matrix_train_batch(epoch, model, optimizer, features, adj, labels, split_data_index_list, idx_train_list ): 46 | model.train() 47 | optimizer.zero_grad() 48 | 49 | loss_train = None 50 | count = 0 51 | for i in range(len(split_data_index_list)): 52 | split_data_index = split_data_index_list[i] 53 | idx_train = idx_train_list[i] 54 | output = model(features[split_data_index], adj[split_data_index][:, split_data_index]) #adj, keep block matrix 55 | if loss_train == None: 56 | loss_train = len(idx_train) * F.nll_loss(output[idx_train], labels[split_data_index][idx_train]) 57 | else: 58 | loss_train += len(idx_train) * F.nll_loss(output[idx_train], labels[split_data_index][idx_train]) 59 | acc_train = accuracy(output[idx_train], labels[split_data_index][idx_train]) 60 | 61 | count += len(idx_train) 62 | 63 | 64 | loss_train /= count 65 | 66 | loss_train.backward() 67 | optimizer.step() 68 | optimizer.zero_grad() 69 | 70 | #print("epoch", epoch, 71 | # "train", loss_train.item(), acc_train.item()) 72 | return loss_train.item(), acc_train.item() 73 | -------------------------------------------------------------------------------- /Collaborative_Reasoning/utils.py: -------------------------------------------------------------------------------- 1 | import collections 2 | import matplotlib.pyplot as plt 3 | 4 | def get_data(file_name): 5 | a = open(file_name, 'r') 6 | train_loss = collections.defaultdict(float) 7 | train_acc = collections.defaultdict(float) 8 | val_loss = collections.defaultdict(float) 9 | val_acc = collections.defaultdict(float) 10 | test_loss = 0 11 | test_acc = 0 12 | count = 0 13 | for line in a: 14 | line = line.split() 15 | if line[1] == 'train': 16 | train_loss[int(line[0])] += float(line[2]) 17 | train_acc[int(line[0])] += float(line[3]) 18 | elif line[1] == 'val': 19 | val_loss[int(line[0])] += float(line[2]) 20 | val_acc[int(line[0])] += float(line[3]) 21 | elif line[1] == 'test': 22 | test_loss += float(line[2]) 23 | test_acc += float(line[3]) 24 | count += 1 25 | else: 26 | print("error") 27 | a.close() 28 | for key in train_loss.keys(): 29 | train_loss[key] /= count 30 | train_acc[key] /= count 31 | val_loss[key] /= count 32 | val_acc[key] /= count 33 | test_loss /= count 34 | test_acc /= count 35 | 36 | return train_loss, train_acc, val_loss, val_acc, test_loss, test_acc 37 | 38 | 39 | def get_plot(file_name): 40 | train_loss, train_acc, val_loss, val_acc, test_loss, test_acc = get_data(file_name) 41 | 42 | #plt.plot(train_loss.keys(), train_loss.values(), label = 'train_loss') 43 | #print(train_acc.values()) 44 | plt.plot(train_acc.keys(), train_acc.values(), label = 'train_acc') 45 | #plt.plot(val_loss.keys(), val_loss.values(), label = 'val_loss') 46 | plt.plot(val_acc.keys(), val_acc.values(), label = 'val_acc') 47 | plt.ylim(0, 1) 48 | plt.xlim(0, 300) 49 | plt.legend() 50 | plt.show() -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # iRML Semantic-aware Communication Networks 2 | Codes for the paper 3 | **Imitation Learning-based Implicit Semantic-aware Communication Networks: Multi-layer Representation and Collaborative Reasoning** 4 | 5 | by Yong Xiao, Zijian Sun, Guangming Shi, Dusit Niyato, 6 | 7 | accepted by *IEEE Journal on Selected Areas in Communications (JSAC)*. 8 | 9 | Preprint of the paper is available at: : https://ieeexplore.ieee.org/abstract/document/10000405. 10 | 11 | ## How to Use 12 | Follow the instructions under different folders to get the numerical results of each module within the proposed architecture. 13 | 14 | ## If you use our code, please cite the paper 15 | ``` 16 | @ARTICLE{10000405, 17 | author={Xiao, Yong and Sun, Zijian and Shi, Guangming and Niyato, Dusit}, 18 | journal={IEEE Journal on Selected Areas in Communications}, 19 | title={Imitation Learning-Based Implicit Semantic-Aware Communication Networks: Multi-Layer Representation and Collaborative Reasoning}, 20 | year={2023}, 21 | volume={41}, 22 | number={3}, 23 | pages={639-658}, 24 | doi={10.1109/JSAC.2022.3229419}} 25 | 26 | ``` 27 | -------------------------------------------------------------------------------- /SER/DecodingSimulation.m: -------------------------------------------------------------------------------- 1 | %% Data Loading 2 | load("data_ent.mat"); 3 | load("data_tri.mat"); 4 | %% 5 | SignalLength=10000000; 6 | SignalSequence=rand(1,SignalLength); 7 | SignalSequence(SignalSequence>0.5)=1; 8 | SignalSequence(SignalSequence<=0.5)=0; 9 | SignalBpskBit=1-2*SignalSequence; 10 | Bit_tx=SignalBpskBit; 11 | noise_real=normrnd(0,sqrt(0.5),[1 SignalLength]); 12 | noise_image=j*normrnd(0,sqrt(0.5),[1 SignalLength]); 13 | SNR=(-12:2:12); 14 | BER=zeros(1,length(SNR)); 15 | for i=1:length(SNR) 16 | ratio=10^(SNR(i)/10); 17 | Mesage_rx=sqrt(ratio)*Bit_tx+(noise_real+noise_image); 18 | Bit_rx=zeros(1,SignalLength); 19 | Bit_rx(Mesage_rx>0)=1; 20 | Bit_rx(Mesage_rx<0)=-1; 21 | SignalBit_rx=0.5*(1-Bit_rx); 22 | error=xor(SignalSequence,SignalBit_rx); 23 | BER(i)=(sum(error))/SignalLength; 24 | end 25 | %% Layered Semantics BER Calculation 26 | n = 50000; 27 | index_low = 1:1:length(ent_low(:,1)); 28 | random_index_low = index_low(randi(numel(index_low),1,n)); 29 | index_mid = 1:1:length(ent_mid(:,1)); 30 | random_index_mid = index_mid(randi(numel(index_mid),1,n)); 31 | index_high = 1:1:length(ent_high(:,1)); 32 | random_index_high = index_high(randi(numel(index_high),1,n)); 33 | snrs = -12:2:12; 34 | acc_list_low = zeros(1,length(snrs)); 35 | ber_list_low = zeros(1,length(snrs)); 36 | ser_list_mid = zeros(1,length(snrs)); 37 | ber_list_mid = zeros(1,length(snrs)); 38 | ser_list_high = zeros(1,length(snrs)); 39 | ber_list_high = zeros(1,length(snrs)); 40 | % BER of Low-layer Semantics 41 | for s = 1:1:length(snrs) 42 | acc = 0; 43 | for i = 1 :1: n 44 | signal = ent_low(random_index_low(i),:); 45 | noised = awgn(signal, snrs(s),'measured'); 46 | temp = ent_low - noised; 47 | mod = []; 48 | for j = 1:1:length(ent_low(:,1)) 49 | mod(j) = norm(temp(j,:)); 50 | end 51 | [minvalue, min_idex] = min(mod); 52 | if ismember(random_index_low(i),find(mod==min(mod))) 53 | acc = acc+1; 54 | end 55 | end 56 | ser = 1-acc/n; 57 | acc_list_low(s) = ser; 58 | end 59 | ber_list_low = 1-nthroot(1-acc_list_low,8); 60 | % BER of Mid-layer Semantics 61 | for s = 1:1:length(snrs) 62 | acc = 0; 63 | for i = 1 :1: n 64 | signal = ent_mid(random_index_mid(i),:); 65 | noised = awgn(signal, snrs(s),'measured'); 66 | temp = ent_mid - noised; 67 | mod = []; 68 | for j = 1:1:length(ent_mid(:,1)) 69 | mod(j) = norm(temp(j,:)); 70 | end 71 | [minvalue, min_idex] = min(mod); 72 | if ismember(random_index_mid(i),find(mod==min(mod))) 73 | acc = acc+1; 74 | end 75 | end 76 | ser = 1-acc/n; 77 | ser_list_mid(s) = ser; 78 | end 79 | ber_list_mid = 1-nthroot(1-ser_list_mid,8); 80 | % BER of High-layer Semantics 81 | for s = 1:1:length(snrs) 82 | acc = 0; 83 | for i = 1 :1: n 84 | signal = ent_high(random_index_high(i),:); 85 | noised = awgn(signal, snrs(s),'measured'); 86 | temp = ent_high - noised; 87 | mod = []; 88 | for j = 1:1:length(ent_high(:,1)) 89 | mod(j) = norm(temp(j,:)); 90 | end 91 | [minvalue, min_idex] = min(mod); 92 | if ismember(random_index_high(i),find(mod==min(mod))) 93 | acc = acc+1; 94 | end 95 | end 96 | ser = 1-acc/n; 97 | ser_list_high(s) = ser; 98 | end 99 | ber_list_high = 1-nthroot(1-ser_list_high,8); 100 | %% Semantic Recovery Accuracy with noised signals 101 | n = 50000; 102 | recovery_ratio = 0.3; 103 | index_low = 1:1:length(head_low); 104 | random_index_low = index_low(randi(numel(index_low),1,n)); 105 | index_mid = 1:1:length(head_mid); 106 | random_index_mid = index_mid(randi(numel(index_mid),1,n)); 107 | index_high = 1:1:length(head_high); 108 | random_index_high = index_high(randi(numel(index_high),1,n)); 109 | snrs = [-12:2:12,1000]; 110 | acc_list_low = zeros(1,length(snrs)); 111 | acc_list_mid = zeros(1,length(snrs)); 112 | acc_list_high = zeros(1,length(snrs)); 113 | % Recovery Acc. of Low-layer triplets 114 | for s = 1:1:length(snrs) 115 | acc = 0; 116 | for i = 1 :1: n 117 | signal_h = head_low(random_index_low(i),:); 118 | signal_t = tail_low(random_index_low(i),:); 119 | noised_h = awgn(signal_h, snrs(s),'measured'); 120 | noised_t = awgn(signal_t, snrs(s),'measured'); 121 | temp = relation_low-(noised_t-noised_h); 122 | norm_temp = norm(relation_low(random_index_low(i),:)-(noised_t-noised_h)); 123 | mod = []; 124 | for j = 1:1:length(head_low) 125 | mod(j) = norm(temp(j,:)); 126 | end 127 | sorted_mod = sortrows(mod'); 128 | if norm_temp <= sorted_mod(10) 129 | acc = acc+1; 130 | end 131 | end 132 | ser =1-acc/n; 133 | acc_list_low(s) = ser; 134 | end 135 | % Recovery Acc. of Mid-layer triplets 136 | for s = 1:1:length(snrs) 137 | acc = 0; 138 | for i = 1 :1: n 139 | signal_h = head_mid(random_index_mid(i),:); 140 | signal_t = tail_mid(random_index_mid(i),:); 141 | noised_h = awgn(signal_h, snrs(s),'measured'); 142 | noised_t = awgn(signal_t, snrs(s),'measured'); 143 | temp = relation_mid-(noised_t-noised_h); 144 | norm_temp = norm(relation_mid(random_index_mid(i),:)-(noised_t-noised_h)); 145 | mod = []; 146 | for j = 1:1:length(head_mid) 147 | mod(j) = norm(temp(j,:)); 148 | end 149 | sorted_mod = sortrows(mod'); 150 | if norm_temp <= sorted_mod(10) 151 | acc = acc+1; 152 | end 153 | end 154 | ser =1-acc/n; 155 | acc_list_mid(s) = ser; 156 | end 157 | % Recovery Acc. of High-layer triplets 158 | for s = 1:1:length(snrs) 159 | acc = 0; 160 | for i = 1 :1: n 161 | signal_h = head_high(random_index_high(i),:); 162 | signal_t = tail_high(random_index_high(i),:); 163 | noised_h = awgn(signal_h, snrs(s),'measured'); 164 | noised_t = awgn(signal_t, snrs(s),'measured'); 165 | temp = relation_high-(noised_t-noised_h); 166 | norm_temp = norm(relation_high(random_index_high(i),:)-(noised_t-noised_h)); 167 | mod = []; 168 | for j = 1:1:length(head_high) 169 | mod(j) = norm(temp(j,:)); 170 | end 171 | sorted_mod = sortrows(mod'); 172 | if norm_temp <= sorted_mod(10) 173 | acc = acc+1; 174 | end 175 | end 176 | ser =1-acc/n; 177 | acc_list_high(s) = ser; 178 | end 179 | %% Hard Decoding 180 | hard_improvedBER_low = ber_list_low-recovery_ratio.*(1-ber_list_low).*(1-acc_list_low(end)).*ber_list_low; 181 | hard_improvedBER_mid = ber_list_mid-recovery_ratio.*(1-ber_list_mid).*(1-acc_list_mid(end)).*ber_list_mid; 182 | hard_improvedBER_high = ber_list_high-recovery_ratio.*(1-ber_list_high).*(1-acc_list_high(end)).*ber_list_high; 183 | %% Soft Decoding 184 | soft_improvedBER_low = ber_list_low-ber_list_low.*(1-acc_list_low(1:end-1)).*ber_list_low; 185 | soft_improvedBER_mid = ber_list_mid-ber_list_mid.*(1-acc_list_mid(1:end-1)).*ber_list_mid; 186 | soft_improvedBER_high = ber_list_high-ber_list_high.*(1-acc_list_high(1:end-1)).*ber_list_high; -------------------------------------------------------------------------------- /SER/README.md: -------------------------------------------------------------------------------- 1 | # Decoding 2 | This is a simulation for hard decoding using entities from FB15k-237 as an example. The entities are converted into low-dimensional representations using Trans E. 3 | 4 | Simply run the `DecodingSimulation.m` file to see the numerical results. 5 | -------------------------------------------------------------------------------- /SER/data_ent.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/SER/data_ent.mat -------------------------------------------------------------------------------- /SER/data_tri.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zjs919/iRML/cb0d68aaeb9c8b244054d647de33c0ec5fc38c9e/SER/data_tri.mat --------------------------------------------------------------------------------