├── BreastCancer_noclass_noNA.tsv ├── README.md ├── ascii_autoencoder.py ├── col.py ├── dependencies.txt └── wine_noclass.txt /BreastCancer_noclass_noNA.tsv: -------------------------------------------------------------------------------- 1 | Id Cl.thickness Cell.size Cell.shape Marg.adhesion Epith.c.size Bare.nuclei Bl.cromatin Normal.nucleoli Mitoses 2 | 1000025 5 1 1 1 2 1 3 1 1 3 | 1002945 5 4 4 5 7 10 3 2 1 4 | 1015425 3 1 1 1 2 2 3 1 1 5 | 1016277 6 8 8 1 3 4 3 7 1 6 | 1017023 4 1 1 3 2 1 3 1 1 7 | 1017122 8 10 10 8 7 10 9 7 1 8 | 1018099 1 1 1 1 2 10 3 1 1 9 | 1018561 2 1 2 1 2 1 3 1 1 10 | 1033078 2 1 1 1 2 1 1 1 5 11 | 1033078 4 2 1 1 2 1 2 1 1 12 | 1035283 1 1 1 1 1 1 3 1 1 13 | 1036172 2 1 1 1 2 1 2 1 1 14 | 1041801 5 3 3 3 2 3 4 4 1 15 | 1043999 1 1 1 1 2 3 3 1 1 16 | 1044572 8 7 5 10 7 9 5 5 4 17 | 1047630 7 4 6 4 6 1 4 3 1 18 | 1048672 4 1 1 1 2 1 2 1 1 19 | 1049815 4 1 1 1 2 1 3 1 1 20 | 1050670 10 7 7 6 4 10 4 1 2 21 | 1050718 6 1 1 1 2 1 3 1 1 22 | 1054590 7 3 2 10 5 10 5 4 4 23 | 1054593 10 5 5 3 6 7 7 10 1 24 | 1056784 3 1 1 1 2 1 2 1 1 25 | 1059552 1 1 1 1 2 1 3 1 1 26 | 1065726 5 2 3 4 2 7 3 6 1 27 | 1066373 3 2 1 1 1 1 2 1 1 28 | 1066979 5 1 1 1 2 1 2 1 1 29 | 1067444 2 1 1 1 2 1 2 1 1 30 | 1070935 1 1 3 1 2 1 1 1 1 31 | 1070935 3 1 1 1 1 1 2 1 1 32 | 1071760 2 1 1 1 2 1 3 1 1 33 | 1072179 10 7 7 3 8 5 7 4 3 34 | 1074610 2 1 1 2 2 1 3 1 1 35 | 1075123 3 1 2 1 2 1 2 1 1 36 | 1079304 2 1 1 1 2 1 2 1 1 37 | 1080185 10 10 10 8 6 1 8 9 1 38 | 1081791 6 2 1 1 1 1 7 1 1 39 | 1084584 5 4 4 9 2 10 5 6 1 40 | 1091262 2 5 3 3 6 7 7 5 1 41 | 1099510 10 4 3 1 3 3 6 5 2 42 | 1100524 6 10 10 2 8 10 7 3 3 43 | 1102573 5 6 5 6 10 1 3 1 1 44 | 1103608 10 10 10 4 8 1 8 10 1 45 | 1103722 1 1 1 1 2 1 2 1 2 46 | 1105257 3 7 7 4 4 9 4 8 1 47 | 1105524 1 1 1 1 2 1 2 1 1 48 | 1106095 4 1 1 3 2 1 3 1 1 49 | 1106829 7 8 7 2 4 8 3 8 2 50 | 1108370 9 5 8 1 2 3 2 1 5 51 | 1108449 5 3 3 4 2 4 3 4 1 52 | 1110102 10 3 6 2 3 5 4 10 2 53 | 1110503 5 5 5 8 10 8 7 3 7 54 | 1110524 10 5 5 6 8 8 7 1 1 55 | 1111249 10 6 6 3 4 5 3 6 1 56 | 1112209 8 10 10 1 3 6 3 9 1 57 | 1113038 8 2 4 1 5 1 5 4 4 58 | 1113483 5 2 3 1 6 10 5 1 1 59 | 1113906 9 5 5 2 2 2 5 1 1 60 | 1115282 5 3 5 5 3 3 4 10 1 61 | 1115293 1 1 1 1 2 2 2 1 1 62 | 1116116 9 10 10 1 10 8 3 3 1 63 | 1116132 6 3 4 1 5 2 3 9 1 64 | 1116192 1 1 1 1 2 1 2 1 1 65 | 1116998 10 4 2 1 3 2 4 3 10 66 | 1117152 4 1 1 1 2 1 3 1 1 67 | 1118039 5 3 4 1 8 10 4 9 1 68 | 1120559 8 3 8 3 4 9 8 9 8 69 | 1121732 1 1 1 1 2 1 3 2 1 70 | 1121919 5 1 3 1 2 1 2 1 1 71 | 1123061 6 10 2 8 10 2 7 8 10 72 | 1124651 1 3 3 2 2 1 7 2 1 73 | 1125035 9 4 5 10 6 10 4 8 1 74 | 1126417 10 6 4 1 3 4 3 2 3 75 | 1131294 1 1 2 1 2 2 4 2 1 76 | 1132347 1 1 4 1 2 1 2 1 1 77 | 1133041 5 3 1 2 2 1 2 1 1 78 | 1133136 3 1 1 1 2 3 3 1 1 79 | 1136142 2 1 1 1 3 1 2 1 1 80 | 1137156 2 2 2 1 1 1 7 1 1 81 | 1143978 4 1 1 2 2 1 2 1 1 82 | 1143978 5 2 1 1 2 1 3 1 1 83 | 1147044 3 1 1 1 2 2 7 1 1 84 | 1147699 3 5 7 8 8 9 7 10 7 85 | 1147748 5 10 6 1 10 4 4 10 10 86 | 1148278 3 3 6 4 5 8 4 4 1 87 | 1148873 3 6 6 6 5 10 6 8 3 88 | 1152331 4 1 1 1 2 1 3 1 1 89 | 1155546 2 1 1 2 3 1 2 1 1 90 | 1156272 1 1 1 1 2 1 3 1 1 91 | 1156948 3 1 1 2 2 1 1 1 1 92 | 1157734 4 1 1 1 2 1 3 1 1 93 | 1158247 1 1 1 1 2 1 2 1 1 94 | 1160476 2 1 1 1 2 1 3 1 1 95 | 1164066 1 1 1 1 2 1 3 1 1 96 | 1165297 2 1 1 2 2 1 1 1 1 97 | 1165790 5 1 1 1 2 1 3 1 1 98 | 1165926 9 6 9 2 10 6 2 9 10 99 | 1166630 7 5 6 10 5 10 7 9 4 100 | 1166654 10 3 5 1 10 5 3 10 2 101 | 1167439 2 3 4 4 2 5 2 5 1 102 | 1167471 4 1 2 1 2 1 3 1 1 103 | 1168359 8 2 3 1 6 3 7 1 1 104 | 1168736 10 10 10 10 10 1 8 8 8 105 | 1169049 7 3 4 4 3 3 3 2 7 106 | 1170419 10 10 10 8 2 10 4 1 1 107 | 1170420 1 6 8 10 8 10 5 7 1 108 | 1171710 1 1 1 1 2 1 2 3 1 109 | 1171710 6 5 4 4 3 9 7 8 3 110 | 1171795 1 3 1 2 2 2 5 3 2 111 | 1171845 8 6 4 3 5 9 3 1 1 112 | 1172152 10 3 3 10 2 10 7 3 3 113 | 1173216 10 10 10 3 10 8 8 1 1 114 | 1173235 3 3 2 1 2 3 3 1 1 115 | 1173347 1 1 1 1 2 5 1 1 1 116 | 1173347 8 3 3 1 2 2 3 2 1 117 | 1173509 4 5 5 10 4 10 7 5 8 118 | 1173514 1 1 1 1 4 3 1 1 1 119 | 1173681 3 2 1 1 2 2 3 1 1 120 | 1174057 1 1 2 2 2 1 3 1 1 121 | 1174057 4 2 1 1 2 2 3 1 1 122 | 1174131 10 10 10 2 10 10 5 3 3 123 | 1174428 5 3 5 1 8 10 5 3 1 124 | 1175937 5 4 6 7 9 7 8 10 1 125 | 1176406 1 1 1 1 2 1 2 1 1 126 | 1176881 7 5 3 7 4 10 7 5 5 127 | 1177027 3 1 1 1 2 1 3 1 1 128 | 1177399 8 3 5 4 5 10 1 6 2 129 | 1177512 1 1 1 1 10 1 1 1 1 130 | 1178580 5 1 3 1 2 1 2 1 1 131 | 1179818 2 1 1 1 2 1 3 1 1 132 | 1180194 5 10 8 10 8 10 3 6 3 133 | 1180523 3 1 1 1 2 1 2 2 1 134 | 1180831 3 1 1 1 3 1 2 1 1 135 | 1181356 5 1 1 1 2 2 3 3 1 136 | 1182404 4 1 1 1 2 1 2 1 1 137 | 1182410 3 1 1 1 2 1 1 1 1 138 | 1183240 4 1 2 1 2 1 2 1 1 139 | 1183516 3 1 1 1 2 1 1 1 1 140 | 1183911 2 1 1 1 2 1 1 1 1 141 | 1183983 9 5 5 4 4 5 4 3 3 142 | 1184184 1 1 1 1 2 5 1 1 1 143 | 1184241 2 1 1 1 2 1 2 1 1 144 | 1185609 3 4 5 2 6 8 4 1 1 145 | 1185610 1 1 1 1 3 2 2 1 1 146 | 1187457 3 1 1 3 8 1 5 8 1 147 | 1187805 8 8 7 4 10 10 7 8 7 148 | 1188472 1 1 1 1 1 1 3 1 1 149 | 1189266 7 2 4 1 6 10 5 4 3 150 | 1189286 10 10 8 6 4 5 8 10 1 151 | 1190394 4 1 1 1 2 3 1 1 1 152 | 1190485 1 1 1 1 2 1 1 1 1 153 | 1192325 5 5 5 6 3 10 3 1 1 154 | 1193091 1 2 2 1 2 1 2 1 1 155 | 1193210 2 1 1 1 2 1 3 1 1 156 | 1196295 9 9 10 3 6 10 7 10 6 157 | 1196915 10 7 7 4 5 10 5 7 2 158 | 1197080 4 1 1 1 2 1 3 2 1 159 | 1197270 3 1 1 1 2 1 3 1 1 160 | 1197440 1 1 1 2 1 3 1 1 7 161 | 1197979 4 1 1 1 2 2 3 2 1 162 | 1197993 5 6 7 8 8 10 3 10 3 163 | 1198128 10 8 10 10 6 1 3 1 10 164 | 1198641 3 1 1 1 2 1 3 1 1 165 | 1199219 1 1 1 2 1 1 1 1 1 166 | 1199731 3 1 1 1 2 1 1 1 1 167 | 1199983 1 1 1 1 2 1 3 1 1 168 | 1200772 1 1 1 1 2 1 2 1 1 169 | 1200847 6 10 10 10 8 10 10 10 7 170 | 1200892 8 6 5 4 3 10 6 1 1 171 | 1200952 5 8 7 7 10 10 5 7 1 172 | 1201834 2 1 1 1 2 1 3 1 1 173 | 1201936 5 10 10 3 8 1 5 10 3 174 | 1202125 4 1 1 1 2 1 3 1 1 175 | 1202812 5 3 3 3 6 10 3 1 1 176 | 1203096 1 1 1 1 1 1 3 1 1 177 | 1204242 1 1 1 1 2 1 1 1 1 178 | 1204898 6 1 1 1 2 1 3 1 1 179 | 1205138 5 8 8 8 5 10 7 8 1 180 | 1205579 8 7 6 4 4 10 5 1 1 181 | 1206089 2 1 1 1 1 1 3 1 1 182 | 1206695 1 5 8 6 5 8 7 10 1 183 | 1206841 10 5 6 10 6 10 7 7 10 184 | 1207986 5 8 4 10 5 8 9 10 1 185 | 1208301 1 2 3 1 2 1 3 1 1 186 | 1210963 10 10 10 8 6 8 7 10 1 187 | 1211202 7 5 10 10 10 10 4 10 3 188 | 1212232 5 1 1 1 2 1 2 1 1 189 | 1212251 1 1 1 1 2 1 3 1 1 190 | 1212422 3 1 1 1 2 1 3 1 1 191 | 1212422 4 1 1 1 2 1 3 1 1 192 | 1213375 8 4 4 5 4 7 7 8 2 193 | 1213383 5 1 1 4 2 1 3 1 1 194 | 1214092 1 1 1 1 2 1 1 1 1 195 | 1214556 3 1 1 1 2 1 2 1 1 196 | 1214966 9 7 7 5 5 10 7 8 3 197 | 1216694 10 8 8 4 10 10 8 1 1 198 | 1216947 1 1 1 1 2 1 3 1 1 199 | 1217051 5 1 1 1 2 1 3 1 1 200 | 1217264 1 1 1 1 2 1 3 1 1 201 | 1218105 5 10 10 9 6 10 7 10 5 202 | 1218741 10 10 9 3 7 5 3 5 1 203 | 1218860 1 1 1 1 1 1 3 1 1 204 | 1218860 1 1 1 1 1 1 3 1 1 205 | 1219406 5 1 1 1 1 1 3 1 1 206 | 1219525 8 10 10 10 5 10 8 10 6 207 | 1219859 8 10 8 8 4 8 7 7 1 208 | 1220330 1 1 1 1 2 1 3 1 1 209 | 1221863 10 10 10 10 7 10 7 10 4 210 | 1222047 10 10 10 10 3 10 10 6 1 211 | 1222936 8 7 8 7 5 5 5 10 2 212 | 1223282 1 1 1 1 2 1 2 1 1 213 | 1223426 1 1 1 1 2 1 3 1 1 214 | 1223793 6 10 7 7 6 4 8 10 2 215 | 1223967 6 1 3 1 2 1 3 1 1 216 | 1224329 1 1 1 2 2 1 3 1 1 217 | 1225799 10 6 4 3 10 10 9 10 1 218 | 1226012 4 1 1 3 1 5 2 1 1 219 | 1226612 7 5 6 3 3 8 7 4 1 220 | 1227210 10 5 5 6 3 10 7 9 2 221 | 1227244 1 1 1 1 2 1 2 1 1 222 | 1227481 10 5 7 4 4 10 8 9 1 223 | 1228152 8 9 9 5 3 5 7 7 1 224 | 1228311 1 1 1 1 1 1 3 1 1 225 | 1230175 10 10 10 3 10 10 9 10 1 226 | 1230688 7 4 7 4 3 7 7 6 1 227 | 1231387 6 8 7 5 6 8 8 9 2 228 | 1231706 8 4 6 3 3 1 4 3 1 229 | 1232225 10 4 5 5 5 10 4 1 1 230 | 1236043 3 3 2 1 3 1 3 6 1 231 | 1241559 10 8 8 2 8 10 4 8 10 232 | 1241679 9 8 8 5 6 2 4 10 4 233 | 1242364 8 10 10 8 6 9 3 10 10 234 | 1243256 10 4 3 2 3 10 5 3 2 235 | 1270479 5 1 3 3 2 2 2 3 1 236 | 1276091 3 1 1 3 1 1 3 1 1 237 | 1277018 2 1 1 1 2 1 3 1 1 238 | 128059 1 1 1 1 2 5 5 1 1 239 | 1285531 1 1 1 1 2 1 3 1 1 240 | 1287775 5 1 1 2 2 2 3 1 1 241 | 144888 8 10 10 8 5 10 7 8 1 242 | 145447 8 4 4 1 2 9 3 3 1 243 | 167528 4 1 1 1 2 1 3 6 1 244 | 183913 1 2 2 1 2 1 1 1 1 245 | 191250 10 4 4 10 2 10 5 3 3 246 | 1017023 6 3 3 5 3 10 3 5 3 247 | 1100524 6 10 10 2 8 10 7 3 3 248 | 1116116 9 10 10 1 10 8 3 3 1 249 | 1168736 5 6 6 2 4 10 3 6 1 250 | 1182404 3 1 1 1 2 1 1 1 1 251 | 1182404 3 1 1 1 2 1 2 1 1 252 | 1198641 3 1 1 1 2 1 3 1 1 253 | 242970 5 7 7 1 5 8 3 4 1 254 | 255644 10 5 8 10 3 10 5 1 3 255 | 263538 5 10 10 6 10 10 10 6 5 256 | 274137 8 8 9 4 5 10 7 8 1 257 | 303213 10 4 4 10 6 10 5 5 1 258 | 314428 7 9 4 10 10 3 5 3 3 259 | 1182404 5 1 4 1 2 1 3 2 1 260 | 1198641 10 10 6 3 3 10 4 3 2 261 | 320675 3 3 5 2 3 10 7 1 1 262 | 324427 10 8 8 2 3 4 8 7 8 263 | 385103 1 1 1 1 2 1 3 1 1 264 | 390840 8 4 7 1 3 10 3 9 2 265 | 411453 5 1 1 1 2 1 3 1 1 266 | 320675 3 3 5 2 3 10 7 1 1 267 | 428903 7 2 4 1 3 4 3 3 1 268 | 431495 3 1 1 1 2 1 3 2 1 269 | 434518 3 1 1 1 2 1 2 1 1 270 | 452264 1 1 1 1 2 1 2 1 1 271 | 456282 1 1 1 1 2 1 3 1 1 272 | 476903 10 5 7 3 3 7 3 3 8 273 | 486283 3 1 1 1 2 1 3 1 1 274 | 486662 2 1 1 2 2 1 3 1 1 275 | 488173 1 4 3 10 4 10 5 6 1 276 | 492268 10 4 6 1 2 10 5 3 1 277 | 508234 7 4 5 10 2 10 3 8 2 278 | 527363 8 10 10 10 8 10 10 7 3 279 | 529329 10 10 10 10 10 10 4 10 10 280 | 535331 3 1 1 1 3 1 2 1 1 281 | 543558 6 1 3 1 4 5 5 10 1 282 | 555977 5 6 6 8 6 10 4 10 4 283 | 560680 1 1 1 1 2 1 1 1 1 284 | 561477 1 1 1 1 2 1 3 1 1 285 | 601265 10 4 4 6 2 10 2 3 1 286 | 606722 5 5 7 8 6 10 7 4 1 287 | 616240 5 3 4 3 4 5 4 7 1 288 | 625201 8 2 1 1 5 1 1 1 1 289 | 63375 9 1 2 6 4 10 7 7 2 290 | 635844 8 4 10 5 4 4 7 10 1 291 | 636130 1 1 1 1 2 1 3 1 1 292 | 640744 10 10 10 7 9 10 7 10 10 293 | 646904 1 1 1 1 2 1 3 1 1 294 | 653777 8 3 4 9 3 10 3 3 1 295 | 659642 10 8 4 4 4 10 3 10 4 296 | 666090 1 1 1 1 2 1 3 1 1 297 | 666942 1 1 1 1 2 1 3 1 1 298 | 667204 7 8 7 6 4 3 8 8 4 299 | 673637 3 1 1 1 2 5 5 1 1 300 | 684955 2 1 1 1 3 1 2 1 1 301 | 688033 1 1 1 1 2 1 1 1 1 302 | 691628 8 6 4 10 10 1 3 5 1 303 | 693702 1 1 1 1 2 1 1 1 1 304 | 704097 1 1 1 1 1 1 2 1 1 305 | 706426 5 5 5 2 5 10 4 3 1 306 | 709287 6 8 7 8 6 8 8 9 1 307 | 718641 1 1 1 1 5 1 3 1 1 308 | 721482 4 4 4 4 6 5 7 3 1 309 | 730881 7 6 3 2 5 10 7 4 6 310 | 733639 3 1 1 1 2 1 3 1 1 311 | 733823 5 4 6 10 2 10 4 1 1 312 | 740492 1 1 1 1 2 1 3 1 1 313 | 743348 3 2 2 1 2 1 2 3 1 314 | 752904 10 1 1 1 2 10 5 4 1 315 | 756136 1 1 1 1 2 1 2 1 1 316 | 760001 8 10 3 2 6 4 3 10 1 317 | 760239 10 4 6 4 5 10 7 1 1 318 | 76389 10 4 7 2 2 8 6 1 1 319 | 764974 5 1 1 1 2 1 3 1 2 320 | 770066 5 2 2 2 2 1 2 2 1 321 | 785208 5 4 6 6 4 10 4 3 1 322 | 785615 8 6 7 3 3 10 3 4 2 323 | 792744 1 1 1 1 2 1 1 1 1 324 | 797327 6 5 5 8 4 10 3 4 1 325 | 798429 1 1 1 1 2 1 3 1 1 326 | 704097 1 1 1 1 1 1 2 1 1 327 | 806423 8 5 5 5 2 10 4 3 1 328 | 809912 10 3 3 1 2 10 7 6 1 329 | 810104 1 1 1 1 2 1 3 1 1 330 | 814265 2 1 1 1 2 1 1 1 1 331 | 814911 1 1 1 1 2 1 1 1 1 332 | 822829 7 6 4 8 10 10 9 5 3 333 | 826923 1 1 1 1 2 1 1 1 1 334 | 830690 5 2 2 2 3 1 1 3 1 335 | 831268 1 1 1 1 1 1 1 3 1 336 | 832226 3 4 4 10 5 1 3 3 1 337 | 832567 4 2 3 5 3 8 7 6 1 338 | 836433 5 1 1 3 2 1 1 1 1 339 | 837082 2 1 1 1 2 1 3 1 1 340 | 846832 3 4 5 3 7 3 4 6 1 341 | 850831 2 7 10 10 7 10 4 9 4 342 | 855524 1 1 1 1 2 1 2 1 1 343 | 857774 4 1 1 1 3 1 2 2 1 344 | 859164 5 3 3 1 3 3 3 3 3 345 | 859350 8 10 10 7 10 10 7 3 8 346 | 866325 8 10 5 3 8 4 4 10 3 347 | 873549 10 3 5 4 3 7 3 5 3 348 | 877291 6 10 10 10 10 10 8 10 10 349 | 877943 3 10 3 10 6 10 5 1 4 350 | 888169 3 2 2 1 4 3 2 1 1 351 | 888523 4 4 4 2 2 3 2 1 1 352 | 896404 2 1 1 1 2 1 3 1 1 353 | 897172 2 1 1 1 2 1 2 1 1 354 | 95719 6 10 10 10 8 10 7 10 7 355 | 160296 5 8 8 10 5 10 8 10 3 356 | 342245 1 1 3 1 2 1 1 1 1 357 | 428598 1 1 3 1 1 1 2 1 1 358 | 492561 4 3 2 1 3 1 2 1 1 359 | 493452 1 1 3 1 2 1 1 1 1 360 | 493452 4 1 2 1 2 1 2 1 1 361 | 521441 5 1 1 2 2 1 2 1 1 362 | 560680 3 1 2 1 2 1 2 1 1 363 | 636437 1 1 1 1 2 1 1 1 1 364 | 640712 1 1 1 1 2 1 2 1 1 365 | 654244 1 1 1 1 1 1 2 1 1 366 | 657753 3 1 1 4 3 1 2 2 1 367 | 685977 5 3 4 1 4 1 3 1 1 368 | 805448 1 1 1 1 2 1 1 1 1 369 | 846423 10 6 3 6 4 10 7 8 4 370 | 1002504 3 2 2 2 2 1 3 2 1 371 | 1022257 2 1 1 1 2 1 1 1 1 372 | 1026122 2 1 1 1 2 1 1 1 1 373 | 1071084 3 3 2 2 3 1 1 2 3 374 | 1080233 7 6 6 3 2 10 7 1 1 375 | 1114570 5 3 3 2 3 1 3 1 1 376 | 1114570 2 1 1 1 2 1 2 2 1 377 | 1116715 5 1 1 1 3 2 2 2 1 378 | 1131411 1 1 1 2 2 1 2 1 1 379 | 1151734 10 8 7 4 3 10 7 9 1 380 | 1156017 3 1 1 1 2 1 2 1 1 381 | 1158247 1 1 1 1 1 1 1 1 1 382 | 1158405 1 2 3 1 2 1 2 1 1 383 | 1168278 3 1 1 1 2 1 2 1 1 384 | 1176187 3 1 1 1 2 1 3 1 1 385 | 1196263 4 1 1 1 2 1 1 1 1 386 | 1196475 3 2 1 1 2 1 2 2 1 387 | 1206314 1 2 3 1 2 1 1 1 1 388 | 1211265 3 10 8 7 6 9 9 3 8 389 | 1213784 3 1 1 1 2 1 1 1 1 390 | 1223003 5 3 3 1 2 1 2 1 1 391 | 1223306 3 1 1 1 2 4 1 1 1 392 | 1223543 1 2 1 3 2 1 1 2 1 393 | 1229929 1 1 1 1 2 1 2 1 1 394 | 1231853 4 2 2 1 2 1 2 1 1 395 | 1234554 1 1 1 1 2 1 2 1 1 396 | 1236837 2 3 2 2 2 2 3 1 1 397 | 1237674 3 1 2 1 2 1 2 1 1 398 | 1238021 1 1 1 1 2 1 2 1 1 399 | 1238633 10 10 10 6 8 4 8 5 1 400 | 1238915 5 1 2 1 2 1 3 1 1 401 | 1238948 8 5 6 2 3 10 6 6 1 402 | 1239232 3 3 2 6 3 3 3 5 1 403 | 1239347 8 7 8 5 10 10 7 2 1 404 | 1239967 1 1 1 1 2 1 2 1 1 405 | 1240337 5 2 2 2 2 2 3 2 2 406 | 1253505 2 3 1 1 5 1 1 1 1 407 | 1255384 3 2 2 3 2 3 3 1 1 408 | 1257200 10 10 10 7 10 10 8 2 1 409 | 1257648 4 3 3 1 2 1 3 3 1 410 | 1257815 5 1 3 1 2 1 2 1 1 411 | 1257938 3 1 1 1 2 1 1 1 1 412 | 1258549 9 10 10 10 10 10 10 10 1 413 | 1258556 5 3 6 1 2 1 1 1 1 414 | 1266154 8 7 8 2 4 2 5 10 1 415 | 1272039 1 1 1 1 2 1 2 1 1 416 | 1276091 2 1 1 1 2 1 2 1 1 417 | 1276091 1 3 1 1 2 1 2 2 1 418 | 1276091 5 1 1 3 4 1 3 2 1 419 | 1277629 5 1 1 1 2 1 2 2 1 420 | 1293439 3 2 2 3 2 1 1 1 1 421 | 1293439 6 9 7 5 5 8 4 2 1 422 | 1294562 10 8 10 1 3 10 5 1 1 423 | 1295186 10 10 10 1 6 1 2 8 1 424 | 527337 4 1 1 1 2 1 1 1 1 425 | 558538 4 1 3 3 2 1 1 1 1 426 | 566509 5 1 1 1 2 1 1 1 1 427 | 608157 10 4 3 10 4 10 10 1 1 428 | 677910 5 2 2 4 2 4 1 1 1 429 | 734111 1 1 1 3 2 3 1 1 1 430 | 734111 1 1 1 1 2 2 1 1 1 431 | 780555 5 1 1 6 3 1 2 1 1 432 | 827627 2 1 1 1 2 1 1 1 1 433 | 1049837 1 1 1 1 2 1 1 1 1 434 | 1058849 5 1 1 1 2 1 1 1 1 435 | 1182404 1 1 1 1 1 1 1 1 1 436 | 1193544 5 7 9 8 6 10 8 10 1 437 | 1201870 4 1 1 3 1 1 2 1 1 438 | 1202253 5 1 1 1 2 1 1 1 1 439 | 1227081 3 1 1 3 2 1 1 1 1 440 | 1230994 4 5 5 8 6 10 10 7 1 441 | 1238410 2 3 1 1 3 1 1 1 1 442 | 1246562 10 2 2 1 2 6 1 1 2 443 | 1257470 10 6 5 8 5 10 8 6 1 444 | 1259008 8 8 9 6 6 3 10 10 1 445 | 1266124 5 1 2 1 2 1 1 1 1 446 | 1267898 5 1 3 1 2 1 1 1 1 447 | 1268313 5 1 1 3 2 1 1 1 1 448 | 1268804 3 1 1 1 2 5 1 1 1 449 | 1276091 6 1 1 3 2 1 1 1 1 450 | 1280258 4 1 1 1 2 1 1 2 1 451 | 1293966 4 1 1 1 2 1 1 1 1 452 | 1296572 10 9 8 7 6 4 7 10 3 453 | 1298416 10 6 6 2 4 10 9 7 1 454 | 1299596 6 6 6 5 4 10 7 6 2 455 | 1105524 4 1 1 1 2 1 1 1 1 456 | 1181685 1 1 2 1 2 1 2 1 1 457 | 1211594 3 1 1 1 1 1 2 1 1 458 | 1238777 6 1 1 3 2 1 1 1 1 459 | 1257608 6 1 1 1 1 1 1 1 1 460 | 1269574 4 1 1 1 2 1 1 1 1 461 | 1277145 5 1 1 1 2 1 1 1 1 462 | 1287282 3 1 1 1 2 1 1 1 1 463 | 1296025 4 1 2 1 2 1 1 1 1 464 | 1296263 4 1 1 1 2 1 1 1 1 465 | 1296593 5 2 1 1 2 1 1 1 1 466 | 1299161 4 8 7 10 4 10 7 5 1 467 | 1301945 5 1 1 1 1 1 1 1 1 468 | 1302428 5 3 2 4 2 1 1 1 1 469 | 1318169 9 10 10 10 10 5 10 10 10 470 | 474162 8 7 8 5 5 10 9 10 1 471 | 787451 5 1 2 1 2 1 1 1 1 472 | 1002025 1 1 1 3 1 3 1 1 1 473 | 1070522 3 1 1 1 1 1 2 1 1 474 | 1073960 10 10 10 10 6 10 8 1 5 475 | 1076352 3 6 4 10 3 3 3 4 1 476 | 1084139 6 3 2 1 3 4 4 1 1 477 | 1115293 1 1 1 1 2 1 1 1 1 478 | 1119189 5 8 9 4 3 10 7 1 1 479 | 1133991 4 1 1 1 1 1 2 1 1 480 | 1142706 5 10 10 10 6 10 6 5 2 481 | 1155967 5 1 2 10 4 5 2 1 1 482 | 1170945 3 1 1 1 1 1 2 1 1 483 | 1181567 1 1 1 1 1 1 1 1 1 484 | 1182404 4 2 1 1 2 1 1 1 1 485 | 1204558 4 1 1 1 2 1 2 1 1 486 | 1217952 4 1 1 1 2 1 2 1 1 487 | 1224565 6 1 1 1 2 1 3 1 1 488 | 1238186 4 1 1 1 2 1 2 1 1 489 | 1253917 4 1 1 2 2 1 2 1 1 490 | 1265899 4 1 1 1 2 1 3 1 1 491 | 1268766 1 1 1 1 2 1 1 1 1 492 | 1277268 3 3 1 1 2 1 1 1 1 493 | 1286943 8 10 10 10 7 5 4 8 7 494 | 1295508 1 1 1 1 2 4 1 1 1 495 | 1297327 5 1 1 1 2 1 1 1 1 496 | 1297522 2 1 1 1 2 1 1 1 1 497 | 1298360 1 1 1 1 2 1 1 1 1 498 | 1299924 5 1 1 1 2 1 2 1 1 499 | 1299994 5 1 1 1 2 1 1 1 1 500 | 1304595 3 1 1 1 1 1 2 1 1 501 | 1306282 6 6 7 10 3 10 8 10 2 502 | 1313325 4 10 4 7 3 10 9 10 1 503 | 1320077 1 1 1 1 1 1 1 1 1 504 | 1320077 1 1 1 1 1 1 2 1 1 505 | 1320304 3 1 2 2 2 1 1 1 1 506 | 1330439 4 7 8 3 4 10 9 1 1 507 | 333093 1 1 1 1 3 1 1 1 1 508 | 369565 4 1 1 1 3 1 1 1 1 509 | 412300 10 4 5 4 3 5 7 3 1 510 | 672113 7 5 6 10 4 10 5 3 1 511 | 749653 3 1 1 1 2 1 2 1 1 512 | 769612 3 1 1 2 2 1 1 1 1 513 | 769612 4 1 1 1 2 1 1 1 1 514 | 798429 4 1 1 1 2 1 3 1 1 515 | 807657 6 1 3 2 2 1 1 1 1 516 | 8233704 4 1 1 1 1 1 2 1 1 517 | 837480 7 4 4 3 4 10 6 9 1 518 | 867392 4 2 2 1 2 1 2 1 1 519 | 869828 1 1 1 1 1 1 3 1 1 520 | 1043068 3 1 1 1 2 1 2 1 1 521 | 1056171 2 1 1 1 2 1 2 1 1 522 | 1061990 1 1 3 2 2 1 3 1 1 523 | 1113061 5 1 1 1 2 1 3 1 1 524 | 1116192 5 1 2 1 2 1 3 1 1 525 | 1135090 4 1 1 1 2 1 2 1 1 526 | 1145420 6 1 1 1 2 1 2 1 1 527 | 1158157 5 1 1 1 2 2 2 1 1 528 | 1171578 3 1 1 1 2 1 1 1 1 529 | 1174841 5 3 1 1 2 1 1 1 1 530 | 1184586 4 1 1 1 2 1 2 1 1 531 | 1186936 2 1 3 2 2 1 2 1 1 532 | 1197527 5 1 1 1 2 1 2 1 1 533 | 1222464 6 10 10 10 4 10 7 10 1 534 | 1240603 2 1 1 1 1 1 1 1 1 535 | 1240603 3 1 1 1 1 1 1 1 1 536 | 1241035 7 8 3 7 4 5 7 8 2 537 | 1287971 3 1 1 1 2 1 2 1 1 538 | 1289391 1 1 1 1 2 1 3 1 1 539 | 1299924 3 2 2 2 2 1 4 2 1 540 | 1306339 4 4 2 1 2 5 2 1 2 541 | 1313658 3 1 1 1 2 1 1 1 1 542 | 1313982 4 3 1 1 2 1 4 8 1 543 | 1321264 5 2 2 2 1 1 2 1 1 544 | 1321321 5 1 1 3 2 1 1 1 1 545 | 1321348 2 1 1 1 2 1 2 1 1 546 | 1321931 5 1 1 1 2 1 2 1 1 547 | 1321942 5 1 1 1 2 1 3 1 1 548 | 1321942 5 1 1 1 2 1 3 1 1 549 | 1328331 1 1 1 1 2 1 3 1 1 550 | 1328755 3 1 1 1 2 1 2 1 1 551 | 1331405 4 1 1 1 2 1 3 2 1 552 | 1331412 5 7 10 10 5 10 10 10 1 553 | 1333104 3 1 2 1 2 1 3 1 1 554 | 1334071 4 1 1 1 2 3 2 1 1 555 | 1343068 8 4 4 1 6 10 2 5 2 556 | 1343374 10 10 8 10 6 5 10 3 1 557 | 1344121 8 10 4 4 8 10 8 2 1 558 | 142932 7 6 10 5 3 10 9 10 2 559 | 183936 3 1 1 1 2 1 2 1 1 560 | 324382 1 1 1 1 2 1 2 1 1 561 | 378275 10 9 7 3 4 2 7 7 1 562 | 385103 5 1 2 1 2 1 3 1 1 563 | 690557 5 1 1 1 2 1 2 1 1 564 | 695091 1 1 1 1 2 1 2 1 1 565 | 695219 1 1 1 1 2 1 2 1 1 566 | 824249 1 1 1 1 2 1 3 1 1 567 | 871549 5 1 2 1 2 1 2 1 1 568 | 878358 5 7 10 6 5 10 7 5 1 569 | 1107684 6 10 5 5 4 10 6 10 1 570 | 1115762 3 1 1 1 2 1 1 1 1 571 | 1217717 5 1 1 6 3 1 1 1 1 572 | 1239420 1 1 1 1 2 1 1 1 1 573 | 1254538 8 10 10 10 6 10 10 10 1 574 | 1261751 5 1 1 1 2 1 2 2 1 575 | 1268275 9 8 8 9 6 3 4 1 1 576 | 1272166 5 1 1 1 2 1 1 1 1 577 | 1294261 4 10 8 5 4 1 10 1 1 578 | 1295529 2 5 7 6 4 10 7 6 1 579 | 1298484 10 3 4 5 3 10 4 1 1 580 | 1311875 5 1 2 1 2 1 1 1 1 581 | 1315506 4 8 6 3 4 10 7 1 1 582 | 1320141 5 1 1 1 2 1 2 1 1 583 | 1325309 4 1 2 1 2 1 2 1 1 584 | 1333063 5 1 3 1 2 1 3 1 1 585 | 1333495 3 1 1 1 2 1 2 1 1 586 | 1334659 5 2 4 1 1 1 1 1 1 587 | 1336798 3 1 1 1 2 1 2 1 1 588 | 1344449 1 1 1 1 1 1 2 1 1 589 | 1350568 4 1 1 1 2 1 2 1 1 590 | 1352663 5 4 6 8 4 1 8 10 1 591 | 188336 5 3 2 8 5 10 8 1 2 592 | 352431 10 5 10 3 5 8 7 8 3 593 | 353098 4 1 1 2 2 1 1 1 1 594 | 411453 1 1 1 1 2 1 1 1 1 595 | 557583 5 10 10 10 10 10 10 1 1 596 | 636375 5 1 1 1 2 1 1 1 1 597 | 736150 10 4 3 10 3 10 7 1 2 598 | 803531 5 10 10 10 5 2 8 5 1 599 | 822829 8 10 10 10 6 10 10 10 10 600 | 1016634 2 3 1 1 2 1 2 1 1 601 | 1031608 2 1 1 1 1 1 2 1 1 602 | 1041043 4 1 3 1 2 1 2 1 1 603 | 1042252 3 1 1 1 2 1 2 1 1 604 | 1061990 4 1 1 1 2 1 2 1 1 605 | 1073836 5 1 1 1 2 1 2 1 1 606 | 1083817 3 1 1 1 2 1 2 1 1 607 | 1096352 6 3 3 3 3 2 6 1 1 608 | 1140597 7 1 2 3 2 1 2 1 1 609 | 1149548 1 1 1 1 2 1 1 1 1 610 | 1174009 5 1 1 2 1 1 2 1 1 611 | 1183596 3 1 3 1 3 4 1 1 1 612 | 1190386 4 6 6 5 7 6 7 7 3 613 | 1190546 2 1 1 1 2 5 1 1 1 614 | 1213273 2 1 1 1 2 1 1 1 1 615 | 1218982 4 1 1 1 2 1 1 1 1 616 | 1225382 6 2 3 1 2 1 1 1 1 617 | 1235807 5 1 1 1 2 1 2 1 1 618 | 1238777 1 1 1 1 2 1 1 1 1 619 | 1253955 8 7 4 4 5 3 5 10 1 620 | 1257366 3 1 1 1 2 1 1 1 1 621 | 1260659 3 1 4 1 2 1 1 1 1 622 | 1268952 10 10 7 8 7 1 10 10 3 623 | 1275807 4 2 4 3 2 2 2 1 1 624 | 1277792 4 1 1 1 2 1 1 1 1 625 | 1277792 5 1 1 3 2 1 1 1 1 626 | 1285722 4 1 1 3 2 1 1 1 1 627 | 1288608 3 1 1 1 2 1 2 1 1 628 | 1290203 3 1 1 1 2 1 2 1 1 629 | 1294413 1 1 1 1 2 1 1 1 1 630 | 1299596 2 1 1 1 2 1 1 1 1 631 | 1303489 3 1 1 1 2 1 2 1 1 632 | 1311033 1 2 2 1 2 1 1 1 1 633 | 1311108 1 1 1 3 2 1 1 1 1 634 | 1315807 5 10 10 10 10 2 10 10 10 635 | 1318671 3 1 1 1 2 1 2 1 1 636 | 1319609 3 1 1 2 3 4 1 1 1 637 | 1323477 1 2 1 3 2 1 2 1 1 638 | 1324572 5 1 1 1 2 1 2 2 1 639 | 1324681 4 1 1 1 2 1 2 1 1 640 | 1325159 3 1 1 1 2 1 3 1 1 641 | 1326892 3 1 1 1 2 1 2 1 1 642 | 1330361 5 1 1 1 2 1 2 1 1 643 | 1333877 5 4 5 1 8 1 3 6 1 644 | 1334015 7 8 8 7 3 10 7 2 3 645 | 1334667 1 1 1 1 2 1 1 1 1 646 | 1339781 1 1 1 1 2 1 2 1 1 647 | 1339781 4 1 1 1 2 1 3 1 1 648 | 13454352 1 1 3 1 2 1 2 1 1 649 | 1345452 1 1 3 1 2 1 2 1 1 650 | 1345593 3 1 1 3 2 1 2 1 1 651 | 1347749 1 1 1 1 2 1 1 1 1 652 | 1347943 5 2 2 2 2 1 1 1 2 653 | 1348851 3 1 1 1 2 1 3 1 1 654 | 1350319 5 7 4 1 6 1 7 10 3 655 | 1350423 5 10 10 8 5 5 7 10 1 656 | 1352848 3 10 7 8 5 8 7 4 1 657 | 1353092 3 2 1 2 2 1 3 1 1 658 | 1354840 2 1 1 1 2 1 3 1 1 659 | 1354840 5 3 2 1 3 1 1 1 1 660 | 1355260 1 1 1 1 2 1 2 1 1 661 | 1365075 4 1 4 1 2 1 1 1 1 662 | 1365328 1 1 2 1 2 1 2 1 1 663 | 1368267 5 1 1 1 2 1 1 1 1 664 | 1368273 1 1 1 1 2 1 1 1 1 665 | 1368882 2 1 1 1 2 1 1 1 1 666 | 1369821 10 10 10 10 5 10 10 10 7 667 | 1371026 5 10 10 10 4 10 5 6 3 668 | 1371920 5 1 1 1 2 1 3 2 1 669 | 466906 1 1 1 1 2 1 1 1 1 670 | 466906 1 1 1 1 2 1 1 1 1 671 | 534555 1 1 1 1 2 1 1 1 1 672 | 536708 1 1 1 1 2 1 1 1 1 673 | 566346 3 1 1 1 2 1 2 3 1 674 | 603148 4 1 1 1 2 1 1 1 1 675 | 654546 1 1 1 1 2 1 1 1 8 676 | 654546 1 1 1 3 2 1 1 1 1 677 | 695091 5 10 10 5 4 5 4 4 1 678 | 714039 3 1 1 1 2 1 1 1 1 679 | 763235 3 1 1 1 2 1 2 1 2 680 | 776715 3 1 1 1 3 2 1 1 1 681 | 841769 2 1 1 1 2 1 1 1 1 682 | 888820 5 10 10 3 7 3 8 10 2 683 | 897471 4 8 6 4 3 4 10 6 1 684 | 897471 4 8 8 5 4 5 10 4 1 685 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 |

Short description

2 | Autoencoder implementation in TensorFlow. Visualizes the (up to 75x75 first) reconstructed values every 1000 training epochs. The background color corresponds to the absolute difference between the reconstructed value and the actual (target) value. These are typically in [0,1] by design (scaling is applied by default, but can be turned off) even though tanh is used as an activation function (so values in [-1,1] are possible in the reconstruction). We use bins corresponding to different errors, with bin 0 (dark green) being the best (small error) and red being the worst. A colored digit in the foreground indicates that the entry in question has moved to another bin (improved or worsened) during the last epoch. The digit indicates which bin it is in now. 3 | 4 | Todo: 5 | - choose layerwise training or not 6 | - add weight decay 7 | - changing the weight initialization 8 | - specify random seed 9 | 10 |

Dependencies

11 | - tensorflow 12 | - pandas 13 | - numpy 14 | 15 |

Usage

16 | ``` 17 | usage: ascii_autoencoder.py [-h] [-T] [-n N_HIDDEN_NODES [N_HIDDEN_NODES ...]] 18 | [-l LEARNING_RATE] [-d DROPOUT_PROB] [-c N_CYCLES] 19 | [-b BATCH_SIZE] [--no_scaling] [--no_output] 20 | [--decoded_file DECODED_FILE] 21 | [--encoded_file ENCODED_FILE] 22 | [--scaled_input_file SCALED_INPUT_FILE] 23 | fpath 24 | 25 | Autoencoder implemented in tensorflow 26 | 27 | positional arguments: 28 | fpath Path to file containing matrix with input examples for 29 | the autoencoder 30 | 31 | optional arguments: 32 | -h, --help show this help message and exit 33 | -T, --transpose Transpose input 34 | -n N_HIDDEN_NODES [N_HIDDEN_NODES ...], --n_hidden_nodes N_HIDDEN_NODES [N_HIDDEN_NODES ...] 35 | Number of hidden nodes in each hidden layer 36 | -l LEARNING_RATE, --learning_rate LEARNING_RATE 37 | Learning rate 38 | -d DROPOUT_PROB, --dropout_prob DROPOUT_PROB 39 | Dropout probability 40 | -c N_CYCLES, --n_cycles N_CYCLES 41 | Number of training cycles 42 | -b BATCH_SIZE, --batch_size BATCH_SIZE 43 | Number of training examples per cycle 44 | --no_scaling Do not scale input (use this if you have already 45 | scaled it) 46 | --no_output Skip writing encoded and decoded values to file? 47 | --decoded_file DECODED_FILE 48 | Name of file to write decoded (reconstructed) values 49 | to 50 | --encoded_file ENCODED_FILE 51 | Name of file to write encoded values to 52 | --scaled_input_file SCALED_INPUT_FILE 53 | Name of file to write scaled input values to 54 | ``` 55 | -------------------------------------------------------------------------------- /ascii_autoencoder.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | import math 4 | import pandas as pd 5 | import sys 6 | from col import print_row, print_row2 7 | import argparse 8 | 9 | def scale01(vec): 10 | """ 11 | Scale the input values a vector to the interval [0, 1] 12 | """ 13 | # Check if all the values are the same - then uninformative so do not scale, to avoid division by zero 14 | if vec.min() == vec.max(): return vec 15 | return( (vec-vec.min()) / float(vec.max()-vec.min()) ) 16 | 17 | def corrupt(array, corr): 18 | """ 19 | Set a fraction ('corr') of the entries in a matrix to zero. May or may not be equivalent to a dropout function in tensorflow. 20 | """ 21 | array = np.array(array) 22 | for row in array: 23 | row[np.random.choice(len(row), int(corr*len(row)))] = 0.0 24 | return array 25 | 26 | def dump_file(tens, fname): 27 | """ 28 | Write the contents of a tensor (for instance, the encoded or decoded inputs) to file. 29 | """ 30 | outfile = open(fname, 'w') 31 | for row in tens: 32 | for i in range(len(row)): 33 | outfile.write(str(row[i])) 34 | if i == len(row)-1: outfile.write('\n') 35 | else: outfile.write('\t') 36 | 37 | def create(x, layer_sizes): 38 | """ 39 | Borrowed from Salik Syed https://gist.github.com/saliksyed/593c950ba1a3b9dd08d5 40 | """ 41 | # Build the encoding layers 42 | next_layer_input = x 43 | encoding_matrices = [] 44 | for dim in layer_sizes: 45 | input_dim = int(next_layer_input.get_shape()[1]) 46 | # Initialize W using random values in interval [-1/sqrt(n) , 1/sqrt(n)] 47 | W = tf.Variable(tf.random_uniform([input_dim, dim], -1.0 / math.sqrt(input_dim), 1.0 / math.sqrt(input_dim))) 48 | # Initialize b to zero 49 | b = tf.Variable(tf.zeros([dim])) 50 | # We are going to use tied-weights so store the W matrix for later reference. 51 | encoding_matrices.append(W) 52 | output = tf.nn.tanh(tf.matmul(next_layer_input,W) + b) 53 | # the input into the next layer is the output of this layer 54 | next_layer_input = output 55 | # The fully encoded x value is now stored in the next_layer_input 56 | encoded_x = next_layer_input 57 | # build the reconstruction layers by reversing the reductions 58 | layer_sizes.reverse() 59 | encoding_matrices.reverse() 60 | for i, dim in enumerate(layer_sizes[1:] + [ int(x.get_shape()[1])]) : 61 | # we are using tied weights, so just lookup the encoding matrix for this step and transpose it 62 | W = tf.transpose(encoding_matrices[i]) 63 | b = tf.Variable(tf.zeros([dim])) 64 | output = tf.nn.tanh(tf.matmul(next_layer_input,W) + b) 65 | next_layer_input = output 66 | # the fully encoded and reconstructed value of x is here: 67 | reconstructed_x = next_layer_input 68 | return { 69 | 'encoded': encoded_x, 70 | 'decoded': reconstructed_x, 71 | 'cost' : tf.sqrt(tf.reduce_mean(tf.square(x-reconstructed_x))) 72 | } 73 | 74 | parser = argparse.ArgumentParser(description='Autoencoder implemented in tensorflow') 75 | parser.add_argument('fpath', help='Path to file containing matrix with input examples for the autoencoder') 76 | parser.add_argument('-T', '--transpose', action='store_true', help='Transpose input') 77 | parser.add_argument('-n', '--n_hidden_nodes', default=[2], nargs='+', help='Number of hidden nodes in each hidden layer',type=int) 78 | parser.add_argument('-l', '--learning_rate', default=0.1, help='Learning rate',type=float) 79 | parser.add_argument('-d', '--dropout_prob', default=0.2, help='Dropout probability',type=float) 80 | parser.add_argument('-c', '--n_cycles', default=50000, help='Number of training cycles',type=int) 81 | parser.add_argument('-b', '--batch_size', default=500, help='Number of training examples per cycle',type=int) 82 | parser.add_argument('--no_scaling', action='store_true', help='Do not scale input (use this if you have already scaled it)') 83 | parser.add_argument('--no_output', action='store_true', help='Skip writing encoded and decoded values to file?') 84 | parser.add_argument('--decoded_file', default='decoded.tsv', help='Name of file to write decoded (reconstructed) values to') 85 | parser.add_argument('--encoded_file', default='encoded.tsv', help='Name of file to write encoded values to') 86 | parser.add_argument('--scaled_input_file', default='scaled_input.tsv', help='Name of file to write scaled input values to') 87 | 88 | # Parse input arguments 89 | args = parser.parse_args() 90 | data = pd.read_csv(args.fpath, sep="\t", index_col=0) 91 | n_hidden = args.n_hidden_nodes 92 | learning_rate = args.learning_rate 93 | corrupt_level = args.dropout_prob 94 | 95 | # We want to scale the data per feature, so the scaling will be different depending on the input - examples x features or features x examples? 96 | if args.transpose: # in the input file, examples are in the columns and features are in the rows 97 | if args.no_scaling: 98 | input_data = np.array(data.T) 99 | else: 100 | input_data = np.array(map(scale01, np.array(data))).T 101 | 102 | else: # examples are in rows, features are in columns 103 | if args.no_scaling: 104 | input_data = np.array(data) 105 | # Scale columnwise, ie transpose, scale, transpose back 106 | else: 107 | input_data = np.array(map(scale01,np.array(data.T))).T 108 | #input_data = np.array(map(scale01,np.array(data.T))) 109 | 110 | print(input_data.shape) 111 | print(input_data[0]) 112 | 113 | n_samp, n_input = input_data.shape 114 | print("No of samples: " + str(n_samp)) 115 | print("No of inputs: " + str(n_input)) 116 | n_rounds = args.n_cycles 117 | batch_size = min(args.batch_size, n_samp) 118 | 119 | x = tf.placeholder("float", [None, n_input]) 120 | ae = create(x, n_hidden) 121 | 122 | init = tf.initialize_all_variables() 123 | sess = tf.Session() 124 | sess.run(init) 125 | 126 | bins = np.array([0.01,0.05,0.1,0.15,0.20,0.30,0.50,1.0]) 127 | vis_input = min(n_input, 75) 128 | vis_samples = min(n_samp, 75) 129 | prev_diff = None 130 | 131 | train_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(ae['cost']) 132 | 133 | for i in range(n_rounds): 134 | sample = np.random.randint(n_samp, size=batch_size) 135 | batch_xs = input_data[sample][:] 136 | batch_xs = corrupt(batch_xs, corrupt_level) 137 | sess.run(train_step, feed_dict={x: np.array(batch_xs)}) 138 | 139 | if i % 1000 == 0: 140 | diff = abs(sess.run(ae['decoded'], feed_dict={x:np.array(input_data)})[0:vis_samples,0:vis_input] - input_data[0:vis_samples,0:vis_input]) 141 | for j in range(0,len(diff)): 142 | discr = np.digitize(diff[j], bins) 143 | if prev_diff is not None: 144 | discr_prev = np.digitize(prev_diff[j], bins) 145 | else: 146 | discr_prev = None 147 | print_row2(discr, discr_prev) 148 | print '' 149 | prev_diff = diff 150 | print i, sess.run(ae['cost'], feed_dict={x: np.array(input_data)}) 151 | 152 | reconstr = sess.run(ae['decoded'], feed_dict={x: input_data}) 153 | encoded = sess.run(ae['encoded'], feed_dict={x: input_data}) 154 | 155 | if not args.no_output: 156 | dump_file(reconstr, args.decoded_file) 157 | dump_file(encoded, args.encoded_file) 158 | dump_file(input_data, args.scaled_input_file) 159 | 160 | -------------------------------------------------------------------------------- /col.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import colored 3 | import sys 4 | import math 5 | 6 | lookup = [ 1,196,197,198,199,208,209,172,149, 2] 7 | def print_row(a): 8 | for x in a: 9 | foo = np.around(math.floor(x * 999)) 10 | index = int(foo / 100) 11 | color = colored.bg(lookup[10-index]) 12 | res = colored.attr('reset') 13 | sys.stdout.write (color + str(int(foo)) + res) 14 | #sys.stdout.write(" ") 15 | sys.stdout.write("\n") 16 | 17 | 18 | def print_row2(a, a2): 19 | for i in range(0,len(a)): 20 | x = a[i] 21 | color = colored.bg(lookup[9-x]) 22 | if a2 is not None: 23 | color += colored.fg(lookup[9-a2[i]]) 24 | res = colored.attr('reset') 25 | sys.stdout.write (color + str(x) + res) 26 | #sys.stdout.write(" ") 27 | sys.stdout.write("\n") 28 | 29 | #print_row([ 1,196,197,198,199,208,209,172,149, 2]) 30 | -------------------------------------------------------------------------------- /dependencies.txt: -------------------------------------------------------------------------------- 1 | tensorflow 2 | numpy 3 | pandas 4 | -------------------------------------------------------------------------------- /wine_noclass.txt: -------------------------------------------------------------------------------- 1 | 14.23 1.71 2.43 15.6 127 2.8 3.06 .28 2.29 5.64 1.04 3.92 2 | 13.2 1.78 2.14 11.2 100 2.65 2.76 .26 1.28 4.38 1.05 3.4 3 | 13.16 2.36 2.67 18.6 101 2.8 3.24 .3 2.81 5.68 1.03 3.17 4 | 14.37 1.95 2.5 16.8 113 3.85 3.49 .24 2.18 7.8 .86 3.45 5 | 13.24 2.59 2.87 21 118 2.8 2.69 .39 1.82 4.32 1.04 2.93 6 | 14.2 1.76 2.45 15.2 112 3.27 3.39 .34 1.97 6.75 1.05 2.85 7 | 14.39 1.87 2.45 14.6 96 2.5 2.52 .3 1.98 5.25 1.02 3.58 8 | 14.06 2.15 2.61 17.6 121 2.6 2.51 .31 1.25 5.05 1.06 3.58 9 | 14.83 1.64 2.17 14 97 2.8 2.98 .29 1.98 5.2 1.08 2.85 10 | 13.86 1.35 2.27 16 98 2.98 3.15 .22 1.85 7.22 1.01 3.55 11 | 14.1 2.16 2.3 18 105 2.95 3.32 .22 2.38 5.75 1.25 3.17 12 | 14.12 1.48 2.32 16.8 95 2.2 2.43 .26 1.57 5 1.17 2.82 13 | 13.75 1.73 2.41 16 89 2.6 2.76 .29 1.81 5.6 1.15 2.9 14 | 14.75 1.73 2.39 11.4 91 3.1 3.69 .43 2.81 5.4 1.25 2.73 15 | 14.38 1.87 2.38 12 102 3.3 3.64 .29 2.96 7.5 1.2 3 16 | 13.63 1.81 2.7 17.2 112 2.85 2.91 .3 1.46 7.3 1.28 2.88 17 | 14.3 1.92 2.72 20 120 2.8 3.14 .33 1.97 6.2 1.07 2.65 18 | 13.83 1.57 2.62 20 115 2.95 3.4 .4 1.72 6.6 1.13 2.57 19 | 14.19 1.59 2.48 16.5 108 3.3 3.93 .32 1.86 8.7 1.23 2.82 20 | 13.64 3.1 2.56 15.2 116 2.7 3.03 .17 1.66 5.1 .96 3.36 21 | 14.06 1.63 2.28 16 126 3 3.17 .24 2.1 5.65 1.09 3.71 22 | 12.93 3.8 2.65 18.6 102 2.41 2.41 .25 1.98 4.5 1.03 3.52 23 | 13.71 1.86 2.36 16.6 101 2.61 2.88 .27 1.69 3.8 1.11 4 24 | 12.85 1.6 2.52 17.8 95 2.48 2.37 .26 1.46 3.93 1.09 3.63 25 | 13.5 1.81 2.61 20 96 2.53 2.61 .28 1.66 3.52 1.12 3.82 26 | 13.05 2.05 3.22 25 124 2.63 2.68 .47 1.92 3.58 1.13 3.2 27 | 13.39 1.77 2.62 16.1 93 2.85 2.94 .34 1.45 4.8 .92 3.22 28 | 13.3 1.72 2.14 17 94 2.4 2.19 .27 1.35 3.95 1.02 2.77 29 | 13.87 1.9 2.8 19.4 107 2.95 2.97 .37 1.76 4.5 1.25 3.4 30 | 14.02 1.68 2.21 16 96 2.65 2.33 .26 1.98 4.7 1.04 3.59 31 | 13.73 1.5 2.7 22.5 101 3 3.25 .29 2.38 5.7 1.19 2.71 32 | 13.58 1.66 2.36 19.1 106 2.86 3.19 .22 1.95 6.9 1.09 2.88 33 | 13.68 1.83 2.36 17.2 104 2.42 2.69 .42 1.97 3.84 1.23 2.87 34 | 13.76 1.53 2.7 19.5 132 2.95 2.74 .5 1.35 5.4 1.25 3 35 | 13.51 1.8 2.65 19 110 2.35 2.53 .29 1.54 4.2 1.1 2.87 36 | 13.48 1.81 2.41 20.5 100 2.7 2.98 .26 1.86 5.1 1.04 3.47 37 | 13.28 1.64 2.84 15.5 110 2.6 2.68 .34 1.36 4.6 1.09 2.78 38 | 13.05 1.65 2.55 18 98 2.45 2.43 .29 1.44 4.25 1.12 2.51 39 | 13.07 1.5 2.1 15.5 98 2.4 2.64 .28 1.37 3.7 1.18 2.69 40 | 14.22 3.99 2.51 13.2 128 3 3.04 .2 2.08 5.1 .89 3.53 41 | 13.56 1.71 2.31 16.2 117 3.15 3.29 .34 2.34 6.13 .95 3.38 42 | 13.41 3.84 2.12 18.8 90 2.45 2.68 .27 1.48 4.28 .91 3 43 | 13.88 1.89 2.59 15 101 3.25 3.56 .17 1.7 5.43 .88 3.56 44 | 13.24 3.98 2.29 17.5 103 2.64 2.63 .32 1.66 4.36 .82 3 45 | 13.05 1.77 2.1 17 107 3 3 .28 2.03 5.04 .88 3.35 46 | 14.21 4.04 2.44 18.9 111 2.85 2.65 .3 1.25 5.24 .87 3.33 47 | 14.38 3.59 2.28 16 102 3.25 3.17 .27 2.19 4.9 1.04 3.44 48 | 13.9 1.68 2.12 16 101 3.1 3.39 .21 2.14 6.1 .91 3.33 49 | 14.1 2.02 2.4 18.8 103 2.75 2.92 .32 2.38 6.2 1.07 2.75 50 | 13.94 1.73 2.27 17.4 108 2.88 3.54 .32 2.08 8.90 1.12 3.1 51 | 13.05 1.73 2.04 12.4 92 2.72 3.27 .17 2.91 7.2 1.12 2.91 52 | 13.83 1.65 2.6 17.2 94 2.45 2.99 .22 2.29 5.6 1.24 3.37 53 | 13.82 1.75 2.42 14 111 3.88 3.74 .32 1.87 7.05 1.01 3.26 54 | 13.77 1.9 2.68 17.1 115 3 2.79 .39 1.68 6.3 1.13 2.93 55 | 13.74 1.67 2.25 16.4 118 2.6 2.9 .21 1.62 5.85 .92 3.2 56 | 13.56 1.73 2.46 20.5 116 2.96 2.78 .2 2.45 6.25 .98 3.03 57 | 14.22 1.7 2.3 16.3 118 3.2 3 .26 2.03 6.38 .94 3.31 58 | 13.29 1.97 2.68 16.8 102 3 3.23 .31 1.66 6 1.07 2.84 59 | 13.72 1.43 2.5 16.7 108 3.4 3.67 .19 2.04 6.8 .89 2.87 60 | 12.37 .94 1.36 10.6 88 1.98 .57 .28 .42 1.95 1.05 1.82 61 | 12.33 1.1 2.28 16 101 2.05 1.09 .63 .41 3.27 1.25 1.67 62 | 12.64 1.36 2.02 16.8 100 2.02 1.41 .53 .62 5.75 .98 1.59 63 | 13.67 1.25 1.92 18 94 2.1 1.79 .32 .73 3.8 1.23 2.46 64 | 12.37 1.13 2.16 19 87 3.5 3.1 .19 1.87 4.45 1.22 2.87 65 | 12.17 1.45 2.53 19 104 1.89 1.75 .45 1.03 2.95 1.45 2.23 66 | 12.37 1.21 2.56 18.1 98 2.42 2.65 .37 2.08 4.6 1.19 2.3 67 | 13.11 1.01 1.7 15 78 2.98 3.18 .26 2.28 5.3 1.12 3.18 68 | 12.37 1.17 1.92 19.6 78 2.11 2 .27 1.04 4.68 1.12 3.48 69 | 13.34 .94 2.36 17 110 2.53 1.3 .55 .42 3.17 1.02 1.93 70 | 12.21 1.19 1.75 16.8 151 1.85 1.28 .14 2.5 2.85 1.28 3.07 71 | 12.29 1.61 2.21 20.4 103 1.1 1.02 .37 1.46 3.05 .906 1.82 72 | 13.86 1.51 2.67 25 86 2.95 2.86 .21 1.87 3.38 1.36 3.16 73 | 13.49 1.66 2.24 24 87 1.88 1.84 .27 1.03 3.74 .98 2.78 74 | 12.99 1.67 2.6 30 139 3.3 2.89 .21 1.96 3.35 1.31 3.5 75 | 11.96 1.09 2.3 21 101 3.38 2.14 .13 1.65 3.21 .99 3.13 76 | 11.66 1.88 1.92 16 97 1.61 1.57 .34 1.15 3.8 1.23 2.14 77 | 13.03 .9 1.71 16 86 1.95 2.03 .24 1.46 4.6 1.19 2.48 78 | 11.84 2.89 2.23 18 112 1.72 1.32 .43 .95 2.65 .96 2.52 79 | 12.33 .99 1.95 14.8 136 1.9 1.85 .35 2.76 3.4 1.06 2.31 80 | 12.7 3.87 2.4 23 101 2.83 2.55 .43 1.95 2.57 1.19 3.13 81 | 12 .92 2 19 86 2.42 2.26 .3 1.43 2.5 1.38 3.12 82 | 12.72 1.81 2.2 18.8 86 2.2 2.53 .26 1.77 3.9 1.16 3.14 83 | 12.08 1.13 2.51 24 78 2 1.58 .4 1.4 2.2 1.31 2.72 84 | 13.05 3.86 2.32 22.5 85 1.65 1.59 .61 1.62 4.8 .84 2.01 85 | 11.84 .89 2.58 18 94 2.2 2.21 .22 2.35 3.05 .79 3.08 86 | 12.67 .98 2.24 18 99 2.2 1.94 .3 1.46 2.62 1.23 3.16 87 | 12.16 1.61 2.31 22.8 90 1.78 1.69 .43 1.56 2.45 1.33 2.26 88 | 11.65 1.67 2.62 26 88 1.92 1.61 .4 1.34 2.6 1.36 3.21 89 | 11.64 2.06 2.46 21.6 84 1.95 1.69 .48 1.35 2.8 1 2.75 90 | 12.08 1.33 2.3 23.6 70 2.2 1.59 .42 1.38 1.74 1.07 3.21 91 | 12.08 1.83 2.32 18.5 81 1.6 1.5 .52 1.64 2.4 1.08 2.27 92 | 12 1.51 2.42 22 86 1.45 1.25 .5 1.63 3.6 1.05 2.65 93 | 12.69 1.53 2.26 20.7 80 1.38 1.46 .58 1.62 3.05 .96 2.06 94 | 12.29 2.83 2.22 18 88 2.45 2.25 .25 1.99 2.15 1.15 3.3 95 | 11.62 1.99 2.28 18 98 3.02 2.26 .17 1.35 3.25 1.16 2.96 96 | 12.47 1.52 2.2 19 162 2.5 2.27 .32 3.28 2.6 1.16 2.63 97 | 11.81 2.12 2.74 21.5 134 1.6 .99 .14 1.56 2.5 .95 2.26 98 | 12.29 1.41 1.98 16 85 2.55 2.5 .29 1.77 2.9 1.23 2.74 99 | 12.37 1.07 2.1 18.5 88 3.52 3.75 .24 1.95 4.5 1.04 2.77 100 | 12.29 3.17 2.21 18 88 2.85 2.99 .45 2.81 2.3 1.42 2.83 101 | 12.08 2.08 1.7 17.5 97 2.23 2.17 .26 1.4 3.3 1.27 2.96 102 | 12.6 1.34 1.9 18.5 88 1.45 1.36 .29 1.35 2.45 1.04 2.77 103 | 12.34 2.45 2.46 21 98 2.56 2.11 .34 1.31 2.8 .8 3.38 104 | 11.82 1.72 1.88 19.5 86 2.5 1.64 .37 1.42 2.06 .94 2.44 105 | 12.51 1.73 1.98 20.5 85 2.2 1.92 .32 1.48 2.94 1.04 3.57 106 | 12.42 2.55 2.27 22 90 1.68 1.84 .66 1.42 2.7 .86 3.3 107 | 12.25 1.73 2.12 19 80 1.65 2.03 .37 1.63 3.4 1 3.17 108 | 12.72 1.75 2.28 22.5 84 1.38 1.76 .48 1.63 3.3 .88 2.42 109 | 12.22 1.29 1.94 19 92 2.36 2.04 .39 2.08 2.7 .86 3.02 110 | 11.61 1.35 2.7 20 94 2.74 2.92 .29 2.49 2.65 .96 3.26 111 | 11.46 3.74 1.82 19.5 107 3.18 2.58 .24 3.58 2.9 .75 2.81 112 | 12.52 2.43 2.17 21 88 2.55 2.27 .26 1.22 2 .9 2.78 113 | 11.76 2.68 2.92 20 103 1.75 2.03 .6 1.05 3.8 1.23 2.5 114 | 11.41 .74 2.5 21 88 2.48 2.01 .42 1.44 3.08 1.1 2.31 115 | 12.08 1.39 2.5 22.5 84 2.56 2.29 .43 1.04 2.9 .93 3.19 116 | 11.03 1.51 2.2 21.5 85 2.46 2.17 .52 2.01 1.9 1.71 2.87 117 | 11.82 1.47 1.99 20.8 86 1.98 1.6 .3 1.53 1.95 .95 3.33 118 | 12.42 1.61 2.19 22.5 108 2 2.09 .34 1.61 2.06 1.06 2.96 119 | 12.77 3.43 1.98 16 80 1.63 1.25 .43 .83 3.4 .7 2.12 120 | 12 3.43 2 19 87 2 1.64 .37 1.87 1.28 .93 3.05 121 | 11.45 2.4 2.42 20 96 2.9 2.79 .32 1.83 3.25 .8 3.39 122 | 11.56 2.05 3.23 28.5 119 3.18 5.08 .47 1.87 6 .93 3.69 123 | 12.42 4.43 2.73 26.5 102 2.2 2.13 .43 1.71 2.08 .92 3.12 124 | 13.05 5.8 2.13 21.5 86 2.62 2.65 .3 2.01 2.6 .73 3.1 125 | 11.87 4.31 2.39 21 82 2.86 3.03 .21 2.91 2.8 .75 3.64 126 | 12.07 2.16 2.17 21 85 2.6 2.65 .37 1.35 2.76 .86 3.28 127 | 12.43 1.53 2.29 21.5 86 2.74 3.15 .39 1.77 3.94 .69 2.84 128 | 11.79 2.13 2.78 28.5 92 2.13 2.24 .58 1.76 3 .97 2.44 129 | 12.37 1.63 2.3 24.5 88 2.22 2.45 .4 1.9 2.12 .89 2.78 130 | 12.04 4.3 2.38 22 80 2.1 1.75 .42 1.35 2.6 .79 2.57 131 | 12.86 1.35 2.32 18 122 1.51 1.25 .21 .94 4.1 .76 1.29 132 | 12.88 2.99 2.4 20 104 1.3 1.22 .24 .83 5.4 .74 1.42 133 | 12.81 2.31 2.4 24 98 1.15 1.09 .27 .83 5.7 .66 1.36 134 | 12.7 3.55 2.36 21.5 106 1.7 1.2 .17 .84 5 .78 1.29 135 | 12.51 1.24 2.25 17.5 85 2 .58 .6 1.25 5.45 .75 1.51 136 | 12.6 2.46 2.2 18.5 94 1.62 .66 .63 .94 7.1 .73 1.58 137 | 12.25 4.72 2.54 21 89 1.38 .47 .53 .8 3.85 .75 1.27 138 | 12.53 5.51 2.64 25 96 1.79 .6 .63 1.1 5 .82 1.69 139 | 13.49 3.59 2.19 19.5 88 1.62 .48 .58 .88 5.7 .81 1.82 140 | 12.84 2.96 2.61 24 101 2.32 .6 .53 .81 4.92 .89 2.15 141 | 12.93 2.81 2.7 21 96 1.54 .5 .53 .75 4.6 .77 2.31 142 | 13.36 2.56 2.35 20 89 1.4 .5 .37 .64 5.6 .7 2.47 143 | 13.52 3.17 2.72 23.5 97 1.55 .52 .5 .55 4.35 .89 2.06 144 | 13.62 4.95 2.35 20 92 2 .8 .47 1.02 4.4 .91 2.05 145 | 12.25 3.88 2.2 18.5 112 1.38 .78 .29 1.14 8.21 .65 2 146 | 13.16 3.57 2.15 21 102 1.5 .55 .43 1.3 4 .6 1.68 147 | 13.88 5.04 2.23 20 80 .98 .34 .4 .68 4.9 .58 1.33 148 | 12.87 4.61 2.48 21.5 86 1.7 .65 .47 .86 7.65 .54 1.86 149 | 13.32 3.24 2.38 21.5 92 1.93 .76 .45 1.25 8.42 .55 1.62 150 | 13.08 3.9 2.36 21.5 113 1.41 1.39 .34 1.14 9.40 .57 1.33 151 | 13.5 3.12 2.62 24 123 1.4 1.57 .22 1.25 8.60 .59 1.3 152 | 12.79 2.67 2.48 22 112 1.48 1.36 .24 1.26 10.8 .48 1.47 153 | 13.11 1.9 2.75 25.5 116 2.2 1.28 .26 1.56 7.1 .61 1.33 154 | 13.23 3.3 2.28 18.5 98 1.8 .83 .61 1.87 10.52 .56 1.51 155 | 12.58 1.29 2.1 20 103 1.48 .58 .53 1.4 7.6 .58 1.55 156 | 13.17 5.19 2.32 22 93 1.74 .63 .61 1.55 7.9 .6 1.48 157 | 13.84 4.12 2.38 19.5 89 1.8 .83 .48 1.56 9.01 .57 1.64 158 | 12.45 3.03 2.64 27 97 1.9 .58 .63 1.14 7.5 .67 1.73 159 | 14.34 1.68 2.7 25 98 2.8 1.31 .53 2.7 13 .57 1.96 160 | 13.48 1.67 2.64 22.5 89 2.6 1.1 .52 2.29 11.75 .57 1.78 161 | 12.36 3.83 2.38 21 88 2.3 .92 .5 1.04 7.65 .56 1.58 162 | 13.69 3.26 2.54 20 107 1.83 .56 .5 .8 5.88 .96 1.82 163 | 12.85 3.27 2.58 22 106 1.65 .6 .6 .96 5.58 .87 2.11 164 | 12.96 3.45 2.35 18.5 106 1.39 .7 .4 .94 5.28 .68 1.75 165 | 13.78 2.76 2.3 22 90 1.35 .68 .41 1.03 9.58 .7 1.68 166 | 13.73 4.36 2.26 22.5 88 1.28 .47 .52 1.15 6.62 .78 1.75 167 | 13.45 3.7 2.6 23 111 1.7 .92 .43 1.46 10.68 .85 1.56 168 | 12.82 3.37 2.3 19.5 88 1.48 .66 .4 .97 10.26 .72 1.75 169 | 13.58 2.58 2.69 24.5 105 1.55 .84 .39 1.54 8.66 .74 1.8 170 | 13.4 4.6 2.86 25 112 1.98 .96 .27 1.11 8.5 .67 1.92 171 | 12.2 3.03 2.32 19 96 1.25 .49 .4 .73 5.5 .66 1.83 172 | 12.77 2.39 2.28 19.5 86 1.39 .51 .48 .64 9.899999 .57 1.63 173 | 14.16 2.51 2.48 20 91 1.68 .7 .44 1.24 9.7 .62 1.71 174 | 13.71 5.65 2.45 20.5 95 1.68 .61 .52 1.06 7.7 .64 1.74 175 | 13.4 3.91 2.48 23 102 1.8 .75 .43 1.41 7.3 .7 1.56 176 | 13.27 4.28 2.26 20 120 1.59 .69 .43 1.35 10.2 .59 1.56 177 | 13.17 2.59 2.37 20 120 1.65 .68 .53 1.46 9.3 .6 1.62 178 | 14.13 4.1 2.74 24.5 96 2.05 .76 .56 1.35 9.2 .61 1.6 179 | --------------------------------------------------------------------------------