├── LICENSE ├── README.md ├── dataset ├── Input_Yo20op1084_ana.csv ├── Input_Yo20op1084_ana_50.csv ├── Input_Yo20op1084_ana_bw20.csv ├── Input_Yo20op1084_ana_bw20_50.csv ├── Input_Yo20op272_ana.csv ├── Input_Yo20op40_ana.csv ├── Input_Yo20op40_ana_50.csv ├── Input_Yo20op40_ana_bw20.csv ├── Input_Yo20op40_ana_bw20_50.csv ├── Output_Yo20op1084_ana.csv ├── Output_Yo20op1084_ana_50.csv ├── Output_Yo20op1084_ana_bw20.csv ├── Output_Yo20op1084_ana_bw20_50.csv ├── Output_Yo20op272_ana.csv ├── Output_Yo20op40_ana.csv ├── Output_Yo20op40_ana_50.csv ├── Output_Yo20op40_ana_bw20.csv └── Output_Yo20op40_ana_bw20_50.csv ├── doc ├── .gitkeep └── Overall---.png ├── optuna ├── .gitkeep ├── Yo_ADAM_Optuna_1.ipynb └── Yo_ADAM_Optuna_2.ipynb ├── simulation examples ├── .gitkeep └── ML_2L_GFL_ADMITTANCE_RL_PLL_2023.plecs ├── training └── Yo_FNN_ADAM.ipynb └── transfer learning └── Yo_transfer learning_cycle.ipynb /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2023 Yufei Li/ Princeton University/ yl5385@princeton.edu 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ## InvNet 2 | InvNet: a machine learning framework for grid-edge inverter impedance modeling. 3 | 4 | ### Brief introduction 5 | The future electric grid will be pervasively formed by a vast number of smart inverters distributed at the edge of the grid. These inverters' dynamics are commonly characterized as impedances under small-signal perturbations and are critical for ensuring grid stability and resiliency. However, operating conditions of these inverters can vary widely, resulting in various impedance patterns and complicating grid-inverter interaction behaviors. Existing analytical impedance models require a thorough and precise understanding of system parameters and make numerous assumptions to reduce system complexities. They can hardly capture the complete electrical behaviors of physical systems when inverters are controlled with sophisticated algorithms or performing complex functions. Real-world impedance acquisitions across multiple operating points through simulations or measurements are expensive and impractical. Leveraging the recent advances in artificial intelligence and machine learning, we present the InvNet, a few-shot machine learning framework capable of characterizing inverter impedance patterns across a wide operation range, even with limited impedance data for each inverter. The InvNet can extrapolate from physics-based models to real-world models and from one inverter to another. Our work showcase machine learning and neural networks as powerful tools for modeling black-box characteristics of sophisticated grid-edge energy systems and analyzing behaviors of larger-scale systems that cannot be described using traditional analytical methods. 6 | 7 | ![](https://github.com/superrabbit2023/InvNet/blob/master/doc/Overall---.png) 8 | 9 | ### How to cite 10 | If you used InvNet, please cite us with the following: 11 | 12 | [1] Yufei Li, Yicheng Liao, Liang Zhao, Minjie Chen, Xiongfei Wang, Lars Nordström, Prateek Mittal, and H. Vincent Poor, "Machine Learning At the Grid Edge: Data-Driven Impedance Models for Model-Free Inverters," in IEEE Transactions on Power Electronics, 2024, doi: 10.1109/TPEL.2024.3399776. (https://ieeexplore.ieee.org/document/10529635). 13 | 14 | [2] Yufei Li, Yicheng Liao, Xiongfei Wang, Lars Nordström, Prateek Mittal, Minjie Chen, and H. Vincent Poor, "Neural Network Models and Transfer Learning for Impedance Modeling of Grid-Tied Inverters," 2022 IEEE 13th International Symposium on Power Electronics for Distributed Generation Systems (PEDG), Kiel, Germany, 2022, pp. 1-6, doi: 10.1109/PEDG54999.2022.9923064. 15 | 16 | And more to come... 17 | -------------------------------------------------------------------------------- /dataset/Input_Yo20op40_ana.csv: -------------------------------------------------------------------------------- 1 | 1,0.9,-0.5,-0.5 2 | 1.31578947368421,0.9,-0.5,-0.5 3 | 1.75438596491228,0.9,-0.5,-0.5 4 | 2.32558139534884,0.9,-0.5,-0.5 5 | 3.03030303030303,0.9,-0.5,-0.5 6 | 4,0.9,-0.5,-0.5 7 | 5.35714285714286,0.9,-0.5,-0.5 8 | 6.97674418604651,0.9,-0.5,-0.5 9 | 9.375,0.9,-0.5,-0.5 10 | 12.5,0.9,-0.5,-0.5 11 | 16.6666666666667,0.9,-0.5,-0.5 12 | 21.4285714285714,0.9,-0.5,-0.5 13 | 28.5714285714286,0.9,-0.5,-0.5 14 | 37.5,0.9,-0.5,-0.5 15 | 50,0.9,-0.5,-0.5 16 | 66.6666666666667,0.9,-0.5,-0.5 17 | 87.5,0.9,-0.5,-0.5 18 | 116.666666666667,0.9,-0.5,-0.5 19 | 150,0.9,-0.5,-0.5 20 | 200,0.9,-0.5,-0.5 21 | 1,0.9,-0.5,0 22 | 1.31578947368421,0.9,-0.5,0 23 | 1.75438596491228,0.9,-0.5,0 24 | 2.32558139534884,0.9,-0.5,0 25 | 3.03030303030303,0.9,-0.5,0 26 | 4,0.9,-0.5,0 27 | 5.35714285714286,0.9,-0.5,0 28 | 6.97674418604651,0.9,-0.5,0 29 | 9.375,0.9,-0.5,0 30 | 12.5,0.9,-0.5,0 31 | 16.6666666666667,0.9,-0.5,0 32 | 21.4285714285714,0.9,-0.5,0 33 | 28.5714285714286,0.9,-0.5,0 34 | 37.5,0.9,-0.5,0 35 | 50,0.9,-0.5,0 36 | 66.6666666666667,0.9,-0.5,0 37 | 87.5,0.9,-0.5,0 38 | 116.666666666667,0.9,-0.5,0 39 | 150,0.9,-0.5,0 40 | 200,0.9,-0.5,0 41 | 1,0.9,-0.5,0.5 42 | 1.31578947368421,0.9,-0.5,0.5 43 | 1.75438596491228,0.9,-0.5,0.5 44 | 2.32558139534884,0.9,-0.5,0.5 45 | 3.03030303030303,0.9,-0.5,0.5 46 | 4,0.9,-0.5,0.5 47 | 5.35714285714286,0.9,-0.5,0.5 48 | 6.97674418604651,0.9,-0.5,0.5 49 | 9.375,0.9,-0.5,0.5 50 | 12.5,0.9,-0.5,0.5 51 | 16.6666666666667,0.9,-0.5,0.5 52 | 21.4285714285714,0.9,-0.5,0.5 53 | 28.5714285714286,0.9,-0.5,0.5 54 | 37.5,0.9,-0.5,0.5 55 | 50,0.9,-0.5,0.5 56 | 66.6666666666667,0.9,-0.5,0.5 57 | 87.5,0.9,-0.5,0.5 58 | 116.666666666667,0.9,-0.5,0.5 59 | 150,0.9,-0.5,0.5 60 | 200,0.9,-0.5,0.5 61 | 1,0.9,0,-0.5 62 | 1.31578947368421,0.9,0,-0.5 63 | 1.75438596491228,0.9,0,-0.5 64 | 2.32558139534884,0.9,0,-0.5 65 | 3.03030303030303,0.9,0,-0.5 66 | 4,0.9,0,-0.5 67 | 5.35714285714286,0.9,0,-0.5 68 | 6.97674418604651,0.9,0,-0.5 69 | 9.375,0.9,0,-0.5 70 | 12.5,0.9,0,-0.5 71 | 16.6666666666667,0.9,0,-0.5 72 | 21.4285714285714,0.9,0,-0.5 73 | 28.5714285714286,0.9,0,-0.5 74 | 37.5,0.9,0,-0.5 75 | 50,0.9,0,-0.5 76 | 66.6666666666667,0.9,0,-0.5 77 | 87.5,0.9,0,-0.5 78 | 116.666666666667,0.9,0,-0.5 79 | 150,0.9,0,-0.5 80 | 200,0.9,0,-0.5 81 | 1,0.9,0,0.5 82 | 1.31578947368421,0.9,0,0.5 83 | 1.75438596491228,0.9,0,0.5 84 | 2.32558139534884,0.9,0,0.5 85 | 3.03030303030303,0.9,0,0.5 86 | 4,0.9,0,0.5 87 | 5.35714285714286,0.9,0,0.5 88 | 6.97674418604651,0.9,0,0.5 89 | 9.375,0.9,0,0.5 90 | 12.5,0.9,0,0.5 91 | 16.6666666666667,0.9,0,0.5 92 | 21.4285714285714,0.9,0,0.5 93 | 28.5714285714286,0.9,0,0.5 94 | 37.5,0.9,0,0.5 95 | 50,0.9,0,0.5 96 | 66.6666666666667,0.9,0,0.5 97 | 87.5,0.9,0,0.5 98 | 116.666666666667,0.9,0,0.5 99 | 150,0.9,0,0.5 100 | 200,0.9,0,0.5 101 | 1,0.9,0.5,-0.5 102 | 1.31578947368421,0.9,0.5,-0.5 103 | 1.75438596491228,0.9,0.5,-0.5 104 | 2.32558139534884,0.9,0.5,-0.5 105 | 3.03030303030303,0.9,0.5,-0.5 106 | 4,0.9,0.5,-0.5 107 | 5.35714285714286,0.9,0.5,-0.5 108 | 6.97674418604651,0.9,0.5,-0.5 109 | 9.375,0.9,0.5,-0.5 110 | 12.5,0.9,0.5,-0.5 111 | 16.6666666666667,0.9,0.5,-0.5 112 | 21.4285714285714,0.9,0.5,-0.5 113 | 28.5714285714286,0.9,0.5,-0.5 114 | 37.5,0.9,0.5,-0.5 115 | 50,0.9,0.5,-0.5 116 | 66.6666666666667,0.9,0.5,-0.5 117 | 87.5,0.9,0.5,-0.5 118 | 116.666666666667,0.9,0.5,-0.5 119 | 150,0.9,0.5,-0.5 120 | 200,0.9,0.5,-0.5 121 | 1,0.9,0.5,0 122 | 1.31578947368421,0.9,0.5,0 123 | 1.75438596491228,0.9,0.5,0 124 | 2.32558139534884,0.9,0.5,0 125 | 3.03030303030303,0.9,0.5,0 126 | 4,0.9,0.5,0 127 | 5.35714285714286,0.9,0.5,0 128 | 6.97674418604651,0.9,0.5,0 129 | 9.375,0.9,0.5,0 130 | 12.5,0.9,0.5,0 131 | 16.6666666666667,0.9,0.5,0 132 | 21.4285714285714,0.9,0.5,0 133 | 28.5714285714286,0.9,0.5,0 134 | 37.5,0.9,0.5,0 135 | 50,0.9,0.5,0 136 | 66.6666666666667,0.9,0.5,0 137 | 87.5,0.9,0.5,0 138 | 116.666666666667,0.9,0.5,0 139 | 150,0.9,0.5,0 140 | 200,0.9,0.5,0 141 | 1,0.9,0.5,0.5 142 | 1.31578947368421,0.9,0.5,0.5 143 | 1.75438596491228,0.9,0.5,0.5 144 | 2.32558139534884,0.9,0.5,0.5 145 | 3.03030303030303,0.9,0.5,0.5 146 | 4,0.9,0.5,0.5 147 | 5.35714285714286,0.9,0.5,0.5 148 | 6.97674418604651,0.9,0.5,0.5 149 | 9.375,0.9,0.5,0.5 150 | 12.5,0.9,0.5,0.5 151 | 16.6666666666667,0.9,0.5,0.5 152 | 21.4285714285714,0.9,0.5,0.5 153 | 28.5714285714286,0.9,0.5,0.5 154 | 37.5,0.9,0.5,0.5 155 | 50,0.9,0.5,0.5 156 | 66.6666666666667,0.9,0.5,0.5 157 | 87.5,0.9,0.5,0.5 158 | 116.666666666667,0.9,0.5,0.5 159 | 150,0.9,0.5,0.5 160 | 200,0.9,0.5,0.5 161 | 1,1,-1,0 162 | 1.31578947368421,1,-1,0 163 | 1.75438596491228,1,-1,0 164 | 2.32558139534884,1,-1,0 165 | 3.03030303030303,1,-1,0 166 | 4,1,-1,0 167 | 5.35714285714286,1,-1,0 168 | 6.97674418604651,1,-1,0 169 | 9.375,1,-1,0 170 | 12.5,1,-1,0 171 | 16.6666666666667,1,-1,0 172 | 21.4285714285714,1,-1,0 173 | 28.5714285714286,1,-1,0 174 | 37.5,1,-1,0 175 | 50,1,-1,0 176 | 66.6666666666667,1,-1,0 177 | 87.5,1,-1,0 178 | 116.666666666667,1,-1,0 179 | 150,1,-1,0 180 | 200,1,-1,0 181 | 1,1,-0.5,-0.5 182 | 1.31578947368421,1,-0.5,-0.5 183 | 1.75438596491228,1,-0.5,-0.5 184 | 2.32558139534884,1,-0.5,-0.5 185 | 3.03030303030303,1,-0.5,-0.5 186 | 4,1,-0.5,-0.5 187 | 5.35714285714286,1,-0.5,-0.5 188 | 6.97674418604651,1,-0.5,-0.5 189 | 9.375,1,-0.5,-0.5 190 | 12.5,1,-0.5,-0.5 191 | 16.6666666666667,1,-0.5,-0.5 192 | 21.4285714285714,1,-0.5,-0.5 193 | 28.5714285714286,1,-0.5,-0.5 194 | 37.5,1,-0.5,-0.5 195 | 50,1,-0.5,-0.5 196 | 66.6666666666667,1,-0.5,-0.5 197 | 87.5,1,-0.5,-0.5 198 | 116.666666666667,1,-0.5,-0.5 199 | 150,1,-0.5,-0.5 200 | 200,1,-0.5,-0.5 201 | 1,1,-0.5,0 202 | 1.31578947368421,1,-0.5,0 203 | 1.75438596491228,1,-0.5,0 204 | 2.32558139534884,1,-0.5,0 205 | 3.03030303030303,1,-0.5,0 206 | 4,1,-0.5,0 207 | 5.35714285714286,1,-0.5,0 208 | 6.97674418604651,1,-0.5,0 209 | 9.375,1,-0.5,0 210 | 12.5,1,-0.5,0 211 | 16.6666666666667,1,-0.5,0 212 | 21.4285714285714,1,-0.5,0 213 | 28.5714285714286,1,-0.5,0 214 | 37.5,1,-0.5,0 215 | 50,1,-0.5,0 216 | 66.6666666666667,1,-0.5,0 217 | 87.5,1,-0.5,0 218 | 116.666666666667,1,-0.5,0 219 | 150,1,-0.5,0 220 | 200,1,-0.5,0 221 | 1,1,-0.5,0.5 222 | 1.31578947368421,1,-0.5,0.5 223 | 1.75438596491228,1,-0.5,0.5 224 | 2.32558139534884,1,-0.5,0.5 225 | 3.03030303030303,1,-0.5,0.5 226 | 4,1,-0.5,0.5 227 | 5.35714285714286,1,-0.5,0.5 228 | 6.97674418604651,1,-0.5,0.5 229 | 9.375,1,-0.5,0.5 230 | 12.5,1,-0.5,0.5 231 | 16.6666666666667,1,-0.5,0.5 232 | 21.4285714285714,1,-0.5,0.5 233 | 28.5714285714286,1,-0.5,0.5 234 | 37.5,1,-0.5,0.5 235 | 50,1,-0.5,0.5 236 | 66.6666666666667,1,-0.5,0.5 237 | 87.5,1,-0.5,0.5 238 | 116.666666666667,1,-0.5,0.5 239 | 150,1,-0.5,0.5 240 | 200,1,-0.5,0.5 241 | 1,1,0,-1 242 | 1.31578947368421,1,0,-1 243 | 1.75438596491228,1,0,-1 244 | 2.32558139534884,1,0,-1 245 | 3.03030303030303,1,0,-1 246 | 4,1,0,-1 247 | 5.35714285714286,1,0,-1 248 | 6.97674418604651,1,0,-1 249 | 9.375,1,0,-1 250 | 12.5,1,0,-1 251 | 16.6666666666667,1,0,-1 252 | 21.4285714285714,1,0,-1 253 | 28.5714285714286,1,0,-1 254 | 37.5,1,0,-1 255 | 50,1,0,-1 256 | 66.6666666666667,1,0,-1 257 | 87.5,1,0,-1 258 | 116.666666666667,1,0,-1 259 | 150,1,0,-1 260 | 200,1,0,-1 261 | 1,1,0,-0.5 262 | 1.31578947368421,1,0,-0.5 263 | 1.75438596491228,1,0,-0.5 264 | 2.32558139534884,1,0,-0.5 265 | 3.03030303030303,1,0,-0.5 266 | 4,1,0,-0.5 267 | 5.35714285714286,1,0,-0.5 268 | 6.97674418604651,1,0,-0.5 269 | 9.375,1,0,-0.5 270 | 12.5,1,0,-0.5 271 | 16.6666666666667,1,0,-0.5 272 | 21.4285714285714,1,0,-0.5 273 | 28.5714285714286,1,0,-0.5 274 | 37.5,1,0,-0.5 275 | 50,1,0,-0.5 276 | 66.6666666666667,1,0,-0.5 277 | 87.5,1,0,-0.5 278 | 116.666666666667,1,0,-0.5 279 | 150,1,0,-0.5 280 | 200,1,0,-0.5 281 | 1,1,0,0.5 282 | 1.31578947368421,1,0,0.5 283 | 1.75438596491228,1,0,0.5 284 | 2.32558139534884,1,0,0.5 285 | 3.03030303030303,1,0,0.5 286 | 4,1,0,0.5 287 | 5.35714285714286,1,0,0.5 288 | 6.97674418604651,1,0,0.5 289 | 9.375,1,0,0.5 290 | 12.5,1,0,0.5 291 | 16.6666666666667,1,0,0.5 292 | 21.4285714285714,1,0,0.5 293 | 28.5714285714286,1,0,0.5 294 | 37.5,1,0,0.5 295 | 50,1,0,0.5 296 | 66.6666666666667,1,0,0.5 297 | 87.5,1,0,0.5 298 | 116.666666666667,1,0,0.5 299 | 150,1,0,0.5 300 | 200,1,0,0.5 301 | 1,1,0,1 302 | 1.31578947368421,1,0,1 303 | 1.75438596491228,1,0,1 304 | 2.32558139534884,1,0,1 305 | 3.03030303030303,1,0,1 306 | 4,1,0,1 307 | 5.35714285714286,1,0,1 308 | 6.97674418604651,1,0,1 309 | 9.375,1,0,1 310 | 12.5,1,0,1 311 | 16.6666666666667,1,0,1 312 | 21.4285714285714,1,0,1 313 | 28.5714285714286,1,0,1 314 | 37.5,1,0,1 315 | 50,1,0,1 316 | 66.6666666666667,1,0,1 317 | 87.5,1,0,1 318 | 116.666666666667,1,0,1 319 | 150,1,0,1 320 | 200,1,0,1 321 | 1,1,0.5,-0.5 322 | 1.31578947368421,1,0.5,-0.5 323 | 1.75438596491228,1,0.5,-0.5 324 | 2.32558139534884,1,0.5,-0.5 325 | 3.03030303030303,1,0.5,-0.5 326 | 4,1,0.5,-0.5 327 | 5.35714285714286,1,0.5,-0.5 328 | 6.97674418604651,1,0.5,-0.5 329 | 9.375,1,0.5,-0.5 330 | 12.5,1,0.5,-0.5 331 | 16.6666666666667,1,0.5,-0.5 332 | 21.4285714285714,1,0.5,-0.5 333 | 28.5714285714286,1,0.5,-0.5 334 | 37.5,1,0.5,-0.5 335 | 50,1,0.5,-0.5 336 | 66.6666666666667,1,0.5,-0.5 337 | 87.5,1,0.5,-0.5 338 | 116.666666666667,1,0.5,-0.5 339 | 150,1,0.5,-0.5 340 | 200,1,0.5,-0.5 341 | 1,1,0.5,0 342 | 1.31578947368421,1,0.5,0 343 | 1.75438596491228,1,0.5,0 344 | 2.32558139534884,1,0.5,0 345 | 3.03030303030303,1,0.5,0 346 | 4,1,0.5,0 347 | 5.35714285714286,1,0.5,0 348 | 6.97674418604651,1,0.5,0 349 | 9.375,1,0.5,0 350 | 12.5,1,0.5,0 351 | 16.6666666666667,1,0.5,0 352 | 21.4285714285714,1,0.5,0 353 | 28.5714285714286,1,0.5,0 354 | 37.5,1,0.5,0 355 | 50,1,0.5,0 356 | 66.6666666666667,1,0.5,0 357 | 87.5,1,0.5,0 358 | 116.666666666667,1,0.5,0 359 | 150,1,0.5,0 360 | 200,1,0.5,0 361 | 1,1,0.5,0.5 362 | 1.31578947368421,1,0.5,0.5 363 | 1.75438596491228,1,0.5,0.5 364 | 2.32558139534884,1,0.5,0.5 365 | 3.03030303030303,1,0.5,0.5 366 | 4,1,0.5,0.5 367 | 5.35714285714286,1,0.5,0.5 368 | 6.97674418604651,1,0.5,0.5 369 | 9.375,1,0.5,0.5 370 | 12.5,1,0.5,0.5 371 | 16.6666666666667,1,0.5,0.5 372 | 21.4285714285714,1,0.5,0.5 373 | 28.5714285714286,1,0.5,0.5 374 | 37.5,1,0.5,0.5 375 | 50,1,0.5,0.5 376 | 66.6666666666667,1,0.5,0.5 377 | 87.5,1,0.5,0.5 378 | 116.666666666667,1,0.5,0.5 379 | 150,1,0.5,0.5 380 | 200,1,0.5,0.5 381 | 1,1,1,0 382 | 1.31578947368421,1,1,0 383 | 1.75438596491228,1,1,0 384 | 2.32558139534884,1,1,0 385 | 3.03030303030303,1,1,0 386 | 4,1,1,0 387 | 5.35714285714286,1,1,0 388 | 6.97674418604651,1,1,0 389 | 9.375,1,1,0 390 | 12.5,1,1,0 391 | 16.6666666666667,1,1,0 392 | 21.4285714285714,1,1,0 393 | 28.5714285714286,1,1,0 394 | 37.5,1,1,0 395 | 50,1,1,0 396 | 66.6666666666667,1,1,0 397 | 87.5,1,1,0 398 | 116.666666666667,1,1,0 399 | 150,1,1,0 400 | 200,1,1,0 401 | 1,1.1,-1,-0.5 402 | 1.31578947368421,1.1,-1,-0.5 403 | 1.75438596491228,1.1,-1,-0.5 404 | 2.32558139534884,1.1,-1,-0.5 405 | 3.03030303030303,1.1,-1,-0.5 406 | 4,1.1,-1,-0.5 407 | 5.35714285714286,1.1,-1,-0.5 408 | 6.97674418604651,1.1,-1,-0.5 409 | 9.375,1.1,-1,-0.5 410 | 12.5,1.1,-1,-0.5 411 | 16.6666666666667,1.1,-1,-0.5 412 | 21.4285714285714,1.1,-1,-0.5 413 | 28.5714285714286,1.1,-1,-0.5 414 | 37.5,1.1,-1,-0.5 415 | 50,1.1,-1,-0.5 416 | 66.6666666666667,1.1,-1,-0.5 417 | 87.5,1.1,-1,-0.5 418 | 116.666666666667,1.1,-1,-0.5 419 | 150,1.1,-1,-0.5 420 | 200,1.1,-1,-0.5 421 | 1,1.1,-1,0 422 | 1.31578947368421,1.1,-1,0 423 | 1.75438596491228,1.1,-1,0 424 | 2.32558139534884,1.1,-1,0 425 | 3.03030303030303,1.1,-1,0 426 | 4,1.1,-1,0 427 | 5.35714285714286,1.1,-1,0 428 | 6.97674418604651,1.1,-1,0 429 | 9.375,1.1,-1,0 430 | 12.5,1.1,-1,0 431 | 16.6666666666667,1.1,-1,0 432 | 21.4285714285714,1.1,-1,0 433 | 28.5714285714286,1.1,-1,0 434 | 37.5,1.1,-1,0 435 | 50,1.1,-1,0 436 | 66.6666666666667,1.1,-1,0 437 | 87.5,1.1,-1,0 438 | 116.666666666667,1.1,-1,0 439 | 150,1.1,-1,0 440 | 200,1.1,-1,0 441 | 1,1.1,-1,0.5 442 | 1.31578947368421,1.1,-1,0.5 443 | 1.75438596491228,1.1,-1,0.5 444 | 2.32558139534884,1.1,-1,0.5 445 | 3.03030303030303,1.1,-1,0.5 446 | 4,1.1,-1,0.5 447 | 5.35714285714286,1.1,-1,0.5 448 | 6.97674418604651,1.1,-1,0.5 449 | 9.375,1.1,-1,0.5 450 | 12.5,1.1,-1,0.5 451 | 16.6666666666667,1.1,-1,0.5 452 | 21.4285714285714,1.1,-1,0.5 453 | 28.5714285714286,1.1,-1,0.5 454 | 37.5,1.1,-1,0.5 455 | 50,1.1,-1,0.5 456 | 66.6666666666667,1.1,-1,0.5 457 | 87.5,1.1,-1,0.5 458 | 116.666666666667,1.1,-1,0.5 459 | 150,1.1,-1,0.5 460 | 200,1.1,-1,0.5 461 | 1,1.1,-0.5,-1 462 | 1.31578947368421,1.1,-0.5,-1 463 | 1.75438596491228,1.1,-0.5,-1 464 | 2.32558139534884,1.1,-0.5,-1 465 | 3.03030303030303,1.1,-0.5,-1 466 | 4,1.1,-0.5,-1 467 | 5.35714285714286,1.1,-0.5,-1 468 | 6.97674418604651,1.1,-0.5,-1 469 | 9.375,1.1,-0.5,-1 470 | 12.5,1.1,-0.5,-1 471 | 16.6666666666667,1.1,-0.5,-1 472 | 21.4285714285714,1.1,-0.5,-1 473 | 28.5714285714286,1.1,-0.5,-1 474 | 37.5,1.1,-0.5,-1 475 | 50,1.1,-0.5,-1 476 | 66.6666666666667,1.1,-0.5,-1 477 | 87.5,1.1,-0.5,-1 478 | 116.666666666667,1.1,-0.5,-1 479 | 150,1.1,-0.5,-1 480 | 200,1.1,-0.5,-1 481 | 1,1.1,-0.5,-0.5 482 | 1.31578947368421,1.1,-0.5,-0.5 483 | 1.75438596491228,1.1,-0.5,-0.5 484 | 2.32558139534884,1.1,-0.5,-0.5 485 | 3.03030303030303,1.1,-0.5,-0.5 486 | 4,1.1,-0.5,-0.5 487 | 5.35714285714286,1.1,-0.5,-0.5 488 | 6.97674418604651,1.1,-0.5,-0.5 489 | 9.375,1.1,-0.5,-0.5 490 | 12.5,1.1,-0.5,-0.5 491 | 16.6666666666667,1.1,-0.5,-0.5 492 | 21.4285714285714,1.1,-0.5,-0.5 493 | 28.5714285714286,1.1,-0.5,-0.5 494 | 37.5,1.1,-0.5,-0.5 495 | 50,1.1,-0.5,-0.5 496 | 66.6666666666667,1.1,-0.5,-0.5 497 | 87.5,1.1,-0.5,-0.5 498 | 116.666666666667,1.1,-0.5,-0.5 499 | 150,1.1,-0.5,-0.5 500 | 200,1.1,-0.5,-0.5 501 | 1,1.1,-0.5,0 502 | 1.31578947368421,1.1,-0.5,0 503 | 1.75438596491228,1.1,-0.5,0 504 | 2.32558139534884,1.1,-0.5,0 505 | 3.03030303030303,1.1,-0.5,0 506 | 4,1.1,-0.5,0 507 | 5.35714285714286,1.1,-0.5,0 508 | 6.97674418604651,1.1,-0.5,0 509 | 9.375,1.1,-0.5,0 510 | 12.5,1.1,-0.5,0 511 | 16.6666666666667,1.1,-0.5,0 512 | 21.4285714285714,1.1,-0.5,0 513 | 28.5714285714286,1.1,-0.5,0 514 | 37.5,1.1,-0.5,0 515 | 50,1.1,-0.5,0 516 | 66.6666666666667,1.1,-0.5,0 517 | 87.5,1.1,-0.5,0 518 | 116.666666666667,1.1,-0.5,0 519 | 150,1.1,-0.5,0 520 | 200,1.1,-0.5,0 521 | 1,1.1,-0.5,0.5 522 | 1.31578947368421,1.1,-0.5,0.5 523 | 1.75438596491228,1.1,-0.5,0.5 524 | 2.32558139534884,1.1,-0.5,0.5 525 | 3.03030303030303,1.1,-0.5,0.5 526 | 4,1.1,-0.5,0.5 527 | 5.35714285714286,1.1,-0.5,0.5 528 | 6.97674418604651,1.1,-0.5,0.5 529 | 9.375,1.1,-0.5,0.5 530 | 12.5,1.1,-0.5,0.5 531 | 16.6666666666667,1.1,-0.5,0.5 532 | 21.4285714285714,1.1,-0.5,0.5 533 | 28.5714285714286,1.1,-0.5,0.5 534 | 37.5,1.1,-0.5,0.5 535 | 50,1.1,-0.5,0.5 536 | 66.6666666666667,1.1,-0.5,0.5 537 | 87.5,1.1,-0.5,0.5 538 | 116.666666666667,1.1,-0.5,0.5 539 | 150,1.1,-0.5,0.5 540 | 200,1.1,-0.5,0.5 541 | 1,1.1,-0.5,1 542 | 1.31578947368421,1.1,-0.5,1 543 | 1.75438596491228,1.1,-0.5,1 544 | 2.32558139534884,1.1,-0.5,1 545 | 3.03030303030303,1.1,-0.5,1 546 | 4,1.1,-0.5,1 547 | 5.35714285714286,1.1,-0.5,1 548 | 6.97674418604651,1.1,-0.5,1 549 | 9.375,1.1,-0.5,1 550 | 12.5,1.1,-0.5,1 551 | 16.6666666666667,1.1,-0.5,1 552 | 21.4285714285714,1.1,-0.5,1 553 | 28.5714285714286,1.1,-0.5,1 554 | 37.5,1.1,-0.5,1 555 | 50,1.1,-0.5,1 556 | 66.6666666666667,1.1,-0.5,1 557 | 87.5,1.1,-0.5,1 558 | 116.666666666667,1.1,-0.5,1 559 | 150,1.1,-0.5,1 560 | 200,1.1,-0.5,1 561 | 1,1.1,0,-1 562 | 1.31578947368421,1.1,0,-1 563 | 1.75438596491228,1.1,0,-1 564 | 2.32558139534884,1.1,0,-1 565 | 3.03030303030303,1.1,0,-1 566 | 4,1.1,0,-1 567 | 5.35714285714286,1.1,0,-1 568 | 6.97674418604651,1.1,0,-1 569 | 9.375,1.1,0,-1 570 | 12.5,1.1,0,-1 571 | 16.6666666666667,1.1,0,-1 572 | 21.4285714285714,1.1,0,-1 573 | 28.5714285714286,1.1,0,-1 574 | 37.5,1.1,0,-1 575 | 50,1.1,0,-1 576 | 66.6666666666667,1.1,0,-1 577 | 87.5,1.1,0,-1 578 | 116.666666666667,1.1,0,-1 579 | 150,1.1,0,-1 580 | 200,1.1,0,-1 581 | 1,1.1,0,-0.5 582 | 1.31578947368421,1.1,0,-0.5 583 | 1.75438596491228,1.1,0,-0.5 584 | 2.32558139534884,1.1,0,-0.5 585 | 3.03030303030303,1.1,0,-0.5 586 | 4,1.1,0,-0.5 587 | 5.35714285714286,1.1,0,-0.5 588 | 6.97674418604651,1.1,0,-0.5 589 | 9.375,1.1,0,-0.5 590 | 12.5,1.1,0,-0.5 591 | 16.6666666666667,1.1,0,-0.5 592 | 21.4285714285714,1.1,0,-0.5 593 | 28.5714285714286,1.1,0,-0.5 594 | 37.5,1.1,0,-0.5 595 | 50,1.1,0,-0.5 596 | 66.6666666666667,1.1,0,-0.5 597 | 87.5,1.1,0,-0.5 598 | 116.666666666667,1.1,0,-0.5 599 | 150,1.1,0,-0.5 600 | 200,1.1,0,-0.5 601 | 1,1.1,0,0.5 602 | 1.31578947368421,1.1,0,0.5 603 | 1.75438596491228,1.1,0,0.5 604 | 2.32558139534884,1.1,0,0.5 605 | 3.03030303030303,1.1,0,0.5 606 | 4,1.1,0,0.5 607 | 5.35714285714286,1.1,0,0.5 608 | 6.97674418604651,1.1,0,0.5 609 | 9.375,1.1,0,0.5 610 | 12.5,1.1,0,0.5 611 | 16.6666666666667,1.1,0,0.5 612 | 21.4285714285714,1.1,0,0.5 613 | 28.5714285714286,1.1,0,0.5 614 | 37.5,1.1,0,0.5 615 | 50,1.1,0,0.5 616 | 66.6666666666667,1.1,0,0.5 617 | 87.5,1.1,0,0.5 618 | 116.666666666667,1.1,0,0.5 619 | 150,1.1,0,0.5 620 | 200,1.1,0,0.5 621 | 1,1.1,0,1 622 | 1.31578947368421,1.1,0,1 623 | 1.75438596491228,1.1,0,1 624 | 2.32558139534884,1.1,0,1 625 | 3.03030303030303,1.1,0,1 626 | 4,1.1,0,1 627 | 5.35714285714286,1.1,0,1 628 | 6.97674418604651,1.1,0,1 629 | 9.375,1.1,0,1 630 | 12.5,1.1,0,1 631 | 16.6666666666667,1.1,0,1 632 | 21.4285714285714,1.1,0,1 633 | 28.5714285714286,1.1,0,1 634 | 37.5,1.1,0,1 635 | 50,1.1,0,1 636 | 66.6666666666667,1.1,0,1 637 | 87.5,1.1,0,1 638 | 116.666666666667,1.1,0,1 639 | 150,1.1,0,1 640 | 200,1.1,0,1 641 | 1,1.1,0.5,-1 642 | 1.31578947368421,1.1,0.5,-1 643 | 1.75438596491228,1.1,0.5,-1 644 | 2.32558139534884,1.1,0.5,-1 645 | 3.03030303030303,1.1,0.5,-1 646 | 4,1.1,0.5,-1 647 | 5.35714285714286,1.1,0.5,-1 648 | 6.97674418604651,1.1,0.5,-1 649 | 9.375,1.1,0.5,-1 650 | 12.5,1.1,0.5,-1 651 | 16.6666666666667,1.1,0.5,-1 652 | 21.4285714285714,1.1,0.5,-1 653 | 28.5714285714286,1.1,0.5,-1 654 | 37.5,1.1,0.5,-1 655 | 50,1.1,0.5,-1 656 | 66.6666666666667,1.1,0.5,-1 657 | 87.5,1.1,0.5,-1 658 | 116.666666666667,1.1,0.5,-1 659 | 150,1.1,0.5,-1 660 | 200,1.1,0.5,-1 661 | 1,1.1,0.5,-0.5 662 | 1.31578947368421,1.1,0.5,-0.5 663 | 1.75438596491228,1.1,0.5,-0.5 664 | 2.32558139534884,1.1,0.5,-0.5 665 | 3.03030303030303,1.1,0.5,-0.5 666 | 4,1.1,0.5,-0.5 667 | 5.35714285714286,1.1,0.5,-0.5 668 | 6.97674418604651,1.1,0.5,-0.5 669 | 9.375,1.1,0.5,-0.5 670 | 12.5,1.1,0.5,-0.5 671 | 16.6666666666667,1.1,0.5,-0.5 672 | 21.4285714285714,1.1,0.5,-0.5 673 | 28.5714285714286,1.1,0.5,-0.5 674 | 37.5,1.1,0.5,-0.5 675 | 50,1.1,0.5,-0.5 676 | 66.6666666666667,1.1,0.5,-0.5 677 | 87.5,1.1,0.5,-0.5 678 | 116.666666666667,1.1,0.5,-0.5 679 | 150,1.1,0.5,-0.5 680 | 200,1.1,0.5,-0.5 681 | 1,1.1,0.5,0 682 | 1.31578947368421,1.1,0.5,0 683 | 1.75438596491228,1.1,0.5,0 684 | 2.32558139534884,1.1,0.5,0 685 | 3.03030303030303,1.1,0.5,0 686 | 4,1.1,0.5,0 687 | 5.35714285714286,1.1,0.5,0 688 | 6.97674418604651,1.1,0.5,0 689 | 9.375,1.1,0.5,0 690 | 12.5,1.1,0.5,0 691 | 16.6666666666667,1.1,0.5,0 692 | 21.4285714285714,1.1,0.5,0 693 | 28.5714285714286,1.1,0.5,0 694 | 37.5,1.1,0.5,0 695 | 50,1.1,0.5,0 696 | 66.6666666666667,1.1,0.5,0 697 | 87.5,1.1,0.5,0 698 | 116.666666666667,1.1,0.5,0 699 | 150,1.1,0.5,0 700 | 200,1.1,0.5,0 701 | 1,1.1,0.5,0.5 702 | 1.31578947368421,1.1,0.5,0.5 703 | 1.75438596491228,1.1,0.5,0.5 704 | 2.32558139534884,1.1,0.5,0.5 705 | 3.03030303030303,1.1,0.5,0.5 706 | 4,1.1,0.5,0.5 707 | 5.35714285714286,1.1,0.5,0.5 708 | 6.97674418604651,1.1,0.5,0.5 709 | 9.375,1.1,0.5,0.5 710 | 12.5,1.1,0.5,0.5 711 | 16.6666666666667,1.1,0.5,0.5 712 | 21.4285714285714,1.1,0.5,0.5 713 | 28.5714285714286,1.1,0.5,0.5 714 | 37.5,1.1,0.5,0.5 715 | 50,1.1,0.5,0.5 716 | 66.6666666666667,1.1,0.5,0.5 717 | 87.5,1.1,0.5,0.5 718 | 116.666666666667,1.1,0.5,0.5 719 | 150,1.1,0.5,0.5 720 | 200,1.1,0.5,0.5 721 | 1,1.1,0.5,1 722 | 1.31578947368421,1.1,0.5,1 723 | 1.75438596491228,1.1,0.5,1 724 | 2.32558139534884,1.1,0.5,1 725 | 3.03030303030303,1.1,0.5,1 726 | 4,1.1,0.5,1 727 | 5.35714285714286,1.1,0.5,1 728 | 6.97674418604651,1.1,0.5,1 729 | 9.375,1.1,0.5,1 730 | 12.5,1.1,0.5,1 731 | 16.6666666666667,1.1,0.5,1 732 | 21.4285714285714,1.1,0.5,1 733 | 28.5714285714286,1.1,0.5,1 734 | 37.5,1.1,0.5,1 735 | 50,1.1,0.5,1 736 | 66.6666666666667,1.1,0.5,1 737 | 87.5,1.1,0.5,1 738 | 116.666666666667,1.1,0.5,1 739 | 150,1.1,0.5,1 740 | 200,1.1,0.5,1 741 | 1,1.1,1,-0.5 742 | 1.31578947368421,1.1,1,-0.5 743 | 1.75438596491228,1.1,1,-0.5 744 | 2.32558139534884,1.1,1,-0.5 745 | 3.03030303030303,1.1,1,-0.5 746 | 4,1.1,1,-0.5 747 | 5.35714285714286,1.1,1,-0.5 748 | 6.97674418604651,1.1,1,-0.5 749 | 9.375,1.1,1,-0.5 750 | 12.5,1.1,1,-0.5 751 | 16.6666666666667,1.1,1,-0.5 752 | 21.4285714285714,1.1,1,-0.5 753 | 28.5714285714286,1.1,1,-0.5 754 | 37.5,1.1,1,-0.5 755 | 50,1.1,1,-0.5 756 | 66.6666666666667,1.1,1,-0.5 757 | 87.5,1.1,1,-0.5 758 | 116.666666666667,1.1,1,-0.5 759 | 150,1.1,1,-0.5 760 | 200,1.1,1,-0.5 761 | 1,1.1,1,0 762 | 1.31578947368421,1.1,1,0 763 | 1.75438596491228,1.1,1,0 764 | 2.32558139534884,1.1,1,0 765 | 3.03030303030303,1.1,1,0 766 | 4,1.1,1,0 767 | 5.35714285714286,1.1,1,0 768 | 6.97674418604651,1.1,1,0 769 | 9.375,1.1,1,0 770 | 12.5,1.1,1,0 771 | 16.6666666666667,1.1,1,0 772 | 21.4285714285714,1.1,1,0 773 | 28.5714285714286,1.1,1,0 774 | 37.5,1.1,1,0 775 | 50,1.1,1,0 776 | 66.6666666666667,1.1,1,0 777 | 87.5,1.1,1,0 778 | 116.666666666667,1.1,1,0 779 | 150,1.1,1,0 780 | 200,1.1,1,0 781 | 1,1.1,1,0.5 782 | 1.31578947368421,1.1,1,0.5 783 | 1.75438596491228,1.1,1,0.5 784 | 2.32558139534884,1.1,1,0.5 785 | 3.03030303030303,1.1,1,0.5 786 | 4,1.1,1,0.5 787 | 5.35714285714286,1.1,1,0.5 788 | 6.97674418604651,1.1,1,0.5 789 | 9.375,1.1,1,0.5 790 | 12.5,1.1,1,0.5 791 | 16.6666666666667,1.1,1,0.5 792 | 21.4285714285714,1.1,1,0.5 793 | 28.5714285714286,1.1,1,0.5 794 | 37.5,1.1,1,0.5 795 | 50,1.1,1,0.5 796 | 66.6666666666667,1.1,1,0.5 797 | 87.5,1.1,1,0.5 798 | 116.666666666667,1.1,1,0.5 799 | 150,1.1,1,0.5 800 | 200,1.1,1,0.5 801 | -------------------------------------------------------------------------------- /dataset/Input_Yo20op40_ana_50.csv: -------------------------------------------------------------------------------- 1 | 1,0.9,-0.5,-0.5 2 | 1.31578947368421,0.9,-0.5,-0.5 3 | 1.75438596491228,0.9,-0.5,-0.5 4 | 2.32558139534884,0.9,-0.5,-0.5 5 | 3.03030303030303,0.9,-0.5,-0.5 6 | 4,0.9,-0.5,-0.5 7 | 5.35714285714286,0.9,-0.5,-0.5 8 | 6.97674418604651,0.9,-0.5,-0.5 9 | 9.375,0.9,-0.5,-0.5 10 | 12.5,0.9,-0.5,-0.5 11 | 16.6666666666667,0.9,-0.5,-0.5 12 | 21.4285714285714,0.9,-0.5,-0.5 13 | 28.5714285714286,0.9,-0.5,-0.5 14 | 37.5,0.9,-0.5,-0.5 15 | 50,0.9,-0.5,-0.5 16 | 66.6666666666667,0.9,-0.5,-0.5 17 | 87.5,0.9,-0.5,-0.5 18 | 116.666666666667,0.9,-0.5,-0.5 19 | 150,0.9,-0.5,-0.5 20 | 200,0.9,-0.5,-0.5 21 | 1,0.9,-0.5,0 22 | 1.31578947368421,0.9,-0.5,0 23 | 1.75438596491228,0.9,-0.5,0 24 | 2.32558139534884,0.9,-0.5,0 25 | 3.03030303030303,0.9,-0.5,0 26 | 4,0.9,-0.5,0 27 | 5.35714285714286,0.9,-0.5,0 28 | 6.97674418604651,0.9,-0.5,0 29 | 9.375,0.9,-0.5,0 30 | 12.5,0.9,-0.5,0 31 | 16.6666666666667,0.9,-0.5,0 32 | 21.4285714285714,0.9,-0.5,0 33 | 28.5714285714286,0.9,-0.5,0 34 | 37.5,0.9,-0.5,0 35 | 50,0.9,-0.5,0 36 | 66.6666666666667,0.9,-0.5,0 37 | 87.5,0.9,-0.5,0 38 | 116.666666666667,0.9,-0.5,0 39 | 150,0.9,-0.5,0 40 | 200,0.9,-0.5,0 41 | 1,0.9,-0.5,0.5 42 | 1.31578947368421,0.9,-0.5,0.5 43 | 1.75438596491228,0.9,-0.5,0.5 44 | 2.32558139534884,0.9,-0.5,0.5 45 | 3.03030303030303,0.9,-0.5,0.5 46 | 4,0.9,-0.5,0.5 47 | 5.35714285714286,0.9,-0.5,0.5 48 | 6.97674418604651,0.9,-0.5,0.5 49 | 9.375,0.9,-0.5,0.5 50 | 12.5,0.9,-0.5,0.5 51 | 16.6666666666667,0.9,-0.5,0.5 52 | 21.4285714285714,0.9,-0.5,0.5 53 | 28.5714285714286,0.9,-0.5,0.5 54 | 37.5,0.9,-0.5,0.5 55 | 50,0.9,-0.5,0.5 56 | 66.6666666666667,0.9,-0.5,0.5 57 | 87.5,0.9,-0.5,0.5 58 | 116.666666666667,0.9,-0.5,0.5 59 | 150,0.9,-0.5,0.5 60 | 200,0.9,-0.5,0.5 61 | 1,0.9,0,-0.5 62 | 1.31578947368421,0.9,0,-0.5 63 | 1.75438596491228,0.9,0,-0.5 64 | 2.32558139534884,0.9,0,-0.5 65 | 3.03030303030303,0.9,0,-0.5 66 | 4,0.9,0,-0.5 67 | 5.35714285714286,0.9,0,-0.5 68 | 6.97674418604651,0.9,0,-0.5 69 | 9.375,0.9,0,-0.5 70 | 12.5,0.9,0,-0.5 71 | 16.6666666666667,0.9,0,-0.5 72 | 21.4285714285714,0.9,0,-0.5 73 | 28.5714285714286,0.9,0,-0.5 74 | 37.5,0.9,0,-0.5 75 | 50,0.9,0,-0.5 76 | 66.6666666666667,0.9,0,-0.5 77 | 87.5,0.9,0,-0.5 78 | 116.666666666667,0.9,0,-0.5 79 | 150,0.9,0,-0.5 80 | 200,0.9,0,-0.5 81 | 1,0.9,0,0.5 82 | 1.31578947368421,0.9,0,0.5 83 | 1.75438596491228,0.9,0,0.5 84 | 2.32558139534884,0.9,0,0.5 85 | 3.03030303030303,0.9,0,0.5 86 | 4,0.9,0,0.5 87 | 5.35714285714286,0.9,0,0.5 88 | 6.97674418604651,0.9,0,0.5 89 | 9.375,0.9,0,0.5 90 | 12.5,0.9,0,0.5 91 | 16.6666666666667,0.9,0,0.5 92 | 21.4285714285714,0.9,0,0.5 93 | 28.5714285714286,0.9,0,0.5 94 | 37.5,0.9,0,0.5 95 | 50,0.9,0,0.5 96 | 66.6666666666667,0.9,0,0.5 97 | 87.5,0.9,0,0.5 98 | 116.666666666667,0.9,0,0.5 99 | 150,0.9,0,0.5 100 | 200,0.9,0,0.5 101 | 1,0.9,0.5,-0.5 102 | 1.31578947368421,0.9,0.5,-0.5 103 | 1.75438596491228,0.9,0.5,-0.5 104 | 2.32558139534884,0.9,0.5,-0.5 105 | 3.03030303030303,0.9,0.5,-0.5 106 | 4,0.9,0.5,-0.5 107 | 5.35714285714286,0.9,0.5,-0.5 108 | 6.97674418604651,0.9,0.5,-0.5 109 | 9.375,0.9,0.5,-0.5 110 | 12.5,0.9,0.5,-0.5 111 | 16.6666666666667,0.9,0.5,-0.5 112 | 21.4285714285714,0.9,0.5,-0.5 113 | 28.5714285714286,0.9,0.5,-0.5 114 | 37.5,0.9,0.5,-0.5 115 | 50,0.9,0.5,-0.5 116 | 66.6666666666667,0.9,0.5,-0.5 117 | 87.5,0.9,0.5,-0.5 118 | 116.666666666667,0.9,0.5,-0.5 119 | 150,0.9,0.5,-0.5 120 | 200,0.9,0.5,-0.5 121 | 1,0.9,0.5,0 122 | 1.31578947368421,0.9,0.5,0 123 | 1.75438596491228,0.9,0.5,0 124 | 2.32558139534884,0.9,0.5,0 125 | 3.03030303030303,0.9,0.5,0 126 | 4,0.9,0.5,0 127 | 5.35714285714286,0.9,0.5,0 128 | 6.97674418604651,0.9,0.5,0 129 | 9.375,0.9,0.5,0 130 | 12.5,0.9,0.5,0 131 | 16.6666666666667,0.9,0.5,0 132 | 21.4285714285714,0.9,0.5,0 133 | 28.5714285714286,0.9,0.5,0 134 | 37.5,0.9,0.5,0 135 | 50,0.9,0.5,0 136 | 66.6666666666667,0.9,0.5,0 137 | 87.5,0.9,0.5,0 138 | 116.666666666667,0.9,0.5,0 139 | 150,0.9,0.5,0 140 | 200,0.9,0.5,0 141 | 1,0.9,0.5,0.5 142 | 1.31578947368421,0.9,0.5,0.5 143 | 1.75438596491228,0.9,0.5,0.5 144 | 2.32558139534884,0.9,0.5,0.5 145 | 3.03030303030303,0.9,0.5,0.5 146 | 4,0.9,0.5,0.5 147 | 5.35714285714286,0.9,0.5,0.5 148 | 6.97674418604651,0.9,0.5,0.5 149 | 9.375,0.9,0.5,0.5 150 | 12.5,0.9,0.5,0.5 151 | 16.6666666666667,0.9,0.5,0.5 152 | 21.4285714285714,0.9,0.5,0.5 153 | 28.5714285714286,0.9,0.5,0.5 154 | 37.5,0.9,0.5,0.5 155 | 50,0.9,0.5,0.5 156 | 66.6666666666667,0.9,0.5,0.5 157 | 87.5,0.9,0.5,0.5 158 | 116.666666666667,0.9,0.5,0.5 159 | 150,0.9,0.5,0.5 160 | 200,0.9,0.5,0.5 161 | 1,1,-1,0 162 | 1.31578947368421,1,-1,0 163 | 1.75438596491228,1,-1,0 164 | 2.32558139534884,1,-1,0 165 | 3.03030303030303,1,-1,0 166 | 4,1,-1,0 167 | 5.35714285714286,1,-1,0 168 | 6.97674418604651,1,-1,0 169 | 9.375,1,-1,0 170 | 12.5,1,-1,0 171 | 16.6666666666667,1,-1,0 172 | 21.4285714285714,1,-1,0 173 | 28.5714285714286,1,-1,0 174 | 37.5,1,-1,0 175 | 50,1,-1,0 176 | 66.6666666666667,1,-1,0 177 | 87.5,1,-1,0 178 | 116.666666666667,1,-1,0 179 | 150,1,-1,0 180 | 200,1,-1,0 181 | 1,1,-0.5,-0.5 182 | 1.31578947368421,1,-0.5,-0.5 183 | 1.75438596491228,1,-0.5,-0.5 184 | 2.32558139534884,1,-0.5,-0.5 185 | 3.03030303030303,1,-0.5,-0.5 186 | 4,1,-0.5,-0.5 187 | 5.35714285714286,1,-0.5,-0.5 188 | 6.97674418604651,1,-0.5,-0.5 189 | 9.375,1,-0.5,-0.5 190 | 12.5,1,-0.5,-0.5 191 | 16.6666666666667,1,-0.5,-0.5 192 | 21.4285714285714,1,-0.5,-0.5 193 | 28.5714285714286,1,-0.5,-0.5 194 | 37.5,1,-0.5,-0.5 195 | 50,1,-0.5,-0.5 196 | 66.6666666666667,1,-0.5,-0.5 197 | 87.5,1,-0.5,-0.5 198 | 116.666666666667,1,-0.5,-0.5 199 | 150,1,-0.5,-0.5 200 | 200,1,-0.5,-0.5 201 | 1,1,-0.5,0 202 | 1.31578947368421,1,-0.5,0 203 | 1.75438596491228,1,-0.5,0 204 | 2.32558139534884,1,-0.5,0 205 | 3.03030303030303,1,-0.5,0 206 | 4,1,-0.5,0 207 | 5.35714285714286,1,-0.5,0 208 | 6.97674418604651,1,-0.5,0 209 | 9.375,1,-0.5,0 210 | 12.5,1,-0.5,0 211 | 16.6666666666667,1,-0.5,0 212 | 21.4285714285714,1,-0.5,0 213 | 28.5714285714286,1,-0.5,0 214 | 37.5,1,-0.5,0 215 | 50,1,-0.5,0 216 | 66.6666666666667,1,-0.5,0 217 | 87.5,1,-0.5,0 218 | 116.666666666667,1,-0.5,0 219 | 150,1,-0.5,0 220 | 200,1,-0.5,0 221 | 1,1,-0.5,0.5 222 | 1.31578947368421,1,-0.5,0.5 223 | 1.75438596491228,1,-0.5,0.5 224 | 2.32558139534884,1,-0.5,0.5 225 | 3.03030303030303,1,-0.5,0.5 226 | 4,1,-0.5,0.5 227 | 5.35714285714286,1,-0.5,0.5 228 | 6.97674418604651,1,-0.5,0.5 229 | 9.375,1,-0.5,0.5 230 | 12.5,1,-0.5,0.5 231 | 16.6666666666667,1,-0.5,0.5 232 | 21.4285714285714,1,-0.5,0.5 233 | 28.5714285714286,1,-0.5,0.5 234 | 37.5,1,-0.5,0.5 235 | 50,1,-0.5,0.5 236 | 66.6666666666667,1,-0.5,0.5 237 | 87.5,1,-0.5,0.5 238 | 116.666666666667,1,-0.5,0.5 239 | 150,1,-0.5,0.5 240 | 200,1,-0.5,0.5 241 | 1,1,0,-1 242 | 1.31578947368421,1,0,-1 243 | 1.75438596491228,1,0,-1 244 | 2.32558139534884,1,0,-1 245 | 3.03030303030303,1,0,-1 246 | 4,1,0,-1 247 | 5.35714285714286,1,0,-1 248 | 6.97674418604651,1,0,-1 249 | 9.375,1,0,-1 250 | 12.5,1,0,-1 251 | 16.6666666666667,1,0,-1 252 | 21.4285714285714,1,0,-1 253 | 28.5714285714286,1,0,-1 254 | 37.5,1,0,-1 255 | 50,1,0,-1 256 | 66.6666666666667,1,0,-1 257 | 87.5,1,0,-1 258 | 116.666666666667,1,0,-1 259 | 150,1,0,-1 260 | 200,1,0,-1 261 | 1,1,0,-0.5 262 | 1.31578947368421,1,0,-0.5 263 | 1.75438596491228,1,0,-0.5 264 | 2.32558139534884,1,0,-0.5 265 | 3.03030303030303,1,0,-0.5 266 | 4,1,0,-0.5 267 | 5.35714285714286,1,0,-0.5 268 | 6.97674418604651,1,0,-0.5 269 | 9.375,1,0,-0.5 270 | 12.5,1,0,-0.5 271 | 16.6666666666667,1,0,-0.5 272 | 21.4285714285714,1,0,-0.5 273 | 28.5714285714286,1,0,-0.5 274 | 37.5,1,0,-0.5 275 | 50,1,0,-0.5 276 | 66.6666666666667,1,0,-0.5 277 | 87.5,1,0,-0.5 278 | 116.666666666667,1,0,-0.5 279 | 150,1,0,-0.5 280 | 200,1,0,-0.5 281 | 1,1,0,0.5 282 | 1.31578947368421,1,0,0.5 283 | 1.75438596491228,1,0,0.5 284 | 2.32558139534884,1,0,0.5 285 | 3.03030303030303,1,0,0.5 286 | 4,1,0,0.5 287 | 5.35714285714286,1,0,0.5 288 | 6.97674418604651,1,0,0.5 289 | 9.375,1,0,0.5 290 | 12.5,1,0,0.5 291 | 16.6666666666667,1,0,0.5 292 | 21.4285714285714,1,0,0.5 293 | 28.5714285714286,1,0,0.5 294 | 37.5,1,0,0.5 295 | 50,1,0,0.5 296 | 66.6666666666667,1,0,0.5 297 | 87.5,1,0,0.5 298 | 116.666666666667,1,0,0.5 299 | 150,1,0,0.5 300 | 200,1,0,0.5 301 | 1,1,0,1 302 | 1.31578947368421,1,0,1 303 | 1.75438596491228,1,0,1 304 | 2.32558139534884,1,0,1 305 | 3.03030303030303,1,0,1 306 | 4,1,0,1 307 | 5.35714285714286,1,0,1 308 | 6.97674418604651,1,0,1 309 | 9.375,1,0,1 310 | 12.5,1,0,1 311 | 16.6666666666667,1,0,1 312 | 21.4285714285714,1,0,1 313 | 28.5714285714286,1,0,1 314 | 37.5,1,0,1 315 | 50,1,0,1 316 | 66.6666666666667,1,0,1 317 | 87.5,1,0,1 318 | 116.666666666667,1,0,1 319 | 150,1,0,1 320 | 200,1,0,1 321 | 1,1,0.5,-0.5 322 | 1.31578947368421,1,0.5,-0.5 323 | 1.75438596491228,1,0.5,-0.5 324 | 2.32558139534884,1,0.5,-0.5 325 | 3.03030303030303,1,0.5,-0.5 326 | 4,1,0.5,-0.5 327 | 5.35714285714286,1,0.5,-0.5 328 | 6.97674418604651,1,0.5,-0.5 329 | 9.375,1,0.5,-0.5 330 | 12.5,1,0.5,-0.5 331 | 16.6666666666667,1,0.5,-0.5 332 | 21.4285714285714,1,0.5,-0.5 333 | 28.5714285714286,1,0.5,-0.5 334 | 37.5,1,0.5,-0.5 335 | 50,1,0.5,-0.5 336 | 66.6666666666667,1,0.5,-0.5 337 | 87.5,1,0.5,-0.5 338 | 116.666666666667,1,0.5,-0.5 339 | 150,1,0.5,-0.5 340 | 200,1,0.5,-0.5 341 | 1,1,0.5,0 342 | 1.31578947368421,1,0.5,0 343 | 1.75438596491228,1,0.5,0 344 | 2.32558139534884,1,0.5,0 345 | 3.03030303030303,1,0.5,0 346 | 4,1,0.5,0 347 | 5.35714285714286,1,0.5,0 348 | 6.97674418604651,1,0.5,0 349 | 9.375,1,0.5,0 350 | 12.5,1,0.5,0 351 | 16.6666666666667,1,0.5,0 352 | 21.4285714285714,1,0.5,0 353 | 28.5714285714286,1,0.5,0 354 | 37.5,1,0.5,0 355 | 50,1,0.5,0 356 | 66.6666666666667,1,0.5,0 357 | 87.5,1,0.5,0 358 | 116.666666666667,1,0.5,0 359 | 150,1,0.5,0 360 | 200,1,0.5,0 361 | 1,1,0.5,0.5 362 | 1.31578947368421,1,0.5,0.5 363 | 1.75438596491228,1,0.5,0.5 364 | 2.32558139534884,1,0.5,0.5 365 | 3.03030303030303,1,0.5,0.5 366 | 4,1,0.5,0.5 367 | 5.35714285714286,1,0.5,0.5 368 | 6.97674418604651,1,0.5,0.5 369 | 9.375,1,0.5,0.5 370 | 12.5,1,0.5,0.5 371 | 16.6666666666667,1,0.5,0.5 372 | 21.4285714285714,1,0.5,0.5 373 | 28.5714285714286,1,0.5,0.5 374 | 37.5,1,0.5,0.5 375 | 50,1,0.5,0.5 376 | 66.6666666666667,1,0.5,0.5 377 | 87.5,1,0.5,0.5 378 | 116.666666666667,1,0.5,0.5 379 | 150,1,0.5,0.5 380 | 200,1,0.5,0.5 381 | 1,1,1,0 382 | 1.31578947368421,1,1,0 383 | 1.75438596491228,1,1,0 384 | 2.32558139534884,1,1,0 385 | 3.03030303030303,1,1,0 386 | 4,1,1,0 387 | 5.35714285714286,1,1,0 388 | 6.97674418604651,1,1,0 389 | 9.375,1,1,0 390 | 12.5,1,1,0 391 | 16.6666666666667,1,1,0 392 | 21.4285714285714,1,1,0 393 | 28.5714285714286,1,1,0 394 | 37.5,1,1,0 395 | 50,1,1,0 396 | 66.6666666666667,1,1,0 397 | 87.5,1,1,0 398 | 116.666666666667,1,1,0 399 | 150,1,1,0 400 | 200,1,1,0 401 | 1,1.1,-1,-0.5 402 | 1.31578947368421,1.1,-1,-0.5 403 | 1.75438596491228,1.1,-1,-0.5 404 | 2.32558139534884,1.1,-1,-0.5 405 | 3.03030303030303,1.1,-1,-0.5 406 | 4,1.1,-1,-0.5 407 | 5.35714285714286,1.1,-1,-0.5 408 | 6.97674418604651,1.1,-1,-0.5 409 | 9.375,1.1,-1,-0.5 410 | 12.5,1.1,-1,-0.5 411 | 16.6666666666667,1.1,-1,-0.5 412 | 21.4285714285714,1.1,-1,-0.5 413 | 28.5714285714286,1.1,-1,-0.5 414 | 37.5,1.1,-1,-0.5 415 | 50,1.1,-1,-0.5 416 | 66.6666666666667,1.1,-1,-0.5 417 | 87.5,1.1,-1,-0.5 418 | 116.666666666667,1.1,-1,-0.5 419 | 150,1.1,-1,-0.5 420 | 200,1.1,-1,-0.5 421 | 1,1.1,-1,0 422 | 1.31578947368421,1.1,-1,0 423 | 1.75438596491228,1.1,-1,0 424 | 2.32558139534884,1.1,-1,0 425 | 3.03030303030303,1.1,-1,0 426 | 4,1.1,-1,0 427 | 5.35714285714286,1.1,-1,0 428 | 6.97674418604651,1.1,-1,0 429 | 9.375,1.1,-1,0 430 | 12.5,1.1,-1,0 431 | 16.6666666666667,1.1,-1,0 432 | 21.4285714285714,1.1,-1,0 433 | 28.5714285714286,1.1,-1,0 434 | 37.5,1.1,-1,0 435 | 50,1.1,-1,0 436 | 66.6666666666667,1.1,-1,0 437 | 87.5,1.1,-1,0 438 | 116.666666666667,1.1,-1,0 439 | 150,1.1,-1,0 440 | 200,1.1,-1,0 441 | 1,1.1,-1,0.5 442 | 1.31578947368421,1.1,-1,0.5 443 | 1.75438596491228,1.1,-1,0.5 444 | 2.32558139534884,1.1,-1,0.5 445 | 3.03030303030303,1.1,-1,0.5 446 | 4,1.1,-1,0.5 447 | 5.35714285714286,1.1,-1,0.5 448 | 6.97674418604651,1.1,-1,0.5 449 | 9.375,1.1,-1,0.5 450 | 12.5,1.1,-1,0.5 451 | 16.6666666666667,1.1,-1,0.5 452 | 21.4285714285714,1.1,-1,0.5 453 | 28.5714285714286,1.1,-1,0.5 454 | 37.5,1.1,-1,0.5 455 | 50,1.1,-1,0.5 456 | 66.6666666666667,1.1,-1,0.5 457 | 87.5,1.1,-1,0.5 458 | 116.666666666667,1.1,-1,0.5 459 | 150,1.1,-1,0.5 460 | 200,1.1,-1,0.5 461 | 1,1.1,-0.5,-1 462 | 1.31578947368421,1.1,-0.5,-1 463 | 1.75438596491228,1.1,-0.5,-1 464 | 2.32558139534884,1.1,-0.5,-1 465 | 3.03030303030303,1.1,-0.5,-1 466 | 4,1.1,-0.5,-1 467 | 5.35714285714286,1.1,-0.5,-1 468 | 6.97674418604651,1.1,-0.5,-1 469 | 9.375,1.1,-0.5,-1 470 | 12.5,1.1,-0.5,-1 471 | 16.6666666666667,1.1,-0.5,-1 472 | 21.4285714285714,1.1,-0.5,-1 473 | 28.5714285714286,1.1,-0.5,-1 474 | 37.5,1.1,-0.5,-1 475 | 50,1.1,-0.5,-1 476 | 66.6666666666667,1.1,-0.5,-1 477 | 87.5,1.1,-0.5,-1 478 | 116.666666666667,1.1,-0.5,-1 479 | 150,1.1,-0.5,-1 480 | 200,1.1,-0.5,-1 481 | 1,1.1,-0.5,-0.5 482 | 1.31578947368421,1.1,-0.5,-0.5 483 | 1.75438596491228,1.1,-0.5,-0.5 484 | 2.32558139534884,1.1,-0.5,-0.5 485 | 3.03030303030303,1.1,-0.5,-0.5 486 | 4,1.1,-0.5,-0.5 487 | 5.35714285714286,1.1,-0.5,-0.5 488 | 6.97674418604651,1.1,-0.5,-0.5 489 | 9.375,1.1,-0.5,-0.5 490 | 12.5,1.1,-0.5,-0.5 491 | 16.6666666666667,1.1,-0.5,-0.5 492 | 21.4285714285714,1.1,-0.5,-0.5 493 | 28.5714285714286,1.1,-0.5,-0.5 494 | 37.5,1.1,-0.5,-0.5 495 | 50,1.1,-0.5,-0.5 496 | 66.6666666666667,1.1,-0.5,-0.5 497 | 87.5,1.1,-0.5,-0.5 498 | 116.666666666667,1.1,-0.5,-0.5 499 | 150,1.1,-0.5,-0.5 500 | 200,1.1,-0.5,-0.5 501 | 1,1.1,-0.5,0 502 | 1.31578947368421,1.1,-0.5,0 503 | 1.75438596491228,1.1,-0.5,0 504 | 2.32558139534884,1.1,-0.5,0 505 | 3.03030303030303,1.1,-0.5,0 506 | 4,1.1,-0.5,0 507 | 5.35714285714286,1.1,-0.5,0 508 | 6.97674418604651,1.1,-0.5,0 509 | 9.375,1.1,-0.5,0 510 | 12.5,1.1,-0.5,0 511 | 16.6666666666667,1.1,-0.5,0 512 | 21.4285714285714,1.1,-0.5,0 513 | 28.5714285714286,1.1,-0.5,0 514 | 37.5,1.1,-0.5,0 515 | 50,1.1,-0.5,0 516 | 66.6666666666667,1.1,-0.5,0 517 | 87.5,1.1,-0.5,0 518 | 116.666666666667,1.1,-0.5,0 519 | 150,1.1,-0.5,0 520 | 200,1.1,-0.5,0 521 | 1,1.1,-0.5,0.5 522 | 1.31578947368421,1.1,-0.5,0.5 523 | 1.75438596491228,1.1,-0.5,0.5 524 | 2.32558139534884,1.1,-0.5,0.5 525 | 3.03030303030303,1.1,-0.5,0.5 526 | 4,1.1,-0.5,0.5 527 | 5.35714285714286,1.1,-0.5,0.5 528 | 6.97674418604651,1.1,-0.5,0.5 529 | 9.375,1.1,-0.5,0.5 530 | 12.5,1.1,-0.5,0.5 531 | 16.6666666666667,1.1,-0.5,0.5 532 | 21.4285714285714,1.1,-0.5,0.5 533 | 28.5714285714286,1.1,-0.5,0.5 534 | 37.5,1.1,-0.5,0.5 535 | 50,1.1,-0.5,0.5 536 | 66.6666666666667,1.1,-0.5,0.5 537 | 87.5,1.1,-0.5,0.5 538 | 116.666666666667,1.1,-0.5,0.5 539 | 150,1.1,-0.5,0.5 540 | 200,1.1,-0.5,0.5 541 | 1,1.1,-0.5,1 542 | 1.31578947368421,1.1,-0.5,1 543 | 1.75438596491228,1.1,-0.5,1 544 | 2.32558139534884,1.1,-0.5,1 545 | 3.03030303030303,1.1,-0.5,1 546 | 4,1.1,-0.5,1 547 | 5.35714285714286,1.1,-0.5,1 548 | 6.97674418604651,1.1,-0.5,1 549 | 9.375,1.1,-0.5,1 550 | 12.5,1.1,-0.5,1 551 | 16.6666666666667,1.1,-0.5,1 552 | 21.4285714285714,1.1,-0.5,1 553 | 28.5714285714286,1.1,-0.5,1 554 | 37.5,1.1,-0.5,1 555 | 50,1.1,-0.5,1 556 | 66.6666666666667,1.1,-0.5,1 557 | 87.5,1.1,-0.5,1 558 | 116.666666666667,1.1,-0.5,1 559 | 150,1.1,-0.5,1 560 | 200,1.1,-0.5,1 561 | 1,1.1,0,-1 562 | 1.31578947368421,1.1,0,-1 563 | 1.75438596491228,1.1,0,-1 564 | 2.32558139534884,1.1,0,-1 565 | 3.03030303030303,1.1,0,-1 566 | 4,1.1,0,-1 567 | 5.35714285714286,1.1,0,-1 568 | 6.97674418604651,1.1,0,-1 569 | 9.375,1.1,0,-1 570 | 12.5,1.1,0,-1 571 | 16.6666666666667,1.1,0,-1 572 | 21.4285714285714,1.1,0,-1 573 | 28.5714285714286,1.1,0,-1 574 | 37.5,1.1,0,-1 575 | 50,1.1,0,-1 576 | 66.6666666666667,1.1,0,-1 577 | 87.5,1.1,0,-1 578 | 116.666666666667,1.1,0,-1 579 | 150,1.1,0,-1 580 | 200,1.1,0,-1 581 | 1,1.1,0,-0.5 582 | 1.31578947368421,1.1,0,-0.5 583 | 1.75438596491228,1.1,0,-0.5 584 | 2.32558139534884,1.1,0,-0.5 585 | 3.03030303030303,1.1,0,-0.5 586 | 4,1.1,0,-0.5 587 | 5.35714285714286,1.1,0,-0.5 588 | 6.97674418604651,1.1,0,-0.5 589 | 9.375,1.1,0,-0.5 590 | 12.5,1.1,0,-0.5 591 | 16.6666666666667,1.1,0,-0.5 592 | 21.4285714285714,1.1,0,-0.5 593 | 28.5714285714286,1.1,0,-0.5 594 | 37.5,1.1,0,-0.5 595 | 50,1.1,0,-0.5 596 | 66.6666666666667,1.1,0,-0.5 597 | 87.5,1.1,0,-0.5 598 | 116.666666666667,1.1,0,-0.5 599 | 150,1.1,0,-0.5 600 | 200,1.1,0,-0.5 601 | 1,1.1,0,0.5 602 | 1.31578947368421,1.1,0,0.5 603 | 1.75438596491228,1.1,0,0.5 604 | 2.32558139534884,1.1,0,0.5 605 | 3.03030303030303,1.1,0,0.5 606 | 4,1.1,0,0.5 607 | 5.35714285714286,1.1,0,0.5 608 | 6.97674418604651,1.1,0,0.5 609 | 9.375,1.1,0,0.5 610 | 12.5,1.1,0,0.5 611 | 16.6666666666667,1.1,0,0.5 612 | 21.4285714285714,1.1,0,0.5 613 | 28.5714285714286,1.1,0,0.5 614 | 37.5,1.1,0,0.5 615 | 50,1.1,0,0.5 616 | 66.6666666666667,1.1,0,0.5 617 | 87.5,1.1,0,0.5 618 | 116.666666666667,1.1,0,0.5 619 | 150,1.1,0,0.5 620 | 200,1.1,0,0.5 621 | 1,1.1,0,1 622 | 1.31578947368421,1.1,0,1 623 | 1.75438596491228,1.1,0,1 624 | 2.32558139534884,1.1,0,1 625 | 3.03030303030303,1.1,0,1 626 | 4,1.1,0,1 627 | 5.35714285714286,1.1,0,1 628 | 6.97674418604651,1.1,0,1 629 | 9.375,1.1,0,1 630 | 12.5,1.1,0,1 631 | 16.6666666666667,1.1,0,1 632 | 21.4285714285714,1.1,0,1 633 | 28.5714285714286,1.1,0,1 634 | 37.5,1.1,0,1 635 | 50,1.1,0,1 636 | 66.6666666666667,1.1,0,1 637 | 87.5,1.1,0,1 638 | 116.666666666667,1.1,0,1 639 | 150,1.1,0,1 640 | 200,1.1,0,1 641 | 1,1.1,0.5,-1 642 | 1.31578947368421,1.1,0.5,-1 643 | 1.75438596491228,1.1,0.5,-1 644 | 2.32558139534884,1.1,0.5,-1 645 | 3.03030303030303,1.1,0.5,-1 646 | 4,1.1,0.5,-1 647 | 5.35714285714286,1.1,0.5,-1 648 | 6.97674418604651,1.1,0.5,-1 649 | 9.375,1.1,0.5,-1 650 | 12.5,1.1,0.5,-1 651 | 16.6666666666667,1.1,0.5,-1 652 | 21.4285714285714,1.1,0.5,-1 653 | 28.5714285714286,1.1,0.5,-1 654 | 37.5,1.1,0.5,-1 655 | 50,1.1,0.5,-1 656 | 66.6666666666667,1.1,0.5,-1 657 | 87.5,1.1,0.5,-1 658 | 116.666666666667,1.1,0.5,-1 659 | 150,1.1,0.5,-1 660 | 200,1.1,0.5,-1 661 | 1,1.1,0.5,-0.5 662 | 1.31578947368421,1.1,0.5,-0.5 663 | 1.75438596491228,1.1,0.5,-0.5 664 | 2.32558139534884,1.1,0.5,-0.5 665 | 3.03030303030303,1.1,0.5,-0.5 666 | 4,1.1,0.5,-0.5 667 | 5.35714285714286,1.1,0.5,-0.5 668 | 6.97674418604651,1.1,0.5,-0.5 669 | 9.375,1.1,0.5,-0.5 670 | 12.5,1.1,0.5,-0.5 671 | 16.6666666666667,1.1,0.5,-0.5 672 | 21.4285714285714,1.1,0.5,-0.5 673 | 28.5714285714286,1.1,0.5,-0.5 674 | 37.5,1.1,0.5,-0.5 675 | 50,1.1,0.5,-0.5 676 | 66.6666666666667,1.1,0.5,-0.5 677 | 87.5,1.1,0.5,-0.5 678 | 116.666666666667,1.1,0.5,-0.5 679 | 150,1.1,0.5,-0.5 680 | 200,1.1,0.5,-0.5 681 | 1,1.1,0.5,0 682 | 1.31578947368421,1.1,0.5,0 683 | 1.75438596491228,1.1,0.5,0 684 | 2.32558139534884,1.1,0.5,0 685 | 3.03030303030303,1.1,0.5,0 686 | 4,1.1,0.5,0 687 | 5.35714285714286,1.1,0.5,0 688 | 6.97674418604651,1.1,0.5,0 689 | 9.375,1.1,0.5,0 690 | 12.5,1.1,0.5,0 691 | 16.6666666666667,1.1,0.5,0 692 | 21.4285714285714,1.1,0.5,0 693 | 28.5714285714286,1.1,0.5,0 694 | 37.5,1.1,0.5,0 695 | 50,1.1,0.5,0 696 | 66.6666666666667,1.1,0.5,0 697 | 87.5,1.1,0.5,0 698 | 116.666666666667,1.1,0.5,0 699 | 150,1.1,0.5,0 700 | 200,1.1,0.5,0 701 | 1,1.1,0.5,0.5 702 | 1.31578947368421,1.1,0.5,0.5 703 | 1.75438596491228,1.1,0.5,0.5 704 | 2.32558139534884,1.1,0.5,0.5 705 | 3.03030303030303,1.1,0.5,0.5 706 | 4,1.1,0.5,0.5 707 | 5.35714285714286,1.1,0.5,0.5 708 | 6.97674418604651,1.1,0.5,0.5 709 | 9.375,1.1,0.5,0.5 710 | 12.5,1.1,0.5,0.5 711 | 16.6666666666667,1.1,0.5,0.5 712 | 21.4285714285714,1.1,0.5,0.5 713 | 28.5714285714286,1.1,0.5,0.5 714 | 37.5,1.1,0.5,0.5 715 | 50,1.1,0.5,0.5 716 | 66.6666666666667,1.1,0.5,0.5 717 | 87.5,1.1,0.5,0.5 718 | 116.666666666667,1.1,0.5,0.5 719 | 150,1.1,0.5,0.5 720 | 200,1.1,0.5,0.5 721 | 1,1.1,0.5,1 722 | 1.31578947368421,1.1,0.5,1 723 | 1.75438596491228,1.1,0.5,1 724 | 2.32558139534884,1.1,0.5,1 725 | 3.03030303030303,1.1,0.5,1 726 | 4,1.1,0.5,1 727 | 5.35714285714286,1.1,0.5,1 728 | 6.97674418604651,1.1,0.5,1 729 | 9.375,1.1,0.5,1 730 | 12.5,1.1,0.5,1 731 | 16.6666666666667,1.1,0.5,1 732 | 21.4285714285714,1.1,0.5,1 733 | 28.5714285714286,1.1,0.5,1 734 | 37.5,1.1,0.5,1 735 | 50,1.1,0.5,1 736 | 66.6666666666667,1.1,0.5,1 737 | 87.5,1.1,0.5,1 738 | 116.666666666667,1.1,0.5,1 739 | 150,1.1,0.5,1 740 | 200,1.1,0.5,1 741 | 1,1.1,1,-0.5 742 | 1.31578947368421,1.1,1,-0.5 743 | 1.75438596491228,1.1,1,-0.5 744 | 2.32558139534884,1.1,1,-0.5 745 | 3.03030303030303,1.1,1,-0.5 746 | 4,1.1,1,-0.5 747 | 5.35714285714286,1.1,1,-0.5 748 | 6.97674418604651,1.1,1,-0.5 749 | 9.375,1.1,1,-0.5 750 | 12.5,1.1,1,-0.5 751 | 16.6666666666667,1.1,1,-0.5 752 | 21.4285714285714,1.1,1,-0.5 753 | 28.5714285714286,1.1,1,-0.5 754 | 37.5,1.1,1,-0.5 755 | 50,1.1,1,-0.5 756 | 66.6666666666667,1.1,1,-0.5 757 | 87.5,1.1,1,-0.5 758 | 116.666666666667,1.1,1,-0.5 759 | 150,1.1,1,-0.5 760 | 200,1.1,1,-0.5 761 | 1,1.1,1,0 762 | 1.31578947368421,1.1,1,0 763 | 1.75438596491228,1.1,1,0 764 | 2.32558139534884,1.1,1,0 765 | 3.03030303030303,1.1,1,0 766 | 4,1.1,1,0 767 | 5.35714285714286,1.1,1,0 768 | 6.97674418604651,1.1,1,0 769 | 9.375,1.1,1,0 770 | 12.5,1.1,1,0 771 | 16.6666666666667,1.1,1,0 772 | 21.4285714285714,1.1,1,0 773 | 28.5714285714286,1.1,1,0 774 | 37.5,1.1,1,0 775 | 50,1.1,1,0 776 | 66.6666666666667,1.1,1,0 777 | 87.5,1.1,1,0 778 | 116.666666666667,1.1,1,0 779 | 150,1.1,1,0 780 | 200,1.1,1,0 781 | 1,1.1,1,0.5 782 | 1.31578947368421,1.1,1,0.5 783 | 1.75438596491228,1.1,1,0.5 784 | 2.32558139534884,1.1,1,0.5 785 | 3.03030303030303,1.1,1,0.5 786 | 4,1.1,1,0.5 787 | 5.35714285714286,1.1,1,0.5 788 | 6.97674418604651,1.1,1,0.5 789 | 9.375,1.1,1,0.5 790 | 12.5,1.1,1,0.5 791 | 16.6666666666667,1.1,1,0.5 792 | 21.4285714285714,1.1,1,0.5 793 | 28.5714285714286,1.1,1,0.5 794 | 37.5,1.1,1,0.5 795 | 50,1.1,1,0.5 796 | 66.6666666666667,1.1,1,0.5 797 | 87.5,1.1,1,0.5 798 | 116.666666666667,1.1,1,0.5 799 | 150,1.1,1,0.5 800 | 200,1.1,1,0.5 801 | -------------------------------------------------------------------------------- /dataset/Input_Yo20op40_ana_bw20.csv: -------------------------------------------------------------------------------- 1 | 1,0.9,-0.5,-0.5 2 | 1.31578947368421,0.9,-0.5,-0.5 3 | 1.75438596491228,0.9,-0.5,-0.5 4 | 2.32558139534884,0.9,-0.5,-0.5 5 | 3.03030303030303,0.9,-0.5,-0.5 6 | 4,0.9,-0.5,-0.5 7 | 5.35714285714286,0.9,-0.5,-0.5 8 | 6.97674418604651,0.9,-0.5,-0.5 9 | 9.375,0.9,-0.5,-0.5 10 | 12.5,0.9,-0.5,-0.5 11 | 16.6666666666667,0.9,-0.5,-0.5 12 | 21.4285714285714,0.9,-0.5,-0.5 13 | 28.5714285714286,0.9,-0.5,-0.5 14 | 37.5,0.9,-0.5,-0.5 15 | 50,0.9,-0.5,-0.5 16 | 66.6666666666667,0.9,-0.5,-0.5 17 | 87.5,0.9,-0.5,-0.5 18 | 116.666666666667,0.9,-0.5,-0.5 19 | 150,0.9,-0.5,-0.5 20 | 200,0.9,-0.5,-0.5 21 | 1,0.9,-0.5,0 22 | 1.31578947368421,0.9,-0.5,0 23 | 1.75438596491228,0.9,-0.5,0 24 | 2.32558139534884,0.9,-0.5,0 25 | 3.03030303030303,0.9,-0.5,0 26 | 4,0.9,-0.5,0 27 | 5.35714285714286,0.9,-0.5,0 28 | 6.97674418604651,0.9,-0.5,0 29 | 9.375,0.9,-0.5,0 30 | 12.5,0.9,-0.5,0 31 | 16.6666666666667,0.9,-0.5,0 32 | 21.4285714285714,0.9,-0.5,0 33 | 28.5714285714286,0.9,-0.5,0 34 | 37.5,0.9,-0.5,0 35 | 50,0.9,-0.5,0 36 | 66.6666666666667,0.9,-0.5,0 37 | 87.5,0.9,-0.5,0 38 | 116.666666666667,0.9,-0.5,0 39 | 150,0.9,-0.5,0 40 | 200,0.9,-0.5,0 41 | 1,0.9,-0.5,0.5 42 | 1.31578947368421,0.9,-0.5,0.5 43 | 1.75438596491228,0.9,-0.5,0.5 44 | 2.32558139534884,0.9,-0.5,0.5 45 | 3.03030303030303,0.9,-0.5,0.5 46 | 4,0.9,-0.5,0.5 47 | 5.35714285714286,0.9,-0.5,0.5 48 | 6.97674418604651,0.9,-0.5,0.5 49 | 9.375,0.9,-0.5,0.5 50 | 12.5,0.9,-0.5,0.5 51 | 16.6666666666667,0.9,-0.5,0.5 52 | 21.4285714285714,0.9,-0.5,0.5 53 | 28.5714285714286,0.9,-0.5,0.5 54 | 37.5,0.9,-0.5,0.5 55 | 50,0.9,-0.5,0.5 56 | 66.6666666666667,0.9,-0.5,0.5 57 | 87.5,0.9,-0.5,0.5 58 | 116.666666666667,0.9,-0.5,0.5 59 | 150,0.9,-0.5,0.5 60 | 200,0.9,-0.5,0.5 61 | 1,0.9,0,-0.5 62 | 1.31578947368421,0.9,0,-0.5 63 | 1.75438596491228,0.9,0,-0.5 64 | 2.32558139534884,0.9,0,-0.5 65 | 3.03030303030303,0.9,0,-0.5 66 | 4,0.9,0,-0.5 67 | 5.35714285714286,0.9,0,-0.5 68 | 6.97674418604651,0.9,0,-0.5 69 | 9.375,0.9,0,-0.5 70 | 12.5,0.9,0,-0.5 71 | 16.6666666666667,0.9,0,-0.5 72 | 21.4285714285714,0.9,0,-0.5 73 | 28.5714285714286,0.9,0,-0.5 74 | 37.5,0.9,0,-0.5 75 | 50,0.9,0,-0.5 76 | 66.6666666666667,0.9,0,-0.5 77 | 87.5,0.9,0,-0.5 78 | 116.666666666667,0.9,0,-0.5 79 | 150,0.9,0,-0.5 80 | 200,0.9,0,-0.5 81 | 1,0.9,0,0.5 82 | 1.31578947368421,0.9,0,0.5 83 | 1.75438596491228,0.9,0,0.5 84 | 2.32558139534884,0.9,0,0.5 85 | 3.03030303030303,0.9,0,0.5 86 | 4,0.9,0,0.5 87 | 5.35714285714286,0.9,0,0.5 88 | 6.97674418604651,0.9,0,0.5 89 | 9.375,0.9,0,0.5 90 | 12.5,0.9,0,0.5 91 | 16.6666666666667,0.9,0,0.5 92 | 21.4285714285714,0.9,0,0.5 93 | 28.5714285714286,0.9,0,0.5 94 | 37.5,0.9,0,0.5 95 | 50,0.9,0,0.5 96 | 66.6666666666667,0.9,0,0.5 97 | 87.5,0.9,0,0.5 98 | 116.666666666667,0.9,0,0.5 99 | 150,0.9,0,0.5 100 | 200,0.9,0,0.5 101 | 1,0.9,0.5,-0.5 102 | 1.31578947368421,0.9,0.5,-0.5 103 | 1.75438596491228,0.9,0.5,-0.5 104 | 2.32558139534884,0.9,0.5,-0.5 105 | 3.03030303030303,0.9,0.5,-0.5 106 | 4,0.9,0.5,-0.5 107 | 5.35714285714286,0.9,0.5,-0.5 108 | 6.97674418604651,0.9,0.5,-0.5 109 | 9.375,0.9,0.5,-0.5 110 | 12.5,0.9,0.5,-0.5 111 | 16.6666666666667,0.9,0.5,-0.5 112 | 21.4285714285714,0.9,0.5,-0.5 113 | 28.5714285714286,0.9,0.5,-0.5 114 | 37.5,0.9,0.5,-0.5 115 | 50,0.9,0.5,-0.5 116 | 66.6666666666667,0.9,0.5,-0.5 117 | 87.5,0.9,0.5,-0.5 118 | 116.666666666667,0.9,0.5,-0.5 119 | 150,0.9,0.5,-0.5 120 | 200,0.9,0.5,-0.5 121 | 1,0.9,0.5,0 122 | 1.31578947368421,0.9,0.5,0 123 | 1.75438596491228,0.9,0.5,0 124 | 2.32558139534884,0.9,0.5,0 125 | 3.03030303030303,0.9,0.5,0 126 | 4,0.9,0.5,0 127 | 5.35714285714286,0.9,0.5,0 128 | 6.97674418604651,0.9,0.5,0 129 | 9.375,0.9,0.5,0 130 | 12.5,0.9,0.5,0 131 | 16.6666666666667,0.9,0.5,0 132 | 21.4285714285714,0.9,0.5,0 133 | 28.5714285714286,0.9,0.5,0 134 | 37.5,0.9,0.5,0 135 | 50,0.9,0.5,0 136 | 66.6666666666667,0.9,0.5,0 137 | 87.5,0.9,0.5,0 138 | 116.666666666667,0.9,0.5,0 139 | 150,0.9,0.5,0 140 | 200,0.9,0.5,0 141 | 1,0.9,0.5,0.5 142 | 1.31578947368421,0.9,0.5,0.5 143 | 1.75438596491228,0.9,0.5,0.5 144 | 2.32558139534884,0.9,0.5,0.5 145 | 3.03030303030303,0.9,0.5,0.5 146 | 4,0.9,0.5,0.5 147 | 5.35714285714286,0.9,0.5,0.5 148 | 6.97674418604651,0.9,0.5,0.5 149 | 9.375,0.9,0.5,0.5 150 | 12.5,0.9,0.5,0.5 151 | 16.6666666666667,0.9,0.5,0.5 152 | 21.4285714285714,0.9,0.5,0.5 153 | 28.5714285714286,0.9,0.5,0.5 154 | 37.5,0.9,0.5,0.5 155 | 50,0.9,0.5,0.5 156 | 66.6666666666667,0.9,0.5,0.5 157 | 87.5,0.9,0.5,0.5 158 | 116.666666666667,0.9,0.5,0.5 159 | 150,0.9,0.5,0.5 160 | 200,0.9,0.5,0.5 161 | 1,1,-1,0 162 | 1.31578947368421,1,-1,0 163 | 1.75438596491228,1,-1,0 164 | 2.32558139534884,1,-1,0 165 | 3.03030303030303,1,-1,0 166 | 4,1,-1,0 167 | 5.35714285714286,1,-1,0 168 | 6.97674418604651,1,-1,0 169 | 9.375,1,-1,0 170 | 12.5,1,-1,0 171 | 16.6666666666667,1,-1,0 172 | 21.4285714285714,1,-1,0 173 | 28.5714285714286,1,-1,0 174 | 37.5,1,-1,0 175 | 50,1,-1,0 176 | 66.6666666666667,1,-1,0 177 | 87.5,1,-1,0 178 | 116.666666666667,1,-1,0 179 | 150,1,-1,0 180 | 200,1,-1,0 181 | 1,1,-0.5,-0.5 182 | 1.31578947368421,1,-0.5,-0.5 183 | 1.75438596491228,1,-0.5,-0.5 184 | 2.32558139534884,1,-0.5,-0.5 185 | 3.03030303030303,1,-0.5,-0.5 186 | 4,1,-0.5,-0.5 187 | 5.35714285714286,1,-0.5,-0.5 188 | 6.97674418604651,1,-0.5,-0.5 189 | 9.375,1,-0.5,-0.5 190 | 12.5,1,-0.5,-0.5 191 | 16.6666666666667,1,-0.5,-0.5 192 | 21.4285714285714,1,-0.5,-0.5 193 | 28.5714285714286,1,-0.5,-0.5 194 | 37.5,1,-0.5,-0.5 195 | 50,1,-0.5,-0.5 196 | 66.6666666666667,1,-0.5,-0.5 197 | 87.5,1,-0.5,-0.5 198 | 116.666666666667,1,-0.5,-0.5 199 | 150,1,-0.5,-0.5 200 | 200,1,-0.5,-0.5 201 | 1,1,-0.5,0 202 | 1.31578947368421,1,-0.5,0 203 | 1.75438596491228,1,-0.5,0 204 | 2.32558139534884,1,-0.5,0 205 | 3.03030303030303,1,-0.5,0 206 | 4,1,-0.5,0 207 | 5.35714285714286,1,-0.5,0 208 | 6.97674418604651,1,-0.5,0 209 | 9.375,1,-0.5,0 210 | 12.5,1,-0.5,0 211 | 16.6666666666667,1,-0.5,0 212 | 21.4285714285714,1,-0.5,0 213 | 28.5714285714286,1,-0.5,0 214 | 37.5,1,-0.5,0 215 | 50,1,-0.5,0 216 | 66.6666666666667,1,-0.5,0 217 | 87.5,1,-0.5,0 218 | 116.666666666667,1,-0.5,0 219 | 150,1,-0.5,0 220 | 200,1,-0.5,0 221 | 1,1,-0.5,0.5 222 | 1.31578947368421,1,-0.5,0.5 223 | 1.75438596491228,1,-0.5,0.5 224 | 2.32558139534884,1,-0.5,0.5 225 | 3.03030303030303,1,-0.5,0.5 226 | 4,1,-0.5,0.5 227 | 5.35714285714286,1,-0.5,0.5 228 | 6.97674418604651,1,-0.5,0.5 229 | 9.375,1,-0.5,0.5 230 | 12.5,1,-0.5,0.5 231 | 16.6666666666667,1,-0.5,0.5 232 | 21.4285714285714,1,-0.5,0.5 233 | 28.5714285714286,1,-0.5,0.5 234 | 37.5,1,-0.5,0.5 235 | 50,1,-0.5,0.5 236 | 66.6666666666667,1,-0.5,0.5 237 | 87.5,1,-0.5,0.5 238 | 116.666666666667,1,-0.5,0.5 239 | 150,1,-0.5,0.5 240 | 200,1,-0.5,0.5 241 | 1,1,0,-1 242 | 1.31578947368421,1,0,-1 243 | 1.75438596491228,1,0,-1 244 | 2.32558139534884,1,0,-1 245 | 3.03030303030303,1,0,-1 246 | 4,1,0,-1 247 | 5.35714285714286,1,0,-1 248 | 6.97674418604651,1,0,-1 249 | 9.375,1,0,-1 250 | 12.5,1,0,-1 251 | 16.6666666666667,1,0,-1 252 | 21.4285714285714,1,0,-1 253 | 28.5714285714286,1,0,-1 254 | 37.5,1,0,-1 255 | 50,1,0,-1 256 | 66.6666666666667,1,0,-1 257 | 87.5,1,0,-1 258 | 116.666666666667,1,0,-1 259 | 150,1,0,-1 260 | 200,1,0,-1 261 | 1,1,0,-0.5 262 | 1.31578947368421,1,0,-0.5 263 | 1.75438596491228,1,0,-0.5 264 | 2.32558139534884,1,0,-0.5 265 | 3.03030303030303,1,0,-0.5 266 | 4,1,0,-0.5 267 | 5.35714285714286,1,0,-0.5 268 | 6.97674418604651,1,0,-0.5 269 | 9.375,1,0,-0.5 270 | 12.5,1,0,-0.5 271 | 16.6666666666667,1,0,-0.5 272 | 21.4285714285714,1,0,-0.5 273 | 28.5714285714286,1,0,-0.5 274 | 37.5,1,0,-0.5 275 | 50,1,0,-0.5 276 | 66.6666666666667,1,0,-0.5 277 | 87.5,1,0,-0.5 278 | 116.666666666667,1,0,-0.5 279 | 150,1,0,-0.5 280 | 200,1,0,-0.5 281 | 1,1,0,0.5 282 | 1.31578947368421,1,0,0.5 283 | 1.75438596491228,1,0,0.5 284 | 2.32558139534884,1,0,0.5 285 | 3.03030303030303,1,0,0.5 286 | 4,1,0,0.5 287 | 5.35714285714286,1,0,0.5 288 | 6.97674418604651,1,0,0.5 289 | 9.375,1,0,0.5 290 | 12.5,1,0,0.5 291 | 16.6666666666667,1,0,0.5 292 | 21.4285714285714,1,0,0.5 293 | 28.5714285714286,1,0,0.5 294 | 37.5,1,0,0.5 295 | 50,1,0,0.5 296 | 66.6666666666667,1,0,0.5 297 | 87.5,1,0,0.5 298 | 116.666666666667,1,0,0.5 299 | 150,1,0,0.5 300 | 200,1,0,0.5 301 | 1,1,0,1 302 | 1.31578947368421,1,0,1 303 | 1.75438596491228,1,0,1 304 | 2.32558139534884,1,0,1 305 | 3.03030303030303,1,0,1 306 | 4,1,0,1 307 | 5.35714285714286,1,0,1 308 | 6.97674418604651,1,0,1 309 | 9.375,1,0,1 310 | 12.5,1,0,1 311 | 16.6666666666667,1,0,1 312 | 21.4285714285714,1,0,1 313 | 28.5714285714286,1,0,1 314 | 37.5,1,0,1 315 | 50,1,0,1 316 | 66.6666666666667,1,0,1 317 | 87.5,1,0,1 318 | 116.666666666667,1,0,1 319 | 150,1,0,1 320 | 200,1,0,1 321 | 1,1,0.5,-0.5 322 | 1.31578947368421,1,0.5,-0.5 323 | 1.75438596491228,1,0.5,-0.5 324 | 2.32558139534884,1,0.5,-0.5 325 | 3.03030303030303,1,0.5,-0.5 326 | 4,1,0.5,-0.5 327 | 5.35714285714286,1,0.5,-0.5 328 | 6.97674418604651,1,0.5,-0.5 329 | 9.375,1,0.5,-0.5 330 | 12.5,1,0.5,-0.5 331 | 16.6666666666667,1,0.5,-0.5 332 | 21.4285714285714,1,0.5,-0.5 333 | 28.5714285714286,1,0.5,-0.5 334 | 37.5,1,0.5,-0.5 335 | 50,1,0.5,-0.5 336 | 66.6666666666667,1,0.5,-0.5 337 | 87.5,1,0.5,-0.5 338 | 116.666666666667,1,0.5,-0.5 339 | 150,1,0.5,-0.5 340 | 200,1,0.5,-0.5 341 | 1,1,0.5,0 342 | 1.31578947368421,1,0.5,0 343 | 1.75438596491228,1,0.5,0 344 | 2.32558139534884,1,0.5,0 345 | 3.03030303030303,1,0.5,0 346 | 4,1,0.5,0 347 | 5.35714285714286,1,0.5,0 348 | 6.97674418604651,1,0.5,0 349 | 9.375,1,0.5,0 350 | 12.5,1,0.5,0 351 | 16.6666666666667,1,0.5,0 352 | 21.4285714285714,1,0.5,0 353 | 28.5714285714286,1,0.5,0 354 | 37.5,1,0.5,0 355 | 50,1,0.5,0 356 | 66.6666666666667,1,0.5,0 357 | 87.5,1,0.5,0 358 | 116.666666666667,1,0.5,0 359 | 150,1,0.5,0 360 | 200,1,0.5,0 361 | 1,1,0.5,0.5 362 | 1.31578947368421,1,0.5,0.5 363 | 1.75438596491228,1,0.5,0.5 364 | 2.32558139534884,1,0.5,0.5 365 | 3.03030303030303,1,0.5,0.5 366 | 4,1,0.5,0.5 367 | 5.35714285714286,1,0.5,0.5 368 | 6.97674418604651,1,0.5,0.5 369 | 9.375,1,0.5,0.5 370 | 12.5,1,0.5,0.5 371 | 16.6666666666667,1,0.5,0.5 372 | 21.4285714285714,1,0.5,0.5 373 | 28.5714285714286,1,0.5,0.5 374 | 37.5,1,0.5,0.5 375 | 50,1,0.5,0.5 376 | 66.6666666666667,1,0.5,0.5 377 | 87.5,1,0.5,0.5 378 | 116.666666666667,1,0.5,0.5 379 | 150,1,0.5,0.5 380 | 200,1,0.5,0.5 381 | 1,1,1,0 382 | 1.31578947368421,1,1,0 383 | 1.75438596491228,1,1,0 384 | 2.32558139534884,1,1,0 385 | 3.03030303030303,1,1,0 386 | 4,1,1,0 387 | 5.35714285714286,1,1,0 388 | 6.97674418604651,1,1,0 389 | 9.375,1,1,0 390 | 12.5,1,1,0 391 | 16.6666666666667,1,1,0 392 | 21.4285714285714,1,1,0 393 | 28.5714285714286,1,1,0 394 | 37.5,1,1,0 395 | 50,1,1,0 396 | 66.6666666666667,1,1,0 397 | 87.5,1,1,0 398 | 116.666666666667,1,1,0 399 | 150,1,1,0 400 | 200,1,1,0 401 | 1,1.1,-1,-0.5 402 | 1.31578947368421,1.1,-1,-0.5 403 | 1.75438596491228,1.1,-1,-0.5 404 | 2.32558139534884,1.1,-1,-0.5 405 | 3.03030303030303,1.1,-1,-0.5 406 | 4,1.1,-1,-0.5 407 | 5.35714285714286,1.1,-1,-0.5 408 | 6.97674418604651,1.1,-1,-0.5 409 | 9.375,1.1,-1,-0.5 410 | 12.5,1.1,-1,-0.5 411 | 16.6666666666667,1.1,-1,-0.5 412 | 21.4285714285714,1.1,-1,-0.5 413 | 28.5714285714286,1.1,-1,-0.5 414 | 37.5,1.1,-1,-0.5 415 | 50,1.1,-1,-0.5 416 | 66.6666666666667,1.1,-1,-0.5 417 | 87.5,1.1,-1,-0.5 418 | 116.666666666667,1.1,-1,-0.5 419 | 150,1.1,-1,-0.5 420 | 200,1.1,-1,-0.5 421 | 1,1.1,-1,0 422 | 1.31578947368421,1.1,-1,0 423 | 1.75438596491228,1.1,-1,0 424 | 2.32558139534884,1.1,-1,0 425 | 3.03030303030303,1.1,-1,0 426 | 4,1.1,-1,0 427 | 5.35714285714286,1.1,-1,0 428 | 6.97674418604651,1.1,-1,0 429 | 9.375,1.1,-1,0 430 | 12.5,1.1,-1,0 431 | 16.6666666666667,1.1,-1,0 432 | 21.4285714285714,1.1,-1,0 433 | 28.5714285714286,1.1,-1,0 434 | 37.5,1.1,-1,0 435 | 50,1.1,-1,0 436 | 66.6666666666667,1.1,-1,0 437 | 87.5,1.1,-1,0 438 | 116.666666666667,1.1,-1,0 439 | 150,1.1,-1,0 440 | 200,1.1,-1,0 441 | 1,1.1,-1,0.5 442 | 1.31578947368421,1.1,-1,0.5 443 | 1.75438596491228,1.1,-1,0.5 444 | 2.32558139534884,1.1,-1,0.5 445 | 3.03030303030303,1.1,-1,0.5 446 | 4,1.1,-1,0.5 447 | 5.35714285714286,1.1,-1,0.5 448 | 6.97674418604651,1.1,-1,0.5 449 | 9.375,1.1,-1,0.5 450 | 12.5,1.1,-1,0.5 451 | 16.6666666666667,1.1,-1,0.5 452 | 21.4285714285714,1.1,-1,0.5 453 | 28.5714285714286,1.1,-1,0.5 454 | 37.5,1.1,-1,0.5 455 | 50,1.1,-1,0.5 456 | 66.6666666666667,1.1,-1,0.5 457 | 87.5,1.1,-1,0.5 458 | 116.666666666667,1.1,-1,0.5 459 | 150,1.1,-1,0.5 460 | 200,1.1,-1,0.5 461 | 1,1.1,-0.5,-1 462 | 1.31578947368421,1.1,-0.5,-1 463 | 1.75438596491228,1.1,-0.5,-1 464 | 2.32558139534884,1.1,-0.5,-1 465 | 3.03030303030303,1.1,-0.5,-1 466 | 4,1.1,-0.5,-1 467 | 5.35714285714286,1.1,-0.5,-1 468 | 6.97674418604651,1.1,-0.5,-1 469 | 9.375,1.1,-0.5,-1 470 | 12.5,1.1,-0.5,-1 471 | 16.6666666666667,1.1,-0.5,-1 472 | 21.4285714285714,1.1,-0.5,-1 473 | 28.5714285714286,1.1,-0.5,-1 474 | 37.5,1.1,-0.5,-1 475 | 50,1.1,-0.5,-1 476 | 66.6666666666667,1.1,-0.5,-1 477 | 87.5,1.1,-0.5,-1 478 | 116.666666666667,1.1,-0.5,-1 479 | 150,1.1,-0.5,-1 480 | 200,1.1,-0.5,-1 481 | 1,1.1,-0.5,-0.5 482 | 1.31578947368421,1.1,-0.5,-0.5 483 | 1.75438596491228,1.1,-0.5,-0.5 484 | 2.32558139534884,1.1,-0.5,-0.5 485 | 3.03030303030303,1.1,-0.5,-0.5 486 | 4,1.1,-0.5,-0.5 487 | 5.35714285714286,1.1,-0.5,-0.5 488 | 6.97674418604651,1.1,-0.5,-0.5 489 | 9.375,1.1,-0.5,-0.5 490 | 12.5,1.1,-0.5,-0.5 491 | 16.6666666666667,1.1,-0.5,-0.5 492 | 21.4285714285714,1.1,-0.5,-0.5 493 | 28.5714285714286,1.1,-0.5,-0.5 494 | 37.5,1.1,-0.5,-0.5 495 | 50,1.1,-0.5,-0.5 496 | 66.6666666666667,1.1,-0.5,-0.5 497 | 87.5,1.1,-0.5,-0.5 498 | 116.666666666667,1.1,-0.5,-0.5 499 | 150,1.1,-0.5,-0.5 500 | 200,1.1,-0.5,-0.5 501 | 1,1.1,-0.5,0 502 | 1.31578947368421,1.1,-0.5,0 503 | 1.75438596491228,1.1,-0.5,0 504 | 2.32558139534884,1.1,-0.5,0 505 | 3.03030303030303,1.1,-0.5,0 506 | 4,1.1,-0.5,0 507 | 5.35714285714286,1.1,-0.5,0 508 | 6.97674418604651,1.1,-0.5,0 509 | 9.375,1.1,-0.5,0 510 | 12.5,1.1,-0.5,0 511 | 16.6666666666667,1.1,-0.5,0 512 | 21.4285714285714,1.1,-0.5,0 513 | 28.5714285714286,1.1,-0.5,0 514 | 37.5,1.1,-0.5,0 515 | 50,1.1,-0.5,0 516 | 66.6666666666667,1.1,-0.5,0 517 | 87.5,1.1,-0.5,0 518 | 116.666666666667,1.1,-0.5,0 519 | 150,1.1,-0.5,0 520 | 200,1.1,-0.5,0 521 | 1,1.1,-0.5,0.5 522 | 1.31578947368421,1.1,-0.5,0.5 523 | 1.75438596491228,1.1,-0.5,0.5 524 | 2.32558139534884,1.1,-0.5,0.5 525 | 3.03030303030303,1.1,-0.5,0.5 526 | 4,1.1,-0.5,0.5 527 | 5.35714285714286,1.1,-0.5,0.5 528 | 6.97674418604651,1.1,-0.5,0.5 529 | 9.375,1.1,-0.5,0.5 530 | 12.5,1.1,-0.5,0.5 531 | 16.6666666666667,1.1,-0.5,0.5 532 | 21.4285714285714,1.1,-0.5,0.5 533 | 28.5714285714286,1.1,-0.5,0.5 534 | 37.5,1.1,-0.5,0.5 535 | 50,1.1,-0.5,0.5 536 | 66.6666666666667,1.1,-0.5,0.5 537 | 87.5,1.1,-0.5,0.5 538 | 116.666666666667,1.1,-0.5,0.5 539 | 150,1.1,-0.5,0.5 540 | 200,1.1,-0.5,0.5 541 | 1,1.1,-0.5,1 542 | 1.31578947368421,1.1,-0.5,1 543 | 1.75438596491228,1.1,-0.5,1 544 | 2.32558139534884,1.1,-0.5,1 545 | 3.03030303030303,1.1,-0.5,1 546 | 4,1.1,-0.5,1 547 | 5.35714285714286,1.1,-0.5,1 548 | 6.97674418604651,1.1,-0.5,1 549 | 9.375,1.1,-0.5,1 550 | 12.5,1.1,-0.5,1 551 | 16.6666666666667,1.1,-0.5,1 552 | 21.4285714285714,1.1,-0.5,1 553 | 28.5714285714286,1.1,-0.5,1 554 | 37.5,1.1,-0.5,1 555 | 50,1.1,-0.5,1 556 | 66.6666666666667,1.1,-0.5,1 557 | 87.5,1.1,-0.5,1 558 | 116.666666666667,1.1,-0.5,1 559 | 150,1.1,-0.5,1 560 | 200,1.1,-0.5,1 561 | 1,1.1,0,-1 562 | 1.31578947368421,1.1,0,-1 563 | 1.75438596491228,1.1,0,-1 564 | 2.32558139534884,1.1,0,-1 565 | 3.03030303030303,1.1,0,-1 566 | 4,1.1,0,-1 567 | 5.35714285714286,1.1,0,-1 568 | 6.97674418604651,1.1,0,-1 569 | 9.375,1.1,0,-1 570 | 12.5,1.1,0,-1 571 | 16.6666666666667,1.1,0,-1 572 | 21.4285714285714,1.1,0,-1 573 | 28.5714285714286,1.1,0,-1 574 | 37.5,1.1,0,-1 575 | 50,1.1,0,-1 576 | 66.6666666666667,1.1,0,-1 577 | 87.5,1.1,0,-1 578 | 116.666666666667,1.1,0,-1 579 | 150,1.1,0,-1 580 | 200,1.1,0,-1 581 | 1,1.1,0,-0.5 582 | 1.31578947368421,1.1,0,-0.5 583 | 1.75438596491228,1.1,0,-0.5 584 | 2.32558139534884,1.1,0,-0.5 585 | 3.03030303030303,1.1,0,-0.5 586 | 4,1.1,0,-0.5 587 | 5.35714285714286,1.1,0,-0.5 588 | 6.97674418604651,1.1,0,-0.5 589 | 9.375,1.1,0,-0.5 590 | 12.5,1.1,0,-0.5 591 | 16.6666666666667,1.1,0,-0.5 592 | 21.4285714285714,1.1,0,-0.5 593 | 28.5714285714286,1.1,0,-0.5 594 | 37.5,1.1,0,-0.5 595 | 50,1.1,0,-0.5 596 | 66.6666666666667,1.1,0,-0.5 597 | 87.5,1.1,0,-0.5 598 | 116.666666666667,1.1,0,-0.5 599 | 150,1.1,0,-0.5 600 | 200,1.1,0,-0.5 601 | 1,1.1,0,0.5 602 | 1.31578947368421,1.1,0,0.5 603 | 1.75438596491228,1.1,0,0.5 604 | 2.32558139534884,1.1,0,0.5 605 | 3.03030303030303,1.1,0,0.5 606 | 4,1.1,0,0.5 607 | 5.35714285714286,1.1,0,0.5 608 | 6.97674418604651,1.1,0,0.5 609 | 9.375,1.1,0,0.5 610 | 12.5,1.1,0,0.5 611 | 16.6666666666667,1.1,0,0.5 612 | 21.4285714285714,1.1,0,0.5 613 | 28.5714285714286,1.1,0,0.5 614 | 37.5,1.1,0,0.5 615 | 50,1.1,0,0.5 616 | 66.6666666666667,1.1,0,0.5 617 | 87.5,1.1,0,0.5 618 | 116.666666666667,1.1,0,0.5 619 | 150,1.1,0,0.5 620 | 200,1.1,0,0.5 621 | 1,1.1,0,1 622 | 1.31578947368421,1.1,0,1 623 | 1.75438596491228,1.1,0,1 624 | 2.32558139534884,1.1,0,1 625 | 3.03030303030303,1.1,0,1 626 | 4,1.1,0,1 627 | 5.35714285714286,1.1,0,1 628 | 6.97674418604651,1.1,0,1 629 | 9.375,1.1,0,1 630 | 12.5,1.1,0,1 631 | 16.6666666666667,1.1,0,1 632 | 21.4285714285714,1.1,0,1 633 | 28.5714285714286,1.1,0,1 634 | 37.5,1.1,0,1 635 | 50,1.1,0,1 636 | 66.6666666666667,1.1,0,1 637 | 87.5,1.1,0,1 638 | 116.666666666667,1.1,0,1 639 | 150,1.1,0,1 640 | 200,1.1,0,1 641 | 1,1.1,0.5,-1 642 | 1.31578947368421,1.1,0.5,-1 643 | 1.75438596491228,1.1,0.5,-1 644 | 2.32558139534884,1.1,0.5,-1 645 | 3.03030303030303,1.1,0.5,-1 646 | 4,1.1,0.5,-1 647 | 5.35714285714286,1.1,0.5,-1 648 | 6.97674418604651,1.1,0.5,-1 649 | 9.375,1.1,0.5,-1 650 | 12.5,1.1,0.5,-1 651 | 16.6666666666667,1.1,0.5,-1 652 | 21.4285714285714,1.1,0.5,-1 653 | 28.5714285714286,1.1,0.5,-1 654 | 37.5,1.1,0.5,-1 655 | 50,1.1,0.5,-1 656 | 66.6666666666667,1.1,0.5,-1 657 | 87.5,1.1,0.5,-1 658 | 116.666666666667,1.1,0.5,-1 659 | 150,1.1,0.5,-1 660 | 200,1.1,0.5,-1 661 | 1,1.1,0.5,-0.5 662 | 1.31578947368421,1.1,0.5,-0.5 663 | 1.75438596491228,1.1,0.5,-0.5 664 | 2.32558139534884,1.1,0.5,-0.5 665 | 3.03030303030303,1.1,0.5,-0.5 666 | 4,1.1,0.5,-0.5 667 | 5.35714285714286,1.1,0.5,-0.5 668 | 6.97674418604651,1.1,0.5,-0.5 669 | 9.375,1.1,0.5,-0.5 670 | 12.5,1.1,0.5,-0.5 671 | 16.6666666666667,1.1,0.5,-0.5 672 | 21.4285714285714,1.1,0.5,-0.5 673 | 28.5714285714286,1.1,0.5,-0.5 674 | 37.5,1.1,0.5,-0.5 675 | 50,1.1,0.5,-0.5 676 | 66.6666666666667,1.1,0.5,-0.5 677 | 87.5,1.1,0.5,-0.5 678 | 116.666666666667,1.1,0.5,-0.5 679 | 150,1.1,0.5,-0.5 680 | 200,1.1,0.5,-0.5 681 | 1,1.1,0.5,0 682 | 1.31578947368421,1.1,0.5,0 683 | 1.75438596491228,1.1,0.5,0 684 | 2.32558139534884,1.1,0.5,0 685 | 3.03030303030303,1.1,0.5,0 686 | 4,1.1,0.5,0 687 | 5.35714285714286,1.1,0.5,0 688 | 6.97674418604651,1.1,0.5,0 689 | 9.375,1.1,0.5,0 690 | 12.5,1.1,0.5,0 691 | 16.6666666666667,1.1,0.5,0 692 | 21.4285714285714,1.1,0.5,0 693 | 28.5714285714286,1.1,0.5,0 694 | 37.5,1.1,0.5,0 695 | 50,1.1,0.5,0 696 | 66.6666666666667,1.1,0.5,0 697 | 87.5,1.1,0.5,0 698 | 116.666666666667,1.1,0.5,0 699 | 150,1.1,0.5,0 700 | 200,1.1,0.5,0 701 | 1,1.1,0.5,0.5 702 | 1.31578947368421,1.1,0.5,0.5 703 | 1.75438596491228,1.1,0.5,0.5 704 | 2.32558139534884,1.1,0.5,0.5 705 | 3.03030303030303,1.1,0.5,0.5 706 | 4,1.1,0.5,0.5 707 | 5.35714285714286,1.1,0.5,0.5 708 | 6.97674418604651,1.1,0.5,0.5 709 | 9.375,1.1,0.5,0.5 710 | 12.5,1.1,0.5,0.5 711 | 16.6666666666667,1.1,0.5,0.5 712 | 21.4285714285714,1.1,0.5,0.5 713 | 28.5714285714286,1.1,0.5,0.5 714 | 37.5,1.1,0.5,0.5 715 | 50,1.1,0.5,0.5 716 | 66.6666666666667,1.1,0.5,0.5 717 | 87.5,1.1,0.5,0.5 718 | 116.666666666667,1.1,0.5,0.5 719 | 150,1.1,0.5,0.5 720 | 200,1.1,0.5,0.5 721 | 1,1.1,0.5,1 722 | 1.31578947368421,1.1,0.5,1 723 | 1.75438596491228,1.1,0.5,1 724 | 2.32558139534884,1.1,0.5,1 725 | 3.03030303030303,1.1,0.5,1 726 | 4,1.1,0.5,1 727 | 5.35714285714286,1.1,0.5,1 728 | 6.97674418604651,1.1,0.5,1 729 | 9.375,1.1,0.5,1 730 | 12.5,1.1,0.5,1 731 | 16.6666666666667,1.1,0.5,1 732 | 21.4285714285714,1.1,0.5,1 733 | 28.5714285714286,1.1,0.5,1 734 | 37.5,1.1,0.5,1 735 | 50,1.1,0.5,1 736 | 66.6666666666667,1.1,0.5,1 737 | 87.5,1.1,0.5,1 738 | 116.666666666667,1.1,0.5,1 739 | 150,1.1,0.5,1 740 | 200,1.1,0.5,1 741 | 1,1.1,1,-0.5 742 | 1.31578947368421,1.1,1,-0.5 743 | 1.75438596491228,1.1,1,-0.5 744 | 2.32558139534884,1.1,1,-0.5 745 | 3.03030303030303,1.1,1,-0.5 746 | 4,1.1,1,-0.5 747 | 5.35714285714286,1.1,1,-0.5 748 | 6.97674418604651,1.1,1,-0.5 749 | 9.375,1.1,1,-0.5 750 | 12.5,1.1,1,-0.5 751 | 16.6666666666667,1.1,1,-0.5 752 | 21.4285714285714,1.1,1,-0.5 753 | 28.5714285714286,1.1,1,-0.5 754 | 37.5,1.1,1,-0.5 755 | 50,1.1,1,-0.5 756 | 66.6666666666667,1.1,1,-0.5 757 | 87.5,1.1,1,-0.5 758 | 116.666666666667,1.1,1,-0.5 759 | 150,1.1,1,-0.5 760 | 200,1.1,1,-0.5 761 | 1,1.1,1,0 762 | 1.31578947368421,1.1,1,0 763 | 1.75438596491228,1.1,1,0 764 | 2.32558139534884,1.1,1,0 765 | 3.03030303030303,1.1,1,0 766 | 4,1.1,1,0 767 | 5.35714285714286,1.1,1,0 768 | 6.97674418604651,1.1,1,0 769 | 9.375,1.1,1,0 770 | 12.5,1.1,1,0 771 | 16.6666666666667,1.1,1,0 772 | 21.4285714285714,1.1,1,0 773 | 28.5714285714286,1.1,1,0 774 | 37.5,1.1,1,0 775 | 50,1.1,1,0 776 | 66.6666666666667,1.1,1,0 777 | 87.5,1.1,1,0 778 | 116.666666666667,1.1,1,0 779 | 150,1.1,1,0 780 | 200,1.1,1,0 781 | 1,1.1,1,0.5 782 | 1.31578947368421,1.1,1,0.5 783 | 1.75438596491228,1.1,1,0.5 784 | 2.32558139534884,1.1,1,0.5 785 | 3.03030303030303,1.1,1,0.5 786 | 4,1.1,1,0.5 787 | 5.35714285714286,1.1,1,0.5 788 | 6.97674418604651,1.1,1,0.5 789 | 9.375,1.1,1,0.5 790 | 12.5,1.1,1,0.5 791 | 16.6666666666667,1.1,1,0.5 792 | 21.4285714285714,1.1,1,0.5 793 | 28.5714285714286,1.1,1,0.5 794 | 37.5,1.1,1,0.5 795 | 50,1.1,1,0.5 796 | 66.6666666666667,1.1,1,0.5 797 | 87.5,1.1,1,0.5 798 | 116.666666666667,1.1,1,0.5 799 | 150,1.1,1,0.5 800 | 200,1.1,1,0.5 801 | -------------------------------------------------------------------------------- /dataset/Input_Yo20op40_ana_bw20_50.csv: -------------------------------------------------------------------------------- 1 | 1,0.9,-0.5,-0.5 2 | 1.31578947368421,0.9,-0.5,-0.5 3 | 1.75438596491228,0.9,-0.5,-0.5 4 | 2.32558139534884,0.9,-0.5,-0.5 5 | 3.03030303030303,0.9,-0.5,-0.5 6 | 4,0.9,-0.5,-0.5 7 | 5.35714285714286,0.9,-0.5,-0.5 8 | 6.97674418604651,0.9,-0.5,-0.5 9 | 9.375,0.9,-0.5,-0.5 10 | 12.5,0.9,-0.5,-0.5 11 | 16.6666666666667,0.9,-0.5,-0.5 12 | 21.4285714285714,0.9,-0.5,-0.5 13 | 28.5714285714286,0.9,-0.5,-0.5 14 | 37.5,0.9,-0.5,-0.5 15 | 50,0.9,-0.5,-0.5 16 | 66.6666666666667,0.9,-0.5,-0.5 17 | 87.5,0.9,-0.5,-0.5 18 | 116.666666666667,0.9,-0.5,-0.5 19 | 150,0.9,-0.5,-0.5 20 | 200,0.9,-0.5,-0.5 21 | 1,0.9,-0.5,0 22 | 1.31578947368421,0.9,-0.5,0 23 | 1.75438596491228,0.9,-0.5,0 24 | 2.32558139534884,0.9,-0.5,0 25 | 3.03030303030303,0.9,-0.5,0 26 | 4,0.9,-0.5,0 27 | 5.35714285714286,0.9,-0.5,0 28 | 6.97674418604651,0.9,-0.5,0 29 | 9.375,0.9,-0.5,0 30 | 12.5,0.9,-0.5,0 31 | 16.6666666666667,0.9,-0.5,0 32 | 21.4285714285714,0.9,-0.5,0 33 | 28.5714285714286,0.9,-0.5,0 34 | 37.5,0.9,-0.5,0 35 | 50,0.9,-0.5,0 36 | 66.6666666666667,0.9,-0.5,0 37 | 87.5,0.9,-0.5,0 38 | 116.666666666667,0.9,-0.5,0 39 | 150,0.9,-0.5,0 40 | 200,0.9,-0.5,0 41 | 1,0.9,-0.5,0.5 42 | 1.31578947368421,0.9,-0.5,0.5 43 | 1.75438596491228,0.9,-0.5,0.5 44 | 2.32558139534884,0.9,-0.5,0.5 45 | 3.03030303030303,0.9,-0.5,0.5 46 | 4,0.9,-0.5,0.5 47 | 5.35714285714286,0.9,-0.5,0.5 48 | 6.97674418604651,0.9,-0.5,0.5 49 | 9.375,0.9,-0.5,0.5 50 | 12.5,0.9,-0.5,0.5 51 | 16.6666666666667,0.9,-0.5,0.5 52 | 21.4285714285714,0.9,-0.5,0.5 53 | 28.5714285714286,0.9,-0.5,0.5 54 | 37.5,0.9,-0.5,0.5 55 | 50,0.9,-0.5,0.5 56 | 66.6666666666667,0.9,-0.5,0.5 57 | 87.5,0.9,-0.5,0.5 58 | 116.666666666667,0.9,-0.5,0.5 59 | 150,0.9,-0.5,0.5 60 | 200,0.9,-0.5,0.5 61 | 1,0.9,0,-0.5 62 | 1.31578947368421,0.9,0,-0.5 63 | 1.75438596491228,0.9,0,-0.5 64 | 2.32558139534884,0.9,0,-0.5 65 | 3.03030303030303,0.9,0,-0.5 66 | 4,0.9,0,-0.5 67 | 5.35714285714286,0.9,0,-0.5 68 | 6.97674418604651,0.9,0,-0.5 69 | 9.375,0.9,0,-0.5 70 | 12.5,0.9,0,-0.5 71 | 16.6666666666667,0.9,0,-0.5 72 | 21.4285714285714,0.9,0,-0.5 73 | 28.5714285714286,0.9,0,-0.5 74 | 37.5,0.9,0,-0.5 75 | 50,0.9,0,-0.5 76 | 66.6666666666667,0.9,0,-0.5 77 | 87.5,0.9,0,-0.5 78 | 116.666666666667,0.9,0,-0.5 79 | 150,0.9,0,-0.5 80 | 200,0.9,0,-0.5 81 | 1,0.9,0,0.5 82 | 1.31578947368421,0.9,0,0.5 83 | 1.75438596491228,0.9,0,0.5 84 | 2.32558139534884,0.9,0,0.5 85 | 3.03030303030303,0.9,0,0.5 86 | 4,0.9,0,0.5 87 | 5.35714285714286,0.9,0,0.5 88 | 6.97674418604651,0.9,0,0.5 89 | 9.375,0.9,0,0.5 90 | 12.5,0.9,0,0.5 91 | 16.6666666666667,0.9,0,0.5 92 | 21.4285714285714,0.9,0,0.5 93 | 28.5714285714286,0.9,0,0.5 94 | 37.5,0.9,0,0.5 95 | 50,0.9,0,0.5 96 | 66.6666666666667,0.9,0,0.5 97 | 87.5,0.9,0,0.5 98 | 116.666666666667,0.9,0,0.5 99 | 150,0.9,0,0.5 100 | 200,0.9,0,0.5 101 | 1,0.9,0.5,-0.5 102 | 1.31578947368421,0.9,0.5,-0.5 103 | 1.75438596491228,0.9,0.5,-0.5 104 | 2.32558139534884,0.9,0.5,-0.5 105 | 3.03030303030303,0.9,0.5,-0.5 106 | 4,0.9,0.5,-0.5 107 | 5.35714285714286,0.9,0.5,-0.5 108 | 6.97674418604651,0.9,0.5,-0.5 109 | 9.375,0.9,0.5,-0.5 110 | 12.5,0.9,0.5,-0.5 111 | 16.6666666666667,0.9,0.5,-0.5 112 | 21.4285714285714,0.9,0.5,-0.5 113 | 28.5714285714286,0.9,0.5,-0.5 114 | 37.5,0.9,0.5,-0.5 115 | 50,0.9,0.5,-0.5 116 | 66.6666666666667,0.9,0.5,-0.5 117 | 87.5,0.9,0.5,-0.5 118 | 116.666666666667,0.9,0.5,-0.5 119 | 150,0.9,0.5,-0.5 120 | 200,0.9,0.5,-0.5 121 | 1,0.9,0.5,0 122 | 1.31578947368421,0.9,0.5,0 123 | 1.75438596491228,0.9,0.5,0 124 | 2.32558139534884,0.9,0.5,0 125 | 3.03030303030303,0.9,0.5,0 126 | 4,0.9,0.5,0 127 | 5.35714285714286,0.9,0.5,0 128 | 6.97674418604651,0.9,0.5,0 129 | 9.375,0.9,0.5,0 130 | 12.5,0.9,0.5,0 131 | 16.6666666666667,0.9,0.5,0 132 | 21.4285714285714,0.9,0.5,0 133 | 28.5714285714286,0.9,0.5,0 134 | 37.5,0.9,0.5,0 135 | 50,0.9,0.5,0 136 | 66.6666666666667,0.9,0.5,0 137 | 87.5,0.9,0.5,0 138 | 116.666666666667,0.9,0.5,0 139 | 150,0.9,0.5,0 140 | 200,0.9,0.5,0 141 | 1,0.9,0.5,0.5 142 | 1.31578947368421,0.9,0.5,0.5 143 | 1.75438596491228,0.9,0.5,0.5 144 | 2.32558139534884,0.9,0.5,0.5 145 | 3.03030303030303,0.9,0.5,0.5 146 | 4,0.9,0.5,0.5 147 | 5.35714285714286,0.9,0.5,0.5 148 | 6.97674418604651,0.9,0.5,0.5 149 | 9.375,0.9,0.5,0.5 150 | 12.5,0.9,0.5,0.5 151 | 16.6666666666667,0.9,0.5,0.5 152 | 21.4285714285714,0.9,0.5,0.5 153 | 28.5714285714286,0.9,0.5,0.5 154 | 37.5,0.9,0.5,0.5 155 | 50,0.9,0.5,0.5 156 | 66.6666666666667,0.9,0.5,0.5 157 | 87.5,0.9,0.5,0.5 158 | 116.666666666667,0.9,0.5,0.5 159 | 150,0.9,0.5,0.5 160 | 200,0.9,0.5,0.5 161 | 1,1,-1,0 162 | 1.31578947368421,1,-1,0 163 | 1.75438596491228,1,-1,0 164 | 2.32558139534884,1,-1,0 165 | 3.03030303030303,1,-1,0 166 | 4,1,-1,0 167 | 5.35714285714286,1,-1,0 168 | 6.97674418604651,1,-1,0 169 | 9.375,1,-1,0 170 | 12.5,1,-1,0 171 | 16.6666666666667,1,-1,0 172 | 21.4285714285714,1,-1,0 173 | 28.5714285714286,1,-1,0 174 | 37.5,1,-1,0 175 | 50,1,-1,0 176 | 66.6666666666667,1,-1,0 177 | 87.5,1,-1,0 178 | 116.666666666667,1,-1,0 179 | 150,1,-1,0 180 | 200,1,-1,0 181 | 1,1,-0.5,-0.5 182 | 1.31578947368421,1,-0.5,-0.5 183 | 1.75438596491228,1,-0.5,-0.5 184 | 2.32558139534884,1,-0.5,-0.5 185 | 3.03030303030303,1,-0.5,-0.5 186 | 4,1,-0.5,-0.5 187 | 5.35714285714286,1,-0.5,-0.5 188 | 6.97674418604651,1,-0.5,-0.5 189 | 9.375,1,-0.5,-0.5 190 | 12.5,1,-0.5,-0.5 191 | 16.6666666666667,1,-0.5,-0.5 192 | 21.4285714285714,1,-0.5,-0.5 193 | 28.5714285714286,1,-0.5,-0.5 194 | 37.5,1,-0.5,-0.5 195 | 50,1,-0.5,-0.5 196 | 66.6666666666667,1,-0.5,-0.5 197 | 87.5,1,-0.5,-0.5 198 | 116.666666666667,1,-0.5,-0.5 199 | 150,1,-0.5,-0.5 200 | 200,1,-0.5,-0.5 201 | 1,1,-0.5,0 202 | 1.31578947368421,1,-0.5,0 203 | 1.75438596491228,1,-0.5,0 204 | 2.32558139534884,1,-0.5,0 205 | 3.03030303030303,1,-0.5,0 206 | 4,1,-0.5,0 207 | 5.35714285714286,1,-0.5,0 208 | 6.97674418604651,1,-0.5,0 209 | 9.375,1,-0.5,0 210 | 12.5,1,-0.5,0 211 | 16.6666666666667,1,-0.5,0 212 | 21.4285714285714,1,-0.5,0 213 | 28.5714285714286,1,-0.5,0 214 | 37.5,1,-0.5,0 215 | 50,1,-0.5,0 216 | 66.6666666666667,1,-0.5,0 217 | 87.5,1,-0.5,0 218 | 116.666666666667,1,-0.5,0 219 | 150,1,-0.5,0 220 | 200,1,-0.5,0 221 | 1,1,-0.5,0.5 222 | 1.31578947368421,1,-0.5,0.5 223 | 1.75438596491228,1,-0.5,0.5 224 | 2.32558139534884,1,-0.5,0.5 225 | 3.03030303030303,1,-0.5,0.5 226 | 4,1,-0.5,0.5 227 | 5.35714285714286,1,-0.5,0.5 228 | 6.97674418604651,1,-0.5,0.5 229 | 9.375,1,-0.5,0.5 230 | 12.5,1,-0.5,0.5 231 | 16.6666666666667,1,-0.5,0.5 232 | 21.4285714285714,1,-0.5,0.5 233 | 28.5714285714286,1,-0.5,0.5 234 | 37.5,1,-0.5,0.5 235 | 50,1,-0.5,0.5 236 | 66.6666666666667,1,-0.5,0.5 237 | 87.5,1,-0.5,0.5 238 | 116.666666666667,1,-0.5,0.5 239 | 150,1,-0.5,0.5 240 | 200,1,-0.5,0.5 241 | 1,1,0,-1 242 | 1.31578947368421,1,0,-1 243 | 1.75438596491228,1,0,-1 244 | 2.32558139534884,1,0,-1 245 | 3.03030303030303,1,0,-1 246 | 4,1,0,-1 247 | 5.35714285714286,1,0,-1 248 | 6.97674418604651,1,0,-1 249 | 9.375,1,0,-1 250 | 12.5,1,0,-1 251 | 16.6666666666667,1,0,-1 252 | 21.4285714285714,1,0,-1 253 | 28.5714285714286,1,0,-1 254 | 37.5,1,0,-1 255 | 50,1,0,-1 256 | 66.6666666666667,1,0,-1 257 | 87.5,1,0,-1 258 | 116.666666666667,1,0,-1 259 | 150,1,0,-1 260 | 200,1,0,-1 261 | 1,1,0,-0.5 262 | 1.31578947368421,1,0,-0.5 263 | 1.75438596491228,1,0,-0.5 264 | 2.32558139534884,1,0,-0.5 265 | 3.03030303030303,1,0,-0.5 266 | 4,1,0,-0.5 267 | 5.35714285714286,1,0,-0.5 268 | 6.97674418604651,1,0,-0.5 269 | 9.375,1,0,-0.5 270 | 12.5,1,0,-0.5 271 | 16.6666666666667,1,0,-0.5 272 | 21.4285714285714,1,0,-0.5 273 | 28.5714285714286,1,0,-0.5 274 | 37.5,1,0,-0.5 275 | 50,1,0,-0.5 276 | 66.6666666666667,1,0,-0.5 277 | 87.5,1,0,-0.5 278 | 116.666666666667,1,0,-0.5 279 | 150,1,0,-0.5 280 | 200,1,0,-0.5 281 | 1,1,0,0.5 282 | 1.31578947368421,1,0,0.5 283 | 1.75438596491228,1,0,0.5 284 | 2.32558139534884,1,0,0.5 285 | 3.03030303030303,1,0,0.5 286 | 4,1,0,0.5 287 | 5.35714285714286,1,0,0.5 288 | 6.97674418604651,1,0,0.5 289 | 9.375,1,0,0.5 290 | 12.5,1,0,0.5 291 | 16.6666666666667,1,0,0.5 292 | 21.4285714285714,1,0,0.5 293 | 28.5714285714286,1,0,0.5 294 | 37.5,1,0,0.5 295 | 50,1,0,0.5 296 | 66.6666666666667,1,0,0.5 297 | 87.5,1,0,0.5 298 | 116.666666666667,1,0,0.5 299 | 150,1,0,0.5 300 | 200,1,0,0.5 301 | 1,1,0,1 302 | 1.31578947368421,1,0,1 303 | 1.75438596491228,1,0,1 304 | 2.32558139534884,1,0,1 305 | 3.03030303030303,1,0,1 306 | 4,1,0,1 307 | 5.35714285714286,1,0,1 308 | 6.97674418604651,1,0,1 309 | 9.375,1,0,1 310 | 12.5,1,0,1 311 | 16.6666666666667,1,0,1 312 | 21.4285714285714,1,0,1 313 | 28.5714285714286,1,0,1 314 | 37.5,1,0,1 315 | 50,1,0,1 316 | 66.6666666666667,1,0,1 317 | 87.5,1,0,1 318 | 116.666666666667,1,0,1 319 | 150,1,0,1 320 | 200,1,0,1 321 | 1,1,0.5,-0.5 322 | 1.31578947368421,1,0.5,-0.5 323 | 1.75438596491228,1,0.5,-0.5 324 | 2.32558139534884,1,0.5,-0.5 325 | 3.03030303030303,1,0.5,-0.5 326 | 4,1,0.5,-0.5 327 | 5.35714285714286,1,0.5,-0.5 328 | 6.97674418604651,1,0.5,-0.5 329 | 9.375,1,0.5,-0.5 330 | 12.5,1,0.5,-0.5 331 | 16.6666666666667,1,0.5,-0.5 332 | 21.4285714285714,1,0.5,-0.5 333 | 28.5714285714286,1,0.5,-0.5 334 | 37.5,1,0.5,-0.5 335 | 50,1,0.5,-0.5 336 | 66.6666666666667,1,0.5,-0.5 337 | 87.5,1,0.5,-0.5 338 | 116.666666666667,1,0.5,-0.5 339 | 150,1,0.5,-0.5 340 | 200,1,0.5,-0.5 341 | 1,1,0.5,0 342 | 1.31578947368421,1,0.5,0 343 | 1.75438596491228,1,0.5,0 344 | 2.32558139534884,1,0.5,0 345 | 3.03030303030303,1,0.5,0 346 | 4,1,0.5,0 347 | 5.35714285714286,1,0.5,0 348 | 6.97674418604651,1,0.5,0 349 | 9.375,1,0.5,0 350 | 12.5,1,0.5,0 351 | 16.6666666666667,1,0.5,0 352 | 21.4285714285714,1,0.5,0 353 | 28.5714285714286,1,0.5,0 354 | 37.5,1,0.5,0 355 | 50,1,0.5,0 356 | 66.6666666666667,1,0.5,0 357 | 87.5,1,0.5,0 358 | 116.666666666667,1,0.5,0 359 | 150,1,0.5,0 360 | 200,1,0.5,0 361 | 1,1,0.5,0.5 362 | 1.31578947368421,1,0.5,0.5 363 | 1.75438596491228,1,0.5,0.5 364 | 2.32558139534884,1,0.5,0.5 365 | 3.03030303030303,1,0.5,0.5 366 | 4,1,0.5,0.5 367 | 5.35714285714286,1,0.5,0.5 368 | 6.97674418604651,1,0.5,0.5 369 | 9.375,1,0.5,0.5 370 | 12.5,1,0.5,0.5 371 | 16.6666666666667,1,0.5,0.5 372 | 21.4285714285714,1,0.5,0.5 373 | 28.5714285714286,1,0.5,0.5 374 | 37.5,1,0.5,0.5 375 | 50,1,0.5,0.5 376 | 66.6666666666667,1,0.5,0.5 377 | 87.5,1,0.5,0.5 378 | 116.666666666667,1,0.5,0.5 379 | 150,1,0.5,0.5 380 | 200,1,0.5,0.5 381 | 1,1,1,0 382 | 1.31578947368421,1,1,0 383 | 1.75438596491228,1,1,0 384 | 2.32558139534884,1,1,0 385 | 3.03030303030303,1,1,0 386 | 4,1,1,0 387 | 5.35714285714286,1,1,0 388 | 6.97674418604651,1,1,0 389 | 9.375,1,1,0 390 | 12.5,1,1,0 391 | 16.6666666666667,1,1,0 392 | 21.4285714285714,1,1,0 393 | 28.5714285714286,1,1,0 394 | 37.5,1,1,0 395 | 50,1,1,0 396 | 66.6666666666667,1,1,0 397 | 87.5,1,1,0 398 | 116.666666666667,1,1,0 399 | 150,1,1,0 400 | 200,1,1,0 401 | 1,1.1,-1,-0.5 402 | 1.31578947368421,1.1,-1,-0.5 403 | 1.75438596491228,1.1,-1,-0.5 404 | 2.32558139534884,1.1,-1,-0.5 405 | 3.03030303030303,1.1,-1,-0.5 406 | 4,1.1,-1,-0.5 407 | 5.35714285714286,1.1,-1,-0.5 408 | 6.97674418604651,1.1,-1,-0.5 409 | 9.375,1.1,-1,-0.5 410 | 12.5,1.1,-1,-0.5 411 | 16.6666666666667,1.1,-1,-0.5 412 | 21.4285714285714,1.1,-1,-0.5 413 | 28.5714285714286,1.1,-1,-0.5 414 | 37.5,1.1,-1,-0.5 415 | 50,1.1,-1,-0.5 416 | 66.6666666666667,1.1,-1,-0.5 417 | 87.5,1.1,-1,-0.5 418 | 116.666666666667,1.1,-1,-0.5 419 | 150,1.1,-1,-0.5 420 | 200,1.1,-1,-0.5 421 | 1,1.1,-1,0 422 | 1.31578947368421,1.1,-1,0 423 | 1.75438596491228,1.1,-1,0 424 | 2.32558139534884,1.1,-1,0 425 | 3.03030303030303,1.1,-1,0 426 | 4,1.1,-1,0 427 | 5.35714285714286,1.1,-1,0 428 | 6.97674418604651,1.1,-1,0 429 | 9.375,1.1,-1,0 430 | 12.5,1.1,-1,0 431 | 16.6666666666667,1.1,-1,0 432 | 21.4285714285714,1.1,-1,0 433 | 28.5714285714286,1.1,-1,0 434 | 37.5,1.1,-1,0 435 | 50,1.1,-1,0 436 | 66.6666666666667,1.1,-1,0 437 | 87.5,1.1,-1,0 438 | 116.666666666667,1.1,-1,0 439 | 150,1.1,-1,0 440 | 200,1.1,-1,0 441 | 1,1.1,-1,0.5 442 | 1.31578947368421,1.1,-1,0.5 443 | 1.75438596491228,1.1,-1,0.5 444 | 2.32558139534884,1.1,-1,0.5 445 | 3.03030303030303,1.1,-1,0.5 446 | 4,1.1,-1,0.5 447 | 5.35714285714286,1.1,-1,0.5 448 | 6.97674418604651,1.1,-1,0.5 449 | 9.375,1.1,-1,0.5 450 | 12.5,1.1,-1,0.5 451 | 16.6666666666667,1.1,-1,0.5 452 | 21.4285714285714,1.1,-1,0.5 453 | 28.5714285714286,1.1,-1,0.5 454 | 37.5,1.1,-1,0.5 455 | 50,1.1,-1,0.5 456 | 66.6666666666667,1.1,-1,0.5 457 | 87.5,1.1,-1,0.5 458 | 116.666666666667,1.1,-1,0.5 459 | 150,1.1,-1,0.5 460 | 200,1.1,-1,0.5 461 | 1,1.1,-0.5,-1 462 | 1.31578947368421,1.1,-0.5,-1 463 | 1.75438596491228,1.1,-0.5,-1 464 | 2.32558139534884,1.1,-0.5,-1 465 | 3.03030303030303,1.1,-0.5,-1 466 | 4,1.1,-0.5,-1 467 | 5.35714285714286,1.1,-0.5,-1 468 | 6.97674418604651,1.1,-0.5,-1 469 | 9.375,1.1,-0.5,-1 470 | 12.5,1.1,-0.5,-1 471 | 16.6666666666667,1.1,-0.5,-1 472 | 21.4285714285714,1.1,-0.5,-1 473 | 28.5714285714286,1.1,-0.5,-1 474 | 37.5,1.1,-0.5,-1 475 | 50,1.1,-0.5,-1 476 | 66.6666666666667,1.1,-0.5,-1 477 | 87.5,1.1,-0.5,-1 478 | 116.666666666667,1.1,-0.5,-1 479 | 150,1.1,-0.5,-1 480 | 200,1.1,-0.5,-1 481 | 1,1.1,-0.5,-0.5 482 | 1.31578947368421,1.1,-0.5,-0.5 483 | 1.75438596491228,1.1,-0.5,-0.5 484 | 2.32558139534884,1.1,-0.5,-0.5 485 | 3.03030303030303,1.1,-0.5,-0.5 486 | 4,1.1,-0.5,-0.5 487 | 5.35714285714286,1.1,-0.5,-0.5 488 | 6.97674418604651,1.1,-0.5,-0.5 489 | 9.375,1.1,-0.5,-0.5 490 | 12.5,1.1,-0.5,-0.5 491 | 16.6666666666667,1.1,-0.5,-0.5 492 | 21.4285714285714,1.1,-0.5,-0.5 493 | 28.5714285714286,1.1,-0.5,-0.5 494 | 37.5,1.1,-0.5,-0.5 495 | 50,1.1,-0.5,-0.5 496 | 66.6666666666667,1.1,-0.5,-0.5 497 | 87.5,1.1,-0.5,-0.5 498 | 116.666666666667,1.1,-0.5,-0.5 499 | 150,1.1,-0.5,-0.5 500 | 200,1.1,-0.5,-0.5 501 | 1,1.1,-0.5,0 502 | 1.31578947368421,1.1,-0.5,0 503 | 1.75438596491228,1.1,-0.5,0 504 | 2.32558139534884,1.1,-0.5,0 505 | 3.03030303030303,1.1,-0.5,0 506 | 4,1.1,-0.5,0 507 | 5.35714285714286,1.1,-0.5,0 508 | 6.97674418604651,1.1,-0.5,0 509 | 9.375,1.1,-0.5,0 510 | 12.5,1.1,-0.5,0 511 | 16.6666666666667,1.1,-0.5,0 512 | 21.4285714285714,1.1,-0.5,0 513 | 28.5714285714286,1.1,-0.5,0 514 | 37.5,1.1,-0.5,0 515 | 50,1.1,-0.5,0 516 | 66.6666666666667,1.1,-0.5,0 517 | 87.5,1.1,-0.5,0 518 | 116.666666666667,1.1,-0.5,0 519 | 150,1.1,-0.5,0 520 | 200,1.1,-0.5,0 521 | 1,1.1,-0.5,0.5 522 | 1.31578947368421,1.1,-0.5,0.5 523 | 1.75438596491228,1.1,-0.5,0.5 524 | 2.32558139534884,1.1,-0.5,0.5 525 | 3.03030303030303,1.1,-0.5,0.5 526 | 4,1.1,-0.5,0.5 527 | 5.35714285714286,1.1,-0.5,0.5 528 | 6.97674418604651,1.1,-0.5,0.5 529 | 9.375,1.1,-0.5,0.5 530 | 12.5,1.1,-0.5,0.5 531 | 16.6666666666667,1.1,-0.5,0.5 532 | 21.4285714285714,1.1,-0.5,0.5 533 | 28.5714285714286,1.1,-0.5,0.5 534 | 37.5,1.1,-0.5,0.5 535 | 50,1.1,-0.5,0.5 536 | 66.6666666666667,1.1,-0.5,0.5 537 | 87.5,1.1,-0.5,0.5 538 | 116.666666666667,1.1,-0.5,0.5 539 | 150,1.1,-0.5,0.5 540 | 200,1.1,-0.5,0.5 541 | 1,1.1,-0.5,1 542 | 1.31578947368421,1.1,-0.5,1 543 | 1.75438596491228,1.1,-0.5,1 544 | 2.32558139534884,1.1,-0.5,1 545 | 3.03030303030303,1.1,-0.5,1 546 | 4,1.1,-0.5,1 547 | 5.35714285714286,1.1,-0.5,1 548 | 6.97674418604651,1.1,-0.5,1 549 | 9.375,1.1,-0.5,1 550 | 12.5,1.1,-0.5,1 551 | 16.6666666666667,1.1,-0.5,1 552 | 21.4285714285714,1.1,-0.5,1 553 | 28.5714285714286,1.1,-0.5,1 554 | 37.5,1.1,-0.5,1 555 | 50,1.1,-0.5,1 556 | 66.6666666666667,1.1,-0.5,1 557 | 87.5,1.1,-0.5,1 558 | 116.666666666667,1.1,-0.5,1 559 | 150,1.1,-0.5,1 560 | 200,1.1,-0.5,1 561 | 1,1.1,0,-1 562 | 1.31578947368421,1.1,0,-1 563 | 1.75438596491228,1.1,0,-1 564 | 2.32558139534884,1.1,0,-1 565 | 3.03030303030303,1.1,0,-1 566 | 4,1.1,0,-1 567 | 5.35714285714286,1.1,0,-1 568 | 6.97674418604651,1.1,0,-1 569 | 9.375,1.1,0,-1 570 | 12.5,1.1,0,-1 571 | 16.6666666666667,1.1,0,-1 572 | 21.4285714285714,1.1,0,-1 573 | 28.5714285714286,1.1,0,-1 574 | 37.5,1.1,0,-1 575 | 50,1.1,0,-1 576 | 66.6666666666667,1.1,0,-1 577 | 87.5,1.1,0,-1 578 | 116.666666666667,1.1,0,-1 579 | 150,1.1,0,-1 580 | 200,1.1,0,-1 581 | 1,1.1,0,-0.5 582 | 1.31578947368421,1.1,0,-0.5 583 | 1.75438596491228,1.1,0,-0.5 584 | 2.32558139534884,1.1,0,-0.5 585 | 3.03030303030303,1.1,0,-0.5 586 | 4,1.1,0,-0.5 587 | 5.35714285714286,1.1,0,-0.5 588 | 6.97674418604651,1.1,0,-0.5 589 | 9.375,1.1,0,-0.5 590 | 12.5,1.1,0,-0.5 591 | 16.6666666666667,1.1,0,-0.5 592 | 21.4285714285714,1.1,0,-0.5 593 | 28.5714285714286,1.1,0,-0.5 594 | 37.5,1.1,0,-0.5 595 | 50,1.1,0,-0.5 596 | 66.6666666666667,1.1,0,-0.5 597 | 87.5,1.1,0,-0.5 598 | 116.666666666667,1.1,0,-0.5 599 | 150,1.1,0,-0.5 600 | 200,1.1,0,-0.5 601 | 1,1.1,0,0.5 602 | 1.31578947368421,1.1,0,0.5 603 | 1.75438596491228,1.1,0,0.5 604 | 2.32558139534884,1.1,0,0.5 605 | 3.03030303030303,1.1,0,0.5 606 | 4,1.1,0,0.5 607 | 5.35714285714286,1.1,0,0.5 608 | 6.97674418604651,1.1,0,0.5 609 | 9.375,1.1,0,0.5 610 | 12.5,1.1,0,0.5 611 | 16.6666666666667,1.1,0,0.5 612 | 21.4285714285714,1.1,0,0.5 613 | 28.5714285714286,1.1,0,0.5 614 | 37.5,1.1,0,0.5 615 | 50,1.1,0,0.5 616 | 66.6666666666667,1.1,0,0.5 617 | 87.5,1.1,0,0.5 618 | 116.666666666667,1.1,0,0.5 619 | 150,1.1,0,0.5 620 | 200,1.1,0,0.5 621 | 1,1.1,0,1 622 | 1.31578947368421,1.1,0,1 623 | 1.75438596491228,1.1,0,1 624 | 2.32558139534884,1.1,0,1 625 | 3.03030303030303,1.1,0,1 626 | 4,1.1,0,1 627 | 5.35714285714286,1.1,0,1 628 | 6.97674418604651,1.1,0,1 629 | 9.375,1.1,0,1 630 | 12.5,1.1,0,1 631 | 16.6666666666667,1.1,0,1 632 | 21.4285714285714,1.1,0,1 633 | 28.5714285714286,1.1,0,1 634 | 37.5,1.1,0,1 635 | 50,1.1,0,1 636 | 66.6666666666667,1.1,0,1 637 | 87.5,1.1,0,1 638 | 116.666666666667,1.1,0,1 639 | 150,1.1,0,1 640 | 200,1.1,0,1 641 | 1,1.1,0.5,-1 642 | 1.31578947368421,1.1,0.5,-1 643 | 1.75438596491228,1.1,0.5,-1 644 | 2.32558139534884,1.1,0.5,-1 645 | 3.03030303030303,1.1,0.5,-1 646 | 4,1.1,0.5,-1 647 | 5.35714285714286,1.1,0.5,-1 648 | 6.97674418604651,1.1,0.5,-1 649 | 9.375,1.1,0.5,-1 650 | 12.5,1.1,0.5,-1 651 | 16.6666666666667,1.1,0.5,-1 652 | 21.4285714285714,1.1,0.5,-1 653 | 28.5714285714286,1.1,0.5,-1 654 | 37.5,1.1,0.5,-1 655 | 50,1.1,0.5,-1 656 | 66.6666666666667,1.1,0.5,-1 657 | 87.5,1.1,0.5,-1 658 | 116.666666666667,1.1,0.5,-1 659 | 150,1.1,0.5,-1 660 | 200,1.1,0.5,-1 661 | 1,1.1,0.5,-0.5 662 | 1.31578947368421,1.1,0.5,-0.5 663 | 1.75438596491228,1.1,0.5,-0.5 664 | 2.32558139534884,1.1,0.5,-0.5 665 | 3.03030303030303,1.1,0.5,-0.5 666 | 4,1.1,0.5,-0.5 667 | 5.35714285714286,1.1,0.5,-0.5 668 | 6.97674418604651,1.1,0.5,-0.5 669 | 9.375,1.1,0.5,-0.5 670 | 12.5,1.1,0.5,-0.5 671 | 16.6666666666667,1.1,0.5,-0.5 672 | 21.4285714285714,1.1,0.5,-0.5 673 | 28.5714285714286,1.1,0.5,-0.5 674 | 37.5,1.1,0.5,-0.5 675 | 50,1.1,0.5,-0.5 676 | 66.6666666666667,1.1,0.5,-0.5 677 | 87.5,1.1,0.5,-0.5 678 | 116.666666666667,1.1,0.5,-0.5 679 | 150,1.1,0.5,-0.5 680 | 200,1.1,0.5,-0.5 681 | 1,1.1,0.5,0 682 | 1.31578947368421,1.1,0.5,0 683 | 1.75438596491228,1.1,0.5,0 684 | 2.32558139534884,1.1,0.5,0 685 | 3.03030303030303,1.1,0.5,0 686 | 4,1.1,0.5,0 687 | 5.35714285714286,1.1,0.5,0 688 | 6.97674418604651,1.1,0.5,0 689 | 9.375,1.1,0.5,0 690 | 12.5,1.1,0.5,0 691 | 16.6666666666667,1.1,0.5,0 692 | 21.4285714285714,1.1,0.5,0 693 | 28.5714285714286,1.1,0.5,0 694 | 37.5,1.1,0.5,0 695 | 50,1.1,0.5,0 696 | 66.6666666666667,1.1,0.5,0 697 | 87.5,1.1,0.5,0 698 | 116.666666666667,1.1,0.5,0 699 | 150,1.1,0.5,0 700 | 200,1.1,0.5,0 701 | 1,1.1,0.5,0.5 702 | 1.31578947368421,1.1,0.5,0.5 703 | 1.75438596491228,1.1,0.5,0.5 704 | 2.32558139534884,1.1,0.5,0.5 705 | 3.03030303030303,1.1,0.5,0.5 706 | 4,1.1,0.5,0.5 707 | 5.35714285714286,1.1,0.5,0.5 708 | 6.97674418604651,1.1,0.5,0.5 709 | 9.375,1.1,0.5,0.5 710 | 12.5,1.1,0.5,0.5 711 | 16.6666666666667,1.1,0.5,0.5 712 | 21.4285714285714,1.1,0.5,0.5 713 | 28.5714285714286,1.1,0.5,0.5 714 | 37.5,1.1,0.5,0.5 715 | 50,1.1,0.5,0.5 716 | 66.6666666666667,1.1,0.5,0.5 717 | 87.5,1.1,0.5,0.5 718 | 116.666666666667,1.1,0.5,0.5 719 | 150,1.1,0.5,0.5 720 | 200,1.1,0.5,0.5 721 | 1,1.1,0.5,1 722 | 1.31578947368421,1.1,0.5,1 723 | 1.75438596491228,1.1,0.5,1 724 | 2.32558139534884,1.1,0.5,1 725 | 3.03030303030303,1.1,0.5,1 726 | 4,1.1,0.5,1 727 | 5.35714285714286,1.1,0.5,1 728 | 6.97674418604651,1.1,0.5,1 729 | 9.375,1.1,0.5,1 730 | 12.5,1.1,0.5,1 731 | 16.6666666666667,1.1,0.5,1 732 | 21.4285714285714,1.1,0.5,1 733 | 28.5714285714286,1.1,0.5,1 734 | 37.5,1.1,0.5,1 735 | 50,1.1,0.5,1 736 | 66.6666666666667,1.1,0.5,1 737 | 87.5,1.1,0.5,1 738 | 116.666666666667,1.1,0.5,1 739 | 150,1.1,0.5,1 740 | 200,1.1,0.5,1 741 | 1,1.1,1,-0.5 742 | 1.31578947368421,1.1,1,-0.5 743 | 1.75438596491228,1.1,1,-0.5 744 | 2.32558139534884,1.1,1,-0.5 745 | 3.03030303030303,1.1,1,-0.5 746 | 4,1.1,1,-0.5 747 | 5.35714285714286,1.1,1,-0.5 748 | 6.97674418604651,1.1,1,-0.5 749 | 9.375,1.1,1,-0.5 750 | 12.5,1.1,1,-0.5 751 | 16.6666666666667,1.1,1,-0.5 752 | 21.4285714285714,1.1,1,-0.5 753 | 28.5714285714286,1.1,1,-0.5 754 | 37.5,1.1,1,-0.5 755 | 50,1.1,1,-0.5 756 | 66.6666666666667,1.1,1,-0.5 757 | 87.5,1.1,1,-0.5 758 | 116.666666666667,1.1,1,-0.5 759 | 150,1.1,1,-0.5 760 | 200,1.1,1,-0.5 761 | 1,1.1,1,0 762 | 1.31578947368421,1.1,1,0 763 | 1.75438596491228,1.1,1,0 764 | 2.32558139534884,1.1,1,0 765 | 3.03030303030303,1.1,1,0 766 | 4,1.1,1,0 767 | 5.35714285714286,1.1,1,0 768 | 6.97674418604651,1.1,1,0 769 | 9.375,1.1,1,0 770 | 12.5,1.1,1,0 771 | 16.6666666666667,1.1,1,0 772 | 21.4285714285714,1.1,1,0 773 | 28.5714285714286,1.1,1,0 774 | 37.5,1.1,1,0 775 | 50,1.1,1,0 776 | 66.6666666666667,1.1,1,0 777 | 87.5,1.1,1,0 778 | 116.666666666667,1.1,1,0 779 | 150,1.1,1,0 780 | 200,1.1,1,0 781 | 1,1.1,1,0.5 782 | 1.31578947368421,1.1,1,0.5 783 | 1.75438596491228,1.1,1,0.5 784 | 2.32558139534884,1.1,1,0.5 785 | 3.03030303030303,1.1,1,0.5 786 | 4,1.1,1,0.5 787 | 5.35714285714286,1.1,1,0.5 788 | 6.97674418604651,1.1,1,0.5 789 | 9.375,1.1,1,0.5 790 | 12.5,1.1,1,0.5 791 | 16.6666666666667,1.1,1,0.5 792 | 21.4285714285714,1.1,1,0.5 793 | 28.5714285714286,1.1,1,0.5 794 | 37.5,1.1,1,0.5 795 | 50,1.1,1,0.5 796 | 66.6666666666667,1.1,1,0.5 797 | 87.5,1.1,1,0.5 798 | 116.666666666667,1.1,1,0.5 799 | 150,1.1,1,0.5 800 | 200,1.1,1,0.5 801 | -------------------------------------------------------------------------------- /doc/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /doc/Overall---.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/superrabbit2023/InvNet/d2263d6b7e1b72805256040480c1c39c78fb765b/doc/Overall---.png -------------------------------------------------------------------------------- /optuna/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /optuna/Yo_ADAM_Optuna_1.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "colab": { 8 | "base_uri": "https://localhost:8080/" 9 | }, 10 | "id": "PCPZZtpVBNxh", 11 | "outputId": "e02e9cb4-b782-4d32-d3f2-04ab32f6411c", 12 | "scrolled": true 13 | }, 14 | "outputs": [], 15 | "source": [ 16 | "# ==================================================================================================\n", 17 | "# Yufei Li\n", 18 | "# Princeton University\n", 19 | "# yl5385@princeton.edu\n", 20 | "\n", 21 | "# Feburay 2023\n", 22 | "\n", 23 | "# Note:In this demo, the neural network is synthesized using the TensorFlow (verion: 2.11.0) framework. \n", 24 | "# Please install TensorFlow according to the official guidance, then import TensorFlow and other dependent modules.\n", 25 | "# ==================================================================================================\n", 26 | "\n", 27 | "!pip install pandas numpy matplotlib\n", 28 | "!pip install tensorflow\n", 29 | "!pip install optuna" 30 | ] 31 | }, 32 | { 33 | "cell_type": "code", 34 | "execution_count": null, 35 | "metadata": { 36 | "colab": { 37 | "base_uri": "https://localhost:8080/" 38 | }, 39 | "id": "ugbgmewAO313", 40 | "outputId": "d3ee69ac-b8d9-4046-a6a2-c272c8172bc8", 41 | "scrolled": true 42 | }, 43 | "outputs": [], 44 | "source": [ 45 | "import tensorflow as tf\n", 46 | "\n", 47 | "import matplotlib.pyplot as plt\n", 48 | "import numpy as np\n", 49 | "import pandas as pd\n", 50 | "import random\n", 51 | "import copy\n", 52 | "import csv\n", 53 | "import math\n", 54 | "import cmath\n", 55 | "import time\n", 56 | "\n", 57 | "import optuna\n", 58 | "\n", 59 | "inFilename = \"Input_Yo20op272_ana.csv\"\n", 60 | "outFilename = \"Output_Yo20op272_ana.csv\"\n", 61 | "\n", 62 | "Input = pd.read_csv(inFilename,header=None)\n", 63 | "Output = pd.read_csv(outFilename,header=None)\n", 64 | "\n", 65 | "print(Input.dtypes)\n", 66 | "print(Output.dtypes)\n", 67 | "\n", 68 | "inputs = []\n", 69 | "outputs = []\n", 70 | "\n", 71 | "inputs = np.array(Input)\n", 72 | "outputs = np.array(Output)\n", 73 | "print(inputs)\n", 74 | "print(outputs)" 75 | ] 76 | }, 77 | { 78 | "cell_type": "code", 79 | "execution_count": null, 80 | "metadata": { 81 | "colab": { 82 | "base_uri": "https://localhost:8080/" 83 | }, 84 | "id": "Jzhcz0jETg-j", 85 | "outputId": "6e1e0b75-20e5-4b31-949a-1022b36d4b07" 86 | }, 87 | "outputs": [], 88 | "source": [ 89 | "# Randomize the order of the inputs, so they can be evenly distributed for training, testing, and validation\n", 90 | "\n", 91 | "num_inputs = len(inputs)\n", 92 | "print(\"Total Number of Dataset is:\",num_inputs)\n", 93 | "randomize = np.arange(num_inputs)\n", 94 | "print(randomize)\n", 95 | "\n", 96 | "inputs_origin = copy.deepcopy(inputs[randomize])\n", 97 | "outputs_origin = copy.deepcopy(outputs[randomize])\n", 98 | "print(inputs_origin)\n", 99 | "\n", 100 | "random.Random(5).shuffle(randomize)\n", 101 | "print(randomize)\n", 102 | "# Swap the consecutive indexes (0, 1, 2, etc) with the randomized indexes\n", 103 | "inputs_real = copy.deepcopy(inputs_origin[randomize])\n", 104 | "outputs_real = copy.deepcopy(outputs_origin[randomize])\n", 105 | "print(inputs_real)\n", 106 | "print(outputs_real)\n", 107 | "\n", 108 | "# Split the recordings (group of samples) into two sets: training and testing\n", 109 | "TRAIN_SPLIT = int(0.7 * num_inputs)\n", 110 | "inputs_train, inputs_test = np.split(inputs_real, [TRAIN_SPLIT])\n", 111 | "outputs_train, outputs_test = np.split(outputs_real, [TRAIN_SPLIT])\n", 112 | "\n", 113 | "num_inputs_train = len(inputs_train)\n", 114 | "print(\"Total Number of training Dataset is:\",num_inputs_train)\n", 115 | "print(\"Dataset randomization and separation complete!\")" 116 | ] 117 | }, 118 | { 119 | "cell_type": "code", 120 | "execution_count": null, 121 | "metadata": { 122 | "colab": { 123 | "base_uri": "https://localhost:8080/" 124 | }, 125 | "id": "29MhrBmsO4xm", 126 | "outputId": "3befb570-ed17-4c3b-8d58-245e0066289c" 127 | }, 128 | "outputs": [], 129 | "source": [ 130 | "EPOCHS = 500\n", 131 | "\n", 132 | "def objective(trial):\n", 133 | " #Clear clutter from previous tf.keras session graphs.\n", 134 | " #tf.keras.backend.clear_session()\n", 135 | " \n", 136 | " #Define normalization layer\n", 137 | " Normlayer1=tf.keras.layers.Normalization()\n", 138 | " Normlayer1.adapt(inputs_real)\n", 139 | " \n", 140 | " #Model construction\n", 141 | " #Optimize the numbers of layers and their units.\n", 142 | " n_layers = trial.suggest_int(\"n_layers\", 2, 6)\n", 143 | " model = tf.keras.Sequential() \n", 144 | " model.add(Normlayer1)\n", 145 | " for i in range(n_layers):\n", 146 | " neurons_hidden = trial.suggest_int(\"n_units_l{}\".format(i), 32, 128, log=True)\n", 147 | " model.add(tf.keras.layers.Dense(neurons_hidden, activation=\"sigmoid\"))\n", 148 | " model.add(tf.keras.layers.Dense(8))\n", 149 | "\n", 150 | " #Defining learning rate schedule\n", 151 | " LR_INIT = trial.suggest_float(\"lr_init\", 1e-3, 5e-1, log=True)\n", 152 | " DECAY_EPOCHS = trial.suggest_int(\"decay_epochs\", 10, 200, log=True)\n", 153 | " DECAY_RATE = trial.suggest_float(\"decay_rate\", 0.4, 0.9, log=True)\n", 154 | " BATCH_SIZE = trial.suggest_categorical(\"batch_size\", [16, 32, 64, 128, 256, 512, 1024])\n", 155 | "\n", 156 | " lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(\n", 157 | " LR_INIT,\n", 158 | " decay_steps=math.ceil(num_inputs_train/BATCH_SIZE)*DECAY_EPOCHS,\n", 159 | " decay_rate=DECAY_RATE,\n", 160 | " staircase=True)\n", 161 | " \n", 162 | " #Compile model\n", 163 | " opt = tf.keras.optimizers.Adam(learning_rate=lr_schedule)\n", 164 | " model.compile(optimizer=opt, loss='mse', metrics=['mse'])\n", 165 | " \n", 166 | " history = model.fit(inputs_train, outputs_train, epochs=EPOCHS, batch_size=BATCH_SIZE, validation_data=(inputs_test, outputs_test), verbose=0)\n", 167 | "\n", 168 | " # Evaluate the model accuracy on the validation set.\n", 169 | " score = model.evaluate(inputs_test, outputs_test, verbose=0)\n", 170 | " return score[1]\n", 171 | "print(\"Optimization object construction complete!\")" 172 | ] 173 | }, 174 | { 175 | "cell_type": "code", 176 | "execution_count": null, 177 | "metadata": { 178 | "colab": { 179 | "base_uri": "https://localhost:8080/" 180 | }, 181 | "id": "nrebGnmp6tnl", 182 | "outputId": "b3ac9602-7151-4d5c-9b3b-7c737515e609", 183 | "scrolled": true 184 | }, 185 | "outputs": [], 186 | "source": [ 187 | "if __name__ == \"__main__\":\n", 188 | " study = optuna.create_study(direction=\"minimize\")\n", 189 | " study.optimize(objective, n_trials=100)\n", 190 | " \n", 191 | " print(\"Number of finished trials: \", len(study.trials)) \n", 192 | " \n", 193 | " print(\"Best trial:\")\n", 194 | " trial = study.best_trial\n", 195 | "\n", 196 | " print(\" Value: \", trial.value)\n", 197 | "\n", 198 | " print(\" Params: \")\n", 199 | " for key, value in trial.params.items():\n", 200 | " print(\" {}: {}\".format(key, value))" 201 | ] 202 | }, 203 | { 204 | "cell_type": "code", 205 | "execution_count": null, 206 | "metadata": { 207 | "colab": { 208 | "base_uri": "https://localhost:8080/", 209 | "height": 165 210 | }, 211 | "id": "1gEAmmoB3Vi5", 212 | "outputId": "0533cc2d-7b87-4f9e-e369-4e6d6c9969eb" 213 | }, 214 | "outputs": [], 215 | "source": [ 216 | "optuna.visualization.plot_optimization_history(study)#Plotting the optimization history of the study." 217 | ] 218 | }, 219 | { 220 | "cell_type": "code", 221 | "execution_count": null, 222 | "metadata": { 223 | "id": "YO7EXI19Bbpo" 224 | }, 225 | "outputs": [], 226 | "source": [ 227 | "optuna.visualization.plot_intermediate_values(study)#Visualizing the Learning Curves of the Trials" 228 | ] 229 | }, 230 | { 231 | "cell_type": "code", 232 | "execution_count": null, 233 | "metadata": { 234 | "id": "hondL6qaBzAG" 235 | }, 236 | "outputs": [], 237 | "source": [ 238 | "optuna.visualization.plot_parallel_coordinate(study)#Visualizing High-dimensional Parameter Relationships" 239 | ] 240 | }, 241 | { 242 | "cell_type": "code", 243 | "execution_count": null, 244 | "metadata": { 245 | "colab": { 246 | "base_uri": "https://localhost:8080/", 247 | "height": 542 248 | }, 249 | "id": "VBBQ6xP3B95v", 250 | "outputId": "07fe6926-0f1d-4170-f896-20d973b967ce" 251 | }, 252 | "outputs": [], 253 | "source": [ 254 | "optuna.visualization.plot_parallel_coordinate(study, params=['lr_init', 'n_layers', 'n_units_l0', 'n_units_l1', 'n_units_l2', 'n_units_l3', 'decay_epochs', 'decay_rate'])#Selecting Parameters to Visualize" 255 | ] 256 | }, 257 | { 258 | "cell_type": "code", 259 | "execution_count": null, 260 | "metadata": { 261 | "id": "G656q4EkCKA7" 262 | }, 263 | "outputs": [], 264 | "source": [ 265 | "optuna.visualization.plot_contour(study)#Visualizing Parameter Relationships" 266 | ] 267 | }, 268 | { 269 | "cell_type": "code", 270 | "execution_count": null, 271 | "metadata": { 272 | "colab": { 273 | "base_uri": "https://localhost:8080/", 274 | "height": 542 275 | }, 276 | "id": "EPgkkZlpCTk9", 277 | "outputId": "d06837b6-e6a6-4e7b-a385-c11dd08e443c" 278 | }, 279 | "outputs": [], 280 | "source": [ 281 | "optuna.visualization.plot_contour(study, params=['n_units_l0', 'n_units_l1'])#Selecting Parameters to Visualize" 282 | ] 283 | }, 284 | { 285 | "cell_type": "code", 286 | "execution_count": null, 287 | "metadata": { 288 | "colab": { 289 | "base_uri": "https://localhost:8080/", 290 | "height": 562 291 | }, 292 | "id": "D2XDYwAA4eZz", 293 | "outputId": "7c188b5e-4d2a-4772-b44b-add423e7f1ca" 294 | }, 295 | "outputs": [], 296 | "source": [ 297 | "optuna.visualization.plot_slice(study)#Plotting the accuracies for each hyperparameter for each trial. Visualizing Individual Parameters" 298 | ] 299 | }, 300 | { 301 | "cell_type": "code", 302 | "execution_count": null, 303 | "metadata": { 304 | "colab": { 305 | "base_uri": "https://localhost:8080/", 306 | "height": 542 307 | }, 308 | "id": "mIBJIVoICgKV", 309 | "outputId": "7dd344dc-f0dc-47f4-982e-7228e758e514" 310 | }, 311 | "outputs": [], 312 | "source": [ 313 | "optuna.visualization.plot_slice(study, params=['n_units_l0', 'n_units_l1'])#Selecting Parameters to Visualize" 314 | ] 315 | }, 316 | { 317 | "cell_type": "code", 318 | "execution_count": null, 319 | "metadata": { 320 | "id": "At1CM3POCmQY" 321 | }, 322 | "outputs": [], 323 | "source": [ 324 | "optuna.visualization.plot_param_importances(study)#Visualizing Parameter Importances" 325 | ] 326 | }, 327 | { 328 | "cell_type": "code", 329 | "execution_count": null, 330 | "metadata": { 331 | "colab": { 332 | "base_uri": "https://localhost:8080/", 333 | "height": 542 334 | }, 335 | "id": "w2CL1_ixgvnB", 336 | "outputId": "80aabf56-fbac-422a-d34c-5513650eb91c" 337 | }, 338 | "outputs": [], 339 | "source": [ 340 | "optuna.visualization.plot_edf(study)" 341 | ] 342 | } 343 | ], 344 | "metadata": { 345 | "colab": { 346 | "collapsed_sections": [], 347 | "provenance": [] 348 | }, 349 | "kernelspec": { 350 | "display_name": "Python 3 (ipykernel)", 351 | "language": "python", 352 | "name": "python3" 353 | }, 354 | "language_info": { 355 | "codemirror_mode": { 356 | "name": "ipython", 357 | "version": 3 358 | }, 359 | "file_extension": ".py", 360 | "mimetype": "text/x-python", 361 | "name": "python", 362 | "nbconvert_exporter": "python", 363 | "pygments_lexer": "ipython3", 364 | "version": "3.9.13" 365 | } 366 | }, 367 | "nbformat": 4, 368 | "nbformat_minor": 1 369 | } 370 | -------------------------------------------------------------------------------- /optuna/Yo_ADAM_Optuna_2.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "colab": { 8 | "base_uri": "https://localhost:8080/" 9 | }, 10 | "id": "PCPZZtpVBNxh", 11 | "outputId": "c341b3a8-9f41-44b5-8bc7-79287dc20759", 12 | "scrolled": true, 13 | "tags": [] 14 | }, 15 | "outputs": [], 16 | "source": [ 17 | "# ==================================================================================================\n", 18 | "# Yufei Li\n", 19 | "# Princeton University\n", 20 | "# yl5385@princeton.edu\n", 21 | "\n", 22 | "# Feburay 2023\n", 23 | "\n", 24 | "# Note:In this demo, the neural network is synthesized using the TensorFlow (verion: 2.11.0) framework. \n", 25 | "# Please install TensorFlow according to the official guidance, then import TensorFlow and other dependent modules.\n", 26 | "# ==================================================================================================\n", 27 | "\n", 28 | "!pip install pandas numpy matplotlib\n", 29 | "!pip install tensorflow\n", 30 | "!pip install optuna" 31 | ] 32 | }, 33 | { 34 | "cell_type": "code", 35 | "execution_count": null, 36 | "metadata": { 37 | "colab": { 38 | "base_uri": "https://localhost:8080/" 39 | }, 40 | "id": "ugbgmewAO313", 41 | "outputId": "c8f00111-7aff-490f-a7d9-ed971423c947", 42 | "scrolled": true, 43 | "tags": [] 44 | }, 45 | "outputs": [], 46 | "source": [ 47 | "import tensorflow as tf\n", 48 | "\n", 49 | "import matplotlib.pyplot as plt\n", 50 | "import numpy as np\n", 51 | "import pandas as pd\n", 52 | "import random\n", 53 | "import copy\n", 54 | "import csv\n", 55 | "import math\n", 56 | "import cmath\n", 57 | "import time\n", 58 | "\n", 59 | "import optuna\n", 60 | "\n", 61 | "inFilename = \"Input_Yo20op272_ana.csv\"\n", 62 | "outFilename = \"Output_Yo20op272_ana.csv\"\n", 63 | "\n", 64 | "Input = pd.read_csv(inFilename,header=None)\n", 65 | "Output = pd.read_csv(outFilename,header=None)\n", 66 | "\n", 67 | "print(Input.dtypes)\n", 68 | "print(Output.dtypes)\n", 69 | "\n", 70 | "inputs = []\n", 71 | "outputs = []\n", 72 | "\n", 73 | "inputs = np.array(Input)\n", 74 | "outputs = np.array(Output)\n", 75 | "print(inputs)\n", 76 | "print(outputs)" 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": null, 82 | "metadata": { 83 | "colab": { 84 | "base_uri": "https://localhost:8080/" 85 | }, 86 | "id": "Jzhcz0jETg-j", 87 | "outputId": "e769f805-e7fe-463c-ee53-fc730028c983" 88 | }, 89 | "outputs": [], 90 | "source": [ 91 | "# Randomize the order of the inputs, so they can be evenly distributed for training, testing, and validation\n", 92 | "\n", 93 | "num_inputs = len(inputs)\n", 94 | "print(\"Total Number of Dataset is:\",num_inputs)\n", 95 | "randomize = np.arange(num_inputs)\n", 96 | "print(randomize)\n", 97 | "\n", 98 | "inputs_origin = copy.deepcopy(inputs[randomize])\n", 99 | "outputs_origin = copy.deepcopy(outputs[randomize])\n", 100 | "print(inputs_origin)\n", 101 | "\n", 102 | "random.Random(5).shuffle(randomize)\n", 103 | "print(randomize)\n", 104 | "# Swap the consecutive indexes (0, 1, 2, etc) with the randomized indexes\n", 105 | "inputs_real = copy.deepcopy(inputs_origin[randomize])\n", 106 | "outputs_real = copy.deepcopy(outputs_origin[randomize])\n", 107 | "print(inputs_real)\n", 108 | "print(outputs_real)\n", 109 | "\n", 110 | "# Split the recordings (group of samples) into two sets: training and testing\n", 111 | "TRAIN_SPLIT = int(0.7 * num_inputs)\n", 112 | "inputs_train, inputs_test = np.split(inputs_real, [TRAIN_SPLIT])\n", 113 | "outputs_train, outputs_test = np.split(outputs_real, [TRAIN_SPLIT])\n", 114 | "\n", 115 | "num_inputs_train = len(inputs_train)\n", 116 | "print(\"Total Number of training Dataset is:\",num_inputs_train)\n", 117 | "print(\"Dataset randomization and separation complete!\")" 118 | ] 119 | }, 120 | { 121 | "cell_type": "code", 122 | "execution_count": null, 123 | "metadata": { 124 | "colab": { 125 | "base_uri": "https://localhost:8080/" 126 | }, 127 | "id": "29MhrBmsO4xm", 128 | "outputId": "8ae5f5ae-e1d4-419a-8df7-9c993e9d29e2" 129 | }, 130 | "outputs": [], 131 | "source": [ 132 | "EPOCHS = 500\n", 133 | "\n", 134 | "def objective(trial):\n", 135 | " #Clear clutter from previous tf.keras session graphs.\n", 136 | " #tf.keras.backend.clear_session()\n", 137 | " \n", 138 | " #Define normalization layer\n", 139 | " Normlayer1=tf.keras.layers.Normalization()\n", 140 | " Normlayer1.adapt(inputs_train)\n", 141 | " \n", 142 | " #Model construction (2140 dataset)\n", 143 | " #Optimize the numbers of layers and their units.\n", 144 | " model = tf.keras.Sequential()\n", 145 | " model.add(Normlayer1) \n", 146 | " model.add(tf.keras.layers.Dense(43, activation='sigmoid'))\n", 147 | " model.add(tf.keras.layers.Dense(56, activation='sigmoid'))\n", 148 | " model.add(tf.keras.layers.Dense(43, activation='sigmoid'))\n", 149 | " model.add(tf.keras.layers.Dense(8))\n", 150 | "\n", 151 | " #Defining learning rate schedule\n", 152 | " LR_INIT = trial.suggest_float(\"lr_init\", 1e-3, 5e-1, log=True)\n", 153 | " DECAY_EPOCHS = trial.suggest_int(\"decay_epochs\", 10, 200, log=True)\n", 154 | " DECAY_RATE = trial.suggest_float(\"decay_rate\", 0.4, 0.9, log=True)\n", 155 | " BATCH_SIZE = trial.suggest_categorical(\"batch_size\", [16, 32, 64, 128, 256, 512, 1024])\n", 156 | " \n", 157 | " lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(\n", 158 | " LR_INIT,\n", 159 | " decay_steps=math.ceil(num_inputs_train/BATCH_SIZE)*DECAY_EPOCHS,\n", 160 | " decay_rate=DECAY_RATE,\n", 161 | " staircase=True)\n", 162 | " \n", 163 | " #Compile model\n", 164 | " opt = tf.keras.optimizers.Adam(learning_rate=lr_schedule)\n", 165 | " model.compile(optimizer=opt, loss='mse', metrics=['mse'])\n", 166 | " \n", 167 | " history = model.fit(inputs_train, outputs_train, epochs=EPOCHS, batch_size=BATCH_SIZE, validation_data=(inputs_test, outputs_test), verbose=0)\n", 168 | "\n", 169 | " # Evaluate the model accuracy on the validation set.\n", 170 | " score = model.evaluate(inputs_test, outputs_test, verbose=0)\n", 171 | " return score[1]\n", 172 | "print(\"Optimization object construction complete!\")" 173 | ] 174 | }, 175 | { 176 | "cell_type": "code", 177 | "execution_count": null, 178 | "metadata": { 179 | "colab": { 180 | "base_uri": "https://localhost:8080/" 181 | }, 182 | "id": "nrebGnmp6tnl", 183 | "outputId": "ab2e2662-e941-4e49-92d0-593c5528277b", 184 | "scrolled": true, 185 | "tags": [] 186 | }, 187 | "outputs": [], 188 | "source": [ 189 | "if __name__ == \"__main__\":\n", 190 | " study = optuna.create_study(direction=\"minimize\")\n", 191 | " study.optimize(objective, n_trials=200)\n", 192 | " \n", 193 | " print(\"Number of finished trials: \", len(study.trials)) \n", 194 | " \n", 195 | " print(\"Best trial:\")\n", 196 | " trial = study.best_trial\n", 197 | "\n", 198 | " print(\" Value: \", trial.value)\n", 199 | "\n", 200 | " print(\" Params: \")\n", 201 | " for key, value in trial.params.items():\n", 202 | " print(\" {}: {}\".format(key, value))" 203 | ] 204 | }, 205 | { 206 | "cell_type": "code", 207 | "execution_count": null, 208 | "metadata": { 209 | "colab": { 210 | "base_uri": "https://localhost:8080/", 211 | "height": 542 212 | }, 213 | "id": "1gEAmmoB3Vi5", 214 | "outputId": "e9969be2-51bc-4468-c1bd-ac99f58d963f" 215 | }, 216 | "outputs": [], 217 | "source": [ 218 | "optuna.visualization.plot_optimization_history(study)#Plotting the optimization history of the study." 219 | ] 220 | }, 221 | { 222 | "cell_type": "code", 223 | "execution_count": null, 224 | "metadata": { 225 | "colab": { 226 | "base_uri": "https://localhost:8080/", 227 | "height": 559 228 | }, 229 | "id": "YO7EXI19Bbpo", 230 | "outputId": "2afda139-b37b-48c3-9b51-258e8c147c6a" 231 | }, 232 | "outputs": [], 233 | "source": [ 234 | "optuna.visualization.plot_intermediate_values(study)#Visualizing the Learning Curves of the Trials" 235 | ] 236 | }, 237 | { 238 | "cell_type": "code", 239 | "execution_count": null, 240 | "metadata": { 241 | "colab": { 242 | "base_uri": "https://localhost:8080/", 243 | "height": 542 244 | }, 245 | "id": "hondL6qaBzAG", 246 | "outputId": "26f6682a-17d7-44c2-dbd1-c2d2c117a521" 247 | }, 248 | "outputs": [], 249 | "source": [ 250 | "optuna.visualization.plot_parallel_coordinate(study)#Visualizing High-dimensional Parameter Relationships" 251 | ] 252 | }, 253 | { 254 | "cell_type": "code", 255 | "execution_count": null, 256 | "metadata": { 257 | "colab": { 258 | "base_uri": "https://localhost:8080/", 259 | "height": 542 260 | }, 261 | "id": "VBBQ6xP3B95v", 262 | "outputId": "bcc9b275-1758-4299-ff29-4b3dd6a664ce" 263 | }, 264 | "outputs": [], 265 | "source": [ 266 | "optuna.visualization.plot_parallel_coordinate(study, params=['lr_init', 'decay_epochs', 'decay_rate'])#Selecting Parameters to Visualize" 267 | ] 268 | }, 269 | { 270 | "cell_type": "code", 271 | "execution_count": null, 272 | "metadata": { 273 | "colab": { 274 | "base_uri": "https://localhost:8080/", 275 | "height": 542 276 | }, 277 | "id": "G656q4EkCKA7", 278 | "outputId": "09c609c5-baaf-4ae0-e51d-5e591e0a9e13" 279 | }, 280 | "outputs": [], 281 | "source": [ 282 | "optuna.visualization.plot_contour(study)#Visualizing Parameter Relationships" 283 | ] 284 | }, 285 | { 286 | "cell_type": "code", 287 | "execution_count": null, 288 | "metadata": { 289 | "id": "EPgkkZlpCTk9" 290 | }, 291 | "outputs": [], 292 | "source": [ 293 | "optuna.visualization.plot_contour(study, params=['n_units_l0', 'n_units_l1'])#Selecting Parameters to Visualize" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": null, 299 | "metadata": { 300 | "colab": { 301 | "base_uri": "https://localhost:8080/", 302 | "height": 542 303 | }, 304 | "id": "D2XDYwAA4eZz", 305 | "outputId": "3864c336-697c-4a97-aa8b-9eafe142afc7" 306 | }, 307 | "outputs": [], 308 | "source": [ 309 | "optuna.visualization.plot_slice(study)#Plotting the accuracies for each hyperparameter for each trial. Visualizing Individual Parameters" 310 | ] 311 | }, 312 | { 313 | "cell_type": "code", 314 | "execution_count": null, 315 | "metadata": { 316 | "id": "mIBJIVoICgKV" 317 | }, 318 | "outputs": [], 319 | "source": [ 320 | "optuna.visualization.plot_slice(study, params=['n_units_l0', 'n_units_l1'])#Selecting Parameters to Visualize" 321 | ] 322 | }, 323 | { 324 | "cell_type": "code", 325 | "execution_count": null, 326 | "metadata": { 327 | "colab": { 328 | "base_uri": "https://localhost:8080/", 329 | "height": 542 330 | }, 331 | "id": "At1CM3POCmQY", 332 | "outputId": "b33caf77-b8a9-4627-a482-ed130f653df1" 333 | }, 334 | "outputs": [], 335 | "source": [ 336 | "optuna.visualization.plot_param_importances(study)#Visualizing Parameter Importances" 337 | ] 338 | }, 339 | { 340 | "cell_type": "code", 341 | "execution_count": null, 342 | "metadata": { 343 | "colab": { 344 | "base_uri": "https://localhost:8080/", 345 | "height": 542 346 | }, 347 | "id": "w2CL1_ixgvnB", 348 | "outputId": "e611da81-20d6-494f-8a83-fc660c9449c9" 349 | }, 350 | "outputs": [], 351 | "source": [ 352 | "optuna.visualization.plot_edf(study)" 353 | ] 354 | } 355 | ], 356 | "metadata": { 357 | "colab": { 358 | "collapsed_sections": [], 359 | "name": "YdqFNN_ADAM_2_Optuna_Ydq50op24Ydq20op47.ipynb", 360 | "provenance": [] 361 | }, 362 | "kernelspec": { 363 | "display_name": "Python 3 (ipykernel)", 364 | "language": "python", 365 | "name": "python3" 366 | }, 367 | "language_info": { 368 | "codemirror_mode": { 369 | "name": "ipython", 370 | "version": 3 371 | }, 372 | "file_extension": ".py", 373 | "mimetype": "text/x-python", 374 | "name": "python", 375 | "nbconvert_exporter": "python", 376 | "pygments_lexer": "ipython3", 377 | "version": "3.9.13" 378 | } 379 | }, 380 | "nbformat": 4, 381 | "nbformat_minor": 4 382 | } 383 | -------------------------------------------------------------------------------- /simulation examples/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /training/Yo_FNN_ADAM.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "colab": { 8 | "base_uri": "https://localhost:8080/" 9 | }, 10 | "executionInfo": { 11 | "elapsed": 29031, 12 | "status": "ok", 13 | "timestamp": 1648514570883, 14 | "user": { 15 | "displayName": "Ramsfield Lee", 16 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GiHPffa6hxu7mszm5ZYTfN7ZGsy0zsXFpnUK65SWA=s64", 17 | "userId": "05718072999768408940" 18 | }, 19 | "user_tz": 240 20 | }, 21 | "id": "ExLEi_TVowna", 22 | "outputId": "f3634a36-87a8-4b32-f899-fe91ba7510c3", 23 | "scrolled": true, 24 | "tags": [] 25 | }, 26 | "outputs": [], 27 | "source": [ 28 | "# ==================================================================================================\n", 29 | "# Yufei Li\n", 30 | "# Princeton University\n", 31 | "# yl5385@princeton.edu\n", 32 | "\n", 33 | "# Feburay 2023\n", 34 | "\n", 35 | "# Note:In this demo, the neural network is synthesized using the TensorFlow (verion: 2.11.0) framework. \n", 36 | "# Please install TensorFlow according to the official guidance, then import TensorFlow and other dependent modules.\n", 37 | "# ==================================================================================================\n", 38 | "\n", 39 | "!pip install pandas numpy matplotlib\n", 40 | "!pip install tensorflow" 41 | ] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "execution_count": null, 46 | "metadata": { 47 | "colab": { 48 | "base_uri": "https://localhost:8080/" 49 | }, 50 | "executionInfo": { 51 | "elapsed": 337, 52 | "status": "ok", 53 | "timestamp": 1648518191561, 54 | "user": { 55 | "displayName": "Ramsfield Lee", 56 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GiHPffa6hxu7mszm5ZYTfN7ZGsy0zsXFpnUK65SWA=s64", 57 | "userId": "05718072999768408940" 58 | }, 59 | "user_tz": 240 60 | }, 61 | "id": "LAYr_kNioymd", 62 | "outputId": "e7354939-02a1-43ad-8e93-eeda6669f337", 63 | "scrolled": true, 64 | "tags": [] 65 | }, 66 | "outputs": [], 67 | "source": [ 68 | "import tensorflow as tf\n", 69 | "\n", 70 | "import matplotlib.pyplot as plt\n", 71 | "import numpy as np\n", 72 | "import pandas as pd\n", 73 | "import random\n", 74 | "import copy\n", 75 | "import csv\n", 76 | "import math\n", 77 | "import cmath\n", 78 | "import time\n", 79 | "\n", 80 | "inFilename = \"Input_Yo20op1084_ana.csv\"\n", 81 | "outFilename = \"Output_Yo20op1084_ana.csv\"\n", 82 | "\n", 83 | "Input = pd.read_csv(inFilename,header=None)\n", 84 | "Output = pd.read_csv(outFilename,header=None)\n", 85 | "\n", 86 | "print(Input.dtypes)\n", 87 | "print(Output.dtypes)\n", 88 | "\n", 89 | "inputs = []\n", 90 | "outputs = []\n", 91 | "\n", 92 | "inputs = np.array(Input)\n", 93 | "outputs = np.array(Output)\n", 94 | "print(inputs)\n", 95 | "print(outputs)" 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": null, 101 | "metadata": { 102 | "colab": { 103 | "base_uri": "https://localhost:8080/" 104 | }, 105 | "executionInfo": { 106 | "elapsed": 547, 107 | "status": "ok", 108 | "timestamp": 1648518194874, 109 | "user": { 110 | "displayName": "Ramsfield Lee", 111 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GiHPffa6hxu7mszm5ZYTfN7ZGsy0zsXFpnUK65SWA=s64", 112 | "userId": "05718072999768408940" 113 | }, 114 | "user_tz": 240 115 | }, 116 | "id": "8EBPxpFuo9xb", 117 | "outputId": "0d991ff1-3c78-48ad-eba9-b69f542cee3c", 118 | "scrolled": true, 119 | "tags": [] 120 | }, 121 | "outputs": [], 122 | "source": [ 123 | "# Randomize the order of the inputs, so they can be evenly distributed for training, testing, and validation\n", 124 | "\n", 125 | "num_inputs = len(inputs)\n", 126 | "print(\"Total Number of Dataset is:\",num_inputs)\n", 127 | "randomize = np.arange(num_inputs)\n", 128 | "print(randomize)\n", 129 | "\n", 130 | "inputs_origin = copy.deepcopy(inputs[randomize])\n", 131 | "outputs_origin = copy.deepcopy(outputs[randomize])\n", 132 | "print(inputs_origin)\n", 133 | "\n", 134 | "random.Random(5).shuffle(randomize)\n", 135 | "print(randomize)\n", 136 | "# Swap the consecutive indexes (0, 1, 2, etc) with the randomized indexes\n", 137 | "inputs_real = copy.deepcopy(inputs_origin[randomize])\n", 138 | "outputs_real = copy.deepcopy(outputs_origin[randomize])\n", 139 | "print(inputs_real)\n", 140 | "print(outputs_real)\n", 141 | "\n", 142 | "# Split the recordings (group of samples) into two sets: training and testing\n", 143 | "TRAIN_SPLIT = int(0.7 * num_inputs)\n", 144 | "inputs_train, inputs_test = np.split(inputs_real, [TRAIN_SPLIT])\n", 145 | "outputs_train, outputs_test = np.split(outputs_real, [TRAIN_SPLIT])\n", 146 | "\n", 147 | "num_inputs_train = len(inputs_train)\n", 148 | "print(\"Total Number of training Dataset is:\",num_inputs_train)\n", 149 | "print(\"Dataset randomization and separation complete!\")" 150 | ] 151 | }, 152 | { 153 | "cell_type": "code", 154 | "execution_count": null, 155 | "metadata": { 156 | "scrolled": true, 157 | "tags": [] 158 | }, 159 | "outputs": [], 160 | "source": [ 161 | "# build the model and train it\n", 162 | "initial_learning_rate = 0.020669824682365133\n", 163 | "lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(\n", 164 | " initial_learning_rate,\n", 165 | " decay_steps=375*65,\n", 166 | " decay_rate=0.575401007701697,\n", 167 | " staircase=True)\n", 168 | "\n", 169 | "Normlayer1=tf.keras.layers.Normalization()\n", 170 | "Normlayer1.adapt(inputs_real)\n", 171 | "\n", 172 | "model = tf.keras.Sequential()\n", 173 | "model.add(Normlayer1)\n", 174 | "model.add(tf.keras.layers.Dense(43, activation='sigmoid'))\n", 175 | "model.add(tf.keras.layers.Dense(56, activation='sigmoid'))\n", 176 | "model.add(tf.keras.layers.Dense(43, activation='sigmoid'))\n", 177 | "model.add(tf.keras.layers.Dense(8))\n", 178 | "\n", 179 | "#Train using Adam\n", 180 | "opt = tf.keras.optimizers.Adam(learning_rate=lr_schedule)\n", 181 | "model.compile(optimizer=opt, loss='mse', metrics=['mse'])\n", 182 | "\n", 183 | "t2_start = time.perf_counter()\n", 184 | "history = model.fit(inputs_train, outputs_train, epochs=600, batch_size=16, validation_data=(inputs_test, outputs_test))\n", 185 | "t2_stop = time.perf_counter()\n", 186 | "print(\"Elapsed time: \", t2_stop - t2_start)\n", 187 | "\n", 188 | "print(model.evaluate(inputs_test, outputs_test, verbose=0))" 189 | ] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": null, 194 | "metadata": { 195 | "colab": { 196 | "base_uri": "https://localhost:8080/", 197 | "height": 421 198 | }, 199 | "executionInfo": { 200 | "elapsed": 761, 201 | "status": "ok", 202 | "timestamp": 1648525798950, 203 | "user": { 204 | "displayName": "Ramsfield Lee", 205 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GiHPffa6hxu7mszm5ZYTfN7ZGsy0zsXFpnUK65SWA=s64", 206 | "userId": "05718072999768408940" 207 | }, 208 | "user_tz": 240 209 | }, 210 | "id": "qaBsCVo7pBIw", 211 | "outputId": "ce0c9fb6-19cc-49a6-81a5-b48a4f7f93a6", 212 | "tags": [] 213 | }, 214 | "outputs": [], 215 | "source": [ 216 | "# # increase the size of the graphs. The default size is (6,4).\n", 217 | "plt.rcParams[\"figure.figsize\"] = (9,6)\n", 218 | "\n", 219 | "# summarize history for loss\n", 220 | "plt.plot(history.history['loss'], '-r.', label='Training Loss')\n", 221 | "plt.plot(history.history['val_loss'], '--b.', label='Validation Loss')\n", 222 | "plt.title('Training and Validation Loss')\n", 223 | "plt.xlabel('Epochs')\n", 224 | "plt.ylabel('Loss')\n", 225 | "plt.legend()\n", 226 | "plt.semilogy()\n", 227 | "plt.grid()\n", 228 | "plt.show()\n", 229 | "\n", 230 | "print(plt.rcParams[\"figure.figsize\"])\n", 231 | "#print(history.history)" 232 | ] 233 | }, 234 | { 235 | "cell_type": "code", 236 | "execution_count": null, 237 | "metadata": {}, 238 | "outputs": [], 239 | "source": [ 240 | "# Save inputs index\n", 241 | "with open('Yo20op1084_ana_inputs.csv','w', newline=\"\") as output_file:\n", 242 | " writer = csv.writer(output_file)\n", 243 | " writer.writerows(Y_input)\n", 244 | "# Save predcition from FNN\n", 245 | "with open('Yo20op1084_ana_prediction.csv','w', newline=\"\") as output_file:\n", 246 | " writer = csv.writer(output_file)\n", 247 | " writer.writerows(Y_pred)\n", 248 | "# Save original data\n", 249 | "with open('Yo20op1084_ana_outputs.csv','w', newline=\"\") as output_file:\n", 250 | " writer = csv.writer(output_file)\n", 251 | " writer.writerows(Y_output)" 252 | ] 253 | }, 254 | { 255 | "cell_type": "code", 256 | "execution_count": null, 257 | "metadata": { 258 | "colab": { 259 | "base_uri": "https://localhost:8080/" 260 | }, 261 | "executionInfo": { 262 | "elapsed": 473, 263 | "status": "ok", 264 | "timestamp": 1648264764413, 265 | "user": { 266 | "displayName": "Ramsfield Lee", 267 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14GiHPffa6hxu7mszm5ZYTfN7ZGsy0zsXFpnUK65SWA=s64", 268 | "userId": "05718072999768408940" 269 | }, 270 | "user_tz": 240 271 | }, 272 | "id": "DblOFeB3pDsb", 273 | "outputId": "28f2f4b8-0442-47d7-daa6-4d0132d1e4a1", 274 | "scrolled": true, 275 | "tags": [] 276 | }, 277 | "outputs": [], 278 | "source": [ 279 | "# Use the model to predict test data\n", 280 | "output_pred = model.predict(inputs_test)\n", 281 | "#print(output_pred)\n", 282 | "output_diff = output_pred - outputs_test\n", 283 | "MSE = np.square(np.subtract(output_pred,outputs_test)).mean()\n", 284 | "print(output_diff)\n", 285 | "print(MSE)" 286 | ] 287 | } 288 | ], 289 | "metadata": { 290 | "colab": { 291 | "collapsed_sections": [], 292 | "name": "YdqFNN.ipynb", 293 | "provenance": [] 294 | }, 295 | "kernelspec": { 296 | "display_name": "Python 3 (ipykernel)", 297 | "language": "python", 298 | "name": "python3" 299 | }, 300 | "language_info": { 301 | "codemirror_mode": { 302 | "name": "ipython", 303 | "version": 3 304 | }, 305 | "file_extension": ".py", 306 | "mimetype": "text/x-python", 307 | "name": "python", 308 | "nbconvert_exporter": "python", 309 | "pygments_lexer": "ipython3", 310 | "version": "3.9.13" 311 | } 312 | }, 313 | "nbformat": 4, 314 | "nbformat_minor": 4 315 | } 316 | -------------------------------------------------------------------------------- /transfer learning/Yo_transfer learning_cycle.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "scrolled": true, 8 | "tags": [] 9 | }, 10 | "outputs": [], 11 | "source": [ 12 | "# ==================================================================================================\n", 13 | "# Yufei Li\n", 14 | "# Princeton University\n", 15 | "# yl5385@princeton.edu\n", 16 | "\n", 17 | "# Feburay 2023\n", 18 | "\n", 19 | "# Note:In this demo, the neural network is synthesized using the TensorFlow (verion: 2.11.0) framework. \n", 20 | "# Please install TensorFlow according to the official guidance, then import TensorFlow and other dependent modules.\n", 21 | "# ==================================================================================================\n", 22 | "\n", 23 | "# Setup environment\n", 24 | "!pip install pandas numpy matplotlib\n", 25 | "!pip install tensorflow" 26 | ] 27 | }, 28 | { 29 | "cell_type": "code", 30 | "execution_count": null, 31 | "metadata": {}, 32 | "outputs": [], 33 | "source": [ 34 | "import tensorflow as tf\n", 35 | "\n", 36 | "import matplotlib.pyplot as plt\n", 37 | "import numpy as np\n", 38 | "import pandas as pd\n", 39 | "import random\n", 40 | "import copy\n", 41 | "import csv\n", 42 | "import math\n", 43 | "import cmath\n", 44 | "import time\n", 45 | "import random as python_random\n", 46 | "\n", 47 | "# reproduciable results\n", 48 | "def reset_seeds():\n", 49 | " np.random.seed(7) \n", 50 | " python_random.seed(7)\n", 51 | " tf.random.set_seed(7) \n", 52 | "\n", 53 | "inFilename = \"Input_Yo20op1084_ana_bw20_50.csv\"\n", 54 | "outFilename = \"Output_Yo20op1084_ana_bw20_50.csv\"\n", 55 | "input = pd.read_csv(inFilename,header=None)\n", 56 | "output = pd.read_csv(outFilename,header=None)\n", 57 | "inputRawTrain = []\n", 58 | "outputRawTrain = []\n", 59 | "inputRawTrain = np.array(input)\n", 60 | "outputRawTrain = np.array(output)\n", 61 | "\n", 62 | "inFilename = \"Input_Yo20op40_ana_50.csv\"\n", 63 | "outFilename = \"Output_Yo20op40_ana_50.csv\"\n", 64 | "input = pd.read_csv(inFilename,header=None)\n", 65 | "output = pd.read_csv(outFilename,header=None)\n", 66 | "inputTrain_1 = []\n", 67 | "outputTrain_1 = []\n", 68 | "inputTrain_1 = np.array(input)\n", 69 | "outputTrain_1 = np.array(output)\n", 70 | "\n", 71 | "num_inputs_train = len(inputTrain_1)\n", 72 | "print(\"Total Number of Training Dataset is:\",num_inputs_train)\n", 73 | "\n", 74 | "# Shuffle the dataset\n", 75 | "np.random.seed(1)\n", 76 | "np.random.shuffle(inputTrain_1)\n", 77 | "np.random.seed(1)\n", 78 | "np.random.shuffle(outputTrain_1)\n", 79 | "\n", 80 | "# Data separation\n", 81 | "TRAIN_SPLIT_1 = int(0.7 * num_inputs_train)\n", 82 | "TRAIN_SPLIT_2 = int(0.85 * num_inputs_train)\n", 83 | "inputTrain, inputTest, inputs_test_2 = np.split(inputTrain_1, [TRAIN_SPLIT_1, TRAIN_SPLIT_2])\n", 84 | "outputTrain, outputTest, outputs_test_2 = np.split(outputTrain_1, [TRAIN_SPLIT_1, TRAIN_SPLIT_2])\n", 85 | "\n", 86 | "print(\"Dataset preparation complete!\")" 87 | ] 88 | }, 89 | { 90 | "cell_type": "code", 91 | "execution_count": null, 92 | "metadata": { 93 | "tags": [] 94 | }, 95 | "outputs": [], 96 | "source": [ 97 | "CYCLE = 10\n", 98 | "# Randomize the order of the inputs, so they can be evenly distributed for training, testing, and validation\n", 99 | "\n", 100 | "num_inputRawTrain = len(inputRawTrain)\n", 101 | "num_inputTrain = len(inputTrain)\n", 102 | "num_inputTest = len(inputTest)\n", 103 | "inputs_test = inputTest\n", 104 | "outputs_test = outputTest\n", 105 | "\n", 106 | "print(\"Total Number of Raw Training Dataset is:\",num_inputRawTrain)\n", 107 | "print(\"Total Number of Real Training Dataset is:\",num_inputTrain)\n", 108 | "print(\"Total Number of Test Dataset is:\",num_inputTest)\n", 109 | "\n", 110 | "\n", 111 | "inputs_train_raw = []\n", 112 | "outputs_train_raw = []\n", 113 | "inputs_train_real = []\n", 114 | "outputs_train_real = []\n", 115 | "\n", 116 | "\n", 117 | "for x in range(CYCLE):\n", 118 | " tempInput_raw = copy.deepcopy(inputRawTrain)\n", 119 | " tempOutput_raw = copy.deepcopy(outputRawTrain)\n", 120 | " tempInput_real = copy.deepcopy(inputTrain)\n", 121 | " tempOutput_real = copy.deepcopy(outputTrain)\n", 122 | "\n", 123 | " np.random.seed(x+1)\n", 124 | " np.random.shuffle(tempInput_raw)\n", 125 | " np.random.seed(x+1)\n", 126 | " np.random.shuffle(tempOutput_raw)\n", 127 | " inputs_train_raw.append(tempInput_raw)\n", 128 | " outputs_train_raw.append(tempOutput_raw)\n", 129 | " \n", 130 | " np.random.seed(x+1)\n", 131 | " np.random.shuffle(tempInput_real)\n", 132 | " np.random.seed(x+1)\n", 133 | " np.random.shuffle(tempOutput_real)\n", 134 | " inputs_train_real.append(tempInput_real)\n", 135 | " outputs_train_real.append(tempOutput_real)\n", 136 | "\n", 137 | "print(\"Dataset randomization and separation complete!\")" 138 | ] 139 | }, 140 | { 141 | "cell_type": "code", 142 | "execution_count": null, 143 | "metadata": {}, 144 | "outputs": [], 145 | "source": [ 146 | "# Set learning rate\n", 147 | "initial_learning_rate = 0.020669824682365133\n", 148 | "lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(\n", 149 | " initial_learning_rate,\n", 150 | " decay_steps=375*65,\n", 151 | " decay_rate=0.575401007701697,\n", 152 | " staircase=True)\n", 153 | "\n", 154 | "# Normalization layer definition\n", 155 | "Normlayer1=tf.keras.layers.Normalization()\n", 156 | "Normlayer1.adapt(inputRawTrain)\n", 157 | "\n", 158 | "# build and then train the model\n", 159 | "def train_model(model,inputs_train,outputs_train,num_epoch):\n", 160 | " model.add(Normlayer1)\n", 161 | " model.add(tf.keras.layers.Dense(43, activation='sigmoid'))\n", 162 | " model.add(tf.keras.layers.Dense(56, activation='sigmoid'))\n", 163 | " model.add(tf.keras.layers.Dense(43, activation='sigmoid'))\n", 164 | " model.add(tf.keras.layers.Dense(8))\n", 165 | " opt = tf.keras.optimizers.Adam(learning_rate=lr_schedule)\n", 166 | " model.compile(optimizer=opt, loss='mse', metrics=['mse'])\n", 167 | "\n", 168 | " model_checkpoint_callback = tf.keras.callbacks.ModelCheckpoint(\n", 169 | " filepath=checkpoint_filepath,\n", 170 | "# save_weights_only=True,\n", 171 | " verbose = 1,\n", 172 | " monitor='val_mse',\n", 173 | " mode='auto',\n", 174 | " save_best_only=True)\n", 175 | "\n", 176 | " history = model.fit(inputs_train, outputs_train, epochs=num_epoch, batch_size=16, validation_data=(inputs_test, outputs_test), callbacks=[model_checkpoint_callback])\n", 177 | " return history\n", 178 | "\n", 179 | "# Use the model to predict test data\n", 180 | "def predict_with_FNN(model,inputs_test,outputs_test):\n", 181 | " output_pred = model.predict(inputs_test)\n", 182 | " output_diff = output_pred - outputs_test\n", 183 | " AE = output_diff\n", 184 | " MAE = np.mean(abs(output_diff))\n", 185 | " MAE_1 = np.mean(abs(output_pred[:,[0,2,4,6]] - outputs_test[:,[0,2,4,6]]))\n", 186 | " MAE_2 = np.mean(abs(output_pred[:,[1,3,5,7]] - outputs_test[:,[1,3,5,7]]))\n", 187 | " MSE = np.square(np.subtract(output_pred,outputs_test)).mean()\n", 188 | " MSE_1 = np.square(np.subtract(output_pred[:,[0,2,4,6]],outputs_test[:,[0,2,4,6]])).mean()\n", 189 | " MSE_2 = np.square(np.subtract(output_pred[:,[1,3,5,7]],outputs_test[:,[1,3,5,7]])).mean()\n", 190 | " \n", 191 | " pct_95th_error = np.percentile(abs(output_diff),95)\n", 192 | " std_error = np.std(output_diff)\n", 193 | " print(\"Mean Absolute Error(MAE): %.9f\" % MAE)\n", 194 | " print(\"MAE conductance: %.9f\" % MAE_1)\n", 195 | " print(\"MAE susceptance: %.9f\" % MAE_2)\n", 196 | " print(\"95th Error: %.9f\" % pct_95th_error)\n", 197 | " print(\"Standard Deviation: %.9f\" % std_error)\n", 198 | " print(\"MSE: %.9f\" % MSE)\n", 199 | " print(\"MSE conductance: %.9f\" % MSE_1)\n", 200 | " print(\"MSE susceptance: %.9f\" % MSE_2)\n", 201 | " print(\"Absolute Error(first element, conductance):\", np.around(AE[[0],[0,2,4,6]],9))\n", 202 | "\n", 203 | " return MAE,MAE_1,MAE_2,pct_95th_error,std_error,MSE,MSE_1,MSE_2,AE\n", 204 | "\n", 205 | "print(\"Training and prediction function definition complete!\")" 206 | ] 207 | }, 208 | { 209 | "cell_type": "code", 210 | "execution_count": null, 211 | "metadata": { 212 | "scrolled": true, 213 | "tags": [] 214 | }, 215 | "outputs": [], 216 | "source": [ 217 | "EPOCHS = 500\n", 218 | "\n", 219 | "# initialization/getting reproducible results\n", 220 | "reset_seeds()\n", 221 | "\n", 222 | "# Direct Train\n", 223 | "mae_direct = []\n", 224 | "mae_direct_iteration = []\n", 225 | "mae_1_direct = []\n", 226 | "mae_1_direct_iteration = []\n", 227 | "mae_2_direct = []\n", 228 | "mae_2_direct_iteration = []\n", 229 | "pct95th_direct = []\n", 230 | "pct95th_direct_iteration = []\n", 231 | "std_direct = []\n", 232 | "std_direct_iteration = []\n", 233 | "mse_direct = []\n", 234 | "mse_direct_iteration = []\n", 235 | "mse_1_direct = []\n", 236 | "mse_1_direct_iteration = []\n", 237 | "mse_2_direct = []\n", 238 | "mse_2_direct_iteration = []\n", 239 | "ae_direct = []\n", 240 | "ae_direct_iteration = []\n", 241 | "\n", 242 | "num_data_array = [5,10,30,50,100,150,200,300,400,550]\n", 243 | "\n", 244 | "for x in range(len(num_data_array)):\n", 245 | " # Set learning rate\n", 246 | " initial_learning_rate = 0.020669824682365133\n", 247 | " lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(\n", 248 | " initial_learning_rate,\n", 249 | " decay_steps=math.ceil(num_data_array[x]/16)*65,\n", 250 | " decay_rate=0.575401007701697,\n", 251 | " staircase=True)\n", 252 | " \n", 253 | " mae_direct_iteration.append(num_data_array[x])\n", 254 | " mae_1_direct_iteration.append(num_data_array[x])\n", 255 | " mae_2_direct_iteration.append(num_data_array[x])\n", 256 | " pct95th_direct_iteration.append(num_data_array[x])\n", 257 | " std_direct_iteration.append(num_data_array[x])\n", 258 | " mse_direct_iteration.append(num_data_array[x])\n", 259 | " mse_1_direct_iteration.append(num_data_array[x])\n", 260 | " mse_2_direct_iteration.append(num_data_array[x])\n", 261 | " ae_direct_iteration.append(num_data_array[x])\n", 262 | " for num_train_iteration in range(10):\n", 263 | " inputs_train_real_cut = inputs_train_real[num_train_iteration][0:0+num_data_array[x]]\n", 264 | " outputs_train_real_cut = outputs_train_real[num_train_iteration][0:0+num_data_array[x]]\n", 265 | "# inputs_train_real_cut = inputs_train_real[num_train_iteration]\n", 266 | "# outputs_train_real_cut = outputs_train_real[num_train_iteration]\n", 267 | " \n", 268 | " model_direct = tf.keras.Sequential()\n", 269 | " checkpoint_filepath = './checkpoint/direct/direct_weights.hdf5' #\n", 270 | " history = train_model(model_direct,inputs_train_real_cut,outputs_train_real_cut,EPOCHS)\n", 271 | " model_direct.load_weights(checkpoint_filepath) # load the best model\n", 272 | " predict_results = predict_with_FNN(model_direct,inputs_test,outputs_test)\n", 273 | " print(\"Number of Data Trained: %.1f\" % num_data_array[x])\n", 274 | " print(\"Iteration Times: %.1f\" % num_train_iteration)\n", 275 | " \n", 276 | " mae_direct_iteration.append(predict_results[0])\n", 277 | " mae_1_direct_iteration.append(predict_results[1])\n", 278 | " mae_2_direct_iteration.append(predict_results[2])\n", 279 | " pct95th_direct_iteration.append(predict_results[3])\n", 280 | " std_direct_iteration.append(predict_results[4])\n", 281 | " mse_direct_iteration.append(predict_results[5])\n", 282 | " mse_1_direct_iteration.append(predict_results[6])\n", 283 | " mse_2_direct_iteration.append(predict_results[7])\n", 284 | " for i in range(num_inputTest):\n", 285 | " ae_direct_iteration.extend(predict_results[8][i])\n", 286 | " print()\n", 287 | " \n", 288 | " mae_direct.append(mae_direct_iteration)\n", 289 | " mae_direct_iteration = []\n", 290 | " mae_1_direct.append(mae_1_direct_iteration)\n", 291 | " mae_1_direct_iteration = []\n", 292 | " mae_2_direct.append(mae_2_direct_iteration)\n", 293 | " mae_2_direct_iteration = []\n", 294 | " pct95th_direct.append(pct95th_direct_iteration)\n", 295 | " pct95th_direct_iteration = []\n", 296 | " std_direct.append(std_direct_iteration)\n", 297 | " std_direct_iteration = []\n", 298 | " mse_direct.append(mse_direct_iteration)\n", 299 | " mse_direct_iteration = []\n", 300 | " mse_1_direct.append(mse_1_direct_iteration)\n", 301 | " mse_1_direct_iteration = []\n", 302 | " mse_2_direct.append(mse_2_direct_iteration)\n", 303 | " mse_2_direct_iteration = []\n", 304 | " ae_direct.append(ae_direct_iteration)\n", 305 | " ae_direct_iteration = []" 306 | ] 307 | }, 308 | { 309 | "cell_type": "code", 310 | "execution_count": null, 311 | "metadata": {}, 312 | "outputs": [], 313 | "source": [ 314 | "# increase the size of the graphs. The default size is (6,4).\n", 315 | "plt.rcParams[\"figure.figsize\"] = (9,6)\n", 316 | "\n", 317 | "# list all data in history\n", 318 | "print(history.history.keys())\n", 319 | "\n", 320 | "# summarize history for loss\n", 321 | "plt.plot(history.history['loss'], '-r.', label='Training Loss')\n", 322 | "plt.plot(history.history['val_loss'], '--b.', label='Validation Loss')\n", 323 | "plt.title('Training and Validation Loss')\n", 324 | "plt.xlabel('Epochs')\n", 325 | "plt.ylabel('Loss')\n", 326 | "plt.legend()\n", 327 | "plt.semilogy()\n", 328 | "plt.grid()\n", 329 | "plt.show()\n", 330 | "\n", 331 | "print(plt.rcParams[\"figure.figsize\"])\n", 332 | "#print(history.history)" 333 | ] 334 | }, 335 | { 336 | "cell_type": "code", 337 | "execution_count": null, 338 | "metadata": { 339 | "scrolled": true 340 | }, 341 | "outputs": [], 342 | "source": [ 343 | "EPOCHS_1 = 500\n", 344 | "EPOCHS_2 = 500\n", 345 | "\n", 346 | "# initialization/getting reproducible results\n", 347 | "reset_seeds()\n", 348 | "\n", 349 | "# Pre-train\n", 350 | "initial_learning_rate = 0.020669824682365133\n", 351 | "lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(\n", 352 | " initial_learning_rate,\n", 353 | " decay_steps=math.ceil(num_inputRawTrain/16)*65,\n", 354 | " decay_rate=0.575401007701697,\n", 355 | " staircase=True)\n", 356 | "model_pre = tf.keras.Sequential()\n", 357 | "checkpoint_filepath = './checkpoint/pre_train/pre_train_weights.hdf5' \n", 358 | "history_pre = train_model(model_pre,inputs_train_raw[0],outputs_train_raw[0],EPOCHS_1)\n", 359 | "model_pre.load_weights(checkpoint_filepath) # load the best model\n", 360 | "predict_with_FNN(model_pre,inputs_test,outputs_test)\n", 361 | "print()\n", 362 | "\n", 363 | "# Train with experimental data\n", 364 | "mae_pretrain = []\n", 365 | "mae_pretrain_iteration = []\n", 366 | "mae_1_pretrain = []\n", 367 | "mae_1_pretrain_iteration = []\n", 368 | "mae_2_pretrain = []\n", 369 | "mae_2_pretrain_iteration = []\n", 370 | "pct95th_pretrain = []\n", 371 | "pct95th_pretrain_iteration = []\n", 372 | "std_pretrain = []\n", 373 | "std_pretrain_iteration = []\n", 374 | "mse_pretrain = []\n", 375 | "mse_pretrain_iteration = []\n", 376 | "mse_1_pretrain = []\n", 377 | "mse_1_pretrain_iteration = []\n", 378 | "mse_2_pretrain = []\n", 379 | "mse_2_pretrain_iteration = []\n", 380 | "ae_pretrain = []\n", 381 | "ae_pretrain_iteration = []\n", 382 | "\n", 383 | "num_data_array = [5,10,30,50,100,150,200,300,400,550]\n", 384 | "\n", 385 | "for x in range(len(num_data_array)):\n", 386 | " # Set learning rate\n", 387 | " initial_learning_rate = 0.020669824682365133\n", 388 | " lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(\n", 389 | " initial_learning_rate,\n", 390 | " decay_steps=math.ceil(num_data_array[x]/16)*65,\n", 391 | " decay_rate=0.575401007701697,\n", 392 | " staircase=True)\n", 393 | " \n", 394 | " mae_pretrain_iteration.append(num_data_array[x])\n", 395 | " mae_1_pretrain_iteration.append(num_data_array[x])\n", 396 | " mae_2_pretrain_iteration.append(num_data_array[x])\n", 397 | " pct95th_pretrain_iteration.append(num_data_array[x])\n", 398 | " std_pretrain_iteration.append(num_data_array[x])\n", 399 | " mse_pretrain_iteration.append(num_data_array[x])\n", 400 | " mse_1_pretrain_iteration.append(num_data_array[x])\n", 401 | " mse_2_pretrain_iteration.append(num_data_array[x])\n", 402 | " ae_pretrain_iteration.append(num_data_array[x])\n", 403 | " for num_train_iteration in range(10):\n", 404 | " # load pre-trained model\n", 405 | " pre_train_model_path = './checkpoint/pre_train/pre_train_weights.hdf5' \n", 406 | " model_pre.load_weights(pre_train_model_path) # load the best model\n", 407 | " \n", 408 | " # prepare experimental data\n", 409 | " inputs_train_real_cut = inputs_train_real[num_train_iteration][0:0+num_data_array[x]]\n", 410 | " outputs_train_real_cut = outputs_train_real[num_train_iteration][0:0+num_data_array[x]]\n", 411 | "\n", 412 | " opt = tf.keras.optimizers.Adam(learning_rate=lr_schedule)\n", 413 | "\n", 414 | " model_pre.compile(optimizer=opt, loss='mse', metrics=['mse']) \n", 415 | " checkpoint_filepath = './checkpoint/pre_train/final_weights.hdf5' \n", 416 | " model_checkpoint_callback = tf.keras.callbacks.ModelCheckpoint(\n", 417 | " filepath=checkpoint_filepath,\n", 418 | " verbose = 1,\n", 419 | " monitor='val_mse',\n", 420 | " mode='auto',\n", 421 | " save_best_only=True)\n", 422 | " \n", 423 | " history = model_pre.fit(inputs_train_real_cut, outputs_train_real_cut, epochs=EPOCHS_2, batch_size=16, validation_data=(inputs_test, outputs_test), callbacks=[model_checkpoint_callback])\n", 424 | " model_pre.load_weights(checkpoint_filepath) # load the best model\n", 425 | " predict_results = predict_with_FNN(model_pre,inputs_test,outputs_test)\n", 426 | " print(\"Number of Data Trained: %.1f\" % num_data_array[x])\n", 427 | " print(\"Iteration Times: %.1f\" % num_train_iteration)\n", 428 | " \n", 429 | " mae_pretrain_iteration.append(predict_results[0])\n", 430 | " mae_1_pretrain_iteration.append(predict_results[1])\n", 431 | " mae_2_pretrain_iteration.append(predict_results[2])\n", 432 | " pct95th_pretrain_iteration.append(predict_results[3])\n", 433 | " std_pretrain_iteration.append(predict_results[4])\n", 434 | " mse_pretrain_iteration.append(predict_results[5])\n", 435 | " mse_1_pretrain_iteration.append(predict_results[6])\n", 436 | " mse_2_pretrain_iteration.append(predict_results[7])\n", 437 | " for i in range(num_inputTest):\n", 438 | " ae_pretrain_iteration.extend(predict_results[8][i])\n", 439 | " print()\n", 440 | " mae_pretrain.append(mae_pretrain_iteration)\n", 441 | " mae_pretrain_iteration = []\n", 442 | " mae_1_pretrain.append(mae_1_pretrain_iteration)\n", 443 | " mae_1_pretrain_iteration = []\n", 444 | " mae_2_pretrain.append(mae_2_pretrain_iteration)\n", 445 | " mae_2_pretrain_iteration = []\n", 446 | " pct95th_pretrain.append(pct95th_pretrain_iteration)\n", 447 | " pct95th_pretrain_iteration = []\n", 448 | " std_pretrain.append(std_pretrain_iteration)\n", 449 | " std_pretrain_iteration = []\n", 450 | " mse_pretrain.append(mse_pretrain_iteration)\n", 451 | " mse_pretrain_iteration = []\n", 452 | " mse_1_pretrain.append(mse_1_pretrain_iteration)\n", 453 | " mse_1_pretrain_iteration = []\n", 454 | " mse_2_pretrain.append(mse_2_pretrain_iteration)\n", 455 | " mse_2_pretrain_iteration = []\n", 456 | " ae_pretrain.append(ae_pretrain_iteration)\n", 457 | " ae_pretrain_iteration = []" 458 | ] 459 | }, 460 | { 461 | "cell_type": "code", 462 | "execution_count": null, 463 | "metadata": {}, 464 | "outputs": [], 465 | "source": [ 466 | "# increase the size of the graphs. The default size is (6,4).\n", 467 | "plt.rcParams[\"figure.figsize\"] = (9,6)\n", 468 | "\n", 469 | "# list all data in history\n", 470 | "print(history.history.keys())\n", 471 | "\n", 472 | "# summarize history for loss\n", 473 | "plt.plot(history_pre.history['loss'], '-r.', label='Training Loss')\n", 474 | "plt.plot(history_pre.history['val_loss'], '--b.', label='Validation Loss')\n", 475 | "plt.title('Training and Validation Loss')\n", 476 | "plt.xlabel('Epochs')\n", 477 | "plt.ylabel('Loss')\n", 478 | "plt.legend()\n", 479 | "plt.semilogy()\n", 480 | "plt.grid()\n", 481 | "plt.show()\n", 482 | "\n", 483 | "print(plt.rcParams[\"figure.figsize\"])\n", 484 | "#print(history.history)" 485 | ] 486 | }, 487 | { 488 | "cell_type": "code", 489 | "execution_count": null, 490 | "metadata": {}, 491 | "outputs": [], 492 | "source": [ 493 | "# increase the size of the graphs. The default size is (6,4).\n", 494 | "plt.rcParams[\"figure.figsize\"] = (9,6)\n", 495 | "\n", 496 | "# list all data in history\n", 497 | "print(history.history.keys())\n", 498 | "\n", 499 | "# summarize history for loss\n", 500 | "plt.plot(history.history['loss'], '-r.', label='Training Loss')\n", 501 | "plt.plot(history.history['val_loss'], '--b.', label='Validation Loss')\n", 502 | "plt.title('Training and Validation Loss')\n", 503 | "plt.xlabel('Epochs')\n", 504 | "plt.ylabel('Loss')\n", 505 | "plt.legend()\n", 506 | "plt.semilogy()\n", 507 | "plt.grid()\n", 508 | "plt.show()\n", 509 | "\n", 510 | "print(plt.rcParams[\"figure.figsize\"])\n", 511 | "#print(history.history)" 512 | ] 513 | }, 514 | { 515 | "cell_type": "code", 516 | "execution_count": null, 517 | "metadata": {}, 518 | "outputs": [], 519 | "source": [ 520 | "# # Creat new directory\n", 521 | "# import os\n", 522 | "# os.mkdir('results')\n", 523 | "import os\n", 524 | "if not os.path.exists('results'):\n", 525 | " os.makedirs('results')\n", 526 | "# Save Mean Squared Error of pretrained results\n", 527 | "with open('./results/mse_results_pretrain.csv','w', newline=\"\") as output_file:\n", 528 | " writer = csv.writer(output_file)\n", 529 | " writer.writerows(mse_pretrain)\n", 530 | "\n", 531 | "# Save Mean Squared Error of non-pretrained results\n", 532 | "with open('./results/mse_results_nopre.csv','w', newline=\"\") as output_file:\n", 533 | " writer = csv.writer(output_file)\n", 534 | " writer.writerows(mse_direct)\n", 535 | " \n", 536 | "# Save Mean Squared Error of pretrained results\n", 537 | "with open('./results/mse_1_results_pretrain.csv','w', newline=\"\") as output_file:\n", 538 | " writer = csv.writer(output_file)\n", 539 | " writer.writerows(mse_1_pretrain)\n", 540 | "\n", 541 | "# Save Mean Squared Error of non-pretrained results\n", 542 | "with open('./results/mse_1_results_nopre.csv','w', newline=\"\") as output_file:\n", 543 | " writer = csv.writer(output_file)\n", 544 | " writer.writerows(mse_1_direct)\n", 545 | "\n", 546 | "# Save Mean Squared Error of pretrained results\n", 547 | "with open('./results/mse_2_results_pretrain.csv','w', newline=\"\") as output_file:\n", 548 | " writer = csv.writer(output_file)\n", 549 | " writer.writerows(mse_2_pretrain)\n", 550 | "\n", 551 | "# Save Mean Squared Error of non-pretrained results\n", 552 | "with open('./results/mse_2_results_nopre.csv','w', newline=\"\") as output_file:\n", 553 | " writer = csv.writer(output_file)\n", 554 | " writer.writerows(mse_2_direct) \n", 555 | " \n", 556 | "# Save Mean Absolute Error of pretrained results\n", 557 | "with open('./results/mae_results_pretrain.csv','w', newline=\"\") as output_file:\n", 558 | " writer = csv.writer(output_file)\n", 559 | " writer.writerows(mae_pretrain)\n", 560 | "\n", 561 | "# Save Mean Absolute Error of non-pretrained results\n", 562 | "with open('./results/mae_results_nopre.csv','w', newline=\"\") as output_file:\n", 563 | " writer = csv.writer(output_file)\n", 564 | " writer.writerows(mae_direct)\n", 565 | " \n", 566 | "# Save Mean Absolute Error of pretrained results\n", 567 | "with open('./results/mae_1_results_pretrain.csv','w', newline=\"\") as output_file:\n", 568 | " writer = csv.writer(output_file)\n", 569 | " writer.writerows(mae_1_pretrain)\n", 570 | "\n", 571 | "# Save Mean Absolute Error of non-pretrained results\n", 572 | "with open('./results/mae_1_results_nopre.csv','w', newline=\"\") as output_file:\n", 573 | " writer = csv.writer(output_file)\n", 574 | " writer.writerows(mae_1_direct)\n", 575 | " \n", 576 | "# Save Mean Absolute Error of pretrained results\n", 577 | "with open('./results/mae_2_results_pretrain.csv','w', newline=\"\") as output_file:\n", 578 | " writer = csv.writer(output_file)\n", 579 | " writer.writerows(mae_2_pretrain)\n", 580 | "\n", 581 | "# Save Mean Absolute Error of non-pretrained results\n", 582 | "with open('./results/mae_2_results_nopre.csv','w', newline=\"\") as output_file:\n", 583 | " writer = csv.writer(output_file)\n", 584 | " writer.writerows(mae_2_direct)\n", 585 | " \n", 586 | "# Save Absolute Error of pretrained results\n", 587 | "with open('./results/ae_results_pretrain.csv','w', newline=\"\") as output_file:\n", 588 | " writer = csv.writer(output_file)\n", 589 | " writer.writerows(ae_pretrain)\n", 590 | "\n", 591 | "# Save Absolute Error of non-pretrained results\n", 592 | "with open('./results/ae_results_nopre.csv','w', newline=\"\") as output_file:\n", 593 | " writer = csv.writer(output_file)\n", 594 | " writer.writerows(ae_direct)\n", 595 | " \n", 596 | "# Save 95th Percentage Error of pretrained results\n", 597 | "with open('./results/pct95th_results_pretrain.csv','w', newline=\"\") as output_file:\n", 598 | " writer = csv.writer(output_file)\n", 599 | " writer.writerows(pct95th_pretrain)\n", 600 | "\n", 601 | "# Save 95th Percentage Error of non-pretrained results\n", 602 | "with open('./results/pct95th_results_nopre.csv','w', newline=\"\") as output_file:\n", 603 | " writer = csv.writer(output_file)\n", 604 | " writer.writerows(pct95th_direct)\n", 605 | " \n", 606 | "# Save Standard Deviation of pretrained results\n", 607 | "with open('./results/std_results_pretrain.csv','w', newline=\"\") as output_file:\n", 608 | " writer = csv.writer(output_file)\n", 609 | " writer.writerows(std_pretrain)\n", 610 | "\n", 611 | "# Save Standard Deviation of non-pretrained results\n", 612 | "with open('./results/std_results_nopre.csv','w', newline=\"\") as output_file:\n", 613 | " writer = csv.writer(output_file)\n", 614 | " writer.writerows(std_direct)\n", 615 | "\n", 616 | "print(\"File generation complete!\") " 617 | ] 618 | } 619 | ], 620 | "metadata": { 621 | "colab": { 622 | "collapsed_sections": [], 623 | "name": "preprocessing.ipynb", 624 | "provenance": [ 625 | { 626 | "file_id": "1kNYLYMj2vowrtHEcjI6ebAbwCWdHIR4_", 627 | "timestamp": 1601231066833 628 | } 629 | ] 630 | }, 631 | "kernelspec": { 632 | "display_name": "Python 3 (ipykernel)", 633 | "language": "python", 634 | "name": "python3" 635 | }, 636 | "language_info": { 637 | "codemirror_mode": { 638 | "name": "ipython", 639 | "version": 3 640 | }, 641 | "file_extension": ".py", 642 | "mimetype": "text/x-python", 643 | "name": "python", 644 | "nbconvert_exporter": "python", 645 | "pygments_lexer": "ipython3", 646 | "version": "3.9.13" 647 | }, 648 | "toc-autonumbering": true 649 | }, 650 | "nbformat": 4, 651 | "nbformat_minor": 4 652 | } 653 | --------------------------------------------------------------------------------