├── .gitignore ├── Lectures ├── 00- Intreview_Questions │ └── Intreview.py ├── 01-CheatSheet │ ├── 01-ML Cheat Sheet │ │ ├── ML Cheat sheet.pdf │ │ └── Scikit_Learn_Cheat_Sheet_Python.pdf │ ├── 02-Math Cheat Sheet │ │ ├── Essentail Math.pdf │ │ ├── machine-learning-cheat-sheet.pdf │ │ └── probability_cheatsheet.pdf │ └── 03-Python Cheat Sheet │ │ ├── Base Python.pdf │ │ ├── Bokeh.pdf │ │ ├── Classes.pdf │ │ ├── Code Debug.pdf │ │ ├── Data Structure 1.pdf │ │ ├── Data Structure 2.pdf │ │ ├── Data Structure 3.pdf │ │ ├── Dictionaries.pdf │ │ ├── Django.pdf │ │ ├── Functions.pdf │ │ ├── If While.pdf │ │ ├── Jupyter Notebook.pdf │ │ ├── List.pdf │ │ ├── Matplotlib.pdf │ │ ├── Matplotlib2.pdf │ │ ├── Numpy.pdf │ │ ├── Open-WriteFiles .pdf │ │ ├── Pands1.pdf │ │ ├── Pands2.pdf │ │ ├── Python Cheat Sheet.pdf │ │ └── Scipy-Linalgebra.pdf ├── 02-Math Review Codes │ ├── 01-Linear-Algebra │ │ ├── Vectors.py │ │ ├── eigen_vector.py │ │ └── sympy_code.py │ ├── 02-Statistics │ │ └── Generate_Random_number.py │ └── Linear_affine.md ├── 03-Perceptron │ ├── Main_loop.py │ ├── Simple_Perceptron.py │ ├── perceptron.py │ └── plot_decision_regions.py ├── 04-Adaline │ ├── AdalineGD.py │ ├── AdalineSGD.py │ ├── Main_AdalineGD.py │ ├── Main_AdalineSGD.py │ └── plot_decision_regions.py ├── 05-Sklearn NN │ ├── Exercise.py │ ├── Sample_NN_Sklearn_Classifier.py │ └── Sample_NN_Sklearn_Regressor..py └── Clustering_SOM │ └── SOM │ ├── Readme.md │ └── Tabular │ └── Iris │ ├── notebook │ ├── 1-iris_training.ipynb │ ├── 2-iris_plot.ipynb │ ├── 3-iris_post_training_analysis.ipynb │ ├── 4-SOMPlots_Cupy_Test.ipynb │ └── README.rst │ └── py │ ├── 1-iris_training.py │ ├── 2-iris_training_gpu.py │ ├── 3-iris_plot.py │ ├── 4-iris_post_training_analysis.py │ ├── 5-iris_interactive.py │ └── README.rst ├── Mini_Project ├── Backpropagation │ ├── BackProp_Lab.md │ ├── models.py │ └── utils.py ├── LMS │ └── LMS_lab.md ├── Perceptron │ ├── perceptron_lab.md │ └── utils.py └── SimpleNet_Python │ ├── PythonLab1.ipynb │ ├── PythonLab1.py │ ├── SampleDF.csv │ └── python_lab.md └── Readme.md /.gitignore: -------------------------------------------------------------------------------- 1 | /.idea/inspectionProfiles/profiles_settings.xml 2 | /.idea/deployment.xml 3 | /.idea/Machine-Learning.iml 4 | /.idea/misc.xml 5 | /.idea/modules.xml 6 | /.idea/vcs.xml 7 | /.idea/workspace.xml 8 | *_sol.py 9 | /Temp 10 | *_sol.ipynb -------------------------------------------------------------------------------- /Lectures/00- Intreview_Questions/Intreview.py: -------------------------------------------------------------------------------- 1 | # ---------------------------------------------------------------------- 2 | # 2.Write a Python program to get the Python version you are using. 3 | 4 | 5 | # ---------------------------------------------------------------------- 6 | # 3. Write a Python program to display the current date and time. 7 | 8 | 9 | # ---------------------------------------------------------------------- 10 | # 4. Write a Python program which accepts the radius of a circle from the user and 11 | # compute the area. 12 | 13 | 14 | # ---------------------------------------------------------------------- 15 | # 5. Write a Python program which accepts the user's first and last name and 16 | # print them in reverse order with a space between them. 17 | 18 | 19 | 20 | # ---------------------------------------------------------------------- 21 | # 6. Write a Python program which accepts a sequence of comma-separated 22 | # numbers from user and generate a list and a tuple with those numbers. 23 | 24 | 25 | 26 | # ---------------------------------------------------------------------- 27 | # 7. Write a Python program to accept a filename from the user and print 28 | # the extension of that. 29 | 30 | 31 | # ---------------------------------------------------------------------- 32 | # 8. Write a Python program to display the first and last colors from the following list. 33 | 34 | 35 | # ---------------------------------------------------------------------- 36 | # 9. Write a Python program to display the examination schedule. 37 | 38 | 39 | # ---------------------------------------------------------------------- 40 | # 10. Write a Python program that accepts an integer (n) and computes 41 | # the value of n+nn+nnn. 42 | 43 | 44 | 45 | # ---------------------------------------------------------------------- 46 | # 11. Write a Python program to print the documents (syntax, description etc.) 47 | # of Python built-in function(s). 48 | 49 | 50 | # ---------------------------------------------------------------------- 51 | # 12. Write a Python program to print the calendar of a given month and year. 52 | 53 | 54 | # ---------------------------------------------------------------------- 55 | # 14. Write a Python program to calculate number of days between two dates. 56 | 57 | 58 | # ---------------------------------------------------------------------- 59 | # 15. Write a Python program to get the volume of a sphere with radius 6. 60 | 61 | 62 | # ---------------------------------------------------------------------- 63 | # 16. Write a Python program to get the difference between a given number 64 | # and 17, if the number is greater than 17 return double the absolute difference. 65 | 66 | 67 | 68 | # ---------------------------------------------------------------------- 69 | # 17. Write a Python program to test whether a number is within 100 of 1000 or 2000. 70 | 71 | 72 | 73 | # ---------------------------------------------------------------------- 74 | # 18. Write a Python program to calculate the sum of three given numbers, 75 | # if the values are equal then return thrice of their sum. 76 | 77 | 78 | 79 | # ---------------------------------------------------------------------- 80 | # 19. Write a Python program to get a new string from a given string 81 | # where "Is" has been added to the front. If the given string already 82 | # begins with "Is" then return the string unchanged. 83 | 84 | 85 | # ---------------------------------------------------------------------- 86 | # 20. Write a Python program to get a string which is n 87 | # (non-negative integer) copies of a given string. 88 | 89 | 90 | 91 | # ---------------------------------------------------------------------- 92 | # 21. Write a Python program to find whether a given number (accept from 93 | # the user) is even or odd, print out an appropriate message to the user. 94 | 95 | 96 | 97 | # ---------------------------------------------------------------------- 98 | # 22. Write a Python program to count the number 4 in a given list. 99 | 100 | 101 | # ---------------------------------------------------------------------- 102 | # 23. Write a Python program to get the n (non-negative integer) copies 103 | # of the first 2 characters of a given string. Return the n copies of 104 | # the whole string if the length is less than 2. 105 | 106 | 107 | 108 | # ---------------------------------------------------------------------- 109 | # 24. Write a Python program to test whether a passed letter is a vowel or not. 110 | 111 | 112 | # ---------------------------------------------------------------------- 113 | # 25. Write a Python program to check whether a specified value is contained 114 | # in a group of values. Go to the editor 115 | # Test Data : 116 | # 3 -> [1, 5, 8, 3] : True 117 | # -1 -> [1, 5, 8, 3] : False 118 | 119 | 120 | # ---------------------------------------------------------------------- 121 | # 26. Write a Python program to create a histogram from a given list of integers. 122 | 123 | 124 | 125 | # ---------------------------------------------------------------------- 126 | # 27. Write a Python program to concatenate all elements in a list into 127 | # a string and return it. 128 | 129 | 130 | 131 | # ---------------------------------------------------------------------- 132 | # 28. Write a Python program to print all even numbers from a given numbers 133 | # list in the same order and stop the printing if any numbers that come 134 | # after 237 in the sequence. 135 | 136 | 137 | # ---------------------------------------------------------------------- 138 | # 29. Write a Python program to print out a set containing all the colors 139 | # from color_list_1 which are not present in color_list_2. 140 | 141 | 142 | 143 | # ---------------------------------------------------------------------- 144 | # 30. Write a Python program that will accept the base and height 145 | # of a triangle and compute the area. 146 | 147 | 148 | 149 | # ---------------------------------------------------------------------- 150 | # 31. Write a Python program to compute the greatest common divisor (GCD) 151 | # of two positive integers. 152 | 153 | 154 | 155 | # ---------------------------------------------------------------------- 156 | # 32. Write a Python program to get the least common multiple (LCM) of two positive integers. 157 | 158 | 159 | 160 | # ---------------------------------------------------------------------- 161 | # 33. Write a Python program to sum of three given integers. However, if 162 | # two values are equal sum will be zero. 163 | 164 | 165 | 166 | # ---------------------------------------------------------------------- 167 | # 34. Write a Python program to sum of two given integers. However, if the 168 | # sum is between 15 to 20 it will return 20 169 | 170 | 171 | 172 | # ---------------------------------------------------------------------- 173 | # 35. Write a Python program that will return true if the two given integer 174 | # values are equal or their sum or difference is 5. 175 | 176 | 177 | 178 | # ---------------------------------------------------------------------- 179 | # 36. Write a Python program to add two objects if both objects are an integer type. 180 | 181 | 182 | 183 | # ---------------------------------------------------------------------- 184 | # 37. Write a Python program to display your details like name, age, 185 | # address in three different lines. 186 | 187 | 188 | 189 | # ---------------------------------------------------------------------- 190 | # 38. Write a Python program to solve (x + y) * (x + y). Go to the editor 191 | 192 | 193 | 194 | # ---------------------------------------------------------------------- 195 | # 39. Write a Python program to compute the future value of a specified 196 | # principal amount, rate of interest, and a number of years. 197 | 198 | 199 | 200 | # ---------------------------------------------------------------------- 201 | # 40. Write a Python program to compute the distance between the points (x1, y1) and (x2, y2). 202 | 203 | 204 | 205 | 206 | # ---------------------------------------------------------------------- 207 | # 41. Write a Python program to check whether a file exists. 208 | 209 | 210 | # ---------------------------------------------------------------------- 211 | # 42. Write a Python program to determine if a Python shell is executing 212 | # in 32bit or 64bit mode on OS. 213 | # For 32 bit it will return 32 and for 64 bit it will return 64 214 | 215 | 216 | # ---------------------------------------------------------------------- 217 | # 43. Write a Python program to get OS name, platform and release information. 218 | 219 | 220 | # ---------------------------------------------------------------------- 221 | # 44. Write a Python program to locate Python site-packages. 222 | 223 | 224 | # ---------------------------------------------------------------------- 225 | # 45. Write a python program to call an external command in Python. 226 | 227 | 228 | 229 | # ---------------------------------------------------------------------- 230 | # 46. Write a python program to get the path and name of the file that is currently executing. 231 | 232 | 233 | # ---------------------------------------------------------------------- 234 | # 47. Write a Python program to find out the number of CPUs using. 235 | 236 | 237 | # ---------------------------------------------------------------------- 238 | # 48. Write a Python program to parse a string to Float or Integer. 239 | 240 | 241 | # ---------------------------------------------------------------------- 242 | # 49. Write a Python program to list all files in a directory in Python. 243 | 244 | 245 | 246 | # ---------------------------------------------------------------------- 247 | # 50. Write a Python program to print without newline or space. 248 | 249 | 250 | # ---------------------------------------------------------------------- 251 | # 51. Write a Python program to determine profiling of Python programs. 252 | 253 | 254 | # ---------------------------------------------------------------------- 255 | # 52. Write a Python program to print to stderr. 256 | 257 | 258 | # ---------------------------------------------------------------------- 259 | # 53. Write a python program to access environment variables. 260 | 261 | 262 | # ---------------------------------------------------------------------- 263 | # 54. Write a Python program to get the current username 264 | 265 | 266 | 267 | # ---------------------------------------------------------------------- 268 | # 55. Write a Python to find local IP addresses using Python's stdlib 269 | 270 | 271 | 272 | # ---------------------------------------------------------------------- 273 | # 56. Write a Python program to get height and width of the console window 274 | 275 | # ---------------------------------------------------------------------- 276 | # 57. Write a program to get execution time for a Python method. 277 | 278 | 279 | # ---------------------------------------------------------------------- 280 | # 58. Write a python program to sum of the first n positive integers. 281 | 282 | 283 | # ---------------------------------------------------------------------- 284 | # 59. Write a Python program to convert height (in feet and inches) to centimeters. 285 | 286 | 287 | # ---------------------------------------------------------------------- 288 | # 60. Write a Python program to calculate the hypotenuse of a right angled triangle. 289 | 290 | 291 | # ---------------------------------------------------------------------- 292 | # 61. Write a Python program to convert the distance (in feet) to inches, yards, and miles. 293 | 294 | 295 | # ---------------------------------------------------------------------- 296 | # 62. Write a Python program to convert all units of time into seconds. 297 | 298 | 299 | # ---------------------------------------------------------------------- 300 | # 63. Write a Python program to get an absolute file path. 301 | 302 | 303 | # ---------------------------------------------------------------------- 304 | # 64. Write a Python program to get file creation and modification date/times. 305 | 306 | 307 | 308 | # ---------------------------------------------------------------------- 309 | # 65. Write a Python program to convert seconds to day, hour, minutes and seconds. 310 | 311 | 312 | # ---------------------------------------------------------------------- 313 | # 66. Write a Python program to calculate body mass index. 314 | 315 | 316 | 317 | # ---------------------------------------------------------------------- 318 | # 67. Write a Python program to convert pressure in kilopascals to 319 | # pounds per square inch, a millimeter of mercury (mmHg) and atmosphere pressure. 320 | 321 | 322 | # ---------------------------------------------------------------------- 323 | # 68. Write a Python program to calculate the sum of the digits in an integer. 324 | 325 | 326 | # ---------------------------------------------------------------------- 327 | # 69. Write a Python program to sort three integers without using conditional 328 | # statements and loops. 329 | 330 | 331 | 332 | # ---------------------------------------------------------------------- 333 | # 70. Write a Python program to sort files by date. 334 | 335 | 336 | # ---------------------------------------------------------------------- 337 | # 71. Write a Python program to get a directory listing, sorted by creation date. 338 | 339 | 340 | 341 | # ---------------------------------------------------------------------- 342 | # 72. Write a Python program to get the details of math module. 343 | 344 | 345 | # ---------------------------------------------------------------------- 346 | # 73. Write a Python program to calculate midpoints of a line. 347 | 348 | 349 | 350 | # ---------------------------------------------------------------------- 351 | # 74. Write a Python program to hash a word. 352 | 353 | 354 | # ---------------------------------------------------------------------- 355 | # 75. Write a Python program to get the copyright information. 356 | 357 | 358 | # ---------------------------------------------------------------------- 359 | # 76. Write a Python program to get the command-line arguments 360 | # (name of the script, the number of arguments, arguments) passed to a script. 361 | 362 | 363 | # ---------------------------------------------------------------------- 364 | # 77. Write a Python program to test whether the system is a big-endian 365 | # platform or little-endian platform 366 | 367 | 368 | # ---------------------------------------------------------------------- 369 | # 78. Write a Python program to find the available built-in modules. 370 | 371 | 372 | # ---------------------------------------------------------------------- 373 | # 79. Write a Python program to get the size of an object in bytes. 374 | 375 | 376 | # ---------------------------------------------------------------------- 377 | # 80. Write a Python program to get the current value of the recursion limit. 378 | 379 | 380 | # ---------------------------------------------------------------------- 381 | # 81. Write a Python program to concatenate N strings. 382 | 383 | 384 | # ---------------------------------------------------------------------- 385 | # 82. Write a Python program to calculate the sum over a container. 386 | 387 | 388 | # ---------------------------------------------------------------------- 389 | # 83. Write a Python program to test whether all numbers of a list is greater 390 | # than a certain number. 391 | 392 | 393 | # ---------------------------------------------------------------------- 394 | # 84. Write a Python program to count the number occurrence of a specific character in a string. 395 | 396 | 397 | 398 | # ---------------------------------------------------------------------- 399 | # 85. Write a Python program to check if a file path is a file or a directory. 400 | 401 | 402 | # ---------------------------------------------------------------------- 403 | # 86. Write a Python program to get the ASCII value of a character 404 | 405 | 406 | # ---------------------------------------------------------------------- 407 | # 87. Write a Python program to get the size of a file. 408 | #import os 409 | #file_size = os.path.getsize("abc.txt") 410 | #print("\nThe size of abc.txt is :",file_size,"Bytes") 411 | #print() 412 | # ---------------------------------------------------------------------- 413 | # 88. Given variables x=30 and y=20, write a Python program to print t "30+20=50". 414 | 415 | 416 | 417 | # ---------------------------------------------------------------------- 418 | # 89. Write a Python program to perform an action if a condition is true. 419 | 420 | 421 | # ---------------------------------------------------------------------- 422 | # 90. Write a Python program to create a copy of its own source code. 423 | 424 | 425 | # ---------------------------------------------------------------------- 426 | # 91. Write a Python program to swap two variables. 427 | 428 | 429 | # ---------------------------------------------------------------------- 430 | # 92. Write a Python program to define a string containing special characters 431 | # in various forms. 432 | 433 | 434 | # ---------------------------------------------------------------------- 435 | # 93. Write a Python program to get the identity of an object. 436 | 437 | 438 | # ---------------------------------------------------------------------- 439 | # 94. Write a Python program to convert a byte string to a list of integers. 440 | 441 | 442 | # ---------------------------------------------------------------------- 443 | # 95. Write a Python program to check if a string is numeric. 444 | 445 | 446 | # ---------------------------------------------------------------------- 447 | # 96. Write a Python program to print the current call stack. 448 | 449 | 450 | # ---------------------------------------------------------------------- 451 | # 97. Write a Python program to list the special variables used within the language. 452 | 453 | 454 | # ---------------------------------------------------------------------- 455 | # 98. Write a Python program to get the system time. 456 | 457 | 458 | # ---------------------------------------------------------------------- 459 | # 99. Write a Python program to clear the screen or terminal. 460 | 461 | 462 | # ---------------------------------------------------------------------- 463 | # 100. Write a Python program to get the name of the host on which the routine is running. 464 | 465 | 466 | # ---------------------------------------------------------------------- 467 | # 101. Write a Python program to access and print a URL's content to the console. 468 | 469 | 470 | # ---------------------------------------------------------------------- 471 | # 102. Write a Python program to get system command output. 472 | 473 | 474 | # ---------------------------------------------------------------------- 475 | # 103. Write a Python program to extract the filename from a given path. 476 | 477 | 478 | # ---------------------------------------------------------------------- 479 | # 104. Write a Python program to get the effective group id, effective 480 | # user id, real group id, a list of supplemental group ids associated 481 | # with the current process. 482 | 483 | 484 | 485 | # ---------------------------------------------------------------------- 486 | # 105. Write a Python program to get the users environment. 487 | 488 | 489 | # ---------------------------------------------------------------------- 490 | # 106. Write a Python program to divide a path on the extension separator. 491 | 492 | 493 | # ---------------------------------------------------------------------- 494 | # 107. Write a Python program to retrieve file properties. 495 | 496 | 497 | # ---------------------------------------------------------------------- 498 | # 108. Write a Python program to find path refers to a file or directory 499 | # when you encounter a path name. 500 | 501 | 502 | # ---------------------------------------------------------------------- 503 | # 109. Write a Python program to check if a number is positive, negative 504 | # or zero. 505 | 506 | 507 | 508 | # ---------------------------------------------------------------------- 509 | # 110. Write a Python program to get numbers divisible by fifteen from a 510 | # list using an anonymous function. 511 | 512 | 513 | # ---------------------------------------------------------------------- 514 | # 111. Write a Python program to make file lists from current directory 515 | # using a wildcard. 516 | 517 | # ---------------------------------------------------------------------- 518 | # 112. Write a Python program to remove the first item from a specified list. 519 | 520 | # ---------------------------------------------------------------------- 521 | # 113. Write a Python program to input a number, if it is not a number 522 | # generate an error message. 523 | 524 | 525 | # ---------------------------------------------------------------------- 526 | # 114. Write a Python program to filter the positive numbers from a list. 527 | 528 | 529 | # ---------------------------------------------------------------------- 530 | # 115. Write a Python program to compute the product of a list of integers 531 | # (without using for loop). 532 | 533 | 534 | # ---------------------------------------------------------------------- 535 | # 116. Write a Python program to print Unicode characters. 536 | 537 | 538 | 539 | # ---------------------------------------------------------------------- 540 | # 117. Write a Python program to prove that two string variables of same 541 | # value point same memory location. 542 | 543 | 544 | # ---------------------------------------------------------------------- 545 | # 118. Write a Python program to create a bytearray from a list. 546 | 547 | 548 | # ---------------------------------------------------------------------- 549 | # 119. Write a Python program to display a floating number in specified numbers. 550 | 551 | 552 | # ---------------------------------------------------------------------- 553 | # 120. Write a Python program to format a specified string to limit 554 | # the number of characters to 6 555 | 556 | 557 | # ---------------------------------------------------------------------- 558 | # 121. Write a Python program to determine if variable is defined or not. 559 | 560 | # ---------------------------------------------------------------------- 561 | # 122. Write a Python program to empty a variable without destroying it. 562 | 563 | # ---------------------------------------------------------------------- 564 | # 123. Write a Python program to determine the largest and smallest integers, 565 | # longs, floats. 566 | 567 | # ---------------------------------------------------------------------- 568 | # 124. Write a Python program to check if multiple variables have the same value. 569 | 570 | 571 | # ---------------------------------------------------------------------- 572 | # 125. Write a Python program to sum of all counts in a collections? 573 | 574 | # ---------------------------------------------------------------------- 575 | # 126. Write a Python program to get the actual module object for a given object. 576 | 577 | # ---------------------------------------------------------------------- 578 | # 127. Write a Python program to check if an integer fits in 64 bits. 579 | 580 | # ---------------------------------------------------------------------- 581 | # 128. Write a Python program to check if lowercase letters exist in a string. 582 | 583 | # ---------------------------------------------------------------------- 584 | # 129. Write a Python program to add leading zeroes to a string. 585 | 586 | # ---------------------------------------------------------------------- 587 | # 130. Write a Python program to use double quotes to display strings. 588 | 589 | # ---------------------------------------------------------------------- 590 | # 131. Write a Python program to split a variable length string into variables. 591 | 592 | # ---------------------------------------------------------------------- 593 | # 132. Write a Python program to list home directory without absolute path 594 | 595 | # ---------------------------------------------------------------------- 596 | # 133. Write a Python program to calculate the time runs 597 | # (difference between start and current time) of a program. 598 | 599 | 600 | # ---------------------------------------------------------------------- 601 | # 134. Write a Python program to input two integers in a single line. 602 | 603 | # ---------------------------------------------------------------------- 604 | # 135. Write a Python program to print a variable without spaces between values. 605 | 606 | # ---------------------------------------------------------------------- 607 | # 136. Write a Python program to find files and skip directories of a 608 | # given directory. 609 | 610 | # ---------------------------------------------------------------------- 611 | # 137. Write a Python program to extract single key-value pair 612 | # of a dictionary in variables. 613 | 614 | # ---------------------------------------------------------------------- 615 | # 138. Write a Python program to convert true to 1 and false to 0. 616 | 617 | # ---------------------------------------------------------------------- 618 | # 139. Write a Python program to valid a IP address. 619 | 620 | 621 | # ---------------------------------------------------------------------- 622 | # 140. Write a Python program to convert an integer to binary keep 623 | # leading zeros. 624 | 625 | 626 | # ---------------------------------------------------------------------- 627 | # 141. Write a python program to convert decimal to hexadecimal. 628 | 629 | # ---------------------------------------------------------------------- 630 | # 142. Write a Python program to find the operating system name, 631 | # platform and platform release date. 632 | 633 | 634 | # ---------------------------------------------------------------------- 635 | # 143. Write a Python program to determine if the python shell is 636 | # executing in 32bit or 64bit mode on operating system. 637 | 638 | # ---------------------------------------------------------------------- 639 | # 144. Write a Python program to check if variable is of integer or string. 640 | 641 | # ---------------------------------------------------------------------- 642 | # 146. Write a Python program to find the location of Python module sources. 643 | 644 | 645 | # ---------------------------------------------------------------------- 646 | # 147. Write a Python function to check whether a number is divisible 647 | # by another number. Accept two integers values form the user. 648 | 649 | # ---------------------------------------------------------------------- 650 | # 148. Write a Python function to find the maximum and minimum numbers 651 | # from a sequence of numbers. Go to the editor 652 | 653 | # ---------------------------------------------------------------------- 654 | # 149. Write a Python function that takes a positive integer and returns 655 | # the sum of the cube of all the positive integers smaller than the 656 | # specified number. 657 | 658 | 659 | # ---------------------------------------------------------------------- 660 | # 150. Write a Python function to find a distinct pair of numbers whose 661 | # product is odd from a sequence of integer values. 662 | 663 | 664 | -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/01-ML Cheat Sheet/ML Cheat sheet.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/01-ML Cheat Sheet/ML Cheat sheet.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/01-ML Cheat Sheet/Scikit_Learn_Cheat_Sheet_Python.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/01-ML Cheat Sheet/Scikit_Learn_Cheat_Sheet_Python.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/02-Math Cheat Sheet/Essentail Math.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/02-Math Cheat Sheet/Essentail Math.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/02-Math Cheat Sheet/machine-learning-cheat-sheet.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/02-Math Cheat Sheet/machine-learning-cheat-sheet.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/02-Math Cheat Sheet/probability_cheatsheet.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/02-Math Cheat Sheet/probability_cheatsheet.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Base Python.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Base Python.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Bokeh.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Bokeh.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Classes.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Classes.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Code Debug.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Code Debug.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Data Structure 1.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Data Structure 1.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Data Structure 2.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Data Structure 2.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Data Structure 3.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Data Structure 3.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Dictionaries.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Dictionaries.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Django.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Django.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Functions.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Functions.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/If While.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/If While.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Jupyter Notebook.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Jupyter Notebook.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/List.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/List.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Matplotlib.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Matplotlib.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Matplotlib2.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Matplotlib2.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Numpy.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Numpy.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Open-WriteFiles .pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Open-WriteFiles .pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Pands1.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Pands1.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Pands2.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Pands2.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Python Cheat Sheet.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Python Cheat Sheet.pdf -------------------------------------------------------------------------------- /Lectures/01-CheatSheet/03-Python Cheat Sheet/Scipy-Linalgebra.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amir-jafari/Machine-Learning/4a2eb045da08cca16bf0a3dcf488d926da7529f9/Lectures/01-CheatSheet/03-Python Cheat Sheet/Scipy-Linalgebra.pdf -------------------------------------------------------------------------------- /Lectures/02-Math Review Codes/01-Linear-Algebra/Vectors.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | 3 | # Constants 4 | VECTOR_1 = np.array([1, 2]) 5 | SAME_VECTOR = np.array([1, 2]) 6 | RECTANGULAR_ARRAY = np.arange(15).reshape(3, 5) 7 | REPEATED_VALUES_ARRAY = np.array([[1, 2, 3, 4, 5], 8 | [1, 2, 3, 4, 5], 9 | [1, 2, 3, 4, 5]]) 10 | 11 | 12 | def print_vector_info(vector_name, vector): 13 | print(f"{vector_name} = {vector}") 14 | print(f"{vector_name}.shape = {vector.shape}") 15 | 16 | 17 | def perform_operations_on_vectors(vector1, vector2): 18 | print(f"Dot product of vectors: {np.dot(vector1, vector2)}") 19 | print(f"Element-wise multiplication of vectors: {np.multiply(vector1, vector2)}") 20 | 21 | 22 | def perform_operations_on_arrays(array1, array2): 23 | print(f"Dot product of arrays: {np.dot(array1, array2.T)}") 24 | print(f"Matrix multiplication of arrays: {np.matmul(array1, array2.T)}") 25 | 26 | 27 | def main(): 28 | print_vector_info("Vector 1", VECTOR_1) 29 | print_vector_info("Vector 2", SAME_VECTOR) 30 | perform_operations_on_vectors(VECTOR_1, SAME_VECTOR) 31 | 32 | print(f"Rectangular Shape Array = {RECTANGULAR_ARRAY}, ndim = {RECTANGULAR_ARRAY.ndim}") 33 | 34 | perform_operations_on_arrays(RECTANGULAR_ARRAY, REPEATED_VALUES_ARRAY) 35 | 36 | single_dimensional_vector = np.array([1, 2, 3, 4]) 37 | reshaped_vector = single_dimensional_vector.reshape((-1, 1)) 38 | print(f"reshaped_vector = {reshaped_vector}") 39 | 40 | 41 | if __name__ == "__main__": 42 | main() -------------------------------------------------------------------------------- /Lectures/02-Math Review Codes/01-Linear-Algebra/eigen_vector.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | from scipy import linalg 3 | import matplotlib.pyplot as plt 4 | 5 | # ---------------------------------------------------------------------------------------------------------------- 6 | def calculate_inverse(matrix): 7 | """Calculates the inverse, determinant and eigenvalues of the matrix.""" 8 | inv = linalg.inv(matrix) 9 | det = linalg.det(matrix) 10 | eig = linalg.eig(matrix) 11 | print(inv) 12 | print(det) 13 | print(eig) 14 | 15 | 16 | def calculate_svd(matrix): 17 | """Calculates the singular value decomposition of the matrix.""" 18 | U, s, Vh = linalg.svd(matrix) 19 | print(U) 20 | print(s) 21 | print(Vh) 22 | 23 | 24 | def solve_equations(A, b): 25 | """Solves the system of equations given the matrix A and vector b.""" 26 | ans = linalg.solve(A, b) 27 | print(ans) 28 | 29 | 30 | def least_square_method(x, y): 31 | """Solves the least squares problem, plots the original data and the least squares fit.""" 32 | M = x[:, np.newaxis] ** [0, 2] 33 | p, res, rnk, s = linalg.lstsq(M, y) 34 | plt.figure(1) 35 | plt.plot(x, y, 'o', label='data') 36 | xx = np.linspace(0, 9, 101) 37 | yy = p[0] + p[1] * xx ** 2 38 | plt.plot(xx, yy, label='least squares fit, $y = a + bx^2$') 39 | plt.xlabel('x') 40 | plt.ylabel('y') 41 | plt.legend(framealpha=1, shadow=True) 42 | plt.grid(alpha=0.25) 43 | plt.show() 44 | 45 | # ---------------------------------------------------------------------------------------------------------------- 46 | 47 | # Usage 48 | MATRIX_1 = np.array([[1, 0], [0, 4]]) 49 | MATRIX_2 = np.array([[3, 2, 0], [1, -1, 0], [0, 5, 1]]) 50 | VECTOR = np.array([2, 4, -1]) 51 | X_VALUES = np.array([1, 2.5, 3.5, 4, 5, 7, 8.5]) 52 | Y_VALUES = np.array([0.3, 1.1, 1.5, 2.0, 3.2, 6.6, 8.6]) 53 | 54 | calculate_inverse(MATRIX_1) 55 | calculate_svd(MATRIX_1) 56 | solve_equations(MATRIX_2, VECTOR) 57 | least_square_method(X_VALUES, Y_VALUES) -------------------------------------------------------------------------------- /Lectures/02-Math Review Codes/01-Linear-Algebra/sympy_code.py: -------------------------------------------------------------------------------- 1 | from sympy import * 2 | from sympy import Matrix 3 | # ------------------------------------------------------------------------------------------------------- 4 | x = symbols('x') 5 | y = symbols('y') 6 | 7 | f1 = (x+y)**2 8 | print(f1) 9 | 10 | f2 = expand(( x + y) ** 3) 11 | print(f2) 12 | 13 | f3 = simplify((x + x * y) / x) 14 | print(f3) 15 | # ------------------------------------------------------------------------------------------------------- 16 | 17 | l1 = limit(x, x, oo) 18 | print(l1) 19 | 20 | l2 = limit(1/x, x, oo) 21 | print(l2) 22 | 23 | l3 = limit(x**x, x, 0) 24 | print(l3) 25 | 26 | l4 =limit((tan(x+y)-tan(x))/y, y, 0) 27 | print(l4) 28 | # ------------------------------------------------------------------------------------------------------- 29 | 30 | d1 = diff(sin(x), x) 31 | print(d1) 32 | 33 | d2 = diff(sin(2*x), x) 34 | print(d2) 35 | 36 | # ------------------------------------------------------------------------------------------------------- 37 | 38 | s1 = series(cos(x), x) 39 | print(s1) 40 | 41 | s2 = series(1/cos(x), x) 42 | print(s2) 43 | # ------------------------------------------------------------------------------------------------------- 44 | 45 | i1 = integrate(6*x**5, x) 46 | print(i1) 47 | 48 | i2 = integrate(sin(x), x) 49 | print(i2) 50 | 51 | i3 = integrate(2*x + sinh(x), x) 52 | print(i3) 53 | 54 | i4 = integrate(x**3, (x, -1, 1)) 55 | print(i4) 56 | 57 | i5= integrate(sin(x), (x, 0, pi/2)) 58 | print(i5) 59 | 60 | 61 | i6 = integrate(cos(x), (x, -pi/2, pi/2)) 62 | print(i6) 63 | # ------------------------------------------------------------------------------------------------------- 64 | 65 | e1 = solve(x**4 - 1, x) 66 | print(e1) 67 | 68 | e2 = solve([x + 5*y - 2, -3*x + 6*y - 15], [x, y]) 69 | print(e2) 70 | # ------------------------------------------------------------------------------------------------------- 71 | 72 | f = x**4 - 3*x**2 + 1 73 | factor(f) 74 | # ------------------------------------------------------------------------------------------------------- 75 | 76 | M1 = Matrix([[1,0], [0,1]]) 77 | print(M1) 78 | 79 | M2 = Matrix([[1,x], [y,1]]) 80 | print(M2) 81 | -------------------------------------------------------------------------------- /Lectures/02-Math Review Codes/02-Statistics/Generate_Random_number.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | import random 3 | import numpy as np 4 | # ------------------------------------------------------------ 5 | random_number = random.random() 6 | print(random_number) 7 | 8 | outcome = random.randint(1,6) 9 | print(outcome) 10 | 11 | a1 =[random.randint(1, 6) for _ in range(10)] 12 | print(a1) 13 | 14 | print(random.sample(range(1, 50), 6)) 15 | 16 | random_number = random.random() 17 | print(random_number) 18 | 19 | outcome = random.randint(1, 6) 20 | print(outcome) 21 | # ------------------------------------------------------------ 22 | 23 | a = np.random.random(1000) 24 | 25 | 26 | plt.figure(1) 27 | plt.hist(a, bins=10) 28 | 29 | mu, sigma = 0, 0.1 30 | s = np.random.normal(mu, sigma, 1000) 31 | 32 | plt.figure(2) 33 | plt.hist(s, bins=10, histtype = 'bar') 34 | 35 | s1 = np.random.randn( 1000) 36 | 37 | plt.figure(3) 38 | plt.hist(s, bins=10, histtype = 'bar') 39 | 40 | a = np.random.random((1000)) 41 | print(a) 42 | 43 | plt.figure(4) 44 | plt.plot(a) 45 | 46 | s1 = np.random.randn(1000) 47 | plt.figure(5) 48 | plt.hist(s1) 49 | plt.show() -------------------------------------------------------------------------------- /Lectures/02-Math Review Codes/Linear_affine.md: -------------------------------------------------------------------------------- 1 | # Understanding Why $y = 2x + 1$ is Affine but Not Linear 2 | 3 | ## **Mathematical Explanation** 4 | A function is **linear** if it satisfies both: 5 | 6 | 1. **Additivity**: 7 | $f(x_{1} + x_2) = f(x_1) + f(x_2)$ 8 | 2. **Homogeneity** (or scalar multiplication): 9 | $f(\lambda x) = \lambda f(x) $ for any scalar $ \lambda$. 10 | 11 | ### **Step 1: Checking Additivity** 12 | 13 | 14 | $f(x_1 + x_2) = 2(x_1 + x_2) + 1 = 2x_1 + 2x_2 + 1$ 15 | 16 | 17 | On the other hand, 18 | 19 | 20 | $(x_1) + f(x_2) = (2x_1 + 1) + (2x_2 + 1) = 2x_1 + 2x_2 + 2$ 21 | 22 | 23 | Since $( f(x_1 + x_2) \neq f(x_1) + f(x_2) )$ (there is an extra \( +1 \) term), additivity does not hold. 24 | 25 | ### **Step 2: Checking Homogeneity** 26 | For a scalar $( \lambda )$: 27 | 28 | 29 | $f(\lambda x) = 2(\lambda x) + 1 = 2\lambda x + 1$ 30 | 31 | 32 | On the other hand, 33 | 34 | 35 | $\lambda f(x) = \lambda (2x + 1) = 2\lambda x + \lambda$ 36 | 37 | 38 | Since $f(\lambda x) \neq \lambda f(x)$ (there is an extra $+1$ term), homogeneity does not hold. 39 | 40 | Since neither condition is satisfied, $f(x) = 2x + 1$ is **not a linear function**. 41 | 42 | --- 43 | 44 | ## **Affine Functions** 45 | A function is **affine** if it is of the form: 46 | 47 | $f(x) = Ax + b$ 48 | 49 | where $(A)$ is a linear transformation (matrix or scalar) and $(b)$ is a constant vector (or scalar). 50 | In our case, 51 | 52 | $f(x) = 2x + 1$ 53 | 54 | which fits the affine form where $A = 2$ and $b = 1$. Therefore, $y = 2x + 1$ is **affine but not linear**. 55 | 56 | --- 57 | 58 | ## **Key Takeaway** 59 | - A **linear function** must pass through the origin and satisfy additivity and homogeneity. 60 | - An **affine function** is a shifted linear function, meaning it is linear plus a constant term. 61 | - Since $y = 2x + 1$ has a constant term $+1$ , it **does not satisfy linearity but is affine**. 62 | 63 | -------------------------------------------------------------------------------- /Lectures/03-Perceptron/Main_loop.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | from perceptron import Perceptron 3 | import pandas as pd 4 | # ------------------------------------------------------------------------------------------------------- 5 | df = pd.read_csv('https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data', header=None) 6 | # ------------------------------------------------------------------------------------------------------- 7 | import matplotlib.pyplot as plt 8 | y = df.iloc[0:100, 4].values 9 | y= np.where(y == 'Iris-setosa', -1, 1) 10 | X = df.iloc[0:100, [0, 2]].values 11 | plt.figure(1) 12 | plt.scatter(X[:50, 0], X[:50, 1], color='red', marker='o', label='setosa') 13 | plt.scatter(X[50:100, 0], X[50:100, 1], color='blue', marker='x', label='versicolor') 14 | plt.xlabel('petal length') 15 | plt.ylabel('sepal length') 16 | plt.legend(loc = 'upper left') 17 | ppn = Perceptron(eta=0.1, n_iter=10) 18 | ppn.fit(X, y) 19 | plt.figure(2) 20 | plt.plot(range(1, len(ppn.errors_) + 1), ppn.errors_, marker='o') 21 | plt.xlabel('Epochs') 22 | plt.ylabel('Number of misclassifications') 23 | 24 | # ------------------------------------------------------------------------------------------------------- 25 | from plot_decision_regions import plot_decision_regions 26 | plt.figure(3) 27 | plot_decision_regions(X,y,classifier=ppn) 28 | plt.xlabel('sepal length [cm]') 29 | plt.ylabel('petal length [cm]') 30 | plt.legend(loc='upper left') 31 | plt.show() 32 | 33 | 34 | 35 | 36 | -------------------------------------------------------------------------------- /Lectures/03-Perceptron/Simple_Perceptron.py: -------------------------------------------------------------------------------- 1 | from sklearn import datasets 2 | from sklearn.preprocessing import StandardScaler 3 | from sklearn.model_selection import train_test_split 4 | from sklearn.linear_model import Perceptron 5 | from sklearn.metrics import accuracy_score 6 | #----------------------------------------------------------------------------- 7 | 8 | iris = datasets.load_iris() 9 | X = iris.data[:, [2,3]] 10 | y = iris.target 11 | X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=0) 12 | sc = StandardScaler() 13 | sc.fit(X_train) 14 | X_train_std = sc.transform(X_train) 15 | X_test_std = sc.transform(X_test) 16 | #----------------------------------------------------------------------------- 17 | ppn = Perceptron(max_iter=40, eta0=0.1, random_state=0) 18 | Perceptron() 19 | ppn.fit(X_train_std, y_train) 20 | y_pred = ppn.predict(X_test_std) 21 | print('Misclassified samples: %d' % (y_test != y_pred).sum()) 22 | print('Accuracy: %.2f' % accuracy_score(y_test, y_pred)) 23 | -------------------------------------------------------------------------------- /Lectures/03-Perceptron/perceptron.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | class Perceptron(object): 3 | # initialize the 2 parameters 4 | def __init__(self, eta=0.01, n_iter=10): 5 | self.eta = eta 6 | self.n_iter = n_iter 7 | ''' 8 | define the fit function in the class, weights initialize with 0, errors with [ ] 9 | 10 | X is the training vectors. X.shape = [n_samples, n_features] 11 | n_samples is the number of samples 12 | n_features is the number of features 13 | 14 | y is the target. y.shape = [n_samples] 15 | ''' 16 | def fit(self, X, y): 17 | self.w_ = np.zeros(1 + X.shape[1]) 18 | self.errors_ = [] 19 | # for loop iter times -> 'n_iter' and learn rate -> eta 20 | for _ in range(self.n_iter): 21 | errors = 0 22 | # map all the rows in zip(X,y), learn the weights 23 | for xi, target in zip(X, y): 24 | # this update -> if the predict is correct, update will equal to 0, and weights remain unchanged 25 | update = self.eta * (target - self.predict(xi)) 26 | self.w_[1:] += update * xi 27 | self.w_[0] += update 28 | # figure out right or wrong. True = 1, False = 0 29 | errors += int(update != 0.0) 30 | # store every iter errors 31 | self.errors_.append(errors) 32 | return self 33 | # calculate net input 34 | def net_input(self, X): 35 | return np.dot(X, self.w_[1:]) + self.w_[0] 36 | # Return class label after unit step 37 | def predict(self, X): 38 | return np.where(self.net_input(X) >= 0.0, 1, -1) 39 | -------------------------------------------------------------------------------- /Lectures/03-Perceptron/plot_decision_regions.py: -------------------------------------------------------------------------------- 1 | from matplotlib.colors import ListedColormap 2 | import matplotlib.pyplot as plt 3 | import numpy as np 4 | def plot_decision_regions(X, y, classifier, test_idx=None, resolution=0.02): 5 | # setup marker generator and color map 6 | markers = ('s', 'x', 'o', '^', 'v') 7 | colors = ('red', 'blue', 'lightgreen', 'gray', 'cyan') 8 | cmap = ListedColormap(colors[:len(np.unique(y))]) 9 | # plot the decision surface 10 | x1_min, x1_max = X[:, 0].min() - 1, X[:, 0].max() + 1 11 | x2_min, x2_max = X[:, 1].min() - 1, X[:, 1].max() + 1 12 | xx1, xx2 = np.meshgrid(np.arange(x1_min, x1_max, resolution), 13 | np.arange(x2_min, x2_max, resolution)) 14 | Z = classifier.predict(np.array([xx1.ravel(), xx2.ravel()]).T) 15 | Z = Z.reshape(xx1.shape) 16 | plt.contourf(xx1, xx2, Z, alpha=0.4, cmap=cmap) 17 | plt.xlim(xx1.min(), xx1.max()) 18 | plt.ylim(xx2.min(), xx2.max()) 19 | # plot all samples 20 | X_test, y_test = X[test_idx, :], y[test_idx] 21 | for idx, cl in enumerate(np.unique(y)): 22 | plt.scatter(x=X[y == cl, 0], y=X[y == cl, 1], 23 | alpha=0.8, c=cmap(idx), 24 | marker=markers[idx], label=cl) 25 | # highlight test samples 26 | if test_idx: 27 | X_test, y_test = X[test_idx, :], y[test_idx] 28 | plt.scatter(X_test[:, 0], X_test[:, 1], c='', 29 | alpha=1.0, linewidth=1, marker='o', 30 | s=55, label='test set') -------------------------------------------------------------------------------- /Lectures/04-Adaline/AdalineGD.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | class AdalineGD(object): 3 | """ 4 | 5 | The `AdalineGD` class implements the Adaline (Adaptive Linear Neuron) algorithm using Gradient Descent for training. 6 | 7 | Attributes: 8 | - `eta` (float): The learning rate, determines the step size at each iteration (default=0.01). 9 | - `n_iter` (int): The number of iterations to train the model (default=50). 10 | - `w_` (ndarray): An array of weights, including the bias term. 11 | - `cost_` (list): A list that stores the sum of squared errors for each epoch during training. 12 | 13 | Methods: 14 | - `__init__(self, eta=0.01, n_iter=50)`: Constructs a new `AdalineGD` object with the given values of `eta` and `n_iter`. 15 | - `fit(self, X, y)`: Fits the linear model to the training data using the Adaline GD algorithm. 16 | - `X` (ndarray): The input training data of shape [n_samples, n_features]. 17 | - `y` (ndarray): The target values of shape [n_samples]. 18 | - Returns: The fitted `AdalineGD` object. 19 | - `net_input(self, X)`: Computes the net input calculated as the dot product of weights and input features. 20 | - `X` (ndarray): The input data for which to calculate the net input of shape [n_samples, n_features]. 21 | - Returns: The net input as a 1D array. 22 | - `activation(self, X)`: Computes the linear activation function by calling `net_input`. 23 | - `X` (ndarray): The input data for which to calculate the activation of shape [n_samples, n_features]. 24 | - Returns: The activation as a 1D array. 25 | - `predict(self, X)`: Predicts the class labels for the given input data. 26 | - `X` (ndarray): The input data for which to make predictions of shape [n_samples, n_features]. 27 | - Returns: The predicted class labels as a 1D array. 28 | 29 | """ 30 | 31 | def __init__(self, eta: float = 0.01, n_iter: int = 50) -> NoReturn: 32 | """ 33 | Initializes an instance of the class. 34 | 35 | Args: 36 | eta (float, optional): The learning rate. Defaults to 0.01. 37 | n_iter (int, optional): The number of iterations. Defaults to 50. 38 | """ 39 | self.eta = eta 40 | self.n_iter = n_iter 41 | def fit(self, X, y): 42 | """ 43 | 44 | Fit the training data. 45 | 46 | Parameters: 47 | X (array-like): The input training data of shape (n_samples, n_features). 48 | y (array-like): The target values of shape (n_samples,). 49 | 50 | Returns: 51 | self: Returns an instance of the current object. 52 | 53 | """ 54 | self.w_ = np.zeros(1 + X.shape[1]) 55 | self.cost_ = [] 56 | for i in range(self.n_iter): 57 | output = self.net_input(X) 58 | errors = (y - output) 59 | self.w_[1:] += self.eta * X.T.dot(errors) 60 | self.w_[0] += self.eta * errors.sum() 61 | cost = (errors ** 2).sum() / 2.0 62 | self.cost_.append(cost) 63 | return self 64 | def net_input(self, X): 65 | """ 66 | Calculate the net input for the given input data. 67 | 68 | Parameters: 69 | X : array-like, shape = [n_samples, n_features] 70 | The input data. 71 | 72 | Returns: 73 | net_input : array, shape = [n_samples] 74 | The net input calculated as the dot product of the input data (X) and the weights (self.w_[1:]) plus the bias term (self.w_[0]). 75 | """ 76 | return np.dot(X, self.w_[1:]) + self.w_[0] 77 | def activation(self, X): 78 | """ 79 | Activate the neural network by passing the input data through the net_input method. 80 | 81 | Parameters: 82 | X (array-like): The input data. 83 | 84 | Returns: 85 | The result of passing the input data through the net_input method. 86 | """ 87 | return self.net_input(X) 88 | def predict(self, X): 89 | """ 90 | Predicts the labels for input samples. 91 | 92 | Parameters: 93 | ---------- 94 | X : array-like, shape (n_samples, n_features) 95 | The input samples. 96 | 97 | Returns: 98 | ------- 99 | predictions : array, shape (n_samples,) 100 | The predicted labels for the input samples. Each predicted label is either 1 or -1. 101 | """ 102 | return np.where(self.activation(X) >= 0.0, 1, -1) 103 | 104 | 105 | 106 | -------------------------------------------------------------------------------- /Lectures/04-Adaline/AdalineSGD.py: -------------------------------------------------------------------------------- 1 | from numpy.random import seed 2 | import numpy as np 3 | class AdalineSGD(object): 4 | """ 5 | AdalineSGD 6 | 7 | This class implements the Adaptive Linear Neuron (Adaline) algorithm using the Stochastic Gradient Descent (SGD) optimization technique. Adaline is a single-layer neural network that 8 | * can be used for binary classification tasks. The SGD optimization updates the model parameters by randomly selecting samples from the training set and adjusting the weights based on 9 | * the differences between the predicted and actual outputs. 10 | 11 | Attributes: 12 | - eta (float): The learning rate (default=0.01). 13 | - n_iter (int): The number of iterations (epochs) to train the model (default=10). 14 | - shuffle (bool): Whether to shuffle the training data before each epoch (default=True). 15 | - random_state (int): The seed value for random number generation (default=None). 16 | 17 | Methods: 18 | - __init__(self, eta=0.01, n_iter=10, shuffle=True, random_state=None) 19 | Initializes the AdalineSGD object with the given parameters. 20 | 21 | - fit(self, X, y) 22 | Trains the AdalineSGD model using the input features (X) and target values (y). 23 | 24 | - partial_fit(self, X, y) 25 | Updates the weights of an already trained AdalineSGD model using additional input features (X) and target values (y). 26 | 27 | - _shuffle(self, X, y) 28 | Shuffles the input features (X) and target values (y) randomly. 29 | 30 | - _initialize_weights(self, m) 31 | Initializes the model weights to zeros. 32 | 33 | - _update_weights(self, xi, target) 34 | Updates the model weights using the input sample (xi) and corresponding target value (target). 35 | 36 | - net_input(self, X) 37 | Calculates the net input (weighted sum) of the input features (X) using the model weights. 38 | 39 | - activation(self, X) 40 | Calculates the output (activation) of the AdalineSGD model for the input features (X). 41 | 42 | - predict(self, X) 43 | Predicts the class labels for the input features (X) based on the AdalineSGD model. 44 | 45 | Note: This class requires the NumPy library to work properly. 46 | """ 47 | def __init__(self, eta=0.01, n_iter=10,shuffle=True, random_state=None): 48 | self.eta = eta 49 | self.n_iter = n_iter 50 | self.w_initialized = False 51 | self.shuffle = shuffle 52 | if random_state: 53 | seed(random_state) 54 | #learn the fit 55 | def fit(self, X, y): 56 | self._initialize_weights(X.shape[1]) 57 | self.cost_ = [] 58 | for i in range(self.n_iter): 59 | if self.shuffle: 60 | X, y = self._shuffle(X, y) 61 | cost = [] 62 | for xi, target in zip(X, y): 63 | cost.append(self._update_weights(xi, target)) 64 | avg_cost = sum(cost) / len(y) 65 | self.cost_.append(avg_cost) 66 | return self 67 | 68 | def partial_fit(self, X, y): 69 | 70 | if not self.w_initialized: 71 | self._initialize_weights(X.shape[1]) 72 | if y.ravel().shape[0] > 1: 73 | for xi, target in zip(X, y): 74 | self._update_weights(xi, target) 75 | else: 76 | self._update_weights(X, y) 77 | return self 78 | 79 | def _shuffle(self, X, y): 80 | r = np.random.permutation(len(y)) 81 | return X[r], y[r] 82 | 83 | def _initialize_weights(self, m): 84 | self.w_ = np.zeros(1 + m) 85 | self.w_initialized = True 86 | 87 | def _update_weights(self, xi, target): 88 | output = self.net_input(xi) 89 | error = (target - output) 90 | self.w_[1:] += self.eta * xi.dot(error) 91 | self.w_[0] += self.eta * error 92 | cost = 0.5 * error ** 2 93 | return cost 94 | def net_input(self, X): 95 | return np.dot(X, self.w_[1:]) + self.w_[0] 96 | def activation(self, X): 97 | return self.net_input(X) 98 | def predict(self, X): 99 | 100 | return np.where(self.activation(X) >= 0.0, 1, -1) 101 | -------------------------------------------------------------------------------- /Lectures/04-Adaline/Main_AdalineGD.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import pandas as pd 3 | import matplotlib.pyplot as plt 4 | from AdalineGD import AdalineGD 5 | from plot_decision_regions import plot_decision_regions 6 | # ------------------------------------------------------------------------------------------------------- 7 | df = pd.read_csv('https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data', header=None) 8 | y = df.iloc[0:100, 4].values 9 | y= np.where(y == 'Iris-setosa', -1, 1) 10 | X = df.iloc[0:100, [0, 2]].values 11 | X_std = np.copy(X) 12 | X_std[:, 0] = (X[:, 0] - X[:, 0].mean()) / X[:, 0].std() 13 | X_std[:, 1] = (X[:, 1] - X[:, 1].mean()) / X[:, 1].std() 14 | ada = AdalineGD(n_iter=15, eta=0.01) 15 | ada.fit(X_std, y) 16 | # ------------------------------------------------------------------------------------------------------- 17 | # 18 | # ------------------------------------------------------------------------------------------------------- 19 | plt.figure(1) 20 | plot_decision_regions(X_std, y, classifier=ada) 21 | plt.title('Adaline - Gradient Descent') 22 | plt.xlabel('sepal length [standardized]') 23 | plt.ylabel('petal length [standardized]') 24 | plt.legend(loc='upper left') 25 | plt.show() 26 | 27 | plt.figure(2) 28 | plt.plot(range(1, len(ada.cost_) + 1), ada.cost_, marker='o') 29 | plt.xlabel('Epochs') 30 | plt.ylabel('Sum-squared-error') 31 | plt.show() 32 | # ------------------------------------------------------------------------------------------------------- 33 | # 34 | # ------------------------------------------------------------------------------------------------------- 35 | ada1 = AdalineGD(n_iter=10, eta=0.01).fit(X, y) 36 | 37 | fig, ax = plt.subplots(nrows=1, ncols=2, figsize=(8, 4)) 38 | ax[0].plot(range(1,len(ada1.cost_)+1), np.log10(ada1.cost_), marker='o') 39 | ax[0].set_xlabel('Epochs') 40 | ax[0].set_ylabel('log(Sum-squared-error)') 41 | ax[0].set_title('Adaline - Learning rate 0.01') 42 | ada2 = AdalineGD(n_iter=10, eta=0.0001).fit(X,y) 43 | ax[1].plot(range(1, len(ada2.cost_)+1), ada2.cost_, marker='o') 44 | ax[1].set_xlabel('Epochs') 45 | ax[1].set_title('Adaline - Learning rate 0.0001') 46 | plt.show() 47 | 48 | 49 | 50 | -------------------------------------------------------------------------------- /Lectures/04-Adaline/Main_AdalineSGD.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import pandas as pd 3 | import matplotlib.pyplot as plt 4 | from AdalineSGD import AdalineSGD 5 | from plot_decision_regions import plot_decision_regions 6 | # ------------------------------------------------------------------------------------------------------- 7 | df = pd.read_csv('https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data', header=None) 8 | 9 | y = df.iloc[0:100, 4].values 10 | y= np.where(y == 'Iris-setosa', -1, 1) 11 | X = df.iloc[0:100, [0, 2]].values 12 | X_std = np.copy(X) 13 | X_std[:, 0] = (X[:, 0] - X[:, 0].mean()) / X[:, 0].std() 14 | X_std[:, 1] = (X[:, 1] - X[:, 1].mean()) / X[:, 1].std() 15 | # ------------------------------------------------------------------------------------------------------- 16 | # 17 | # ------------------------------------------------------------------------------------------------------- 18 | ada = AdalineSGD(n_iter=15, eta=0.01, random_state=1) 19 | ada.fit(X_std, y) 20 | 21 | 22 | 23 | plt.figure(1) 24 | plot_decision_regions(X_std, y, classifier=ada) 25 | plt.title('Adaline - Stochastic Gradient Descent') 26 | plt.xlabel('sepal length [standardized]') 27 | plt.ylabel('petal length [standardized]') 28 | plt.legend(loc='upper left') 29 | plt.show() 30 | 31 | plt.figure(2) 32 | plt.plot(range(1, len(ada.cost_) + 1), ada.cost_, marker='o') 33 | plt.xlabel('Epochs') 34 | plt.ylabel('Average Cost') 35 | plt.show() 36 | plt.show() -------------------------------------------------------------------------------- /Lectures/04-Adaline/plot_decision_regions.py: -------------------------------------------------------------------------------- 1 | from matplotlib.colors import ListedColormap 2 | import matplotlib.pyplot as plt 3 | import numpy as np 4 | def plot_decision_regions(X, y, classifier, test_idx=None, resolution=0.02): 5 | # setup marker generator and color map 6 | markers = ('s', 'x', 'o', '^', 'v') 7 | colors = ('red', 'blue', 'lightgreen', 'gray', 'cyan') 8 | cmap = ListedColormap(colors[:len(np.unique(y))]) 9 | # plot the decision surface 10 | x1_min, x1_max = X[:, 0].min() - 1, X[:, 0].max() + 1 11 | x2_min, x2_max = X[:, 1].min() - 1, X[:, 1].max() + 1 12 | xx1, xx2 = np.meshgrid(np.arange(x1_min, x1_max, resolution), 13 | np.arange(x2_min, x2_max, resolution)) 14 | Z = classifier.predict(np.array([xx1.ravel(), xx2.ravel()]).T) 15 | Z = Z.reshape(xx1.shape) 16 | plt.contourf(xx1, xx2, Z, alpha=0.4, cmap=cmap) 17 | plt.xlim(xx1.min(), xx1.max()) 18 | plt.ylim(xx2.min(), xx2.max()) 19 | # plot all samples 20 | X_test, y_test = X[test_idx, :], y[test_idx] 21 | for idx, cl in enumerate(np.unique(y)): 22 | plt.scatter(x=X[y == cl, 0], y=X[y == cl, 1], 23 | alpha=0.8, c=cmap(idx), 24 | marker=markers[idx], label=cl) 25 | # highlight test samples 26 | if test_idx: 27 | X_test, y_test = X[test_idx, :], y[test_idx] 28 | plt.scatter(X_test[:, 0], X_test[:, 1], 29 | alpha=1.0, linewidth=1, marker='o', 30 | s=55, label='test set') -------------------------------------------------------------------------------- /Lectures/05-Sklearn NN/Exercise.py: -------------------------------------------------------------------------------- 1 | #%%----------------------------------------------------------------------- 2 | # Exercise 3 | #%%----------------------------------------------------------------------- 4 | # 1- Generate data from the following function np.sin(np.exp(p)) 5 | # 2- Write a script to fit the function 6 | # 3- Plot the trained network vs original function 7 | # 4- Use different layers, neurons, optimizer etc 8 | # 5- Explain your findings and write down a paragraph to explain all the results. 9 | # 6- Explain your performance index results and figures. 10 | #%%----------------------------------------------------------------------- 11 | # 1- 12 | 13 | #%%----------------------------------------------------------------------- 14 | # 2- 15 | 16 | 17 | #%%----------------------------------------------------------------------- 18 | # 3- 19 | 20 | 21 | #%%----------------------------------------------------------------------- 22 | # 4- 23 | 24 | #%%----------------------------------------------------------------------- 25 | # 5- 26 | 27 | #%%----------------------------------------------------------------------- 28 | # 6- -------------------------------------------------------------------------------- /Lectures/05-Sklearn NN/Sample_NN_Sklearn_Classifier.py: -------------------------------------------------------------------------------- 1 | import pandas as pd 2 | from sklearn import preprocessing 3 | from sklearn.model_selection import train_test_split 4 | from sklearn.preprocessing import StandardScaler 5 | from sklearn.neural_network import MLPClassifier 6 | from sklearn.metrics import classification_report, confusion_matrix 7 | # ------------------------------------------------------------------------------------------------------- 8 | url = "https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data" 9 | names = ['sepal-length', 'sepal-width', 'petal-length', 'petal-width', 'Class'] 10 | 11 | irisdata = pd.read_csv(url, names=names) 12 | print(irisdata.head()) 13 | 14 | X = irisdata.iloc[:, 0:4] 15 | y = irisdata.select_dtypes(include=[object]) 16 | print(y.Class.unique()) 17 | # ------------------------------------------------------------------------------------------------------- 18 | le = preprocessing.LabelEncoder() 19 | y = y.apply(le.fit_transform) 20 | 21 | X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.20) 22 | 23 | scaler = StandardScaler() 24 | scaler.fit(X_train) 25 | 26 | X_train = scaler.transform(X_train) 27 | X_test = scaler.transform(X_test) 28 | # ------------------------------------------------------------------------------------------------------- 29 | 30 | mlp = MLPClassifier(hidden_layer_sizes=(10, 10, 10), max_iter=1000) 31 | mlp.fit(X_train, y_train.values.ravel()) 32 | 33 | predictions = mlp.predict(X_test) 34 | 35 | print(confusion_matrix(y_test,predictions)) 36 | print(classification_report(y_test,predictions)) -------------------------------------------------------------------------------- /Lectures/05-Sklearn NN/Sample_NN_Sklearn_Regressor..py: -------------------------------------------------------------------------------- 1 | from sklearn.neural_network import MLPRegressor 2 | from sklearn.datasets import make_regression 3 | from sklearn.model_selection import train_test_split 4 | 5 | X, y = make_regression(n_samples=200, random_state=1) 6 | 7 | X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=1) 8 | regr = MLPRegressor(random_state=1, max_iter=500) 9 | regr.fit(X_train, y_train) 10 | 11 | print(regr.predict(X_test[:2])) -------------------------------------------------------------------------------- /Lectures/Clustering_SOM/SOM/Readme.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Lectures/Clustering_SOM/SOM/Tabular/Iris/notebook/README.rst: -------------------------------------------------------------------------------- 1 | Examples 2 | ======== 3 | This is example script -------------------------------------------------------------------------------- /Lectures/Clustering_SOM/SOM/Tabular/Iris/py/1-iris_training.py: -------------------------------------------------------------------------------- 1 | #%%----------------------------------------------------------------------------------------------------------- 2 | """ 3 | This script demonstrates training the Iris dataset using matplotlib. 4 | """ 5 | # Importing Library 6 | from NNSOM.plots import SOMPlots 7 | import numpy as np 8 | import matplotlib.pyplot as plt 9 | from numpy.random import default_rng 10 | from sklearn.datasets import load_iris 11 | from sklearn.preprocessing import MinMaxScaler 12 | import os 13 | #%%----------------------------------------------------------------------------------------------------------- 14 | 15 | # SOM Parameters 16 | SOM_Row_Num = 4 # The number of row used for the SOM grid. 17 | Dimensions = (SOM_Row_Num, SOM_Row_Num) # The dimensions of the SOM grid. 18 | 19 | # Training Parameters 20 | Epochs = 500 21 | Steps = 100 22 | Init_neighborhood = 3 23 | 24 | # Random State 25 | SEED = 1234567 26 | rng = default_rng(SEED) 27 | 28 | # Data Preparation 29 | iris = load_iris() 30 | X = iris.data 31 | y = iris.target 32 | 33 | # Preprocessing data 34 | X = X[rng.permutation(len(X))] 35 | y = y[rng.permutation(len(X))] 36 | 37 | # Define the normalize funciton 38 | scaler = MinMaxScaler(feature_range=(-1, 1)) 39 | #%%----------------------------------------------------------------------------------------------------------- 40 | 41 | # Training 42 | som = SOMPlots(Dimensions) 43 | som.init_w(X, norm_func=scaler.fit_transform) 44 | som.train(X, Init_neighborhood, Epochs, Steps, norm_func=scaler.fit_transform) 45 | #%%----------------------------------------------------------------------------------------------------------- 46 | path = os.getcwd() 47 | Trained_SOM_File = "SOM_Model_iris_Epoch_" + str(Epochs) + '_Seed_' + str(SEED) + '_Size_' + str(SOM_Row_Num) + '.pkl' 48 | 49 | # Save the model 50 | som.save_pickle(Trained_SOM_File, path + os.sep) 51 | #%%----------------------------------------------------------------------------------------------------------- 52 | 53 | # Extract Cluster details 54 | clust, dist, mdist, clustSize = som.cluster_data(X) 55 | 56 | # Find quantization error 57 | quant_err = som.quantization_error(dist) 58 | print('Quantization error: ' + str(quant_err)) 59 | #%%----------------------------------------------------------------------------------------------------------- 60 | 61 | # Find topological error 62 | top_error_1, top_error_1_2 = som.topological_error(X) 63 | print('Topological Error (1st neighbor) = ' + str(top_error_1) + '%') 64 | print('Topological Error (1st and 2nd neighbor) = ' + str(top_error_1_2) + '%') 65 | #%%----------------------------------------------------------------------------------------------------------- 66 | 67 | # Find Distortion Error 68 | som.distortion_error(X) 69 | 70 | # Visualization 71 | # Data Preparation 72 | data_dict = { 73 | "data": X, 74 | "target": y, 75 | "clust": clust, 76 | } 77 | #%%----------------------------------------------------------------------------------------------------------- 78 | 79 | # SOM Topology 80 | fig1, ax1, patches1 = som.plot('top') 81 | plt.show() 82 | #%%----------------------------------------------------------------------------------------------------------- 83 | 84 | # SOM Topology with neruon numbers 85 | fig2, ax2, pathces2, text2 = som.plot('top_num') 86 | plt.show() 87 | #%%----------------------------------------------------------------------------------------------------------- 88 | 89 | # Hit Histogram 90 | fig3, ax3, patches3, text3 = som.plot('hit_hist', data_dict) 91 | plt.show() 92 | #%%----------------------------------------------------------------------------------------------------------- 93 | 94 | # Neighborhood Connection Map 95 | fig4, ax4, patches4 = som.plot('neuron_connection') 96 | plt.show() 97 | #%%----------------------------------------------------------------------------------------------------------- 98 | 99 | # Distance Map 100 | fig5, ax5, patches5 = som.plot('neuron_dist') 101 | plt.show() 102 | #%%----------------------------------------------------------------------------------------------------------- 103 | 104 | # Weight Position Plot 105 | som.plot('component_positions', data_dict) 106 | #%%----------------------------------------------------------------------------------------------------------- 107 | 108 | # Weight as Line 109 | fig6, ax6, h_axes6 = som.plot('wgts') 110 | plt.show() 111 | #%%----------------------------------------------------------------------------------------------------------- 112 | 113 | # Weight as Image 114 | fig7, ax7, patches7 = som.weight_as_image() 115 | plt.show() 116 | -------------------------------------------------------------------------------- /Lectures/Clustering_SOM/SOM/Tabular/Iris/py/2-iris_training_gpu.py: -------------------------------------------------------------------------------- 1 | """ 2 | This script demonstrates the training of the Iris dataset using SOM algorithm, leveraging GPU acceleration. 3 | 4 | Requirements: 5 | - This script is specifically designed for execution in environments with GPU support. 6 | - The SOMGpu class from the NNSOM library is used for SOM operations on GPU. 7 | 8 | Key Operations: 9 | - Data loading and preprocessing using sklearn. 10 | - SOM initialization and training on GPU using SOMGpu. 11 | - Saving the trained SOM model externally. 12 | 13 | Note: 14 | - Ensure a compatible GPU and necessary driver support are available in the execution environment. 15 | - The script uses absolute paths and assumes a specific directory structure for saving the model. 16 | 17 | To execute this script: 18 | - A Python environment with the necessary libraries (NNSOM, numpy, cupy) installed is required. 19 | - The script assumes the presence of a CUDA-compatible GPU and appropriate setup for PyCUDA. 20 | """ 21 | #%%----------------------------------------------------------------------------------------------------------- 22 | 23 | # Importing Library 24 | from NNSOM.som_gpu import SOMGpu 25 | 26 | import numpy as np 27 | import matplotlib.pyplot as plt 28 | from numpy.random import default_rng 29 | from sklearn.datasets import load_iris 30 | from sklearn.preprocessing import MinMaxScaler 31 | 32 | import os 33 | #%%----------------------------------------------------------------------------------------------------------- 34 | 35 | # SOM Parameters 36 | SOM_Row_Num = 4 # The number of row used for the SOM grid. 37 | Dimensions = (SOM_Row_Num, SOM_Row_Num) # The dimensions of the SOM grid. 38 | 39 | # Training Parameters 40 | Epochs = 500 41 | Steps = 100 42 | Init_neighborhood = 3 43 | 44 | # Random State 45 | SEED = 1234567 46 | rng = default_rng(SEED) 47 | 48 | # Data Preparation 49 | iris = load_iris() 50 | X = iris.data 51 | y = iris.target 52 | 53 | # Preprocessing data 54 | X = X[rng.permutation(len(X))] 55 | y = y[rng.permutation(len(X))] 56 | 57 | # Define the normalize funciton 58 | scaler = MinMaxScaler(feature_range=(-1, 1)) 59 | #%%----------------------------------------------------------------------------------------------------------- 60 | 61 | # Training 62 | som = SOMGpu(Dimensions) 63 | som.init_w(X, norm_func=scaler.fit_transform) 64 | som.train(X, Init_neighborhood, Epochs, Steps, norm_func=scaler.fit_transform) 65 | #%%----------------------------------------------------------------------------------------------------------- 66 | 67 | # Define the directory path for saving the model outside the repository 68 | model_dir = os.path.abspath(os.path.join(os.getcwd(), "..", "..", "..", "..", "Model")) 69 | # Create the directory if it doesn't exist 70 | if not os.path.exists(model_dir): 71 | os.makedirs(model_dir) 72 | Trained_SOM_File = "SOM_Model_iris_Epoch_" + str(Epochs) + '_Seed_' + str(SEED) + '_Size_' + str(SOM_Row_Num) + '.pkl' 73 | #%%----------------------------------------------------------------------------------------------------------- 74 | 75 | # Save the model 76 | som.save_pickle(Trained_SOM_File, model_dir + os.sep) 77 | #%%----------------------------------------------------------------------------------------------------------- 78 | 79 | # Extract Cluster details 80 | clust, dist, mdist, clustSize = som.cluster_data(X) 81 | #%%----------------------------------------------------------------------------------------------------------- 82 | 83 | # Find quantization error 84 | quant_err = som.quantization_error(dist) 85 | print('Quantization error: ' + str(quant_err)) 86 | #%%----------------------------------------------------------------------------------------------------------- 87 | 88 | # Find topological error 89 | top_error_1, top_error_1_2 = som.topological_error(X) 90 | print('Topological Error (1st neighbor) = ' + str(top_error_1) + '%') 91 | print('Topological Error (1st and 2nd neighbor) = ' + str(top_error_1_2) + '%') 92 | #%%----------------------------------------------------------------------------------------------------------- 93 | 94 | # Find Distortion Error 95 | som.distortion_error(X) 96 | 97 | """ 98 | Note: 99 | The 'SOMGpu' class is used exclusively for training the Self-Organizing Maps using GPU acceleration. 100 | For visualization purposes, including the display of SOM topology, histograms, and other graphical 101 | representations of the SOM structure, use the 'SOMPlots' class. The 'SOMPlots' class is designed 102 | to automatically detect and utilize GPU capabilities via CuPy if available; otherwise, it falls 103 | back to CPU processing. This ensures optimal performance in environments with GPU support while 104 | still maintaining functionality in CPU-only environments. 105 | """ 106 | -------------------------------------------------------------------------------- /Lectures/Clustering_SOM/SOM/Tabular/Iris/py/3-iris_plot.py: -------------------------------------------------------------------------------- 1 | """ 2 | This script demonstrates plotting the Iris dataset using matplotlib. 3 | """ 4 | 5 | from NNSOM.plots import SOMPlots 6 | 7 | from numpy.random import default_rng 8 | import matplotlib.pyplot as plt 9 | 10 | from sklearn.datasets import load_iris 11 | from sklearn.preprocessing import MinMaxScaler 12 | 13 | import os 14 | #%%----------------------------------------------------------------------------------------------------------- 15 | 16 | # Random State 17 | SEED = 1234567 18 | rng = default_rng(SEED) 19 | 20 | # Data Preprocessing 21 | iris = load_iris() 22 | X = iris.data 23 | y = iris.target 24 | X = X[rng.permutation(len(X))] 25 | y = y[rng.permutation(len(X))] 26 | scaler = MinMaxScaler(feature_range=(-1, 1)) 27 | #%%----------------------------------------------------------------------------------------------------------- 28 | 29 | # Define the directory path for saving the model outside the repository 30 | model_dir = os.getcwd() + os.sep 31 | trained_file_name = "SOM_Model_iris_Epoch_500_Seed_1234567_Size_4.pkl" 32 | #%%----------------------------------------------------------------------------------------------------------- 33 | 34 | # SOM Parameters 35 | SOM_Row_Num = 4 # The number of row used for the SOM grid. 36 | Dimensions = (SOM_Row_Num, SOM_Row_Num) # The dimensions of the SOM grid. 37 | 38 | som = SOMPlots(Dimensions) 39 | som = som.load_pickle(trained_file_name, model_dir + os.sep) 40 | #%%----------------------------------------------------------------------------------------------------------- 41 | 42 | # Data Preparation for Visualization 43 | clust, dist, mdist, clustSize = som.cluster_data(X) 44 | 45 | data_dict = { 46 | "data": X, 47 | "target": y, 48 | "clust": clust 49 | } 50 | #%%----------------------------------------------------------------------------------------------------------- 51 | 52 | # Visualization 53 | # Grey Hist 54 | fig, ax, pathces, text = som.plot('gray_hist', data_dict, ind=0) 55 | plt.suptitle("Gray Hist with Sepal Length", fontsize=16) 56 | plt.show() 57 | #%%----------------------------------------------------------------------------------------------------------- 58 | 59 | fig, ax, pathces, text = som.plot('gray_hist', data_dict, target_class=0) 60 | plt.suptitle("Gray Hist with Sentosa Distribution") 61 | plt.show() 62 | #%%----------------------------------------------------------------------------------------------------------- 63 | 64 | # Color Hist 65 | fig, ax, pathces, text, cbar = som.plot('color_hist', data_dict, ind=0) 66 | plt.suptitle("Color Hist with Sepal Length", fontsize=16) 67 | plt.show() 68 | #%%----------------------------------------------------------------------------------------------------------- 69 | 70 | fig, ax, patches, text, cbar = som.plot('color_hist', data_dict, target_class=0) 71 | plt.suptitle("Color Hist with Sentosa Distribution", fontsize=16) 72 | plt.show() 73 | #%%----------------------------------------------------------------------------------------------------------- 74 | 75 | # Pie Chart 76 | fig, ax, h_axes = som.plot("pie", data_dict) 77 | plt.suptitle("Iris Class Distribution", fontsize=16) 78 | plt.show() 79 | #%%----------------------------------------------------------------------------------------------------------- 80 | 81 | # Stem Plot 82 | fig, ax, h_axes = som.plot('stem', data_dict) 83 | plt.show() 84 | #%%----------------------------------------------------------------------------------------------------------- 85 | 86 | # Histogram 87 | fig, ax, h_axes = som.plot('hist', data_dict, ind=0) 88 | plt.suptitle("Sepal Length", fontsize=16) 89 | plt.show() 90 | #%%----------------------------------------------------------------------------------------------------------- 91 | 92 | fig, ax, h_axes = som.plot('hist', data_dict, ind=1) 93 | plt.suptitle("Sepal Width", fontsize=16) 94 | plt.show() 95 | #%%----------------------------------------------------------------------------------------------------------- 96 | 97 | # Boxplot 98 | fig, ax, h_axes = som.plot("box", data_dict) 99 | plt.suptitle("Iris Feature Distirbution", fontsize=16) 100 | plt.show() 101 | #%%----------------------------------------------------------------------------------------------------------- 102 | 103 | fig, ax, h_axes = som.plot("box", data_dict, ind=[0, 1]) 104 | plt.suptitle("Box Plot with Sepal length and width", fontsize=16) 105 | plt.show() 106 | #%%----------------------------------------------------------------------------------------------------------- 107 | 108 | fig, ax, h_axes = som.plot("box", data_dict, ind=0) 109 | plt.suptitle("Box Plot with Sepal Length", fontsize=16) 110 | plt.show() 111 | #%%----------------------------------------------------------------------------------------------------------- 112 | 113 | # Violin plot 114 | fig, ax, h_axes = som.plot("violin", data_dict) 115 | plt.suptitle("Violin Plot with all feature in Iris", fontsize=16) 116 | plt.show() 117 | #%%----------------------------------------------------------------------------------------------------------- 118 | 119 | fig, ax, h_axes = som.plot("violin", data_dict, ind=[0, 1]) 120 | plt.suptitle("Violin Plot with Sepal Length and Width", fontsize=16) 121 | plt.show() 122 | #%%----------------------------------------------------------------------------------------------------------- 123 | 124 | fig, ax, h_axes = som.plot("violin", data_dict, ind=0) 125 | plt.suptitle("Violin Plot with Sepal Length", fontsize=16) 126 | plt.show() 127 | #%%----------------------------------------------------------------------------------------------------------- 128 | 129 | # Scatter Plot 130 | fig, ax, h_axes = som.plot("scatter", data_dict, ind=[0, 1]) 131 | plt.suptitle("Scatter Plot with Sepal Length and Width", fontsize=16) 132 | plt.show() 133 | #%%----------------------------------------------------------------------------------------------------------- 134 | 135 | fig, ax, h_axes = som.plot("scatter", data_dict, ind=[2, 3]) 136 | plt.suptitle("Scatter Plot with Petal Length and Width", fontsize=16) 137 | plt.show() 138 | #%%----------------------------------------------------------------------------------------------------------- 139 | 140 | # Components Plane 141 | som.plot('component_planes', data_dict) 142 | -------------------------------------------------------------------------------- /Lectures/Clustering_SOM/SOM/Tabular/Iris/py/4-iris_post_training_analysis.py: -------------------------------------------------------------------------------- 1 | """ 2 | This script demonstrates post training analysis on the Iris dataset using matplotlib. 3 | """ 4 | from NNSOM.plots import SOMPlots 5 | from NNSOM.utils import * 6 | 7 | import numpy as np 8 | from numpy.random import default_rng 9 | import matplotlib.pyplot as plt 10 | 11 | from sklearn.datasets import load_iris 12 | from sklearn.linear_model import LogisticRegression 13 | 14 | import os 15 | #%%----------------------------------------------------------------------------------------------------------- 16 | 17 | # Random State 18 | SEED = 1234567 19 | rng = default_rng(SEED) 20 | 21 | # Data Preprocessing 22 | iris = load_iris() 23 | X = iris.data 24 | y = iris.target 25 | X = X[rng.permutation(len(X))] 26 | y = y[rng.permutation(len(X))] 27 | 28 | # Define the directory path for saving the model outside the repository 29 | model_dir = os.getcwd() + os.sep 30 | trained_file_name = "SOM_Model_iris_Epoch_500_Seed_1234567_Size_4.pkl" 31 | #%%----------------------------------------------------------------------------------------------------------- 32 | 33 | # SOM Parameters 34 | SOM_Row_Num = 4 # The number of row used for the SOM grid. 35 | Dimensions = (SOM_Row_Num, SOM_Row_Num) # The dimensions of the SOM grid. 36 | 37 | som = SOMPlots(Dimensions) 38 | som = som.load_pickle(trained_file_name, model_dir + os.sep) 39 | #%%----------------------------------------------------------------------------------------------------------- 40 | 41 | # Data post processing 42 | clust, dist, mdist, clustSizes = som.cluster_data(X) 43 | 44 | # Train Logistic Regression on Iris 45 | logit = LogisticRegression(random_state=SEED) 46 | logit.fit(X, y) 47 | results = logit.predict(X) 48 | 49 | perc_misclassified = get_perc_misclassified(y, results, clust) 50 | #%%----------------------------------------------------------------------------------------------------------- 51 | 52 | # For Pie chart and Stem Plot 53 | sent_tp, sent_tn, sent_fp, sent_fn = get_conf_indices(y, results, 0) # Confusion matrix for sentosa 54 | sentosa_conf = cal_class_cluster_intersect(clust, sent_tp, sent_tn, sent_fp, sent_fn) 55 | vers_tp, vers_tn, vers_fp, vers_fn = get_conf_indices(y, results, 1) # Confusion matrix for versicolor 56 | versicolor_conf = cal_class_cluster_intersect(clust, vers_tp, vers_tn, vers_fp, vers_fn) 57 | virg_tp, virg_tn, virg_fp, virg_fn = get_conf_indices(y, results, 2) # Confusion matrix for virginica 58 | virginica_conf = cal_class_cluster_intersect(clust, virg_tp, virg_tn, virg_fp, virg_fn) 59 | conf_align = [0, 1, 2, 3] 60 | #%%----------------------------------------------------------------------------------------------------------- 61 | 62 | # Complex Hit Histogram 63 | # Get the list with dominat class in each cluster 64 | dominant_classes = majority_class_cluster(y, clust) 65 | #%%----------------------------------------------------------------------------------------------------------- 66 | 67 | # Get the majority error type (0: type 1 error, 1: type 2 error) corresponding dominat class 68 | sent_error = get_color_labels(clust, sent_tn, sent_fp) # Get the majority error type in sentosa 69 | vers_error = get_color_labels(clust, vers_tn, vers_fp) # Get the majority error type in versicolor 70 | virg_error = get_color_labels(clust, virg_tn, virg_fp) # Get the majority error type in virginica 71 | iris_error_types = [sent_error, vers_error, virg_error] 72 | error_types = get_dominant_class_error_types(dominant_classes, iris_error_types) 73 | #%%----------------------------------------------------------------------------------------------------------- 74 | 75 | # Get the edge width based on the perc of misclassified 76 | ind_misclassified = get_ind_misclassified(y, results) 77 | edge_width = get_edge_widths(ind_misclassified, clust) 78 | #%%----------------------------------------------------------------------------------------------------------- 79 | 80 | # Make an additional 2-D array 81 | comp_2d_array = np.transpose(np.array([dominant_classes, error_types, edge_width])) 82 | #%%----------------------------------------------------------------------------------------------------------- 83 | 84 | # Simple Grid 85 | perc_sentosa = get_perc_cluster(y, 0, clust) 86 | simple_2d_array = np.transpose(np.array([perc_sentosa, perc_sentosa])) 87 | 88 | data_dict = { 89 | "data": X, 90 | "target": y, 91 | "clust": clust, 92 | "add_1d_array": perc_misclassified, 93 | "add_2d_array": [] 94 | } 95 | #%%----------------------------------------------------------------------------------------------------------- 96 | 97 | # Visualization 98 | # Gray Hist (Brighter: more, Darker: less) 99 | fig1, ax1, patches1, text1 = som.plot('gray_hist', data_dict, use_add_array=True) 100 | plt.suptitle("Gray Hist with the percentage of misclassified items", fontsize=16) 101 | plt.show() 102 | #%%----------------------------------------------------------------------------------------------------------- 103 | 104 | # Color Hist 105 | fig2, ax2, patches2, text2, cbar2 = som.plot('color_hist', data_dict, use_add_array=True) 106 | plt.suptitle("Color Hist with the percentage of misclassified items", fontsize=16) 107 | plt.show() 108 | #%%----------------------------------------------------------------------------------------------------------- 109 | 110 | # Complex Hit hist 111 | # sentosa: Blue, versicolor: Green, virginica: Red (inner color) 112 | # type 1 error (tn): Pink, type 2 error (fn): blue (edge color) for corresponding dominat classes 113 | # Edge width: percentage of misclassified items (edge width) 114 | data_dict['add_2d_array'] = comp_2d_array # Update an additional 2-D array 115 | fig3, ax3, patches3, text3 = som.plot('complex_hist', data_dict, use_add_array=True) 116 | plt.suptitle("Complex Hit Histogram - Error Analysis", fontsize=16) 117 | plt.show() 118 | #%%----------------------------------------------------------------------------------------------------------- 119 | 120 | # Simple Grid 121 | # color: perc misclassified 122 | # sizes: perc sentosa 123 | data_dict['add_2d_array'] = simple_2d_array # Update an additional 2-D array 124 | fig4, ax4, patches4, cbar4 = som.plot('simple_grid', data_dict, use_add_array=True) 125 | plt.suptitle("Simple Grid", fontsize=16) 126 | plt.show() 127 | #%%----------------------------------------------------------------------------------------------------------- 128 | 129 | # Pie chart 130 | # tp: Blue, tn: Purple, fp: Orange, and fn: Yellow 131 | data_dict['add_2d_array'] = sentosa_conf # Update an additional 2-D array 132 | fig5, ax5, h_axes5 = som.plot('pie', data_dict, use_add_array=True) 133 | plt.suptitle("Pie Chart with tp, tn, fp, and fn of sentosa", fontsize=16) 134 | plt.show() 135 | #%%----------------------------------------------------------------------------------------------------------- 136 | 137 | # tp: Blue, tn: Purple, fp: Orange, and fn: Yellow 138 | data_dict['add_2d_array'] = versicolor_conf # Update an additional 2-D array 139 | fig6, ax6, h_axes6 = som.plot('pie', data_dict, use_add_array=True) 140 | plt.suptitle("Pie Chart with tp, tn, fp, and fn of versicolor", fontsize=16) 141 | plt.show() 142 | #%%----------------------------------------------------------------------------------------------------------- 143 | 144 | # tp: Blue, tn: Purple, fp: Orange, and fn: Yellow 145 | data_dict['add_2d_array'] = virginica_conf # Update an additional 2-D array 146 | fig7, ax7, h_axes7 = som.plot('pie', data_dict, use_add_array=True) 147 | plt.suptitle("Pie Chart with tp, tn, fp, and fn of virginica", fontsize=16) 148 | plt.show() 149 | #%%----------------------------------------------------------------------------------------------------------- 150 | 151 | # Stem Plot 152 | data_dict['add_2d_array'] = sentosa_conf # Update an additional 2-D array 153 | fig8, ax8, h_axes8 = som.plot("stem", data_dict, use_add_array=True) 154 | plt.suptitle("Stem Plot with tp, tn, fp, fn of Sentosa", fontsize=16) 155 | plt.show() 156 | #%%----------------------------------------------------------------------------------------------------------- 157 | 158 | data_dict['add_2d_array'] = versicolor_conf # Update an additional 2-D array 159 | fig9, ax9, h_axes9 = som.plot("stem", data_dict, use_add_array=True) 160 | plt.suptitle("Stem Plot with tp, tn, fp, fn of Versicolor", fontsize=16) 161 | plt.show() 162 | #%%----------------------------------------------------------------------------------------------------------- 163 | 164 | data_dict['add_2d_array'] = virginica_conf # Update an additional 2-D array 165 | fig10, ax10, h_axes10 = som.plot("stem", data_dict, use_add_array=True) 166 | plt.suptitle("Stem Plot with tp, tn, fp, fn of Virginica", fontsize=16) 167 | plt.show() 168 | -------------------------------------------------------------------------------- /Lectures/Clustering_SOM/SOM/Tabular/Iris/py/5-iris_interactive.py: -------------------------------------------------------------------------------- 1 | 2 | from NNSOM.plots import SOMPlots 3 | from NNSOM.utils import * 4 | 5 | import numpy as np 6 | from numpy.random import default_rng 7 | import matplotlib.pyplot as plt 8 | 9 | from sklearn.datasets import load_iris 10 | from sklearn.linear_model import LogisticRegression 11 | 12 | import os 13 | #%%----------------------------------------------------------------------------------------------------------- 14 | 15 | # SOM Parameters 16 | SOM_Row_Num = 4 # The number of row used for the SOM grid. 17 | Dimensions = (SOM_Row_Num, SOM_Row_Num) # The dimensions of the SOM grid. 18 | #%%----------------------------------------------------------------------------------------------------------- 19 | 20 | # Training Parameters 21 | Epochs = 500 22 | Steps = 100 23 | Init_neighborhood = 3 24 | #%%----------------------------------------------------------------------------------------------------------- 25 | 26 | # Random State 27 | SEED = 1234567 28 | rng = default_rng(SEED) 29 | 30 | # Data Preparation 31 | iris = load_iris() 32 | X = iris.data 33 | y = iris.target 34 | 35 | # Preprocessing data 36 | X = X[rng.permutation(len(X))] 37 | y = y[rng.permutation(len(X))] 38 | #%%----------------------------------------------------------------------------------------------------------- 39 | 40 | # Determine model dir and file name 41 | model_dir = os.getcwd() + os.sep 42 | Trained_SOM_File = "SOM_Model_iris_Epoch_" + str(Epochs) + '_Seed_' + str(SEED) + '_Size_' + str(SOM_Row_Num) + '.pkl' 43 | #%%----------------------------------------------------------------------------------------------------------- 44 | 45 | # Load som instance 46 | som = SOMPlots(Dimensions) 47 | som = som.load_pickle(Trained_SOM_File, model_dir + os.sep) 48 | #%%----------------------------------------------------------------------------------------------------------- 49 | 50 | # Data Processing 51 | clust, dist, mdist, clustSize = som.cluster_data(X) 52 | #%%----------------------------------------------------------------------------------------------------------- 53 | 54 | # Items for int_dict 55 | num1 = get_cluster_array(X[:, 0], clust) 56 | num2 = get_cluster_array(X[:, 1], clust) 57 | cat = count_classes_in_cluster(y, clust) 58 | #%%----------------------------------------------------------------------------------------------------------- 59 | 60 | # Items for plots 61 | perc_sentosa = get_perc_cluster(y, 0, clust) 62 | iris_class_counts_cluster_array = count_classes_in_cluster(y, clust) 63 | align = np.arange(len(iris_class_counts_cluster_array[0])) 64 | num_classes = count_classes_in_cluster(y, clust) 65 | num_sentosa = num_classes[:, 0] 66 | 67 | int_dict = { 68 | 'data': X, 69 | 'target': y, 70 | 'clust': clust, 71 | 'num1': num1, 72 | 'num2': num2, 73 | 'cat': cat, 74 | 'topn': 5, 75 | } 76 | #%%----------------------------------------------------------------------------------------------------------- 77 | 78 | # Interactive hit hist 79 | fig, ax, patches, text = som.hit_hist(X, mouse_click=True, **int_dict) 80 | plt.show() 81 | #%%----------------------------------------------------------------------------------------------------------- 82 | 83 | # Interactive neuron dist 84 | fig, ax, patches = som.neuron_dist_plot(True, **int_dict) 85 | plt.show() 86 | #%%----------------------------------------------------------------------------------------------------------- 87 | 88 | # Interactive weight_as_image 89 | fig, ax, pathces = som.weight_as_image(mouse_click=True, **int_dict) 90 | plt.show() 91 | #%%----------------------------------------------------------------------------------------------------------- 92 | 93 | # Interactive pie plot 94 | fig, ax, h_axes = som.plt_pie(iris_class_counts_cluster_array, perc_sentosa, mouse_click=True, **int_dict) 95 | plt.show() 96 | #%%----------------------------------------------------------------------------------------------------------- 97 | 98 | # Interactive plt_top 99 | fig, ax, patches = som.plt_top(True, **int_dict) 100 | plt.show() 101 | #%%----------------------------------------------------------------------------------------------------------- 102 | 103 | # Interactive plt_top_num 104 | fig, ax, patches, text = som.plt_top_num(True, **int_dict) 105 | plt.show() 106 | #%%----------------------------------------------------------------------------------------------------------- 107 | 108 | # Interactive gray plot 109 | fig, ax, patches, text = som.gray_hist(X, perc_sentosa, True, **int_dict) 110 | plt.show() 111 | #%%----------------------------------------------------------------------------------------------------------- 112 | 113 | # Interactive color plot 114 | fig, ax, patches, text, cbar = som.color_hist(X, perc_sentosa, True, **int_dict) 115 | plt.show() 116 | #%%----------------------------------------------------------------------------------------------------------- 117 | 118 | # Interactive plt_nc 119 | fig, ax, patches = som.plt_nc(True, **int_dict) 120 | plt.show() 121 | #%%----------------------------------------------------------------------------------------------------------- 122 | 123 | # interactive simple grid 124 | fig, ax, patches, cbar = som.simple_grid(perc_sentosa, num_sentosa, True, **int_dict) 125 | plt.show() 126 | #%%----------------------------------------------------------------------------------------------------------- 127 | 128 | # interactive stem plot 129 | fig, ax, h_axes = som.plt_stem(align, iris_class_counts_cluster_array, True, **int_dict) 130 | plt.show() 131 | #%%----------------------------------------------------------------------------------------------------------- 132 | 133 | # interactive line plot 134 | fig, ax, h_axes = som.plt_wgts(True, **int_dict) 135 | plt.show() 136 | #%%----------------------------------------------------------------------------------------------------------- 137 | 138 | # Interactive Histogram 139 | fig, ax, h_axes = som.plt_histogram(num1, True, **int_dict) 140 | plt.show() 141 | #%%----------------------------------------------------------------------------------------------------------- 142 | 143 | # Interactive Boxplot 144 | fig, ax, h_axes = som.plt_boxplot(num1, True, **int_dict) 145 | plt.show() 146 | #%%----------------------------------------------------------------------------------------------------------- 147 | 148 | # Interactive Violin Plot 149 | fig, ax, h_axes = som.plt_violin_plot(num1, True, **int_dict) 150 | plt.show() 151 | #%%----------------------------------------------------------------------------------------------------------- 152 | 153 | # Interactive Scatter plot 154 | fig, ax, h_axes = som.plt_scatter(num1, num2, True, True, **int_dict) 155 | plt.show() 156 | #%%----------------------------------------------------------------------------------------------------------- 157 | 158 | # Train Logistic Regression on Iris 159 | print('start training') 160 | logit = LogisticRegression(random_state=SEED) 161 | logit.fit(X, y) 162 | results = logit.predict(X) 163 | print('end training') 164 | 165 | ind_missClass = get_ind_misclassified(y, results) 166 | tp, ind21, fp, ind12 = get_conf_indices(y, results, 0) 167 | 168 | if 'clust' in int_dict: 169 | del int_dict['clust'] 170 | 171 | fig, ax, patches, text = som.cmplx_hit_hist(X, clust, perc_sentosa, ind_missClass, ind21, ind12, mouse_click=True, 172 | **int_dict) 173 | plt.show() 174 | -------------------------------------------------------------------------------- /Lectures/Clustering_SOM/SOM/Tabular/Iris/py/README.rst: -------------------------------------------------------------------------------- 1 | Examples 2 | ======== 3 | This is example script -------------------------------------------------------------------------------- /Mini_Project/Backpropagation/BackProp_Lab.md: -------------------------------------------------------------------------------- 1 | # Backpropagation 2 | 3 | ## Objective 4 | The main goal is to build, train, and assess a simple feed-forward neural network to approximate a given 5 | function 'g'. This aim will be achieved in the following steps: 6 | 7 | - Generating input data within a specified range. 8 | - Using the provided function 'g' to generate corresponding targets. 9 | 10 | Building a 2-layer feed-forward neural network using a sigmoid activation function. The number of neurons 11 | is variable and can be adjusted. Implementing a simple stochastic gradient descent (SGD) learning 12 | algorithm to train the model over a specified number of epochs. This involves calculating the gradient 13 | of the error with respect to the model parameters and updating the parameters using a learning rate. 14 | 15 | After training, the performance of the neural network will be assessed by comparing the approximated 16 | function to the real function via a plot. 17 | 18 | Finally, the mean squared error (MSE) during training will be plotted over epochs to provide insight 19 | into how the model performance improves over time, both in linear and log scales. 20 | 21 | This process will be reiterated for different configurations of model parameters, such as the number of 22 | neurons and learning rate, to better understand their impact on the model performance. 23 | 24 | The Python code snippet given above is the implementation of this objective where the backpropagation 25 | algorithm is encapsulated within the 'SGD' function in 'models.py'. 26 | 27 | This exercise will provide the student with practical experience in implementing neural networks, 28 | understanding the backpropagation algorithm, and learning about stochastic gradient descent optimization. 29 | It will also give them insight into how changing model parameters can impact performance and learning. 30 | 31 | 32 | ```angular2html 33 | g(p) = exp(−|p|) × sin(π p)) 34 | ``` 35 | 36 | 37 | ## Instructions 38 | 39 | - Write a Python script to implement the backpropagation algorithm for a 1−S1 −1 network. 40 | - Write the program using matrix operations, as in Eq. (11.41) to Eq. (11.47). Choose the initial 41 | weights and biases to be random numbers uniformly distributed between -0.5 and 0.5 (using the 42 | function rand), and train the network to approximate the function. 43 | 44 | # Task 45 | 46 | - Use S1 = 2 and S1 = 10. Experiment with several different values for the learning rate α , and 47 | use several different initial conditions. Discuss the convergence properties of the algorithm as 48 | the learning rate changes. 49 | - Plot the trained networks with the network outputs. Compare them. 50 | - Plot squared error for each epochs. 51 | - Implement Stochastic gradient approach. 52 | - Implement Batch approach (True Gradient). 53 | - Write your code in a format that you can enter any number of neurons in hidden layer. 54 | -------------------------------------------------------------------------------- /Mini_Project/Backpropagation/models.py: -------------------------------------------------------------------------------- 1 | from utils import * 2 | # ----------------------------------------------------------------------------------------------- 3 | def Batch(p, t, n_neurons, alpha, epochs): 4 | W1 = ''# TO DO 5 | b1 = '' # TO DO 6 | W2 = '' # TO DO 7 | b2 = '' # TO DO 8 | e = '' # TO DO 9 | mse = '' # TO DO 10 | return np.array(W1), np.array(b1), np.array(W2), np.array(b2), mse 11 | 12 | # ----------------------------------------------------------------------------------------------- 13 | def SGD(p, t, n_neurons, alpha, epochs): 14 | W1 = ''# TO DO 15 | b1 = '' # TO DO 16 | W2 = '' # TO DO 17 | b2 = '' # TO DO 18 | e = '' # TO DO 19 | mse = '' # TO DO 20 | return np.array(W1), np.array(b1), np.array(W2), np.array(b2), e, mse 21 | 22 | 23 | def sim(p, W1, b1, W2, b2): 24 | out = ''# TO DO 25 | return out 26 | -------------------------------------------------------------------------------- /Mini_Project/Backpropagation/utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | 3 | 4 | def g(p): 5 | return np.exp(-abs(p))*np.sin(np.pi*p) 6 | 7 | 8 | def logsigmoid(x): 9 | return 1.0/(1.0 + np.exp(-x)) 10 | 11 | 12 | # linear 13 | def linear(x): 14 | return x 15 | 16 | def F1(x): 17 | return (1-x)*x 18 | 19 | 20 | F2 = 1 21 | -------------------------------------------------------------------------------- /Mini_Project/LMS/LMS_lab.md: -------------------------------------------------------------------------------- 1 | # Lab: Exploring ADALINE for Decision Boundary Identification 2 | 3 | ## Objective 4 | 5 | In this lab activity, you will learn how to implement and understand the Adaptive Linear Neuron (ADALINE) 6 | algorithm for decision boundary identification. 7 | 8 | 9 | ADALINE is a single-layer neural network that uses a continuous valued linear function for the activation function. The aim is to correctly update the weight and bias parameters of ADALINE such that it can accurately identify the decision boundary in a given dataset. 10 | In the context of machine learning, decision boundaries are surfaces that partition the input space into regions where all points in the same region are classified under the same label. 11 | Through the course of this lab, you will work on a Python program utilizing ADALINE for identifying such decision boundaries. 12 | 13 | 14 | ## Instructions 15 | The given Python script data that need to use in ADALINE, depicted in the following code block: 16 | 17 | ```angular2html 18 | p = np.array([[1, 1, 2, 2, -1, -2, -1, -2 ], 19 | [1, 2, -1, 0, 2, 1, -1, -2]]) 20 | 21 | t = np.array([[-1, -1, -1, -1, 1, 1, 1 , 1], 22 | [-1, -1, 1 ,1, -1, -1, 1, 1]]) 23 | ``` 24 | ## Tasks 25 | 26 | Your script should perform the following steps: 27 | 28 | - It starts by defining the input and target data. 29 | - It initializes the weight, bias, and learning rate. 30 | - Then, it initializes error e and sum of squared errors SSE to store errors for each epoch. 31 | - Using nested loops, it performs forward propagation, calculates error, adjust weights and biases, 32 | and computes the SSE. 33 | -In the end, it plots the SSE for each epoch and identifies the decision boundary. 34 | 35 | Your task is to read, understand and implement the ADALINE neural network model using the provided code and modify it as needed to suit your requirements. The code also includes data visualization making it easier for you to understand the process. 36 | Try to play around with the weight initialization, learning rate, and number of epochs and see how it affects the algorithm's performance. -------------------------------------------------------------------------------- /Mini_Project/Perceptron/perceptron_lab.md: -------------------------------------------------------------------------------- 1 | # Perceptron Learning Algorithm Implementation 2 | 3 | ## Objective 4 | Write a python code snippet that implement a simple Perceptron Learning algorithm. 5 | The purpose is to demonstrate how the algorithm is used to classify two types of inputs, represented by 6 | 'Rabbit' and 'Bear' in a 2-dimensional space. 7 | 8 | ## Instructions 9 | 10 | In the given code, `numpy` and `matplotlib.pyplot` libraries are used for general mathematical operations and for 11 | plotting the decision boundary respectively. The `utils` library is used for any additional utility functions. 12 | 13 | The initial weights and bias for the Perceptron are initialized randomly. Training examples are defined in 2D space 14 | and respective target outputs are defined. 15 | 16 | ```python 17 | p = np.array([[1, 1, 2, 2, 3, 3, 4, 4], [4, 5, 4, 5, 1, 2, 1, 2]]) 18 | t = np.array([0, 0, 0, 0, 1, 1, 1, 1]) 19 | ``` 20 | 21 | The perceptron is trained over N epochs, adjusting the weights and bias according to the perceptron learning 22 | rule in each epoch. The error for each point is calculated and the sum of errors is stored. 23 | When the sum of errors reaches zero, the algorithm is terminated as it means the decision boundary correctly 24 | classifies all the training examples. 25 | 26 | A scatter plot is then drawn to visualize the classification. 27 | The decision boundary separates 'Rabbit' and 'Bear' inputs. 28 | 29 | ## Extra Tasks 30 | - Find the best decision boundary. 31 | - Plot errors and find the best stopping criteria. 32 | 33 | ## Commitment 34 | 35 | We believe this lab will be an exciting opportunity to learn first learning rule. -------------------------------------------------------------------------------- /Mini_Project/Perceptron/utils.py: -------------------------------------------------------------------------------- 1 | 2 | 3 | def purelin(n): 4 | """ 5 | Calculate purelin value for given number. 6 | 7 | Parameters: 8 | n (float): The number for which purelin value needs to be calculated. 9 | 10 | Returns: 11 | float: The purelin value of the input number. 12 | 13 | """ 14 | a=n 15 | return a 16 | 17 | ## poslin 18 | def poslin(n): 19 | """ 20 | --- poslin --- 21 | 22 | Parameters: 23 | - n: The number to be processed. 24 | 25 | Description: 26 | This function takes a number as input and returns the value of the number if it's greater than or equal to 0, otherwise it returns 0. 27 | 28 | Example Usage: 29 | n = 5 30 | result = poslin(n) 31 | print(result) 32 | # Output: 5 33 | 34 | n = -5 35 | result = poslin(n) 36 | print(result) 37 | # Output: 0 38 | 39 | """ 40 | if n<0: 41 | a=0 42 | else: 43 | a=n 44 | return a 45 | 46 | ## hardlims 47 | def hardlims(n): 48 | """ 49 | 50 | """ 51 | if n<0: 52 | a=-1 53 | else: 54 | a=1 55 | return a 56 | 57 | ## hardlim 58 | def hardlim(n): 59 | """ 60 | Applies the hardlimit activation function to the input. 61 | 62 | Parameters: 63 | n (float): The input value. 64 | 65 | Returns: 66 | float: The result of applying the hardlimit activation function to the input. 67 | """ 68 | if n<0: 69 | a=0 70 | else: 71 | a=1 72 | return a 73 | 74 | ## satlins 75 | def satlins(n): 76 | """ 77 | Implements the Satlins activation function. 78 | 79 | Parameters: 80 | n (float): The input value. 81 | 82 | Returns: 83 | float: The output value after applying the Satlins function. 84 | 85 | Notes: 86 | - The Satlins function returns -1 for input values less than -1, 1 for input values greater than 1, and the input value itself for values between -1 and 1. 87 | 88 | Examples: 89 | >>> satlins(-5) 90 | -1.0 91 | >>> satlins(3) 92 | 1.0 93 | >>> satlins(0.5) 94 | 0.5 95 | """ 96 | if n<-1: 97 | a=-1 98 | elif n>1: 99 | a=1 100 | else: 101 | a=n 102 | return a 103 | def bound(x,b,w): 104 | """ 105 | Calculate the boundary for a given point. 106 | 107 | Parameters: 108 | - x: The x-coordinate of the point. 109 | 110 | Returns: 111 | - The y-coordinate of the boundary line, calculated as (-b - w[0]*x) / w[1]. 112 | 113 | """ 114 | return (-b-w[0]*x)/w[1] -------------------------------------------------------------------------------- /Mini_Project/SimpleNet_Python/PythonLab1.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/NNDesignDeepLearning/NNDesignDeepLearning/blob/master/05.PythonChapter/Code/Labs/PythonLab1.ipynb)" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "# Python Lab 1 Objective\n", 15 | "\n", 16 | "This objective of this first Python lab is to help you become familiar with common Python operations that are often used in deep learning workflows. If you haven't already done so, run the cells in the `PythonChapter.ipynb` Jupyter Notebook to prepare for this lab.\n", 17 | "\n", 18 | "Some of the cells in this notebook are prefilled with working code. In addition, there will be cells with missing code (labeled `# TODO`), which you will need to complete. If you need additional cells, you can use the `Insert` menu at the top of the page.\n", 19 | "\n", 20 | "## Loading Modules\n", 21 | "\n", 22 | "We begin by loading some useful modules. In addition to NumPy and Pandas, we use the popular Python package, [Matplotlib](https://matplotlib.org/), to perform plotting. " 23 | ] 24 | }, 25 | { 26 | "cell_type": "code", 27 | "execution_count": 3, 28 | "metadata": { 29 | "ExecuteTime": { 30 | "end_time": "2024-03-13T14:17:03.176010Z", 31 | "start_time": "2024-03-13T14:17:02.454359Z" 32 | } 33 | }, 34 | "outputs": [], 35 | "source": [ 36 | "%matplotlib inline \n", 37 | "import matplotlib.pyplot as plt\n", 38 | "import numpy as np\n", 39 | "import pandas as pd" 40 | ] 41 | }, 42 | { 43 | "cell_type": "markdown", 44 | "metadata": {}, 45 | "source": [ 46 | "# Learning Rate Scheduler\n", 47 | "\n", 48 | "When training neural networks with gradient descent, it is important to set the learning rate appropriately. If it is too small, the training will be slow. If is is too large, the training can become unstable. In addition, the best learning rate can change during the training. In the following cells we will define functions that can compute different learning rates at different epochs of training.\n", 49 | "\n", 50 | "To begin, we define a simple algorithm that starts with a large learning rate and then reduces the rate during the training process by a certain factor at each epoch." 51 | ] 52 | }, 53 | { 54 | "cell_type": "code", 55 | "execution_count": 1, 56 | "metadata": { 57 | "ExecuteTime": { 58 | "end_time": "2024-03-13T14:16:45.555116Z", 59 | "start_time": "2024-03-13T14:16:45.549279Z" 60 | } 61 | }, 62 | "outputs": [], 63 | "source": [ 64 | "def lr_reduce(epoch):\n", 65 | " lr_decay = 0.8\n", 66 | " lr_max = 0.005\n", 67 | " lr_min = 0.001\n", 68 | " lr = (lr_max - lr_min)*lr_decay**epoch + lr_min\n", 69 | " return lr" 70 | ] 71 | }, 72 | { 73 | "cell_type": "markdown", 74 | "metadata": {}, 75 | "source": [ 76 | "To test this function, we calculate the learning rate up to an epoch of 20." 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": 4, 82 | "metadata": { 83 | "ExecuteTime": { 84 | "end_time": "2024-03-13T14:17:06.524985Z", 85 | "start_time": "2024-03-13T14:17:06.388138Z" 86 | } 87 | }, 88 | "outputs": [ 89 | { 90 | "data": { 91 | "text/plain": "[]" 92 | }, 93 | "execution_count": 4, 94 | "metadata": {}, 95 | "output_type": "execute_result" 96 | }, 97 | { 98 | "data": { 99 | "text/plain": "
", 100 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAj4AAAGdCAYAAAASUnlxAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAABYEUlEQVR4nO3de1yUdd4//tcwR0FAAWFABfGIiJqAIRZSmii2ptUmHb6EHVxpa5Xc7vCw/XJPobW5bbeK1arV1q2uIeWmlliCB9CEEFHxUKKgggTqDIIMDHx+fyCzjgzIoHAxzOv5eFwP9bre12feF1c2L6+jTAghQERERGQHHKRugIiIiKizMPgQERGR3WDwISIiIrvB4ENERER2g8GHiIiI7AaDDxEREdkNBh8iIiKyGww+REREZDcUUjfQlTQ0NODixYtwdnaGTCaTuh0iIiJqAyEEKisr4ePjAweH1o/pMPjc5OLFi+jfv7/UbRAREVE7FBcXo1+/fq3WMPjcxNnZGUDjD87FxUXiboiIiKgt9Ho9+vfvb/oebw2Dz02aTm+5uLgw+BAREdmYtlymwoubiYiIyG4w+BAREZHdYPAhIiIiu8HgQ0RERHaDwYeIiIjsBoMPERER2Q0GHyIiIrIbDD5ERERkNxh8iIiIyG60K/isXr0a/v7+0Gg0CAkJwd69e1utz8jIQEhICDQaDQYOHIg1a9Y0q0lJSUFgYCDUajUCAwORmppqtnzp0qWQyWRmk1arNasRQmDp0qXw8fFBjx498MADD+DYsWPt2UQiIiLqhqwOPps2bUJCQgKWLFmC3NxcREREIDo6GkVFRRbrCwsLMW3aNERERCA3NxeLFy/GvHnzkJKSYqrJyspCTEwMYmNjkZeXh9jYWMyaNQsHDx40G2vEiBEoKSkxTfn5+WbL3377baxYsQIrV67EoUOHoNVqMXnyZFRWVlq7mURERNQNyYQQwpoVwsLCEBwcjOTkZNO84cOHY+bMmUhKSmpWn5iYiK1bt6KgoMA0Lz4+Hnl5ecjKygIAxMTEQK/XY8eOHaaaqVOnonfv3tiwYQOAxiM+X375JQ4fPmyxLyEEfHx8kJCQgMTERACAwWCAl5cXli9fjrlz59522/R6PVxdXaHT6fiuLiIiIhthzfe3VUd8amtrkZOTg6ioKLP5UVFRyMzMtLhOVlZWs/opU6YgOzsbdXV1rdbcOubp06fh4+MDf39/PPnkkzhz5oxpWWFhIUpLS83GUavViIyMbLE3g8EAvV5vNnUEXXUdPtpzBolfHOmQ8YmIiKhtrAo+5eXlqK+vh5eXl9l8Ly8vlJaWWlyntLTUYr3RaER5eXmrNTePGRYWhk8//RTffvstPvroI5SWlmL8+PGoqKgwjdG0Xlt7S0pKgqurq2nq37//7X4E7WKor0fSjgJsyi5GUUV1h3wGERER3V67Lm6+9bXvQohWXwVvqf7W+bcbMzo6Go8//jhGjhyJhx56CNu2bQMAfPLJJ+3ubdGiRdDpdKapuLi4xW24E57OGoQPcgcA/OfIxQ75DCIiIro9q4KPh4cH5HJ5syMoZWVlzY60NNFqtRbrFQoF3N3dW61paUwAcHJywsiRI3H69GnTGACsGketVsPFxcVs6iiPjPYBAGw9zOBDREQkFauCj0qlQkhICNLS0szmp6WlYfz48RbXCQ8Pb1a/c+dOhIaGQqlUtlrT0phA4/U5BQUF8Pb2BgD4+/tDq9WajVNbW4uMjIxWx+ksU0d4QyV3wMlLlThZyrvMiIiIpGD1qa4FCxbgn//8J9atW4eCggK8+uqrKCoqQnx8PIDG00fPPvusqT4+Ph7nzp3DggULUFBQgHXr1mHt2rV47bXXTDXz58/Hzp07sXz5cpw4cQLLly/Hrl27kJCQYKp57bXXkJGRgcLCQhw8eBC//vWvodfrERcXB6DxFFdCQgLeeustpKam4ujRo5g9ezYcHR3x9NNPt/fnc9e4OioROawPAGBr3gWJuyEiIrJToh1WrVol/Pz8hEqlEsHBwSIjI8O0LC4uTkRGRprVp6enizFjxgiVSiUGDBggkpOTm425efNmMWzYMKFUKkVAQIBISUkxWx4TEyO8vb2FUqkUPj4+4rHHHhPHjh0zq2loaBBvvvmm0Gq1Qq1WiwkTJoj8/Pw2b5dOpxMAhE6na/M61th6+ILwS/xa3L/8O9HQ0NAhn0FERGRvrPn+tvo5Pt1ZRz/H53ptPUL+kobq2nps+e14BPv2vuufQUREZG867Dk+dGd6qOSYHNh4oTUvciYiIup8DD6drOnurm35Jahv4ME2IiKizsTg08kihvSBaw8lfqk04MCZCqnbISIisisMPp1MpXDAtJGNzxzi6S4iIqLOxeAjgek3TnftOFoCg7Fe4m6IiIjsB4OPBML83eHprIa+xog9p8qlboeIiMhuMPhIQO4gMx312ZrH011ERESdhcFHIk13d+06fgnVtUaJuyEiIrIPDD4SGdXPFX7ujrheV4+045ekboeIiMguMPhIRCaT8Y3tREREnYzBR0JNwWfP6V9wtbpW4m6IiIi6PwYfCQ3xckaA1hl19QI7jpZK3Q4REVG3x+AjsUfu4ekuIiKizsLgI7HpoxqDz4HCClzS10jcDRERUffG4COx/m6OCPbtBSGAr4+USN0OERFRt8bg0wU8wocZEhERdQoGny7g4VE+cJABecVXca6iSup2iIiIui0Gny6gj7Ma9w32AAD8h0d9iIiIOgyDTxfBd3cRERF1PAafLmLKCC1UcgecunQNJ0r1UrdDRETULTH4dBGuPZR4YFgfAMBXfKYPERFRh2Dw6UKaHmb4n7yLEEJI3A0REVH3w+DThUwK8IKTSo7zV67jx6KrUrdDRETU7TD4dCE9VHJMDvQCwLu7iIiIOgKDTxfTdLrr6yMlMNY3SNwNERFR98Lg08XcP7gPejkqUX7NgANnLkvdDhERUbfC4NPFqBQOmDbSGwCwNe+CxN0QERF1Lww+XVDTu7t2HC2FwVgvcTdERETdB4NPF3TvADdoXTSorDEi4+QvUrdDRETUbTD4dEEODjL8alTT6S7e3UVERHS3MPh0UU13d+0quIQqg1HiboiIiLoHBp8uamRfVwxwd0RNXQPSjl+Suh0iIqJuoV3BZ/Xq1fD394dGo0FISAj27t3ban1GRgZCQkKg0WgwcOBArFmzpllNSkoKAgMDoVarERgYiNTU1BbHS0pKgkwmQ0JCgtn82bNnQyaTmU3jxo1rzyZKTiaTmS5y5ukuIiKiu8Pq4LNp0yYkJCRgyZIlyM3NRUREBKKjo1FUVGSxvrCwENOmTUNERARyc3OxePFizJs3DykpKaaarKwsxMTEIDY2Fnl5eYiNjcWsWbNw8ODBZuMdOnQIH374IUaNGmXx86ZOnYqSkhLTtH37dms3sctoOt2159QvuFJVK3E3REREts/q4LNixQq88MILePHFFzF8+HC899576N+/P5KTky3Wr1mzBr6+vnjvvfcwfPhwvPjii3j++efxt7/9zVTz3nvvYfLkyVi0aBECAgKwaNEiTJo0Ce+9957ZWNeuXcMzzzyDjz76CL1797b4eWq1Glqt1jS5ublZu4ldxmBPZwz3doGxQWDH0VKp2yEiIrJ5VgWf2tpa5OTkICoqymx+VFQUMjMzLa6TlZXVrH7KlCnIzs5GXV1dqzW3jvnyyy/j4YcfxkMPPdRij+np6fD09MTQoUMxZ84clJWVtVhrMBig1+vNpq7mv6e7+DBDIiKiO2VV8CkvL0d9fT28vLzM5nt5eaG01PIRidLSUov1RqMR5eXlrdbcPObGjRvx448/IikpqcX+oqOj8fnnn+P777/Hu+++i0OHDmHixIkwGAwW65OSkuDq6mqa+vfv3/LGS2T66Mbb2g8WXkaprkbiboiIiGxbuy5ulslkZn8WQjSbd7v6W+e3NmZxcTHmz5+Pzz77DBqNpsXPiYmJwcMPP4ygoCBMnz4dO3bswKlTp7Bt2zaL9YsWLYJOpzNNxcXFLY4tlX69HRHq1xtCAF8f4UXOREREd8Kq4OPh4QG5XN7s6E5ZWVmzIzZNtFqtxXqFQgF3d/dWa5rGzMnJQVlZGUJCQqBQKKBQKJCRkYH3338fCoUC9fWWX+vg7e0NPz8/nD592uJytVoNFxcXs6krarrI+T+8u4uIiOiOWBV8VCoVQkJCkJaWZjY/LS0N48ePt7hOeHh4s/qdO3ciNDQUSqWy1ZqmMSdNmoT8/HwcPnzYNIWGhuKZZ57B4cOHIZfLLX52RUUFiouL4e3tbc1mdjnTRnpD7iBD3nkdzpZXSd0OERGRzbL6VNeCBQvwz3/+E+vWrUNBQQFeffVVFBUVIT4+HkDj6aNnn33WVB8fH49z585hwYIFKCgowLp167B27Vq89tprppr58+dj586dWL58OU6cOIHly5dj165dpuf0ODs7IygoyGxycnKCu7s7goKCADTe8fXaa68hKysLZ8+eRXp6OqZPnw4PDw88+uijd/IzkpxHTzXGD2o8OsajPkRERO1ndfCJiYnBe++9hz/96U+45557sGfPHmzfvh1+fn4AgJKSErNn+vj7+2P79u1IT0/HPffcgz//+c94//338fjjj5tqxo8fj40bN2L9+vUYNWoUPv74Y2zatAlhYWFt7ksulyM/Px8zZszA0KFDERcXh6FDhyIrKwvOzs7WbmaX03R311d5F03XSBEREZF1ZILfoiZ6vR6urq7Q6XRd7noffU0dQv+yC7XGBmyfF4FAn67VHxERkVSs+f7mu7pshItGiQeH9QHAV1gQERG1F4OPDXlkdF8Ajdf58EAdERGR9Rh8bMik4Z5wUslx4ep1/Fh0Rep2iIiIbA6Djw3RKOWYMkILANh6mKe7iIiIrMXgY2Om33iY4bb8EhjrGyTuhoiIyLYw+NiY+wd7oLejEuXXapF1pkLqdoiIiGwKg4+NUcodMG1k45OoebqLiIjIOgw+NqjpYYbfHCuFwWj5PWVERETUHIOPDRo7wA3erhpU1hiRfvIXqdshIiKyGQw+NsjBQYZfjeLpLiIiImsx+NiopocZ7iq4hGsGo8TdEBER2QYGHxsV1NcF/h5OMBgbkHa8VOp2iIiIbAKDj42SyWSYfuMiZ57uIiIiahsGHxvWdHfX3tPluFJVK3E3REREXR+Djw0b7NkTI3xcYGwQ2H60ROp2iIiIujwGHxv3CE93ERERtRmDj4371Y3g88PZyyjV1UjcDRERUdfG4GPj+vbqgbEDekMI4OsjPOpDRETUGgafbqDpdNcXOechhJC4GyIioq6LwacbeGR0X/RQynGitBIHzlyWuh0iIqIui8GnG3B1VOKx4MYnOa/fXyhxN0RERF0Xg0838dx9AwAAaQWXUFRRLW0zREREXRSDTzcx2NMZE4b2gRDAJ1lnpW6HiIioS2Lw6Uaajvr8+1AxX1xKRERkAYNPNxI5pA8Gejih0mBESs55qdshIiLqchh8uhEHB5npqM/HmWfR0MBb24mIiG7G4NPNPBbcD84aBQrLq5B+qkzqdoiIiLoUBp9uxkmtwFP3+gIA1u07K20zREREXQyDTzf0bLgfHGTAvp/KcepSpdTtEBERdRkMPt1Qv96OiArUAgDW7z8rbTNERERdCINPN/X8/f4AgC0/nseVqlqJuyEiIuoaGHy6qbEDemOEjwsMxgZsOFQkdTtERERdQruCz+rVq+Hv7w+NRoOQkBDs3bu31fqMjAyEhIRAo9Fg4MCBWLNmTbOalJQUBAYGQq1WIzAwEKmpqS2Ol5SUBJlMhoSEBLP5QggsXboUPj4+6NGjBx544AEcO3asPZto82QyGZ67r/Goz6eZ51BX3yBxR0RERNKzOvhs2rQJCQkJWLJkCXJzcxEREYHo6GgUFVk+qlBYWIhp06YhIiICubm5WLx4MebNm4eUlBRTTVZWFmJiYhAbG4u8vDzExsZi1qxZOHjwYLPxDh06hA8//BCjRo1qtuztt9/GihUrsHLlShw6dAharRaTJ09GZaV9XuA7fbQ3PHqqUKqvwTdHS6Vuh4iISHrCSvfee6+Ij483mxcQECAWLlxosf71118XAQEBZvPmzp0rxo0bZ/rzrFmzxNSpU81qpkyZIp588kmzeZWVlWLIkCEiLS1NREZGivnz55uWNTQ0CK1WK5YtW2aaV1NTI1xdXcWaNWvatG06nU4AEDqdrk31tmDFzpPCL/Fr8eiqfVK3QkRE1CGs+f626ohPbW0tcnJyEBUVZTY/KioKmZmZFtfJyspqVj9lyhRkZ2ejrq6u1Zpbx3z55Zfx8MMP46GHHmr2OYWFhSgtLTUbR61WIzIyssXeDAYD9Hq92dTdPDPOFyq5A34suorDxVelboeIiEhSVgWf8vJy1NfXw8vLy2y+l5cXSkstn0opLS21WG80GlFeXt5qzc1jbty4ET/++COSkpJa/Jym9draW1JSElxdXU1T//79LdbZMk9nDX412hsAsH5/ocTdEBERSatdFzfLZDKzPwshms27Xf2t81sbs7i4GPPnz8dnn30GjUZz13pbtGgRdDqdaSouLm51bFv1/I2LnLcdKcElfY3E3RAREUnHquDj4eEBuVze7AhKWVlZsyMtTbRarcV6hUIBd3f3VmuaxszJyUFZWRlCQkKgUCigUCiQkZGB999/HwqFAvX19dBqGx/YZ01varUaLi4uZlN3FNTXFfcOcIOxQeBfWeekboeIiEgyVgUflUqFkJAQpKWlmc1PS0vD+PHjLa4THh7erH7nzp0IDQ2FUqlstaZpzEmTJiE/Px+HDx82TaGhoXjmmWdw+PBhyOVy+Pv7Q6vVmo1TW1uLjIyMFnuzJ01vbf+/H4pQU1cvbTNEREQSUVi7woIFCxAbG4vQ0FCEh4fjww8/RFFREeLj4wE0nj66cOECPv30UwBAfHw8Vq5ciQULFmDOnDnIysrC2rVrsWHDBtOY8+fPx4QJE7B8+XLMmDEDX331FXbt2oV9+/YBAJydnREUFGTWh5OTE9zd3U3zm57r89Zbb2HIkCEYMmQI3nrrLTg6OuLpp59u30+nG5kc6IW+vXrgwtXr+OrwBcSM9ZW6JSIiok5ndfCJiYlBRUUF/vSnP6GkpARBQUHYvn07/Pz8AAAlJSVmz/Tx9/fH9u3b8eqrr2LVqlXw8fHB+++/j8cff9xUM378eGzcuBF/+MMf8MYbb2DQoEHYtGkTwsLCrOrt9ddfx/Xr1/Hb3/4WV65cQVhYGHbu3AlnZ2drN7PbUcgdEDfeD29tP4H1+89iVmj/Vq/LIiIi6o5koulKY4Jer4erqyt0Ol23vN5HV12HcUnf4XpdPf5vThjGD/KQuiUiIqI7Zs33N9/VZUdcHZX4dUg/AMC6fWelbYaIiEgCDD52ZvaNi5y/O3EJ5yqqpG2GiIiokzH42JlBfXoicmgfCAF8nHlW6naIiIg6FYOPHXr+/sYHGm7OPo/KmjqJuyEiIuo8DD52aMIQDwzq44RrBiO+yDkvdTtERESdhsHHDslkMsy+8RqLjzPPor6BN/YREZF9YPCxU48H94WLRoFzFdXYfaJM6naIiIg6BYOPnXJUKfDUvY1Pb16fybe2ExGRfWDwsWPPjh8AuYMM+3+qwIlSvdTtEBERdTgGHzvWt1cPTBnR+Ob6j/eflbYZIiKiTsDgY+eeu3GRc2ruBVyuqpW4GyIioo7F4GPnQv16Y2RfVxiMDdjwQ9HtVyAiIrJhDD52TiaT4bkbr7H4NOss6uobpG2IiIioAzH4EB4e5Q2Pnmpc0huwPb9E6naIiIg6DIMPQa2QI3acHwBgPS9yJiKibozBhwAAT4f5QiV3wOHiq/ix6IrU7RAREXUIBh8CAPRxVuORe3wA8KgPERF1Xww+ZNJ0kfOO/BKU6K5L2wwREVEHYPAhkxE+rrjX3w3GBoF/ZZ2Tuh0iIqK7jsGHzDx/44GGG34oQk1dvcTdEBER3V0MPmRmcqAX+vXugSvVdfgy94LU7RAREd1VDD5kRu4gw+zxAwAA6/YXQgghbUNERER3EYMPNfNEaH84quQ4dekaMn+ukLodIiKiu4bBh5px7aHEr0P6AQDW7SuUuBsiIqK7h8GHLGo63fX9yTKcLa+SthkiIqK7hMGHLBrYpyceHNYHQgAfZ56Vuh0iIqK7gsGHWvTcjVvbN2cXQ19TJ3E3REREd47Bh1oUMcQDgz17oqq2Hpuzz0vdDhER0R1j8KEWyWQy02ssPs4sRH0Db20nIiLbxuBDrXpsTD+49lCi+PJ17Cq4JHU7REREd4TBh1rVQyXHM2G+AIC/p53iUR8iIrJp7Qo+q1evhr+/PzQaDUJCQrB3795W6zMyMhASEgKNRoOBAwdizZo1zWpSUlIQGBgItVqNwMBApKammi1PTk7GqFGj4OLiAhcXF4SHh2PHjh1mNbNnz4ZMJjObxo0b155NpJv8ZsJAuGgUOFFaiVS+xoKIiGyY1cFn06ZNSEhIwJIlS5Cbm4uIiAhER0ejqKjIYn1hYSGmTZuGiIgI5ObmYvHixZg3bx5SUlJMNVlZWYiJiUFsbCzy8vIQGxuLWbNm4eDBg6aafv36YdmyZcjOzkZ2djYmTpyIGTNm4NixY2afN3XqVJSUlJim7du3W7uJdItejir89sHBAIAVO0/y5aVERGSzZMLKlzGFhYUhODgYycnJpnnDhw/HzJkzkZSU1Kw+MTERW7duRUFBgWlefHw88vLykJWVBQCIiYmBXq83O4IzdepU9O7dGxs2bGixFzc3N7zzzjt44YUXADQe8bl69Sq+/PJLazbJRK/Xw9XVFTqdDi4uLu0ao7uqqavHg39LR4muBouiAzA3cpDULREREQGw7vvbqiM+tbW1yMnJQVRUlNn8qKgoZGZmWlwnKyurWf2UKVOQnZ2Nurq6VmtaGrO+vh4bN25EVVUVwsPDzZalp6fD09MTQ4cOxZw5c1BWVtbi9hgMBuj1erOJLNMo5VgweSgAYNXun3C1ulbijoiIiKxnVfApLy9HfX09vLy8zOZ7eXmhtLTU4jqlpaUW641GI8rLy1utuXXM/Px89OzZE2q1GvHx8UhNTUVgYKBpeXR0ND7//HN8//33ePfdd3Ho0CFMnDgRBoPBYm9JSUlwdXU1Tf3792/bD8JOPRbcDwFaZ+hrjFi1+yep2yEiIrJauy5ulslkZn8WQjSbd7v6W+e3Zcxhw4bh8OHDOHDgAF566SXExcXh+PHjpuUxMTF4+OGHERQUhOnTp2PHjh04deoUtm3bZrGvRYsWQafTmabi4uJWtprkDjIkRgcAAD7JPIfzV6ol7oiIiMg6VgUfDw8PyOXyZkdiysrKmh2xaaLVai3WKxQKuLu7t1pz65gqlQqDBw9GaGgokpKSMHr0aPzjH/9osV9vb2/4+fnh9OnTFper1WrTXWJNE7XugaF9ED7QHbX1DVix85TU7RAREVnFquCjUqkQEhKCtLQ0s/lpaWkYP368xXXCw8Ob1e/cuROhoaFQKpWt1rQ0ZhMhRIunsQCgoqICxcXF8Pb2bnUcajuZTIZF0xqP+qQevoDjF3ldFBER2Q6rT3UtWLAA//znP7Fu3ToUFBTg1VdfRVFREeLj4wE0nj569tlnTfXx8fE4d+4cFixYgIKCAqxbtw5r167Fa6+9ZqqZP38+du7cieXLl+PEiRNYvnw5du3ahYSEBFPN4sWLsXfvXpw9exb5+flYsmQJ0tPT8cwzzwAArl27htdeew1ZWVk4e/Ys0tPTMX36dHh4eODRRx9t78+HLBjVrxd+NcobQgDLvjkhdTtERERtJ9ph1apVws/PT6hUKhEcHCwyMjJMy+Li4kRkZKRZfXp6uhgzZoxQqVRiwIABIjk5udmYmzdvFsOGDRNKpVIEBASIlJQUs+XPP/+86TP79OkjJk2aJHbu3GlaXl1dLaKiokSfPn2EUqkUvr6+Ii4uThQVFbV5u3Q6nQAgdDpdm9exV2fLr4nBi7cJv8Svxd5Tv0jdDhER2TFrvr+tfo5Pd8bn+Fhn6dZj+DjzLEb4uOA/r9wPB4eWL3AnIiLqKB32HB+im/1u4mD0VCtw7KIe/zlyUep2iIiIbovBh9rNvaca8ZEDAQDvfHsSBiNfZUFERF0bgw/dkefv94ensxrnr1zHv7LOSd0OERFRqxh86I44qhSmV1ms3P0TdNfrJO6IiIioZQw+dMd+HdIPgz174mp1HdZk/Cx1O0RERC1i8KE7ppA7IHFq40MN1+0rRInuusQdERERWcbgQ3fFQ8M9MXZAbxiMfJUFERF1XQw+dFc0vspiOAAg5cfzOFlaKXFHREREzTH40F0T7Nsb0UFaNAhgOV9lQUREXRCDD91V/zNlGOQOMnx/ogxZP1dI3Q4REZEZBh+6qwb26Ymn7u0PAFi2owB8IwoREXUlDD50182fNBSOKjnyzuuwLb9E6naIiIhMGHzoruvjrMZvJvz3VRa1xgaJOyIiImrE4EMdYk7EQHj0VONcRTU2/FAkdTtEREQAGHyogzipFZj/0BAAwD++O43KGr7KgoiIpMfgQx3mybH9MdDDCZeravHhnjNSt0NERMTgQx1HKXfA61OHAQD+ubcQZfoaiTsiIiJ7x+BDHWrKCC2CfXvhel09/r7rtNTtEBGRnWPwoQ5186ssNh0qwk9lfJUFERFJh8GHOtzYAW54aLjXjVdZnJS6HSIismMMPtQpFkYPg4MMSDt+CdlnL0vdDhER2SkGH+oUgz2dETO28VUWb23nqyyIiEgaDD7UaRIeGgqN0gE/Fl3Ft8cuSd0OERHZIQYf6jReLhq8eH/jqyze/uYE6ur5KgsiIupcDD7UqeZGDoSbkwpnyquw6VCx1O0QEZGdYfChTuWsUWLexMEAgPd2nUaVwShxR0REZE8YfKjTPR3mBz93R5RfM+CfewulboeIiOwIgw91OpXCAa9FNb7K4oM9P+OXSoPEHRERkb1g8CFJPDzSG6P6uaK6th7vf8dXWRARUedg8CFJODjIsDA6AACw4YciFJZXSdwRERHZAwYfksz4QR54cFgfGBsE3vn2hNTtEBGRHWDwIUklRgdAJgO255ci6+cKqdshIqJurl3BZ/Xq1fD394dGo0FISAj27t3ban1GRgZCQkKg0WgwcOBArFmzpllNSkoKAgMDoVarERgYiNTUVLPlycnJGDVqFFxcXODi4oLw8HDs2LHDrEYIgaVLl8LHxwc9evTAAw88gGPHjrVnE6mTBGhd8NS9vgCA//kiD9d4ezsREXUgq4PPpk2bkJCQgCVLliA3NxcRERGIjo5GUVGRxfrCwkJMmzYNERERyM3NxeLFizFv3jykpKSYarKyshATE4PY2Fjk5eUhNjYWs2bNwsGDB001/fr1w7Jly5CdnY3s7GxMnDgRM2bMMAs2b7/9NlasWIGVK1fi0KFD0Gq1mDx5MiorK63dTOpEi6cNR7/ePXD+ynX8dVuB1O0QEVE3JhNWvi0yLCwMwcHBSE5ONs0bPnw4Zs6ciaSkpGb1iYmJ2Lp1KwoK/vuFFh8fj7y8PGRlZQEAYmJioNfrzY7gTJ06Fb1798aGDRta7MXNzQ3vvPMOXnjhBQgh4OPjg4SEBCQmJgIADAYDvLy8sHz5csydO/e226bX6+Hq6gqdTgcXF5fb/zDorsn6uQJPfXQAAPDxc2PxwDBPiTsiIiJbYc33t1VHfGpra5GTk4OoqCiz+VFRUcjMzLS4TlZWVrP6KVOmIDs7G3V1da3WtDRmfX09Nm7ciKqqKoSHhwNoPLJUWlpqNo5arUZkZGSL4xgMBuj1erOJpBE+yB3P3TcAAJCYcgS66jppGyIiom7JquBTXl6O+vp6eHl5mc338vJCaWmpxXVKS0st1huNRpSXl7dac+uY+fn56NmzJ9RqNeLj45GamorAwEDTGE3rtbW3pKQkuLq6mqb+/fu3tvnUwV6fEoCBHk64pDdg6X94bRYREd197bq4WSaTmf1ZCNFs3u3qb53fljGHDRuGw4cP48CBA3jppZcQFxeH48ePt7u3RYsWQafTmabiYr40U0o9VHL8bdZoOMiA1NwL+OZoidQtERFRN2NV8PHw8IBcLm92BKWsrKzZkZYmWq3WYr1CoYC7u3urNbeOqVKpMHjwYISGhiIpKQmjR4/GP/7xD9MYAKzqTa1Wm+4Sa5pIWsG+vREfOQgAsCT1KMqv8XUWRER091gVfFQqFUJCQpCWlmY2Py0tDePHj7e4Tnh4eLP6nTt3IjQ0FEqlstWalsZsIoSAwdD4xejv7w+tVms2Tm1tLTIyMm47DnUt8x8aggCtMyqqavGH1KOw8vp7IiKilgkrbdy4USiVSrF27Vpx/PhxkZCQIJycnMTZs2eFEEIsXLhQxMbGmurPnDkjHB0dxauvviqOHz8u1q5dK5RKpfjiiy9MNfv37xdyuVwsW7ZMFBQUiGXLlgmFQiEOHDhgqlm0aJHYs2ePKCwsFEeOHBGLFy8WDg4OYufOnaaaZcuWCVdXV7FlyxaRn58vnnrqKeHt7S30en2btk2n0wkAQqfTWftjobvs6IWrYtCibcIv8Wux5cdiqdshIqIuzJrvb6uDjxBCrFq1Svj5+QmVSiWCg4NFRkaGaVlcXJyIjIw0q09PTxdjxowRKpVKDBgwQCQnJzcbc/PmzWLYsGFCqVSKgIAAkZKSYrb8+eefN31mnz59xKRJk8xCjxBCNDQ0iDfffFNotVqhVqvFhAkTRH5+fpu3i8Gna3l/1ynhl/i1CHrzG3HxarXU7RARURdlzfe31c/x6c74HJ+uxVjfgMeTM5F3XocJQ/vgk+fGtnoRPRER2acOe44PUWdSyB3w7qzRUCkcsOfUL9jwA++6IyKiO8PgQ13aYE9nvD5lGADgL9uOo6iiWuKOiIjIljH4UJf33H3+uHeAG6pr6/HaF3loaODZWSIiah8GH+ry5A4yvPPEKDiq5Pih8DLWZ56VuiUiIrJRDD5kE/zcnbB42nAAwNvfnMBPZdck7oiIiGwRgw/ZjGfCfBExxAMGYwN+vzkPxvoGqVsiIiIbw+BDNkMmk+HtX4+Cs0aBvOKr+GDPGalbIiIiG8PgQzbF27UH/vjICADAe7tO4fhFvcQdERGRLWHwIZvz6Ji+iAr0Ql29wIJ/H0atkae8iIiobRh8yObIZDL89dGRcHNS4URpJd7/7rTULRERkY1g8CGb1MdZjb/MDAIArE7/CblFVyTuiIiIbAGDD9msaSO9MeMeHzQI4Peb81BTVy91S0RE1MUx+JBN++MjI+DprMaZX6rwzrcnpW6HiIi6OAYfsmm9HFVY/vgoAMC6/YU4cKZC4o6IiKgrY/Ahm/dggCeeHNsfQgD/80UerhmMUrdERERdFIMPdQtLHh6Ovr16oPjydby1vUDqdoiIqIti8KFuwVmjxDtPNJ7y+r+DRcg49YvEHRERUVfE4EPdxvhBHpg9fgAAIPGLI9BV10nbEBERdTkMPtStJE4NgL+HE0r1Nfjjf45J3Q4REXUxDD7UrfRQyfG3J0bDQQZsyb2Ab46WSt0SERF1IQw+1O2E+PXG3MhBAIAlqfmouGaQuCMiIuoqGHyoW0p4aAiGeTmjoqoWS1KPQgghdUtERNQFMPhQt6RWyPHurNFQOMjwzbFSbM27KHVLRETUBTD4ULcV1NcV8yYNAQD8IfUoTl+qlLgjIiKSGoMPdWsvPTAIYwf0RqXBiOc/OcTrfYiI7ByDD3VrSrkDPogNha+bI4ovX8dv/pXDt7gTEdkxBh/q9tycVFg3eyxcNArknLuC1784woudiYjsFIMP2YXBnj2R/P9CoHCQYWveRfzju9NSt0RERBJg8CG7cd9gD/xlZhAA4L1dp/HV4QsSd0RERJ2NwYfsypP3+mLuhIEAgP/ZfATZZy9L3BEREXUmBh+yO4lTAxAV6IXa+gb85l85KKqolrolIiLqJO0KPqtXr4a/vz80Gg1CQkKwd+/eVuszMjIQEhICjUaDgQMHYs2aNc1qUlJSEBgYCLVajcDAQKSmppotT0pKwtixY+Hs7AxPT0/MnDkTJ0+eNKuZPXs2ZDKZ2TRu3Lj2bCJ1Yw4OMrz35D0I6uuCy1W1eP6TQ9Bd55vciYjsgdXBZ9OmTUhISMCSJUuQm5uLiIgIREdHo6ioyGJ9YWEhpk2bhoiICOTm5mLx4sWYN28eUlJSTDVZWVmIiYlBbGws8vLyEBsbi1mzZuHgwYOmmoyMDLz88ss4cOAA0tLSYDQaERUVhaqqKrPPmzp1KkpKSkzT9u3brd1EsgOOKgXWxo2F1kWDn8qu4eXPf0RdfYPUbRERUQeTCSvv6w0LC0NwcDCSk5NN84YPH46ZM2ciKSmpWX1iYiK2bt2KgoIC07z4+Hjk5eUhKysLABATEwO9Xo8dO3aYaqZOnYrevXtjw4YNFvv45Zdf4OnpiYyMDEyYMAFA4xGfq1ev4ssvv7Rmk0z0ej1cXV2h0+ng4uLSrjHIthy9oMOsD7JQXVuPp8N88deZQZDJZFK3RUREVrDm+9uqIz61tbXIyclBVFSU2fyoqChkZmZaXCcrK6tZ/ZQpU5CdnY26urpWa1oaEwB0Oh0AwM3NzWx+eno6PD09MXToUMyZMwdlZWVt2ziyS0F9XfH+k2MgkwH/d7AIa/cVSt0SERF1IKuCT3l5Oerr6+Hl5WU238vLC6WlpRbXKS0ttVhvNBpRXl7eak1LYwohsGDBAtx///0ICgoyzY+Ojsbnn3+O77//Hu+++y4OHTqEiRMnwmCw/JoCg8EAvV5vNpH9eSjQC0umDQcA/HV7AXYes/zfHRER2b52Xdx866kAIUSrpwcs1d8635oxX3nlFRw5cqTZabCYmBg8/PDDCAoKwvTp07Fjxw6cOnUK27ZtszhOUlISXF1dTVP//v1b3Abq3l643x/PhPlCCGD+xsM4ekEndUtERNQBrAo+Hh4ekMvlzY7ElJWVNTti00Sr1VqsVygUcHd3b7XG0pi/+93vsHXrVuzevRv9+vVrtV9vb2/4+fnh9GnLT+ldtGgRdDqdaSouLm51POq+ZDIZlj4yAhFDPHC9rh4vfHIIpboaqdsiIqK7zKrgo1KpEBISgrS0NLP5aWlpGD9+vMV1wsPDm9Xv3LkToaGhUCqVrdbcPKYQAq+88gq2bNmC77//Hv7+/rftt6KiAsXFxfD29ra4XK1Ww8XFxWwi+6WUO2Dl08EY4tkTl/QGvPDJIVQZjFK3RUREd5Ow0saNG4VSqRRr164Vx48fFwkJCcLJyUmcPXtWCCHEwoULRWxsrKn+zJkzwtHRUbz66qvi+PHjYu3atUKpVIovvvjCVLN//34hl8vFsmXLREFBgVi2bJlQKBTiwIEDppqXXnpJuLq6ivT0dFFSUmKaqqurhRBCVFZWit///vciMzNTFBYWit27d4vw8HDRt29fodfr27RtOp1OABA6nc7aHwt1I0UVVSL4TzuFX+LX4sVPDgljfYPULRERUSus+f62OvgIIcSqVauEn5+fUKlUIjg4WGRkZJiWxcXFicjISLP69PR0MWbMGKFSqcSAAQNEcnJyszE3b94shg0bJpRKpQgICBApKSnmjQIWp/Xr1wshhKiurhZRUVGiT58+QqlUCl9fXxEXFyeKioravF0MPtQk+2yFGLJku/BL/Fr8ddtxqdshIqJWWPP9bfVzfLozPseHbvbV4QuYv/EwACDpsZF46l5faRsiIiKLOuw5PkT2ZMY9ffHqQ0MBAG98eRT7TpdL3BEREd0pBh+iVsybNBgz7/GBsUHgpc9z8FNZpdQtERHRHWDwIWqFTCbDssdHIdSvNyprjHj+42xUXLP8QEwiIur6GHyIbkOjlOOD2BD4ujmi6HI15v4rBwZjvdRtERFROzD4ELWBe0811s0OhbNGgexzV5D4xRHwvgAiItvD4EPURoM9nZH8TAjkDjJ8efgi3v/uJ6lbIiIiKzH4EFnh/iEe+MvMxhfj/n3XKXx1+ILEHRERkTUYfIis9NS9vpgT0fjKlP/54ghyzl2WuCMiImorBh+idlgYPRyTA71Qa2zAbz7NQfHlaqlbIiKiNmDwIWoHuYMM/3jyHozwcUFFVS2e+ugAzlVUSd0WERHdBoMPUTs5qhRYGzcWA9wdcf7KdTyxJgsnS/mAQyKirozBh+gOaF01+Hd8OAK0ziirNCDmwyzkFV+Vui0iImoBgw/RHfJ01mDjb8bhnv69cLW6Dk9/dABZP1dI3RYREVnA4EN0F/RyVOGzF8MwfpA7qmrrMXv9D/j+xCWp2yIiolsw+BDdJT3VCqybPRYPDfeE4cbdXv/Juyh1W0REdBMGH6K7SKOUI/n/hWDGjTe6z9uYi40/FEndFhER3cDgQ3SXKeUO+Puse/BMmC+EABZuycc/956Rui0iIgKDD1GHcHCQ4S8zgzA3ciAA4C/bCrAi7RRfbEpEJDEGH6IOIpPJsCh6OP5nyjAAwPvfncafvj6OhgaGHyIiqTD4EHWwlx8cjD/NGAEAWL//LBJTjqCe4YeISBIMPkSd4NnwAXj3idFwkAGbc87jdxt+RK2xQeq2iIjsDoMPUSd5PKQfVj8TApXcAdvzSzHn02xcr62Xui0iIrvC4EPUiaYGabF2dih6KOXIOPUL4tb9AH1NndRtERHZDQYfok4WMaQP/vXCvXDWKPDD2ct4+qMDuFxVK3VbRER2gcGHSAKhA9ywYc44uDupcPSCHrM+yEKprkbqtoiIuj0GHyKJBPV1xaa54fB21eCnsmt44oNMFFVUS90WEVG3xuBDJKHBnj2xOT4cA9wdUXz5On69JhOnLlVK3RYRUbfF4EMksX69HfHv+HAM83JGWaUBMR9k4cj5q1K3RUTULTH4EHUBns4abJo7DqP798KV6jo8/dFBHDxTIXVbRETdDoMPURfRy1GFz18Mw7iBbrhmMOLZdT9g94kyqdsiIupWGHyIupCeagU+fu5eTArwhMHYgDmfZuPrIxelbouIqNtg8CHqYjRKOdbEhmD6aB8YGwTmbcjFmoyf+XJTIqK7oF3BZ/Xq1fD394dGo0FISAj27t3ban1GRgZCQkKg0WgwcOBArFmzpllNSkoKAgMDoVarERgYiNTUVLPlSUlJGDt2LJydneHp6YmZM2fi5MmTZjVCCCxduhQ+Pj7o0aMHHnjgARw7dqw9m0gkKaXcAe/F3INnwnzRIIBlO07gxU+zcYUPOiQiuiNWB59NmzYhISEBS5YsQW5uLiIiIhAdHY2ioiKL9YWFhZg2bRoiIiKQm5uLxYsXY968eUhJSTHVZGVlISYmBrGxscjLy0NsbCxmzZqFgwcPmmoyMjLw8ssv48CBA0hLS4PRaERUVBSqqqpMNW+//TZWrFiBlStX4tChQ9BqtZg8eTIqK3l7MNkeuYMMf5kZhKTHRkKlcMD3J8rw8Pt78WPRFalbIyKyWTIhhFXHz8PCwhAcHIzk5GTTvOHDh2PmzJlISkpqVp+YmIitW7eioKDANC8+Ph55eXnIysoCAMTExECv12PHjh2mmqlTp6J3797YsGGDxT5++eUXeHp6IiMjAxMmTIAQAj4+PkhISEBiYiIAwGAwwMvLC8uXL8fcuXNvu216vR6urq7Q6XRwcXFp2w+EqBMcv6jHy//3IwrLq6BwkCFxagBejPCHTCaTujUiIslZ8/1t1RGf2tpa5OTkICoqymx+VFQUMjMzLa6TlZXVrH7KlCnIzs5GXV1dqzUtjQkAOp0OAODm5gag8chSaWmp2ThqtRqRkZEtjmMwGKDX680moq4o0McF//nd/abrfv66vQBzPs2BrpovOCUisoZVwae8vBz19fXw8vIym+/l5YXS0lKL65SWllqsNxqNKC8vb7WmpTGFEFiwYAHuv/9+BAUFmcZoWq+t4yQlJcHV1dU09e/f32IdUVfQU63A+0/egz/PDIJK7oBdBZcw7f29OFx8VerWiIhsRrsubr718LoQotVD7pbqb51vzZivvPIKjhw5YvE0mDXjLFq0CDqdzjQVFxe3uA1EXYFMJkPsOD9s+e14+Lk74sLV63hiTSbW7iuElWetiYjsklXBx8PDA3K5vNkRlLKysmZHWppotVqL9QqFAu7u7q3WWBrzd7/7HbZu3Yrdu3ejX79+Zp8DwKre1Go1XFxczCYiWxDU1xX/+d39mDZSi7p6gT9/fRzxn+VAd52nvoiIWmNV8FGpVAgJCUFaWprZ/LS0NIwfP97iOuHh4c3qd+7cidDQUCiVylZrbh5TCIFXXnkFW7Zswffffw9/f3+zen9/f2i1WrNxamtrkZGR0WJvRLbMRaPEqqeD8cdHRkAld8C3xy7hV/+7l+/5IiJqjbDSxo0bhVKpFGvXrhXHjx8XCQkJwsnJSZw9e1YIIcTChQtFbGysqf7MmTPC0dFRvPrqq+L48eNi7dq1QqlUii+++MJUs3//fiGXy8WyZctEQUGBWLZsmVAoFOLAgQOmmpdeekm4urqK9PR0UVJSYpqqq6tNNcuWLROurq5iy5YtIj8/Xzz11FPC29tb6PX6Nm2bTqcTAIROp7P2x0IkqbziK+L+5d8Jv8SvxeDF28T6fWdEQ0OD1G0REXUKa76/rQ4+QgixatUq4efnJ1QqlQgODhYZGRmmZXFxcSIyMtKsPj09XYwZM0aoVCoxYMAAkZyc3GzMzZs3i2HDhgmlUikCAgJESkqKeaOAxWn9+vWmmoaGBvHmm28KrVYr1Gq1mDBhgsjPz2/zdjH4kC27Wl0rfvPpIeGX+LXwS/xavPRZttBdr5W6LSKiDmfN97fVz/HpzvgcH7J1Qgis338WSTsKUFcv4OfuiFVPByOor6vUrRERdZgOe44PEXVtMpkMz9/vj83x49G3Vw+cq6jGY6sz8a+ss7zri4gIDD5E3dI9/Xth+7wITA70Qm19A9746hh+tyEXlTW864uI7BuDD1E35eqoxIexIfjDw8OhcJDh6yMleGTlfhy7qJO6NSIiyTD4EHVjMpkML0YMxL/jw9G3Vw8Ullfh0dWZ+PzgOZ76IiK7xOBDZAeCfXtj27z7MSnAE7XGBixJPYr5Gw/jmsEodWtERJ2KwYfITvRyVOGjZ0OxKDoAcgcZtuZdxCP/uw8FJXw5LxHZDwYfIjvi4CDD3MhB+PfccfB21eBMeRVmrNqPFWmnUFNXL3V7REQdjsGHyA6F+Llh27wITLxx6uv9705j8t8z8F3BJalbIyLqUAw+RHbKzUmFtXGhWP1MMLxdNSi+fB0vfJKNFz85hOLL1VK3R0TUIRh8iOyYTCbDtJHe2LUgEvGRg6BwkGFXQRkeWpGBf+w6zdNfRNTtMPgQEZzUCiyMDsA3CREYP8gdBmMD/r7rFKa8twe7T5ZJ3R4R0V3D4ENEJoM9nfH5i2H436fGwMtFjXMV1Xhu/SH85tNsnv4iom6BwYeIzMhkMkwf7YPvfv8AfjNhIBQOMuw8fgmT/56Bld+fhsHI019EZLv4dvab8O3sRM2dulSJN748ioOFlwEA/h5O+OMjIzBhaB+JOyMiasS3sxPRXTPUyxkbfzMO/3jyHvRxVqOwvArPrvsBL32Wg4tXr0vdHhGRVRh8iOi2ZDIZZtzTF9//PhLP3+cPuYMMO46WYtK7GUhO/xm1xgapWyQiahOe6roJT3URtU1BiR7/31dHcejsFQDAwD5O+POMINw32EPizojIHvFUFxF1qOHeLvj33HC8+8RoePRU4cwvVXjmnwfx8v/9iFJdjdTtERG1iMGHiNpFJpPh8ZB++O73D2D2+AFwkAHbjpRg4rvp+HDPz6ir5+kvIup6eKrrJjzVRdR+xy7q8MaXR/Fj0VUAwBDPnvjjjBEYP4inv4ioY1nz/c3gcxMGH6I709Ag8MWP57FsxwlcrqoFAEQHafG7iUMQ6MO/U0TUMRh82onBh+ju0FXX4W87T+Kzg+fQ9H+YSQGeeHniYAT79pa2OSLqdhh82onBh+juOlGqx6rdP2PbkYtouPF/mvsGu+PlBwcjfKA7ZDKZtA0SUbfA4NNODD5EHePML9eQnP4zUnMvwHgjAYX49cYrDw7GA8P6MAAR0R1h8GknBh+ijnX+SjU+yDiDTdnFpocejvBxwSsPDsaUEVo4ODAAEZH1GHzaicGHqHOU6Wvw0d4z+PxgEaprG196OtizJ15+cBCmj/KBQs4nbRBR2zH4tBODD1HnulJVi/X7C7E+8ywqa4wAAF83R8RHDsLjIX2hVsgl7pCIbAGDTzsx+BBJQ19Th39lncPafYWm2+C1Lhr8ZsJAPHWvL3qoGICIqGUMPu3E4EMkrepaIzb8UIwP9/yMS3oDAMDdSYUXIvwRO84PzhqlxB0SUVfE4NNODD5EXYPBWI+UnAtIzvgJxZevAwBcNArMvs8fz40fgN5OKok7JKKuhMGnnRh8iLoWY30DtuZdxKrdP+HnX6oAAI4qOf7fOD+8GOEPT2eNxB0SUVfA4NNODD5EXVN9g8C3x0rxv9//hIISPQBApXDAk2P747n7/OHv4SRxh0QkJWu+v9t1z+jq1avh7+8PjUaDkJAQ7N27t9X6jIwMhISEQKPRYODAgVizZk2zmpSUFAQGBkKtViMwMBCpqalmy/fs2YPp06fDx8cHMpkMX375ZbMxZs+eDZlMZjaNGzeuPZtIRF2I3EGGaSO9sX3e/Vg3OxRjfHuh1tiAT7PO4cG/pWPWmix8kXMe1bVGqVsloi7O6uCzadMmJCQkYMmSJcjNzUVERASio6NRVFRksb6wsBDTpk1DREQEcnNzsXjxYsybNw8pKSmmmqysLMTExCA2NhZ5eXmIjY3FrFmzcPDgQVNNVVUVRo8ejZUrV7ba39SpU1FSUmKatm/fbu0mElEXJZPJMDHAC1teGo//ezEMDw7rAwcZ8MPZy3htcx7u/et3WLQlH4eLr4IHs4nIEqtPdYWFhSE4OBjJycmmecOHD8fMmTORlJTUrD4xMRFbt25FQUGBaV58fDzy8vKQlZUFAIiJiYFer8eOHTtMNVOnTkXv3r2xYcOG5k3LZEhNTcXMmTPN5s+ePRtXr161eDSoLXiqi8j2lOiuIyXnPP6dfR5Fl6tN84d69cSs0P54dExfuPdUS9ghEXW0DjvVVVtbi5ycHERFRZnNj4qKQmZmpsV1srKymtVPmTIF2dnZqKura7WmpTFbk56eDk9PTwwdOhRz5sxBWVlZi7UGgwF6vd5sIiLb4u3aA69MHIL01x7Ahjnj8OiYvlArHHDq0jX8ZVsBxiV9h5c+y8Huk2Wob+BRICJ7p7CmuLy8HPX19fDy8jKb7+XlhdLSUovrlJaWWqw3Go0oLy+Ht7d3izUtjdmS6OhoPPHEE/Dz80NhYSHeeOMNTJw4ETk5OVCrm/+LLykpCX/84x+t+gwi6pocHGQIH+SO8EHuWPrICGzNu4jN2cU4cl6HHUdLseNoKbQuGvw6pB9mhfaHr7uj1C0TkQSsCj5Nbn2TshCi1bcrW6q/db61Y1oSExNj+n1QUBBCQ0Ph5+eHbdu24bHHHmtWv2jRIixYsMD0Z71ej/79+1v1mUTU9bj2UCJ2nB9ix/nh+EU9/p1djC8PX0CpvgYrd/+Elbt/QvhAd8SM7Y+pQVpolHwyNJG9sCr4eHh4QC6XNzsSU1ZW1uyITROtVmuxXqFQwN3dvdWalsZsK29vb/j5+eH06dMWl6vVaotHgoio+wj0ccHSR0Zg0bQApB2/hE2HirHvp3JknalA1pkKOH+lwIx7fBAT6ougvi5W/4OLiGyLVdf4qFQqhISEIC0tzWx+Wloaxo8fb3Gd8PDwZvU7d+5EaGgolEplqzUtjdlWFRUVKC4uhre39x2NQ0S2T62Q41ejfPCvF8Kw9/UH8epDQ9G3Vw9U1hjx2YEiTF+5D9Pe34f1+wtx5cb7woio+7H6rq5NmzYhNjYWa9asQXh4OD788EN89NFHOHbsGPz8/LBo0SJcuHABn376KYDG29mDgoIwd+5czJkzB1lZWYiPj8eGDRvw+OOPAwAyMzMxYcIE/PWvf8WMGTPw1Vdf4Q9/+AP27duHsLAwAMC1a9fw008/AQDGjBmDFStW4MEHH4Sbmxt8fX1x7do1LF26FI8//ji8vb1x9uxZLF68GEVFRSgoKICzs/Ntt413dRHZl4YGgcyfK/Dv7GJ8c6wUtcYGAIBK7oCoEV54IrQ/xg9yh1LerkeeEVEnser7W7TDqlWrhJ+fn1CpVCI4OFhkZGSYlsXFxYnIyEiz+vT0dDFmzBihUqnEgAEDRHJycrMxN2/eLIYNGyaUSqUICAgQKSkpZst3794tADSb4uLihBBCVFdXi6ioKNGnTx+hVCqFr6+viIuLE0VFRW3eLp1OJwAInU7X9h8GEXULV6oM4uP9hSL6vT3CL/Fr0zTyzW/E/A0/iq/zLorKmjqp2yQiC6z5/uYrK27CIz5EBABHL+jw7+xifH2kBJdvOu2lkjsgfJA7Jgd6YXKgF7xc+K4woq6A7+pqJwYfIrpZfYPAj0VXkHb8EtKOX0JheZXZ8tH9XG+EIC2GevXkhdFEEmHwaScGHyJqiRACP/9yDTtvhKDG12L8d7mvm6PpSFCoX28oeF0QUadh8GknBh8iaquyyhp8V1CGXccvYe9P5aYLowGgl6MSEwM8ERXohYghfeCkbtcj04iojRh82onBh4jao7rWiD2nypF2/BK+P3EJV6rrTMtUCgfcP9gDDw33wkOBnvB05nVBRHcbg087MfgQ0Z0y1jcg59yN64IKLuFcRbXZ8nv698LkQC9EBXphsCevCyK6Gxh82onBh4juJiEETpddQ9rxS9h5/BLyiq+aLe/v1gPj/N0RNtAdYf5u6O/G94cRtQeDTzsx+BBRR7qkr8GugsaLozN/qkBtfYPZ8r69eiBsoNuNMOQGXzdHHhEiagMGn3Zi8CGiznLNYMShs5dx8MxlHCysQP55HYwN5v871rpoGoPQjSNC/h5ODEJEFjD4tBODDxFJpcpgRM65KzhYWIGDZy4j7/xV1NWb/++5j7MaYf5uCBvojnH+brxGiOgGBp92YvAhoq7iem09couu4MCZChwovIzDxVfNbpkHAI+eKtzr74awG6fGhno6w8GBQYjsD4NPOzH4EFFXVVNXj8PFV02nxnLOXYHhliDU21GJsQNunBob6IYArQvkDEJkBxh82onBh4hshcFYjyPndTh4pgIHCy8j++wVXK+rN6txVMkR6O2CoL6uCOrripF9XTGojxOfKk3dDoNPOzH4EJGtqqtvQP4FHQ6cabxGKPvsZVTV1jer0ygdMNzbBUE+jUFoRF8XDPVyhpJhiGwYg087MfgQUXdR3yBQWH4N+Rd0yD+vx9GLOhy/qMc1g7FZrUrhgOFaZ4y4cVQoyMcVQ7U9oVbIJeicyHoMPu3E4ENE3VlDg8DZiirkX9Dh2EU98s/rcPSiDpU1zcOQUi7DUC/nxiB0YwrQOkOjZBiirofBp50YfIjI3gghUHS5GvkXdDh6QY+jF3TIv6CD7npds1q5gwxDPHuawtAQr54Y7NkTfXqqeVs9SYrBp50YfIiIGsPQ+SvXcexiYwjKvxGILlfVWqx31igwqE9jCPrvr07wdXPkhdTUKRh82onBh4jIMiEESnQ1OHpBh6M3TpX9/Ms1FF2uRkML3yJKuQwD3J3+G4Y8nTC4jzMG9nGCk1rRuRtA3RqDTzsx+BARWaemrh7nKqrxU9k1/PxL4/RT2TWc+aWq2e31N/N21ZiOEA26cYSIp82ovRh82onBh4jo7mhoELiou46ff6n6byi68Wv5NcunzID/njYb2McJ/Xs7or+bI/r17oF+vXvA27UHH8hIFjH4tBODDxFRx7taXXsjCFXhp5sCUWunzQBA4SCDdy8N+vVyRH+3HujXuzEUNYUjT2cNg5GdYvBpJwYfIiLp3HzarLD8Gi5cvY7iy9dx/ko1Lly93uylrbdSymXo26t5IGqcHNGnp5rvMuumrPn+5tVlRETUJWiUcgzTOmOY1rnZsoYGgUuVNTh/pTEINQWi81euo/hKNS5erUFdvcDZimqcrai2OL5K4YB+vXqgb+8e6NurBzxdNPByUcPLWQOvG79376nmUaNujsGHiIi6PAcHGbxdG6/zGTvArdlyY30DLlUaUHy52mI4KtFdR62xAWfKq3CmvKrlz5EBfZzV8HLRwNNZ3RiOnG8EJBcNPG/86uao4tEjG8XgQ0RENk8hd0DfXo1Hciypq29Aqa4GxVeqcf7ydZToanCpsgZlegPKKmtwSV+DXyoNaBDAJb0Bl/SG1j/PQWYKRp43gpKXy42g5KKBu5MK7j1VcHNS8dUfXQyDDxERdXtKuQP6uzXeJYZBlmvqGwQqrhluBJ/GYHRJb8AvN369pG/8taLKAGODwEVdDS7qam772U4qOdx6quDmpIaboxJuTmpTKHJzvPFrzxu/76mCs1rBW/o7EIMPERERGl/J4emigaeLBiPh2mJdXX0Dym8KSGX6GpRV/jcYXdLXoKKqFleqamFsEKiqrUfV5cYLtdtCKZeh941A5H5LYGoKSK49lHDtoYRLDwVceyjhrFHy2qQ2YvAhIiKyglLuYLreqDVCCOhrjLhcVYvLVQZcrqrD5SoDKqpqcflaLS5X195Y9t+purYedfUCZZUGlFW2frrtVs5qBVx6KOHSQwnXHgq4aJrCkbJZULp1mT29fJbBh4iIqAPIZDJT4PD3cGrTOjV19aajRRW3BKabA5L+uhG663XQ19ShurbxCdmVBiMqDUZcuNq2I0s3UykcboQhBZw1SjhrFHBSKeCkVqCnWo6emqbfN87vqbnx+6blaiWc1HI4qRRd/qJvBh8iIqIuQqOUt3qRtiW1xgZU1tRBd73uRhi6EYpMf278/c1hyVR7vQ4NonGM8msGlF+z7iiTJY4qOXqq/xuMnG4Eo55quSk8LYwOkOw6JgYfIiIiG6ZSOMC9Z+MziKwlhMA1g7ExLFX/NyRV1RpxzVCPKoMR12qMuGYwospgRFWtEZU1N35vqMc1w3+XGW88dru6th7VtfUtnqpTKRywaNrwO9rmO9Gu4LN69Wq88847KCkpwYgRI/Dee+8hIiKixfqMjAwsWLAAx44dg4+PD15//XXEx8eb1aSkpOCNN97Azz//jEGDBuGvf/0rHn30UdPyPXv24J133kFOTg5KSkqQmpqKmTNnmo0hhMAf//hHfPjhh7hy5QrCwsKwatUqjBgxoj2bSURE1K3JZLIbp7aUVh1lupUQAgZjQ2NQMoWh+lv+3PhrQ2vvJekEVgefTZs2ISEhAatXr8Z9992HDz74ANHR0Th+/Dh8fX2b1RcWFmLatGmYM2cOPvvsM+zfvx+//e1v0adPHzz++OMAgKysLMTExODPf/4zHn30UaSmpmLWrFnYt28fwsLCAABVVVUYPXo0nnvuOdN6t3r77bexYsUKfPzxxxg6dCj+8pe/YPLkyTh58iScnZs/CZSIiIjunEwmg0Yph0Ypb9eRp85k9bu6wsLCEBwcjOTkZNO84cOHY+bMmUhKSmpWn5iYiK1bt6KgoMA0Lz4+Hnl5ecjKygIAxMTEQK/XY8eOHaaaqVOnonfv3tiwYUPzpmWyZkd8hBDw8fFBQkICEhMTAQAGgwFeXl5Yvnw55s6de9tt47u6iIiIbI81398O1gxcW1uLnJwcREVFmc2PiopCZmamxXWysrKa1U+ZMgXZ2dmoq6trtaalMS0pLCxEaWmp2ThqtRqRkZEtjmMwGKDX680mIiIi6r6sCj7l5eWor6+Hl5eX2XwvLy+UlpZaXKe0tNRivdFoRHl5eas1LY3Z0uc0rdfWcZKSkuDq6mqa+vfv3+bPIyIiIttjVfBpcustaEKIVm9Ls1R/63xrx7wbvS1atAg6nc40FRcXW/15REREZDusurjZw8MDcrm82RGUsrKyZkdammi1Wov1CoUC7u7urda0NGZLnwM0Hvnx9vZu0zhqtRpqdde+CIuIiIjuHquO+KhUKoSEhCAtLc1sflpaGsaPH29xnfDw8Gb1O3fuRGhoKJRKZas1LY1pib+/P7Rardk4tbW1yMjIsGocIiIi6r6svp19wYIFiI2NRWhoKMLDw/Hhhx+iqKjI9FyeRYsW4cKFC/j0008BNN7BtXLlSixYsABz5sxBVlYW1q5da3a31vz58zFhwgQsX74cM2bMwFdffYVdu3Zh3759pppr167hp59+Mv25sLAQhw8fhpubG3x9fSGTyZCQkIC33noLQ4YMwZAhQ/DWW2/B0dERTz/9dLt/QERERNSNiHZYtWqV8PPzEyqVSgQHB4uMjAzTsri4OBEZGWlWn56eLsaMGSNUKpUYMGCASE5Objbm5s2bxbBhw4RSqRQBAQEiJSXFbPnu3bsFgGZTXFycqaahoUG8+eabQqvVCrVaLSZMmCDy8/PbvF06nU4AEDqdrs3rEBERkbSs+f62+jk+3Rmf40NERGR7Ouw5PkRERES2jMGHiIiI7AaDDxEREdkNBh8iIiKyG1bfzt6dNV3nzXd2ERER2Y6m7+223K/F4HOTyspKAOA7u4iIiGxQZWUlXF1dW63h7ew3aWhowMWLF+Hs7Nyu94S1Rq/Xo3///iguLu72t8rb07YC9rW93Nbuy562l9va/QghUFlZCR8fHzg4tH4VD4/43MTBwQH9+vXr0M9wcXHp1v/x3cyethWwr+3ltnZf9rS93Nbu5XZHeprw4mYiIiKyGww+REREZDcYfDqJWq3Gm2++CbVaLXUrHc6ethWwr+3ltnZf9rS93Fb7xoubiYiIyG7wiA8RERHZDQYfIiIishsMPkRERGQ3GHyIiIjIbjD43EWrV6+Gv78/NBoNQkJCsHfv3lbrMzIyEBISAo1Gg4EDB2LNmjWd1Gn7JSUlYezYsXB2doanpydmzpyJkydPtrpOeno6ZDJZs+nEiROd1HX7LV26tFnfWq221XVscb8CwIABAyzup5dfftlivS3t1z179mD69Onw8fGBTCbDl19+abZcCIGlS5fCx8cHPXr0wAMPPIBjx47ddtyUlBQEBgZCrVYjMDAQqampHbQF1mlte+vq6pCYmIiRI0fCyckJPj4+ePbZZ3Hx4sVWx/z4448t7u+ampoO3prW3W7fzp49u1nP48aNu+24XXHf3m5bLe0fmUyGd955p8Uxu+p+7UgMPnfJpk2bkJCQgCVLliA3NxcRERGIjo5GUVGRxfrCwkJMmzYNERERyM3NxeLFizFv3jykpKR0cufWycjIwMsvv4wDBw4gLS0NRqMRUVFRqKqquu26J0+eRElJiWkaMmRIJ3R850aMGGHWd35+fou1trpfAeDQoUNm25mWlgYAeOKJJ1pdzxb2a1VVFUaPHo2VK1daXP72229jxYoVWLlyJQ4dOgStVovJkyeb3t9nSVZWFmJiYhAbG4u8vDzExsZi1qxZOHjwYEdtRpu1tr3V1dX48ccf8cYbb+DHH3/Eli1bcOrUKTzyyCO3HdfFxcVsX5eUlECj0XTEJrTZ7fYtAEydOtWs5+3bt7c6Zlfdt7fb1lv3zbp16yCTyfD444+3Om5X3K8dStBdce+994r4+HizeQEBAWLhwoUW619//XUREBBgNm/u3Lli3LhxHdZjRygrKxMAREZGRos1u3fvFgDElStXOq+xu+TNN98Uo0ePbnN9d9mvQggxf/58MWjQINHQ0GBxua3uVwAiNTXV9OeGhgah1WrFsmXLTPNqamqEq6urWLNmTYvjzJo1S0ydOtVs3pQpU8STTz5513u+E7duryU//PCDACDOnTvXYs369euFq6vr3W3uLrO0rXFxcWLGjBlWjWML+7Yt+3XGjBli4sSJrdbYwn6923jE5y6ora1FTk4OoqKizOZHRUUhMzPT4jpZWVnN6qdMmYLs7GzU1dV1WK93m06nAwC4ubndtnbMmDHw9vbGpEmTsHv37o5u7a45ffo0fHx84O/vjyeffBJnzpxpsba77Nfa2lp89tlneP7552/7wl5b3a9NCgsLUVpaarbf1Go1IiMjW/z7C7S8r1tbp6vS6XSQyWTo1atXq3XXrl2Dn58f+vXrh1/96lfIzc3tnAbvUHp6Ojw9PTF06FDMmTMHZWVlrdZ3h3176dIlbNu2DS+88MJta211v7YXg89dUF5ejvr6enh5eZnN9/LyQmlpqcV1SktLLdYbjUaUl5d3WK93kxACCxYswP3334+goKAW67y9vfHhhx8iJSUFW7ZswbBhwzBp0iTs2bOnE7ttn7CwMHz66af49ttv8dFHH6G0tBTjx49HRUWFxfrusF8B4Msvv8TVq1cxe/bsFmtseb/erOnvqDV/f5vWs3adrqimpgYLFy7E008/3epLLAMCAvDxxx9j69at2LBhAzQaDe677z6cPn26E7u1XnR0ND7//HN8//33ePfdd3Ho0CFMnDgRBoOhxXW6w7795JNP4OzsjMcee6zVOlvdr3eCb2e/i279l7EQotV/LVuqtzS/q3rllVdw5MgR7Nu3r9W6YcOGYdiwYaY/h4eHo7i4GH/7298wYcKEjm7zjkRHR5t+P3LkSISHh2PQoEH45JNPsGDBAovr2Pp+BYC1a9ciOjoaPj4+LdbY8n61xNq/v+1dpyupq6vDk08+iYaGBqxevbrV2nHjxpldFHzfffchODgY//u//4v333+/o1ttt5iYGNPvg4KCEBoaCj8/P2zbtq3VUGDr+3bdunV45plnbnutjq3u1zvBIz53gYeHB+RyebN/DZSVlTX7V0MTrVZrsV6hUMDd3b3Der1bfve732Hr1q3YvXs3+vXrZ/X648aNs8l/UTg5OWHkyJEt9m7r+xUAzp07h127duHFF1+0el1b3K9Nd+lZ8/e3aT1r1+lK6urqMGvWLBQWFiItLa3Voz2WODg4YOzYsTa3v729veHn59dq37a+b/fu3YuTJ0+26++wre5XazD43AUqlQohISGmu2CapKWlYfz48RbXCQ8Pb1a/c+dOhIaGQqlUdlivd0oIgVdeeQVbtmzB999/D39//3aNk5ubC29v77vcXcczGAwoKChosXdb3a83W79+PTw9PfHwww9bva4t7ld/f39otVqz/VZbW4uMjIwW//4CLe/r1tbpKppCz+nTp7Fr1652hXIhBA4fPmxz+7uiogLFxcWt9m3L+xZoPGIbEhKC0aNHW72ure5Xq0h1VXV3s3HjRqFUKsXatWvF8ePHRUJCgnBychJnz54VQgixcOFCERsba6o/c+aMcHR0FK+++qo4fvy4WLt2rVAqleKLL76QahPa5KWXXhKurq4iPT1dlJSUmKbq6mpTza3b+ve//12kpqaKU6dOiaNHj4qFCxcKACIlJUWKTbDK73//e5Geni7OnDkjDhw4IH71q18JZ2fnbrdfm9TX1wtfX1+RmJjYbJkt79fKykqRm5srcnNzBQCxYsUKkZuba7qLadmyZcLV1VVs2bJF5Ofni6eeekp4e3sLvV5vGiM2NtbsLs39+/cLuVwuli1bJgoKCsSyZcuEQqEQBw4c6PTtu1Vr21tXVyceeeQR0a9fP3H48GGzv8cGg8E0xq3bu3TpUvHNN9+In3/+WeTm5ornnntOKBQKcfDgQSk20aS1ba2srBS///3vRWZmpigsLBS7d+8W4eHhom/fvja5b2/337EQQuh0OuHo6CiSk5MtjmEr+7UjMfjcRatWrRJ+fn5CpVKJ4OBgs1u84+LiRGRkpFl9enq6GDNmjFCpVGLAgAEt/ofalQCwOK1fv95Uc+u2Ll++XAwaNEhoNBrRu3dvcf/994tt27Z1fvPtEBMTI7y9vYVSqRQ+Pj7iscceE8eOHTMt7y77tcm3334rAIiTJ082W2bL+7Xp1vtbp7i4OCFE4y3tb775ptBqtUKtVosJEyaI/Px8szEiIyNN9U02b94shg0bJpRKpQgICOgyoa+17S0sLGzx7/Hu3btNY9y6vQkJCcLX11eoVCrRp08fERUVJTIzMzt/427R2rZWV1eLqKgo0adPH6FUKoWvr6+Ii4sTRUVFZmPYyr693X/HQgjxwQcfiB49eoirV69aHMNW9mtHkglx48pLIiIiom6O1/gQERGR3WDwISIiIrvB4ENERER2g8GHiIiI7AaDDxEREdkNBh8iIiKyGww+REREZDcYfIiIiMhuMPgQERGR3WDwISIiIrvB4ENERER2g8GHiIiI7Mb/D3/WD8f6fdKjAAAAAElFTkSuQmCC" 101 | }, 102 | "metadata": {}, 103 | "output_type": "display_data" 104 | } 105 | ], 106 | "source": [ 107 | "plt.plot(np.arange(20), [lr_reduce(ep) for ep in range(20)])" 108 | ] 109 | }, 110 | { 111 | "cell_type": "markdown", 112 | "metadata": {}, 113 | "source": [ 114 | "Using the idea of the `lr_reduce` function, create a function that starts at a low learning rate `lr_0`, increases linearly for `lr_0_steps` to a maximum of `lr_max`, holds at `lr_max` for `lr_max_steps` and then decays geometrically at a rate of `lr_decay` each epoch toward a minimum of `lr_min`." 115 | ] 116 | }, 117 | { 118 | "cell_type": "code", 119 | "execution_count": null, 120 | "metadata": {}, 121 | "outputs": [], 122 | "source": [ 123 | "def lr_increase_decrease(epoch):\n", 124 | " lr_0 = 0.001\n", 125 | " lr_0_steps = 5\n", 126 | " lr_max = 0.005\n", 127 | " lr_max_steps = 5\n", 128 | " lr_min = 0.0005\n", 129 | " lr_decay = 0.8\n", 130 | "\n", 131 | " if epoch < lr_0_steps:\n", 132 | " lr = #TODO\n", 133 | " elif epoch < lr_0_steps + lr_max_steps:\n", 134 | " lr = #TODO\n", 135 | " else:\n", 136 | " lr = #TODO\n", 137 | " \n", 138 | " return lr" 139 | ] 140 | }, 141 | { 142 | "cell_type": "markdown", 143 | "metadata": {}, 144 | "source": [ 145 | "Test your function." 146 | ] 147 | }, 148 | { 149 | "cell_type": "code", 150 | "execution_count": null, 151 | "metadata": {}, 152 | "outputs": [], 153 | "source": [ 154 | "plt.plot(np.arange(30), [lr_increase_decrease(ep) for ep in range(30)])" 155 | ] 156 | }, 157 | { 158 | "cell_type": "markdown", 159 | "metadata": {}, 160 | "source": [ 161 | "Sometimes it is useful to create learning rate functions with different sets of parameters and to pass these functions to a training function. Using the ideas above, create a `Class` of *increase-then-reduce* learning rate functions. It should have `__init__` and `__call__` methods. The input to the `__call__` method should be epoch number, and the returned value should be the learning rate for that epoch." 162 | ] 163 | }, 164 | { 165 | "cell_type": "code", 166 | "execution_count": null, 167 | "metadata": {}, 168 | "outputs": [], 169 | "source": [ 170 | "class lr_inc_dec:\n", 171 | " def __init__(self, lr_0=0.001, lr_0_steps=5, lr_max = 0.005, lr_max_steps = 5, lr_min=0.0005, lr_decay=0.9):\n", 172 | " self.lr_0 = lr_0\n", 173 | " self.lr_max = lr_max\n", 174 | " self.lr_0_steps = lr_0_steps\n", 175 | " self.lr_max_steps = lr_max_steps\n", 176 | " self.lr_min = lr_min\n", 177 | " self.lr_decay = lr_decay\n", 178 | " \n", 179 | " def __call__(self, epoch):\n", 180 | " #TODO\n", 181 | " \n", 182 | " return self.lr" 183 | ] 184 | }, 185 | { 186 | "cell_type": "markdown", 187 | "metadata": {}, 188 | "source": [ 189 | "Test your function" 190 | ] 191 | }, 192 | { 193 | "cell_type": "code", 194 | "execution_count": null, 195 | "metadata": {}, 196 | "outputs": [], 197 | "source": [ 198 | "lr_scheduler = lr_inc_dec(lr_0=0.002, lr_0_steps=10, lr_max = 0.01, lr_max_steps = 5, lr_min=0.001, lr_decay=0.85)\n", 199 | "plt.plot(np.arange(50), [lr_scheduler(ep) for ep in range(50)])" 200 | ] 201 | }, 202 | { 203 | "cell_type": "markdown", 204 | "metadata": {}, 205 | "source": [ 206 | "# Training a Simple Network\n", 207 | "\n", 208 | "Consider the network and training data from Exercise E3.1. You are going to write some code to create the network and the training data, and then to train the network using steepest descent.\n", 209 | "\n", 210 | "## Training Data\n", 211 | "\n", 212 | "First, create NumPy arrays to hold the inputs and targets." 213 | ] 214 | }, 215 | { 216 | "cell_type": "code", 217 | "execution_count": null, 218 | "metadata": {}, 219 | "outputs": [], 220 | "source": [ 221 | "P = #TODO\n", 222 | "T = #TODO\n", 223 | "print(P)\n", 224 | "print(T)" 225 | ] 226 | }, 227 | { 228 | "cell_type": "markdown", 229 | "metadata": {}, 230 | "source": [ 231 | "## Define the Network\n", 232 | "\n", 233 | "The next step is to define the network. Define a network class, `simplenet`, to implement the network from Exercise E2.1. The `simplenet` object should have two attributes: the weight `.w` and the bias `.b`, which should be initialized in the `__init__` method. It should have a method called `sim`, which should return the network output for a given input (and should work for a single input, or a batch of inputs). It should also have a method `deriv`, which should return the derivative of the network output with respect to the weight and bias. The derivative will have two rows (the first for w and the second for b), and as many columns as the batch size. (Hint: `np.ones` and `np.vstack` could be useful in forming the derivative.)" 234 | ] 235 | }, 236 | { 237 | "cell_type": "code", 238 | "execution_count": null, 239 | "metadata": {}, 240 | "outputs": [], 241 | "source": [ 242 | "class simplenet:\n", 243 | " def __init__(self, weight, bias):\n", 244 | " self.w = weight\n", 245 | " self.b = bias\n", 246 | " \n", 247 | " def sim(self, p):\n", 248 | " return #TODO\n", 249 | " \n", 250 | " def deriv(self, p):\n", 251 | " return #TODO" 252 | ] 253 | }, 254 | { 255 | "cell_type": "markdown", 256 | "metadata": {}, 257 | "source": [ 258 | "Test the network using an initial weight of 1 and a bias of 0. Use the training data as input. The network output should be [-1, 0, 1]. The derivative should be [[-1., 0., 1.], [ 1., 1., 1.]]." 259 | ] 260 | }, 261 | { 262 | "cell_type": "code", 263 | "execution_count": null, 264 | "metadata": {}, 265 | "outputs": [], 266 | "source": [ 267 | "net = simplenet(1, 0)\n", 268 | "A = net.sim(P)\n", 269 | "print(A)\n", 270 | "dA_dwb = net.deriv(P)\n", 271 | "print(dA_dwb)" 272 | ] 273 | }, 274 | { 275 | "cell_type": "markdown", 276 | "metadata": {}, 277 | "source": [ 278 | "## Performance Function\n", 279 | "\n", 280 | "Now we define the performance (or loss) function. Define a mean square error performance class. The class should have two methods: `value`, which returns the value of the MSE, and `deriv`, which returns the derivative of the MSE with respect to the network output. The input to both methods should be the network output and the target output. The function should work for single examples, or a batch of any size. " 281 | ] 282 | }, 283 | { 284 | "cell_type": "code", 285 | "execution_count": null, 286 | "metadata": {}, 287 | "outputs": [], 288 | "source": [ 289 | "class mse:\n", 290 | " def value(self, a, t):\n", 291 | " perf = #TODO\n", 292 | " return perf\n", 293 | " \n", 294 | " def deriv(self, a, t):\n", 295 | " df_da = #TODO\n", 296 | " return df_da" 297 | ] 298 | }, 299 | { 300 | "cell_type": "markdown", 301 | "metadata": {}, 302 | "source": [ 303 | "Test the function using the training targets and the network output you computed above. The MSE value should be 0.9166666, and the derivative should be [-1., 1., 3.]." 304 | ] 305 | }, 306 | { 307 | "cell_type": "code", 308 | "execution_count": null, 309 | "metadata": {}, 310 | "outputs": [], 311 | "source": [ 312 | "test_mse = mse()\n", 313 | "performance = test_mse.value(A, T)\n", 314 | "print(performance)\n", 315 | "dF_dA = test_mse.deriv(A, T)\n", 316 | "print(dF_dA)" 317 | ] 318 | }, 319 | { 320 | "cell_type": "markdown", 321 | "metadata": {}, 322 | "source": [ 323 | "## Training Function\n", 324 | "\n", 325 | "Now define a training function. There should be five inputs to the training function: 1) a network, 2) a performance function, 3) a learning rate scheduler, 4) the maximum number of epochs to train, and 5) the data set. The data set should be provided as a dictionary with two keys: `Input` and `Target`. The values in the dictionary should be NumPy arrays containing the training inputs and training targets. \n", 326 | "\n", 327 | "To compute the gradient, use the `perf.deriv` and `net.deriv` methods and then multiply their outputs together ($\\partial F/\\partial \\mathbf{x}=\\partial F/\\partial a\\times \\partial a/\\partial \\mathbf{x}$). Get the learning rate at each epoch from the learning rate scheduler. This will be a batch traning function, which will use the entire data set for each weight update, so each iteration is one epoch. (See Chapter 3 of the text for a discussion of batch training.) You can access the weight and bias attributes of the network with `net.w` and `net.b`.\n", 328 | "\n", 329 | "Compute the performance at each epoch and store it in a list (use the `append` method). The training function should return two things: 1) the final trained network, and 2) the list of performance values." 330 | ] 331 | }, 332 | { 333 | "cell_type": "code", 334 | "execution_count": null, 335 | "metadata": {}, 336 | "outputs": [], 337 | "source": [ 338 | "def train_gd(net, perf, lr_sched, max_epoch, data):\n", 339 | " pp = data['Input']\n", 340 | " tt = data['Target']\n", 341 | " ff = []\n", 342 | " for ep in range(max_epoch):\n", 343 | " #TODO\n", 344 | " return net, ff" 345 | ] 346 | }, 347 | { 348 | "cell_type": "markdown", 349 | "metadata": {}, 350 | "source": [ 351 | "Test the training function." 352 | ] 353 | }, 354 | { 355 | "cell_type": "code", 356 | "execution_count": null, 357 | "metadata": {}, 358 | "outputs": [], 359 | "source": [ 360 | "lr_scheduler = lr_inc_dec(lr_0=0.01, lr_0_steps=10, lr_max = 0.03, lr_max_steps = 5, lr_min=0.02, lr_decay=0.85)\n", 361 | "DATA = {'Input': P, 'Target': T}\n", 362 | "PERF = mse()\n", 363 | "MAX_EPOCH = 30\n", 364 | "NET = simplenet(0.1, 0.1)" 365 | ] 366 | }, 367 | { 368 | "cell_type": "code", 369 | "execution_count": null, 370 | "metadata": {}, 371 | "outputs": [], 372 | "source": [ 373 | "NET2, FF = train_gd(NET, PERF, lr_scheduler, MAX_EPOCH, DATA)" 374 | ] 375 | }, 376 | { 377 | "cell_type": "markdown", 378 | "metadata": {}, 379 | "source": [ 380 | "Plot the progress of the performance function during training. The performance should decay toward 0. You could try changing the learning rate scheduler to see how it affects the convergence of the network." 381 | ] 382 | }, 383 | { 384 | "cell_type": "code", 385 | "execution_count": null, 386 | "metadata": {}, 387 | "outputs": [], 388 | "source": [ 389 | "plt.plot(np.arange(MAX_EPOCH), FF)" 390 | ] 391 | }, 392 | { 393 | "cell_type": "markdown", 394 | "metadata": {}, 395 | "source": [ 396 | "# Using Pandas and a Data Generator\n", 397 | "\n", 398 | "Now consider a little more complex problem, in which we load some data from a file into a Pandas DataFrame, transform the data, and then set up a data generator to read the data into mini-batches to train the network.\n", 399 | "\n", 400 | "To begin, we read a CSV file into a DataFrame. This is the same file used in the Python Chapter Jupyter Notebook. For this task we will be trying to predict the **Percent** value from the **FVC** value." 401 | ] 402 | }, 403 | { 404 | "cell_type": "code", 405 | "execution_count": null, 406 | "metadata": {}, 407 | "outputs": [], 408 | "source": [ 409 | "sample_df = pd.read_csv('https://raw.githubusercontent.com/NNDesignDeepLearning/NNDesignDeepLearning/master/05.PythonChapter/Code/ChapterNotebook/SampleDF.csv?token=AUZTTDQALCLLAYUY2ZZGDKDA5YTEK')" 410 | ] 411 | }, 412 | { 413 | "cell_type": "code", 414 | "execution_count": null, 415 | "metadata": {}, 416 | "outputs": [], 417 | "source": [ 418 | "print(sample_df.head())" 419 | ] 420 | }, 421 | { 422 | "cell_type": "markdown", 423 | "metadata": {}, 424 | "source": [ 425 | "When training neural networks we commonly normalize the input and target data before training the network, as described in Chapter 4 of the textbook. For this problem let's normalize the inputs and targets to have a mean of 0 and a standard deviation of 1. In order to do that, compute the mean and standard deviation of the **FVC** and **Percent** columns in the DataFrame." 426 | ] 427 | }, 428 | { 429 | "cell_type": "code", 430 | "execution_count": null, 431 | "metadata": {}, 432 | "outputs": [], 433 | "source": [ 434 | "mean_fvc = #TODO\n", 435 | "std_fvc = #TODO\n", 436 | "mean_percent = #TODO\n", 437 | "std_percent = #TODO" 438 | ] 439 | }, 440 | { 441 | "cell_type": "markdown", 442 | "metadata": {}, 443 | "source": [ 444 | "Now transform the **FVC** and **Percent** columns so that they have a mean of 0 and a standard deviation of 1. Use the `apply` method and a `lambda` function that subtracts the mean and divides by the standard deviation." 445 | ] 446 | }, 447 | { 448 | "cell_type": "code", 449 | "execution_count": null, 450 | "metadata": {}, 451 | "outputs": [], 452 | "source": [ 453 | "sample_df['FVC'] = #TODO\n", 454 | "sample_df['Percent'] = #TODO" 455 | ] 456 | }, 457 | { 458 | "cell_type": "markdown", 459 | "metadata": {}, 460 | "source": [ 461 | "Assign the input variable `P` to the **FVC** column of the DataFrame and the target variable `T` to the **Percent** column of the DataFrame, after converting them to NumPy arrays." 462 | ] 463 | }, 464 | { 465 | "cell_type": "code", 466 | "execution_count": null, 467 | "metadata": {}, 468 | "outputs": [], 469 | "source": [ 470 | "P = #TODO\n", 471 | "T = #TODO" 472 | ] 473 | }, 474 | { 475 | "cell_type": "markdown", 476 | "metadata": {}, 477 | "source": [ 478 | "As earlier, put the inputs and targets into a dictionary." 479 | ] 480 | }, 481 | { 482 | "cell_type": "code", 483 | "execution_count": null, 484 | "metadata": {}, 485 | "outputs": [], 486 | "source": [ 487 | "Data = {'Input': P, 'Target': T}" 488 | ] 489 | }, 490 | { 491 | "cell_type": "markdown", 492 | "metadata": {}, 493 | "source": [ 494 | "## Data Generator\n", 495 | "\n", 496 | "We covered generators in the Python Chapter Jupyter Notebook. Let's make a generator that will be passed into the training function, which will use it to return mini-batches of the training data. The generator should continue to pass through the data set an indefinite number of times. It will shuffle the data set after each epoch. If the length of the data set is not an integer multiple of the batch size, some remainder of data points will be left out of each epoch, but because of shuffling, all data will eventually be included in training.\n", 497 | "\n", 498 | "There will be two inputs to the generator: 1) the dictionary containing the data set and 2) the mini-batch size `bsize`. The generator should then extract the input and target from the dictionary and iterate through the data, extracting `bsize` columns from the input and target arrays at each iteration. \n", 499 | "\n", 500 | "On each iteration, the data generator should yield three things: 1) the epoch number, 2) `bsize` columns of the input array and 3) `bsize` columns of the target array. At the completion of each epoch, it should randomly reorder the columns of the input and target arrays, so that they will be presented in a different order on each epoch. (Hint: The function `np.random.permutation` can be useful in reordering the arrays. Be sure to use the same indexing for both inputs and targets.) The generator should be written so that it can be used for an arbitrary number of epochs. The training function will know when to stop accessing the generator when the epoch number reaches the desired number of epochs." 501 | ] 502 | }, 503 | { 504 | "cell_type": "code", 505 | "execution_count": null, 506 | "metadata": {}, 507 | "outputs": [], 508 | "source": [ 509 | "def data_gen(data, bsize):\n", 510 | " In = data['Input']\n", 511 | " Tar = data['Target']\n", 512 | " num = len(In)\n", 513 | " steps = num//bsize\n", 514 | " epoch = 0\n", 515 | " while True:\n", 516 | " for ii in range(steps):\n", 517 | " yield epoch, #TODO\n", 518 | " #TODO\n", 519 | " epoch += 1" 520 | ] 521 | }, 522 | { 523 | "cell_type": "markdown", 524 | "metadata": {}, 525 | "source": [ 526 | "## Modified Training Function\n", 527 | "\n", 528 | "Modify the earlier training function, `train_gd`, so that it's fifth input is a data generator, rather than the data set itself. It should still run for `max_epoch` epochs, but there will be multiple iterations for each epoch if the mini-batch size is smaller than the full data set size. \n", 529 | "\n", 530 | "The form of the `train_gd_gen` function will be almost identical to the `train_gd` function. However, instead of incrementing the epoch number, it will call the data generator with the `next` command, which will return the current epoch number. This epoch number will be used to access the correct learning rate.\n", 531 | "\n", 532 | "Except for the data generator, the inputs and outputs of `train_gd_gen` will be the same as those of `train_gd`. However, the returned list of performance values will be provided at each iteration, which could be multiple times per epoch, depending on the mini-batch size." 533 | ] 534 | }, 535 | { 536 | "cell_type": "code", 537 | "execution_count": null, 538 | "metadata": {}, 539 | "outputs": [], 540 | "source": [ 541 | "def train_gd_gen(net, perf, lr_sched, max_epoch, datagen):\n", 542 | " ff = []\n", 543 | " ep = 0\n", 544 | " while ep < max_epoch:\n", 545 | " #TODO\n", 546 | " return net, ff" 547 | ] 548 | }, 549 | { 550 | "cell_type": "markdown", 551 | "metadata": {}, 552 | "source": [ 553 | "Now test the data generator and the new training function. In the values set below the mini-batch size is set to 10, which means that there will be 10 iterations per epoch, since there are 100 data points in the training set." 554 | ] 555 | }, 556 | { 557 | "cell_type": "code", 558 | "execution_count": null, 559 | "metadata": {}, 560 | "outputs": [], 561 | "source": [ 562 | "lr_scheduler = lr_inc_dec(lr_0=0.001, lr_0_steps=10, lr_max = 0.01, lr_max_steps = 10, lr_min=0.000001, lr_decay=0.85)\n", 563 | "DATA = {'Input': P, 'Target': T}\n", 564 | "BSIZE = 10\n", 565 | "PERF = mse()\n", 566 | "MAX_EPOCH = 50\n", 567 | "NET = simplenet(0.1, 0.1)\n", 568 | "gendata = data_gen(DATA, BSIZE)" 569 | ] 570 | }, 571 | { 572 | "cell_type": "code", 573 | "execution_count": null, 574 | "metadata": {}, 575 | "outputs": [], 576 | "source": [ 577 | "NET2, FF = train_gd_gen(NET, PERF, lr_scheduler, MAX_EPOCH, gendata)" 578 | ] 579 | }, 580 | { 581 | "cell_type": "markdown", 582 | "metadata": {}, 583 | "source": [ 584 | "Let's plot the performance during training. You will notice that the plot is not as smooth as the earlier training example, which used full batch training. Since at each iteration the performance is computed over only a subset of the data, the convergence can be noisy. You can experiment with larger mini-batch sizes to see how this effects the convergence." 585 | ] 586 | }, 587 | { 588 | "cell_type": "code", 589 | "execution_count": null, 590 | "metadata": {}, 591 | "outputs": [], 592 | "source": [ 593 | "plt.plot(FF)" 594 | ] 595 | }, 596 | { 597 | "cell_type": "markdown", 598 | "metadata": {}, 599 | "source": [ 600 | "Finally, let's see how well the final trained network fit the training data. Here we find the network response to inputs over the range -3 to 3 (which is the approximate range of the normalized **FVC** values) and compare it to the training data. Since we only use a one layer network, the response of the network is linear. " 601 | ] 602 | }, 603 | { 604 | "cell_type": "code", 605 | "execution_count": null, 606 | "metadata": {}, 607 | "outputs": [], 608 | "source": [ 609 | "pp = np.arange(-3,3,0.1)\n", 610 | "aa = NET2.sim(pp)\n", 611 | "plt.plot(pp, aa, 'r', P, T, 'b.')" 612 | ] 613 | }, 614 | { 615 | "cell_type": "markdown", 616 | "metadata": {}, 617 | "source": [ 618 | "## Explore Further\n", 619 | "\n", 620 | "Experiment with different learning rate schedules and mini-batch sizes to see how they affect training. \n", 621 | "\n", 622 | "Implement other training algorithms, such as Adam. \n", 623 | "\n", 624 | "Implement a two-layer network. \n", 625 | "\n", 626 | "Load your own data set." 627 | ] 628 | } 629 | ], 630 | "metadata": { 631 | "kernelspec": { 632 | "display_name": "Python 3", 633 | "language": "python", 634 | "name": "python3" 635 | }, 636 | "language_info": { 637 | "codemirror_mode": { 638 | "name": "ipython", 639 | "version": 3 640 | }, 641 | "file_extension": ".py", 642 | "mimetype": "text/x-python", 643 | "name": "python", 644 | "nbconvert_exporter": "python", 645 | "pygments_lexer": "ipython3", 646 | "version": "3.6.9" 647 | } 648 | }, 649 | "nbformat": 4, 650 | "nbformat_minor": 4 651 | } 652 | -------------------------------------------------------------------------------- /Mini_Project/SimpleNet_Python/PythonLab1.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | import numpy as np 3 | import pandas as pd 4 | 5 | 6 | # ---------------------------------------------------------------------------------------------------- 7 | def lr_reduce(epoch): 8 | lr_decay = 0.8 9 | lr_max = 0.005 10 | lr_min = 0.001 11 | lr = (lr_max - lr_min) * lr_decay ** epoch + lr_min 12 | return lr 13 | 14 | 15 | # ---------------------------------------------------------------------------------------------------- 16 | plt.figure() 17 | plt.plot(np.arange(20), [lr_reduce(ep) for ep in range(20)]) 18 | plt.show() 19 | 20 | 21 | # ---------------------------------------------------------------------------------------------------- 22 | def lr_increase_decrease(epoch): 23 | lr_0 = 0.001 24 | lr_0_steps = 5 25 | lr_max = 0.005 26 | lr_max_steps = 5 27 | lr_min = 0.0005 28 | lr_decay = 0.8 29 | 30 | if epoch < lr_0_steps: 31 | lr = ' ' # TODO 32 | elif epoch < lr_0_steps + lr_max_steps: 33 | lr = ' ' # TODO 34 | else: 35 | lr = ' ' # TODO 36 | 37 | return lr 38 | 39 | 40 | # ---------------------------------------------------------------------------------------------------- 41 | plt.figure() 42 | plt.plot(np.arange(30), [lr_increase_decrease(ep) for ep in range(30)]) 43 | plt.show() 44 | 45 | 46 | # ---------------------------------------------------------------------------------------------------- 47 | class lr_inc_dec: 48 | def __init__(self, lr_0=0.001, lr_0_steps=5, lr_max=0.005, lr_max_steps=5, lr_min=0.0005, lr_decay=0.9): 49 | self.lr_0 = lr_0 50 | self.lr_max = lr_max 51 | self.lr_0_steps = lr_0_steps 52 | self.lr_max_steps = lr_max_steps 53 | self.lr_min = lr_min 54 | self.lr_decay = lr_decay 55 | 56 | def __call__(self, epoch): 57 | # TODO 58 | 59 | return self.lr 60 | 61 | 62 | # ---------------------------------------------------------------------------------------------------- 63 | lr_scheduler = lr_inc_dec(lr_0=0.002, lr_0_steps=10, lr_max=0.01, lr_max_steps=5, lr_min=0.001, lr_decay=0.85) 64 | 65 | plt.figure() 66 | plt.plot(np.arange(50), [lr_scheduler(ep) for ep in range(50)]) 67 | plt.show() 68 | # ---------------------------------------------------------------------------------------------------- 69 | 70 | P = ' ' # TODO 71 | T = ' ' # TODO 72 | print(P) 73 | print(T) 74 | 75 | 76 | # ---------------------------------------------------------------------------------------------------- 77 | 78 | class simplenet: 79 | def __init__(self, weight, bias): 80 | self.w = weight 81 | self.b = bias 82 | 83 | def sim(self, p): 84 | return # TODO 85 | 86 | def deriv(self, p): 87 | return # TODO 88 | 89 | 90 | # ---------------------------------------------------------------------------------------------------- 91 | 92 | net = simplenet(1, 0) 93 | A = net.sim(P) 94 | print(A) 95 | dA_dwb = net.deriv(P) 96 | print(dA_dwb) 97 | 98 | 99 | # ---------------------------------------------------------------------------------------------------- 100 | 101 | class mse: 102 | def value(self, a, t): 103 | perf = ' ' # TODO 104 | return perf 105 | 106 | def deriv(self, a, t): 107 | df_da = ' ' # TODO 108 | return df_da 109 | 110 | 111 | # ---------------------------------------------------------------------------------------------------- 112 | 113 | test_mse = mse() 114 | performance = test_mse.value(A, T) 115 | print(performance) 116 | dF_dA = test_mse.deriv(A, T) 117 | print(dF_dA) 118 | 119 | 120 | # ---------------------------------------------------------------------------------------------------- 121 | 122 | 123 | def train_gd(net, perf, lr_sched, max_epoch, data): 124 | pp = data['Input'] 125 | tt = data['Target'] 126 | ff = [] 127 | for ep in range(max_epoch): 128 | ' ' # TODO 129 | return net, ff 130 | 131 | 132 | # ---------------------------------------------------------------------------------------------------- 133 | 134 | lr_scheduler = lr_inc_dec(lr_0=0.01, lr_0_steps=10, lr_max=0.03, lr_max_steps=5, lr_min=0.02, lr_decay=0.85) 135 | DATA = {'Input': P, 'Target': T} 136 | PERF = mse() 137 | MAX_EPOCH = 30 138 | NET = simplenet(0.1, 0.1) 139 | 140 | # ---------------------------------------------------------------------------------------------------- 141 | NET2, FF = train_gd(NET, PERF, lr_scheduler, MAX_EPOCH, DATA) 142 | # ---------------------------------------------------------------------------------------------------- 143 | plt.figure() 144 | plt.plot(np.arange(MAX_EPOCH), FF) 145 | plt.show() 146 | 147 | # ---------------------------------------------------------------------------------------------------- 148 | sample_df = pd.read_csv( 149 | 'https://raw.githubusercontent.com/NNDesignDeepLearning/NNDesignDeepLearning/master/05.PythonChapter/Code/ChapterNotebook/SampleDF.csv?token=AUZTTDQALCLLAYUY2ZZGDKDA5YTEK') 150 | # ---------------------------------------------------------------------------------------------------- 151 | print(sample_df.head()) 152 | 153 | mean_fvc = ' ' # TODO 154 | std_fvc = ' ' # TODO 155 | mean_percent = ' ' # TODO 156 | std_percent = ' ' # TODO 157 | # ---------------------------------------------------------------------------------------------------- 158 | sample_df['FVC'] = ' ' # TODO 159 | sample_df['Percent'] = ' ' # TODO 160 | 161 | # ---------------------------------------------------------------------------------------------------- 162 | P = ' ' # TODO 163 | T = ' ' # TODO 164 | 165 | # ---------------------------------------------------------------------------------------------------- 166 | Data = {'Input': P, 'Target': T} 167 | 168 | 169 | # ---------------------------------------------------------------------------------------------------- 170 | 171 | 172 | def data_gen(data, bsize): 173 | In = data['Input'] 174 | Tar = data['Target'] 175 | num = len(In) 176 | steps = num // bsize 177 | epoch = 0 178 | while True: 179 | for ii in range(steps): 180 | yield epoch, # TODO 181 | # TODO 182 | epoch += 1 183 | 184 | 185 | # ---------------------------------------------------------------------------------------------------- 186 | 187 | def train_gd_gen(net, perf, lr_sched, max_epoch, datagen): 188 | ff = [] 189 | ep = 0 190 | while ep < max_epoch: 191 | ' ' # TODO 192 | return net, ff 193 | 194 | 195 | # ---------------------------------------------------------------------------------------------------- 196 | lr_scheduler = lr_inc_dec(lr_0=0.001, lr_0_steps=10, lr_max=0.01, lr_max_steps=10, lr_min=0.000001, lr_decay=0.85) 197 | DATA = {'Input': P, 'Target': T} 198 | BSIZE = 10 199 | PERF = mse() 200 | MAX_EPOCH = 50 201 | NET = simplenet(0.1, 0.1) 202 | gendata = data_gen(DATA, BSIZE) 203 | 204 | # ---------------------------------------------------------------------------------------------------- 205 | 206 | NET2, FF = train_gd_gen(NET, PERF, lr_scheduler, MAX_EPOCH, gendata) 207 | # ---------------------------------------------------------------------------------------------------- 208 | plt.figure() 209 | plt.plot(FF) 210 | plt.show() 211 | # ---------------------------------------------------------------------------------------------------- 212 | 213 | pp = np.arange(-3, 3, 0.1) 214 | aa = NET2.sim(pp) 215 | plt.plot(pp, aa, 'r', P, T, 'b.') 216 | 217 | # ---------------------------------------------------------------------------------------------------- 218 | -------------------------------------------------------------------------------- /Mini_Project/SimpleNet_Python/SampleDF.csv: -------------------------------------------------------------------------------- 1 | Patient,Weeks,FVC,Percent,Age,Sex,SmokingStatus 2 | ID00213637202257692916109,32,2972,81.8281938325991,70,Male,Currently smokes 3 | ID00129637202219868188000,0,2253,59.622102254684,71,Male,Never smoked 4 | ID00130637202220059448013,12,1648,68.1160618335125,65,Female,Never smoked 5 | ID00225637202259339837603,23,969,49.07571537097999,77,Female,Never smoked 6 | ID00082637202201836229724,33,2885,98.66621067031471,49,Female,Currently smokes 7 | ID00388637202301028491611,27,3045,76.9172476508033,53,Male,Ex-smoker 8 | ID00355637202295106567614,65,4791,153.145377828922,65,Male,Currently smokes 9 | ID00133637202223847701934,6,3171,92.1588002790049,83,Male,Never smoked 10 | ID00124637202217596410344,84,3350,83.59952086244759,60,Male,Ex-smoker 11 | ID00172637202238316925179,43,2833,77.2102910716232,73,Male,Ex-smoker 12 | ID00202637202249376026949,27,4029,100.26378658172399,64,Male,Never smoked 13 | ID00011637202177653955184,13,3410,88.15925542916241,72,Male,Ex-smoker 14 | ID00011637202177653955184,43,3346,86.50465356773529,72,Male,Ex-smoker 15 | ID00376637202297677828573,77,4251,118.74301675977699,72,Male,Never smoked 16 | ID00343637202287577133798,48,1383,60.206347133342,68,Female,Never smoked 17 | ID00077637202199102000916,12,3255,84.2740265120133,70,Male,Ex-smoker 18 | ID00128637202219474716089,34,2220,96.9263010827803,87,Female,Never smoked 19 | ID00401637202305320178010,51,1845,67.90577843209421,74,Female,Ex-smoker 20 | ID00421637202311550012437,17,2756,82.55451713395641,68,Male,Ex-smoker 21 | ID00367637202296290303449,36,1389,56.6892498571545,57,Female,Never smoked 22 | ID00365637202296085035729,42,2416,71.5724611920844,71,Male,Ex-smoker 23 | ID00119637202215426335765,2,2917,66.7017287112412,57,Male,Ex-smoker 24 | ID00086637202203494931510,6,3303,115.392677473449,65,Female,Never smoked 25 | ID00109637202210454292264,79,2327,60.567412805830294,73,Male,Ex-smoker 26 | ID00094637202205333947361,42,4574,109.133422408857,64,Male,Ex-smoker 27 | ID00102637202206574119190,29,2196,59.0449559044956,60,Male,Ex-smoker 28 | ID00312637202282607344793,10,1765,91.5788927515177,72,Female,Ex-smoker 29 | ID00219637202258203123958,56,5768,137.92443806791002,71,Male,Ex-smoker 30 | ID00184637202242062969203,43,3175,68.27956989247309,52,Male,Ex-smoker 31 | ID00343637202287577133798,76,1699,73.962822689478,68,Female,Never smoked 32 | ID00165637202237320314458,87,2067,52.9077505887171,54,Male,Ex-smoker 33 | ID00296637202279895784347,10,2258,67.04275534441811,58,Male,Ex-smoker 34 | ID00317637202283194142136,26,2375,57.45597058254311,64,Male,Ex-smoker 35 | ID00290637202279304677843,4,2283,67.4087634345104,75,Male,Never smoked 36 | ID00279637202272164826258,63,3796,83.36261419536191,70,Male,Ex-smoker 37 | ID00423637202312137826377,18,2777,66.8190567853705,72,Male,Ex-smoker 38 | ID00367637202296290303449,38,1467,59.8726634560444,57,Female,Never smoked 39 | ID00130637202220059448013,18,1537,63.5281474745805,65,Female,Never smoked 40 | ID00216637202257988213445,23,2195,58.346624136097795,65,Male,Never smoked 41 | ID00136637202224951350618,6,3390,97.5820379965458,64,Male,Ex-smoker 42 | ID00027637202179689871102,12,2472,64.3414888079125,73,Male,Ex-smoker 43 | ID00288637202279148973731,34,2365,81.8792411023404,63,Female,Ex-smoker 44 | ID00094637202205333947361,9,4738,113.0463828975,64,Male,Ex-smoker 45 | ID00131637202220424084844,116,2990,62.453003592614294,61,Male,Never smoked 46 | ID00169637202238024117706,28,2527,106.741573033708,66,Female,Never smoked 47 | ID00105637202208831864134,49,2547,68.2768603903067,64,Male,Ex-smoker 48 | ID00229637202260254240583,38,3603,80.60402684563759,71,Male,Ex-smoker 49 | ID00202637202249376026949,23,4316,107.405932709536,64,Male,Never smoked 50 | ID00111637202210956877205,10,2419,69.8164396213346,72,Male,Ex-smoker 51 | ID00109637202210454292264,91,2112,54.9713690786049,73,Male,Ex-smoker 52 | ID00068637202190879923934,13,2784,82.3376316100793,73,Male,Ex-smoker 53 | ID00411637202309374271828,45,1746,48.64593781344029,65,Male,Ex-smoker 54 | ID00411637202309374271828,7,1556,43.352279059400395,65,Male,Ex-smoker 55 | ID00340637202287399835821,30,2160,54.380664652567994,68,Male,Ex-smoker 56 | ID00381637202299644114027,9,2335,55.0292232277526,62,Male,Ex-smoker 57 | ID00169637202238024117706,20,2486,105.009715299485,66,Female,Never smoked 58 | ID00398637202303897337979,19,2592,76.1994355597366,70,Male,Ex-smoker 59 | ID00035637202182204917484,12,2603,71.15910333515579,69,Male,Ex-smoker 60 | ID00035637202182204917484,18,2684,73.37342810278841,69,Male,Ex-smoker 61 | ID00323637202285211956970,29,1550,64.9296246648793,77,Male,Ex-smoker 62 | ID00180637202240177410333,18,3054,86.9788106630212,68,Male,Ex-smoker 63 | ID00317637202283194142136,85,1861,45.021288949100104,64,Male,Ex-smoker 64 | ID00240637202264138860065,5,2991,78.4216046145779,63,Male,Ex-smoker 65 | ID00117637202212360228007,44,2912,79.0445168295331,68,Male,Currently smokes 66 | ID00015637202177877247924,69,2585,66.8304033092037,71,Male,Ex-smoker 67 | ID00131637202220424084844,76,3226,67.38240454507479,61,Male,Never smoked 68 | ID00190637202244450116191,12,4490,117.207893912499,69,Male,Ex-smoker 69 | ID00235637202261451839085,42,2120,55.4219387221583,67,Male,Ex-smoker 70 | ID00305637202281772703145,46,2714,71.75338409475471,62,Male,Ex-smoker 71 | ID00319637202283897208687,40,4238,124.383658135713,72,Male,Ex-smoker 72 | ID00094637202205333947361,30,4753,113.404275625119,64,Male,Ex-smoker 73 | ID00020637202178344345685,70,1861,95.4163248564397,66,Female,Never smoked 74 | ID00401637202305320178010,49,1793,65.9919028340081,74,Female,Ex-smoker 75 | ID00108637202209619669361,65,2611,74.00793650793649,73,Male,Ex-smoker 76 | ID00367637202296290303449,35,1366,55.750550975430606,57,Female,Never smoked 77 | ID00138637202231603868088,22,4324,104.46463084653999,66,Male,Ex-smoker 78 | ID00011637202177653955184,58,3193,82.5491209927611,72,Male,Ex-smoker 79 | ID00030637202181211009029,54,2328,54.9834671705243,69,Male,Ex-smoker 80 | ID00102637202206574119190,41,2336,62.8092062809206,60,Male,Ex-smoker 81 | ID00400637202305055099402,32,3895,86.8876594985277,55,Male,Ex-smoker 82 | ID00202637202249376026949,33,4092,101.831574756122,64,Male,Never smoked 83 | ID00062637202188654068490,98,2071,68.8131313131313,74,Male,Never smoked 84 | ID00323637202285211956970,33,1818,76.1561662198391,77,Male,Ex-smoker 85 | ID00422637202311677017371,35,1862,73.9710789766407,73,Male,Ex-smoker 86 | ID00078637202199415319443,17,1780,65.3522781510445,55,Female,Ex-smoker 87 | ID00093637202205278167493,54,3668,84.33734939759029,69,Male,Ex-smoker 88 | ID00240637202264138860065,41,2762,72.41740954378609,63,Male,Ex-smoker 89 | ID00229637202260254240583,26,3531,78.9932885906041,71,Male,Ex-smoker 90 | ID00011637202177653955184,9,3541,91.54601861427089,72,Male,Ex-smoker 91 | ID00099637202206203080121,51,3089,86.55570499887921,68,Male,Ex-smoker 92 | ID00381637202299644114027,31,1996,47.03996983408749,62,Male,Ex-smoker 93 | ID00192637202245493238298,12,1929,62.825690463783204,56,Female,Never smoked 94 | ID00139637202231703564336,27,4050,107.59829968119,76,Male,Ex-smoker 95 | ID00126637202218610655908,50,2316,57.9,78,Male,Ex-smoker 96 | ID00331637202286306023714,82,3221,97.24067141649559,69,Male,Currently smokes 97 | ID00344637202287684217717,6,2032,53.0714584203928,58,Male,Never smoked 98 | ID00392637202302319160044,9,2214,55.019880715705796,66,Male,Ex-smoker 99 | ID00038637202182690843176,15,3787,93.1198977082719,71,Male,Ex-smoker 100 | ID00299637202280383305867,24,2182,79.84484777517571,78,Male,Ex-smoker 101 | ID00421637202311550012437,70,2719,81.4462017733046,68,Male,Ex-smoker 102 | -------------------------------------------------------------------------------- /Mini_Project/SimpleNet_Python/python_lab.md: -------------------------------------------------------------------------------- 1 | # Python Lab 1 2 | 3 | ## Objective 4 | 5 | The objective of this first Python Lab is to make participants familiar with common Python operations that are extensively used in deep learning workflows. 6 | 7 | ### Prerequisites 8 | Before starting with this lab, kindly ensure to execute all the cells in the PythonChapter.ipynb Jupyter Notebook as a preparation for this lab. 9 | 10 | ## Instructions 11 | Throughout the lab, you will encounter predesigned cells with working code. However, certain cells will have missing code parts indicated with a comment `# TODO`. 12 | 13 | Here, you are required to: 14 | 15 | 1. Study the problem described. 16 | 2. Fill in the missing parts of the code to make it functional. 17 | 18 | In instances where you are required to complete the TODO tasks, sufficient information and guiding comments have been provided to facilitate your understanding of what each task entails. 19 | 20 | If you need to add additional cells, you can utilize the `Insert` menu located at the top of the page. 21 | 22 | ## Commitment 23 | We believe this lab will be an exciting opportunity to interactively learn and practice Python in the context of machine learning. Hence, we encourage asking questions, experimenting with code, and in-depth exploration! 24 | 25 | Please ensure to solve all the 'TO DO' tasks assigned in the notebook for complete learning. -------------------------------------------------------------------------------- /Readme.md: -------------------------------------------------------------------------------- 1 | ## 💫 Basic Codes to Learn Machine Learning 2 | 3 | 4 | ## Guide 5 | 6 | 7 | ![Amir Jafari - ML](https://img.shields.io/static/v1?label=Amir+Jafari&message=ML&color=blue&logo=github) [![License](https://img.shields.io/badge/License-MIT-blue)](#license) 8 | 9 | [//]: # ([![dependency - Python](https://img.shields.io/badge/dependency-Python-blue)](https://pypi.org/project/Python)) 10 | 11 | 12 | * Supervised Learning 13 | 14 | 15 | * Unsupervised Learning 16 | 17 | 18 | * Packages 19 | 20 | 21 | * Neural Network Design 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | This is a repo that has most updated fundamental of Machine Learning needed to implement to build a practical solutions. 30 | 31 | 32 | # 💻 Tech Stack: 33 | ![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54) 34 | ![Anaconda](https://img.shields.io/badge/Anaconda-%2344A833.svg?style=for-the-badge&logo=anaconda&logoColor=white) 35 | ![PyTorch](https://img.shields.io/badge/PyTorch-%23EE4C2C.svg?style=for-the-badge&logo=PyTorch&logoColor=white) 36 | ![Plotly](https://img.shields.io/badge/Plotly-%233F4F75.svg?style=for-the-badge&logo=plotly&logoColor=white) 37 | ![Pandas](https://img.shields.io/badge/pandas-%23150458.svg?style=for-the-badge&logo=pandas&logoColor=white) 38 | ![NumPy](https://img.shields.io/badge/numpy-%23013243.svg?style=for-the-badge&logo=numpy&logoColor=white) 39 | ![Scipy](https://img.shields.io/badge/SciPy-%230C55A5.svg?style=for-the-badge&logo=scipy&logoColor=%white) 40 | ![scikit-learn](https://img.shields.io/badge/scikit--learn-%23F7931E.svg?style=for-the-badge&logo=scikit-learn&logoColor=white) 41 | ![TensorFlow](https://img.shields.io/badge/TensorFlow-%23FF6F00.svg?style=for-the-badge&logo=TensorFlow&logoColor=white) 42 | ![Keras](https://img.shields.io/badge/Keras-%23D00000.svg?style=for-the-badge&logo=Keras&logoColor=white) 43 | ![Matplotlib](https://img.shields.io/badge/Matplotlib-%23ffffff.svg?style=for-the-badge&logo=Matplotlib&logoColor=black) 44 | --------------------------------------------------------------------------------