├── Week 01 - Introduction to Python ├── HRDataset.zip └── text_data.txt ├── Week 05 - Deep Learning Algorithms └── model.png ├── Week 06 - Deep Learning for Computer Vision └── model.png ├── requirements.txt ├── .github └── workflows │ └── colab_badge_workflow.yml ├── README.md ├── Week 02 - Data Science Libraries └── test.csv ├── Week 03 - Machine Learning Algorithms ├── 2- Evaluation_Metrics.ipynb └── 1- Scikit_Learn.ipynb ├── Week 04 - Advance Machine Learning ├── 1 - Feature Selection.ipynb └── 2- Cross_Validation.ipynb └── LICENSE /Week 01 - Introduction to Python/HRDataset.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/TheAIDojo/Machine_Learning_Bootcamp/HEAD/Week 01 - Introduction to Python/HRDataset.zip -------------------------------------------------------------------------------- /Week 05 - Deep Learning Algorithms/model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/TheAIDojo/Machine_Learning_Bootcamp/HEAD/Week 05 - Deep Learning Algorithms/model.png -------------------------------------------------------------------------------- /Week 06 - Deep Learning for Computer Vision/model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/TheAIDojo/Machine_Learning_Bootcamp/HEAD/Week 06 - Deep Learning for Computer Vision/model.png -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | jupyter==4.6 2 | matplotlib==3.5.1 3 | numpy==1.21.5 4 | pandas==1.3.5 5 | scikit-learn==1.0.1 6 | tensorflow==2.5.0 7 | seaborn==0.11.2 8 | tqdm==4.32.2 9 | tensorflow-hub==0.4.0 10 | tensorflow-datasets==2.2.0 11 | 12 | 13 | -------------------------------------------------------------------------------- /.github/workflows/colab_badge_workflow.yml: -------------------------------------------------------------------------------- 1 | name: colab-badge-workflow 2 | on: [push] 3 | jobs: 4 | add-colab-badge: 5 | runs-on: ubuntu-latest 6 | steps: 7 | - name: Checkout first 8 | id: checkout 9 | uses: actions/checkout@v2 10 | - name: Add/Update badges 11 | id: badges 12 | uses: trsvchn/colab-badge-action@v3 13 | with: 14 | check: 'all' 15 | update: true 16 | target_branch: main 17 | target_repository: TheAIDojo/Machine_Learning_Bootcamp 18 | - name: Commit changes 19 | uses: EndBug/add-and-commit@v7 20 | with: 21 | author_name: ${{ github.repository_owner }} 22 | author_email: mentzerhussein@gmail.com 23 | message: 'Added Colab Badges' 24 | add: '*.ipynb' 25 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | # AI DOJO Machine Learning Bootcamp 3 | 4 | This is the repository for The Machine Learning Bootcamp published by [AI DOJO](https://github.com/TheAIDojo). It contains all the Bootcamp code with supporting project files necessary to work through the code. 5 | 6 | **Note**: to access our previous Bootcamp resources, please refer to the below link 7 | * [**Previous Bootcamp**](https://github.com/TheAIDojo/Machine_Learning_Bootcamp/tree/Chapter_01) 8 | 9 | ## Requirements and Setup 10 | 11 | We recommend to use [Colab](https://colab.research.google.com/?utm_source=scs-index), you will find Colab icon in the top of all AI DOJO code. 12 | If you want to use your own pc please follow the instructions below: 13 | 1. Install Python on [Windows/Mac](https://www.python.org/downloads/release/python-387/). 14 | 2. Install pip for [Windows/Mac/Linux](https://pip.pypa.io/en/stable/installation/). 15 | 3. Make sure to install the necessary python packages for the Bootcamp from the [requirements.txt](https://github.com/TheAIDojo/Workshops/blob/main/requirements.txt) file. 16 | 4. Donwload the code editer, we recommend [vscode](https://code.visualstudio.com/download) 17 | 18 | ## About The AI DOJO Machine Learning Bootcamp 19 | 20 | With expert guidance and real-world examples the AI DOJO Machine Learning Bootcamp will walk you through the process of building, training, and model evaluation of your Machine Learning and Deep Learning algorithms by showing you how to leverage TensorFlow flexibility. The AI DOJO Machine Learning Bootcamp will teach you all the skills you need to use Machine Learning & Deep Learning in the right way. 21 | 22 | 23 | ## What You Will Learn 24 | 25 | 26 | * [**WEEK 01 – INTRODUCTION TO PYTHON**](https://github.com/TheAIDojo/Machine_Learning_Bootcamp/tree/main/Week%2001%20-%20Introduction%20to%20Python) 27 | 28 | Review of the Python programming language that covers basic syntax, variables, arithmetic operations, control flow, functions, classes, and built-in Python modules. All applied using Google Colab. 29 | 30 | * [**WEEK 02 – DATA SCIENCE LIBRARIES**](https://github.com/TheAIDojo/Machine_Learning_Bootcamp/tree/main/Week%2002%20-%20Data%20Science%20Libraries) 31 | 32 | Introduction to 3rd party Python libraries that are used to data science tasks like NumPy for working with numerical tensors and matrices, Pandas for working with tabular data, and Matplotlib for data visualization. 33 | 34 | * [**WEEK 03 – MACHINE LEARNING ALGORITHMS**](https://github.com/TheAIDojo/Machine_Learning_Bootcamp/tree/main/Week%2003%20-%20Machine%20Learning%20Algorithms) 35 | 36 | Introduction to Scikit Learn, a machine learning library that covers a wide range of ML tasks like preprocessing and preparing data to defining, training, and tuning different machine learning algorithms to evaluating their performance and measure their accuracy. 37 | 38 | * [**WEEK 04 – DEEP LEARNING ALGORITHMS**](https://github.com/TheAIDojo/Machine_Learning_Bootcamp/tree/main/Week%2004_%20Advance%20Machine%20Learning) 39 | 40 | Introduction to Deep Learning using TensorFlow framework and the building blocks that go into training deep learning models including Neural Network and its components like activation functions, loss function, optimizers, evaluation metrics, and data pipelines. 41 | 42 | * [**WEEK 05 – COMPUTER VISION**]() 43 | 44 | Introduction to Convolutional Neural Networks (CNNs) using TensorFlow, images preprocessing and data pipelines, using pretrained models for Transfer Learning, and exploring different CNN architectures like ResNet, Inception, and more. 45 | 46 | * [**WEEK 06 – COMPUTER VISION APPLICATIONS**]() 47 | 48 | Build advanced computer vision-powered applications beyond basic image classification including object detection, semantic segmentation, and more advanced CNN training techniques. 49 | 50 | * [**WEEK 07 – SEQUENCE MODELLING**]() 51 | 52 | Introduction to Recurrent Neural Networks (RNNs) and its variations like GRUs and LSTMs which are used to sequential data like text and time series. We’ll use these Neural Networks to perform tasks like sentiment analysis and text generation. 53 | 54 | * [**WEEK 08 – ADVANCED SEQUENCE APPLICATIONS**]() 55 | 56 | Introduction to more advanced RNN-based architectures like Autoencoders. We’ll also build more sophisticated applications like Neural Machine Translation and Forecasting using RNN-based models. 57 | 58 | * [**WEEK 09 – MORE ADVANCED DL APPLICATIONS**]() 59 | 60 | We’ll explore more Deep Learning applications like Recommendation Systems. We’ll also introduce platforms like Gradio that helps us quickly build interactive demos for our Deep Learning models. 61 | 62 | * [**WEEK 10 – MODEL DEPLOYMENT with TENSORFLOW**]() 63 | 64 | We’ll learn how to deploy and serve our trained models across different platforms including via REST APIs and mobile applications using TensorFlow Lite. 65 | 66 | -------------------------------------------------------------------------------- /Week 02 - Data Science Libraries/test.csv: -------------------------------------------------------------------------------- 1 | a,b 2 | 0,1 3 | 2,3 4 | 4,5 5 | 6,7 6 | 8,9 7 | 10,11 8 | 12,13 9 | 14,15 10 | 16,17 11 | 18,19 12 | 20,21 13 | 22,23 14 | 24,25 15 | 26,27 16 | 28,29 17 | 30,31 18 | 32,33 19 | 34,35 20 | 36,37 21 | 38,39 22 | 40,41 23 | 42,43 24 | 44,45 25 | 46,47 26 | 48,49 27 | 50,51 28 | 52,53 29 | 54,55 30 | 56,57 31 | 58,59 32 | 60,61 33 | 62,63 34 | 64,65 35 | 66,67 36 | 68,69 37 | 70,71 38 | 72,73 39 | 74,75 40 | 76,77 41 | 78,79 42 | 80,81 43 | 82,83 44 | 84,85 45 | 86,87 46 | 88,89 47 | 90,91 48 | 92,93 49 | 94,95 50 | 96,97 51 | 98,99 52 | 100,101 53 | 102,103 54 | 104,105 55 | 106,107 56 | 108,109 57 | 110,111 58 | 112,113 59 | 114,115 60 | 116,117 61 | 118,119 62 | 120,121 63 | 122,123 64 | 124,125 65 | 126,127 66 | 128,129 67 | 130,131 68 | 132,133 69 | 134,135 70 | 136,137 71 | 138,139 72 | 140,141 73 | 142,143 74 | 144,145 75 | 146,147 76 | 148,149 77 | 150,151 78 | 152,153 79 | 154,155 80 | 156,157 81 | 158,159 82 | 160,161 83 | 162,163 84 | 164,165 85 | 166,167 86 | 168,169 87 | 170,171 88 | 172,173 89 | 174,175 90 | 176,177 91 | 178,179 92 | 180,181 93 | 182,183 94 | 184,185 95 | 186,187 96 | 188,189 97 | 190,191 98 | 192,193 99 | 194,195 100 | 196,197 101 | 198,199 102 | 200,201 103 | 202,203 104 | 204,205 105 | 206,207 106 | 208,209 107 | 210,211 108 | 212,213 109 | 214,215 110 | 216,217 111 | 218,219 112 | 220,221 113 | 222,223 114 | 224,225 115 | 226,227 116 | 228,229 117 | 230,231 118 | 232,233 119 | 234,235 120 | 236,237 121 | 238,239 122 | 240,241 123 | 242,243 124 | 244,245 125 | 246,247 126 | 248,249 127 | 250,251 128 | 252,253 129 | 254,255 130 | 256,257 131 | 258,259 132 | 260,261 133 | 262,263 134 | 264,265 135 | 266,267 136 | 268,269 137 | 270,271 138 | 272,273 139 | 274,275 140 | 276,277 141 | 278,279 142 | 280,281 143 | 282,283 144 | 284,285 145 | 286,287 146 | 288,289 147 | 290,291 148 | 292,293 149 | 294,295 150 | 296,297 151 | 298,299 152 | 300,301 153 | 302,303 154 | 304,305 155 | 306,307 156 | 308,309 157 | 310,311 158 | 312,313 159 | 314,315 160 | 316,317 161 | 318,319 162 | 320,321 163 | 322,323 164 | 324,325 165 | 326,327 166 | 328,329 167 | 330,331 168 | 332,333 169 | 334,335 170 | 336,337 171 | 338,339 172 | 340,341 173 | 342,343 174 | 344,345 175 | 346,347 176 | 348,349 177 | 350,351 178 | 352,353 179 | 354,355 180 | 356,357 181 | 358,359 182 | 360,361 183 | 362,363 184 | 364,365 185 | 366,367 186 | 368,369 187 | 370,371 188 | 372,373 189 | 374,375 190 | 376,377 191 | 378,379 192 | 380,381 193 | 382,383 194 | 384,385 195 | 386,387 196 | 388,389 197 | 390,391 198 | 392,393 199 | 394,395 200 | 396,397 201 | 398,399 202 | 400,401 203 | 402,403 204 | 404,405 205 | 406,407 206 | 408,409 207 | 410,411 208 | 412,413 209 | 414,415 210 | 416,417 211 | 418,419 212 | 420,421 213 | 422,423 214 | 424,425 215 | 426,427 216 | 428,429 217 | 430,431 218 | 432,433 219 | 434,435 220 | 436,437 221 | 438,439 222 | 440,441 223 | 442,443 224 | 444,445 225 | 446,447 226 | 448,449 227 | 450,451 228 | 452,453 229 | 454,455 230 | 456,457 231 | 458,459 232 | 460,461 233 | 462,463 234 | 464,465 235 | 466,467 236 | 468,469 237 | 470,471 238 | 472,473 239 | 474,475 240 | 476,477 241 | 478,479 242 | 480,481 243 | 482,483 244 | 484,485 245 | 486,487 246 | 488,489 247 | 490,491 248 | 492,493 249 | 494,495 250 | 496,497 251 | 498,499 252 | 500,501 253 | 502,503 254 | 504,505 255 | 506,507 256 | 508,509 257 | 510,511 258 | 512,513 259 | 514,515 260 | 516,517 261 | 518,519 262 | 520,521 263 | 522,523 264 | 524,525 265 | 526,527 266 | 528,529 267 | 530,531 268 | 532,533 269 | 534,535 270 | 536,537 271 | 538,539 272 | 540,541 273 | 542,543 274 | 544,545 275 | 546,547 276 | 548,549 277 | 550,551 278 | 552,553 279 | 554,555 280 | 556,557 281 | 558,559 282 | 560,561 283 | 562,563 284 | 564,565 285 | 566,567 286 | 568,569 287 | 570,571 288 | 572,573 289 | 574,575 290 | 576,577 291 | 578,579 292 | 580,581 293 | 582,583 294 | 584,585 295 | 586,587 296 | 588,589 297 | 590,591 298 | 592,593 299 | 594,595 300 | 596,597 301 | 598,599 302 | 600,601 303 | 602,603 304 | 604,605 305 | 606,607 306 | 608,609 307 | 610,611 308 | 612,613 309 | 614,615 310 | 616,617 311 | 618,619 312 | 620,621 313 | 622,623 314 | 624,625 315 | 626,627 316 | 628,629 317 | 630,631 318 | 632,633 319 | 634,635 320 | 636,637 321 | 638,639 322 | 640,641 323 | 642,643 324 | 644,645 325 | 646,647 326 | 648,649 327 | 650,651 328 | 652,653 329 | 654,655 330 | 656,657 331 | 658,659 332 | 660,661 333 | 662,663 334 | 664,665 335 | 666,667 336 | 668,669 337 | 670,671 338 | 672,673 339 | 674,675 340 | 676,677 341 | 678,679 342 | 680,681 343 | 682,683 344 | 684,685 345 | 686,687 346 | 688,689 347 | 690,691 348 | 692,693 349 | 694,695 350 | 696,697 351 | 698,699 352 | 700,701 353 | 702,703 354 | 704,705 355 | 706,707 356 | 708,709 357 | 710,711 358 | 712,713 359 | 714,715 360 | 716,717 361 | 718,719 362 | 720,721 363 | 722,723 364 | 724,725 365 | 726,727 366 | 728,729 367 | 730,731 368 | 732,733 369 | 734,735 370 | 736,737 371 | 738,739 372 | 740,741 373 | 742,743 374 | 744,745 375 | 746,747 376 | 748,749 377 | 750,751 378 | 752,753 379 | 754,755 380 | 756,757 381 | 758,759 382 | 760,761 383 | 762,763 384 | 764,765 385 | 766,767 386 | 768,769 387 | 770,771 388 | 772,773 389 | 774,775 390 | 776,777 391 | 778,779 392 | 780,781 393 | 782,783 394 | 784,785 395 | 786,787 396 | 788,789 397 | 790,791 398 | 792,793 399 | 794,795 400 | 796,797 401 | 798,799 402 | 800,801 403 | 802,803 404 | 804,805 405 | 806,807 406 | 808,809 407 | 810,811 408 | 812,813 409 | 814,815 410 | 816,817 411 | 818,819 412 | 820,821 413 | 822,823 414 | 824,825 415 | 826,827 416 | 828,829 417 | 830,831 418 | 832,833 419 | 834,835 420 | 836,837 421 | 838,839 422 | 840,841 423 | 842,843 424 | 844,845 425 | 846,847 426 | 848,849 427 | 850,851 428 | 852,853 429 | 854,855 430 | 856,857 431 | 858,859 432 | 860,861 433 | 862,863 434 | 864,865 435 | 866,867 436 | 868,869 437 | 870,871 438 | 872,873 439 | 874,875 440 | 876,877 441 | 878,879 442 | 880,881 443 | 882,883 444 | 884,885 445 | 886,887 446 | 888,889 447 | 890,891 448 | 892,893 449 | 894,895 450 | 896,897 451 | 898,899 452 | 900,901 453 | 902,903 454 | 904,905 455 | 906,907 456 | 908,909 457 | 910,911 458 | 912,913 459 | 914,915 460 | 916,917 461 | 918,919 462 | 920,921 463 | 922,923 464 | 924,925 465 | 926,927 466 | 928,929 467 | 930,931 468 | 932,933 469 | 934,935 470 | 936,937 471 | 938,939 472 | 940,941 473 | 942,943 474 | 944,945 475 | 946,947 476 | 948,949 477 | 950,951 478 | 952,953 479 | 954,955 480 | 956,957 481 | 958,959 482 | 960,961 483 | 962,963 484 | 964,965 485 | 966,967 486 | 968,969 487 | 970,971 488 | 972,973 489 | 974,975 490 | 976,977 491 | 978,979 492 | 980,981 493 | 982,983 494 | 984,985 495 | 986,987 496 | 988,989 497 | 990,991 498 | 992,993 499 | 994,995 500 | 996,997 501 | 998,999 502 | -------------------------------------------------------------------------------- /Week 01 - Introduction to Python/text_data.txt: -------------------------------------------------------------------------------- 1 | First Citizen: 2 | Before we proceed any further, hear me speak. 3 | 4 | All: 5 | Speak, speak. 6 | 7 | First Citizen: 8 | You are all resolved rather to die than to famish? 9 | 10 | All: 11 | Resolved. resolved. 12 | 13 | First Citizen: 14 | First, you know Caius Marcius is chief enemy to the people. 15 | 16 | All: 17 | We know't, we know't. 18 | 19 | First Citizen: 20 | Let us kill him, and we'll have corn at our own price. 21 | Is't a verdict? 22 | 23 | All: 24 | No more talking on't; let it be done: away, away! 25 | 26 | Second Citizen: 27 | One word, good citizens. 28 | 29 | First Citizen: 30 | We are accounted poor citizens, the patricians good. 31 | What authority surfeits on would relieve us: if they 32 | would yield us but the superfluity, while it were 33 | wholesome, we might guess they relieved us humanely; 34 | but they think we are too dear: the leanness that 35 | afflicts us, the object of our misery, is as an 36 | inventory to particularise their abundance; our 37 | sufferance is a gain to them Let us revenge this with 38 | our pikes, ere we become rakes: for the gods know I 39 | speak this in hunger for bread, not in thirst for revenge. 40 | 41 | Second Citizen: 42 | Would you proceed especially against Caius Marcius? 43 | 44 | All: 45 | Against him first: he's a very dog to the commonalty. 46 | 47 | Second Citizen: 48 | Consider you what services he has done for his country? 49 | 50 | First Citizen: 51 | Very well; and could be content to give him good 52 | report fort, but that he pays himself with being proud. 53 | 54 | Second Citizen: 55 | Nay, but speak not maliciously. 56 | 57 | First Citizen: 58 | I say unto you, what he hath done famously, he did 59 | it to that end: though soft-conscienced men can be 60 | content to say it was for his country he did it to 61 | please his mother and to be partly proud; which he 62 | is, even till the altitude of his virtue. 63 | 64 | Second Citizen: 65 | What he cannot help in his nature, you account a 66 | vice in him. You must in no way say he is covetous. 67 | 68 | First Citizen: 69 | If I must not, I need not be barren of accusations; 70 | he hath faults, with surplus, to tire in repetition. 71 | What shouts are these? The other side o' the city 72 | is risen: why stay we prating here? to the Capitol! 73 | 74 | All: 75 | Come, come. 76 | 77 | First Citizen: 78 | Soft! who comes here? 79 | 80 | Second Citizen: 81 | Worthy Menenius Agrippa; one that hath always loved 82 | the people. 83 | 84 | First Citizen: 85 | He's one honest enough: would all the rest were so! 86 | 87 | MENENIUS: 88 | What work's, my countrymen, in hand? where go you 89 | With bats and clubs? The matter? speak, I pray you. 90 | 91 | First Citizen: 92 | Our business is not unknown to the senate; they have 93 | had inkling this fortnight what we intend to do, 94 | which now we'll show 'em in deeds. They say poor 95 | suitors have strong breaths: they shall know we 96 | have strong arms too. 97 | 98 | MENENIUS: 99 | Why, masters, my good friends, mine honest neighbours, 100 | Will you undo yourselves? 101 | 102 | First Citizen: 103 | We cannot, sir, we are undone already. 104 | 105 | MENENIUS: 106 | I tell you, friends, most charitable care 107 | Have the patricians of you. For your wants, 108 | Your suffering in this dearth, you may as well 109 | Strike at the heaven with your staves as lift them 110 | Against the Roman state, whose course will on 111 | The way it takes, cracking ten thousand curbs 112 | Of more strong link asunder than can ever 113 | Appear in your impediment. For the dearth, 114 | The gods, not the patricians, make it, and 115 | Your knees to them, not arms, must help. Alack, 116 | You are transported by calamity 117 | Thither where more attends you, and you slander 118 | The helms o' the state, who care for you like fathers, 119 | When you curse them as enemies. 120 | 121 | First Citizen: 122 | Care for us! True, indeed! They ne'er cared for us 123 | yet: suffer us to famish, and their store-houses 124 | crammed with grain; make edicts for usury, to 125 | support usurers; repeal daily any wholesome act 126 | established against the rich, and provide more 127 | piercing statutes daily, to chain up and restrain 128 | the poor. If the wars eat us not up, they will; and 129 | there's all the love they bear us. 130 | 131 | MENENIUS: 132 | Either you must 133 | Confess yourselves wondrous malicious, 134 | Or be accused of folly. I shall tell you 135 | A pretty tale: it may be you have heard it; 136 | But, since it serves my purpose, I will venture 137 | To stale 't a little more. 138 | 139 | First Citizen: 140 | Well, I'll hear it, sir: yet you must not think to 141 | fob off our disgrace with a tale: but, an 't please 142 | you, deliver. 143 | 144 | MENENIUS: 145 | There was a time when all the body's members 146 | Rebell'd against the belly, thus accused it: 147 | That only like a gulf it did remain 148 | I' the midst o' the body, idle and unactive, 149 | Still cupboarding the viand, never bearing 150 | Like labour with the rest, where the other instruments 151 | Did see and hear, devise, instruct, walk, feel, 152 | And, mutually participate, did minister 153 | Unto the appetite and affection common 154 | Of the whole body. The belly answer'd-- 155 | 156 | First Citizen: 157 | Well, sir, what answer made the belly? 158 | 159 | MENENIUS: 160 | Sir, I shall tell you. With a kind of smile, 161 | Which ne'er came from the lungs, but even thus-- 162 | For, look you, I may make the belly smile 163 | As well as speak--it tauntingly replied 164 | To the discontented members, the mutinous parts 165 | That envied his receipt; even so most fitly 166 | As you malign our senators for that 167 | They are not such as you. 168 | 169 | First Citizen: 170 | Your belly's answer? What! 171 | The kingly-crowned head, the vigilant eye, 172 | The counsellor heart, the arm our soldier, 173 | Our steed the leg, the tongue our trumpeter. 174 | With other muniments and petty helps 175 | In this our fabric, if that they-- 176 | 177 | MENENIUS: 178 | What then? 179 | 'Fore me, this fellow speaks! What then? what then? 180 | 181 | First Citizen: 182 | Should by the cormorant belly be restrain'd, 183 | Who is the sink o' the body,-- 184 | 185 | MENENIUS: 186 | Well, what then? 187 | 188 | First Citizen: 189 | The former agents, if they did complain, 190 | What could the belly answer? 191 | 192 | MENENIUS: 193 | I will tell you 194 | If you'll bestow a small--of what you have little-- 195 | Patience awhile, you'll hear the belly's answer. 196 | 197 | First Citizen: 198 | Ye're long about it. 199 | 200 | MENENIUS: 201 | Note me this, good friend; 202 | Your most grave belly was deliberate, 203 | Not rash like his accusers, and thus answer'd: 204 | 'True is it, my incorporate friends,' quoth he, 205 | 'That I receive the general food at first, 206 | Which you do live upon; and fit it is, 207 | Because I am the store-house and the shop 208 | Of the whole body: but, if you do remember, 209 | I send it through the rivers of your blood, 210 | Even to the court, the heart, to the seat o' the brain; 211 | And, through the cranks and offices of man, 212 | The strongest nerves and small inferior veins 213 | From me receive that natural competency 214 | Whereby they live: and though that all at once, 215 | You, my good friends,'--this says the belly, mark me,-- 216 | 217 | First Citizen: 218 | Ay, sir; well, well. 219 | 220 | MENENIUS: 221 | 'Though all at once cannot 222 | See what I do deliver out to each, 223 | Yet I can make my audit up, that all 224 | From me do back receive the flour of all, 225 | And leave me but the bran.' What say you to't? 226 | 227 | First Citizen: 228 | It was an answer: how apply you this? 229 | 230 | MENENIUS: 231 | The senators of Rome are this good belly, 232 | And you the mutinous members; for examine 233 | Their counsels and their cares, digest things rightly 234 | Touching the weal o' the common, you shall find 235 | No public benefit which you receive 236 | But it proceeds or comes from them to you 237 | And no way from yourselves. What do you think, 238 | You, the great toe of this assembly? 239 | 240 | First Citizen: 241 | I the great toe! why the great toe? 242 | 243 | MENENIUS: 244 | For that, being one o' the lowest, basest, poorest, 245 | Of this most wise rebellion, thou go'st foremost: 246 | Thou rascal, that art worst in blood to run, 247 | Lead'st first to win some vantage. 248 | But make you ready your stiff bats and clubs: 249 | Rome and her rats are at the point of battle; 250 | The one side must have bale. 251 | Hail, noble Marcius! 252 | 253 | MARCIUS: 254 | Thanks. What's the matter, you dissentious rogues, 255 | That, rubbing the poor itch of your opinion, 256 | Make yourselves scabs? 257 | 258 | First Citizen: 259 | We have ever your good word. 260 | 261 | MARCIUS: 262 | He that will give good words to thee will flatter 263 | Beneath abhorring. What would you have, you curs, 264 | That like nor peace nor war? the one affrights you, 265 | The other makes you proud. He that trusts to you, 266 | Where he should find you lions, finds you hares; 267 | Where foxes, geese: you are no surer, no, 268 | Than is the coal of fire upon the ice, 269 | Or hailstone in the sun. Your virtue is 270 | To make him worthy whose offence subdues him 271 | And curse that justice did it. 272 | Who deserves greatness 273 | Deserves your hate; and your affections are 274 | A sick man's appetite, who desires most that 275 | Which would increase his evil. He that depends 276 | Upon your favours swims with fins of lead 277 | And hews down oaks with rushes. Hang ye! Trust Ye? 278 | With every minute you do change a mind, 279 | And call him noble that was now your hate, 280 | Him vile that was your garland. What's the matter, 281 | That in these several places of the city 282 | You cry against the noble senate, who, 283 | Under the gods, keep you in awe, which else 284 | Would feed on one another? What's their seeking? 285 | 286 | MENENIUS: 287 | For corn at their own rates; whereof, they say, 288 | The city is well stored. 289 | 290 | MARCIUS: 291 | Hang 'em! They say! 292 | They'll sit by the fire, and presume to know 293 | What's done i' the Capitol; who's like to rise, 294 | Who thrives and who declines; side factions 295 | and give out 296 | Conjectural marriages; making parties strong 297 | And feebling such as stand not in their liking 298 | Below their cobbled shoes. They say there's 299 | grain enough! 300 | Would the nobility lay aside their ruth, 301 | And let me use my sword, I'll make a quarry 302 | With thousands of these quarter'd slaves, as high 303 | As I could pick my lance. 304 | 305 | MENENIUS: 306 | Nay, these are almost thoroughly persuaded; 307 | For though abundantly they lack discretion, 308 | Yet are they passing cowardly. But, I beseech you, 309 | What says the other troop? 310 | 311 | MARCIUS: 312 | They are dissolved: hang 'em! 313 | They said they were an-hungry; sigh'd forth proverbs, 314 | That hunger broke stone walls, that dogs must eat, 315 | That meat was made for mouths, that the gods sent not 316 | Corn for the rich men only: with these shreds 317 | They vented their complainings; which being answer'd, 318 | And a petition granted them, a strange one-- 319 | To break the heart of generosity, 320 | And make bold power look pale--they threw their caps 321 | As they would hang them on the horns o' the moon, 322 | Shouting their emulation. 323 | 324 | MENENIUS: 325 | What is granted them? 326 | 327 | MARCIUS: 328 | Five tribunes to defend their vulgar wisdoms, 329 | Of their own choice: one's Junius Brutus, 330 | Sicinius Velutus, and I know not--'Sdeath! 331 | The rabble should have first unroof'd the city, 332 | Ere so prevail'd with me: it will in time 333 | Win upon power and throw forth greater themes 334 | For insurrection's arguing. 335 | 336 | MENENIUS: 337 | This is strange. 338 | 339 | MARCIUS: 340 | Go, get you home, you fragments! 341 | 342 | Messenger: 343 | Where's Caius Marcius? 344 | 345 | MARCIUS: 346 | Here: what's the matter? 347 | 348 | Messenger: 349 | The news is, sir, the Volsces are in arms. 350 | 351 | MARCIUS: 352 | I am glad on 't: then we shall ha' means to vent 353 | Our musty superfluity. See, our best elders. 354 | 355 | First Senator: 356 | Marcius, 'tis true that you have lately told us; 357 | The Volsces are in arms. 358 | 359 | MARCIUS: 360 | They have a leader, 361 | Tullus Aufidius, that will put you to 't. 362 | I sin in envying his nobility, 363 | And were I any thing but what I am, 364 | I would wish me only he. 365 | 366 | COMINIUS: 367 | You have fought together. 368 | 369 | MARCIUS: 370 | Were half to half the world by the ears and he. 371 | Upon my party, I'ld revolt to make 372 | Only my wars with him: he is a lion 373 | That I am proud to hunt. 374 | 375 | First Senator: 376 | Then, worthy Marcius, 377 | Attend upon Cominius to these wars. 378 | 379 | COMINIUS: 380 | It is your former promise. 381 | 382 | MARCIUS: 383 | Sir, it is; 384 | And I am constant. Titus Lartius, thou 385 | Shalt see me once more strike at Tullus' face. 386 | What, art thou stiff? stand'st out? 387 | 388 | TITUS: 389 | No, Caius Marcius; 390 | I'll lean upon one crutch and fight with t'other, 391 | Ere stay behind this business. 392 | 393 | MENENIUS: 394 | O, true-bred! 395 | 396 | First Senator: 397 | Your company to the Capitol; where, I know, 398 | Our greatest friends attend us. 399 | 400 | TITUS: 401 | 402 | COMINIUS: 403 | Noble Marcius! 404 | 405 | First Senator: 406 | 407 | MARCIUS: 408 | Nay, let them follow: 409 | The Volsces have much corn; take these rats thither 410 | To gnaw their garners. Worshipful mutiners, 411 | Your valour puts well forth: pray, follow. 412 | 413 | SICINIUS: 414 | Was ever man so proud as is this Marcius? 415 | 416 | BRUTUS: 417 | He has no equal. 418 | 419 | SICINIUS: 420 | When we were chosen tribunes for the people,-- 421 | 422 | BRUTUS: 423 | Mark'd you his lip and eyes? 424 | 425 | SICINIUS: 426 | Nay. but his taunts. 427 | 428 | BRUTUS: 429 | Being moved, he will not spare to gird the gods. 430 | 431 | SICINIUS: 432 | Be-mock the modest moon. 433 | 434 | BRUTUS: 435 | The present wars devour him: he is grown 436 | Too proud to be so valiant. -------------------------------------------------------------------------------- /Week 03 - Machine Learning Algorithms/2- Evaluation_Metrics.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "id": "-mh4fYe9vvuk" 7 | }, 8 | "source": [ 9 | "# Model Evaluation\n", 10 | "\n", 11 | "\"Open\n", 12 | "\n", 13 | "\n" 14 | ] 15 | }, 16 | { 17 | "cell_type": "markdown", 18 | "metadata": { 19 | "id": "NrP6gOUsaVsy" 20 | }, 21 | "source": [ 22 | "## Classifier Evaluation\n", 23 | "\n", 24 | "### True Positive (TP) \n", 25 | "\n", 26 | "The predicted value matches the actual value\n", 27 | "The actual value was positive and the model predicted a positive value\n", 28 | "\n", 29 | "### True Negative (TN) \n", 30 | "\n", 31 | "The predicted value matches the actual value\n", 32 | "The actual value was negative and the model predicted a negative value\n", 33 | "\n", 34 | "### False Positive (FP) – Type 1 error\n", 35 | "\n", 36 | "The predicted value was falsely predicted\n", 37 | "The actual value was negative but the model predicted a positive value\n", 38 | "Also known as the Type 1 error\n", 39 | "\n", 40 | "### False Negative (FN) – Type 2 error\n", 41 | "\n", 42 | "The predicted value was falsely predicted\n", 43 | "The actual value was positive but the model predicted a negative value\n", 44 | "Also known as the Type 2 error\n", 45 | "\n", 46 | "### Precision\n", 47 | "Precision is a good measure to determine, when the costs of False Positive is high. For instance, email spam detection. In email spam detection, a false positive means that an email that is non-spam (actual negative) has been identified as spam (predicted spam). The email user might lose important emails if the precision is not high for the spam detection model.\n", 48 | "\n", 49 | "$$Precision = \\dfrac{TP}{TP + FP}$$\n", 50 | "\n", 51 | "### Recall\n", 52 | "Recall shall be the model metric we use to select our best model when there is a high cost associated with False Negative.\n", 53 | "in sick patient detection. If a sick patient (Actual Positive) goes through the test and predicted as not sick (Predicted Negative). The cost associated with False Negative will be extremely high if the sickness is contagious.\n", 54 | "\n", 55 | "$$Recall = \\dfrac{TP}{TP + FN}$$\n", 56 | "\n", 57 | "### F1 score\n", 58 | "F1 Score might be a better measure to use if we need to seek a balance between Precision and Recall AND there is an uneven class distribution (large number of Actual Negatives).\n", 59 | "\n", 60 | "$$F1 = 2 \\cdot \\dfrac{Precision \\cdot Recall}{Precision + Recall}$$\n", 61 | "\n", 62 | "### Confusion Matrix\n", 63 | "A Confusion matrix is an N x N matrix used for evaluating the performance of a classification model, where N is the number of target classes. The matrix compares the actual target values with those predicted by the machine learning model. This gives us a holistic view of how well our classification model is performing and what kinds of errors it is making." 64 | ] 65 | }, 66 | { 67 | "cell_type": "code", 68 | "execution_count": 6, 69 | "metadata": { 70 | "id": "WVfD0etJidZq", 71 | "vscode": { 72 | "languageId": "python" 73 | } 74 | }, 75 | "outputs": [], 76 | "source": [ 77 | "from sklearn import (\n", 78 | " linear_model,\n", 79 | " metrics,\n", 80 | " datasets,\n", 81 | " model_selection,\n", 82 | " pipeline,\n", 83 | " preprocessing,\n", 84 | ")" 85 | ] 86 | }, 87 | { 88 | "cell_type": "code", 89 | "execution_count": 7, 90 | "metadata": { 91 | "id": "1LqNVHlaikt4", 92 | "vscode": { 93 | "languageId": "python" 94 | } 95 | }, 96 | "outputs": [], 97 | "source": [ 98 | "x, y = datasets.load_breast_cancer(return_X_y=True)\n", 99 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 100 | " x, y, test_size=0.1, random_state=42, stratify=y\n", 101 | ")" 102 | ] 103 | }, 104 | { 105 | "cell_type": "code", 106 | "execution_count": 8, 107 | "metadata": { 108 | "id": "510UXIxaB5zV", 109 | "vscode": { 110 | "languageId": "python" 111 | } 112 | }, 113 | "outputs": [], 114 | "source": [ 115 | "# Create the pipeline from the scaler and the estimator\n", 116 | "# Scaler\n", 117 | "scaler = preprocessing.StandardScaler()\n", 118 | "# Estimator\n", 119 | "estimator = linear_model.LogisticRegression(solver=\"liblinear\", random_state=42)\n", 120 | "# Create the pipeline\n", 121 | "pipe = pipeline.Pipeline([(\"scaler\", scaler), (\"estimator\", estimator)])" 122 | ] 123 | }, 124 | { 125 | "cell_type": "code", 126 | "execution_count": 9, 127 | "metadata": { 128 | "colab": { 129 | "base_uri": "https://localhost:8080/" 130 | }, 131 | "id": "ti3WlBgD40JL", 132 | "outputId": "b20ca8c4-4c5a-467d-acc4-e4644e663b42", 133 | "vscode": { 134 | "languageId": "python" 135 | } 136 | }, 137 | "outputs": [ 138 | { 139 | "data": { 140 | "text/plain": [ 141 | "Pipeline(steps=[('scaler', StandardScaler()),\n", 142 | " ('estimator',\n", 143 | " LogisticRegression(random_state=42, solver='liblinear'))])" 144 | ] 145 | }, 146 | "execution_count": 9, 147 | "metadata": {}, 148 | "output_type": "execute_result" 149 | } 150 | ], 151 | "source": [ 152 | "# fit the pipeline of the x_train and y_train\n", 153 | "pipe.fit(x_train, y_train)" 154 | ] 155 | }, 156 | { 157 | "cell_type": "code", 158 | "execution_count": 10, 159 | "metadata": { 160 | "colab": { 161 | "base_uri": "https://localhost:8080/", 162 | "height": 511 163 | }, 164 | "id": "nevs86s-i-t0", 165 | "outputId": "a13d6cb1-56b2-4eec-e9d3-7fe3df8ec492", 166 | "vscode": { 167 | "languageId": "python" 168 | } 169 | }, 170 | "outputs": [ 171 | { 172 | "name": "stdout", 173 | "output_type": "stream", 174 | "text": [ 175 | "Accuracy Score: 0.9649122807017544\n", 176 | "True Positive: 35\n", 177 | "True Negative: 20\n", 178 | "False Positive: 1\n", 179 | "False Negative: 1\n", 180 | "Precision Score: 0.9722222222222222\n", 181 | "Precision Score Manual: 0.9722222222222222\n", 182 | "Recall Score: 0.9722222222222222\n", 183 | "Recall Score Manual: 0.9722222222222222\n" 184 | ] 185 | }, 186 | { 187 | "name": "stderr", 188 | "output_type": "stream", 189 | "text": [ 190 | "/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning: Function plot_confusion_matrix is deprecated; Function `plot_confusion_matrix` is deprecated in 1.0 and will be removed in 1.2. Use one of the class methods: ConfusionMatrixDisplay.from_predictions or ConfusionMatrixDisplay.from_estimator.\n", 191 | " warnings.warn(msg, category=FutureWarning)\n" 192 | ] 193 | }, 194 | { 195 | "data": { 196 | "text/plain": [ 197 | "" 198 | ] 199 | }, 200 | "execution_count": 10, 201 | "metadata": {}, 202 | "output_type": "execute_result" 203 | }, 204 | { 205 | "data": { 206 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAATIAAAEKCAYAAACR79kFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAWlklEQVR4nO3debQcZZ3G8e+Te0MSspIQQthXgYgQmMg6ciCALDoH8KiAjnKUGUBEVBaJzhlBcGbQYXEcAQ2LQUEEBAZEZBHhBEYGSCBgFhCQLRAI2UhCyHLv/c0fVReamHRXJd23q/o+H06d21Xd/dYv9x6e89Zbb1UpIjAzK7M+zS7AzGx9OcjMrPQcZGZWeg4yMys9B5mZlZ6DzMxKz0FmZk0hqb+kxyQ9JWmGpO+l2ydJelHStHQZW6ut9saXa2a2RiuA8RGxVFJf4GFJv0/fOzsifpO1IQeZmTVFJLPxl6arfdNlnWboq0gz+9sGD4z2ERs1uwzLod/Ly5pdguWwnHdYGSu0Pm0cdtDAmL+gM9Nnpz69YgawvGLTxIiY2L0iqQ2YCuwAXBYR50iaBOxL0mO7H5gQESuq7adQPbL2ERux6blfa3YZlsOHTpzS7BIsh0fj/vVuY/6CTh67Z6tMn20b/dzyiBi3tvcjohMYK2kYcJukXYFvA28AGwATgXOA86vtx4P9ZpZLAF0Z/8vcZsQi4AHg8IiYE4kVwM+BvWp930FmZrkEwarozLRUI2lk2hND0gDgUOAZSaPTbQKOBqbXqqlQh5ZmVg55eltVjAauTcfJ+gA3RcSdkv4oaSQgYBpwSq2GHGRmlksQdNbhJGFEPA3ssYbt4/O25SAzs9y61m2WRMM4yMwslwA6HWRmVnbukZlZqQWwqkAT6cFBZmY5BeFDSzMruYDOYuWYg8zM8klm9heLg8zMchKdrNd153XnIDOzXJLBfgeZmZVYMo/MQWZmJdflHpmZlZl7ZGZWeoHoLNgdwBxkZpabDy3NrNQCsTLaml3GBzjIzCyXZEKsDy3NrOQ82G9mpRYhOsM9MjMruS73yMyszJLB/mJFR7GqMbPC82C/mbWETs8jM7MyK+LM/mJVY2al0BV9Mi3VSOov6TFJT0maIel76fZtJT0q6XlJN0raoFY9DjIzyyW5aLxPpqWGFcD4iNgdGAscLmkf4AfApRGxA7AQOLFWQw4yM8slEKuiLdNStZ3E0nS1b7oEMB74Tbr9WuDoWjV5jMzMcokgz4TYjSVNqVifGBETu1cktQFTgR2Ay4AXgEUR0ZF+ZDawea2dOMjMLCflmRA7LyLGre3NiOgExkoaBtwG7LwuFTnIzCyXIFePLFubEYskPQDsCwyT1J72yrYAXqv1fY+RmVlu9RjslzQy7YkhaQBwKDALeAD4dPqxE4Dba9XjHpmZ5RKoXjdWHA1cm46T9QFuiog7Jc0Efi3p+8CTwNW1GnKQmVkuyePg1j86IuJpYI81bP8rsFeethxkZpaTH9BrZiUXUHPWfk9zkJlZbu6RmVmpRcg9MjMrt2Sw309RMrNS8z37zazkksF+j5GZWckV7caKDjIzy6WOM/vrxkFmZrn54SNmVmoRsKrLQWZmJZYcWjrIzKzkPLO/hbUvWMmmV71I2+JVIHj7gJEsOnQUfZZ2MPpnL9B33kpWbbwBc07Znq6B/tUX0RmXvMLehyxh0bx2Th6/U7PLKaQiTr9oaP9Q0uGSnk0f6zShkfsqgugDbx27BS9/f1de+c4uDHtgLhu8/i7Dfz+HZbsM4aX/+AjLdhnC8LveaHapthb33jicf/n8ts0uo+BUl8fB1VPD9pTeLO0y4AhgDHC8pDGN2l8RdA7bgBVbDwQgBrSxcvQA2heuZNCTi1i83wgAFu83gkFPLmxmmVbF9EcHsWShe8u1dKX37a+19JRG/sX2Ap5Pb5KGpF8DRwEzG7jPwmift4J+ryxj+XaDaFvcQeew5BmjnUP70ra4o8a3zYorOWvZe6613Bx4tWJ9NrD36h+SdBJwEkDbiGENLKfnaHknm13+Am8dtyVdA1b7g0sUbJzULJciToht+jnUiJgYEeMiYlzboIHNLmf9dXSx2eUvsHjv4Sz9u40A6BzSTtuilQC0LVpJ52Afuli5Fe3QspFB9hqwZcV6psc6lVoEm056mZWj+7PosE3f27x07DCG/Gk+AEP+NJ+le7RGz9N6p+6zllmWntLIrsHjwI6StiUJsOOAzzVwf03X//mlDHlkPiu2GMBW580AYP6nNmfBkaPZ7IoXGPrQPFaNSKZfWDFNuPxldtt3KUOHd3DdlJn88uJR3HPDiGaXVTi9ZkJsRHRIOg24B2gDromIGY3aXxEs33Ewf7l6zQ9Vnn225ySVwYWnbt3sEgovQnT0liADiIi7gLsauQ8z63ke7DezUqvXGJmkLSU9IGmmpBmSvp5uP0/Sa5KmpcuRtWry6TMzy61OPbIO4MyIeELSYGCqpPvS9y6NiIuyNuQgM7Nc6jWPLCLmAHPS10skzSKZf5qbDy3NLLcc88g2ljSlYjlpTe1J2gbYA3g03XSapKclXSNpo1r1uEdmZrlEQEf2GyvOi4g1n8pPSRoE3AJ8IyIWS7oCuIBkOO4C4GLgy9XacJCZWW71OmspqS9JiF0fEbcCRMSbFe9fCdxZqx0HmZnlUq8xMkkCrgZmRcQlFdtHp+NnAMcA02u15SAzs9yiPj2y/YEvAH+WNC3d9h2SW36NJTm0fAk4uVZDDjIzy60eF4RHxMOs+V4wuSfRO8jMLJeI4s3sd5CZWU6i04+DM7Oyq9MYWd04yMwslyI+RclBZmb5RDJOViQOMjPLrSdvY52Fg8zMcgkP9ptZK/ChpZmVns9amlmpRTjIzKwFePqFmZWex8jMrNQC0eWzlmZWdgXrkDnIzCwnD/abWUsoWJfMQWZmuZWmRybpv6mSuxFxekMqMrNCC6CrqyRBBkzpsSrMrDwCKEuPLCKurVyXtGFELGt8SWZWdEWbR1ZzMoikfSXNBJ5J13eXdHnDKzOz4oqMSw/JMqvtR8BhwHyAiHgKOKCRRZlZkYmIbEtPyXTWMiJeTZ6l+Z7OxpRjZqVQtkNL4FVJ+wEhqa+ks4BZDa7LzIoqILqUaalG0paSHpA0U9IMSV9Ptw+XdJ+k59KfG9UqKUuQnQJ8FdgceB0Ym66bWa+ljEtVHcCZETEG2Af4qqQxwATg/ojYEbg/Xa+q5qFlRMwDPl/rc2bWi9Th0DIi5gBz0tdLJM0i6TAdBRyYfuxa4EHgnGptZTlruZ2k30p6S9JcSbdL2m496jezsst+1nJjSVMqlpPW1JykbYA9gEeBUWnIAbwBjKpVTpbB/l8BlwHHpOvHATcAe2f4rpm1mnwTYudFxLhqH5A0CLgF+EZELK48sRgRIalm/y/LGNmGEfHLiOhIl+uA/hm+Z2YtKiLbUoukviQhdn1E3JpuflPS6PT90cDcWu2sNcjSMwfDgd9LmiBpG0lbS/oWcFftEs2sZXUp21KFkq7X1cCsiLik4q07gBPS1ycAt9cqp9qh5VSSTmR3NSdXvBfAt2s1bmatqfbBXib7A18A/ixpWrrtO8CFwE2STgReBj5bq6Fq11puW4dCzazV1Onyo4h4mLXP0Tg4T1uZZvZL2hUYQ8XYWET8Is+OzKxVqDx3v+gm6VySOR1jSMbGjgAeBhxkZr1VCS9R+jRJN++NiPgSsDswtKFVmVmxdWVcekiWQ8t3I6JLUoekISSnQrdscF1mVlRlurFihSmShgFXkpzJXAo80tCqzKzQ6nTWsm6yXGt5avryp5LuBoZExNONLcvMCq0sQSZpz2rvRcQTjSnJzCyfaj2yi6u8F8D4OtdCv5eX8aET/cyTMrnn9Wm1P2SFsddh9XnsRmkOLSPioJ4sxMxKIqh5+VFP8wN6zSy/svTIzMzWpjSHlmZma1WwIMtyh1hJ+kdJ303Xt5K0V+NLM7PCKuFzLS8H9gWOT9eXkNwx1sx6IUX2padkObTcOyL2lPQkQEQslLRBg+sysyIr4VnLVZLaSDuKkkbSo5eDmlnRFG2wP8uh5Y+B24BNJP0byS18/r2hVZlZsRVsjCzLtZbXS5pKcisfAUdHhJ80btZb9fD4VxZZbqy4FbAM+G3ltoh4pZGFmVmBlS3IgN/x/kNI+gPbAs8CH25gXWZWYCrYKHmWQ8uPVK6nd8U4dS0fNzPrcbln9kfEE5L8lHGz3qxsh5aSzqhY7QPsCbzesIrMrNgKONifZfrF4IqlH8mY2VGNLMrMCq5O0y8kXSNprqTpFdvOk/SapGnpcmStdqr2yNKJsIMj4qzaJZlZr1G/Htkk4Cf87eMlL42Ii7I2Uu1W1+0R0SFp/3Wrz8xakajfWcuImCxpm/Vtp1qP7DGS8bBpku4AbgbeqSjg1vXduZmVUL4xso0lVd6/fmJETMzwvdMkfRGYApwZEQurfTjLWcv+wHySe/R3zycLwEFm1ltlD7J5ETEuZ+tXABeke7mA5PkhX672hWpBtkl6xnI67wdYt4KdszCzHtXABIiIN7tfS7oSuLPWd6oFWRswiA8G2Hv7yl2dmbWMRk6/kDQ6Iuakq8eQdKaqqhZkcyLi/LpUZmatpU5BJukG4ECSsbTZwLnAgZLGpnt5CTi5VjvVgqxYd04zs2KIup61PH4Nm6/O2061IDs4b2Nm1ksUbHCp2gN6F/RkIWZWHkW7RMmPgzOz/BxkZlZqPXwb6ywcZGaWi/ChpZm1AAeZmZWfg8zMSs9BZmalVsA7xDrIzCw/B5mZlV3pHgdnZrY6H1qaWbl5QqyZtQQHmZmVmWf2m1lLUFexksxBZmb5eIzMzFqBDy3NrPwcZGZWdu6RmVn5OcjMrNTq+BSlenGQmVkuRZxH1qfZBZhZCUVkW2qQdI2kuZKmV2wbLuk+Sc+lPzeq1Y6DzMxyU2RbMpgEHL7atgnA/RGxI3B/ul6VDy0b6IxLXmHvQ5awaF47J4/fqdnl2BqsXC7O/NQOrFrZh84O+Ngn3uaLZ7/BRd/YiqcfGcjAwclg0Fk/eoXtd323ydUWRB0nxEbEZEnbrLb5KODA9PW1wIPAOdXaaViQSboG+CQwNyJ2bdR+iuzeG4dzx8835uz/erXZpdha9O0X/PDmFxgwsIuOVXDG0Tvy0fGLAfjnf32dj33y7SZXWEw5Bvs3ljSlYn1iREys8Z1RETEnff0GMKrWThrZI5sE/AT4RQP3UWjTHx3EqC1WNrsMq0KCAQOT/ys7VonOVUJqclElkCPI5kXEuHXdT0SEVPsgtWFjZBExGVjQqPbN6qWzE75yyE4cu9uu7HHAEnbecxkAky4czSkH78RPz92MlSucbu8J6jbYvxZvShoNkP6cW+sLTR/sl3SSpCmSpqxiRbPLsV6orQ2u+MOzXD91Js9O25CXnunPl779Olc99Aw/vusvLFnUzk2XbdLsMguljoP9a3IHcEL6+gTg9lpfaHqQRcTEiBgXEeP60q/Z5VgvNmhoJ7vvt5THHxjMiFEdSLBBv+Djxy7g2WkbNru8YomMSw2SbgAeAXaSNFvSicCFwKGSngMOSder8llL69UWzW+jvT0JsRXviicmD+azX53L/DfbGTGqgwj4091D2Wan5c0utTDqOSE2Io5fy1sH52nHQdZAEy5/md32XcrQ4R1cN2Umv7x4FPfcMKLZZVmFBW/25aKvb0VXl+jqggP+YRH7HLqYb31me96e304EbP/hdzn9B3NqN9ZbRPSeGyumXcYDSU6/zgbOjYirG7W/Irrw1K2bXYLVsN2Y5Vx+31/+ZvsPb36hCdWUSLFyrHFBVqXLaGYlV7RrLX1oaWb5BNBbDi3NrIUVK8ccZGaWnw8tzaz0es1ZSzNrUX4cnJmVXTIhtlhJ5iAzs/x8z34zKzv3yMys3DxGZmbl14uutTSzFuZDSzMrNT+g18xagntkZlZ6xcoxB5mZ5aeuYh1bOsjMLJ/AE2LNrNxEeEKsmbUAB5mZlZ6DzMxKzWNkZtYK6nXWUtJLwBKgE+iIiHHr0o6DzMxyinofWh4UEfPWpwEHmZnlExRujKxPswswsxLqyrgkD+ieUrGctFpLAdwraeoa3svMPTIzyy3HPLJ5Nca9/j4iXpO0CXCfpGciYnLeetwjM7P8IrItNZuJ19Kfc4HbgL3WpRwHmZnlEwGdXdmWKiQNlDS4+zXwcWD6upTkQ0szy68+g/2jgNskQZJFv4qIu9elIQeZmeVXhyCLiL8Cu69/MQ4yM8srAN+z38zKLSCKdY2Sg8zM8glqDuT3NAeZmeVXsJn9DjIzy89BZmblVveLxtebg8zM8gnADx8xs9Jzj8zMyi181tLMSi4gPI/MzErPM/vNrPQ8RmZmpRbhs5Zm1gLcIzOzcguis7PZRXyAg8zM8vFtfMysJXj6hZmVWQDhHpmZlVr4xopm1gKKNtivKNBpVElvAS83u44G2BiY1+wiLJdW/ZttHREj16cBSXeT/H6ymBcRh6/P/rIoVJC1KklTajxt2QrGf7Ny8QN6zaz0HGRmVnoOsp4xsdkFWG7+m5WIx8jMrPTcIzOz0nOQmVnpOcgaSNLhkp6V9LykCc2ux2qTdI2kuZKmN7sWy85B1iCS2oDLgCOAMcDxksY0tyrLYBLQ8AmcVl8OssbZC3g+Iv4aESuBXwNHNbkmqyEiJgMLml2H5eMga5zNgVcr1men28yszhxkZlZ6DrLGeQ3YsmJ9i3SbmdWZg6xxHgd2lLStpA2A44A7mlyTWUtykDVIRHQApwH3ALOAmyJiRnOrslok3QA8AuwkabakE5tdk9XmS5TMrPTcIzOz0nOQmVnpOcjMrPQcZGZWeg4yMys9B1mJSOqUNE3SdEk3S9pwPdqaJOnT6eurql3QLulASfutwz5ekvQ3T9tZ2/bVPrM0577Ok3RW3hqtNTjIyuXdiBgbEbsCK4FTKt+UtE7PKY2If4qImVU+ciCQO8jMeoqDrLweAnZIe0sPSboDmCmpTdJ/Snpc0tOSTgZQ4ifp/dH+AGzS3ZCkByWNS18fLukJSU9Jul/SNiSB+c20N/gxSSMl3ZLu43FJ+6ffHSHpXkkzJF0FqNY/QtL/SJqafuek1d67NN1+v6SR6bbtJd2dfuchSTvX45dp5eYnjZdQ2vM6Arg73bQnsGtEvJiGwdsR8VFJ/YD/lXQvsAewE8m90UYBM4FrVmt3JHAlcEDa1vCIWCDpp8DSiLgo/dyvgEsj4mFJW5FcvbALcC7wcEScL+kTQJZZ8V9O9zEAeFzSLRExHxgITImIb0r6btr2aSQPBTklIp6TtDdwOTB+HX6N1kIcZOUyQNK09PVDwNUkh3yPRcSL6faPA7t1j38BQ4EdgQOAGyKiE3hd0h/X0P4+wOTutiJibfflOgQYI73X4RoiaVC6j0+l3/2dpIUZ/k2nSzomfb1lWut8oAu4Md1+HXBruo/9gJsr9t0vwz6sxTnIyuXdiBhbuSH9H/qdyk3A1yLintU+d2Qd6+gD7BMRy9dQS2aSDiQJxX0jYpmkB4H+a/l4pPtdtPrvwMxjZK3nHuArkvoCSPqQpIHAZODYdAxtNHDQGr77f8ABkrZNvzs83b4EGFzxuXuBr3WvSOoOlsnA59JtRwAb1ah1KLAwDbGdSXqE3foA3b3Kz5Ecsi4GXpT0mXQfkrR7jX1YL+Agaz1XkYx/PZE+QONnJD3v24Dn0vd+QXKHhw+IiLeAk0gO457i/UO73wLHdA/2A6cD49KTCTN5/+zp90iCcAbJIeYrNWq9G2iXNAu4kCRIu70D7JX+G8YD56fbPw+cmNY3A98+3PDdL8ysBbhHZmal5yAzs9JzkJlZ6TnIzKz0HGRmVnoOMjMrPQeZmZXe/wPNGIyYX5nerAAAAABJRU5ErkJggg==", 207 | "text/plain": [ 208 | "
" 209 | ] 210 | }, 211 | "metadata": { 212 | "needs_background": "light" 213 | }, 214 | "output_type": "display_data" 215 | } 216 | ], 217 | "source": [ 218 | "y_pred = pipe.predict(x_test)\n", 219 | "print(\"Accuracy Score:\", metrics.accuracy_score(y_test, y_pred))\n", 220 | "\n", 221 | "cf_matrix = metrics.confusion_matrix(y_test, y_pred)\n", 222 | "tp, tn, fp, fn = cf_matrix[1, 1], cf_matrix[0, 0], cf_matrix[0, 1], cf_matrix[1, 0]\n", 223 | "print(\"True Positive:\", tp)\n", 224 | "print(\"True Negative:\", tn)\n", 225 | "print(\"False Positive:\", fp)\n", 226 | "print(\"False Negative:\", fn)\n", 227 | "\n", 228 | "precision1 = metrics.precision_score(y_test, y_pred)\n", 229 | "precision2 = tp / (tp + fp)\n", 230 | "\n", 231 | "print(\"Precision Score:\", precision1)\n", 232 | "print(\"Precision Score Manual:\", precision2)\n", 233 | "\n", 234 | "recall1 = metrics.recall_score(y_test, y_pred)\n", 235 | "recall2 = tp / (tp + fn)\n", 236 | "print(\"Recall Score:\", recall1)\n", 237 | "print(\"Recall Score Manual:\", recall2)\n", 238 | "\n", 239 | "\n", 240 | "metrics.plot_confusion_matrix(pipe, x_test, y_test)" 241 | ] 242 | }, 243 | { 244 | "cell_type": "markdown", 245 | "metadata": { 246 | "id": "49Zi7e-89fwb" 247 | }, 248 | "source": [ 249 | "## Regression Metrics\n", 250 | "\n", 251 | "### Mean Squared Error\n", 252 | "\n", 253 | "MSE is the average of the squared error that is used as the loss function for least squares regression: It is the sum, over all the data points, of the square of the difference between the predicted and actual target variables, divided by the number of data points.\n", 254 | "\n", 255 | "$$mse = 1/m \\sum_{i=0}^{m} (\\hat{y_i} - y_i)^2$$\n", 256 | "\n", 257 | "Where m is the total number of examples being tested, and $\\hat{y}$ is the predicted label while $y$ is the actual label\n", 258 | "\n", 259 | "### Mean Absolute Error\n", 260 | "\n", 261 | "MAE is the average of the absolue errors that is used as the loss function for regression: It is the sum, over all the data points, of the absolute of the difference between the predicted and actual target variables, divided by the number of data points.\n", 262 | "\n", 263 | "$$mae = 1/m \\sum_{i=0}^{m} |\\hat{y_i} - y_i|$$\n", 264 | "\n", 265 | "Where m is the total number of examples being tested, and $\\hat{y}$ is the predicted label while $y$ is the actual label\n", 266 | "\n", 267 | "\n", 268 | "### $R^2$ score\n", 269 | "the $R^2$ score varies between 0 and 100%. It is closely related to the MSE .\n", 270 | "\n", 271 | "if it is 100%, the two variables are perfectly correlated, i.e., with no variance at all. A low value would show a low level of correlation, meaning a regression model that is not valid, but not in all cases.\n" 272 | ] 273 | }, 274 | { 275 | "cell_type": "code", 276 | "execution_count": 12, 277 | "metadata": { 278 | "colab": { 279 | "base_uri": "https://localhost:8080/" 280 | }, 281 | "id": "92KHMKxH6n72", 282 | "outputId": "dc6c4066-f8b2-4dd0-bfba-659d4590ecd9", 283 | "vscode": { 284 | "languageId": "python" 285 | } 286 | }, 287 | "outputs": [ 288 | { 289 | "name": "stderr", 290 | "output_type": "stream", 291 | "text": [ 292 | "/usr/local/lib/python3.7/dist-packages/sklearn/utils/deprecation.py:87: FutureWarning: Function load_boston is deprecated; `load_boston` is deprecated in 1.0 and will be removed in 1.2.\n", 293 | "\n", 294 | " The Boston housing prices dataset has an ethical problem. You can refer to\n", 295 | " the documentation of this function for further details.\n", 296 | "\n", 297 | " The scikit-learn maintainers therefore strongly discourage the use of this\n", 298 | " dataset unless the purpose of the code is to study and educate about\n", 299 | " ethical issues in data science and machine learning.\n", 300 | "\n", 301 | " In this special case, you can fetch the dataset from the original\n", 302 | " source::\n", 303 | "\n", 304 | " import pandas as pd\n", 305 | " import numpy as np\n", 306 | "\n", 307 | "\n", 308 | " data_url = \"http://lib.stat.cmu.edu/datasets/boston\"\n", 309 | " raw_df = pd.read_csv(data_url, sep=\"\\s+\", skiprows=22, header=None)\n", 310 | " data = np.hstack([raw_df.values[::2, :], raw_df.values[1::2, :2]])\n", 311 | " target = raw_df.values[1::2, 2]\n", 312 | "\n", 313 | " Alternative datasets include the California housing dataset (i.e.\n", 314 | " :func:`~sklearn.datasets.fetch_california_housing`) and the Ames housing\n", 315 | " dataset. You can load the datasets as follows::\n", 316 | "\n", 317 | " from sklearn.datasets import fetch_california_housing\n", 318 | " housing = fetch_california_housing()\n", 319 | "\n", 320 | " for the California housing dataset and::\n", 321 | "\n", 322 | " from sklearn.datasets import fetch_openml\n", 323 | " housing = fetch_openml(name=\"house_prices\", as_frame=True)\n", 324 | "\n", 325 | " for the Ames housing dataset.\n", 326 | " \n", 327 | " warnings.warn(msg, category=FutureWarning)\n" 328 | ] 329 | } 330 | ], 331 | "source": [ 332 | "x, y = datasets.load_boston(return_X_y=True)\n", 333 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 334 | " x, y, test_size=0.1, random_state=42\n", 335 | ")" 336 | ] 337 | }, 338 | { 339 | "cell_type": "code", 340 | "execution_count": 16, 341 | "metadata": { 342 | "id": "BdT4cQBsDd-x", 343 | "vscode": { 344 | "languageId": "python" 345 | } 346 | }, 347 | "outputs": [], 348 | "source": [ 349 | "# Create the pipeline from the scaler and the estimator\n", 350 | "# Scaler\n", 351 | "scaler = preprocessing.MinMaxScaler()\n", 352 | "# Estimator\n", 353 | "estimator = linear_model.LinearRegression()\n", 354 | "# Create the pipeline\n", 355 | "pipe = pipeline.Pipeline([(\"scaler\", scaler), (\"estimator\", estimator)])" 356 | ] 357 | }, 358 | { 359 | "cell_type": "code", 360 | "execution_count": 17, 361 | "metadata": { 362 | "colab": { 363 | "base_uri": "https://localhost:8080/" 364 | }, 365 | "id": "GOOfCGA3nH_e", 366 | "outputId": "d15d42f7-9ad1-4d83-85bf-7946b8dd580f", 367 | "vscode": { 368 | "languageId": "python" 369 | } 370 | }, 371 | "outputs": [ 372 | { 373 | "data": { 374 | "text/plain": [ 375 | "Pipeline(steps=[('scaler', MinMaxScaler()), ('estimator', LinearRegression())])" 376 | ] 377 | }, 378 | "execution_count": 17, 379 | "metadata": {}, 380 | "output_type": "execute_result" 381 | } 382 | ], 383 | "source": [ 384 | "# fit the pipeline onf the x_train and y_train\n", 385 | "pipe.fit(x_train, y_train)" 386 | ] 387 | }, 388 | { 389 | "cell_type": "code", 390 | "execution_count": 18, 391 | "metadata": { 392 | "colab": { 393 | "base_uri": "https://localhost:8080/" 394 | }, 395 | "id": "PxpRftWOnMoV", 396 | "outputId": "7e71e2d9-ca8f-42d1-adfd-fa49481505fc", 397 | "vscode": { 398 | "languageId": "python" 399 | } 400 | }, 401 | "outputs": [ 402 | { 403 | "name": "stdout", 404 | "output_type": "stream", 405 | "text": [ 406 | "Mean Squared Error: 14.995852876582648\n", 407 | "Mean Absolute Error: 2.8342104578589598\n", 408 | "R2 Score: 0.7598135533532474\n" 409 | ] 410 | } 411 | ], 412 | "source": [ 413 | "y_pred = pipe.predict(x_test)\n", 414 | "print(\"Mean Squared Error:\", metrics.mean_squared_error(y_test, y_pred))\n", 415 | "print(\"Mean Absolute Error:\", metrics.mean_absolute_error(y_test, y_pred))\n", 416 | "print(\"R2 Score:\", metrics.r2_score(y_test, y_pred))" 417 | ] 418 | }, 419 | { 420 | "cell_type": "code", 421 | "execution_count": null, 422 | "metadata": { 423 | "id": "wM1uMdwJDwLF", 424 | "vscode": { 425 | "languageId": "python" 426 | } 427 | }, 428 | "outputs": [], 429 | "source": [] 430 | } 431 | ], 432 | "metadata": { 433 | "colab": { 434 | "name": "2- Evaluation_Metrics.ipynb", 435 | "provenance": [] 436 | }, 437 | "kernelspec": { 438 | "display_name": "Python 3", 439 | "name": "python3" 440 | } 441 | }, 442 | "nbformat": 4, 443 | "nbformat_minor": 0 444 | } -------------------------------------------------------------------------------- /Week 04 - Advance Machine Learning/1 - Feature Selection.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "id": "Hd4Lz8wkfcdj" 7 | }, 8 | "source": [ 9 | "# Feature Selection\n", 10 | "\n", 11 | "\n", 12 | "\"Open\n", 13 | "\n", 14 | "Feature selection is a process where you automatically or manually select those features in your data that contribute most to the prediction variable or output in which you are interested.\n", 15 | "\n", 16 | "There are many ways to perform feature selection. Some methods include:\n", 17 | "\n", 18 | "- Using a correlation matrix to select features that are highly correlated with the output variable\n", 19 | "- Using a statistical tests (e.g., $x^2$) to measure the statistical significance of each feature in relation to the label. \n", 20 | "- Using a recursive feature elimination algorithm to automatically select features that are most relevant to the output variable\n", 21 | "\n", 22 | "Feature selection is important in machine learning because it can help you reduce the amount of data you need to work with, which in turn can reduce the amount of time and resources required to train and tune your machine learning model. Additionally, by selecting only the most relevant features, you can improve the interpretability of your machine learning model and make overfitting less likley to occure.\n", 23 | "\n", 24 | "There are many feature selection techniques supported by Scikit Learn. We will go through some of them and if you wish to learn more visit [Scikit Learn's Documentation](https://scikit-learn.org/stable/modules/feature_selection.html)." 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": 8, 30 | "metadata": { 31 | "id": "3euJPnU0fcCU" 32 | }, 33 | "outputs": [], 34 | "source": [ 35 | "from sklearn import (\n", 36 | " datasets,\n", 37 | " model_selection,\n", 38 | " feature_selection,\n", 39 | " svm,\n", 40 | " metrics,\n", 41 | " pipeline,\n", 42 | " preprocessing,\n", 43 | ")" 44 | ] 45 | }, 46 | { 47 | "cell_type": "code", 48 | "execution_count": 9, 49 | "metadata": { 50 | "colab": { 51 | "base_uri": "https://localhost:8080/" 52 | }, 53 | "id": "4zwq1k1zfae4", 54 | "outputId": "74223840-9527-44f7-c474-f40dcb1b865f" 55 | }, 56 | "outputs": [ 57 | { 58 | "data": { 59 | "text/plain": [ 60 | "(569, 30)" 61 | ] 62 | }, 63 | "execution_count": 9, 64 | "metadata": {}, 65 | "output_type": "execute_result" 66 | } 67 | ], 68 | "source": [ 69 | "# Load iris dataset\n", 70 | "x, y = datasets.load_breast_cancer(return_X_y=True)\n", 71 | "\n", 72 | "x.shape" 73 | ] 74 | }, 75 | { 76 | "cell_type": "markdown", 77 | "metadata": { 78 | "id": "U5FBvhd5-EBE" 79 | }, 80 | "source": [ 81 | "## Feature Selection Methods" 82 | ] 83 | }, 84 | { 85 | "cell_type": "markdown", 86 | "metadata": { 87 | "id": "iZStJ23HkfrE" 88 | }, 89 | "source": [ 90 | "### Univariate Feature Selection\n", 91 | "\n", 92 | "Univariate feature selection works by selecting the best features based on univariate statistical tests. It can be seen as a preprocessing step to an estimator.\n", 93 | "\n", 94 | "We will be using the `SelectKBest` class from Scikit Learn which returing the best $k$ features using some scoring method based on a statistical significance test like $x^2$ (chi-square).\n", 95 | "\n", 96 | "Scikit Learn includes additional scoring functions like:\n", 97 | "\n", 98 | "- For classification: `chi2`, `f_classif`, `mutual_info_classif`\n", 99 | "- For regression: `f_regression`, `mutual_info_regression`\n", 100 | "\n" 101 | ] 102 | }, 103 | { 104 | "cell_type": "code", 105 | "execution_count": 10, 106 | "metadata": { 107 | "colab": { 108 | "base_uri": "https://localhost:8080/" 109 | }, 110 | "id": "zdIpHV6nwaEb", 111 | "outputId": "da8ea4e4-d5fb-4f37-eea6-9461b7a9b4d9" 112 | }, 113 | "outputs": [ 114 | { 115 | "data": { 116 | "text/plain": [ 117 | "(569, 10)" 118 | ] 119 | }, 120 | "execution_count": 10, 121 | "metadata": {}, 122 | "output_type": "execute_result" 123 | } 124 | ], 125 | "source": [ 126 | "# Define the feature selector by selecing the scoring function and the number of features to select\n", 127 | "feature_selector = feature_selection.SelectKBest(feature_selection.chi2, k=10)\n", 128 | "\n", 129 | "# Use fit transform to train the feature selector and return the best 10 features\n", 130 | "x_new = feature_selector.fit_transform(x, y)\n", 131 | "\n", 132 | "# New X will have 10 features instead of 30\n", 133 | "x_new.shape" 134 | ] 135 | }, 136 | { 137 | "cell_type": "markdown", 138 | "metadata": { 139 | "id": "ei4QcwSF0v6w" 140 | }, 141 | "source": [ 142 | "## Training & Evaluating" 143 | ] 144 | }, 145 | { 146 | "cell_type": "code", 147 | "execution_count": 11, 148 | "metadata": { 149 | "id": "G6JJ70wZ0v6x" 150 | }, 151 | "outputs": [], 152 | "source": [ 153 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 154 | " x_new, y, test_size=0.3, stratify=y, random_state=42\n", 155 | ")" 156 | ] 157 | }, 158 | { 159 | "cell_type": "code", 160 | "execution_count": 12, 161 | "metadata": { 162 | "colab": { 163 | "base_uri": "https://localhost:8080/" 164 | }, 165 | "id": "szO8ej9U0v6y", 166 | "outputId": "b497ea9b-99ba-4fd0-a69e-fe6eaad73ce0" 167 | }, 168 | "outputs": [ 169 | { 170 | "name": "stdout", 171 | "output_type": "stream", 172 | "text": [ 173 | "Training model on 10 features\n" 174 | ] 175 | }, 176 | { 177 | "name": "stderr", 178 | "output_type": "stream", 179 | "text": [ 180 | "/usr/local/lib/python3.7/dist-packages/sklearn/svm/_base.py:289: ConvergenceWarning: Solver terminated early (max_iter=10). Consider pre-processing your data with StandardScaler or MinMaxScaler.\n", 181 | " ConvergenceWarning,\n" 182 | ] 183 | }, 184 | { 185 | "data": { 186 | "text/plain": [ 187 | "Pipeline(steps=[('scaler', StandardScaler()),\n", 188 | " ('model', SVC(max_iter=10, random_state=10))])" 189 | ] 190 | }, 191 | "execution_count": 12, 192 | "metadata": {}, 193 | "output_type": "execute_result" 194 | } 195 | ], 196 | "source": [ 197 | "print(f\"Training model on {x_train.shape[1]} features\")\n", 198 | "\n", 199 | "# Define a Support Vector Machine classifier with default configuration\n", 200 | "model = pipeline.Pipeline(\n", 201 | " [\n", 202 | " (\"scaler\", preprocessing.StandardScaler()),\n", 203 | " (\"model\", svm.SVC(kernel=\"rbf\", random_state=10, max_iter=10)),\n", 204 | " ]\n", 205 | ")\n", 206 | "\n", 207 | "# Train model using the training dataset\n", 208 | "model.fit(x_train, y_train)" 209 | ] 210 | }, 211 | { 212 | "cell_type": "code", 213 | "execution_count": 13, 214 | "metadata": { 215 | "colab": { 216 | "base_uri": "https://localhost:8080/" 217 | }, 218 | "id": "oq-XRDaP0v6z", 219 | "outputId": "2c0d747a-6e0e-47ae-e877-5285ebdaf2d9" 220 | }, 221 | "outputs": [ 222 | { 223 | "name": "stdout", 224 | "output_type": "stream", 225 | "text": [ 226 | "Accuracy: 0.7777777777777778\n", 227 | "Precision: 0.8876404494382022\n", 228 | "Recall: 0.7383177570093458\n" 229 | ] 230 | } 231 | ], 232 | "source": [ 233 | "# Use the model to predict the testing set to prepare for calculating the metrics\n", 234 | "y_pred = model.predict(x_test)\n", 235 | "\n", 236 | "# Calculate and print relevant scores\n", 237 | "accuracy = metrics.accuracy_score(y_test, y_pred)\n", 238 | "precision = metrics.precision_score(y_test, y_pred)\n", 239 | "recall = metrics.recall_score(y_test, y_pred)\n", 240 | "\n", 241 | "print(\"Accuracy:\", accuracy)\n", 242 | "print(\"Precision:\", precision)\n", 243 | "print(\"Recall:\", recall)" 244 | ] 245 | }, 246 | { 247 | "cell_type": "markdown", 248 | "metadata": { 249 | "id": "gkav1r_6QLHz" 250 | }, 251 | "source": [ 252 | "# Feature selection using Select From Model" 253 | ] 254 | }, 255 | { 256 | "cell_type": "markdown", 257 | "metadata": { 258 | "id": "eF3dd6CnQVuS" 259 | }, 260 | "source": [ 261 | "[SelectFromModel](https://scikit-learn.org/stable/modules/generated/sklearn.feature_selection.SelectFromModel.html#sklearn.feature_selection.SelectFromModel) is a meta-transformer that can be used alongside any estimator that assigns importance to each feature through a specific attribute (such as coef_, feature_importances_) or via an importance_getter callable after fitting. \n", 262 | "\n", 263 | "The features are considered unimportant and removed if the corresponding importance of the feature values are below the provided threshold parameter. \n", 264 | "\n", 265 | "Apart from specifying the threshold numerically, there are built-in heuristics for finding a threshold using a string argument. Available heuristics are “mean”, “median” and float multiples of these like “0.1*mean”. \n", 266 | "\n", 267 | "In combination with the threshold criteria, one can use the max_features parameter to set a limit on the number of features to select." 268 | ] 269 | }, 270 | { 271 | "cell_type": "markdown", 272 | "metadata": { 273 | "id": "jQ7k6NsdV216" 274 | }, 275 | "source": [ 276 | "## Define and Train the Linear Model" 277 | ] 278 | }, 279 | { 280 | "cell_type": "code", 281 | "execution_count": 14, 282 | "metadata": { 283 | "colab": { 284 | "base_uri": "https://localhost:8080/" 285 | }, 286 | "id": "2ks-dw3vT7tx", 287 | "outputId": "4c423421-a540-4cc8-8614-ba96ce3c6b58" 288 | }, 289 | "outputs": [ 290 | { 291 | "name": "stderr", 292 | "output_type": "stream", 293 | "text": [ 294 | "/usr/local/lib/python3.7/dist-packages/sklearn/svm/_base.py:1208: ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.\n", 295 | " ConvergenceWarning,\n" 296 | ] 297 | }, 298 | { 299 | "data": { 300 | "text/plain": [ 301 | "LinearSVC()" 302 | ] 303 | }, 304 | "execution_count": 14, 305 | "metadata": {}, 306 | "output_type": "execute_result" 307 | } 308 | ], 309 | "source": [ 310 | "lsvc = svm.LinearSVC()\n", 311 | "lsvc.fit(x, y)" 312 | ] 313 | }, 314 | { 315 | "cell_type": "markdown", 316 | "metadata": { 317 | "id": "MzkHBxM1V_eN" 318 | }, 319 | "source": [ 320 | "## Use the Trained Linear Model to Get the Best Features" 321 | ] 322 | }, 323 | { 324 | "cell_type": "code", 325 | "execution_count": 16, 326 | "metadata": { 327 | "colab": { 328 | "base_uri": "https://localhost:8080/" 329 | }, 330 | "id": "a7YXLQWLUBup", 331 | "outputId": "870be180-5c1b-4c4d-b713-7f3f9135147b" 332 | }, 333 | "outputs": [ 334 | { 335 | "data": { 336 | "text/plain": [ 337 | "(569, 8)" 338 | ] 339 | }, 340 | "execution_count": 16, 341 | "metadata": {}, 342 | "output_type": "execute_result" 343 | } 344 | ], 345 | "source": [ 346 | "feature_selector = feature_selection.SelectFromModel(lsvc, prefit=True)\n", 347 | "x_new = feature_selector.transform(\n", 348 | " x\n", 349 | ") # get the best features using the pretrained model\n", 350 | "x_new.shape" 351 | ] 352 | }, 353 | { 354 | "cell_type": "markdown", 355 | "metadata": { 356 | "id": "3dugS7ARWgKE" 357 | }, 358 | "source": [ 359 | "## Split the Datase to Train and Test " 360 | ] 361 | }, 362 | { 363 | "cell_type": "code", 364 | "execution_count": 17, 365 | "metadata": { 366 | "id": "0VRTE-r8UPtb" 367 | }, 368 | "outputs": [], 369 | "source": [ 370 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 371 | " x_new, y, test_size=0.3, stratify=y, random_state=42\n", 372 | ")" 373 | ] 374 | }, 375 | { 376 | "cell_type": "markdown", 377 | "metadata": { 378 | "id": "nN_DtPG8Vrc9" 379 | }, 380 | "source": [ 381 | "## Training & Evaluating" 382 | ] 383 | }, 384 | { 385 | "cell_type": "code", 386 | "execution_count": 20, 387 | "metadata": { 388 | "colab": { 389 | "base_uri": "https://localhost:8080/" 390 | }, 391 | "id": "T9O9CXCCU9ll", 392 | "outputId": "d0dec91b-ff98-4be6-d71a-5f29778877da" 393 | }, 394 | "outputs": [ 395 | { 396 | "name": "stdout", 397 | "output_type": "stream", 398 | "text": [ 399 | "Training model on 8 features\n" 400 | ] 401 | }, 402 | { 403 | "name": "stderr", 404 | "output_type": "stream", 405 | "text": [ 406 | "/usr/local/lib/python3.7/dist-packages/sklearn/svm/_base.py:289: ConvergenceWarning: Solver terminated early (max_iter=10). Consider pre-processing your data with StandardScaler or MinMaxScaler.\n", 407 | " ConvergenceWarning,\n" 408 | ] 409 | }, 410 | { 411 | "data": { 412 | "text/plain": [ 413 | "Pipeline(steps=[('scaler', StandardScaler()),\n", 414 | " ('model', SVC(max_iter=10, random_state=10))])" 415 | ] 416 | }, 417 | "execution_count": 20, 418 | "metadata": {}, 419 | "output_type": "execute_result" 420 | } 421 | ], 422 | "source": [ 423 | "print(f\"Training model on {x_train.shape[1]} features\")\n", 424 | "\n", 425 | "# Define a Support Vector Machine classifier with default configuration\n", 426 | "model = pipeline.Pipeline(\n", 427 | " [\n", 428 | " (\"scaler\", preprocessing.StandardScaler()),\n", 429 | " (\"model\", svm.SVC(kernel=\"rbf\", random_state=10, max_iter=10)),\n", 430 | " ]\n", 431 | ")\n", 432 | "\n", 433 | "# Train model using the training dataset\n", 434 | "model.fit(x_train, y_train)" 435 | ] 436 | }, 437 | { 438 | "cell_type": "code", 439 | "execution_count": 21, 440 | "metadata": { 441 | "colab": { 442 | "base_uri": "https://localhost:8080/" 443 | }, 444 | "id": "EH4fPo7eVCF5", 445 | "outputId": "471791b6-c918-4963-aed8-0dd046ab5d6c" 446 | }, 447 | "outputs": [ 448 | { 449 | "name": "stdout", 450 | "output_type": "stream", 451 | "text": [ 452 | "Accuracy: 0.9298245614035088\n", 453 | "Precision: 0.9279279279279279\n", 454 | "Recall: 0.9626168224299065\n" 455 | ] 456 | } 457 | ], 458 | "source": [ 459 | "# Use the model to predict the testing set to prepare for calculating the metrics\n", 460 | "y_pred = model.predict(x_test)\n", 461 | "\n", 462 | "# Calculate and print relevant scores\n", 463 | "accuracy = metrics.accuracy_score(y_test, y_pred)\n", 464 | "precision = metrics.precision_score(y_test, y_pred)\n", 465 | "recall = metrics.recall_score(y_test, y_pred)\n", 466 | "\n", 467 | "print(\"Accuracy:\", accuracy)\n", 468 | "print(\"Precision:\", precision)\n", 469 | "print(\"Recall:\", recall)" 470 | ] 471 | }, 472 | { 473 | "cell_type": "markdown", 474 | "metadata": { 475 | "id": "OCF8uyvr5DKS" 476 | }, 477 | "source": [ 478 | "# Recursive Feature Elimination\n", 479 | "\n", 480 | "Given an external estimator that assigns weights to features (e.g., the coefficients/weights of a linear model), the goal of recursive feature elimination (RFE) is to select features by recursively considering smaller and smaller sets of features. First, the estimator is trained on the initial set of features and the importance of each feature is obtained either through any specific attribute (such as `coef_`, `feature_importances_`). Then, the least important features are pruned from current set of features. That procedure is recursively repeated on the pruned set until the desired number of features to select is eventually reached.\n", 481 | "\n" 482 | ] 483 | }, 484 | { 485 | "cell_type": "code", 486 | "execution_count": 26, 487 | "metadata": { 488 | "colab": { 489 | "base_uri": "https://localhost:8080/" 490 | }, 491 | "id": "l6AXNXt0805y", 492 | "outputId": "f5ae1bdf-858c-4542-f403-e88bdef48659" 493 | }, 494 | "outputs": [ 495 | { 496 | "data": { 497 | "text/plain": [ 498 | "(569, 12)" 499 | ] 500 | }, 501 | "execution_count": 26, 502 | "metadata": {}, 503 | "output_type": "execute_result" 504 | } 505 | ], 506 | "source": [ 507 | "# We use a linear kernel because RFE requires a model that has a coef_ or feature_importances_ attributes\n", 508 | "svc = svm.SVC(kernel=\"linear\")\n", 509 | "\n", 510 | "# Define the feature selector by selecing the estimator and the number of features to select, and the number of features to remove in each iteration\n", 511 | "rfe = feature_selection.RFE(estimator=svc, n_features_to_select=12, step=1)\n", 512 | "\n", 513 | "# Use fit transform to train the feature selector and return the best 10 features\n", 514 | "x_new = rfe.fit_transform(x, y)\n", 515 | "\n", 516 | "# New X will have 12 features instead of 30\n", 517 | "x_new.shape" 518 | ] 519 | }, 520 | { 521 | "cell_type": "markdown", 522 | "metadata": { 523 | "id": "emX3sGCwyc92" 524 | }, 525 | "source": [ 526 | "## Training & Evaluating" 527 | ] 528 | }, 529 | { 530 | "cell_type": "code", 531 | "execution_count": 27, 532 | "metadata": { 533 | "id": "UoVEg8slyZTo" 534 | }, 535 | "outputs": [], 536 | "source": [ 537 | "use_pruned_features = True # @param {type:\"boolean\"}\n", 538 | "x_final = x_new if use_pruned_features else x\n", 539 | "\n", 540 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 541 | " x_new, y, test_size=0.3, stratify=y, random_state=42\n", 542 | ")" 543 | ] 544 | }, 545 | { 546 | "cell_type": "code", 547 | "execution_count": 28, 548 | "metadata": { 549 | "colab": { 550 | "base_uri": "https://localhost:8080/" 551 | }, 552 | "id": "fhKL3ElPyuBu", 553 | "outputId": "860155e7-2d7a-4003-cc80-46b9003dc61c" 554 | }, 555 | "outputs": [ 556 | { 557 | "name": "stdout", 558 | "output_type": "stream", 559 | "text": [ 560 | "Training model on 12 features\n" 561 | ] 562 | }, 563 | { 564 | "name": "stderr", 565 | "output_type": "stream", 566 | "text": [ 567 | "/usr/local/lib/python3.7/dist-packages/sklearn/svm/_base.py:289: ConvergenceWarning: Solver terminated early (max_iter=10). Consider pre-processing your data with StandardScaler or MinMaxScaler.\n", 568 | " ConvergenceWarning,\n" 569 | ] 570 | }, 571 | { 572 | "data": { 573 | "text/plain": [ 574 | "Pipeline(steps=[('scaler', StandardScaler()),\n", 575 | " ('model', SVC(max_iter=10, random_state=10))])" 576 | ] 577 | }, 578 | "execution_count": 28, 579 | "metadata": {}, 580 | "output_type": "execute_result" 581 | } 582 | ], 583 | "source": [ 584 | "print(f\"Training model on {x_train.shape[1]} features\")\n", 585 | "\n", 586 | "# Define a Support Vector Machine classifier with default configuration\n", 587 | "model = pipeline.Pipeline(\n", 588 | " [\n", 589 | " (\"scaler\", preprocessing.StandardScaler()),\n", 590 | " (\"model\", svm.SVC(kernel=\"rbf\", random_state=10, max_iter=10)),\n", 591 | " ]\n", 592 | ")\n", 593 | "\n", 594 | "# Train model using the training dataset\n", 595 | "model.fit(x_train, y_train)" 596 | ] 597 | }, 598 | { 599 | "cell_type": "code", 600 | "execution_count": 29, 601 | "metadata": { 602 | "colab": { 603 | "base_uri": "https://localhost:8080/" 604 | }, 605 | "id": "9dusZZULy_Jq", 606 | "outputId": "41ea1080-e6a6-4ea7-b12a-bf99cf65981e" 607 | }, 608 | "outputs": [ 609 | { 610 | "name": "stdout", 611 | "output_type": "stream", 612 | "text": [ 613 | "Accuracy: 0.935672514619883\n", 614 | "Precision: 0.9363636363636364\n", 615 | "Recall: 0.9626168224299065\n" 616 | ] 617 | } 618 | ], 619 | "source": [ 620 | "# Use the model to predict the testing set to prepare for calculating the metrics\n", 621 | "y_pred = model.predict(x_test)\n", 622 | "\n", 623 | "# Calculate and print relevant scores\n", 624 | "accuracy = metrics.accuracy_score(y_test, y_pred)\n", 625 | "precision = metrics.precision_score(y_test, y_pred)\n", 626 | "recall = metrics.recall_score(y_test, y_pred)\n", 627 | "\n", 628 | "print(\"Accuracy:\", accuracy)\n", 629 | "print(\"Precision:\", precision)\n", 630 | "print(\"Recall:\", recall)" 631 | ] 632 | }, 633 | { 634 | "cell_type": "markdown", 635 | "metadata": { 636 | "id": "iav2Q9et3SV6" 637 | }, 638 | "source": [ 639 | "# Recursive Feature Elimination CV" 640 | ] 641 | }, 642 | { 643 | "cell_type": "markdown", 644 | "metadata": { 645 | "id": "l4WwtQqk3oDs" 646 | }, 647 | "source": [ 648 | "Recursive feature elimination with cross-validation to select the number of features.\n", 649 | "\n" 650 | ] 651 | }, 652 | { 653 | "cell_type": "code", 654 | "execution_count": 35, 655 | "metadata": { 656 | "colab": { 657 | "base_uri": "https://localhost:8080/" 658 | }, 659 | "id": "A-nAsnPC3UWe", 660 | "outputId": "667840db-f73b-475a-e585-7f8aa0d9159a" 661 | }, 662 | "outputs": [ 663 | { 664 | "data": { 665 | "text/plain": [ 666 | "(569, 9)" 667 | ] 668 | }, 669 | "execution_count": 35, 670 | "metadata": {}, 671 | "output_type": "execute_result" 672 | } 673 | ], 674 | "source": [ 675 | "# We use a linear kernel because RFECV requires a model that has a coef_ or feature_importances_ attributes\n", 676 | "svc = svm.SVC(kernel=\"linear\")\n", 677 | "\n", 678 | "# Define the feature selector by selecing the estimator and the number of features to select, and the score for the features to be selected\n", 679 | "feature_selector = feature_selection.RFECV(\n", 680 | " lsvc, scoring=metrics.make_scorer(metrics.precision_score)\n", 681 | ")\n", 682 | "# Use fit transform to train the feature selector\n", 683 | "feature_selector.fit(x, y)\n", 684 | "\n", 685 | "new_x = feature_selector.transform(x)\n", 686 | "\n", 687 | "# New X will have 9 features instead of 30\n", 688 | "new_x.shape" 689 | ] 690 | }, 691 | { 692 | "cell_type": "code", 693 | "execution_count": 36, 694 | "metadata": { 695 | "id": "uBFe87Ur3XVK" 696 | }, 697 | "outputs": [], 698 | "source": [ 699 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 700 | " new_x, y, test_size=0.2, random_state=42, stratify=y\n", 701 | ")" 702 | ] 703 | }, 704 | { 705 | "cell_type": "markdown", 706 | "metadata": { 707 | "id": "j18PyPYG4LiY" 708 | }, 709 | "source": [ 710 | "## Training & Evaluating" 711 | ] 712 | }, 713 | { 714 | "cell_type": "code", 715 | "execution_count": 39, 716 | "metadata": { 717 | "colab": { 718 | "base_uri": "https://localhost:8080/" 719 | }, 720 | "id": "AqianVCP3Y0s", 721 | "outputId": "b2319989-1fc6-431d-8eb0-204cf9b8cc2b" 722 | }, 723 | "outputs": [ 724 | { 725 | "name": "stderr", 726 | "output_type": "stream", 727 | "text": [ 728 | "/usr/local/lib/python3.7/dist-packages/sklearn/svm/_base.py:289: ConvergenceWarning: Solver terminated early (max_iter=10). Consider pre-processing your data with StandardScaler or MinMaxScaler.\n", 729 | " ConvergenceWarning,\n" 730 | ] 731 | }, 732 | { 733 | "data": { 734 | "text/plain": [ 735 | "Pipeline(steps=[('scaler', StandardScaler()),\n", 736 | " ('model', SVC(max_iter=10, random_state=10))])" 737 | ] 738 | }, 739 | "execution_count": 39, 740 | "metadata": {}, 741 | "output_type": "execute_result" 742 | } 743 | ], 744 | "source": [ 745 | "# Define a Support Vector Machine classifier with default configuration\n", 746 | "model = pipeline.Pipeline(\n", 747 | " [\n", 748 | " (\"scaler\", preprocessing.StandardScaler()),\n", 749 | " (\"model\", svm.SVC(kernel=\"rbf\", random_state=10, max_iter=10)),\n", 750 | " ]\n", 751 | ")\n", 752 | "# Train model using the training dataset\n", 753 | "model.fit(x_train, y_train)" 754 | ] 755 | }, 756 | { 757 | "cell_type": "code", 758 | "execution_count": 40, 759 | "metadata": { 760 | "colab": { 761 | "base_uri": "https://localhost:8080/" 762 | }, 763 | "id": "qEF83uCX3dOQ", 764 | "outputId": "0b999914-1ad7-4cf5-811f-54deace61ebd" 765 | }, 766 | "outputs": [ 767 | { 768 | "name": "stdout", 769 | "output_type": "stream", 770 | "text": [ 771 | "Accuracy: 0.9210526315789473\n", 772 | "Precision: 0.9565217391304348\n", 773 | "Recall: 0.9166666666666666\n" 774 | ] 775 | } 776 | ], 777 | "source": [ 778 | "# Use the model to predict the testing set to prepare for calculating the metrics\n", 779 | "y_pred = model.predict(x_test)\n", 780 | "\n", 781 | "# Calculate and print relevant scores\n", 782 | "accuracy = metrics.accuracy_score(y_test, y_pred)\n", 783 | "precision = metrics.precision_score(y_test, y_pred)\n", 784 | "recall = metrics.recall_score(y_test, y_pred)\n", 785 | "\n", 786 | "print(\"Accuracy:\", accuracy)\n", 787 | "print(\"Precision:\", precision)\n", 788 | "print(\"Recall:\", recall)" 789 | ] 790 | }, 791 | { 792 | "cell_type": "code", 793 | "execution_count": null, 794 | "metadata": { 795 | "id": "lA9m72tq3eol" 796 | }, 797 | "outputs": [], 798 | "source": [] 799 | } 800 | ], 801 | "metadata": { 802 | "colab": { 803 | "collapsed_sections": [ 804 | "Hd4Lz8wkfcdj", 805 | "gkav1r_6QLHz" 806 | ], 807 | "name": "1 - Feature Selection.ipynb", 808 | "provenance": [], 809 | "toc_visible": true 810 | }, 811 | "kernelspec": { 812 | "display_name": "Python 3.9.13 64-bit", 813 | "language": "python", 814 | "name": "python3" 815 | }, 816 | "language_info": { 817 | "codemirror_mode": { 818 | "name": "ipython", 819 | "version": 3 820 | }, 821 | "file_extension": ".py", 822 | "mimetype": "text/x-python", 823 | "name": "python", 824 | "nbconvert_exporter": "python", 825 | "pygments_lexer": "ipython3", 826 | "version": "3.9.13" 827 | }, 828 | "vscode": { 829 | "interpreter": { 830 | "hash": "b667cebad148e7b094a58ee81f940c685de1dd70a003a9ccdca4a5792431bee5" 831 | } 832 | } 833 | }, 834 | "nbformat": 4, 835 | "nbformat_minor": 0 836 | } -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | GNU GENERAL PUBLIC LICENSE 2 | Version 3, 29 June 2007 3 | 4 | Copyright (C) 2007 Free Software Foundation, Inc. 5 | Everyone is permitted to copy and distribute verbatim copies 6 | of this license document, but changing it is not allowed. 7 | 8 | Preamble 9 | 10 | The GNU General Public License is a free, copyleft license for 11 | software and other kinds of works. 12 | 13 | The licenses for most software and other practical works are designed 14 | to take away your freedom to share and change the works. By contrast, 15 | the GNU General Public License is intended to guarantee your freedom to 16 | share and change all versions of a program--to make sure it remains free 17 | software for all its users. We, the Free Software Foundation, use the 18 | GNU General Public License for most of our software; it applies also to 19 | any other work released this way by its authors. You can apply it to 20 | your programs, too. 21 | 22 | When we speak of free software, we are referring to freedom, not 23 | price. Our General Public Licenses are designed to make sure that you 24 | have the freedom to distribute copies of free software (and charge for 25 | them if you wish), that you receive source code or can get it if you 26 | want it, that you can change the software or use pieces of it in new 27 | free programs, and that you know you can do these things. 28 | 29 | To protect your rights, we need to prevent others from denying you 30 | these rights or asking you to surrender the rights. Therefore, you have 31 | certain responsibilities if you distribute copies of the software, or if 32 | you modify it: responsibilities to respect the freedom of others. 33 | 34 | For example, if you distribute copies of such a program, whether 35 | gratis or for a fee, you must pass on to the recipients the same 36 | freedoms that you received. You must make sure that they, too, receive 37 | or can get the source code. And you must show them these terms so they 38 | know their rights. 39 | 40 | Developers that use the GNU GPL protect your rights with two steps: 41 | (1) assert copyright on the software, and (2) offer you this License 42 | giving you legal permission to copy, distribute and/or modify it. 43 | 44 | For the developers' and authors' protection, the GPL clearly explains 45 | that there is no warranty for this free software. For both users' and 46 | authors' sake, the GPL requires that modified versions be marked as 47 | changed, so that their problems will not be attributed erroneously to 48 | authors of previous versions. 49 | 50 | Some devices are designed to deny users access to install or run 51 | modified versions of the software inside them, although the manufacturer 52 | can do so. This is fundamentally incompatible with the aim of 53 | protecting users' freedom to change the software. The systematic 54 | pattern of such abuse occurs in the area of products for individuals to 55 | use, which is precisely where it is most unacceptable. Therefore, we 56 | have designed this version of the GPL to prohibit the practice for those 57 | products. If such problems arise substantially in other domains, we 58 | stand ready to extend this provision to those domains in future versions 59 | of the GPL, as needed to protect the freedom of users. 60 | 61 | Finally, every program is threatened constantly by software patents. 62 | States should not allow patents to restrict development and use of 63 | software on general-purpose computers, but in those that do, we wish to 64 | avoid the special danger that patents applied to a free program could 65 | make it effectively proprietary. To prevent this, the GPL assures that 66 | patents cannot be used to render the program non-free. 67 | 68 | The precise terms and conditions for copying, distribution and 69 | modification follow. 70 | 71 | TERMS AND CONDITIONS 72 | 73 | 0. Definitions. 74 | 75 | "This License" refers to version 3 of the GNU General Public License. 76 | 77 | "Copyright" also means copyright-like laws that apply to other kinds of 78 | works, such as semiconductor masks. 79 | 80 | "The Program" refers to any copyrightable work licensed under this 81 | License. Each licensee is addressed as "you". "Licensees" and 82 | "recipients" may be individuals or organizations. 83 | 84 | To "modify" a work means to copy from or adapt all or part of the work 85 | in a fashion requiring copyright permission, other than the making of an 86 | exact copy. The resulting work is called a "modified version" of the 87 | earlier work or a work "based on" the earlier work. 88 | 89 | A "covered work" means either the unmodified Program or a work based 90 | on the Program. 91 | 92 | To "propagate" a work means to do anything with it that, without 93 | permission, would make you directly or secondarily liable for 94 | infringement under applicable copyright law, except executing it on a 95 | computer or modifying a private copy. Propagation includes copying, 96 | distribution (with or without modification), making available to the 97 | public, and in some countries other activities as well. 98 | 99 | To "convey" a work means any kind of propagation that enables other 100 | parties to make or receive copies. Mere interaction with a user through 101 | a computer network, with no transfer of a copy, is not conveying. 102 | 103 | An interactive user interface displays "Appropriate Legal Notices" 104 | to the extent that it includes a convenient and prominently visible 105 | feature that (1) displays an appropriate copyright notice, and (2) 106 | tells the user that there is no warranty for the work (except to the 107 | extent that warranties are provided), that licensees may convey the 108 | work under this License, and how to view a copy of this License. If 109 | the interface presents a list of user commands or options, such as a 110 | menu, a prominent item in the list meets this criterion. 111 | 112 | 1. Source Code. 113 | 114 | The "source code" for a work means the preferred form of the work 115 | for making modifications to it. "Object code" means any non-source 116 | form of a work. 117 | 118 | A "Standard Interface" means an interface that either is an official 119 | standard defined by a recognized standards body, or, in the case of 120 | interfaces specified for a particular programming language, one that 121 | is widely used among developers working in that language. 122 | 123 | The "System Libraries" of an executable work include anything, other 124 | than the work as a whole, that (a) is included in the normal form of 125 | packaging a Major Component, but which is not part of that Major 126 | Component, and (b) serves only to enable use of the work with that 127 | Major Component, or to implement a Standard Interface for which an 128 | implementation is available to the public in source code form. A 129 | "Major Component", in this context, means a major essential component 130 | (kernel, window system, and so on) of the specific operating system 131 | (if any) on which the executable work runs, or a compiler used to 132 | produce the work, or an object code interpreter used to run it. 133 | 134 | The "Corresponding Source" for a work in object code form means all 135 | the source code needed to generate, install, and (for an executable 136 | work) run the object code and to modify the work, including scripts to 137 | control those activities. However, it does not include the work's 138 | System Libraries, or general-purpose tools or generally available free 139 | programs which are used unmodified in performing those activities but 140 | which are not part of the work. For example, Corresponding Source 141 | includes interface definition files associated with source files for 142 | the work, and the source code for shared libraries and dynamically 143 | linked subprograms that the work is specifically designed to require, 144 | such as by intimate data communication or control flow between those 145 | subprograms and other parts of the work. 146 | 147 | The Corresponding Source need not include anything that users 148 | can regenerate automatically from other parts of the Corresponding 149 | Source. 150 | 151 | The Corresponding Source for a work in source code form is that 152 | same work. 153 | 154 | 2. Basic Permissions. 155 | 156 | All rights granted under this License are granted for the term of 157 | copyright on the Program, and are irrevocable provided the stated 158 | conditions are met. This License explicitly affirms your unlimited 159 | permission to run the unmodified Program. The output from running a 160 | covered work is covered by this License only if the output, given its 161 | content, constitutes a covered work. This License acknowledges your 162 | rights of fair use or other equivalent, as provided by copyright law. 163 | 164 | You may make, run and propagate covered works that you do not 165 | convey, without conditions so long as your license otherwise remains 166 | in force. You may convey covered works to others for the sole purpose 167 | of having them make modifications exclusively for you, or provide you 168 | with facilities for running those works, provided that you comply with 169 | the terms of this License in conveying all material for which you do 170 | not control copyright. Those thus making or running the covered works 171 | for you must do so exclusively on your behalf, under your direction 172 | and control, on terms that prohibit them from making any copies of 173 | your copyrighted material outside their relationship with you. 174 | 175 | Conveying under any other circumstances is permitted solely under 176 | the conditions stated below. Sublicensing is not allowed; section 10 177 | makes it unnecessary. 178 | 179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law. 180 | 181 | No covered work shall be deemed part of an effective technological 182 | measure under any applicable law fulfilling obligations under article 183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or 184 | similar laws prohibiting or restricting circumvention of such 185 | measures. 186 | 187 | When you convey a covered work, you waive any legal power to forbid 188 | circumvention of technological measures to the extent such circumvention 189 | is effected by exercising rights under this License with respect to 190 | the covered work, and you disclaim any intention to limit operation or 191 | modification of the work as a means of enforcing, against the work's 192 | users, your or third parties' legal rights to forbid circumvention of 193 | technological measures. 194 | 195 | 4. Conveying Verbatim Copies. 196 | 197 | You may convey verbatim copies of the Program's source code as you 198 | receive it, in any medium, provided that you conspicuously and 199 | appropriately publish on each copy an appropriate copyright notice; 200 | keep intact all notices stating that this License and any 201 | non-permissive terms added in accord with section 7 apply to the code; 202 | keep intact all notices of the absence of any warranty; and give all 203 | recipients a copy of this License along with the Program. 204 | 205 | You may charge any price or no price for each copy that you convey, 206 | and you may offer support or warranty protection for a fee. 207 | 208 | 5. Conveying Modified Source Versions. 209 | 210 | You may convey a work based on the Program, or the modifications to 211 | produce it from the Program, in the form of source code under the 212 | terms of section 4, provided that you also meet all of these conditions: 213 | 214 | a) The work must carry prominent notices stating that you modified 215 | it, and giving a relevant date. 216 | 217 | b) The work must carry prominent notices stating that it is 218 | released under this License and any conditions added under section 219 | 7. This requirement modifies the requirement in section 4 to 220 | "keep intact all notices". 221 | 222 | c) You must license the entire work, as a whole, under this 223 | License to anyone who comes into possession of a copy. This 224 | License will therefore apply, along with any applicable section 7 225 | additional terms, to the whole of the work, and all its parts, 226 | regardless of how they are packaged. This License gives no 227 | permission to license the work in any other way, but it does not 228 | invalidate such permission if you have separately received it. 229 | 230 | d) If the work has interactive user interfaces, each must display 231 | Appropriate Legal Notices; however, if the Program has interactive 232 | interfaces that do not display Appropriate Legal Notices, your 233 | work need not make them do so. 234 | 235 | A compilation of a covered work with other separate and independent 236 | works, which are not by their nature extensions of the covered work, 237 | and which are not combined with it such as to form a larger program, 238 | in or on a volume of a storage or distribution medium, is called an 239 | "aggregate" if the compilation and its resulting copyright are not 240 | used to limit the access or legal rights of the compilation's users 241 | beyond what the individual works permit. Inclusion of a covered work 242 | in an aggregate does not cause this License to apply to the other 243 | parts of the aggregate. 244 | 245 | 6. Conveying Non-Source Forms. 246 | 247 | You may convey a covered work in object code form under the terms 248 | of sections 4 and 5, provided that you also convey the 249 | machine-readable Corresponding Source under the terms of this License, 250 | in one of these ways: 251 | 252 | a) Convey the object code in, or embodied in, a physical product 253 | (including a physical distribution medium), accompanied by the 254 | Corresponding Source fixed on a durable physical medium 255 | customarily used for software interchange. 256 | 257 | b) Convey the object code in, or embodied in, a physical product 258 | (including a physical distribution medium), accompanied by a 259 | written offer, valid for at least three years and valid for as 260 | long as you offer spare parts or customer support for that product 261 | model, to give anyone who possesses the object code either (1) a 262 | copy of the Corresponding Source for all the software in the 263 | product that is covered by this License, on a durable physical 264 | medium customarily used for software interchange, for a price no 265 | more than your reasonable cost of physically performing this 266 | conveying of source, or (2) access to copy the 267 | Corresponding Source from a network server at no charge. 268 | 269 | c) Convey individual copies of the object code with a copy of the 270 | written offer to provide the Corresponding Source. This 271 | alternative is allowed only occasionally and noncommercially, and 272 | only if you received the object code with such an offer, in accord 273 | with subsection 6b. 274 | 275 | d) Convey the object code by offering access from a designated 276 | place (gratis or for a charge), and offer equivalent access to the 277 | Corresponding Source in the same way through the same place at no 278 | further charge. You need not require recipients to copy the 279 | Corresponding Source along with the object code. If the place to 280 | copy the object code is a network server, the Corresponding Source 281 | may be on a different server (operated by you or a third party) 282 | that supports equivalent copying facilities, provided you maintain 283 | clear directions next to the object code saying where to find the 284 | Corresponding Source. Regardless of what server hosts the 285 | Corresponding Source, you remain obligated to ensure that it is 286 | available for as long as needed to satisfy these requirements. 287 | 288 | e) Convey the object code using peer-to-peer transmission, provided 289 | you inform other peers where the object code and Corresponding 290 | Source of the work are being offered to the general public at no 291 | charge under subsection 6d. 292 | 293 | A separable portion of the object code, whose source code is excluded 294 | from the Corresponding Source as a System Library, need not be 295 | included in conveying the object code work. 296 | 297 | A "User Product" is either (1) a "consumer product", which means any 298 | tangible personal property which is normally used for personal, family, 299 | or household purposes, or (2) anything designed or sold for incorporation 300 | into a dwelling. In determining whether a product is a consumer product, 301 | doubtful cases shall be resolved in favor of coverage. For a particular 302 | product received by a particular user, "normally used" refers to a 303 | typical or common use of that class of product, regardless of the status 304 | of the particular user or of the way in which the particular user 305 | actually uses, or expects or is expected to use, the product. A product 306 | is a consumer product regardless of whether the product has substantial 307 | commercial, industrial or non-consumer uses, unless such uses represent 308 | the only significant mode of use of the product. 309 | 310 | "Installation Information" for a User Product means any methods, 311 | procedures, authorization keys, or other information required to install 312 | and execute modified versions of a covered work in that User Product from 313 | a modified version of its Corresponding Source. The information must 314 | suffice to ensure that the continued functioning of the modified object 315 | code is in no case prevented or interfered with solely because 316 | modification has been made. 317 | 318 | If you convey an object code work under this section in, or with, or 319 | specifically for use in, a User Product, and the conveying occurs as 320 | part of a transaction in which the right of possession and use of the 321 | User Product is transferred to the recipient in perpetuity or for a 322 | fixed term (regardless of how the transaction is characterized), the 323 | Corresponding Source conveyed under this section must be accompanied 324 | by the Installation Information. But this requirement does not apply 325 | if neither you nor any third party retains the ability to install 326 | modified object code on the User Product (for example, the work has 327 | been installed in ROM). 328 | 329 | The requirement to provide Installation Information does not include a 330 | requirement to continue to provide support service, warranty, or updates 331 | for a work that has been modified or installed by the recipient, or for 332 | the User Product in which it has been modified or installed. Access to a 333 | network may be denied when the modification itself materially and 334 | adversely affects the operation of the network or violates the rules and 335 | protocols for communication across the network. 336 | 337 | Corresponding Source conveyed, and Installation Information provided, 338 | in accord with this section must be in a format that is publicly 339 | documented (and with an implementation available to the public in 340 | source code form), and must require no special password or key for 341 | unpacking, reading or copying. 342 | 343 | 7. Additional Terms. 344 | 345 | "Additional permissions" are terms that supplement the terms of this 346 | License by making exceptions from one or more of its conditions. 347 | Additional permissions that are applicable to the entire Program shall 348 | be treated as though they were included in this License, to the extent 349 | that they are valid under applicable law. If additional permissions 350 | apply only to part of the Program, that part may be used separately 351 | under those permissions, but the entire Program remains governed by 352 | this License without regard to the additional permissions. 353 | 354 | When you convey a copy of a covered work, you may at your option 355 | remove any additional permissions from that copy, or from any part of 356 | it. (Additional permissions may be written to require their own 357 | removal in certain cases when you modify the work.) You may place 358 | additional permissions on material, added by you to a covered work, 359 | for which you have or can give appropriate copyright permission. 360 | 361 | Notwithstanding any other provision of this License, for material you 362 | add to a covered work, you may (if authorized by the copyright holders of 363 | that material) supplement the terms of this License with terms: 364 | 365 | a) Disclaiming warranty or limiting liability differently from the 366 | terms of sections 15 and 16 of this License; or 367 | 368 | b) Requiring preservation of specified reasonable legal notices or 369 | author attributions in that material or in the Appropriate Legal 370 | Notices displayed by works containing it; or 371 | 372 | c) Prohibiting misrepresentation of the origin of that material, or 373 | requiring that modified versions of such material be marked in 374 | reasonable ways as different from the original version; or 375 | 376 | d) Limiting the use for publicity purposes of names of licensors or 377 | authors of the material; or 378 | 379 | e) Declining to grant rights under trademark law for use of some 380 | trade names, trademarks, or service marks; or 381 | 382 | f) Requiring indemnification of licensors and authors of that 383 | material by anyone who conveys the material (or modified versions of 384 | it) with contractual assumptions of liability to the recipient, for 385 | any liability that these contractual assumptions directly impose on 386 | those licensors and authors. 387 | 388 | All other non-permissive additional terms are considered "further 389 | restrictions" within the meaning of section 10. If the Program as you 390 | received it, or any part of it, contains a notice stating that it is 391 | governed by this License along with a term that is a further 392 | restriction, you may remove that term. If a license document contains 393 | a further restriction but permits relicensing or conveying under this 394 | License, you may add to a covered work material governed by the terms 395 | of that license document, provided that the further restriction does 396 | not survive such relicensing or conveying. 397 | 398 | If you add terms to a covered work in accord with this section, you 399 | must place, in the relevant source files, a statement of the 400 | additional terms that apply to those files, or a notice indicating 401 | where to find the applicable terms. 402 | 403 | Additional terms, permissive or non-permissive, may be stated in the 404 | form of a separately written license, or stated as exceptions; 405 | the above requirements apply either way. 406 | 407 | 8. Termination. 408 | 409 | You may not propagate or modify a covered work except as expressly 410 | provided under this License. Any attempt otherwise to propagate or 411 | modify it is void, and will automatically terminate your rights under 412 | this License (including any patent licenses granted under the third 413 | paragraph of section 11). 414 | 415 | However, if you cease all violation of this License, then your 416 | license from a particular copyright holder is reinstated (a) 417 | provisionally, unless and until the copyright holder explicitly and 418 | finally terminates your license, and (b) permanently, if the copyright 419 | holder fails to notify you of the violation by some reasonable means 420 | prior to 60 days after the cessation. 421 | 422 | Moreover, your license from a particular copyright holder is 423 | reinstated permanently if the copyright holder notifies you of the 424 | violation by some reasonable means, this is the first time you have 425 | received notice of violation of this License (for any work) from that 426 | copyright holder, and you cure the violation prior to 30 days after 427 | your receipt of the notice. 428 | 429 | Termination of your rights under this section does not terminate the 430 | licenses of parties who have received copies or rights from you under 431 | this License. If your rights have been terminated and not permanently 432 | reinstated, you do not qualify to receive new licenses for the same 433 | material under section 10. 434 | 435 | 9. Acceptance Not Required for Having Copies. 436 | 437 | You are not required to accept this License in order to receive or 438 | run a copy of the Program. Ancillary propagation of a covered work 439 | occurring solely as a consequence of using peer-to-peer transmission 440 | to receive a copy likewise does not require acceptance. However, 441 | nothing other than this License grants you permission to propagate or 442 | modify any covered work. These actions infringe copyright if you do 443 | not accept this License. Therefore, by modifying or propagating a 444 | covered work, you indicate your acceptance of this License to do so. 445 | 446 | 10. Automatic Licensing of Downstream Recipients. 447 | 448 | Each time you convey a covered work, the recipient automatically 449 | receives a license from the original licensors, to run, modify and 450 | propagate that work, subject to this License. You are not responsible 451 | for enforcing compliance by third parties with this License. 452 | 453 | An "entity transaction" is a transaction transferring control of an 454 | organization, or substantially all assets of one, or subdividing an 455 | organization, or merging organizations. If propagation of a covered 456 | work results from an entity transaction, each party to that 457 | transaction who receives a copy of the work also receives whatever 458 | licenses to the work the party's predecessor in interest had or could 459 | give under the previous paragraph, plus a right to possession of the 460 | Corresponding Source of the work from the predecessor in interest, if 461 | the predecessor has it or can get it with reasonable efforts. 462 | 463 | You may not impose any further restrictions on the exercise of the 464 | rights granted or affirmed under this License. For example, you may 465 | not impose a license fee, royalty, or other charge for exercise of 466 | rights granted under this License, and you may not initiate litigation 467 | (including a cross-claim or counterclaim in a lawsuit) alleging that 468 | any patent claim is infringed by making, using, selling, offering for 469 | sale, or importing the Program or any portion of it. 470 | 471 | 11. Patents. 472 | 473 | A "contributor" is a copyright holder who authorizes use under this 474 | License of the Program or a work on which the Program is based. The 475 | work thus licensed is called the contributor's "contributor version". 476 | 477 | A contributor's "essential patent claims" are all patent claims 478 | owned or controlled by the contributor, whether already acquired or 479 | hereafter acquired, that would be infringed by some manner, permitted 480 | by this License, of making, using, or selling its contributor version, 481 | but do not include claims that would be infringed only as a 482 | consequence of further modification of the contributor version. For 483 | purposes of this definition, "control" includes the right to grant 484 | patent sublicenses in a manner consistent with the requirements of 485 | this License. 486 | 487 | Each contributor grants you a non-exclusive, worldwide, royalty-free 488 | patent license under the contributor's essential patent claims, to 489 | make, use, sell, offer for sale, import and otherwise run, modify and 490 | propagate the contents of its contributor version. 491 | 492 | In the following three paragraphs, a "patent license" is any express 493 | agreement or commitment, however denominated, not to enforce a patent 494 | (such as an express permission to practice a patent or covenant not to 495 | sue for patent infringement). To "grant" such a patent license to a 496 | party means to make such an agreement or commitment not to enforce a 497 | patent against the party. 498 | 499 | If you convey a covered work, knowingly relying on a patent license, 500 | and the Corresponding Source of the work is not available for anyone 501 | to copy, free of charge and under the terms of this License, through a 502 | publicly available network server or other readily accessible means, 503 | then you must either (1) cause the Corresponding Source to be so 504 | available, or (2) arrange to deprive yourself of the benefit of the 505 | patent license for this particular work, or (3) arrange, in a manner 506 | consistent with the requirements of this License, to extend the patent 507 | license to downstream recipients. "Knowingly relying" means you have 508 | actual knowledge that, but for the patent license, your conveying the 509 | covered work in a country, or your recipient's use of the covered work 510 | in a country, would infringe one or more identifiable patents in that 511 | country that you have reason to believe are valid. 512 | 513 | If, pursuant to or in connection with a single transaction or 514 | arrangement, you convey, or propagate by procuring conveyance of, a 515 | covered work, and grant a patent license to some of the parties 516 | receiving the covered work authorizing them to use, propagate, modify 517 | or convey a specific copy of the covered work, then the patent license 518 | you grant is automatically extended to all recipients of the covered 519 | work and works based on it. 520 | 521 | A patent license is "discriminatory" if it does not include within 522 | the scope of its coverage, prohibits the exercise of, or is 523 | conditioned on the non-exercise of one or more of the rights that are 524 | specifically granted under this License. You may not convey a covered 525 | work if you are a party to an arrangement with a third party that is 526 | in the business of distributing software, under which you make payment 527 | to the third party based on the extent of your activity of conveying 528 | the work, and under which the third party grants, to any of the 529 | parties who would receive the covered work from you, a discriminatory 530 | patent license (a) in connection with copies of the covered work 531 | conveyed by you (or copies made from those copies), or (b) primarily 532 | for and in connection with specific products or compilations that 533 | contain the covered work, unless you entered into that arrangement, 534 | or that patent license was granted, prior to 28 March 2007. 535 | 536 | Nothing in this License shall be construed as excluding or limiting 537 | any implied license or other defenses to infringement that may 538 | otherwise be available to you under applicable patent law. 539 | 540 | 12. No Surrender of Others' Freedom. 541 | 542 | If conditions are imposed on you (whether by court order, agreement or 543 | otherwise) that contradict the conditions of this License, they do not 544 | excuse you from the conditions of this License. If you cannot convey a 545 | covered work so as to satisfy simultaneously your obligations under this 546 | License and any other pertinent obligations, then as a consequence you may 547 | not convey it at all. For example, if you agree to terms that obligate you 548 | to collect a royalty for further conveying from those to whom you convey 549 | the Program, the only way you could satisfy both those terms and this 550 | License would be to refrain entirely from conveying the Program. 551 | 552 | 13. Use with the GNU Affero General Public License. 553 | 554 | Notwithstanding any other provision of this License, you have 555 | permission to link or combine any covered work with a work licensed 556 | under version 3 of the GNU Affero General Public License into a single 557 | combined work, and to convey the resulting work. The terms of this 558 | License will continue to apply to the part which is the covered work, 559 | but the special requirements of the GNU Affero General Public License, 560 | section 13, concerning interaction through a network will apply to the 561 | combination as such. 562 | 563 | 14. Revised Versions of this License. 564 | 565 | The Free Software Foundation may publish revised and/or new versions of 566 | the GNU General Public License from time to time. Such new versions will 567 | be similar in spirit to the present version, but may differ in detail to 568 | address new problems or concerns. 569 | 570 | Each version is given a distinguishing version number. If the 571 | Program specifies that a certain numbered version of the GNU General 572 | Public License "or any later version" applies to it, you have the 573 | option of following the terms and conditions either of that numbered 574 | version or of any later version published by the Free Software 575 | Foundation. If the Program does not specify a version number of the 576 | GNU General Public License, you may choose any version ever published 577 | by the Free Software Foundation. 578 | 579 | If the Program specifies that a proxy can decide which future 580 | versions of the GNU General Public License can be used, that proxy's 581 | public statement of acceptance of a version permanently authorizes you 582 | to choose that version for the Program. 583 | 584 | Later license versions may give you additional or different 585 | permissions. However, no additional obligations are imposed on any 586 | author or copyright holder as a result of your choosing to follow a 587 | later version. 588 | 589 | 15. Disclaimer of Warranty. 590 | 591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY 592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT 593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY 594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, 595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR 596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM 597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF 598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 599 | 600 | 16. Limitation of Liability. 601 | 602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING 603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS 604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY 605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE 606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF 607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD 608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), 609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF 610 | SUCH DAMAGES. 611 | 612 | 17. Interpretation of Sections 15 and 16. 613 | 614 | If the disclaimer of warranty and limitation of liability provided 615 | above cannot be given local legal effect according to their terms, 616 | reviewing courts shall apply local law that most closely approximates 617 | an absolute waiver of all civil liability in connection with the 618 | Program, unless a warranty or assumption of liability accompanies a 619 | copy of the Program in return for a fee. 620 | 621 | END OF TERMS AND CONDITIONS 622 | 623 | How to Apply These Terms to Your New Programs 624 | 625 | If you develop a new program, and you want it to be of the greatest 626 | possible use to the public, the best way to achieve this is to make it 627 | free software which everyone can redistribute and change under these terms. 628 | 629 | To do so, attach the following notices to the program. It is safest 630 | to attach them to the start of each source file to most effectively 631 | state the exclusion of warranty; and each file should have at least 632 | the "copyright" line and a pointer to where the full notice is found. 633 | 634 | 635 | Copyright (C) 636 | 637 | This program is free software: you can redistribute it and/or modify 638 | it under the terms of the GNU General Public License as published by 639 | the Free Software Foundation, either version 3 of the License, or 640 | (at your option) any later version. 641 | 642 | This program is distributed in the hope that it will be useful, 643 | but WITHOUT ANY WARRANTY; without even the implied warranty of 644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 645 | GNU General Public License for more details. 646 | 647 | You should have received a copy of the GNU General Public License 648 | along with this program. If not, see . 649 | 650 | Also add information on how to contact you by electronic and paper mail. 651 | 652 | If the program does terminal interaction, make it output a short 653 | notice like this when it starts in an interactive mode: 654 | 655 | Copyright (C) 656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. 657 | This is free software, and you are welcome to redistribute it 658 | under certain conditions; type `show c' for details. 659 | 660 | The hypothetical commands `show w' and `show c' should show the appropriate 661 | parts of the General Public License. Of course, your program's commands 662 | might be different; for a GUI interface, you would use an "about box". 663 | 664 | You should also get your employer (if you work as a programmer) or school, 665 | if any, to sign a "copyright disclaimer" for the program, if necessary. 666 | For more information on this, and how to apply and follow the GNU GPL, see 667 | . 668 | 669 | The GNU General Public License does not permit incorporating your program 670 | into proprietary programs. If your program is a subroutine library, you 671 | may consider it more useful to permit linking proprietary applications with 672 | the library. If this is what you want to do, use the GNU Lesser General 673 | Public License instead of this License. But first, please read 674 | . -------------------------------------------------------------------------------- /Week 04 - Advance Machine Learning/2- Cross_Validation.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "id": "1TKyqk7qoR8Z" 7 | }, 8 | "source": [ 9 | "# Cross Validation\n", 10 | "\n", 11 | "\"Open\n", 12 | "\n", 13 | "\n", 14 | "\n", 15 | "Splitting our datasetes into train/test sets allows us to test our model on unseen examples. However, it might be the case that we got a lucky (or unlucky) split that doesn't represent the model's actual performance. To solve this problem, we'll use a technique called cross-validation, where we use the entire dataset for training and for testing and evaluate the model accordingly.\n", 16 | "\n", 17 | "There are several ways of performing cross-validation, and there are several corresponding iterators defined in scikit-learn. Each defines a `split` method, which will generate arrays of indices from the data set, each array indicating the instances to go into the training or testing set. \n" 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": null, 23 | "metadata": { 24 | "id": "U-L8w5RMoR8c", 25 | "scrolled": true 26 | }, 27 | "outputs": [], 28 | "source": [ 29 | "import pandas as pd\n", 30 | "import numpy as np\n", 31 | "from sklearn import datasets, svm, metrics, model_selection" 32 | ] 33 | }, 34 | { 35 | "cell_type": "code", 36 | "execution_count": null, 37 | "metadata": { 38 | "id": "3H2-Cv3uqIG8" 39 | }, 40 | "outputs": [], 41 | "source": [ 42 | "x, y = datasets.load_breast_cancer(return_X_y=True)" 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": null, 48 | "metadata": { 49 | "id": "jAHUC3xpqkJX" 50 | }, 51 | "outputs": [], 52 | "source": [ 53 | "# Define a function to split our dataset into train/test splits using indices\n", 54 | "\n", 55 | "\n", 56 | "def kfold_train_test_split(x, y, train_indices, test_indices):\n", 57 | " return x[train_indices], x[test_indices], y[train_indices], y[test_indices]" 58 | ] 59 | }, 60 | { 61 | "cell_type": "markdown", 62 | "metadata": { 63 | "id": "TNI8exLPp61j" 64 | }, 65 | "source": [ 66 | "### `KFold`\n", 67 | "\n", 68 | "`KFold` is arguably the simplest. It partitions the data into $k$ folds. It does not attempt to keep the proportions of classes. \n" 69 | ] 70 | }, 71 | { 72 | "cell_type": "code", 73 | "execution_count": null, 74 | "metadata": { 75 | "id": "QqNihPfzoR8f" 76 | }, 77 | "outputs": [], 78 | "source": [ 79 | "k_fold = model_selection.KFold(\n", 80 | " n_splits=10\n", 81 | ") # splits the data into 10 splits, using 9 for training and 1 for testing in each iteration\n", 82 | "\n", 83 | "# Empty array to store the scores\n", 84 | "scores = []\n", 85 | "\n", 86 | "for train_indices, test_indices in k_fold.split(x):\n", 87 | " # Split data using our predefined function\n", 88 | " x_train, x_test, y_train, y_test = kfold_train_test_split(\n", 89 | " x, y, train_indices, test_indices\n", 90 | " )\n", 91 | "\n", 92 | " # Train model\n", 93 | " svc = svm.SVC()\n", 94 | " svc.fit(x_train, y_train)\n", 95 | "\n", 96 | " # Predict using test set\n", 97 | " y_pred = svc.predict(x_test)\n", 98 | "\n", 99 | " # Calculate scores\n", 100 | " accuracy = metrics.accuracy_score(y_test, y_pred)\n", 101 | " precision = metrics.precision_score(y_test, y_pred)\n", 102 | " recall = metrics.recall_score(y_test, y_pred)\n", 103 | "\n", 104 | " # Create scores dictionary\n", 105 | " scores_dict = {\"accuracy\": accuracy, \"precision\": precision, \"recall\": recall}\n", 106 | "\n", 107 | " # Append to scores array\n", 108 | " scores.append(scores_dict)" 109 | ] 110 | }, 111 | { 112 | "cell_type": "code", 113 | "execution_count": null, 114 | "metadata": { 115 | "colab": { 116 | "base_uri": "https://localhost:8080/", 117 | "height": 357 118 | }, 119 | "executionInfo": { 120 | "elapsed": 793, 121 | "status": "ok", 122 | "timestamp": 1631624726269, 123 | "user": { 124 | "displayName": "muntadher alkaabi", 125 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gj1adaCWpR-cnwQQK4viQZvo0gyRf1A0y9AkvnPMt0=s64", 126 | "userId": "00221067971932357642" 127 | }, 128 | "user_tz": -180 129 | }, 130 | "id": "ILHzsRYEr-N3", 131 | "outputId": "fa0cb867-e76c-48d2-cba1-0336886ab8d0" 132 | }, 133 | "outputs": [ 134 | { 135 | "data": { 136 | "text/html": [ 137 | "
\n", 138 | "\n", 151 | "\n", 152 | " \n", 153 | " \n", 154 | " \n", 155 | " \n", 156 | " \n", 157 | " \n", 158 | " \n", 159 | " \n", 160 | " \n", 161 | " \n", 162 | " \n", 163 | " \n", 164 | " \n", 165 | " \n", 166 | " \n", 167 | " \n", 168 | " \n", 169 | " \n", 170 | " \n", 171 | " \n", 172 | " \n", 173 | " \n", 174 | " \n", 175 | " \n", 176 | " \n", 177 | " \n", 178 | " \n", 179 | " \n", 180 | " \n", 181 | " \n", 182 | " \n", 183 | " \n", 184 | " \n", 185 | " \n", 186 | " \n", 187 | " \n", 188 | " \n", 189 | " \n", 190 | " \n", 191 | " \n", 192 | " \n", 193 | " \n", 194 | " \n", 195 | " \n", 196 | " \n", 197 | " \n", 198 | " \n", 199 | " \n", 200 | " \n", 201 | " \n", 202 | " \n", 203 | " \n", 204 | " \n", 205 | " \n", 206 | " \n", 207 | " \n", 208 | " \n", 209 | " \n", 210 | " \n", 211 | " \n", 212 | " \n", 213 | " \n", 214 | " \n", 215 | " \n", 216 | " \n", 217 | " \n", 218 | " \n", 219 | " \n", 220 | " \n", 221 | " \n", 222 | "
accuracyprecisionrecall
00.7017540.3928571.000000
10.9122810.8750001.000000
20.9122810.9189190.944444
30.8947370.8285711.000000
40.9649120.9354841.000000
50.9824560.9782611.000000
60.9473680.9523810.975610
70.9473680.9555560.977273
80.9122810.9534880.931818
90.9821430.9772731.000000
\n", 223 | "
" 224 | ], 225 | "text/plain": [ 226 | " accuracy precision recall\n", 227 | "0 0.701754 0.392857 1.000000\n", 228 | "1 0.912281 0.875000 1.000000\n", 229 | "2 0.912281 0.918919 0.944444\n", 230 | "3 0.894737 0.828571 1.000000\n", 231 | "4 0.964912 0.935484 1.000000\n", 232 | "5 0.982456 0.978261 1.000000\n", 233 | "6 0.947368 0.952381 0.975610\n", 234 | "7 0.947368 0.955556 0.977273\n", 235 | "8 0.912281 0.953488 0.931818\n", 236 | "9 0.982143 0.977273 1.000000" 237 | ] 238 | }, 239 | "execution_count": 9, 240 | "metadata": {}, 241 | "output_type": "execute_result" 242 | } 243 | ], 244 | "source": [ 245 | "# Conver scores array to dataframe\n", 246 | "scores_df = pd.DataFrame(scores)\n", 247 | "scores_df" 248 | ] 249 | }, 250 | { 251 | "cell_type": "code", 252 | "execution_count": null, 253 | "metadata": { 254 | "colab": { 255 | "base_uri": "https://localhost:8080/" 256 | }, 257 | "executionInfo": { 258 | "elapsed": 412, 259 | "status": "ok", 260 | "timestamp": 1631624737680, 261 | "user": { 262 | "displayName": "muntadher alkaabi", 263 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gj1adaCWpR-cnwQQK4viQZvo0gyRf1A0y9AkvnPMt0=s64", 264 | "userId": "00221067971932357642" 265 | }, 266 | "user_tz": -180 267 | }, 268 | "id": "YSI7d7KhsEM5", 269 | "outputId": "2552c148-c99f-4e06-a569-1c6e51b1fb9f" 270 | }, 271 | "outputs": [ 272 | { 273 | "data": { 274 | "text/plain": [ 275 | "accuracy 0.915758\n", 276 | "precision 0.876779\n", 277 | "recall 0.982915\n", 278 | "dtype: float64" 279 | ] 280 | }, 281 | "execution_count": 11, 282 | "metadata": {}, 283 | "output_type": "execute_result" 284 | } 285 | ], 286 | "source": [ 287 | "# Calculate the mean of the scores\n", 288 | "scores_df.mean()" 289 | ] 290 | }, 291 | { 292 | "cell_type": "markdown", 293 | "metadata": { 294 | "id": "cPGvwCGMoR8f" 295 | }, 296 | "source": [ 297 | "### `StratifiedKFold`\n", 298 | "\n", 299 | "`StratifiedKFold` ensures that the proportion of classes are preserved in each training/testing set. " 300 | ] 301 | }, 302 | { 303 | "cell_type": "code", 304 | "execution_count": null, 305 | "metadata": { 306 | "id": "BzN78eZvsTiG" 307 | }, 308 | "outputs": [], 309 | "source": [ 310 | "stratified_k_fold = model_selection.StratifiedKFold(\n", 311 | " n_splits=10\n", 312 | ") # splits the data into 10 splits, using 9 for training and 1 for testing in each iteration\n", 313 | "\n", 314 | "# Empty array to store the scores\n", 315 | "scores = []\n", 316 | "\n", 317 | "for train_indices, test_indices in stratified_k_fold.split(\n", 318 | " x, y\n", 319 | "): # y is needed here for stratification, similar to stratify = y.\n", 320 | " # Split data using our predefined function\n", 321 | " x_train, x_test, y_train, y_test = kfold_train_test_split(\n", 322 | " x, y, train_indices, test_indices\n", 323 | " )\n", 324 | "\n", 325 | " # Train model\n", 326 | " svc = svm.SVC()\n", 327 | " svc.fit(x_train, y_train)\n", 328 | "\n", 329 | " # Predict using test set\n", 330 | " y_pred = svc.predict(x_test)\n", 331 | "\n", 332 | " # Calculate scores\n", 333 | " accuracy = metrics.accuracy_score(y_test, y_pred)\n", 334 | " precision = metrics.precision_score(y_test, y_pred)\n", 335 | " recall = metrics.recall_score(y_test, y_pred)\n", 336 | "\n", 337 | " # Create scores dictionary\n", 338 | " scores_dict = {\"accuracy\": accuracy, \"precision\": precision, \"recall\": recall}\n", 339 | "\n", 340 | " # Append to scores array\n", 341 | " scores.append(scores_dict)" 342 | ] 343 | }, 344 | { 345 | "cell_type": "code", 346 | "execution_count": null, 347 | "metadata": { 348 | "colab": { 349 | "base_uri": "https://localhost:8080/", 350 | "height": 357 351 | }, 352 | "executionInfo": { 353 | "elapsed": 5, 354 | "status": "ok", 355 | "timestamp": 1631624768035, 356 | "user": { 357 | "displayName": "muntadher alkaabi", 358 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gj1adaCWpR-cnwQQK4viQZvo0gyRf1A0y9AkvnPMt0=s64", 359 | "userId": "00221067971932357642" 360 | }, 361 | "user_tz": -180 362 | }, 363 | "id": "EWRz5B5wsTiI", 364 | "outputId": "9aa1b4e9-8194-41d7-8996-3c713d60827f" 365 | }, 366 | "outputs": [ 367 | { 368 | "data": { 369 | "text/html": [ 370 | "
\n", 371 | "\n", 384 | "\n", 385 | " \n", 386 | " \n", 387 | " \n", 388 | " \n", 389 | " \n", 390 | " \n", 391 | " \n", 392 | " \n", 393 | " \n", 394 | " \n", 395 | " \n", 396 | " \n", 397 | " \n", 398 | " \n", 399 | " \n", 400 | " \n", 401 | " \n", 402 | " \n", 403 | " \n", 404 | " \n", 405 | " \n", 406 | " \n", 407 | " \n", 408 | " \n", 409 | " \n", 410 | " \n", 411 | " \n", 412 | " \n", 413 | " \n", 414 | " \n", 415 | " \n", 416 | " \n", 417 | " \n", 418 | " \n", 419 | " \n", 420 | " \n", 421 | " \n", 422 | " \n", 423 | " \n", 424 | " \n", 425 | " \n", 426 | " \n", 427 | " \n", 428 | " \n", 429 | " \n", 430 | " \n", 431 | " \n", 432 | " \n", 433 | " \n", 434 | " \n", 435 | " \n", 436 | " \n", 437 | " \n", 438 | " \n", 439 | " \n", 440 | " \n", 441 | " \n", 442 | " \n", 443 | " \n", 444 | " \n", 445 | " \n", 446 | " \n", 447 | " \n", 448 | " \n", 449 | " \n", 450 | " \n", 451 | " \n", 452 | " \n", 453 | " \n", 454 | " \n", 455 | "
accuracyprecisionrecall
00.8947370.8536591.000000
10.8421050.8095240.971429
20.8947370.8947370.944444
30.9298250.9000001.000000
40.9298250.9000001.000000
50.9298250.9210530.972222
60.9473680.9459460.972222
70.9298250.9210530.972222
80.9298250.9444440.944444
90.9107140.8750001.000000
\n", 456 | "
" 457 | ], 458 | "text/plain": [ 459 | " accuracy precision recall\n", 460 | "0 0.894737 0.853659 1.000000\n", 461 | "1 0.842105 0.809524 0.971429\n", 462 | "2 0.894737 0.894737 0.944444\n", 463 | "3 0.929825 0.900000 1.000000\n", 464 | "4 0.929825 0.900000 1.000000\n", 465 | "5 0.929825 0.921053 0.972222\n", 466 | "6 0.947368 0.945946 0.972222\n", 467 | "7 0.929825 0.921053 0.972222\n", 468 | "8 0.929825 0.944444 0.944444\n", 469 | "9 0.910714 0.875000 1.000000" 470 | ] 471 | }, 472 | "execution_count": 13, 473 | "metadata": {}, 474 | "output_type": "execute_result" 475 | } 476 | ], 477 | "source": [ 478 | "# Conver scores array to dataframe\n", 479 | "scores_df = pd.DataFrame(scores)\n", 480 | "scores_df" 481 | ] 482 | }, 483 | { 484 | "cell_type": "code", 485 | "execution_count": null, 486 | "metadata": { 487 | "colab": { 488 | "base_uri": "https://localhost:8080/" 489 | }, 490 | "executionInfo": { 491 | "elapsed": 336, 492 | "status": "ok", 493 | "timestamp": 1631624786970, 494 | "user": { 495 | "displayName": "muntadher alkaabi", 496 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gj1adaCWpR-cnwQQK4viQZvo0gyRf1A0y9AkvnPMt0=s64", 497 | "userId": "00221067971932357642" 498 | }, 499 | "user_tz": -180 500 | }, 501 | "id": "7kdBMEeMsTiK", 502 | "outputId": "f6feecb3-e4b0-4254-dbeb-4d85d923b060" 503 | }, 504 | "outputs": [ 505 | { 506 | "data": { 507 | "text/plain": [ 508 | "accuracy 0.913878\n", 509 | "precision 0.896541\n", 510 | "recall 0.977698\n", 511 | "dtype: float64" 512 | ] 513 | }, 514 | "execution_count": 14, 515 | "metadata": {}, 516 | "output_type": "execute_result" 517 | } 518 | ], 519 | "source": [ 520 | "# Calculate the mean of the scores\n", 521 | "scores_df.mean()" 522 | ] 523 | }, 524 | { 525 | "cell_type": "markdown", 526 | "metadata": { 527 | "id": "QIUkAfuioR8g" 528 | }, 529 | "source": [ 530 | "### `ShuffleSplit`\n", 531 | "\n", 532 | "`ShuffleSplit` will generate indepedent pairs of randomly shuffled training and testing sets. " 533 | ] 534 | }, 535 | { 536 | "cell_type": "code", 537 | "execution_count": null, 538 | "metadata": { 539 | "id": "eKEl76iBsoBW" 540 | }, 541 | "outputs": [], 542 | "source": [ 543 | "shuffle_k_fold = model_selection.ShuffleSplit(\n", 544 | " n_splits=10, random_state=42\n", 545 | ") # splits the data into 10 splits, using 9 for training and 1 for testing in each iteration\n", 546 | "\n", 547 | "# Empty array to store the scores\n", 548 | "scores = []\n", 549 | "\n", 550 | "for train_indices, test_indices in shuffle_k_fold.split(x):\n", 551 | " # Split data using our predefined function\n", 552 | " x_train, x_test, y_train, y_test = kfold_train_test_split(\n", 553 | " x, y, train_indices, test_indices\n", 554 | " )\n", 555 | "\n", 556 | " # Train model\n", 557 | " svc = svm.SVC()\n", 558 | " svc.fit(x_train, y_train)\n", 559 | "\n", 560 | " # Predict using test set\n", 561 | " y_pred = svc.predict(x_test)\n", 562 | "\n", 563 | " # Calculate scores\n", 564 | " accuracy = metrics.accuracy_score(y_test, y_pred)\n", 565 | " precision = metrics.precision_score(y_test, y_pred)\n", 566 | " recall = metrics.recall_score(y_test, y_pred)\n", 567 | "\n", 568 | " # Create scores dictionary\n", 569 | " scores_dict = {\"accuracy\": accuracy, \"precision\": precision, \"recall\": recall}\n", 570 | "\n", 571 | " # Append to scores array\n", 572 | " scores.append(scores_dict)" 573 | ] 574 | }, 575 | { 576 | "cell_type": "code", 577 | "execution_count": null, 578 | "metadata": { 579 | "colab": { 580 | "base_uri": "https://localhost:8080/", 581 | "height": 357 582 | }, 583 | "executionInfo": { 584 | "elapsed": 3, 585 | "status": "ok", 586 | "timestamp": 1631624798202, 587 | "user": { 588 | "displayName": "muntadher alkaabi", 589 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gj1adaCWpR-cnwQQK4viQZvo0gyRf1A0y9AkvnPMt0=s64", 590 | "userId": "00221067971932357642" 591 | }, 592 | "user_tz": -180 593 | }, 594 | "id": "0RI8O92esoBX", 595 | "outputId": "1b35d3d8-bfe7-43f1-80f8-d8acfa8567a7" 596 | }, 597 | "outputs": [ 598 | { 599 | "data": { 600 | "text/html": [ 601 | "
\n", 602 | "\n", 615 | "\n", 616 | " \n", 617 | " \n", 618 | " \n", 619 | " \n", 620 | " \n", 621 | " \n", 622 | " \n", 623 | " \n", 624 | " \n", 625 | " \n", 626 | " \n", 627 | " \n", 628 | " \n", 629 | " \n", 630 | " \n", 631 | " \n", 632 | " \n", 633 | " \n", 634 | " \n", 635 | " \n", 636 | " \n", 637 | " \n", 638 | " \n", 639 | " \n", 640 | " \n", 641 | " \n", 642 | " \n", 643 | " \n", 644 | " \n", 645 | " \n", 646 | " \n", 647 | " \n", 648 | " \n", 649 | " \n", 650 | " \n", 651 | " \n", 652 | " \n", 653 | " \n", 654 | " \n", 655 | " \n", 656 | " \n", 657 | " \n", 658 | " \n", 659 | " \n", 660 | " \n", 661 | " \n", 662 | " \n", 663 | " \n", 664 | " \n", 665 | " \n", 666 | " \n", 667 | " \n", 668 | " \n", 669 | " \n", 670 | " \n", 671 | " \n", 672 | " \n", 673 | " \n", 674 | " \n", 675 | " \n", 676 | " \n", 677 | " \n", 678 | " \n", 679 | " \n", 680 | " \n", 681 | " \n", 682 | " \n", 683 | " \n", 684 | " \n", 685 | " \n", 686 | "
accuracyprecisionrecall
00.9824560.9756101.000000
10.8947370.8571431.000000
20.9122810.8684211.000000
30.8421050.8571430.923077
40.8947370.8604651.000000
50.9473680.9189191.000000
60.9122810.8947370.971429
70.9473680.9285711.000000
80.9473680.9444440.971429
90.8947370.8536591.000000
\n", 687 | "
" 688 | ], 689 | "text/plain": [ 690 | " accuracy precision recall\n", 691 | "0 0.982456 0.975610 1.000000\n", 692 | "1 0.894737 0.857143 1.000000\n", 693 | "2 0.912281 0.868421 1.000000\n", 694 | "3 0.842105 0.857143 0.923077\n", 695 | "4 0.894737 0.860465 1.000000\n", 696 | "5 0.947368 0.918919 1.000000\n", 697 | "6 0.912281 0.894737 0.971429\n", 698 | "7 0.947368 0.928571 1.000000\n", 699 | "8 0.947368 0.944444 0.971429\n", 700 | "9 0.894737 0.853659 1.000000" 701 | ] 702 | }, 703 | "execution_count": 16, 704 | "metadata": {}, 705 | "output_type": "execute_result" 706 | } 707 | ], 708 | "source": [ 709 | "# Conver scores array to dataframe\n", 710 | "scores_df = pd.DataFrame(scores)\n", 711 | "scores_df" 712 | ] 713 | }, 714 | { 715 | "cell_type": "code", 716 | "execution_count": null, 717 | "metadata": { 718 | "colab": { 719 | "base_uri": "https://localhost:8080/" 720 | }, 721 | "executionInfo": { 722 | "elapsed": 4, 723 | "status": "ok", 724 | "timestamp": 1631624799400, 725 | "user": { 726 | "displayName": "muntadher alkaabi", 727 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gj1adaCWpR-cnwQQK4viQZvo0gyRf1A0y9AkvnPMt0=s64", 728 | "userId": "00221067971932357642" 729 | }, 730 | "user_tz": -180 731 | }, 732 | "id": "SsLzjCddsoBY", 733 | "outputId": "ed481898-2aa6-4db3-b40c-e77969b4d6c4" 734 | }, 735 | "outputs": [ 736 | { 737 | "data": { 738 | "text/plain": [ 739 | "accuracy 0.917544\n", 740 | "precision 0.895911\n", 741 | "recall 0.986593\n", 742 | "dtype: float64" 743 | ] 744 | }, 745 | "execution_count": 17, 746 | "metadata": {}, 747 | "output_type": "execute_result" 748 | } 749 | ], 750 | "source": [ 751 | "# Calculate the mean of the scores\n", 752 | "scores_df.mean()" 753 | ] 754 | }, 755 | { 756 | "cell_type": "markdown", 757 | "metadata": { 758 | "id": "B8XrAKF-oR8h" 759 | }, 760 | "source": [ 761 | "### `StratifiedShuffleSplit`\n", 762 | "\n", 763 | "`StratifiedShuffleSplit` will generate indepedent pairs of shuffled training and testing sets. Here, however, it will ensure the training and test sets are stratified. " 764 | ] 765 | }, 766 | { 767 | "cell_type": "code", 768 | "execution_count": null, 769 | "metadata": { 770 | "id": "piVPY6RXs521" 771 | }, 772 | "outputs": [], 773 | "source": [ 774 | "stratified_shuffled_k_fold = model_selection.StratifiedShuffleSplit(\n", 775 | " n_splits=10\n", 776 | ") # splits the data into 10 splits, using 9 for training and 1 for testing in each iteration\n", 777 | "\n", 778 | "# Empty array to store the scores\n", 779 | "scores = []\n", 780 | "\n", 781 | "for train_indices, test_indices in stratified_shuffled_k_fold.split(\n", 782 | " x, y\n", 783 | "): # y is needed here for stratification, similar to stratify = y.\n", 784 | " # Split data using our predefined function\n", 785 | " x_train, x_test, y_train, y_test = kfold_train_test_split(\n", 786 | " x, y, train_indices, test_indices\n", 787 | " )\n", 788 | "\n", 789 | " # Train model\n", 790 | " svc = svm.SVC()\n", 791 | " svc.fit(x_train, y_train)\n", 792 | "\n", 793 | " # Predict using test set\n", 794 | " y_pred = svc.predict(x_test)\n", 795 | "\n", 796 | " # Calculate scores\n", 797 | " accuracy = metrics.accuracy_score(y_test, y_pred)\n", 798 | " precision = metrics.precision_score(y_test, y_pred)\n", 799 | " recall = metrics.recall_score(y_test, y_pred)\n", 800 | "\n", 801 | " # Create scores dictionary\n", 802 | " scores_dict = {\"accuracy\": accuracy, \"precision\": precision, \"recall\": recall}\n", 803 | "\n", 804 | " # Append to scores array\n", 805 | " scores.append(scores_dict)" 806 | ] 807 | }, 808 | { 809 | "cell_type": "code", 810 | "execution_count": null, 811 | "metadata": { 812 | "colab": { 813 | "base_uri": "https://localhost:8080/", 814 | "height": 357 815 | }, 816 | "executionInfo": { 817 | "elapsed": 297, 818 | "status": "ok", 819 | "timestamp": 1631624817060, 820 | "user": { 821 | "displayName": "muntadher alkaabi", 822 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gj1adaCWpR-cnwQQK4viQZvo0gyRf1A0y9AkvnPMt0=s64", 823 | "userId": "00221067971932357642" 824 | }, 825 | "user_tz": -180 826 | }, 827 | "id": "EWCm4_pSs526", 828 | "outputId": "abf05b16-a5ba-4532-aaa2-c0f48df5c573" 829 | }, 830 | "outputs": [ 831 | { 832 | "data": { 833 | "text/html": [ 834 | "
\n", 835 | "\n", 848 | "\n", 849 | " \n", 850 | " \n", 851 | " \n", 852 | " \n", 853 | " \n", 854 | " \n", 855 | " \n", 856 | " \n", 857 | " \n", 858 | " \n", 859 | " \n", 860 | " \n", 861 | " \n", 862 | " \n", 863 | " \n", 864 | " \n", 865 | " \n", 866 | " \n", 867 | " \n", 868 | " \n", 869 | " \n", 870 | " \n", 871 | " \n", 872 | " \n", 873 | " \n", 874 | " \n", 875 | " \n", 876 | " \n", 877 | " \n", 878 | " \n", 879 | " \n", 880 | " \n", 881 | " \n", 882 | " \n", 883 | " \n", 884 | " \n", 885 | " \n", 886 | " \n", 887 | " \n", 888 | " \n", 889 | " \n", 890 | " \n", 891 | " \n", 892 | " \n", 893 | " \n", 894 | " \n", 895 | " \n", 896 | " \n", 897 | " \n", 898 | " \n", 899 | " \n", 900 | " \n", 901 | " \n", 902 | " \n", 903 | " \n", 904 | " \n", 905 | " \n", 906 | " \n", 907 | " \n", 908 | " \n", 909 | " \n", 910 | " \n", 911 | " \n", 912 | " \n", 913 | " \n", 914 | " \n", 915 | " \n", 916 | " \n", 917 | " \n", 918 | " \n", 919 | "
accuracyprecisionrecall
00.9473680.9459460.972222
10.9649120.9722220.972222
20.9473680.9714290.944444
30.8947370.8947370.944444
40.9298250.9000001.000000
50.9473680.9230771.000000
61.0000001.0000001.000000
70.9473680.9459460.972222
80.9122810.8780491.000000
90.9298250.9000001.000000
\n", 920 | "
" 921 | ], 922 | "text/plain": [ 923 | " accuracy precision recall\n", 924 | "0 0.947368 0.945946 0.972222\n", 925 | "1 0.964912 0.972222 0.972222\n", 926 | "2 0.947368 0.971429 0.944444\n", 927 | "3 0.894737 0.894737 0.944444\n", 928 | "4 0.929825 0.900000 1.000000\n", 929 | "5 0.947368 0.923077 1.000000\n", 930 | "6 1.000000 1.000000 1.000000\n", 931 | "7 0.947368 0.945946 0.972222\n", 932 | "8 0.912281 0.878049 1.000000\n", 933 | "9 0.929825 0.900000 1.000000" 934 | ] 935 | }, 936 | "execution_count": 19, 937 | "metadata": {}, 938 | "output_type": "execute_result" 939 | } 940 | ], 941 | "source": [ 942 | "# Conver scores array to dataframe\n", 943 | "scores_df = pd.DataFrame(scores)\n", 944 | "scores_df" 945 | ] 946 | }, 947 | { 948 | "cell_type": "code", 949 | "execution_count": null, 950 | "metadata": { 951 | "colab": { 952 | "base_uri": "https://localhost:8080/" 953 | }, 954 | "executionInfo": { 955 | "elapsed": 319, 956 | "status": "ok", 957 | "timestamp": 1631624818825, 958 | "user": { 959 | "displayName": "muntadher alkaabi", 960 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gj1adaCWpR-cnwQQK4viQZvo0gyRf1A0y9AkvnPMt0=s64", 961 | "userId": "00221067971932357642" 962 | }, 963 | "user_tz": -180 964 | }, 965 | "id": "29NqkTl4s527", 966 | "outputId": "d5bf37a0-f8bc-4297-c180-9fac8140af6f" 967 | }, 968 | "outputs": [ 969 | { 970 | "data": { 971 | "text/plain": [ 972 | "accuracy 0.942105\n", 973 | "precision 0.933141\n", 974 | "recall 0.980556\n", 975 | "dtype: float64" 976 | ] 977 | }, 978 | "execution_count": 20, 979 | "metadata": {}, 980 | "output_type": "execute_result" 981 | } 982 | ], 983 | "source": [ 984 | "# Calculate the mean of the scores\n", 985 | "scores_df.mean()" 986 | ] 987 | }, 988 | { 989 | "cell_type": "code", 990 | "execution_count": null, 991 | "metadata": { 992 | "id": "wgvpAk5yau22" 993 | }, 994 | "outputs": [], 995 | "source": [] 996 | } 997 | ], 998 | "metadata": { 999 | "anaconda-cloud": {}, 1000 | "colab": { 1001 | "name": "Cross_Validation.ipynb", 1002 | "provenance": [] 1003 | }, 1004 | "kernelspec": { 1005 | "display_name": "Python 3", 1006 | "language": "python", 1007 | "name": "python3" 1008 | }, 1009 | "language_info": { 1010 | "codemirror_mode": { 1011 | "name": "ipython", 1012 | "version": 3 1013 | }, 1014 | "file_extension": ".py", 1015 | "mimetype": "text/x-python", 1016 | "name": "python", 1017 | "nbconvert_exporter": "python", 1018 | "pygments_lexer": "ipython3", 1019 | "version": "3.7.3" 1020 | }, 1021 | "toc": { 1022 | "nav_menu": {}, 1023 | "number_sections": true, 1024 | "sideBar": true, 1025 | "skip_h1_title": false, 1026 | "title_cell": "Table of Contents", 1027 | "title_sidebar": "Contents", 1028 | "toc_cell": false, 1029 | "toc_position": {}, 1030 | "toc_section_display": true, 1031 | "toc_window_display": false 1032 | } 1033 | }, 1034 | "nbformat": 4, 1035 | "nbformat_minor": 0 1036 | } -------------------------------------------------------------------------------- /Week 03 - Machine Learning Algorithms/1- Scikit_Learn.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "id": "cLHAiAzQEqU2" 7 | }, 8 | "source": [ 9 | "# Scikit-Learn\n", 10 | "\n", 11 | "\"Open\n", 12 | "\n", 13 | "\n", 14 | "[Scikit-learn](http://scikit-learn.org/stable/) is a python-based machine learning library providing implementations of a great many algorithms for supervised and unsupervised learning. In large part, it builds upon the cabilities of NumPy, SciPy, matplotlib, and Pandas.\n", 15 | "\n", 16 | "In the context of supervised learning, the primary objects scikit-learn defines are called **estimators**. Each of these defines a `fit` method, which develops a model from provided training data, and a `predict` method, which uses the model to map a new instance to a suitable target value. Scikit-learn also defines multiple utilities for partitioning and manipulating data sets as well as evaluating models.\n", 17 | "\n", 18 | "Below, we cover some of the basic steps needed to create a model in scikit-learn. These notes are based on material appearing in the *scikit-learn tutorials*.\n", 19 | "\n", 20 | "* [Tutorial](http://scikit-learn.org/stable/tutorial/index.html)\n", 21 | "* [Cheatsheet](https://s3.amazonaws.com/assets.datacamp.com/blog_assets/Scikit_Learn_Cheat_Sheet_Python.pdf)" 22 | ] 23 | }, 24 | { 25 | "cell_type": "markdown", 26 | "metadata": { 27 | "id": "GvzKfYgXEqVC" 28 | }, 29 | "source": [ 30 | "## Datasets\n", 31 | "\n", 32 | "Scikit-learn comes bundled with several pre-defined (typically small) `datasets` that users can explore.\n", 33 | "\n", 34 | " load_boston()\tLoad and return the boston house-prices dataset (regression).\n", 35 | " load_iris()\tLoad and return the iris dataset (classification).\n", 36 | " load_diabetes()\tLoad and return the diabetes dataset (regression).\n", 37 | " load_digits()\tLoad and return the digits dataset (classification).\n", 38 | " load_linnerud()\tLoad and return the linnerud dataset (multivariate regression).\n", 39 | " load_wine()\tLoad and return the wine dataset (classification).\n", 40 | " load_breast_cancer()\tLoad and return the breast cancer wisconsin dataset (classification).\n", 41 | "\n", 42 | "The iris dataset is loaded below, and a description of it is printed." 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": 1, 48 | "metadata": { 49 | "id": "tPAFLH3AEqVD" 50 | }, 51 | "outputs": [], 52 | "source": [ 53 | "import numpy as np\n", 54 | "import pandas as pd\n", 55 | "\n", 56 | "# using 'from * import ...' allows as to import submodules directly\n", 57 | "from sklearn import (\n", 58 | " datasets,\n", 59 | " model_selection,\n", 60 | " linear_model,\n", 61 | " metrics,\n", 62 | " neighbors,\n", 63 | " tree,\n", 64 | " ensemble,\n", 65 | " preprocessing,\n", 66 | ")\n", 67 | "\n", 68 | "# alternatively, we can import the whole package as such\n", 69 | "import sklearn" 70 | ] 71 | }, 72 | { 73 | "cell_type": "code", 74 | "execution_count": null, 75 | "metadata": { 76 | "colab": { 77 | "base_uri": "https://localhost:8080/" 78 | }, 79 | "executionInfo": { 80 | "elapsed": 850, 81 | "status": "ok", 82 | "timestamp": 1655312921916, 83 | "user": { 84 | "displayName": "Muntadher Al-kaabi", 85 | "userId": "00221067971932357642" 86 | }, 87 | "user_tz": -180 88 | }, 89 | "id": "FARFibuOuHaw", 90 | "outputId": "bb6e52cd-d142-4fa0-d857-7efb43aa353a" 91 | }, 92 | "outputs": [ 93 | { 94 | "name": "stdout", 95 | "output_type": "stream", 96 | "text": [ 97 | ".. _iris_dataset:\n", 98 | "\n", 99 | "Iris plants dataset\n", 100 | "--------------------\n", 101 | "\n", 102 | "**Data Set Characteristics:**\n", 103 | "\n", 104 | " :Number of Instances: 150 (50 in each of three classes)\n", 105 | " :Number of Attributes: 4 numeric, predictive attributes and the class\n", 106 | " :Attribute Information:\n", 107 | " - sepal length in cm\n", 108 | " - sepal width in cm\n", 109 | " - petal length in cm\n", 110 | " - petal width in cm\n", 111 | " - class:\n", 112 | " - Iris-Setosa\n", 113 | " - Iris-Versicolour\n", 114 | " - Iris-Virginica\n", 115 | " \n", 116 | " :Summary Statistics:\n", 117 | "\n", 118 | " ============== ==== ==== ======= ===== ====================\n", 119 | " Min Max Mean SD Class Correlation\n", 120 | " ============== ==== ==== ======= ===== ====================\n", 121 | " sepal length: 4.3 7.9 5.84 0.83 0.7826\n", 122 | " sepal width: 2.0 4.4 3.05 0.43 -0.4194\n", 123 | " petal length: 1.0 6.9 3.76 1.76 0.9490 (high!)\n", 124 | " petal width: 0.1 2.5 1.20 0.76 0.9565 (high!)\n", 125 | " ============== ==== ==== ======= ===== ====================\n", 126 | "\n", 127 | " :Missing Attribute Values: None\n", 128 | " :Class Distribution: 33.3% for each of 3 classes.\n", 129 | " :Creator: R.A. Fisher\n", 130 | " :Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov)\n", 131 | " :Date: July, 1988\n", 132 | "\n", 133 | "The famous Iris database, first used by Sir R.A. Fisher. The dataset is taken\n", 134 | "from Fisher's paper. Note that it's the same as in R, but not as in the UCI\n", 135 | "Machine Learning Repository, which has two wrong data points.\n", 136 | "\n", 137 | "This is perhaps the best known database to be found in the\n", 138 | "pattern recognition literature. Fisher's paper is a classic in the field and\n", 139 | "is referenced frequently to this day. (See Duda & Hart, for example.) The\n", 140 | "data set contains 3 classes of 50 instances each, where each class refers to a\n", 141 | "type of iris plant. One class is linearly separable from the other 2; the\n", 142 | "latter are NOT linearly separable from each other.\n", 143 | "\n", 144 | ".. topic:: References\n", 145 | "\n", 146 | " - Fisher, R.A. \"The use of multiple measurements in taxonomic problems\"\n", 147 | " Annual Eugenics, 7, Part II, 179-188 (1936); also in \"Contributions to\n", 148 | " Mathematical Statistics\" (John Wiley, NY, 1950).\n", 149 | " - Duda, R.O., & Hart, P.E. (1973) Pattern Classification and Scene Analysis.\n", 150 | " (Q327.D83) John Wiley & Sons. ISBN 0-471-22361-1. See page 218.\n", 151 | " - Dasarathy, B.V. (1980) \"Nosing Around the Neighborhood: A New System\n", 152 | " Structure and Classification Rule for Recognition in Partially Exposed\n", 153 | " Environments\". IEEE Transactions on Pattern Analysis and Machine\n", 154 | " Intelligence, Vol. PAMI-2, No. 1, 67-71.\n", 155 | " - Gates, G.W. (1972) \"The Reduced Nearest Neighbor Rule\". IEEE Transactions\n", 156 | " on Information Theory, May 1972, 431-433.\n", 157 | " - See also: 1988 MLC Proceedings, 54-64. Cheeseman et al\"s AUTOCLASS II\n", 158 | " conceptual clustering system finds 3 classes in the data.\n", 159 | " - Many, many more ...\n" 160 | ] 161 | } 162 | ], 163 | "source": [ 164 | "iris_dataset = (\n", 165 | " datasets.load_iris()\n", 166 | ") # sklearn.datasets.load_iris() works exactly the same\n", 167 | "\n", 168 | "print(iris_dataset.DESCR)" 169 | ] 170 | }, 171 | { 172 | "cell_type": "markdown", 173 | "metadata": { 174 | "id": "idq1o-aou6WR" 175 | }, 176 | "source": [ 177 | "We can also use `iris_dataset.data` and `iris_dataset.targets` to create or x & y (inputs & outputs) pairs that will be used for training and testing" 178 | ] 179 | }, 180 | { 181 | "cell_type": "code", 182 | "execution_count": null, 183 | "metadata": { 184 | "colab": { 185 | "base_uri": "https://localhost:8080/", 186 | "height": 424 187 | }, 188 | "executionInfo": { 189 | "elapsed": 32, 190 | "status": "ok", 191 | "timestamp": 1655312925923, 192 | "user": { 193 | "displayName": "Muntadher Al-kaabi", 194 | "userId": "00221067971932357642" 195 | }, 196 | "user_tz": -180 197 | }, 198 | "id": "JbXkcYblu6DT", 199 | "outputId": "0ead5436-2f61-4296-84f7-c5325c204a2c" 200 | }, 201 | "outputs": [ 202 | { 203 | "data": { 204 | "text/html": [ 205 | "\n", 206 | "
\n", 207 | "
\n", 208 | "
\n", 209 | "\n", 222 | "\n", 223 | " \n", 224 | " \n", 225 | " \n", 226 | " \n", 227 | " \n", 228 | " \n", 229 | " \n", 230 | " \n", 231 | " \n", 232 | " \n", 233 | " \n", 234 | " \n", 235 | " \n", 236 | " \n", 237 | " \n", 238 | " \n", 239 | " \n", 240 | " \n", 241 | " \n", 242 | " \n", 243 | " \n", 244 | " \n", 245 | " \n", 246 | " \n", 247 | " \n", 248 | " \n", 249 | " \n", 250 | " \n", 251 | " \n", 252 | " \n", 253 | " \n", 254 | " \n", 255 | " \n", 256 | " \n", 257 | " \n", 258 | " \n", 259 | " \n", 260 | " \n", 261 | " \n", 262 | " \n", 263 | " \n", 264 | " \n", 265 | " \n", 266 | " \n", 267 | " \n", 268 | " \n", 269 | " \n", 270 | " \n", 271 | " \n", 272 | " \n", 273 | " \n", 274 | " \n", 275 | " \n", 276 | " \n", 277 | " \n", 278 | " \n", 279 | " \n", 280 | " \n", 281 | " \n", 282 | " \n", 283 | " \n", 284 | " \n", 285 | " \n", 286 | " \n", 287 | " \n", 288 | " \n", 289 | " \n", 290 | " \n", 291 | " \n", 292 | " \n", 293 | " \n", 294 | " \n", 295 | " \n", 296 | " \n", 297 | " \n", 298 | " \n", 299 | " \n", 300 | " \n", 301 | " \n", 302 | " \n", 303 | " \n", 304 | " \n", 305 | " \n", 306 | " \n", 307 | " \n", 308 | " \n", 309 | " \n", 310 | " \n", 311 | "
sepal length (cm)sepal width (cm)petal length (cm)petal width (cm)
05.13.51.40.2
14.93.01.40.2
24.73.21.30.2
34.63.11.50.2
45.03.61.40.2
...............
1456.73.05.22.3
1466.32.55.01.9
1476.53.05.22.0
1486.23.45.42.3
1495.93.05.11.8
\n", 312 | "

150 rows × 4 columns

\n", 313 | "
\n", 314 | " \n", 324 | " \n", 325 | " \n", 362 | "\n", 363 | " \n", 387 | "
\n", 388 | "
\n", 389 | " " 390 | ], 391 | "text/plain": [ 392 | " sepal length (cm) sepal width (cm) petal length (cm) petal width (cm)\n", 393 | "0 5.1 3.5 1.4 0.2\n", 394 | "1 4.9 3.0 1.4 0.2\n", 395 | "2 4.7 3.2 1.3 0.2\n", 396 | "3 4.6 3.1 1.5 0.2\n", 397 | "4 5.0 3.6 1.4 0.2\n", 398 | ".. ... ... ... ...\n", 399 | "145 6.7 3.0 5.2 2.3\n", 400 | "146 6.3 2.5 5.0 1.9\n", 401 | "147 6.5 3.0 5.2 2.0\n", 402 | "148 6.2 3.4 5.4 2.3\n", 403 | "149 5.9 3.0 5.1 1.8\n", 404 | "\n", 405 | "[150 rows x 4 columns]" 406 | ] 407 | }, 408 | "execution_count": 6, 409 | "metadata": {}, 410 | "output_type": "execute_result" 411 | } 412 | ], 413 | "source": [ 414 | "x = pd.DataFrame(iris_dataset.data, columns=iris_dataset.feature_names)\n", 415 | "y = pd.DataFrame(iris_dataset.target, columns=[\"Labels\"])\n", 416 | "\n", 417 | "x" 418 | ] 419 | }, 420 | { 421 | "cell_type": "markdown", 422 | "metadata": { 423 | "id": "5S3W8bkDuvH4" 424 | }, 425 | "source": [ 426 | "Alternatively, can load a dataset into x & y directly (i.e. into input/output pairs) by setting the `return_X_y` parameter to `True`" 427 | ] 428 | }, 429 | { 430 | "cell_type": "code", 431 | "execution_count": null, 432 | "metadata": { 433 | "colab": { 434 | "base_uri": "https://localhost:8080/" 435 | }, 436 | "executionInfo": { 437 | "elapsed": 6, 438 | "status": "ok", 439 | "timestamp": 1655312926631, 440 | "user": { 441 | "displayName": "Muntadher Al-kaabi", 442 | "userId": "00221067971932357642" 443 | }, 444 | "user_tz": -180 445 | }, 446 | "id": "IqmQMvHbuuZq", 447 | "outputId": "5a789103-3af9-431b-ab9e-c66fccd07d73" 448 | }, 449 | "outputs": [ 450 | { 451 | "data": { 452 | "text/plain": [ 453 | "((150, 4), (150,))" 454 | ] 455 | }, 456 | "execution_count": 7, 457 | "metadata": {}, 458 | "output_type": "execute_result" 459 | } 460 | ], 461 | "source": [ 462 | "x, y = datasets.load_iris(return_X_y=True)\n", 463 | "\n", 464 | "x.shape, y.shape" 465 | ] 466 | }, 467 | { 468 | "cell_type": "markdown", 469 | "metadata": { 470 | "id": "sVzD_4buvw8d" 471 | }, 472 | "source": [ 473 | "## Train/Test Split\n", 474 | "\n", 475 | "In order to validate that our model can generalize to data that it wasn't trained on, it's necessary to create a sperate **testing dataset** that will not be used in training.\n", 476 | "\n", 477 | "Within the `model_selection` submodule of Scikit Learn, there's the `train_test_split` that we can use to automatically split the data into training and testing pairs.\n", 478 | "\n", 479 | "Here's an explanation of the different parameters taken directly from the function's docstring\n", 480 | "\n", 481 | "#### **Parameters**\n", 482 | "\n", 483 | "**arrays** : sequence of indexables with same length / shape[0]\n", 484 | " Allowed inputs are lists, numpy arrays, scipy-sparse\n", 485 | " matrices or pandas dataframes.\n", 486 | "\n", 487 | "**test_size** : float, int or None, optional (default=None)\n", 488 | " If float, should be between 0.0 and 1.0 and represent the proportion\n", 489 | " of the dataset to include in the test split. If int, represents the\n", 490 | " absolute number of test samples. If None, the value is set to the\n", 491 | " complement of the train size. If train_size is also None, it will\n", 492 | " be set to 0.25.\n", 493 | "\n", 494 | "**train_size** : float, int, or None, (default=None)\n", 495 | " If float, should be between 0.0 and 1.0 and represent the\n", 496 | " proportion of the dataset to include in the train split. If\n", 497 | " int, represents the absolute number of train samples. If None,\n", 498 | " the value is automatically set to the complement of the test size.\n", 499 | "\n", 500 | "**random_state** : int, RandomState instance or None, optional (default=None)\n", 501 | " If int, random_state is the seed used by the random number generator;\n", 502 | " If RandomState instance, random_state is the random number generator;\n", 503 | " If None, the random number generator is the RandomState instance used\n", 504 | " by np.random.\n", 505 | "\n", 506 | "**shuffle** : boolean, optional (default=True)\n", 507 | " Whether or not to shuffle the data before splitting. If shuffle=False\n", 508 | " then stratify must be None.\n", 509 | "\n", 510 | "**stratify** : array-like or None (default=None)\n", 511 | " If not None, data is split in a stratified fashion, using this as\n", 512 | " the class labels.\n", 513 | "\n", 514 | "\n", 515 | "\n", 516 | "\n" 517 | ] 518 | }, 519 | { 520 | "cell_type": "code", 521 | "execution_count": null, 522 | "metadata": { 523 | "id": "aEKUby5MwOUc" 524 | }, 525 | "outputs": [], 526 | "source": [ 527 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 528 | " x, y, test_size=0.1, random_state=42, stratify=y\n", 529 | ")" 530 | ] 531 | }, 532 | { 533 | "cell_type": "markdown", 534 | "metadata": { 535 | "id": "R9chq57lximz" 536 | }, 537 | "source": [ 538 | "Please note that the `stratify` parameter works only in the context of classification tasks where there are a fixed amount of possible outputs/targets" 539 | ] 540 | }, 541 | { 542 | "cell_type": "markdown", 543 | "metadata": { 544 | "id": "QohA1Fh6blL9" 545 | }, 546 | "source": [ 547 | "# Fitting and predicting: estimator basics\n", 548 | "\n", 549 | "Scikit-learn provides dozens of built-in machine learning algorithms and models, called estimators. Each estimator can be fitted to some data using its fit method.\n", 550 | "\n", 551 | "Here is a simple example where we fit a Linear Regression to some very basic data:" 552 | ] 553 | }, 554 | { 555 | "cell_type": "code", 556 | "execution_count": 3, 557 | "metadata": { 558 | "colab": { 559 | "base_uri": "https://localhost:8080/" 560 | }, 561 | "executionInfo": { 562 | "elapsed": 6, 563 | "status": "ok", 564 | "timestamp": 1655312940742, 565 | "user": { 566 | "displayName": "Muntadher Al-kaabi", 567 | "userId": "00221067971932357642" 568 | }, 569 | "user_tz": -180 570 | }, 571 | "id": "7IrGMK3_bf3W", 572 | "outputId": "e7f9f450-86a1-410f-9b09-d122fab5f037" 573 | }, 574 | "outputs": [ 575 | { 576 | "data": { 577 | "text/html": [ 578 | "
LinearRegression()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
" 579 | ], 580 | "text/plain": [ 581 | "LinearRegression()" 582 | ] 583 | }, 584 | "execution_count": 3, 585 | "metadata": {}, 586 | "output_type": "execute_result" 587 | } 588 | ], 589 | "source": [ 590 | "x = [[1, 2, 3], [11, 12, 13]] # 2 samples, 3 features\n", 591 | "y = [0.5, 0.1] # classes of each sample\n", 592 | "\n", 593 | "model = linear_model.LinearRegression()\n", 594 | "\n", 595 | "model.fit(x, y)" 596 | ] 597 | }, 598 | { 599 | "cell_type": "code", 600 | "execution_count": 5, 601 | "metadata": { 602 | "colab": { 603 | "base_uri": "https://localhost:8080/" 604 | }, 605 | "executionInfo": { 606 | "elapsed": 399, 607 | "status": "ok", 608 | "timestamp": 1655312943365, 609 | "user": { 610 | "displayName": "Muntadher Al-kaabi", 611 | "userId": "00221067971932357642" 612 | }, 613 | "user_tz": -180 614 | }, 615 | "id": "xzxhbQ0PdWcq", 616 | "outputId": "7e299cee-0312-4cf8-ee02-8926e6b883a5" 617 | }, 618 | "outputs": [ 619 | { 620 | "name": "stdout", 621 | "output_type": "stream", 622 | "text": [ 623 | "[0.5 0.1]\n", 624 | "[ 0.38 -0.02]\n" 625 | ] 626 | } 627 | ], 628 | "source": [ 629 | "pred = model.predict(x) # predict classes of the training data\n", 630 | "print(pred)\n", 631 | "pred = model.predict([[4, 5, 6], [14, 15, 16]]) # predict classes of new data\n", 632 | "print(pred)" 633 | ] 634 | }, 635 | { 636 | "cell_type": "markdown", 637 | "metadata": { 638 | "id": "jb4V_h7rc-Ls" 639 | }, 640 | "source": [ 641 | "The `fit` method generally accepts 2 inputs:\n", 642 | "\n", 643 | "1. The samples matrix (or design matrix) X. The size of X is typically (n_samples, n_features), which means that samples are represented as rows and features are represented as columns.\n", 644 | "\n", 645 | "2. The target values y which are real numbers for regression tasks, or integers for classification (or any other discrete set of values). For unsupervized learning tasks, y does not need to be specified. y is usually 1d array where the i th entry corresponds to the target of the i th sample (row) of X.\n", 646 | "\n", 647 | "Both X and y are usually expected to be numpy arrays or equivalent array-like data types, though some estimators work with other formats such as sparse matrices.\n", 648 | "\n", 649 | "Once the estimator is fitted, it can be used for predicting target values of new data. You don’t need to re-train the estimator:" 650 | ] 651 | }, 652 | { 653 | "cell_type": "markdown", 654 | "metadata": { 655 | "id": "mN9b1m-YEqVP" 656 | }, 657 | "source": [ 658 | "# Linear Regression\n", 659 | "\n", 660 | "In statistics, linear regression is a linear approach to modelling the relationship between a set a features, and a desired output. The case of one input feature is called simple linear regression; for more than one, the process is called multiple linear regression.\n", 661 | "\n", 662 | "Scikit Learn defines this algorithm in `LinearRegression` class as a part of the `linear_models` module.\n" 663 | ] 664 | }, 665 | { 666 | "cell_type": "markdown", 667 | "metadata": { 668 | "id": "euPwTAlozl-z" 669 | }, 670 | "source": [ 671 | "First, we load the data" 672 | ] 673 | }, 674 | { 675 | "cell_type": "code", 676 | "execution_count": null, 677 | "metadata": { 678 | "colab": { 679 | "base_uri": "https://localhost:8080/" 680 | }, 681 | "executionInfo": { 682 | "elapsed": 6, 683 | "status": "ok", 684 | "timestamp": 1655312958796, 685 | "user": { 686 | "displayName": "Muntadher Al-kaabi", 687 | "userId": "00221067971932357642" 688 | }, 689 | "user_tz": -180 690 | }, 691 | "id": "urCbsGeMEqVP", 692 | "outputId": "a1a3beed-4b8e-4045-98a6-fa542405fb29" 693 | }, 694 | "outputs": [ 695 | { 696 | "name": "stdout", 697 | "output_type": "stream", 698 | "text": [ 699 | "Diabetes features/input shape: (442, 10)\n", 700 | "Diabetes target/output shape: (442,)\n" 701 | ] 702 | } 703 | ], 704 | "source": [ 705 | "x, y = datasets.load_diabetes(return_X_y=True)\n", 706 | "# normalize the values of x and y\n", 707 | "y_normalize = preprocessing.MinMaxScaler()\n", 708 | "y_norm = y_normalize.fit_transform(y.reshape(-1, 1)) # normlize the y\n", 709 | "\n", 710 | "x_normalize = preprocessing.StandardScaler()\n", 711 | "x_norm = x_normalize.fit_transform(x) # normlize the x\n", 712 | "\n", 713 | "print(\"Diabetes features/input shape:\", x.shape)\n", 714 | "print(\"Diabetes target/output shape:\", y.shape)" 715 | ] 716 | }, 717 | { 718 | "cell_type": "markdown", 719 | "metadata": { 720 | "id": "jsXgJi2uzh96" 721 | }, 722 | "source": [ 723 | "Second, we split the data into 90/10 training/testing split (90% of the data will be used for training while 10% will be used for testing)" 724 | ] 725 | }, 726 | { 727 | "cell_type": "code", 728 | "execution_count": null, 729 | "metadata": { 730 | "colab": { 731 | "base_uri": "https://localhost:8080/" 732 | }, 733 | "executionInfo": { 734 | "elapsed": 493, 735 | "status": "ok", 736 | "timestamp": 1655312962127, 737 | "user": { 738 | "displayName": "Muntadher Al-kaabi", 739 | "userId": "00221067971932357642" 740 | }, 741 | "user_tz": -180 742 | }, 743 | "id": "tgO8Mj4jlXdZ", 744 | "outputId": "36c8e198-0f5e-41a2-d88d-3e40831b64c6" 745 | }, 746 | "outputs": [ 747 | { 748 | "data": { 749 | "text/plain": [ 750 | "((397, 10), (45, 10), (397,), (45,))" 751 | ] 752 | }, 753 | "execution_count": 12, 754 | "metadata": {}, 755 | "output_type": "execute_result" 756 | } 757 | ], 758 | "source": [ 759 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 760 | " x_norm, y_norm.reshape(-1), test_size=0.1, random_state=42\n", 761 | ")\n", 762 | "\n", 763 | "x_train.shape, x_test.shape, y_train.shape, y_test.shape" 764 | ] 765 | }, 766 | { 767 | "cell_type": "markdown", 768 | "metadata": { 769 | "id": "D3L2rnqyzXzV" 770 | }, 771 | "source": [ 772 | "Third, we train (i.e. `fit`) the model using the training dataset (`x_train` as inputs, `y_train` as targets)" 773 | ] 774 | }, 775 | { 776 | "cell_type": "code", 777 | "execution_count": null, 778 | "metadata": { 779 | "colab": { 780 | "base_uri": "https://localhost:8080/" 781 | }, 782 | "executionInfo": { 783 | "elapsed": 6, 784 | "status": "ok", 785 | "timestamp": 1655312963309, 786 | "user": { 787 | "displayName": "Muntadher Al-kaabi", 788 | "userId": "00221067971932357642" 789 | }, 790 | "user_tz": -180 791 | }, 792 | "id": "Xy0-GOCqEqVQ", 793 | "outputId": "4d495b7d-5598-40f7-b372-dfb9a879c65b" 794 | }, 795 | "outputs": [ 796 | { 797 | "name": "stdout", 798 | "output_type": "stream", 799 | "text": [ 800 | "Weights:\n", 801 | " [ 0.00295256 -0.03890481 0.07545094 0.04980218 -0.12584691 0.07115817\n", 802 | " 0.01788275 0.03507649 0.10618591 0.01043328]\n", 803 | "Bias:\n", 804 | " 0.39477478088542883\n" 805 | ] 806 | } 807 | ], 808 | "source": [ 809 | "regressor = (\n", 810 | " linear_model.LinearRegression()\n", 811 | ") # initialize the parameter of linear regression model\n", 812 | "regressor.fit(x_train, y_train) # training the model on the train data\n", 813 | "\n", 814 | "# we can preview the learned coefficients (i.e. weights) and intercept (i.e. bias)\n", 815 | "\n", 816 | "print(\"Weights:\\n\", regressor.coef_)\n", 817 | "print(\"Bias:\\n\", regressor.intercept_)" 818 | ] 819 | }, 820 | { 821 | "cell_type": "markdown", 822 | "metadata": { 823 | "id": "PwY6r_g5zTMN" 824 | }, 825 | "source": [ 826 | "Fourth, we'll feed the test set into the trained model" 827 | ] 828 | }, 829 | { 830 | "cell_type": "code", 831 | "execution_count": null, 832 | "metadata": { 833 | "id": "SC2kNUdaAe7_" 834 | }, 835 | "outputs": [], 836 | "source": [ 837 | "y_pred = regressor.predict(x_test)" 838 | ] 839 | }, 840 | { 841 | "cell_type": "markdown", 842 | "metadata": { 843 | "id": "edEogEf-zRCl" 844 | }, 845 | "source": [ 846 | "Finally, we'll evaluate the predicted output against the ground-truth values in `y_test` using Scikit Learn's `metrics` module\n", 847 | "\n", 848 | "One of the most used metrics to evaluate regression models is `mean_squared_error` which has the following formula: $$\\frac{1}{n}\\sum_{i=1}^{n}(\\hat y_i - y_i)^2$$\n", 849 | "\n", 850 | "Where `n` is the total number of examples evaluated (in this case 45), $\\hat y$ is the predicted value (here `y_pred`) and $y$ is the ground-truth value (here `y_test`)\n" 851 | ] 852 | }, 853 | { 854 | "cell_type": "code", 855 | "execution_count": null, 856 | "metadata": { 857 | "colab": { 858 | "base_uri": "https://localhost:8080/" 859 | }, 860 | "executionInfo": { 861 | "elapsed": 6, 862 | "status": "ok", 863 | "timestamp": 1655312966852, 864 | "user": { 865 | "displayName": "Muntadher Al-kaabi", 866 | "userId": "00221067971932357642" 867 | }, 868 | "user_tz": -180 869 | }, 870 | "id": "_2ma5hdDA6tX", 871 | "outputId": "c05c3168-13b2-4fdb-aaa0-14c3c3b9c0c3" 872 | }, 873 | "outputs": [ 874 | { 875 | "data": { 876 | "text/plain": [ 877 | "0.02662901220648913" 878 | ] 879 | }, 880 | "execution_count": 15, 881 | "metadata": {}, 882 | "output_type": "execute_result" 883 | } 884 | ], 885 | "source": [ 886 | "metrics.mean_squared_error(y_test, y_pred)" 887 | ] 888 | }, 889 | { 890 | "cell_type": "markdown", 891 | "metadata": { 892 | "id": "zZSazzIi1xar" 893 | }, 894 | "source": [ 895 | "# Logistic Regression\n", 896 | "\n", 897 | "In statistics, the logistic model (or logit model) is used to model the probability of a certain class or event existing such as pass/fail, win/lose, alive/dead or healthy/sick. This can be extended to model several classes of events such as determining whether an image contains a cat, dog, lion, etc. Each object being detected in the image would be assigned a probability between 0 and 1, with a sum of one.\n", 898 | "\n", 899 | "\n", 900 | "Scikit Learn defines this algorithm in `LogisticRegression` class as a part of the `linear_models` module.\n" 901 | ] 902 | }, 903 | { 904 | "cell_type": "markdown", 905 | "metadata": { 906 | "id": "Lq8iQ7Sl1xas" 907 | }, 908 | "source": [ 909 | "First, we load the data" 910 | ] 911 | }, 912 | { 913 | "cell_type": "code", 914 | "execution_count": null, 915 | "metadata": { 916 | "colab": { 917 | "base_uri": "https://localhost:8080/" 918 | }, 919 | "executionInfo": { 920 | "elapsed": 424, 921 | "status": "ok", 922 | "timestamp": 1655312969046, 923 | "user": { 924 | "displayName": "Muntadher Al-kaabi", 925 | "userId": "00221067971932357642" 926 | }, 927 | "user_tz": -180 928 | }, 929 | "id": "YTS2cFdD1xas", 930 | "outputId": "906679fb-ad5c-4539-ac68-a421d1eee4ef" 931 | }, 932 | "outputs": [ 933 | { 934 | "name": "stdout", 935 | "output_type": "stream", 936 | "text": [ 937 | "Breast Cancer features/input shape: (569, 30)\n", 938 | "Breast Cancer target/output shape: (569,)\n" 939 | ] 940 | } 941 | ], 942 | "source": [ 943 | "x, y = datasets.load_breast_cancer(return_X_y=True)\n", 944 | "# normalize the values of x\n", 945 | "x_normalize = preprocessing.StandardScaler()\n", 946 | "x_norm = x_normalize.fit_transform(x)\n", 947 | "print(\"Breast Cancer features/input shape:\", x_norm.shape)\n", 948 | "print(\"Breast Cancer target/output shape:\", y.shape)" 949 | ] 950 | }, 951 | { 952 | "cell_type": "markdown", 953 | "metadata": { 954 | "id": "0ZIGpVe_1xat" 955 | }, 956 | "source": [ 957 | "Second, we split the data into 90/10 training/testing split (90% of the data will be used for training while 10% will be used for testing)\n", 958 | "\n", 959 | "Since this is a classification problem (we only have two possible outputs, 1 or 0), we can use the `stratify` parameter to ensure that the two possible output values are distributed proportionally between the training and testing sets and preserve the data's original distribution across the two sets." 960 | ] 961 | }, 962 | { 963 | "cell_type": "code", 964 | "execution_count": null, 965 | "metadata": { 966 | "colab": { 967 | "base_uri": "https://localhost:8080/" 968 | }, 969 | "executionInfo": { 970 | "elapsed": 549, 971 | "status": "ok", 972 | "timestamp": 1655312971415, 973 | "user": { 974 | "displayName": "Muntadher Al-kaabi", 975 | "userId": "00221067971932357642" 976 | }, 977 | "user_tz": -180 978 | }, 979 | "id": "Vh9i-abM1xat", 980 | "outputId": "01f1ab71-6442-414b-cb0a-56f6008250d7" 981 | }, 982 | "outputs": [ 983 | { 984 | "data": { 985 | "text/plain": [ 986 | "((512, 30), (57, 30), (512,), (57,))" 987 | ] 988 | }, 989 | "execution_count": 17, 990 | "metadata": {}, 991 | "output_type": "execute_result" 992 | } 993 | ], 994 | "source": [ 995 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 996 | " x_norm, y, test_size=0.1, random_state=42, stratify=y\n", 997 | ")\n", 998 | "\n", 999 | "x_train.shape, x_test.shape, y_train.shape, y_test.shape" 1000 | ] 1001 | }, 1002 | { 1003 | "cell_type": "markdown", 1004 | "metadata": { 1005 | "id": "MQBL3xRW1xau" 1006 | }, 1007 | "source": [ 1008 | "Third, we train (i.e. `fit`) the model using the training dataset (`x_train` as inputs, `y_train` as targets)" 1009 | ] 1010 | }, 1011 | { 1012 | "cell_type": "code", 1013 | "execution_count": null, 1014 | "metadata": { 1015 | "colab": { 1016 | "base_uri": "https://localhost:8080/" 1017 | }, 1018 | "executionInfo": { 1019 | "elapsed": 7, 1020 | "status": "ok", 1021 | "timestamp": 1655312971847, 1022 | "user": { 1023 | "displayName": "Muntadher Al-kaabi", 1024 | "userId": "00221067971932357642" 1025 | }, 1026 | "user_tz": -180 1027 | }, 1028 | "id": "C_rv1UZf1xau", 1029 | "outputId": "ac8b6fe5-b0d1-4dee-952e-911938ae2f4d" 1030 | }, 1031 | "outputs": [ 1032 | { 1033 | "name": "stdout", 1034 | "output_type": "stream", 1035 | "text": [ 1036 | "Weights:\n", 1037 | " [ 0.00295256 -0.03890481 0.07545094 0.04980218 -0.12584691 0.07115817\n", 1038 | " 0.01788275 0.03507649 0.10618591 0.01043328]\n", 1039 | "Bias:\n", 1040 | " 0.39477478088542883\n" 1041 | ] 1042 | } 1043 | ], 1044 | "source": [ 1045 | "classifier = linear_model.LogisticRegression()\n", 1046 | "classifier.fit(x_train, y_train)\n", 1047 | "\n", 1048 | "# we can preview the learned coefficients (i.e. weights) and intercept (i.e. bias)\n", 1049 | "\n", 1050 | "print(\"Weights:\\n\", regressor.coef_)\n", 1051 | "print(\"Bias:\\n\", regressor.intercept_)" 1052 | ] 1053 | }, 1054 | { 1055 | "cell_type": "markdown", 1056 | "metadata": { 1057 | "id": "xE2N6-SR1xau" 1058 | }, 1059 | "source": [ 1060 | "Fourth, we'll feed the test set into the trained model" 1061 | ] 1062 | }, 1063 | { 1064 | "cell_type": "code", 1065 | "execution_count": null, 1066 | "metadata": { 1067 | "id": "hV_LmBvP1xav" 1068 | }, 1069 | "outputs": [], 1070 | "source": [ 1071 | "y_pred = classifier.predict(x_test)" 1072 | ] 1073 | }, 1074 | { 1075 | "cell_type": "markdown", 1076 | "metadata": { 1077 | "id": "ReSoDvxc1xav" 1078 | }, 1079 | "source": [ 1080 | "Finally, we'll evaluate the predicted output against the ground-truth values in `y_test` using Scikit Learn's `metrics` module\n", 1081 | "\n", 1082 | "One of the most used metrics to evaluate classification models is `accuracy_score` which calculates the precentage of the examples that the trained classifier guessed correctly\n" 1083 | ] 1084 | }, 1085 | { 1086 | "cell_type": "code", 1087 | "execution_count": null, 1088 | "metadata": { 1089 | "colab": { 1090 | "base_uri": "https://localhost:8080/" 1091 | }, 1092 | "executionInfo": { 1093 | "elapsed": 6, 1094 | "status": "ok", 1095 | "timestamp": 1655312976677, 1096 | "user": { 1097 | "displayName": "Muntadher Al-kaabi", 1098 | "userId": "00221067971932357642" 1099 | }, 1100 | "user_tz": -180 1101 | }, 1102 | "id": "6WjQ-aSI1xav", 1103 | "outputId": "0ec58b7f-04cf-4232-b8ea-64d526723350" 1104 | }, 1105 | "outputs": [ 1106 | { 1107 | "data": { 1108 | "text/plain": [ 1109 | "0.9649122807017544" 1110 | ] 1111 | }, 1112 | "execution_count": 20, 1113 | "metadata": {}, 1114 | "output_type": "execute_result" 1115 | } 1116 | ], 1117 | "source": [ 1118 | "metrics.accuracy_score(y_test, y_pred)" 1119 | ] 1120 | }, 1121 | { 1122 | "cell_type": "markdown", 1123 | "metadata": { 1124 | "id": "h0HWW8lfVFfc" 1125 | }, 1126 | "source": [ 1127 | "# Pipeline \n", 1128 | "Scikit-learn's pipeline class is a useful tool for encapsulating multiple different transformers alongside an estimator into one object, so that you only have to call your important methods once `( fit() , predict() , etc).`" 1129 | ] 1130 | }, 1131 | { 1132 | "cell_type": "code", 1133 | "execution_count": null, 1134 | "metadata": { 1135 | "id": "P-UnRk1OVbqP" 1136 | }, 1137 | "outputs": [], 1138 | "source": [ 1139 | "# Import the sklearn pipeline\n", 1140 | "from sklearn.pipeline import Pipeline" 1141 | ] 1142 | }, 1143 | { 1144 | "cell_type": "code", 1145 | "execution_count": null, 1146 | "metadata": { 1147 | "id": "A9Lv2nMmWez0" 1148 | }, 1149 | "outputs": [], 1150 | "source": [ 1151 | "# Download the dataset\n", 1152 | "x, y = datasets.load_breast_cancer(return_X_y=True)\n", 1153 | "# Split the dataset to train and test\n", 1154 | "x_train, x_test, y_train, y_test = model_selection.train_test_split(\n", 1155 | " x, y, test_size=0.1, random_state=42, stratify=y\n", 1156 | ")" 1157 | ] 1158 | }, 1159 | { 1160 | "cell_type": "markdown", 1161 | "metadata": { 1162 | "id": "OP786qxEWXoS" 1163 | }, 1164 | "source": [ 1165 | "The first step in building the pipeline is to define each transformer type. The convention here is generally to create transformers for the different variable types. In the code below I have created a numeric transformer which applies a StandardScaler, and includes a SimpleImputer to fill in any missing values. This is a really nice function in scikit-learn and has a number of options for filling missing values. I have chosen to use median but another method may result in better performance. The categorical transformer also has a SimpleImputer with a different fill method, and leverages OneHotEncoder to transform the categorical values into integers." 1166 | ] 1167 | }, 1168 | { 1169 | "cell_type": "code", 1170 | "execution_count": null, 1171 | "metadata": { 1172 | "colab": { 1173 | "base_uri": "https://localhost:8080/" 1174 | }, 1175 | "executionInfo": { 1176 | "elapsed": 370, 1177 | "status": "ok", 1178 | "timestamp": 1655314427549, 1179 | "user": { 1180 | "displayName": "Muntadher Al-kaabi", 1181 | "userId": "00221067971932357642" 1182 | }, 1183 | "user_tz": -180 1184 | }, 1185 | "id": "_VQDmA6NVhbM", 1186 | "outputId": "1f123744-f6a7-482c-bb59-4f11c85dc14f" 1187 | }, 1188 | "outputs": [ 1189 | { 1190 | "data": { 1191 | "text/plain": [ 1192 | "Pipeline(steps=[('scaler', StandardScaler()),\n", 1193 | " ('Logistic_R', LogisticRegression())])" 1194 | ] 1195 | }, 1196 | "execution_count": 26, 1197 | "metadata": {}, 1198 | "output_type": "execute_result" 1199 | } 1200 | ], 1201 | "source": [ 1202 | "# Create the sklearn pipeline\n", 1203 | "pipe = Pipeline(\n", 1204 | " [\n", 1205 | " (\"scaler\", preprocessing.StandardScaler()),\n", 1206 | " (\"Logistic_R\", linear_model.LogisticRegression()),\n", 1207 | " ]\n", 1208 | ")\n", 1209 | "\n", 1210 | "# fit the pipeline\n", 1211 | "pipe.fit(x_train, y_train)" 1212 | ] 1213 | }, 1214 | { 1215 | "cell_type": "code", 1216 | "execution_count": null, 1217 | "metadata": { 1218 | "colab": { 1219 | "base_uri": "https://localhost:8080/" 1220 | }, 1221 | "executionInfo": { 1222 | "elapsed": 597, 1223 | "status": "ok", 1224 | "timestamp": 1655314479639, 1225 | "user": { 1226 | "displayName": "Muntadher Al-kaabi", 1227 | "userId": "00221067971932357642" 1228 | }, 1229 | "user_tz": -180 1230 | }, 1231 | "id": "mKjjiGTbbSTl", 1232 | "outputId": "aa586a9c-471f-496e-da5e-6014846717c3" 1233 | }, 1234 | "outputs": [ 1235 | { 1236 | "data": { 1237 | "text/plain": [ 1238 | "0.9649122807017544" 1239 | ] 1240 | }, 1241 | "execution_count": 28, 1242 | "metadata": {}, 1243 | "output_type": "execute_result" 1244 | } 1245 | ], 1246 | "source": [ 1247 | "# Calculate the Accuracy of the model\n", 1248 | "pipe.score(x_test, y_test)" 1249 | ] 1250 | } 1251 | ], 1252 | "metadata": { 1253 | "colab": { 1254 | "collapsed_sections": [], 1255 | "name": "1- Scikit_Learn.ipynb", 1256 | "provenance": [] 1257 | }, 1258 | "kernelspec": { 1259 | "display_name": "Python 3.9.13 64-bit", 1260 | "language": "python", 1261 | "name": "python3" 1262 | }, 1263 | "language_info": { 1264 | "codemirror_mode": { 1265 | "name": "ipython", 1266 | "version": 3 1267 | }, 1268 | "file_extension": ".py", 1269 | "mimetype": "text/x-python", 1270 | "name": "python", 1271 | "nbconvert_exporter": "python", 1272 | "pygments_lexer": "ipython3", 1273 | "version": "3.9.13" 1274 | }, 1275 | "vscode": { 1276 | "interpreter": { 1277 | "hash": "b667cebad148e7b094a58ee81f940c685de1dd70a003a9ccdca4a5792431bee5" 1278 | } 1279 | } 1280 | }, 1281 | "nbformat": 4, 1282 | "nbformat_minor": 0 1283 | } --------------------------------------------------------------------------------