├── .github └── ISSUE_TEMPLATE │ ├── bug_report.md │ ├── clarification-request.md │ └── feature-or-improvement-request.md ├── .gitignore ├── 01_the_machine_learning_landscape.ipynb ├── 02_end_to_end_machine_learning_project.ipynb ├── 03_classification.ipynb ├── 04_training_linear_models.ipynb ├── 05_support_vector_machines.ipynb ├── 06_decision_trees.ipynb ├── 07_ensemble_learning_and_random_forests.ipynb ├── 08_dimensionality_reduction.ipynb ├── 09_unsupervised_learning.ipynb ├── 10_neural_nets_with_keras.ipynb ├── 11_training_deep_neural_networks.ipynb ├── 12_custom_models_and_training_with_tensorflow.ipynb ├── 13_loading_and_preprocessing_data.ipynb ├── 14_deep_computer_vision_with_cnns.ipynb ├── 15_processing_sequences_using_rnns_and_cnns.ipynb ├── 16_nlp_with_rnns_and_attention.ipynb ├── 17_autoencoders_and_gans.ipynb ├── 18_reinforcement_learning.ipynb ├── 19_training_and_deploying_at_scale.ipynb ├── INSTALL.md ├── LICENSE ├── README.md ├── apt.txt ├── book_equations.pdf ├── changes_in_2nd_edition.md ├── datasets ├── housing │ ├── README.md │ ├── housing.csv │ └── housing.tgz ├── inception │ └── imagenet_class_names.txt ├── jsb_chorales │ ├── README.md │ └── jsb_chorales.tgz ├── lifesat │ ├── README.md │ ├── gdp_per_capita.csv │ └── oecd_bli_2015.csv └── titanic │ ├── test.csv │ └── train.csv ├── docker ├── .env ├── Dockerfile ├── Dockerfile.gpu ├── Makefile ├── README.md ├── bashrc.bash ├── bin │ ├── nbclean_checkpoints │ ├── nbdiff_checkpoint │ ├── rm_empty_subdirs │ └── tensorboard ├── docker-compose.yml └── jupyter_notebook_config.py ├── environment.yml ├── extra_autodiff.ipynb ├── extra_gradient_descent_comparison.ipynb ├── images ├── ann │ └── README ├── autoencoders │ └── README ├── classification │ └── README ├── cnn │ ├── README │ └── test_image.png ├── decision_trees │ └── README ├── deep │ └── README ├── distributed │ └── README ├── end_to_end_project │ ├── README │ └── california.png ├── ensembles │ └── README ├── fundamentals │ └── README ├── nlp │ └── README ├── rl │ ├── README │ └── breakout.gif ├── rnn │ └── README ├── svm │ └── README ├── tensorflow │ └── README ├── training_linear_models │ └── README └── unsupervised_learning │ ├── README │ └── ladybug.png ├── index.ipynb ├── math_differential_calculus.ipynb ├── math_linear_algebra.ipynb ├── ml-project-checklist.md ├── requirements.txt ├── tools_matplotlib.ipynb ├── tools_numpy.ipynb └── tools_pandas.ipynb /.github/ISSUE_TEMPLATE/bug_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Bug report 3 | about: Create a report to help us improve 4 | title: "[BUG]" 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | Thanks for helping us improve this project! 11 | 12 | **Before you create this issue** 13 | Please make sure you are using the latest updated code and libraries: see https://github.com/ageron/handson-ml2/blob/master/INSTALL.md#update-this-project-and-its-libraries 14 | 15 | Also please make sure to read the FAQ (https://github.com/ageron/handson-ml2#faq) and search for existing issues (both open and closed), as your question may already have been answered: https://github.com/ageron/handson-ml2/issues 16 | 17 | **Describe the bug** 18 | Please provide a clear and concise description of what the bug is, and specify the notebook name and the cell number at which the problem occurs (or the chapter and page in the book). 19 | 20 | **To Reproduce** 21 | Please copy the code that fails here, using code blocks like this: 22 | 23 | ```python 24 | def inverse(x): 25 | return 1 / x 26 | 27 | result = inverse(0) 28 | ``` 29 | 30 | And if you got an exception, please copy the full stacktrace here: 31 | 32 | ```stacktrace 33 | Traceback (most recent call last): 34 | File "", line 1, in 35 | File "", line 2, in inverse 36 | ZeroDivisionError: division by zero 37 | ``` 38 | 39 | **Expected behavior** 40 | A clear and concise description of what you expected to happen. 41 | 42 | **Screenshots** 43 | If applicable, add screenshots to help explain your problem. 44 | 45 | **Versions (please complete the following information):** 46 | - OS: [e.g. MacOSX 10.15.7] 47 | - Python: [e.g. 3.7] 48 | - TensorFlow: [e.g., 2.4.1] 49 | - Scikit-Learn: [e.g., 0.24.1] 50 | - Other libraries that may be connected with the issue: [e.g., gym 0.18.0] 51 | 52 | **Additional context** 53 | Add any other context about the problem here. 54 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/clarification-request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Clarification request 3 | about: Not a bug, but something is not clear 4 | title: "[QUESTION]" 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | Thanks for helping us improve this project! 11 | 12 | **Before you create this issue** 13 | Please make sure you are using the latest updated code and libraries: see https://github.com/ageron/handson-ml2/blob/master/INSTALL.md#update-this-project-and-its-libraries 14 | 15 | Also please make sure to read the FAQ (https://github.com/ageron/handson-ml2#faq) and search for existing issues (both open and closed), as your question may already have been answered: https://github.com/ageron/handson-ml2/issues 16 | 17 | **Describe what is unclear to you** 18 | Please provide a clear and concise description of what the problem is, and specify the notebook name and the cell number at which the problem occurs (or the chapter and page in the book). 19 | 20 | **To Reproduce** 21 | If the question relates to a specific piece of code, please copy the code that fails here, using code blocks like this: 22 | 23 | ```python 24 | def inverse(x): 25 | return 1 / x 26 | 27 | result = inverse(0) 28 | ``` 29 | 30 | And if you get an exception, please copy the full stacktrace here: 31 | 32 | ```stacktrace 33 | Traceback (most recent call last): 34 | File "", line 1, in 35 | File "", line 2, in inverse 36 | ZeroDivisionError: division by zero 37 | ``` 38 | 39 | **Expected behavior** 40 | If applicable, a clear and concise description of what you expected to happen. 41 | 42 | **Screenshots** 43 | If applicable, add screenshots to help explain your problem. 44 | 45 | **Versions (please complete the following information):** 46 | - OS: [e.g. MacOSX 10.15.7] 47 | - Python: [e.g. 3.7] 48 | - TensorFlow: [e.g., 2.4.1] 49 | - Scikit-Learn: [e.g., 0.24.1] 50 | - Other libraries that may be connected with the issue: [e.g., gym 0.18.0] 51 | 52 | **Additional context** 53 | Add any other context about the problem here. 54 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/feature-or-improvement-request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Feature or Improvement request 3 | about: Suggest an idea for this project 4 | title: "[IDEA]" 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | Thanks for helping us improve this project! 11 | 12 | **Before you create this issue** 13 | Please make sure you are using the latest updated code and libraries: see https://github.com/ageron/handson-ml2/blob/master/INSTALL.md#update-this-project-and-its-libraries 14 | 15 | Also please make sure to read the FAQ (https://github.com/ageron/handson-ml2#faq) and search for existing issues (both open and closed), as your question may already have been answered: https://github.com/ageron/handson-ml2/issues 16 | 17 | **Is your feature request related to a problem? Please describe.** 18 | Please indicate the notebook name and cell number where the problem occurs (or the chapter and page number in the book), and provide a clear and concise description of what the problem is. Ex. In chapter 1, cells 200-220, I think the code could be clearer [...] 19 | 20 | **Describe the solution you'd like** 21 | A clear and concise description of what you want to happen. 22 | 23 | **Describe alternatives you've considered** 24 | A clear and concise description of any alternative solutions or features you've considered. 25 | 26 | **Additional context** 27 | Add any other context or screenshots about the feature request here. 28 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | *.bak 2 | *.bak.* 3 | *.ckpt 4 | *.old 5 | *.pyc 6 | .DS_Store 7 | .ipynb_checkpoints 8 | .vscode/ 9 | checkpoint 10 | logs/* 11 | tf_logs/* 12 | images/**/*.png 13 | images/**/*.dot 14 | my_* 15 | person.proto 16 | person.desc 17 | person_pb2.py 18 | datasets/flowers 19 | datasets/lifesat/lifesat.csv 20 | datasets/spam 21 | datasets/words 22 | datasets/jsb_chorales 23 | 24 | -------------------------------------------------------------------------------- /INSTALL.md: -------------------------------------------------------------------------------- 1 | # Installation 2 | 3 | ## Download this repository 4 | To install this repository and run the Jupyter notebooks on your machine, you will first need git, which you may already have. Open a terminal and type `git` to check. If you do not have git, you can download it from [git-scm.com](https://git-scm.com/). 5 | 6 | Next, clone this repository by opening a terminal and typing the following commands (do not type the first `$` on each line, it's just a convention to show that this is a terminal prompt, not something else like Python code): 7 | 8 | $ cd $HOME # or any other development directory you prefer 9 | $ git clone https://github.com/ageron/handson-ml2.git 10 | $ cd handson-ml2 11 | 12 | If you do not want to install git, you can instead download [master.zip](https://github.com/ageron/handson-ml2/archive/master.zip), unzip it, rename the resulting directory to `handson-ml2` and move it to your development directory. 13 | 14 | ## Install Anaconda 15 | Next, you will need Python 3 and a bunch of Python libraries. The simplest way to install these is to [download and install Anaconda](https://www.anaconda.com/distribution/), which is a great cross-platform Python distribution for scientific computing. It comes bundled with many scientific libraries, including NumPy, Pandas, Matplotlib, Scikit-Learn and much more, so it's quite a large installation. If you prefer a lighter weight Anaconda distribution, you can [install Miniconda](https://docs.conda.io/en/latest/miniconda.html), which contains the bare minimum to run the `conda` packaging tool. You should install the latest version of Anaconda (or Miniconda) available. 16 | 17 | During the installation on MacOSX and Linux, you will be asked whether to initialize Anaconda by running `conda init`: you should accept, as it will update your shell script to ensure that `conda` is available whenever you open a terminal. After the installation, you must close your terminal and open a new one for the changes to take effect. 18 | 19 | During the installation on Windows, you will be asked whether you want the installer to update the `PATH` environment variable. This is not recommended as it may interfere with other software. Instead, after the installation you should open the Start Menu and launch an Anaconda Shell whenever you want to use Anaconda. 20 | 21 | Once Anaconda (or Miniconda) is installed, run the following command to update the `conda` packaging tool to the latest version: 22 | 23 | $ conda update -n base -c defaults conda 24 | 25 | > **Note**: if you don't like Anaconda for some reason, then you can install Python 3 and use pip to install the required libraries manually (this is not recommended, unless you really know what you are doing). I recommend using Python 3.8, since some libs don't support Python 3.9 or 3.10 yet. 26 | 27 | 28 | ## Install the GPU Driver and Libraries 29 | If you have a TensorFlow-compatible GPU card (NVidia card with Compute Capability ≥ 3.5), and you want TensorFlow to use it, then you should download the latest driver for your card from [nvidia.com](https://www.nvidia.com/Download/index.aspx?lang=en-us) and install it. You will also need NVidia's CUDA and cuDNN libraries, but the good news is that they will be installed automatically when you install the tensorflow-gpu package from Anaconda. However, if you don't use Anaconda, you will have to install them manually. If you hit any roadblock, see TensorFlow's [GPU installation instructions](https://tensorflow.org/install/gpu) for more details. 30 | 31 | ## Create the `tf2` Environment 32 | Next, make sure you're in the `handson-ml2` directory and run the following command. It will create a new `conda` environment containing every library you will need to run all the notebooks (by default, the environment will be named `tf2`, but you can choose another name using the `-n` option): 33 | 34 | $ conda env create -f environment.yml 35 | 36 | Next, activate the new environment: 37 | 38 | $ conda activate tf2 39 | 40 | 41 | ## Start Jupyter 42 | You're almost there! You just need to register the `tf2` conda environment to Jupyter. The notebooks in this project will default to the environment named `python3`, so it's best to register this environment using the name `python3` (if you prefer to use another name, you will have to select it in the "Kernel > Change kernel..." menu in Jupyter every time you open a notebook): 43 | 44 | $ python3 -m ipykernel install --user --name=python3 45 | 46 | And that's it! You can now start Jupyter like this: 47 | 48 | $ jupyter notebook 49 | 50 | This should open up your browser, and you should see Jupyter's tree view, with the contents of the current directory. If your browser does not open automatically, visit [localhost:8888](http://localhost:8888/tree). Click on `index.ipynb` to get started. 51 | 52 | Congrats! You are ready to learn Machine Learning, hands on! 53 | 54 | When you're done with Jupyter, you can close it by typing Ctrl-C in the Terminal window where you started it. Every time you want to work on this project, you will need to open a Terminal, and run: 55 | 56 | $ cd $HOME # or whatever development directory you chose earlier 57 | $ cd handson-ml2 58 | $ conda activate tf2 59 | $ jupyter notebook 60 | 61 | ## Update This Project and its Libraries 62 | I regularly update the notebooks to fix issues and add support for new libraries. So make sure you update this project regularly. 63 | 64 | For this, open a terminal, and run: 65 | 66 | $ cd $HOME # or whatever development directory you chose earlier 67 | $ cd handson-ml2 # go to this project's directory 68 | $ git pull 69 | 70 | If you get an error, it's probably because you modified a notebook. In this case, before running `git pull` you will first need to commit your changes. I recommend doing this in your own branch, or else you may get conflicts: 71 | 72 | $ git checkout -b my_branch # you can use another branch name if you want 73 | $ git add -u 74 | $ git commit -m "describe your changes here" 75 | $ git checkout master 76 | $ git pull 77 | 78 | Next, let's update the libraries. First, let's update `conda` itself: 79 | 80 | $ conda update -c defaults -n base conda 81 | 82 | Then we'll delete this project's `tf2` environment: 83 | 84 | $ conda activate base 85 | $ conda env remove -n tf2 86 | 87 | And recreate the environment: 88 | 89 | $ conda env create -f environment.yml 90 | 91 | Lastly, we reactivate the environment and start Jupyter: 92 | 93 | $ conda activate tf2 94 | $ jupyter notebook 95 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | 179 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | Machine Learning Notebooks 2 | ========================== 3 | 4 | # ⚠ The 3rd edition of my book will be released in October 2022. The notebooks are available at [ageron/handson-ml3](https://github.com/ageron/handson-ml3) and contain more up-to-date code. 5 | 6 | This project aims at teaching you the fundamentals of Machine Learning in 7 | python. It contains the example code and solutions to the exercises in the second edition of my O'Reilly book [Hands-on Machine Learning with Scikit-Learn, Keras and TensorFlow](https://www.oreilly.com/library/view/hands-on-machine-learning/9781492032632/): 8 | 9 | 10 | 11 | **Note**: If you are looking for the first edition notebooks, check out [ageron/handson-ml](https://github.com/ageron/handson-ml). For the third edition, check out [ageron/handson-ml3](https://github.com/ageron/handson-ml3). 12 | 13 | ## Quick Start 14 | 15 | ### Want to play with these notebooks online without having to install anything? 16 | Use any of the following services (I recommended Colab or Kaggle, since they offer free GPUs and TPUs). 17 | 18 | **WARNING**: _Please be aware that these services provide temporary environments: anything you do will be deleted after a while, so make sure you download any data you care about._ 19 | 20 | * Open In Colab 21 | 22 | * Open in Kaggle 23 | 24 | * Launch binder 25 | 26 | * Launch in Deepnote 27 | 28 | ### Just want to quickly look at some notebooks, without executing any code? 29 | 30 | * Render nbviewer 31 | 32 | * [github.com's notebook viewer](https://github.com/ageron/handson-ml2/blob/master/index.ipynb) also works but it's not ideal: it's slower, the math equations are not always displayed correctly, and large notebooks often fail to open. 33 | 34 | ### Want to run this project using a Docker image? 35 | Read the [Docker instructions](https://github.com/ageron/handson-ml2/tree/master/docker). 36 | 37 | ### Want to install this project on your own machine? 38 | 39 | Start by installing [Anaconda](https://www.anaconda.com/distribution/) (or [Miniconda](https://docs.conda.io/en/latest/miniconda.html)), [git](https://git-scm.com/downloads), and if you have a TensorFlow-compatible GPU, install the [GPU driver](https://www.nvidia.com/Download/index.aspx), as well as the appropriate version of CUDA and cuDNN (see TensorFlow's documentation for more details). 40 | 41 | Next, clone this project by opening a terminal and typing the following commands (do not type the first `$` signs on each line, they just indicate that these are terminal commands): 42 | 43 | $ git clone https://github.com/ageron/handson-ml2.git 44 | $ cd handson-ml2 45 | 46 | Next, run the following commands: 47 | 48 | $ conda env create -f environment.yml 49 | $ conda activate tf2 50 | $ python -m ipykernel install --user --name=python3 51 | 52 | Finally, start Jupyter: 53 | 54 | $ jupyter notebook 55 | 56 | If you need further instructions, read the [detailed installation instructions](INSTALL.md). 57 | 58 | # FAQ 59 | 60 | **Which Python version should I use?** 61 | 62 | I recommend Python 3.8. If you follow the installation instructions above, that's the version you will get. Most code will work with other versions of Python 3, but some libraries do not support Python 3.9 or 3.10 yet, which is why I recommend Python 3.8. 63 | 64 | **I'm getting an error when I call `load_housing_data()`** 65 | 66 | Make sure you call `fetch_housing_data()` *before* you call `load_housing_data()`. If you're getting an HTTP error, make sure you're running the exact same code as in the notebook (copy/paste it if needed). If the problem persists, please check your network configuration. 67 | 68 | **I'm getting an SSL error on MacOSX** 69 | 70 | You probably need to install the SSL certificates (see this [StackOverflow question](https://stackoverflow.com/questions/27835619/urllib-and-ssl-certificate-verify-failed-error)). If you downloaded Python from the official website, then run `/Applications/Python\ 3.8/Install\ Certificates.command` in a terminal (change `3.8` to whatever version you installed). If you installed Python using MacPorts, run `sudo port install curl-ca-bundle` in a terminal. 71 | 72 | **I've installed this project locally. How do I update it to the latest version?** 73 | 74 | See [INSTALL.md](INSTALL.md) 75 | 76 | **How do I update my Python libraries to the latest versions, when using Anaconda?** 77 | 78 | See [INSTALL.md](INSTALL.md) 79 | 80 | ## Contributors 81 | I would like to thank everyone [who contributed to this project](https://github.com/ageron/handson-ml2/graphs/contributors), either by providing useful feedback, filing issues or submitting Pull Requests. Special thanks go to Haesun Park and Ian Beauregard who reviewed every notebook and submitted many PRs, including help on some of the exercise solutions. Thanks as well to Steven Bunkley and Ziembla who created the `docker` directory, and to github user SuperYorio who helped on some exercise solutions. 82 | -------------------------------------------------------------------------------- /apt.txt: -------------------------------------------------------------------------------- 1 | build-essential 2 | cmake 3 | ffmpeg 4 | git 5 | libboost-all-dev 6 | libjpeg-dev 7 | libpq-dev 8 | libsdl2-dev 9 | sudo 10 | swig 11 | unzip 12 | xorg-dev 13 | xvfb 14 | zip 15 | zlib1g-dev 16 | -------------------------------------------------------------------------------- /book_equations.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ageron/handson-ml2/8958d538bdcdf29d329d9950bfc79034c29db724/book_equations.pdf -------------------------------------------------------------------------------- /changes_in_2nd_edition.md: -------------------------------------------------------------------------------- 1 | # Changes in the Second Edition 2 | The second edition has six main objectives: 3 | 4 | 1. Cover additional ML topics: additional unsupervised learning techniques (including clustering, anomaly detection, density estimation and mixture models), additional techniques for training deep nets (including self-normalized networks), additional computer vision techniques (including Xception, SENet, object detection with YOLO, and semantic segmentation using R-CNN), handling sequences using CNNs (including WaveNet), natural language processing using RNNs, CNNs and Transformers, generative adversarial networks 5 | 2. Cover additional libraries and APIs: Keras, the Data API, TF-Agents for Reinforcement Learning, training and deploying TF models at scale using the Distribution Strategies API, TF-Serving and Google Cloud AI Platform. Also briefly introduce TF Transform, TFLite, TF Addons/Seq2Seq, TensorFlow.js and more. 6 | 3. Mention some of the latest important results from Deep Learning research. 7 | 4. Migrate all TensorFlow chapters to TensorFlow 2, and use TensorFlow's implementation of the Keras API (called tf.keras) whenever possible, to simplify the code examples. 8 | 5. Update the code examples to use the latest version of Scikit-Learn, NumPy, Pandas, Matplotlib and other libraries. 9 | 6. Clarify some sections and fix some errors, thanks to plenty of great feedback from readers. 10 | 11 | Some chapters were added, others were rewritten and a few were reordered. The following table shows the mapping between the 1st edition chapters and the 2nd edition chapters: 12 | 13 | |1st ed. chapter | 2nd ed. chapter | %Changes | 2nd ed. title 14 | |--|--|--|--| 15 | |1|1|<10%|The Machine Learning Landscape 16 | |2|2|<10%|End-to-End Machine Learning Project 17 | |3|3|<10%|Classification 18 | |4|4|<10%|Training Models 19 | |5|5|<10%|Support Vector Machines 20 | |6|6|<10%|Decision Trees 21 | |7|7|<10%|Ensemble Learning and Random Forests 22 | |8|8|<10%|Dimensionality Reduction 23 | |N/A|9|100% new|Unsupervised Learning Techniques 24 | |10|10|~75%|Introduction to Artificial Neural Networks with Keras 25 | |11|11|~50%|Training Deep Neural Networks 26 | |9|12|100% rewritten|Custom Models and Training with TensorFlow 27 | |Part of 12|13|100% rewritten|Loading and Preprocessing Data with TensorFlow 28 | |13|14|~50%|Deep Computer Vision Using Convolutional Neural Networks 29 | |Part of 14|15|~75%|Processing Sequences Using RNNs and CNNs 30 | |Part of 14|16|~90%|Natural Language Processing with RNNs and Attention 31 | |15|17|~75%|Autoencoders and GANs 32 | |16|18|~75%|Reinforcement Learning 33 | |Part of 12|19|~75% new|Productionizing TensorFlow Models 34 | 35 | More specifically, here are the main changes for the 2nd edition (other than clarifications, corrections and code updates): 36 | 37 | * Chapter 1 – The Machine Learning Landscape 38 | * Added more examples of ML applications and the corresponding algorithms. 39 | * Added a section on handling mismatch between the training set and the validation & test sets. 40 | * Chapter 2 – End-to-End Machine Learning Project 41 | * Added how to compute a confidence interval. 42 | * Improved the installation instructions (e.g., for Windows). 43 | * Introduced the upgraded `OneHotEncoder` and the new `ColumnTransformer`. 44 | * Added more details on deployment, monitoring and maintenance. 45 | * Chapter 4 – Training Models 46 | * Explained the need for training instances to be Independent and Identically Distributed (IID). 47 | * Chapter 7 – Ensemble Learning and Random Forests 48 | * Added a short section about XGBoost. 49 | * Chapter 9 – Unsupervised Learning Techniques (new chapter) 50 | * Clustering with K-Means, how to choose the number of clusters, how to use it for dimensionality reduction, semi-supervised learning, image segmentation, and more. 51 | * The DBSCAN clustering algorithm and an overview of other clustering algorithms available in Scikit-Learn. 52 | * Gaussian mixture models, the Expectation-Maximization (EM) algorithm, Bayesian variational inference, and how mixture models can be used for clustering, density estimation, anomaly detection and novelty detection. 53 | * Overview of other anomaly detection and novelty detection algorithms. 54 | * Chapter 10 – Introduction to Artificial Neural Networks with Keras (mostly new) 55 | * Added an introduction to the Keras API, including all its APIs (Sequential, Functional and Subclassing), persistence and callbacks (including the `TensorBoard` callback). 56 | * Chapter 11 – Training Deep Neural Networks (many changes) 57 | * Introduced self-normalizing nets, the SELU activation function and Alpha Dropout. 58 | * Introduced self-supervised learning. 59 | * Added Nadam optimization. 60 | * Added Monte-Carlo Dropout. 61 | * Added a note about the risks of adaptive optimization methods. 62 | * Updated the practical guidelines. 63 | * Chapter 12 – Custom Models and Training with TensorFlow (completely rewritten) 64 | * A tour of TensorFlow 2. 65 | * TensorFlow's lower-level Python API. 66 | * Writing custom loss functions, metrics, layers, models. 67 | * Using auto-differentiation and creating custom training algorithms. 68 | * TensorFlow Functions and graphs (including tracing and autograph). 69 | * Chapter 13 – Loading and Preprocessing Data with TensorFlow (new chapter) 70 | * The Data API 71 | * Loading/Storing data efficiently using TFRecords. 72 | * Writing custom preprocessing layers, using Keras preprocessing layers, encoding categorical features and text, using one-hot vectors, bag-of-words, TF-IDF or embeddings. 73 | * An overview of TF Transform and TF Datasets. 74 | * Moved the low-level implementation of the neural network to the exercises. 75 | * Removed details about queues and readers that are now superseded by the Data API. 76 | * Chapter 14 – Deep Computer Vision Using Convolutional Neural Networks 77 | * Added Xception and SENet architectures. 78 | * Added a Keras implementation of ResNet-34. 79 | * Showed how to use pretrained models using Keras. 80 | * Added an end-to-end transfer learning example. 81 | * Added classification and localization. 82 | * Introduced Fully Convolutional Networks (FCNs). 83 | * Introduced object detection using the YOLO architecture. 84 | * Introduced semantic segmentation using R-CNN. 85 | * Chapter 15 – Processing Sequences Using RNNs and CNNs 86 | * Added an introduction to Wavenet. 87 | * Moved the Encoder–Decoder architecture and Bidirectional RNNs to Chapter 16. 88 | * Chapter 16 – Natural Language Processing with RNNs and Attention (new chapter) 89 | * Explained how to use the Data API to handle sequential data. 90 | * Showed an end-to-end example of text generation using a Character RNN, using both a stateless and a stateful RNN. 91 | * Showed an end-to-end example of sentiment analysis using an LSTM. 92 | * Explained masking in Keras. 93 | * Showed how to reuse pretrained embeddings using TF Hub. 94 | * Showed how to build an Encoder–Decoder for Neural Machine Translation using TensorFlow Addons/seq2seq. 95 | * Introduced beam search. 96 | * Explained attention mechanisms. 97 | * Added a short overview of visual attention and a note on explainability. 98 | * Introduced the fully attention-based Transformer architecture, including positional embeddings and multi-head attention. 99 | * Added an overview of recent language models (2018). 100 | * Chapters 17 – Representation Learning and Generative Learning Using Autoencoders and GANs 101 | * Added convolutional autoencoders and recurrent autoencoders. 102 | * Add Generative Adversarial Networks (GANs), including basic GANs, deep convolutional GANs (DCGANs), Progressively Growing GANs (ProGANs) and StyleGANs. 103 | * Chapter 18 – Reinforcement Learning 104 | * Added Double DQN, Dueling DQN and Prioritized Experience Replay. 105 | * Introduced TF Agents. 106 | * Chapter 19 – Training and Deploying TensorFlow Models at Scale (mostly new chapter) 107 | * Serving a TensorFlow model using TF Serving and Google Cloud AI Platform. 108 | * Deploying a Model to a Mobile or Embedded Device using TFLite. 109 | * Using GPUs to Speed Up Computations. 110 | * Training models across multiple devices using the Distribution Strategies API. 111 | 112 | ## Migrating from TF 1 to TF 2 113 | Migrating from TensorFlow 1.x to 2.0 is similar to migrating from Python 2 to 3. The first thing to do is... breathe. Don't rush. TensorFlow 1.x will be around for a while, you have time. 114 | 115 | * You should start by upgrading to the last TensorFlow 1.x version (it will probably be 1.15 by the time you read this). 116 | * Migrate as much of your code as possible to using high level APIs, either tf.keras or the Estimators API. The Estimators API will still work in TF 2.0, but you should prefer using Keras from now on, as the TF team announced that Keras is preferred and it's likely that they will put more effort into improving the Keras API. Also prefer using Keras preprocessing layers (see chapter 13) rather than `tf.feature_columns`. 117 | * If your code uses only high-level APIs, then it will be easy to migrate, as it should work the same way in the latest TF 1.x release and in TF 2.0. 118 | * Get rid of any usage of `tf.contrib` as it won't exist in TF 2.0. Some parts of it were moved to the core API, others were moved to separate projects, and others were not maintained so they were just deleted. If needed, you must install the appropriate libraries or copy some legacy `tf.contrib` code into your own project (as a last resort). 119 | * Write as many tests as you can, it will make the migration easier and safer. 120 | * You can run TF 1.x code in TF 2.0 by starting your program with `import tensorflow.compat.v1 as tf` and `tf.disable_v2_behavior()`. 121 | * Once you are ready to make the jump, you can run the `tf_upgrade_v2` [upgrade script](https://www.tensorflow.org/beta/guide/upgrade). 122 | 123 | For more details on migration, checkout TensorFlow's [Migration Guide](https://www.tensorflow.org/beta/guide/migration_guide). 124 | -------------------------------------------------------------------------------- /datasets/housing/README.md: -------------------------------------------------------------------------------- 1 | # California Housing 2 | 3 | ## Source 4 | This dataset is a modified version of the California Housing dataset available from [Luís Torgo's page](http://www.dcc.fc.up.pt/~ltorgo/Regression/cal_housing.html) (University of Porto). Luís Torgo obtained it from the StatLib repository (which is closed now). The dataset may also be downloaded from StatLib mirrors. 5 | 6 | This dataset appeared in a 1997 paper titled *Sparse Spatial Autoregressions* by Pace, R. Kelley and Ronald Barry, published in the *Statistics and Probability Letters* journal. They built it using the 1990 California census data. It contains one row per census block group. A block group is the smallest geographical unit for which the U.S. Census Bureau publishes sample data (a block group typically has a population of 600 to 3,000 people). 7 | 8 | ## Tweaks 9 | The dataset in this directory is almost identical to the original, with two differences: 10 | 11 | * 207 values were randomly removed from the `total_bedrooms` column, so we can discuss what to do with missing data. 12 | * An additional categorical attribute called `ocean_proximity` was added, indicating (very roughly) whether each block group is near the ocean, near the Bay area, inland or on an island. This allows discussing what to do with categorical data. 13 | 14 | Note that the block groups are called "districts" in the Jupyter notebooks, simply because in some contexts the name "block group" was confusing. 15 | 16 | ## Data description 17 | 18 | >>> housing.info() 19 | 20 | RangeIndex: 20640 entries, 0 to 20639 21 | Data columns (total 10 columns): 22 | longitude 20640 non-null float64 23 | latitude 20640 non-null float64 24 | housing_median_age 20640 non-null float64 25 | total_rooms 20640 non-null float64 26 | total_bedrooms 20433 non-null float64 27 | population 20640 non-null float64 28 | households 20640 non-null float64 29 | median_income 20640 non-null float64 30 | median_house_value 20640 non-null float64 31 | ocean_proximity 20640 non-null object 32 | dtypes: float64(9), object(1) 33 | memory usage: 1.6+ MB 34 | 35 | >>> housing["ocean_proximity"].value_counts() 36 | <1H OCEAN 9136 37 | INLAND 6551 38 | NEAR OCEAN 2658 39 | NEAR BAY 2290 40 | ISLAND 5 41 | Name: ocean_proximity, dtype: int64 42 | 43 | >>> housing.describe() 44 | longitude latitude housing_median_age total_rooms \ 45 | count 16513.000000 16513.000000 16513.000000 16513.000000 46 | mean -119.575972 35.639693 28.652335 2622.347605 47 | std 2.002048 2.138279 12.576306 2138.559393 48 | min -124.350000 32.540000 1.000000 6.000000 49 | 25% -121.800000 33.940000 18.000000 1442.000000 50 | 50% -118.510000 34.260000 29.000000 2119.000000 51 | 75% -118.010000 37.720000 37.000000 3141.000000 52 | max -114.310000 41.950000 52.000000 39320.000000 53 | 54 | total_bedrooms population households median_income 55 | count 16355.000000 16513.000000 16513.000000 16513.000000 56 | mean 534.885112 1419.525465 496.975050 3.875651 57 | std 412.716467 1115.715084 375.737945 1.905088 58 | min 2.000000 3.000000 2.000000 0.499900 59 | 25% 295.000000 784.000000 278.000000 2.566800 60 | 50% 433.000000 1164.000000 408.000000 3.541400 61 | 75% 644.000000 1718.000000 602.000000 4.745000 62 | max 6210.000000 35682.000000 5358.000000 15.000100 63 | 64 | -------------------------------------------------------------------------------- /datasets/housing/housing.tgz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ageron/handson-ml2/8958d538bdcdf29d329d9950bfc79034c29db724/datasets/housing/housing.tgz -------------------------------------------------------------------------------- /datasets/inception/imagenet_class_names.txt: -------------------------------------------------------------------------------- 1 | n01440764 tench, Tinca tinca 2 | n01443537 goldfish, Carassius auratus 3 | n01484850 great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias 4 | n01491361 tiger shark, Galeocerdo cuvieri 5 | n01494475 hammerhead, hammerhead shark 6 | n01496331 electric ray, crampfish, numbfish, torpedo 7 | n01498041 stingray 8 | n01514668 cock 9 | n01514859 hen 10 | n01518878 ostrich, Struthio camelus 11 | n01530575 brambling, Fringilla montifringilla 12 | n01531178 goldfinch, Carduelis carduelis 13 | n01532829 house finch, linnet, Carpodacus mexicanus 14 | n01534433 junco, snowbird 15 | n01537544 indigo bunting, indigo finch, indigo bird, Passerina cyanea 16 | n01558993 robin, American robin, Turdus migratorius 17 | n01560419 bulbul 18 | n01580077 jay 19 | n01582220 magpie 20 | n01592084 chickadee 21 | n01601694 water ouzel, dipper 22 | n01608432 kite 23 | n01614925 bald eagle, American eagle, Haliaeetus leucocephalus 24 | n01616318 vulture 25 | n01622779 great grey owl, great gray owl, Strix nebulosa 26 | n01629819 European fire salamander, Salamandra salamandra 27 | n01630670 common newt, Triturus vulgaris 28 | n01631663 eft 29 | n01632458 spotted salamander, Ambystoma maculatum 30 | n01632777 axolotl, mud puppy, Ambystoma mexicanum 31 | n01641577 bullfrog, Rana catesbeiana 32 | n01644373 tree frog, tree-frog 33 | n01644900 tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui 34 | n01664065 loggerhead, loggerhead turtle, Caretta caretta 35 | n01665541 leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea 36 | n01667114 mud turtle 37 | n01667778 terrapin 38 | n01669191 box turtle, box tortoise 39 | n01675722 banded gecko 40 | n01677366 common iguana, iguana, Iguana iguana 41 | n01682714 American chameleon, anole, Anolis carolinensis 42 | n01685808 whiptail, whiptail lizard 43 | n01687978 agama 44 | n01688243 frilled lizard, Chlamydosaurus kingi 45 | n01689811 alligator lizard 46 | n01692333 Gila monster, Heloderma suspectum 47 | n01693334 green lizard, Lacerta viridis 48 | n01694178 African chameleon, Chamaeleo chamaeleon 49 | n01695060 Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis 50 | n01697457 African crocodile, Nile crocodile, Crocodylus niloticus 51 | n01698640 American alligator, Alligator mississipiensis 52 | n01704323 triceratops 53 | n01728572 thunder snake, worm snake, Carphophis amoenus 54 | n01728920 ringneck snake, ring-necked snake, ring snake 55 | n01729322 hognose snake, puff adder, sand viper 56 | n01729977 green snake, grass snake 57 | n01734418 king snake, kingsnake 58 | n01735189 garter snake, grass snake 59 | n01737021 water snake 60 | n01739381 vine snake 61 | n01740131 night snake, Hypsiglena torquata 62 | n01742172 boa constrictor, Constrictor constrictor 63 | n01744401 rock python, rock snake, Python sebae 64 | n01748264 Indian cobra, Naja naja 65 | n01749939 green mamba 66 | n01751748 sea snake 67 | n01753488 horned viper, cerastes, sand viper, horned asp, Cerastes cornutus 68 | n01755581 diamondback, diamondback rattlesnake, Crotalus adamanteus 69 | n01756291 sidewinder, horned rattlesnake, Crotalus cerastes 70 | n01768244 trilobite 71 | n01770081 harvestman, daddy longlegs, Phalangium opilio 72 | n01770393 scorpion 73 | n01773157 black and gold garden spider, Argiope aurantia 74 | n01773549 barn spider, Araneus cavaticus 75 | n01773797 garden spider, Aranea diademata 76 | n01774384 black widow, Latrodectus mactans 77 | n01774750 tarantula 78 | n01775062 wolf spider, hunting spider 79 | n01776313 tick 80 | n01784675 centipede 81 | n01795545 black grouse 82 | n01796340 ptarmigan 83 | n01797886 ruffed grouse, partridge, Bonasa umbellus 84 | n01798484 prairie chicken, prairie grouse, prairie fowl 85 | n01806143 peacock 86 | n01806567 quail 87 | n01807496 partridge 88 | n01817953 African grey, African gray, Psittacus erithacus 89 | n01818515 macaw 90 | n01819313 sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita 91 | n01820546 lorikeet 92 | n01824575 coucal 93 | n01828970 bee eater 94 | n01829413 hornbill 95 | n01833805 hummingbird 96 | n01843065 jacamar 97 | n01843383 toucan 98 | n01847000 drake 99 | n01855032 red-breasted merganser, Mergus serrator 100 | n01855672 goose 101 | n01860187 black swan, Cygnus atratus 102 | n01871265 tusker 103 | n01872401 echidna, spiny anteater, anteater 104 | n01873310 platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus 105 | n01877812 wallaby, brush kangaroo 106 | n01882714 koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus 107 | n01883070 wombat 108 | n01910747 jellyfish 109 | n01914609 sea anemone, anemone 110 | n01917289 brain coral 111 | n01924916 flatworm, platyhelminth 112 | n01930112 nematode, nematode worm, roundworm 113 | n01943899 conch 114 | n01944390 snail 115 | n01945685 slug 116 | n01950731 sea slug, nudibranch 117 | n01955084 chiton, coat-of-mail shell, sea cradle, polyplacophore 118 | n01968897 chambered nautilus, pearly nautilus, nautilus 119 | n01978287 Dungeness crab, Cancer magister 120 | n01978455 rock crab, Cancer irroratus 121 | n01980166 fiddler crab 122 | n01981276 king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica 123 | n01983481 American lobster, Northern lobster, Maine lobster, Homarus americanus 124 | n01984695 spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish 125 | n01985128 crayfish, crawfish, crawdad, crawdaddy 126 | n01986214 hermit crab 127 | n01990800 isopod 128 | n02002556 white stork, Ciconia ciconia 129 | n02002724 black stork, Ciconia nigra 130 | n02006656 spoonbill 131 | n02007558 flamingo 132 | n02009229 little blue heron, Egretta caerulea 133 | n02009912 American egret, great white heron, Egretta albus 134 | n02011460 bittern 135 | n02012849 crane 136 | n02013706 limpkin, Aramus pictus 137 | n02017213 European gallinule, Porphyrio porphyrio 138 | n02018207 American coot, marsh hen, mud hen, water hen, Fulica americana 139 | n02018795 bustard 140 | n02025239 ruddy turnstone, Arenaria interpres 141 | n02027492 red-backed sandpiper, dunlin, Erolia alpina 142 | n02028035 redshank, Tringa totanus 143 | n02033041 dowitcher 144 | n02037110 oystercatcher, oyster catcher 145 | n02051845 pelican 146 | n02056570 king penguin, Aptenodytes patagonica 147 | n02058221 albatross, mollymawk 148 | n02066245 grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus 149 | n02071294 killer whale, killer, orca, grampus, sea wolf, Orcinus orca 150 | n02074367 dugong, Dugong dugon 151 | n02077923 sea lion 152 | n02085620 Chihuahua 153 | n02085782 Japanese spaniel 154 | n02085936 Maltese dog, Maltese terrier, Maltese 155 | n02086079 Pekinese, Pekingese, Peke 156 | n02086240 Shih-Tzu 157 | n02086646 Blenheim spaniel 158 | n02086910 papillon 159 | n02087046 toy terrier 160 | n02087394 Rhodesian ridgeback 161 | n02088094 Afghan hound, Afghan 162 | n02088238 basset, basset hound 163 | n02088364 beagle 164 | n02088466 bloodhound, sleuthhound 165 | n02088632 bluetick 166 | n02089078 black-and-tan coonhound 167 | n02089867 Walker hound, Walker foxhound 168 | n02089973 English foxhound 169 | n02090379 redbone 170 | n02090622 borzoi, Russian wolfhound 171 | n02090721 Irish wolfhound 172 | n02091032 Italian greyhound 173 | n02091134 whippet 174 | n02091244 Ibizan hound, Ibizan Podenco 175 | n02091467 Norwegian elkhound, elkhound 176 | n02091635 otterhound, otter hound 177 | n02091831 Saluki, gazelle hound 178 | n02092002 Scottish deerhound, deerhound 179 | n02092339 Weimaraner 180 | n02093256 Staffordshire bullterrier, Staffordshire bull terrier 181 | n02093428 American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier 182 | n02093647 Bedlington terrier 183 | n02093754 Border terrier 184 | n02093859 Kerry blue terrier 185 | n02093991 Irish terrier 186 | n02094114 Norfolk terrier 187 | n02094258 Norwich terrier 188 | n02094433 Yorkshire terrier 189 | n02095314 wire-haired fox terrier 190 | n02095570 Lakeland terrier 191 | n02095889 Sealyham terrier, Sealyham 192 | n02096051 Airedale, Airedale terrier 193 | n02096177 cairn, cairn terrier 194 | n02096294 Australian terrier 195 | n02096437 Dandie Dinmont, Dandie Dinmont terrier 196 | n02096585 Boston bull, Boston terrier 197 | n02097047 miniature schnauzer 198 | n02097130 giant schnauzer 199 | n02097209 standard schnauzer 200 | n02097298 Scotch terrier, Scottish terrier, Scottie 201 | n02097474 Tibetan terrier, chrysanthemum dog 202 | n02097658 silky terrier, Sydney silky 203 | n02098105 soft-coated wheaten terrier 204 | n02098286 West Highland white terrier 205 | n02098413 Lhasa, Lhasa apso 206 | n02099267 flat-coated retriever 207 | n02099429 curly-coated retriever 208 | n02099601 golden retriever 209 | n02099712 Labrador retriever 210 | n02099849 Chesapeake Bay retriever 211 | n02100236 German short-haired pointer 212 | n02100583 vizsla, Hungarian pointer 213 | n02100735 English setter 214 | n02100877 Irish setter, red setter 215 | n02101006 Gordon setter 216 | n02101388 Brittany spaniel 217 | n02101556 clumber, clumber spaniel 218 | n02102040 English springer, English springer spaniel 219 | n02102177 Welsh springer spaniel 220 | n02102318 cocker spaniel, English cocker spaniel, cocker 221 | n02102480 Sussex spaniel 222 | n02102973 Irish water spaniel 223 | n02104029 kuvasz 224 | n02104365 schipperke 225 | n02105056 groenendael 226 | n02105162 malinois 227 | n02105251 briard 228 | n02105412 kelpie 229 | n02105505 komondor 230 | n02105641 Old English sheepdog, bobtail 231 | n02105855 Shetland sheepdog, Shetland sheep dog, Shetland 232 | n02106030 collie 233 | n02106166 Border collie 234 | n02106382 Bouvier des Flandres, Bouviers des Flandres 235 | n02106550 Rottweiler 236 | n02106662 German shepherd, German shepherd dog, German police dog, alsatian 237 | n02107142 Doberman, Doberman pinscher 238 | n02107312 miniature pinscher 239 | n02107574 Greater Swiss Mountain dog 240 | n02107683 Bernese mountain dog 241 | n02107908 Appenzeller 242 | n02108000 EntleBucher 243 | n02108089 boxer 244 | n02108422 bull mastiff 245 | n02108551 Tibetan mastiff 246 | n02108915 French bulldog 247 | n02109047 Great Dane 248 | n02109525 Saint Bernard, St Bernard 249 | n02109961 Eskimo dog, husky 250 | n02110063 malamute, malemute, Alaskan malamute 251 | n02110185 Siberian husky 252 | n02110341 dalmatian, coach dog, carriage dog 253 | n02110627 affenpinscher, monkey pinscher, monkey dog 254 | n02110806 basenji 255 | n02110958 pug, pug-dog 256 | n02111129 Leonberg 257 | n02111277 Newfoundland, Newfoundland dog 258 | n02111500 Great Pyrenees 259 | n02111889 Samoyed, Samoyede 260 | n02112018 Pomeranian 261 | n02112137 chow, chow chow 262 | n02112350 keeshond 263 | n02112706 Brabancon griffon 264 | n02113023 Pembroke, Pembroke Welsh corgi 265 | n02113186 Cardigan, Cardigan Welsh corgi 266 | n02113624 toy poodle 267 | n02113712 miniature poodle 268 | n02113799 standard poodle 269 | n02113978 Mexican hairless 270 | n02114367 timber wolf, grey wolf, gray wolf, Canis lupus 271 | n02114548 white wolf, Arctic wolf, Canis lupus tundrarum 272 | n02114712 red wolf, maned wolf, Canis rufus, Canis niger 273 | n02114855 coyote, prairie wolf, brush wolf, Canis latrans 274 | n02115641 dingo, warrigal, warragal, Canis dingo 275 | n02115913 dhole, Cuon alpinus 276 | n02116738 African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus 277 | n02117135 hyena, hyaena 278 | n02119022 red fox, Vulpes vulpes 279 | n02119789 kit fox, Vulpes macrotis 280 | n02120079 Arctic fox, white fox, Alopex lagopus 281 | n02120505 grey fox, gray fox, Urocyon cinereoargenteus 282 | n02123045 tabby, tabby cat 283 | n02123159 tiger cat 284 | n02123394 Persian cat 285 | n02123597 Siamese cat, Siamese 286 | n02124075 Egyptian cat 287 | n02125311 cougar, puma, catamount, mountain lion, painter, panther, Felis concolor 288 | n02127052 lynx, catamount 289 | n02128385 leopard, Panthera pardus 290 | n02128757 snow leopard, ounce, Panthera uncia 291 | n02128925 jaguar, panther, Panthera onca, Felis onca 292 | n02129165 lion, king of beasts, Panthera leo 293 | n02129604 tiger, Panthera tigris 294 | n02130308 cheetah, chetah, Acinonyx jubatus 295 | n02132136 brown bear, bruin, Ursus arctos 296 | n02133161 American black bear, black bear, Ursus americanus, Euarctos americanus 297 | n02134084 ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus 298 | n02134418 sloth bear, Melursus ursinus, Ursus ursinus 299 | n02137549 mongoose 300 | n02138441 meerkat, mierkat 301 | n02165105 tiger beetle 302 | n02165456 ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle 303 | n02167151 ground beetle, carabid beetle 304 | n02168699 long-horned beetle, longicorn, longicorn beetle 305 | n02169497 leaf beetle, chrysomelid 306 | n02172182 dung beetle 307 | n02174001 rhinoceros beetle 308 | n02177972 weevil 309 | n02190166 fly 310 | n02206856 bee 311 | n02219486 ant, emmet, pismire 312 | n02226429 grasshopper, hopper 313 | n02229544 cricket 314 | n02231487 walking stick, walkingstick, stick insect 315 | n02233338 cockroach, roach 316 | n02236044 mantis, mantid 317 | n02256656 cicada, cicala 318 | n02259212 leafhopper 319 | n02264363 lacewing, lacewing fly 320 | n02268443 dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk 321 | n02268853 damselfly 322 | n02276258 admiral 323 | n02277742 ringlet, ringlet butterfly 324 | n02279972 monarch, monarch butterfly, milkweed butterfly, Danaus plexippus 325 | n02280649 cabbage butterfly 326 | n02281406 sulphur butterfly, sulfur butterfly 327 | n02281787 lycaenid, lycaenid butterfly 328 | n02317335 starfish, sea star 329 | n02319095 sea urchin 330 | n02321529 sea cucumber, holothurian 331 | n02325366 wood rabbit, cottontail, cottontail rabbit 332 | n02326432 hare 333 | n02328150 Angora, Angora rabbit 334 | n02342885 hamster 335 | n02346627 porcupine, hedgehog 336 | n02356798 fox squirrel, eastern fox squirrel, Sciurus niger 337 | n02361337 marmot 338 | n02363005 beaver 339 | n02364673 guinea pig, Cavia cobaya 340 | n02389026 sorrel 341 | n02391049 zebra 342 | n02395406 hog, pig, grunter, squealer, Sus scrofa 343 | n02396427 wild boar, boar, Sus scrofa 344 | n02397096 warthog 345 | n02398521 hippopotamus, hippo, river horse, Hippopotamus amphibius 346 | n02403003 ox 347 | n02408429 water buffalo, water ox, Asiatic buffalo, Bubalus bubalis 348 | n02410509 bison 349 | n02412080 ram, tup 350 | n02415577 bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis 351 | n02417914 ibex, Capra ibex 352 | n02422106 hartebeest 353 | n02422699 impala, Aepyceros melampus 354 | n02423022 gazelle 355 | n02437312 Arabian camel, dromedary, Camelus dromedarius 356 | n02437616 llama 357 | n02441942 weasel 358 | n02442845 mink 359 | n02443114 polecat, fitch, foulmart, foumart, Mustela putorius 360 | n02443484 black-footed ferret, ferret, Mustela nigripes 361 | n02444819 otter 362 | n02445715 skunk, polecat, wood pussy 363 | n02447366 badger 364 | n02454379 armadillo 365 | n02457408 three-toed sloth, ai, Bradypus tridactylus 366 | n02480495 orangutan, orang, orangutang, Pongo pygmaeus 367 | n02480855 gorilla, Gorilla gorilla 368 | n02481823 chimpanzee, chimp, Pan troglodytes 369 | n02483362 gibbon, Hylobates lar 370 | n02483708 siamang, Hylobates syndactylus, Symphalangus syndactylus 371 | n02484975 guenon, guenon monkey 372 | n02486261 patas, hussar monkey, Erythrocebus patas 373 | n02486410 baboon 374 | n02487347 macaque 375 | n02488291 langur 376 | n02488702 colobus, colobus monkey 377 | n02489166 proboscis monkey, Nasalis larvatus 378 | n02490219 marmoset 379 | n02492035 capuchin, ringtail, Cebus capucinus 380 | n02492660 howler monkey, howler 381 | n02493509 titi, titi monkey 382 | n02493793 spider monkey, Ateles geoffroyi 383 | n02494079 squirrel monkey, Saimiri sciureus 384 | n02497673 Madagascar cat, ring-tailed lemur, Lemur catta 385 | n02500267 indri, indris, Indri indri, Indri brevicaudatus 386 | n02504013 Indian elephant, Elephas maximus 387 | n02504458 African elephant, Loxodonta africana 388 | n02509815 lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens 389 | n02510455 giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca 390 | n02514041 barracouta, snoek 391 | n02526121 eel 392 | n02536864 coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch 393 | n02606052 rock beauty, Holocanthus tricolor 394 | n02607072 anemone fish 395 | n02640242 sturgeon 396 | n02641379 gar, garfish, garpike, billfish, Lepisosteus osseus 397 | n02643566 lionfish 398 | n02655020 puffer, pufferfish, blowfish, globefish 399 | n02666196 abacus 400 | n02667093 abaya 401 | n02669723 academic gown, academic robe, judge's robe 402 | n02672831 accordion, piano accordion, squeeze box 403 | n02676566 acoustic guitar 404 | n02687172 aircraft carrier, carrier, flattop, attack aircraft carrier 405 | n02690373 airliner 406 | n02692877 airship, dirigible 407 | n02699494 altar 408 | n02701002 ambulance 409 | n02704792 amphibian, amphibious vehicle 410 | n02708093 analog clock 411 | n02727426 apiary, bee house 412 | n02730930 apron 413 | n02747177 ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin 414 | n02749479 assault rifle, assault gun 415 | n02769748 backpack, back pack, knapsack, packsack, rucksack, haversack 416 | n02776631 bakery, bakeshop, bakehouse 417 | n02777292 balance beam, beam 418 | n02782093 balloon 419 | n02783161 ballpoint, ballpoint pen, ballpen, Biro 420 | n02786058 Band Aid 421 | n02787622 banjo 422 | n02788148 bannister, banister, balustrade, balusters, handrail 423 | n02790996 barbell 424 | n02791124 barber chair 425 | n02791270 barbershop 426 | n02793495 barn 427 | n02794156 barometer 428 | n02795169 barrel, cask 429 | n02797295 barrow, garden cart, lawn cart, wheelbarrow 430 | n02799071 baseball 431 | n02802426 basketball 432 | n02804414 bassinet 433 | n02804610 bassoon 434 | n02807133 bathing cap, swimming cap 435 | n02808304 bath towel 436 | n02808440 bathtub, bathing tub, bath, tub 437 | n02814533 beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon 438 | n02814860 beacon, lighthouse, beacon light, pharos 439 | n02815834 beaker 440 | n02817516 bearskin, busby, shako 441 | n02823428 beer bottle 442 | n02823750 beer glass 443 | n02825657 bell cote, bell cot 444 | n02834397 bib 445 | n02835271 bicycle-built-for-two, tandem bicycle, tandem 446 | n02837789 bikini, two-piece 447 | n02840245 binder, ring-binder 448 | n02841315 binoculars, field glasses, opera glasses 449 | n02843684 birdhouse 450 | n02859443 boathouse 451 | n02860847 bobsled, bobsleigh, bob 452 | n02865351 bolo tie, bolo, bola tie, bola 453 | n02869837 bonnet, poke bonnet 454 | n02870880 bookcase 455 | n02871525 bookshop, bookstore, bookstall 456 | n02877765 bottlecap 457 | n02879718 bow 458 | n02883205 bow tie, bow-tie, bowtie 459 | n02892201 brass, memorial tablet, plaque 460 | n02892767 brassiere, bra, bandeau 461 | n02894605 breakwater, groin, groyne, mole, bulwark, seawall, jetty 462 | n02895154 breastplate, aegis, egis 463 | n02906734 broom 464 | n02909870 bucket, pail 465 | n02910353 buckle 466 | n02916936 bulletproof vest 467 | n02917067 bullet train, bullet 468 | n02927161 butcher shop, meat market 469 | n02930766 cab, hack, taxi, taxicab 470 | n02939185 caldron, cauldron 471 | n02948072 candle, taper, wax light 472 | n02950826 cannon 473 | n02951358 canoe 474 | n02951585 can opener, tin opener 475 | n02963159 cardigan 476 | n02965783 car mirror 477 | n02966193 carousel, carrousel, merry-go-round, roundabout, whirligig 478 | n02966687 carpenter's kit, tool kit 479 | n02971356 carton 480 | n02974003 car wheel 481 | n02977058 cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM 482 | n02978881 cassette 483 | n02979186 cassette player 484 | n02980441 castle 485 | n02981792 catamaran 486 | n02988304 CD player 487 | n02992211 cello, violoncello 488 | n02992529 cellular telephone, cellular phone, cellphone, cell, mobile phone 489 | n02999410 chain 490 | n03000134 chainlink fence 491 | n03000247 chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour 492 | n03000684 chain saw, chainsaw 493 | n03014705 chest 494 | n03016953 chiffonier, commode 495 | n03017168 chime, bell, gong 496 | n03018349 china cabinet, china closet 497 | n03026506 Christmas stocking 498 | n03028079 church, church building 499 | n03032252 cinema, movie theater, movie theatre, movie house, picture palace 500 | n03041632 cleaver, meat cleaver, chopper 501 | n03042490 cliff dwelling 502 | n03045698 cloak 503 | n03047690 clog, geta, patten, sabot 504 | n03062245 cocktail shaker 505 | n03063599 coffee mug 506 | n03063689 coffeepot 507 | n03065424 coil, spiral, volute, whorl, helix 508 | n03075370 combination lock 509 | n03085013 computer keyboard, keypad 510 | n03089624 confectionery, confectionary, candy store 511 | n03095699 container ship, containership, container vessel 512 | n03100240 convertible 513 | n03109150 corkscrew, bottle screw 514 | n03110669 cornet, horn, trumpet, trump 515 | n03124043 cowboy boot 516 | n03124170 cowboy hat, ten-gallon hat 517 | n03125729 cradle 518 | n03126707 crane 519 | n03127747 crash helmet 520 | n03127925 crate 521 | n03131574 crib, cot 522 | n03133878 Crock Pot 523 | n03134739 croquet ball 524 | n03141823 crutch 525 | n03146219 cuirass 526 | n03160309 dam, dike, dyke 527 | n03179701 desk 528 | n03180011 desktop computer 529 | n03187595 dial telephone, dial phone 530 | n03188531 diaper, nappy, napkin 531 | n03196217 digital clock 532 | n03197337 digital watch 533 | n03201208 dining table, board 534 | n03207743 dishrag, dishcloth 535 | n03207941 dishwasher, dish washer, dishwashing machine 536 | n03208938 disk brake, disc brake 537 | n03216828 dock, dockage, docking facility 538 | n03218198 dogsled, dog sled, dog sleigh 539 | n03220513 dome 540 | n03223299 doormat, welcome mat 541 | n03240683 drilling platform, offshore rig 542 | n03249569 drum, membranophone, tympan 543 | n03250847 drumstick 544 | n03255030 dumbbell 545 | n03259280 Dutch oven 546 | n03271574 electric fan, blower 547 | n03272010 electric guitar 548 | n03272562 electric locomotive 549 | n03290653 entertainment center 550 | n03291819 envelope 551 | n03297495 espresso maker 552 | n03314780 face powder 553 | n03325584 feather boa, boa 554 | n03337140 file, file cabinet, filing cabinet 555 | n03344393 fireboat 556 | n03345487 fire engine, fire truck 557 | n03347037 fire screen, fireguard 558 | n03355925 flagpole, flagstaff 559 | n03372029 flute, transverse flute 560 | n03376595 folding chair 561 | n03379051 football helmet 562 | n03384352 forklift 563 | n03388043 fountain 564 | n03388183 fountain pen 565 | n03388549 four-poster 566 | n03393912 freight car 567 | n03394916 French horn, horn 568 | n03400231 frying pan, frypan, skillet 569 | n03404251 fur coat 570 | n03417042 garbage truck, dustcart 571 | n03424325 gasmask, respirator, gas helmet 572 | n03425413 gas pump, gasoline pump, petrol pump, island dispenser 573 | n03443371 goblet 574 | n03444034 go-kart 575 | n03445777 golf ball 576 | n03445924 golfcart, golf cart 577 | n03447447 gondola 578 | n03447721 gong, tam-tam 579 | n03450230 gown 580 | n03452741 grand piano, grand 581 | n03457902 greenhouse, nursery, glasshouse 582 | n03459775 grille, radiator grille 583 | n03461385 grocery store, grocery, food market, market 584 | n03467068 guillotine 585 | n03476684 hair slide 586 | n03476991 hair spray 587 | n03478589 half track 588 | n03481172 hammer 589 | n03482405 hamper 590 | n03483316 hand blower, blow dryer, blow drier, hair dryer, hair drier 591 | n03485407 hand-held computer, hand-held microcomputer 592 | n03485794 handkerchief, hankie, hanky, hankey 593 | n03492542 hard disc, hard disk, fixed disk 594 | n03494278 harmonica, mouth organ, harp, mouth harp 595 | n03495258 harp 596 | n03496892 harvester, reaper 597 | n03498962 hatchet 598 | n03527444 holster 599 | n03529860 home theater, home theatre 600 | n03530642 honeycomb 601 | n03532672 hook, claw 602 | n03534580 hoopskirt, crinoline 603 | n03535780 horizontal bar, high bar 604 | n03538406 horse cart, horse-cart 605 | n03544143 hourglass 606 | n03584254 iPod 607 | n03584829 iron, smoothing iron 608 | n03590841 jack-o'-lantern 609 | n03594734 jean, blue jean, denim 610 | n03594945 jeep, landrover 611 | n03595614 jersey, T-shirt, tee shirt 612 | n03598930 jigsaw puzzle 613 | n03599486 jinrikisha, ricksha, rickshaw 614 | n03602883 joystick 615 | n03617480 kimono 616 | n03623198 knee pad 617 | n03627232 knot 618 | n03630383 lab coat, laboratory coat 619 | n03633091 ladle 620 | n03637318 lampshade, lamp shade 621 | n03642806 laptop, laptop computer 622 | n03649909 lawn mower, mower 623 | n03657121 lens cap, lens cover 624 | n03658185 letter opener, paper knife, paperknife 625 | n03661043 library 626 | n03662601 lifeboat 627 | n03666591 lighter, light, igniter, ignitor 628 | n03670208 limousine, limo 629 | n03673027 liner, ocean liner 630 | n03676483 lipstick, lip rouge 631 | n03680355 Loafer 632 | n03690938 lotion 633 | n03691459 loudspeaker, speaker, speaker unit, loudspeaker system, speaker system 634 | n03692522 loupe, jeweler's loupe 635 | n03697007 lumbermill, sawmill 636 | n03706229 magnetic compass 637 | n03709823 mailbag, postbag 638 | n03710193 mailbox, letter box 639 | n03710637 maillot 640 | n03710721 maillot, tank suit 641 | n03717622 manhole cover 642 | n03720891 maraca 643 | n03721384 marimba, xylophone 644 | n03724870 mask 645 | n03729826 matchstick 646 | n03733131 maypole 647 | n03733281 maze, labyrinth 648 | n03733805 measuring cup 649 | n03742115 medicine chest, medicine cabinet 650 | n03743016 megalith, megalithic structure 651 | n03759954 microphone, mike 652 | n03761084 microwave, microwave oven 653 | n03763968 military uniform 654 | n03764736 milk can 655 | n03769881 minibus 656 | n03770439 miniskirt, mini 657 | n03770679 minivan 658 | n03773504 missile 659 | n03775071 mitten 660 | n03775546 mixing bowl 661 | n03776460 mobile home, manufactured home 662 | n03777568 Model T 663 | n03777754 modem 664 | n03781244 monastery 665 | n03782006 monitor 666 | n03785016 moped 667 | n03786901 mortar 668 | n03787032 mortarboard 669 | n03788195 mosque 670 | n03788365 mosquito net 671 | n03791053 motor scooter, scooter 672 | n03792782 mountain bike, all-terrain bike, off-roader 673 | n03792972 mountain tent 674 | n03793489 mouse, computer mouse 675 | n03794056 mousetrap 676 | n03796401 moving van 677 | n03803284 muzzle 678 | n03804744 nail 679 | n03814639 neck brace 680 | n03814906 necklace 681 | n03825788 nipple 682 | n03832673 notebook, notebook computer 683 | n03837869 obelisk 684 | n03838899 oboe, hautboy, hautbois 685 | n03840681 ocarina, sweet potato 686 | n03841143 odometer, hodometer, mileometer, milometer 687 | n03843555 oil filter 688 | n03854065 organ, pipe organ 689 | n03857828 oscilloscope, scope, cathode-ray oscilloscope, CRO 690 | n03866082 overskirt 691 | n03868242 oxcart 692 | n03868863 oxygen mask 693 | n03871628 packet 694 | n03873416 paddle, boat paddle 695 | n03874293 paddlewheel, paddle wheel 696 | n03874599 padlock 697 | n03876231 paintbrush 698 | n03877472 pajama, pyjama, pj's, jammies 699 | n03877845 palace 700 | n03884397 panpipe, pandean pipe, syrinx 701 | n03887697 paper towel 702 | n03888257 parachute, chute 703 | n03888605 parallel bars, bars 704 | n03891251 park bench 705 | n03891332 parking meter 706 | n03895866 passenger car, coach, carriage 707 | n03899768 patio, terrace 708 | n03902125 pay-phone, pay-station 709 | n03903868 pedestal, plinth, footstall 710 | n03908618 pencil box, pencil case 711 | n03908714 pencil sharpener 712 | n03916031 perfume, essence 713 | n03920288 Petri dish 714 | n03924679 photocopier 715 | n03929660 pick, plectrum, plectron 716 | n03929855 pickelhaube 717 | n03930313 picket fence, paling 718 | n03930630 pickup, pickup truck 719 | n03933933 pier 720 | n03935335 piggy bank, penny bank 721 | n03937543 pill bottle 722 | n03938244 pillow 723 | n03942813 ping-pong ball 724 | n03944341 pinwheel 725 | n03947888 pirate, pirate ship 726 | n03950228 pitcher, ewer 727 | n03954731 plane, carpenter's plane, woodworking plane 728 | n03956157 planetarium 729 | n03958227 plastic bag 730 | n03961711 plate rack 731 | n03967562 plow, plough 732 | n03970156 plunger, plumber's helper 733 | n03976467 Polaroid camera, Polaroid Land camera 734 | n03976657 pole 735 | n03977966 police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria 736 | n03980874 poncho 737 | n03982430 pool table, billiard table, snooker table 738 | n03983396 pop bottle, soda bottle 739 | n03991062 pot, flowerpot 740 | n03992509 potter's wheel 741 | n03995372 power drill 742 | n03998194 prayer rug, prayer mat 743 | n04004767 printer 744 | n04005630 prison, prison house 745 | n04008634 projectile, missile 746 | n04009552 projector 747 | n04019541 puck, hockey puck 748 | n04023962 punching bag, punch bag, punching ball, punchball 749 | n04026417 purse 750 | n04033901 quill, quill pen 751 | n04033995 quilt, comforter, comfort, puff 752 | n04037443 racer, race car, racing car 753 | n04039381 racket, racquet 754 | n04040759 radiator 755 | n04041544 radio, wireless 756 | n04044716 radio telescope, radio reflector 757 | n04049303 rain barrel 758 | n04065272 recreational vehicle, RV, R.V. 759 | n04067472 reel 760 | n04069434 reflex camera 761 | n04070727 refrigerator, icebox 762 | n04074963 remote control, remote 763 | n04081281 restaurant, eating house, eating place, eatery 764 | n04086273 revolver, six-gun, six-shooter 765 | n04090263 rifle 766 | n04099969 rocking chair, rocker 767 | n04111531 rotisserie 768 | n04116512 rubber eraser, rubber, pencil eraser 769 | n04118538 rugby ball 770 | n04118776 rule, ruler 771 | n04120489 running shoe 772 | n04125021 safe 773 | n04127249 safety pin 774 | n04131690 saltshaker, salt shaker 775 | n04133789 sandal 776 | n04136333 sarong 777 | n04141076 sax, saxophone 778 | n04141327 scabbard 779 | n04141975 scale, weighing machine 780 | n04146614 school bus 781 | n04147183 schooner 782 | n04149813 scoreboard 783 | n04152593 screen, CRT screen 784 | n04153751 screw 785 | n04154565 screwdriver 786 | n04162706 seat belt, seatbelt 787 | n04179913 sewing machine 788 | n04192698 shield, buckler 789 | n04200800 shoe shop, shoe-shop, shoe store 790 | n04201297 shoji 791 | n04204238 shopping basket 792 | n04204347 shopping cart 793 | n04208210 shovel 794 | n04209133 shower cap 795 | n04209239 shower curtain 796 | n04228054 ski 797 | n04229816 ski mask 798 | n04235860 sleeping bag 799 | n04238763 slide rule, slipstick 800 | n04239074 sliding door 801 | n04243546 slot, one-armed bandit 802 | n04251144 snorkel 803 | n04252077 snowmobile 804 | n04252225 snowplow, snowplough 805 | n04254120 soap dispenser 806 | n04254680 soccer ball 807 | n04254777 sock 808 | n04258138 solar dish, solar collector, solar furnace 809 | n04259630 sombrero 810 | n04263257 soup bowl 811 | n04264628 space bar 812 | n04265275 space heater 813 | n04266014 space shuttle 814 | n04270147 spatula 815 | n04273569 speedboat 816 | n04275548 spider web, spider's web 817 | n04277352 spindle 818 | n04285008 sports car, sport car 819 | n04286575 spotlight, spot 820 | n04296562 stage 821 | n04310018 steam locomotive 822 | n04311004 steel arch bridge 823 | n04311174 steel drum 824 | n04317175 stethoscope 825 | n04325704 stole 826 | n04326547 stone wall 827 | n04328186 stopwatch, stop watch 828 | n04330267 stove 829 | n04332243 strainer 830 | n04335435 streetcar, tram, tramcar, trolley, trolley car 831 | n04336792 stretcher 832 | n04344873 studio couch, day bed 833 | n04346328 stupa, tope 834 | n04347754 submarine, pigboat, sub, U-boat 835 | n04350905 suit, suit of clothes 836 | n04355338 sundial 837 | n04355933 sunglass 838 | n04356056 sunglasses, dark glasses, shades 839 | n04357314 sunscreen, sunblock, sun blocker 840 | n04366367 suspension bridge 841 | n04367480 swab, swob, mop 842 | n04370456 sweatshirt 843 | n04371430 swimming trunks, bathing trunks 844 | n04371774 swing 845 | n04372370 switch, electric switch, electrical switch 846 | n04376876 syringe 847 | n04380533 table lamp 848 | n04389033 tank, army tank, armored combat vehicle, armoured combat vehicle 849 | n04392985 tape player 850 | n04398044 teapot 851 | n04399382 teddy, teddy bear 852 | n04404412 television, television system 853 | n04409515 tennis ball 854 | n04417672 thatch, thatched roof 855 | n04418357 theater curtain, theatre curtain 856 | n04423845 thimble 857 | n04428191 thresher, thrasher, threshing machine 858 | n04429376 throne 859 | n04435653 tile roof 860 | n04442312 toaster 861 | n04443257 tobacco shop, tobacconist shop, tobacconist 862 | n04447861 toilet seat 863 | n04456115 torch 864 | n04458633 totem pole 865 | n04461696 tow truck, tow car, wrecker 866 | n04462240 toyshop 867 | n04465501 tractor 868 | n04467665 trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi 869 | n04476259 tray 870 | n04479046 trench coat 871 | n04482393 tricycle, trike, velocipede 872 | n04483307 trimaran 873 | n04485082 tripod 874 | n04486054 triumphal arch 875 | n04487081 trolleybus, trolley coach, trackless trolley 876 | n04487394 trombone 877 | n04493381 tub, vat 878 | n04501370 turnstile 879 | n04505470 typewriter keyboard 880 | n04507155 umbrella 881 | n04509417 unicycle, monocycle 882 | n04515003 upright, upright piano 883 | n04517823 vacuum, vacuum cleaner 884 | n04522168 vase 885 | n04523525 vault 886 | n04525038 velvet 887 | n04525305 vending machine 888 | n04532106 vestment 889 | n04532670 viaduct 890 | n04536866 violin, fiddle 891 | n04540053 volleyball 892 | n04542943 waffle iron 893 | n04548280 wall clock 894 | n04548362 wallet, billfold, notecase, pocketbook 895 | n04550184 wardrobe, closet, press 896 | n04552348 warplane, military plane 897 | n04553703 washbasin, handbasin, washbowl, lavabo, wash-hand basin 898 | n04554684 washer, automatic washer, washing machine 899 | n04557648 water bottle 900 | n04560804 water jug 901 | n04562935 water tower 902 | n04579145 whiskey jug 903 | n04579432 whistle 904 | n04584207 wig 905 | n04589890 window screen 906 | n04590129 window shade 907 | n04591157 Windsor tie 908 | n04591713 wine bottle 909 | n04592741 wing 910 | n04596742 wok 911 | n04597913 wooden spoon 912 | n04599235 wool, woolen, woollen 913 | n04604644 worm fence, snake fence, snake-rail fence, Virginia fence 914 | n04606251 wreck 915 | n04612504 yawl 916 | n04613696 yurt 917 | n06359193 web site, website, internet site, site 918 | n06596364 comic book 919 | n06785654 crossword puzzle, crossword 920 | n06794110 street sign 921 | n06874185 traffic light, traffic signal, stoplight 922 | n07248320 book jacket, dust cover, dust jacket, dust wrapper 923 | n07565083 menu 924 | n07579787 plate 925 | n07583066 guacamole 926 | n07584110 consomme 927 | n07590611 hot pot, hotpot 928 | n07613480 trifle 929 | n07614500 ice cream, icecream 930 | n07615774 ice lolly, lolly, lollipop, popsicle 931 | n07684084 French loaf 932 | n07693725 bagel, beigel 933 | n07695742 pretzel 934 | n07697313 cheeseburger 935 | n07697537 hotdog, hot dog, red hot 936 | n07711569 mashed potato 937 | n07714571 head cabbage 938 | n07714990 broccoli 939 | n07715103 cauliflower 940 | n07716358 zucchini, courgette 941 | n07716906 spaghetti squash 942 | n07717410 acorn squash 943 | n07717556 butternut squash 944 | n07718472 cucumber, cuke 945 | n07718747 artichoke, globe artichoke 946 | n07720875 bell pepper 947 | n07730033 cardoon 948 | n07734744 mushroom 949 | n07742313 Granny Smith 950 | n07745940 strawberry 951 | n07747607 orange 952 | n07749582 lemon 953 | n07753113 fig 954 | n07753275 pineapple, ananas 955 | n07753592 banana 956 | n07754684 jackfruit, jak, jack 957 | n07760859 custard apple 958 | n07768694 pomegranate 959 | n07802026 hay 960 | n07831146 carbonara 961 | n07836838 chocolate sauce, chocolate syrup 962 | n07860988 dough 963 | n07871810 meat loaf, meatloaf 964 | n07873807 pizza, pizza pie 965 | n07875152 potpie 966 | n07880968 burrito 967 | n07892512 red wine 968 | n07920052 espresso 969 | n07930864 cup 970 | n07932039 eggnog 971 | n09193705 alp 972 | n09229709 bubble 973 | n09246464 cliff, drop, drop-off 974 | n09256479 coral reef 975 | n09288635 geyser 976 | n09332890 lakeside, lakeshore 977 | n09399592 promontory, headland, head, foreland 978 | n09421951 sandbar, sand bar 979 | n09428293 seashore, coast, seacoast, sea-coast 980 | n09468604 valley, vale 981 | n09472597 volcano 982 | n09835506 ballplayer, baseball player 983 | n10148035 groom, bridegroom 984 | n10565667 scuba diver 985 | n11879895 rapeseed 986 | n11939491 daisy 987 | n12057211 yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum 988 | n12144580 corn 989 | n12267677 acorn 990 | n12620546 hip, rose hip, rosehip 991 | n12768682 buckeye, horse chestnut, conker 992 | n12985857 coral fungus 993 | n12998815 agaric 994 | n13037406 gyromitra 995 | n13040303 stinkhorn, carrion fungus 996 | n13044778 earthstar 997 | n13052670 hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa 998 | n13054560 bolete 999 | n13133613 ear, spike, capitulum 1000 | n15075141 toilet tissue, toilet paper, bathroom tissue -------------------------------------------------------------------------------- /datasets/jsb_chorales/README.md: -------------------------------------------------------------------------------- 1 | # Johann Sebastian Bach Chorales Dataset 2 | 3 | ## Source 4 | This dataset contains 382 chorales by Johann Sebastian Bach (in the public domain), where each chorale is composed of 100 to 640 chords with a temporal resolution of 1/16th. Each chord is composed of 4 integers, each indicating the index of a note on a piano, except for the value 0 which means "no note played". 5 | 6 | This dataset is based on [czhuang's JSB-Chorales-dataset](https://github.com/czhuang/JSB-Chorales-dataset/blob/master/README.md) (`Jsb16thSeparated.npz`) which used the train, validation, test split from Boulanger-Lewandowski (2012). 7 | 8 | Motivation: I thought it would be nice to have a version of this dataset in CSV format. 9 | 10 | ## Reference 11 | Boulanger-Lewandowski, N., Vincent, P., & Bengio, Y. (2012). Modeling Temporal Dependencies in High-Dimensional Sequences: Application to Polyphonic Music Generation and Transcription. Proceedings of the 29th International Conference on Machine Learning (ICML-12), 1159–1166. 12 | 13 | ## Usage 14 | Download `jsb_chorales.tgz` and untar it: 15 | 16 | ```bash 17 | $ tar xvzf jsb_chorales.tgz 18 | ``` 19 | 20 | ## Data structure 21 | The dataset is split in three (train, valid, test), with a total of 382 CSV files: 22 | 23 | ``` 24 | $ tree 25 | . 26 | ├── train 27 | │   ├── chorale_000.csv 28 | │   ├── chorale_001.csv 29 | │   ├── chorale_002.csv 30 | │ │ ... 31 | │   ├── chorale_227.csv 32 | │   └── chorale_228.csv 33 | ├── valid 34 | │ ├── chorale_229.csv 35 | │ ├── chorale_230.csv 36 | │ ├── chorale_231.csv 37 | │ │ ... 38 | │   ├── chorale_303.csv 39 | │   └── chorale_304.csv 40 | └── test 41 |    ├── chorale_305.csv 42 |    ├── chorale_306.csv 43 |    ├── chorale_307.csv 44 | │ ... 45 |    ├── chorale_380.csv 46 |    └── chorale_381.csv 47 | ``` 48 | 49 | ## Data sample 50 | 51 | ``` 52 | $ head train/chorale_000.csv 53 | note0,note1,note2,note3 54 | 74,70,65,58 55 | 74,70,65,58 56 | 74,70,65,58 57 | 74,70,65,58 58 | 75,70,58,55 59 | 75,70,58,55 60 | 75,70,60,55 61 | 75,70,60,55 62 | 77,69,62,50 63 | ``` 64 | 65 | Enjoy! 66 | -------------------------------------------------------------------------------- /datasets/jsb_chorales/jsb_chorales.tgz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ageron/handson-ml2/8958d538bdcdf29d329d9950bfc79034c29db724/datasets/jsb_chorales/jsb_chorales.tgz -------------------------------------------------------------------------------- /datasets/lifesat/README.md: -------------------------------------------------------------------------------- 1 | # Life satisfaction and GDP per capita 2 | ## Life satisfaction 3 | ### Source 4 | This dataset was obtained from the OECD's website at: http://stats.oecd.org/index.aspx?DataSetCode=BLI 5 | 6 | ### Data description 7 | 8 | Int64Index: 3292 entries, 0 to 3291 9 | Data columns (total 17 columns): 10 | "LOCATION" 3292 non-null object 11 | Country 3292 non-null object 12 | INDICATOR 3292 non-null object 13 | Indicator 3292 non-null object 14 | MEASURE 3292 non-null object 15 | Measure 3292 non-null object 16 | INEQUALITY 3292 non-null object 17 | Inequality 3292 non-null object 18 | Unit Code 3292 non-null object 19 | Unit 3292 non-null object 20 | PowerCode Code 3292 non-null int64 21 | PowerCode 3292 non-null object 22 | Reference Period Code 0 non-null float64 23 | Reference Period 0 non-null float64 24 | Value 3292 non-null float64 25 | Flag Codes 1120 non-null object 26 | Flags 1120 non-null object 27 | dtypes: float64(3), int64(1), object(13) 28 | memory usage: 462.9+ KB 29 | 30 | ### Example usage using python Pandas 31 | 32 | >>> life_sat = pd.read_csv("oecd_bli_2015.csv", thousands=',') 33 | 34 | >>> life_sat_total = life_sat[life_sat["INEQUALITY"]=="TOT"] 35 | 36 | >>> life_sat_total = life_sat_total.pivot(index="Country", columns="Indicator", values="Value") 37 | 38 | >>> life_sat_total.info() 39 | 40 | Index: 37 entries, Australia to United States 41 | Data columns (total 24 columns): 42 | Air pollution 37 non-null float64 43 | Assault rate 37 non-null float64 44 | Consultation on rule-making 37 non-null float64 45 | Dwellings without basic facilities 37 non-null float64 46 | Educational attainment 37 non-null float64 47 | Employees working very long hours 37 non-null float64 48 | Employment rate 37 non-null float64 49 | Homicide rate 37 non-null float64 50 | Household net adjusted disposable income 37 non-null float64 51 | Household net financial wealth 37 non-null float64 52 | Housing expenditure 37 non-null float64 53 | Job security 37 non-null float64 54 | Life expectancy 37 non-null float64 55 | Life satisfaction 37 non-null float64 56 | Long-term unemployment rate 37 non-null float64 57 | Personal earnings 37 non-null float64 58 | Quality of support network 37 non-null float64 59 | Rooms per person 37 non-null float64 60 | Self-reported health 37 non-null float64 61 | Student skills 37 non-null float64 62 | Time devoted to leisure and personal care 37 non-null float64 63 | Voter turnout 37 non-null float64 64 | Water quality 37 non-null float64 65 | Years in education 37 non-null float64 66 | dtypes: float64(24) 67 | memory usage: 7.2+ KB 68 | 69 | ## GDP per capita 70 | ### Source 71 | Dataset obtained from the IMF's website at: http://goo.gl/j1MSKe 72 | 73 | ### Data description 74 | 75 | Int64Index: 190 entries, 0 to 189 76 | Data columns (total 7 columns): 77 | Country 190 non-null object 78 | Subject Descriptor 189 non-null object 79 | Units 189 non-null object 80 | Scale 189 non-null object 81 | Country/Series-specific Notes 188 non-null object 82 | 2015 187 non-null float64 83 | Estimates Start After 188 non-null float64 84 | dtypes: float64(2), object(5) 85 | memory usage: 11.9+ KB 86 | 87 | ### Example usage using python Pandas 88 | 89 | >>> gdp_per_capita = pd.read_csv( 90 | ... datapath+"gdp_per_capita.csv", thousands=',', delimiter='\t', 91 | ... encoding='latin1', na_values="n/a", index_col="Country") 92 | ... 93 | >>> gdp_per_capita.rename(columns={"2015": "GDP per capita"}, inplace=True) 94 | 95 | -------------------------------------------------------------------------------- /datasets/lifesat/gdp_per_capita.csv: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ageron/handson-ml2/8958d538bdcdf29d329d9950bfc79034c29db724/datasets/lifesat/gdp_per_capita.csv -------------------------------------------------------------------------------- /datasets/titanic/test.csv: -------------------------------------------------------------------------------- 1 | PassengerId,Pclass,Name,Sex,Age,SibSp,Parch,Ticket,Fare,Cabin,Embarked 2 | 892,3,"Kelly, Mr. James",male,34.5,0,0,330911,7.8292,,Q 3 | 893,3,"Wilkes, Mrs. James (Ellen Needs)",female,47.0,1,0,363272,7.0,,S 4 | 894,2,"Myles, Mr. Thomas Francis",male,62.0,0,0,240276,9.6875,,Q 5 | 895,3,"Wirz, Mr. Albert",male,27.0,0,0,315154,8.6625,,S 6 | 896,3,"Hirvonen, Mrs. Alexander (Helga E Lindqvist)",female,22.0,1,1,3101298,12.2875,,S 7 | 897,3,"Svensson, Mr. Johan Cervin",male,14.0,0,0,7538,9.225,,S 8 | 898,3,"Connolly, Miss. Kate",female,30.0,0,0,330972,7.6292,,Q 9 | 899,2,"Caldwell, Mr. Albert Francis",male,26.0,1,1,248738,29.0,,S 10 | 900,3,"Abrahim, Mrs. Joseph (Sophie Halaut Easu)",female,18.0,0,0,2657,7.2292,,C 11 | 901,3,"Davies, Mr. John Samuel",male,21.0,2,0,A/4 48871,24.15,,S 12 | 902,3,"Ilieff, Mr. Ylio",male,,0,0,349220,7.8958,,S 13 | 903,1,"Jones, Mr. Charles Cresson",male,46.0,0,0,694,26.0,,S 14 | 904,1,"Snyder, Mrs. John Pillsbury (Nelle Stevenson)",female,23.0,1,0,21228,82.2667,B45,S 15 | 905,2,"Howard, Mr. Benjamin",male,63.0,1,0,24065,26.0,,S 16 | 906,1,"Chaffee, Mrs. Herbert Fuller (Carrie Constance Toogood)",female,47.0,1,0,W.E.P. 5734,61.175,E31,S 17 | 907,2,"del Carlo, Mrs. Sebastiano (Argenia Genovesi)",female,24.0,1,0,SC/PARIS 2167,27.7208,,C 18 | 908,2,"Keane, Mr. Daniel",male,35.0,0,0,233734,12.35,,Q 19 | 909,3,"Assaf, Mr. Gerios",male,21.0,0,0,2692,7.225,,C 20 | 910,3,"Ilmakangas, Miss. Ida Livija",female,27.0,1,0,STON/O2. 3101270,7.925,,S 21 | 911,3,"Assaf Khalil, Mrs. Mariana ('Miriam')",female,45.0,0,0,2696,7.225,,C 22 | 912,1,"Rothschild, Mr. Martin",male,55.0,1,0,PC 17603,59.4,,C 23 | 913,3,"Olsen, Master. Artur Karl",male,9.0,0,1,C 17368,3.1708,,S 24 | 914,1,"Flegenheim, Mrs. Alfred (Antoinette)",female,,0,0,PC 17598,31.6833,,S 25 | 915,1,"Williams, Mr. Richard Norris II",male,21.0,0,1,PC 17597,61.3792,,C 26 | 916,1,"Ryerson, Mrs. Arthur Larned (Emily Maria Borie)",female,48.0,1,3,PC 17608,262.375,B57 B59 B63 B66,C 27 | 917,3,"Robins, Mr. Alexander A",male,50.0,1,0,A/5. 3337,14.5,,S 28 | 918,1,"Ostby, Miss. Helene Ragnhild",female,22.0,0,1,113509,61.9792,B36,C 29 | 919,3,"Daher, Mr. Shedid",male,22.5,0,0,2698,7.225,,C 30 | 920,1,"Brady, Mr. John Bertram",male,41.0,0,0,113054,30.5,A21,S 31 | 921,3,"Samaan, Mr. Elias",male,,2,0,2662,21.6792,,C 32 | 922,2,"Louch, Mr. Charles Alexander",male,50.0,1,0,SC/AH 3085,26.0,,S 33 | 923,2,"Jefferys, Mr. Clifford Thomas",male,24.0,2,0,C.A. 31029,31.5,,S 34 | 924,3,"Dean, Mrs. Bertram (Eva Georgetta Light)",female,33.0,1,2,C.A. 2315,20.575,,S 35 | 925,3,"Johnston, Mrs. Andrew G (Elizabeth 'Lily' Watson)",female,,1,2,W./C. 6607,23.45,,S 36 | 926,1,"Mock, Mr. Philipp Edmund",male,30.0,1,0,13236,57.75,C78,C 37 | 927,3,"Katavelas, Mr. Vassilios ('Catavelas Vassilios')",male,18.5,0,0,2682,7.2292,,C 38 | 928,3,"Roth, Miss. Sarah A",female,,0,0,342712,8.05,,S 39 | 929,3,"Cacic, Miss. Manda",female,21.0,0,0,315087,8.6625,,S 40 | 930,3,"Sap, Mr. Julius",male,25.0,0,0,345768,9.5,,S 41 | 931,3,"Hee, Mr. Ling",male,,0,0,1601,56.4958,,S 42 | 932,3,"Karun, Mr. Franz",male,39.0,0,1,349256,13.4167,,C 43 | 933,1,"Franklin, Mr. Thomas Parham",male,,0,0,113778,26.55,D34,S 44 | 934,3,"Goldsmith, Mr. Nathan",male,41.0,0,0,SOTON/O.Q. 3101263,7.85,,S 45 | 935,2,"Corbett, Mrs. Walter H (Irene Colvin)",female,30.0,0,0,237249,13.0,,S 46 | 936,1,"Kimball, Mrs. Edwin Nelson Jr (Gertrude Parsons)",female,45.0,1,0,11753,52.5542,D19,S 47 | 937,3,"Peltomaki, Mr. Nikolai Johannes",male,25.0,0,0,STON/O 2. 3101291,7.925,,S 48 | 938,1,"Chevre, Mr. Paul Romaine",male,45.0,0,0,PC 17594,29.7,A9,C 49 | 939,3,"Shaughnessy, Mr. Patrick",male,,0,0,370374,7.75,,Q 50 | 940,1,"Bucknell, Mrs. William Robert (Emma Eliza Ward)",female,60.0,0,0,11813,76.2917,D15,C 51 | 941,3,"Coutts, Mrs. William (Winnie 'Minnie' Treanor)",female,36.0,0,2,C.A. 37671,15.9,,S 52 | 942,1,"Smith, Mr. Lucien Philip",male,24.0,1,0,13695,60.0,C31,S 53 | 943,2,"Pulbaum, Mr. Franz",male,27.0,0,0,SC/PARIS 2168,15.0333,,C 54 | 944,2,"Hocking, Miss. Ellen 'Nellie'",female,20.0,2,1,29105,23.0,,S 55 | 945,1,"Fortune, Miss. Ethel Flora",female,28.0,3,2,19950,263.0,C23 C25 C27,S 56 | 946,2,"Mangiavacchi, Mr. Serafino Emilio",male,,0,0,SC/A.3 2861,15.5792,,C 57 | 947,3,"Rice, Master. Albert",male,10.0,4,1,382652,29.125,,Q 58 | 948,3,"Cor, Mr. Bartol",male,35.0,0,0,349230,7.8958,,S 59 | 949,3,"Abelseth, Mr. Olaus Jorgensen",male,25.0,0,0,348122,7.65,F G63,S 60 | 950,3,"Davison, Mr. Thomas Henry",male,,1,0,386525,16.1,,S 61 | 951,1,"Chaudanson, Miss. Victorine",female,36.0,0,0,PC 17608,262.375,B61,C 62 | 952,3,"Dika, Mr. Mirko",male,17.0,0,0,349232,7.8958,,S 63 | 953,2,"McCrae, Mr. Arthur Gordon",male,32.0,0,0,237216,13.5,,S 64 | 954,3,"Bjorklund, Mr. Ernst Herbert",male,18.0,0,0,347090,7.75,,S 65 | 955,3,"Bradley, Miss. Bridget Delia",female,22.0,0,0,334914,7.725,,Q 66 | 956,1,"Ryerson, Master. John Borie",male,13.0,2,2,PC 17608,262.375,B57 B59 B63 B66,C 67 | 957,2,"Corey, Mrs. Percy C (Mary Phyllis Elizabeth Miller)",female,,0,0,F.C.C. 13534,21.0,,S 68 | 958,3,"Burns, Miss. Mary Delia",female,18.0,0,0,330963,7.8792,,Q 69 | 959,1,"Moore, Mr. Clarence Bloomfield",male,47.0,0,0,113796,42.4,,S 70 | 960,1,"Tucker, Mr. Gilbert Milligan Jr",male,31.0,0,0,2543,28.5375,C53,C 71 | 961,1,"Fortune, Mrs. Mark (Mary McDougald)",female,60.0,1,4,19950,263.0,C23 C25 C27,S 72 | 962,3,"Mulvihill, Miss. Bertha E",female,24.0,0,0,382653,7.75,,Q 73 | 963,3,"Minkoff, Mr. Lazar",male,21.0,0,0,349211,7.8958,,S 74 | 964,3,"Nieminen, Miss. Manta Josefina",female,29.0,0,0,3101297,7.925,,S 75 | 965,1,"Ovies y Rodriguez, Mr. Servando",male,28.5,0,0,PC 17562,27.7208,D43,C 76 | 966,1,"Geiger, Miss. Amalie",female,35.0,0,0,113503,211.5,C130,C 77 | 967,1,"Keeping, Mr. Edwin",male,32.5,0,0,113503,211.5,C132,C 78 | 968,3,"Miles, Mr. Frank",male,,0,0,359306,8.05,,S 79 | 969,1,"Cornell, Mrs. Robert Clifford (Malvina Helen Lamson)",female,55.0,2,0,11770,25.7,C101,S 80 | 970,2,"Aldworth, Mr. Charles Augustus",male,30.0,0,0,248744,13.0,,S 81 | 971,3,"Doyle, Miss. Elizabeth",female,24.0,0,0,368702,7.75,,Q 82 | 972,3,"Boulos, Master. Akar",male,6.0,1,1,2678,15.2458,,C 83 | 973,1,"Straus, Mr. Isidor",male,67.0,1,0,PC 17483,221.7792,C55 C57,S 84 | 974,1,"Case, Mr. Howard Brown",male,49.0,0,0,19924,26.0,,S 85 | 975,3,"Demetri, Mr. Marinko",male,,0,0,349238,7.8958,,S 86 | 976,2,"Lamb, Mr. John Joseph",male,,0,0,240261,10.7083,,Q 87 | 977,3,"Khalil, Mr. Betros",male,,1,0,2660,14.4542,,C 88 | 978,3,"Barry, Miss. Julia",female,27.0,0,0,330844,7.8792,,Q 89 | 979,3,"Badman, Miss. Emily Louisa",female,18.0,0,0,A/4 31416,8.05,,S 90 | 980,3,"O'Donoghue, Ms. Bridget",female,,0,0,364856,7.75,,Q 91 | 981,2,"Wells, Master. Ralph Lester",male,2.0,1,1,29103,23.0,,S 92 | 982,3,"Dyker, Mrs. Adolf Fredrik (Anna Elisabeth Judith Andersson)",female,22.0,1,0,347072,13.9,,S 93 | 983,3,"Pedersen, Mr. Olaf",male,,0,0,345498,7.775,,S 94 | 984,1,"Davidson, Mrs. Thornton (Orian Hays)",female,27.0,1,2,F.C. 12750,52.0,B71,S 95 | 985,3,"Guest, Mr. Robert",male,,0,0,376563,8.05,,S 96 | 986,1,"Birnbaum, Mr. Jakob",male,25.0,0,0,13905,26.0,,C 97 | 987,3,"Tenglin, Mr. Gunnar Isidor",male,25.0,0,0,350033,7.7958,,S 98 | 988,1,"Cavendish, Mrs. Tyrell William (Julia Florence Siegel)",female,76.0,1,0,19877,78.85,C46,S 99 | 989,3,"Makinen, Mr. Kalle Edvard",male,29.0,0,0,STON/O 2. 3101268,7.925,,S 100 | 990,3,"Braf, Miss. Elin Ester Maria",female,20.0,0,0,347471,7.8542,,S 101 | 991,3,"Nancarrow, Mr. William Henry",male,33.0,0,0,A./5. 3338,8.05,,S 102 | 992,1,"Stengel, Mrs. Charles Emil Henry (Annie May Morris)",female,43.0,1,0,11778,55.4417,C116,C 103 | 993,2,"Weisz, Mr. Leopold",male,27.0,1,0,228414,26.0,,S 104 | 994,3,"Foley, Mr. William",male,,0,0,365235,7.75,,Q 105 | 995,3,"Johansson Palmquist, Mr. Oskar Leander",male,26.0,0,0,347070,7.775,,S 106 | 996,3,"Thomas, Mrs. Alexander (Thamine 'Thelma')",female,16.0,1,1,2625,8.5167,,C 107 | 997,3,"Holthen, Mr. Johan Martin",male,28.0,0,0,C 4001,22.525,,S 108 | 998,3,"Buckley, Mr. Daniel",male,21.0,0,0,330920,7.8208,,Q 109 | 999,3,"Ryan, Mr. Edward",male,,0,0,383162,7.75,,Q 110 | 1000,3,"Willer, Mr. Aaron ('Abi Weller')",male,,0,0,3410,8.7125,,S 111 | 1001,2,"Swane, Mr. George",male,18.5,0,0,248734,13.0,F,S 112 | 1002,2,"Stanton, Mr. Samuel Ward",male,41.0,0,0,237734,15.0458,,C 113 | 1003,3,"Shine, Miss. Ellen Natalia",female,,0,0,330968,7.7792,,Q 114 | 1004,1,"Evans, Miss. Edith Corse",female,36.0,0,0,PC 17531,31.6792,A29,C 115 | 1005,3,"Buckley, Miss. Katherine",female,18.5,0,0,329944,7.2833,,Q 116 | 1006,1,"Straus, Mrs. Isidor (Rosalie Ida Blun)",female,63.0,1,0,PC 17483,221.7792,C55 C57,S 117 | 1007,3,"Chronopoulos, Mr. Demetrios",male,18.0,1,0,2680,14.4542,,C 118 | 1008,3,"Thomas, Mr. John",male,,0,0,2681,6.4375,,C 119 | 1009,3,"Sandstrom, Miss. Beatrice Irene",female,1.0,1,1,PP 9549,16.7,G6,S 120 | 1010,1,"Beattie, Mr. Thomson",male,36.0,0,0,13050,75.2417,C6,C 121 | 1011,2,"Chapman, Mrs. John Henry (Sara Elizabeth Lawry)",female,29.0,1,0,SC/AH 29037,26.0,,S 122 | 1012,2,"Watt, Miss. Bertha J",female,12.0,0,0,C.A. 33595,15.75,,S 123 | 1013,3,"Kiernan, Mr. John",male,,1,0,367227,7.75,,Q 124 | 1014,1,"Schabert, Mrs. Paul (Emma Mock)",female,35.0,1,0,13236,57.75,C28,C 125 | 1015,3,"Carver, Mr. Alfred John",male,28.0,0,0,392095,7.25,,S 126 | 1016,3,"Kennedy, Mr. John",male,,0,0,368783,7.75,,Q 127 | 1017,3,"Cribb, Miss. Laura Alice",female,17.0,0,1,371362,16.1,,S 128 | 1018,3,"Brobeck, Mr. Karl Rudolf",male,22.0,0,0,350045,7.7958,,S 129 | 1019,3,"McCoy, Miss. Alicia",female,,2,0,367226,23.25,,Q 130 | 1020,2,"Bowenur, Mr. Solomon",male,42.0,0,0,211535,13.0,,S 131 | 1021,3,"Petersen, Mr. Marius",male,24.0,0,0,342441,8.05,,S 132 | 1022,3,"Spinner, Mr. Henry John",male,32.0,0,0,STON/OQ. 369943,8.05,,S 133 | 1023,1,"Gracie, Col. Archibald IV",male,53.0,0,0,113780,28.5,C51,C 134 | 1024,3,"Lefebre, Mrs. Frank (Frances)",female,,0,4,4133,25.4667,,S 135 | 1025,3,"Thomas, Mr. Charles P",male,,1,0,2621,6.4375,,C 136 | 1026,3,"Dintcheff, Mr. Valtcho",male,43.0,0,0,349226,7.8958,,S 137 | 1027,3,"Carlsson, Mr. Carl Robert",male,24.0,0,0,350409,7.8542,,S 138 | 1028,3,"Zakarian, Mr. Mapriededer",male,26.5,0,0,2656,7.225,,C 139 | 1029,2,"Schmidt, Mr. August",male,26.0,0,0,248659,13.0,,S 140 | 1030,3,"Drapkin, Miss. Jennie",female,23.0,0,0,SOTON/OQ 392083,8.05,,S 141 | 1031,3,"Goodwin, Mr. Charles Frederick",male,40.0,1,6,CA 2144,46.9,,S 142 | 1032,3,"Goodwin, Miss. Jessie Allis",female,10.0,5,2,CA 2144,46.9,,S 143 | 1033,1,"Daniels, Miss. Sarah",female,33.0,0,0,113781,151.55,,S 144 | 1034,1,"Ryerson, Mr. Arthur Larned",male,61.0,1,3,PC 17608,262.375,B57 B59 B63 B66,C 145 | 1035,2,"Beauchamp, Mr. Henry James",male,28.0,0,0,244358,26.0,,S 146 | 1036,1,"Lindeberg-Lind, Mr. Erik Gustaf ('Mr Edward Lingrey')",male,42.0,0,0,17475,26.55,,S 147 | 1037,3,"Vander Planke, Mr. Julius",male,31.0,3,0,345763,18.0,,S 148 | 1038,1,"Hilliard, Mr. Herbert Henry",male,,0,0,17463,51.8625,E46,S 149 | 1039,3,"Davies, Mr. Evan",male,22.0,0,0,SC/A4 23568,8.05,,S 150 | 1040,1,"Crafton, Mr. John Bertram",male,,0,0,113791,26.55,,S 151 | 1041,2,"Lahtinen, Rev. William",male,30.0,1,1,250651,26.0,,S 152 | 1042,1,"Earnshaw, Mrs. Boulton (Olive Potter)",female,23.0,0,1,11767,83.1583,C54,C 153 | 1043,3,"Matinoff, Mr. Nicola",male,,0,0,349255,7.8958,,C 154 | 1044,3,"Storey, Mr. Thomas",male,60.5,0,0,3701,,,S 155 | 1045,3,"Klasen, Mrs. (Hulda Kristina Eugenia Lofqvist)",female,36.0,0,2,350405,12.1833,,S 156 | 1046,3,"Asplund, Master. Filip Oscar",male,13.0,4,2,347077,31.3875,,S 157 | 1047,3,"Duquemin, Mr. Joseph",male,24.0,0,0,S.O./P.P. 752,7.55,,S 158 | 1048,1,"Bird, Miss. Ellen",female,29.0,0,0,PC 17483,221.7792,C97,S 159 | 1049,3,"Lundin, Miss. Olga Elida",female,23.0,0,0,347469,7.8542,,S 160 | 1050,1,"Borebank, Mr. John James",male,42.0,0,0,110489,26.55,D22,S 161 | 1051,3,"Peacock, Mrs. Benjamin (Edith Nile)",female,26.0,0,2,SOTON/O.Q. 3101315,13.775,,S 162 | 1052,3,"Smyth, Miss. Julia",female,,0,0,335432,7.7333,,Q 163 | 1053,3,"Touma, Master. Georges Youssef",male,7.0,1,1,2650,15.2458,,C 164 | 1054,2,"Wright, Miss. Marion",female,26.0,0,0,220844,13.5,,S 165 | 1055,3,"Pearce, Mr. Ernest",male,,0,0,343271,7.0,,S 166 | 1056,2,"Peruschitz, Rev. Joseph Maria",male,41.0,0,0,237393,13.0,,S 167 | 1057,3,"Kink-Heilmann, Mrs. Anton (Luise Heilmann)",female,26.0,1,1,315153,22.025,,S 168 | 1058,1,"Brandeis, Mr. Emil",male,48.0,0,0,PC 17591,50.4958,B10,C 169 | 1059,3,"Ford, Mr. Edward Watson",male,18.0,2,2,W./C. 6608,34.375,,S 170 | 1060,1,"Cassebeer, Mrs. Henry Arthur Jr (Eleanor Genevieve Fosdick)",female,,0,0,17770,27.7208,,C 171 | 1061,3,"Hellstrom, Miss. Hilda Maria",female,22.0,0,0,7548,8.9625,,S 172 | 1062,3,"Lithman, Mr. Simon",male,,0,0,S.O./P.P. 251,7.55,,S 173 | 1063,3,"Zakarian, Mr. Ortin",male,27.0,0,0,2670,7.225,,C 174 | 1064,3,"Dyker, Mr. Adolf Fredrik",male,23.0,1,0,347072,13.9,,S 175 | 1065,3,"Torfa, Mr. Assad",male,,0,0,2673,7.2292,,C 176 | 1066,3,"Asplund, Mr. Carl Oscar Vilhelm Gustafsson",male,40.0,1,5,347077,31.3875,,S 177 | 1067,2,"Brown, Miss. Edith Eileen",female,15.0,0,2,29750,39.0,,S 178 | 1068,2,"Sincock, Miss. Maude",female,20.0,0,0,C.A. 33112,36.75,,S 179 | 1069,1,"Stengel, Mr. Charles Emil Henry",male,54.0,1,0,11778,55.4417,C116,C 180 | 1070,2,"Becker, Mrs. Allen Oliver (Nellie E Baumgardner)",female,36.0,0,3,230136,39.0,F4,S 181 | 1071,1,"Compton, Mrs. Alexander Taylor (Mary Eliza Ingersoll)",female,64.0,0,2,PC 17756,83.1583,E45,C 182 | 1072,2,"McCrie, Mr. James Matthew",male,30.0,0,0,233478,13.0,,S 183 | 1073,1,"Compton, Mr. Alexander Taylor Jr",male,37.0,1,1,PC 17756,83.1583,E52,C 184 | 1074,1,"Marvin, Mrs. Daniel Warner (Mary Graham Carmichael Farquarson)",female,18.0,1,0,113773,53.1,D30,S 185 | 1075,3,"Lane, Mr. Patrick",male,,0,0,7935,7.75,,Q 186 | 1076,1,"Douglas, Mrs. Frederick Charles (Mary Helene Baxter)",female,27.0,1,1,PC 17558,247.5208,B58 B60,C 187 | 1077,2,"Maybery, Mr. Frank Hubert",male,40.0,0,0,239059,16.0,,S 188 | 1078,2,"Phillips, Miss. Alice Frances Louisa",female,21.0,0,1,S.O./P.P. 2,21.0,,S 189 | 1079,3,"Davies, Mr. Joseph",male,17.0,2,0,A/4 48873,8.05,,S 190 | 1080,3,"Sage, Miss. Ada",female,,8,2,CA. 2343,69.55,,S 191 | 1081,2,"Veal, Mr. James",male,40.0,0,0,28221,13.0,,S 192 | 1082,2,"Angle, Mr. William A",male,34.0,1,0,226875,26.0,,S 193 | 1083,1,"Salomon, Mr. Abraham L",male,,0,0,111163,26.0,,S 194 | 1084,3,"van Billiard, Master. Walter John",male,11.5,1,1,A/5. 851,14.5,,S 195 | 1085,2,"Lingane, Mr. John",male,61.0,0,0,235509,12.35,,Q 196 | 1086,2,"Drew, Master. Marshall Brines",male,8.0,0,2,28220,32.5,,S 197 | 1087,3,"Karlsson, Mr. Julius Konrad Eugen",male,33.0,0,0,347465,7.8542,,S 198 | 1088,1,"Spedden, Master. Robert Douglas",male,6.0,0,2,16966,134.5,E34,C 199 | 1089,3,"Nilsson, Miss. Berta Olivia",female,18.0,0,0,347066,7.775,,S 200 | 1090,2,"Baimbrigge, Mr. Charles Robert",male,23.0,0,0,C.A. 31030,10.5,,S 201 | 1091,3,"Rasmussen, Mrs. (Lena Jacobsen Solvang)",female,,0,0,65305,8.1125,,S 202 | 1092,3,"Murphy, Miss. Nora",female,,0,0,36568,15.5,,Q 203 | 1093,3,"Danbom, Master. Gilbert Sigvard Emanuel",male,0.3333,0,2,347080,14.4,,S 204 | 1094,1,"Astor, Col. John Jacob",male,47.0,1,0,PC 17757,227.525,C62 C64,C 205 | 1095,2,"Quick, Miss. Winifred Vera",female,8.0,1,1,26360,26.0,,S 206 | 1096,2,"Andrew, Mr. Frank Thomas",male,25.0,0,0,C.A. 34050,10.5,,S 207 | 1097,1,"Omont, Mr. Alfred Fernand",male,,0,0,F.C. 12998,25.7417,,C 208 | 1098,3,"McGowan, Miss. Katherine",female,35.0,0,0,9232,7.75,,Q 209 | 1099,2,"Collett, Mr. Sidney C Stuart",male,24.0,0,0,28034,10.5,,S 210 | 1100,1,"Rosenbaum, Miss. Edith Louise",female,33.0,0,0,PC 17613,27.7208,A11,C 211 | 1101,3,"Delalic, Mr. Redjo",male,25.0,0,0,349250,7.8958,,S 212 | 1102,3,"Andersen, Mr. Albert Karvin",male,32.0,0,0,C 4001,22.525,,S 213 | 1103,3,"Finoli, Mr. Luigi",male,,0,0,SOTON/O.Q. 3101308,7.05,,S 214 | 1104,2,"Deacon, Mr. Percy William",male,17.0,0,0,S.O.C. 14879,73.5,,S 215 | 1105,2,"Howard, Mrs. Benjamin (Ellen Truelove Arman)",female,60.0,1,0,24065,26.0,,S 216 | 1106,3,"Andersson, Miss. Ida Augusta Margareta",female,38.0,4,2,347091,7.775,,S 217 | 1107,1,"Head, Mr. Christopher",male,42.0,0,0,113038,42.5,B11,S 218 | 1108,3,"Mahon, Miss. Bridget Delia",female,,0,0,330924,7.8792,,Q 219 | 1109,1,"Wick, Mr. George Dennick",male,57.0,1,1,36928,164.8667,,S 220 | 1110,1,"Widener, Mrs. George Dunton (Eleanor Elkins)",female,50.0,1,1,113503,211.5,C80,C 221 | 1111,3,"Thomson, Mr. Alexander Morrison",male,,0,0,32302,8.05,,S 222 | 1112,2,"Duran y More, Miss. Florentina",female,30.0,1,0,SC/PARIS 2148,13.8583,,C 223 | 1113,3,"Reynolds, Mr. Harold J",male,21.0,0,0,342684,8.05,,S 224 | 1114,2,"Cook, Mrs. (Selena Rogers)",female,22.0,0,0,W./C. 14266,10.5,F33,S 225 | 1115,3,"Karlsson, Mr. Einar Gervasius",male,21.0,0,0,350053,7.7958,,S 226 | 1116,1,"Candee, Mrs. Edward (Helen Churchill Hungerford)",female,53.0,0,0,PC 17606,27.4458,,C 227 | 1117,3,"Moubarek, Mrs. George (Omine 'Amenia' Alexander)",female,,0,2,2661,15.2458,,C 228 | 1118,3,"Asplund, Mr. Johan Charles",male,23.0,0,0,350054,7.7958,,S 229 | 1119,3,"McNeill, Miss. Bridget",female,,0,0,370368,7.75,,Q 230 | 1120,3,"Everett, Mr. Thomas James",male,40.5,0,0,C.A. 6212,15.1,,S 231 | 1121,2,"Hocking, Mr. Samuel James Metcalfe",male,36.0,0,0,242963,13.0,,S 232 | 1122,2,"Sweet, Mr. George Frederick",male,14.0,0,0,220845,65.0,,S 233 | 1123,1,"Willard, Miss. Constance",female,21.0,0,0,113795,26.55,,S 234 | 1124,3,"Wiklund, Mr. Karl Johan",male,21.0,1,0,3101266,6.4958,,S 235 | 1125,3,"Linehan, Mr. Michael",male,,0,0,330971,7.8792,,Q 236 | 1126,1,"Cumings, Mr. John Bradley",male,39.0,1,0,PC 17599,71.2833,C85,C 237 | 1127,3,"Vendel, Mr. Olof Edvin",male,20.0,0,0,350416,7.8542,,S 238 | 1128,1,"Warren, Mr. Frank Manley",male,64.0,1,0,110813,75.25,D37,C 239 | 1129,3,"Baccos, Mr. Raffull",male,20.0,0,0,2679,7.225,,C 240 | 1130,2,"Hiltunen, Miss. Marta",female,18.0,1,1,250650,13.0,,S 241 | 1131,1,"Douglas, Mrs. Walter Donald (Mahala Dutton)",female,48.0,1,0,PC 17761,106.425,C86,C 242 | 1132,1,"Lindstrom, Mrs. Carl Johan (Sigrid Posse)",female,55.0,0,0,112377,27.7208,,C 243 | 1133,2,"Christy, Mrs. (Alice Frances)",female,45.0,0,2,237789,30.0,,S 244 | 1134,1,"Spedden, Mr. Frederic Oakley",male,45.0,1,1,16966,134.5,E34,C 245 | 1135,3,"Hyman, Mr. Abraham",male,,0,0,3470,7.8875,,S 246 | 1136,3,"Johnston, Master. William Arthur 'Willie'",male,,1,2,W./C. 6607,23.45,,S 247 | 1137,1,"Kenyon, Mr. Frederick R",male,41.0,1,0,17464,51.8625,D21,S 248 | 1138,2,"Karnes, Mrs. J Frank (Claire Bennett)",female,22.0,0,0,F.C.C. 13534,21.0,,S 249 | 1139,2,"Drew, Mr. James Vivian",male,42.0,1,1,28220,32.5,,S 250 | 1140,2,"Hold, Mrs. Stephen (Annie Margaret Hill)",female,29.0,1,0,26707,26.0,,S 251 | 1141,3,"Khalil, Mrs. Betros (Zahie 'Maria' Elias)",female,,1,0,2660,14.4542,,C 252 | 1142,2,"West, Miss. Barbara J",female,0.9167,1,2,C.A. 34651,27.75,,S 253 | 1143,3,"Abrahamsson, Mr. Abraham August Johannes",male,20.0,0,0,SOTON/O2 3101284,7.925,,S 254 | 1144,1,"Clark, Mr. Walter Miller",male,27.0,1,0,13508,136.7792,C89,C 255 | 1145,3,"Salander, Mr. Karl Johan",male,24.0,0,0,7266,9.325,,S 256 | 1146,3,"Wenzel, Mr. Linhart",male,32.5,0,0,345775,9.5,,S 257 | 1147,3,"MacKay, Mr. George William",male,,0,0,C.A. 42795,7.55,,S 258 | 1148,3,"Mahon, Mr. John",male,,0,0,AQ/4 3130,7.75,,Q 259 | 1149,3,"Niklasson, Mr. Samuel",male,28.0,0,0,363611,8.05,,S 260 | 1150,2,"Bentham, Miss. Lilian W",female,19.0,0,0,28404,13.0,,S 261 | 1151,3,"Midtsjo, Mr. Karl Albert",male,21.0,0,0,345501,7.775,,S 262 | 1152,3,"de Messemaeker, Mr. Guillaume Joseph",male,36.5,1,0,345572,17.4,,S 263 | 1153,3,"Nilsson, Mr. August Ferdinand",male,21.0,0,0,350410,7.8542,,S 264 | 1154,2,"Wells, Mrs. Arthur Henry ('Addie' Dart Trevaskis)",female,29.0,0,2,29103,23.0,,S 265 | 1155,3,"Klasen, Miss. Gertrud Emilia",female,1.0,1,1,350405,12.1833,,S 266 | 1156,2,"Portaluppi, Mr. Emilio Ilario Giuseppe",male,30.0,0,0,C.A. 34644,12.7375,,C 267 | 1157,3,"Lyntakoff, Mr. Stanko",male,,0,0,349235,7.8958,,S 268 | 1158,1,"Chisholm, Mr. Roderick Robert Crispin",male,,0,0,112051,0.0,,S 269 | 1159,3,"Warren, Mr. Charles William",male,,0,0,C.A. 49867,7.55,,S 270 | 1160,3,"Howard, Miss. May Elizabeth",female,,0,0,A. 2. 39186,8.05,,S 271 | 1161,3,"Pokrnic, Mr. Mate",male,17.0,0,0,315095,8.6625,,S 272 | 1162,1,"McCaffry, Mr. Thomas Francis",male,46.0,0,0,13050,75.2417,C6,C 273 | 1163,3,"Fox, Mr. Patrick",male,,0,0,368573,7.75,,Q 274 | 1164,1,"Clark, Mrs. Walter Miller (Virginia McDowell)",female,26.0,1,0,13508,136.7792,C89,C 275 | 1165,3,"Lennon, Miss. Mary",female,,1,0,370371,15.5,,Q 276 | 1166,3,"Saade, Mr. Jean Nassr",male,,0,0,2676,7.225,,C 277 | 1167,2,"Bryhl, Miss. Dagmar Jenny Ingeborg",female,20.0,1,0,236853,26.0,,S 278 | 1168,2,"Parker, Mr. Clifford Richard",male,28.0,0,0,SC 14888,10.5,,S 279 | 1169,2,"Faunthorpe, Mr. Harry",male,40.0,1,0,2926,26.0,,S 280 | 1170,2,"Ware, Mr. John James",male,30.0,1,0,CA 31352,21.0,,S 281 | 1171,2,"Oxenham, Mr. Percy Thomas",male,22.0,0,0,W./C. 14260,10.5,,S 282 | 1172,3,"Oreskovic, Miss. Jelka",female,23.0,0,0,315085,8.6625,,S 283 | 1173,3,"Peacock, Master. Alfred Edward",male,0.75,1,1,SOTON/O.Q. 3101315,13.775,,S 284 | 1174,3,"Fleming, Miss. Honora",female,,0,0,364859,7.75,,Q 285 | 1175,3,"Touma, Miss. Maria Youssef",female,9.0,1,1,2650,15.2458,,C 286 | 1176,3,"Rosblom, Miss. Salli Helena",female,2.0,1,1,370129,20.2125,,S 287 | 1177,3,"Dennis, Mr. William",male,36.0,0,0,A/5 21175,7.25,,S 288 | 1178,3,"Franklin, Mr. Charles (Charles Fardon)",male,,0,0,SOTON/O.Q. 3101314,7.25,,S 289 | 1179,1,"Snyder, Mr. John Pillsbury",male,24.0,1,0,21228,82.2667,B45,S 290 | 1180,3,"Mardirosian, Mr. Sarkis",male,,0,0,2655,7.2292,F E46,C 291 | 1181,3,"Ford, Mr. Arthur",male,,0,0,A/5 1478,8.05,,S 292 | 1182,1,"Rheims, Mr. George Alexander Lucien",male,,0,0,PC 17607,39.6,,S 293 | 1183,3,"Daly, Miss. Margaret Marcella 'Maggie'",female,30.0,0,0,382650,6.95,,Q 294 | 1184,3,"Nasr, Mr. Mustafa",male,,0,0,2652,7.2292,,C 295 | 1185,1,"Dodge, Dr. Washington",male,53.0,1,1,33638,81.8583,A34,S 296 | 1186,3,"Wittevrongel, Mr. Camille",male,36.0,0,0,345771,9.5,,S 297 | 1187,3,"Angheloff, Mr. Minko",male,26.0,0,0,349202,7.8958,,S 298 | 1188,2,"Laroche, Miss. Louise",female,1.0,1,2,SC/Paris 2123,41.5792,,C 299 | 1189,3,"Samaan, Mr. Hanna",male,,2,0,2662,21.6792,,C 300 | 1190,1,"Loring, Mr. Joseph Holland",male,30.0,0,0,113801,45.5,,S 301 | 1191,3,"Johansson, Mr. Nils",male,29.0,0,0,347467,7.8542,,S 302 | 1192,3,"Olsson, Mr. Oscar Wilhelm",male,32.0,0,0,347079,7.775,,S 303 | 1193,2,"Malachard, Mr. Noel",male,,0,0,237735,15.0458,D,C 304 | 1194,2,"Phillips, Mr. Escott Robert",male,43.0,0,1,S.O./P.P. 2,21.0,,S 305 | 1195,3,"Pokrnic, Mr. Tome",male,24.0,0,0,315092,8.6625,,S 306 | 1196,3,"McCarthy, Miss. Catherine 'Katie'",female,,0,0,383123,7.75,,Q 307 | 1197,1,"Crosby, Mrs. Edward Gifford (Catherine Elizabeth Halstead)",female,64.0,1,1,112901,26.55,B26,S 308 | 1198,1,"Allison, Mr. Hudson Joshua Creighton",male,30.0,1,2,113781,151.55,C22 C26,S 309 | 1199,3,"Aks, Master. Philip Frank",male,0.8333,0,1,392091,9.35,,S 310 | 1200,1,"Hays, Mr. Charles Melville",male,55.0,1,1,12749,93.5,B69,S 311 | 1201,3,"Hansen, Mrs. Claus Peter (Jennie L Howard)",female,45.0,1,0,350026,14.1083,,S 312 | 1202,3,"Cacic, Mr. Jego Grga",male,18.0,0,0,315091,8.6625,,S 313 | 1203,3,"Vartanian, Mr. David",male,22.0,0,0,2658,7.225,,C 314 | 1204,3,"Sadowitz, Mr. Harry",male,,0,0,LP 1588,7.575,,S 315 | 1205,3,"Carr, Miss. Jeannie",female,37.0,0,0,368364,7.75,,Q 316 | 1206,1,"White, Mrs. John Stuart (Ella Holmes)",female,55.0,0,0,PC 17760,135.6333,C32,C 317 | 1207,3,"Hagardon, Miss. Kate",female,17.0,0,0,AQ/3. 30631,7.7333,,Q 318 | 1208,1,"Spencer, Mr. William Augustus",male,57.0,1,0,PC 17569,146.5208,B78,C 319 | 1209,2,"Rogers, Mr. Reginald Harry",male,19.0,0,0,28004,10.5,,S 320 | 1210,3,"Jonsson, Mr. Nils Hilding",male,27.0,0,0,350408,7.8542,,S 321 | 1211,2,"Jefferys, Mr. Ernest Wilfred",male,22.0,2,0,C.A. 31029,31.5,,S 322 | 1212,3,"Andersson, Mr. Johan Samuel",male,26.0,0,0,347075,7.775,,S 323 | 1213,3,"Krekorian, Mr. Neshan",male,25.0,0,0,2654,7.2292,F E57,C 324 | 1214,2,"Nesson, Mr. Israel",male,26.0,0,0,244368,13.0,F2,S 325 | 1215,1,"Rowe, Mr. Alfred G",male,33.0,0,0,113790,26.55,,S 326 | 1216,1,"Kreuchen, Miss. Emilie",female,39.0,0,0,24160,211.3375,,S 327 | 1217,3,"Assam, Mr. Ali",male,23.0,0,0,SOTON/O.Q. 3101309,7.05,,S 328 | 1218,2,"Becker, Miss. Ruth Elizabeth",female,12.0,2,1,230136,39.0,F4,S 329 | 1219,1,"Rosenshine, Mr. George ('Mr George Thorne')",male,46.0,0,0,PC 17585,79.2,,C 330 | 1220,2,"Clarke, Mr. Charles Valentine",male,29.0,1,0,2003,26.0,,S 331 | 1221,2,"Enander, Mr. Ingvar",male,21.0,0,0,236854,13.0,,S 332 | 1222,2,"Davies, Mrs. John Morgan (Elizabeth Agnes Mary White)",female,48.0,0,2,C.A. 33112,36.75,,S 333 | 1223,1,"Dulles, Mr. William Crothers",male,39.0,0,0,PC 17580,29.7,A18,C 334 | 1224,3,"Thomas, Mr. Tannous",male,,0,0,2684,7.225,,C 335 | 1225,3,"Nakid, Mrs. Said (Waika 'Mary' Mowad)",female,19.0,1,1,2653,15.7417,,C 336 | 1226,3,"Cor, Mr. Ivan",male,27.0,0,0,349229,7.8958,,S 337 | 1227,1,"Maguire, Mr. John Edward",male,30.0,0,0,110469,26.0,C106,S 338 | 1228,2,"de Brito, Mr. Jose Joaquim",male,32.0,0,0,244360,13.0,,S 339 | 1229,3,"Elias, Mr. Joseph",male,39.0,0,2,2675,7.2292,,C 340 | 1230,2,"Denbury, Mr. Herbert",male,25.0,0,0,C.A. 31029,31.5,,S 341 | 1231,3,"Betros, Master. Seman",male,,0,0,2622,7.2292,,C 342 | 1232,2,"Fillbrook, Mr. Joseph Charles",male,18.0,0,0,C.A. 15185,10.5,,S 343 | 1233,3,"Lundstrom, Mr. Thure Edvin",male,32.0,0,0,350403,7.5792,,S 344 | 1234,3,"Sage, Mr. John George",male,,1,9,CA. 2343,69.55,,S 345 | 1235,1,"Cardeza, Mrs. James Warburton Martinez (Charlotte Wardle Drake)",female,58.0,0,1,PC 17755,512.3292,B51 B53 B55,C 346 | 1236,3,"van Billiard, Master. James William",male,,1,1,A/5. 851,14.5,,S 347 | 1237,3,"Abelseth, Miss. Karen Marie",female,16.0,0,0,348125,7.65,,S 348 | 1238,2,"Botsford, Mr. William Hull",male,26.0,0,0,237670,13.0,,S 349 | 1239,3,"Whabee, Mrs. George Joseph (Shawneene Abi-Saab)",female,38.0,0,0,2688,7.2292,,C 350 | 1240,2,"Giles, Mr. Ralph",male,24.0,0,0,248726,13.5,,S 351 | 1241,2,"Walcroft, Miss. Nellie",female,31.0,0,0,F.C.C. 13528,21.0,,S 352 | 1242,1,"Greenfield, Mrs. Leo David (Blanche Strouse)",female,45.0,0,1,PC 17759,63.3583,D10 D12,C 353 | 1243,2,"Stokes, Mr. Philip Joseph",male,25.0,0,0,F.C.C. 13540,10.5,,S 354 | 1244,2,"Dibden, Mr. William",male,18.0,0,0,S.O.C. 14879,73.5,,S 355 | 1245,2,"Herman, Mr. Samuel",male,49.0,1,2,220845,65.0,,S 356 | 1246,3,"Dean, Miss. Elizabeth Gladys 'Millvina'",female,0.1667,1,2,C.A. 2315,20.575,,S 357 | 1247,1,"Julian, Mr. Henry Forbes",male,50.0,0,0,113044,26.0,E60,S 358 | 1248,1,"Brown, Mrs. John Murray (Caroline Lane Lamson)",female,59.0,2,0,11769,51.4792,C101,S 359 | 1249,3,"Lockyer, Mr. Edward",male,,0,0,1222,7.8792,,S 360 | 1250,3,"O'Keefe, Mr. Patrick",male,,0,0,368402,7.75,,Q 361 | 1251,3,"Lindell, Mrs. Edvard Bengtsson (Elin Gerda Persson)",female,30.0,1,0,349910,15.55,,S 362 | 1252,3,"Sage, Master. William Henry",male,14.5,8,2,CA. 2343,69.55,,S 363 | 1253,2,"Mallet, Mrs. Albert (Antoinette Magnin)",female,24.0,1,1,S.C./PARIS 2079,37.0042,,C 364 | 1254,2,"Ware, Mrs. John James (Florence Louise Long)",female,31.0,0,0,CA 31352,21.0,,S 365 | 1255,3,"Strilic, Mr. Ivan",male,27.0,0,0,315083,8.6625,,S 366 | 1256,1,"Harder, Mrs. George Achilles (Dorothy Annan)",female,25.0,1,0,11765,55.4417,E50,C 367 | 1257,3,"Sage, Mrs. John (Annie Bullen)",female,,1,9,CA. 2343,69.55,,S 368 | 1258,3,"Caram, Mr. Joseph",male,,1,0,2689,14.4583,,C 369 | 1259,3,"Riihivouri, Miss. Susanna Juhantytar 'Sanni'",female,22.0,0,0,3101295,39.6875,,S 370 | 1260,1,"Gibson, Mrs. Leonard (Pauline C Boeson)",female,45.0,0,1,112378,59.4,,C 371 | 1261,2,"Pallas y Castello, Mr. Emilio",male,29.0,0,0,SC/PARIS 2147,13.8583,,C 372 | 1262,2,"Giles, Mr. Edgar",male,21.0,1,0,28133,11.5,,S 373 | 1263,1,"Wilson, Miss. Helen Alice",female,31.0,0,0,16966,134.5,E39 E41,C 374 | 1264,1,"Ismay, Mr. Joseph Bruce",male,49.0,0,0,112058,0.0,B52 B54 B56,S 375 | 1265,2,"Harbeck, Mr. William H",male,44.0,0,0,248746,13.0,,S 376 | 1266,1,"Dodge, Mrs. Washington (Ruth Vidaver)",female,54.0,1,1,33638,81.8583,A34,S 377 | 1267,1,"Bowen, Miss. Grace Scott",female,45.0,0,0,PC 17608,262.375,,C 378 | 1268,3,"Kink, Miss. Maria",female,22.0,2,0,315152,8.6625,,S 379 | 1269,2,"Cotterill, Mr. Henry 'Harry'",male,21.0,0,0,29107,11.5,,S 380 | 1270,1,"Hipkins, Mr. William Edward",male,55.0,0,0,680,50.0,C39,S 381 | 1271,3,"Asplund, Master. Carl Edgar",male,5.0,4,2,347077,31.3875,,S 382 | 1272,3,"O'Connor, Mr. Patrick",male,,0,0,366713,7.75,,Q 383 | 1273,3,"Foley, Mr. Joseph",male,26.0,0,0,330910,7.8792,,Q 384 | 1274,3,"Risien, Mrs. Samuel (Emma)",female,,0,0,364498,14.5,,S 385 | 1275,3,"McNamee, Mrs. Neal (Eileen O'Leary)",female,19.0,1,0,376566,16.1,,S 386 | 1276,2,"Wheeler, Mr. Edwin 'Frederick'",male,,0,0,SC/PARIS 2159,12.875,,S 387 | 1277,2,"Herman, Miss. Kate",female,24.0,1,2,220845,65.0,,S 388 | 1278,3,"Aronsson, Mr. Ernst Axel Algot",male,24.0,0,0,349911,7.775,,S 389 | 1279,2,"Ashby, Mr. John",male,57.0,0,0,244346,13.0,,S 390 | 1280,3,"Canavan, Mr. Patrick",male,21.0,0,0,364858,7.75,,Q 391 | 1281,3,"Palsson, Master. Paul Folke",male,6.0,3,1,349909,21.075,,S 392 | 1282,1,"Payne, Mr. Vivian Ponsonby",male,23.0,0,0,12749,93.5,B24,S 393 | 1283,1,"Lines, Mrs. Ernest H (Elizabeth Lindsey James)",female,51.0,0,1,PC 17592,39.4,D28,S 394 | 1284,3,"Abbott, Master. Eugene Joseph",male,13.0,0,2,C.A. 2673,20.25,,S 395 | 1285,2,"Gilbert, Mr. William",male,47.0,0,0,C.A. 30769,10.5,,S 396 | 1286,3,"Kink-Heilmann, Mr. Anton",male,29.0,3,1,315153,22.025,,S 397 | 1287,1,"Smith, Mrs. Lucien Philip (Mary Eloise Hughes)",female,18.0,1,0,13695,60.0,C31,S 398 | 1288,3,"Colbert, Mr. Patrick",male,24.0,0,0,371109,7.25,,Q 399 | 1289,1,"Frolicher-Stehli, Mrs. Maxmillian (Margaretha Emerentia Stehli)",female,48.0,1,1,13567,79.2,B41,C 400 | 1290,3,"Larsson-Rondberg, Mr. Edvard A",male,22.0,0,0,347065,7.775,,S 401 | 1291,3,"Conlon, Mr. Thomas Henry",male,31.0,0,0,21332,7.7333,,Q 402 | 1292,1,"Bonnell, Miss. Caroline",female,30.0,0,0,36928,164.8667,C7,S 403 | 1293,2,"Gale, Mr. Harry",male,38.0,1,0,28664,21.0,,S 404 | 1294,1,"Gibson, Miss. Dorothy Winifred",female,22.0,0,1,112378,59.4,,C 405 | 1295,1,"Carrau, Mr. Jose Pedro",male,17.0,0,0,113059,47.1,,S 406 | 1296,1,"Frauenthal, Mr. Isaac Gerald",male,43.0,1,0,17765,27.7208,D40,C 407 | 1297,2,"Nourney, Mr. Alfred ('Baron von Drachstedt')",male,20.0,0,0,SC/PARIS 2166,13.8625,D38,C 408 | 1298,2,"Ware, Mr. William Jeffery",male,23.0,1,0,28666,10.5,,S 409 | 1299,1,"Widener, Mr. George Dunton",male,50.0,1,1,113503,211.5,C80,C 410 | 1300,3,"Riordan, Miss. Johanna 'Hannah'",female,,0,0,334915,7.7208,,Q 411 | 1301,3,"Peacock, Miss. Treasteall",female,3.0,1,1,SOTON/O.Q. 3101315,13.775,,S 412 | 1302,3,"Naughton, Miss. Hannah",female,,0,0,365237,7.75,,Q 413 | 1303,1,"Minahan, Mrs. William Edward (Lillian E Thorpe)",female,37.0,1,0,19928,90.0,C78,Q 414 | 1304,3,"Henriksson, Miss. Jenny Lovisa",female,28.0,0,0,347086,7.775,,S 415 | 1305,3,"Spector, Mr. Woolf",male,,0,0,A.5. 3236,8.05,,S 416 | 1306,1,"Oliva y Ocana, Dona. Fermina",female,39.0,0,0,PC 17758,108.9,C105,C 417 | 1307,3,"Saether, Mr. Simon Sivertsen",male,38.5,0,0,SOTON/O.Q. 3101262,7.25,,S 418 | 1308,3,"Ware, Mr. Frederick",male,,0,0,359309,8.05,,S 419 | 1309,3,"Peter, Master. Michael J",male,,1,1,2668,22.3583,,C 420 | -------------------------------------------------------------------------------- /docker/.env: -------------------------------------------------------------------------------- 1 | COMPOSE_PROJECT_NAME=handson-ml2 2 | -------------------------------------------------------------------------------- /docker/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM continuumio/miniconda3:latest 2 | 3 | RUN apt-get update && apt-get install -y \ 4 | build-essential \ 5 | cmake \ 6 | ffmpeg \ 7 | git \ 8 | libboost-all-dev \ 9 | libjpeg-dev \ 10 | libpq-dev \ 11 | libsdl2-dev swig \ 12 | sudo \ 13 | unzip \ 14 | xorg-dev \ 15 | xvfb \ 16 | zip \ 17 | zlib1g-dev \ 18 | && apt clean \ 19 | && rm -rf /var/lib/apt/lists/* 20 | 21 | COPY environment.yml /tmp/ 22 | RUN printf "\n - pyvirtualdisplay" >> /tmp/environment.yml \ 23 | && conda env create -f /tmp/environment.yml \ 24 | && conda clean -afy \ 25 | && find /opt/conda/ -follow -type f -name '*.a' -delete \ 26 | && find /opt/conda/ -follow -type f -name '*.pyc' -delete \ 27 | && find /opt/conda/ -follow -type f -name '*.js.map' -delete \ 28 | && rm /tmp/environment.yml 29 | 30 | ARG username 31 | ARG userid 32 | 33 | ARG home=/home/${username} 34 | ARG workdir=${home}/handson-ml2 35 | 36 | RUN adduser ${username} --uid ${userid} --gecos '' --disabled-password \ 37 | && echo "${username} ALL=(root) NOPASSWD:ALL" > /etc/sudoers.d/${username} \ 38 | && chmod 0440 /etc/sudoers.d/${username} 39 | 40 | WORKDIR ${workdir} 41 | RUN chown ${username}:${username} ${workdir} 42 | 43 | USER ${username} 44 | WORKDIR ${workdir} 45 | 46 | ENV PATH /opt/conda/envs/tf2/bin:$PATH 47 | 48 | # The config below enables diffing notebooks with nbdiff (and nbdiff support 49 | # in git diff command) after connecting to the container by "make exec" (or 50 | # "docker-compose exec handson-ml2 bash") 51 | # You may also try running: 52 | # nbdiff NOTEBOOK_NAME.ipynb 53 | # to get nbdiff between checkpointed version and current version of the 54 | # given notebook. 55 | 56 | RUN git-nbdiffdriver config --enable --global 57 | 58 | # INFO: Optionally uncomment any (one) of the following RUN commands below to ignore either 59 | # metadata or details in nbdiff within git diff 60 | #RUN git config --global diff.jupyternotebook.command 'git-nbdiffdriver diff --ignore-metadata' 61 | RUN git config --global diff.jupyternotebook.command 'git-nbdiffdriver diff --ignore-details' 62 | 63 | 64 | COPY docker/bashrc.bash /tmp/ 65 | RUN cat /tmp/bashrc.bash >> ${home}/.bashrc \ 66 | && echo "export PATH=\"${workdir}/docker/bin:$PATH\"" >> ${home}/.bashrc \ 67 | && sudo rm /tmp/bashrc.bash 68 | 69 | 70 | # INFO: Uncomment lines below to enable automatic save of python-only and html-only 71 | # exports alongside the notebook 72 | #COPY docker/jupyter_notebook_config.py /tmp/ 73 | #RUN cat /tmp/jupyter_notebook_config.py >> ${home}/.jupyter/jupyter_notebook_config.py 74 | #RUN sudo rm /tmp/jupyter_notebook_config.py 75 | 76 | 77 | # INFO: Uncomment the RUN command below to disable git diff paging 78 | #RUN git config --global core.pager '' 79 | 80 | 81 | # INFO: Uncomment the RUN command below for easy and constant notebook URL (just localhost:8888) 82 | # That will switch Jupyter to using empty password instead of a token. 83 | # To avoid making a security hole you SHOULD in fact not only uncomment but 84 | # regenerate the hash for your own non-empty password and replace the hash below. 85 | # You can compute a password hash in any notebook, just run the code: 86 | # from notebook.auth import passwd 87 | # passwd() 88 | # and take the hash from the output 89 | #RUN mkdir -p ${home}/.jupyter && \ 90 | # echo 'c.NotebookApp.password = u"sha1:c6bbcba2d04b:f969e403db876dcfbe26f47affe41909bd53392e"' \ 91 | # >> ${home}/.jupyter/jupyter_notebook_config.py 92 | -------------------------------------------------------------------------------- /docker/Dockerfile.gpu: -------------------------------------------------------------------------------- 1 | # This Dockerfile includes sections from tensorflow/tensorflow:latest-gpu's Dockerfile: 2 | # https://github.com/tensorflow/tensorflow/blob/master/tensorflow/tools/dockerfiles/dockerfiles/gpu.Dockerfile 3 | # and sections from continuumio/miniconda3:latest's Dockerfile: 4 | # https://github.com/ContinuumIO/docker-images/blob/master/miniconda3/debian/Dockerfile 5 | 6 | 7 | # First we need CUDA and everything else needed to support GPUs 8 | 9 | ############################################### 10 | #### FROM tensorflow/tensorflow:latest-gpu #### 11 | ############################################### 12 | ARG UBUNTU_VERSION=20.04 13 | 14 | ARG ARCH= 15 | ARG CUDA=11.2 16 | FROM nvidia/cuda${ARCH:+-$ARCH}:${CUDA}.1-base-ubuntu${UBUNTU_VERSION} as base 17 | # ARCH and CUDA are specified again because the FROM directive resets ARGs 18 | # (but their default value is retained if set previously) 19 | ARG ARCH 20 | ARG CUDA 21 | ARG CUDNN=8.1.0.77-1 22 | ARG CUDNN_MAJOR_VERSION=8 23 | ARG LIB_DIR_PREFIX=x86_64 24 | ARG LIBNVINFER=8.0.0-1 25 | ARG LIBNVINFER_MAJOR_VERSION=8 26 | 27 | # Let us install tzdata painlessly 28 | ENV DEBIAN_FRONTEND=noninteractive 29 | 30 | # Needed for string substitution 31 | SHELL ["/bin/bash", "-c"] 32 | # Pick up some TF dependencies 33 | # [HOML2] Tweaked for handson-ml2: added all the libs before build-essentials 34 | # and call apt clean + remove apt cache. 35 | RUN apt-get -qq update && apt-get install -qq -y --no-install-recommends \ 36 | bzip2 \ 37 | ca-certificates \ 38 | cmake \ 39 | ffmpeg \ 40 | git \ 41 | libboost-all-dev \ 42 | libglib2.0-0 \ 43 | libjpeg-dev \ 44 | libpq-dev \ 45 | libsdl2-dev \ 46 | libsm6 \ 47 | libxext6 \ 48 | libxrender1 \ 49 | mercurial \ 50 | subversion \ 51 | sudo \ 52 | swig \ 53 | wget \ 54 | xorg-dev \ 55 | xvfb \ 56 | zip \ 57 | zlib1g-dev \ 58 | build-essential \ 59 | cuda-command-line-tools-${CUDA/./-} \ 60 | libcublas-${CUDA/./-} \ 61 | cuda-nvrtc-${CUDA/./-} \ 62 | libcufft-${CUDA/./-} \ 63 | libcurand-${CUDA/./-} \ 64 | libcusolver-${CUDA/./-} \ 65 | libcusparse-${CUDA/./-} \ 66 | curl \ 67 | libcudnn8=${CUDNN}+cuda${CUDA} \ 68 | libfreetype6-dev \ 69 | libhdf5-serial-dev \ 70 | libzmq3-dev \ 71 | pkg-config \ 72 | software-properties-common \ 73 | unzip \ 74 | && apt-get clean \ 75 | && rm -rf /var/lib/apt/lists/* 76 | 77 | # Install TensorRT if not building for PowerPC 78 | # NOTE: libnvinfer uses cuda11.1 versions 79 | RUN [[ "${ARCH}" = "ppc64le" ]] || { apt-get update && \ 80 | apt-get install -y --no-install-recommends libnvinfer${LIBNVINFER_MAJOR_VERSION}=${LIBNVINFER}+cuda11.0 \ 81 | libnvinfer-plugin${LIBNVINFER_MAJOR_VERSION}=${LIBNVINFER}+cuda11.0 \ 82 | && apt-get clean \ 83 | && rm -rf /var/lib/apt/lists/*; } 84 | 85 | # For CUDA profiling, TensorFlow requires CUPTI. 86 | ENV LD_LIBRARY_PATH /usr/local/cuda/extras/CUPTI/lib64:/usr/local/cuda/lib64:$LD_LIBRARY_PATH 87 | 88 | # Link the libcuda stub to the location where tensorflow is searching for it and reconfigure 89 | # dynamic linker run-time bindings 90 | RUN ln -s /usr/local/cuda/lib64/stubs/libcuda.so /usr/local/cuda/lib64/stubs/libcuda.so.1 \ 91 | && echo "/usr/local/cuda/lib64/stubs" > /etc/ld.so.conf.d/z-cuda-stubs.conf \ 92 | && ldconfig 93 | 94 | # [HOML2] Tweaked for handson-ml2: removed Python3 & TensorFlow installation using pip 95 | 96 | ################################################# 97 | #### End of tensorflow/tensorflow:latest-gpu #### 98 | ################################################# 99 | 100 | ENV LANG=C.UTF-8 LC_ALL=C.UTF-8 101 | ENV PATH /opt/conda/bin:/opt/conda/envs/tf2/bin:$PATH 102 | 103 | # Next we need to install miniconda 104 | 105 | ############################################ 106 | #### FROM continuumio/miniconda3:latest #### 107 | ############################################ 108 | 109 | # [HOML2] Tweaked for handson-ml2: removed the beginning of the Dockerfile 110 | CMD [ "/bin/bash" ] 111 | 112 | # Leave these args here to better use the Docker build cache 113 | ARG CONDA_VERSION=py39_4.10.3 114 | 115 | RUN set -x && \ 116 | UNAME_M="$(uname -m)" && \ 117 | if [ "${UNAME_M}" = "x86_64" ]; then \ 118 | MINICONDA_URL="https://repo.anaconda.com/miniconda/Miniconda3-${CONDA_VERSION}-Linux-x86_64.sh"; \ 119 | SHA256SUM="1ea2f885b4dbc3098662845560bc64271eb17085387a70c2ba3f29fff6f8d52f"; \ 120 | elif [ "${UNAME_M}" = "s390x" ]; then \ 121 | MINICONDA_URL="https://repo.anaconda.com/miniconda/Miniconda3-${CONDA_VERSION}-Linux-s390x.sh"; \ 122 | SHA256SUM="1faed9abecf4a4ddd4e0d8891fc2cdaa3394c51e877af14ad6b9d4aadb4e90d8"; \ 123 | elif [ "${UNAME_M}" = "aarch64" ]; then \ 124 | MINICONDA_URL="https://repo.anaconda.com/miniconda/Miniconda3-${CONDA_VERSION}-Linux-aarch64.sh"; \ 125 | SHA256SUM="4879820a10718743f945d88ef142c3a4b30dfc8e448d1ca08e019586374b773f"; \ 126 | elif [ "${UNAME_M}" = "ppc64le" ]; then \ 127 | MINICONDA_URL="https://repo.anaconda.com/miniconda/Miniconda3-${CONDA_VERSION}-Linux-ppc64le.sh"; \ 128 | SHA256SUM="fa92ee4773611f58ed9333f977d32bbb64769292f605d518732183be1f3321fa"; \ 129 | fi && \ 130 | wget "${MINICONDA_URL}" -O miniconda.sh -q && \ 131 | echo "${SHA256SUM} miniconda.sh" > shasum && \ 132 | if [ "${CONDA_VERSION}" != "latest" ]; then sha256sum --check --status shasum; fi && \ 133 | mkdir -p /opt && \ 134 | sh miniconda.sh -b -p /opt/conda && \ 135 | rm miniconda.sh shasum && \ 136 | ln -s /opt/conda/etc/profile.d/conda.sh /etc/profile.d/conda.sh && \ 137 | echo ". /opt/conda/etc/profile.d/conda.sh" >> ~/.bashrc && \ 138 | echo "conda activate base" >> ~/.bashrc && \ 139 | find /opt/conda/ -follow -type f -name '*.a' -delete && \ 140 | find /opt/conda/ -follow -type f -name '*.js.map' -delete && \ 141 | /opt/conda/bin/conda clean -afy 142 | 143 | ############################################## 144 | #### End of continuumio/miniconda3:latest #### 145 | ############################################## 146 | 147 | # Now we're ready to create our conda environment 148 | 149 | COPY environment.yml /tmp/ 150 | RUN printf "\n - pyvirtualdisplay" >> /tmp/environment.yml \ 151 | && conda env create -f /tmp/environment.yml \ 152 | && conda clean -afy \ 153 | && find /opt/conda/ -follow -type f -name '*.a' -delete \ 154 | && find /opt/conda/ -follow -type f -name '*.pyc' -delete \ 155 | && find /opt/conda/ -follow -type f -name '*.js.map' -delete \ 156 | && rm /tmp/environment.yml 157 | 158 | ARG username 159 | ARG userid 160 | 161 | ARG home=/home/${username} 162 | ARG workdir=${home}/handson-ml2 163 | 164 | RUN adduser ${username} --uid ${userid} --gecos '' --disabled-password \ 165 | && echo "${username} ALL=(root) NOPASSWD:ALL" > /etc/sudoers.d/${username} \ 166 | && chmod 0440 /etc/sudoers.d/${username} 167 | 168 | WORKDIR ${workdir} 169 | RUN chown ${username}:${username} ${workdir} 170 | 171 | USER ${username} 172 | WORKDIR ${workdir} 173 | 174 | 175 | # The config below enables diffing notebooks with nbdiff (and nbdiff support 176 | # in git diff command) after connecting to the container by "make exec" (or 177 | # "docker-compose exec handson-ml2 bash") 178 | # You may also try running: 179 | # nbdiff NOTEBOOK_NAME.ipynb 180 | # to get nbdiff between checkpointed version and current version of the 181 | # given notebook. 182 | 183 | RUN git-nbdiffdriver config --enable --global 184 | 185 | # INFO: Optionally uncomment any (one) of the following RUN commands below to ignore either 186 | # metadata or details in nbdiff within git diff 187 | #RUN git config --global diff.jupyternotebook.command 'git-nbdiffdriver diff --ignore-metadata' 188 | RUN git config --global diff.jupyternotebook.command 'git-nbdiffdriver diff --ignore-details' 189 | 190 | 191 | COPY docker/bashrc.bash /tmp/ 192 | RUN cat /tmp/bashrc.bash >> ${home}/.bashrc \ 193 | && echo "export PATH=\"${workdir}/docker/bin:$PATH\"" >> ${home}/.bashrc \ 194 | && sudo rm /tmp/bashrc.bash 195 | 196 | 197 | # INFO: Uncomment lines below to enable automatic save of python-only and html-only 198 | # exports alongside the notebook 199 | #COPY docker/jupyter_notebook_config.py /tmp/ 200 | #RUN cat /tmp/jupyter_notebook_config.py >> ${home}/.jupyter/jupyter_notebook_config.py 201 | #RUN sudo rm /tmp/jupyter_notebook_config.py 202 | 203 | 204 | # INFO: Uncomment the RUN command below to disable git diff paging 205 | #RUN git config --global core.pager '' 206 | 207 | 208 | # INFO: Uncomment the RUN command below for easy and constant notebook URL (just localhost:8888) 209 | # That will switch Jupyter to using empty password instead of a token. 210 | # To avoid making a security hole you SHOULD in fact not only uncomment but 211 | # regenerate the hash for your own non-empty password and replace the hash below. 212 | # You can compute a password hash in any notebook, just run the code: 213 | # from notebook.auth import passwd 214 | # passwd() 215 | # and take the hash from the output 216 | #RUN mkdir -p ${home}/.jupyter && \ 217 | # echo 'c.NotebookApp.password = u"sha1:c6bbcba2d04b:f969e403db876dcfbe26f47affe41909bd53392e"' \ 218 | # >> ${home}/.jupyter/jupyter_notebook_config.py 219 | -------------------------------------------------------------------------------- /docker/Makefile: -------------------------------------------------------------------------------- 1 | 2 | help: 3 | cat Makefile 4 | run: 5 | docker-compose up 6 | exec: 7 | docker-compose exec handson-ml2 bash 8 | build: stop .FORCE 9 | docker-compose build 10 | rebuild: stop .FORCE 11 | docker-compose build --no-cache 12 | stop: 13 | docker stop handson-ml2 || true; docker rm handson-ml2 || true; 14 | .FORCE: 15 | -------------------------------------------------------------------------------- /docker/README.md: -------------------------------------------------------------------------------- 1 | 2 | # Hands-on Machine Learning in Docker 3 | 4 | This is the Docker configuration which allows you to run and tweak the book's notebooks without installing any dependencies on your machine!
OK, any except `docker` and `docker-compose`.
And optionally `make`.
And a few more things if you want GPU support (see below for details). 5 | 6 | ## Prerequisites 7 | 8 | Follow the instructions on [Install Docker](https://docs.docker.com/engine/installation/) and [Install Docker Compose](https://docs.docker.com/compose/install/) for your environment if you haven't got `docker` and `docker-compose` already. 9 | 10 | Some general knowledge about `docker` infrastructure might be useful (that's an interesting topic on its own) but is not strictly *required* to just run the notebooks. 11 | 12 | ## Usage 13 | 14 | ### Prepare the image (once) 15 | 16 | The first option is to pull the image from Docker Hub (this will download about 1.9 GB of compressed data): 17 | 18 | ```bash 19 | $ docker pull ageron/handson-ml2 20 | ``` 21 | 22 | **Note**: this is the CPU-only image. For GPU support, read the GPU section below. 23 | 24 | Alternatively, you can build the image yourself. This will be slower, but it will ensure the image is up to date, with the latest libraries. For this, assuming you already downloaded this project into the directory `/path/to/project/handson-ml2`: 25 | 26 | ```bash 27 | $ cd /path/to/project/handson-ml2/docker 28 | $ docker-compose build 29 | ``` 30 | 31 | This will take quite a while, but is only required once. 32 | 33 | After the process is finished you have an `ageron/handson-ml2:latest` image, that will be the base for your experiments. You can confirm that by running the following command: 34 | 35 | ```bash 36 | $ docker images 37 | REPOSITORY TAG IMAGE ID CREATED SIZE 38 | ageron/handson-ml2 latest 3ebafebc604a 2 minutes ago 4.87GB 39 | ``` 40 | 41 | ### Run the notebooks 42 | 43 | Still assuming you already downloaded this project into the directory `/path/to/project/handson-ml2`, run the following commands to start the Jupyter server inside the container, which is named `handson-ml2`: 44 | 45 | ```bash 46 | $ cd /path/to/project/handson-ml2/docker 47 | $ docker-compose up 48 | ``` 49 | 50 | Next, just point your browser to the URL printed on the screen (or go to if you enabled password authentication inside the `jupyter_notebook_config.py` file, before building the image) and you're ready to play with the book's code! 51 | 52 | The server runs in the directory containing the notebooks, and the changes you make from the browser will be persisted there. 53 | 54 | You can close the server just by pressing `Ctrl-C` in the terminal window. 55 | 56 | ### Using `make` (optional) 57 | 58 | If you have `make` installed on your computer, you can use it as a thin layer to run `docker-compose` commands. For example, executing `make rebuild` will actually run `docker-compose build --no-cache`, which will rebuild the image without using the cache. This ensures that your image is based on the latest version of the `continuumio/miniconda3` image which the `ageron/handson-ml2` image is based on. 59 | 60 | If you don't have `make` (and you don't want to install it), just examine the contents of `Makefile` to see which `docker-compose` commands you can run instead. 61 | 62 | ### Run additional commands in the container 63 | 64 | Run `make exec` (or `docker-compose exec handson-ml2 bash`) while the server is running to run an additional `bash` shell inside the `handson-ml2` container. Now you're inside the environment prepared within the image. 65 | 66 | One of the useful things that can be done there would be starting TensorBoard (for example with simple `tb` command, see bashrc file). 67 | 68 | Another one may be comparing versions of the notebooks using the `nbdiff` command if you haven't got `nbdime` installed locally (it is **way** better than plain `diff` for notebooks). See [Tools for diffing and merging of Jupyter notebooks](https://github.com/jupyter/nbdime) for more details. 69 | 70 | You can see changes you made relative to the version in git using `git diff` which is integrated with `nbdiff`. 71 | 72 | You may also try `nbd NOTEBOOK_NAME.ipynb` command (custom, see bashrc file) to compare one of your notebooks with its `checkpointed` version.
73 | To be precise, the output will tell you *what modifications should be re-played on the **manually saved** version of the notebook (located in `.ipynb_checkpoints` subdirectory) to update it to the **current** i.e. **auto-saved** version (given as command's argument - located in working directory)*. 74 | 75 | ## GPU Support on Linux (experimental) 76 | 77 | ### Prerequisites 78 | 79 | If you're running on Linux, and you have a TensorFlow-compatible GPU card (NVidia card with Compute Capability ≥ 3.5) that you would like TensorFlow to use inside the Docker container, then you should download and install the latest driver for your card from [nvidia.com](https://www.nvidia.com/Download/index.aspx?lang=en-us). You will also need to install [NVidia Docker support](https://github.com/NVIDIA/nvidia-docker): if you are using Docker 19.03 or above, you must install the `nvidia-container-toolkit` package, and for earlier versions, you must install `nvidia-docker2`. 80 | 81 | Next, edit the `docker-compose.yml` file: 82 | 83 | ```bash 84 | $ cd /path/to/project/handson-ml2/docker 85 | $ edit docker-compose.yml # use your favorite editor 86 | ``` 87 | 88 | * Replace `dockerfile: ./docker/Dockerfile` with `dockerfile: ./docker/Dockerfile.gpu` 89 | * Replace `image: ageron/handson-ml2:latest` with `image: ageron/handson-ml2:latest-gpu` 90 | * If you want to use `docker-compose`, you will need version 1.28 or above for GPU support, and you must uncomment the whole `deploy` section in `docker-compose.yml`. 91 | 92 | ### Prepare the image (once) 93 | 94 | If you want to pull the prebuilt image from Docker Hub (this will download over 3.5 GB of compressed data): 95 | 96 | ```bash 97 | $ docker pull ageron/handson-ml2:latest-gpu 98 | ``` 99 | 100 | If you prefer to build the image yourself: 101 | 102 | ```bash 103 | $ cd /path/to/project/handson-ml2/docker 104 | $ docker-compose build 105 | ``` 106 | 107 | ### Run the notebooks with `docker-compose` (version 1.28 or above) 108 | 109 | If you have `docker-compose` version 1.28 or above, that's great! You can simply run: 110 | 111 | ```bash 112 | $ cd /path/to/project/handson-ml2/docker 113 | $ docker-compose up 114 | [...] 115 | or http://127.0.0.1:8888/?token=[...] 116 | ``` 117 | 118 | Then point your browser to the URL and Jupyter should appear. If you then open or create a notebook and execute the following code, a list containing your GPU device(s) should be displayed (success!): 119 | 120 | ```python 121 | import tensorflow as tf 122 | 123 | tf.config.list_physical_devices("GPU") 124 | ``` 125 | 126 | To stop the server, just press Ctrl-C. 127 | 128 | ### Run the notebooks without `docker-compose` 129 | 130 | If you have a version of `docker-compose` earlier than 1.28, you will have to use `docker run` directly. 131 | 132 | If you are using Docker 19.03 or above, you can run: 133 | 134 | ```bash 135 | $ cd /path/to/project/handson-ml2 136 | $ docker run --name handson-ml2 --gpus all -p 8888:8888 -p 6006:6006 --log-opt mode=non-blocking --log-opt max-buffer-size=50m -v `pwd`:/home/devel/handson-ml2 ageron/handson-ml2:latest-gpu /opt/conda/envs/tf2/bin/jupyter notebook --ip='0.0.0.0' --port=8888 --no-browser 137 | ``` 138 | 139 | If you are using an older version of Docker, then replace `--gpus all` with `--runtime=nvidia`. 140 | 141 | Now point your browser to the displayed URL: Jupyter should appear, and you can open a notebook and run `import tensorflow as tf` and `tf.config.list_physical_devices("GPU)` as above to confirm that TensorFlow does indeed see your GPU device(s). 142 | 143 | Lastly, to interrupt the server, press Ctrl-C, then run: 144 | 145 | ```bash 146 | $ docker rm handson-ml2 147 | ``` 148 | 149 | This will remove the container so you can start a new one later (but it will not remove the image or the notebooks, don't worry!). 150 | 151 | Have fun! 152 | -------------------------------------------------------------------------------- /docker/bashrc.bash: -------------------------------------------------------------------------------- 1 | alias ll="ls -alF" 2 | alias nbd="nbdiff_checkpoint" 3 | alias tb="tensorboard --logdir=tf_logs" 4 | -------------------------------------------------------------------------------- /docker/bin/nbclean_checkpoints: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | 3 | import collections 4 | import glob 5 | import hashlib 6 | import os 7 | import subprocess 8 | 9 | 10 | class NotebookAnalyser: 11 | 12 | def __init__(self, dry_run=False, verbose=False, colorful=False): 13 | self._dry_run = dry_run 14 | self._verbose = verbose 15 | self._colors = collections.defaultdict(lambda: "") 16 | if colorful: 17 | for color in [ 18 | NotebookAnalyser.COLOR_WHITE, 19 | NotebookAnalyser.COLOR_RED, 20 | NotebookAnalyser.COLOR_GREEN, 21 | NotebookAnalyser.COLOR_YELLOW, 22 | ]: 23 | self._colors[color] = "\033[{}m".format(color) 24 | 25 | NOTEBOOK_SUFFIX = ".ipynb" 26 | CHECKPOINT_DIR = NOTEBOOK_SUFFIX + "_checkpoints" 27 | CHECKPOINT_MASK = "*-checkpoint" + NOTEBOOK_SUFFIX 28 | CHECKPOINT_MASK_LEN = len(CHECKPOINT_MASK) - 1 29 | 30 | @staticmethod 31 | def get_hash(file_path): 32 | with open(file_path, "rb") as input: 33 | hash = hashlib.md5() 34 | for chunk in iter(lambda: input.read(4096), b""): 35 | hash.update(chunk) 36 | return hash.hexdigest() 37 | 38 | MESSAGE_ORPHANED = "missing " 39 | MESSAGE_MODIFIED = "modified" 40 | MESSAGE_DELETED = "DELETING" 41 | 42 | COLOR_WHITE = "0" 43 | COLOR_RED = "31" 44 | COLOR_GREEN = "32" 45 | COLOR_YELLOW = "33" 46 | 47 | def log(self, message, file, color=COLOR_WHITE): 48 | color_on = self._colors[color] 49 | color_off = self._colors[NotebookAnalyser.COLOR_WHITE] 50 | print("{}{}{}: {}".format(color_on, message, color_off, file)) 51 | 52 | def clean_checkpoints(self, directory): 53 | for checkpoint_path in sorted(glob.glob(os.path.join(directory, NotebookAnalyser.CHECKPOINT_MASK))): 54 | 55 | workfile_dir = os.path.dirname(os.path.dirname(checkpoint_path)) 56 | workfile_name = os.path.basename(checkpoint_path)[:-NotebookAnalyser.CHECKPOINT_MASK_LEN] + NotebookAnalyser.NOTEBOOK_SUFFIX 57 | workfile_path = os.path.join(workfile_dir, workfile_name) 58 | 59 | status = "" 60 | if not os.path.isfile(workfile_path): 61 | if self._verbose: 62 | self.log(NotebookAnalyser.MESSAGE_ORPHANED, workfile_path, NotebookAnalyser.COLOR_RED) 63 | else: 64 | checkpoint_stat = os.stat(checkpoint_path) 65 | workfile_stat = os.stat(workfile_path) 66 | 67 | modified = workfile_stat.st_size != checkpoint_stat.st_size 68 | 69 | if not modified: 70 | checkpoint_hash = NotebookAnalyser.get_hash(checkpoint_path) 71 | workfile_hash = NotebookAnalyser.get_hash(workfile_path) 72 | modified = checkpoint_hash != workfile_hash 73 | 74 | if modified: 75 | if self._verbose: 76 | self.log(NotebookAnalyser.MESSAGE_MODIFIED, workfile_path, NotebookAnalyser.COLOR_YELLOW) 77 | else: 78 | self.log(NotebookAnalyser.MESSAGE_DELETED, checkpoint_path, NotebookAnalyser.COLOR_GREEN) 79 | if not self._dry_run: 80 | os.remove(checkpoint_path) 81 | 82 | if not self._dry_run and not os.listdir(directory): 83 | self.log(NotebookAnalyser.MESSAGE_DELETED, directory, NotebookAnalyser.COLOR_GREEN) 84 | os.rmdir(directory) 85 | 86 | def clean_checkpoints_recursively(self, directory): 87 | for (root, subdirs, files) in os.walk(directory): 88 | subdirs.sort() # INFO: traverse alphabetically 89 | if NotebookAnalyser.CHECKPOINT_DIR in subdirs: 90 | subdirs.remove(NotebookAnalyser.CHECKPOINT_DIR) # INFO: don't recurse there 91 | self.clean_checkpoints(os.path.join(root, NotebookAnalyser.CHECKPOINT_DIR)) 92 | 93 | 94 | def main(): 95 | import argparse 96 | parser = argparse.ArgumentParser(description="Remove checkpointed versions of those jupyter notebooks that are identical to their working copies.", 97 | epilog="""Notebooks will be reported as either 98 | "DELETED" if the working copy and checkpointed version are identical 99 | (checkpoint will be deleted), 100 | "missing" if there is a checkpoint but no corresponding working file can be found 101 | or "modified" if notebook and the checkpoint are not byte-to-byte identical. 102 | If removal of checkpoints results in empty ".ipynb_checkpoints" directory 103 | that directory is also deleted. 104 | """) #, formatter_class=argparse.RawDescriptionHelpFormatter) 105 | parser.add_argument("dirs", metavar="DIR", type=str, nargs="*", default=".", help="directories to search") 106 | parser.add_argument("-d", "--dry-run", action="store_true", help="only print messages, don't perform any removals") 107 | parser.add_argument("-v", "--verbose", action="store_true", help="verbose mode") 108 | parser.add_argument("-c", "--color", action="store_true", help="colorful mode") 109 | args = parser.parse_args() 110 | 111 | analyser = NotebookAnalyser(args.dry_run, args.verbose, args.color) 112 | for directory in args.dirs: 113 | analyser.clean_checkpoints_recursively(directory) 114 | 115 | if __name__ == "__main__": 116 | main() 117 | -------------------------------------------------------------------------------- /docker/bin/nbdiff_checkpoint: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | if [[ "$#" -lt 1 || "$1" =~ ^((-h)|(--help))$ ]] ; then 3 | echo "usage: nbdiff_checkpoint NOTEBOOK.ipynb" 4 | echo 5 | echo "Show differences between given jupyter notebook and its checkpointed version (in .ipynb_checkpoints subdirectory)" 6 | exit 7 | fi 8 | 9 | DIRNAME=$(dirname "$1") 10 | BASENAME=$(basename "$1" .ipynb) 11 | shift 12 | 13 | WORKING_COPY=$DIRNAME/$BASENAME.ipynb 14 | CHECKPOINT_COPY=$DIRNAME/.ipynb_checkpoints/$BASENAME-checkpoint.ipynb 15 | 16 | echo "----- Analysing how to change $CHECKPOINT_COPY into $WORKING_COPY -----" 17 | nbdiff "$CHECKPOINT_COPY" "$WORKING_COPY" --ignore-details "$@" 18 | -------------------------------------------------------------------------------- /docker/bin/rm_empty_subdirs: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | 3 | import os 4 | 5 | def remove_empty_directories(initial_dir, 6 | allow_initial_delete=False, ignore_nonexistant_initial=False, 7 | dry_run=False, quiet=False): 8 | 9 | FORBIDDEN_SUBDIRS = set([".git"]) 10 | 11 | if not os.path.isdir(initial_dir) and not ignore_nonexistant_initial: 12 | raise RuntimeError("Initial directory '{}' not found!".format(initial_dir)) 13 | 14 | message = "removed" 15 | if dry_run: 16 | message = "to be " + message 17 | 18 | deleted = set() 19 | 20 | for (directory, subdirs, files) in os.walk(initial_dir, topdown=False): 21 | forbidden = False 22 | parent = directory 23 | while parent: 24 | parent, dirname = os.path.split(parent) 25 | if dirname in FORBIDDEN_SUBDIRS: 26 | forbidden = True 27 | break 28 | if forbidden: 29 | continue 30 | 31 | is_empty = len(files) < 1 and len(set([os.path.join(directory, s) for s in subdirs]) - deleted) < 1 32 | 33 | if is_empty and (initial_dir != directory or allow_initial_delete): 34 | if not quiet: 35 | print("{}: {}".format(message, directory)) 36 | deleted.add(directory) 37 | if not dry_run: 38 | os.rmdir(directory) 39 | 40 | def main(): 41 | import argparse 42 | parser = argparse.ArgumentParser(description="Remove empty directories recursively in subtree.") 43 | parser.add_argument("dir", metavar="DIR", type=str, nargs="+", help="directory to be searched") 44 | parser.add_argument("-r", "--allow-dir-removal", action="store_true", help="allow deletion of DIR itself") 45 | parser.add_argument("-i", "--ignore-nonexistent-dir", action="store_true", help="don't throw an error if DIR doesn't exist") 46 | parser.add_argument("-d", "--dry-run", action="store_true", help="only print messages, don't perform any removals") 47 | parser.add_argument("-q", "--quiet", action="store_true", help="don't print names of directories being removed") 48 | args = parser.parse_args() 49 | for directory in args.dir: 50 | remove_empty_directories(directory, args.allow_dir_removal, args.ignore_nonexistent_dir, 51 | args.dry_run, args.quiet) 52 | 53 | if __name__ == "__main__": 54 | main() 55 | -------------------------------------------------------------------------------- /docker/bin/tensorboard: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | python -m tensorboard.main "$@" 3 | -------------------------------------------------------------------------------- /docker/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: "3" 2 | services: 3 | handson-ml2: 4 | build: 5 | context: ../ 6 | dockerfile: ./docker/Dockerfile #Dockerfile.gpu 7 | args: 8 | - username=devel 9 | - userid=1000 10 | container_name: handson-ml2 11 | image: ageron/handson-ml2:latest #latest-gpu 12 | restart: unless-stopped 13 | logging: 14 | driver: json-file 15 | options: 16 | max-size: 50m 17 | ports: 18 | - "8888:8888" 19 | - "6006:6006" 20 | volumes: 21 | - ../:/home/devel/handson-ml2 22 | command: /opt/conda/envs/tf2/bin/jupyter notebook --ip='0.0.0.0' --port=8888 --no-browser 23 | #deploy: 24 | # resources: 25 | # reservations: 26 | # devices: 27 | # - capabilities: [gpu] -------------------------------------------------------------------------------- /docker/jupyter_notebook_config.py: -------------------------------------------------------------------------------- 1 | import os 2 | import subprocess 3 | 4 | def export_script_and_view(model, os_path, contents_manager): 5 | if model["type"] != "notebook": 6 | return 7 | dir_name, file_name = os.path.split(os_path) 8 | file_base, file_ext = os.path.splitext(file_name) 9 | if file_base.startswith("Untitled"): 10 | return 11 | export_name = file_base if file_ext == ".ipynb" else file_name 12 | subprocess.check_call(["jupyter", "nbconvert", "--to", "script", file_name, "--output", export_name + "_script"], cwd=dir_name) 13 | subprocess.check_call(["jupyter", "nbconvert", "--to", "html", file_name, "--output", export_name + "_view"], cwd=dir_name) 14 | 15 | c.FileContentsManager.post_save_hook = export_script_and_view 16 | -------------------------------------------------------------------------------- /environment.yml: -------------------------------------------------------------------------------- 1 | name: tf2 2 | channels: 3 | - conda-forge 4 | - defaults 5 | dependencies: 6 | - box2d-py # used only in chapter 18, exercise 8 7 | - ftfy=6.0 # used only in chapter 16 by the transformers library 8 | - graphviz # used only in chapter 6 for dot files 9 | - ipython=7.28 # a powerful Python shell 10 | - ipywidgets=7.6 # optionally used only in chapter 12 for tqdm in Jupyter 11 | - joblib=0.14 # used only in chapter 2 to save/load Scikit-Learn models 12 | - jupyter=1.0 # to edit and run Jupyter notebooks 13 | - matplotlib=3.4 # beautiful plots. See tutorial tools_matplotlib.ipynb 14 | - nbdime=3.1 # optional tool to diff Jupyter notebooks 15 | - nltk=3.6 # optionally used in chapter 3, exercise 4 16 | - numexpr=2.7 # used only in the Pandas tutorial for numerical expressions 17 | - numpy=1.19 # Powerful n-dimensional arrays and numerical computing tools 18 | - opencv=4.5 # used only in chapter 18 by TF Agents for image preprocessing 19 | - pandas=1.3 # data analysis and manipulation tool 20 | - pillow=8.3 # image manipulation library, (used by matplotlib.image.imread) 21 | - pip # Python's package-management system 22 | - py-xgboost=1.4 # used only in chapter 7 for optimized Gradient Boosting 23 | - pyglet=1.5 # used only in chapter 18 to render environments 24 | - pyopengl=3.1 # used only in chapter 18 to render environments 25 | - python=3.8 # Python! Not using latest version as some libs lack support 26 | - python-graphviz # used only in chapter 6 for dot files 27 | #- pyvirtualdisplay=2.2 # used only in chapter 18 if on headless server 28 | - requests=2.26 # used only in chapter 19 for REST API queries 29 | - scikit-learn=1.0 # machine learning library 30 | - scipy=1.7 # scientific/technical computing library 31 | - tqdm=4.62 # a progress bar library 32 | - wheel # built-package format for pip 33 | - widgetsnbextension=3.5 # interactive HTML widgets for Jupyter notebooks 34 | - pip: 35 | - tensorboard-plugin-profile~=2.5.0 # profiling plugin for TensorBoard 36 | - tensorboard~=2.7.0 # TensorFlow's visualization toolkit 37 | - tensorflow-addons~=0.14.0 # used only in chapter 16 for a seq2seq impl. 38 | - tensorflow-datasets~=4.4.0 # datasets repository, ready to use 39 | - tensorflow-hub~=0.12.0 # trained ML models repository, ready to use 40 | - tensorflow-probability~=0.14.1 # Optional. Probability/Stats lib. 41 | - tensorflow-serving-api~=2.6.0 # or tensorflow-serving-api-gpu if gpu 42 | - tensorflow~=2.6.0 # Deep Learning library 43 | - tf-agents~=0.10.0 # Reinforcement Learning lib based on TensorFlow 44 | - tfx~=1.3.0 # platform to deploy production ML pipelines 45 | - transformers~=4.11.3 # Natural Language Processing lib for TF or PyTorch 46 | - urlextract~=1.4.0 # optionally used in chapter 3, exercise 4 47 | - gym[atari,accept-rom-license]~=0.21.0 # used only in chapter 18 48 | 49 | # Specific lib versions to avoid conflicts 50 | - attrs=20.3 51 | - click=7.1 52 | - packaging=20.9 53 | - six=1.15 54 | - typing-extensions=3.7 55 | -------------------------------------------------------------------------------- /extra_autodiff.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "**Appendix D – Autodiff**" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "_This notebook contains toy implementations of various autodiff techniques, to explain how they work._" 15 | ] 16 | }, 17 | { 18 | "cell_type": "markdown", 19 | "metadata": {}, 20 | "source": [ 21 | "\n", 22 | " \n", 25 | " \n", 28 | "
\n", 23 | " \"Open\n", 24 | " \n", 26 | " \n", 27 | "
" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": {}, 34 | "source": [ 35 | "# Setup" 36 | ] 37 | }, 38 | { 39 | "cell_type": "markdown", 40 | "metadata": {}, 41 | "source": [ 42 | "# Introduction" 43 | ] 44 | }, 45 | { 46 | "cell_type": "markdown", 47 | "metadata": {}, 48 | "source": [ 49 | "Suppose we want to compute the gradients of the function $f(x,y)=x^2y + y + 2$ with regards to the parameters x and y:" 50 | ] 51 | }, 52 | { 53 | "cell_type": "code", 54 | "execution_count": 1, 55 | "metadata": {}, 56 | "outputs": [], 57 | "source": [ 58 | "def f(x,y):\n", 59 | " return x*x*y + y + 2" 60 | ] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": {}, 65 | "source": [ 66 | "One approach is to solve this analytically:\n", 67 | "\n", 68 | "$\\dfrac{\\partial f}{\\partial x} = 2xy$\n", 69 | "\n", 70 | "$\\dfrac{\\partial f}{\\partial y} = x^2 + 1$" 71 | ] 72 | }, 73 | { 74 | "cell_type": "code", 75 | "execution_count": 2, 76 | "metadata": {}, 77 | "outputs": [], 78 | "source": [ 79 | "def df(x,y):\n", 80 | " return 2*x*y, x*x + 1" 81 | ] 82 | }, 83 | { 84 | "cell_type": "markdown", 85 | "metadata": {}, 86 | "source": [ 87 | "So for example $\\dfrac{\\partial f}{\\partial x}(3,4) = 24$ and $\\dfrac{\\partial f}{\\partial y}(3,4) = 10$." 88 | ] 89 | }, 90 | { 91 | "cell_type": "code", 92 | "execution_count": 3, 93 | "metadata": {}, 94 | "outputs": [ 95 | { 96 | "data": { 97 | "text/plain": [ 98 | "(24, 10)" 99 | ] 100 | }, 101 | "execution_count": 3, 102 | "metadata": {}, 103 | "output_type": "execute_result" 104 | } 105 | ], 106 | "source": [ 107 | "df(3, 4)" 108 | ] 109 | }, 110 | { 111 | "cell_type": "markdown", 112 | "metadata": {}, 113 | "source": [ 114 | "Perfect! We can also find the equations for the second order derivatives (also called Hessians):\n", 115 | "\n", 116 | "$\\dfrac{\\partial^2 f}{\\partial x \\partial x} = \\dfrac{\\partial (2xy)}{\\partial x} = 2y$\n", 117 | "\n", 118 | "$\\dfrac{\\partial^2 f}{\\partial x \\partial y} = \\dfrac{\\partial (2xy)}{\\partial y} = 2x$\n", 119 | "\n", 120 | "$\\dfrac{\\partial^2 f}{\\partial y \\partial x} = \\dfrac{\\partial (x^2 + 1)}{\\partial x} = 2x$\n", 121 | "\n", 122 | "$\\dfrac{\\partial^2 f}{\\partial y \\partial y} = \\dfrac{\\partial (x^2 + 1)}{\\partial y} = 0$" 123 | ] 124 | }, 125 | { 126 | "cell_type": "markdown", 127 | "metadata": {}, 128 | "source": [ 129 | "At x=3 and y=4, these Hessians are respectively 8, 6, 6, 0. Let's use the equations above to compute them:" 130 | ] 131 | }, 132 | { 133 | "cell_type": "code", 134 | "execution_count": 4, 135 | "metadata": {}, 136 | "outputs": [], 137 | "source": [ 138 | "def d2f(x, y):\n", 139 | " return [2*y, 2*x], [2*x, 0]" 140 | ] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "execution_count": 5, 145 | "metadata": {}, 146 | "outputs": [ 147 | { 148 | "data": { 149 | "text/plain": [ 150 | "([8, 6], [6, 0])" 151 | ] 152 | }, 153 | "execution_count": 5, 154 | "metadata": {}, 155 | "output_type": "execute_result" 156 | } 157 | ], 158 | "source": [ 159 | "d2f(3, 4)" 160 | ] 161 | }, 162 | { 163 | "cell_type": "markdown", 164 | "metadata": {}, 165 | "source": [ 166 | "Perfect, but this requires some mathematical work. It is not too hard in this case, but for a deep neural network, it is pratically impossible to compute the derivatives this way. So let's look at various ways to automate this!" 167 | ] 168 | }, 169 | { 170 | "cell_type": "markdown", 171 | "metadata": {}, 172 | "source": [ 173 | "# Numeric differentiation" 174 | ] 175 | }, 176 | { 177 | "cell_type": "markdown", 178 | "metadata": {}, 179 | "source": [ 180 | "Here, we compute an approxiation of the gradients using the equation: $\\dfrac{\\partial f}{\\partial x} = \\displaystyle{\\lim_{\\epsilon \\to 0}}\\dfrac{f(x+\\epsilon, y) - f(x, y)}{\\epsilon}$ (and there is a similar definition for $\\dfrac{\\partial f}{\\partial y}$)." 181 | ] 182 | }, 183 | { 184 | "cell_type": "code", 185 | "execution_count": 6, 186 | "metadata": {}, 187 | "outputs": [], 188 | "source": [ 189 | "def gradients(func, vars_list, eps=0.0001):\n", 190 | " partial_derivatives = []\n", 191 | " base_func_eval = func(*vars_list)\n", 192 | " for idx in range(len(vars_list)):\n", 193 | " tweaked_vars = vars_list[:]\n", 194 | " tweaked_vars[idx] += eps\n", 195 | " tweaked_func_eval = func(*tweaked_vars)\n", 196 | " derivative = (tweaked_func_eval - base_func_eval) / eps\n", 197 | " partial_derivatives.append(derivative)\n", 198 | " return partial_derivatives" 199 | ] 200 | }, 201 | { 202 | "cell_type": "code", 203 | "execution_count": 7, 204 | "metadata": {}, 205 | "outputs": [], 206 | "source": [ 207 | "def df(x, y):\n", 208 | " return gradients(f, [x, y])" 209 | ] 210 | }, 211 | { 212 | "cell_type": "code", 213 | "execution_count": 8, 214 | "metadata": {}, 215 | "outputs": [ 216 | { 217 | "data": { 218 | "text/plain": [ 219 | "[24.000400000048216, 10.000000000047748]" 220 | ] 221 | }, 222 | "execution_count": 8, 223 | "metadata": {}, 224 | "output_type": "execute_result" 225 | } 226 | ], 227 | "source": [ 228 | "df(3, 4)" 229 | ] 230 | }, 231 | { 232 | "cell_type": "markdown", 233 | "metadata": {}, 234 | "source": [ 235 | "It works well!" 236 | ] 237 | }, 238 | { 239 | "cell_type": "markdown", 240 | "metadata": {}, 241 | "source": [ 242 | "The good news is that it is pretty easy to compute the Hessians. First let's create functions that compute the first order partial derivatives (also called Jacobians):" 243 | ] 244 | }, 245 | { 246 | "cell_type": "code", 247 | "execution_count": 9, 248 | "metadata": {}, 249 | "outputs": [ 250 | { 251 | "data": { 252 | "text/plain": [ 253 | "(24.000400000048216, 10.000000000047748)" 254 | ] 255 | }, 256 | "execution_count": 9, 257 | "metadata": {}, 258 | "output_type": "execute_result" 259 | } 260 | ], 261 | "source": [ 262 | "def dfdx(x, y):\n", 263 | " return gradients(f, [x,y])[0]\n", 264 | "\n", 265 | "def dfdy(x, y):\n", 266 | " return gradients(f, [x,y])[1]\n", 267 | "\n", 268 | "dfdx(3., 4.), dfdy(3., 4.)" 269 | ] 270 | }, 271 | { 272 | "cell_type": "markdown", 273 | "metadata": {}, 274 | "source": [ 275 | "Now we can simply apply the `gradients()` function to these functions:" 276 | ] 277 | }, 278 | { 279 | "cell_type": "code", 280 | "execution_count": 10, 281 | "metadata": {}, 282 | "outputs": [], 283 | "source": [ 284 | "def d2f(x, y):\n", 285 | " return [gradients(dfdx, [x, y]), gradients(dfdy, [x, y])]" 286 | ] 287 | }, 288 | { 289 | "cell_type": "code", 290 | "execution_count": 11, 291 | "metadata": {}, 292 | "outputs": [ 293 | { 294 | "data": { 295 | "text/plain": [ 296 | "[[7.999999951380232, 6.000099261882497],\n", 297 | " [6.000099261882497, -1.4210854715202004e-06]]" 298 | ] 299 | }, 300 | "execution_count": 11, 301 | "metadata": {}, 302 | "output_type": "execute_result" 303 | } 304 | ], 305 | "source": [ 306 | "d2f(3, 4)" 307 | ] 308 | }, 309 | { 310 | "cell_type": "markdown", 311 | "metadata": {}, 312 | "source": [ 313 | "So everything works well, but the result is approximate, and computing the gradients of a function with regards to $n$ variables requires calling that function $n$ times. In deep neural nets, there are often thousands of parameters to tweak using gradient descent (which requires computing the gradients of the loss function with regards to each of these parameters), so this approach would be much too slow." 314 | ] 315 | }, 316 | { 317 | "cell_type": "markdown", 318 | "metadata": {}, 319 | "source": [ 320 | "## Implementing a Toy Computation Graph" 321 | ] 322 | }, 323 | { 324 | "cell_type": "markdown", 325 | "metadata": {}, 326 | "source": [ 327 | "Rather than this numerical approach, let's implement some symbolic autodiff techniques. For this, we will need to define classes to represent constants, variables and operations." 328 | ] 329 | }, 330 | { 331 | "cell_type": "code", 332 | "execution_count": 12, 333 | "metadata": {}, 334 | "outputs": [], 335 | "source": [ 336 | "class Const(object):\n", 337 | " def __init__(self, value):\n", 338 | " self.value = value\n", 339 | " def evaluate(self):\n", 340 | " return self.value\n", 341 | " def __str__(self):\n", 342 | " return str(self.value)\n", 343 | "\n", 344 | "class Var(object):\n", 345 | " def __init__(self, name, init_value=0):\n", 346 | " self.value = init_value\n", 347 | " self.name = name\n", 348 | " def evaluate(self):\n", 349 | " return self.value\n", 350 | " def __str__(self):\n", 351 | " return self.name\n", 352 | "\n", 353 | "class BinaryOperator(object):\n", 354 | " def __init__(self, a, b):\n", 355 | " self.a = a\n", 356 | " self.b = b\n", 357 | "\n", 358 | "class Add(BinaryOperator):\n", 359 | " def evaluate(self):\n", 360 | " return self.a.evaluate() + self.b.evaluate()\n", 361 | " def __str__(self):\n", 362 | " return \"{} + {}\".format(self.a, self.b)\n", 363 | "\n", 364 | "class Mul(BinaryOperator):\n", 365 | " def evaluate(self):\n", 366 | " return self.a.evaluate() * self.b.evaluate()\n", 367 | " def __str__(self):\n", 368 | " return \"({}) * ({})\".format(self.a, self.b)" 369 | ] 370 | }, 371 | { 372 | "cell_type": "markdown", 373 | "metadata": {}, 374 | "source": [ 375 | "Good, now we can build a computation graph to represent the function $f$:" 376 | ] 377 | }, 378 | { 379 | "cell_type": "code", 380 | "execution_count": 13, 381 | "metadata": {}, 382 | "outputs": [], 383 | "source": [ 384 | "x = Var(\"x\")\n", 385 | "y = Var(\"y\")\n", 386 | "f = Add(Mul(Mul(x, x), y), Add(y, Const(2))) # f(x,y) = x²y + y + 2" 387 | ] 388 | }, 389 | { 390 | "cell_type": "markdown", 391 | "metadata": {}, 392 | "source": [ 393 | "And we can run this graph to compute $f$ at any point, for example $f(3, 4)$." 394 | ] 395 | }, 396 | { 397 | "cell_type": "code", 398 | "execution_count": 14, 399 | "metadata": {}, 400 | "outputs": [ 401 | { 402 | "data": { 403 | "text/plain": [ 404 | "42" 405 | ] 406 | }, 407 | "execution_count": 14, 408 | "metadata": {}, 409 | "output_type": "execute_result" 410 | } 411 | ], 412 | "source": [ 413 | "x.value = 3\n", 414 | "y.value = 4\n", 415 | "f.evaluate()" 416 | ] 417 | }, 418 | { 419 | "cell_type": "markdown", 420 | "metadata": {}, 421 | "source": [ 422 | "Perfect, it found the ultimate answer." 423 | ] 424 | }, 425 | { 426 | "cell_type": "markdown", 427 | "metadata": {}, 428 | "source": [ 429 | "## Computing gradients" 430 | ] 431 | }, 432 | { 433 | "cell_type": "markdown", 434 | "metadata": {}, 435 | "source": [ 436 | "The autodiff methods we will present below are all based on the *chain rule*." 437 | ] 438 | }, 439 | { 440 | "cell_type": "markdown", 441 | "metadata": {}, 442 | "source": [ 443 | "Suppose we have two functions $u$ and $v$, and we apply them sequentially to some input $x$, and we get the result $z$. So we have $z = v(u(x))$, which we can rewrite as $z = v(s)$ and $s = u(x)$. Now we can apply the chain rule to get the partial derivative of the output $z$ with regards to the input $x$:\n", 444 | "\n", 445 | "$ \\dfrac{\\partial z}{\\partial x} = \\dfrac{\\partial s}{\\partial x} \\cdot \\dfrac{\\partial z}{\\partial s}$" 446 | ] 447 | }, 448 | { 449 | "cell_type": "markdown", 450 | "metadata": {}, 451 | "source": [ 452 | "Now if $z$ is the output of a sequence of functions which have intermediate outputs $s_1, s_2, ..., s_n$, the chain rule still applies:\n", 453 | "\n", 454 | "$ \\dfrac{\\partial z}{\\partial x} = \\dfrac{\\partial s_1}{\\partial x} \\cdot \\dfrac{\\partial s_2}{\\partial s_1} \\cdot \\dfrac{\\partial s_3}{\\partial s_2} \\cdot \\dots \\cdot \\dfrac{\\partial s_{n-1}}{\\partial s_{n-2}} \\cdot \\dfrac{\\partial s_n}{\\partial s_{n-1}} \\cdot \\dfrac{\\partial z}{\\partial s_n}$" 455 | ] 456 | }, 457 | { 458 | "cell_type": "markdown", 459 | "metadata": {}, 460 | "source": [ 461 | "In forward mode autodiff, the algorithm computes these terms \"forward\" (i.e., in the same order as the computations required to compute the output $z$), that is from left to right: first $\\dfrac{\\partial s_1}{\\partial x}$, then $\\dfrac{\\partial s_2}{\\partial s_1}$, and so on. In reverse mode autodiff, the algorithm computes these terms \"backwards\", from right to left: first $\\dfrac{\\partial z}{\\partial s_n}$, then $\\dfrac{\\partial s_n}{\\partial s_{n-1}}$, and so on.\n", 462 | "\n", 463 | "For example, suppose you want to compute the derivative of the function $z(x)=\\sin(x^2)$ at x=3, using forward mode autodiff. The algorithm would first compute the partial derivative $\\dfrac{\\partial s_1}{\\partial x}=\\dfrac{\\partial x^2}{\\partial x}=2x=6$. Next, it would compute $\\dfrac{\\partial z}{\\partial x}=\\dfrac{\\partial s_1}{\\partial x}\\cdot\\dfrac{\\partial z}{\\partial s_1}= 6 \\cdot \\dfrac{\\partial \\sin(s_1)}{\\partial s_1}=6 \\cdot \\cos(s_1)=6 \\cdot \\cos(3^2)\\approx-5.46$." 464 | ] 465 | }, 466 | { 467 | "cell_type": "markdown", 468 | "metadata": {}, 469 | "source": [ 470 | "Let's verify this result using the `gradients()` function defined earlier:" 471 | ] 472 | }, 473 | { 474 | "cell_type": "code", 475 | "execution_count": 15, 476 | "metadata": {}, 477 | "outputs": [ 478 | { 479 | "data": { 480 | "text/plain": [ 481 | "[-5.46761419430053]" 482 | ] 483 | }, 484 | "execution_count": 15, 485 | "metadata": {}, 486 | "output_type": "execute_result" 487 | } 488 | ], 489 | "source": [ 490 | "from math import sin\n", 491 | "\n", 492 | "def z(x):\n", 493 | " return sin(x**2)\n", 494 | "\n", 495 | "gradients(z, [3])" 496 | ] 497 | }, 498 | { 499 | "cell_type": "markdown", 500 | "metadata": {}, 501 | "source": [ 502 | "Look good. Now let's do the same thing using reverse mode autodiff. This time the algorithm would start from the right hand side so it would compute $\\dfrac{\\partial z}{\\partial s_1} = \\dfrac{\\partial \\sin(s_1)}{\\partial s_1}=\\cos(s_1)=\\cos(3^2)\\approx -0.91$. Next it would compute $\\dfrac{\\partial z}{\\partial x}=\\dfrac{\\partial s_1}{\\partial x}\\cdot\\dfrac{\\partial z}{\\partial s_1} \\approx \\dfrac{\\partial s_1}{\\partial x} \\cdot -0.91 = \\dfrac{\\partial x^2}{\\partial x} \\cdot -0.91=2x \\cdot -0.91 = 6\\cdot-0.91=-5.46$." 503 | ] 504 | }, 505 | { 506 | "cell_type": "markdown", 507 | "metadata": {}, 508 | "source": [ 509 | "Of course both approaches give the same result (except for rounding errors), and with a single input and output they involve the same number of computations. But when there are several inputs or outputs, they can have very different performance. Indeed, if there are many inputs, the right-most terms will be needed to compute the partial derivatives with regards to each input, so it is a good idea to compute these right-most terms first. That means using reverse-mode autodiff. This way, the right-most terms can be computed just once and used to compute all the partial derivatives. Conversely, if there are many outputs, forward-mode is generally preferable because the left-most terms can be computed just once to compute the partial derivatives of the different outputs. In Deep Learning, there are typically thousands of model parameters, meaning there are lots of inputs, but few outputs. In fact, there is generally just one output during training: the loss. This is why reverse mode autodiff is used in TensorFlow and all major Deep Learning libraries." 510 | ] 511 | }, 512 | { 513 | "cell_type": "markdown", 514 | "metadata": {}, 515 | "source": [ 516 | "There's one additional complexity in reverse mode autodiff: the value of $s_i$ is generally required when computing $\\dfrac{\\partial s_{i+1}}{\\partial s_i}$, and computing $s_i$ requires first computing $s_{i-1}$, which requires computing $s_{i-2}$, and so on. So basically, a first pass forward through the network is required to compute $s_1$, $s_2$, $s_3$, $\\dots$, $s_{n-1}$ and $s_n$, and then the algorithm can compute the partial derivatives from right to left. Storing all the intermediate values $s_i$ in RAM is sometimes a problem, especially when handling images, and when using GPUs which often have limited RAM: to limit this problem, one can reduce the number of layers in the neural network, or configure TensorFlow to make it swap these values from GPU RAM to CPU RAM. Another approach is to only cache every other intermediate value, $s_1$, $s_3$, $s_5$, $\\dots$, $s_{n-4}$, $s_{n-2}$ and $s_n$. This means that when the algorithm computes the partial derivatives, if an intermediate value $s_i$ is missing, it will need to recompute it based on the previous intermediate value $s_{i-1}$. This trades off CPU for RAM (if you are interested, check out [this paper](https://pdfs.semanticscholar.org/f61e/9fd5a4878e1493f7a6b03774a61c17b7e9a4.pdf))." 517 | ] 518 | }, 519 | { 520 | "cell_type": "markdown", 521 | "metadata": {}, 522 | "source": [ 523 | "### Forward mode autodiff" 524 | ] 525 | }, 526 | { 527 | "cell_type": "code", 528 | "execution_count": 16, 529 | "metadata": {}, 530 | "outputs": [], 531 | "source": [ 532 | "Const.gradient = lambda self, var: Const(0)\n", 533 | "Var.gradient = lambda self, var: Const(1) if self is var else Const(0)\n", 534 | "Add.gradient = lambda self, var: Add(self.a.gradient(var), self.b.gradient(var))\n", 535 | "Mul.gradient = lambda self, var: Add(Mul(self.a, self.b.gradient(var)), Mul(self.a.gradient(var), self.b))\n", 536 | "\n", 537 | "x = Var(name=\"x\", init_value=3.)\n", 538 | "y = Var(name=\"y\", init_value=4.)\n", 539 | "f = Add(Mul(Mul(x, x), y), Add(y, Const(2))) # f(x,y) = x²y + y + 2\n", 540 | "\n", 541 | "dfdx = f.gradient(x) # 2xy\n", 542 | "dfdy = f.gradient(y) # x² + 1" 543 | ] 544 | }, 545 | { 546 | "cell_type": "code", 547 | "execution_count": 17, 548 | "metadata": {}, 549 | "outputs": [ 550 | { 551 | "data": { 552 | "text/plain": [ 553 | "(24.0, 10.0)" 554 | ] 555 | }, 556 | "execution_count": 17, 557 | "metadata": {}, 558 | "output_type": "execute_result" 559 | } 560 | ], 561 | "source": [ 562 | "dfdx.evaluate(), dfdy.evaluate()" 563 | ] 564 | }, 565 | { 566 | "cell_type": "markdown", 567 | "metadata": {}, 568 | "source": [ 569 | "Since the output of the `gradient()` method is fully symbolic, we are not limited to the first order derivatives, we can also compute second order derivatives, and so on:" 570 | ] 571 | }, 572 | { 573 | "cell_type": "code", 574 | "execution_count": 18, 575 | "metadata": {}, 576 | "outputs": [], 577 | "source": [ 578 | "d2fdxdx = dfdx.gradient(x) # 2y\n", 579 | "d2fdxdy = dfdx.gradient(y) # 2x\n", 580 | "d2fdydx = dfdy.gradient(x) # 2x\n", 581 | "d2fdydy = dfdy.gradient(y) # 0" 582 | ] 583 | }, 584 | { 585 | "cell_type": "code", 586 | "execution_count": 19, 587 | "metadata": {}, 588 | "outputs": [ 589 | { 590 | "data": { 591 | "text/plain": [ 592 | "[[8.0, 6.0], [6.0, 0.0]]" 593 | ] 594 | }, 595 | "execution_count": 19, 596 | "metadata": {}, 597 | "output_type": "execute_result" 598 | } 599 | ], 600 | "source": [ 601 | "[[d2fdxdx.evaluate(), d2fdxdy.evaluate()],\n", 602 | " [d2fdydx.evaluate(), d2fdydy.evaluate()]]" 603 | ] 604 | }, 605 | { 606 | "cell_type": "markdown", 607 | "metadata": {}, 608 | "source": [ 609 | "Note that the result is now exact, not an approximation (up to the limit of the machine's float precision, of course)." 610 | ] 611 | }, 612 | { 613 | "cell_type": "markdown", 614 | "metadata": {}, 615 | "source": [ 616 | "### Forward mode autodiff using dual numbers" 617 | ] 618 | }, 619 | { 620 | "cell_type": "markdown", 621 | "metadata": {}, 622 | "source": [ 623 | "A nice way to apply forward mode autodiff is to use [dual numbers](https://en.wikipedia.org/wiki/Dual_number). In short, a dual number $z$ has the form $z = a + b\\epsilon$, where $a$ and $b$ are real numbers, and $\\epsilon$ is an infinitesimal number, positive but smaller than all real numbers, and such that $\\epsilon^2=0$.\n", 624 | "It can be shown that $f(x + \\epsilon) = f(x) + \\dfrac{\\partial f}{\\partial x}\\epsilon$, so simply by computing $f(x + \\epsilon)$ we get both the value of $f(x)$ and the partial derivative of $f$ with regards to $x$. " 625 | ] 626 | }, 627 | { 628 | "cell_type": "markdown", 629 | "metadata": {}, 630 | "source": [ 631 | "Dual numbers have their own arithmetic rules, which are generally quite natural. For example:\n", 632 | "\n", 633 | "**Addition**\n", 634 | "\n", 635 | "$(a_1 + b_1\\epsilon) + (a_2 + b_2\\epsilon) = (a_1 + a_2) + (b_1 + b_2)\\epsilon$\n", 636 | "\n", 637 | "**Subtraction**\n", 638 | "\n", 639 | "$(a_1 + b_1\\epsilon) - (a_2 + b_2\\epsilon) = (a_1 - a_2) + (b_1 - b_2)\\epsilon$\n", 640 | "\n", 641 | "**Multiplication**\n", 642 | "\n", 643 | "$(a_1 + b_1\\epsilon) \\times (a_2 + b_2\\epsilon) = (a_1 a_2) + (a_1 b_2 + a_2 b_1)\\epsilon + b_1 b_2\\epsilon^2 = (a_1 a_2) + (a_1b_2 + a_2b_1)\\epsilon$\n", 644 | "\n", 645 | "**Division**\n", 646 | "\n", 647 | "$\\dfrac{a_1 + b_1\\epsilon}{a_2 + b_2\\epsilon} = \\dfrac{a_1 + b_1\\epsilon}{a_2 + b_2\\epsilon} \\cdot \\dfrac{a_2 - b_2\\epsilon}{a_2 - b_2\\epsilon} = \\dfrac{a_1 a_2 + (b_1 a_2 - a_1 b_2)\\epsilon - b_1 b_2\\epsilon^2}{{a_2}^2 + (a_2 b_2 - a_2 b_2)\\epsilon - {b_2}^2\\epsilon} = \\dfrac{a_1}{a_2} + \\dfrac{a_1 b_2 - b_1 a_2}{{a_2}^2}\\epsilon$\n", 648 | "\n", 649 | "**Power**\n", 650 | "\n", 651 | "$(a + b\\epsilon)^n = a^n + (n a^{n-1}b)\\epsilon$\n", 652 | "\n", 653 | "etc." 654 | ] 655 | }, 656 | { 657 | "cell_type": "markdown", 658 | "metadata": {}, 659 | "source": [ 660 | "Let's create a class to represent dual numbers, and implement a few operations (addition and multiplication). You can try adding some more if you want." 661 | ] 662 | }, 663 | { 664 | "cell_type": "code", 665 | "execution_count": 20, 666 | "metadata": {}, 667 | "outputs": [], 668 | "source": [ 669 | "class DualNumber(object):\n", 670 | " def __init__(self, value=0.0, eps=0.0):\n", 671 | " self.value = value\n", 672 | " self.eps = eps\n", 673 | " def __add__(self, b):\n", 674 | " return DualNumber(self.value + self.to_dual(b).value,\n", 675 | " self.eps + self.to_dual(b).eps)\n", 676 | " def __radd__(self, a):\n", 677 | " return self.to_dual(a).__add__(self)\n", 678 | " def __mul__(self, b):\n", 679 | " return DualNumber(self.value * self.to_dual(b).value,\n", 680 | " self.eps * self.to_dual(b).value + self.value * self.to_dual(b).eps)\n", 681 | " def __rmul__(self, a):\n", 682 | " return self.to_dual(a).__mul__(self)\n", 683 | " def __str__(self):\n", 684 | " if self.eps:\n", 685 | " return \"{:.1f} + {:.1f}ε\".format(self.value, self.eps)\n", 686 | " else:\n", 687 | " return \"{:.1f}\".format(self.value)\n", 688 | " def __repr__(self):\n", 689 | " return str(self)\n", 690 | " @classmethod\n", 691 | " def to_dual(cls, n):\n", 692 | " if hasattr(n, \"value\"):\n", 693 | " return n\n", 694 | " else:\n", 695 | " return cls(n)" 696 | ] 697 | }, 698 | { 699 | "cell_type": "markdown", 700 | "metadata": {}, 701 | "source": [ 702 | "$3 + (3 + 4 \\epsilon) = 6 + 4\\epsilon$" 703 | ] 704 | }, 705 | { 706 | "cell_type": "code", 707 | "execution_count": 21, 708 | "metadata": {}, 709 | "outputs": [ 710 | { 711 | "data": { 712 | "text/plain": [ 713 | "6.0 + 4.0ε" 714 | ] 715 | }, 716 | "execution_count": 21, 717 | "metadata": {}, 718 | "output_type": "execute_result" 719 | } 720 | ], 721 | "source": [ 722 | "3 + DualNumber(3, 4)" 723 | ] 724 | }, 725 | { 726 | "cell_type": "markdown", 727 | "metadata": {}, 728 | "source": [ 729 | "$(3 + 4ε)\\times(5 + 7ε)$ = $3 \\times 5 + 3 \\times 7ε + 4ε \\times 5 + 4ε \\times 7ε$ = $15 + 21ε + 20ε + 28ε^2$ = $15 + 41ε + 28 \\times 0$ = $15 + 41ε$" 730 | ] 731 | }, 732 | { 733 | "cell_type": "code", 734 | "execution_count": 22, 735 | "metadata": {}, 736 | "outputs": [ 737 | { 738 | "data": { 739 | "text/plain": [ 740 | "15.0 + 41.0ε" 741 | ] 742 | }, 743 | "execution_count": 22, 744 | "metadata": {}, 745 | "output_type": "execute_result" 746 | } 747 | ], 748 | "source": [ 749 | "DualNumber(3, 4) * DualNumber(5, 7)" 750 | ] 751 | }, 752 | { 753 | "cell_type": "markdown", 754 | "metadata": {}, 755 | "source": [ 756 | "Now let's see if the dual numbers work with our toy computation framework:" 757 | ] 758 | }, 759 | { 760 | "cell_type": "code", 761 | "execution_count": 23, 762 | "metadata": {}, 763 | "outputs": [ 764 | { 765 | "data": { 766 | "text/plain": [ 767 | "42.0" 768 | ] 769 | }, 770 | "execution_count": 23, 771 | "metadata": {}, 772 | "output_type": "execute_result" 773 | } 774 | ], 775 | "source": [ 776 | "x.value = DualNumber(3.0)\n", 777 | "y.value = DualNumber(4.0)\n", 778 | "\n", 779 | "f.evaluate()" 780 | ] 781 | }, 782 | { 783 | "cell_type": "markdown", 784 | "metadata": {}, 785 | "source": [ 786 | "Yep, sure works. Now let's use this to compute the partial derivatives of $f$ with regards to $x$ and $y$ at x=3 and y=4:" 787 | ] 788 | }, 789 | { 790 | "cell_type": "code", 791 | "execution_count": 24, 792 | "metadata": {}, 793 | "outputs": [], 794 | "source": [ 795 | "x.value = DualNumber(3.0, 1.0) # 3 + ε\n", 796 | "y.value = DualNumber(4.0) # 4\n", 797 | "\n", 798 | "dfdx = f.evaluate().eps\n", 799 | "\n", 800 | "x.value = DualNumber(3.0) # 3\n", 801 | "y.value = DualNumber(4.0, 1.0) # 4 + ε\n", 802 | "\n", 803 | "dfdy = f.evaluate().eps" 804 | ] 805 | }, 806 | { 807 | "cell_type": "code", 808 | "execution_count": 25, 809 | "metadata": {}, 810 | "outputs": [ 811 | { 812 | "data": { 813 | "text/plain": [ 814 | "24.0" 815 | ] 816 | }, 817 | "execution_count": 25, 818 | "metadata": {}, 819 | "output_type": "execute_result" 820 | } 821 | ], 822 | "source": [ 823 | "dfdx" 824 | ] 825 | }, 826 | { 827 | "cell_type": "code", 828 | "execution_count": 26, 829 | "metadata": {}, 830 | "outputs": [ 831 | { 832 | "data": { 833 | "text/plain": [ 834 | "10.0" 835 | ] 836 | }, 837 | "execution_count": 26, 838 | "metadata": {}, 839 | "output_type": "execute_result" 840 | } 841 | ], 842 | "source": [ 843 | "dfdy" 844 | ] 845 | }, 846 | { 847 | "cell_type": "markdown", 848 | "metadata": {}, 849 | "source": [ 850 | "Great! However, in this implementation we are limited to first order derivatives.\n", 851 | "Now let's look at reverse mode." 852 | ] 853 | }, 854 | { 855 | "cell_type": "markdown", 856 | "metadata": {}, 857 | "source": [ 858 | "### Reverse mode autodiff" 859 | ] 860 | }, 861 | { 862 | "cell_type": "markdown", 863 | "metadata": {}, 864 | "source": [ 865 | "Let's rewrite our toy framework to add reverse mode autodiff:" 866 | ] 867 | }, 868 | { 869 | "cell_type": "code", 870 | "execution_count": 27, 871 | "metadata": {}, 872 | "outputs": [], 873 | "source": [ 874 | "class Const(object):\n", 875 | " def __init__(self, value):\n", 876 | " self.value = value\n", 877 | " def evaluate(self):\n", 878 | " return self.value\n", 879 | " def backpropagate(self, gradient):\n", 880 | " pass\n", 881 | " def __str__(self):\n", 882 | " return str(self.value)\n", 883 | "\n", 884 | "class Var(object):\n", 885 | " def __init__(self, name, init_value=0):\n", 886 | " self.value = init_value\n", 887 | " self.name = name\n", 888 | " self.gradient = 0\n", 889 | " def evaluate(self):\n", 890 | " return self.value\n", 891 | " def backpropagate(self, gradient):\n", 892 | " self.gradient += gradient\n", 893 | " def __str__(self):\n", 894 | " return self.name\n", 895 | "\n", 896 | "class BinaryOperator(object):\n", 897 | " def __init__(self, a, b):\n", 898 | " self.a = a\n", 899 | " self.b = b\n", 900 | "\n", 901 | "class Add(BinaryOperator):\n", 902 | " def evaluate(self):\n", 903 | " self.value = self.a.evaluate() + self.b.evaluate()\n", 904 | " return self.value\n", 905 | " def backpropagate(self, gradient):\n", 906 | " self.a.backpropagate(gradient)\n", 907 | " self.b.backpropagate(gradient)\n", 908 | " def __str__(self):\n", 909 | " return \"{} + {}\".format(self.a, self.b)\n", 910 | "\n", 911 | "class Mul(BinaryOperator):\n", 912 | " def evaluate(self):\n", 913 | " self.value = self.a.evaluate() * self.b.evaluate()\n", 914 | " return self.value\n", 915 | " def backpropagate(self, gradient):\n", 916 | " self.a.backpropagate(gradient * self.b.value)\n", 917 | " self.b.backpropagate(gradient * self.a.value)\n", 918 | " def __str__(self):\n", 919 | " return \"({}) * ({})\".format(self.a, self.b)" 920 | ] 921 | }, 922 | { 923 | "cell_type": "code", 924 | "execution_count": 28, 925 | "metadata": {}, 926 | "outputs": [], 927 | "source": [ 928 | "x = Var(\"x\", init_value=3)\n", 929 | "y = Var(\"y\", init_value=4)\n", 930 | "f = Add(Mul(Mul(x, x), y), Add(y, Const(2))) # f(x,y) = x²y + y + 2\n", 931 | "\n", 932 | "result = f.evaluate()\n", 933 | "f.backpropagate(1.0)" 934 | ] 935 | }, 936 | { 937 | "cell_type": "code", 938 | "execution_count": 29, 939 | "metadata": {}, 940 | "outputs": [ 941 | { 942 | "name": "stdout", 943 | "output_type": "stream", 944 | "text": [ 945 | "((x) * (x)) * (y) + y + 2\n" 946 | ] 947 | } 948 | ], 949 | "source": [ 950 | "print(f)" 951 | ] 952 | }, 953 | { 954 | "cell_type": "code", 955 | "execution_count": 30, 956 | "metadata": {}, 957 | "outputs": [ 958 | { 959 | "data": { 960 | "text/plain": [ 961 | "42" 962 | ] 963 | }, 964 | "execution_count": 30, 965 | "metadata": {}, 966 | "output_type": "execute_result" 967 | } 968 | ], 969 | "source": [ 970 | "result" 971 | ] 972 | }, 973 | { 974 | "cell_type": "code", 975 | "execution_count": 31, 976 | "metadata": {}, 977 | "outputs": [ 978 | { 979 | "data": { 980 | "text/plain": [ 981 | "24.0" 982 | ] 983 | }, 984 | "execution_count": 31, 985 | "metadata": {}, 986 | "output_type": "execute_result" 987 | } 988 | ], 989 | "source": [ 990 | "x.gradient" 991 | ] 992 | }, 993 | { 994 | "cell_type": "code", 995 | "execution_count": 32, 996 | "metadata": {}, 997 | "outputs": [ 998 | { 999 | "data": { 1000 | "text/plain": [ 1001 | "10.0" 1002 | ] 1003 | }, 1004 | "execution_count": 32, 1005 | "metadata": {}, 1006 | "output_type": "execute_result" 1007 | } 1008 | ], 1009 | "source": [ 1010 | "y.gradient" 1011 | ] 1012 | }, 1013 | { 1014 | "cell_type": "markdown", 1015 | "metadata": {}, 1016 | "source": [ 1017 | "Again, in this implementation the outputs are just numbers, not symbolic expressions, so we are limited to first order derivatives. However, we could have made the `backpropagate()` methods return symbolic expressions rather than values (e.g., return `Add(2,3)` rather than 5). This would make it possible to compute second order gradients (and beyond). This is what TensorFlow does, as do all the major libraries that implement autodiff." 1018 | ] 1019 | }, 1020 | { 1021 | "cell_type": "markdown", 1022 | "metadata": {}, 1023 | "source": [ 1024 | "### Reverse mode autodiff using TensorFlow" 1025 | ] 1026 | }, 1027 | { 1028 | "cell_type": "code", 1029 | "execution_count": 33, 1030 | "metadata": {}, 1031 | "outputs": [], 1032 | "source": [ 1033 | "import tensorflow as tf" 1034 | ] 1035 | }, 1036 | { 1037 | "cell_type": "code", 1038 | "execution_count": 34, 1039 | "metadata": {}, 1040 | "outputs": [ 1041 | { 1042 | "data": { 1043 | "text/plain": [ 1044 | "[,\n", 1045 | " ]" 1046 | ] 1047 | }, 1048 | "execution_count": 34, 1049 | "metadata": {}, 1050 | "output_type": "execute_result" 1051 | } 1052 | ], 1053 | "source": [ 1054 | "x = tf.Variable(3.)\n", 1055 | "y = tf.Variable(4.)\n", 1056 | "\n", 1057 | "with tf.GradientTape() as tape:\n", 1058 | " f = x*x*y + y + 2\n", 1059 | "\n", 1060 | "jacobians = tape.gradient(f, [x, y])\n", 1061 | "jacobians" 1062 | ] 1063 | }, 1064 | { 1065 | "cell_type": "markdown", 1066 | "metadata": {}, 1067 | "source": [ 1068 | "Since everything is symbolic, we can compute second order derivatives, and beyond:" 1069 | ] 1070 | }, 1071 | { 1072 | "cell_type": "code", 1073 | "execution_count": 35, 1074 | "metadata": {}, 1075 | "outputs": [ 1076 | { 1077 | "name": "stdout", 1078 | "output_type": "stream", 1079 | "text": [ 1080 | "WARNING:tensorflow:Calling GradientTape.gradient on a persistent tape inside its context is significantly less efficient than calling it outside the context (it causes the gradient ops to be recorded on the tape, leading to increased CPU and memory usage). Only call GradientTape.gradient inside the context if you actually want to trace the gradient in order to compute higher order derivatives.\n" 1081 | ] 1082 | }, 1083 | { 1084 | "data": { 1085 | "text/plain": [ 1086 | "[[,\n", 1087 | " ],\n", 1088 | " [, None]]" 1089 | ] 1090 | }, 1091 | "execution_count": 35, 1092 | "metadata": {}, 1093 | "output_type": "execute_result" 1094 | } 1095 | ], 1096 | "source": [ 1097 | "x = tf.Variable(3.)\n", 1098 | "y = tf.Variable(4.)\n", 1099 | "\n", 1100 | "with tf.GradientTape(persistent=True) as tape:\n", 1101 | " f = x*x*y + y + 2\n", 1102 | " df_dx, df_dy = tape.gradient(f, [x, y])\n", 1103 | "\n", 1104 | "d2f_d2x, d2f_dydx = tape.gradient(df_dx, [x, y])\n", 1105 | "d2f_dxdy, d2f_d2y = tape.gradient(df_dy, [x, y])\n", 1106 | "del tape\n", 1107 | "\n", 1108 | "hessians = [[d2f_d2x, d2f_dydx], [d2f_dxdy, d2f_d2y]]\n", 1109 | "hessians" 1110 | ] 1111 | }, 1112 | { 1113 | "cell_type": "markdown", 1114 | "metadata": {}, 1115 | "source": [ 1116 | "Note that when we compute the derivative of a tensor with regards to a variable that it does not depend on, instead of returning 0.0, the `gradient()` function returns `None`." 1117 | ] 1118 | }, 1119 | { 1120 | "cell_type": "markdown", 1121 | "metadata": {}, 1122 | "source": [ 1123 | "And that's all folks! Hope you enjoyed this notebook." 1124 | ] 1125 | } 1126 | ], 1127 | "metadata": { 1128 | "kernelspec": { 1129 | "display_name": "Python 3", 1130 | "language": "python", 1131 | "name": "python3" 1132 | }, 1133 | "language_info": { 1134 | "codemirror_mode": { 1135 | "name": "ipython", 1136 | "version": 3 1137 | }, 1138 | "file_extension": ".py", 1139 | "mimetype": "text/x-python", 1140 | "name": "python", 1141 | "nbconvert_exporter": "python", 1142 | "pygments_lexer": "ipython3", 1143 | "version": "3.7.10" 1144 | }, 1145 | "nav_menu": { 1146 | "height": "603px", 1147 | "width": "616px" 1148 | }, 1149 | "toc": { 1150 | "navigate_menu": true, 1151 | "number_sections": true, 1152 | "sideBar": true, 1153 | "threshold": 6, 1154 | "toc_cell": false, 1155 | "toc_section_display": "block", 1156 | "toc_window_display": true 1157 | } 1158 | }, 1159 | "nbformat": 4, 1160 | "nbformat_minor": 1 1161 | } 1162 | -------------------------------------------------------------------------------- /images/ann/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/autoencoders/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/classification/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/cnn/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/cnn/test_image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ageron/handson-ml2/8958d538bdcdf29d329d9950bfc79034c29db724/images/cnn/test_image.png -------------------------------------------------------------------------------- /images/decision_trees/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/deep/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/distributed/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/end_to_end_project/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/end_to_end_project/california.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ageron/handson-ml2/8958d538bdcdf29d329d9950bfc79034c29db724/images/end_to_end_project/california.png -------------------------------------------------------------------------------- /images/ensembles/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/fundamentals/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/nlp/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/rl/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/rl/breakout.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ageron/handson-ml2/8958d538bdcdf29d329d9950bfc79034c29db724/images/rl/breakout.gif -------------------------------------------------------------------------------- /images/rnn/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/svm/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/tensorflow/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/training_linear_models/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/unsupervised_learning/README: -------------------------------------------------------------------------------- 1 | Images generated by the notebooks 2 | -------------------------------------------------------------------------------- /images/unsupervised_learning/ladybug.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ageron/handson-ml2/8958d538bdcdf29d329d9950bfc79034c29db724/images/unsupervised_learning/ladybug.png -------------------------------------------------------------------------------- /index.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Machine Learning Notebooks\n", 8 | "\n", 9 | "*Welcome to the Machine Learning Notebooks!*\n", 10 | "\n", 11 | "[Prerequisites](#Prerequisites) (see below)\n", 12 | "\n", 13 | "\n", 14 | " \n", 17 | " \n", 20 | "
\n", 15 | " \"Open\n", 16 | " \n", 18 | " \n", 19 | "
" 21 | ] 22 | }, 23 | { 24 | "cell_type": "markdown", 25 | "metadata": {}, 26 | "source": [ 27 | "## Notebooks\n", 28 | "1. [The Machine Learning landscape](01_the_machine_learning_landscape.ipynb)\n", 29 | "2. [End-to-end Machine Learning project](02_end_to_end_machine_learning_project.ipynb)\n", 30 | "3. [Classification](03_classification.ipynb)\n", 31 | "4. [Training Models](04_training_linear_models.ipynb)\n", 32 | "5. [Support Vector Machines](05_support_vector_machines.ipynb)\n", 33 | "6. [Decision Trees](06_decision_trees.ipynb)\n", 34 | "7. [Ensemble Learning and Random Forests](07_ensemble_learning_and_random_forests.ipynb)\n", 35 | "8. [Dimensionality Reduction](08_dimensionality_reduction.ipynb)\n", 36 | "9. [Unsupervised Learning Techniques](09_unsupervised_learning.ipynb)\n", 37 | "10. [Artificial Neural Nets with Keras](10_neural_nets_with_keras.ipynb)\n", 38 | "11. [Training Deep Neural Networks](11_training_deep_neural_networks.ipynb)\n", 39 | "12. [Custom Models and Training with TensorFlow](12_custom_models_and_training_with_tensorflow.ipynb)\n", 40 | "13. [Loading and Preprocessing Data](13_loading_and_preprocessing_data.ipynb)\n", 41 | "14. [Deep Computer Vision Using Convolutional Neural Networks](14_deep_computer_vision_with_cnns.ipynb)\n", 42 | "15. [Processing Sequences Using RNNs and CNNs](15_processing_sequences_using_rnns_and_cnns.ipynb)\n", 43 | "16. [Natural Language Processing with RNNs and Attention](16_nlp_with_rnns_and_attention.ipynb)\n", 44 | "17. [Representation Learning and Generative Learning Using Autencoders and GANs](17_autoencoders_and_gans.ipynb)\n", 45 | "18. [Reinforcement Learning](18_reinforcement_learning.ipynb)\n", 46 | "19. [Training and Deploying TensorFlow Models at Scale](19_training_and_deploying_at_scale.ipynb)" 47 | ] 48 | }, 49 | { 50 | "cell_type": "markdown", 51 | "metadata": {}, 52 | "source": [ 53 | "## Scientific Python tutorials\n", 54 | "* [NumPy](tools_numpy.ipynb)\n", 55 | "* [Matplotlib](tools_matplotlib.ipynb)\n", 56 | "* [Pandas](tools_pandas.ipynb)" 57 | ] 58 | }, 59 | { 60 | "cell_type": "markdown", 61 | "metadata": {}, 62 | "source": [ 63 | "## Math Tutorials\n", 64 | "* [Linear Algebra](math_linear_algebra.ipynb)\n", 65 | "* [Differential Calculus](math_differential_calculus.ipynb)" 66 | ] 67 | }, 68 | { 69 | "cell_type": "markdown", 70 | "metadata": {}, 71 | "source": [ 72 | "## Extra Material\n", 73 | "* [Auto-differentiation](extra_autodiff.ipynb)" 74 | ] 75 | }, 76 | { 77 | "cell_type": "markdown", 78 | "metadata": {}, 79 | "source": [ 80 | "## Misc.\n", 81 | "* [Equations](book_equations.pdf) (list of equations in the book)" 82 | ] 83 | }, 84 | { 85 | "cell_type": "markdown", 86 | "metadata": {}, 87 | "source": [ 88 | "## Prerequisites" 89 | ] 90 | }, 91 | { 92 | "cell_type": "markdown", 93 | "metadata": {}, 94 | "source": [ 95 | "### To understand\n", 96 | "* **Python** – you don't need to be an expert python programmer, but you do need to know the basics. If you don't, the official [Python tutorial](https://docs.python.org/3/tutorial/) is a good place to start.\n", 97 | "* **Scientific Python** – We will be using a few popular python libraries, in particular NumPy, matplotlib and pandas. If you are not familiar with these libraries, you should probably start by going through the tutorials in the Tools section (especially NumPy).\n", 98 | "* **Math** – We will also use some notions of Linear Algebra, Calculus, Statistics and Probability theory. You should be able to follow along if you learned these in the past as it won't be very advanced, but if you don't know about these topics or you need a refresher then go through the appropriate introduction in the Math section." 99 | ] 100 | }, 101 | { 102 | "cell_type": "markdown", 103 | "metadata": {}, 104 | "source": [ 105 | "### To run the examples\n", 106 | "* **Jupyter** – These notebooks are based on Jupyter. You can run these notebooks in just one click using a hosted platform such as Binder, Deepnote or Colaboratory (no installation required), or you can just view them using Jupyter.org's viewer, or you can install everything on your machine, as you prefer. Check out the [home page](https://github.com/ageron/handson-ml2/) for more details." 107 | ] 108 | }, 109 | { 110 | "cell_type": "code", 111 | "execution_count": null, 112 | "metadata": {}, 113 | "outputs": [], 114 | "source": [] 115 | } 116 | ], 117 | "metadata": { 118 | "kernelspec": { 119 | "display_name": "Python 3", 120 | "language": "python", 121 | "name": "python3" 122 | }, 123 | "language_info": { 124 | "codemirror_mode": { 125 | "name": "ipython", 126 | "version": 3 127 | }, 128 | "file_extension": ".py", 129 | "mimetype": "text/x-python", 130 | "name": "python", 131 | "nbconvert_exporter": "python", 132 | "pygments_lexer": "ipython3", 133 | "version": "3.7.10" 134 | }, 135 | "nav_menu": {}, 136 | "toc": { 137 | "navigate_menu": true, 138 | "number_sections": true, 139 | "sideBar": true, 140 | "threshold": 6, 141 | "toc_cell": false, 142 | "toc_section_display": "block", 143 | "toc_window_display": false 144 | } 145 | }, 146 | "nbformat": 4, 147 | "nbformat_minor": 4 148 | } 149 | -------------------------------------------------------------------------------- /ml-project-checklist.md: -------------------------------------------------------------------------------- 1 | This checklist can guide you through your Machine Learning projects. There are eight main steps: 2 | 3 | 1. Frame the problem and look at the big picture. 4 | 2. Get the data. 5 | 3. Explore the data to gain insights. 6 | 4. Prepare the data to better expose the underlying data patterns to Machine Learning algorithms. 7 | 5. Explore many different models and short-list the best ones. 8 | 6. Fine-tune your models and combine them into a great solution. 9 | 7. Present your solution. 10 | 8. Launch, monitor, and maintain your system. 11 | 12 | Obviously, you should feel free to adapt this checklist to your needs. 13 | 14 | # Frame the problem and look at the big picture 15 | 1. Define the objective in business terms. 16 | 2. How will your solution be used? 17 | 3. What are the current solutions/workarounds (if any)? 18 | 4. How should you frame this problem (supervised/unsupervised, online/offline, etc.) 19 | 5. How should performance be measured? 20 | 6. Is the performance measure aligned with the business objective? 21 | 7. What would be the minimum performance needed to reach the business objective? 22 | 8. What are comparable problems? Can you reuse experience or tools? 23 | 9. Is human expertise available? 24 | 10. How would you solve the problem manually? 25 | 11. List the assumptions you or others have made so far. 26 | 12. Verify assumptions if possible. 27 | 28 | # Get the data 29 | Note: automate as much as possible so you can easily get fresh data. 30 | 31 | 1. List the data you need and how much you need. 32 | 2. Find and document where you can get that data. 33 | 3. Check how much space it will take. 34 | 4. Check legal obligations, and get the authorization if necessary. 35 | 5. Get access authorizations. 36 | 6. Create a workspace (with enough storage space). 37 | 7. Get the data. 38 | 8. Convert the data to a format you can easily manipulate (without changing the data itself). 39 | 9. Ensure sensitive information is deleted or protected (e.g., anonymized). 40 | 10. Check the size and type of data (time series, sample, geographical, etc.). 41 | 11. Sample a test set, put it aside, and never look at it (no data snooping!). 42 | 43 | # Explore the data 44 | Note: try to get insights from a field expert for these steps. 45 | 46 | 1. Create a copy of the data for exploration (sampling it down to a manageable size if necessary). 47 | 2. Create a Jupyter notebook to keep record of your data exploration. 48 | 3. Study each attribute and its characteristics: 49 | - Name 50 | - Type (categorical, int/float, bounded/unbounded, text, structured, etc.) 51 | - % of missing values 52 | - Noisiness and type of noise (stochastic, outliers, rounding errors, etc.) 53 | - Possibly useful for the task? 54 | - Type of distribution (Gaussian, uniform, logarithmic, etc.) 55 | 4. For supervised learning tasks, identify the target attribute(s). 56 | 5. Visualize the data. 57 | 6. Study the correlations between attributes. 58 | 7. Study how you would solve the problem manually. 59 | 8. Identify the promising transformations you may want to apply. 60 | 9. Identify extra data that would be useful (go back to "Get the Data" on page 502). 61 | 10. Document what you have learned. 62 | 63 | # Prepare the data 64 | Notes: 65 | - Work on copies of the data (keep the original dataset intact). 66 | - Write functions for all data transformations you apply, for five reasons: 67 | - So you can easily prepare the data the next time you get a fresh dataset 68 | - So you can apply these transformations in future projects 69 | - To clean and prepare the test set 70 | - To clean and prepare new data instances 71 | - To make it easy to treat your preparation choices as hyperparameters 72 | 73 | 1. Data cleaning: 74 | - Fix or remove outliers (optional). 75 | - Fill in missing values (e.g., with zero, mean, median...) or drop their rows (or columns). 76 | 2. Feature selection (optional): 77 | - Drop the attributes that provide no useful information for the task. 78 | 3. Feature engineering, where appropriates: 79 | - Discretize continuous features. 80 | - Decompose features (e.g., categorical, date/time, etc.). 81 | - Add promising transformations of features (e.g., log(x), sqrt(x), x^2, etc.). 82 | - Aggregate features into promising new features. 83 | 4. Feature scaling: standardize or normalize features. 84 | 85 | # Short-list promising models 86 | Notes: 87 | - If the data is huge, you may want to sample smaller training sets so you can train many different models in a reasonable time (be aware that this penalizes complex models such as large neural nets or Random Forests). 88 | - Once again, try to automate these steps as much as possible. 89 | 90 | 1. Train many quick and dirty models from different categories (e.g., linear, naive, Bayes, SVM, Random Forests, neural net, etc.) using standard parameters. 91 | 2. Measure and compare their performance. 92 | - For each model, use N-fold cross-validation and compute the mean and standard deviation of their performance. 93 | 3. Analyze the most significant variables for each algorithm. 94 | 4. Analyze the types of errors the models make. 95 | - What data would a human have used to avoid these errors? 96 | 5. Have a quick round of feature selection and engineering. 97 | 6. Have one or two more quick iterations of the five previous steps. 98 | 7. Short-list the top three to five most promising models, preferring models that make different types of errors. 99 | 100 | # Fine-Tune the System 101 | Notes: 102 | - You will want to use as much data as possible for this step, especially as you move toward the end of fine-tuning. 103 | - As always automate what you can. 104 | 105 | 1. Fine-tune the hyperparameters using cross-validation. 106 | - Treat your data transformation choices as hyperparameters, especially when you are not sure about them (e.g., should I replace missing values with zero or the median value? Or just drop the rows?). 107 | - Unless there are very few hyperparamter values to explore, prefer random search over grid search. If training is very long, you may prefer a Bayesian optimization approach (e.g., using a Gaussian process priors, as described by Jasper Snoek, Hugo Larochelle, and Ryan Adams ([https://goo.gl/PEFfGr](https://goo.gl/PEFfGr))) 108 | 2. Try Ensemble methods. Combining your best models will often perform better than running them invdividually. 109 | 3. Once you are confident about your final model, measure its performance on the test set to estimate the generalization error. 110 | 111 | > Don't tweak your model after measuring the generalization error: you would just start overfitting the test set. 112 | 113 | # Present your solution 114 | 1. Document what you have done. 115 | 2. Create a nice presentation. 116 | - Make sure you highlight the big picture first. 117 | 3. Explain why your solution achieves the business objective. 118 | 4. Don't forget to present interesting points you noticed along the way. 119 | - Describe what worked and what did not. 120 | - List your assumptions and your system's limitations. 121 | 5. Ensure your key findings are communicated through beautiful visualizations or easy-to-remember statements (e.g., "the median income is the number-one predictor of housing prices"). 122 | 123 | # Launch! 124 | 1. Get your solution ready for production (plug into production data inputs, write unit tests, etc.). 125 | 2. Write monitoring code to check your system's live performance at regular intervals and trigger alerts when it drops. 126 | - Beware of slow degradation too: models tend to "rot" as data evolves. 127 | - Measuring performance may require a human pipeline (e.g., via a crowdsourcing service). 128 | - Also monitor your inputs' quality (e.g., a malfunctioning sensor sending random values, or another team's output becoming stale). This is particulary important for online learning systems. 129 | 3. Retrain your models on a regular basis on fresh data (automate as much as possible). 130 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | # WARNING: Using Anaconda instead of pip is highly recommended, especially on 2 | # Windows or when using a GPU. Please see the installation instructions in 3 | # INSTALL.md 4 | 5 | 6 | ##### Core scientific packages 7 | jupyter~=1.0.0 8 | matplotlib~=3.4.3 9 | numpy~=1.19.5 10 | pandas~=1.3.3 11 | scipy~=1.7.1 12 | 13 | ##### Machine Learning packages 14 | scikit-learn~=1.0 15 | 16 | # Optional: the XGBoost library is only used in chapter 7 17 | xgboost~=1.4.2 18 | 19 | # Optional: the transformers library is only using in chapter 16 20 | transformers~=4.11.3 21 | 22 | ##### TensorFlow-related packages 23 | 24 | # If you have a TF-compatible GPU and you want to enable GPU support, then 25 | # replace tensorflow-serving-api with tensorflow-serving-api-gpu. 26 | # Your GPU must have CUDA Compute Capability 3.5 or higher support, and 27 | # you must install CUDA, cuDNN and more: see tensorflow.org for the detailed 28 | # installation instructions. 29 | 30 | tensorflow~=2.6.0 31 | # Optional: the TF Serving API library is just needed for chapter 19. 32 | tensorflow-serving-api~=2.6.0 # or tensorflow-serving-api-gpu if gpu 33 | 34 | tensorboard~=2.7.0 35 | tensorboard-plugin-profile~=2.5.0 36 | tensorflow-datasets~=4.4.0 37 | tensorflow-hub~=0.12.0 38 | tensorflow-probability~=0.14.1 39 | 40 | # Optional: only used in chapter 13. 41 | tfx~=1.3.0 42 | 43 | # Optional: only used in chapter 16. 44 | tensorflow-addons~=0.14.0 45 | 46 | ##### Reinforcement Learning library (chapter 18) 47 | 48 | # There are a few dependencies you need to install first, check out: 49 | # https://github.com/openai/gym#installing-everything 50 | gym[Box2D,atari,accept-rom-license]~=0.21.0 51 | 52 | # WARNING: on Windows, installing Box2D this way requires: 53 | # * Swig: http://www.swig.org/download.html 54 | # * Microsoft C++ Build Tools: https://visualstudio.microsoft.com/visual-cpp-build-tools/ 55 | # It's much easier to use Anaconda instead. 56 | 57 | tf-agents~=0.10.0 58 | 59 | ##### Image manipulation 60 | Pillow~=8.4.0 61 | graphviz~=0.17 62 | opencv-python~=4.5.3.56 63 | pyglet~=1.5.21 64 | 65 | #pyvirtualdisplay # needed in chapter 18, if on a headless server 66 | # (i.e., without screen, e.g., Colab or VM) 67 | 68 | 69 | ##### Additional utilities 70 | 71 | # Efficient jobs (caching, parallelism, persistence) 72 | joblib~=0.14.1 73 | 74 | # Easy http requests 75 | requests~=2.26.0 76 | 77 | # Nice utility to diff Jupyter Notebooks. 78 | nbdime~=3.1.0 79 | 80 | # May be useful with Pandas for complex "where" clauses (e.g., Pandas 81 | # tutorial). 82 | numexpr~=2.7.3 83 | 84 | # Optional: these libraries can be useful in the chapter 3, exercise 4. 85 | nltk~=3.6.5 86 | urlextract~=1.4.0 87 | 88 | # Optional: these libraries are only used in chapter 16 89 | ftfy~=6.0.3 90 | 91 | # Optional: tqdm displays nice progress bars, ipywidgets for tqdm's notebook support 92 | tqdm~=4.62.3 93 | ipywidgets~=7.6.5 94 | --------------------------------------------------------------------------------