├── .gitignore
├── featured
└── images
│ ├── ml.png
│ ├── layout.png
│ ├── road.jpg
│ ├── turing.jpg
│ └── vispy.PNG
├── references
├── chapter15_symbolic.md
├── chapter13_stochastic.md
├── chapter08_ml.md
├── chapter05_hpc.md
├── chapter06_viz.md
├── chapter03_notebook.md
├── chapter14_graphs.md
├── chapter01_intro.md
├── chapter10_signal.md
├── chapter12_deterministic.md
├── chapter09_numopt.md
├── chapter11_image.md
├── chapter07_stats.md
├── chapter04_optimization.md
└── chapter02_best_practices.md
├── tools
├── util.py
├── genfeatured.py
├── gentoc.py
└── featured.tpl
├── notebooks
├── chapter05_hpc
│ ├── _launch_notebook.bat
│ ├── 02_numexpr.ipynb
│ ├── 11_mpi.ipynb
│ ├── 09_ipyparallel.ipynb
│ └── 05_ray_1.ipynb
├── chapter04_optimization
│ ├── 01_timeit.ipynb
│ ├── 03_linebyline.ipynb
│ ├── 09_memmap.ipynb
│ ├── 04_memprof.ipynb
│ ├── 06_stride_tricks.ipynb
│ ├── 02_profile.ipynb
│ └── 11_hdf5_table.ipynb
├── chapter06_viz
│ ├── 03_bokeh.ipynb
│ └── 02_seaborn.ipynb
├── chapter13_stochastic
│ └── 03_brownian.ipynb
├── chapter07_stats
│ ├── 02_z_test.ipynb
│ └── 03_bayesian.ipynb
├── chapter15_symbolic
│ ├── 03_function.ipynb
│ └── 02_solvers.ipynb
├── chapter09_numoptim
│ ├── 01_root.ipynb
│ └── 03_curvefitting.ipynb
├── chapter11_image
│ ├── 05_faces.ipynb
│ ├── 07_synth.ipynb
│ ├── 01_exposure.ipynb
│ ├── 02_filters.ipynb
│ └── 04_interest.ipynb
├── chapter08_ml
│ ├── 07_pca.ipynb
│ └── 03_digits.ipynb
├── chapter12_deterministic
│ ├── 02_cellular.ipynb
│ └── 03_ode.ipynb
└── chapter03_notebook
│ ├── 03_controls.ipynb
│ └── 04_css.ipynb
├── errata.md
├── LICENSE.txt
├── README.md
└── installation.md
/.gitignore:
--------------------------------------------------------------------------------
1 | data
2 | _old
3 | word
4 | .ipynb_checkpoints
5 | matplotlibrc
6 | *.bat
7 | *.json
8 | *.txt
--------------------------------------------------------------------------------
/featured/images/ml.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kunalj101/cookbook-code/master/featured/images/ml.png
--------------------------------------------------------------------------------
/featured/images/layout.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kunalj101/cookbook-code/master/featured/images/layout.png
--------------------------------------------------------------------------------
/featured/images/road.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kunalj101/cookbook-code/master/featured/images/road.jpg
--------------------------------------------------------------------------------
/featured/images/turing.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kunalj101/cookbook-code/master/featured/images/turing.jpg
--------------------------------------------------------------------------------
/featured/images/vispy.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kunalj101/cookbook-code/master/featured/images/vispy.PNG
--------------------------------------------------------------------------------
/references/chapter15_symbolic.md:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kunalj101/cookbook-code/master/references/chapter15_symbolic.md
--------------------------------------------------------------------------------
/references/chapter13_stochastic.md:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/kunalj101/cookbook-code/master/references/chapter13_stochastic.md
--------------------------------------------------------------------------------
/references/chapter08_ml.md:
--------------------------------------------------------------------------------
1 | # 8. Machine Learning
2 |
3 | Full list of references in Chapter 8 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 | ## Recipe 1
5 |
6 |
7 |
8 | ## Recipe 2
9 |
10 |
11 |
12 | ## Recipe 3
13 |
14 |
15 |
16 | ## Recipe 4
17 |
18 |
19 |
20 | ## Recipe 5
21 |
22 |
23 |
24 | ## Recipe 6
25 |
26 |
27 |
28 | ## Recipe 7
29 |
30 |
31 |
32 |
--------------------------------------------------------------------------------
/tools/util.py:
--------------------------------------------------------------------------------
1 | import re
2 | import urlparse
3 | import json
4 | import os
5 | import sys
6 | import os.path as op
7 |
8 | def get_recipe_number(file):
9 | return int(file[:2])
10 |
11 | def get_recipe_name(file):
12 | # Load notebook.
13 | with open(file, 'r') as f:
14 | contents = json.load(f)
15 | cells = contents['worksheets'][0]['cells']
16 | for cell in cells:
17 | if cell.get('cell_type', None) == 'markdown':
18 | source = cell.get('source', [])
19 | for _ in source:
20 | if _.startswith('# '):
21 | return _[2:].strip()
22 |
--------------------------------------------------------------------------------
/notebooks/chapter05_hpc/_launch_notebook.bat:
--------------------------------------------------------------------------------
1 | REM ## This is needed for setenv.cmd later
2 | setlocal EnableDelayedExpansion
3 | REM ## STORE OLD DIR
4 | set olddir=%cd%
5 |
6 | REM ## DETERMINE SDK DIR
7 | SET SDKREG=HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v7.0
8 | SET SDKDIRQUERY=reg query "%SDKREG%" /v InstallationFolder
9 | FOR /F "tokens=2* delims= " %%A IN ('%SDKDIRQUERY%') DO SET SDKDIR=%%B
10 | REM Tab followed by Space ^^^^^^
11 |
12 | REM ## STARTUP COMPILE ENVIRONMENT
13 | call "%SDKDIR%\bin\setenv.cmd" /x64 /release
14 |
15 | REM ## RESET DIRECTORY
16 | chdir /d %olddir%
17 | set DISTUTILS_USE_SDK=1
18 |
19 | REM ## YOUR COMMANDS GO HERE
20 | ipython notebook --profile=cookbook
21 |
--------------------------------------------------------------------------------
/references/chapter05_hpc.md:
--------------------------------------------------------------------------------
1 | # 5. High-Performance Computing
2 |
3 | Full list of references in Chapter 5 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 | ## Recipe 1
5 |
6 |
7 |
8 | ## Recipe 2
9 |
10 |
11 |
12 | ## Recipe 3
13 |
14 |
15 |
16 | ## Recipe 4
17 |
18 |
19 |
20 | ## Recipe 5
21 |
22 |
23 |
24 | ## Recipe 5
25 |
26 |
27 |
28 | ## Recipe 5
29 |
30 |
31 |
32 | ## Recipe 5
33 |
34 |
35 |
36 | ## Recipe 5
37 |
38 |
39 |
40 | ## Recipe 6
41 |
42 |
43 |
44 | ## Recipe 7
45 |
46 |
47 |
48 | ## Recipe 8
49 |
50 |
51 |
52 | ## Recipe 9
53 |
54 |
55 |
56 | ## Recipe 10
57 |
58 |
59 |
60 |
--------------------------------------------------------------------------------
/references/chapter06_viz.md:
--------------------------------------------------------------------------------
1 | # 6. Advanced Visualization
2 |
3 | Full list of references in Chapter 6 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 |
6 | * [Vega](http://trifacta.github.io/vega/).
7 | * [Vincent](http://vincent.readthedocs.org/en/latest/).
8 | * [ggplot2](http://ggplot2.org/).
9 | * [ggplot for Python](http://blog.yhathq.com/posts/ggplot-for-python.html).
10 | * [VizQL](http://www.tableausoftware.com/fr-fr/products/technology).
11 | * [prettyplotlib](http://github.com/olgabot/prettyplotlib).
12 | * [Seaborn](http://github.com/mwaskom/seaborn).
13 | * [Bokeh](http://bokeh.pydata.org/docs/gallery.html).
14 | * [plotly](http://plot.ly).
15 | * [d3js gallery](http://github.com/mbostock/d3/wiki/Gallery).
16 | * [mpld3](http://github.com/mpld3/mpld3).
17 |
--------------------------------------------------------------------------------
/references/chapter03_notebook.md:
--------------------------------------------------------------------------------
1 | # 3. Master the Notebook
2 |
3 | Full list of references in Chapter 3 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 |
6 | ## Recipe 2
7 |
8 | * [Documentation of nbconvert](http://ipython.org/ipython-doc/dev/interactive/nbconvert.html).
9 | * [A list of conversion examples with nbconvert](https://github.com/ipython/nbconvert-examples).
10 | * [JSON on Wikipedia](http://en.wikipedia.org/wiki/JSON).
11 |
12 |
13 | ## Recipe 6
14 |
15 | * [Official example about custom widgets](http://nbviewer.ipython.org/urls/raw.githubusercontent.com/ipython/ipython/master/examples/widgets/Part%206%20-%20Custom%20Widget.ipynb).
16 | * [MVC pattern in Wikipedia](http://en.wikipedia.org/wiki/Model%E2%80%93view%E2%80%93controller).
17 | * [Backbone.js](http://backbonejs.org/).
18 | * [Course on Backbone.js](https://www.codeschool.com/courses/anatomy-of-backbonejs).
19 | * [IPEP 21: widget messages (comms)](https://github.com/ipython/ipython/wiki/IPEP-21%3A-Widget-Messages).
20 | * [IPEP 23: IPython widgets](https://github.com/ipython/ipython/wiki/IPEP-23%3A-Backbone.js-Widgets).
21 |
22 |
23 |
--------------------------------------------------------------------------------
/errata.md:
--------------------------------------------------------------------------------
1 | # Errata
2 |
3 | You can report inaccuracies or errors in the [GitHub issue tracker](https://github.com/ipython-books/cookbook-code/issues). Even better, propose your own corrections by submitting a pull request!
4 |
5 | * [Erratum #1](https://github.com/ipython-books/cookbook-code/pull/1) in [Featured Recipe 1](http://ipython-books.github.io/featured-01.html): *Correction about strides and copying*, by [Chris Beaumont](https://github.com/ChrisBeaumont).
6 | * [Erratum #2](https://github.com/ipython-books/cookbook-code/issues/2) in [Featured Recipe 1](http://ipython-books.github.io/featured-01.html): *Precision on explanation of shared memory*, by [Michael Droettboom](https://github.com/mdboom).
7 | * [Erratum #3](https://github.com/ipython-books/cookbook-code/issues/3) in [recipe 4.9](http://nbviewer.ipython.org/github/ipython-books/cookbook-code/blob/master/notebooks/chapter04_optimization/09_memmap.ipynb): *Added file mode in `memmap`*, by [Renaud Blanch](http://iihm.imag.fr/blanch/).
8 | * [Erratum #9](https://github.com/ipython-books/cookbook-code/pull/9) in [recipe 7.8](http://nbviewer.ipython.org/github/ipython-books/cookbook-code/blob/master/notebooks/chapter07_stats/08_r.ipynb): *Replaced old `rmagic` IPython extension by new `rpy2.ipython` for the `%R` magic command*, by [@mpelko](https://github.com/mpelko).
9 |
--------------------------------------------------------------------------------
/references/chapter14_graphs.md:
--------------------------------------------------------------------------------
1 | # 14. Graphs and Geometry
2 |
3 | Full list of references in Chapter 14 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 |
6 | ## Recipe 3
7 |
8 | * [Topological sort](http://en.wikipedia.org/wiki/Topological_sorting).
9 | * [DAG](http://en.wikipedia.org/wiki/Directed_acyclic_graph).
10 |
11 | ## Recipe 4
12 |
13 | * [Connected component labeling](http://en.wikipedia.org/wiki/Connected-component_labeling).
14 | * [Flood fill algorithm](http://en.wikipedia.org/wiki/Flood_fill).
15 | * [Connected components](http://en.wikipedia.org/wiki/Connected_component_%28graph_theory%29).
16 |
17 | ## Recipe 5
18 |
19 | * [Voronoi diagram](http://en.wikipedia.org/wiki/Voronoi_diagram).
20 | * [Delaunay triangulation](http://en.wikipedia.org/wiki/Delaunay_triangulation).
21 |
22 |
23 | ## Recipe 7
24 |
25 | * [Shortest path](http://en.wikipedia.org/wiki/Shortest_path_problem),
26 | * [Dijkstra's algorithm](http://en.wikipedia.org/wiki/Dijkstra%27s_algorithm).
27 | * [Geographical distance](http://en.wikipedia.org/wiki/Geographical_distance),
28 | * [Great circle](http://en.wikipedia.org/wiki/Great_Circle),
29 | * [Great circle distance](http://en.wikipedia.org/wiki/Great-circle_distance).
30 |
31 |
--------------------------------------------------------------------------------
/references/chapter01_intro.md:
--------------------------------------------------------------------------------
1 | # 1. A Tour of Interactive Computing with IPython
2 |
3 | Full list of references in Chapter 1 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 | ## Recipe 1
5 |
6 | * [Official page about the notebook](http://ipython.org/notebook).
7 | * [Documentation of the notebook](http://ipython.org/ipython-doc/dev/interactive/notebook.html).
8 | * [Official notebook examples](https://github.com/ipython/ipython/tree/master/examples/notebooks).
9 | * [User-curated gallery of interesting notebooks](https://github.com/ipython/ipython/wiki/A-gallery-of-interesting-IPython-Notebooks).
10 | * [Official tutorial on the interactive widgets](http://nbviewer.ipython.org/github/ipython/ipython/tree/master/examples/widgets/).
11 |
12 | ## Recipe 2
13 |
14 |
15 |
16 | ## Recipe 3
17 |
18 | * [Introduction to the ndarray on NumPy's documentation](http://docs.scipy.org/doc/numpy/reference/arrays.ndarray.html).
19 | * [Tutorial on the NumPy array](http://wiki.scipy.org/Tentative_NumPy_Tutorial).
20 | * [The NumPy array in the SciPy lectures notes](http://scipy-lectures.github.io/intro/numpy/array_object.html).
21 |
22 | ## Recipe 4
23 |
24 | * [Documentation of IPython's extension system](http://ipython.org/ipython-doc/dev/config/extensions/).
25 | * [Defining new magic commands](http://ipython.org/ipython-doc/dev/interactive/reference.html#defining-magics).
26 | * [Index of IPython extensions](https://github.com/ipython/ipython/wiki/Extensions-Index).
27 |
28 |
--------------------------------------------------------------------------------
/LICENSE.txt:
--------------------------------------------------------------------------------
1 | The code is free and can be used for any purpose.
2 | Packt Publishing and the author Dr. Cyrille Rossant can not be held liable for
3 | any use or result of the book's text or code.
4 | Further copyright & license info is found in the book's Copyright page.
5 | The book can be obtained on "http://ipython-books.github.io".
6 |
7 | Redistribution and use in source and binary forms, with or without modification,
8 | are permitted provided that the following conditions are met:
9 |
10 | * Redistributions of source code must retain the above copyright notice, this
11 | list of conditions and the following disclaimer.
12 | * Redistributions in binary form must reproduce the above copyright notice, this
13 | list of conditions and the following disclaimer in the documentation and/or
14 | other materials provided with the distribution.
15 |
16 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
17 | ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
18 | WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
19 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
20 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
21 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
22 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
23 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
24 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
25 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
26 |
--------------------------------------------------------------------------------
/references/chapter10_signal.md:
--------------------------------------------------------------------------------
1 | # 10. Signal Processing
2 |
3 | Full list of references in Chapter 10 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 |
6 | ## Recipe 1
7 |
8 | * [Introduction to the FFT with SciPy](http://scipy-lectures.github.io/intro/scipy.html#fast-fourier-transforms-scipy-fftpack).
9 | * [Reference documentation for the fftpack in SciPy](http://docs.scipy.org/doc/scipy/reference/fftpack.html).
10 | * [Fourier Transform](http://en.wikipedia.org/wiki/Fourier_transform).
11 | * [Discrete Fourier Transform](http://en.wikipedia.org/wiki/Discrete_Fourier_transform).
12 | * [Fast Fourier Transform](http://en.wikipedia.org/wiki/Fast_Fourier_transform).
13 |
14 |
15 | ## Recipe 2
16 |
17 | * [Digital signal processing](http://en.wikipedia.org/wiki/Digital_signal_processing).
18 | * [Linear filter](http://en.wikipedia.org/wiki/Linear_filter).
19 | * [LTI filters](http://en.wikipedia.org/wiki/LTI_system_theory).
20 | * [Impulse response](http://en.wikipedia.org/wiki/Impulse_response).
21 | * [Convolution](http://en.wikipedia.org/wiki/Convolution).
22 | * [FIR filters](http://en.wikipedia.org/wiki/Finite_impulse_response).
23 | * [IIR filters](http://en.wikipedia.org/wiki/Infinite_impulse_response).
24 | * [Low-pass filters](http://en.wikipedia.org/wiki/Low-pass_filter).
25 | * [High-pass filters](http://en.wikipedia.org/wiki/High-pass_filter).
26 | * [Band-pass filters](http://en.wikipedia.org/wiki/Band-pass_filter).
27 |
28 |
29 | ## Recipe 3
30 |
31 |
32 |
33 |
--------------------------------------------------------------------------------
/references/chapter12_deterministic.md:
--------------------------------------------------------------------------------
1 | # 12. Deterministic Dynamical Systems
2 |
3 | Full list of references in Chapter 12 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 |
6 | ## Recipe 1
7 |
8 | * [Chaos theory](http://en.wikipedia.org/wiki/Chaos_theory).
9 | * [Complex systems](http://en.wikipedia.org/wiki/Complex_system) are particular chaotic systems.
10 | * The [logistic map](http://en.wikipedia.org/wiki/Logistic_map).
11 | * [Iterated functions (discrete dynamical systems)](http://en.wikipedia.org/wiki/Iterated_function).
12 | * [Bifurcation diagrams](http://en.wikipedia.org/wiki/Bifurcation_diagram).
13 | * [Lyapunov exponent](http://en.wikipedia.org/wiki/Lyapunov_exponent).
14 |
15 | ## Recipe 2
16 |
17 | * [Cellular automata](http://en.wikipedia.org/wiki/Cellular_automaton).
18 | * [Elementary cellular automata](http://en.wikipedia.org/wiki/Elementary_cellular_automaton).
19 | * [Rule 110](http://en.wikipedia.org/wiki/Rule_110), a Turing-complete automaton.
20 | * The [Wolfram code](http://en.wikipedia.org/wiki/Wolfram_code) assigns a 1D elementary cellular automaton to any number between 0 and 255.
21 |
22 | ## Recipe 3
23 |
24 |
25 | ## Recipe 4
26 |
27 | * [Partial Differential Equations](http://en.wikipedia.org/wiki/Partial_differential_equation),
28 | * [Reaction-diffusion systems](http://en.wikipedia.org/wiki/Reaction%E2%80%93diffusion_system),
29 | * [Fitzhugh-Nagumo system](http://en.wikipedia.org/wiki/FitzHugh%E2%80%93Nagumo_equation),
30 | * [Neumann boundary conditions](http://en.wikipedia.org/wiki/Neumann_boundary_condition),
31 | * [Von Neumann stability analysis](http://en.wikipedia.org/wiki/Von_Neumann_stability_analysis).
32 |
33 |
--------------------------------------------------------------------------------
/references/chapter09_numopt.md:
--------------------------------------------------------------------------------
1 | # 9. Numerical optimization
2 |
3 | Full list of references in Chapter 9 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 |
6 | ## Recipe 1
7 |
8 | * [Documentation of scipy.optimize](http://docs.scipy.org/doc/scipy/reference/optimize.html#root-finding).
9 | * [A course on root finding with SciPy](http://quant-econ.net/scipy.html#roots-and-fixed-points).
10 | * The [Bisection method](http://en.wikipedia.org/wiki/Bisection_method).
11 | * The [intermediate value theorem](http://en.wikipedia.org/wiki/Intermediate_value_theorem).
12 | * [Brent's method](http://en.wikipedia.org/wiki/Brent%27s_method).
13 | * [Newton's method](http://en.wikipedia.org/wiki/Newton%27s_method).
14 |
15 | ## Recipe 2
16 |
17 | * [SciPy.optimize reference documentation](http://docs.scipy.org/doc/scipy/reference/optimize.html).
18 | * [An excellent lecture on mathematical optimization with SciPy](http://scipy-lectures.github.io/advanced/mathematical_optimization/).
19 | * Definition of the [gradient](http://en.wikipedia.org/wiki/Gradient).
20 | * [Newton's method](http://en.wikipedia.org/wiki/Newton%27s_method_in_optimization).
21 | * [Quasi-Newton methods](http://en.wikipedia.org/wiki/Quasi-Newton_method).
22 | * [Simulated annealing](http://en.wikipedia.org/wiki/Simulated_annealing).
23 | * [Metaheuristics for function minimization](http://en.wikipedia.org/wiki/Metaheuristic).
24 | * [The CMA-ES algorithm](http://en.wikipedia.org/wiki/CMA-ES).
25 | * [A Python implementation of CMA-ES](http://www.lri.fr/~hansen/cmaes_inmatlab.html#python).
26 |
27 | ## Recipe 3
28 |
29 | * [Reference documentation of `curvefit`](http://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.curve_fit.html).
30 | * [Non-linear least squares](http://en.wikipedia.org/wiki/Non-linear_least_squares).
31 | * [Levenberg-Marquardt algorithm](http://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm).
32 |
33 | ## Recipe 4
34 |
35 | * [Potential energy](http://en.wikipedia.org/wiki/Potential_energy).
36 | * [Elastic potential energy](http://en.wikipedia.org/wiki/Elastic_potential_energy).
37 | * [Hooke's law, which is the linear approximation of the spring's response](http://en.wikipedia.org/wiki/Hooke%27s_law).
38 | * [Principle of minimum energy](http://en.wikipedia.org/wiki/Minimum_total_potential_energy_principle).
39 |
40 |
--------------------------------------------------------------------------------
/notebooks/chapter04_optimization/01_timeit.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 3,
3 | "nbformat_minor": 0,
4 | "worksheets": [
5 | {
6 | "cells": [
7 | {
8 | "source": [
9 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
10 | ],
11 | "cell_type": "markdown",
12 | "metadata": []
13 | },
14 | {
15 | "source": [
16 | "# 4.1. Evaluating the time taken by a statement in IPython"
17 | ],
18 | "cell_type": "markdown",
19 | "metadata": {}
20 | },
21 | {
22 | "cell_type": "code",
23 | "language": "python",
24 | "outputs": [],
25 | "collapsed": false,
26 | "input": [
27 | "n = 100000"
28 | ],
29 | "metadata": {}
30 | },
31 | {
32 | "cell_type": "code",
33 | "language": "python",
34 | "outputs": [],
35 | "collapsed": false,
36 | "input": [
37 | "%timeit sum([1. / i**2 for i in range(1, n)])"
38 | ],
39 | "metadata": {}
40 | },
41 | {
42 | "cell_type": "code",
43 | "language": "python",
44 | "outputs": [],
45 | "collapsed": false,
46 | "input": [
47 | "%%timeit s = 0.\n",
48 | "for i in range(1, n):\n",
49 | " s += 1. / i**2"
50 | ],
51 | "metadata": {}
52 | },
53 | {
54 | "cell_type": "code",
55 | "language": "python",
56 | "outputs": [],
57 | "collapsed": false,
58 | "input": [
59 | "import numpy as np"
60 | ],
61 | "metadata": {}
62 | },
63 | {
64 | "cell_type": "code",
65 | "language": "python",
66 | "outputs": [],
67 | "collapsed": false,
68 | "input": [
69 | "%timeit np.sum(1. / np.arange(1., n) ** 2)"
70 | ],
71 | "metadata": {}
72 | },
73 | {
74 | "source": [
75 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n\n> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
76 | ],
77 | "cell_type": "markdown",
78 | "metadata": {}
79 | }
80 | ],
81 | "metadata": {}
82 | }
83 | ],
84 | "metadata": {
85 | "name": "",
86 | "signature": "sha256:a6393788ee72dcb8749be659262e5ab934e2505c54a2779bed8fe9896fe5b0e6"
87 | }
88 | }
--------------------------------------------------------------------------------
/tools/genfeatured.py:
--------------------------------------------------------------------------------
1 | import sys
2 | import os
3 | import os.path as op
4 | import re
5 |
6 | from util import get_recipe_number, get_recipe_name
7 |
8 | current_dir = op.dirname(os.path.abspath(__file__))
9 | code_dir = op.join(current_dir, '../')
10 | featured_dir = op.join(code_dir, 'featured/')
11 | site_dir = op.join(current_dir, '../../ipython-books.github.io')
12 |
13 | index_path = op.join(site_dir, 'index.html')
14 |
15 | def get_title(notebook_filename):
16 | return get_recipe_name(op.join(featured_dir, notebook_filename))
17 |
18 | def get_snippet(name):
19 | with open(index_path, 'r') as f:
20 | contents = f.read()
21 | start = contents.index(''.format(name))
22 | end = contents.index(''.format(name))
23 | return contents[start:end]
24 |
25 | def get_navbar():
26 | return get_snippet('NAVBAR')
27 |
28 | def get_footer():
29 | return get_snippet('FOOTER')
30 |
31 | NAVBAR = get_navbar()
32 | FOOTER = get_footer()
33 |
34 | def transform_featured(notebook_filename):
35 | notebook_filename = op.basename(notebook_filename)
36 | number = int(notebook_filename[:2])
37 | notebook_basename = op.basename(notebook_filename)
38 | input_path = op.realpath(op.join(featured_dir, notebook_filename))
39 | output_path = op.realpath(op.join(site_dir,
40 | 'featured-{0:02d}'.format(number)))
41 |
42 | # Get the recipe's title.
43 | title = get_title(input_path)
44 |
45 | # Generate the nbconvert command.
46 | command = ('ipython nbconvert {f} --to html '
47 | '--template featured.tpl --output {of}').format(
48 | f=input_path,
49 | of=output_path)
50 | os.system(command)
51 |
52 | output_path += '.html'
53 | # Replace the templates: title, navbar, footer.
54 | with open(output_path, 'r') as f:
55 | contents = f.read()
56 | contents = contents.replace('%TITLE%', title)
57 | contents = contents.replace('%NAVBAR%', NAVBAR)
58 | contents = contents.replace('%FOOTER%', FOOTER)
59 | with open(output_path, 'w') as f:
60 | f.write(contents)
61 |
62 | def main():
63 | if len(sys.argv) == 1:
64 | for nb in sorted([x for x in os.listdir(featured_dir)
65 | if x.endswith('.ipynb')]):
66 | print(nb)
67 | transform_featured(nb)
68 | else:
69 | transform_featured(sys.argv[1])
70 |
71 | if __name__ == '__main__':
72 | main()
73 |
--------------------------------------------------------------------------------
/tools/gentoc.py:
--------------------------------------------------------------------------------
1 | """Generate table of contents with links to nbviewer."""
2 |
3 | import re
4 | import urlparse
5 | import json
6 | import os
7 | import sys
8 | import os.path as op
9 |
10 | from util import get_recipe_number, get_recipe_name
11 |
12 | CHAPTER_NAMES = [
13 | 'A Tour of Interactive Computing with IPython',
14 | 'Best practices in Interactive Computing',
15 | 'Master the Notebook',
16 | 'Profiling and Optimization',
17 | 'High-Performance Computing',
18 | 'Advanced Visualization',
19 | 'Statistical Data Analysis',
20 | 'Machine Learning',
21 | 'Numerical Optimization',
22 | 'Signal Processing',
23 | 'Image and Audio Processing',
24 | 'Deterministic Dynamical Systems',
25 | 'Stochastic Dynamical Systems',
26 | 'Graphs and Geometry',
27 | 'Symbolic and Numerical Mathematics',
28 | ]
29 |
30 | def get_chapter_number(dir):
31 | return int(op.basename(dir)[7:9])
32 |
33 | def get_chapter_name(number):
34 | return CHAPTER_NAMES[number-1]
35 |
36 | def iter_chapters(root):
37 | for chapter in sorted([_ for _ in os.listdir(root)
38 | if _.startswith('chapter')]):
39 | yield chapter
40 |
41 | def iter_recipes(root, dir):
42 | files = sorted([_ for _ in os.listdir(op.join(root, dir))
43 | if re.match(r'\d{2}\_[^.]+\.ipynb', _)])
44 | for file in files:
45 | yield file
46 |
47 | def yield_chapter_toc(root):
48 | for chapter in iter_chapters(root):
49 | number = get_chapter_number(chapter)
50 | chapter_name = get_chapter_name(number)
51 | yield chapter, '### Chapter {number}: {name}\n\n'.format(number=number,
52 | name=chapter_name)
53 |
54 | def yield_recipe_toc(root, chapter):
55 | for recipe in iter_recipes(root, chapter):
56 | file = op.join(root, chapter, recipe)
57 | recipe_name = get_recipe_name(file)
58 | url = urlparse.urljoin(urlbase, chapter + '/' + recipe)
59 | yield '* [{recipe_name}]({url})\n'.format(
60 | recipe_name=recipe_name,
61 | url=url,
62 | )
63 |
64 | curdir = op.realpath(op.dirname(os.path.abspath(__file__)))
65 | root = op.realpath(op.join(curdir, '../notebooks'))
66 | urlbase = "http://nbviewer.ipython.org/github/ipython-books/cookbook-code/blob/master/notebooks/"
67 |
68 | def write_toc(root, f=sys.stdout):
69 | with f:
70 | for chapter, chapter_toc in yield_chapter_toc(root):
71 | f.write(chapter_toc)
72 | for recipe_toc in yield_recipe_toc(root, chapter):
73 | f.write(recipe_toc)
74 | f.write('\n\n')
75 |
76 | if __name__ == '__main__':
77 | write_toc(root, open('toc.md', 'w'))
78 |
79 | # for chapter, chapter_toc in yield_chapter_toc(root):
80 | # print(chapter_toc)
81 |
--------------------------------------------------------------------------------
/tools/featured.tpl:
--------------------------------------------------------------------------------
1 | {%- extends 'basic.tpl' -%}
2 | {% from 'mathjax.tpl' import mathjax %}
3 |
4 |
5 | {%- block header -%}
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 | IPython Cookbook %TITLE%
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
25 |
26 |
27 |
54 |
55 |
56 |
57 |
58 |
59 |
60 |
61 |
62 | {{ mathjax() }}
63 |
76 |
77 |
78 |
82 |
83 | {%- endblock header -%}
84 |
85 |
86 | {% block body %}
87 |
88 |
89 | %NAVBAR%
90 |
91 |
92 |
93 | {{ super() }}
94 |
95 |
96 |
97 | %FOOTER%
98 |
99 |
100 | {%- endblock body %}
101 |
102 | {% block footer %}
103 |
104 | {% endblock footer %}
105 |
--------------------------------------------------------------------------------
/references/chapter11_image.md:
--------------------------------------------------------------------------------
1 | # 11. Image Processing
2 |
3 | Full list of references in Chapter 11 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 |
6 | ## Recipe 2
7 |
8 | * [API reference of skimage.filter](http://scikit-image.org/docs/dev/api/skimage.filter.html).
9 | * [Noise reduction](http://en.wikipedia.org/wiki/Noise_reduction).
10 | * [Gaussian filter](http://en.wikipedia.org/wiki/Gaussian_filter).
11 | * [Sobel filter](http://en.wikipedia.org/wiki/Sobel_operator).
12 | * [Image denoising](http://en.wikipedia.org/wiki/Noise_reduction).
13 | * [Total variation denoising](http://en.wikipedia.org/wiki/Total_variation_denoising).
14 | * [Split Bregman algorithm](http://www.ece.rice.edu/~tag7/Tom_Goldstein/Split_Bregman.html).
15 |
16 | ## Recipe 3
17 |
18 | * [SciPy lectures note about image processing](http://scipy-lectures.github.io/packages/scikit-image/).
19 | * [Image segmentation](http://en.wikipedia.org/wiki/Image_segmentation).
20 | * [Otsu's method to find a threshold](http://en.wikipedia.org/wiki/Otsu's_method).
21 | * [Segmentation tutorial with scikit-image (that inspired this recipe)](http://scikit-image.org/docs/dev/user_guide/tutorial_segmentation.html).
22 | * [Mathematical morphology](http://en.wikipedia.org/wiki/Mathematical_morphology).
23 | * [API reference of skimage.morphology](http://scikit-image.org/docs/dev/api/skimage.morphology.html).
24 |
25 |
26 | ## Recipe 4
27 |
28 | * [Corner detection example with scikit-image](http://scikit-image.org/docs/dev/auto_examples/plot_corner.html).
29 | * [Image processing tutorial with scikit-image](http://blog.yhathq.com/posts/image-processing-with-scikit-image.html).
30 | * [Corner detection](http://en.wikipedia.org/wiki/Corner_detection).
31 | * [Structure tensor](http://en.wikipedia.org/wiki/Structure_tensor).
32 | * [Interest point detection](http://en.wikipedia.org/wiki/Interest_point_detection).
33 | * [API reference of `skimage.feature`](http://scikit-image.org/docs/dev/api/skimage.feature.html).
34 |
35 |
36 | ## Recipe 5
37 |
38 | * [Cascade tutorial with OpenCV (C++)](http://docs.opencv.org/doc/tutorials/objdetect/cascade_classifier/cascade_classifier.html).
39 | * [Train your own cascade](http://docs.opencv.org/doc/user_guide/ug_traincascade.html).
40 | * [OpenCV's cascade classification API reference](http://docs.opencv.org/modules/objdetect/doc/cascade_classification.html).
41 | * [Viola–Jones object detection framework](http://en.wikipedia.org/wiki/Viola%E2%80%93Jones_object_detection_framework).
42 | * [Boosting](http://en.wikipedia.org/wiki/Boosting_(machine_learning)), or how to create one strong classifier from many weak classifiers.
43 |
44 |
45 | ## Recipe 6
46 |
47 | * [Audio signal processing](http://en.wikipedia.org/wiki/Audio_signal_processing).
48 | * [Audio filters](http://en.wikipedia.org/wiki/Audio_filter).
49 | * [Voice frequency](http://en.wikipedia.org/wiki/Voice_frequency).
50 |
51 |
52 | ## Recipe 7
53 |
54 | * [Synthesizer](http://en.wikipedia.org/wiki/Synthesizer).
55 | * [Equal temperament](http://en.wikipedia.org/wiki/Equal_temperament).
56 | * [Chromatic scale](http://en.wikipedia.org/wiki/Chromatic_scale).
57 | * [Pure tone](http://en.wikipedia.org/wiki/Pure_tone).
58 | * [Synthesizer](http://en.wikipedia.org/wiki/Synthesizer).
59 | * [Timbre](http://en.wikipedia.org/wiki/Timbre).
60 |
61 |
--------------------------------------------------------------------------------
/references/chapter07_stats.md:
--------------------------------------------------------------------------------
1 | # 7. Statistical Data Analysis
2 |
3 | Full list of references in Chapter 7 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 |
6 | ## Books
7 |
8 | * Python for Data Analysis by Wes McKinney
9 |
10 |
11 | ## Packages
12 |
13 | * [statsmodels](http://statsmodels.sourceforge.net).
14 |
15 |
16 | ## Frequentism and Bayesanism
17 |
18 | * [Must-read: a series of posts by Jake Vanderplas](http://jakevdp.github.io/blog/2014/03/11/frequentism-and-bayesianism-a-practical-intro/).
19 | * [Misuses of Frequentist methods](http://www.refsmmat.com/statistics/).
20 |
21 |
22 | ## Hypothesis testing
23 |
24 | * [Statistical hypothesis testing](http://en.wikipedia.org/wiki/Statistical_hypothesis_testing).
25 | * [Credible interval](http://en.wikipedia.org/wiki/Credible_interval).
26 | * [MAP estimation](http://en.wikipedia.org/wiki/Maximum_a_posteriori_estimation).
27 | * [Conjugate prior](http://en.wikipedia.org/wiki/Conjugate_prior).
28 | * [Prior probability](http://en.wikipedia.org/wiki/Prior_probability#Uninformative_priors).
29 | * [Jeffreys prior](http://en.wikipedia.org/wiki/Jeffreys_prior).
30 |
31 |
32 | ## Correlations
33 |
34 | * [Correlation does not imply causation](http://en.wikipedia.org/wiki/Correlation_does_not_imply_causation).
35 | * [Pearson correlation](http://en.wikipedia.org/wiki/Pearson_product-moment_correlation_coefficient).
36 |
37 |
38 | ## Statistical tests
39 |
40 | * [Chi2 test in SciPy](http://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.chi2_contingency.html).
41 | * [Contingency table](http://en.wikipedia.org/wiki/Contingency_table).
42 | * [Chi square test](http://en.wikipedia.org/wiki/Pearson's_chi-squared_test).
43 |
44 |
45 | ## Maximum likelihood
46 |
47 | * [Maximum likelihood](http://en.wikipedia.org/wiki/Maximum_likelihood).
48 | * [Kolmogorov-Smirnov test](http://en.wikipedia.org/wiki/Kolmogorov-Smirnov_test).
49 | * [Goodness of fit](http://en.wikipedia.org/wiki/Goodness_of_fit).
50 |
51 |
52 | ## Kernel Density Estimation
53 |
54 | * [Kernel Density Estimation](http://en.wikipedia.org/wiki/Kernel_density_estimation).
55 | * [Bias-variance dilemma](http://en.wikipedia.org/wiki/Bias-variance_dilemma).
56 | * [A post about KDE by Jake Vanderplas](http://jakevdp.github.io/blog/2013/12/01/kernel-density-estimation/).
57 |
58 |
59 | ## Bayesian programming
60 |
61 | * [PyMC model fitting](http://pymc-devs.github.io/pymc/modelfitting.html).
62 | * [A great PyMC tutorial which we largely took inspiration from](http://pymc-devs.github.io/pymc/tutorial.html).
63 | * [A must-read free ebook on the subject, by Cameron Davidson-Pilon, entirely written in the IPython notebook!](http://camdavidsonpilon.github.io/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/).
64 | * [The Monte-Carlo Markov Chain method](http://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo).
65 | * [The Metropolis-Hastings algorithm](http://en.wikipedia.org/wiki/Metropolis-Hastings_algorithm).
66 |
67 |
68 | ## Linear regression
69 |
70 | * [Regression analysis](http://en.wikipedia.org/wiki/Regression_analysis).
71 | * [Least squares method](http://en.wikipedia.org/wiki/Linear_least_squares_(mathematics)).
72 |
73 |
74 | ## R
75 |
76 | * [The IPython R magic](http://ipython.org/ipython-doc/dev/config/extensions/rmagic.html).
77 | * [Rpy2](http://rpy.sourceforge.net/rpy2.html).
78 | * [Official introduction to R](http://cran.r-project.org/doc/manuals/R-intro.html).
79 | * [R tutorial](http://www.cyclismo.org/tutorial/R/).
80 | * [CRAN: Comprehensive R Archive Network, containing many packages for R](http://cran.r-project.org).
81 | * [IPython and R tutorial](http://nbviewer.ipython.org/github/ipython/ipython/blob/master/examples/notebooks/R%20Magics.ipynb).
82 |
83 |
84 |
85 |
86 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | IPython Cookbook
2 | ================
3 |
4 | This repository contains the recipes of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to **high-performance scientific computing** and **data science** in Python, by [Dr. Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 500 pages, September 2014.
5 |
6 | [Follow me on Twitter to get all updates](https://twitter.com/cyrillerossant)
7 |
8 |
9 | ## Featured recipes
10 |
11 | A selection of free recipes from the book:
12 |
13 | 1. [Getting the best performance out of NumPy](http://ipython-books.github.io/featured-01/)
14 | 2. [Simulating a physical system by minimizing an energy](http://ipython-books.github.io/featured-02/)
15 | 3. [Creating a route planner for road network](http://ipython-books.github.io/featured-03/)
16 | 4. [Introduction to machine learning in Python with scikit-learn](http://ipython-books.github.io/featured-04/)
17 | 5. [Simulating a partial differential equation: reaction-diffusion systems and Turing patterns](http://ipython-books.github.io/featured-05/)
18 | 6. [Getting started with Vispy](http://ipython-books.github.io/featured-06/)
19 | * more coming soon...
20 |
21 |
22 | ## Table of contents
23 |
24 | * [See the full table of contents here](toc.md)
25 |
26 |
27 | ## Errata
28 |
29 | You can report inaccuracies or errors in the [GitHub issue tracker](https://github.com/ipython-books/cookbook-code/issues). Even better, propose your own corrections by submitting a pull request!
30 |
31 | * [See the list of errata here](errata.md)
32 |
33 |
34 | ## Example data
35 |
36 | [You will find the data used in the recipes here](https://github.com/ipython-books/cookbook-data).
37 |
38 |
39 | ## Structure of the repository
40 |
41 | The structure of the repo is the following:
42 |
43 | ```
44 | notebooks/ all notebooks with the code of all examples
45 | chapter01_tour/
46 | chapter02_best_practices/
47 | ...
48 | extra/ extra code example that didn't make it in the book
49 | guests/ guest recipes
50 | featured/ a selection of complete recipes with all text, figures and code
51 | references/ a curated list of references about scientific Python programming
52 | tools/ various building Python scripts
53 | ```
54 |
55 |
56 | ## Installation
57 |
58 | You need Python 3 (or 2) and a bunch of scientific modules for the code examples, mainly IPython 2.0+, NumPy, SciPy, Pandas, and matplotlib. Many recipes that require other modules come with the appropriate installation instructions.
59 |
60 | We highly recommend that you use an all-in-one Python distribution like [Anaconda](http://continuum.io/downloads). This distribution comes with an excellent package manager named *conda*. It lets you install easily many modules on most platforms (Windows, Linux, Mac OS X), in 64-bit (recommended if you have a 64-bit OS) or 32-bit.
61 |
62 | The recipes are written for Python 3 first, but they also work with Python 2. Please favor Python 3 over Python 2 if you can.
63 |
64 |
65 | ## Cloning the repository
66 |
67 | You need [git](http://git-scm.com/), a distributed versioning system, to download a local copy of this repository. Open a terminal and type:
68 |
69 | ```
70 | git clone https://github.com/ipython-books/cookbook-code.git
71 | ```
72 |
73 | This will copy the repository in a local folder named `cookbook-code`.
74 |
75 |
76 | ## Running the examples
77 |
78 | Launch the IPython notebook server with:
79 |
80 | ```
81 | ipython notebook
82 | ```
83 |
84 | In your browser, go to `127.0.0.1:8888`. You can navigate in the repository and open the notebooks.
85 |
86 |
87 | ## Contribute
88 |
89 | You are welcome to contribute to this repository. You can use the issue tracker to report any problem. You can also propose a pull request (PR) to fix an error, to add some information, or even propose a brand new recipe in the `guests/` folder!
90 |
91 |
92 |
--------------------------------------------------------------------------------
/references/chapter04_optimization.md:
--------------------------------------------------------------------------------
1 | # 4. Profiling and Optimization
2 |
3 | Full list of references in Chapter 4 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 | ## Profiling
6 |
7 | * [Program optimization on Wikipedia](http://en.wikipedia.org/wiki/Program_optimization).
8 | * [Python's `timeit` module](http://docs.python.org/3/library/timeit.html).
9 | * [Python's `profile` module](http://docs.python.org/3/library/profile.html).
10 | * [`easy_profiler` that simplifies the use of Python profiling tools](http://github.com/rossant/easy_profiler).
11 | * [`line_profiler`, a line-by-line profiler, for Python 2 only](http://pythonhosted.org/line_profiler/).
12 | * [How to deal with `@profile` in profiled code](http://stackoverflow.com/questions/18229628/python-profiling-using-line-profiler-clever-way-to-remove-profile-statements).
13 | * [RunSnakeRun, a viewer for profiling results](http://www.vrplumber.com/programming/runsnakerun/).
14 |
15 |
16 | ## Memory profiling
17 |
18 | * [`memory_profiler` package](http://pypi.python.org/pypi/memory_profiler).
19 | * [`psutil` package, required by `memory_profiler`](http://pypi.python.org/pypi/psutil).
20 | * [Guppy-PE, a memory profiling package](http://guppy-pe.sourceforge.net).
21 | * [PySizer, a memory profiler for Python](http://pysizer.8325.org).
22 | * [Pympler, another memory profiler](http://code.google.com/p/pympler/).
23 |
24 |
25 | ## Code tracing
26 |
27 | * [Python's `trace` module](https://docs.python.org/3/library/trace.html).
28 | * [Online Python Tutor, a great educational interactive tool to visualize the step-by-step execution of Python code](http://pythontutor.com/).
29 | * [Some Python profiling tools](http://blog.ionelmc.ro/2013/06/08/python-profiling-tools/).
30 |
31 |
32 | ## NumPy code optimization
33 |
34 | * [Broadcasting rules and examples](http://docs.scipy.org/doc/numpy/user/basics.broadcasting.html).
35 | * [Array interface in NumPy](http://docs.scipy.org/doc/numpy/reference/arrays.interface.html).
36 | * [Locality of reference](http://en.wikipedia.org/wiki/Locality_of_reference).
37 | * [Internals of NumPy in the SciPy lectures notes](http://scipy-lectures.github.io/advanced/advanced_numpy/).
38 | * [The complete list of NumPy routines is available on the NumPy Reference Guide](http://docs.scipy.org/doc/numpy/reference/).
39 | * [List of indexing routines](http://docs.scipy.org/doc/numpy/reference/routines.indexing.html).
40 | * [NumPy Performance Tricks](http://cyrille.rossant.net/numpy-performance-tricks/).
41 | * [100 NumPy exercices](http://www.loria.fr/~rougier/teaching/numpy.100/index.html).
42 |
43 |
44 | ## Memory mapping
45 |
46 | * [Documentation of `memmap`](http://docs.scipy.org/doc/numpy/reference/generated/numpy.memmap.html).
47 |
48 |
49 | ## Sparse matrices
50 |
51 | * [SciPy lecture notes about sparse matrices](http://scipy-lectures.github.io/advanced/scipy_sparse/index.html).
52 | * [Reference documentation on sparse matrices](http://docs.scipy.org/doc/scipy/reference/sparse.html).
53 |
54 |
55 | ## HDF5
56 |
57 | * [PyTables](http://pytables.github.io)
58 | * [h5py](http://www.h5py.org)
59 | * [Partial list of HDF5 users](http://www.hdfgroup.org/users.html)
60 | * [HDF5 chunking](http://www.hdfgroup.org/HDF5/doc/Advanced/Chunking/).
61 | * [PyTables optimization guide, a must-read for PyTables users](http://pytables.github.io/usersguide/optimization.html).
62 | * [In-kernel queries in PyTables](http://pytables.github.io/usersguide/condition_syntax.html).
63 | * [An alternative to PyTables and HDF5 might come from the Blaze project, still in early development at the time of writing](http://blaze.pydata.org).
64 | * [Difference between PyTables and h5py, from the perspective of h5py](http://github.com/h5py/h5py/wiki/FAQ#whats-the-difference-between-h5py-and-pytables).
65 | * [Difference between PyTables and h5py, from the perspective of PyTables](http://www.pytables.org/moin/FAQ#HowdoesPyTablescomparewiththeh5pyproject.3F).
66 |
67 |
--------------------------------------------------------------------------------
/notebooks/chapter05_hpc/02_numexpr.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:9f3072f113fc338fc8166eb9ca37743161b7df7e8b298ed4179c78cf25168c0c"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 5.2. Accelerating array computations with Numexpr"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "Let's import NumPy and Numexpr."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import numexpr as ne"
38 | ],
39 | "language": "python",
40 | "metadata": {},
41 | "outputs": []
42 | },
43 | {
44 | "cell_type": "markdown",
45 | "metadata": {},
46 | "source": [
47 | "We generate three large vectors."
48 | ]
49 | },
50 | {
51 | "cell_type": "code",
52 | "collapsed": false,
53 | "input": [
54 | "x, y, z = np.random.rand(3, 1000000)"
55 | ],
56 | "language": "python",
57 | "metadata": {},
58 | "outputs": []
59 | },
60 | {
61 | "cell_type": "markdown",
62 | "metadata": {},
63 | "source": [
64 | "Now, we evaluate the time taken by NumPy to calculate a complex algebraic expression involving our vectors."
65 | ]
66 | },
67 | {
68 | "cell_type": "code",
69 | "collapsed": false,
70 | "input": [
71 | "%timeit x + (y**2 + (z*x + 1)*3)"
72 | ],
73 | "language": "python",
74 | "metadata": {},
75 | "outputs": []
76 | },
77 | {
78 | "cell_type": "markdown",
79 | "metadata": {},
80 | "source": [
81 | "And now, the same calculation performed by Numexpr. We need to give the formula as a string as Numexpr will parse it and compile it."
82 | ]
83 | },
84 | {
85 | "cell_type": "code",
86 | "collapsed": false,
87 | "input": [
88 | "%timeit ne.evaluate('x + (y**2 + (z*x + 1)*3)')"
89 | ],
90 | "language": "python",
91 | "metadata": {},
92 | "outputs": []
93 | },
94 | {
95 | "cell_type": "markdown",
96 | "metadata": {},
97 | "source": [
98 | "Numexpr also makes use of multicore processors. Here, we have 4 physical cores and 8 virtual threads with hyperthreading. We can specify how many cores we want numexpr to use."
99 | ]
100 | },
101 | {
102 | "cell_type": "code",
103 | "collapsed": false,
104 | "input": [
105 | "ne.ncores"
106 | ],
107 | "language": "python",
108 | "metadata": {},
109 | "outputs": []
110 | },
111 | {
112 | "cell_type": "code",
113 | "collapsed": false,
114 | "input": [
115 | "for i in range(1, 5):\n",
116 | " ne.set_num_threads(i)\n",
117 | " %timeit ne.evaluate('x + (y**2 + (z*x + 1)*3)')"
118 | ],
119 | "language": "python",
120 | "metadata": {},
121 | "outputs": []
122 | },
123 | {
124 | "cell_type": "markdown",
125 | "metadata": {},
126 | "source": [
127 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
128 | "\n",
129 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
130 | ]
131 | }
132 | ],
133 | "metadata": {}
134 | }
135 | ]
136 | }
--------------------------------------------------------------------------------
/notebooks/chapter06_viz/03_bokeh.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:0f5e6b9d089bf8dbe38e3159ade307fe1743171ed6aa17b1ae0c588fab091915"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 6.3. Creating interactive Web visualizations with Bokeh"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. Let's import NumPy and Bokeh. We need to call `output_notebook` in order to tell Bokeh to render plots in the IPython notebook."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import bokeh.plotting as bkh\n",
38 | "bkh.output_notebook()"
39 | ],
40 | "language": "python",
41 | "metadata": {},
42 | "outputs": []
43 | },
44 | {
45 | "cell_type": "markdown",
46 | "metadata": {},
47 | "source": [
48 | "2. Let's create some random data."
49 | ]
50 | },
51 | {
52 | "cell_type": "code",
53 | "collapsed": false,
54 | "input": [
55 | "x = np.linspace(0., 1., 100)\n",
56 | "y = np.cumsum(np.random.randn(100))"
57 | ],
58 | "language": "python",
59 | "metadata": {},
60 | "outputs": []
61 | },
62 | {
63 | "cell_type": "markdown",
64 | "metadata": {},
65 | "source": [
66 | "3. Let's draw a curve."
67 | ]
68 | },
69 | {
70 | "cell_type": "code",
71 | "collapsed": false,
72 | "input": [
73 | "bkh.line(x, y, line_width=5)\n",
74 | "bkh.show()"
75 | ],
76 | "language": "python",
77 | "metadata": {},
78 | "outputs": []
79 | },
80 | {
81 | "cell_type": "markdown",
82 | "metadata": {},
83 | "source": [
84 | "An interactive plot is rendered in the notebook. We can pan and zoom by clicking on the buttons above the plot."
85 | ]
86 | },
87 | {
88 | "cell_type": "markdown",
89 | "metadata": {},
90 | "source": [
91 | "4. Let's move to another example. We first load a sample dataset (*Iris flowers*). We also generate some colors based on the species of the flowers."
92 | ]
93 | },
94 | {
95 | "cell_type": "code",
96 | "collapsed": false,
97 | "input": [
98 | "from bokeh.sampledata.iris import flowers\n",
99 | "colormap = {'setosa': 'red',\n",
100 | " 'versicolor': 'green',\n",
101 | " 'virginica': 'blue'}\n",
102 | "flowers['color'] = flowers['species'].map(lambda x: colormap[x])"
103 | ],
104 | "language": "python",
105 | "metadata": {},
106 | "outputs": []
107 | },
108 | {
109 | "cell_type": "markdown",
110 | "metadata": {},
111 | "source": [
112 | "5. Now, we render an interactive scatter plot."
113 | ]
114 | },
115 | {
116 | "cell_type": "code",
117 | "collapsed": false,
118 | "input": [
119 | "bkh.scatter(flowers[\"petal_length\"], \n",
120 | " flowers[\"petal_width\"],\n",
121 | " color=flowers[\"color\"], \n",
122 | " fill_alpha=0.25, size=10,)\n",
123 | "bkh.show()"
124 | ],
125 | "language": "python",
126 | "metadata": {},
127 | "outputs": []
128 | },
129 | {
130 | "cell_type": "markdown",
131 | "metadata": {},
132 | "source": [
133 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
134 | "\n",
135 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
136 | ]
137 | }
138 | ],
139 | "metadata": {}
140 | }
141 | ]
142 | }
--------------------------------------------------------------------------------
/notebooks/chapter06_viz/02_seaborn.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:c1be480eb545f8c0b2a50692d2b01b0e7c2b06522f3a2653cb377665ff7df7aa"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 6.2. Creating beautiful statistical plots with seaborn"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. Let's import NumPy, matplotlib, and seaborn."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import pandas as pd\n",
38 | "import matplotlib.pyplot as plt\n",
39 | "import seaborn as sns\n",
40 | "%matplotlib inline"
41 | ],
42 | "language": "python",
43 | "metadata": {},
44 | "outputs": []
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "2. We generate a random dataset (following this example on seaborn's website: http://nbviewer.ipython.org/github/mwaskom/seaborn/blob/master/examples/linear_models.ipynb)"
51 | ]
52 | },
53 | {
54 | "cell_type": "code",
55 | "collapsed": false,
56 | "input": [
57 | "x1 = np.random.randn(80)\n",
58 | "x2 = np.random.randn(80)\n",
59 | "x3 = x1 * x2\n",
60 | "y1 = .5 + 2 * x1 - x2 + 2.5 * x3 + 3 * np.random.randn(80)\n",
61 | "y2 = .5 + 2 * x1 - x2 + 2.5 * np.random.randn(80)\n",
62 | "y3 = y2 + np.random.randn(80)"
63 | ],
64 | "language": "python",
65 | "metadata": {},
66 | "outputs": []
67 | },
68 | {
69 | "cell_type": "markdown",
70 | "metadata": {},
71 | "source": [
72 | "2. Seaborn implements many easy-to-use statistical plotting functions. For example, here is how to create a violin plot (showing the distribution of several sets of points)."
73 | ]
74 | },
75 | {
76 | "cell_type": "code",
77 | "collapsed": false,
78 | "input": [
79 | "plt.figure(figsize=(4,3));\n",
80 | "sns.violinplot([x1,x2, x3]);"
81 | ],
82 | "language": "python",
83 | "metadata": {},
84 | "outputs": []
85 | },
86 | {
87 | "cell_type": "markdown",
88 | "metadata": {},
89 | "source": [
90 | "4. Seaborn also implement all-in-one statistical visualization functions. For example, one can use a single function (`regplot`) to perform *and* display a linear regression between two variables."
91 | ]
92 | },
93 | {
94 | "cell_type": "code",
95 | "collapsed": false,
96 | "input": [
97 | "plt.figure(figsize=(4,3));\n",
98 | "sns.regplot(x2, y2);"
99 | ],
100 | "language": "python",
101 | "metadata": {},
102 | "outputs": []
103 | },
104 | {
105 | "cell_type": "markdown",
106 | "metadata": {},
107 | "source": [
108 | "5. Seaborn has built-in support for Pandas data structures. Here, we display the pairwise correlations between all variables defined in a `DataFrame`."
109 | ]
110 | },
111 | {
112 | "cell_type": "code",
113 | "collapsed": false,
114 | "input": [
115 | "df = pd.DataFrame(dict(x1=x1, x2=x2, x3=x3, \n",
116 | " y1=y1, y2=y2, y3=y3))\n",
117 | "sns.corrplot(df);"
118 | ],
119 | "language": "python",
120 | "metadata": {},
121 | "outputs": []
122 | },
123 | {
124 | "cell_type": "markdown",
125 | "metadata": {},
126 | "source": [
127 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
128 | "\n",
129 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
130 | ]
131 | }
132 | ],
133 | "metadata": {}
134 | }
135 | ]
136 | }
--------------------------------------------------------------------------------
/notebooks/chapter13_stochastic/03_brownian.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:0a8c03cbe7f461cac57d5143962241ef3d5928dd09ecb1053b6a0bbfe523fe50"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 13.3. Simulating a Brownian motion"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. Let's import NumPy and matplotlib."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import matplotlib as mpl\n",
38 | "import matplotlib.pyplot as plt\n",
39 | "%matplotlib inline"
40 | ],
41 | "language": "python",
42 | "metadata": {},
43 | "outputs": []
44 | },
45 | {
46 | "cell_type": "markdown",
47 | "metadata": {},
48 | "source": [
49 | "2. We will simulate Brownian motions with 5000 time steps."
50 | ]
51 | },
52 | {
53 | "cell_type": "code",
54 | "collapsed": false,
55 | "input": [
56 | "n = 5000"
57 | ],
58 | "language": "python",
59 | "metadata": {},
60 | "outputs": []
61 | },
62 | {
63 | "cell_type": "markdown",
64 | "metadata": {},
65 | "source": [
66 | "3. We simulate two independent one-dimensional Brownian processes to form a single two-dimensional Brownian process. The (discrete) Brownian motion makes independent Gaussian jumps at each time step (like a random walk). Therefore, we just have to compute the cumulative sum of independent normal random variables (one for each time step)."
67 | ]
68 | },
69 | {
70 | "cell_type": "code",
71 | "collapsed": false,
72 | "input": [
73 | "x = np.cumsum(np.random.randn(n))\n",
74 | "y = np.cumsum(np.random.randn(n))"
75 | ],
76 | "language": "python",
77 | "metadata": {},
78 | "outputs": []
79 | },
80 | {
81 | "cell_type": "markdown",
82 | "metadata": {},
83 | "source": [
84 | "4. Now, to display the Brownian motion, we could just do `plot(x, y)`. However, the result would be monochromatic and a bit boring. We would like to use a gradient of color to illustrate the progression of the motion in time. Matplotlib forces us to use a small hack based on `scatter`. This function allows us to assign a different color to each point at the expense of dropping out line segments between points. To work around this issue, we interpolate linearly the process to give the illusion of a continuous line."
85 | ]
86 | },
87 | {
88 | "cell_type": "code",
89 | "collapsed": false,
90 | "input": [
91 | "k = 10 # We add 10 intermediary points between two \n",
92 | " # successive points.\n",
93 | "# We interpolate x and y.\n",
94 | "x2 = np.interp(np.arange(n*k), np.arange(n)*k, x)\n",
95 | "y2 = np.interp(np.arange(n*k), np.arange(n)*k, y)"
96 | ],
97 | "language": "python",
98 | "metadata": {},
99 | "outputs": []
100 | },
101 | {
102 | "cell_type": "code",
103 | "collapsed": false,
104 | "input": [
105 | "# Now, we draw our points with a gradient of colors.\n",
106 | "plt.scatter(x2, y2, c=range(n*k), linewidths=0,\n",
107 | " marker='o', s=3, cmap=plt.cm.jet,)\n",
108 | "plt.axis('equal');\n",
109 | "plt.xticks([]); plt.yticks([]);"
110 | ],
111 | "language": "python",
112 | "metadata": {},
113 | "outputs": []
114 | },
115 | {
116 | "cell_type": "markdown",
117 | "metadata": {},
118 | "source": [
119 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
120 | "\n",
121 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
122 | ]
123 | }
124 | ],
125 | "metadata": {}
126 | }
127 | ]
128 | }
--------------------------------------------------------------------------------
/notebooks/chapter05_hpc/11_mpi.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:311eded5d086bea3ca60a085bf3d6026103c5df55ca5b0c1a3119615bbf8811e"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python."
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {
21 | "word_id": "4818_05_mpi"
22 | },
23 | "source": [
24 | "# 5.11. Using MPI with IPython"
25 | ]
26 | },
27 | {
28 | "cell_type": "markdown",
29 | "metadata": {},
30 | "source": [
31 | "For this recipe, you need a MPI installation and the mpi4py package."
32 | ]
33 | },
34 | {
35 | "cell_type": "markdown",
36 | "metadata": {},
37 | "source": [
38 | "1. We first need to create a MPI profile with:"
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "collapsed": false,
44 | "input": [
45 | "!ipython profile create --parallel --profile=mpi"
46 | ],
47 | "language": "python",
48 | "metadata": {},
49 | "outputs": []
50 | },
51 | {
52 | "cell_type": "markdown",
53 | "metadata": {},
54 | "source": [
55 | "2. Then, we need to open `~/.ipython/profile_mpi/ipcluster_config.py` and add the line `c.IPClusterEngines.engine_launcher_class = 'MPI'`."
56 | ]
57 | },
58 | {
59 | "cell_type": "markdown",
60 | "metadata": {},
61 | "source": [
62 | "3. Once the MPI profile has been created and configured, we can launch the engines with: `ipcluster start -n 4 --engines MPI --profile=mpi` in a terminal."
63 | ]
64 | },
65 | {
66 | "cell_type": "markdown",
67 | "metadata": {},
68 | "source": [
69 | "4. Now, to actually use the engines, we create a MPI client in the notebook."
70 | ]
71 | },
72 | {
73 | "cell_type": "code",
74 | "collapsed": false,
75 | "input": [
76 | "import numpy as np\n",
77 | "from IPython.parallel import Client"
78 | ],
79 | "language": "python",
80 | "metadata": {},
81 | "outputs": []
82 | },
83 | {
84 | "cell_type": "code",
85 | "collapsed": false,
86 | "input": [
87 | "c = Client(profile='mpi')"
88 | ],
89 | "language": "python",
90 | "metadata": {},
91 | "outputs": []
92 | },
93 | {
94 | "cell_type": "markdown",
95 | "metadata": {},
96 | "source": [
97 | "5. Let's create a view on all engines."
98 | ]
99 | },
100 | {
101 | "cell_type": "code",
102 | "collapsed": false,
103 | "input": [
104 | "view = c[:]"
105 | ],
106 | "language": "python",
107 | "metadata": {},
108 | "outputs": []
109 | },
110 | {
111 | "cell_type": "markdown",
112 | "metadata": {},
113 | "source": [
114 | "6. In this example, we compute the sum of all integers between 0 and 15 in parallel over two cores. We first distribute the array with the 16 values across the engines (each engine gets a subarray)."
115 | ]
116 | },
117 | {
118 | "cell_type": "code",
119 | "collapsed": false,
120 | "input": [
121 | "view.scatter('a', np.arange(16., dtype='float'))"
122 | ],
123 | "language": "python",
124 | "metadata": {},
125 | "outputs": []
126 | },
127 | {
128 | "cell_type": "markdown",
129 | "metadata": {},
130 | "source": [
131 | "7. We compute the total sum in parallel using MPI's `allreduce` function. Every node makes the same computation and returns the same result."
132 | ]
133 | },
134 | {
135 | "cell_type": "code",
136 | "collapsed": false,
137 | "input": [
138 | "%%px\n",
139 | "from mpi4py import MPI\n",
140 | "import numpy as np\n",
141 | "print(MPI.COMM_WORLD.allreduce(np.sum(a), op=MPI.SUM))"
142 | ],
143 | "language": "python",
144 | "metadata": {},
145 | "outputs": []
146 | },
147 | {
148 | "cell_type": "markdown",
149 | "metadata": {},
150 | "source": [
151 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
152 | "\n",
153 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
154 | ]
155 | }
156 | ],
157 | "metadata": {}
158 | }
159 | ]
160 | }
--------------------------------------------------------------------------------
/notebooks/chapter07_stats/02_z_test.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:c10d033cecdfe07745ffec6cc2559349d9325a1f916c642166d8d96ec8aa382f"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {
21 | "word_id": "4818_07_ztest"
22 | },
23 | "source": [
24 | "# 7.2. Getting started with statistical hypothesis testing: a simple z-test"
25 | ]
26 | },
27 | {
28 | "cell_type": "markdown",
29 | "metadata": {},
30 | "source": [
31 | "Many frequentist methods for hypothesis testing roughly involve the following steps:\n",
32 | "\n",
33 | "1. Writing down the hypotheses, notably the **null hypothesis** which is the *opposite* of the hypothesis you want to prove (with a certain degree of confidence).\n",
34 | "2. Computing a **test statistics**, a mathematical formula depending on the test type, the model, the hypotheses, and the data.\n",
35 | "3. Using the computed value to accept the hypothesis, reject it, or fail to conclude.\n",
36 | " \n",
37 | "Here, we flip a coin $n$ times and we observe $h$ heads. We want to know whether the coin is fair (null hypothesis). This example is extremely simple yet quite good for pedagogical purposes. Besides, it is the basis of many more complex methods."
38 | ]
39 | },
40 | {
41 | "cell_type": "markdown",
42 | "metadata": {},
43 | "source": [
44 | "We denote by $\\mathcal B(q)$ the Bernoulli distribution with unknown parameter $q$ (http://en.wikipedia.org/wiki/Bernoulli_distribution). A Bernoulli variable:\n",
45 | "\n",
46 | "* is 0 (tail) with probability $1-q$,\n",
47 | "* is 1 (head) with probability $q$."
48 | ]
49 | },
50 | {
51 | "cell_type": "markdown",
52 | "metadata": {},
53 | "source": [
54 | "1. Let's suppose that, after $n=100$ flips, we get $h=61$ heads. We choose a significance level of 0.05: is the coin fair or not? Our null hypothesis is: *the coin is fair* ($q = 1/2$)."
55 | ]
56 | },
57 | {
58 | "cell_type": "code",
59 | "collapsed": false,
60 | "input": [
61 | "import numpy as np\n",
62 | "import scipy.stats as st\n",
63 | "import scipy.special as sp"
64 | ],
65 | "language": "python",
66 | "metadata": {},
67 | "outputs": []
68 | },
69 | {
70 | "cell_type": "code",
71 | "collapsed": false,
72 | "input": [
73 | "n = 100 # number of coin flips\n",
74 | "h = 61 # number of heads\n",
75 | "q = .5 # null-hypothesis of fair coin"
76 | ],
77 | "language": "python",
78 | "metadata": {},
79 | "outputs": []
80 | },
81 | {
82 | "cell_type": "markdown",
83 | "metadata": {},
84 | "source": [
85 | "2. Let's compute the **z-score**, which is defined by the following formula (`xbar` is the estimated average of the distribution). We will explain this formula in the next section *How it works...*"
86 | ]
87 | },
88 | {
89 | "cell_type": "code",
90 | "collapsed": false,
91 | "input": [
92 | "xbar = float(h)/n\n",
93 | "z = (xbar - q) * np.sqrt(n / (q*(1-q))); z"
94 | ],
95 | "language": "python",
96 | "metadata": {},
97 | "outputs": []
98 | },
99 | {
100 | "cell_type": "markdown",
101 | "metadata": {},
102 | "source": [
103 | "3. Now, from the z-score, we can compute the p-value as follows:"
104 | ]
105 | },
106 | {
107 | "cell_type": "code",
108 | "collapsed": false,
109 | "input": [
110 | "pval = 2 * (1 - st.norm.cdf(z)); pval"
111 | ],
112 | "language": "python",
113 | "metadata": {},
114 | "outputs": []
115 | },
116 | {
117 | "cell_type": "markdown",
118 | "metadata": {},
119 | "source": [
120 | "4. This p-value is less than 0.05, so we reject the null hypothesis and conclude that *the coin is probably not fair*."
121 | ]
122 | },
123 | {
124 | "cell_type": "markdown",
125 | "metadata": {},
126 | "source": [
127 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
128 | "\n",
129 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
130 | ]
131 | }
132 | ],
133 | "metadata": {}
134 | }
135 | ]
136 | }
--------------------------------------------------------------------------------
/notebooks/chapter15_symbolic/03_function.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:6fe4293941bd8b2ea9b5fda970d90c07e2cc5f1b77555dedf3b3f5bf29961714"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 15.3. Analyzing real-valued functions"
23 | ]
24 | },
25 | {
26 | "cell_type": "code",
27 | "collapsed": false,
28 | "input": [
29 | "from sympy import *\n",
30 | "init_printing()"
31 | ],
32 | "language": "python",
33 | "metadata": {},
34 | "outputs": []
35 | },
36 | {
37 | "cell_type": "code",
38 | "collapsed": false,
39 | "input": [
40 | "var('x z')"
41 | ],
42 | "language": "python",
43 | "metadata": {},
44 | "outputs": []
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "We define a new function depending on x."
51 | ]
52 | },
53 | {
54 | "cell_type": "code",
55 | "collapsed": false,
56 | "input": [
57 | "f = 1/(1+x**2)"
58 | ],
59 | "language": "python",
60 | "metadata": {},
61 | "outputs": []
62 | },
63 | {
64 | "cell_type": "markdown",
65 | "metadata": {},
66 | "source": [
67 | "Let's evaluate this function in 1."
68 | ]
69 | },
70 | {
71 | "cell_type": "code",
72 | "collapsed": false,
73 | "input": [
74 | "f.subs(x, 1)"
75 | ],
76 | "language": "python",
77 | "metadata": {},
78 | "outputs": []
79 | },
80 | {
81 | "cell_type": "markdown",
82 | "metadata": {},
83 | "source": [
84 | "We can compute the derivative of this function..."
85 | ]
86 | },
87 | {
88 | "cell_type": "code",
89 | "collapsed": false,
90 | "input": [
91 | "diff(f, x)"
92 | ],
93 | "language": "python",
94 | "metadata": {},
95 | "outputs": []
96 | },
97 | {
98 | "cell_type": "markdown",
99 | "metadata": {},
100 | "source": [
101 | "limits..."
102 | ]
103 | },
104 | {
105 | "cell_type": "code",
106 | "collapsed": false,
107 | "input": [
108 | "limit(f, x, oo)"
109 | ],
110 | "language": "python",
111 | "metadata": {},
112 | "outputs": []
113 | },
114 | {
115 | "cell_type": "markdown",
116 | "metadata": {},
117 | "source": [
118 | "Taylor series..."
119 | ]
120 | },
121 | {
122 | "cell_type": "code",
123 | "collapsed": false,
124 | "input": [
125 | "series(f, x0=0, n=9)"
126 | ],
127 | "language": "python",
128 | "metadata": {},
129 | "outputs": []
130 | },
131 | {
132 | "cell_type": "markdown",
133 | "metadata": {},
134 | "source": [
135 | "Definite integrals..."
136 | ]
137 | },
138 | {
139 | "cell_type": "code",
140 | "collapsed": false,
141 | "input": [
142 | "integrate(f, (x, -oo, oo))"
143 | ],
144 | "language": "python",
145 | "metadata": {},
146 | "outputs": []
147 | },
148 | {
149 | "cell_type": "markdown",
150 | "metadata": {},
151 | "source": [
152 | "indefinite integrals..."
153 | ]
154 | },
155 | {
156 | "cell_type": "code",
157 | "collapsed": false,
158 | "input": [
159 | "integrate(f, x)"
160 | ],
161 | "language": "python",
162 | "metadata": {},
163 | "outputs": []
164 | },
165 | {
166 | "cell_type": "markdown",
167 | "metadata": {},
168 | "source": [
169 | "and even Fourier transforms!"
170 | ]
171 | },
172 | {
173 | "cell_type": "code",
174 | "collapsed": false,
175 | "input": [
176 | "fourier_transform(f, x, z)"
177 | ],
178 | "language": "python",
179 | "metadata": {},
180 | "outputs": []
181 | },
182 | {
183 | "cell_type": "markdown",
184 | "metadata": {},
185 | "source": [
186 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
187 | "\n",
188 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
189 | ]
190 | }
191 | ],
192 | "metadata": {}
193 | }
194 | ]
195 | }
--------------------------------------------------------------------------------
/notebooks/chapter04_optimization/03_linebyline.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:1c79cf627e3e56fbb004a7c9a265a12aab13c938a743ac0498b3f0dac24f1a6c"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 4.3. Profiling your code line by line with line_profiler"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "Standard imports."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np"
37 | ],
38 | "language": "python",
39 | "metadata": {},
40 | "outputs": []
41 | },
42 | {
43 | "cell_type": "markdown",
44 | "metadata": {},
45 | "source": [
46 | "After installing `line_profiler`, we can export the IPython extension."
47 | ]
48 | },
49 | {
50 | "cell_type": "code",
51 | "collapsed": false,
52 | "input": [
53 | "%load_ext line_profiler"
54 | ],
55 | "language": "python",
56 | "metadata": {},
57 | "outputs": []
58 | },
59 | {
60 | "cell_type": "markdown",
61 | "metadata": {},
62 | "source": [
63 | "For `%lprun` to work, we need to encapsulate the code in a function, and to save it in a Python script.."
64 | ]
65 | },
66 | {
67 | "cell_type": "code",
68 | "collapsed": false,
69 | "input": [
70 | "%%writefile simulation.py\n",
71 | "import numpy as np\n",
72 | "\n",
73 | "def step(*shape):\n",
74 | " # Create a random n-vector with +1 or -1 values.\n",
75 | " return 2 * (np.random.random_sample(shape) < .5) - 1\n",
76 | "\n",
77 | "def simulate(iterations, n=10000):\n",
78 | " s = step(iterations, n)\n",
79 | " x = np.cumsum(s, axis=0)\n",
80 | " bins = np.arange(-30, 30, 1)\n",
81 | " y = np.vstack([np.histogram(x[i,:], bins)[0] for i in range(iterations)])\n",
82 | " return y"
83 | ],
84 | "language": "python",
85 | "metadata": {},
86 | "outputs": []
87 | },
88 | {
89 | "cell_type": "markdown",
90 | "metadata": {},
91 | "source": [
92 | "Now, we need to execute this script to load the function in the interactive namespace."
93 | ]
94 | },
95 | {
96 | "cell_type": "code",
97 | "collapsed": false,
98 | "input": [
99 | "import simulation"
100 | ],
101 | "language": "python",
102 | "metadata": {},
103 | "outputs": []
104 | },
105 | {
106 | "cell_type": "markdown",
107 | "metadata": {},
108 | "source": [
109 | "Let's execute the function under the control of the line profiler."
110 | ]
111 | },
112 | {
113 | "cell_type": "code",
114 | "collapsed": false,
115 | "input": [
116 | "%lprun -T lprof0 -f simulation.simulate simulation.simulate(50)"
117 | ],
118 | "language": "python",
119 | "metadata": {},
120 | "outputs": []
121 | },
122 | {
123 | "cell_type": "code",
124 | "collapsed": false,
125 | "input": [
126 | "print(open('lprof0', 'r').read())"
127 | ],
128 | "language": "python",
129 | "metadata": {},
130 | "outputs": []
131 | },
132 | {
133 | "cell_type": "markdown",
134 | "metadata": {},
135 | "source": [
136 | "Let's run the simulation with 10 times more iterations."
137 | ]
138 | },
139 | {
140 | "cell_type": "code",
141 | "collapsed": false,
142 | "input": [
143 | "%lprun -T lprof1 -f simulation.simulate simulation.simulate(iterations=500)"
144 | ],
145 | "language": "python",
146 | "metadata": {},
147 | "outputs": []
148 | },
149 | {
150 | "cell_type": "code",
151 | "collapsed": false,
152 | "input": [
153 | "print(open('lprof1', 'r').read())"
154 | ],
155 | "language": "python",
156 | "metadata": {},
157 | "outputs": []
158 | },
159 | {
160 | "cell_type": "markdown",
161 | "metadata": {},
162 | "source": [
163 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
164 | "\n",
165 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
166 | ]
167 | }
168 | ],
169 | "metadata": {}
170 | }
171 | ]
172 | }
--------------------------------------------------------------------------------
/notebooks/chapter04_optimization/09_memmap.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 3,
3 | "nbformat_minor": 0,
4 | "worksheets": [
5 | {
6 | "cells": [
7 | {
8 | "source": [
9 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
10 | ],
11 | "cell_type": "markdown",
12 | "metadata": []
13 | },
14 | {
15 | "source": [
16 | "# 4.9. Processing huge NumPy arrays with memory mapping"
17 | ],
18 | "cell_type": "markdown",
19 | "metadata": {}
20 | },
21 | {
22 | "cell_type": "code",
23 | "language": "python",
24 | "outputs": [],
25 | "collapsed": false,
26 | "input": [
27 | "import numpy as np"
28 | ],
29 | "metadata": {}
30 | },
31 | {
32 | "source": [
33 | "## Writing a memory-mapped array"
34 | ],
35 | "cell_type": "markdown",
36 | "metadata": {}
37 | },
38 | {
39 | "source": [
40 | "We create a memory-mapped array with a specific shape."
41 | ],
42 | "cell_type": "markdown",
43 | "metadata": {}
44 | },
45 | {
46 | "cell_type": "code",
47 | "language": "python",
48 | "outputs": [],
49 | "collapsed": false,
50 | "input": [
51 | "nrows, ncols = 1000000, 100"
52 | ],
53 | "metadata": {}
54 | },
55 | {
56 | "cell_type": "code",
57 | "language": "python",
58 | "outputs": [],
59 | "collapsed": false,
60 | "input": [
61 | "f = np.memmap('memmapped.dat', dtype=np.float32, \n",
62 | " mode='w+', shape=(nrows, ncols))"
63 | ],
64 | "metadata": {}
65 | },
66 | {
67 | "source": [
68 | "Let's feed the array with random values, one column at a time because our system memory is limited!"
69 | ],
70 | "cell_type": "markdown",
71 | "metadata": {}
72 | },
73 | {
74 | "cell_type": "code",
75 | "language": "python",
76 | "outputs": [],
77 | "collapsed": false,
78 | "input": [
79 | "for i in range(ncols):\n",
80 | " f[:,i] = np.random.rand(nrows)"
81 | ],
82 | "metadata": {}
83 | },
84 | {
85 | "source": [
86 | "We save the last column of the array."
87 | ],
88 | "cell_type": "markdown",
89 | "metadata": {}
90 | },
91 | {
92 | "cell_type": "code",
93 | "language": "python",
94 | "outputs": [],
95 | "collapsed": false,
96 | "input": [
97 | "x = f[:,-1]"
98 | ],
99 | "metadata": {}
100 | },
101 | {
102 | "source": [
103 | "Now, we flush memory changes to disk by removing the object."
104 | ],
105 | "cell_type": "markdown",
106 | "metadata": {}
107 | },
108 | {
109 | "cell_type": "code",
110 | "language": "python",
111 | "outputs": [],
112 | "collapsed": false,
113 | "input": [
114 | "del f"
115 | ],
116 | "metadata": {}
117 | },
118 | {
119 | "source": [
120 | "## Reading a memory-mapped file"
121 | ],
122 | "cell_type": "markdown",
123 | "metadata": {}
124 | },
125 | {
126 | "source": [
127 | "Reading a memory-mapped array from disk involves the same memmap function but with a different file mode. The data type and the shape need to be specified again, as this information is not stored in the file."
128 | ],
129 | "cell_type": "markdown",
130 | "metadata": {}
131 | },
132 | {
133 | "cell_type": "code",
134 | "language": "python",
135 | "outputs": [],
136 | "collapsed": false,
137 | "input": [
138 | "f = np.memmap('memmapped.dat', dtype=np.float32, shape=(nrows, ncols))"
139 | ],
140 | "metadata": {}
141 | },
142 | {
143 | "cell_type": "code",
144 | "language": "python",
145 | "outputs": [],
146 | "collapsed": false,
147 | "input": [
148 | "np.array_equal(f[:,-1], x)"
149 | ],
150 | "metadata": {}
151 | },
152 | {
153 | "cell_type": "code",
154 | "language": "python",
155 | "outputs": [],
156 | "collapsed": false,
157 | "input": [
158 | "del f"
159 | ],
160 | "metadata": {}
161 | },
162 | {
163 | "source": [
164 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n\n> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
165 | ],
166 | "cell_type": "markdown",
167 | "metadata": {}
168 | }
169 | ],
170 | "metadata": {}
171 | }
172 | ],
173 | "metadata": {
174 | "name": "",
175 | "signature": "sha256:6c2f964d6e248336692081c284f18e3e7aa2a75206ce92e8de6e42f49d88fe6d"
176 | }
177 | }
--------------------------------------------------------------------------------
/notebooks/chapter04_optimization/04_memprof.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 3,
3 | "nbformat_minor": 0,
4 | "worksheets": [
5 | {
6 | "cells": [
7 | {
8 | "source": [
9 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
10 | ],
11 | "cell_type": "markdown",
12 | "metadata": []
13 | },
14 | {
15 | "source": [
16 | "# 4.4. Profiling the memory usage of your code with memory_profiler"
17 | ],
18 | "cell_type": "markdown",
19 | "metadata": {}
20 | },
21 | {
22 | "source": [
23 | "Standard imports."
24 | ],
25 | "cell_type": "markdown",
26 | "metadata": {}
27 | },
28 | {
29 | "cell_type": "code",
30 | "language": "python",
31 | "outputs": [],
32 | "collapsed": false,
33 | "input": [
34 | "import numpy as np"
35 | ],
36 | "metadata": {}
37 | },
38 | {
39 | "source": [
40 | "After installing `memory_profiler`, we can export the IPython extension."
41 | ],
42 | "cell_type": "markdown",
43 | "metadata": {}
44 | },
45 | {
46 | "cell_type": "code",
47 | "language": "python",
48 | "outputs": [],
49 | "collapsed": false,
50 | "input": [
51 | "%load_ext memory_profiler"
52 | ],
53 | "metadata": {}
54 | },
55 | {
56 | "source": [
57 | "For `%lprun` to work, we need to encapsulate the code in a function, and to save it in a Python script."
58 | ],
59 | "cell_type": "markdown",
60 | "metadata": {}
61 | },
62 | {
63 | "cell_type": "code",
64 | "language": "python",
65 | "outputs": [],
66 | "collapsed": false,
67 | "input": [
68 | "%%writefile simulation.py\n",
69 | "import numpy as np\n",
70 | "\n",
71 | "def step(*shape):\n",
72 | " # Create a random n-vector with +1 or -1 values.\n",
73 | " return 2 * (np.random.random_sample(shape) < .5) - 1\n",
74 | "\n",
75 | "def simulate(iterations, n=10000):\n",
76 | " s = step(iterations, n)\n",
77 | " x = np.cumsum(s, axis=0)\n",
78 | " bins = np.arange(-30, 30, 1)\n",
79 | " y = np.vstack([np.histogram(x[i,:], bins)[0] for i in range(iterations)])\n",
80 | " return y"
81 | ],
82 | "metadata": {}
83 | },
84 | {
85 | "source": [
86 | "Now, we need to execute this script to load the function in the interactive namespace."
87 | ],
88 | "cell_type": "markdown",
89 | "metadata": {}
90 | },
91 | {
92 | "cell_type": "code",
93 | "language": "python",
94 | "outputs": [],
95 | "collapsed": false,
96 | "input": [
97 | "import simulation"
98 | ],
99 | "metadata": {}
100 | },
101 | {
102 | "source": [
103 | "Let's execute the function under the control of the line profiler."
104 | ],
105 | "cell_type": "markdown",
106 | "metadata": {}
107 | },
108 | {
109 | "cell_type": "code",
110 | "language": "python",
111 | "outputs": [],
112 | "collapsed": false,
113 | "input": [
114 | "%mprun -T mprof0 -f simulation.simulate simulation.simulate(50)"
115 | ],
116 | "metadata": {}
117 | },
118 | {
119 | "cell_type": "code",
120 | "language": "python",
121 | "outputs": [],
122 | "collapsed": false,
123 | "input": [
124 | "print(open('mprof0', 'r').read())"
125 | ],
126 | "metadata": {}
127 | },
128 | {
129 | "source": [
130 | "Let's run the simulation with 10 times more iterations."
131 | ],
132 | "cell_type": "markdown",
133 | "metadata": {}
134 | },
135 | {
136 | "cell_type": "code",
137 | "language": "python",
138 | "outputs": [],
139 | "collapsed": false,
140 | "input": [
141 | "%mprun -T mprof1 -f simulation.simulate simulation.simulate(iterations=500)"
142 | ],
143 | "metadata": {}
144 | },
145 | {
146 | "cell_type": "code",
147 | "language": "python",
148 | "outputs": [],
149 | "collapsed": false,
150 | "input": [
151 | "print(open('mprof1', 'r').read())"
152 | ],
153 | "metadata": {}
154 | },
155 | {
156 | "source": [
157 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n\n> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
158 | ],
159 | "cell_type": "markdown",
160 | "metadata": {}
161 | }
162 | ],
163 | "metadata": {}
164 | }
165 | ],
166 | "metadata": {
167 | "name": "",
168 | "signature": "sha256:97f336324ca2d7003df984bf72324eb2c2b678885be5da31226d6fa36e7031c2"
169 | }
170 | }
--------------------------------------------------------------------------------
/installation.md:
--------------------------------------------------------------------------------
1 | # Beginners instructions for installing Python
2 |
3 | **We only support the Anaconda distribution, although other Python distributions should work too.**
4 |
5 | In this *work-in-progress* document, we give very detailled instructions about how to install a full scientific Python distribution. This document targets beginners, and tackles things such as using the command-line interface, installing Anaconda, etc. These instructions are sometimes hard to find, scattered across many websites, blog posts, and posts on support mailing lists. Eventually, you'll find here the very minimum you need to know.
6 |
7 | This document aims at being maintained by the community: please propose a Pull Request if an information is missing.
8 |
9 |
10 | ## Overview
11 |
12 | Anaconda is a program created by Continuum Analytics. It lets you install a Python distribution with many scientific packages like NumPy, SciPy, IPython, matplotlib, and many others. Anaconda also comes with a packaging tool named conda. This tool lets you maintain, install, and update your Python packages. It is actually more general than that, in that it can deal with any non-Python binary package.
13 |
14 |
15 | ## Things to know
16 |
17 | You need to know/learn, several things before you start:
18 |
19 | * The system terminal, or command-line interface.
20 | * Git, a version control system.
21 |
22 |
23 | ### The terminal
24 |
25 | #### OS-specific instructions
26 |
27 | ##### Windows
28 |
29 | Use Powershell.
30 |
31 |
32 | ##### Linux
33 |
34 | TODO
35 |
36 | ##### Mac OS X
37 |
38 | TODO
39 |
40 |
41 | #### Home directory
42 |
43 | * On Windows: `c:\users\yourname\`
44 | * On Linux: `/home/yourname/`
45 |
46 |
47 | #### Current directory
48 |
49 | * `cd subfolder`: go into a subfolder
50 | * `cd ~`: go in your home directory
51 | * `ls` (if on Windows and not in Powershell, replace by `dir`)
52 |
53 |
54 |
55 | ### Git
56 |
57 | Not strictly required, but highly recommended if you want to use obscure packages, or if you start to write significant amount of code.
58 |
59 | TODO:
60 |
61 | * Installing git...
62 | * git bash...
63 | * git GUI...
64 |
65 |
66 | ### Environment variables and system paths
67 |
68 | TODO
69 |
70 |
71 | ## Installing Anaconda
72 |
73 | Once you're relatively confortable with the terminal and git, you can install Anaconda, a Python distribution.
74 |
75 | **Don't execute `conda` or any other Python-related commands in the Anaconda directory: go somewhere else.**
76 |
77 | ### Windows
78 |
79 | * Install Anaconda for Windows with Python 3.x (the latest is 3.4 in Sept. 2014).
80 | * Don't install Anaconda with admin rights: check "Just for me" in the installer.
81 | * Install in a path like `c:\anaconda`.
82 | * You'll be able to install Python 2.x in a separate conda environment.
83 | * **If you use Powershell**, you need to [use customized activate/deactivate scripts](https://github.com/Liquidmantis/PSCondaEnvs.git). Clone the repository, and copy them in `c:\anaconda\scripts` or similar.
84 |
85 |
86 | ### Linux
87 |
88 | TODO
89 |
90 |
91 | ### Mac OS X
92 |
93 | TODO
94 |
95 |
96 |
97 | ## Using Anaconda
98 |
99 | ### The basics
100 |
101 | ### Updating the packages
102 |
103 | `conda update conda`
104 |
105 |
106 | ## Environments
107 |
108 | Anaconda lets you maintain various isolated environements. You can switch to any environment at any time from the command-line.
109 |
110 | ### Get the list of installed environments
111 |
112 | `conda info -e`
113 |
114 |
115 | ### Switch to another environment
116 |
117 | * On Windows: `activate myenv`
118 | * On another OS: `source activate myenv`
119 |
120 |
121 | ### Installing Python 2.x side-by-side with the default Python 3
122 |
123 | * Run the following command:
124 |
125 | `conda create -n py27 python=2.7.8`
126 | `(press `f` if you have a WARNING)
127 |
128 | * You'll have no package in this new environment. Make sure to install **pip**!
129 |
130 | `conda install pip`
131 | `conda update pip`
132 |
133 |
134 | ## Running Python, IPython, and the IPython notebook
135 |
136 | ### IPython profile
137 |
138 | * A profile is a set of parameters and configuration files specific to a project of yours.
139 | * You can use different profiles for different use-cases: one profile when running code from the Cookbook, another one for your work, etc.
140 | * Using IPython profiles is not strictly necessary, and you can do everything in the same default profile. But it is good practice to use different profiles.
141 |
142 |
143 | #### Creating a profile for the cookbook
144 |
145 | `ipython profile create cookbook`
146 |
147 | #### Running IPython with this profile
148 |
149 | * Command-line interface: `ipython --profile=cookbook`
150 | * Notebook: `ipython notebook --profile=cookbook`
151 |
152 |
153 | ## Using a code editor
154 |
155 |
156 |
157 | ## Installing new packages
158 |
159 | * `conda install mypackage`
160 | * If that doesn't work, it needs that there is no conda package for this Python module. There are solutions.
161 | * **Important**: make sure pip is installed in the currently activated environment with `conda install pip`.
162 | * Then you can try to use pip: `pip install mypackage`.
163 |
164 |
165 |
--------------------------------------------------------------------------------
/notebooks/chapter07_stats/03_bayesian.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:79fe40d2e09c9eed21f52aabad47475915c2452f20d75ae93865266133380287"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {
21 | "word_id": "4818_07_bayesian"
22 | },
23 | "source": [
24 | "# 7.3. Getting started with Bayesian methods"
25 | ]
26 | },
27 | {
28 | "cell_type": "markdown",
29 | "metadata": {},
30 | "source": [
31 | "Let $q$ be the probability of obtaining a head. Whereas $q$ was just a fixed number in the previous recipe, we consider here that it is a *random variable*. Initially, this variable follows a distribution called the **prior distribution**. It represents our knowledge about $q$ *before* we start flipping the coin. We will update this distribution after each trial (**posterior distribution**)."
32 | ]
33 | },
34 | {
35 | "cell_type": "markdown",
36 | "metadata": {},
37 | "source": [
38 | "1. First, we assume that $q$ is a *uniform* random variable on the interval $[0, 1]$. That's our prior distribution: for all $q$, $P(q)=1$.\n",
39 | "2. Then, we flip our coin $n$ times. We note $x_i$ the outcome of the $i$-th flip ($0$ for tail, $1$ for head).\n",
40 | "3. What is the probability distribution of $q$ knowing the observations $x_i$? **Bayes' formula** allows us to compute the *posterior distribution* analytically (see the next section for the mathematical details):\n",
41 | "\n",
42 | "$$P(q | \\{x_i\\}) = \\frac{P(\\{x_i\\} | q) P(q)}{\\displaystyle\\int_0^1 P(\\{x_i\\} | q) P(q) dq} = (n+1)\\binom n h q^h (1-q)^{n-h}$$"
43 | ]
44 | },
45 | {
46 | "cell_type": "markdown",
47 | "metadata": {},
48 | "source": [
49 | "We define the posterior distribution according to the mathematical formula above. We remark this this expression is $(n+1)$ times the *probability mass function* (PMF) of the binomial distribution, which is directly available in `scipy.stats`. (http://en.wikipedia.org/wiki/Binomial_distribution)"
50 | ]
51 | },
52 | {
53 | "cell_type": "code",
54 | "collapsed": false,
55 | "input": [
56 | "import numpy as np\n",
57 | "import scipy.stats as st\n",
58 | "import matplotlib.pyplot as plt\n",
59 | "%matplotlib inline"
60 | ],
61 | "language": "python",
62 | "metadata": {},
63 | "outputs": []
64 | },
65 | {
66 | "cell_type": "code",
67 | "collapsed": false,
68 | "input": [
69 | "posterior = lambda n, h, q: (n+1) * st.binom(n, q).pmf(h)"
70 | ],
71 | "language": "python",
72 | "metadata": {},
73 | "outputs": []
74 | },
75 | {
76 | "cell_type": "markdown",
77 | "metadata": {},
78 | "source": [
79 | "Let's plot this distribution for an observation of $h=61$ heads and $n=100$ total flips."
80 | ]
81 | },
82 | {
83 | "cell_type": "code",
84 | "collapsed": false,
85 | "input": [
86 | "n = 100\n",
87 | "h = 61\n",
88 | "q = np.linspace(0., 1., 1000)\n",
89 | "d = posterior(n, h, q)"
90 | ],
91 | "language": "python",
92 | "metadata": {},
93 | "outputs": []
94 | },
95 | {
96 | "cell_type": "code",
97 | "collapsed": false,
98 | "input": [
99 | "plt.figure(figsize=(5,3));\n",
100 | "plt.plot(q, d, '-k');\n",
101 | "plt.xlabel('q parameter');\n",
102 | "plt.ylabel('Posterior distribution');\n",
103 | "plt.ylim(0, d.max()+1);"
104 | ],
105 | "language": "python",
106 | "metadata": {},
107 | "outputs": []
108 | },
109 | {
110 | "cell_type": "markdown",
111 | "metadata": {},
112 | "source": [
113 | "4. This distribution indicates the plausible values for $q$ given the observations. We could use it to derive a **credible interval**, likely to contain the actual value. (http://en.wikipedia.org/wiki/Credible_interval)\n",
114 | "\n",
115 | "We can also derive a point estimate. For example, the **maximum a posteriori (MAP) estimation** consists in considering the *maximum* of this distribution as an estimate for $q$. We can find this maximum analytically or numerically. Here, we find analytically $\\hat q = h/n$, which looks quite sensible. (http://en.wikipedia.org/wiki/Maximum_a_posteriori_estimation)"
116 | ]
117 | },
118 | {
119 | "cell_type": "markdown",
120 | "metadata": {},
121 | "source": [
122 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
123 | "\n",
124 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
125 | ]
126 | }
127 | ],
128 | "metadata": {}
129 | }
130 | ]
131 | }
--------------------------------------------------------------------------------
/notebooks/chapter09_numoptim/01_root.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:ebdc09f2bf4edb98a5219f3c142f07a09ffe11eda7f5673574e9160ba49be18c"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 9.1. Finding the root of a mathematical function"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. Let's import NumPy, SciPy, scipy.optimize, and matplotlib."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import scipy as sp\n",
38 | "import scipy.optimize as opt\n",
39 | "import matplotlib.pyplot as plt\n",
40 | "%matplotlib inline"
41 | ],
42 | "language": "python",
43 | "metadata": {},
44 | "outputs": []
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "2. We define the mathematical function $f(x)=\\cos(x)-x$ in Python. We will try to find a root of this function numerically, which corresponds to a fixed point of the cosine function."
51 | ]
52 | },
53 | {
54 | "cell_type": "code",
55 | "collapsed": false,
56 | "input": [
57 | "f = lambda x: np.cos(x) - x"
58 | ],
59 | "language": "python",
60 | "metadata": {},
61 | "outputs": []
62 | },
63 | {
64 | "cell_type": "markdown",
65 | "metadata": {},
66 | "source": [
67 | "3. Let's plot this function on the interval $[-5, 5]$."
68 | ]
69 | },
70 | {
71 | "cell_type": "code",
72 | "collapsed": false,
73 | "input": [
74 | "x = np.linspace(-5, 5, 1000)\n",
75 | "y = f(x)\n",
76 | "plt.figure(figsize=(5,3));\n",
77 | "plt.plot(x, y);\n",
78 | "plt.axhline(0, color='k');\n",
79 | "plt.xlim(-5,5);"
80 | ],
81 | "language": "python",
82 | "metadata": {},
83 | "outputs": []
84 | },
85 | {
86 | "cell_type": "markdown",
87 | "metadata": {},
88 | "source": [
89 | "4. We see that this function has a unique root on this interval (this is because the function's sign changes on this interval). The scipy.optimize module contains a few root-finding functions that are adapted here. For example, the `bisect` function implements the **bisection method** (also called the **dichotomy method**). It takes as input the function and the interval to find the root in."
90 | ]
91 | },
92 | {
93 | "cell_type": "code",
94 | "collapsed": false,
95 | "input": [
96 | "opt.bisect(f, -5, 5)"
97 | ],
98 | "language": "python",
99 | "metadata": {},
100 | "outputs": []
101 | },
102 | {
103 | "cell_type": "markdown",
104 | "metadata": {},
105 | "source": [
106 | "Let's visualize the root on the plot."
107 | ]
108 | },
109 | {
110 | "cell_type": "code",
111 | "collapsed": false,
112 | "input": [
113 | "plt.figure(figsize=(5,3));\n",
114 | "plt.plot(x, y);\n",
115 | "plt.axhline(0, color='k');\n",
116 | "plt.scatter([_], [0], c='r', s=100);\n",
117 | "plt.xlim(-5,5);"
118 | ],
119 | "language": "python",
120 | "metadata": {},
121 | "outputs": []
122 | },
123 | {
124 | "cell_type": "markdown",
125 | "metadata": {},
126 | "source": [
127 | "5. A faster and more powerful method is `brentq` (Brent's method). This algorithm also requires that $f$ is continuous and that $f(a)$ and $f(b)$ have different signs."
128 | ]
129 | },
130 | {
131 | "cell_type": "code",
132 | "collapsed": false,
133 | "input": [
134 | "opt.brentq(f, -5, 5)"
135 | ],
136 | "language": "python",
137 | "metadata": {},
138 | "outputs": []
139 | },
140 | {
141 | "cell_type": "markdown",
142 | "metadata": {},
143 | "source": [
144 | "The `brentq` method is faster than `bisect`. If the conditions are satisfied, it is a good idea to try Brent's method first."
145 | ]
146 | },
147 | {
148 | "cell_type": "code",
149 | "collapsed": false,
150 | "input": [
151 | "%timeit opt.bisect(f, -5, 5)\n",
152 | "%timeit opt.brentq(f, -5, 5)"
153 | ],
154 | "language": "python",
155 | "metadata": {},
156 | "outputs": []
157 | },
158 | {
159 | "cell_type": "markdown",
160 | "metadata": {},
161 | "source": [
162 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
163 | "\n",
164 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
165 | ]
166 | }
167 | ],
168 | "metadata": {}
169 | }
170 | ]
171 | }
--------------------------------------------------------------------------------
/notebooks/chapter11_image/05_faces.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:03475f5dd3c04558baa18d8b5fd02587f785a26b7b114f6db399151f0deb9a2e"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 11.5. Detecting faces in an image with OpenCV"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "You need OpenCV and the Python wrapper. You can find installation instructions on [OpenCV's website](http://docs.opencv.org/trunk/doc/py_tutorials/py_tutorials.html)."
30 | ]
31 | },
32 | {
33 | "cell_type": "markdown",
34 | "metadata": {},
35 | "source": [
36 | "On Windows, you can install [Chris Gohlke's package](http://www.lfd.uci.edu/~gohlke/pythonlibs/#opencv)."
37 | ]
38 | },
39 | {
40 | "cell_type": "markdown",
41 | "metadata": {},
42 | "source": [
43 | "You also need to download the *Family* dataset on the book's website. (http://ipython-books.github.io)."
44 | ]
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "1. First, we import the packages."
51 | ]
52 | },
53 | {
54 | "cell_type": "code",
55 | "collapsed": false,
56 | "input": [
57 | "import numpy as np\n",
58 | "import cv2\n",
59 | "import matplotlib.pyplot as plt\n",
60 | "import matplotlib as mpl\n",
61 | "%matplotlib inline\n"
62 | ],
63 | "language": "python",
64 | "metadata": {},
65 | "outputs": []
66 | },
67 | {
68 | "cell_type": "markdown",
69 | "metadata": {},
70 | "source": [
71 | "2. We open the JPG image with OpenCV."
72 | ]
73 | },
74 | {
75 | "cell_type": "code",
76 | "collapsed": false,
77 | "input": [
78 | "img = cv2.imread('data/pic3.jpg')"
79 | ],
80 | "language": "python",
81 | "metadata": {},
82 | "outputs": []
83 | },
84 | {
85 | "cell_type": "markdown",
86 | "metadata": {},
87 | "source": [
88 | "3. Then, we convert it to a grayscale image using OpenCV's `cvtColor` function."
89 | ]
90 | },
91 | {
92 | "cell_type": "code",
93 | "collapsed": false,
94 | "input": [
95 | "gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)"
96 | ],
97 | "language": "python",
98 | "metadata": {},
99 | "outputs": []
100 | },
101 | {
102 | "cell_type": "markdown",
103 | "metadata": {},
104 | "source": [
105 | "4. To detect faces, we will use the **Viola\u2013Jones object detection framework**. A cascade of Haar-like classifiers has been trained to detect faces. The result of the training is stored in a XML file (part of the *Family* dataset available on the book's website). We load this cascade from this XML file with OpenCV's `CascadeClassifier` class."
106 | ]
107 | },
108 | {
109 | "cell_type": "code",
110 | "collapsed": false,
111 | "input": [
112 | "face_cascade = cv2.CascadeClassifier('data/haarcascade_frontalface_default.xml')"
113 | ],
114 | "language": "python",
115 | "metadata": {},
116 | "outputs": []
117 | },
118 | {
119 | "cell_type": "markdown",
120 | "metadata": {},
121 | "source": [
122 | "5. Finally, the `detectMultiScale` method of the classifier detects the objects on a grayscale image, and returns a list of rectangles around these objects."
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "collapsed": false,
128 | "input": [
129 | "for x, y, w, h in face_cascade.detectMultiScale(gray, 1.3):\n",
130 | " cv2.rectangle(gray, (x,y), (x+w,y+h), (255,0,0), 2)\n",
131 | "plt.figure(figsize=(6,4));\n",
132 | "plt.imshow(gray, cmap=plt.cm.gray);\n",
133 | "plt.axis('off');"
134 | ],
135 | "language": "python",
136 | "metadata": {},
137 | "outputs": []
138 | },
139 | {
140 | "cell_type": "markdown",
141 | "metadata": {},
142 | "source": [
143 | "We see that, although all detected objects are indeed faces, one face out of four is not detected. This is probably due to the fact that this face is not perfectly facing the camera, whereas the faces in the training set were. This shows that the efficacy of this method is limited by the quality and generality of the training set."
144 | ]
145 | },
146 | {
147 | "cell_type": "markdown",
148 | "metadata": {},
149 | "source": [
150 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
151 | "\n",
152 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
153 | ]
154 | }
155 | ],
156 | "metadata": {}
157 | }
158 | ]
159 | }
--------------------------------------------------------------------------------
/notebooks/chapter08_ml/07_pca.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:8c43cbb07ca71966152c95e40d34835d917bf1d5217c45f54c30814c9ac5512d"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python."
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 8.7. Reducing the dimensionality of a data with a Principal Component Analysis"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. We import NumPy, matplotlib, and scikit-learn."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import sklearn\n",
38 | "import sklearn.decomposition as dec\n",
39 | "import sklearn.datasets as ds\n",
40 | "import matplotlib.pyplot as plt\n",
41 | "%matplotlib inline\n"
42 | ],
43 | "language": "python",
44 | "metadata": {},
45 | "outputs": []
46 | },
47 | {
48 | "cell_type": "markdown",
49 | "metadata": {},
50 | "source": [
51 | "2. The Iris flower dataset is available in the *datasets* module of scikit-learn."
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "collapsed": false,
57 | "input": [
58 | "iris = ds.load_iris()\n",
59 | "X = iris.data\n",
60 | "y = iris.target\n",
61 | "print(X.shape)"
62 | ],
63 | "language": "python",
64 | "metadata": {},
65 | "outputs": []
66 | },
67 | {
68 | "cell_type": "markdown",
69 | "metadata": {},
70 | "source": [
71 | "3. Each row contains four parameters related to the morphology of the flower. Let's display the first two components in two dimensions. The color reflects the iris variety of the flower (the label, between 0 and 2)."
72 | ]
73 | },
74 | {
75 | "cell_type": "code",
76 | "collapsed": false,
77 | "input": [
78 | "plt.figure(figsize=(6,3));\n",
79 | "plt.scatter(X[:,0], X[:,1], c=y,\n",
80 | " s=30, cmap=plt.cm.rainbow);"
81 | ],
82 | "language": "python",
83 | "metadata": {},
84 | "outputs": []
85 | },
86 | {
87 | "cell_type": "markdown",
88 | "metadata": {},
89 | "source": [
90 | "4. We now apply PCA on the dataset to get the transformed matrix. This operation can be done in a single line with scikit-learn: we instantiate a `PCA` model, and call the `fit_transform` method. This function computes the principal components first, and projects the data then."
91 | ]
92 | },
93 | {
94 | "cell_type": "code",
95 | "collapsed": false,
96 | "input": [
97 | "X_bis = dec.PCA().fit_transform(X)"
98 | ],
99 | "language": "python",
100 | "metadata": {},
101 | "outputs": []
102 | },
103 | {
104 | "cell_type": "markdown",
105 | "metadata": {},
106 | "source": [
107 | "5. We now display the same dataset, but in a new coordinate system (or equivalently, a linearly transformed version of the initial dataset)."
108 | ]
109 | },
110 | {
111 | "cell_type": "code",
112 | "collapsed": false,
113 | "input": [
114 | "plt.figure(figsize=(6,3));\n",
115 | "plt.scatter(X_bis[:,0], X_bis[:,1], c=y,\n",
116 | " s=30, cmap=plt.cm.rainbow);"
117 | ],
118 | "language": "python",
119 | "metadata": {},
120 | "outputs": []
121 | },
122 | {
123 | "cell_type": "markdown",
124 | "metadata": {},
125 | "source": [
126 | "Points belonging to the same classes are now grouped together, even though the `PCA` estimator dit *not* use the labels. The PCA was able to find a projection maximizing the variance, which corresponds here to a projection where the classes are well separated."
127 | ]
128 | },
129 | {
130 | "cell_type": "markdown",
131 | "metadata": {},
132 | "source": [
133 | "6. The `scikit.decomposition` module contains several variants of the classic `PCA` estimator: `ProbabilisticPCA`, `SparsePCA`, `RandomizedPCA`, `KernelPCA`... As an example, let's take a look at `KernelPCA`, a non-linear version of PCA."
134 | ]
135 | },
136 | {
137 | "cell_type": "code",
138 | "collapsed": false,
139 | "input": [
140 | "X_ter = dec.KernelPCA(kernel='rbf').fit_transform(X)\n",
141 | "plt.figure(figsize=(6,3));\n",
142 | "plt.scatter(X_ter[:,0], X_ter[:,1], c=y, s=30, cmap=plt.cm.rainbow);"
143 | ],
144 | "language": "python",
145 | "metadata": {},
146 | "outputs": []
147 | },
148 | {
149 | "cell_type": "markdown",
150 | "metadata": {},
151 | "source": [
152 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
153 | "\n",
154 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
155 | ]
156 | }
157 | ],
158 | "metadata": {}
159 | }
160 | ]
161 | }
--------------------------------------------------------------------------------
/notebooks/chapter11_image/07_synth.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:b13b82ab2b72800124dcad04c4a86dec0eacb33450bbbe804429e02c958a32dc"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 11.7. Creating a sound synthesizer in the notebook"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. We import NumPy, matplotlib, and various IPython packages and objects."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import matplotlib.pyplot as plt\n",
38 | "from IPython.display import Audio, display, clear_output\n",
39 | "from IPython.html import widgets\n",
40 | "from functools import partial\n",
41 | "import matplotlib as mpl\n",
42 | "%matplotlib inline\n"
43 | ],
44 | "language": "python",
45 | "metadata": {},
46 | "outputs": []
47 | },
48 | {
49 | "cell_type": "markdown",
50 | "metadata": {},
51 | "source": [
52 | "2. We define the sampling rate and the duration of the notes."
53 | ]
54 | },
55 | {
56 | "cell_type": "code",
57 | "collapsed": false,
58 | "input": [
59 | "rate = 16000.\n",
60 | "duration = .5\n",
61 | "t = np.linspace(0., duration, rate * duration)"
62 | ],
63 | "language": "python",
64 | "metadata": {},
65 | "outputs": []
66 | },
67 | {
68 | "cell_type": "markdown",
69 | "metadata": {},
70 | "source": [
71 | "3. We create a function that generates and plays the sound of a note (sine function) at a given frequency, using NumPy and IPython's `Audio` class."
72 | ]
73 | },
74 | {
75 | "cell_type": "code",
76 | "collapsed": false,
77 | "input": [
78 | "def synth(f):\n",
79 | " x = np.sin(f * 2. * np.pi * t)\n",
80 | " display(Audio(x, rate=rate, autoplay=True))"
81 | ],
82 | "language": "python",
83 | "metadata": {},
84 | "outputs": []
85 | },
86 | {
87 | "cell_type": "markdown",
88 | "metadata": {},
89 | "source": [
90 | "4. Here is the fundamental 440 Hz note."
91 | ]
92 | },
93 | {
94 | "cell_type": "code",
95 | "collapsed": false,
96 | "input": [
97 | "synth(440)"
98 | ],
99 | "language": "python",
100 | "metadata": {},
101 | "outputs": []
102 | },
103 | {
104 | "cell_type": "markdown",
105 | "metadata": {},
106 | "source": [
107 | "5. Now, we generate the note frequencies of our piano. The chromatic scale is obtained by a geometric progression with common ratio $2^{1/12}$."
108 | ]
109 | },
110 | {
111 | "cell_type": "code",
112 | "collapsed": false,
113 | "input": [
114 | "notes = zip('C,C#,D,D#,E,F,F#,G,G#,A,A#,B,C'.split(','),\n",
115 | " 440. * 2 ** (np.arange(3, 17) / 12.))"
116 | ],
117 | "language": "python",
118 | "metadata": {},
119 | "outputs": []
120 | },
121 | {
122 | "cell_type": "markdown",
123 | "metadata": {},
124 | "source": [
125 | "6. Finally, we create the piano with the notebook widgets. Each note is a button, and all buttons are contained in an horizontal box container. Clicking on one note plays a sound at the corresponding frequency."
126 | ]
127 | },
128 | {
129 | "cell_type": "code",
130 | "collapsed": false,
131 | "input": [
132 | "container = widgets.ContainerWidget()\n",
133 | "buttons = []\n",
134 | "for note, f in notes:\n",
135 | " button = widgets.ButtonWidget(description=note)\n",
136 | " def on_button_clicked(f, b):\n",
137 | " clear_output()\n",
138 | " synth(f)\n",
139 | " button.on_click(partial(on_button_clicked, f))\n",
140 | " button.set_css({'width': '30px', \n",
141 | " 'height': '60px',\n",
142 | " 'padding': '0',\n",
143 | " 'color': ('black', 'white')['#' in note],\n",
144 | " 'background': ('white', 'black')['#' in note],\n",
145 | " 'border': '1px solid black',\n",
146 | " 'float': 'left'})\n",
147 | " buttons.append(button)\n",
148 | "container.children = buttons\n",
149 | "display(container)\n",
150 | "container.remove_class('vbox')\n",
151 | "container.add_class('hbox')"
152 | ],
153 | "language": "python",
154 | "metadata": {},
155 | "outputs": []
156 | },
157 | {
158 | "cell_type": "markdown",
159 | "metadata": {},
160 | "source": [
161 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
162 | "\n",
163 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
164 | ]
165 | }
166 | ],
167 | "metadata": {}
168 | }
169 | ]
170 | }
--------------------------------------------------------------------------------
/notebooks/chapter04_optimization/06_stride_tricks.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 3,
3 | "nbformat_minor": 0,
4 | "worksheets": [
5 | {
6 | "cells": [
7 | {
8 | "source": [
9 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
10 | ],
11 | "cell_type": "markdown",
12 | "metadata": []
13 | },
14 | {
15 | "source": [
16 | "# 4.6. Using stride tricks with NumPy"
17 | ],
18 | "cell_type": "markdown",
19 | "metadata": {}
20 | },
21 | {
22 | "source": [
23 | "Every array has a number of dimensions, a shape, a data type, and strides. Strides are integer numbers describing, for each dimension, the byte step in the contiguous block of memory. The address of an item in the array is a linear combination of its indices: the coefficients are the strides."
24 | ],
25 | "cell_type": "markdown",
26 | "metadata": {}
27 | },
28 | {
29 | "cell_type": "code",
30 | "language": "python",
31 | "outputs": [],
32 | "collapsed": false,
33 | "input": [
34 | "import numpy as np"
35 | ],
36 | "metadata": {}
37 | },
38 | {
39 | "cell_type": "code",
40 | "language": "python",
41 | "outputs": [],
42 | "collapsed": false,
43 | "input": [
44 | "id = lambda x: x.__array_interface__['data'][0]"
45 | ],
46 | "metadata": {}
47 | },
48 | {
49 | "cell_type": "code",
50 | "language": "python",
51 | "outputs": [],
52 | "collapsed": false,
53 | "input": [
54 | "x = np.zeros(10); x.strides"
55 | ],
56 | "metadata": {}
57 | },
58 | {
59 | "source": [
60 | "This vector contains float64 (8 bytes) items: one needs to go 8 bytes forward to go from one item to the next."
61 | ],
62 | "cell_type": "markdown",
63 | "metadata": {}
64 | },
65 | {
66 | "cell_type": "code",
67 | "language": "python",
68 | "outputs": [],
69 | "collapsed": false,
70 | "input": [
71 | "y = np.zeros((10, 10)); y.strides"
72 | ],
73 | "metadata": {}
74 | },
75 | {
76 | "source": [
77 | "In the first dimension (vertical), one needs to go 80 bytes (10 float64 items) forward to go from one item to the next, because the items are internally stored in row-major order. In the second dimension (horizontal), one needs to go 8 bytes forward to go from one item to the next."
78 | ],
79 | "cell_type": "markdown",
80 | "metadata": {}
81 | },
82 | {
83 | "source": [
84 | "### Broadcasting revisited"
85 | ],
86 | "cell_type": "markdown",
87 | "metadata": {}
88 | },
89 | {
90 | "source": [
91 | "We create a new array pointing to the same memory block as `a`, but with a different shape. The strides are such that this array looks like it is a vertically tiled version of `a`. NumPy is *tricked*: it thinks `b` is a 2D `n * n` array with `n^2` elements, whereas the data buffer really contains only `n` elements."
92 | ],
93 | "cell_type": "markdown",
94 | "metadata": {}
95 | },
96 | {
97 | "cell_type": "code",
98 | "language": "python",
99 | "outputs": [],
100 | "collapsed": false,
101 | "input": [
102 | "n = 1000; a = np.arange(n)"
103 | ],
104 | "metadata": {}
105 | },
106 | {
107 | "cell_type": "code",
108 | "language": "python",
109 | "outputs": [],
110 | "collapsed": false,
111 | "input": [
112 | "b = np.lib.stride_tricks.as_strided(a, (n, n), (0, 4))"
113 | ],
114 | "metadata": {}
115 | },
116 | {
117 | "cell_type": "code",
118 | "language": "python",
119 | "outputs": [],
120 | "collapsed": false,
121 | "input": [
122 | "b"
123 | ],
124 | "metadata": {}
125 | },
126 | {
127 | "cell_type": "code",
128 | "language": "python",
129 | "outputs": [],
130 | "collapsed": false,
131 | "input": [
132 | "b.size, b.shape, b.nbytes"
133 | ],
134 | "metadata": {}
135 | },
136 | {
137 | "cell_type": "code",
138 | "language": "python",
139 | "outputs": [],
140 | "collapsed": false,
141 | "input": [
142 | "%timeit b * b.T"
143 | ],
144 | "metadata": {}
145 | },
146 | {
147 | "source": [
148 | "This first version does not involve any copy, as `b` and `b.T` are arrays pointing to the same data buffer in memory, but with different strides."
149 | ],
150 | "cell_type": "markdown",
151 | "metadata": {}
152 | },
153 | {
154 | "cell_type": "code",
155 | "language": "python",
156 | "outputs": [],
157 | "collapsed": false,
158 | "input": [
159 | "%timeit np.tile(a, (n, 1)) * np.tile(a[:, np.newaxis], (1, n))"
160 | ],
161 | "metadata": {}
162 | },
163 | {
164 | "source": [
165 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n\n> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
166 | ],
167 | "cell_type": "markdown",
168 | "metadata": {}
169 | }
170 | ],
171 | "metadata": {}
172 | }
173 | ],
174 | "metadata": {
175 | "name": "",
176 | "signature": "sha256:b24dc9fc21de16e5b73784c0a8696b89f36dfff0b3cd8e0e1b61bb7de8a62251"
177 | }
178 | }
--------------------------------------------------------------------------------
/notebooks/chapter04_optimization/02_profile.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 3,
3 | "nbformat_minor": 0,
4 | "worksheets": [
5 | {
6 | "cells": [
7 | {
8 | "source": [
9 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
10 | ],
11 | "cell_type": "markdown",
12 | "metadata": []
13 | },
14 | {
15 | "source": [
16 | "# 4.2. Profiling your code easily with cProfile and IPython"
17 | ],
18 | "cell_type": "markdown",
19 | "metadata": {}
20 | },
21 | {
22 | "source": [
23 | "Standard imports."
24 | ],
25 | "cell_type": "markdown",
26 | "metadata": {}
27 | },
28 | {
29 | "cell_type": "code",
30 | "language": "python",
31 | "outputs": [],
32 | "collapsed": false,
33 | "input": [
34 | "import numpy as np\n",
35 | "import matplotlib.pyplot as plt"
36 | ],
37 | "metadata": {}
38 | },
39 | {
40 | "cell_type": "code",
41 | "language": "python",
42 | "outputs": [],
43 | "collapsed": false,
44 | "input": [
45 | "%matplotlib inline"
46 | ],
47 | "metadata": {}
48 | },
49 | {
50 | "source": [
51 | "This function generates an array with random, uniformly distributed +1 and -1."
52 | ],
53 | "cell_type": "markdown",
54 | "metadata": {}
55 | },
56 | {
57 | "cell_type": "code",
58 | "language": "python",
59 | "outputs": [],
60 | "collapsed": false,
61 | "input": [
62 | "def step(*shape):\n",
63 | " # Create a random n-vector with +1 or -1 values.\n",
64 | " return 2 * (np.random.random_sample(shape) < .5) - 1"
65 | ],
66 | "metadata": {}
67 | },
68 | {
69 | "source": [
70 | "We simulate $n$ random walks, and look at the histogram of the walks over time."
71 | ],
72 | "cell_type": "markdown",
73 | "metadata": {}
74 | },
75 | {
76 | "cell_type": "code",
77 | "language": "python",
78 | "outputs": [],
79 | "collapsed": false,
80 | "input": [
81 | "%%prun -s cumulative -q -l 10 -T prun0\n",
82 | "# We profile the cell, sort the report by \"cumulative time\",\n",
83 | "# limit it to 10 lines, and save it to a file \"prun0\".\n",
84 | "n = 10000\n",
85 | "iterations = 50\n",
86 | "x = np.cumsum(step(iterations, n), axis=0)\n",
87 | "bins = np.arange(-30, 30, 1)\n",
88 | "y = np.vstack([np.histogram(x[i,:], bins)[0] for i in range(iterations)])"
89 | ],
90 | "metadata": {}
91 | },
92 | {
93 | "cell_type": "code",
94 | "language": "python",
95 | "outputs": [],
96 | "collapsed": false,
97 | "input": [
98 | "print(open('prun0', 'r').read())"
99 | ],
100 | "metadata": {}
101 | },
102 | {
103 | "source": [
104 | "The most expensive functions are respectively `histogram` (37 ms), `rand` (19 ms), and `cumsum` (5 ms)."
105 | ],
106 | "cell_type": "markdown",
107 | "metadata": {}
108 | },
109 | {
110 | "source": [
111 | "We plot the array `y`, representing the distribution of the particles over time."
112 | ],
113 | "cell_type": "markdown",
114 | "metadata": {}
115 | },
116 | {
117 | "cell_type": "code",
118 | "language": "python",
119 | "outputs": [],
120 | "collapsed": false,
121 | "input": [
122 | "plt.figure(figsize=(6,6));\n",
123 | "plt.imshow(y, cmap='hot');"
124 | ],
125 | "metadata": {}
126 | },
127 | {
128 | "source": [
129 | "We now run the same code with 10 times more iterations."
130 | ],
131 | "cell_type": "markdown",
132 | "metadata": {}
133 | },
134 | {
135 | "cell_type": "code",
136 | "language": "python",
137 | "outputs": [],
138 | "collapsed": false,
139 | "input": [
140 | "%%prun -s cumulative -q -l 10 -T prun1\n",
141 | "n = 10000\n",
142 | "iterations = 500\n",
143 | "x = np.cumsum(step(iterations, n), axis=0)\n",
144 | "bins = np.arange(-30, 30, 1)\n",
145 | "y = np.vstack([np.histogram(x[i,:], bins)[0] for i in range(iterations)])"
146 | ],
147 | "metadata": {}
148 | },
149 | {
150 | "cell_type": "code",
151 | "language": "python",
152 | "outputs": [],
153 | "collapsed": false,
154 | "input": [
155 | "print(open('prun1', 'r').read())"
156 | ],
157 | "metadata": {}
158 | },
159 | {
160 | "source": [
161 | "The most expensive functions are this time respectively `histogram` (566 ms), `cumsum` (388 ms) and `rand` (241 ms). `cumsum`'s execution time was negligible in the first case, whereas it is not in this case (due to the higher number of iterations)."
162 | ],
163 | "cell_type": "markdown",
164 | "metadata": {}
165 | },
166 | {
167 | "source": [
168 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n\n> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
169 | ],
170 | "cell_type": "markdown",
171 | "metadata": {}
172 | }
173 | ],
174 | "metadata": {}
175 | }
176 | ],
177 | "metadata": {
178 | "name": "",
179 | "signature": "sha256:320dd795b811d1d848832e4ec8520c7490c57b91d8fbebba99047bb053375cbd"
180 | }
181 | }
--------------------------------------------------------------------------------
/notebooks/chapter09_numoptim/03_curvefitting.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:9a1bac1f4c50385b9d1752085560b5ee3d5f460936cdeb686cc9334b866f6598"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 9.3. Fitting a function to data with nonlinear least squares"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. Let's import the usual libraries."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import scipy.optimize as opt\n",
38 | "import matplotlib.pyplot as plt\n",
39 | "%matplotlib inline\n",
40 | "np.random.seed(3)"
41 | ],
42 | "language": "python",
43 | "metadata": {},
44 | "outputs": []
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "2. We define a logistic function with four parameters."
51 | ]
52 | },
53 | {
54 | "cell_type": "markdown",
55 | "metadata": {},
56 | "source": [
57 | "$$f_{a,b,c,d}(x) = \\frac{a}{1 + \\exp\\left(-c (x-d)\\right)} + b$$"
58 | ]
59 | },
60 | {
61 | "cell_type": "code",
62 | "collapsed": false,
63 | "input": [
64 | "def f(x, a, b, c, d):\n",
65 | " return a/(1. + np.exp(-c * (x-d))) + b"
66 | ],
67 | "language": "python",
68 | "metadata": {},
69 | "outputs": []
70 | },
71 | {
72 | "cell_type": "markdown",
73 | "metadata": {},
74 | "source": [
75 | "3. Let's define four random parameters."
76 | ]
77 | },
78 | {
79 | "cell_type": "code",
80 | "collapsed": false,
81 | "input": [
82 | "a, c = np.random.exponential(size=2)\n",
83 | "b, d = np.random.randn(2)"
84 | ],
85 | "language": "python",
86 | "metadata": {},
87 | "outputs": []
88 | },
89 | {
90 | "cell_type": "markdown",
91 | "metadata": {},
92 | "source": [
93 | "4. Now, we generate random data points, by using the sigmoid function and adding a bit of noise."
94 | ]
95 | },
96 | {
97 | "cell_type": "code",
98 | "collapsed": false,
99 | "input": [
100 | "n = 100\n",
101 | "x = np.linspace(-10., 10., n)\n",
102 | "y_model = f(x, a, b, c, d)\n",
103 | "y = y_model + a * .2 * np.random.randn(n)"
104 | ],
105 | "language": "python",
106 | "metadata": {},
107 | "outputs": []
108 | },
109 | {
110 | "cell_type": "markdown",
111 | "metadata": {},
112 | "source": [
113 | "5. Here is a plot of the data points, with the particular sigmoid used for their generation."
114 | ]
115 | },
116 | {
117 | "cell_type": "code",
118 | "collapsed": false,
119 | "input": [
120 | "plt.figure(figsize=(6,4));\n",
121 | "plt.plot(x, y_model, '--k');\n",
122 | "plt.plot(x, y, 'o');"
123 | ],
124 | "language": "python",
125 | "metadata": {},
126 | "outputs": []
127 | },
128 | {
129 | "cell_type": "markdown",
130 | "metadata": {},
131 | "source": [
132 | "6. We now assume that we only have access to the data points. These points could have been obtained during an experiment. By looking at the data, the points appear to approximately follow a sigmoid, so we may want to try to fit such a curve to the points. That's what **curve fitting** is about. SciPy's function `curve_fit` allows us to fit a curve defined by an arbitrary Python function to the data."
133 | ]
134 | },
135 | {
136 | "cell_type": "code",
137 | "collapsed": false,
138 | "input": [
139 | "(a_, b_, c_, d_), _ = opt.curve_fit(f, x, y, (a, b, c, d))"
140 | ],
141 | "language": "python",
142 | "metadata": {},
143 | "outputs": []
144 | },
145 | {
146 | "cell_type": "markdown",
147 | "metadata": {},
148 | "source": [
149 | "7. Now, let's take a look at the fitted simoid curve."
150 | ]
151 | },
152 | {
153 | "cell_type": "code",
154 | "collapsed": false,
155 | "input": [
156 | "y_fit = f(x, a_, b_, c_, d_)"
157 | ],
158 | "language": "python",
159 | "metadata": {},
160 | "outputs": []
161 | },
162 | {
163 | "cell_type": "code",
164 | "collapsed": false,
165 | "input": [
166 | "plt.figure(figsize=(6,4));\n",
167 | "plt.plot(x, y_model, '--k');\n",
168 | "plt.plot(x, y, 'o');\n",
169 | "plt.plot(x, y_fit, '-');"
170 | ],
171 | "language": "python",
172 | "metadata": {},
173 | "outputs": []
174 | },
175 | {
176 | "cell_type": "markdown",
177 | "metadata": {},
178 | "source": [
179 | "The fitted sigmoid appears to be quite close from the original sigmoid used for data generation."
180 | ]
181 | },
182 | {
183 | "cell_type": "markdown",
184 | "metadata": {},
185 | "source": [
186 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
187 | "\n",
188 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
189 | ]
190 | }
191 | ],
192 | "metadata": {}
193 | }
194 | ]
195 | }
--------------------------------------------------------------------------------
/notebooks/chapter12_deterministic/02_cellular.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:421e3554d86e59aa983d04ba26bc8f23411a521a95e56dc7ec37bd1b3d4c8165"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 12.2. Simulating an elementary cellular automaton"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. We import NumPy and matplotlib."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import matplotlib.pyplot as plt\n",
38 | "%matplotlib inline"
39 | ],
40 | "language": "python",
41 | "metadata": {},
42 | "outputs": []
43 | },
44 | {
45 | "cell_type": "markdown",
46 | "metadata": {},
47 | "source": [
48 | "2. We will use the following vector to obtain numbers written in binary representation."
49 | ]
50 | },
51 | {
52 | "cell_type": "code",
53 | "collapsed": false,
54 | "input": [
55 | "u = np.array([[4], [2], [1]])"
56 | ],
57 | "language": "python",
58 | "metadata": {},
59 | "outputs": []
60 | },
61 | {
62 | "cell_type": "markdown",
63 | "metadata": {},
64 | "source": [
65 | "3. We write a function that performs one iteration on the grid, updating all cells at once according to the given rule in binary representation. The first step consists in stacking circularly shifted versions of the grid to get the LCR triplets of each cell (`y`). Then, we convert these triplets in 3-bit numbers (`z`). Finally, we compute the next state of every cell using the specified rule."
66 | ]
67 | },
68 | {
69 | "cell_type": "code",
70 | "collapsed": false,
71 | "input": [
72 | "def step(x, rule_binary):\n",
73 | " \"\"\"Compute a single stet of an elementary cellular\n",
74 | " automaton.\"\"\"\n",
75 | " # The columns contains the L, C, R values\n",
76 | " # of all cells.\n",
77 | " y = np.vstack((np.roll(x, 1), x,\n",
78 | " np.roll(x, -1))).astype(np.int8)\n",
79 | " # We get the LCR pattern numbers between 0 and 7.\n",
80 | " z = np.sum(y * u, axis=0).astype(np.int8)\n",
81 | " # We get the patterns given by the rule.\n",
82 | " return rule_binary[7-z]"
83 | ],
84 | "language": "python",
85 | "metadata": {},
86 | "outputs": []
87 | },
88 | {
89 | "cell_type": "markdown",
90 | "metadata": {},
91 | "source": [
92 | "4. We now write a function that simulates any elementary cellular automaton. First, we compute the binary representation of the rule (**Wolfram's code**). Then, we initialize the first row of the grid to random values. Finally, we apply the function `step` iteratively on the grid."
93 | ]
94 | },
95 | {
96 | "cell_type": "code",
97 | "collapsed": false,
98 | "input": [
99 | "def generate(rule, size=80, steps=80):\n",
100 | " \"\"\"Simulate an elementary cellular automaton given its rule\n",
101 | " (number between 0 and 255).\"\"\"\n",
102 | " # Compute the binary representation of the rule.\n",
103 | " rule_binary = np.array([int(_) \n",
104 | " for _ in np.binary_repr(rule, 8)],\n",
105 | " dtype=np.int8)\n",
106 | " x = np.zeros((steps, size), dtype=np.int8)\n",
107 | " # Random initial state.\n",
108 | " x[0,:] = np.random.rand(size) < .5\n",
109 | " # Apply the step function iteratively.\n",
110 | " for i in range(steps-1):\n",
111 | " x[i+1,:] = step(x[i,:], rule_binary)\n",
112 | " return x"
113 | ],
114 | "language": "python",
115 | "metadata": {},
116 | "outputs": []
117 | },
118 | {
119 | "cell_type": "markdown",
120 | "metadata": {},
121 | "source": [
122 | "5. Now, we simulate and display 9 different automata."
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "collapsed": false,
128 | "input": [
129 | "plt.figure(figsize=(6, 6));\n",
130 | "rules = [3, 18, 30, \n",
131 | " 90, 106, 110, \n",
132 | " 158, 154, 184]\n",
133 | "for i, rule in enumerate(rules):\n",
134 | " x = generate(rule)\n",
135 | " plt.subplot(331+i)\n",
136 | " plt.imshow(x, interpolation='none', cmap=plt.cm.binary);\n",
137 | " plt.xticks([]); plt.yticks([]);\n",
138 | " plt.title(str(rule))"
139 | ],
140 | "language": "python",
141 | "metadata": {},
142 | "outputs": []
143 | },
144 | {
145 | "cell_type": "markdown",
146 | "metadata": {},
147 | "source": [
148 | "It has been shown that Rule 110 is **Turing complete** (or **universal**): in principle, this automaton can simulate any computer program."
149 | ]
150 | },
151 | {
152 | "cell_type": "markdown",
153 | "metadata": {},
154 | "source": [
155 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
156 | "\n",
157 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
158 | ]
159 | }
160 | ],
161 | "metadata": {}
162 | }
163 | ]
164 | }
--------------------------------------------------------------------------------
/notebooks/chapter11_image/01_exposure.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:b560c8fc7f938b6be6e74cb1859be2cd7f1eee1b1bbca25d89019754a481662b"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 11.1. Manipulating the exposure of an image"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "You need scikit-image for this recipe. You will find the installation instructions [here](http://scikit-image.org/download.html).\n",
30 | "\n",
31 | "You also need to download the *Beach* dataset. (http://ipython-books.github.io)"
32 | ]
33 | },
34 | {
35 | "cell_type": "markdown",
36 | "metadata": {},
37 | "source": [
38 | "1. Let's import the packages."
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "collapsed": false,
44 | "input": [
45 | "import numpy as np\n",
46 | "import matplotlib as mpl\n",
47 | "import matplotlib.pyplot as plt\n",
48 | "import skimage.exposure as skie\n",
49 | "%matplotlib inline\n"
50 | ],
51 | "language": "python",
52 | "metadata": {},
53 | "outputs": []
54 | },
55 | {
56 | "cell_type": "markdown",
57 | "metadata": {},
58 | "source": [
59 | "2. We open an image with matplotlib. We only take a single RGB component to have a grayscale image."
60 | ]
61 | },
62 | {
63 | "cell_type": "code",
64 | "collapsed": false,
65 | "input": [
66 | "img = plt.imread('data/pic1.jpg')[...,0]"
67 | ],
68 | "language": "python",
69 | "metadata": {},
70 | "outputs": []
71 | },
72 | {
73 | "cell_type": "markdown",
74 | "metadata": {},
75 | "source": [
76 | "3. We create a function that displays the image along with its **histogram**."
77 | ]
78 | },
79 | {
80 | "cell_type": "code",
81 | "collapsed": false,
82 | "input": [
83 | "def show(img):\n",
84 | " # Display the image.\n",
85 | " plt.figure(figsize=(8,2));\n",
86 | " plt.subplot(121);\n",
87 | " plt.imshow(img, cmap=plt.cm.gray);\n",
88 | " plt.axis('off');\n",
89 | " # Display the histogram.\n",
90 | " plt.subplot(122);\n",
91 | " plt.hist(img.ravel(), lw=0, bins=256);\n",
92 | " plt.xlim(0, img.max());\n",
93 | " plt.yticks([]);\n",
94 | " plt.show()"
95 | ],
96 | "language": "python",
97 | "metadata": {},
98 | "outputs": []
99 | },
100 | {
101 | "cell_type": "markdown",
102 | "metadata": {},
103 | "source": [
104 | "4. Let's display the image along with its histogram."
105 | ]
106 | },
107 | {
108 | "cell_type": "code",
109 | "collapsed": false,
110 | "input": [
111 | "show(img)"
112 | ],
113 | "language": "python",
114 | "metadata": {},
115 | "outputs": []
116 | },
117 | {
118 | "cell_type": "markdown",
119 | "metadata": {},
120 | "source": [
121 | "The histogram is unbalanced and the image appears slightly over-exposed."
122 | ]
123 | },
124 | {
125 | "cell_type": "markdown",
126 | "metadata": {},
127 | "source": [
128 | "5. Now, we rescale the intensity of the image using scikit-image's `rescale_intensity` function. The `in_range` and `out_range` define a linear mapping from the original image to the modified image. The pixels that are outside `in_range` are clipped to the extremal values of `out_range`. Here, the darkest pixels (intensity less than 100) become completely black (0), whereas the brightest pixels (>240) become completely white (255)."
129 | ]
130 | },
131 | {
132 | "cell_type": "code",
133 | "collapsed": false,
134 | "input": [
135 | "show(skie.rescale_intensity(img,\n",
136 | " in_range=(100, 240), out_range=(0, 255)))"
137 | ],
138 | "language": "python",
139 | "metadata": {},
140 | "outputs": []
141 | },
142 | {
143 | "cell_type": "markdown",
144 | "metadata": {},
145 | "source": [
146 | "Many intensity values seem to be missing in the histogram, which reflects the poor quality of this exposure correction technique."
147 | ]
148 | },
149 | {
150 | "cell_type": "markdown",
151 | "metadata": {},
152 | "source": [
153 | "6. We now use a more advanced exposure correction technique called **Contrast Limited Adaptive Histogram Equalization**."
154 | ]
155 | },
156 | {
157 | "cell_type": "code",
158 | "collapsed": false,
159 | "input": [
160 | "show(skie.equalize_adapthist(img))"
161 | ],
162 | "language": "python",
163 | "metadata": {},
164 | "outputs": []
165 | },
166 | {
167 | "cell_type": "markdown",
168 | "metadata": {},
169 | "source": [
170 | "The histogram seems more balanced, and the image now appears more contrasted."
171 | ]
172 | },
173 | {
174 | "cell_type": "markdown",
175 | "metadata": {},
176 | "source": [
177 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
178 | "\n",
179 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
180 | ]
181 | }
182 | ],
183 | "metadata": {}
184 | }
185 | ]
186 | }
--------------------------------------------------------------------------------
/notebooks/chapter15_symbolic/02_solvers.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:df1c115c86837c0f0c3e0207e7fab7b16aa871306b22ac8b0f0e3d8789926413"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 15.2. Solving equations and inequalities"
23 | ]
24 | },
25 | {
26 | "cell_type": "code",
27 | "collapsed": false,
28 | "input": [
29 | "from sympy import *\n",
30 | "init_printing()"
31 | ],
32 | "language": "python",
33 | "metadata": {},
34 | "outputs": []
35 | },
36 | {
37 | "cell_type": "code",
38 | "collapsed": false,
39 | "input": [
40 | "var('x y z a')"
41 | ],
42 | "language": "python",
43 | "metadata": {},
44 | "outputs": []
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "Use the function solve to resolve equations (the right hand side is always 0)."
51 | ]
52 | },
53 | {
54 | "cell_type": "code",
55 | "collapsed": false,
56 | "input": [
57 | "solve(x**2 - a, x)"
58 | ],
59 | "language": "python",
60 | "metadata": {},
61 | "outputs": []
62 | },
63 | {
64 | "cell_type": "markdown",
65 | "metadata": {},
66 | "source": [
67 | "You can also solve inequations. You may need to specify the domain of your variables. Here, we tell SymPy that x is a real variable."
68 | ]
69 | },
70 | {
71 | "cell_type": "code",
72 | "collapsed": false,
73 | "input": [
74 | "x = Symbol('x')\n",
75 | "solve_univariate_inequality(x**2 > 4, x)"
76 | ],
77 | "language": "python",
78 | "metadata": {},
79 | "outputs": []
80 | },
81 | {
82 | "cell_type": "markdown",
83 | "metadata": {},
84 | "source": [
85 | "## Systems of equations"
86 | ]
87 | },
88 | {
89 | "cell_type": "markdown",
90 | "metadata": {},
91 | "source": [
92 | "This function also accepts systems of equations (here a linear system)."
93 | ]
94 | },
95 | {
96 | "cell_type": "code",
97 | "collapsed": false,
98 | "input": [
99 | "solve([x + 2*y + 1, x - 3*y - 2], x, y)"
100 | ],
101 | "language": "python",
102 | "metadata": {},
103 | "outputs": []
104 | },
105 | {
106 | "cell_type": "markdown",
107 | "metadata": {},
108 | "source": [
109 | "Non-linear systems are also supported."
110 | ]
111 | },
112 | {
113 | "cell_type": "code",
114 | "collapsed": false,
115 | "input": [
116 | "solve([x**2 + y**2 - 1, x**2 - y**2 - S(1)/2], x, y)"
117 | ],
118 | "language": "python",
119 | "metadata": {},
120 | "outputs": []
121 | },
122 | {
123 | "cell_type": "markdown",
124 | "metadata": {},
125 | "source": [
126 | "Singular linear systems can also be solved (here, there are infinitely many equations because the two equations are colinear)."
127 | ]
128 | },
129 | {
130 | "cell_type": "code",
131 | "collapsed": false,
132 | "input": [
133 | "solve([x + 2*y + 1, -x - 2*y - 1], x, y)"
134 | ],
135 | "language": "python",
136 | "metadata": {},
137 | "outputs": []
138 | },
139 | {
140 | "cell_type": "markdown",
141 | "metadata": {},
142 | "source": [
143 | "Now, let's solve a linear system using matrices with symbolic variables."
144 | ]
145 | },
146 | {
147 | "cell_type": "code",
148 | "collapsed": false,
149 | "input": [
150 | "var('a b c d u v')"
151 | ],
152 | "language": "python",
153 | "metadata": {},
154 | "outputs": []
155 | },
156 | {
157 | "cell_type": "markdown",
158 | "metadata": {},
159 | "source": [
160 | "We create the augmented matrix, which is the horizontal concatenation of the system's matrix with the linear coefficients, and the right-hand side vector."
161 | ]
162 | },
163 | {
164 | "cell_type": "code",
165 | "collapsed": false,
166 | "input": [
167 | "M = Matrix([[a, b, u], [c, d, v]]); M"
168 | ],
169 | "language": "python",
170 | "metadata": {},
171 | "outputs": []
172 | },
173 | {
174 | "cell_type": "code",
175 | "collapsed": false,
176 | "input": [
177 | "solve_linear_system(M, x, y)"
178 | ],
179 | "language": "python",
180 | "metadata": {},
181 | "outputs": []
182 | },
183 | {
184 | "cell_type": "markdown",
185 | "metadata": {},
186 | "source": [
187 | "This system needs to be non-singular to have a unique solution, which is equivalent to say that the determinant of the system's matrix needs to be non-zero (otherwise the denominators in the fractions above are equal to zero)."
188 | ]
189 | },
190 | {
191 | "cell_type": "code",
192 | "collapsed": false,
193 | "input": [
194 | "det(M[:2,:2])"
195 | ],
196 | "language": "python",
197 | "metadata": {},
198 | "outputs": []
199 | },
200 | {
201 | "cell_type": "markdown",
202 | "metadata": {},
203 | "source": [
204 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
205 | "\n",
206 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
207 | ]
208 | }
209 | ],
210 | "metadata": {}
211 | }
212 | ]
213 | }
--------------------------------------------------------------------------------
/notebooks/chapter12_deterministic/03_ode.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:b28923943f64e7224cc7bcac23500952e7aadb1f72c70ddb576a999bb2b150e1"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 12.3. Simulating an Ordinary Differential Equation with SciPy"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. Let's import NumPy, SciPy (`integrate` package), and matplotlib."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import scipy.integrate as spi\n",
38 | "import matplotlib.pyplot as plt\n",
39 | "%matplotlib inline"
40 | ],
41 | "language": "python",
42 | "metadata": {},
43 | "outputs": []
44 | },
45 | {
46 | "cell_type": "markdown",
47 | "metadata": {},
48 | "source": [
49 | "2. We define a few parameters appearing in our model."
50 | ]
51 | },
52 | {
53 | "cell_type": "code",
54 | "collapsed": false,
55 | "input": [
56 | "m = 1. # particle's mass\n",
57 | "k = 1. # drag coefficient\n",
58 | "g = 9.81 # gravity acceleration"
59 | ],
60 | "language": "python",
61 | "metadata": {},
62 | "outputs": []
63 | },
64 | {
65 | "cell_type": "markdown",
66 | "metadata": {},
67 | "source": [
68 | "3. We have two variables: `x` and `y` (two dimensions). We note $\\mathbf{u}=(x,y)$. The ODE we are going to simulate is:\n",
69 | "\n",
70 | "$$\\ddot{\\mathbf{u}} = -\\frac{k}{m} \\dot{\\mathbf{u}} + \\mathbf{g}$$\n",
71 | "\n",
72 | "where $\\mathbf{g}$ is the gravity acceleration vector. In order to simulate this second-order ODE with SciPy, we can convert it to a first-order ODE (another option would be to solve $\\dot{\\mathbf{u}}$ first before integrating the solution). To do this, we consider two 2D variables: $\\mathbf{u}$ and $\\dot{\\mathbf{u}}$. We note $\\mathbf{v} = (\\mathbf{u}, \\dot{\\mathbf{u}})$. We can express $\\dot{\\mathbf{v}}$ as a function of $\\mathbf{v}$. Now, we create the initial vector $\\mathbf{v}_0$ at time $t=0$: it has four components."
73 | ]
74 | },
75 | {
76 | "cell_type": "code",
77 | "collapsed": false,
78 | "input": [
79 | "# The initial position is (0, 0).\n",
80 | "v0 = np.zeros(4)\n",
81 | "# The initial speed vector is oriented\n",
82 | "# to the top right.\n",
83 | "v0[2] = 4.\n",
84 | "v0[3] = 10."
85 | ],
86 | "language": "python",
87 | "metadata": {},
88 | "outputs": []
89 | },
90 | {
91 | "cell_type": "markdown",
92 | "metadata": {},
93 | "source": [
94 | "4. We need to create a Python function $f$ that takes the current vector $\\mathbf{v}(t_0)$ and a time $t_0$ as argument (with optional parameters), and that returns the derivative $\\dot{\\mathbf{v}}(t_0)$."
95 | ]
96 | },
97 | {
98 | "cell_type": "code",
99 | "collapsed": false,
100 | "input": [
101 | "def f(v, t0, k):\n",
102 | " # v has four components: v=[u, u'].\n",
103 | " u, udot = v[:2], v[2:]\n",
104 | " # We compute the second derivative u'' of u.\n",
105 | " udotdot = -k/m * udot\n",
106 | " udotdot[1] -= g\n",
107 | " # We return v'=[u', u''].\n",
108 | " return np.r_[udot, udotdot]"
109 | ],
110 | "language": "python",
111 | "metadata": {},
112 | "outputs": []
113 | },
114 | {
115 | "cell_type": "markdown",
116 | "metadata": {},
117 | "source": [
118 | "3. Now, we simulate the system for different values of $k$. We use the SciPy function `odeint`, defined in the `scipy.integrate` package."
119 | ]
120 | },
121 | {
122 | "cell_type": "code",
123 | "collapsed": false,
124 | "input": [
125 | "plt.figure(figsize=(6,3));\n",
126 | "# We want to evaluate the system on 30 linearly\n",
127 | "# spaced times between t=0 and t=3.\n",
128 | "t = np.linspace(0., 3., 30)\n",
129 | "# We simulate the system for different values of k.\n",
130 | "for k in np.linspace(0., 1., 5):\n",
131 | " # We simulate the system and evaluate $v$ on the \n",
132 | " # given times.\n",
133 | " v = spi.odeint(f, v0, t, args=(k,))\n",
134 | " # We plot the particle's trajectory.\n",
135 | " plt.plot(v[:,0], v[:,1], 'o-', mew=1, ms=8, mec='w',\n",
136 | " label='k={0:.1f}'.format(k));\n",
137 | "plt.legend();\n",
138 | "plt.xlim(0, 12);\n",
139 | "plt.ylim(0, 6);"
140 | ],
141 | "language": "python",
142 | "metadata": {},
143 | "outputs": []
144 | },
145 | {
146 | "cell_type": "markdown",
147 | "metadata": {},
148 | "source": [
149 | "The most outward trajectory (blue) corresponds to drag-free motion (without air resistance). It is a parabola. In the other trajectories, we can observe the increasing effect of air resistance, parameterized with $k$."
150 | ]
151 | },
152 | {
153 | "cell_type": "markdown",
154 | "metadata": {},
155 | "source": [
156 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
157 | "\n",
158 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
159 | ]
160 | }
161 | ],
162 | "metadata": {}
163 | }
164 | ]
165 | }
--------------------------------------------------------------------------------
/notebooks/chapter05_hpc/09_ipyparallel.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:777507e061a5a6b38bcb93a0a7a91cc1bf0ac7c02c425ffcad10c3e6cc70c7d7"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 5.9. Distributing Python code across multiple cores with IPython"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "First, we launch 4 IPython engines with `ipcluster start -n 4` in a console."
30 | ]
31 | },
32 | {
33 | "cell_type": "markdown",
34 | "metadata": {},
35 | "source": [
36 | "Then, we create a client that will act as a proxy to the IPython engines. The client automatically detects the running engines."
37 | ]
38 | },
39 | {
40 | "cell_type": "code",
41 | "collapsed": false,
42 | "input": [
43 | "from IPython.parallel import Client\n",
44 | "rc = Client()"
45 | ],
46 | "language": "python",
47 | "metadata": {},
48 | "outputs": []
49 | },
50 | {
51 | "cell_type": "markdown",
52 | "metadata": {},
53 | "source": [
54 | "Let's check the number of running engines."
55 | ]
56 | },
57 | {
58 | "cell_type": "code",
59 | "collapsed": false,
60 | "input": [
61 | "rc.ids"
62 | ],
63 | "language": "python",
64 | "metadata": {},
65 | "outputs": []
66 | },
67 | {
68 | "cell_type": "markdown",
69 | "metadata": {},
70 | "source": [
71 | "To run commands in parallel over the engines, we can use the %px magic or the %%px cell magic."
72 | ]
73 | },
74 | {
75 | "cell_type": "code",
76 | "collapsed": false,
77 | "input": [
78 | "%%px\n",
79 | "import os\n",
80 | "print(\"Process {0:d}.\".format(os.getpid()))"
81 | ],
82 | "language": "python",
83 | "metadata": {},
84 | "outputs": []
85 | },
86 | {
87 | "cell_type": "markdown",
88 | "metadata": {},
89 | "source": [
90 | "We can specify which engines to run the commands on using the --targets or -t option."
91 | ]
92 | },
93 | {
94 | "cell_type": "code",
95 | "collapsed": false,
96 | "input": [
97 | "%%px -t 1,2\n",
98 | "# The os module has already been imported in the previous cell.\n",
99 | "print(\"Process {0:d}.\".format(os.getpid()))"
100 | ],
101 | "language": "python",
102 | "metadata": {},
103 | "outputs": []
104 | },
105 | {
106 | "cell_type": "markdown",
107 | "metadata": {},
108 | "source": [
109 | "By default, the %px magic executes commands in blocking mode: the cell returns when the commands have completed on all engines. It is possible to run non-blocking commands with the --noblock or -a option. In this case, the cell returns immediately, and the task's status and the results can be polled asynchronously from the IPython interactive session."
110 | ]
111 | },
112 | {
113 | "cell_type": "code",
114 | "collapsed": false,
115 | "input": [
116 | "%%px -a\n",
117 | "import time\n",
118 | "time.sleep(5)"
119 | ],
120 | "language": "python",
121 | "metadata": {},
122 | "outputs": []
123 | },
124 | {
125 | "cell_type": "markdown",
126 | "metadata": {},
127 | "source": [
128 | "The previous command returned an ASyncResult instance that we can use to poll the task's status."
129 | ]
130 | },
131 | {
132 | "cell_type": "code",
133 | "collapsed": false,
134 | "input": [
135 | "print(_.elapsed, _.ready())"
136 | ],
137 | "language": "python",
138 | "metadata": {},
139 | "outputs": []
140 | },
141 | {
142 | "cell_type": "markdown",
143 | "metadata": {},
144 | "source": [
145 | "The %pxresult blocks until the task finishes."
146 | ]
147 | },
148 | {
149 | "cell_type": "code",
150 | "collapsed": false,
151 | "input": [
152 | "%pxresult"
153 | ],
154 | "language": "python",
155 | "metadata": {},
156 | "outputs": []
157 | },
158 | {
159 | "cell_type": "code",
160 | "collapsed": false,
161 | "input": [
162 | "print(_.elapsed, _.ready())"
163 | ],
164 | "language": "python",
165 | "metadata": {},
166 | "outputs": []
167 | },
168 | {
169 | "cell_type": "markdown",
170 | "metadata": {},
171 | "source": [
172 | "IPython provides convenient functions for most common use-cases, like a parallel map function."
173 | ]
174 | },
175 | {
176 | "cell_type": "code",
177 | "collapsed": false,
178 | "input": [
179 | "v = rc[:]\n",
180 | "res = v.map(lambda x: x*x, range(10))"
181 | ],
182 | "language": "python",
183 | "metadata": {},
184 | "outputs": []
185 | },
186 | {
187 | "cell_type": "code",
188 | "collapsed": false,
189 | "input": [
190 | "print(res.get())"
191 | ],
192 | "language": "python",
193 | "metadata": {},
194 | "outputs": []
195 | },
196 | {
197 | "cell_type": "markdown",
198 | "metadata": {},
199 | "source": [
200 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
201 | "\n",
202 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
203 | ]
204 | }
205 | ],
206 | "metadata": {}
207 | }
208 | ]
209 | }
--------------------------------------------------------------------------------
/notebooks/chapter03_notebook/03_controls.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:3ac8512f1eab399da758b33b61abb8225f8d6b129cf9456e316cb5e44f2bbb2c"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 3.3. Adding custom controls in the notebook toolbar"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "The CSS and Javascript of the HTML notebook can be customized through the files in `~/.ipython/profile_default/static/custom`, where `~` is your `HOME` directory, and `default` is your IPython profile. In this short recipe, we will use this feature to add a new button in the notebook toolbar on top of every notebook. Specifically, this button renumbers linearly all code cells."
30 | ]
31 | },
32 | {
33 | "cell_type": "markdown",
34 | "metadata": {},
35 | "source": [
36 | "1. First, we are going to inject Javascript code directly in the notebook. This is useful for testing purposes, or if you don't want your changes to be permanent. The Javascript code will be loaded with that notebook only. To do this, we can just use the `%%javascript` cell magic."
37 | ]
38 | },
39 | {
40 | "cell_type": "code",
41 | "collapsed": false,
42 | "input": [
43 | "%%javascript\n",
44 | "// This function allows us to add buttons \n",
45 | "// to the notebook toolbar.\n",
46 | "IPython.toolbar.add_buttons_group([\n",
47 | "{\n",
48 | " // The button's label.\n",
49 | " 'label': 'renumber all code cells',\n",
50 | " \n",
51 | " // The button's icon.\n",
52 | " // See a list of Font-Awesome icons here:\n",
53 | " // http://fortawesome.github.io/Font-Awesome/icons/\n",
54 | " 'icon': 'icon-list-ol',\n",
55 | " \n",
56 | " // The callback function.\n",
57 | " 'callback': function () {\n",
58 | " \n",
59 | " // We retrieve the lists of all cells.\n",
60 | " var cells = IPython.notebook.get_cells();\n",
61 | " \n",
62 | " // We only keep the code cells.\n",
63 | " cells = cells.filter(function(c)\n",
64 | " {\n",
65 | " return c instanceof IPython.CodeCell; \n",
66 | " })\n",
67 | " \n",
68 | " // We set the input prompt of all code cells.\n",
69 | " for (var i = 0; i < cells.length; i++) {\n",
70 | " cells[i].set_input_prompt(i + 1);\n",
71 | " }\n",
72 | " }\n",
73 | "}]);"
74 | ],
75 | "language": "python",
76 | "metadata": {},
77 | "outputs": []
78 | },
79 | {
80 | "cell_type": "markdown",
81 | "metadata": {},
82 | "source": [
83 | "Running this code cell adds a button in the toolbar. Clicking on this button automatically updates the prompt numbers of all code cells."
84 | ]
85 | },
86 | {
87 | "cell_type": "markdown",
88 | "metadata": {},
89 | "source": [
90 | "2. To make these changes permanent, i.e. to add this button on every notebook open within the current profile, we can open the file `~/.ipython/profile_default/static/custom/custom.js` and add the following code:"
91 | ]
92 | },
93 | {
94 | "cell_type": "markdown",
95 | "metadata": {},
96 | "source": [
97 | "```javascript\n",
98 | "$([IPython.events]).on('app_initialized.NotebookApp',\n",
99 | " function(){\n",
100 | "\n",
101 | " // Copy of the Javascript code above (step 1).\n",
102 | " IPython.toolbar.add_buttons_group([\n",
103 | " {\n",
104 | " // The button's label.\n",
105 | " 'label': 'renumber all code cells',\n",
106 | "\n",
107 | " // The button's icon.\n",
108 | " // See a list of Font-Awesome icons here:\n",
109 | " // http://fortawesome.github.io/Font-Awesome/icons/\n",
110 | " 'icon': 'icon-list-ol',\n",
111 | "\n",
112 | " // The callback function.\n",
113 | " 'callback': function () {\n",
114 | "\n",
115 | " // We retrieve the lists of all cells.\n",
116 | " var cells = IPython.notebook.get_cells();\n",
117 | "\n",
118 | " // We only keep the code cells.\n",
119 | " cells = cells.filter(function(c)\n",
120 | " {\n",
121 | " return c instanceof IPython.CodeCell; \n",
122 | " })\n",
123 | "\n",
124 | " // We set the input prompt of all code cells.\n",
125 | " for (var i = 0; i < cells.length; i++) {\n",
126 | " cells[i].set_input_prompt(i + 1);\n",
127 | " }\n",
128 | " }\n",
129 | " }]);\n",
130 | "});\n",
131 | "```"
132 | ]
133 | },
134 | {
135 | "cell_type": "markdown",
136 | "metadata": {},
137 | "source": [
138 | "The code put here will be automatically loaded as soon as a notebook page is loaded."
139 | ]
140 | },
141 | {
142 | "cell_type": "markdown",
143 | "metadata": {},
144 | "source": [
145 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
146 | "\n",
147 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
148 | ]
149 | }
150 | ],
151 | "metadata": {}
152 | }
153 | ]
154 | }
--------------------------------------------------------------------------------
/notebooks/chapter03_notebook/04_css.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:10412c7481e58d28fa14c7bcea8016ef24bb344450bd9b362e81063251c06830"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 3.4. Customizing the CSS style in the notebook"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "You will need the *CSS* dataset on the book's website. (http://ipython-books.github.com)\n",
30 | "\n",
31 | "You are expected to know a bit of CSS3 for this recipe. You can find many tutorials online (see references at the end of this recipe)."
32 | ]
33 | },
34 | {
35 | "cell_type": "markdown",
36 | "metadata": {},
37 | "source": [
38 | "1. First, we create a new IPython profile to avoid messing with our regular profile."
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "collapsed": false,
44 | "input": [
45 | "!ipython profile create custom_css"
46 | ],
47 | "language": "python",
48 | "metadata": {},
49 | "outputs": []
50 | },
51 | {
52 | "cell_type": "markdown",
53 | "metadata": {},
54 | "source": [
55 | "2. Now, we retrieve in Python the path to this profile (this should be `~/.ipython`) and to the `custom.css` file (empty by default)."
56 | ]
57 | },
58 | {
59 | "cell_type": "code",
60 | "collapsed": false,
61 | "input": [
62 | "dir = !ipython locate profile custom_css\n",
63 | "dir = dir[0]"
64 | ],
65 | "language": "python",
66 | "metadata": {},
67 | "outputs": []
68 | },
69 | {
70 | "cell_type": "code",
71 | "collapsed": false,
72 | "input": [
73 | "import os\n",
74 | "csspath = os.path.realpath(os.path.join(\n",
75 | " dir, 'static/custom/custom.css'))"
76 | ],
77 | "language": "python",
78 | "metadata": {},
79 | "outputs": []
80 | },
81 | {
82 | "cell_type": "code",
83 | "collapsed": false,
84 | "input": [
85 | "csspath"
86 | ],
87 | "language": "python",
88 | "metadata": {},
89 | "outputs": []
90 | },
91 | {
92 | "cell_type": "markdown",
93 | "metadata": {},
94 | "source": [
95 | "3. We can now edit this file here. We change the background color, the font size of code cells, the border of some cells, and we highlight the selected cells in edit mode."
96 | ]
97 | },
98 | {
99 | "cell_type": "code",
100 | "collapsed": false,
101 | "input": [
102 | "%%writefile {csspath}\n",
103 | "\n",
104 | "body {\n",
105 | " /* Background color for the whole notebook. */\n",
106 | " background-color: #f0f0f0;\n",
107 | "}\n",
108 | "\n",
109 | "/* Level 1 headers. */\n",
110 | "h1 {\n",
111 | " text-align: right;\n",
112 | " color: red;\n",
113 | "}\n",
114 | "\n",
115 | "/* Code cells. */\n",
116 | "div.input_area > div.highlight > pre {\n",
117 | " font-size: 10px;\n",
118 | "}\n",
119 | "\n",
120 | "/* Output images. */\n",
121 | "div.output_area img {\n",
122 | " border: 3px #ababab solid;\n",
123 | " border-radius: 8px;\n",
124 | "}\n",
125 | "\n",
126 | "/* Selected cells. */\n",
127 | "div.cell.selected {\n",
128 | " border: 3px #ababab solid;\n",
129 | " background-color: #ddd;\n",
130 | "}\n",
131 | "\n",
132 | "/* Code cells in edit mode. */\n",
133 | "div.cell.edit_mode {\n",
134 | " border: 3px red solid;\n",
135 | " background-color: #faa;\n",
136 | "}"
137 | ],
138 | "language": "python",
139 | "metadata": {},
140 | "outputs": []
141 | },
142 | {
143 | "cell_type": "markdown",
144 | "metadata": {},
145 | "source": [
146 | "4. Opening a notebook with the `custom_css` profile leads to the following style:"
147 | ]
148 | },
149 | {
150 | "cell_type": "markdown",
151 | "metadata": {},
152 | "source": [
153 | "5. We can also use this stylesheet with nbconvert. We just have to convert a notebook to a static HTML document, and copy the `custom.css` file in the same folder. Here, we use a test notebook that has been downloaded on this book's website (see *Getting started*)."
154 | ]
155 | },
156 | {
157 | "cell_type": "code",
158 | "collapsed": false,
159 | "input": [
160 | "!cp {csspath} custom.css\n",
161 | "!ipython nbconvert --to html data/test.ipynb"
162 | ],
163 | "language": "python",
164 | "metadata": {},
165 | "outputs": []
166 | },
167 | {
168 | "cell_type": "markdown",
169 | "metadata": {},
170 | "source": [
171 | "Here is how this HTML document look like:"
172 | ]
173 | },
174 | {
175 | "cell_type": "code",
176 | "collapsed": false,
177 | "input": [
178 | "from IPython.display import IFrame\n",
179 | "IFrame('test.html', 600, 650)"
180 | ],
181 | "language": "python",
182 | "metadata": {},
183 | "outputs": []
184 | },
185 | {
186 | "cell_type": "markdown",
187 | "metadata": {},
188 | "source": [
189 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
190 | "\n",
191 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
192 | ]
193 | }
194 | ],
195 | "metadata": {}
196 | }
197 | ]
198 | }
--------------------------------------------------------------------------------
/notebooks/chapter11_image/02_filters.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:ab88678633bd1daf9651bb1d7dd3e9c6f78ddd85499b0fcafc55e2c99b053ade"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 11.2. Applying filters on an image"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. Let's import the packages."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import matplotlib.pyplot as plt\n",
38 | "import skimage\n",
39 | "import skimage.filter as skif\n",
40 | "import skimage.data as skid\n",
41 | "import matplotlib as mpl\n",
42 | "%matplotlib inline\n"
43 | ],
44 | "language": "python",
45 | "metadata": {},
46 | "outputs": []
47 | },
48 | {
49 | "cell_type": "markdown",
50 | "metadata": {},
51 | "source": [
52 | "2. We create a function that displays a grayscale image."
53 | ]
54 | },
55 | {
56 | "cell_type": "code",
57 | "collapsed": false,
58 | "input": [
59 | "def show(img):\n",
60 | " plt.figure(figsize=(4,2));\n",
61 | " plt.imshow(img, cmap=plt.cm.gray);\n",
62 | " plt.axis('off')\n",
63 | " plt.show();"
64 | ],
65 | "language": "python",
66 | "metadata": {},
67 | "outputs": []
68 | },
69 | {
70 | "cell_type": "markdown",
71 | "metadata": {},
72 | "source": [
73 | "3. Now, we load the Lena image (bundled in scikit-image). We select a single RGB component to get a grayscale image."
74 | ]
75 | },
76 | {
77 | "cell_type": "code",
78 | "collapsed": false,
79 | "input": [
80 | "img = skimage.img_as_float(skid.lena())[...,0]"
81 | ],
82 | "language": "python",
83 | "metadata": {},
84 | "outputs": []
85 | },
86 | {
87 | "cell_type": "code",
88 | "collapsed": false,
89 | "input": [
90 | "show(img)"
91 | ],
92 | "language": "python",
93 | "metadata": {},
94 | "outputs": []
95 | },
96 | {
97 | "cell_type": "markdown",
98 | "metadata": {},
99 | "source": [
100 | "4. Let's apply a blurring **Gaussian filter** to the image."
101 | ]
102 | },
103 | {
104 | "cell_type": "code",
105 | "collapsed": false,
106 | "input": [
107 | "show(skif.gaussian_filter(img, 5.))"
108 | ],
109 | "language": "python",
110 | "metadata": {},
111 | "outputs": []
112 | },
113 | {
114 | "cell_type": "markdown",
115 | "metadata": {},
116 | "source": [
117 | "5. We now apply a **Sobel filter** that enhances the edges in the image."
118 | ]
119 | },
120 | {
121 | "cell_type": "code",
122 | "collapsed": false,
123 | "input": [
124 | "sobimg = skif.sobel(img)\n",
125 | "show(sobimg)"
126 | ],
127 | "language": "python",
128 | "metadata": {},
129 | "outputs": []
130 | },
131 | {
132 | "cell_type": "markdown",
133 | "metadata": {},
134 | "source": [
135 | "6. We can threshold the filtered image to get a *sketch effect*. We obtain a binary image that only contains the edges. We use a notebook widget to find an adequate thresholding value."
136 | ]
137 | },
138 | {
139 | "cell_type": "code",
140 | "collapsed": false,
141 | "input": [
142 | "from IPython.html import widgets\n",
143 | "@widgets.interact(x=(0.01, .4, .005))\n",
144 | "def edge(x):\n",
145 | " show(sobimg You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
204 | "\n",
205 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
206 | ]
207 | }
208 | ],
209 | "metadata": {}
210 | }
211 | ]
212 | }
--------------------------------------------------------------------------------
/notebooks/chapter08_ml/03_digits.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:7cbe378ea205083519bce986e5d90ae8ddd877b6860d7ce91314ab07a859fd49"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 8.3. Learning to recognize handwritten digits with a K-nearest neighbors classifier"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "1. Let's do the traditional imports."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "collapsed": false,
35 | "input": [
36 | "import numpy as np\n",
37 | "import sklearn\n",
38 | "import sklearn.datasets as ds\n",
39 | "import sklearn.cross_validation as cv\n",
40 | "import sklearn.neighbors as nb\n",
41 | "import matplotlib.pyplot as plt\n",
42 | "%matplotlib inline\n"
43 | ],
44 | "language": "python",
45 | "metadata": {},
46 | "outputs": []
47 | },
48 | {
49 | "cell_type": "markdown",
50 | "metadata": {},
51 | "source": [
52 | "2. Let's load the digits dataset, part of the `datasets` module of scikit-learn. This dataset contains hand-written digits that have been manually labeled."
53 | ]
54 | },
55 | {
56 | "cell_type": "code",
57 | "collapsed": false,
58 | "input": [
59 | "digits = ds.load_digits()\n",
60 | "X = digits.data\n",
61 | "y = digits.target\n",
62 | "print((X.min(), X.max()))\n",
63 | "print(X.shape)"
64 | ],
65 | "language": "python",
66 | "metadata": {},
67 | "outputs": []
68 | },
69 | {
70 | "cell_type": "markdown",
71 | "metadata": {},
72 | "source": [
73 | "In the matrix `X`, each row contains the $8 \\times 8=64$ pixels (in grayscale, values between 0 and 16). The pixels are ordered according to the row-major order."
74 | ]
75 | },
76 | {
77 | "cell_type": "markdown",
78 | "metadata": {},
79 | "source": [
80 | "3. Let's display some of the images."
81 | ]
82 | },
83 | {
84 | "cell_type": "code",
85 | "collapsed": false,
86 | "input": [
87 | "nrows, ncols = 2, 5\n",
88 | "plt.figure(figsize=(6,3));\n",
89 | "plt.gray()\n",
90 | "for i in range(ncols * nrows):\n",
91 | " ax = plt.subplot(nrows, ncols, i + 1)\n",
92 | " ax.matshow(digits.images[i,...])\n",
93 | " plt.xticks([]); plt.yticks([]);\n",
94 | " plt.title(digits.target[i]);"
95 | ],
96 | "language": "python",
97 | "metadata": {},
98 | "outputs": []
99 | },
100 | {
101 | "cell_type": "markdown",
102 | "metadata": {},
103 | "source": [
104 | "4. Now, let's fit a K-nearest neighbors classifier on the data."
105 | ]
106 | },
107 | {
108 | "cell_type": "code",
109 | "collapsed": false,
110 | "input": [
111 | "(X_train, X_test, \n",
112 | " y_train, y_test) = cv.train_test_split(X, y, test_size=.25)"
113 | ],
114 | "language": "python",
115 | "metadata": {},
116 | "outputs": []
117 | },
118 | {
119 | "cell_type": "code",
120 | "collapsed": false,
121 | "input": [
122 | "knc = nb.KNeighborsClassifier()"
123 | ],
124 | "language": "python",
125 | "metadata": {},
126 | "outputs": []
127 | },
128 | {
129 | "cell_type": "code",
130 | "collapsed": false,
131 | "input": [
132 | "knc.fit(X_train, y_train);"
133 | ],
134 | "language": "python",
135 | "metadata": {},
136 | "outputs": []
137 | },
138 | {
139 | "cell_type": "markdown",
140 | "metadata": {},
141 | "source": [
142 | "5. Let's evaluate the score of the trained classifier on the test dataset."
143 | ]
144 | },
145 | {
146 | "cell_type": "code",
147 | "collapsed": false,
148 | "input": [
149 | "knc.score(X_test, y_test)"
150 | ],
151 | "language": "python",
152 | "metadata": {},
153 | "outputs": []
154 | },
155 | {
156 | "cell_type": "markdown",
157 | "metadata": {},
158 | "source": [
159 | "6. Now, let's see if our classifier can recognize a \"hand-written\" digit!"
160 | ]
161 | },
162 | {
163 | "cell_type": "code",
164 | "collapsed": false,
165 | "input": [
166 | "# Let's draw a 1.\n",
167 | "one = np.zeros((8, 8))\n",
168 | "one[1:-1, 4] = 16 # The image values are in [0, 16].\n",
169 | "one[2, 3] = 16"
170 | ],
171 | "language": "python",
172 | "metadata": {},
173 | "outputs": []
174 | },
175 | {
176 | "cell_type": "code",
177 | "collapsed": false,
178 | "input": [
179 | "plt.figure(figsize=(2,2));\n",
180 | "plt.imshow(one, interpolation='none');\n",
181 | "plt.grid(False);\n",
182 | "plt.xticks(); plt.yticks();\n",
183 | "plt.title(\"One\");"
184 | ],
185 | "language": "python",
186 | "metadata": {},
187 | "outputs": []
188 | },
189 | {
190 | "cell_type": "code",
191 | "collapsed": false,
192 | "input": [
193 | "knc.predict(one.ravel())"
194 | ],
195 | "language": "python",
196 | "metadata": {},
197 | "outputs": []
198 | },
199 | {
200 | "cell_type": "markdown",
201 | "metadata": {},
202 | "source": [
203 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
204 | "\n",
205 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
206 | ]
207 | }
208 | ],
209 | "metadata": {}
210 | }
211 | ]
212 | }
--------------------------------------------------------------------------------
/notebooks/chapter11_image/04_interest.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "",
4 | "signature": "sha256:a479b8ae3be7bcbbe4e766c1d91526548e38aa69daade24b2b31aa72bd0f1f05"
5 | },
6 | "nbformat": 3,
7 | "nbformat_minor": 0,
8 | "worksheets": [
9 | {
10 | "cells": [
11 | {
12 | "cell_type": "markdown",
13 | "metadata": [],
14 | "source": [
15 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
16 | ]
17 | },
18 | {
19 | "cell_type": "markdown",
20 | "metadata": {},
21 | "source": [
22 | "# 11.4. Finding points of interest in an image"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {},
28 | "source": [
29 | "You need to download the *Child* dataset on the book's website. (http://ipython-books.github.io)"
30 | ]
31 | },
32 | {
33 | "cell_type": "markdown",
34 | "metadata": {},
35 | "source": [
36 | "1. Let's import the packages."
37 | ]
38 | },
39 | {
40 | "cell_type": "code",
41 | "collapsed": false,
42 | "input": [
43 | "import numpy as np\n",
44 | "import matplotlib.pyplot as plt\n",
45 | "import skimage\n",
46 | "import skimage.feature as sf\n",
47 | "import matplotlib as mpl\n",
48 | "%matplotlib inline\n"
49 | ],
50 | "language": "python",
51 | "metadata": {},
52 | "outputs": []
53 | },
54 | {
55 | "cell_type": "markdown",
56 | "metadata": {},
57 | "source": [
58 | "2. We create a function to display a colored or grayscale image."
59 | ]
60 | },
61 | {
62 | "cell_type": "code",
63 | "collapsed": false,
64 | "input": [
65 | "def show(img, cmap=None):\n",
66 | " cmap = cmap or plt.cm.gray\n",
67 | " plt.figure(figsize=(4,2));\n",
68 | " plt.imshow(img, cmap=cmap);\n",
69 | " plt.axis('off');"
70 | ],
71 | "language": "python",
72 | "metadata": {},
73 | "outputs": []
74 | },
75 | {
76 | "cell_type": "markdown",
77 | "metadata": {},
78 | "source": [
79 | "3. We load an image."
80 | ]
81 | },
82 | {
83 | "cell_type": "code",
84 | "collapsed": false,
85 | "input": [
86 | "img = plt.imread('data/pic2.jpg')"
87 | ],
88 | "language": "python",
89 | "metadata": {},
90 | "outputs": []
91 | },
92 | {
93 | "cell_type": "code",
94 | "collapsed": false,
95 | "input": [
96 | "show(img)"
97 | ],
98 | "language": "python",
99 | "metadata": {},
100 | "outputs": []
101 | },
102 | {
103 | "cell_type": "markdown",
104 | "metadata": {},
105 | "source": [
106 | "4. We find salient points in the image with the Harris corner method. The first step consists in computing the **Harris corner measure response image** with the `corner_harris` function (we will explain this measure in *How it works...*)."
107 | ]
108 | },
109 | {
110 | "cell_type": "code",
111 | "collapsed": false,
112 | "input": [
113 | "corners = sf.corner_harris(img[:,:,0])"
114 | ],
115 | "language": "python",
116 | "metadata": {},
117 | "outputs": []
118 | },
119 | {
120 | "cell_type": "code",
121 | "collapsed": false,
122 | "input": [
123 | "show(corners)"
124 | ],
125 | "language": "python",
126 | "metadata": {},
127 | "outputs": []
128 | },
129 | {
130 | "cell_type": "markdown",
131 | "metadata": {},
132 | "source": [
133 | "We see that the patterns in the child's coat are particularly well detected by this algorithm."
134 | ]
135 | },
136 | {
137 | "cell_type": "markdown",
138 | "metadata": {},
139 | "source": [
140 | "5. The next step consists in detecting corners from this measure image, using the `corner_peaks` function."
141 | ]
142 | },
143 | {
144 | "cell_type": "code",
145 | "collapsed": false,
146 | "input": [
147 | "peaks = sf.corner_peaks(corners)"
148 | ],
149 | "language": "python",
150 | "metadata": {},
151 | "outputs": []
152 | },
153 | {
154 | "cell_type": "code",
155 | "collapsed": false,
156 | "input": [
157 | "show(img)\n",
158 | "plt.plot(peaks[:,1], peaks[:,0], 'or', ms=4);"
159 | ],
160 | "language": "python",
161 | "metadata": {},
162 | "outputs": []
163 | },
164 | {
165 | "cell_type": "markdown",
166 | "metadata": {},
167 | "source": [
168 | "6. Finally, we create a box around the corner points to define our region of interest."
169 | ]
170 | },
171 | {
172 | "cell_type": "code",
173 | "collapsed": false,
174 | "input": [
175 | "ymin, xmin = peaks.min(axis=0)\n",
176 | "ymax, xmax = peaks.max(axis=0)\n",
177 | "w, h = xmax-xmin, ymax-ymin"
178 | ],
179 | "language": "python",
180 | "metadata": {},
181 | "outputs": []
182 | },
183 | {
184 | "cell_type": "code",
185 | "collapsed": false,
186 | "input": [
187 | "k = .25\n",
188 | "xmin -= k*w\n",
189 | "xmax += k*w\n",
190 | "ymin -= k*h\n",
191 | "ymax += k*h"
192 | ],
193 | "language": "python",
194 | "metadata": {},
195 | "outputs": []
196 | },
197 | {
198 | "cell_type": "code",
199 | "collapsed": false,
200 | "input": [
201 | "show(img[ymin:ymax,xmin:xmax])"
202 | ],
203 | "language": "python",
204 | "metadata": {},
205 | "outputs": []
206 | },
207 | {
208 | "cell_type": "markdown",
209 | "metadata": {},
210 | "source": [
211 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n",
212 | "\n",
213 | "> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
214 | ]
215 | }
216 | ],
217 | "metadata": {}
218 | }
219 | ]
220 | }
--------------------------------------------------------------------------------
/notebooks/chapter04_optimization/11_hdf5_table.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 3,
3 | "nbformat_minor": 0,
4 | "worksheets": [
5 | {
6 | "cells": [
7 | {
8 | "source": [
9 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
10 | ],
11 | "cell_type": "markdown",
12 | "metadata": []
13 | },
14 | {
15 | "source": [
16 | "# 4.11. Manipulating large heterogeneous tables with HDF5 and PyTables"
17 | ],
18 | "cell_type": "markdown",
19 | "metadata": {}
20 | },
21 | {
22 | "cell_type": "code",
23 | "language": "python",
24 | "outputs": [],
25 | "collapsed": false,
26 | "input": [
27 | "import numpy as np\n",
28 | "import tables as tb"
29 | ],
30 | "metadata": {}
31 | },
32 | {
33 | "source": [
34 | "We create a new HDF5 file."
35 | ],
36 | "cell_type": "markdown",
37 | "metadata": {}
38 | },
39 | {
40 | "cell_type": "code",
41 | "language": "python",
42 | "outputs": [],
43 | "collapsed": false,
44 | "input": [
45 | "f = tb.open_file('myfile.h5', 'w')"
46 | ],
47 | "metadata": {}
48 | },
49 | {
50 | "source": [
51 | "We will create a HDF5 table with two columns: the name of a city (a string with 64 characters at most), and its population (a 32 bit integer)."
52 | ],
53 | "cell_type": "markdown",
54 | "metadata": {}
55 | },
56 | {
57 | "cell_type": "code",
58 | "language": "python",
59 | "outputs": [],
60 | "collapsed": false,
61 | "input": [
62 | "dtype = np.dtype([('city', 'S64'), ('population', 'i4')])"
63 | ],
64 | "metadata": {}
65 | },
66 | {
67 | "source": [
68 | "Now, we create the table in '/table1'."
69 | ],
70 | "cell_type": "markdown",
71 | "metadata": {}
72 | },
73 | {
74 | "cell_type": "code",
75 | "language": "python",
76 | "outputs": [],
77 | "collapsed": false,
78 | "input": [
79 | "table = f.create_table('/', 'table1', dtype)"
80 | ],
81 | "metadata": {}
82 | },
83 | {
84 | "source": [
85 | "Let's add a few rows."
86 | ],
87 | "cell_type": "markdown",
88 | "metadata": {}
89 | },
90 | {
91 | "cell_type": "code",
92 | "language": "python",
93 | "outputs": [],
94 | "collapsed": false,
95 | "input": [
96 | "table.append([('Brussels', 1138854),\n",
97 | " ('London', 8308369),\n",
98 | " ('Paris', 2243833)])"
99 | ],
100 | "metadata": {}
101 | },
102 | {
103 | "source": [
104 | "After adding rows, we need to flush the table to commit the changes on disk."
105 | ],
106 | "cell_type": "markdown",
107 | "metadata": {}
108 | },
109 | {
110 | "cell_type": "code",
111 | "language": "python",
112 | "outputs": [],
113 | "collapsed": false,
114 | "input": [
115 | "table.flush()"
116 | ],
117 | "metadata": {}
118 | },
119 | {
120 | "source": [
121 | "Data can be obtained from the table with a lot of different ways in PyTables. The easiest but less efficient way is to load the entire table in memory, which returns a NumPy array."
122 | ],
123 | "cell_type": "markdown",
124 | "metadata": {}
125 | },
126 | {
127 | "cell_type": "code",
128 | "language": "python",
129 | "outputs": [],
130 | "collapsed": false,
131 | "input": [
132 | "table[:]"
133 | ],
134 | "metadata": {}
135 | },
136 | {
137 | "source": [
138 | "It is also possible to load a particular column (and all rows)."
139 | ],
140 | "cell_type": "markdown",
141 | "metadata": {}
142 | },
143 | {
144 | "cell_type": "code",
145 | "language": "python",
146 | "outputs": [],
147 | "collapsed": false,
148 | "input": [
149 | "table.col('city')"
150 | ],
151 | "metadata": {}
152 | },
153 | {
154 | "source": [
155 | "When dealing with a large number of rows, we can make a SQL-like query in the table to load all rows that satisfy particular conditions."
156 | ],
157 | "cell_type": "markdown",
158 | "metadata": {}
159 | },
160 | {
161 | "cell_type": "code",
162 | "language": "python",
163 | "outputs": [],
164 | "collapsed": false,
165 | "input": [
166 | "[row['city'] for row in table.where('population>2e6')]"
167 | ],
168 | "metadata": {}
169 | },
170 | {
171 | "source": [
172 | "Finally, we can access particular rows knowing their indices."
173 | ],
174 | "cell_type": "markdown",
175 | "metadata": {}
176 | },
177 | {
178 | "cell_type": "code",
179 | "language": "python",
180 | "outputs": [],
181 | "collapsed": false,
182 | "input": [
183 | "table[1]"
184 | ],
185 | "metadata": {}
186 | },
187 | {
188 | "source": [
189 | "Clean-up."
190 | ],
191 | "cell_type": "markdown",
192 | "metadata": {}
193 | },
194 | {
195 | "cell_type": "code",
196 | "language": "python",
197 | "outputs": [],
198 | "collapsed": false,
199 | "input": [
200 | "f.close()\n",
201 | "import os\n",
202 | "os.remove('myfile.h5')"
203 | ],
204 | "metadata": {}
205 | },
206 | {
207 | "source": [
208 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n\n> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
209 | ],
210 | "cell_type": "markdown",
211 | "metadata": {}
212 | }
213 | ],
214 | "metadata": {}
215 | }
216 | ],
217 | "metadata": {
218 | "name": "",
219 | "signature": "sha256:3825624d6bf4f8e38eb125e50bfd8c7a90334b8b4a72b373a07c7ac2c38b8421"
220 | }
221 | }
--------------------------------------------------------------------------------
/notebooks/chapter05_hpc/05_ray_1.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 3,
3 | "nbformat_minor": 0,
4 | "worksheets": [
5 | {
6 | "cells": [
7 | {
8 | "source": [
9 | "> This is one of the 100 recipes of the [IPython Cookbook](http://ipython-books.github.io/), the definitive guide to high-performance scientific computing and data science in Python.\n"
10 | ],
11 | "cell_type": "markdown",
12 | "metadata": []
13 | },
14 | {
15 | "source": [
16 | "# 5.5. Ray tracing: pure Python"
17 | ],
18 | "cell_type": "markdown",
19 | "metadata": {}
20 | },
21 | {
22 | "source": [
23 | "In this example, we will render a sphere with a diffuse and specular material. The principle is to model a scene with a light source and a camera, and use the physical properties of light propagation to calculate the light intensity and color of every pixel of the screen."
24 | ],
25 | "cell_type": "markdown",
26 | "metadata": {}
27 | },
28 | {
29 | "cell_type": "code",
30 | "language": "python",
31 | "outputs": [],
32 | "collapsed": false,
33 | "input": [
34 | "import numpy as np\n",
35 | "import matplotlib.pyplot as plt"
36 | ],
37 | "metadata": {}
38 | },
39 | {
40 | "cell_type": "code",
41 | "language": "python",
42 | "outputs": [],
43 | "collapsed": false,
44 | "input": [
45 | "%matplotlib inline"
46 | ],
47 | "metadata": {}
48 | },
49 | {
50 | "cell_type": "code",
51 | "language": "python",
52 | "outputs": [],
53 | "collapsed": false,
54 | "input": [
55 | "w, h = 200, 200 # Size of the screen in pixels.\n",
56 | "\n",
57 | "def normalize(x):\n",
58 | " # This function normalizes a vector.\n",
59 | " x /= np.linalg.norm(x)\n",
60 | " return x\n",
61 | "\n",
62 | "def intersect_sphere(O, D, S, R):\n",
63 | " # Return the distance from O to the intersection \n",
64 | " # of the ray (O, D) with the sphere (S, R), or \n",
65 | " # +inf if there is no intersection.\n",
66 | " # O and S are 3D points, D (direction) is a \n",
67 | " # normalized vector, R is a scalar.\n",
68 | " a = np.dot(D, D)\n",
69 | " OS = O - S\n",
70 | " b = 2 * np.dot(D, OS)\n",
71 | " c = np.dot(OS, OS) - R*R\n",
72 | " disc = b*b - 4*a*c\n",
73 | " if disc > 0:\n",
74 | " distSqrt = np.sqrt(disc)\n",
75 | " q = (-b - distSqrt) / 2.0 if b < 0 \\\n",
76 | " else (-b + distSqrt) / 2.0\n",
77 | " t0 = q / a\n",
78 | " t1 = c / q\n",
79 | " t0, t1 = min(t0, t1), max(t0, t1)\n",
80 | " if t1 >= 0:\n",
81 | " return t1 if t0 < 0 else t0\n",
82 | " return np.inf\n",
83 | "\n",
84 | "def trace_ray(O, D):\n",
85 | " # Find first point of intersection with the scene.\n",
86 | " t = intersect_sphere(O, D, position, radius)\n",
87 | " # No intersection?\n",
88 | " if t == np.inf:\n",
89 | " return\n",
90 | " # Find the point of intersection on the object.\n",
91 | " M = O + D * t\n",
92 | " N = normalize(M - position)\n",
93 | " toL = normalize(L - M)\n",
94 | " toO = normalize(O - M)\n",
95 | " # Ambient light.\n",
96 | " col = ambient\n",
97 | " # Lambert shading (diffuse).\n",
98 | " col += diffuse * max(np.dot(N, toL), 0) * color\n",
99 | " # Blinn-Phong shading (specular).\n",
100 | " col += specular_c * color_light * \\\n",
101 | " max(np.dot(N, normalize(toL + toO)), 0) \\\n",
102 | " ** specular_k\n",
103 | " return col\n",
104 | "\n",
105 | "def run():\n",
106 | " img = np.zeros((h, w, 3))\n",
107 | " # Loop through all pixels.\n",
108 | " for i, x in enumerate(np.linspace(-1., 1., w)):\n",
109 | " for j, y in enumerate(np.linspace(-1., 1., h)):\n",
110 | " # Position of the pixel.\n",
111 | " Q[0], Q[1] = x, y\n",
112 | " # Direction of the ray going through the optical center.\n",
113 | " D = normalize(Q - O)\n",
114 | " # Launch the ray and get the color of the pixel.\n",
115 | " col = trace_ray(O, D)\n",
116 | " if col is None:\n",
117 | " continue\n",
118 | " img[h - j - 1, i, :] = np.clip(col, 0, 1)\n",
119 | " return img\n",
120 | "\n",
121 | "# Sphere properties.\n",
122 | "position = np.array([0., 0., 1.])\n",
123 | "radius = 1.\n",
124 | "color = np.array([0., 0., 1.])\n",
125 | "diffuse = 1.\n",
126 | "specular_c = 1.\n",
127 | "specular_k = 50\n",
128 | "\n",
129 | "# Light position and color.\n",
130 | "L = np.array([5., 5., -10.])\n",
131 | "color_light = np.ones(3)\n",
132 | "ambient = .05\n",
133 | "\n",
134 | "# Camera.\n",
135 | "O = np.array([0., 0., -1.]) # Position.\n",
136 | "Q = np.array([0., 0., 0.]) # Pointing to.\n",
137 | "\n",
138 | "img = run()\n",
139 | "plt.imshow(img);\n",
140 | "plt.xticks([]); plt.yticks([]);"
141 | ],
142 | "metadata": {}
143 | },
144 | {
145 | "cell_type": "code",
146 | "language": "python",
147 | "outputs": [],
148 | "collapsed": false,
149 | "input": [
150 | "%timeit run()"
151 | ],
152 | "metadata": {}
153 | },
154 | {
155 | "source": [
156 | "> You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).\n\n> [IPython Cookbook](http://ipython-books.github.io/), by [Cyrille Rossant](http://cyrille.rossant.net), Packt Publishing, 2014 (500 pages)."
157 | ],
158 | "cell_type": "markdown",
159 | "metadata": {}
160 | }
161 | ],
162 | "metadata": {}
163 | }
164 | ],
165 | "metadata": {
166 | "name": "",
167 | "signature": "sha256:1b529fac36f07d48a1babb0651018ae732a5122f9e19890163b9ca0713bb8295"
168 | }
169 | }
--------------------------------------------------------------------------------
/references/chapter02_best_practices.md:
--------------------------------------------------------------------------------
1 | # 2. Best practices in Interactive Computing
2 |
3 | Full list of references in Chapter 2 of the [IPython Cookbook](http://ipython-books.github.io), the definitive guide to high-performance scientific computing and data science in Python, by Dr. Cyrille Rossant, Packt Publishing, 400 pages, 2014.
4 |
5 | ## Python 2/Python 3
6 |
7 | * [What's new in Python 3?](http://docs.python.org/3.3/whatsnew/3.0.html).
8 | * [An excellent free book about porting code to Python 3, by Lennart Regebro](http://python3porting.com).
9 | * [Official recommendations about Python 2/Python 3 compatibility](http://docs.python.org/3/howto/pyporting.html).
10 | * [`2to3` module to convert Python 2 code to Python 3](http://docs.python.org/2/library/2to3.html).
11 | * [`six` compatibility module](http://pythonhosted.org/six/).
12 | * [Official wiki page about the Python 2/Python 3 question](http://wiki.python.org/moin/Python2orPython3).
13 | * [Python 3 questions and answers, by Nick Coghlan](http://python-notes.curiousefficiency.org/en/latest/python3/questions_and_answers.html).
14 | * [*"Ten awesome features of Python that you can't use because you refuse to upgrade to Python 3"*, a presentation by Aaron Meurer](http://asmeurer.github.io/python3-presentation/slides.html).
15 | * [Using the `__future__` module when writing compatibility code](http://docs.python.org/2/library/__future__.html).
16 | * [Key differences between Python 2 and Python 3](http://sebastianraschka.com/Articles/2014_python_2_3_key_diff.html).
17 |
18 |
19 | ## Integrated Development Environments for Python
20 |
21 | * [PyDev for Eclipse](http://pydev.org).
22 | * [Spyder, an open source IDE](http://code.google.com/p/spyderlib/).
23 | * [PyCharm](http://www.jetbrains.com/pycharm/).
24 | * [PyTools for Microsoft Visual Studio on Windows](http://pytools.codeplex.com).
25 | * [PyScripter](http://code.google.com/p/pyscripter/).
26 | * [IEP, the Interactive Editor for Python](http://www.iep-project.org).
27 |
28 |
29 | ## Git
30 |
31 | * [msysgit: Git for Windows](http://msysgit.github.io).
32 | * [TortoiseGit](http://code.google.com/p/tortoisegit/).
33 |
34 |
35 | ## Hosted Git Services
36 |
37 | * [GitHub](http://github.com).
38 | * [BitBucket](http://bitbucket.org).
39 | * [Google Code](http://code.google.com).
40 | * [Gitorious](http://gitorious.org).
41 | * [Sourceforge](http://sourceforge.net).
42 |
43 |
44 | ## Learning Git
45 |
46 | * [GitHub help](http://help.github.com).
47 | * [Pro Git book](http://git-scm.com).
48 | * [Hands-on tutorial](http://try.github.io).
49 | * [Git Guided Tour](http://gitimmersion.com).
50 | * [GitHub Git tutorial](http://git-lectures.github.io).
51 | * [Atlassian Git tutorial](http://www.atlassian.com/git).
52 | * [CodeSchool's online course](http://www.codeschool.com/courses/try-git).
53 | * [Git tutorial by Lars Vogel](http://www.vogella.com/tutorials/Git/article.html).
54 | * [Git tutorial for scientists](http://nyuccl.org/pages/GitTutorial/).
55 |
56 | ## Git workflows
57 |
58 | * [A popular but complex Git flow](http://nvie.com/posts/a-successful-git-branching-model/).
59 | * [A simpler workflow, used by GitHub](http://scottchacon.com/2011/08/31/github-flow.html).
60 | * [Different Git workflows](http://www.atlassian.com/git/workflows).
61 | * [Learn Git branching](http://pcottle.github.io/learnGitBranching/).
62 | * [The Git workflow recommended on the NumPy project (and others)](http://docs.scipy.org/doc/numpy/dev/gitwash/development_workflow.html).
63 | * [A post on the IPython mailing list about an efficient Git workflow, by Fernando Perez](http://mail.scipy.org/pipermail/ipython-dev/2010-October/006746.html).
64 |
65 |
66 | ## Good practices
67 |
68 | * [The Python Cookbook, by David Beazley, a must-read with many advanced recipes for Python 3](http://shop.oreilly.com/product/0636920027072.do).
69 | * [The Hitchhiker's Guide to Python](http://docs.python-guide.org/en/latest/).
70 | * [PEP8 rules](http://www.python.org/dev/peps/pep-0008/).
71 | * [PyLint, a static analysis tool](http://www.pylint.org).
72 | * [Design patterns in Python](http://github.com/faif/python-patterns).
73 | * [Design patterns on Wikipedia](http://en.wikipedia.org/wiki/Software_design_pattern).
74 | * [Coding standards of Tahoe-LAFS](http://tahoe-lafs.org/trac/tahoe-lafs/wiki/CodingStandards).
75 | * [*"How to be a great software developer"*, by Peter Nixey](http://peternixey.com/post/83510597580/how-to-be-a-great-software-developer).
76 | * [*"Why you should write buggy software with as few features as possible?"* a talk by Brian Granger](http://www.youtube.com/watch?v=OrpPDkZef5I).
77 |
78 |
79 | ## Packaging
80 |
81 | * [The Hitchhiker’s Guide to Packaging](http://guide.python-distribute.org).
82 | * [Python Packaging User Guide](http://python-packaging-user-guide.readthedocs.org).
83 | * [Conda](http://conda.pydata.org).
84 |
85 |
86 | ## Reproducibility
87 |
88 | * [*"An efficient workflow for reproducible science"*, a talk by Trevor Bekolay](http://bekolay.org/scipy2013-workflow/).
89 | * [*"Ten Simple Rules for Reproducible Computational Research"*, Sandve et al., PLoS Computational Biology, 2013](http://dx.doi.org/10.1371/journal.pcbi.1003285).
90 | * [Konrad Hinsen's blog](http://khinsen.wordpress.com).
91 |
92 | ### Tools
93 |
94 | * [Markdown, a simple markup language](http://daringfireball.net/projects/markdown/).
95 | * [Sphinx](http://sphinx-doc.org).
96 | * [Figshare for storing binary research data online](http://figshare.com).
97 | * [Datadryad for storing binary research data online](http://datadryad.org).
98 | * [joblib, a must-have tool for interactive computing](http://pythonhosted.org/joblib/).
99 | * [ipycache, providing a `%%cache` magic in IPython](http://github.com/rossant/ipycache).
100 | * [AutoIt to automate GUI actions](http://www.autoitscript.com/site/autoit/).
101 | * [AutoHotKey to create automation scripts on Windows](http://www.autohotkey.com).
102 |
103 |
104 | ## Unit testing
105 |
106 | * [Nose, a unit testing package for Python](http://nose.readthedocs.org/en/latest/testing.html).
107 | * [Test-driven development](http://en.wikipedia.org/wiki/Test-driven_development).
108 | * [*"Untested code is broken code"*, by Martin Aspeli](http://www.deloittedigital.com/eu/blog/untested-code-is-broken-code-test-automation-in-enterprise-software-deliver).
109 |
110 |
111 | ## Test coverage
112 |
113 | * [coverage.py, a Python module by Ned Batchelder](http://nedbatchelder.com/code/coverage/).
114 | * [coveralls.io, a test coverage tool](http://coveralls.io).
115 |
116 |
117 | ## Continuous integration
118 |
119 | * [Documentation of Travis CI in Python](http://about.travis-ci.org/docs/user/languages/python/).
120 |
121 |
122 | ## Debugging
123 |
124 | * [WinPdb, a graphical debugger](http://winpdb.org).
125 |
126 |
127 |
--------------------------------------------------------------------------------