├── .github └── FUNDING.yml ├── .gitignore ├── 00.Motivation-Classifying-fruit.ipynb ├── 0300.Representing-data-with-models.ipynb ├── 0500.Building-models.ipynb ├── 0700.Model-complexity.ipynb ├── 0900.What-is-learning.ipynb ├── 1100.Intro-to-neurons.ipynb ├── LICENSE.md ├── Project.toml ├── README.md ├── data ├── 104_100.jpg ├── 107_100.jpg ├── 10_100.jpg └── 8_100.jpg └── scripts └── draw_neural_net.jl /.github/FUNDING.yml: -------------------------------------------------------------------------------- 1 | # These are supported funding model platforms 2 | 3 | github: [JuliaLang] 4 | open_collective: julialang 5 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Files generated by invoking Julia with --code-coverage 2 | *.jl.cov 3 | *.jl.*.cov 4 | 5 | # Files generated by invoking Julia with --track-allocation 6 | *.jl.mem 7 | 8 | # System-specific files and directories generated by the BinaryProvider and BinDeps packages 9 | # They contain absolute paths specific to the host computer, and so should not be committed 10 | deps/deps.jl 11 | deps/build.log 12 | deps/downloads/ 13 | deps/usr/ 14 | deps/src/ 15 | 16 | # Build artifacts for creating documentation generated by the Documenter package 17 | docs/build/ 18 | docs/site/ 19 | 20 | # File generated by Pkg, the package manager, based on a corresponding Project.toml 21 | # It records a fixed state of all packages used by the project. As such, it should not be 22 | # committed for packages, but should be committed for applications that require a static 23 | # environment. 24 | Manifest.toml 25 | .ipynb_checkpoints/ -------------------------------------------------------------------------------- /00.Motivation-Classifying-fruit.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Motivation\n", 8 | "\n", 9 | "Hello, and welcome! We're excited to be your gateway into machine learning. ML is a rapidly growing field that's buzzing with opportunity. Why? Because the tools and skills employed by ML specialists are extremely powerful and allow them to draw conclusions from large data sets quickly and with relative ease.\n", 10 | "\n", 11 | "Take the Celeste project, for example. This is a project that took 178 **tera**bytes of data on the visible sky and used it to catalogue 188 millions stars and galaxies. \"Cataloguing\" these stars meant identifying characteristics like their locations, colors, sizes, and morphologies. This is an amazing feat, *especially* because this entire calculation took under 15 minutes.\n", 12 | "\n", 13 | "\"Drawing\"\n", 14 | "\n", 15 | "How are Celeste's calculations so fast? To achieve performance on this scale, the Celeste team uses the Julia programming language to write their software and supercomputers from Lawrence Berkeley National Lab's NERSC as their hardware. In this course, we unfortunately won't be able to give you access to a top 10 supercomputer, but we will teach you how to use Julia!\n", 16 | "\n", 17 | "We're confident that this course will put you on your way to understanding many of the important concepts and \"buzz words\" in ML. To get you started, we'll teach you how to teach a machine to tell the difference between images of apples and bananas, i.e to **classify** images as being one or the other type of fruit.\n", 18 | "\n", 19 | "Like Project Celeste, we'll use the [Julia programming language](https://julialang.org/) to do this. In particular, we'll be working in [Jupyter notebooks](http://jupyter.org/) like this one! (Perhaps you already know that the \"ju\" in Jupyter comes from Julia.)" 20 | ] 21 | }, 22 | { 23 | "cell_type": "markdown", 24 | "metadata": {}, 25 | "source": [ 26 | "## What do the images we want to classify look like?\n", 27 | "\n", 28 | "We can use the `Images.jl` package in Julia to load sample images from this dataset. Most of the data we will use live in the `data` folder in this repository." 29 | ] 30 | }, 31 | { 32 | "cell_type": "code", 33 | "execution_count": 4, 34 | "metadata": {}, 35 | "outputs": [], 36 | "source": [ 37 | "using Images # To execute hit + enter\n", 38 | "using FileIO" 39 | ] 40 | }, 41 | { 42 | "cell_type": "code", 43 | "execution_count": 5, 44 | "metadata": {}, 45 | "outputs": [ 46 | { 47 | "data": { 48 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAIAAAD/gAIDAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAAIABJREFUeAF0wevPrul5FvbjOM/rvp/n3azxjBNvgghKipOQxCElYCL8KY1QkWpAalGRkIDIlBYhlA8gpH5o+weAKpUKqVIDLRVSQ6NsBElc4Th2SIhxLPBmvM/E9gz1bGzPbq13rfd9nvu+rvM4er/PmjUzxuH3a3InACU2JgImCjBAIA0aMEAYAEEKkGHZIhIMEIAF2RHNBEEAHhYcEQxsShURBGUFQxZJnAivCSAskHCUHEkANkgQkMQIeYAhSGaAjQHf4gkAQ4aFW6Vq0cLuvc/zrKrwBALErcDGuCXcCpzYAGjcCuKRVmCCxH+IuGUABAETIDa2ySAQROA13sCZE4A+eoskiXCLsLEsfWqRmZYYTEZVZSYAn5DgBiRgURITmTHKmSRhSXYGAFUfrbUgkkEQNk8ASALAWwiwquacDQOcp71GRTQQrwm8WeANJL0hNsQbmhEFJEEDBgQmwhBtQEQAJoQNsWEkSIAGjI0JAwyU5dLcJkByl6URrc373QRjDAWIWw5Qo/iagAEDhiFmkGCEgaBr9Nbab/zGh/7Nv/7tw/HmT/zYH/8v/6u/AAbgsY42TTIl0yAZEdgYJdmYWhtDkCOCAQ9jpokNidcRvw+S+Dat4ATxZgYJbwjCwiYAGAYQIDaCDQoObBgQQES2gDFGj8bMVirBNUwyW9goOZM0CcC3YFPEhmTEUAGCyqWWbI0Yx6c+9+kP/NLPv/zyy1/7vafecnH+E//Zn479WWbWUE4BEIYlFUmTkZEAJEiap4aTmBoIAyISt4g32AZAEifGa4g3NCAAkIABGgRAQKAI4iQgPBIgHCAoOEAChAEaVTKZwWwNlIBiDDAau0UgSSQLIDYynMFbSWwMEBtZE4MkYCzL5//tx1985qtvSY45v/yFz/3M//YP/8SffM9jZ3tEmmWbxoYJmqAfAiB4mttxVGsJwIGyMgJAwQACoLGhQdwyTNL4/bUEiEcIkKAAEMRJADSI19giDaTTAETYhpGMyMDGIgiZNXbzZA0jQqNl2qZh6eruvVGrR7l0a9QGBVGGWgv3sSMx1qz6R3//773yzW9ejG7y8OD+lz/5qW/83ldffPaFPDvPTMiw/BCQm2nKzPls/8Tb3tHX436eyzIJojGMb2GCxutI4j+upXHLeA2xMZxIAAQg4yEZAGEkxBIw4IBIhATnuLlp04whSGhT9MJhZSbXHtMEgMcbnJ1fPf30P/u/f/bB1dXNzc3V3Vevrq6ur+5fX1/XGJAnsvoCebl+gNEhveWxy4Du7Hat1x22A+sv/eSfns4vp8vHomWwWmsgxxiCzy4vvvNt73j8O976/X/kB3767/zdKQPqZCADBBgZs4AACOIhAsSbEb+/FsZDpvEmhmnAAAgDBkCgkA23iq4GNRsyrK9/7jM//8/+n6ef+jJr1FITYyzrWFZASW/meYassdq+f+8KMiRbNM7sfZXtAGfm6CZdOUeb4EIvQGO5f0EWYxcR+70j+vWDaTdLsst2WkSM68OLr957eWpPP/mZj37w1w7rmuf773zHO//Mn/0v/sJf+cuwoBGRQODEuGW8gfiPahAQALERTIggQYIwNhIiYYGJqoBhu0XCxhgf/Kf/9KlPflJXV8e791567rmrl17m0BTTWkIvb9B7cDPgiGgRgO8YsBPcBCgpect22Zk5egkOowzbY9RErNWjxX63e/yJJ+4fjjdr1/GQBmQTZVeUR/eyCFiA44sv9ggR95//xj9/+eWP/fpH5ouLd3739/yln/qrb/sDfxAwMlQVORW8CYYsV81tcokRsBHEIw0PEYaI1xBRXRHBQCRuJZZ1zHMjgML955/76P/7L+/++2eefvJTz3/pKV5f7SO1HC/HoBxgY0smCQN2hbkJO1xBJmGYcIIbh2nDKKurVGiERQSBKAnZumpilKBlEDe5HucSNwNAGBi0BMFmDLhFDlXSnV7v3nvucHjuq8+s9lve+V3feOHr7/9v/tq7fvRHMXpEygNMkl1jisZGACRhQwITxEMNARBvRgBytgCwDjuYiaoRM4D+wlee+ebTz3z9C1/67V/+1Ze+9Lt3XI8t61kjq681yoqMAlSDySkbbQA0SGZEtmgM0gBsE7dsQ7YriMZcx8icJMC0uY4RGVFOUnCv8rJMo8NCpBGCFQgbYNmGCdQYkszIFgyglwqGX33+hX/xcz+H0d/zp977Pd//rv/0x/9kRhMkIGg84k1VTBPepIEAIRi3FCCcgC2sveZ9SoDHLgHry5958lMf+Vdf/Njv3H3638e9q7f2Ma3HHX3mqc3TzVg6nREyFSbYSIB2EGZERiQzAxFpF0CfgDRhc1MYuwZCmNLKIU9TVhlAAwUSKukMbMYolYNBGiBKpmEXGQ0go6x1qYGBKXNqLdu6LI/t9x/6lV/58K//+h/78fe8f/0bT7z9bd/7fd/PlgkathSRjCBZ65rzjEcaiNcFiEeYmCPXdezm8Lr2mweHV1/92X/wvzz/mc8+Dp/fLP3q3jRqsjLQ+9LLAGZOIUBItsmTBhQhIgAwAgwzjEQUlJk+YVi3AGhiiynWKjBGRpSneddvDraxsZOglGRDLqURHhaIkmjQAgxbEhgZebFLRS41+uhhT9mu7t7dne2vD4eP/Mtf++hvf+zP/Ln3/fTf+dvf/b3/CSPKjsihcmmapmgNxOsaTggaj1AGLURgNzeo+/rBz/3vP/ML/+Sf/OA73/m9Z2frSy+vV1dT74+fXzZwrEsfy9z2kGjQgJGMRmwKFMAgAySMEQgau4y2m2xXlakq64QwSUnZ2hTMqe3mM5IPHjyQvEnQckYzqoEwTUM0acg0gYKzRR+CObcdp0lLrUu3vS7Lvk3r4ZBT46jl+sEHf+UDL7300n//P/xPP/DuH8poZWUkIg04Sbyh4YQADBCPCBFleyz3Xnjhl/7xP/53H/zQD3/H29qrd+vBdRxuzhm52/feHfnY+Z15evzm5rgeD2W11thI0OnGUB+NzCQyAAsiHek2Tfs5C7UsZQLEACjs2xlAkTDXtZcXr6NfX892kSLBsF22GZGZMOEVBlx0AyoAEEBrUa51LBFwIKZEMMlRPQiNgSoQh/tXT37ik//wH/yvf+1v/Ld//D0/HhGyTOKEAPGahkdIAsStICErqWee+vK//sCvvPilp94xz7sHD3TvCofDRWvL0pHZIkIevY/lqK4WkUyESUxzkNQo0hHMRmYYIkAikoAcBckUyZxz32hTBUmtNQBD5bKqBy2jwY50cCVlgxGECyICLinCJiAgVMPzbu6KY61aytlsj96JIJ2RVaMBh+MBu7Pre1cf/a3fPD8/r6r3/Kn3BmNY3IB4k9ZrtGzYOCAwICMCYX3td5/67G/95ouf+8J89+60HOv6AfvaCJb2LSVFuUVqXcZYp5wJR0ZmkJRBoLUoTrYFNDKzmcFNIlozwYz5bAbCtiQERxcF25LmKdxidLllL1VZUskRYTMswSIA22IgGSAAQ85pUskyDW1QYQZbWWQyYhezxpico9ax6OrV/hsf/rXc5eXl5R959w+3aAXjW7Up21AFkwQTEkhApvH05z/3lU8/OV69uz8eb+7exXK8s9/psDYQsoAkp0hALXYJOhiBjNyYMErgFCkqAAciEG3KJDN6FUlQDQHHqgEgM0mO4SrBYoCONoURZthddjCaCVWBsEkLIgg4QcMGTbhkghswkALIJEwk4DAUTDBsCqV1mnfffOH5D//ahy7v3Pkf/+iPAAoQBonXtbH2Nk+SwBgqkkGgxvVzzz94/gVe3W+Hw7i+yaqJHIfDedtrFOBktBaNBFLS1EIwgACSRgY5mXBpalOACGQy5ykCZbkGVZIKJhJSRuzn3VBxY9AQoAKJiMjkGIMEaQYCLMmqYIRhgIYJbgD4BG8IY8AgJJV1K2ibJODNsiw5T9944esf/53f+dynn/zhH/2jwZAFBh5pbZpgRESvgUSCUOHm+kO/+Itf/cQn9PIrjyEKvKlKsLWJRpKMjIhGbAhEZgRJkIxbCAYySHZ12gyCzMyIAGS7tYagbdgAaGwSRGRFZSbJMSQNvcYbnJDEI34IMHwCG8QtbwDaggouuIiIRGlsYIEmGAzG2oeCaPzkv/1373//+z/ym/9qvzufdjPepAFQFcIto7uTxnL42X/0My999gvT1XXcHK9eemm9euUseLHf91G0I7NFcqMikMnMLKu1lkkHfUKCxO5sZxuAww6eRAtGNBMRQZVtEXb1sSDCNWwDCIPy60hHEggJvAUGIAGwDcK3ABAnBHwiS7AAB+dpGrAk+BbIAAFn5tp7MKZ5d3X33v/89/7+3/xbP/3O7/ouEK9rEDYRUR627l3d/eivfuAbTz/d7j+YD8ccA9Zu2iVGwtNuriIAA0FO0xQRLcmMRmYm0lWlKpAMIiMyGcFwnDACFOxslAQaXaPEzCAFuB4yAJ+QzAy70JrAIdnOTBGCZZGiAjAAGrRN0zRsw3aYAwYFpG0a3IBBmtgQqLFO0wywr8txXX7x53/hr//1/84SM/BIgxGZvfc2xY6+vnnwzBe+4KureV15vPH1Ta6dVsjRMM/z0r0BRQAR0VoEBMz7nYNCOcAWmRnBjYPzbtem0EOEDUmR4WQECIPOzNaaRq3HxVUUNwWRToKZkpkooUwSEUjmUJEMM4wAZDsAECe2w7wFhikDrmU5DEsCiQ3JINOcIvuo3qud7ZLx4jde+L/+z//jr/zUX/2eP/x9IB5qACxM0ySvV6++8rufevLmpZf54EEbA+tQ71alkZkkj33A0eapJSlr1BgDLXJqFWCQ2aaYMjMiAHSNi8uLs7OzzDwcrg/LkQCRrl4AW+x3M6eGZclMAGMM2wBIRoQknAREAiRtowBEtABI4hHbIAF4A9vY2CZe41uQJQsIbwAbIAFUFZJT21UfzJYR//wXf+knfuInvucPfx8eaQZI2A7GS88+94nf+jd4cDOtY7l+gOMNawUQEfPcChhjRHCaJkLHw83Zbl9VAjNbtha7aUhtnqIlaQK76WI+2+8vznrvqDmBMVZvGAAic3e2b/tdWZuIYN5ylSTDJDMTKAkRITgbE1ljjL6UCSAiKONbeQOREcGyAHAD2LKMgG0QNGBXleBkjDEyHJsWo+pr/98zn//CZ3/g3T/09ne8A0BVNQZAVFWj2loPvvniZQHrGqN6daiSCAYjGNzPU4vJpZa8vr7eTTOA1ppsMaZpkl1hwRlJoAIPjssIaEPkbmZLSet6pAygl6SRJwCSLCAicEKCZEQAgmWhSqgBCobtqg6nbcA4sQ2QRmTKUlXJhRIDvAV4Q9MwQG8Aw5IjQhp2IjRPU2f90s//wjv/4Hf/uT//5yOCZDNQpZaBXrms54IeXON4rLHaJhCtzVOLiG5FtDGGS33RlHM4cp6yNQWzzW2/r9LwYGZMU5A5xcXZhaj1cBzqtmmDbPNUfZjuvdtOBsnee1WRGQQo22BnOBA+kSwJQGYy6HJaoxuP2AZhG0BVCcYJT2zLAgjANoiHwig7IoeLhkNjDMIgP/GJT3zlK18ZY+x2O5LNRMuA9ImPfORjv/qBXNbUBiwjwIxARGskLSOIYCLX3s8vL4YFCzYyMWXbnwFOmhGttYyY5vlmOWayXVzseG579KXWXpVEwiWJqMjsvY91rSoYD+kEj8RGo2BuwBYEUBWDA8YtCk7fAghJCJ6AJ0EGADOtgdeQhE0SgO0gvZHWGhHRpvbxj3/sj/3Yj733ve/d7/etAKiHsJPH/Zt9WYc1S7TCNEhXWS3blFObppzQl6XtdzYyY97vRwRbwzRjmqez+fxsHy25MeZ5PoPHWLWprj6CYLTQiKlDQ+uQCI2h6lWZ2ddeKo/SxiIJEEBEkARgW75VgiRusDEA2yA2tqdpUnCUoSHJwcicMtdeBEmELSAMkbCrKlvaLhXJCEpjyt1TX/zSk5/+9E/+5E8CaIAi8plPfer3Pv/5SRo3h6bSGDkqpCCIsA0gTtiiHw/ZsmVenJ2//Q981/PffHFBXVycYT/Nl+dvedt3XFxcmhzrWlUAjsdDVam6R2n0dVnY15iaex84YnCsAsCMZPRleFiSbZ4A3FivseRCwRZtA7CNE5KAAZggmZkgGrR2gbJDMr6NbZyQHKNyStsRQXJd1xdffPGFF16AbzWhAH75S1/86peeOlfV6EGOGtXXWHtmzK0ZBBCblpt2ttOoab/Ps7OYdzG3iDx/4vH943fOn3jLW97+ndO0KyvX3vsSEdy1qhp96csalQiORo0oGEpJTqJNYWEoIgD4hIQ3kjejNjqxCcJ2AMOPwAC4AUhUFYNkkoyIgqqqlxiNZAQB2wSMkymzl7iJqJPWmowxxrPPPvu5z33u3T/yI42klgWjZ/Xl/v02OqDGUGmsS06TMw3Yzszdbid73u2Kvata1fPf/Pp0eX75lsvp4uzOd771iXe8bX955/71g+NyTEbu9nTFPDeCa8pURSDmqS0316jimJgFxTRN1liWlRG2hwUgpLIpdhVObOOEsmTJNrxBAQQFBEnAJbHK8OgqyQwAYQCimTQYhmUSCGD41jTPfQxZzBhjINsY44tf/OKHP/zhd7/73c2FmOasMdUojRorBNgu7efZJ9laRFNhvzsHtR6Ots/Ozthw/tid6fI8zs/my8t2fn7++OP3r2+YLaIR7ssSgY2MNk28YO9LZNZKr4fGWS7UmspiPQSVUNEoaV1HtGSiSmOUgYgYYwDcJDE0GEEiM0uW7UCvQpDMKssCQCbJhMsgCQhAkDBNCq7RAQY5ekcwIoZNpmxL9155+TOf+TToNmfD4bpJ682NDjdc+81x2UskLLcWOScM89ayLPev7o4xznf7eZ5zajkFMhwseB39lVfuFgEpAmMdESEN2xFBRrGMsClw3p/XcgyuREosi8hpmo59MGO3myPicDiMtQvOTDdaBYsZLA6rbGYQBAsOQGTaZcMymDix6Q02QUIS6Ih0QAVpuPCQbZDeACBsg9xP88319b1XXgXQqPrQL//qVz/z2TQAztGmffBwEAkgT2o4G7Nx7cdpmnZtmue5926Cve/liJgiaYwx9udngJKxGJEgZ0k4IWl7YnhuY21H1RpkBoKbaZrm1uZ5vr7/QFK0Nu122hQijXLJZRdR8Kgq0iQEkgYdhGBbMoIkHuJGJmBiYxtkZiJDtmuINGnDgG2Q3mBjG9fX12eXF7vdDkaD8PXnnn1w995ObtmAZawLl3XixlUV64JsczZmwpimKQySY/ScmkaNMXJd1+Uw9bMpY8qY5/1xuYnEQzqxXb2P3j3KGgTqlqskAYhoEy27NsuytNboWzWqpCq9xhuIkAygbIsgiARsEzBPLNkAHAlhIxgkDdgmghRJGCQtgdjYBuENTGGaptYaSdgNo++ipdkPh1kjQYDZGmsNsPoANLcpk4IASBCQVagBlcbajwdMOY7HfjzUclwThPqy6lY1pkfBhm3VZqxrjVWjH4/H3vsYQ1UEJI11OT64vx6PqiobYnVplCWUUxAgY8NbqCrJZRhhwzYAkhGRrdVwDRk2wIgwAjJZKvUiOAzBJksSULBBEbJBPFRV67o+++yzH/rgBxui1XJ0X1tEHTqlubXlwfUet4yCGQZsl0SQAERQ0rquDUZb5tqHSut6fe/ervry4IbJiIA9XGMMlUf1Wntfj6P3sS7V+1hWqCKICFRKfVmWMRSgQQxJiLINgt4gkzRKBEHACNphjVKVYRB8TUSQBGAbpF1kEgQsu1zsFgOATwDixDbeZIyx8+7e3bsf+chHGpbjRLQAVKSzhZfRWuOosiKitVZV67oCEjyEiGgMhl2dTow+luPyoA1pPd5ovbOMmvfTPM9xMjZdaz+u66rRIamPcVxG75BoQDaKJ1OkGaPKJduQIapk0TDJiKAMIzNbThzoqzRKgHgLvtV7ryrSmZQh0ABoBCnCsF1W0QBtgwQgGK+jgJznufd+PB7vvvpqw5TLsqzH40wlIwjBbAmF7WyZyd7XsbC1ZjiZpbFKAEi2FiZ8c93HyHW3v7zTgsdlXe/j7PIiM03Yrqp1XZdlcQ2UagyXajm6ytWrry5tWkQHSEYEDNtlo0zSlmGQyAANKLO13YzuRWMUCJEpArCkPoYIRmSmDFllwbBNMkj5BPQtgHjIeBNKQkSSPBwODa6q0VrrvcNqYEQs68qqaZpIr+sa0dbDEfM81DHPbZ5ba305ElyXY5OyMTKzp28ON2NhTkXdv7lxEMFpnmuMqg5g9O5SEh7Fqn48aAyXqq9VRd8SKNiwDIGD7hpdkpHzji3Zq4+hsaarr97ExhBoiyQQmbRVG5ciTdiQLGJje1g2DTgYEWWYIEjCNk5sRwaJdV2kajAILMsyR2RiPS5e+7ybY5XGgA3bpTBcnUDLtpvnueWrx+uGUB/OXK9vZiAzI4cKqw4DI52YiClHP6JE0jaqp8OuWjuq0pBqrGv1Xn3Yrt4lRYQA2LJLkgkGAiCZEUZWqsbGDtu4FbhFg4ILNuBgBJkJwjJUAAoGCARAEg95Axs23kCyqiJCUu+9/ea/+OVXXnllmpNHLqPPJFuOQ09bVVKFBXuaZldN86RRfVnpNkUmQx4ePXJCHzoe1xoDFswWETZoS122SUIudYlVXaNooAqjs4ZrQEN91JBLBGhYsgTRdkTIqiovGuUNDfl1AIg3c9gyYBMIE7JM2AYDJmmQIDYmIOP3ExGZSbL33p781Kfu3723W3vTCCNaugRSAAHbVWWbMz1Ew9JYjxgxT5NRgTZ6h63rAZvcA64aWqDWmKhAm7OqJAUoySXbtFUaY3VVALtsa2mUNAolAJY8amMjQNkoFWxgGLDCKBnfxnbRAEVItqwxKiDBNhwgHooIE7QB41vZBmkbMICqOh6PzTWmoPvAKJKQbbTMqIiYbNQo25KqKtZeSWa4Abal1qKsMdZoeTzeCMWWkphBUwUyd9O8jOPoKyLGGLX2AG1D6r1rVGxAFyRwqKpsS3KZMgUbEmwCwWCqAMqmDBuPmIBxy2GLQTKLHlaVCxS+hTeAN7CINyNpvKGqDodDq2VNsGUjki6SAMYQqmhLqqqIHGN41CCJrDEa5uFjJPpi2Sb6oXMaCE6xk0S7SBHnZ/OUTcluoKSlr8uSpCQAGjXGcAmABI/SiV8nFjxUROLEJ7JsB1iUZVtGGhAshO2hMkgmCdgAvMG3sC1btiCAxn+IpHFLUu+9TdkOa68+duDUmntpFKTeu8fQ6LAjovceUsqNUZs+MpoNklLvfQCx2+/u3LmYz/Y3x8PxeESZLW9ubtZ1rb7WyVhXV/VSVUUERBeqLMm2LAYtyBxwGUUPacARGGXRAAouCYBgG34INuBHgDBoWLYRogEQtvGQN4DhDYgNmbZJ4pYDNGBbJ2OMVsuqUbvW3FdVqco2XRarjNLEyMx1rCQRXNc1g65uZyIef+KJ6+v7d+9ftWz9cHz15VfOLs5F2IattYM83BwJZabWPtYOoPduW1IgAcTGkEXQBElbFqWSjZOSCg4yMxlhoNcYhvwQDUswaFtwRJgwfAsAQRKEyjZM2hABg7dAMEARBIxbtkHGI5nZAtli6qXzNhMrYiQhwEEZO7ZQH33p1ZVZY5kQJSVDS19WvVwFYEJDjX1rqNEPN8goAkraLrmPbpE0gkLBU86Ce+/SiIgER4lAX1bbgsmoGhFZQ65KpqQEEk5CgG7hJAwJBIOwZRtgALANIiJoSFUEI9tuGuWlhmAZBQvwBjALCAC2QYBpAHbv/eKxO0888UTLTAQR2fvq6lSFRkQ4AqRtSRQB+ESpKTJNSVVdEjIaIwiWYBOUXaoCUIJsyUBEIBJAEEFqM0pSGM6sKgA8kayCJCIkeQNEhKxbvfdSWSLs6FUGMzMiAdSo6pZl0rABAzY2tiUZNYyNCMsFGzCxIQkQMB4haRtARJydnTVPuUK7DAEgpt3sMTxkwoSAjEAgkRGRjLA21StaRiAiYBAIEr1ERAQIjIJvjTJPJEBlgiRkj6GlCyxv2G1JBjxqA8eGkUHToROfDHiUJQABmCSMsmQOQLCDgRiSAyJEGJAtQLCqxDDDRqFkiCAJ49vRANlaOzs7u3PnTvveH/z+z959xWRrrY9jHwNVGZREAMFgOhwwDcNOzrsdJm0KBcs2Av24huEgpwRhCaWuApOkTyTR2JioMTQGohkuGCR4S6EqVw0BEHrvmTksBukgkKCsSMN0FUm4JPbSAASSDSRgEoYFFyDAgM0CbRa9wSMFAzRuGW8gCWJD8uLior3v/T/14nPPPveNb6QcmbSLUA0CLcOIglzlGhxysJ3taLiGpJjaNO/W9dj7MokiDdWyuIIRLbMF115l0bANmTYc2MgQCZdrwIKF8GZIJoIEbBQcTAiGbJMpy8RDPhFBMkjAMkoq2QERQsiSbVKgATAl2xAJBCGYNr4dw6BAlIr0Y4/faaCXqsipgWMYdrTU0kkiOORbKkphA2TEOvpYl2mazvf7y8fuLId28+BBLX3OCUlEFKCq4VtkyFLJdsgGIBuQBHtoFFQqZsz7fe+9goEoWVVraYyRMVVVNNoerlE15IJlDssEQQCC8YhggLIN+CEYZIG2RRi3TJi0YBtB26I3AEjipPeOiLIANCCm3ezgzfXRyzKTU0wVCwCTJkQ62JgkMnM5HrkRPXxz/0FVQcOlFqnNENOxy/3+7PHHHz8/P3/6a8+qV0kAYdqWBKDKAsqOTJBtnu/ceez6+vrQy1CVJBB5frZ7/K1P2H7llVfssgoPObgxSMM2YFu2QAEIlqmg7YJFgAnTkGgbBZoEDBsQSQPewHgT2y3nnKeLizuPP/7WhqCDgqeInPfoy1ARKZD2gCcCQZrwrQRVggF6rW47IuCiGRG5mZro4/F49+7dw2EBQupVxVsh28StDNoeZamGTNTaj9c3NQaAABEhoaqOx+OyLHUSa8UsAAAX60lEQVQiy6aFgizIFmwTAG8lDRi2AeKECJKAsXFILtqRJmQIFmHCuOUNQBInJizv9/t3vetd73vf+xoQjBZtHroOMxBSZUuRzObqx9HPo2UQFG0KY/QWYVsCya7atUmjB0CyYJmSrx7cxM0Rkb33yBxjMHKa94fDAUBEjOpkAKStZbz64qvYFMuyKYBM21d370sa6gBkCgK48S3KLpVARAAYVhkKGChJQFlCFC3ApEkjbJckwDAIklUGGBkl2c5sJsoieXl55+1ve8cP/OAPNdiOHNDZflf3F621a7vj8iAjR5UIkkNKg7IkSK21ZEiVmZdvebyqL8tiBoJKgjChgOSqEYbtqoqINk9oGZoiYvQSIJs4kaESbACyAcO0BeMR34JhGwbKFATADJhlDGNjYlMwgLLxkAOgLRECbAswIcCEpYi0Dd7CiW04Mts0z0888QQjGsjv/kN/6P5Xnzk89yzMueVaA0FFcBAMMlVjtdKwofKdOxct4urqisT+4ryqrg83U+YqGZqn3bxrvap7qTFcRUdZrWVMzUCb5sw01lBp0ABJWwBoQDAow6AhEyZMkAkYlgmTQdKGUabBgsoeshgCByy7LAEFCxZQhokNaYAEQBAwjFsCbNM2SW9ggPv9fpp2QADRgPzP/+u/2F9++ePPfW2fbV1X2xEJ2kAiyLR7VZEN1rSb79273wLzPDvjueeek7Sb5lHD9jTtLi8vd2fz/evr5XAYXTQyc/RRG3jtvUskLy8vHRRukTBBgxmuMgzQNgDZJh4SYIIkGAYQsFOWyCKrXOYIFym5rIILMKJMECYesg0QJyRhA5AMwDYAEZQERybJt771rZeXlwAaQIxRu5bn+1h61jTGyDnHKPNWgmIaKjMQGrVrU4DVhxUIwu7rOjNhjONy95VXmbGua/WRESanaRo6KVdJFoDj0seQwI1g2ZICFAM24A0QBgyVXZJgwQLKNlG24EEbKLAYIoyQPegyRNo0YWDAAv7/5uAuVvf0Luv49/rd9/951rPW3rPnbbczbee9M0NnWja1MNNSXrUFSiwoaFEICCQWqC+gIYrVI0o8QiQ1xANN1BMSo0cmYmIQD5RaiLZiBFQKNlqgMyDMy56913r+931dPmt3drun0AObSPh8jOwZAZPiIAkIkEjQASDlgFTVsiyttZOTk7GuHcF2GUs/FbJ7UF/2+7NqVb3XHFKJhuwcKDaZcoDeKudwPJLee5XmfnWT7STruk4jyTcUNNXBiM/Ozjio1nqTdBYPW4lUgTjAJE5mHGESYXDkeKAZpnGY0gxTCpqKE5NJApEmsRTKwWSSBEn8HinEy5IgWmsnt138krd88Td90zf1ZemJsJ/6si+bz//u//yZn82cp6enElVqrUkND0lIIy5IwRRyVakv9jqhkEsmc2QSgSRukDSncQAfJEYOxpJKUmtqxRzrmAUtIYkVQBwkcSIJUbQpe9rODBMsZjyDpUgmM0ww5ywSnAQMFgcqhCDnACEpCZCEm0p9s9nsdrs777r91a+5h7H2FKrlda9/5J4HHvhY+3BU1fs8O7WIFDFQKyXCWFRYtpsKY+ytilqJMfZVbSSttc3SgLP99TFcVZLWOQBJ3JBEiQBVbijJVJLJuRCLwCQhUTlDtEiIWDMayYgHsmq11zCVSSYxiUgwSmJVkIkhB0IQcZAQcIL4lIBFSWpVrfXt5o477rjt0iUoWvqALpTWl6M6OTp97rmyy6Ew2ex2rS/r9auZ1Xv3um/qpY6wxtK3p2dzjNEoR9Va9aVKYwxPQFINkhQHIYmBlOe0kJTJ6eleZ+ucgUqmq2ZmJKiEOWPHqtaa45HMKKqJpzXRPsMqpJBpuzKJkylZFTCaZDomTiJGIA5MzkkEEiFV1TqGWiFttv3k5OThxx5//IknY6jqExpC3PXqe+64557f+O3f8X6vEqR6AzmeKkMjVX0dY8aFSI0x5siBqtmZgjHG1PRqzs04FKWDHDhCCRZzZulCyhgDARVGsvd+u91W9ev7s7PTAagUGJ7DXsMMe2dMDxSRVDiXxJADYSh1kyCTRCniRFgkfFrEy1rZrqQtTdI61qPj42W7eetb3/r0W79UrYC+iP06NvJjb35zu/rSP/7Yx5b9dn/9dMnEnrbmoIleXj2ZEY4rVNXZul/nSCY32IzEdhJJTTiZmUUdJDEFRDkA2yOUDwgQImlZllo6lBMTwHOe7tdl2cx4oGlPmMRgsDCZMOKgkECgSuQGkoCAgoAlJjeET0sCJAElWZbl6Ohot9ttt9tlWdZ17QfA0dLxYIw6OfbR7vqL1za7Xa5fMxm2nFYHbWqKqtacYSdkHWtIwbQleU7hA6C1ZkGpUbnBkAMREkFpncMTWkmac5qotUuXLl1bz65fOxvDwEycVK/hOROrUi0HmjOxQzWBHYOdFAiQ7ZAEkwkJBkP4DEkh4WWSLJL03o+Od8uyvOlNb7r33nv3+/2yLJK6DIVoKPc8/Mi3fPd3/+QHf8JG61zapOF9fDZspGqt4pEUxI7j1lqpzznBSSpAIZtIrVXbbnan6942AuzYsYlKdlJChMz4oKQXr13fr+sYTolUPKNStTlGhIVJRKiAlerLjMPIxMIQSmLaEVKRODExFXFgISoHQogbqmraklpvvW2OtscnJxff8cfeeeXKlc1mww29CoLt6svurjvve/wL1mVZya63sY6QkKAonRqkSRE0eZ1QgMmMCyW2zkGbsSaUJpnx9KQUYoiUVk42R1uodV3HGEBVJXnxxReRUo2DEipgeqqVosC0Z2QRSWiKmZhYmiTCwoklwIJwMCHEIPF7WZBIQkpSvR0fHz/00EOPPPLIyckJcHZ2ttlsekhJqMVW6/14945vePeHfupfzWc+OVZVdOBEaql4HVVNxUFqVtUYU2La0UEFCAeSRsg6hmMbiJNS9S5pnXOOYSrJsGeCpJxblmXaqz2HqVatVy/F6xjG0zGxZAcpqtVzxAMZRwRyTlFFOEwwASwigjgXxE0F2K7WVL1a2+12d9555zvf+TWXL1/uywL03oEOmFQT1PC8ePnyu97zno/+3M+98Nzv9MxlwF77szHnVBWtnBHSW9Nsrff9nBJVfc7ZGwe2gaqa2NPTbq1V1YwRVKmqhMd6uj8bYyTRQZi2yXq2hjJSK6Ppac9pj9gkISjJRDazPMmIDYaAVYkmNpmJw4wDEb+v8LJqzaKk4+Pj2w7uuP0d73jH6x54iNQ61mVZbHcRIMF2q0ag190PvPbsd35bz3E0NmfraujLxvu1tUo6ZsxYtT/dK5EQSWnMfFprqipHCaDpOCDNMZPhxNGcVvWSnOFpSoUqNTynTUQJFZhzlRiQ0IETJ2GGkQzHMKUk1kHHMZkR1YASIY6rak5XCXAiCZAUzm02m5OTk7vuuvttb/vSixcukSCWZQGqqocJlJbWGuAxa7u89/u//5/86N/51Z//j1evPbdo2V64cPrii7GPj0/2p9dNIJAIUziK55wi3NBaU282k8SZc0gCChGSWNBKQp9ipYamJoknraqIGtVyABYBh4hPCVAKASJF4UCKKlFESkXNICFwcVApQKKqnJBYHEganscnJyfHJ5SOL174/h/4gaPjC46rGjd1IW4YY2S6qtaxLkdHu7vu0Mnx/oUXz8b1bm9OTqyz51+6urRmo+AwkTOJPWahTW9AElVzcFgT20rqoLdSWUybkGQmJIDt3GA7UqlUNXxDYlUgnkkMFpOXWURSdGBhcDKEgylzkJlEJCAOnCAhkUhCAh30vtludn2zeeptb/uRD/zto5OLGa6+cIsONT3x6L0jU1WqOfZf9bXv2r/40i+9+LNMz7N9xn5W2GzWsTrBTjACJQo5GDRJ03MdE7ANJEhOwBpxHNuUEA5JfAOloBkLtWozmfGYoZVaK2rYETMJJFg2kkpSAAkEiuIwSYhJyIETC0pAEkl8SqtW5Ui9HS3bZbu58/Ld991//733vQ6i1qZna42belCrZXokUdW63/fNUm151cMPf/nXft24tv/FD30Yz03btNJ6/ZoExCM2aZKKEmKsI04DmwNJqCELRZphDmuuHJQalQOUYBSVA9SM4kxFKkfWQakOxJBUmnMkBh2gA25IYjBMxVKCiSEQYc4pROiGJDqoklSq1vuFCxd2JydXrlx55zvfiZimSq0at+jc0FobYyx9WTYbxyBVe+DKlSd/49lf/ZX/cTrHevoSjVEsreJuAyGkkJNybRePaVK9OJAAG9tL6QBI4nOxPckBJUMiE3tMRxJEiUVg2pqDKMIhIrHBqUgRgZhEIUhQkhA2BxEESdxUUniZbqhqR0dHxycnb3jiiXe9611v/4ovR60ac8b2Zmnc1IXG8NKr9+74QGrVtM6xtLrz/te+9vHHf+35FxozQ306WbGTypCNAGFVxSNuram3dT8TSxpzldRorRoHCShzjmFa2SHQysEQqhoHMVMYzdiEYUqGCAtKhaCiMhJIghxYCQpIgkSQgoD5tBRCQlK1VlWb7e7S7bc//Mgj3/O+7/uyr/wqpCTrHEtfqhq36GO497INVJWqBNOo9eH5yBvfuEH/9Nlnnv34x6/+7ku92nYhtlAAVeZAgYyRKZbN0pblbJ6u62jVUEOysKdtSRy0Igk4McFxlINSVbedxLEhAp2jKp6RinIUUWozEiqoJDVtckCQoGBCQSQ1muUKM4ncqlTVDpal2nJ0cnzbpTv+xt98/xc9/XRi0YClLzZV3KqrC6gqPiVEVOG4qhHf9/qH3/uDP/jD739/k+bVa/urL25bX8/2RtGwXT1GoZbtScK1s71D6wuQSlNNkwRUQtKIp2cSSp7ohtWTie3hGRUHJZAQYU47cQ5QNYRFSsA8yDSJiEQV4aCqJ0LYY87Zlt6KzFEiREpVHR3v+ubo4u13PPTYo7uLFxNU3aZKtluVbVVxUxfipoRPKxXnwtH24uVX/dAPf+Dv/ejffebjv9bmvPrC88e7o4haC1jPrrfe+1IQzxlqTmMXan2ZY6pAAqaEnFTKVQvgDIs4JkSeMyXEQRLABDChqoRNiJVJpj0kJ+acbuBc5Rz2lFoEpWRaVcV2uz0b67Is1fpmObp05933P/jg973vfa+9/wFVJZI4sF1VkrhFL16WRFHEpxiIJSrR0ebuRx761u/6jp/65//sf/3ify3Ns2un1SorSZZlEQbmtCX1XjTWuVTbbvoLV18slSQgByaJVVLNeICTIFVT9bnfE4gOJnFizkWqKoPF6ukwpRnWOWawQwqEKp9CWqsIiEqynKSoakOh2tHuZLM7Prl02/0PPvAX/vJfunLlSj/aJUpSVUBVEXBo4qbOKykgAo5LNRlE1ZecXn/8LW/xWP9N8bH/8gvLshlXX5r7tW+2WfdkHgyiEpLAyXT2TtSmAsIzUc4J0nrLiFWB6VSvVm0oFQIkTiwCamUwmWQkq4chvZsYDduRJQSSACUR4sCyEKUEVagKdeHihb7dXrzttsfe8OSf/pb3PP32t9uBsmdrDbBdVYTP0rlJAcI5AZJMoKUC6GjLGG946i2/+cn//evP/PqLn/xtb5funa+fzXXtamrVohyAmSark+unVQ05llCiICjEup9z2kAJlVqrvqmxekzQJIZAdFBUzdiwJqdJoILRgIEsGVlYIClU4cwqQiR6K0pUqfXeNv1od+muOx997LGv+/qv/5p3vxtTVUC1FnAszsVWFbfofA5Cw6NXdzLnirPZbMi87/HH3vjMU7/80V+4+uxvTa6qqqvm2VnmEC1xMtNKbdMmc045thJHDQI9idTXMUKMiFKKasampgIONTQiAiVQJgyzzxzBwJyG4cwwIcQQSLAEVBUQEtFaq97UWt9sN0cX7rzrroceevg9f+ZbvubdfzzDtJIUzoUAVQXYblXcoocJiBahcJAECViqh9jufVswPHEefdOVR5948hO/8isf+KH3H91+yVdb70dnv/t/mvAkUYQOWnnjhb6/tmeClESVTCMh9c0SMVZP4mjdD5N1XVtrRAOMDFTNQBjJcCYyMplhxsNMVRBKwCIHRBKEc2qt9b5pvW92R7vjC9uTi48+8Ybv/Z73fdGXfDFEywYIhABJSgUkqarYasVNnZskkXCDoMDQkFoXmT639B5Q02sefv2P/Njf/Ykf+/Grzz6zPvfi8fKqs+eeZ+7ZD3u1PedqKFytAqmQeEwTQJAEaSgzSWkmM3FJBGRilUUIlM1wpjMDailiDAFULiRFkBBIgKoyatJmu9seHbXN9uTCbZfuvOPHf+Lv3/2qy7vdLrGqJ0FyrAN0AMw5gVaNhFt0bkgiiRsUQoCSBAVCVY1q4dxqL9vd5Qce+M7ve59Pr/3bf/lTH/kPH+rt9tMXXtj7pU0dMfYyGWNM+rLMrH0pj6lWfbOcnp62Vvv9aNQaq9V+roEBKRLOVZM425+1TavW1nW1Y3E2rOrrOqxymOA4UVVNz6paPVtrfVlEtdJud7LZ7mqzvXzPq596+q1/4k9+88OPvh41DkocSECpuEVrLQkJr9S5QRIHEgmgEEGCJF6WBAjqrc+ZVpsHn3yCuJbejo8++uEP7y4c7Z951oSzmtevtb45m2fEm5MdiVlx5KT3vb3PXFSrUiStphljXZZNVU+yjjEz1WpAs9WbV6/DodaxWgVSb3OsQgfTOTDaHu2qarvdtr5ZDo52r7vvgaff/vY3PPHG+x968IkvvBJxIAkIn5MkEl6pQwmFcxJISQCFcwnnBAhFYNQ4mHFTQ3nwySe/+ujo8gP3P/fsb/77n/6ZXL9++tzzmVOGMYcQBPr2qKuuX7/OslEitCauptaUgxnJ5Nrp9aPdCds2x9mMJnFY51S103mGWttsA2NOO1SnImlZltWTaicXLlrsdrtXvfrer/jqP3q0O77n3td+4Zu/6LEn34gwEQLC70/cosQrdSFeSVISfj8KFAdVkpptkPpy/+OPPvD4o8/91rNT9fwzz378l/7bb33iE9eev7ocn+z3+7X1hs7svVO7Yy37ay+9tMcFtFo954yktH7d3l64cDbHfh0R1fqYezIdT7sv2xH29jpGbbbTUW+b3lbP5Wi3W7bHly4+/Mgjd9x9eXt0dN999//Fv/pXqA5JhHCw1Pj8dRJJ3BBukhAHSTgXQBLgdVQVYJGkWtvPoWJRu3T5Vd/x3u/F/hc/+ZM//+8+dHr1pesvvvDsJ5956fT05Ggn6Ww9a6pJsjvuR7urz79Q5/rULKC1OcZLc85EvZs5yahqTcPTkfHqpErbbVov2O6Oe6/777335LaLY+ae1732z33Xd77tK74SAS1zSBWiUmDioriF+H/TFUIkcYsQIUASr9RaQwLsSSmktx4y7aZi2RJ/w5/9tm/4tm9nzE987Ff/0T/8B//9l375+kvXNpvNVhck1XrWk9Nr1y+enCS5fvUlxtyf7c/OzmpZjKWSNIZpGFUvaJWa01XQW212EccXbmtLf9W99zz19Fv/1gd+eOzPWt+kRGtYY4y+2SSx3Vpz3FXTk2p8vjqgECIJMITJTaLxSs5QCOhcc1woUal5XdVaKB0dz5gNdz/2+vf99b92eu16oQNnSqo4ifAHP/jB//yfPrLZbcd+PVLdXu25556bXjPm2dkpve66+451Xec65pw2x72P5GyOa2frlbf8kW/99u/4krc+nda2xycj6bvjGVBNsN03C2C7tZZEwV57a7yC+WzFTeGc+IzOTUmQQopzSSTxe7TWSJCSjDmW1udMK2V19QWjImDVmsF2c+drXiOJ6QipwLws3/bn3/uN73kh0yRzzoYUPPbJ/OhHPvLTP/2vr127xpxtzNPTU6/rxTvu/MIrV77xm//UiO+48+4HH339HZcvT6dag9p7VrWEOWfvLcGhtbbf7zd9kQqFBPF560gI8bKGRONAfA6FOJBYGgetCdBSHDQQgg5dHRA3VEkQEJ/2xJU3cxA+I5yTX/P4F9z35JNXn3+htTbnHGPY3my2j73hC97y1FNICM8Etda5YanGgei9ARKIg81mw8san6343MRn64hbic+XuJV4JXFOQPFZxGeIJFK//5HXv+6hh1trseecvfckQICq/X6/LIta5xbi/7vOHzK5AdJaAyRVFRKJqgS2l2WRZAII8Qel84dMVXHDGEPQWqvWSCTFds713uecUbVq/AH6v9/fEfl/slkLAAAAAElFTkSuQmCC", 49 | "text/plain": [ 50 | "100×100 Array{RGB{N0f8},2} with eltype RGB{N0f8}:\n", 51 | " RGB{N0f8}(0.996,1.0,0.984) … RGB{N0f8}(1.0,1.0,1.0)\n", 52 | " RGB{N0f8}(0.984,1.0,0.988) RGB{N0f8}(1.0,1.0,1.0)\n", 53 | " RGB{N0f8}(0.98,1.0,0.996) RGB{N0f8}(1.0,1.0,1.0)\n", 54 | " RGB{N0f8}(0.969,1.0,0.992) RGB{N0f8}(1.0,1.0,1.0)\n", 55 | " RGB{N0f8}(0.969,1.0,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 56 | " RGB{N0f8}(0.98,1.0,1.0) … RGB{N0f8}(1.0,1.0,1.0)\n", 57 | " RGB{N0f8}(0.992,0.996,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 58 | " RGB{N0f8}(1.0,0.992,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 59 | " RGB{N0f8}(1.0,0.984,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 60 | " RGB{N0f8}(1.0,0.984,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 61 | " RGB{N0f8}(1.0,0.992,1.0) … RGB{N0f8}(1.0,1.0,1.0)\n", 62 | " RGB{N0f8}(1.0,0.996,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 63 | " RGB{N0f8}(1.0,1.0,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 64 | " ⋮ ⋱ \n", 65 | " RGB{N0f8}(0.992,1.0,0.988) RGB{N0f8}(1.0,1.0,1.0)\n", 66 | " RGB{N0f8}(0.992,1.0,0.988) RGB{N0f8}(1.0,1.0,1.0)\n", 67 | " RGB{N0f8}(0.992,1.0,0.996) … RGB{N0f8}(1.0,1.0,1.0)\n", 68 | " RGB{N0f8}(0.996,1.0,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 69 | " RGB{N0f8}(0.996,0.996,0.996) RGB{N0f8}(1.0,1.0,1.0)\n", 70 | " RGB{N0f8}(1.0,0.992,0.996) RGB{N0f8}(1.0,1.0,1.0)\n", 71 | " RGB{N0f8}(1.0,0.992,0.996) RGB{N0f8}(1.0,1.0,1.0)\n", 72 | " RGB{N0f8}(1.0,0.992,0.996) … RGB{N0f8}(1.0,1.0,1.0)\n", 73 | " RGB{N0f8}(1.0,0.996,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 74 | " RGB{N0f8}(1.0,1.0,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 75 | " RGB{N0f8}(1.0,1.0,1.0) RGB{N0f8}(1.0,1.0,1.0)\n", 76 | " RGB{N0f8}(1.0,1.0,1.0) RGB{N0f8}(1.0,1.0,1.0)" 77 | ] 78 | }, 79 | "execution_count": 5, 80 | "metadata": {}, 81 | "output_type": "execute_result" 82 | } 83 | ], 84 | "source": [ 85 | "apple = load(\"data/10_100.jpg\")" 86 | ] 87 | }, 88 | { 89 | "cell_type": "code", 90 | "execution_count": null, 91 | "metadata": {}, 92 | "outputs": [], 93 | "source": [ 94 | "banana = load(\"data/104_100.jpg\")" 95 | ] 96 | }, 97 | { 98 | "cell_type": "markdown", 99 | "metadata": {}, 100 | "source": [ 101 | "The dataset consists of many images of different fruits, viewed from different positions.\n", 102 | "The dataset is [available on GitHub here](https://github.com/Horea94/Fruit-Images-Dataset)." 103 | ] 104 | }, 105 | { 106 | "cell_type": "markdown", 107 | "metadata": {}, 108 | "source": [ 109 | "## What is the goal?" 110 | ] 111 | }, 112 | { 113 | "cell_type": "markdown", 114 | "metadata": {}, 115 | "source": [ 116 | "The ultimate goal is to feed one of these images to the computer and for it to identify whether the image represents an apple or a banana! To do so, we will **train** the computer to learn **for itself** how to\n", 117 | "distinguish the two images.\n", 118 | "\n", 119 | "The following notebooks will walk you step by step through the underlying math and machine learning concepts you need to know in order to accomplish this classification.\n", 120 | "\n", 121 | "They alternate between two different types of notebooks: those labelled **ML** (Machine Learning), which are designed to give a high-level overview of the concepts we need for machine learning, but which gloss over some of the technical details; and those labelled **Tools**, which dive into the details of coding in Julia that will be key to actually implement the machine learning algorithms ourselves.\n", 122 | "\n", 123 | "The notebooks contain many **Exercises**. By doing these exercises in Julia, you will learn the basics of machine learning!" 124 | ] 125 | } 126 | ], 127 | "metadata": { 128 | "kernelspec": { 129 | "display_name": "Julia 1.6.0", 130 | "language": "julia", 131 | "name": "julia-1.6" 132 | }, 133 | "language_info": { 134 | "file_extension": ".jl", 135 | "mimetype": "application/julia", 136 | "name": "julia", 137 | "version": "1.6.0" 138 | } 139 | }, 140 | "nbformat": 4, 141 | "nbformat_minor": 4 142 | } 143 | -------------------------------------------------------------------------------- /0700.Model-complexity.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Model complexity\n", 8 | "\n", 9 | "In the last notebook, we saw that we could customize a model by adding a parameter. Doing so, we were able to fit that model to a data point. This fit was perfect, insofar as numerics would allow.\n", 10 | "\n", 11 | "In the next notebook, we'll see that as we add more data to our data set, fitting a model to our data usually becomes more challenging and the result will be less perfect.\n", 12 | "\n", 13 | "For one thing, we will find that we can add complexity to our model to capture added complexity in the data. We can do this by adding more parameters to our model. We'll see that for a data set with two data points, we can again get a \"perfect\" fit to our model by adding a second parameter to our model.\n", 14 | "\n", 15 | "However, we can't simply add a parameter to our model every time we add a data point to our data set, since this will lead to a phenomenon called **overfitting**.\n", 16 | "\n", 17 | "In the image below, we depict a data set that is close to linear, and models that exhibit underfitting, fitting well, and overfitting, from left to right:\n", 18 | "\n", 19 | "\"Drawing\"\n", 20 | "\n", 21 | "\n", 22 | "In the first image, the model accounts for the slope along which the data falls, but not the offset.\n", 23 | "\n", 24 | "In the second image, the model accounts for both the slope and offset of the data. Adding this second parameter (the offset) to the model creates a much better fit.\n", 25 | "\n", 26 | "However, we can imagine that a model can have too many parameters, where we begin to fit not only the high level features of the data, but also the noise. This overfitting is depicted in the third image." 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "metadata": {}, 32 | "source": [ 33 | "Our aim will be to fit the data well, but avoiding *over*fitting the data!" 34 | ] 35 | } 36 | ], 37 | "metadata": { 38 | "kernelspec": { 39 | "display_name": "Julia 1.6.0", 40 | "language": "julia", 41 | "name": "julia-1.6" 42 | }, 43 | "language_info": { 44 | "file_extension": ".jl", 45 | "mimetype": "application/julia", 46 | "name": "julia", 47 | "version": "1.6.0" 48 | } 49 | }, 50 | "nbformat": 4, 51 | "nbformat_minor": 4 52 | } 53 | -------------------------------------------------------------------------------- /0900.What-is-learning.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## What is learning?\n", 8 | "\n", 9 | "Computers read data, as we saw in notebooks 1 and 2. We can then build functions that model that data to make decisions, as we saw in notebooks 3 and 5.\n", 10 | "\n", 11 | "But how do you make sure that the model actually fits the data well? In the last notebook, we saw that we can fiddle with the parameters of our function defining the model to reduce the loss function. However, we don't want to have to pick the model parameters ourselves. Choosing parameters ourselves works *well enough* when we have a simple model and only a few data points, but can quickly become extremely complex for more detailed models and larger data sets.\n", 12 | "\n", 13 | "Instead, we want our machine to *learn* the parameters that fit the model to our data, without needing us to fiddle with the parameters ourselves. In this notebook, we'll talk about the \"learning\" in machine learning." 14 | ] 15 | }, 16 | { 17 | "cell_type": "markdown", 18 | "metadata": {}, 19 | "source": [ 20 | "### Motivation: Fitting parameters by hand\n", 21 | "\n", 22 | "Let's go back to our example of fitting parameters from notebook 3. Recall that we looked at whether the amount of green in the pictures could distinguish between an apple and a banana, and used a sigmoid function to model our choice of \"apple or banana\" using the amount of green in an image." 23 | ] 24 | }, 25 | { 26 | "cell_type": "code", 27 | "execution_count": 2, 28 | "metadata": {}, 29 | "outputs": [ 30 | { 31 | "data": { 32 | "image/svg+xml": [ 33 | "\n", 34 | "\n", 35 | "\n", 36 | " \n", 37 | " \n", 38 | " \n", 39 | "\n", 40 | "\n", 43 | "\n", 44 | " \n", 45 | " \n", 46 | " \n", 47 | "\n", 48 | "\n", 51 | "\n", 52 | " \n", 53 | " \n", 54 | " \n", 55 | "\n", 56 | "\n", 59 | "\n", 62 | "\n", 65 | "\n", 68 | "\n", 71 | "\n", 74 | "\n", 77 | "\n", 80 | "\n", 83 | "\n", 86 | "\n", 89 | "\n", 92 | "\n", 95 | "\n", 98 | "\n", 101 | "\n", 104 | "\n", 107 | "\n", 110 | "\n", 113 | "\n", 116 | "\n", 119 | "\n", 122 | "\n", 128 | "\n", 129 | "\n", 130 | "\n", 133 | "\n", 136 | "\n", 139 | "\n", 140 | "\n", 141 | "\n" 142 | ] 143 | }, 144 | "execution_count": 2, 145 | "metadata": {}, 146 | "output_type": "execute_result" 147 | } 148 | ], 149 | "source": [ 150 | "using Plots; gr()\n", 151 | "using Images, Statistics\n", 152 | "\n", 153 | "σ(x,w,b) = 1 / (1 + exp(-w*x+b))\n", 154 | "\n", 155 | "apple = load(\"data/10_100.jpg\")\n", 156 | "banana = load(\"data/104_100.jpg\")\n", 157 | "apple_green_amount = mean(Float64.(green.(apple)))\n", 158 | "banana_green_amount = mean(Float64.(green.(banana)));\n", 159 | "\n", 160 | "w = 10.0 # Try manipulating w between 0 and 30 to see how the plot changes\n", 161 | "b = 15.0 # Try manipulating b bewteen 0 and 30\n", 162 | "\n", 163 | "plot(x->σ(x,w,b), 0, 1, label=\"Model\", legend = :topleft, lw=3)\n", 164 | "scatter!([apple_green_amount], [0.0], label=\"Apple\")\n", 165 | "scatter!([banana_green_amount], [1.0], label=\"Banana\")" 166 | ] 167 | }, 168 | { 169 | "cell_type": "markdown", 170 | "metadata": {}, 171 | "source": [ 172 | "Intuitively, how did you tweak the sliders so that way the model sends apples to 0 and bananas to 1? Most likely, you did the following:\n", 173 | "\n", 174 | "#### Move the sliders a bit, see whether the curve moves in the right direction, and if it did, keep doing it.\n", 175 | "\n", 176 | "For a machine, \"learning\" is that same process, translated into math!" 177 | ] 178 | }, 179 | { 180 | "cell_type": "markdown", 181 | "metadata": {}, 182 | "source": [ 183 | "## \"Learning by nudging\": The process of descent\n", 184 | "\n", 185 | "Let's start to formalize this idea. In order to push the curve in the \"right direction\", we need some measurement of \"how right\" and \"how wrong\" the model is. When we translate the idea of a \"right direction\" into math, we end up with a **loss function**, `L(w, b)`, as we saw in notebook 5. We say that the loss function is lowest when the model `σ(x, w, b)` performs the best.\n", 186 | "\n", 187 | "Now we want to create a loss function that is the lowest when the apple is at `0` and the banana is at `1`. If the data (the amount of green) for our apple is $x_1$, then our model will output $σ(x_1,w, b)$ for our apple. So, we want the difference $0 - σ(x_1, w, b)$ to be small. Similarly, if our data for our banana (the banana's amount of green) is $x_2$, we want the difference $1 - σ(x_2, w, b)$ to be small.\n", 188 | "\n", 189 | "To create our loss function, let's add together the squares of the difference of the model's output from the desired output for the apple and the banana. We get\n", 190 | "\n", 191 | "$$ L(w,b) = (0 - σ(x_1, w, b))^2 + (1 - σ(x_2, w, b))^2. $$\n", 192 | "\n", 193 | "$L(w, b)$ is lowest when it outputs `0` for the apple and `1` for the banana, and thus the cost is lowest when the model \"is correct\".\n", 194 | "\n", 195 | "We can visualize this function by plotting it in 3D with the `surface` function or in 2D with contour lines" 196 | ] 197 | }, 198 | { 199 | "cell_type": "code", 200 | "execution_count": 3, 201 | "metadata": {}, 202 | "outputs": [ 203 | { 204 | "data": { 205 | "text/plain": [ 206 | "Plots.GRBackend()" 207 | ] 208 | }, 209 | "execution_count": 3, 210 | "metadata": {}, 211 | "output_type": "execute_result" 212 | } 213 | ], 214 | "source": [ 215 | "# plotly() # The plotly backend is nice for 3d surface plots\n", 216 | "gr() # The GR backend is good for faster interactive plots" 217 | ] 218 | }, 219 | { 220 | "cell_type": "code", 221 | "execution_count": 4, 222 | "metadata": {}, 223 | "outputs": [ 224 | { 225 | "data": { 226 | "image/svg+xml": [ 227 | "\n", 228 | "\n", 229 | "\n", 230 | " \n", 231 | " \n", 232 | " \n", 233 | "\n", 234 | "\n", 237 | "\n", 238 | " \n", 239 | " \n", 240 | " \n", 241 | "\n", 242 | "\n", 245 | "\n", 246 | " \n", 247 | " \n", 248 | " \n", 249 | "\n", 250 | "\n", 253 | "\n", 256 | "\n", 259 | "\n", 262 | "\n", 265 | "\n", 268 | "\n", 271 | "\n", 274 | "\n", 277 | "\n", 280 | "\n", 283 | "\n", 286 | "\n", 289 | "\n", 292 | "\n", 295 | "\n", 298 | "\n", 301 | "\n", 304 | "\n", 307 | "\n", 310 | "\n", 316 | "\n", 322 | "\n", 328 | "\n", 334 | "\n", 340 | "\n", 346 | "\n", 352 | "\n", 358 | "\n", 364 | "\n", 370 | "\n", 376 | "\n", 382 | "\n", 388 | "\n", 394 | "\n", 400 | "\n", 406 | "\n", 412 | "\n", 418 | "\n", 424 | "\n", 429 | "\n", 430 | "\n", 433 | "\n", 434 | " \n", 435 | " \n", 436 | " \n", 437 | "\n", 438 | "\n", 441 | "\n", 444 | "\n", 447 | "\n", 450 | "\n", 453 | "\n", 456 | "\n", 459 | "\n", 462 | "\n", 465 | "\n", 468 | "\n", 471 | "\n", 474 | "\n", 477 | "\n", 480 | "\n", 483 | "\n", 486 | "\n", 489 | "\n", 492 | "\n", 495 | "\n", 498 | "\n", 501 | "\n", 504 | "\n", 507 | "\n", 510 | "\n", 513 | "\n", 516 | "\n", 522 | "\n", 523 | "\n", 524 | "\n", 527 | "\n", 530 | "\n", 533 | "\n", 534 | "\n", 535 | "\n" 536 | ] 537 | }, 538 | "execution_count": 4, 539 | "metadata": {}, 540 | "output_type": "execute_result" 541 | } 542 | ], 543 | "source": [ 544 | "L(w, b) = (0 - σ(apple_green_amount,w,b))^2 + (1 - σ(banana_green_amount,w,b))^2\n", 545 | "\n", 546 | "w_range = 10:0.1:13\n", 547 | "b_range = 0:1:20\n", 548 | "\n", 549 | "L_values = [L(w,b) for b in b_range, w in w_range]\n", 550 | "\n", 551 | "\n", 552 | "w = 11.5 # Try manipulating w with values from w_range (between 10 and 13)\n", 553 | "b = 10 # Try manipulating b with values from b_range (between 0 and 20)\n", 554 | "# p1 = surface(w_range, b_range, L_values, xlabel=\"w\", ylabel=\"b\", cam=(70,40), cbar=false, leg=false)\n", 555 | "# scatter!(p1, [w], [b], [L(w,b)+1e-2], markersize=5, color = :blue)\n", 556 | "p1 = contour(w_range, b_range, L_values, levels=0.05:0.1:1, xlabel=\"w\", ylabel=\"b\", cam=(70,40), cbar=false, leg=false)\n", 557 | "scatter!(p1, [w], [b], markersize=5, color = :blue)\n", 558 | "\n", 559 | "p2 = plot(x->σ(x,w,b), 0, 1, label=\"Model\", legend = :topleft, lw=3)\n", 560 | "scatter!(p2, [apple_green_amount], [0.0], label=\"Apple\", markersize=10)\n", 561 | "scatter!(p2, [banana_green_amount], [1.0], label=\"Banana\", markersize=10, xlim=(0,1), ylim=(0,1))\n", 562 | "plot(p1, p2, layout=(2,1))" 563 | ] 564 | }, 565 | { 566 | "cell_type": "markdown", 567 | "metadata": {}, 568 | "source": [ 569 | "The blue ball on the 3D plot shows the current parameter choices, plotted as `(w,b)`. Shown below the 3D plot is a 2D plot of the corresponding model with those parameters. Notice that as the blue ball rolls down the hill, the model becomes a better fit. Our loss function gives us a mathematical notion of a \"hill\", and the process of \"learning by nudging\" is simply rolling the ball down that hill." 570 | ] 571 | }, 572 | { 573 | "cell_type": "markdown", 574 | "metadata": {}, 575 | "source": [ 576 | "To do this mathematically, we need to know which direction is \"downhill\". Recall from calculus that the derivative of `L` with respect to `b` tells you how `L` changes when `b` changes. Thus to roll downhill, we should go in the direction where the derivative is negative (the function goes down) for each parameter. This direction is the negative of what's called the **gradient**, $\\nabla L$. This means that the \"learn by nudging method\" can be rephrased in mathematical terms as:\n", 577 | "\n", 578 | "1. Calculate the gradient\n", 579 | "2. Move a little bit in the direction of the negative gradient\n", 580 | "3. Repeat\n", 581 | "\n", 582 | "This process of rolling the ball in the direction of the negative gradient is called **gradient descent**; written mathematically, it is\n", 583 | "\n", 584 | "$$p_{n+1} = p_n - \\eta \\nabla L(p_n).$$\n", 585 | "\n", 586 | "Here, $p_n$ represents the vector of current parameters $(w, b)$; $\\nabla L(p_n)$ is the gradient of the loss function, given those parameters. We start from $p_n$ and change it by $\\eta \\nabla L(p_n)$, where $\\eta$ is a small step size that determines how far we move the parameters in the direction of the negative gradient; notice that if you step too far, you'll overshoot the minimum!. The result is $p_{n+1}$, the new vector of parameters.\n", 587 | "\n", 588 | "[Picture of Gradient Descent Vectors]\n", 589 | "\n", 590 | "If we repeat this process, then we will end up at parameters where the model correctly labels apples as `0` and bananas as `1`. When this happens, the model has learned from the data and can then read pictures and tell you whether they are apples or bananas!" 591 | ] 592 | }, 593 | { 594 | "cell_type": "markdown", 595 | "metadata": {}, 596 | "source": [ 597 | "#### Exercise 1\n", 598 | "\n", 599 | "Use the following terms to fill in the sentences below. Terms may be used more than once or not at all:\n", 600 | "> gradient, loss function, derivative, gradient descent, learning.\n", 601 | "\n", 602 | "* We can think of a _(A)_ as a 1D version of a _(B)_.\n", 603 | "* We can visualize a _(C)_ as a hill.\n", 604 | "* In the explanation above, rolling downhill is called _(D)_ and means traveling along the _(E)_.\n", 605 | "* To quantify the correctness of a model we use a _(F)_.\n", 606 | "* When our program can minimize a _(G)_ on its own, we say it is _(H)_.\n", 607 | "\n", 608 | "

\n", 609 | "\n", 610 | "A)
\n", 611 | "B)
\n", 612 | "C)
\n", 613 | "D)
\n", 614 | "E)
\n", 615 | "F)
\n", 616 | "G)
\n", 617 | "H)
" 618 | ] 619 | }, 620 | { 621 | "cell_type": "markdown", 622 | "metadata": {}, 623 | "source": [ 624 | "#### Solution\n", 625 | "\n", 626 | "A) derivative
\n", 627 | "B) gradient
\n", 628 | "C) loss function
\n", 629 | "D) gradient descent
\n", 630 | "E) gradient
\n", 631 | "F) loss function
\n", 632 | "G) loss function
\n", 633 | "H) learning
" 634 | ] 635 | } 636 | ], 637 | "metadata": { 638 | "kernelspec": { 639 | "display_name": "Julia 1.6.0", 640 | "language": "julia", 641 | "name": "julia-1.6" 642 | }, 643 | "language_info": { 644 | "file_extension": ".jl", 645 | "mimetype": "application/julia", 646 | "name": "julia", 647 | "version": "1.6.0" 648 | } 649 | }, 650 | "nbformat": 4, 651 | "nbformat_minor": 4 652 | } 653 | -------------------------------------------------------------------------------- /1100.Intro-to-neurons.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Intro to neurons\n", 8 | "\n", 9 | "At this point, we know how to build models and have a computer automatically learn how to match the model to data. This is the core of how any machine learning method works.\n", 10 | "\n", 11 | "Now, let's narrow our focus and look at **neural networks**. Neural networks (or \"neural nets\", for short) are a specific choice of a **model**. It's a network made up of **neurons**; this, in turn, leads to the question, \"what is a neuron?\"" 12 | ] 13 | }, 14 | { 15 | "cell_type": "markdown", 16 | "metadata": {}, 17 | "source": [ 18 | "### Models with multiple inputs\n", 19 | "\n", 20 | "So far, we have been using the sigmoid function as our model. One of the forms of the sigmoid function we've used is\n", 21 | "\n", 22 | "$$\\sigma_{w, b}(x) = \\frac{1}{1 + \\exp(-wx + b)},$$\n", 23 | "\n", 24 | "where `x` and `w` are both single numbers. We have been using this function to model how the amount of the color green in an image (`x`) can be used to determine if an image shows an apple or a banana.\n", 25 | "\n", 26 | "But what if we had multiple data features we wanted to fit?\n", 27 | "\n", 28 | "We could then extend our model to include multiple features like\n", 29 | "\n", 30 | "$$\\sigma_{\\mathbf{w},b}(\\mathbf{x}) = \\frac{1}{1 + \\exp(-w_1 x_1 - w_2 x_2 - \\cdots - w_n x_n + b)}$$\n", 31 | "\n", 32 | "Note that now $\\mathbf{x}$ and $\\mathbf{w}$ are vectors with many components, rather than a single number.\n", 33 | "\n", 34 | "For example, $x_1$ might be the amount of the color green in an image, $x_2$ could be the height of the object in the picture, $x_3$ could be the width, and so forth. We can add information for as many features as we have! Our model now has more parameters, but the same idea of gradient descent (\"rolling the ball downhill\") will still work to train our model.\n", 35 | "\n", 36 | "This version of the sigmoid model that takes multiple inputs is an example of a **neuron**." 37 | ] 38 | }, 39 | { 40 | "cell_type": "markdown", 41 | "metadata": {}, 42 | "source": [ 43 | "In the video, we see that one huge class of learning techniques is based around neurons, that is, *artificial neurons*. These are caricatures of real, biological neurons. Both *artificial* and *biological* neurons have several inputs $x_1, \\ldots, x_n$, and a single output, $y$. Schematically they look like this:" 44 | ] 45 | }, 46 | { 47 | "cell_type": "code", 48 | "execution_count": 1, 49 | "metadata": {}, 50 | "outputs": [ 51 | { 52 | "data": { 53 | "image/svg+xml": [ 54 | "\n", 55 | "\n", 56 | "\n", 57 | " \n", 58 | " \n", 59 | " \n", 60 | "\n", 61 | "\n", 64 | "\n", 65 | " \n", 66 | " \n", 67 | " \n", 68 | "\n", 69 | "\n", 72 | "\n", 73 | " \n", 74 | " \n", 75 | " \n", 76 | "\n", 77 | "\n", 80 | "\n", 83 | "\n", 86 | "\n", 89 | "\n", 98 | "\n", 107 | "\n", 116 | "\n", 125 | "\n", 134 | "\n", 143 | "\n", 152 | "\n", 161 | "\n", 170 | "\n", 179 | "\n", 182 | "\n", 185 | "\n", 188 | "\n", 191 | "\n", 194 | "\n" 195 | ] 196 | }, 197 | "execution_count": 1, 198 | "metadata": {}, 199 | "output_type": "execute_result" 200 | } 201 | ], 202 | "source": [ 203 | "include(\"scripts/draw_neural_net.jl\")\n", 204 | "\n", 205 | "number_inputs, number_neurons = 4, 1\n", 206 | "\n", 207 | "draw_network([number_inputs, number_neurons])" 208 | ] 209 | }, 210 | { 211 | "cell_type": "markdown", 212 | "metadata": {}, 213 | "source": [ 214 | "We should read this as showing how information flows from left to right:\n", 215 | "- 4 pieces of input information arrive (shown in green on the left);\n", 216 | "\n", 217 | "- the neuron (shown in red on the right) receives all the inputs, processes them, and outputs a single number to the right." 218 | ] 219 | }, 220 | { 221 | "cell_type": "markdown", 222 | "metadata": {}, 223 | "source": [ 224 | "In other words, a neuron is just a type of function that takes multiple inputs and returns a single output.\n", 225 | "\n", 226 | "The simplest interesting case that we will look at in this notebook is when there are just two pieces of input data:" 227 | ] 228 | }, 229 | { 230 | "cell_type": "code", 231 | "execution_count": 2, 232 | "metadata": {}, 233 | "outputs": [ 234 | { 235 | "data": { 236 | "image/svg+xml": [ 237 | "\n", 238 | "\n", 239 | "\n", 240 | " \n", 241 | " \n", 242 | " \n", 243 | "\n", 244 | "\n", 247 | "\n", 248 | " \n", 249 | " \n", 250 | " \n", 251 | "\n", 252 | "\n", 255 | "\n", 256 | " \n", 257 | " \n", 258 | " \n", 259 | "\n", 260 | "\n", 263 | "\n", 266 | "\n", 275 | "\n", 284 | "\n", 293 | "\n", 302 | "\n", 311 | "\n", 320 | "\n", 323 | "\n", 326 | "\n", 329 | "\n" 330 | ] 331 | }, 332 | "execution_count": 2, 333 | "metadata": {}, 334 | "output_type": "execute_result" 335 | } 336 | ], 337 | "source": [ 338 | "draw_network([2, 1])" 339 | ] 340 | }, 341 | { 342 | "cell_type": "markdown", 343 | "metadata": {}, 344 | "source": [ 345 | "Each link between circles above represents a **weight** $w$ that can be modified to allow the neuron to learn, so in this case the neuron has two weights, $w_1$ and $w_2$.\n", 346 | "\n", 347 | "The neuron also has a single bias $b$, and an **activation function**, which we will take to be the $\\sigma$ sigmoidal function that we have been using. (Note that other activation functions can be used!)\n", 348 | "\n", 349 | "Let's call our neuron $f_{w_1,w_2, b}(x_1, x_2)$, where\n", 350 | "\n", 351 | "$$f_{w_1,w_2, b}(x_1, x_2) := \\sigma(w_1 x_1 + w_2 x_2 + b).$$\n", 352 | "\n", 353 | "Note that $f_{w_1,w_2, b}(x_1, x_2)$ has 3 parameters: two weights and a bias.\n", 354 | "\n", 355 | "To simplify the notation, and to prepare for more complicated scenarios later, we put the two weights into a vector, and the two data values into another vector:\n", 356 | "\n", 357 | "$$\n", 358 | "\\mathbf{w} = \\begin{pmatrix} w_1 \\\\ w_2 \\end{pmatrix};\n", 359 | "\\qquad\n", 360 | "\\mathbf{x} = \\begin{pmatrix} x_1 \\\\ x_2 \\end{pmatrix}.\n", 361 | "$$\n", 362 | "\n", 363 | "We thus have\n", 364 | "\n", 365 | "$$f_{\\mathbf{w}, b}(\\mathbf{x}) = \\sigma(\\mathbf{w} \\cdot \\mathbf{x} + b),$$\n", 366 | "\n", 367 | "where the dot product $\\mathbf{w} \\cdot \\mathbf{x}$ is an abbreviated syntax for $w_1 x_1 + w_2 x_2$." 368 | ] 369 | }, 370 | { 371 | "cell_type": "markdown", 372 | "metadata": {}, 373 | "source": [ 374 | "#### Exercise 1\n", 375 | "\n", 376 | "Declare the function `f(x, w, b)` in Julia. `f` should take vectors `x` and `w` as vectors and `b` as a scalar. Furthermore `f` should call\n", 377 | "\n", 378 | "```julia\n", 379 | "σ(x) = 1 / (1 + exp(-x))\n", 380 | "```\n", 381 | "\n", 382 | "What output do you get for\n", 383 | "\n", 384 | "```julia\n", 385 | "f(3, 4, 5)\n", 386 | "```\n", 387 | "?" 388 | ] 389 | }, 390 | { 391 | "cell_type": "markdown", 392 | "metadata": {}, 393 | "source": [ 394 | "#### Solution" 395 | ] 396 | }, 397 | { 398 | "cell_type": "code", 399 | "execution_count": 10, 400 | "metadata": {}, 401 | "outputs": [ 402 | { 403 | "data": { 404 | "text/plain": [ 405 | "f (generic function with 1 method)" 406 | ] 407 | }, 408 | "execution_count": 10, 409 | "metadata": {}, 410 | "output_type": "execute_result" 411 | } 412 | ], 413 | "source": [ 414 | "σ(x) = 1 / (1 + exp(-x))\n", 415 | "\n", 416 | "f(x, w, b) = σ(w'x + b)" 417 | ] 418 | }, 419 | { 420 | "cell_type": "markdown", 421 | "metadata": {}, 422 | "source": [ 423 | "**Tests**" 424 | ] 425 | }, 426 | { 427 | "cell_type": "code", 428 | "execution_count": 11, 429 | "metadata": {}, 430 | "outputs": [], 431 | "source": [ 432 | "@assert f(3, 4, 5) ≈ 0.99999996" 433 | ] 434 | } 435 | ], 436 | "metadata": { 437 | "kernelspec": { 438 | "display_name": "Julia 1.6.0", 439 | "language": "julia", 440 | "name": "julia-1.6" 441 | }, 442 | "language_info": { 443 | "file_extension": ".jl", 444 | "mimetype": "application/julia", 445 | "name": "julia", 446 | "version": "1.6.0" 447 | } 448 | }, 449 | "nbformat": 4, 450 | "nbformat_minor": 4 451 | } 452 | -------------------------------------------------------------------------------- /LICENSE.md: -------------------------------------------------------------------------------- 1 | Copyright (c) 2018-2020: Julia Computing, Inc. 2 | 3 | > Permission is hereby granted, free of charge, to any person obtaining 4 | > a copy of this software and associated documentation files (the 5 | > "Software"), to deal in the Software without restriction, including 6 | > without limitation the rights to use, copy, modify, merge, publish, 7 | > distribute, sublicense, and/or sell copies of the Software, and to 8 | > permit persons to whom the Software is furnished to do so, subject to 9 | > the following conditions: 10 | > 11 | > The above copyright notice and this permission notice shall be 12 | > included in all copies or substantial portions of the Software. 13 | > 14 | > THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, 15 | > EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF 16 | > MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND 17 | > NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE 18 | > LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION 19 | > OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION 20 | > WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 21 | -------------------------------------------------------------------------------- /Project.toml: -------------------------------------------------------------------------------- 1 | [deps] 2 | FileIO = "5789e2e9-d7fb-5bc7-8068-2c6fae9b9549" 3 | Flux = "587475ba-b771-5e3f-ad9e-33799f191a9c" 4 | Images = "916415d5-f1e6-5110-898d-aaa5f9f070e0" 5 | Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80" 6 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Foundations-of-Machine-Learning 2 | We're excited to be your gateway into machine learning. ML is a rapidly growing field that's buzzing with opportunity. 3 | -------------------------------------------------------------------------------- /data/104_100.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/JuliaAcademy/Foundations-of-Machine-Learning/e3b16dd422d5ca169adedf72166aee461f66033a/data/104_100.jpg -------------------------------------------------------------------------------- /data/107_100.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/JuliaAcademy/Foundations-of-Machine-Learning/e3b16dd422d5ca169adedf72166aee461f66033a/data/107_100.jpg -------------------------------------------------------------------------------- /data/10_100.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/JuliaAcademy/Foundations-of-Machine-Learning/e3b16dd422d5ca169adedf72166aee461f66033a/data/10_100.jpg -------------------------------------------------------------------------------- /data/8_100.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/JuliaAcademy/Foundations-of-Machine-Learning/e3b16dd422d5ca169adedf72166aee461f66033a/data/8_100.jpg -------------------------------------------------------------------------------- /scripts/draw_neural_net.jl: -------------------------------------------------------------------------------- 1 | using Plots; gr() 2 | import LinearAlgebra: normalize! 3 | 4 | function draw_neuron(x, y, r; c=:blue) 5 | 6 | θs = 0:0.1:2pi 7 | xs = x .+ r.*cos.(θs) 8 | ys = y .+ r.*sin.(θs) 9 | 10 | 11 | plot!(xs, ys, seriestype=:shape, c=c, alpha=0.5, aspect_ratio=1, leg=false) 12 | 13 | end 14 | 15 | 16 | #neuron_coords(x, N, spacing) = range(-(N - 1)/2 * spacing, spacing, N) 17 | 18 | """ 19 | Vertical position of neuron in layer i, position j, with a total of N neurons 20 | """ 21 | neuron_coords(j, N, y_spacing) = (-(N - 1)/2 + j) * y_spacing 22 | 23 | function draw_neurons(x, N, spacing, r; c=:blue) 24 | 25 | ys = neuron_coords(x, N, spacing) 26 | 27 | draw_neuron.(x, ys, r; c=c) 28 | 29 | end 30 | 31 | 32 | function draw_layer(x, spacing, N1, N2, r) 33 | 34 | plot!(framestyle=:none, grid=:none) 35 | 36 | first_x = x 37 | second_x = x + 1 38 | 39 | first = neuron_coords(x + 1, N1, spacing) 40 | second = neuron_coords(x, N2, spacing) 41 | 42 | draw_neurons(x, N1, 1, r; c=:blue) 43 | draw_neurons(x+1, N2, 1, r; c=:red) 44 | 45 | for i in 1:N1 46 | for j in 1:N2 47 | 48 | vec = [second_x - first_x, second[j] - first[i]] 49 | normalize!(vec) 50 | 51 | start = [first_x, first[i]] + 1.2*r*vec 52 | finish = [second_x, second[j]] - 1.2*r*vec 53 | 54 | 55 | plot!([start[1], finish[1]], [start[2], finish[2]], c=:black, alpha=0.5) 56 | end 57 | end 58 | 59 | end 60 | 61 | #draw_layer(1, 1, 3, 4, 0.2) 62 | 63 | function draw_link(x1, y1, x2, y2, r) 64 | vec = [x2 - x1, y2 - y1] 65 | normalize!(vec) 66 | 67 | start = [x1, y1] + 1.2 * r * vec 68 | finish = [x2, y2] - 1.2 * r * vec 69 | 70 | plot!([start[1], finish[1]], [start[2], finish[2]], c=:black, alpha=0.5) 71 | end 72 | 73 | """ 74 | Takes a vector of neurons per layer 75 | """ 76 | function draw_network(neurons_per_layer) 77 | 78 | x_spacing = 1 79 | y_spacing = 1 80 | r = 0.2 81 | 82 | num_layers = length(neurons_per_layer) 83 | 84 | plot(framestyle=:none, grid=:none) 85 | 86 | # draw input links 87 | N1 = neurons_per_layer[1] 88 | 89 | for j in 1:N1 90 | y = neuron_coords(j, N1, y_spacing) 91 | draw_link(0, y, 1, y, r) 92 | end 93 | 94 | # draw neurons 95 | for layer in 1:length(neurons_per_layer) 96 | N = neurons_per_layer[layer] 97 | 98 | if layer == 1 99 | c = :green 100 | elseif layer == num_layers 101 | c = :red 102 | else 103 | c = :blue 104 | end 105 | 106 | for j in 1:N 107 | 108 | draw_neuron(layer, neuron_coords(j, N, y_spacing), r, c=c) 109 | 110 | end 111 | end 112 | 113 | # draw links 114 | for layer in 1:length(neurons_per_layer)-1 115 | N1 = neurons_per_layer[layer] 116 | N2 = neurons_per_layer[layer+1] 117 | 118 | for j1 in 1:N1 119 | for j2 in 1:N2 120 | 121 | draw_link(layer, neuron_coords(j1, N1, y_spacing), layer+1, 122 | neuron_coords(j2, N2, y_spacing), r) 123 | end 124 | 125 | end 126 | end 127 | 128 | # draw output links 129 | N_last = neurons_per_layer[end] 130 | 131 | for j in 1:N_last 132 | y = neuron_coords(j, N_last, y_spacing) 133 | draw_link(num_layers, y, num_layers+1, y, r) 134 | end 135 | 136 | plot!() 137 | 138 | end 139 | 140 | draw_network([3, 2, 2]) --------------------------------------------------------------------------------