├── .gitattributes ├── .gitignore ├── LICENSE ├── README.md ├── datasets ├── README.md └── hot-dog-v-not-hot-dog-zip-moved.txt └── notebooks ├── Try_Keras.ipynb └── Try_Keras_Start_Here.ipynb /.gitattributes: -------------------------------------------------------------------------------- 1 | *.zip filter=lfs diff=lfs merge=lfs -text 2 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | .DS_Store 2 | # Byte-compiled / optimized / DLL files 3 | __pycache__/ 4 | *.py[cod] 5 | *$py.class 6 | 7 | # C extensions 8 | *.so 9 | 10 | # Distribution / packaging 11 | .Python 12 | build/ 13 | develop-eggs/ 14 | dist/ 15 | downloads/ 16 | eggs/ 17 | .eggs/ 18 | lib/ 19 | lib64/ 20 | parts/ 21 | sdist/ 22 | var/ 23 | wheels/ 24 | pip-wheel-metadata/ 25 | share/python-wheels/ 26 | *.egg-info/ 27 | .installed.cfg 28 | *.egg 29 | MANIFEST 30 | 31 | # PyInstaller 32 | # Usually these files are written by a python script from a template 33 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 34 | *.manifest 35 | *.spec 36 | 37 | # Installer logs 38 | pip-log.txt 39 | pip-delete-this-directory.txt 40 | 41 | # Unit test / coverage reports 42 | htmlcov/ 43 | .tox/ 44 | .nox/ 45 | .coverage 46 | .coverage.* 47 | .cache 48 | nosetests.xml 49 | coverage.xml 50 | *.cover 51 | *.py,cover 52 | .hypothesis/ 53 | .pytest_cache/ 54 | 55 | # Translations 56 | *.mo 57 | *.pot 58 | 59 | # Django stuff: 60 | *.log 61 | local_settings.py 62 | db.sqlite3 63 | db.sqlite3-journal 64 | 65 | # Flask stuff: 66 | instance/ 67 | .webassets-cache 68 | 69 | # Scrapy stuff: 70 | .scrapy 71 | 72 | # Sphinx documentation 73 | docs/_build/ 74 | 75 | # PyBuilder 76 | target/ 77 | 78 | # Jupyter Notebook 79 | .ipynb_checkpoints 80 | 81 | # IPython 82 | profile_default/ 83 | ipython_config.py 84 | 85 | # pyenv 86 | .python-version 87 | 88 | # pipenv 89 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 90 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 91 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 92 | # install all needed dependencies. 93 | #Pipfile.lock 94 | 95 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow 96 | __pypackages__/ 97 | 98 | # Celery stuff 99 | celerybeat-schedule 100 | celerybeat.pid 101 | 102 | # SageMath parsed files 103 | *.sage.py 104 | 105 | # Environments 106 | .env 107 | .venv 108 | env/ 109 | venv/ 110 | ENV/ 111 | env.bak/ 112 | venv.bak/ 113 | 114 | # Spyder project settings 115 | .spyderproject 116 | .spyproject 117 | 118 | # Rope project settings 119 | .ropeproject 120 | 121 | # mkdocs documentation 122 | /site 123 | 124 | # mypy 125 | .mypy_cache/ 126 | .dmypy.json 127 | dmypy.json 128 | 129 | # Pyre type checker 130 | .pyre/ 131 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Coding For Entrepreneurs 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | [![Try Keras Logo](https://static.codingforentrepreneurs.com/media/projects/try-keras/images/share/Try_Keras_-_Share.jpg)](https://www.codingforentrepreneurs.com/projects/try-keras) 2 | 3 | # Try Keras 4 | Learn how to build your first neural network using Keras and Tensorflow to do Deep Learning! 5 | 6 | 7 | > Datasets created by the app [Tight.ai Desktop](https://tight.ai) (coming soon). 8 | 9 | 10 | ### Getting Started 11 | 12 | Open the [Try Keras Start Here.ipynb](./notebooks/Try_Keras_Start_Here.ipynb) Notebook on [Colab](https://colab.research.google.com/github/codingforentrepreneurs/Try-Keras/blob/master/notebooks/Try_Keras_Start_Here.ipynb) or locally. 13 | 14 | Using colab provides a free and easy way for nearly anyone to get started. 15 | 16 | 17 | ### Demo 18 | - [Google Drive Link](https://drive.google.com/drive/folders/1-4_a546Oix6cI345RzbH03mcRJV4bc5i?usp=sharing) 19 | -------------------------------------------------------------------------------- /datasets/README.md: -------------------------------------------------------------------------------- 1 | ## Try Keras Datasets 2 | 3 | We'll be adding new datasets from time to time. This readme is here to ensure you know exactly where you can download them. 4 | 5 | As you probably know, datasets can be large (100MB +) so storing all datasets on GitHub isn't always practical. Come back to this README for any updates. Naturally, if you download a dataset locally you may have to re-upload to use on Colab or another cloud-based jupyter notebook. 6 | 7 | Available Datasets: 8 | 9 | - [Try Keras Google Drive Folder](https://drive.google.com/drive/folders/1YzioLo90Th2RpwmTCtv-YrO79Hjlyz2k?usp=sharing) 10 | 11 | We'll do our best to have the notebook [Try Keras - Start Here.ipynb](https://github.com/codingforentrepreneurs/Try-Keras/blob/master/notebooks/Try_Keras_Start_Here.ipynb) to have the latest instructions to download our datasets. 12 | -------------------------------------------------------------------------------- /datasets/hot-dog-v-not-hot-dog-zip-moved.txt: -------------------------------------------------------------------------------- 1 | Check readme in this directory. 2 | -------------------------------------------------------------------------------- /notebooks/Try_Keras_Start_Here.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "Try Keras - Start Here.ipynb", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | } 14 | }, 15 | "cells": [ 16 | { 17 | "cell_type": "markdown", 18 | "metadata": { 19 | "id": "sXpRAifOCWe5", 20 | "colab_type": "text" 21 | }, 22 | "source": [ 23 | "# Try Keras\n", 24 | "This is the starter notebook for the [Try Keras](https://www.codingforentrepreneurs.com/projects/try-keras) tutorial series.", "\n", "[Run on Google Colab](https://colab.research.google.com/github/codingforentrepreneurs/Try-Keras/blob/master/notebooks/Try_Keras_Start_Here.ipynb)" 25 | ] 26 | }, 27 | { 28 | "cell_type": "markdown", 29 | "metadata": { 30 | "id": "YutY7pbQEKMa", 31 | "colab_type": "text" 32 | }, 33 | "source": [ 34 | "## Download and Unzip our Dataset\n", 35 | "\n", 36 | "We have provided the starter dataset for this series. You can always use your own dataset once you get the basics and you can use a lot of the methods below download and unzip your dataset too.\n", 37 | "\n", 38 | "All datasets are available on [this Google Drive folder](https://drive.google.com/drive/folders/1YzioLo90Th2RpwmTCtv-YrO79Hjlyz2k?usp=sharing). Datasets were generated using [Tight.ai Desktop](https://tight.ai) _coming soon_." 39 | ] 40 | }, 41 | { 42 | "cell_type": "code", 43 | "metadata": { 44 | "id": "54fYnBQQANBF", 45 | "colab_type": "code", 46 | "colab": {} 47 | }, 48 | "source": [ 49 | "DATASET_NAME = \"hot-dog-v-not-hot-dog\"\n", 50 | "DATASET_ZIP = f\"{DATASET_NAME}.zip\"\n", 51 | "DATASET_URL = f\"https://s3.amazonaws.com/datasets.tight.ai/tutorials/try-keras/{DATASET_ZIP}\"" 52 | ], 53 | "execution_count": 33, 54 | "outputs": [] 55 | }, 56 | { 57 | "cell_type": "markdown", 58 | "metadata": { 59 | "id": "ciyYmAb6CGD4", 60 | "colab_type": "text" 61 | }, 62 | "source": [ 63 | "Let's just run a bash script `!` to repeat `echo` the python variable `DATASET_URL` from above by using `$DATASET_URL`." 64 | ] 65 | }, 66 | { 67 | "cell_type": "code", 68 | "metadata": { 69 | "id": "az3pcK0xBL09", 70 | "colab_type": "code", 71 | "colab": {} 72 | }, 73 | "source": [ 74 | "!echo $DATASET_URL" 75 | ], 76 | "execution_count": null, 77 | "outputs": [] 78 | }, 79 | { 80 | "cell_type": "markdown", 81 | "metadata": { 82 | "id": "lejAgZ46BnJH", 83 | "colab_type": "text" 84 | }, 85 | "source": [ 86 | "Here we'll use a bash script `!` to download the file `wget` and the newest version `-N` using the python variable from above with `$DATASET_URL`." 87 | ] 88 | }, 89 | { 90 | "cell_type": "code", 91 | "metadata": { 92 | "id": "E9E5v-E8Af1Y", 93 | "colab_type": "code", 94 | "colab": {} 95 | }, 96 | "source": [ 97 | "!wget -N $DATASET_URL" 98 | ], 99 | "execution_count": null, 100 | "outputs": [] 101 | }, 102 | { 103 | "cell_type": "code", 104 | "metadata": { 105 | "id": "8sDJcmQ0DCel", 106 | "colab_type": "code", 107 | "colab": {} 108 | }, 109 | "source": [ 110 | "# !unzip -h" 111 | ], 112 | "execution_count": 52, 113 | "outputs": [] 114 | }, 115 | { 116 | "cell_type": "markdown", 117 | "metadata": { 118 | "id": "y5YX0KbrCzWB", 119 | "colab_type": "text" 120 | }, 121 | "source": [ 122 | "Now we'll run a bash script `!` to unzip `!unzip` and update files (if neccessary) quietly `-uq` from the python variable name `DATASET_NAME` as `$DATASET_NAME` with our destination `-d` being our current directory `.`." 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "metadata": { 128 | "id": "wTEbYQslAe-z", 129 | "colab_type": "code", 130 | "colab": {} 131 | }, 132 | "source": [ 133 | "!unzip -uq $DATASET_ZIP -d \".\"" 134 | ], 135 | "execution_count": 24, 136 | "outputs": [] 137 | }, 138 | { 139 | "cell_type": "markdown", 140 | "metadata": { 141 | "id": "KZBSqC3LD0Et", 142 | "colab_type": "text" 143 | }, 144 | "source": [ 145 | "Now let's review, with a bash script `!`, our current directory `ls` to ensure our zip file `$DATASET_ZIP` was unpacked and our `DATASET_NAME` is visable as a directory:" 146 | ] 147 | }, 148 | { 149 | "cell_type": "code", 150 | "metadata": { 151 | "id": "LhQQoev-DkyB", 152 | "colab_type": "code", 153 | "colab": {} 154 | }, 155 | "source": [ 156 | "!ls" 157 | ], 158 | "execution_count": null, 159 | "outputs": [] 160 | } 161 | ] 162 | } 163 | --------------------------------------------------------------------------------