├── README.md ├── assignments ├── assignment1.md ├── assignment1_colab.zip ├── assignment1_jupyter.zip ├── assignment2.md ├── assignment2_colab.zip ├── assignment2_jupyter.zip ├── assignment3.md ├── assignment3_colab.zip └── assignment3_jupyter.zip └── slides ├── lecture_10.pdf ├── lecture_11.pdf ├── lecture_12.pdf ├── lecture_13.pdf ├── lecture_14.pdf ├── lecture_16_Hao.pdf ├── lecture_17.pdf ├── lecture_18.pdf ├── lecture_1_ranjay.pdf ├── lecture_2.pdf ├── lecture_3.pdf ├── lecture_4.pdf ├── lecture_5.pdf ├── lecture_6.pdf ├── lecture_7.pdf ├── lecture_8.pdf ├── lecture_9.pdf ├── section_2_annotated.pdf ├── section_2_backprop.pdf ├── section_3_project.pdf ├── section_5_midterm.pdf ├── section_7_detection.pdf └── section_8_video.pdf /README.md: -------------------------------------------------------------------------------- 1 | # Stanford University - CS231n 2020 (spring) class 2 | 3 | All notes, slides and assignments for [CS231n: Convolutional Neural Networks for Visual Recognition](http://vision.stanford.edu/teaching/cs231n/) class by Stanford 4 | 5 | Useful links: 6 | - [Offical CS231n-2028 course online](http://cs231n.stanford.edu/2018/) 7 | - [Syllabus online](http://cs231n.stanford.edu/2018/syllabus.html) 8 | - [Notes online](https://cs231n.github.io/) 9 | - [YouTube Video lectures, Spring 2017](https://www.youtube.com/playlist?list=PLC1qU-LWwrF64f4QKQT-Vg5Wr4qEE1Zxk) 10 | - [Stanford HAI 2019 - Introduction to Stanford HAI: Fei-Fei Li](https://www.youtube.com/watch?v=XnhfeNDc0eI) 11 | - [CS231n Google Colab Assignment Workflow Tutorial](https://www.youtube.com/watch?v=IZUz4pRYlus&t=1s) 12 | -------------------------------------------------------------------------------- /assignments/assignment1.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: page 3 | title: Assignment 1 4 | mathjax: true 5 | permalink: /assignments2020/assignment1/ 6 | --- 7 | 8 | This assignment is due on **Wednesday, April 22 2020** at 11:59pm PST. 9 | 10 |
11 | Handy Download Links 12 | 13 | 17 |
18 | 19 | - [Goals](#goals) 20 | - [Setup](#setup) 21 | - [Option A: Google Colaboratory (Recommended)](#option-a-google-colaboratory-recommended) 22 | - [Option B: Local Development](#option-b-local-development) 23 | - [Q1: k-Nearest Neighbor classifier (20 points)](#q1-k-nearest-neighbor-classifier-20-points) 24 | - [Q2: Training a Support Vector Machine (25 points)](#q2-training-a-support-vector-machine-25-points) 25 | - [Q3: Implement a Softmax classifier (20 points)](#q3-implement-a-softmax-classifier-20-points) 26 | - [Q4: Two-Layer Neural Network (25 points)](#q4-two-layer-neural-network-25-points) 27 | - [Q5: Higher Level Representations: Image Features (10 points)](#q5-higher-level-representations-image-features-10-points) 28 | - [Submitting your work](#submitting-your-work) 29 | 30 | ### Goals 31 | 32 | In this assignment you will practice putting together a simple image classification pipeline based on the k-Nearest Neighbor or the SVM/Softmax classifier. The goals of this assignment are as follows: 33 | 34 | - Understand the basic **Image Classification pipeline** and the data-driven approach (train/predict stages) 35 | - Understand the train/val/test **splits** and the use of validation data for **hyperparameter tuning**. 36 | - Develop proficiency in writing efficient **vectorized** code with numpy 37 | - Implement and apply a k-Nearest Neighbor (**kNN**) classifier 38 | - Implement and apply a Multiclass Support Vector Machine (**SVM**) classifier 39 | - Implement and apply a **Softmax** classifier 40 | - Implement and apply a **Two layer neural network** classifier 41 | - Understand the differences and tradeoffs between these classifiers 42 | - Get a basic understanding of performance improvements from using **higher-level representations** as opposed to raw pixels, e.g. color histograms, Histogram of Gradient (HOG) features, etc. 43 | 44 | ### Setup 45 | 46 | You can work on the assignment in one of two ways: **remotely** on Google Colaboratory or **locally** on your own machine. 47 | 48 | **Regardless of the method chosen, ensure you have followed the [setup instructions](/setup-instructions) before proceeding.** 49 | 50 | #### Option A: Google Colaboratory (Recommended) 51 | 52 | **Download.** Starter code containing Colab notebooks can be downloaded [here]({{site.hw_1_colab}}). 53 | 54 | 55 | 56 | If you choose to work with Google Colab, please watch the workflow tutorial above or read the instructions below. 57 | 58 | 1. Unzip the starter code zip file. You should see an `assignment1` folder. 59 | 2. Create a folder in your personal Google Drive and upload `assignment1/` folder to the Drive folder. We recommend that you call the Google Drive folder `cs231n/assignments/` so that the final uploaded folder has the path `cs231n/assignments/assignment1/`. 60 | 3. Each Colab notebook (i.e. files ending in `.ipynb`) corresponds to an assignment question. In Google Drive, double click on the notebook and select the option to open with `Colab`. 61 | 4. You will be connected to a Colab VM. You can mount your Google Drive and access your uploaded 62 | files by executing the first cell in the notebook. It will prompt you for an authorization code which you can obtain 63 | from a popup window. The code cell will also automatically download the CIFAR-10 dataset for you. 64 | 5. Once you have completed the assignment question (i.e. reached the end of the notebook), you can save your edited files back to your Drive and move on to the next question. For your convenience, we also provide you with a code cell (the very last one) that automatically saves the modified files for that question back to your Drive. 65 | 6. Repeat steps 3-5 for each remaining notebook. 66 | 67 | **Note 1**. Please make sure that you work on the Colab notebooks in the order of the questions (see below). Specifically, you should work on kNN first, then SVM, the Softmax, then Two-layer Net and finally on Image Features. The reason is that the code cells that get executed *at the end* of the notebooks save the modified files back to your drive and some notebooks may require code from previous notebook. 68 | 69 | **Note 2**. Related to above, ensure you are periodically saving your notebook (`File -> Save`), and any edited `.py` files relevant to that notebook (i.e. **by executing the last code cell**) so that you don't lose your progress if you step away from the assignment and the Colab VM disconnects. 70 | 71 | Once you have completed all Colab notebooks **except `collect_submission.ipynb`**, proceed to the [submission instructions](#submitting-your-work). 72 | 73 | #### Option B: Local Development 74 | 75 | **Download.** Starter code containing jupyter notebooks can be downloaded [here]({{site.hw_1_jupyter}}). 76 | 77 | **Install Packages**. Once you have the starter code, activate your environment (the one you installed in the [Software Setup]({{site.baseurl}}/setup-instructions/) page) and run `pip install -r requirements.txt`. 78 | 79 | **Download CIFAR-10**. Next, you will need to download the CIFAR-10 dataset. Run the following from the `assignment1` directory: 80 | 81 | ```bash 82 | cd cs231n/datasets 83 | ./get_datasets.sh 84 | ``` 85 | **Start Jupyter Server**. After you have the CIFAR-10 data, you should start the Jupyter server from the 86 | `assignment1` directory by executing `jupyter notebook` in your terminal. 87 | 88 | Complete each notebook, then once you are done, go to the [submission instructions](#submitting-your-work). 89 | 90 | ### Q1: k-Nearest Neighbor classifier (20 points) 91 | 92 | The notebook **knn.ipynb** will walk you through implementing the kNN classifier. 93 | 94 | ### Q2: Training a Support Vector Machine (25 points) 95 | 96 | The notebook **svm.ipynb** will walk you through implementing the SVM classifier. 97 | 98 | ### Q3: Implement a Softmax classifier (20 points) 99 | 100 | The notebook **softmax.ipynb** will walk you through implementing the Softmax classifier. 101 | 102 | ### Q4: Two-Layer Neural Network (25 points) 103 | 104 | The notebook **two\_layer\_net.ipynb** will walk you through the implementation of a two-layer neural network classifier. 105 | 106 | ### Q5: Higher Level Representations: Image Features (10 points) 107 | 108 | The notebook **features.ipynb** will examine the improvements gained by using higher-level representations 109 | as opposed to using raw pixel values. 110 | 111 | ### Submitting your work 112 | 113 | **Important**. Please make sure that the submitted notebooks have been run and the cell outputs are visible. 114 | 115 | Once you have completed all notebooks and filled out the necessary code, there are **_two_** steps you must follow to submit your assignment: 116 | 117 | **1.** If you selected Option A and worked on the assignment in Colab, open `collect_submission.ipynb` in Colab and execute the notebook cells. If you selected Option B and worked on the assignment locally, run the bash script in `assignment1` by executing `bash collectSubmission.sh`. 118 | 119 | This notebook/script will: 120 | 121 | * Generate a zip file of your code (`.py` and `.ipynb`) called `a1.zip`. 122 | * Convert all notebooks into a single PDF file. 123 | 124 | **Note for Option B users**. You must have (a) `nbconvert` installed with Pandoc and Tex support and (b) `PyPDF2` installed to successfully convert your notebooks to a PDF file. Please follow these [installation instructions](https://nbconvert.readthedocs.io/en/latest/install.html#installing-nbconvert) to install (a) and run `pip install PyPDF2` to install (b). If you are, for some inexplicable reason, unable to successfully install the above dependencies, you can manually convert each jupyter notebook to HTML (`File -> Download as -> HTML (.html)`), save the HTML page as a PDF, then concatenate all the PDFs into a single PDF submission using your favorite PDF viewer. 125 | 126 | If your submission for this step was successful, you should see the following display message: 127 | 128 | `### Done! Please submit a1.zip and the pdfs to Gradescope. ###` 129 | 130 | **2.** Submit the PDF and the zip file to [Gradescope](https://www.gradescope.com/courses/103764). 131 | 132 | **Note for Option A users**. Remember to download `a1.zip` and `assignment.pdf` locally before submitting to Gradescope. 133 | -------------------------------------------------------------------------------- /assignments/assignment1_colab.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/assignments/assignment1_colab.zip -------------------------------------------------------------------------------- /assignments/assignment1_jupyter.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/assignments/assignment1_jupyter.zip -------------------------------------------------------------------------------- /assignments/assignment2.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: page 3 | title: Assignment 2 4 | mathjax: true 5 | permalink: /assignments2020/assignment2/ 6 | --- 7 | 8 | This assignment is due on **Wednesday, May 6 2020** at 11:59pm PDT. 9 | 10 |
11 | Handy Download Links 12 | 13 | 17 |
18 | 19 | - [Goals](#goals) 20 | - [Setup](#setup) 21 | - [Option A: Google Colaboratory (Recommended)](#option-a-google-colaboratory-recommended) 22 | - [Option B: Local Development](#option-b-local-development) 23 | - [Q1: Fully-connected Neural Network (20 points)](#q1-fully-connected-neural-network-20-points) 24 | - [Q2: Batch Normalization (30 points)](#q2-batch-normalization-30-points) 25 | - [Q3: Dropout (10 points)](#q3-dropout-10-points) 26 | - [Q4: Convolutional Networks (30 points)](#q4-convolutional-networks-30-points) 27 | - [Q5: PyTorch / TensorFlow on CIFAR-10 (10 points)](#q5-pytorch--tensorflow-on-cifar-10-10-points) 28 | - [Submitting your work](#submitting-your-work) 29 | 30 | ### Goals 31 | 32 | In this assignment you will practice writing backpropagation code, and training Neural Networks and Convolutional Neural Networks. The goals of this assignment are as follows: 33 | 34 | - Understand **Neural Networks** and how they are arranged in layered architectures. 35 | - Understand and be able to implement (vectorized) **backpropagation**. 36 | - Implement various **update rules** used to optimize Neural Networks. 37 | - Implement **Batch Normalization** and **Layer Normalization** for training deep networks. 38 | - Implement **Dropout** to regularize networks. 39 | - Understand the architecture of **Convolutional Neural Networks** and get practice with training them. 40 | - Gain experience with a major deep learning framework, such as **TensorFlow** or **PyTorch**. 41 | 42 | ### Setup 43 | 44 | You can work on the assignment in one of two ways: **remotely** on Google Colaboratory or **locally** on your own machine. 45 | 46 | **Regardless of the method chosen, ensure you have followed the [setup instructions](/setup-instructions) before proceeding.** 47 | 48 | #### Option A: Google Colaboratory (Recommended) 49 | 50 | **Download.** Starter code containing Colab notebooks can be downloaded [here]({{site.hw_2_colab}}). 51 | 52 | If you choose to work with Google Colab, please familiarize yourself with the [recommended workflow]({{site.baseurl}}/setup-instructions/#working-remotely-on-google-colaboratory). 53 | 54 | 55 | 56 | **Note**. Ensure you are periodically saving your notebook (`File -> Save`) so that you don't lose your progress if you step away from the assignment and the Colab VM disconnects. 57 | 58 | Once you have completed all Colab notebooks **except `collect_submission.ipynb`**, proceed to the [submission instructions](#submitting-your-work). 59 | 60 | #### Option B: Local Development 61 | 62 | **Download.** Starter code containing jupyter notebooks can be downloaded [here]({{site.hw_2_jupyter}}). 63 | 64 | **Install Packages**. Once you have the starter code, activate your environment (the one you installed in the [Software Setup]({{site.baseurl}}/setup-instructions/) page) and run `pip install -r requirements.txt`. 65 | 66 | **Download CIFAR-10**. Next, you will need to download the CIFAR-10 dataset. Run the following from the `assignment2` directory: 67 | 68 | ```bash 69 | cd cs231n/datasets 70 | ./get_datasets.sh 71 | ``` 72 | **Start Jupyter Server**. After you have the CIFAR-10 data, you should start the Jupyter server from the 73 | `assignment2` directory by executing `jupyter notebook` in your terminal. 74 | 75 | Complete each notebook, then once you are done, go to the [submission instructions](#submitting-your-work). 76 | 77 | ### Q1: Fully-connected Neural Network (20 points) 78 | 79 | The notebook `FullyConnectedNets.ipynb` will introduce you to our 80 | modular layer design, and then use those layers to implement fully-connected 81 | networks of arbitrary depth. To optimize these models you will implement several 82 | popular update rules. 83 | 84 | ### Q2: Batch Normalization (30 points) 85 | 86 | In notebook `BatchNormalization.ipynb` you will implement batch normalization, and use it to train deep fully-connected networks. 87 | 88 | ### Q3: Dropout (10 points) 89 | 90 | The notebook `Dropout.ipynb` will help you implement Dropout and explore its effects on model generalization. 91 | 92 | ### Q4: Convolutional Networks (30 points) 93 | In the IPython Notebook `ConvolutionalNetworks.ipynb` you will implement several new layers that are commonly used in convolutional networks. 94 | 95 | ### Q5: PyTorch / TensorFlow on CIFAR-10 (10 points) 96 | For this last part, you will be working in either TensorFlow or PyTorch, two popular and powerful deep learning frameworks. **You only need to complete ONE of these two notebooks.** You do NOT need to do both, and we will _not_ be awarding extra credit to those who do. 97 | 98 | Open up either `PyTorch.ipynb` or `TensorFlow.ipynb`. There, you will learn how the framework works, culminating in training a convolutional network of your own design on CIFAR-10 to get the best performance you can. 99 | 100 | ### Submitting your work 101 | 102 | **Important**. Please make sure that the submitted notebooks have been run and the cell outputs are visible. 103 | 104 | Once you have completed all notebooks and filled out the necessary code, there are **_two_** steps you must follow to submit your assignment: 105 | 106 | **1.** If you selected Option A and worked on the assignment in Colab, open `collect_submission.ipynb` in Colab and execute the notebook cells. If you selected Option B and worked on the assignment locally, run the bash script in `assignment2` by executing `bash collectSubmission.sh`. 107 | 108 | This notebook/script will: 109 | 110 | * Generate a zip file of your code (`.py` and `.ipynb`) called `a2.zip`. 111 | * Convert all notebooks into a single PDF file. 112 | 113 | **Note for Option B users**. You must have (a) `nbconvert` installed with Pandoc and Tex support and (b) `PyPDF2` installed to successfully convert your notebooks to a PDF file. Please follow these [installation instructions](https://nbconvert.readthedocs.io/en/latest/install.html#installing-nbconvert) to install (a) and run `pip install PyPDF2` to install (b). If you are, for some inexplicable reason, unable to successfully install the above dependencies, you can manually convert each jupyter notebook to HTML (`File -> Download as -> HTML (.html)`), save the HTML page as a PDF, then concatenate all the PDFs into a single PDF submission using your favorite PDF viewer. 114 | 115 | If your submission for this step was successful, you should see the following display message: 116 | 117 | `### Done! Please submit a2.zip and the pdfs to Gradescope. ###` 118 | 119 | **2.** Submit the PDF and the zip file to [Gradescope](https://www.gradescope.com/courses/103764). 120 | 121 | **Note for Option A users**. Remember to download `a2.zip` and `assignment.pdf` locally before submitting to Gradescope. 122 | -------------------------------------------------------------------------------- /assignments/assignment2_colab.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/assignments/assignment2_colab.zip -------------------------------------------------------------------------------- /assignments/assignment2_jupyter.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/assignments/assignment2_jupyter.zip -------------------------------------------------------------------------------- /assignments/assignment3.md: -------------------------------------------------------------------------------- 1 | --- 2 | layout: page 3 | title: Assignment 3 4 | mathjax: true 5 | permalink: /assignments2020/assignment3/ 6 | --- 7 | 8 | This assignment is due on **Wednesday, May 27 2020** at 11:59pm PDT. 9 | 10 |
11 | Handy Download Links 12 | 13 | 17 |
18 | 19 | - [Goals](#goals) 20 | - [Setup](#setup) 21 | - [Option A: Google Colaboratory (Recommended)](#option-a-google-colaboratory-recommended) 22 | - [Option B: Local Development](#option-b-local-development) 23 | - [Q1: Image Captioning with Vanilla RNNs (29 points)](#q1-image-captioning-with-vanilla-rnns-29-points) 24 | - [Q2: Image Captioning with LSTMs (23 points)](#q2-image-captioning-with-lstms-23-points) 25 | - [Q3: Network Visualization: Saliency maps, Class Visualization, and Fooling Images (15 points)](#q3-network-visualization-saliency-maps-class-visualization-and-fooling-images-15-points) 26 | - [Q4: Style Transfer (15 points)](#q4-style-transfer-15-points) 27 | - [Q5: Generative Adversarial Networks (15 points)](#q5-generative-adversarial-networks-15-points) 28 | - [Submitting your work](#submitting-your-work) 29 | 30 | ### Goals 31 | 32 | In this assignment, you will implement recurrent neural networks and apply them to image captioning on the Microsoft COCO data. You will also explore methods for visualizing the features of a pretrained model on ImageNet, and use this model to implement Style Transfer. Finally, you will train a Generative Adversarial Network to generate images that look like a training dataset! 33 | 34 | The goals of this assignment are as follows: 35 | 36 | - Understand the architecture of recurrent neural networks (RNNs) and how they operate on sequences by sharing weights over time. 37 | - Understand and implement both Vanilla RNNs and Long-Short Term Memory (LSTM) networks. 38 | - Understand how to combine convolutional neural nets and recurrent nets to implement an image captioning system. 39 | - Explore various applications of image gradients, including saliency maps, fooling images, class visualizations. 40 | - Understand and implement techniques for image style transfer. 41 | - Understand how to train and implement a Generative Adversarial Network (GAN) to produce images that resemble samples from a dataset. 42 | 43 | ### Setup 44 | 45 | You should be able to use your setup from assignments 1 and 2. 46 | 47 | You can work on the assignment in one of two ways: **remotely** on Google Colaboratory or **locally** on your own machine. 48 | 49 | **Regardless of the method chosen, ensure you have followed the [setup instructions](/setup-instructions) before proceeding.** 50 | 51 | #### Option A: Google Colaboratory (Recommended) 52 | 53 | **Download.** Starter code containing Colab notebooks can be downloaded [here]({{site.hw_3_colab}}). 54 | 55 | If you choose to work with Google Colab, please familiarize yourself with the [recommended workflow]({{site.baseurl}}/setup-instructions/#working-remotely-on-google-colaboratory). 56 | 57 | 58 | 59 | **Note**. Ensure you are periodically saving your notebook (`File -> Save`) so that you don't lose your progress if you step away from the assignment and the Colab VM disconnects. 60 | 61 | Once you have completed all Colab notebooks **except `collect_submission.ipynb`**, proceed to the [submission instructions](#submitting-your-work). 62 | 63 | #### Option B: Local Development 64 | 65 | **Download.** Starter code containing jupyter notebooks can be downloaded [here]({{site.hw_3_jupyter}}). 66 | 67 | **Install Packages**. Once you have the starter code, activate your environment (the one you installed in the [Software Setup]({{site.baseurl}}/setup-instructions/) page) and run `pip install -r requirements.txt`. 68 | 69 | **Download data**. Next, you will need to download the COCO captioning data, a pretrained SqueezeNet model (for TensorFlow), and a few ImageNet validation images. Run the following from the `assignment3` directory: 70 | 71 | ```bash 72 | cd cs231n/datasets 73 | ./get_datasets.sh 74 | ``` 75 | **Start Jupyter Server**. After you've downloaded the data, you can start the Jupyter server from the `assignment3` directory by executing `jupyter notebook` in your terminal. 76 | 77 | Complete each notebook, then once you are done, go to the [submission instructions](#submitting-your-work). 78 | 79 | **You can do Questions 3, 4, and 5 in TensorFlow or PyTorch. There are two versions of each of these notebooks, one for TensorFlow and one for PyTorch. No extra credit will be awarded if you do a question in both TensorFlow and PyTorch** 80 | 81 | ### Q1: Image Captioning with Vanilla RNNs (29 points) 82 | 83 | The notebook `RNN_Captioning.ipynb` will walk you through the implementation of an image captioning system on MS-COCO using vanilla recurrent networks. 84 | 85 | ### Q2: Image Captioning with LSTMs (23 points) 86 | 87 | The notebook `LSTM_Captioning.ipynb` will walk you through the implementation of Long-Short Term Memory (LSTM) RNNs, and apply them to image captioning on MS-COCO. 88 | 89 | ### Q3: Network Visualization: Saliency maps, Class Visualization, and Fooling Images (15 points) 90 | 91 | The notebooks `NetworkVisualization-TensorFlow.ipynb`, and `NetworkVisualization-PyTorch.ipynb` will introduce the pretrained SqueezeNet model, compute gradients with respect to images, and use them to produce saliency maps and fooling images. Please complete only one of the notebooks (TensorFlow or PyTorch). No extra credit will be awardeded if you complete both notebooks. 92 | 93 | ### Q4: Style Transfer (15 points) 94 | 95 | In thenotebooks `StyleTransfer-TensorFlow.ipynb` or `StyleTransfer-PyTorch.ipynb` you will learn how to create images with the content of one image but the style of another. Please complete only one of the notebooks (TensorFlow or PyTorch). No extra credit will be awardeded if you complete both notebooks. 96 | 97 | ### Q5: Generative Adversarial Networks (15 points) 98 | 99 | In the notebooks `GANS-TensorFlow.ipynb` or `GANS-PyTorch.ipynb` you will learn how to generate images that match a training dataset, and use these models to improve classifier performance when training on a large amount of unlabeled data and a small amount of labeled data. Please complete only one of the notebooks (TensorFlow or PyTorch). No extra credit will be awarded if you complete both notebooks. 100 | 101 | ### Submitting your work 102 | 103 | **Important**. Please make sure that the submitted notebooks have been run and the cell outputs are visible. 104 | 105 | Once you have completed all notebooks and filled out the necessary code, there are **_two_** steps you must follow to submit your assignment: 106 | 107 | **1.** If you selected Option A and worked on the assignment in Colab, open `collect_submission.ipynb` in Colab and execute the notebook cells. If you selected Option B and worked on the assignment locally, run the bash script in `assignment3` by executing `bash collectSubmission.sh`. 108 | 109 | This notebook/script will: 110 | 111 | * Generate a zip file of your code (`.py` and `.ipynb`) called `a3.zip`. 112 | * Convert all notebooks into a single PDF file. 113 | 114 | **Note for Option B users**. You must have (a) `nbconvert` installed with Pandoc and Tex support and (b) `PyPDF2` installed to successfully convert your notebooks to a PDF file. Please follow these [installation instructions](https://nbconvert.readthedocs.io/en/latest/install.html#installing-nbconvert) to install (a) and run `pip install PyPDF2` to install (b). If you are, for some inexplicable reason, unable to successfully install the above dependencies, you can manually convert each jupyter notebook to HTML (`File -> Download as -> HTML (.html)`), save the HTML page as a PDF, then concatenate all the PDFs into a single PDF submission using your favorite PDF viewer. 115 | 116 | If your submission for this step was successful, you should see the following display message: 117 | 118 | `### Done! Please submit a3.zip and the pdfs to Gradescope. ###` 119 | 120 | **2.** Submit the PDF and the zip file to [Gradescope](https://www.gradescope.com/courses/103764). 121 | 122 | **Note for Option A users**. Remember to download `a3.zip` and `assignment.pdf` locally before submitting to Gradescope. 123 | -------------------------------------------------------------------------------- /assignments/assignment3_colab.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/assignments/assignment3_colab.zip -------------------------------------------------------------------------------- /assignments/assignment3_jupyter.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/assignments/assignment3_jupyter.zip -------------------------------------------------------------------------------- /slides/lecture_10.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_10.pdf -------------------------------------------------------------------------------- /slides/lecture_11.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_11.pdf -------------------------------------------------------------------------------- /slides/lecture_12.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_12.pdf -------------------------------------------------------------------------------- /slides/lecture_13.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_13.pdf -------------------------------------------------------------------------------- /slides/lecture_14.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_14.pdf -------------------------------------------------------------------------------- /slides/lecture_16_Hao.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_16_Hao.pdf -------------------------------------------------------------------------------- /slides/lecture_17.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_17.pdf -------------------------------------------------------------------------------- /slides/lecture_18.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_18.pdf -------------------------------------------------------------------------------- /slides/lecture_1_ranjay.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_1_ranjay.pdf -------------------------------------------------------------------------------- /slides/lecture_2.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_2.pdf -------------------------------------------------------------------------------- /slides/lecture_3.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_3.pdf -------------------------------------------------------------------------------- /slides/lecture_4.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_4.pdf -------------------------------------------------------------------------------- /slides/lecture_5.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_5.pdf -------------------------------------------------------------------------------- /slides/lecture_6.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_6.pdf -------------------------------------------------------------------------------- /slides/lecture_7.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_7.pdf -------------------------------------------------------------------------------- /slides/lecture_8.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_8.pdf -------------------------------------------------------------------------------- /slides/lecture_9.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/lecture_9.pdf -------------------------------------------------------------------------------- /slides/section_2_annotated.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/section_2_annotated.pdf -------------------------------------------------------------------------------- /slides/section_2_backprop.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/section_2_backprop.pdf -------------------------------------------------------------------------------- /slides/section_3_project.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/section_3_project.pdf -------------------------------------------------------------------------------- /slides/section_5_midterm.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/section_5_midterm.pdf -------------------------------------------------------------------------------- /slides/section_7_detection.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/section_7_detection.pdf -------------------------------------------------------------------------------- /slides/section_8_video.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/maxim5/cs231n-2020-spring/de14d936477fb4a1bb45da2a7b08ae6daf7e4ac8/slides/section_8_video.pdf --------------------------------------------------------------------------------