├── (KDT)TMDB 인기영화 추천.ipynb
├── (KDT)TMDB컨텐츠 기반 영화 추천.ipynb
├── (KDT)_TF_IDF_이해와_유사도_계산.ipynb
├── (KDT)_더미_데이터를_이용한_사용자_기반_추천_실습.ipynb
├── (KDT)_더미_데이터를_이용한_아이템_기반_추천_실습.ipynb
├── (KDT)_사용자_기반_협업_필터링_추천.ipynb
├── (KDT)_숫자이미지_오토인코더.ipynb
├── (KDT)_아이템_기반_협업_필터링_추천.ipynb
├── 0. Jupyter Notebook 시작하기.ipynb
├── 1. 파이썬의 컴퓨팅 라이브러리, numpy.ipynb
├── 11주차_PySpark_기본_2일차.ipynb
├── 11주차_PySpark_기본_3일차_2.ipynb
├── 11주차_PySpark_기본_4일차_1.ipynb
├── 11주차_PySpark_기본_4일차_2.ipynb
├── 11주차_PySpark_기본_5일차_1.ipynb
├── 11주차_SQL_실습_3일차_1.ipynb
├── 2. 파이썬으로 데이터 주무르기, pandas.ipynb
├── 3. python으로 멋드러진 그래프 그리기, Matplotlib.ipynb
├── 4. EDA.ipynb
├── Large_Linear_Regression.ipynb
├── Linear Algebra, Matrix Calculus.ipynb
├── Linear Algebra, Matrix Calculus.pdf
├── ML_Basics (Linear Regression).ipynb
├── ML_Basics (Linear Regression).pdf
├── ML_E2E.ipynb
├── ML_Linear_Classification.ipynb
├── ML_Linear_Models_Practice.ipynb
├── ML_Probability_1.ipynb
├── [KDT] 10주차 실습
├── BatchNormalization.ipynb
├── ConvolutionalNetworks.ipynb
├── FullyConnectedNets.ipynb
└── RNN_Captioning.ipynb
├── [KDT] 인공지능 11주차 실습.
├── 2일차_SQL_실습.ipynb
├── 3일차_SQL_실습.ipynb
├── 4일차_실습.ipynb
└── 5일차_실습.ipynb
├── [KDT] 인공지능 14주차 실습
└── [Day 5]실습6-2-OpenCV로 웹캠접근.ipynb
├── [KDT] 인공지능 12주차 실습
├── PySpark_타이베이시_주택가격_예측.ipynb
├── text classification (CNN with pre-training).ipynb
├── text classification (CNN without pre-training).ipynb
├── text classification (bag of words).ipynb
└── text classification (embedding).ipynb
├── [KDT] 인공지능 13주차 실습
├── lecture_material_전체.pdf
├── 실습 12 pix2pix.ipynb
├── 실습 13 cyclegan.ipynb
├── 실습 14 style_transfer.ipynb
├── 실습 yolov4_colab에서의 실시간 webcam.ipynb
├── 실습1-폐렴검출.ipynb
├── 실습10-dcgan.ipynb
├── 실습2-전이학습.ipynb
├── 실습3-object localization.ipynb
├── 실습4 -OpenCV이미지와 영상처리 개요.ipynb
├── 실습5 - OpenCV로 Face Detection.ipynb
├── 실습6 - Tensorflow Hub로 objection detection.ipynb
├── 실습6-2-OpenCV로 웹캠접근.ipynb
├── 실습7-YOLOv3.ipynb
├── 실습8-커스텀 YOLOv3.ipynb
└── 실습9 image segmentation.ipynb
├── [KDT] 인공지능 16차 실습
└── 아마존_뷰티_상품_평점_정보를_이용한_추천_엔진_만들기.ipynb
├── [KDT] 인공지능 2주차 실습
├── [1-1. 선형시스템]-2.ipynb
├── [1-3. LU 분해]-2.ipynb
├── [1-4. 행렬연산과 선형조합].ipynb
├── [1-8. SVD, PCA]-1.ipynb
└── [1-8. 벡터공간과 최소제곱법]-2.ipynb
├── [KDT] 인공지능 6,7주차 강의자료
├── Large_Linear_Regression.ipynb
├── Linear Algebra, Matrix Calculus.ipynb
├── Linear Algebra, Matrix Calculus.pdf
├── ML_Basics (Linear Regression).ipynb
├── ML_Basics (Linear Regression).pdf
├── ML_Linear_Classification.ipynb
├── ML_Linear_Models_Practice.ipynb
├── ML_Probability_1.ipynb
├── ml_basics_1.pptx
├── ml_basics_2.pptx
├── ml_basics_3.pptx
├── ml_linear_classification.pptx
├── ml_linear_regression.pptx
├── ml_probability_1.pptx
└── ml_probability_2.pptx
├── [KDT] 인공지능 8주차 실습
├── [실습]02_PyTorch_기본연산.ipynb
├── [실습]02_TensorFlow_기본연산.ipynb
├── [실습]03_Pytorch_자동미분연산_MLP.ipynb.ipynb
├── [실습]03_TensorFlow_자동미분연산_MLP.ipynb
├── [실습]04_PyTorch_CNN(AlexNet).ipynb
└── [실습]04_TensorFlow_CNN(AlexNet).ipynb
├── [KDT] 인공지능 9주차 실습
├── Tensorflow Tutorial(InceptionV3 Model).ipynb
├── [실습]05_PyTorch_CNN(VGGNet).ipynb
├── [실습]05_TensorFlow_CNN(VGGNet).ipynb
├── [실습]06_PyTorch_CNN(VGGNet_훈련).ipynb
├── [실습]06_TensorFlow_CNN(VGGNet_훈련).ipynb
├── [실습]07_PyTorch_CNN(GoogLeNet).ipynb
├── [실습]07_TensorFlow_CNN(GoogLeNet_InceptionV3_훈련).ipynb
├── [실습]08_PyTorch_CNN(ResNet).ipynb
├── [실습]08_TensorFlow_CNN(ResNet).ipynb
├── [실습]08_TensorFlow_CNN(ResNet_훈련).ipynb
├── [실습]09_PyTorch_CNN(DenseNet).ipynb
├── [실습]09_PyTorch_CNN(DenseNet_훈련).ipynb
├── [실습]09_TensorFlow_CNN(DenseNet).ipynb
├── [실습]09_TensorFlow_CNN(DenseNet_훈련).ipynb
├── [실습]10_PyTorch_RNN.ipynb
└── [실습]10_TensorFlow_RNN.ipynb
├── country_wise_latest.csv
├── day_wise.csv
├── images
├── 05.06-gaussian-basis.png
├── ann
│ ├── README
│ ├── activation_functions_plot.png
│ ├── fashion_mnist_images_plot.png
│ ├── fashion_mnist_plot.png
│ ├── keras_learning_curves_plot.png
│ └── perceptron_iris_plot.png
├── autoencoder.png
├── autoencoders
│ └── README
├── classification
│ ├── README
│ ├── confusion_matrix_errors_plot.png
│ ├── confusion_matrix_plot.png
│ ├── more_digits_plot.png
│ ├── precision_recall_vs_threshold_plot.png
│ ├── precision_vs_recall_plot.png
│ └── some_digit_plot.png
├── cnn
│ ├── README
│ ├── china_horizontal.png
│ ├── china_max_pooling.png
│ ├── china_original.png
│ ├── china_vertical.png
│ └── test_image.png
├── decision_tree.png
├── decision_trees
│ └── README
├── deep
│ ├── README
│ ├── elu_plot.png
│ ├── leaky_relu_plot.png
│ ├── selu_plot.png
│ └── sigmoid_saturation_plot.png
├── distributed
│ └── README
├── end_to_end_project
│ ├── README
│ ├── attribute_histogram_plots.png
│ ├── bad_visualization_plot.png
│ ├── better_visualization_plot.png
│ ├── california.png
│ ├── california_housing_prices_plot.png
│ ├── housing_prices_scatterplot.png
│ ├── income_vs_house_value_scatterplot.png
│ └── scatter_matrix_plot.png
├── ensembles
│ └── README
├── fig2-1.png
├── fig2-14.png
├── fig2-17.png
├── fig2-2.png
├── fig2-3.png
├── fig2-4.png
├── fig3-2.png
├── fig3-3.png
├── fig_det.png
├── fundamentals
│ └── README
├── kfolds.png
├── nlp
│ └── README
├── random_forest.png
├── rl
│ ├── README
│ └── breakout.gif
├── rnn
│ └── README
├── svm
│ └── README
├── tensorflow
│ └── README
├── training_linear_models
│ ├── README
│ ├── generated_data_plot.png
│ ├── gradient_descent_paths_plot.png
│ ├── gradient_descent_plot.png
│ ├── high_degree_polynomials_plot.png
│ ├── learning_curves_plot.png
│ ├── linear_model_predictions_plot.png
│ ├── logistic_function_plot.png
│ ├── quadratic_data_plot.png
│ ├── quadratic_predictions_plot.png
│ ├── ridge_regression_plot.png
│ ├── sgd_plot.png
│ └── underfitting_learning_curves_plot.png
└── unsupervised_learning
│ ├── README
│ └── ladybug.png
└── train.csv
/(KDT)_더미_데이터를_이용한_사용자_기반_추천_실습.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 4,
3 | "nbformat_minor": 0,
4 | "metadata": {
5 | "colab": {
6 | "name": "(KDT) 더미 데이터를 이용한 사용자 기반 추천 실습",
7 | "provenance": [],
8 | "collapsed_sections": [],
9 | "include_colab_link": true
10 | },
11 | "kernelspec": {
12 | "name": "python3",
13 | "display_name": "Python 3"
14 | }
15 | },
16 | "cells": [
17 | {
18 | "cell_type": "markdown",
19 | "metadata": {
20 | "id": "view-in-github",
21 | "colab_type": "text"
22 | },
23 | "source": [
24 | "
"
25 | ]
26 | },
27 | {
28 | "cell_type": "code",
29 | "metadata": {
30 | "id": "hNYMxHoxCIUj"
31 | },
32 | "source": [
33 | "import pandas as pd\n",
34 | "from sklearn.metrics.pairwise import cosine_similarity"
35 | ],
36 | "execution_count": null,
37 | "outputs": []
38 | },
39 | {
40 | "cell_type": "code",
41 | "metadata": {
42 | "id": "yMSjSA8QDLpA"
43 | },
44 | "source": [
45 | "dummy_rating = pd.read_csv(\"https://grepp-reco-test.s3.ap-northeast-2.amazonaws.com/dummy_rating.csv\", index_col=0)"
46 | ],
47 | "execution_count": null,
48 | "outputs": []
49 | },
50 | {
51 | "cell_type": "code",
52 | "metadata": {
53 | "id": "ND5mf6sVsoxs"
54 | },
55 | "source": [
56 | "dummy_rating.shape"
57 | ],
58 | "execution_count": null,
59 | "outputs": []
60 | },
61 | {
62 | "cell_type": "code",
63 | "metadata": {
64 | "id": "t4eKv_FrD0bz"
65 | },
66 | "source": [
67 | "dummy_rating.head()"
68 | ],
69 | "execution_count": null,
70 | "outputs": []
71 | },
72 | {
73 | "cell_type": "code",
74 | "metadata": {
75 | "id": "Arr4hz_iD4ip"
76 | },
77 | "source": [
78 | "dummy_rating.fillna(0, inplace=True)\n",
79 | "dummy_rating"
80 | ],
81 | "execution_count": null,
82 | "outputs": []
83 | },
84 | {
85 | "cell_type": "code",
86 | "metadata": {
87 | "id": "PDB6vSY3EDJN"
88 | },
89 | "source": [
90 | "# 평점 정보를 보정. 이후에 코사인 유사도를 사용하면 이는 피어슨 유사도에 해당\n",
91 | "def standardize(row):\n",
92 | " new_row = (row - row.mean())/(row.max()-row.min())\n",
93 | " return new_row\n",
94 | "\n",
95 | "dummy_rating_std = dummy_rating.apply(standardize)\n",
96 | "dummy_rating_std.head()"
97 | ],
98 | "execution_count": null,
99 | "outputs": []
100 | },
101 | {
102 | "cell_type": "code",
103 | "metadata": {
104 | "id": "H-mvAftFJLBF"
105 | },
106 | "source": [
107 | "# 정규화 없이 아이템간의 유사도 측정 행렬 만들기\n",
108 | "corrMatrix_wo_std = pd.DataFrame(cosine_similarity(dummy_rating), index=dummy_rating.index, columns=dummy_rating.index)\n",
109 | "corrMatrix_wo_std"
110 | ],
111 | "execution_count": null,
112 | "outputs": []
113 | },
114 | {
115 | "cell_type": "code",
116 | "metadata": {
117 | "id": "hhEsGTdpO5Gx"
118 | },
119 | "source": [
120 | "# 정규화 기반 아이템간의 유사도 측정 행렬 만들기\n",
121 | "corrMatrix = pd.DataFrame(cosine_similarity(dummy_rating_std), index=dummy_rating.index, columns=dummy_rating.index)\n",
122 | "corrMatrix"
123 | ],
124 | "execution_count": null,
125 | "outputs": []
126 | },
127 | {
128 | "cell_type": "code",
129 | "metadata": {
130 | "id": "MgS2PZ48Jn-H"
131 | },
132 | "source": [
133 | "def get_similar(userId):\n",
134 | " similar_score = corrMatrix[userId]\n",
135 | " # 앞서 보정된 값을 가지고 평점의 내림차순으로 정렬\n",
136 | " similar_score = similar_score.sort_values(ascending=False)\n",
137 | " return similar_score"
138 | ],
139 | "execution_count": null,
140 | "outputs": []
141 | },
142 | {
143 | "cell_type": "code",
144 | "metadata": {
145 | "id": "Cn00uRC3KABb"
146 | },
147 | "source": [
148 | "scifi_lover = \"user1\"\n",
149 | "\n",
150 | "similar_users = get_similar(scifi_lover)\n",
151 | "similar_users.head(10)"
152 | ],
153 | "execution_count": null,
154 | "outputs": []
155 | },
156 | {
157 | "cell_type": "code",
158 | "metadata": {
159 | "id": "Nt_OWrWgL9fk"
160 | },
161 | "source": [
162 | ""
163 | ],
164 | "execution_count": null,
165 | "outputs": []
166 | }
167 | ]
168 | }
--------------------------------------------------------------------------------
/0. Jupyter Notebook 시작하기.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# 0. Jupyter Notebook 시작하기\n",
8 | "**Jupyter Notebook을 사용해 Python 프로그래밍을 진행해봅시다.**"
9 | ]
10 | },
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "1. Jupyter Notebook의 단축키 익히기 \n",
16 | " - Esc : 명령 모드\n",
17 | " - **Y** : Code Cell\n",
18 | " - **M** : Markdown Cell\n",
19 | " - **A** : 현재 셀 위에 셀 만들기\n",
20 | " - **B** : 현재 셀 아래에 셀 만들기\n",
21 | " - **dd** : 현재 셀 삭제하기\n",
22 | " - Enter : 편집 모드\n",
23 | " - ctrl(cmd) + Enter : 셀 실행하기(Code) / 형식 반영하기(Markdown) \n",
24 | "2. Markdown\n",
25 | " - Header : #, ##, ### \n",
26 | " - *italic* : _, * \n",
27 | " - **bold** : __, **\n",
28 | " - ~strikethrough~ : ~\n",
29 | " - unordered list : -, ...\n",
30 | " - ordered list : 1. "
31 | ]
32 | },
33 | {
34 | "cell_type": "markdown",
35 | "metadata": {},
36 | "source": [
37 | "----"
38 | ]
39 | },
40 | {
41 | "cell_type": "markdown",
42 | "metadata": {},
43 | "source": [
44 | "## Jupyter Notebook Tutorial\n",
45 | "\n",
46 | "**마우스 없이 키보드만으로 아래 미션들을 해결해봅시다!**"
47 | ]
48 | },
49 | {
50 | "cell_type": "markdown",
51 | "metadata": {},
52 | "source": [
53 | "### Mission 1\n",
54 | "Code Mode로 이 셀 **아래**에 Python Code로 Hello World!를 출력해봅시다."
55 | ]
56 | },
57 | {
58 | "cell_type": "markdown",
59 | "metadata": {},
60 | "source": [
61 | "----"
62 | ]
63 | },
64 | {
65 | "cell_type": "markdown",
66 | "metadata": {},
67 | "source": [
68 | "### Mission 2\n",
69 | "Edit Mode로 이 셀 **위**에 자기소개 글을 적어봅시다."
70 | ]
71 | },
72 | {
73 | "cell_type": "markdown",
74 | "metadata": {},
75 | "source": [
76 | "----"
77 | ]
78 | },
79 | {
80 | "cell_type": "markdown",
81 | "metadata": {},
82 | "source": [
83 | "### Mission 3\n",
84 | "아래 Cell의 속성을 바꿔서 파이썬 코드가 잘 실행되게 해주세요!"
85 | ]
86 | },
87 | {
88 | "cell_type": "markdown",
89 | "metadata": {},
90 | "source": [
91 | "import random as rand \n",
92 | "print(rand.randint(0, 10))"
93 | ]
94 | },
95 | {
96 | "cell_type": "markdown",
97 | "metadata": {},
98 | "source": [
99 | "----"
100 | ]
101 | },
102 | {
103 | "cell_type": "markdown",
104 | "metadata": {},
105 | "source": [
106 | "## Mission 4\n",
107 | "\n",
108 | "출출한 야심한 밤, 라면을 끓여먹으려고 합니다.\n",
109 | "이를 함께 준비해볼까요?\n",
110 | "\n",
111 | "### 4.1\n",
112 | "라면 먹고 싶은 마음이 너무 큰 우리, 아래에 셀을 추가하고, **헤더 1 크기**의 큰 글씨로 \"라면먹고싶다!\"를 적어봅시다."
113 | ]
114 | },
115 | {
116 | "cell_type": "markdown",
117 | "metadata": {},
118 | "source": [
119 | "### 4.2\n",
120 | "어떤 라면을 먹고 싶으신가요? 여러분이 먹고 싶은 라면을 아래 셀을 추가하고 다음과 같이 *italic* 체와 함께 적어봅시다. \n",
121 | "ex) 나는 *불닭볶음면* 이 땡긴다..."
122 | ]
123 | },
124 | {
125 | "cell_type": "markdown",
126 | "metadata": {},
127 | "source": [
128 | "### 4.3\n",
129 | "아래 셀에는 라면을 만들 때 쓸 재료들이 있습니다. \n",
130 | "1. 재료들을 unordered list로 만들어봅시다.\n",
131 | "2. 꼭 넣어야 하는 재료는 **bold** 처리를 해줍시다.\n",
132 | "3. 넣지 말아야하는 재료는 ~strikethrough~ 처리를 해줍시다."
133 | ]
134 | },
135 | {
136 | "cell_type": "markdown",
137 | "metadata": {},
138 | "source": [
139 | "물 \n",
140 | "라면 스프 \n",
141 | "라면 \n",
142 | "계란 \n",
143 | "치즈 \n",
144 | "홍삼 \n",
145 | "아메리카노 "
146 | ]
147 | },
148 | {
149 | "cell_type": "markdown",
150 | "metadata": {},
151 | "source": [
152 | "### 4.4 \n",
153 | "아래 셀에는 라면을 끓이는 순서가 뒤죽박죽 나열되어있습니다. \n",
154 | "순서에 맞게 이를 ordered list로 만들어주세요. \n",
155 | "(이 Mission에서는 마우스를 사용해도 됩니다!)"
156 | ]
157 | },
158 | {
159 | "cell_type": "markdown",
160 | "metadata": {},
161 | "source": [
162 | "면이 다 익었을 때 즈음 추가 재료들을 넣어준다. \n",
163 | "냄비에 물을 올리고 불을 켠다. \n",
164 | "불을 끄고, 맛있게 먹는다! \n",
165 | "끓는 물에 라면과 라면 스프를 넣는다. "
166 | ]
167 | },
168 | {
169 | "cell_type": "markdown",
170 | "metadata": {},
171 | "source": [
172 | "----"
173 | ]
174 | }
175 | ],
176 | "metadata": {
177 | "kernelspec": {
178 | "display_name": "Python 3",
179 | "language": "python",
180 | "name": "python3"
181 | },
182 | "language_info": {
183 | "codemirror_mode": {
184 | "name": "ipython",
185 | "version": 3
186 | },
187 | "file_extension": ".py",
188 | "mimetype": "text/x-python",
189 | "name": "python",
190 | "nbconvert_exporter": "python",
191 | "pygments_lexer": "ipython3",
192 | "version": "3.9.0"
193 | }
194 | },
195 | "nbformat": 4,
196 | "nbformat_minor": 4
197 | }
198 |
--------------------------------------------------------------------------------
/1. 파이썬의 컴퓨팅 라이브러리, numpy.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# 1. 파이썬의 컴퓨팅 라이브러리, numpy\n",
8 | "**numpy를 이용해서 데이터를 다뤄봅시다!**"
9 | ]
10 | },
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "### Our Goal\n",
16 | "1. Numpy 시작하기\n",
17 | " - prerequisite : Python의 List\n",
18 | " - numpy import하기\n",
19 | " - numpy.array\n",
20 | "\n",
21 | "2. Numpy로 연산하기\n",
22 | " - Vector - Scalar : elementwise! (+, -, *, /)\n",
23 | " - Vector - Vector : elementwise / broadcasting (+, -, *, /)\n",
24 | " - Indexing & Slicing\n",
25 | "3. Example : Linear Algebra with Numpy\n",
26 | " 1. basics\n",
27 | " - 영벡터 : `.zeros()`\n",
28 | " - 일벡터 : `.ones()`\n",
29 | " - 대각행렬 : `.diag()`\n",
30 | " - 항등행렬 : `.eye()`\n",
31 | " - 행렬곱 : `@` / `.dot()`\n",
32 | " \n",
33 | " 2. furthermore\n",
34 | " - 트레이스 : `.trace()`\n",
35 | " - 행렬식 : `.linalg.det()`\n",
36 | " - 역행렬 : `.linalg.inv()`\n",
37 | " - 고유값 : `.linalg.eig()`\n"
38 | ]
39 | },
40 | {
41 | "cell_type": "markdown",
42 | "metadata": {},
43 | "source": [
44 | "## I. Numpy 시작하기"
45 | ]
46 | },
47 | {
48 | "cell_type": "markdown",
49 | "metadata": {},
50 | "source": [
51 | "## II. Numpy로 연산하기"
52 | ]
53 | },
54 | {
55 | "cell_type": "markdown",
56 | "metadata": {},
57 | "source": [
58 | "## III. Numpy로 선형대수 지식 끼얹기"
59 | ]
60 | },
61 | {
62 | "cell_type": "markdown",
63 | "metadata": {},
64 | "source": [
65 | "## IV. Exercises"
66 | ]
67 | },
68 | {
69 | "cell_type": "markdown",
70 | "metadata": {},
71 | "source": [
72 | "### 1. 어떤 벡터가 주어졌을 때 L2 norm을 구하는 함수 `get_L2_norm()`을 작성하세요\n",
73 | "\n",
74 | "- **매개변수** : 1차원 벡터 (`np.array`)\n",
75 | "- **반환값** : 인자로 주어진 벡터의 L2 Norm값 (`number`)"
76 | ]
77 | },
78 | {
79 | "cell_type": "code",
80 | "execution_count": null,
81 | "metadata": {},
82 | "outputs": [],
83 | "source": []
84 | },
85 | {
86 | "cell_type": "markdown",
87 | "metadata": {},
88 | "source": [
89 | "### 2. 어떤 행렬이 singular matrix인지 확인하는 함수 `is_singular()` 를 작성하세요\n",
90 | "\n",
91 | "- 매개변수 : 2차원 벡터(`np.array`)\n",
92 | "- 반환값 : 인자로 주어진 벡터가 singular하면 True, non-singular하면 False를 반환 "
93 | ]
94 | },
95 | {
96 | "cell_type": "code",
97 | "execution_count": null,
98 | "metadata": {},
99 | "outputs": [],
100 | "source": []
101 | }
102 | ],
103 | "metadata": {
104 | "kernelspec": {
105 | "display_name": "Python 3",
106 | "language": "python",
107 | "name": "python3"
108 | },
109 | "language_info": {
110 | "codemirror_mode": {
111 | "name": "ipython",
112 | "version": 3
113 | },
114 | "file_extension": ".py",
115 | "mimetype": "text/x-python",
116 | "name": "python",
117 | "nbconvert_exporter": "python",
118 | "pygments_lexer": "ipython3",
119 | "version": "3.9.0"
120 | }
121 | },
122 | "nbformat": 4,
123 | "nbformat_minor": 4
124 | }
125 |
--------------------------------------------------------------------------------
/2. 파이썬으로 데이터 주무르기, pandas.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# 2. 파이썬으로 데이터 주무르기, pandas\n",
8 | "**pandas를 활용해서 데이터프레임을 다뤄봅시다.**"
9 | ]
10 | },
11 | {
12 | "cell_type": "markdown",
13 | "metadata": {},
14 | "source": [
15 | "1. Pandas 시작하기\n",
16 | " - prerequisite : Table\n",
17 | " - pandas import하기\n",
18 | " \n",
19 | "2. Pandas로 1차원 데이터 다루기 - Series \n",
20 | " - Series 선언하기\n",
21 | " - Series vs ndarray\n",
22 | " - Series vs dict\n",
23 | " - Series에 이름 붙이기\n",
24 | "3. Pandas로 2차원 데이터 다루기 - dataframe\n",
25 | " - dataframe 선언하기\n",
26 | " - from csv to dataframe\n",
27 | " - dataframe 자료 접근하기\n",
28 | "\n",
29 | "[수업에 사용된 covid 데이터](https://www.kaggle.com/imdevskp/corona-virus-report)"
30 | ]
31 | },
32 | {
33 | "cell_type": "markdown",
34 | "metadata": {},
35 | "source": [
36 | "## I. pandas 시작하기"
37 | ]
38 | },
39 | {
40 | "cell_type": "markdown",
41 | "metadata": {},
42 | "source": [
43 | "## II. pandas로 1차원 데이터 다루기 - Series"
44 | ]
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "## III. Pandas로 2차원 데이터 다루기 - dataframe"
51 | ]
52 | },
53 | {
54 | "cell_type": "markdown",
55 | "metadata": {},
56 | "source": [
57 | "## Mission:\n",
58 | "### 1. covid 데이터에서 100 case 대비 사망률(`Deaths / 100 Cases`)이 가장 높은 국가는?"
59 | ]
60 | },
61 | {
62 | "cell_type": "code",
63 | "execution_count": null,
64 | "metadata": {},
65 | "outputs": [],
66 | "source": []
67 | },
68 | {
69 | "cell_type": "markdown",
70 | "metadata": {},
71 | "source": [
72 | "### 2. covid 데이터에서 신규 확진자가 없는 나라 중 WHO Region이 'Europe'를 모두 출력하면? \n",
73 | "Hint : 한 줄에 동시에 두가지 조건을 Apply하는 경우 Warning이 발생할 수 있습니다."
74 | ]
75 | },
76 | {
77 | "cell_type": "code",
78 | "execution_count": null,
79 | "metadata": {},
80 | "outputs": [],
81 | "source": []
82 | },
83 | {
84 | "cell_type": "markdown",
85 | "metadata": {},
86 | "source": [
87 | "### 3. 다음 [데이터](https://www.kaggle.com/neuromusic/avocado-prices)를 이용해 각 Region별로 아보카도가 가장 비싼 평균가격(AveragePrice)을 출력하면?"
88 | ]
89 | },
90 | {
91 | "cell_type": "code",
92 | "execution_count": null,
93 | "metadata": {},
94 | "outputs": [],
95 | "source": []
96 | }
97 | ],
98 | "metadata": {
99 | "kernelspec": {
100 | "display_name": "Python 3",
101 | "language": "python",
102 | "name": "python3"
103 | },
104 | "language_info": {
105 | "codemirror_mode": {
106 | "name": "ipython",
107 | "version": 3
108 | },
109 | "file_extension": ".py",
110 | "mimetype": "text/x-python",
111 | "name": "python",
112 | "nbconvert_exporter": "python",
113 | "pygments_lexer": "ipython3",
114 | "version": "3.9.0"
115 | }
116 | },
117 | "nbformat": 4,
118 | "nbformat_minor": 4
119 | }
120 |
--------------------------------------------------------------------------------
/3. python으로 멋드러진 그래프 그리기, Matplotlib.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Matlab으로 데이터 시각화하기\n",
8 | "\n",
9 | "**데이터를 보기좋게 표현해봅시다.**"
10 | ]
11 | },
12 | {
13 | "cell_type": "markdown",
14 | "metadata": {},
15 | "source": [
16 | "#### 1. Matplotlib 시작하기\n",
17 | " \n",
18 | "#### 2. 자주 사용되는 Plotting의 Options\n",
19 | "- 크기 : `figsize`\n",
20 | "- 제목 : `title`\n",
21 | "- 라벨 : `_label`\n",
22 | "- 눈금 : `_tics`\n",
23 | "- 범례 : `legend`\n",
24 | " \n",
25 | "#### 3. Matplotlib Case Study\n",
26 | "- 꺾은선 그래프 (Plot)\n",
27 | "- 산점도 (Scatter Plot)\n",
28 | "- 박스그림 (Box Plot)\n",
29 | "- 막대그래프 (Bar Chart)\n",
30 | "- 원형그래프 (Pie Chart)\n",
31 | " \n",
32 | "#### 4. The 멋진 그래프, seaborn Case Study\n",
33 | "- 커널밀도그림 (Kernel Density Plot)\n",
34 | "- 카운트그림 (Count Plot)\n",
35 | "- 캣그림 (Cat Plot)\n",
36 | "- 스트립그림 (Strip Plot)\n",
37 | "- 히트맵 (Heatmap)\n",
38 | " "
39 | ]
40 | },
41 | {
42 | "cell_type": "markdown",
43 | "metadata": {},
44 | "source": [
45 | "## I. Matplotlib 시작하기"
46 | ]
47 | },
48 | {
49 | "cell_type": "markdown",
50 | "metadata": {},
51 | "source": [
52 | "## II. Matplotlib Case Study"
53 | ]
54 | },
55 | {
56 | "cell_type": "markdown",
57 | "metadata": {},
58 | "source": [
59 | "## III. Matplotlib Case Study"
60 | ]
61 | },
62 | {
63 | "cell_type": "markdown",
64 | "metadata": {},
65 | "source": [
66 | "## IV. The 멋진 그래프, Seaborn Case Study"
67 | ]
68 | },
69 | {
70 | "cell_type": "markdown",
71 | "metadata": {},
72 | "source": [
73 | "## Mission:"
74 | ]
75 | },
76 | {
77 | "cell_type": "markdown",
78 | "metadata": {},
79 | "source": [
80 | "### "
81 | ]
82 | }
83 | ],
84 | "metadata": {
85 | "kernelspec": {
86 | "display_name": "Python 3",
87 | "language": "python",
88 | "name": "python3"
89 | },
90 | "language_info": {
91 | "codemirror_mode": {
92 | "name": "ipython",
93 | "version": 3
94 | },
95 | "file_extension": ".py",
96 | "mimetype": "text/x-python",
97 | "name": "python",
98 | "nbconvert_exporter": "python",
99 | "pygments_lexer": "ipython3",
100 | "version": "3.9.0"
101 | }
102 | },
103 | "nbformat": 4,
104 | "nbformat_minor": 4
105 | }
106 |
--------------------------------------------------------------------------------
/4. EDA.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# 4. Exploratory Data Analysis\n",
8 | "\n",
9 | "**탐색적 데이터 분석을 통해 데이터를 통달해봅시다.** with [Titanic Data](https://www.kaggle.com/c/titanic)\n",
10 | "\n",
11 | "0. 라이브러리 준비\n",
12 | "1. 분석의 목적과 변수 확인\n",
13 | "2. 데이터 전체적으로 살펴보기\n",
14 | "3. 데이터의 개별 속성 파악하기"
15 | ]
16 | },
17 | {
18 | "cell_type": "markdown",
19 | "metadata": {},
20 | "source": [
21 | "## 0. 라이브러리 준비"
22 | ]
23 | },
24 | {
25 | "cell_type": "markdown",
26 | "metadata": {},
27 | "source": [
28 | "## 1. 분석의 목적과 변수 확인"
29 | ]
30 | },
31 | {
32 | "cell_type": "markdown",
33 | "metadata": {},
34 | "source": [
35 | "## 2. 데이터 전체적으로 살펴보기"
36 | ]
37 | },
38 | {
39 | "cell_type": "markdown",
40 | "metadata": {},
41 | "source": [
42 | "## 3. 데이터의 개별 속성 파악하기"
43 | ]
44 | },
45 | {
46 | "cell_type": "markdown",
47 | "metadata": {},
48 | "source": [
49 | "## Mission : It's Your Turn!\n",
50 | "\n",
51 | "### 1. 본문에서 언급된 Feature를 제외하고 유의미한 Feature를 1개 이상 찾아봅시다.\n",
52 | "\n",
53 | "- Hint : Fare? Sibsp? Parch?\n",
54 | "\n",
55 | "### 2. [Kaggle](https://www.kaggle.com/datasets)에서 Dataset을 찾고, 이 Dataset에서 유의미한 Feature를 3개 이상 찾고 이를 시각화해봅시다.\n",
56 | "\n",
57 | "함께 보면 좋은 라이브러리 document\n",
58 | "- [numpy]()\n",
59 | "- [pandas]()\n",
60 | "- [seaborn]()\n",
61 | "- [matplotlib]()"
62 | ]
63 | }
64 | ],
65 | "metadata": {
66 | "kernelspec": {
67 | "display_name": "Python 3",
68 | "language": "python",
69 | "name": "python3"
70 | },
71 | "language_info": {
72 | "codemirror_mode": {
73 | "name": "ipython",
74 | "version": 3
75 | },
76 | "file_extension": ".py",
77 | "mimetype": "text/x-python",
78 | "name": "python",
79 | "nbconvert_exporter": "python",
80 | "pygments_lexer": "ipython3",
81 | "version": "3.9.0"
82 | }
83 | },
84 | "nbformat": 4,
85 | "nbformat_minor": 4
86 | }
87 |
--------------------------------------------------------------------------------
/Large_Linear_Regression.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "%matplotlib inline\n",
10 | "import matplotlib.pyplot as plt\n",
11 | "import seaborn as sns; sns.set()\n",
12 | "import numpy as np\n",
13 | "\n",
14 | "from sklearn.linear_model import LinearRegression\n",
15 | "model = LinearRegression(fit_intercept=True)"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": null,
21 | "metadata": {},
22 | "outputs": [],
23 | "source": [
24 | "N = 1000\n",
25 | "M = 3\n",
26 | "rng = np.random.RandomState(1)\n",
27 | "rng.rand(N, M).shape\n",
28 | "X = 10 * rng.rand(N, M)\n",
29 | "np.dot(X, [1.5, -2., 1.]).shape"
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "execution_count": null,
35 | "metadata": {},
36 | "outputs": [],
37 | "source": [
38 | "rng = np.random.RandomState(1)\n",
39 | "X = 10 * rng.rand(N, M)\n",
40 | "y = 0.5 + np.dot(X, [1.5, -2., 1.])\n",
41 | "\n",
42 | "model.fit(X, y)\n",
43 | "print(model.intercept_)\n",
44 | "print(model.coef_)"
45 | ]
46 | },
47 | {
48 | "cell_type": "markdown",
49 | "metadata": {},
50 | "source": [
51 | "### Normal Equations"
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "execution_count": null,
57 | "metadata": {},
58 | "outputs": [],
59 | "source": [
60 | "import numpy.linalg as LA\n",
61 | "normal_equation_solution = LA.inv(X.T@X)@X.T@y"
62 | ]
63 | },
64 | {
65 | "cell_type": "code",
66 | "execution_count": null,
67 | "metadata": {
68 | "scrolled": true
69 | },
70 | "outputs": [],
71 | "source": [
72 | "normal_equation_solution"
73 | ]
74 | },
75 | {
76 | "cell_type": "markdown",
77 | "metadata": {},
78 | "source": [
79 | "### Small Memory Normal Equations (Online)"
80 | ]
81 | },
82 | {
83 | "cell_type": "code",
84 | "execution_count": null,
85 | "metadata": {},
86 | "outputs": [],
87 | "source": [
88 | "A = np.zeros([M, M])\n",
89 | "b = np.zeros([M, 1])"
90 | ]
91 | },
92 | {
93 | "cell_type": "code",
94 | "execution_count": null,
95 | "metadata": {},
96 | "outputs": [],
97 | "source": [
98 | "for i in range(N):\n",
99 | " A = A + X[i, np.newaxis].T@X[i, np.newaxis]\n",
100 | " b = b + X[i, np.newaxis].T*y[i]\n",
101 | "solution = LA.inv(A)@b"
102 | ]
103 | },
104 | {
105 | "cell_type": "code",
106 | "execution_count": null,
107 | "metadata": {},
108 | "outputs": [],
109 | "source": [
110 | "solution"
111 | ]
112 | },
113 | {
114 | "cell_type": "markdown",
115 | "metadata": {},
116 | "source": [
117 | "### SGD"
118 | ]
119 | },
120 | {
121 | "cell_type": "code",
122 | "execution_count": null,
123 | "metadata": {},
124 | "outputs": [],
125 | "source": [
126 | "w = np.zeros([M, 1])"
127 | ]
128 | },
129 | {
130 | "cell_type": "code",
131 | "execution_count": null,
132 | "metadata": {},
133 | "outputs": [],
134 | "source": [
135 | "eta = 0.001\n",
136 | "n_iter = 1000"
137 | ]
138 | },
139 | {
140 | "cell_type": "code",
141 | "execution_count": null,
142 | "metadata": {},
143 | "outputs": [],
144 | "source": [
145 | "for i in range(n_iter):\n",
146 | " i = i % N\n",
147 | " neg_gradient = (y[i] - w.T@X[i, np.newaxis].T) * X[i, np.newaxis].T\n",
148 | " w = w + eta * neg_gradient\n",
149 | "print(w)"
150 | ]
151 | },
152 | {
153 | "cell_type": "code",
154 | "execution_count": null,
155 | "metadata": {},
156 | "outputs": [],
157 | "source": []
158 | }
159 | ],
160 | "metadata": {
161 | "kernelspec": {
162 | "display_name": "Python 3",
163 | "language": "python",
164 | "name": "python3"
165 | },
166 | "language_info": {
167 | "codemirror_mode": {
168 | "name": "ipython",
169 | "version": 3
170 | },
171 | "file_extension": ".py",
172 | "mimetype": "text/x-python",
173 | "name": "python",
174 | "nbconvert_exporter": "python",
175 | "pygments_lexer": "ipython3",
176 | "version": "3.7.8"
177 | }
178 | },
179 | "nbformat": 4,
180 | "nbformat_minor": 4
181 | }
182 |
--------------------------------------------------------------------------------
/Linear Algebra, Matrix Calculus.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/Linear Algebra, Matrix Calculus.pdf
--------------------------------------------------------------------------------
/ML_Basics (Linear Regression).ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "%matplotlib inline\n",
10 | "import matplotlib.pyplot as plt\n",
11 | "import seaborn as sns; sns.set()\n",
12 | "import numpy as np"
13 | ]
14 | },
15 | {
16 | "cell_type": "markdown",
17 | "metadata": {},
18 | "source": [
19 | "## 선형회귀 (Linear Regression)\n",
20 | "\n",
21 | "먼저, 주어진 데이터를 직선을 사용해 모델링하는 방법을 살펴본다. 직선함수는 다음과 같은 형태를 가진다.\n",
22 | "$$\n",
23 | "y = ax + b\n",
24 | "$$\n",
25 | "여기서 $a$는 기울기 (*slope*)이고 $b$는 $y$절편 (*intercept*)라고 불린다.\n",
26 | "\n",
27 | "아래 그래프는 기울기가 2이고 $y$절편이 -5인 직선으로부터 생성된 데이터를 보여준다"
28 | ]
29 | },
30 | {
31 | "cell_type": "code",
32 | "execution_count": null,
33 | "metadata": {},
34 | "outputs": [],
35 | "source": [
36 | "rng = np.random.RandomState(1)\n",
37 | "x = 10 * rng.rand(50)\n",
38 | "y = 2 * x - 5 + rng.randn(50)\n",
39 | "plt.scatter(x, y);"
40 | ]
41 | },
42 | {
43 | "cell_type": "markdown",
44 | "metadata": {},
45 | "source": [
46 | "Scikit-Learn의 ``LinearRegression`` estimator를 사용해서 위 데이터를 가장 잘 표현하는 직선을 찾을 수 있다."
47 | ]
48 | },
49 | {
50 | "cell_type": "code",
51 | "execution_count": null,
52 | "metadata": {},
53 | "outputs": [],
54 | "source": [
55 | "x.shape"
56 | ]
57 | },
58 | {
59 | "cell_type": "code",
60 | "execution_count": null,
61 | "metadata": {},
62 | "outputs": [],
63 | "source": [
64 | "x[:, np.newaxis].shape"
65 | ]
66 | },
67 | {
68 | "cell_type": "code",
69 | "execution_count": null,
70 | "metadata": {},
71 | "outputs": [],
72 | "source": [
73 | "from sklearn.linear_model import LinearRegression\n",
74 | "model = LinearRegression(fit_intercept=True)\n",
75 | "\n",
76 | "model.fit(x[:, np.newaxis], y)\n",
77 | "\n",
78 | "xfit = np.linspace(0, 10, 1000)\n",
79 | "yfit = model.predict(xfit[:, np.newaxis])\n",
80 | "\n",
81 | "plt.scatter(x, y)\n",
82 | "plt.plot(xfit, yfit);"
83 | ]
84 | },
85 | {
86 | "cell_type": "markdown",
87 | "metadata": {},
88 | "source": [
89 | "모델 학습이 끝난 후 학습된 파라미터들은 model.\"파라미터이름\"_ 의 형태로 저장된다. 기울기와 y절편은 아래와 같이 출력할 수 있다."
90 | ]
91 | },
92 | {
93 | "cell_type": "code",
94 | "execution_count": null,
95 | "metadata": {},
96 | "outputs": [],
97 | "source": [
98 | "print(\"Model slope: \", model.coef_[0])\n",
99 | "print(\"Model intercept:\", model.intercept_)"
100 | ]
101 | },
102 | {
103 | "cell_type": "markdown",
104 | "metadata": {},
105 | "source": [
106 | "``LinearRegression`` estimator는 위의 예제와 같은 1차원 입력뿐만 아니라 다차원 입력을 사용한 선형모델을 다룰 수 있다. 다차원 선형모델은 다음과 같은 형태를 가진다.\n",
107 | "$$\n",
108 | "y = a_0 + a_1 x_1 + a_2 x_2 + \\cdots\n",
109 | "$$\n",
110 | "기하학적으로 이것은 hyper-plane으로 데이터를 표현하는 것이라고 말할 수 있다."
111 | ]
112 | },
113 | {
114 | "cell_type": "code",
115 | "execution_count": null,
116 | "metadata": {},
117 | "outputs": [],
118 | "source": [
119 | "rng = np.random.RandomState(1)\n",
120 | "rng.rand(100, 3).shape\n",
121 | "X = 10 * rng.rand(100, 3)\n",
122 | "np.dot(X, [1.5, -2., 1.]).shape"
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "execution_count": null,
128 | "metadata": {},
129 | "outputs": [],
130 | "source": [
131 | "rng = np.random.RandomState(1)\n",
132 | "X = 10 * rng.rand(100, 3)\n",
133 | "y = 0.5 + np.dot(X, [1.5, -2., 1.])\n",
134 | "\n",
135 | "model.fit(X, y)\n",
136 | "print(model.intercept_)\n",
137 | "print(model.coef_)"
138 | ]
139 | },
140 | {
141 | "cell_type": "markdown",
142 | "metadata": {},
143 | "source": [
144 | "$y$값들은 랜덤하게 생성된 3차원의 $x$값과 계수들([1.5, -2., 1.])을 곱함으로써 생성되었는데, linear regression을 통해서 이 계수들을 계산해낼 수 있다는 것을 알 수 있다.\n",
145 | "\n",
146 | "만약 데이터가 선형적인 관계를 가지고 있지 않다면?"
147 | ]
148 | },
149 | {
150 | "cell_type": "markdown",
151 | "metadata": {},
152 | "source": [
153 | "## 선형 기저함수 모델 (Linear Basis Function Models)\n",
154 | "\n",
155 | "비선형데이터를 선형함수로 모델링하는 한가지 방법은 기저함수(basis function)을 사용하는 것이다.\n",
156 | "\n",
157 | "예를 들어, 다음과 같은 선형함수를 사용한다고 하자.\n",
158 | "$$\n",
159 | "y = a_0 + a_1 x_1 + a_2 x_2 + a_3 x_3 + \\cdots\n",
160 | "$$\n",
161 | "여기서 $x_1, x_2, x_3,$등을 1차원 $x$로부터 생성할 수 있다($x_n = f_n(x)$). $f_n$을 기저함수(basis function)라고 부른다.\n",
162 | "\n",
163 | "만약 $f_n(x) = x^n$라는 기저함수를 사용하면 최종적인 모델은 다음과 같을 것이다.\n",
164 | "\n",
165 | "$$\n",
166 | "y = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + \\cdots\n",
167 | "$$\n",
168 | "\n",
169 | "이 모델은 여전히 계수($a_n$)에 관해서는 선형함수임을 기억하자. 따라서 1차원 변수인 $x$를 기저함수를 통해 다차원으로 확장시킴으로써 우리는 여전히 선형모델(linear regression)을 사용할 수 있게 된다."
170 | ]
171 | },
172 | {
173 | "cell_type": "markdown",
174 | "metadata": {},
175 | "source": [
176 | "### 다항 기저함수 (Polynomial Basis Functions)\n",
177 | "\n",
178 | "$f_n(x) = x^n$ 형태의 함수를 다항 기저함수 (polynomial basis functions)라고 부른다. Scikit-Learn은 ``PolynomialFeatures``이라는 transformer를 이미 포함하고 있다."
179 | ]
180 | },
181 | {
182 | "cell_type": "code",
183 | "execution_count": null,
184 | "metadata": {},
185 | "outputs": [],
186 | "source": [
187 | "from sklearn.preprocessing import PolynomialFeatures\n",
188 | "x = np.array([2, 3, 4])\n",
189 | "poly = PolynomialFeatures(3, include_bias=False)\n",
190 | "poly.fit_transform(x[:, None])"
191 | ]
192 | },
193 | {
194 | "cell_type": "markdown",
195 | "metadata": {},
196 | "source": [
197 | "PolynomialFeatures가 1차원 array를 3차원 array로 변환한 것을 볼 수 있다. 이렇게 변환된 데이터를 선형모델에 적용할 수 있다.\n",
198 | "\n",
199 | "7차원 변환을 적용해보자."
200 | ]
201 | },
202 | {
203 | "cell_type": "code",
204 | "execution_count": null,
205 | "metadata": {},
206 | "outputs": [],
207 | "source": [
208 | "from sklearn.pipeline import make_pipeline\n",
209 | "poly_model = make_pipeline(PolynomialFeatures(7),\n",
210 | " LinearRegression())"
211 | ]
212 | },
213 | {
214 | "cell_type": "markdown",
215 | "metadata": {},
216 | "source": [
217 | "다차원 변환을 사용하면 복잡한 데이터를 모델링할 수 있게 된다. 예를 들어 sine 함수를 사용해서 데이터를 생성하고 모델링해보자."
218 | ]
219 | },
220 | {
221 | "cell_type": "code",
222 | "execution_count": null,
223 | "metadata": {},
224 | "outputs": [],
225 | "source": [
226 | "rng = np.random.RandomState(1)\n",
227 | "x = 10 * rng.rand(50)\n",
228 | "y = np.sin(x) + 0.1 * rng.randn(50)\n",
229 | "plt.scatter(x, y)"
230 | ]
231 | },
232 | {
233 | "cell_type": "code",
234 | "execution_count": null,
235 | "metadata": {},
236 | "outputs": [],
237 | "source": [
238 | "poly_model.fit(x[:, np.newaxis], y)\n",
239 | "yfit = poly_model.predict(xfit[:, np.newaxis])\n",
240 | "\n",
241 | "plt.scatter(x, y)\n",
242 | "plt.plot(xfit, yfit);"
243 | ]
244 | },
245 | {
246 | "cell_type": "markdown",
247 | "metadata": {},
248 | "source": [
249 | "### 가우시안 기저함수 (Gaussian Basis Functions)\n",
250 | "\n",
251 | "다항 기저함수 외에 다른 기저함수를 사용해보자. 가우시안 기저함수는 다음과 같이 정의된다.\n",
252 | "\n",
253 | "$$\\exp \\{-\\frac{(x-u_j)^2}{2s^2}\\}$$\n",
254 | "\n",
255 | "$u_j$는 함수의 위치, $s$는 폭을 결정한다. 주어진 데이터를 여러 개의 가우시안 기저함수들의 합으로 표현하려고 시도할 수 있다."
256 | ]
257 | },
258 | {
259 | "cell_type": "markdown",
260 | "metadata": {},
261 | "source": [
262 | ""
263 | ]
264 | },
265 | {
266 | "cell_type": "markdown",
267 | "metadata": {},
268 | "source": [
269 | "그래프 아래 회색부분은 다수의 가우시안 기저함수들을 보여주고 있다. 이 기저함수들의 합은 데이터를 가로지르는 매끈한 곡선이 된다.\n",
270 | "\n",
271 | "가우시안 기저함수는 Scikit-Learn에 포함되어 있지 않지만 어렵지 않게 구현할 수 있다."
272 | ]
273 | },
274 | {
275 | "cell_type": "code",
276 | "execution_count": null,
277 | "metadata": {},
278 | "outputs": [],
279 | "source": [
280 | "from sklearn.base import BaseEstimator, TransformerMixin\n",
281 | "\n",
282 | "class GaussianFeatures(BaseEstimator, TransformerMixin):\n",
283 | " \"\"\"Uniformly spaced Gaussian features for one-dimensional input\"\"\"\n",
284 | " \n",
285 | " def __init__(self, N, width_factor=2.0):\n",
286 | " self.N = N\n",
287 | " self.width_factor = width_factor\n",
288 | " \n",
289 | " @staticmethod\n",
290 | " def _gauss_basis(x, y, width, axis=None):\n",
291 | " arg = (x - y) / width\n",
292 | " return np.exp(-0.5 * np.sum(arg ** 2, axis))\n",
293 | " \n",
294 | " def fit(self, X, y=None):\n",
295 | " # create N centers spread along the data range\n",
296 | " self.centers_ = np.linspace(X.min(), X.max(), self.N)\n",
297 | " self.width_ = self.width_factor * (self.centers_[1] - self.centers_[0])\n",
298 | " return self\n",
299 | " \n",
300 | " def transform(self, X):\n",
301 | " return self._gauss_basis(X[:, :, np.newaxis], self.centers_,\n",
302 | " self.width_, axis=1)\n",
303 | " \n",
304 | "gauss_model = make_pipeline(GaussianFeatures(20),\n",
305 | " LinearRegression())\n",
306 | "gauss_model.fit(x[:, np.newaxis], y)\n",
307 | "yfit = gauss_model.predict(xfit[:, np.newaxis])\n",
308 | "\n",
309 | "plt.scatter(x, y)\n",
310 | "plt.plot(xfit, yfit)\n",
311 | "plt.xlim(0, 10);"
312 | ]
313 | },
314 | {
315 | "cell_type": "markdown",
316 | "metadata": {},
317 | "source": [
318 | "## 규제화 (Regularization)\n",
319 | "\n",
320 | "기저함수를 사용함으로써 복잡한 데이터를 모델링할 수 있게 되었지만 조심하지 않는다면 over-fitting이라는 다른 심각한 문제를 만날 수 있다! 예를 들어, 너무 많은 개수의 가우시안 기저함수를 사용하게 되면 다음과 같이 될 수 있다."
321 | ]
322 | },
323 | {
324 | "cell_type": "code",
325 | "execution_count": null,
326 | "metadata": {},
327 | "outputs": [],
328 | "source": [
329 | "model = make_pipeline(GaussianFeatures(30),\n",
330 | " LinearRegression())\n",
331 | "model.fit(x[:, np.newaxis], y)\n",
332 | "\n",
333 | "plt.scatter(x, y)\n",
334 | "plt.plot(xfit, model.predict(xfit[:, np.newaxis]))\n",
335 | "\n",
336 | "plt.xlim(0, 10)\n",
337 | "plt.ylim(-1.5, 1.5);"
338 | ]
339 | },
340 | {
341 | "cell_type": "markdown",
342 | "metadata": {},
343 | "source": [
344 | "이 예제에서는 30개의 기저함수가 사용되었는데 모델이 필요이상으로 flexible해져서 데이터가 없는 곳에서는 극단적인 값을 가지는 것을 볼 수 있다. 기저함수의 계수들은 다음과 같이 확인할 수 있다."
345 | ]
346 | },
347 | {
348 | "cell_type": "code",
349 | "execution_count": null,
350 | "metadata": {
351 | "scrolled": true
352 | },
353 | "outputs": [],
354 | "source": [
355 | "def basis_plot(model, title=None):\n",
356 | " fig, ax = plt.subplots(2, sharex=True)\n",
357 | " model.fit(x[:, np.newaxis], y)\n",
358 | " ax[0].scatter(x, y)\n",
359 | " ax[0].plot(xfit, model.predict(xfit[:, np.newaxis]))\n",
360 | " ax[0].set(xlabel='x', ylabel='y', ylim=(-1.5, 1.5))\n",
361 | " \n",
362 | " if title:\n",
363 | " ax[0].set_title(title)\n",
364 | "\n",
365 | " ax[1].plot(model.steps[0][1].centers_,\n",
366 | " model.steps[1][1].coef_)\n",
367 | " ax[1].set(xlabel='basis location',\n",
368 | " ylabel='coefficient',\n",
369 | " xlim=(0, 10))\n",
370 | " \n",
371 | "model = make_pipeline(GaussianFeatures(30), LinearRegression())\n",
372 | "basis_plot(model)"
373 | ]
374 | },
375 | {
376 | "cell_type": "markdown",
377 | "metadata": {},
378 | "source": [
379 | "위 두번째 그래프는 각각의 가우시안 기저함수의 크기(계수값)을 보여주고 있다. Over-fitting이 일어나는 영역에서는 인접한 기저함수들의 값이 극단으로 가면서 서로 상쇄하는 현상이 일어난다. 따라서 큰 계수값에 대해 penalty를 부여해서 over-fitting을 어느 정도 극복할 수 있을 것이다. 이러한 penalty를 *regularization*이라 부른다."
380 | ]
381 | },
382 | {
383 | "cell_type": "markdown",
384 | "metadata": {},
385 | "source": [
386 | "### Ridge regression ($L_2$ Regularization)\n",
387 | "\n",
388 | "가장 자주 쓰이는 형태의 regularization은 *ridge regression* ($L_2$ *regularization*)이고 다음과 같이 정의된다.\n",
389 | "$$\n",
390 | "P = \\alpha\\sum_{n=1}^N \\theta_n^2\n",
391 | "$$\n",
392 | "여기서 $\\alpha$는 regularization의 강도를 조절하는 파라미터이다. 이 형태의 regularization은 Scikit-Learn의 ``Ridge`` estimator에서 사용된다."
393 | ]
394 | },
395 | {
396 | "cell_type": "code",
397 | "execution_count": null,
398 | "metadata": {},
399 | "outputs": [],
400 | "source": [
401 | "from sklearn.linear_model import Ridge\n",
402 | "model = make_pipeline(GaussianFeatures(30), Ridge(alpha=0.1))\n",
403 | "basis_plot(model, title='Ridge Regression')"
404 | ]
405 | },
406 | {
407 | "cell_type": "markdown",
408 | "metadata": {},
409 | "source": [
410 | "$\\alpha$값이 0에 가까워질수록 일반적인 선형회귀모델이 되고 $\\alpha$값이 무한대로 증가하면 데이터는 모델에 영향을 주지 않게 된다."
411 | ]
412 | },
413 | {
414 | "cell_type": "markdown",
415 | "metadata": {},
416 | "source": [
417 | "### Lasso regression ($L_1$ regularization)\n",
418 | "\n",
419 | "또 하나의 자주 쓰이는 regularization방법은 계수들의 절대값의 합을 제한하는 것이다.\n",
420 | "$$\n",
421 | "P = \\alpha\\sum_{n=1}^N |\\theta_n|\n",
422 | "$$\n",
423 | "뒤에서 자세히 다루겠지만 이 방법은 *sparse*한 모델을 생성하게 된다(많은 계수들이 0이 됨)."
424 | ]
425 | },
426 | {
427 | "cell_type": "code",
428 | "execution_count": null,
429 | "metadata": {
430 | "scrolled": true
431 | },
432 | "outputs": [],
433 | "source": [
434 | "from sklearn.linear_model import Lasso\n",
435 | "model = make_pipeline(GaussianFeatures(30), Lasso(alpha=0.001))\n",
436 | "basis_plot(model, title='Lasso Regression')"
437 | ]
438 | },
439 | {
440 | "cell_type": "markdown",
441 | "metadata": {},
442 | "source": [
443 | "위에서 볼 수 있듯 대부분의 계수값들이 0이 된다. Ridge regression과 마찬가지로 $\\alpha$값으로 regularization의 강도를 조절할 수 있다."
444 | ]
445 | },
446 | {
447 | "cell_type": "markdown",
448 | "metadata": {},
449 | "source": [
450 | "### SGD"
451 | ]
452 | },
453 | {
454 | "cell_type": "code",
455 | "execution_count": null,
456 | "metadata": {},
457 | "outputs": [],
458 | "source": [
459 | "from sklearn.linear_model import SGDRegressor\n",
460 | "model = make_pipeline(GaussianFeatures(30),\n",
461 | " SGDRegressor(max_iter=1000, tol=1e-8, alpha=0))\n",
462 | "basis_plot(model)"
463 | ]
464 | },
465 | {
466 | "cell_type": "code",
467 | "execution_count": null,
468 | "metadata": {},
469 | "outputs": [],
470 | "source": []
471 | }
472 | ],
473 | "metadata": {
474 | "kernelspec": {
475 | "display_name": "Python 3",
476 | "language": "python",
477 | "name": "python3"
478 | },
479 | "language_info": {
480 | "codemirror_mode": {
481 | "name": "ipython",
482 | "version": 3
483 | },
484 | "file_extension": ".py",
485 | "mimetype": "text/x-python",
486 | "name": "python",
487 | "nbconvert_exporter": "python",
488 | "pygments_lexer": "ipython3",
489 | "version": "3.7.8"
490 | }
491 | },
492 | "nbformat": 4,
493 | "nbformat_minor": 4
494 | }
495 |
--------------------------------------------------------------------------------
/ML_Basics (Linear Regression).pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/ML_Basics (Linear Regression).pdf
--------------------------------------------------------------------------------
/[KDT] 인공지능 14주차 실습/[Day 5]실습6-2-OpenCV로 웹캠접근.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "## OpenCV로 웹캠 접근"
8 | ]
9 | },
10 | {
11 | "cell_type": "markdown",
12 | "metadata": {},
13 | "source": [
14 | "밑의 예제는 다음 사이트에서 가져왔습니다:\n",
15 | "https://hoony-gunputer.tistory.com/"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": 1,
21 | "metadata": {
22 | "collapsed": false
23 | },
24 | "outputs": [],
25 | "source": [
26 | "import cv2 #0이면 노트북 내장 웹캠 숫자를 올리면 추가된 웹캠을 이용할 수 있다. \n",
27 | "cap = cv2.VideoCapture(0) \n",
28 | "# 3은 가로 4는 세로 길이 \n",
29 | "cap.set(3, 720) \n",
30 | "cap.set(4, 1080) \n",
31 | "while True: \n",
32 | " ret, frame = cap.read()\n",
33 | " cv2.imshow('test', frame) \n",
34 | " \n",
35 | " # frame 에 대한 영상처리\n",
36 | " # frame에 대해 object detection하고\n",
37 | " # frame에다가 box를 그려주기 \n",
38 | " \n",
39 | " k = cv2.waitKey(1) \n",
40 | " if k == 27: \n",
41 | " break \n",
42 | "cap.release() \n",
43 | "cv2.destroyAllWindows()"
44 | ]
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "## OpenCV로 동영상 저장"
51 | ]
52 | },
53 | {
54 | "cell_type": "code",
55 | "execution_count": 1,
56 | "metadata": {
57 | "collapsed": true
58 | },
59 | "outputs": [],
60 | "source": [
61 | "import cv2 \n",
62 | "cap = cv2.VideoCapture(0) \n",
63 | "cap.set(3, 720) \n",
64 | "cap.set(4, 1080) \n",
65 | "fc = 20.0 \n",
66 | "codec = cv2.VideoWriter_fourcc('D', 'I', 'V', 'X') \n",
67 | "out = cv2.VideoWriter('mycam.avi', codec, fc, (int(cap.get(3)), int(cap.get(4)))) \n",
68 | "while True: \n",
69 | " ret, frame = cap.read()\n",
70 | " cv2.imshow('test', frame)\n",
71 | " # frame 에 대한 영상처리\n",
72 | " # frame에 대해 object detection하고\n",
73 | " # frame에다가 box를 그려주기 \n",
74 | " \n",
75 | " out.write(frame) \n",
76 | " k = cv2.waitKey(1) \n",
77 | " if k == 27: break \n",
78 | "cap.release() \n",
79 | "cv2.destroyAllWindows()"
80 | ]
81 | },
82 | {
83 | "cell_type": "code",
84 | "execution_count": null,
85 | "metadata": {
86 | "collapsed": true
87 | },
88 | "outputs": [],
89 | "source": []
90 | }
91 | ],
92 | "metadata": {
93 | "anaconda-cloud": {},
94 | "kernelspec": {
95 | "display_name": "Python [default]",
96 | "language": "python",
97 | "name": "python3"
98 | },
99 | "language_info": {
100 | "codemirror_mode": {
101 | "name": "ipython",
102 | "version": 3
103 | },
104 | "file_extension": ".py",
105 | "mimetype": "text/x-python",
106 | "name": "python",
107 | "nbconvert_exporter": "python",
108 | "pygments_lexer": "ipython3",
109 | "version": "3.5.5"
110 | }
111 | },
112 | "nbformat": 4,
113 | "nbformat_minor": 1
114 | }
115 |
--------------------------------------------------------------------------------
/[KDT] 인공지능 12주차 실습/PySpark_타이베이시_주택가격_예측.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 4,
3 | "nbformat_minor": 0,
4 | "metadata": {
5 | "colab": {
6 | "name": "PySpark 타이베이시 주택가격 예측",
7 | "provenance": []
8 | },
9 | "kernelspec": {
10 | "name": "python3",
11 | "display_name": "Python 3"
12 | }
13 | },
14 | "cells": [
15 | {
16 | "cell_type": "markdown",
17 | "metadata": {
18 | "id": "xAqhTDfuWrcM"
19 | },
20 | "source": [
21 | "PySpark을 로컬머신에 설치하고 노트북을 사용하기 보다는 머신러닝 관련 다양한 라이브러리가 이미 설치되었고 좋은 하드웨어를 제공해주는 Google Colab을 통해 실습을 진행한다."
22 | ]
23 | },
24 | {
25 | "cell_type": "markdown",
26 | "metadata": {
27 | "id": "XIA23YgbXKJd"
28 | },
29 | "source": [
30 | "이를 위해 pyspark과 Py4J 패키지를 설치한다. Py4J 패키지는 파이썬 프로그램이 자바가상머신상의 오브젝트들을 접근할 수 있게 해준다. Local Standalone Spark을 사용한다."
31 | ]
32 | },
33 | {
34 | "cell_type": "code",
35 | "metadata": {
36 | "id": "NbT0rpGfVdiq"
37 | },
38 | "source": [
39 | "!pip install pyspark==3.0.1 py4j==0.10.9 "
40 | ],
41 | "execution_count": null,
42 | "outputs": []
43 | },
44 | {
45 | "cell_type": "code",
46 | "metadata": {
47 | "id": "3vm6tgcPXdnR"
48 | },
49 | "source": [
50 | "from pyspark.sql import SparkSession\n",
51 | "\n",
52 | "spark = SparkSession \\\n",
53 | " .builder \\\n",
54 | " .appName(\"Taipei Housing Price Prediction\") \\\n",
55 | " .getOrCreate()"
56 | ],
57 | "execution_count": null,
58 | "outputs": []
59 | },
60 | {
61 | "cell_type": "markdown",
62 | "metadata": {
63 | "id": "pyl0gES0KhkF"
64 | },
65 | "source": [
66 | "# 타이베이 주택 가격 예측 모델 만들기\n",
67 | "\n",
68 | "\n"
69 | ]
70 | },
71 | {
72 | "cell_type": "markdown",
73 | "metadata": {
74 | "id": "BSkgIWglmw3-"
75 | },
76 | "source": [
77 | "데이터셋 설명\n",
78 | "\n",
79 | "이번 문제는 대만 타이베이 시의 신단 지역에서 수집된 주택 거래 관련 정보를 바탕으로 주택 가격(정확히는 주택의 평당 가격)을 예측하는 Regression 모델을 만들어보는 것이다. 총 6개의 피쳐와 주택의 평당 가격에 해당하는 레이블 정보가 훈련 데이터로 제공된다. 레이블의 경우에는 주택의 최종 가격이 아니라 평당 가격이란 점을 다시 한번 강조한다.\n",
80 | "\n",
81 | "각 컬럼에 대한 설명은 아래와 같으며 모든 필드는 X4를 제외하고는 실수 타입이다.\n",
82 | "\n",
83 | "* X1: 주택 거래 날짜를 실수로 제공한다. 소수점 부분은 달을 나타낸다. 예를 들어 2013.250이라면 2013년 3월임을 나타낸다 (0.250 = 3/12)\n",
84 | "* X2: 주택 나이 (년수)\n",
85 | "* X3: 가장 가까운 지하철역까지의 거리 (미터)\n",
86 | "* X4: 주택 근방 걸어갈 수 있는 거리내 편의점 수\n",
87 | "* X5: 주택 위치의 위도 (latitude)\n",
88 | "* X6: 주택 위치의 경도 (longitude)\n",
89 | "* Y: 주택 평당 가격\n",
90 | "\n"
91 | ]
92 | },
93 | {
94 | "cell_type": "code",
95 | "metadata": {
96 | "id": "LSs_1PYaYWxI"
97 | },
98 | "source": [
99 | "spark"
100 | ],
101 | "execution_count": null,
102 | "outputs": []
103 | },
104 | {
105 | "cell_type": "code",
106 | "metadata": {
107 | "id": "vE8iL4vy6705"
108 | },
109 | "source": [
110 | "!wget https://grepp-reco-test.s3.ap-northeast-2.amazonaws.com/Taipei_sindan_housing.csv"
111 | ],
112 | "execution_count": null,
113 | "outputs": []
114 | },
115 | {
116 | "cell_type": "code",
117 | "metadata": {
118 | "id": "GfSzc03fOC6e"
119 | },
120 | "source": [
121 | "!ls -tl"
122 | ],
123 | "execution_count": null,
124 | "outputs": []
125 | },
126 | {
127 | "cell_type": "code",
128 | "metadata": {
129 | "id": "-mH3roiKIOix"
130 | },
131 | "source": [
132 | "data = spark.read.csv('./Taipei_sindan_housing.csv', header=True, inferSchema=True)"
133 | ],
134 | "execution_count": null,
135 | "outputs": []
136 | },
137 | {
138 | "cell_type": "code",
139 | "metadata": {
140 | "id": "SCty0kw6ITVi"
141 | },
142 | "source": [
143 | "data.printSchema()"
144 | ],
145 | "execution_count": null,
146 | "outputs": []
147 | },
148 | {
149 | "cell_type": "code",
150 | "metadata": {
151 | "id": "nrlYYyVfIV1e"
152 | },
153 | "source": [
154 | "data.show()"
155 | ],
156 | "execution_count": null,
157 | "outputs": []
158 | }
159 | ]
160 | }
--------------------------------------------------------------------------------
/[KDT] 인공지능 12주차 실습/text classification (CNN with pre-training).ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import numpy as np\n",
10 | "\n",
11 | "import tensorflow as tf\n",
12 | "\n",
13 | "!pip install -q tensorflow-hub\n",
14 | "!pip install -q tensorflow-datasets\n",
15 | "import tensorflow_hub as hub\n",
16 | "import tensorflow_datasets as tfds\n",
17 | "\n",
18 | "print(\"Version: \", tf.__version__)\n",
19 | "print(\"Eager mode: \", tf.executing_eagerly())\n",
20 | "print(\"Hub version: \", hub.__version__)\n",
21 | "print(\"GPU is\", \"available\" if tf.config.experimental.list_physical_devices(\"GPU\") else \"NOT AVAILABLE\")"
22 | ]
23 | },
24 | {
25 | "cell_type": "code",
26 | "execution_count": null,
27 | "metadata": {},
28 | "outputs": [],
29 | "source": [
30 | "# Split the training set into 60% and 40%, so we'll end up with 15,000 examples\n",
31 | "# for training, 10,000 examples for validation and 25,000 examples for testing.\n",
32 | "train_data, validation_data, test_data = tfds.load(\n",
33 | " name=\"imdb_reviews\", \n",
34 | " split=('train[:60%]', 'train[60%:]', 'test'),\n",
35 | " as_supervised=True)"
36 | ]
37 | },
38 | {
39 | "cell_type": "code",
40 | "execution_count": null,
41 | "metadata": {},
42 | "outputs": [],
43 | "source": [
44 | "#embedding = \"https://tfhub.dev/google/nnlm-en-dim50/2\"\n",
45 | "embedding = hub.load(\"https://tfhub.dev/google/nnlm-en-dim128-with-normalization/2\")\n",
46 | "hub_layer = hub.KerasLayer(embedding, input_shape=[],\n",
47 | " dtype=tf.string, trainable=False)"
48 | ]
49 | },
50 | {
51 | "cell_type": "code",
52 | "execution_count": null,
53 | "metadata": {
54 | "scrolled": true
55 | },
56 | "outputs": [],
57 | "source": [
58 | "input_length = 1000\n",
59 | "embed_size = 128\n",
60 | "\n",
61 | "class Encode_Layer(tf.keras.layers.Layer):\n",
62 | " def __init__(self, dtype=tf.string, **kwargs):\n",
63 | " super().__init__(dtype=dtype, **kwargs)\n",
64 | " self.embedding = hub.load(\"https://tfhub.dev/google/nnlm-en-dim128-with-normalization/2\")\n",
65 | " self.hub_layer = hub.KerasLayer(self.embedding, input_shape=[], dtype=tf.string, trainable=False)\n",
66 | " def call(self, inputs):\n",
67 | " words = tf.strings.split(inputs)\n",
68 | " A = tf.ragged.map_flat_values(self.hub_layer, words)\n",
69 | " A = A * 1\n",
70 | " B = A.to_tensor(shape=[None, input_length, embed_size], default_value=0)\n",
71 | " return B\n",
72 | " \n",
73 | "encode_layer = Encode_Layer()\n",
74 | "\n",
75 | "filter_sizes = '1,2,3'\n",
76 | "num_filters = 1000\n",
77 | "\n",
78 | "input = tf.keras.layers.Input(shape=(), dtype=tf.string)\n",
79 | "embed = encode_layer(input)\n",
80 | "embed = tf.expand_dims(embed, -1)\n",
81 | "pool_outputs = []\n",
82 | "for filter_size in list(map(int, filter_sizes.split(','))):\n",
83 | " filter_shape = (filter_size, embed_size)\n",
84 | " conv = tf.keras.layers.Conv2D(num_filters, filter_shape, strides=(1, 1), padding='valid',\n",
85 | " data_format='channels_last', activation='relu',\n",
86 | " kernel_initializer='glorot_normal',\n",
87 | " bias_initializer=tf.keras.initializers.constant(0.1),\n",
88 | " name='convolution_{:d}'.format(filter_size))(embed)\n",
89 | " max_pool_shape = (input_length - filter_size + 1, 1)\n",
90 | " pool = tf.keras.layers.MaxPool2D(pool_size=max_pool_shape,\n",
91 | " strides=(1, 1), padding='valid',\n",
92 | " data_format='channels_last',\n",
93 | " name='max_pooling_{:d}'.format(filter_size))(conv)\n",
94 | " pool_outputs.append(pool)\n",
95 | "pool_outputs = tf.keras.layers.concatenate(pool_outputs, axis=-1, name='concatenate')\n",
96 | "pool_outputs = tf.keras.layers.Flatten(data_format='channels_last', name='flatten')(pool_outputs)\n",
97 | "pool_outputs = tf.keras.layers.Dropout(0.4, name='dropout1')(pool_outputs)\n",
98 | "dense = tf.keras.layers.Dense(256)(pool_outputs)\n",
99 | "dense = tf.keras.layers.Dropout(0.4, name='dropout2')(dense)\n",
100 | "outputs = tf.keras.layers.Dense(1)(dense)\n",
101 | "model = tf.keras.models.Model(inputs=[input],outputs=[outputs])\n",
102 | "model.summary()"
103 | ]
104 | },
105 | {
106 | "cell_type": "code",
107 | "execution_count": null,
108 | "metadata": {},
109 | "outputs": [],
110 | "source": [
111 | "model.compile(optimizer='adam',\n",
112 | " loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),\n",
113 | " metrics=['accuracy'])\n",
114 | "checkpoint_cb = tf.keras.callbacks.ModelCheckpoint(\"text_cnn.h5\", save_best_only=True)"
115 | ]
116 | },
117 | {
118 | "cell_type": "code",
119 | "execution_count": null,
120 | "metadata": {
121 | "scrolled": true
122 | },
123 | "outputs": [],
124 | "source": [
125 | "history = model.fit(train_data.shuffle(10000).batch(64),\n",
126 | " epochs=10,\n",
127 | " validation_data=validation_data.batch(64),\n",
128 | " callbacks=[checkpoint_cb],\n",
129 | " verbose=1)"
130 | ]
131 | },
132 | {
133 | "cell_type": "code",
134 | "execution_count": null,
135 | "metadata": {},
136 | "outputs": [],
137 | "source": [
138 | "model = tf.keras.models.load_model(\"text_cnn.h5\", custom_objects={\"Encode_Layer\": Encode_Layer})"
139 | ]
140 | },
141 | {
142 | "cell_type": "code",
143 | "execution_count": null,
144 | "metadata": {},
145 | "outputs": [],
146 | "source": [
147 | "results = model.evaluate(test_data.batch(32), verbose=2)\n",
148 | "\n",
149 | "for name, value in zip(model.metrics_names, results):\n",
150 | " print(\"%s: %.3f\" % (name, value))"
151 | ]
152 | },
153 | {
154 | "cell_type": "code",
155 | "execution_count": null,
156 | "metadata": {},
157 | "outputs": [],
158 | "source": [
159 | "tf.saved_model.save(model, \"text_cnn\")"
160 | ]
161 | },
162 | {
163 | "cell_type": "code",
164 | "execution_count": null,
165 | "metadata": {},
166 | "outputs": [],
167 | "source": [
168 | "saved_model = tf.saved_model.load(\"text_cnn\")\n",
169 | "y_pred = saved_model(tf.constant([\"this is a terrible movie.\",\"this is a good movie.\",\"very interesting movie\",\"i wouldn't watch this movie.\",\"i recommend this movie.\"]))\n",
170 | "y_pred"
171 | ]
172 | }
173 | ],
174 | "metadata": {
175 | "kernelspec": {
176 | "display_name": "Python 3",
177 | "language": "python",
178 | "name": "python3"
179 | },
180 | "language_info": {
181 | "codemirror_mode": {
182 | "name": "ipython",
183 | "version": 3
184 | },
185 | "file_extension": ".py",
186 | "mimetype": "text/x-python",
187 | "name": "python",
188 | "nbconvert_exporter": "python",
189 | "pygments_lexer": "ipython3",
190 | "version": "3.7.8"
191 | }
192 | },
193 | "nbformat": 4,
194 | "nbformat_minor": 4
195 | }
196 |
--------------------------------------------------------------------------------
/[KDT] 인공지능 12주차 실습/text classification (CNN without pre-training).ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import numpy as np\n",
10 | "\n",
11 | "import tensorflow as tf\n",
12 | "\n",
13 | "!pip install -q tensorflow-hub\n",
14 | "!pip install -q tensorflow-datasets\n",
15 | "import tensorflow_hub as hub\n",
16 | "import tensorflow_datasets as tfds\n",
17 | "\n",
18 | "print(\"Version: \", tf.__version__)\n",
19 | "print(\"Eager mode: \", tf.executing_eagerly())\n",
20 | "print(\"Hub version: \", hub.__version__)\n",
21 | "print(\"GPU is\", \"available\" if tf.config.experimental.list_physical_devices(\"GPU\") else \"NOT AVAILABLE\")"
22 | ]
23 | },
24 | {
25 | "cell_type": "code",
26 | "execution_count": null,
27 | "metadata": {},
28 | "outputs": [],
29 | "source": [
30 | "# Split the training set into 60% and 40%, so we'll end up with 15,000 examples\n",
31 | "# for training, 10,000 examples for validation and 25,000 examples for testing.\n",
32 | "train_data, validation_data, test_data = tfds.load(\n",
33 | " name=\"imdb_reviews\", \n",
34 | " split=('train[:60%]', 'train[60%:]', 'test'),\n",
35 | " as_supervised=True)"
36 | ]
37 | },
38 | {
39 | "cell_type": "code",
40 | "execution_count": null,
41 | "metadata": {},
42 | "outputs": [],
43 | "source": [
44 | "def preprocess(X_batch, n_words=500):\n",
45 | " shape = tf.shape(X_batch) * tf.constant([1, 0]) + tf.constant([0, n_words])\n",
46 | " Z = tf.strings.substr(X_batch, 0, n_words)\n",
47 | " Z = tf.strings.lower(Z)\n",
48 | " Z = tf.strings.regex_replace(Z, b\"
\", b\" \")\n",
49 | " Z = tf.strings.regex_replace(Z, b\"[^a-z]\", b\" \")\n",
50 | " Z = tf.strings.split(Z)\n",
51 | " return Z.to_tensor(shape=shape, default_value=b\"\")\n",
52 | "\n",
53 | "X_example = tf.constant([\"It's a great, great movie! I loved it.\", \"It was terrible, run away!!!\"])"
54 | ]
55 | },
56 | {
57 | "cell_type": "code",
58 | "execution_count": null,
59 | "metadata": {},
60 | "outputs": [],
61 | "source": [
62 | "from collections import Counter\n",
63 | "\n",
64 | "def get_vocabulary(data_sample, max_size=1000):\n",
65 | " preprocessed_reviews = preprocess(data_sample).numpy()\n",
66 | " counter = Counter()\n",
67 | " for words in preprocessed_reviews:\n",
68 | " for word in words:\n",
69 | " if word != b\"\":\n",
70 | " counter[word] += 1\n",
71 | " return [b\"\"] + [word for word, count in counter.most_common(max_size)]\n",
72 | "\n",
73 | "get_vocabulary(X_example)"
74 | ]
75 | },
76 | {
77 | "cell_type": "code",
78 | "execution_count": null,
79 | "metadata": {},
80 | "outputs": [],
81 | "source": [
82 | "class TextVectorization(tf.keras.layers.Layer):\n",
83 | " def __init__(self, max_vocabulary_size=1000, n_oov_buckets=100, dtype=tf.string, **kwargs):\n",
84 | " super().__init__(dtype=dtype, **kwargs)\n",
85 | " self.max_vocabulary_size = max_vocabulary_size\n",
86 | " self.n_oov_buckets = n_oov_buckets\n",
87 | "\n",
88 | " def adapt(self, data_sample):\n",
89 | " self.vocab = get_vocabulary(data_sample, self.max_vocabulary_size)\n",
90 | " words = tf.constant(self.vocab)\n",
91 | " word_ids = tf.range(len(self.vocab), dtype=tf.int64)\n",
92 | " vocab_init = tf.lookup.KeyValueTensorInitializer(words, word_ids)\n",
93 | " self.table = tf.lookup.StaticVocabularyTable(vocab_init, self.n_oov_buckets)\n",
94 | " \n",
95 | " def call(self, inputs, input_length):\n",
96 | " preprocessed_inputs = preprocess(inputs, n_words=input_length)\n",
97 | " return self.table.lookup(preprocessed_inputs)\n",
98 | " \n",
99 | " def get_config(self):\n",
100 | " config = super(TextVectorization, self).get_config()\n",
101 | " config.update({\n",
102 | " 'max_vocabulary_size': self.max_vocabulary_size,\n",
103 | " 'n_oov_buckets': self.n_oov_buckets\n",
104 | " })\n",
105 | " return config"
106 | ]
107 | },
108 | {
109 | "cell_type": "code",
110 | "execution_count": null,
111 | "metadata": {},
112 | "outputs": [],
113 | "source": [
114 | "max_vocabulary_size = 100000\n",
115 | "n_oov_buckets = 10\n",
116 | "text_vectorization = TextVectorization(max_vocabulary_size, n_oov_buckets,\n",
117 | " input_shape=[])\n",
118 | "\n",
119 | "train_examples_batch, train_labels_batch = next(iter(train_data.batch(15000)))\n",
120 | "text_vectorization.adapt(train_examples_batch)"
121 | ]
122 | },
123 | {
124 | "cell_type": "code",
125 | "execution_count": null,
126 | "metadata": {},
127 | "outputs": [],
128 | "source": [
129 | "text_vectorization.vocab"
130 | ]
131 | },
132 | {
133 | "cell_type": "code",
134 | "execution_count": null,
135 | "metadata": {},
136 | "outputs": [],
137 | "source": [
138 | "input_length = 600\n",
139 | "embed_size = 128\n",
140 | "filter_sizes = '1,2,3'\n",
141 | "num_filters = 1500\n",
142 | "vocab_size = len(text_vectorization.vocab) + n_oov_buckets\n",
143 | "\n",
144 | "input = tf.keras.layers.Input(shape=(), dtype=tf.string)\n",
145 | "vectorized = text_vectorization(input, input_length)\n",
146 | "\n",
147 | "embed_initer = tf.keras.initializers.RandomUniform(minval=-1, maxval=1)\n",
148 | "embed = tf.keras.layers.Embedding(vocab_size, embed_size,\n",
149 | " embeddings_initializer=embed_initer,\n",
150 | " input_length=input_length,\n",
151 | " name='embedding')(vectorized)\n",
152 | "# single channel. If using real embedding, you can set one static\n",
153 | "embed = tf.keras.layers.Reshape((input_length, embed_size, 1), name='add_channel')(embed)\n",
154 | "#embed = tf.expand_dims(embed, -1)\n",
155 | "pool_outputs = []\n",
156 | "for filter_size in list(map(int, filter_sizes.split(','))):\n",
157 | " filter_shape = (filter_size, embed_size)\n",
158 | " conv = tf.keras.layers.Conv2D(num_filters, filter_shape, strides=(1, 1), padding='valid',\n",
159 | " data_format='channels_last', activation='relu',\n",
160 | " kernel_initializer='glorot_normal',\n",
161 | " bias_initializer=tf.keras.initializers.constant(0.1),\n",
162 | " name='convolution_{:d}'.format(filter_size))(embed)\n",
163 | " max_pool_shape = (input_length - filter_size + 1, 1)\n",
164 | " pool = tf.keras.layers.MaxPool2D(pool_size=max_pool_shape,\n",
165 | " strides=(1, 1), padding='valid',\n",
166 | " data_format='channels_last',\n",
167 | " name='max_pooling_{:d}'.format(filter_size))(conv)\n",
168 | " pool_outputs.append(pool)\n",
169 | "pool_outputs = tf.keras.layers.concatenate(pool_outputs, axis=-1, name='concatenate')\n",
170 | "pool_outputs = tf.keras.layers.Flatten(data_format='channels_last', name='flatten')(pool_outputs)\n",
171 | "pool_outputs = tf.keras.layers.Dropout(0.4, name='dropout1')(pool_outputs)\n",
172 | "dense = tf.keras.layers.Dense(256, name='dense1')(pool_outputs)\n",
173 | "dense = tf.keras.layers.Dropout(0.4, name='dropout2')(dense)\n",
174 | "outputs = tf.keras.layers.Dense(1, name='dense2')(dense)\n",
175 | "model = tf.keras.models.Model(inputs=[input],outputs=[outputs])\n",
176 | "model.summary()"
177 | ]
178 | },
179 | {
180 | "cell_type": "code",
181 | "execution_count": null,
182 | "metadata": {},
183 | "outputs": [],
184 | "source": [
185 | "opt = tf.keras.optimizers.Adam(learning_rate=0.001)\n",
186 | "model.compile(optimizer=opt,\n",
187 | " loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),\n",
188 | " metrics=['accuracy'])\n",
189 | "checkpoint_cb = tf.keras.callbacks.ModelCheckpoint(\"text_cnn_no_pretraining\", save_weights_only=True, save_best_only=True)"
190 | ]
191 | },
192 | {
193 | "cell_type": "code",
194 | "execution_count": null,
195 | "metadata": {},
196 | "outputs": [],
197 | "source": [
198 | "history = model.fit(train_data.shuffle(10000).batch(128),\n",
199 | " epochs=30,\n",
200 | " validation_data=validation_data.batch(128),\n",
201 | " callbacks=[checkpoint_cb],\n",
202 | " verbose=1)"
203 | ]
204 | },
205 | {
206 | "cell_type": "code",
207 | "execution_count": null,
208 | "metadata": {},
209 | "outputs": [],
210 | "source": [
211 | "model.load_weights(\"text_cnn_no_pretraining\")"
212 | ]
213 | },
214 | {
215 | "cell_type": "code",
216 | "execution_count": null,
217 | "metadata": {},
218 | "outputs": [],
219 | "source": [
220 | "results = model.evaluate(test_data.batch(32), verbose=2)\n",
221 | "\n",
222 | "for name, value in zip(model.metrics_names, results):\n",
223 | " print(\"%s: %.3f\" % (name, value))"
224 | ]
225 | },
226 | {
227 | "cell_type": "code",
228 | "execution_count": null,
229 | "metadata": {},
230 | "outputs": [],
231 | "source": [
232 | "tf.saved_model.save(model, \"text_cnn_no_pretraining\")"
233 | ]
234 | },
235 | {
236 | "cell_type": "code",
237 | "execution_count": null,
238 | "metadata": {},
239 | "outputs": [],
240 | "source": [
241 | "saved_model = tf.saved_model.load(\"text_cnn_no_pretraining\")\n",
242 | "y_pred = saved_model(tf.constant([\"this is a terrible movie.\",\"this is a good movie.\",\"very interesting movie\",\"i wouldn't watch this movie.\",\"i recommend this movie.\"]))\n",
243 | "y_pred"
244 | ]
245 | }
246 | ],
247 | "metadata": {
248 | "kernelspec": {
249 | "display_name": "Python 3",
250 | "language": "python",
251 | "name": "python3"
252 | },
253 | "language_info": {
254 | "codemirror_mode": {
255 | "name": "ipython",
256 | "version": 3
257 | },
258 | "file_extension": ".py",
259 | "mimetype": "text/x-python",
260 | "name": "python",
261 | "nbconvert_exporter": "python",
262 | "pygments_lexer": "ipython3",
263 | "version": "3.7.8"
264 | }
265 | },
266 | "nbformat": 4,
267 | "nbformat_minor": 4
268 | }
269 |
--------------------------------------------------------------------------------
/[KDT] 인공지능 12주차 실습/text classification (bag of words).ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import numpy as np\n",
10 | "\n",
11 | "import tensorflow as tf\n",
12 | "\n",
13 | "!pip install -q tensorflow-hub\n",
14 | "!pip install -q tensorflow-datasets\n",
15 | "import tensorflow_hub as hub\n",
16 | "import tensorflow_datasets as tfds\n",
17 | "\n",
18 | "print(\"Version: \", tf.__version__)\n",
19 | "print(\"Eager mode: \", tf.executing_eagerly())\n",
20 | "print(\"Hub version: \", hub.__version__)\n",
21 | "print(\"GPU is\", \"available\" if tf.config.experimental.list_physical_devices(\"GPU\") else \"NOT AVAILABLE\")"
22 | ]
23 | },
24 | {
25 | "cell_type": "code",
26 | "execution_count": null,
27 | "metadata": {},
28 | "outputs": [],
29 | "source": [
30 | "# Split the training set into 60% and 40%, so we'll end up with 15,000 examples\n",
31 | "# for training, 10,000 examples for validation and 25,000 examples for testing.\n",
32 | "train_data, validation_data, test_data = tfds.load(\n",
33 | " name=\"imdb_reviews\", \n",
34 | " split=('train[:60%]', 'train[60%:]', 'test'),\n",
35 | " as_supervised=True)"
36 | ]
37 | },
38 | {
39 | "cell_type": "code",
40 | "execution_count": null,
41 | "metadata": {},
42 | "outputs": [],
43 | "source": [
44 | "def preprocess(X_batch, n_words=300):\n",
45 | " shape = tf.shape(X_batch) * tf.constant([1, 0]) + tf.constant([0, n_words]) # [num docs, n_words]\n",
46 | " #Z = tf.strings.substr(X_batch, 0, 300)\n",
47 | " Z = tf.strings.lower(X_batch)\n",
48 | " Z = tf.strings.regex_replace(Z, b\"
\", b\" \")\n",
49 | " Z = tf.strings.regex_replace(Z, b\"[^a-z]\", b\" \")\n",
50 | " Z = tf.strings.split(Z)\n",
51 | " return Z.to_tensor(shape=shape, default_value=b\"\")\n",
52 | "\n",
53 | "X_example = tf.constant([\"It's a great, great movie! I loved it.\", \"It was terrible, run away!!!\"])\n",
54 | "preprocess(X_example)"
55 | ]
56 | },
57 | {
58 | "cell_type": "code",
59 | "execution_count": null,
60 | "metadata": {},
61 | "outputs": [],
62 | "source": [
63 | "from collections import Counter\n",
64 | "\n",
65 | "def get_vocabulary(data_sample, max_size=1000):\n",
66 | " preprocessed_reviews = preprocess(data_sample).numpy()\n",
67 | " counter = Counter()\n",
68 | " for words in preprocessed_reviews:\n",
69 | " for word in words:\n",
70 | " if word != b\"\":\n",
71 | " counter[word] += 1\n",
72 | " return [b\"\"] + [word for word, count in counter.most_common(max_size)]\n",
73 | "\n",
74 | "get_vocabulary(X_example)"
75 | ]
76 | },
77 | {
78 | "cell_type": "code",
79 | "execution_count": null,
80 | "metadata": {},
81 | "outputs": [],
82 | "source": [
83 | "class TextVectorization(tf.keras.layers.Layer):\n",
84 | " def __init__(self, max_vocabulary_size=1000, n_oov_buckets=100, dtype=tf.string, **kwargs):\n",
85 | " super().__init__(dtype=dtype, **kwargs)\n",
86 | " self.max_vocabulary_size = max_vocabulary_size\n",
87 | " self.n_oov_buckets = n_oov_buckets\n",
88 | "\n",
89 | " def adapt(self, data_sample):\n",
90 | " self.vocab = get_vocabulary(data_sample, self.max_vocabulary_size)\n",
91 | " words = tf.constant(self.vocab)\n",
92 | " word_ids = tf.range(len(self.vocab), dtype=tf.int64)\n",
93 | " vocab_init = tf.lookup.KeyValueTensorInitializer(words, word_ids)\n",
94 | " self.table = tf.lookup.StaticVocabularyTable(vocab_init, self.n_oov_buckets)\n",
95 | " \n",
96 | " def call(self, inputs):\n",
97 | " preprocessed_inputs = preprocess(inputs)\n",
98 | " return self.table.lookup(preprocessed_inputs)\n",
99 | " \n",
100 | " def get_config(self):\n",
101 | " config = super(TextVectorization, self).get_config()\n",
102 | " config.update({\n",
103 | " 'max_vocabulary_size': self.max_vocabulary_size,\n",
104 | " 'n_oov_buckets': self.n_oov_buckets\n",
105 | " })\n",
106 | " return config"
107 | ]
108 | },
109 | {
110 | "cell_type": "code",
111 | "execution_count": null,
112 | "metadata": {},
113 | "outputs": [],
114 | "source": [
115 | "max_vocabulary_size = 100000\n",
116 | "n_oov_buckets = 100\n",
117 | "\n",
118 | "text_vectorization = TextVectorization(max_vocabulary_size, n_oov_buckets,\n",
119 | " input_shape=[])\n",
120 | "\n",
121 | "train_examples_batch, train_labels_batch = next(iter(train_data.batch(15000)))\n",
122 | "text_vectorization.adapt(train_examples_batch)"
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "execution_count": null,
128 | "metadata": {},
129 | "outputs": [],
130 | "source": [
131 | "class BagOfWords(tf.keras.layers.Layer):\n",
132 | " def __init__(self, n_tokens, dtype=tf.int32, **kwargs):\n",
133 | " super().__init__(dtype=tf.int32, **kwargs)\n",
134 | " self.n_tokens = n_tokens\n",
135 | " \n",
136 | " def call(self, inputs):\n",
137 | " one_hot = tf.one_hot(inputs, self.n_tokens)\n",
138 | " return tf.reduce_sum(one_hot, axis=1)[:, 1:]\n",
139 | " \n",
140 | " def get_config(self):\n",
141 | " config = super(BagOfWords, self).get_config()\n",
142 | " config.update({\n",
143 | " 'n_tokens': self.n_tokens\n",
144 | " })\n",
145 | " return config"
146 | ]
147 | },
148 | {
149 | "cell_type": "code",
150 | "execution_count": null,
151 | "metadata": {},
152 | "outputs": [],
153 | "source": [
154 | "n_tokens = len(text_vectorization.vocab) + n_oov_buckets\n",
155 | "bag_of_words = BagOfWords(n_tokens)"
156 | ]
157 | },
158 | {
159 | "cell_type": "code",
160 | "execution_count": null,
161 | "metadata": {
162 | "scrolled": false
163 | },
164 | "outputs": [],
165 | "source": [
166 | "input = tf.keras.layers.Input(shape=(), dtype=tf.string)\n",
167 | "vectorized = text_vectorization(input)\n",
168 | "bow = bag_of_words(vectorized)\n",
169 | "outputs = tf.keras.layers.Dense(1)(bow)\n",
170 | "model = tf.keras.models.Model(inputs=[input],outputs=[outputs])\n",
171 | "model.summary()"
172 | ]
173 | },
174 | {
175 | "cell_type": "code",
176 | "execution_count": null,
177 | "metadata": {},
178 | "outputs": [],
179 | "source": [
180 | "opt = tf.keras.optimizers.Adam(learning_rate=0.001)\n",
181 | "model.compile(optimizer=opt,\n",
182 | " loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),\n",
183 | " metrics=['accuracy'])\n",
184 | "checkpoint_cb = tf.keras.callbacks.ModelCheckpoint(\"text_bagofwords\", save_weights_only=True, save_best_only=True)"
185 | ]
186 | },
187 | {
188 | "cell_type": "code",
189 | "execution_count": null,
190 | "metadata": {},
191 | "outputs": [],
192 | "source": [
193 | "history = model.fit(train_data.shuffle(10000).batch(128),\n",
194 | " epochs=30,\n",
195 | " validation_data=validation_data.batch(128),\n",
196 | " callbacks=[checkpoint_cb],\n",
197 | " verbose=1)"
198 | ]
199 | },
200 | {
201 | "cell_type": "code",
202 | "execution_count": null,
203 | "metadata": {},
204 | "outputs": [],
205 | "source": [
206 | "model.load_weights(\"text_bagofwords\")"
207 | ]
208 | },
209 | {
210 | "cell_type": "code",
211 | "execution_count": null,
212 | "metadata": {
213 | "scrolled": true
214 | },
215 | "outputs": [],
216 | "source": [
217 | "results = model.evaluate(test_data.batch(32), verbose=2)\n",
218 | "\n",
219 | "for name, value in zip(model.metrics_names, results):\n",
220 | " print(\"%s: %.3f\" % (name, value))"
221 | ]
222 | }
223 | ],
224 | "metadata": {
225 | "kernelspec": {
226 | "display_name": "Python 3",
227 | "language": "python",
228 | "name": "python3"
229 | },
230 | "language_info": {
231 | "codemirror_mode": {
232 | "name": "ipython",
233 | "version": 3
234 | },
235 | "file_extension": ".py",
236 | "mimetype": "text/x-python",
237 | "name": "python",
238 | "nbconvert_exporter": "python",
239 | "pygments_lexer": "ipython3",
240 | "version": "3.7.8"
241 | }
242 | },
243 | "nbformat": 4,
244 | "nbformat_minor": 4
245 | }
246 |
--------------------------------------------------------------------------------
/[KDT] 인공지능 12주차 실습/text classification (embedding).ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import numpy as np\n",
10 | "\n",
11 | "import tensorflow as tf\n",
12 | "\n",
13 | "!pip install -q tensorflow-hub\n",
14 | "!pip install -q tensorflow-datasets\n",
15 | "import tensorflow_hub as hub\n",
16 | "import tensorflow_datasets as tfds\n",
17 | "\n",
18 | "print(\"Version: \", tf.__version__)\n",
19 | "print(\"Eager mode: \", tf.executing_eagerly())\n",
20 | "print(\"Hub version: \", hub.__version__)\n",
21 | "print(\"GPU is\", \"available\" if tf.config.experimental.list_physical_devices(\"GPU\") else \"NOT AVAILABLE\")"
22 | ]
23 | },
24 | {
25 | "cell_type": "code",
26 | "execution_count": null,
27 | "metadata": {},
28 | "outputs": [],
29 | "source": [
30 | "# Split the training set into 60% and 40%, so we'll end up with 15,000 examples\n",
31 | "# for training, 10,000 examples for validation and 25,000 examples for testing.\n",
32 | "train_data, validation_data, test_data = tfds.load(\n",
33 | " name=\"imdb_reviews\", \n",
34 | " split=('train[:60%]', 'train[60%:]', 'test'),\n",
35 | " as_supervised=True)"
36 | ]
37 | },
38 | {
39 | "cell_type": "code",
40 | "execution_count": null,
41 | "metadata": {},
42 | "outputs": [],
43 | "source": [
44 | "#embedding = \"https://tfhub.dev/google/nnlm-en-dim50/2\"\n",
45 | "embedding = hub.load(\"https://tfhub.dev/google/nnlm-en-dim128-with-normalization/2\")\n",
46 | "hub_layer = hub.KerasLayer(embedding, input_shape=[],\n",
47 | " dtype=tf.string, trainable=True)"
48 | ]
49 | },
50 | {
51 | "cell_type": "code",
52 | "execution_count": null,
53 | "metadata": {},
54 | "outputs": [],
55 | "source": [
56 | "train_examples_batch, train_labels_batch = next(iter(train_data.batch(10)))"
57 | ]
58 | },
59 | {
60 | "cell_type": "code",
61 | "execution_count": null,
62 | "metadata": {},
63 | "outputs": [],
64 | "source": [
65 | "hub_layer(train_examples_batch)[0]"
66 | ]
67 | },
68 | {
69 | "cell_type": "code",
70 | "execution_count": null,
71 | "metadata": {},
72 | "outputs": [],
73 | "source": [
74 | "model = tf.keras.Sequential()\n",
75 | "model.add(hub_layer)\n",
76 | "model.add(tf.keras.layers.Dense(32, activation='relu'))\n",
77 | "model.add(tf.keras.layers.Dropout(rate=0.2))\n",
78 | "model.add(tf.keras.layers.Dense(1))\n",
79 | "\n",
80 | "model.summary()"
81 | ]
82 | },
83 | {
84 | "cell_type": "code",
85 | "execution_count": null,
86 | "metadata": {},
87 | "outputs": [],
88 | "source": [
89 | "model.compile(optimizer='adam',\n",
90 | " loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),\n",
91 | " metrics=['accuracy'])"
92 | ]
93 | },
94 | {
95 | "cell_type": "code",
96 | "execution_count": null,
97 | "metadata": {},
98 | "outputs": [],
99 | "source": [
100 | "history = model.fit(train_data.shuffle(10000).batch(512),\n",
101 | " epochs=20,\n",
102 | " validation_data=validation_data.batch(512),\n",
103 | " verbose=1)"
104 | ]
105 | },
106 | {
107 | "cell_type": "code",
108 | "execution_count": null,
109 | "metadata": {},
110 | "outputs": [],
111 | "source": [
112 | "results = model.evaluate(test_data.batch(32), verbose=2)\n",
113 | "\n",
114 | "for name, value in zip(model.metrics_names, results):\n",
115 | " print(\"%s: %.3f\" % (name, value))"
116 | ]
117 | }
118 | ],
119 | "metadata": {
120 | "kernelspec": {
121 | "display_name": "Python 3",
122 | "language": "python",
123 | "name": "python3"
124 | },
125 | "language_info": {
126 | "codemirror_mode": {
127 | "name": "ipython",
128 | "version": 3
129 | },
130 | "file_extension": ".py",
131 | "mimetype": "text/x-python",
132 | "name": "python",
133 | "nbconvert_exporter": "python",
134 | "pygments_lexer": "ipython3",
135 | "version": "3.7.8"
136 | }
137 | },
138 | "nbformat": 4,
139 | "nbformat_minor": 4
140 | }
141 |
--------------------------------------------------------------------------------
/[KDT] 인공지능 13주차 실습/lecture_material_전체.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 13주차 실습/lecture_material_전체.pdf
--------------------------------------------------------------------------------
/[KDT] 인공지능 13주차 실습/실습6-2-OpenCV로 웹캠접근.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "## OpenCV로 웹캠 접근"
8 | ]
9 | },
10 | {
11 | "cell_type": "markdown",
12 | "metadata": {},
13 | "source": [
14 | "밑의 예제는 다음 사이트에서 가져왔습니다:\n",
15 | "https://hoony-gunputer.tistory.com/"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": 1,
21 | "metadata": {
22 | "collapsed": false
23 | },
24 | "outputs": [],
25 | "source": [
26 | "import cv2 #0이면 노트북 내장 웹캠 숫자를 올리면 추가된 웹캠을 이용할 수 있다. \n",
27 | "cap = cv2.VideoCapture(0) \n",
28 | "# 3은 가로 4는 세로 길이 \n",
29 | "cap.set(3, 720) \n",
30 | "cap.set(4, 1080) \n",
31 | "while True: \n",
32 | " ret, frame = cap.read()\n",
33 | " cv2.imshow('test', frame) \n",
34 | " \n",
35 | " # frame 에 대한 영상처리\n",
36 | " # frame에 대해 object detection하고\n",
37 | " # frame에다가 box를 그려주기 \n",
38 | " \n",
39 | " k = cv2.waitKey(1) \n",
40 | " if k == 27: \n",
41 | " break \n",
42 | "cap.release() \n",
43 | "cv2.destroyAllWindows()"
44 | ]
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "## OpenCV로 동영상 저장"
51 | ]
52 | },
53 | {
54 | "cell_type": "code",
55 | "execution_count": 1,
56 | "metadata": {
57 | "collapsed": true
58 | },
59 | "outputs": [],
60 | "source": [
61 | "import cv2 \n",
62 | "cap = cv2.VideoCapture(0) \n",
63 | "cap.set(3, 720) \n",
64 | "cap.set(4, 1080) \n",
65 | "fc = 20.0 \n",
66 | "codec = cv2.VideoWriter_fourcc('D', 'I', 'V', 'X') \n",
67 | "out = cv2.VideoWriter('mycam.avi', codec, fc, (int(cap.get(3)), int(cap.get(4)))) \n",
68 | "while True: \n",
69 | " ret, frame = cap.read()\n",
70 | " cv2.imshow('test', frame)\n",
71 | " # frame 에 대한 영상처리\n",
72 | " # frame에 대해 object detection하고\n",
73 | " # frame에다가 box를 그려주기 \n",
74 | " \n",
75 | " out.write(frame) \n",
76 | " k = cv2.waitKey(1) \n",
77 | " if k == 27: break \n",
78 | "cap.release() \n",
79 | "cv2.destroyAllWindows()"
80 | ]
81 | },
82 | {
83 | "cell_type": "code",
84 | "execution_count": null,
85 | "metadata": {
86 | "collapsed": true
87 | },
88 | "outputs": [],
89 | "source": []
90 | }
91 | ],
92 | "metadata": {
93 | "anaconda-cloud": {},
94 | "kernelspec": {
95 | "display_name": "Python [default]",
96 | "language": "python",
97 | "name": "python3"
98 | },
99 | "language_info": {
100 | "codemirror_mode": {
101 | "name": "ipython",
102 | "version": 3
103 | },
104 | "file_extension": ".py",
105 | "mimetype": "text/x-python",
106 | "name": "python",
107 | "nbconvert_exporter": "python",
108 | "pygments_lexer": "ipython3",
109 | "version": "3.5.5"
110 | }
111 | },
112 | "nbformat": 4,
113 | "nbformat_minor": 1
114 | }
115 |
--------------------------------------------------------------------------------
/[KDT] 인공지능 2주차 실습/[1-1. 선형시스템]-2.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 4,
3 | "nbformat_minor": 0,
4 | "metadata": {
5 | "colab": {
6 | "name": "[1-1. 선형시스템]-2",
7 | "provenance": [],
8 | "collapsed_sections": []
9 | },
10 | "kernelspec": {
11 | "name": "python3",
12 | "display_name": "Python 3"
13 | }
14 | },
15 | "cells": [
16 | {
17 | "cell_type": "markdown",
18 | "metadata": {
19 | "id": "Fn6NoDwPmhNs"
20 | },
21 | "source": [
22 | "# 인공지능 수학 - 선형대수 (NumPy 실습)\n",
23 | "## 행렬과 벡터의 코딩 및 연산"
24 | ]
25 | },
26 | {
27 | "cell_type": "code",
28 | "metadata": {
29 | "id": "kvcJK1GfTqxA"
30 | },
31 | "source": [
32 | "import numpy as np"
33 | ],
34 | "execution_count": null,
35 | "outputs": []
36 | },
37 | {
38 | "cell_type": "code",
39 | "metadata": {
40 | "id": "SqWHFBWsT3K7",
41 | "colab": {
42 | "base_uri": "https://localhost:8080/"
43 | },
44 | "outputId": "14340910-36bb-49f8-aa9f-b21317db6156"
45 | },
46 | "source": [
47 | "# 행렬 코딩\n",
48 | "A = np.array([[3, 1, 1], [1, -2, -1], [1, 1, 1]])\n",
49 | "\n",
50 | "print(A)\n",
51 | "print(np.shape(A))"
52 | ],
53 | "execution_count": null,
54 | "outputs": [
55 | {
56 | "output_type": "stream",
57 | "text": [
58 | "[[ 3 1 1]\n",
59 | " [ 1 -2 -1]\n",
60 | " [ 1 1 1]]\n",
61 | "(3, 3)\n"
62 | ],
63 | "name": "stdout"
64 | }
65 | ]
66 | },
67 | {
68 | "cell_type": "code",
69 | "metadata": {
70 | "colab": {
71 | "base_uri": "https://localhost:8080/"
72 | },
73 | "id": "YFVDrsxxh6gc",
74 | "outputId": "d210a227-4c4b-4d34-97e9-012349046822"
75 | },
76 | "source": [
77 | "# 벡터 코딩\n",
78 | "b = np.array([4, 1, 2])\n",
79 | "\n",
80 | "print(b)\n",
81 | "print(np.shape(b))"
82 | ],
83 | "execution_count": null,
84 | "outputs": [
85 | {
86 | "output_type": "stream",
87 | "text": [
88 | "[4 1 2]\n",
89 | "(3,)\n"
90 | ],
91 | "name": "stdout"
92 | }
93 | ]
94 | },
95 | {
96 | "cell_type": "code",
97 | "metadata": {
98 | "colab": {
99 | "base_uri": "https://localhost:8080/"
100 | },
101 | "id": "vE5nVm9uiAC_",
102 | "outputId": "54996a00-3e20-416b-ec66-b09664b30cc0"
103 | },
104 | "source": [
105 | "# 역행렬 구하기\n",
106 | "A_inv = np.linalg.inv(A) \n",
107 | "\n",
108 | "print(A_inv)\n",
109 | "print(np.shape(A_inv))"
110 | ],
111 | "execution_count": null,
112 | "outputs": [
113 | {
114 | "output_type": "stream",
115 | "text": [
116 | "[[ 5.00000000e-01 -7.40148683e-17 -5.00000000e-01]\n",
117 | " [ 1.00000000e+00 -1.00000000e+00 -2.00000000e+00]\n",
118 | " [-1.50000000e+00 1.00000000e+00 3.50000000e+00]]\n",
119 | "(3, 3)\n"
120 | ],
121 | "name": "stdout"
122 | }
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "metadata": {
128 | "colab": {
129 | "base_uri": "https://localhost:8080/"
130 | },
131 | "id": "BcrQWuJ3iaFr",
132 | "outputId": "efc84757-893c-4988-e533-a7aed872de93"
133 | },
134 | "source": [
135 | "# 역행렬을 이용한 선형시스템 Ax=b의 해 구하기\n",
136 | "# x = np.matmul(A_inv, b)\n",
137 | "x = A_inv @ b\n",
138 | "\n",
139 | "print(x)\n",
140 | "print(np.shape(x))"
141 | ],
142 | "execution_count": null,
143 | "outputs": [
144 | {
145 | "output_type": "stream",
146 | "text": [
147 | "[ 1. -1. 2.]\n",
148 | "(3,)\n"
149 | ],
150 | "name": "stdout"
151 | }
152 | ]
153 | },
154 | {
155 | "cell_type": "code",
156 | "metadata": {
157 | "colab": {
158 | "base_uri": "https://localhost:8080/"
159 | },
160 | "id": "fbHOxX5Ei1Bz",
161 | "outputId": "c99bf2a4-f89a-4776-a234-b9b732a04440"
162 | },
163 | "source": [
164 | "## 결과 검증\n",
165 | "# bb = np.matmul(A, x)\n",
166 | "bb = A @ x\n",
167 | "\n",
168 | "print(np.shape(bb))\n",
169 | "print(bb)\n",
170 | "\n",
171 | "if np.linalg.norm(b - bb) < 1e-3:\n",
172 | " print(\"Ok\")\n",
173 | "else:\n",
174 | " print(\"something wrong\")"
175 | ],
176 | "execution_count": null,
177 | "outputs": [
178 | {
179 | "output_type": "stream",
180 | "text": [
181 | "(3,)\n",
182 | "[4. 1. 2.]\n",
183 | "Ok\n"
184 | ],
185 | "name": "stdout"
186 | }
187 | ]
188 | }
189 | ]
190 | }
--------------------------------------------------------------------------------
/[KDT] 인공지능 2주차 실습/[1-3. LU 분해]-2.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 4,
3 | "nbformat_minor": 0,
4 | "metadata": {
5 | "colab": {
6 | "name": "[1-3. LU 분해]-2",
7 | "provenance": []
8 | },
9 | "kernelspec": {
10 | "name": "python3",
11 | "display_name": "Python 3"
12 | }
13 | },
14 | "cells": [
15 | {
16 | "cell_type": "markdown",
17 | "metadata": {
18 | "id": "Fn6NoDwPmhNs"
19 | },
20 | "source": [
21 | "# 인공지능 수학 - 선형대수 (NumPy 실습)\n",
22 | "## LU 분해 (가우스 소거법), 행렬의 rank"
23 | ]
24 | },
25 | {
26 | "cell_type": "code",
27 | "metadata": {
28 | "id": "kvcJK1GfTqxA"
29 | },
30 | "source": [
31 | "import numpy as np\n",
32 | "import scipy \n",
33 | "import scipy.linalg # LU 분해를 사용하기 위한 import "
34 | ],
35 | "execution_count": null,
36 | "outputs": []
37 | },
38 | {
39 | "cell_type": "code",
40 | "metadata": {
41 | "id": "SqWHFBWsT3K7",
42 | "colab": {
43 | "base_uri": "https://localhost:8080/"
44 | },
45 | "outputId": "33abcc6f-db20-4ee1-be79-10810f7f535d"
46 | },
47 | "source": [
48 | "# 행렬 A, 벡터 b 코딩\n",
49 | "A = np.array([[3, 1, 1], [1, -2, -1], [1, 1, 1]])\n",
50 | "b = np.array([4, 1, 2])\n",
51 | "\n",
52 | "print(\"A:\", A)\n",
53 | "print(np.shape(A))\n",
54 | "\n",
55 | "print(\"b:\", b)\n",
56 | "print(np.shape(b))"
57 | ],
58 | "execution_count": null,
59 | "outputs": [
60 | {
61 | "output_type": "stream",
62 | "text": [
63 | "A: [[ 3 1 1]\n",
64 | " [ 1 -2 -1]\n",
65 | " [ 1 1 1]]\n",
66 | "(3, 3)\n",
67 | "b: [4 1 2]\n",
68 | "(3,)\n"
69 | ],
70 | "name": "stdout"
71 | }
72 | ]
73 | },
74 | {
75 | "cell_type": "markdown",
76 | "metadata": {
77 | "id": "Qx_Um_GH55rF"
78 | },
79 | "source": [
80 | "LU 분해 결과 확인하기"
81 | ]
82 | },
83 | {
84 | "cell_type": "code",
85 | "metadata": {
86 | "colab": {
87 | "base_uri": "https://localhost:8080/"
88 | },
89 | "id": "YYNV8UN-puA2",
90 | "outputId": "67f4bc7c-523e-4a31-e129-72ba1be261da"
91 | },
92 | "source": [
93 | "# LU 분해의 결과를 각각 행렬로서 확인하기\n",
94 | "P, L, U = scipy.linalg.lu(A)\n",
95 | "\n",
96 | "print(\"P:\", P)\n",
97 | "print(\"L:\", L)\n",
98 | "print(\"U:\", U)\n",
99 | "\n",
100 | "AA = P @ L @ U\n",
101 | "print(\"AA:\", AA)"
102 | ],
103 | "execution_count": null,
104 | "outputs": [
105 | {
106 | "output_type": "stream",
107 | "text": [
108 | "P: [[1. 0. 0.]\n",
109 | " [0. 1. 0.]\n",
110 | " [0. 0. 1.]]\n",
111 | "L: [[ 1. 0. 0. ]\n",
112 | " [ 0.33333333 1. 0. ]\n",
113 | " [ 0.33333333 -0.28571429 1. ]]\n",
114 | "U: [[ 3. 1. 1. ]\n",
115 | " [ 0. -2.33333333 -1.33333333]\n",
116 | " [ 0. 0. 0.28571429]]\n",
117 | "AA: [[ 3. 1. 1.]\n",
118 | " [ 1. -2. -1.]\n",
119 | " [ 1. 1. 1.]]\n"
120 | ],
121 | "name": "stdout"
122 | }
123 | ]
124 | },
125 | {
126 | "cell_type": "markdown",
127 | "metadata": {
128 | "id": "7Qa-vPFD7QVw"
129 | },
130 | "source": [
131 | "LU 분해를 이용한 선형시스템 Ax = b 풀기\n"
132 | ]
133 | },
134 | {
135 | "cell_type": "code",
136 | "metadata": {
137 | "colab": {
138 | "base_uri": "https://localhost:8080/"
139 | },
140 | "id": "YFVDrsxxh6gc",
141 | "outputId": "7f0e5d9e-2e87-4a34-9b45-62ed407f99be"
142 | },
143 | "source": [
144 | "# LU 분해\n",
145 | "lu, piv = scipy.linalg.lu_factor(A)\n",
146 | "x = scipy.linalg.lu_solve((lu, piv), b)\n",
147 | "\n",
148 | "print(\"x:\", x)\n",
149 | "print(np.shape(x))\n",
150 | "\n",
151 | "bb = A@x\n",
152 | "print(\"bb:\", bb)\n"
153 | ],
154 | "execution_count": null,
155 | "outputs": [
156 | {
157 | "output_type": "stream",
158 | "text": [
159 | "x: [ 1. -1. 2.]\n",
160 | "(3,)\n",
161 | "bb: [4. 1. 2.]\n"
162 | ],
163 | "name": "stdout"
164 | }
165 | ]
166 | },
167 | {
168 | "cell_type": "markdown",
169 | "metadata": {
170 | "id": "3Bl3v__7_uqi"
171 | },
172 | "source": [
173 | "## 행렬의 rank 계산하기"
174 | ]
175 | },
176 | {
177 | "cell_type": "markdown",
178 | "metadata": {
179 | "id": "sbGGkQxzBI4T"
180 | },
181 | "source": [
182 | "'rank = 2'인 2x2 행렬 A\n"
183 | ]
184 | },
185 | {
186 | "cell_type": "code",
187 | "metadata": {
188 | "colab": {
189 | "base_uri": "https://localhost:8080/"
190 | },
191 | "id": "U2Eq5NVt_-wH",
192 | "outputId": "bf1093ab-eff6-4820-b1d9-e6d59ff2ddcd"
193 | },
194 | "source": [
195 | "A = np.array([[1, 3], [-2, 1]])\n",
196 | "print(\"A:\", A)"
197 | ],
198 | "execution_count": null,
199 | "outputs": [
200 | {
201 | "output_type": "stream",
202 | "text": [
203 | "A: [[ 1 3]\n",
204 | " [-2 1]]\n"
205 | ],
206 | "name": "stdout"
207 | }
208 | ]
209 | },
210 | {
211 | "cell_type": "code",
212 | "metadata": {
213 | "colab": {
214 | "base_uri": "https://localhost:8080/"
215 | },
216 | "id": "0pE3s59nBHO_",
217 | "outputId": "2a8e1478-87e9-4179-b8e1-bf6d1f1b0ea1"
218 | },
219 | "source": [
220 | "print(\"rank:\", np.linalg.matrix_rank(A))\n",
221 | "A_inv = np.linalg.inv(A) \n",
222 | "\n",
223 | "print(A_inv)"
224 | ],
225 | "execution_count": null,
226 | "outputs": [
227 | {
228 | "output_type": "stream",
229 | "text": [
230 | "rank: 2\n",
231 | "[[ 0.14285714 -0.42857143]\n",
232 | " [ 0.28571429 0.14285714]]\n"
233 | ],
234 | "name": "stdout"
235 | }
236 | ]
237 | },
238 | {
239 | "cell_type": "markdown",
240 | "metadata": {
241 | "id": "Ef5Y7F2sBPjr"
242 | },
243 | "source": [
244 | "'rank = 1'인 2x2 행렬 A"
245 | ]
246 | },
247 | {
248 | "cell_type": "code",
249 | "metadata": {
250 | "colab": {
251 | "base_uri": "https://localhost:8080/"
252 | },
253 | "id": "jJe0Im38ACVz",
254 | "outputId": "4036c661-f792-4753-c2ba-254bc72e967b"
255 | },
256 | "source": [
257 | "A = np.array([[1, 3], [2, 6]])\n",
258 | "\n",
259 | "print(\"A:\", A)"
260 | ],
261 | "execution_count": null,
262 | "outputs": [
263 | {
264 | "output_type": "stream",
265 | "text": [
266 | "A: [[1 3]\n",
267 | " [2 6]]\n"
268 | ],
269 | "name": "stdout"
270 | }
271 | ]
272 | },
273 | {
274 | "cell_type": "code",
275 | "metadata": {
276 | "colab": {
277 | "base_uri": "https://localhost:8080/",
278 | "height": 396
279 | },
280 | "id": "Fg-9YGgUAgLc",
281 | "outputId": "cb506086-b349-46f9-9d86-a150094ef523"
282 | },
283 | "source": [
284 | "print(\"rank:\", np.linalg.matrix_rank(A))\n",
285 | "A_inv = np.linalg.inv(A) "
286 | ],
287 | "execution_count": null,
288 | "outputs": [
289 | {
290 | "output_type": "stream",
291 | "text": [
292 | "rank: 1\n"
293 | ],
294 | "name": "stdout"
295 | },
296 | {
297 | "output_type": "error",
298 | "ename": "LinAlgError",
299 | "evalue": "ignored",
300 | "traceback": [
301 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
302 | "\u001b[0;31mLinAlgError\u001b[0m Traceback (most recent call last)",
303 | "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0mprint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"rank:\"\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlinalg\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmatrix_rank\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mA\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mA_inv\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlinalg\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minv\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mA\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m",
304 | "\u001b[0;32m<__array_function__ internals>\u001b[0m in \u001b[0;36minv\u001b[0;34m(*args, **kwargs)\u001b[0m\n",
305 | "\u001b[0;32m/usr/local/lib/python3.6/dist-packages/numpy/linalg/linalg.py\u001b[0m in \u001b[0;36minv\u001b[0;34m(a)\u001b[0m\n\u001b[1;32m 545\u001b[0m \u001b[0msignature\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m'D->D'\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0misComplexType\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mt\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0;34m'd->d'\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 546\u001b[0m \u001b[0mextobj\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mget_linalg_error_extobj\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0m_raise_linalgerror_singular\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 547\u001b[0;31m \u001b[0mainv\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0m_umath_linalg\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minv\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0ma\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msignature\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0msignature\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mextobj\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mextobj\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 548\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mwrap\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mainv\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mastype\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mresult_t\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mcopy\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mFalse\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 549\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
306 | "\u001b[0;32m/usr/local/lib/python3.6/dist-packages/numpy/linalg/linalg.py\u001b[0m in \u001b[0;36m_raise_linalgerror_singular\u001b[0;34m(err, flag)\u001b[0m\n\u001b[1;32m 95\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 96\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_raise_linalgerror_singular\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0merr\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mflag\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 97\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mLinAlgError\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"Singular matrix\"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 98\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 99\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_raise_linalgerror_nonposdef\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0merr\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mflag\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
307 | "\u001b[0;31mLinAlgError\u001b[0m: Singular matrix"
308 | ]
309 | }
310 | ]
311 | },
312 | {
313 | "cell_type": "code",
314 | "metadata": {
315 | "colab": {
316 | "base_uri": "https://localhost:8080/"
317 | },
318 | "id": "Z4DEh2rMAy46",
319 | "outputId": "61ea62c6-b54a-41ef-effb-355dab917634"
320 | },
321 | "source": [
322 | "# LU 분해의 결과를 각각 행렬로서 확인하기\n",
323 | "P, L, U = scipy.linalg.lu(A)\n",
324 | "\n",
325 | "print(\"P:\", P)\n",
326 | "print(\"L:\", L)\n",
327 | "print(\"U:\", U)\n",
328 | "\n",
329 | "AA = P @ L @ U\n",
330 | "print(\"AA:\", AA)"
331 | ],
332 | "execution_count": null,
333 | "outputs": [
334 | {
335 | "output_type": "stream",
336 | "text": [
337 | "P: [[0. 1.]\n",
338 | " [1. 0.]]\n",
339 | "L: [[1. 0. ]\n",
340 | " [0.5 1. ]]\n",
341 | "U: [[2. 6.]\n",
342 | " [0. 0.]]\n",
343 | "AA: [[1. 3.]\n",
344 | " [2. 6.]]\n"
345 | ],
346 | "name": "stdout"
347 | }
348 | ]
349 | },
350 | {
351 | "cell_type": "code",
352 | "metadata": {
353 | "colab": {
354 | "base_uri": "https://localhost:8080/"
355 | },
356 | "id": "5pUnbBz4B5N0",
357 | "outputId": "e34dd9e3-10ab-47c8-b1b4-8458701ede18"
358 | },
359 | "source": [
360 | "b = np.array([2, 4])\n",
361 | "# LU 분해\n",
362 | "lu, piv = scipy.linalg.lu_factor(A)\n",
363 | "x = scipy.linalg.lu_solve((lu, piv), b)\n",
364 | "\n",
365 | "print(\"x:\", x)\n",
366 | "print(np.shape(x))"
367 | ],
368 | "execution_count": null,
369 | "outputs": [
370 | {
371 | "output_type": "stream",
372 | "text": [
373 | "x: [nan nan]\n",
374 | "(2,)\n"
375 | ],
376 | "name": "stdout"
377 | },
378 | {
379 | "output_type": "stream",
380 | "text": [
381 | "/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:3: LinAlgWarning: Diagonal number 2 is exactly zero. Singular matrix.\n",
382 | " This is separate from the ipykernel package so we can avoid doing imports until\n"
383 | ],
384 | "name": "stderr"
385 | }
386 | ]
387 | }
388 | ]
389 | }
--------------------------------------------------------------------------------
/[KDT] 인공지능 2주차 실습/[1-8. SVD, PCA]-1.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 4,
3 | "nbformat_minor": 0,
4 | "metadata": {
5 | "colab": {
6 | "name": "[1-8. SVD, PCA]-1",
7 | "provenance": [],
8 | "collapsed_sections": []
9 | },
10 | "kernelspec": {
11 | "name": "python3",
12 | "display_name": "Python 3"
13 | }
14 | },
15 | "cells": [
16 | {
17 | "cell_type": "markdown",
18 | "metadata": {
19 | "id": "avi-EXcIa6gS"
20 | },
21 | "source": [
22 | "# SVD"
23 | ]
24 | },
25 | {
26 | "cell_type": "code",
27 | "metadata": {
28 | "id": "dQa0lGJnbH9u"
29 | },
30 | "source": [
31 | "import numpy as np"
32 | ],
33 | "execution_count": null,
34 | "outputs": []
35 | },
36 | {
37 | "cell_type": "code",
38 | "metadata": {
39 | "id": "2SOcUB5Qa6Xw"
40 | },
41 | "source": [
42 | "A = np.array([[1, 1], [-2, 2], [-1, -1]])\n",
43 | "\n",
44 | "print(\"A:\")\n",
45 | "print(A)"
46 | ],
47 | "execution_count": null,
48 | "outputs": []
49 | },
50 | {
51 | "cell_type": "code",
52 | "metadata": {
53 | "id": "XfghmJpIa6UJ"
54 | },
55 | "source": [
56 | "U = np.array([[1/1.414, 0, 1/1.414], [0, 1, 0], [-1/1.414, 0, 1/1.414]])\n",
57 | "\n",
58 | "UU = U[:,:1]\n",
59 | "\n",
60 | "print(\"U:\")\n",
61 | "print(U)\n",
62 | "print(\"UU:\")\n",
63 | "print(UU)"
64 | ],
65 | "execution_count": null,
66 | "outputs": []
67 | },
68 | {
69 | "cell_type": "code",
70 | "metadata": {
71 | "id": "KNN45vGga6Q4"
72 | },
73 | "source": [
74 | "D = np.array([[4, 0], [0, 1/1.414], [0, 0]])\n",
75 | "\n",
76 | "DD = D[:1,:1]\n",
77 | "\n",
78 | "print(\"D\")\n",
79 | "print(D)\n",
80 | "\n",
81 | "print(\"DD\")\n",
82 | "print(DD)"
83 | ],
84 | "execution_count": null,
85 | "outputs": []
86 | },
87 | {
88 | "cell_type": "code",
89 | "metadata": {
90 | "id": "yCnrzLTMa6N-"
91 | },
92 | "source": [
93 | "Vt = np.array([[1/1.414, 1/1.414], [-1/1.414, 1/1.414]])\n",
94 | "\n",
95 | "VVt = Vt[:1,:]\n",
96 | "\n",
97 | "print(\"Vt\")\n",
98 | "print(Vt)\n",
99 | "\n",
100 | "print(\"VVt\")\n",
101 | "print(VVt)"
102 | ],
103 | "execution_count": null,
104 | "outputs": []
105 | },
106 | {
107 | "cell_type": "code",
108 | "metadata": {
109 | "id": "j3arPWcNa6LG"
110 | },
111 | "source": [
112 | "AA = U @ D @ Vt\n",
113 | "\n",
114 | "print(\"AA\")\n",
115 | "print(AA)"
116 | ],
117 | "execution_count": null,
118 | "outputs": []
119 | },
120 | {
121 | "cell_type": "code",
122 | "metadata": {
123 | "id": "MO8h6wSya6IG"
124 | },
125 | "source": [
126 | "UU @ DD @ VVt"
127 | ],
128 | "execution_count": null,
129 | "outputs": []
130 | },
131 | {
132 | "cell_type": "markdown",
133 | "metadata": {
134 | "id": "YuJNNAftauIV"
135 | },
136 | "source": [
137 | "# PCA"
138 | ]
139 | },
140 | {
141 | "cell_type": "code",
142 | "metadata": {
143 | "colab": {
144 | "base_uri": "https://localhost:8080/"
145 | },
146 | "id": "dMkS6DLfbRik",
147 | "outputId": "10cb534d-08b9-4611-93ac-34af63207f34"
148 | },
149 | "source": [
150 | "import numpy as np\n",
151 | "from sklearn.decomposition import PCA\n",
152 | "\n",
153 | "X = np.array([[-1, -1], [-2, -1], [-3, -2], [1, 1], [2, 1], [3, 2]])\n",
154 | "pca = PCA(n_components=2)\n",
155 | "pca.fit(X)\n"
156 | ],
157 | "execution_count": null,
158 | "outputs": [
159 | {
160 | "output_type": "execute_result",
161 | "data": {
162 | "text/plain": [
163 | "PCA(copy=True, iterated_power='auto', n_components=2, random_state=None,\n",
164 | " svd_solver='auto', tol=0.0, whiten=False)"
165 | ]
166 | },
167 | "metadata": {
168 | "tags": []
169 | },
170 | "execution_count": 1
171 | }
172 | ]
173 | },
174 | {
175 | "cell_type": "code",
176 | "metadata": {
177 | "colab": {
178 | "base_uri": "https://localhost:8080/"
179 | },
180 | "id": "urQ_9bcZb0WO",
181 | "outputId": "50e52323-d40a-4836-a0b4-1ccb00d5d45d"
182 | },
183 | "source": [
184 | "print(pca.explained_variance_ratio_)\n"
185 | ],
186 | "execution_count": null,
187 | "outputs": [
188 | {
189 | "output_type": "stream",
190 | "text": [
191 | "[0.99244289 0.00755711]\n"
192 | ],
193 | "name": "stdout"
194 | }
195 | ]
196 | },
197 | {
198 | "cell_type": "code",
199 | "metadata": {
200 | "colab": {
201 | "base_uri": "https://localhost:8080/"
202 | },
203 | "id": "s2AbqvSKcBnl",
204 | "outputId": "72e59550-f93f-464f-c37b-0d2f598bc2f8"
205 | },
206 | "source": [
207 | "print(pca.singular_values_)"
208 | ],
209 | "execution_count": null,
210 | "outputs": [
211 | {
212 | "output_type": "stream",
213 | "text": [
214 | "[6.30061232 0.54980396]\n"
215 | ],
216 | "name": "stdout"
217 | }
218 | ]
219 | },
220 | {
221 | "cell_type": "code",
222 | "metadata": {
223 | "colab": {
224 | "base_uri": "https://localhost:8080/"
225 | },
226 | "id": "dSvdHJtvcEI1",
227 | "outputId": "2386806a-f1dc-41fa-d55c-67257175c4eb"
228 | },
229 | "source": [
230 | "print(pca.components_)\n"
231 | ],
232 | "execution_count": null,
233 | "outputs": [
234 | {
235 | "output_type": "stream",
236 | "text": [
237 | "[[-0.83849224 -0.54491354]\n",
238 | " [ 0.54491354 -0.83849224]]\n"
239 | ],
240 | "name": "stdout"
241 | }
242 | ]
243 | },
244 | {
245 | "cell_type": "code",
246 | "metadata": {
247 | "id": "2PAvKSbIdQE-"
248 | },
249 | "source": [
250 | ""
251 | ],
252 | "execution_count": null,
253 | "outputs": []
254 | }
255 | ]
256 | }
--------------------------------------------------------------------------------
/[KDT] 인공지능 2주차 실습/[1-8. 벡터공간과 최소제곱법]-2.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 4,
3 | "nbformat_minor": 0,
4 | "metadata": {
5 | "colab": {
6 | "name": "[1-8. 벡터공간과 최소제곱법]-2",
7 | "provenance": [],
8 | "collapsed_sections": []
9 | },
10 | "kernelspec": {
11 | "name": "python3",
12 | "display_name": "Python 3"
13 | }
14 | },
15 | "cells": [
16 | {
17 | "cell_type": "code",
18 | "metadata": {
19 | "id": "7yDXK0vVggAM"
20 | },
21 | "source": [
22 | "import numpy as np"
23 | ],
24 | "execution_count": null,
25 | "outputs": []
26 | },
27 | {
28 | "cell_type": "code",
29 | "metadata": {
30 | "colab": {
31 | "base_uri": "https://localhost:8080/"
32 | },
33 | "id": "X4Ybua2-gk_U",
34 | "outputId": "8db76106-b702-4e37-90a4-00199219cba4"
35 | },
36 | "source": [
37 | "A = np.array([[-3, 1], [-1, 1], [1, 1], [3, 1]])\n",
38 | "print(A)"
39 | ],
40 | "execution_count": null,
41 | "outputs": [
42 | {
43 | "output_type": "stream",
44 | "text": [
45 | "[[-3 1]\n",
46 | " [-1 1]\n",
47 | " [ 1 1]\n",
48 | " [ 3 1]]\n"
49 | ],
50 | "name": "stdout"
51 | }
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "metadata": {
57 | "id": "EyvvgCFXg-kE"
58 | },
59 | "source": [
60 | "b = np.array([-1, -1, 3, 3])"
61 | ],
62 | "execution_count": null,
63 | "outputs": []
64 | },
65 | {
66 | "cell_type": "code",
67 | "metadata": {
68 | "colab": {
69 | "base_uri": "https://localhost:8080/"
70 | },
71 | "id": "nAztKs5ThIgs",
72 | "outputId": "1724d4c8-0dcb-46db-d878-0a769e8ee217"
73 | },
74 | "source": [
75 | "At = A.transpose()\n",
76 | "print(At)"
77 | ],
78 | "execution_count": null,
79 | "outputs": [
80 | {
81 | "output_type": "stream",
82 | "text": [
83 | "[[-3 -1 1 3]\n",
84 | " [ 1 1 1 1]]\n"
85 | ],
86 | "name": "stdout"
87 | }
88 | ]
89 | },
90 | {
91 | "cell_type": "code",
92 | "metadata": {
93 | "colab": {
94 | "base_uri": "https://localhost:8080/"
95 | },
96 | "id": "3pTQGHpphLnz",
97 | "outputId": "7c95cf82-6a8e-4eb4-af46-7bc4d626a3b3"
98 | },
99 | "source": [
100 | "AtA = At @ A\n",
101 | "print(AtA)\n",
102 | "\n",
103 | "Atb = At @ b\n",
104 | "print(Atb)\n",
105 | "\n",
106 | "AtA_inv = np.linalg.inv(AtA)\n",
107 | "print(AtA_inv)"
108 | ],
109 | "execution_count": null,
110 | "outputs": [
111 | {
112 | "output_type": "stream",
113 | "text": [
114 | "[[20 0]\n",
115 | " [ 0 4]]\n",
116 | "[16 4]\n",
117 | "[[0.05 0. ]\n",
118 | " [0. 0.25]]\n"
119 | ],
120 | "name": "stdout"
121 | }
122 | ]
123 | },
124 | {
125 | "cell_type": "code",
126 | "metadata": {
127 | "colab": {
128 | "base_uri": "https://localhost:8080/"
129 | },
130 | "id": "VTT7AAT0hPdk",
131 | "outputId": "922e3888-4895-4b24-d422-52d3a1c7286a"
132 | },
133 | "source": [
134 | "xx = AtA_inv @ Atb\n",
135 | "print(xx)"
136 | ],
137 | "execution_count": null,
138 | "outputs": [
139 | {
140 | "output_type": "stream",
141 | "text": [
142 | "[0.8 1. ]\n"
143 | ],
144 | "name": "stdout"
145 | }
146 | ]
147 | },
148 | {
149 | "cell_type": "code",
150 | "metadata": {
151 | "colab": {
152 | "base_uri": "https://localhost:8080/"
153 | },
154 | "id": "QpQ6DoZnhTt0",
155 | "outputId": "d51eb2ad-194f-4777-8dbf-3c0e5b5067a6"
156 | },
157 | "source": [
158 | "bb = A @ x\n",
159 | "print(bb)"
160 | ],
161 | "execution_count": null,
162 | "outputs": [
163 | {
164 | "output_type": "stream",
165 | "text": [
166 | "[-1.4 0.2 1.8 3.4]\n"
167 | ],
168 | "name": "stdout"
169 | }
170 | ]
171 | }
172 | ]
173 | }
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/Large_Linear_Regression.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "%matplotlib inline\n",
10 | "import matplotlib.pyplot as plt\n",
11 | "import seaborn as sns; sns.set()\n",
12 | "import numpy as np\n",
13 | "\n",
14 | "from sklearn.linear_model import LinearRegression\n",
15 | "model = LinearRegression(fit_intercept=True)"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": null,
21 | "metadata": {},
22 | "outputs": [],
23 | "source": [
24 | "N = 1000\n",
25 | "M = 3\n",
26 | "rng = np.random.RandomState(1)\n",
27 | "rng.rand(N, M).shape\n",
28 | "X = 10 * rng.rand(N, M)\n",
29 | "np.dot(X, [1.5, -2., 1.]).shape"
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "execution_count": null,
35 | "metadata": {},
36 | "outputs": [],
37 | "source": [
38 | "rng = np.random.RandomState(1)\n",
39 | "X = 10 * rng.rand(N, M)\n",
40 | "y = 0.5 + np.dot(X, [1.5, -2., 1.])\n",
41 | "\n",
42 | "model.fit(X, y)\n",
43 | "print(model.intercept_)\n",
44 | "print(model.coef_)"
45 | ]
46 | },
47 | {
48 | "cell_type": "markdown",
49 | "metadata": {},
50 | "source": [
51 | "### Normal Equations"
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "execution_count": null,
57 | "metadata": {},
58 | "outputs": [],
59 | "source": [
60 | "import numpy.linalg as LA\n",
61 | "normal_equation_solution = LA.inv(X.T@X)@X.T@y"
62 | ]
63 | },
64 | {
65 | "cell_type": "code",
66 | "execution_count": null,
67 | "metadata": {
68 | "scrolled": true
69 | },
70 | "outputs": [],
71 | "source": [
72 | "normal_equation_solution"
73 | ]
74 | },
75 | {
76 | "cell_type": "markdown",
77 | "metadata": {},
78 | "source": [
79 | "### Small Memory Normal Equations (Online)"
80 | ]
81 | },
82 | {
83 | "cell_type": "code",
84 | "execution_count": null,
85 | "metadata": {},
86 | "outputs": [],
87 | "source": [
88 | "A = np.zeros([M, M])\n",
89 | "b = np.zeros([M, 1])"
90 | ]
91 | },
92 | {
93 | "cell_type": "code",
94 | "execution_count": null,
95 | "metadata": {},
96 | "outputs": [],
97 | "source": [
98 | "for i in range(N):\n",
99 | " A = A + X[i, np.newaxis].T@X[i, np.newaxis]\n",
100 | " b = b + X[i, np.newaxis].T*y[i]\n",
101 | "solution = LA.inv(A)@b"
102 | ]
103 | },
104 | {
105 | "cell_type": "code",
106 | "execution_count": null,
107 | "metadata": {},
108 | "outputs": [],
109 | "source": [
110 | "solution"
111 | ]
112 | },
113 | {
114 | "cell_type": "markdown",
115 | "metadata": {},
116 | "source": [
117 | "### SGD"
118 | ]
119 | },
120 | {
121 | "cell_type": "code",
122 | "execution_count": null,
123 | "metadata": {},
124 | "outputs": [],
125 | "source": [
126 | "w = np.zeros([M, 1])"
127 | ]
128 | },
129 | {
130 | "cell_type": "code",
131 | "execution_count": null,
132 | "metadata": {},
133 | "outputs": [],
134 | "source": [
135 | "eta = 0.001\n",
136 | "n_iter = 1000"
137 | ]
138 | },
139 | {
140 | "cell_type": "code",
141 | "execution_count": null,
142 | "metadata": {},
143 | "outputs": [],
144 | "source": [
145 | "for i in range(n_iter):\n",
146 | " i = i % N\n",
147 | " neg_gradient = (y[i] - w.T@X[i, np.newaxis].T) * X[i, np.newaxis].T\n",
148 | " w = w + eta * neg_gradient\n",
149 | "print(w)"
150 | ]
151 | },
152 | {
153 | "cell_type": "code",
154 | "execution_count": null,
155 | "metadata": {},
156 | "outputs": [],
157 | "source": []
158 | }
159 | ],
160 | "metadata": {
161 | "kernelspec": {
162 | "display_name": "Python 3",
163 | "language": "python",
164 | "name": "python3"
165 | },
166 | "language_info": {
167 | "codemirror_mode": {
168 | "name": "ipython",
169 | "version": 3
170 | },
171 | "file_extension": ".py",
172 | "mimetype": "text/x-python",
173 | "name": "python",
174 | "nbconvert_exporter": "python",
175 | "pygments_lexer": "ipython3",
176 | "version": "3.7.8"
177 | }
178 | },
179 | "nbformat": 4,
180 | "nbformat_minor": 4
181 | }
182 |
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/Linear Algebra, Matrix Calculus.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 6,7주차 강의자료/Linear Algebra, Matrix Calculus.pdf
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/ML_Basics (Linear Regression).ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "%matplotlib inline\n",
10 | "import matplotlib.pyplot as plt\n",
11 | "import seaborn as sns; sns.set()\n",
12 | "import numpy as np"
13 | ]
14 | },
15 | {
16 | "cell_type": "markdown",
17 | "metadata": {},
18 | "source": [
19 | "## 선형회귀 (Linear Regression)\n",
20 | "\n",
21 | "먼저, 주어진 데이터를 직선을 사용해 모델링하는 방법을 살펴본다. 직선함수는 다음과 같은 형태를 가진다.\n",
22 | "$$\n",
23 | "y = ax + b\n",
24 | "$$\n",
25 | "여기서 $a$는 기울기 (*slope*)이고 $b$는 $y$절편 (*intercept*)라고 불린다.\n",
26 | "\n",
27 | "아래 그래프는 기울기가 2이고 $y$절편이 -5인 직선으로부터 생성된 데이터를 보여준다"
28 | ]
29 | },
30 | {
31 | "cell_type": "code",
32 | "execution_count": null,
33 | "metadata": {},
34 | "outputs": [],
35 | "source": [
36 | "rng = np.random.RandomState(1)\n",
37 | "x = 10 * rng.rand(50)\n",
38 | "y = 2 * x - 5 + rng.randn(50)\n",
39 | "plt.scatter(x, y);"
40 | ]
41 | },
42 | {
43 | "cell_type": "markdown",
44 | "metadata": {},
45 | "source": [
46 | "Scikit-Learn의 ``LinearRegression`` estimator를 사용해서 위 데이터를 가장 잘 표현하는 직선을 찾을 수 있다."
47 | ]
48 | },
49 | {
50 | "cell_type": "code",
51 | "execution_count": null,
52 | "metadata": {},
53 | "outputs": [],
54 | "source": [
55 | "x.shape"
56 | ]
57 | },
58 | {
59 | "cell_type": "code",
60 | "execution_count": null,
61 | "metadata": {},
62 | "outputs": [],
63 | "source": [
64 | "x[:, np.newaxis].shape"
65 | ]
66 | },
67 | {
68 | "cell_type": "code",
69 | "execution_count": null,
70 | "metadata": {},
71 | "outputs": [],
72 | "source": [
73 | "from sklearn.linear_model import LinearRegression\n",
74 | "model = LinearRegression(fit_intercept=True)\n",
75 | "\n",
76 | "model.fit(x[:, np.newaxis], y)\n",
77 | "\n",
78 | "xfit = np.linspace(0, 10, 1000)\n",
79 | "yfit = model.predict(xfit[:, np.newaxis])\n",
80 | "\n",
81 | "plt.scatter(x, y)\n",
82 | "plt.plot(xfit, yfit);"
83 | ]
84 | },
85 | {
86 | "cell_type": "markdown",
87 | "metadata": {},
88 | "source": [
89 | "모델 학습이 끝난 후 학습된 파라미터들은 model.\"파라미터이름\"_ 의 형태로 저장된다. 기울기와 y절편은 아래와 같이 출력할 수 있다."
90 | ]
91 | },
92 | {
93 | "cell_type": "code",
94 | "execution_count": null,
95 | "metadata": {},
96 | "outputs": [],
97 | "source": [
98 | "print(\"Model slope: \", model.coef_[0])\n",
99 | "print(\"Model intercept:\", model.intercept_)"
100 | ]
101 | },
102 | {
103 | "cell_type": "markdown",
104 | "metadata": {},
105 | "source": [
106 | "``LinearRegression`` estimator는 위의 예제와 같은 1차원 입력뿐만 아니라 다차원 입력을 사용한 선형모델을 다룰 수 있다. 다차원 선형모델은 다음과 같은 형태를 가진다.\n",
107 | "$$\n",
108 | "y = a_0 + a_1 x_1 + a_2 x_2 + \\cdots\n",
109 | "$$\n",
110 | "기하학적으로 이것은 hyper-plane으로 데이터를 표현하는 것이라고 말할 수 있다."
111 | ]
112 | },
113 | {
114 | "cell_type": "code",
115 | "execution_count": null,
116 | "metadata": {},
117 | "outputs": [],
118 | "source": [
119 | "rng = np.random.RandomState(1)\n",
120 | "rng.rand(100, 3).shape\n",
121 | "X = 10 * rng.rand(100, 3)\n",
122 | "np.dot(X, [1.5, -2., 1.]).shape"
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "execution_count": null,
128 | "metadata": {},
129 | "outputs": [],
130 | "source": [
131 | "rng = np.random.RandomState(1)\n",
132 | "X = 10 * rng.rand(100, 3)\n",
133 | "y = 0.5 + np.dot(X, [1.5, -2., 1.])\n",
134 | "\n",
135 | "model.fit(X, y)\n",
136 | "print(model.intercept_)\n",
137 | "print(model.coef_)"
138 | ]
139 | },
140 | {
141 | "cell_type": "markdown",
142 | "metadata": {},
143 | "source": [
144 | "$y$값들은 랜덤하게 생성된 3차원의 $x$값과 계수들([1.5, -2., 1.])을 곱함으로써 생성되었는데, linear regression을 통해서 이 계수들을 계산해낼 수 있다는 것을 알 수 있다.\n",
145 | "\n",
146 | "만약 데이터가 선형적인 관계를 가지고 있지 않다면?"
147 | ]
148 | },
149 | {
150 | "cell_type": "markdown",
151 | "metadata": {},
152 | "source": [
153 | "## 선형 기저함수 모델 (Linear Basis Function Models)\n",
154 | "\n",
155 | "비선형데이터를 선형함수로 모델링하는 한가지 방법은 기저함수(basis function)을 사용하는 것이다.\n",
156 | "\n",
157 | "예를 들어, 다음과 같은 선형함수를 사용한다고 하자.\n",
158 | "$$\n",
159 | "y = a_0 + a_1 x_1 + a_2 x_2 + a_3 x_3 + \\cdots\n",
160 | "$$\n",
161 | "여기서 $x_1, x_2, x_3,$등을 1차원 $x$로부터 생성할 수 있다($x_n = f_n(x)$). $f_n$을 기저함수(basis function)라고 부른다.\n",
162 | "\n",
163 | "만약 $f_n(x) = x^n$라는 기저함수를 사용하면 최종적인 모델은 다음과 같을 것이다.\n",
164 | "\n",
165 | "$$\n",
166 | "y = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + \\cdots\n",
167 | "$$\n",
168 | "\n",
169 | "이 모델은 여전히 계수($a_n$)에 관해서는 선형함수임을 기억하자. 따라서 1차원 변수인 $x$를 기저함수를 통해 다차원으로 확장시킴으로써 우리는 여전히 선형모델(linear regression)을 사용할 수 있게 된다."
170 | ]
171 | },
172 | {
173 | "cell_type": "markdown",
174 | "metadata": {},
175 | "source": [
176 | "### 다항 기저함수 (Polynomial Basis Functions)\n",
177 | "\n",
178 | "$f_n(x) = x^n$ 형태의 함수를 다항 기저함수 (polynomial basis functions)라고 부른다. Scikit-Learn은 ``PolynomialFeatures``이라는 transformer를 이미 포함하고 있다."
179 | ]
180 | },
181 | {
182 | "cell_type": "code",
183 | "execution_count": null,
184 | "metadata": {},
185 | "outputs": [],
186 | "source": [
187 | "from sklearn.preprocessing import PolynomialFeatures\n",
188 | "x = np.array([2, 3, 4])\n",
189 | "poly = PolynomialFeatures(3, include_bias=False)\n",
190 | "poly.fit_transform(x[:, None])"
191 | ]
192 | },
193 | {
194 | "cell_type": "markdown",
195 | "metadata": {},
196 | "source": [
197 | "PolynomialFeatures가 1차원 array를 3차원 array로 변환한 것을 볼 수 있다. 이렇게 변환된 데이터를 선형모델에 적용할 수 있다.\n",
198 | "\n",
199 | "7차원 변환을 적용해보자."
200 | ]
201 | },
202 | {
203 | "cell_type": "code",
204 | "execution_count": null,
205 | "metadata": {},
206 | "outputs": [],
207 | "source": [
208 | "from sklearn.pipeline import make_pipeline\n",
209 | "poly_model = make_pipeline(PolynomialFeatures(7),\n",
210 | " LinearRegression())"
211 | ]
212 | },
213 | {
214 | "cell_type": "markdown",
215 | "metadata": {},
216 | "source": [
217 | "다차원 변환을 사용하면 복잡한 데이터를 모델링할 수 있게 된다. 예를 들어 sine 함수를 사용해서 데이터를 생성하고 모델링해보자."
218 | ]
219 | },
220 | {
221 | "cell_type": "code",
222 | "execution_count": null,
223 | "metadata": {},
224 | "outputs": [],
225 | "source": [
226 | "rng = np.random.RandomState(1)\n",
227 | "x = 10 * rng.rand(50)\n",
228 | "y = np.sin(x) + 0.1 * rng.randn(50)\n",
229 | "plt.scatter(x, y)"
230 | ]
231 | },
232 | {
233 | "cell_type": "code",
234 | "execution_count": null,
235 | "metadata": {},
236 | "outputs": [],
237 | "source": [
238 | "poly_model.fit(x[:, np.newaxis], y)\n",
239 | "yfit = poly_model.predict(xfit[:, np.newaxis])\n",
240 | "\n",
241 | "plt.scatter(x, y)\n",
242 | "plt.plot(xfit, yfit);"
243 | ]
244 | },
245 | {
246 | "cell_type": "markdown",
247 | "metadata": {},
248 | "source": [
249 | "### 가우시안 기저함수 (Gaussian Basis Functions)\n",
250 | "\n",
251 | "다항 기저함수 외에 다른 기저함수를 사용해보자. 가우시안 기저함수는 다음과 같이 정의된다.\n",
252 | "\n",
253 | "$$\\exp \\{-\\frac{(x-u_j)^2}{2s^2}\\}$$\n",
254 | "\n",
255 | "$u_j$는 함수의 위치, $s$는 폭을 결정한다. 주어진 데이터를 여러 개의 가우시안 기저함수들의 합으로 표현하려고 시도할 수 있다."
256 | ]
257 | },
258 | {
259 | "cell_type": "markdown",
260 | "metadata": {},
261 | "source": [
262 | ""
263 | ]
264 | },
265 | {
266 | "cell_type": "markdown",
267 | "metadata": {},
268 | "source": [
269 | "그래프 아래 회색부분은 다수의 가우시안 기저함수들을 보여주고 있다. 이 기저함수들의 합은 데이터를 가로지르는 매끈한 곡선이 된다.\n",
270 | "\n",
271 | "가우시안 기저함수는 Scikit-Learn에 포함되어 있지 않지만 어렵지 않게 구현할 수 있다."
272 | ]
273 | },
274 | {
275 | "cell_type": "code",
276 | "execution_count": null,
277 | "metadata": {},
278 | "outputs": [],
279 | "source": [
280 | "from sklearn.base import BaseEstimator, TransformerMixin\n",
281 | "\n",
282 | "class GaussianFeatures(BaseEstimator, TransformerMixin):\n",
283 | " \"\"\"Uniformly spaced Gaussian features for one-dimensional input\"\"\"\n",
284 | " \n",
285 | " def __init__(self, N, width_factor=2.0):\n",
286 | " self.N = N\n",
287 | " self.width_factor = width_factor\n",
288 | " \n",
289 | " @staticmethod\n",
290 | " def _gauss_basis(x, y, width, axis=None):\n",
291 | " arg = (x - y) / width\n",
292 | " return np.exp(-0.5 * np.sum(arg ** 2, axis))\n",
293 | " \n",
294 | " def fit(self, X, y=None):\n",
295 | " # create N centers spread along the data range\n",
296 | " self.centers_ = np.linspace(X.min(), X.max(), self.N)\n",
297 | " self.width_ = self.width_factor * (self.centers_[1] - self.centers_[0])\n",
298 | " return self\n",
299 | " \n",
300 | " def transform(self, X):\n",
301 | " return self._gauss_basis(X[:, :, np.newaxis], self.centers_,\n",
302 | " self.width_, axis=1)\n",
303 | " \n",
304 | "gauss_model = make_pipeline(GaussianFeatures(20),\n",
305 | " LinearRegression())\n",
306 | "gauss_model.fit(x[:, np.newaxis], y)\n",
307 | "yfit = gauss_model.predict(xfit[:, np.newaxis])\n",
308 | "\n",
309 | "plt.scatter(x, y)\n",
310 | "plt.plot(xfit, yfit)\n",
311 | "plt.xlim(0, 10);"
312 | ]
313 | },
314 | {
315 | "cell_type": "markdown",
316 | "metadata": {},
317 | "source": [
318 | "## 규제화 (Regularization)\n",
319 | "\n",
320 | "기저함수를 사용함으로써 복잡한 데이터를 모델링할 수 있게 되었지만 조심하지 않는다면 over-fitting이라는 다른 심각한 문제를 만날 수 있다! 예를 들어, 너무 많은 개수의 가우시안 기저함수를 사용하게 되면 다음과 같이 될 수 있다."
321 | ]
322 | },
323 | {
324 | "cell_type": "code",
325 | "execution_count": null,
326 | "metadata": {},
327 | "outputs": [],
328 | "source": [
329 | "model = make_pipeline(GaussianFeatures(30),\n",
330 | " LinearRegression())\n",
331 | "model.fit(x[:, np.newaxis], y)\n",
332 | "\n",
333 | "plt.scatter(x, y)\n",
334 | "plt.plot(xfit, model.predict(xfit[:, np.newaxis]))\n",
335 | "\n",
336 | "plt.xlim(0, 10)\n",
337 | "plt.ylim(-1.5, 1.5);"
338 | ]
339 | },
340 | {
341 | "cell_type": "markdown",
342 | "metadata": {},
343 | "source": [
344 | "이 예제에서는 30개의 기저함수가 사용되었는데 모델이 필요이상으로 flexible해져서 데이터가 없는 곳에서는 극단적인 값을 가지는 것을 볼 수 있다. 기저함수의 계수들은 다음과 같이 확인할 수 있다."
345 | ]
346 | },
347 | {
348 | "cell_type": "code",
349 | "execution_count": null,
350 | "metadata": {
351 | "scrolled": true
352 | },
353 | "outputs": [],
354 | "source": [
355 | "def basis_plot(model, title=None):\n",
356 | " fig, ax = plt.subplots(2, sharex=True)\n",
357 | " model.fit(x[:, np.newaxis], y)\n",
358 | " ax[0].scatter(x, y)\n",
359 | " ax[0].plot(xfit, model.predict(xfit[:, np.newaxis]))\n",
360 | " ax[0].set(xlabel='x', ylabel='y', ylim=(-1.5, 1.5))\n",
361 | " \n",
362 | " if title:\n",
363 | " ax[0].set_title(title)\n",
364 | "\n",
365 | " ax[1].plot(model.steps[0][1].centers_,\n",
366 | " model.steps[1][1].coef_)\n",
367 | " ax[1].set(xlabel='basis location',\n",
368 | " ylabel='coefficient',\n",
369 | " xlim=(0, 10))\n",
370 | " \n",
371 | "model = make_pipeline(GaussianFeatures(30), LinearRegression())\n",
372 | "basis_plot(model)"
373 | ]
374 | },
375 | {
376 | "cell_type": "markdown",
377 | "metadata": {},
378 | "source": [
379 | "위 두번째 그래프는 각각의 가우시안 기저함수의 크기(계수값)을 보여주고 있다. Over-fitting이 일어나는 영역에서는 인접한 기저함수들의 값이 극단으로 가면서 서로 상쇄하는 현상이 일어난다. 따라서 큰 계수값에 대해 penalty를 부여해서 over-fitting을 어느 정도 극복할 수 있을 것이다. 이러한 penalty를 *regularization*이라 부른다."
380 | ]
381 | },
382 | {
383 | "cell_type": "markdown",
384 | "metadata": {},
385 | "source": [
386 | "### Ridge regression ($L_2$ Regularization)\n",
387 | "\n",
388 | "가장 자주 쓰이는 형태의 regularization은 *ridge regression* ($L_2$ *regularization*)이고 다음과 같이 정의된다.\n",
389 | "$$\n",
390 | "P = \\alpha\\sum_{n=1}^N \\theta_n^2\n",
391 | "$$\n",
392 | "여기서 $\\alpha$는 regularization의 강도를 조절하는 파라미터이다. 이 형태의 regularization은 Scikit-Learn의 ``Ridge`` estimator에서 사용된다."
393 | ]
394 | },
395 | {
396 | "cell_type": "code",
397 | "execution_count": null,
398 | "metadata": {},
399 | "outputs": [],
400 | "source": [
401 | "from sklearn.linear_model import Ridge\n",
402 | "model = make_pipeline(GaussianFeatures(30), Ridge(alpha=0.1))\n",
403 | "basis_plot(model, title='Ridge Regression')"
404 | ]
405 | },
406 | {
407 | "cell_type": "markdown",
408 | "metadata": {},
409 | "source": [
410 | "$\\alpha$값이 0에 가까워질수록 일반적인 선형회귀모델이 되고 $\\alpha$값이 무한대로 증가하면 데이터는 모델에 영향을 주지 않게 된다."
411 | ]
412 | },
413 | {
414 | "cell_type": "markdown",
415 | "metadata": {},
416 | "source": [
417 | "### Lasso regression ($L_1$ regularization)\n",
418 | "\n",
419 | "또 하나의 자주 쓰이는 regularization방법은 계수들의 절대값의 합을 제한하는 것이다.\n",
420 | "$$\n",
421 | "P = \\alpha\\sum_{n=1}^N |\\theta_n|\n",
422 | "$$\n",
423 | "뒤에서 자세히 다루겠지만 이 방법은 *sparse*한 모델을 생성하게 된다(많은 계수들이 0이 됨)."
424 | ]
425 | },
426 | {
427 | "cell_type": "code",
428 | "execution_count": null,
429 | "metadata": {
430 | "scrolled": true
431 | },
432 | "outputs": [],
433 | "source": [
434 | "from sklearn.linear_model import Lasso\n",
435 | "model = make_pipeline(GaussianFeatures(30), Lasso(alpha=0.001))\n",
436 | "basis_plot(model, title='Lasso Regression')"
437 | ]
438 | },
439 | {
440 | "cell_type": "markdown",
441 | "metadata": {},
442 | "source": [
443 | "위에서 볼 수 있듯 대부분의 계수값들이 0이 된다. Ridge regression과 마찬가지로 $\\alpha$값으로 regularization의 강도를 조절할 수 있다."
444 | ]
445 | },
446 | {
447 | "cell_type": "markdown",
448 | "metadata": {},
449 | "source": [
450 | "### SGD"
451 | ]
452 | },
453 | {
454 | "cell_type": "code",
455 | "execution_count": null,
456 | "metadata": {},
457 | "outputs": [],
458 | "source": [
459 | "from sklearn.linear_model import SGDRegressor\n",
460 | "model = make_pipeline(GaussianFeatures(30),\n",
461 | " SGDRegressor(max_iter=1000, tol=1e-8, alpha=0))\n",
462 | "basis_plot(model)"
463 | ]
464 | },
465 | {
466 | "cell_type": "code",
467 | "execution_count": null,
468 | "metadata": {},
469 | "outputs": [],
470 | "source": []
471 | }
472 | ],
473 | "metadata": {
474 | "kernelspec": {
475 | "display_name": "Python 3",
476 | "language": "python",
477 | "name": "python3"
478 | },
479 | "language_info": {
480 | "codemirror_mode": {
481 | "name": "ipython",
482 | "version": 3
483 | },
484 | "file_extension": ".py",
485 | "mimetype": "text/x-python",
486 | "name": "python",
487 | "nbconvert_exporter": "python",
488 | "pygments_lexer": "ipython3",
489 | "version": "3.7.8"
490 | }
491 | },
492 | "nbformat": 4,
493 | "nbformat_minor": 4
494 | }
495 |
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/ML_Basics (Linear Regression).pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 6,7주차 강의자료/ML_Basics (Linear Regression).pdf
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/ml_basics_1.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 6,7주차 강의자료/ml_basics_1.pptx
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/ml_basics_2.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 6,7주차 강의자료/ml_basics_2.pptx
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/ml_basics_3.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 6,7주차 강의자료/ml_basics_3.pptx
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/ml_linear_classification.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 6,7주차 강의자료/ml_linear_classification.pptx
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/ml_linear_regression.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 6,7주차 강의자료/ml_linear_regression.pptx
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/ml_probability_1.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 6,7주차 강의자료/ml_probability_1.pptx
--------------------------------------------------------------------------------
/[KDT] 인공지능 6,7주차 강의자료/ml_probability_2.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/[KDT] 인공지능 6,7주차 강의자료/ml_probability_2.pptx
--------------------------------------------------------------------------------
/[KDT] 인공지능 8주차 실습/[실습]04_PyTorch_CNN(AlexNet).ipynb:
--------------------------------------------------------------------------------
1 | {"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"name":"AlexNet_model(Pytorch).ipynb","provenance":[],"collapsed_sections":[],"authorship_tag":"ABX9TyM2fGN8fNeTZyzrGvTdFFV5"},"kernelspec":{"name":"python3","display_name":"Python 3"}},"cells":[{"cell_type":"code","metadata":{"id":"s3G6N4VQkDkD"},"source":["import torch\n","import torch.nn as nn\n","from .utils import load_state_dict_from_url\n","from typing import Any\n","\n","\n","__all__ = ['AlexNet', 'alexnet']\n","\n","\n","model_urls = {\n"," 'alexnet': 'https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth',\n","}\n","\n","\n","class AlexNet(nn.Module):\n","\n"," def __init__(self, num_classes: int = 1000) -> None:\n"," super(AlexNet, self).__init__()\n"," self.features = nn.Sequential(\n"," #Conv1\n"," #input channel : 3 , output channel : 64, kernel_size : 11, stride : 4, padding : 2\n"," nn.Conv2d(3, 64, kernel_size=11, stride=4, padding=2),\n"," nn.ReLU(inplace=True), # inplace=True 하면, inplace 연산을 수행함, inplace 연산은 결과값을 새로운 변수에 값을 저장하는 대신 기존의 데이터를 대체하는것을 의미\n"," #Max Pool1\n"," nn.MaxPool2d(kernel_size=3, stride=2),\n"," #Conv2\n"," nn.Conv2d(64, 192, kernel_size=5, padding=2),\n"," nn.ReLU(inplace=True),\n"," #Max Pool2\n"," nn.MaxPool2d(kernel_size=3, stride=2),\n"," #Conv3\n"," nn.Conv2d(192, 384, kernel_size=3, padding=1),\n"," nn.ReLU(inplace=True),\n"," ##Conv4\n"," nn.Conv2d(384, 256, kernel_size=3, padding=1),\n"," nn.ReLU(inplace=True),\n"," #Conv5\n"," nn.Conv2d(256, 256, kernel_size=3, padding=1),\n"," nn.ReLU(inplace=True),\n"," #Max Pool3\n"," nn.MaxPool2d(kernel_size=3, stride=2),\n"," )\n"," self.avgpool = nn.AdaptiveAvgPool2d((6, 6))\n"," self.classifier = nn.Sequential(\n"," #드롭아웃 \n"," nn.Dropout(),\n"," nn.Linear(256 * 6 * 6, 4096),\n"," nn.ReLU(inplace=True),\n"," nn.Dropout(),\n"," nn.Linear(4096, 4096),\n"," nn.ReLU(inplace=True),\n"," nn.Linear(4096, num_classes),\n"," )\n","\n"," def forward(self, x: torch.Tensor) -> torch.Tensor:\n"," #특징 추출 부분\n"," x = self.features(x)\n"," \n"," x = self.avgpool(x)\n"," #output shape : (batch size * 256(channel), 6, 6)\n"," #Flatten\n"," x = torch.flatten(x, 1)\n"," #output shape (batch_size, 256 * 6* 6)\n"," #분류 분류 \n"," x = self.classifier(x)\n"," return x\n","\n","\n","def alexnet(pretrained: bool = False, progress: bool = True, **kwargs: Any) -> AlexNet:\n"," r\"\"\"AlexNet model architecture from the\n"," `\"One weird trick...\" `_ paper.\n"," Args:\n"," pretrained (bool): If True, returns a model pre-trained on ImageNet\n"," progress (bool): If True, displays a progress bar of the download to stderr\n"," \"\"\"\n"," model = AlexNet(**kwargs)\n"," if pretrained:\n"," state_dict = load_state_dict_from_url(model_urls['alexnet'],\n"," progress=progress)\n"," model.load_state_dict(state_dict)\n"," return model"],"execution_count":null,"outputs":[]}]}
--------------------------------------------------------------------------------
/[KDT] 인공지능 8주차 실습/[실습]04_TensorFlow_CNN(AlexNet).ipynb:
--------------------------------------------------------------------------------
1 | {"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"name":"AlexNet(Tensorflow).ipynb","provenance":[],"collapsed_sections":[],"authorship_tag":"ABX9TyOJ1DFP7OOJAUkYAMiWCiAM"},"kernelspec":{"name":"python3","display_name":"Python 3"}},"cells":[{"cell_type":"code","metadata":{"id":"TgQpetUTM7Ik"},"source":["import tensorflow as tf\r\n","\r\n","def AlexNet(\r\n"," input_shape=None,\r\n"," weights=None,\r\n"," classes=1000,\r\n"," classifier_activation='softmax'):\r\n"," \r\n"," model = tf.keras.Sequential([\r\n"," #특징 추출 부분 \r\n"," #Conv 1\r\n"," tf.keras.layers.Conv2D(filters=96,\r\n"," kernel_size=(11, 11),\r\n"," strides=4,\r\n"," padding=\"valid\",\r\n"," activation=tf.keras.activations.relu,\r\n"," input_shape=input_shape),\r\n"," #Max Pool 1\r\n"," tf.keras.layers.MaxPool2D(pool_size=(3, 3),\r\n"," strides=2,\r\n"," padding=\"valid\"),\r\n"," tf.keras.layers.BatchNormalization(),\r\n"," #Conv 2\r\n"," tf.keras.layers.Conv2D(filters=256,\r\n"," kernel_size=(5, 5),\r\n"," strides=1,\r\n"," padding=\"same\",\r\n"," activation=tf.keras.activations.relu),\r\n"," #Max Pool 2\r\n"," tf.keras.layers.MaxPool2D(pool_size=(3, 3),\r\n"," strides=2,\r\n"," padding=\"same\"),\r\n"," tf.keras.layers.BatchNormalization(),\r\n"," #Conv 3\r\n"," tf.keras.layers.Conv2D(filters=384,\r\n"," kernel_size=(3, 3),\r\n"," strides=1,\r\n"," padding=\"same\",\r\n"," activation=tf.keras.activations.relu),\r\n"," #Conv 4\r\n"," tf.keras.layers.Conv2D(filters=384,\r\n"," kernel_size=(3, 3),\r\n"," strides=1,\r\n"," padding=\"same\",\r\n"," activation=tf.keras.activations.relu),\r\n"," #Conv 5\r\n"," tf.keras.layers.Conv2D(filters=256,\r\n"," kernel_size=(3, 3),\r\n"," strides=1,\r\n"," padding=\"same\",\r\n"," activation=tf.keras.activations.relu),\r\n"," #Max Pool 3\r\n"," tf.keras.layers.MaxPool2D(pool_size=(3, 3),\r\n"," strides=2,\r\n"," padding=\"same\"),\r\n"," tf.keras.layers.BatchNormalization(),\r\n"," \r\n"," tf.keras.layers.Flatten(),\r\n"," \r\n"," #분류 층 부분\r\n"," #Fully connected layer 1 \r\n"," tf.keras.layers.Dense(units=4096,\r\n"," activation=tf.keras.activations.relu),\r\n"," tf.keras.layers.Dropout(rate=0.2),\r\n"," #Fully connected layer 2\r\n"," tf.keras.layers.Dense(units=4096,\r\n"," activation=tf.keras.activations.relu),\r\n"," tf.keras.layers.Dropout(rate=0.2),\r\n"," \r\n"," #Fully connected layer 3\r\n"," tf.keras.layers.Dense(units=classes,\r\n"," activation=tf.keras.activations.softmax)\r\n"," ])\r\n","\r\n"," return model\r\n","\r\n","\r\n"],"execution_count":null,"outputs":[]}]}
--------------------------------------------------------------------------------
/[KDT] 인공지능 9주차 실습/[실습]05_PyTorch_CNN(VGGNet).ipynb:
--------------------------------------------------------------------------------
1 | {"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"name":"VGGNet_model.ipynb","provenance":[],"collapsed_sections":[],"authorship_tag":"ABX9TyPx3n3Fs3TuEv1JQaXFo7jw"},"kernelspec":{"name":"python3","display_name":"Python 3"}},"cells":[{"cell_type":"code","metadata":{"id":"iXw570s7pQ9q"},"source":["import torch\n","import torch.nn as nn\n","from .utils import load_state_dict_from_url\n","from typing import Union, List, Dict, Any, cast\n","\n","\n","__all__ = [\n"," 'VGG', 'vgg11', 'vgg11_bn', 'vgg13', 'vgg13_bn', 'vgg16', 'vgg16_bn',\n"," 'vgg19_bn', 'vgg19',\n","]\n","\n","\n","model_urls = {\n"," 'vgg11': 'https://download.pytorch.org/models/vgg11-bbd30ac9.pth',\n"," 'vgg13': 'https://download.pytorch.org/models/vgg13-c768596a.pth',\n"," 'vgg16': 'https://download.pytorch.org/models/vgg16-397923af.pth',\n"," 'vgg19': 'https://download.pytorch.org/models/vgg19-dcbb9e9d.pth',\n"," 'vgg11_bn': 'https://download.pytorch.org/models/vgg11_bn-6002323d.pth',\n"," 'vgg13_bn': 'https://download.pytorch.org/models/vgg13_bn-abd245e5.pth',\n"," 'vgg16_bn': 'https://download.pytorch.org/models/vgg16_bn-6c64b313.pth',\n"," 'vgg19_bn': 'https://download.pytorch.org/models/vgg19_bn-c79401a0.pth',\n","}\n","\n","\n","class VGG(nn.Module):\n","\n"," def __init__(\n"," self,\n"," features: nn.Module,\n"," num_classes: int = 1000,\n"," init_weights: bool = True\n"," ) -> None:\n"," super(VGG, self).__init__()\n"," self.features = features\n"," self.avgpool = nn.AdaptiveAvgPool2d((7, 7))\n"," self.classifier = nn.Sequential(\n"," nn.Linear(512 * 7 * 7, 4096),\n"," nn.ReLU(True),\n"," nn.Dropout(),\n"," nn.Linear(4096, 4096),\n"," nn.ReLU(True),\n"," nn.Dropout(),\n"," nn.Linear(4096, num_classes),\n"," )\n"," if init_weights:\n"," self._initialize_weights()\n","\n"," def forward(self, x: torch.Tensor) -> torch.Tensor:\n"," x = self.features(x)\n"," x = self.avgpool(x)\n"," # batch size * channel * height * width\n"," # 128 , (512 * 7 * 7) \n"," x = torch.flatten(x, 1)\n"," x = self.classifier(x)\n"," return x\n","\n"," def _initialize_weights(self) -> None:\n"," for m in self.modules():\n"," if isinstance(m, nn.Conv2d):\n"," nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu')\n"," if m.bias is not None:\n"," nn.init.constant_(m.bias, 0)\n"," elif isinstance(m, nn.BatchNorm2d):\n"," nn.init.constant_(m.weight, 1)\n"," nn.init.constant_(m.bias, 0)\n"," elif isinstance(m, nn.Linear):\n"," nn.init.normal_(m.weight, 0, 0.01)\n"," nn.init.constant_(m.bias, 0)\n","\n","#'D': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 'M', 512, 512, 512, 'M', 512, 512, 512, 'M'], \n","def make_layers(cfg: List[Union[str, int]], batch_norm: bool = False) -> nn.Sequential:\n"," layers: List[nn.Module] = []\n"," in_channels = 3\n"," for v in cfg:\n"," #'M'이면 maxpool 층 추가 \n"," if v == 'M':\n"," layers += [nn.MaxPool2d(kernel_size=2, stride=2)]\n"," #그 이외에는 \n"," else:\n"," v = cast(int, v)\n"," conv2d = nn.Conv2d(in_channels, v, kernel_size=3, padding=1)\n"," #batch norm을 사용한다면 relu전에 batchnorm layer 추가 \n"," if batch_norm:\n"," layers += [conv2d, nn.BatchNorm2d(v), nn.ReLU(inplace=True)]\n"," else:\n"," #batch norm을 사용하지 않는다면 relu만 추가 \n"," layers += [conv2d, nn.ReLU(inplace=True)]\n"," #in_channels를 현재 channel 값으로 바꿔줌\n"," in_channels = v\n"," return nn.Sequential(*layers)\n","\n","\n","cfgs: Dict[str, List[Union[str, int]]] = {\n"," 'A': [64, 'M', 128, 'M', 256, 256, 'M', 512, 512, 'M', 512, 512, 'M'],\n"," 'B': [64, 64, 'M', 128, 128, 'M', 256, 256, 'M', 512, 512, 'M', 512, 512, 'M'],\n"," 'D': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 'M', 512, 512, 512, 'M', 512, 512, 512, 'M'],\n"," 'E': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 256, 'M', 512, 512, 512, 512, 'M', 512, 512, 512, 512, 'M'],\n","}\n","\n","\n","def _vgg(arch: str, cfg: str, batch_norm: bool, pretrained: bool, progress: bool, **kwargs: Any) -> VGG:\n"," if pretrained:\n"," kwargs['init_weights'] = False\n"," # _vgg('vgg16', 'D', False, pretrained, progress, **kwargs)\n"," #'D': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 'M', 512, 512, 512, 'M', 512, 512, 512, 'M'],\n"," model = VGG(make_layers(cfgs[cfg], batch_norm=batch_norm), **kwargs)\n"," if pretrained:\n"," state_dict = load_state_dict_from_url(model_urls[arch],\n"," progress=progress)\n"," model.load_state_dict(state_dict)\n"," return model\n","\n","\n","def vgg11(pretrained: bool = False, progress: bool = True, **kwargs: Any) -> VGG:\n"," r\"\"\"VGG 11-layer model (configuration \"A\") from\n"," `\"Very Deep Convolutional Networks For Large-Scale Image Recognition\" `._\n"," Args:\n"," pretrained (bool): If True, returns a model pre-trained on ImageNet\n"," progress (bool): If True, displays a progress bar of the download to stderr\n"," \"\"\"\n"," return _vgg('vgg11', 'A', False, pretrained, progress, **kwargs)\n","\n","\n","def vgg11_bn(pretrained: bool = False, progress: bool = True, **kwargs: Any) -> VGG:\n"," r\"\"\"VGG 11-layer model (configuration \"A\") with batch normalization\n"," `\"Very Deep Convolutional Networks For Large-Scale Image Recognition\" `._\n"," Args:\n"," pretrained (bool): If True, returns a model pre-trained on ImageNet\n"," progress (bool): If True, displays a progress bar of the download to stderr\n"," \"\"\"\n"," return _vgg('vgg11_bn', 'A', True, pretrained, progress, **kwargs)\n","\n","\n","def vgg13(pretrained: bool = False, progress: bool = True, **kwargs: Any) -> VGG:\n"," r\"\"\"VGG 13-layer model (configuration \"B\")\n"," `\"Very Deep Convolutional Networks For Large-Scale Image Recognition\" `._\n"," Args:\n"," pretrained (bool): If True, returns a model pre-trained on ImageNet\n"," progress (bool): If True, displays a progress bar of the download to stderr\n"," \"\"\"\n"," return _vgg('vgg13', 'B', False, pretrained, progress, **kwargs)\n","\n","\n","def vgg13_bn(pretrained: bool = False, progress: bool = True, **kwargs: Any) -> VGG:\n"," r\"\"\"VGG 13-layer model (configuration \"B\") with batch normalization\n"," `\"Very Deep Convolutional Networks For Large-Scale Image Recognition\" `._\n"," Args:\n"," pretrained (bool): If True, returns a model pre-trained on ImageNet\n"," progress (bool): If True, displays a progress bar of the download to stderr\n"," \"\"\"\n"," return _vgg('vgg13_bn', 'B', True, pretrained, progress, **kwargs)\n","\n","#실제로 models에서 호출하는 함수\n","def vgg16(pretrained: bool = False, progress: bool = True, **kwargs: Any) -> VGG:\n"," r\"\"\"VGG 16-layer model (configuration \"D\")\n"," `\"Very Deep Convolutional Networks For Large-Scale Image Recognition\" `._\n"," Args:\n"," pretrained (bool): If True, returns a model pre-trained on ImageNet\n"," progress (bool): If True, displays a progress bar of the download to stderr\n"," \"\"\"\n"," #여기서 시작 \n"," return _vgg('vgg16', 'D', False, pretrained, progress, **kwargs)\n","\n","\n","def vgg16_bn(pretrained: bool = False, progress: bool = True, **kwargs: Any) -> VGG:\n"," r\"\"\"VGG 16-layer model (configuration \"D\") with batch normalization\n"," `\"Very Deep Convolutional Networks For Large-Scale Image Recognition\" `._\n"," Args:\n"," pretrained (bool): If True, returns a model pre-trained on ImageNet\n"," progress (bool): If True, displays a progress bar of the download to stderr\n"," \"\"\"\n"," return _vgg('vgg16_bn', 'D', True, pretrained, progress, **kwargs)\n","\n","\n","def vgg19(pretrained: bool = False, progress: bool = True, **kwargs: Any) -> VGG:\n"," r\"\"\"VGG 19-layer model (configuration \"E\")\n"," `\"Very Deep Convolutional Networks For Large-Scale Image Recognition\" `._\n"," Args:\n"," pretrained (bool): If True, returns a model pre-trained on ImageNet\n"," progress (bool): If True, displays a progress bar of the download to stderr\n"," \"\"\"\n"," return _vgg('vgg19', 'E', False, pretrained, progress, **kwargs)\n","\n","\n","def vgg19_bn(pretrained: bool = False, progress: bool = True, **kwargs: Any) -> VGG:\n"," r\"\"\"VGG 19-layer model (configuration 'E') with batch normalization\n"," `\"Very Deep Convolutional Networks For Large-Scale Image Recognition\" `._\n"," Args:\n"," pretrained (bool): If True, returns a model pre-trained on ImageNet\n"," progress (bool): If True, displays a progress bar of the download to stderr\n"," \"\"\"\n"," return _vgg('vgg19_bn', 'E', True, pretrained, progress, **kwargs)"],"execution_count":null,"outputs":[]}]}
--------------------------------------------------------------------------------
/[KDT] 인공지능 9주차 실습/[실습]05_TensorFlow_CNN(VGGNet).ipynb:
--------------------------------------------------------------------------------
1 | {"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"name":"VGGNet16_models(Tensorflow).ipynb","provenance":[],"collapsed_sections":[],"authorship_tag":"ABX9TyO7UGr66n0suzmPEXw4hR0V"},"kernelspec":{"name":"python3","display_name":"Python 3"}},"cells":[{"cell_type":"code","metadata":{"id":"PDARYlagDFWQ"},"source":["# Copyright 2015 The TensorFlow Authors. All Rights Reserved.\r\n","#\r\n","# Licensed under the Apache License, Version 2.0 (the \"License\");\r\n","# you may not use this file except in compliance with the License.\r\n","# You may obtain a copy of the License at\r\n","#\r\n","# http://www.apache.org/licenses/LICENSE-2.0\r\n","#\r\n","# Unless required by applicable law or agreed to in writing, software\r\n","# distributed under the License is distributed on an \"AS IS\" BASIS,\r\n","# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\n","# See the License for the specific language governing permissions and\r\n","# limitations under the License.\r\n","# ==============================================================================\r\n","# pylint: disable=invalid-name\r\n","\"\"\"VGG16 model for Keras.\r\n","Reference:\r\n"," - [Very Deep Convolutional Networks for Large-Scale Image Recognition]\r\n"," (https://arxiv.org/abs/1409.1556) (ICLR 2015)\r\n","\"\"\"\r\n","from __future__ import absolute_import\r\n","from __future__ import division\r\n","from __future__ import print_function\r\n","\r\n","from tensorflow.python.keras import backend\r\n","from tensorflow.python.keras.applications import imagenet_utils\r\n","from tensorflow.python.keras.engine import training\r\n","from tensorflow.python.keras.layers import VersionAwareLayers\r\n","from tensorflow.python.keras.utils import data_utils\r\n","from tensorflow.python.keras.utils import layer_utils\r\n","from tensorflow.python.lib.io import file_io\r\n","from tensorflow.python.util.tf_export import keras_export\r\n","\r\n","\r\n","WEIGHTS_PATH = ('https://storage.googleapis.com/tensorflow/keras-applications/'\r\n"," 'vgg16/vgg16_weights_tf_dim_ordering_tf_kernels.h5')\r\n","WEIGHTS_PATH_NO_TOP = ('https://storage.googleapis.com/tensorflow/'\r\n"," 'keras-applications/vgg16/'\r\n"," 'vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5')\r\n","\r\n","layers = VersionAwareLayers()\r\n","\r\n","#keras.applications.VGG16를 사용하면 호출되는 함수\r\n","@keras_export('keras.applications.vgg16.VGG16', 'keras.applications.VGG16')\r\n","def VGG16(\r\n"," include_top=True, #분류 층 포함할지 여부 \r\n"," weights='imagenet',\r\n"," input_tensor=None,\r\n"," input_shape=None,\r\n"," pooling=None,\r\n"," classes=1000,\r\n"," classifier_activation='softmax'):\r\n"," \"\"\"Instantiates the VGG16 model.\r\n"," Reference:\r\n"," - [Very Deep Convolutional Networks for Large-Scale Image Recognition](\r\n"," https://arxiv.org/abs/1409.1556) (ICLR 2015)\r\n"," By default, it loads weights pre-trained on ImageNet. Check 'weights' for\r\n"," other options.\r\n"," This model can be built both with 'channels_first' data format\r\n"," (channels, height, width) or 'channels_last' data format\r\n"," (height, width, channels).\r\n"," The default input size for this model is 224x224.\r\n"," Note: each Keras Application expects a specific kind of input preprocessing.\r\n"," For VGG16, call `tf.keras.applications.vgg16.preprocess_input` on your\r\n"," inputs before passing them to the model.\r\n"," Arguments:\r\n"," include_top: whether to include the 3 fully-connected\r\n"," layers at the top of the network.\r\n"," weights: one of `None` (random initialization),\r\n"," 'imagenet' (pre-training on ImageNet),\r\n"," or the path to the weights file to be loaded.\r\n"," input_tensor: optional Keras tensor\r\n"," (i.e. output of `layers.Input()`)\r\n"," to use as image input for the model.\r\n"," input_shape: optional shape tuple, only to be specified\r\n"," if `include_top` is False (otherwise the input shape\r\n"," has to be `(224, 224, 3)`\r\n"," (with `channels_last` data format)\r\n"," or `(3, 224, 224)` (with `channels_first` data format).\r\n"," It should have exactly 3 input channels,\r\n"," and width and height should be no smaller than 32.\r\n"," E.g. `(200, 200, 3)` would be one valid value.\r\n"," pooling: Optional pooling mode for feature extraction\r\n"," when `include_top` is `False`.\r\n"," - `None` means that the output of the model will be\r\n"," the 4D tensor output of the\r\n"," last convolutional block.\r\n"," - `avg` means that global average pooling\r\n"," will be applied to the output of the\r\n"," last convolutional block, and thus\r\n"," the output of the model will be a 2D tensor.\r\n"," - `max` means that global max pooling will\r\n"," be applied.\r\n"," classes: optional number of classes to classify images\r\n"," into, only to be specified if `include_top` is True, and\r\n"," if no `weights` argument is specified.\r\n"," classifier_activation: A `str` or callable. The activation function to use\r\n"," on the \"top\" layer. Ignored unless `include_top=True`. Set\r\n"," `classifier_activation=None` to return the logits of the \"top\" layer.\r\n"," Returns:\r\n"," A `keras.Model` instance.\r\n"," Raises:\r\n"," ValueError: in case of invalid argument for `weights`,\r\n"," or invalid input shape.\r\n"," ValueError: if `classifier_activation` is not `softmax` or `None` when\r\n"," using a pretrained top layer.\r\n"," \"\"\"\r\n"," if not (weights in {'imagenet', None} or file_io.file_exists_v2(weights)):\r\n"," raise ValueError('The `weights` argument should be either '\r\n"," '`None` (random initialization), `imagenet` '\r\n"," '(pre-training on ImageNet), '\r\n"," 'or the path to the weights file to be loaded.')\r\n","\r\n"," if weights == 'imagenet' and include_top and classes != 1000:\r\n"," raise ValueError('If using `weights` as `\"imagenet\"` with `include_top`'\r\n"," ' as true, `classes` should be 1000')\r\n"," # Determine proper input shape\r\n"," input_shape = imagenet_utils.obtain_input_shape(\r\n"," input_shape,\r\n"," default_size=224,\r\n"," min_size=32,\r\n"," data_format=backend.image_data_format(),\r\n"," require_flatten=include_top,\r\n"," weights=weights)\r\n","\r\n"," if input_tensor is None:\r\n"," img_input = layers.Input(shape=input_shape)\r\n"," else:\r\n"," if not backend.is_keras_tensor(input_tensor):\r\n"," img_input = layers.Input(tensor=input_tensor, shape=input_shape)\r\n"," else:\r\n"," img_input = input_tensor\r\n","\r\n"," # Block 1\r\n"," #layers.Conv2D(filters,kernel_size)\r\n"," #channels : 64, kernel_size : (3,3), activation function = ReLU, padding='same'(padding 사용), name : 해당 layer의 이름 설정\r\n"," x = layers.Conv2D(\r\n"," 64, (3, 3), activation='relu', padding='same', name='block1_conv1')(\r\n"," img_input)\r\n"," x = layers.Conv2D(\r\n"," 64, (3, 3), activation='relu', padding='same', name='block1_conv2')(x)\r\n"," #pool_size : (2,2)\r\n"," x = layers.MaxPooling2D((2, 2), strides=(2, 2), name='block1_pool')(x)\r\n","\r\n"," # Block 2\r\n"," x = layers.Conv2D(\r\n"," 128, (3, 3), activation='relu', padding='same', name='block2_conv1')(x)\r\n"," x = layers.Conv2D(\r\n"," 128, (3, 3), activation='relu', padding='same', name='block2_conv2')(x)\r\n"," x = layers.MaxPooling2D((2, 2), strides=(2, 2), name='block2_pool')(x)\r\n","\r\n"," # Block 3\r\n"," x = layers.Conv2D(\r\n"," 256, (3, 3), activation='relu', padding='same', name='block3_conv1')(x)\r\n"," x = layers.Conv2D(\r\n"," 256, (3, 3), activation='relu', padding='same', name='block3_conv2')(x)\r\n"," x = layers.Conv2D(\r\n"," 256, (3, 3), activation='relu', padding='same', name='block3_conv3')(x)\r\n"," x = layers.MaxPooling2D((2, 2), strides=(2, 2), name='block3_pool')(x)\r\n","\r\n"," # Block 4\r\n"," x = layers.Conv2D(\r\n"," 512, (3, 3), activation='relu', padding='same', name='block4_conv1')(x)\r\n"," x = layers.Conv2D(\r\n"," 512, (3, 3), activation='relu', padding='same', name='block4_conv2')(x)\r\n"," x = layers.Conv2D(\r\n"," 512, (3, 3), activation='relu', padding='same', name='block4_conv3')(x)\r\n"," x = layers.MaxPooling2D((2, 2), strides=(2, 2), name='block4_pool')(x)\r\n","\r\n"," # Block 5\r\n"," x = layers.Conv2D(\r\n"," 512, (3, 3), activation='relu', padding='same', name='block5_conv1')(x)\r\n"," x = layers.Conv2D(\r\n"," 512, (3, 3), activation='relu', padding='same', name='block5_conv2')(x)\r\n"," x = layers.Conv2D(\r\n"," 512, (3, 3), activation='relu', padding='same', name='block5_conv3')(x)\r\n"," x = layers.MaxPooling2D((2, 2), strides=(2, 2), name='block5_pool')(x)\r\n","\r\n"," #분류 층을 포함한다면\r\n"," if include_top:\r\n"," # Classification block\r\n"," #Flatten\r\n"," x = layers.Flatten(name='flatten')(x)\r\n"," #Fully connected layer 1\r\n"," x = layers.Dense(4096, activation='relu', name='fc1')(x)\r\n"," #Fully connected layer 2\r\n"," x = layers.Dense(4096, activation='relu', name='fc2')(x)\r\n","\r\n"," imagenet_utils.validate_activation(classifier_activation, weights)\r\n"," #최종 분류층 (activation 보통 softmax로 사용)\r\n"," x = layers.Dense(classes, activation=classifier_activation,\r\n"," name='predictions')(x)\r\n"," else:\r\n"," if pooling == 'avg':\r\n"," x = layers.GlobalAveragePooling2D()(x)\r\n"," elif pooling == 'max':\r\n"," x = layers.GlobalMaxPooling2D()(x)\r\n","\r\n"," # Ensure that the model takes into account\r\n"," # any potential predecessors of `input_tensor`.\r\n"," if input_tensor is not None:\r\n"," inputs = layer_utils.get_source_inputs(input_tensor)\r\n"," else:\r\n"," inputs = img_input\r\n"," # Create model.\r\n"," model = training.Model(inputs, x, name='vgg16')\r\n","\r\n"," # Load weights.\r\n"," #ImageNet weight 사용할 경우 \r\n"," if weights == 'imagenet':\r\n"," #분류 층 포함 인 경우 \r\n"," if include_top:\r\n"," weights_path = data_utils.get_file(\r\n"," 'vgg16_weights_tf_dim_ordering_tf_kernels.h5',\r\n"," WEIGHTS_PATH,\r\n"," cache_subdir='models',\r\n"," file_hash='64373286793e3c8b2b4e3219cbf3544b')\r\n"," #특징 분류 부분만 \r\n"," else:\r\n"," weights_path = data_utils.get_file(\r\n"," 'vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5',\r\n"," WEIGHTS_PATH_NO_TOP,\r\n"," cache_subdir='models',\r\n"," file_hash='6d6bbae143d832006294945121d1f1fc')\r\n"," #모델에 가중치 로드\r\n"," model.load_weights(weights_path)\r\n"," elif weights is not None:\r\n"," model.load_weights(weights)\r\n","\r\n"," return model\r\n","\r\n","\r\n","@keras_export('keras.applications.vgg16.preprocess_input')\r\n","def preprocess_input(x, data_format=None):\r\n"," return imagenet_utils.preprocess_input(\r\n"," x, data_format=data_format, mode='caffe')\r\n","\r\n","\r\n","@keras_export('keras.applications.vgg16.decode_predictions')\r\n","def decode_predictions(preds, top=5):\r\n"," return imagenet_utils.decode_predictions(preds, top=top)\r\n","\r\n","\r\n","preprocess_input.__doc__ = imagenet_utils.PREPROCESS_INPUT_DOC.format(\r\n"," mode='',\r\n"," ret=imagenet_utils.PREPROCESS_INPUT_RET_DOC_CAFFE,\r\n"," error=imagenet_utils.PREPROCESS_INPUT_ERROR_DOC)\r\n","decode_predictions.__doc__ = imagenet_utils.decode_predictions.__doc__"],"execution_count":null,"outputs":[]}]}
--------------------------------------------------------------------------------
/[KDT] 인공지능 9주차 실습/[실습]08_TensorFlow_CNN(ResNet).ipynb:
--------------------------------------------------------------------------------
1 | {"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"name":"Tensorflow Tutorial(ResNet Model).ipynb","provenance":[],"collapsed_sections":[],"authorship_tag":"ABX9TyPb/Z7emNcjAZNz5nTtv2LR"},"kernelspec":{"name":"python3","display_name":"Python 3"}},"cells":[{"cell_type":"code","metadata":{"id":"MxA9bYBh18nW"},"source":["# Copyright 2015 The TensorFlow Authors. All Rights Reserved.\n","#\n","# Licensed under the Apache License, Version 2.0 (the \"License\");\n","# you may not use this file except in compliance with the License.\n","# You may obtain a copy of the License at\n","#\n","# http://www.apache.org/licenses/LICENSE-2.0\n","#\n","# Unless required by applicable law or agreed to in writing, software\n","# distributed under the License is distributed on an \"AS IS\" BASIS,\n","# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n","# See the License for the specific language governing permissions and\n","# limitations under the License.\n","# ==============================================================================\n","# pylint: disable=invalid-name\n","\"\"\"ResNet models for Keras.\n","Reference:\n"," - [Deep Residual Learning for Image Recognition](\n"," https://arxiv.org/abs/1512.03385) (CVPR 2015)\n","\"\"\"\n","from __future__ import absolute_import\n","from __future__ import division\n","from __future__ import print_function\n","\n","from tensorflow.python.keras import backend\n","from tensorflow.python.keras.applications import imagenet_utils\n","from tensorflow.python.keras.engine import training\n","from tensorflow.python.keras.layers import VersionAwareLayers\n","from tensorflow.python.keras.utils import data_utils\n","from tensorflow.python.keras.utils import layer_utils\n","from tensorflow.python.lib.io import file_io\n","from tensorflow.python.util.tf_export import keras_export\n","\n","\n","BASE_WEIGHTS_PATH = (\n"," 'https://storage.googleapis.com/tensorflow/keras-applications/resnet/')\n","WEIGHTS_HASHES = {\n"," 'resnet50': ('2cb95161c43110f7111970584f804107',\n"," '4d473c1dd8becc155b73f8504c6f6626')\n","}\n","\n","layers = None\n","\n","\n","def ResNet(stack_fn,\n"," preact,\n"," use_bias,\n"," model_name='resnet',\n"," include_top=True,\n"," weights='imagenet',\n"," input_tensor=None,\n"," input_shape=None,\n"," pooling=None,\n"," classes=1000,\n"," classifier_activation='softmax',\n"," **kwargs):\n"," \"\"\"Instantiates the ResNet, ResNetV2, and ResNeXt architecture.\n"," Reference:\n"," - [Deep Residual Learning for Image Recognition](\n"," https://arxiv.org/abs/1512.03385) (CVPR 2015)\n"," Optionally loads weights pre-trained on ImageNet.\n"," Note that the data format convention used by the model is\n"," the one specified in your Keras config at `~/.keras/keras.json`.\n"," Arguments:\n"," stack_fn: a function that returns output tensor for the\n"," stacked residual blocks.\n"," preact: whether to use pre-activation or not\n"," (True for ResNetV2, False for ResNet and ResNeXt).\n"," use_bias: whether to use biases for convolutional layers or not\n"," (True for ResNet and ResNetV2, False for ResNeXt).\n"," model_name: string, model name.\n"," include_top: whether to include the fully-connected\n"," layer at the top of the network.\n"," weights: one of `None` (random initialization),\n"," 'imagenet' (pre-training on ImageNet),\n"," or the path to the weights file to be loaded.\n"," input_tensor: optional Keras tensor\n"," (i.e. output of `layers.Input()`)\n"," to use as image input for the model.\n"," input_shape: optional shape tuple, only to be specified\n"," if `include_top` is False (otherwise the input shape\n"," has to be `(224, 224, 3)` (with `channels_last` data format)\n"," or `(3, 224, 224)` (with `channels_first` data format).\n"," It should have exactly 3 inputs channels.\n"," pooling: optional pooling mode for feature extraction\n"," when `include_top` is `False`.\n"," - `None` means that the output of the model will be\n"," the 4D tensor output of the\n"," last convolutional layer.\n"," - `avg` means that global average pooling\n"," will be applied to the output of the\n"," last convolutional layer, and thus\n"," the output of the model will be a 2D tensor.\n"," - `max` means that global max pooling will\n"," be applied.\n"," classes: optional number of classes to classify images\n"," into, only to be specified if `include_top` is True, and\n"," if no `weights` argument is specified.\n"," classifier_activation: A `str` or callable. The activation function to use\n"," on the \"top\" layer. Ignored unless `include_top=True`. Set\n"," `classifier_activation=None` to return the logits of the \"top\" layer.\n"," **kwargs: For backwards compatibility only.\n"," Returns:\n"," A `keras.Model` instance.\n"," Raises:\n"," ValueError: in case of invalid argument for `weights`,\n"," or invalid input shape.\n"," ValueError: if `classifier_activation` is not `softmax` or `None` when\n"," using a pretrained top layer.\n"," \"\"\"\n"," global layers\n"," if 'layers' in kwargs:\n"," layers = kwargs.pop('layers')\n"," else:\n"," layers = VersionAwareLayers()\n"," if kwargs:\n"," raise ValueError('Unknown argument(s): %s' % (kwargs,))\n"," if not (weights in {'imagenet', None} or file_io.file_exists_v2(weights)):\n"," raise ValueError('The `weights` argument should be either '\n"," '`None` (random initialization), `imagenet` '\n"," '(pre-training on ImageNet), '\n"," 'or the path to the weights file to be loaded.')\n","\n"," if weights == 'imagenet' and include_top and classes != 1000:\n"," raise ValueError('If using `weights` as `\"imagenet\"` with `include_top`'\n"," ' as true, `classes` should be 1000')\n","\n"," # Determine proper input shape\n"," input_shape = imagenet_utils.obtain_input_shape(\n"," input_shape,\n"," default_size=224,\n"," min_size=32,\n"," data_format=backend.image_data_format(),\n"," require_flatten=include_top,\n"," weights=weights)\n","\n"," if input_tensor is None:\n"," img_input = layers.Input(shape=input_shape)\n"," else:\n"," if not backend.is_keras_tensor(input_tensor):\n"," img_input = layers.Input(tensor=input_tensor, shape=input_shape)\n"," else:\n"," img_input = input_tensor\n","\n"," bn_axis = 3 if backend.image_data_format() == 'channels_last' else 1\n","\n"," x = layers.ZeroPadding2D(\n"," padding=((3, 3), (3, 3)), name='conv1_pad')(img_input)\n"," x = layers.Conv2D(64, 7, strides=2, use_bias=use_bias, name='conv1_conv')(x)\n","\n"," if not preact:\n"," x = layers.BatchNormalization(\n"," axis=bn_axis, epsilon=1.001e-5, name='conv1_bn')(x)\n"," x = layers.Activation('relu', name='conv1_relu')(x)\n","\n"," x = layers.ZeroPadding2D(padding=((1, 1), (1, 1)), name='pool1_pad')(x)\n"," x = layers.MaxPooling2D(3, strides=2, name='pool1_pool')(x)\n","\n"," x = stack_fn(x)\n","\n"," if preact:\n"," x = layers.BatchNormalization(\n"," axis=bn_axis, epsilon=1.001e-5, name='post_bn')(x)\n"," x = layers.Activation('relu', name='post_relu')(x)\n","\n"," if include_top:\n"," x = layers.GlobalAveragePooling2D(name='avg_pool')(x)\n"," imagenet_utils.validate_activation(classifier_activation, weights)\n"," x = layers.Dense(classes, activation=classifier_activation,\n"," name='predictions')(x)\n"," else:\n"," if pooling == 'avg':\n"," x = layers.GlobalAveragePooling2D(name='avg_pool')(x)\n"," elif pooling == 'max':\n"," x = layers.GlobalMaxPooling2D(name='max_pool')(x)\n","\n"," # Ensure that the model takes into account\n"," # any potential predecessors of `input_tensor`.\n"," if input_tensor is not None:\n"," inputs = layer_utils.get_source_inputs(input_tensor)\n"," else:\n"," inputs = img_input\n","\n"," # Create model.\n"," model = training.Model(inputs, x, name=model_name)\n","\n"," # Load weights.\n"," if (weights == 'imagenet') and (model_name in WEIGHTS_HASHES):\n"," if include_top:\n"," file_name = model_name + '_weights_tf_dim_ordering_tf_kernels.h5'\n"," file_hash = WEIGHTS_HASHES[model_name][0]\n"," else:\n"," file_name = model_name + '_weights_tf_dim_ordering_tf_kernels_notop.h5'\n"," file_hash = WEIGHTS_HASHES[model_name][1]\n"," weights_path = data_utils.get_file(\n"," file_name,\n"," BASE_WEIGHTS_PATH + file_name,\n"," cache_subdir='models',\n"," file_hash=file_hash)\n"," model.load_weights(weights_path)\n"," elif weights is not None:\n"," model.load_weights(weights)\n","\n"," return model\n","\n","\n","def block1(x, filters, kernel_size=3, stride=1, conv_shortcut=True, name=None):\n"," \"\"\"A residual block.\n"," Arguments:\n"," x: input tensor.\n"," filters: integer, filters of the bottleneck layer.\n"," kernel_size: default 3, kernel size of the bottleneck layer.\n"," stride: default 1, stride of the first layer.\n"," conv_shortcut: default True, use convolution shortcut if True,\n"," otherwise identity shortcut.\n"," name: string, block label.\n"," Returns:\n"," Output tensor for the residual block.\n"," \"\"\"\n"," bn_axis = 3 if backend.image_data_format() == 'channels_last' else 1\n","\n"," if conv_shortcut:\n"," shortcut = layers.Conv2D(\n"," 4 * filters, 1, strides=stride, name=name + '_0_conv')(x)\n"," shortcut = layers.BatchNormalization(\n"," axis=bn_axis, epsilon=1.001e-5, name=name + '_0_bn')(shortcut)\n"," else:\n"," shortcut = x\n","\n"," x = layers.Conv2D(filters, 1, strides=stride, name=name + '_1_conv')(x)\n"," x = layers.BatchNormalization(\n"," axis=bn_axis, epsilon=1.001e-5, name=name + '_1_bn')(x)\n"," x = layers.Activation('relu', name=name + '_1_relu')(x)\n","\n"," x = layers.Conv2D(\n"," filters, kernel_size, padding='SAME', name=name + '_2_conv')(x)\n"," x = layers.BatchNormalization(\n"," axis=bn_axis, epsilon=1.001e-5, name=name + '_2_bn')(x)\n"," x = layers.Activation('relu', name=name + '_2_relu')(x)\n","\n"," x = layers.Conv2D(4 * filters, 1, name=name + '_3_conv')(x)\n"," x = layers.BatchNormalization(\n"," axis=bn_axis, epsilon=1.001e-5, name=name + '_3_bn')(x)\n","\n"," x = layers.Add(name=name + '_add')([shortcut, x])\n"," x = layers.Activation('relu', name=name + '_out')(x)\n"," return x\n","\n","\n","def stack1(x, filters, blocks, stride1=2, name=None):\n"," \"\"\"A set of stacked residual blocks.\n"," Arguments:\n"," x: input tensor.\n"," filters: integer, filters of the bottleneck layer in a block.\n"," blocks: integer, blocks in the stacked blocks.\n"," stride1: default 2, stride of the first layer in the first block.\n"," name: string, stack label.\n"," Returns:\n"," Output tensor for the stacked blocks.\n"," \"\"\"\n"," x = block1(x, filters, stride=stride1, name=name + '_block1')\n"," for i in range(2, blocks + 1):\n"," x = block1(x, filters, conv_shortcut=False, name=name + '_block' + str(i))\n"," return x\n","\n","@keras_export('keras.applications.resnet50.ResNet50',\n"," 'keras.applications.resnet.ResNet50',\n"," 'keras.applications.ResNet50')\n","def ResNet50(include_top=True,\n"," weights='imagenet',\n"," input_tensor=None,\n"," input_shape=None,\n"," pooling=None,\n"," classes=1000,\n"," **kwargs):\n"," \"\"\"Instantiates the ResNet50 architecture.\"\"\"\n","\n"," def stack_fn(x):\n"," x = stack1(x, 64, 3, stride1=1, name='conv2')\n"," x = stack1(x, 128, 4, name='conv3')\n"," x = stack1(x, 256, 6, name='conv4')\n"," return stack1(x, 512, 3, name='conv5')\n","\n"," return ResNet(stack_fn, False, True, 'resnet50', include_top, weights,\n"," input_tensor, input_shape, pooling, classes, **kwargs)\n","\n","@keras_export('keras.applications.resnet50.preprocess_input',\n"," 'keras.applications.resnet.preprocess_input')\n","def preprocess_input(x, data_format=None):\n"," return imagenet_utils.preprocess_input(\n"," x, data_format=data_format, mode='caffe')\n","\n","\n","@keras_export('keras.applications.resnet50.decode_predictions',\n"," 'keras.applications.resnet.decode_predictions')\n","def decode_predictions(preds, top=5):\n"," return imagenet_utils.decode_predictions(preds, top=top)\n","\n","\n","preprocess_input.__doc__ = imagenet_utils.PREPROCESS_INPUT_DOC.format(\n"," mode='',\n"," ret=imagenet_utils.PREPROCESS_INPUT_RET_DOC_CAFFE,\n"," error=imagenet_utils.PREPROCESS_INPUT_ERROR_DOC)\n","decode_predictions.__doc__ = imagenet_utils.decode_predictions.__doc__\n","\n","DOC = \"\"\"\n"," Reference:\n"," - [Deep Residual Learning for Image Recognition](\n"," https://arxiv.org/abs/1512.03385) (CVPR 2015)\n"," Optionally loads weights pre-trained on ImageNet.\n"," Note that the data format convention used by the model is\n"," the one specified in your Keras config at `~/.keras/keras.json`.\n"," Note: each Keras Application expects a specific kind of input preprocessing.\n"," For ResNet, call `tf.keras.applications.resnet.preprocess_input` on your\n"," inputs before passing them to the model.\n"," Arguments:\n"," include_top: whether to include the fully-connected\n"," layer at the top of the network.\n"," weights: one of `None` (random initialization),\n"," 'imagenet' (pre-training on ImageNet),\n"," or the path to the weights file to be loaded.\n"," input_tensor: optional Keras tensor (i.e. output of `layers.Input()`)\n"," to use as image input for the model.\n"," input_shape: optional shape tuple, only to be specified\n"," if `include_top` is False (otherwise the input shape\n"," has to be `(224, 224, 3)` (with `'channels_last'` data format)\n"," or `(3, 224, 224)` (with `'channels_first'` data format).\n"," It should have exactly 3 inputs channels,\n"," and width and height should be no smaller than 32.\n"," E.g. `(200, 200, 3)` would be one valid value.\n"," pooling: Optional pooling mode for feature extraction\n"," when `include_top` is `False`.\n"," - `None` means that the output of the model will be\n"," the 4D tensor output of the\n"," last convolutional block.\n"," - `avg` means that global average pooling\n"," will be applied to the output of the\n"," last convolutional block, and thus\n"," the output of the model will be a 2D tensor.\n"," - `max` means that global max pooling will\n"," be applied.\n"," classes: optional number of classes to classify images\n"," into, only to be specified if `include_top` is True, and\n"," if no `weights` argument is specified.\n"," Returns:\n"," A Keras model instance.\n","\"\"\"\n","\n","setattr(ResNet50, '__doc__', ResNet50.__doc__ + DOC)"],"execution_count":null,"outputs":[]},{"cell_type":"code","metadata":{"id":"i_utNCX2vefp"},"source":[""],"execution_count":null,"outputs":[]}]}
--------------------------------------------------------------------------------
/country_wise_latest.csv:
--------------------------------------------------------------------------------
1 | Country/Region,Confirmed,Deaths,Recovered,Active,New cases,New deaths,New recovered,Deaths / 100 Cases,Recovered / 100 Cases,Deaths / 100 Recovered,Confirmed last week,1 week change,1 week % increase,WHO Region
2 | Afghanistan,36263,1269,25198,9796,106,10,18,3.5,69.49,5.04,35526,737,2.07,Eastern Mediterranean
3 | Albania,4880,144,2745,1991,117,6,63,2.95,56.25,5.25,4171,709,17.0,Europe
4 | Algeria,27973,1163,18837,7973,616,8,749,4.16,67.34,6.17,23691,4282,18.07,Africa
5 | Andorra,907,52,803,52,10,0,0,5.73,88.53,6.48,884,23,2.6,Europe
6 | Angola,950,41,242,667,18,1,0,4.32,25.47,16.94,749,201,26.84,Africa
7 | Antigua and Barbuda,86,3,65,18,4,0,5,3.49,75.58,4.62,76,10,13.16,Americas
8 | Argentina,167416,3059,72575,91782,4890,120,2057,1.83,43.35,4.21,130774,36642,28.02,Americas
9 | Armenia,37390,711,26665,10014,73,6,187,1.9,71.32,2.67,34981,2409,6.89,Europe
10 | Australia,15303,167,9311,5825,368,6,137,1.09,60.84,1.79,12428,2875,23.13,Western Pacific
11 | Austria,20558,713,18246,1599,86,1,37,3.47,88.75,3.91,19743,815,4.13,Europe
12 | Azerbaijan,30446,423,23242,6781,396,6,558,1.39,76.34,1.82,27890,2556,9.16,Europe
13 | Bahamas,382,11,91,280,40,0,0,2.88,23.82,12.09,174,208,119.54,Americas
14 | Bahrain,39482,141,36110,3231,351,1,421,0.36,91.46,0.39,36936,2546,6.89,Eastern Mediterranean
15 | Bangladesh,226225,2965,125683,97577,2772,37,1801,1.31,55.56,2.36,207453,18772,9.05,South-East Asia
16 | Barbados,110,7,94,9,0,0,0,6.36,85.45,7.45,106,4,3.77,Americas
17 | Belarus,67251,538,60492,6221,119,4,67,0.8,89.95,0.89,66213,1038,1.57,Europe
18 | Belgium,66428,9822,17452,39154,402,1,14,14.79,26.27,56.28,64094,2334,3.64,Europe
19 | Belize,48,2,26,20,0,0,0,4.17,54.17,7.69,40,8,20.0,Americas
20 | Benin,1770,35,1036,699,0,0,0,1.98,58.53,3.38,1602,168,10.49,Africa
21 | Bhutan,99,0,86,13,4,0,1,0.0,86.87,0.0,90,9,10.0,South-East Asia
22 | Bolivia,71181,2647,21478,47056,1752,64,309,3.72,30.17,12.32,60991,10190,16.71,Americas
23 | Bosnia and Herzegovina,10498,294,4930,5274,731,14,375,2.8,46.96,5.96,8479,2019,23.81,Europe
24 | Botswana,739,2,63,674,53,1,11,0.27,8.53,3.17,522,217,41.57,Africa
25 | Brazil,2442375,87618,1846641,508116,23284,614,33728,3.59,75.61,4.74,2118646,323729,15.28,Americas
26 | Brunei,141,3,138,0,0,0,0,2.13,97.87,2.17,141,0,0.0,Western Pacific
27 | Bulgaria,10621,347,5585,4689,194,7,230,3.27,52.58,6.21,8929,1692,18.95,Europe
28 | Burkina Faso,1100,53,926,121,14,0,6,4.82,84.18,5.72,1065,35,3.29,Africa
29 | Burma,350,6,292,52,0,0,2,1.71,83.43,2.05,341,9,2.64,South-East Asia
30 | Burundi,378,1,301,76,17,0,22,0.26,79.63,0.33,322,56,17.39,Africa
31 | Cabo Verde,2328,22,1550,756,21,0,103,0.95,66.58,1.42,2071,257,12.41,Africa
32 | Cambodia,226,0,147,79,1,0,4,0.0,65.04,0.0,171,55,32.16,Western Pacific
33 | Cameroon,17110,391,14539,2180,402,6,0,2.29,84.97,2.69,16157,953,5.9,Africa
34 | Canada,116458,8944,0,107514,682,11,0,7.68,0.0,inf,112925,3533,3.13,Americas
35 | Central African Republic,4599,59,1546,2994,0,0,0,1.28,33.62,3.82,4548,51,1.12,Africa
36 | Chad,922,75,810,37,7,0,0,8.13,87.85,9.26,889,33,3.71,Africa
37 | Chile,347923,9187,319954,18782,2133,75,1859,2.64,91.96,2.87,333029,14894,4.47,Americas
38 | China,86783,4656,78869,3258,213,4,7,5.37,90.88,5.9,85622,1161,1.36,Western Pacific
39 | Colombia,257101,8777,131161,117163,16306,508,11494,3.41,51.02,6.69,204005,53096,26.03,Americas
40 | Comoros,354,7,328,19,0,0,0,1.98,92.66,2.13,334,20,5.99,Africa
41 | Congo (Brazzaville),3200,54,829,2317,162,3,73,1.69,25.91,6.51,2851,349,12.24,Africa
42 | Congo (Kinshasa),8844,208,5700,2936,13,4,190,2.35,64.45,3.65,8443,401,4.75,Africa
43 | Costa Rica,15841,115,3824,11902,612,11,88,0.73,24.14,3.01,11534,4307,37.34,Americas
44 | Cote d'Ivoire,15655,96,10361,5198,59,0,183,0.61,66.18,0.93,14312,1343,9.38,Africa
45 | Croatia,4881,139,3936,806,24,3,70,2.85,80.64,3.53,4370,511,11.69,Europe
46 | Cuba,2532,87,2351,94,37,0,2,3.44,92.85,3.7,2446,86,3.52,Americas
47 | Cyprus,1060,19,852,189,3,0,0,1.79,80.38,2.23,1038,22,2.12,Europe
48 | Czechia,15516,373,11428,3715,192,2,0,2.4,73.65,3.26,14098,1418,10.06,Europe
49 | Denmark,13761,613,12605,543,109,0,77,4.45,91.6,4.86,13453,308,2.29,Europe
50 | Djibouti,5059,58,4977,24,9,0,11,1.15,98.38,1.17,5020,39,0.78,Eastern Mediterranean
51 | Dominica,18,0,18,0,0,0,0,0.0,100.0,0.0,18,0,0.0,Americas
52 | Dominican Republic,64156,1083,30204,32869,1248,20,1601,1.69,47.08,3.59,53956,10200,18.9,Americas
53 | Ecuador,81161,5532,34896,40733,467,17,0,6.82,43.0,15.85,74620,6541,8.77,Americas
54 | Egypt,92482,4652,34838,52992,420,46,1007,5.03,37.67,13.35,88402,4080,4.62,Eastern Mediterranean
55 | El Salvador,15035,408,7778,6849,405,8,130,2.71,51.73,5.25,12207,2828,23.17,Americas
56 | Equatorial Guinea,3071,51,842,2178,0,0,0,1.66,27.42,6.06,3071,0,0.0,Africa
57 | Eritrea,265,0,191,74,2,0,2,0.0,72.08,0.0,251,14,5.58,Africa
58 | Estonia,2034,69,1923,42,0,0,1,3.39,94.54,3.59,2021,13,0.64,Europe
59 | Eswatini,2316,34,1025,1257,109,2,39,1.47,44.26,3.32,1826,490,26.83,Africa
60 | Ethiopia,14547,228,6386,7933,579,5,170,1.57,43.9,3.57,10207,4340,42.52,Africa
61 | Fiji,27,0,18,9,0,0,0,0.0,66.67,0.0,27,0,0.0,Western Pacific
62 | Finland,7398,329,6920,149,5,0,0,4.45,93.54,4.75,7340,58,0.79,Europe
63 | France,220352,30212,81212,108928,2551,17,267,13.71,36.86,37.2,214023,6329,2.96,Europe
64 | Gabon,7189,49,4682,2458,205,0,219,0.68,65.13,1.05,6433,756,11.75,Africa
65 | Gambia,326,8,66,252,49,2,6,2.45,20.25,12.12,112,214,191.07,Africa
66 | Georgia,1137,16,922,199,6,0,2,1.41,81.09,1.74,1039,98,9.43,Europe
67 | Germany,207112,9125,190314,7673,445,1,259,4.41,91.89,4.79,203325,3787,1.86,Europe
68 | Ghana,33624,168,29801,3655,655,0,307,0.5,88.63,0.56,28430,5194,18.27,Africa
69 | Greece,4227,202,1374,2651,34,0,0,4.78,32.51,14.7,4012,215,5.36,Europe
70 | Greenland,14,0,13,1,1,0,0,0.0,92.86,0.0,13,1,7.69,Europe
71 | Grenada,23,0,23,0,0,0,0,0.0,100.0,0.0,23,0,0.0,Americas
72 | Guatemala,45309,1761,32455,11093,256,27,843,3.89,71.63,5.43,39039,6270,16.06,Americas
73 | Guinea,7055,45,6257,753,47,2,105,0.64,88.69,0.72,6590,465,7.06,Africa
74 | Guinea-Bissau,1954,26,803,1125,0,0,0,1.33,41.1,3.24,1949,5,0.26,Africa
75 | Guyana,389,20,181,188,19,0,0,5.14,46.53,11.05,337,52,15.43,Americas
76 | Haiti,7340,158,4365,2817,25,1,0,2.15,59.47,3.62,7053,287,4.07,Americas
77 | Holy See,12,0,12,0,0,0,0,0.0,100.0,0.0,12,0,0.0,Europe
78 | Honduras,39741,1166,5039,33536,465,50,117,2.93,12.68,23.14,34611,5130,14.82,Americas
79 | Hungary,4448,596,3329,523,13,0,0,13.4,74.84,17.9,4339,109,2.51,Europe
80 | Iceland,1854,10,1823,21,7,0,0,0.54,98.33,0.55,1839,15,0.82,Europe
81 | India,1480073,33408,951166,495499,44457,637,33598,2.26,64.26,3.51,1155338,324735,28.11,South-East Asia
82 | Indonesia,100303,4838,58173,37292,1525,57,1518,4.82,58.0,8.32,88214,12089,13.7,South-East Asia
83 | Iran,293606,15912,255144,22550,2434,212,1931,5.42,86.9,6.24,276202,17404,6.3,Eastern Mediterranean
84 | Iraq,112585,4458,77144,30983,2553,96,1927,3.96,68.52,5.78,94693,17892,18.89,Eastern Mediterranean
85 | Ireland,25892,1764,23364,764,11,0,0,6.81,90.24,7.55,25766,126,0.49,Europe
86 | Israel,63985,474,27133,36378,2029,4,108,0.74,42.41,1.75,52003,11982,23.04,Europe
87 | Italy,246286,35112,198593,12581,168,5,147,14.26,80.64,17.68,244624,1662,0.68,Europe
88 | Jamaica,853,10,714,129,11,0,0,1.17,83.7,1.4,809,44,5.44,Americas
89 | Japan,31142,998,21970,8174,594,0,364,3.2,70.55,4.54,25706,5436,21.15,Western Pacific
90 | Jordan,1176,11,1041,124,8,0,0,0.94,88.52,1.06,1223,-47,-3.84,Eastern Mediterranean
91 | Kazakhstan,84648,585,54404,29659,1526,0,1833,0.69,64.27,1.08,73468,11180,15.22,Europe
92 | Kenya,17975,285,7833,9857,372,5,90,1.59,43.58,3.64,13771,4204,30.53,Africa
93 | Kosovo,7413,185,4027,3201,496,16,274,2.5,54.32,4.59,5877,1536,26.14,Europe
94 | Kuwait,64379,438,55057,8884,606,5,684,0.68,85.52,0.8,59763,4616,7.72,Eastern Mediterranean
95 | Kyrgyzstan,33296,1301,21205,10790,483,24,817,3.91,63.69,6.14,27143,6153,22.67,Europe
96 | Laos,20,0,19,1,0,0,0,0.0,95.0,0.0,19,1,5.26,Western Pacific
97 | Latvia,1219,31,1045,143,0,0,0,2.54,85.73,2.97,1192,27,2.27,Europe
98 | Lebanon,3882,51,1709,2122,132,0,17,1.31,44.02,2.98,2905,977,33.63,Eastern Mediterranean
99 | Lesotho,505,12,128,365,0,0,0,2.38,25.35,9.38,359,146,40.67,Africa
100 | Liberia,1167,72,646,449,5,0,5,6.17,55.36,11.15,1107,60,5.42,Africa
101 | Libya,2827,64,577,2186,158,4,24,2.26,20.41,11.09,1980,847,42.78,Eastern Mediterranean
102 | Liechtenstein,86,1,81,4,0,0,0,1.16,94.19,1.23,86,0,0.0,Europe
103 | Lithuania,2019,80,1620,319,11,0,4,3.96,80.24,4.94,1947,72,3.7,Europe
104 | Luxembourg,6321,112,4825,1384,49,0,178,1.77,76.33,2.32,5639,682,12.09,Europe
105 | Madagascar,9690,91,6260,3339,395,6,681,0.94,64.6,1.45,7153,2537,35.47,Africa
106 | Malawi,3664,99,1645,1920,24,0,6,2.7,44.9,6.02,2992,672,22.46,Africa
107 | Malaysia,8904,124,8601,179,7,0,1,1.39,96.6,1.44,8800,104,1.18,Western Pacific
108 | Maldives,3369,15,2547,807,67,0,19,0.45,75.6,0.59,2999,370,12.34,South-East Asia
109 | Mali,2513,124,1913,476,3,1,2,4.93,76.12,6.48,2475,38,1.54,Africa
110 | Malta,701,9,665,27,1,0,0,1.28,94.86,1.35,677,24,3.55,Europe
111 | Mauritania,6208,156,4653,1399,37,0,223,2.51,74.95,3.35,5923,285,4.81,Africa
112 | Mauritius,344,10,332,2,0,0,0,2.91,96.51,3.01,343,1,0.29,Africa
113 | Mexico,395489,44022,303810,47657,4973,342,8588,11.13,76.82,14.49,349396,46093,13.19,Americas
114 | Moldova,23154,748,16154,6252,120,13,245,3.23,69.77,4.63,21115,2039,9.66,Europe
115 | Monaco,116,4,104,8,0,0,0,3.45,89.66,3.85,109,7,6.42,Europe
116 | Mongolia,289,0,222,67,1,0,4,0.0,76.82,0.0,287,2,0.7,Western Pacific
117 | Montenegro,2893,45,809,2039,94,2,70,1.56,27.96,5.56,2188,705,32.22,Europe
118 | Morocco,20887,316,16553,4018,609,3,115,1.51,79.25,1.91,17562,3325,18.93,Eastern Mediterranean
119 | Mozambique,1701,11,0,1690,32,0,0,0.65,0.0,inf,1507,194,12.87,Africa
120 | Namibia,1843,8,101,1734,68,0,26,0.43,5.48,7.92,1344,499,37.13,Africa
121 | Nepal,18752,48,13754,4950,139,3,626,0.26,73.35,0.35,17844,908,5.09,South-East Asia
122 | Netherlands,53413,6160,189,47064,419,1,0,11.53,0.35,3259.26,52132,1281,2.46,Europe
123 | New Zealand,1557,22,1514,21,1,0,1,1.41,97.24,1.45,1555,2,0.13,Western Pacific
124 | Nicaragua,3439,108,2492,839,0,0,0,3.14,72.46,4.33,3147,292,9.28,Americas
125 | Niger,1132,69,1027,36,0,0,0,6.1,90.72,6.72,1105,27,2.44,Africa
126 | Nigeria,41180,860,18203,22117,648,2,829,2.09,44.2,4.72,37225,3955,10.62,Africa
127 | North Macedonia,10213,466,5564,4183,127,6,137,4.56,54.48,8.38,9249,964,10.42,Europe
128 | Norway,9132,255,8752,125,15,0,0,2.79,95.84,2.91,9034,98,1.08,Europe
129 | Oman,77058,393,57028,19637,1053,9,1729,0.51,74.01,0.69,68400,8658,12.66,Eastern Mediterranean
130 | Pakistan,274289,5842,241026,27421,1176,20,3592,2.13,87.87,2.42,266096,8193,3.08,Eastern Mediterranean
131 | Panama,61442,1322,35086,25034,1146,28,955,2.15,57.1,3.77,54426,7016,12.89,Americas
132 | Papua New Guinea,62,0,11,51,0,0,0,0.0,17.74,0.0,19,43,226.32,Western Pacific
133 | Paraguay,4548,43,2905,1600,104,2,111,0.95,63.87,1.48,3748,800,21.34,Americas
134 | Peru,389717,18418,272547,98752,13756,575,4697,4.73,69.93,6.76,357681,32036,8.96,Americas
135 | Philippines,82040,1945,26446,53649,1592,13,336,2.37,32.24,7.35,68898,13142,19.07,Western Pacific
136 | Poland,43402,1676,32856,8870,337,5,103,3.86,75.7,5.1,40383,3019,7.48,Europe
137 | Portugal,50299,1719,35375,13205,135,2,158,3.42,70.33,4.86,48771,1528,3.13,Europe
138 | Qatar,109597,165,106328,3104,292,0,304,0.15,97.02,0.16,107037,2560,2.39,Eastern Mediterranean
139 | Romania,45902,2206,25794,17902,1104,19,151,4.81,56.19,8.55,38139,7763,20.35,Europe
140 | Russia,816680,13334,602249,201097,5607,85,3077,1.63,73.74,2.21,776212,40468,5.21,Europe
141 | Rwanda,1879,5,975,899,58,0,57,0.27,51.89,0.51,1629,250,15.35,Africa
142 | Saint Kitts and Nevis,17,0,15,2,0,0,0,0.0,88.24,0.0,17,0,0.0,Americas
143 | Saint Lucia,24,0,22,2,0,0,0,0.0,91.67,0.0,23,1,4.35,Americas
144 | Saint Vincent and the Grenadines,52,0,39,13,0,0,0,0.0,75.0,0.0,50,2,4.0,Americas
145 | San Marino,699,42,657,0,0,0,0,6.01,93.99,6.39,699,0,0.0,Europe
146 | Sao Tome and Principe,865,14,734,117,2,0,38,1.62,84.86,1.91,746,119,15.95,Africa
147 | Saudi Arabia,268934,2760,222936,43238,1993,27,2613,1.03,82.9,1.24,253349,15585,6.15,Eastern Mediterranean
148 | Senegal,9764,194,6477,3093,83,3,68,1.99,66.34,3.0,8948,816,9.12,Africa
149 | Serbia,24141,543,0,23598,411,9,0,2.25,0.0,inf,21253,2888,13.59,Europe
150 | Seychelles,114,0,39,75,0,0,0,0.0,34.21,0.0,108,6,5.56,Africa
151 | Sierra Leone,1783,66,1317,400,0,0,4,3.7,73.86,5.01,1711,72,4.21,Africa
152 | Singapore,50838,27,45692,5119,469,0,171,0.05,89.88,0.06,48035,2803,5.84,Western Pacific
153 | Slovakia,2181,28,1616,537,2,0,39,1.28,74.09,1.73,1980,201,10.15,Europe
154 | Slovenia,2087,116,1733,238,5,0,55,5.56,83.04,6.69,1953,134,6.86,Europe
155 | Somalia,3196,93,1543,1560,18,0,22,2.91,48.28,6.03,3130,66,2.11,Eastern Mediterranean
156 | South Africa,452529,7067,274925,170537,7096,298,9848,1.56,60.75,2.57,373628,78901,21.12,Africa
157 | South Korea,14203,300,13007,896,28,1,102,2.11,91.58,2.31,13816,387,2.8,Western Pacific
158 | South Sudan,2305,46,1175,1084,43,1,0,2.0,50.98,3.91,2211,94,4.25,Africa
159 | Spain,272421,28432,150376,93613,0,0,0,10.44,55.2,18.91,264836,7585,2.86,Europe
160 | Sri Lanka,2805,11,2121,673,23,0,15,0.39,75.61,0.52,2730,75,2.75,South-East Asia
161 | Sudan,11424,720,5939,4765,39,3,49,6.3,51.99,12.12,10992,432,3.93,Eastern Mediterranean
162 | Suriname,1483,24,925,534,44,1,35,1.62,62.37,2.59,1079,404,37.44,Americas
163 | Sweden,79395,5700,0,73695,398,3,0,7.18,0.0,inf,78048,1347,1.73,Europe
164 | Switzerland,34477,1978,30900,1599,65,1,200,5.74,89.62,6.4,33634,843,2.51,Europe
165 | Syria,674,40,0,634,24,2,0,5.93,0.0,inf,522,152,29.12,Eastern Mediterranean
166 | Taiwan*,462,7,440,15,4,0,0,1.52,95.24,1.59,451,11,2.44,Western Pacific
167 | Tajikistan,7235,60,6028,1147,43,1,58,0.83,83.32,1.0,6921,314,4.54,Europe
168 | Tanzania,509,21,183,305,0,0,0,4.13,35.95,11.48,509,0,0.0,Africa
169 | Thailand,3297,58,3111,128,6,0,2,1.76,94.36,1.86,3250,47,1.45,South-East Asia
170 | Timor-Leste,24,0,0,24,0,0,0,0.0,0.0,0.0,24,0,0.0,South-East Asia
171 | Togo,874,18,607,249,6,0,8,2.06,69.45,2.97,783,91,11.62,Africa
172 | Trinidad and Tobago,148,8,128,12,1,0,0,5.41,86.49,6.25,137,11,8.03,Americas
173 | Tunisia,1455,50,1157,248,3,0,15,3.44,79.52,4.32,1381,74,5.36,Eastern Mediterranean
174 | Turkey,227019,5630,210469,10920,919,17,982,2.48,92.71,2.67,220572,6447,2.92,Europe
175 | US,4290259,148011,1325804,2816444,56336,1076,27941,3.45,30.9,11.16,3834677,455582,11.88,Americas
176 | Uganda,1128,2,986,140,13,0,4,0.18,87.41,0.2,1069,59,5.52,Africa
177 | Ukraine,67096,1636,37202,28258,835,11,317,2.44,55.45,4.4,60767,6329,10.42,Europe
178 | United Arab Emirates,59177,345,52510,6322,264,1,328,0.58,88.73,0.66,57193,1984,3.47,Eastern Mediterranean
179 | United Kingdom,301708,45844,1437,254427,688,7,3,15.19,0.48,3190.26,296944,4764,1.6,Europe
180 | Uruguay,1202,35,951,216,10,1,3,2.91,79.12,3.68,1064,138,12.97,Americas
181 | Uzbekistan,21209,121,11674,9414,678,5,569,0.57,55.04,1.04,17149,4060,23.67,Europe
182 | Venezuela,15988,146,9959,5883,525,4,213,0.91,62.29,1.47,12334,3654,29.63,Americas
183 | Vietnam,431,0,365,66,11,0,0,0.0,84.69,0.0,384,47,12.24,Western Pacific
184 | West Bank and Gaza,10621,78,3752,6791,152,2,0,0.73,35.33,2.08,8916,1705,19.12,Eastern Mediterranean
185 | Western Sahara,10,1,8,1,0,0,0,10.0,80.0,12.5,10,0,0.0,Africa
186 | Yemen,1691,483,833,375,10,4,36,28.56,49.26,57.98,1619,72,4.45,Eastern Mediterranean
187 | Zambia,4552,140,2815,1597,71,1,465,3.08,61.84,4.97,3326,1226,36.86,Africa
188 | Zimbabwe,2704,36,542,2126,192,2,24,1.33,20.04,6.64,1713,991,57.85,Africa
189 |
--------------------------------------------------------------------------------
/day_wise.csv:
--------------------------------------------------------------------------------
1 | Date,Confirmed,Deaths,Recovered,Active,New cases,New deaths,New recovered,Deaths / 100 Cases,Recovered / 100 Cases,Deaths / 100 Recovered,No. of countries
2 | 2020-01-22,555,17,28,510,0,0,0,3.06,5.05,60.71,6
3 | 2020-01-23,654,18,30,606,99,1,2,2.75,4.59,60.0,8
4 | 2020-01-24,941,26,36,879,287,8,6,2.76,3.83,72.22,9
5 | 2020-01-25,1434,42,39,1353,493,16,3,2.93,2.72,107.69,11
6 | 2020-01-26,2118,56,52,2010,684,14,13,2.64,2.46,107.69,13
7 | 2020-01-27,2927,82,61,2784,809,26,9,2.8,2.08,134.43,16
8 | 2020-01-28,5578,131,107,5340,2651,49,46,2.35,1.92,122.43,16
9 | 2020-01-29,6166,133,125,5908,588,2,18,2.16,2.03,106.4,18
10 | 2020-01-30,8234,171,141,7922,2068,38,16,2.08,1.71,121.28,20
11 | 2020-01-31,9927,213,219,9495,1693,42,78,2.15,2.21,97.26,24
12 | 2020-02-01,12038,259,281,11498,2111,46,62,2.15,2.33,92.17,25
13 | 2020-02-02,16787,362,459,15966,4749,103,178,2.16,2.73,78.87,25
14 | 2020-02-03,19887,426,604,18857,3100,64,145,2.14,3.04,70.53,25
15 | 2020-02-04,23898,492,821,22585,4011,66,217,2.06,3.44,59.93,26
16 | 2020-02-05,27643,564,1071,26008,3745,72,250,2.04,3.87,52.66,26
17 | 2020-02-06,30802,634,1418,28750,3159,70,347,2.06,4.6,44.71,26
18 | 2020-02-07,34334,719,1903,31712,3532,85,485,2.09,5.54,37.78,26
19 | 2020-02-08,37068,806,2470,33792,2734,87,567,2.17,6.66,32.63,26
20 | 2020-02-09,40095,906,3057,36132,3027,100,587,2.26,7.62,29.64,26
21 | 2020-02-10,42633,1013,3714,37906,2538,107,657,2.38,8.71,27.28,26
22 | 2020-02-11,44675,1113,4417,39145,2042,100,703,2.49,9.89,25.2,26
23 | 2020-02-12,46561,1118,4849,40594,1886,5,432,2.4,10.41,23.06,26
24 | 2020-02-13,60206,1371,5930,52905,13645,253,1081,2.28,9.85,23.12,26
25 | 2020-02-14,66690,1523,7613,57554,6484,152,1683,2.28,11.42,20.01,27
26 | 2020-02-15,68765,1666,8902,58197,2075,143,1289,2.42,12.95,18.71,27
27 | 2020-02-16,70879,1770,10319,58790,2114,104,1417,2.5,14.56,17.15,27
28 | 2020-02-17,72815,1868,11951,58996,1936,98,1632,2.57,16.41,15.63,27
29 | 2020-02-18,74609,2008,13693,58908,1794,140,1742,2.69,18.35,14.66,27
30 | 2020-02-19,75030,2123,15394,57513,421,115,1701,2.83,20.52,13.79,28
31 | 2020-02-20,75577,2246,17369,55962,547,123,1975,2.97,22.98,12.93,28
32 | 2020-02-21,76206,2250,17966,55990,629,4,597,2.95,23.58,12.52,30
33 | 2020-02-22,77967,2457,21849,53661,1761,207,3883,3.15,28.02,11.25,30
34 | 2020-02-23,78290,2467,22304,53519,323,10,455,3.15,28.49,11.06,31
35 | 2020-02-24,78854,2627,24047,52180,564,160,1743,3.33,30.5,10.92,36
36 | 2020-02-25,79707,2707,26652,50348,853,80,2605,3.4,33.44,10.16,41
37 | 2020-02-26,80670,2767,29077,48826,963,60,2425,3.43,36.04,9.52,47
38 | 2020-02-27,82034,2810,31919,47305,1364,43,2842,3.43,38.91,8.8,51
39 | 2020-02-28,83411,2867,35306,45238,1377,57,3387,3.44,42.33,8.12,57
40 | 2020-02-29,85306,2936,38314,44056,1895,69,3008,3.44,44.91,7.66,61
41 | 2020-03-01,87690,2990,41208,43492,2384,54,2894,3.41,46.99,7.26,66
42 | 2020-03-02,89664,3079,44085,42500,1974,89,2877,3.43,49.17,6.98,73
43 | 2020-03-03,92241,3154,46681,42406,2577,75,2596,3.42,50.61,6.76,76
44 | 2020-03-04,94540,3249,49619,41672,2299,95,2938,3.44,52.48,6.55,80
45 | 2020-03-05,97331,3342,52237,41752,2791,93,2618,3.43,53.67,6.4,84
46 | 2020-03-06,101274,3454,54270,43550,3943,112,2033,3.41,53.59,6.36,93
47 | 2020-03-07,105312,3553,56760,44999,4038,99,2490,3.37,53.9,6.26,94
48 | 2020-03-08,109266,3797,59092,46377,3954,244,2332,3.48,54.08,6.43,99
49 | 2020-03-09,113166,3981,60891,48294,3900,184,1799,3.52,53.81,6.54,102
50 | 2020-03-10,118190,4260,62802,51128,5024,279,1911,3.6,53.14,6.78,105
51 | 2020-03-11,125853,4604,65113,56136,7663,344,2311,3.66,51.74,7.07,111
52 | 2020-03-12,131603,4909,66434,60260,5750,305,1321,3.73,50.48,7.39,113
53 | 2020-03-13,146008,5406,68359,72243,14405,497,1925,3.7,46.82,7.91,120
54 | 2020-03-14,157114,5823,70729,80562,11106,417,2370,3.71,45.02,8.23,133
55 | 2020-03-15,168260,6464,74139,87657,11146,641,3410,3.84,44.06,8.72,137
56 | 2020-03-16,182919,7144,76192,99583,14659,680,2053,3.91,41.65,9.38,143
57 | 2020-03-17,198757,7948,78944,111865,15838,804,2752,4.0,39.72,10.07,146
58 | 2020-03-18,218343,8845,81427,128071,19586,897,2483,4.05,37.29,10.86,150
59 | 2020-03-19,246261,9951,83064,153246,27918,1106,1637,4.04,33.73,11.98,154
60 | 2020-03-20,275869,11429,85509,178931,29608,1478,2445,4.14,31.0,13.37,161
61 | 2020-03-21,308175,13134,89775,205266,32306,1705,4266,4.26,29.13,14.63,163
62 | 2020-03-22,341585,14831,95990,230764,33410,1697,6215,4.34,28.1,15.45,168
63 | 2020-03-23,383750,16748,96456,270546,42165,1917,466,4.36,25.14,17.36,169
64 | 2020-03-24,424889,19016,105997,299876,41154,2268,9541,4.48,24.95,17.94,171
65 | 2020-03-25,475706,21793,111445,342468,50817,2777,5448,4.58,23.43,19.55,174
66 | 2020-03-26,538666,24800,119804,394062,62960,3007,8359,4.6,22.24,20.7,175
67 | 2020-03-27,603066,28318,128508,446240,64400,3518,8704,4.7,21.31,22.04,176
68 | 2020-03-28,670723,31997,136800,501926,67657,3679,8292,4.77,20.4,23.39,176
69 | 2020-03-29,730300,35470,146261,548569,59577,3473,9461,4.86,20.03,24.25,176
70 | 2020-03-30,794939,39634,161707,593598,64639,4164,15446,4.99,20.34,24.51,177
71 | 2020-03-31,871355,44478,174074,652803,76416,4844,12367,5.1,19.98,25.55,179
72 | 2020-04-01,947569,50029,189434,708106,76214,5551,15360,5.28,19.99,26.41,179
73 | 2020-04-02,1028968,56334,206052,766582,81399,6305,16618,5.47,20.03,27.34,180
74 | 2020-04-03,1112123,62319,221060,828744,83155,5985,15008,5.6,19.88,28.19,180
75 | 2020-04-04,1192586,68160,241072,883354,80463,5841,20012,5.72,20.21,28.27,180
76 | 2020-04-05,1264304,73181,254477,936646,71718,5021,13405,5.79,20.13,28.76,182
77 | 2020-04-06,1336976,79013,270812,987151,72672,5832,16335,5.91,20.26,29.18,183
78 | 2020-04-07,1413849,86915,293665,1033269,76873,7902,22853,6.15,20.77,29.6,183
79 | 2020-04-08,1497624,93650,322017,1081957,83775,6735,28352,6.25,21.5,29.08,183
80 | 2020-04-09,1584249,101279,346349,1136621,86625,7629,24332,6.39,21.86,29.24,183
81 | 2020-04-10,1671907,108551,367477,1195879,87658,7272,21128,6.49,21.98,29.54,184
82 | 2020-04-11,1748872,114620,392991,1241261,76965,6069,25514,6.55,22.47,29.17,184
83 | 2020-04-12,1845653,120351,411864,1313438,96802,5731,18873,6.52,22.32,29.22,184
84 | 2020-04-13,1915247,126098,438395,1350754,69594,5747,26531,6.58,22.89,28.76,184
85 | 2020-04-14,1985174,132996,463014,1389164,69927,6898,24619,6.7,23.32,28.72,184
86 | 2020-04-15,2066003,141308,498925,1425770,80829,8312,35911,6.84,24.15,28.32,184
87 | 2020-04-16,2162715,148591,529015,1485109,96712,7283,30090,6.87,24.46,28.09,184
88 | 2020-04-17,2250439,157481,554287,1538671,87724,8890,25272,7.0,24.63,28.41,184
89 | 2020-04-18,2324396,163952,577789,1582655,73958,6471,23502,7.05,24.86,28.38,184
90 | 2020-04-19,2404919,168522,608557,1627840,80523,4570,30768,7.01,25.3,27.69,184
91 | 2020-04-20,2478258,173965,629862,1674431,73339,5443,21305,7.02,25.42,27.62,184
92 | 2020-04-21,2553508,181122,664043,1708343,75250,7157,34181,7.09,26.01,27.28,184
93 | 2020-04-22,2630314,187877,693207,1749230,78994,6755,29164,7.14,26.35,27.1,184
94 | 2020-04-23,2719327,194727,721689,1802911,89013,6850,28482,7.16,26.54,26.98,184
95 | 2020-04-24,2806267,201401,771329,1833537,96974,6674,49640,7.18,27.49,26.11,184
96 | 2020-04-25,2891199,206979,798239,1885981,84932,5578,26910,7.16,27.61,25.93,184
97 | 2020-04-26,2964146,210862,825969,1927315,72948,3883,27730,7.11,27.87,25.53,184
98 | 2020-04-27,3032850,215511,852382,1964957,68704,4649,26413,7.11,28.1,25.28,184
99 | 2020-04-28,3108149,221974,884680,2001495,75404,6463,32298,7.14,28.46,25.09,184
100 | 2020-04-29,3185195,228742,925752,2030701,79558,6768,41072,7.18,29.06,24.71,184
101 | 2020-04-30,3268876,234704,989616,2044556,83681,5962,63864,7.18,30.27,23.72,186
102 | 2020-05-01,3355922,239881,1026501,2089540,87046,5177,36885,7.15,30.59,23.37,186
103 | 2020-05-02,3437608,245206,1066362,2126040,81853,5325,39861,7.13,31.02,22.99,186
104 | 2020-05-03,3515244,248659,1097577,2169008,77636,3453,31215,7.07,31.22,22.66,186
105 | 2020-05-04,3591321,252787,1130526,2208008,76078,4128,32949,7.04,31.48,22.36,186
106 | 2020-05-05,3671310,258658,1166155,2246497,79989,5871,35629,7.05,31.76,22.18,186
107 | 2020-05-06,3761332,265327,1210894,2285111,90022,6669,44739,7.05,32.19,21.91,186
108 | 2020-05-07,3850418,270736,1249311,2330371,90669,5409,38417,7.03,32.45,21.67,186
109 | 2020-05-08,3941935,276304,1284849,2380782,92997,5568,35538,7.01,32.59,21.5,186
110 | 2020-05-09,4027781,280569,1337367,2409845,85846,4265,52518,6.97,33.2,20.98,186
111 | 2020-05-10,4104027,284135,1370108,2449784,76255,3566,32741,6.92,33.38,20.74,186
112 | 2020-05-11,4180268,287608,1416204,2476456,76298,3473,46096,6.88,33.88,20.31,186
113 | 2020-05-12,4263867,293155,1452191,2518521,83619,5547,35987,6.88,34.06,20.19,186
114 | 2020-05-13,4348619,298383,1506905,2543331,84917,5228,54714,6.86,34.65,19.8,187
115 | 2020-05-14,4445724,303651,1545712,2596361,97106,5268,38807,6.83,34.77,19.64,187
116 | 2020-05-15,4542073,308866,1592880,2640327,96349,5215,47168,6.8,35.07,19.39,187
117 | 2020-05-16,4637485,313037,1648546,2675902,95412,4171,55666,6.75,35.55,18.99,187
118 | 2020-05-17,4715994,316366,1688699,2710929,78509,3329,40153,6.71,35.81,18.73,187
119 | 2020-05-18,4804278,319657,1740909,2743712,88284,3291,52210,6.65,36.24,18.36,187
120 | 2020-05-19,4900702,324441,1792256,2784005,96633,4784,51347,6.62,36.57,18.1,187
121 | 2020-05-20,5003730,329326,1850441,2823963,103028,4885,58185,6.58,36.98,17.8,187
122 | 2020-05-21,5110064,334112,1900768,2875184,106438,4786,50327,6.54,37.2,17.58,187
123 | 2020-05-22,5216964,339396,2008541,2869027,106900,5284,107773,6.51,38.5,16.9,187
124 | 2020-05-23,5322253,343385,2062802,2916066,105289,3989,54261,6.45,38.76,16.65,187
125 | 2020-05-24,5417579,346525,2117555,2953499,95326,3140,54753,6.4,39.09,16.36,187
126 | 2020-05-25,5504542,347703,2180605,2976234,87335,1178,63050,6.32,39.61,15.95,187
127 | 2020-05-26,5597064,351906,2235118,3010040,92742,4203,54513,6.29,39.93,15.74,187
128 | 2020-05-27,5699664,357119,2297613,3044932,102600,5213,62495,6.27,40.31,15.54,187
129 | 2020-05-28,5818978,361820,2363746,3093412,119314,4701,66133,6.22,40.62,15.31,187
130 | 2020-05-29,5940145,366562,2440127,3133456,121167,4742,76381,6.17,41.08,15.02,187
131 | 2020-05-30,6077978,370718,2509981,3197279,137833,4156,69854,6.1,41.3,14.77,187
132 | 2020-05-31,6185530,373606,2585589,3226335,107552,2888,75608,6.04,41.8,14.45,187
133 | 2020-06-01,6280725,376674,2639599,3264452,95195,3068,54010,6.0,42.03,14.27,187
134 | 2020-06-02,6401536,381497,2743083,3276956,121577,4823,103484,5.96,42.85,13.91,187
135 | 2020-06-03,6520924,387069,2821430,3312425,119389,5572,78347,5.94,43.27,13.72,187
136 | 2020-06-04,6647861,392218,2890776,3364867,126937,5149,69346,5.9,43.48,13.57,187
137 | 2020-06-05,6778724,396994,2959037,3422693,130863,4776,68261,5.86,43.65,13.42,187
138 | 2020-06-06,6914666,400875,3030214,3483577,135942,3881,71177,5.8,43.82,13.23,187
139 | 2020-06-07,7026925,403617,3084718,3538590,112259,2742,54504,5.74,43.9,13.08,187
140 | 2020-06-08,7129150,407314,3235640,3486196,102225,3697,150922,5.71,45.39,12.59,187
141 | 2020-06-09,7253492,412236,3317121,3524135,124342,4922,81481,5.68,45.73,12.43,187
142 | 2020-06-10,7387517,417441,3395154,3574922,134025,5205,78033,5.65,45.96,12.3,187
143 | 2020-06-11,7525631,422215,3480121,3623295,138114,4774,84967,5.61,46.24,12.13,187
144 | 2020-06-12,7654725,426512,3558933,3669280,129094,4297,78812,5.57,46.49,11.98,187
145 | 2020-06-13,7790735,430750,3644048,3715937,136010,4238,85115,5.53,46.77,11.82,187
146 | 2020-06-14,7924156,434124,3714006,3776026,133421,3374,69958,5.48,46.87,11.69,187
147 | 2020-06-15,8043794,437549,3793406,3812839,119638,3425,79400,5.44,47.16,11.53,187
148 | 2020-06-16,8185197,444416,3890800,3849981,141403,6867,97394,5.43,47.53,11.42,187
149 | 2020-06-17,8327050,449671,4008201,3869178,141853,5255,117401,5.4,48.13,11.22,187
150 | 2020-06-18,8466978,454700,4088826,3923452,139928,5029,80625,5.37,48.29,11.12,187
151 | 2020-06-19,8647784,460973,4183298,4003513,180954,6273,94472,5.33,48.37,11.02,187
152 | 2020-06-20,8805336,465222,4298603,4041511,157552,4249,115305,5.28,48.82,10.82,187
153 | 2020-06-21,8933875,469185,4366875,4097815,128539,3963,68272,5.25,48.88,10.74,187
154 | 2020-06-22,9071733,472756,4458093,4140884,137858,3571,91218,5.21,49.14,10.6,187
155 | 2020-06-23,9237071,478067,4561696,4197308,165338,5311,103603,5.18,49.38,10.48,187
156 | 2020-06-24,9408254,483328,4677005,4247921,171183,5261,115309,5.14,49.71,10.33,187
157 | 2020-06-25,9586141,489955,4769458,4326728,177887,6627,92453,5.11,49.75,10.27,187
158 | 2020-06-26,9777487,494782,4875774,4406931,191346,4827,106316,5.06,49.87,10.15,187
159 | 2020-06-27,9955597,499268,4981808,4474521,178110,4486,106034,5.01,50.04,10.02,187
160 | 2020-06-28,10117227,502357,5070592,4544278,162349,3089,88784,4.97,50.12,9.91,187
161 | 2020-06-29,10275799,506078,5164494,4605227,158572,3721,93902,4.92,50.26,9.8,187
162 | 2020-06-30,10449697,511210,5281459,4657028,173898,5132,116965,4.89,50.54,9.68,187
163 | 2020-07-01,10667386,516221,5397083,4754082,217689,5011,115624,4.84,50.59,9.56,187
164 | 2020-07-02,10875091,521341,5681477,4672273,207705,5120,284394,4.79,52.24,9.18,187
165 | 2020-07-03,11078585,526336,5790942,4761307,203495,4995,109465,4.75,52.27,9.09,187
166 | 2020-07-04,11272152,530705,5986375,4755072,193567,4369,195433,4.71,53.11,8.87,187
167 | 2020-07-05,11454847,534150,6105546,4815151,182695,3445,119171,4.66,53.3,8.75,187
168 | 2020-07-06,11622190,537947,6228768,4855475,167343,3797,123222,4.63,53.59,8.64,187
169 | 2020-07-07,11833034,544054,6373513,4915467,210844,6107,144745,4.6,53.86,8.54,187
170 | 2020-07-08,12044836,549373,6531016,4964447,211802,5319,157503,4.56,54.22,8.41,187
171 | 2020-07-09,12273063,554831,6665237,5052995,228227,5458,134221,4.52,54.31,8.32,187
172 | 2020-07-10,12505640,560142,6804254,5141244,232577,5311,139017,4.48,54.41,8.23,187
173 | 2020-07-11,12721968,565039,6929711,5227218,216328,4897,125457,4.44,54.47,8.15,187
174 | 2020-07-12,12914636,568993,7041174,5304469,192668,3954,111463,4.41,54.52,8.08,187
175 | 2020-07-13,13107415,572808,7181139,5353468,192779,3815,139965,4.37,54.79,7.98,187
176 | 2020-07-14,13328867,578468,7322897,5427502,221452,5660,141758,4.34,54.94,7.9,187
177 | 2020-07-15,13559984,583961,7482320,5493703,231122,5493,159423,4.31,55.18,7.8,187
178 | 2020-07-16,13812525,589760,7634241,5588524,252544,5799,151921,4.27,55.27,7.73,187
179 | 2020-07-17,14054563,596503,7793760,5664300,242038,6743,159519,4.24,55.45,7.65,187
180 | 2020-07-18,14292198,602130,7944550,5745518,237635,5627,150790,4.21,55.59,7.58,187
181 | 2020-07-19,14506845,606159,8032235,5868451,214647,4029,87685,4.18,55.37,7.55,187
182 | 2020-07-20,14713623,610319,8190777,5912527,206778,4160,158542,4.15,55.67,7.45,187
183 | 2020-07-21,14947078,616557,8364986,5965535,233565,6238,174209,4.12,55.96,7.37,187
184 | 2020-07-22,15227725,623540,8541255,6062930,280647,6983,176269,4.09,56.09,7.3,187
185 | 2020-07-23,15510481,633506,8710969,6166006,282756,9966,169714,4.08,56.16,7.27,187
186 | 2020-07-24,15791645,639650,8939705,6212290,281164,6144,228736,4.05,56.61,7.16,187
187 | 2020-07-25,16047190,644517,9158743,6243930,255545,4867,219038,4.02,57.07,7.04,187
188 | 2020-07-26,16251796,648621,9293464,6309711,204606,4104,134721,3.99,57.18,6.98,187
189 | 2020-07-27,16480485,654036,9468087,6358362,228693,5415,174623,3.97,57.45,6.91,187
190 |
--------------------------------------------------------------------------------
/images/05.06-gaussian-basis.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/05.06-gaussian-basis.png
--------------------------------------------------------------------------------
/images/ann/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/ann/activation_functions_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/ann/activation_functions_plot.png
--------------------------------------------------------------------------------
/images/ann/fashion_mnist_images_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/ann/fashion_mnist_images_plot.png
--------------------------------------------------------------------------------
/images/ann/fashion_mnist_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/ann/fashion_mnist_plot.png
--------------------------------------------------------------------------------
/images/ann/keras_learning_curves_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/ann/keras_learning_curves_plot.png
--------------------------------------------------------------------------------
/images/ann/perceptron_iris_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/ann/perceptron_iris_plot.png
--------------------------------------------------------------------------------
/images/autoencoder.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/autoencoder.png
--------------------------------------------------------------------------------
/images/autoencoders/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/classification/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/classification/confusion_matrix_errors_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/classification/confusion_matrix_errors_plot.png
--------------------------------------------------------------------------------
/images/classification/confusion_matrix_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/classification/confusion_matrix_plot.png
--------------------------------------------------------------------------------
/images/classification/more_digits_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/classification/more_digits_plot.png
--------------------------------------------------------------------------------
/images/classification/precision_recall_vs_threshold_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/classification/precision_recall_vs_threshold_plot.png
--------------------------------------------------------------------------------
/images/classification/precision_vs_recall_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/classification/precision_vs_recall_plot.png
--------------------------------------------------------------------------------
/images/classification/some_digit_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/classification/some_digit_plot.png
--------------------------------------------------------------------------------
/images/cnn/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/cnn/china_horizontal.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/cnn/china_horizontal.png
--------------------------------------------------------------------------------
/images/cnn/china_max_pooling.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/cnn/china_max_pooling.png
--------------------------------------------------------------------------------
/images/cnn/china_original.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/cnn/china_original.png
--------------------------------------------------------------------------------
/images/cnn/china_vertical.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/cnn/china_vertical.png
--------------------------------------------------------------------------------
/images/cnn/test_image.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/cnn/test_image.png
--------------------------------------------------------------------------------
/images/decision_tree.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/decision_tree.png
--------------------------------------------------------------------------------
/images/decision_trees/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/deep/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/deep/elu_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/deep/elu_plot.png
--------------------------------------------------------------------------------
/images/deep/leaky_relu_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/deep/leaky_relu_plot.png
--------------------------------------------------------------------------------
/images/deep/selu_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/deep/selu_plot.png
--------------------------------------------------------------------------------
/images/deep/sigmoid_saturation_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/deep/sigmoid_saturation_plot.png
--------------------------------------------------------------------------------
/images/distributed/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/end_to_end_project/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/end_to_end_project/attribute_histogram_plots.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/end_to_end_project/attribute_histogram_plots.png
--------------------------------------------------------------------------------
/images/end_to_end_project/bad_visualization_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/end_to_end_project/bad_visualization_plot.png
--------------------------------------------------------------------------------
/images/end_to_end_project/better_visualization_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/end_to_end_project/better_visualization_plot.png
--------------------------------------------------------------------------------
/images/end_to_end_project/california.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/end_to_end_project/california.png
--------------------------------------------------------------------------------
/images/end_to_end_project/california_housing_prices_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/end_to_end_project/california_housing_prices_plot.png
--------------------------------------------------------------------------------
/images/end_to_end_project/housing_prices_scatterplot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/end_to_end_project/housing_prices_scatterplot.png
--------------------------------------------------------------------------------
/images/end_to_end_project/income_vs_house_value_scatterplot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/end_to_end_project/income_vs_house_value_scatterplot.png
--------------------------------------------------------------------------------
/images/end_to_end_project/scatter_matrix_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/end_to_end_project/scatter_matrix_plot.png
--------------------------------------------------------------------------------
/images/ensembles/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/fig2-1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/fig2-1.png
--------------------------------------------------------------------------------
/images/fig2-14.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/fig2-14.png
--------------------------------------------------------------------------------
/images/fig2-17.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/fig2-17.png
--------------------------------------------------------------------------------
/images/fig2-2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/fig2-2.png
--------------------------------------------------------------------------------
/images/fig2-3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/fig2-3.png
--------------------------------------------------------------------------------
/images/fig2-4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/fig2-4.png
--------------------------------------------------------------------------------
/images/fig3-2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/fig3-2.png
--------------------------------------------------------------------------------
/images/fig3-3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/fig3-3.png
--------------------------------------------------------------------------------
/images/fig_det.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/fig_det.png
--------------------------------------------------------------------------------
/images/fundamentals/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/kfolds.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/kfolds.png
--------------------------------------------------------------------------------
/images/nlp/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/random_forest.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/random_forest.png
--------------------------------------------------------------------------------
/images/rl/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/rl/breakout.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/rl/breakout.gif
--------------------------------------------------------------------------------
/images/rnn/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/svm/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/tensorflow/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/training_linear_models/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/training_linear_models/generated_data_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/generated_data_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/gradient_descent_paths_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/gradient_descent_paths_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/gradient_descent_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/gradient_descent_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/high_degree_polynomials_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/high_degree_polynomials_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/learning_curves_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/learning_curves_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/linear_model_predictions_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/linear_model_predictions_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/logistic_function_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/logistic_function_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/quadratic_data_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/quadratic_data_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/quadratic_predictions_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/quadratic_predictions_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/ridge_regression_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/ridge_regression_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/sgd_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/sgd_plot.png
--------------------------------------------------------------------------------
/images/training_linear_models/underfitting_learning_curves_plot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/training_linear_models/underfitting_learning_curves_plot.png
--------------------------------------------------------------------------------
/images/unsupervised_learning/README:
--------------------------------------------------------------------------------
1 | Images generated by the notebooks
2 |
--------------------------------------------------------------------------------
/images/unsupervised_learning/ladybug.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/learn-programmers/programmers_kdt_II/9947a6d88a2d5ba44b73f622d72dc0f9ab9168d9/images/unsupervised_learning/ladybug.png
--------------------------------------------------------------------------------