├── .gitignore
├── README.md
├── cover.png
├── notebooks
├── 01_normal.ipynb
├── 02_mle.ipynb
├── 03_multi_normal.ipynb
├── 04_gmm.ipynb
├── 05_em.ipynb
├── 06_pytorch.ipynb
├── 07_vae.ipynb
├── 08_hvae.ipynb
├── 09_diffusion.ipynb
├── 10_diffusion2.ipynb
├── flower.png
└── old_faithful.txt
├── posters
├── 1부. 시야를 찾아서.png
├── 2부. 상어공주.png
├── 3부. DeZero의 창조자.png
├── 4부. 제발, 가즈아!.png
├── 5부. 피쉬카소와 천재의 초상.png
├── README.md
└── 바닷속 딥러닝 어드벤처.png
├── requirements.txt
├── step01
├── norm_dist.py
├── norm_param.py
├── sample_avg.py
└── sample_sum.py
├── step02
├── fit.py
├── generate.py
├── height.txt
├── hist.py
└── prob.py
├── step03
├── height_weight.txt
├── mle.py
├── numpy_basis.py
├── numpy_matrix.py
├── plot_3d.py
├── plot_dataset.py
└── plot_norm.py
├── step04
├── gmm.py
├── gmm_sampling.py
├── old_faithful.py
└── old_faithful.txt
├── step05
├── em.py
├── generate.py
└── old_faithful.txt
├── step06
├── gradient.py
├── neuralnet.py
├── regression.py
├── tensor.py
└── vision.py
├── step07
└── vae.py
├── step08
└── hvae.py
├── step09
├── diffusion_model.py
├── flower.png
├── gaussian_noise.py
└── simple_unet.py
└── step10
├── classifier_free_guidance.py
└── conditional.py
/.gitignore:
--------------------------------------------------------------------------------
1 | .DS_Store
2 | *~
3 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # 『밑바닥부터 시작하는 딥러닝 ❺』
: 정규 분포부터 생성 모델까지!
2 |
3 |
4 |
5 | **10단계로 알아보는 이미지 생성 모델의 원리!**
6 |
7 | 이 책은 정규 분포와 최대 가능도 추정과 같은 기본 개념에서 출발하여 가우스 혼합 모델, 변이형 오토인코더(VAE), 계층형 VAE 그리고 확산 모델에 이르기까지 다양한 생성 모델을 설명한다. 수식과 알고리즘을 꼼꼼하게 다루며 수학 이론과 파이썬 프로그래밍을 바탕으로 한 실제 구현 방법을 알려준다. 생성 모델을 이론뿐만 아니라 실습과 함께 명확하게 학습할 수 있다. 특히 확산 모델에 이르는 10단계의 과정을 하나의 스토리로 엮어 중요한 기술들을 서로 잇고 개선할 수 있도록 구성했다. 이 책과 함께 생성 모델을 밑바닥부터 시작해보자.
8 |
9 | [미리보기](https://www.yes24.com/Product/Viewer/Preview/134648807) | [알려진 오류(정오표)](https://docs.google.com/document/d/1SU7b_emm3Lqha4BfVLTr4Ae6eTg32BkKFWMEXl6N_vA) | [본문 그림과 수식 이미지 모음](https://drive.google.com/file/d/1bMxCjB_SJzc7oJ913QT6Yn9sn3fjsymn/view?usp=drive_link)
10 |
11 | ## 파일 구성
12 |
13 | |폴더명 |설명 |
14 | |:-- |:-- |
15 | |`step01` |1장에서 사용할 코드 |
16 | |`step02` |2장에서 사용할 코드 |
17 | |... |... |
18 | |`step10` |10장에서 사용할 코드 |
19 | |`notebooks` |1〜10장까지의 코드(주피터 노트북 형식)|
20 |
21 |
22 | ## 주피터 노트북
23 |
24 | 이 책의 코드는 주피터 노트북에서도 확인할 수 있습니다. 다음 표의 버튼을 클릭하면 각각의 클라우드 서비스에서 노트북을 실행할 수 있습니다.
25 |
26 | | 단계 | Colab | Kaggle | Studio Lab |
27 | | :--- | :--- | :--- | :--- |
28 | | 1. 정규 분포 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/01_normal.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/01_normal.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/01_normal.ipynb) |
29 | | 2. 최대 가능도 추정 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/02_mle.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/02_mle.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/02_mle.ipynb) |
30 | | 3. 다변량 정규 분포 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/03_multi_normal.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/03_multi_normal.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/03_multi_normal.ipynb) |
31 | | 4. 가우스 혼합 모델 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/04_gmm.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/04_gmm.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/04_gmm.ipynb) |
32 | | 5. EM 알고리즘 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/05_em.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/05_em.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/05_em.ipynb) |
33 | | 6. 신경망 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/06_pytorch.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/06_pytorch.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/06_pytorch.ipynb) |
34 | | 7. 변이형 오토인코더 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/07_vae.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/07_vae.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/07_vae.ipynb) |
35 | | 8. 확산 모델 이론 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/08_hvae.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/08_hvae.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/08_hvae.ipynb) |
36 | | 9. 확산 모델 구현 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/09_diffusion.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/09_diffusion.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/09_diffusion.ipynb) |
37 | | 10. 확산 모델 응용 | [](https://colab.research.google.com/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/10_diffusion2.ipynb) | [](https://kaggle.com/kernels/welcome?src=https://github.com/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/10_diffusion2.ipynb) | [](https://studiolab.sagemaker.aws/import/github/oreilly-japan/deep-learning-from-scratch-5/blob/master/notebooks/10_diffusion2.ipynb) |
38 |
39 |
40 | ## 파이썬과 외부 라이브러리
41 |
42 | 소스 코드를 실행하려면 다음과 같은 라이브러리가 필요합니다.
43 |
44 | * NumPy
45 | * Matplotlib
46 | * PyTorch(버전 2.x)
47 | * torchvision
48 | * tqdm
49 |
50 | ※ 파이썬 버전은 3.x를 사용합니다.
51 |
52 |
53 | ## 실행 방법
54 |
55 | 각 장의 폴더로 이동하여 파이썬 명령어를 실행하면 됩니다.
56 |
57 | ```
58 | $ cd step01
59 | $ python norm_dist.py
60 |
61 | $ cd ../step02
62 | $ python generate.py
63 | ```
64 |
65 | ---
66 |
67 | ## 팬픽 - 바닷속 딥러닝 어드벤처 (5부작)
68 |
69 |
70 |
71 | "<밑바닥부터 시작하는 딥러닝>의 주인공 생선들은 딥러닝 기술로 바닷속 생태계를 어떻게 혁신하고 있을까요? 어공지능의 첨단을 이끌어가는 밑시딥 생선들과 신나는 모험을 떠나보세요."
72 |
73 | 바닷속 세계를 배경으로, 해양 생물들이 자신의 특성과 필요에 맞는 딥러닝 기술을 개발하여 문제를 해결해 나가는 모험을 그린 연작 소설입니다. 시리즈를 읽으신 분은 더 많은 재미를 느끼실 수 있도록 딥러닝 요소들을 곳곳에 삽입하였습니다.
74 |
75 | 각 편의 주인공과 주제는 다음과 같습니다.
76 |
77 | 1. **시야를 찾아서**: 쏨뱅이(쏨)가 **이미지 처리 기술**을 개발하여 주변 환경을 선명하게 파악
78 | 1. **상어공주**: 괭이상어 공주(꽹)가 **자연어 처리** 기술로 돌고래 왕자와의 사랑을 쟁취
79 | 1. **DeZero의 창조자**: 나뭇잎해룡(잎룡)이 **딥러닝 프레임워크**를 만들어 기술 보급과 협업 촉진
80 | 1. **제발, 가즈아!**: 가자미(가즈아)가 **심층 강화 학습**으로 먹이가 풍부한 새로운 바다 개척
81 | 1. **피쉬카소와 천재의 초상**: 유령실고기(피쉬카소)가 **이미지 생성 모델**로 바닷속 예술계 혁신
82 |
83 | 소설 보러 가기
84 |
85 | ---
86 |
87 | ## 라이선스
88 |
89 | 이 저장소의 소스 코드는 [MIT 라이선스](http://www.opensource.org/licenses/MIT)를 따릅니다. 상업적/비상업적 용도로 자유롭게 사용하실 수 있습니다.
90 |
--------------------------------------------------------------------------------
/cover.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/WegraLee/deep-learning-from-scratch-5/86be5ee971cf3cfe3795262f1ee07ce2322e9c9a/cover.png
--------------------------------------------------------------------------------
/notebooks/06_pytorch.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Step 6: Neural Network & PyTorch"
8 | ]
9 | },
10 | {
11 | "cell_type": "markdown",
12 | "metadata": {},
13 | "source": [
14 | "## tensor.py"
15 | ]
16 | },
17 | {
18 | "cell_type": "code",
19 | "execution_count": 1,
20 | "metadata": {},
21 | "outputs": [
22 | {
23 | "name": "stdout",
24 | "output_type": "stream",
25 | "text": [
26 | "tensor(75., grad_fn=)\n",
27 | "tensor(30.)\n"
28 | ]
29 | }
30 | ],
31 | "source": [
32 | "import torch\n",
33 | "\n",
34 | "x = torch.tensor(5.0, requires_grad=True)\n",
35 | "y = 3 * x ** 2\n",
36 | "print(y)\n",
37 | "\n",
38 | "y.backward()\n",
39 | "print(x.grad)"
40 | ]
41 | },
42 | {
43 | "cell_type": "markdown",
44 | "metadata": {},
45 | "source": [
46 | "## gradient.py"
47 | ]
48 | },
49 | {
50 | "cell_type": "code",
51 | "execution_count": 2,
52 | "metadata": {},
53 | "outputs": [
54 | {
55 | "name": "stdout",
56 | "output_type": "stream",
57 | "text": [
58 | "tensor(-2.) tensor(400.)\n",
59 | "0.0 2.0\n",
60 | "0.6815015077590942 0.46292299032211304\n",
61 | "0.8253857493400574 0.6804871559143066\n",
62 | "0.8942827582359314 0.7992911338806152\n",
63 | "0.9331904053688049 0.8705660700798035\n",
64 | "0.9568046927452087 0.9152978658676147\n",
65 | "0.9716982245445251 0.9440822601318359\n",
66 | "0.9813036918640137 0.9628812670707703\n",
67 | "0.98758465051651 0.9752733111381531\n",
68 | "0.9917276501655579 0.983490526676178\n",
69 | "0.9944759607315063 0.9889602065086365\n"
70 | ]
71 | }
72 | ],
73 | "source": [
74 | "import torch\n",
75 | "\n",
76 | "def rosenbrock(x0, x1):\n",
77 | " y = 100 * (x1 - x0 ** 2) ** 2 + (x0 - 1) ** 2\n",
78 | " return y\n",
79 | "\n",
80 | "x0 = torch.tensor(0.0, requires_grad=True)\n",
81 | "x1 = torch.tensor(2.0, requires_grad=True)\n",
82 | "\n",
83 | "y = rosenbrock(x0, x1)\n",
84 | "y.backward()\n",
85 | "print(x0.grad, x1.grad)\n",
86 | "\n",
87 | "lr = 0.001 # learning rate\n",
88 | "iters = 10000 # iteration count\n",
89 | "\n",
90 | "for i in range(iters):\n",
91 | " if i % 1000 == 0:\n",
92 | " print(x0.item(), x1.item())\n",
93 | "\n",
94 | " y = rosenbrock(x0, x1)\n",
95 | "\n",
96 | " y.backward()\n",
97 | "\n",
98 | " x0.data -= lr * x0.grad.data\n",
99 | " x1.data -= lr * x1.grad.data\n",
100 | "\n",
101 | " x0.grad.zero_()\n",
102 | " x1.grad.zero_()\n",
103 | "\n",
104 | "print(x0.item(), x1.item())"
105 | ]
106 | },
107 | {
108 | "cell_type": "markdown",
109 | "metadata": {},
110 | "source": [
111 | "## regression.py"
112 | ]
113 | },
114 | {
115 | "cell_type": "code",
116 | "execution_count": 3,
117 | "metadata": {},
118 | "outputs": [
119 | {
120 | "name": "stdout",
121 | "output_type": "stream",
122 | "text": [
123 | "41.89796447753906\n",
124 | "0.22483204305171967\n",
125 | "0.0925208106637001\n",
126 | "0.0888015553355217\n",
127 | "0.08627457916736603\n",
128 | "0.08435674756765366\n",
129 | "0.0829005315899849\n",
130 | "0.0817948430776596\n",
131 | "0.08095530420541763\n",
132 | "0.08031783998012543\n",
133 | "0.07987643033266068\n",
134 | "====\n",
135 | "W = 2.2863590717315674\n",
136 | "b = 5.3144850730896\n"
137 | ]
138 | },
139 | {
140 | "data": {
141 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAGwCAYAAABVdURTAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAABGQElEQVR4nO3deZzNdd/H8ffYhsqMCWNGxi5kX1KW0kJuXC1aL0rWylJCFFp0UZYWtCGytSprriJbaUORISRLZKgoNWYQM8z87j++l8kwM+bMnHN+y3k9H4953Nf3zO+c8z3nmuv+vX2+W5hlWZYAAAA8ooDdHQAAAPAnwg0AAPAUwg0AAPAUwg0AAPAUwg0AAPAUwg0AAPAUwg0AAPCUQnZ3INjS09P166+/qnjx4goLC7O7OwAAIBcsy9KRI0dUtmxZFSiQc20m5MLNr7/+qri4OLu7AQAA8mDfvn0qV65cjteEXLgpXry4JPPlRERE2NwbAACQG8nJyYqLi8u4j+ck5MLN6aGoiIgIwg0AAC6TmyklTCgGAACeQrgBAACeQrgBAACeQrgBAACeQrgBAACeQrgBAACeQrgBAACeQrgBAACeQrgBAACeQrgBAACeQrgBAACeEnJnSwEAEGriExK159AxVSp1oRqUj7K7OwFHuAEAwMPGLNmmyZ/vzmj3allZQ9rWtLFHgcewFAAAHhWfkJgp2EjS5M93Kz4h0aYeBQfhBgAAj9pz6JhPj3sF4QYAAI+qVOpCnx73CsINAAAe1aB8lHq1rJzpsd4tK3t+UjETigEA8LAhbWuqTa0YVksBAADvaFA+KiRCzWkMSwEAAE8h3AAAAE8h3AAAAE9hzg0AADYItSMRgolwAwBAkIXikQjBxLAUAABBFKpHIgQT4QYAgCDy/JEIn38u/fyzrV0g3AAAEESePRLh2DGpXz/pmmukHj2k9HTbukK4AQAgiDx5JMKXX0r16kmvvGLaVapIqam2dYcJxQAABJlnjkT4+2/p8cell16SLEsqV06aNk264QZbu0W4AQDABq4/EmH1aqlrV2nnTtPu3l0aN06KjLS1WxLDUgAAwBfHj0uDBkktWphgU7astHixqdg4INhIVG4AAEBurV1rqjXbt5t2167S+PFSiRI2dupcVG4AAEDOTpyQHntMat7cBJvYWOmjj6QZMxwXbCQqNwAAICfr1kldukjbtpl2585mAnGUc+cLUbkBAADnSkmRhg2TrrzSBJsyZaSFC6U333R0sJGo3AAAgLOtX2/m02zdatqdOkkvvyyVLGlrt3KLyg0AAC4Tn5Co+Rv2+/88qtRU6cknTbVm61YpOlqaP1965x3XBBvJ5nBTsWJFhYWFnfPTt2/fLK+fOXPmOdcWLVo0yL0GAMA+Y5ZsU4eJqzXwg03qMHG1xizZ5p8Xjo+XLr9ceuYZKS1NuusuE3A6dPDP6weRrcNS69atU1paWkZ7y5Ytat26te64445snxMREaHtp5egSQoLCwtoHwEAcIrsThRvUysm7xsCpqZKo0ZJzz4rnTollSolTZok3X67H3psD1vDTenSpTO1x4wZoypVqqhly5bZPicsLEwxMTG5fo+UlBSlpKRktJOTk33vKAAAfhCfkJivIxdyOlE8T+Fm0yYzt2bjRtO+7TZp4kQzHOVijplzk5qaqrffflvdu3fPsRpz9OhRVahQQXFxcbr55pu19fRkp2yMHj1akZGRGT9xcXH+7joAAOflj+Ekv50ofvKkNHKk1LixCTYlS0qzZ0tz5rg+2EgOCjcLFy7U4cOH1bVr12yvqV69uqZPn64PP/xQb7/9ttLT09WsWTPt378/2+cMHTpUSUlJGT/79u0LQO8BAG4WsAm6Z7x+VsNJvr5fbk8Uz/HzbNliJgw/9ZQZhurQwcytuesuySNTPcIsy7Ls7oQktWnTRkWKFNF///vfXD/n5MmTqlmzpjp27KiRI0fm6jnJycmKjIxUUlKSIiIi8tpdAIBHjFmyLVPw6NWysoa0renX95i/Yb8GfrDpnMfH3VlPtzYs5/Pr5TS8le3nOXVKeu456emnTeUmKkp69VWpY0dXhBpf7t+O2Odm7969WrFihebPn+/T8woXLqwGDRpo165dAeoZAMDLAjJBNwt+G076n+xOFM/u89xc+LBqDu1n9q+RpBtvlF5/3Ryj4EGOGJaaMWOGoqOj1b59e5+el5aWps2bNyvWo//lAAACK6cJuv6U2+Gk/Dq73wXT09Rr7Vxd2u4aE2xKlDA7DH/4oWeDjeSAyk16erpmzJihLl26qFChzN259957dckll2j06NGSpBEjRujKK69U1apVdfjwYT3//PPau3evevbsaUfXAQAul9uKSn5XOUnSkLY11aZWTL5fJydn9rvKoX16cfF41f9th3mgfXtpyhSpbFm/v6/T2B5uVqxYoYSEBHXv3v2c3yUkJKhAgX+KS4mJibrvvvt04MABRUVFqVGjRlq9erUuu+yyYHYZAOARpysqZw7lnF1R8eecnOyGk/ylQfko9W5RQadeHK9BX76l8LSTOnFhcRV99WVz+KUL5tb4g2MmFAcLE4oBAGfLrjITn5CoDhNXn3P9gj7NAhpS8mzHDrNvzZo1kqTkltcr4u2ZUjnfJy1nxR8VrLxy3YRiAADslF1Fxe+b5gVKWpo52HLYMOnECal4cWnCBEV06+a3ak0wVpX5iyMmFAMA4ET+XuUUEDt3Si1bSgMHmmDTurXZy6Z7d78FG3/t0xMshBsAALIRrFVOeZKeLr30klSvnvT119JFF5nl3UuXSuXL+/WtgrWqzF8YlgIAIAfBWOXks59+MpWZL74w7euvl6ZNkypUCMjbuaKCdQYqNwAAnEeD8lG6tWE5+4NNerr02mtS3bom2Fx4oTnBe/nygAUbyeEVrCxQuQEAwA327DHVmlWrTPuaa6Tp06VKlYLy9o6sYGWDcAMAcA07lyLbJj3dzKUZPFg6dky64AJp7FipTx+pQHAHYAK9T4+/EG4AAK7gpqXIfrN3r9Sjh7RypWlfdZU0Y4ZUpYq9/XI45twAABzPbUuR882ypKlTpdq1TbApVsysjFq1SqpSRfEJiZq/Yb93P38+UbkBADhKVkNPrtlMzx/27ZN69pSWLTPt5s1NtaZaNUkhWsHyEeEGAJBrgZ7zkt2N221LkfPEsswE4YEDpeRkqWhR6dlnpYcflgoWlJR9BatNrRjvhbx8INwAAHIl0BWD8924z3fApavt3y/dd5/0ySemfeWV0syZUvXqmS4LqQpWPhBuAADnFYyKwflu3G5aipxrliXNmiX17y8lJUnh4dIzz0gDBmRUa84UEhUsP2BCMQDgvIKx/X5ubtyO2UzPH379VbrxRqlbNxNsmjSR4uOlQYOyDDZS7jbTY7IxlRsAQC4Eo2Lg+aGn0yxLevttqV8/6fBhqUgRacQI6ZFHpELnvy3nVMFisrFBuAEAnFcgg8eZk5Q9OfR0pgMHpAcekBYtMu1GjcywVK1aPr1MVpvpMdn4H4QbAECuBCJ4ZFdp8NzN2LKk996THnpI+usvqXBh6emnpUcfzVW1JjeYbPwPwg0AINf8uf1+yFQaDh6UeveWFiww7YYNzUqoOnX8+jZMNv4HE4oBALYIxiRl273/vhlyWrDAVGhGjJDWrvV7sJF8n2zs5YnHVG4AALbwdKXhjz/MwZZz55p2vXpmbk29egF9W18mG5/JaxOPqdwAAM4RjH/V56bS4Epz55pqzdy5plozfLj07bcBDzanZbVcPqshwDN57ZwuKjcAgEyCuZzYU6ujDh2SHnzQDEVJZuhp5kwzx8ZmuRnq89LEYyo3AIAMdpy+7YmN+RYsMNWa9983G/A98YS0fr0jgo2Uu6E+TwwH/g/hBgCQISQm+frTn39Kd98t3Xqr9PvvJuCsXSuNHGk253OIrIYAz+SJ4cAzMCwFAMjg6Um+/rZokdmQ78ABqUAB6bHHzPya8HC7e5als4cAJXljODALhBsAQIaQOQIhPxITpYcflt56y7Rr1DAroZo0sbdfuXD2PkVe/e+VcAMAyMRTk3z97aOPpPvvl377zVRrBg2S/vMfqWhRu3uGMxBuAADn8OdOxJ5w+LDUv7+p0EhS9epmJdSVV9rYKWSHCcUAAORkyRKpdm0TbMLCzOnd8fEEGwejcgMAQFaSkkyQmTbNtKtVk2bMkJo3t7dfOC8qNwAAnG3ZMlOtmTbNVGv695c2biTYuASVGwAATktONpOEp0417SpVTLXmqqvs7Rd8QuUGAABJWrnSHJlwOtg89JC0aRPBxoWo3AAAQtvRo9Kjj0qTJpl2pUrS9OnSNdfY2i3kHZUbAEDo+uwzU605HWz69JG+/55g43JUbgAAoefYMWnIEOnVV027QgVTrbnuOnv7Bb8g3ABAgMUnJLLbr5N88YXUrZu0+39HTDzwgPT881Lx4vb2C35DuAGAABqzZFumc5p6taysIW1r2tijEPb339KwYdJLL5l2XJxZ6t26tb39gt8x5wYAAiQ+ITFTsJGkyZ/vVnxCok09CmFffSXVq/dPsOnZU9qyhWDjUYQbAAiQPYeO+fQ4AuD4cbPL8NVXS7t26Xh0jHa9Occs946IsLt3CBDCDQAESKVSF/r0OPxszRqpfn1p3DjJsvRBnVa6ouMEtdpaTGOWbLO7dwggwg0ABEiD8lHq1bJypsd6t6zMpOJAO35cGjxYatFC2rFDqWVi1fX24Xq0XX8lF71IEsODXseEYgAIoCFta6pNrRhWSwXLN99IXbtKP/5o2vfeq6XdBmvVJ3vPuXTPoWP89+FRhBsACLAG5aO4iQZaSor09NPSc89J6elSTIw0ZYp0440ql5Ao6dxww/CgdzEsBQBwt3XrpIYNpTFjTLC55x5p61bF12uh+Rv2SxLDgyGGyg0AwJ1SUqQRI6SxY6W0NCk6Wnr9demWW7LcX2hBn2YMD4YIwg0AwH02bJC6dDF71UjSv/8tvfKKVKpUtvsLtakVo1sblrOhs6HBSTtxE24AAJk46SZ1jtRU6ZlnpFGjTLWmdGlz6OVtt2VcktP+Qo77PB7htJ24CTcAgAxOu0llsnGjWQm1aZNp33mnOfiydOlMl7G/UHDlVCmzK0wyoRgAICkwx0XEJyRq/ob9+dtT5uRJ6T//kS6/3ASbkiWl9983P2cFG4n9hYLNiTtxU7kBAEjy/3COX6pA339vqjXx8aZ9663SxIlSmTI5Po39hYLHiZUyKjcAAEn+vUnluwp08qSZW9O4sQk2F18svfeeNHfueYPNaQ3KR+nWhuUINgHmxEoZlRsAIcvRE2dtcPomdWYoyetNKl9VoC1bTLXmu+9M++abpcmTzcZ8cCSnVcoINwBCkqMnztrIXzepPFWBTp2Snn/e7DScmipFRZnl3Z06SWFheeoHgsdJO3EzLAUg5ARi4qyX+GM4x+ehih9+kJo1k4YNM8HmX/+Stm6V7r6bYAOf2RpuKlasqLCwsHN++vbtm+1z5syZoxo1aqho0aKqU6eOFi9eHMQeA/ACJ67u8KIhbWtqQZ9mGndnPS3o00yPZVUZS0sz50E1bGiOUYiMlGbNkhYtkmJjg99peIKtw1Lr1q1TWlpaRnvLli1q3bq17rjjjiyvX716tTp27KjRo0frX//6l959913dcsst2rBhg2rXrh2sbgNwOV+GTAIxLyeU5vrkOFTx449St27S2rWm3a6dOezykkuC10F4UphlWZbdnTitf//++uijj7Rz506FZVGGvOuuu3Ts2DF99NFHGY9deeWVql+/viZPnpzla6akpCglJSWjnZycrLi4OCUlJSkiIsL/HwKAK5w956Z3y8rnVBYCMS+HuT4y1ZoJE6THHzfnQ0VEmHbXrgxBIVvJycmKjIzM1f3bMROKU1NT9fbbb2vgwIFZBhtJWrNmjQYOHJjpsTZt2mjhwoXZvu7o0aP1n//8x59dBeAB55s4G4hdV524k2vQ7dhhqjWrV5t2mzbS1KlSXJy9/YKnOGZC8cKFC3X48GF17do122sOHDigMmftb1CmTBkdOHAg2+cMHTpUSUlJGT/79u3zV5cBuFxOE2cDMS8npOf6pKeb6ky9eibYFC9uQs2SJXkONn7Z/Rie5JjKzbRp09S2bVuVLVvWr68bHh6u8PBwv74mAOfx9zyWQOy66sSdXINi1y6pe3fpyy9Nu1Urado0qXz5PL8kw3vIiSMqN3v37tWKFSvUs2fPHK+LiYnRwYMHMz128OBBxbCxExDSxizZpg4TV2vgB5vUYeJqjVmyLd+vGYhdV524k2tApaebfWrq1jXB5qKLzGZ8y5blK9iwlB/n44jKzYwZMxQdHa327dvneF3Tpk21cuVK9e/fP+Ox5cuXq2nTpgHuIQCnCuQ8lkDsuuq0nVwDZvduU635/HPTvu46U62pWDHfL+3vM7DgPbaHm/T0dM2YMUNdunRRoUKZu3Pvvffqkksu0ejRoyVJDz/8sFq2bKkXX3xR7du31+zZs7V+/XpNmTLFjq4DcIBA3+gCseuqk3Zy9bv0dFOdefRR6dgx6YILzK7DvXpJBfwzWBCyw3vINduHpVasWKGEhAR17979nN8lJCTot99+y2g3a9ZM7777rqZMmaJ69epp7ty5WrhwIXvcACGMG52D/Pyz1Lq11LevCTYtW0qbN0t9+vgt2EghOLwHnzlqn5tg8GWdPAB3yM2eNU7iuU38LMtsvjdokHT0qFSsmDR2rAk5fgw1Z/Pc94gc+XL/JtwA8AS33Oh8XeXj+M+1d6/Us6e0YoVpt2ghzZghVa1qb7/g/L8dH7lyEz8AyA+757Hk5kbi6+RnRy93tiwzQXjgQOnIEVOtGTVK6tcvoNUa5I6j/3aCgHADAPmU2xuJL5OfHb2b8f79plqzdKlpN2tmqjWXXmpvvzwkP1UXR//tBAnxGgDywZc9V3yZ/OyU3Ywz7QJsWSbE1Kplgk14uPTCC9IXXxBs/Ci/+zY55W/HTlRuACAffKnGnF7lc/bk56z+Ne2EVWBnVqTKHDmkd9ZNV9V1X5hfXnGFNHOmVKNG0PoTCvxRdXHC347dCDcAkA++3kiy28Tv7GEIX4JQIGTcZC1Lt235VMNXTlFEyjGlh4erwIgR0iOPSAULBqUvocQf+zbZ/bfjBIQbAMiHvNxIzp78nN2cHTt3M95z6JhKH/1Loz95Ra1+WidJ2hhbTX+8PFmtb78uaP0INf6quoTMTtjZYCk4APhBXieAxickqsPE1ec8vqBPM/tuSJaln1+eqhJDHlGJE0eVUrCQJrS4W1Oa3Kq5D14VcjdKX+V3Cbbb9m0KFpaCA0CQ5XUpuuPOSTpwQOrVSxU//FCS9H1MVQ1q1187SlfM89CG1/ZbyYk/lmCHetXFHwg3AGAjx0z+tCxp9mzpwQelv/6SCheWnnpKaR3vV6+kVL9VIby834o/l2DbvW+T27EUHABs5Ihzkn7/Xbr9dqlTJxNs6teX1q+XnnhCDapE69aG5fJcscntMnkvYAm2c1C5AQCb5XUYwi/DPXPmmIMtDx2SChWSnnxSGjrUVG7yyXFDbgHmmCocCDcA4AS+DkPke7jnjz/MwZZz5ph23brSrFmmauMnoXazZwm2cxBuAMBl8j23Y948qXdvE3AKFpSGDZOeeEIqUsSv/QzFmz2TgZ2BcAMALpPn4Z4//zQThmfPNu3atc0uw40a+b+T/xOKN3smA9uPcAMALpOn4Z4PP5QeeEA6eNBUa4YMMfNrwsPz1Adf5vtws0ewEW4AwGV8Gu756y+pXz/pnXdM+7LLTLXm8svz/P4D3o/XgvhfM9peXt4Nd2KHYgBwqfNWT/77X+n++83GfAUKSI8+Kg0fLhUtmuf3PDvYnGbrjsoICexQDAAhINvhnsREqX9/6c03TbtGDVOtueKKfL1ffEJilsFG8u7ybrgTm/gBgJcsXmwmCr/5phQWJg0eLG3YkO9gI+W8GZ1Xl3fDnajcAIAXJCVJAwZIM2aY9qWXmv/crJnf3iK7AHNrg7JUbeAoVG4AwMXiExL11atvK7XmZSbMhIVJAwdKGzf6NdhI0tKtB8557NYGZTXurgZ+fR8gv6jcAIBLjZu3TrEjnlDH75dJkv4qW14Xv/+O1KKF398rq40DJalz04p+fy8gv6jcAIAL7Xpnvu7q1j4j2ExvdJOa/Xuc4svXCsj7cSgk3ITKDQC4yZEj0uDBqvr665KkvSVi9Gjbh/VN+TqSArdqKdTOiYK7UbkBALf49FOpTh3pf8FmZsN/6f+6vZoRbKTAhY3TGweeyevnRMG9qNwAgNMdPSo99pg0caJpV6woTZ+uAydidDyIh1KG4jlRcCd2KAYAJ/v8c6lbN2nPHtPu3Vt67jnpoosk+XbGE+Bm7FAMAG537Jg0dKj0yiumXb68NG2a1KpVpsu8diglYQ3+QLgBAKf58ktTrfnpJ9O+/37p+eclj1ebxyzZlmm5OQdyIq+YUAwATvH332aX4ZYtTbCJi5OWLjUTiD0ebLLaR2fy57sVn5BoU4/gZoQbAHCC1aul+vWlCRMky5J69JA2b5ZuuMHungUF++jAnwg3AGCn48elQYPMrsI7d0qXXGIOv3zjDSky0u7eBQ376MCfCDcAYJe1a6UGDaQXXzTVmq5dpS1bpLZt7e5Z0Nmxj058QqLmb9ifMfR1dhvuxYRiAAi2Eyek4cOlF16Q0tOl2Fhp6lSpfXu7e2arYO6jc/bk5fpxkdq4LymjzWRmdyPcAEAwffutqdBs22banTtLL70kRbHsWQrO0vasJi+fGWwkM5m5Ta0YlqO7FMNSABAMKSnSsGFS06Ym2JQpI334ofTmmwSbIMvtJGUmM7sXlRsACLT16021ZutW0+7USXr5ZalkSTats0FuJykzmdm9CDcAECipqdLIkdLo0VJamhQdLU2eLHXoIIlN6+xyevJyTnNuOBTU3Qg3ABAIGzaYas3mzaZ9113Sq69KpUpJyn7TOuZ5BEdWk5eponkH4QYA/Ck1VRo1Snr2WenUKRNmJk2Sbr8902U5bVrHjTU4zp687LVzukIZ4QYA/GXTJlOt2bjRtG+7TZo40QxHnYVN64DAYbUUgGyxqVkunTxp5tY0bmyCTcmS0uzZ0pw5WQYbyZ5N64BQQeUGQJaY7JpLmzebas2GDabdoYMZhipT5rxPDeamdUAooXID4Byc0JwLp06ZeTWNGplgExUlvfOONG9eroLNaQ3KR+nWhuUINoAfEW4AnIMTms9j61azGd8TT5ghqZtuMo916iSFhdndOyDkEW4AnIPJrtk4dUoaO1Zq2NBszFeihPTWW9LCheZ8KACOQLgBcA4mu2Zh2zapeXNpyBCz3Lt9e1OtueceqjWAwzChGECWmOz6P2lp0rhx0pNPmvOhIiOlCROkLl0INYBDEW4AZCvkNzXbvl3q1k1as8a0/+//pKlTpXLl7O0XgBwxLAUAZ0tLk8aPl+rXN8EmIkKaNk1avJhgA7gAlRsAONPOnaZa8/XXpt26tfTGG1L58vb2C0CuUbkBAElKT5deekmqV88Em4sukl5/XVq6lGADuAyVGwD46Sepe3fpiy9M+/rrzTBUhQr29gtAnlC5AeAzz5w5lZ4uvfqqVLeuCTYXXmiOTli+nGADuJjt4eaXX37RPffco5IlS6pYsWKqU6eO1q9fn+31q1atUlhY2Dk/Bw4cCGKvgdA1Zsk2dZi4WgM/2KQOE1drzJJtdncpb/bsMRWahx6S/v5buvZac05Ur14s8QZcztZhqcTERDVv3lzXXnutlixZotKlS2vnzp2Kijr/0tPt27crIiIiox2dzcm7APwnuzOn2tSKcc+S8fR0M5dm8GDp2DHpgguk556TeveWCtj+7z0AfmBruBk7dqzi4uI0Y8aMjMcqVaqUq+dGR0erRIkS570uJSVFKSkpGe3k5GSf+wnAyOnMKVeEm717pR49pJUrTfuqq6QZM6QqVeztFwC/svWfKYsWLVLjxo11xx13KDo6Wg0aNNDUqVNz9dz69esrNjZWrVu31tenl2xmYfTo0YqMjMz4iYuL81f3gZDj2jOnLEuaMkWqXdsEm2LFzMqoVasINoAH2Rpudu/erUmTJqlatWpaunSpevfurX79+mnWrFnZPic2NlaTJ0/WvHnzNG/ePMXFxemaa67Rhg0bsrx+6NChSkpKyvjZt29foD4O4HmuPHNq3z6pTRvpgQeko0fN+VCbNkn9+jEM5QPPTCJHSAizLMuy682LFCmixo0ba/Xq1RmP9evXT+vWrdOa09ud50LLli1Vvnx5vfXWW+e9Njk5WZGRkUpKSso0ZwdA7sUnJDr/zCnLkqZPlwYOlJKTpaJFpVGjTKgpWNDu3rnKmCXbMs216tWysoa0rWljjxCKfLl/2/rPltjYWF122WWZHqtZs6YSEhJ8ep0mTZpo165d/uwagBw0KB+lWxuWc26w2b9fatdO6tnTBJumTaWNG6UBAwg2PspuEjkVHDiZreGmefPm2r59e6bHduzYoQo+7i+xceNGxcbG+rNrANzIsqSZM83cmk8+kcLDpeefl778Uqpe3e7euVJOk8gBp7J1tdSAAQPUrFkzjRo1Snfeeae+/fZbTZkyRVOmTMm4ZujQofrll1/05ptvSpImTJigSpUqqVatWjpx4oTeeOMNffrpp1q2bJldHwOAE/z6q3T//dLHH5t2kyYm6NRk+CQ/XDuJHCHN1srN5ZdfrgULFui9995T7dq1NXLkSE2YMEF33313xjW//fZbpmGq1NRUPfLII6pTp45atmypTZs2acWKFbr++uvt+AgA7GZZ0ltvSbVqmWBTpIg0Zow5H4pgk2+unESOkGfrhGI7MKEYweKKSbdud+CAWQW1aJFpN25sqjW1atnaLS/i7xl28+X+7fOwVJcuXdSjRw9dffXVee4g4HWsLgkwy5Lee88cnfDXX1LhwtLTT0uPPioV4jzgQGhQPopQA9fweVgqKSlJrVq1UrVq1TRq1Cj98ssvgegX4FqsLgmwgwel226T7r7bBJuGDaXvvpOGDSPYAJCUh3CzcOFC/fLLL+rdu7fef/99VaxYUW3bttXcuXN18uTJQPQRcBVWlwSIZUnvv2+GnBYsMEFmxAhp7VqpTh27ewfAQfI0obh06dIaOHCgNm3apG+++UZVq1ZV586dVbZsWQ0YMEA7d+70dz8B12B1SQD88Yd0553Sv/8t/fmnVL++tH699OSTZkgKAM6Qr9VSv/32m5YvX67ly5erYMGCateunTZv3qzLLrtM48eP91cfAVdhdYmfzZ1rqjVz55pqzfDh0jffSPXq2d0zAA7l82qpkydPatGiRZoxY4aWLVumunXrqmfPnurUqVPG7OUFCxaoe/fuSkx03hwDVkshWFhdcv7vIMffHzokPfigGYqSzNDTrFlSgwZB6DkApwnoaqnY2Filp6erY8eO+vbbb1W/fv1zrrn22mtVokQJX18a8JRQX11yvhVjOf5+wQKpVy/p99/NcQlDh5ohqCJFgtZ/AO7lc7gZP3687rjjDhUtWjTba0qUKKE9e/bkq2MA3Cu7FWNtasWoQfmobH/frmwR1R37pFnmLZnhqJkzzf418BnVQ4Qqn8NN586dA9EPAB6S04qxBuWjsvx9q53f6NLru0l//iEVKCA99piZXxMeHujuehJ7LSGUsSkEAL8734qxM38fceKohq94Xbdt/cw8ULOmqdY0aRLobnrW+SpngNfZerYUAG8634qx07+/bte3Wj6tj27b+pnSCxQwOwxv2ECwySf2WkKoo3IDICCGtK2pNrVisp7zcfiwhrw/Vpo3S5J0oko1FX37TenKK23qrbew1xJCHZUbAAHToHyUbm1YLnOwWbJEql3bLOsOC5MeeURFN28i2PgRey0h1FG5ARAcSUnSI49I06aZdrVq0owZUvPm9vbLo3KsnAEeR7gBEHjLlkk9ekj795tqTf/+0jPPSBdcEPSuhNLy6FDfawmhi3CDkBdKN7ugS06WBg2Spk417SpVTLXmqqts6U5+l0fztwK4A+EGIY29QAJoxQpTrUlIMO2HHpJGj5YutGdSa36XR/O3ArgHE4phm/iERM3fsF/xCfacQZbdzc6u/njGkSNS795S69Ym2FSqJH32mfTyy7YFGyl/y6P5WwHchcoNbOGEfwWfbxdd5MFnn0ndu0s//2zafftKY8ZIF11ka7ek/C2P5m8FcBcqNwg6p/wrmL1A/OjYMTPsdN11JthUqCCtXCm9+qojgo2Uv+XR/K0A7kLlBkHnlH8Fn77ZnRm02AskD774QurWTdr9v+/xgQek55+Xihe3t19ZyOvyaP5WAHch3CDonPSvYPYCyYdjx6Rhw8xcGkmKizN72LRubW+/ziOvy6P5WwHcg3CDoHPav4LZCyQPvvrKVGt27TLtnj2lF1+UIiLs7VeA8bcCuAPhBrbgX8Eudfy49Pjj0oQJkmVJ5cpJb7whtWljd88AIAPhBrbhX8Eus2aN1LWrtGOHaXfvLo0bJ0VG2totADgbq6UA5Oz4cWnwYKlFCxNsypaVPv7YzK8h2ABwICo3ALL3zTemWvPjj6Z9771mSCqKihsA56JyA+BcJ05IQ4ZIzZqZYBMTIy1aJM2alRFs7N5hGgCyQ+UGQGbr1plqzQ8/mPY990gvvSRdfHHGJU7YYRoAskPlBoCRkmJWQjVtaoJNdLS0YIH01luZgo1TdpgGgOwQbgBIGzZIjRtLo0ZJaWnSv/8tbd0q3XLLOZfm5wBKAAgGwg3gB66df5KaKj31lNSkibRli1S6tDR3rvTee1KpUlk+xUk7TANAVphzA9eKT0h0xCaArp1/snGj1KWL9P33pn3nneagy9Klc3ya03aYBoCzEW6QLaeEh6w4JVBkN/+kTa0Yx31nGU6eNMNPzzwjnTplKjQTJ0p33JHrl2CHaQBORrhBlvwRHgIVjpwUKJxywnmuff+9WQkVH2/at94qTZpkJg/7iB2mATgV4Qbn8Ed4CGRlxUmBwjXzT06elMaOlUaMMP/54oul116T7rpLCguzu3cA4FdMKMY58rsaJtBLhZ0UKE7PPzmT4+afbNlilnc/+aQJNjffbFZC/fvfBBsAnkTlBufIb3gIdGXFaRNaHTv/5NQp6fnnpaefNquioqKkV16ROnUi1ADwNMINzpHf8BCMyorTAoXj5p/88IOZW7NunWnfeKP0+utSbKyt3QKAYAizLMuyuxPBlJycrMjISCUlJSkiIsLu7jhafiYEnz3npnfLynrMDcuj3S4tTXrxRbN3TUqKObX75Zelzp2p1gBwNV/u34QbBIyTl5J70o8/St26SWvXmna7dtKUKdIll9jbL4fh7xJwJ1/u3wxLIWAcN1TjVWlp0oQJ5lyolBQpIsK0u3alWnMWp+yPBCCwWC0FuNmOHdLVV0uDBplg06aNWR3VrVtIBBtfjr3gwE8gdFC5AdwoPd3MpRk6VDpxQipeXBo3TurRIyRCjeR7FcZJ+yMBCCwqN4Db7NolXXONNGCACTatWplqTc+eIRNs8lKFcdL+SAACi3ADuEV6utmnpm5d6csvpYsukiZPlpYtk8qXt7t3QXF6GGrV9t+z/H1OG026YsNFAH7BsBTgBrt3S927S59/btrXXSdNmyZVrGhrt4Lp7GGorJyvCuO0/ZEABAbhBnCy9HRTnXn0UenYMenCC6XnnpN69ZIKhE7hNathqLPltgrDKj7A+wg3gFP9/LOZIPzpp6bdsqU0fbpUuXKOT/Oi7IabHr6+qiqUvJAqDIBMCDeA01iW2Xxv0CDp6FHpggvMid59+gStWuO0je6yG266pnq0I/oHwFkIN4CT7N1rVj2tWGHaLVpIM2ZIVasGrQtO3OjOaYelAnA2wg3gBJYlvfGG9Mgj0pEjUrFi0qhRUr9+QZ1bk90S6za1YmwPEkwGBpBbhBvAbvv2SffdJy1datrNmplqzaWXBr0rTt/ojsnAAHIjdJZbAE5jWSbE1K5tgk3RouZE7y++sCXYSGx0B8AbCDeAD3w5yyhHv/wi/etfZu+a5GTpiiuk+Hhp4ECpYEH/dDYP2OgOgBfYHm5++eUX3XPPPSpZsqSKFSumOnXqaP369Tk+Z9WqVWrYsKHCw8NVtWpVzZw5MzidRUgbs2SbOkxcrYEfbFKHias1Zsk2n18jfu9fWv+f8Tp12WXS4sVSeLhZCfX111KNGgHote+GtK2pBX2aadyd9bSgTzM9xqnZAFzG1jk3iYmJat68ua699lotWbJEpUuX1s6dOxUVlf2/Evfs2aP27durV69eeuedd7Ry5Ur17NlTsbGxatOmTRB7j1Dij4m2r779uWo8PVitflonSfr10joqu2C2dNll+epXICbYMrcFgJvZGm7Gjh2ruLg4zZgxI+OxSpUq5ficyZMnq1KlSnrxxRclSTVr1tRXX32l8ePHE24QMPmaaGtZ+vnlqbpnyCMqceKoUgoW0oQWd2tKk1s196JYNchjn5y4ZBsAnMDWYalFixapcePGuuOOOxQdHa0GDRpo6tSpOT5nzZo1atWqVabH2rRpozVr1mR5fUpKipKTkzP9AL7K80TbAwekDh1Usf8DKnHiqL6Pqaobu0zQpCvvUFqBgjke9JiTvJyKDQChwtZws3v3bk2aNEnVqlXT0qVL1bt3b/Xr10+zZs3K9jkHDhxQmTJlMj1WpkwZJScn6/jx4+dcP3r0aEVGRmb8xMXF+f1zwPt8nmhrWdJ770m1akkffqj0woX1/FWddes9L2hH6YoZl+V1FVJOlSQACHW2Dkulp6ercePGGjVqlCSpQYMG2rJliyZPnqwuXbr45T2GDh2qgQMHZrSTk5MJOMiTXG8i9/vvUu/e0vz5pl2/vgrMmqW0XwrrlJ922GXJNgBkz9ZwExsbq8vOmkxZs2ZNzZs3L9vnxMTE6ODBg5keO3jwoCIiIlSsWLFzrg8PD1d4eLh/OoyQd96Jth98IPXtKx06JBUqJD35pDR0qFS4sIbUld922OU4AgDInq3hpnnz5tq+fXumx3bs2KEKFSpk+5ymTZtq8eLFmR5bvny5mjZtGpA+Arnyxx8m1MyZY9p160qzZkn162e6zJ+rkDiOAACyZuucmwEDBmjt2rUaNWqUdu3apXfffVdTpkxR3759M64ZOnSo7r333ox2r169tHv3bj366KP68ccfNXHiRH3wwQcaMGCAHR8BkObNM3Nr5swxG/A99ZS0bt05wSYQGpSP0q0NyxFsAOAMtlZuLr/8ci1YsEBDhw7ViBEjVKlSJU2YMEF33313xjW//fabEhISMtqVKlXSxx9/rAEDBuill15SuXLl9MYbb7AMHEFzem+ZqgVTVHfME9Ls2eYXtWtLM2dKjRrZ2j8ACHVhlmVZdncimJKTkxUZGamkpCRFRETY3R34QaA2ssvK6b1lbtixRs8ue02ljx021ZohQ8z8mjzO7wrmZwAAN/Ll/s2p4HC1YG5kF5+QqPc+2aTxK15Xhx9WSZJ2lCwva8Z0Vb/x+jy/LpvxAYB/2X62FJBXwd7I7vj8hVo+rY86/LBKaWEFNPHK23Vj1wnaekn1PL8mm/EBgP8RbuBa2W1Y9/LKnf59o8REqUsXNRvQXdHHErXr4nK67Z7n9VzLrkopVCRfe8uwGR8A+B/DUggaf88ryS5UfLb9D8UnJPpn7srixdJ990m//iqFhWntrd3UpUJ7pRQ2c2vyu7cMm/EBgP8RbhAUgZhX0qB8lK6rUVqf/vjHOb/L1YGWOTl8WBo4UDp9qOull0ozZujKZs00248hjc34AMD/CDcIuOzmlbSpFZPvm/hD11XLMtzkq/LxySdSz57SL79IYWHSgAHSM89I/9sB258b8UlsxgcA/sacGwRcIOeV+HygZU6Sk80QVNu2JthUrSp98YX04osZwSZQ2IwPAPyHyg0CLtDzSvxS+Vi+XOrRQ9q3z7QfflgaNUq64AK/9BEAEDyEGwRcMOaV5Hmo6MgRafBg6fXXTbtyZWn6dKllS7/1zcvYfBCAExFuEBSOnFeycqWp1uzda9oPPiiNGSNdyEql3GDzQQBORbhB0Ph7Im6eHT0qPfaYNHGiaVesaKo1115ra7fcJJCTxAEgv5hQjNCyapVUt+4/waZ3b2nz5pAINvEJiZq/Yb9fdj9m80EATkblBqHh2DFp6FDplVdMu3x5U625Pu9nQrmJv4eQ2HwQgJNRuYH3ffmlVK/eP8Hm/vtNtSZEgk0gzq/y6xJ8APAzKjfwrr//lh5/XHrpJcmypLg46Y03pBtusLtnQZXTEFJ+wogjJ4kDgAg38Kqvv5a6dZN2/u8QzR49zGZ8kZF+fRs3LIUO5BCSYyaJA8AZCDfwluPHpSeflMaNM9WaSy6Rpk41uw77mVuWQnN+FYBQQ7iBd6xdK3XtKm3fbtpdu0rjx0slSvj9rdy2FJohJAChhHAD9ztxQho+XHrhBSk9XYqNNdWa9u0D9paBmscSSAwhAQgVhBu427ffmgrNtm2m3bmzmUAclbubeF7nzLAUGgCci3ADd0pJkf7zH2nsWFOtiYkx50PddFOuXyI/c2aYxwIAzkW4gfusX2+qNVu3mnanTtLLL0slS+b6JfwxZ4Z5LADgTIQbuEdqqjRypDR6tJSWJkVHS5MnSx06+PxS/pozwzwWAHAewg3cYcMGU63ZvNm077pLevVVqVSpPL0cc2YAwLs4fgHOlppqVkJdcYUJNqVKSXPmSLNn5znYSKbiUj8u84Z+9eMiqcIAgAcQbuBcmzZJTZpII0ZIp05Jt99u5tncfnu+Xzo+IVEb9yVlemzjviS/nJgNALAX4QbOc/KkmVvTuLEJOCVLSu+/byo20dF+eYuc5twAANyNOTdwls2bzdyaDRtMu0MHadIkqUwZv74Nc24AwLuo3MAZTp2Snn1WatTIBJuoKOmdd6R58/webKR/9qk5E/vUAIA3ULlBQOVqB+CtW021Zv16077pJrPEOzY2oH1jnxoA8CbCDQLmvDsAnzolvfii9NRTZlVUiRLSK69Id9+t+H2HtWfD/oCHDvapAQDvIdwgIM67A/C2baZa8+235pf/+pc5PqFs2XwdiwAAAHNuEBDZrkY6mCw9/7zUoIEJNpGR0syZ0qJFUtmy2YYilmgDAHKLyg0CIqtVR5X/3K8bev1H2rDOPPB//ydNnSqVK5dxjb+ORQAAhC4qNwiIM1cjFUhPU49vF2jpmw/rog3rpIgIado0afHiTMFGYok2ACD/qNwgYIa0rambLjim6If7qNSm/1VrbrhBeuMNKS4uy+ecDkVnDk2xRBsA4AvCDQIjPV165RVdNnSodPy4VLy4WRnVs6cUFpbjU1miDQDID8KNH+VqTxePyPGz/vST1L279MUXpn399WYYqkKFXL++l5doh9LfCQDYgXDjJ6G0fDnbz5qeLk2cKD32mPT339KFF0ovvCA98MB5qzWhIpT+TgDALkwo9oNQWr6c3Wfd+tVGU6F56CETbK691pwT1asXweZ/QunvBADsRLjxA3+cMB2fkKj5G/Y7/kZ39mcKs9J1T/xiVb+hubRqlXTBBdKrr0orVkiVKtnTSYfiJHIACA6Gpfwgv8uX3TRUceZnuiTpd41d8pJa7N1kHrj6amn6dKlKFZt652wscweA4KBy4wf5OWHabUMVDcpHqdfVldRx4ydaOr2vWuzdpJPhRaWXXpI++4xgkwNOIgeA4KBy4yd5Xb7suh15ExI0ZEJ/aflySdLRxlfoonffkqpVs7dfLsEydwAIPMKNH+Vl+bJrhiosyww5DRggHTkiFS0qjRqli/r1kwoWtLt3ruLlZe4A4AQMS9nMFUMV+/dL7dqZDfiOHJGaNpU2bjRBh2ADAHAYKjcO4NihCsuSZs2S+veXkpKk8HDpmWcCEmrY2A4A4C+EG4dw3FDFr79K998vffyxaTdpIs2cKdX0/youN60WAwA4H8NSyMyypLfekmrVMsGmSBFpzBjp668DEmzctloMAOB8hBv847ffpJtvlu69Vzp8WGrcWNqwwRynUCgwRT42tgMA+BvhBqZa8+67plrz3/9KhQtLzz4rrVljHgsg16wWAwC4BuEm1B08KN12m3T33VJiotSwofTdd9KwYQGr1pzJFavFAACuwoRihwj6aiHLkj74QOrbV/rzTxNknnpKGjLEVG6CaEjbmqpU6kJt2ndY9eJK6K7Lywf1/QEA3kK4cYCgrxb6/XepTx9p3jzTrl/frISqVy9w75mDMz//u9/u055Dx1gtBQDIM4albBb01UJz5ph5NPPmmWrN8OHSN9/YFmxYLQUA8Ddbw83TTz+tsLCwTD81atTI9vqZM2eec33RokWD2GP/e+XTnVk+7vfVQocOSf/+t3TnneY/16kjffut9PTTZrm3TVgtBQDwN9uHpWrVqqUVK1ZktAudZxJrRESEtm/fntEOCwsLWN8CLT4hUZ/++EeWv/PraqEFC6RevcxwVMGC0tCh0pNP2hpqTmO1FADA32wPN4UKFVJMTEyurw8LC/PpeifLrjpxXY3S/plU/Oef0kMPSe+9Z9q1apm5NY0b5/+1/eT0aqkzh6ZYLQUAyA/bw83OnTtVtmxZFS1aVE2bNtXo0aNVvnz2q2WOHj2qChUqKD09XQ0bNtSoUaNUK4e9WFJSUpSSkpLRTk5O9mv/8yO76sRD11XL/4t/+KH0wANmqXeBAmYjvuHDzflQDuPYs7UAAK5k65ybK664QjNnztQnn3yiSZMmac+ePbrqqqt05MiRLK+vXr26pk+frg8//FBvv/220tPT1axZM+3fvz/b9xg9erQiIyMzfuLi4gL1cXwWkD1e/vpL6txZuuUWE2xq1jSb8Y0a5chgc1qD8lG6tWE5gg0AIN/CLMuy7O7EaYcPH1aFChU0btw49ejR47zXnzx5UjVr1lTHjh01cuTILK/JqnITFxenpKQkRURE+K3v+eG3PW4++sgcdvnbb6ZaM3iwmTDs0knXnBQOADgtOTlZkZGRubp/2z4sdaYSJUro0ksv1a5du3J1feHChdWgQYMcrw8PD1e4gysWkh9OBD98WOrfX5o1y7SrVzdza668MtNlbgoLudn7x02fBwAQPI4KN0ePHtVPP/2kzp075+r6tLQ0bd68We3atQtwzxxsyRLpvvukX36RwsKkgQOlkSOlYsUyXRb0jQLzIbu9b9rUiskIMW76PACA4LJ1zs2gQYP0+eef6+eff9bq1avVoUMHFSxYUB07dpQk3XvvvRo6dGjG9SNGjNCyZcu0e/dubdiwQffcc4/27t2rnj172vUR7JOUJPXoIbVrZ4JNtWrSl19KL7xwTrBx20Z559v7xm2fBwAQXLZWbvbv36+OHTvqzz//VOnSpdWiRQutXbtWpUuXliQlJCSoQIF/8ldiYqLuu+8+HThwQFFRUWrUqJFWr16tyy67zK6PYI9ly0yw2b/fVGv695eeeUa64IIsL88pLDhxOOd8e9+47fMAAILL1nAze/bsHH+/atWqTO3x48dr/PjxAeyRwyUnS4MGSVOnmnaVKtKMGdJVV+X4NLdtlHe+vW/c9nkAAMHlqDk3yMGKFaZak5Bg2v36meXdF57/hu7LRnlOmaSb0943WX0eSVq69QCVGwCAs5aCB4MvS8kc4cgR6dFHpcmTTbtSJWn6dOmaa3x+qfMFFzdN0o1PSFSHiavPeXxBn2YEHADwIF/u35wK7mSffSbVrftPsOnbV/r++0zBJj4hUfM37M/VZNqcNspz2yRdDtwEAGSHYSknOnpUGjJEeu01065QwVRrrrsu02X+rLS4bZIu824AANmhcuM0X3wh1av3T7Dp1UvavPmcYOPvSovbwkJAjq4AAHgClRunOHZMGjZMevll046Lk6ZNk1q3zvJyf1da3Hg6NwduAgCyQrhxgq++krp1k04fI9Gzp/Tii1IOE6YCUWlxY1jI99EVAADPYVjKTn//bY5LuPpqE2zKlZM++cTsY3OemeCBGpbhdG4AgNtRubHLmjVS167Sjh2m3b27NG6cFBmZ65dwY6UFAIBAI9wE2/Hj0lNPmSCTni6VLWsqNXk8/JNhGQAAMiPcBNM335hqzY8/mnaXLtL48VIU4QQAAH9hzk0wnDhh9q1p1swEm5gYadEiaeZMgg0AAH5G5SbQ1q0z1ZoffjDte+6RXnpJuvjicy51yrlOvvbDKf0GAEAi3AROSoo0YoQ0dqyUliaVKSO9/rp0881ZXu6Uc5187YdT+g0AwGkMSwXChg1S48bm1O60NKljR2nr1myDjVPOdfK1H07pNwAAZyLc+FNqqlkJ1aSJtGWLVLq0NG+e9O67UsmS2T7NKYdA+toPp/QbAIAzMSzlL99/L3XubP6vJN15p/TqqybgnIdTznXytR9O6TcAAGeicuMvhw+bYFOqlPTBB9L77+cq2EjOOQTS1344pd8AAJwpzLIsy+5OBFNycrIiIyOVlJSkiPMcceCzWbOktm2l6Og8Pd0pq45YLQUAcBpf7t+EGwAA4Hi+3L8ZlgIAAJ5CuAEAAJ5CuAEAAJ5CuAEAAJ7CPjc2YHURAACBQ7gJMs5iAgAgsBiWCiLOYgIAIPAIN0HEWUwAAAQe4SaIOIsJAIDAI9wEEWcxAQAQeEwoDrIhbWuqTa0YVksBABAghBsbNCgfRagBACBAGJYCAACeQrgBAACeQrgBAACewpybAOB4BQAA7EO48TOOVwAAwF4MS/kRxysAAGA/wo0fcbwCAAD2I9z4EccrAABgP8KNH3G8AgAA9mNCsZ9xvAIAAPYi3AQAxysAAGAfhqUAAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnhNzZUpZlSZKSk5Nt7gkAAMit0/ft0/fxnIRcuDly5IgkKS4uzuaeAAAAXx05ckSRkZE5XhNm5SYCeUh6erp+/fVXFS9eXGFhYX597eTkZMXFxWnfvn2KiIjw62vjH3zPwcH3HBx8z8HDdx0cgfqeLcvSkSNHVLZsWRUokPOsmpCr3BQoUEDlypUL6HtERETwP5wg4HsODr7n4OB7Dh6+6+AIxPd8vorNaUwoBgAAnkK4AQAAnkK48aPw8HANHz5c4eHhdnfF0/ieg4PvOTj4noOH7zo4nPA9h9yEYgAA4G1UbgAAgKcQbgAAgKcQbgAAgKcQbgAAgKcQbnz02muvqWLFiipatKiuuOIKffvttzleP2fOHNWoUUNFixZVnTp1tHjx4iD11N18+Z6nTp2qq666SlFRUYqKilKrVq3O+98LDF//nk+bPXu2wsLCdMsttwS2gx7h6/d8+PBh9e3bV7GxsQoPD9ell17K/+/IBV+/5wkTJqh69eoqVqyY4uLiNGDAAJ04cSJIvXWnL774QjfeeKPKli2rsLAwLVy48LzPWbVqlRo2bKjw8HBVrVpVM2fODHg/ZSHXZs+ebRUpUsSaPn26tXXrVuu+++6zSpQoYR08eDDL67/++murYMGC1nPPPWf98MMP1hNPPGEVLlzY2rx5c5B77i6+fs+dOnWyXnvtNSs+Pt7atm2b1bVrVysyMtLav39/kHvuLr5+z6ft2bPHuuSSS6yrrrrKuvnmm4PTWRfz9XtOSUmxGjdubLVr18766quvrD179lirVq2yNm7cGOSeu4uv3/M777xjhYeHW++88461Z88ea+nSpVZsbKw1YMCAIPfcXRYvXmw9/vjj1vz58y1J1oIFC3K8fvfu3dYFF1xgDRw40Prhhx+sV155xSpYsKD1ySefBLSfhBsfNGnSxOrbt29GOy0tzSpbtqw1evToLK+/8847rfbt22d67IorrrAeeOCBgPbT7Xz9ns926tQpq3jx4tasWbMC1UVPyMv3fOrUKatZs2bWG2+8YXXp0oVwkwu+fs+TJk2yKleubKWmpgari57g6/fct29f67rrrsv02MCBA63mzZsHtJ9ekptw8+ijj1q1atXK9Nhdd91ltWnTJoA9syyGpXIpNTVV3333nVq1apXxWIECBdSqVSutWbMmy+esWbMm0/WS1KZNm2yvR96+57P9/fffOnnypC6++OJAddP18vo9jxgxQtHR0erRo0cwuul6efmeFy1apKZNm6pv374qU6aMateurVGjRiktLS1Y3XadvHzPzZo103fffZcxdLV7924tXrxY7dq1C0qfQ4Vd98GQOzgzrw4dOqS0tDSVKVMm0+NlypTRjz/+mOVzDhw4kOX1Bw4cCFg/3S4v3/PZHnvsMZUtW/ac/0HhH3n5nr/66itNmzZNGzduDEIPvSEv3/Pu3bv16aef6u6779bixYu1a9cu9enTRydPntTw4cOD0W3Xycv33KlTJx06dEgtWrSQZVk6deqUevXqpWHDhgWjyyEju/tgcnKyjh8/rmLFigXkfancwFPGjBmj2bNna8GCBSpatKjd3fGMI0eOqHPnzpo6dapKlSpld3c8LT09XdHR0ZoyZYoaNWqku+66S48//rgmT55sd9c8ZdWqVRo1apQmTpyoDRs2aP78+fr44481cuRIu7sGP6Byk0ulSpVSwYIFdfDgwUyPHzx4UDExMVk+JyYmxqfrkbfv+bQXXnhBY8aM0YoVK1S3bt1AdtP1fP2ef/rpJ/3888+68cYbMx5LT0+XJBUqVEjbt29XlSpVAttpF8rL33NsbKwKFy6sggULZjxWs2ZNHThwQKmpqSpSpEhA++xGefmen3zySXXu3Fk9e/aUJNWpU0fHjh3T/fffr8cff1wFCvBvf3/I7j4YERERsKqNROUm14oUKaJGjRpp5cqVGY+lp6dr5cqVatq0aZbPadq0aabrJWn58uXZXo+8fc+S9Nxzz2nkyJH65JNP1Lhx42B01dV8/Z5r1KihzZs3a+PGjRk/N910k6699lpt3LhRcXFxwey+a+Tl77l58+batWtXRniUpB07dig2NpZgk428fM9///33OQHmdKC0OHLRb2y7DwZ0urLHzJ492woPD7dmzpxp/fDDD9b9999vlShRwjpw4IBlWZbVuXNna8iQIRnXf/3111ahQoWsF154wdq2bZs1fPhwloLngq/f85gxY6wiRYpYc+fOtX777beMnyNHjtj1EVzB1+/5bKyWyh1fv+eEhASrePHi1oMPPmht377d+uijj6zo6GjrmWeesesjuIKv3/Pw4cOt4sWLW++99561e/dua9myZVaVKlWsO++8066P4ApHjhyx4uPjrfj4eEuSNW7cOCs+Pt7au3evZVmWNWTIEKtz584Z159eCj548GBr27Zt1muvvcZScCd65ZVXrPLly1tFihSxmjRpYq1duzbjdy1btrS6dOmS6foPPvjAuvTSS60iRYpYtWrVsj7++OMg99idfPmeK1SoYEk652f48OHB77jL+Pr3fCbCTe75+j2vXr3auuKKK6zw8HCrcuXK1rPPPmudOnUqyL12H1++55MnT1pPP/20VaVKFato0aJWXFyc1adPHysxMTH4HXeRzz77LMv/f3v6u+3SpYvVsmXLc55Tv359q0iRIlblypWtGTNmBLyfYZZF/Q0AAHgHc24AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AAICnEG4AuN4ff/yhmJgYjRo1KuOx1atXq0iRIlq5cqWNPQNgBw7OBOAJixcv1i233KLVq1erevXqql+/vm6++WaNGzfO7q4BCDLCDQDP6Nu3r1asWKHGjRtr8+bNWrduncLDw+3uFoAgI9wA8Izjx4+rdu3a2rdvn7777jvVqVPH7i4BsAFzbgB4xk8//aRff/1V6enp+vnnn+3uDgCbULkB4Ampqalq0qSJ6tevr+rVq2vChAnavHmzoqOj7e4agCAj3ADwhMGDB2vu3LnatGmTLrroIrVs2VKRkZH66KOP7O4agCBjWAqA661atUoTJkzQW2+9pYiICBUoUEBvvfWWvvzyS02aNMnu7gEIMio3AADAU6jcAAAATyHcAAAATyHcAAAATyHcAAAATyHcAAAATyHcAAAATyHcAAAATyHcAAAATyHcAAAATyHcAAAATyHcAAAAT/l/Huc97fTEvrQAAAAASUVORK5CYII=",
142 | "text/plain": [
143 | ""
144 | ]
145 | },
146 | "metadata": {},
147 | "output_type": "display_data"
148 | }
149 | ],
150 | "source": [
151 | "import torch\n",
152 | "\n",
153 | "\n",
154 | "torch.manual_seed(0)\n",
155 | "x = torch.rand(100, 1)\n",
156 | "y = 5 + 2 * x + torch.rand(100, 1)\n",
157 | "\n",
158 | "W = torch.zeros((1, 1), requires_grad=True)\n",
159 | "b = torch.zeros(1, requires_grad=True)\n",
160 | "\n",
161 | "def predict(x):\n",
162 | " y = x @ W + b\n",
163 | " return y\n",
164 | "\n",
165 | "def mean_squared_error(x0, x1):\n",
166 | " diff = x0 - x1\n",
167 | " N = len(diff)\n",
168 | " return torch.sum(diff ** 2) / N\n",
169 | "\n",
170 | "lr = 0.1\n",
171 | "iters = 100\n",
172 | "\n",
173 | "for i in range(iters):\n",
174 | " y_hat = predict(x)\n",
175 | " loss = mean_squared_error(y, y_hat)\n",
176 | "\n",
177 | " loss.backward()\n",
178 | "\n",
179 | " W.data -= lr * W.grad.data\n",
180 | " b.data -= lr * b.grad.data\n",
181 | "\n",
182 | " W.grad.zero_()\n",
183 | " b.grad.zero_()\n",
184 | "\n",
185 | " if i % 10 == 0: # print every 10 iterations\n",
186 | " print(loss.item())\n",
187 | "\n",
188 | "print(loss.item())\n",
189 | "print('====')\n",
190 | "print('W =', W.item())\n",
191 | "print('b =', b.item())\n",
192 | "\n",
193 | "\n",
194 | "# plot\n",
195 | "import matplotlib.pyplot as plt\n",
196 | "plt.scatter(x.detach().numpy(), y.detach().numpy(), s=10)\n",
197 | "x = torch.tensor([[0.0], [1.0]])\n",
198 | "y = W.detach().numpy() * x.detach().numpy() + b.detach().numpy()\n",
199 | "plt.plot(x, y, color='red')\n",
200 | "plt.xlabel('x')\n",
201 | "plt.ylabel('y')\n",
202 | "plt.show()"
203 | ]
204 | },
205 | {
206 | "cell_type": "markdown",
207 | "metadata": {},
208 | "source": [
209 | "## neuralnet.py"
210 | ]
211 | },
212 | {
213 | "cell_type": "code",
214 | "execution_count": 4,
215 | "metadata": {},
216 | "outputs": [
217 | {
218 | "name": "stdout",
219 | "output_type": "stream",
220 | "text": [
221 | "0.7643452286720276\n",
222 | "0.23656320571899414\n",
223 | "0.23226076364517212\n",
224 | "0.22441408038139343\n",
225 | "0.21026146411895752\n",
226 | "0.17957879602909088\n",
227 | "0.11798454076051712\n",
228 | "0.08481380343437195\n",
229 | "0.08023109287023544\n",
230 | "0.0796855092048645\n",
231 | "0.07945814728736877\n"
232 | ]
233 | },
234 | {
235 | "data": {
236 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi8AAAGdCAYAAADaPpOnAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAABJeElEQVR4nO3deVxU1fsH8M+AAqKymLIllLjkmuIalD9aKMTSNEtNU/NrmlulmArmkqlhi0tuWZqpZWqm2KK5YWbuqbikSCkoaoIbgqICwv39cQJFZoaZYe7ce2c+79drXs0Md2aOV/I+c85znkcnSZIEIiIiIo1wUnoAREREROZg8EJERESawuCFiIiINIXBCxEREWkKgxciIiLSFAYvREREpCkMXoiIiEhTGLwQERGRplRQegDWVlhYiH///RdVq1aFTqdTejhERERkAkmScP36dQQEBMDJyfjcit0FL//++y8CAwOVHgYRERFZ4OzZs6hZs6bRY+wueKlatSoA8Yf38PBQeDRERERkiuzsbAQGBhZfx42xu+ClaKnIw8ODwQsREZHGmJLywYRdIiIi0hQGL0RERKQpDF6IiIhIUxi8EBERkaYweCEiIiJNYfBCREREmsLghYiIiDSFwQsRERFpCoMXIiIi0hQGL0RERKQpdtcegNQjMS0TqZdzUKt6ZYQEeSs9HCIishMMXkgWU39NwvzfU4ofDwwPRkxUAwVHRERE9oLLRmR1iWmZJQIXAJj/ewoS0zIVG8+ag+cU+3wiIrIuzryQ1aVezjH4vK2Xj+ScAeKyGBGRMhi8kNXVql7ZrOflYmgGKLKRX7mDDS6LEREph8tGZHUhQd4YGB5c4rlB4cE2n50wNgNUHmpbFiMicjSceSFZxEQ1QGQjP0WXVeSaAVLTshgRkSPizAvJJiTIGy81r6nYBV2uGSC1LIsRETkqzryQXZNjBqgoKLp36UiJZTEiIkelkyRJUnoQ1pSdnQ1PT09kZWXBw8ND6eGQHeNuIyIi6zHn+s2ZFyILhQR5M2ghIlIAc16IiIhIUxi8EBERkaZw2YhUizklRESkD4MXUiVWsCUiIkO4bESqwwq2RERkDIMXUh25yvoTEZF9YPBCqsMKtkREZAyDF1IdtTR2JCIidWLCLqmSGho7EhGROjF4IYvJvZWZFWyJiEgfBi9kEW5lJiIipTDnhczGrcxERKQkBi9kNm5lJiIiJTF4IbNxKzMRESmJwQuZjVuZiYhISUzYJYtwKzMRESmFwQtZjFuZiYhICVw2IiIiIk3hzIsDk7vInD3huSIiUg8GLw6KReZMx3NFRKQuXDZyQCwyZzqeKyIi9WHw4oBYZM50PFdEROrD4MUBscic6XiuiIjUh8GLA2KROdPxXBERqY9OkiRJ6UFYU3Z2Njw9PZGVlQUPDw+lh6Nq3EFjOp4rIiJ5mXP9ZvBCREREijPn+s1lIyIiItIUBi9ERESkKQxeiIiISFNkDV62b9+ODh06ICAgADqdDmvXrjV6/LZt26DT6Urd0tPT5RwmERERaYiswUtOTg6aNm2KuXPnmvW65ORkXLhwofjm4+Mj0wiJiIhIa2TtbRQVFYWoqCizX+fj4wMvLy/rD4iIiIg0T5U5L82aNYO/vz+effZZ7Ny50+ixubm5yM7OLnEjIiIi+6Wq4MXf3x/z58/H6tWrsXr1agQGBuLJJ5/EwYMHDb4mLi4Onp6exbfAwEAbjphsITEtE2sOnmMzRCIiAmDDInU6nQ7x8fHo1KmTWa8LDw9HUFAQvvnmG70/z83NRW5ubvHj7OxsBAYGskidnZj6a1KJrs4Dw4MRE9VA1s9kNV0iItszp0idrDkv1tC6dWvs2LHD4M9dXV3h6upqwxGRrSSmZZYIXABg/u8pqFW9Mrq1CpLlM5UIloiIyDyqWjbS59ChQ/D391d6GKSA1Ms5ep8fvfoopv6aZPXPMxQscbmKiEhdZJ15uXHjBk6ePFn8ODU1FYcOHUK1atUQFBSE2NhYnD9/HkuXLgUAzJw5E7Vq1UKjRo1w+/ZtLFy4EFu3bsWmTZvkHCapVK3qlQ3+bP7vKYhs5GfVZR1DwVLq5RwuHxERqYisMy/79+9HSEgIQkJCAADR0dEICQnB+PHjAQAXLlxAWlpa8fF5eXkYMWIEmjRpgvDwcBw+fBhbtmzBM888I+cwSaVCgrwxMDzY4M8NBRuWMhQsGQuiiIjI9thVmlRv5Z9pGL36aKnn4weHWX1G5P6cl0HhwRjNnBciItnZVcIuaYscO3W6tQpC6uWcUkGFHEs5MVENENnIj7uNiIhUjDMvZDVy79ThFmYiIvvFmReyOUM7dayZVBsS5M2ghYiI1L9VmrTB2E4dIiIia+LMC1mFPe3U4fIUEZG6MXghqyja1myLpFqrkiTg0iXg/Hng5k0s//0Ethw8A9c7+chzrognQ+vhtfbNgWrVxM3FRekRWxUDNSLSIibsklWp+mJ49Sqwa5e4JScDp06J240bpr1epwNq1waaNLl7a90aCJKnVYHc2AqBiNTEnOs3gxeyX7duARs2ABs3Ajt2AMeO6T9OpwN8fXGjgivO3QZuV3RBrrMLXAvy4Xn7OgIKb8H1ejZQWKj/9Q0bAlFRQPv2wBNPaGJ2JjEtE53n7Sr1vBy1c4iITMHdRuS4cnKAX38FVq0C1q0Tj+/1yCPA448DTZuKWZTatYGHHwbc3PCPsQt6TU/g4kURAB05Ahw9Kv6bmAgcPy5u06YBVaoAr7wC9O8PPPaYCIxUiK0QiEjLGLyQfThyBJg7F/j2W+DmzbvPBwUBnToBTz4pZkVq1DD4FmXm7fj5idu97SquXgU2bxYB04YNQEYG8PXX4ta4sQhievUCvNUVENhTgjUROR4uG5F25ecDa9aIoOWPP+4+X6uWmP14+WWgZUuzZz8sztspLBT5NAsXAitXArdvi+crVwbefht4912R9KsSbIVARGrCnBcGL/YtLw9YvBiYMgUoauxZoQLw0kvAkCFA27aKLdcUBT51Kubj0d/XAV98Afz1l/hh1arA8OHi5uWlyPjup+oEayJyKAxeGLzYp/x84JtvgEmTgNOnxXO+vsDAgcCAAUBAgKLD07t7p1194KefgPHjxdIWIAKXcePEbEwFrtwSEQHmXb9ZYZfUT5KAH34AGjQA+vUTgYufH/DZZ+L+++8rHrgYao+QePYa8OKLIrF31SqxM+naNWDECKBVK2DfPkXGS0SkZQxeSN2OHQMiIkQOy6lTgI+P2NVz6pSYuXBzU3qEAExoj+DkJHJwjhwBFiwQCbyHDokdSUOGAFlZthssEZHGMXghdbp2DRg2TGxp3roVcHUVSy8pKUB0NODubtLbJKZlYs3Bc0hMy5R1uCbv3nF2Bt54AzhxQuxCkiRg3jwxq7Rli6xjJCKyFwxeSH1+/FFczD/7DCgoEFudk5KAiRPFzh0TTf01CZ3n7UL094fRed4uTP01SbYhF22zvpfR9gg+PsDSpSIwq1cPuHABeO45IDZW5PYQEZFBTNgl9bh6VSwFLVsmHterB8yeLS7qZlKqgqxFu3du3RKzSfPni8ePPQYsXy6K5xEROQgm7JL2/PijSGZdtkzkh4weDRw+bFHgApiQgyKTkCBvvNS8pnkBUqVKwOefi4ReT09gzx6gWTMgPl62cRIRaRmDF1JWTo7YQdSpk6hOW7++KPQ2dWq5knE1WUH25ZdFEm9oqEjgfeklIC5O5MUQEVExBi+knGPHRFfmRYtEUblRo8SW4jZtyv3WZuegqMXDDwO//w689ZZ4PGYM8PrrQG6ukqMiIlIVBi9ke5IEfPWVqHNy/Djg7w8kJAAffSTr1mfNzF9UrAjMmiXaHjg7i8TeiAjg8mWlR0ZEpAoMXsi2cnKA114T24Vv3UJ2+NNiqeSpp6z6MQaLxsm8ZdqqBg8G1q8HPDyAHTvEjNQ//yg9KiIixTF4Ids5fRp4/HHgu+9wR+eEuCdfR9M2wzD1wBWrf5S1EnZtVSfGoOeeA3bvBoKDRY2b//u/u72SiIgcFBurkG1s2yYSUq9cwSV3LwzqHIv9NRsBEDMikY38rJqPYo2EXb29ipToutywoQhgnn1WVOgNDwc2bhQds4mIHBBnXqhcypyZkCRgzhyRs3HlCjIbNEHHPjOKA5ci1t7CXN6EXdUtO/n4AL/9JhKcr14Fnn5aLCURETkgzryQxcqcmbhzR/Tt+fJL8bhnT5wZ/zEuLEos9V5ybGGOiWqAyEZ+5heNg/FlJ0tmiCwqXne/atVEC4EOHcSOpOeeE/Vxnn3WsvcjItIozryQRcqcmbhxA+jYUQQuOh3wySfAN9+gWb0Am25htqhoHKxbJ8aqbQqqVhVJvO3aicq8HTuKJTkiIgfCmReyiNGZCZdc4PnngYMHRfXY5cuBF18sPqY8MyK2UrTsdG+AZkmQZSjIK1eOj7u7mHHp0gX45RcxE5OQIJaUiIgcAIMXsoihGYj6meeAl7oDZ84ANWoAP/+st+hcSJC3KoOWe1kjyLL28lMxFxfRTuD550Vzx3btxFJSkyaWvycRkUZw2Ygsoi8hdnKNLDR8OUoELnXrih0yVqiWqyRLl52KyNqmwM1NzMA89hiQmSlyX6xUB0bxLeJEREawqzSVS1Ei6qNJ+1BnQC/g5k1xMf35Z6B6daWHpwr3JzYPCg/GaGtuuc7MFEX+Dh8GAgOBnTvFfy2kmi3iRORQzLl+M3ih8ouPB7p3B/LyxA6YNWuAyipugKgAq+w2MubiRVHALjlZLB3t2CEq81owzs7zdpV6Pn5wmOqX+YhI28y5fnPZiMpn6VLglVdE4NKlC/DTTwxc9Cjv8lOZfHyATZsAPz/g6FHxd5Kfb/bbWKsyMRGRnBi8kOXmzgX69AEKCoC+fYEVKwBXV6VH5biCgsTuI3d3EcgMGSKKBJpB1hwdIiIrYfBClpk1Cxg6VNx/5x1g4UKgAjevKa5FCxFEOjkBCxYAH39s1svLW5mYiMgWmPNC5vvsM2DYMHE/Jgb48ENRiI7UY/Zs4O23xf2VK4GuXc16uew5OkRE92HCLoMX+dwbuIwZA0yezMBFrYYNE39fbm5iB1Lz5kqPiIjIICbsUrnprfMxc+bdwOW998wKXFg3RAHTpokidrdvA507A5cuKT0iIiKrYJIClaK3zsepBGD4cPHE2LHABx+YHLiwbojlyrV84+wMfPutaBvwzz9At24ikZe5SUSkcZx5oRL09eK5MvsL4K23xIP33jMrcCmzgSMZZJWGjl5ewNq1QJUqwG+/AaNHW3uYREQ2x+CFSri/nscLSdsxdcNs8SA6Gpg0yawcF9YNsYxVg76GDYElS8T96dOB776zwgiJiJTD4IVKuLeexzMn92LGL9PgLBXico8+wKefmp2cy7ohlrF60PfSS2LWDADeeAM4dMiy9yEiUgEGL1RCUZ2PsNOHMG/tVFQsLMCxp15A9aVfWbSriHVDLCNL0DdxIhAVBdy6JSrwXr9u+XsRESmIW6WptH37UPDU03C+mYNrkS/A65f4cid5rvwzDYfPXkPTQC90axVkpYHaN1kaOl69CjRrBpw9C7z6KrBsGbe6E5EqsM4LgxfLJScDjz8OXLkCRESIcvPlLPnP3UaWk6VY3K5dooljQYGowvvGG9Z5XyKicmCdF7LM+fOiK/SVK0DLlqI7dDkDF+42Kh9ZGjqGhYmqyIDYRfbXX9Z7byIiG2DwQkJmJtCuHZCWBtStC6xfD1StWu635W4jlXr3XZH/cvu2aB2Qw78PItIOBi8kEjg7dBDfwP39RSGzGjWs8tbcbaRSTk5i+3RAAJCUJDpQExFpBIMXR1dQAPToIXrfeHoCGzYADz9stbfnbiMVq1FD1HwpCmRWrlR6REREJmHCrqMrat7n4gJs3iwSOWWgL/GUnYtVYsIEUTXZyws4ehSoWVPpERGRA1JNwu727dvRoUMHBAQEQKfTYe3atWW+Ztu2bWjevDlcXV1Rp04dLF68WM4hOraZM0XgAgBLl8oWuAClE09NKX3PZo42MnYs0KoVcO0a0KcPUFio9IiIiIySNXjJyclB06ZNMXfuXJOOT01NxfPPP4+nnnoKhw4dwrBhw/DGG29g48aNcg7TMa1ZI8r9A8BHH4mmfTZiyg4kq/T1IdNUrCgaOLq7A1u33g1oiYhUStb2slFRUYiKijL5+Pnz56NWrVqYNm0aAKBBgwbYsWMHZsyYgcjISLmG6Xj27AF69gQkCRg4EBg50qYfb2wHUkiQt8HgJrKRH5eX5FKvnuh7NHAgEBsravw0aaL0qErgMiMRFVFVwu7u3bsRERFR4rnIyEjs3r3b4Gtyc3ORnZ1d4kZGpKYCHTuKLbLPPw/Mnm3zCqtl7UDi9mqFDBgAvPACkJsrgtvcXKVHVIwzcUR0L1UFL+np6fD19S3xnK+vL7Kzs3Hr1i29r4mLi4Onp2fxLTAw0BZD1aasLHFxunQJaN4cWLGi3GX/LVHWDiRur1aITgcsXCh2IR09Cowbp/SIALDQIRGVpqrgxRKxsbHIysoqvp09e1bpIanTnTtA9+7A8eOitsdPPwFVqig2nJioBogfHIbpXZsifnBYiZ49+oIbANh4LN2WQ3RMvr4igAGAadPEEqPCOBNHRPdTVfDi5+eHjIyMEs9lZGTAw8MDlSpV0vsaV1dXeHh4lLiRHiNGiBoulSqJwOXBB5UekdHS95GN/Eo9p6Vv25reKdWxI9Crl9h19PrrooihgjgTR0T3U1XwEhoaioSEhBLPbd68GaGhoQqNyE58/jkwa5a4/803QIsWyo7HBFr+tm0X+RkzZwJ+fqJR5/vvKzoUFjokovvJmvBw48YNnDx5svhxamoqDh06hGrVqiEoKAixsbE4f/48li5dCgAYOHAg5syZg1GjRuF///sftm7diu+//x7r1q2Tc5j2bcsW0XwPAKZMAbp0UXY8JtLqt2272SlVrRrwxRfAiy8Cn34KvPQS0KaNYsOJiWqAyEZ+3G1ERABknnnZv38/QkJCEBISAgCIjo5GSEgIxo8fDwC4cOEC0tLSio+vVasW1q1bh82bN6Np06aYNm0aFi5cyG3Sljp5UjTdKygQywCxsUqPyGRa/bat5RmjUjp2BF577e7y0e3big5Hlg7bRKRJbA9gr7KzgcceE0332rQBtm0D3NyUHpXZtFbbIzEtE53n7Sr1fPzgME2Mv5SrV4GGDYGMDGD0aGDqVKVHRER2SjXtAUghhYXiG3NSkthZFB+vycAF0N63ba3OGBlUtHwEAJ98Avz5p7LjISICZ17s07hxwOTJgKsrsH070Lq10iNyOFqbMSpTjx7A8uVA06YigKlYUekREZGd4cyLI1u1SgQuALBgAQMXhWhtxqhMM2eKWZjDh0UbASIiBTF4sSdHjojESkDUdenVS9HhkB3x8RFF6wCxdfq/XYSarmdDRJrFZSN7cfUq0KoVkJICPPsssH69IqX/yY5JkvjdSkgAnnkGU6NnYf721OIfDwwPRsw9lZKJiMzBZSNHU1AgGumlpAAPPyxyExi4kLXpdMD8+SL5OyEBl+YuLPFjLVVAJiJtY/BiD95/X5T+d3MTO4seeEDpEZEd0LskVKdOccXdsVsX4oGcayVeo8l6NkSkOQxetO7HH0sm6DZrpuhwyD4YbXEQHY2bDRvD+/Z1jN1acvZF7RWQicg+MHjRsuTku0m5b78tarsQmcBYoq2hFgfFx1asCPclX6PQyQmdj29D2OlDADRez4aINIWJEVp144boN3P9OtC2reg/Q2SCqb8mlQhO7k+0NdbioDg4adkSToMHA3Pm4Is9i3Bq0k40q1u6EzgRkRw486JFkgQMGAAcPw74+wPff29y0TBubXVsZc6qwIymmJMnA35+qHomBc1WLtT7GiIiOTB40aJ588SOImdnEbj4mfaN12geAzkEUxpHmtziwNPzbsG6KVPEbjciIhvgspHW7N0LDB8u7n/8MfDEEya9zNA37shGfsxTcCCmzqrERDVAZCO/slscdO8OfPWVqP0yZIioL6TTWXvYREQlcOZFSy5fBl55BcjPB7p0uRvEmMCUb9yGcKnJfpjTONKkFgc6nZgJdHER2/VXr7b2kImISuHMi1YUFaI7exaoVw9YtKjUN1xjzQBNzmO4T1nJnaQepjaDNHlWxVT16gGjRwOTJgHvvANERgJVq5bvPa3E7hpkEhEAtgfQjokTRXEwd3exdNS4cYkfmxJk3H/MoPBgjDYSiCSmZaLzvF2lno8fHMYLgcooHmTeugU0aQKcOgW8+y7wySe2+2wDFD8nRGQWtgewNwkJIngBRHn2+wIXU3aQAOIbd/zgMEzv2hTxg8OMBi5A+ZaayHZM/fuXVaVKwKxZ4v7MmWInnIJUcU6ISDYMXtTuwgWgRw+xPbpfP72dos0JMkzKY/iPpUtNZFuqCTLbtwc6dgTu3AHeekv8zipENeeEiGTB4EXN7twRgcvFi2JKfvZsvYfJFWSYk9xJylFVkDlzJuDqCmzdCvzwg+0//z+qOidEZHUMXmzEoh07EycC27YBVaoAq1aJqXk95AwyzF1qIttTVZBZqxYQEyPuR0eLStAKUNU5ISKrY8KuDViUOLhpE9CunZh6/+474NVXy/wcc3ZWcBeG/VHN3+mtW0DDhsDp0yKQiYtTbCiqOSdEVCZzrt8MXmRm0Y6dCxeApk2BS5eAN98USbpWxF0YJLsffwQ6dUJhxYrYsnILarRowuCBiIzibiMVMTtxsKiey6VLIoCZOdOq4+EuDLKJjh1xqmVbOOXnw3XEcHSeu5PtKIjIahi8yMzsxMEPPwR++w2oXBlYuRJwc7PqeLgLg2wh8ew19At5DbnOFRCeehARJ/cxSCYiq2HwIjOzEge3bxeF6ABRcv2RR6w+Hu7CIFtIvZyD09UexKKWnQAAY7cuhMudfAbJRGQVbA9gAyaVY798WWyLLiwEevcWNxkUBVP3V9plPgJZU1EwPCe0K146thUPX7uAfvvXotbb4QqPjIjsARN21UCSgA4dgHXrxGzL/v1ie7SMuAuD5FaUGN75r62YsW468twqweXUSSAgQOmhEZEKmXP95syLGnz2mQhcXF1FnovMgQsgZmAYtJCcimccLzZBzr/bUTlxv9g6vXSp0kMjIo1jzovSDh4ERo0S96dNEzuMiOxESJA3XmoZhMpfzBNPfPMNsHu3soMiIs1j8KKkGzdE8bn8fODFF4HBg5UeEZE8WrUC+vYV999+W+R2ERFZiMGLkt5+G/j7b+DBB4GvvgJ0ulKHWNRWgEiN4uIADw+R07V4sdKjISINY/BiI6WCkOXLga+/FgHLsmXAAw+Ues3UX5PQed4uRH9/GJ3n7WKRL9I2X19g/Hhxf8wYIDtb2fEQkWYxeLGB+4OQzxdtAgYOFD8cOxYIL719lJVwyS699RZQpw6QkaFozyMi0jYGLzK7PwipUHAHj419W3zrDAu7+030PqyES3bJxUUkpgPA9OlASorx44mI9GDwYgZL8k/uDzbe3rkcIReSkVfFQ3SLrqB/tzor4ZLd6tABiIgA8vKAkSOVHg0RaRCDFxNZmn/yxz+Xiu+3OvsXhuxZBQA4P3U68NBDBl9nVlsBIi3R6YAZMwAnJ2DNGmDbNqVHREQawwq7JkhMy0TnebtKPR8/OMxoMHHv6zxu38D6r99CzexL2Pd/HdD6959M/mxWwiW7NGSI6OHVtClw4ADg7Kz0iIhIQeZcvznzYgJL80+Kfy5JmLJxLmpmX8JpL3+kT/7Y5M8OCfLGS81rMnAh+zNxIuDlBRw+LEoFEBGZiMGLCSzNPyn6eZe/tqLDiT9wR+eEYR3eReBDvlYfI5HmVK9+t4v62LFAVpYiw2AtJSLtYfBiAkvzT0KCvBFTrwImbpkPAJjxRE+EvhrFWRSiIoMHi2akly4BU6bY/ONZS4lIm5jzYgaz80/u3AHatgX27MGl5m1w7odfEFKrulXHRKR569YBL7wgtlEfPw7Urm2Tj7U0l42I5MGcF5mYnX8yZQqwZw/g6Ykaa1YycCHSp3174LnnxNbpoialNsBaSkTaxeBFLnv2AJMmifvz5hndFk3k0HQ6UbDOxlunWUuJSLsYvMjh+nXgtdeAggKgRw9xIyLDGjUC3nxT3I+OFv/vyIy1lIi0izkvcujXD1i0CAgKEttAvbyUGQeRlly6BNStK3YdffUV8L//2eRjWUuJSB2Y86KkNWtE4KLTAUuXMnAhMlWNGiW7Tl+/bpOPZS0lIu1h8GJNFy4AAwaI+6NG6e0WTURGDB3KrtNEVCYGL9YiSWKa+8oVICQE+OADpUdEpD0uLsCnn4r706cDp08rOhwiUicGL9by+efAhg2Amxvw7bfiH2EiMl/HjsBTTwG5uUBMjNKjISIVYvBiDcnJwLvvivsffQQ0bKjseIi0rGjrtE4HrFwJ7CpdSI6IHBuDl/LKzxfbom/dAiIixJo9EZVPs2Z3dxsNHw4UFio6HCJSF5sEL3PnzsXDDz8MNzc3tGnTBvv27TN47OLFi6HT6Urc3NzcbDFMy0yeDOzfL3YVff21KLRFROU3eTJQpQqwbx+wfLnSoyEiFZH9Srty5UpER0djwoQJOHjwIJo2bYrIyEhcvHjR4Gs8PDxw4cKF4tuZM2fkHqZl9uy520zu88+BmjWVHQ+RlSnacdnPD4iNFfdjYoCbN20/BiJSJdmDl+nTp6N///7o27cvGjZsiPnz58Pd3R2LFi0y+BqdTgc/P7/im6+vr9zDNF9ODtC7t6gE+uqrQPfuSo+IyKpU0XF5+HBR7PHcubu7kIjI4ckavOTl5eHAgQOIiIi4+4FOToiIiMDu3bsNvu7GjRt46KGHEBgYiBdffBHHjh0zeGxubi6ys7NL3Gxi1Cjgn3+ABx8E5s416SWKfoslMkNiWibm/55S4rn5v6fY/ne3UiWRBA+I/54/b9vPJyJVkjV4uXz5MgoKCkrNnPj6+iI9PV3vax555BEsWrQIP/74I7799lsUFhYiLCwM586d03t8XFwcPD09i2+BgYFW/3OUsnGjaLYIiDwX77Irc6riWyyRiVTVcblbNyAsTCwbvfee7T+fiFRHddmloaGh6N27N5o1a4bw8HCsWbMGNWrUwBdffKH3+NjYWGRlZRXfzp49K+8Ar169uwti6FDg2WfLfIlqvsUSmUhVHZd1OmDGDHF/yRKRIK8ynFUlsi1Zg5fq1avD2dkZGRkZJZ7PyMiAn5+fSe9RsWJFhISE4OTJk3p/7urqCg8PjxI3WQ0dCvz7L1Cv3t3p7DKo6lsskQlU13G5dWtRkgAQeTAq6ifLWVUi25M1eHFxcUGLFi2QkJBQ/FxhYSESEhIQGhpq0nsUFBTg6NGj8Pf3l2uYplu5UmzZdHYGvvkGcHc36WWq+hZLZKKYqAaIHxyG6V2bIn5wGEZHNVB2QHFxIgdmxw7ghx+UHct/OKtKauFos3+yLxtFR0djwYIFWLJkCZKSkjBo0CDk5OSgb9++AIDevXsjtmg7JIAPPvgAmzZtQkpKCg4ePIjXXnsNZ86cwRtvvCH3UI37919g0CBxf8wY8U3QRKr7FktkIlV1XK5ZUyTKA+K/t28rOx5wVpXUwRFn/yrI/QHdunXDpUuXMH78eKSnp6NZs2bYsGFDcRJvWloanO4p7JaZmYn+/fsjPT0d3t7eaNGiBXbt2oWGSpfcT0kBKlQAWrQAxo0z++UxUQ0Q2cgPqZdzUKt6ZXVcDIi0ZuRIYOFC0bBx5kzFex9xVpWUZmj2L7KRn11fZ3SSpKLFYyvIzs6Gp6cnsrKyrJ//cvEicP06ULu2dd+XiEz3zTeixlKVKqJcgYn5c3KZ+mtSiYvHoPBg5ZfYyGGsOXgO0d8fLvX89K5N8VJzbRVONef6LfvMi13x8RE3IlJOz57A7NnAn38CY8eKmRgFcVaVlOSos3+q2ypNRGSUk5NYMgKARYuAxERFhwOoLDeIHIqj5lRy2YiItKlHD7H7r21b4PffRT0YIgeVmJap+dk/c67fDF6ISJvOngUeeQS4dQv4/nvglVeUHhERlYM5128uGxGRNgUG3t06PXKkCGKIyCEweCEi7Ro1StR/OXMGmD5d6dEQkY0weCEi7XJ3v9umIy5OFJMkIrvH4IWItO3VV4HQUCAnB7inWre1OVr5dSI1Y8IuEWnfn3/ebdmxZw/Qpo1V3/7+QnQDw4MRw0J0RFbFhF0iciytWgF9+oj7b78NFBZa7a3ZfJFIfRi8EJF9iIsTLQP27QOWLrXa27L5IjkSrSyPMniRgVb+8onsir8/MH68uB8TA2RnW+VtHbX8OjkeLXWnZvBiZVr6yyeyO++8A9StC2RkAJMmWeUtHbX8OjkWrS2PsjGjFTlqa3Ii1XBxAWbMAF54AfjsM6B/f6BevXK/LZsvkr0ztjyqxt93zrxYEdfGiVTg+eeB9u2B/Hxg+HCrvS2bL5I909ryKIMXK9LaXz6R3ZoxA6hYEVi/Hli3TunREKme1pZHWefFyu6vBzEoPBijWQ+CyPZGjgQ+/RSoUwc4ehRwc1N6RESqp2R3anaVVrhInT20JifSvOxsoH594MIFYMoUYMwYpUdEREYweGGFXSICgGXLgNdeAypVAk6cAIKClB4RERnACrtERADQowfQti1w6xYQHa30aIjIShi8EJH90umAOXMAZ2dg9Wpg82abfCwLVRLJi3VeiMi+PfooMGQIMGsW8NZbwJEjoh6MTNjEkUh+nHkhIvs3cSLg4wMkJwMzZ8r2MVqrUkqkVQxeiMj+eXkBH38s7n/wAXDunCwfw0KVRLbB4IWIVEPWXJFevYCwMCAnx6qVd+/FQpVEtsHghYhUQfampk5OwLx5Inn3hx+ADRus+/7QXpVSIq1inRciUlxiWiY6z9tV6vn4wWHWv/CPGAFMnw4EBwN//SVqwFgZC1USmY91XohIU2yaK/L++8CDDwIpKUBcnPXfH2ziSCQ3Bi9EpDib5opUrQp89pm4/9FHYgcSkcY4ei0hBi9EpDhDuSIA5PkH+qWXgKgoIC8PGDwYkGH13NEvLiQf2fPDNIA5LxbgejaRPO79f2vjsXR5i72lpACNGgG3b4seSD16WO2tWaiO5GLT/DAbY86LjBjxEsmnKFcEgPzF3oKDgbFjxf3hw4FM67w3C9VRkfLMvhl6LWsJCWwPYAZD/yhFNvLTfMRLpCbG/oG26v9r774rZl2SkoCRI4GFC8v9ljYbO6laeWbfjL2WtYQEzryYgREvkW2U5x9os77turoCX34p7n/1FbBtmxmj1I8XFyrP7FtZr1VFLaHMTJEvpiAGL2bgP0pEtmHpP9AWLes+8QQwcKC4/+abIgemHFRxcSFFleeLrimvjYlqgPjBYZjetSniB4dhtC3zqY4fB1q3Bt5+23afqQeXjcxQ9I/SvVEx/1EikkdMVANENvIzOTm+XMu6U6cCP/4I/P03MHmyuNlw7GRfyvNF19TXhgR52/736qefgJ49gRs3gDt3gCtXgAcesO0Y/sOZFzMpGvESORhzir2Va1nX0xOYM0fc/+gj4OhRc4apFwvVOa7yzL6pcuZOkoApU4BOnUTg8uSTwJ9/Kha4ANwqTUR2wipbSDt3BtauBdq0AXbuFH2QiCxUnrIaqinJkZMD9O0LrFolHg8dKtprVKxo9Y8y5/rN4IWI7Mb9uzQGhQebNzt6/jzQoAFw/bqowqvwuj6Rok6fFrMthw+LYGXuXKB/f9k+jsELgxcih1Xub6yffy6q7rq7A0eOALVrW3+QRGr3++/Ayy8Dly8DPj7A6tUiuV1GLFJHRA6r3Lkmb74p1vRv3gT69QMKC606PiJVkyQxwxIRIQKXFi2A/ftlD1zMxeCFiOheTk6i5ou7u/j2OX++0iMiso3cXGDAAJHXcueO2Fn0xx9AYKDSIyuFwQsR0f2Cg8X2aQAYNQpITVV2PEQWMKtg44ULwNNPiyrTTk7AJ58A33wDVKok/0AtwDovRET6DBkidlj88QfwxhvAli2ATqf0qIhMYlZ7gr17Raf1f/8FvLyA5cuBdu1sM1ALceaFiEifouWjSpWArVvvthEgUjmz2hN8/TXwf/8nApeGDUX9FpUHLgCDFyIiw+rWFcW5ANHEMSXF+PFlKE+XYSJTmVSwMT9flAL43/9En6JOnYA9e4A6dQy+r5p+f7lsRERkzNtvA2vWADt2AH36iOaNFhSvK0+XYSJTJaZl4swV/cFLcYuBixeBV14Btm8XjydOBMaOFbONBqjt95czL0RExjg7A0uXAlWqiADm00/NfovydBkmMlVRY9LPEk6W+llxi4H9+8X25+3bgapVRU+v8eONBi5q/P1l8EJEVJZatUTFXQAYN05UHDVDufouEZlAX4ABAO88U+duH74lS0S9lnPngEceAfbtAzp2LPO91fj7y+CFiMgUffsCL74ocgVeew24fdvklxrrFKymPAIqP1v8fer7DEOBxEMPVEaIX2XgrbeA118XtVw6dhQ7jOrXN+nzytMlWy7MeSEiMoVOJ3Yc7d4N/PWXyBEwcQmpqFPw/X2XNh5LV1UeAZWPtfNC9LW6MPQZhgKJuoU3gGdeFUueADBhQpnLRPcz9PurZMNIm/Q2mjt3Lj755BOkp6ejadOmmD17Nlq3bm3w+FWrVmHcuHE4ffo06tati48++gjt27c36bPY24iIZPXzz+Kbq04HJCQATz1l8kvvvRgBKH8XbBmoppuxxlilq/k99AUpkY38jH7G/a+Z5JONXp+OEAXoPDyAZcuAF14weyxF5P7dUFVvo5UrVyI6OhoTJkzAwYMH0bRpU0RGRuLixYt6j9+1axdeffVV9OvXD4mJiejUqRM6deqEv/76S+6hEhGVrUMH0VlXksTy0eXLJr/03r5LaswjKEr4jP7+MDrP24WpvyYpNhatsebfp6EE2W3J+q+bRZ8RE9UA8YPDMP2VR7HdIxm9YvqIwKVhQ5GoW47ABbBC3zArkj14mT59Ovr374++ffuiYcOGmD9/Ptzd3bFo0SK9x3/22Wdo164dRo4ciQYNGmDSpElo3rw55syZI/dQiYhMM2OGSHj891+RC2PBBLba8gjUuKNES6z592luwHPvZ4TUcMNLc8Yj6L0RIj/r5ZdFfkvdumaPQ81kDV7y8vJw4MABRERE3P1AJydERERg9+7del+ze/fuEscDQGRkpMHjc3NzkZ2dXeJGRCSrypWBlSsBV1fgl1+AWbPMfouiPIJ72TKP4P6kTzXOBGmJNf8+DQU8Tz7iY/wzUlOBxx8HFi8WOS0ffQR8/73Y5m9nZE3YvXz5MgoKCuDr61vieV9fX5w4cULva9LT0/Uen56ervf4uLg4TJw40ToDJiIyVdOmwLRpogPvyJFiC2qLFma9RUxUA0Q28rN5jomhfAp9lNxRojXW+vs0liAbEuSt/zM2bAB69AAyM4Hq1UVw/fTT1vhjqZLmdxvFxsYiOjq6+HF2djYCVdi+m4js0ODBomHj2rVA9+7AwYOi8JcZii5ItmJoeSiykZ/qdpRokbX+Po0FQiU+o7AQmDwZeP99sXzZujXwww+AnV8HZQ1eqlevDmdnZ2RkZJR4PiMjA35++qN8Pz8/s453dXWFq6urdQZMRJpn090yOp1o3njgAHDypAhmli5VdfdpY8tDSs0EkX5lBkJXrgC9egG//ioev/mmKKboANdEWXNeXFxc0KJFCyQkJBQ/V1hYiISEBISGhup9TWhoaInjAWDz5s0GjyciKqLIbplq1YDly0UbgW+/BRYskP8zy6GsxFI17SghI/78E2jeXAQubm4iz2X+fIcIXAAb7DaKjo7GggULsGTJEiQlJWHQoEHIyclB3759AQC9e/dGbGxs8fHvvPMONmzYgGnTpuHEiRN4//33sX//fgwdOlTuoRKRhim6W+bxx+92n37rLVF2XaWUThSmcpIkEaQ88QSQlia6QO/dK5qGOhDZc166deuGS5cuYfz48UhPT0ezZs2wYcOG4qTctLQ0ON1T6S8sLAzfffcdxo4dizFjxqBu3bpYu3YtGjduLPdQiUjDjC2H2OTCPGqUuIjEx4vtqQcOADVqyP+5FuDykEbduCGWhr77Tjzu1EnMuHh6KjkqRdikwq4tscIukWOydoVTi2RliYTJv/8GnnkG2LhRLCcRldexYyIoPnFC/E5NnQqMGKHq/CpzqarCLhGRLahiOcTTE1izBnB3F60Dxo2z3WeTUZpugLl0KdCqlQhcHnwQ+P134N137SpwMRdnXojIrqiiN8+KFcCrr4r7a9YAnTsrMw4CYHrDRFX87tzr5k2RQ1VUkf6550RSuEqXI8vLnOs3gxciIjkMHw7MnCmq8e7cKYrakc2Zupxo7Y7Q5XbsGNC1K3D8uJhhmTgRGDPGrpchuWxERKS0jz8WeS85OaKZo4Eq4SQvU9oeqKqvkySJmZZWrUTg4ud3dwnSjgMXczF4ISKSQ8WKwKpVQL16wNmzYmfI7dtKj8rhmNIwUTV9na5fB3r3Bvr1A27dEstEhw8DTz1l23FoAIMXIiK5eHuLxo3e3mIb9f/+Z1EHarKcKYncqujwvX8/EBIiclqcnYEPPxQF6Hx8bDcGDdF8byMiImuRJWGzbl1g9WrxLXr5cqBhQ2DsWOu8N5mkrLo2xhohyq6wEJg+HYiNBe7cAYKCgGXLRBE6MogJu0REsEHC5hdfAAMHivvffAO89pr13puswua7jdLTRWXcTZvE4y5dRHsJbxXsdFIAE3aJiMwgZ8JmcX2RqK5AdLR4sm9fYMOGcr83WZdN+zr9/DPw6KMicKlUCfjyS5Ej5aCBi7m4bEREDk+u1gKlZnOe/h9i0tNFefcuXYDffhMVeclx5OSIyrhffCEeP/ro3eVEMhlnXojI4cmRsKl3NueP0zg0cTrw7LOiANnzzwPJyRZ/hlYZq3ar6Uq4Zdm/X3SCLgpcRowQTTwZuJiNMy9E5PDkSNg0NJuTkp2PZqtXA08/LS5mkZHArl1AQECpY1VX8dUKjOUWqa5QnLXcuQPExQEffCDuP/ggsGSJqANEFmHwQkQE63daNjqbU7UqsG6d2FHyzz9iJua330psi7XHC7mh3KLIRn7F9/X9TNOB24kTonbLn3+Kxy+/LGZeqlVTdlwax2UjIqL/WDNhs8z6Ij4+ouv0gw+KSqpPPw1cugRAZRVfraBoKWhb8kW9P0+9nKOeQnHWUlgIfPaZqN3y55+Al5eo4fL99wxcrIAzL0REMilzNqdWLTHjEh4uetk88wywdStSL+uvxFveBGIl3D+DpI+x3CKbFoqzlpQUUSV32zbx+LnngK++AmrWVHRY9oQzL0REMipzNqduXRHA+PkBR48CERGoUyFP76Fau5Drm0G6X9FslCmVcFWvsBCYNQto0kQELu7uwLx5Yls8Axer4swLEZHSHnkE2LpV9LA5fBiPvv4yho2ei5mJV4sP0dyFHIaXfN55pg4eeqByqdkoa+cd2dTff4v2Dzt3isdPPQUsXAgEBxt/HVmEwQsRkRo0aHA3gElMxLCJ/RCxaCX+dvG2+oXcVruYDM0UPfmIj8HPLZqF0Yz8fFHe//33RePNKlWATz4BBgwAnLi4IRcGL0REatGwoVhueO45ICkJjbs+j8abNgFB1ltysOUuJkV7BtnC3r0iSDlyRDx+9llR3v+hh5QdlwNgbyMiIrVJSxMBTHIy8MADortwq1blftvEtEx0nrer1PPxg8NkDSjsrl5Ndjbw3nvA3LmiS/gDD4jZl169AJ1O6dFpFnsbERFpWVAQ8McfQMuWwJUrYilp8+Zyv61S25Ft2jNITpIktjo3bAjMmSMe9+59t5YLAxebYfBCRKRGNWqIHJhnnhH9cKKixM6VcpCjDYLDOHFCLAt16wacPw/Uri0CyiVLgOrVlR6dw2HwQkSkVlWr4tDn3yItqjNQUAAMGQIMGiSSRC1gF9uRbe3GDSA2VjRQTEgA3NyAiROBv/4CIiKUHp3DYs4LEZFKFSfXShIG7l2N0duXQCdJwJNPAqtWWfyN3+5yUORQWAgsXQqMGQNcuCCee+EFUTWX259lYc71m8ELEZEK6UuuffrkPizYOB3ON26I6ryrV4vy82Rd27cDw4cDBw+Kx8HBwIwZQMeOyo7LzjFhl4hI4/Ql0W6t0xpbF60VF9PUVOCxx4CZM0XiKJVfcrJonBgeLgIXDw9Rs+X4cQYuKsPghYhIhQwl0VZv01w0+uvUCcjLEzMEL7wAXNTf9JBMcO4c0L8/0KiRmM1ycgIGDhQdv999F3B1VXqEdB8GL0REKmQ0ubZaNWDNGrH7yNUVWL8eaNoU2LRJodFq1JUrwKhRor/UwoUiKbpjR+DwYeDzz0Xnb1Il5rwQEalYmcm1R48C3buLpQ0AeP114NNPReE00u/iRVFUbs4csQ0dANq2BaZOBcLClB2bA2PCLoMXInIkN2+KGYR580T+S/XqIhemRw8WTrtXeroI7D7/XJwzQCQ8T54s6ujwXCmKCbtERI7E3V3MIuzcKfI2Ll8GXntNXJD//lvp0SkvKUn0IHr4YWDaNBG4tGwJ/PwzcOAA0L49AxeNYfBCRGQvQkPFLpnJk0UuzMaNopT9oEF3a5U4CkkSFYqff16cgwULgNxcsUPr11+BfftEojODFk1i8EJEZE9cXETTwCNHgA4dRBLq/PlAnTrAuHGiqaA9u3ZNzEI9+qhorbB+vQhQOnUS/aJ27QLatWPQonHMeSEismd//AGMHg3s3i0ee3mJbcBvvQUEBCg6NKuRJGDvXuDLL4EVK4Bbt8TzlSoBffsCw4aJHUV2yJ6qJTNhl8ELEdFdkgT8+KPo0XPihHiuYkXg1VeBESPELIUWJScD330nbidP3n2+cWPgzTdF3o+Xl2LDk1tx+4j/DAwPRkxUAwVHVD4MXhi8EBGVVlAgklSnTQN27Lj7fGiouNB37aruDsmSBBw7Bvzyi+jtVFS+HxBJyy+/LIKW0FC7XxbS1z4CAOIHh2l2Bsac63cFG42JiIiU5uwscj86dRIJq9OnAz/8IJaUdu8G3nlH5IN07w5ERqojkLlxQyx9rVsngpYzZ+7+rEIFMc4ePURxuSpVlBunjelrH1H0vFaDF3MweCEickStW4v8kAsXxH+XLRPbhn/5Rdx0OqB5c+DZZ4HnngPatBGzG3KSJCAjA9izRwQs27cDiYlixqiIq6tIxO3QQcy0qCHAUoCh9hGGnrc3XDYiIiIhKUkEMT//LHYr3cvJCahXD2jWTBR2a9wYCArCEakKTuY5o1aNKqZ/4795U/QTOnsWOH1aVAkuul26VPr4oCAxw9KhA/D000Blx7hAl+X+nJdB4cEYzZwXbWLwQkRkBenpwJYtol/Sli1G68TkVHRDetUHUKl6NQT4eQNubuJWsaIIVHJy7t4uXgSuXjX8uU5OQP36olx/0S0oSIY/oH3gbiM7weCFiMh0Jl/80tOBQ4fELTERt478hdunz8L79nXLPrhKFSAwUNwaNhQ7npo0EfflXp4iVWLCLhERlcmsrbZ+fiKZt107AMCvB88h+vvDcM3Phd+NK/C/fhnuebcxoHUAHguoDNy+DeTliUCkcuW7twceEAGLh4fd7wgi+TB4ISJyQIlpmSUCFwCY/3sKIhv5mbT8UJQYmlvRFWe8A3DGWxS8e6t7GKDx5QtSP7YHICJyQMa22poiJMgbA8ODSzw3KDxY0byLxLRMrDl4DolpmYqNgWyDMy9ERA7IGlttY6IaILKRnyoSRu2t2iwZx5kXIiIHZK2Zk5Agb7zUvKbiMy76lsA4A2O/OPNCROSg1DRzUh6OXm3WETF4ISJyYCFB3pq/wDt6tVlHxGUjIiLSrKI6NZ1DAko8r3TyMMmLMy9ERKRJ9yfpdg4JQNu6NTS9BEam4cwLERFpjr4k3fjEfxm4OAhZg5erV6+iZ8+e8PDwgJeXF/r164cbN24Yfc2TTz4JnU5X4jZw4EA5h0lERBpT3jo1SmNNmvKRddmoZ8+euHDhAjZv3oz8/Hz07dsXAwYMwHfffWf0df3798cHH3xQ/NidfS6IiOgeWk7SZU2a8pNt5iUpKQkbNmzAwoUL0aZNGzzxxBOYPXs2VqxYgX///dfoa93d3eHn51d8Y4NFIiK6lxor/JqCNWmsQ7bgZffu3fDy8kLLli2Ln4uIiICTkxP27t1r9LXLli1D9erV0bhxY8TGxuLmzZsGj83NzUV2dnaJGxER2b+YqAaIHxyG6V2bIn5wGEZrYPZC68tdaiHbslF6ejp8fHxKfliFCqhWrRrS09MNvq5Hjx546KGHEBAQgCNHjmD06NFITk7GmjVr9B4fFxeHiRMnWnXsRESkDVqrU6Pl5S41MXvmJSYmplRC7f23EydOWDygAQMGIDIyEk2aNEHPnj2xdOlSxMfH49SpU3qPj42NRVZWVvHt7NmzFn82ERGRnLS63KU2Zs+8jBgxAq+//rrRY4KDg+Hn54eLFy+WeP7OnTu4evUq/Pz8TP68Nm3aAABOnjyJ2rVrl/q5q6srXF1dTX4/IiIiJdlLWwYlmR281KhRAzVq1CjzuNDQUFy7dg0HDhxAixYtAABbt25FYWFhcUBiikOHDgEA/P39zR0qERGRKmltuUttZEvYbdCgAdq1a4f+/ftj37592LlzJ4YOHYru3bsjIECUcT5//jzq16+Pffv2AQBOnTqFSZMm4cCBAzh9+jR++ukn9O7dG//3f/+HRx99VK6hEhHRPRLTMjFjczJmbE7mLhhSJVnrvCxbtgxDhw7FM888AycnJ3Tp0gWzZs0q/nl+fj6Sk5OLdxO5uLhgy5YtmDlzJnJychAYGIguXbpg7Nixcg6TiIj+c38Nks8STrIOCamOTpIkSelBWFN2djY8PT2RlZXF+jBERGZITMtE53m79P4sfnCYzZc5ipouMi/EMZhz/WZjRiIiAgDM3vqPwZ+lXs6xaQDBKrRkDBszEhEREtMysfXEJYM/N1aHxNp9eliFlsrCmRciIjJa4dVYHRI5ZkiMVaHl8hEBnHkhIiIYnln5qEsTg2X35ZohYRVaKguDFyIiMlj5tVurIIOvkatPD6vQUlm4bERERADMr/wq5wwJq9CSMQxeiIiomDmVX4tmSO5dOrLmDAmr0JIhDF6IiMhijjhDwvozymPwQkRE5eJIMySsP6MOTNglIiIyAevPqAeDFyIiIhPItbuKzMfghYiIyASsP6MeDF6IiIhMwPoz6sGEXSIiIhM54u4qNWLwQkREZAZH2l2lVlw2IiIiIk1h8EJERESawuCFiIiINIXBCxEREWkKgxciIiLSFAYvREREpCncKk1EpFHsbkyOisELEZEGsbsxOTIuGxERaQy7G5OjY/BCRKQx7G5Mjo7BCxGRxrC7MTk6Bi9ERBrD7sbk6JiwS0SkQVrtbswdUmQNDF6IiDRKa92NuUOKrIXLRkREJDvukCJrYvBCRESy4w4psiYGL0REJDvukCJrYvBCRESy4w4psiYm7BIRkU12AWl1hxSpD4MXIiIHZ8tdQFrbIUXqxGUjIiIHxl1ApEUMXoiIHBh3AZEWcdmIiMiB2fMuIH15PKzwax8YvBARObCiXUD3Lh3Zwy4gfXk8AFjh107oJEmSlB6ENWVnZ8PT0xNZWVnw8PBQejhERJpgTzMSiWmZ6Dxvl0nHxg8O0/yf116Yc/3mzAsREdnVLiBz8nVSL+fYzZ/bkTBhl4iI7Io5+Tr2kNvjiBi8EBGRXTFUzZcVfu0Hl42IiMjuGKrmywq/9oEJu0RERKQ4c67fXDYiIiIiTWHwQkRERJrC4IWIiIg0hcELERERaQqDFyIiItIUBi9ERESkKbIFL1OmTEFYWBjc3d3h5eVl0mskScL48ePh7++PSpUqISIiAv/8849cQyQiIiINki14ycvLwyuvvIJBgwaZ/JqPP/4Ys2bNwvz587F3715UrlwZkZGRuH37tlzDJCIiIo2RvUjd4sWLMWzYMFy7ds3ocZIkISAgACNGjMC7774LAMjKyoKvry8WL16M7t27m/R5LFJHRESkPZosUpeamor09HREREQUP+fp6Yk2bdpg9+7dBl+Xm5uL7OzsEjciIiKyX6oJXtLT0wEAvr6+JZ739fUt/pk+cXFx8PT0LL4FBgbKOk4iIiJSllmNGWNiYvDRRx8ZPSYpKQn169cv16DMERsbi+jo6OLHWVlZCAoK4gwMERGRhhRdt03JZjEreBkxYgRef/11o8cEBwcb/bkhfn5+AICMjAz4+/sXP5+RkYFmzZoZfJ2rqytcXV2LHxf94TkDQ0REpD3Xr1+Hp6en0WPMCl5q1KiBGjVqlGtQhtSqVQt+fn5ISEgoDlays7Oxd+9es3YsBQQE4OzZs6hatSp0Op1Vx5idnY3AwECcPXuWycAy4nm2DZ5n2+B5th2ea9uQ6zxLkoTr168jICCgzGPNCl7MkZaWhqtXryItLQ0FBQU4dOgQAKBOnTqoUqUKAKB+/fqIi4tD586dodPpMGzYMEyePBl169ZFrVq1MG7cOAQEBKBTp04mf66TkxNq1qwpw5/oLg8PD/6PYQM8z7bB82wbPM+2w3NtG3Kc57JmXIrIFryMHz8eS5YsKX4cEhICAPjtt9/w5JNPAgCSk5ORlZVVfMyoUaOQk5ODAQMG4Nq1a3jiiSewYcMGuLm5yTVMIiIi0hjZ67zYE9aQsQ2eZ9vgebYNnmfb4bm2DTWcZ9VsldYCV1dXTJgwoUSCMFkfz7Nt8DzbBs+z7fBc24YazjNnXoiIiEhTOPNCREREmsLghYiIiDSFwQsRERFpCoMXIiIi0hQGL/eZO3cuHn74Ybi5uaFNmzbYt2+f0eNXrVqF+vXrw83NDU2aNMH69ettNFJtM+c8L1iwAG3btoW3tze8vb0RERFR5t8LCeb+PhdZsWIFdDqdWQUiHZm55/natWsYMmQI/P394erqinr16vHfDhOYe55nzpyJRx55BJUqVUJgYCCGDx+O27dv22i02rR9+3Z06NABAQEB0Ol0WLt2bZmv2bZtG5o3bw5XV1fUqVMHixcvln2ckKjYihUrJBcXF2nRokXSsWPHpP79+0teXl5SRkaG3uN37twpOTs7Sx9//LF0/PhxaezYsVLFihWlo0eP2njk2mLuee7Ro4c0d+5cKTExUUpKSpJef/11ydPTUzp37pyNR64t5p7nIqmpqdKDDz4otW3bVnrxxRdtM1gNM/c85+bmSi1btpTat28v7dixQ0pNTZW2bdsmHTp0yMYj1xZzz/OyZcskV1dXadmyZVJqaqq0ceNGyd/fXxo+fLiNR64t69evl9577z1pzZo1EgApPj7e6PEpKSmSu7u7FB0dLR0/flyaPXu25OzsLG3YsEHWcTJ4uUfr1q2lIUOGFD8uKCiQAgICpLi4OL3Hd+3aVXr++edLPNemTRvpzTfflHWcWmfueb7fnTt3pKpVq0pLliyRa4h2wZLzfOfOHSksLExauHCh1KdPHwYvJjD3PH/++edScHCwlJeXZ6sh2gVzz/OQIUOkp59+usRz0dHR0uOPPy7rOO2JKcHLqFGjpEaNGpV4rlu3blJkZKSMI5MkLhv9Jy8vDwcOHEBERETxc05OToiIiMDu3bv1vmb37t0ljgeAyMhIg8eTZef5fjdv3kR+fj6qVasm1zA1z9Lz/MEHH8DHxwf9+vWzxTA1z5Lz/NNPPyE0NBRDhgyBr68vGjdujA8//BAFBQW2GrbmWHKew8LCcODAgeKlpZSUFKxfvx7t27e3yZgdhVLXQdl6G2nN5cuXUVBQAF9f3xLP+/r64sSJE3pfk56ervf49PR02capdZac5/uNHj0aAQEBpf6HobssOc87duzAV199VdxElcpmyXlOSUnB1q1b0bNnT6xfvx4nT57E4MGDkZ+fjwkTJthi2JpjyXnu0aMHLl++jCeeeAKSJOHOnTsYOHAgxowZY4shOwxD18Hs7GzcunULlSpVkuVzOfNCmjJ16lSsWLEC8fHxbNhpRdevX0evXr2wYMECVK9eXenh2LXCwkL4+Pjgyy+/RIsWLdCtWze89957mD9/vtJDsyvbtm3Dhx9+iHnz5uHgwYNYs2YN1q1bh0mTJik9NLICzrz8p3r16nB2dkZGRkaJ5zMyMuDn56f3NX5+fmYdT5ad5yKffvoppk6dii1btuDRRx+Vc5iaZ+55PnXqFE6fPo0OHToUP1dYWAgAqFChApKTk1G7dm15B61Blvw++/v7o2LFinB2di5+rkGDBkhPT0deXh5cXFxkHbMWWXKex40bh169euGNN94AADRp0gQ5OTkYMGAA3nvvPTg58bu7NRi6Dnp4eMg26wJw5qWYi4sLWrRogYSEhOLnCgsLkZCQgNDQUL2vCQ0NLXE8AGzevNng8WTZeQaAjz/+GJMmTcKGDRvQsmVLWwxV08w9z/Xr18fRo0dx6NCh4lvHjh3x1FNP4dChQwgMDLTl8DXDkt/nxx9/HCdPniwODgHg77//hr+/PwMXAyw5zzdv3iwVoBQFjBJb+lmNYtdBWdOBNWbFihWSq6urtHjxYun48ePSgAEDJC8vLyk9PV2SJEnq1auXFBMTU3z8zp07pQoVKkiffvqplJSUJE2YMIFbpU1g7nmeOnWq5OLiIv3www/ShQsXim/Xr19X6o+gCeae5/txt5FpzD3PaWlpUtWqVaWhQ4dKycnJ0i+//CL5+PhIkydPVuqPoAnmnucJEyZIVatWlZYvXy6lpKRImzZtkmrXri117dpVqT+CJly/fl1KTEyUEhMTJQDS9OnTpcTEROnMmTOSJElSTEyM1KtXr+Lji7ZKjxw5UkpKSpLmzp3LrdJKmD17thQUFCS5uLhIrVu3lvbs2VP8s/DwcKlPnz4ljv/++++levXqSS4uLlKjRo2kdevW2XjE2mTOeX7ooYckAKVuEyZMsP3ANcbc3+d7MXgxnbnnedeuXVKbNm0kV1dXKTg4WJoyZYp0584dG49ae8w5z/n5+dL7778v1a5dW3Jzc5MCAwOlwYMHS5mZmbYfuIb89ttvev+9LTq3ffr0kcLDw0u9plmzZpKLi4sUHBwsff3117KPUydJnD8jIiIi7WDOCxEREWkKgxciIiLSFAYvREREpCkMXoiIiEhTGLwQERGRpjB4ISIiIk1h8EJERESawuCFiIiINIXBCxEREWkKgxciIiLSFAYvREREpCkMXoiIiEhT/h+72Om7LI9kywAAAABJRU5ErkJggg==",
237 | "text/plain": [
238 | ""
239 | ]
240 | },
241 | "metadata": {},
242 | "output_type": "display_data"
243 | }
244 | ],
245 | "source": [
246 | "import torch\n",
247 | "import torch.nn as nn\n",
248 | "import torch.nn.functional as F\n",
249 | "\n",
250 | "\n",
251 | "torch.manual_seed(0)\n",
252 | "x = torch.rand(100, 1)\n",
253 | "y = torch.sin(2 * torch.pi * x) + torch.rand(100, 1)\n",
254 | "\n",
255 | "# model\n",
256 | "class Model(nn.Module):\n",
257 | " def __init__(self, input_size=1, hidden_size= 10, output_size=1):\n",
258 | " super().__init__()\n",
259 | " self.linear1 = nn.Linear(input_size, hidden_size)\n",
260 | " self.linear2 = nn.Linear(hidden_size, output_size)\n",
261 | "\n",
262 | " def forward(self, x):\n",
263 | " y = self.linear1(x)\n",
264 | " y = F.sigmoid(y)\n",
265 | " y = self.linear2(y)\n",
266 | " return y\n",
267 | "\n",
268 | "\n",
269 | "lr = 0.2\n",
270 | "iters = 10000\n",
271 | "\n",
272 | "model = Model()\n",
273 | "optimizer = torch.optim.SGD(model.parameters(), lr=lr)\n",
274 | "\n",
275 | "for i in range(iters):\n",
276 | " y_pred = model(x)\n",
277 | " loss = F.mse_loss(y, y_pred)\n",
278 | " optimizer.zero_grad()\n",
279 | " loss.backward()\n",
280 | " optimizer.step()\n",
281 | "\n",
282 | " if i % 1000 == 0:\n",
283 | " print(loss.item())\n",
284 | "\n",
285 | "print(loss.item())\n",
286 | "\n",
287 | "# plot\n",
288 | "import matplotlib.pyplot as plt\n",
289 | "plt.scatter(x.detach().numpy(), y.detach().numpy(), s=10)\n",
290 | "x = torch.linspace(0, 1, 100).reshape(-1, 1)\n",
291 | "y = model(x).detach().numpy()\n",
292 | "plt.plot(x, y, color='red')\n",
293 | "plt.show()"
294 | ]
295 | },
296 | {
297 | "cell_type": "markdown",
298 | "metadata": {},
299 | "source": [
300 | "## vision.py"
301 | ]
302 | },
303 | {
304 | "cell_type": "code",
305 | "execution_count": 5,
306 | "metadata": {},
307 | "outputs": [
308 | {
309 | "name": "stdout",
310 | "output_type": "stream",
311 | "text": [
312 | "Downloading http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz\n",
313 | "Downloading http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz to ./data/MNIST/raw/train-images-idx3-ubyte.gz\n"
314 | ]
315 | },
316 | {
317 | "name": "stderr",
318 | "output_type": "stream",
319 | "text": [
320 | "100%|██████████| 9912422/9912422 [00:01<00:00, 6778804.78it/s]\n"
321 | ]
322 | },
323 | {
324 | "name": "stdout",
325 | "output_type": "stream",
326 | "text": [
327 | "Extracting ./data/MNIST/raw/train-images-idx3-ubyte.gz to ./data/MNIST/raw\n",
328 | "\n",
329 | "Downloading http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz\n",
330 | "Downloading http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz to ./data/MNIST/raw/train-labels-idx1-ubyte.gz\n"
331 | ]
332 | },
333 | {
334 | "name": "stderr",
335 | "output_type": "stream",
336 | "text": [
337 | "100%|██████████| 28881/28881 [00:00<00:00, 3582730.29it/s]\n"
338 | ]
339 | },
340 | {
341 | "name": "stdout",
342 | "output_type": "stream",
343 | "text": [
344 | "Extracting ./data/MNIST/raw/train-labels-idx1-ubyte.gz to ./data/MNIST/raw\n",
345 | "\n",
346 | "Downloading http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz\n",
347 | "Downloading http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz to ./data/MNIST/raw/t10k-images-idx3-ubyte.gz\n"
348 | ]
349 | },
350 | {
351 | "name": "stderr",
352 | "output_type": "stream",
353 | "text": [
354 | "100%|██████████| 1648877/1648877 [00:00<00:00, 5282915.73it/s]\n"
355 | ]
356 | },
357 | {
358 | "name": "stdout",
359 | "output_type": "stream",
360 | "text": [
361 | "Extracting ./data/MNIST/raw/t10k-images-idx3-ubyte.gz to ./data/MNIST/raw\n",
362 | "\n",
363 | "Downloading http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz\n",
364 | "Downloading http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz to ./data/MNIST/raw/t10k-labels-idx1-ubyte.gz\n"
365 | ]
366 | },
367 | {
368 | "name": "stderr",
369 | "output_type": "stream",
370 | "text": [
371 | "100%|██████████| 4542/4542 [00:00<00:00, 8873092.11it/s]\n"
372 | ]
373 | },
374 | {
375 | "name": "stdout",
376 | "output_type": "stream",
377 | "text": [
378 | "Extracting ./data/MNIST/raw/t10k-labels-idx1-ubyte.gz to ./data/MNIST/raw\n",
379 | "\n",
380 | "size: 60000\n",
381 | "type: \n",
382 | "label: 5\n"
383 | ]
384 | },
385 | {
386 | "data": {
387 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAaAAAAGdCAYAAABU0qcqAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAbe0lEQVR4nO3df2xV9f3H8dflR6+I7e1KbW8rPyygsIlgxqDrVMRRKd1G5McWdS7BzWhwrRGYuNRM0W2uDqczbEz5Y4GxCSjJgEEWNi22ZLNgQBgxbg0l3VpGWyZb7y2FFmw/3z+I98uVFjyXe/u+vTwfySeh955378fjtU9vezn1OeecAADoZ4OsNwAAuDIRIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYGKI9QY+qaenR8eOHVN6erp8Pp/1dgAAHjnn1N7ervz8fA0a1PfrnKQL0LFjxzRq1CjrbQAALlNTU5NGjhzZ5/1J9y249PR06y0AAOLgUl/PExag1atX6/rrr9dVV12lwsJCvfvuu59qjm+7AUBquNTX84QE6PXXX9eyZcu0YsUKvffee5oyZYpKSkp0/PjxRDwcAGAgcgkwffp0V1ZWFvm4u7vb5efnu8rKykvOhkIhJ4nFYrFYA3yFQqGLfr2P+yugM2fOaP/+/SouLo7cNmjQIBUXF6u2tvaC47u6uhQOh6MWACD1xT1AH374obq7u5Wbmxt1e25urlpaWi44vrKyUoFAILJ4BxwAXBnM3wVXUVGhUCgUWU1NTdZbAgD0g7j/PaDs7GwNHjxYra2tUbe3trYqGAxecLzf75ff74/3NgAASS7ur4DS0tI0depUVVVVRW7r6elRVVWVioqK4v1wAIABKiFXQli2bJkWLVqkL3zhC5o+fbpefvlldXR06Nvf/nYiHg4AMAAlJED33HOP/vOf/+jpp59WS0uLbrnlFu3cufOCNyYAAK5cPuecs97E+cLhsAKBgPU2AACXKRQKKSMjo8/7zd8FBwC4MhEgAIAJAgQAMEGAAAAmCBAAwAQBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYIIAAQBMECAAgAkCBAAwQYAAACYIEADABAECAJggQAAAEwQIAGCCAAEATBAgAIAJAgQAMEGAAAAmCBAAwAQBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYIIAAQBMECAAgAkCBAAwQYAAACYIEADABAECAJggQAAAEwQIAGCCAAEATBAgAIAJAgQAMEGAAAAmhlhvAEgmgwcP9jwTCAQSsJP4KC8vj2nu6quv9jwzYcIEzzNlZWWeZ372s595nrnvvvs8z0hSZ2en55nnn3/e88yzzz7reSYV8AoIAGCCAAEATMQ9QM8884x8Pl/UmjhxYrwfBgAwwCXkZ0A33XST3nrrrf9/kCH8qAkAEC0hZRgyZIiCwWAiPjUAIEUk5GdAhw8fVn5+vsaOHav7779fjY2NfR7b1dWlcDgctQAAqS/uASosLNS6deu0c+dOvfLKK2poaNDtt9+u9vb2Xo+vrKxUIBCIrFGjRsV7SwCAJBT3AJWWluob3/iGJk+erJKSEv3xj39UW1ub3njjjV6Pr6ioUCgUiqympqZ4bwkAkIQS/u6AzMxM3Xjjjaqvr+/1fr/fL7/fn+htAACSTML/HtDJkyd15MgR5eXlJfqhAAADSNwD9Pjjj6umpkb//Oc/9c4772j+/PkaPHhwzJfCAACkprh/C+7o0aO67777dOLECV177bW67bbbtGfPHl177bXxfigAwAAW9wBt2rQp3p8SSWr06NGeZ9LS0jzPfOlLX/I8c9ttt3mekc79zNKrhQsXxvRYqebo0aOeZ1atWuV5Zv78+Z5n+noX7qX87W9/8zxTU1MT02NdibgWHADABAECAJggQAAAEwQIAGCCAAEATBAgAIAJAgQAMEGAAAAmCBAAwAQBAgCYIEAAABMECABgwuecc9abOF84HFYgELDexhXllltuiWlu165dnmf4dzsw9PT0eJ75zne+43nm5MmTnmdi0dzcHNPc//73P88zdXV1MT1WKgqFQsrIyOjzfl4BAQBMECAAgAkCBAAwQYAAACYIEADABAECAJggQAAAEwQIAGCCAAEATBAgAIAJAgQAMEGAAAAmCBAAwMQQ6w3AXmNjY0xzJ06c8DzD1bDP2bt3r+eZtrY2zzN33nmn5xlJOnPmjOeZ3/72tzE9Fq5cvAICAJggQAAAEwQIAGCCAAEATBAgAIAJAgQAMEGAAAAmCBAAwAQBAgCYIEAAABMECABgggABAExwMVLov//9b0xzy5cv9zzzta99zfPMgQMHPM+sWrXK80ysDh486Hnmrrvu8jzT0dHheeamm27yPCNJjz32WExzgBe8AgIAmCBAAAATBAgAYIIAAQBMECAAgAkCBAAwQYAAACYIEADABAECAJggQAAAEwQIAGCCAAEATPicc856E+cLh8MKBALW20CCZGRkeJ5pb2/3PLNmzRrPM5L04IMPep751re+5Xlm48aNnmeAgSYUCl30v3leAQEATBAgAIAJzwHavXu35s6dq/z8fPl8Pm3dujXqfuecnn76aeXl5WnYsGEqLi7W4cOH47VfAECK8Bygjo4OTZkyRatXr+71/pUrV2rVqlV69dVXtXfvXg0fPlwlJSXq7Oy87M0CAFKH59+IWlpaqtLS0l7vc87p5Zdf1g9+8APdfffdkqT169crNzdXW7du1b333nt5uwUApIy4/gyooaFBLS0tKi4ujtwWCARUWFio2traXme6uroUDoejFgAg9cU1QC0tLZKk3NzcqNtzc3Mj931SZWWlAoFAZI0aNSqeWwIAJCnzd8FVVFQoFApFVlNTk/WWAAD9IK4BCgaDkqTW1tao21tbWyP3fZLf71dGRkbUAgCkvrgGqKCgQMFgUFVVVZHbwuGw9u7dq6Kiong+FABggPP8LriTJ0+qvr4+8nFDQ4MOHjyorKwsjR49WkuWLNGPf/xj3XDDDSooKNBTTz2l/Px8zZs3L577BgAMcJ4DtG/fPt15552Rj5ctWyZJWrRokdatW6cnnnhCHR0devjhh9XW1qbbbrtNO3fu1FVXXRW/XQMABjwuRoqU9MILL8Q09/H/UHlRU1Pjeeb8v6rwafX09HieASxxMVIAQFIiQAAAEwQIAGCCAAEATBAgAIAJAgQAMEGAAAAmCBAAwAQBAgCYIEAAABMECABgggABAEwQIACACa6GjZQ0fPjwmOa2b9/ueeaOO+7wPFNaWup55s9//rPnGcASV8MGACQlAgQAMEGAAAAmCBAAwAQBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJggQAMAEFyMFzjNu3DjPM++9957nmba2Ns8zb7/9tueZffv2eZ6RpNWrV3ueSbIvJUgCXIwUAJCUCBAAwAQBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATXIwUuEzz58/3PLN27VrPM+np6Z5nYvXkk096nlm/fr3nmebmZs8zGDi4GCkAICkRIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYIIAAQBMECAAgAkCBAAwQYAAACa4GClgYNKkSZ5nXnrpJc8zs2bN8jwTqzVr1nieee655zzP/Pvf//Y8AxtcjBQAkJQIEADAhOcA7d69W3PnzlV+fr58Pp+2bt0adf8DDzwgn88XtebMmROv/QIAUoTnAHV0dGjKlClavXp1n8fMmTNHzc3NkbVx48bL2iQAIPUM8TpQWlqq0tLSix7j9/sVDAZj3hQAIPUl5GdA1dXVysnJ0YQJE/TII4/oxIkTfR7b1dWlcDgctQAAqS/uAZozZ47Wr1+vqqoq/fSnP1VNTY1KS0vV3d3d6/GVlZUKBAKRNWrUqHhvCQCQhDx/C+5S7r333sifb775Zk2ePFnjxo1TdXV1r38noaKiQsuWLYt8HA6HiRAAXAES/jbssWPHKjs7W/X19b3e7/f7lZGREbUAAKkv4QE6evSoTpw4oby8vEQ/FABgAPH8LbiTJ09GvZppaGjQwYMHlZWVpaysLD377LNauHChgsGgjhw5oieeeELjx49XSUlJXDcOABjYPAdo3759uvPOOyMff/zzm0WLFumVV17RoUOH9Jvf/EZtbW3Kz8/X7Nmz9aMf/Uh+vz9+uwYADHhcjBQYIDIzMz3PzJ07N6bHWrt2recZn8/neWbXrl2eZ+666y7PM7DBxUgBAEmJAAEATBAgAIAJAgQAMEGAAAAmCBAAwAQBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJrgaNoALdHV1eZ4ZMsTzb3fRRx995Hkmlt8tVl1d7XkGl4+rYQMAkhIBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYML71QMBXLbJkyd7nvn617/ueWbatGmeZ6TYLiwaiw8++MDzzO7duxOwE1jgFRAAwAQBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYIKLkQLnmTBhgueZ8vJyzzMLFizwPBMMBj3P9Kfu7m7PM83NzZ5nenp6PM8gOfEKCABgggABAEwQIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYIIAAQBMECAAgAkCBAAwwcVIkfRiuQjnfffdF9NjxXJh0euvvz6mx0pm+/bt8zzz3HPPeZ75wx/+4HkGqYNXQAAAEwQIAGDCU4AqKys1bdo0paenKycnR/PmzVNdXV3UMZ2dnSorK9OIESN0zTXXaOHChWptbY3rpgEAA5+nANXU1KisrEx79uzRm2++qbNnz2r27Nnq6OiIHLN06VJt375dmzdvVk1NjY4dOxbTL98CAKQ2T29C2LlzZ9TH69atU05Ojvbv368ZM2YoFArp17/+tTZs2KAvf/nLkqS1a9fqs5/9rPbs2aMvfvGL8ds5AGBAu6yfAYVCIUlSVlaWJGn//v06e/asiouLI8dMnDhRo0ePVm1tba+fo6urS+FwOGoBAFJfzAHq6enRkiVLdOutt2rSpEmSpJaWFqWlpSkzMzPq2NzcXLW0tPT6eSorKxUIBCJr1KhRsW4JADCAxBygsrIyvf/++9q0adNlbaCiokKhUCiympqaLuvzAQAGhpj+Imp5ebl27Nih3bt3a+TIkZHbg8Ggzpw5o7a2tqhXQa2trX3+ZUK/3y+/3x/LNgAAA5inV0DOOZWXl2vLli3atWuXCgoKou6fOnWqhg4dqqqqqshtdXV1amxsVFFRUXx2DABICZ5eAZWVlWnDhg3atm2b0tPTIz/XCQQCGjZsmAKBgB588EEtW7ZMWVlZysjI0KOPPqqioiLeAQcAiOIpQK+88ookaebMmVG3r127Vg888IAk6ec//7kGDRqkhQsXqqurSyUlJfrVr34Vl80CAFKHzznnrDdxvnA4rEAgYL0NfAq5ubmeZz73uc95nvnlL3/peWbixImeZ5Ld3r17Pc+88MILMT3Wtm3bPM/09PTE9FhIXaFQSBkZGX3ez7XgAAAmCBAAwAQBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYCKm34iK5JWVleV5Zs2aNTE91i233OJ5ZuzYsTE9VjJ75513PM+8+OKLnmf+9Kc/eZ45ffq05xmgv/AKCABgggABAEwQIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYIIAAQBMECAAgAkCBAAwwcVI+0lhYaHnmeXLl3uemT59uueZ6667zvNMsjt16lRMc6tWrfI885Of/MTzTEdHh+cZINXwCggAYIIAAQBMECAAgAkCBAAwQYAAACYIEADABAECAJggQAAAEwQIAGCCAAEATBAgAIAJAgQAMMHFSPvJ/Pnz+2WmP33wwQeeZ3bs2OF55qOPPvI88+KLL3qekaS2traY5gB4xysgAIAJAgQAMEGAAAAmCBAAwAQBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJggQAMCEzznnrDdxvnA4rEAgYL0NAMBlCoVCysjI6PN+XgEBAEwQIACACU8Bqqys1LRp05Senq6cnBzNmzdPdXV1UcfMnDlTPp8vai1evDiumwYADHyeAlRTU6OysjLt2bNHb775ps6ePavZs2ero6Mj6riHHnpIzc3NkbVy5cq4bhoAMPB5+o2oO3fujPp43bp1ysnJ0f79+zVjxozI7VdffbWCwWB8dggASEmX9TOgUCgkScrKyoq6/bXXXlN2drYmTZqkiooKnTp1qs/P0dXVpXA4HLUAAFcAF6Pu7m731a9+1d16661Rt69Zs8bt3LnTHTp0yP3ud79z1113nZs/f36fn2fFihVOEovFYrFSbIVCoYt2JOYALV682I0ZM8Y1NTVd9LiqqionydXX1/d6f2dnpwuFQpHV1NRkftJYLBaLdfnrUgHy9DOgj5WXl2vHjh3avXu3Ro4cedFjCwsLJUn19fUaN27cBff7/X75/f5YtgEAGMA8Bcg5p0cffVRbtmxRdXW1CgoKLjlz8OBBSVJeXl5MGwQApCZPASorK9OGDRu0bds2paenq6WlRZIUCAQ0bNgwHTlyRBs2bNBXvvIVjRgxQocOHdLSpUs1Y8YMTZ48OSH/AACAAcrLz33Ux/f51q5d65xzrrGx0c2YMcNlZWU5v9/vxo8f75YvX37J7wOeLxQKmX/fksVisViXvy71tZ+LkQIAEoKLkQIAkhIBAgCYIEAAABMECABgggABAEwQIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYIIAAQBMECAAgAkCBAAwQYAAACYIEADABAECAJggQAAAEwQIAGCCAAEATBAgAIAJAgQAMEGAAAAmCBAAwETSBcg5Z70FAEAcXOrredIFqL293XoLAIA4uNTXc59LspccPT09OnbsmNLT0+Xz+aLuC4fDGjVqlJqampSRkWG0Q3uch3M4D+dwHs7hPJyTDOfBOaf29nbl5+dr0KC+X+cM6cc9fSqDBg3SyJEjL3pMRkbGFf0E+xjn4RzOwzmch3M4D+dYn4dAIHDJY5LuW3AAgCsDAQIAmBhQAfL7/VqxYoX8fr/1VkxxHs7hPJzDeTiH83DOQDoPSfcmBADAlWFAvQICAKQOAgQAMEGAAAAmCBAAwMSACdDq1at1/fXX66qrrlJhYaHeffdd6y31u2eeeUY+ny9qTZw40XpbCbd7927NnTtX+fn58vl82rp1a9T9zjk9/fTTysvL07Bhw1RcXKzDhw/bbDaBLnUeHnjggQueH3PmzLHZbIJUVlZq2rRpSk9PV05OjubNm6e6urqoYzo7O1VWVqYRI0bommuu0cKFC9Xa2mq048T4NOdh5syZFzwfFi9ebLTj3g2IAL3++utatmyZVqxYoffee09TpkxRSUmJjh8/br21fnfTTTepubk5sv7yl79YbynhOjo6NGXKFK1evbrX+1euXKlVq1bp1Vdf1d69ezV8+HCVlJSos7Ozn3eaWJc6D5I0Z86cqOfHxo0b+3GHiVdTU6OysjLt2bNHb775ps6ePavZs2ero6MjcszSpUu1fft2bd68WTU1NTp27JgWLFhguOv4+zTnQZIeeuihqOfDypUrjXbcBzcATJ8+3ZWVlUU+7u7udvn5+a6ystJwV/1vxYoVbsqUKdbbMCXJbdmyJfJxT0+PCwaD7oUXXojc1tbW5vx+v9u4caPBDvvHJ8+Dc84tWrTI3X333Sb7sXL8+HEnydXU1Djnzv27Hzp0qNu8eXPkmL///e9OkqutrbXaZsJ98jw459wdd9zhHnvsMbtNfQpJ/wrozJkz2r9/v4qLiyO3DRo0SMXFxaqtrTXcmY3Dhw8rPz9fY8eO1f3336/GxkbrLZlqaGhQS0tL1PMjEAiosLDwinx+VFdXKycnRxMmTNAjjzyiEydOWG8poUKhkCQpKytLkrR//36dPXs26vkwceJEjR49OqWfD588Dx977bXXlJ2drUmTJqmiokKnTp2y2F6fku5ipJ/04Ycfqru7W7m5uVG35+bm6h//+IfRrmwUFhZq3bp1mjBhgpqbm/Xss8/q9ttv1/vvv6/09HTr7ZloaWmRpF6fHx/fd6WYM2eOFixYoIKCAh05ckRPPvmkSktLVVtbq8GDB1tvL+56enq0ZMkS3XrrrZo0aZKkc8+HtLQ0ZWZmRh2bys+H3s6DJH3zm9/UmDFjlJ+fr0OHDun73/++6urq9Pvf/95wt9GSPkD4f6WlpZE/T548WYWFhRozZozeeOMNPfjgg4Y7QzK49957I3+++eabNXnyZI0bN07V1dWaNWuW4c4So6ysTO+///4V8XPQi+nrPDz88MORP998883Ky8vTrFmzdOTIEY0bN66/t9mrpP8WXHZ2tgYPHnzBu1haW1sVDAaNdpUcMjMzdeONN6q+vt56K2Y+fg7w/LjQ2LFjlZ2dnZLPj/Lycu3YsUNvv/121K9vCQaDOnPmjNra2qKOT9XnQ1/noTeFhYWSlFTPh6QPUFpamqZOnaqqqqrIbT09PaqqqlJRUZHhzuydPHlSR44cUV5envVWzBQUFCgYDEY9P8LhsPbu3XvFPz+OHj2qEydOpNTzwzmn8vJybdmyRbt27VJBQUHU/VOnTtXQoUOjng91dXVqbGxMqefDpc5Dbw4ePChJyfV8sH4XxKexadMm5/f73bp169wHH3zgHn74YZeZmelaWlqst9avvve977nq6mrX0NDg/vrXv7ri4mKXnZ3tjh8/br21hGpvb3cHDhxwBw4ccJLcSy+95A4cOOD+9a9/Oeece/75511mZqbbtm2bO3TokLv77rtdQUGBO336tPHO4+ti56G9vd09/vjjrra21jU0NLi33nrLff7zn3c33HCD6+zstN563DzyyCMuEAi46upq19zcHFmnTp2KHLN48WI3evRot2vXLrdv3z5XVFTkioqKDHcdf5c6D/X19e6HP/yh27dvn2toaHDbtm1zY8eOdTNmzDDeebQBESDnnPvFL37hRo8e7dLS0tz06dPdnj17rLfU7+655x6Xl5fn0tLS3HXXXefuueceV19fb72thHv77bedpAvWokWLnHPn3or91FNPudzcXOf3+92sWbNcXV2d7aYT4GLn4dSpU2727Nnu2muvdUOHDnVjxoxxDz30UMr9T1pv//yS3Nq1ayPHnD592n33u991n/nMZ9zVV1/t5s+f75qbm+02nQCXOg+NjY1uxowZLisry/n9fjd+/Hi3fPlyFwqFbDf+Cfw6BgCAiaT/GRAAIDURIACACQIEADBBgAAAJggQAMAEAQIAmCBAAAATBAgAYIIAAQBMECAAgAkCBAAwQYAAACb+Dwuo74MxItlsAAAAAElFTkSuQmCC",
388 | "text/plain": [
389 | ""
390 | ]
391 | },
392 | "metadata": {},
393 | "output_type": "display_data"
394 | },
395 | {
396 | "name": "stdout",
397 | "output_type": "stream",
398 | "text": [
399 | "type: \n",
400 | "shape: torch.Size([1, 28, 28])\n",
401 | "x shape: torch.Size([32, 1, 28, 28])\n",
402 | "label shape: torch.Size([32])\n"
403 | ]
404 | }
405 | ],
406 | "source": [
407 | "import torch\n",
408 | "import torchvision\n",
409 | "import torchvision.transforms as transforms\n",
410 | "import matplotlib.pyplot as plt\n",
411 | "\n",
412 | "\n",
413 | "## ==== MNIST ====\n",
414 | "dataset = torchvision.datasets.MNIST(\n",
415 | " root='./data',\n",
416 | " train=True,\n",
417 | " transform=None,\n",
418 | " download=True\n",
419 | ")\n",
420 | "\n",
421 | "x, label = dataset[0]\n",
422 | "\n",
423 | "print('size:', len(dataset)) # size: 60000\n",
424 | "print('type:', type(x)) # type: \n",
425 | "print('label:', label) # label: 5\n",
426 | "\n",
427 | "plt.imshow(x, cmap='gray')\n",
428 | "plt.show()\n",
429 | "\n",
430 | "\n",
431 | "# ==== preprocess ====\n",
432 | "transform = transforms.ToTensor()\n",
433 | "\n",
434 | "dataset = torchvision.datasets.MNIST(\n",
435 | " root='./data',\n",
436 | " train=True,\n",
437 | " transform=transform,\n",
438 | " download=True\n",
439 | ")\n",
440 | "\n",
441 | "x, label = dataset[0]\n",
442 | "print('type:', type(x)) # type: \n",
443 | "print('shape:', x.shape) # shape: torch.Size([1, 28, 28])\n",
444 | "\n",
445 | "\n",
446 | "# ==== DataLoader ====\n",
447 | "dataloader = torch.utils.data.DataLoader(\n",
448 | " dataset,\n",
449 | " batch_size=32,\n",
450 | " shuffle=True)\n",
451 | "\n",
452 | "for x, label in dataloader:\n",
453 | " print('x shape:', x.shape) # shape: torch.Size([32, 1, 28, 28])\n",
454 | " print('label shape:', label.shape) # shape: torch.Size([32])\n",
455 | " break"
456 | ]
457 | }
458 | ],
459 | "metadata": {
460 | "kernelspec": {
461 | "display_name": "Python 3",
462 | "language": "python",
463 | "name": "python3"
464 | },
465 | "language_info": {
466 | "codemirror_mode": {
467 | "name": "ipython",
468 | "version": 3
469 | },
470 | "file_extension": ".py",
471 | "mimetype": "text/x-python",
472 | "name": "python",
473 | "nbconvert_exporter": "python",
474 | "pygments_lexer": "ipython3",
475 | "version": "3.11.0rc2"
476 | }
477 | },
478 | "nbformat": 4,
479 | "nbformat_minor": 2
480 | }
481 |
--------------------------------------------------------------------------------
/notebooks/flower.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/WegraLee/deep-learning-from-scratch-5/86be5ee971cf3cfe3795262f1ee07ce2322e9c9a/notebooks/flower.png
--------------------------------------------------------------------------------
/notebooks/old_faithful.txt:
--------------------------------------------------------------------------------
1 | 3.6 79
2 | 1.8 54
3 | 3.333 74
4 | 2.283 62
5 | 4.533 85
6 | 2.883 55
7 | 4.7 88
8 | 3.6 85
9 | 1.95 51
10 | 4.35 85
11 | 1.833 54
12 | 3.917 84
13 | 4.2 78
14 | 1.75 47
15 | 4.7 83
16 | 2.167 52
17 | 1.75 62
18 | 4.8 84
19 | 1.6 52
20 | 4.25 79
21 | 1.8 51
22 | 1.75 47
23 | 3.45 78
24 | 3.067 69
25 | 4.533 74
26 | 3.6 83
27 | 1.967 55
28 | 4.083 76
29 | 3.85 78
30 | 4.433 79
31 | 4.3 73
32 | 4.467 77
33 | 3.367 66
34 | 4.033 80
35 | 3.833 74
36 | 2.017 52
37 | 1.867 48
38 | 4.833 80
39 | 1.833 59
40 | 4.783 90
41 | 4.35 80
42 | 1.883 58
43 | 4.567 84
44 | 1.75 58
45 | 4.533 73
46 | 3.317 83
47 | 3.833 64
48 | 2.1 53
49 | 4.633 82
50 | 2 59
51 | 4.8 75
52 | 4.716 90
53 | 1.833 54
54 | 4.833 80
55 | 1.733 54
56 | 4.883 83
57 | 3.717 71
58 | 1.667 64
59 | 4.567 77
60 | 4.317 81
61 | 2.233 59
62 | 4.5 84
63 | 1.75 48
64 | 4.8 82
65 | 1.817 60
66 | 4.4 92
67 | 4.167 78
68 | 4.7 78
69 | 2.067 65
70 | 4.7 73
71 | 4.033 82
72 | 1.967 56
73 | 4.5 79
74 | 4 71
75 | 1.983 62
76 | 5.067 76
77 | 2.017 60
78 | 4.567 78
79 | 3.883 76
80 | 3.6 83
81 | 4.133 75
82 | 4.333 82
83 | 4.1 70
84 | 2.633 65
85 | 4.067 73
86 | 4.933 88
87 | 3.95 76
88 | 4.517 80
89 | 2.167 48
90 | 4 86
91 | 2.2 60
92 | 4.333 90
93 | 1.867 50
94 | 4.817 78
95 | 1.833 63
96 | 4.3 72
97 | 4.667 84
98 | 3.75 75
99 | 1.867 51
100 | 4.9 82
101 | 2.483 62
102 | 4.367 88
103 | 2.1 49
104 | 4.5 83
105 | 4.05 81
106 | 1.867 47
107 | 4.7 84
108 | 1.783 52
109 | 4.85 86
110 | 3.683 81
111 | 4.733 75
112 | 2.3 59
113 | 4.9 89
114 | 4.417 79
115 | 1.7 59
116 | 4.633 81
117 | 2.317 50
118 | 4.6 85
119 | 1.817 59
120 | 4.417 87
121 | 2.617 53
122 | 4.067 69
123 | 4.25 77
124 | 1.967 56
125 | 4.6 88
126 | 3.767 81
127 | 1.917 45
128 | 4.5 82
129 | 2.267 55
130 | 4.65 90
131 | 1.867 45
132 | 4.167 83
133 | 2.8 56
134 | 4.333 89
135 | 1.833 46
136 | 4.383 82
137 | 1.883 51
138 | 4.933 86
139 | 2.033 53
140 | 3.733 79
141 | 4.233 81
142 | 2.233 60
143 | 4.533 82
144 | 4.817 77
145 | 4.333 76
146 | 1.983 59
147 | 4.633 80
148 | 2.017 49
149 | 5.1 96
150 | 1.8 53
151 | 5.033 77
152 | 4 77
153 | 2.4 65
154 | 4.6 81
155 | 3.567 71
156 | 4 70
157 | 4.5 81
158 | 4.083 93
159 | 1.8 53
160 | 3.967 89
161 | 2.2 45
162 | 4.15 86
163 | 2 58
164 | 3.833 78
165 | 3.5 66
166 | 4.583 76
167 | 2.367 63
168 | 5 88
169 | 1.933 52
170 | 4.617 93
171 | 1.917 49
172 | 2.083 57
173 | 4.583 77
174 | 3.333 68
175 | 4.167 81
176 | 4.333 81
177 | 4.5 73
178 | 2.417 50
179 | 4 85
180 | 4.167 74
181 | 1.883 55
182 | 4.583 77
183 | 4.25 83
184 | 3.767 83
185 | 2.033 51
186 | 4.433 78
187 | 4.083 84
188 | 1.833 46
189 | 4.417 83
190 | 2.183 55
191 | 4.8 81
192 | 1.833 57
193 | 4.8 76
194 | 4.1 84
195 | 3.966 77
196 | 4.233 81
197 | 3.5 87
198 | 4.366 77
199 | 2.25 51
200 | 4.667 78
201 | 2.1 60
202 | 4.35 82
203 | 4.133 91
204 | 1.867 53
205 | 4.6 78
206 | 1.783 46
207 | 4.367 77
208 | 3.85 84
209 | 1.933 49
210 | 4.5 83
211 | 2.383 71
212 | 4.7 80
213 | 1.867 49
214 | 3.833 75
215 | 3.417 64
216 | 4.233 76
217 | 2.4 53
218 | 4.8 94
219 | 2 55
220 | 4.15 76
221 | 1.867 50
222 | 4.267 82
223 | 1.75 54
224 | 4.483 75
225 | 4 78
226 | 4.117 79
227 | 4.083 78
228 | 4.267 78
229 | 3.917 70
230 | 4.55 79
231 | 4.083 70
232 | 2.417 54
233 | 4.183 86
234 | 2.217 50
235 | 4.45 90
236 | 1.883 54
237 | 1.85 54
238 | 4.283 77
239 | 3.95 79
240 | 2.333 64
241 | 4.15 75
242 | 2.35 47
243 | 4.933 86
244 | 2.9 63
245 | 4.583 85
246 | 3.833 82
247 | 2.083 57
248 | 4.367 82
249 | 2.133 67
250 | 4.35 74
251 | 2.2 54
252 | 4.45 83
253 | 3.567 73
254 | 4.5 73
255 | 4.15 88
256 | 3.817 80
257 | 3.917 71
258 | 4.45 83
259 | 2 56
260 | 4.283 79
261 | 4.767 78
262 | 4.533 84
263 | 1.85 58
264 | 4.25 83
265 | 1.983 43
266 | 2.25 60
267 | 4.75 75
268 | 4.117 81
269 | 2.15 46
270 | 4.417 90
271 | 1.817 46
272 | 4.467 74
273 |
--------------------------------------------------------------------------------
/posters/1부. 시야를 찾아서.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/WegraLee/deep-learning-from-scratch-5/86be5ee971cf3cfe3795262f1ee07ce2322e9c9a/posters/1부. 시야를 찾아서.png
--------------------------------------------------------------------------------
/posters/2부. 상어공주.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/WegraLee/deep-learning-from-scratch-5/86be5ee971cf3cfe3795262f1ee07ce2322e9c9a/posters/2부. 상어공주.png
--------------------------------------------------------------------------------
/posters/3부. DeZero의 창조자.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/WegraLee/deep-learning-from-scratch-5/86be5ee971cf3cfe3795262f1ee07ce2322e9c9a/posters/3부. DeZero의 창조자.png
--------------------------------------------------------------------------------
/posters/4부. 제발, 가즈아!.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/WegraLee/deep-learning-from-scratch-5/86be5ee971cf3cfe3795262f1ee07ce2322e9c9a/posters/4부. 제발, 가즈아!.png
--------------------------------------------------------------------------------
/posters/5부. 피쉬카소와 천재의 초상.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/WegraLee/deep-learning-from-scratch-5/86be5ee971cf3cfe3795262f1ee07ce2322e9c9a/posters/5부. 피쉬카소와 천재의 초상.png
--------------------------------------------------------------------------------
/posters/README.md:
--------------------------------------------------------------------------------
1 | 포스터 보관용
2 |
--------------------------------------------------------------------------------
/posters/바닷속 딥러닝 어드벤처.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/WegraLee/deep-learning-from-scratch-5/86be5ee971cf3cfe3795262f1ee07ce2322e9c9a/posters/바닷속 딥러닝 어드벤처.png
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | numpy==1.24.2
2 | scipy==1.10.1
3 | matplotlib==3.7.1
4 | torch==2.0.0
5 | torchvision==0.15.1
6 | tqdm==4.65.0
--------------------------------------------------------------------------------
/step01/norm_dist.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib.pyplot as plt
3 |
4 |
5 | def normal(x, mu=0, sigma=1):
6 | y = 1 / (np.sqrt(2 * np.pi) * sigma) * np.exp(-(x - mu)**2 / (2 * sigma**2))
7 | return y
8 |
9 | x = np.linspace(-5, 5, 100)
10 | y = normal(x)
11 |
12 | plt.plot(x, y)
13 | plt.xlabel('x')
14 | plt.ylabel('y')
15 | plt.show()
--------------------------------------------------------------------------------
/step01/norm_param.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib.pyplot as plt
3 |
4 | def normal(x, mu=0, sigma=1):
5 | y = 1 / (np.sqrt(2 * np.pi) * sigma) * np.exp(-(x - mu)**2 / (2 * sigma**2))
6 | return y
7 |
8 | x = np.linspace(-10, 10, 1000)
9 |
10 | # mu ====================
11 | y0 = normal(x, mu=-3)
12 | y1 = normal(x, mu=0)
13 | y2 = normal(x, mu=5)
14 |
15 | plt.plot(x, y0, label='$\mu$=-3')
16 | plt.plot(x, y1, label='$\mu$=0')
17 | plt.plot(x, y2, label='$\mu$=5')
18 | plt.legend()
19 | plt.xlabel('x')
20 | plt.ylabel('y')
21 | plt.show()
22 |
23 | # sigma ====================
24 | y0 = normal(x, mu=0, sigma=0.5)
25 | y1 = normal(x, mu=0, sigma=1)
26 | y2 = normal(x, mu=0, sigma=2)
27 |
28 | plt.plot(x, y0, label='$\sigma$=0.5')
29 | plt.plot(x, y1, label='$\sigma$=1')
30 | plt.plot(x, y2, label='$\sigma$=2')
31 | plt.legend()
32 | plt.xlabel('x')
33 | plt.ylabel('y')
34 | plt.show()
--------------------------------------------------------------------------------
/step01/sample_avg.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib.pyplot as plt
3 |
4 | x_means = []
5 | N = 1 # sample size
6 |
7 | for _ in range(10000):
8 | xs = []
9 | for i in range(N):
10 | x = np.random.rand()
11 | xs.append(x)
12 | mean = np.mean(xs)
13 | x_means.append(mean)
14 |
15 | # plot
16 | plt.hist(x_means, bins='auto', density=True)
17 | plt.title(f'N={N}')
18 | plt.xlabel('x')
19 | plt.ylabel('Probability Density')
20 | plt.xlim(-0.05, 1.05)
21 | plt.ylim(0, 5)
22 | plt.show()
--------------------------------------------------------------------------------
/step01/sample_sum.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib.pyplot as plt
3 |
4 | x_means = []
5 | N = 5
6 |
7 | for _ in range(10000):
8 | xs = []
9 | for i in range(N):
10 | x = np.random.rand()
11 | xs.append(x)
12 | mean = np.sum(xs)
13 | x_means.append(mean)
14 |
15 | # normal distribution
16 | def normal(x, mu=0, sigma=1):
17 | y = 1 / (np.sqrt(2 * np.pi) * sigma) * np.exp(-(x - mu)**2 / (2 * sigma**2))
18 | return y
19 | x_norm = np.linspace(-5, 5, 1000)
20 | mu = 0.5 * N
21 | sigma = np.sqrt(1 / 12 * N)
22 | y_norm = normal(x_norm, mu, sigma)
23 |
24 | # plot
25 | plt.hist(x_means, bins='auto', density=True)
26 | plt.plot(x_norm, y_norm)
27 | plt.title(f'N={N}')
28 | plt.xlim(-1, 6)
29 | plt.show()
--------------------------------------------------------------------------------
/step02/fit.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 |
5 | path = os.path.join(os.path.dirname(__file__), 'height.txt')
6 | xs = np.loadtxt(path)
7 |
8 | mu = np.mean(xs)
9 | sigma = np.std(xs)
10 |
11 | # normal distribution
12 | def normal(x, mu=0, sigma=1):
13 | y = 1 / (np.sqrt(2 * np.pi) * sigma) * np.exp(-(x - mu)**2 / (2 * sigma**2))
14 | return y
15 | x = np.linspace(150, 190, 1000)
16 | y = normal(x, mu, sigma)
17 |
18 | # plot
19 | plt.hist(xs, bins='auto', density=True)
20 | plt.plot(x, y)
21 | plt.xlabel('Height(cm)')
22 | plt.ylabel('Probability Density')
23 | plt.show()
--------------------------------------------------------------------------------
/step02/generate.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 |
5 |
6 | path = os.path.join(os.path.dirname(__file__), 'height.txt')
7 | xs = np.loadtxt(path)
8 | mu = np.mean(xs)
9 | sigma = np.std(xs)
10 |
11 | samples = np.random.normal(mu, sigma, 10000)
12 |
13 | plt.hist(xs, bins='auto', density=True, alpha=0.7, label='original')
14 | plt.hist(samples, bins='auto', density=True, alpha=0.7, label='generated')
15 | plt.xlabel('Height(cm)')
16 | plt.ylabel('Probability Density')
17 | plt.legend()
18 | plt.show()
--------------------------------------------------------------------------------
/step02/hist.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 |
5 | path = os.path.join(os.path.dirname(__file__), 'height.txt')
6 | xs = np.loadtxt(path)
7 | print(xs.shape)
8 |
9 | plt.hist(xs, bins='auto', density=True)
10 | plt.xlabel('Height(cm)')
11 | plt.ylabel('Probability Density')
12 | plt.show()
--------------------------------------------------------------------------------
/step02/prob.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | from scipy.stats import norm
4 |
5 |
6 | path = os.path.join(os.path.dirname(__file__), 'height.txt')
7 | xs = np.loadtxt(path)
8 | mu = np.mean(xs)
9 | sigma = np.std(xs)
10 |
11 | p1 = norm.cdf(160, mu, sigma)
12 | print('p(x <= 160):', p1)
13 |
14 | p2 = norm.cdf(180, mu, sigma)
15 | print('p(x > 180):', 1-p2)
--------------------------------------------------------------------------------
/step03/mle.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 |
5 | path = os.path.join(os.path.dirname(__file__), 'height_weight.txt')
6 | xs = np.loadtxt(path)
7 |
8 | # Maximum Likelihood Estimation(MLE)
9 | mu = np.mean(xs, axis=0)
10 | cov = np.cov(xs, rowvar=False)
11 |
12 | def multivariate_normal(x, mu, cov):
13 | det = np.linalg.det(cov)
14 | inv = np.linalg.inv(cov)
15 | d = len(x)
16 | z = 1 / np.sqrt((2 * np.pi) ** d * det)
17 | y = z * np.exp((x - mu).T @ inv @ (x - mu) / -2.0)
18 | return y
19 |
20 | small_xs = xs[:500]
21 | X, Y = np.meshgrid(np.arange(150, 195, 0.5),
22 | np.arange(45, 75, 0.5))
23 | Z = np.zeros_like(X)
24 |
25 | for i in range(X.shape[0]):
26 | for j in range(X.shape[1]):
27 | x = np.array([X[i, j], Y[i, j]])
28 | Z[i, j] = multivariate_normal(x, mu, cov)
29 |
30 | fig = plt.figure()
31 | ax1 = fig.add_subplot(1, 2, 1, projection='3d')
32 | ax1.set_xlabel('x')
33 | ax1.set_ylabel('y')
34 | ax1.set_zlabel('z')
35 | ax1.plot_surface(X, Y, Z, cmap='viridis')
36 |
37 | ax2 = fig.add_subplot(1, 2, 2)
38 | ax2.scatter(small_xs[:,0], small_xs[:,1])
39 | ax2.set_xlabel('x')
40 | ax2.set_ylabel('y')
41 | ax2.set_xlim(156, 189)
42 | ax2.set_ylim(36, 79)
43 | ax2.contour(X, Y, Z)
44 | plt.show()
--------------------------------------------------------------------------------
/step03/numpy_basis.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 |
3 | # array
4 | x = np.array([1, 2, 3])
5 | print(x.__class__)
6 | print(x.shape)
7 | print(x.ndim)
8 | W = np.array([[1, 2, 3],
9 | [4, 5, 6]])
10 | print(W.ndim)
11 | print(W.shape)
12 |
13 | # element-wise operation
14 | W = np.array([[1, 2, 3], [4, 5, 6]])
15 | X = np.array([[0, 1, 2], [3, 4, 5]])
16 | print(W + X)
17 | print('---')
18 | print(W * X)
19 |
20 | # inner product
21 | a = np.array([1, 2, 3])
22 | b = np.array([4, 5, 6])
23 | y = np.dot(a, b) # a @ b
24 | print(y)
25 |
26 | # matrix multiplication
27 | A = np.array([[1, 2], [3, 4]])
28 | B = np.array([[5, 6], [7, 8]])
29 | Y = np.dot(A, B) # A @ B
30 | print(Y)
--------------------------------------------------------------------------------
/step03/numpy_matrix.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 |
3 | # transpose
4 | A = np.array([[1, 2, 3], [4, 5, 6]])
5 | print(A)
6 | print('---')
7 | print(A.T)
8 |
9 | # determinant
10 | A = np.array([[3, 4], [5, 6]])
11 | d = np.linalg.det(A)
12 | print(d)
13 |
14 | # inverse matrix
15 | A = np.array([[3, 4], [5, 6]])
16 | B = np.linalg.inv(A)
17 | print(B)
18 | print('---')
19 | print(A @ B)
20 |
21 | # multivariate normal distribution
22 | def multivariate_normal(x, mu, cov):
23 | det = np.linalg.det(cov)
24 | inv = np.linalg.inv(cov)
25 | D = len(x)
26 | z = 1 / np.sqrt((2 * np.pi) ** D * det)
27 | y = z * np.exp((x - mu).T @ inv @ (x - mu) / -2.0)
28 | return y
29 |
30 | x = np.array([0, 0])
31 | mu = np.array([1, 2])
32 | cov = np.array([[1, 0],
33 | [0, 1]])
34 |
35 | y = multivariate_normal(x, mu, cov)
36 | print(y)
37 |
--------------------------------------------------------------------------------
/step03/plot_3d.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib.pyplot as plt
3 |
4 | X = np.array([[-2, -1, 0, 1, 2],
5 | [-2, -1, 0, 1, 2],
6 | [-2, -1, 0, 1, 2],
7 | [-2, -1, 0, 1, 2],
8 | [-2, -1, 0, 1, 2]])
9 | Y = np.array([[-2, -2, -2, -2, -2],
10 | [-1, -1, -1, -1, -1],
11 | [0, 0, 0, 0, 0],
12 | [1, 1, 1, 1, 1],
13 | [2, 2, 2, 2, 2]])
14 | Z = X ** 2 + Y ** 2
15 |
16 | ax = plt.axes(projection='3d')
17 | ax.plot_surface(X, Y, Z, cmap='viridis')
18 | ax.set_xlabel('x')
19 | ax.set_ylabel('y')
20 | ax.set_zlabel('z')
21 | plt.show()
22 |
23 | # ===== better resolution =====
24 | xs = np.arange(-2, 2, 0.1)
25 | ys = np.arange(-2, 2, 0.1)
26 |
27 | X, Y = np.meshgrid(xs, ys)
28 | Z = X ** 2 + Y ** 2
29 |
30 | ax = plt.axes(projection='3d')
31 | ax.plot_surface(X, Y, Z, cmap='viridis')
32 | ax.set_xlabel('x')
33 | ax.set_ylabel('y')
34 | ax.set_zlabel('z')
35 | plt.show()
36 |
37 | # ===== plot contour =====
38 | x = np.arange(-2, 2, 0.1)
39 | y = np.arange(-2, 2, 0.1)
40 |
41 | X, Y = np.meshgrid(x, y)
42 | Z = X ** 2 + Y ** 2
43 |
44 | ax = plt.axes()
45 | ax.contour(X, Y, Z)
46 | ax.set_xlabel('x')
47 | ax.set_ylabel('y')
48 | plt.show()
49 |
--------------------------------------------------------------------------------
/step03/plot_dataset.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 |
5 | path = os.path.join(os.path.dirname(__file__), 'height_weight.txt')
6 | xs = np.loadtxt(path)
7 |
8 | print(xs.shape)
9 |
10 | small_xs = xs[:500]
11 | plt.scatter(small_xs[:, 0], small_xs[:, 1])
12 | plt.xlabel('Height(cm)')
13 | plt.ylabel('Weight(kg)')
14 | plt.show()
--------------------------------------------------------------------------------
/step03/plot_norm.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib.pyplot as plt
3 |
4 | def multivariate_normal(x, mu, cov):
5 | det = np.linalg.det(cov)
6 | inv = np.linalg.inv(cov)
7 | D = len(x)
8 | z = 1 / np.sqrt((2 * np.pi) ** D * det)
9 | y = z * np.exp((x - mu).T @ inv @ (x - mu) / -2.0)
10 | return y
11 |
12 | mu = np.array([0.5, -0.2])
13 | cov = np.array([[2.0, 0.3],
14 | [0.3, 0.5]])
15 |
16 | xs = ys = np.arange(-5, 5, 0.1)
17 | X, Y = np.meshgrid(xs, ys)
18 | Z = np.zeros_like(X)
19 |
20 | for i in range(X.shape[0]):
21 | for j in range(X.shape[1]):
22 | x = np.array([X[i, j], Y[i, j]])
23 | Z[i, j] = multivariate_normal(x, mu, cov)
24 |
25 | fig = plt.figure()
26 | ax1 = fig.add_subplot(1, 2, 1, projection='3d')
27 | ax1.set_xlabel('x')
28 | ax1.set_ylabel('y')
29 | ax1.set_zlabel('z')
30 | ax1.plot_surface(X, Y, Z, cmap='viridis')
31 |
32 | ax2 = fig.add_subplot(1, 2, 2)
33 | ax2.set_xlabel('x')
34 | ax2.set_ylabel('y')
35 | ax2.contour(X, Y, Z)
36 | plt.show()
37 |
--------------------------------------------------------------------------------
/step04/gmm.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib.pyplot as plt
3 |
4 |
5 | mus = np.array([[2.0, 54.50],
6 | [4.3, 80.0]])
7 | covs = np.array([[[0.07, 0.44],
8 | [0.44, 33.7]],
9 | [[0.17, 0.94],
10 | [0.94, 36.00 ]]])
11 | phis = np.array([0.35, 0.65])
12 |
13 |
14 | def multivariate_normal(x, mu, cov):
15 | det = np.linalg.det(cov)
16 | inv = np.linalg.inv(cov)
17 | d = len(x)
18 | z = 1 / np.sqrt((2 * np.pi) ** d * det)
19 | y = z * np.exp((x - mu).T @ inv @ (x - mu) / -2.0)
20 | return y
21 |
22 | def gmm(x, phis, mus, covs):
23 | K = len(phis)
24 | y = 0
25 | for k in range(K):
26 | phi, mu, cov = phis[k], mus[k], covs[k]
27 | y += phi * multivariate_normal(x, mu, cov)
28 | return y
29 |
30 |
31 | # plot
32 | xs = np.arange(1, 6, 0.1)
33 | ys = np.arange(40, 100, 0.1)
34 | X, Y = np.meshgrid(xs, ys)
35 | Z = np.zeros_like(X)
36 |
37 | for i in range(X.shape[0]):
38 | for j in range(X.shape[1]):
39 | x = np.array([X[i, j], Y[i, j]])
40 | Z[i, j] = gmm(x, phis, mus, covs)
41 |
42 | fig = plt.figure()
43 | ax1 = fig.add_subplot(1, 2, 1, projection='3d')
44 | ax1.set_xlabel('x')
45 | ax1.set_ylabel('y')
46 | ax1.set_zlabel('z')
47 | ax1.plot_surface(X, Y, Z, cmap='viridis')
48 |
49 | ax2 = fig.add_subplot(1, 2, 2)
50 | ax2.set_xlabel('x')
51 | ax2.set_ylabel('y')
52 | ax2.contour(X, Y, Z)
53 | plt.show()
--------------------------------------------------------------------------------
/step04/gmm_sampling.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib.pyplot as plt
3 |
4 |
5 | mus = np.array([[2.0, 54.50],
6 | [4.3, 80.0]])
7 | covs = np.array([[[0.07, 0.44],
8 | [0.44, 33.7]],
9 | [[0.17, 0.94],
10 | [0.94, 36.00 ]]])
11 | phis = np.array([0.35, 0.65])
12 |
13 |
14 | def sample():
15 | k = np.random.choice(2, p=phis)
16 | mu, cov = mus[k], covs[k]
17 | x = np.random.multivariate_normal(mu, cov)
18 | return x
19 |
20 | N = 500
21 | xs = np.zeros((N, 2))
22 | for i in range(N):
23 | xs[i] = sample()
24 |
25 | plt.scatter(xs[:,0], xs[:,1], color='orange', alpha=0.7)
26 | plt.xlabel('x')
27 | plt.ylabel('y')
28 | plt.show()
--------------------------------------------------------------------------------
/step04/old_faithful.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 |
5 | path = os.path.join(os.path.dirname(__file__), 'old_faithful.txt')
6 | xs = np.loadtxt(path)
7 |
8 | print(xs.shape)
9 | print(xs[0])
10 |
11 | plt.scatter(xs[:,0], xs[:,1])
12 | plt.xlabel('Eruptions(Min)')
13 | plt.ylabel('Waiting(Min)')
14 | plt.show()
15 |
--------------------------------------------------------------------------------
/step04/old_faithful.txt:
--------------------------------------------------------------------------------
1 | 3.6 79
2 | 1.8 54
3 | 3.333 74
4 | 2.283 62
5 | 4.533 85
6 | 2.883 55
7 | 4.7 88
8 | 3.6 85
9 | 1.95 51
10 | 4.35 85
11 | 1.833 54
12 | 3.917 84
13 | 4.2 78
14 | 1.75 47
15 | 4.7 83
16 | 2.167 52
17 | 1.75 62
18 | 4.8 84
19 | 1.6 52
20 | 4.25 79
21 | 1.8 51
22 | 1.75 47
23 | 3.45 78
24 | 3.067 69
25 | 4.533 74
26 | 3.6 83
27 | 1.967 55
28 | 4.083 76
29 | 3.85 78
30 | 4.433 79
31 | 4.3 73
32 | 4.467 77
33 | 3.367 66
34 | 4.033 80
35 | 3.833 74
36 | 2.017 52
37 | 1.867 48
38 | 4.833 80
39 | 1.833 59
40 | 4.783 90
41 | 4.35 80
42 | 1.883 58
43 | 4.567 84
44 | 1.75 58
45 | 4.533 73
46 | 3.317 83
47 | 3.833 64
48 | 2.1 53
49 | 4.633 82
50 | 2 59
51 | 4.8 75
52 | 4.716 90
53 | 1.833 54
54 | 4.833 80
55 | 1.733 54
56 | 4.883 83
57 | 3.717 71
58 | 1.667 64
59 | 4.567 77
60 | 4.317 81
61 | 2.233 59
62 | 4.5 84
63 | 1.75 48
64 | 4.8 82
65 | 1.817 60
66 | 4.4 92
67 | 4.167 78
68 | 4.7 78
69 | 2.067 65
70 | 4.7 73
71 | 4.033 82
72 | 1.967 56
73 | 4.5 79
74 | 4 71
75 | 1.983 62
76 | 5.067 76
77 | 2.017 60
78 | 4.567 78
79 | 3.883 76
80 | 3.6 83
81 | 4.133 75
82 | 4.333 82
83 | 4.1 70
84 | 2.633 65
85 | 4.067 73
86 | 4.933 88
87 | 3.95 76
88 | 4.517 80
89 | 2.167 48
90 | 4 86
91 | 2.2 60
92 | 4.333 90
93 | 1.867 50
94 | 4.817 78
95 | 1.833 63
96 | 4.3 72
97 | 4.667 84
98 | 3.75 75
99 | 1.867 51
100 | 4.9 82
101 | 2.483 62
102 | 4.367 88
103 | 2.1 49
104 | 4.5 83
105 | 4.05 81
106 | 1.867 47
107 | 4.7 84
108 | 1.783 52
109 | 4.85 86
110 | 3.683 81
111 | 4.733 75
112 | 2.3 59
113 | 4.9 89
114 | 4.417 79
115 | 1.7 59
116 | 4.633 81
117 | 2.317 50
118 | 4.6 85
119 | 1.817 59
120 | 4.417 87
121 | 2.617 53
122 | 4.067 69
123 | 4.25 77
124 | 1.967 56
125 | 4.6 88
126 | 3.767 81
127 | 1.917 45
128 | 4.5 82
129 | 2.267 55
130 | 4.65 90
131 | 1.867 45
132 | 4.167 83
133 | 2.8 56
134 | 4.333 89
135 | 1.833 46
136 | 4.383 82
137 | 1.883 51
138 | 4.933 86
139 | 2.033 53
140 | 3.733 79
141 | 4.233 81
142 | 2.233 60
143 | 4.533 82
144 | 4.817 77
145 | 4.333 76
146 | 1.983 59
147 | 4.633 80
148 | 2.017 49
149 | 5.1 96
150 | 1.8 53
151 | 5.033 77
152 | 4 77
153 | 2.4 65
154 | 4.6 81
155 | 3.567 71
156 | 4 70
157 | 4.5 81
158 | 4.083 93
159 | 1.8 53
160 | 3.967 89
161 | 2.2 45
162 | 4.15 86
163 | 2 58
164 | 3.833 78
165 | 3.5 66
166 | 4.583 76
167 | 2.367 63
168 | 5 88
169 | 1.933 52
170 | 4.617 93
171 | 1.917 49
172 | 2.083 57
173 | 4.583 77
174 | 3.333 68
175 | 4.167 81
176 | 4.333 81
177 | 4.5 73
178 | 2.417 50
179 | 4 85
180 | 4.167 74
181 | 1.883 55
182 | 4.583 77
183 | 4.25 83
184 | 3.767 83
185 | 2.033 51
186 | 4.433 78
187 | 4.083 84
188 | 1.833 46
189 | 4.417 83
190 | 2.183 55
191 | 4.8 81
192 | 1.833 57
193 | 4.8 76
194 | 4.1 84
195 | 3.966 77
196 | 4.233 81
197 | 3.5 87
198 | 4.366 77
199 | 2.25 51
200 | 4.667 78
201 | 2.1 60
202 | 4.35 82
203 | 4.133 91
204 | 1.867 53
205 | 4.6 78
206 | 1.783 46
207 | 4.367 77
208 | 3.85 84
209 | 1.933 49
210 | 4.5 83
211 | 2.383 71
212 | 4.7 80
213 | 1.867 49
214 | 3.833 75
215 | 3.417 64
216 | 4.233 76
217 | 2.4 53
218 | 4.8 94
219 | 2 55
220 | 4.15 76
221 | 1.867 50
222 | 4.267 82
223 | 1.75 54
224 | 4.483 75
225 | 4 78
226 | 4.117 79
227 | 4.083 78
228 | 4.267 78
229 | 3.917 70
230 | 4.55 79
231 | 4.083 70
232 | 2.417 54
233 | 4.183 86
234 | 2.217 50
235 | 4.45 90
236 | 1.883 54
237 | 1.85 54
238 | 4.283 77
239 | 3.95 79
240 | 2.333 64
241 | 4.15 75
242 | 2.35 47
243 | 4.933 86
244 | 2.9 63
245 | 4.583 85
246 | 3.833 82
247 | 2.083 57
248 | 4.367 82
249 | 2.133 67
250 | 4.35 74
251 | 2.2 54
252 | 4.45 83
253 | 3.567 73
254 | 4.5 73
255 | 4.15 88
256 | 3.817 80
257 | 3.917 71
258 | 4.45 83
259 | 2 56
260 | 4.283 79
261 | 4.767 78
262 | 4.533 84
263 | 1.85 58
264 | 4.25 83
265 | 1.983 43
266 | 2.25 60
267 | 4.75 75
268 | 4.117 81
269 | 2.15 46
270 | 4.417 90
271 | 1.817 46
272 | 4.467 74
273 |
--------------------------------------------------------------------------------
/step05/em.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 |
5 | path = os.path.join(os.path.dirname(__file__), 'old_faithful.txt')
6 | xs = np.loadtxt(path)
7 | print(xs.shape) # (272, 2)
8 |
9 | # initialize parameters
10 | phis = np.array([0.5, 0.5])
11 | mus = np.array([[0.0, 50.0], [0.0, 100.0]])
12 | covs = np.array([np.eye(2), np.eye(2)])
13 |
14 | K = len(phis) # 2
15 | N = len(xs) # 272
16 | MAX_ITERS = 100
17 | THRESHOLD = 1e-4
18 |
19 | def multivariate_normal(x, mu, cov):
20 | det = np.linalg.det(cov)
21 | inv = np.linalg.inv(cov)
22 | d = len(x)
23 | z = 1 / np.sqrt((2 * np.pi) ** d * det)
24 | y = z * np.exp((x - mu).T @ inv @ (x - mu) / -2.0)
25 | return y
26 |
27 | def gmm(x, phis, mus, covs):
28 | K = len(phis)
29 | y = 0
30 | for k in range(K):
31 | phi, mu, cov = phis[k], mus[k], covs[k]
32 | y += phi * multivariate_normal(x, mu, cov)
33 | return y
34 |
35 | def likelihood(xs, phis, mus, covs):
36 | """ log likelihood """
37 | eps = 1e-8
38 | L = 0
39 | N = len(xs)
40 | for x in xs:
41 | y = gmm(x, phis, mus, covs)
42 | L += np.log(y + eps)
43 | return L / N
44 |
45 |
46 | current_likelihood = likelihood(xs, phis, mus, covs)
47 |
48 | for iter in range(MAX_ITERS):
49 | # E-step ====================
50 | qs = np.zeros((N, K))
51 | for n in range(N):
52 | x = xs[n]
53 | for k in range(K):
54 | phi, mu, cov = phis[k], mus[k], covs[k]
55 | qs[n, k] = phi * multivariate_normal(x, mu, cov)
56 | qs[n] /= gmm(x, phis, mus, covs)
57 |
58 | # M-step ====================
59 | qs_sum = qs.sum(axis=0)
60 | for k in range(K):
61 | # 1. phis
62 | phis[k] = qs_sum[k] / N
63 |
64 | # 2. mus
65 | c = 0
66 | for n in range(N):
67 | c += qs[n, k] * xs[n]
68 | mus[k] = c / qs_sum[k]
69 |
70 | # 3. covs
71 | c = 0
72 | for n in range(N):
73 | z = xs[n] - mus[k]
74 | z = z[:, np.newaxis] # column vector
75 | c += qs[n, k] * z @ z.T
76 | covs[k] = c / qs_sum[k]
77 |
78 | # thershold check ====================
79 | print(f'{current_likelihood:.3f}')
80 |
81 | next_likelihood = likelihood(xs, phis, mus, covs)
82 | diff = np.abs(next_likelihood - current_likelihood)
83 | if diff < THRESHOLD:
84 | break
85 | current_likelihood = next_likelihood
86 |
87 |
88 |
89 | # visualize
90 | def plot_contour(w, mus, covs):
91 | x = np.arange(1, 6, 0.1)
92 | y = np.arange(40, 100, 1)
93 | X, Y = np.meshgrid(x, y)
94 | Z = np.zeros_like(X)
95 |
96 | for i in range(X.shape[0]):
97 | for j in range(X.shape[1]):
98 | x = np.array([X[i, j], Y[i, j]])
99 |
100 | for k in range(len(mus)):
101 | mu, cov = mus[k], covs[k]
102 | Z[i, j] += w[k] * multivariate_normal(x, mu, cov)
103 | plt.contour(X, Y, Z)
104 |
105 | plt.scatter(xs[:,0], xs[:,1])
106 | plot_contour(phis, mus, covs)
107 | plt.xlabel('Eruptions(Min)')
108 | plt.ylabel('Waiting(Min)')
109 | plt.show()
--------------------------------------------------------------------------------
/step05/generate.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 |
5 | path = os.path.join(os.path.dirname(__file__), 'old_faithful.txt')
6 | original_xs = np.loadtxt(path)
7 |
8 | # learned parameters
9 | mus = np.array([[2.0, 54.50],
10 | [4.3, 80.0]])
11 | covs = np.array([[[0.07, 0.44],
12 | [0.44, 33.7]],
13 | [[0.17, 0.94],
14 | [0.94, 36.00 ]]])
15 | phis = np.array([0.35, 0.65])
16 |
17 | def multivariate_normal(x, mu, cov):
18 | det = np.linalg.det(cov)
19 | inv = np.linalg.inv(cov)
20 | d = len(x)
21 | z = 1 / np.sqrt((2 * np.pi) ** d * det)
22 | y = z * np.exp((x - mu).T @ inv @ (x - mu) / -2.0)
23 | return y
24 |
25 | def gmm(x, phis, mus, covs):
26 | K = len(phis)
27 | y = 0
28 | for k in range(K):
29 | phi, mu, cov = phis[k], mus[k], covs[k]
30 | y += phi * multivariate_normal(x, mu, cov)
31 | return y
32 |
33 | # genearte data
34 | N = 500
35 | new_xs = np.zeros((N, 2))
36 | for n in range(N):
37 | k = np.random.choice(2, p=phis)
38 | mu, cov = mus[k], covs[k]
39 | new_xs[n] = np.random.multivariate_normal(mu, cov)
40 |
41 | # visualize
42 | plt.scatter(original_xs[:,0], original_xs[:,1], alpha=0.7, label='original')
43 | plt.scatter(new_xs[:,0], new_xs[:,1], alpha=0.7, label='generated')
44 | plt.legend()
45 | plt.xlabel('Eruptions(Min)')
46 | plt.ylabel('Waiting(Min)')
47 | plt.show()
--------------------------------------------------------------------------------
/step05/old_faithful.txt:
--------------------------------------------------------------------------------
1 | 3.6 79
2 | 1.8 54
3 | 3.333 74
4 | 2.283 62
5 | 4.533 85
6 | 2.883 55
7 | 4.7 88
8 | 3.6 85
9 | 1.95 51
10 | 4.35 85
11 | 1.833 54
12 | 3.917 84
13 | 4.2 78
14 | 1.75 47
15 | 4.7 83
16 | 2.167 52
17 | 1.75 62
18 | 4.8 84
19 | 1.6 52
20 | 4.25 79
21 | 1.8 51
22 | 1.75 47
23 | 3.45 78
24 | 3.067 69
25 | 4.533 74
26 | 3.6 83
27 | 1.967 55
28 | 4.083 76
29 | 3.85 78
30 | 4.433 79
31 | 4.3 73
32 | 4.467 77
33 | 3.367 66
34 | 4.033 80
35 | 3.833 74
36 | 2.017 52
37 | 1.867 48
38 | 4.833 80
39 | 1.833 59
40 | 4.783 90
41 | 4.35 80
42 | 1.883 58
43 | 4.567 84
44 | 1.75 58
45 | 4.533 73
46 | 3.317 83
47 | 3.833 64
48 | 2.1 53
49 | 4.633 82
50 | 2 59
51 | 4.8 75
52 | 4.716 90
53 | 1.833 54
54 | 4.833 80
55 | 1.733 54
56 | 4.883 83
57 | 3.717 71
58 | 1.667 64
59 | 4.567 77
60 | 4.317 81
61 | 2.233 59
62 | 4.5 84
63 | 1.75 48
64 | 4.8 82
65 | 1.817 60
66 | 4.4 92
67 | 4.167 78
68 | 4.7 78
69 | 2.067 65
70 | 4.7 73
71 | 4.033 82
72 | 1.967 56
73 | 4.5 79
74 | 4 71
75 | 1.983 62
76 | 5.067 76
77 | 2.017 60
78 | 4.567 78
79 | 3.883 76
80 | 3.6 83
81 | 4.133 75
82 | 4.333 82
83 | 4.1 70
84 | 2.633 65
85 | 4.067 73
86 | 4.933 88
87 | 3.95 76
88 | 4.517 80
89 | 2.167 48
90 | 4 86
91 | 2.2 60
92 | 4.333 90
93 | 1.867 50
94 | 4.817 78
95 | 1.833 63
96 | 4.3 72
97 | 4.667 84
98 | 3.75 75
99 | 1.867 51
100 | 4.9 82
101 | 2.483 62
102 | 4.367 88
103 | 2.1 49
104 | 4.5 83
105 | 4.05 81
106 | 1.867 47
107 | 4.7 84
108 | 1.783 52
109 | 4.85 86
110 | 3.683 81
111 | 4.733 75
112 | 2.3 59
113 | 4.9 89
114 | 4.417 79
115 | 1.7 59
116 | 4.633 81
117 | 2.317 50
118 | 4.6 85
119 | 1.817 59
120 | 4.417 87
121 | 2.617 53
122 | 4.067 69
123 | 4.25 77
124 | 1.967 56
125 | 4.6 88
126 | 3.767 81
127 | 1.917 45
128 | 4.5 82
129 | 2.267 55
130 | 4.65 90
131 | 1.867 45
132 | 4.167 83
133 | 2.8 56
134 | 4.333 89
135 | 1.833 46
136 | 4.383 82
137 | 1.883 51
138 | 4.933 86
139 | 2.033 53
140 | 3.733 79
141 | 4.233 81
142 | 2.233 60
143 | 4.533 82
144 | 4.817 77
145 | 4.333 76
146 | 1.983 59
147 | 4.633 80
148 | 2.017 49
149 | 5.1 96
150 | 1.8 53
151 | 5.033 77
152 | 4 77
153 | 2.4 65
154 | 4.6 81
155 | 3.567 71
156 | 4 70
157 | 4.5 81
158 | 4.083 93
159 | 1.8 53
160 | 3.967 89
161 | 2.2 45
162 | 4.15 86
163 | 2 58
164 | 3.833 78
165 | 3.5 66
166 | 4.583 76
167 | 2.367 63
168 | 5 88
169 | 1.933 52
170 | 4.617 93
171 | 1.917 49
172 | 2.083 57
173 | 4.583 77
174 | 3.333 68
175 | 4.167 81
176 | 4.333 81
177 | 4.5 73
178 | 2.417 50
179 | 4 85
180 | 4.167 74
181 | 1.883 55
182 | 4.583 77
183 | 4.25 83
184 | 3.767 83
185 | 2.033 51
186 | 4.433 78
187 | 4.083 84
188 | 1.833 46
189 | 4.417 83
190 | 2.183 55
191 | 4.8 81
192 | 1.833 57
193 | 4.8 76
194 | 4.1 84
195 | 3.966 77
196 | 4.233 81
197 | 3.5 87
198 | 4.366 77
199 | 2.25 51
200 | 4.667 78
201 | 2.1 60
202 | 4.35 82
203 | 4.133 91
204 | 1.867 53
205 | 4.6 78
206 | 1.783 46
207 | 4.367 77
208 | 3.85 84
209 | 1.933 49
210 | 4.5 83
211 | 2.383 71
212 | 4.7 80
213 | 1.867 49
214 | 3.833 75
215 | 3.417 64
216 | 4.233 76
217 | 2.4 53
218 | 4.8 94
219 | 2 55
220 | 4.15 76
221 | 1.867 50
222 | 4.267 82
223 | 1.75 54
224 | 4.483 75
225 | 4 78
226 | 4.117 79
227 | 4.083 78
228 | 4.267 78
229 | 3.917 70
230 | 4.55 79
231 | 4.083 70
232 | 2.417 54
233 | 4.183 86
234 | 2.217 50
235 | 4.45 90
236 | 1.883 54
237 | 1.85 54
238 | 4.283 77
239 | 3.95 79
240 | 2.333 64
241 | 4.15 75
242 | 2.35 47
243 | 4.933 86
244 | 2.9 63
245 | 4.583 85
246 | 3.833 82
247 | 2.083 57
248 | 4.367 82
249 | 2.133 67
250 | 4.35 74
251 | 2.2 54
252 | 4.45 83
253 | 3.567 73
254 | 4.5 73
255 | 4.15 88
256 | 3.817 80
257 | 3.917 71
258 | 4.45 83
259 | 2 56
260 | 4.283 79
261 | 4.767 78
262 | 4.533 84
263 | 1.85 58
264 | 4.25 83
265 | 1.983 43
266 | 2.25 60
267 | 4.75 75
268 | 4.117 81
269 | 2.15 46
270 | 4.417 90
271 | 1.817 46
272 | 4.467 74
273 |
--------------------------------------------------------------------------------
/step06/gradient.py:
--------------------------------------------------------------------------------
1 | import torch
2 |
3 | def rosenbrock(x0, x1):
4 | y = 100 * (x1 - x0 ** 2) ** 2 + (x0 - 1) ** 2
5 | return y
6 |
7 | x0 = torch.tensor(0.0, requires_grad=True)
8 | x1 = torch.tensor(2.0, requires_grad=True)
9 |
10 | y = rosenbrock(x0, x1)
11 | y.backward()
12 | print(x0.grad, x1.grad)
13 |
14 | lr = 0.001 # learning rate
15 | iters = 10000 # iteration count
16 |
17 | for i in range(iters):
18 | if i % 1000 == 0:
19 | print(x0.item(), x1.item())
20 |
21 | y = rosenbrock(x0, x1)
22 |
23 | y.backward()
24 |
25 | x0.data -= lr * x0.grad.data
26 | x1.data -= lr * x1.grad.data
27 |
28 | x0.grad.zero_()
29 | x1.grad.zero_()
30 |
31 | print(x0.item(), x1.item())
--------------------------------------------------------------------------------
/step06/neuralnet.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torch.nn.functional as F
4 |
5 |
6 | torch.manual_seed(0)
7 | x = torch.rand(100, 1)
8 | y = torch.sin(2 * torch.pi * x) + torch.rand(100, 1)
9 |
10 | # model
11 | class Model(nn.Module):
12 | def __init__(self, input_size=1, hidden_size= 10, output_size=1):
13 | super().__init__()
14 | self.linear1 = nn.Linear(input_size, hidden_size)
15 | self.linear2 = nn.Linear(hidden_size, output_size)
16 |
17 | def forward(self, x):
18 | y = self.linear1(x)
19 | y = F.sigmoid(y)
20 | y = self.linear2(y)
21 | return y
22 |
23 |
24 | lr = 0.2
25 | iters = 10000
26 |
27 | model = Model()
28 | optimizer = torch.optim.SGD(model.parameters(), lr=lr)
29 |
30 | for i in range(iters):
31 | y_pred = model(x)
32 | loss = F.mse_loss(y, y_pred)
33 | optimizer.zero_grad()
34 | loss.backward()
35 | optimizer.step()
36 |
37 | if i % 1000 == 0:
38 | print(loss.item())
39 |
40 | print(loss.item())
41 |
42 | # plot
43 | import matplotlib.pyplot as plt
44 | plt.scatter(x.detach().numpy(), y.detach().numpy(), s=10)
45 | x = torch.linspace(0, 1, 100).reshape(-1, 1)
46 | y = model(x).detach().numpy()
47 | plt.plot(x, y, color='red')
48 | plt.show()
--------------------------------------------------------------------------------
/step06/regression.py:
--------------------------------------------------------------------------------
1 | import torch
2 |
3 |
4 | torch.manual_seed(0)
5 | x = torch.rand(100, 1)
6 | y = 5 + 2 * x + torch.rand(100, 1)
7 |
8 | W = torch.zeros((1, 1), requires_grad=True)
9 | b = torch.zeros(1, requires_grad=True)
10 |
11 | def predict(x):
12 | y = x @ W + b
13 | return y
14 |
15 | def mean_squared_error(x0, x1):
16 | diff = x0 - x1
17 | N = len(diff)
18 | return torch.sum(diff ** 2) / N
19 |
20 | lr = 0.1
21 | iters = 100
22 |
23 | for i in range(iters):
24 | y_hat = predict(x)
25 | loss = mean_squared_error(y, y_hat)
26 |
27 | loss.backward()
28 |
29 | W.data -= lr * W.grad.data
30 | b.data -= lr * b.grad.data
31 |
32 | W.grad.zero_()
33 | b.grad.zero_()
34 |
35 | if i % 10 == 0: # print every 10 iterations
36 | print(loss.item())
37 |
38 | print(loss.item())
39 | print('====')
40 | print('W =', W.item())
41 | print('b =', b.item())
42 |
43 |
44 | # plot
45 | import matplotlib.pyplot as plt
46 | plt.scatter(x.detach().numpy(), y.detach().numpy(), s=10)
47 | x = torch.tensor([[0.0], [1.0]])
48 | y = W.detach().numpy() * x.detach().numpy() + b.detach().numpy()
49 | plt.plot(x, y, color='red')
50 | plt.xlabel('x')
51 | plt.ylabel('y')
52 | plt.show()
--------------------------------------------------------------------------------
/step06/tensor.py:
--------------------------------------------------------------------------------
1 | import torch
2 |
3 | x = torch.tensor(5.0, requires_grad=True)
4 | y = 3 * x ** 2
5 | print(y)
6 |
7 | y.backward()
8 | print(x.grad)
--------------------------------------------------------------------------------
/step06/vision.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torchvision
3 | import torchvision.transforms as transforms
4 | import matplotlib.pyplot as plt
5 |
6 |
7 | ## ==== MNIST ====
8 | dataset = torchvision.datasets.MNIST(
9 | root='./data',
10 | train=True,
11 | transform=None,
12 | download=True
13 | )
14 |
15 | x, label = dataset[0]
16 |
17 | print('size:', len(dataset)) # size: 60000
18 | print('type:', type(x)) # type:
19 | print('label:', label) # label: 5
20 |
21 | plt.imshow(x, cmap='gray')
22 | plt.show()
23 |
24 |
25 | # ==== preprocess ====
26 | transform = transforms.ToTensor()
27 |
28 | dataset = torchvision.datasets.MNIST(
29 | root='./data',
30 | train=True,
31 | transform=transform,
32 | download=True
33 | )
34 |
35 | x, label = dataset[0]
36 | print('type:', type(x)) # type:
37 | print('shape:', x.shape) # shape: torch.Size([1, 28, 28])
38 |
39 |
40 | # ==== DataLoader ====
41 | dataloader = torch.utils.data.DataLoader(
42 | dataset,
43 | batch_size=32,
44 | shuffle=True)
45 |
46 | for x, label in dataloader:
47 | print('x shape:', x.shape) # shape: torch.Size([32, 1, 28, 28])
48 | print('label shape:', label.shape) # shape: torch.Size([32])
49 | break
--------------------------------------------------------------------------------
/step07/vae.py:
--------------------------------------------------------------------------------
1 | import matplotlib.pyplot as plt
2 | import torch
3 | import torch.nn as nn
4 | import torch.nn.functional as F
5 | import torch.optim as optim
6 | import torchvision
7 | from torchvision import datasets, transforms
8 |
9 |
10 | # hyperparameters
11 | input_dim = 784 # x dimension
12 | hidden_dim = 200 # neurons in hidden layers
13 | latent_dim = 20 # z dimension
14 | epochs = 30
15 | learning_rate = 3e-4
16 | batch_size = 32
17 |
18 |
19 | class Encoder(nn.Module):
20 | def __init__(self, input_dim, hidden_dim, latent_dim):
21 | super().__init__()
22 | self.linear = nn.Linear(input_dim, hidden_dim)
23 | self.linear_mu = nn.Linear(hidden_dim, latent_dim)
24 | self.linear_logvar = nn.Linear(hidden_dim, latent_dim)
25 |
26 | def forward(self, x):
27 | h = self.linear(x)
28 | h = F.relu(h)
29 | mu = self.linear_mu(h)
30 | logvar = self.linear_logvar(h)
31 | sigma = torch.exp(0.5 * logvar)
32 | return mu, sigma
33 |
34 |
35 | class Decoder(nn.Module):
36 | def __init__(self, latent_dim, hidden_dim, output_dim):
37 | super().__init__()
38 | self.linear1 = nn.Linear(latent_dim, hidden_dim)
39 | self.linear2 = nn.Linear(hidden_dim, output_dim)
40 |
41 | def forward(self, z):
42 | h = self.linear1(z)
43 | h = F.relu(h)
44 | h = self.linear2(h)
45 | x_hat = F.sigmoid(h)
46 | return x_hat
47 |
48 |
49 | def reparameterize(mu, sigma):
50 | eps = torch.randn_like(sigma)
51 | z = mu + eps * sigma
52 | return z
53 |
54 |
55 | class VAE(nn.Module):
56 | def __init__(self, input_dim, hidden_dim, latent_dim):
57 | super().__init__()
58 | self.encoder = Encoder(input_dim, hidden_dim, latent_dim)
59 | self.decoder = Decoder(latent_dim, hidden_dim, input_dim)
60 |
61 | def get_loss(self, x):
62 | mu, sigma = self.encoder(x)
63 | z = reparameterize(mu, sigma)
64 | x_hat = self.decoder(z)
65 |
66 | batch_size = len(x)
67 | L1 = F.mse_loss(x_hat, x, reduction='sum')
68 | L2 = - torch.sum(1 + torch.log(sigma ** 2) - mu ** 2 - sigma ** 2)
69 | return (L1 + L2) / batch_size
70 |
71 |
72 | # datasets
73 | transform = transforms.Compose([
74 | transforms.ToTensor(),
75 | transforms.Lambda(torch.flatten) # falatten
76 | ])
77 | dataset = datasets.MNIST(root='./data', train=True, download=True, transform=transform)
78 | dataloader = torch.utils.data.DataLoader(dataset, batch_size=batch_size, shuffle=True)
79 |
80 | model = VAE(input_dim, hidden_dim, latent_dim)
81 | optimizer = optim.Adam(model.parameters(), lr=learning_rate)
82 | losses = []
83 |
84 | for epoch in range(epochs):
85 | loss_sum = 0.0
86 | cnt = 0
87 |
88 | for x, label in dataloader:
89 | optimizer.zero_grad()
90 | loss = model.get_loss(x)
91 | loss.backward()
92 | optimizer.step()
93 |
94 | loss_sum += loss.item()
95 | cnt += 1
96 |
97 | loss_avg = loss_sum / cnt
98 | print(loss_avg)
99 | losses.append(loss_avg)
100 |
101 | # plot losses
102 | epochs = list(range(1, epochs + 1))
103 | plt.plot(epochs, losses, marker='o', linestyle='-')
104 | plt.xlabel('Epoch')
105 | plt.ylabel('Loss')
106 | plt.show()
107 |
108 |
109 | # generate new images
110 | with torch.no_grad():
111 | sample_size = 64
112 | z = torch.randn(sample_size, latent_dim)
113 | x = model.decoder(z)
114 | generated_images = x.view(sample_size, 1, 28, 28)
115 |
116 | grid_img = torchvision.utils.make_grid(generated_images, nrow=8, padding=2, normalize=True)
117 | plt.imshow(grid_img.permute(1, 2, 0))
118 | plt.axis('off')
119 | plt.show()
--------------------------------------------------------------------------------
/step08/hvae.py:
--------------------------------------------------------------------------------
1 | import matplotlib.pyplot as plt
2 | import torch
3 | import torch.nn as nn
4 | import torch.nn.functional as F
5 | import torch.optim as optim
6 | import torchvision
7 | from torchvision import datasets, transforms
8 |
9 |
10 | # hyperparameters
11 | input_dim = 784 # mnist image 28x28
12 | hidden_dim = 100
13 | latent_dim = 20
14 | epochs = 30
15 | learning_rate = 1e-3
16 | batch_size = 32
17 |
18 |
19 | class Encoder(nn.Module):
20 | def __init__(self, input_dim, hidden_dim, latent_dim):
21 | super().__init__()
22 | self.linear = nn.Linear(input_dim, hidden_dim)
23 | self.linear_mu = nn.Linear(hidden_dim, latent_dim)
24 | self.linear_logvar = nn.Linear(hidden_dim, latent_dim)
25 |
26 | def forward(self, x):
27 | h = self.linear(x)
28 | h = F.relu(h)
29 | mu = self.linear_mu(h)
30 | logvar = self.linear_logvar(h)
31 | sigma = torch.exp(0.5 * logvar)
32 | return mu, sigma
33 |
34 |
35 | class Decoder(nn.Module):
36 | def __init__(self, latent_dim, hidden_dim, output_dim, use_sigmoid=False):
37 | super().__init__()
38 | self.linear1 = nn.Linear(latent_dim, hidden_dim)
39 | self.linear2 = nn.Linear(hidden_dim, output_dim)
40 | self.use_sigmoid = use_sigmoid
41 |
42 | def forward(self, z):
43 | h = self.linear1(z)
44 | h = F.relu(h)
45 | h = self.linear2(h)
46 | if self.use_sigmoid:
47 | h = F.sigmoid(h)
48 | return h
49 |
50 |
51 | def reparameterize(mu, sigma):
52 | eps = torch.randn_like(sigma)
53 | z = mu + eps * sigma
54 | return z
55 |
56 |
57 | class VAE(nn.Module):
58 | def __init__(self, input_dim, hidden_dim, latent_dim):
59 | super().__init__()
60 | self.encoder1 = Encoder(input_dim, hidden_dim, latent_dim)
61 | self.encoder2 = Encoder(latent_dim, hidden_dim, latent_dim)
62 | self.decoder1 = Decoder(latent_dim, hidden_dim, input_dim, use_sigmoid=True)
63 | self.decoder2 = Decoder(latent_dim, hidden_dim, latent_dim)
64 |
65 | def get_loss(self, x):
66 | mu1, sigma1 = self.encoder1(x)
67 | z1 = reparameterize(mu1, sigma1)
68 | mu2, sigma2 = self.encoder2(z1)
69 | z2 = reparameterize(mu2, sigma2)
70 |
71 | z_hat = self.decoder2(z2)
72 | x_hat = self.decoder1(z1)
73 |
74 | # loss
75 | batch_size = len(x)
76 | L1 = F.mse_loss(x_hat, x, reduction='sum')
77 | L2 = - torch.sum(1 + torch.log(sigma2 ** 2) - mu2 ** 2 - sigma2 ** 2)
78 | L3 = - torch.sum(1 + torch.log(sigma1 ** 2) - (mu1 - z_hat) ** 2 - sigma1 ** 2)
79 | return (L1 + L2 + L3) / batch_size
80 |
81 |
82 | # dataset
83 | transform = transforms.Compose([
84 | transforms.ToTensor(),
85 | transforms.Lambda(torch.flatten)
86 | ])
87 | dataset = datasets.MNIST(root='./data', train=True, download=True, transform=transform)
88 | dataloader = torch.utils.data.DataLoader(dataset, batch_size=batch_size, shuffle=True)
89 |
90 | model = VAE(input_dim, hidden_dim, latent_dim)
91 | optimizer = optim.Adam(model.parameters(), lr=learning_rate)
92 |
93 | for epoch in range(epochs):
94 | loss_sum = 0.0
95 | cnt = 0
96 |
97 | for x, label in dataloader:
98 | optimizer.zero_grad()
99 | loss = model.get_loss(x)
100 | loss.backward()
101 | optimizer.step()
102 |
103 | loss_sum += loss.item()
104 | cnt += 1
105 |
106 | loss_avg = loss_sum / cnt
107 | print(loss_avg)
108 |
109 |
110 | # visualize generated images
111 | with torch.no_grad():
112 | sample_size = 64
113 | z2 = torch.randn(sample_size, latent_dim)
114 | z1_hat = model.decoder2(z2)
115 | z1 = reparameterize(z1_hat, torch.ones_like(z1_hat))
116 | x = model.decoder1(z1)
117 | generated_images = x.view(sample_size, 1, 28, 28)
118 |
119 | grid_img = torchvision.utils.make_grid(generated_images, nrow=8, padding=2, normalize=True)
120 |
121 | plt.imshow(grid_img.permute(1, 2, 0))
122 | plt.axis('off')
123 | plt.show()
--------------------------------------------------------------------------------
/step09/diffusion_model.py:
--------------------------------------------------------------------------------
1 | import math
2 | import torch
3 | import torchvision
4 | import matplotlib.pyplot as plt
5 | from torchvision import transforms
6 | from torch.utils.data import DataLoader
7 | from torch.optim import Adam
8 | import torch.nn.functional as F
9 | from torch import nn
10 | from tqdm import tqdm
11 |
12 |
13 | img_size = 28
14 | batch_size = 128
15 | num_timesteps = 1000
16 | epochs = 10
17 | lr = 1e-3
18 | device = 'cuda' if torch.cuda.is_available() else 'cpu'
19 |
20 |
21 | def show_images(images, rows=2, cols=10):
22 | fig = plt.figure(figsize=(cols, rows))
23 | i = 0
24 | for r in range(rows):
25 | for c in range(cols):
26 | fig.add_subplot(rows, cols, i + 1)
27 | plt.imshow(images[i], cmap='gray')
28 | plt.axis('off')
29 | i += 1
30 | plt.show()
31 |
32 | def _pos_encoding(time_idx, output_dim, device='cpu'):
33 | t, D = time_idx, output_dim
34 | v = torch.zeros(D, device=device)
35 |
36 | i = torch.arange(0, D, device=device)
37 | div_term = torch.exp(i / D * math.log(10000))
38 |
39 | v[0::2] = torch.sin(t / div_term[0::2])
40 | v[1::2] = torch.cos(t / div_term[1::2])
41 | return v
42 |
43 | def pos_encoding(timesteps, output_dim, device='cpu'):
44 | batch_size = len(timesteps)
45 | device = timesteps.device
46 | v = torch.zeros(batch_size, output_dim, device=device)
47 | for i in range(batch_size):
48 | v[i] = _pos_encoding(timesteps[i], output_dim, device)
49 | return v
50 |
51 | class ConvBlock(nn.Module):
52 | def __init__(self, in_ch, out_ch, time_embed_dim):
53 | super().__init__()
54 | self.convs = nn.Sequential(
55 | nn.Conv2d(in_ch, out_ch, 3, padding=1),
56 | nn.BatchNorm2d(out_ch),
57 | nn.ReLU(),
58 | nn.Conv2d(out_ch, out_ch, 3, padding=1),
59 | nn.BatchNorm2d(out_ch),
60 | nn.ReLU()
61 | )
62 | self.mlp = nn.Sequential(
63 | nn.Linear(time_embed_dim, in_ch),
64 | nn.ReLU(),
65 | nn.Linear(in_ch, in_ch)
66 | )
67 |
68 | def forward(self, x, v):
69 | N, C, _, _ = x.shape
70 | v = self.mlp(v)
71 | v = v.view(N, C, 1, 1)
72 | y = self.convs(x + v)
73 | return y
74 |
75 | class UNet(nn.Module):
76 | def __init__(self, in_ch=1, time_embed_dim=100):
77 | super().__init__()
78 | self.time_embed_dim = time_embed_dim
79 |
80 | self.down1 = ConvBlock(in_ch, 64, time_embed_dim)
81 | self.down2 = ConvBlock(64, 128, time_embed_dim)
82 | self.bot1 = ConvBlock(128, 256, time_embed_dim)
83 | self.up2 = ConvBlock(128 + 256, 128, time_embed_dim)
84 | self.up1 = ConvBlock(128 + 64, 64, time_embed_dim)
85 | self.out = nn.Conv2d(64, in_ch, 1)
86 |
87 | self.maxpool = nn.MaxPool2d(2)
88 | self.upsample = nn.Upsample(scale_factor=2, mode='bilinear')
89 |
90 | def forward(self, x, timesteps):
91 | v = pos_encoding(timesteps, self.time_embed_dim, x.device)
92 |
93 | x1 = self.down1(x, v)
94 | x = self.maxpool(x1)
95 | x2 = self.down2(x, v)
96 | x = self.maxpool(x2)
97 |
98 | x = self.bot1(x, v)
99 |
100 | x = self.upsample(x)
101 | x = torch.cat([x, x2], dim=1)
102 | x = self.up2(x, v)
103 | x = self.upsample(x)
104 | x = torch.cat([x, x1], dim=1)
105 | x = self.up1(x, v)
106 | x = self.out(x)
107 | return x
108 |
109 |
110 | class Diffuser:
111 | def __init__(self, num_timesteps=1000, beta_start=0.0001, beta_end=0.02, device='cpu'):
112 | self.num_timesteps = num_timesteps
113 | self.device = device
114 | self.betas = torch.linspace(beta_start, beta_end, num_timesteps, device=device)
115 | self.alphas = 1 - self.betas
116 | self.alpha_bars = torch.cumprod(self.alphas, dim=0)
117 |
118 | def add_noise(self, x_0, t):
119 | T = self.num_timesteps
120 | assert (t >= 1).all() and (t <= T).all()
121 |
122 | t_idx = t - 1 # alpha_bars[0] is for t=1
123 | alpha_bar = self.alpha_bars[t_idx] # (N,)
124 | N = alpha_bar.size(0)
125 | alpha_bar = alpha_bar.view(N, 1, 1, 1) # (N, 1, 1, 1)
126 |
127 | noise = torch.randn_like(x_0, device=self.device)
128 | x_t = torch.sqrt(alpha_bar) * x_0 + torch.sqrt(1 - alpha_bar) * noise
129 | return x_t, noise
130 |
131 | def denoise(self, model, x, t):
132 | T = self.num_timesteps
133 | assert (t >= 1).all() and (t <= T).all()
134 |
135 | t_idx = t - 1 # alphas[0] is for t=1
136 | alpha = self.alphas[t_idx]
137 | alpha_bar = self.alpha_bars[t_idx]
138 | alpha_bar_prev = self.alpha_bars[t_idx-1]
139 |
140 | N = alpha.size(0)
141 | alpha = alpha.view(N, 1, 1, 1)
142 | alpha_bar = alpha_bar.view(N, 1, 1, 1)
143 | alpha_bar_prev = alpha_bar_prev.view(N, 1, 1, 1)
144 |
145 | model.eval()
146 | with torch.no_grad():
147 | eps = model(x, t)
148 | model.train()
149 |
150 | noise = torch.randn_like(x, device=self.device)
151 | noise[t == 1] = 0 # no noise at t=1
152 |
153 | mu = (x - ((1-alpha) / torch.sqrt(1-alpha_bar)) * eps) / torch.sqrt(alpha)
154 | std = torch.sqrt((1-alpha) * (1-alpha_bar_prev) / (1-alpha_bar))
155 | return mu + noise * std
156 |
157 | def reverse_to_img(self, x):
158 | x = x * 255
159 | x = x.clamp(0, 255)
160 | x = x.to(torch.uint8)
161 | x = x.cpu()
162 | to_pil = transforms.ToPILImage()
163 | return to_pil(x)
164 |
165 | def sample(self, model, x_shape=(20, 1, 28, 28)):
166 | batch_size = x_shape[0]
167 | x = torch.randn(x_shape, device=self.device)
168 |
169 | for i in tqdm(range(self.num_timesteps, 0, -1)):
170 | t = torch.tensor([i] * batch_size, device=self.device, dtype=torch.long)
171 | x = self.denoise(model, x, t)
172 |
173 | images = [self.reverse_to_img(x[i]) for i in range(batch_size)]
174 | return images
175 |
176 |
177 | preprocess = transforms.ToTensor()
178 | dataset = torchvision.datasets.MNIST(root='./data', download=True, transform=preprocess)
179 | dataloader = DataLoader(dataset, batch_size=batch_size, shuffle=True)
180 |
181 | diffuser = Diffuser(num_timesteps, device=device)
182 | model = UNet()
183 | model.to(device)
184 | optimizer = Adam(model.parameters(), lr=lr)
185 |
186 | losses = []
187 | for epoch in range(epochs):
188 | loss_sum = 0.0
189 | cnt = 0
190 |
191 | # generate samples every epoch ===================
192 | # images = diffuser.sample(model)
193 | # show_images(images)
194 | # ================================================
195 |
196 | for images, labels in tqdm(dataloader):
197 | optimizer.zero_grad()
198 | x = images.to(device)
199 | t = torch.randint(1, num_timesteps+1, (len(x),), device=device)
200 |
201 | x_noisy, noise = diffuser.add_noise(x, t)
202 | noise_pred = model(x_noisy, t)
203 | loss = F.mse_loss(noise, noise_pred)
204 |
205 | loss.backward()
206 | optimizer.step()
207 |
208 | loss_sum += loss.item()
209 | cnt += 1
210 |
211 | loss_avg = loss_sum / cnt
212 | losses.append(loss_avg)
213 | print(f'Epoch {epoch} | Loss: {loss_avg}')
214 |
215 | # plot losses
216 | plt.plot(losses)
217 | plt.xlabel('Epoch')
218 | plt.ylabel('Loss')
219 | plt.show()
220 |
221 | # generate samples
222 | images = diffuser.sample(model)
223 | show_images(images)
--------------------------------------------------------------------------------
/step09/flower.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/WegraLee/deep-learning-from-scratch-5/86be5ee971cf3cfe3795262f1ee07ce2322e9c9a/step09/flower.png
--------------------------------------------------------------------------------
/step09/gaussian_noise.py:
--------------------------------------------------------------------------------
1 | import os
2 | import numpy as np
3 | import matplotlib.pyplot as plt
4 | import torch
5 | import torchvision.transforms as transforms
6 |
7 |
8 | x = torch.randn(3, 64, 64)
9 | T = 1000
10 | betas = torch.linspace(0.0001, 0.02, T)
11 |
12 | for t in range(T):
13 | beta = betas[t]
14 | eps = torch.randn_like(x)
15 | x = torch.sqrt(1 - beta) * x + torch.sqrt(beta) * eps
16 |
17 | # load image
18 | current_dir = os.path.dirname(os.path.abspath(__file__))
19 | file_path = os.path.join(current_dir, 'flower.png')
20 | image = plt.imread(file_path)
21 | print(image.shape) # (64, 64, 3)
22 |
23 | # preprocess
24 | preprocess = transforms.ToTensor()
25 | x = preprocess(image)
26 | print(x.shape) # (3, 64, 64)
27 |
28 | original_x = x.clone() # keep original image
29 |
30 | def reverse_to_img(x):
31 | x = x * 255
32 | x = x.clamp(0, 255)
33 | x = x.to(torch.uint8)
34 | to_pil = transforms.ToPILImage()
35 | return to_pil(x)
36 |
37 | T = 1000
38 | beta_start = 0.0001
39 | beta_end = 0.02
40 | betas = torch.linspace(beta_start, beta_end, T)
41 | imgs = []
42 |
43 | for t in range(T):
44 | if t % 100 == 0:
45 | img = reverse_to_img(x)
46 | imgs.append(img)
47 |
48 | beta = betas[t]
49 | eps = torch.randn_like(x)
50 | x = torch.sqrt(1 - beta) * x + torch.sqrt(beta) * eps
51 |
52 | # show imgs
53 | plt.figure(figsize=(15, 6))
54 | for i, img in enumerate(imgs[:10]):
55 | plt.subplot(2, 5, i + 1)
56 | plt.imshow(img)
57 | plt.title(f'Noise: {i * 100}')
58 | plt.axis('off')
59 | plt.show()
60 |
61 |
62 |
63 | # ============================================
64 | # q(x_t|x_0)
65 | # ============================================
66 | def add_noise(x_0, t, betas):
67 | T = len(betas)
68 | assert t >= 1 and t <= T
69 | t_idx = t - 1 # betas[0] is for t=1
70 |
71 | alphas = 1 - betas
72 | alpha_bars = torch.cumprod(alphas, dim=0)
73 | alpha_bar = alpha_bars[t_idx]
74 |
75 | eps = torch.randn_like(x_0)
76 | x_t = torch.sqrt(alpha_bar) * x_0 + torch.sqrt(1 - alpha_bar) * eps
77 | return x_t
78 |
79 | x = original_x
80 |
81 | t = 100
82 | x_t = add_noise(x, t, betas)
83 |
84 | img = reverse_to_img(x_t)
85 | plt.imshow(img)
86 | plt.title(f'Noise: {t}')
87 | plt.axis('off')
88 | plt.show()
--------------------------------------------------------------------------------
/step09/simple_unet.py:
--------------------------------------------------------------------------------
1 | import torch
2 | from torch import nn
3 |
4 |
5 | class ConvBlock(nn.Module):
6 | def __init__(self, in_ch, out_ch):
7 | super().__init__()
8 | self.convs = nn.Sequential(
9 | nn.Conv2d(in_ch, out_ch, 3, padding=1),
10 | nn.BatchNorm2d(out_ch),
11 | nn.ReLU(),
12 | nn.Conv2d(out_ch, out_ch, 3, padding=1),
13 | nn.BatchNorm2d(out_ch),
14 | nn.ReLU()
15 | )
16 |
17 | def forward(self, x):
18 | return self.convs(x)
19 |
20 |
21 | class UNet(nn.Module):
22 | def __init__(self, in_ch=1):
23 | super().__init__()
24 |
25 | self.down1 = ConvBlock(in_ch, 64)
26 | self.down2 = ConvBlock(64, 128)
27 | self.bot1 = ConvBlock(128, 256)
28 | self.up2 = ConvBlock(128 + 256, 128)
29 | self.up1 = ConvBlock(128 + 64, 64)
30 | self.out = nn.Conv2d(64, in_ch, 1)
31 |
32 | self.maxpool = nn.MaxPool2d(2)
33 | self.upsample = nn.Upsample(scale_factor=2, mode='bilinear')
34 |
35 | def forward(self, x):
36 | x1 = self.down1(x)
37 | x = self.maxpool(x1)
38 | x2 = self.down2(x)
39 | x = self.maxpool(x2)
40 |
41 | x = self.bot1(x)
42 |
43 | x = self.upsample(x)
44 | x = torch.cat([x, x2], dim=1)
45 | x = self.up2(x)
46 | x = self.upsample(x)
47 | x = torch.cat([x, x1], dim=1)
48 | x = self.up1(x)
49 | x = self.out(x)
50 | return x
51 |
52 |
53 | model = UNet()
54 | x = torch.randn(10, 1, 28, 28) # dummy input
55 | y = model(x)
56 | print(y.shape)
--------------------------------------------------------------------------------
/step10/classifier_free_guidance.py:
--------------------------------------------------------------------------------
1 | import math
2 | import numpy as np
3 | import torch
4 | import torchvision
5 | import matplotlib.pyplot as plt
6 | from torchvision import transforms
7 | from torch.utils.data import DataLoader
8 | from torch.optim import Adam
9 | import torch.nn.functional as F
10 | from torch import nn
11 | from tqdm import tqdm
12 |
13 |
14 | img_size = 28
15 | batch_size = 128
16 | num_timesteps = 1000
17 | epochs = 10
18 | lr = 1e-3
19 | device = 'cuda' if torch.cuda.is_available() else 'cpu'
20 |
21 |
22 | def show_images(images, labels=None, rows=2, cols=10):
23 | fig = plt.figure(figsize=(cols, rows))
24 | i = 0
25 | for r in range(rows):
26 | for c in range(cols):
27 | ax = fig.add_subplot(rows, cols, i + 1)
28 | plt.imshow(images[i], cmap='gray')
29 | if labels is not None:
30 | ax.set_xlabel(labels[i].item())
31 | ax.get_xaxis().set_ticks([])
32 | ax.get_yaxis().set_ticks([])
33 | i += 1
34 | plt.tight_layout()
35 | plt.show()
36 |
37 | def _pos_encoding(time_idx, output_dim, device='cpu'):
38 | t, D = time_idx, output_dim
39 | v = torch.zeros(D, device=device)
40 |
41 | i = torch.arange(0, D, device=device)
42 | div_term = torch.exp(i / D * math.log(10000))
43 |
44 | v[0::2] = torch.sin(t / div_term[0::2])
45 | v[1::2] = torch.cos(t / div_term[1::2])
46 | return v
47 |
48 | def pos_encoding(timesteps, output_dim, device='cpu'):
49 | batch_size = len(timesteps)
50 | device = timesteps.device
51 | v = torch.zeros(batch_size, output_dim, device=device)
52 | for i in range(batch_size):
53 | v[i] = _pos_encoding(timesteps[i], output_dim, device)
54 | return v
55 |
56 | class ConvBlock(nn.Module):
57 | def __init__(self, in_ch, out_ch, time_embed_dim):
58 | super().__init__()
59 | self.convs = nn.Sequential(
60 | nn.Conv2d(in_ch, out_ch, 3, padding=1),
61 | nn.BatchNorm2d(out_ch),
62 | nn.ReLU(),
63 | nn.Conv2d(out_ch, out_ch, 3, padding=1),
64 | nn.BatchNorm2d(out_ch),
65 | nn.ReLU()
66 | )
67 | self.mlp = nn.Sequential(
68 | nn.Linear(time_embed_dim, in_ch),
69 | nn.ReLU(),
70 | nn.Linear(in_ch, in_ch)
71 | )
72 |
73 | def forward(self, x, v):
74 | N, C, _, _ = x.shape
75 | v = self.mlp(v)
76 | v = v.view(N, C, 1, 1)
77 | y = self.convs(x + v)
78 | return y
79 |
80 | class UNetCond(nn.Module):
81 | def __init__(self, in_ch=1, time_embed_dim=100, num_labels=None):
82 | super().__init__()
83 | self.time_embed_dim = time_embed_dim
84 |
85 | self.down1 = ConvBlock(in_ch, 64, time_embed_dim)
86 | self.down2 = ConvBlock(64, 128, time_embed_dim)
87 | self.bot1 = ConvBlock(128, 256, time_embed_dim)
88 | self.up2 = ConvBlock(128 + 256, 128, time_embed_dim)
89 | self.up1 = ConvBlock(128 + 64, 64, time_embed_dim)
90 | self.out = nn.Conv2d(64, in_ch, 1)
91 |
92 | self.maxpool = nn.MaxPool2d(2)
93 | self.upsample = nn.Upsample(scale_factor=2, mode='bilinear')
94 |
95 | if num_labels is not None:
96 | self.label_emb = nn.Embedding(num_labels, time_embed_dim)
97 |
98 | def forward(self, x, timesteps, labels=None):
99 | t = pos_encoding(timesteps, self.time_embed_dim)
100 |
101 | if labels is not None:
102 | t += self.label_emb(labels)
103 |
104 | x1 = self.down1(x, t)
105 | x = self.maxpool(x1)
106 | x2 = self.down2(x, t)
107 | x = self.maxpool(x2)
108 |
109 | x = self.bot1(x, t)
110 |
111 | x = self.upsample(x)
112 | x = torch.cat([x, x2], dim=1)
113 | x = self.up2(x, t)
114 | x = self.upsample(x)
115 | x = torch.cat([x, x1], dim=1)
116 | x = self.up1(x, t)
117 | x = self.out(x)
118 | return x
119 |
120 |
121 | class Diffuser:
122 | def __init__(self, num_timesteps=1000, beta_start=0.0001, beta_end=0.02, device='cpu'):
123 | self.num_timesteps = num_timesteps
124 | self.device = device
125 | self.betas = torch.linspace(beta_start, beta_end, num_timesteps, device=device)
126 | self.alphas = 1 - self.betas
127 | self.alpha_bars = torch.cumprod(self.alphas, dim=0)
128 |
129 | def add_noise(self, x_0, t):
130 | T = self.num_timesteps
131 | assert (t >= 1).all() and (t <= T).all()
132 |
133 | t_idx = t - 1 # alpha_bars[0] is for t=1
134 | alpha_bar = self.alpha_bars[t_idx] # (N,)
135 | alpha_bar = alpha_bar.view(alpha_bar.size(0), 1, 1, 1) # (N, 1, 1, 1)
136 |
137 | noise = torch.randn_like(x_0, device=self.device)
138 | x_t = torch.sqrt(alpha_bar) * x_0 + torch.sqrt(1 - alpha_bar) * noise
139 | return x_t, noise
140 |
141 | def denoise(self, model, x, t, labels, gamma):
142 | T = self.num_timesteps
143 | assert (t >= 1).all() and (t <= T).all()
144 |
145 | t_idx = t - 1 # alphas[0] is for t=1
146 | alpha = self.alphas[t_idx]
147 | alpha_bar = self.alpha_bars[t_idx]
148 | alpha_bar_prev = self.alpha_bars[t_idx-1]
149 |
150 | N = alpha.size(0)
151 | alpha = alpha.view(N, 1, 1, 1)
152 | alpha_bar = alpha_bar.view(N, 1, 1, 1)
153 | alpha_bar_prev = alpha_bar_prev.view(N, 1, 1, 1)
154 |
155 | model.eval()
156 | with torch.no_grad():
157 | eps = model(x, t, labels)
158 | eps_uncond = model(x, t)
159 | eps = eps_uncond + gamma * (eps - eps_uncond)
160 | model.train()
161 |
162 | noise = torch.randn_like(x, device=self.device)
163 | noise[t == 1] = 0 # no noise at t=1
164 |
165 | mu = (x - ((1-alpha) / torch.sqrt(1-alpha_bar)) * eps) / torch.sqrt(alpha)
166 | std = torch.sqrt((1-alpha) * (1-alpha_bar_prev) / (1-alpha_bar))
167 | return mu + noise * std
168 |
169 | def reverse_to_img(self, x):
170 | x = x * 255
171 | x = x.clamp(0, 255)
172 | x = x.to(torch.uint8)
173 | x = x.cpu()
174 | to_pil = transforms.ToPILImage()
175 | return to_pil(x)
176 |
177 | def sample(self, model, x_shape=(20, 1, 28, 28), labels=None, gamma=3.0):
178 | batch_size = x_shape[0]
179 | x = torch.randn(x_shape, device=self.device)
180 | if labels is None:
181 | labels = torch.randint(0, 10, (len(x),), device=self.device)
182 |
183 | for i in tqdm(range(self.num_timesteps, 0, -1)):
184 | t = torch.tensor([i] * batch_size, device=self.device, dtype=torch.long)
185 | x = self.denoise(model, x, t, labels, gamma)
186 |
187 | images = [self.reverse_to_img(x[i]) for i in range(batch_size)]
188 | return images, labels
189 |
190 |
191 | preprocess = transforms.ToTensor()
192 | dataset = torchvision.datasets.MNIST(root='./data', download=True, transform=preprocess)
193 | dataloader = DataLoader(dataset, batch_size=batch_size, shuffle=True)
194 |
195 | diffuser = Diffuser(num_timesteps, device=device)
196 | model = UNetCond(num_labels=10)
197 | model.to(device)
198 | optimizer = Adam(model.parameters(), lr=lr)
199 |
200 | losses = []
201 | for epoch in range(epochs):
202 | loss_sum = 0.0
203 | cnt = 0
204 |
205 | # generate samples every epoch ===================
206 | #images, labels = diffuser.sample(model)
207 | #show_images(images, labels)
208 | # ================================================
209 |
210 | for images, labels in tqdm(dataloader):
211 | optimizer.zero_grad()
212 | x = images.to(device)
213 | labels = labels.to(device)
214 | t = torch.randint(1, num_timesteps+1, (len(x),), device=device)
215 |
216 | if np.random.random() < 0.1:
217 | labels = None
218 |
219 | x_noisy, noise = diffuser.add_noise(x, t)
220 | noise_pred = model(x_noisy, t, labels)
221 | loss = F.mse_loss(noise, noise_pred)
222 |
223 | loss.backward()
224 | optimizer.step()
225 |
226 | loss_sum += loss.item()
227 | cnt += 1
228 |
229 | loss_avg = loss_sum / cnt
230 | losses.append(loss_avg)
231 | print(f'Epoch {epoch} | Loss: {loss_avg}')
232 |
233 | # plot losses
234 | plt.plot(losses)
235 | plt.xlabel('Epoch')
236 | plt.ylabel('Loss')
237 | plt.show()
238 |
239 | # generate samples
240 | images, labels = diffuser.sample(model)
241 | show_images(images, labels)
--------------------------------------------------------------------------------
/step10/conditional.py:
--------------------------------------------------------------------------------
1 | import math
2 | import torch
3 | import torchvision
4 | import matplotlib.pyplot as plt
5 | from torchvision import transforms
6 | from torch.utils.data import DataLoader
7 | from torch.optim import Adam
8 | import torch.nn.functional as F
9 | from torch import nn
10 | from tqdm import tqdm
11 |
12 |
13 | img_size = 28
14 | batch_size = 128
15 | num_timesteps = 1000
16 | epochs = 10
17 | lr = 1e-3
18 | device = 'cuda' if torch.cuda.is_available() else 'cpu'
19 |
20 |
21 | def show_images(images, labels=None, rows=2, cols=10):
22 | fig = plt.figure(figsize=(cols, rows))
23 | i = 0
24 | for r in range(rows):
25 | for c in range(cols):
26 | ax = fig.add_subplot(rows, cols, i + 1)
27 | plt.imshow(images[i], cmap='gray')
28 | if labels is not None:
29 | ax.set_xlabel(labels[i].item())
30 | ax.get_xaxis().set_ticks([])
31 | ax.get_yaxis().set_ticks([])
32 | i += 1
33 | plt.tight_layout()
34 | plt.show()
35 |
36 | def _pos_encoding(time_idx, output_dim, device='cpu'):
37 | t, D = time_idx, output_dim
38 | v = torch.zeros(D, device=device)
39 |
40 | i = torch.arange(0, D, device=device)
41 | div_term = torch.exp(i / D * math.log(10000))
42 |
43 | v[0::2] = torch.sin(t / div_term[0::2])
44 | v[1::2] = torch.cos(t / div_term[1::2])
45 | return v
46 |
47 | def pos_encoding(timesteps, output_dim, device='cpu'):
48 | batch_size = len(timesteps)
49 | device = timesteps.device
50 | v = torch.zeros(batch_size, output_dim, device=device)
51 | for i in range(batch_size):
52 | v[i] = _pos_encoding(timesteps[i], output_dim, device)
53 | return v
54 |
55 | class ConvBlock(nn.Module):
56 | def __init__(self, in_ch, out_ch, time_embed_dim):
57 | super().__init__()
58 | self.convs = nn.Sequential(
59 | nn.Conv2d(in_ch, out_ch, 3, padding=1),
60 | nn.BatchNorm2d(out_ch),
61 | nn.ReLU(),
62 | nn.Conv2d(out_ch, out_ch, 3, padding=1),
63 | nn.BatchNorm2d(out_ch),
64 | nn.ReLU()
65 | )
66 | self.mlp = nn.Sequential(
67 | nn.Linear(time_embed_dim, in_ch),
68 | nn.ReLU(),
69 | nn.Linear(in_ch, in_ch)
70 | )
71 |
72 | def forward(self, x, v):
73 | N, C, _, _ = x.shape
74 | v = self.mlp(v)
75 | v = v.view(N, C, 1, 1)
76 | y = self.convs(x + v)
77 | return y
78 |
79 | class UNetCond(nn.Module):
80 | def __init__(self, in_ch=1, time_embed_dim=100, num_labels=None):
81 | super().__init__()
82 | self.time_embed_dim = time_embed_dim
83 |
84 | self.down1 = ConvBlock(in_ch, 64, time_embed_dim)
85 | self.down2 = ConvBlock(64, 128, time_embed_dim)
86 | self.bot1 = ConvBlock(128, 256, time_embed_dim)
87 | self.up2 = ConvBlock(128 + 256, 128, time_embed_dim)
88 | self.up1 = ConvBlock(128 + 64, 64, time_embed_dim)
89 | self.out = nn.Conv2d(64, in_ch, 1)
90 |
91 | self.maxpool = nn.MaxPool2d(2)
92 | self.upsample = nn.Upsample(scale_factor=2, mode='bilinear')
93 |
94 | if num_labels is not None:
95 | self.label_emb = nn.Embedding(num_labels, time_embed_dim)
96 |
97 | def forward(self, x, timesteps, labels=None):
98 | t = pos_encoding(timesteps, self.time_embed_dim)
99 |
100 | if labels is not None:
101 | t += self.label_emb(labels)
102 |
103 | x1 = self.down1(x, t)
104 | x = self.maxpool(x1)
105 | x2 = self.down2(x, t)
106 | x = self.maxpool(x2)
107 |
108 | x = self.bot1(x, t)
109 |
110 | x = self.upsample(x)
111 | x = torch.cat([x, x2], dim=1)
112 | x = self.up2(x, t)
113 | x = self.upsample(x)
114 | x = torch.cat([x, x1], dim=1)
115 | x = self.up1(x, t)
116 | x = self.out(x)
117 | return x
118 |
119 |
120 | class Diffuser:
121 | def __init__(self, num_timesteps=1000, beta_start=0.0001, beta_end=0.02, device='cpu'):
122 | self.num_timesteps = num_timesteps
123 | self.device = device
124 | self.betas = torch.linspace(beta_start, beta_end, num_timesteps, device=device)
125 | self.alphas = 1 - self.betas
126 | self.alpha_bars = torch.cumprod(self.alphas, dim=0)
127 |
128 | def add_noise(self, x_0, t):
129 | T = self.num_timesteps
130 | assert (t >= 1).all() and (t <= T).all()
131 |
132 | t_idx = t - 1 # alpha_bars[0] is for t=1
133 | alpha_bar = self.alpha_bars[t_idx] # (N,)
134 | alpha_bar = alpha_bar.view(alpha_bar.size(0), 1, 1, 1) # (N, 1, 1, 1)
135 |
136 | noise = torch.randn_like(x_0, device=self.device)
137 | x_t = torch.sqrt(alpha_bar) * x_0 + torch.sqrt(1 - alpha_bar) * noise
138 | return x_t, noise
139 |
140 | def denoise(self, model, x, t, labels):
141 | T = self.num_timesteps
142 | assert (t >= 1).all() and (t <= T).all()
143 |
144 | t_idx = t - 1 # alphas[0] is for t=1
145 | alpha = self.alphas[t_idx]
146 | alpha_bar = self.alpha_bars[t_idx]
147 | alpha_bar_prev = self.alpha_bars[t_idx-1]
148 |
149 | N = alpha.size(0)
150 | alpha = alpha.view(N, 1, 1, 1)
151 | alpha_bar = alpha_bar.view(N, 1, 1, 1)
152 | alpha_bar_prev = alpha_bar_prev.view(N, 1, 1, 1)
153 |
154 | model.eval()
155 | with torch.no_grad():
156 | eps = model(x, t, labels) # add lable embedding
157 | model.train()
158 |
159 | noise = torch.randn_like(x, device=self.device)
160 | noise[t == 1] = 0 # no noise at t=1
161 |
162 | mu = (x - ((1-alpha) / torch.sqrt(1-alpha_bar)) * eps) / torch.sqrt(alpha)
163 | std = torch.sqrt((1-alpha) * (1-alpha_bar_prev) / (1-alpha_bar))
164 | return mu + noise * std
165 |
166 | def reverse_to_img(self, x):
167 | x = x * 255
168 | x = x.clamp(0, 255)
169 | x = x.to(torch.uint8)
170 | x = x.cpu()
171 | to_pil = transforms.ToPILImage()
172 | return to_pil(x)
173 |
174 | def sample(self, model, x_shape=(20, 1, 28, 28), labels=None):
175 | batch_size = x_shape[0]
176 | x = torch.randn(x_shape, device=self.device)
177 | if labels is None:
178 | labels = torch.randint(0, 10, (len(x),), device=self.device)
179 |
180 | for i in tqdm(range(self.num_timesteps, 0, -1)):
181 | t = torch.tensor([i] * batch_size, device=self.device, dtype=torch.long)
182 | x = self.denoise(model, x, t, labels)
183 |
184 | images = [self.reverse_to_img(x[i]) for i in range(batch_size)]
185 | return images, labels
186 |
187 |
188 | preprocess = transforms.ToTensor()
189 | dataset = torchvision.datasets.MNIST(root='./data', download=True, transform=preprocess)
190 | dataloader = DataLoader(dataset, batch_size=batch_size, shuffle=True)
191 |
192 | diffuser = Diffuser(num_timesteps, device=device)
193 | model = UNetCond(num_labels=10)
194 | model.to(device)
195 | optimizer = Adam(model.parameters(), lr=lr)
196 |
197 | losses = []
198 | for epoch in range(epochs):
199 | loss_sum = 0.0
200 | cnt = 0
201 |
202 | # generate samples every epoch ===================
203 | #images, labels = diffuser.sample(model)
204 | #show_images(images, labels)
205 | # ================================================
206 |
207 | for images, labels in tqdm(dataloader):
208 | optimizer.zero_grad()
209 | x = images.to(device)
210 | labels = labels.to(device)
211 | t = torch.randint(1, num_timesteps+1, (len(x),), device=device)
212 |
213 | x_noisy, noise = diffuser.add_noise(x, t)
214 | noise_pred = model(x_noisy, t, labels)
215 | loss = F.mse_loss(noise, noise_pred)
216 |
217 | loss.backward()
218 | optimizer.step()
219 |
220 | loss_sum += loss.item()
221 | cnt += 1
222 |
223 | loss_avg = loss_sum / cnt
224 | losses.append(loss_avg)
225 | print(f'Epoch {epoch} | Loss: {loss_avg}')
226 |
227 | # plot losses
228 | plt.plot(losses)
229 | plt.xlabel('Epoch')
230 | plt.ylabel('Loss')
231 | plt.show()
232 |
233 | # generate samples
234 | images, labels = diffuser.sample(model)
235 | show_images(images, labels)
--------------------------------------------------------------------------------