├── GAN ├── .gitkeep ├── GAN pre trained model.ipynb └── README.md ├── Pytorch ├── .gitkeep ├── Introduction │ ├── .gitkeep │ ├── Intro_to_PyTorch.ipynb │ └── Linear Regression with Pytorch.ipynb ├── OOPS Concept │ ├── .gitkeep │ └── OOPS.ipynb ├── Pretrained Models │ ├── .gitkeep │ └── object_recognition_resnet101.py ├── Projects │ ├── Image Classification │ │ ├── Binary Class Classification │ │ │ ├── Cats and Dogs │ │ │ │ ├── Pytorch_Cats-DogsClassifier_CPU.ipynb │ │ │ │ └── readme.md │ │ │ ├── Dental Disease │ │ │ │ ├── Binary Dental Disease Classification.ipynb │ │ │ │ └── readme.md │ │ │ └── readme.md │ │ ├── Multi Class Classification │ │ │ ├── English Alphabet Recognition │ │ │ │ ├── .gitkeep │ │ │ │ └── English-Alphabet-Recognition.ipynb │ │ │ ├── Fashion MNIST │ │ │ │ ├── .gitkeep │ │ │ │ ├── FashionMNIST (2).ipynb │ │ │ │ └── Pytorch_Fashion_MNIST.ipynb │ │ │ ├── Flower Classification │ │ │ │ └── .gitkeep │ │ │ ├── MNIST │ │ │ │ ├── .gitkeep │ │ │ │ └── MNIST_with_Pytorch.ipynb │ │ │ └── readme.md │ │ └── readme.md │ └── readme.md └── Tensors │ ├── .gitkeep │ ├── Linear Regression without Sklearn.ipynb │ └── tensors_theory.py ├── README.md ├── Resources ├── .gitkeep └── README.md └── static ├── .gitkeep ├── cerebrum.jpg └── nn.jpg /GAN/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /GAN/README.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Introduction/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Introduction/Intro_to_PyTorch.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "colab_type": "text", 7 | "id": "j1k6Y9afpxL6" 8 | }, 9 | "source": [ 10 | "# Intro\n", 11 | "\n", 12 | "[PyTorch](https://pytorch.org/) is a very powerful machine learning framework. Central to PyTorch are [tensors](https://pytorch.org/docs/stable/tensors.html), a generalization of matrices to higher ranks. One intuitive example of a tensor is an image with three color channels: A 3-channel (red, green, blue) image which is 64 pixels wide and 64 pixels tall is a $3\\times64\\times64$ tensor. You can access the PyTorch framework by writing `import torch` near the top of your code, along with all of your other import statements.\n", 13 | "\n", 14 | "This guide will help introduce you to the functionality of PyTorch, but don't worry too much about memorizing it: the assignments will link to relevant documentation where necessary." 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": 2, 20 | "metadata": { 21 | "colab": {}, 22 | "colab_type": "code", 23 | "id": "fwp6T5ZMteDC" 24 | }, 25 | "outputs": [], 26 | "source": [ 27 | "import torch" 28 | ] 29 | }, 30 | { 31 | "cell_type": "markdown", 32 | "metadata": { 33 | "colab_type": "text", 34 | "id": "IvXp0rlPBqdQ" 35 | }, 36 | "source": [ 37 | "# Why PyTorch?\n", 38 | "\n", 39 | "One important question worth asking is, why is PyTorch being used for this course? There is a great breakdown by [the Gradient](https://thegradient.pub/state-of-ml-frameworks-2019-pytorch-dominates-research-tensorflow-dominates-industry/) looking at the state of machine learning frameworks today. In part, as highlighted by the article, PyTorch is generally more pythonic than alternative frameworks, easier to debug, and is the most-used language in machine learning research by a large and growing margin. While PyTorch's primary alternative, Tensorflow, has attempted to integrate many of PyTorch's features, Tensorflow's implementations come with some inherent limitations highlighted in the article.\n", 40 | "\n", 41 | "Notably, while PyTorch's industry usage has grown, Tensorflow is still (for now) a slight favorite in industry. In practice, the features that make PyTorch attractive for research also make it attractive for education, and the general trend of machine learning research and practice to PyTorch makes it the more proactive choice. " 42 | ] 43 | }, 44 | { 45 | "cell_type": "markdown", 46 | "metadata": { 47 | "colab_type": "text", 48 | "id": "MCgwdP20r1yX" 49 | }, 50 | "source": [ 51 | "# Tensor Properties\n", 52 | "One way to create tensors from a list or an array is to use `torch.Tensor`. It'll be used to set up examples in this notebook, but you'll never need to use it in the course - in fact, if you find yourself needing it, that's probably not the correct answer. " 53 | ] 54 | }, 55 | { 56 | "cell_type": "code", 57 | "execution_count": 3, 58 | "metadata": { 59 | "colab": {}, 60 | "colab_type": "code", 61 | "id": "B0hgYekGsxlB" 62 | }, 63 | "outputs": [], 64 | "source": [ 65 | "example_tensor = torch.Tensor(\n", 66 | " [\n", 67 | " [[1, 2], [3, 4]], \n", 68 | " [[5, 6], [7, 8]], \n", 69 | " [[9, 0], [1, 2]]\n", 70 | " ]\n", 71 | ")" 72 | ] 73 | }, 74 | { 75 | "cell_type": "markdown", 76 | "metadata": { 77 | "colab_type": "text", 78 | "id": "9dO4C2oft7zq" 79 | }, 80 | "source": [ 81 | "You can view the tensor in the notebook by simple printing it out (though some larger tensors will be cut off)" 82 | ] 83 | }, 84 | { 85 | "cell_type": "code", 86 | "execution_count": 4, 87 | "metadata": { 88 | "colab": { 89 | "base_uri": "https://localhost:8080/", 90 | "height": 153 91 | }, 92 | "colab_type": "code", 93 | "id": "U2FKEzeYuEOX", 94 | "outputId": "dfa12ff7-afd1-4737-a669-54f36b4209dd" 95 | }, 96 | "outputs": [ 97 | { 98 | "data": { 99 | "text/plain": [ 100 | "tensor([[[1., 2.],\n", 101 | " [3., 4.]],\n", 102 | "\n", 103 | " [[5., 6.],\n", 104 | " [7., 8.]],\n", 105 | "\n", 106 | " [[9., 0.],\n", 107 | " [1., 2.]]])" 108 | ] 109 | }, 110 | "execution_count": 4, 111 | "metadata": {}, 112 | "output_type": "execute_result" 113 | } 114 | ], 115 | "source": [ 116 | "example_tensor" 117 | ] 118 | }, 119 | { 120 | "cell_type": "markdown", 121 | "metadata": { 122 | "colab_type": "text", 123 | "id": "VUwlmUngw-VR" 124 | }, 125 | "source": [ 126 | "## Tensor Properties: Device\n", 127 | "\n", 128 | "One important property is the device of the tensor - throughout this notebook you'll be sticking to tensors which are on the CPU. However, throughout the course you'll also be using tensors on GPU (that is, a graphics card which will be provided for you to use for the course). To view the device of the tensor, all you need to write is `example_tensor.device`. To move a tensor to a new device, you can write `new_tensor = example_tensor.to(device)` where device will be either `cpu` or `cuda`." 129 | ] 130 | }, 131 | { 132 | "cell_type": "code", 133 | "execution_count": 5, 134 | "metadata": { 135 | "colab": { 136 | "base_uri": "https://localhost:8080/", 137 | "height": 34 138 | }, 139 | "colab_type": "code", 140 | "id": "R7SF44_Vw9h0", 141 | "outputId": "57f90e38-f9e1-4115-8f27-ebe651d5b2fa" 142 | }, 143 | "outputs": [ 144 | { 145 | "data": { 146 | "text/plain": [ 147 | "device(type='cpu')" 148 | ] 149 | }, 150 | "execution_count": 5, 151 | "metadata": {}, 152 | "output_type": "execute_result" 153 | } 154 | ], 155 | "source": [ 156 | "example_tensor.device" 157 | ] 158 | }, 159 | { 160 | "cell_type": "markdown", 161 | "metadata": { 162 | "colab_type": "text", 163 | "id": "FkfySyFduHQi" 164 | }, 165 | "source": [ 166 | "## Tensor Properties: Shape\n", 167 | "\n", 168 | "And you can get the number of elements in each dimension by printing out the tensor's shape, using `example_tensor.shape`, something you're likely familiar with if you've used numpy. For example, this tensor is a $3\\times2\\times2$ tensor, since it has 3 elements, each of which are $2\\times2$. " 169 | ] 170 | }, 171 | { 172 | "cell_type": "code", 173 | "execution_count": 6, 174 | "metadata": { 175 | "colab": { 176 | "base_uri": "https://localhost:8080/", 177 | "height": 34 178 | }, 179 | "colab_type": "code", 180 | "id": "DKmfzpOBun0t", 181 | "outputId": "883009b6-7300-4329-f9ec-df99cc36d846" 182 | }, 183 | "outputs": [ 184 | { 185 | "data": { 186 | "text/plain": [ 187 | "torch.Size([3, 2, 2])" 188 | ] 189 | }, 190 | "execution_count": 6, 191 | "metadata": {}, 192 | "output_type": "execute_result" 193 | } 194 | ], 195 | "source": [ 196 | "example_tensor.shape" 197 | ] 198 | }, 199 | { 200 | "cell_type": "markdown", 201 | "metadata": { 202 | "colab_type": "text", 203 | "id": "aL954xmAuq4b" 204 | }, 205 | "source": [ 206 | "You can also get the size of a particular dimension $n$ using `example_tensor.shape[n]` or equivalently `example_tensor.size(n)`" 207 | ] 208 | }, 209 | { 210 | "cell_type": "code", 211 | "execution_count": 6, 212 | "metadata": { 213 | "colab": { 214 | "base_uri": "https://localhost:8080/", 215 | "height": 51 216 | }, 217 | "colab_type": "code", 218 | "id": "7IKy3BB8uqBo", 219 | "outputId": "7fac1275-132f-4d2b-bf63-73065a2aea6a" 220 | }, 221 | "outputs": [ 222 | { 223 | "name": "stdout", 224 | "output_type": "stream", 225 | "text": [ 226 | "shape[0] = 3\n", 227 | "size(1) = 2\n" 228 | ] 229 | } 230 | ], 231 | "source": [ 232 | "print(\"shape[0] =\", example_tensor.shape[0])\n", 233 | "print(\"size(1) =\", example_tensor.size(1))" 234 | ] 235 | }, 236 | { 237 | "cell_type": "markdown", 238 | "metadata": { 239 | "colab_type": "text", 240 | "id": "3pzzG8bav5rl" 241 | }, 242 | "source": [ 243 | "Finally, it is sometimes useful to get the number of dimensions (rank) or the number of elements, which you can do as follows" 244 | ] 245 | }, 246 | { 247 | "cell_type": "code", 248 | "execution_count": 7, 249 | "metadata": { 250 | "colab": { 251 | "base_uri": "https://localhost:8080/", 252 | "height": 51 253 | }, 254 | "colab_type": "code", 255 | "id": "l_j9qTwyv41-", 256 | "outputId": "5921cbd1-19a2-4543-9488-3f72c0cb4970" 257 | }, 258 | "outputs": [ 259 | { 260 | "name": "stdout", 261 | "output_type": "stream", 262 | "text": [ 263 | "Rank = 3\n", 264 | "Number of elements = 12\n" 265 | ] 266 | } 267 | ], 268 | "source": [ 269 | "print(\"Rank =\", len(example_tensor.shape))\n", 270 | "print(\"Number of elements =\", example_tensor.numel())" 271 | ] 272 | }, 273 | { 274 | "cell_type": "markdown", 275 | "metadata": { 276 | "colab_type": "text", 277 | "id": "gibyKQJQzLkm" 278 | }, 279 | "source": [ 280 | "# Indexing Tensors\n", 281 | "\n", 282 | "As with numpy, you can access specific elements or subsets of elements of a tensor. To access the $n$-th element, you can simply write `example_tensor[n]` - as with Python in general, these dimensions are 0-indexed. " 283 | ] 284 | }, 285 | { 286 | "cell_type": "code", 287 | "execution_count": 8, 288 | "metadata": { 289 | "colab": { 290 | "base_uri": "https://localhost:8080/", 291 | "height": 51 292 | }, 293 | "colab_type": "code", 294 | "id": "F87bFA5SzNz7", 295 | "outputId": "1b0a8381-6fd8-40b4-a5c8-88cc80029f8e" 296 | }, 297 | "outputs": [ 298 | { 299 | "data": { 300 | "text/plain": [ 301 | "tensor([[5., 6.],\n", 302 | " [7., 8.]])" 303 | ] 304 | }, 305 | "execution_count": 8, 306 | "metadata": {}, 307 | "output_type": "execute_result" 308 | } 309 | ], 310 | "source": [ 311 | "example_tensor[1]" 312 | ] 313 | }, 314 | { 315 | "cell_type": "markdown", 316 | "metadata": { 317 | "colab_type": "text", 318 | "id": "1CegGw5wzpGa" 319 | }, 320 | "source": [ 321 | "In addition, if you want to access the $j$-th dimension of the $i$-th example, you can write `example_tensor[i, j]`" 322 | ] 323 | }, 324 | { 325 | "cell_type": "code", 326 | "execution_count": 9, 327 | "metadata": { 328 | "colab": { 329 | "base_uri": "https://localhost:8080/", 330 | "height": 34 331 | }, 332 | "colab_type": "code", 333 | "id": "bl1JSZcRz0xn", 334 | "outputId": "7f98e47b-66cb-4927-b784-7e4bcb9eb687" 335 | }, 336 | "outputs": [ 337 | { 338 | "data": { 339 | "text/plain": [ 340 | "tensor(7.)" 341 | ] 342 | }, 343 | "execution_count": 9, 344 | "metadata": {}, 345 | "output_type": "execute_result" 346 | } 347 | ], 348 | "source": [ 349 | "example_tensor[1, 1, 0]" 350 | ] 351 | }, 352 | { 353 | "cell_type": "markdown", 354 | "metadata": { 355 | "colab_type": "text", 356 | "id": "dyQRCRIa4NaY" 357 | }, 358 | "source": [ 359 | "Note that if you'd like to get a Python scalar value from a tensor, you can use `example_scalar.item()`" 360 | ] 361 | }, 362 | { 363 | "cell_type": "code", 364 | "execution_count": 10, 365 | "metadata": { 366 | "colab": { 367 | "base_uri": "https://localhost:8080/", 368 | "height": 34 369 | }, 370 | "colab_type": "code", 371 | "id": "e56KSJOq4YOE", 372 | "outputId": "29e1fd13-32df-40c5-e558-3193fa5da629" 373 | }, 374 | "outputs": [ 375 | { 376 | "data": { 377 | "text/plain": [ 378 | "7.0" 379 | ] 380 | }, 381 | "execution_count": 10, 382 | "metadata": {}, 383 | "output_type": "execute_result" 384 | } 385 | ], 386 | "source": [ 387 | "example_scalar = example_tensor[1, 1, 0]\n", 388 | "example_scalar.item()" 389 | ] 390 | }, 391 | { 392 | "cell_type": "markdown", 393 | "metadata": { 394 | "colab_type": "text", 395 | "id": "wZdMEQfu0A7h" 396 | }, 397 | "source": [ 398 | "In addition, you can index into the ith element of a column by using `x[:, i]`. For example, if you want the top-left element of each element in `example_tensor`, which is the `0, 0` element of each matrix, you can write:" 399 | ] 400 | }, 401 | { 402 | "cell_type": "code", 403 | "execution_count": 11, 404 | "metadata": { 405 | "colab": { 406 | "base_uri": "https://localhost:8080/", 407 | "height": 34 408 | }, 409 | "colab_type": "code", 410 | "id": "x2cFxJx50eGH", 411 | "outputId": "e66eade9-4b4b-4c7a-ea99-a83195d10541" 412 | }, 413 | "outputs": [ 414 | { 415 | "data": { 416 | "text/plain": [ 417 | "tensor([1., 5., 9.])" 418 | ] 419 | }, 420 | "execution_count": 11, 421 | "metadata": {}, 422 | "output_type": "execute_result" 423 | } 424 | ], 425 | "source": [ 426 | "example_tensor[:, 0, 0]" 427 | ] 428 | }, 429 | { 430 | "cell_type": "markdown", 431 | "metadata": { 432 | "colab_type": "text", 433 | "id": "w-rTBP-1whd2" 434 | }, 435 | "source": [ 436 | "# Initializing Tensors\n", 437 | "\n", 438 | "There are many ways to create new tensors in PyTorch, but in this course, the most important ones are: \n", 439 | "\n", 440 | "[`torch.ones_like`](https://pytorch.org/docs/master/generated/torch.ones_like.html): creates a tensor of all ones with the same shape and device as `example_tensor`." 441 | ] 442 | }, 443 | { 444 | "cell_type": "code", 445 | "execution_count": 12, 446 | "metadata": { 447 | "colab": { 448 | "base_uri": "https://localhost:8080/", 449 | "height": 153 450 | }, 451 | "colab_type": "code", 452 | "id": "g7gbs4AnwlIo", 453 | "outputId": "b0c67ed9-e33f-47d6-d95c-e53bc4f90dec" 454 | }, 455 | "outputs": [ 456 | { 457 | "data": { 458 | "text/plain": [ 459 | "tensor([[[1., 1.],\n", 460 | " [1., 1.]],\n", 461 | "\n", 462 | " [[1., 1.],\n", 463 | " [1., 1.]],\n", 464 | "\n", 465 | " [[1., 1.],\n", 466 | " [1., 1.]]])" 467 | ] 468 | }, 469 | "execution_count": 12, 470 | "metadata": {}, 471 | "output_type": "execute_result" 472 | } 473 | ], 474 | "source": [ 475 | "torch.ones_like(example_tensor)" 476 | ] 477 | }, 478 | { 479 | "cell_type": "markdown", 480 | "metadata": { 481 | "colab_type": "text", 482 | "id": "_aIbSlaJy9Z0" 483 | }, 484 | "source": [ 485 | "[`torch.zeros_like`](https://pytorch.org/docs/master/generated/torch.zeros_like.html): creates a tensor of all zeros with the same shape and device as `example_tensor`" 486 | ] 487 | }, 488 | { 489 | "cell_type": "code", 490 | "execution_count": 13, 491 | "metadata": { 492 | "colab": { 493 | "base_uri": "https://localhost:8080/", 494 | "height": 153 495 | }, 496 | "colab_type": "code", 497 | "id": "X4cWQduzzCd8", 498 | "outputId": "dbc8a5fa-8db1-4f6d-e38e-d1deb982ff36" 499 | }, 500 | "outputs": [ 501 | { 502 | "data": { 503 | "text/plain": [ 504 | "tensor([[[0., 0.],\n", 505 | " [0., 0.]],\n", 506 | "\n", 507 | " [[0., 0.],\n", 508 | " [0., 0.]],\n", 509 | "\n", 510 | " [[0., 0.],\n", 511 | " [0., 0.]]])" 512 | ] 513 | }, 514 | "execution_count": 13, 515 | "metadata": {}, 516 | "output_type": "execute_result" 517 | } 518 | ], 519 | "source": [ 520 | "torch.zeros_like(example_tensor)" 521 | ] 522 | }, 523 | { 524 | "cell_type": "markdown", 525 | "metadata": { 526 | "colab_type": "text", 527 | "id": "wsOmgS1izDS_" 528 | }, 529 | "source": [ 530 | "[`torch.randn_like`](https://pytorch.org/docs/stable/generated/torch.randn_like.html): creates a tensor with every element sampled from a [Normal (or Gaussian) distribution](https://en.wikipedia.org/wiki/Normal_distribution) with the same shape and device as `example_tensor`\n" 531 | ] 532 | }, 533 | { 534 | "cell_type": "code", 535 | "execution_count": 14, 536 | "metadata": { 537 | "colab": { 538 | "base_uri": "https://localhost:8080/", 539 | "height": 153 540 | }, 541 | "colab_type": "code", 542 | "id": "2hto51IazDow", 543 | "outputId": "cb62a68a-6171-4d1e-eb9b-f31784464aac" 544 | }, 545 | "outputs": [ 546 | { 547 | "data": { 548 | "text/plain": [ 549 | "tensor([[[-0.3675, 0.2242],\n", 550 | " [-0.3378, -1.0944]],\n", 551 | "\n", 552 | " [[ 1.5371, 0.7701],\n", 553 | " [-0.1490, -0.0928]],\n", 554 | "\n", 555 | " [[ 0.3270, 0.4642],\n", 556 | " [ 0.1494, 0.1283]]])" 557 | ] 558 | }, 559 | "execution_count": 14, 560 | "metadata": {}, 561 | "output_type": "execute_result" 562 | } 563 | ], 564 | "source": [ 565 | "torch.randn_like(example_tensor)" 566 | ] 567 | }, 568 | { 569 | "cell_type": "markdown", 570 | "metadata": { 571 | "colab_type": "text", 572 | "id": "HXp0i5Cf6AGj" 573 | }, 574 | "source": [ 575 | "Sometimes (though less often than you'd expect), you might need to initialize a tensor knowing only the shape and device, without a tensor for reference for `ones_like` or `randn_like`. In this case, you can create a $2x2$ tensor as follows:" 576 | ] 577 | }, 578 | { 579 | "cell_type": "code", 580 | "execution_count": 15, 581 | "metadata": { 582 | "colab": { 583 | "base_uri": "https://localhost:8080/", 584 | "height": 51 585 | }, 586 | "colab_type": "code", 587 | "id": "RZRqt3-S6cUZ", 588 | "outputId": "7bef97cc-a303-4200-c0f8-ef9bf3cb4996" 589 | }, 590 | "outputs": [ 591 | { 592 | "data": { 593 | "text/plain": [ 594 | "tensor([[ 0.2235, -1.8912],\n", 595 | " [-1.2873, 0.7405]])" 596 | ] 597 | }, 598 | "execution_count": 15, 599 | "metadata": {}, 600 | "output_type": "execute_result" 601 | } 602 | ], 603 | "source": [ 604 | "torch.randn(2, 2, device='cpu') # Alternatively, for a GPU tensor, you'd use device='cuda'" 605 | ] 606 | }, 607 | { 608 | "cell_type": "markdown", 609 | "metadata": { 610 | "colab_type": "text", 611 | "id": "JTkmDwVsrM6R" 612 | }, 613 | "source": [ 614 | "# Basic Functions\n", 615 | "\n", 616 | "There are a number of basic functions that you should know to use PyTorch - if you're familiar with numpy, all commonly-used functions exist in PyTorch, usually with the same name. You can perform element-wise multiplication / division by a scalar $c$ by simply writing `c * example_tensor`, and element-wise addition / subtraction by a scalar by writing `example_tensor + c`\n", 617 | "\n", 618 | "Note that most operations are not in-place in PyTorch, which means that they don't change the original variable's data (However, you can reassign the same variable name to the changed data if you'd like, such as `example_tensor = example_tensor + 1`)" 619 | ] 620 | }, 621 | { 622 | "cell_type": "code", 623 | "execution_count": 16, 624 | "metadata": { 625 | "colab": { 626 | "base_uri": "https://localhost:8080/", 627 | "height": 153 628 | }, 629 | "colab_type": "code", 630 | "id": "FpfwOUdopsF_", 631 | "outputId": "32347400-2e6a-40c6-e6f1-21e6aacde795" 632 | }, 633 | "outputs": [ 634 | { 635 | "data": { 636 | "text/plain": [ 637 | "tensor([[[ -8., -6.],\n", 638 | " [ -4., -2.]],\n", 639 | "\n", 640 | " [[ 0., 2.],\n", 641 | " [ 4., 6.]],\n", 642 | "\n", 643 | " [[ 8., -10.],\n", 644 | " [ -8., -6.]]])" 645 | ] 646 | }, 647 | "execution_count": 16, 648 | "metadata": {}, 649 | "output_type": "execute_result" 650 | } 651 | ], 652 | "source": [ 653 | "(example_tensor - 5) * 2" 654 | ] 655 | }, 656 | { 657 | "cell_type": "markdown", 658 | "metadata": { 659 | "colab_type": "text", 660 | "id": "uciZnx4b3UjX" 661 | }, 662 | "source": [ 663 | "You can calculate the mean or standard deviation of a tensor using [`example_tensor.mean()`](https://pytorch.org/docs/stable/generated/torch.mean.html) or [`example_tensor.std()`](https://pytorch.org/docs/stable/generated/torch.std.html). " 664 | ] 665 | }, 666 | { 667 | "cell_type": "code", 668 | "execution_count": 17, 669 | "metadata": { 670 | "colab": { 671 | "base_uri": "https://localhost:8080/", 672 | "height": 51 673 | }, 674 | "colab_type": "code", 675 | "id": "0ELXUKG7329z", 676 | "outputId": "720dd190-7dd4-43f1-e53c-cba4263eb2be" 677 | }, 678 | "outputs": [ 679 | { 680 | "name": "stdout", 681 | "output_type": "stream", 682 | "text": [ 683 | "Mean: tensor(4.)\n", 684 | "Stdev: tensor(2.9848)\n" 685 | ] 686 | } 687 | ], 688 | "source": [ 689 | "print(\"Mean:\", example_tensor.mean())\n", 690 | "print(\"Stdev:\", example_tensor.std())" 691 | ] 692 | }, 693 | { 694 | "cell_type": "markdown", 695 | "metadata": { 696 | "colab_type": "text", 697 | "id": "_QsyTRym32SX" 698 | }, 699 | "source": [ 700 | "You might also want to find the mean or standard deviation along a particular dimension. To do this you can simple pass the number corresponding to that dimension to the function. For example, if you want to get the average $2\\times2$ matrix of the $3\\times2\\times2$ `example_tensor` you can write:" 701 | ] 702 | }, 703 | { 704 | "cell_type": "code", 705 | "execution_count": 18, 706 | "metadata": { 707 | "colab": { 708 | "base_uri": "https://localhost:8080/", 709 | "height": 51 710 | }, 711 | "colab_type": "code", 712 | "id": "eCJl3Im25B9k", 713 | "outputId": "4bd9decd-579e-462c-bde1-ee8d9d1b2061" 714 | }, 715 | "outputs": [ 716 | { 717 | "data": { 718 | "text/plain": [ 719 | "tensor([[5.0000, 2.6667],\n", 720 | " [3.6667, 4.6667]])" 721 | ] 722 | }, 723 | "execution_count": 18, 724 | "metadata": {}, 725 | "output_type": "execute_result" 726 | } 727 | ], 728 | "source": [ 729 | "example_tensor.mean(0)\n", 730 | "\n", 731 | "# Equivalently, you could also write:\n", 732 | "# example_tensor.mean(dim=0)\n", 733 | "# example_tensor.mean(axis=0)\n", 734 | "# torch.mean(example_tensor, 0)\n", 735 | "# torch.mean(example_tensor, dim=0)\n", 736 | "# torch.mean(example_tensor, axis=0)" 737 | ] 738 | }, 739 | { 740 | "cell_type": "markdown", 741 | "metadata": { 742 | "colab_type": "text", 743 | "id": "Vb-_5ubc8t97" 744 | }, 745 | "source": [ 746 | "PyTorch has many other powerful functions but these should be all of PyTorch functions you need for this course outside of its neural network module (`torch.nn`)." 747 | ] 748 | }, 749 | { 750 | "cell_type": "markdown", 751 | "metadata": { 752 | "colab_type": "text", 753 | "id": "RtWjExD69JEs" 754 | }, 755 | "source": [ 756 | "# PyTorch Neural Network Module (`torch.nn`)\n", 757 | "\n", 758 | "PyTorch has a lot of powerful classes in its `torch.nn` module (Usually, imported as simply `nn`). These classes allow you to create a new function which transforms a tensor in specific way, often retaining information when called multiple times." 759 | ] 760 | }, 761 | { 762 | "cell_type": "code", 763 | "execution_count": 2, 764 | "metadata": { 765 | "colab": {}, 766 | "colab_type": "code", 767 | "id": "UYrgloYo_slC" 768 | }, 769 | "outputs": [], 770 | "source": [ 771 | "import torch.nn as nn" 772 | ] 773 | }, 774 | { 775 | "cell_type": "markdown", 776 | "metadata": { 777 | "colab_type": "text", 778 | "id": "uyCPVmTD_kkl" 779 | }, 780 | "source": [ 781 | "## `nn.Linear`\n", 782 | "\n", 783 | "To create a linear layer, you need to pass it the number of input dimensions and the number of output dimensions. The linear object initialized as `nn.Linear(10, 2)` will take in a $n\\times10$ matrix and return an $n\\times2$ matrix, where all $n$ elements have had the same linear transformation performed. For example, you can initialize a linear layer which performs the operation $Ax + b$, where $A$ and $b$ are initialized randomly when you generate the [`nn.Linear()`](https://pytorch.org/docs/stable/generated/torch.nn.Linear.html) object. " 784 | ] 785 | }, 786 | { 787 | "cell_type": "code", 788 | "execution_count": 7, 789 | "metadata": { 790 | "colab": { 791 | "base_uri": "https://localhost:8080/", 792 | "height": 68 793 | }, 794 | "colab_type": "code", 795 | "id": "pNPaHPo89VrN", 796 | "outputId": "c14dc316-ae68-49d3-a8eb-8ad6e1464f01" 797 | }, 798 | "outputs": [ 799 | { 800 | "data": { 801 | "text/plain": [ 802 | "tensor([[ 0.2102, 0.5055],\n", 803 | " [-0.5417, 0.8288],\n", 804 | " [ 0.1755, 0.3779]], grad_fn=)" 805 | ] 806 | }, 807 | "execution_count": 7, 808 | "metadata": {}, 809 | "output_type": "execute_result" 810 | } 811 | ], 812 | "source": [ 813 | "linear = nn.Linear(10, 2)\n", 814 | "example_input = torch.randn(3, 10)\n", 815 | "example_output = linear(example_input)\n", 816 | "example_output" 817 | ] 818 | }, 819 | { 820 | "cell_type": "markdown", 821 | "metadata": { 822 | "colab_type": "text", 823 | "id": "YGNULkJR_mzn" 824 | }, 825 | "source": [ 826 | "## `nn.ReLU`\n", 827 | "\n", 828 | "[`nn.ReLU()`](https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html) will create an object that, when receiving a tensor, will perform a ReLU activation function. This will be reviewed further in lecture, but in essence, a ReLU non-linearity sets all negative numbers in a tensor to zero. In general, the simplest neural networks are composed of series of linear transformations, each followed by activation functions. " 829 | ] 830 | }, 831 | { 832 | "cell_type": "code", 833 | "execution_count": 21, 834 | "metadata": { 835 | "colab": { 836 | "base_uri": "https://localhost:8080/", 837 | "height": 68 838 | }, 839 | "colab_type": "code", 840 | "id": "nGxVFS3nBASc", 841 | "outputId": "d5f57584-1bad-4803-ba8c-b69881db4a1f" 842 | }, 843 | "outputs": [ 844 | { 845 | "data": { 846 | "text/plain": [ 847 | "tensor([[0.2900, 0.0000],\n", 848 | " [0.4298, 0.4173],\n", 849 | " [0.4861, 0.0000]], grad_fn=)" 850 | ] 851 | }, 852 | "execution_count": 21, 853 | "metadata": {}, 854 | "output_type": "execute_result" 855 | } 856 | ], 857 | "source": [ 858 | "relu = nn.ReLU()\n", 859 | "relu_output = relu(example_output)\n", 860 | "relu_output" 861 | ] 862 | }, 863 | { 864 | "cell_type": "markdown", 865 | "metadata": { 866 | "colab_type": "text", 867 | "id": "KzfOEZ03AJzA" 868 | }, 869 | "source": [ 870 | "## `nn.BatchNorm1d`\n", 871 | "\n", 872 | "[`nn.BatchNorm1d`](https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm1d.html) is a normalization technique that will rescale a batch of $n$ inputs to have a consistent mean and standard deviation between batches. \n", 873 | "\n", 874 | "As indicated by the `1d` in its name, this is for situations where you expects a set of inputs, where each of them is a flat list of numbers. In other words, each input is a vector, not a matrix or higher-dimensional tensor. For a set of images, each of which is a higher-dimensional tensor, you'd use [`nn.BatchNorm2d`](https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm2d.html), discussed later on this page.\n", 875 | "\n", 876 | "`nn.BatchNorm1d` takes an argument of the number of input dimensions of each object in the batch (the size of each example vector)." 877 | ] 878 | }, 879 | { 880 | "cell_type": "code", 881 | "execution_count": 22, 882 | "metadata": { 883 | "colab": { 884 | "base_uri": "https://localhost:8080/", 885 | "height": 68 886 | }, 887 | "colab_type": "code", 888 | "id": "O4tYsi9-G9vM", 889 | "outputId": "ba61d37c-a8af-4663-fcc2-1691c6d241de" 890 | }, 891 | "outputs": [ 892 | { 893 | "data": { 894 | "text/plain": [ 895 | "tensor([[-1.3570, -0.7070],\n", 896 | " [ 0.3368, 1.4140],\n", 897 | " [ 1.0202, -0.7070]], grad_fn=)" 898 | ] 899 | }, 900 | "execution_count": 22, 901 | "metadata": {}, 902 | "output_type": "execute_result" 903 | } 904 | ], 905 | "source": [ 906 | "batchnorm = nn.BatchNorm1d(2)\n", 907 | "batchnorm_output = batchnorm(relu_output)\n", 908 | "batchnorm_output" 909 | ] 910 | }, 911 | { 912 | "cell_type": "markdown", 913 | "metadata": { 914 | "colab_type": "text", 915 | "id": "EMZewDz9Idr1" 916 | }, 917 | "source": [ 918 | "## `nn.Sequential`\n", 919 | "\n", 920 | "[`nn.Sequential`](https://pytorch.org/docs/stable/generated/torch.nn.Sequential.html) creates a single operation that performs a sequence of operations. For example, you can write a neural network layer with a batch normalization as" 921 | ] 922 | }, 923 | { 924 | "cell_type": "code", 925 | "execution_count": 23, 926 | "metadata": { 927 | "colab": { 928 | "base_uri": "https://localhost:8080/", 929 | "height": 221 930 | }, 931 | "colab_type": "code", 932 | "id": "R3GhASjyJt3N", 933 | "outputId": "3ef779ca-a17b-42fd-f2e5-fbb5fdc60b13" 934 | }, 935 | "outputs": [ 936 | { 937 | "name": "stdout", 938 | "output_type": "stream", 939 | "text": [ 940 | "input: \n", 941 | "tensor([[ 1.7690, 0.2864, 0.7925, 2.2849, 1.5226],\n", 942 | " [ 0.1877, 0.1367, -0.2833, 2.0905, 0.0454],\n", 943 | " [ 0.7825, 2.2969, 1.2144, 0.2526, 2.5709],\n", 944 | " [-0.4878, 1.9587, 1.6849, 0.5284, 1.9027],\n", 945 | " [ 0.5384, 1.1787, 0.4961, -1.6326, 1.4192]])\n", 946 | "output: \n", 947 | "tensor([[0.0000, 1.1865],\n", 948 | " [1.5208, 0.0000],\n", 949 | " [0.0000, 1.1601],\n", 950 | " [0.0000, 0.0000],\n", 951 | " [0.7246, 0.0000]], grad_fn=)\n" 952 | ] 953 | } 954 | ], 955 | "source": [ 956 | "mlp_layer = nn.Sequential(\n", 957 | " nn.Linear(5, 2),\n", 958 | " nn.BatchNorm1d(2),\n", 959 | " nn.ReLU()\n", 960 | ")\n", 961 | "\n", 962 | "test_example = torch.randn(5,5) + 1\n", 963 | "print(\"input: \")\n", 964 | "print(test_example)\n", 965 | "print(\"output: \")\n", 966 | "print(mlp_layer(test_example))" 967 | ] 968 | }, 969 | { 970 | "cell_type": "markdown", 971 | "metadata": { 972 | "colab_type": "text", 973 | "id": "SToQiSv5K5Yb" 974 | }, 975 | "source": [ 976 | "# Optimization\n", 977 | "\n", 978 | "One of the most important aspects of essentially any machine learning framework is its automatic differentiation library. " 979 | ] 980 | }, 981 | { 982 | "cell_type": "markdown", 983 | "metadata": { 984 | "colab_type": "text", 985 | "id": "r4GZFCZ0QqI1" 986 | }, 987 | "source": [ 988 | "## Optimizers\n", 989 | "\n", 990 | "To create an optimizer in PyTorch, you'll need to use the `torch.optim` module, often imported as `optim`. [`optim.Adam`](https://pytorch.org/docs/stable/optim.html#torch.optim.Adam) corresponds to the Adam optimizer. To create an optimizer object, you'll need to pass it the parameters to be optimized and the learning rate, `lr`, as well as any other parameters specific to the optimizer.\n", 991 | "\n", 992 | "For all `nn` objects, you can access their parameters as a list using their `parameters()` method, as follows:" 993 | ] 994 | }, 995 | { 996 | "cell_type": "code", 997 | "execution_count": 24, 998 | "metadata": { 999 | "colab": {}, 1000 | "colab_type": "code", 1001 | "id": "AIcCbs35K4wY" 1002 | }, 1003 | "outputs": [], 1004 | "source": [ 1005 | "import torch.optim as optim\n", 1006 | "adam_opt = optim.Adam(mlp_layer.parameters(), lr=1e-1)" 1007 | ] 1008 | }, 1009 | { 1010 | "cell_type": "markdown", 1011 | "metadata": { 1012 | "colab_type": "text", 1013 | "id": "-BsPFZu2M0Xx" 1014 | }, 1015 | "source": [ 1016 | "## Training Loop\n", 1017 | "\n", 1018 | "A (basic) training step in PyTorch consists of four basic parts:\n", 1019 | "\n", 1020 | "\n", 1021 | "1. Set all of the gradients to zero using `opt.zero_grad()`\n", 1022 | "2. Calculate the loss, `loss`\n", 1023 | "3. Calculate the gradients with respect to the loss using `loss.backward()`\n", 1024 | "4. Update the parameters being optimized using `opt.step()`\n", 1025 | "\n", 1026 | "That might look like the following code (and you'll notice that if you run it several times, the loss goes down):\n" 1027 | ] 1028 | }, 1029 | { 1030 | "cell_type": "code", 1031 | "execution_count": 25, 1032 | "metadata": { 1033 | "colab": { 1034 | "base_uri": "https://localhost:8080/", 1035 | "height": 34 1036 | }, 1037 | "colab_type": "code", 1038 | "id": "zm6lPx4sOJht", 1039 | "outputId": "c21672bd-a306-42ab-face-9a299511a059" 1040 | }, 1041 | "outputs": [ 1042 | { 1043 | "name": "stdout", 1044 | "output_type": "stream", 1045 | "text": [ 1046 | "tensor(0.7719, grad_fn=)\n" 1047 | ] 1048 | } 1049 | ], 1050 | "source": [ 1051 | "train_example = torch.randn(100,5) + 1\n", 1052 | "adam_opt.zero_grad()\n", 1053 | "\n", 1054 | "# We'll use a simple loss function of mean distance from 1\n", 1055 | "# torch.abs takes the absolute value of a tensor\n", 1056 | "cur_loss = torch.abs(1 - mlp_layer(train_example)).mean()\n", 1057 | "\n", 1058 | "cur_loss.backward()\n", 1059 | "adam_opt.step()\n", 1060 | "print(cur_loss)" 1061 | ] 1062 | }, 1063 | { 1064 | "cell_type": "markdown", 1065 | "metadata": { 1066 | "colab_type": "text", 1067 | "id": "wDjhZBCeTc6o" 1068 | }, 1069 | "source": [ 1070 | "## `requires_grad_()`\n", 1071 | "\n", 1072 | "You can also tell PyTorch that it needs to calculate the gradient with respect to a tensor that you created by saying `example_tensor.requires_grad_()`, which will change it in-place. This means that even if PyTorch wouldn't normally store a grad for that particular tensor, it will for that specified tensor. " 1073 | ] 1074 | }, 1075 | { 1076 | "cell_type": "markdown", 1077 | "metadata": { 1078 | "colab_type": "text", 1079 | "id": "mB22ovHyUEvH" 1080 | }, 1081 | "source": [ 1082 | "## `with torch.no_grad():`\n", 1083 | "\n", 1084 | "PyTorch will usually calculate the gradients as it proceeds through a set of operations on tensors. This can often take up unnecessary computations and memory, especially if you're performing an evaluation. However, you can wrap a piece of code with `with torch.no_grad()` to prevent the gradients from being calculated in a piece of code. " 1085 | ] 1086 | }, 1087 | { 1088 | "cell_type": "markdown", 1089 | "metadata": { 1090 | "colab_type": "text", 1091 | "id": "kowb1M425CE_" 1092 | }, 1093 | "source": [ 1094 | "\n", 1095 | "## `detach():`\n", 1096 | "\n", 1097 | "Sometimes, you want to calculate and use a tensor's value without calculating its gradients. For example, if you have two models, A and B, and you want to directly optimize the parameters of A with respect to the output of B, without calculating the gradients through B, then you could feed the detached output of B to A. There are many reasons you might want to do this, including efficiency or cyclical dependencies (i.e. A depends on B depends on A)." 1098 | ] 1099 | }, 1100 | { 1101 | "cell_type": "markdown", 1102 | "metadata": { 1103 | "colab_type": "text", 1104 | "id": "-9HY2wgKLOr-" 1105 | }, 1106 | "source": [ 1107 | "# New `nn` Classes\n", 1108 | "\n", 1109 | "You can also create new classes which extend the `nn` module. For these classes, all class attributes, as in `self.layer` or `self.param` will automatically treated as parameters if they are themselves `nn` objects or if they are tensors wrapped in `nn.Parameter` which are initialized with the class. \n", 1110 | "\n", 1111 | "The `__init__` function defines what will happen when the object is created. The first line of the init function of a class, for example, `WellNamedClass`, needs to be `super(WellNamedClass, self).__init__()`. \n", 1112 | "\n", 1113 | "The `forward` function defines what runs if you create that object `model` and pass it a tensor `x`, as in `model(x)`. If you choose the function signature, `(self, x)`, then each call of the forward function, gets two pieces of information: `self`, which is a reference to the object with which you can access all of its parameters, and `x`, which is the current tensor for which you'd like to return `y`.\n", 1114 | "\n", 1115 | "One class might look like the following:" 1116 | ] 1117 | }, 1118 | { 1119 | "cell_type": "code", 1120 | "execution_count": 26, 1121 | "metadata": { 1122 | "colab": {}, 1123 | "colab_type": "code", 1124 | "id": "WOip473tQs-d" 1125 | }, 1126 | "outputs": [], 1127 | "source": [ 1128 | "class ExampleModule(nn.Module):\n", 1129 | " def __init__(self, input_dims, output_dims):\n", 1130 | " super(ExampleModule, self).__init__()\n", 1131 | " self.linear = nn.Linear(input_dims, output_dims)\n", 1132 | " self.exponent = nn.Parameter(torch.tensor(1.))\n", 1133 | "\n", 1134 | " def forward(self, x):\n", 1135 | " x = self.linear(x)\n", 1136 | "\n", 1137 | " # This is the notation for element-wise exponentiation, \n", 1138 | " # which matches python in general\n", 1139 | " x = x ** self.exponent \n", 1140 | " \n", 1141 | " return x" 1142 | ] 1143 | }, 1144 | { 1145 | "cell_type": "markdown", 1146 | "metadata": { 1147 | "colab_type": "text", 1148 | "id": "x4CUFH_GS5UY" 1149 | }, 1150 | "source": [ 1151 | "And you can view its parameters as follows" 1152 | ] 1153 | }, 1154 | { 1155 | "cell_type": "code", 1156 | "execution_count": 27, 1157 | "metadata": { 1158 | "colab": { 1159 | "base_uri": "https://localhost:8080/", 1160 | "height": 136 1161 | }, 1162 | "colab_type": "code", 1163 | "id": "YuelIiE4S3KR", 1164 | "outputId": "27a52620-ca40-4dc8-dff5-4f3a56ba0e5b" 1165 | }, 1166 | "outputs": [ 1167 | { 1168 | "data": { 1169 | "text/plain": [ 1170 | "[Parameter containing:\n", 1171 | " tensor(1., requires_grad=True),\n", 1172 | " Parameter containing:\n", 1173 | " tensor([[ 0.2789, 0.2618, -0.0678, 0.2766, 0.1436, 0.0917, -0.1669, -0.1887,\n", 1174 | " 0.0913, -0.1998],\n", 1175 | " [-0.1757, 0.0361, 0.1140, 0.2152, -0.1200, 0.1712, 0.0944, -0.0447,\n", 1176 | " 0.1548, 0.2383]], requires_grad=True),\n", 1177 | " Parameter containing:\n", 1178 | " tensor([ 0.1881, -0.0834], requires_grad=True)]" 1179 | ] 1180 | }, 1181 | "execution_count": 27, 1182 | "metadata": {}, 1183 | "output_type": "execute_result" 1184 | } 1185 | ], 1186 | "source": [ 1187 | "example_model = ExampleModule(10, 2)\n", 1188 | "list(example_model.parameters())" 1189 | ] 1190 | }, 1191 | { 1192 | "cell_type": "markdown", 1193 | "metadata": { 1194 | "colab_type": "text", 1195 | "id": "1F7E1wKN5tez" 1196 | }, 1197 | "source": [ 1198 | "And you can print out their names too, as follows:" 1199 | ] 1200 | }, 1201 | { 1202 | "cell_type": "code", 1203 | "execution_count": 28, 1204 | "metadata": { 1205 | "colab": { 1206 | "base_uri": "https://localhost:8080/", 1207 | "height": 153 1208 | }, 1209 | "colab_type": "code", 1210 | "id": "dYTuTDsQ5pnY", 1211 | "outputId": "6635a493-7318-4688-bd18-bfba41d43e9d" 1212 | }, 1213 | "outputs": [ 1214 | { 1215 | "data": { 1216 | "text/plain": [ 1217 | "[('exponent',\n", 1218 | " Parameter containing:\n", 1219 | " tensor(1., requires_grad=True)),\n", 1220 | " ('linear.weight',\n", 1221 | " Parameter containing:\n", 1222 | " tensor([[ 0.2789, 0.2618, -0.0678, 0.2766, 0.1436, 0.0917, -0.1669, -0.1887,\n", 1223 | " 0.0913, -0.1998],\n", 1224 | " [-0.1757, 0.0361, 0.1140, 0.2152, -0.1200, 0.1712, 0.0944, -0.0447,\n", 1225 | " 0.1548, 0.2383]], requires_grad=True)),\n", 1226 | " ('linear.bias',\n", 1227 | " Parameter containing:\n", 1228 | " tensor([ 0.1881, -0.0834], requires_grad=True))]" 1229 | ] 1230 | }, 1231 | "execution_count": 28, 1232 | "metadata": {}, 1233 | "output_type": "execute_result" 1234 | } 1235 | ], 1236 | "source": [ 1237 | "list(example_model.named_parameters())" 1238 | ] 1239 | }, 1240 | { 1241 | "cell_type": "markdown", 1242 | "metadata": { 1243 | "colab_type": "text", 1244 | "id": "iWPoIqX2UsaH" 1245 | }, 1246 | "source": [ 1247 | "And here's an example of the class in action:" 1248 | ] 1249 | }, 1250 | { 1251 | "cell_type": "code", 1252 | "execution_count": 29, 1253 | "metadata": { 1254 | "colab": { 1255 | "base_uri": "https://localhost:8080/", 1256 | "height": 51 1257 | }, 1258 | "colab_type": "code", 1259 | "id": "7NXwbg5tUroC", 1260 | "outputId": "0836e447-7c37-464e-b196-048ae0a0cc73" 1261 | }, 1262 | "outputs": [ 1263 | { 1264 | "data": { 1265 | "text/plain": [ 1266 | "tensor([[-0.0567, 0.4562],\n", 1267 | " [ 0.3780, 0.3452]], grad_fn=)" 1268 | ] 1269 | }, 1270 | "execution_count": 29, 1271 | "metadata": {}, 1272 | "output_type": "execute_result" 1273 | } 1274 | ], 1275 | "source": [ 1276 | "input = torch.randn(2, 10)\n", 1277 | "example_model(input)" 1278 | ] 1279 | }, 1280 | { 1281 | "cell_type": "markdown", 1282 | "metadata": { 1283 | "colab_type": "text", 1284 | "id": "6Ocol8DABScy" 1285 | }, 1286 | "source": [ 1287 | "# 2D Operations\n", 1288 | "\n", 1289 | "You won't need these for the first lesson, and the theory behind each of these will be reviewed more in later lectures, but here is a quick reference: \n", 1290 | "\n", 1291 | "\n", 1292 | "* 2D convolutions: [`nn.Conv2d`](https://pytorch.org/docs/master/generated/torch.nn.Conv2d.html) requires the number of input and output channels, as well as the kernel size.\n", 1293 | "* 2D transposed convolutions (aka deconvolutions): [`nn.ConvTranspose2d`](https://pytorch.org/docs/master/generated/torch.nn.ConvTranspose2d.html) also requires the number of input and output channels, as well as the kernel size\n", 1294 | "* 2D batch normalization: [`nn.BatchNorm2d`](https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm2d.html) requires the number of input dimensions\n", 1295 | "* Resizing images: [`nn.Upsample`](https://pytorch.org/docs/master/generated/torch.nn.Upsample.html) requires the final size or a scale factor. Alternatively, [`nn.functional.interpolate`](https://pytorch.org/docs/stable/nn.functional.html#torch.nn.functional.interpolate) takes the same arguments. \n", 1296 | "\n", 1297 | "\n", 1298 | "\n" 1299 | ] 1300 | } 1301 | ], 1302 | "metadata": { 1303 | "colab": { 1304 | "collapsed_sections": [], 1305 | "name": "Intro to PyTorch.ipynb", 1306 | "provenance": [], 1307 | "toc_visible": true 1308 | }, 1309 | "kernelspec": { 1310 | "display_name": "Python 3", 1311 | "language": "python", 1312 | "name": "python3" 1313 | }, 1314 | "language_info": { 1315 | "codemirror_mode": { 1316 | "name": "ipython", 1317 | "version": 3 1318 | }, 1319 | "file_extension": ".py", 1320 | "mimetype": "text/x-python", 1321 | "name": "python", 1322 | "nbconvert_exporter": "python", 1323 | "pygments_lexer": "ipython3", 1324 | "version": "3.7.6" 1325 | } 1326 | }, 1327 | "nbformat": 4, 1328 | "nbformat_minor": 1 1329 | } 1330 | -------------------------------------------------------------------------------- /Pytorch/Introduction/Linear Regression with Pytorch.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "#dependencies\n", 10 | "import pandas as pd\n", 11 | "import numpy as np\n", 12 | "import torch\n", 13 | "import torch.nn as nn\n", 14 | "import torch.nn.functional as f\n", 15 | "from torch.utils.data import DataLoader,TensorDataset\n", 16 | "from torch.optim import Adam" 17 | ] 18 | }, 19 | { 20 | "cell_type": "code", 21 | "execution_count": 2, 22 | "metadata": {}, 23 | "outputs": [], 24 | "source": [ 25 | "path='D:/Data Science/Machine Learning/Machine_Learning/Machine-Learning-A-Z-New/Machine Learning A-Z New\\Part 2 - Regression/Section 5 - Multiple Linear Regression/'\n", 26 | "df=pd.read_csv(path+'50_Startups.csv')" 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": 3, 32 | "metadata": {}, 33 | "outputs": [ 34 | { 35 | "data": { 36 | "text/html": [ 37 | "
\n", 38 | "\n", 51 | "\n", 52 | " \n", 53 | " \n", 54 | " \n", 55 | " \n", 56 | " \n", 57 | " \n", 58 | " \n", 59 | " \n", 60 | " \n", 61 | " \n", 62 | " \n", 63 | " \n", 64 | " \n", 65 | " \n", 66 | " \n", 67 | " \n", 68 | " \n", 69 | " \n", 70 | " \n", 71 | " \n", 72 | " \n", 73 | " \n", 74 | " \n", 75 | " \n", 76 | " \n", 77 | " \n", 78 | " \n", 79 | " \n", 80 | " \n", 81 | " \n", 82 | " \n", 83 | " \n", 84 | " \n", 85 | " \n", 86 | " \n", 87 | " \n", 88 | " \n", 89 | " \n", 90 | " \n", 91 | " \n", 92 | " \n", 93 | " \n", 94 | " \n", 95 | " \n", 96 | " \n", 97 | " \n", 98 | " \n", 99 | " \n", 100 | " \n", 101 | " \n", 102 | " \n", 103 | " \n", 104 | "
R&D SpendAdministrationMarketing SpendStateProfit
0165349.20136897.80471784.10New York192261.83
1162597.70151377.59443898.53California191792.06
2153441.51101145.55407934.54Florida191050.39
3144372.41118671.85383199.62New York182901.99
4142107.3491391.77366168.42Florida166187.94
\n", 105 | "
" 106 | ], 107 | "text/plain": [ 108 | " R&D Spend Administration Marketing Spend State Profit\n", 109 | "0 165349.20 136897.80 471784.10 New York 192261.83\n", 110 | "1 162597.70 151377.59 443898.53 California 191792.06\n", 111 | "2 153441.51 101145.55 407934.54 Florida 191050.39\n", 112 | "3 144372.41 118671.85 383199.62 New York 182901.99\n", 113 | "4 142107.34 91391.77 366168.42 Florida 166187.94" 114 | ] 115 | }, 116 | "execution_count": 3, 117 | "metadata": {}, 118 | "output_type": "execute_result" 119 | } 120 | ], 121 | "source": [ 122 | "df.head()" 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "execution_count": 4, 128 | "metadata": {}, 129 | "outputs": [], 130 | "source": [ 131 | "x=df.iloc[:,:-1].values\n", 132 | "y=df.iloc[:,-1].values" 133 | ] 134 | }, 135 | { 136 | "cell_type": "code", 137 | "execution_count": 5, 138 | "metadata": {}, 139 | "outputs": [], 140 | "source": [ 141 | "from sklearn.preprocessing import LabelEncoder\n", 142 | "le=LabelEncoder()\n", 143 | "x[:,3]=le.fit_transform(x[:,3])" 144 | ] 145 | }, 146 | { 147 | "cell_type": "code", 148 | "execution_count": 6, 149 | "metadata": {}, 150 | "outputs": [ 151 | { 152 | "data": { 153 | "text/plain": [ 154 | "array([[165349.2, 136897.8, 471784.1, 2],\n", 155 | " [162597.7, 151377.59, 443898.53, 0],\n", 156 | " [153441.51, 101145.55, 407934.54, 1],\n", 157 | " [144372.41, 118671.85, 383199.62, 2]], dtype=object)" 158 | ] 159 | }, 160 | "execution_count": 6, 161 | "metadata": {}, 162 | "output_type": "execute_result" 163 | } 164 | ], 165 | "source": [ 166 | "x[:4]" 167 | ] 168 | }, 169 | { 170 | "cell_type": "code", 171 | "execution_count": 7, 172 | "metadata": {}, 173 | "outputs": [ 174 | { 175 | "data": { 176 | "text/plain": [ 177 | "(50, 4)" 178 | ] 179 | }, 180 | "execution_count": 7, 181 | "metadata": {}, 182 | "output_type": "execute_result" 183 | } 184 | ], 185 | "source": [ 186 | "x.shape" 187 | ] 188 | }, 189 | { 190 | "cell_type": "code", 191 | "execution_count": 8, 192 | "metadata": {}, 193 | "outputs": [ 194 | { 195 | "data": { 196 | "text/plain": [ 197 | "array([192261.83, 191792.06, 191050.39, 182901.99, 166187.94, 156991.12,\n", 198 | " 156122.51, 155752.6 , 152211.77, 149759.96, 146121.95, 144259.4 ,\n", 199 | " 141585.52, 134307.35, 132602.65, 129917.04, 126992.93, 125370.37,\n", 200 | " 124266.9 , 122776.86, 118474.03, 111313.02, 110352.25, 108733.99,\n", 201 | " 108552.04, 107404.34, 105733.54, 105008.31, 103282.38, 101004.64,\n", 202 | " 99937.59, 97483.56, 97427.84, 96778.92, 96712.8 , 96479.51,\n", 203 | " 90708.19, 89949.14, 81229.06, 81005.76, 78239.91, 77798.83,\n", 204 | " 71498.49, 69758.98, 65200.33, 64926.08, 49490.75, 42559.73,\n", 205 | " 35673.41, 14681.4 ])" 206 | ] 207 | }, 208 | "execution_count": 8, 209 | "metadata": {}, 210 | "output_type": "execute_result" 211 | } 212 | ], 213 | "source": [ 214 | "y=y.astype(dtype='float64')\n", 215 | "y" 216 | ] 217 | }, 218 | { 219 | "cell_type": "code", 220 | "execution_count": 9, 221 | "metadata": {}, 222 | "outputs": [ 223 | { 224 | "data": { 225 | "text/plain": [ 226 | "array([[1.6534920e+05, 1.3689780e+05, 4.7178410e+05, 2.0000000e+00],\n", 227 | " [1.6259770e+05, 1.5137759e+05, 4.4389853e+05, 0.0000000e+00]])" 228 | ] 229 | }, 230 | "execution_count": 9, 231 | "metadata": {}, 232 | "output_type": "execute_result" 233 | } 234 | ], 235 | "source": [ 236 | "x=x.astype(dtype='float64') #we had to convert the array to float because one col was int because it was encoded\n", 237 | "x[:2] #so we casted entire array as float array because tensors doesn't entertains heterogenous data types" 238 | ] 239 | }, 240 | { 241 | "cell_type": "code", 242 | "execution_count": 10, 243 | "metadata": {}, 244 | "outputs": [ 245 | { 246 | "name": "stdout", 247 | "output_type": "stream", 248 | "text": [ 249 | "tensor([[1.6535e+05, 1.3690e+05, 4.7178e+05, 2.0000e+00],\n", 250 | " [1.6260e+05, 1.5138e+05, 4.4390e+05, 0.0000e+00]], dtype=torch.float64)\n", 251 | "tensor([192261.8300, 191792.0600], dtype=torch.float64)\n" 252 | ] 253 | } 254 | ], 255 | "source": [ 256 | "#two ways to convert arrays to tensors\n", 257 | "x=torch.tensor(x) #first \n", 258 | "y=torch.from_numpy(y) #second\n", 259 | "print(x[:2])\n", 260 | "print(y[:2])" 261 | ] 262 | }, 263 | { 264 | "cell_type": "code", 265 | "execution_count": 11, 266 | "metadata": {}, 267 | "outputs": [ 268 | { 269 | "name": "stdout", 270 | "output_type": "stream", 271 | "text": [ 272 | "Parameter containing:\n", 273 | "tensor([[-0.3879, -0.1967, -0.2104, 0.2840]], requires_grad=True)\n", 274 | "Parameter containing:\n", 275 | "tensor([0.3466], requires_grad=True)\n" 276 | ] 277 | } 278 | ], 279 | "source": [ 280 | "#definiing model\n", 281 | "model=nn.Linear(x.shape[1],1)\n", 282 | "print(model.weight)\n", 283 | "print(model.bias)" 284 | ] 285 | }, 286 | { 287 | "cell_type": "code", 288 | "execution_count": 12, 289 | "metadata": {}, 290 | "outputs": [ 291 | { 292 | "name": "stdout", 293 | "output_type": "stream", 294 | "text": [ 295 | "\n", 296 | "[Parameter containing:\n", 297 | "tensor([[-0.3879, -0.1967, -0.2104, 0.2840]], requires_grad=True), Parameter containing:\n", 298 | "tensor([0.3466], requires_grad=True)]\n" 299 | ] 300 | } 301 | ], 302 | "source": [ 303 | "#to view all the model parameters\n", 304 | "print(model.parameters)\n", 305 | "print(list(model.parameters()))" 306 | ] 307 | }, 308 | { 309 | "cell_type": "code", 310 | "execution_count": 13, 311 | "metadata": {}, 312 | "outputs": [ 313 | { 314 | "name": "stdout", 315 | "output_type": "stream", 316 | "text": [ 317 | "tensor([[-190310.0156],\n", 318 | " [-186225.6406],\n", 319 | " [-165226.7500],\n", 320 | " [-159953.2656],\n", 321 | " [-150125.6875],\n", 322 | " [-147118.6719],\n", 323 | " [-108036.2109],\n", 324 | " [-147298.6406],\n", 325 | " [-141562.0312],\n", 326 | " [-133373.9531],\n", 327 | " [-109491.5078],\n", 328 | " [-109641.4922],\n", 329 | " [-114009.9375],\n", 330 | " [-115486.8984],\n", 331 | " [-131278.5938],\n", 332 | " [-123608.6484],\n", 333 | " [-109788.2656],\n", 334 | " [-124696.8984],\n", 335 | " [-120087.3203],\n", 336 | " [ -63717.2852],\n", 337 | " [-114804.6797],\n", 338 | " [-123708.4453],\n", 339 | " [-116661.1562],\n", 340 | " [-111109.2266],\n", 341 | " [ -78984.4219],\n", 342 | " [ -81556.2734],\n", 343 | " [ -85770.6797],\n", 344 | " [-127418.5234],\n", 345 | " [ -86402.7188],\n", 346 | " [ -78087.7578],\n", 347 | " [ -65964.6016],\n", 348 | " [ -72309.2500],\n", 349 | " [ -59708.4141],\n", 350 | " [ -86948.8672],\n", 351 | " [ -93372.9141],\n", 352 | " [ -77810.7812],\n", 353 | " [ -78421.8281],\n", 354 | " [ -68629.5781],\n", 355 | " [ -59792.2656],\n", 356 | " [ -68093.3047],\n", 357 | " [ -70823.1953],\n", 358 | " [ -62081.4375],\n", 359 | " [ -59226.0000],\n", 360 | " [ -38547.0000],\n", 361 | " [ -45015.6719],\n", 362 | " [ -25210.9160],\n", 363 | " [ -85795.9453],\n", 364 | " [ -26640.8066],\n", 365 | " [ -10388.2256],\n", 366 | " [ -32515.5879]], grad_fn=)\n" 367 | ] 368 | } 369 | ], 370 | "source": [ 371 | "y_pred=model(x.float())\n", 372 | "print(y_pred)" 373 | ] 374 | }, 375 | { 376 | "cell_type": "code", 377 | "execution_count": 14, 378 | "metadata": {}, 379 | "outputs": [ 380 | { 381 | "name": "stdout", 382 | "output_type": "stream", 383 | "text": [ 384 | "tensor(4.9871e+10, dtype=torch.float64, grad_fn=)\n" 385 | ] 386 | } 387 | ], 388 | "source": [ 389 | "#loss function\n", 390 | "mse=f.mse_loss\n", 391 | "loss=mse(y_pred.squeeze(1),y)\n", 392 | "print(loss.double())" 393 | ] 394 | }, 395 | { 396 | "cell_type": "code", 397 | "execution_count": 15, 398 | "metadata": {}, 399 | "outputs": [], 400 | "source": [ 401 | "opt=Adam(model.parameters(),lr=1e-2)" 402 | ] 403 | }, 404 | { 405 | "cell_type": "code", 406 | "execution_count": 16, 407 | "metadata": {}, 408 | "outputs": [ 409 | { 410 | "data": { 411 | "text/plain": [ 412 | "(tensor([[1.6535e+05, 1.3690e+05, 4.7178e+05, 2.0000e+00],\n", 413 | " [1.6260e+05, 1.5138e+05, 4.4390e+05, 0.0000e+00]], dtype=torch.float64),\n", 414 | " tensor([192261.8300, 191792.0600], dtype=torch.float64))" 415 | ] 416 | }, 417 | "execution_count": 16, 418 | "metadata": {}, 419 | "output_type": "execute_result" 420 | } 421 | ], 422 | "source": [ 423 | "trainset=TensorDataset(x,y)\n", 424 | "trainset[:2]" 425 | ] 426 | }, 427 | { 428 | "cell_type": "code", 429 | "execution_count": 17, 430 | "metadata": {}, 431 | "outputs": [], 432 | "source": [ 433 | "training_data=DataLoader(trainset,batch_size=5,shuffle=True)" 434 | ] 435 | }, 436 | { 437 | "cell_type": "code", 438 | "execution_count": 18, 439 | "metadata": {}, 440 | "outputs": [ 441 | { 442 | "name": "stdout", 443 | "output_type": "stream", 444 | "text": [ 445 | "tensor([[6.4665e+04, 1.3955e+05, 1.3796e+05, 0.0000e+00],\n", 446 | " [0.0000e+00, 1.3543e+05, 0.0000e+00, 0.0000e+00],\n", 447 | " [6.1994e+04, 1.1564e+05, 9.1131e+04, 1.0000e+00],\n", 448 | " [2.2178e+04, 1.5481e+05, 2.8335e+04, 0.0000e+00],\n", 449 | " [2.7893e+04, 8.4711e+04, 1.6447e+05, 1.0000e+00]], dtype=torch.float64)\n", 450 | "tensor([107404.3400, 42559.7300, 99937.5900, 65200.3300, 77798.8300],\n", 451 | " dtype=torch.float64)\n" 452 | ] 453 | } 454 | ], 455 | "source": [ 456 | "for a,b in training_data:\n", 457 | " print(a)\n", 458 | " print(b)\n", 459 | " break" 460 | ] 461 | }, 462 | { 463 | "cell_type": "code", 464 | "execution_count": null, 465 | "metadata": {}, 466 | "outputs": [], 467 | "source": [ 468 | "epochs=100000\n", 469 | "for epoch in range(epochs):\n", 470 | " for inputs,targets in training_data:\n", 471 | " pred=model(inputs.float())\n", 472 | " pred=pred.squeeze(1)\n", 473 | " #print(pred.shape)\n", 474 | " #print(inputs.dtype)\n", 475 | " #print(targets.dtype)\n", 476 | " #print(y_pred.dtype)\n", 477 | " pred=pred.double()\n", 478 | " loss=mse(pred,targets)\n", 479 | " loss.backward() #computing gradients\n", 480 | " opt.step() #updating parameters\n", 481 | " opt.zero_grad() #setting the grads back to zero\n", 482 | " if epoch%1000==0:\n", 483 | " #print(f'Epoch:{epoch}/{epochs} | Loss:{loss.item()}')\n", 484 | " pass" 485 | ] 486 | }, 487 | { 488 | "cell_type": "code", 489 | "execution_count": null, 490 | "metadata": {}, 491 | "outputs": [], 492 | "source": [] 493 | }, 494 | { 495 | "cell_type": "code", 496 | "execution_count": null, 497 | "metadata": {}, 498 | "outputs": [], 499 | "source": [] 500 | }, 501 | { 502 | "cell_type": "code", 503 | "execution_count": null, 504 | "metadata": {}, 505 | "outputs": [], 506 | "source": [] 507 | }, 508 | { 509 | "cell_type": "code", 510 | "execution_count": null, 511 | "metadata": {}, 512 | "outputs": [], 513 | "source": [] 514 | }, 515 | { 516 | "cell_type": "code", 517 | "execution_count": null, 518 | "metadata": {}, 519 | "outputs": [], 520 | "source": [] 521 | } 522 | ], 523 | "metadata": { 524 | "kernelspec": { 525 | "display_name": "Python 3", 526 | "language": "python", 527 | "name": "python3" 528 | }, 529 | "language_info": { 530 | "codemirror_mode": { 531 | "name": "ipython", 532 | "version": 3 533 | }, 534 | "file_extension": ".py", 535 | "mimetype": "text/x-python", 536 | "name": "python", 537 | "nbconvert_exporter": "python", 538 | "pygments_lexer": "ipython3", 539 | "version": "3.7.4" 540 | } 541 | }, 542 | "nbformat": 4, 543 | "nbformat_minor": 2 544 | } 545 | -------------------------------------------------------------------------------- /Pytorch/OOPS Concept/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/OOPS Concept/OOPS.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "name": "OOPS", 7 | "provenance": [], 8 | "collapsed_sections": [] 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | } 14 | }, 15 | "cells": [ 16 | { 17 | "cell_type": "markdown", 18 | "metadata": { 19 | "id": "24MNMFjn3iPc", 20 | "colab_type": "text" 21 | }, 22 | "source": [ 23 | "**Demonstrating Concepts used in OOPS**" 24 | ] 25 | }, 26 | { 27 | "cell_type": "code", 28 | "metadata": { 29 | "id": "DUP0FeX14OMb", 30 | "colab_type": "code", 31 | "colab": {} 32 | }, 33 | "source": [ 34 | "#Classes and Objects\n", 35 | "\n", 36 | "#Object is the state or the behaviour of an entity. \n", 37 | "#Class is the collection of objects. \n", 38 | "#Object is an instance of a class" 39 | ], 40 | "execution_count": null, 41 | "outputs": [] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "metadata": { 46 | "id": "c_QBWfTe1orb", 47 | "colab_type": "code", 48 | "colab": {} 49 | }, 50 | "source": [ 51 | "#can be used to keep a class empty for that specific instance of time.\n", 52 | "class a:\n", 53 | " pass \n" 54 | ], 55 | "execution_count": 1, 56 | "outputs": [] 57 | }, 58 | { 59 | "cell_type": "code", 60 | "metadata": { 61 | "id": "Dl5JwYza2VOB", 62 | "colab_type": "code", 63 | "colab": { 64 | "base_uri": "https://localhost:8080/", 65 | "height": 67 66 | }, 67 | "outputId": "01e0fef5-bc91-46ff-b74b-a5115fbf3f2e" 68 | }, 69 | "source": [ 70 | "#Why Class?\n", 71 | "class details:\n", 72 | " pass\n", 73 | "d1=details()\n", 74 | "print(d1)\n", 75 | "d1.name='Bhavishya'\n", 76 | "d1.surname='Pandit'\n", 77 | "print(d1.name)\n", 78 | "print(d1.surname)\n", 79 | "# if we do this in this a-fashion then things become very turdy" 80 | ], 81 | "execution_count": null, 82 | "outputs": [ 83 | { 84 | "output_type": "stream", 85 | "text": [ 86 | "<__main__.details object at 0x7f58919df390>\n", 87 | "Bhavishya\n", 88 | "Pandit\n" 89 | ], 90 | "name": "stdout" 91 | } 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "metadata": { 97 | "id": "WCzXKCI32VKv", 98 | "colab_type": "code", 99 | "colab": { 100 | "base_uri": "https://localhost:8080/", 101 | "height": 84 102 | }, 103 | "outputId": "a4b6a119-aadd-45e4-a798-ea011db1f3a7" 104 | }, 105 | "source": [ 106 | "# So will choose it to do like this\n", 107 | "class classname:\n", 108 | " def __init__(self,name,surname,pay): #you can relate it to be more like a constructor\n", 109 | " self.name=name\n", 110 | " self.surname=surname\n", 111 | " self.pay=pay\n", 112 | " print('Your name is:',name+surname)\n", 113 | "\n", 114 | " def greetings(self): #method or a member function\n", 115 | " print('Have a great day {}{} !'.format(self.name,self.surname))\n", 116 | "\n", 117 | "c=classname('Iron',' Man',50000)\n", 118 | "c1=classname('Bhavishya',' Pandit',100000)\n", 119 | "print(c1.name)\n", 120 | "c1.greetings()\n", 121 | "#here we see the length of the code fell considerably\n", 122 | "#The self parameter is a reference to the current instance of the class, and is used to access variables that belongs to the class.\n", 123 | "#self parameter can be renamed and would still work\n", 124 | "\n", 125 | "#classname.greetings(c1) is same as c1.greetings(); here in the 2nd one you will have to create the object before that." 126 | ], 127 | "execution_count": null, 128 | "outputs": [ 129 | { 130 | "output_type": "stream", 131 | "text": [ 132 | "Your name is: Iron Man\n", 133 | "Your name is: Bhavishya Pandit\n", 134 | "Bhavishya\n", 135 | "Have a great day Bhavishya Pandit !\n" 136 | ], 137 | "name": "stdout" 138 | } 139 | ] 140 | }, 141 | { 142 | "cell_type": "code", 143 | "metadata": { 144 | "id": "r8pScKMO2VH5", 145 | "colab_type": "code", 146 | "colab": {} 147 | }, 148 | "source": [ 149 | "'''\n", 150 | "Destructors are called when an object gets destroyed. In Python, destructors are \n", 151 | "not needed as much needed in C++ because Python has a garbage collector that handles memory management automatically.\n", 152 | "The __del__() method is a known as a destructor method in Python'''" 153 | ], 154 | "execution_count": null, 155 | "outputs": [] 156 | }, 157 | { 158 | "cell_type": "code", 159 | "metadata": { 160 | "id": "VjUW0vIJ2VFh", 161 | "colab_type": "code", 162 | "colab": { 163 | "base_uri": "https://localhost:8080/", 164 | "height": 34 165 | }, 166 | "outputId": "a2d6630a-ebb4-4470-82b9-a978b0ec55d8" 167 | }, 168 | "source": [ 169 | "#dynamic arguments in functions\n", 170 | "def add(*m):\n", 171 | " s=0\n", 172 | " for i in m:\n", 173 | " s = s+i\n", 174 | " print(s)\n", 175 | "\n", 176 | "add(15,45,10,20,30,44)" 177 | ], 178 | "execution_count": 10, 179 | "outputs": [ 180 | { 181 | "output_type": "stream", 182 | "text": [ 183 | "164\n" 184 | ], 185 | "name": "stdout" 186 | } 187 | ] 188 | }, 189 | { 190 | "cell_type": "code", 191 | "metadata": { 192 | "id": "9c6IhJ1B9mmT", 193 | "colab_type": "code", 194 | "colab": { 195 | "base_uri": "https://localhost:8080/", 196 | "height": 34 197 | }, 198 | "outputId": "d8ece0ef-9cc1-4987-f4a1-c7bdcb03cf82" 199 | }, 200 | "source": [ 201 | "#method overriding\n", 202 | "'''\n", 203 | "Overriding is the property of a class to change the implementation of a method provided by one of its base classes. ... \n", 204 | "Method overriding is thus a part of the inheritance mechanism.\n", 205 | "In Python method overriding occurs by simply defining in the child class a method with the same name of a method in the parent class\n", 206 | "'''\n", 207 | "class parent():\n", 208 | " def __init__(self):\n", 209 | " self.value=4\n", 210 | " def get_value(self):\n", 211 | " print(self.value)\n", 212 | "class child(parent):\n", 213 | " def get_value(self):\n", 214 | " print(self.value +1)\n", 215 | "\n", 216 | "c=child()\n", 217 | "c.get_value()" 218 | ], 219 | "execution_count": 12, 220 | "outputs": [ 221 | { 222 | "output_type": "stream", 223 | "text": [ 224 | "5\n" 225 | ], 226 | "name": "stdout" 227 | } 228 | ] 229 | }, 230 | { 231 | "cell_type": "code", 232 | "metadata": { 233 | "id": "pkxlNLqc9mi9", 234 | "colab_type": "code", 235 | "colab": { 236 | "base_uri": "https://localhost:8080/", 237 | "height": 34 238 | }, 239 | "outputId": "f8a2fd54-a86a-4317-c1fa-580d03f92664" 240 | }, 241 | "source": [ 242 | "#method overloading\n", 243 | "'''\n", 244 | "Method Overloading is the class having methods that are the same name with different arguments.\n", 245 | "Arguments different will be based on a number of arguments and types of arguments.\n", 246 | "It is used in a single class. It is also used to write the code clarity as well as reduce complexity.\n", 247 | "'''\n", 248 | "class mul:\n", 249 | " def two(self,a,b):\n", 250 | " p=a*b\n", 251 | " print('Product of',a,b,end=':')\n", 252 | " print(p)\n", 253 | " def two(self,a,b,c): \n", 254 | " p=a*b*c\n", 255 | " print('Product of',a,b,c,end=': ')\n", 256 | " print(p)\n", 257 | "m=mul()\n", 258 | "m.two(2,3,4)\n", 259 | "\n", 260 | "#overloading is not possible in python bcz the function gets over written. And thus shows errror" 261 | ], 262 | "execution_count": 13, 263 | "outputs": [ 264 | { 265 | "output_type": "stream", 266 | "text": [ 267 | "Product of 2 3 4:24\n" 268 | ], 269 | "name": "stdout" 270 | } 271 | ] 272 | }, 273 | { 274 | "cell_type": "code", 275 | "metadata": { 276 | "id": "BGBWFim8qMHk", 277 | "colab_type": "code", 278 | "colab": {} 279 | }, 280 | "source": [ 281 | "" 282 | ], 283 | "execution_count": null, 284 | "outputs": [] 285 | }, 286 | { 287 | "cell_type": "code", 288 | "metadata": { 289 | "id": "qrFvFzaqWTBI", 290 | "colab_type": "code", 291 | "colab": {} 292 | }, 293 | "source": [ 294 | "" 295 | ], 296 | "execution_count": null, 297 | "outputs": [] 298 | }, 299 | { 300 | "cell_type": "code", 301 | "metadata": { 302 | "id": "yVuuK_FcWS-H", 303 | "colab_type": "code", 304 | "colab": {} 305 | }, 306 | "source": [ 307 | "" 308 | ], 309 | "execution_count": null, 310 | "outputs": [] 311 | } 312 | ] 313 | } -------------------------------------------------------------------------------- /Pytorch/Pretrained Models/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Pretrained Models/object_recognition_resnet101.py: -------------------------------------------------------------------------------- 1 | #object recognition using resnet101 , a pretrained model 2 | from torchvision import models 3 | dir(models) 4 | alexnet=models.AlexNet() 5 | resnet=models.resnet101(pretrained=True) 6 | 7 | from torchvision import transforms 8 | preprocess = transforms.Compose([ 9 | transforms.Resize(256), 10 | transforms.CenterCrop(224), 11 | transforms.ToTensor(), 12 | transforms.Normalize( 13 | mean=[0.485, 0.456, 0.406], 14 | std=[0.229, 0.224, 0.225] 15 | )]) 16 | from PIL import Image 17 | #img=Image.open('C:/Users/91884/Pictures/IMG_20190918_193240_465.jpg') 18 | img=Image.open('C:/Users/91884/Pictures/dog.jpg') 19 | img.show() 20 | img_t=preprocess(img) 21 | 22 | import torch 23 | batch_t=torch.unsqueeze(img_t,0) 24 | resnet.eval() 25 | out=resnet(batch_t) 26 | out 27 | 28 | with open('C:/Users/91884/Desktop/ImageNet Labels/imagenet_classes.txt') as f: 29 | labels=[line.strip() for line in f.readlines()] 30 | 31 | _,index=torch.max(out,1) 32 | 33 | percentage=torch.nn.functional.softmax(out,dim=1)[0]*100 34 | print('Label:',labels[index[0]]) 35 | print('Percentage:',percentage[index[0]].item()) -------------------------------------------------------------------------------- /Pytorch/Projects/Image Classification/Binary Class Classification/Cats and Dogs/readme.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Projects/Image Classification/Binary Class Classification/Dental Disease/readme.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Projects/Image Classification/Binary Class Classification/readme.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Projects/Image Classification/Multi Class Classification/English Alphabet Recognition/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Projects/Image Classification/Multi Class Classification/Fashion MNIST/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Projects/Image Classification/Multi Class Classification/Flower Classification/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Projects/Image Classification/Multi Class Classification/MNIST/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Projects/Image Classification/Multi Class Classification/readme.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Projects/Image Classification/readme.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Projects/readme.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Tensors/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Pytorch/Tensors/Linear Regression without Sklearn.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "#dependencies\n", 10 | "import numpy as np\n", 11 | "import pandas as pd\n", 12 | "import torch\n", 13 | "import matplotlib.pyplot as plt" 14 | ] 15 | }, 16 | { 17 | "cell_type": "code", 18 | "execution_count": 2, 19 | "metadata": {}, 20 | "outputs": [], 21 | "source": [ 22 | "path='D:/Data Science/Machine Learning/Machine_Learning/Machine-Learning-A-Z-New/Machine Learning A-Z New/Part 2 - Regression\\Section 4 - Simple Linear Regression/'\n", 23 | "df=pd.read_csv(path+'Salary_Data.csv')" 24 | ] 25 | }, 26 | { 27 | "cell_type": "code", 28 | "execution_count": 3, 29 | "metadata": {}, 30 | "outputs": [ 31 | { 32 | "data": { 33 | "text/html": [ 34 | "
\n", 35 | "\n", 48 | "\n", 49 | " \n", 50 | " \n", 51 | " \n", 52 | " \n", 53 | " \n", 54 | " \n", 55 | " \n", 56 | " \n", 57 | " \n", 58 | " \n", 59 | " \n", 60 | " \n", 61 | " \n", 62 | " \n", 63 | " \n", 64 | " \n", 65 | " \n", 66 | " \n", 67 | " \n", 68 | " \n", 69 | " \n", 70 | " \n", 71 | " \n", 72 | " \n", 73 | " \n", 74 | " \n", 75 | " \n", 76 | " \n", 77 | " \n", 78 | " \n", 79 | " \n", 80 | " \n", 81 | " \n", 82 | " \n", 83 | "
YearsExperienceSalary
01.139343.0
11.346205.0
21.537731.0
32.043525.0
42.239891.0
\n", 84 | "
" 85 | ], 86 | "text/plain": [ 87 | " YearsExperience Salary\n", 88 | "0 1.1 39343.0\n", 89 | "1 1.3 46205.0\n", 90 | "2 1.5 37731.0\n", 91 | "3 2.0 43525.0\n", 92 | "4 2.2 39891.0" 93 | ] 94 | }, 95 | "execution_count": 3, 96 | "metadata": {}, 97 | "output_type": "execute_result" 98 | } 99 | ], 100 | "source": [ 101 | "df.head()" 102 | ] 103 | }, 104 | { 105 | "cell_type": "code", 106 | "execution_count": 4, 107 | "metadata": {}, 108 | "outputs": [], 109 | "source": [ 110 | "x=df.iloc[:,0].values #independent variable\n", 111 | "y=df.iloc[:,1].values #dependent variable" 112 | ] 113 | }, 114 | { 115 | "cell_type": "code", 116 | "execution_count": 5, 117 | "metadata": {}, 118 | "outputs": [], 119 | "source": [ 120 | "x,y=torch.from_numpy(x),torch.from_numpy(y)" 121 | ] 122 | }, 123 | { 124 | "cell_type": "code", 125 | "execution_count": 6, 126 | "metadata": {}, 127 | "outputs": [ 128 | { 129 | "name": "stdout", 130 | "output_type": "stream", 131 | "text": [ 132 | "tensor([ 1.1000, 1.3000, 1.5000, 2.0000, 2.2000, 2.9000, 3.0000, 3.2000,\n", 133 | " 3.2000, 3.7000, 3.9000, 4.0000, 4.0000, 4.1000, 4.5000, 4.9000,\n", 134 | " 5.1000, 5.3000, 5.9000, 6.0000, 6.8000, 7.1000, 7.9000, 8.2000,\n", 135 | " 8.7000, 9.0000, 9.5000, 9.6000, 10.3000, 10.5000],\n", 136 | " dtype=torch.float64)\n" 137 | ] 138 | } 139 | ], 140 | "source": [ 141 | "print(x)" 142 | ] 143 | }, 144 | { 145 | "cell_type": "code", 146 | "execution_count": 7, 147 | "metadata": {}, 148 | "outputs": [ 149 | { 150 | "name": "stdout", 151 | "output_type": "stream", 152 | "text": [ 153 | "tensor([ 39343., 46205., 37731., 43525., 39891., 56642., 60150., 54445.,\n", 154 | " 64445., 57189., 63218., 55794., 56957., 57081., 61111., 67938.,\n", 155 | " 66029., 83088., 81363., 93940., 91738., 98273., 101302., 113812.,\n", 156 | " 109431., 105582., 116969., 112635., 122391., 121872.],\n", 157 | " dtype=torch.float64)\n" 158 | ] 159 | } 160 | ], 161 | "source": [ 162 | "print(y)" 163 | ] 164 | }, 165 | { 166 | "cell_type": "code", 167 | "execution_count": 8, 168 | "metadata": {}, 169 | "outputs": [ 170 | { 171 | "data": { 172 | "text/plain": [ 173 | "30" 174 | ] 175 | }, 176 | "execution_count": 8, 177 | "metadata": {}, 178 | "output_type": "execute_result" 179 | } 180 | ], 181 | "source": [ 182 | "#total samples\n", 183 | "torch.numel(x) #to find the total number of elements" 184 | ] 185 | }, 186 | { 187 | "cell_type": "code", 188 | "execution_count": 9, 189 | "metadata": {}, 190 | "outputs": [], 191 | "source": [ 192 | "train_size=0.8*30#20 percent of the data=24" 193 | ] 194 | }, 195 | { 196 | "cell_type": "code", 197 | "execution_count": 10, 198 | "metadata": {}, 199 | "outputs": [], 200 | "source": [ 201 | "x_train,y_train,x_test,y_test=x[:24],y[:24],x[24:],y[24:]" 202 | ] 203 | }, 204 | { 205 | "cell_type": "code", 206 | "execution_count": 11, 207 | "metadata": {}, 208 | "outputs": [ 209 | { 210 | "name": "stdout", 211 | "output_type": "stream", 212 | "text": [ 213 | "torch.Size([24])\n", 214 | "torch.Size([24])\n", 215 | "torch.Size([6])\n", 216 | "torch.Size([6])\n" 217 | ] 218 | } 219 | ], 220 | "source": [ 221 | "#printing the shape of each\n", 222 | "print(x_train.shape)\n", 223 | "print(y_train.shape)\n", 224 | "print(x_test.shape)\n", 225 | "print(y_test.shape)" 226 | ] 227 | }, 228 | { 229 | "cell_type": "code", 230 | "execution_count": 12, 231 | "metadata": {}, 232 | "outputs": [ 233 | { 234 | "data": { 235 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZgAAAEWCAYAAABbgYH9AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy86wFpkAAAACXBIWXMAAAsTAAALEwEAmpwYAAAlC0lEQVR4nO3dfZRV9X3v8fcnQHTU4oiihUEDqZTUhxuJE2JimyYhcUiTCLXakNVEktJy6/LmoWlIYd3e5sGmYkhranu1JZqCmqpICBKtIhfyHB8yiAmiEsaSCAOVsQgSnSrg9/6xf0fPjGeGmWH22WfOfF5rnXX2+e792/t3SJzv+T3trYjAzMxssL2m6AqYmVl9coIxM7NcOMGYmVkunGDMzCwXTjBmZpYLJxgzM8uFE4zZMCTp85JuLroeVt+cYMx6Iem3Jf1Y0j5JeyT9SNKbj/CcH5X0w26xpZL+5shq+6rrLJX0oqRfpbqvlfSGAZznF5LePZh1s+HBCcasB5JGA3cC/wiMAZqALwAvFFmvSiSN7GHXlyPiOGACsBtYWrVK2bDnBGPWs98EiIhbIuJQRHRGxL0R8bPSAZL+VNJjkvZLelTSm1J8gaQnyuK/n+K/Bfwz8NbUstgraR7wR8BnU+zb6djxkr4pqUPSNkmfKLvu5yWtkHSzpGeBj/b2RSLieeDfgLMq7Zd0oaTNqT7fTfVE0k3AacC3U90+O7B/ShuOnGDMevZz4JCkZZLeK+mE8p2SLgE+D1wKjAYuBP4r7X4C+B3geLJWz82SxkXEY8CfAfdFxHER0RgRS4BvkFobEfEBSa8Bvg38lKzlNB34lKSWsirMBFYAjal8jyQdR5bENlbY95vALcCngLHAv5MllNdGxEeAJ4EPpLp9+XD/aGYlTjBmPYiIZ4HfBgL4GtAhabWkU9Ihf0KWFH4SmbaI+GUqe3tE7IyIlyLiNmArMK0fl38zMDYivhgRL0bEf6Q6zC475r6IWJWu0dnDeT4jaS/QBhxH5ZbOB4G7ImJtRBwAvgI0AG/rR33NXqWnflszA1KL46MAaYD8ZuCrwIeAU8laKq8i6VLg08DEFDoOOKkfl34dMD4lh5IRwA/KPm/vw3m+EhF/dZhjxgO/LH2IiJckbSdrOZkNmBOMWR9FxOOSlgL/M4W2A7/R/ThJryNrbUwna2UckvQwoNKpKp2+2+ftwLaImNxblfpe+17tBM4ufZAksuTZPsjXsWHGXWRmPZD0Bkl/IWlC+nwqWcvl/nTI9WRdUOcqc3pKLseS/VHuSOU+RtfB9aeACZJe2y32+rLPDwLPSvpLSQ2SRkg660inSPdgOfA+SdMljQL+gmym3I97qJtZnzjBmPVsP/AW4AFJz5EllkfI/gATEbcDXyKbnbUfWAWMiYhHgb8D7iP743w28KOy864HNgP/KenpFLsBOCPN4loVEYeADwDnANuAp8kS2vGD/SUjYgvwYbLp2E+n634gIl5Mh1wJ/FWq22cG+/pWv+QHjpmZWR7cgjEzs1w4wZiZWS6cYMzMLBdOMGZmlguvg0lOOumkmDhxYtHVMDMbUjZs2PB0RIyttM8JJpk4cSKtra1FV8PMbEiR9Mue9rmLzMzMcuEEY2ZmuXCCMTOzXDjBmJlZLpxgzMwsF55FZmY2TK3a2M7iNVvYubeT8Y0NzG+Zwqypg/cYICcYM7NhaNXGdhau3ETngUMAtO/tZOHKTQCDlmTcRWZmNgwtXrPl5eRS0nngEIvXbBm0azjBmJkNQzv3dvYrPhBOMGZmw9D4xoZ+xQfCCcbMbBia3zKFhlEjusQaRo1gfsuUQbuGB/nNzIah0kC+Z5GZmdmgmzW1aVATSnfuIjMzs1w4wZiZWS5ySzCSvi5pt6RHymKXSNos6SVJzd2OXyipTdIWSS1l8XMlbUr7rpGkFD9K0m0p/oCkiWVl5kjaml5z8vqOZmbWszxbMEuBGd1ijwAXAd8vD0o6A5gNnJnKXCupNL3hOmAeMDm9SuecCzwTEacDVwNXpXONAT4HvAWYBnxO0gmD+cXMzOzwckswEfF9YE+32GMRUWmZ6Ezg1oh4ISK2AW3ANEnjgNERcV9EBHAjMKuszLK0vQKYnlo3LcDaiNgTEc8Aa3l1ojMzs5zVyhhME7C97POOFGtK293jXcpExEFgH3BiL+d6FUnzJLVKau3o6BiEr2FmZiW1kmBUIRa9xAdapmswYklENEdE89ixY/tUUTMz65taSTA7gFPLPk8Adqb4hArxLmUkjQSOJ+uS6+lcZmZWRbWSYFYDs9PMsElkg/kPRsQuYL+k89L4yqXAHWVlSjPELgbWp3GaNcAFkk5Ig/sXpJiZmVVRbiv5Jd0CvAM4SdIOsplde4B/BMYCd0l6OCJaImKzpOXAo8BB4PKIKN1H+jKyGWkNwN3pBXADcJOktnTe2QARsUfSFcBP0nFfjIgukw3MzCx/yn70W3Nzc7S2thZdDTOzIUXShohorrSvVrrIzMyszjjBmJlZLpxgzMwsF04wZmaWCycYMzPLhROMmZnlwgnGzMxy4QRjZma5cIIxM7NcOMGYmVkunGDMzCwXTjBmZpYLJxgzM8uFE4yZmeUit+fBmJlZ9aza2M7iNVvYubeT8Y0NzG+ZwqypTYXWyQnGzGyIW7WxnYUrN9F5IHtOY/veThau3ARQaJJxF5mZ2RC3eM2Wl5NLSeeBQyxes6WgGmVySzCSvi5pt6RHymJjJK2VtDW9n1C2b6GkNklbJLWUxc+VtCntu0aSUvwoSbel+AOSJpaVmZOusVXSnLy+o5lZLdi5t7Nf8WrJswWzFJjRLbYAWBcRk4F16TOSzgBmA2emMtdKGpHKXAfMAyanV+mcc4FnIuJ04GrgqnSuMcDngLcA04DPlScyM7N6M76xoV/xasktwUTE94E93cIzgWVpexkwqyx+a0S8EBHbgDZgmqRxwOiIuC8iArixW5nSuVYA01PrpgVYGxF7IuIZYC2vTnRmZnVjfssUGkaN6BJrGDWC+S1TCqpRptqD/KdExC6AiNgl6eQUbwLuLztuR4odSNvd46Uy29O5DkraB5xYHq9QpgtJ88haR5x22mkD/1ZmZgUqDeR7FlllqhCLXuIDLdM1GLEEWALQ3Nxc8Rgzs6Fg1tSmwhNKd9VOME9JGpdaL+OA3Sm+Azi17LgJwM4Un1AhXl5mh6SRwPFkXXI7gHd0K/Pdwf0aZmaZWlx/UiuqPU15NVCa1TUHuKMsPjvNDJtENpj/YOpO2y/pvDS+cmm3MqVzXQysT+M0a4ALJJ2QBvcvSDEzs0FVWn/SvreT4JX1J6s2thddtZqQ5zTlW4D7gCmSdkiaCywC3iNpK/Ce9JmI2AwsBx4F7gEuj4jSpO7LgOvJBv6fAO5O8RuAEyW1AZ8mzUiLiD3AFcBP0uuLKWZmNqhqdf1JrVD2o9+am5ujtbW16GqY2RAyacFdFQd4BWxb9L5qV6cQkjZERHOlfV7Jb2Y2QLW6/qRWOMGYmQ1Qra4/qRW1Mk3ZzGzIqdX1J7XCCcbM7AjU4vqTWuEuMjMzy4UTjJmZ5cIJxszMcuExGDMrnG+3Up+cYMysULX6uF87cu4iM7NC+XYr9csJxswKVauP+7Uj5wRjZoXy7VbqlxOMmRXKt1upXx7kN7NC1dLtVjybbXA5wZhZ4WrhdiuezTb43EVmZoZns+XBCcbMDM9my0MhCUbSJyU9ImmzpE+l2BhJayVtTe8nlB2/UFKbpC2SWsri50ralPZdI0kpfpSk21L8AUkTq/0dzWxo8Wy2wVf1BCPpLOBPgWnAG4H3S5oMLADWRcRkYF36jKQzgNnAmcAM4FpJpSkn1wHzgMnpNSPF5wLPRMTpwNXAVVX4amY2hHk22+ArogXzW8D9EfF8RBwEvgf8PjATWJaOWQbMStszgVsj4oWI2Aa0AdMkjQNGR8R9ERHAjd3KlM61Apheat2YmVUya2oTV150Nk2NDQhoamzgyovO9gD/EShiFtkjwJcknQh0Ar8HtAKnRMQugIjYJenkdHwTcH9Z+R0pdiBtd4+XymxP5zooaR9wIvB0eUUkzSNrAXHaaacN1vczsyGqFmaz1ZOqt2Ai4jGyLqu1wD3AT4GDvRSp1PKIXuK9lelelyUR0RwRzWPHju213mZm1j+FDPJHxA0R8aaIeDuwB9gKPJW6vUjvu9PhO4BTy4pPAHam+IQK8S5lJI0Ejk/XMTOzKilqFtnJ6f004CLgFmA1MCcdMge4I22vBmanmWGTyAbzH0zdafslnZfGVy7tVqZ0rouB9WmcxszMqqSolfzfTGMwB4DLI+IZSYuA5ZLmAk8ClwBExGZJy4FHybrSLo+I0mqoy4ClQANwd3oB3ADcJKmNrOUyuzpfy8zMSuQf9pnm5uZobW0tuhpmZkOKpA0R0Vxpn1fym5lZLpxgzMwsF04wZmaWCycYMzPLhROMmZnlwgnGzMxy4QRjZma5cIIxM7NcOMGYmVkuirpVjJnVoFUb21m8Zgs793YyvrGB+S1TfPt6GzAnGDMDsuSycOUmOg9kt/pr39vJwpWbAJxkbEDcRWZmACxes+Xl5FLSeeAQi9dsKahGNtQ5wZgZADv3dvYrbnY4TjBmBsD4xoZ+xc0OxwnGzACY3zKFhlEjusQaRo1gfsuUQb3Oqo3tnL9oPZMW3MX5i9azamP7oJ7faocH+c0MeGUgP89ZZJ5IMLw4wZjZy2ZNbcr1D31vEwmcYOpPIV1kkv5c0mZJj0i6RdLRksZIWitpa3o/oez4hZLaJG2R1FIWP1fSprTvGklK8aMk3ZbiD0iaWMDXNLNuPJFgeOlTgpE04vBH9Y2kJuATQHNEnAWMAGYDC4B1ETEZWJc+I+mMtP9MYAZwbVl9rgPmAZPTa0aKzwWeiYjTgauBqwar/mY2cJ5IMLz0tQXTJmlx+mM/GEYCDZJGAscAO4GZwLK0fxkwK23PBG6NiBciYhvQBkyTNA4YHRH3RUQAN3YrUzrXCmB6qXVjZsWp1kQCqw19TTD/A/g5cL2k+yXNkzR6IBeMiHbgK8CTwC5gX0TcC5wSEbvSMbuAk1ORJmB72Sl2pFhT2u4e71ImIg4C+4ATu9clfY9WSa0dHR0D+Tpm1g+zpjZx5UVn09TYgICmxgauvOhsj7/UqT4N8kfEfuBrwNckvR24Bbha0grgioho6+sF09jKTGASsBe4XdKHeytSqUq9xHsr0zUQsQRYAtDc3Pyq/WY2+PKeSGC1o89jMJIulPQt4B+AvwNeD3wb+Pd+XvPdwLaI6IiIA8BK4G3AU6nbi/S+Ox2/Azi1rPwEsi61HWm7e7xLmdQNdzywp5/1NDOzI9DXLrKtZK2OxRExNSL+PiKeiogVwD39vOaTwHmSjknjItOBx4DVwJx0zBzgjrS9GpidZoZNIhvMfzB1o+2XdF46z6XdypTOdTGwPo3TmJlZlRy2iyzN2FoaEV+stD8iPtGfC0bEA6lr7SHgILCRrJvqOGC5pLlkSeiSdPxmScuBR9Pxl0dEaSL9ZcBSoAG4O70AbgBuktRG1nKZ3Z86mpnZkVNffthL+k5EvLMK9SlMc3NztLa2Fl0NM7MhRdKGiGiutK+vK/l/LOmfgNuA50rBiHhoEOpnZmZ1qK8J5m3pvbybLIB3DW51zMysXvR1mnJdd4+Zmdng6/PNLiW9j+x2LUeXYj0N/JuZmfV1Hcw/Ax8EPk62iPES4HU51svMzIa4vq6DeVtEXEp2A8kvAG+l6+JHMzOzLvqaYEr30n5e0njgANmtXszMzCrq6xjMnZIagcVkCyQDuD6vSpmZ2dDX11lkV6TNb0q6Ezg6IvblVy0zMxvqek0wki7qZR8RsXLwq2RmZvXgcC2YD/SyL8juhGxmZvYqvSaYiPhYtSpiZmb1xQstzcwsF15oaWZmufBCSzMzy8VAF1oexAstzcysF/1daPllYEOKeaGlmZn1qNcWjKQ3S/r1iLgiIvaSPdZ4E3A7cPVALihpiqSHy17PSvqUpDGS1kramt5PKCuzUFKbpC2SWsri50ralPZdI0kpfpSk21L8AUkTB1JXMzMbuMN1kf0L8CKApLcDi1JsH7BkIBeMiC0RcU5EnAOcCzwPfAtYAKyLiMnAuvQZSWcAs8lmsM0ArpU0Ip3uOmAeMDm9ZqT4XLLxotPJEuFVA6mrmZkN3OESzIiI2JO2PwgsiYhvRsT/AU4fhOtPB56IiF8CM4FlKb4MmJW2ZwK3RsQLEbENaAOmSRoHjI6I+yIigBu7lSmdawUwvdS6MTOz6jhsgpFUGqeZDqwv29fnNTS9mA3ckrZPiYhdAOn95BRvAraXldmRYk1pu3u8S5mIOEjW4jqx+8UlzZPUKqm1o6NjEL6OmZmVHC7B3AJ8T9IdZDPJfgAg6XSyP9oDJum1wIVk4zm9HlohFr3EeyvTNRCxJCKaI6J57Nixh6mGmZn1x+FuFfMlSeuAccC9qSsKssT08SO89nuBhyLiqfT5KUnjImJX6v7aneI76LrmZgKwM8UnVIiXl9mRWmDHA3swq2OrNrazeM0Wdu7tZHxjA/NbpjBratPhC5rl5LDrYCLi/oj4VkQ8Vxb7eUQ8dITX/hCvdI8BrAbmpO05wB1l8dlpZtgkssH8B1M32n5J56XxlUu7lSmd62JgfVlyNKs7qza2s3DlJtr3dhJA+95OFq7cxKqN7UVXzYaxvi60HFSSjgHeQ9e7MS8C3iNpa9q3CCAiNgPLgUeBe4DLI+JQKnMZ2XqcNuAJ4O4UvwE4UVIb8GnSjDSzerV4zRY6DxzqEus8cIjFa7YUVCOzwRmo77eIeJ5ug+4R8V9kEwkqHf8l4EsV4q3AWRXi/012vzSzYWHn3s5+xc2qoZAWjJkNrvGNDf2Km1WDE4xZHZjfMoWGUSO6xBpGjWB+y5SXP6/a2M75i9YzacFdnL9ovcdnLHeFdJGZ2eAqzRbraRZZaRJAaZymNAmgvKzZYHOCMasTs6Y29ZgsepsE4ARjeXEXmdkw4EkAVgQnGLNhwJMArAhOMDYkeID6yPRlEoDZYPMYjNU8D1AfucNNAjDLgxOM1TwPUA+O3iYBmOXBCcZqXpED1L6BpNnAeQzGal5RA9S+gaTZkXGCsZpX1AC1byBpdmTcRWY1r6gBaq8dMTsyTjA2JBQxQD2+sYH2CsnEa0fM+sZdZGY9qFbXnNf4WL1yC8asB9XomvMaH6tnTjBmvci7a85rfKyeFfXI5EZJKyQ9LukxSW+VNEbSWklb0/sJZccvlNQmaYuklrL4uZI2pX3XSFKKHyXpthR/QNLEAr6m9dFw7iLyRAKrZ0WNwfwDcE9EvAF4I/AYsABYFxGTgXXpM5LOAGYDZwIzgGsllTrGrwPmAZPTa0aKzwWeiYjTgauBq6rxpaz/hvtaE9+E0upZ1ROMpNHA24EbACLixYjYC8wElqXDlgGz0vZM4NaIeCEitgFtwDRJ44DREXFfRARwY7cypXOtAKaXWjdWW4b7WhPfhNLqWREtmNcDHcC/Stoo6XpJxwKnRMQugPR+cjq+CdheVn5HijWl7e7xLmUi4iCwDzixe0UkzZPUKqm1o6NjsL6f9cNw7yKaNbWJKy86m6bGBgQ0NTZw5UVne/zF6kIRg/wjgTcBH4+IByT9A6k7rAeVWh7RS7y3Ml0DEUuAJQDNzc2v2m/581oT34TS6lcRLZgdwI6IeCB9XkGWcJ5K3V6k991lx59aVn4CsDPFJ1SIdykjaSRwPLBn0L+JHTF3EZnVr6onmIj4T2C7pNJfkOnAo8BqYE6KzQHuSNurgdlpZtgkssH8B1M32n5J56XxlUu7lSmd62JgfRqnsRrjLiKz+lXUOpiPA9+Q9FrgP4CPkSW75ZLmAk8ClwBExGZJy8mS0EHg8ogojQpfBiwFGoC70wuyCQQ3SWoja7nMrsaXsoEpqovIt+I3y5f8wz7T3Nwcra2tRVfDqqT7CnrIuuYO13rqnpTe+YaxfOfxDicpG7YkbYiI5kr7vJLfqq4WWg4DWUFf6bYuN9//5Mv7fZsXs658s0urqlpZWDmQ6dGVklJ3w2kNj9nhOMFYVdXKwsqBrKDv69qc4bKGx+xwnGCsqmplYeVApkc3HjOqT+ceTmt4zHrjBGNVVSv33hrI9Oi+zIfxGh6zV3iQ36pqfsuUirO3DvdHOY+JAf2dHr2v80CP+wSeRWbWjROMVdVAHuJVKw/l6um2Nk2NDfxowbuqVg+zocIJxqquvy2HWnko10BbX2bDlROM1bxamRhQjUcom9UTJxirebV0x2Xf+dis7zyLzGqe77hsNjS5BWM1z11TZkOTE4wNCe6aMht63EVmZma5cIIxM7NcOMGYmVkunGDMzCwXhSQYSb+QtEnSw5JaU2yMpLWStqb3E8qOXyipTdIWSS1l8XPTedokXSNJKX6UpNtS/AFJE6v+Jc3MhrkiWzDvjIhzyh61uQBYFxGTgXXpM5LOAGYDZwIzgGsllRZFXAfMAyan14wUnws8ExGnA1cDV1Xh+5iZWZla6iKbCSxL28uAWWXxWyPihYjYBrQB0ySNA0ZHxH0REcCN3cqUzrUCmF5q3ZiZWXUUlWACuFfSBknzUuyUiNgFkN5PTvEmYHtZ2R0p1pS2u8e7lImIg8A+4MTulZA0T1KrpNaOjo5B+WJmZpYpaqHl+RGxU9LJwFpJj/dybKWWR/QS761M10DEEmAJQHNzcx8eJ2VmZn1VSAsmInam993At4BpwFOp24v0vjsdvgM4taz4BGBnik+oEO9SRtJI4HhgTx7fxczMKqt6gpF0rKRfK20DFwCPAKuBOemwOcAdaXs1MDvNDJtENpj/YOpG2y/pvDS+cmm3MqVzXQysT+M0ZmZWJUV0kZ0CfCuNuY8E/i0i7pH0E2C5pLnAk8AlABGxWdJy4FHgIHB5RJSe+HQZsBRoAO5OL4AbgJsktZG1XGZX44uZmdkr5B/2mebm5mhtbS26GmZmQ4qkDWXLTbqopWnKZmZWR3y7/pyt2tju55iY2bDkBJOjVRvbWbhyE50HsiGj9r2dLFy5CcBJxszqnrvIcrR4zZaXk0tJ54FDLF6zpaAamZlVjxNMjnbu7exX3MysnjjB5Gh8Y0O/4mZm9cQJJkfzW6bQMGpEl1jDqBHMb5lSUI3MzKrHg/w5Kg3kexaZmQ1HTjA5mzW1qZCE4unRZlY0J5g65OnRZlYLPAZThzw92sxqgRNMHfL0aDOrBU4wdcjTo82sFjjB1CFPjzazWuBB/jrk6dFmVgucYOpUUdOjzcxK3EVmZma5KCzBSBohaaOkO9PnMZLWStqa3k8oO3ahpDZJWyS1lMXPlbQp7btG6TnMko6SdFuKPyBpYtW/4BCwamM75y9az6QFd3H+ovWs2thedJXMrI4U2YL5JPBY2ecFwLqImAysS5+RdAYwGzgTmAFcK6k0gn0dMA+YnF4zUnwu8ExEnA5cDVyV71cZekqLMdv3dhK8shjTScbMBkshCUbSBOB9wPVl4ZnAsrS9DJhVFr81Il6IiG1AGzBN0jhgdETcFxEB3NitTOlcK4DppdbNYBuqrQAvxjSzvBXVgvkq8FngpbLYKRGxCyC9n5ziTcD2suN2pFhT2u4e71ImIg4C+4ATu1dC0jxJrZJaOzo6+v0lhnIrwIsxzSxvVU8wkt4P7I6IDX0tUiEWvcR7K9M1ELEkIpojonns2LF9rM4rhnIrwIsxzSxvRbRgzgculPQL4FbgXZJuBp5K3V6k993p+B3AqWXlJwA7U3xChXiXMpJGAscDewb7iwzlVoAXY5pZ3qqeYCJiYURMiIiJZIP36yPiw8BqYE46bA5wR9peDcxOM8MmkQ3mP5i60fZLOi+Nr1zarUzpXBena7yqBXOkBqsVUMQ4zqypTVx50dk0NTYgoKmxgSsvOttrZ8xs0NTSQstFwHJJc4EngUsAImKzpOXAo8BB4PKIKPVLXQYsBRqAu9ML4AbgJkltZC2X2XlUeH7LlC63xYf+twKKvLW+F2OaWZ6Uww/7Iam5uTlaW1v7Xe5IH+x1/qL1tFfoUmtqbOBHC97V7/qYmVWTpA0R0VxpXy21YIakI20FDOVxHDOz3vhWMQXzbC4zq1dOMAXzbC4zq1fuIiuYb61vZvXKCaYGeDaXmdUjd5GZmVkunGDMzCwXTjBmZpYLJxgzM8uFE4yZmeXCt4pJJHUAv+zj4ScBT+dYncHkuubDdc2H6zr48q7n6yKi4vNOnGAGQFJrT/feqTWuaz5c13y4roOvyHq6i8zMzHLhBGNmZrlwghmYJUVXoB9c13y4rvlwXQdfYfX0GIyZmeXCLRgzM8uFE4yZmeXCCaYfJH1d0m5JjxRdl8ORdKqk70h6TNJmSZ8suk49kXS0pAcl/TTV9QtF16k3kkZI2ijpzqLr0htJv5C0SdLDkvr/PPAqktQoaYWkx9P/Z99adJ0qkTQl/XuWXs9K+lTR9eqJpD9P/009IukWSUdX9foeg+k7SW8HfgXcGBFnFV2f3kgaB4yLiIck/RqwAZgVEY8WXLVXkSTg2Ij4laRRwA+BT0bE/QVXrSJJnwaagdER8f6i69MTSb8AmiOi5hcDSloG/CAirpf0WuCYiNhbcLV6JWkE0A68JSL6uki7aiQ1kf23dEZEdEpaDvx7RCytVh3cgumHiPg+sKfoevRFROyKiIfS9n7gMaAmHzoTmV+lj6PSqyZ/+UiaALwPuL7outQLSaOBtwM3AETEi7WeXJLpwBO1mFzKjAQaJI0EjgF2VvPiTjDDgKSJwFTggYKr0qPU7fQwsBtYGxG1WtevAp8FXiq4Hn0RwL2SNkiaV3RlevF6oAP419T1eL2kY4uuVB/MBm4puhI9iYh24CvAk8AuYF9E3FvNOjjB1DlJxwHfBD4VEc8WXZ+eRMShiDgHmABMk1RzXZCS3g/sjogNRdelj86PiDcB7wUuT128tWgk8CbguoiYCjwHLCi2Sr1L3XgXArcXXZeeSDoBmAlMAsYDx0r6cDXr4ARTx9J4xjeBb0TEyqLr0xepa+S7wIxia1LR+cCFaWzjVuBdkm4utko9i4id6X038C1gWrE16tEOYEdZq3UFWcKpZe8FHoqIp4quSC/eDWyLiI6IOACsBN5WzQo4wdSpNHB+A/BYRPx90fXpjaSxkhrTdgPZfxiPF1qpCiJiYURMiIiJZN0j6yOiqr8I+0rSsWlyB6m76QKgJmc/RsR/AtslTUmh6UDNTUbp5kPUcPdY8iRwnqRj0t+D6WRjsVXjBNMPkm4B7gOmSNohaW7RderF+cBHyH5ll6ZU/l7RlerBOOA7kn4G/IRsDKampwAPAacAP5T0U+BB4K6IuKfgOvXm48A30v8HzgH+ttjq9EzSMcB7yFoENSu1CFcADwGbyP7eV/W2MZ6mbGZmuXALxszMcuEEY2ZmuXCCMTOzXDjBmJlZLpxgzMwsF04wVveU+aGk95bF/lBSIdN2Jb0hTRvfKOk3iqhDWV3+TNKlRdbB6penKduwkG49czvZPdlGAA8DMyLiiQGca0REHDqCuiwAGiLicwM9x2CQNDIiDhZZB6tvTjA2bEj6Mtl9ro5N768Dzia7F9bnI+KOdGPQm9IxAP8rIn4s6R3A58huGngO8GZgOdm900YAV0TEbd2udw7wz2R3sX0C+GPgrcDXgUPAzyPind3KXAB8ATgqlfkYcCLw/1LZPcD3gCuAnwP3kN3EdGr6fGlEPC/pXODvgeOAp4GPRsQuSd8Ffky2EHc18GvAryLiK6k19X+BscDzwJ9GxOOSlgLPkj2i4NeBz0bEilTfz5It6H0JuDsiFvR0nsP8z2P1KCL88mtYvMiSxhayVc1XAh9O8UayP87HkiWDo1N8MtCatt9BlpQmpc9/AHyt7NzHV7jez4DfTdtfBL6atj8PfKbC8ScB3yd7Ng7AXwJ/nbb/hGxV9nzgX1JsItkdk89Pn78OfIbscQc/Bsam+AeBr6ft7wLXll3z5boA64DJafstZLfCAVhK1vp7DXAG0Jbi703XOSZ9HtPbefwafq+RfchBZnUhIp6TdBvZQ+P+EPiApM+k3UcDp5E9L+OfUuvjEPCbZad4MCK2pe1NwFckXQXcGRE/KL+WpOOBxoj4Xgot4/B33j2P7A/4j7JbR/FaslsTEdmDuC4B/oysBVWyPSJ+lLZvBj5B1qo5C1ibzjOCrOVV0qWllep7HNmNEG9PZSBrRZWsioiXgEclnZJi7wb+NSKeT3Xc04fz2DDiBGPDzUvpJeAPImJL+U5JnweeAt5I9ov9v8t2P1faiIifp26o3wOulHRvRHzxCOsmsvuwfehVO7L7X01IH48D9peq0u3QSOfZHBE9PXb4uQqx1wB7I3tkQiUvdKtn6b379Q93HhtGPIvMhqs1wMfTXWaRNDXFjwd2pV/rHyH79f8qksYDz0fEzWQPdepye/mI2Ac8I+l3UugjZGMnvbkfOF/S6ekax0gqtaCuAr4B/DXwtbIyp5U9v/5DZI/I3QKMLcUljZJ0Zm8XjuxZQdtSK6k08+6Nh6nvvcAfp+SHpDEDPI/VKScYG66uIBur+JmkR9JngGuBOZLuJ+seq/RrH7LJAQ+mp3D+b+BvKhwzB1hcdofgXls4EdEBfBS4JZW5H3iDpN8lm1RwVUR8A3hR0sdSscdSfX8GjCF7aNeLwMXAVeluyg/Tt+eA/BEwN5XZTPawqt7qew/ZRIHW9O9Q6m7s13msfnkWmdkQlWa83RkRNff0TzNwC8bMzHLiFoyZmeXCLRgzM8uFE4yZmeXCCcbMzHLhBGNmZrlwgjEzs1z8f4wIUn0YfzxzAAAAAElFTkSuQmCC\n", 236 | "text/plain": [ 237 | "
" 238 | ] 239 | }, 240 | "metadata": { 241 | "needs_background": "light" 242 | }, 243 | "output_type": "display_data" 244 | } 245 | ], 246 | "source": [ 247 | "plt.scatter(x_train,y_train)\n", 248 | "plt.xlabel(\"Years of experience\")\n", 249 | "plt.ylabel(\"Salary\")\n", 250 | "plt.title(\"Scatter Plot\")\n", 251 | "plt.show()" 252 | ] 253 | }, 254 | { 255 | "cell_type": "code", 256 | "execution_count": 13, 257 | "metadata": {}, 258 | "outputs": [], 259 | "source": [ 260 | "#y=mx+c\n", 261 | "#here c is the bias that we will have \n", 262 | "#m is the weight or slope\n", 263 | "#x,y independent and dependent varaible resp." 264 | ] 265 | }, 266 | { 267 | "cell_type": "code", 268 | "execution_count": 14, 269 | "metadata": {}, 270 | "outputs": [ 271 | { 272 | "name": "stdout", 273 | "output_type": "stream", 274 | "text": [ 275 | "tensor([-0.9106, -0.8722, -1.3318, -0.4220, 0.3752, -0.4594, -0.0186, 0.8910,\n", 276 | " 0.5777, 1.2361, -0.1749, -0.0420, 1.1878, 0.2229, -0.3383, -0.1718,\n", 277 | " -1.7934, -0.5112, 1.3540, -1.5285, 1.3032, -1.3901, 0.8975, 1.5249],\n", 278 | " requires_grad=True)\n", 279 | "tensor([-0.0196, -1.3625, 0.0125, -1.1594, -0.4420, -0.9959, 0.3635, -0.6018,\n", 280 | " -1.0911, -2.0272, -0.5228, 0.6352, -1.0188, 1.5032, -0.7270, -0.4117,\n", 281 | " -1.1910, 0.5346, -0.1988, 0.0073, 0.1303, -1.0499, 0.9785, 0.3123],\n", 282 | " requires_grad=True)\n" 283 | ] 284 | } 285 | ], 286 | "source": [ 287 | "m,c=torch.randn(24,requires_grad=True),torch.randn(24,requires_grad=True)\n", 288 | "print(m)\n", 289 | "print(c)" 290 | ] 291 | }, 292 | { 293 | "cell_type": "code", 294 | "execution_count": 15, 295 | "metadata": {}, 296 | "outputs": [], 297 | "source": [ 298 | "def model(x):\n", 299 | " x=x.double()\n", 300 | " return x*m.t()+c" 301 | ] 302 | }, 303 | { 304 | "cell_type": "code", 305 | "execution_count": 16, 306 | "metadata": {}, 307 | "outputs": [], 308 | "source": [ 309 | "predictions=model(x_train) #it is just to show how it will predict" 310 | ] 311 | }, 312 | { 313 | "cell_type": "code", 314 | "execution_count": 17, 315 | "metadata": {}, 316 | "outputs": [ 317 | { 318 | "data": { 319 | "text/plain": [ 320 | "tensor([ -1.0212, -2.4964, -1.9852, -2.0034, 0.3835, -2.3280, 0.3077,\n", 321 | " 2.2494, 0.7574, 2.5463, -1.2048, 0.4673, 3.7323, 2.4170,\n", 322 | " -2.2493, -1.2537, -10.3373, -2.1745, 7.7896, -9.1637, 8.9923,\n", 323 | " -10.9198, 8.0687, 12.8166], dtype=torch.float64,\n", 324 | " grad_fn=)" 325 | ] 326 | }, 327 | "execution_count": 17, 328 | "metadata": {}, 329 | "output_type": "execute_result" 330 | } 331 | ], 332 | "source": [ 333 | "predictions #so here we see the predictions are way to vague" 334 | ] 335 | }, 336 | { 337 | "cell_type": "code", 338 | "execution_count": 18, 339 | "metadata": {}, 340 | "outputs": [ 341 | { 342 | "data": { 343 | "text/plain": [ 344 | "tensor([ 39343., 46205., 37731., 43525., 39891., 56642., 60150., 54445.,\n", 345 | " 64445., 57189., 63218., 55794., 56957., 57081., 61111., 67938.,\n", 346 | " 66029., 83088., 81363., 93940., 91738., 98273., 101302., 113812.],\n", 347 | " dtype=torch.float64)" 348 | ] 349 | }, 350 | "execution_count": 18, 351 | "metadata": {}, 352 | "output_type": "execute_result" 353 | } 354 | ], 355 | "source": [ 356 | "y_train #here just to give an idea how vastly the values are varying" 357 | ] 358 | }, 359 | { 360 | "cell_type": "code", 361 | "execution_count": 19, 362 | "metadata": {}, 363 | "outputs": [], 364 | "source": [ 365 | "#now we we will compare how the predictions are as compared to our target variables\n", 366 | "def mse(t1,t2):\n", 367 | " diff=t1-t2\n", 368 | " return torch.sum(diff**2/diff.numel())" 369 | ] 370 | }, 371 | { 372 | "cell_type": "code", 373 | "execution_count": 20, 374 | "metadata": {}, 375 | "outputs": [], 376 | "source": [ 377 | "#here diff can be understood as targets-predicted values" 378 | ] 379 | }, 380 | { 381 | "cell_type": "code", 382 | "execution_count": 21, 383 | "metadata": {}, 384 | "outputs": [ 385 | { 386 | "data": { 387 | "text/plain": [ 388 | "tensor(4.8235e+09, dtype=torch.float64, grad_fn=)" 389 | ] 390 | }, 391 | "execution_count": 21, 392 | "metadata": {}, 393 | "output_type": "execute_result" 394 | } 395 | ], 396 | "source": [ 397 | "mse(y_train,predictions) #so the number here is gigantic af! \n", 398 | "#need not worry will bring it down" 399 | ] 400 | }, 401 | { 402 | "cell_type": "code", 403 | "execution_count": 22, 404 | "metadata": {}, 405 | "outputs": [], 406 | "source": [ 407 | "loss=mse(y_train,predictions)\n", 408 | "loss.backward()" 409 | ] 410 | }, 411 | { 412 | "cell_type": "code", 413 | "execution_count": 23, 414 | "metadata": {}, 415 | "outputs": [ 416 | { 417 | "data": { 418 | "text/plain": [ 419 | "tensor([ -3606.5354, -5005.8120, -4716.6230, -7254.5005, -7313.2798,\n", 420 | " -13689.0459, -15037.4229, -14518.0664, -17185.1309, -17632.4902,\n", 421 | " -20546.2422, -18597.8438, -18984.4219, -19501.8496, -22917.4688,\n", 422 | " -27741.8613, -28066.7188, -36698.1602, -39999.6445, -46974.5820,\n", 423 | " -51979.7695, -58151.3203, -66685.1719, -77762.7734])" 424 | ] 425 | }, 426 | "execution_count": 23, 427 | "metadata": {}, 428 | "output_type": "execute_result" 429 | } 430 | ], 431 | "source": [ 432 | "#checking grads\n", 433 | "m.grad" 434 | ] 435 | }, 436 | { 437 | "cell_type": "code", 438 | "execution_count": 24, 439 | "metadata": {}, 440 | "outputs": [ 441 | { 442 | "data": { 443 | "text/plain": [ 444 | "tensor([-3278.6685, -3850.6248, -3144.4155, -3627.2502, -3324.2180, -4720.3608,\n", 445 | " -5012.4741, -4536.8960, -5370.3535, -4765.5376, -5268.2671, -4649.4609,\n", 446 | " -4746.1055, -4756.5483, -5092.7710, -5661.6045, -5503.2783, -6924.1812,\n", 447 | " -6779.6011, -7829.0972, -7644.0840, -8190.3267, -8441.1611, -9483.2656])" 448 | ] 449 | }, 450 | "execution_count": 24, 451 | "metadata": {}, 452 | "output_type": "execute_result" 453 | } 454 | ], 455 | "source": [ 456 | "c.grad" 457 | ] 458 | }, 459 | { 460 | "cell_type": "code", 461 | "execution_count": 25, 462 | "metadata": {}, 463 | "outputs": [ 464 | { 465 | "data": { 466 | "text/plain": [ 467 | "tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])" 468 | ] 469 | }, 470 | "execution_count": 25, 471 | "metadata": {}, 472 | "output_type": "execute_result" 473 | } 474 | ], 475 | "source": [ 476 | "#initialising the grads to zero \n", 477 | "m.grad.zero_()\n", 478 | "c.grad.zero_()" 479 | ] 480 | }, 481 | { 482 | "cell_type": "code", 483 | "execution_count": 26, 484 | "metadata": {}, 485 | "outputs": [], 486 | "source": [ 487 | "#this is how you adjust weights\n", 488 | "with torch.no_grad(): #with this in play torch will not track or modify the grad of tensors\n", 489 | " m-=m.grad*1e-2\n", 490 | " c-=c.grad*1e-2\n", 491 | " m.grad.zero_()\n", 492 | " c.grad.zero_()" 493 | ] 494 | }, 495 | { 496 | "cell_type": "code", 497 | "execution_count": 29, 498 | "metadata": {}, 499 | "outputs": [], 500 | "source": [ 501 | "#now training the data for multiple epochs\n", 502 | "for i in range(3000):\n", 503 | " pred=model(x_train)\n", 504 | " loss=mse(y_train,pred)\n", 505 | " loss.backward()\n", 506 | " with torch.no_grad():\n", 507 | " m-=m.grad*1e-2\n", 508 | " c-=c.grad*1e-2\n", 509 | " m.grad.zero_()\n", 510 | " c.grad.zero_()" 511 | ] 512 | }, 513 | { 514 | "cell_type": "code", 515 | "execution_count": 30, 516 | "metadata": {}, 517 | "outputs": [ 518 | { 519 | "name": "stdout", 520 | "output_type": "stream", 521 | "text": [ 522 | "tensor(26.8843, dtype=torch.float64, grad_fn=)\n" 523 | ] 524 | } 525 | ], 526 | "source": [ 527 | "#to verify the loss\n", 528 | "preds=model(x_train)\n", 529 | "loss=mse(y_train,preds)\n", 530 | "print(loss)" 531 | ] 532 | }, 533 | { 534 | "cell_type": "code", 535 | "execution_count": 46, 536 | "metadata": {}, 537 | "outputs": [], 538 | "source": [ 539 | "x1=x_train.detach().numpy()\n", 540 | "m1=m.detach().numpy()\n", 541 | "c1=c.detach().numpy()" 542 | ] 543 | }, 544 | { 545 | "cell_type": "code", 546 | "execution_count": 52, 547 | "metadata": {}, 548 | "outputs": [ 549 | { 550 | "data": { 551 | "text/plain": [ 552 | "" 553 | ] 554 | }, 555 | "execution_count": 52, 556 | "metadata": {}, 557 | "output_type": "execute_result" 558 | }, 559 | { 560 | "data": { 561 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAD4CAYAAADy46FuAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy86wFpkAAAACXBIWXMAAAsTAAALEwEAmpwYAAAxZElEQVR4nO3deXhU5dn48e+dhSUge2QLSUAQZSlbiizqqyKCK/ysVGxUWtH0tbR16aKUvl1ejcXWavWt0Eawoo0ipS7UioqgUpAtLMomEggJkSWBhDUsWe7fH+cEJjGZLDOTmUzuz3XNdWaeOc85d7wwd57tPKKqGGOMMdWJCHYAxhhjQpslCmOMMV5ZojDGGOOVJQpjjDFeWaIwxhjjVVSwA/C3Tp06aWJiYrDDMMaYRmX9+vWHVDW2qu/CLlEkJiaSkZER7DCMMaZREZHs6r6zridjjDFeWaIwxhjjlSUKY4wxXlmiMMYY45UlCmOMMV5ZojDGmEYuPR0SEyEiwjmmp/v3+mE3PdYYY5qS9HRISYGiIudzdrbzGSA52T/3sBaFMcY0YjNmOEnizstfYcqVLwFKUZFT7i+WKIwxphHLyQGRMh677X+4c/TfATlX7i/W9WSMMY1YfDz0bbOExNhsHpn/ZIVyf7FEYYwxjVhqKsRseIFDxzvyVsZEAGJinHJ/sURhjDGNWPKtBynjbeau+DHFpc1JSHCShL8GssEShTHGNG5ZLxEhJdw38z7umx2YW9hgtjHGNFaqkDkHYq+AtpcE7DaWKIwxprHK+xhOZELv+wJ6mxoThYi8KCJ5IrLFo2ySiGwVkTIRSap0/nQRyRSRHSIyzqN8mIhsdr97TkTELW8uIq+75WtEJNGjzhQR2em+pvjlJzbGmHCRmQbR7aDHbQG9TW1aFC8B4yuVbQFuBZZ7FopIP2Ay0N+tM0tEIt2vZwMpQB/3VX7NqUChqvYGngGedK/VAfg1cBkwHPi1iLSvw89mjDHh6/Qh2PsG9LwLoloG9FY1JgpVXQ4UVCrbrqo7qjh9AjBfVc+oahaQCQwXka5AG1VdpaoKvAxM9Kgzz32/EBjjtjbGAUtUtUBVC4ElfD1hGWNM07TnFSg7G/BuJ/D/GEV3YK/H51y3rLv7vnJ5hTqqWgIcBTp6udbXiEiKiGSISEZ+fr4ffgxjjAlhqpD5AnQcAe0GBvx2/k4UUkWZeimvb52KhappqpqkqkmxsVXuDW6MMeEjfyUc294grQnwf6LIBXp4fI4D9rnlcVWUV6gjIlFAW5yuruquZYwxTduuFyDqAki4vUFu5+9EsQiY7M5k6okzaL1WVfcDx0VkhDv+cDfwtked8hlNtwHL3HGM94HrRKS9O4h9nVtmjDFN19lCyFkAid+BqFYNcssaV2aLyGvAVUAnEcnFmYlUAPwfEAv8W0Q2qeo4Vd0qIguAbUAJME1VS91L3Y8zg6olsNh9AcwFXhGRTPe6kwFUtUBEHgPWuef9r6pWGFQ3xpgmJysdSk9D75QGu6U4f7yHj6SkJM3IyAh2GMYY43+qsHgwSBRcv96vlxaR9aqaVNV3tjLbGGMai8Pr4MjnDTaIXc4ShTHGNBa7XoDIGGd8ogFZojDGmMag+DhkvwYJkyG6TYPe2hKFMcY0BtmvQcnJBu92AksUxhjTOGS+4KzC7nhZg9/aEoUxxoS6go1QkAEX3QdS1UMrAssShTHGhLpdL0BkC+h5Z1Bub4nCGGNCWclJ2JPu7DnRLDg7LViiMMaYUJbzDyg+1qArsSuzRGGMMaEsMw3aXAKxlwctBEsUxhgTqo5shUOr4KJ7gzKIXc4ShTHGhKpdL0BEM+g5peZzA8gShTHGhKLS05D1MsT9P2jRKaihWKIwxphQlPNPZ++JIKzErswShTHGhJD0dEhMhI9ffIE9hy8ifenVwQ7JEoUxxoSK9HRISYHmZ3dw1aWf8Jcl95KSEkF6enDjskRhjDHBVlYKhZ+xfdEs/jrlTpb+YgzFJVG8tPy7FBXBjBnBDa/GRCEiL4pInohs8SjrICJLRGSne2zv8d10EckUkR0iMs6jfJiIbHa/e87dOxt3f+3X3fI1IpLoUWeKe4+dIhLcYX9jjPGX4uNw4EPY/FtYNg7+2QEWD+bxCdO4pv8yVu0cyW3PLeTg0S4A5OQEN9wa98zG2ef6z8DLHmWPAktVdaaIPOp+fkRE+uHsed0f6AZ8KCIXu/tmzwZSgNXAu8B4nH2zpwKFqtpbRCYDTwK3i0gHnP25kwAF1ovIIlUt9PWHNsaYBnUyB/JXOq9Dn8KRz0DLAHGeCJuYDJ1Gc/mto1i5KdEp9xAfH4ygz6sxUajqcs+/8l0TgKvc9/OAj4FH3PL5qnoGyBKRTGC4iOwB2qjqKgAReRmYiJMoJgC/ca+1EPiz29oYByxR1QK3zhKc5PJa3X9MY4xpIGUlTiLwTAxFuc53Ua2g4wjo/0uIHe08MrxZ23NV7/8pbEyBoqLzl4uJgdTUBv4ZKqlNi6IqnVV1P4Cq7heRC93y7jgthnK5blmx+75yeXmdve61SkTkKNDRs7yKOhWISApOa4X4YKdeY0zTcvaos3o6fyUcWgmH1kCp+5s+Jt559Ean0U5iaDcQIqr/tZuc7BxnzHC6m+LjnSRRXh4s9U0U1alqjbl6Ka9vnYqFqmlAGkBSUlKV5xhjjM9U4WTW+dZC/ko4uhVQkEhoPxgumuokhdjREBNX51skJwc/MVRW30RxUES6uq2JrkCeW54L9PA4Lw7Y55bHVVHuWSdXRKKAtkCBW35VpTof1zNeY4zxKj29ir/kJ5+Fwo3nWwv5n8LpA06F6LbQaSTEf9vtRhoO0a2D+0MESH0TxSJgCjDTPb7tUf6qiDyNM5jdB1irqqUiclxERgBrgLuB/6t0rVXAbcAyVVUReR94wmNG1XXA9HrGa4wx1SpfvxCpx7hh8HJGX7yShJ0rKZm/jig57ZzUuhd0GQuxo5zE0LY/SNNYYVBjohCR13D+su8kIrk4M5FmAgtEZCqQA0wCUNWtIrIA2AaUANPcGU8A9+PMoGqJM4i92C2fC7ziDnwX4MyaQlULROQxYJ173v+WD2wbY4w/zZgBraLyWP/4MHp0zOVsSTQb9gzlbyt+wH3TRznJoWXXYIcZNKIaXl36SUlJmpGREewwjDGNSFRkKYt/Pp7L+67gtmcXsmzrNZwubokIlJUFO7qGISLrVTWpqu/8PZhtjDGNzu+npDJ24IdMTZvDu5tuPFdukygdTaODzRhjqnNgKQ+N+Q3pn97Ni5/cc644FNYvhApLFMaYpqtoH3z6HaTdpUSOnEVCgiACCQmQlhZ601SDxbqejDFNU1kJfHoHFJ+AMR8zuW0rJltiqJIlCmNM0/T5ryBvOYx8BdpeGuxoQpp1PRljmp6v3oVtv4OL7oOedwY7mpBnicIY07SczIFVd0G7QTDs2WBH0yhYojDG+E35Np4REc4x2DuzfU3pWVhxO5QVw+X/gKiWwY6oUbAxCmOMX5Q/BqP8EdnZ2c5nCKHZQ5sehcOr4fIF0KZPsKNpNCxRGGP8YsYMOH2qlPceuYGWzU6xNbc/W3P7887cfiTf2h9aXAhS1UOhG8jeN2HHM3DxjyB+UvDiaIQsURhj/CInB3p0zGXcNz5gd15PBvbYTPtWR5wv3wSad4Q2/ZyH6bXtD23d9w2RQE7shtXfgw7fhCF/COy9wpAlCmOMX8THQ89WuwG4b84LLNt6DV3aHeCaoVtJf34rHNvm7N2QPR+Kj5yv2LyjkzDKk0g7972/EkjpafjPJECcLqfI5r5fs4mxRGGM8YvUVFgxLwuArLyegHDsbFduuKcrXHLt+RNVnT0djm6FI1ud47FtgUsgG34ChRvgyrehdaI/ftQmxxKFMcYvkpNhELspKY0kt7AHCQnVbOMp4jyyu2VX6FIpgZza7ySN2iSQ8iTSzn3fPPZcAinfhGhkt/m89sNZbNOf0S/uloD/NwhXliiMMX4zIDELDvXgbHF03SuLQEw351VVAjm6FY663VdHt8KeV6H46Pnz3ATy5cH+rH+tH8Mu7Era1PtYsWM0Nz6TyixCaPZVI2P7URhj/Of9kRAVA2OWBv5eFRLI+SRybO9W2rR0Ekj+sU4MmbGRrwriSEiAPXsCH1ZjZftRGGMaxsks6H5zw9zLswXSdey54nYRSpe2++kft5XsQwl8VRAHOLOyTP34tDJbRB4QkS0islVEHnTLOojIEhHZ6R7be5w/XUQyRWSHiIzzKB8mIpvd754TcToaRaS5iLzulq8RkURf4jXGBFDJSTh9EFr3DGoY8fHC/iPd+HDLWHYeuNijPIhBNXL1ThQiMgC4DxgODAJuEpE+wKPAUlXtAyx1PyMi/XD2w+4PjAdmiUike7nZQArQx32Nd8unAoWq2ht4BniyvvEaYwLshDPjiVa9ghpGaqqz6ZAn24TIN760KC4FVqtqkaqWAJ8A/w+YAMxzz5kHTHTfTwDmq+oZVc0CMoHhItIVaKOqq9QZMHm5Up3yay0ExpS3NowxIaY8UQS5RZGc7Gw6lJCAbULkJ76MUWwBUkWkI3AKuAHIADqr6n4AVd0vIhe653cHVnvUz3XLit33lcvL6+x1r1UiIkeBjsAhz0BEJAWnRUK8tS+NCY4TzmI7Wge3RQFOUrDE4D/1blGo6nacrqAlwHvAZ0CJlypVtQTUS7m3OpVjSVPVJFVNio2N9Rq3MSZATmZBVCto3inYkRg/82kwW1XnqupQVb0SKAB2Agfd7iTcY557ei7Qw6N6HLDPLY+rorxCHRGJAtq69zHGhJoTu53WhPUOhx1fZz1d6B7jgVuB14BFwBT3lCnA2+77RcBkdyZTT5xB67VuN9VxERnhjj/cXalO+bVuA5ZpuC38MCZcnMgKiW4n43++rqP4pztGUQxMU9VCEZkJLBCRqUAOMAlAVbeKyAJgG04X1TRVLXWvcz/wEtASWOy+AOYCr4hIJk5LYrKP8RpjAkHVaVF4rqg2YcOnRKGqV1RRdhgYU835qcDXJqmpagYwoIry07iJxhgTwk7nQWmRtSjClG2Faozx3cnQmBprAsMShTHGdyE0Ndb4nyUKY4zvzq3KTgxqGCYwLFEYY3x3Yje06OI8OdaEHUsUxhjfncyy8YkwZonCGOO78sV2JixZojDG+KasGIr2WqIIY5YojDG+OZkDWgatrOspXFmiMMb4xqbGhj1LFMaEofR0SEyEiAjnmJ4ewJvZYruwZ3tmGxNm0tMhJQWKipzP2dnOZwjQHg0ndkNENLTsXvO5plGyRGFMmJkxA4qKlJmTH6VZ1Fk27x3I5r0Defw3/UhObuX/G57IgpgEiIis+VzTKFmiMCbM5ORArwt388jNv6e0LILIiDIAysoEFvWCdgOh7QDn2G4AXHAxRPjwq8CmxoY9SxTGhJn4eBjaeQMAI3+9ioKTHRgQt4UrBm7mJ1M3w5Et8NW/oPwp/xHNoM2lTtLwTCIxPWq3CdGJ3RCfFMCfyASbJQpjwkxqKuxbvJHikig+3/sNzhS3YP/x3kx6aCJc7p5UehqOfeEkjSOb4egWyFsOezxGvaPbeLQ8PBJI8w6AMxYy83+PsvnXBfzu/3oSP972qQ5XliiMCTPJybCPDWQe6MfZkhYkJDjJo8Iv8cgW0H6w8/J09ggc3eokjyNb4OhmyFkAmX89f07LruwrGsihDwYysV9bANZ/2YvH3zh/fxNeLFEYE25U6dZiA91G30hZWR3rNmsHsaOdl8f1OLXfbXk4CeTwis2kXP08LZudBuCLfZdQVOQMpFuiCD8+JQoReQi4F1BgM/A9IAZ4HUgE9gDfVtVC9/zpwFSgFPixqr7vlg/j/Fao7wIPqKqKSHPgZWAYcBi4XVX3+BKzMWHv1D44kw/th/jneiIQ0815dRsHwKDRIJRyUeddtG9VyNZcZ4PKnBz/3NKElnovuBOR7sCPgSRVHQBE4uxp/SiwVFX7AEvdz4hIP/f7/sB4YJaIlM+nmw2kAH3c13i3fCpQqKq9gWeAJ+sbrzFNRuFG59h+aMBuER8PZRrJzgMXs3bXZRXKTfjxdWV2FNBSRKJwWhL7gAnAPPf7ecBE9/0EYL6qnlHVLCATGC4iXYE2qrpKVRWnBeFZp/xaC4ExIrWZhmFME1awARBoPyhgt0hNhZhKW0/ExDjlJvzUO1Go6lfAU0AOsB84qqofAJ1Vdb97zn7gQrdKd2CvxyVy3bLu7vvK5RXqqGoJcBToWDkWEUkRkQwRycjPz6/vj2RMeCjcCBf0gegLAnaL5GRIS4OEBKdnKiHB+WzjE+HJl66n9jh/8fcEugGtROROb1WqKFMv5d7qVCxQTVPVJFVNio2N9R64MeGuYIP/xie8SE6GPXugrMw5WpIIX750PV0LZKlqvqoWA28Ao4CDbncS7jHPPT8X6OFRPw6nqyrXfV+5vEIdt3urLVDgQ8zGhLczh6EoBzoEbnzCND2+JIocYISIxLjjBmOA7cAiYIp7zhTgbff9ImCyiDQXkZ44g9Zr3e6p4yIywr3O3ZXqlF/rNmCZO45hjKnKuYHswLcoTNNR7+mxqrpGRBYCG4ASYCOQBrQGFojIVJxkMsk9f6uILAC2uedPUy1/hgD3c3567GL3BTAXeEVEMnFaEpPrG68xTUKBJQrjfxJuf6AnJSVpRkZGsMMwJjhW3gH5n8LE7GBHYhoZEVmvqlU+tMs2LjImnBRuhA7WmjD+ZYnCmHBRfAKOfRnQhXamabJEYUy4OPIZoDY+YfzOEoUx4aLA2YPCpsYaf7NEYUy4KNwIzWOhZbdgR2LCjCUKY8JFwQanNWGPQzN+ZonCmHBQesbZcMjGJ0wAWKIwJhwc3QJaYuMTJiAsURgTDmxFtgkgSxTGhIPCDRDdBlr3CnYkJgxZojAmHBRshPaDQex/aeN/9q/KmMaurNRZbGcrsk2AWKIwprE7vgNKT9n4hAkYSxTGNHa2ItsEmCUKYxq7wo0Q2QLaXBLsSEyYskRhTGNXsAHafQMi6r0PmTFeWaIwpjFTdVoUNj5hAqjeiUJE+orIJo/XMRF5UEQ6iMgSEdnpHtt71JkuIpkiskNExnmUDxORze53z7l7Z+Pur/26W75GRBJ9+mmNCTcns6D4qI1PmICqd6JQ1R2qOlhVBwPDgCLgTeBRYKmq9gGWup8RkX44e173B8YDs0Qk0r3cbCAF6OO+xrvlU4FCVe0NPAM8Wd94jQlLtiLbNAB/dT2NAXapajYwAZjnls8DJrrvJwDzVfWMqmYBmcBwEekKtFHVVeps4P1ypTrl11oIjClvbRhjcFZkSyS0GxjsSEwY81eimAy85r7vrKr7AdzjhW55d2CvR51ct6y7+75yeYU6qloCHAU6Vr65iKSISIaIZOTn5/vlBzKmUSjYCG37ObOejAkQnxOFiDQDbgH+UdOpVZSpl3JvdSoWqKapapKqJsXGxtYQhjFhpHCDrcg2AeePFsX1wAZVPeh+Puh2J+Ee89zyXKCHR704YJ9bHldFeYU6IhIFtAUK/BCzMSErPR0SEyEiwjmmp1dz4qn9cPqgjU+YgPNHoriD891OAIuAKe77KcDbHuWT3ZlMPXEGrde63VPHRWSEO/5wd6U65de6DVjmjmMYE5bS0yElBbKznZmv2dnO5yqTha3INg3Ep0QhIjHAWOANj+KZwFgR2el+NxNAVbcCC4BtwHvANFUtdevcD8zBGeDeBSx2y+cCHUUkE3gYdwaVMeFqxgwoKlJGXbyS5tGnASgqcsq/prB8xtOghgvQNEkSbn+gJyUlaUZGRrDDMKZeIiKU39/xM3564x/ZndeTn6T/kbcyJiIilJVVOnn5rc7Odjd/GZRYTXgRkfWqmlTVd7Yy25gQ8qd7fsVPb/wjr306mZNnWvHmQ7fy4fRrGfvNzV8/2VZkmwZiicKYULEllR9f8zh/W34vybPSGfKLjUx76c8M6bmRxT8eDOumwZnDpKfDoEsL4OQeZqYNrX6w2xg/sURhTCjY/kf4/JeQeCfNr/gL8fERlGkU//5yGktjdhLR9weQ+VfOLOzDhvn/R5fodQAs2zSk+sFuY/zExiiMCbYvn4eMH0L8JBj1avVPgT2yhRXPPcjlvZdy8nQMrVoUEfvfeRw6HktCAuzZ06BRmzBjYxTGhKpdc50k0f0WGJXu/VHh7QZw5W+WMPHpNzlwtAub9w7g0HFngWlOTgPFa5oke4C9McGS9XdYcx90HQeXL4CI6BqrxMcLb6+fyDsbbyIqssSjPJCBmqbOWhSmQdV61XG4y/kHrJ4Cna+CK96AyOa1qpaaCjExUFoWxZli5/lOMTFOuTGBYonCNJjyVcdSlEVkRLH3VcfhLHcRrPwOdBoJVy6CqJhaV01OhrQ0SEgAEeeYluaUGxMoNphtGkxiIhzcd4qjc9qyaudIJj79FkeK2jetgdh978PyW6DdILhmCTRrG+yIjAFsMNuEiJwcaBZ1lmZRxfzXpctZ+ZvRxHfKbpCB2JDo8jr4EfxnIrS5FK5+z5KEaTQsUZgG4zngunDtt+jabj+rfzOC60dsCOh9zz9oT2t+0F6g5K+ET26G1r2clkTzDg14c2N8Y4nCNJjygViAlTtGc/lvV1BSFs1b066Efe8F7L4zZkCbZvs5MKsLd4x6FfDyoL1AOLwOProeWnaDa5ZCC9szxTQulihMg0lOhmefdd6LwMmo/qzrsJroDn3gk5ucNQUBkJMDv/3Wr+ncNo8bB/+7QnnAFW6Cj8ZB804wZhm07NIANzXGvyxRmAb17UnO8emnnQHsW5O7wbXLofMYWHMvfP5rZyMGP7omaRtTr5pLaVkEI/usOlce8LUHR7bCsrEQ1cpJEjFxNdcxJgRZojDBF30BXPUO9PoebPlfWHMPlBX77fIvPfAoJ0635vfv/JxeF2ZxYZuDAVl74DlgfvWwLzn172tBouCaZdA60b83M6YBWaIwoSEiGi6bCwN/A7tfgo9vhOJjvl/34CfEyb/Y3XI66/bdDMCEUav8vvbAc2e6hE5ZvPzdMZw4UcK/Ti2FNn38dyNjgsAShQkdIjDw107COLgMllwJRV/V/3qqsPFnEBPHkNsf4I2PhkJENGlPrPL7AjVnZzroH7eFZb+4hlbNT3LtEx/yoxn9/HsjY4LA161Q24nIQhH5QkS2i8hIEekgIktEZKd7bO9x/nQRyRSRHSIyzqN8mIhsdr97zt07G3d/7dfd8jUikuhLvCaw/LZW4aJ74L/+DSd2wQcjnb7++sj5BxSsg288BlEtIbKFs9HPoVU1162jwwdP8Ps7fsamJwZzQcvjjHvyfT7PGWQP6zNhwdcWxbPAe6p6CTAI2I6zr/VSVe0DLHU/IyL9gMlAf2A8MEtEIt3rzAZSgD7ua7xbPhUoVNXewDPAkz7GawKkvOslJ6eMCCnxfa1Ct3HOILeWwJLRzmK1uig9C59Nh3YDIfGu8+WdRkJBhv/GQFRh7xvs+OOl/Oymp3hp+Xe55KdfkLH7m4A9rM+Eh3onChFpA1wJzAVQ1bOqegSYAMxzT5sHTHTfTwDmq+oZVc0CMoHhItIVaKOqq9R5nsjLleqUX2shMKa8tWFCS3nXy1/u+W8++90gmkWd8X2tQochcN0qiOnuTDHd82rt62b+BU7shsF/gIjI8+WdRkLpKSj8zIfAXMd3OWMp//kWLdt24JqZK7lvzhwOn+gE2MP6TPjwpUXRC8gH/iYiG0Vkjoi0Ajqr6n4A93ihe353YK9H/Vy3rLv7vnJ5hTqqWgIcBTpWDkREUkQkQ0Qy8vPzffiRTH3l5EBCpz3c818v0j9uGw+Mf/ZcuU9aJcDYFdBpFHyaDFtn1jx99uxRZ/ZUl2uh63UVv+s00jkeWl3/mErPwObH4N0BkP8fGPo07e9Yz9RHRtnD+kxY8iVRRAFDgdmqOgQ4idvNVI2qWgLqpdxbnYoFqmmqmqSqSbGxtuo1GOLj4ac3PkWZRrD8iyv4n4mP0bntAf90vTRrD1e/Dwl3ON1JGdOgrKT687c9CWcOw+DfO7+1PcX0cFZI13ecYv8SeHcgbP4VdL8ZbvoCLnkIIqJITnbWhpSVOUdLEiZc+JIocoFcVV3jfl6IkzgOut1JuMc8j/N7eNSPA/a55XFVlFeoIyJRQFugwIeYTYA8lZrH1Kvm8sqKu7gn7UWaR5/hD8nT/df1EtkcRv0d+j0CO2fDf26FkpNfP68oF3Y8A4l3Ol1XlYk4rYq6JoqifbBiMnx0ndOiueo9Z7OhmO411zWmkat3olDVA8BeEenrFo0BtgGLgClu2RTgbff9ImCyO5OpJ86g9Vq3e+q4iIxwxx/urlSn/Fq3Acs03J6LHiZuG/AcLaLPkL7h5+zO682LKx/irtEvkTx+nf9uIhEweCYkPQ/7/g0fXg2n8yqe8/mvQMtg0OPVX6fTSDiZBacO1nzPshL44ll45xLIfQsG/hZu3OwMthvTRPi6FeqPgHQRaQbsBr6Hk3wWiMhUIAeYBKCqW0VkAU4yKQGmqWqpe537gZeAlsBi9wXOQPkrIpKJ05KY7GO8JhCKj8GXf0bib2VZhvt3Q/Ev4V8vQ8aP4bpPv94F5IuLf+A8DmPlZPhgJIuOL+bHMy6mTdlmNj3xEl/IT+jXKqH6+ufGKVZBj4nVn5e/CtbdD0c+c7YrTfozXNDbfz+HMY2ET9NjVXWTOzbwDVWdqKqFqnpYVceoah/3WOBxfqqqXqSqfVV1sUd5hqoOcL/7YXmrQVVPq+okVe2tqsNVdbcv8ZoA2flXKD4K/TyGqKIvgMG/g8Or6zZbqbbiboExH3H6+HFGnRhFt2afMnPyIxwpasfYB3/hfVpuh6GUajR/eXzVuTUfP/jB+TUggy89TGb6fbBkFJw5BJcvhKsWW5IwTZatzDb1lp4OfXufZv/HT/OfndeS/l6lzbF63g0dkmDTz6H4hP8D6HQZ1/1xFYUn2/PxL6/ihsGLefytX7LvUHuv03LT57dgfdZQ+nVedW5/itmzlcK8o3z3yhf58MG+JJb9jW36E7hpO8R/y78tImMaGV+7nkwTVb7ALvmyl+na7gB3zvo7q7Oc787N9pEIGPac85f5tidh0GN+j2PFposYuXMVbz08kY6tD/P8kmmA92m5M2bAg1eM4MHrn2XHUxfTpuUxOrY+THSUM5NqxY7R3P+32RyPGMgem7lkjO2ZbeonMRHy9hex5ckBHDreict+tQaQqve//vROyFnoTCVt1h4WtoOhTzvTSv0QR3Y2gBIdWUxxaTMAr/twR0TApMte5/UfOUNeacvu49DxThw+0ZGdB/rwzsabUI1AxJnqakxT4G3PbGtRmHrJyYHn7nYe2z31hbmUL3mp8i/5wTNh75vOA/oum+PXOFJTnZZNUZGcSxI1rYju0AFW7Rx57vP356ZVeZ49fsMYh41RmHpJvuYDfnjd8zyz+EE+3nb1ufIqf7nGxEH/6bB3Yd2f2VRTHMnOCui6rojee7iH1+/t8RvGnGeJwtTd2UL++r172L7vUn7x+hPnir3+cr3kJ9AqgePLfwLAww/7+IRZD3VdEV1QACBMTZvDd56vGIA9fsOYr7OuJ1N3635ITMRB9nR9m87dWpKT47QkUlO9/HKNasnyE09xpTh7oZbPNkpJcb5uyF/K8fHOvV/8ZGqFcm/jGsY0ZdaiMHWTvQCyX4UBv+L65GF1+kv+7hnf4qNtV1Uo8/kJs/WQmuq0fjxZV5Mx1bNEYWqvaJ+zUrnjcGfMoY5ycoQHX/kTxSVRHD3V1qPcn0HWrL7jGsY0Vdb1ZGpHFdbc6+zlMPJliKj7P534ePg8exA9H8riwJEuFcobWnKyJQZjastaFKZ2MtNg/2Ln0d1t+tZ8fhXKu3y+KoijtMxJNNblY0zos0RhanY8EzY8DF3GOg/kqyfr8jGmcbKuJ+NdWQmsuhsimsGIF53HcvjAunyMaXwsURjvtv/BeRz3qHRn4ZwxpsmxridTvcJNsPnXED/J2YbUGNMkWaIwVSs9DZ/eBc06wjdn22O2jWnCrOvJVO3z/4GjW+Cqd6F5x2BHY4wJIp9aFCKyR0Q2i8gmEclwyzqIyBIR2eke23ucP11EMkVkh4iM8ygf5l4nU0Sec/fOxt1f+3W3fI2IJPoSr6mlvOWw/Y/Q+/vQ7fpgR2OMCTJ/dD1draqDPZ5j/iiwVFX7AEvdz4hIP5w9r/sD44FZIhLp1pkNpAB93Nd4t3wqUKiqvYFngCf9EK/xpvgYrJoCrXvBkKeCHY0xJgQEYoxiAjDPfT8PmOhRPl9Vz6hqFpAJDBeRrkAbVV3l7pX9cqU65ddaCIwpb22YANnwMBTlOKuvo1sHOxpjTAjwNVEo8IGIrBcR9zmgdFbV/QDu8UK3vDuw16NurlvW3X1fubxCHVUtAY4CX+swF5EUEckQkYz8/Hwff6QmLHcR7JoLlz4CsaOCHY0xJkT4Opg9WlX3iciFwBIR+cLLuVW1BNRLubc6FQtU04A0cLZC9R6yqdLpfFh7H7QbBAN/E+xojDEhxKcWharuc495wJvAcOCg252Ee8xzT88FPLcViwP2ueVxVZRXqCMiUUBboMCXmE0VVGHt9+HsERj1CkQ2C3ZExpgQUu9EISKtROSC8vfAdcAWYBEwxT1tCvC2+34RMNmdydQTZ9B6rds9dVxERrjjD3dXqlN+rduAZe44hvGnrFcg9034xuPQbmCwozHGhBhfup46A2+6Y8tRwKuq+p6IrAMWiMhUIAeYBKCqW0VkAbANKAGmqWqpe637gZeAlsBi9wUwF3hFRDJxWhKTfYjXVOVkDqz/EcReAZc8HOxojDEhSMLtD/SkpCTNyMgIdhiNg5bBsmvh8Dq44XNo3TPYERljgkRE1nssc6jAVmY3ZTueg4MfwWVzLEkYY6plz3qqpfR0SEyEiAjnmJ4e7Ih8dHQbbHoUut8Mve4JdjTGmBBmLYpaSE+HlBQoKnI+Z2c7n6GR7q1QVuzsMRF9AQx/wR74Z4zxyloUtTBjhpMkbh66iAU/nkSElFJU5JQ3Slseh4L1MPyv0LJzsKMxxoQ4SxS1kJMDkRElPHPnQ0y6bCFjBiw9V97oHFoLW1Oh593Q49ZgR2OMaQQsUdRCfDx8e8QCLuq8m5LSSO69as658kalpAhW3QUtu8Gw54IdjTGmkbAxilpITS1jyL4n2LK3Px9uuZYfjJ1F/IX5pKbGBju0utn0CBz/Eq5ZCs3aBjsaY0wjYS2KWki+chH9um/lhU+nM+fj+2gWVcwbT7/SuAay9y+BL/8MfR+ALtcEOxpjTCNiiaImqk6ffutePPvm7WzZ2x86jmBYu7nOdwHml2m5Zwth9fegzSUw6Hd+jtAYE+4sUdTkwBIoyIB+j0KE21PX+15nHcKh1QG9dfm03OxsJyeVT8utc7JY90M4fRBGvgJRLQMSqzEmfFmiqMnWJ6Bld2eWULn42yGqNeyaE9Bbz5gBxWfO8vikGSybcTWdLsiv+7Tc7AWQ/SoM+B/oWOXqfGOM8coShTf5KyHvE7j0ZxDZ/Hx5dGtIuB1yXofi4wG7fUzJdlb/dgQzJj7B5Rev4N2f3UDrFsdrPy331H5Ydz90+Cb0nx6wOI0x4c0ShTdbUqF5J6erqbKL7oWSk5D9uv/vqwo7/syG1KH06LiXW/74NhOfeYshiRt548Fbuajn2dpdY/VUKC1y9piIiPZ/nMaYJsGmx1anYAPsXwyDUiGq1de/73gZtO3vdD9VlUjq69R+WH0P7H+Pw1E3cMVv55J1oAsAU1+Yy7z//i5LBkwBTQfxkucz05z4hz0Hbfr6Lz5jTJNjLYrqbH0CottCn2lVfy/itCoOr4Ejm/1zz71vwrsDne6ub86i+3fe4bGnupCQ4Nzuk5wpbNQnSZT5sP6h6mddHc+EDQ9Dl2vh4mriN8aYWrJEUZWj22HvG3DxD70vTEu80+nS2TXXt/sVH3e6if5zK7RKhOs3Qp/7QYTkZNizB8rKnOOQ7/wM+j4EXz4H22Z+/VplpbBqihPXZS96b3UYY0wt2G+RqmybCZEtoe+D3s9r0Qni/p+zlWjpmfrdK38VLB4MWS9B/xkw9lPvXUUiMPQpSEyGz37x9SS1/Q9w6FNIeh5a9aj6GsYYUwc+JwoRiRSRjSLyjvu5g4gsEZGd7rG9x7nTRSRTRHaIyDiP8mEistn97jl372zc/bVfd8vXiEiir/HW6EQW7EmH3ilOIqjJRffC2QLIfatu9ykrhs9/BR9e7uw0N+YTGPQ4RDarua5EOK2FruNgbQofpy8iMRGGJG7i7Ppfka23QeJ36haPMcZUwx8tigeA7R6fHwWWqmofYKn7GRHph7PndX9gPDBLRCLdOrOBFKCP+xrvlk8FClW1N/AM8KQf4vVu2+9BIuHSn9bu/C5joFVC3dZUHPsSPhgNWx6DxLvghs/gwsvrFmdkM7h8IYfKkris+HYuar2Ul++/i8PHO3LFw7NJf9X2mDDG+IdPiUJE4oAbAc/fkhOAee77ecBEj/L5qnpGVbOATGC4iHQF2qjqKnU28H65Up3yay0ExpS3NvwtPR2GD9zHme0v8urq75L+ZvfaVZQIZ4e4Ax86rRFvVJ3ZSIuHwIlMuPwfMPIliG5Tv6CjWzM29d9kH0pg6S+uZWCPLdw7Zw578zo13r0yjDEhx9cWxZ+AnwNlHmWdVXU/gHu80C3vDuz1OC/XLevuvq9cXqGOqpYAR4GOlYMQkRQRyRCRjPz8/Dr/EOWPyvj2N54mMqKUX776SN0eldHre4DA7r9Vf87pPFg+AdZ+H2JHwQ2bIf62Osda2WdfdGLck++z62Av/rT4Ad7ddCPQSPfKMMaEpHonChG5CchT1fW1rVJFmXop91anYoFqmqomqWpSbGzdH/09Ywa0iDjMf4/5C6+tuoOs/F51e1RGqx7QdTzsetGZdVTZV+840173fwBD/wRXvw8xtWyx1CA+HnIOJdDnJzt56O9/qlBujDH+4EuLYjRwi4jsAeYD14jI34GDbncS7jHPPT8X8JyGEwfsc8vjqiivUEdEooC2QIEPMVcpJwdUhVkf/oDfLZpeobzWet8Lp76C/e+fLys5CWvvh09uhhZdYHwGXPKAX6espqZCTAyonr9mTIxTbowx/lDv31iqOl1V41Q1EWeQepmq3gksAqa4p00B3nbfLwImuzOZeuIMWq91u6eOi8gId/zh7kp1yq91m3sPvz/bOz4eCk924JHXfs/2r/pVKK+1bjdxWmN5b9YcIiLgltHrOPb6UMj8qzMwPm4ttBvg79BJToa0NM4tyktIcD43qr0yjDEhLRDrKGYCY0VkJzDW/YyqbgUWANuA94BpqlreT3M/zoB4JrALWOyWzwU6ikgm8DDuDCp/K/+r3FNd/ypPn9+Mv3wwhTF9/8UT336Uf35/FMcKi/hQl8KQP1R8qKCfVV6UZ0nCGONPEoA/0IMqKSlJMzIy6lwvPd0Zk8jJcVoSqal1+4WbmAgtzn7BF09dCsCrn97BtL89T9vY9uzZU+dwjDGmQYnIelWtci8CSxR+EhHhzH596PqnyS2I4x9rvg043UFlZTVUNsaYIPOWKOzpsX4SH+/sQPfM4oe/Vm6MMY2ZPevJT/wxzmGMMaHIEoWf2OwjY0y4sq4nP0pOtsRgjAk/1qIwxhjjlSUKY4wxXlmiMMYY45UlCmOMMV5ZojDGGONV2K3MFpF8ILuWp3cCDgUwHH+yWAPDYg0Mi9X/Ah1ngqpWuU9D2CWKuhCRjOqWrIcaizUwLNbAsFj9L5hxWteTMcYYryxRGGOM8aqpJ4q0YAdQBxZrYFisgWGx+l/Q4mzSYxTGGGNq1tRbFMYYY2pgicIYY4xXTTJRiMiLIpInIluCHUtNRKSHiHwkIttFZKuIPBDsmKojIi1EZK2IfObG+ttgx+SNiESKyEYReSfYsXgjIntEZLOIbBKRht++sQ5EpJ2ILBSRL9x/syODHVNVRKSv+9+z/HVMRB4MdlzVEZGH3P+ntojIayLSokHv3xTHKETkSuAE8LKqDgh2PN6ISFegq6puEJELgPXARFXdFuTQvkZEBGilqidEJBpYATygqquDHFqVRORhIAloo6o3BTue6ojIHiBJVUN+UZiIzAP+o6pzRKQZEKOqR4IcllciEgl8BVymqrVdrNtgRKQ7zv9L/VT1lIgsAN5V1ZcaKoYm2aJQ1eVAQbDjqA1V3a+qG9z3x4HtQPfgRlU1dZxwP0a7r5D8S0RE4oAbgTnBjiVciEgb4EpgLoCqng31JOEaA+wKxSThIQpoKSJRQAywryFv3iQTRWMlIonAEGBNkEOpltudswnIA5aoaqjG+ifg50BZkOOoDQU+EJH1IpIS7GC86AXkA39zu/TmiEirYAdVC5OB14IdRHVU9SvgKSAH2A8cVdUPGjIGSxSNhIi0Bv4JPKiqx4IdT3VUtVRVBwNxwHARCbmuPRG5CchT1fXBjqWWRqvqUOB6YJrbdRqKooChwGxVHQKcBB4Nbkjeud1jtwD/CHYs1RGR9sAEoCfQDWglInc2ZAyWKBoBt7//n0C6qr4R7Hhqw+1y+BgYH9xIqjQauMXt+58PXCMifw9uSNVT1X3uMQ94Exge3IiqlQvkerQiF+IkjlB2PbBBVQ8GOxAvrgWyVDVfVYuBN4BRDRmAJYoQ5w4QzwW2q+rTwY7HGxGJFZF27vuWOP/AvwhqUFVQ1emqGqeqiTjdDstUtUH/QqstEWnlTmLA7ca5DgjJ2XqqegDYKyJ93aIxQMhNuqjkDkK428mVA4wQkRj398EYnLHKBtMkE4WIvAasAvqKSK6ITA12TF6MBu7C+au3fCrfDcEOqhpdgY9E5HNgHc4YRUhPPW0EOgMrROQzYC3wb1V9L8gxefMjIN39NzAYeCK44VRPRGKAsTh/oYcst4W2ENgAbMb5vd2gj/NoktNjjTHG1F6TbFEYY4ypPUsUxhhjvLJEYYwxxitLFMYYY7yyRGGMMcYrSxTGGGO8skRhjDHGq/8PWBGpRaAcPvUAAAAASUVORK5CYII=\n", 562 | "text/plain": [ 563 | "
" 564 | ] 565 | }, 566 | "metadata": { 567 | "needs_background": "light" 568 | }, 569 | "output_type": "display_data" 570 | } 571 | ], 572 | "source": [ 573 | "plt.plot(x1,m1*x1+c1,c='orange')\n", 574 | "plt.scatter(x_train,y_train,c='blue')" 575 | ] 576 | }, 577 | { 578 | "cell_type": "code", 579 | "execution_count": 56, 580 | "metadata": {}, 581 | "outputs": [ 582 | { 583 | "data": { 584 | "text/plain": [ 585 | "" 586 | ] 587 | }, 588 | "execution_count": 56, 589 | "metadata": {}, 590 | "output_type": "execute_result" 591 | }, 592 | { 593 | "data": { 594 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAD4CAYAAADy46FuAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy86wFpkAAAACXBIWXMAAAsTAAALEwEAmpwYAAAsl0lEQVR4nO3deXxU9b3/8deHnci+GggxKMimohIj1taqqOAKbbVSbaUt92KtVdt721u99ldQS12u1bpUe6moaHGh3FZBRUFwvQoYVCoJIIFACCABwh6WLJ/fH+fkOsRkCMkkMxnez8cjj5n5zPmefFCYz5zz/Z7zMXdHRESkJs3inYCIiCQ2FQoREYlKhUJERKJSoRARkahUKEREJKoW8U4g1rp16+YZGRnxTkNEpElZsmTJVnfvXt17SVcoMjIyyM7OjncaIiJNipmtq+k9nXoSEZGoVChERCQqFQoREYlKhUJERKJSoRARkahUKEREmrr86fBSBjzXLHjMnx7T3Sfd8lgRkaNK/nRYPAHKS4LXJeuC1wB9r43Jr9ARhYhIU7b09i+LRKXykiAeIyoUIiJNWUnBkcXrQIVCRKQpS0k/sngdqFCIiDRlQydD85RDY81TgniMqFCIiDRlfa+FrCmQchxgwWPWlJhNZINWPYmINH19r41pYahKRxQiIhKVCoWIiER12EJhZk+aWZGZLYuIXWVmOWZWYWaZVba/zczyzGylmY2MiA8zs8/C9x42Mwvjrc3sxTC+yMwyIsaMM7NV4c+4mPyJRUTkiNTmiOJpYFSV2DLg28C7kUEzGwyMBYaEYx4zs+bh248DE4D+4U/lPscD2929H/AgcG+4ry7AROBMIAuYaGadj+DPJiIiMXDYQuHu7wLFVWLL3X1lNZuPBl5w9wPung/kAVlmlgp0cPcP3d2BZ4AxEWOmhc9nAiPCo42RwDx3L3b37cA8vlqwRESkgcV6jqI3sD7idWEY6x0+rxo/ZIy7lwE7ga5R9vUVZjbBzLLNLHvLli0x+GOIiEilWBcKqybmUeJ1HXNo0H2Ku2e6e2b37tX2BhcRkTqKdaEoBPpEvE4DNobxtGrih4wxsxZAR4JTXTXtS0REGlGsC8UsYGy4kqkvwaT1YnffBOw2s+Hh/MN1wMsRYypXNF0JLAjnMd4ALjKzzuEk9kVhTEREGtFhr8w2s+eBc4FuZlZIsBKpGHgE6A68amafuvtId88xsxlALlAG3Oju5eGubiBYQdUWmBP+AEwFnjWzvHC/YwHcvdjM7gI+Cre7090PmVQXEZGGZ8GX9+SRmZnp2dnZ8U5DRKRJMbMl7p5Z3Xu6MltERKJSoRARkahUKEREksDWPQdYtmFng+xbhUJEpAkrLa9g6vv5nPdfb/OLFz+lIead1Y9CRKSJem/VFu6YnUte0R7OObE7v71sMOH9VmNKhUJEpIkp2FbC717NZW7uZtK7pPDEdZmMGNSjQYoEqFCIiDQZ+w6W8/jbefz53TU0N+NXIwcw/ut9adOy+eEH14MKhYhIgnN3Xv1sE79/dTkbd+5n9Km9uPXigaR2bNsov1+FQkQkgS3ftItJs3JYlF/M4NQOPPS90zgjo0uj5qBCISKSgHaUHOSBeZ/z14Xr6Ni2JZO/dRJjz0inebOGmYeIRoVCRCSBlFc4zy8u4P65K9m1r5QfDD+OX1x4Ip1SWsUtJxUKEZEEsTi/mEmzcsjdtIsz+3Zh0hVDGJTaId5pqVCIiMTbpp37uPu1FcxaupFeHdvwp2tO55KTj22w5a5HSoVCRCRO9peWM/X9fB5dkEe5OzeP6M8Nx31E29zxsKwAUtJh6GToe21c81ShEBFpZO7Om8uLuOuVXAqKSxg15Fhuv3QQfXb+AxZPgPKSYMOSdcFriGuxUKEQEWlEeUV7uPOVXN79fAv9erTjr+PP5Ov9uwVvvnv7l0WiUnkJLL09roXisDcFNLMnzazIzJZFxLqY2TwzWxU+do547zYzyzOzlWY2MiI+zMw+C997OGyJStg29cUwvsjMMiLGjAt/xyozq2yXKiLS5OzeX8rkV3MZ9cd3+aRgO7+9bDBzbvnGl0UCoKSg+sE1xRtJbe4e+zQwqkrsVmC+u/cH5oevMbPBBK1Mh4RjHjOzymvLHwcmEPTR7h+xz/HAdnfvBzwI3BvuqwtB29UzgSxgYmRBEhFpCioqnL9lr+e8+9/hiffzuXJYGm/98lx+/PW+tGxe5SM4Jb36ndQUbySHLRTu/i5BL+tIo4Fp4fNpwJiI+AvufsDd84E8IMvMUoEO7v6hB/fAfabKmMp9zQRGhEcbI4F57l7s7tuBeXy1YImIJKyl63fw7cc/4Fcz/0mfLm156adnc893TqFbu9bVDxg6GZqnHBprnhLE46iucxQ93X0TgLtvMrMeYbw3sDBiu8IwVho+rxqvHLM+3FeZme0EukbGqxlzCDObQHC0Qnp6fCuviMiW3Qe47/UV/G1JId3bt+YPVw3lW6f1ptnhrqqunIdYentwuilJVz1V91/Bo8TrOubQoPsUYApAZmZm7Lt2iEjyy59e7w/o0vIKpn2wlofeXMX+snKuP+d4fnZ+P9q3aVn7nfS9Nu6Foaq6ForNZpYaHk2kAkVhvBDoE7FdGrAxjKdVE48cU2hmLYCOBKe6CoFzq4x5u475iojULH96vZelvvv5Fu6YncPqLXv55ond+e3lgzmhe7sGSrhx1bUV6iygchXSOODliPjYcCVTX4JJ68XhaardZjY8nH+4rsqYyn1dCSwI5zHeAC4ys87hJPZFYUxEJLaWRlmWehgF20r412eyue7JxZRVOFPHZfL0j85ImiIBtTiiMLPnCb7ZdzOzQoKVSPcAM8xsPFAAXAXg7jlmNgPIBcqAG929PNzVDQQrqNoCc8IfgKnAs2aWR3AkMTbcV7GZ3QV8FG53p7tXnVQXEam/OixLLTlYxmNvrWbKe2to0cz4j1FBE6HWLRq2iVA8WEM04o6nzMxMz87OjncaItKUvJQRnG6qKuU4GLP2kJC7M/ufm7j7teVs2rmfMaf24taLB3FsxzaNkmpDMbMl7p5Z3Xu6MltEZOjkQ+cooNplqbkbdzFpdg6L84sZ0qsDD8ehiVA8qFCIiBxmWer2vUEToemLgiZCv//WyVx9Rp+4NBGKBxUKERGodllqeYXz3OIC/pBATYTiQYVCRKQai9ZsY9LsXJZv2sXw44MmQgOPjX8ToXhQoRARibBxxz7unrOC2Us30rtTWx679nQuPilxmgjFgwqFiAhBE6En3lvDn95a/WUToW+eQNtWybfc9UipUIhI7MTgNhiNzd2Zl7uZu17NZX3xPkYO6clvLh1Mny4phx98lFChEJHYiMFtMBpbXtFu7pidy3urttK/ahMh+T8qFCISG9Fug5FghWLX/lIefnMVT3+wlratmvPbywbzg7OO+2p/CAFUKEQkVhK0O1ukigpn5seF3Pf6CrbtPcjVmX345cgBNfeHEECFQkRiJSW9httgJEaPmE8KtjNpdi5L1+/g9PROPPnDMzglrVO802oSVChEJDZqeRuMRhExqV7U4iTu2zOJmZ+3pnv71jzw3aGMObUWTYTk/6hQiEhsJEp3tnBS/WDpQaZtG8NDm7/HAW/O9UP3c9O3R9KutT72jpT+i4lI7CRCd7alt/PujgHcsXECqw/04bz2H/H/ev2F49u2hNbfiW9uTZQKhYgkjYJtJdyV+wPm7RpORquNTM24gxEdwpY2JTrVVFcqFCLS5EU2EWpZcSq3HvsUP+r2Mq2blX25UYJMqjdF9Vo0bGa3mNkyM8sxs5+HsS5mNs/MVoWPnSO2v83M8sxspZmNjIgPM7PPwvceDtulErZUfTGMLzKzjPrkKyLJxd2ZtXQj59//Do++lcelJ6ey4JoD/CR1zqFFIl6T6kmizoXCzE4C/hXIAoYCl5lZf+BWYL679wfmh68xs8EEbU6HAKOAx8ys8iYqjwMTCHps9w/fBxgPbHf3fsCDwL11zVdEkkvuxl1c/d8Lufn5T+jWvhUzf3IWD159Kj1PugaypgTd6bDgMWtK/OdOmrD6nHoaBCx09xIAM3sH+BYwmqDHNsA04G3g12H8BXc/AOSHPbKzzGwt0MHdPwz38wwwhqCn9mhgUrivmcCjZmaebP1bRaTWtu89yB/mreS5RQV0SmnF3d8+me9mVmkilAiT6kmkPoViGTDZzLoC+4BLgGygp7tvAnD3TWbWI9y+N7AwYnxhGCsNn1eNV45ZH+6rzMx2Al2BrZGJmNkEgiMS0tN1HlIkGZWVV/D84gLun/s5ew6Ucd1ZGfzighPpmNIy3qklvToXCndfbmb3AvOAPcBSoCzKkOqWHHiUeLQxVXOZAkwByMzM1NGGSJJZuGYbk2blsOKL3Zx1fFcmXTGEAce2j3daR416rXpy96nAVAAz+z3B0cBmM0sNjyZSgaJw80KgT8TwNGBjGE+rJh45ptDMWgAdgeL65CwiTcfGHfv4/WvLeeWfm+jdqS2PX3s6o47yJkLxUK9CYWY93L3IzNKBbwNnAX2BccA94ePL4eazgOfM7AGgF8Gk9WJ3Lzez3WY2HFgEXAc8EjFmHPAhcCWwQPMTIslvf2k5f3l3DX96Ow93+PkF/bn+HDURipf6XkfxP+EcRSlwo7tvN7N7gBlmNh4oAK4CcPccM5sB5BKcorrR3cvD/dwAPA20JZjEnhPGpwLPhhPfxQSrpkQkSbk7c3M387uwidAlJx/Lf14yiLTOaiIUT5ZsX9AzMzM9Ozs73mmIyBGKbCJ0Ys92TLp8CF/rpyZCjcXMlrh7ZnXv6cpsEYmrXftLeejNVUwLmwhNvHww3x+uJkKJRIVCROKiosKZuaSQ+94ImgiNPaMPv7xoAF3VRCjhqFCISKP7pGA7k2blsLRwJ6end+KpH2ZxclrHeKclNVChEJFGU7R7P/e9vpKZSwrp0b41D14dNBHSctfEpkIhIg3uYFkF0z5Yy0PzV3GgrJyffPMEfnZ+PzURaiL0f0lEGtTbK4u485Vc1mzZy/kDe/D/LhtM327HxDstOQIqFCLSINZt28tdryznzeWbyeiawpM/zOT8gT3jnZbUgQqFiMTU3gNlPPZ2Hn95N5+WzY1bLx7Ij87OoHULXVXdVKlQiCSj/Omw9HYoKQg6uw2d3OC33a5sInT3ayv4Ytd+vn1ab3598UB6dmjToL9XGp4KhUiyyZ8OiydAeUnwumRd8BoarFjkbNzJpFk5fLR2Oyf17sCj15xGZkaXBvld0vhUKESSzdLbvywSlcpLgniMC8X2vQe5f+5Knl8cNBG659snc1XVJkLS5KlQiCSbkoIji9dBWXkFzy0u4A9hE6FxX8vg5yPURChZqVCIJJuU9OB0U3XxGPhw9TbumB00Efpah5VM6vMwJ5YDmxt+HkTiQ4VCJNkMnXzoHAVA85QgXg8bd+xj8mvLefWfm+jdroI/932Ake3exgwoocHnQSR+VChEkk3lB3WMVj1VbSL0iwtO5Ppdl9Bmf96hGzbQPIjEnwqFSDLqe229P7CjNhF6bnX1g2I4DyKJo143fDezX5hZjpktM7PnzayNmXUxs3lmtip87Byx/W1mlmdmK81sZER8mJl9Fr73sIV3CDOz1mb2YhhfZGYZ9clXRGonr2g31z25mOufXULbls157l/O5LFrh33Zaa6m+Y4YzYNIYqlzoTCz3sDNQKa7nwQ0J2hVeisw3937A/PD15jZ4PD9IcAo4DEzq7xU83FgAkEf7f7h+wDjge3u3g94ELi3rvmKyOHt2l/KXa/kMuqP77F0/Q7uuGIIr938ja92mhs6OZj3iBSDeRBJTPU99dQCaGtmpUAKsBG4DTg3fH8a8Dbwa2A08IK7HwDywz7YWWa2Fujg7h8CmNkzwBiCvtmjgUnhvmYCj5qZebL1bxWJs6pNhL6Xlc6/X3hizU2EYjwPIomtzoXC3TeY2f1AAbAPmOvuc82sp7tvCrfZZGY9wiG9gYURuygMY6Xh86rxyjHrw32VmdlOoCuwNTIXM5tAcERCeroOfUWOxMcF27kjbCI07LjOPP2jLE7qXYsmQjGYB5Gmoc6FIpx7GA30BXYAfzOz70cbUk3Mo8SjjTk04D4FmAKQmZmpow2RWijavZ9756zkfz4Omgj98epTGX1qLzURkq+oz6mnC4B8d98CYGZ/B74GbDaz1PBoIhUoCrcvBPpEjE8jOFVVGD6vGo8cU2hmLYCOQHE9chY56h0sq+DpD/J5eH4eB8squOHcE7jxPDURkprV529GATDczFIITj2NALKBvcA44J7w8eVw+1nAc2b2ANCLYNJ6sbuXm9luMxsOLAKuAx6JGDMO+BC4Elig+QmRunt7ZRF3zs5lzda9jBjYg9+oiZDUQn3mKBaZ2UzgY6AM+ITg9E87YIaZjScoJleF2+eY2QwgN9z+RncvD3d3A/A00JZgEntOGJ8KPBtOfBcTrJoSkSMU2USob7djeOqHZ3DewB6HHygCWLJ9Qc/MzPTs7Ox4pyGSEKo2Ebp5RH9+dHZfWrWo1yVUkoTMbIm7Z1b3nk5KiiShrzQROr03t44aSA81EZI6UKEQSTLLNuzkjtlBE6GTe3fkT9eezrDjOh9+oEgNVChEkkRxRBOhLmoiJDGkQiHSxJWVVzB9UQF/mLuSvQfL+dHX+nLLBf3p2FZNhCQ2VChEmrAPVm/ljlm5rNy8m7P7dWXS5UPo37N9vNOSJKNCIdIEbdixj9+/upxXP9tEWue2/Pn7wxg5pKeuqpYGoUIh0oTsLy1nyrtreOztoGnQv114IhPOOZ42LZsfZqRI3alQiDQB7s4bOUETocLt+7j0lFT+85JB9O7UNt6pyVFAhUIkwa3avJs7Zufyft5WBh7bnuf/dThnndA13mnJUUSFQiRB7dxXykNvrmLah2s5plVz7rhiCNeemU6L5rqqWhqXCoVIgqmocP62ZD33vb6S4pKgidAvLxpAl2NaxTs1OUqpUIgkkCXrtjNpVg6fbdhJ5nGdmXZFLZsIiTQgFQqRBFC0az/3zFnB3z/ZQM8OaiIkiUWFQiSODpSV89T/ruWR+asoLXd+GjYROkZNhCSB6G+jSJy8FTYRyt+6lwsG9eA3lw4mQ02EJAGpUIg0srVb93LXK7nMX1HE8d2O4akfncF5A9RESBJXndfZmdkAM/s04meXmf3czLqY2TwzWxU+do4Yc5uZ5ZnZSjMbGREfZmafhe89bOGJWTNrbWYvhvFFZpZRrz+tSBztPVDGva+v4KIH32Xhmm385yUDef3n56hISMKrTyvUlcCpAGbWHNgA/AO4FZjv7veY2a3h61+b2WCCVqZDCHpmv2lmJ4btUB8HJgALgdeAUQTtUMcD2929n5mNBe4Frq5rziLx4O68/OlG7p6znM27DvCd09P49agBaiIkTUasTj2NAFa7+zozGw2cG8anAW8DvwZGAy+4+wEgP+yDnWVma4EO7v4hgJk9A4whKBSjgUnhvmYCj5qZebL1b5WktWzDTibOymHJuu2cktaRx78/jNPT1URImpZYFYqxwPPh857uvgnA3TeZWeVxdW+CI4ZKhWGsNHxeNV45Zn24rzIz2wl0BbZG/nIzm0BwREJ6enqM/kgidbdtzwHun/s5L3wUNBG67zuncOWwNJqpiZA0QfUuFGbWCrgCuO1wm1YT8yjxaGMODbhPAaYAZGZm6mhD4qasvIK/LlzHA/M+Z+/Bcn58dl9uHqEmQtK0xeKI4mLgY3ffHL7ebGap4dFEKlAUxguBPhHj0oCNYTytmnjkmEIzawF0BIpjkLNIzH2Qt5U7ZgdNhL7erxsTLx9ctyZC+dNh6e1QUgAp6TB0MvS9NvYJi9RSLO4u9j2+PO0EMAsYFz4fB7wcER8brmTqC/QHFoenqXab2fBwtdN1VcZU7utKYIHmJyTRFG4v4afTl3DNE4vYe7CMP39/GM+Oz6p7kVg8AUrWAR48Lp4QxEXipF5HFGaWAlwIXB8RvgeYYWbjgQLgKgB3zzGzGUAuUAbcGK54ArgBeBpoSzCJPSeMTwWeDSe+iwnmQkQSwv7Scv78zmoef3s1ZvDvF57Iv9a3idDS26G85NBYeUkQ11GFxIkl2xf0zMxMz87OjncaksTcndeXfcHvXl3Ohh37uOyUVG6LVROh55pRzTQcYHBNRf33L1IDM1vi7pnVvacrs0WOwOebdzNpVg4frN7GwGPb88KE4Qw/PoZNhFLSw9NO1cRF4kSFQqQWdpaU8uCbn/PswnW0a92CO0cP4ZqsBmgiNHRyMCcRefqpeUoQr6TJbmlkKhQiUZRXODOy1/Nfb6xke8lBrslK598bsolQ5Qd+TYWgcrK7spBUTnZHjhWJMc1RiNRgybpiJs7KYdmGXZyR0ZmJlw+JfxOhlzJqODV1HIxZ29jZSBLRHIXIEdgcNhH6xycbOLZDGx4aeypXDE2QJkIlBUcWF4kBFQppXAl8fr1qE6GfndePG849IbGaCGmyW+Iggf4FSNJL4PPrb60o4s5XgiZCFw7uyW8uHcRxXROwiVBtJrtFYkyFQhpPAl5Mlh82EVqwoojjux/DtB9n8c0Tu8cll1o53GS3SANQoZDGE8/z61VOee0ZNJlH1wxj6vtraN2iObdfMohxX8ugVYsYL3dtCH2vVWGQRqVCIY0nXufXI055ucNLG/pyd3YFRWWrgyZCFw+gR3s1ERKpiQqFNJ54nV8PT3ktKzmBiRuvZ0nJYIa2/Zz/HjCV0777dsP+bpEkoEIhjSdO59e37drB/V/8jBeKL6Jri53cl/ZHruw8H/UQEqkdFQppXI14fr20sonQyr+wr7wV47u9zM09n6dD8/CIJuW4RslDpKlToZCk9EHeVibNzuHzzXv4RlpLJrb/Ff1arvpyg4Y45ZXA14iI1IcKhSSV9cUl/P615cxZ9gV9urRlyg+GceHgntjavQ37IZ7A14iI1Jfu9SSxE8dv1PsOBk2E/vxO0EToxnP71b+J0JHQPZikiWuwez2ZWSfgCeAkgm4rPwZWAi8CGcBa4Lvuvj3c/jZgPFAO3Ozub4TxYXzZ4e414BZ3dzNrDTwDDAO2AVe7+9r65CwNJE7fqN2dOcu+YHJEE6H/vGQQvWLRROhI6B5MksTqe3XRQ8Dr7j4QGAosB24F5rt7f2B++BozG0zQynQIMAp4zMwqv+49Dkwg6KPdP3wfgqKy3d37AQ8C99YzX2ko0a66biArv9jNNX9ZxE+nf0z7Ni14YcJwHr3m9MYvElDztSC6B5MkgToXCjPrAJxD0Ncadz/o7juA0cC0cLNpwJjw+WjgBXc/4O75QB6QZWapQAd3/9CD82DPVBlTua+ZwAhLiFt4ylc04jfqnSWlTJqVwyUPv0fupl3cNXoIr9z09dh2mjtSQycHE+SRdA8mSRL1OfV0PLAFeMrMhgJLgFuAnu6+CcDdN5lZj3D73sDCiPGFYaw0fF41XjlmfbivMjPbCXQFtkYmYmYTCI5ISE/XN7i4aISrrssrnBc/Ws9/vbGCnftKuebMdP79wgF0bqgmQkdC92CSJFafQtECOB24yd0XmdlDhKeZalDdkYBHiUcbc2jAfQowBYLJ7GhJSwNp4Kuus9cWM2l20EQoK6MLE68YzJBecW4iVJXuwSRJqj6FohAodPdF4euZBIVis5mlhkcTqUBRxPZ9IsanARvDeFo18cgxhWbWAugIFNcjZ2koDfSNumoToYe/dxqXn5KaGE2ERI4SdS4U7v6Fma03swHuvhIYAeSGP+OAe8LHl8Mhs4DnzOwBoBfBpPVidy83s91mNhxYBFwHPBIxZhzwIXAlsMCTbT1vMonhN+oDZeU8+f5aHlmwirKwidBPzzuBlFa69EeksdX3X91NwHQzawWsAX5EMEE+w8zGAwXAVQDunmNmMwgKSRlwo7uXh/u5gS+Xx84JfyCYKH/WzPIIjiTG1jNfaQIWrNjMnbNzWbutpHZNhOpy/UbVMb0ugY2vaX5BpBq64E7qLsYX2K3Zsoe7XsnlrZVbOL77MUy8fMjhmwhVvX4DgrmRrCk151LdmKoOtw+RJNNgF9zJUSyGF9jtOVDGIwtW8eT7+bRu0ZzfXDqI686qZROhunTNq25MVXHuvCeSSFQopG5i0Na0osJ56dMN3D1nBVt2H+DKYWn8x6gjbCJUl+s3antth66qFgFUKKSu6nmB3T8LdzBpVg4fF+xgaFpHpvxgGKeldz7yPOpy/UarLnBwW+32LSIqFFJHdbzAbuueA9z/xkpezF5P12Nacd+Vp3Dl6Wk0q2sXobpcv1GbaTldVS3yf1QopG6O8AO6tLyCZz9cx4Nvfs6+g+X8+Oy+3HJBfzq0aVm/POpy/UZptEtxTKueRKpQoZC6OYIP6P/N28qkWTmsKtrDN9JKmdj1bvrt+ghej9EH8pFev1Hj0ZBuCS5SHRUKqbvDfECvLy5h8qvLeT0nbCI0cg8Xbv4xVhHn5j4NfLsRkWSjQiExF9lEqJkZv7zoRP7lG8fT5tUToKJ+K6ViQjfwEzkiKhQSM1WbCF0xtBe3XTKQ1I5hf4hEau6jG/iJ1JoKhcTEyi92M2lWDh+u2cbAY9vz4oThnFm1P0Qj3IpcRGJPhULqZWdJKQ+++TnPLlxH+zYtuGvMSVyTlU7z6pa7am5ApElSoZA6qWwidP/clewoOci1Zx7Hv114YvQmQpobEGmSVCjkiGWvLWbirBxyNu4iq28XJl0+hMG9OtRusOYGRJocFQqptS927ueeOct56dONpHZswyPfO43L1ERIJOmpUMhhHSgrZ+r7+Ty6II+yCuem8/txw7lqIiRytNC/dIkqsonQRYN78ptLB5PeNSXeaYlII6rFDf9rZmZrzewzM/vUzLLDWBczm2dmq8LHzhHb32ZmeWa20sxGRsSHhfvJM7OHLTyXYWatzezFML7IzDLqk6/U3pote/jhU4v58dPZNG9mPPPjLKZcl6kiIXIUisURxXnuvjXi9a3AfHe/x8xuDV//2swGE7QyHULQM/tNMzsxbIf6ODABWAi8BowiaIc6Htju7v3MbCxwL3B1DHKWGuzeX8qjC/J48n/zaRM2ERr3tQxaNq/XdwoRacIa4tTTaODc8Pk04G3g12H8BXc/AOSHfbCzzGwt0MHdPwQws2eAMQSFYjQwKdzXTOBRMzNPtv6tCaCiwvnHJxu45/WgidB3M9P41ciBdG/fOt6piUic1bdQODDXzBz4b3efAvR0900A7r7JzHqE2/YmOGKoVBjGSsPnVeOVY9aH+yozs51AVyDyCAYzm0BwREJ6uq7yPVKRTYRO7dOJv1yXyal9OsU7LRFJEPUtFGe7+8awGMwzsxVRtq1uDaVHiUcbc2ggKFBTADIzM3W0UUuHNhFqzf1XDeXbp/WuexMhEUlK9SoU7r4xfCwys38AWcBmM0sNjyZSgaJw80KgT8TwNGBjGE+rJh45ptDMWgAdgWhdZ6QWqjYR+pev9+WmETFoIiQiSanOM5RmdoyZta98DlwELANmAePCzcYBL4fPZwFjw5VMfYH+wOLwNNVuMxserna6rsqYyn1dCSzQ/ET9vL9qK5c89B53vpLLaemdef3n53D7pYNVJESkRvU5ougJ/CNcydoCeM7dXzezj4AZZjYeKACuAnD3HDObAeQCZcCN4YongBuAp4G2BJPYc8L4VODZcOK7mGDVlNRBZBOh9C4pPHFdJiMG9dBV1SJyWJZsX9AzMzM9Ozs73mkkjH0Hy3n8ndX8d9hE6Gfn92P81/vSpmXzeKcmIgnEzJa4e2Z17+nK7NrKn96k7nrq7rz22RdMfjWXjTv3M/rUXtx6cUQTIRGRWlKhqI386Yf2UYhXr+daWvHFLibNymHhmmIGpXbgj2NPI6tvl3inJSJNlApFbSy9/dBmOxCfXs+HsaPkIA/OC5oIdWjbkt+NOYnv1dRESESkllQoaiORej1Xo7zCeeGjAu5/YyU795Xy/eFBE6FOKVGaCImI1JIKRW0kcK/nj9YWM/HlHHI37eLMvl2YdMUQBqXWsomQiEgtqFDURgL2ev5i537unrOclz/dSK+ObXj0mtO49GQ1ERKR2FOhqI0E6vW8vzRoIvSnt4ImQjef34+fqImQiDQgfbrUVrx6PYfLcn1vAfMPXsJdmyawbldzRg4Jmgj16aL+ECLSsFQoElm4LHf13s7cuWki7+zOpF+bQp69tBPf+Mal8c5ORI4SKhQJbPeSu3hk/dU8uXU0bZsd4Depf2Fct1douS0N9W8SkcaiQpGAKiqcv3+ygXs+/Q3byjpyVec3+dWxz9C95Y5ggwRZlisiRwcVigSzdP0OJs7K4dP1Ozi13Q6mHnsnQ1NWHbpRAizLFZGjhwpFgtiy+wD/9cYKZmQX0q1d2ESo8w6afbQByiM2jPOyXBE5+qhQxFlpeQXTPljLQ2+uYl9pORPOOZ6bzu9H+zYtgWuDHn8JsCxXRI5eKhRx9N6qLdwxO5e8oj2cc2J3fnvZYPr1aHfoRvFalisiElKhiIOCbSX87tVc5uZuVhMhEUl4dW6FWsnMmpvZJ2b2Svi6i5nNM7NV4WPniG1vM7M8M1tpZiMj4sPM7LPwvYfDlqiEbVNfDOOLzCyjvvnGU8nBMv4wdyUXPPgO763ayq9GDmDuL87hgsE961ck8qfDSxnwXLPgMX96rFIWEal/oQBuAZZHvL4VmO/u/YH54WvMbDBBK9MhwCjgMTOrbLP2ODCBoI92//B9gPHAdnfvBzwI3BuDfKvXgB+27s7spRsZ8Yd3eGRBHhefdCwLfvlNbjyvX/07zVX2yihZB/iXvTJULEQkRupVKMwsDbgUeCIiPBqYFj6fBoyJiL/g7gfcPR/IA7LMLBXo4O4fetCX9ZkqYyr3NRMYYQ1xfqYBP2yXb9rF2CkLuen5T+ic0ooZ15/FQ2NPi12nuWi9MkREYqC+cxR/BP4DaB8R6+numwDcfZOZ9QjjvYGFEdsVhrHS8HnVeOWY9eG+ysxsJ9AV2BqZhJlNIDgiIT29DtcYNEBjoh0lB3lg3uf8deE6OrZtyeRvncTYMxqgiVCC98oQkaavzoXCzC4Ditx9iZmdW5sh1cQ8SjzamEMD7lOAKQCZmZlfef+wYvhhW17hPL+4gPvnrmTXvlJ+MPw4ftGQTYQSuFeGiCSH+hxRnA1cYWaXAG2ADmb2V2CzmaWGRxOpQFG4fSHQJ2J8GrAxjKdVE48cU2hmLYCOQHE9cq5ejD5sF3/wApPe3EFuSR/ObP85k75zLIMyT4pRkjVIwF4ZIpJc6jxH4e63uXuau2cQTFIvcPfvA7OAceFm44CXw+ezgLHhSqa+BJPWi8PTVLvNbHg4/3BdlTGV+7oy/B1HfsRwOEMnBx+ukY7gw3bTzn3c/MQ/+O6s9uwobc0j6ffyQsa/MWj1vzT8pHLfayFrCqQcB1jwmDVF116ISMw0xHUU9wAzzGw8UABcBeDuOWY2A8gFyoAb3b3y5hQ3AE8DbYE54Q/AVOBZM8sjOJIY2wD51rkxUWUToUcX5FFeZtzc43l+0mMmKc0OBBvUc57jiPJXYRCRBmIN8QU9njIzMz07O7tBf4e78+byIu56JZeC4pKgiVD55fRp9UU1WxtcU9Gg+YiI1JeZLXH3zOre05XZR2j1lj3cOTuXdz7fQr8e7fjr+DP5ev9u8FJrKKlmgCaVRaSJU6Gopd37S3lkQR5Pvp9P21bN+e1lg/nBWcfRsnk4zaNJZRFJUioUh/F/TYTmrGDb3gN8d1gffjVqAN3atT50wzrOc4iIJDoViigimwidlt6JJ3+YySlpnWoeoEllEUlCKhTV2LL7APe9voK/LSmke/vWPPDdoYw5tTfNYn1VtYhIE6BCESGyidD+snKu/+bx3HR+f9q11n8mETl66RMwtL64hB8+tZjVW/Zy7oCgidDx3dsdfqCISJJToQj17NCGjK7HcPulgzh/YM94pyMikjBUKEKtWjRj6g/PiHcaIiIJJxaNi0REJImpUIiISFQqFCIiEpUKhYiIRKVCISIiUalQiIhIVCoUIiISlQqFiIhElXQd7sxsC7Culpt3A7Y2YDqxpFwbhnJtGMo19ho6z+PcvXt1byRdoTgSZpZdU+u/RKNcG4ZybRjKNfbimadOPYmISFQqFCIiEtXRXiimxDuBI6BcG4ZybRjKNfbiludRPUchIiKHd7QfUYiIyGGoUIiISFRHZaEwsyfNrMjMlsU7l8Mxsz5m9paZLTezHDO7Jd451cTM2pjZYjNbGuZ6R7xzisbMmpvZJ2b2SrxzicbM1prZZ2b2qZllxzufaMysk5nNNLMV4d/Zs+KdU3XMbED437PyZ5eZ/TzeedXEzH4R/ptaZmbPm1mbRv39R+MchZmdA+wBnnH3k+KdTzRmlgqkuvvHZtYeWAKMcffcOKf2FWZmwDHuvsfMWgLvA7e4+8I4p1YtM/s3IBPo4O6XxTufmpjZWiDT3RP+ojAzmwa85+5PmFkrIMXdd8Q5rajMrDmwATjT3Wt7sW6jMbPeBP+WBrv7PjObAbzm7k83Vg5H5RGFu78LFMc7j9pw903u/nH4fDewHOgd36yq54E94cuW4U9CfhMxszTgUuCJeOeSLMysA3AOMBXA3Q8mepEIjQBWJ2KRiNACaGtmLYAUYGNj/vKjslA0VWaWAZwGLIpzKjUKT+d8ChQB89w9UXP9I/AfQEWc86gNB+aa2RIzmxDvZKI4HtgCPBWe0nvCzI6Jd1K1MBZ4Pt5J1MTdNwD3AwXAJmCnu89tzBxUKJoIM2sH/A/wc3ffFe98auLu5e5+KpAGZJlZwp3aM7PLgCJ3XxLvXGrpbHc/HbgYuDE8dZqIWgCnA4+7+2nAXuDW+KYUXXh67Argb/HOpSZm1hkYDfQFegHHmNn3GzMHFYomIDzf/z/AdHf/e7zzqY3wlMPbwKj4ZlKts4ErwnP/LwDnm9lf45tSzdx9Y/hYBPwDyIpvRjUqBAojjiJnEhSORHYx8LG7b453IlFcAOS7+xZ3LwX+DnytMRNQoUhw4QTxVGC5uz8Q73yiMbPuZtYpfN6W4C/4irgmVQ13v83d09w9g+C0wwJ3b9RvaLVlZseEixgIT+NcBCTkaj13/wJYb2YDwtAIIOEWXVTxPRL4tFOoABhuZinh58EIgrnKRnNUFgozex74EBhgZoVmNj7eOUVxNvADgm+9lUv5Lol3UjVIBd4ys38CHxHMUST00tMmoCfwvpktBRYDr7r763HOKZqbgOnh34FTgd/HN52amVkKcCHBN/SEFR6hzQQ+Bj4j+Nxu1Nt5HJXLY0VEpPaOyiMKERGpPRUKERGJSoVCRESiUqEQEZGoVChERCQqFQoREYlKhUJERKL6/0iR8GrYjkh6AAAAAElFTkSuQmCC\n", 595 | "text/plain": [ 596 | "
" 597 | ] 598 | }, 599 | "metadata": { 600 | "needs_background": "light" 601 | }, 602 | "output_type": "display_data" 603 | } 604 | ], 605 | "source": [ 606 | "m, b = np.polyfit(x1, y_train, 1)\n", 607 | "plt.plot(x1, m*x1 + b)\n", 608 | "plt.scatter(x_train,y_train,c='orange')" 609 | ] 610 | } 611 | ], 612 | "metadata": { 613 | "kernelspec": { 614 | "display_name": "Python 3", 615 | "language": "python", 616 | "name": "python3" 617 | }, 618 | "language_info": { 619 | "codemirror_mode": { 620 | "name": "ipython", 621 | "version": 3 622 | }, 623 | "file_extension": ".py", 624 | "mimetype": "text/x-python", 625 | "name": "python", 626 | "nbconvert_exporter": "python", 627 | "pygments_lexer": "ipython3", 628 | "version": "3.7.4" 629 | } 630 | }, 631 | "nbformat": 4, 632 | "nbformat_minor": 2 633 | } 634 | -------------------------------------------------------------------------------- /Pytorch/Tensors/tensors_theory.py: -------------------------------------------------------------------------------- 1 | #tensors, index starts from zero ;) 2 | import torch 3 | 4 | #1D 5 | x=torch.ones(3) #1d tensor of size 3;three 1's 6 | y=torch.zeros(3) #same's for zeros 7 | y[1]=5 #tensor([0., 5., 0.]) 8 | my_tensor=torch.tensor([1,2,3,]) #tensor([1, 2, 3]) 9 | 10 | #2D 11 | two_di=torch.tensor([[1,2],[3,4]]) 12 | '''tensor([[1, 2], 13 | [3, 4]])''' 14 | print(two_di[1,0]) #prints tensor(3) 15 | print(two_di[1]) #tensor([3, 4]) 16 | print(two_di.shape)#torch.Size([2, 2]) format [row,col] 17 | two_di[None] 18 | '''tensor([[[1, 2], 19 | [3, 4]]])''' 20 | zeros=torch.zeros(2,3) 21 | '''tensor([[0., 0., 0.], 22 | [0., 0., 0.]])''' 23 | 24 | #indexing in tensors 25 | 26 | #1D 27 | tensor_plusplus=torch.tensor(range(4)) #print tensor([0, 1, 2, 3]) 28 | tensor_plusplus[:] #tensor([0, 1, 2]) 29 | tensor_plusplus[-1] #tensor(3) 30 | tensor_plusplus[:-1] #tensor([0, 1, 2]); excludes the last element 31 | tensor_plusplus[::2] #tensor([0, 2]); printing elements with step 2. Default val of step is 1. 32 | 33 | #2D 34 | good_boy=torch.tensor([[1,2,3,],[4,5,6]]) 35 | '''tensor([[1, 2, 3], 36 | [4, 5, 6]])''' 37 | good_boy[None] #the same as above 38 | good_boy[:,1] #tensor([2, 5]); prints first col of the tensor; Format [row from:til row-1,col from: til col-1] 39 | good_boy[1,1:] #tensor([5, 6]) 40 | good_boy.reshape(1,6) #tensor([[1, 2, 3, 4, 5, 6]]) 41 | 42 | 43 | #operations 44 | x=torch.tensor(torch.ones(4)) #tensor([1., 1., 1., 1.]) 45 | x=torch.ones(4).int() #tensor([1, 1, 1, 1], dtype=torch.int32) 46 | torch.sum(x) #tensor(4) 47 | y=torch.tensor([1,2,3,4]) 48 | z=torch.tensor([[1,2],[3,4],[5,6]]) 49 | '''tensor([[1, 2], 50 | [3, 4], 51 | [5, 6]])''' 52 | z.transpose(0,1) 53 | '''tensor([[1, 3, 5], 54 | [2, 4, 6]])''' 55 | torch.add(x,y) #tensor([2, 3, 4, 5]) 56 | print(x+y) #tensor([2, 3, 4, 5]) 57 | torch.sub(x,y) #tensor([ 0, -1, -2, -3]); Other way of doing this is : x-y 58 | torch.mul(x,y) #tensor([1, 2, 3, 4]); Same as x*y and x.mul(y);element by element mat mul. 59 | abc=torch.tensor([[1,2,3],[3,4,5]]) 60 | 61 | 62 | 63 | #transposing in higher dimensions 64 | 65 | abc=torch.tensor([[1,2,3],[3,4,5]]) 66 | abc_t=abc.t() #transposing a tensor; only for dimensions >=2 67 | '''tensor([[1, 3], 68 | [2, 4], 69 | [3, 5]])''' 70 | #transposing matrix in multidimensions 71 | demo=torch.ones(3,4,5) 72 | demo.shape #torch.Size([3, 4, 5]) 73 | demo_t=demo.transpose(1,2) 74 | demo_t.shape #torch.Size([3, 5, 4]) 75 | 76 | #numpy - tensor interoperability 77 | 78 | tensor=torch.tensor(torch.ones(5)) #tensor([1., 1., 1., 1., 1.]) 79 | tensor_to_np=tensor.numpy() #array([1., 1., 1., 1., 1.], dtype=float32) 80 | t=torch.from_numpy(tensor_to_np) #tensor([1., 1., 1., 1., 1.]) 81 | 82 | #saving tensors 83 | import h5py 84 | f=h5py.File('storing_tensor.hdf5','w') 85 | f.create_dataset('dataset',data=t) 86 | f.close() 87 | read=h5py.File('storing_tensor.hdf5','r') 88 | read.keys() # 89 | read.values() #ValuesViewHDF5() 90 | print(read.get('dataset')) # 91 | read.close() 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | #exercise 103 | 104 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Deep-Stuff 2 | 3 | 4 | 5 | This repo is dedicated to deep learning concepts from scratch using python. It consists the following: 6 | 7 | 1. Resources(theoretical) 8 | 9 | 2. Blogs(Both theoretical and code based) 10 | 11 | 3. Explanable jupyter notebooks 12 | 13 | 4. Projects (beginner and intermed) 14 | 15 | 16 | ### Concepts covered 17 | 18 | * Definition 19 | 20 | * Tensors 21 | 22 | * ANN 23 | 24 | * CNN 25 | 26 | * RNN 27 | 28 | * SOM 29 | 30 | * Boltmann Machines 31 | 32 | * Autoencoders 33 | 34 | * GANs 35 | 36 | 37 | ### Frameworks 38 | 39 | * Pytorch 40 | 41 | * Tensorflow(coming soon) 42 | 43 | ### Other Curations 44 | 45 | * [100 Day Programming Challenge- Learning Path](https://github.com/bhav09/100dayProgrammingChallenge_LearningPath) 46 | 47 | * [Free Data Science Resources](https://github.com/bhav09/FREE-Data-Science-Resources) 48 | 49 | * [Python zero to hero](https://github.com/bhav09/python_zero_to_hero) 50 | 51 | * [Basics of NLP](https://github.com/bhav09/NLP_basics) 52 | 53 | * [Basics of Open CV](https://github.com/bhav09/OpenCV_template) 54 | -------------------------------------------------------------------------------- /Resources/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Resources/README.md: -------------------------------------------------------------------------------- 1 | 2 | ## Books on Deep Learning 3 | 4 | * [Deep Learning By Ian Goodfellow, Yoshua Bengio and Aaron Courville](http://www.deeplearningbook.org/) 5 | 6 | * [Deep Learning Tutorial By LISA Lab, University of Montreal](http://deeplearning.net/tutorial/deeplearning.pdf) 7 | 8 | * [Deep Learning: Methods and Applications By Li Deng and Dong Yu](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/DeepLearning-NowPublishing-Vol7-SIG-039.pdf) 9 | 10 | * [Deep Learning with Pytorch](https://pytorch.org/deep-learning-with-pytorch) 11 | -------------------------------------------------------------------------------- /static/.gitkeep: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /static/cerebrum.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/bhav09/deep-stuff/e233df7a525d9a73fe54058135ef6d24869da75b/static/cerebrum.jpg -------------------------------------------------------------------------------- /static/nn.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/bhav09/deep-stuff/e233df7a525d9a73fe54058135ef6d24869da75b/static/nn.jpg --------------------------------------------------------------------------------