├── AddingLinkedList.png ├── Array.png ├── BFS.gif ├── BalancedBinaryTree.png ├── BalancedVsUnbalancedBinaryTree.png ├── Big-O-Graph.png ├── BinarySearch.png ├── BinarySearchTree.jpeg ├── BinaryTree.jpeg ├── BubbleSort.gif ├── CallStack.gif ├── CompleteBinaryTree.png ├── CyclicVsAcyclicGraph.png ├── DFS.gif ├── DegenerateBinaryTree.png ├── DirectedVsUnDirectedGraph.png ├── DoublyLinkedList.png ├── Fib.png ├── Fibonacci.png ├── FileSystem.jpg ├── FullBinaryTree.png ├── Graph.png ├── HashCollision.png ├── HashFunction.png ├── HashTable.png ├── InsertionSort.gif ├── LinkedList.png ├── MaxHeapAndMinHeap.png ├── MergeSort.png ├── Operations-Table.png ├── PerfectBinaryTree.png ├── Queue.jpg ├── Readme.md ├── SelectionSort.webp ├── Stack.gif ├── StackVsQueue.png ├── Tree.webp ├── Trie.png ├── TypesOfBinaryTree.png ├── Typescript_logo_2020.svg ├── UnbalancedBinaryTree.png └── WeightedVsUnWeightedGraph.png /AddingLinkedList.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/AddingLinkedList.png -------------------------------------------------------------------------------- /Array.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Array.png -------------------------------------------------------------------------------- /BFS.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/BFS.gif -------------------------------------------------------------------------------- /BalancedBinaryTree.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/BalancedBinaryTree.png -------------------------------------------------------------------------------- /BalancedVsUnbalancedBinaryTree.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/BalancedVsUnbalancedBinaryTree.png -------------------------------------------------------------------------------- /Big-O-Graph.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Big-O-Graph.png -------------------------------------------------------------------------------- /BinarySearch.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/BinarySearch.png -------------------------------------------------------------------------------- /BinarySearchTree.jpeg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/BinarySearchTree.jpeg -------------------------------------------------------------------------------- /BinaryTree.jpeg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/BinaryTree.jpeg -------------------------------------------------------------------------------- /BubbleSort.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/BubbleSort.gif -------------------------------------------------------------------------------- /CallStack.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/CallStack.gif -------------------------------------------------------------------------------- /CompleteBinaryTree.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/CompleteBinaryTree.png -------------------------------------------------------------------------------- /CyclicVsAcyclicGraph.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/CyclicVsAcyclicGraph.png -------------------------------------------------------------------------------- /DFS.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/DFS.gif -------------------------------------------------------------------------------- /DegenerateBinaryTree.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/DegenerateBinaryTree.png -------------------------------------------------------------------------------- /DirectedVsUnDirectedGraph.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/DirectedVsUnDirectedGraph.png -------------------------------------------------------------------------------- /DoublyLinkedList.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/DoublyLinkedList.png -------------------------------------------------------------------------------- /Fib.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Fib.png -------------------------------------------------------------------------------- /Fibonacci.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Fibonacci.png -------------------------------------------------------------------------------- /FileSystem.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/FileSystem.jpg -------------------------------------------------------------------------------- /FullBinaryTree.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/FullBinaryTree.png -------------------------------------------------------------------------------- /Graph.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Graph.png -------------------------------------------------------------------------------- /HashCollision.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/HashCollision.png -------------------------------------------------------------------------------- /HashFunction.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/HashFunction.png -------------------------------------------------------------------------------- /HashTable.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/HashTable.png -------------------------------------------------------------------------------- /InsertionSort.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/InsertionSort.gif -------------------------------------------------------------------------------- /LinkedList.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/LinkedList.png -------------------------------------------------------------------------------- /MaxHeapAndMinHeap.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/MaxHeapAndMinHeap.png -------------------------------------------------------------------------------- /MergeSort.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/MergeSort.png -------------------------------------------------------------------------------- /Operations-Table.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Operations-Table.png -------------------------------------------------------------------------------- /PerfectBinaryTree.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/PerfectBinaryTree.png -------------------------------------------------------------------------------- /Queue.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Queue.jpg -------------------------------------------------------------------------------- /Readme.md: -------------------------------------------------------------------------------- 1 | # My notes on DSA 2 | 3 | I'm making my detailed notes here as I'm learning DSA. 4 | This can be useful to you even if you don't have any knowledge of DSA. 5 | 6 | If you find something that I have written wrong or It could be explained in a better 7 | way please submit an issue [here](https://github.com/siddharthroy12/My-DSA-Notes/issues). 8 | 9 | Contributions are welcomed. 10 | 11 | ## Table of Contents 12 | 13 | 1. [Big O](#big-o) 14 | 1. [How to calculate Big O](#how-to-calculate-big-o) 15 | 2. [Space Complexity](#space-complexity) 16 | 3. [Big O Cheat Sheet](#cheat-sheet) 17 | 2. [Data Structures](#data-structures) 18 | 1. [Operations on Data Structures](#operations-on-data-structures) 19 | 2. [Arrays](#array) 20 | 3. [Hash Tables](#hash-tables) 21 | 4. [Linked List](#linked-list) 22 | 5. [Stacks and Queues](#stacks-and-queues) 23 | 6. [Trees](#trees) 24 | 7. [Graphs](#graphs) 25 | 3. [Algorithms](#algorithms) 26 | 1. [Recursion](#recursion) 27 | 2. [Back Tracking](#backtracking) 28 | 3. [Two Pointers](#two-pointers) 29 | 4. [Divide and Conquer](#divide-and-conquer) 30 | 5. [Sorting](#sorting) 31 | 6. [Searching](#searching) 32 | 7. [BFS and DFS](#bfs-and-dfs) 33 | 8. [Dynamic Programming](#dynamic-programming) 34 | 6. [Additional Important Topics](#additional-important-topics) 35 | 6. [Coding Problems](#coding-problems) 36 | 37 | ## Big O 38 | 39 | The Big O notation is used to show how well an algorithm scales 40 | in terms of speed and space as the input grows. 41 | 42 | It indicates how much time and memory an algorithm 43 | will take to solve a problem for a given input. 44 | 45 | Because a code can run faster or slower on different processors 46 | calculating time not a good way to calculate the efficiency of a program. 47 | 48 | That's why we use Big O notation to show how well a code performs 49 | when the input grows. 50 | 51 | Big O can also be plotted on a graph: 52 | 53 | ![Big O Graph](./Big-O-Graph.png) 54 | 55 | Note that even though O(n^2), O(2^n) and O(n!) are marked as horrible it's not always the case. 56 | 57 | In some cases you cannot avoid having steep time complexities, 58 | they can be the most efficient complexities for certain problems so keep that in mind. 59 | 60 | ### How to caculate Big O 61 | 62 | To calculate Big O we need to calculate how many operations 63 | the code/function/program will take for the given input. 64 | 65 | For example this code: 66 | 67 | ```js 68 | function getFirstElement(array) { 69 | return array[0]; // -- O(1) 70 | } 71 | ``` 72 | 73 | This code returns the first element of an array 74 | so it has only one operation so the Big O of this function is O(1). 75 | 76 | Let's look at another example: 77 | 78 | ```js 79 | function sumOfFirstTheeElements(array) { 80 | let res = 0; // -- O(1) 81 | 82 | for (let i = 0; i < 3; i++) { // -- O(3) 83 | res += array[i]; 84 | } 85 | 86 | return res; // -- O(1) 87 | } 88 | ``` 89 | 90 | You might think the Big O of this function is O(5) because 91 | there are five operations in this code. 92 | 93 | This is correct but as the input grows the number of operations 94 | stays the same, it does not grow. 95 | 96 | So it's a linear time which when denoted as O(1). 97 | 98 | Let's look at a code where the numeber of operations grows as the input: 99 | 100 | ```js 101 | function getIndexOf(array, element) { 102 | for (let i = 0; i < array.length; i++) { // -- O(n) 103 | if (array[i] === element) { 104 | return i; 105 | } 106 | } 107 | } 108 | ``` 109 | 110 | This code will find the index of an element in an array by looping over each element. 111 | 112 | So in the best case if the element is at the first index 113 | it will only do one operation O(1) but in the worst case, 114 | the number of operations will be equal to the number of inputs 115 | which is denoted by O(n). 116 | 117 | When we calculate the Big O we always take the worst case 118 | so the Big O of this code is O(n). 119 | 120 | Let's look at another similar example: 121 | 122 | ```js 123 | function getIndexOfArrayAndSumOfFirstThree(array, element) { 124 | let index = 0; 125 | let sum = 0; 126 | 127 | for (let i = 0; i < array.length; i++) { // -- O(n) 128 | if (array[i] === element) { 129 | index = i; 130 | break; 131 | } 132 | } 133 | 134 | for (let i = 0; i < 3; i++) { // -- O(3) 135 | sum += array[i]; 136 | } 137 | 138 | return { index, sum }; 139 | } 140 | ``` 141 | 142 | Here we have a function that does two things: 143 | - Get the index of an element 144 | - Get the sum of the first three elements 145 | 146 | You might think that the Big O of this function is O(n) + O(3) = O(n+3). 147 | But we don't count the constant numbers like 3 because if the number of elements 148 | is 1000 then having an extra 3 operation doesn't matter to us, 149 | so we remove the constant and this becomes O(n). 150 | 151 | > In big tech we often work with big amount of data 152 | 153 | But what if we have two for loops? 154 | 155 | ```js 156 | function getSumAndMultipleOfElements(array) { 157 | let sum = 0; 158 | let multiple = 0; 159 | 160 | for (let i = 0; i < array.length; i++) { // -- O(N) 161 | sum += array[i]; 162 | } 163 | 164 | for (let i = 0; i < array.length; i++) { // -- O(N) 165 | multiple *= array[i]; 166 | } 167 | 168 | return { sum, multiple }; 169 | } 170 | ``` 171 | 172 | We have to O(n) here so it should become O(2n) right? 173 | 174 | Even though the graph of this is steeper than O(n) 175 | it's still a straight line (a linear graph), 176 | so once again we remove the constant and it becomes O(n). 177 | 178 | For two separate collections: 179 | 180 | ```js 181 | function getSumOfTwoArray(array1, array2) { 182 | let sum1 = 0; 183 | let sum2 = 0; 184 | 185 | for (let i = 0; i < array.length; i++) { // -- O(N) 186 | sum1 += array1[i]; 187 | } 188 | 189 | for (let i = 0; i < array.length; i++) { // -- O(N) 190 | sum2 += array2[i]; 191 | } 192 | 193 | return { sum1, sum 2}; 194 | } 195 | ``` 196 | 197 | People often confuse this as O(n) but this is incorrect. 198 | Because we have two inputs we need to show two variables in the notation. 199 | So this is O(a+b). 200 | 201 | For nested loops: 202 | 203 | ```js 204 | function getPairsOfElements(array) { 205 | let res = []; 206 | 207 | for (let i = 0; i < array.length; i++) { // -- O(N) 208 | for (let j = 0; j < array.length; j++) { // -- O(N) 209 | res.push([i,j]); 210 | } 211 | } 212 | 213 | return res; 214 | } 215 | ``` 216 | 217 | For nested loops like this, we just multiply the operations, 218 | so this becomes O(n * n) = O(n^2). 219 | 220 | And the last rule: 221 | Drop Non-dominant terms 222 | 223 | ```js 224 | function getSumAndPairs(array) { 225 | let pairs = []; 226 | let sum = 0; 227 | 228 | for (let i = 0; i < array.length; i++) { // -- O(N) 229 | for (let j = 0; j < array.length; j++) { // -- O(N) 230 | pairs.push([i,j]); 231 | } 232 | } 233 | 234 | for (let i = 0; i < array.length; i++) { // -- O(N) 235 | sum += array1[i]; 236 | } 237 | } 238 | ``` 239 | 240 | This looks like it has the complexity of O(n + n^2) but 241 | because n^2 is way more important than n because it grows much faster than n. 242 | 243 | When we calculate Big O we are concerned about the scalability of the function 244 | and not the accurate speed of the function. 245 | 246 | ### Space complexity 247 | 248 | Big O is not only used to calculate time compexity 249 | it also used to calculate memory usage. 250 | 251 | And there is relation between time and memory usage in computers. 252 | 253 | If an algorithm uses less time then It will have to use more memory, 254 | If an algorithm uses less memory then it will take more time to perform. 255 | 256 | You cannot have the best of both worlds. 257 | 258 | If you are writing a software that runs on low-end systems (like mico conrollers) 259 | you wanna sacrifice speed for low memory usage. 260 | 261 | And If you are writing a software that runs on a high-end system with lots of 262 | data to process you wanna sacrifice memory usage for speed. 263 | 264 | Let's look at an example to learn how to calculate space complexity: 265 | 266 | ```js 267 | function sumOfFirstTheeElements(array) { 268 | let res = 0; // -- O(1) 269 | 270 | for (let i = 0; i < 3; i++) { // -- O(1) 271 | res += array[i]; 272 | } 273 | 274 | return res; 275 | } 276 | ``` 277 | 278 | When we calculate space complexity we do not include the space occupied by the inputs. 279 | 280 | Inside our function, we are only creating two variables and 281 | it does not grow as the input grows so it's constant O(1). 282 | 283 | Let's look at another example: 284 | 285 | ```js 286 | function generateHelloArray(n) { 287 | let res = []; 288 | 289 | for (let i = 0; i < n; i++) { 290 | res.push("Hello"); 291 | } 292 | 293 | return res; 294 | } 295 | ``` 296 | 297 | The input n can also be an integer, not just an array. 298 | 299 | Here we are creating an array inside the function and the length of the array 300 | is the same as n so the space complexity of this is O(n). 301 | 302 | ### Cheat sheet 303 | 304 | #### The most common Big Os 305 | 306 | **O(1)** Constant - no loops 307 | 308 | **O(log N)** Logarithmic - usually searching algorithms have log n 309 | if they are sorted (Binary Search) 310 | 311 | **O(n)** Linear - for loops, while loops through n items 312 | 313 | **O(n log(n))** Log Linear - usually sorting operations 314 | 315 | **O(n^2)** Quadratic - every element in a collection needs to be compared 316 | to ever other element. Two nested loops 317 | 318 | **O(2^n)** Exponential - recursive algorithms that solves a problem of size N 319 | 320 | **O(n!)** Factorial - you are adding a loop for every element 321 | 322 | #### Rules 323 | 324 | - Always worst case 325 | - Remove constants 326 | - O(n*2) => O(n) 327 | - O(n+100) => O(n) 328 | - O(n/2) => O(n) 329 | - Different variables for different inputs 330 | - O(a + b) 331 | - O(a * b) 332 | - Use + sign for loops in order 333 | - Use * for nested loops 334 | - Iterating through half a collection is still O(n) 335 | - Drop Non-dominant terms 336 | - O(n + n^2) => O(n^2) 337 | 338 | #### What Causes Time Complexity? 339 | 340 | - Operations (+, -, *, /) 341 | - Comparisons (<, >, ==) 342 | - Looping (for, while) 343 | - Outside Function call (function()) 344 | 345 | #### What Causes Space Complexity? 346 | 347 | - Variables 348 | - Data Structures 349 | - Function Call 350 | - Allocations 351 | 352 | ## Data Structures 353 | 354 | To write an efficient software it's important to learn how to use these different 355 | types of data structures and how they can improve the efficiency of your program. 356 | 357 | ### Operations on Data Structures 358 | 359 | Data structures are simply ways to organise data on our computers and 360 | each data structures have their tradeoffs, some are good at certain operations 361 | and other are good at other operations. 362 | 363 | The operations we are talking about are: 364 | 365 | - Insertion: Adding more data 366 | - Deletion: Deleting a data 367 | - Traversal: Looping through all the data 368 | - Searching: Searching for a data 369 | - Sorting: Sorting all the data 370 | - Access: Accessing a data 371 | 372 | Here is a cheat sheet on operations for all data structures: 373 | ![Cheat Sheet on operation for all data structures](./Operations-Table.png) 374 | 375 | ### Array 376 | 377 | The most basic and commonly used data structure. 378 | Elements of an array are placed next to each other in memory, 379 | mapped with numbers starting at 0. 380 | 381 | Elements of an array are stored sequentially, meaning if we have the address 382 | of the first element of an array to access the second element all we need 383 | to do is add one to the address and to access the second element add two, and so on. 384 | 385 | ![Array Diagram](./Array.png) 386 | 387 | This makes the access time O(1) if we know the index of our data. 388 | But if you want to search for a data you need to loop over every element 389 | one by one which makes the searching time O(n). 390 | 391 | To calculate the length of the array we don't need to loop over the entire array, 392 | we can just store the length of the array inside a variable/property 393 | when we create the array (like in JS `array.length`) and increase/decrease this value 394 | whenever we add or remove an element in the array. 395 | 396 | And most languages provide this length property 397 | so the access time of the last element of the array is also O(1). 398 | 399 | But adding and removing an element is a different story. 400 | 401 | Many languages provide two methods for array `push` and `pop` to insert 402 | and remove an element at the end of the array. 403 | 404 | `push` and `pop` have O(1) time complexity. 405 | 406 | But if you want to add or remove an element anywhere else in the array 407 | the elements after the index you just changed will have to re-order so in 408 | the worst-case scenario the time complexity of inserting 409 | and removing an element is O(n) 410 | 411 | And strings are also an array of characters so the same applies to strings too. 412 | 413 | #### Operations On Arrays 414 | 415 | **Inserting** O(1) at the end and O(n) at the beginning. 416 | 417 | **Deleting** Same as Inserting. 418 | 419 | **Lookup** O(1). 420 | 421 | **Searching** O(n). 422 | 423 | #### Coding Problems 424 | 425 | ##### Reversing an array 426 | 427 | Most programming languages provide a built-in method for reversing an array 428 | but let's look at the ways we can implement this. 429 | 430 | First, the most simple way to reverse an array: 431 | 432 | ```js 433 | function reverse(input) { 434 | const output = []; 435 | 436 | for (let i = input.length-1; i >= 0; i--) { 437 | output.push(input[i]); 438 | } 439 | 440 | return output; 441 | } 442 | ``` 443 | 444 | It loops through the array in reverse order and pushes the value into a new array. 445 | 446 | The time and space complexity of this code is O(n), but we can improve this. 447 | 448 | ```js 449 | function reverse(input) { 450 | let index1 = 0; 451 | let index2 = input.length-1; 452 | 453 | while(index2 > index1) { 454 | let tmp = input[index1]; 455 | input[index1] = input[index2]; 456 | input[index2] = tmp; 457 | index1++; 458 | index2--; 459 | } 460 | } 461 | ``` 462 | 463 | In this function, we are using two indexes that point to the opposite sides 464 | of the array and swap the values of the indexes as they move towards the center. 465 | 466 | The above function has a space complexity of O(1) because 467 | it creates a constant amount of space and mutates the input value instead. 468 | 469 | This still has the time complexity of O(n) if you are confused. 470 | 471 | Now, this function is a un-pure function because it is mutating the input value 472 | which is not always an option so that's why 473 | the previous solution was not that bad for reversing an array. 474 | 475 | ##### Merging sorted array 476 | 477 | Let's solve a more complex interview question. 478 | 479 | Q. Given two sorted arrays, merge them into one sorted array. 480 | 481 | For example: `[0,4,6]` and `[2,3,7]` should become `[0,2,3,4,6,7]` 482 | 483 | Answer: 484 | 485 | To solve this problem we need to store two indexes. 486 | [1] One will point to the first element of the first array and 487 | the other one will point to the first element of the second array. 488 | 489 | [2] Then compare the two values and increase the index pointing 490 | to the shorter value and push the shorter value to the result array. 491 | 492 | Keep doing this until one of the indexes hit the end 493 | [3] then add the remaining elements to the result array. 494 | 495 | ```js 496 | function mergeSortedArray(array1, array2) { 497 | // [1] 498 | let index1 = 0; 499 | let index2 = 0; 500 | let result = []; 501 | 502 | // [2] 503 | while(index1 < array1.length && index2 < array2.length) { 504 | if (array1[index1] < array2[index2]) { 505 | result.push(array1[index1]); 506 | index1++; 507 | } else { 508 | result.push(array2[index2]); 509 | index2++; 510 | } 511 | } 512 | 513 | // [3] 514 | while(index1 < array1.length) { 515 | result.push(array1[index1]); 516 | index1++; 517 | } 518 | while(index2 < array2.length) { 519 | result.push(array2[index2]); 520 | index2++; 521 | } 522 | 523 | return result; 524 | } 525 | ``` 526 | 527 | Both the time and space complexity of this algorithm is O(a+b). 528 | 529 | ### Hash Tables 530 | 531 | Map(C++), HashMap, Dictionary(Python), Object(JavaScript),and HashTable 532 | are some of the ways to call this data structure and 533 | different languages have slight variations of hash tables. 534 | 535 | Hash tables are very important all across computer science. 536 | 537 | Hash tables allow us to store data in a key-value pair and 538 | this key is mostly a string or a number(we can use almost any type of data as key). 539 | 540 | ![HashTable Table](./HashTable.png) 541 | 542 | In hash tables we use key to find the value in memory, 543 | unlike arrays in hash tables data is not stored in sequencial way 544 | but still the access time of hash tables is O(1), it's constant. 545 | 546 | Hash tables uses one-way hashing algorithm where the key is the input 547 | and the output is the address in memory and this implementation 548 | of this hashing algorithm is different in all languages. 549 | 550 | ![Hash Function Example](./HashFunction.png) 551 | 552 | #### Hash Function 553 | 554 | Hash Function is again something that is used all across computer science. 555 | A Hash Function is simply a function that generates a fixed-length string 556 | that looks like some random gibberish for any given input. The output is called Hash. 557 | 558 | Eg. `5d41402abc4b2a76b9719d911017c592` 559 | 560 | You can play around with a famous hash function MD5 on [this website](http://www.md5.cz/). 561 | 562 | There are some key aspects of hash functions: 563 | 564 | - The function is a one-way function. Meaning there is no way to know 565 | what the input was by looking at the output. 566 | - For the same input the output is always going to be the same 567 | - If the input changes even by one bit, it's going to completely change the output 568 | 569 | In hash tables, the hash function returns a memory address for a given input. 570 | This makes accessing a block of memory by a key O(1) time. 571 | 572 | This is extremely useful because now we can map data to a string(name, etc) 573 | instead of a number like in an array. 574 | 575 | The hash function takes some time to calculate the output and it can slow down 576 | accessing data using HashTable. That's why in most languages the HashTable 577 | is implemented by the most efficient hash function. 578 | 579 | #### Hash Collisions 580 | 581 | Hash Functions are great, they can generate a unique random string for a given input 582 | and two different inputs cannot produce the same hash. 583 | 584 | But when we use a hash table the range of the output of the hash function becomes limited. 585 | So sometimes two different output results in the same address in the memory. 586 | 587 | To deal with that different implementations have different methods. One of which is 588 | Separate Chaining which uses a linked list to solve this issue but it slows down the 589 | access time from O(1) to O(n) where n is the size of the linked list. 590 | 591 | This is how it looks like: 592 | 593 | ![Diagram of Hash Collision](./HashCollision.png) 594 | 595 | Mostly you don't have to worry about Hash Collision happening 596 | but it's good to know that this happens. 597 | 598 | #### Coding Problems 599 | 600 | ##### First Recurring Element 601 | 602 | Q. For a given array find the first recurring element 603 | 604 | Example: [2,5,1,2,3,5,1,2,3] => 2 605 | 606 | Answer: 607 | 608 | To solve this problem we need to loop over every element of the array 609 | and store the seen elements somewhere so that we can check if an element has 610 | been seen before. 611 | 612 | We will use hash table to store seen elments. 613 | 614 | ```js 615 | function firstRecurringElement(input) { 616 | let seen = {}; 617 | 618 | for (let i = 0; i < input.length; i++) { 619 | // Check if key exists 620 | if (seen[input[i]]) { 621 | return input[i]; 622 | } 623 | 624 | // Otherwise store the key with any value 625 | seen[input[i]] = true; 626 | } 627 | 628 | // If no recurring element return false 629 | return false; 630 | } 631 | ``` 632 | 633 | The time and space complexity of this is O(n). 634 | 635 | #### Operations on Hash Tables 636 | 637 | **Inserting** O(1). 638 | 639 | **Deleting** O(1). 640 | 641 | **Lookup** O(1). 642 | 643 | **Searching** O(1) if searching using key, otherwise O(n). 644 | 645 | ### Linked List 646 | 647 | As the name suggests it's a list that's linked. The best way 648 | to explain this is using a diagram. 649 | 650 | ![Linked List Diagram](./LinkedList.png) 651 | 652 | A linked list is a collection of Nodes, each Node has two 653 | sections. One stores the data and the other points to the next Node in the list. 654 | 655 | So how does this differs from an array? 656 | 657 | In an array, all the elements are stored sequentially. 658 | This mean if we have the address to the first element of the array 659 | we can get the second element by just adding 1 to the address. 660 | 661 | But In a linked list all the Nodes are stored in random places in memory. 662 | So the access the second Node we need to use the pointer in the 663 | first Node that is pointing to the second Node and so on. 664 | The last Node points to nothing, that how we know it's the last Node. 665 | 666 | Most programming languages don't come with linked list built-in 667 | because nobody uses linked list in their applications. 668 | 669 | #### Why Linked List? 670 | 671 | You might be thinking if most programming languages don't come with 672 | linked list and nobody uses them to develop their 673 | application then why are we learning about this. 674 | 675 | The reason is that while linked list alone is not very useful 676 | in most situations it is a foundation for other very useful data structures 677 | like Graph and Trees so it's important to know how to linked list 678 | works if you want to learn about them. 679 | 680 | #### Linked List vs Array 681 | 682 | **Lookup** 683 | 684 | In an array, accessing an element by index has O(1) time complexity. 685 | 686 | But in Linked List because data is not stored in sequence in the memory 687 | we need to iterate through one Node to another to get to the desired element. 688 | 689 | **Prepend** 690 | 691 | Adding an element at the beginning of an array has O(n) time complexity. 692 | 693 | But in Linked List it's O(1). 694 | 695 | **Append** 696 | 697 | Adding an element at the end of an array as O(1) time complexity. Same for Linked List. 698 | 699 | **Inserting/Deleting an element** 700 | 701 | In array time complexity of adding/deleting an element is O(n) in the worst case. 702 | 703 | And in the linked list it's the same. 704 | 705 | Here is a visual representation of adding an element to a Linked List: 706 | 707 | ![Visual representaion of adding an element in Linked List](./AddingLinkedList.png) 708 | 709 | If we want to add a Node/element to an index first we need to traverse to the index. 710 | 711 | [1]Then create a new Node that will point to the Node which we just found. 712 | 713 | [2]Then make the previous Node point to the newly created Node. 714 | 715 | #### What is a pointer? 716 | 717 | If you know C/C++ you already know what pointers are, you can skip this section. 718 | 719 | When we create a variable and store a value in it, it occupies a space in memory 720 | and the location of that space has an address. 721 | 722 | And if we have an address of a location in memory we can change the value stored there. 723 | 724 | In mid-level languages like C/C++ and Rust, we have a special data type 725 | that allows us to store the address of a location in the memory 726 | in a special variable called pointers. 727 | 728 | So we can have two variables that point to the same location in memory 729 | 730 | In interpreted languages like Python and JavaScript, we do not have access 731 | to addresses in memory but have references. 732 | 733 | References are like pointers but they point to the actual value rather than the address. 734 | 735 | In JavaScript, if you create a variable and assign it to a different variable 736 | it will either copy the value or create a reference 737 | 738 | In JavaScript primitive data types `Boolean`, `null`, 739 | `undefined`, `String`, and `Number` are always get copied 740 | 741 | ```js 742 | let a = 1; 743 | let b = a; // Copy the value of a into b 744 | b++; 745 | 746 | console.log(a) // => 1 747 | console.log(b) // => 2 748 | ``` 749 | JavaScript has three data types that get passed by reference: `Array`, `Function`, `Object` 750 | 751 | ```js 752 | let a = { value: 1 }; 753 | let b = a; 754 | b.value = 10; 755 | 756 | console.log(a.value) // => 10 757 | console.log(b.value) // => 10 758 | ``` 759 | 760 | By using an object we have two variables that points to the same location in the memory. 761 | 762 | This is how we will make one Node(a object) point to the other Node(a object). 763 | 764 | #### Implementing Linked List 765 | 766 | First, let's implement a Node using OOP 767 | 768 | ```js 769 | class Node { 770 | constructor(value) { 771 | this.value = value; 772 | this.next = null; 773 | } 774 | } 775 | ``` 776 | 777 | That's it, it's that easy to implement a Node for a linked list. 778 | 779 | Now we need a class to store our list and have useful methods to work with our list. 780 | 781 | ```js 782 | class LinkedList { 783 | constructor(firstValue) { 784 | if (firstValue !== undefined && firstValue !== null) { 785 | this.firstNode = new Node(firstValue); 786 | this.lastNode = this.firstNode; // Storing a refrence 787 | this.length = 1; 788 | 789 | } else { 790 | this.firstNode = null; 791 | this.lastNode = null; 792 | this.length = 0; 793 | } 794 | } 795 | 796 | push(value) { 797 | // Add a Node to the end 798 | } 799 | } 800 | ``` 801 | 802 | What we are doing here is that in the constructor we have a choice to 803 | put the first element in the list when we create a linked list. 804 | 805 | We will always have a reference to the first Node in our list and the last Node. 806 | If we have only one Node then it's both the first and the last Node. 807 | 808 | We will also keep the count of the nodes in the `length` property. 809 | 810 | Now let's implement the push method. 811 | 812 | ```js 813 | class LinkedList { 814 | constructor(firstValue) { 815 | if (firstValue !== undefined && firstValue !== null) { 816 | this.firstNode = new Node(firstValue); 817 | this.lastNode = this.firstNode; // Storing a refrence 818 | this.length = 1; 819 | 820 | } else { 821 | this.firstNode = null; 822 | this.lastNode = null; 823 | this.length = 0; 824 | } 825 | } 826 | 827 | push(value) { 828 | if (this.lastNode) { 829 | const newNode = new Node(value) 830 | this.lastNode.next = newNode; 831 | this.lastNode = newNode; 832 | } else { 833 | this.firstNode = new Node(value); 834 | this.lastNode = this.firstNode; // Storing a refrence 835 | } 836 | this.length++; 837 | } 838 | } 839 | ``` 840 | 841 | Because we have the reference to the last Node of the list adding a new Node is easy. 842 | But first, we need to check if last Node exists because the list could be empty. 843 | 844 | If it's empty then do the same thing we did in the constructor. If the last Node 845 | exists then just create a new Node and store it in the `next` 846 | of the last Node and then make the new Node the last Node. 847 | 848 | Now we need a way to check if our Linked list is working properly so 849 | let's make a method to print our linked list. 850 | 851 | ```js 852 | class LinkedList { 853 | constructor(firstValue) { 854 | if (firstValue !== undefined && firstValue !== null) { 855 | this.firstNode = new Node(firstValue); 856 | this.lastNode = this.firstNode; // Storing a refrence 857 | this.length = 1; 858 | 859 | } else { 860 | this.firstNode = null; 861 | this.lastNode = null; 862 | this.length = 0; 863 | } 864 | } 865 | 866 | push(value) { 867 | if (this.lastNode) { 868 | const newNode = new Node(value) 869 | this.lastNode.next = newNode; 870 | this.lastNode = newNode; 871 | 872 | } else { 873 | this.firstNode = new Node(value); 874 | this.lastNode = this.firstNode; // Storing a refrence 875 | } 876 | 877 | this.length++; 878 | } 879 | 880 | print() { 881 | let currentNode = this.firstNode; 882 | 883 | while (currentNode) { 884 | console.log(currentNode.value); 885 | currentNode = currentNode.next; 886 | } 887 | } 888 | } 889 | ``` 890 | 891 | As you can see it's working as intended. 892 | 893 | ```js 894 | let test = new LinkedList(1); 895 | test.push(2); 896 | test.push(5); 897 | test.push(7); 898 | test.print(); 899 | 900 | // Output 901 | // 1 902 | // 2 903 | // 5 904 | // 7 905 | ``` 906 | 907 | Now let's add a method to add a value to the beginning of the list. 908 | 909 | We'll call this `unShift` because that's the method for adding an element 910 | to the beginning of the array is called in JavaScript. 911 | 912 | ```js 913 | unshift(value) { 914 | const newNode = new Node(value); 915 | newNode.next = this.firstNode; 916 | this.firstNode = newNode; 917 | 918 | this.length++; 919 | } 920 | ``` 921 | 922 | What we are doing here is first we make a new Node then make the new Node point 923 | to the first Node in our list then make the new Node our first Node. 924 | 925 | ```js 926 | let test = new LinkedList(1); 927 | test.push(2); 928 | test.push(5); 929 | test.push(7); 930 | test.unshift(10); 931 | test.print(); 932 | 933 | // Output 934 | // 10 935 | // 1 936 | // 2 937 | // 5 938 | // 7 939 | ``` 940 | 941 | Up until now, all the method that we have defined has O(1) time complexity 942 | except the print function. 943 | 944 | Now we are left with `lookup`, `insert` and `delete` operations. 945 | 946 | All of them are similar and have O(n) time complexity. 947 | 948 | Let's first implement a `traverse` method because the above method will depend on this. 949 | 950 | What the` traverse` method will do is that it'll go through each node 951 | one by one and return the node for the given index, with the index starting at 0. 952 | 953 | ```js 954 | traverse(index) { 955 | if (index >= 0 && index < this.length) { 956 | let i = 0; 957 | let currentNode = this.firstNode; 958 | 959 | while (i < index && currentNode) { 960 | currentNode = currentNode.next; 961 | i++; 962 | } 963 | return currentNode; 964 | 965 | } else { 966 | return null; 967 | } 968 | } 969 | ``` 970 | 971 | This method looks similar to the print method, what we are doing 972 | different is first we are checking if the index is within the length 973 | then traverse through the list and return the node at the index. 974 | 975 | After running this test code we'll get some interesting results. 976 | 977 | ```js 978 | let test = new LinkedList(1); 979 | test.push(2); 980 | test.push(5); 981 | test.push(7); 982 | test.unshift(10); 983 | console.log(test.lookup(0)); 984 | console.log(test.lookup(1)); 985 | console.log(test.lookup(2)); 986 | console.log(test.lookup(10)); 987 | 988 | // Output 989 | // Node { 990 | // value: 10, 991 | // next: Node { value: 1, next: Node { value: 2, next: [Node] } } 992 | // } 993 | // Node { 994 | // value: 1, 995 | // next: Node { value: 2, next: Node { value: 5, next: [Node] } } 996 | // } 997 | // Node { 998 | // value: 2, 999 | // next: Node { value: 5, next: Node { value: 7, next: null } } 1000 | // } 1001 | // null 1002 | ``` 1003 | 1004 | If we print a node we can see the rest of the list after that node. 1005 | and If we try of get a node that doesn't exist we get null. 1006 | 1007 | Now implementing the `lookup` method will be easy. 1008 | 1009 | ```js 1010 | lookup(index) { 1011 | const node = this.traverse(index); 1012 | 1013 | if (node) { 1014 | return node.value; 1015 | } else { 1016 | return null; 1017 | } 1018 | } 1019 | ``` 1020 | 1021 | All we need to do is return the value of the node if it exists. 1022 | 1023 | Now let's do the `insert` method. 1024 | 1025 | ```js 1026 | insert(index, value) { 1027 | const nodeAtIndex = this.traverse(index); 1028 | const nodeAtPreviousIndex = this.traverse(index-1); 1029 | 1030 | if (nodeAtIndex) { 1031 | const newNode = new Node(value); 1032 | newNode.next = nodeAtIndex; 1033 | 1034 | if (nodeAtPreviousIndex) { 1035 | nodeAtPreviousIndex.next = newNode; 1036 | } 1037 | 1038 | this.length++; 1039 | } else if (index === this.length) { 1040 | this.push(value); 1041 | } 1042 | } 1043 | ``` 1044 | 1045 | Let's go step by step to see what this is doing. 1046 | 1047 | First, we take two references, one for the node at the index and 1048 | the second one for the node at the previous index. 1049 | 1050 | Then we check if the node at the index exists. If it does then 1051 | create a new node and make it point the node at the index. 1052 | 1053 | Then check if the previous node exists and make the previous node point 1054 | the to the new node. 1055 | 1056 | If the node at the index does not exist then if the index is the same 1057 | as the length of the list use the push method. 1058 | 1059 | Now it's time for the last method in our list. 1060 | 1061 | ```js 1062 | delete(index) { 1063 | const nodeAtNextIndex = this.traverse(index+1); 1064 | const nodeAtPreviousIndex = this.traverse(index-1); 1065 | 1066 | if (nodeAtPreviousIndex) { 1067 | nodeAtPreviousIndex.next = nodeAtNextIndex; 1068 | } 1069 | } 1070 | ``` 1071 | 1072 | To delete a node at an index you just need to make the previous node point 1073 | to the node next to the node at the given index. 1074 | 1075 | Because JavaScript uses a garbage collector the node that no one has 1076 | reference to will automatically get deleted from the memory but in other languages 1077 | you may have to manually free the memory occupied by the node. 1078 | 1079 | Here is the full implementaion of Linked List 1080 | 1081 | ```js 1082 | class Node { 1083 | constructor(value) { 1084 | this.value = value; 1085 | this.next = null; 1086 | } 1087 | } 1088 | 1089 | class LinkedList { 1090 | constructor(firstValue) { 1091 | if (firstValue !== undefined && firstValue !== null) { 1092 | this.firstNode = new Node(firstValue); 1093 | this.lastNode = this.firstNode; // Storing a refrence 1094 | this.length = 1; 1095 | 1096 | } else { 1097 | this.firstNode = null; 1098 | this.lastNode = null; 1099 | this.length = 0; 1100 | } 1101 | } 1102 | 1103 | push(value) { 1104 | if (this.lastNode) { 1105 | const newNode = new Node(value) 1106 | this.lastNode.next = newNode; 1107 | this.lastNode = newNode; 1108 | 1109 | } else { 1110 | this.firstNode = new Node(value); 1111 | this.lastNode = this.firstNode; // Storing a refrence 1112 | } 1113 | 1114 | this.length++; 1115 | } 1116 | 1117 | print() { 1118 | let currentNode = this.firstNode; 1119 | 1120 | while (currentNode) { 1121 | console.log(currentNode.value); 1122 | currentNode = currentNode.next; 1123 | } 1124 | } 1125 | 1126 | unshift(value) { 1127 | const newNode = new Node(value); 1128 | newNode.next = this.firstNode; 1129 | this.firstNode = newNode; 1130 | 1131 | this.length++; 1132 | } 1133 | 1134 | traverse(index) { 1135 | if (index >= 0 && index < this.length) { 1136 | let i = 0; 1137 | let currentNode = this.firstNode; 1138 | 1139 | while (i < index && currentNode) { 1140 | currentNode = currentNode.next; 1141 | i++; 1142 | } 1143 | return currentNode; 1144 | 1145 | } else { 1146 | return null; 1147 | } 1148 | } 1149 | 1150 | lookup(index) { 1151 | const node = this.traverse(index); 1152 | 1153 | if (node) { 1154 | return node.value; 1155 | } else { 1156 | return null; 1157 | } 1158 | } 1159 | 1160 | insert(index, value) { 1161 | const nodeAtIndex = this.traverse(index); 1162 | const nodeAtPreviousIndex = this.traverse(index-1); 1163 | 1164 | if (nodeAtIndex) { 1165 | const newNode = new Node(value); 1166 | newNode.next = nodeAtIndex; 1167 | 1168 | if (nodeAtPreviousIndex) { 1169 | nodeAtPreviousIndex.next = newNode; 1170 | } 1171 | 1172 | this.length++; 1173 | } else if (index === this.length) { 1174 | this.push(value); 1175 | } 1176 | } 1177 | 1178 | delete(index) { 1179 | const nodeAtNextIndex = this.traverse(index+1); 1180 | const nodeAtPreviousIndex = this.traverse(index-1); 1181 | 1182 | if (nodeAtPreviousIndex) { 1183 | nodeAtPreviousIndex.next = nodeAtNextIndex; 1184 | } 1185 | 1186 | } 1187 | } 1188 | 1189 | let test = new LinkedList(1); 1190 | test.push(2); 1191 | test.push(5); 1192 | test.push(7); 1193 | test.unshift(10); 1194 | test.print(); 1195 | console.log("-----------") 1196 | test.insert(3, 11); 1197 | test.print(); 1198 | console.log("-----------") 1199 | test.delete(3); 1200 | test.print(); 1201 | ``` 1202 | 1203 | I know this isn't the most efficient implementation of Linked List because 1204 | in the `insert` and `delete` methods I'm calling the `traverse` method twice 1205 | which can be avoided but I think this is good enough to show how a Linked List works. 1206 | 1207 | #### Doubly Linked List 1208 | 1209 | What I just showed you is called a Singly Linked List because in it a Node only 1210 | points to a single Node. 1211 | 1212 | Imagine a situation where you want to get a Node at a closer index 1213 | to the last Node. 1214 | 1215 | In a Singly Linked List, you would have to traverse through the list from the 1216 | beginning to almost the end of the list. 1217 | 1218 | It would be great if we can traverse from the end of the list. 1219 | 1220 | This is what Doubly Linked List is for. In Doubly Linked List a Node 1221 | points to both the next and previous Node. Which allows us to traverse 1222 | from the end of the list. 1223 | 1224 | ![Diagram of Doubly Linked List](./DoublyLinkedList.png) 1225 | 1226 | Let's convert our Singly Linked List to a Doubly Linked List. 1227 | 1228 | Inside the `Node` class you only need to add the `previous` pointer. 1229 | 1230 | ```js 1231 | class Node { 1232 | constructor(value) { 1233 | this.value = value; 1234 | this.next = null; 1235 | this.previous = null; 1236 | } 1237 | } 1238 | ``` 1239 | 1240 | Inside the `LinkedList` class, there is no need to change 1241 | anything in the constructor. 1242 | 1243 | Inside the 'push' method you need to add one line: 1244 | 1245 | ```js 1246 | push(value) { 1247 | if (this.lastNode) { 1248 | const newNode = new Node(value) 1249 | this.lastNode.next = newNode; 1250 | newNode.previous = this.lastNode; // Add this 1251 | this.lastNode = newNode; 1252 | } else { 1253 | this.firstNode = new Node(value); 1254 | this.lastNode = this.firstNode; 1255 | } 1256 | 1257 | this.length++; 1258 | } 1259 | ``` 1260 | 1261 | We also need to do the same for the `unshift` method: 1262 | 1263 | ```js 1264 | unshift(value) { 1265 | const newNode = new Node(value); 1266 | newNode.next = this.firstNode; 1267 | this.firstNode = newNode; 1268 | this.firstNode.next.previous = this.firstNode; // Add this 1269 | 1270 | this.length++; 1271 | } 1272 | ``` 1273 | 1274 | Let's skip the `traverse` for now, we'll come back to them later. 1275 | 1276 | First, let's fix the `insert` and `delete` methods. 1277 | 1278 | ```js 1279 | insert(index, value) { 1280 | const nodeAtIndex = this.traverse(index); 1281 | const nodeAtPreviousIndex = nodeAtIndex.previous; // Change this 1282 | 1283 | if (nodeAtIndex) { 1284 | const newNode = new Node(value); 1285 | newNode.next = nodeAtIndex; 1286 | nodeAtIndex.previous = newNode; // Add this 1287 | 1288 | if (nodeAtPreviousIndex) { 1289 | nodeAtPreviousIndex.next = newNode; 1290 | newNode.previous = nodeAtPreviousIndex; // Add this 1291 | } 1292 | 1293 | this.length++; 1294 | } else if (index === this.length) { 1295 | this.push(value); 1296 | } 1297 | } 1298 | ``` 1299 | 1300 | Now we also don't need to call the traverse function twice! 1301 | 1302 | ```js 1303 | delete(index) { 1304 | const nodeAtIndex = this.traverse(index); 1305 | const nodeAtNextIndex = nodeAtIndex.next; 1306 | const nodeAtPreviousIndex = nodeAtIndex.previous; 1307 | 1308 | if (nodeAtPreviousIndex) { 1309 | nodeAtPreviousIndex.next = nodeAtNextIndex; 1310 | nodeAtNextIndex.previous = nodeAtPreviousIndex; // Add this 1311 | } 1312 | } 1313 | ``` 1314 | 1315 | Now to test if this works we'll add another print method that will 1316 | traverse the list in reverse order(from the last Node to the first Node). 1317 | 1318 | ```js 1319 | printReverse() { 1320 | let currentNode = this.lastNode; 1321 | 1322 | while(currentNode) { 1323 | console.log(currentNode.value); 1324 | currentNode = currentNode.previous; 1325 | } 1326 | } 1327 | ``` 1328 | 1329 | To test your code run this: 1330 | 1331 | ```js 1332 | let test = new LinkedList(1); 1333 | test.push(2); 1334 | test.push(5); 1335 | test.push(7); 1336 | test.unshift(10); 1337 | test.print(); 1338 | console.log("---- Reverse ----") 1339 | test.printReverse(); 1340 | console.log("---- Insert 11 at 3 ------") 1341 | test.insert(3, 11); 1342 | test.print(); 1343 | console.log("---- Reverse ----") 1344 | test.printReverse(); 1345 | console.log("---- Delete at 3 -----") 1346 | test.delete(3); 1347 | test.print(); 1348 | console.log("---- Reverse ----") 1349 | test.printReverse(); 1350 | ``` 1351 | 1352 | The output should look like this: 1353 | 1354 | ``` 1355 | 10 1356 | 1 1357 | 2 1358 | 5 1359 | 7 1360 | ---- Reverse ---- 1361 | 7 1362 | 5 1363 | 2 1364 | 1 1365 | 10 1366 | ---- Insert 11 at 3 ------ 1367 | 10 1368 | 1 1369 | 2 1370 | 11 1371 | 5 1372 | 7 1373 | ---- Reverse ---- 1374 | 7 1375 | 5 1376 | 11 1377 | 2 1378 | 1 1379 | 10 1380 | ---- Delete at 3 ----- 1381 | 10 1382 | 1 1383 | 2 1384 | 5 1385 | 7 1386 | ---- Reverse ---- 1387 | 7 1388 | 5 1389 | 2 1390 | 1 1391 | 10 1392 | ``` 1393 | 1394 | Now that everything is working correctly we can make an 1395 | improvement in our `traverse` method. 1396 | 1397 | We can now traverse in the reverse order if the index is closer to 1398 | the last item that the first item. 1399 | 1400 | ```js 1401 | traverse(index) { 1402 | if (index >= 0 && index < this.length) { 1403 | // If index is closer to the first element then traverse from the first element 1404 | if (index < this.length/2) { 1405 | let i = 0; 1406 | 1407 | let currentNode = this.firstNode; 1408 | while (i < index && currentNode) { 1409 | currentNode = currentNode.next; 1410 | i++; 1411 | } 1412 | 1413 | return currentNode; 1414 | } else { // If index is closer to the last element then traverse from the last element 1415 | let i = this.length - 1; 1416 | let currentNode = this.lastNode; 1417 | 1418 | while (i > index && currentNode) { 1419 | currentNode = currentNode.previous; 1420 | i--; 1421 | } 1422 | 1423 | return currentNode; 1424 | } 1425 | } else { 1426 | return null; 1427 | } 1428 | } 1429 | ``` 1430 | 1431 | And here is the full implementation of Doubly Linked List: 1432 | 1433 | ```js 1434 | class Node { 1435 | constructor(value) { 1436 | this.value = value; 1437 | this.next = null; 1438 | this.previous = null; 1439 | } 1440 | } 1441 | 1442 | class LinkedList { 1443 | constructor(firstValue) { 1444 | if (firstValue !== undefined && firstValue !== null) { 1445 | this.firstNode = new Node(firstValue); 1446 | this.lastNode = this.firstNode; 1447 | this.length = 1; 1448 | } else { 1449 | this.firstNode = null; 1450 | this.lastNode = null; 1451 | this.length = 0; 1452 | } 1453 | } 1454 | 1455 | push(value) { 1456 | if (this.lastNode) { 1457 | const newNode = new Node(value) 1458 | this.lastNode.next = newNode; 1459 | newNode.previous = this.lastNode; 1460 | this.lastNode = newNode; 1461 | } else { 1462 | this.firstNode = new Node(value); 1463 | this.lastNode = this.firstNode; 1464 | } 1465 | 1466 | this.length++; 1467 | } 1468 | 1469 | print() { 1470 | let currentNode = this.firstNode; 1471 | 1472 | while (currentNode) { 1473 | console.log(currentNode.value); 1474 | currentNode = currentNode.next; 1475 | } 1476 | } 1477 | 1478 | printReverse() { 1479 | let currentNode = this.lastNode; 1480 | 1481 | while(currentNode) { 1482 | console.log(currentNode.value); 1483 | currentNode = currentNode.previous; 1484 | } 1485 | } 1486 | 1487 | unshift(value) { 1488 | const newNode = new Node(value); 1489 | newNode.next = this.firstNode; 1490 | this.firstNode = newNode; 1491 | this.firstNode.next.previous = this.firstNode; 1492 | 1493 | this.length++; 1494 | } 1495 | 1496 | traverse(index) { 1497 | if (index >= 0 && index < this.length) { 1498 | // If index is closer to the first element then traverse from the first element 1499 | if (index < this.length/2) { 1500 | let i = 0; 1501 | let currentNode = this.firstNode; 1502 | 1503 | while (i < index && currentNode) { 1504 | currentNode = currentNode.next; 1505 | i++; 1506 | } 1507 | 1508 | return currentNode; 1509 | } else { // If index is closer to the last element then traverse from the last element 1510 | let i = this.length - 1; 1511 | let currentNode = this.lastNode; 1512 | 1513 | while (i > index && currentNode) { 1514 | currentNode = currentNode.previous; 1515 | i--; 1516 | } 1517 | 1518 | return currentNode; 1519 | } 1520 | } else { 1521 | return null; 1522 | } 1523 | } 1524 | 1525 | lookup(index) { 1526 | const node = this.traverse(index); 1527 | 1528 | if (node) { 1529 | return node.value; 1530 | } else { 1531 | return null; 1532 | } 1533 | } 1534 | 1535 | insert(index, value) { 1536 | const nodeAtIndex = this.traverse(index); 1537 | const nodeAtPreviousIndex = nodeAtIndex.previous; 1538 | 1539 | if (nodeAtIndex) { 1540 | const newNode = new Node(value); 1541 | newNode.next = nodeAtIndex; 1542 | nodeAtIndex.previous = newNode; 1543 | 1544 | if (nodeAtPreviousIndex) { 1545 | nodeAtPreviousIndex.next = newNode; 1546 | newNode.previous = nodeAtPreviousIndex; 1547 | } 1548 | 1549 | this.length++; 1550 | } else if (index === this.length) { 1551 | this.push(value); 1552 | } 1553 | } 1554 | 1555 | delete(index) { 1556 | const nodeAtIndex = this.traverse(index); 1557 | const nodeAtNextIndex = nodeAtIndex.next; 1558 | const nodeAtPreviousIndex = nodeAtIndex.previous; 1559 | 1560 | if (nodeAtPreviousIndex) { 1561 | nodeAtPreviousIndex.next = nodeAtNextIndex; 1562 | nodeAtNextIndex.previous = nodeAtPreviousIndex; 1563 | } 1564 | 1565 | } 1566 | } 1567 | 1568 | let test = new LinkedList(1); 1569 | test.push(2); 1570 | test.push(5); 1571 | test.push(7); 1572 | test.unshift(10); 1573 | test.print(); 1574 | console.log("---- Reverse ----") 1575 | test.printReverse(); 1576 | console.log("---- Insert 11 at 3 ------") 1577 | test.insert(3, 11); 1578 | test.print(); 1579 | console.log("---- Reverse ----") 1580 | test.printReverse(); 1581 | console.log("---- Delete at 3 -----") 1582 | test.delete(3); 1583 | test.print(); 1584 | console.log("---- Reverse ----") 1585 | test.printReverse(); 1586 | ``` 1587 | 1588 | 1589 | #### Reversing a Linked List 1590 | 1591 | This is the most common interview question related to linked list. 1592 | 1593 | And we are going to use a Singly Linked List for this because in 1594 | Doubly Linked List there is not need to reverse it. 1595 | 1596 | Remember that we don't to want to print the Linked List in a reverse 1597 | order like we did in the Doubly Linked List. 1598 | 1599 | We want to make the given Linked List in the reverse order. 1600 | 1601 | The algorithm to reverse a Linked List looks like this: 1602 | 1603 | ``` 1604 | If the linked list has elements less or equal to 1 then do nothing. 1605 | 1606 | Else get the reference to the first and the second Node to the 1607 | `first` and `second` variable and make the `first` LinkedList's last node; 1608 | 1609 | While the second Node exists: 1610 | Store the reference to the `next` of the `second` variable in a `temp` variable. 1611 | Make the `second` variable point to the `first`. 1612 | `first` is now `second`. 1613 | `second` is now `temp`. 1614 | 1615 | Now make the `next` of LinkedList's first Node `null` because it's now the last element. 1616 | And make `first` the LinkedList's first element. 1617 | ``` 1618 | 1619 | Now that's confusing as hell but before I explain what is 1620 | going on here let's look at the JavaScript code. 1621 | 1622 | ```js 1623 | reverse() { 1624 | if (this.length <= 1) { 1625 | // Do nothing 1626 | } else { 1627 | let first = this.firstNode; 1628 | let second = first.next; 1629 | this.lastNode = first; 1630 | 1631 | while(second) { 1632 | const temp = second.next; 1633 | second.next = first; 1634 | first = second; 1635 | second = temp; 1636 | } 1637 | 1638 | this.firstNode.next = null; 1639 | this.firstNode = first; 1640 | } 1641 | } 1642 | ``` 1643 | 1644 | This is one of the most difficult concepts to understand, 1645 | basically what we are doing is we have two pointers that point 1646 | to the first and second Node of the list and as we loop through the 1647 | list we make the second Node point to the first Node. 1648 | 1649 | It's very hard to understand this algorithm by just reading so I recommend 1650 | you to watch this [Youtube Video](https://www.youtube.com/watch?v=G0_I-ZF0S38). 1651 | 1652 | ### Stacks and Queues 1653 | 1654 | Stacks and Queues are very similar, there are both linear data-structure 1655 | and linear data structures allow us to traverse(go through elements sequentially) 1656 | and only one data element can be directly reached. 1657 | 1658 | #### Stacks 1659 | 1660 | A stack is a type of data structure where data elements are stacked on top of each other like a stack of plates. And you can only touch the top plate(when inserting, removing, or peeking). 1661 | 1662 | To get to the bottom plate you will have to remove the top plates one by one. 1663 | 1664 | And if you want to add a data element you can only put it on the top of the stack. 1665 | 1666 | This is called LIFO(Last In First Out) because the last one to get 1667 | inserted into the stack(the top plate) will be the first one to get out. 1668 | 1669 | ![Stack animation](./Stack.gif) 1670 | 1671 | The act of inserting an element into the stack is called Push 1672 | and the act of removing an element is called Pop. 1673 | 1674 | Programming languages use the stack data structure to keep track of function calls, 1675 | it's known as Call Stack. 1676 | 1677 | When a function gets called it gets pushed into the Call Stack and starts to execute executed. 1678 | When another function inside that function gets called the other function gets pushed 1679 | to the stack and popped when it's done executing. 1680 | 1681 | ![Call Stack animation](./CallStack.gif) 1682 | 1683 | Call Stack has limited space and it can get overflowed if you use a recursive function that 1684 | never ends causing the program to crash with Stack Overflow. 1685 | 1686 | Another good example of stack is browser history ( ͡° ͜ʖ ͡°). 1687 | 1688 | When we visit a site it gets pushed to our histroy and when we press the back button it gets poped. 1689 | 1690 | ##### Stack Operations 1691 | 1692 | **Lookup** O(n) 1693 | 1694 | **Push** O(1) Push add the element on top of the last element 1695 | 1696 | **Pop** O(1) Pop remove the top last element 1697 | 1698 | **Peek** O(1) The Peek points to the last element. 1699 | 1700 | #### Queues 1701 | 1702 | Queue data structure is like a line of people waiting to 1703 | go inside a theater. 1704 | 1705 | The first person in the line will go first inside the theater and the last person in the line will be the last one 1706 | to go inside the theater. 1707 | 1708 | ![Queue animation](./Queue.jpg) 1709 | 1710 | It's opposite of Stack. In Stack the first plate to get in is 1711 | the last plate to get out. 1712 | 1713 | But in a Queue the first person to go in is the first one to 1714 | go out which is called FIFO(First In First Out). 1715 | 1716 | Queues are useful when you are making an App where you need a waiting list. 1717 | 1718 | ##### Queue Operations 1719 | 1720 | **Lookup** O(n) If you follow the constraints otherwise O(1) 1721 | 1722 | **Enqueue** O(1) Enqueue add the element before the last element so the new element becomes the last element. 1723 | 1724 | **Dequeue** O(1) Dequeue remove the first element and the second element becomes the first element. 1725 | 1726 | **Peek** O(1) Peek points to the first element 1727 | 1728 | #### Stack Vs Queue 1729 | 1730 | Stacks are used in areas like undo/redo, browser history, Syntax parsing, and Virtual Machines. 1731 | 1732 | Queues are used for CPU task scheduling, Handling of interrupts, and waiting list in an App. 1733 | 1734 | 1735 | ![Stack vs Queue diagram](./StackVsQueue.png) 1736 | 1737 | #### Implementing Stack and Queues 1738 | 1739 | There is no need to create another class for Stack because you can just 1740 | use an array because Stack and Array are the same if you only use the push and pop operation and read only the last element of the array. 1741 | 1742 | ```js 1743 | // Stack using array 1744 | let myStack = [1,2,3,4,5]; 1745 | myStack[myStack.length-1] // Peek 1746 | myStack.push(); // In some languages it's called append 1747 | myStack.pop(); // Pop 1748 | ``` 1749 | 1750 | We can also implement stack using Linked List but in all Stack-related interview questions you will be given an array. 1751 | 1752 | The Queue on the other hand is similar to Linked List if when removing an element you remove the first element and when adding an element you add the to the end of the list and read only the first element of the list. 1753 | 1754 | ```js 1755 | let myQueue = new LinkedList(); // The linked list from the previous section 1756 | 1757 | myQueue.lookup(0) // Peek 1758 | myQueue.push() // Enqueue 1759 | myQueue.remove(0) // Dequeue 1760 | ``` 1761 | 1762 | #### Implement Queue using Stacks 1763 | 1764 | This is the most commonly asking question related to Stack and Queues. 1765 | 1766 | A queue is FIFO (first-in-first-out) but a stack is LIFO (last-in-first-out). This means the newest element must be pushed to the bottom of the stack. To do that we need to unilize two stacks. 1767 | 1768 | ```js 1769 | class Queue { 1770 | constructor() { 1771 | this.main = []; 1772 | this.temp = []; 1773 | } 1774 | 1775 | enqueue() {} 1776 | dequeue() {} 1777 | peek() {} 1778 | } 1779 | ``` 1780 | 1781 | Because the first element to go in needs to be the first element of our main stack we need to pop and push all the elements from the main stack to the temp stack and then push the new element to the temp stack. After that pop and push all the elements from the temp stack back the the main stack. 1782 | 1783 | 1784 | ```js 1785 | enqueue(element) { 1786 | while(this.main.length > 0) { 1787 | this.temp.push(this.main.pop()); 1788 | } 1789 | 1790 | this.temp.push(element); 1791 | 1792 | while(this.temp.length > 0) { 1793 | this.main.push(this.temp.pop()); 1794 | } 1795 | } 1796 | ``` 1797 | 1798 | This makes the enqueue method O(n) instead of O(1) but this is a stupid question anyway so we don't care. 1799 | 1800 | Here is the full implementaion with dequeue and peek: 1801 | 1802 | ```js 1803 | class Queue { 1804 | constructor() { 1805 | this.main = []; 1806 | this.temp = []; 1807 | } 1808 | 1809 | enqueue(element) { 1810 | while(this.main.length > 0) { 1811 | this.temp.push(this.main.pop()); 1812 | } 1813 | 1814 | this.temp.push(element); 1815 | 1816 | while(this.temp.length > 0) { 1817 | this.main.push(this.temp.pop()); 1818 | } 1819 | } 1820 | 1821 | dequeue() { 1822 | return this.main.pop(); 1823 | } 1824 | 1825 | peek() { 1826 | return this.main[this.main.length - 1]; 1827 | } 1828 | } 1829 | 1830 | const myQueue = new Queue(); 1831 | console.log(myQueue.peek()); 1832 | myQueue.enqueue('Roy'); 1833 | myQueue.enqueue('Dishant'); 1834 | myQueue.enqueue('Burhan'); 1835 | console.log(myQueue.peek()); 1836 | myQueue.dequeue(); 1837 | myQueue.dequeue(); 1838 | myQueue.dequeue(); 1839 | console.log(myQueue.peek()); 1840 | ``` 1841 | 1842 | ### Trees 1843 | 1844 | The tree data structure is based on Linked List and it is called "tree" because 1845 | it looks like an upside-down tree. 1846 | 1847 | ![Tree Diagram](./Tree.webp) 1848 | 1849 | Here we have a root node that every child descends from. 1850 | A node can only have one parent node but can have any number 1851 | of children nodes. So we have a uni directional parent-child relationship. 1852 | 1853 | The tree data structure is important because you work with them every day. 1854 | 1855 | The files and folders in your disk are stored in a tree structure. 1856 | 1857 | And if you have coded HTML you may know that it also represents a Tree 1858 | Structure of HTML Elements. 1859 | 1860 | **Terms in Tree:** 1861 | 1862 | - **Root**: The root of a tree is the topmost node of the tree that has no parent node. 1863 | There is only one root node in every tree. 1864 | 1865 | - **Edge**: Edge acts as a link between the parent node and the child node. 1866 | 1867 | - **Leaf**: A node that has no child is known as the leaf node. 1868 | It is the last node of the tree. There can be multiple leaf nodes in a tree. 1869 | 1870 | - **Depth of Node**: The depth of the node is the distance from the root node 1871 | to that particular node. 1872 | 1873 | - **Height of Node**: The height of the node is the distance from that node to the 1874 | deepest node of the tree. 1875 | 1876 | - **Height of tree**: The Height of the tree is the maximum height of any node. 1877 | 1878 | #### Why use Trees? 1879 | 1880 | 1. One reason to use trees might be because you want to store information that naturally 1881 | forms a hierarchy. For example, the file system on a computer: 1882 | 1883 | ![File System diagram](./FileSystem.jpg) 1884 | 1885 | 2. Trees (with some ordering e.g., BST) provide moderate access/search 1886 | (quicker than Linked List and slower than arrays). 1887 | 1888 | 3. Trees provide moderate insertion/deletion (quicker than Arrays and slower 1889 | than Unordered Linked Lists). 1890 | 1891 | 4. Like Linked Lists and unlike Arrays, Trees don't have an upper limit on the 1892 | number of nodes as nodes are linked using pointers. 1893 | 1894 | #### Main applicaiton of Trees: 1895 | 1896 | - Manipulate hierarchical data. 1897 | - Make information easy to search. 1898 | - Manipulate sorted lists of data. 1899 | - Form of multi-stage decision-making. 1900 | 1901 | #### Binary Tree 1902 | 1903 | A Binary Tree is a type of Tree Data Structure where a parent node can only 1904 | have either 0, 1 or two nodes. 1905 | 1906 | Since each element in a binary tree can have only 2 children, we typically name them 1907 | the left and right child. 1908 | 1909 | ![Binary Tree Diagram](./BinaryTree.jpeg) 1910 | 1911 | #### Operations on Binary Tree 1912 | 1913 | **Inserting** O(n) 1914 | 1915 | **Removing** O(n) 1916 | 1917 | **Searching** O(n) 1918 | 1919 | #### Types of Binary Tree 1920 | 1921 | ##### Full Binary Tree: 1922 | 1923 | A full Binary tree is a special type of binary tree in which every 1924 | parent node/internal node has either two or no children. It is also 1925 | known as a proper binary tree. 1926 | 1927 | ![Full Binary Tree Diagram](./FullBinaryTree.png) 1928 | 1929 | ##### Complete Binary Tree: 1930 | 1931 | A Binary Tree is a Complete Binary Tree if all the levels are 1932 | completely filled except possibly the last level and the last level 1933 | has all keys as left as possible. 1934 | 1935 | A complete binary tree is just like a full binary tree, but with two major differences: 1936 | 1937 | - Every level must be completely filled. 1938 | - All the leaf elements must lean towards the left. 1939 | - The last leaf element might not have a right sibling i.e. a complete 1940 | binary tree doesn't have to be a full binary tree. 1941 | 1942 | ![Complete Binary Tree Diagram](./CompleteBinaryTree.png) 1943 | 1944 | ##### Perfect Binary Tree: 1945 | 1946 | A Binary tree is a Perfect Binary Tree in which all the internal nodes have two 1947 | children and all leaf nodes are at the same level. 1948 | 1949 | ![Perfect Binary Tree Diagram](./PerfectBinaryTree.png) 1950 | 1951 | ##### Balanced Tree: 1952 | 1953 | A binary tree is balanced if the height of the tree is O(Log n) where n is 1954 | the number of nodes. 1955 | 1956 | ![Balanced Binary Tree Diagram](./BalancedBinaryTree.png) 1957 | 1958 | ##### Degenerate(or Pathological) Binary Tree: 1959 | 1960 | A Binary Tree is Degenerate Binary Tree where every parent node has only one child node. 1961 | 1962 | This is bacially a Linked List. 1963 | 1964 | ![DegenerateBinaryTree](./DegenerateBinaryTree.png) 1965 | 1966 | #### Binary Search Tree 1967 | 1968 | Binary Search Tree are really good at searching because it's sorted. 1969 | 1970 | For example take a look at this Tree: 1971 | 1972 | ![Diagram of Binary Search Tree](./BinarySearchTree.jpeg) 1973 | 1974 | Here the right side of each node is smaller than it's parent node and 1975 | the right side of each node is bigger than it's parent node. 1976 | 1977 | And this makes searching very easy because at each node we know what 1978 | direction we need to go the find our node. And this is how we avoid 1979 | checking through all the nodes. 1980 | 1981 | ##### Operations on Binary Search Trees 1982 | 1983 | **Lookup** O(Log n) 1984 | 1985 | **Insert** O(Log n) 1986 | 1987 | **Delete** O(log n) 1988 | 1989 | ##### Why O(Log n)? 1990 | 1991 | In computer science, we use Log with base 2 instead of base 10. 1992 | 1993 | And if we put the number of nodes of a Binary Tree and 1994 | the log2 function with one added we will get the height of the Binary Tree. 1995 | 1996 | You could look at it like this 1997 | 1998 | log2(n + 1) = h 1999 | 2000 | Where n is the number of nodes and h is the height. 2001 | 2002 | log2(1 + 1) = 1 2003 | 2004 | log2(3 + 1) = 2 2005 | 2006 | log2(7 + 1) = 3 2007 | 2008 | log2(15 + 1) = 4 2009 | 2010 | log2(31 + 1) = 5 2011 | 2012 | And in a Binary Tree the number of steps to get to 2013 | the desired node is equal the height of the Tree because 2014 | at each node, we know which way we need to go. 2015 | 2016 | So the Big O becomes O(log(n+1)) and after removing the constant 2017 | it becomes O(log n). 2018 | 2019 | #### Balanced vs Unbalanced Binary Tree 2020 | 2021 | Binary trees have several flavors. A balanced binary tree is one in which no leaf nodes are 'too far' from the root. 2022 | 2023 | For example, one definition of balanced could require that all leaf nodes have a depth that differs by at most 1. 2024 | 2025 | An unbalanced binary tree is one that is not balanced. 2026 | 2027 | Here is what a Balanced Binary Tree looks like: 2028 | 2029 | ![Balanced Binary Tree Diagram](./BalancedBinaryTree.png) 2030 | 2031 | On the right side, we have Unbalanced Binary Trees, and on 2032 | the left side we have Balanced Binary Trees. 2033 | 2034 | The problem with Unbalanced Binary Tree is that the 2035 | Big O of access time in an Unbalanced Binary tree is O(n) 2036 | where in a Balanced Binary Tree the access time is O(log n). 2037 | 2038 | That is because an unbalanced tree built from sorted data is effectively the same as a linked list. 2039 | 2040 | ![Balanced vs Unbalanced Binary Tree Diagram](./BalancedVsUnbalancedBinaryTree.png) 2041 | 2042 | So keeping a Binary Tree balanced is important. 2043 | 2044 | 2045 | #### Implementing Binary Search Tree 2046 | 2047 | After so much theory now it's time to code our first Tree. 2048 | 2049 | Just like the Linked List we need a class for our Node. 2050 | 2051 | ```js 2052 | class Node { 2053 | constructor(value) { 2054 | this.left = null; 2055 | this.right = null; 2056 | this.value = value; 2057 | } 2058 | } 2059 | ``` 2060 | 2061 | And a class for the Tree 2062 | 2063 | ```js 2064 | class BST { 2065 | constructor() { 2066 | this.root = null; 2067 | } 2068 | 2069 | insert() {} 2070 | lookup() {} 2071 | remove() {} 2072 | } 2073 | ``` 2074 | BST stands for Binary Search Tree, if you are confused. 2075 | 2076 | In this class we'll have three methods: `insert`, `lookup` and `remove`. 2077 | 2078 | Let's Implement the `insert` first: 2079 | 2080 | ```js 2081 | insert(value) { 2082 | const newNode = new Node(value); 2083 | 2084 | // If the tree is empty then put the new node at the root 2085 | if (this.root === null) { 2086 | this.root = newNode; 2087 | } else { 2088 | 2089 | let currentNode = this.root; 2090 | 2091 | // Traverse through the nodes 2092 | while(true) { 2093 | // Left 2094 | if (value < currentNode.value) { 2095 | // If empty left space found put the new node there 2096 | if (!currentNode.left) { 2097 | currentNode.left = newNode; 2098 | break; 2099 | } else { 2100 | // If left node exist move to the left node 2101 | currentNode = currentNode.left 2102 | } 2103 | } else { // Right 2104 | // If empty right space found put the new node there 2105 | if (!currentNode.right) { 2106 | currentNode.right = newNode; 2107 | break; 2108 | } else { 2109 | // If right node exist move to the right node 2110 | currentNode = currentNode.right; 2111 | } 2112 | } 2113 | } 2114 | } 2115 | } 2116 | ``` 2117 | 2118 | First we check if the root node is null if it's null then the tree is empty 2119 | So we put the new Node at the root. 2120 | 2121 | Then we traverse through the tree at each node we are checking which way 2122 | to go(bigger value is to right and smaller value is to left) and if the next 2123 | node is empty put the new node there else move to that node. 2124 | 2125 | 2126 | You can test this out by running this 2127 | 2128 | ```js 2129 | let tree = new BST(); 2130 | 2131 | tree.insert(5); 2132 | tree.insert(2); 2133 | tree.insert(18); 2134 | tree.insert(-4); 2135 | tree.insert(3); 2136 | 2137 | console.log(JSON.stringify(tree.root)) 2138 | ``` 2139 | 2140 | The output will give a JSON string that you can put in any online JSON tree 2141 | viewer. 2142 | 2143 | The `lookup` method is very similar to the `insert` 2144 | 2145 | ```js 2146 | lookup(value) { 2147 | if (this.root === null) { 2148 | return null; 2149 | } 2150 | 2151 | let currentNode = this.root; 2152 | 2153 | while (currentNode.value !== value) { 2154 | if (value > currentNode.value) { 2155 | if (currentNode.right === null) { 2156 | return null; 2157 | } else { 2158 | currentNode = currentNode.right; 2159 | } 2160 | } else { 2161 | if (currentNode.left === null) { 2162 | return null; 2163 | } else { 2164 | currentNode = currentNode.left; 2165 | } 2166 | } 2167 | } 2168 | 2169 | return currentNode; 2170 | } 2171 | ``` 2172 | 2173 | At each step in the loop, we are checking if the current node is equal to the given 2174 | value if not then check the which side we need to go next if the next node is null 2175 | then return null because we couldn't find the value. 2176 | 2177 | Now the last method in our list is the `remove` method which is way too complicated 2178 | to implement so I'm going to skip over this one because no one will ever ask you to 2179 | implement a Binary Search Tree in an Interview. 2180 | 2181 | I think this is enough for you to understand how Binary Search Tree works but if 2182 | you want to know how to implement remove for a Binary Seach Tree you can watch 2183 | [this video](https://www.youtube.com/watch?v=wMyWHO9F1OM). 2184 | 2185 | 2186 | #### AVL Trees and Red Black Trees 2187 | 2188 | They are more advance types of Trees but they are not that important to learn 2189 | unless you are preparing for MANGA(Meta, Apple, Netlify, Google and Amazon) Interview. 2190 | 2191 | So I'll put some resources here if you want to learn about them. 2192 | 2193 | AVL Trees: https://medium.com/basecs/the-little-avl-tree-that-could-86a3cae410c7 2194 | 2195 | Red Black Tree: https://medium.com/basecs/painting-nodes-black-with-red-black-trees-60eacb2be9a5 2196 | 2197 | #### Heap and Binary Heap 2198 | 2199 | A Heap is a special Tree-based data structure in which the tree is a complete binary tree. 2200 | And it may sound fimiliar but it is not related to Heap memory. 2201 | 2202 | Generally, Heaps can be of two types: 2203 | 2204 | **Max-Heap**: In a Max-Heap the key present at the root node must be greatest among the keys 2205 | present at all of it's children. The same property must be recursively 2206 | true for all sub-trees in that Binary Tree. 2207 | 2208 | **Min-Heap**: In a Min-Heap the key present at the root node must be minimum among the 2209 | keys present at all of it's children. The same property must be recursively true 2210 | for all sub-trees in that Binary Tree. 2211 | 2212 | Heap is also called Binary Heap, there are the same thing. 2213 | 2214 | ![Max Heap and Min Heap Diagram](./MaxHeapAndMinHeap.png) 2215 | 2216 | #### Operations on Binary Heap 2217 | 2218 | **Lookup** O(n) We never use this in Binary Heap. 2219 | 2220 | For Max-Heap Lookup of largest number is O(1). 2221 | 2222 | For Min-Heap Lookup of smallest number O(1). 2223 | 2224 | **Insert** O(1) on average, O(log n) in worst case. 2225 | 2226 | **Delete** O(1) on average, O(log n) in worst case. 2227 | 2228 | 2229 | When we work with Binary Heap we generally only look for the top most 2230 | element. I'll be either Pushing, Popping, or Reading the first Element all of these have O(1) time complexity. 2231 | 2232 | #### Why use Binary Heap 2233 | 2234 | Binary Heap is usefull if you want to have a list of items that 2235 | stays sorted when you delete or insert an item. So it's useful when you 2236 | want to get the data with either the highest priorty(Max Heap) or the lowest priorty(Min Heap). 2237 | 2238 | #### Implementing Binary Heap 2239 | 2240 | Generally when you are given a question in an interview that 2241 | requires a Binary Heap, you do not have to implement Binary Heap 2242 | from scratch, Like Arrays and Hashmaps, Binary Heap also comes with most 2243 | Programming Languages. 2244 | 2245 | But if you want to understand how a Binary Heap works by implementing 2246 | it then you can read this [article](https://blog.bitsrc.io/implementing-heaps-in-javascript-c3fbf1cb2e65). 2247 | 2248 | Because JavaScript doesn't have Binary Heap in-built you can just copy 2249 | and paste an implementation of Binary Heap when solving a Leetcode Problem. 2250 | And if you are in an interview ask the interviewer if they want you to 2251 | implement the Binary Heap or you can just a npm library or CTRL+C + CTRL + V. 2252 | 2253 | Most likely you will never have to implement Binary Heap from scratch. You just need to know how to use this. 2254 | 2255 | #### How to solve a problem with Binary Heap 2256 | 2257 | Let's solve a Leetcode problem that can be solved efficently by 2258 | Binary Heap. 2259 | 2260 | Q. Kth Largest Element in a Stream 2261 | 2262 | Design a class to find the kth largest element in a stream. Note that it 2263 | is the kth largest element in the sorted order, not the kth distinct element. 2264 | 2265 | Implement KthLargest class: 2266 | 2267 | `KthLargest(int k, int[] nums)` Initializes the object with the integer k 2268 | and the stream of integers nums. 2269 | 2270 | `int add(int val)` Appends the integer val to the stream and returns the element representing the kth largest element in the stream. 2271 | 2272 | **Example 1**: 2273 | 2274 | ``` 2275 | Input 2276 | 2277 | ["KthLargest", "add", "add", "add", "add", "add"] 2278 | [[3, [4, 5, 8, 2]], [3], [5], [10], [9], [4]] 2279 | 2280 | Output 2281 | 2282 | [null, 4, 5, 5, 8, 8] 2283 | 2284 | Explanation 2285 | 2286 | KthLargest kthLargest = new KthLargest(3, [4, 5, 8, 2]); 2287 | kthLargest.add(3); // return 4 2288 | kthLargest.add(5); // return 5 2289 | kthLargest.add(10); // return 5 2290 | kthLargest.add(9); // return 8 2291 | kthLargest.add(4); // return 8 2292 | ``` 2293 | 2294 | **Answer**: 2295 | 2296 | The most naive approach to solve this problem is to just sort the array 2297 | using the built-in `sort` method when we first initialize the array. 2298 | 2299 | Then finding the kth element would take `n` amount of time where `n` is the 2300 | size of the array. And adding an element to its correct 2301 | place will also take `n` amount of time. 2302 | 2303 | But we can solve this problem with a better time complexity using a Heap. 2304 | 2305 | And you may think that the max heap is the best solution for this because 2306 | inserting an element to a max heap takes log n time and we will always have 2307 | the biggest number at the top so finding kth element would take k amount of time. 2308 | 2309 | So the overall time complexity of the `add` function will become O(k log n). 2310 | 2311 | But if we use a min-heap we can make this even more time-efficient to 2312 | O(log k). 2313 | 2314 | This is possible because we only want the kth element so if we will only store 2315 | the k largest element in a min-heap so we will have the kth largest number at 2316 | the top. And when we add a new number to the min-heap will we pop the smallest 2317 | number in the heap so the kth largest element will always be at the top. 2318 | 2319 | I'm going to use Python for this because JavaScript doest not come 2320 | with Heap Data Structures. 2321 | 2322 | The code: 2323 | 2324 | ```python 2325 | import heapq # Min heap implementation in python 2326 | 2327 | class KthLargest: 2328 | def __init__(self, k: int, nums: List[int]): # O(log n + n) = O(log n) 2329 | self.min_heap, self.k = nums, k 2330 | 2331 | heapq.heapify(self.min_heap) # O(log n) 2332 | 2333 | # Removing numbers that are not k largest O(n) 2334 | while len(self.min_heap) > self.k: 2335 | heapq.heappop(self.min_heap) 2336 | 2337 | def add(self, val: int) -> int: # O(log k + log k) = O(log k) 2338 | heapq.heappush(self.min_heap, val) # Add new number O(log k) 2339 | 2340 | # Pop minimum number if the size of heap is bigger than k 2341 | if len(self.min_heap) > self.k: 2342 | heapq.heappop(self.min_heap) # O(log k) 2343 | 2344 | return self.min_heap[0] 2345 | ``` 2346 | 2347 | #### Priority Queue 2348 | 2349 | The most common use case of heap data structure is using it as a Priority Queue. 2350 | 2351 | In a normal queue, the first one to go in is the first one to come out. 2352 | 2353 | But there are situations where you wanna prioritize the elements of a queue. 2354 | Where the element with the highest priority comes out first. 2355 | 2356 | We can use a max-heap in that case. 2357 | 2358 | #### Solve a problem using Priority Queue or max-heap 2359 | 2360 | Q. Last Stone Weight 2361 | 2362 | You are given an array of integers `stones` where `stones[i]` is the weight of the ith stone. 2363 | 2364 | We are playing a game with the stones. On each turn, we choose the **heaviest two stones** and 2365 | smash them together. Suppose the heaviest two stones have weights `x` and `y` with `x <= y`. 2366 | 2367 | The result of this smash is: 2368 | 2369 | - If `x == y`, both stones are destroyed, and 2370 | - If `x != y`, the stone of weight `x` is destroyed, and the stone of weight `y` 2371 | has new weight `y - x`. 2372 | 2373 | At the end of the game, there is **at most one** stone left. 2374 | 2375 | Return the weight of the last remaining stone. If there are no stones left, return `0`. 2376 | 2377 | **Example 1**: 2378 | 2379 | ``` 2380 | Input: stones = [2,7,4,1,8,1] 2381 | 2382 | Output: 1 2383 | 2384 | Explanation: 2385 | 2386 | We combine 7 and 8 to get 1 so the array converts to [2,4,1,1,1] then, 2387 | we combine 2 and 4 to get 2 so the array converts to [2,1,1,1] then, 2388 | we combine 2 and 1 to get 1 so the array converts to [1,1,1] then, 2389 | we combine 1 and 1 to get 0 so the array converts to [1] then that's 2390 | the value of the last stone. 2391 | ``` 2392 | 2393 | ``` 2394 | Example 2: 2395 | 2396 | Input: stones = [1] 2397 | 2398 | Output: 1 2399 | ``` 2400 | 2401 | **Answer**: 2402 | 2403 | Since Python doesn't have a max heap we'll store all the 2404 | numbers as negative and convert them to positive when we need it. Other than that, everything is explained well in the 2405 | question what we have to do. 2406 | 2407 | 2408 | ```python 2409 | class Solution: 2410 | def lastStoneWeight(self, stones: List[int]) -> int: 2411 | self.max_heap = [] 2412 | 2413 | # Storing the weights as negative 2414 | # Because heapq is a min heap 2415 | for stone in stones: 2416 | heapq.heappush(self.max_heap, -stone) 2417 | 2418 | while len(self.max_heap) > 1: 2419 | first = heapq.heappop(self.max_heap) 2420 | second = heapq.heappop(self.max_heap) 2421 | 2422 | if first != second: 2423 | new_stone = first - second 2424 | heapq.heappush(self.max_heap, new_stone) 2425 | 2426 | if len(self.max_heap) == 0: 2427 | return 0 2428 | 2429 | return -self.max_heap[0] 2430 | ``` 2431 | 2432 | The overall time complexity of this is O(n log n). 2433 | 2434 | And the space complexity of this is O(n). 2435 | 2436 | #### Trie 2437 | 2438 | A trie (pronounced as "try") or prefix tree is a tree data structure 2439 | used to efficiently store and retrieve keys in a dataset of strings. 2440 | 2441 | There are various applications of this data structure, 2442 | such as autocomplete and spellchecker. 2443 | 2444 | ![Trie Diagram](./Trie.png) 2445 | 2446 | In the above example we have the words "Apple", "Apply", "App", "Manga", "Mango", 2447 | "Ray", "Rat", and "Rats". 2448 | 2449 | The letters maked as blue shows where a word ends. 2450 | 2451 | Unlike other Trees, The root node in a Trie doesn't hold any value. 2452 | 2453 | ### Implementing Trie 2454 | 2455 | The implementation of our Trie will have three methods: 2456 | 2457 | - `insert(word)` will insert a word into the trie. 2458 | - `search(word)` will return `true` if a word exists and `false` otherwise. 2459 | - `startsWith(word)` will return `true` if a word exists that starts with the given 2460 | word otherwise `false`. 2461 | 2462 | A node in a Trie does three things: Storing a value, Pointing to other nodes, and 2463 | marking if the node is a end of a word. 2464 | 2465 | ```js 2466 | class TrieNode { 2467 | constructor(value) { 2468 | this.value = value; 2469 | this.nexts = {}; 2470 | this.end = false; 2471 | } 2472 | } 2473 | ``` 2474 | 2475 | We are using a hashmap because it will make finding the next node faster than 2476 | using an array. O(n) for array and O(1) for hashmap. 2477 | 2478 | The Trie class: 2479 | 2480 | ```js 2481 | class Trie { 2482 | constructor() { 2483 | this.root = new TrieNode(null); 2484 | } 2485 | 2486 | insert(word) {} 2487 | search(word) {} 2488 | startsWith(word) {} 2489 | } 2490 | ``` 2491 | 2492 | Let's first implement the `insert` method. 2493 | 2494 | ```js 2495 | insert(word) { 2496 | let currentNode = this.root; 2497 | 2498 | for (let i = 0; i < word.length; i++) { 2499 | const letter = word[i]; 2500 | 2501 | if (currentNode.nexts[letter]) { 2502 | currentNode = currentNode.nexts[letter]; 2503 | 2504 | if (i === word.length-1) { 2505 | currentNode.end = true; 2506 | } 2507 | 2508 | continue; 2509 | } else { 2510 | let newNode = new TrieNode(letter); 2511 | 2512 | currentNode.nexts[letter] = newNode; 2513 | 2514 | currentNode = newNode; 2515 | 2516 | if (i === word.length-1) { 2517 | currentNode.end = true; 2518 | } 2519 | 2520 | continue; 2521 | } 2522 | } 2523 | } 2524 | ``` 2525 | 2526 | This may look very complicated but it is a very simple algorithm. 2527 | 2528 | We are starting from the root node and looping over every letter of the word. 2529 | 2530 | If the current letter exists in the current node then move to that node and 2531 | move to the next letter. 2532 | 2533 | If the current letter does not exist in the current node then create a new 2534 | node and add it to the next of our current node then move to the new node and 2535 | letter. 2536 | 2537 | If we are at the end of the word mark the current node or the new now as the 2538 | end. 2539 | 2540 | This takes O(n) time where n the length of the word. 2541 | 2542 | 2543 | The `search` method: 2544 | 2545 | ```js 2546 | search(word) { 2547 | let currentNode = this.root; 2548 | 2549 | for (let i = 0; i < word.length; i++) { 2550 | const letter = word[i]; 2551 | 2552 | if (!currentNode.nexts[letter]) { 2553 | return false; 2554 | } 2555 | 2556 | currentNode = currentNode.nexts[letter]; 2557 | } 2558 | 2559 | return currentNode.end; 2560 | } 2561 | ``` 2562 | 2563 | We will traverse in the same way as we did in the `insert` method and if a 2564 | letter does not exist we will return false. 2565 | 2566 | After the loop we will simply return the `end` property of the current node. 2567 | 2568 | The `starsWith` method is almost same as the `seach` method except the last 2569 | line. 2570 | 2571 | ```js 2572 | startsWith(prefix) { 2573 | let currentNode = this.root; 2574 | 2575 | for (let i = 0; i < prefix.length; i++) { 2576 | const letter = prefix[i]; 2577 | 2578 | if (!currentNode.nexts[letter]) { 2579 | return false; 2580 | } 2581 | 2582 | currentNode = currentNode.nexts[letter]; 2583 | } 2584 | 2585 | return Object.keys(currentNode).length > 0; 2586 | } 2587 | ``` 2588 | 2589 | If the last letter of the word has more letters after it then it has more 2590 | words after it. 2591 | 2592 | ### Graphs 2593 | 2594 | Graph is a collection of interconnected nodes. Kind of like the Internet. 2595 | 2596 | ![Graph Diagram](./Graph.png) 2597 | 2598 | The lines connecting two nodes are called Edge. 2599 | And the nodes are sometimes referred to as Vertex. 2600 | 2601 | Graphs are useful to model real-world relationships. Facebook uses it to track 2602 | users' relationships with others, Amazon uses it for their recommendation engine. 2603 | Google uses it in Google Maps where each node is a city and the edges are roads. 2604 | 2605 | #### Types of Graphs 2606 | 2607 | ##### Directed and Undirected 2608 | 2609 | Pretty staightforward from looking at this picture. 2610 | 2611 | ![Directed vs UnDirected Graph Diagram](./DirectedVsUnDirectedGraph.png) 2612 | 2613 | In a Directed Graph, we can only go forward from one node to another. 2614 | 2615 | But In an Undirected Graph, we can go forward and backward. 2616 | 2617 | Facebook uses Undirected Graph because if `person1` is a friend of `person2`. 2618 | `person2` is also a friend of `person1` it's a unidirectional relationship. 2619 | 2620 | Twitter uses Directed Graph because if `person` follows `person2` that doesn't 2621 | necessarily means `person2` follows `person1`. 2622 | 2623 | Note that we can have unidirectional edges in a Directed Graph. 2624 | 2625 | ##### Weighted and UnWeighted 2626 | 2627 | In a Weighted Graph, edges have weight on them. It can represent anything. 2628 | 2629 | As for Google Maps, the weight on an edge is the distance between two cities. 2630 | Which is how Google Maps can calculate the shortest path between two locations. 2631 | 2632 | ![Weighted and UnWeighted Graph Diagram](./WeightedVsUnWeightedGraph.png) 2633 | 2634 | 2635 | ##### Cyclic vs Acyclic 2636 | 2637 | A Graph is called Cyclic Graph if there's at least one cycle on nodes is present. 2638 | 2639 | ![Cyclic And Acyclic Graph Diagram](./CyclicVsAcyclicGraph.png) 2640 | 2641 | ## Algorithms 2642 | 2643 | A word that programmers use when they don't want to explain what they did. 2644 | 2645 | There are tons of algorithms out there and it's impossible to learn all of them. 2646 | 2647 | So In this section, we'll go through the most common algorithms that need to know 2648 | to solve Leetcode Problems and Interview Questions. 2649 | 2650 | It's important for programmers to know what algorithm should they use to solve 2651 | a given problem in the most efficient way. 2652 | 2653 | 2654 | ### Recursion 2655 | 2656 | Recursion isn't an algorithm but an approach to solving recursive problems. 2657 | 2658 | It is the most difficult concept for beginners to learn. But once you learn it, 2659 | it can make solving some problems easy. 2660 | 2661 | #### When to use Recursion 2662 | 2663 | Recursion is mostly used when you can divide a problem into smaller repetive problems. 2664 | 2665 | Like for example if you wrote a function that prints all the files and folders in 2666 | a directory and you want to do the same for all the folders inside that directory. 2667 | 2668 | You can just call the same function inside that function for all the folders. 2669 | 2670 | ```js 2671 | // Pesudo code 2672 | function printFilesAndFolders(directory) { 2673 | for (file of filesInDirectory) { 2674 | print(file) 2675 | } 2676 | 2677 | for (folder of foldersInDirectory) { 2678 | print(folder) 2679 | } 2680 | 2681 | for (folder of foldersInDirectory) { 2682 | printFilesAndFolders(folder) // Recursion 2683 | } 2684 | } 2685 | ``` 2686 | 2687 | #### Stack Overflow 2688 | 2689 | Recursion can make solving recursive problems straightforward but that have 2690 | one caveat. 2691 | 2692 | Whenever a function gets called it gets pushed to Stack and eats some space. 2693 | And we have a fixed amount of space in Stack. 2694 | 2695 | If a Recursive function never stops calling itself it'll cause Stack Overflow. 2696 | In interpreted languages, you'll get an error saying you've exceeded that maximum call 2697 | stack size. 2698 | 2699 | And In compiled languages, your PC will just hang and you may have to restart your computer. 2700 | 2701 | Since they also take space in memory the Space Complexity of a Recursive Function 2702 | is O(n) where n is the number of recursive function calls. 2703 | 2704 | So you have to have the Recursive function to stop it at some point. 2705 | 2706 | #### Anatomy of Recursive Function 2707 | 2708 | Every recursive function needs to have something called a Base Case. The Base Case 2709 | will stop the recursive function from continuing calling itself. 2710 | 2711 | And where the function call itself is called Recursive Case. 2712 | 2713 | 2714 | #### Factorial 2715 | 2716 | The most commonly seen function when it comes to Recursion. 2717 | 2718 | In mathematics, Factorial is the product of all positive integers less than or equal to a given positive integer and denoted by that integer and an exclamation point. 2719 | 2720 | For example: 5! = 5 * 4 * 3 * 2 * 1 2721 | n! = n * n-1 * n-2 * n-3 * n-4 2722 | 2723 | We can write this as this: n! = n * (n-1)! 2724 | 2725 | There is a recursion in this equation. 2726 | 2727 | Let's implement this in code: 2728 | 2729 | ```js 2730 | function factorial(n) { 2731 | return n * factorial(n-1); 2732 | } 2733 | ``` 2734 | 2735 | This exatly looks like this equation. But there is one thing missing, the base 2736 | case. 2737 | 2738 | We want it to stop when it reaches 1 or the given number is smaller than 1. 2739 | 2740 | Because in Factorial we only deal with positive numbers. 2741 | 2742 | ```js 2743 | function factorial(n) { 2744 | if (n < 1) { // The base case 2745 | return 1; 2746 | } 2747 | 2748 | return n * factorial(n-1); 2749 | } 2750 | ``` 2751 | The space and time complexity of this function is O(n). 2752 | 2753 | #### Fibonacci 2754 | 2755 | The Fibonacci sequence is also famous programming question for recursion. 2756 | 2757 | In Fibonacci sequence each number is the sum of previous two numbers. 2758 | 2759 | `1, 1, 2, 3, 5, 8, 13, 21, ...` 2760 | 2761 | Now the question is how will you find the the nth number in the Fibonacci 2762 | sequence? 2763 | 2764 | You can say the the nth number in the Fibonacci sequence is equal to 2765 | n-1th + n-2th number in the Fibonacci sequence. 2766 | 2767 | ``` 2768 | fib(n) = fib(n-1) + fib(n-2) 2769 | ``` 2770 | 2771 | Here is the code: 2772 | 2773 | ```js 2774 | function fib(n) { 2775 | if (n < 2) { // Base case 2776 | return n; 2777 | } 2778 | 2779 | return fib(n-1) + fib(n-2); 2780 | } 2781 | ``` 2782 | 2783 | The base case will take care of 0 and 1. 2784 | 2785 | What do you think the Big O of this is? 2786 | 2787 | It's actually O(n^2). 2788 | 2789 | Here is why: 2790 | 2791 | The function is calling itself twice so the number of calls doubles as we go down 2792 | the numbers. 2793 | 2794 | Here is a visual representation of this function: 2795 | 2796 | ![Fibonacci Graph](./Fibonacci.png) 2797 | 2798 | As we can see in the graph, we are calculating same values more than once. 2799 | This is a really inefficient way to solve this problem. 2800 | 2801 | The normal iterative approach would be the best to solve this problem with 2802 | O(n) time compexity and O(1) space complexity. 2803 | 2804 | ```js 2805 | function fib(n) { 2806 | let num1 = 0; 2807 | let num2 = 1; 2808 | 2809 | for (let i = 1; i <= n && n > 2; i++) { 2810 | let total = num1 + num2; 2811 | num1 = num2; 2812 | num2 = total; 2813 | } 2814 | 2815 | return num2; 2816 | } 2817 | ``` 2818 | 2819 | #### Recursive vs Iterative approach 2820 | 2821 | Anything that can be implemented Iteratively can be 2822 | implemented iteratively (loop). 2823 | 2824 | So when to use recursion and when to use iteration? 2825 | 2826 | Use recursion when you don't have to worry about memory. 2827 | and the recursive approach is more straightforward. 2828 | 2829 | Use iteration when you have to worry about memory or the 2830 | the iterative approach is more efficient than the recursive 2831 | approach. 2832 | 2833 | 2834 | ### Backtracking 2835 | 2836 | Backtracking is a technique based on recursion to solve a problem step by step 2837 | and if a step is not giving the results it goes back and tries a different step 2838 | until it reaches the solution. 2839 | 2840 | ### Solving sudoku using Bracktracking 2841 | 2842 | Given a partially filled 9×9 2D array `grid[9][9]`, the goal is to assign digits 2843 | (from 1 to 9) to the empty cells so that every row, column, and subgrid of size 2844 | 3×3 contains exactly one instance of the digits from 1 to 9 2845 | 2846 | First we need a helper function that will give us the possible numbers in a cell. 2847 | 2848 | ```js 2849 | function getValidNumbers(input, x, y) { 2850 | // Storing numbers as array of booleans 2851 | let validNumbers = new Array(10).fill(true); 2852 | 2853 | // Check row 2854 | for (let xi = 0; xi < input.length; xi++) { 2855 | validNumbers[input[xi][y]] = false; 2856 | } 2857 | 2858 | // Check column 2859 | for (let yi = 0; yi < input.length; yi++) { 2860 | validNumbers[input[x][yi]] = false; 2861 | } 2862 | 2863 | // Check 3x3 grid 2864 | let gridX = Math.floor(x/3) * 3; 2865 | let gridY = Math.floor(y/3) * 3; 2866 | 2867 | for (let xi = gridX; xi < gridX + 3; xi++) { 2868 | for (let yi = gridY; yi < gridY + 3; yi++) { 2869 | validNumbers[input[xi][yi]] = false; 2870 | } 2871 | } 2872 | 2873 | let result = []; 2874 | 2875 | for (let i = 0; i < validNumbers.length; i++) { 2876 | if (validNumbers[i]) { 2877 | result.push(i); 2878 | } 2879 | } 2880 | 2881 | return result; 2882 | } 2883 | ``` 2884 | 2885 | Then our main solver function: 2886 | 2887 | ```js 2888 | function solveSudoku(input) { 2889 | // Loop over every cell 2890 | for(let x = 0; x < input.length; x++) { 2891 | for (let y = 0; y < input.length; y++) { 2892 | if (input[x][y] === 0) { 2893 | // First we need to know what numbers we can put in this cell 2894 | const validNumbers = getValidNumbers(input, x, y); 2895 | 2896 | // Try with each valid numbers 2897 | for (let validNumber of validNumbers) { 2898 | input[x][y] = validNumber; 2899 | 2900 | // This will show how this algorithm works step by step 2901 | console.log({x, y, validNumbers, validNumber}); 2902 | console.table(input); 2903 | 2904 | let res = solveSudoku(input) // This will now fill the next empty cell 2905 | 2906 | if (!res) { // If the result is false then that means we need to go back 2907 | input[x][y] = 0; 2908 | continue; 2909 | } else { 2910 | return res; 2911 | } 2912 | } 2913 | 2914 | // If we reached here that means we don't have any numbers to put 2915 | // in so we return false to indicate we need to go one step back 2916 | return false; 2917 | } 2918 | } 2919 | } 2920 | 2921 | // If we reached here that means the board is finised so return the board 2922 | return input; 2923 | } 2924 | ``` 2925 | 2926 | You can try running this code with the given input: 2927 | 2928 | ```js 2929 | const input = [ 2930 | [3, 0, 6, 5, 0, 8, 4, 0, 0], 2931 | [5, 2, 0, 0, 0, 0, 0, 0, 0], 2932 | [0, 8, 7, 0, 0, 0, 0, 3, 1], 2933 | [0, 0, 3, 0, 1, 0, 0, 8, 0], 2934 | [9, 0, 0, 8, 6, 3, 0, 0, 5], 2935 | [0, 5, 0, 0, 9, 0, 6, 0, 0], 2936 | [1, 3, 0, 0, 0, 0, 2, 5, 0], 2937 | [0, 0, 0, 0, 0, 0, 0, 7, 4], 2938 | [0, 0, 5, 2, 0, 6, 3, 0, 0] 2939 | ]; 2940 | ``` 2941 | 2942 | It's hard to understand what this is doing just by reading the code so I 2943 | suggest you to watch [this video](https://www.youtube.com/watch?v=G_UYXzGuqvM) 2944 | to understand it better. 2945 | 2946 | ### Permutations 2947 | 2948 | Calculating permutaions of given collection is another great use case of 2949 | backtracking algorithm. 2950 | 2951 | ```js 2952 | const input = [1,2,3] 2953 | 2954 | function permutations(input, results=[], progress=[], used={}) { 2955 | for (let num of input) { // Loop over items 2956 | if (!used[num]) { // If item is not used 2957 | used[num] = true; // Make the item as used 2958 | progress.push(num); // Push the item to our progress 2959 | permutations(input, results, progress, used); // Then move to the next index in our progress 2960 | used[num] = false; // After we are done with the item mark it as not used 2961 | progress.pop(); // And remove the number from progress too so we can add next one 2962 | } 2963 | } 2964 | 2965 | // One the progress has the same length as the input 2966 | // That's one of our results 2967 | if (progress.length === input.length) { 2968 | results.push([...progress]); 2969 | } 2970 | } 2971 | 2972 | let results = []; 2973 | 2974 | permutations(input, results); 2975 | 2976 | console.log(results); 2977 | ``` 2978 | Once again it's hard to explain what this is doing in text so I recommed you to 2979 | watch [this video](https://www.youtube.com/watch?v=s7AvT7cGdSo). 2980 | 2981 | ### Two pointers 2982 | 2983 | Two pointers is really an easy and effective technique that is typically used for searching pairs in an array. 2984 | 2985 | Usually you have one pointer pointing at the left most side of the array and one 2986 | pointing at the right most side of the array. 2987 | 2988 | And we move them towards each other. 2989 | 2990 | Like for example if you want to know if a string is a palindrome or not 2991 | this technique is the best way to know that. 2992 | 2993 | For example `"madamimadam"` is a valid palindrome. 2994 | 2995 | ```js 2996 | function isPalindrome(string) { 2997 | let left = 0; 2998 | let right = string.length - 1; 2999 | 3000 | while (left < right) { 3001 | if (string[left] !== string[right]) { 3002 | return false; 3003 | } 3004 | } 3005 | 3006 | return true; 3007 | } 3008 | ``` 3009 | 3010 | ### Divide and Conquer 3011 | 3012 | Divide and Conquer algorithm is a strategy of solving a large problem by recursively dividing the problem into sub-problems and solving them until we get 3013 | to the solution. 3014 | 3015 | The best example of this is searching for a item in a sorted list. 3016 | 3017 | ```js 3018 | const list = [1,3,6,7,9,15,20]; 3019 | ``` 3020 | 3021 | If we want to find an item in a sorted array we can start from the middle and 3022 | check if the middle one is smaller or greater or equal to the target. 3023 | 3024 | Because the array is sorted we know that the target could be in either the left 3025 | side if the middle is bigger or the right side if middle is smaller. 3026 | 3027 | Then go to the next middle (either of left side or right side). 3028 | 3029 | Continue doing this until you reach the target. 3030 | 3031 | ![Binary Search](BinarySearch.png) 3032 | 3033 | The is called binary search and the Big O of this 3034 | is O(log n). 3035 | 3036 | ### Sorting 3037 | 3038 | A Sorting Algorithm is used to rearrange a given array or list of elements according 3039 | to a comparison operator on the elements. 3040 | 3041 | There are tons of sorting algorithms and all of them have their pros and cons. So 3042 | it's important to learn when to use them. 3043 | 3044 | #### Basic Sorting Alorithms 3045 | 3046 | Bubble Sort, Selection Sort, and Insertion Sort are very basic sorting algorithms. 3047 | They are easy to implement but have the worst time complexity so they are only used 3048 | to introduce beginners to sorting algorithms. 3049 | 3050 | #### Bubble Sort 3051 | 3052 | Bubble sort is a very basic sorting algorithm. It's usually used as a learning 3053 | tool to teach about sorting algorithms instead of using it in a real case. 3054 | Because it is very slow. 3055 | 3056 | ![Bubble Sort Animation](./BubbleSort.gif) 3057 | 3058 | In Bubble sort, you check two elements at a time and swap the values in sorted 3059 | order and move forward and repeat. 3060 | 3061 | You need to loop over the same array multiple times until all the values are in sorted order. 3062 | 3063 | This is called bubble sort because we are bubbling the higher values to the right side 3064 | and lower values to the left side. 3065 | 3066 | The Time Complexity of this algorithm is O(n*n) or O(n^2). 3067 | 3068 | The code: 3069 | 3070 | ```js 3071 | function bubbleSort(list) { 3072 | for (let i = 0; i < list.length; i++) { 3073 | for (let j = 0; j < list.length; j++) { 3074 | // If left side is bigger than right side 3075 | if (list[j] > list[j+1]) { 3076 | // Swap 3077 | let temp = list[j]; 3078 | list[j] = list[j+1]; 3079 | list[j+1] = temp; 3080 | } 3081 | } 3082 | } 3083 | 3084 | return list; 3085 | } 3086 | ``` 3087 | 3088 | #### Selection Sort 3089 | 3090 | Selection sort works but looking for the smallest element in the 3091 | list and swapping it with the left-most side in the list. 3092 | 3093 | ![Selection Sort Animation](./SelectionSort.webp) 3094 | 3095 | This algorithm also takes O(n^2) time. 3096 | 3097 | The code: 3098 | 3099 | ```js 3100 | function selectionSort(list) { 3101 | for (let i = 0; i < list.length; i++) { 3102 | let smallest = list[i]; 3103 | let smallestIndex = i; 3104 | 3105 | for (let j = i+1; j < list.length; j++) { 3106 | if (list[j] < smallest) { 3107 | smallest = list[j]; 3108 | smallestIndex = j 3109 | } 3110 | } 3111 | 3112 | // Swap 3113 | let temp = list[i]; 3114 | list[i] = smallest; 3115 | list[smallestIndex] = temp; 3116 | } 3117 | 3118 | return list; 3119 | } 3120 | ``` 3121 | 3122 | #### Insertion Sort 3123 | 3124 | Insertion sort is like Bubble sort and Selection sort. Not very fast but there are 3125 | cases where it can outperform these two with the time complexity of O(n). 3126 | 3127 | This happens when the list is almost sorted. 3128 | 3129 | And this is how it works: 3130 | 3131 | ![Insertion Sort Animation](./InsertionSort.gif) 3132 | 3133 | The code: 3134 | 3135 | ```js 3136 | function insertionSort(list) { 3137 | for (let i = 0; i < list.length; i++) { 3138 | // If the current number we are looking at is smaller than first element 3139 | // Move the first element to right and place the smaller element 3140 | // in place of first element 3141 | if (list[0] > list[i]) { 3142 | // First remvoe the current element from the list 3143 | const elementWeRemoved = list.splice(i,1)[0]; // [0] to get the element we removed 3144 | // Place it in front of the list 3145 | list.unshift(elementWeRemoved); 3146 | } else { 3147 | // If the element we are looking at is not smaller than first element 3148 | // Then we need to figure out where we should put it but finding the 3149 | // Bigger element from the left side 3150 | for (let j = 1; j < i; j++) { 3151 | if (list[i] >= list[j-1] && list[i] < list[j]) { 3152 | // Remvoe the current element from the list 3153 | const elementWeRemoved = list.splice(i,1)[0]; 3154 | // Place it where it belongs 3155 | list.splice(j, 0, elementWeRemoved); 3156 | } 3157 | } 3158 | 3159 | } 3160 | } 3161 | return list 3162 | } 3163 | ``` 3164 | 3165 | The best-case scenario for Insertion sort is a small list or nearly sorted list. 3166 | The time complexity for the best case is O(n). 3167 | 3168 | #### Merge Sort 3169 | 3170 | Mergo Sort is a sorting algorithm that use actually use in softwares unlike the previous ones. 3171 | And the time complexity of Merge Sort is O(n log n) because this uses Divide anc Conquer method. 3172 | 3173 | In the first stage we divide the list of items into halfs recursively until we have one item. 3174 | Then in the second stage we put them back together but in a sorted order. 3175 | 3176 | ![Merge Sort Diagram](./MergeSort.png) 3177 | 3178 | To implement this we need to write two functions one will divide the list recursively and 3179 | the other will merge them in sorted order. 3180 | 3181 | ```js 3182 | function mergeSort(list) { 3183 | if (list.length === 1) { 3184 | return list; 3185 | } 3186 | 3187 | const middle = Math.floor(list.length / 2); 3188 | 3189 | const left = list.splice(0, middle); 3190 | const right = list.splice(middle); 3191 | 3192 | return merge( 3193 | mergeSort(left), 3194 | mergeSort(right) 3195 | ); 3196 | } 3197 | 3198 | function merge(left, right) { 3199 | const result = []; 3200 | let leftIndex = 0; 3201 | let rightIndex = 0; 3202 | 3203 | while (leftIndex < left.length && rightIndex < right.length) { 3204 | if (left[leftIndex] < right[rightIndex]) { 3205 | result.push(left[leftIndex]); 3206 | left++; 3207 | } else { 3208 | result.push(right[rightIndex]); 3209 | right++; 3210 | } 3211 | } 3212 | 3213 | // Put the remaining elements in the result 3214 | return result.concat(left.slice(leftIndex)).concat(right.slice(rightIndex)); 3215 | } 3216 | ``` 3217 | 3218 | #### Quick Sort 3219 | 3220 | Quick sort sounds like it's the quickest sorting algorith but it is as fast as merge sort but 3221 | uses a different techinique than merge sort. 3222 | 3223 | Explaining Quick sort in text is impossible so I recommend you to watch 3224 | [this video](https://www.youtube.com/watch?v=XE4VP_8Y0BU). 3225 | 3226 | #### Which sort to use? 3227 | 3228 | You will never use Bubble sort and Selection sort in a real software. They are only used for 3229 | teaching sorting. 3230 | 3231 | Use Insertion sort when you have a small list or an almost sorted list. 3232 | 3233 | Merge Sort whether it's the best case or the worst case will only take O(n log n) time. But 3234 | it takes O(n) space. 3235 | 3236 | Quick Sort in the best case takes O(n log n) time and in the worst case takes (n^2) time. But 3237 | it takes O(n log n) space. 3238 | 3239 | So use Merge Sort when you are worried about the worst-case and you have plenty of memory. 3240 | 3241 | And use Quick sort when you don't have to worry about the worst-case or have less memory. 3242 | 3243 | #### Heap Sort 3244 | 3245 | Heapsort is basically using the [Heap](#heap-and-binary-heap) data structure to sort an array. The convertion from array to heap 3246 | takes O(n log n) time. 3247 | 3248 | This sorting algorithm is only used when we need to get the lowest or highest value in a list. 3249 | 3250 | #### Radix Sort and Counting Sort 3251 | 3252 | Mathematically it's impossible to beat O(n log n) time when it's come to sorting. Because we have 3253 | to compare every element with each other. 3254 | 3255 | Radix Sort and Counting Sort beat the O(n log n) time by not comparing elements 3256 | with each other. 3257 | 3258 | Both of them take advantage of how data is sorted and they only work with numbers. 3259 | 3260 | #### Counting Sort 3261 | 3262 | We only use this when the range of values (biggest number - smallest number + 1) is smaller or equal to the length 3263 | of the input array. 3264 | 3265 | The Big O of this is O(n + k) where n is the length of input and k is the 3266 | range of numbers. 3267 | 3268 | For example for this one the n is 7 and the k is 6: 3269 | 3270 | ```js 3271 | const list = [6,4,6,2,4,7,1] 3272 | ``` 3273 | 3274 | The algorithm of Counting sort goes like this: 3275 | 3276 | Loop through the entire list and keep track of the frequencies of the numbers in a HashSet or an array. 3277 | Now loop over the range and check which number was present and how many times and put them new a new array. 3278 | 3279 | The code: 3280 | 3281 | ```js 3282 | function countingSort(list) { 3283 | const seen = {}; 3284 | const res = []; 3285 | let highest = list[0]; 3286 | let lowest = list[0]; 3287 | 3288 | for (num of list) { 3289 | if (num > highest) { 3290 | highest = num; 3291 | } 3292 | 3293 | if (num < lowest) { 3294 | lowest = num; 3295 | } 3296 | 3297 | if (!(num in seen)) { 3298 | seen[num] = 1; 3299 | } else { 3300 | seen[num]++; 3301 | } 3302 | } 3303 | 3304 | // This takes O(n) time even through there is a nested loop 3305 | for (let i = lowest; i <= highest; i++) { 3306 | if (i in seen) { 3307 | for (let j = 0; j < seen[i]; j++) { 3308 | res.push(i); 3309 | } 3310 | } 3311 | } 3312 | 3313 | return res; 3314 | } 3315 | ``` 3316 | 3317 | #### Radix Sort 3318 | 3319 | Radix Sort is hard to explain for me so 3320 | I recommend you to read [this article](https://www.doabledanny.com/radix-sort-in-javascript). 3321 | 3322 | You can skip this one becuase it's now that important to know. 3323 | 3324 | #### Bucket Sort 3325 | 3326 | This is also not important so 3327 | I'll just point to [another article](https://www.programiz.com/dsa/bucket-sort) for this. 3328 | 3329 | ### Searching 3330 | 3331 | #### Linear Search 3332 | 3333 | Linear Search is a method of finding a value by checking each item sequentially 3334 | in a data structure. 3335 | 3336 | It's the most basic way to search and the time complexity of this is O(n). 3337 | 3338 | One example of this is getting the index of a value in an array. 3339 | 3340 | ```js 3341 | function getIndex(list, value) { 3342 | for (let i = 0; i < list.length; i++) { 3343 | if (list[i] === value) { 3344 | return i; 3345 | } 3346 | } 3347 | 3348 | return -1; // If value not found 3349 | } 3350 | ``` 3351 | 3352 | Linear Search is the slowest way to seach for an item so always avoid using this 3353 | if we have a large amount of data. 3354 | 3355 | But for a small list this is not a bad method. 3356 | 3357 | ### Binary Search 3358 | 3359 | We can use Binary Seach when the data is in sorted order, 3360 | it can be an array or a [Binary Tree](#binary-tree). 3361 | 3362 | For example if we have an array `[1,3,6,7,9,15,20,21,25,26,29,30]` and 3363 | we want the index of `9` we can first check the middle value. 3364 | 3365 | If the middle value is bigger than our target then go the the middle from 3366 | left side. 3367 | 3368 | If the middle from left side is smaller then go to the next middle until you 3369 | get to the target. 3370 | 3371 | ![Binary Search](BinarySearch.png) 3372 | 3373 | The code: 3374 | 3375 | ```js 3376 | function binarySearch(list, value) { 3377 | let begin = 0; 3378 | let end = list.length - 1; 3379 | 3380 | while (begin <= end) { 3381 | mid = Math.ceil((begin + end)/2); 3382 | 3383 | if (list[mid] > value) { 3384 | end = mid - 1; 3385 | } else if (list[mid] < value) { 3386 | begin = mid + 1; 3387 | } else { 3388 | return mid; 3389 | } 3390 | } 3391 | 3392 | return -1; 3393 | } 3394 | ``` 3395 | 3396 | The time complexity of Binary Search is O(log n). 3397 | 3398 | ### BFS and DFS 3399 | 3400 | If we have a unsorted Tree or a Graph we have go over every item in the data-structure 3401 | to find an item. 3402 | 3403 | We have two method for searching in these two data-strucutre: 3404 | - Breadth First Search 3405 | - Depth First Search 3406 | 3407 | Both have the same time complexity in the worst case. It's O(n). 3408 | But they have their pros and cons. 3409 | 3410 | #### Breadth First Search 3411 | 3412 | The way BFS works is we first check the first level (the root node) then 3413 | then each element of second level then each element of third level and so on. 3414 | 3415 | It looks like this: 3416 | 3417 | ![BFS Animation](./BFS.gif) 3418 | 3419 | #### Depth First Search 3420 | 3421 | In DFS we explore the depth of the child node first then move to 3422 | the next child node. 3423 | 3424 | It looks like this: 3425 | 3426 | ![DFS Animation](./DFS.gif) 3427 | 3428 | #### BFS vs DFS 3429 | 3430 | **Data structure**: 3431 | 3432 | BFS uses Queue to find the shortest path. 3433 | 3434 | DFS uses Stack to find the shortest path. 3435 | 3436 | **Faviourable Condition**: 3437 | 3438 | BFS is better when target is closer to Source. 3439 | 3440 | DFS is better when target is far from source. 3441 | 3442 | **Space Complexity**: 3443 | 3444 | In case of Balanced Binary Trees BFS takes more memory than DFS 3445 | because the Big O of BFS is O(w) where w is the width of the tree and 3446 | the Big O of DFS is O(h) where h is the height of the tree. 3447 | 3448 | **Pros**: 3449 | 3450 | BFS is good at finding the shortest path from the source. 3451 | 3452 | DFS is used when we want to know if the target exist and it usually takes 3453 | less memory. 3454 | 3455 | To learn more about their space and time complexity read [this](https://stackoverflow.com/questions/9844193/what-is-the-time-and-space-complexity-of-a-breadth-first-and-depth-first-tree-tr) 3456 | 3457 | #### When to use what? 3458 | 3459 | **BFS**: 3460 | 3461 | If the target is not far from the source. 3462 | 3463 | If the tree is deep than wide. 3464 | 3465 | Determining the shortest path from the source to the target. 3466 | 3467 | **DFS**: 3468 | 3469 | If the target is far from the source. 3470 | 3471 | If the tree is wide than deep. 3472 | 3473 | Determining if the target exists. 3474 | 3475 | #### Implementing BFS 3476 | 3477 | Assuming the node structure look like this: 3478 | 3479 | ```ts 3480 | Node { 3481 | value: Number, 3482 | childrens: [Node] 3483 | } 3484 | ``` 3485 | 3486 | ```js 3487 | function breadthFirstSearch(root, target) { 3488 | let currentNodes = [root]; // Store all the nodes of current level 3489 | 3490 | while (currentNodes.length > 0) { // While we have nodes in current level 3491 | let nextNodes = []; // Store the nodes in next level 3492 | 3493 | for (let node of currentNodes) { // Look in each node at current level 3494 | if (node.value === target) { // If It's the target then return the node 3495 | return node; 3496 | } else { // If it's not the target then push it's childrens to next level 3497 | for (let child of node.childrens) { 3498 | nextNodes.push(child); 3499 | } 3500 | } 3501 | } 3502 | 3503 | currentNodes = nextNodes; // Move to next level 3504 | } 3505 | 3506 | return null; 3507 | } 3508 | ``` 3509 | 3510 | You can test this code on this tree: 3511 | 3512 | ```js 3513 | // 5 3514 | // / | \ 3515 | // 6 7 8 3516 | // /| /\ |\ 3517 | // 1 2 3 9 4 10 3518 | 3519 | const testTree = { 3520 | value: 5, 3521 | childrens: [ 3522 | { 3523 | value: 6, 3524 | childrens: [ 3525 | { 3526 | value: 1, 3527 | childrens: [] 3528 | }, 3529 | { 3530 | value: 2, 3531 | childrens: [] 3532 | } 3533 | ], 3534 | }, 3535 | { 3536 | value: 7, 3537 | childrens: [ 3538 | { 3539 | value: 3, 3540 | childrens: [] 3541 | }, 3542 | { 3543 | value: 9, 3544 | childrens: [] 3545 | } 3546 | ] 3547 | }, 3548 | { 3549 | value: 8, 3550 | childrens: [ 3551 | { 3552 | value: 4, 3553 | childrens: [] 3554 | }, 3555 | { 3556 | value: 10, 3557 | childrens: [] 3558 | } 3559 | ] 3560 | } 3561 | ] 3562 | } 3563 | 3564 | console.log(breadthFirstSearch(testTree, 3)); // Returns the node with 3 value 3565 | console.log(breadthFirstSearch(testTree, 11)); // Returns null 3566 | ``` 3567 | 3568 | We can see how the BFS function traversing through the tree by converting 3569 | it to an array. 3570 | 3571 | ```js 3572 | function breadthFirstTraverse(root) { 3573 | let currentNodes = [root]; 3574 | let res = []; 3575 | 3576 | while (currentNodes.length > 0) { 3577 | let nextNodes = []; 3578 | 3579 | for (let node of currentNodes) { 3580 | res.push(node.value); 3581 | 3582 | for (let child of node.childrens) { 3583 | nextNodes.push(child); 3584 | } 3585 | } 3586 | 3587 | currentNodes = nextNodes; 3588 | } 3589 | 3590 | return res; 3591 | } 3592 | ``` 3593 | 3594 | Try running this code for that test tree you should see this: 3595 | 3596 | ```js 3597 | [5, 6, 7, 8, 1, 2, 3, 9, 4, 10] 3598 | ``` 3599 | 3600 | #### Implementing DFS 3601 | 3602 | There are three ways to implement BFS. PreOrder, InOrder, and PostOrder. 3603 | 3604 | There are the three ways traverse through the tree and it looks like this: 3605 | 3606 | ```js 3607 | // 5 3608 | // / | \ 3609 | // 6 7 8 3610 | // /| |\ |\ 3611 | // 1 2 3 9 4 10 3612 | 3613 | InOrder = [1,6,2,5,3,7,9,4,8,10] 3614 | PreOrder = [5,6,1,2,7,3,9,8,4,10] 3615 | PostOrder = [1,2,6,3,9,7,4,10,8,5] 3616 | ``` 3617 | 3618 | **InOrder**: 3619 | 3620 | InOrder is used to get the values of the nodes in non-decreasing order in a [BST](#binary-search-tree). 3621 | 3622 | InOrder is only used in Binary Trees. 3623 | 3624 | So we'll use a Binary Tree Node for this: 3625 | 3626 | ```ts 3627 | Node { 3628 | value: Number, 3629 | left: Node 3630 | right: Node 3631 | } 3632 | ``` 3633 | 3634 | ```js 3635 | function DFSInOrder(source) { 3636 | if (!source) { 3637 | return []; 3638 | } 3639 | 3640 | let res = []; 3641 | 3642 | const left = DFSInOrder(source.left); 3643 | 3644 | for (let value of left) { 3645 | res.push(value); 3646 | } 3647 | 3648 | res.push(source.value); 3649 | 3650 | const right = DFSInOrder(source.right); 3651 | 3652 | for (let value of right) { 3653 | res.push(value) 3654 | } 3655 | 3656 | return res; 3657 | } 3658 | ``` 3659 | 3660 | Try running it with this Binary Tree. 3661 | 3662 | ```js 3663 | // 10 3664 | // / \ 3665 | // 5 12 3666 | // /\ |\ 3667 | // 3 7 11 13 3668 | // /| |\ 3669 | // 1 4 6 8 3670 | 3671 | const testBinaryTree = { 3672 | value: 10, 3673 | left: { 3674 | value: 5, 3675 | left: { 3676 | value: 3, 3677 | left: { 3678 | value: 1, 3679 | left: null, 3680 | right: null 3681 | }, 3682 | right: { 3683 | value: 4, 3684 | left: null, 3685 | right: null 3686 | } 3687 | }, 3688 | right: { 3689 | value: 7, 3690 | left: { 3691 | value: 6, 3692 | left: null, 3693 | right: null, 3694 | }, 3695 | right: { 3696 | value: 8, 3697 | left: null, 3698 | right: null 3699 | } 3700 | } 3701 | }, 3702 | right: { 3703 | value: 12, 3704 | left: { 3705 | value: 11, 3706 | left: null, 3707 | right: null 3708 | }, 3709 | right: { 3710 | value: 13, 3711 | left: null, 3712 | right: null 3713 | } 3714 | } 3715 | } 3716 | ``` 3717 | 3718 | It should give this result: 3719 | 3720 | ```js 3721 | [1, 3, 4, 5, 6, 7, 8, 10, 11, 12, 13] 3722 | ``` 3723 | 3724 | **PreOrder**: 3725 | 3726 | Used to create a copy of a tree. For example, if you want to create a replica 3727 | of a tree, put the nodes in an array with a pre-order traversal. 3728 | 3729 | Then perform an Insert operation on a new tree for each value in the array. 3730 | You will end up with a copy of your original tree. 3731 | 3732 | Since we can use any type of tree or graph I'll use a 3733 | generic node. 3734 | 3735 | ```ts 3736 | Node { 3737 | value: Number, 3738 | childrens: [Node] 3739 | } 3740 | ``` 3741 | 3742 | ```js 3743 | function DFSPreOrder(source) { 3744 | if (!source) { 3745 | return []; 3746 | } 3747 | 3748 | let res = [source.value]; 3749 | 3750 | for (let child of source.childrens) { 3751 | const next = DFSPreOrder(child); 3752 | 3753 | for (let value of next) { 3754 | res.push(value); 3755 | } 3756 | } 3757 | 3758 | return res; 3759 | } 3760 | ``` 3761 | 3762 | Example Tree: 3763 | 3764 | ```js 3765 | // 5 3766 | // / | \ 3767 | // 6 7 8 3768 | // /| /\ |\ 3769 | // 1 2 3 9 4 10 3770 | 3771 | const testTree = { 3772 | value: 5, 3773 | childrens: [ 3774 | { 3775 | value: 6, 3776 | childrens: [ 3777 | { 3778 | value: 1, 3779 | childrens: [] 3780 | }, 3781 | { 3782 | value: 2, 3783 | childrens: [] 3784 | } 3785 | ], 3786 | }, 3787 | { 3788 | value: 7, 3789 | childrens: [ 3790 | { 3791 | value: 3, 3792 | childrens: [] 3793 | }, 3794 | { 3795 | value: 9, 3796 | childrens: [] 3797 | } 3798 | ] 3799 | }, 3800 | { 3801 | value: 8, 3802 | childrens: [ 3803 | { 3804 | value: 4, 3805 | childrens: [] 3806 | }, 3807 | { 3808 | value: 10, 3809 | childrens: [] 3810 | } 3811 | ] 3812 | } 3813 | ] 3814 | } 3815 | ``` 3816 | 3817 | Result: `[5, 6, 1, 2, 7, 3, 9, 8, 4, 10]` 3818 | 3819 | **PostOrder**: 3820 | 3821 | Used to delete a tree from leaf to root. 3822 | 3823 | ```js 3824 | function DFSPostOrder(source) { 3825 | if (!source) { 3826 | return []; 3827 | } 3828 | 3829 | let res = []; 3830 | 3831 | for (let child of source.childrens) { 3832 | const next = DFSPostOrder(child); 3833 | 3834 | for (let value of next) { 3835 | res.push(value); 3836 | } 3837 | } 3838 | 3839 | res.push(source.value); 3840 | 3841 | return res; 3842 | } 3843 | ``` 3844 | 3845 | Run this code with the example tree of previous one 3846 | and you should get this result: `[1, 2, 6, 3, 9, 7, 4, 10, 8, 5]` 3847 | 3848 | ### Dynamic Programming 3849 | 3850 | Dynamic Programming is a technique that is used when you have 3851 | a problem that can be divided into smaller sub-problems and then the 3852 | solutions to these subproblems are cached in case the sub-problems repeat. 3853 | 3854 | #### Memoization 3855 | 3856 | Caching or Memoization is a big part of Dynamic Programming. 3857 | 3858 | Memoization is an act of caching the result of a function so that 3859 | if the function is computationally expensive and that function will 3860 | get called multiple times with the same input it can return the result immediately. 3861 | 3862 | Here is a simple function that takes an input and returns the log 3863 | of the input number. 3864 | 3865 | ```js 3866 | function getLog(n) { 3867 | return Math.log(n); 3868 | } 3869 | ``` 3870 | 3871 | Log is an expensive function but not that much to slow down a modern 3872 | computer. 3873 | 3874 | If we want to calculate the log of all numbers in a list and that list is big 3875 | and also have lots of duplicates, we can use memoization to speed up the 3876 | calculation. 3877 | 3878 | ```js 3879 | const cache = {}; 3880 | 3881 | function getLog(n) { 3882 | if (!cache[n]) { 3883 | cache[n] = Math.log(n); 3884 | } 3885 | 3886 | return cache[n]; 3887 | } 3888 | ``` 3889 | 3890 | Now even if one number is used multiple times the calculation will only run 3891 | one time. 3892 | 3893 | We can improve this code a little more. Since most people hate using global 3894 | variables, we should move this cache variable to a scope. 3895 | 3896 | ```js 3897 | const getLog = (() => { 3898 | const cache = {}; 3899 | 3900 | return (n) => { 3901 | if (!cache[n]) { 3902 | cache[n] = Math.log(n); 3903 | } 3904 | 3905 | return cache[n]; 3906 | } 3907 | })(); 3908 | ``` 3909 | 3910 | Using the closure feature of JavaScript now we have the cache variable scoped. 3911 | And the function behaves the same. 3912 | 3913 | #### Fibonacci Sequence 3914 | 3915 | Fibonacci Sequence is an ideal example to solve using Dynamic Programming 3916 | 3917 | In a Fibonacci Sequence, every number is equal to the sum of the previous two 3918 | numbers 3919 | 3920 | ``` 3921 | 0, 1, 1, 2, 3, 5, 8, 13, 21, 34 . . . 3922 | ``` 3923 | Now we will take a function fib(n) that returns the nth number of the Fibonacci series. We can come up with a recursive formula for this: 3924 | 3925 | ``` 3926 | fib(n) = fib(n-1) + fib(n-2) 3927 | ``` 3928 | 3929 | As you can see we have divided a problem into two sub-problems. Now let's 3930 | write the code for it. 3931 | 3932 | ```js 3933 | function fib(n) { 3934 | if (n < 2) { 3935 | return n; 3936 | } 3937 | 3938 | return fib(n-1)+fib(n-2); 3939 | } 3940 | ``` 3941 | 3942 | It was so easy to implement but we have one problem here. 3943 | 3944 | ![Fibonacci Visual](Fib.png) 3945 | 3946 | In this visualization, you can see that we are solving some same problems 3947 | twice so we need to add memoization to it. 3948 | 3949 | ```js 3950 | const cache = {} 3951 | 3952 | function fib(n) { 3953 | if (cache[n]) { 3954 | return cache[n]; 3955 | } 3956 | 3957 | let res; 3958 | 3959 | if (n < 2) { 3960 | res = n; 3961 | } else { 3962 | res = fib(n-1)+fib(n-2) 3963 | } 3964 | 3965 | cache[n] = res; 3966 | 3967 | return res; 3968 | } 3969 | ``` 3970 | 3971 | Without cache, we were solving some sub-problems twice so the Time Complexity 3972 | of that was O(n^2). 3973 | 3974 | With memoization, we are solving every sub-problem once so the Time Complexity 3975 | becomes O(n) which is a huge improvement. 3976 | 3977 | ## Additional Important Topics 3978 | 3979 | - Sliding Window 3980 | - Greedy Alogorithms 3981 | - Intervals 3982 | 3983 | ## Coding Problems 3984 | 3985 | Start here if new -> [neetcode.io](neetcode.io) 3986 | 3987 | To prepare for interview at top tech -> [Grind75](https://www.techinterviewhandbook.org/grind75) -------------------------------------------------------------------------------- /SelectionSort.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/SelectionSort.webp -------------------------------------------------------------------------------- /Stack.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Stack.gif -------------------------------------------------------------------------------- /StackVsQueue.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/StackVsQueue.png -------------------------------------------------------------------------------- /Tree.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Tree.webp -------------------------------------------------------------------------------- /Trie.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/Trie.png -------------------------------------------------------------------------------- /TypesOfBinaryTree.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/TypesOfBinaryTree.png -------------------------------------------------------------------------------- /Typescript_logo_2020.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | TypeScript logo 4 | 5 | 6 | 7 | -------------------------------------------------------------------------------- /UnbalancedBinaryTree.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/UnbalancedBinaryTree.png -------------------------------------------------------------------------------- /WeightedVsUnWeightedGraph.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/siddharthroy12/DSA-Notes/addfdee4683a05e4bc808cd397f3f571a05c1a0e/WeightedVsUnWeightedGraph.png --------------------------------------------------------------------------------