├── LICENSE
├── README.md
└── algorithm
├── dynamic_programming
├── 1_maximum_sub_array_kadane_algo.py
└── 2_maximum_sub_matrix.py
├── graph
└── graph_searching
│ └── breadth_first_search.py
└── recursion
├── 206_Reverse_Linked_List.py
├── 344_Reverse_String.py
├── backtracking
├── 46_Permutations.py
└── 52_N-Queens_II.py
└── divide-and-conquer
└── 240_Search_a_2D_Matrix_II.py
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2021 CodeXplore
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Algorithm Design and Implementation
2 |
3 | - Resource: [The Algorithms](https://the-algorithms.com/)
4 |
5 | # Table of contents
6 | - [Table of contents](#table-of-contents)
7 | - [Part A - Algorithm](#part-a-algorithm)
8 | - [1. Recursion](#1-recursion)
9 | - [1.1. Recursion Examples](#11-recursion-examples)
10 | - [1.2. Principle of Recursion](#12-principle-of-recursion)
11 | - [1.3. Complexity Analysis](#13-complexity-analysis)
12 | - [1.4. Theme of Recursion](#14-theme-of-recursion)
13 | - [1.4.1. Divide and Conquer](#141-divide-and-conquer)
14 | - [1.4.2. Backtracking](#142-backtracking)
15 | - [2. Sorting](#2-sorting)
16 | - [2.1. Insertion Sort](#21-insertion-sort)
17 | - [2.2. Merge Sort](#22-merge-sort)
18 | - [2.3. Quick Sort](#23-quick-sort)
19 | - [3. Analysis of Algorithm](#3-analysis-of-algorithm)
20 | - [3.1. Types of Analysis](#31-types-of-analysis)
21 | - [3.2. Asymptotic Order of Growth](#32-asymptotic-order-of-growth)
22 | - [3.3. Recurrence](#33-recurrence)
23 | - [4. BFS and DFS](#4-bfs-and-dfs)
24 | - [4.1. BFS](#41-bfs)
25 | - [4.2. DFS](#42-dfs)
26 | - [Part B - Data Structure](#part-b-data-structure)
27 | - [1. What is Data Structure](#1-what-is-data-structure)
28 | - [1.1. Abstract Data Type ](#11-abstract-data-type)
29 | - [2. Arrays](#2-arrays)
30 | - [3. Linked List](#3-linked-list)
31 | - [3.1. Advantage vs Disadvantage of Linked List](#31-advantage-vs-disadvantage-of-linked-list)
32 | - [3.2. Other Linked List Variants](#32-other-linked-list-variants)
33 | - [3.3. Arrays vs Linked List](#33-arrays-vs-linked-list)
34 | - [4. Stacks, Queues and Deques](#4-stacks-queues-and-deques)
35 | - [4.1. ADT Design for Stacks, Queues, and Deques](#41-adt-design-for-stacks-queues-and-deques)
36 | - [4.2. Stacks Queues and Deques Implementation using Linked List](#42-stacks-queues-and-deques-implementation-using-linked-list)
37 | - [4.3. Queue](#43-queue)
38 | - [5. Dynamic Arrays](#5-dynamic-arrays)
39 | - [6. Tree](#6-tree)
40 | - [6.1. Binary Tree](#61-binary-tree)
41 | - [6.2. Traverse A Tree](#62-traverse-a-tree)
42 | - [6.3. Level-order Traversal](#63-level-order-traversal)
43 | - [6.4. Binary Search Tree](#64-binary-search-tree)
44 | - [7. Heaps](#7-Heaps)
45 | - [7.1. Heap Representation](#71-heap-representation)
46 | - [7.2. Heap Operations](#72-heap-operations)
47 | - [7.2.1. Heapify](#721-heapify)
48 | - [7.2.2. Insertion](#722-insertion)
49 | - [7.2.3. Pop](#723-pop)
50 | - [7.3. Heap Advantage](#73-heap-advantage)
51 | - [8. Graph](#8-graph)
52 | - [8.1. Graph Property](#81-graph-property)
53 | - [8.2. Graph Search Algorithms](#82-graph-search-algorithms)
54 |
55 |
56 | # Part A. Algorithm
57 | # 1. Recursion
58 | ## 1.1. Recursion Examples
59 | - General Example: Factorial, Fibonacci Number, Euclidean Algorithm, Extended Euclidean Algorithm
60 | - Reverse Example: [Reverse String](./algorithm/recursion/344_Reverse_String.py), [Reverse the Linked List](./algorithm/recursion/206_Reverse_Linked_List.py)
61 |
62 | ## 1.2. Principle of Recursion
63 | - `Base case`: a terminating scenario that does not use recursion to produce an answer.
64 | - `Recurrence Relation`: the relationship between the result of a problem and the result of its subproblems.that reduces all other cases towards the base case.
65 | - **Tail Recursion**: Tail recursion is a recursion where the recursive call is the final instruction in the recursion function. And there should be only one recursive call in the function.
66 | - The benefit of having tail recursion is that it could avoid the accumulation of stack overheads during the recursive calls
67 | - Since in tail recursion, we know that as soon as we return from the recursive call we are going to immediately return as well, so we can skip the entire chain of recursive calls returning and return straight to the original caller. That means we don't need a call stack at all for all of the recursive calls, which saves us space.
68 | - Different from the *memoization* technique, **tail recursion could optimize the space complexity of the algorithm**, by eliminating the stack overhead incurred by recursion. More importantly, **with tail recursion, one could avoid the problem of stack overflow that comes often with recursion**.
69 | - Another advantage about tail recursion is that often times it is easier to read and understand, compared to non-tail-recursion. Because there is no post-call dependency in tail recursion (i.e. the recursive call is the final action in the function), unlike non-tail-recursion.
70 | ```Python
71 | def sum_non_tail_recursion(ls):
72 | if len(ls) == 0:
73 | return 0
74 |
75 | # not a tail recursion because it does some computation after the recursive call returned.
76 | return ls[0] + sum_non_tail_recursion(ls[1:])
77 |
78 | def sum_tail_recursion(ls, acc = 0):
79 | if len(ls) == 0:
80 | return acc
81 |
82 | # this is a tail recursion because the final instruction is a recursive call.
83 | return sum_tail_recursion(ls[1:], ls[0] + acc)
84 | ```
85 |
86 | - **Memoization**: duplicate calculations problem that could happen with recursion. We will then propose a common technique called memoization that can be used to avoid this problem.
87 | - Cache `memo` can be passed into the function as an input param
88 | ```Python
89 | def climbStairs(n, memo = {1: 1, 2:2}):
90 | if n not in memo: #T(n) = T(n-1) + T(n-2)
91 | memo[n] = self.climbStairs(n-1, memo) + self.climbStairs(n-2, memo)
92 | return memo[n]
93 | ```
94 | - Cache `memo` can also be initialized in the `__init__` class
95 | ```Python
96 | class Solution(object):
97 | def __init__(self):
98 | self.cache = {0:1, 1:1}
99 | def climbStairs(self, n):
100 | if n not in self.cache:
101 | self.cache[n] = self.climbStairs(n-1) + self.climbStairs(n-2)
102 | return self.cache[n]
103 | ```
104 |
105 | ## 1.3. Complexity Analysis
106 | ### 1.3.1. Time Complexity
107 | - **Execution tree**, which is a tree that is used to denote the execution flow of a recursive function in particular.
108 | - Each node in the tree represents an invocation of the recursive function. Therefore, the total number of nodes in the tree corresponds to the number of recursion calls during the execution.
109 | - *For example*: Execution tree for the calculation of Fibonacci number f(4).
110 | - In a full binary tree with n levels, the total number of nodes would be `2^n - 1`. So, Time complexity for f(n) would be `O(2^n)`.
111 | - With **memoization**, we save the result of Fibonacci number for each index `n`. As a result, the recursion to calculate `f(n)` would be invoked `n-1` times to calculate all the precedent numbers that it depends on. So, Time complexity for f(n) would be `O(n)`.
112 |
113 |

114 |
115 | ### 1.3.2. Space Complexity
116 | - **Two parts of the space consumption** that one should bear in mind when calculating the space complexity of a recursive algorithm: `recursion related` and `non-recursion related space`.
117 | #### Recursion Related Space
118 | - The recursion related space refers to the memory cost that is incurred directly by the recursion, i.e. the stack to keep track of recursive function calls.
119 | - It is due to recursion-related space consumption that sometimes one might run into a situation called `stack overflow`, where the stack allocated for a program reaches its maximum space limit and the program crashes. Therefore, when designing a recursive algorithm, one should carefully check if there is a possibility of stack overflow when the input scales up.
120 |
121 | #### Non-Recursion Related Space
122 | - As suggested by the name, the non-recursion related space refers to the memory space that is not directly related to recursion, which typically includes the space (normally in heap) that is allocated for the global variables.
123 |
124 | ## 1.4. Theme of Recursion
125 | Theme of recursion: paradigms that are often applied together with the recursion to solve some problems.
126 | 1. Divide and Conquer
127 | 2. Backtracking
128 |
129 | ### 1.4.1. Divide and Conquer
130 | - **Divide-and-conquer algorithm**: works by recursively breaking the problem down into two or more subproblems of the same or related type, until these subproblems become simple enough to be solved directly. Then one combines the results of subproblems to form the final solution.
131 | - *Example*: [Merge Sort](#22-merge-sort), Quick Sort, [Search a 2D Matrix II](./algorithm/recursion/divide-and-conquer/240_Search_a_2D_Matrix_II.py)
132 | 
133 |
134 | - **Divide-and-conquer Template**: essential part of the divide and conquer is to figure out the `recurrence relationship` between the subproblems and the original problem, which subsequently defines the functions of `divide()` and `combine()`.
135 |
136 |
137 | ```Pyhon
138 | def divide_and_conquer( S ):
139 | # (1). Divide the problem into a set of subproblems.
140 | [S1, S2, ... Sn] = divide(S)
141 |
142 | # (2). Solve the subproblem recursively,
143 | # obtain the results of subproblems as [R1, R2... Rn].
144 | rets = [divide_and_conquer(Si) for Si in [S1, S2, ... Sn]]
145 | [R1, R2,... Rn] = rets
146 |
147 | # (3). combine the results from the subproblems.
148 | # and return the combined result.
149 | return combine([R1, R2,... Rn])
150 | ```
151 | ### 1.4.2. Backtracking
152 | - Procedure of backtracking as the tree traversal.
153 | - Starting from the root node, one sets out to search for solutions that are located at the leaf nodes.
154 | - Each intermediate node represents a partial candidate solution that could potentially lead us to a final valid solution.
155 | - At each node, we would fan out to move one step further to the final solution, i.e. we iterate the child nodes of the current node.
156 | - Once we can determine if a certain node cannot possibly lead to a valid solution, we abandon the current node and backtrack to its parent node to explore other possibilities.
157 | - It is due to this backtracking behaviour, the backtracking algorithms are often much faster than the brute-force search algorithm, since it eliminates many unnecessary exploration.
158 | - Example: [N-Queens II](./algorithm/recursion/backtracking/52_N-Queens_II.py), Robot Room Cleaner, Sudoku Solver, [Permutations](./algorithm/recursion/backtracking/46_Permutations.py)
159 | 
160 |
161 | - **Backtracking Template**
162 | ```Python
163 | def backtrack(candidate):
164 | if find_solution(candidate):
165 | output(candidate)
166 | return
167 |
168 | # iterate all possible candidates.
169 | for next_candidate in list_of_candidates:
170 | if is_valid(next_candidate):
171 | # try this partial candidate solution
172 | place(next_candidate)
173 | # given the candidate, explore further.
174 | backtrack(next_candidate)
175 | # backtrack
176 | remove(next_candidate)
177 | ```
178 |
179 | # 2. Sorting
180 | ## 2.1. Insertion Sort
181 | - Time Complexity: `O(n^2)`
182 | ```Python
183 | def insertion_sort(A):
184 | for i in range(1, len(A)): #i = start from the second the element to End of Array
185 | key = A[i] #Select A[i] as Key, and start to insert from j = i - 1 all the way to 0
186 | j = i - 1
187 | while (j >= 0 and key < A[j]):
188 | A[j+1] = A[j] #if A[j] > Key, swap A[j] to A[j+1] position
189 | j-=1
190 | A[j+1] = key #Place key to the right pos, which is A[j+1]
191 | return A
192 | ```
193 | ## 2.2. Merge Sort
194 | - Time Complexity: `O(n*log(n))`
195 | ```Python
196 | def merge_sort(A,lower, upper):
197 | if lower < upper:
198 | mid = (lower+upper)//2
199 | merge_sort(A, lower, mid)
200 | merge_sort(A, mid + 1, upper)
201 | merge(A, lower, mid, upper)
202 |
203 | array = [-4,2,1,3,5,3,11,100,7,4,-1]
204 | merge_sort(array, 0, len(array)-1) #[-4, -1, 1, 2, 3, 3, 4, 5, 7, 11, 100]
205 | ```
206 | - For the merge function, we will create 2 copy of `A[lower, mid]` and `A[mid+1, upper]` to `tmpL` and `tmpR` and start to sort back
207 | - Need to add the stopper `float("inf")` to the end of both `tmpL` and `tmpR`
208 | ```Python
209 | def merge(A, lower, mid, upper):
210 | #Step 1: Create Copy of A[lower, mid] to tmpL and A[mid+1, upper] to tmpR
211 | tmpL = [0]*((mid - lower + 1)+1) #mid - lower + 1 = len(tmpL), +1 to add the place for Stopper
212 | for i in range(0, (mid - lower)+1): #len(A[lower, mid]) = mid - lower + 1
213 | tmpL[i] = A[i+lower]
214 | tmpL[mid-lower+1] = float('inf') #add the stopper to the end of tmpL
215 |
216 | tmpR = [0]*(upper - mid + 1) #[upper - (mid+1) + 1] + 1 = upper - mid + 1
217 | for j in range(0, upper - mid): #len(A[mid+1, upper]) = upper - (mid+1) + 1 = upper - mid
218 | tmpR[j] = A[j+mid+1]
219 | tmpR[upper - mid] = float('inf') #add the stopper to the end of tmpR
220 |
221 | #Step 2: Start to merge tmpL and tmpR back to A[lower, upper]
222 | i, j = 0, 0
223 | for k in range(lower, upper + 1):
224 | if tmpL[i] <= tmpR[j]:
225 | A[k] = tmpL[i]
226 | i+=1
227 | else:
228 | A[k] = tmpR[j]
229 | j+=1
230 | ```
231 | ## 2.3. Quick Sort
232 | - Depending on the pivot values, the time complexity of the quick sort algorithm can vary from best case `O(N*Log(N))` to worst case (already sorted input) `O(N^2)`.
233 | ```Python
234 | def partition(A, p, r):
235 | pivot = A[r]
236 | i = p - 1 #track A[j] < pivot
237 | for j in range(p, r):
238 | if A[j] <= pivot:
239 | i+=1 #grow Blue
240 | A[i], A[j] = A[j], A[i]
241 | #Swap A[i+1] & A[r] once done
242 | A[i+1], A[r] = A[r], A[i+1]
243 | return i+1
244 |
245 | def quicksort(A, p, r):
246 | if p < r:
247 | q = partition(A, p, r)
248 | quicksort(A, p, q - 1)
249 | quicksort(A, q + 1, r)
250 |
251 | quicksort(A, 0, len(A) - 1)
252 | ```
253 | - In order to avoid the worst case `O(N^2)` in quick sort, we can use **Random sampling** technique, which is to pick a random pivot element.
254 | - On average, `randomized_quicksort` will cost `O(N*Log(N))`
255 | ```Python
256 | def randomized_partition(A, p, r):
257 | i = random.randint(p, r)
258 | A[i], A[r] = A[r], A[i]
259 | #Swap i, which is a random pivot, to last pos (r), and then
260 | #call the standard partition
261 | return partition(A, p, r)
262 | def randomized_quicksort(A, p, r):
263 | if p < r:
264 | q = randomized_partition(A, p, r)
265 | randomized_quicksort(A, p, q - 1)
266 | randomized_quicksort(A, q + 1, r)
267 | ```
268 |
269 | # 3. Analysis of Algorithm
270 | ## 3.1. Types of Analysis
271 | - **Worst Case**:
272 | - **Average Case**: difficult to compute as need to find combination of inputs and then take the average of run time
273 | - **Amortized Case**: sampling with replacement
274 |
275 | ## 3.2. Asymptotic Order of Growth
276 | - **Asymptotic Order of Growth**: know how the function behaves as the input gets larger, towards infinity (i.e: **in the limit**)
277 | - For example: the blue algo is scale better then the red on
278 | 
279 |
280 | ### 3.2.1. Upper Bound (Big O) Notation
281 | - **Note 1**: Exponential > (dominates) polynomial > (dominates) logarithmic
282 | - For example: `n*log(n)` = `O(n^(1.5)) = O(n*sqrt(n))`, since log is growth slower than sqrt(n), so `n*log(n)` will be upper bounded by `n*sqrt(n)`
283 | - Generalize: `n*log(n)` = `O(n^(k))` for any k > 0
284 | - **Note 2**: `log(n)
287 |
288 | ### 3.2.2. Lower Bound (Omega) Notation
289 | ### 3.2.3. Tight Asymptotic (Theta) Bound
290 | 
291 |
292 | #### Exercise of Big O, Omega and Theta
293 | - Determine which relationship is correct and briefly explain why
294 | 
295 |
296 | - a. f(n) = log(n^2) = 2log(n), so **f(n) = Theta(g(n))**
297 | - b. g(n) = log(n^2) = 2log(n), so f(n) = Omega(g(n))
298 | - c. log(n) < n => log(log(n)) < log(n), so f(n) = O(g(n))
299 | - d. since (log(n))^2 lower than n, so **f(n) = Omega(g(n))**
300 | - e. f(n) = Omega(g(n))
301 | - f. f(n) = Theta(g(n))
302 | - g. f(n) = O(g(n))
303 | - h. f(n) = O(g(n))
304 |
305 | ## 3.3. Recurrence
306 | ### 3.3.1 Divide and Conquer
307 | - Motivation: **Divide and Conquer** Break a problem of size n into smaller problems such that by solving the smaller problems, we can construct a solution the entire problem:
308 | 1) Divide
309 | 2) Solve each of the smaller problems
310 | 3) Combine solutions
311 |
312 | ### 3.3.2. Recurrence
313 | - Recurrence describes a function `T(n)` in terms of its values on smaller inputs `T(m)`, where m < n
314 | - There are 3 ways to solve Recurrence
315 | #### Method 1. Substitution Method
316 | #### Method 2. Recursion Tree Method
317 |
318 |
319 |
320 |
321 |
322 |
323 | #### Method 3. Master Theorem Method
324 |
325 |
326 |
327 |
328 |
329 |
330 | - **Example of Master Theorem:**
331 |
332 |
333 |
334 |
335 |
336 |
337 | [(Back to top)](#table-of-contents)
338 |
339 | # 4. BFS and DFS
340 | ## 4.1. BFS
341 | - **BFS**: needs to use [queue](#43-queue) as processing order of the nodes in First-in-First-out (FIFO) manner.
342 | - **Application**: do [traversal](#63-level-order-traversal) or find the `shortest path from the root node to the target node`
343 | - Round 1: we process the root node
344 | - Round 2: we process the nodes next to the root node
345 | - Round 3: we process the nodes which are two steps from the root node; so on and so forth.
346 | - Similar to tree's level-order traversal, the nodes closer to the root node will be traversed earlier.
347 | - If a **target** node is added to the queue in the `kth` round, the *length of the shortest path* between the root node and **target** node is exactly `k`.
348 | - That is to say, you are already in the shortest path the first time you find the target node.
349 | 
350 |
351 | ### 4.1.1. BFS Template 1 for Tree
352 | - After each outer while loop, we are one step farther from the root node.
353 | - Variable `step` indicates the distance from the root node and the current node we are visiting.
354 | - `Template 1` not need keep the visited hash set becauese:
355 | - There is no cycle, for example, in tree traversal
356 | - You do want to add the node to the queue multiple times.
357 | ```Python
358 | def BFS(root, target):
359 | queue = dequeue() #store all nodes which are waiting to be processed
360 | step = 0 #number of steps neeeded from root to current node
361 | queue.append(root) #append root to queue
362 |
363 | while queue:
364 | step = step + 1
365 | #iterate the nodes which are already in the queue
366 | size = len(queue)
367 | for _ in range(size):
368 | cur_node = queue.popleft() #dequeue the first node in queue
369 | if cur_node == target: return step #return step if cur is target
370 | #else: continue to add the neighbors of cur_node to queue
371 | for (Node next : the neighbors of cur_node):
372 | add next to queue
373 |
374 | return -1 #there is no path from root to target
375 | ```
376 | ### 4.1.1. BFS Template 2 for Graph
377 | - It is important to make sure that we **never visit a node twice**. Otherwise, we might get stuck in an infinite loop, e.g. in graph with cycle.
378 | - Used `set`, instead of `list`, to make a `visited` set for marking the node is visted. For example: [Open the Lock](https://github.com/CodexploreRepo/leetcode/blob/master/solution/752_Open_the_Lock.py), [Number of Islands](https://github.com/CodexploreRepo/leetcode/blob/master/solution/200_Number_of_Islands.py),
379 |
380 | ```Python
381 | def BFS(root, target):
382 | queue = dequeue() #store all nodes which are waiting to be processed
383 | visited = set() #store all the nodes that we've visited
384 | step = 0 #number of steps neeeded from root to current node
385 | queue.append(root) #append root to queue
386 |
387 | while queue:
388 | step = step + 1
389 | #iterate the nodes which are already in the queue
390 | size = len(queue)
391 | for _ in range(size):
392 | cur_node = queue.popleft() #dequeue the first node in queue
393 | if cur_node == target: return step #return step if cur is target
394 | #else: continue to add the neighbors of cur_node to queue
395 | for neighbour in cur_node.neighbour:
396 | if neighbour not in visited: #ensure that the neighbor is not visited
397 | visited.add(neighbour)
398 | queue.append(neighbour)
399 |
400 | return -1 #there is no path from root to target
401 | ```
402 |
403 | [(Back to top)](#table-of-contents)
404 |
405 | # Part B. Data Structure
406 | - ADTs: designing abstractions
407 | - Data Structures: concrete implementations
408 |
409 | # 1. What is Data Structure
410 | - Data Structures are ways to store and organize of data to facilitate efficient
411 | - Access(Read)
412 | - Modification(Write)
413 |
414 | ## 1.1. Abstract Data Type
415 | - An Abstract Data Type (ADT) is a **mathematical models** of the data structures (DS is implemenetation), which defines
416 | - Types of data stored
417 | - Operations that should be supported (For ex: design DS easily to insert and delete)
418 | - Parameters for the operations
419 |
420 | [(Back to top)](#table-of-contents)
421 |
422 | # 2. Arrays
423 | Array is a continuous chunks of memory. Computer registers:
424 | - **Starting Address**: 0x8600 (0x is prefix for hexadecimal). Address of nth cell = `starting addr + index*sizeOf(data_type)`
425 | - **Data Type**: of values stored in the cells
426 | - **Size**: How many cells are there
427 | 
428 |
429 | [(Back to top)](#table-of-contents)
430 |
431 | # 3. Linked List
432 | - **Node**, is a data structure, consists of a value and a pointer to the next node
433 | - **Linked List** is the parent of data structure of node
434 | - Computer registers: head of the Linked List
435 | 
436 |
437 | ## 3.1. Advantage vs Disadvantage of Linked List
438 | - Advantage: Easy to expand
439 | - Disadvantage:
440 | - (1) Space: Half of space is wasted on pointer as need to store address of next value (modern computer's address is 8 bytes)
441 | - (2) Slow to retrieve: as need to jump here and there in the memory (Array you can continuously search and retrieve, say 30th element = starting addr + 30x4 - O(1))
442 |
443 | ## 3.2. Other Linked List Variants
444 | - **Circular Linked List**: a linked list where all nodes are connected to form a circle. There is no NULL at the end.
445 | - **Doubly Linked List**: usually use both Head & Tail for reversing the list, as Tail will be another Head of the list from the other direction.
446 | 
447 |
448 | ## 3.3. Arrays vs Linked List
449 | 
450 |
451 | - `Lef()`: Linked List O(n) as need to travel from Head to next not me
452 | - `Insert()` and `Delete()`: for Array O(n) as need to shift anything when insert or delete a cell
453 |
454 | # 4. Stacks Queues and Deques
455 | ## 4.1. ADT Design for Stacks Queues and Deques
456 | - Deques: Since we have 2 ends, so we can enqueue & deque from both the front and back
457 |
458 | 
459 |
460 | ## 4.2. Stacks Queues and Deques Implementation using Linked List
461 | - **Stacks**: Singly Linked List with `Head`
462 | - **Queues**: Singly Linked List with `Head` & `Tail`
463 | - **Deques**: Doubly Linked List with `Head` & `Tail`
464 |
465 | ## 4.3. Queue
466 | - **Application**: [BFS](#41-bfs)
467 | ### 4.3.1. Queue Implementation
468 | - To implement a **queue using dynamic array** and an index pointing to the `head` of the queue.
469 | - Drawback: With the movement of the `head` pointer when dequeue, more and more space is wasted
470 | - In this example, when we dequeue(5), the head pointer move to the second position, but then array[0] pos will never be used
471 | - 
472 | - Solution: `Circular queue` Specifically, we may use a **fixed-size array** and **two pointers** to indicate the starting position and the ending position. And the goal is to reuse the wasted storage we mentioned previously.
473 | - Example: [Design Circular Queue](https://github.com/CodexploreRepo/leetcode/blob/master/solution/622_Design_Circular_Queue.py) Please refer the link for [Circular Queue animation]( https://leetcode.com/explore/learn/card/queue-stack/228/first-in-first-out-data-structure/1396/)
474 |
475 | ### 4.3.2. Queue Usage in Python
476 | - Queue with List: **dequeue** `list.pop(0)` (But requiring O(n) as need to shift all elements) time and **enqueue** `list.append(item)`
477 | - Queue with Built-in Function:
478 | ```Python
479 | from collections import deque
480 | q = deque() # Initializing a queue
481 | q.append('a') # Adding elements to a queue
482 | # Removing elements from a queue - only O(1) in compare with using List to implement Queue
483 | q.popleft()
484 | ```
485 | [(Back to top)](#table-of-contents)
486 |
487 | # 5. Dynamic Arrays
488 | ## 5.1. Dynamic Array Properties
489 | - **5.1.1. Dynamic Arrays**: a linked list of arrays
490 | - Which Level ? (i.e: which array): `Array_Index = (pos + 1).bit_length - 1 = h`
491 | - Which Cell ? (at the particular Array index): `Cell_Index = pos - array_index = pos - (2**h - 1)`
492 | - For example: pos = 5 → pos + 1 = 6 = (110) → bit_len(110) = 3 → Array_Index = bit_len(110) - 1 = 2.
493 | - Therefore, the element @ pos = 5 at 2nd array_index, and the cell index = 5 - (2**2-1) = 2.
494 |
495 | ```Python
496 | def __translate(self, pos):
497 | h = (pos + 1).bit_length() - 1
498 | return h, pos + 1 – 2 ** h
499 | ```
500 |
501 | 
502 |
503 | - **5.1.2. Number of Arrays**: `# of arrays = log(capcity + 1)`
504 | - Capacity = 1 → 1 array
505 | - Capacity = 3 → 2 arrays
506 | - Capacity = 7 → 3 arrays
507 |
508 | - **5.1.3. Dynamic Arrays in Python**: In Python, the capacity grows ~1/8 of the current size
509 | ```Python
510 | def grow(curr_size):
511 | new_allocated = (curr_size >> 3) + (3 if curr_size < 9 else 6)
512 | return curr_size + new_allocated
513 | ```
514 |
515 | ## 5.2. Arrays vs Linked Lists vs Dynamic Arrays
516 | 
517 |
518 | - **Access(i) = O(log(n))**: Since we first we need to travel at most log(n), which is max level of arrays from **5.1.2. Number of Arrays**, then O(1) to access the horizontal position within that array level.
519 | - **Append(i) = O(log(n))**: As we will need to traverse to the last array by log(n) then O(1) to the last position.
520 | - **Delete & Insert = O(n)**: We need to shift anyone
521 |
522 | [(Back to top)](#table-of-contents)
523 |
524 | # 6. Tree
525 | - **Tree**: is made up of a single node r (called the `root`) connected to disjoint sets of nodes (called subtrees of r)
526 | - **Internal Node**: A node with at least one child
527 | - **Leaf Node**: A node with no children
528 | - From graph view, a tree can also be defined as a directed *acyclic graph* which has `N` nodes and `N-1` edges.
529 |
530 | #### Level & Heights
531 | - **Level of a Node**:
532 | - If Root → Level 0
533 | - If not Root → Level of Node = Level of Parent + 1
534 | - **Height of the Tree**: maximum level of its nodes
535 | 
536 |
537 | ## 6.1. Binary Tree
538 | - An empty tree or a node with at most 2 branches
539 |
540 |
541 | ### 6.1.1. Full Binary Tree vs Complete Binary Tree
542 | - **Full Binary Tree**: Either empty or A node with both left and right subtrees being full binary trees of the same height
543 | - **# of Nodes** of a full binary tree of height h: `2^(h+1) - 1`
544 | 
545 |
546 | - **Complete Binary Tree**: Left-justified Tree
547 | - A completely filled tree on all levels except possibly the lowest level, which is filled from the left.
548 | - A complete binary tree has at most one node with only one child.
549 | 
550 |
551 | ## 6.2. Traverse A Tree
552 | - **Traversal Type**
553 | - `Pre-Order`: Root - Left - Right
554 | - `In-Order` : Left - Root - Right
555 | ```Python
556 | class Solution:
557 | def inorderTraversal(self, root: Optional[TreeNode]) -> List[int]:
558 | res = []
559 | self.dfs_inorder(root, res)
560 | return res
561 | def dfs_inorder(self, root, res):
562 | if (root):
563 | self.dfs_inorder(root.left, res)
564 | res.append(root.val)
565 | self.dfs_inorder(root.right, res)
566 | ```
567 | - `Post-Order` : Left - Right - Root
568 | - Note 1: Delete nodes in a tree, deletion process will be in post-order. That is to say, when you delete a node, you will delete its left child and its right child before you delete the node itself.
569 | - Note 2: Post-order is widely use in mathematical expression.
570 | - original expression using the inorder traversal. However, it is not easy for a program to handle this expression since you have to check the priorities of operations.
571 | - If you handle this tree in postorder, you can easily handle the expression using a stack. Each time when you meet a operator, you can just pop 2 elements from the stack, calculate the result and push the result back into the stack.
572 | 
573 |
574 | ## 6.3. Level-order Traversal
575 | - `Breadth-First Search` is an algorithm is to traverse the tree level by level.
576 | - We will use a queue to help us to do BFS.
577 | ```Python
578 | def levelOrder(self, root: Optional[TreeNode]) -> List[List[int]]:
579 | res = []
580 | if (root):
581 | queue = [root]
582 | while(len(queue) > 0):
583 | #in order to keep track on Level, we will based on len of queue
584 | size, cur_level = len(queue), []
585 | #at level 0, size_level_0 = 1
586 | #at level 1, size_level_2 = 2
587 | for _ in range(size):
588 | cur_node = queue.pop(0)
589 | cur_level.append(cur_node.val)
590 | if (cur_node.left):
591 | queue.append(cur_node.left)
592 | if (cur_node.right):
593 | queue.append(cur_node.right)
594 | res.append(cur_level)
595 | return res
596 | ```
597 |
598 | ## 6.4. Binary Search Tree
599 | - A `binary search tree` (BST), a special form of a binary tree, satisfies the binary search property:
600 | - The value in each node must be greater than (or equal to) any values stored in its left subtree.
601 | - The value in each node must be less than (or equal to) any values stored in its right subtree.
602 | - **inorder traversal** in BST will be in **ascending** order
603 | - BSTs support three main operations: search, insertion and deletion.
604 |
605 | ### 6.4.1. BST - Search
606 | ```Python
607 | class Solution:
608 | def searchBST(self, root, val):
609 | if (root):
610 | if root.val == val:
611 | return root
612 | elif root.val > val:
613 | return self.searchBST(root.left, val)
614 | elif root.val < val:
615 | return self.searchBST(root.right, val)
616 | else:
617 | return None
618 | ```
619 |
620 | ### 6.4.2. BST - Insertion
621 | ```Python
622 | class Solution:
623 | def insertIntoBST(self, root, val):
624 | if not root:
625 | return TreeNode(val) #Base Case: Null Node, then return a new Node with "val"
626 | if root.val < val:
627 | root.right = self.insertIntoBST(root.right, val) #If root.right is Null, so TreeNode(val) will be returned from Base Case & assign to root.right
628 | elif root.val > val:
629 | root.left = self.insertIntoBST(root.left, val) #If root.left is Null, so TreeNode(val) will be returned from Base Case & assign to root.left
630 | return root #return root after insertion
631 | ```
632 |
633 | ### 6.4.3. BST - Deletion
634 | 
635 |
636 | - First, traverse it until `root.val == key`.
637 | - **Case 0**: node do not have any children, like 1, 8, 11, 14, 6 or 18: then we just delete it and nothing else to do here.
638 | - **Case 1**: node has left children, but do not have right, for example 3 or 20. In this case we can just delete this node and put connection betweeen its parent and its children: for example for 3, we put connection 5->1 and for 20 we put connection 17->18. Note, that the property of BST will be fulfilled, because for parent all left subtree will be less than its value and nothing will change for others nodes.
639 | - **Case 2**: node has both children, like 12, 5, 7, 9 or 15. In this case we can not just delete it.
640 | - Let us consider node 5. We want to find succesor of this node: the node with next value, to do this we need to go one time to the right and then as left as possible.
641 | - For node 5 our succesor will be 6: we go 5->7->6.
642 | - How we can delete node 5 now? We swap nodes 5 and 6 (or just put value 6 to 5) and then we need to deal with new tree, where we need to delete node which I put in square.
643 | - How to do it? Just understand, that this node do not have left children, so it is either Case 1 or Case 3, which we already can solve.
644 |
645 | ```Python
646 | class Solution:
647 | def deleteNode(self, root, key):
648 | if not root: return None #Base Case
649 | if root.val > key:
650 | root.left = self.deleteNode(root.left, key)
651 | elif root.val < key:
652 | root.right = self.deleteNode(root.right, key)
653 | else: #root.val = key
654 | #Case 0: node has no child
655 | if not root.left and not root.right: return None
656 |
657 | #Case 1: node has 1 child
658 | if not root.left: return root.right #if no left child, so right child will replace root
659 | if not root.right: return root.left #if no right child, so left child will replace root
660 |
661 | #Case 2: node has 2 child, will replace root with its successor
662 | if root.left and root.right:
663 | #root's succssor = left-most child of the root's right sub-tree
664 | temp = root.right
665 | while(temp.left):
666 | temp = temp.left
667 | #replace root with it successor's val
668 | root.val = temp.val
669 | #delete root's succssor: Since root's succssor will have no left child
670 | #so delete it will fall under case 0, or case 1 only
671 | root.right = self.deleteNode(root.right, temp.val)
672 |
673 | return root
674 | ```
675 |
676 |
677 | [(Back to top)](#table-of-contents)
678 |
679 |
680 |
681 | # 7. Heaps
682 | definition. A binary heap is a **complete binary tree** with the following binary heap property:
683 | - (1) if the key at a node is greater than or equal to the key of its parent, we call it a min-heap.
684 | - (2) if the key at a node is smaller than or equal to the key of its parent, we call it a max-heap.
685 |
686 |
687 | ## 7.1. Heap Representation
688 | - A heap can be represented by a list with indices **starting from 1**.
689 | - example. A max-heap in the previous example can be represented by a list `[None, 6, 5, 3, 4, 2, 1]`.
690 |
691 | #### Finding Parent & Child
692 | Given a key at position `i`:
693 | - Position of the left child: `2*i`
694 | - Position of the right child: `2*i + 1`
695 | - Position of the parent: `i//2`
696 | - example. i = 4, value = 17 → parent = `i//2` Left Child = `2*i`, Right Child = `2*i+ 1`
697 |
698 | #### Height of Heap
699 | - **# of Internal Nodes** = `floor(log(n)) + 1`
700 | - if heap has 1 node it's height will be 1
701 | - if heap has from 2 to 3 nodes it's height will be 2
702 | - if heap has from 4 to 7 nodes it's height will be 3
703 | - ...
704 | - if heap has from 2^i to 2^(i+1) - 1 nodes it's height will be i
705 |
706 |
707 | #### Internal Nodes vs Leafs
708 | Since Heap is is a **complete binary tree**, therefore:
709 | - **# of Internal Nodes** = `floor(n/2)` or `n//2`
710 | - **# of Leafs** = `# of internal nodes` or `# of internal nodes + 1`
711 |
712 | 
713 |
714 |
715 | ## 7.2. Heap Operations
716 | - `heapify`: convert an arbitrary list → a list that satisfies max-heap property. For ex: `[None, 1, 2, 3]` → `[None, 3, 2, 1]`
717 | - `insert`: insert an element into the list without violating the max-heap property.
718 | - `pop`: retrieve the maximum element of the list.
719 |
720 | ### 7.2.1. Heapify
721 | - Time Complexity of Heapify: `O(n)` (See Proof Below)
722 | - Time Complexity of Heapify_at(i): `O(log(n))` since worst case, we need to shift down from root to leaf = all levels (h = log(n))
723 | 
724 |
725 | - **Step 1**: Start from the last internal node
726 | - For a heap of size `n`, there are exactly `n//2` internal node, so we can start checking the max-heap property from the last internal node, with index `n//2` to the root node (index 1).
727 | ```Python
728 | def heapify (self):
729 | for i in range (self.__size // 2, 0, -1):
730 | self.__heapify_at(i)
731 | ```
732 | - **Step 2**: Execute heapify_at(i) for all the internal nodes
733 | - heapify_at(i) function does is to make sure, every sub-heap rooted at element with position i satisfies the heap property, i.e., the key at the root is not smaller than any other keys in the sub-heap.
734 | - try_swap function: if the child key is greater than the parent key, we swap the two keys, and return true for the success of swap action. Otherwise, it is false. The heapify will be notified by the returned true or false, and decide if it is necessary to heapify the sub-heap rooted at the child, since the key has changed smaller.
735 | ```Python
736 | def __heapify_at (self, i):
737 | if 2*i > self.__size: #Base case: There is no left or right child, i.e: it is the leaf node now
738 | return
739 | elif 2*i == self.__size: #Case 1: internal node has only child, i.e: this left child (2*i) at the end of the list
740 | if self.__try_swap(i, 2*i):
741 | __heapify_at(2*i)
742 | elif self.__content[2*i] > self.__content[2*i+1]: #Case 2.1: internal node has 2 child & left child is larger
743 | if self.__try_swap(i, 2*i):
744 | __heapify_at(2*i) #Swap i with its left child and perform heapify_at(its_left_child)
745 | else: #Case 2.2: internal node has 2 child & right child is larger
746 | if self.__try_swap(i, 2*i+1):
747 | __heapify_at(2*i+1) #Swap i with its right child and perform heapify_at(its_right_child)
748 | ```
749 | ```Python
750 | def __try_swap(self, i,j):
751 | if self.__content[i] >= self.__content[j]: #if value_at_i > value_at_j, dont need to swap as Parent > its child
752 | return False
753 | else:
754 | self.__content[i], self.__content[j] = self.__content[j], self.__content[i] #if parent < child, need to swap
755 | return True
756 | ```
757 |
758 | #### Proof Time Complexity of Heapify is O(n)
759 | 
760 |
761 | ### 7.2.2. Insertion
762 | - Time Complexity of Insertion: `O(log(n))` since worst case, we need to shift up from left to root = all levels (h = log(n))
763 | - insert is to append a new element at the end of the list.
764 | - After we inserted this new element, we have to check if it is larger than its parent,
765 | - if yes, we swap it with the parent, until the next parent is larger.
766 | - The try_swap function is used here to break the loop when the parent contains a larger key.
767 | ```Python
768 | def insert (self, k):
769 | self.__size += 1
770 | i = self.__size
771 | self.__content[i] = k
772 | while i > 1 and self.__try_swap(i // 2, i):
773 | i = i // 2
774 | ```
775 |
776 | ### 7.2.3. Pop
777 | - Time Complexity of Insertion: `O(log(n))` as same as T(heapify_at(root))
778 | - pop is to return the maximum key in the heap, which is contained in the root of the heap, at index 1.
779 | - However, we cannot leave the heap without its root.
780 | - So the last element of the heap is now put at the root, and
781 | - this is followed by heapify_at to maintain the heap property.
782 |
783 | ```Python
784 | def pop (self):
785 | if self.__size == 0:
786 | return None
787 | else:
788 | k = self.__content[1]
789 | self.__content[1], self.__content[self.__size] = self.__content[self.__size], None
790 | self.__size -= 1
791 | self.__heapify_at(1)
792 | return k
793 | ```
794 |
795 | ## 7.3. Heap Advantage
796 | Question: For a priority queue, where does the saving come from using heap extract and insert instead of merge sort?
797 | - Since priority queue problem is to extract the maximum
798 | - Using heap, we can use **heapify**, which take `O(n)`, to maintain the max heap, say `10 9 7 8 4 5 3` (→ this is *partially sorted order*) → then we can extract the maximum `10`.
799 | - Meanwhile, if using **merge sort**, which will take `O(nlogn)`, to maintain *totally sorted order*, say `10 9 8 7 5 4 3` → then we can extract the maximum `10`.
800 | - Answer: depend on the use-case, for this use-case, we only need to maintain *partially sorted order*, so using Heap `O(n)` is Better than Merge Sort `O(nlogn)`
801 |
802 | [(Back to top)](#table-of-contents)
803 |
804 | # 8. Graph
805 | ## 8.1. Graph Property
806 | ### 8.1.1. Edge & Vertex Relationship:
807 | - **Un-directed Graph**: `n(n-1)/2`
808 | - In an undirected graph, each edge is specified by its two endpoints and order doesn't matter. The number of edges is therefore the number of subsets of size 2 chosen from the set of vertices. Since the set of vertices has size n, the number of such subsets is given by the binomial coefficient C(n,2) (also known as "n choose 2"). Using the formula for binomial coefficients, C(n,2) = n(n-1)/2
809 | - **Directed Graph**: `n(n-1)`
810 | - Each edge is specified by its start vertex and end vertex. There are n choices for the start vertex. Since there are no self-loops, there are n-1 choices for the end vertex. Multiplying these together we have n(n-1) all possible choices.
811 | ### 8.1.2. Sparse and Dense Graph:
812 | - **Dense Graph**: `|E| = O(|V|^2)`
813 | - **Sparse**: `|E| = O(|V|)`
814 |
815 | ## 8.2. Graph Search Algorithms
816 | - Main Difference (BFS and DFS on Graph (vs Tree)): **have to mark nodes that already searched**
817 | - Time Complexity:
818 | - Search on Tree: `O(n)` where n is number of nodes
819 | - Search on Graph: `O(n+e)` where n is number of nodes; e is number of edges
820 |
821 | [(Back to top)](#table-of-contents)
822 |
823 |
824 |
825 |
826 |
827 |
--------------------------------------------------------------------------------
/algorithm/dynamic_programming/1_maximum_sub_array_kadane_algo.py:
--------------------------------------------------------------------------------
1 | """
2 | Kadane's Algorithm to Maximum Sum Subarray Problem
3 | """
4 |
5 | # array = [-2,1,-3,4,-1,2,1,-5,4] #max_sum = 6, sub_array = [4, -1, 2, 1]
6 | # array = [1,-3,2,1,-1] #max_sum = 3, sub_array = [2, 1]
7 | array = [-1,3,-2,5,-6,1] #max_sum = 6, sub_array = [3, -2, 5]
8 | # array = [5,4,-1,7,8] #max_sum = 23
9 |
10 | def max_subarray(nums):
11 | """
12 | This version is to return max_sum of the sub-arrays
13 | """
14 | n, max_sum = len(nums), float("-inf")
15 | dp = [float("-inf")]*(n+1)
16 |
17 | for i in range(1, n+1):
18 | if nums[i-1] >= nums[i-1] + dp[i-1]:
19 | dp[i] = nums[i-1]
20 | else:
21 | dp[i] = nums[i-1] + dp[i-1]
22 | if dp[i] > max_sum:
23 | max_sum = dp[i]
24 | return max_sum
25 |
26 | def max_subarray(nums):
27 | """
28 | This version is to return max_sum, max_start & max_end pos
29 | """
30 | n, max_sum = len(nums), float("-inf")
31 | max_start, max_end = -1,-1
32 | dp = [float("-inf")]*(n+1)
33 |
34 | for i in range(1, n+1):
35 | if nums[i-1] >= nums[i-1] + dp[i-1]:
36 | dp[i] = nums[i-1]
37 | if dp[i] > max_sum:
38 | max_sum = dp[i]
39 | #Restart the start & end pos to the new (i-1) pos
40 | max_start = max_end = i-1
41 | else:
42 | dp[i] = nums[i-1] + dp[i-1]
43 | if dp[i] > max_sum:
44 | max_sum = dp[i]
45 | max_end = i-1
46 |
47 | return max_sum, max_start, max_end+1
48 |
49 | max_sum, max_start, max_end = max_subarray(array)
50 | print(max_sum, array[max_start: max_end])
51 |
--------------------------------------------------------------------------------
/algorithm/dynamic_programming/2_maximum_sub_matrix.py:
--------------------------------------------------------------------------------
1 | """
2 | Maximum Sub-matrix: 2-D Kadane Algorithm
3 | Explanation: https://www.youtube.com/watch?v=yCQN096CwWM
4 | """
5 |
6 | matrix = [[ 2, 1, -3, -4, 5],
7 | [ 0, 6, 3, 4, 1],
8 | [ 2, -2, -1, 4, -5],
9 | [-3, 3, 1, 0, 3]]
10 |
11 | def max_submatrix(matrix):
12 | """
13 | Time Complexity: O(col*col*row)
14 | """
15 | row, col = len(matrix), len(matrix[0])
16 | max_sum = float("-inf")
17 | max_left, max_right, max_top, max_down = -1,-1,-1,-1
18 |
19 | for left in range(col): #O(col)
20 | nums = [0]*row #nums is the accummulated col
21 | for right in range(left, col): #O(col)
22 | for r in range(row): #O(row)
23 | nums[r] += matrix[r][right]
24 | #print(nums)
25 | current_sum, max_start, max_end = max_subarray(nums) #O(row)
26 | if current_sum > max_sum:
27 | max_left, max_right, max_top, max_down = left, right, max_start, max_end
28 | max_sum = current_sum
29 |
30 | return max_sum, max_left, max_right, max_top, max_down-1
31 |
32 | max_submatrix(matrix)
33 |
--------------------------------------------------------------------------------
/algorithm/graph/graph_searching/breadth_first_search.py:
--------------------------------------------------------------------------------
1 | """
2 | Breadth-First Search (BFS)
3 | """
4 | def BFS(s):
5 | global color, res
6 | color[s] = "GRAY"
7 | queue = [s]
8 | while queue:
9 | u = queue.pop(0)
10 | res.append(u)
11 | for v in adjlist[u]:
12 | if color[v] == "WHITE":
13 | color[v] = "GRAY"
14 | queue.append(v)
15 | color[u] = 'BLACK'
16 |
17 | color = {u: "WHITE" for u in adjlist}
18 | res = []
19 |
20 | for u in adjlist:
21 | if color[u] == "WHITE":
22 | BFS(u)
23 |
24 | print(res) #[0, 1, 5, 6, 2, 7, 4, 11, 8, 10, 3, 9]
25 |
26 | """
27 | This is to generate Adj List Representation for Graph
28 | """
29 | n = 12
30 | m = 18
31 | adjlist = {i:[] for i in range(n)}
32 | for i in range(6):
33 | adjlist[i].extend(((i+1)%6, (i+5)%6))
34 | adjlist[i].append(i+6)
35 | for i in range(6, 12):
36 | adjlist[i].append(i-6)
37 | adjlist[i].extend((6+(i+2)%6, 6+(i+4)%6))
38 |
39 | """
40 | {0: [1, 5, 6],
41 | 1: [2, 0, 7],
42 | 2: [3, 1, 8],
43 | 3: [4, 2, 9],
44 | 4: [5, 3, 10],
45 | 5: [0, 4, 11],
46 | 6: [0, 8, 10],
47 | 7: [1, 9, 11],
48 | 8: [2, 10, 6],
49 | 9: [3, 11, 7],
50 | 10: [4, 6, 8],
51 | 11: [5, 7, 9]}
52 | """
53 |
--------------------------------------------------------------------------------
/algorithm/recursion/206_Reverse_Linked_List.py:
--------------------------------------------------------------------------------
1 | class Solution(object):
2 | def reverseList(self, head):
3 | """
4 | :type head: ListNode
5 | :rtype: ListNode
6 | """
7 | #Base Case:
8 | if not head or not head.next:
9 | #not head: means this is empty linked list initally
10 | #not head.next: means this is the last node in the linked list
11 | return head
12 | #Recursive Case:
13 | #rhead in this case will be the last node in the original linked list
14 | rhead = self.reverseList(head.next)
15 |
16 | #head, in this case, will be the second last node
17 | #head.next = last node, (head.next).next = head, meaning change the pointer of last node to the second last node (head)
18 | head.next.next = head
19 | #break the connection of second last node (head) with the last node
20 | #by doing head.next = None
21 | head.next = None
22 | #Return the rhead, which is last node in the original linked list
23 |
24 | return rhead
25 |
--------------------------------------------------------------------------------
/algorithm/recursion/344_Reverse_String.py:
--------------------------------------------------------------------------------
1 | class Solution(object):
2 | def reverseString(self, s):
3 | """
4 | :type s: List[str]
5 | :rtype: None Do not return anything, modify s in-place instead.
6 | """
7 | def helper(s, left, right):
8 | if left > right: return
9 | s[left], s[right] = s[right], s[left]
10 | return helper(s, left+1, right-1)
11 | helper(s, 0, len(s)-1)
12 |
--------------------------------------------------------------------------------
/algorithm/recursion/backtracking/46_Permutations.py:
--------------------------------------------------------------------------------
1 | class Solution:
2 | def permute(self, nums: List[int]) -> List[List[int]]:
3 | result = []
4 |
5 | if len(nums) == 1:
6 | return [nums[:]] #nums[:] = nums.copy() return a copy of nums
7 |
8 | for idx in range(len(nums)):
9 | perms = self.permute(nums[:idx] + nums[idx+1:]) #Find the perms without nums[idx]
10 | for perm in perms:
11 | perm.append(nums[idx]) # [2,3]+[1] and [3,2]+[1] individually
12 | result.extend(perms) # [2,3,1],[3,2,1] all together into the result
13 | return result
14 |
--------------------------------------------------------------------------------
/algorithm/recursion/backtracking/52_N-Queens_II.py:
--------------------------------------------------------------------------------
1 | class Solution(object):
2 | def totalNQueens(self, n, row=0, diag=set(), off_diag=set(), cols=set()):
3 | """
4 | :diag, off_diag, cols: sets => to keep track on attacked zones
5 | """
6 | #Base Case: row = n, i.e. we find a solution since row start from 0 to (n-1)!
7 | if row == n: return 1
8 |
9 | count = 0
10 |
11 | for col in range(n):
12 | # At each curent row, iterate through columns
13 | if row-col in diag or row+col in off_diag or col in cols:
14 | #if current cell in the attacked zones, we will skip that cell
15 | continue
16 | else:
17 | #if current cell NOT in the attacked zones, we will place the new queen at that cell
18 | diag.add(row-col)
19 | off_diag.add(row+col)
20 | cols.add(col)
21 |
22 | #Move to the next row
23 | count += self.totalNQueens(n, row + 1, diag, off_diag, cols)
24 |
25 | # backtrack, i.e. remove the queen and remove the attacking zone.
26 | diag.remove(row-col)
27 | off_diag.remove(row+col)
28 | cols.remove(col)
29 |
30 | return count
31 |
--------------------------------------------------------------------------------
/algorithm/recursion/divide-and-conquer/240_Search_a_2D_Matrix_II.py:
--------------------------------------------------------------------------------
1 | class Solution(object):
2 | def searchMatrix(self, matrix, target):
3 | """
4 | :type matrix: List[List[int]]
5 | :type target: int
6 | :rtype: bool
7 | """
8 | #Base Case:
9 | if not matrix or not matrix[0]: return False
10 | if len(matrix) == 1 and len(matrix[0]) == 1:
11 | return matrix[0][0] == target
12 | #Recursion Case: Divide the search into 4 regions by cal the center of matrix
13 | center_x, center_y = len(matrix[0])//2, len(matrix)//2
14 |
15 | if matrix[center_y][center_x] > target:
16 | tl = self.searchMatrix([matrix[row][:center_x] for row in range(0, center_y)], target)
17 | tr = self.searchMatrix([matrix[row][center_x:] for row in range(0, center_y)], target)
18 | bl = self.searchMatrix([matrix[row][:center_x] for row in range(center_y, len(matrix))], target)
19 | return tl or tr or bl #Conquer
20 | elif matrix[center_y][center_x] < target:
21 | tr = self.searchMatrix([matrix[row][center_x:] for row in range(0, center_y)], target)
22 | bl = self.searchMatrix([matrix[row][:center_x] for row in range(center_y, len(matrix))], target)
23 | br = self.searchMatrix([matrix[row][center_x:] for row in range(center_y, len(matrix))], target)
24 | return tr or bl or br
25 | else:
26 | return True
27 |
--------------------------------------------------------------------------------