├── Book.pdf ├── chapter001_Introduction.md ├── chapter002_TimeComplexity.md ├── chapter003_Sorting.md ├── chapter004_DataStructures.md ├── chapter005_CompleteSearch.md ├── chapter006_GreedyAlgorithms.md ├── chapter007_DynamicProgramming.md ├── chapter008_AmortizedAnalysis.md ├── chapter009_RangeQueries.md ├── chapter010_BitManipulation.md ├── chapter011_GraphBasics.md ├── chapter012_GraphTraversal.md ├── chapter013_ShortestPaths.md ├── chapter014_TreeAlgorithms.md ├── chapter015_SpanningTrees.md └── images ├── 2d-prefix-sum.png ├── backtrack-queen.png ├── bellman-ford.png ├── complete-search-opt.png ├── count-sort.png ├── floyd-warshall-initial.png ├── huffman-coding.png ├── iterator.png ├── segment-tree.png ├── time-complexity-input.png └── union-find.png /Book.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/Book.pdf -------------------------------------------------------------------------------- /chapter001_Introduction.md: -------------------------------------------------------------------------------- 1 | Chapter 1: Introduction 2 | === 3 | * Comp programming is two parts: Design + implementation 4 | 5 | General performance tips 6 | --- 7 | * Use C++! 80% of comp programmers use it 8 | * Use `g++ -std=c++11 -O2 -Wall test.cpp -o test` 9 | * -O2 is optimizer flag 10 | * -Wall shows warnings 11 | 12 | * Use `#include `to include all STL 13 | 14 | * Use `ios::sync_with_stdio(0); cin.tie(0);` to make cin and cout faster! 15 | * Use `freopen("input.txt", "r", stdin); freopen("output.txt", "w", stdout); ` to set redirect stdout/in 16 | 17 | * Use macros to do things faster (typedefs, #defines). Kinda advanced. 18 | 19 | Introduction to mathematics 20 | --- 21 | * Arithmetic progression = Sequence of numbers where difference between 2 adjacent is the same. 22 | * Formula: n(a+b)/2 where n = length, a = first num, b = last num 23 | 24 | * Geometric progression = Sequence of numbers where ratio between 2 adjacent is the same (Ex: 1, 2, 4, 8) 25 | * Formula: (bk-a)/(k-1) where k = constant ratio 26 | 27 | * Harmonic sum = 1 + (1/2) + (1/3) + ... + (1/n) = O(log2(n)+1) 28 | 29 | * \ = difference between sets (i.e. A \ B) 30 | * N = natural numbers (1-oo), Z = integers (whole nums), Q = rational nums (nums w/o infinite decimal), R = real nums (any num) 31 | 32 | * Predicate = something that is T/F based on parameter(s) ex: P(0) 33 | * Upside down A = "for all", backwards E = "there is" 34 | 35 | * There is a closed form for Fibonnaci sequence! Called Binet's formula 36 | * A logarithm of x with base b is how many times x needs to be divided by b to get to 1 37 | 38 | -------------------------------------------------------------------------------- /chapter002_TimeComplexity.md: -------------------------------------------------------------------------------- 1 | Chapter 2: Time complexity 2 | === 3 | * Time complexity = How long program takes depending on input size 4 | 5 | Calculation 6 | --- 7 | * Only care about highest order of magnitude part of program 8 | * Recursion = time complexity of each call * # of calls 9 | 10 | Complexity classes 11 | --- 12 | * O(1) < O(logn) < O(sqrt(n)) < O(2n) < O(n!) 13 | 14 | Estimation 15 | --- 16 | * Easy to estimate based on time constraint and input size 17 | * ![time-complexity-input](./images/time-complexity-input.png) 18 | * Important to note that estimation relies on constant factor, i.e. 5n = .5n = n 19 | 20 | Maximum subarray sum 21 | --- 22 | * Easy to make O(n3) algorithm by just iterating through all possible subarrays 23 | * O(n2) algorithm removes a loop by adding to sum when right increments, reset sum every outer loop 24 | * O(n) solution is a DP-like algorithm that resets sum if sum+array[i] is less than array[i]. 25 | 26 | 27 | -------------------------------------------------------------------------------- /chapter003_Sorting.md: -------------------------------------------------------------------------------- 1 | Chapter 3: Sorting 2 | === 3 | * Sorting is a great algorithmic design technique example due to multiple ways of implementing, time complexities, objectives, etc. 4 | * Given an array of n elements, return array in where elements are in increasing order 5 | 6 | Algorithms 7 | --- 8 | * Bubble sort consists of n rounds. On each round, the algorithm iteratesthrough the elements of the array. Whenever two consecutive elements are foundthat are not in correct order, the algorithm swaps them. Bubble sort is O(n2 9 | * ~~~c++ 10 | for (int i = 0; i < n; i++) { 11 | for (int j = 0; j < n-1; j++) { 12 | if (array[j] > array[j+1]) { 13 | swap(array[j],array[j+1]); 14 | } 15 | } 16 | } 17 | ~~~ 18 | * Inversion = A pair of elements that are out of order 19 | * IMPORTANT: Merge sort is a divide and conquer algorithm. It splits the array recursively down to sizes of 1, then sorts each half coming back up from the recursion. Merge sort is O(nlogn). 20 | * IMPORTANT: Counting sort uses a bookkeeping array that keeps track of indices. I actually implemented this in LC #75 (Sort Colors). Counting sort is O(n). 21 | * ![count-sort](./images/count-sort.png) 22 | 23 | C++ Sort 24 | --- 25 | * `sort()` requires address bounds. To sort an array, use `sort(a, a+n)`. 26 | * Pairs (`pair`) are sorted based on first element with 2nd as tiebreaker. 27 | * Override the operator< in order to use sort with structs! 28 | * Use a comparison function via `sort(a, a+n, comp)`. 29 | 30 | Binary Search 31 | --- 32 | * Searching through an array is normally O(n). If array is sorted, then it can be O(logn) with binary search. 33 | * Binary search can be done using 34 | * ~~~c++ 35 | int k = 0; 36 | for (int b = n/2; b >= 1; b /= 2) { 37 | while (k+b < n && array[k+b] <= x) 38 | k += b; 39 | } 40 | ~~~ 41 | * `lower_bound`, `upper_bound`, and `equal_range` all search a sorted array and employ binary search. 42 | * Can be used to find the smallest/largest solution in our array of solutions! 43 | -------------------------------------------------------------------------------- /chapter004_DataStructures.md: -------------------------------------------------------------------------------- 1 | Chapter 4: Data Structures 2 | === 3 | * This chapter goes over C++ data structures 4 | 5 | Dynamic array 6 | --- 7 | * Vectors are used as dynamic arrays 8 | * `v.back()` gets last element 9 | * `v.pop_back` remove and return back element 10 | * `vector v(10, 5)` initializes vector of size 10 with value 5 11 | * Use `str.substr(x, y)` to get a substring at x with y length 12 | * `str.find(x)` finds first occurence of substring x 13 | 14 | Set structures 15 | --- 16 | * Two set architectures: unordered_set uses hashing and set for BT 17 | * `insert()` inserts a value 18 | * `erase()` removes a value 19 | * Use multiset to store more non-unique elements in it 20 | 21 | Map structures 22 | --- 23 | * Same as sets: `map` is BT, `unordered_map` is for hashing (hashmap) 24 | * Syntax: `m["key"] = x;` for insertion, `m.count("key")` checks if key exists in hashmap 25 | * Also use `for (auto x : m)` for going through hashmap. `x.first` is key and `x.second` is value. 26 | 27 | Iterators 28 | --- 29 | * Most things have `.begin()` that starts at first element and `.end()` that starts after last element. This is makes [.begin, .end) true. 30 | * You can do cool stuff with these. `sort(v.begin(), v.end())` and `reverse(v.begin(), v.end()` 31 | * For sets, you can use `auto it = s.begin();`. Then use `*it` to get element. 32 | * ![image](./images/iterator.png) 33 | * ~~~c++ 34 | auto it = s.find(x); 35 | if (it == s.end()) { 36 | // x is not found 37 | } 38 | ~~~ 39 | 40 | Other structures 41 | --- 42 | * `bitset` is array with only values 0 or 1 43 | * `deque` is cooler version of vector where it has `push_front` and `pop_front` 44 | * `stack` only has `push()`, `pop()`, and `top()`. 45 | * `queue` has same as stack but `front()` instead of `top()`. 46 | * `priority_queue` has only `push` and `pop()` 47 | * Policy based data structures can be used as well. `indexed_set` is a data structure that's a set but is indexed like an array. 48 | 49 | Comparison to sorting 50 | --- 51 | * A lot of problems can be solved just by doing sorting and data structures. For instance, some problems, despite having a worse time complexity, are actually way faster to use sorting for. 52 | -------------------------------------------------------------------------------- /chapter005_CompleteSearch.md: -------------------------------------------------------------------------------- 1 | Chapter 5: Complete search 2 | === 3 | * Complete search can solve almost any problem. 4 | * Idea: Generate all possible solutions, select best/count all solutions 5 | * Alternatives: Backtracking, DP, or Greedy normally 6 | 7 | Generating subsets 8 | --- 9 | * Use bits to create subsets! Ex: 11001 is {0, 3, 4}. 10 | * Or just use the basic method. 11 | 12 | Generating permutations 13 | --- 14 | * Use a "chosen" array that keeps track if numbers have been chosen. 15 | * ~~~c++ 16 | vector permutation; 17 | for (int i = 0; i < n; i++) { 18 | permutation.push_back(i); 19 | } 20 | do { 21 | // process permutation 22 | } while (next_permutation(permutation.begin(),permutation.end())); 23 | ~~~ 24 | 25 | Backtracking 26 | --- 27 | * One of my favorite types of algorithms! 28 | * Generate permutations but if it's invalid, don't keep going. Just return. 29 | * ![image](./images/backtrack-queen.png) 30 | * Cool way of doing N-queens problem: Use arrays to represent diagonals, rows, and columns so don't have to use weird loop to see if board is valid. 31 | 32 | Pruning the search 33 | --- 34 | * Optimize backtracking using pruning, like A.I. 35 | * Ex: Find the number of paths in a 7x7 matrix that visits all squares 36 | * Optimization 1: There are always two paths that are symmetric. 37 | * ![image](./images/complete-search-opt.png) 38 | * Optimization 2: Visits goal early, terminate 39 | * Optimization 3: If hits wall and can turn left or right, stop search. 40 | * Optimization 4: If just can't go forward and can go left or right, stop. 41 | * Using these optimizations, time goes from 500s -> .6s 42 | 43 | Meet in the middle 44 | --- 45 | * Search space is divided into two halves. Separate searches on each half, then combined. 46 | -------------------------------------------------------------------------------- /chapter006_GreedyAlgorithms.md: -------------------------------------------------------------------------------- 1 | Chapter 6: Greedy algorithms 2 | === 3 | * Greedy algorithms always take best choice at that moment 4 | 5 | Coin problem 6 | --- 7 | * The coin problem is to form a sum of money n using coin values in an array. 8 | * Greedy Algorithm = Take largest possible coin first 9 | * With most basic coin problems where values are multiples of eachother, then it is optimal. 10 | * However, general cases are non-optimal. Ex: Get 6 cents with {1,3,4} cent coins. Optimal = 3+3, solution = 4+1+1 11 | * DP can be used to solve general case 12 | 13 | Scheduling 14 | --- 15 | * Given n events with start and end times, find schedule that fits as many possible. 16 | * Algorithm 1: Select shortest events first 17 | * Algorithm 2: Select next possible event that begins as early as possible 18 | * Algorithm 3: Select next possible event that ends as early possible 19 | * Algorithm 3 is the only optimal one of the 3. 20 | 21 | Tasks and deadlines 22 | --- 23 | * Each task has a duration and deadline. Earn deadline-finish_time points. 24 | * Always sort by duration of task. Deadline does not matter. Choose lowest duration task first 25 | 26 | Minimizing sums 27 | --- 28 | * Find a value x that minimizes the sum |a1-x|c + |a2-x|c + ... 29 | * For the c = 1 case, choose the median number. 30 | * For the c = 2 case, use FOIL to find the equation. Turns out the average of the a set is the best. 31 | 32 | Data compression 33 | --- 34 | * Use variable length codewords for different letters. Ex: Regularly, A = 00, B = 01, C = 10, D = 11. Make it shorter by doing A = 0, B = 110, C = 10.. 35 | * This requires that no codeword is a prefix of another codeword. 36 | 37 | Huffman coding 38 | --- 39 | * To solve data compression problem, use Huffman coding. 40 | * Greedy algorithm that constructs optimal code for compressing a given string 41 | * Build binary tree based on letter frequency 42 | * When done, LEFT = 0, RIGHT = 1. Each leaf = a letter. 43 | * ![image](./images/huffman-coding.png) 44 | -------------------------------------------------------------------------------- /chapter007_DynamicProgramming.md: -------------------------------------------------------------------------------- 1 | Chapter 7: Dynamic programming 2 | === 3 | * Technique that combines greedy and complete search 4 | * Can be applied if problem can be divided into overlapping subproblems 5 | * Used for counting solutions + optimal solution 6 | 7 | Coin problem 8 | --- 9 | * Recursively, if coins = {1, 3, 4}, use `solve(x) = min(solve(x-1)+1, solve(x-3)+1, solve(x-4)+1);` 10 | * `INF` in C++ denotes infinity! 11 | * Use memoization to speed things up. Memoization just remembers solutions to past sub-problems so we don't have to calculate it again. 12 | * Both the recursive memoization problem AND the iterative version are DP. 13 | * Iteration is better though since it's shorter and often simpler to read. 14 | * ~~~c++ 15 | value[0] = 0; 16 | for (int x = 1; x <= n; x++) { 17 | value[x] = INF; 18 | for (auto c : coins) { 19 | if (x-c >= 0 && value[x-c]+1 < value[x]) { 20 | value[x] = value[x-c]+1; 21 | first[x] = c; 22 | } 23 | } 24 | } 25 | while (n > 0) { 26 | cout << first[n] << "\n"; 27 | n -= first[n]; 28 | } 29 | ~~~ 30 | * In order to find the solution as well as the answer, use another array to keep track of the "parent" index. `first` does this. 31 | * In order to count the number of solutions, use another array as well that stores the count. 32 | 33 | Longest increasing subsequence 34 | --- 35 | * Longest subarray that increases between elements 36 | * Use DP array that keeps track of length. Search for largest length where array[curr] > array[i]. O(n2). 37 | * There is a O(nlogn) solution though that uses binary search 38 | * ~~~c++ 39 | for (int k = 0; k < n; k++) { 40 | length[k] = 1; 41 | for (int i = 0; i < k; i++) { 42 | if (array[i] < array[k]) { 43 | length[k] = max(length[k],length[i]+1); 44 | } 45 | } 46 | } 47 | ~~~ 48 | 49 | Paths in a grid 50 | --- 51 | * Find a path from top-left to bottom-right where path is minimized/maximized. 52 | * Use `sum(row, col) = max(sum(row,col-1), sum(row-1,col)) + value[row][col]` 53 | 54 | Knapsack 55 | --- 56 | * Set of objects given, subsets with some properties must be found. 57 | * Ex: Given a list of weights, find all possible weight combinations. 58 | * `possible(x, k) = possible(x - wk, k-1) or possible(x, k-1)` 59 | * Can be done w 1D array by working backwards 60 | 61 | Edit distance 62 | --- 63 | * The minimum number of operations needed to transform a string into another string. Also known as Levenshtein distance. 64 | * Operations include insertion, remove, or modify. 65 | * Use `distance(a, b)` where a and b are indices of current string and goal string. 66 | * `d(a,b) = min(d(a,b-1)+1, d(a-1,b)+1, d(a-1,b-1)+cost(a,b))` where cost(a,b) = 0 if `x[a] == y[b]` else 1. 67 | * This is representative of min(insert character at end of x, remove character from end of x, modify last character of x). 68 | 69 | Counting tiles 70 | --- 71 | * Find # of distinct ways to fill `n` x `m` grid with 1x2 and 2x1 tiles. 72 | * Use characters to represent different states of each grid space. 73 | * Go row by row down. 74 | * Capital pi = Summation but multiplication. There is a surprisingly simple formula to calculate the answer to this problem. Too long to write out though. 75 | -------------------------------------------------------------------------------- /chapter008_AmortizedAnalysis.md: -------------------------------------------------------------------------------- 1 | Chapter 8: Amortized analysis 2 | === 3 | * Method to determine time complexity 4 | * Estimate total time for all operations, not specific ones 5 | 6 | Two pointers method 7 | --- 8 | * Subarray sum (find subarray w sum s) = Two pointers, one start, one end. Each iteration, start pointer moves forward 1, end moves until sum > s. 9 | * Ex: `[1, 3, 2, 5, 1, 1, 2, 3], s = 8` . 10 | * Starts with `1, 3, 2` since the next element makes it > 8. 11 | * Moves over 1. Now `3, 2`. Still `5` would make it exceed 8. 12 | * Moves over 1. Now `2`. Add `5`. Add `1`. Now found solution! 13 | * 2SUM (find two elements in array that sum to s) = Sort, one start, one end. Each iteration, start moves forward 1, end moves until pair < s. 14 | 15 | Nearest smaller elements 16 | --- 17 | * Use a stack to solve this problem. Push items to stack, then pop off the stack when the top of the stack is greater than the current element. 18 | 19 | Sliding window 20 | --- 21 | * A subarray that moves from left to right. Generally used for strings. 22 | * Ex: Find the minimum sum of elements in a 4-length subarray. 23 | * Use queue to do this :D 24 | -------------------------------------------------------------------------------- /chapter009_RangeQueries.md: -------------------------------------------------------------------------------- 1 | Chapter 9: Range queries 2 | === 3 | * Discussion of DS that process range queries 4 | * Range query = calculate value based on subarray of array 5 | * Ex: Find min between index `a` and index `b`. This is known as a query. 6 | 7 | Static array queries 8 | --- 9 | * Static array = array where values are not updated between queries. 10 | * Sum queries 11 | * Use prefix sum array! An array that is equal to sum from 0 to n. 12 | * That way, you can calculate a sum in O(1) by doing `prefix[b] - prefix[a-1]`. 13 | * Can also be done in 2D. 14 | * ![image](./images/2d-prefix-sum.png) 15 | * If point A is bottom right, point D is top left, and prefix[][] is the very top left, then `S(A) - S(B) - S(C) + S(D)` where B and C are other corners. 16 | * Minimum queries 17 | * Idea: Precalculate all values of `min(a,b)` where `b - a + 1` is a power of 2. 18 | * Precalculations done by: `minq(a, b) = min(minq(a, a+w-1), minq(a+w,b))` where `b-a+1` is a power of 2 and `w` is half of that. 19 | * Find any query: `min(minq(a, a+k-1), minq(b-k+1,b))`. k is of length where the two `minq` calls overlap, creating a union 20 | 21 | Binary indexed tree 22 | --- 23 | * Also known as a Fenwick tree. Dynamic variant of prefix sum array. 24 | * Allows updating array values in the prefix sum array in O(log n) time instead of O(n) (recreating the prefix sum array). 25 | * `tree[k] = sumq(k - p(k) + 1, k)` where p(k) is largest power of 2 that divides k. 26 | * Sums can be easily calculated due to having non-overlapping sums of zones. 27 | * Use bit operations. p(k) can be easily found by doing `p(k) = k&-k` 28 | 29 | Segment tree 30 | --- 31 | * Binary tree so that leaf nodes correspond to array elements. 32 | * Supports almost all range queries 33 | * Use array to create it where `tree[1]` is the root, `tree[2]` is the left child of root, `tree[3]` is right child, and `tree[n]` is the start of the leaves. 34 | * ![image](./images/segment-tree.png) 35 | * Can also be used to find minimums/maximums in O(log n) time. 36 | 37 | Additional techniques 38 | --- 39 | * If we want to increase a bunch of elements by x, use a difference array. 40 | * Difference array = d[i] = a[i] - a[i-1]. 41 | * To update a range, just update a by increasing and b by decreasing. 42 | -------------------------------------------------------------------------------- /chapter010_BitManipulation.md: -------------------------------------------------------------------------------- 1 | Chapter 10: Bit manipulation 2 | === 3 | * Computers use binary to represent data 4 | 5 | Bit representation 6 | --- 7 | * Use two's complement for negative numbers: 1's are 0's and 0's are 1's 8 | * Can get two's complement really easily by converting negative `int` var to `unsigned int` var 9 | 10 | Bit operations 11 | --- 12 | * `&` = AND 13 | * `|` = OR 14 | * `^` = XOR 15 | * `~` = NOT 16 | * `<<` or `>>` are bit shifts, adds 0's to each side. 17 | * `x ^ (1 << k)` inverts the `k`th bit of `x` 18 | * `__builtin_clz(x)` gets number of zeros at beginning of `x` 19 | * `__builtin_popcount(x)` gets number of ones 20 | 21 | Representing sets 22 | --- 23 | * Use 32-bit type to represent a set {0..31} where each bit reps a num 24 | * Very useful for intersect, union, complement, difference, etc. due to just needing a single bit operation between two ints to ge them. 25 | * Go through subsets of x: `b = 0; do{}while (b = (b-x)&x);`. 26 | 27 | Bit optimizations 28 | --- 29 | * Hamming distance = length of number of positions where two strings differ 30 | * Problem: Find minimum hamming distance between two strings in list of strings 31 | * If length of string is small and only two characters, use `__builtin_popcount(a ^ b);` - Almost 30 times faster 32 | * Problem: Given `n` x `n` grid where each square is black or white, get subgrids where all corners are black 33 | * Use `__builtin_popcount(color[a][i]&color[b][i]);` with N 64-bit representations. 34 | 35 | Dynamic programming 36 | --- 37 | * Bit operations are convenient for storing state representations in dynamic programming problems 38 | * Problem: Given k products over n days, have to buy all products but only one per day. 39 | * Did not understand the solution to this problem 40 | * Change iteration over permutations to iteration over subsets 41 | * Problem: Elevator with max weight `x`, `n` people want to go up. Minimum number of rides? 42 | * Classic knapsack problem. 43 | * For each subset, calculate minimum number of rides AND minimum weight of people in last group 44 | 45 | -------------------------------------------------------------------------------- /chapter011_GraphBasics.md: -------------------------------------------------------------------------------- 1 | Chapter 11: Basics of graphs 2 | === 3 | * Lots of problems solve via graphs 4 | 5 | Graph terminology 6 | --- 7 | * Consists of nodes and edges 8 | * Path = from node a to node b 9 | * Simple = each node only appears once 10 | * Cycle = First and last node are same 11 | * Tree = connected graph with `n` nodes and `n-1` edges 12 | * Directed = edges only go one way 13 | * Neighbor/adjacent = edge between them 14 | * Degree = number of neighbors a node has 15 | * Regular = every node has constant `d` degree 16 | * Complete = every node is connected to every other node 17 | * Indegree = Number of edges that start at node, outdegree = number of edges that end at that node 18 | * Coloring = each node is assigned a color so that no adjacent nodes are of same color 19 | * Bipartite = Possible for a graph to be colored w 2 colors 20 | * Simple = No multiple edges between nodes 21 | 22 | Graph representation 23 | --- 24 | * Adjacency list = List of lists of connections, i.e. `adj[1]` gets all of the 1st node's connections 25 | * Weighted graphs can be stored using pairs instead of just ints 26 | * Adjacency matrix = Really cool matrix version of adjacency list. `adj[a][b]` is the weight of the connection between `a` and `b`. 27 | * Edge list = Just a 1D list of pairs of nodes that represent edges 28 | -------------------------------------------------------------------------------- /chapter012_GraphTraversal.md: -------------------------------------------------------------------------------- 1 | Chapter 12: Graph traversal 2 | === 3 | * DFS and BFS are super important to basic questions (LC) 4 | 5 | Depth-first search 6 | --- 7 | * O(m+n), m = edges, n = nodes 8 | * Goes as deep as possible first (depth) 9 | * Uses stack (take a look at code below for BFS) 10 | 11 | Breadth-first search 12 | --- 13 | * Opposite of DFS - uses queue 14 | * Goes as wide as possible first (breadth) 15 | * ~~~c++ 16 | visited[x] = true; 17 | distance[x] = 0; 18 | q.push(x); 19 | while (!q.empty()) { 20 | int s = q.front(); 21 | q.pop(); 22 | // process node s 23 | for (auto u : adj[s]) { 24 | if (visited[u]) continue; 25 | visited[u] = true; 26 | distance[u] = distance[s]+1; 27 | q.push(u); 28 | } 29 | } 30 | ~~~ 31 | 32 | Applications 33 | --- 34 | * Check if a node is connected to other nodes - use BFS 35 | * Find cycles in the graph using DFS 36 | * Bipartiteness checks 37 | * Bipartite = can be colored with two colors 38 | * For K-COLOR, problem is NP-hard. 39 | -------------------------------------------------------------------------------- /chapter013_ShortestPaths.md: -------------------------------------------------------------------------------- 1 | Chapter 13: Shortest paths 2 | === 3 | * Finding shortest path is incredibly important problem. 4 | * In unweighted graphs, we can just use BFS to solve for shortest paths 5 | * This chapter will focus on weighted graphs 6 | 7 | Bellman-Ford algorithm 8 | --- 9 | * Finds paths from starting node to all other nodes 10 | * Can detect negative cycles, but does not account for it 11 | * ![bellman-ford](./images/bellman-ford.png) 12 | * Implemented fairly easily using two for loops. tie(a,b,w) gets a, b, and w from e. 13 | * ~~~c++ 14 | for (int i = 1; i <= n; i++) distance[i] = INF; 15 | distance[x] = 0; 16 | for (int i = 1; i <= n-1; i++) { 17 | for (auto e : edges) { 18 | int a, b, w; 19 | tie(a, b, w) = e; 20 | distance[b] = min(distance[b], distance[a]+w); 21 | } 22 | } 23 | ~~~ 24 | * To detect negative cycle, run `n` rounds instead of `n-1`. If anything decreases, there's a negative cycle. 25 | 26 | Dijkstra's algorithm 27 | --- 28 | * Finds paths from starting node to all other nodes 29 | * More efficient than Bellman-Ford, but harder to implement 30 | * Requires no negative weights. Similar to BFS in chapter 12. 31 | * ~~~c++ 32 | for (int i = 1; i <= n; i++) distance[i] = INF; 33 | distance[x] = 0; 34 | q.push({0,x}); 35 | while (!q.empty()) { 36 | int a = q.top().second; 37 | q.pop(); 38 | if (processed[a]) continue; 39 | processed[a] = true; 40 | for (auto u : adj[a]) { 41 | int b = u.first, w = u.second; 42 | if (distance[a]+w < distance[b]) { 43 | distance[b] = distance[a]+w; 44 | q.push({-distance[b],b}); 45 | } 46 | } 47 | } 48 | ~~~ 49 | 50 | Floyd-Warshall algorithm 51 | --- 52 | * Finds all shortest paths in a single run 53 | * Maintains 2D array that keeps track of distances 54 | * Initially, filled with infinity for distance it doesn't know (edge count > 1). Example: 55 | * ![image](./images/floyd-warshall-initial.png) 56 | * Consists of multiple rounds. Each round, algorithm selects new node that acts as intermediate node in paths. 57 | * After initial setup, can be done very easily in O(n3). 58 | * ~~~c++ 59 | for (int k = 1; k <= n; k++) { 60 | for (int i = 1; i <= n; i++) { 61 | for (int j = 1; j <= n; j++) { 62 | distance[i][j] = min(distance[i][j],distance[i][k]+distance[k][j]); 63 | } 64 | } 65 | } 66 | ~~~ 67 | -------------------------------------------------------------------------------- /chapter014_TreeAlgorithms.md: -------------------------------------------------------------------------------- 1 | Chapter 14: Tree algorithms 2 | === 3 | * Tree is a connected, acyclic graph of `n-1` edges. 4 | * Leaves = nodes with degree 1. 5 | * Rooted tree = all other nodes are below root node 6 | 7 | Tree traversal 8 | --- 9 | * Use DP to calculate a number of things 10 | * Ex: Calculate number of nodes in subtree 11 | * ~~~c++ 12 | void dfs(int s, int e) { 13 | count[s] = 1; 14 | for (auto u : adj[s]) { 15 | if (u == e) continue; 16 | dfs(u, s); 17 | count[s] += count[u]; 18 | } 19 | } 20 | ~~~ 21 | 22 | Diameter 23 | --- 24 | * Diameter is the maximum length of a path between two nodes 25 | * Every diameter has to go through a high point 26 | * Algorithm 1: Calculate `toLeaf(x)` and `maxLength(x)` (max length to leaf and maxLength = `toLeaf(x->r) + toLeaf(x->l)` 27 | * Algorithm 2: Choose random node, find farthest node `b` from it. Then, find farthest node `c` from `b`. 28 | 29 | All longest paths 30 | --- 31 | * Calculate the maximum length of a path that begins at a node for all nodes 32 | * Solve using maxLength1(x) = max length of path from x and maxLength2(x) = max length in another direction than maxLength1 33 | * If path `maxLength1(parent)` goes through x, `maxLength(x) = maxLength2(parent)+1` 34 | * Else, `maxLength(x) = maxLength1(p)+1` 35 | 36 | Binary trees 37 | --- 38 | * BT is rooted tree with left and right subtrees. 39 | * Preorder = Process root, traverse left, traverse right 40 | * Inorder = Traverse left, process root, traverse right 41 | * Postorder = Traverse left, traverse right, process root 42 | 43 | -------------------------------------------------------------------------------- /chapter015_SpanningTrees.md: -------------------------------------------------------------------------------- 1 | Chapter 15: Spanning trees 2 | === 3 | * Convert graph into tree w all nodes, some edges 4 | * Weight = sum of all edge weights 5 | * Minimum spanning tree (MST) = weight is small as possible 6 | 7 | Kruskal's algorithm 8 | --- 9 | * Initial: Only nodes, no edges 10 | * Sort edge weights, adds edges to tree if no cycle 11 | * Cycle = two edges already a part of set 12 | * In order to check if the nodes are already part of the same component, use union-find structure. 13 | 14 | Union-find structure 15 | --- 16 | * Union-find struct: Collection of sets that are disjoint 17 | * Supports uniting two sets together, and finding an element in one of the sets 18 | * Use representative nodes for each set (representative = root of set tree basically) 19 | * Join two sets by connecting each representative, smaller set linked to larger set 20 | * ![image](./images/union-find.png) 21 | * Keep track of size + links using `size` and `link` arrays. `link` holds index of above element in chain 22 | * ~~~c++ 23 | int find(int x) { 24 | while (x != link[x]) x = link[x]; 25 | return x; 26 | } 27 | 28 | bool same(int a, int b) { 29 | return find(a) == find(b); 30 | } 31 | 32 | void unite(int a, int b) { 33 | a = find(a); 34 | b = find(b); 35 | if (size[a] < size[b]) swap(a,b); 36 | size[a] += size[b]; 37 | link[b] = a; 38 | } 39 | ~~~ 40 | 41 | Prim's algorithm 42 | --- 43 | * Add random start node to tree. Then, add minimum weight edge that adds a new node to the tree 44 | * Modify Dijkstra's algorithm for implementation. 45 | -------------------------------------------------------------------------------- /images/2d-prefix-sum.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/2d-prefix-sum.png -------------------------------------------------------------------------------- /images/backtrack-queen.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/backtrack-queen.png -------------------------------------------------------------------------------- /images/bellman-ford.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/bellman-ford.png -------------------------------------------------------------------------------- /images/complete-search-opt.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/complete-search-opt.png -------------------------------------------------------------------------------- /images/count-sort.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/count-sort.png -------------------------------------------------------------------------------- /images/floyd-warshall-initial.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/floyd-warshall-initial.png -------------------------------------------------------------------------------- /images/huffman-coding.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/huffman-coding.png -------------------------------------------------------------------------------- /images/iterator.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/iterator.png -------------------------------------------------------------------------------- /images/segment-tree.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/segment-tree.png -------------------------------------------------------------------------------- /images/time-complexity-input.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/time-complexity-input.png -------------------------------------------------------------------------------- /images/union-find.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/osu-acm/comp-programmers-handbook/164f17f5db9ff2f2ee6afbfa5f5c0d333b21b85a/images/union-find.png --------------------------------------------------------------------------------