└── README.md /README.md: -------------------------------------------------------------------------------- 1 | # Top 53 Binary Tree Data Structure Interview Questions in 2025 2 | 3 |
4 |

5 | 6 | data-structures-and-algorithms 7 | 8 |

9 | 10 | #### You can also find all 53 answers here 👉 [Devinterview.io - Binary Tree Data Structure](https://devinterview.io/questions/data-structures-and-algorithms/binary-tree-data-structure-interview-questions) 11 | 12 |
13 | 14 | ## 1. What is a _Tree Data Structure_? 15 | 16 | A **tree data structure** is a hierarchical collection of nodes, typically visualized with a root at the top. Trees are typically used for representing relationships, hierarchies, and facilitating efficient data operations. 17 | 18 | ### Core Definitions 19 | 20 | - **Node**: The basic unit of a tree that contains data and may link to child nodes. 21 | - **Root**: The tree's topmost node; no nodes point to the root. 22 | - **Parent / Child**: Nodes with a direct connection; a parent points to its children. 23 | - **Leaf**: A node that has no children. 24 | - **Edge**: A link or reference from one node to another. 25 | - **Depth**: The level of a node, or its distance from the root. 26 | - **Height**: Maximum depth of any node in the tree. 27 | 28 | ### Key Characteristics 29 | 30 | 1. **Hierarchical**: Organized in parent-child relationships. 31 | 2. **Non-Sequential**: Non-linear data storage ensures flexible and efficient access patterns. 32 | 3. **Directed**: Nodes are connected unidirectionally. 33 | 4. **Acyclic**: Trees do not have loops or cycles. 34 | 5. **Diverse Node Roles**: Such as root and leaf. 35 | 36 | ### Visual Representation 37 | 38 | ![Tree Data Structure](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2FTreedatastructure%20(1).png?alt=media&token=d6b820e4-e956-4e5b-8190-2f8a38acc6af&_gl=1*3qk9u9*_ga*OTYzMjY5NTkwLjE2ODg4NDM4Njg.*_ga_CW55HF8NVT*MTY5NzI4NzY1Ny4xNTUuMS4xNjk3Mjg5NDU1LjUzLjAuMA..) 39 | 40 | ### Common Tree Variants 41 | 42 | - **Binary Tree**: Each node has a maximum of two children. 43 | - **Binary Search Tree (BST)**: A binary tree where each node's left subtree has values less than the node and the right subtree has values greater. 44 | - **AVL Tree**: A BST that self-balances to optimize searches. 45 | - **B-Tree**: Commonly used in databases to enable efficient access. 46 | - **Red-Black Tree**: A BST that maintains balance using node coloring. 47 | - **Trie**: Specifically designed for efficient string operations. 48 | 49 | ### Practical Applications 50 | 51 | - **File Systems**: Model directories and files. 52 | - **AI and Decision Making**: Decision trees help in evaluating possible actions. 53 | - **Database Systems**: Many databases use trees to index data efficiently. 54 | 55 | ### Tree Traversals 56 | 57 | #### Depth-First Search 58 | 59 | - **Preorder**: Root, Left, Right. 60 | - **Inorder**: Left, Root, Right (specific to binary trees). 61 | - **Postorder**: Left, Right, Root. 62 | 63 | #### Breadth-First Search 64 | 65 | - **Level Order**: Traverse nodes by depth, moving from left to right. 66 | 67 | ### Code Example: Binary Tree 68 | 69 | Here is the Python code: 70 | 71 | ```python 72 | class Node: 73 | def __init__(self, data): 74 | self.left = None 75 | self.right = None 76 | self.data = data 77 | 78 | # Create a tree structure 79 | root = Node(1) 80 | root.left, root.right = Node(2), Node(3) 81 | root.left.left, root.right.right = Node(4), Node(5) 82 | 83 | # Inorder traversal 84 | def inorder_traversal(node): 85 | if node: 86 | inorder_traversal(node.left) 87 | print(node.data, end=' ') 88 | inorder_traversal(node.right) 89 | 90 | # Expected Output: 4 2 1 3 5 91 | print("Inorder Traversal: ") 92 | inorder_traversal(root) 93 | ``` 94 |
95 | 96 | ## 2. What is a _Binary Tree_? 97 | 98 | A **Binary Tree** is a hierarchical structure where each node has up to two children, termed as **left child** and **right child**. Each node holds a data element and pointers to its left and right children. 99 | 100 | ### Binary Tree Types 101 | 102 | - **Full Binary Tree**: Nodes either have two children or none. 103 | - **Complete Binary Tree**: Every level, except possibly the last, is completely filled, with nodes skewed to the left. 104 | - **Perfect Binary Tree**: All internal nodes have two children, and leaves exist on the same level. 105 | 106 | ### Visual Representation 107 | 108 | ![Binary Tree Types](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2Ftree-types.png?alt=media&token=847de252-5545-4a29-9e28-7a7e93c8e657) 109 | 110 | ### Applications 111 | 112 | - **Binary Search Trees**: Efficient in lookup, addition, and removal operations. 113 | - **Expression Trees**: Evaluate mathematical expressions. 114 | - **Heap**: Backbone of priority queues. 115 | - **Trie**: Optimized for string searches. 116 | 117 | ### Code Example: Binary Tree & In-order Traversal 118 | 119 | Here is the Python code: 120 | 121 | ```python 122 | class Node: 123 | """Binary tree node with left and right child.""" 124 | def __init__(self, data): 125 | self.left = None 126 | self.right = None 127 | self.data = data 128 | 129 | def insert(self, data): 130 | """Inserts a node into the tree.""" 131 | if data < self.data: 132 | if self.left is None: 133 | self.left = Node(data) 134 | else: 135 | self.left.insert(data) 136 | elif data > self.data: 137 | if self.right is None: 138 | self.right = Node(data) 139 | else: 140 | self.right.insert(data) 141 | 142 | def in_order_traversal(self): 143 | """Performs in-order traversal and returns a list of nodes.""" 144 | nodes = [] 145 | if self.left: 146 | nodes += self.left.in_order_traversal() 147 | nodes.append(self.data) 148 | if self.right: 149 | nodes += self.right.in_order_traversal() 150 | return nodes 151 | 152 | 153 | # Example usage: 154 | # 1. Instantiate the root of the tree 155 | root = Node(50) 156 | 157 | # 2. Insert nodes (This will implicitly form a Binary Search Tree for simplicity) 158 | values_to_insert = [30, 70, 20, 40, 60, 80] 159 | for val in values_to_insert: 160 | root.insert(val) 161 | 162 | # 3. Perform in-order traversal 163 | print(root.in_order_traversal()) # Expected Output: [20, 30, 40, 50, 60, 70, 80] 164 | ``` 165 |
166 | 167 | ## 3. What is _Binary Heap_? 168 | 169 | A **Binary Heap** is a special binary tree that satisfies the **Heap Property**: parent nodes are ordered relative to their children. 170 | 171 | There are two types of binary heaps: 172 | 173 | - **Min Heap**: Parent nodes are less than or equal to their children. 174 | - **Max Heap**: Parent nodes are greater than or equal to their children. 175 | 176 | ### Key Characteristics 177 | 178 | 1. **Shape Property**: A binary heap is a complete binary tree, which means all its levels are filled except the last one, which is filled from the left. 179 | 2. **Heap Property**: Nodes follow a specific order—either min heap or max heap—relative to their children. 180 | 181 | ### Visual Representation 182 | 183 | ![Min Heap and Max Heap Example](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2Fmax-heap-min-heap%20(1).png?alt=media&token=3c2136ee-ada1-41c9-9ddb-590e4338f585) 184 | 185 | ### Array-Based Representation 186 | 187 | Due to the **complete binary tree structure**, binary heaps are often implemented as **arrays**, offering both spatial efficiency and cache-friendly access patterns. 188 | 189 | - **Root Element**: Stored at index 0. 190 | - **Child-Parent Mapping**: 191 | - Left child: `(2*i) + 1` 192 | - Right child: `(2*i) + 2` 193 | - Parent: `(i-1) / 2` 194 | 195 | #### Example Array 196 | 197 | ```plaintext 198 | Index: 0 1 2 3 4 5 6 7 8 9 10 199 | Elements: 1 3 2 6 5 7 8 9 10 0 4 200 | ``` 201 | 202 | #### Advantages 203 | 204 | - **Memory Efficiency**: No extra pointers needed. 205 | - **Cache Locality**: Adjacent elements are stored closely, aiding cache efficiency. 206 | 207 | #### Limitations 208 | 209 | - **Array Sizing**: The array size must be predefined. 210 | - **Percolation**: Insertions and deletions may require element swapping, adding computational overhead. 211 | 212 | ### Code Example: Array-based Binary Heap Operations 213 | 214 | Here is the Python code: 215 | 216 | ```python 217 | class BinaryHeap: 218 | def __init__(self, array): 219 | self.heap = array 220 | 221 | def get_parent_index(self, index): 222 | return (index - 1) // 2 223 | 224 | def get_left_child_index(self, index): 225 | return (2 * index) + 1 226 | 227 | def get_right_child_index(self, index): 228 | return (2 * index) + 2 229 | 230 | # Example usage 231 | heap = BinaryHeap([1, 3, 2, 6, 5, 7, 8, 9, 10, 0, 4]) 232 | parent_index = heap.get_parent_index(4) 233 | left_child_index = heap.get_left_child_index(1) 234 | print(f"Parent index of node at index 4: {parent_index}") 235 | print(f"Left child index of node at index 1: {left_child_index}") 236 | ``` 237 |
238 | 239 | ## 4. What is a _Binary Search Tree_? 240 | 241 | A **Binary Search Tree** (BST) is a binary tree optimized for quick lookup, insertion, and deletion operations. A BST has the distinct property that each node's left subtree contains values smaller than the node, and its right subtree contains values larger. 242 | 243 | ### Key Characteristics 244 | 245 | - **Sorted Elements**: Enables efficient searching and range queries. 246 | - **Recursive Definition**: Each node and its subtrees also form a BST. 247 | - **Unique Elements**: Generally, BSTs do not allow duplicates, although variations exist. 248 | 249 | ### Visual Representation 250 | 251 | ![Binary Tree vs BST](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2Fvalid-binary-search-tree-example.png?alt=media&token=5821a405-7991-4c92-976b-b187a5a25fe3) 252 | 253 | ### Formal Properties 254 | 255 | For any node $N$ in the BST: 256 | 257 | $$ 258 | $$ 259 | \forall L \in \text{Left-Subtree}(N) & : \text{Value}(L) < \text{Value}(N) \\ 260 | \forall R \in \text{Right-Subtree}(N) & : \text{Value}(R) > \text{Value}(N) 261 | $$ 262 | $$ 263 | 264 | ### Practical Applications 265 | 266 | - **Databases**: Used for efficient indexing. 267 | - **File Systems**: Employed in OS for file indexing. 268 | - **Text Editors**: Powers auto-completion and suggestions. 269 | 270 | ### Time Complexity 271 | 272 | - **Search**: $O(\log n)$ in balanced trees; $O(n)$ in skewed trees. 273 | - **Insertion**: Averages $O(\log n)$; worst case is $O(n)$. 274 | - **Deletion**: Averages $O(\log n)$; worst case is $O(n)$. 275 | 276 | ### Code Example: Validating a BST 277 | 278 | Here is the Python code: 279 | 280 | ```python 281 | def is_bst(node, min=float('-inf'), max=float('inf')): 282 | if node is None: 283 | return True 284 | if not min < node.value < max: 285 | return False 286 | return (is_bst(node.left, min, node.value) and 287 | is_bst(node.right, node.value, max)) 288 | ``` 289 |
290 | 291 | ## 5. What is _AVL Tree_? How to _Balance_ it? 292 | 293 | **AVL Trees**, named after their inventors Adelson-Velsky and Landis, are a special type of binary search tree (BST) that **self-balance**. This balancing optimizes time complexity for operations like search, insert, and delete to $O(\log n)$. 294 | 295 | ### Balance Criterion 296 | 297 | Each node in an AVL Tree must satisfy the following balance criterion to maintain self-balancing: 298 | 299 | $$ 300 | \text{BalanceFactor}(N) = \text{height}(L) - \text{height}(R) \in \{-1, 0, 1\} 301 | $$ 302 | 303 | If a node's **Balance Factor** deviates from this range, the tree needs rebalancing. 304 | 305 | This involves three steps: 306 | 307 | 1. Evaluate each node's balance factor. 308 | 2. Identify the type of imbalance: left-heavy, right-heavy, or requiring double rotation. 309 | 3. Perform the necessary rotations to restore balance. 310 | 311 | ### Visual Representation 312 | 313 | ![AVL Tree Balance](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2Favl-tree-1.png?alt=media&token=23c747ed-29f4-4b43-a1f2-b274cf4131fe) 314 | 315 | ### Types of Rotations for Rebalancing 316 | 317 | #### Single Rotations 318 | 319 | - **Left Rotation (LL)**: Useful when the right subtree is taller. 320 | ![Left-Left Rotation](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2FLL%20Rotation%20(1).png?alt=media&token=fe873921-147c-4639-a5d8-4ba83abb111b) 321 | 322 | - **Right Rotation (RR)**: Used for a taller left subtree. 323 | ![Right-Right Rotation](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2FRR%20Rotation%20(1).png?alt=media&token=be8009dc-1c40-4096-85e9-ce65f320880f) 324 | 325 | #### Double Rotations 326 | 327 | - **Left-Right (LR) Rotation**: Involves an initially taller left subtree. 328 | ![Left-Right Rotation](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2FLR%20Rotation%20(1).png?alt=media&token=d8db235b-f6f7-49e5-b4c4-5e4e2529aa70) 329 | 330 | - **Right-Left (RL) Rotation**: Similar to LR but starts with a taller right subtree. 331 | ![Right-Left Rotation](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2FRL%20Rotation%20(1).png?alt=media&token=c18900f3-7fe9-4c7e-8ba8-f74cb6d8ecc3) 332 | 333 | ### Code Example: AVL Operations 334 | 335 | Here is the Python code: 336 | 337 | ```python 338 | class Node: 339 | def __init__(self, key): 340 | self.left = None 341 | self.right = None 342 | self.key = key 343 | self.height = 1 344 | 345 | def left_rotate(z): 346 | y = z.right 347 | T2 = y.left 348 | y.left = z 349 | z.right = T2 350 | z.height = 1 + max(get_height(z.left), get_height(z.right)) 351 | y.height = 1 + max(get_height(y.left), get_height(y.right)) 352 | return y 353 | ``` 354 |
355 | 356 | ## 6. What is a _Red-Black Tree_? 357 | 358 | A **Red-Black Tree** is a self-balancing binary search tree that optimizes both search and insertion/deletion operations. It accomplishes this via a set of rules known as **red-black balance**, making it well-suited for practical applications. 359 | 360 | ### Key Characteristics 361 | 362 | - **Root**: Always black. 363 | - **Red Nodes**: Can only have black children. 364 | - **Black Depth**: Every path from a node to its descendant leaves contains the same count of black nodes. 365 | 366 | These rules ensure a **balanced tree**, where the **longest** path is no more than twice the length of the **shortest** one. 367 | 368 | ### Benefits 369 | 370 | - **Efficiency**: Maintains $O(\log n)$ operations even during insertions/deletions. 371 | - **Simplicity**: Easier to implement than some other self-balancing trees like AVL trees. 372 | 373 | ### Visual Representation 374 | 375 | Nodes in a **Red-Black Tree** are visually differentiated by color. Memory-efficient implementations often use a single bit for color, with '1' for red and '0' for black. 376 | 377 | ![Red-Black Tree Example](https://upload.wikimedia.org/wikipedia/commons/6/66/Red-black_tree_example.svg) 378 | 379 | ### Complexity Analysis 380 | 381 | - **Time Complexity**: 382 | - Search: $O(\log n)$ 383 | - Insert/Delete: $O(\log n)$ 384 | - **Space Complexity**: $O(n)$ 385 | 386 | ### Code Example: Red-Black Tree 387 | 388 | Here is the Python code: 389 | 390 | ```python 391 | class Node: 392 | def __init__(self, val, color): 393 | self.left = None 394 | self.right = None 395 | self.val = val 396 | self.color = color # 'R' for red, 'B' for black 397 | 398 | class RedBlackTree: 399 | def __init__(self): 400 | self.root = None 401 | 402 | def insert(self, val): 403 | new_node = Node(val, 'R') 404 | if not self.root: 405 | self.root = new_node 406 | self.root.color = 'B' # Root is always black 407 | else: 408 | self._insert_recursive(self.root, new_node) 409 | 410 | def _insert_recursive(self, root, node): 411 | if root.val < node.val: 412 | if not root.right: 413 | root.right = node 414 | else: 415 | self._insert_recursive(root.right, node) 416 | else: 417 | if not root.left: 418 | root.left = node 419 | else: 420 | self._insert_recursive(root.left, node) 421 | 422 | self._balance(node) 423 | 424 | def _balance(self, node): 425 | # Red-black balancing logic here 426 | pass 427 | ``` 428 |
429 | 430 | ## 7. How is an _AVL Tree_ different from a _B-Tree_? 431 | 432 | Balanced search trees, such as **AVL Trees** and **B-Trees** - are designed primarily for optimized and **fast search operations**. However, each tree has distinct core properties and specific applications. 433 | 434 | ### Key Distinctions 435 | 436 | #### Structural Characteristics 437 | 438 | - **AVL Trees**: These are self-adjusting Binary Search Trees with nodes that can have up to two children. Balancing is achieved through rotations. 439 | 440 | - **B-Trees**: Multi-way trees where nodes can house multiple children, balancing is maintained via key redistribution. 441 | 442 | #### Storage Optimization 443 | 444 | - **AVL Trees**: Best suited for in-memory operations, optimizing searches in RAM. Their efficiency dwindles in disk storage due to pointer overhead. 445 | 446 | - **B-Trees**: Engineered for disk-based storage, minimizing I/O operations, making them ideal for databases and extensive file systems. 447 | 448 | #### Data Housing Approach 449 | 450 | - **AVL Trees**: Utilize dynamic memory linked via pointers, which can be more memory-intensive. 451 | 452 | - **B-Trees**: Data is stored in disk blocks, optimizing access by reducing disk I/O. 453 | 454 | #### Search Efficiency 455 | 456 | - Both types ensure $O(\log n)$ search time. However, B-Trees often outpace AVL Trees in large datasets due to their multi-way branching. 457 | 458 | ### Code Example: AVL Tree 459 | 460 | Here is the Python code: 461 | 462 | ```python 463 | class Node: 464 | def __init__(self, value): 465 | self.value = value 466 | self.left = None 467 | self.right = None 468 | self.height = 1 469 | 470 | class AVLTree: 471 | def __init__(self): 472 | self.root = None 473 | # Additional methods for insertion, deletion, and balancing. 474 | ``` 475 | 476 | ### Code Example: B-Tree 477 | 478 | Here is the Python code: 479 | 480 | ```python 481 | class BTreeNode: 482 | def __init__(self, leaf=False): 483 | self.leaf = leaf 484 | self.keys = [] 485 | self.child = [] 486 | # Additional methods for operations. 487 | 488 | class BTree: 489 | def __init__(self, t): 490 | self.root = BTreeNode(True) 491 | self.t = t 492 | # Methods for traversal, search, etc. 493 | ``` 494 |
495 | 496 | ## 8. How can a _Fenwick Tree (Binary Indexed Tree)_ be beneficial in algorithm design? 497 | 498 | The **Fenwick Tree**, or Binary Indexed Tree $(BIT)$, is an extremely efficient data structure particularly suited for **range queries and point updates** in large sequential datasets, like arrays. Its primary strength lies in its **fast update and query operations**, presenting unique advantages in specific algorithmic scenarios. 499 | 500 | ### Use Cases 501 | 502 | - **Sum Query Efficiency**: In an array, obtaining the sum of its elements up to index $i$ requires $O(n)$ time. With a BIT, this task is optimized to $O(\log n)$. 503 | 504 | - **Update Efficiency**: While updating an array's element at index $i$ takes $O(1)$, updating the prefix sum data to reflect this change typically needs $O(n)$ time. A BIT aids in achieving a $O(\log n)$ time update for both. 505 | 506 | - **Range Queries Optimization**: A BIT is helpful in scenarios where you need to frequently calculate ranges like $[l, r]$ in sequences that don't change size. 507 | 508 | ### Code Example: Constructing a Binary-Indexed Tree 509 | 510 | Here is the Python code: 511 | 512 | ```python 513 | def update(bit, idx, val): 514 | while idx < len(bit): 515 | bit[idx] += val 516 | idx += (idx & -idx) 517 | 518 | def get_sum(bit, idx): 519 | total = 0 520 | while idx > 0: 521 | total += bit[idx] 522 | idx -= (idx & -idx) 523 | return total 524 | 525 | def construct_bit(arr): 526 | bit = [0] * (len(arr) + 1) 527 | for i, val in enumerate(arr): 528 | update(bit, i + 1, val) 529 | return bit 530 | ``` 531 | 532 | To calculate the sum from index $1$ to $7$, the call is: `get_sum(bit, 7) - get_sum(bit, 0)`. This subtractions helps in avoiding the extra point. 533 |
534 | 535 | ## 9. What is a _Segment Tree_, and how does it differ from a traditional binary tree in usage and efficiency? 536 | 537 | **Segment Trees** are variant binary search trees optimized for fast **range queries** on an interval of a known array. 538 | 539 | ### Features of a Segment Tree 540 | 541 | - **Root Node**: Covers the entire array or range. 542 | - **Functionality**: Can efficiently handle range operations like find-sum, find-max, and find-min. 543 | - **Internal Nodes**: Divide the array into two equal segments. 544 | - **Leaves**: Represent individual array elements. 545 | - **Building the Tree**: Done in top-down manner. 546 | - **Complexity**: Suitable for queries with $O(\log n)$ complexity over large inputs. 547 | - **Operations**: Can perform range updates in $O(\log n)$ time. 548 | 549 | ### Coding Example: Range Sum Query 550 | 551 | Here is the Python code: 552 | 553 | ```python 554 | class SegmentTree: 555 | def __init__(self, arr): 556 | self.tree = [None] * (4*len(arr)) 557 | self.build_tree(arr, 0, len(arr)-1, 0) 558 | 559 | def build_tree(self, arr, start, end, pos): 560 | if start == end: 561 | self.tree[pos] = arr[start] 562 | return 563 | 564 | mid = (start + end) // 2 565 | self.build_tree(arr, start, mid, 2*pos+1) 566 | self.build_tree(arr, mid+1, end, 2*pos+2) 567 | self.tree[pos] = self.tree[2*pos+1] + self.tree[2*pos+2] 568 | 569 | def range_sum(self, q_start, q_end, start=0, end=None, pos=0): 570 | if end is None: 571 | end = len(self.tree) // 4 - 1 572 | 573 | if q_end < start or q_start > end: 574 | return 0 575 | if q_start <= start and q_end >= end: 576 | return self.tree[pos] 577 | mid = (start + end) // 2 578 | return self.range_sum(q_start, q_end, start, mid, 2*pos+1) + self.range_sum(q_start, q_end, mid+1, end, 2*pos+2) 579 | 580 | # Example usage 581 | arr = [1, 3, 5, 7, 9, 11] 582 | st = SegmentTree(arr) 583 | print(st.range_sum(1, 3)) # Output: 15 (5 + 7 + 3) 584 | ``` 585 |
586 | 587 | ## 10. What is a _Splay Tree_, and how does its _splay operation_ work? 588 | 589 | The **Splay Tree**, a form of self-adjusting binary search tree, reshapes itself to optimize performance based on recent data access patterns. It achieves this through "splaying" operations. 590 | 591 | ### Splaying Nodes 592 | 593 | The **splay operation** aims to move a target node $x$ to the root position via a sequence of tree and node manipulations to increase efficiency. 594 | 595 | The process generally involves: 596 | 597 | - **Zig Step**: If the node is a direct child of the root, it's rotated up. 598 | 599 | - **Zig-Zig Step**: If both the node and its parent are left or right children, they're both moved up. 600 | 601 | - **Zig-Zag Step**: If one is a left child, and the other a right child, a double rotation brings both up. 602 | 603 | The splay sequence also ensures that descendants of $x$ remain children of $x$ after the splay operation. 604 | 605 | ### Advantages and Disadvantages 606 | 607 | - **Pros**: 608 | - Trees can adapt to access patterns, making it an ideal data structure for both search and insert operations in practice. 609 | - It can outperform other tree structures in some cases due to its adaptive nature. 610 | 611 | - **Cons**: 612 | - The splay operation is complex and can be time-consuming. 613 | - Splay trees do not guarantee the best time complexity for search operations, which can be an issue in performance-critical applications where a consistent time is required. 614 | 615 | - **Average Time Complexity**: 616 | - **Search**: $O(\log{n})$ on average. 617 | - **Insertion** and **Deletion**: $O(\log{n})$ on average. 618 | 619 | ### Code Example: Splay Tree and Splaying Operation 620 | 621 | Here is the Python code: 622 | 623 | ```python 624 | class Node: 625 | def __init__(self, key): 626 | self.left = self.right = None 627 | self.key = key 628 | 629 | class SplayTree: 630 | def __init__(self): 631 | self.root = None 632 | 633 | def splay(self, key): 634 | if self.root is None or key == self.root.key: 635 | return # No need to splay 636 | dummy = Node(None) # Create a dummy node 637 | left, right, self.root.left, self.root.right = dummy, dummy, dummy, dummy 638 | while True: 639 | if key < self.root.key: 640 | if self.root.left is None or key < self.root.left.key: 641 | break 642 | self.root.left, self.root, right.left = right.left, self.root.left, self.root 643 | right, self.root = self.root, right 644 | if key > self.root.key: 645 | if self.root.right is None or key > self.root.right.key: 646 | break 647 | self.root.right, self.root, left.right = left.right, self.root.right, self.root 648 | left, self.root = self.root, left 649 | left.right, right.left = self.root.left, self.root.right 650 | self.root.left, self.root.right = left, right 651 | self.root = dummy.right 652 | 653 | # Splay the node with key 6 654 | splayTree = SplayTree() 655 | splayTree.root = Node(10) 656 | splayTree.root.left = Node(5) 657 | splayTree.root.left.left = Node(3) 658 | splayTree.root.left.right = Node(7) 659 | splayTree.root.right = Node(15) 660 | splayTree.splay(6) 661 | ``` 662 |
663 | 664 | ## 11. Explain the concept and structure of a _Ternary Tree_. 665 | 666 | **Ternary Trees**, a type of multiway tree, were traditionally used for disk storage. They can be visualized as full or complete. Modern applications are more algorithmic than storage based. While not as common as binary trees, they are rich in learning opportunities. 667 | 668 | ### Structure 669 | 670 | Each **node** in a ternary tree typically has three **children**: 671 | 672 | - Left 673 | - Middle 674 | - Right 675 | 676 | This organizational layout is especially effective for representing certain types of data or solving specific problems. For instance, ternary trees are optimal when dealing with scenarios that have three distinct outcomes at a decision point. 677 | 678 | ### Code Example: Ternary Tree Node 679 | 680 | Here is the Python code: 681 | 682 | ```python 683 | class TernaryNode: 684 | def __init__(self, data, left=None, middle=None, right=None): 685 | self.data = data 686 | self.left = left 687 | self.middle = middle 688 | self.right = right 689 | ``` 690 |
691 | 692 | ## 12. Describe a _Lazy Segment Tree_ and when it is used over a regular Segment Tree. 693 | 694 | The **Lazy Segment Tree** supplements the standard Segment Tree by allowing delayed updates, making it best suited for the Range Update and Point Query type tasks. It is **more efficient** in such scenarios, especially when dealing with a large number of updates. 695 | 696 | ### Lazy Propagation Mechanism 697 | 698 | The Lazy Segment Tree keeps track of pending updates on a range of elements using a separate array or data structure. 699 | 700 | When a range update is issued, rather than carrying out the immediate actions for all elements in that range, the tree schedules the update to be executed when required. 701 | 702 | The next time an element within that range is accessed (such as during a range query or point update), the tree first ensures that any pending updates get propagated to the concerned range. This propagation mechanism avoids redundant update operations, achieving time complexity of $O(\log n)$ for range updates, range queries, and point updates. 703 | 704 | ### Code Example: Lazy Segment Tree 705 | 706 | Here is the Python code: 707 | 708 | ```python 709 | class LazySegmentTree: 710 | def __init__(self, arr): 711 | self.size = len(arr) 712 | self.tree = [0] * (4 * self.size) 713 | self.lazy = [0] * (4 * self.size) 714 | self.construct_tree(arr, 0, self.size-1, 0) 715 | 716 | def update_range(self, start, end, value): 717 | self.update_range_util(0, 0, self.size-1, start, end, value) 718 | 719 | def range_query(self, start, end): 720 | return self.range_query_util(0, 0, self.size-1, start, end) 721 | 722 | # Implement the rest of the methods 723 | 724 | def construct_tree(self, arr, start, end, pos): 725 | if start == end: 726 | self.tree[pos] = arr[start] 727 | else: 728 | mid = (start + end) // 2 729 | self.tree[pos] = self.construct_tree(arr, start, mid, 2*pos+1) + self.construct_tree(arr, mid+1, end, 2*pos+2) 730 | return self.tree[pos] 731 | 732 | def update_range_util(self, pos, start, end, range_start, range_end, value): 733 | if self.lazy[pos] != 0: 734 | self.tree[pos] += (end - start + 1) * self.lazy[pos] 735 | if start != end: 736 | self.lazy[2*pos+1] += self.lazy[pos] 737 | self.lazy[2*pos+2] += self.lazy[pos] 738 | self.lazy[pos] = 0 739 | 740 | if start > end or start > range_end or end < range_start: 741 | return 742 | 743 | if start >= range_start and end <= range_end: 744 | self.tree[pos] += (end - start + 1) * value 745 | if start != end: 746 | self.lazy[2*pos+1] += value 747 | self.lazy[2*pos+2] += value 748 | return 749 | 750 | mid = (start+end) // 2 751 | self.update_range_util(2*pos+1, start, mid, range_start, range_end, value) 752 | self.update_range_util(2*pos+2, mid+1, end, range_start, range_end, value) 753 | self.tree[pos] = self.tree[2*pos+1] + self.tree[2*pos+2] 754 | 755 | def range_query_util(self, pos, start, end, range_start, range_end): 756 | if self.lazy[pos]: 757 | self.tree[pos] += (end - start + 1) * self.lazy[pos] 758 | if start != end: 759 | self.lazy[2*pos+1] += self.lazy[pos] 760 | self.lazy[2*pos+2] += self.lazy[pos] 761 | self.lazy[pos] = 0 762 | 763 | if start > end or start > range_end or end < range_start: 764 | return 0 765 | 766 | if start >= range_start and end <= range_end: 767 | return self.tree[pos] 768 | 769 | mid = (start + end) // 2 770 | return self.range_query_util(2*pos+1, start, mid, range_start, range_end) + self.range_query_util(2*pos+2, mid+1, end, range_start, range_end) 771 | ``` 772 |
773 | 774 | ## 13. What is a _Treap_, and how does it combine the properties of a binary search tree and a heap? 775 | 776 | A **Treap**, also known as a **Cartesian Tree**, is a specialized **binary search tree** that maintains a dual structure, inheriting characteristics from both a **Binary Search Tree** (BST) and a **Heap**. 777 | 778 | ### Core Properties 779 | 780 | - **BST Order**: Every node satisfies the order: $\text{node.left} < \text{node} < \text{node.right}$ based on a specific attribute. Traditionally, it's the node's numerical key-value that is used for this ordering. 781 | - **Heap Priority**: Each node conforms to the "parent" property, where its heap priority is determined by an attribute independent of the BST order. This attribute is often referred to as the node's "priority". 782 | 783 | ### Link between Priority and Order 784 | 785 | The `priority` attribute of a Treap node acts as a "tether" or a link that ensures the tree's structure conforms to both BST and heap properties. When nodes are inserted or their keys are updated in a Treap, their priorities are adjusted to maintain both of these properties simultaneously. 786 | 787 | ### Operations 788 | 789 | #### Insert Operation 790 | 791 | When a new node is inserted into the Treap, both BST and heap properties are simultaneously maintained by adjusting the node's `priority` based on its key. 792 | 793 | 1. The node is first inserted based on the BST property (overriding its priority if necessary). 794 | 2. Then, it "percolates" up the tree based on its priority to regain the heap characteristic. 795 | 796 | #### Delete Operation 797 | 798 | Deletion, as always, is a two-step process: 799 | 800 | 1. Locate the node to be deleted. 801 | 2. Replace it with either the left or right child to keep the BST property. The replacement is specifically chosen to preserve the overall priority order of the tree. 802 | 803 | ### Complexity Analysis 804 | 805 | - **Time Complexity**: All primary operations such as Insert, Delete, and Search take $\mathcal{O}(\log n)$ expected time. 806 | - **Space Complexity**: The structure preserves both BST and heap requirements with each node carrying two data attributes (key and priority). 807 |
808 | 809 | ## 14. What is a _Balanced Tree_? 810 | 811 | A **Balanced Tree** ensures that the **Balance Factor**—the height difference between left and right subtrees of any node—doesn't exceed one. This property guarantees efficient $O(\log n)$ time complexity for **search**, **insertion**, and **deletion** operations. 812 | 813 | ### Balanced Tree Criteria 814 | 815 | - **Height Difference**: Each node's subtrees differ in height by at most one. 816 | - **Recursive Balance**: Both subtrees of every node are balanced. 817 | 818 | ### Benefits 819 | 820 | - **Efficiency**: Avoids the $O(n)$ degradation seen in unbalanced trees. 821 | - **Predictability**: Provides stable performance, essential for real-time applications. 822 | 823 | ### Visual Comparison 824 | 825 | ![](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/binary%20tree%2FHeight-Balanced-Tree-2%20(1).png?alt=media&token=4751e97d-2115-4a6a-a4cc-19fa1a1e0a7d) 826 | 827 | 828 | 829 | The **balanced tree** maintains $O(\log n)$ height, while the **unbalanced tree** could degenerate into a linked list with $O(n)$ height. 830 | 831 | ### Code Example: Balance Verification 832 | 833 | Here is the Python code: 834 | 835 | ```python 836 | class Node: 837 | def __init__(self, data): 838 | self.data = data 839 | self.left = None 840 | self.right = None 841 | 842 | def is_balanced(root): 843 | if root is None: 844 | return True 845 | 846 | left_height = get_height(root.left) 847 | right_height = get_height(root.right) 848 | 849 | return abs(left_height - right_height) <= 1 and is_balanced(root.left) and is_balanced(root.right) 850 | 851 | def get_height(node): 852 | if node is None: 853 | return 0 854 | 855 | return 1 + max(get_height(node.left), get_height(node.right)) 856 | ``` 857 |
858 | 859 | ## 15. What are advantages and disadvantages of _BST_? 860 | 861 | The **Binary Search Tree** (BST) is a versatile data structure that offers many benefits but also comes with limitations. 862 | 863 | ### Advantages of Using BSTs 864 | 865 | 1. **Quick Search Operations**: A balanced BST can perform search operations in $O(\log n)$ time, making it much faster than linear structures like arrays and linked lists. 866 | 867 | 2. **Dynamic Allocation**: Unlike arrays that require pre-defined sizes, BSTs are dynamic in nature, adapting to data as it comes in. This results in better space utilization. 868 | 869 | 3. **Space Efficiency**: With $O(n)$ space requirements, BSTs are often more memory-efficient than other structures like hash tables, especially in memory-sensitive applications. 870 | 871 | 4. **Versatile Operations**: Beyond simple insertions and deletions, BSTs excel in: 872 | - Range queries 873 | - Nearest smaller or larger element searches 874 | - Different types of tree traversals (in-order, pre-order, post-order) 875 | 876 | 5. **Inherent Sorting**: BSTs naturally keep their elements sorted, making them ideal for tasks that require efficient and frequent sorting. 877 | 878 | 6. **Predictable Efficiency**: Unlike hash tables, which can have unpredictable worst-case scenarios, a balanced BST maintains consistent $O(\log n)$ performance. 879 | 880 | 7. **Practical Utility**: BSTs find applications in: 881 | - Database indexing for quick data retrieval 882 | - Efficient file searching in operating systems 883 | - Task scheduling based on priorities 884 | 885 | ### Disadvantages of Using BSTs 886 | 887 | 1. **Limited Direct Access**: While operations like `insert`, `delete`, and `lookup` are efficient, direct access to elements by index can be slow, taking $O(n)$ time in unbalanced trees. 888 | 889 | 2. **Risk of Imbalance**: If not managed carefully, a BST can become unbalanced, resembling a linked list and losing its efficiency advantages. 890 | 891 | 3. **Memory Costs**: Each node in a BST requires additional memory for two child pointers, which could be a concern in memory-constrained environments. 892 | 893 | 4. **Complex Self-Balancing Algorithms**: While self-balancing trees like AVL or Red-Black trees mitigate the risk of imbalance, they are more complex to implement. 894 | 895 | 5. **Lack of Global Optimum**: BSTs do not readily provide access to the smallest or largest element, unlike data structures like heaps. 896 |
897 | 898 | 899 | 900 | #### Explore all 53 answers here 👉 [Devinterview.io - Binary Tree Data Structure](https://devinterview.io/questions/data-structures-and-algorithms/binary-tree-data-structure-interview-questions) 901 | 902 |
903 | 904 | 905 | data-structures-and-algorithms 906 | 907 |

908 | 909 | --------------------------------------------------------------------------------