└── README.md /README.md: -------------------------------------------------------------------------------- 1 | # 60 Important Array Data Structure Interview Questions in 2025 2 |
3 |

4 | 5 | data-structures-and-algorithms 6 | 7 |

8 | 9 | #### You can also find all 60 answers here 👉 [Devinterview.io - Array Data Structure](https://devinterview.io/questions/data-structures-and-algorithms/array-data-structure-interview-questions) 10 | 11 |
12 | 13 | ## 1. What is an _Array_? 14 | 15 | An **array** is a fundamental data structure used for storing a **sequence** of elements that can be accessed via an **index**. 16 | 17 | ### Key Characteristics 18 | 19 | - **Homogeneity**: All elements are of the same data type. 20 | - **Contiguous Memory**: Elements are stored in adjacent memory locations for quick access. 21 | - **Fixed Size**: Arrays are generally static in size, although dynamic arrays exist in modern languages. 22 | - **Indexing**: Usually zero-based, though some languages use one-based indexing. 23 | 24 | ### Time Complexity of Basic Operations 25 | 26 | - **Access**: $O(1)$ 27 | - **Search**: $O(1)$, $O(n)$ assuming unsorted array 28 | - **Insertion**: $O(1)$ for the end, $O(n)$ for beginning/middle 29 | - **Deletion**: $O(1)$ for the end, $O(n)$ for beginning/middle 30 | - **Append**: $O(1)$ amortized, $O(n)$ during resizing 31 | 32 | ### Code Example: Basic Array Operations 33 | 34 | Here is the Java code: 35 | 36 | ```java 37 | public class ArrayExample { 38 | public static void main(String[] args) { 39 | // Declare and Initialize Arrays 40 | int[] myArray = new int[5]; // Declare an array of size 5 41 | int[] initializedArray = {1, 2, 3, 4, 5}; // Direct initialization 42 | 43 | // Access Elements 44 | System.out.println(initializedArray[0]); // Output: 1 45 | 46 | // Update Elements 47 | initializedArray[2] = 10; // Modify the third element 48 | 49 | // Check Array Length 50 | int length = initializedArray.length; // Retrieve array length 51 | System.out.println(length); // Output: 5 52 | } 53 | } 54 | ``` 55 |
56 | 57 | ## 2. What are _Dynamic Arrays_? 58 | 59 | **Dynamic arrays** start with a preset capacity and **automatically resize** as needed. When full, they allocate a larger memory block—often doubling in size—and copy existing elements. 60 | 61 | ### Key Features 62 | 63 | - **Adaptive Sizing**: Dynamic arrays adjust their size based on the number of elements, unlike fixed-size arrays. 64 | - **Contiguous Memory**: Dynamic arrays, like basic arrays, keep elements in adjacent memory locations for efficient indexed access. 65 | - **Amortized Appending**: Append operations are typically constant time. However, occasional resizing might take longer, but averaged over multiple operations, it's still $O(1)$ amortized. 66 | 67 | ### Time Complexity of Basic Operations 68 | 69 | - **Access**: $O(1)$ 70 | - **Search**: $O(1)$ for exact matches, $O(n)$ linearly for others 71 | - **Insertion**: $O(1)$ amortized, $O(n)$ during resizing 72 | - **Deletion**: $O(1)$ amortized, $O(n)$ during shifting or resizing 73 | - **Append**: $O(1)$ amortized, $O(n)$ during resizing 74 | 75 | ### Code Example: Java's 'ArrayList': Simplified Implementation 76 | 77 | Here is the Java code: 78 | 79 | ```java 80 | import java.util.Arrays; 81 | 82 | public class DynamicArray { 83 | private Object[] data; 84 | private int size = 0; 85 | private int capacity; 86 | 87 | public DynamicArray(int initialCapacity) { 88 | this.capacity = initialCapacity; 89 | data = new Object[initialCapacity]; 90 | } 91 | 92 | public T get(int index) { 93 | return (T) data[index]; 94 | } 95 | 96 | public void add(T value) { 97 | if (size == capacity) { 98 | resize(2 * capacity); 99 | } 100 | data[size++] = value; 101 | } 102 | 103 | private void resize(int newCapacity) { 104 | Object[] newData = new Object[newCapacity]; 105 | for (int i = 0; i < size; i++) { 106 | newData[i] = data[i]; 107 | } 108 | data = newData; 109 | capacity = newCapacity; 110 | } 111 | 112 | public int size() { 113 | return size; 114 | } 115 | 116 | public boolean isEmpty() { 117 | return size == 0; 118 | } 119 | 120 | public static void main(String[] args) { 121 | DynamicArray dynArray = new DynamicArray<>(2); 122 | dynArray.add(1); 123 | dynArray.add(2); 124 | dynArray.add(3); // This will trigger a resize 125 | System.out.println("Size: " + dynArray.size()); // Output: 3 126 | System.out.println("Element at index 2: " + dynArray.get(2)); // Output: 3 127 | } 128 | } 129 | ``` 130 |
131 | 132 | ## 3. What is an _Associative Array_ (Dictionary)? 133 | 134 | An **Associative Array**, often referred to as **Map**, **Hash**, or **Dictionary** is an abstract data type that enables **key-based access** to its elements and offers **dynamic resizing** and fast retrieval. 135 | 136 | ### Key Features 137 | 138 | - **Unique Keys**: Each key is unique, and adding an existing key updates its value. 139 | 140 | - **Variable Key Types**: Keys can be diverse types, including strings, numbers, or objects. 141 | 142 | ### Common Implementations 143 | 144 | - **Hash Table**: Efficiency can degrade due to hash collisions. 145 | - Average Case $O(1)$ 146 | - Worst Case $O(n)$ 147 | 148 | - **Self-Balancing Trees**: Consistent efficiency due to balanced structure. 149 | - Average Case $O(\log n)$ 150 | - Worst Case $O(\log n)$ 151 | 152 | - **Unbalanced Trees**: Efficiency can vary, making them less reliable. 153 | - Average Case Variable 154 | - Worst Case between $O(\log n)$ and $O(n)$ 155 | 156 | - **Association Lists**: Simple structure, not ideal for large datasets. 157 | - Average and Worst Case $O(n)$ 158 | 159 | ### Code Example: Associative Arrays vs. Regular Arrays 160 | 161 | Here is the Python code: 162 | 163 | ```python 164 | # Regular Array Example 165 | my_list = ["apple", "banana", "cherry"] 166 | print(my_list[1]) # Outputs: banana 167 | 168 | # Trying to access using non-integer index would cause an error: 169 | # print(my_list["fruit_name"]) # This would raise an error. 170 | 171 | # Associative Array (Dictionary) Example 172 | my_dict = { 173 | "fruit_name": "apple", 174 | 42: "banana", 175 | (1, 2): "cherry" 176 | } 177 | 178 | print(my_dict["fruit_name"]) # Outputs: apple 179 | print(my_dict[42]) # Outputs: banana 180 | print(my_dict[(1, 2)]) # Outputs: cherry 181 | 182 | # Demonstrating key update 183 | my_dict["fruit_name"] = "orange" 184 | print(my_dict["fruit_name"]) # Outputs: orange 185 | ``` 186 |
187 | 188 | ## 4. What defines the _Dimensionality_ of an array? 189 | 190 | **Array dimensionality** indicates the number of indices required to select an element within the array. A classic example is the Tic-Tac-Toe board, which is a two-dimensional array, and elements are referenced by their row and column positions. 191 | 192 | ### Code Example: Tic-Tac-Toe Board (2D Array) 193 | 194 | Here is the Python code: 195 | 196 | ```python 197 | # Setting up the Tic-Tac-Toe board 198 | tic_tac_toe_board = [ 199 | ['X', 'O', 'X'], 200 | ['O', 'X', 'O'], 201 | ['X', 'O', 'X'] 202 | ] 203 | 204 | # Accessing the top-left corner (which contains 'X'): 205 | element = tic_tac_toe_board[0][0] 206 | ``` 207 | 208 | ### Code Example: 3D Array 209 | 210 | Here is the Python code: 211 | 212 | ```python 213 | arr_3d = [ 214 | [[1, 2, 3], [4, 5, 6]], 215 | [[7, 8, 9], [10, 11, 12]] 216 | ] 217 | ``` 218 | 219 | A three-dimensional array can be imagined as a **cube** or a **stack** of matrices. 220 | 221 | ### Mathematical Perspective 222 | 223 | Mathematically, an array's dimensionality aligns with the Cartesian product of sets, each set corresponding to an axis. A 3D array, for instance, is formed from the Cartesian product of three distinct sets. 224 | 225 | #### Beyond 3D: N-Dimensional Arrays 226 | 227 | Arrays can extend into N dimensions, where $N$ can be any positive integer. The total count of elements in an N-dimensional array is: 228 | 229 | $$ 230 | \text{Number of Elements} = S_1 \times S_2 \times \ldots \times S_N 231 | $$ 232 | 233 | Where $S_k$ signifies the size of the $k$-th dimension. 234 |
235 | 236 | ## 5. Name some _Advantages_ and _Disadvantages_ of arrays. 237 | 238 | **Arrays** have very specific **strengths** and **weaknesses**, making them better suited for some applications over others. 239 | 240 | ### Advantages 241 | 242 | - **Speed**: Arrays provide $O(1)$ access and append operations when appending at a known index (like the end). 243 | 244 | - **Cache Performance**: Arrays, with their contiguous memory layout, are efficient for tasks involving sequential data access. 245 | 246 | ### Disadvantages 247 | 248 | - **Size Limitations**: Arrays have a fixed size after allocation. Resizing means creating a new array, leading to potential memory overhead or data transfer costs. 249 | 250 | - **Mid-Array Changes**: Operations like insertions or deletions are $O(n)$ due to necessary element shifting. 251 | 252 | ### Considerations 253 | 254 | - **When to Use**: Arrays are optimal for **known data sizes** and when rapid access or appends are critical. They're popular in numerical algorithms and cache-centric tasks. 255 | 256 | - **When to Rethink**: Their static nature and inefficiency for **frequent mid-array changes** make alternatives like linked lists or hash tables sometimes more suitable. 257 |
258 | 259 | ## 6. Explain _Sparse_ and _Dense_ arrays. 260 | 261 | **Sparse arrays** are data structures optimized for arrays where most values are default (e.g., zero or null). They save memory by storing only non-default values and their indices. In contrast, **dense arrays** allocate memory for every element, irrespective of it being a default value. 262 | 263 | ### Example 264 | 265 | - **Sparse Array**: `[0, 0, 3, 0, 0, 0, 0, 9, 0, 0]` 266 | - **Dense Array**: `[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]` 267 | 268 | ### Advantages of Sparse Arrays 269 | 270 | Sparse arrays offer **optimized memory usage**. 271 | 272 | For example, in a million-element array where 90% are zeros: 273 | 274 | - **Dense Array**: Allocates memory for every single element, even if the majority are zeros. 275 | - **Sparse Array**: Drastically conserves memory by only allocating for non-zero elements. 276 | 277 | ### Practical Application 278 | 279 | 1. **Text Processing**: Efficiently represent term-document matrices in analytics where not all words appear in every document. 280 | 281 | 2. **Computer Graphics**: Represent 3D spaces in modeling where many cells may be empty. 282 | 283 | 3. **Scientific Computing**: Handle linear systems with sparse coefficient matrices, speeding up computations. 284 | 285 | 4. **Databases**: Store tables with numerous missing values efficiently. 286 | 287 | 5. **Networking**: Represent sparsely populated routing tables in networking equipment. 288 | 289 | 6. **Machine Learning**: Efficiently handle high-dimensional feature vectors with many zeros. 290 | 291 | 7. **Recommendation Systems**: Represent user-item interaction matrices where many users haven't interacted with most items. 292 | 293 | ### Code Example: Sparse Array 294 | 295 | Here is a Python code: 296 | 297 | ```python 298 | class SparseArray: 299 | def __init__(self): 300 | self.data = {} 301 | 302 | def set(self, index, value): 303 | if value != 0: # Only store non-zero values 304 | self.data[index] = value 305 | elif index in self.data: 306 | del self.data[index] 307 | 308 | def get(self, index): 309 | return self.data.get(index, 0) # Return 0 if index is not in the data 310 | 311 | # Usage 312 | sparse_array = SparseArray() 313 | sparse_array.set(2, 3) 314 | sparse_array.set(7, 9) 315 | 316 | print(sparse_array.get(2)) # Output: 3 317 | print(sparse_array.get(7)) # Output: 9 318 | print(sparse_array.get(3)) # Output: 0 319 | ``` 320 |
321 | 322 | ## 7. What are advantages and disadvantages of _Sorted Arrays_? 323 | 324 | A **sorted array** is a data structure where elements are stored in a specific, **predetermined sequence**, usually in ascending or descending order. 325 | 326 | This ordering provides various benefits, such as **optimized search operations**, at the cost of more complex insertions and deletions. 327 | 328 | ### Advantages 329 | 330 | - **Efficient Searches**: Sorted arrays are optimized for search operations, especially when using algorithms like Binary Search, which has a $O(\log n)$ time complexity. 331 | 332 | - **Additional Query Types**: They support other specialized queries, like bisection to find the closest element and range queries to identify elements within a specified range. 333 | 334 | - **Cache Efficiency**: The contiguous memory layout improves cache utilization, which can lead to faster performance. 335 | 336 | ### Disadvantages 337 | 338 | - **Slow Updates**: Insertions and deletions generally require shifting elements, leading to $O(n)$ time complexity for these operations. 339 | 340 | - **Memory Overhead**: The need to maintain the sorted structure can require extra memory, especially during updates. 341 | 342 | - **Lack of Flexibility**: Sorted arrays are less flexible for dynamic resizing and can be problematic in parallel computing environments. 343 | 344 | ### Practical Applications 345 | 346 | - **Search-Heavy Applications**: Suitable when rapid search operations are more common than updates, such as in financial analytics or in-memory databases. 347 | - **Static or Semi-Static Data**: Ideal for datasets known in advance or that change infrequently. 348 | - **Memory Constraints**: They are efficient for small, known datasets that require quick search capabilities. 349 | 350 | ### Time Complexity of Basic Operations 351 | 352 | - **Access**: $O(1)$. 353 | - **Search**: $O(1)$ for exact matches, $O(\log n)$ with binary search for others. 354 | - **Insertion**: $O(1)$ for the end, but usually $O(n)$ to maintain order. 355 | - **Deletion**: $O(1)$ for the end, but usually $O(n)$ to maintain order. 356 | - **Append**: $O(1)$ if appending a larger value, but can spike to $O(n)$ if resizing or inserting in order. 357 |
358 | 359 | ## 8. What are the advantages of _Heaps_ over _Sorted Arrays_? 360 | 361 | While both **heaps** and **sorted arrays** have their strengths, heaps are often preferred when dealing with dynamic data requiring frequent insertions and deletions. 362 | 363 | ### Advantages of Heaps Over Sorted Arrays 364 | 365 | - **Dynamic Operations**: Heaps excel in scenarios with frequent insertions and deletions, maintaining their structure efficiently. 366 | - **Memory Allocation**: Heaps, especially when implemented as binary heaps, can be efficiently managed in memory as they're typically backed by arrays. Sorted arrays, on the other hand, might require periodic resizing or might have wasted space if over-allocated. 367 | - **Predictable Time Complexity**: Heap operations have consistent time complexities, while sorted arrays can vary based on specific data scenarios. 368 | - **No Overhead for Sorting**: Heaps ensure parents are either always smaller or larger than children, which suffices for many tasks without the overhead of maintaining full order as in sorted arrays. 369 | 370 | ### Time Complexities of Key Operations 371 | 372 | #### Heaps 373 | 374 | - **find-min**: $O(1)$ – The root node always contains the minimum value. 375 | - **delete-min**: $O(\log n)$ – Removal of the root is followed by the heapify process to restore order. 376 | - **insert**: $O(\log n)$ – The newly inserted element might need to be bubbled up to its correct position. 377 | 378 | #### Sorted Arrays 379 | 380 | - **find-min**: $O(1)$ – The first element is the minimum if the array is sorted in ascending order. 381 | - **delete-min**: $O(n)$ – Removing the first element requires shifting all other elements. 382 | - **insert**: $O(n)$ – Even though we can find the insertion point in $O(\log n)$ with binary search, we may need to shift elements, making it $O(n)$ in the worst case. 383 |
384 | 385 | ## 9. How does _Indexing_ work in arrays? 386 | 387 | **Indexing** refers to accessing specific elements in an array using unique indices, which range from 0 to $n-1$ for an array of $n$ elements. 388 | 389 | ### Key Concepts 390 | 391 | #### Contiguous Memory and Fixed Element Size 392 | 393 | Arrays occupy adjacent memory locations, facilitating fast random access. All elements are uniformly sized. For example, a 32-bit integer consumes 4 bytes of memory. 394 | 395 | #### Memory Address Calculation 396 | 397 | The memory address of the $i$-th element is computed as: 398 | 399 | $$ 400 | \text{Memory Address}_{i} = P + (\text{Element Size}) \times i 401 | $$ 402 | 403 | Here, $P$ represents the pointer to the array's first element. 404 | 405 | ### Code Example: Accessing Memory Address 406 | 407 | Here is the Python code: 408 | 409 | ```python 410 | # Define an array 411 | arr = [10, 20, 30, 40, 50, 60] 412 | 413 | # Calculate memory address of the third element 414 | element_index = 2 415 | element_address = arr.__array_interface__['data'][0] + element_index * arr.itemsize 416 | 417 | # Retrieve the element value 418 | import ctypes 419 | element_value = ctypes.cast(element_address, ctypes.py_object).value 420 | 421 | # Output 422 | print(f"The memory address of the third element is: {element_address}") 423 | print(f"The value at that memory address is: {element_value}") 424 | ``` 425 |
426 | 427 | ## 10. _Merge_ two _Sorted Arrays_ into one _Sorted Array_. 428 | 429 | ### Problem Statement 430 | 431 | The task is to **merge two sorted arrays** into one combined, sorted array. 432 | 433 | ### Solution 434 | 435 | #### Algorithm Steps 436 | 437 | 1. Initialize the result array **C**, with counters `i=0` for array **A** and `j=0` for array **B**. 438 | 2. While `i` is within the bounds of array **A** and `j` is within the bounds of array **B**: 439 | a. If `A[i]` is less than `B[j]`, append `A[i]` to `C` and increment `i`. 440 | b. If `A[i]` is greater than `B[j]`, append `B[j]` to `C` and increment `j`. 441 | c. If `A[i]` is equal to `B[j]`, append both `A[i]` and `B[j]` to `C` and increment both `i` and `j`. 442 | 3. If any elements remain in array **A**, append them to `C`. 443 | 4. If any elements remain in array **B**, append them to `C`. 444 | 5. Return the merged array `C`. 445 | 446 | ### Visual Representation 447 | 448 | ![Merging Two Sorted Arrays into One](https://firebasestorage.googleapis.com/v0/b/dev-stack-app.appspot.com/o/arrays%2Fmerge-two-sorted-array-algorithm%20(1).png?alt=media&token=580caabc-2bc4-4928-9780-ba7bb13d0cb1&_gl=1*14yao85*_ga*OTYzMjY5NTkwLjE2ODg4NDM4Njg.*_ga_CW55HF8NVT*MTY5NzM3MjYxNC4xNjAuMS4xNjk3MzcyNjQ2LjI8LjAuMA..) 449 | 450 | #### Complexity Analysis 451 | 452 | - **Time Complexity**: $O(n)$, where $n$ is the combined length of Arrays A and B. 453 | - **Space Complexity**: $O(n)$, considering the space required for the output array. 454 | 455 | #### Implementation 456 | 457 | Here is the Python code: 458 | 459 | ```python 460 | def merge_sorted_arrays(a, b): 461 | merged_array, i, j = [], 0, 0 462 | 463 | while i < len(a) and j < len(b): 464 | if a[i] < b[j]: 465 | merged_array.append(a[i]) 466 | i += 1 467 | elif a[i] > b[j]: 468 | merged_array.append(b[j]) 469 | j += 1 470 | else: 471 | merged_array.extend([a[i], b[j]]) 472 | i, j = i + 1, j + 1 473 | 474 | merged_array.extend(a[i:]) 475 | merged_array.extend(b[j:]) 476 | 477 | return merged_array 478 | 479 | # Sample Test 480 | array1 = [1, 3, 5, 7, 9] 481 | array2 = [2, 4, 6, 8, 10] 482 | print(merge_sorted_arrays(array1, array2)) # Expected Output: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] 483 | ``` 484 |
485 | 486 | ## 11. Implement three _Stacks_ with one _Array_. 487 | 488 | ### Problem Statement 489 | 490 | The task is to implement **three stacks** using a **single dynamic array**. 491 | 492 | ### Solution 493 | 494 | To solve the task we can **divide** the array into twelve portions, with four sections for each stack, allowing each of them to **grow** and **shrink** without affecting the others. 495 | 496 | #### Algorithm Steps 497 | 498 | 1. Initialize Stack States: 499 | - Set `size` as the full array length divided by 3. 500 | - Set `stackPointers` as `[ start, start + size - 1, start + 2*size - 1 ]`, where `start` is the array's beginning index. 501 | 502 | 2. Implement `Push` Operation: For stack 1, check if `stackPointers[0]` is less than `start + size - 1` before pushing. 503 | 504 | #### Complexity Analysis 505 | 506 | - **Time Complexity**: $O(1)$ for all stack operations. 507 | - **Space Complexity**: $O(1)$ 508 | 509 | #### Implementation 510 | 511 | Here is the Python code: 512 | 513 | ```python 514 | class MultiStack: 515 | def __init__(self, stack_size): 516 | self.stack_size = stack_size 517 | self.array = [None] * (3 * stack_size) 518 | self.stack_pointers = [-1, -1, -1] 519 | 520 | def push(self, stack_number, value): 521 | if self.stack_pointers[stack_number] >= self.stack_size - 1: 522 | print("Stack Overflow!") 523 | return 524 | 525 | self.stack_pointers[stack_number] += 1 526 | self.array[self.stack_pointers[stack_number]] = value 527 | 528 | def pop(self, stack_number): 529 | if self.stack_pointers[stack_number] < 0: 530 | print("Stack Underflow!") 531 | return None 532 | 533 | value = self.array[self.stack_pointers[stack_number]] 534 | self.stack_pointers[stack_number] -= 1 535 | return value 536 | 537 | def peek(self, stack_number): 538 | if self.stack_pointers[stack_number] < 0: 539 | print("Stack Underflow!") 540 | return None 541 | 542 | return self.array[self.stack_pointers[stack_number]] 543 | ``` 544 |
545 | 546 | ## 12. How do you perform _Array Rotation_ and what are its applications? 547 | 548 | **Array rotation** involves moving elements within an array to shift its position. This operation can be beneficial in various scenarios, from data obfuscation to algorithmic optimizations. 549 | 550 | ### Types of Array Rotation 551 | 552 | 1. **Left Rotation**: Shifts elements to the left. 553 | 2. **Right Rotation**: Shifts elements to the right. 554 | 555 | ### Algorithms for Array Rotation 556 | 557 | 1. **Naive Method**: Directly shifting each element one at a time, $d$ times, where $d$ is the rotation factor. 558 | 2. **Reversal Algorithm**: Involves performing specific **reversals** within the array to achieve rotation more efficiently. 559 | 560 | ### Code Example: Array Rotation using the Reversal Algorithm 561 | 562 | Here is the Python code: 563 | 564 | ```python 565 | def reverse(arr, start, end): 566 | while start < end: 567 | arr[start], arr[end] = arr[end], arr[start] 568 | start += 1 569 | end -= 1 570 | 571 | def rotate_array(arr, d): 572 | n = len(arr) 573 | reverse(arr, 0, d-1) 574 | reverse(arr, d, n-1) 575 | reverse(arr, 0, n-1) 576 | 577 | # Example 578 | my_array = [1, 2, 3, 4, 5, 6, 7] 579 | rotate_array(my_array, 3) 580 | print(my_array) # Output: [4, 5, 6, 7, 1, 2, 3] 581 | ``` 582 | 583 | ### Applications of Array Rotation 584 | 585 | 1. **Obfuscation of Data**: By performing secure operations, such as circular permutations on sensitive arrays, it ensures data confidentiality. 586 | 587 | 2. **Cryptography**: Techniques like the Caesar cipher use array rotation to encrypt and decrypt messages. Modern ciphers similarly rely on advanced versions of this concept. 588 | 589 | 3. **Memory Optimization**: It ensures that data in the array is arranged for optimal memory access, which is crucial in large datasets or when working with limited memory resources. 590 | 591 | 4. **Algorithm Optimization**: Certain algorithms, such as search and sorting algorithms, might perform better on a particular range of elements within an array. Rotation allows for tailoring the array to these algorithms for enhanced performance. 592 |
593 | 594 | ## 13. _Reverse_ an _Array_ in place. 595 | 596 | ### Problem Statement 597 | 598 | Given an array, the objective is to **reverse the sequence of its elements**. 599 | 600 | ### Solution 601 | 602 | Two elements are selected from each end of the array and are swapped. This process continues, with the selected elements moving towards the center, until the entire array is reversed. 603 | 604 | #### Algorithm Steps 605 | 606 | 1. Begin with two pointers: `start` at index 0 and `end` at the last index. 607 | 2. Swap the elements at `start` and `end` positions. 608 | 3. Increment `start` and decrement `end`. 609 | 4. Repeat Steps 2 and 3 until the pointers meet at the center of the array. 610 | 611 | This algorithm **reverses the array in place, with a space complexity of $O(1)$**. 612 | 613 | #### Complexity Analysis 614 | 615 | - **Time Complexity**: $O(n/2)$ as the swapping loop only runs through half of the array. 616 | - **Space Complexity**: Constant, $O(1)$, as no additional space is required. 617 | 618 | #### Implementation 619 | 620 | Here is the Python code: 621 | 622 | ```python 623 | def reverse_array(arr): 624 | start = 0 625 | end = len(arr) - 1 626 | 627 | while start < end: 628 | arr[start], arr[end] = arr[end], arr[start] 629 | start += 1 630 | end -= 1 631 | 632 | # Example 633 | arr = [1, 2, 3, 4, 5] 634 | reverse_array(arr) 635 | print("Reversed array:", arr) # Output: [5, 4, 3, 2, 1] 636 | ``` 637 |
638 | 639 | ## 14. _Remove Duplicates_ from a sorted array without using extra space. 640 | 641 | ### Problem Statement 642 | 643 | Given a **sorted array**, the task is to **remove duplicate elements** in place (using constant space) and return the new length. 644 | 645 | ### Solution 646 | 647 | A two-pointer method provides an efficient solution that removes duplicates **in place** while also recording the new length of the array. 648 | 649 | **Algorithm steps**: 650 | 651 | 1. Initialize `i=0` and `j=1`. 652 | 2. Iterate through the array. 653 | - If `array[i] == array[j]`, move `j` to the next element. 654 | - If `array[i] != array[j]`, update `array[i+1]` and move both `i` and `j` to the next element. 655 | 656 | #### Complexity Analysis 657 | 658 | - **Time Complexity**: $O(n)$. Here, $n$ represents the array's length. 659 | - **Space Complexity**: $O(1)$. The process requires only a few additional variables 660 | 661 | #### Implementation 662 | 663 | Here is the Python code: 664 | 665 | ```python 666 | def removeDuplicates(array): 667 | if not array: 668 | return 0 669 | 670 | i = 0 671 | for j in range(1, len(array)): 672 | if array[j] != array[i]: 673 | i += 1 674 | array[i] = array[j] 675 | 676 | return i + 1 677 | ``` 678 |
679 | 680 | ## 15. Implement a _Queue_ using an array. 681 | 682 | ### Problem Statement 683 | 684 | Implement a **Queue** data structure using a fixed-size array. 685 | 686 | ### Solution 687 | 688 | While a **dynamic array** is a more efficient choice for this purpose, utilizing a standard array helps in demonstrating the principles of queue operations. 689 | 690 | - The queue's front should always have a lower index than its rear, reflecting the structure's first-in, first-out (FIFO) nature. 691 | - When the rear pointer hits the array's end, it may switch to the beginning if there are available slots, a concept known as **circular or wrapped around arrays**. 692 | 693 | #### Algorithm Steps 694 | 695 | 1. Initialize the queue: Set `front` and `rear` both to -1. 696 | 2. `enqueue(item)`: Check for a full queue then perform the following steps: 697 | - If the queue is empty (`front = -1, rear = -1`), set `front` to 0. 698 | - Increment `rear` (with wrapping if needed) and add the item. 699 | 3. `dequeue()`: Check for an empty queue then: 700 | - Remove the item at the `front`. 701 | - If `front` equals `rear` after the removal, it indicates an empty queue, so set both to -1. 702 | 703 | #### Complexity Analysis 704 | 705 | - **Time Complexity**: 706 | - $\text{enqueue}: O(1)$ 707 | - $\text{dequeue}: O(1)$ 708 | - **Space Complexity**: $O(n)$ 709 | 710 | #### Implementation 711 | 712 | Here is the Python code: 713 | 714 | ```python 715 | class Queue: 716 | def __init__(self, capacity: int): 717 | self.capacity = capacity 718 | self.queue = [None] * capacity 719 | self.front = self.rear = -1 720 | 721 | def is_full(self) -> bool: 722 | return self.front == (self.rear + 1) % self.capacity 723 | 724 | def is_empty(self) -> bool: 725 | return self.front == -1 and self.rear == -1 726 | 727 | def enqueue(self, item): 728 | if self.is_full(): 729 | print("Queue is full") 730 | return 731 | if self.is_empty(): 732 | self.front = self.rear = 0 733 | else: 734 | self.rear = (self.rear + 1) % self.capacity 735 | self.queue[self.rear] = item 736 | 737 | def dequeue(self): 738 | if self.is_empty(): 739 | print("Queue is empty") 740 | return 741 | if self.front == self.rear: 742 | self.front = self.rear = -1 743 | else: 744 | self.front = (self.front + 1) % self.capacity 745 | 746 | def display(self): 747 | if self.is_empty(): 748 | print("Queue is empty") 749 | return 750 | temp = self.front 751 | while temp != self.rear: 752 | print(self.queue[temp], end=" ") 753 | temp = (temp + 1) % self.capacity 754 | print(self.queue[self.rear]) 755 | 756 | # Usage 757 | q = Queue(5) 758 | q.enqueue(1) 759 | q.enqueue(2) 760 | q.enqueue(3) 761 | q.enqueue(4) 762 | q.enqueue(5) 763 | q.display() 764 | q.enqueue(6) # Queue is full 765 | q.dequeue() 766 | q.dequeue() 767 | q.display() 768 | ``` 769 |
770 | 771 | 772 | 773 | #### Explore all 60 answers here 👉 [Devinterview.io - Array Data Structure](https://devinterview.io/questions/data-structures-and-algorithms/array-data-structure-interview-questions) 774 | 775 |
776 | 777 | 778 | data-structures-and-algorithms 779 | 780 |

781 | 782 | --------------------------------------------------------------------------------