- Notifications
You must be signed in to change notification settings - Fork0
🟣 Array Data Structure interview questions and answers to help you prepare for your next data structures and algorithms interview in 2025.
Devinterview-io/array-data-structure-interview-questions
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
You can also find all 60 answers here 👉Devinterview.io - Array Data Structure
Anarray is a fundamental data structure used for storing asequence of elements that can be accessed via anindex.
- Homogeneity: All elements are of the same data type.
- Contiguous Memory: Elements are stored in adjacent memory locations for quick access.
- Fixed Size: Arrays are generally static in size, although dynamic arrays exist in modern languages.
- Indexing: Usually zero-based, though some languages use one-based indexing.
- Access:
$O(1)$ - Search:
$O(1)$ ,$O(n)$ assuming unsorted array - Insertion:
$O(1)$ for the end,$O(n)$ for beginning/middle - Deletion:
$O(1)$ for the end,$O(n)$ for beginning/middle - Append:
$O(1)$ amortized,$O(n)$ during resizing
Here is the Java code:
publicclassArrayExample {publicstaticvoidmain(String[]args) {// Declare and Initialize Arraysint[]myArray =newint[5];// Declare an array of size 5int[]initializedArray = {1,2,3,4,5};// Direct initialization// Access ElementsSystem.out.println(initializedArray[0]);// Output: 1// Update ElementsinitializedArray[2] =10;// Modify the third element// Check Array Lengthintlength =initializedArray.length;// Retrieve array lengthSystem.out.println(length);// Output: 5 }}
Dynamic arrays start with a preset capacity andautomatically resize as needed. When full, they allocate a larger memory block—often doubling in size—and copy existing elements.
- Adaptive Sizing: Dynamic arrays adjust their size based on the number of elements, unlike fixed-size arrays.
- Contiguous Memory: Dynamic arrays, like basic arrays, keep elements in adjacent memory locations for efficient indexed access.
- Amortized Appending: Append operations are typically constant time. However, occasional resizing might take longer, but averaged over multiple operations, it's still
$O(1)$ amortized.
- Access:
$O(1)$ - Search:
$O(1)$ for exact matches,$O(n)$ linearly for others - Insertion:
$O(1)$ amortized,$O(n)$ during resizing - Deletion:
$O(1)$ amortized,$O(n)$ during shifting or resizing - Append:
$O(1)$ amortized,$O(n)$ during resizing
Here is the Java code:
importjava.util.Arrays;publicclassDynamicArray<T> {privateObject[]data;privateintsize =0;privateintcapacity;publicDynamicArray(intinitialCapacity) {this.capacity =initialCapacity;data =newObject[initialCapacity]; }publicTget(intindex) {return (T)data[index]; }publicvoidadd(Tvalue) {if (size ==capacity) {resize(2 *capacity); }data[size++] =value; }privatevoidresize(intnewCapacity) {Object[]newData =newObject[newCapacity];for (inti =0;i <size;i++) {newData[i] =data[i]; }data =newData;capacity =newCapacity; }publicintsize() {returnsize; }publicbooleanisEmpty() {returnsize ==0; }publicstaticvoidmain(String[]args) {DynamicArray<Integer>dynArray =newDynamicArray<>(2);dynArray.add(1);dynArray.add(2);dynArray.add(3);// This will trigger a resizeSystem.out.println("Size: " +dynArray.size());// Output: 3System.out.println("Element at index 2: " +dynArray.get(2));// Output: 3 }}
AnAssociative Array, often referred to asMap,Hash, orDictionary is an abstract data type that enableskey-based access to its elements and offersdynamic resizing and fast retrieval.
Unique Keys: Each key is unique, and adding an existing key updates its value.
Variable Key Types: Keys can be diverse types, including strings, numbers, or objects.
Hash Table: Efficiency can degrade due to hash collisions.
- Average Case
$O(1)$ - Worst Case
$O(n)$
- Average Case
Self-Balancing Trees: Consistent efficiency due to balanced structure.
- Average Case
$O(\log n)$ - Worst Case
$O(\log n)$
- Average Case
Unbalanced Trees: Efficiency can vary, making them less reliable.
- Average Case Variable
- Worst Case between
$O(\log n)$ and$O(n)$
Association Lists: Simple structure, not ideal for large datasets.
- Average and Worst Case
$O(n)$
- Average and Worst Case
Here is the Python code:
# Regular Array Examplemy_list= ["apple","banana","cherry"]print(my_list[1])# Outputs: banana# Trying to access using non-integer index would cause an error:# print(my_list["fruit_name"]) # This would raise an error.# Associative Array (Dictionary) Examplemy_dict= {"fruit_name":"apple",42:"banana", (1,2):"cherry"}print(my_dict["fruit_name"])# Outputs: appleprint(my_dict[42])# Outputs: bananaprint(my_dict[(1,2)])# Outputs: cherry# Demonstrating key updatemy_dict["fruit_name"]="orange"print(my_dict["fruit_name"])# Outputs: orange
Array dimensionality indicates the number of indices required to select an element within the array. A classic example is the Tic-Tac-Toe board, which is a two-dimensional array, and elements are referenced by their row and column positions.
Here is the Python code:
# Setting up the Tic-Tac-Toe boardtic_tac_toe_board= [ ['X','O','X'], ['O','X','O'], ['X','O','X']]# Accessing the top-left corner (which contains 'X'):element=tic_tac_toe_board[0][0]
Here is the Python code:
arr_3d= [ [[1,2,3], [4,5,6]], [[7,8,9], [10,11,12]]]
A three-dimensional array can be imagined as acube or astack of matrices.
Mathematically, an array's dimensionality aligns with the Cartesian product of sets, each set corresponding to an axis. A 3D array, for instance, is formed from the Cartesian product of three distinct sets.
Arrays can extend into N dimensions, where
Where
Arrays have very specificstrengths andweaknesses, making them better suited for some applications over others.
Speed: Arrays provide
$O(1)$ access and append operations when appending at a known index (like the end).Cache Performance: Arrays, with their contiguous memory layout, are efficient for tasks involving sequential data access.
Size Limitations: Arrays have a fixed size after allocation. Resizing means creating a new array, leading to potential memory overhead or data transfer costs.
Mid-Array Changes: Operations like insertions or deletions are
$O(n)$ due to necessary element shifting.
When to Use: Arrays are optimal forknown data sizes and when rapid access or appends are critical. They're popular in numerical algorithms and cache-centric tasks.
When to Rethink: Their static nature and inefficiency forfrequent mid-array changes make alternatives like linked lists or hash tables sometimes more suitable.
Sparse arrays are data structures optimized for arrays where most values are default (e.g., zero or null). They save memory by storing only non-default values and their indices. In contrast,dense arrays allocate memory for every element, irrespective of it being a default value.
- Sparse Array:
[0, 0, 3, 0, 0, 0, 0, 9, 0, 0]
- Dense Array:
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
Sparse arrays offeroptimized memory usage.
For example, in a million-element array where 90% are zeros:
- Dense Array: Allocates memory for every single element, even if the majority are zeros.
- Sparse Array: Drastically conserves memory by only allocating for non-zero elements.
Text Processing: Efficiently represent term-document matrices in analytics where not all words appear in every document.
Computer Graphics: Represent 3D spaces in modeling where many cells may be empty.
Scientific Computing: Handle linear systems with sparse coefficient matrices, speeding up computations.
Databases: Store tables with numerous missing values efficiently.
Networking: Represent sparsely populated routing tables in networking equipment.
Machine Learning: Efficiently handle high-dimensional feature vectors with many zeros.
Recommendation Systems: Represent user-item interaction matrices where many users haven't interacted with most items.
Here is a Python code:
classSparseArray:def__init__(self):self.data= {}defset(self,index,value):ifvalue!=0:# Only store non-zero valuesself.data[index]=valueelifindexinself.data:delself.data[index]defget(self,index):returnself.data.get(index,0)# Return 0 if index is not in the data# Usagesparse_array=SparseArray()sparse_array.set(2,3)sparse_array.set(7,9)print(sparse_array.get(2))# Output: 3print(sparse_array.get(7))# Output: 9print(sparse_array.get(3))# Output: 0
Asorted array is a data structure where elements are stored in a specific,predetermined sequence, usually in ascending or descending order.
This ordering provides various benefits, such asoptimized search operations, at the cost of more complex insertions and deletions.
Efficient Searches: Sorted arrays are optimized for search operations, especially when using algorithms like Binary Search, which has a
$O(\log n)$ time complexity.Additional Query Types: They support other specialized queries, like bisection to find the closest element and range queries to identify elements within a specified range.
Cache Efficiency: The contiguous memory layout improves cache utilization, which can lead to faster performance.
Slow Updates: Insertions and deletions generally require shifting elements, leading to
$O(n)$ time complexity for these operations.Memory Overhead: The need to maintain the sorted structure can require extra memory, especially during updates.
Lack of Flexibility: Sorted arrays are less flexible for dynamic resizing and can be problematic in parallel computing environments.
- Search-Heavy Applications: Suitable when rapid search operations are more common than updates, such as in financial analytics or in-memory databases.
- Static or Semi-Static Data: Ideal for datasets known in advance or that change infrequently.
- Memory Constraints: They are efficient for small, known datasets that require quick search capabilities.
- Access:
$O(1)$ . - Search:
$O(1)$ for exact matches,$O(\log n)$ with binary search for others. - Insertion:
$O(1)$ for the end, but usually$O(n)$ to maintain order. - Deletion:
$O(1)$ for the end, but usually$O(n)$ to maintain order. - Append:
$O(1)$ if appending a larger value, but can spike to$O(n)$ if resizing or inserting in order.
While bothheaps andsorted arrays have their strengths, heaps are often preferred when dealing with dynamic data requiring frequent insertions and deletions.
- Dynamic Operations: Heaps excel in scenarios with frequent insertions and deletions, maintaining their structure efficiently.
- Memory Allocation: Heaps, especially when implemented as binary heaps, can be efficiently managed in memory as they're typically backed by arrays. Sorted arrays, on the other hand, might require periodic resizing or might have wasted space if over-allocated.
- Predictable Time Complexity: Heap operations have consistent time complexities, while sorted arrays can vary based on specific data scenarios.
- No Overhead for Sorting: Heaps ensure parents are either always smaller or larger than children, which suffices for many tasks without the overhead of maintaining full order as in sorted arrays.
- find-min:
$O(1)$ – The root node always contains the minimum value. - delete-min:
$O(\log n)$ – Removal of the root is followed by the heapify process to restore order. - insert:
$O(\log n)$ – The newly inserted element might need to be bubbled up to its correct position.
- find-min:
$O(1)$ – The first element is the minimum if the array is sorted in ascending order. - delete-min:
$O(n)$ – Removing the first element requires shifting all other elements. - insert:
$O(n)$ – Even though we can find the insertion point in$O(\log n)$ with binary search, we may need to shift elements, making it$O(n)$ in the worst case.
Indexing refers to accessing specific elements in an array using unique indices, which range from 0 to
Arrays occupy adjacent memory locations, facilitating fast random access. All elements are uniformly sized. For example, a 32-bit integer consumes 4 bytes of memory.
The memory address of the
Here,
Here is the Python code:
# Define an arrayarr= [10,20,30,40,50,60]# Calculate memory address of the third elementelement_index=2element_address=arr.__array_interface__['data'][0]+element_index*arr.itemsize# Retrieve the element valueimportctypeselement_value=ctypes.cast(element_address,ctypes.py_object).value# Outputprint(f"The memory address of the third element is:{element_address}")print(f"The value at that memory address is:{element_value}")
The task is tomerge two sorted arrays into one combined, sorted array.
- Initialize the result arrayC, with counters
i=0
for arrayA andj=0
for arrayB. - While
i
is within the bounds of arrayA andj
is within the bounds of arrayB:a. IfA[i]
is less thanB[j]
, appendA[i]
toC
and incrementi
.b. IfA[i]
is greater thanB[j]
, appendB[j]
toC
and incrementj
.c. IfA[i]
is equal toB[j]
, append bothA[i]
andB[j]
toC
and increment bothi
andj
. - If any elements remain in arrayA, append them to
C
. - If any elements remain in arrayB, append them to
C
. - Return the merged array
C
.
- Time Complexity:
$O(n)$ , where$n$ is the combined length of Arrays A and B. - Space Complexity:
$O(n)$ , considering the space required for the output array.
Here is the Python code:
defmerge_sorted_arrays(a,b):merged_array,i,j= [],0,0whilei<len(a)andj<len(b):ifa[i]<b[j]:merged_array.append(a[i])i+=1elifa[i]>b[j]:merged_array.append(b[j])j+=1else:merged_array.extend([a[i],b[j]])i,j=i+1,j+1merged_array.extend(a[i:])merged_array.extend(b[j:])returnmerged_array# Sample Testarray1= [1,3,5,7,9]array2= [2,4,6,8,10]print(merge_sorted_arrays(array1,array2))# Expected Output: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
The task is to implementthree stacks using asingle dynamic array.
To solve the task we candivide the array into twelve portions, with four sections for each stack, allowing each of them togrow andshrink without affecting the others.
Initialize Stack States:
- Set
size
as the full array length divided by 3. - Set
stackPointers
as[ start, start + size - 1, start + 2*size - 1 ]
, wherestart
is the array's beginning index.
- Set
Implement
Push
Operation: For stack 1, check ifstackPointers[0]
is less thanstart + size - 1
before pushing.
- Time Complexity:
$O(1)$ for all stack operations. - Space Complexity:
$O(1)$
Here is the Python code:
classMultiStack:def__init__(self,stack_size):self.stack_size=stack_sizeself.array= [None]* (3*stack_size)self.stack_pointers= [-1,-1,-1]defpush(self,stack_number,value):ifself.stack_pointers[stack_number]>=self.stack_size-1:print("Stack Overflow!")returnself.stack_pointers[stack_number]+=1self.array[self.stack_pointers[stack_number]]=valuedefpop(self,stack_number):ifself.stack_pointers[stack_number]<0:print("Stack Underflow!")returnNonevalue=self.array[self.stack_pointers[stack_number]]self.stack_pointers[stack_number]-=1returnvaluedefpeek(self,stack_number):ifself.stack_pointers[stack_number]<0:print("Stack Underflow!")returnNonereturnself.array[self.stack_pointers[stack_number]]
Array rotation involves moving elements within an array to shift its position. This operation can be beneficial in various scenarios, from data obfuscation to algorithmic optimizations.
- Left Rotation: Shifts elements to the left.
- Right Rotation: Shifts elements to the right.
- Naive Method: Directly shifting each element one at a time,
$d$ times, where$d$ is the rotation factor. - Reversal Algorithm: Involves performing specificreversals within the array to achieve rotation more efficiently.
Here is the Python code:
defreverse(arr,start,end):whilestart<end:arr[start],arr[end]=arr[end],arr[start]start+=1end-=1defrotate_array(arr,d):n=len(arr)reverse(arr,0,d-1)reverse(arr,d,n-1)reverse(arr,0,n-1)# Examplemy_array= [1,2,3,4,5,6,7]rotate_array(my_array,3)print(my_array)# Output: [4, 5, 6, 7, 1, 2, 3]
Obfuscation of Data: By performing secure operations, such as circular permutations on sensitive arrays, it ensures data confidentiality.
Cryptography: Techniques like the Caesar cipher use array rotation to encrypt and decrypt messages. Modern ciphers similarly rely on advanced versions of this concept.
Memory Optimization: It ensures that data in the array is arranged for optimal memory access, which is crucial in large datasets or when working with limited memory resources.
Algorithm Optimization: Certain algorithms, such as search and sorting algorithms, might perform better on a particular range of elements within an array. Rotation allows for tailoring the array to these algorithms for enhanced performance.
Given an array, the objective is toreverse the sequence of its elements.
Two elements are selected from each end of the array and are swapped. This process continues, with the selected elements moving towards the center, until the entire array is reversed.
- Begin with two pointers:
start
at index 0 andend
at the last index. - Swap the elements at
start
andend
positions. - Increment
start
and decrementend
. - Repeat Steps 2 and 3 until the pointers meet at the center of the array.
This algorithmreverses the array in place, with a space complexity of
- Time Complexity:
$O(n/2)$ as the swapping loop only runs through half of the array. - Space Complexity: Constant,
$O(1)$ , as no additional space is required.
Here is the Python code:
defreverse_array(arr):start=0end=len(arr)-1whilestart<end:arr[start],arr[end]=arr[end],arr[start]start+=1end-=1# Examplearr= [1,2,3,4,5]reverse_array(arr)print("Reversed array:",arr)# Output: [5, 4, 3, 2, 1]
Given asorted array, the task is toremove duplicate elements in place (using constant space) and return the new length.
A two-pointer method provides an efficient solution that removes duplicatesin place while also recording the new length of the array.
Algorithm steps:
- Initialize
i=0
andj=1
. - Iterate through the array.
- If
array[i] == array[j]
, movej
to the next element. - If
array[i] != array[j]
, updatearray[i+1]
and move bothi
andj
to the next element.
- If
- Time Complexity:
$O(n)$ . Here,$n$ represents the array's length. - Space Complexity:
$O(1)$ . The process requires only a few additional variables
Here is the Python code:
defremoveDuplicates(array):ifnotarray:return0i=0forjinrange(1,len(array)):ifarray[j]!=array[i]:i+=1array[i]=array[j]returni+1
Implement aQueue data structure using a fixed-size array.
While adynamic array is a more efficient choice for this purpose, utilizing a standard array helps in demonstrating the principles of queue operations.
- The queue's front should always have a lower index than its rear, reflecting the structure's first-in, first-out (FIFO) nature.
- When the rear pointer hits the array's end, it may switch to the beginning if there are available slots, a concept known ascircular or wrapped around arrays.
- Initialize the queue: Set
front
andrear
both to -1. enqueue(item)
: Check for a full queue then perform the following steps:- If the queue is empty (
front = -1, rear = -1
), setfront
to 0. - Increment
rear
(with wrapping if needed) and add the item.
- If the queue is empty (
dequeue()
: Check for an empty queue then:- Remove the item at the
front
. - If
front
equalsrear
after the removal, it indicates an empty queue, so set both to -1.
- Remove the item at the
- Time Complexity:
$\text{enqueue}: O(1)$ $\text{dequeue}: O(1)$
- Space Complexity:
$O(n)$
Here is the Python code:
classQueue:def__init__(self,capacity:int):self.capacity=capacityself.queue= [None]*capacityself.front=self.rear=-1defis_full(self)->bool:returnself.front== (self.rear+1)%self.capacitydefis_empty(self)->bool:returnself.front==-1andself.rear==-1defenqueue(self,item):ifself.is_full():print("Queue is full")returnifself.is_empty():self.front=self.rear=0else:self.rear= (self.rear+1)%self.capacityself.queue[self.rear]=itemdefdequeue(self):ifself.is_empty():print("Queue is empty")returnifself.front==self.rear:self.front=self.rear=-1else:self.front= (self.front+1)%self.capacitydefdisplay(self):ifself.is_empty():print("Queue is empty")returntemp=self.frontwhiletemp!=self.rear:print(self.queue[temp],end=" ")temp= (temp+1)%self.capacityprint(self.queue[self.rear])# Usageq=Queue(5)q.enqueue(1)q.enqueue(2)q.enqueue(3)q.enqueue(4)q.enqueue(5)q.display()q.enqueue(6)# Queue is fullq.dequeue()q.dequeue()q.display()
Explore all 60 answers here 👉Devinterview.io - Array Data Structure
About
🟣 Array Data Structure interview questions and answers to help you prepare for your next data structures and algorithms interview in 2025.
Topics
Resources
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.