- Notifications
You must be signed in to change notification settings - Fork1
🟣 Queue Data Structure interview questions and answers to help you prepare for your next data structures and algorithms interview in 2025.
Devinterview-io/queue-data-structure-interview-questions
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
You can also find all 55 answers here 👉Devinterview.io - Queue Data Structure
Aqueue is a data structure that adheres to theFirst-In-First-Out (FIFO) principle and is designed to hold a collection of elements.
- Enqueue: Adding an element to the end of the queue.
- Dequeue: Removing an element from the front of the queue.
- IsEmpty: Checks if the queue is empty.
- IsFull: Checks if the queue has reached its capacity.
- Peek: Views the front element without removal.
All operations have a space complexity of
- Order: Maintains the order of elements according to their arrival time.
- Size: Can be either bounded (fixed size) or unbounded (dynamic size).
- Accessibility: Typically provides only restricted access to elements in front and at the rear.
- Time Complexity: The time required to perform enqueue and dequeue is usually
$O(1)$ .
- Ticket Counter: People form a queue, and the first person who joined the queue gets the ticket first.
- Printer Queue: Print jobs are processed in the order they were sent to the printer.
- Task Scheduling: Used by operating systems for managing processes ready to execute or awaiting specific events.
- Handling of Requests: Servers in multi-threaded environments queue multiple user requests, processing them in arrival order.
- Data Buffering: Supports asynchronous data transfers between processes, such as in IO buffers and pipes.
- Breadth-First Search: Employed in graph algorithms, like BFS, to manage nodes for exploration.
- Order Processing: E-commerce platforms queue customer orders for processing.
- Call Center Systems: Incoming calls wait in a queue before connecting to the next available representative.
Here is the Python code:
fromcollectionsimportdequeclassQueue:def__init__(self):self.queue=deque()defenqueue(self,item):self.queue.append(item)defdequeue(self):ifnotself.is_empty():returnself.queue.popleft()raiseException("Queue is empty.")defsize(self):returnlen(self.queue)defis_empty(self):returnlen(self.queue)==0deffront(self):ifnotself.is_empty():returnself.queue[0]raiseException("Queue is empty.")defrear(self):ifnotself.is_empty():returnself.queue[-1]raiseException("Queue is empty.")# Example Usageq=Queue()q.enqueue(5)q.enqueue(6)q.enqueue(3)q.enqueue(2)q.enqueue(7)print("Queue:",list(q.queue))print("Front:",q.front())print("Rear:",q.rear())q.dequeue()print("After dequeue:",list(q.queue))
TheFIFO (First-In-First-Out) policy governs the wayQueues handle their elements. Elements are processed and removed from the queue in the same order in which they were added. The data structures responsible for adhering to this policy are specifically designed to optimize for this principle, making them ideal for a host of real-world applications.
Elements are typically added to therear and removed from thefront. This design choice ensures that the earliest elements, those closest to the front, are processed and eliminated first.
- Enqueue (Add): New elements are positioned at the rear end.
- Dequeue (Remove): Front element is removed from the queue.
In the above diagram:
- Front: Pointing to the element about to be dequeued.
- Rear: Position where new elements will be enqueued.
Queues are adaptable data structures with diverse types, each optimized for specific tasks. Let's explore the different forms of queues and their functionalities.
ASimple Queue follows the basicFIFO principle. This means items are added at the end and removed from the beginning.
Here is the Python code:
classSimpleQueue:def__init__(self):self.queue= []defenqueue(self,item):self.queue.append(item)defdequeue(self):ifnotself.is_empty():returnself.queue.pop(0)defis_empty(self):returnlen(self.queue)==0defsize(self):returnlen(self.queue)
In aCircular Queue the last element points to the first element, making a circular link. This structure uses afixed-size array and can wrap around upon reaching the end. It's morememory efficient than a Simple Queue, reusing positions at the front that are left empty by the dequeue operations.
Here is the Python code:
classCircularQueue:def__init__(self,k):self.queue= [None]*kself.size=kself.front=self.rear=-1defenqueue(self,item):ifself.is_full():return"Queue is full"elifself.is_empty():self.front=self.rear=0else:self.rear= (self.rear+1)%self.sizeself.queue[self.rear]=itemdefdequeue(self):ifself.is_empty():return"Queue is empty"elifself.front==self.rear:temp=self.queue[self.front]self.front=self.rear=-1returntempelse:temp=self.queue[self.front]self.front= (self.front+1)%self.sizereturntempdefis_empty(self):returnself.front==-1defis_full(self):return (self.rear+1)%self.size==self.front
APriority Queue gives each item a priority. Items with higher priorities are dequeued before those with lower priorities. This is useful in scenarios like task scheduling where some tasks need to be processed before others.
Here is the Python code:
classPriorityQueue:def__init__(self):self.queue= []defenqueue(self,item,priority):self.queue.append((item,priority))self.queue.sort(key=lambdax:x[1],reverse=True)defdequeue(self):ifnotself.is_empty():returnself.queue.pop(0)[0]defis_empty(self):returnlen(self.queue)==0
ADouble-Ended Queue allows items to be added or removed from both ends, giving itmore flexibility compared to a simple queue.
Here is the Python code:
fromcollectionsimportdequede_queue=deque()de_queue.append(1)# Add to rearde_queue.appendleft(2)# Add to frontde_queue.pop()# Remove from rearde_queue.popleft()# Remove from front
AnInput-Restricted Deque only allows items to be added at one end, while anOutput-Restricted Deque limits removals to one end.
Input-Restricted Deque
Output-Restricted Deque
Queues are data structures that follow aFIFO (First-In, First-Out) order, where elements are removed in the same sequence they were added.
Priority Queues, on the other hand, are more dynamic and cater to elements withvarying priorities. A key distinction is that while queues prioritize the order in which items are processed, a priority queue dictates the sequence based on the priority assigned to each element.
Order: Queues ensure a consistent, predefined processing sequence, whereas priority queues handle items based on their assigned priority levels.
Elements Removal: Queues remove the oldest element, while priority queues remove the highest-priority item. This results in a different set of elements being dequeued in each case.
Queues: 1, 2, 3, 4, 5
Priority Queue: (assuming '4' has the highest priority): 4, 2, 6, 3, 1
Support Functions: Since queues rely on a standard FIFO flow, they present standard methods like
enqueue
anddequeue
. In contrast, priority queues offer means to set priorities and locate/query elements based on their priority levels.
- Queues: Direct support.
- Priority Queues: Manage elements to sustain a specific order.
- Queues: Convenient for dynamic sizing and additions.
- Priority Queues: Manual management of element ordering.
Queues: Not common but a viable approach using structures like Heaps.
Priority Queues: Specifically utilized for priority queues to ensure efficient operations based on priorities.
- Hash Tables
- Queues: Suitable for more sophisticated, fine-tuned queues.
- Priority Queues: Can be combined with other structures for varied implementations.
- Queues: Appropriate for scenarios where "first come, first serve" is fundamental, such as in printing tasks or handling multiple requests.
- Priority Queues: More Suitable for contexts that require managing and completing tasks in an "order of urgency" or "order of importance", like in real-time systems, traffic routing, or resource allocation.
Queues andStacks provide structured ways to handle data, offering distinct advantages over more generic structures likeLists orArrays.
- Characteristic: First-In-First-Out (FIFO)
- Usage: Ideal for ordered processing, such as print queues or BFS traversal.
- Characteristic: Last-In-First-Out (LIFO)
- Usage: Perfect for tasks requiring reverse order like undo actions or DFS traversal.
- Characteristic: Random Access
- Usage: Suitable when you need random access to elements or don't require strict order or data management.
Reversing a queue can be accomplishedusing a single stack orrecursively. Both methods ensure the first element in the input queue becomes the last in the resultant queue.
Here are the steps:
- Transfer Input to Stack: While the input queue isn't empty,dequeue elements andpush them to the stack.
- Transfer Stack to Output: Then,pop elements from the stack andenqueue them back to the queue. This reverses their order.
Here is the Python code:
defreverse_queue(q):ifnotq:# Base case: queue is emptyreturnstack= []whileq:stack.append(q.pop(0))# Transfer queue to stackwhilestack:q.append(stack.pop())# Transfer stack back to queuereturnq# Testq= [1,2,3,4,5]print(f"Original queue:{q}")reverse_queue(q)print(f"Reversed queue:{q}")
- Time Complexity:
$O(n)$ as it involves one pass through both the queue and the stack for a queue of size$n$ . - Space Complexity:
$O(n)$ -$n$ space is used to store the elements in the stack.
To reverse a queuerecursively, you can follow this approach:
- Base Case: If the queue is empty, stop.
- Recurse: Call the reverse function recursively until all elements are dequeued.
- Enqueue Last Element: For each item being dequeued, enqueue it back into the queue after the recursion bottoms out, effectively reversing the order.
Here is the Python code:
defreverse_queue_recursively(q):ifnotq:returnfront=q.pop(0)# Dequeue the first elementreverse_queue_recursively(q)# Recurse for the remaining queueq.append(front)# Enqueue the previously dequeued element at the endreturnq# Testq= [1,2,3,4,5]print(f"Original queue:{q}")reverse_queue_recursively(q)print(f"Reversed queue:{q}")
- Time Complexity:
$O(n^2)$ - this is because each dequeue operation on the queue in the recursion stack is an$O(n)$ operation, and these operations occur in sequence for a queue of size$n$ . Therefore, we get$n + (n-1) + \ldots + 1 = \frac{n(n+1)}{2}$ in the worst case. While this can technically be represented as$O(n^2)$ , in practical scenarios for small queues, it can have a time complexity of$O(n)$ . - Space Complexity:
$O(n)$ -$n$ depth comes from the recursion stack for a queue of$n$ elements
Static queues use a pre-defined amount of memory, typically an array, for efficient FIFO data handling.
Fixed Capacity: A static queue cannot dynamically adjust its size based on data volume or system requirements. As a result, it can become either underutilized or incapable of accommodating additional items.
Memory Fragmentation: If there's not enough contiguous memory to support queue expansion or changes, memory fragmentation occurs. This means that even if there's available memory in the system, it may not be usable by the static queue.
Memory fragmentation is more likely in long-running systems or when the queue has a high rate of enqueueing and dequeueing due to the "moving window" of occupied and freed space.
Potential for Data Loss: Enqueuing an item into a full static queue results in data loss. As there's no mechanism to signify that storage was exhausted, it's essential to maintain methods to keep track of the queue's status.
Time-Consuming Expansion: If the queue were to support expansion, it would require operations in
$O(n)$ time - linear with the current size of the queue. This computational complexity is a significant downside compared to the$O(1)$ time complexity offered by dynamic queues.Inefficient Memory Usage: A static queue reserved a set amount of memory for its potential max size, which can be a wasteful use of resources if the queue seldom reaches that max size.
The task is to write an algorithm to perform bothenqueue (add an item) anddequeue (remove an item) operations on aqueue.
A Queue, often used in real-world scenarios with first-in, first-out (FIFO) logic, can be implemented using an array (for fixed-size) or linked list (for dynamic size).
- Enqueue Operation: Add an item at the
rear
of the queue. - Dequeue Operation: Remove the item at the
front
of the queue.
Here is the Python code:
classQueue:def__init__(self):self.items= []defenqueue(self,item):self.items.append(item)defdequeue(self):ifnotself.is_empty():returnself.items.pop(0)return"Queue is empty"defis_empty(self):returnself.items== []defsize(self):returnlen(self.items)# Exampleq=Queue()q.enqueue(2)q.enqueue(4)q.enqueue(6)print("Dequeued:",q.dequeue())# Output: Dequeued: 2
In this Python implementation, theenqueue
operation has a time complexity ofdequeue
operation has a time complexity of
One way to achieve
However, dequeue operations on a single-ended list are costly, potentially traversing the whole list. To keep dequeue times acceptable, you might want to limit the number of elements you enqueue before you're allowed to dequeue elements. You could define a fixed size for the list e.g. 100 or 1000, and after this limit, you would allow dequeueing. The key is to ensure the amortized time for the last operation is still
Here is a Python code:
classNode:def__init__(self,data=None):self.data=dataself.next=NoneclassLimitedQueue:def__init__(self,max_size):self.head=Noneself.tail=Noneself.max_size=max_sizeself.count=0defenqueue(self,data):ifself.count<self.max_size:new_node=Node(data)ifnotself.head:self.head=new_nodeelse:self.tail.next=new_nodeself.tail=new_nodeself.count+=1else:print("Queue is full. Dequeue before adding more.")defdequeue(self):ifself.head:data=self.head.dataself.head=self.head.nextself.count-=1ifself.count==0:self.tail=Nonereturndataelse:print("Queue is empty. Nothing to dequeue.")defdisplay(self):current=self.headwhilecurrent:print(current.data,end=" ")current=current.nextprint()# Let's test the Queuelimited_queue=LimitedQueue(3)limited_queue.enqueue(10)limited_queue.enqueue(20)limited_queue.enqueue(30)limited_queue.display()# Should display 10 20 30limited_queue.enqueue(40)# Should display 'Queue is full. Dequeue before adding more.'limited_queue.dequeue()limited_queue.display()# Should display 20 30
Whileenqueue typically takes
In most traditional Queue implementations,enqueue and dequeue operate in
However, you can design special queues, likepriority queues, where one operation is optimized at the cost of the other. For instance, if you're using abinary heap.
The efficiency of bothenqueue anddequeue is constrained by the binary heap's structure. A binary heap can be represented as a binary tree.
In acomplete binary tree, most levels are fully occupied, and the last level is either partially or fully occupied from the left.
When the binary heap is visualized with the root at the top, the following rules are typically followed:
- Maximum Number of Children: All nodes, except the ones at the last level, have exactly two children.
- Possible Lopsided Structure in the Last Level: The last level, if not fully occupied from the left, can have a right-leaning configuration of nodes.
Suppose we represent such a binary heap using an array starting from index
Thus, bothenqueue anddequeue rely on traversing the binary heap in a systematic manner. The following efficiencies are characteristic:
Whenenqueue is executed:
- The highest efficiency achievable is
$O(1)$ when the new element replaces the root, and the heap happens to be a min or max heap. - The efficiency can degrade up to
$O(\log n)$ in the worst-case scenario. This occurs when the new child percolates to the root in$O(\log n)$ steps after comparing and potentially swapping with its ancestors.
Whendequeue is executed:
- The operation's efficiency spans from
$O(1)$ when the root is instantly removed to$O(\log n)$ when the replacement node needs to 'bubble down' to its proper position.
Using asingly linked list as a queue provides
- Element Popularity Counter: Keep track of the number of times an element appears, so you can easily determine changes to the minimum and maximum when elements are added or removed.
- Auxiliary Data Structure: Alongside the queue, maintain a secondary data structure, such as a tree or stack, that helps identify the current minimum and maximum elements efficiently.
Here is the Python code:
classNaiveQueue:def__init__(self):self.queue= []defpush(self,item):self.queue.append(item)defpop(self):returnself.queue.pop(0)defmin(self):returnmin(self.queue)defmax(self):returnmax(self.queue)
This code hasmin
andmax
methods.
Here is the Python code:
fromcollectionsimportCounterclassEfficientQueue:def__init__(self):self.queue= []self.element_count=Counter()self.minimum=float('inf')self.maximum=float('-inf')defpush(self,item):self.queue.append(item)self.element_count[item]+=1self.update_min_max(item)defpop(self):item=self.queue.pop(0)self.element_count[item]-=1ifself.element_count[item]==0:delself.element_count[item]ifitem==self.minimum:self.minimum=min(self.element_count.elements(),default=float('inf'))ifitem==self.maximum:self.maximum=max(self.element_count.elements(),default=float('-inf'))returnitemdefmin(self):returnself.minimumdefmax(self):returnself.maximumdefupdate_min_max(self,item):self.minimum=min(self.minimum,item)self.maximum=max(self.maximum,item)
This code hasmin
andmax
methods.
Here is the Python code:
fromqueueimportQueuefromcollectionsimportdequeclassDualDataQueue:def__init__(self):self.queue=Queue()# For standard queue operationsself.max_queue=deque()# To keep track of current maximumdefpush(self,item):self.queue.put(item)whilelen(self.max_queue)>0andself.max_queue[-1]<item:self.max_queue.pop()self.max_queue.append(item)defpop(self):item=self.queue.get()ifitem==self.max_queue[0]:self.max_queue.popleft()returnitemdefmax(self):returnself.max_queue[0]
This code hasmax
method andmin
method using the symmetric approach.
Merging multiple queues is conceptually similar to merging two lists. However, direct merging challenges efficiency as it enforces a
- Enqueue into Aux: Until all input queues are empty,enqueue from the oldest non-empty queue to
$\text{auxQueue}$ . - Move Everything Back: For each item already in
$\text{auxQueue}$ ,dequeue andenqueue back to the determined queue. - Return
$\text{auxQueue}$ : As all input queues are empty,$\text{auxQueue}$ now contains all the original elements.
- Time Complexity: The algorithm runs in
$\mathcal{O}(n + m)$ where$n$ and$m$ represent the sizes of the input queues. - Space Complexity: The algorithm uses
$\mathcal{O}(1)$ auxiliary space.
Here is the Python code:
fromqueueimportQueuedefmerge_queues(q_list):auxQueue=Queue()# Step 1: Enqueue into Auxforqinq_list:whilenotq.empty():auxQueue.put(q.get())# Step 2: Move Everything Backfor_inrange(auxQueue.qsize()):q.put(auxQueue.get())# Step 3: Return auxQueuereturnq
Here is the Python code with the test:
# Creating queuesq1=Queue()q2=Queue()# Enqueueing elementsforiinrange(5):q1.put(i)foriinrange(5,10):q2.put(i)# Mergingmerged=merge_queues([q1,q2])# Dequeuing and printingwhilenotmerged.empty():print(merged.get())
Here is the Python code if we merge it into single queue:
defmerge_queue_multi(q_list):merged=Queue()# Merging the queuesforqinq_list:whilenotq.empty():merged.put(q.get())returnmerged
The time complexity of this algorithm isnot as optimal as the enqueuing to the auxiliary queue makes each item traverse more than once, increasing control time when an element is being dequeued.
For even activity, all enqueuing actions are executed approximately the same number of times, so there's still a linear-time bound.
Here is the Python code:
defmerge_queues_on_visit_multi(q_list):defon_visit(visit_cb):forqinq_list:whilenotq.empty():visit_cb(q.get())merged=Queue()on_visit(merged.put)returnmerged
Queues can be built using various underlying structures, each with its trade-offs in efficiency and complexity.
Using a simple array for implementation requiresshifting elements when adding or removing from the front. This makes operations linear time
classArrayQueue:def__init__(self):self.queue= []defenqueue(self,item):self.queue.append(item)defdequeue(self):returnself.queue.pop(0)
Using a singly-linked list allowsenqueue
with a tail pointer but stilldequeue
.
classNode:def__init__(self,data=None):self.data=dataself.next=NoneclassLinkedListQueue:def__init__(self):self.head=Noneself.tail=Nonedefenqueue(self,item):new_node=Node(item)ifself.tail:self.tail.next=new_nodeelse:self.head=new_nodeself.tail=new_nodedefdequeue(self):ifself.head:data=self.head.dataself.head=self.head.nextifnotself.head:self.tail=Nonereturndata
A doubly linked list enablesenqueue
anddequeue
by maintaining head and tail pointers, but it requiresprev node management.
classDNode:def__init__(self,data=None):self.data=dataself.next=Noneself.prev=NoneclassDoublyLinkedListQueue:def__init__(self):self.head=Noneself.tail=Nonedefenqueue(self,item):new_node=DNode(item)ifnotself.head:self.head=new_nodeelse:self.tail.next=new_nodenew_node.prev=self.tailself.tail=new_nodedefdequeue(self):ifself.head:data=self.head.dataself.head=self.head.nextifself.head:self.head.prev=Noneelse:self.tail=Nonereturndata
Thecollections.deque
in Python is essentially a double-ended queue implemented using adoubly-linked list, providing
fromcollectionsimportdequeclassDequeQueue:def__init__(self):self.queue=deque()defenqueue(self,item):self.queue.append(item)defdequeue(self):returnself.queue.popleft()
A binary heap with itsbinary tree structure is optimized forpriority queues, achievingenqueue
anddequeue
operations. This makes it useful for situations where you need to process elements in a particular order.
importheapqclassMinHeapQueue:def__init__(self):self.heap= []defenqueue(self,item):heapq.heappush(self.heap,item)defdequeue(self):returnheapq.heappop(self.heap)
Whilearray-based Queues are simple, they have inherent limitations.
- Structure: Uses an array to simulate a queue's First-In-First-Out (FIFO) behavior.
- Pointers: Utilizes a front and rear pointer/index.
Here is the Python code:
classArrayQueue:def__init__(self,size):self.size=sizeself.queue= [None]*sizeself.front=self.rear=-1defis_full(self):returnself.rear==self.size-1defis_empty(self):returnself.front==-1orself.front>self.reardefenqueue(self,element):ifself.is_full():print("Queue is full")returnifself.front==-1:self.front=0self.rear+=1self.queue[self.rear]=elementdefdequeue(self):ifself.is_empty():print("Queue is empty")returnitem=self.queue[self.front]self.front+=1ifself.front>self.rear:self.front=self.rear=-1returnitem
- Fixed Size: Array size is predetermined, leading to potential memory waste or overflow.
- Element Frontshift: Deletions necessitate front-shifting, creating an
$O(n)$ time cost. - Unequal Time Complexities: Operations like
enqueue
anddequeue
can be$O(1)$ or$O(n)$ , making computation times less predictable.
15. What are the benefits of implementing aQueue with aDoubly Linked List versus aSingly Linked List?
Let's compare the benefits of implementing aQueue using both aDoubly Linked List andSingly Linked List.
- Simplicity: The implementation is straightforward and may require fewer lines of code.
- Memory Efficiency: Nodes need to store only a single reference to the next node, which can save memory.
- Bi-directional Traversal: Allows for both forward and backward traversal, a necessity for certain queue operations such as tail management and removing from the end.
- Efficient Tail Operations: Eliminates the need to traverse the entire list to find the tail, significantly reducing time complexity for operations that involve the tail.
Explore all 55 answers here 👉Devinterview.io - Queue Data Structure
About
🟣 Queue Data Structure interview questions and answers to help you prepare for your next data structures and algorithms interview in 2025.
Topics
Resources
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.