- Notifications
You must be signed in to change notification settings - Fork191
Lru cache#45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Open
Akshayy67 wants to merge2 commits intoalgorithm-visualizer:masterChoose a base branch fromAkshayy67:lru-cache-1
base:master
Could not load branches
Branch not found:{{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline, and old review comments may become outdated.
Uh oh!
There was an error while loading.Please reload this page.
Open
Lru cache#45
Changes fromall commits
Commits
Show all changes
2 commits Select commitHold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Uh oh!
There was an error while loading.Please reload this page.
Jump to
Jump to file
Failed to load files.
Loading
Uh oh!
There was an error while loading.Please reload this page.
Diff view
Diff view
There are no files selected for viewing
120 changes: 120 additions & 0 deletionsLrucache_README.md
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,120 @@ | ||
# LRU Cache (Least Recently Used Cache) | ||
## Category | ||
Uncategorized / Data Structures | ||
## Description | ||
An **LRU (Least Recently Used) Cache** is a data structure that stores a limited number of items and automatically evicts the least recently used item when the cache reaches its capacity. It's one of the most commonly used cache replacement policies in computer systems. | ||
## How It Works | ||
The LRU Cache maintains items in order of their usage: | ||
1. **Most Recently Used (MRU)**: Items at the front of the cache | ||
2. **Least Recently Used (LRU)**: Items at the back of the cache | ||
### Operations | ||
#### GET Operation | ||
- Retrieves the value associated with a key | ||
- If the key exists, it's moved to the front (marked as most recently used) | ||
- Returns -1 if the key doesn't exist | ||
- **Time Complexity**: O(1) | ||
#### PUT Operation | ||
- Inserts a new key-value pair or updates an existing one | ||
- The item is placed at the front (most recently used position) | ||
- If the cache is at capacity, the least recently used item is evicted | ||
- **Time Complexity**: O(1) | ||
## Implementation Details | ||
This implementation uses two data structures: | ||
1. **Circular Doubly-Linked List**: Maintains the order of items based on usage | ||
- Head points to the most recently used item | ||
- Tail (head.prev) points to the least recently used item | ||
- Allows O(1) insertion and deletion | ||
2. **Hash Map**: Provides O(1) lookup of nodes by key | ||
- Maps keys to their corresponding nodes in the linked list | ||
## Algorithm Complexity | ||
| Operation | Time Complexity | Space Complexity| | ||
|-----------|----------------|------------------| | ||
| GET | O(1) | O(capacity) | | ||
| PUT | O(1) | O(capacity) | | ||
## Real-World Applications | ||
LRU Cache is widely used in: | ||
- **Operating Systems**: Page replacement in virtual memory | ||
- **Web Browsers**: Browser cache management | ||
- **Databases**: Query result caching | ||
- **CDNs**: Content delivery optimization | ||
- **Web Servers**: Page caching | ||
- **APIs**: Response caching to reduce load | ||
## Example Walkthrough | ||
``` | ||
Cache Capacity: 3 | ||
1. PUT(1, 100) → [1:100] | ||
2. PUT(2, 200) → [2:200, 1:100] | ||
3. PUT(3, 300) → [3:300, 2:200, 1:100] | ||
4. GET(1) → [1:100, 3:300, 2:200] (1 moved to front) | ||
5. PUT(4, 400) → [4:400, 1:100, 3:300] (2 evicted - LRU) | ||
6. GET(2) → -1 (not found - was evicted) | ||
``` | ||
## Key Observations | ||
1. **Recency Matters**: Both GET and PUT operations update an item's recency | ||
2. **Automatic Eviction**: When capacity is reached, the LRU item is automatically removed | ||
3. **Efficient**: All operations run in constant time O(1) | ||
4. **Space-Time Tradeoff**: Uses extra space (HashMap) to achieve O(1) time complexity | ||
## Advantages | ||
- **Fast Operations**: O(1) time for both get and put | ||
- **Automatic Management**: No manual cleanup needed | ||
- **Predictable Behavior**: Clear eviction policy | ||
- **Memory Efficient**: Fixed memory footprint | ||
## Disadvantages | ||
- **Fixed Capacity**: Cannot grow beyond initial capacity | ||
- **No Expiration**: Items don't expire based on time (only on usage) | ||
- **Memory Overhead**: Requires additional space for HashMap and pointers | ||
## Variations | ||
- **LFU (Least Frequently Used)**: Evicts items used least often | ||
- **FIFO (First In First Out)**: Evicts oldest items regardless of usage | ||
- **LRU-K**: Considers K most recent accesses instead of just the last one | ||
- **2Q**: Uses two queues to better handle different access patterns | ||
## Interview Tips | ||
LRU Cache is a popular interview question because it tests: | ||
- Understanding of data structures (HashMap, Linked List) | ||
- Ability to combine multiple data structures | ||
- Time/space complexity analysis | ||
- Real-world system design knowledge | ||
## References | ||
- [LeetCode Problem #146: LRU Cache](https://leetcode.com/problems/lru-cache/) | ||
- [Wikipedia: Cache Replacement Policies](https://en.wikipedia.org/wiki/Cache_replacement_policies) | ||
- [System Design: Caching Strategies](https://aws.amazon.com/caching/) | ||
## Author | ||
Akshay Juluri | ||
- GitHub: [@Akshayy67](https://github.com/Akshayy67) | ||
- Project: [LRU Cache Visualizer](https://github.com/Akshayy67/LRU-Cache) | ||
242 changes: 242 additions & 0 deletionsUncategorized/LRU Cache/code.js
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,242 @@ | ||
// Import visualization library | ||
const { | ||
Array1DTracer, | ||
LogTracer, | ||
Layout, | ||
Tracer, | ||
} = require("algorithm-visualizer"); | ||
/** | ||
* Node class for doubly-linked list | ||
*/ | ||
class Node { | ||
constructor(key, value) { | ||
this.key = key; | ||
this.value = value; | ||
this.next = null; | ||
this.prev = null; | ||
} | ||
} | ||
/** | ||
* LRU (Least Recently Used) Cache Implementation | ||
* | ||
* This cache maintains a fixed capacity and automatically evicts | ||
* the least recently used item when capacity is exceeded. | ||
* | ||
* Time Complexity: O(1) for both get and put operations | ||
* Space Complexity: O(capacity) | ||
*/ | ||
class LRUCache { | ||
constructor(capacity) { | ||
this.capacity = capacity; | ||
this.size = 0; | ||
this.head = null; | ||
this.map = new Map(); | ||
} | ||
/** | ||
* Get value by key. Marks the key as recently used. | ||
* Returns -1 if key doesn't exist. | ||
*/ | ||
get(key) { | ||
if (!this.map.has(key)) return -1; | ||
this.moveToFront(key); | ||
return this.map.get(key).value; | ||
} | ||
/** | ||
* Insert or update a key-value pair. | ||
* If cache is full, evicts the least recently used item. | ||
*/ | ||
put(key, value) { | ||
if (this.map.has(key)) { | ||
const node = this.map.get(key); | ||
node.value = value; | ||
this.moveToFront(key); | ||
return; | ||
} | ||
const newNode = new Node(key, value); | ||
this.map.set(key, newNode); | ||
this.insertAtFront(newNode); | ||
if (this.size > this.capacity) { | ||
this.removeLast(); | ||
} | ||
} | ||
/** | ||
* Get current cache state as array | ||
*/ | ||
getState() { | ||
const result = []; | ||
let current = this.head; | ||
while (current) { | ||
result.push({ key: current.key, value: current.value }); | ||
current = current.next; | ||
if (current === this.head) break; | ||
} | ||
return result; | ||
} | ||
insertAtFront(node) { | ||
if (!this.head) { | ||
node.next = node; | ||
node.prev = node; | ||
this.head = node; | ||
} else { | ||
const last = this.head.prev; | ||
node.next = this.head; | ||
node.prev = last; | ||
last.next = node; | ||
this.head.prev = node; | ||
this.head = node; | ||
} | ||
this.size++; | ||
} | ||
removeLast() { | ||
if (!this.head) return; | ||
const last = this.head.prev; | ||
this.map.delete(last.key); | ||
this.size--; | ||
if (this.head === last) { | ||
this.head = null; | ||
return; | ||
} | ||
last.prev.next = this.head; | ||
this.head.prev = last.prev; | ||
} | ||
moveToFront(key) { | ||
const node = this.map.get(key); | ||
if (node === this.head) return; | ||
node.prev.next = node.next; | ||
node.next.prev = node.prev; | ||
const last = this.head.prev; | ||
node.next = this.head; | ||
node.prev = last; | ||
last.next = node; | ||
this.head.prev = node; | ||
this.head = node; | ||
} | ||
} | ||
// Visualization Setup | ||
const logger = new LogTracer("Operation Log"); | ||
const cacheTracer = new Array1DTracer( | ||
"LRU Cache State (Most Recent → Least Recent)" | ||
); | ||
// Helper function to update visualization | ||
function updateVisualization(cache, message) { | ||
const state = cache.getState(); | ||
const keys = state.map((item) => `${item.key}:${item.value}`); | ||
if (keys.length > 0) { | ||
cacheTracer.set(keys); | ||
cacheTracer.patch(0); // Highlight most recently used | ||
} | ||
logger.println(message); | ||
Tracer.delay(); | ||
} | ||
// Set up layout | ||
Layout.setRoot(cacheTracer); | ||
Tracer.delay(); | ||
// Demo: LRU Cache Operations | ||
logger.println("=== LRU Cache Demonstration ==="); | ||
logger.println(`Creating cache with capacity 4`); | ||
Tracer.delay(); | ||
const cache = new LRUCache(4); | ||
// Operation 1: Insert items | ||
logger.println("\n--- Inserting Items ---"); | ||
Tracer.delay(); | ||
cache.put(1, 100); | ||
updateVisualization(cache, "PUT(1, 100): Added first item"); | ||
cache.put(2, 200); | ||
updateVisualization(cache, "PUT(2, 200): Added second item"); | ||
cache.put(3, 300); | ||
updateVisualization(cache, "PUT(3, 300): Added third item"); | ||
cache.put(4, 400); | ||
updateVisualization(cache, "PUT(4, 400): Cache is now full"); | ||
// Operation 2: Access an item (makes it most recently used) | ||
logger.println("\n--- Accessing Items ---"); | ||
Tracer.delay(); | ||
const value1 = cache.get(1); | ||
updateVisualization(cache, `GET(1): Retrieved value ${value1}, moved to front`); | ||
const value2 = cache.get(3); | ||
updateVisualization(cache, `GET(3): Retrieved value ${value2}, moved to front`); | ||
// Operation 3: Insert new item (will evict least recently used) | ||
logger.println("\n--- Eviction Demonstration ---"); | ||
Tracer.delay(); | ||
cache.put(5, 500); | ||
updateVisualization(cache, "PUT(5, 500): Added new item, evicted key 2 (LRU)"); | ||
// Try to access evicted item | ||
const evicted = cache.get(2); | ||
logger.println(`GET(2): Returns ${evicted} (not found - was evicted)`); | ||
Tracer.delay(); | ||
// Operation 4: Update existing item | ||
logger.println("\n--- Updating Items ---"); | ||
Tracer.delay(); | ||
cache.put(1, 150); | ||
updateVisualization( | ||
cache, | ||
"PUT(1, 150): Updated value of key 1, moved to front" | ||
); | ||
// Operation 5: More operations to show LRU behavior | ||
logger.println("\n--- Complex Access Pattern ---"); | ||
Tracer.delay(); | ||
cache.get(4); | ||
updateVisualization(cache, "GET(4): Accessed key 4, moved to front"); | ||
cache.put(6, 600); | ||
updateVisualization(cache, "PUT(6, 600): Added new item, evicted key 5 (LRU)"); | ||
cache.get(1); | ||
updateVisualization(cache, "GET(1): Accessed key 1, moved to front"); | ||
cache.put(7, 700); | ||
updateVisualization(cache, "PUT(7, 700): Added new item, evicted key 3 (LRU)"); | ||
// Final state | ||
logger.println("\n--- Final Cache State ---"); | ||
Tracer.delay(); | ||
const finalState = cache.getState(); | ||
logger.println("Cache contents (Most Recent → Least Recent):"); | ||
finalState.forEach((item, index) => { | ||
logger.println(` ${index + 1}. Key: ${item.key}, Value: ${item.value}`); | ||
}); | ||
Tracer.delay(); | ||
logger.println("\n=== Demonstration Complete ==="); | ||
logger.println("\nKey Observations:"); | ||
logger.println("1. Most recently accessed items stay at the front"); | ||
logger.println("2. When cache is full, least recently used item is evicted"); | ||
logger.println("3. Both GET and PUT operations update recency"); | ||
logger.println("4. All operations run in O(1) time complexity"); | ||
Tracer.delay(); |
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.