Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Lru cache#45

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Open
Akshayy67 wants to merge2 commits intoalgorithm-visualizer:master
base:master
Choose a base branch
Loading
fromAkshayy67:lru-cache-1
Open
Show file tree
Hide file tree
Changes fromall commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
120 changes: 120 additions & 0 deletionsLrucache_README.md
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,120 @@
# LRU Cache (Least Recently Used Cache)

## Category
Uncategorized / Data Structures

## Description

An **LRU (Least Recently Used) Cache** is a data structure that stores a limited number of items and automatically evicts the least recently used item when the cache reaches its capacity. It's one of the most commonly used cache replacement policies in computer systems.

## How It Works

The LRU Cache maintains items in order of their usage:

1. **Most Recently Used (MRU)**: Items at the front of the cache
2. **Least Recently Used (LRU)**: Items at the back of the cache

### Operations

#### GET Operation
- Retrieves the value associated with a key
- If the key exists, it's moved to the front (marked as most recently used)
- Returns -1 if the key doesn't exist
- **Time Complexity**: O(1)

#### PUT Operation
- Inserts a new key-value pair or updates an existing one
- The item is placed at the front (most recently used position)
- If the cache is at capacity, the least recently used item is evicted
- **Time Complexity**: O(1)

## Implementation Details

This implementation uses two data structures:

1. **Circular Doubly-Linked List**: Maintains the order of items based on usage
- Head points to the most recently used item
- Tail (head.prev) points to the least recently used item
- Allows O(1) insertion and deletion

2. **Hash Map**: Provides O(1) lookup of nodes by key
- Maps keys to their corresponding nodes in the linked list

## Algorithm Complexity

| Operation | Time Complexity | Space Complexity|
|-----------|----------------|------------------|
| GET | O(1) | O(capacity) |
| PUT | O(1) | O(capacity) |

## Real-World Applications

LRU Cache is widely used in:

- **Operating Systems**: Page replacement in virtual memory
- **Web Browsers**: Browser cache management
- **Databases**: Query result caching
- **CDNs**: Content delivery optimization
- **Web Servers**: Page caching
- **APIs**: Response caching to reduce load

## Example Walkthrough

```
Cache Capacity: 3

1. PUT(1, 100) → [1:100]
2. PUT(2, 200) → [2:200, 1:100]
3. PUT(3, 300) → [3:300, 2:200, 1:100]
4. GET(1) → [1:100, 3:300, 2:200] (1 moved to front)
5. PUT(4, 400) → [4:400, 1:100, 3:300] (2 evicted - LRU)
6. GET(2) → -1 (not found - was evicted)
```

## Key Observations

1. **Recency Matters**: Both GET and PUT operations update an item's recency
2. **Automatic Eviction**: When capacity is reached, the LRU item is automatically removed
3. **Efficient**: All operations run in constant time O(1)
4. **Space-Time Tradeoff**: Uses extra space (HashMap) to achieve O(1) time complexity

## Advantages

- **Fast Operations**: O(1) time for both get and put
- **Automatic Management**: No manual cleanup needed
- **Predictable Behavior**: Clear eviction policy
- **Memory Efficient**: Fixed memory footprint

## Disadvantages

- **Fixed Capacity**: Cannot grow beyond initial capacity
- **No Expiration**: Items don't expire based on time (only on usage)
- **Memory Overhead**: Requires additional space for HashMap and pointers

## Variations

- **LFU (Least Frequently Used)**: Evicts items used least often
- **FIFO (First In First Out)**: Evicts oldest items regardless of usage
- **LRU-K**: Considers K most recent accesses instead of just the last one
- **2Q**: Uses two queues to better handle different access patterns

## Interview Tips

LRU Cache is a popular interview question because it tests:
- Understanding of data structures (HashMap, Linked List)
- Ability to combine multiple data structures
- Time/space complexity analysis
- Real-world system design knowledge

## References

- [LeetCode Problem #146: LRU Cache](https://leetcode.com/problems/lru-cache/)
- [Wikipedia: Cache Replacement Policies](https://en.wikipedia.org/wiki/Cache_replacement_policies)
- [System Design: Caching Strategies](https://aws.amazon.com/caching/)

## Author

Akshay Juluri
- GitHub: [@Akshayy67](https://github.com/Akshayy67)
- Project: [LRU Cache Visualizer](https://github.com/Akshayy67/LRU-Cache)

242 changes: 242 additions & 0 deletionsUncategorized/LRU Cache/code.js
View file
Open in desktop
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,242 @@
// Import visualization library
const {
Array1DTracer,
LogTracer,
Layout,
Tracer,
} = require("algorithm-visualizer");

/**
* Node class for doubly-linked list
*/
class Node {
constructor(key, value) {
this.key = key;
this.value = value;
this.next = null;
this.prev = null;
}
}

/**
* LRU (Least Recently Used) Cache Implementation
*
* This cache maintains a fixed capacity and automatically evicts
* the least recently used item when capacity is exceeded.
*
* Time Complexity: O(1) for both get and put operations
* Space Complexity: O(capacity)
*/
class LRUCache {
constructor(capacity) {
this.capacity = capacity;
this.size = 0;
this.head = null;
this.map = new Map();
}

/**
* Get value by key. Marks the key as recently used.
* Returns -1 if key doesn't exist.
*/
get(key) {
if (!this.map.has(key)) return -1;
this.moveToFront(key);
return this.map.get(key).value;
}

/**
* Insert or update a key-value pair.
* If cache is full, evicts the least recently used item.
*/
put(key, value) {
if (this.map.has(key)) {
const node = this.map.get(key);
node.value = value;
this.moveToFront(key);
return;
}

const newNode = new Node(key, value);
this.map.set(key, newNode);
this.insertAtFront(newNode);

if (this.size > this.capacity) {
this.removeLast();
}
}

/**
* Get current cache state as array
*/
getState() {
const result = [];
let current = this.head;

while (current) {
result.push({ key: current.key, value: current.value });
current = current.next;
if (current === this.head) break;
}

return result;
}

insertAtFront(node) {
if (!this.head) {
node.next = node;
node.prev = node;
this.head = node;
} else {
const last = this.head.prev;
node.next = this.head;
node.prev = last;
last.next = node;
this.head.prev = node;
this.head = node;
}
this.size++;
}

removeLast() {
if (!this.head) return;
const last = this.head.prev;
this.map.delete(last.key);
this.size--;

if (this.head === last) {
this.head = null;
return;
}

last.prev.next = this.head;
this.head.prev = last.prev;
}

moveToFront(key) {
const node = this.map.get(key);
if (node === this.head) return;

node.prev.next = node.next;
node.next.prev = node.prev;

const last = this.head.prev;
node.next = this.head;
node.prev = last;
last.next = node;
this.head.prev = node;
this.head = node;
}
}

// Visualization Setup
const logger = new LogTracer("Operation Log");
const cacheTracer = new Array1DTracer(
"LRU Cache State (Most Recent → Least Recent)"
);

// Helper function to update visualization
function updateVisualization(cache, message) {
const state = cache.getState();
const keys = state.map((item) => `${item.key}:${item.value}`);

if (keys.length > 0) {
cacheTracer.set(keys);
cacheTracer.patch(0); // Highlight most recently used
}
logger.println(message);
Tracer.delay();
}

// Set up layout
Layout.setRoot(cacheTracer);
Tracer.delay();

// Demo: LRU Cache Operations
logger.println("=== LRU Cache Demonstration ===");
logger.println(`Creating cache with capacity 4`);
Tracer.delay();

const cache = new LRUCache(4);

// Operation 1: Insert items
logger.println("\n--- Inserting Items ---");
Tracer.delay();

cache.put(1, 100);
updateVisualization(cache, "PUT(1, 100): Added first item");

cache.put(2, 200);
updateVisualization(cache, "PUT(2, 200): Added second item");

cache.put(3, 300);
updateVisualization(cache, "PUT(3, 300): Added third item");

cache.put(4, 400);
updateVisualization(cache, "PUT(4, 400): Cache is now full");

// Operation 2: Access an item (makes it most recently used)
logger.println("\n--- Accessing Items ---");
Tracer.delay();

const value1 = cache.get(1);
updateVisualization(cache, `GET(1): Retrieved value ${value1}, moved to front`);

const value2 = cache.get(3);
updateVisualization(cache, `GET(3): Retrieved value ${value2}, moved to front`);

// Operation 3: Insert new item (will evict least recently used)
logger.println("\n--- Eviction Demonstration ---");
Tracer.delay();

cache.put(5, 500);
updateVisualization(cache, "PUT(5, 500): Added new item, evicted key 2 (LRU)");

// Try to access evicted item
const evicted = cache.get(2);
logger.println(`GET(2): Returns ${evicted} (not found - was evicted)`);
Tracer.delay();

// Operation 4: Update existing item
logger.println("\n--- Updating Items ---");
Tracer.delay();

cache.put(1, 150);
updateVisualization(
cache,
"PUT(1, 150): Updated value of key 1, moved to front"
);

// Operation 5: More operations to show LRU behavior
logger.println("\n--- Complex Access Pattern ---");
Tracer.delay();

cache.get(4);
updateVisualization(cache, "GET(4): Accessed key 4, moved to front");

cache.put(6, 600);
updateVisualization(cache, "PUT(6, 600): Added new item, evicted key 5 (LRU)");

cache.get(1);
updateVisualization(cache, "GET(1): Accessed key 1, moved to front");

cache.put(7, 700);
updateVisualization(cache, "PUT(7, 700): Added new item, evicted key 3 (LRU)");

// Final state
logger.println("\n--- Final Cache State ---");
Tracer.delay();
const finalState = cache.getState();
logger.println("Cache contents (Most Recent → Least Recent):");
finalState.forEach((item, index) => {
logger.println(` ${index + 1}. Key: ${item.key}, Value: ${item.value}`);
});
Tracer.delay();

logger.println("\n=== Demonstration Complete ===");
logger.println("\nKey Observations:");
logger.println("1. Most recently accessed items stay at the front");
logger.println("2. When cache is full, least recently used item is evicted");
logger.println("3. Both GET and PUT operations update recency");
logger.println("4. All operations run in O(1) time complexity");
Tracer.delay();

[8]ページ先頭

©2009-2025 Movatter.jp