Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork26
A high-performance, lightweight LRU cache. Built for developers who need fast caching without compromising on features.
License
avoidwork/tiny-lru
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Ahigh-performance, lightweight LRU cache for JavaScript withstrong UPDATE performance and competitive SET/GET/DELETE, and acompact bundle size. Built for developers who need fast caching without compromising on features.
npm install tiny-lru# oryarn add tiny-lru# orpnpm add tiny-lru
Requirements: Node.js ≥12
import{lru}from"tiny-lru";// Create cache and start using immediatelyconstcache=lru(100);// Max 100 itemscache.set('user:123',{name:'John',age:30});constuser=cache.get('user:123');// {name: 'John', age: 30}// With TTL (5 second expiration)consttempCache=lru(50,5000);tempCache.set('session','abc123');// Automatically expires after 5 seconds
- ✨ Features & Benefits
- 📊 Performance Deep Dive
- 📖 API Reference
- 🚀 Getting Started
- 💡 Real-World Examples
- 🔗 Interoperability
- 🛠️ Development
- 📄 License
- 🔄 Strong Cache Updates - Excellent performance in update-heavy workloads
- 📦 Compact Bundle - Just ~2.2 KiB minified for a full-featured LRU library
- ⚖️ Balanced Performance - Competitive across all operations with O(1) complexity
- ⏱️ TTL Support - Optional time-to-live with automatic expiration
- 🔄 Method Chaining - Fluent API for better developer experience
- 🎯 TypeScript Ready - Full TypeScript support with complete type definitions
- 🌐 Universal Compatibility - Works seamlessly in Node.js and browsers
- 🛡️ Production Ready - Battle-tested and reliable
| Library | SET ops/sec | GET ops/sec | UPDATE ops/sec | DELETE ops/sec |
|---|---|---|---|---|
| tiny-lru | 404,753 | 1,768,449 | 1,703,716 | 298,770 |
| lru-cache | 326,221 | 1,069,061 | 878,858 | 277,734 |
| quick-lru | 591,683 | 1,298,487 | 935,481 | 359,600 |
| mnemonist | 412,467 | 2,478,778 | 2,156,690 | 0 |
Notes:
- Mean values computed from the Performance Summary across 5 consecutive runs of
npm run benchmark:comparison. - mnemonist lacks a compatible delete method in this harness, so DELETE ops/sec is 0.
- Performance varies by hardware, Node.js version, and workload patterns; run the provided benchmarks locally to assess your specific use case.
- Environment: Node.js v24.5.0, macOS arm64.
✅ Perfect for:
- Frequent cache updates - Leading UPDATE performance
- Mixed read/write workloads - Balanced across all operations
- Bundle size constraints - Compact library with full features
- Production applications - Battle-tested with comprehensive testing
# Run all performance benchmarksnpm run benchmark:all# Individual benchmark suitesnpm run benchmark:modern# Comprehensive Tinybench suitenpm run benchmark:perf# Performance Observer measurementsnpm run benchmark:comparison# Compare against other LRU libraries
npm install tiny-lru# oryarn add tiny-lru# orpnpm add tiny-lru
import{lru}from"tiny-lru";// Basic cacheconstcache=lru(100);cache.set('key1','value1').set('key2','value2').set('key3','value3');console.log(cache.get('key1'));// 'value1'console.log(cache.size);// 3// With TTL (time-to-live)constcacheWithTtl=lru(50,30000);// 30 second TTLcacheWithTtl.set('temp-data',{important:true});// Automatically expires after 30 secondsconstresetCache=lru(25,10000,true);resetCache.set('session','user123');// Because resetTtl is true, TTL resets when you set() the same key again
<!-- ES Modules --><scripttype="module">import{lru,LRU}from'https://cdn.skypack.dev/tiny-lru';constcache=lru(100);</script><!-- UMD Bundle (global: window.lru) --><scriptsrc="https://unpkg.com/tiny-lru/dist/tiny-lru.umd.js"></script><script>const{lru,LRU}=window.lru;constcache=lru(100);// or: const cache = new LRU(100);</script>
import{lru,LRU}from"tiny-lru";// Type-safe cacheconstcache=lru<string>(100);// or: const cache: LRU<string> = lru<string>(100);cache.set('user:123','John Doe');constuser:string|undefined=cache.get('user:123');// Class inheritanceclassMyCacheextendsLRU<User>{constructor(){super(1000,60000,true);// 1000 items, 1 min TTL, reset TTL on set}}
import{lru}from"tiny-lru";constcache=lru(max,ttl=0,resetTtl=false);
Parameters:
max{Number}- Maximum number of items (0 = unlimited, default: 1000)ttl{Number}- Time-to-live in milliseconds (0 = no expiration, default: 0)resetTtl{Boolean}- Reset TTL when updating existing items viaset()(default: false)
import{LRU}from"tiny-lru";constcache=newLRU(1000,60000,true);// 1000 items, 1 min TTL, reset TTL on set
// 1. Size your cache appropriatelyconstcache=lru(1000);// Not too small, not too large// 2. Use meaningful keyscache.set(`user:${userId}:profile`,userProfile);cache.set(`product:${productId}:details`,productDetails);// 3. Handle cache misses gracefullyfunctiongetData(key){constcached=cache.get(key);if(cached!==undefined){returncached;}// Fallback to slower data sourceconstdata=expensiveOperation(key);cache.set(key,data);returndata;}// 4. Clean up when neededprocess.on('exit',()=>{cache.clear();// Help garbage collection});
- Cache Size: Keep cache size reasonable (1000-10000 items for most use cases)
- TTL Usage: Only use TTL when necessary; it adds overhead
- Key Types: String keys perform better than object keys
- Memory: Call
clear()when done to help garbage collection
import{lru}from"tiny-lru";classApiClient{constructor(){this.cache=lru(100,300000);// 5 minute cache}asyncfetchUser(userId){constcacheKey=`user:${userId}`;// Return cached result if availableif(this.cache.has(cacheKey)){returnthis.cache.get(cacheKey);}// Fetch from API and cacheconstresponse=awaitfetch(`/api/users/${userId}`);constuser=awaitresponse.json();this.cache.set(cacheKey,user);returnuser;}}
import{lru}from"tiny-lru";functionmemoize(fn,maxSize=100){constcache=lru(maxSize);returnfunction(...args){constkey=JSON.stringify(args);if(cache.has(key)){returncache.get(key);}constresult=fn.apply(this,args);cache.set(key,result);returnresult;};}// UsageconstexpensiveCalculation=memoize((n)=>{console.log(`Computing for${n}`);returnn*n*n;},50);console.log(expensiveCalculation(5));// Computing for 5 -> 125console.log(expensiveCalculation(5));// 125 (cached)
import{lru}from"tiny-lru";classSessionManager{constructor(){// 30 minute TTL, with resetTtl enabled for set()this.sessions=lru(1000,1800000,true);}createSession(userId,data){constsessionId=this.generateId();constsession={ userId, data,createdAt:Date.now()};this.sessions.set(sessionId,session);returnsessionId;}getSession(sessionId){// get() does not extend TTL; to extend, set the session again when resetTtl is truereturnthis.sessions.get(sessionId);}endSession(sessionId){this.sessions.delete(sessionId);}}
Compatible with Lodash'smemoize function cache interface:
import_from"lodash";import{lru}from"tiny-lru";_.memoize.Cache=lru().constructor;constmemoized=_.memoize(myFunc);memoized.cache.max=10;
Tiny LRU maintains 100% test coverage with comprehensive unit and integration tests.
# Run all tests with coveragenpmtest# Run tests with verbose outputnpm run mocha# Lint codenpm run lint# Full build (lint + build)npm run build
Test Coverage: 100% coverage across all modules
----------|---------|----------|---------|---------|-------------------File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s----------|---------|----------|---------|---------|-------------------All files | 100 | 100 | 100 | 100 | lru.js | 100 | 100 | 100 | 100 |----------|---------|----------|---------|---------|-------------------
# Clone and setupgit clone https://github.com/avoidwork/tiny-lru.gitcd tiny-lrunpm install# Run testsnpmtest# Run lintingnpm run lint# Run benchmarksnpm run benchmark:all# Build distribution filesnpm run build
- Fork the repository on GitHub
- Clone your fork locally
- Create a feature branch:
git checkout -b feature/amazing-feature - Develop your changes with tests
- Test thoroughly:
npm test && npm run lint - Commit using conventional commits:
git commit -m "feat: add amazing feature" - Push to your fork:
git push origin feature/amazing-feature - Submit a Pull Request
- Code Quality: Follow ESLint rules and existing code style
- Testing: Maintain 100% test coverage for all changes
- Documentation: Update README.md and JSDoc for API changes
- Performance: Benchmark changes that could impact performance
- Compatibility: Ensure Node.js ≥12 compatibility
- Commit Messages: UseConventional Commits format
Creates a new LRU cache instance using the factory function.
Parameters:
max{Number}- Maximum number of items to store (default: 1000; 0 = unlimited)ttl{Number}- Time-to-live in milliseconds (default: 0; 0 = no expiration)resetTtl{Boolean}- Reset TTL when updating existing items viaset()(default: false)
Returns:{LRU} New LRU cache instance
Throws:{TypeError} When parameters are invalid
import{lru}from"tiny-lru";// Basic cacheconstcache=lru(100);// With TTLconstcacheWithTtl=lru(50,30000);// 30 second TTL// With resetTtl enabled for set()constresetCache=lru(25,10000,true);// Validation errorslru(-1);// TypeError: Invalid max valuelru(100,-1);// TypeError: Invalid ttl valuelru(100,0,"no");// TypeError: Invalid resetTtl value
{Object|null} - Item in first (least recently used) position
constcache=lru();cache.first;// null - empty cache
{Object|null} - Item in last (most recently used) position
constcache=lru();cache.last;// null - empty cache
{Number} - Maximum number of items to hold in cache
constcache=lru(500);cache.max;// 500
{Boolean} - Whether to reset TTL when updating existing items viaset()
constcache=lru(500,5*6e4,true);cache.resetTtl;// true
{Number} - Current number of items in cache
constcache=lru();cache.size;// 0 - empty cache
{Number} - TTL in milliseconds (0 = no expiration)
constcache=lru(100,3e4);cache.ttl;// 30000
Removes all items from cache.
Returns:{Object} LRU instance
cache.clear();
Removes specified item from cache.
Parameters:
key{String}- Item key
Returns:{Object} LRU instance
cache.set('key1','value1');cache.delete('key1');console.log(cache.has('key1'));// false
Returns array of cache items as[key, value] pairs.
Parameters:
keys{Array}- Optional array of specific keys to retrieve (defaults to all keys)
Returns:{Array} Array of[key, value] pairs
cache.set('a',1).set('b',2);console.log(cache.entries());// [['a', 1], ['b', 2]]console.log(cache.entries(['a']));// [['a', 1]]
Removes the least recently used item from cache.
Returns:{Object} LRU instance
cache.set('old','value').set('new','value');cache.evict();// Removes 'old' item
Gets expiration timestamp for cached item.
Parameters:
key{String}- Item key
Returns:{Number|undefined} Expiration time (epoch milliseconds) or undefined if key doesn't exist
constcache=newLRU(100,5000);// 5 second TTLcache.set('key1','value1');console.log(cache.expiresAt('key1'));// timestamp 5 seconds from now
Retrieves cached item and promotes it to most recently used position.
Parameters:
key{String}- Item key
Returns:{*} Item value or undefined if not found/expired
Note:get() does not reset or extend TTL. TTL is only reset onset() whenresetTtl istrue.
cache.set('key1','value1');console.log(cache.get('key1'));// 'value1'console.log(cache.get('nonexistent'));// undefined
Checks if key exists in cache (without promoting it).
Parameters:
key{String}- Item key
Returns:{Boolean} True if key exists and is not expired
cache.set('key1','value1');console.log(cache.has('key1'));// trueconsole.log(cache.has('nonexistent'));// false
Returns array of all cache keys in LRU order (first = least recent).
Returns:{Array} Array of keys
cache.set('a',1).set('b',2);cache.get('a');// Move 'a' to most recentconsole.log(cache.keys());// ['b', 'a']
Stores item in cache as most recently used.
Parameters:
key{String}- Item keyvalue{*}- Item value
Returns:{Object} LRU instance
cache.set('key1','value1').set('key2','value2').set('key3','value3');
Stores item and returns evicted item if cache was full.
Parameters:
key{String}- Item keyvalue{*}- Item value
Returns:{Object|null} Evicted item{key, value, expiry, prev, next} or null
constcache=newLRU(2);cache.set('a',1).set('b',2);constevicted=cache.setWithEvicted('c',3);// evicted = {key: 'a', value: 1, ...}if(evicted){console.log(`Evicted:${evicted.key}`,evicted.value);}
Returns array of cache values.
Parameters:
keys{Array}- Optional array of specific keys to retrieve (defaults to all keys)
Returns:{Array} Array of values
cache.set('a',1).set('b',2);console.log(cache.values());// [1, 2]console.log(cache.values(['a']));// [1]
Copyright (c) 2025 Jason Mulligan
Licensed under the BSD-3 license.
About
A high-performance, lightweight LRU cache. Built for developers who need fast caching without compromising on features.
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Sponsor this project
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.