A lightweight, high-performance Least Recently Used (LRU) cache implementation for JavaScript with optional TTL (time-to-live) support. Works in both Node.js and browser environments.
npm install tiny-lru
import {lru} from "tiny-lru";
const cache = lru(max, ttl = 0, resetTtl = false);
import {LRU} from "tiny-lru";
// Create a cache with 1000 items, 1 minute TTL, reset on access
const cache = new LRU(1000, 60000, true);
// Create a cache with TTL
const cache2 = new LRU(100, 5000); // 100 items, 5 second TTL
cache2.set('key1', 'value1');
// After 5 seconds, key1 will be expired
import {LRU} from "tiny-lru";
class MyCache extends LRU {
constructor(max, ttl, resetTtl) {
super(max, ttl, resetTtl);
}
}
-
max
{Number}
- Maximum number of items to store. 0 means unlimited (default: 1000) -
ttl
{Number}
- Time-to-live in milliseconds, 0 disables expiration (default: 0) -
resetTtl
{Boolean}
- Reset TTL on eachset()
operation (default: false)
The factory function validates parameters and throws TypeError
for invalid values:
// Invalid parameters will throw TypeError
try {
const cache = lru(-1); // Invalid max value
} catch (error) {
console.error(error.message); // "Invalid max value"
}
try {
const cache = lru(100, -1); // Invalid ttl value
} catch (error) {
console.error(error.message); // "Invalid ttl value"
}
try {
const cache = lru(100, 0, "true"); // Invalid resetTtl value
} catch (error) {
console.error(error.message); // "Invalid resetTtl value"
}
Compatible with Lodash's memoize
function cache interface:
import _ from "lodash";
import {lru} from "tiny-lru";
_.memoize.Cache = lru().constructor;
const memoized = _.memoize(myFunc);
memoized.cache.max = 10;
Tiny-LRU maintains 100% code coverage:
--------------|---------|----------|---------|---------|-------------------
File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
--------------|---------|----------|---------|---------|-------------------
All files | 100 | 96.34 | 100 | 100 |
tiny-lru.cjs | 100 | 96.34 | 100 | 100 | 190,225,245
--------------|---------|----------|---------|---------|-------------------
Tiny-LRU includes a comprehensive benchmark suite for performance analysis and comparison. The benchmark suite uses modern Node.js best practices and popular benchmarking tools.
Comprehensive benchmark suite using Tinybench
Features:
- Statistically analyzed latency and throughput values
- Standard deviation, margin of error, variance calculations
- Proper warmup phases and statistical significance
- Realistic workload scenarios
Test categories:
- SET operations: Empty cache, full cache, eviction scenarios
- GET operations: Hit/miss patterns, access patterns
- Mixed operations: Real-world 80/20 read-write scenarios
- Special operations: Delete, clear, different data types
- Memory usage analysis
Native Node.js performance measurement using Performance Observer
Features:
- Function-level timing using
performance.timerify()
- PerformanceObserver for automatic measurement collection
- Custom high-resolution timer implementations
- Scalability testing across different cache sizes
# Run all modern benchmarks
npm run benchmark:all
# Run individual benchmark suites
npm run benchmark:modern # Tinybench suite
npm run benchmark:perf # Performance Observer suite
# Or run directly
node benchmarks/modern-benchmark.js
node benchmarks/performance-observer-benchmark.js
# Run with garbage collection exposed (for memory analysis)
node --expose-gc benchmarks/modern-benchmark.js
┌─────────┬─────────────────────────────┬─────────────────┬────────────────────┬──────────┬─────────┐
│ (index) │ Task Name │ ops/sec │ Average Time (ns) │ Margin │ Samples │
├─────────┼─────────────────────────────┼─────────────────┼────────────────────┼──────────┼─────────┤
│ 0 │ 'set-random-empty-cache-100'│ '2,486,234' │ 402.21854775934 │ '±0.45%' │ 1243117 │
- ops/sec: Operations per second (higher is better)
- Average Time: Average execution time in nanoseconds
- Margin: Statistical margin of error
- Samples: Number of samples collected for statistical significance
┌─────────────┬─────────┬────────────┬────────────┬────────────┬───────────────┬─────────┬────────┐
│ Function │ Calls │ Avg (ms) │ Min (ms) │ Max (ms) │ Median (ms) │ Std Dev │Ops/sec │
├─────────────┼─────────┼────────────┼────────────┼────────────┼───────────────┼─────────┼────────┤
│ lru.set │ 1000 │ 0.0024 │ 0.0010 │ 0.0156 │ 0.0020 │ 0.0012 │ 417292 │
For accurate benchmark results:
- Close other applications to reduce system noise
- Run multiple times and compare results
- Use consistent hardware for comparisons
-
Enable garbage collection with
--expose-gc
for memory tests - Consider CPU frequency scaling on laptops
- ✅ Consistent ops/sec across runs
- ✅ Low margin of error (< 5%)
- ✅ GET operations faster than SET
- ✅ Cache hits faster than misses
See benchmarks/README.md
for complete documentation and advanced usage.
{Object|null}
- Item in first (least recently used) position
const cache = lru();
cache.first; // null - empty cache
{Object|null}
- Item in last (most recently used) position
const cache = lru();
cache.last; // null - empty cache
{Number}
- Maximum number of items to hold in cache
const cache = lru(500);
cache.max; // 500
{Boolean}
- Whether to reset TTL on each set()
operation
const cache = lru(500, 5*6e4, true);
cache.resetTtl; // true
{Number}
- Current number of items in cache
const cache = lru();
cache.size; // 0 - empty cache
{Number}
- TTL in milliseconds (0 = no expiration)
const cache = lru(100, 3e4);
cache.ttl; // 30000
Removes all items from cache.
Returns: {Object}
LRU instance
cache.clear();
Removes specified item from cache.
Parameters:
-
key
{String}
- Item key
Returns: {Object}
LRU instance
cache.set('key1', 'value1');
cache.delete('key1');
console.log(cache.has('key1')); // false
Returns array of cache items as [key, value]
pairs.
Parameters:
-
keys
{Array}
- Optional array of specific keys to retrieve (defaults to all keys)
Returns: {Array}
Array of [key, value]
pairs
cache.set('a', 1).set('b', 2);
console.log(cache.entries()); // [['a', 1], ['b', 2]]
console.log(cache.entries(['a'])); // [['a', 1]]
Removes the least recently used item from cache.
Returns: {Object}
LRU instance
cache.set('old', 'value').set('new', 'value');
cache.evict(); // Removes 'old' item
See also: setWithEvicted()
Gets expiration timestamp for cached item.
Parameters:
-
key
{String}
- Item key
Returns: {Number|undefined}
Expiration time (epoch milliseconds) or undefined if key doesn't exist
const cache = new LRU(100, 5000); // 5 second TTL
cache.set('key1', 'value1');
console.log(cache.expiresAt('key1')); // timestamp 5 seconds from now
Retrieves cached item and promotes it to most recently used position.
Parameters:
-
key
{String}
- Item key
Returns: {*}
Item value or undefined if not found/expired
cache.set('key1', 'value1');
console.log(cache.get('key1')); // 'value1'
console.log(cache.get('nonexistent')); // undefined
Checks if key exists in cache (without promoting it).
Parameters:
-
key
{String}
- Item key
Returns: {Boolean}
True if key exists and is not expired
cache.set('key1', 'value1');
console.log(cache.has('key1')); // true
console.log(cache.has('nonexistent')); // false
Returns array of all cache keys in LRU order (first = least recent).
Returns: {Array}
Array of keys
cache.set('a', 1).set('b', 2);
cache.get('a'); // Move 'a' to most recent
console.log(cache.keys()); // ['b', 'a']
Stores item in cache as most recently used.
Parameters:
-
key
{String}
- Item key -
value
{*}
- Item value
Returns: {Object}
LRU instance
cache.set('key1', 'value1')
.set('key2', 'value2')
.set('key3', 'value3');
See also: get(), setWithEvicted()
Stores item and returns evicted item if cache was full.
Parameters:
-
key
{String}
- Item key -
value
{*}
- Item value
Returns: {Object|null}
Evicted item {key, value, expiry, prev, next}
or null
const cache = new LRU(2);
cache.set('a', 1).set('b', 2);
const evicted = cache.setWithEvicted('c', 3); // evicted = {key: 'a', value: 1, ...}
if (evicted) {
console.log(`Evicted: ${evicted.key}`, evicted.value);
}
Returns array of cache values.
Parameters:
-
keys
{Array}
- Optional array of specific keys to retrieve (defaults to all keys)
Returns: {Array}
Array of values
cache.set('a', 1).set('b', 2);
console.log(cache.values()); // [1, 2]
console.log(cache.values(['a'])); // [1]
import {lru} from "tiny-lru";
// Create a cache with max 100 items
const cache = lru(100);
cache.set('key1', 'value1');
console.log(cache.get('key1')); // 'value1'
// Method chaining
cache.set("user:123", {name: "John", age: 30})
.set("session:abc", {token: "xyz", expires: Date.now()});
const user = cache.get("user:123"); // Promotes to most recent
console.log(cache.size); // 2
import {LRU} from "tiny-lru";
const cache = new LRU(50, 5000); // 50 items, 5s TTL
cache.set("temp-data", {result: "computed"});
setTimeout(() => {
console.log(cache.get("temp-data")); // undefined - expired
}, 6000);
const cache = lru(100, 10000, true); // Reset TTL on each set()
cache.set("session", {user: "admin"});
// Each subsequent set() resets the 10s TTL
Copyright (c) 2025 Jason Mulligan
Licensed under the BSD-3 license.