Fix: Node.js JavaScript Heap Out of Memory
Quick Answer
How to fix Node.js 'JavaScript heap out of memory' — increasing heap size, finding memory leaks with heap snapshots, fixing common leak patterns, and stream-based processing for large data.
The Error
Node.js crashes with an out-of-memory error:
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 0xb7c560 node::Abort() [node]
2: 0xa914f5 node::FatalError(char const*, char const*) [node]
3: 0xd886fe v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [node]
4: 0xd88a37 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [node]
Aborted (core dumped)Or a more specific message:
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memoryOr the process is killed by the OS without a message (OOM killer on Linux).
Why This Happens
Node.js’s V8 JavaScript engine has a default heap size limit — about 1.5 GB on 64-bit systems (less on 32-bit). When the heap exceeds this limit, Node.js crashes:
- Memory leak — objects are allocated but never garbage collected because a reference chain keeps them alive. The heap grows without bound.
- Loading large files into memory — reading an entire CSV, JSON, or log file with
fs.readFileSync()orJSON.parse()when the file is larger than available memory. - Processing large arrays — building a result array with millions of items instead of streaming or batching.
- Event listener accumulation — adding event listeners without removing them, usually inside loops or repeated function calls.
- Caching without eviction — an in-memory cache that grows forever without an LRU or TTL policy.
- Circular references in closures — closures that reference large objects, keeping them alive even after they’re no longer needed.
- Recursive processing — deep recursion on large data structures building a large call stack and heap simultaneously.
Fix 1: Increase the Heap Size (Temporary Fix)
Increase Node.js’s maximum heap size to buy time while you find the underlying issue:
# Increase to 4 GB
node --max-old-space-size=4096 server.js
# Or set via environment variable
NODE_OPTIONS="--max-old-space-size=4096" node server.js
# In package.json scripts
{
"scripts": {
"start": "node --max-old-space-size=4096 server.js",
"build": "NODE_OPTIONS=--max-old-space-size=4096 webpack"
}
}This is a workaround, not a fix. If the process has a memory leak, it will still crash — just later. Use this to prevent crashes while you diagnose the leak.
Rule of thumb for heap size: set it to ~75% of available RAM, leaving room for the OS and other processes. On a 4 GB server: --max-old-space-size=3072.
Fix 2: Profile Memory with Heap Snapshots
Find the leak by taking heap snapshots before and after the memory grows:
Using Node.js built-in v8.writeHeapSnapshot():
const v8 = require('v8');
// Take a snapshot at startup
v8.writeHeapSnapshot();
// Run your workload...
// Take another snapshot after memory grows
v8.writeHeapSnapshot();
// Compare the two snapshots in Chrome DevToolsOpen Chrome DevTools → Memory → Load Profile → load the .heapsnapshot files → Compare Snapshots.
Objects that grew between snapshot 1 and snapshot 2 are your leak candidates.
Trigger snapshot via HTTP endpoint (for production diagnosis):
const v8 = require('v8');
const path = require('path');
app.get('/debug/heap-snapshot', (req, res) => {
// Protect this endpoint — internal/admin only!
const filename = v8.writeHeapSnapshot(path.join('/tmp', `heap-${Date.now()}.heapsnapshot`));
res.json({ snapshot: filename });
});Fix 3: Fix Event Listener Leaks
Adding event listeners inside functions that are called repeatedly without removing them causes the listener count (and referenced objects) to grow indefinitely:
// LEAK — each call to setupHandler adds a new listener
function setupHandler(emitter) {
emitter.on('data', (data) => {
// This closure captures 'data' and prevents GC
processData(data);
});
}
// Called many times → listeners pile up
setInterval(() => setupHandler(eventEmitter), 1000);// FIX — remove the listener when done
function setupHandler(emitter) {
const handler = (data) => processData(data);
emitter.on('data', handler);
// Return cleanup function
return () => emitter.off('data', handler);
}
// Or use 'once' for single-use listeners
emitter.once('data', handler);Detect listener leaks early:
// Node.js warns when more than 10 listeners are added to one event
// Increase the limit if you legitimately need more
emitter.setMaxListeners(20);
// Or check current listener count
console.log(emitter.listenerCount('data'));Fix 4: Stream Large Files Instead of Loading Them
Reading large files entirely into memory is the most common cause of crashes in data processing scripts:
// CRASHES for large files — loads entire file into memory
const data = fs.readFileSync('huge-file.csv', 'utf8');
const rows = data.split('\n');
// 10 GB file → 10 GB in memory
// FIX — process line by line with streams
const readline = require('readline');
const fs = require('fs');
const rl = readline.createInterface({
input: fs.createReadStream('huge-file.csv'),
crlfDelay: Infinity,
});
rl.on('line', (line) => {
processRow(line); // Process one line at a time — minimal memory
});
rl.on('close', () => {
console.log('Done processing');
});For JSON files too large to parse at once, use stream-json:
npm install stream-jsonconst { parser } = require('stream-json');
const { streamArray } = require('stream-json/streamers/StreamArray');
fs.createReadStream('huge.json')
.pipe(parser())
.pipe(streamArray())
.on('data', ({ key, value }) => {
processItem(value); // One item at a time
})
.on('end', () => console.log('Done'));Fix 5: Batch Large Database Queries
Fetching millions of rows from a database at once fills the heap:
// CRASHES for large tables
const allUsers = await db.query('SELECT * FROM users');
// 1M users × 500 bytes each = 500 MB in memory
// FIX — process in batches
async function processAllUsers() {
const batchSize = 1000;
let offset = 0;
while (true) {
const batch = await db.query(
'SELECT * FROM users ORDER BY id LIMIT $1 OFFSET $2',
[batchSize, offset]
);
if (batch.rows.length === 0) break;
for (const user of batch.rows) {
await processUser(user);
}
offset += batchSize;
console.log(`Processed ${offset} users`);
}
}For PostgreSQL, use cursors for true streaming:
const cursor = client.query(new Cursor('SELECT * FROM users ORDER BY id'));
async function processWithCursor() {
while (true) {
const rows = await cursor.read(100); // Read 100 rows at a time
if (rows.length === 0) break;
for (const row of rows) await processUser(row);
}
await cursor.close();
}Fix 6: Implement Cache Eviction
Caches without eviction grow until the process crashes:
// LEAK — cache grows without bound
const cache = new Map();
function getCachedData(key) {
if (cache.has(key)) return cache.get(key);
const value = expensiveCompute(key);
cache.set(key, value); // Never evicted
return value;
}Fix with a size-limited LRU cache:
npm install lru-cacheconst { LRUCache } = require('lru-cache');
const cache = new LRUCache({
max: 1000, // Maximum 1000 entries
maxSize: 50_000_000, // Maximum 50 MB total
sizeCalculation: (value) => JSON.stringify(value).length,
ttl: 1000 * 60 * 60, // Entries expire after 1 hour
});
function getCachedData(key) {
if (cache.has(key)) return cache.get(key);
const value = expensiveCompute(key);
cache.set(key, value);
return value;
}Use WeakMap for object-keyed caches — entries are automatically garbage collected when the key object is no longer referenced:
const resultCache = new WeakMap();
function getCachedResult(obj) {
if (resultCache.has(obj)) return resultCache.get(obj);
const result = compute(obj);
resultCache.set(obj, result); // Automatically GC'd when obj is GC'd
return result;
}Fix 7: Monitor Memory in Production
Detect memory growth before it causes a crash:
// Log memory usage periodically
setInterval(() => {
const usage = process.memoryUsage();
console.log({
heapUsed: `${Math.round(usage.heapUsed / 1024 / 1024)} MB`,
heapTotal: `${Math.round(usage.heapTotal / 1024 / 1024)} MB`,
rss: `${Math.round(usage.rss / 1024 / 1024)} MB`,
external: `${Math.round(usage.external / 1024 / 1024)} MB`,
});
}, 30000); // Every 30 secondsAuto-restart on memory threshold:
const MEMORY_LIMIT_MB = 1024; // Restart if heap exceeds 1 GB
setInterval(() => {
const heapMB = process.memoryUsage().heapUsed / 1024 / 1024;
if (heapMB > MEMORY_LIMIT_MB) {
console.error(`Memory limit exceeded (${Math.round(heapMB)} MB). Restarting...`);
process.exit(1); // PM2 or Docker will restart the process
}
}, 10000);Use PM2 with memory restart limit:
pm2 start server.js --max-memory-restart 1G
# PM2 restarts the process if it exceeds 1 GBStill Not Working?
Force a garbage collection to separate “memory leak” from “high but stable memory usage”:
node --expose-gc server.js// Trigger GC manually (only works with --expose-gc flag)
global.gc();
console.log('After GC:', process.memoryUsage().heapUsed / 1024 / 1024, 'MB');If memory drops significantly after forced GC, the issue is that GC isn’t running frequently enough (a GC tuning problem, not a leak). If memory stays high after GC, objects are being retained by references (a real leak).
Check --max-semi-space-size — the young generation heap also has a limit. For write-heavy workloads:
node --max-semi-space-size=128 --max-old-space-size=4096 server.jsUse clinic.js for production-quality profiling:
npm install -g clinic
clinic heapprofiler -- node server.js
# Generates a flame graph of heap allocationsFor related Node.js issues, see Fix: Node.js Unhandled Rejection Crash and Fix: Linux OOM Killer.
Solo developer based in Japan. Every solution is cross-referenced with official documentation and tested before publishing.
Was this article helpful?
Related Articles
Fix: Fastify Not Working — 404, Plugin Encapsulation, and Schema Validation Errors
How to fix Fastify issues — route 404 from plugin encapsulation, reply already sent, FST_ERR_VALIDATION, request.body undefined, @fastify/cors, hooks not running, and TypeScript type inference.
Fix: Bun Not Working — Node.js Module Incompatible, Native Addon Fails, or bun test Errors
How to fix Bun runtime issues — Node.js API compatibility, native addons (node-gyp), Bun.serve vs Node http, bun test differences from Jest, and common package incompatibilities.
Fix: Node.js Stream Error — Pipe Not Working, Backpressure, or Premature Close
How to fix Node.js stream issues — pipe and pipeline errors, backpressure handling, Transform streams, async iteration, error propagation, and common stream anti-patterns.
Fix: Node.js UnhandledPromiseRejection and uncaughtException — Crashing Server
How to handle Node.js uncaughtException and unhandledRejection events — graceful shutdown, error logging, async error boundaries, and keeping servers alive safely.