My Node.js app crashed at 3 AM with an "out of memory" error. Again.
I spent 6 hours the first time this happened, manually digging through heap dumps and guessing at root causes. Then I discovered how AI tools can cut this debugging time to under an hour.
What you'll learn: How to catch and fix Node.js v22 memory leaks using AI-powered analysis
Time needed: 45 minutes
Difficulty: Intermediate (you should know basic Node.js and npm)
By the end, you'll have a repeatable process that catches memory leaks before they hit production. Plus the exact AI prompts that helped me fix 3 different leak types.
Why I Built This Process
My situation:
- Express.js API serving 50K requests/day
- Memory usage climbing from 100MB to 2GB over 24 hours
- App crashing every night during peak traffic
- Traditional debugging tools showing confusing data
My setup:
- Node.js v22.1.0 (latest LTS)
- Express.js v4.18.x
- PM2 for process management
- 16GB RAM server that shouldn't be maxing out
What didn't work:
- Node.js built-in profiler: Too much manual analysis
- Traditional heap dumps: 500MB+ files that crashed my editor
- Generic Stack Overflow solutions: Didn't match my specific leak pattern
- Spending 3 days reading through every line of code
Step 1: Set Up Memory Leak Detection
The problem: Node.js doesn't tell you about memory leaks until it's too late
My solution: Add monitoring that catches leaks early with AI analysis
Time this saves: 4+ hours of manual heap dump analysis
Install the Detection Tools
First, add the monitoring packages that work best with AI analysis:
// package.json additions
npm install --save-dev clinic heapdump v8-profiler-next
npm install --save memwatch-next
What this does: Gives us heap snapshots and memory monitoring that AI can actually read Expected output: Three new packages in your dev dependencies
My Terminal after installing - took 30 seconds on decent internet
Personal tip: "Install memwatch-next in production too - it's lightweight and catches leaks in real-time"
Add Memory Monitoring Code
Create a memory monitor that generates AI-friendly data:
// memory-monitor.js
const memwatch = require('memwatch-next');
const heapdump = require('heapdump');
const fs = require('fs');
class MemoryMonitor {
constructor() {
this.baseline = null;
this.leakCount = 0;
this.setupMonitoring();
}
setupMonitoring() {
// Establish baseline after app startup
setTimeout(() => {
memwatch.gc();
this.baseline = process.memoryUsage();
console.log('Memory baseline established:', this.baseline);
}, 5000);
// Monitor for leaks
memwatch.on('leak', (info) => {
this.leakCount++;
console.log('Memory leak detected:', info);
// Create AI-friendly leak report
this.createLeakReport(info);
// Take heap snapshot for AI analysis
if (this.leakCount <= 3) { // Limit snapshots
this.takeSnapshot(`leak-${this.leakCount}`);
}
});
// Monitor memory stats every 30 seconds
setInterval(() => {
this.logMemoryStats();
}, 30000);
}
createLeakReport(leakInfo) {
const report = {
timestamp: new Date().toISOString(),
leak_number: this.leakCount,
growth: leakInfo.growth,
reason: leakInfo.reason,
current_memory: process.memoryUsage(),
baseline_memory: this.baseline,
heap_used_growth: process.memoryUsage().heapUsed - this.baseline.heapUsed,
pid: process.pid,
node_version: process.version
};
// Save report in AI-readable format
fs.writeFileSync(
`memory-leak-report-${this.leakCount}.json`,
JSON.stringify(report, null, 2)
);
console.log('AI-friendly leak report saved');
}
takeSnapshot(name) {
const filename = `${name}-${Date.now()}.heapsnapshot`;
heapdump.writeSnapshot(filename, (err, file) => {
if (err) {
console.error('Snapshot failed:', err);
} else {
console.log(`Heap snapshot saved: ${file}`);
// This file can be uploaded to AI tools for analysis
}
});
}
logMemoryStats() {
const usage = process.memoryUsage();
if (this.baseline) {
const growth = {
rss: usage.rss - this.baseline.rss,
heapUsed: usage.heapUsed - this.baseline.heapUsed,
heapTotal: usage.heapTotal - this.baseline.heapTotal
};
if (growth.heapUsed > 50 * 1024 * 1024) { // 50MB growth
console.warn('Significant heap growth detected:', growth);
}
}
}
}
module.exports = MemoryMonitor;
What this does: Creates structured data that AI can analyze instead of raw heap dumps Expected output: JSON files with leak info that make sense to both humans and AI
Personal tip: "The 30-second memory logging catches gradual leaks that the leak event misses"
Integrate with Your App
Add monitoring to your main application file:
// app.js or server.js
const MemoryMonitor = require('./memory-monitor');
// Start monitoring before your app logic
const memoryMonitor = new MemoryMonitor();
// Your existing Express/Fastify/etc code here
const express = require('express');
const app = express();
// Add memory info endpoint for monitoring
app.get('/health/memory', (req, res) => {
const usage = process.memoryUsage();
res.json({
memory: usage,
uptime: process.uptime(),
leak_count: memoryMonitor.leakCount
});
});
app.listen(3000, () => {
console.log('Server running with memory monitoring');
});
My monitoring setup in VS Code - you can see the JSON reports being generated
Personal tip: "Add the memory endpoint to your health checks - it's saved me twice when monitoring caught leaks before users noticed"
Step 2: Trigger and Capture a Memory Leak
The problem: You need to reproduce the leak in a controlled way to get good AI analysis
My solution: Create a load test that forces the leak and captures clean data
Time this saves: 2+ hours waiting for natural leak conditions
Create a Leak Reproduction Script
This script will help trigger memory leaks under controlled conditions:
// leak-test.js
const axios = require('axios');
class LeakTester {
constructor(baseUrl = 'http://localhost:3000') {
this.baseUrl = baseUrl;
this.requestCount = 0;
this.startTime = Date.now();
}
async simulateTraffic() {
console.log('Starting memory leak test...');
// Run for 10 minutes or until leak detected
const duration = 10 * 60 * 1000; // 10 minutes
const interval = 100; // Request every 100ms
const testEndTime = Date.now() + duration;
while (Date.now() < testEndTime) {
try {
await this.makeRequest();
this.requestCount++;
if (this.requestCount % 100 === 0) {
await this.checkMemoryStatus();
}
await this.sleep(interval);
} catch (error) {
console.error('Request failed:', error.message);
}
}
console.log(`Test completed. Made ${this.requestCount} requests`);
}
async makeRequest() {
// Test different endpoints that commonly leak
const endpoints = [
'/api/users',
'/api/posts',
'/api/upload',
'/api/search'
];
const endpoint = endpoints[Math.floor(Math.random() * endpoints.length)];
await axios.get(`${this.baseUrl}${endpoint}`);
}
async checkMemoryStatus() {
try {
const response = await axios.get(`${this.baseUrl}/health/memory`);
const memInfo = response.data;
console.log(`Requests: ${this.requestCount}, Heap: ${Math.round(memInfo.memory.heapUsed / 1024 / 1024)}MB`);
// Alert if heap grows beyond reasonable bounds
if (memInfo.memory.heapUsed > 200 * 1024 * 1024) { // 200MB
console.warn('High memory usage detected - leak likely occurring');
}
} catch (error) {
console.error('Memory check failed:', error.message);
}
}
sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
}
// Run the test
const tester = new LeakTester();
tester.simulateTraffic().catch(console.error);
What this does: Creates controlled traffic patterns that reveal memory leaks quickly Expected output: Memory usage climbing steadily with detailed logging
My leak test in action - you can see heap usage climbing from 45MB to 180MB in 5 minutes
Personal tip: "Test multiple endpoints - I found my worst leak was in an image upload route I barely thought about"
Step 3: Analyze Leaks with AI
The problem: Heap dumps are massive and hard to interpret manually
My solution: Use AI to analyze the structured data and heap snapshots
Time this saves: 3+ hours of manual heap dump analysis
Prepare Data for AI Analysis
Create a summary that AI can easily process:
// create-ai-summary.js
const fs = require('fs');
const path = require('path');
function createAISummary() {
const reports = [];
const heapSnapshots = [];
// Collect all leak reports
const files = fs.readdirSync('.');
files.forEach(file => {
if (file.startsWith('memory-leak-report-') && file.endsWith('.json')) {
const report = JSON.parse(fs.readFileSync(file, 'utf8'));
reports.push(report);
}
if (file.endsWith('.heapsnapshot')) {
heapSnapshots.push({
filename: file,
size: fs.statSync(file).size,
created: fs.statSync(file).birthtime
});
}
});
const summary = {
application_info: {
node_version: process.version,
platform: process.platform,
memory_limit: '2GB', // Adjust for your setup
typical_heap_size: '50-100MB'
},
leak_pattern: {
total_leaks_detected: reports.length,
time_span: reports.length > 0 ? {
first_leak: reports[0].timestamp,
last_leak: reports[reports.length - 1].timestamp
} : null,
heap_growth_pattern: reports.map(r => ({
timestamp: r.timestamp,
heap_used_mb: Math.round(r.current_memory.heapUsed / 1024 / 1024),
growth_from_baseline_mb: Math.round(r.heap_used_growth / 1024 / 1024)
}))
},
heap_snapshots: heapSnapshots,
analysis_request: "Please analyze this Node.js v22 memory leak pattern and identify the likely root cause. Focus on common Node.js v22 specific issues like EventEmitter leaks, unclosed streams, or closure retention."
};
fs.writeFileSync('ai-analysis-summary.json', JSON.stringify(summary, null, 2));
console.log('AI analysis summary created: ai-analysis-summary.json');
return summary;
}
module.exports = createAISummary;
// Run if called directly
if (require.main === module) {
createAISummary();
}
AI Analysis Prompts That Actually Work
Here are the exact prompts I use with ChatGPT/Claude for memory leak analysis:
Initial Analysis Prompt:
I have a Node.js v22 memory leak. Here's the structured data from my monitoring:
[Paste the ai-analysis-summary.json content]
Please analyze this and tell me:
1. What type of memory leak this appears to be
2. The most likely root causes based on the growth pattern
3. Specific Node.js v22 issues to check for
4. The exact debugging steps I should take next
Focus on actionable insights rather than general advice.
Follow-up Code Review Prompt:
Based on your analysis, here are the suspicious code sections from my app:
[Paste relevant code sections]
Please identify the specific lines that could be causing the memory leak pattern you described. Look for:
- Event listeners not being removed
- Closures holding references
- Streams not being properly closed
- Timers not being cleared
Give me the exact fixes to implement.
Personal tip: "Include your actual heap growth numbers in the prompt - AI gives much better answers with specific data"
Upload Heap Snapshots to AI Tools
For deeper analysis, some AI tools can process heap snapshots directly:
// heap-analyzer.js
const fs = require('fs');
function prepareHeapForAI(snapshotPath) {
const stats = fs.statSync(snapshotPath);
// Most AI tools have file size limits
if (stats.size > 25 * 1024 * 1024) { // 25MB
console.log('Heap snapshot too large for direct AI upload');
console.log('Use Chrome DevTools to analyze first, then share specific findings with AI');
return false;
}
console.log(`Heap snapshot ${snapshotPath} ready for AI analysis`);
console.log('Upload to your AI tool with this prompt:');
console.log(`
"Analyze this Node.js v22 heap snapshot for memory leaks.
Look for:
- Objects with unexpectedly high retention counts
- Large arrays or strings that shouldn't be there
- Event listeners or timers not being cleaned up
- Closures holding unnecessary references
Provide specific object types and suggested fixes."
`);
return true;
}
// Check all snapshots
const files = fs.readdirSync('.');
files.forEach(file => {
if (file.endsWith('.heapsnapshot')) {
prepareHeapForAI(file);
}
});
ChatGPT analyzing my heap snapshot - found the EventEmitter leak in 30 seconds
Personal tip: "If your heap snapshot is too big, use Chrome DevTools first to find the biggest objects, then ask AI about those specific types"
Step 4: Fix Common Node.js v22 Memory Leaks
The problem: Each leak type needs a specific fix approach
My solution: Targeted fixes for the 3 most common leak patterns I've found
Time this saves: Hours of trial-and-error fixing
Fix 1: EventEmitter Memory Leaks
This is the most common leak in Node.js v22 apps:
// BEFORE - This leaks memory
class UserService {
constructor() {
this.eventEmitter = new EventEmitter();
this.users = new Map();
// This listener never gets removed!
this.eventEmitter.on('user-updated', (userId) => {
this.handleUserUpdate(userId);
});
}
async processUser(userId) {
// Each call adds another listener
this.eventEmitter.on('processing-complete', () => {
console.log('Done processing:', userId);
});
// Do processing...
this.eventEmitter.emit('processing-complete');
}
}
// AFTER - Properly managed listeners
class UserService {
constructor() {
this.eventEmitter = new EventEmitter();
this.users = new Map();
this.listeners = new Map(); // Track listeners
// Set max listeners to catch leaks early
this.eventEmitter.setMaxListeners(10);
// Single persistent listener
this.setupPersistentListeners();
}
setupPersistentListeners() {
const userUpdateHandler = (userId) => {
this.handleUserUpdate(userId);
};
this.eventEmitter.on('user-updated', userUpdateHandler);
// Store reference for cleanup
this.listeners.set('user-updated', userUpdateHandler);
}
async processUser(userId) {
// Use once() for one-time listeners
const processingPromise = new Promise((resolve) => {
this.eventEmitter.once('processing-complete', resolve);
});
// Do processing...
this.eventEmitter.emit('processing-complete');
await processingPromise;
console.log('Done processing:', userId);
}
// Clean up when shutting down
cleanup() {
this.listeners.forEach((handler, event) => {
this.eventEmitter.removeListener(event, handler);
});
this.listeners.clear();
this.eventEmitter.removeAllListeners();
}
}
What this fixes: Prevents listener accumulation that can grow to thousands of references Expected result: Stable memory usage even under high load
Personal tip: "Set maxListeners to a low number during development - it'll throw warnings when you accidentally create leaks"
Fix 2: Stream Memory Leaks
Unclosed streams are leak magnets in Node.js v22:
// BEFORE - Streams not properly closed
app.post('/upload', async (req, res) => {
const writeStream = fs.createWriteStream('upload.tmp');
req.pipe(writeStream);
writeStream.on('finish', () => {
res.send('Upload complete');
});
// Stream never closed if error occurs!
});
// AFTER - Proper stream management
app.post('/upload', async (req, res) => {
let writeStream;
try {
writeStream = fs.createWriteStream('upload.tmp');
// Handle all stream events
const uploadPromise = new Promise((resolve, reject) => {
writeStream.on('finish', resolve);
writeStream.on('error', reject);
req.on('error', reject);
});
req.pipe(writeStream);
await uploadPromise;
res.send('Upload complete');
} catch (error) {
console.error('Upload failed:', error);
res.status(500).send('Upload failed');
} finally {
// Always clean up
if (writeStream) {
writeStream.destroy();
}
}
});
// For readable streams
async function processLargeFile(filename) {
const readStream = fs.createReadStream(filename);
try {
for await (const chunk of readStream) {
// Process chunk
await processChunk(chunk);
}
} catch (error) {
console.error('Processing failed:', error);
} finally {
// Cleanup happens automatically with for-await
// But you can force it if needed
readStream.destroy();
}
}
Fix 3: Closure and Timer Leaks
These are subtle but deadly in long-running applications:
// BEFORE - Closures holding large objects
class DataProcessor {
constructor() {
this.cache = new Map();
this.processors = [];
}
addProcessor(largeDataSet) {
// This closure keeps largeDataSet in memory forever!
const processor = (item) => {
return item.process(largeDataSet);
};
this.processors.push(processor);
// Timer never cleared
setInterval(() => {
this.cleanupCache();
}, 60000);
}
}
// AFTER - Proper memory management
class DataProcessor {
constructor() {
this.cache = new Map();
this.processors = [];
this.timers = new Set(); // Track timers
}
addProcessor(largeDataSet) {
// Extract needed data instead of keeping whole object
const processConfig = {
algorithm: largeDataSet.algorithm,
params: largeDataSet.params
};
const processor = (item) => {
return item.process(processConfig);
};
this.processors.push(processor);
// Track timer for cleanup
const cleanupTimer = setInterval(() => {
this.cleanupCache();
}, 60000);
this.timers.add(cleanupTimer);
}
removeProcessor(index) {
// Remove processor reference
this.processors.splice(index, 1);
// Clean up associated timer
const timerArray = Array.from(this.timers);
if (timerArray[index]) {
clearInterval(timerArray[index]);
this.timers.delete(timerArray[index]);
}
}
shutdown() {
// Clear all timers
this.timers.forEach(timer => clearInterval(timer));
this.timers.clear();
// Clear processors
this.processors.length = 0;
// Clear cache
this.cache.clear();
}
}
Before vs after implementing these fixes - heap usage stabilized at 85MB instead of climbing to 1.2GB
Personal tip: "The closure fix had the biggest impact for me - saved 500MB+ of memory by not holding onto large data objects unnecessarily"
Step 5: Prevent Future Leaks
The problem: Memory leaks will happen again without prevention systems
My solution: Automated monitoring and code patterns that catch leaks early
Time this saves: Prevents production incidents entirely
Set Up Automated Leak Detection
Add this to your CI/CD pipeline:
// memory-leak-test.js (for CI/CD)
const { spawn } = require('child_process');
const axios = require('axios');
class AutomatedLeakTest {
constructor() {
this.serverProcess = null;
this.baseUrl = 'http://localhost:3000';
}
async runLeakTest() {
try {
// Start server
await this.startServer();
// Wait for startup
await this.waitForServer();
// Run load test
const result = await this.performLoadTest();
// Check for leaks
const leakDetected = await this.checkForLeaks();
if (leakDetected) {
console.error('MEMORY LEAK DETECTED - failing build');
process.exit(1);
}
console.log('Memory leak test PASSED');
} finally {
this.stopServer();
}
}
async performLoadTest() {
console.log('Starting automated leak test...');
const startMemory = await this.getMemoryUsage();
// Make 1000 requests
for (let i = 0; i < 1000; i++) {
await axios.get(`${this.baseUrl}/api/test`);
if (i % 100 === 0) {
console.log(`Progress: ${i}/1000 requests`);
}
}
// Force garbage collection
await axios.post(`${this.baseUrl}/api/gc`);
await this.sleep(5000); // Wait for GC
const endMemory = await this.getMemoryUsage();
return {
startMemory,
endMemory,
growth: endMemory.heapUsed - startMemory.heapUsed
};
}
async checkForLeaks() {
const memUsage = await this.getMemoryUsage();
const maxAllowedHeap = 150 * 1024 * 1024; // 150MB
if (memUsage.heapUsed > maxAllowedHeap) {
console.error(`Heap usage too high: ${Math.round(memUsage.heapUsed / 1024 / 1024)}MB`);
return true;
}
return false;
}
async getMemoryUsage() {
const response = await axios.get(`${this.baseUrl}/health/memory`);
return response.data.memory;
}
async startServer() {
return new Promise((resolve) => {
this.serverProcess = spawn('node', ['app.js'], {
env: { ...process.env, NODE_ENV: 'test' }
});
setTimeout(resolve, 3000); // Give server time to start
});
}
stopServer() {
if (this.serverProcess) {
this.serverProcess.kill();
}
}
sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
}
// Run test
if (require.main === module) {
const tester = new AutomatedLeakTest();
tester.runLeakTest().catch(console.error);
}
Add to your package.json:
{
"scripts": {
"test:memory": "node memory-leak-test.js",
"test": "npm run test:unit && npm run test:memory"
}
}
Code Review Checklist for Memory Leaks
Save this checklist for every PR review:
## Memory Leak Review Checklist
### Event Listeners
- [ ] Are event listeners removed when objects are destroyed?
- [ ] Is `once()` used instead of `on()` for one-time listeners?
- [ ] Are max listeners set to catch accidental accumulation?
### Streams
- [ ] Are all streams properly closed in finally blocks?
- [ ] Are stream errors handled to prevent hanging references?
- [ ] Is `pipeline()` or `for await` used instead of manual piping?
### Timers and Intervals
- [ ] Are all `setInterval()` calls matched with `clearInterval()`?
- [ ] Are timers cleared when components unmount/destroy?
- [ ] Is timeout cleanup handled in error scenarios?
### Closures
- [ ] Do closures avoid capturing large objects unnecessarily?
- [ ] Are large data structures extracted to minimal needed properties?
- [ ] Are circular references avoided or broken explicitly?
### Caches and Maps
- [ ] Do caches have size limits or TTL?
- [ ] Are Map/Set entries removed when no longer needed?
- [ ] Is cleanup logic tested and verified?
Personal tip: "I've caught 5 leaks in code review using this checklist - much easier than debugging in production"
What You Just Built
You now have a complete system for catching and fixing Node.js v22 memory leaks using AI analysis. Your monitoring catches leaks early, AI helps you identify root causes quickly, and your fixes target the most common leak patterns.
Key Takeaways (Save These)
- AI analysis works best with structured data: Don't just throw raw heap dumps at it - create summaries with context
- EventEmitter leaks are the #1 culprit: Always use
once()for temporary listeners and track persistent ones - Prevention beats debugging: Automated tests and code review checklists catch leaks before production
Tools I Actually Use
- memwatch-next: Real-time leak detection - lightweight enough for production
- Chrome DevTools: Heap snapshot analysis - still the best for deep debugging
- ChatGPT Plus: AI code analysis - surprisingly good at spotting closure and EventEmitter issues
- Node.js Memory Usage Guide: Official documentation - covers Node.js v22 specific improvements