
Advanced Concurrency: Running Things in Parallel
Abhay Vachhani
Developer
In high-performance backend development, running tasks one-by-one is rarely enough. Whether you're fetching data from multiple microservices, processing a batch of images, or handling a multi-stage checkout, you need to manage concurrency with precision. This guide covers the advanced patterns necessary to build resilient, parallel systems in JavaScript.
1. Beyond Promise.all: Error Resilience
We all know Promise.all(), but it has a major weakness: if one promise fails, the whole thing rejects immediately (fail-fast). In many scenarios, like processing a batch of independent uploads, we want to know about every failure without stopping the entire process.
- Promise.allSettled(): This is your primary tool for batch processing. It waits for every promise to finish, whether it succeeded or failed, and gives you an array of results with status and value/reason.
- Promise.any(): A "first-to-succeed" pattern. If you're requesting data from three redundant mirrors, you only care about the first one that delivers successfully.
- Promise.race(): The first-to-finish pattern. While often used for timeouts, it's also useful for competitive tasks where the outcome doesn't matter as much as the speed.
const results = await Promise.allSettled([
fetchFromServiceA(),
fetchFromServiceB(),
fetchFromServiceC()
]);
// Efficiently handle partial successes
const successes = results
.filter(r => r.status === 'fulfilled')
.map(r => r.value);
const failures = results
.filter(r => r.status === 'rejected')
.map(r => r.reason);
2. Implementing Concurrency Limits
Parallelism is a double-edged sword. Running 1,000 requests simultaneously can lead to socket exhaustion, "Too Many Connections" errors in your database, or getting blocked by external APIs. You must control the throughput.
You can use a throttling pattern. Instead of firing everything at once, you process them in chunks or use a pool of "workers" that pick up tasks as they become free.
// A simple yet powerful concurrency pool
async function pool(tasks, limit) {
const results = [];
const executing = [];
for (const task of tasks) {
const p = Promise.resolve().then(() => task());
results.push(p);
if (limit <= tasks.length) {
const e = p.then(() => executing.splice(executing.indexOf(e), 1));
executing.push(e);
if (executing.length >= limit) {
await Promise.race(executing);
}
}
}
return Promise.all(results);
}
3. Clean Cancellation with AbortController
Efficiency isn't just about starting tasks; it's about stopping them. If a user cancels a request or navigates away, your server should stop doing expensive work for that request. AbortController provides a unified way to signal cancellation across the entire Node.js ecosystem.
const controller = new AbortController();
const { signal } = controller;
// Trigger cancellation after 5 seconds automatically
const timeoutId = setTimeout(() => controller.abort(), 5000);
try {
const data = await fetch(url, { signal });
// Process data...
} catch (err) {
if (err.name === 'AbortError') {
console.warn('Request was aborted due to timeout');
}
} finally {
clearTimeout(timeoutId);
}
4. Handling Async Race Conditions
Race conditions in JavaScript usually occur when asynchronous operations share and modify state. Because the execution "pauses" at every await, other tasks can sneak in and change the environment your function depends on.
The Golden Rule: Always treat the state inside an async function as "dirty" after an await. Re-validate or use locks if your logic depends on a static view of the world.
Conclusion
Mastering advanced concurrency transforms your Node.js applications from simple scripts into resilient, professional-grade systems. By moving beyond Promise.all and adopting patterns for pooling and cancellation, you ensure your backends stay responsive even under extreme load. Don't just run things in parallel—manage them.
FAQs
What is the main advantage of Promise.allSettled?
It prevents a single failed task from interrupting an entire batch operation, allowing you to collect both results and errors for complete processing.
Does Node.js have a built-in mutex?
Node.js doesn't have a built-in mutex, but you can achieve similar results using atomic database operations or libraries like async-mutex for in-memory locks.
Why utilize AbortController?
It allows you to stop long-running or redundant operations (like HTTP requests or file reads) to save CPU, memory, and bandwidth when the result is no longer needed.