I Built My First npm package to Fix Duplicate API Calls and Control Concurrency
Introduction
While working on real applications, I kept running into a pattern that didn’t feel right.
Different parts of the UI would trigger the same API call at the same time. The backend would process identical requests multiple times. Everything worked, but it was inefficient and avoidable.
At the same time, I also needed a structured way to manage concurrency:
limit how many tasks run at once
retry failures
handle timeouts properly
I found myself solving these problems repeatedly in different places. That’s when it made more sense to step back and build a reusable solution.
This is how dedupflow came into existence.
The Problem

Consider a simple scenario:
await Promise.all([
fetchUserData(1),
fetchUserData(1),
fetchUserData(1)
]);Even though all calls are identical, this results in:
multiple unnecessary API hits
increased backend load
wasted resources
possible race conditions
There is no native way to say:
If a request is already running, reuse it instead of starting a new one
This gap becomes very noticeable in real systems.
Existing Alternatives (And Their Limitations)
Before building anything, I explored available approaches.
There are tools that handle:
caching
request deduplication (partially)
concurrency control
But the limitations were clear:
Most solutions focus on a single concern
Combining them leads to complexity
Customization is often limited
There isn’t a clean, minimal abstraction
What I wanted was something simple and predictable:
const fn = createDedup(fetchData);The Core Idea Behind dedupflow

The idea is straightforward but powerful.
In-Flight Deduplication
If a request is already executing, all subsequent calls should reuse the same promise instead of triggering new executions.
TTL-Based Caching
Once a result is available, it can be cached temporarily so repeated calls can be served instantly.
Controlled Cleanup
expired entries should be removed
failed executions should never be cached
This creates a balance between performance and correctness.
Basic Usage
import { createDedup } from "dedupflow";
const fetchUser = async (id: number) => {
console.log("Fetching user...");
const res = await fetch(`/api/user/${id}`);
return res.json();
};
const dedupedFetch = createDedup(fetchUser, {
ttl: 5000
});
await Promise.all([
dedupedFetch(1),
dedupedFetch(1),
dedupedFetch(1)
]);Output
Fetching user...Only one request is executed, and all callers receive the same result.
Advanced Features
Custom Cache Key
createDedup(fetchUser, {
key: (id) => `user:${id}`
});Force Fresh Execution
dedupedFetch.force(1);Bypass Deduplication
dedupedFetch.raw(1);Clear Cache
dedupedFetch.clearCache();Concurrency Control (Pool System)

While working on deduplication, another problem became clear.
What happens when you have a large number of async tasks and no control over how many run at once?
Running everything in parallel is not always efficient or safe.
This is where the pool system comes in.
Pool Usage
import { createPool } from "dedupflow";
const pool = createPool({
limit: 3,
retries: 2,
retryDelay: 500,
timeout: 2000
});
const tasks = Array.from({ length: 10 }, (_, i) => async () => {
console.log("Running task", i);
return i;
});
const results = await pool.run(tasks);What the Pool Solves
limits concurrency
ensures FIFO execution
retries failed tasks
applies per-task timeouts
supports cancellation
Example: Retry Logic
const unstableTask = async () => {
if (Math.random() < 0.7) {
throw new Error("Random failure");
}
return "Success";
};
const pool = createPool({
retries: 3,
retryDelay: 1000
});
await pool.run([unstableTask]);Technical Challenges
Building this package surfaced several important implementation challenges.
Managing In-Flight State
Tracking active promises safely required careful handling to avoid leaks and ensure cleanup.
const inFlight = new Map<string, Promise<any>>();Cache Expiry (TTL)
Each cached entry needed an expiry mechanism along with periodic cleanup.
if (now >= entry.expiresAt) {
cache.delete(key);
}FIFO Cache Eviction
To prevent uncontrolled growth, older entries must be removed first.
cache.delete(cache.keys().next().value);Retry and Timeout Coordination
Each retry attempt needed:
its own timeout window
proper delay handling
awareness of overall pool state
Error Handling Strategy
Two key decisions shaped the behavior:
failures are not cached
the pool stops scheduling new tasks after the first failure
This keeps the system predictable and avoids hidden issues.
When NOT to Use dedupflow
While dedupflow helps simplify async control and improves efficiency, it is not meant for every use case. Using it incorrectly can lead to unexpected behavior.
Understanding these limitations is important for using it correctly in real systems.
Non-Idempotent Operations (Mutations)
If a function performs operations like creating, updating, or deleting data, deduplication can break expected behavior.
const createOrder = createDedup(async (data) => {
return db.insert(data);
});
await Promise.all([
createOrder({ item: "A" }),
createOrder({ item: "A" })
]);In this case, both calls share the same key, so only one execution happens.
This can lead to:
missing operations
incorrect system state
dedupflow should not be used where each call must execute independently.
Functions with Side Effects
If a function:
writes to a database
sends emails
triggers external systems
deduplication can suppress necessary executions.
Even if inputs are identical, side effects may be expected to occur multiple times.
Non-Deterministic Functions
If a function produces different results for the same input, caching can return misleading results.
const getRandom = createDedup(async () => {
return Math.random();
});With caching enabled, multiple calls may return the same value, which is not expected behavior.
This applies to:
random values
time-based outputs
frequently changing external data
Rapidly Changing Data
If the underlying data changes frequently, TTL-based caching can serve stale results.
createDedup(fetchLiveData, { ttl: 5000 });Even a short cache duration may introduce inconsistency in such cases.
Complex or Non-Serializable Arguments
By default, dedupflow generates keys using JSON.stringify.
This can fail or behave incorrectly for:
circular objects
functions
complex nested structures
createDedup(fn, {
key: customKeyFunction
});For such cases, a custom key function should be used.
Unbounded Cache Growth
If neither ttl nor maxSize is configured, the cache can grow indefinitely.
This may lead to increased memory usage over time.
Setting limits ensures predictable behavior.
Strict Execution Requirements
dedupflow optimizes execution by:
sharing in-flight promises
caching results
If your system requires every call to execute independently, this behavior is not suitable.
Pool: Early Failure Behavior
In the pool system:
execution stops scheduling new tasks after the first failure
the overall promise rejects immediately
remaining queued tasks are not executed
This may not be suitable if you need all tasks to run regardless of failures.
Pool: Dependent or Sequential Tasks
The pool runs tasks concurrently.
If tasks:
depend on each other
must execute in strict sequence
modify shared state
this approach can lead to incorrect results.
Summary
dedupflow works best for:
idempotent operations
data fetching
controlled async execution
It is not suitable for:
mutation-heavy workflows
side-effect-driven logic
unpredictable or rapidly changing outputs
Using it in the right context makes it effective and reliable.
What This Project Taught Me
Building this package was less about code and more about understanding systems.
It improved how I think about:
async behavior under load
shared state management
designing reusable abstractions
handling edge cases early
It also highlighted how small utilities can have a meaningful impact when used correctly.
Final Thoughts
This project started as a need to remove inefficiencies from real applications.
It turned into a deeper exploration of how asynchronous systems behave and how to control them effectively.
If you are dealing with:
duplicate API calls
uncontrolled parallel execution
repeated async patterns
a structured approach like this can make a noticeable difference.
Try It Out
npm install dedupflowClosing
This is an evolving project, and there is still more to refine.
If you have suggestions, feedback, or improvements, they are always welcome.
