Creating a Decoupled Promise
When working with promises, you typically write something like this:
new Promise((resolve, reject) => {});
By default the resolve
and reject
method is only available inside the Promise lifecycle.
However, with a new TC39 proposal promise-with-resolvers now at Stage 4, a static method has been added to the Promise constructor: withResolvers. It became a new baseline and available since chrome 119 and Node.js 22. This method helps you create a decoupled promise easily, allowing you to hoist resolve
and reject
method out of the promise lifecycle.
The Promise.withResolvers() static method returns an object containing a new Promise object and two functions to resolve or reject it, corresponding to the two parameters passed to the executor of the Promise() constructor.

For legacy environment, you can easily polyfill the behavior by doing something like this:
const createPromiseWithResolvers = () => {
let resolve = noop;
let reject = noop;
const promise = new Promise((res, rej) => {
resolve = res;
reject = rej;
});
return {
promise,
resolve,
reject,
};
};
This pattern is incredibly versatile, and I use it frequently in my projects. Below are a couple of practical ways to leverage decoupled promises.
Using Decoupled Promises for Sequential Control
A decoupled promise can be used as a lock, preventing the execution of a job until certain conditions are met. For example:
// Create a promise with resolvers
const lock = Promise.withResolvers();
// Use the promise as a lock
try {
await lock.promise;
// Job executes once the lock is resolved
doJob();
} catch (e) {
// Job is canceled
console.log("Job canceled", e);
}
// Unlock the job once conditions are met
waitForElement('#content', lock.resolve);
// Or cancel the job if necessary
setTimeout(lock.reject, 2000);
Here, the job only executes after lock.resolve
is called, giving you fine control over when the task proceeds. This is especially useful when handling complex asynchronous workflows without passing necessary job running context into the top level.
Converting Asynchronous to Synchronous-Like Flow
Decoupled promises can also help make asynchronous operations work in a synchronous way. For example, a methodology used behind the scenes in React 18’s Suspense:
// Use suspense in React
<Suspense fallback='loading'>
<Component />
</Suspense>
// A simplified version of Suspense-like behavior
const lock = Promise.withResolvers();
let data = null;
lock.promise.then((res) => (data = res));
const fallback = () => (document.body.innerHTML = "loading");
const render = () => {
if (!data) {
setTimeout(() => lock.resolve("data"), 500);
throw lock.promise;
} else {
document.body.innerHTML = data;
}
};
try {
render();
} catch (e) {
if (e instanceof Promise) {
e.finally(render);
}
fallback();
}
You can view a live demo of this:
Converting Asynchronous to Synchronous-Like Flow using promise

In this example, the promise is thrown on the first render, causing the fallback to display. Once the promise is resolved (via lock.resolve
), the render is triggered again, but this time with the data ready.
By calling render
twice: once when the promise is pending and again when the “lock” is resolved, you avoid asynchronous behavior and ensure that the final render occurs with the correct data.
Decoupled Promises in Class Initialization
Using decoupled promises can also help keep a class constructor synchronous, allowing you to defer method calls until after an asynchronous initialization process is finished.
class Executor {
private ready = Promise.withResolvers();
constructor() {
// Start an asynchronous initialization process
asynInit().then(this.ready.resolve).catch(this.ready.reject);
}
public doTask1 = async () => {
await ready.promise;
// Task 1 logic here
}
public doTask2 = async () => {
await ready.promise;
// Task 2 logic here
}
}
// This step is synchronous
const executor = new Executor();
// Methods are only executed after initialization is complete
executor.doTask1();
executor.doTask2();
In this example, the class methods doTask1
and doTask2
wait for the asynchronous initialization process to complete before executing, ensuring that tasks only run when the environment is fully ready.
Using Promises as a Task Scheduler
Decoupled promises are valuable for managing complex asynchronous task queues with fine control over execution order.
Imagine we have three jobs with different priorities. Before executing each job, we need to perform an async operation to check whether the job can proceed. Only the highest-priority job that passes the check will be executed. An intuitive but inefficient way to handle this might look like this:
const [{ status: job1Status }, { status: job2Status }, { status: job3Status }] = await Promise.allSettled([
p1JobCheck(),
p2JobCheck(),
p3JobCheck(),
]);
if (job3Status === "fulfilled") {
return doJob3();
}
if (job2Status === "fulfilled") {
return doJob2();
}
if (job1Status === "fulfilled") {
return doJob1();
}
In this approach, we wait for all checks to complete before determining which job to execute, meaning we might be delaying execution unnecessarily. For example, if p3JobCheck
finishes early, we could immediately start doJob3
instead of waiting for the other checks.
To optimize this process, we can use decoupled promises as follows:
const lock1 = Promise.withResolvers();
const lock2 = Promise.withResolvers();
const lock3 = Promise.withResolvers();
lock3.then(doJob3).catch(() => {
lock2.then(doJob2).catch(() => {
lock1.then(doJob1).catch(() => {
// all checks failed
})
}
})
p1JobCheck().then(lock1.resolve).catch(lock1.reject);
p2JobCheck().then(lock2.resolve).catch(lock2.reject);
p3JobCheck().then(lock3.resolve).catch(lock3.reject);
In this revised approach, we initiate all checks simultaneously. The final execution respects the priority and the checking result of the jobs, allowing us to speed up the overall process.
Conclusion
Decoupled promises bring powerful new ways to manage asynchronous flows, giving you greater control over when and how tasks are executed. This pattern simplifies your code and enhances its flexibility.