promise-pool 1.3.0
Install from the command line:
Learn more about npm packages
$ npm install @elite-libs/promise-pool@1.3.0
Install via package.json:
"@elite-libs/promise-pool": "1.3.0"
About this version
A background task processor focused on reliability and scalability.
| Table of Contents
- [x] Configurable.
- [x] Concurrency limit.
- [x] Retries. (Use
p-retry
for this.) - [x] Zero dependencies.
- [x] Error handling.
- [x] Singleton mode: Option to auto-reset when
.done()
. (Added.drain()
method.) - [x] Task scheduling & prioritization.
- [x] Optionally
return
hints. (To support APIs like GitHub which return rate limit hints in http headers, e.g.X-RateLimit-Remaining
header.)
- [x] Optionally
PromisePool
exposes 3 methods:
-
.add(...tasks)
- add one (or more) tasks for background processing. (A task is a function that wraps aPromise
value. e.g.() => Promise.resolve(1)
). -
.drain()
- Returns a promise that resolves when all tasks have been processed, or when another thread calls.drain()
mid-task execution. This is important to maximize performance in shared server environments, so subsequent calls to.drain()
will ensure only the last caller to.drain()
willawait
until the completion of all tasks. -
.done()
- optionally finalize the pool. Returns a promise that resolves when all tasks are complete. No more tasks can be added after this.
# with npm
npm install @elite-libs/promise-pool
# or using yarn
yarn add @elite-libs/promise-pool
import PromisePool from '@elite-libs/promise-pool';
// 1/3: Either use the default instance or create a new one.
const pool = new PromisePool();
// 2/3: Add task(s) to the pool as needed.
// PromisePool will execute them in parallel as soon as possible.
pool.add(() => saveToS3(data));
pool.add(() => expensiveBackgroundWork(data));
// 3/3: REQUIRED: in order to ensure your tasks are executed,
// you must await either `pool.drain()` or `pool.done()` at some point in your code (`done` prevents additional tasks from being added).
await pool.done();
In this example, we'll use a singleton pattern to ensure we have only 1 Promise Pool
instance per process.
// `./src/services/taskPool.ts`
import PromisePool from '@elite-libs/promise-pool';
export const taskPool = new PromisePool({
maxWorkers: 6, // Optional. Default is `4`.
});
Then from inside your app (Lambda function, Express app, etc.), you can use the taskPool
instance to add tasks to the pool.
// Example Lambda function
// `./src/handlers/user.handler.ts`
import { taskPool } from './services/taskPool';
export const handler = async (event, context) => {
const data = getDataFromEvent(event);
taskPool.add(() => saveToS3(data));
taskPool.add(() => expensiveBackgroundWork(data));
await taskPool.drain();
return {
statusCode: 200,
body: JSON.stringify({
message: 'Success',
}),
}
}
await taskPool.drain()
call. This is required to ensure your tasks are executed.
You could also utilize Promise Pool
in middy
middleware:
// Important: Import a global instance of `Promise Pool`
import { taskPool } from './services/taskPool';
const taskPoolMiddleware = () => ({
before: (request) => {
Object.assign(request.context, { taskPool });
},
after: async (request) => {
await request.context.taskPool.drain();
}
});
export default taskPoolMiddleware;
Now you can use taskPool
in your Lambda function:
request.context.taskPool.add(() => saveToS3(data));
Now the drain()
method will be called automatically after()
every Lambda function returns.
// `./src/middleware/taskPool.ts`
import { taskPool } from "../services/taskPool.mjs";
export const taskPoolMiddleware = (req, res, next) => {
req.taskPool = taskPool;
next();
};
Then you can use taskPool
in your Express app:
import { taskPool } from "../services/taskPool.mjs";
export const taskPoolMiddleware = (req, res, next) => {
req.taskPool = taskPool;
next();
};
See
/examples
folder for more examples.
-
What? Adds Smart
.drain()
behavior.-
Why?
- Speeds up concurrent server environments!
- prevents several types of difficult (ghost) bugs!
- stack overflows/mem access violations,
- exceeding rarely hit OS limits, (e.g. max open handle/file/socket limits)
- exceeding limits of Network hardware (e.g. connections/sec, total open socket limit, etc.)
- uneven observed wait times.
-
How? Only awaits the latest
.drain()
caller. -
Who?
Server
+Singleton
use cases will see a major benefit to this design. -
Errata
- Here are several competing (tortured) metaphors, please suggest more & let me know which makes the most sense.
- Like a relay race's series of baton handoffs. As soon as you hand off, your race is over & the next runner must either run until finish (wait to end) OR hand off the job to someone new.
- A tiny Queue, with a size of one!
- A 'Take-a-penny-leave-a-penny' tray, big enough for exactly 1 penny.
- This is similar to the feature in MongoDB called the
Write Threshold/Concern
setting, or in Distributed SQL Replication where you can set a minimum # of confirmed 'server writes' before SQL considers it safely written in the cluster. - Anectode: A group of friends had a rule that whoever is last to arrive must pay for the first friend (to arrive on time.)
- Here are several competing (tortured) metaphors, please suggest more & let me know which makes the most sense.
-
Why?
Details
- promise-pool
- elite-libs
- over 2 years ago
- BSD-3-Clause
- 15 dependencies
Assets
- promise-pool-1.3.0.tgz
Download activity
- Total downloads 0
- Last 30 days 0
- Last week 0
- Today 0