{"id":13457584,"url":"https://github.com/sindresorhus/p-queue","last_synced_at":"2026-04-07T12:01:14.362Z","repository":{"id":38420115,"uuid":"72200120","full_name":"sindresorhus/p-queue","owner":"sindresorhus","description":"Promise queue with concurrency control","archived":false,"fork":false,"pushed_at":"2026-04-01T17:00:36.000Z","size":205,"stargazers_count":4159,"open_issues_count":6,"forks_count":201,"subscribers_count":19,"default_branch":"main","last_synced_at":"2026-04-02T02:03:24.991Z","etag":null,"topics":["async-functions","async-queue","node-module","npm-package","promise","promise-queue","queue","queue-data-stucture"],"latest_commit_sha":null,"homepage":null,"language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/sindresorhus.png","metadata":{"files":{"readme":"readme.md","changelog":null,"contributing":null,"funding":null,"license":"license","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":".github/security.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null},"funding":{"github":"sindresorhus","open_collective":"sindresorhus","buy_me_a_coffee":"sindresorhus","custom":"https://sindresorhus.com/donate"}},"created_at":"2016-10-28T10:57:52.000Z","updated_at":"2026-04-01T17:00:40.000Z","dependencies_parsed_at":"2026-02-05T08:00:08.502Z","dependency_job_id":null,"html_url":"https://github.com/sindresorhus/p-queue","commit_stats":{"total_commits":142,"total_committers":32,"mean_commits":4.4375,"dds":0.3943661971830986,"last_synced_commit":"cbdbbb768bf0804087201a4e99302a7dbc848198"},"previous_names":[],"tags_count":48,"template":false,"template_full_name":null,"purl":"pkg:github/sindresorhus/p-queue","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sindresorhus%2Fp-queue","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sindresorhus%2Fp-queue/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sindresorhus%2Fp-queue/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sindresorhus%2Fp-queue/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/sindresorhus","download_url":"https://codeload.github.com/sindresorhus/p-queue/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sindresorhus%2Fp-queue/sbom","scorecard":{"id":826687,"data":{"date":"2025-08-11","repo":{"name":"github.com/sindresorhus/p-queue","commit":"9dff653cad2823764443bfbc410c24445bc95947"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":4.2,"checks":[{"name":"Code-Review","score":1,"reason":"Found 3/30 approved changesets -- score normalized to 1","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"Token-Permissions","score":0,"reason":"detected GitHub workflow tokens with excessive permissions","details":["Warn: no topLevel permission defined: .github/workflows/main.yml:1","Info: no jobLevel write permissions found"],"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Maintained","score":1,"reason":"0 commit(s) and 2 issue activity found in the last 90 days -- score normalized to 1","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Dangerous-Workflow","score":10,"reason":"no dangerous workflow patterns detected","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"Security-Policy","score":10,"reason":"security policy file detected","details":["Info: security policy file detected: .github/security.md:1","Info: Found linked content: .github/security.md:1","Info: Found disclosure, vulnerability, and/or timelines in security policy: .github/security.md:1","Info: Found text in security policy: .github/security.md:1"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Pinned-Dependencies","score":0,"reason":"dependency not pinned by hash detected -- score normalized to 0","details":["Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/main.yml:16: update your workflow using https://app.stepsecurity.io/secureworkflow/sindresorhus/p-queue/main.yml/main?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/main.yml:17: update your workflow using https://app.stepsecurity.io/secureworkflow/sindresorhus/p-queue/main.yml/main?enable=pin","Warn: npmCommand not pinned by hash: .github/workflows/main.yml:21","Info:   0 out of   2 GitHub-owned GitHubAction dependencies pinned","Info:   0 out of   1 npmCommand dependencies pinned"],"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"Vulnerabilities","score":10,"reason":"0 existing vulnerabilities detected","details":null,"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"License","score":10,"reason":"license file detected","details":["Info: project has a license file: license:0","Info: FSF or OSI recognized license: MIT License: license:0"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":0,"reason":"branch protection not enabled on development/release branches","details":["Warn: branch protection not enabled for branch 'main'"],"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 3 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}}]},"last_synced_at":"2025-08-23T16:45:50.933Z","repository_id":38420115,"created_at":"2025-08-23T16:45:50.933Z","updated_at":"2025-08-23T16:45:50.933Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31493372,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-06T17:22:55.647Z","status":"ssl_error","status_checked_at":"2026-04-06T17:22:54.741Z","response_time":112,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["async-functions","async-queue","node-module","npm-package","promise","promise-queue","queue","queue-data-stucture"],"created_at":"2024-07-31T09:00:30.480Z","updated_at":"2026-04-07T12:01:14.356Z","avatar_url":"https://github.com/sindresorhus.png","language":"TypeScript","readme":"# p-queue\n\n\u003e Promise queue with concurrency control\n\nUseful for rate-limiting async (or sync) operations. For example, when interacting with a REST API or when doing CPU/memory intensive tasks.\n\nFor servers, you probably want a Redis-backed [job queue](https://github.com/sindresorhus/awesome-nodejs#job-queues) instead.\n\nNote that the project is feature complete. We are happy to review pull requests, but we don't plan any further development. We are also not answering email support questions.\n\n---\n\n\u003cbr\u003e\n\u003cdiv align=\"center\"\u003e\n\t\u003cp\u003e\n\t\t\u003cp\u003e\n\t\t\t\u003csup\u003e\n\t\t\t\t\u003ca href=\"https://github.com/sponsors/sindresorhus\"\u003eSindre's open source work is supported by the community\u003c/a\u003e\u003cbr\u003eSpecial thanks to:\n\t\t\t\u003c/sup\u003e\n\t\t\u003c/p\u003e\n\t\t\u003cbr\u003e\n\t\t\u003cbr\u003e\n\t\t\u003ca href=\"https://fetchfox.ai?ref=sindre\"\u003e\n\t\t\t\u003cdiv\u003e\n\t\t\t\t\u003cimg src=\"https://sindresorhus.com/assets/thanks/fetchfox-logo.svg\" height=\"200\"/\u003e\n\t\t\t\u003c/div\u003e\n\t\t\t\u003cb\u003eScrape anything with FetchFox\u003c/b\u003e\n\t\t\t\u003cdiv\u003e\n\t\t\t\t\u003csup\u003eFetchFox is an AI powered scraping tool that lets you scrape data from any website\u003c/sup\u003e\n\t\t\t\u003c/div\u003e\n\t\t\u003c/a\u003e\n\t\u003c/p\u003e\n\t\u003cbr\u003e\n\t\u003cbr\u003e\n\u003c/div\u003e\n\n---\n\n## Install\n\n```sh\nnpm install p-queue\n```\n\n**Warning:** This package is native [ESM](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules) and no longer provides a CommonJS export. If your project uses CommonJS, you'll have to [convert to ESM](https://gist.github.com/sindresorhus/a39789f98801d908bbc7ff3ecc99d99c). Please don't open issues for questions regarding CommonJS / ESM.\n\n## Usage\n\nHere we run only one promise at a time. For example, set `concurrency` to 4 to run four promises at the same time.\n\n```js\nimport PQueue from 'p-queue';\nimport got from 'got';\n\nconst queue = new PQueue({concurrency: 1});\n\n(async () =\u003e {\n\tawait queue.add(() =\u003e got('https://sindresorhus.com'));\n\tconsole.log('Done: sindresorhus.com');\n})();\n\n(async () =\u003e {\n\tawait queue.add(() =\u003e got('https://avajs.dev'));\n\tconsole.log('Done: avajs.dev');\n})();\n```\n\n## API\n\n### PQueue(options?)\n\nReturns a new `queue` instance, which is an [`EventEmitter3`](https://github.com/primus/eventemitter3) subclass.\n\n#### options\n\nType: `object`\n\n##### concurrency\n\nType: `number`\\\nDefault: `Infinity`\\\nMinimum: `1`\n\nConcurrency limit.\n\n##### timeout\n\nType: `number`\\\nDefault: `undefined`\n\nPer-operation timeout in milliseconds. Operations will throw a `TimeoutError` if they don't complete within the specified time.\n\nThe timeout begins when the operation is dequeued and starts execution, not while it's waiting in the queue.\n\nCan be overridden per task using the `timeout` option in `.add()`:\n\n```js\nconst queue = new PQueue({timeout: 5000});\n\n// This task uses the global 5s timeout\nawait queue.add(() =\u003e fetchData());\n\n// This task has a 10s timeout\nawait queue.add(() =\u003e slowTask(), {timeout: 10000});\n```\n\n##### autoStart\n\nType: `boolean`\\\nDefault: `true`\n\nWhether queue tasks within concurrency limit, are auto-executed as soon as they're added.\n\n##### queueClass\n\nType: `Function`\n\nClass with a `enqueue` and `dequeue` method, and a `size` getter. See the [Custom QueueClass](#custom-queueclass) section.\n\n##### intervalCap\n\nType: `number`\\\nDefault: `Infinity`\\\nMinimum: `1`\n\nThe max number of runs in the given interval of time.\n\n##### interval\n\nType: `number`\\\nDefault: `0`\\\nMinimum: `0`\n\nThe length of time in milliseconds before the interval count resets. Must be finite.\n\n##### carryoverIntervalCount\n\nType: `boolean`\\\nDefault: `false`\n\nIf `true`, specifies that any [pending](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise) Promises, should be carried over into the next interval and counted against the `intervalCap`. If `false`, any of those pending Promises will not count towards the next `intervalCap`.\n\n##### strict\n\nType: `boolean`\\\nDefault: `false`\n\nWhether to use strict mode for rate limiting (sliding window algorithm).\n\nWhen enabled, ensures that no more than `intervalCap` tasks execute in any rolling `interval` window, rather than resetting the count at fixed intervals. This provides more predictable and evenly distributed execution.\n\nFor example, with `intervalCap: 2` and `interval: 1000`:\n- **Default mode (fixed window)**: Tasks can burst at window boundaries. You could execute 2 tasks at 999ms and 2 more at 1000ms, resulting in 4 tasks within 1ms.\n- **Strict mode (sliding window)**: Enforces that no more than 2 tasks execute in any 1000ms rolling window, preventing bursts.\n\n\u003e [!NOTE]\n\u003e Strict mode is more resource-intensive as it tracks individual execution timestamps. Use it when you need guaranteed rate-limit compliance, such as when interacting with APIs that enforce strict rate limits.\n\n\u003e [!NOTE]\n\u003e The `carryoverIntervalCount` option has no effect when `strict` mode is enabled, as strict mode tracks actual execution timestamps rather than counting pending tasks.\n\n### queue\n\n`PQueue` instance.\n\n#### .add(fn, options?)\n\nAdds a sync or async task to the queue.\n\nReturns a promise that settles when the task completes, not when it's added to the queue. The promise resolves with the return value of `fn`.\n\n\u003e [!IMPORTANT]\n\u003e If you `await` this promise, you will wait for the task to finish running, which may defeat the purpose of using a queue for concurrency. See the [Usage](#usage) section for examples.\n\n\u003e [!NOTE]\n\u003e If your items can potentially throw an exception, you must handle those errors from the returned Promise or they may be reported as an unhandled Promise rejection and potentially cause your process to exit immediately.\n\n##### fn\n\nType: `Function`\n\nPromise-returning/async function. When executed, it will receive `{signal}` as the first argument.\n\n#### options\n\nType: `object`\n\n##### priority\n\nType: `number`\\\nDefault: `0`\n\nPriority of operation. Operations with greater priority will be scheduled first.\n\n##### id\n\nType `string`\n\nUnique identifier for the promise function, used to update its priority before execution. If not specified, it is auto-assigned an incrementing BigInt starting from `1n`.\n\n##### signal\n\n[`AbortSignal`](https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal) for cancellation of the operation. When aborted, it will be removed from the queue and the `queue.add()` call will reject with an [error](https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal/reason). If the operation is already running, the signal will need to be handled by the operation itself.\n\n```js\nimport PQueue from 'p-queue';\nimport got, {CancelError} from 'got';\n\nconst queue = new PQueue();\n\nconst controller = new AbortController();\n\ntry {\n\tawait queue.add(({signal}) =\u003e {\n\t\tconst request = got('https://sindresorhus.com');\n\n\t\tsignal.addEventListener('abort', () =\u003e {\n\t\t\trequest.cancel();\n\t\t});\n\n\t\ttry {\n\t\t\treturn await request;\n\t\t} catch (error) {\n\t\t\tif (!(error instanceof CancelError)) {\n\t\t\t\tthrow error;\n\t\t\t}\n\t\t}\n\t}, {signal: controller.signal});\n} catch (error) {\n\tif (!(error instanceof DOMException)) {\n\t\tthrow error;\n\t}\n}\n```\n\n#### .addAll(fns, options?)\n\nSame as `.add()`, but accepts an array of sync or async functions and returns a promise that resolves when all functions are resolved.\n\n#### .pause()\n\nPut queue execution on hold.\n\n#### .start()\n\nStart (or resume) executing enqueued tasks within concurrency limit. No need to call this if queue is not paused (via `options.autoStart = false` or by `.pause()` method.)\n\nReturns `this` (the instance).\n\n#### .onEmpty()\n\nReturns a promise that settles when the queue becomes empty.\n\nCan be called multiple times. Useful if you for example add additional items at a later time.\n\n\u003e [!NOTE]\n\u003e The promise returned by `.onEmpty()` resolves **once** when the queue becomes empty. If you want to be notified every time the queue becomes empty, use the `empty` event instead: `queue.on('empty', () =\u003e {})`.\n\n#### .onIdle()\n\nReturns a promise that settles when the queue becomes empty, and all promises have completed; `queue.size === 0 \u0026\u0026 queue.pending === 0`.\n\nThe difference with `.onEmpty` is that `.onIdle` guarantees that all work from the queue has finished. `.onEmpty` merely signals that the queue is empty, but it could mean that some promises haven't completed yet.\n\n\u003e [!NOTE]\n\u003e The promise returned by `.onIdle()` resolves **once** when the queue becomes idle. If you want to be notified every time the queue becomes idle, use the `idle` event instead: `queue.on('idle', () =\u003e {})`.\n\n#### .onPendingZero()\n\nReturns a promise that settles when all currently running tasks have completed; `queue.pending === 0`.\n\nThe difference with `.onIdle` is that `.onPendingZero` only waits for currently running tasks to finish, ignoring queued tasks. This is useful when you want to drain in-flight tasks before mutating shared state.\n\n```js\nqueue.pause();\nawait queue.onPendingZero();\n// All running tasks have finished, though the queue may still have items\n```\n\n#### .onRateLimit()\n\nReturns a promise that settles when the queue becomes rate-limited due to `intervalCap`. If the queue is already rate-limited, the promise resolves immediately.\n\nUseful for implementing backpressure to prevent memory issues when producers are faster than consumers.\n\n```js\nconst queue = new PQueue({intervalCap: 5, interval: 1000});\n\n// Add many tasks\nfor (let index = 0; index \u003c 10; index++) {\n\tqueue.add(() =\u003e someTask());\n}\n\nawait queue.onRateLimit();\nconsole.log('Queue is now rate-limited - time for maintenance tasks');\n```\n\n#### .onRateLimitCleared()\n\nReturns a promise that settles when the queue is no longer rate-limited. If the queue is not currently rate-limited, the promise resolves immediately.\n\n```js\nconst queue = new PQueue({intervalCap: 5, interval: 1000});\n\n// Wait for rate limiting to be cleared\nawait queue.onRateLimitCleared();\nconsole.log('Rate limit cleared - can add more tasks');\n```\n\n#### .onError()\n\nReturns a promise that rejects when any task in the queue errors.\n\nUse with `Promise.race([queue.onError(), queue.onIdle()])` to fail fast on the first error while still resolving normally when the queue goes idle.\n\n\u003e [!IMPORTANT]\n\u003e The promise returned by `add()` still rejects. You must handle each `add()` promise (for example, `.catch(() =\u003e {})`) to avoid unhandled rejections.\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 2});\n\nqueue.add(() =\u003e fetchData(1)).catch(() =\u003e {});\nqueue.add(() =\u003e fetchData(2)).catch(() =\u003e {});\nqueue.add(() =\u003e fetchData(3)).catch(() =\u003e {});\n\n// Stop processing on first error\ntry {\n\tawait Promise.race([\n\t\tqueue.onError(),\n\t\tqueue.onIdle()\n\t]);\n} catch (error) {\n\tqueue.pause(); // Stop processing remaining tasks\n\tconsole.error('Queue failed:', error);\n}\n```\n\n#### .onSizeLessThan(limit)\n\nReturns a promise that settles when the queue size is less than the given limit: `queue.size \u003c limit`.\n\nIf you want to avoid having the queue grow beyond a certain size you can `await queue.onSizeLessThan()` before adding a new item.\n\nNote that this only limits the number of items waiting to start. There could still be up to `concurrency` jobs already running that this call does not include in its calculation.\n\n#### .clear()\n\nClear the queue.\n\n\u003e [!WARNING]\n\u003e Any promises returned by `.add()` for tasks that were waiting in the queue (not yet running) will **never settle** after calling `.clear()`. This can cause \"unsettled top-level await\" warnings or hang your process. If you need the promises to settle, use `AbortSignal` for cancellation instead — aborting rejects the `.add()` promise cleanly.\n\n#### .size\n\nSize of the queue, the number of queued items waiting to run.\n\n#### .sizeBy(options)\n\nSize of the queue, filtered by the given options.\n\nFor example, this can be used to find the number of items remaining in the queue with a specific priority level.\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue();\n\nqueue.add(async () =\u003e '🦄', {priority: 1});\nqueue.add(async () =\u003e '🦄', {priority: 0});\nqueue.add(async () =\u003e '🦄', {priority: 1});\n\nconsole.log(queue.sizeBy({priority: 1}));\n//=\u003e 2\n\nconsole.log(queue.sizeBy({priority: 0}));\n//=\u003e 1\n```\n\n#### .setPriority(id, priority)\n\nUpdates the priority of a promise function by its id, affecting its execution order. Requires a defined concurrency limit to take effect.\n\nFor example, this can be used to prioritize a promise function to run earlier.\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 1});\n\nqueue.add(async () =\u003e '🦄', {priority: 1});\nqueue.add(async () =\u003e '🦀', {priority: 0, id: '🦀'});\nqueue.add(async () =\u003e '🦄', {priority: 1});\nqueue.add(async () =\u003e '🦄', {priority: 1});\n\nqueue.setPriority('🦀', 2);\n```\n\nIn this case, the promise function with `id: '🦀'` runs second.\n\nYou can also deprioritize a promise function to delay its execution:\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 1});\n\nqueue.add(async () =\u003e '🦄', {priority: 1});\nqueue.add(async () =\u003e '🦀', {priority: 1, id: '🦀'});\nqueue.add(async () =\u003e '🦄');\nqueue.add(async () =\u003e '🦄', {priority: 0});\n\nqueue.setPriority('🦀', -1);\n```\n\nHere, the promise function with `id: '🦀'` executes last.\n\n#### .pending\n\nNumber of running items (no longer in the queue).\n\n#### [.timeout](#timeout)\n\nType: `number | undefined`\n\nGet or set the default timeout for all tasks. Can be changed at runtime.\n\nOperations will throw a `TimeoutError` if they don't complete within the specified time.\n\nThe timeout begins when the operation is dequeued and starts execution, not while it's waiting in the queue.\n\n```js\nconst queue = new PQueue({timeout: 5000});\n\n// Change timeout for all future tasks\nqueue.timeout = 10000;\n```\n\n#### [.concurrency](#concurrency)\n\n#### .isPaused\n\nWhether the queue is currently paused.\n\n#### .isRateLimited\n\nWhether the queue is currently rate-limited due to `intervalCap`. Returns `true` when the number of tasks executed in the current interval has reached the `intervalCap` and there are still tasks waiting to be processed.\n\n#### .isSaturated\n\nWhether the queue is saturated. Returns `true` when:\n- All concurrency slots are occupied and tasks are waiting, OR\n- The queue is rate-limited and tasks are waiting\n\nUseful for detecting backpressure and potential hanging tasks.\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 2});\n\n// Backpressure handling\nif (queue.isSaturated) {\n\tconsole.log('Queue is saturated, waiting for capacity...');\n\tawait queue.onSizeLessThan(queue.concurrency);\n}\n\n// Monitoring for stuck tasks\nsetInterval(() =\u003e {\n\tif (queue.isSaturated) {\n\t\tconsole.warn(`Queue saturated: ${queue.pending} running, ${queue.size} waiting`);\n\t}\n}, 60000);\n```\n\n#### .runningTasks\n\nThe tasks currently being executed. Each task includes its `id`, `priority`, `startTime`, and `timeout` (if set).\n\nReturns an array of task info objects.\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 2});\n\n// Add tasks with IDs for better debugging\nqueue.add(() =\u003e fetchUser(123), {id: 'user-123'});\nqueue.add(() =\u003e fetchPosts(456), {id: 'posts-456', priority: 1});\n\n// Check what's running\nconsole.log(queue.runningTasks);\n/*\n[\n\t{\n\t\tid: 'user-123',\n\t\tpriority: 0,\n\t\tstartTime: 1759253001716,\n\t\ttimeout: undefined\n\t},\n\t{\n\t\tid: 'posts-456',\n\t\tpriority: 1,\n\t\tstartTime: 1759253001916,\n\t\ttimeout: undefined\n\t}\n]\n*/\n```\n\n## Events\n\n#### active\n\nEmitted as each item is processed in the queue for the purpose of tracking progress.\n\n```js\nimport delay from 'delay';\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 2});\n\nlet count = 0;\nqueue.on('active', () =\u003e {\n\tconsole.log(`Working on item #${++count}.  Size: ${queue.size}  Pending: ${queue.pending}`);\n});\n\nqueue.add(() =\u003e Promise.resolve());\nqueue.add(() =\u003e delay(2000));\nqueue.add(() =\u003e Promise.resolve());\nqueue.add(() =\u003e Promise.resolve());\nqueue.add(() =\u003e delay(500));\n```\n\n#### completed\n\nEmitted when an item completes without error.\n\n```js\nimport delay from 'delay';\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 2});\n\nqueue.on('completed', result =\u003e {\n\tconsole.log(result);\n});\n\nqueue.add(() =\u003e Promise.resolve('hello, world!'));\n```\n\n#### error\n\nEmitted if an item throws an error. The promise returned by `add()` is still rejected, so you must handle both.\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 2});\n\nqueue.on('error', error =\u003e {\n\tconsole.error(error);\n});\n\n// Handle the promise to prevent unhandled rejection\nqueue.add(() =\u003e Promise.reject(new Error('error'))).catch(() =\u003e {\n\t// Error already handled by event listener\n});\n```\n\n#### empty\n\nEmitted every time the queue becomes empty.\n\nUseful if you for example add additional items at a later time.\n\n#### idle\n\nEmitted whenever the queue becomes idle: both empty and with zero running tasks (`size === 0 \u0026\u0026 pending === 0`). If no tasks are ever added, it never fires.\n\nThe difference with `empty` is that `idle` guarantees that all work from the queue has finished. `empty` merely signals that the queue is empty, but it could mean that some promises haven't completed yet.\n\n#### pendingZero\n\nEmitted every time the number of running tasks becomes zero; `queue.pending === 0`.\n\nThe difference with `idle` is that `pendingZero` is emitted even when the queue still has items waiting to run, whereas `idle` requires both an empty queue and no pending tasks.\n\n```js\nimport delay from 'delay';\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue();\n\nqueue.on('idle', () =\u003e {\n\tconsole.log(`Queue is idle.  Size: ${queue.size}  Pending: ${queue.pending}`);\n});\n\nconst job1 = queue.add(() =\u003e delay(2000));\nconst job2 = queue.add(() =\u003e delay(500));\n\nawait job1;\nawait job2;\n// =\u003e 'Queue is idle.  Size: 0  Pending: 0'\n\nawait queue.add(() =\u003e delay(600));\n// =\u003e 'Queue is idle.  Size: 0  Pending: 0'\n```\n\nThe `idle` event is emitted every time the queue reaches an idle state. On the other hand, the promise the `onIdle()` function returns resolves once the queue becomes idle instead of every time the queue is idle.\n\n#### add\n\nEmitted every time the add method is called and the number of pending or queued tasks is increased.\n\n#### next\n\nEmitted every time a task is completed and the number of pending or queued tasks is decreased. This is emitted regardless of whether the task completed normally or with an error.\n\n```js\nimport delay from 'delay';\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue();\n\nqueue.on('add', () =\u003e {\n\tconsole.log(`Task is added.  Size: ${queue.size}  Pending: ${queue.pending}`);\n});\n\nqueue.on('next', () =\u003e {\n\tconsole.log(`Task is completed.  Size: ${queue.size}  Pending: ${queue.pending}`);\n});\n\nconst job1 = queue.add(() =\u003e delay(2000));\nconst job2 = queue.add(() =\u003e delay(500));\n\nawait job1;\nawait job2;\n//=\u003e 'Task is added.  Size: 0  Pending: 1'\n//=\u003e 'Task is added.  Size: 0  Pending: 2'\n\nawait queue.add(() =\u003e delay(600));\n//=\u003e 'Task is completed.  Size: 0  Pending: 1'\n//=\u003e 'Task is completed.  Size: 0  Pending: 0'\n```\n\n#### rateLimit\n\nEmitted when the queue becomes rate-limited due to `intervalCap`. This happens when the maximum number of tasks allowed per interval has been reached.\n\nUseful for implementing backpressure to prevent memory issues when producers are faster than consumers.\n\n```js\nimport delay from 'delay';\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({\n\tintervalCap: 2,\n\tinterval: 1000\n});\n\nqueue.on('rateLimit', () =\u003e {\n\tconsole.log('Queue is rate-limited - processing backlog or maintenance tasks');\n});\n\n// Add 3 tasks - third one triggers rate limiting\nqueue.add(() =\u003e delay(100));\nqueue.add(() =\u003e delay(100));\nqueue.add(() =\u003e delay(100));\n```\n\n#### rateLimitCleared\n\nEmitted when the queue is no longer rate-limited—either because the interval reset and new tasks can start, or because the backlog was drained.\n\n```js\nimport delay from 'delay';\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({\n\tintervalCap: 1,\n\tinterval: 1000\n});\n\nqueue.on('rateLimit', () =\u003e {\n\tconsole.log('Rate limited - waiting for interval to reset');\n});\n\nqueue.on('rateLimitCleared', () =\u003e {\n\tconsole.log('Rate limit cleared - can process more tasks');\n});\n\nqueue.add(() =\u003e delay(100));\nqueue.add(() =\u003e delay(100)); // This triggers rate limiting\n```\n\n## Advanced example\n\nA more advanced example to help you understand the flow.\n\n```js\nimport delay from 'delay';\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 1});\n\n(async () =\u003e {\n\tawait delay(200);\n\n\tconsole.log(`8. Pending promises: ${queue.pending}`);\n\t//=\u003e '8. Pending promises: 0'\n\n\t(async () =\u003e {\n\t\tawait queue.add(async () =\u003e '🐙');\n\t\tconsole.log('11. Resolved')\n\t})();\n\n\tconsole.log('9. Added 🐙');\n\n\tconsole.log(`10. Pending promises: ${queue.pending}`);\n\t//=\u003e '10. Pending promises: 1'\n\n\tawait queue.onIdle();\n\tconsole.log('12. All work is done');\n})();\n\n(async () =\u003e {\n\tawait queue.add(async () =\u003e '🦄');\n\tconsole.log('5. Resolved')\n})();\nconsole.log('1. Added 🦄');\n\n(async () =\u003e {\n\tawait queue.add(async () =\u003e '🐴');\n\tconsole.log('6. Resolved')\n})();\nconsole.log('2. Added 🐴');\n\n(async () =\u003e {\n\tawait queue.onEmpty();\n\tconsole.log('7. Queue is empty');\n})();\n\nconsole.log(`3. Queue size: ${queue.size}`);\n//=\u003e '3. Queue size: 1`\n\nconsole.log(`4. Pending promises: ${queue.pending}`);\n//=\u003e '4. Pending promises: 1'\n```\n\n```\n$ node example.js\n1. Added 🦄\n2. Added 🐴\n3. Queue size: 1\n4. Pending promises: 1\n5. Resolved 🦄\n6. Resolved 🐴\n7. Queue is empty\n8. Pending promises: 0\n9. Added 🐙\n10. Pending promises: 1\n11. Resolved 🐙\n12. All work is done\n```\n\n## Handling timeouts\n\nYou can set a timeout for all tasks or override it per task. When a task times out, a `TimeoutError` is thrown.\n\n```js\nimport PQueue, {TimeoutError} from 'p-queue';\n\n// Set a global timeout for all tasks\nconst queue = new PQueue({\n\tconcurrency: 2,\n\ttimeout: 5000, // 5 seconds\n});\n\n(async () =\u003e {\n\t// This task will use the global timeout\n\ttry {\n\t\tawait queue.add(() =\u003e fetchData());\n\t} catch (error) {\n\t\tif (error instanceof TimeoutError) {\n\t\t\tconsole.log('Task timed out after 5 seconds');\n\t\t}\n\t}\n})();\n\n(async () =\u003e {\n\t// Override timeout for a specific task\n\ttry {\n\t\tawait queue.add(() =\u003e slowTask(), {\n\t\t\ttimeout: 10000, // 10 seconds for this task only\n\t\t});\n\t} catch (error) {\n\t\tif (error instanceof TimeoutError) {\n\t\t\tconsole.log('Slow task timed out after 10 seconds');\n\t\t}\n\t}\n})();\n\n(async () =\u003e {\n\t// No timeout for this task\n\tawait queue.add(() =\u003e verySlowTask(), {\n\t\ttimeout: undefined,\n\t});\n})();\n```\n\n## Custom QueueClass\n\nFor implementing more complex scheduling policies, you can provide a QueueClass in the options:\n\n```js\nimport PQueue from 'p-queue';\n\nclass QueueClass {\n\tconstructor() {\n\t\tthis._queue = [];\n\t}\n\n\tenqueue(run, options) {\n\t\tthis._queue.push(run);\n\t}\n\n\tdequeue() {\n\t\treturn this._queue.shift();\n\t}\n\n\tget size() {\n\t\treturn this._queue.length;\n\t}\n\n\tfilter(options) {\n\t\treturn this._queue;\n\t}\n}\n\nconst queue = new PQueue({queueClass: QueueClass});\n```\n\n`p-queue` will call corresponding methods to put and get operations from this queue.\n\n## FAQ\n\n#### How do the `concurrency` and `intervalCap` options affect each other?\n\nThey are just different constraints. The `concurrency` option limits how many things run at the same time. The `intervalCap` option limits how many things run in total during the interval (over time).\n\n#### When should I use `strict` mode for rate limiting?\n\nUse `strict: true` when:\n- You're interacting with APIs that enforce strict rate limits and will throttle or block you if you exceed them, even briefly\n- You've experienced issues with the default fixed window mode (such as [#126](https://github.com/sindresorhus/p-queue/issues/126))\n- You need guaranteed compliance with rate limits for any rolling time window\n\nUse the default fixed window mode when:\n- You don't have strict rate limit requirements\n- Performance is more important than perfect rate limit distribution\n- You're rate limiting for backpressure management rather than external API constraints\n\n#### How do I implement backpressure?\n\nUse `.onSizeLessThan()` to prevent the queue from growing unbounded and causing memory issues when producers are faster than consumers:\n\n```js\nconst queue = new PQueue();\n\n// Wait for queue to have space before adding more\nawait queue.onSizeLessThan(100);\nqueue.add(() =\u003e someTask());\n```\n\nNote: `.size` counts queued items, while `.pending` counts running items. The total is `queue.size + queue.pending`.\n\nYou can also use `.onRateLimit()` for backpressure during rate limiting. See the [`.onRateLimit()`](#onratelimit) docs.\n\n#### How do I cancel or remove a queued task?\n\nUse `AbortSignal` for targeted cancellation. When aborted, a queued task is removed and the `.add()` promise rejects. For bulk cancellation, share one `AbortController` across tasks. Avoid using `queue.clear()` alone for cancellation: it removes queued tasks but their `.add()` promises will never settle, causing dangling promises.\n\nNote that aborting only rejects the promise returned by `.add()` — it does not automatically stop the async work inside your function. For a running task, you must handle the signal inside the function itself (see the example below).\n\nSingle-task cancellation:\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue();\nconst controller = new AbortController();\n\nconst promise = queue.add(({signal}) =\u003e doWork({signal}), {signal: controller.signal});\n\ncontroller.abort(); // Cancels if still queued; running tasks must handle `signal` themselves\n```\n\nBulk cancellation using a shared `AbortController`:\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 2});\nconst controller = new AbortController();\n\n// All tasks share the same signal\nqueue.add(({signal}) =\u003e doWork(signal), {signal: controller.signal}).catch(() =\u003e {});\nqueue.add(({signal}) =\u003e doWork(signal), {signal: controller.signal}).catch(() =\u003e {});\nqueue.add(({signal}) =\u003e doWork(signal), {signal: controller.signal}).catch(() =\u003e {});\n\n// Cancel all queued (and signal running) tasks — promises reject cleanly\ncontroller.abort();\n```\n\nDirect removal methods are not provided as they would leak internals and risk dangling promises.\n\n#### How do I get results in the order they were added?\n\nThis package executes tasks in priority order, but doesn't guarantee completion order. If you need results in the order they were added, use `Promise.all()`, which maintains the order of the input array:\n\n```js\nimport PQueue from 'p-queue';\n\nconst queue = new PQueue({concurrency: 4});\n\nconst tasks = [\n\t() =\u003e fetchData(1), // May finish third\n\t() =\u003e fetchData(2), // May finish first\n\t() =\u003e fetchData(3), // May finish second\n];\n\nconst results = await Promise.all(\n\ttasks.map(task =\u003e queue.add(task))\n);\n// results = [result1, result2, result3] ✅ Always in input order\n\n// Or more concisely:\nconst urls = ['url1', 'url2', 'url3'];\nconst results = await Promise.all(\n\turls.map(url =\u003e queue.add(() =\u003e fetch(url)))\n);\n```\n\nIf you don't need `p-queue`'s advanced features, consider using [`p-map`](https://github.com/sindresorhus/p-map), which is specifically designed for this use case.\n\n#### How do I stream results as they complete in order?\n\nFor progressive results that maintain input order, use [`pMapIterable`](https://github.com/sindresorhus/p-map#pmapiterable) from `p-map`:\n\n```js\nimport {pMapIterable} from 'p-map';\n\n// Stream results in order as they complete\nfor await (const result of pMapIterable(items, fetchItem, {concurrency: 4})) {\n\tconsole.log(result); // Results arrive in input order\n}\n```\n\nYou can combine it with `p-queue` when you need priorities or a shared concurrency cap:\n\n```js\nimport PQueue from 'p-queue';\nimport {pMapIterable} from 'p-map';\n\n// Let p-queue handle concurrency\nconst queue = new PQueue({concurrency: 4});\n\nfor await (const result of pMapIterable(\n\titems,\n\titem =\u003e queue.add(() =\u003e fetchItem(item), {priority: item.priority})\n)) {\n\tconsole.log(result); // Still in input order\n}\n```\n\n#### How do I debug a queue that stops processing tasks?\n\nIf your queue stops processing tasks after extended use, it's likely that some tasks are hanging indefinitely, exhausting the concurrency limit. Use the `.runningTasks` property to identify which specific tasks are stuck.\n\nCommon causes:\n- Network requests without timeouts\n- Database queries that hang\n- File operations on unresponsive network drives\n- Unhandled promise rejections\n\nDebugging steps:\n\n```js\n// 1. Add timeouts to prevent hanging\nconst queue = new PQueue({\n\tconcurrency: 2,\n\ttimeout: 30000 // 30 seconds\n});\n\n// 2. Always add IDs to tasks for debugging\nqueue.add(() =\u003e processItem(item), {id: `item-${item.id}`});\n\n// 3. Monitor for stuck tasks using runningTasks\nsetInterval(() =\u003e {\n\tconst now = Date.now();\n\tconst stuckTasks = queue.runningTasks.filter(task =\u003e\n\t\tnow - task.startTime \u003e 30000 // Running for over 30 seconds\n\t);\n\n\tif (stuckTasks.length \u003e 0) {\n\t\tconsole.error('Stuck tasks:', stuckTasks);\n\t\t// Consider aborting or logging more details\n\t}\n\n\t// Detect saturation (potential hanging if persistent)\n\tif (queue.isSaturated) {\n\t\tconsole.warn(`Queue saturated: ${queue.pending} running, ${queue.size} waiting`);\n\t}\n}, 60000);\n\n// 4. Track task lifecycle\nqueue.on('completed', result =\u003e {\n\tconsole.log('Task completed');\n});\nqueue.on('error', error =\u003e {\n\tconsole.error('Task failed:', error);\n});\n\n// 5. Wrap tasks with debugging\nconst debugTask = async (fn, name) =\u003e {\n\tconst start = Date.now();\n\tconsole.log(`Starting: ${name}`);\n\ttry {\n\t\tconst result = await fn();\n\t\tconsole.log(`Completed: ${name} (${Date.now() - start}ms)`);\n\t\treturn result;\n\t} catch (error) {\n\t\tconsole.error(`Failed: ${name} (${Date.now() - start}ms)`, error);\n\t\tthrow error;\n\t}\n};\n\nqueue.add(() =\u003e debugTask(() =\u003e fetchData(), 'fetchData'), {id: 'fetchData'});\n```\n\nPrevention:\n- Always use timeouts for I/O operations\n- Ensure all async functions properly resolve or reject\n- Use the `timeout` option to enforce task time limits\n- Monitor `queue.size` and `queue.pending` in production\n\n#### How do I test code that uses `p-queue` with Jest fake timers?\n\nJest fake timers don't work well with `p-queue` because it uses `queueMicrotask` internally.\n\nWorkaround:\n\n```js\nconst flushPromises = () =\u003e new Promise(resolve =\u003e setImmediate(resolve));\n\njest.useFakeTimers();\n\n// ... your test code ...\n\nawait jest.runAllTimersAsync();\nawait flushPromises();\n```\n\n## Maintainers\n\n- [Sindre Sorhus](https://github.com/sindresorhus)\n- [Richie Bendall](https://github.com/Richienb)\n\n## Related\n\n- [p-limit](https://github.com/sindresorhus/p-limit) - Run multiple promise-returning \u0026 async functions with limited concurrency\n- [p-throttle](https://github.com/sindresorhus/p-throttle) - Throttle promise-returning \u0026 async functions\n- [p-debounce](https://github.com/sindresorhus/p-debounce) - Debounce promise-returning \u0026 async functions\n- [p-all](https://github.com/sindresorhus/p-all) - Run promise-returning \u0026 async functions concurrently with optional limited concurrency\n- [More…](https://github.com/sindresorhus/promise-fun)\n","funding_links":["https://github.com/sponsors/sindresorhus","https://opencollective.com/sindresorhus","https://buymeacoffee.com/sindresorhus","https://sindresorhus.com/donate"],"categories":["TypeScript","Packages","语言资源库","Convenience Utilities"],"sub_categories":["typescript","sindresorhus's many Promise utilities ([see notes](https://github.com/sindresorhus/promise-fun))","sindresorhus's many Promise utilities (\u003cb\u003e\u003ccode\u003e\u0026nbsp;\u0026nbsp;5134⭐\u003c/code\u003e\u003c/b\u003e \u003cb\u003e\u003ccode\u003e\u0026nbsp;\u0026nbsp;\u0026nbsp;136🍴\u003c/code\u003e\u003c/b\u003e [see notes](https://github.com/sindresorhus/promise-fun)))"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsindresorhus%2Fp-queue","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fsindresorhus%2Fp-queue","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsindresorhus%2Fp-queue/lists"}