{"id":32118567,"url":"https://github.com/patrickkunka/bandwidth-throttle-stream","last_synced_at":"2026-02-18T09:02:56.547Z","repository":{"id":42899075,"uuid":"252779092","full_name":"patrickkunka/bandwidth-throttle-stream","owner":"patrickkunka","description":"A Node.js and Deno transform stream for throttling bandwidth","archived":false,"fork":false,"pushed_at":"2023-03-04T10:06:11.000Z","size":381,"stargazers_count":24,"open_issues_count":6,"forks_count":3,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-10-22T19:58:59.306Z","etag":null,"topics":["bandwidth-limiting","network","simulation","stream","testing","throttling"],"latest_commit_sha":null,"homepage":"","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/patrickkunka.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2020-04-03T16:04:06.000Z","updated_at":"2024-07-04T10:32:28.000Z","dependencies_parsed_at":"2023-02-06T12:01:07.353Z","dependency_job_id":null,"html_url":"https://github.com/patrickkunka/bandwidth-throttle-stream","commit_stats":null,"previous_names":[],"tags_count":7,"template":false,"template_full_name":null,"purl":"pkg:github/patrickkunka/bandwidth-throttle-stream","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/patrickkunka%2Fbandwidth-throttle-stream","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/patrickkunka%2Fbandwidth-throttle-stream/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/patrickkunka%2Fbandwidth-throttle-stream/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/patrickkunka%2Fbandwidth-throttle-stream/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/patrickkunka","download_url":"https://codeload.github.com/patrickkunka/bandwidth-throttle-stream/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/patrickkunka%2Fbandwidth-throttle-stream/sbom","scorecard":{"id":722359,"data":{"date":"2025-08-11","repo":{"name":"github.com/patrickkunka/bandwidth-throttle-stream","commit":"cf6209b10f2b60800174f5cef92db25b96901c8a"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":2.4,"checks":[{"name":"Dangerous-Workflow","score":10,"reason":"no dangerous workflow patterns detected","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"Maintained","score":0,"reason":"0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Token-Permissions","score":0,"reason":"detected GitHub workflow tokens with excessive permissions","details":["Warn: no topLevel permission defined: .github/workflows/ci.yml:1","Info: no jobLevel write permissions found"],"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Code-Review","score":0,"reason":"Found 0/30 approved changesets -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Pinned-Dependencies","score":4,"reason":"dependency not pinned by hash detected -- score normalized to 4","details":["Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/ci.yml:12: update your workflow using https://app.stepsecurity.io/secureworkflow/patrickkunka/bandwidth-throttle-stream/ci.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/ci.yml:15: update your workflow using https://app.stepsecurity.io/secureworkflow/patrickkunka/bandwidth-throttle-stream/ci.yml/master?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/ci.yml:39: update your workflow using https://app.stepsecurity.io/secureworkflow/patrickkunka/bandwidth-throttle-stream/ci.yml/master?enable=pin","Info:   0 out of   2 GitHub-owned GitHubAction dependencies pinned","Info:   0 out of   1 third-party GitHubAction dependencies pinned","Info:   1 out of   1 npmCommand dependencies pinned"],"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"License","score":0,"reason":"license file not detected","details":["Warn: project does not have a license file"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":0,"reason":"branch protection not enabled on development/release branches","details":["Warn: branch protection not enabled for branch 'master'"],"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 6 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}},{"name":"Vulnerabilities","score":0,"reason":"15 existing vulnerabilities detected","details":["Warn: Project is vulnerable to: GHSA-968p-4wvh-cqc8","Warn: Project is vulnerable to: GHSA-67hx-6x53-jw92","Warn: Project is vulnerable to: GHSA-93q8-gq69-wqmw","Warn: Project is vulnerable to: GHSA-v6h2-p8h4-qcjw","Warn: Project is vulnerable to: GHSA-grv7-fg5c-xmjg","Warn: Project is vulnerable to: GHSA-3xgq-45jj-v275","Warn: Project is vulnerable to: GHSA-gxpj-cx7g-858c","Warn: Project is vulnerable to: GHSA-2j2x-2gpw-g8fm","Warn: Project is vulnerable to: GHSA-4q6p-r6v2-jvc5","Warn: Project is vulnerable to: GHSA-9c47-m6qq-7p4h","Warn: Project is vulnerable to: GHSA-952p-6rrq-rcjv","Warn: Project is vulnerable to: GHSA-f8q6-p94x-37v3","Warn: Project is vulnerable to: GHSA-xvch-5gv4-984h","Warn: Project is vulnerable to: GHSA-9wv6-86v2-598j","Warn: Project is vulnerable to: GHSA-c2qf-rxjj-qqgw"],"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}}]},"last_synced_at":"2025-08-22T11:45:01.976Z","repository_id":42899075,"created_at":"2025-08-22T11:45:01.976Z","updated_at":"2025-08-22T11:45:01.976Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":29574065,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-02-18T08:38:15.585Z","status":"ssl_error","status_checked_at":"2026-02-18T08:38:14.917Z","response_time":162,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["bandwidth-limiting","network","simulation","stream","testing","throttling"],"created_at":"2025-10-20T17:23:32.719Z","updated_at":"2026-02-18T09:02:56.540Z","avatar_url":"https://github.com/patrickkunka.png","language":"TypeScript","readme":"![CI](https://github.com/patrickkunka/bandwidth-throttle-stream/workflows/CI/badge.svg) [![Coverage Status](https://coveralls.io/repos/github/patrickkunka/bandwidth-throttle-stream/badge.svg?branch=master)](https://coveralls.io/github/patrickkunka/bandwidth-throttle-stream?branch=master)\n\n# Bandwidth Throttle Stream\n\nA [Node.js](https://nodejs.org/en/) and [Deno](https://deno.land/) transform stream for throttling bandwidth which distributes available bandwidth evenly between all requests in a \"group\", accurately simulating the effect of network conditions on simultaneous overlapping requests.\n\n#### Features\n- Idiomatic pipeable [Transform](https://nodejs.org/api/stream.html) API for use in Node.js\n- Idiomatic pipeable [TransformStream](https://developer.mozilla.org/en-US/docs/Web/API/TransformStream) API for use in Deno\n- Distributes the desired bandwidth evenly over each second\n- Distributes the desired bandwidth evenly between all active requests\n- Abortable requests ensure bandwidth is redistributed if a client aborts a request\n\n#### Contents\n- [Node.js Installation](#nodejs-installation)\n- [Deno Installation](#deno-installation)\n- [Usage](#usage)\n    - [Creating a Group](#creating-a-group)\n    - [Attaching Throttles](#attaching-throttles)\n    - [Handling Completion](#handling-completion)\n    - [Converting Between Reader Formats in Deno](#converting-between-reader-formats-in-deno)\n- [Configuration Options](#configuration-options)\n- [Dynamic Configuration](#dynamic-configuration)\n- [Aborted Requests](#aborted-requests)\n- [Repo Structure](#repo-structure)\n\n## Node.js Installation\n\nFirstly, install the package using your package manager of choice.\n\n```\nnpm install bandwidth-throttle-stream\n```\n\nYou may then import the `createBandwidthThrottleGroup()` factory function into your project.\n\n```js\nimport {createBandwidthThrottleGroup} from 'bandwidth-throttle-stream';\n```\n\n## Deno Installation\n\nIn Deno, all modules are imported from URLs as ES modules. Versioned [releases](https://github.com/patrickkunka/bandwidth-throttle-stream/releases) of `bandwidth_throttle_stream` are available from [deno.land/x](https://deno.land/x). Note that as per Deno convention, the package name is delineated with underscores (`_`).\n\n```js\nimport {createBandwidthThrottleGroup} from 'https://deno.land/x/bandwidth_throttle_stream/mod.ts';\n```\n\nThe above URL will return the latest release, but it is strongly advised to lock your import to a specific version using the following syntax, where the `x.y.z` semver can be any published version of the library:\n\n```js\nimport {createBandwidthThrottleGroup} from 'https://deno.land/x/bandwidth_throttle_stream@x.y.z/mod.ts';\n```\n\n## Usage\n\n### Creating a Group\n\nUsing the imported `createBandwidthThrottleGroup` factory function, we must firstly create a \"bandwidth throttle group\" which will be configured with a specific throughput in bytes (B) per second.\n\n```js\n// Create a group with a configured available bandwidth in bytes (B) per second.\n\nconst bandwidthThrottleGroup = createBandwidthThrottleGroup({\n    bytesPerSecond: 500000 // 500KB/s\n});\n```\n\nTypically we would create a single group only for a server running a simulation, which all incoming network requests to be throttled are routed through. However, we could also create multiple groups if we wanted to run multiple simulations with different configurations on a single server.\n\n### Attaching Throttles\n\nOnce we've created a group, we can then attach individual pipeable \"throttles\" to it, as requests come into our server.\n\nThe most simple integration would be to insert the throttle (via `.pipe`, or `.pipeThrough`) between a readable stream (e.g file system readout, server-side HTTP response), and the response stream of the incoming client request to be throttled.\n\n##### Node.js example: Piping between readable and writable streams\n```js\n// Attach a throttle to a group (e.g. in response to an incoming request)\n\nconst throttle = bandwidthThrottleGroup.createBandwidthThrottle(contentLength);\n\n// Throttle the response by piping a `stream.Readable` to a `stream.Writable`\n// via the throttle\n\nsomeReadableStream\n    .pipe(throttle)\n    .pipe(someWritableStream);\n\n```\n\n##### Deno example: Piping between a readable stream and a reader:\n```ts\n// Attach a throttle to a group (e.g. in response to an incoming request)\n\nconst throttle = bandwidthThrottleGroup.createBandwidthThrottle(contentLength);\n\n// Throttle the response by piping a `ReadableStream` to a `ReadableStreamDefaultReader`:\n\nconst reader = someReadableStream\n    .pipeThrough(throttle)\n    .getReader()\n```\n\n\nNote that a number value for `contentLength` (in \"bytes\") must be passed when creating an individual throttle. This should be the total size of data for the request being passed through the throttle, and is used to allocate memory upfront in a single `Uint8Array` typed array, thus preventing expensive GC calls as backpressure builds up. When throttling HTTP requests, `contentLength` can be obtained from the `'content-length'` header, once the headers of the request have arrived:\n\n##### Node.js (Express) example: Obtaining `content-length` from `req` headers:\n```js\nconst contentLength = parseInt(req.get('content-length'))\n```\n\n##### Deno example: Obtaining `content-length` from `fetch` headers:\n\n```ts\nconst { body, headers } = await fetch(destination);\n\nconst contentLength = parseInt(headers.get(\"content-length\"));\n```\n\nIf no `contentLength` value is available (e.g. the underlying server does not implement a `content-length` header), then it should be set to a value no smaller than the size of largest expected request. To keep memory usage within reason, arbitrarily high values should be avoided.\n\n### Handling Completion\n\nWe may want to perform some specific logic once a request is complete, and all data has been processed through the throttle.\n\nIn Node.js, rather than piping directly to a response, we can use the `done` event to manually write data, and the `end` event to manually handled completion.\n\n##### Node.js example: Hooking into the `end` event of a writable stream\n```js\nrequest\n    .pipe(throttle)\n    .on('data', chunk =\u003e response.write(chunk)\n    .on('end', () =\u003e {\n        response.end();\n\n        // any custom completion logic here\n    });\n```\n\nIn Deno, the call to `request.respond()` returns a promise which resolves once the request is completed and all data has been pulled into the `body` reader.\n\n##### Deno example: responding to a request with a reader and a status code\n```ts\nimport {readerToDenoReader} from 'https://deno.land/x/bandwidth_throttle_stream@x.y.z/mod.ts';\n\nconst reader = request\n    .pipeThrough(throttle)\n    .getReader()\n\nawait request.respond({\n    status: 200\n    body: readerToDenoReader(reader, contentLength),\n});\n\n// any custom completion logic here\n```\n\n### Converting Between Reader Formats in Deno\n\nNote that in the Deno example above, a reader may be passed directly to `request.respond()` allowing real-time streaming of the throttled output. However, the Deno [`std`](https://deno.land/std/http/server.ts) server expects a `Deno.Reader` as a `body` (rather than the standard `ReadableStreamDefaultReader`), meaning that conversion is needed between the two.\n\nThe `readerToDenoReader` util is exposed for this purpose, and must be provided with both a reference to `ReadableStreamDefaultReader` (`reader`), and the `contentLength` of the request.\n\n## Configuration Options\n\nEach bandwidth throttle group accepts an optional object of configuration options:\n\n```js\nconst bandwidthThrottleGroup = createBandwidthThrottleGroup({\n    bytesPerSecond: 500000 // 500KB/s,\n    ticksPerSecond: 20 // aim to write output 20x per second\n});\n```\n\nThe following options are available.\n\n```ts\ninterface IConfig {\n    /**\n     * The maximum number of bytes allowed to pass through the\n     * throttle, each second.\n     *\n     * @default Infinity\n     */\n\n    bytesPerSecond?: number;\n\n    /**\n     * Defines how frequently the processing of bytes should be\n     * distributed across each second. Each time the internal\n     * scheduler \"ticks\", data will be processed and written out.\n     *\n     * A higher value will ensure smoother throttling for requests\n     * that complete within a second, but will be more expensive\n     * computationally and will ultimately be constrained by the\n     * performance of the JavaScript runtime.\n     *\n     * @default 40\n     */\n\n    ticksPerSecond?: number;\n}\n```\n\n## Dynamic Configuration\n\nA group can be reconfigured at any point after creation via its `.configure()` method, which accepts the same configuration interface as the `createBandwidthThrottleGroup()` factory.\n\n```js\n// Create a group with no throttling\n\nconst bandwidthThrottleGroup = createBandwidthThrottleGroup();\n\n// ... after some configuration event:\n\nbandwidthThrottleGroup.configure({\n    bytesPerSecond: 6000000\n})\n```\n\n## Aborted Requests\n\nWhen a client aborts a requests, its important that we also abort the throttle, ensuring the group can re-balance available bandwidth correctly, and backpressure buffer memory is released.\n\n##### Node.js example: Handling aborted requests\n\n```js\nconst throttle = bandwidthThrottleGroup.createBandwidthThrottle(contentLength);\n\nrequest.on('aborted', () =\u003e {\n    // Client aborted request\n\n    throttle.abort();\n});\n\nrequest\n    .pipe(throttle)\n    .pipe(response);\n```\n\n##### Deno example: Handling aborted requests\n\n```ts\nconst throttle = bandwidthThrottleGroup.createBandwidthThrottle(contentLength);\n\nconst reader = request\n    .pipeThrough(throttle)\n    .getReader()\n\ntry {\n    await request.respond({\n        status: 200\n        body: readerToDenoReader(reader, contentLength),\n    });\n} catch(err) {\n    // request aborted or failed\n\n    throttle.abort();\n}\n```\n\n## Repo Structure\n\nThis repository contains shared source code for consumption by both Deno (TypeScript ES modules), and Node.js (JavaScript Common.js modules).\n\nWherever a Deno or Node.js specific API is needed, a common abstraction is created that can be swapped at build time. Platform specific implementations are denoted with either a `.deno.ts` or `.node.ts` file extension. Platform specific entry points to these abstractions reside in the `lib/Platform/` directory.\n\nThe source code (contained in the `lib/` directory) is ready for direct consumption by Deno is written in ESNext TypeScript, but requires some modifications to produce Node.js compatible NPM distribution code.\n\nThe Node.js build process comprises the following steps:\n1. Copy all contents of `lib/` to `src/` (git ignored)\n1. Remove all `.ts` file extensions from modules in `src/` (see `scipts/replace.ts`)\n1. Replace any imports from `src/Platform/*` with a `@Platform` alias (see `scipts/replace.ts`)\n1. Run `tsc` on contents of `src/` using the [`ts-transform-paths`](https://github.com/zerkalica/zerollup/tree/master/packages/ts-transform-paths) plugin to replace `@Platform` alias with Node.js entry points.\n1. Output compiled, Common.js code to `dist/` (git ignored), and publish `dist/` to NPM.","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpatrickkunka%2Fbandwidth-throttle-stream","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpatrickkunka%2Fbandwidth-throttle-stream","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpatrickkunka%2Fbandwidth-throttle-stream/lists"}