{"id":22882708,"url":"https://github.com/nimitzdev/fetch-streamer","last_synced_at":"2025-09-08T03:35:14.745Z","repository":{"id":79381527,"uuid":"255774971","full_name":"NimitzDEV/fetch-streamer","owner":"NimitzDEV","description":"Process data while the data is downloading","archived":false,"fork":false,"pushed_at":"2023-07-11T06:39:52.000Z","size":25,"stargazers_count":0,"open_issues_count":2,"forks_count":0,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-02-06T21:46:46.796Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/NimitzDEV.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-04-15T01:46:47.000Z","updated_at":"2020-04-15T01:49:02.000Z","dependencies_parsed_at":"2025-02-06T21:45:31.697Z","dependency_job_id":"f8e381b8-2216-449a-8f0e-408c535b7706","html_url":"https://github.com/NimitzDEV/fetch-streamer","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NimitzDEV%2Ffetch-streamer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NimitzDEV%2Ffetch-streamer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NimitzDEV%2Ffetch-streamer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NimitzDEV%2Ffetch-streamer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/NimitzDEV","download_url":"https://codeload.github.com/NimitzDEV/fetch-streamer/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246499374,"owners_count":20787490,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-12-13T18:19:24.045Z","updated_at":"2025-03-31T16:38:38.494Z","avatar_url":"https://github.com/NimitzDEV.png","language":"TypeScript","readme":"# FetchStreamer\r\nThis library utilize `ReadableStream` that provide in `fetch`, to let you process structured data while downloading it.\r\nThis library is most usable when you want to process data like the following:\r\n\r\nData stream contains data units with fixed length\r\n```\r\nbace138be6953bbff4c0a97057d55691c124dc949ad38ae4d9dc31b35d4cef4cfa7416d63d56b3d17b49555aab14c5c1ac466ecec1c74c6db6dbe81518cee1c1\r\n```\r\nThe above data is the combine of 4 md5 hashes, and you want to process it as soon as every 32 bytes of data became available.\r\n\r\n## Examples\r\nFixed length\r\n```javascript\r\nconst url = 'http://localhost:3000/data';\r\nconst streamer = new FetchStreamer(url, 32, 'utf-8');\r\nstreamer.onData((data) =\u003e {\r\n  // callback will be called every time 32 bytes of data became available\r\n})\r\n\r\nstreamer.start();\r\n```\r\n\r\nDynamic length\r\n```javascript\r\nconst url = 'http://localhost:3000/data';\r\n\r\n// This time, the actual data length will be presented in every chunk's first byte.\r\nconst streamer = new FetchStreamer(url, 1, 'utf-8');\r\nstreamer.onData((data, info) =\u003e {\r\n  if (info.targetSize === 1) {\r\n    console.log(`It's header, next chunk's length should be`, data);\r\n    return +data;\r\n  } else {\r\n    console.log(`Consumable data`, data.length, data);\r\n    return 1;\r\n  }\r\n});\r\nstreamer.start();\r\n```\r\n\r\nCheckout example folder for more details.\r\n\r\n## APIs\r\n#### constructor(url, initialSize, textDecoderEncoding)\r\nurl: Reading url\r\ninitialSize: Default chunk size\r\ntextDecoderEncoding: If supplied, the data you receive in onData callback will be decoded first. See also https://developer.mozilla.org/en-US/docs/Web/API/TextDecoder\r\n\r\n\r\n### Events\r\n#### onData((data, info) =\u003e {})\r\nThis event will be fired every time when buffer reaches targeted size.\r\n\r\ndata: Chunk data\r\ninfo: \r\n```json\r\n{ \r\n  \"targetSize\": 0, // this chunk's target size\r\n  \"fulfilled\": true, // indicates this chunk matches the length you are expect, it will be false when it's final chunk and the legnth is shorter than the setting\r\n}\r\n```\r\n\r\n#### onFinish((info) =\u003e {})\r\nThis event will be fired when there is no more data to read\r\n\r\ninfo: \r\n```json\r\n{\r\n  \"bytesProcessed\": 39,\r\n  \"bytesReceived\": 39,\r\n  \"elapsed\": 1107\r\n}\r\n```\r\n\r\n### Methods\r\n#### pause()\r\nPause processing, this will not pause fetching.\r\n\r\n#### resume()\r\nResume paused processing process.\r\n\r\n## License\r\nMIT\r\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnimitzdev%2Ffetch-streamer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fnimitzdev%2Ffetch-streamer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnimitzdev%2Ffetch-streamer/lists"}