{"id":15646447,"url":"https://github.com/martinheidegger/block-cache","last_synced_at":"2025-06-25T07:34:59.647Z","repository":{"id":143861180,"uuid":"123858338","full_name":"martinheidegger/block-cache","owner":"martinheidegger","description":"block-cache is a transparent(ish) cache that keeps data split in blocks in an in-memory lru-cache. This is useful if you want to process a file, reusing previously downloaded parts and improving the general performance without caching more than your given memory limit.","archived":false,"fork":false,"pushed_at":"2018-03-06T16:02:40.000Z","size":141,"stargazers_count":5,"open_issues_count":0,"forks_count":2,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-06-01T00:47:16.675Z","etag":null,"topics":["block","cache","dat","fs","nodejs","read"],"latest_commit_sha":null,"homepage":null,"language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/martinheidegger.png","metadata":{"files":{"readme":"Readme.md","changelog":"CHANGELOG.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-03-05T03:16:05.000Z","updated_at":"2023-06-04T08:25:07.000Z","dependencies_parsed_at":null,"dependency_job_id":"ccc09df8-82b3-4e96-9e7e-ea59d6081219","html_url":"https://github.com/martinheidegger/block-cache","commit_stats":null,"previous_names":[],"tags_count":2,"template":false,"template_full_name":null,"purl":"pkg:github/martinheidegger/block-cache","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/martinheidegger%2Fblock-cache","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/martinheidegger%2Fblock-cache/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/martinheidegger%2Fblock-cache/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/martinheidegger%2Fblock-cache/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/martinheidegger","download_url":"https://codeload.github.com/martinheidegger/block-cache/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/martinheidegger%2Fblock-cache/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":261827574,"owners_count":23215771,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["block","cache","dat","fs","nodejs","read"],"created_at":"2024-10-03T12:12:56.552Z","updated_at":"2025-06-25T07:34:59.606Z","avatar_url":"https://github.com/martinheidegger.png","language":"JavaScript","readme":"# block-cache\n\n[![Build Status](https://travis-ci.org/martinheidegger/block-cache.svg?branch=master)](https://travis-ci.org/martinheidegger/block-cache)\n[![JavaScript Style Guide](https://img.shields.io/badge/code_style-standard-brightgreen.svg)](https://standardjs.com)\n[![Maintainability](https://api.codeclimate.com/v1/badges/16ad2e5bd41ce529ae97/maintainability)](https://codeclimate.com/github/martinheidegger/block-cache/maintainability)\n[![Test Coverage](https://api.codeclimate.com/v1/badges/16ad2e5bd41ce529ae97/test_coverage)](https://codeclimate.com/github/martinheidegger/block-cache/test_coverage)\n\n`block-cache` is a transparent(ish) cache that keeps data split in blocks in\nan in-memory lru-cache. This is useful if you want to process a file, reusing\npreviously downloaded parts and improving the general performance without\ncaching more than your given memory limit.\n\nThe cache does not expose the passed-in API at any point which makes it\nsuitable as a Sandbox.\n\n`npm i block-cache --save`\n\n## Usage\n\nThe API of `block-cache` is comparable to the\n[`fs`](https://nodejs.org/api/fs.html) API but all callbacks are optional and\nif omitted will result in a Promise returned.\n\nHere is a simple example of reading a file into the local cache.\n\n```javascript\nconst fs = require('fs')\nconst {Cache, CachedFile} = require('block-cache')\n\nconst cache = new Cache(fs, {\n  blkSize: 1024,\n  cacheSize: 2 * 1024 * 1024 // 2 MB\n})\nconst fp = await cache.open('./Readme.md')\nconst data = await cache.read(fp)\n\nconsole.log(data)\n\nawait cache.close(fp)\n```\n\nThis example reads the entirety of the `./Readme.md` file into a 2 mega-byte\ncache in 1 kilo-byte sized blocks and then closes the data. Even if the fp is\nclosed: the block stay in the cache!\n\n## Use-case: file parsing\n\nThis library usually comes in play when you have to parse parts of a file\ndepending on the header. Take the beginning of this GIF parser for example:\n\n```javascript\nconst fs = require('fs')\nconst {Cache, CachedFile} = require('block-cache')\n\nconst cache = new Cache(fs, {\n  blkSize: 1024,\n  cacheSize: 2 * 1024 * 1024 // 2 MB\n})\nconst fp = await cache.open('./Readme.md')\nconst signature = (await fp.read(null, 0, 6)).toString()\nif (signature === 'GIF87a' || signature === 'GIF89a') {\n  const packed = await fp.read(null, 0, 10)\n  // etc.\n}\n\nawait cache.close(fp)\n```\n\nAs you can see in this example code, it is necessary to read only parts of a\nfile at a time. Very small parts. But most of those bytes are already\npresent in the cache. So, while the first operation needed to read 1Kb of\nthe file, the second operation can already use it from the cached data.\n\n## API\n\n- [`Cache`](#Cache)\n    - [`.open`](#cache.open)\n    - [`.close`](#cache.close)\n    - [`.disconnect`](#cache.disconnect)\n    - [`.openSync`](#cache.openSync)\n    - [`.read`](#cache.read)\n    - [`.createReadStream`](#cache.createReadStream)\n    - [`DEFAULT_CACHE_SIZE`](#Cache.DEFAULT_CACHE_SIZE)\n- [`CachedFile`](#CachedFile)\n    - [`.close`](#cachedFile.close)\n    - [`.read`](#cachedFile.read)\n    - [`.createReadStream`](#cachedFile.createReadStream)\n    - [`.size`](#cachedFile.size)\n    - [`.stat`](#cachedFile.stat)\n    - [`DEFAULT_BLK_SIZE`](#CachedFile.DEFAULT_BLK_SIZE)\n\n---\n\n\u003ca name=\"Cache\"\u003e\u003c/a\u003e\n\n```javascript\nnew Cache(fs[, opts])\n```\n\n- `fs` is a [FileSystem](https://nodejs.org/api/fs.html) (`require('fs')`)) or\n    [Hyperdrive](https://github.com/mafintosh/hyperdrive) archive (object).\n- `opts.cache` is a [`lru-cache`](https://github.com/isaacs/node-lru-cache)\n    instance (object, optional).\n- `opts.cacheSize` is the size of the lru-cache to be created in case a\n    `opts.cache` is missing. Defaults to\n    [`Cache.DEFAULT_CACHE_SIZE`](#Cache.DEFAULT_CACHE_SIZE) (integer).\n- `opts.blkSize` is the default size in bytes of a cache-block. Defaults to\n    [`CachedFile.DEFAULT_BLK_SIZE`](#CachedFile.DEFAULT_BLK_SIZE). (integer).\n- `opts.prefix` is an optional prefix that can be added to the cached data,\n    useful if you want to reuse the same `opts.cache` for multiple\n    `Cache` instances. Defaults to `''`. (string)\n\n---\n\n\u003ca name=\"cache.open\"\u003e\u003c/a\u003e\n\n```javascript\ncache.open(path[, opts, cb])\n```\n\nCreates a cached file pointer reference for a given path. Note: It will open\nthe file reference in `r` mode.\n\n- `path` path to read the file from (string).\n- `opts.blkSize` is the size in bytes of a cache-block. Defaults to the\n    `opts.blkSize` defined in the `Cache`.\n- `cb(Error, CachedFile)` is an optional async callback handler method.\n    The method will return a `Promise` if the callback is not defined.\n\n---\n\n\u003ca name=\"cache.close\"\u003e\u003c/a\u003e\n\n```javascript\ncache.close(fp[, cb])\n```\n\nCloses a created file pointer reference. After closing, future requests\non the `CachedFile` will result in an `err.code === 'ERR_CLOSED` error.\n\n- `fp` is a [`CachedFile`](#CachedFile) instance, created\n    with [`.open`](#cache.open) or [`.openSync`](#cache.openSync)\n- `cb(Error)` is an optional async callback handler method.\n    The method will return a `Promise` if the callback is not defined.\n\n---\n\n\u003ca name=\"cache.disconnect\"\u003e\u003c/a\u003e\n\n```javascript\ncache.disconnect()\n```\n\nDisconnects the cache from the file system instance. Any future operations on\nthe Cache or CachedFile instances create with the Cache  will result in\nan `err.code === 'ERR_DISCONNECTED'` error. Disconnect also closes all open\nfile pointer references on the underlying file system.  \n\n---\n\n\u003ca name=\"cache.openSync\"\u003e\u003c/a\u003e\n\n```javascript\ncache.openSync(path[, opts])\n```\n\nlike `cache.open` but synchronous.\n\n---\n\n\u003ca name=\"cache.read\"\u003e\u003c/a\u003e\n\n```javascript\ncache.read(fd[, buffer, offset, length, position, cb])\n```\n\nReads the content of an opened file into a given buffer.\n\n- `fd` is a [`CachedFile`](#CachedFile) instance, created\n    with [`.open`](#cache.open) or [`.openSync`](#cache.openSync)\n- `buffer` is a [`Buffer`](https://nodejs.org/api/buffer.html) instance to\n    write into. Unlike the Node API, this is optional which means that the\n    reader will create a buffer instance if `null` or `undefined` is passed-in.\n- `offset` is the offset in the buffer to start writing at.\n- `length` is an integer specifying the number of bytes to read into buffer,\n    defaults to length of the file (integer).\n- `position` is an argument specifying where to begin reading from in the file.\n    The file descriptor will remember the end of the last read in the\n   `fd.position` property. It will default to 0.\n- `cb(Error, Buffer)` is an optional async callback handler method. The method\n    will return a `Promise` if the callback is not defined.\n\n---\n\n\u003ca name=\"cache.createReadStream\"\u003e\u003c/a\u003e\n\n```javascript\ncache.createReadStream(path[, opts, cb])\n```\n\nCreates a cached file pointer reference for a given path and then reads it\nthrough a stream.\n\n- `path` is the path to read the file from (string).\n- `opts.blkSize` is the block size for each block to be cached. Defaults\n    to [`cache.opts.blkSize`](#Cache). (integer).\n- `opts.start` is the start from while to read the file. Defaults to 0. (integer)\n- `opts.end` is the end until which to read the file. Defaults to the end of\n    the file. (integer)\n\n\u003ca name=\"Cache.DEFAULT_CACHE_SIZE\"\u003e\u003c/a\u003e\n\n```javascript\nCache.DEFAULT_CACHE_SIZE\n```\n\nThe default size of a cache created if `opts.cache` is not passed in: 10485760\n(integer, equals 10 MegaByte)\n\n---\n\n\u003ca name=\"CachedFile\"\u003e\u003c/a\u003e\n\n```javascript\nnew CachedFile(cache, path[, opts])\n```\n\nCreates a new instance for reading one file. The blocks will still be stored in\nthe passed-in `cache` object. While it is possible to instantiate a new\n`CachedFile`, you can not pass-in a cache directly, use the\n[`.open`](#cache.open), [`.openSync`](#cache.openSync) or\n[`.createReadStream`](#cache.createReadStream) to interact with the cache\n\n- `cacheInternal` a subset of the `Cache` API that is not accessible from\n    outside.\n- `cacheInternal.open(path, opts, cb)` opens a file pointer to a given `path`\n    on the underlying `fs`.\n- `cacheInternal.stat(path, cb)` receives the `stat` file from the underlying\n   `fs`\n- `cacheInternal.close(fp, cb)` closes a file pointer on the underlying `fs`.\n- `cacheInternal.read(fp, prefix, start, end, cb)` reads bytes from the\n    underlying `fs` into a buffer.\n- `opts.blkSize` specifies the block size for this file pointer (integer).\n    Defaults to [`CachedFile.DEFAULT_BLK_SIZE`](#CachedFile.DEFAULT_BLK_SIZE).\n\n---\n\n\u003ca name=\"cachedFile.close\"\u003e\u003c/a\u003e\n\n```javascript\ncachedFile.close([cb])\n```\n\nCloses the instance. After closing, future requests\non the `CachedFile` will result in an `err.code === 'ERR_CLOSED` error.\n\n- `cb(Error)` is an optional async callback handler method.\n    The method will return a `Promise` if the callback is not defined.\n\n---\n\n\u003ca name=\"cachedFile.read\"\u003e\u003c/a\u003e\n\n```javascript\ncachedFile.read([buffer, offset, length, position, cb])\n```\n\nLike [`cache.read`](#cache.read) but without the need to pass a descriptor.\n\n---\n\n\u003ca name=\"cachedFile.createReadStream\"\u003e\u003c/a\u003e\n\n```javascript\ncachedFile.createReadStream([opts, cb])\n```\n\nLike [`cache.createReadStream`](#cache.createReadStream) but without the need\nto pass a descriptor.\n\n---\n\n\u003ca name=\"cachedFile.size\"\u003e\u003c/a\u003e\n\n```javascript\ncachedFile.size([cb])\n```\n\nThe size of the file as noted in the file descriptor.\n\n---\n\n\u003ca name=\"cachedFile.stat\"\u003e\u003c/a\u003e\n\n```javascript\ncachedFile.stat([cb])\n```\n\nRetreives the actual\n[`Stats`](https://nodejs.org/api/fs.html#fs_class_fs_stats) of the file\nthrough [`fs.stat`](https://nodejs.org/api/fs.html#fs_class_fs_stats).\n\n---\n\n\u003ca name=\"CachedFile.DEFAULT_BLK_SIZE\"\u003e\u003c/a\u003e\n\n```javascript\nCachedFile.DEFAULT_BLK_SIZE\n```\n\nThe default `opts.blkSize` used for caching: 512 (integer, equals 512 Byte).\n\n## Acknowledgement\n\nThis project was made for and supported by [dotloom](https://github.com/dotloom).\n\n## License\n\nMIT\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmartinheidegger%2Fblock-cache","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmartinheidegger%2Fblock-cache","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmartinheidegger%2Fblock-cache/lists"}