{"id":13540926,"url":"https://github.com/beakerbrowser/pauls-dat-api","last_synced_at":"2025-04-02T08:30:53.540Z","repository":{"id":57320832,"uuid":"81680965","full_name":"beakerbrowser/pauls-dat-api","owner":"beakerbrowser","description":"Library of functions that make working with dat / hyperdrive easier.","archived":true,"fork":false,"pushed_at":"2019-07-11T20:20:38.000Z","size":217,"stargazers_count":43,"open_issues_count":5,"forks_count":13,"subscribers_count":3,"default_branch":"master","last_synced_at":"2025-03-18T11:13:10.199Z","etag":null,"topics":["dat"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/beakerbrowser.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-02-11T20:41:25.000Z","updated_at":"2025-02-09T11:08:31.000Z","dependencies_parsed_at":"2022-08-26T01:10:59.950Z","dependency_job_id":null,"html_url":"https://github.com/beakerbrowser/pauls-dat-api","commit_stats":null,"previous_names":["pfrazee/pauls-dat-api"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/beakerbrowser%2Fpauls-dat-api","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/beakerbrowser%2Fpauls-dat-api/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/beakerbrowser%2Fpauls-dat-api/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/beakerbrowser%2Fpauls-dat-api/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/beakerbrowser","download_url":"https://codeload.github.com/beakerbrowser/pauls-dat-api/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246418658,"owners_count":20773934,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["dat"],"created_at":"2024-08-01T10:00:36.124Z","updated_at":"2025-04-02T08:30:52.856Z","avatar_url":"https://github.com/beakerbrowser.png","language":"JavaScript","readme":"# pauls-dat-api\n\nA library of functions that make working with [dat](https://github.com/datproject/dat-node) / [hyperdrive](https://github.com/mafintosh/hyperdrive) easier.\nIncludes common operations, and some sugars.\nThese functions were factored out of [beaker browser](https://github.com/beakerbrowser/beaker)'s internal APIs.\n\nAll async methods work with callbacks and promises. If no callback is provided, a promise will be returned.\n\nAny time a hyperdrive `archive` is expected, a [scoped-fs](https://github.com/pfrazee/scoped-fs) instance can be provided, unless otherwise stated.\n\n```js\nvar hyperdrive = require('hyperdrive')\nvar ScopedFS = require('scoped-fs')\n\nvar archive = hyperdrive('./my-hyperdrive')\nvar scopedfs = new ScopedFS('./my-scoped-fs')\n\nawait pda.readFile(archive, '/hello.txt') // read the published hello.txt\nawait pda.readFile(scopedfs, '/hello.txt') // read the local hello.txt\n```\n\n** NOTE: this library is written natively for node 7 and above. **\n\nTo use with node versions lesser than 7 use:\n```js\nvar pda = require('pauls-dat-api/es5');\n```\n\n\u003c!-- START doctoc generated TOC please keep comment here to allow auto update --\u003e\n\u003c!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE --\u003e\n\n\n- [Lookup](#lookup)\n  - [stat(archive, name[, cb])](#statarchive-name-cb)\n- [Read](#read)\n  - [readFile(archive, name[, opts, cb])](#readfilearchive-name-opts-cb)\n  - [readdir(archive, path[, opts, cb])](#readdirarchive-path-opts-cb)\n  - [readSize(archive, path[, cb])](#readsizearchive-path-cb)\n- [Write](#write)\n  - [writeFile(archive, name, data[, opts, cb])](#writefilearchive-name-data-opts-cb)\n  - [mkdir(archive, name[, cb])](#mkdirarchive-name-cb)\n  - [copy(archive, sourceName, targetName[, cb])](#copyarchive-sourcename-targetname-cb)\n  - [rename(archive, sourceName, targetName[, cb])](#renamearchive-sourcename-targetname-cb)\n- [Delete](#delete)\n  - [unlink(archive, name[, cb])](#unlinkarchive-name-cb)\n  - [rmdir(archive, name[, opts, cb])](#rmdirarchive-name-opts-cb)\n- [Network](#network)\n  - [download(archive, name[, cb])](#downloadarchive-name-cb)\n- [Activity Streams](#activity-streams)\n  - [watch(archive[, path])](#watcharchive-path)\n  - [createNetworkActivityStream(archive)](#createnetworkactivitystreamarchive)\n- [Exporters](#exporters)\n  - [exportFilesystemToArchive(opts[, cb])](#exportfilesystemtoarchiveopts-cb)\n  - [exportArchiveToFilesystem(opts[, cb])](#exportarchivetofilesystemopts-cb)\n  - [exportArchiveToArchive(opts[, cb])](#exportarchivetoarchiveopts-cb)\n- [Manifest](#manifest)\n  - [readManifest(archive[, cb])](#readmanifestarchive-cb)\n  - [writeManifest(archive, manifest[, cb])](#writemanifestarchive-manifest-cb)\n  - [updateManifest(archive, manifest[, cb])](#updatemanifestarchive-manifest-cb)\n  - [generateManifest(opts)](#generatemanifestopts)\n- [Diff/Merge](#diffmerge)\n  - [diff(srcArchive, srcPath, dstArchive, dstPath[, opts, cb])](#diffsrcarchive-srcpath-dstarchive-dstpath-opts-cb)\n  - [merge(srcArchive, srcPath, dstArchive, dstPath[, opts, cb])](#mergesrcarchive-srcpath-dstarchive-dstpath-opts-cb)\n- [Helpers](#helpers)\n  - [findEntryByContentBlock(archive, block)](#findentrybycontentblockarchive-block)\n\n\u003c!-- END doctoc generated TOC please keep comment here to allow auto update --\u003e\n\n```js\nconst pda = require('pauls-dat-api')\n```\n\n## Lookup\n\n### stat(archive, name[, cb])\n\n - `archive` Hyperdrive archive (object).\n - `name` Entry name (string).\n - Returns a Hyperdrive Stat entry (object).\n - Throws NotFoundError\n\n```js\n// by name:\nvar st = await pda.stat(archive, '/dat.json')\nst.isDirectory()\nst.isFile()\nconsole.log(st) /* =\u003e\nStat {\n  dev: 0,\n  nlink: 1,\n  rdev: 0,\n  blksize: 0,\n  ino: 0,\n  mode: 16877,\n  uid: 0,\n  gid: 0,\n  size: 0,\n  offset: 0,\n  blocks: 0,\n  atime: 2017-04-10T18:59:00.147Z,\n  mtime: 2017-04-10T18:59:00.147Z,\n  ctime: 2017-04-10T18:59:00.147Z,\n  linkname: undefined } */\n```\n\n## Read\n\n### readFile(archive, name[, opts, cb])\n\n - `archive` Hyperdrive archive (object).\n - `name` Entry path (string).\n - `opts`. Options (object|string). If a string, will act as `opts.encoding`.\n - `opts.encoding` Desired output encoding (string). May be 'binary', 'utf8', 'hex', or 'base64'. Default 'utf8'.\n - Returns the content of the file in the requested encoding.\n - Throws NotFoundError, NotAFileError.\n\n```js\nvar manifestStr = await pda.readFile(archive, '/dat.json')\nvar imageBase64 = await pda.readFile(archive, '/favicon.png', 'base64')\n```\n\n### readdir(archive, path[, opts, cb])\n\n - `archive` Hyperdrive archive (object).\n - `path` Target directory path (string).\n - `opts.recursive` Read all subfolders and their files as well?\n - Returns an array of file and folder names.\n\n```js\nvar listing = await pda.readdir(archive, '/assets')\nconsole.log(listing) // =\u003e ['profile.png', 'styles.css']\n\nvar listing = await pda.readdir(archive, '/', { recursive: true })\nconsole.log(listing) /* =\u003e [\n  'index.html',\n  'assets',\n  'assets/profile.png',\n  'assets/styles.css'\n]*/\n```\n\n### readSize(archive, path[, cb])\n\n - `archive` Hyperdrive archive (object).\n - `path` Target directory path (string).\n - Returns a number (size in bytes).\n\nThis method will recurse on folders.\n\n```js\nvar size = await pda.readSize(archive, '/assets')\nconsole.log(size) // =\u003e 123\n```\n\n## Write\n\n### writeFile(archive, name, data[, opts, cb])\n\n - `archive` Hyperdrive archive (object).\n - `name` Entry path (string).\n - `data` Data to write (string|Buffer).\n - `opts`. Options (object|string). If a string, will act as `opts.encoding`.\n - `opts.encoding` Desired file encoding (string). May be 'binary', 'utf8', 'hex', or 'base64'. Default 'utf8' if `data` is a string, 'binary' if `data` is a Buffer.\n - Throws ArchiveNotWritableError, InvalidPathError, EntryAlreadyExistsError, ParentFolderDoesntExistError, InvalidEncodingError.\n\n```js\nawait pda.writeFile(archive, '/hello.txt', 'world', 'utf8')\nawait pda.writeFile(archive, '/profile.png', fs.readFileSync('/tmp/dog.png'))\n```\n\n### mkdir(archive, name[, cb])\n\n - `archive` Hyperdrive archive (object).\n - `name` Directory path (string).\n - Throws ArchiveNotWritableError, InvalidPathError, EntryAlreadyExistsError, ParentFolderDoesntExistError, InvalidEncodingError.\n\n```js\nawait pda.mkdir(archive, '/stuff')\n```\n\n### copy(archive, sourceName, targetName[, cb])\n\n - `archive` Hyperdrive archive (object).\n - `sourceName` Path to file or directory to copy (string).\n - `targetName` Where to copy the file or folder to (string).\n - Throws ArchiveNotWritableError, InvalidPathError, EntryAlreadyExistsError, ParentFolderDoesntExistError, InvalidEncodingError.\n\n```js\n// copy file:\nawait pda.copy(archive, '/foo.txt', '/foo.txt.back')\n// copy folder:\nawait pda.copy(archive, '/stuff', '/stuff-copy')\n```\n\n### rename(archive, sourceName, targetName[, cb])\n\n - `archive` Hyperdrive archive (object).\n - `sourceName` Path to file or directory to rename (string).\n - `targetName` What the file or folder should be named (string).\n - Throws ArchiveNotWritableError, InvalidPathError, EntryAlreadyExistsError, ParentFolderDoesntExistError, InvalidEncodingError.\n\nThis is equivalent to moving a file/folder.\n\n```js\n// move file:\nawait pda.rename(archive, '/foo.txt', '/foo.md')\n// move folder:\nawait pda.rename(archive, '/stuff', '/things')\n```\n\n## Delete\n\n### unlink(archive, name[, cb])\n\n - `archive` Hyperdrive archive (object).\n - `name` Entry path (string).\n - Throws ArchiveNotWritableError, NotFoundError, NotAFileError\n\n```js\nawait pda.unlink(archive, '/hello.txt')\n```\n\n### rmdir(archive, name[, opts, cb])\n\n - `archive` Hyperdrive archive (object).\n - `name` Entry path (string).\n - `opts.recursive` Delete all subfolders and files if the directory is not empty.\n - Throws ArchiveNotWritableError, NotFoundError, NotAFolderError, DestDirectoryNotEmpty\n\n```js\nawait pda.rmdir(archive, '/stuff', {recursive: true})\n```\n\n## Network\n\n### download(archive, name[, cb])\n\n - `archive` Hyperdrive archive (object). Can not be a scoped-fs object.\n - `name` Entry path (string). Can point to a file or folder.\n\nDownload an archive file or folder-tree.\n\n```js\n// download a specific file:\nawait pda.download(archive, '/foo.txt')\n// download a specific folder and all children:\nawait pda.download(archive, '/bar/')\n// download the entire archive:\nawait pda.download(archive, '/')\n```\n\n## Activity Streams\n\n### watch(archive[, path])\n\n - `archive` Hyperdrive archive (object).\n - `path` Entry path (string) or [anymatch](npm.im/anymatch) pattern (array of strings). If falsy, will watch all files.\n - Returns a Readable stream.\n\nWatches the given path or path-pattern for file events, which it emits as an [emit-stream](https://github.com/substack/emit-stream). Supported events:\n\n - `['invalidated',{path}]` - The contents of the file has changed, but may not have been downloaded yet. `path` is the path-string of the file.\n - `['changed',{path}]` - The contents of the file has changed, and the new version is ready to read. `path` is the path-string of the file.\n\nAn archive will emit \"invalidated\" first, when it receives the new metadata for the file. It will then emit \"changed\" when the content arrives. (A local archive will emit \"invalidated\" immediately before \"changed.\")\n\n```js\nvar es = pda.watch(archive)\nvar es = pda.watch(archive, 'foo.txt')\nvar es = pda.watch(archive, ['**/*.txt', '**/*.md'])\n\nes.on('data', ([event, args]) =\u003e {\n  if (event === 'invalidated') {\n    console.log(args.path, 'has been invalidated')\n    pda.download(archive, args.path)\n  } else if (event === 'changed') {\n    console.log(args.path, 'has changed')\n  }\n})\n\n// alternatively, via emit-stream:\n\nvar emitStream = require('emit-stream')\nvar events = emitStream(pda.watch(archive))\nevents.on('invalidated', args =\u003e {\n  console.log(args.path, 'has been invalidated')\n  pda.download(archive, args.path)\n})\nevents.on('changed', args =\u003e {\n  console.log(args.path, 'has changed')\n})\n```\n\n### createNetworkActivityStream(archive)\n\n - `archive` Hyperdrive archive (object). Can not be a scoped-fs object.\n - Returns a Readable stream.\n\nWatches the archive for network events, which it emits as an [emit-stream](https://github.com/substack/emit-stream). Supported events:\n\n - `['network-changed',{connections}]` - The number of connections has changed. `connections` is a number.\n - `['download',{feed,block,bytes}]` - A block has been downloaded. `feed` will either be \"metadata\" or \"content\". `block` is the index of data downloaded. `bytes` is the number of bytes in the block.\n - `['upload',{feed,block,bytes}]` - A block has been uploaded. `feed` will either be \"metadata\" or \"content\". `block` is the index of data downloaded. `bytes` is the number of bytes in the block.\n - `['sync',{feed}]` - All known blocks have been downloaded. `feed` will either be \"metadata\" or \"content\".\n\n```js\nvar es = pda.createNetworkActivityStream(archive)\n\nes.on('data', ([event, args]) =\u003e {\n  if (event === 'network-changed') {\n    console.log('Connected to %d peers', args.connections)\n  } else if (event === 'download') {\n    console.log('Just downloaded %d bytes (block %d) of the %s feed', args.bytes, args.block, args.feed)\n  } else if (event === 'upload') {\n    console.log('Just uploaded %d bytes (block %d) of the %s feed', args.bytes, args.block, args.feed)\n  } else if (event === 'sync') {\n    console.log('Finished downloading', args.feed)\n  }\n})\n\n// alternatively, via emit-stream:\n\nvar emitStream = require('emit-stream')\nvar events = emitStream(es)\nevents.on('network-changed', args =\u003e {\n  console.log('Connected to %d peers', args.connections)\n})\nevents.on('download', args =\u003e {\n  console.log('Just downloaded %d bytes (block %d) of the %s feed', args.bytes, args.block, args.feed)\n})\nevents.on('upload', args =\u003e {\n  console.log('Just uploaded %d bytes (block %d) of the %s feed', args.bytes, args.block, args.feed)\n})\nevents.on('sync', args =\u003e {\n  console.log('Finished downloading', args.feed)\n})\n```\n\n## Exporters\n\n### exportFilesystemToArchive(opts[, cb])\n\n - `opts.srcPath` Source path in the filesystem (string). Required.\n - `opts.dstArchive` Destination archive (object). Required.\n - `opts.dstPath` Destination path within the archive. Optional, defaults to '/'.\n - `opts.ignore` Files not to copy (array of strings). Optional. Uses [anymatch](npm.im/anymatch).\n - `opts.inplaceImport` Should import source directory in-place? (boolean). If true and importing a directory, this will cause the directory's content to be copied directy into the `dstPath`. If false, will cause the source-directory to become a child of the `dstPath`.\n - `opts.dryRun` Don't actually make changes, just list what changes will occur. Optional, defaults to `false`.\n - Returns stats on the export.\n\nCopies a file-tree into an archive.\n\n```js\nvar stats = await pda.exportFilesystemToArchive({\n  srcPath: '/tmp/mystuff',\n  dstArchive: archive,\n  inplaceImport: true\n})\nconsole.log(stats) /* =\u003e {\n  addedFiles: ['fuzz.txt', 'foo/bar.txt'],\n  updatedFiles: ['something.txt'],\n  removedFiles: [],\n  addedFolders: ['foo'],\n  removedFolders: [],\n  skipCount: 3, // files skipped due to the target already existing\n  fileCount: 3,\n  totalSize: 400 // bytes\n}*/\n```\n\n### exportArchiveToFilesystem(opts[, cb])\n\n - `opts.srcArchive` Source archive (object). Required.\n - `opts.dstPath` Destination path in the filesystem (string). Required.\n - `opts.srcPath` Source path within the archive. Optional, defaults to '/'.\n - `opts.ignore` Files not to copy (array of strings). Optional. Uses [anymatch](npm.im/anymatch).\n - `opts.overwriteExisting` Proceed if the destination isn't empty (boolean). Default false.\n - `opts.skipUndownloadedFiles` Ignore files that haven't been downloaded yet (boolean). Default false. If false, will wait for source files to download.\n - Returns stats on the export.\n\nCopies an archive into the filesystem.\n\nNOTE\n\n - Unlike exportFilesystemToArchive, this will not compare the target for equality before copying. If `overwriteExisting` is true, it will simply copy all files again.\n\n```js\nvar stats = await pda.exportArchiveToFilesystem({\n  srcArchive: archive,\n  dstPath: '/tmp/mystuff',\n  skipUndownloadedFiles: true\n})\nconsole.log(stats) /* =\u003e {\n  addedFiles: ['fuzz.txt', 'foo/bar.txt'],\n  updatedFiles: ['something.txt'],\n  fileCount: 3,\n  totalSize: 400 // bytes\n}*/\n```\n\n### exportArchiveToArchive(opts[, cb])\n\n - `opts.srcArchive` Source archive (object). Required.\n - `opts.dstArchive` Destination archive (object). Required.\n - `opts.srcPath` Source path within the source archive (string). Optional, defaults to '/'.\n - `opts.dstPath` Destination path within the destination archive (string). Optional, defaults to '/'.\n - `opts.ignore` Files not to copy (array of strings). Optional. Uses [anymatch](npm.im/anymatch).\n - `opts.skipUndownloadedFiles` Ignore files that haven't been downloaded yet (boolean). Default false. If false, will wait for source files to download.\n\nCopies an archive into another archive.\n\nNOTE\n\n - Unlike exportFilesystemToArchive, this will not compare the target for equality before copying. It copies files indescriminately.\n\n```js\nvar stats = await pda.exportArchiveToArchive({\n  srcArchive: archiveA,\n  dstArchive: archiveB,\n  skipUndownloadedFiles: true\n})\nconsole.log(stats) /* =\u003e {\n  addedFiles: ['fuzz.txt', 'foo/bar.txt'],\n  updatedFiles: ['something.txt'],\n  fileCount: 3,\n  totalSize: 400 // bytes\n}*/\n```\n\n## Manifest\n\n### readManifest(archive[, cb])\n\n - `archive` Hyperdrive archive (object).\n\nA sugar to get the manifest object.\n\n```js\nvar manifestObj = await pda.readManifest(archive)\n```\n\n### writeManifest(archive, manifest[, cb])\n\n - `archive` Hyperdrive archive (object).\n - `manifest` Manifest values (object).\n\nA sugar to write the manifest object.\n\n```js\nawait pda.writeManifest(archive, { title: 'My dat!' })\n```\n\n### updateManifest(archive, manifest[, cb])\n\n - `archive` Hyperdrive archive (object).\n - `manifest` Manifest values (object).\n\nA sugar to modify the manifest object.\n\n```js\nawait pda.writeManifest(archive, { title: 'My dat!', description: 'the desc' })\nawait pda.writeManifest(archive, { title: 'My new title!' }) // preserves description\n```\n\n### generateManifest(opts)\n\n - `opts` Manifest options (object).\n\nHelper to generate a manifest object. Opts in detail:\n\n```\n{\n  url: String, the dat's url\n  title: String\n  description: String\n  type: Array\u003cString\u003e\n  author: String | Object{name: String, url: String}\n  links: Object\n  web_root: String\n  fallback_page: String\n}\n```\n\nSee: https://github.com/datprotocol/dat.json\n\n## Diff/Merge\n\n### diff(srcArchive, srcPath, dstArchive, dstPath[, opts, cb])\n\n - `srcArchive` Source archive (object). Required.\n - `srcPath` Source path within the source archive (string). Required.\n - `dstArchive` Destination archive (object). Required.\n - `dstPath` Destination path within the destination archive (string). Required.\n - `opts.shallow` Don't descend into changed folders (bool). Optional, default false.\n - `opts.compareContent`. Compare the content of the files, rather than the mtime and size. Optional, default false.\n - `opts.paths` Whitelist of files to diff (array\u003cstring\u003e). Optional.\n - `opts.ops` Whitelist of operations to include in the diff (array\u003cstring\u003e). Optional. Valid values are `'add'`, `'mod'`, and `'del'`.\n - Returns diff data.\n\nGet a list of differences between the two archives at the given paths.\n\n```js\nawait pda.diff(archiveA, '/', archiveB, '/')\nawait pda.diff(archiveA, '/', archiveB, '/', {shallow: false, compareContent: true})\nawait pda.diff(archiveA, '/', archiveB, '/', {paths: ['/foo', '/bar']})\nawait pda.diff(archiveA, '/', archiveB, '/', {ops: ['add']}) // additions only\n```\n\nOutput looks like:\n\n```\n[\n  {change: 'mod', type: 'file', path: '/hello.txt'},\n  {change: 'add', type: 'dir',  path: '/pics'},\n  {change: 'add', type: 'file', path: '/pics/kitty.png'},\n  {change: 'del', type: 'file', path: '/backup/hello.txt'},\n  {change: 'del', type: 'dir',  path: '/backup'},\n  {change: 'del', type: 'file', path: '/hello.txt'},\n]\n```\n\n### merge(srcArchive, srcPath, dstArchive, dstPath[, opts, cb])\n\n - `srcArchive` Source archive (object). Required.\n - `srcPath` Source path within the source archive (string). Required.\n - `dstArchive` Destination archive (object). Required.\n - `dstPath` Destination path within the destination archive (string). Required.\n - `opts.shallow` Don't descend into changed folders (bool). Optional, default false.\n - `opts.compareContent`. Compare the content of the files, rather than the mtime and size. Optional, default false.\n - `opts.paths` Whitelist of files to diff (array\u003cstring\u003e). Optional.\n - `opts.ops` Whitelist of operations to include in the diff (array\u003cstring\u003e). Optional. Valid values are `'add'`, `'mod'`, and `'del'`.\n - Returns the changes applied.\n\nMerges the source archive into the destinatio archive at the given paths, causing `dstArchive` content to match `srcArchive`.\n\n```js\nawait pda.merge(archiveA, '/', archiveB, '/')\nawait pda.merge(archiveA, '/', archiveB, '/', {shallow: false, compareContent: true})\nawait pda.merge(archiveA, '/', archiveB, '/', {paths: ['/foo', '/bar']})\nawait pda.merge(archiveA, '/', archiveB, '/', {ops: ['add']}) // additions only\n```\n\nOutput looks like:\n\n```\n[\n  {change: 'mod', type: 'file', path: '/hello.txt'},\n  {change: 'add', type: 'dir',  path: '/pics'},\n  {change: 'add', type: 'file', path: '/pics/kitty.png'},\n  {change: 'del', type: 'file', path: '/backup/hello.txt'},\n  {change: 'del', type: 'dir',  path: '/backup'},\n  {change: 'del', type: 'file', path: '/hello.txt'},\n]\n```\n\n## Helpers\n\n### findEntryByContentBlock(archive, block)\n\n - `archive` Hyperdrive archive (object).\n - `block` Content-block index\n - Returns a Promise for `{name:, start:, end:}`\n\nRuns a binary search to find the file-entry that the given content-block index belongs to.\n\n```js\nawait pda.findEntryByContentBlock(archive, 5)\n/* =\u003e {\n  name: '/foo.txt',\n  start: 4,\n  end: 6\n}*/\n```\n","funding_links":[],"categories":["Using Dat"],"sub_categories":["High-Level APIs"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbeakerbrowser%2Fpauls-dat-api","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fbeakerbrowser%2Fpauls-dat-api","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbeakerbrowser%2Fpauls-dat-api/lists"}