{"id":13632341,"url":"https://github.com/appy-one/acebase","last_synced_at":"2025-10-13T22:32:01.576Z","repository":{"id":38303553,"uuid":"137869753","full_name":"appy-one/acebase","owner":"appy-one","description":"A fast, low memory, transactional, index \u0026 query enabled NoSQL database engine and server for node.js and browser with realtime data change notifications","archived":false,"fork":false,"pushed_at":"2025-09-06T10:16:42.000Z","size":6613,"stargazers_count":514,"open_issues_count":17,"forks_count":30,"subscribers_count":17,"default_branch":"master","last_synced_at":"2025-09-12T16:56:44.840Z","etag":null,"topics":["acebase","angular","browser","database","electron","firebase","indexeddb","javascript","nodejs","nosql","nosql-database","react","realtime","realtime-database","svelte","typescript"],"latest_commit_sha":null,"homepage":"","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/appy-one.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null},"funding":{"github":"appy-one"}},"created_at":"2018-06-19T09:28:07.000Z","updated_at":"2025-09-10T07:44:34.000Z","dependencies_parsed_at":"2024-01-15T00:16:54.656Z","dependency_job_id":"3e400c5a-3da4-4702-92c1-a4b65058fa1f","html_url":"https://github.com/appy-one/acebase","commit_stats":{"total_commits":960,"total_committers":8,"mean_commits":120.0,"dds":0.375,"last_synced_commit":"4b21bac27f6f409db0ffb37628c5db31b625deef"},"previous_names":[],"tags_count":42,"template":false,"template_full_name":null,"purl":"pkg:github/appy-one/acebase","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/appy-one%2Facebase","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/appy-one%2Facebase/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/appy-one%2Facebase/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/appy-one%2Facebase/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/appy-one","download_url":"https://codeload.github.com/appy-one/acebase/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/appy-one%2Facebase/sbom","scorecard":{"id":204245,"data":{"date":"2025-08-11","repo":{"name":"github.com/appy-one/acebase","commit":"35147f4a68fe7ff5c35256f44cbaec2d4a4d063b"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":1.7,"checks":[{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"Code-Review","score":0,"reason":"Found 1/19 approved changesets -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Token-Permissions","score":-1,"reason":"No tokens found","details":null,"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Maintained","score":0,"reason":"0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Dangerous-Workflow","score":-1,"reason":"no workflows found","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Pinned-Dependencies","score":-1,"reason":"no dependencies found","details":null,"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"License","score":10,"reason":"license file detected","details":["Info: project has a license file: LICENSE:0","Info: FSF or OSI recognized license: MIT License: LICENSE:0"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":0,"reason":"branch protection not enabled on development/release branches","details":["Warn: branch protection not enabled for branch 'master'"],"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 18 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}},{"name":"Vulnerabilities","score":0,"reason":"15 existing vulnerabilities detected","details":["Warn: Project is vulnerable to: GHSA-v6h2-p8h4-qcjw","Warn: Project is vulnerable to: GHSA-grv7-fg5c-xmjg","Warn: Project is vulnerable to: GHSA-x9w5-v3q2-3rhw","Warn: Project is vulnerable to: GHSA-3xgq-45jj-v275","Warn: Project is vulnerable to: GHSA-434g-2637-qmqr","Warn: Project is vulnerable to: GHSA-49q7-c7j4-3p7m","Warn: Project is vulnerable to: GHSA-977x-g7h5-7qgw","Warn: Project is vulnerable to: GHSA-f7q4-pwc6-w24p","Warn: Project is vulnerable to: GHSA-fc9h-whq2-v747","Warn: Project is vulnerable to: GHSA-vjh7-7g9h-fjfh","Warn: Project is vulnerable to: GHSA-952p-6rrq-rcjv","Warn: Project is vulnerable to: GHSA-h7cp-r72f-jxh6","Warn: Project is vulnerable to: GHSA-v62p-rq8g-8h59","Warn: Project is vulnerable to: GHSA-c2qf-rxjj-qqgw","Warn: Project is vulnerable to: GHSA-j8xg-fqg3-53r7"],"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}}]},"last_synced_at":"2025-08-16T23:21:32.684Z","repository_id":38303553,"created_at":"2025-08-16T23:21:32.684Z","updated_at":"2025-08-16T23:21:32.684Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":279017160,"owners_count":26085983,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-13T02:00:06.723Z","response_time":61,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["acebase","angular","browser","database","electron","firebase","indexeddb","javascript","nodejs","nosql","nosql-database","react","realtime","realtime-database","svelte","typescript"],"created_at":"2024-08-01T22:03:00.581Z","updated_at":"2025-10-13T22:32:01.566Z","avatar_url":"https://github.com/appy-one.png","language":"TypeScript","readme":"\u003cp align=\"center\"\u003e\n    \u003cimg src=\"https://github.com/appy-one/acebase/blob/806bc6e78920c1d735ae96fc8eb818f338ceb00a/logo.png?raw=true\" alt=\"AceBase realtime database\"\u003e\n\u003c/p\u003e\n\n# AceBase realtime database\n\nA fast, low memory, transactional, index \u0026 query enabled NoSQL database engine and server for node.js and browser with realtime data change notifications. Supports storing of JSON objects, arrays, numbers, strings, booleans, dates, bigints and binary (ArrayBuffer) data.\n\nInspired by (and largely compatible with) the Firebase realtime database, with additional functionality and less data sharding/duplication. Capable of storing up to 2^48 (281 trillion) object nodes in a binary database file that can theoretically grow to a max filesize of 8 petabytes.\n\nAceBase is easy to set up and runs anywhere: in the cloud, NAS, local server, your PC/Mac, Raspberry Pi, the [browser](#running-acebase-in-the-browser), wherever you want.\n\n🔥 Check out the new [live data proxy](#realtime-synchronization-with-a-live-data-proxy) feature that allows your app to use and update live database values using in-memory objects and **no additional db coding**!\n\n## Table of contents\n\n* [Getting started](#getting-started)\n    * [Prerequisites](#prerequisites)\n    * [Installing](#installing)\n    * [Example usage](#example-usage)\n    * [Creating a database](#creating-a-database)\n* Loading and storing data\n    * [Loading data](#loading-data)\n    * [Storing data](#storing-data)\n    * [Updating data](#updating-data)\n    * [Transactional updating](#transactional-updating)\n    * [Removing data](#removing-data)\n    * [Generating unique keys](#generating-unique-keys)\n    * [Using arrays](#using-arrays)\n    * [Counting children](#counting-children)\n    * [Limit nested data loading](#limit-nested-data-loading)\n    * [Iterating (streaming) children](#iterating-streaming-children)\n    * [Asserting data types in TypeScript](#asserting-data-types-in-typescript)\n* Realtime monitoring\n    * [Monitoring realtime data changes](#monitoring-realtime-data-changes)\n    * [Using variables and wildcards in subscription paths](#using-variables-and-wildcards-in-subscription-paths)\n    * [Notify only events](#notify-only-events)\n    * [Wait for events to activate](#wait-for-events-to-activate)\n    * [Get triggering context of events](#get-triggering-context-of-events)\n    * [Change tracking using \"mutated\" and \"mutations\" events](#change-tracking-using-mutated-and-mutations-events)\n    * [Observe realtime value changes](#observe-realtime-value-changes)\n    * [Realtime synchronization with a live data proxy](#realtime-synchronization-with-a-live-data-proxy)\n    * [Using proxy methods in Typescript](#using-proxy-methods-in-typescript)\n* Queries\n    * [Querying data](#querying-data)\n    * [Limiting query result data](#limiting-query-result-data)\n    * [Removing data with a query](#removing-data-with-a-query)\n    * [Counting query results](#counting-query-results)\n    * [Checking query result existence](#checking-query-result-existence)\n    * [Streaming query results](#streaming-query-results)\n    * [Realtime queries](#realtime-queries)\n* Indexes\n    * [Indexing data](#indexing-data)\n    * [Indexing scattered data with wildcards](#indexing-scattered-data-with-wildcards)\n    * [Include additional data in indexes](#include-additional-data-in-indexes)\n    * [Other indexing options](#other-indexing-options)\n    * [Special indexes](#special-indexes)\n    * [Array indexes](#array-indexes)\n    * [Fulltext indexes](#fulltext-indexes)\n    * [Geo indexes](#geo-indexes)\n* Schemas (NEW v1.3.0)\n    * [Validating data with schemas](#schemas)\n    * [Adding schemas to enforce data rules](#adding-schemas-to-enforce-data-rules)\n    * [Schema Examples](#schema-examples)\n* Class mappings (ORM)\n    * [Mapping data to custom classes](#mapping-data-to-custom-classes)\n* Data storage options\n    * [AceBase data storage engine](#storage)\n    * [Using SQLite or MSSQL storage](#using-a-sqllite-or-mssql-backend)\n    * [AceBase in the browser](#running-acebase-in-the-browser)\n    * [Using CustomStorage](#using-a-customstorage-backend)\n* Reflect API\n    * [Introduction](#reflect-api)\n    * [Get information about a node](#get-information-about-a-node)\n    * [Get children of a node](#get-children-of-a-node)\n* Importing and Exporting data\n    * [Export API usage](#export-api)\n    * [Import API usage](#import-api)\n* Transaction Logging\n    * [Info](#transaction-logging)\n* Multi-process support\n    * [Info](#multi-process-support)\n* [CommonJS and ESM packages](#commonjs-and-esm-packages)\n* [Upgrade notices](#upgrade-notices)\n* [Known issues](#known-issues)\n* [Authors](#authors)\n* [Contributing](#contributing)\n* [Sponsoring](#sponsoring)\n\n## Getting Started\n\nAceBase is split up into multiple packages:\n* **acebase**: local AceBase database engine ([github](https://github.com/appy-one/acebase), [npm](https://www.npmjs.com/package/acebase))\n* **acebase-server**: AceBase server endpoint to enable remote connections. Includes built-in user authentication and authorization, supports using external OAuth providers such as Facebook and Google ([github](https://github.com/appy-one/acebase-server), [npm](https://www.npmjs.com/package/acebase-server)).\n* **acebase-client**: client to connect to an external AceBase server ([github](https://github.com/appy-one/acebase-client), [npm](https://www.npmjs.com/package/acebase-client))\n* **acebase-core**: shared functionality, dependency of above packages ([github](https://github.com/appy-one/acebase-core), [npm](https://www.npmjs.com/package/acebase-core))\n\nAceBase uses [semver](https://semver.org/) versioning to prevent breaking changes to impact older code.\nPlease report any errors / unexpected behaviour you encounter by creating an issue on Github.\n\n\n### Prerequisites\n\nAceBase is designed to run in a [Node.js](https://nodejs.org/) environment, as it (by default) requires the 'fs' filesystem to store its data and indexes. However, since v0.9.0 **it is now also possible to use AceBase databases in the browser**! To run AceBase in the browser, simply include 1 script file and you're good to go! See [AceBase in the browser](#running-acebase-in-the-browser) for more info and code samples!\n\n### Installing\n\nAll AceBase repositories are available through npm. You only have to install one of them, depending on your needs:\n\n### Create a local database\nIf you want to use a **local AceBase database** in your project, install the [acebase](https://github.com/appy-one/acebase) package.\n\n```sh\nnpm install acebase\n```\nThen, create (open) your database:\n```js\nconst { AceBase } = require('acebase');\nconst db = new AceBase('my_db'); // nodejs\n// OR: const db = AceBase.WithIndexedDB('my_db'); // browser\ndb.ready(() =\u003e {\n    // Do stuff\n});\n```\n\n### Try AceBase in your browser\n\nIf you want to try out AceBase running in Node.js, simply open it in [RunKit](https://npm.runkit.com/acebase) and follow along with the examples. If you want to try out the browser version of AceBase, open [google.com](google.com) in a new tab (GitHub doesn't allow cross-site scripts to be loaded) and run the code snippet below to use it in your browser console immediately.\n\n*To try AceBase in RunKit:*\n```js\nconst { AceBase } = require('acebase');\nconst db = new AceBase('mydb');\n\nawait db.ref('test').set({ text: 'This is my first AceBase test in RunKit' });\n\nconst snap = await db.ref('test/text').get();\nconsole.log(`value of \"test/text\": ` + snap.val());\n```\n\n*To try AceBase in the browser console:*\n```js\nawait fetch('https://cdn.jsdelivr.net/npm/acebase@latest/dist/browser.min.js')\n    .then(response =\u003e response.text())\n    .then(text =\u003e eval(text));\nif (!AceBase) { throw 'AceBase not loaded!'; }\n\nvar db = AceBase.WithIndexedDB('mydb');\nawait db.ref('test').set({ text: 'This is my first AceBase test in the browser' });\n\nconst snap = await db.ref('test/text').get();\nconsole.log(`value of \"test/text\": ` + snap.val());\n```\n\n### Setup a database server\nIf you want to setup an **AceBase server**, install [acebase-server](https://github.com/appy-one/acebase-server).\n\n```sh\nnpm install acebase-server\n```\nThen, start your server (`server.js`):\n```js\nconst { AceBaseServer } = require('acebase-server');\nconst server = new AceBaseServer('my_server_db', { /* server config */ });\nserver.ready(() =\u003e {\n    // Server running\n});\n```\n\n### Connect to a remote database\nIf you want to connect to a remote (or local) AceBase server, install [acebase-client](https://github.com/appy-one/acebase-client).\n\n```sh\nnpm install acebase-client\n```\nThen, connect to your AceBase server:\n```js\nconst { AceBaseClient } = require('acebase-client');\nconst db = new AceBaseClient({ /* connection config */ });\ndb.ready(() =\u003e {\n    // Connected!\n});\n```\n\n## Example usage\n\nThe API is similar to that of the Firebase realtime database, with additions.\n\n### Creating a database\n\nCreating a new database is as simple as opening it. If the database file doesn't exists, it will be created automatically.\n\n```javascript\nconst { AceBase } = require('acebase');\nconst options = { logLevel: 'log', storage: { path: '.' } }; // optional settings\nconst db = new AceBase('mydb', options);  // Creates or opens a database with name \"mydb\"\n\ndb.ready(() =\u003e {\n    // database is ready to use!\n})\n```\n\nNOTE: The `logLevel` option specifies how much info should be written to the console logs. Possible values are: `'verbose'`, `'log'` (default), `'warn'` and `'error'` (only errors are logged)\n\n### Loading data\n\nRun `.get` on a reference to get the currently stored value. This is short for the Firebase syntax of `.once(\"value\")`.\n\n```javascript\nconst snapshot = await db.ref('game/config').get();\nif (snapshot.exists()) {\n    config = snapshot.val();\n}\nelse {\n    config = defaultGameConfig; // use defaults\n}\n```\n\nNote: When loading data, the currently stored value will be wrapped and returned in a `DataSnapshot` object. Use `snapshot.exists()` to determine if the node exists, `snapshot.val()` to get the value. \n\n### Storing data\n\nSetting the value of a node, overwriting if it exists:\n\n```javascript\nconst ref = await db.ref('game/config').set({\n    name: 'Name of the game',\n    max_players: 10\n});\n// stored at /game/config\n```\n\nNote: When storing data, it doesn't matter whether the target path, and/or parent paths exist already. If you store data in _'chats/somechatid/messages/msgid/receipts'_, it will create any nonexistent node in that path.\n\n### Updating data\n\nUpdating the value of a node merges the stored value with the new object. If the target node doesn't exist, it will be created with the passed value.\n\n```javascript\nconst ref = await db.ref('game/config').update({\n    description: 'The coolest game in the history of mankind'\n});\n\n// config was updated, now get the value (ref points to 'game/config')\nconst snapshot = await ref.get();\nconst config = snapshot.val();\n\n// `config` now has properties \"name\", \"max_players\" and \"description\"\n```\n\n### Transactional updating\n\nIf you want to update data based upon its current value, and you want to make sure the data is not changed in between your `get` and `update`, use `transaction`. A transaction gets the current value, runs your callback with a snapshot. The value you return from the callback will be used to overwrite the node with. Returning `null` will remove the entire node, returning nothing (`undefined`) will cancel the transaction.\n\n```javascript\nawait db.ref('accounts/some_account')\n.transaction(snapshot =\u003e {\n    // some_account is locked until its new value is returned by this callback\n    var account = snapshot.val();\n    if (!snapshot.exists()) {\n        // Create it\n        account = {\n            balance: 0\n        };\n    }\n    account.balance *= 1.02;    // Add 2% interest\n    return account; // accounts/some_account will be set to the return value\n});\n```\n\nNote: ```transaction``` loads the value of a node including ALL child objects. If the node you want to run a transaction on has a large value (eg many nested child objects), you might want to run the transaction on a subnode instead. If that is not possible, consider structuring your data differently.\n\n```javascript\n// Run transaction on balance only, reduces amount of data being loaded, transferred, and overwritten\ndb.ref('accounts/some_account/balance')\n.transaction(snapshot =\u003e {\n    var balance = snapshot.val();\n    if (balance === null) { // snapshot.exists() === false\n        balance = 0;\n    }\n    return balance * 1.02;    // Add 2% interest\n});\n```\n\n### Removing data\n\nYou can remove data with the `remove` method\n\n```javascript\ndb.ref('animals/dog')\n.remove()\n.then(() =\u003e { /* removed successfully */ });\n```\n\nRemoving data can also be done by setting or updating its value to `null`. Any property that has a null value will be removed from the parent object node.\n\n```javascript\n// Remove by setting it to null\ndb.ref('animals/dog')\n.set(null)\n.then(ref =\u003e { /* dog property removed */ });\n\n// Or, update its parent with a null value for 'dog' property\ndb.ref('animals')\n.update({ dog: null })\n.then(ref =\u003e { /* dog property removed */ });\n```\n\n### Generating unique keys\n\nFor all generic data you add, you need to create keys that are unique and won't clash with keys generated by other clients. To do this, you can have unique keys generated with `push`. Under the hood, `push` uses [cuid](https://www.npmjs.com/package/cuid) to generated keys that are guaranteed to be unique and time-sortable.\n\n```javascript\ndb.ref('users')\n.push({\n    name: 'Ewout',\n    country: 'The Netherlands'\n})\n.then(userRef =\u003e {\n    // user is saved, userRef points to something \n    // like 'users/jld2cjxh0000qzrmn831i7rn'\n});\n```\n\nThe above example generates the unique key and stores the object immediately. You can also choose to have the key generated, but store the value later. \n\n```javascript\nconst postRef = db.ref('posts').push();\nconsole.log(`About to add a new post with key \"${postRef.key}\"..`);\n// ... do stuff ...\npostRef.set({\n    title: 'My first post'\n})\n.then(ref =\u003e {\n    console.log(`Saved post \"${postRef.key}\"`);\n});\n```\n\n**NOTE**: This approach is recommended if you want to add multitple new objects at once, because a single update performs way faster:\n\n```javascript\nconst newMessages = {};\n// We got messages from somewhere else (eg imported from file or other db)\nmessages.forEach(message =\u003e {\n    const ref = db.ref('messages').push();\n    newMessages[ref.key] = message;\n})\nconsole.log(`About to add multiple messages in 1 update operation`);\ndb.ref('messages').update(newMessages)\n.then(ref =\u003e {\n    console.log(`Added all messages at once`);\n});\n```\n\n### Using arrays\n\nAceBase supports storage of arrays, but there are some caveats when working with them. For instance, you cannot remove or insert items that are not at the end of the array. AceBase arrays work like a stack, you can add and remove from the top, not within. It is possible however to edit individual entries, or to overwrite the entire array. The safest way to edit arrays is with a `transaction`, which requires all data to be loaded and stored again. In many cases, it is wiser to use object collections instead.\n\nYou can safely use arrays when:\n* The number of items are small and finite, meaning you could estimate the typical average number of items in it.\n* There is no need to retrieve/edit individual items using their stored path. If you reorder the items in an array, their paths change (eg from `\"playlist/songs[4]\"` to `\"playlist/songs[1]\"`)\n* The entries stored are small and do not have a lot of nested data (small strings or simple objects, eg: `chat/members` with user IDs array `['ewout','john','pete']`)\n* The collection does not need to be edited frequently.\n\nUse object collections instead when:\n* The collection keeps growing (eg: user generated content)\n* The path of items are important and preferably not change, eg `\"playlist/songs[4]\"` might point to a different entry if the array is edited. When using an object collection, `playlist/songs/jld2cjxh0000qzrmn831i7rn` will always refer to that same item.\n* The entries stored are large (eg large strings / blobs / objects with lots of nested data)\n* You have to edit the collection frequently.\n\nHaving said that, here's how to safely work with arrays:\n```javascript\n// Store an array with 2 songs:\nawait db.ref('playlist/songs').set([\n    { id: 13535, title: 'Daughters', artist: 'John Mayer' }, \n    { id: 22345,  title: 'Crazy', artist: 'Gnarls Barkley' }\n]);\n\n// Editing an array safely:\nawait db.ref('playlist/songs').transaction(snap =\u003e {\n    const songs = snap.val();\n    // songs is instanceof Array\n    // Add a song:\n    songs.push({ id: 7855, title: 'Formidable', artist: 'Stromae' });\n    // Edit the second song:\n    songs[1].title += ' (Live)';\n    // Remove the first song:\n    songs.splice(0, 1);\n    // Store the edited array:\n    return songs;\n});\n```\n\nIf you do not change the order of the entries in an array, it's safe to use them in referenced paths:\n\n```js\n// Update a single array entry:\nawait db.ref('playlist/songs[4]/title').set('Blue on Black');\n\n// Or:\nawait db.ref('playlist/songs[4]').update({ title: 'Blue on Black') };\n\n// Or:\nawait db.ref('playlist/songs').update({\n    4: { title: 'Blue on Black', artist: 'Kenny Wayne Shepherd' }\n})\n\n// Get value of single array entry:\nlet snap = await db.ref('playlist/songs[2]').get();\n\n// Get selected entries with an include filter (like you'd use with object collections)\nlet snap = await db.ref('playlist/songs').get({ include: [0, 5, 8] });\nlet songs = snap.val();\n// NOTE: songs is instanceof PartialArray, which is an object with properties '0', '5', '8'\n```\n\nNOTE: you CANNOT use `ref.push()` to add entries to an array! `push` can only be used on object collections because it generates unique child IDs such as `\"jpx0k53u0002ecr7s354c51l\"` (which obviously is not a valid array index)\n\nTo summarize: ONLY use arrays if using an object collection seems like overkill, and be very cautious! Adding and removing items can only be done to/from the END of an array, unless you rewrite the entire array. That means you will have to know how many entries your array has up-front to be able to add new entries, which is not really desirable in most situations. If you feel the urge to use an array because the order of the entries are important for you or your app: consider using an object collection instead, and add an 'order' property to the entries to perform a sort on.\n\n## Counting children\n\nTo quickly find out how many children a specific node has, use the `count` method on a `DataReference`:\n\n```javascript\nconst messageCount = await db.ref('chat/messages').count();\n```\n\n### Limit nested data loading  \n\nIf your database structure is using nesting (eg storing posts in `'users/someuser/posts'` instead of in `'posts'`), you might want to limit the amount of data you are retrieving in most cases. Eg: if you want to get the details of a user, but don't want to load all nested data, you can explicitly limit the nested data retrieval by passing `exclude`, `include`, and/or `child_objects` options to `.get`:\n\n```javascript\n// Exclude specific nested data:\ndb.ref('users/someuser')\n.get({ exclude: ['posts', 'comments'] })\n.then(snap =\u003e {\n    // snapshot contains all properties of 'someuser' except \n    // 'users/someuser/posts' and 'users/someuser/comments'\n});\n\n// Include specific nested data:\ndb.ref('users/someuser/posts')\n.get({ include: ['*/title', '*/posted'] })\n.then(snap =\u003e {\n    // snapshot contains all posts of 'someuser', but each post \n    // only contains 'title' and 'posted' properties\n});\n\n// Combine include \u0026 exclude:\ndb.ref('users/someuser')\n.get({ exclude: ['comments'], include: ['posts/*/title'] })\n.then(snap =\u003e {\n    // snapshot contains all user data without the 'comments' collection, \n    // and each object in the 'posts' collection only contains a 'title' property.\n});\n```\n\n**NOTE**: This enables you to do what Firebase can't: store your data in logical places, and only get the data you are interested in, fast! On top of that, you're even able to index your nested data and query it, even faster. See [Indexing data](#indexing-data) for more info.\n\n### Iterating (streaming) children\n(NEW since v1.4.0)\n\nTo iterate through all children of an object collection without loading all data into memory at once, you can use `forEach` which streams each child and executes a callback function with a snapshot of its data. If the callback function returns `false`, iteration will stop. If the callback returns a `Promise`, iteration will wait for it to resolve before loading the next child.\n\nThe children to iterate are determined at the start of the function. Because `forEach` does not read/write lock the collection, it is possible for the data to be changed while iterating. Children that are added while iterating will be ignored, removed children will be skipped.\n\nIt is also possible to selectively load data for each child, using the same options object available for `ref.get(options)`\n\nExamples:\n```js\n// Stream all books one at a time (loads all data for each book):\nawait db.ref('books').forEach(bookSnapshot =\u003e {\n   const book = bookSnapshot.val();\n   console.log(`Got book \"${book.title}\": \"${book.description}\"`);\n});\n\n// Now do the same but only load 'title' and 'description' of each book:\nawait db.ref('books').forEach(\n   { include: ['title', 'description'] }, \n   bookSnapshot =\u003e {\n      const book = bookSnapshot.val();\n      console.log(`Got book \"${book.title}\": \"${book.description}\"`);\n   }\n);\n```\n\nAlso see [Streaming query results](#streaming-query-results)\n\n### Asserting data types in TypeScript\nIf you are using TypeScript, you can pass a type parameter to most data retrieval methods that will assert the type of the returned value. Note that you are responsible for ensuring the value matches the asserted type at runtime.\n\nExamples:\n```typescript\nconst snapshot = await db.ref\u003cMyClass\u003e('users/someuser/posts').get\u003cMyClass\u003e();\n//                            ^ type parameter can go here,        ^ here,\nif (snapshot.exists()) {\n    config = snapshot.val\u003cMyClass\u003e();\n    //                    ^ or here\n}\n\n// A type parameter can also be used to assert the type of a callback parameter\nawait db.ref('users/someuser/posts')\n    .transaction\u003cMyClass\u003e(snapshot =\u003e {\n        const posts = snapshot.val(); // posts is of type MyClass\n        return posts;\n    })\n\n// Or when iterating over children\nawait db.ref('users').forEach\u003cUserClass\u003e(userSnapshot =\u003e {\n    const user = snapshot.val(); // user is of type UserClass\n})\n```\n\n## Monitoring realtime data changes\n\nYou can subscribe to data events to get realtime notifications as the monitored node is being changed. When connected to a remote AceBase server, the events will be pushed to clients through a websocket connection. Supported events are:  \n- `'value'`: triggered when a node's value changes (including changes to any child value)\n- `'child_added'`: triggered when a child node is added, callback contains a snapshot of the added child node\n- `'child_changed'`: triggered when a child node's value changed, callback contains a snapshot of the changed child node\n- `'child_removed'`: triggered when a child node is removed, callback contains a snapshot of the removed child node\n- `'mutated'`: (NEW v0.9.51) triggered when any nested property of a node changes, callback contains a snapshot and reference of the exact mutation.\n- `'mutations'`: (NEW v0.9.60) like `'mutated'`, but fires with an array of all mutations caused by a single database update.\n- `'notify_*'`: notification only version of above events without data, see \"Notify only events\" below \n\n```javascript\n// Using event callback\ndb.ref('users')\n.on('child_added', userSnapshot =\u003e {\n    // fires for all current children, \n    // and for each new user from then on\n});\n```\n\n```javascript\n// To be able to unsubscribe later:\nfunction userAdded(userSnapshot) { /* ... */ }\ndb.ref('users').on('child_added', userAdded);\n// Unsubscribe later with .off:\ndb.ref('users').off('child_added', userAdded);\n```\n\nAceBase uses the same `.on` and `.off` method signatures as Firebase, but also offers another way to subscribe to the events using the returned `EventStream` you can `subscribe` to. Having a subscription helps to easier unsubscribe from the events later. Additionally, `subscribe` callbacks only fire for future events by default, as opposed to the `.on` callback, which also fires for current values of events `'value'` and `'child_added'`:\n\n```javascript\n// Using .subscribe\nconst addSubscription = db.ref('users')\n.on('child_added')\n.subscribe(newUserSnapshot =\u003e {\n    // .subscribe only fires for new children from now on\n});\n\nconst removeSubscription = db.ref('users')\n.on('child_removed')\n.subscribe(removedChildSnapshot =\u003e {\n    // removedChildSnapshot contains the removed data\n    // NOTE: snapshot.exists() will return false, \n    // and snapshot.val() contains the removed child value\n});\n\nconst changesSubscription = db.ref('users')\n.on('child_changed')\n.subscribe(updatedUserSnapshot =\u003e {\n    // Got new value for an updated user object\n});\n\n// Stopping all subscriptions later:\naddSubscription.stop();\nremoveSubscription.stop();\nchangesSubscription.stop();\n```\n\nIf you want to use `.subscribe` while also getting callbacks on existing data, pass `true` as the callback argument:\n```javascript\ndb.ref('users/some_user')\n.on('value', true) // passing true triggers .subscribe callback for current value as well\n.subscribe(userSnapshot =\u003e {\n    // Got current value (1st call), or new value (2nd+ call) for some_user\n});\n```\n\nThe `EventStream` returned by `.on` can also be used to `subscribe` more than once:\n\n```javascript\nconst newPostStream = db.ref('posts').on('child_added');\nconst subscription1 = newPostStream.subscribe(childSnapshot =\u003e { /* do something */ });\nconst subscription2 = newPostStream.subscribe(childSnapshot =\u003e { /* do something else */ });\n// To stop 1's subscription:\nsubscription1.stop(); \n// or, to stop all active subscriptions:\nnewPostStream.stop();\n```\n\nIf you are using TypeScript, you can pass a type parameter to `.on` or to `.subscribe` to assert the type of the value stored in the snapshot. This type is not checked by TypeScript; it is your responsibility to ensure that the value stored matches your assertion.\n\n```typescript\nconst newPostStream = db.ref('posts').on\u003cMyClass\u003e('child_added');\nconst subscription1 = newPostStream.subscribe(childSnapshot =\u003e {\n    const child = childSnapshot.val(); // child is of type MyClass\n });\nconst subscription2 = newPostStream.subscribe\u003cMyOtherClass\u003e(childSnapshot =\u003e { \n    const child = childSnapshot.val(); // child is of type MyOtherClass\n    // .subscribe overrode .on's type parameter\n });\n\n### Using variables and wildcards in subscription paths\n\nIt is also possible to subscribe to events using wildcards and variables in the path:\n```javascript\n// Using wildcards:\ndb.ref('users/*/posts')\n.on('child_added')\n.subscribe(snap =\u003e {\n    // This will fire for every post added by any user,\n    // so for our example .push this will be the result:\n    // snap.ref.vars === { 0: \"ewout\" }\n    const vars = snap.ref.vars;\n    console.log(`New post added by user \"${vars[0]}\"`)\n});\ndb.ref('users/ewout/posts').push({ title: 'new post' });\n\n// Using named variables:\ndb.ref('users/$userid/posts/$postid/title')\n.on('value')\n.subscribe(snap =\u003e {\n    // This will fire for every new or changed post title,\n    // so for our example .push below this will be the result:\n    // snap.ref.vars === { 0: \"ewout\", 1: \"jpx0k53u0002ecr7s354c51l\", userid: \"ewout\", postid: (...), $userid: (...), $postid: (...) }\n    // The user id will be in vars[0], vars.userid and vars.$userid\n    const title = snap.val();\n    const vars = snap.ref.vars; // contains the variable values in path\n    console.log(`The title of post ${vars.postid} by user ${vars.userid} was set to: \"${title}\"`);\n});\ndb.ref('users/ewout/posts').push({ title: 'new post' });\n\n// Or a combination:\ndb.ref('users/*/posts/$postid/title')\n.on('value')\n.subscribe(snap =\u003e {\n    // snap.ref.vars === { 0: 'ewout', 1: \"jpx0k53u0002ecr7s354c51l\", postid: \"jpx0k53u0002ecr7s354c51l\", $postid: (...) }\n});\ndb.ref('users/ewout/posts').push({ title: 'new post' });\n```\n\n### Notify only events\n\nIn additional to the events mentioned above, you can also subscribe to their `notify_` counterparts which do the same, but with a reference to the changed data instead of a snapshot. This is quite useful if you want to monitor changes, but are not interested in the actual values. Doing this also saves serverside resources, and results in less data being transferred from the server. Eg: `notify_child_changed` will run your callback with a reference to the changed node:\n\n```javascript\nref.on('notify_child_changed', childRef =\u003e {\n    console.log(`child \"${childRef.key}\" changed`);\n})\n```\n\n### Wait for events to activate\n\nIn some situations, it is useful to wait for event handlers to be active before modifying data. For instance, if you want an event to fire for changes you are about to make, you have to make sure the subscription is active before performing the updates.\n```javascript\nvar subscription = db.ref('users')\n.on('child_added')\n.subscribe(snap =\u003e { /*...*/ });\n\n// Use activated promise\nsubscription.activated()\n.then(() =\u003e {\n    // We now know for sure the subscription is active,\n    // adding a new user will trigger the .subscribe callback\n    db.ref('users').push({ name: 'Ewout' });\n})\n.catch(err =\u003e {\n    // Access to path denied by server?\n    console.error(`Subscription canceled: ${err.message}`);\n});\n```\n\nIf you want to handle changes in the subscription state after it was activated (eg because server-side access rights have changed), provide a callback function to the `activated` call:\n```javascript\nsubscription.activated((activated, cancelReason) =\u003e {\n    if (!activated) {\n        // Access to path denied by server?\n        console.error(`Subscription canceled: ${cancelReason}`);\n    }\n});\n```\n\n### Get triggering context of events \n(NEW v0.9.51)\n\nIn some cases it is benificial to know what (and/or who) triggered a data event to fire, so you can choose what you want to do with data updates. It is now possible to pass context information with all `update`, `set`, `remove` , and `transaction` operations, which will be passed along to any event triggered on affected paths (on any connected client!)\n\nImagine the following situation: you have a document editor that allows multiple people to edit at the same time. When loading a document you update its `last_accessed` property:\n\n```javascript\n// Load document \u0026 subscribe to changes\ndb.ref('users/ewout/documents/some_id').on('value', snap =\u003e {\n    // Document loaded, or changed. Display its contents\n    const document = snap.val();\n    displayDocument(document);\n});\n\n// Set last_accessed to current time\ndb.ref('users/ewout/documents/some_id').update({ last_accessed: new Date() })\n```\n\nThis will trigger the `value` event TWICE, and cause the document to render TWICE. Additionally, if any other user opens the same document, it will be triggered again even though a redraw is not needed!\n\nTo prevent this, you can pass contextual info with the update:\n\n```javascript\n// Load document \u0026 subscribe to changes (context aware!)\ndb.ref('users/ewout/documents/some_id')\n    .on('value', snap =\u003e {\n        // Document loaded, or changed.\n        const context = snap.context();\n        if (context.redraw === false) {\n            // No need to redraw!\n            return;\n        }\n        // Display its contents\n        const document = snap.val();\n        displayDocument(document);\n    });\n\n// Set last_accessed to current time, with context\ndb.ref('users/ewout/documents/some_id')\n    .context({ redraw: false }) // prevent redraws!\n    .update({ last_accessed: new Date() })\n```\n\n### Change tracking using \"mutated\" and \"mutations\" events\n(NEW v0.9.51)\n\nThese events are mainly used by AceBase behind the scenes to automatically update in-memory values with remote mutations. See [Observe realtime value changes](#observe-realtime-value-changes) and [Realtime synchronization with a live data proxy](#realtime-synchronization-with-a-live-data-proxy). It is possible to use these events yourself, but they require some additional plumbing, and you're probably better off using the methods mentioned above.\n\nHaving said that, here's how to use them: \n\nIf you want to monitor a specific node's value, but don't want to get its entire new value every time a small mutation is made to it, subscribe to the \"mutated\" event. This event is only fired with the target data actually being changed. This allows you to keep a cached copy of your data in memory (or cache db), and replicate all changes being made to it:\n\n```javascript\nconst chatRef = db.ref('chats/chat_id');\n// Get current value\nconst chat = (await chatRef.get()).val();\n\n// Subscribe to mutated event\nchatRef.on('mutated', snap =\u003e {\n    const mutatedPath = snap.ref.path; // 'chats/chat_id/messages/message_id'\n    const propertyTrail = \n        // ['messages', 'message_id']\n        mutatedPath.slice(chatRef.path.length + 1).split('/');\n\n    // Navigate to the in-memory chat property target:\n    let targetObject = propertyTrail.slice(0,-1).reduce((target, prop) =\u003e target[prop], chat);\n    // targetObject === chat.messages\n    const targetProperty = propertyTrail.slice(-1)[0]; // The last item in array\n    // targetProperty === 'message_id'\n\n    // Update the value of our in-memory chat:\n    const newValue = snap.val(); // { sender: 'Ewout', text: '...' }\n    if (newValue === null) {\n        // Remove it\n        delete targetObject[targetProperty]; // delete chat.messages.message_id\n    }\n    else {\n        // Set or update it\n        targetObject[targetProperty] = newValue; // chat.messages.message_id = newValue\n    }\n});\n\n// Add a new message to trigger above event handler\nchatRef.child('messages').push({\n    sender: 'Ewout'\n    text: 'Sending you a message'\n})\n```\n\nNOTE: if you are connected to a remote AceBase server and the connection was lost, it is important that you always get the latest value upon reconnecting because you might have missed mutation events.\n\nThe `'mutations'` event does the same as `'mutated'`, but will be fired on the subscription path with an array of all mutations caused by a single database update. The best way to handle these mutations is by iterating them using `snapshot.forEach`:\n\n```javascript\nchatRef.on('mutations', snap =\u003e {\n    snap.forEach(mutationSnap =\u003e {\n        handleMutation(mutationSnap);\n    });\n})\n```\n\n### Observe realtime value changes \n(NEW v0.9.51)\n\nYou can now observe the realtime value of a path, and (for example) bind it to your UI. `ref.observe()` returns a RxJS Observable that can be used to observe updates to this node and its children. It does not return snapshots, so you can bind the observable straight to your UI. The value being observed is updated internally using the \"mutations\" database event. All database mutations are automatically applied to the in-memory value, and trigger the observable to emit the new value.\n\n```html\n\u003c!-- In your Angular view template: --\u003e\n\u003cng-container *ngIf=\"liveChat | async as chat\"\u003e\n   \u003ch3\u003e{{ chat.title }}\u003c/h3\u003e\n   \u003cp\u003eChat was started by {{ chat.startedBy }}\u003c/p\u003e\n   \u003cdiv class=\"messages\"\u003e\n    \u003cMessage *ngFor=\"let item of chat.messages | keyvalue\" [message]=\"item.value\"\u003e\u003c/Message\u003e\n   \u003c/div\u003e\n\u003c/ng-container\u003e\n```\n\n_Note that to use Angular's `*ngFor` on an object collection, you have to use the `keyvalue` pipe._\n\n```javascript\n// In your Angular component:\nngOnInit() {\n   this.liveChat = this.db.ref('chats/chat_id').observe();\n}\n```\n\nOr, if you want to monitor updates yourself, handle the subscribe and unsubscribe:\n```javascript\nngOnInit() {\n   this.observer = this.db.ref('chats/chat_id').observe().subscribe(chat =\u003e {\n      this.chat = chat;\n   });\n}\nngOnDestroy() {\n   // DON'T forget to unsubscribe!\n   this.observer.unsubscribe();\n}\n```\n\nNOTE: objects returned in the observable are only updated downstream - any changes made locally won't be updated in the database. If that is what you would want to do... keep reading! (Spoiler alert - use `proxy()`!)\n\n### Realtime synchronization with a live data proxy \n(NEW v0.9.51)\n\nYou can now create a live data proxy for a given path. The data of the referenced path will be loaded, and kept in-sync with live data by listening for remote 'mutated' events, and immediately syncing back all changes you make to its value. This allows you to forget about data storage, and code as if you are only handling in-memory objects. Synchronization was never this easy!\n\nCheck out the following example:\n\n```javascript\nconst proxy = await db.ref('chats/chat1').proxy();\nconst chat = proxy.value; // contains realtime chat value\n\n// Make changes in memory, AND database (yes!)\nchat.title = 'Changing the title in the database too!';\nchat.members = ['Ewout'];\nchat.members.push('John', 'Jack', 'Pete'); // Append to array\nchat.messages.push({ // Push child to a collection (generates an ID for it!)\n    from: 'Ewout', \n    message: 'I am changing the database without programming against it!' \n});\nchat.messages.push({\n    from: 'Pete', \n    message: 'Impressive dude' \n});\nif (chat.members.includes('John') \u0026\u0026 !chat.title.startsWith('Hallo')) {\n    chat.title = 'Hallo, is John May er?'; // Dutch joke\n}\n// Now that all synchronous updates above have taken place,\n// AceBase will update the database automatically\n```\n\nAll changes made above will be persisted to the database, and any changes made remotely will be automatically become available in the proxy object. The above code will result in the execution of 2 updates to the database, equivalent to below statements. **How awesome is that?!**\n\n```javascript\n// This is what is executed behind the scenes by above example:\ndb.ref('chats/chat1').update({\n    title: 'Hallo, is John May er?', // Dutch joke\n    members: ['Ewout','John','Jack','Pete']\n});\ndb.ref('chats/chat1/messages').update({\n    kh1x3ygb000120r7ipw6biln: {\n        from: 'Ewout',\n        message: 'I am changing the database without programming against it!'\n    },\n    kh1x3ygb000220r757ybpyec: {\n        from: 'Pete',\n        message: 'Impressive dude'\n    }\n});\n```\n\nTo get a notification each time a mutation is made to the value, use `proxy.onMutation(handler)`. To get notifications about any errors that might occur, use `proxy.onError(handler)`:\n\n```javascript\nproxy.onError(err =\u003e {\n    console.error(`Proxy error: ${err.message}`, err.details);\n});\nproxy.onMutation((mutationSnapshot, isRemoteChange) =\u003e {\n    console.log(`Value of path \"${mutationSnapshot.ref.path}\" was mutated by ${isRemoteChange ? 'somebody else' : 'us' }`);\n})\n```\n\nIf you no longer need the proxy object, use `proxy.destroy()` to stop realtime updating. Don't forget this!\n\nA number of additional methods are available to all proxied object values to make it possible to monitor specific properties being changed, get the actual target values, add children etc. See code below for more details:\n\n```javascript\nconst proxy = await db.ref('chats/chat1').proxy();\nif (!proxy.hasValue) {\n    // If the proxied path currently does not have a value, create it now.\n    proxy.value = {};\n}\nconst chat = proxy.value;\n```\n\n**`forEach`**: iterate object collection\n```javascript\nchat.messages.forEach((message, key, index) =\u003e {\n    // Fired for all messages in collection, or until returning false\n});\n```\n\n**`for...of`**: iterate array or object collection's values, keys or entries (v1.2.0+)\n```js\nfor (let message of chat.messages) {\n    // Iterates with default .values iterator, same as:\n    // for (let message of chat.messages.values())\n}\nfor (let keys of chat.messages.keys()) {\n    // All keys in the messages object collection\n}\nfor (let [key, message] of chat.messages.entries()) {\n    // Same as above\n}\n```\n\n**`push`**: Add item to object collection with generated key\n```javascript\nconst key = chat.messages.push({ text: 'New message' });\n```\n\n**`remove`**: delete a node\n```javascript\nchat.messages[key].remove();\nchat.messages.someotherkey.remove();\n\n// Note, you can also do this:\ndelete chat.messages.somemessage;\n// Or this:\nchat.messages.somemessage = null;\n```\n\n**`toArray`**: access an object collection like an array:\n```javascript\nconst array = chat.messages.toArray();\n```\n\n**`toArray` (with sort)**: like above, sorting the results:\n```javascript\nconst sortedArray = chat.messages.toArray((a, b) =\u003e a.sent \u003c b.sent ? -1 : 1);\n```\n\n**`valueOf`** (or **`getTarget`**): gets the underlying value (unproxied, be careful!)\n```js\nconst message = chat.messages.message1.valueOf();\nmessage.text = 'This does NOT update the database'; // Because it is not the proxied value\nchat.messages.message1.text = 'This does'; // Just so you know\n```\n\n**`onChanged`**: registers a callback for the value that is called every time the underlying value changes:\n```javascript\nchat.messages.message1.onChanged((message, previous, isRemote, context) =\u003e {\n    if (message.read) {\n        // Show blue ticks\n    }\n    if (message.title !== previous.title \u0026\u0026 isRemote) {\n        // Somebody changed the title \n        // (remote: not through this proxy instance)\n    }\n});\n```\n\n**`getRef`**: returns a DataReference instance to current target if you'd want or need to do stuff outside of the proxy's scope:\n```javascript\nconst messageRef = chat.messages.message1.getRef();\n// Eg: add an \"old fashioned\" event handler\nmessageRef.on('child_changed', snap =\u003e { /* .. */ });\n// Or, if you need to know when an update is done\nawait messageRef.update({ read: new Date() });\n```\n\n**`getObservable`**: returns a RxJS Observable that is updated each time the underlying value changes:\n```javascript\nconst observable = chat.messages.message1.getObservable();\nconst subscription = observable.subscribe(message =\u003e {\n    if (message.read) {\n        // Show blue ticks\n    }\n});\n// Later:\nsubscription.unsubscribe();\n```\n\n**`startTransaction`**: (NEW v0.9.62) Enables you to make changes to the proxied value, but not writing them to the database until you want them to. This makes it possble to bind a proxy to an input form, and wait to save the changes until the user click 'Save', or rollback when canceling. Meanwhile, the value will still be updated with any remote changes.\n\n```javascript\nconst proxy = await db.ref('contacts/ewout').proxy();\nconst contact = proxy.value; // NOTE: === null if node doesn't exist\nconst tx = await contact.startTransaction();\n\n// Make some changes:\ncontact.name = 'Ewout Stortenbeker'; // Was 'Ewout'\ncontact.email = 'ewout@appy.one'; // Was 'me@appy.one'\n\nasync function save() {\n    await tx.commit();\n    console.log('Contact details updated');\n}\n\nfunction rollback() {\n    tx.rollback();\n    // contact.name === 'Ewout'\n    // contact.email === 'me@appy.one'\n    console.log('All changes made were rolled back');\n}\n```\nOnce `tx.commit()` is called, all pending updates will be processed and saved to the database. When `tx.rollback()` is called, all changes made to the proxied object will be reverted and no further action is taken.\n\n### Using proxy methods in Typescript\n\nIn TypeScript some additional typecasting is needed to access proxy methods shown above. You can use the `proxyAccess` function to get help with that. This function typecasts and also checks if your passed value is indeed a proxy.\n```typescript\ntype ChatMessage = { from: string, text: string, sent: Date, received: Date, read: Date };\ntype MessageCollection = ObjectCollection\u003cChatMessage\u003e;\n\n// Easy \u0026 safe typecasting:\nproxyAccess\u003cMessageCollection\u003e(chat.messages)\n    .getObservable()\n    .subscribe(messages =\u003e {\n        // No need to define type of messages, TS knows it is a MessageCollection\n    });\n\n// Instead of:\n(chat.messages as any as ILiveDataProxyValue\u003cMessageCollection\u003e)\n    .getObservable()\n    .subscribe(messages =\u003e {\n        // messages: MessageCollection\n    });\n\n// Or, with unsafe typecasting (discouraged!)\n(chat.messages as any)\n    .getObservable()\n    .subscribe((messages: MessageCollection) =\u003e {\n        // messages: MessageCollection, but only because we've prevented typescript\n        // from checking if the taken route to get here was ok.\n        // If getObservable or subscribe method signatures change in the \n        // future, code will break without typescript knowing it!\n    });\n```\n\nWith Angular, `getObservable` comes in handy for UI binding and updating:\n\n```typescript\n@Component({\n  selector: 'chat-messages',\n  template: `\u003cng-container *ngIf=\"liveChat | async as chat\"\u003e\n    \u003ch1\u003e{{ chat.title }}\u003c/h1\u003e\n    \u003cMessage *ngFor=\"let item of chat.messages | keyvalue\" [message]=\"item.value\" /\u003e\n    \u003c/ng-container\u003e`\n})\nexport class ChatComponent {\n    liveChat: Observable\u003c{ \n        title: string, \n        messages: ObjectCollection\u003c{ from: string, text: string }\u003e \n    }\u003e;\n\n    constructor(private dataProvider: MyDataProvider) {}\n\n    async ngOnInit() {\n        const proxy = await this.dataProvider.db.ref('chats/chat1').proxy();\n        this.liveChat = proxyAccess(proxy.value).getObservable();\n    }\n}\n```\n\nFor completeness of above example, `MyDataProvider` would look something like this:\n```typescript\nimport { AceBase } from 'acebase';\n@Injectable({\n    providedIn: 'root'\n})\nexport class MyDataProvider {\n    db: AceBase;\n    constructor() {\n        this.db = new AceBase('chats');\n    }\n}\n```\n\nI'll leave up to your imagination what the `MessageComponent` would look like.\n\n## Querying data\n\nWhen running a query, all child nodes of the referenced path will be matched against your set criteria and returned in any requested `sort` order. Pagination of results is also supported, so you can `skip` and `take` any number of results. Queries do not require data to be indexed, although this is recommended if your data becomes larger.\n\nTo filter results, multiple `filter(key, operator, compare)` statements can be added. The filtered results must match all conditions set (logical AND). Supported query operators are:\n- `'\u003c'`: value must be smaller than `compare`\n- `'\u003c='`: value must be smaller or equal to `compare`\n- `'=='`: value must be equal to `compare`\n- `'!='`: value must not be equal to `compare`\n- `'\u003e'`: value must be greater than `compare`\n- `'\u003e='`: value must be greater or equal to `compare`\n- `'exists'`: `key` must exist\n- `'!exists'`: `key` must not exist\n- `'between'`: value must be between the 2 values in `compare` array (`compare[0]` \u003c= value \u003c= `compare[1]`). If `compare[0] \u003e compare[1]`, their values will be swapped\n- `'!between'`: value must not be between the 2 values in `compare` array (value \u003c `compare[0]` or value \u003e `compare[1]`). If `compare[0] \u003e compare[1]`, their values will be swapped\n- `'like'`: value must be a string and must match the given pattern `compare`. Patterns are case-insensitive and can contain wildcards _*_ for 0 or more characters, and _?_ for 1 character. (pattern `\"Th?\"` matches `\"The\"`, not `\"That\"`; pattern `\"Th*\"` matches `\"the\"` and `\"That\"`)\n- `'!like'`: value must be a string and must not match the given pattern `compare`\n- `'matches'`: value must be a string and must match the regular expression `compare`\n- `'!matches'`: value must be a string and must not match the regular expression `compare`\n- `'in'`: value must be equal to one of the values in `compare` array\n- `'!in'`: value must not be equal to any value in `compare` array\n- `'has'`: value must be an object, and it must have property `compare`.\n- `'!has'`: value must be an object, and it must not have property `compare`\n- `'contains'`: value must be an array and it must contain a value equal to `compare`, or contain all of the values in `compare` array\n- `'!contains'`: value must be an array and it must not contain a value equal to `compare`, or not contain any of the values in `compare` array\n\nNOTE: A query does not require any `filter` criteria, you can also use a `query` to paginate your data using `skip`, `take` and `sort`. If you don't specify any of these, AceBase will use `.take(100)` as default. If you do not specify a `sort`, the order of the returned values can vary between executions.\n\n```javascript\ndb.query('songs')\n.filter('year', 'between', [1975, 2000])\n.filter('title', 'matches', /love/i)  // Songs with love in the title\n.take(50)                   // limit to 50 results\n.skip(100)                  // skip first 100 results\n.sort('rating', false)      // highest rating first\n.sort('title')              // order by title ascending\n.get(snapshots =\u003e {\n    // ...\n});\n```\n\nTo quickly convert a snapshots array to the values it encapsulates, you can call `snapshots.getValues()`. This is a convenience method and comes in handy if you are not interested in the results' paths or keys. You can also do it yourself with `var values = snapshots.map(snap =\u003e snap.val())`:\n```javascript\ndb.query('songs')\n.filter('year', '\u003e=', 2018)\n.get(snapshots =\u003e {\n    const songs = snapshots.getValues();\n});\n```\n\nInstead of using the callback of `.get`, you can also use the returned `Promise` which is very useful in promise chains:\n```javascript\n// ... in some promise chain\n.then(fromYear =\u003e {\n    return db.query('songs')\n    .filter('year', '\u003e=', fromYear)\n    .get();\n})\n.then(snapshots =\u003e {\n    // Got snapshots from returned promise\n})\n```\n\nThis also enables using ES6 `async` / `await`:\n```javascript\nconst snapshots = await db.query('songs')\n    .filter('year', '\u003e=', fromYear)\n    .get();\n```\n\n### Limiting query result data\n\nBy default, queries will return snapshots of the matched nodes, but you can also get references only by passing the option `{ snapshots: false }` or use the new `.find()` method.\n\n```javascript\n// ...\nconst references = await db.query('songs')\n    .filter('genre', 'contains', 'rock')\n    .get({ snapshots: false });\n\n// now we have references only, so we can decide what data to load\n\n```\nUsing the new `find()` method does the same (v1.10.0+):\n\n```javascript\nconst references = await db.query('songs')\n    .filter('genre', 'contains', 'blues')\n    .find();\n```\n\nIf you do want your query results to include some (but not all) data, you can use the `include` and `exclude` options to filter the fields in the query results returned by `get`:\n\n```javascript\nconst snapshots = await db.query('songs')\n    .filter('title', 'like', 'Love*')\n    .get({ include: ['title', 'artist'] });\n```\n\nThe snapshots in the example above will only contain each matching song's _title_ and _artist_ fields. See [Limit nested data loading](#limit-nested-data-loading) for more info about `include` and `exclude` filters.\n\n### Removing data with a query\n\nTo remove all nodes that match a query, simply call `remove` instead of `get`:\n```javascript\ndb.query('songs')\n    .filter('year', '\u003c', 1950)\n    .remove(() =\u003e {\n        // Old junk gone\n    }); \n\n// Or, with await\nawait db.query('songs')\n    .filter('year', '\u003c', 1950)\n    .remove();\n```\n\n### Counting query results\n(NEW since v1.10.0)\n\nTo get a quick count of query results, you can use `.count()`:\n\n```javascript\nconst count = await db.query('songs')\n    .filter('artist', '==', 'John Mayer')\n    .count();\n```\n\nYou can use this in combination with `skip` and `limit` to check if there are results beyond a currently loaded dataset:\n\n```javascript\nconst nextPageSongsCount = await db.query('songs')\n    .filter('artist', '==', 'John Mayer')\n    .skip(100)\n    .take(10)\n    .count(); // 10: full page, \u003c10: last page.\n```\n\nNOTE: This method currently performs a count on results returned by `.find()` behind the scenes, this will be optimized in a future version.\n\n### Checking query result existence\n(NEW since v1.10.0)\n\nTo quickly determine if a query has any matches, you can use `.exists()`:\n\n```javascript\nconst exists = await db.query('users')\n    .filter('email', '==', 'me@appy.one')\n    .exists();\n```\n\nJust like `count()`, you can also combine this with `skip` and `limit`.\n\nNOTE: This method currently performs a check on the result returned by `.count()` behind the scenes, this will be optimized in a future version.\n\n### Streaming query results\n(NEW since v1.4.0)\n\nTo iterate through the results of a query without loading all data into memory at once, you can use `forEach` which streams each child and executes a callback function with a snapshot of its data. If the callback function returns `false`, iteration will stop. If the callback returns a `Promise`, iteration will wait for it to resolve before loading the next child.\n\nThe query will be executed at the start of the function, retrieving references to all matching children (not their values). After this, `forEach` will load their values one at a time. It is possible for the underlying data to be changed while iterating. Matching children that were removed while iterating will be skipped. Children that had any of the filtered properties changed after initial results were populated might not match the query anymore, this is not checked.\n\nIt is also possible to selectively load data for each child, using the same options object available for `query.get(options)`.\n\nExample:\n```js\n// Query books, streaming the results one at a time:\nawait db.query('books')\n .filter('category', '==', 'cooking')\n .forEach(bookSnapshot =\u003e {\n    const book = bookSnapshot.val();\n    console.log(`Found cooking book \"${book.title}\": \"${book.description}\"`);\n });\n\n// Now only load book properties 'title' and 'description'\nawait db.query('books')\n .filter('category', '==', 'cooking')\n .forEach(\n   { include: ['title', 'description'] },\n   bookSnapshot =\u003e {\n      const book = bookSnapshot.val();\n      console.log(`Found cooking book \"${book.title}\": \"${book.description}\"`);\n   }\n);\n```\n\nAlso see [Iterating (streaming) children](#iterating-streaming-children)\n\n### Realtime queries \n(NEW 0.9.9, alpha)\n\nAceBase now supports realtime (live) queries and is able to send notifications when there are changes to the initial query results\n\n```javascript\nlet fiveStarBooks = {}; // maps keys to book values\nfunction gotMatches(snaps) {\n    snaps.forEach(snap =\u003e {\n        fiveStarBooks[snap.key] = snap.val();\n    });\n}\nfunction matchAdded(match) {\n    // add book to results\n    fiveStarBooks[match.snapshot.key] = match.snapshot.val();\n}\nfunction matchChanged(match) {\n    // update book details\n    fiveStarBooks[match.snapshot.key] = match.snapshot.val();\n}\nfunction matchRemoved(match) {\n    // remove book from results\n    delete fiveStarBooks[match.ref.key];\n}\n\ndb.query('books')\n    .filter('rating', '==', 5)\n    .on('add', matchAdded)\n    .on('change', matchChanged)\n    .on('remove', matchRemoved)\n    .get(gotMatches)\n```\n\nNOTE: Usage of `take` and `skip` are currently not taken into consideration, events might fire for results that are not in the requested range.\n\n## Indexing data\n\nIndexing data will dramatically improve the speed of queries on your data, especially as it increases in size. Any indexes you create will be updated automatically when underlying data is changed, added or removed. Indexes are used to speed up filters and sorts, and to limit the amount of results. NOTE: If you are connected to an external AceBase server (using `AceBaseClient`), indexes can only be created if you are signed in as the *admin* user.\n\n```javascript\nPromise.all([\n    // creates indexes if they don't exist\n    db.indexes.create('songs', 'year'),\n    db.indexes.create('songs', 'genre')\n])\n.then(() =\u003e {\n    return db.query('songs')\n    .filter('year', '==', 2010) // uses the index on year\n    .filter('genre', 'in', ['jazz','rock','blues']) // uses the index on genre\n    .get();\n})\n.then(snapshots =\u003e {\n    console.log(`Got ${snapshots.length} songs`);\n});\n```\n\n### Indexing scattered data with wildcards\n\nBecause nesting data is recommended in AceBase (as opposed to Firebase that discourages this), you are able to index and query data that is scattered accross your database in a structered manner. For example, you might want to store `posts` for each `user` in their own user node, and index (and query) all posts by any user:\n\n```javascript\nawait db.indexes.create('users/*/posts', 'date'); // Index date of any post by any user\n\n// Get all today's posts, of all users:\nlet now = new Date();\nlet today = new Date(now.getFullYear(), now.getMonth(), now.getDate());\nconst postSnapshots = await db.query('users/*/posts') // query with the same wildcard\n    .filter('date', '\u003e=', today)\n    .get();\n```\n\n**NOTE**: Wildcard queries always require an index - they will not execute if there is no corresponding index.\n\nYou can also use named variables instead of `*`, to allow the indexing of parts of the path. This is really powerful in situations where you want to be able to query nested data, using a single id:\n\n```javascript\nawait db.indexes.create('users/*/posts/$postId/comments', '$postId');\n\n// Query comments on a specific postId without knowing which user it was posted by:\nconst commentSnapshots = await db.query('users/*/posts/$postId/comments')\n    .filter('$postId', '==', 'l6uldsmt000309l207kz8ll0')\n    .get();\n\n```\n\nAdditionally, you can also use `'{key}'` to index the key of the indexed path:\n```javascript\nawait db.indexes.create('users/*/posts/*/comments', '{key}');\n// Quickly lookup the comment with a given id, without the need to know which userId\n// it was posted by, or the postId it belongs to:\nconst commentSnapshots = await db.query('users/*/posts/*/comments')\n    .filter('{key}', '==', 'l6uldsmt000309l207kz8ll0')\n    .get();\n```\n\n### Include additional data in indexes\n\nIf your query uses filters on multiple keys you could create separate indexes on each key, but you can also include that data into a single index. This will speed up queries even more in most cases:\n\n```javascript\nawait db.indexes.create('songs', 'year', { include: ['genre'] });\n\nconst snapshots = await  db.query('songs')\n    .filter('year', '==', 2010) // uses the index on year\n    .filter('genre', 'in', ['jazz', 'rock', 'blues']) // filters indexed results of year filter: FAST!\n    .get();\n```\n\nIf you are filtering data on one key, and are sorting on another key, it is highly recommended to include the `sort` key in your index on the `filter` key, because this will greatly increase sorting performance:\n\n```javascript\nawait db.indexes.create('songs', 'title', { include: ['year', 'genre'] });\n\nconst snapshots = await db.query('songs')\n    .filter('title', 'like', 'Love *') // queries the index\n    .sort('genre')  // sorts indexed results: FAST!\n    .sort('title')  // sorts indexed results: FAST!\n    .get();\n```\n\nYou can also use named wildcards (as described above) to be included with the indexed values:\n```javascript\nawait db.indexes.create('users/$userId/posts', 'title', { include: ['$userId'] });\n\nconst snapshots = await db.query('users/$userId/posts')\n    .filter('title', 'like', 'Hello *') // queries the index\n    .filter('$userId', '==', 'l6uldsmt000309l207kz8ll0')  // filters results by userId in path\n    .get();\n```\n\n### Other indexing options\n\nIn addition to the `include` option described above, you can specify the following options:\n\n * `caseSensitive`: boolean that specifies whether texts should be indexed using case sensitivity. Setting this to `true` will cause words with mixed casings (eg `\"word\"`, `\"Word\"` and `\"WORD\"`) to be indexed separately. Default is `false`.\n * `textLocale`: string that specifies the default locale of the indexed texts. Should be a 2-character language code such as `\"en\"` for English and `\"nl\"` for Dutch, or an LCID string for country specific locales such as `\"en-us\"` for American English, `\"en-gb\"` for British English etc.\n * `textLocaleKey`: string that specifies a key in the source data that contains the locale to use instead of the default specified in `textLocale`\n\n### Special indexes\n\nNormal indexes are able to index `string`, `number`, `Date`, `boolean` and `undefined` (non-existent) values. To index other data, you have to create a special index. Currently supported special indexes are: **Array**, **FullText** and **Geo** indexes.\n\n### Array indexes\n\nUse Array indexes to dramatically improve the speed of `\"contains\"` filters on array values.\nConsider the following data structure:\n\n```javascript\nchats: {\n    chat1: {\n        members: ['ewout','john','pete','jack'],\n        // ...\n    }\n}\n```\n\nBy adding an index to the `members` key, this will speed up queries to get all chats a specific user is in.\n\n```javascript\ndb.indexes.create('chats', 'members', { type: 'array' });\n.then(() =\u003e {\n    return db.query('chats')\n    .filter('members', 'contains', 'ewout'); // also possible without index, but now way faster\n    .get()\n})\n.then(snapshots =\u003e {\n    // Got all chats with ewout\n})\n```\n\nBy supplying an array to the filter, you can get all chats that have all of the supplied users:\n```javascript\ndb.query('chats')\n.filter('members', 'contains', ['ewout', 'jack']);\n.get(snapshots =\u003e {\n    // Got all chats with ewout AND jack\n})\n```\nUsing `!contains` you can check which chats do not involve 1 or more users:\n```javascript\ndb.query('chats')\n.filter('members', '!contains', ['ewout', 'jack']);\n.get(snapshots =\u003e {\n    // Got all chats without ewout and/or jack\n})\n```\n\n### Fulltext indexes\n\nA fulltext index will index all individual words and their relative positions in string nodes. A normal index on text nodes is only capable of searching for exact matches quickly, or proximate like/regexp matches by scanning through the index. A fulltext index makes it possible to quickly find text nodes that contain multiple words, a selection of words or parts of them, in any order in the text.\n\n```javascript\ndb.indexes.create('chats/*/messages', 'text', { type: 'fulltext' });\n.then(() =\u003e {\n    return db.query('chats/*/messages')\n    .filter('text', 'fulltext:contains', `confidential OR secret OR \"don't tell\"`); // not possible without fulltext index\n    .get()\n})\n.then(snapshots =\u003e {\n    // Got all confidential messages\n})\n```\n\nFulltext indexes support *whitelisting*, *blacklisting*, manual word *stemming* and *filtering*, and using different *locales*. All indexed words are stored *\"unidecoded\"*: all unicode characters are translated into ascii characters so they become searchable in both ways. Eg: Japanese \"AceBaseはクールです\" is indexed as \"acebase wa kurudesu\" and will be found with queries on both \"クール\", \"kūru\" and \"kuru\".\n\nYou can define these additional settings using the the `config` property in the options parameter passed to the `indexes.create` method:\n\n#### `transform`\nCallback function that transforms (and/or filters) words being indexed *and* queried.\n\n```js\ndb.indexes.create('chats/*/messages', 'text', { \n    type: 'fulltext', \n    config: { \n        transform: function(locale, word) {\n            // Correct misspelled words:\n            if (word === 'mispeled') { return 'misspelled'; } \n\n            // Do not index a specific word:\n            if (word === 'secret') { return null; }\n\n            // Word stemming:\n            if (['fishing','fished','fisher'].includes(word)) { return 'fish'; }\n\n            // Consider multiple locales to allow multilingual query results:\n            if (locale === 'nl') {\n                // Word being indexed or queried is Dutch, index and query in English\n                // Also see localeKey setting for more info\n                return dutchToEnglish(word);\n            }\n\n            // Or, keep the word as it is:\n            return word;\n        } \n    } \n});\n```\n\n#### `blacklist`\nAlso known as a *stoplist*. Array of words to automatically be ignored for indexing and querying. \n\n```javascript\ndb.indexes.create('chats/*/messages', 'text', { \n    type: 'fulltext', \n    config: { \n        blacklist: ['a','the','on','at'] // these words won't be indexed and ignored in queries\n    }\n}\n```\n\n#### `whitelist`\nWords to be included if they did not match `minLength` and/or `blacklist` criteria:\n\n```javascript\ndb.indexes.create('chats/*/messages', 'text', { \n    type: 'fulltext', \n    config: { \n        minLength: 3,\n        whitelist: ['ok'] // allow \"ok\" although it's only 2 characters\n    }\n}\n```\n\n#### `minLength` and `maxLength`\nOnly use words with a minimum and/or maximum length:\n```javascript\ndb.indexes.create('chats/*/messages', 'text', { \n    type: 'fulltext', \n    config: { \n        minLength: 3,   // Ignore small words\n        maxLength: 20   // Ignore large words\n    }\n}\n```\n#### `localeKey`\nSpecify a key in the data that contains the locale of the indexed texts. This allows multiple languages to be indexed using their own rules.\n\nImagine the following the dataset:\n```json\n{\n    \"love\": {\n        \"item1\": {\n            \"text\": \"I love AceBase\",\n            \"locale\": \"en\"\n        },\n        \"item2\": {\n            \"text\": \"Amo AceBase\",\n            \"locale\": \"es\"\n        },\n        \"item3\": {\n            \"text\": \"J'aime AceBase\",\n            \"locale\": \"fr\"\n        },\n        \"item4\": {\n            \"text\": \"Ich liebe AceBase\",\n            \"locale\": \"de\"\n        },\n        \"item5\": {\n            \"text\": \"Ik hou van AceBase\",\n            \"locale\": \"nl\"\n        },\n        \"item6\": {\n            \"text\": \"Jag älskar AceBase\",\n            \"locale\": \"sv\"\n        }\n    }\n}\n```\n\nYou can have the texts in `text` indexed using the locale specified in `locale`. The locale found is used in a given `transform` function. If the source data does not have the specified locale key, the default `textLocale` option specified in the options will be used.\n\n```javascript\ndb.indexes.create('chats/*/messages', 'text', { \n    type: 'fulltext', \n    textLocale: 'en' // default locale to use\n    config: { \n        localeKey: 'locale' // Use the locale found in the locale property\n    }\n}\n```\n\n#### `useStoplist`\nBoolean value that specifies whether a default stoplist for the used locale should be used to automatically blacklist words. Currently only available for locale `\"en\"`, which contains very frequently used words like \"a\", \"i\", \"me\", \"it\", \"the\", \"they\", \"them\" etc.\n\n### Geo indexes\n\nA geo index is able to index latitude/longitude value combinations so you can create very fast location-based queries. \n\nConsider the following dataset:\n```javascript\nlandmarks: {\n    landmark1: {\n        name: 'I Amsterdam Sign',\n        note: 'This is where it used to be before some crazy mayor decided it had to go',\n        location: {\n            lat: 52.359157,\n            long: 4.884155\n        }\n    },\n    landmark2: {\n        name: 'Van Gogh Museum',\n        location: {\n            lat: 52.358407, \n            long: 4.881152\n        }\n    },\n    landmark3: {\n        name: 'Rijksmuseum',\n        location: {\n            lat: 52.359818, \n            long: 4.884924\n        }\n    },\n    // ...\n}\n```\n\nTo query all landmarks in a range of a given location, create a _geo_ index on nodes containing `lat` and `long` keys. Then use the `geo:nearby` filter:\n\n```javascript\ndb.indexes.create('landmarks', 'location', { type: 'geo' });\n.then(() =\u003e {\n    return db.query('landmarks')\n    .filter('location', 'geo:nearby', { lat: 52.359157, long: 4.884155, radius: 100 });\n    .get()\n})\n.then(snapshots =\u003e {\n    // Got all landmarks on Museumplein in Amsterdam (in a radius of 100 meters)\n})\n```\n\nIndexed locations are stored using 10 character geohashes, which have a precision of about half a square meter.\n\n## Schemas\n(NEW since v1.3.0)\n\nIn many cases it is desirable to define what data is allowed to be stored in your database, to prevent unexpected errors in your application. It can also prevent a programming error from damaging your database structure or data. By defining schemas to your database, you can prevent data that does not adhere to the schema from being written. All updates and inserts will check the passed data with your defined schemas before writing, and raise an error if validation fails. Any existing data will not be checked.\n\nNote: Schema checking was already available in [acebase-server](https://github.com/appy-one/acebase-server), but its implementation was limited. For this reason, it was moved closer to the storage code and improved. Additional benefit: schema checks are now available for any AceBase instance (Hello, standalone browser/node.js databases!).\n\n### Adding schemas to enforce data rules\n\nTo define a schema, use `db.schema.set(path, schema)`. This will add a schema definition to the specified path to enforce for updates and inserts. Schema definitions use typescript formatting. For optional properties, append a question mark to the property name, eg: \"birthdate?\". You can specify one wildcard child property (\"*\" or \"$varname\") to check unspecified properties with.\n\nThe following types are supported: \n* Types returned by `typeof`: `string`, `number`, `boolean`, `object`\\*, and `undefined`\\*\\*\n* Classnames: `Date`, `Object`*, (_v1.8.0+_:) `String`, `Number`, `Boolean`\n* Interface definitions: `{ \"prop1\": \"string\", \"prop2\": \"Date\" }`\n* Arrays: `string[]`, `number[]`, ``Date[]``, `{ \"prop\": \"string\" }[]` etc\n* Arrays (generic): `Array\u003cDate\u003e`, `Array\u003cstring | number\u003e`, `Array\u003c{ \"prop1\": \"string\" }\u003e` etc\n* Binary: `Binary`, `binary`\n* Any type: `any` or `*`\n* Combinations: `string | number | Date[]`\n* Specific values: `1 | 2 | 3`, `\"car\" | \"boat\" | \"airplane\"`, `true` etc\n* Regular expressions (_v1.8.0+_): `/^[A-Z]{2}$/` (NL, EN, DE, US, etc), `/^[a-z.\\-_]+@(?:[a-z\\-_]+\\.){1,}[a-z]{2,}$/i` (email addresses), etc\n* Optional values: property names suffixed with `?`\n\n\\* Types `object` and `Object` are treated the same way: they allow a given value to be *any* object, *except* `Array`, `Date` and binary values. This means that if you are using custom class mappings, you will be able to store a `Pet` object, but not an `Array`.\n\n\\*\\* When using type `undefined`, the property will not be allowed to be inserted or updated. This can be useful if your data structure changed and want to prevent updates to use the old structure. For example, if your contacts previously had an \"age\" property that you are replacing with \"birthday\". Setting the type of \"age\" to `undefined` will prevent the property to be set or overwritten. Note that an existing \"age\" property will not be removed, unless its value is set to `null` by the update.\n\n### Schema Examples\n\n```js\n// Set schema for users:\nawait db.schema.set('users/$uid', {\n    name: 'string',\n    email: 'string',\n    \"birthdate?\": 'Date', // optional birthdate\n    \"address?\": { // optional address\n        street: 'string',\n        nr: 'number | string',\n        \"building?\": 'string',\n        city: 'number',\n        postal_code: 'string',\n        country: /^[A-Z]{2}$/  // 2 uppercase character strings\n    },\n    \"posts?\": 'object', // Optional posts\n});\n\n// Set schema for user posts, using string definitions:\nawait db.schema.set(\n    'users/$uid/posts/$postid', \n    '{ title: string, text: string, posted: Date, edited?: Date, tags: string[] }'\n);\n\n// Set schema for user AND posts in 1 schema definition:\nawait db.schema.set('users/$uid', {\n    name: 'string', \n    // ...\n    \"posts?\": {\n        // use wildcard \"*\", or \"$postid\" for each child:\n        \"*\": { \n            title: 'string',\n            text: 'string',\n            posted: 'Date',\n            \"edited?\": 'Date',\n            tags: 'string[]',\n        }\n    }\n});\n\n// Get schema defined for a specific path:\nconst schemaInfo = await db.schema.get('users/$uid');\n\n// Get all defined schemas\nconst schemas = await db.schema.all();\n```\n\n## Mapping data to custom classes\n\nMapping data to your own classes allows you to store and load objects to/from the database without them losing their class type. Once you have mapped a database path to a class, you won't ever have to worry about serialization or deserialization of the objects =\u003e Store a `User`, get a `User`. Store a `Chat` that has a collection of `Messages`, get a `Chat` with `Messages` back from the database. Any class specific methods can be executed directly on the objects you get back from the db, because they will be an `instanceof` your class.\n\nBy default, AceBase runs your class constructor with a snapshot of the data to instantiate new objects, and uses all properties of your class to serialize them for storage. \n\n```javascript\n// User class implementation\nclass User {\n    constructor(obj) {\n        if (obj \u0026\u0026 obj instanceof DataSnapshot) {\n            let obj = snapshot.val();\n            this.name = obj.name;\n        }\n    }\n}\n\n// Bind to all children of users node\ndb.types.bind(\"users\", User);\n```\n\nYou can now do the following:\n```javascript\n// Create a user\nlet user = new User();\nuser.name = 'Ewout';\n\n// Store the user in the database (will be serialized automatically)\nconst userRef = await db.ref('users').push(user);\n\n// Load user from the db again (will be instantiated with the User constructor)\nconst userSnapshot = await userRef.get();\nlet savedUser = userSnapshot.val();\n// savedUser is an instance of class User\n```\n\n\nIf you are unable (or don't want to) to change your class constructor, add a static method named `create` to deserialize stored objects:\n\n```javascript\nclass Pet {\n    // Constructor that takes multiple arguments\n    constructor(animal, name) {\n        this.animal = animal;\n        this.name = name;\n    }\n    // Static method that instantiates a Pet object\n    static create(snap) {\n        let obj = snap.val();\n        return new Pet(obj.animal, obj.name);\n    }\n}\n// Bind to all pets of any user\ndb.types.bind(\"users/*/pets\", Pet); \n```\n\nIf you want to change how your objects are serialized for storage, add a method named `serialize` to your class. You should do this if your class contains properties that should not be serialized (eg `get` properties).\n\n```javascript\nclass Pet {\n    // ...\n    serialize() {\n        // manually serialize\n        return {\n            animal: this.animal,\n            name: this.name\n        }\n    }\n}\n// Bind\ndb.types.bind(\"users/*/pets\", Pet); \n```\n\nIf you want to use other methods for instantiation and/or serialization than the defaults explained above, you can manually specify them in the `bind` call:\n```javascript\nclass Pet {\n    // ...\n    toDatabase(ref) {\n        return {\n            animal: this.animal,\n            name: this.name\n        }\n    }\n    static fromDatabase(snap) {\n        let obj = snap.val();\n        return new Pet(obj.animal, obj.name);\n    }\n}\n// Bind using Pet.fromDatabase as object creator and Pet.prototype.toDatabase as serializer\ndb.types.bind(\"users/*/pets\", Pet, { creator: Pet.fromDatabase, serializer: Pet.prototype.toDatabase }); \n```\n\nIf you want to store native or 3rd party classes, or don't want to extend the classes with (de)serialization methods:\n```javascript\n// Storing native RegExp objects\ndb.types.bind(\n    \"regular_expressions\", \n    RegExp, { \n        creator: (snap) =\u003e {\n            let obj = snap.val();\n            return new RegExp(obj.pattern, obj.flags);\n        }, \n        serializer: (ref, regex) =\u003e {\n            // NOTE the regex param, it's provided because we can't use `this` to reference the object\n            return { pattern: regex.source, flags: regex.flags };\n        } \n    }\n);\n```\n\n## Storage\n\nBy default, AceBase uses its own binary database format in Node.js environments, and IndexedDB (or LocalStorage) in the browser to store its data. However, it is also possible to use AceBase's realtime capabilities, and have the actual data stored in other databases. Currently, AceBase has built-in adapters for MSSQL, SQLite in Node.js environments; and IndexedDB, LocalStorage, SessionStorage for the browser. It also possible to create your own custom storage adapters, so wherever you'd want to store your data - it's in your hands!\n\n### Using SQLite or MSSQL storage \n(NEW v0.8.0)\n\nFrom v0.8.0+ it is now possible to have AceBase store all data in a SQLite or SQL Server database backend! They're not as fast as the default AceBase binary database (which is about 5x faster), but if you want more control over your data, storing it in a widely used DBMS might come in handy. I developed it to be able to make ports to the browser and/or Android/iOS HTML5 apps easier, so `AceBaseClient`s will be able to store and query data locally also.\n\nTo use a different backend database, simply pass a typed `StorageSettings` object to the `AceBase` constructor. You can use `SQLiteStorageSettings` for a SQLite backend, `MSSQLStorageSettings` for SQL Server etc. \n\nDependencies: SQLite requires the `sqlite3` package to be installed from npm (`npm i sqlite3`), MSSQL requires the `mssql` package. mssql uses the tedious driver by default, but if you're on Windows you can also use Microsoft's native sql server driver by adding the `msnodesqlv8` package as well, and specifying `driver: 'native'` in the `MSSQLStorageSettings`\n\n```javascript\n// Using SQLite backend:\nconst db = new AceBase('mydb', new SQLiteStorageSettings({ path: '.' }));\n\n// Or, SQL Server:\nconst db = new AceBase('mydb', new MSSQLStorageSettings({ server: 'localhost', port: 1433, database: 'MyDB', username: 'user', password: 'secret', (...) }));\n```\n\n## Running AceBase in the browser\n\nAceBase is now able to run stand-alone in the browser. It uses IndexedDB or LocalStorage to store the data, or SessionStorage if you want a temporary database.\n\nNOTE: If you want to connect to a remote AceBase [acebase-server](https://www.npmjs.com/package/acebase-server) from the browser instead of running one locally, use [acebase-client](https://www.npmjs.com/package/acebase-client) instead.\n\nYou can also use a local database in the browser to sync with an AceBase server. To do this, create your database in the browser and pass it as `cache` db in `AceBaseClient`'s settings.\n\nIf you are using TypeScript (eg with Angular/Ionic), or webpack, add `acebase` to your project (`npm i acebase`), and use:\n```typescript\nimport { AceBase } from 'acebase';\n```\n\nOr, include AceBase script in your html page:\n```html\n\u003cscript type=\"text/javascript\" src=\"https://cdn.jsdelivr.net/npm/acebase@latest/dist/browser.min.js\"\u003e\u003c/script\u003e\n```\n\nThen, create your database and start using it!\n```js\n// Create an AceBase db using IndexedDB\nconst db = AceBase.WithIndexedDB('mydb');\n\nawait db.ready();\nconsole.log('Database ready to use');\n\nconst ref = await db.ref('browser').set({\n    test: 'AceBase runs in the browser!'\n});\nconsole.log(`\"${ref.path}\" was saved!`);\n\nconst snapshot = await ref.get();\nconsole.log(`Got \"${snapshot.ref.path}\" value:`, snapshot.val());\n```\n\nOr, if you prefer using `Promises` instead of `async` / `await`:\n```js\ndb.ready(() =\u003e {\n    console.log('Database ready to use');\n    return db.ref('browser').set({\n        test: 'AceBase runs in the browser!'\n    })\n    .then(ref =\u003e {\n        console.log(`\"${ref.path}\" was saved!`);\n        return ref.get();\n    })\n    .then(snap =\u003e {\n        console.log(`Got \"${snap.ref.path}\" value:`, snap.val());\n    });\n});\n```\n\nIf you want AceBase to use localStorage instead, use `AceBase.WithLocalStorage`:\n```js\n// Create an AceBase db using LocalStorage\nconst db = AceBase.WithLocalStorage('mydb', { temp: false }); // temp:true to use sessionStorage instead\n```\n\n### Cross-tab synchronization\n(NEW in v1.5.0)\n\nWhen you're using AceBase with an IndexedDB or LocalStorage backend, you might notice that if you change data in one open tab, those changes do not raise change events in other open tabs monitoring that same data. This is because `IndexedDB` or `LocalStorage` databases do not raise change events themselves, and AceBase won't be able to either if the data was not changed through AceBase itself. To overcome this issue, AceBase will have to notify local changes to other AceBase instances in different browser tabs. \n\nAceBase is now able to communicate with other tabs using the `BroadcastChannel` implemented in most browsers\\*, and is able to notify others of changes made to the underlaying database. This functionality is disabled by default, set `multipleTabs: true` in the options parameter to enable it:\n\n```js\nconst db = AceBase.WithIndexedDB('mydb', { multipleTabs: true });\n```\n\nOnce you've enabled this setting, the AceBase instances running in multiple tabs will exchange what events they are listening for, and notify eachother with any changes made to the monitored data.\n\n\\* If the browser does not support `BroadcastChannel`, a polyfill will be used. [Browser support](https://caniuse.com/broadcastchannel) is currently at 93% (December 2022)\n\nNOTE: This applies to local databases only. If you are using an `AceBaseClient`, connected to an `AceBaseServer`, changing something in one browser tab will already notify other tabs, because the events are raised by the AceBase server and sent back to the clients automatically. If you use a local `AceBase` instance as offline cache for an `AceBaseClient` and have `multipleTabs` enabled, cross-tab synchronization will only be used while offline.\n\n## Using a CustomStorage backend\n\nIn additional to the already available binary, SQL Server, SQLite, IndexedDB and LocalStorage backends, it's also possible to roll your own custom storage backend, such as MongoDB, MySQL, WebSQL etc. To do this, all you have to do is write a couple of methods to get, set, remove and query data within a transactional context. The only prerequisite is that your used database provider is able to execute queries, or provides some other way to iterate through record entries without having to load them all into memory at once. (Firebase won't do because it can't do that)\n\nThe example below shows how to implement a `CustomStorage` class that uses the browser's `LocalStorage`, but you can use anything you'd want. It's easy to change the code below to use any other database provider like MongoDB, PostgreSQL, MySQL etc.\n\nNOTE: The code below is similar to the implementation of `AceBase.WithLocalStorage`\n\n```typescript\nimport { AceBase, CustomStorageSettings, CustomStorageTransaction, CustomStorageHelpers, ICustomStorageNodeMetaData, ICustomStorageNode } from 'acebase';\n\n// Setup our CustomStorageSettings\nconst storageSettings = new CustomStorageSettings({\n    name: 'LocalStorage',\n    locking: true, // Let AceBase handle resource locking to prevent multiple simultanious updates to the same data\n    \n    async ready() {\n        // LocalStorage is always ready\n    },\n\n    async getTransaction(target: { path: string, write: boolean }) {\n        // Create an instance of our transaction class\n        const context = { debug: true, dbname };\n        const transaction = new LocalStorageTransaction(context, target);\n        return transaction;\n    }\n});\n\n// Setup CustomStorageTransaction for browser's LocalStorage\nclass LocalStorageTransaction extends CustomStorageTransaction {\n\n    constructor(context: { debug: boolean; dbname: string }, target: { path: string, write: boolean }) {\n        super(target);\n        this.context = context;\n        this._storageKeysPrefix = `${this.context.dbname}.acebase::`;\n    }\n\n    async commit() {\n        // To implement REAL commit and rollback capabilities, we'd have to add pending mutations to a batch,\n        // and store upon commit, or toss upon rollback. This is what AceBase.WithIndexedDB does, is also way faster.\n        // All changes have already been committed\n    }\n    \n    async rollback(err: any) {\n        // Not able to rollback changes, was already comitted.\n    }\n\n    async get(path: string) {\n        // Gets value from localStorage, wrapped in Promise\n        const json = localStorage.getItem(this.getStorageKeyForPath(path));\n        const val = JSON.parse(json);\n        return val;\n    }\n\n    async set(path: string, val: any) {\n        // Sets value in localStorage, wrapped in Promise\n        const json = JSON.stringify(val);\n        const key = this.getStorageKeyForPath(path);\n        localStorage.setItem(key, json);\n    }\n\n    async remove(path: string) {\n        // Removes a value from localStorage, wrapped in Promise\n        const key = this.getStorageKeyForPath(path);\n        localStorage.removeItem(key);\n    }\n\n    async childrenOf(\n        path: string, \n        include: { metadata: boolean; value: boolean; },\n        checkCallback: (path: string) =\u003e boolean,\n        addCallback: (path: string, node: ICustomStorageNodeMetaData | ICustomStorageNode) =\u003e boolean,\n    ) {\n        // Streams all child paths\n        // Cannot query localStorage, so loop through all stored keys to find children\n        const pathInfo = CustomStorageHelpers.PathInfo.get(path);\n        for (let i = 0; i \u003c localStorage.length; i++) {\n            const key = localStorage.key(i);\n            if (!key.startsWith(this._storageKeysPrefix)) { continue; }                \n            let otherPath = this.getPathFromStorageKey(key);\n            if (pathInfo.isParentOf(otherPath) \u0026\u0026 checkCallback(otherPath)) {\n                let node;\n                if (include.metadata || include.value) {\n                    const json = localStorage.getItem(key);\n                    node = JSON.parse(json);\n                }\n                const keepGoing = addCallback(otherPath, node);\n                if (!keepGoing) { break; }\n            }\n        }\n    }\n\n    async descendantsOf(\n        path: string, \n        include: { metadata: boolean; value: boolean; },\n        checkCallback: (path: string) =\u003e boolean,\n        addCallback: (path: string, node: ICustomStorageNodeMetaData | ICustomStorageNode) =\u003e boolean,\n    ) {\n        // Streams all descendant paths\n        // Cannot query localStorage, so loop through all stored keys to find descendants\n        const pathInfo = CustomStorageHelpers.PathInfo.get(path);\n        for (let i = 0; i \u003c localStorage.length; i++) {\n            const key = localStorage.key(i);\n            if (!key.startsWith(this._storageKeysPrefix)) { continue; }\n            let otherPath = this.getPathFromStorageKey(key);\n            if (pathInfo.isAncestorOf(otherPath) \u0026\u0026 checkCallback(otherPath)) {\n                let node;\n                if (include.metadata || include.value) {\n                    const json = localStorage.getItem(key);\n                    node = JSON.parse(json);\n                }\n                const keepGoing = addCallback(otherPath, node);\n                if (!keepGoing) { break; }\n            }\n        }\n    }\n\n    /**\n     * Helper function to get the path from a localStorage key\n     */\n    getPathFromStorageKey(key: string) {\n        return key.slice(this._storageKeysPrefix.length);\n    }\n\n    /**\n     * Helper function to get the localStorage key for a path\n     */\n    getStorageKeyForPath(path: string) {\n        return `${this._storageKeysPrefix}${path}`;\n    }\n}\n\n// Now, create the database with our custom storage class\nconst db = new AceBase('dbname', { storage: storageSettings });\n\n// Ready to use!\n```\n\n## Reflect API\n\nAceBase has a built-in reflection API that enables browsing the database content without retrieving any (nested) data. This API is available for local databases, and remote databases when signed in as the `admin` user or on paths the authenticated user has access to.\n\nThe reflect API is also used internally: AceBase server's webmanager uses it to allow database exploration, and the `DataReference` class uses it to deliver results for `count()` and initial `notify_child_added` event callbacks.\n\n### Get information about a node\n\nTo get information about a node and its children, use an `info` query:\n\n```javascript\n// Get info about the root node and a maximum of 200 children:\ndb.root.reflect('info', { child_limit: 200, child_skip: 0 })\n.then(info =\u003e { /* ... */ });\n```\n\nThe above example will return an info object with the following structure:\n```json\n{ \n    \"key\": \"\",\n    \"exists\": true, \n    \"type\": \"object\",\n    \"children\": { \n        \"more\": false, \n        \"list\": [\n            { \"key\": \"appName\", \"type\": \"string\", \"value\": \"My social app\" },\n            { \"key\": \"appVersion\", \"type\": \"number\", \"value\": 1 },\n            { \"key\": \"posts\", \"type\": \"object\" }\n        ] \n    } \n}\n```\n\nTo get the number of children of a node (instead of enumerating them), pass `{ child_count: true }` with the `info` reflect request:\n\n```javascript\nconst info = await db.ref('chats/somechat/messages')\n    .reflect('info', { child_count: true });\n```\n\nThis will return an info object with the following structure:\n```json\n{ \n    \"key\": \"messages\",\n    \"exists\": true, \n    \"type\": \"object\",\n    \"children\": { \n        \"count\": 879\n    }\n}\n```\n\n### Get children of a node\n\nTo get information about the children of a node, use the `children` reflection query:\n```javascript\nconst children = await db.ref('chats/somechat/messages')\n    .reflect('children', { limit: 10, skip: 0 });\n```\n\nThe returned children object in above example will have to following structure:\n```json\n{\n    \"more\": true,\n    \"list\": {\n        \"message1\": { \"type\": \"object\" },\n        \"message2\": { \"type\": \"object\" },\n        // ...\n        \"message10\": { \"type\": \"object\" }\n    }\n}\n```\n\n## Export API \n(NEW v0.9.1)\n\nTo export data from any node to json, you can use the export API. Simply pass an object that has a `write` method to `yourRef.export`, and the entire node's value (including nested data) will be streamed in `json` format. If your `write` function returns a `Promise`, streaming will be paused until the promise resolves (local databases only). You can use this to back off writing if the target stream's buffer is full (eg while waiting for a file stream to \"drain\"). This API is available for local databases, and remote databases \n~~when signed in as the `admin` user~~ (from server v0.9.29+) on paths the authenticated user has access to.\n\n```javascript\nlet json = '';\nconst write = str =\u003e {\n    json += str;\n};\ndb.ref('posts').export(write)\n.then(() =\u003e {\n    console.log('All posts have been exported:');\n    console.log(json);\n})\n```\n\nTo export to a file in Node.js, you could use a filestream:\n```js\nconst stream = fs.createWriteStream('export.json', { flags: 'w+' });\nconst write = chunk =\u003e {\n    const ok = stream.write(chunk);\n    if (!ok) {\n        return new Promise(resolve =\u003e stream.once('drain', resolve));\n    }\n};\nawait db.root.export(write); // Export all data\nstream.close(); \n```\n\n### Type safety\nAny data that can not be expressed in JSON format natively (such as Dates and binary data) are exported type-safe using an object describing the content. This is the default behaviour since v1.13.0 \n\nFor example: a Date will be exported like `\"date\":{\".type\":\"Date\",\".val\":\"2021-12-31T11:55:14.380Z\"}`, and binary data like `\"binary\":{\".type\":\"Buffer\",\".val\":\"\u003c~@VK^gEd8d\u003c@\u003c\u003eo~\u003e\"}`.\n\nIf you do not want to use this type-safe formatting, you can disable it by setting the `type_safe` option: `ref.export(write, { type_safe: false })`;\n\n## Import API \n(NEW v1.13.0)\n\nIf you need to import large amounts of data it is recommended to use the new import API, which efficiently streams a JSON input source into the database without acquiring long-blocking write locks. This leaves your database responsive for other processes and eliminates the need to load your entire source into memory.\n\nExample:\n```js\nconst fd = fs.openSync('data.json', 'r');\nconst read = length =\u003e {\n    return new Promise((resolve, reject) =\u003e {\n        const buffer = new Uint8Array(length);\n        fs.read(fd, buffer, 0, length, null, err =\u003e {\n            if (err) { reject(err); }\n            else { resolve(buffer); }\n        });\n    });\n};\nawait db.ref(path).import(read);\nfs.closeSync(fd);\n```\n\nNOTE: If you have transaction logging enabled, the import will cause many smaller updates to be logged, instead of just one.\n\n## Transaction Logging\n(NEW v1.8.0, BETA, AceBase binary databases only)\n\nAceBase now supports transaction logging to facilitate sophisticated synchronization options and custom data recovery. Using cursors that indicate certain points in time, this allows for fast and easy synchronization of data between an AceBase server and multiple clients, or other server instances. This functionality is currently in BETA stage and will be tested extensively in the coming weeks. \n\nTo enable transaction logging on your database, add the `transactions` setting to the AceBase constructor:\n```js\nconst db = new AceBase('mydb', { transactions: { log: true, maxAge: 30, noWait: false } });\n```\n\nMore documentation will follow soon, see `transaction-logs.spec.js` unit tests for more info for now.\n\n## Multi-process support\n\nAceBase supports running in multiple processes by using interprocess communication (IPC). If your app runs in a standard Node.js cluster, AceBase is able to communicate with each process through Node.js's built-in `cluster` functionality. If your app runs in the browser, AceBase will use `BroadcastChannel` (or shim for Safari) to communicate with other browser tabs. \n\nIf you are using pm2 to run your app in a cluster, or run your app in a cloud-based cluster (eg Kubernetes, Docker Swarm), AceBase instances will need some other way to communicate with eachother. This is now possible using an AceBase IPC Server, which allows fast communication using websockets. See [AceBase IPC Server](https://github.com/appy-one/acebase-ipc-server) for more info!\n\n**NEW** (v1.28.0): AceBase now supports an IPC mode that enables isolated processes on a single machine to access the same database simultaneously, without them having to setup an IPC cluster. By setting the storage `ipc` setting to `'socket'`, AceBase will launch (or connect to) a dedicated service process that communicates through very fast in-memory Unix sockets, or named pipes on Windows. The service will automatically shut down again once the database is not being accessed by any running process anymore. This will become the default IPC setting in the future.\n\n## CommonJS and ESM packages\nThe TypeScript sources are compiled to both CommonJS and ESM module systems. The sources loaded depend on whether you `import` or `require` acebase:\n| Statement                                  | Module system | Target |\n|--------------------------------------------|---------------|--------|\n| import { AceBase } from 'acebase'          | ESM           | ES2020 |\n| const { AceBase } = require('acebase')     | CommonJS      | ES2017 |\n\nSee https://github.com/appy-one/acebase/discussions/98 for more info.\n\n## Upgrade notices\n\n* v0.9.68 - To get the used updating context in data event handlers, read from `snap.context()` instead of `snap.ref.context()`. This is to prevent further updates on `snap.ref` to use the same context. If you need to reuse the event context for new updates, you will have to manually set it: `snap.ref.context(snap.context()).update(...)`\n\n* v0.7.0 - Changed DataReference.vars object for subscription events, it now contains all values for path wildcards and variables with their index, and (for named variables:) `name` and ($-)prefixed `$name`. The `wildcards` array has been removed. See *Using variables and wildcards in subscription paths* in the documentation above.\n\n* v0.6.0 - Changed `db.types.bind` method signature. Serialization and creator functions can now also access the `DataReference` for the object being serialized/instantiated, this enables the use of path variables.\n\n* v0.4.0 - introduced fulltext, geo and array indexes. This required making changes to the index file format, you will have to delete all index files and create them again using `db.indexes.create`.\n\n## Known issues\n\n* No currently known issues. Please submit any issues you might find in the respective GitHub repository! For this repository go to [AceBase issues](https://github.com/appy-one/acebase/issues)\n\n## Authors\n\n* **Ewout Stortenbeker** - *Initial work* - \u003cme@appy.one\u003e\n* **You?** Please contribute!\n\n## Contributing\n\nIf you would like to contribute to help move the project forward, you are welcome to do so!\nWhat can you help me with?\n\n* Bugfixes - if you find bugs please create a new issue on github. If you know how to fix one, feel free to submit a pull request or drop me an email\n* Enhancements - if you've got code to make AceBase even faster or better, you're welcome to contribute!\n* Ports - If you would like to port `AceBaseClient` to other languages (Java, Swift, C#, etc) that would be awesome!\n* Ideas - I love new ideas, share them!\n* Money - I am an independant developer and many (MANY) months were put into developing this. I also have a family to feed so if you like AceBase, send me a donation or become a sponsor ♥\n\n## Sponsoring\n\nIf you use AceBase, please consider supporting its development by [sponsoring](https://github.com/sponsors/appy-one) the project, buying me a [coffee](https://www.buymeacoffee.com/appyone) or sending a [donation](https://paypal.me/theappyone). \nIf you are employed, ask your supervisor to sponsor AceBase and any other open source projects you use.\n\n[![Sponsor AceBase](img/sponsor.svg)](https://github.com/sponsors/appy-one)\n\nThanks, Ewout\n\n## Share\n\nPlease tell others about AceBase!\n\n[![Sponsor AceBase](img/tweet.svg)](https://twitter.com/intent/tweet?button=\u0026url=https://github.com/appy-one/acebase\u0026text=I%27m+using+@AcebaseRealtime+in+my+project+to+make+my+life+easier!\u0026button=)\n","funding_links":["https://github.com/sponsors/appy-one","https://www.buymeacoffee.com/appyone","https://paypal.me/theappyone"],"categories":["TypeScript","JavaScript"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fappy-one%2Facebase","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fappy-one%2Facebase","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fappy-one%2Facebase/lists"}