{"id":31530209,"url":"https://github.com/frictionlessdata/frictionless-js","last_synced_at":"2025-10-04T01:19:31.814Z","repository":{"id":40295301,"uuid":"99835338","full_name":"frictionlessdata/frictionless-js","owner":"frictionlessdata","description":"A lightweight, standardized library accessing files and datasets, especially tabular ones (CSV, Excel).","archived":false,"fork":false,"pushed_at":"2023-03-04T03:54:27.000Z","size":19156,"stargazers_count":73,"open_issues_count":32,"forks_count":8,"subscribers_count":9,"default_branch":"master","last_synced_at":"2025-09-21T00:55:05.231Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://frictionlessdata.io","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/frictionlessdata.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2017-08-09T17:30:35.000Z","updated_at":"2025-02-10T19:10:17.000Z","dependencies_parsed_at":"2024-06-18T18:23:02.177Z","dependency_job_id":"167ba9cc-5f4d-4c75-887a-cfe974a0d2e9","html_url":"https://github.com/frictionlessdata/frictionless-js","commit_stats":{"total_commits":268,"total_committers":11,"mean_commits":"24.363636363636363","dds":0.6567164179104478,"last_synced_commit":"fa7b8f4f36880159dada651309a81a2068e7d0c2"},"previous_names":["datopian/data.js","datahq/data.js"],"tags_count":16,"template":false,"template_full_name":null,"purl":"pkg:github/frictionlessdata/frictionless-js","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/frictionlessdata%2Ffrictionless-js","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/frictionlessdata%2Ffrictionless-js/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/frictionlessdata%2Ffrictionless-js/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/frictionlessdata%2Ffrictionless-js/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/frictionlessdata","download_url":"https://codeload.github.com/frictionlessdata/frictionless-js/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/frictionlessdata%2Ffrictionless-js/sbom","scorecard":{"id":325064,"data":{"date":"2025-08-11","repo":{"name":"github.com/frictionlessdata/frictionless-js","commit":"fa7b8f4f36880159dada651309a81a2068e7d0c2"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":2.4,"checks":[{"name":"Token-Permissions","score":0,"reason":"detected GitHub workflow tokens with excessive permissions","details":["Warn: no topLevel permission defined: .github/workflows/node.js.yml:1","Info: no jobLevel write permissions found"],"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Dangerous-Workflow","score":10,"reason":"no dangerous workflow patterns detected","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"Code-Review","score":-1,"reason":"Found no human activity in the last 15 changesets","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Maintained","score":0,"reason":"0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"License","score":0,"reason":"license file not detected","details":["Warn: project does not have a license file"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":0,"reason":"branch protection not enabled on development/release branches","details":["Warn: branch protection not enabled for branch 'master'"],"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"Pinned-Dependencies","score":0,"reason":"dependency not pinned by hash detected -- score normalized to 0","details":["Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/node.js.yml:18: update your workflow using https://app.stepsecurity.io/secureworkflow/frictionlessdata/frictionless-js/node.js.yml/master?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/node.js.yml:20: update your workflow using https://app.stepsecurity.io/secureworkflow/frictionlessdata/frictionless-js/node.js.yml/master?enable=pin","Info:   0 out of   2 GitHub-owned GitHubAction dependencies pinned"],"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 30 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}},{"name":"Vulnerabilities","score":0,"reason":"106 existing vulnerabilities detected","details":["Warn: Project is vulnerable to: GHSA-968p-4wvh-cqc8","Warn: Project is vulnerable to: GHSA-67hx-6x53-jw92","Warn: Project is vulnerable to: GHSA-93q8-gq69-wqmw","Warn: Project is vulnerable to: GHSA-cph5-m8f7-6c5x","Warn: Project is vulnerable to: GHSA-wf5p-g6vw-rhxx","Warn: Project is vulnerable to: GHSA-jr5f-v2jv-69x6","Warn: Project is vulnerable to: GHSA-qwcr-r2fm-qrc7","Warn: Project is vulnerable to: GHSA-v6h2-p8h4-qcjw","Warn: Project is vulnerable to: GHSA-cwfw-4gq5-mrqx","Warn: Project is vulnerable to: GHSA-g95f-p29q-9xw4","Warn: Project is vulnerable to: GHSA-grv7-fg5c-xmjg","Warn: Project is vulnerable to: GHSA-x9w5-v3q2-3rhw","Warn: Project is vulnerable to: GHSA-pxg6-pf52-xh8x","Warn: Project is vulnerable to: GHSA-3xgq-45jj-v275","Warn: Project is vulnerable to: GHSA-gxpj-cx7g-858c","Warn: Project is vulnerable to: GHSA-w573-4hg7-7wgq","Warn: Project is vulnerable to: GHSA-434g-2637-qmqr","Warn: Project is vulnerable to: GHSA-49q7-c7j4-3p7m","Warn: Project is vulnerable to: GHSA-977x-g7h5-7qgw","Warn: Project is vulnerable to: GHSA-f7q4-pwc6-w24p","Warn: Project is vulnerable to: GHSA-fc9h-whq2-v747","Warn: Project is vulnerable to: GHSA-vjh7-7g9h-fjfh","Warn: Project is vulnerable to: GHSA-j4f2-536g-r55m","Warn: Project is vulnerable to: GHSA-r7qp-cfhv-p84w","Warn: Project is vulnerable to: GHSA-2j2x-2gpw-g8fm","Warn: Project is vulnerable to: GHSA-pw2r-vq6v-hr8c","Warn: Project is vulnerable to: GHSA-jchw-25xp-jwwc","Warn: Project is vulnerable to: GHSA-cxjh-pqwp-8mfp","Warn: Project is vulnerable to: GHSA-fjxv-7rqg-78g4","Warn: Project is vulnerable to: GHSA-8r6j-v8pm-fqw3","Warn: Project is vulnerable to: MAL-2023-462","Warn: Project is vulnerable to: GHSA-4q6p-r6v2-jvc5","Warn: Project is vulnerable to: GHSA-ww39-953v-wcq6","Warn: Project is vulnerable to: GHSA-pfrx-2q88-qq97","Warn: Project is vulnerable to: GHSA-rc47-6667-2j5j","Warn: Project is vulnerable to: GHSA-7r28-3m3f-r2pr","Warn: Project is vulnerable to: GHSA-r8j5-h5cx-65gg","Warn: Project is vulnerable to: GHSA-896r-f27r-55mw","Warn: Project is vulnerable to: GHSA-9c47-m6qq-7p4h","Warn: Project is vulnerable to: GHSA-7x7c-qm48-pq9c","Warn: Project is vulnerable to: GHSA-rc3x-jf5g-xvc5","Warn: Project is vulnerable to: GHSA-6c8f-qphg-qjgp","Warn: Project is vulnerable to: GHSA-76p3-8jx3-jpfq","Warn: Project is vulnerable to: GHSA-3rfm-jhwj-7488","Warn: Project is vulnerable to: GHSA-hhq3-ff78-jv3g","Warn: Project is vulnerable to: GHSA-29mw-wpgm-hmr9","Warn: Project is vulnerable to: GHSA-35jh-r3h4-6jhm","Warn: Project is vulnerable to: GHSA-p6mc-m468-83gw","Warn: Project is vulnerable to: GHSA-82v2-mx6x-wq7q","Warn: Project is vulnerable to: GHSA-6vfc-qv3f-vr6c","Warn: Project is vulnerable to: GHSA-5v2h-r2cx-5xgj","Warn: Project is vulnerable to: GHSA-rrrm-qjm4-v8hf","Warn: Project is vulnerable to: GHSA-4r62-v4vq-hr96","Warn: Project is vulnerable to: GHSA-952p-6rrq-rcjv","Warn: Project is vulnerable to: GHSA-f8q6-p94x-37v3","Warn: Project is vulnerable to: GHSA-xvch-5gv4-984h","Warn: Project is vulnerable to: GHSA-8hfj-j24r-96c4","Warn: Project is vulnerable to: GHSA-wc69-rhjr-hc9g","Warn: Project is vulnerable to: GHSA-r683-j2x4-v87g","Warn: Project is vulnerable to: GHSA-5rrq-pxf6-6jx5","Warn: Project is vulnerable to: GHSA-8fr3-hfg3-gpgp","Warn: Project is vulnerable to: GHSA-gf8q-jrpm-jvxq","Warn: Project is vulnerable to: GHSA-2r2c-g63r-vccr","Warn: Project is vulnerable to: GHSA-cfm4-qjh2-4765","Warn: Project is vulnerable to: GHSA-x4jg-mjrx-434g","Warn: Project is vulnerable to: GHSA-92xj-mqp7-vmcj","Warn: Project is vulnerable to: GHSA-wxgw-qj99-44c2","Warn: Project is vulnerable to: GHSA-px4h-xg32-q955","Warn: Project is vulnerable to: GHSA-rp65-9cf3-cjxr","Warn: Project is vulnerable to: GHSA-6fx8-h7jm-663j","Warn: Project is vulnerable to: GHSA-g6ww-v8xp-vmwg","Warn: Project is vulnerable to: GHSA-h7cp-r72f-jxh6","Warn: Project is vulnerable to: GHSA-v62p-rq8g-8h59","Warn: Project is vulnerable to: GHSA-566m-qj78-rww5","Warn: Project is vulnerable to: GHSA-7fh5-64p2-3v2j","Warn: Project is vulnerable to: GHSA-hwj9-h5mp-3pm3","Warn: Project is vulnerable to: GHSA-hrpp-h998-j3pp","Warn: Project is vulnerable to: GHSA-p8p7-x288-28g6","Warn: Project is vulnerable to: GHSA-c2qf-rxjj-qqgw","Warn: Project is vulnerable to: GHSA-m6fv-jmcg-4jfg","Warn: Project is vulnerable to: GHSA-cm22-4g7w-348p","Warn: Project is vulnerable to: GHSA-4g88-fppr-53pp","Warn: Project is vulnerable to: GHSA-4jqc-8m5r-9rpr","Warn: Project is vulnerable to: GHSA-g4rg-993r-mgx7","Warn: Project is vulnerable to: GHSA-25hc-qcg6-38wj","Warn: Project is vulnerable to: GHSA-qm95-pgcg-qqfq","Warn: Project is vulnerable to: GHSA-cqmj-92xf-r6r9","Warn: Project is vulnerable to: GHSA-mxhp-79qh-mcx6","Warn: Project is vulnerable to: GHSA-f5x3-32g6-xq36","Warn: Project is vulnerable to: GHSA-4wf5-vphf-c2xc","Warn: Project is vulnerable to: GHSA-52f5-9888-hmc6","Warn: Project is vulnerable to: GHSA-72xf-g2v4-qvf3","Warn: Project is vulnerable to: GHSA-38fc-wpqx-33j7","Warn: Project is vulnerable to: GHSA-394c-5j6w-4xmx","Warn: Project is vulnerable to: GHSA-78cj-fxph-m83p","Warn: Project is vulnerable to: GHSA-fhg7-m89q-25r3","Warn: Project is vulnerable to: GHSA-cf4h-3jhx-xvhq","Warn: Project is vulnerable to: GHSA-j8xg-fqg3-53r7","Warn: Project is vulnerable to: GHSA-3h5v-q93c-6h6q","Warn: Project is vulnerable to: GHSA-6fc8-4gx4-v693","Warn: Project is vulnerable to: GHSA-4r6h-8v6p-xvw6","Warn: Project is vulnerable to: GHSA-5pgg-2g8v-p4x9","Warn: Project is vulnerable to: GHSA-72mh-269x-7mh5","Warn: Project is vulnerable to: GHSA-h4j5-c7cj-74xg","Warn: Project is vulnerable to: GHSA-c4w7-xm78-47vh","Warn: Project is vulnerable to: GHSA-p9pc-299p-vxgp"],"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}}]},"last_synced_at":"2025-08-18T02:13:59.916Z","repository_id":40295301,"created_at":"2025-08-18T02:13:59.916Z","updated_at":"2025-08-18T02:13:59.916Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":278252370,"owners_count":25956287,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-03T02:00:06.070Z","response_time":53,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-10-04T01:19:17.534Z","updated_at":"2025-10-04T01:19:31.803Z","avatar_url":"https://github.com/frictionlessdata.png","language":"JavaScript","readme":"`frictionless.js` is a lightweight, standardized \"stream-plus-metadata\" interface for accessing files and datasets, especially tabular ones (CSV, Excel).\n\n`frictionless.js` follows the [\"Frictionless Data Lib Pattern\"][fd-pattern].\n\n[fd-pattern]: http://okfnlabs.org/blog/2018/02/15/design-pattern-for-a-core-data-library.html#dataset\n\n* **Open it fast**: simple `open` method for data on disk, online and inline\n* **Data plus**: data plus metadata (size, path, etc) in standardized way\n* **Stream it**: raw streams and object streams\n* **Tabular**: open CSV, Excel or arrays and get a row stream\n* **Frictionless**: compatible with [Frictionless Data standards][fd]\n\n[![Build Status](https://travis-ci.org/frictionlessdata/frictionless-js.svg?branch=master)](https://travis-ci.org/frictionlessdata/frictionless-js) [![Gitter](https://img.shields.io/gitter/room/frictionlessdata/chat.svg)](https://gitter.im/datahubio/chat)\n\nA line of code is worth a thousand words ...\n\n```\nconst {open} = require('frictionless.js')\n\nvar file = open('path/to/ons-mye-population-totals.xls')\n\nfile.descriptor\n  {\n    path: '/path/to/ons-mye-population-totals.xls',\n    pathType: 'local',\n    name: 'ons-mye-population-totals',\n    format: 'xls',\n    mediatype: 'application/vnd.ms-excel',\n    encoding: 'windows-1252'\n  }\n\nfile.size\n  67584\n\nfile.rows() =\u003e stream object for rows\n  // keyed by header row by default ...\n  { 'col1': 1, 'col2': 2, ... }\n  { 'col1': 10, 'col2': 20, ... }\n```\n\n\n\u003c!-- START doctoc generated TOC please keep comment here to allow auto update --\u003e\n\u003c!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE --\u003e\n**Table of Contents**\n\n- [Motivation](#motivation)\n- [Features](#features)\n- [Installation](#installation)\n- [Browser](#browser)\n- [Usage](#usage)\n- [API](#api)\n  - [open](#open)\n  - [Files](#files)\n    - [load](#load)\n    - [Metadata](#metadata)\n    - [stream](#stream)\n    - [buffer](#buffer)\n    - [rows](#rows)\n  - [Datasets](#datasets)\n    - [load](#load-1)\n    - [addResource](#addresource)\n  - [Utilities](#utilities)\n    - [isDataset](#isdataset)\n    - [parseDatasetIdentifier](#parsedatasetidentifier)\n- [Developers](#developers)\n\n\u003c!-- END doctoc generated TOC please keep comment here to allow auto update --\u003e\n\n## Motivation\n\n`frictionless.js` is motivated by the following use cases:\n\n* **Data \"plus\"**: when you work with data you always find yourself needing the data itself plus a little bit more -- things like where the data came from on disk (or is going to), or how large it is. This library gives you that information in a standardized way.\n* **Convenient open**: the same simple `open` method whether you are accessing data on disk, from a URL or inline data from a string, buffer or array.\n* **Streams (or strings)**: standardized iterator / object stream interface to data wherever you've loaded from online, on disk or inline\n* **Building block for data pipelines**: provides a standardized building block for more complex data processing. For example, suppose you want to load a csv file then write to JSON. That's simple enough. But then suppose you want to delete the first 3 rows, delete the 2nd column. Now you have a more complex processing pipeline. This library provides a simple set of \"data plus metadata\" objects that you can pass along your pipeline.\n\n## Features\n\n* Easy: a single common API for local, online and inline data\n* Micro: The whole project is ~400 lines of code\n* Simple: Oriented for single purpose\n* Explicit: No hidden behaviours, no extra magic\n* Frictionlesss: uses and supports (but does not require) [Frictionless Data][fd] specs such as [Data Package][dp] so you can leverage Frictionless tooling\n* Minimal glue: Use on its own or as a building block for more complex data tooling (thanks to its common minimal metadata)\n\n[fd]: https://frictionlessdata.io/\n[dp]: https://frictionlessdata.io/data-packages/\n\n## Installation\n\n`npm install frictionless.js`\n\n\n## Browser\n\nIf you want to use the it in the browser, first you need to build the bundle.\n\nRun the following command to generate the bundle for the necessary JS targets\n\n`yarn build` \n\nThis will create two bundles in the `dist` folder. `node` sub-folder contains build for node environment, while `browser` sub-folder contains build for the browser. In a simple html file you can use it like this:\n```html\n\u003chead\u003e\n  \u003cscript src=\"./dist/browser/bundle.js\"\u003e\u003c/script\u003e\n  \u003cscript\u003e\n    // Global data lib is available here...\n    \n    const file = data.open('path/to/file')\n    ...\n  \u003c/script\u003e\n\u003c/head\u003e\n\u003cbody\u003e\u003c/body\u003e\n```  \n\n## Usage\n\nWith a simple file:\n\n```javascript\nconst data = require('frictionless.js')\n\n// path can be local or remote\nconst file = data.open(path)\n\n// descriptor with metadata e.g. name, path, format, (guessed) mimetype etc\nconsole.log(file.descriptor)\n\n// returns promise with raw stream\nconst stream = await file.stream()\n\n// let's get an object stream of the rows\n// (assuming it is tabular i.e. csv, xls etc)\nconst rows = await file.rows()\n\n// entire file as a buffer\nconst buffer = await file.buffer\n\n//for large files you can return in chunks\nawait file.bufferInChunks((chunk, progress)=\u003e{\n  console.log(progress, chunk)\n})\n\n\n```\n\nWith a Dataset:\n\n```javascript\nconst { Dataset } = require('frictionless.js')\n\nconst path = '/path/to/directory/' // must have datapackage.json in the directory atm\n\nDataset.load(path).then(dataset =\u003e {\n  // get a data file in this dataset\n  const file = dataset.resources[0]\n\n  const data = file.stream()\n})\n```\n\n## API\n\n\n### open\n\nLoad a file from a path or descriptor.\n\n`\nload(pathOrDescriptor, {basePath, format}={})\n`\n\nThere are 3 types of file source we support:\n\n* Local path\n* Remote url\n* Inline data\n\n```javascript\nconst data = require('frictionless.js')\n\nconst file = data.open('/path/to/file.csv')\n\nconst file = data.open('https://example.com/data.xls')\n\n// loading raw data\nconst file = data.open({\n  name: 'mydata',\n  data: { // can be any javascript - an object, an array or a string or ...\n    a: 1,\n    b: 2\n  }\n})\n\n// Loading with a descriptor - this allows more fine-grained configuration\n// The descriptor should follow the Frictionless Data Resource model\n// http://specs.frictionlessdata.io/data-resource/\nconst file = data.open({\n  // file or url path\n  path: 'https://example.com/data.csv',\n  // a Table Schema - https://specs.frictionlessdata.io/table-schema/\n  schema: {\n    fields: [\n      ...\n    ]\n  }\n  // CSV dialect - https://specs.frictionlessdata.io/csv-dialect/\n  dialect: {\n    // this is tab separated CSV/DSV\n    delimiter: '\\t'\n  }\n})\n```\n\n`basePath`: use in cases where you want to create a File with a path that is relative to a base directory / path e.g.\n\n```\nconst file = data.open('data.csv', {basePath: '/my/base/path'})\n```\n\nWill open the file: `/my/base/path/data.csv`\n\nThis functionality is primarily useful when using Files as part of Datasets where it can be convenient for a File to have a path relative to the directory of the Dataset. (See also Data Package and Data Resource in the Frictionless Data specs).\n\n\n### Files\n\nA single data file - local or remote.\n\n#### load\n\n*DEPRECATED*. Use simple `open`.\n\n#### Metadata\n\nMain metadata is available via the `descriptor`:\n\n```\nfile.descriptor\n```\n\nThis metadata is a combination of the metadata passed in at File creation (if you created the File with a descriptor object) and auto-inferred information from the File path. This is the info that is auto-inferred:\n\n```\npath: path this was instantiated with - may not be same as file.path (depending on basePath)\npathType: remote | local\nname:   file name (without extension)\nformat: the extension\nmediatype: mimetype based on file name and extension\n```\n\nIn addition to this metadata there are certain properties which are computed on demand:\n\n```javascript\n// the full path to the file (using basepath)\nconst path = file.path\n\nconst size = file.size\n\n// md5 hash of the file\nconst hash = file.hash()\n\n// sha256 hash of the file\nconst hash256 = file.hash(hashType='sha256')\n\n// file encoding\nconst encoding = file.encoding\n```\n\n**Note**: size, hash are not available for remote Files (those created from urls).\n\n\n#### stream\n\n`stream()`\n\nGet readable stream\n\n@returns Promise with readable stream object on resolve\n\n#### buffer\n\n`File.buffer`\n\nGet this file as a buffer (async)\n\n@returns: promise which resolves to the buffer\n\n#### rows\n\n`rows({keyed}={})`\n\nGet the rows for this file as a node object stream (assumes underlying data is tabular!)\n\n@returns Promise with rows as parsed JS objects (depends on file format)\n\n* `keyed`: if `false` (default) returns rows as arrays. If `true` returns rows as objects.\n\nTODO: casting (does data get cast automatically for you or not ...)\n\n**What formats are supported?**\n\nThe rows functionality is currently available for CSV and Excel files. The Tabular support incorporates supports for Table Schema and CSV Dialect e.g. you can do:\n\n```javascript\n\n// load a CSV with a non-standard dialect e.g. tab separated or semi-colon separated\nconst file = data.open({\n  path: 'mydata.tsv'\n  // Full support for http://specs.frictionlessdata.io/csv-dialect/\n  dialect: {\n    delimiter: '\\t' // for tabs or ';' for semi-colons etc\n  }\n})\n\n// open a CSV with a Table Schema\nconst file = data.open({\n  path: 'mydata.csv'\n  // Full support for Table Schema https://specs.frictionlessdata.io/table-schema/\n  schema: {\n    fields: [\n      {\n        name: 'Column 1',\n        type: 'integer'\n      },\n      ...\n    ]\n  }\n})\n```\n\n\n### Datasets\n\nA collection of data files with optional metadata.\n\nUnder the hood it heavily uses Data Package formats and it natively supports Data Package formats including loading from `datapackage.json` files. However, it does not require knowledge or use of Data Packages.\n\nA Dataset has four primary properties:\n\n* `descriptor`: key metadata. The descriptor follows the Data Package spec\n* `resources`: an array of the Files contained in this Dataset\n* `identifier`: the identifier encapsulates the location (or origin) of this Dataset\n* `readme`: the README for this Dataset (if it exists). The readme content is taken from the README.md file located in the Dataset root directory, or, if that does not exist from the `readme` property on the descriptor. If neither of those exist the readme will be undefined or null.\n\nIn addition we provide the convenience attributes:\n\n* `path`: the path (remote or local) to this dataset\n* `dataPackageJsonPath`: the path to the `datapackage.json` for this Dataset (if it exists)\n\n#### load\n\nTo create a new Dataset object use `Dataset.load`. It takes descriptor Object or identifier string:\n\n```\nasync Dataset.load(pathOrDescriptor, {owner = null} = {})\n```\n\n* `pathOrDescriptor` - can be one of:\n  * local path to Dataset\n  * remote url to Dataset\n  * descriptor object\n* @returns: a fully loaded Dataset (parsed and used `datapackage.json` and `README.md` -- if README exists)\n\nFor example:\n\n```javascript\nconst data = require('frictionless.js')\n\nconst pathOrDescriptor = 'https://raw.githubusercontent.com/datasets/co2-ppm/master/datapackage.json'\nconst dataset = await data.Dataset.load(pathOrDescriptor)\n```\n\n#### addResource\n\nAdd a resource to the Dataset:\n\n```javascript\naddResource(resource)\n```\n\n* `resource`: may be an already instantiated File object or it is a resource descriptor\n* @returns: null\n\n\n### Utilities\n\n#### isDataset\n\n```javascript\n// seeks to guess whether a given path is the path to a Dataset or a File\n// (i.e. a directory or datapackage.json)\ndata.isDataset(path)\n```\n\n#### parseDatasetIdentifier\n\n```javascript\n// parses dataset path and returns identifier dictionary\n// handles local paths, remote URLs as well as DataHub and GitHub specific URLs\n// (e.g., https://datahub.io/core/finance-vix or https://github.com/datasets/finance-vix\nconst identifier = data.parseDatasetIdentifier(path)\n\nconsole.log(identifier)\n```\n\nand it prints out:\n\n```\n{\n    name: \u003cname\u003e,\n    owner: \u003cowner\u003e,\n    path: \u003cpath\u003e,\n    type: \u003ctype\u003e,\n    original: \u003cpath\u003e,\n    version: \u003cversion\u003e\n}\n```\n\n## Developers\n\nRequirements:\n\n* NodeJS \u003e= v8.10.0\n* NPM \u003e= v5.2.0\n\n\n### Test\n\nWe have two type of tests Karma based for browser testing and Mocha with Chai for Node. All node tests are in `datajs/test` folder. Since Mocha is sensitive to test namings, we have separate the folder `/browser-test` for only Karma.\n\n- To run browser test, first you need to build the library in order to have the bundle in `dist/browser` folder. Run: `yarn build:browser` to achieve this, then for browser testing use the command `yarn test:browser`, this will run Karma tests.\n- To test in Node: `yarn test:node`\n- To run all tests including Node and browser run `yarn test`\n- To watch Node test run: `yarn test:node:watch`\n\n### Setup\n\n1. Git clone the repo\n2. Install dependencies: `yarn`\n3. To make the browser and node test work, first run the build: `yarn build`\n4. Run tests: `yarn test`\n5. Do some dev work\n6. Once done, make sure tests are passing. Then build distribution version of the app - `yarn build`.\n\n   Run `yarn build` to compile using webpack and babel for different node and web target. To watch the build run: `yarn build:watch`.\n\n7. Now proceed to \"Deployment\" stage\n\n\n### Deployment\n\n1. Update version number in `package.json`.\n2. Git commit: `git commit -m \"some message, eg, version\"`.\n3. Release: `git tag -a v0.12.0 -m \"some message\"`.\n4. Push: `git push origin master --tags`\n5. Publish to NPM: `npm publish`\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffrictionlessdata%2Ffrictionless-js","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ffrictionlessdata%2Ffrictionless-js","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffrictionlessdata%2Ffrictionless-js/lists"}