{"id":29354558,"url":"https://github.com/googlechromelabs/autowebperf","last_synced_at":"2025-07-09T03:15:05.438Z","repository":{"id":42442741,"uuid":"266994556","full_name":"GoogleChromeLabs/AutoWebPerf","owner":"GoogleChromeLabs","description":"AutoWebPerf provides a flexible and scalable framework for running web performance audits with arbitrary audit tools including PageSpeedInsights, WebPageTest and more.","archived":false,"fork":false,"pushed_at":"2025-03-31T10:23:27.000Z","size":2252,"stargazers_count":355,"open_issues_count":10,"forks_count":32,"subscribers_count":14,"default_branch":"stable","last_synced_at":"2025-03-31T11:26:51.515Z","etag":null,"topics":["chrome-ux-report","crux","lighthouse","pagespeed-insights","pagespeed-insights-api","performance","performance-metrics","web","webperformance"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/GoogleChromeLabs.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":"AUTHORS","dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-05-26T09:02:04.000Z","updated_at":"2025-03-31T10:22:42.000Z","dependencies_parsed_at":"2024-03-18T10:27:13.158Z","dependency_job_id":"4daa07c2-558d-4354-8c44-793a7fe24810","html_url":"https://github.com/GoogleChromeLabs/AutoWebPerf","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/GoogleChromeLabs/AutoWebPerf","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/GoogleChromeLabs%2FAutoWebPerf","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/GoogleChromeLabs%2FAutoWebPerf/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/GoogleChromeLabs%2FAutoWebPerf/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/GoogleChromeLabs%2FAutoWebPerf/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/GoogleChromeLabs","download_url":"https://codeload.github.com/GoogleChromeLabs/AutoWebPerf/tar.gz/refs/heads/stable","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/GoogleChromeLabs%2FAutoWebPerf/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":264384458,"owners_count":23599619,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["chrome-ux-report","crux","lighthouse","pagespeed-insights","pagespeed-insights-api","performance","performance-metrics","web","webperformance"],"created_at":"2025-07-09T03:15:04.799Z","updated_at":"2025-07-09T03:15:05.430Z","avatar_url":"https://github.com/GoogleChromeLabs.png","language":"JavaScript","readme":"# AutoWebPerf (AWP)\n\n\u003cp align=\"left\"\u003e\n  \u003cimg src=\"https://i.imgur.com/f87A9xi.png\" width=\"200\" alt=\"quicklink\"\u003e\n\u003c/p\u003e\n\n\n\u003e AutoWebPerf provides a flexible and scalable framework for running web\nperformance audits with arbitrary audit tools like WebPageTest and\nPageSpeedInsights. This library enables developers to collect metrics\nconsistently and store metrics to a preferred data store such as local JSON\nfiles, Google Sheets, BigQuery, or an in-house SQL database.\n\nCheck out https://web.dev/autowebperf for introduction.\n\n## How it works\n\nAutoWebPerf takes a list of **Tests** from an arbitrary data store platform,\nsuch as local JSONs, Google Sheets, BigQuery, or a self-hosted SQL database.\nWith the list of Tests, it executes audits based on each Test config, collects\nmetrics from individual data sources into a list of **Results**.\n\nThe process of running an audit through an measurement tool (e.g. WebPageTest)\nis defined in the individual **Gatherer**. The logic of reading and writing\nwith a data platform (e.g. local JSON) is implemented in a **Connector**.\n\n### Feature highlights\n\n- A library of web audit automation that can be plugged-in to any platforms,\nlike Google Sheets, GCP App Engine, or simply a cron job that writes to JSON\nfile.\n- Providing the ability to run recurring tests with customizable frequency\n(e.g. daily, weekly, monthly, etc), network conditions, and other audit configs,\netc.\n- Metric gatherers are designed as modules that are decoupled with the output\ndata format and automation logic.\n- Connector modules are designed to read Test list and write audit results to\nspecific data format or platforms. e.g. a connector for CSV files.\n(See ```src/connectors/csv-connector``` for details)\n\n### How does this compare to the rest of Google's speed measurement tools?\n\nAutoWebPerf serves as a performance audit aggregator that automates the process\nof performance audit and metrics collection through multiple speed measurement\ntools including WebPageTest, PageSpeedInsights, and Chrome UX Report. As each\nindividual speed measurement tool provides audit metrics, AutoWebPerf aggregates\nthe results and writes to any preferred data storage platform, such as local\nJSONs, cloud-based database, or GoogleSheets.\n\n## Quickstart\n\nFirst, clone AWP repo and run npm install:\n```\ngit clone https://github.com/GoogleChromeLabs/AutoWebPerf.git\nnpm install\n```\n\nOnce finished, check the install by running a single test with the following command:\n```\n./awp run examples/tests.json output/results.json\n```\nThis command uses the example file in ```examples/tests.json``` and returns the results to ```output/results.json```.\n\nTo start recurring tests, you'll need to include a `recurring.frequency` property in the test file and set the next trigger in the test file. To setup the next trigger time and to run a one-off test, use this command after adding the `recurring.frequency` property to your tests:\n```\n./awp recurring examples/tests-recurring.json output/results.json\n```\nIf this was successful, the trigger time will have updated base on your chosen frequency, and a result would have been written to `output/results.json`.\n\nOnce the trigger time is correctly set, you can have your tests auto-run on the next triger time with the `continue` command:\n```\n./awp continue examples/tests-recurring.json output/results.json\n```\nThis will automatically run each test at the frequency specified. More information can be found below in the \"Run recurring tests\" section below.\n\n### More Examples\n\n**Single URL:** To test a single URL through PageSpeedInsights:\n```\n./awp run url:https://www.thinkwithgoogle.com/ json:output/results.json\n\n```\n**Pick Gatherer:** to test a single URL with a specific gatherer like PageSpeedInsights or WebPageTest:\n```\n./awp run --gatherers=psi url:https://web.dev json:output/results.json\n```\n\n**CSV file:** To run tests defined in a CSV file and write results to a JSON file:\n```\n./awp run csv:examples/tests.csv json:output/results.json\n```\n\n**PageSpeedInsights API:** To run PageSpeedInsights tests with an [API Key](https://developers.google.com/speed/docs/insights/v5/get-started):\n```\nPSI_APIKEY=SAMPLE_KEY ./awp run examples/tests.json output/results.json\n```\n\n**WebPageTest API:** To run WebPageTest tests:\n```\nWPT_APIKEY=SAMPLE_KEY ./awp run examples/tests-wpt.json output/results.json\n```\n\n**Override vs. Append:** To run tests and override existing results in the output file\n```\n./awp run examples/tests.json output/results.json --override-results\n```\n\n### Available gatherers:\n\n- WebPageTest - See [docs/webpagetest.md](docs/webpagetest.md) for details.\n- PageSpeed Insights - See [docs/psi.md](docs/psi.md) for details.\n- Chrome UX Report API - See [docs/cruxapi.md](docs/cruxapi.md) for details.\n- Chrome UX Report BigQuery - See [docs/cruxbigquery.md](docs/cruxbigquery.md) for details.\n\n### Available connectors:\n\n- JSON connector - reads or writes to local JSON files.\nThis is the default connector if a conenctor name is not specified. For example:\n```\n./awp run examples/tests.json output/results.json\n```\n\nAlternatively, to specify using the JSON connector for the `Tests` path and the `Results` path:\n```\n./awp run json:/examples/tests.json json:output/results.json\n```\n\n- CSV connector - reads or writes to local CSV files.\nTo specify using the CSV connector for the `Tests` path and the `Results` path:\n```\n./awp run csv:/examples/tests.csv csv:output/results.csv\n```\n\n- URL connector - generates just one `Test` with a specific URL for audit.\nTo run an audit with just one `Test` with a specific URL:\n```\n./awp run url:https://example.com csv:output/results.csv\n```\n\nPlease note that this connector only works with `Tests` path, not for the `Results` path.\n\n- Google Sheets connector\nSee [docs/sheets-connector.md](docs/sheets-connector.md) for detailed guidance.\n\n## Using AWP with Node CLI\n\n### Run tests\n\nYou can run the following anytime for printing CLI usages:\n\n```\n./awp --help\n```\n\nTo run tests, you can run the following CLI command with given Tests JSON, like\n`examples/tests.json`, which contains an array of tests. You can check out the\n`examples/tests.json` for the data structure of Tests.\n\n```\n./awp run examples/tests.json output/results.json\n```\n\nThis will generate the result object(s) in the given path to `results.json`.\n\nBy default, AWP will use JSON as the default connector for both reading tests\nand writing results. Alternatively, you can specify a different connector in the\nformat of `\u003cconnector\u003e:\u003cpath\u003e`.\n\nE.g. to run tests defined in CSV and write results in JSON:\n```\n./awp run csv:examples/tests.csv json:output/results.csv\n```\n\n### Retrieve test results\n\nFor some audit platforms like WebPageTest, each test may take a few minutes to\nfetch actual results. For these type of *asynchronous* audits, each Result will\nstay in \"Submitted\" status. You will need to explicitly retrieve results later.\n\nRun the following to retrieve the final metrics of results in the\n`results.json`.\n\n```\n./awp retrieve examples/tests.json output/results.json\n```\n\nThis will fetch metrics for all audit platforms and update to the Result object\nin the `output/results.json`. You can check out `examples/results.json` for\ndetails in Result objects.\n\n### Run recurring tests\n\nIf you'd like to set up recurring tests, you can define the `recurring` object\nthat contains `frequency` for that Test.\n\n```\n./awp recurring examples/tests-recurring.json output/results.json\n```\n\nThis will generate the Result object in the `results.json` and updates the next\ntrigger time to its original Test object in the `tests.json`. E.g. the updated\nTest object would look like the following, with the updated `nextTriggerTimestamp`.\n\n```\n{\n  \"label\": \"web.dev\",\n  \"url\": \"https://web.dev\",\n  \"recurring\": {\n    \"frequency\": \"Daily\",\n    \"nextTriggerTimestamp\": 1599692305567,\n    \"activatedFrequency\": \"Daily\"\n  },\n  \"psi\": {\n    \"settings\": {\n      \"locale\": \"en-GB\",\n      \"strategy\": \"mobile\"\n    }\n  }\n}\n```\n\nThe `nextTriggerTimestamp` will be updated to the next day based on the previous\ntimestamp. This is to prevent repeated runs with the same Test and to guarantee\nthat this Test is executed only once per day.\n\n### Set up a cron job to run recurring tests\n\nIn most Unix-like operating system, you can set up a cron job to run the AWP CLI\nperiodically.\n\nFor example, in macOS, you can run the following commands to set up a daily cron\njob with AWP:\n\n```\n# Edit the cronjob with a text editor.\nEDITOR=nano crontab -e\n```\n\nAdd the following line to the crontab for a daily run at 12:00 at noon. Note\nthat this is based on the system time where it runs AWP.\n\n```\n0 12 * * * PSI_APIKEY=SAMPLE_KEY cd ~/workspace/awp \u0026\u0026 ./awp run examples/tests.json csv:output/results-recurring.csv\n```\n\n### Run tests with extensions\n\nAn extension is a module to assist AWP to run tests with additional process and\ncomputation. For example, `budgets` extension is able to add performance budgets\nand compute the delta between the targets and the result metrics.\n\nTo run with extensions:\n```\n./awp run examples/tests.json output/results.json --extensions=budgets\n```\n\n## Tests and Results\n\n### Define the Tests\n\nThe list of tests is simply an array of Tests objects, like the sample Tests\nbelow. Or check out `src/examples/tests.js` for a detailed example of Tests\nlist.\n\n```\n[{\n  \"label\": \"Test-1\",\n  \"url\": \"example1.com\",\n  \"webpagetest\": {\n    ...\n  }\n}, {\n  \"label\": \"Test-2\",\n  \"url\": \"example2.com\",\n  \"psi\": {\n    ...\n  }\n}]\n```\n\nEach `Test` object defines which audits to run by defining `gatherers` property.\nFor example, the first `Test` has a `webpagetest` property which defines the\nconfiguration of running a WebPageTest audit. The second `Test` has a `psi`\nproperty that defines how to run PageSpeedInsights audit.\n\n### Generate the Results\n\nAfter running tests, a list of `Results` is generated like below. Each `Result`\ncontains its corresponding metrics to the predefined `gatherers` such as\nWebPageTest and PageSpeedInsights. See the example below.\n\n```\n[{\n  \"label\": \"Test-1\",\n  \"url\": \"example1.com\",\n  \"webpagetest\": {\n    \"metrics\": {\n      FirstContentfulPaint: 900,\n      ...\n    }\n  }  \n}, {\n  \"label\": \"Test-2\",\n  \"url\": \"example2.com\",\n  \"psi\": {\n    \"metrics\": {\n      FirstContentfulPaint: 900,\n      ...\n    }\n  }  \n}]\n```\n\n### Environmental Variables\n\nSome conenctors or gatherers may require one or more environmental variables, such as API keys or the path to \nservice account. For example, running with the CrUX API gatherer requires the CrUX API key.\n\nTo pass the environmental variables in the CLI, run the command with the regular usage of environment vars:\n```\nCRUX_APIKEY=\u003cYOUR_KEY\u003e ./awp run url:https://wev.dev/ json:output/results.json\n```\n\n## Gatherers\n\nAWP supports the following audit gatherers. Please check out the corresponding\ndocumentations for details.\n\n#### WebPageTest\n\nThe WebPageTest gatherer runs Tests through either the public WebPageTest\nendpoints or a custom private WebPageTest instance.\n\nSee [docs/webpagetest.md](docs/webpagetest.md) for more details for the usage\nof WebPageTest gatherer.\n\n#### PageSpeed Insights\n\nThe PageSpeed Insights gatherer runs Tests through the public\n[PageSpeed Insights API](https://developers.google.com/speed/docs/insights/v5/get-started).\n\nSee [docs/psi.md](docs/psi.md) for more details for the usage of PSI gatherer.\n\n#### Chrome UX Report API (CrUX API)\n\nThe CrUX API gatherer collects performance metrics through the [Chrome UX Report API](https://developers.google.com/web/tools/chrome-user-experience-report/api/guides/getting-started). \n\nSee [docs/cruxapi.md](docs/cruxapi.md) for more details for the usage of CrUX API gatherer.\n\n#### Chrome UX Report History (CrUX via BigQuery)\n\nThe CrUX BigQuery gatherer collects performance metrics through the [Chrome UX Report](https://developers.google.com/web/tools/chrome-user-experience-report) with its [public Google BigQuery project](https://bigquery.cloud.google.com/dataset/chrome-ux-report:all). \nPlease noet that you would need set up a Google Cloud project in order to query the public BigQuery table.\n\nSee [docs/cruxbigquery.md](docs/cruxbigquery.md) for more details for the usage of CrUX API gatherer.\n\n## Design\n\nAWP is designed with modules, including modules for running audits\nwith WebPageTest, PageSpeedInsights, or other tools, and modules for\nreading/writing data from data platforms such as JSON, GoogleSheets or\na Cloud service.\n\nIn a high-level view, there are three types of modules:\n- **Gatherer** - A Gatherer runs audits and generates metrics.\n- **Connector** - A Connector reads test configs from and writes results to a data\nplatform, such as a local JSON file or with Google Sheets.\n- **Extension** - An Extension adds additional metrics or information, either\nbefore or after running audits.\n\nThe AWP engine uses two major JavaScript Object structures for running audits and collecting metrics.\n\n- **Test** - An object that contains the audit configuration for one test task,\nsuch as URL, audit methods, or extension config. You can refer to\n`examples/tests.json` for an actual Test object.\n- **Result** - An object that contains audit configuration, metrics and overall\nstatus. You can refer to `examples/results.json` for an actual Result object.\n\n### Audit steps\n\nIn order to deal with asynchronous audit tool like WebPageTest, AWP breaks the\naudit cycle into three steps of actions:\n\n- **Run** - This action takes a list of `Tests` and generates a list of `Results`\nobjects.\n- **Recurring** - Similar to **Run**, this action takes a list of `Tests`,\ngenerates a list of `Results`, and updates nextTriggerTimestamp for each\nrecurring `Test`. This action is useful when running with periodic or timer-based tasks such as cron job.\n- **Retrieve** - This action scans the list of Results, and collects metrics\nwhen the results are not in `Retrieved` status.\n\n### AWP Config\n\nTo set up modules and their configurations, an overall AWP Config is required\nas a JavaScript Object.\n\nAWP Config has the following required properties:\n- `connector`: The name of connector.\n- `helper`: A helper for a specific connector, including API Handler and other\nhelper functions, which will be used in gatherers and extensions.\n- `dataSources`: An array of audit sources, such as `webpagetest` or `psi`. Each\nof the data source needs to have a corresponding Gatherer file in the\n`src/gatherers` folder.\n- `extensions`: An array of extensions. Each extension needs to have a\ncorresponding Extension file in `src/extensions`.\n\nOther optional properties:\n- `verbose`: Whether to print verbose messages.\n- `debug`: Whether to print debug messages.\n\nThe following config example comes from the `examples/awp-config.js`:\n\n```\n{\n  connector: 'JSON',\n  helper: 'Node',\n  dataSources: ['webpagetest'],\n  json: { // Config for JSON Connector.\n    tests: 'tests.json',\n    results: 'results.json',\n  },\n  extensions: [\n    'budgets',\n  ],\n  budgets: { // Config for Budgets extension.\n    dataSource: 'webpagetest',\n  },\n  verbose: true,\n  debug: false,\n}\n```\n\nWith the example config above, it will use `JSON` connector which reads and\nwrites Tests and Results as JSON files. See `examples/tests.json` and\n`examples/results.json` for examples.\n\nIn addition to fundamental properties, there are a few additional properties\nused by modules:\n- `json` property as the configuration for **JSONConnector**.\n- `budgets` property as the configuration for **BudgetsExtension**\n\n### Usage of AutoWebPerf core\n\nExamples of creating a new instance of AWP:\n```\nlet awp = new AutoWebPerf({\n  connector: 'JSON',\n  helper: 'Node',\n  dataSources: ['webpagetest'],\n  extensions: extensions,\n  json: { // Config for JSON connector.\n    tests: argv['tests'],\n    results: argv['results'],\n  },\n  verbose: verbose,\n  debug: debug,\n});\n```\nTo submit all tests:\n```\nawp.run();\n```\n\nTo submit specific tests using filters:\nThis will run the test which has id=1 and selected=true properties.\n```\nawp.run({\n  filters: ['id=\"1\"', 'selected'],\n});\n```\n\nTo retrieve all pending results, filtering with status !== \"Retrieved\".\n```\nawp.retrieve({\n  filters: ['status!==\"Retrieved\"'],\n});\n```\n- For more advanced usage of PatternFilter, please check out\n`src/utils/pattern-filter.js` with more examples.\n\nTo run recurring tests:\n```\n// This will run the actual audit and update the nextTriggerTimestamp.\nawp.recurring();\n```\n\nTo run tests with specific extensions:\n```\n// This will override the extension list defined in the awpConfig.\nawp.run({\n  extensions: ['budgets']\n})\n```\n\n\n### Gatherer Modules\n\nA Gatherer class extends `src/gatherers/gatherer.js` and overrides the\nfollowing methods:\n\n- `constructor(config, apiHelper, options)`:\n  - `config`: The config defined in a property with this gatherer's name in the\n  AWP config. Some audit tools like WebPageTest or PageSpeedInsights require API keys. The API key for the gatherer is located in the `config.apiKey`.\n  - `options`: Additional settings like `verbose` and `debug`.\n\n- `run(test, options)`:\n  - `test`: A `Test` object for this audit run. The data required for this\n  gatherer (e.g. settings or metadata) will be in the property with the gatherer's\n  name. E.g. the data for WebPageTest will be in `webpagetest` of this\n  `Test` object.\n  - `options`: Additional settings.\n\n- `retrieve(result, options)`:\n  - `result`: A `Result` object to retrieve metrics with. The data required for\n  this gatherer will be in the property with the gatherer's name. E.g. the data\n  and metrics will be in `webpagetest` of this `Result` object.\n  - `options`: Additional settings like `verbose` and `debug`.\n\n### Connector Modules\n\nA Connector class extends `src/connectors/connector.js` and overrides the\nfollowing methods:\n\n- `constructor(config, apiHandler)`:\n  - `config`: The config defined in a property with this connector's name in the\n  AWP config.\n  - `apiHandler`: The API handler instance used for making API calls.\n\n- `getConfig()`: The method to return the Connector's additional config object.\nThis config object depends on where this Connector stores its additional\nsettings including API keys for gatherers. For example, JSONConnector uses the\n `tests.json` and reads additional settings from the `config` property,\nincluding API keys for each gatherers.\n\n- `getTestList(options)`: The method to return the list of `Tests` as an array.\n- `updateTestList(newTests, options)`: The method to update `Tests` list, given\n the list of new `Tests`.\n- `getResultList(options)`: The method to return the list of `Results` as an\narray.\n- `appendResultList(newResults, options)`: The method to append new `Results` to\n the end of the current `Results` list.\n- `updateResultList(newResults, options)`: The method to update existing\n `Results` in the current `Results` list.\n\n### Extension Modules\n\nA Extension class extends `src/extensions/extension.js` and overrides the\nfollowing methods:\n\n- `constructor(config)`:\n  - `config`: The config defined in a property with this extension's\nname in the AWP config.\n- `beforeRun(context)`: The method before executing **Run** step for a `Test`.\n  - `context.test`: The corresponding `Test` object.\n- `afterRun(context)`: The method after executing **Run** step for a `Test`.\n  - `context.test`: The corresponding `Test` object.\n  - `context.result`: The corresponding `Result` object.\n- `beforeAllRuns(context)`: The method before executing **Run** step.\n  - `context.tests`: All `Test` objects in this **Run**.\n- `afterAllRuns(context)`: The method after executing **Run** step.\n  - `context.tests`: All `Test` objects in this **Run**.\n  - `context.results`: All `Result` objects in this **Run**.\n- `beforeRetrieve(context)`: The method before executing **Retrieve** step for a `Result`.\n  - `context.result`: The corresponding `Result` object.\n- `afterRetrieve(context)`: The method after executing **Retrieve** step for a `Result`.\n  - `context.result`: The corresponding `Result` object.\n- `beforeAllRetrieves(context)`: The method before executing **Retrieve** step.\n  - `context.result`: The corresponding `Result` object.\n- `afterAllRetrieves(context)`: The method after executing **Retrieve** step.\n  - `context.result`: The corresponding `Result` object.\n\n### Test Object\n\nA standard `Test` object contains the following properties:\n\n(You can refer to `examples/tests.json` for an example.)\n\n- `selected` \u003cboolean\u003e: Whether to perform **Run** for this `Test`.\n- `label` \u003cstring\u003e: Name of this `Test`.\n- `url` \u003cstring\u003e: URL to audit.\n- `recurring`: Settings for recurring audit.\n  - `frequency` \u003cstring\u003e: The frequency string defined in\n   `src/common/frequency.js`. E.g. 'Daily', 'Weekly' or 'Monthly'.\n\nGatherer-specific settings will be in their own property with the Gatherer's\nname in lower case. For example, the settings for *WebPageTests* will be:\n\n- `webpagetest`\n  - `settings`: Setting object contains audit location, connection, etc.\n  - `metadata`: Metadata object contains WebPageTests's ID, JSON URL, etc.\n\n### Result Object\n\nA standard `Result` object contains the following properties:\n\n- `selected` \u003cboolean\u003e: Whether to perform **Retrieve** for this `Result`.\n- `id` \u003cstring\u003e: Auto-generated unique ID for this `Result`.\n- `type` \u003cstring\u003e: `Single` or `Recurring` audit.\n- `status` \u003cstring\u003e: `Submitted`, `Retrieved` or `Error`.\nRefer to `src/common/status.js` for details.\n- `label` \u003cstring\u003e: String label for this `Result`. This label inherits from its\noriginal `Test` object.\n- `url` \u003cstring\u003e: Audited URL.\n- `createdTimestamp` \u003cstring\u003e: When this `Result` is created.\n- `modifiedTimestamp` \u003cstring\u003e: When this `Result` is last modified.\n\n### Standardized Metrics\n\nAll metric names used in AWP are required to follow the names, case\nsensitive. See the full list of standardized metrics in `src/common/metrics.js`\n\n- **Timing metrics**\n  - `TimeToFirstByte`\n  - `FirstPaint`\n  - `FirstMeaningfulPaint`\n  - `FirstContentfulPaint`\n  - `VisualComplete`\n  - `SpeedIndex`\n  - `DOMContentLoaded`\n  - `LoadEvent`\n  - `TimeToInteractive`\n  - `TotalBlockingTime`\n  - `FirstCPUIdle`\n  - `FirstInputDelay`\n  - `LargestContentfulPaint`\n\n- **Resource Size**\n  - `HTML`\n  - `Javascript`\n  - `CSS`\n  - `Fonts`\n  - `Images`\n  - `Videos`\n\n- **Resource Count**\n  - `DOMElements`\n  - `Connections`\n  - `Requests`\n\n- **Resource Scores**\n  - `Performance`\n  - `ProgressiveWebApp`\n\n## Source Folder Structure\n\nAll source codes for major functions are located in `src` folder. Files are\norganized into the following subfolders:\n\n- `common`: Common classes and definitions, such as Status, Frequency, Metrics, etc.\n- `connectors`: Connector classes.\n- `extensions`: Extension classes.\n- `gatherers`: Gatherer classes.\n- `utils`: Utilities and tools.\n\n## Unit Test\n\nRun the following commands to run unit test:\n\n```\nnpm test\n```\n\nTo run individual test spec, you can install Jest NPM module to your local\nmachine:\n\n```\nnpm install -g jest\njest test/some-module.test.js\n```\n\n### Unit Test Design\n\nThe Unit Test is based on [Jest](https://jestjs.io/) unit test framework. All\nunit tests are located in the `./test` folder, and are organized into its own\ncorresponding subfolders, as the same structure as in the `src` folder.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgooglechromelabs%2Fautowebperf","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgooglechromelabs%2Fautowebperf","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgooglechromelabs%2Fautowebperf/lists"}