{"id":13517088,"url":"https://github.com/liip/TheA11yMachine","last_synced_at":"2025-03-31T07:31:08.688Z","repository":{"id":43174324,"uuid":"48109508","full_name":"liip/TheA11yMachine","owner":"liip","description":"The A11y Machine is an automated accessibility testing tool which crawls and tests pages of any web application to produce detailed reports.","archived":true,"fork":false,"pushed_at":"2019-12-17T18:46:24.000Z","size":47849,"stargazers_count":625,"open_issues_count":35,"forks_count":67,"subscribers_count":73,"default_branch":"master","last_synced_at":"2025-03-22T20:32:23.291Z","etag":null,"topics":["accessibility","crawl","test","wcag"],"latest_commit_sha":null,"homepage":"https://www.liip.ch/","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/liip.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2015-12-16T12:39:33.000Z","updated_at":"2025-03-07T11:10:04.000Z","dependencies_parsed_at":"2022-09-13T21:40:31.340Z","dependency_job_id":null,"html_url":"https://github.com/liip/TheA11yMachine","commit_stats":null,"previous_names":[],"tags_count":26,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/liip%2FTheA11yMachine","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/liip%2FTheA11yMachine/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/liip%2FTheA11yMachine/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/liip%2FTheA11yMachine/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/liip","download_url":"https://codeload.github.com/liip/TheA11yMachine/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246433100,"owners_count":20776529,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["accessibility","crawl","test","wcag"],"created_at":"2024-08-01T05:01:29.575Z","updated_at":"2025-03-31T07:31:03.678Z","avatar_url":"https://github.com/liip.png","language":"JavaScript","readme":"\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://liip.ch/\"\u003e\u003cimg src=\"https://github.com/liip.png\" alt=\"Liip\" width=\"150px\" /\u003e\u003c/a\u003e\u003cbr /\u003e\n  \u003cem\u003epresents\u003c/em\u003e\u003cbr /\u003e\n  The Accessibility Testing Machine\n\u003c/p\u003e\n\n\u003chr /\u003e\n\n# The A11y Machine\n\n[![Version](https://img.shields.io/npm/v/the-a11y-machine.svg)](https://github.com/liip/TheA11yMachine)\n[![Downloads](https://img.shields.io/npm/dt/the-a11y-machine.svg)](https://www.npmjs.com/package/the-a11y-machine)\n[![License](https://img.shields.io/npm/l/the-a11y-machine.svg)](#authors-and-license)\n\n\u003e Note: TheA11yMachine is no longer maintained.\n\n**The A11y Machine** (or `a11ym` for short, spelled “alym”) is an **automated\naccessibility testing tool** which **crawls** and **tests** pages of any Web\napplication to produce detailed reports. It validates pages against the\nfollowing specifications/laws:\n\n  * [W3C Web Content Accessibility Guidelines](http://www.w3.org/TR/WCAG20/)\n    (WCAG) 2.0, including A, AA and AAA levels ([understanding levels of\n    conformance](http://www.w3.org/TR/UNDERSTANDING-WCAG20/conformance.html#uc-levels-head)),\n  * U.S. [Section 508](http://www.section508.gov/) legislation,\n  * [W3C HTML5 Recommendation](https://www.w3.org/TR/html/).\n\n## Table of contents\n\n* [Why?](#why)\n* [Installation](#installation)\n* [Usage](#usage)\n  * [List of URLs instead of crawling](#list-of-urls-instead-of-crawling)\n  * [Possible output](#possible-output)\n  * [How does it work?](#how-does-it-work)\n  * [Write your custom rules](#write-your-custom-rules)\n  * [Watch the dashboard](#watch-the-dashboard)\n* [Roadmap and board](#roadmap-and-board)\n* [Authors and license](#authors-and-license)\n\n## Why?\n\nIf **privacy** matters for you, you're likely to install The A11y Machine over\nany SaaS services: It runs locally so you don't need to send your code\nsomewhere, you can test all parts of your application including the ones which\nrequire an authentification (like a checkout, a back-office etc.)…\n\nHere are some pros and cons compared to SaaS solutions:\n\nProperties | The A11y Machine | SaaS services\n-----------|------------------|---------------\nCan run locally              | yes | no\nCan test each patch          | yes | no (except if deployed)\nReduce the test loop         | yes | no (the loop is longer)\nCan test private code        | yes | no (you must send your code)\nCan test auth-required parts | yes | no\nCan crawl all your pages     | yes | yes (but it can be pricey)\n\nAccessibility is not only a concern for disabled people. Bots can be considered\nas such, like [DuckDuckGo](https://duckduckgo.com),\n[Google](https://google.com/) or [Bing](https://bing.com/). By respecting these\nstandards, you're likely to have a better ranking. Also it helps to clean your\ncode. Accessibility issues are often left unaddressed for budget reasons. In\nfact most of the cost is spent looking for errors on your website. The A11y\nMachine greatly help with this task, you can thus focus on fixing your code and\nreap the benefits.\n\n## Installation\n\n[NPM](http://npmjs.org/) is required. Then, execute the following lines:\n\n```sh\n$ npm install -g the-a11y-machine\n```\n\nIf you would like to validate your pages against the HTML5 recommendation, then\nyou need to [install Java](https://www.java.com/en/download/).\n\nAs an alternative you can run a Docker image instead, which will ensure the image\nis available locally:\n\n```sh\n$ docker build -t liip/the-a11y-machine .\n$ docker run liip/the-a11y-machine --help\n```\n\nTo get access to a report you will need to:\n\n  * Mount a path into the container,\n  * Specifify that internal path in your `a11ym` CLI options.\n\nFor example:\n\n```sh\n$ docker run -v $PWD:/var/output liip/the-a11y-machine -o /var/output http://example.org\n```\n\n## Usage\n\nAs a prelude, see the help:\n\n```sh\n  Usage: a11ym [options] url …\n\n  Options:\n\n    -h, --help                                 output usage information\n    -b, --bootstrap \u003cpath\u003e                     Bootstrap file, i.e. the configuration file. All CLI options will overwrite options defined in the configuration file.\n    -e, --error-level \u003cerror_level\u003e            Minimum error level: In ascending order, `notice` (default), `warning`, and `error` (e.g. `warning` includes all warnings and errors).\n    -c, --filter-by-codes \u003ccodes\u003e              Filter results by comma-separated WCAG codes (e.g. `H25,H91,G18`).\n    -C, --exclude-by-codes \u003ccodes\u003e             Exclude results by comma-separated WCAG codes (e.g. `H25,H91,G18`).\n    -d, --maximum-depth \u003cdepth\u003e                Explore up to a maximum depth (hops).\n    -m, --maximum-urls \u003cmaximum_urls\u003e          Maximum number of URLs to compute.\n    -o, --output-directory \u003coutput_directory\u003e  Output directory.\n    -r, --report \u003creport\u003e                      Report format: `cli`, `csv`, `html` (default), `json` or `markdown`.\n    -s, --standards \u003cstandards\u003e                Standard to use: `WCAG2A`, `WCAG2AA` (default), ` WCAG2AAA`, `Section508`, `HTML` or your own (see `--sniffers`). `HTML` can be combined with any other by a comma.\n    -S, --sniffers \u003csniffers\u003e                  Path to the sniffers file, e.g. `resource/sniffers.js` (default).\n    -u, --filter-by-urls \u003curls\u003e                Filter URL to test by using a regular expression without delimiters (e.g. 'news|contact').\n    -U, --exclude-by-urls \u003curls\u003e               Exclude URL to test by using a regular expression without delimiters (e.g. 'news|contact').\n    -w, --workers \u003cworkers\u003e                    Number of workers, i.e. number of URLs computed in parallel.\n    --http-auth-user \u003chttp_auth_user\u003e          Username to authenticate all HTTP requests.\n    --http-auth-password \u003chttp_auth_password\u003e  Password to authenticate all HTTP requests.\n    --http-tls-disable                         Disable TLS/SSL when crawling or downloading pages.\n    -V, --no-verbose                           Make the program silent.\n    --ignore-robots-txt                        Ignore robots.txt file.\n\n```\n\nThus, the simplest use is to run `a11ym` with a URL:\n\n```sh\n$ ./a11ym http://example.org/\n```\n\nAll URLs accessible from `http://example.org/` will be tested against the\nWCAG2AA standard. See the `--maximum-urls` options to reduce the number of\nURLs to test.\n\nThen open `a11ym_output/index.html` and browser the result!\n\n### List of URLs instead of crawling\n\nYou can compute several URLs by adding them to the command-line, like this:\n\n```sh\n$ ./a11ym http://example.org/A http://example.org/B http://example.org/C\n```\n\nAlternatively, this is possible to read URLs from STDIN, as follows:\n\n```sh\n$ cat URLs.lists | ./a11ym -\n```\n\nNote the `-`: It means “Read URLs from STDIN please”.\n\nWhen reading several URLs, the `--maximum-depth` option will be forced to 1.\n\n### Possible output\n\nThe index of the reports:\n\n![Index of the report](resource/screenshots/index.png)\n\nReport of a specific URL:\n\n![Report of a specific URL](resource/screenshots/report.png)\n\nThe dashboard of all reports:\n\n![Dashboard of all reports](resource/screenshots/dashboard.jpg)\n\n### Selecting standards\n\nAs mentionned, the following standards are supported:\n  * W3C WCAG,\n  * U.S. Section 508 legislation,\n  * W3C HTML5 recommendation.\n\nYou cannot combine standards between each other, except HTML5 that can be\ncombined with any other. So for instance, to run `WCAG2AAA`:\n\n```sh\n$ ./a11ym --standards WCAG2AAA http://example.org/\n```\n\nTo run `WCAG2AA` along with `HTML`:\n\n```sh\n$ ./a11ym --standards WCAG2AA,HTML http://example.org/\n```\n\n### How does it work?\n\nThe pipe looks like this:\n\n  1. The [`node-simplecrawler`](https://github.com/cgiffard/node-simplecrawler/)\n     tool is used to crawl a Web application based on the given URLs, with **our\n     own specific exploration algorithm** to provide better results quickly, in\n     addition to support **parallelism**,\n  2. For each URL found, 2 kind of tests are applied:\n      1. **Accessibility**: [PhantomJS](http://phantomjs.org/) runs and\n         [`HTML_CodeSniffer`](https://github.com/squizlabs/HTML_CodeSniffer) is\n         injected in order to check the page conformance; This step is\n         semi-automated by the help of\n         [`pa11y`](https://github.com/nature/pa11y), which is a very thin layer\n         of code wrapping PhantomJS and `HTML_CodeSniffer`,\n      2. **HTML**: [The Nu Html Checker](http://validator.github.io/validator/)\n         (v.Nu) is run on the same URL.\n  3. Finally, results from different tools are normalized, and enhanced and easy\n     to use reports are produced.\n\nPhantomJS and `HTML_CodeSniffer` are widely-used, tested and precise tools.\n`pa11y` simplifies the use of these two latters. The Nu Html Checker is the tool\nused by the W3C to validate documents online. However, in this case, we **do all\nvalidations offline**! Nothing is sent over the network. Again, privacy.\n\n### Write your custom rules\n\n`HTML_CodeSniffer` is build in a way that allows you to extend existing rules or\nwrite your own. A rule is represented as a sniffer (this is another\nterminology). The `resource/sniffers/` directory contains an example of\na custom sniffer.\n\nThe A11y Machine comes with a default file containing all the sniffers:\n`resource/sniffers.js`. You can provide your own by using the `--sniffers`\noption. To build your own sniffers, simply copy the `resource/sniffers/`\nsomewhere as a basis, complete it, then compile it with the `a11ym-sniffers`\nutility:\n\n```sh\n$ ./a11ym-sniffers --directory my/sniffers/ --output-directory my_sniffers.js\n```\n\nThen, to effectively use it:\n\n```sh\n$ ./a11ym --sniffers my_sniffers.js --standard MyStandard http://example.org/\n```\n\n### Watch the dashboard\n\nRun the `a11ym-dashboard` command to serve the dashboard. The dashboard is an\noverview of several reports generated by the `a11ym` command. The command can\nserves the dashboard over HTTP, or over static files. In addition to requiring\na root directory, it requires: In the first case, an address, and a port, in the\nsecond, nothing more than just a flag. For instance, if the reports are\ngenerated with the following command:\n\n```sh\n$ ./a11ym --output-directory my_reports/`date +%s`/ http://example.org/A http://example.org/B\n```\n\nThen, the root directory is `my_reports/` and thus the dashboard will be started\nover HTTP with the following command:\n\n```sh\n$ ./a11ym-dashboard --root my_reports\n```\n\nBrowse `127.0.0.1:8080` (by default) to see the dashboard!\n\nOr to generate static files:\n\n```sh\n$ ./a11ym-dashboard --root my_reports --static-output\n```\n\nOpen `my_reports/index.html`, and do the same!\n\nBonus: Use the `--open` option to automatically open the dashboard in your\nfavorite browser.\n\n## Roadmap and board\n\nThe roadmap is public:\n  * See [the incoming\n    milestones](https://github.com/liip/TheA11yMachine/milestones),\n  * See [the in progress\n    issues](https://github.com/liip/TheA11yMachine/labels/in%20progress).\n\nThe board is publicly available at the following URL: https://waffle.io/liip/TheA11yMachine.\n\n## Authors and license\n\nOriginal author is [Ivan Enderlin](http://mnt.io/), accompagnied by [Gilles\nCrettenand](https://github.com/krtek4) and [David\nJeanmonod](https://github.com/jeanmonod). This software is backed by\n[Liip](https://liip.ch/).\n\n[BSD-3-Clause](http://opensource.org/licenses/BSD-3-Clause):\n\n\u003e Copyright (c), Ivan Enderlin and Liip\n\u003e All rights reserved.\n\u003e\n\u003e Redistribution and use in source and binary forms, with or without modification,\n\u003e are permitted provided that the following conditions are met:\n\u003e\n\u003e 1. Redistributions of source code must retain the above copyright notice, this\n\u003e    list of conditions and the following disclaimer.\n\u003e\n\u003e 2. Redistributions in binary form must reproduce the above copyright notice,\n\u003e    this list of conditions and the following disclaimer in the documentation\n\u003e    and/or other materials provided with the distribution.\n\u003e\n\u003e 3. Neither the name of the copyright holder nor the names of its contributors\n\u003e    may be used to endorse or promote products derived from this software without\n\u003e    specific prior written permission.\n\u003e\n\u003e THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\n\u003e ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\n\u003e WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n\u003e DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR\n\u003e ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\n\u003e (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\n\u003e  LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON\n\u003e ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n\u003e (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\n\u003e SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n","funding_links":[],"categories":["JavaScript"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fliip%2FTheA11yMachine","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fliip%2FTheA11yMachine","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fliip%2FTheA11yMachine/lists"}