Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/saasify-sh/ta11y

Modern web accessibility audits. 💪
https://github.com/saasify-sh/ta11y

a11y a11y-testing accessibility accessibility-audits accessibility-automation accessibility-testing ada compliance html-validation saas saas-api wcag wcag2 web

Last synced: 2 months ago
JSON representation

Modern web accessibility audits. 💪

Awesome Lists containing this project

README

        



ta11y Logo

# ta11y

> Modern web accessibility audits. 💪

[![NPM](https://img.shields.io/npm/v/@ta11y/ta11y.svg)](https://www.npmjs.com/package/@ta11y/ta11y) [![Build Status](https://travis-ci.com/saasify-sh/ta11y.svg?branch=master)](https://travis-ci.com/saasify-sh/ta11y) [![JavaScript Style Guide](https://img.shields.io/badge/code_style-standard-brightgreen.svg)](https://standardjs.com)

## Features

- **Accessibility as a service**
- Audit your websites with a range of test suites including WCAG 2.0/2.1 A, AA, AAA, Section 508, HTML validation, as well as our own best practices.
- **Flexible and automated**
- Run manual tests during development and then integrate into any CI pipeline. Supports generating reports in XLS, XLSX, CSV, JSON, HTML, and more.
- **Runs in any environment**
- Easy integration that supports localhost, firewalls, custom auth, as well as any public production environment.
- **Modern dynamic websites**
- Ta11y treats all websites as dynamic with full JavaScript support, so you'll test pages as your users actually experience them.
- **Free to try**
- Simple to get started for free, then [sign up](/pricing) once you're ready to remove rate limits. Have a non-profit use case? [Get in touch](mailto:[email protected]).
- **Private & secure**
- Ta11y is built using serverless functions and never stores any of your data or audit results.

## Usage

This project is broken down into the following packages:

- [@ta11y/ta11y](./packages/ta11y) - Main CLI for running web accessibility audits with ta11y.
- [@ta11y/core](./packages/ta11y-core) - Core library for programatically running web accessibility audits with ta11y.
- [@ta11y/extract](./packages/ta11y-extract) - Library to crawl and extract content from websites.
- [@ta11y/reporter](./packages/ta11y-reporter) - Library to convert audit results to different formats.

## CLI

The easiest way to get started is via the CLI.

```bash
npm install -g @ta11y/ta11y
```

```bash
Usage: ta11y [options]

Options:
-V, --version output the version number
-o, --output Output the results to the given file (format determined by file
type). Supports xls, xlsx, csv, json, html, txt, etc.
-r, --remote Run all content extraction remotely (website must be publicly
accessible). Default is to run content extraction locally.
(default: false)
-e, --extract-only Only run content extraction and disable auditing. (default: false)
-s, --suites Optional comma-separated array of test suites to run. (section508,
wcag2a, wcag2aa, wcag2aaa, best-practice, html). Defaults to
running all audit suites.
-c, --crawl Enable crawling additional pages. (default: false)
-d, --max-depth Maximum crawl depth. (default: 16)
-v, --max-visit Maximum number of pages to visit while crawling.
-S, --no-same-origin By default, we only crawling links with the same origin as the
root. Disables this behavior so we crawl links with any origin.
-b, --blacklist Optional comma-separated array of URL glob patterns to ignore.
-w, --whitelist Optional comma-separated array of URL glob patterns to include.
-u, --user-agent Optional user-agent override.
-e, --emulate-device Optionally emulate a specific device type.
-H, --no-headless Disables headless mode for puppeteer. Useful for debugging.
-P, --no-progress Disables progress logging.
--api-key Optional API key.
--api-base-url Optional API base URL.
-h, --help output usage information
```

### Notes

**The CLI defaults to running all crawling and content extraction locally via a headless Puppeteer instance**.

You can disable this and run everything remotely by passing the `--remote` option, though it's not recommended.

See [@ta11y/core](https://github.com/saasify-sh/ta11y/tree/master/packages/ta11y-core) for more detailed descriptions of how the different configuration options affect auditing behavior.

### API Key

The free tier is subject to rate limits as well as a 60 second timeout, so if you're crawling a larger site, you're better off running content extraction locally.

If you're processing a non-publicly accessible website (like `localhost`), then you _must_ perform content extraction locally.

You can bypass rate limiting by [signing up](https://ta11y.saasify.sh/pricing) for an API key and passing it either via the `--api-key` flag or via the `TA11Y_API_KEY` environment variable.

Visit [ta11y](https://ta11y.saasify.sh) once you're ready to sign up for an API key.

### Output

The output format is determined by the file type if given a filename via `-o` or `--output`. If no file is given, the CLI defaults to logging the results in JSON format to `stdout`.

Ta11y supports a large number of [output formats](https://github.com/saasify-sh/ta11y/tree/master/packages/ta11y-reporter#formats) including:
- **xls**
- x**lsx**
- **csv**
- **json**
- **html**
- **txt**

Here are some example audit results so you can get a feel for the data:
- [example.com](http://example.com/) single page audit: [csv](https://github.com/saasify-sh/ta11y/blob/master/media/example.csv), [json](https://github.com/saasify-sh/ta11y/blob/master/media/example.json), [xls](https://github.com/saasify-sh/ta11y/blob/master/media/example.xls?raw=true), [xlsx](https://github.com/saasify-sh/ta11y/blob/master/media/example.xlsx?raw=true)
- [Wikipedia](http://en.wikipedia.org) small crawl (`--max-visit 16`): [csv](https://github.com/saasify-sh/ta11y/blob/master/media/wikipedia.csv), [json](https://github.com/saasify-sh/ta11y/blob/master/media/wikipedia.json), [xls](https://github.com/saasify-sh/ta11y/blob/master/media/wikipedia.xls?raw=true), [xlsx](https://github.com/saasify-sh/ta11y/blob/master/media/wikipedia.xlsx?raw=true)

### Examples

Basic single page audit

This example runs all available audit test suites on the given URL.

It uses the default output behavior which logs the results in JSON format to `stdout`.

```bash
ta11y https://example.com
```

```json
{
"summary": {
"errors": 4,
"warnings": 0,
"infos": 2,
"numPages": 1,
"numPagesPass": 0,
"numPagesFail": 1
},
"results": {
"https://example.com": {
"url": "https://example.com",
"depth": 0,
"rules": [
{
"id": "html",
"description": "A document must not include both a “meta” element with an “http-equiv” attribute whose value is “content-type”, and a “meta” element with a “charset” attribute.",
"context": "f-8\">\n \n <",
"type": "error",
"tags": [
"html"
],
"firstColumn": 5,
"lastLine": 5,
"lastColumn": 71
},
...
],
"summary": {
"errors": 4,
"warnings": 0,
"infos": 2,
"pass": false
}
}
}
}
```

If you only want specific audit results, use the `--suite` option.

Basic single page audit writing results to an Excel file

This example runs the wcag2a and wcag2aa audit test suites on the given URL and outputs the results to an Excel spreadsheet (supports any `xls`, `xlsx`, or `csv` file).

```bash
ta11y https://example.com -o audit.xls
```

Single page audit testing WCAG2A and WCAG2AA writing results to a CSV file

This example runs wcag2a and wcag2aa audit test suites on the given URL and outputs the results to a comma-separated-value file (`csv`).

```bash
ta11y https://example.com --suites wcag2a,wcag2aa -o audit.csv
```

Basic single page content extraction

```bash
ta11y https://example.com --extract-only
```

```json
{
"results": {
"https://example.com": {
"url": "https://example.com",
"depth": 0,
"content": "\n Example Domain\n\n \n \n \n \n body {\n background-color: #f0f0f2;\n margin: 0;\n padding: 0;\n font-family: -apple-system, system-ui, BlinkMacSystemFont, \"Segoe UI\", \"Open Sans\", \"Helvetica Neue\", Helvetica, Arial, sans-serif;\n \n }\n div {\n width: 600px;\n margin: 5em auto;\n padding: 2em;\n background-color: #fdfdff;\n border-radius: 0.5em;\n box-shadow: 2px 3px 7px 2px rgba(0,0,0,0.02);\n }\n a:link, a:visited {\n color: #38488f;\n text-decoration: none;\n }\n @media (max-width: 700px) {\n div {\n margin: 0 auto;\n width: auto;\n }\n }\n \n\n\n\n

\n

Example Domain

\n

This domain is for use in illustrative examples in documents. You may use this\n domain in literature without prior coordination or asking for permission.

\n

More information...

\n
\n\n\n"
}
},
"summary": {
"root": "https://example.com",
"visited": 1,
"success": 1,
"error": 0
}
}
```

Crawl part of a site and audit each page

```bash
ta11y https://en.wikipedia.org --crawl --max-depth 1 --max-visit 8
```

This example will crawl and extract the target site locally and then perform a full remote audit of the results. You can use the `--remote` flag to force the whole process to operate remotely.

Crawl a localhost site and audit each page

```bash
ta11y http://localhost:3000 --crawl
```

This example will crawl all pages of a local site and then perform an audit of the results.

Note that the local site does not have to be publicly accessible as content extraction happens locally.

Run a WCAG2AA audit on a localhost site

```bash
ta11y http://localhost:3000 --crawl --suites wcag2aa
```

This example will crawl all pages of a local site and then perform an audit of the results, **only considering the WCAG2AA test suite**.

Note that the local site does not have to be publicly accessible as content extraction happens locally.

Single page audit using WCAG2A and HTML validation test suites

```bash
ta11y https://example.com --suites wcag2a,html
```

```json
{
"summary": {
"suites": [
"wcag2a",
"html"
],
"errors": 2,
"warnings": 0,
"infos": 2,
"numPages": 1,
"numPagesPass": 0,
"numPagesFail": 1
},
"results": {
"https://example.com": {
"url": "https://example.com",
"depth": 0,
"rules": [
{
"id": "html",
"description": "A document must not include both a “meta” element with an “http-equiv” attribute whose value is “content-type”, and a “meta” element with a “charset” attribute.",
"context": "f-8\">\n \n <",
"type": "error",
"tags": [
"html"
],
"firstColumn": 5,
"lastLine": 5,
"lastColumn": 71
},
{
"id": "html",
"description": " The “type” attribute for the “style” element is not needed and should be omitted.",
"context": "e=1\">\n \n b",
"type": "info",
"tags": [
"html"
],
"firstColumn": 5,
"lastLine": 7,
"lastColumn": 27
},
{
"id": "html",
"description": "Consider adding a “lang” attribute to the “html” start tag to declare the language of this document.",
"context": "TYPE html><html><head>",
"type": "info",
"tags": [
"html"
],
"firstColumn": 16,
"lastLine": 1,
"lastColumn": 21
},
{
"id": "html-has-lang",
"type": "error",
"description": "Ensures every HTML document has a lang attribute",
"impact": "serious",
"tags": [
"cat.language",
"wcag2a",
"wcag311"
],
"help": "<html> element must have a lang attribute",
"helpUrl": "https://dequeuniversity.com/rules/ta11y/3.4/html-has-lang?application=Ta11y%20API"
}
],
"summary": {
"errors": 2,
"warnings": 0,
"infos": 2,
"pass": false
}
}
}
}
```
</details>

<details>
<summary>More advanced crawling with debug output</summary>

This example crawls the English Wikipedia site, visiting up to 200 pages and uses a whitelist to ensure that we only consider links on the English Wikipedia domain.

It then runs an audit against the `wcag2a` and `wcag2aa` test suites.

This example also shows how you can get additional debug output during crawling and auditing that can be really helpful to understand what's going on under the hood.

```bash
DEBUG=ta11y:* ta11y "https://en.wikipedia.org" --crawl --max-visit 200 --whitelist "https://en.wikipedia.org/**/*" --suites wcag2a,wcag2aa -o wikipedia.xlsx
```
</details>

---

<p align="center">
<a href="https://ta11y.saasify.sh" title="ta11y">
<img src="https://storage.googleapis.com/saasify-uploads-prod/c5480c7c4e006629b4a2f7bfc5b783e2fce662ec.jpeg" alt="ta11y Logo" />
</a>
<span>Help us with our goal of building a more accessible and inclusive web. ☺️</span>
</p>

## License

MIT © [Saasify](https://saasify.sh)

Support my OSS work by <a href="https://twitter.com/transitive_bs">following me on twitter <img src="https://storage.googleapis.com/saasify-assets/twitter-logo.svg" alt="twitter" height="24px" align="center"></a>