Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dannyben/sla
Site Link Analyzer - command line broken links checker
https://github.com/dannyben/sla
gem link-analysis ruby
Last synced: 3 months ago
JSON representation
Site Link Analyzer - command line broken links checker
- Host: GitHub
- URL: https://github.com/dannyben/sla
- Owner: DannyBen
- License: mit
- Created: 2016-07-14T19:34:22.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2024-02-10T16:29:14.000Z (11 months ago)
- Last Synced: 2024-10-06T05:21:20.533Z (3 months ago)
- Topics: gem, link-analysis, ruby
- Language: Ruby
- Homepage:
- Size: 146 KB
- Stars: 2
- Watchers: 3
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# Site Link Analyzer
[![Gem Version](https://badge.fury.io/rb/sla.svg)](https://badge.fury.io/rb/sla)
[![Build Status](https://github.com/DannyBen/sla/workflows/Test/badge.svg)](https://github.com/DannyBen/sla/actions?query=workflow%3ATest)
[![Maintainability](https://api.codeclimate.com/v1/badges/f78192aead8a74535a24/maintainability)](https://codeclimate.com/github/DannyBen/sla/maintainability)---
SLA is a simple broken links checker, with built in caching.
![SLA Demo](demo/cast.svg "SLA Demo")
## Install
### Install using Ruby
```shell
$ gem install sla
```### Install using Docker
```shell
alias sla='docker run --rm -it --network host -v /tmp/sla_cache:/app/cache dannyben/sla'
```The `--network host` flag is only necessary if you intend to check links on `localhost`.
## Features
- Easy to use command line interface.
- Built in caching, to avoid overtaxing the server.
- Show and save list of broken links to a log file.
- Exits with non zero code on failure, for CI integration.## Usage
```
$ sla --help
Site Link AnalyzerUsage:
sla URL [options]
sla --help | -h | --versionOptions:
--verbose, -v
Show detailed output--simple, -s
Show simple output of errors only--depth, -d DEPTH
Set crawling depth [default: 5]--external, -x
Also check external links--ignore, -i URLS
Specify a list of space delimited patterns to skip
URLs that contain any of the strings in this list will be skipped--cache, -c LIFE
Set cache life [default: 1d]. LIFE can be in any of the following formats:
10 = 10 seconds
20s = 20 seconds
10m = 10 minutes
10h = 10 hours
10d = 10 days--cache-dir DIR
Set the cache directory-h --help
Show this help--version
Show version numberParameters:
URL
URL to scanEnvironment Variables:
SLA_SLEEP
Set number of seconds to sleep between calls (for debugging purposes)Examples:
sla example.com
sla example.com -c10m -d10
sla example.com --cache-dir my_cache
sla example.com --depth 10
sla example.com --cache 30d --external
sla example.com --simple > out.log
sla example.com --ignore "/admin /customer/login"```
## Contributing / Support
If you experience any issue, have a question or a suggestion, or if you wish
to contribute, feel free to [open an issue][issues].---
[issues]: https://github.com/DannyBen/sla/issues