Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/gromnitsky/badlinks
A Minimalistic Link Checker
https://github.com/gromnitsky/badlinks
Last synced: about 6 hours ago
JSON representation
A Minimalistic Link Checker
- Host: GitHub
- URL: https://github.com/gromnitsky/badlinks
- Owner: gromnitsky
- Created: 2024-08-04T19:36:19.000Z (4 months ago)
- Default Branch: master
- Last Pushed: 2024-08-05T16:49:28.000Z (4 months ago)
- Last Synced: 2024-08-05T19:43:11.478Z (4 months ago)
- Language: Shell
- Size: 18.6 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
An attempt to write a fast link checker under 150 LOC.
~~~
$ wc -l lib.bash url* badlinks
21 lib.bash
28 urlcheck
19 urlextract.rb
56 urlextract-recursive
25 badlinks
149 total
~~~Actuall LOC:
~~~
$ scc | grep ^[BR] | awk '{s+=$6} END {print s}'
119
~~~## Usage
Run `bundle` to install the required gems. Then
$ ./badlinks -e -l 3 http://example.com/foo/bar/
`-e` means to check external links (whose origin != `example.com`),
`-l` sets max recursion level.A timeout for each connection is 3 seconds, use `-t sec` to change.
To obtain a full list of URLs without checks:
$ LEVEL=3 ./urlextract-recursive http://example.com/foo/bar
Then process it with awk+urlcheck (this is what *badlinks* wrapper
does).## ♲ Loicense
MIT