Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/tomnomnom/fff
The Fairly Fast Fetcher. Requests a bunch of URLs provided on stdin fairly quickly.
https://github.com/tomnomnom/fff
Last synced: 18 days ago
JSON representation
The Fairly Fast Fetcher. Requests a bunch of URLs provided on stdin fairly quickly.
- Host: GitHub
- URL: https://github.com/tomnomnom/fff
- Owner: tomnomnom
- Created: 2020-05-26T23:49:09.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2024-04-10T16:02:48.000Z (8 months ago)
- Last Synced: 2024-11-16T20:07:12.003Z (26 days ago)
- Language: Go
- Size: 7.81 KB
- Stars: 384
- Watchers: 9
- Forks: 60
- Open Issues: 7
-
Metadata Files:
- Readme: README.mkd
Awesome Lists containing this project
- WebHackersWeapons - fff
- awesome-hacking-lists - tomnomnom/fff - The Fairly Fast Fetcher. Requests a bunch of URLs provided on stdin fairly quickly. (Go)
README
# fff
The Fairly Fast Fetcher. Requests a bunch of URLs provided on stdin fairly quickly.
The main idea is to launch a new request every `n` milliseconds, without waiting
for the last request to finish first. This makes for consistently fast fetching,
but can be hard on system resources (e.g. you might run out of file descriptors).
The advantage though, is that hitting a bunch of very slow URLs or URLs that
result in timeouts doesn't slow the overall progress very much.## Install
```
▶ go get -u github.com/tomnomnom/fff
```## Usage
Basic usage:
```
▶ cat urls.txt | fff
```Options:
```
▶ fff --help
Request URLs provided on stdin fairly frickin' fastOptions:
-b, --body Request body
-d, --delay Delay between issuing requests (ms)
-H, --header Add a header to the request (can be specified multiple times)
-k, --keep-alive Use HTTP Keep-Alive
-m, --method HTTP method to use (default: GET, or POST if body is specified)
-o, --output Directory to save responses in (will be created)
-s, --save-statusSave responses with given status code (can be specified multiple times)
-S, --save Save all responses
-x, --proxy Use the provided HTTP proxy
```## Tuning
You might want to increase your open file descriptor limit before doing anything crazy:```
▶ ulimit -n 16384
```## TODO
* Create an index file in the output directory