Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/p0dalirius/robotsvalidator
A python script to check if URLs are allowed or disallowed by a robots.txt file.
https://github.com/p0dalirius/robotsvalidator
allow bugbounty bypass check disallow robots-txt web
Last synced: about 1 month ago
JSON representation
A python script to check if URLs are allowed or disallowed by a robots.txt file.
- Host: GitHub
- URL: https://github.com/p0dalirius/robotsvalidator
- Owner: p0dalirius
- Created: 2021-11-29T15:22:09.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2022-06-26T15:29:44.000Z (over 2 years ago)
- Last Synced: 2024-12-18T18:45:38.710Z (about 1 month ago)
- Topics: allow, bugbounty, bypass, check, disallow, robots-txt, web
- Language: Python
- Homepage: https://podalirius.net/
- Size: 187 KB
- Stars: 21
- Watchers: 2
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
Awesome Lists containing this project
README
# RobotsValidator
The robotsvalidator script allows you to check if URLs are allowed or disallowed by a robots.txt file.
![](./.github/example.png)
## Features
- [x] Getting robots.txt file from local file
- [x] Getting robots.txt file from an URL
- [x] Verbose mode, showing all the rules with their results.## Verbose mode
There is a verbose mode using `--debug` option, which prints every rule with its result:
![](./.github/verbose.png)
## Contributing
Pull requests are welcome. Feel free to open an issue if you want to add other features.