https://github.com/p0dalirius/robotsvalidator
A python script to check if URLs are allowed or disallowed by a robots.txt file.
https://github.com/p0dalirius/robotsvalidator
allow bugbounty bypass check disallow robots-txt web
Last synced: 5 months ago
JSON representation
A python script to check if URLs are allowed or disallowed by a robots.txt file.
- Host: GitHub
- URL: https://github.com/p0dalirius/robotsvalidator
- Owner: p0dalirius
- Created: 2021-11-29T15:22:09.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2022-06-26T15:29:44.000Z (almost 3 years ago)
- Last Synced: 2024-12-18T18:45:38.710Z (6 months ago)
- Topics: allow, bugbounty, bypass, check, disallow, robots-txt, web
- Language: Python
- Homepage: https://podalirius.net/
- Size: 187 KB
- Stars: 21
- Watchers: 2
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
Awesome Lists containing this project
README
# RobotsValidator
The robotsvalidator script allows you to check if URLs are allowed or disallowed by a robots.txt file.
![]()
![]()
![]()

## Features
- [x] Getting robots.txt file from local file
- [x] Getting robots.txt file from an URL
- [x] Verbose mode, showing all the rules with their results.## Verbose mode
There is a verbose mode using `--debug` option, which prints every rule with its result:

## Contributing
Pull requests are welcome. Feel free to open an issue if you want to add other features.