Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/robotshell/robotScraper

RobotScraper is a simple tool written in Python to check each of the paths found in the robots.txt file and what HTTP response code they return.
https://github.com/robotshell/robotScraper

bounty-hunting-tools bugbounty hacking infosec python robots scraper tool

Last synced: 21 days ago
JSON representation

RobotScraper is a simple tool written in Python to check each of the paths found in the robots.txt file and what HTTP response code they return.

Awesome Lists containing this project

README

        




robotScraper

## Description

RobotScraper is an open-source tool designed to scrape and analyze the `robots.txt` file of a specified domain. This Python script helps in identifying directories and pages that are allowed or disallowed by the `robots.txt` file and can save the results if needed. It is useful for web security researchers, SEO analysts, and anyone interested in examining the structure and access rules of a website.

## Requirements

- Python 3.x
- `requests` package
- `beautifulsoup4` package

## Installation

1. Clone the repository:
```sh
git clone https://github.com/robotshell/robotScraper
cd robotScraper
```

2. Install the required Python packages:
```sh
pip install requests beautifulsoup4
```

## Usage

To run the RobotScraper, you can use the following command syntax:

```sh
python robotScraper.py domain [-s output.txt]
```

# Disclaimer
This tool is intended for educational and research purposes only. The author and contributors are not responsible for any misuse of this tool. Users are advised to use this tool responsibly and only on systems for which they have explicit permission. Unauthorized access to systems, networks, or data is illegal and unethical. Always obtain proper authorization before conducting any kind of activities that could impact other users or systems.