An open API service indexing awesome lists of open source software.

https://github.com/whiteerabbit/dork-seeker

Simple Automatizated Google dorker script written in python
https://github.com/whiteerabbit/dork-seeker

dork dork-scanner dork-scraper dork-searcher dorking dorking-tool google-dork google-dorker google-dorking-tool google-dorks python-dorking python-scraper python-script python3 python3-script web-scraper web-scraping web-scraping-python web-scraping-software

Last synced: about 2 months ago
JSON representation

Simple Automatizated Google dorker script written in python

Awesome Lists containing this project

README

          


🔍 Dork Seeker


dork

Simple Automatizated Google dorker script written in Python


❗ This tool is for educational and testing purposes only.


# 🎯 Features

- 🔍 Perform automated Google Dorking queries
- 📄 Save results to a text file
- 📦 Simple CLI interface
- 🛠️ Easy to customize and extend

🚀 How to Use This Tool


Before running the script, make sure Python is installed on your system.

To do that you can run on terminal or cmd: `python3 --version` or `python --version`

🟢 Once Python is ready, install the required Python libraries.

```bash
pip install -r requirements.txt
```

✅ Usage:

```bash
usage: google_url_dorker.py [-h] -d DORK [-c COUNT] [-w WRITE]
[-x]

Tool to extract URLs using Google Dork. python3 google_dorker.py
--dork 'inurl:"/xmlrpc.php?rsd" & ext:php' --count 50

options:
-h, --help show this help message and exit
-d DORK, --dork DORK Google dork query
-c COUNT, --count COUNT
Number of URLs to fetch (default: 10)
-w WRITE, --write WRITE
Save results to a text file
-x, --examples Show usage examples and Contacts
```

🔴 Show Usage Examples and Contacts

```bash
python3 google_url_dorker.py -d 0 --examples
```

🟡 If you run the script without specifying count, it will fetch 10 URLs by default

```bash
python3 google_url_dorker.py -d 'YOUR DORK'
```

🔵 Fetch 50 URLs using your dork query

```bash
python3 google_url_dorker.py -d 'YOUR DORK' -c 50
```

🟣 Fetch 50 URLs and save results to a file named results.txt

```bash
python3 google_url_dorker.py -d 'YOUR DORK' -c 50 --write results.txt
```

# 🖧 Bypass Google Blocking (Using Proxy) 🔓

If Google blocks your connection while performing dorking, you can configure the script to use a proxy:
Go to `line 16` of this script and add your HTTPS proxy:

```python
proxy = 'https://your-proxy-address:port'
```
Then go to `line 19` and update the search call to include the proxy:

```python
res = search(query, num_results=count, proxy=proxy, timeout=22)
```

⚪ 🛠️ Usage Options 📝

| Options | Description |
| ------ | ------ |
| `-h` or `--help` | Show help message and exit |
| `-d` or `--dork` | Specify the Google dork query (required) |
| `-c` or `--count` | Number of URLs to fetch (default: 10) |
| `-w` or `--write` | Save the fetched URLs to a text file |
| `-x` or `--examlpes` | Show usage examples and contacts |

## 📫 Contact

Made with ❤️ by [WhiteeRabbit](https://github.com/WhiteeRabbit)
Feel free to open an issue or contact me for suggestions!

## 📃 License:
[LICENSE](https://github.com/WhiteeRabbit/Google_Dorking_Script/blob/main/LICENSE)

🙏🏻 Big thanks to everyone who gave a ⭐️ to this project or helped in any way.


Your support means a lot and keeps this kind of work going!