https://github.com/mpcodewriter21/proxyeater
A Python Proxy Scraper for gathering fresh proxies.
https://github.com/mpcodewriter21/proxyeater
proxy proxylist python python3
Last synced: 3 months ago
JSON representation
A Python Proxy Scraper for gathering fresh proxies.
- Host: GitHub
- URL: https://github.com/mpcodewriter21/proxyeater
- Owner: MPCodeWriter21
- License: apache-2.0
- Created: 2022-07-16T09:09:00.000Z (about 3 years ago)
- Default Branch: master
- Last Pushed: 2023-07-12T16:28:37.000Z (about 2 years ago)
- Last Synced: 2025-04-14T17:16:19.928Z (6 months ago)
- Topics: proxy, proxylist, python, python3
- Language: Python
- Homepage:
- Size: 101 KB
- Stars: 8
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
ProxyEater\[1.5.3\]
===================

[](https://www.codefactor.io/repository/github/mpcodewriter21/proxyeater)A Python Proxy Scraper for gathering fresh proxies.

Install ProxyEater
------------------To install **ProxyEater**, you can simply use the `pip install ProxyEater` command:
```commandline
python -m pip install ProxyEater
```Or you can clone [the repository](https://github.com/MPCodeWriter21/ProxyEater) and run:
```commandline
git clone https://github.com/MPCodeWriter21/ProxyEater
cd ProxyEater
``````commandline
python setup.py install
```Usage
-----```
usage: ProxyEater [-h] [--source SOURCE] [--output OUTPUT] [--file-format { text, json, csv }]
[--format FORMAT] [--proxy-type PROXY_TYPE] [--include-status] [--threads
THREADS] [--timeout TIMEOUT] [--url URL] [--verbose] [--quiet] [--version]
[--proxy PROXY] [--useragent USERAGENT] [--include-geolocation] [--no-check]
[--source-format { text, json, csv }] [--default-type { http, https, socks4,
socks5 }]
modepositional arguments:
mode Modes: Scrape, Checkoptions:
-h, --help
show this help message and exit
--source SOURCE, -s SOURCE
The source of the proxies(default:%localappdata%\
Python\Python310\lib\site-packages\ProxyEater\sources.json).
--output OUTPUT, -o OUTPUT
The output file.
--file-format { text, json, csv }, -ff { text, json, csv }
The format of the output file(default:text).
--format FORMAT, -f FORMAT
The format for saving the proxies in text
file(default:"{scheme}://{ip}:{port}").
--proxy-type PROXY_TYPE, -type PROXY_TYPE
The type of the proxies(default:all).
--include-status, -is
Include the status of the proxies in the output file.
--threads THREADS, -t THREADS
The number of threads to use for scraping(default:25).
--timeout TIMEOUT, -to TIMEOUT
The timeout of the requests(default:15).
--url URL, -u URL
The url to use for checking the proxies(default:http://icanhazip.com).
--verbose, -v
The verbose of the program(default:False).
--quiet, -q
The quiet of the program(default:False).
--version, -V
The version of the program.Scrape:
Scrape mode arguments--proxy PROXY, -p PROXY
The proxy to use for scraping.
--useragent USERAGENT, -ua USERAGENT
The useragent of the requests(default:random).
--include-geolocation, -ig
Include the geolocation info of the proxies in the output file.
--no-check, -nc
Use this option to skip the checking of the proxies afterCheck:
Check mode arguments--source-format { text, json, csv }, -sf { text, json, csv }
The format of the source file(default:text).
--default-type { http, https, socks4, socks5 }, -dt { http, https, socks4, socks5 }
The default type of the proxies - Use this if you are providing proxies
without scheme(default:http).```
About
-----
Author: CodeWriter21 (Mehrad Pooryoussof)GitHub: [MPCodeWriter21](https://github.com/MPCodeWriter21)
Telegram Channel: [@CodeWriter21](https://t.me/CodeWriter21)
Aparat Channel: [CodeWriter21](https://www.aparat.com/CodeWriter21)
### License

[apache-2.0](http://www.apache.org/licenses/LICENSE-2.0)
### Donate
In order to support this project you can donate some crypto of your choice 8D
[Donate Addresses](https://github.com/MPCodeWriter21/ProxyEater/blob/master/DONATE.md)
Or if you can't, give [this project](https://github.com/MPCodeWriter21/ProxyEater) a star on GitHub :)