Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/rootviii/proxy_requests
a class that uses scraped proxies to make http GET/POST requests (Python requests)
https://github.com/rootviii/proxy_requests
http http-get http-getter http-proxy http-proxy-middleware proxy proxy-list proxy-requests proxy-server python python-requests python3 recursion recursion-problem requests requests-module webscraper webscraper-api webscraping
Last synced: 6 days ago
JSON representation
a class that uses scraped proxies to make http GET/POST requests (Python requests)
- Host: GitHub
- URL: https://github.com/rootviii/proxy_requests
- Owner: rootVIII
- License: mit
- Archived: true
- Created: 2018-08-04T21:23:59.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2020-12-03T15:41:38.000Z (almost 4 years ago)
- Last Synced: 2024-10-28T14:27:08.318Z (11 days ago)
- Topics: http, http-get, http-getter, http-proxy, http-proxy-middleware, proxy, proxy-list, proxy-requests, proxy-server, python, python-requests, python3, recursion, recursion-problem, requests, requests-module, webscraper, webscraper-api, webscraping
- Language: Python
- Homepage:
- Size: 750 KB
- Stars: 388
- Watchers: 19
- Forks: 43
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-network-stuff - **302**星
README
## Python Proxy Requests | make an http GET/POST with a proxy scraped from https://www.sslproxies.org/
[![Downloads](https://pepy.tech/badge/proxy-requests)](https://pepy.tech/project/proxy-requests)
[![Downloads](https://pepy.tech/badge/proxy-requests/month)](https://pepy.tech/project/proxy-requests)
[![Downloads](https://pepy.tech/badge/proxy-requests/week)](https://pepy.tech/project/proxy-requests)pypi.org: https://pypi.org/project/proxy-requests/
The ProxyRequests class first scrapes proxies from the web. Then it recursively attempts to make a request if the initial request with a proxy is unsuccessful.
Either copy the code and put where you want it, or download via pip:
pip install proxy-requests
(or pip3)
from proxy_requests import ProxyRequests
or if you need the Basic Auth subclass as well:
from proxy_requests import ProxyRequests, ProxyRequestsBasicAuth
If the above import statement is used, method calls will be identical to the ones shown below. Pass a fully qualified URL when initializing an instance.
System Requirements: Python 3 and the requests module.
Runs on Linux and Windows (and Mac probably) - It may take a moment to run depending on the current proxy.
Each request with a proxy is set with an 3 second timeout in the event that the request takes too long (before trying the next proxy socket in the queue).
Proxies are randomly popped from the queue.
The ProxyRequestBasicAuth subclass has the methods get(), get_with_headers(), post(), post_with_headers(), post_file(), and post_file_with_headers() that will override the Parent methods.GET:
r = ProxyRequests('https://api.ipify.org')
r.get()
GET with headers:
h = {'User-Agent': 'NCSA Mosaic/3.0 (Windows 95)'}
r = ProxyRequests('url here')
r.set_headers(h)
r.get_with_headers()
POST:
r = ProxyRequests('url here')
r.post({'key1': 'value1', 'key2': 'value2'})
POST with headers:
r = ProxyRequests('url here')
r.set_headers({'name': 'rootVIII', 'secret_message': '7Yufs9KIfj33d'})
r.post_with_headers({'key1': 'value1', 'key2': 'value2'})
POST FILE:
r = ProxyRequests('url here')
r.set_file('test.txt')
r.post_file()
POST FILE with headers:
h = {'User-Agent': 'NCSA Mosaic/3.0 (Windows 95)'}
r = ProxyRequests('url here')
r.set_headers(h)
r.set_file('test.txt')
r.post_file_with_headers()
GET with Basic Authentication:
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.get()
GET with headers & Basic Authentication:
h = {'User-Agent': 'NCSA Mosaic/3.0 (Windows 95)'}
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.set_headers(h)
r.get_with_headers()
POST with Basic Authentication:
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.post({'key1': 'value1', 'key2': 'value2'})
POST with headers & Basic Authentication:
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.set_headers({'header_key': 'header_value'})
r.post_with_headers({'key1': 'value1', 'key2': 'value2'})
POST FILE with Basic Authentication:
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.set_file('test.txt')
r.post_file()
POST FILE with headers & Basic Authentication:
h = {'User-Agent': 'NCSA Mosaic/3.0 (Windows 95)'}
r = ProxyRequestsBasicAuth('url here', 'username', 'password')
r.set_headers(h)
r.set_file('test.txt')
r.post_file_with_headers()
Response Methods
Returns a string:
print(r)
Or if you want the raw content as bytes:
r.get_raw()
Get the response as JSON (if valid JSON):
r.get_json()
Get the response headers:
print(r.get_headers())
Get the status code:
print(r.get_status_code())
Get the URL that was requested:
print(r.get_url())
Get the proxy that was used to make the request:
print(r.get_proxy_used())
To write raw data to a file (including an image):
url = 'https://www.restwords.com/static/ICON.png'
r = ProxyRequests(url)
r.get()
with open('out.png', 'wb') as f:
f.write(r.get_raw())
Dump the response to a file as JSON:
import json
with open('test.txt', 'w') as f:
json.dump(r.get_json(), f)
This was developed on Ubuntu 16.04.4/18.04 LTS.
Author: rootVIII 2018-2020