{"id":22387710,"url":"https://github.com/mogelpeter/proxy-scraper","last_synced_at":"2025-07-31T06:31:57.106Z","repository":{"id":248129070,"uuid":"827779692","full_name":"mogelpeter/proxy-scraper","owner":"mogelpeter","description":"This script is designed to download and verify HTTP/s and SOCKS5 proxies from public databases and files.","archived":false,"fork":false,"pushed_at":"2024-08-01T23:31:53.000Z","size":47,"stargazers_count":4,"open_issues_count":0,"forks_count":1,"subscribers_count":1,"default_branch":"main","last_synced_at":"2024-08-02T01:03:50.098Z","etag":null,"topics":["http","http-proxy","https-proxy","proxies","proxies-scraper","proxy","proxy-checker","proxy-scraper","python","python3","socks5","socks5-proxy"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/mogelpeter.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-07-12T11:02:30.000Z","updated_at":"2024-08-01T23:31:56.000Z","dependencies_parsed_at":"2024-08-02T01:02:25.559Z","dependency_job_id":"f7b20d06-57b4-44fb-9ff9-ff8504310cd4","html_url":"https://github.com/mogelpeter/proxy-scraper","commit_stats":null,"previous_names":["lyxmyx/proxy-scraper","mogelpeter/proxy-scraper"],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mogelpeter%2Fproxy-scraper","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mogelpeter%2Fproxy-scraper/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mogelpeter%2Fproxy-scraper/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mogelpeter%2Fproxy-scraper/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/mogelpeter","download_url":"https://codeload.github.com/mogelpeter/proxy-scraper/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":228219753,"owners_count":17887058,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["http","http-proxy","https-proxy","proxies","proxies-scraper","proxy","proxy-checker","proxy-scraper","python","python3","socks5","socks5-proxy"],"created_at":"2024-12-05T02:10:37.879Z","updated_at":"2024-12-05T02:10:38.340Z","avatar_url":"https://github.com/mogelpeter.png","language":"Python","readme":"\n# Proxy Scraper and Checker\n\n![Stable](https://img.shields.io/badge/status-stable-brightgreen) \n\n![Discord](https://dcbadge.limes.pink/api/shield/741265873779818566?compact=true)\n\n## Script Description\n\nThis script is designed to download and verify HTTP/s and SOCKS5 proxies from public databases and files. It offers the following key features:\n\n- **Configurable Threading**: Adjust the number of threads based on your system's capability using a `usage_level` setting from 1 to 3.\n- **Scraping Proxies**: Automatically scrape HTTP/s and SOCKS5 proxies from various online sources.\n- **Checking Proxies**: Validate the functionality of the scraped proxies to ensure they are operational.\n- **System Monitoring**: Display the script's CPU and RAM usage in the console title for real-time performance monitoring.\n\n### Usage\n\n1. **Installation**:\n   - Clone the repository or download the .zip file.\n   - Navigate to the project directory.\n\n2. **Running the Script**:\n   - Execute the script using:\n     ```bash\n     start.bat\n     ```\n     or\n     ```bash\n     python main.py\n     ```\n\n3. **Configuration**:\n   - The script uses a `config.json` file to manage settings.\n   - Adjust the `usage_level`, and specify the list of URLs for HTTP/s and SOCKS5 proxies.\n\n4. **Educational \u0026 Research Purposes Only**:\n   - This script is intended for educational and research purposes only. Use it responsibly and in accordance with applicable laws.\n\n### Requirements\n\n- Python 3.8+\n- All necessary packages are automatically installed when the script is run.\n\n### Example `config.json`\n\n```json\n{\n    \"usage_level\": 2,\n    \"http_links\": [\n        \"https://api.proxyscrape.com/?request=getproxies\u0026proxytype=https\u0026timeout=10000\u0026country=all\u0026ssl=all\u0026anonymity=all\",\n        \"https://api.proxyscrape.com/v2/?request=getproxies\u0026protocol=http\u0026timeout=10000\u0026country=all\u0026ssl=all\u0026anonymity=all\"\n    ],\n    \"socks5_links\": [\n        \"https://raw.githubusercontent.com/B4RC0DE-TM/proxy-list/main/SOCKS5.txt\",\n        \"https://raw.githubusercontent.com/saschazesiger/Free-Proxies/master/proxies/socks5.txt\"\n    ]\n}\n```\n\nBy following this documentation, you should be able to set up, run, and understand the Proxy Scraper and Checker script with ease.\n\n\n## Important Information!\n\nFor educational \u0026 research purposes only!\n\n## Detailed Documentation\n\n### Functions\n\n#### `generate_random_folder_name(length=32)`\n\nGenerates a random folder name with the specified length.\n\n#### `remove_old_folders(base_folder=\".\")`\n\nRemoves old folders with 32 character names in the base folder.\n\n#### `get_time_rn()`\n\nReturns the current time formatted as HH:MM:SS.\n\n#### `get_usage_level_str(level)`\n\nConverts the usage level integer to a string representation.\n\n#### `update_title(http_selected, socks5_selected, usage_level)`\n\nUpdates the console title with current CPU, RAM usage, and validation counts.\n\n#### `center_text(text, width)`\n\nCenters the text within the given width.\n\n#### `ui()`\n\nClears the console and displays the main UI with ASCII art.\n\n#### `scrape_proxy_links(link, proxy_type)`\n\nScrapes proxies from the given link, retries up to 3 times in case of failure.\n\n#### `check_proxy_link(link)`\n\nChecks if a proxy link is accessible.\n\n#### `clean_proxy_links()`\n\nCleans the proxy links by removing non-accessible ones.\n\n#### `scrape_proxies(proxy_list, proxy_type, file_name)`\n\nScrapes proxies from the provided list of links and saves them to a file.\n\n#### `check_proxy_http(proxy)`\n\nChecks the validity of an HTTP/s proxy by making a request to httpbin.org.\n\n#### `check_proxy_socks5(proxy)`\n\nChecks the validity of a SOCKS5 proxy by connecting to google.com.\n\n#### `check_http_proxies(proxies)`\n\nChecks a list of HTTP/s proxies for validity.\n\n#### `check_socks5_proxies(proxies)`\n\nChecks a list of SOCKS5 proxies for validity.\n\n#### `signal_handler(sig, frame)`\n\nHandles SIGINT signal (Ctrl+C) to exit gracefully.\n\n#### `set_process_priority()`\n\nSets the process priority to high for better performance.\n\n#### `loading_animation()`\n\nDisplays a loading animation while verifying proxy links.\n\n#### `clear_console()`\n\nClears the console screen.\n\n#### `continuously_update_title()`\n\nContinuously updates the console title with current status.\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmogelpeter%2Fproxy-scraper","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmogelpeter%2Fproxy-scraper","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmogelpeter%2Fproxy-scraper/lists"}