Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/juancarlospaco/faster-than-requests
Faster requests on Python 3
https://github.com/juancarlospaco/faster-than-requests
curl cython download-file faster-than-requests high-performance http-requests ndjson open-data python python-library python-requests python3 requests-toolbelt requests3 scrapy speed urllib urllib3 web-scraper web-scraping
Last synced: 1 day ago
JSON representation
Faster requests on Python 3
- Host: GitHub
- URL: https://github.com/juancarlospaco/faster-than-requests
- Owner: juancarlospaco
- License: mit
- Created: 2018-11-13T00:23:32.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2024-10-25T15:45:50.000Z (about 2 months ago)
- Last Synced: 2024-10-29T15:23:02.768Z (about 1 month ago)
- Topics: curl, cython, download-file, faster-than-requests, high-performance, http-requests, ndjson, open-data, python, python-library, python-requests, python3, requests-toolbelt, requests3, scrapy, speed, urllib, urllib3, web-scraper, web-scraping
- Language: Nim
- Homepage: https://gist.github.com/juancarlospaco/37da34ed13a609663f55f4466c4dbc3e
- Size: 20.1 MB
- Stars: 1,104
- Watchers: 20
- Forks: 89
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- Funding: .github/FUNDING.yml
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- awesome-rainmana - juancarlospaco/faster-than-requests - Faster requests on Python 3 (Nim)
README
# Faster-than-Requests
[![screenshot](https://source.unsplash.com/eH_ftJYhaTY/800x402 "Please Star this repo on GitHub!")](https://youtu.be/QiKwnlyhKrk?t=5)
![screenshot](https://raw.githubusercontent.com/juancarlospaco/faster-than-requests/master/img/temp.png "Please Star this repo on GitHub!")
![](https://img.shields.io/github/languages/top/juancarlospaco/faster-than-requests?style=for-the-badge)
![](https://img.shields.io/github/stars/juancarlospaco/faster-than-requests?style=for-the-badge "Star faster-than-requests on GitHub!")
![](https://img.shields.io/maintenance/yes/2022?style=for-the-badge)
![](https://img.shields.io/github/languages/code-size/juancarlospaco/faster-than-requests?style=for-the-badge)
![](https://img.shields.io/github/issues-raw/juancarlospaco/faster-than-requests?style=for-the-badge "Bugs")
![](https://img.shields.io/github/issues-pr-raw/juancarlospaco/faster-than-requests?style=for-the-badge "PRs")
![](https://img.shields.io/github/last-commit/juancarlospaco/faster-than-requests?style=for-the-badge "Commits")| Library | Speed | Files | LOC | Dependencies | Developers | WebSockets | Multi-Threaded Web Scraper Built-in |
|-------------------------------|----------|-------|------|-----------------------|------------|-------------------------------|-------------------------------------|
| PyWGET | `152.39` | 1 | 338 | Wget | >17 | :negative_squared_cross_mark: | :negative_squared_cross_mark: |
| Requests | `15.58` | >20 | 2558 | >=7 | >527 | :negative_squared_cross_mark: | :negative_squared_cross_mark: |
| Requests (cached object) | `5.50` | >20 | 2558 | >=7 | >527 | :negative_squared_cross_mark: | :negative_squared_cross_mark: |
| Urllib | `4.00` | ??? | 1200 | 0 (std lib) | ??? | :negative_squared_cross_mark: | :negative_squared_cross_mark: |
| Urllib3 | `3.55` | >40 | 5242 | 0 (No SSL), >=5 (SSL) | >188 | :negative_squared_cross_mark: | :negative_squared_cross_mark: |
| PyCurl | `0.75` | >15 | 5932 | Curl, LibCurl | >50 | :negative_squared_cross_mark: | :negative_squared_cross_mark: |
| PyCurl (no SSL) | `0.68` | >15 | 5932 | Curl, LibCurl | >50 | :negative_squared_cross_mark: | :negative_squared_cross_mark: |
| Faster_than_requests | `0.40` | 1 | 999 | 0 | 1 | :heavy_check_mark: | :heavy_check_mark: 7, [One-Liner](https://github.com/juancarlospaco/faster-than-requests/blob/master/examples/multithread_web_scraper.py#L2) |- Lines Of Code counted using [CLOC](https://github.com/AlDanial/cloc).
- Direct dependencies of the package when ready to run.
- Benchmarks run on Docker from Dockerfile on this repo.
- Developers counted from the Contributors list of Git.
- Speed is IRL time to complete 10000 HTTP local requests.
- Stats as of year 2020.
- x86_64 64Bit AMD, SSD, Arch Linux.# Use
```python
import faster_than_requests as requestsrequests.get("http://httpbin.org/get") # GET
requests.post("http://httpbin.org/post", "Some Data Here") # POST
requests.download("http://example.com/foo.jpg", "out.jpg") # Download a file
requests.scraper(["http://foo.io", "http://bar.io"], threads=True) # Multi-Threaded Web Scraper
requests.scraper5(["http://foo.io"], sqlite_file_path="database.db") # URL-to-SQLite Web Scraper
requests.scraper6(["http://python.org"], ["(www|http:|https:)+[^\s]+[\w]"]) # Regex-powered Web Scraper
requests.scraper7("http://python.org", "body > div.someclass a#someid"]) # CSS Selector Web Scraper
requests.websocket_send("ws://echo.websocket.org", "data here") # WebSockets Binary/Text
```# Table Of Contents
| | | | |
|:-----------------------:|:---------------------------:|:-----------------------------:|:-------------------------:|
| [**get()**](#get) | [**post()**](#post) | [**put()**](#put) | [**head()**](#head) |
| [**patch()**](#patch) | [**delete()**](#delete) | [download()](#download) | [download2()](#download2) |
| [scraper()](#scraper) | [scraper2()](#scraper2) | [scraper3()](#scraper3) | [scraper4()](#scraper4) |
| [scraper5()](#scraper5) | [scraper6()](#scraper6) | [scraper7()](#scraper7) | [get2str()](#get2str) |
| [get2str2()](#get2str2) | | [get2dict()](#get2dict) | [get2json()](#get2json) |
| [post2str()](#post2str) | [post2dict()](#post2dict) | [post2json()](#post2json) | [post2list()](#post2list) |
| [download3()](#download3) | [tuples2json()](#tuples2json) | [set_headers()](#set_headers) | [multipartdata2str()](#multipartdata2str) |
| [datauri()](#datauri) | [urlparse()](#urlparse) | [urlencode()](#urlencode) | [urldecode()](#urldecode) |
| [encodequery()](#encodequery) | [encodexml()](#encodexml) | [debugs()](#debugs) | [minifyhtml()](#minifyhtml) |
| [How to set DEBUG mode](#how-to-set-debug-mode) | [websocket_send()](#websocket_send) | [websocket_ping()](#websocket_ping) | |
| [How to Install](#install) | [How to Windows](#windows) | [FAQ](#faq) | [Get Help](https://github.com/juancarlospaco/faster-than-requests/issues/new/choose) |
| [PyPI](https://pypi.org/project/faster-than-requests) | [GitHub Actions / CI](https://github.com/juancarlospaco/faster-than-requests/actions?query=workflow%3APYTHON) | [Examples](https://github.com/juancarlospaco/faster-than-requests/tree/master/examples) | [Sponsors](#sponsors) |# get()
**Description:**
Takes an URL string, makes an HTTP GET and returns a dict with the response.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `https://dev.to`.
- `user_agent` User Agent, string type, optional, should not be empty string.
- `max_redirects` Maximum Redirects, int type, optional, defaults to `9`, example `5`, example `1`.
- `proxy_url` Proxy URL, string type, optional, if is `""` then NO Proxy is used, defaults to `""`, example `172.15.256.1:666`.
- `proxy_auth` Proxy Auth, string type, optional, if `proxy_url` is `""` then is ignored, defaults to `""`.
- `timeout` Timeout, int type, optional, Milliseconds precision, defaults to `-1`, example `9999`, example `666`.
- `http_headers` HTTP Headers, List of Tuples type, optional, example `[("key", "value")]`, example `[("DNT", "1")]`.Examples:
```python
import faster_than_requests as requests
requests.get("http://example.com")
```**Returns:**
Response, `list` type, values of the list are string type,
values of the list can be empty string, the lenght of the list is always 7 items,
the values are like `[body, type, status, version, url, length, headers]`,
you can use `to_json()` to get JSON or `to_dict()` to get a dict or `to_tuples()` to get a tuples.**See Also:**
[get2str()](https://github.com/juancarlospaco/faster-than-requests#get2str) and [get2str2()](https://github.com/juancarlospaco/faster-than-requests#get2str2)# post()
**Description:**
Takes an URL string, makes an HTTP POST and returns a dict with the response.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `https://dev.to`.
- `body` the Body data, string type, required, can be empty string. To Post Files use this too.
- `multipart_data` MultiPart data, optional, list of tupes type, must not be empty list, example `[("key", "value")]`.
- `user_agent` User Agent, string type, optional, should not be empty string.
- `max_redirects` Maximum Redirects, int type, optional, defaults to `9`, example `5`, example `1`.
- `proxy_url` Proxy URL, string type, optional, if is `""` then NO Proxy is used, defaults to `""`, example `172.15.256.1:666`.
- `proxy_auth` Proxy Auth, string type, optional, if `proxy_url` is `""` then is ignored, defaults to `""`.
- `timeout` Timeout, int type, optional, Milliseconds precision, defaults to `-1`, example `9999`, example `666`.
- `http_headers` HTTP Headers, List of Tuples type, optional, example `[("key", "value")]`, example `[("DNT", "1")]`.Examples:
```python
import faster_than_requests as requests
requests.post("http://httpbin.org/post", "Some Data Here")
```**Returns:**
Response, `list` type, values of the list are string type,
values of the list can be empty string, the lenght of the list is always 7 items,
the values are like `[body, type, status, version, url, length, headers]`,
you can use `to_json()` to get JSON or `to_dict()` to get a dict or `to_tuples()` to get a tuples.# put()
**Description:**
Takes an URL string, makes an HTTP PUT and returns a dict with the response.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `https://nim-lang.org`.
- `body` the Body data, string type, required, can be empty string.
- `user_agent` User Agent, string type, optional, should not be empty string.
- `max_redirects` Maximum Redirects, int type, optional, defaults to `9`, example `5`, example `1`.
- `proxy_url` Proxy URL, string type, optional, if is `""` then NO Proxy is used, defaults to `""`, example `172.15.256.1:666`.
- `proxy_auth` Proxy Auth, string type, optional, if `proxy_url` is `""` then is ignored, defaults to `""`.
- `timeout` Timeout, int type, optional, Milliseconds precision, defaults to `-1`, example `9999`, example `666`.
- `http_headers` HTTP Headers, List of Tuples type, optional, example `[("key", "value")]`, example `[("DNT", "1")]`.Examples:
```python
import faster_than_requests as requests
requests.put("http://httpbin.org/post", "Some Data Here")
```**Returns:**
Response, `list` type, values of the list are string type,
values of the list can be empty string, the lenght of the list is always 7 items,
the values are like `[body, type, status, version, url, length, headers]`,
you can use `to_json()` to get JSON or `to_dict()` to get a dict or `to_tuples()` to get a tuples.# delete()
**Description:**
Takes an URL string, makes an HTTP DELETE and returns a dict with the response.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `https://nim-lang.org`.
- `user_agent` User Agent, string type, optional, should not be empty string.
- `max_redirects` Maximum Redirects, int type, optional, defaults to `9`, example `5`, example `1`.
- `proxy_url` Proxy URL, string type, optional, if is `""` then NO Proxy is used, defaults to `""`, example `172.15.256.1:666`.
- `proxy_auth` Proxy Auth, string type, optional, if `proxy_url` is `""` then is ignored, defaults to `""`.
- `timeout` Timeout, int type, optional, Milliseconds precision, defaults to `-1`, example `9999`, example `666`.
- `http_headers` HTTP Headers, List of Tuples type, optional, example `[("key", "value")]`, example `[("DNT", "1")]`.Examples:
```python
import faster_than_requests as requests
requests.delete("http://example.com/api/something")
```**Returns:**
Response, `list` type, values of the list are string type,
values of the list can be empty string, the lenght of the list is always 7 items,
the values are like `[body, type, status, version, url, length, headers]`,
you can use `to_json()` to get JSON or `to_dict()` to get a dict or `to_tuples()` to get a tuples.# patch()
**Description:**
Takes an URL string, makes an HTTP PATCH and returns a dict with the response.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `https://archlinux.org`.
- `body` the Body data, string type, required, can be empty string.
- `user_agent` User Agent, string type, optional, should not be empty string.
- `max_redirects` Maximum Redirects, int type, optional, defaults to `9`, example `5`, example `1`.
- `proxy_url` Proxy URL, string type, optional, if is `""` then NO Proxy is used, defaults to `""`, example `172.15.256.1:666`.
- `proxy_auth` Proxy Auth, string type, optional, if `proxy_url` is `""` then is ignored, defaults to `""`.
- `timeout` Timeout, int type, optional, Milliseconds precision, defaults to `-1`, example `9999`, example `666`.
- `http_headers` HTTP Headers, List of Tuples type, optional, example `[("key", "value")]`, example `[("DNT", "1")]`.Examples:
```python
import faster_than_requests as requests
requests.patch("http://example.com", "My Body Data Here")
```**Returns:**
Response, `list` type, values of the list are string type,
values of the list can be empty string, the lenght of the list is always 7 items,
the values are like `[body, type, status, version, url, length, headers]`,
you can use `to_json()` to get JSON or `to_dict()` to get a dict or `to_tuples()` to get a tuples.# head()
**Description:**
Takes an URL string, makes an HTTP HEAD and returns a dict with the response.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `https://nim-lang.org`.
- `user_agent` User Agent, string type, optional, should not be empty string.
- `max_redirects` Maximum Redirects, int type, optional, defaults to `9`, example `5`, example `1`.
- `proxy_url` Proxy URL, string type, optional, if is `""` then NO Proxy is used, defaults to `""`, example `172.15.256.1:666`.
- `proxy_auth` Proxy Auth, string type, optional, if `proxy_url` is `""` then is ignored, defaults to `""`.
- `timeout` Timeout, int type, optional, Milliseconds precision, defaults to `-1`, example `9999`, example `666`.
- `http_headers` HTTP Headers, List of Tuples type, optional, example `[("key", "value")]`, example `[("DNT", "1")]`.Examples:
```python
import faster_than_requests as requests
requests.head("http://example.com/api/something")
```**Returns:**
Response, `list` type, values of the list are string type,
values of the list can be empty string, the lenght of the list is always 7 items,
the values are like `[body, type, status, version, url, length, headers]`,
you can use `to_json()` to get JSON or `to_dict()` to get a dict or `to_tuples()` to get a tuples.# to_dict()
**Description:** Convert the response to dict.
**Arguments:**
- `ftr_response` Response from any of the functions that return a response.**Returns:** Response, `dict` type.
# to_json()
**Description:** Convert the response to Pretty-Printed JSON.
**Arguments:**
- `ftr_response` Response from any of the functions that return a response.**Returns:** Response, Pretty-Printed JSON.
# to_tuples()
**Description:** Convert the response to a list of tuples.
**Arguments:**
- `ftr_response` Response from any of the functions that return a response.**Returns:** Response, list of tuples.
# Extras: Go beyond requests
## scraper()
**Description:**
Multi-Threaded Ready-Made URL-Deduplicating Web Scraper from a list of URLs.![](misc/multithread-scraper.png)
All arguments are optional, it only needs the URL to get to work.
Scraper is designed to be like a 2-Step Web Scraper, that makes a first pass collecting all URL Links and then a second pass actually fetching those URLs.
Requests are processed asynchronously. This means that it doesn’t need to wait for a request to be finished to be processed.**Arguments:**
- `list_of_urls` List of URLs, URL must be string type, required, must not be empty list, example `["http://example.io"]`.
- `html_tag` HTML Tag to parse, string type, optional, defaults to `"a"` being Links, example `"h1"`.
- `case_insensitive` Case Insensitive, `True` for Case Insensitive, boolean type, optional, defaults to `True`, example `True`.
- `deduplicate_urls` Deduplicate `list_of_urls` removing repeated URLs, boolean type, optional, defaults to `False`, example `False`.
- `threads` Passing `threads = True` uses Multi-Threading, `threads = False` will Not use Multi-Threading, boolean type, optional, omitting it will Not use Multi-Threading.Examples:
```python
import faster_than_requests as requests
requests.scraper(["https://nim-lang.org", "http://example.com"], threads=True)
```**Returns:** Scraped Webs.
## scraper2()
**Description:**
Multi-Tag Ready-Made URL-Deduplicating Web Scraper from a list of URLs.
All arguments are optional, it only needs the URL to get to work.
Scraper is designed to be like a 2-Step Web Scraper, that makes a first pass collecting all URL Links and then a second pass actually fetching those URLs.
Requests are processed asynchronously. This means that it doesn’t need to wait for a request to be finished to be processed.
You can think of this scraper as a parallel evolution of the original scraper.**Arguments:**
- `list_of_urls` List of URLs, URL must be string type, required, must not be empty list, example `["http://example.io"]`.
- `list_of_tags` List of HTML Tags to parse, List type, optional, defaults to `["a"]` being Links, example `["h1", "h2"]`.
- `case_insensitive` Case Insensitive, `True` for Case Insensitive, boolean type, optional, defaults to `True`, example `True`.
- `deduplicate_urls` Deduplicate `list_of_urls` removing repeated URLs, boolean type, optional, defaults to `False`, example `False`.
- `verbose` Verbose, print to terminal console the progress, bool type, optional, defaults to `True`, example `False`.
- `delay` Delay between a download and the next one, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `0`, must be a positive integer value, example `42`.
- `threads` Passing `threads = True` uses Multi-Threading, `threads = False` will Not use Multi-Threading, boolean type, optional, omitting it will Not use Multi-Threading.
- `agent` User Agent, string type, optional, must not be empty string.
- `redirects` Maximum Redirects, integer type, optional, defaults to `5`, must be positive integer.
- `timeout` Timeout, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `-1`, must be a positive integer value, example `42`.
- `header` HTTP Header, any HTTP Headers can be put here, list type, optional, example `[("key", "value")]`.
- `proxy_url` HTTPS Proxy Full URL, string type, optional, must not be empty string.
- `proxy_auth` HTTPS Proxy Authentication, string type, optional, defaults to `""`, empty string is ignored.Examples:
```python
import faster_than_requests as requests
requests.scraper2(["https://nim-lang.org", "http://example.com"], list_of_tags=["h1", "h2"], case_insensitive=False)
```**Returns:** Scraped Webs.
## scraper3()
**Description:**
Multi-Tag Ready-Made URL-Deduplicating Web Scraper from a list of URLs.![](misc/multitag-scraper.png)
This Scraper is designed with lots of extra options on the arguments.
All arguments are optional, it only needs the URL to get to work.
Scraper is designed to be like a 2-Step Web Scraper, that makes a first pass collecting all URL Links and then a second pass actually fetching those URLs.
You can think of this scraper as a parallel evolution of the original scraper.**Arguments:**
- `list_of_urls` List of URLs, URL must be string type, required, must not be empty list, example `["http://example.io"]`.
- `list_of_tags` List of HTML Tags to parse, List type, optional, defaults to `["a"]` being Links, example `["h1", "h2"]`.
- `case_insensitive` Case Insensitive, `True` for Case Insensitive, boolean type, optional, defaults to `True`, example `True`.
- `deduplicate_urls` Deduplicate `list_of_urls` removing repeated URLs, boolean type, optional, defaults to `False`, example `False`.
- `start_with` Match at the start of the line, similar to `str().startswith()`, string type, optional, example `""`.
- `delay` Delay between a download and the next one, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `0`, must be a positive integer value, example `42`.
- `line_start` Slice the line at the start by this index, integer type, optional, defaults to `0` meaning no slicing since string start at index 0, example `3` cuts off 3 letters of the line at the start.
- `line_end` Slice the line at the end by this *reverse* index, integer type, optional, defaults to `1` meaning no slicing since string ends at reverse index 1, example `9` cuts off 9 letters of the line at the end.
- `pre_replacements` List of tuples of strings to replace *before* parsing, replacements are in parallel, List type, optional, example `[("old", "new"), ("red", "blue")]` will replace `"old"` with `"new"` and will replace `"red"` with `"blue"`.
- `post_replacements` List of tuples of strings to replace *after* parsing, replacements are in parallel, List type, optional, example `[("old", "new"), ("red", "blue")]` will replace `"old"` with `"new"` and will replace `"red"` with `"blue"`.
- `agent` User Agent, string type, optional, must not be empty string.
- `redirects` Maximum Redirects, integer type, optional, defaults to `5`, must be positive integer.
- `timeout` Timeout, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `-1`, must be a positive integer value, example `42`.
- `header` HTTP Header, any HTTP Headers can be put here, list type, optional, example `[("key", "value")]`.
- `proxy_url` HTTPS Proxy Full URL, string type, optional, must not be empty string.
- `proxy_auth` HTTPS Proxy Authentication, string type, optional, defaults to `""`, empty string is ignored.
- `verbose` Verbose, print to terminal console the progress, bool type, optional, defaults to `True`, example `False`.Examples:
```python
import faster_than_requests as requests
requests.scraper3(["https://nim-lang.org", "http://example.com"], list_of_tags=["h1", "h2"], case_insensitive=False)
```**Returns:** Scraped Webs.
## scraper4()
**Description:**
Images and Photos Ready-Made Web Scraper from a list of URLs.![](misc/photo-scraper.png)
The Images and Photos scraped from the first URL will be put into a new sub-folder named `0`,
Images and Photos scraped from the second URL will be put into a new sub-folder named `1`, and so on.
All arguments are optional, it only needs the URL to get to work.
You can think of this scraper as a parallel evolution of the original scraper.**Arguments:**
- `list_of_urls` List of URLs, URL must be string type, required, must not be empty list, example `["https://unsplash.com/s/photos/cat", "https://unsplash.com/s/photos/dog"]`.
- `case_insensitive` Case Insensitive, `True` for Case Insensitive, boolean type, optional, defaults to `True`, example `True`.
- `deduplicate_urls` Deduplicate `list_of_urls` removing repeated URLs, boolean type, optional, defaults to `False`, example `False`.
- `visited_urls` Do not visit same URL twice, even if redirected into, keeps track of visited URLs, bool type, optional, defaults to `True`.
- `delay` Delay between a download and the next one, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `0`, must be a positive integer value, example `42`.
- `folder` Directory to download Images and Photos, string type, optional, defaults to current folder, must not be empty string, example `/tmp`.
- `force_extension` Force file extension to be this file extension, string type, optional, defaults to `".jpg"`, must not be empty string, example `".png"`.
- `https_only` Force to download images on Secure HTTPS only ignoring plain HTTP, sometimes HTTPS may redirect to HTTP, bool type, optional, defaults to `False`, example `True`.
- `html_output` Collect all scraped Images and Photos into 1 HTML file with all elements scraped, bool type, optional, defaults to `True`, example `False`.
- `csv_output` Collect all scraped URLs into 1 CSV file with all links scraped, bool type, optional, defaults to `True`, example `False`.
- `verbose` Verbose, print to terminal console the progress, bool type, optional, defaults to `True`, example `False`.
- `print_alt` print to terminal console the `alt` attribute of the Images and Photos, bool type, optional, defaults to `False`, example `True`.
- `picture` Scrap images from the new HTML5 `` tags instead of `` tags, `` are Responsive images for several resolutions but also you get duplicated images, bool type, optional, defaults to `False`, example `True`.
- `agent` User Agent, string type, optional, must not be empty string.
- `redirects` Maximum Redirects, integer type, optional, defaults to `5`, must be positive integer.
- `timeout` Timeout, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `-1`, must be a positive integer value, example `42`.
- `header` HTTP Header, any HTTP Headers can be put here, list type, optional, example `[("key", "value")]`.
- `proxy_url` HTTPS Proxy Full URL, string type, optional, must not be empty string.
- `proxy_auth` HTTPS Proxy Authentication, string type, optional, defaults to `""`, empty string is ignored.Examples:
```python
import faster_than_requests as requests
requests.scraper4(["https://unsplash.com/s/photos/cat", "https://unsplash.com/s/photos/dog"])
```**Returns:** None.
## scraper5()
**Description:**
Recursive Web Scraper to SQLite Database, you give it an URL, it gives back an SQLite.![](misc/sqlite-scraper.png)
SQLite database can be visualized with any SQLite WYSIWYG, like https://sqlitebrowser.org
If the script gets interrupted like with CTRL+C it will try its best to keep data consistent.
Additionally it will create a CSV file with all the scraped URLs.
HTTP Headers are stored as Pretty-Printed JSON.
Date and Time are stored as Unix Timestamps.
All arguments are optional, it only needs the URL and SQLite file path to get to work.
You can think of this scraper as a parallel evolution of the original scraper.**Arguments:**
- `list_of_urls` List of URLs, URL must be string type, required, must not be empty list, example `["https://unsplash.com/s/photos/cat", "https://unsplash.com/s/photos/dog"]`.
- `sqlite_file_path` Full file path to a new SQLite Database, must be `.db` file extension, string type, required, must not be empty string, example `"scraped_data.db"`.
- `skip_ends_with` Skip the URL if ends with this pattern, list type, optional, must not be empty list, example `[".jpg", ".pdf"]`.
- `case_insensitive` Case Insensitive, `True` for Case Insensitive, boolean type, optional, defaults to `True`, example `True`.
- `deduplicate_urls` Deduplicate `list_of_urls` removing repeated URLs, boolean type, optional, defaults to `False`, example `False`.
- `visited_urls` Do not visit same URL twice, even if redirected into, keeps track of visited URLs, bool type, optional, defaults to `True`.
- `delay` Delay between a download and the next one, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `0`, must be a positive integer value, example `42`.
- `https_only` Force to download images on Secure HTTPS only ignoring plain HTTP, sometimes HTTPS may redirect to HTTP, bool type, optional, defaults to `False`, example `True`.
- `only200` Only commit to Database the successful scraping pages, ignore all errors, bool type, optional, example `True`.
- `agent` User Agent, string type, optional, must not be empty string.
- `redirects` Maximum Redirects, integer type, optional, defaults to `5`, must be positive integer.
- `timeout` Timeout, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `-1`, must be a positive integer value, example `42`.
- `max_loops` Maximum total Loops to do while scraping, like a global guard for infinite redirections, integer type, optional, example `999`.
- `max_deep` Maximum total scraping Recursive Deep, like a global guard for infinite deep recursivity, integer type, optional, example `999`.
- `header` HTTP Header, any HTTP Headers can be put here, list type, optional, example `[("key", "value")]`.
- `proxy_url` HTTPS Proxy Full URL, string type, optional, must not be empty string.
- `proxy_auth` HTTPS Proxy Authentication, string type, optional, defaults to `""`, empty string is ignored.Examples:
```python
import faster_than_requests as requests
requests.scraper5(["https://example.com"], "scraped_data.db")
```**Returns:** None.
## scraper6()
**Description:**
Regex powered Web Scraper from a list of URLs.
Scrap web content using a list of Perl Compatible Regular Expressions (PCRE standard).
You can configure the Regular Expressions to be case insensitive or multiline or extended.This Scraper is designed for developers that know Regular Expressions.
[Learn Regular Expressions.](https://github.com/ziishaned/learn-regex#translations)All arguments are optional, it only needs the URL and the Regex to get to work.
You can think of this scraper as a parallel evolution of the original scraper.**Regex Arguments:**
(Arguments focused on Regular Expression parsing and matching)- `list_of_regex` List of Perl Compatible Regular Expressions (PCRE standard) to match the URL against, List type, required, example `["(www|http:|https:)+[^\s]+[\w]"]`.
- `case_insensitive` Case Insensitive Regular Expressions, do caseless matching, `True` for Case Insensitive, boolean type, optional, defaults to `False`, example `True`.
- `multiline` Multi-Line Regular Expressions, `^` and `$` match newlines within data, boolean type, optional, defaults to `False`, example `True`.
- `extended` Extended Regular Expressions, ignore all whitespaces and `#` comments, boolean type, optional, defaults to `False`, example `True`.
- `dot` Dot `.` matches anything, including new lines, boolean type, optional, defaults to `False`, example `True`.
- `start_with` Perl Compatible Regular Expression to match at the start of the line, similar to `str().startswith()` but with Regular Expressions, string type, optional.
- `ends_with` Perl Compatible Regular Expression to match at the end of the line, similar to `str().endswith()` but with Regular Expressions, string type, optional.
- `post_replacement_regex` Perl Compatible Regular Expressions (PCRE standard) to replace *after* parsing, string type, optional, this option works with `post_replacement_by`, this is like a Regex post-processing, this option is for experts on Regular Expressions.
- `post_replacement_by` string **to replace by** *after* parsing, string type, optional, this option works with `post_replacement_regex`, this is like a Regex post-processing, this option is for experts on Regular Expressions.
- `re_start` Perl Compatible Regular Expression matchs start at this index, positive integer type, optional, defaults to `0`, this option is for experts on Regular Expressions.**Arguments:**
- `list_of_urls` List of URLs, URL must be string type, required, must not be empty list, example `["http://example.io"]`.
- `deduplicate_urls` Deduplicate `list_of_urls` removing repeated URLs, boolean type, optional, defaults to `False`, example `False`.
- `delay` Delay between a download and the next one, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `0`, must be a positive integer value, example `42`.
- `agent` User Agent, string type, optional, must not be empty string.
- `redirects` Maximum Redirects, integer type, optional, defaults to `5`, must be positive integer.
- `timeout` Timeout, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `-1`, must be a positive integer value, example `42`.
- `header` HTTP Header, any HTTP Headers can be put here, list type, optional, example `[("key", "value")]`.
- `proxy_url` HTTPS Proxy Full URL, string type, optional, must not be empty string.
- `proxy_auth` HTTPS Proxy Authentication, string type, optional, defaults to `""`, empty string is ignored.
- `verbose` Verbose, print to terminal console the progress, bool type, optional, defaults to `True`, example `False`.Examples:
```python
import faster_than_requests as requests
requests.scraper6(["http://nim-lang.org", "http://python.org"], ["(www|http:|https:)+[^\s]+[\w]"])
```**Returns:** Scraped Webs.
## scraper7()
![](https://raw.githubusercontent.com/juancarlospaco/faster-than-requests/master/css_selectors.png)
**Description:**
CSS Selector powered Web Scraper. Scrap web content using a CSS Selector.
The CSS Syntax does NOT take Regex nor Regex-like syntax nor literal tag attribute values.All arguments are optional, it only needs the URL and CSS Selector to get to work.
You can think of this scraper as a parallel evolution of the original scraper.**Arguments:**
- `url` The URL, string type, required, must not be empty string, example `"http://python.org"`.
- `css_selector` CSS Selector, string type, required, must not be empty string, example `"body nav.class ul.menu > li > a"`.
- `agent` User Agent, string type, optional, must not be empty string.
- `redirects` Maximum Redirects, integer type, optional, defaults to `9`, must be positive integer.
- `timeout` Timeout, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `-1`, must be a positive integer value, example `42`.
- `header` HTTP Header, any HTTP Headers can be put here, list type, optional, example `[("key", "value")]`.
- `proxy_url` HTTPS Proxy Full URL, string type, optional, must not be empty string.
- `proxy_auth` HTTPS Proxy Authentication, string type, optional, defaults to `""`, empty string is ignored.Examples:
```python
import faster_than_requests as requests
requests.scraper7("http://python.org", "body > div.class a#someid")
``````python
import faster_than_requests as requests
requests.scraper7("https://nim-lang.org", "a.pure-menu-link")[
'Blog',
'Features',
'Download',
'Learn',
'Documentation',
'Forum',
'Source'
]
```More examples:
https://github.com/juancarlospaco/faster-than-requests/blob/master/examples/web_scraper_via_css_selectors.py**Returns:** Scraped Webs.
## websocket_ping()
**Description:**
WebSocket Ping.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `"ws://echo.websocket.org"`.
- `data` data to send, string type, optional, can be empty string, default is empty string, example `""`.
- `hangup` Close the Socket without sending a close packet, optional, default is `False`, not sending close packet can be faster.Examples:
```python
import faster_than_requests as requests
requests.websocket_ping("ws://echo.websocket.org")
```**Returns:** Response, `string` type, can be empty string.
## websocket_send()
**Description:**
WebSocket send data, binary or text.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `"ws://echo.websocket.org"`.
- `data` data to send, string type, optional, can be empty string, default is empty string, example `""`.
- `is_text` if `True` data is sent as Text else as Binary, optional, default is `False`.
- `hangup` Close the Socket without sending a close packet, optional, default is `False`, not sending close packet can be faster.Examples:
```python
import faster_than_requests as requests
requests.websocket_send("ws://echo.websocket.org", "data here")
```**Returns:** Response, `string` type.
## get2str()
**Description:**
Takes an URL string, makes an HTTP GET and returns a string with the response Body.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `https://archlinux.org`.Examples:
```python
import faster_than_requests as requests
requests.get2str("http://example.com")
```**Returns:** Response body, `string` type, can be empty string.
## get2str2()
**Description:**
Takes a list of URLs, makes 1 HTTP GET for each URL, and returns a list of strings with the response Body.
This makes all `GET` fully parallel, in a single Thread, in a single Process.**Arguments:**
- `list_of_urls` A list of the remote URLs, list type, required. Objects inside the list must be string type.Examples:
```python
import faster_than_requests as requests
requests.get2str2(["http://example.com/foo", "http://example.com/bar"]) # Parallel GET
```**Returns:**
List of response bodies, `list` type, values of the list are string type,
values of the list can be empty string, can be empty list.## get2dict()
**Description:**
Takes an URL, makes an HTTP GET, returns a dict with the response Body.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `https://alpinelinux.org`.Examples:
```python
import faster_than_requests as requests
requests.get2dict("http://example.com")
```**Returns:**
Response, `dict` type, values of the dict are string type,
values of the dict can be empty string, but keys are always consistent.## get2json()
**Description:**
Takes an URL, makes an HTTP GET, returns a Minified Computer-friendly single-line JSON with the response Body.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string, example `https://alpinelinux.org`.
- `pretty_print` Pretty Printed JSON, optional, defaults to `False`.Examples:
```python
import faster_than_requests as requests
requests.get2json("http://example.com", pretty_print=True)
```**Returns:** Response Body, Pretty-Printed JSON.
## post2str()
**Description:**
Takes an URL, makes an HTTP POST, returns the response Body as string type.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string.
- `body` the Body data, string type, required, can be empty string.
- `multipart_data` MultiPart data, optional, list of tupes type, must not be empty list, example `[("key", "value")]`.Examples:
```python
import faster_than_requests as requests
requests.post2str("http://example.com/api/foo", "My Body Data Here")
```**Returns:** Response body, `string` type, can be empty string.
## post2dict()
**Description:**
Takes an URL, makes a HTTP POST on that URL, returns a dict with the response.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string.
- `body` the Body data, string type, required, can be empty string.
- `multipart_data` MultiPart data, optional, list of tupes type, must not be empty list, example `[("key", "value")]`.Examples:
```python
import faster_than_requests as requests
requests.post2dict("http://example.com/api/foo", "My Body Data Here")
```**Returns:**
Response, `dict` type, values of the dict are string type,
values of the dict can be empty string, but keys are always consistent.## post2json()
**Description:**
Takes a list of URLs, makes 1 HTTP GET for each URL, returns a list of responses.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string.
- `body` the Body data, string type, required, can be empty string.
- `multipart_data` MultiPart data, optional, list of tupes type, must not be empty list, example `[("key", "value")]`.
- `pretty_print` Pretty Printed JSON, optional, defaults to `False`.Examples:
```python
import faster_than_requests as requests
requests.post2json("http://example.com/api/foo", "My Body Data Here")
```**Returns:** Response, string type.
## post2list()
**Description:**
Takes a list of URLs, makes 1 HTTP POST for each URL, returns a list of responses.**Arguments:**
- `list_of_urls` the remote URLS, list type, required, the objects inside the list must be string type.
- `body` the Body data, string type, required, can be empty string.
- `multipart_data` MultiPart data, optional, list of tupes type, must not be empty list, example `[("key", "value")]`.Examples:
```python
import faster_than_requests as requests
requests.post2list("http://example.com/api/foo", "My Body Data Here")
```**Returns:**
List of response bodies, `list` type, values of the list are string type,
values of the list can be empty string, can be empty list.## download()
**Description:**
Takes a list of URLs, makes 1 HTTP GET for each URL, returns a list of responses.**Arguments:**
- `url` the remote URL, string type, required, must not be empty string.
- `filename` the local filename, string type, required, must not be empty string, full path recommended, can be relative path, includes file extension.Examples:
```python
import faster_than_requests as requests
requests.download("http://example.com/api/foo", "my_file.ext")
```**Returns:** None.
## download2()
**Description:**
Takes a list of URLs, makes 1 HTTP GET Download for each URL of the list.**Arguments:**
- `list_of_files` list of tuples, tuples must be 2 items long, first item is URL and second item is filename.
The remote URL, string type, required, must not be empty string, is the first item on the tuple.
The local filename, string type, required, must not be empty string, can be full path, can be relative path, must include file extension.
- `delay` Delay between a download and the next one, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `0`, must be a positive integer value.
- `threads` Passing `threads = True` uses Multi-Threading, `threads = False` will Not use Multi-Threading, omitting it will Not use Multi-Threading.Examples:
```python
import faster_than_requests as requests
requests.download2([("http://example.com/cat.jpg", "kitten.jpg"), ("http://example.com/dog.jpg", "doge.jpg")])
```**Returns:** None.
## download3()
**Description:**
Takes a list of URLs, makes 1 HTTP GET Download for each URL of the list.
It will Retry again and again in loop until the file is downloaded or `tries` is `0`, whatever happens first.
If all retries have failed and `tries` is `0` it will error out.**Arguments:**
- `list_of_files` list of tuples, tuples must be 2 items long, first item is URL and second item is filename.
The remote URL, string type, required, must not be empty string, is the first item on the tuple.
The local filename, string type, required, must not be empty string, can be full path, can be relative path, must include file extension.
- `delay` Delay between a download and the next one, MicroSeconds precision (1000 = 1 Second), integer type, optional, defaults to `0`, must be a positive integer value.
- `tries` how many Retries to try, positive integer type, optional, defaults to `9`, must be a positive integer value.
- `backoff` Back-Off between retries, positive integer type, optional, defaults to `2`, must be a positive integer value.
- `jitter` Jitter applied to the Back-Off between retries (Modulo math operation), positive integer type, optional, defaults to `2`, must be a positive integer value.
- `verbose` be Verbose, bool type, optional, defaults to `True`.**Returns:** None.
Examples:
```python
import faster_than_requests as requests
requests.download3(
[("http://INVALID/cat.jpg", "kitten.jpg"), ("http://INVALID/dog.jpg", "doge.jpg")],
delay = 1, tries = 9, backoff = 2, jitter = 2, verbose = True,
)
```Examples of Failed download output (intended):
```console
$ python3 example_fail_all_retry.pyRetry: 3 of 3
(url: "http://NONEXISTENT", filename: "a.json")
No such file or directory
Additional info: "Name or service not known"
Retrying in 64 microseconds...
Retry: 2 of 3
(url: "http://NONEXISTENT", filename: "a.json")
No such file or directory
Additional info: "Name or service not known"
Retrying in 128 microseconds (Warning: This is the last Retry!).
Retry: 1 of 3
(url: "http://NONEXISTENT", filename: "a.json")
No such file or directory
Additional info: "Name or service not known"
Retrying in 256 microseconds (Warning: This is the last Retry!).
Traceback (most recent call last):
File "example_fail_all_retry.py", line 3, in
downloader.download3()
...$
```## set_headers()
**Description:**
Set the HTTP Headers from the arguments.
**This is for the functions that NOT allow `http_headers` as argument.****Arguments:**
- `http_headers` HTTP Headers, List of Tuples type, required, example `[("key", "value")]`, example `[("DNT", "1")]`.
List of tuples, tuples must be 2 items long, must not be empty list, must not be empty tuple,
the first item of the tuple is the key and second item of the tuple is value,
keys must not be empty string, values can be empty string, both must the stripped.Examples:
```python
import faster_than_requests as requests
requests.set_headers(headers = [("key", "value")])
``````python
import faster_than_requests as requests
requests.set_headers([("key0", "value0"), ("key1", "value1")])
``````python
import faster_than_requests as requests
requests.set_headers([("content-type", "text/plain"), ("dnt", "1")])
```**Returns:** None.
## multipartdata2str()
**Description:**
Takes MultiPart Data and returns a string representation. Converts MultipartData to 1 human readable string.
The human-friendly representation is not machine-friendly, so is not Serialization nor Stringification, just for humans.
It is faster and different than stdlib `parse_multipart`.**Arguments:**
- `multipart_data` MultiPart data, optional, list of tupes type, must not be empty list, example `[("key", "value")]`.Examples:
```python
import faster_than_requests as requests
requests.multipartdata2str([("key", "value")])
```**Returns:** string.
## datauri()
**Description:**
Takes data and returns a [standard Base64 Data URI (RFC-2397).](https://tools.ietf.org/html/rfc2397)
At the time of writing Python stdlib does not have a function that returns Data URI (RFC-2397) on `base64` module.
This can be used as URL on HTML/CSS/JS. It is faster and different than stdlib `base64`.**Arguments:**
- `data` Arbitrary Data, string type, required.
- `mime` MIME Type of `data`, string type, required, example `"text/plain"`.
- `encoding` Encoding, string type, required, defaults to `"utf-8"`, example `"utf-8"`, `"utf-8"` is recommended.Examples:
```python
import faster_than_requests as requests
requests.datauri("Nim", "text/plain")
```**Returns:** string.
## urlparse()
**Description:**
Parse any URL and return parsed primitive values like
`scheme`, `username`, `password`, `hostname`, `port`, `path`, `query`, `anchor`, `opaque`, etc.
It is faster and different than stdlib `urlparse`.**Arguments:**
- `url` The URL, string type, required.Examples:
```python
import faster_than_requests as requests
requests.urlparse("https://nim-lang.org")
```**Returns:** `scheme`, `username`, `password`, `hostname`, `port`, `path`, `query`, `anchor`, `opaque`, etc.
## urlencode()
**Description:**
Encodes a URL according to RFC-3986, string to string.
It is faster and different than stdlib `urlencode`.**Arguments:**
- `url` The URL, string type, required.
- `use_plus` When `use_plus` is `true`, spaces are encoded as `+` instead of `%20`.Examples:
```python
import faster_than_requests as requests
requests.urlparse("https://nim-lang.org", use_plus = True)
```**Returns:** string.
## urldecode()
**Description:**
Decodes a URL according to RFC-3986, string to string.
It is faster and different than stdlib `unquote`.**Arguments:**
- `url` The URL, string type, required.
- `use_plus` When `use_plus` is `true`, spaces are decoded as `+` instead of `%20`.Examples:
```python
import faster_than_requests as requests
requests.urldecode(r"https%3A%2F%2Fnim-lang.org", use_plus = False)
```**Returns:** string.
## encodequery()
**Description:**
Encode a URL according to RFC-3986, string to string.
It is faster and different than stdlib `quote_plus`.**Arguments:**
- `query` List of Tuples, required, example `[("key", "value")]`, example `[("DNT", "1")]`.
- `omit_eq` If the value is an empty string then the `=""` is omitted, unless `omit_eq` is `false`.
- `use_plus` When `use_plus` is `true`, spaces are decoded as `+` instead of `%20`.Examples:
```python
import faster_than_requests as requests
requests.encodequery([("key", "value")], use_plus = True, omit_eq = True)
```**Returns:** string.
## encodexml()
**Description:**
Convert the characters `&`, `<`, `>`, `"` in a string to an HTML-safe string, output is Valid XML.
Use this if you need to display text that might contain such characters in HTML, SVG or XML.
It is faster and different than stdlib `html.escape`.**Arguments:**
- `s` Arbitrary string, required.Examples:
```python
import faster_than_requests as requests
requests.encodexml("Hello World
")
```**Returns:** string.
## minifyhtml()
**Description:**
Fast HTML and SVG Minifier. Not Obfuscator.**Arguments:**
- `html` HTML string, required.Examples:
```python
import faster_than_requests as requests
requests.minifyhtml("Hello
World
")
```**Returns:** string.
## gen_auth_header()
**Description:**
Helper for HTTP Authentication headers.Returns 1 string kinda like "Basic base64(username):base64(username)",
so it can be used like `[ ("Authorization": gen_auth_header("username", "password") ) ]`.
See https://github.com/juancarlospaco/faster-than-requests/issues/168#issuecomment-858999317**Arguments:**
- `username` Username string, must not be empty string, required.
- `password` Password string, must not be empty string, required.**Returns:** string.
## debugs
**Description:**
Debug the internal Configuration of the library, takes no arguments, returns nothing,
prints the pretty-printed human-friendly multi-line JSON Configuration to standard output terminal.Examples:
```python
import faster_than_requests as requests
requests.debugs()
```**Arguments:** None.
**Returns:** None.
## optimizeGC()
**Description:**
This module uses compile-time deterministic memory management GC (kinda like Rust, but for Python).
Python at run-time makes a pause, runs a Garbage Collector, and resumes again after the pause.`gctricks.optimizeGC` allows you to omit the Python GC pauses at run-time temporarily on a context manager block,
this is the proper way to use this module for Benchmarks!, this is optional but recommended,
we did not invent this, this is inspired from work from Instagram Engineering team and battle tested by them:- https://instagram-engineering.com/dismissing-python-garbage-collection-at-instagram-4dca40b29172
This is NOT a function, it is a context manager, it takes no arguments and wont return.
This calls `init_client()` at start and `close_client()` at end automatically.
Examples:
```python
from gctricks import optmizeGCwith optmizeGC:
# All your HTTP code here. Chill the GC. Calls init_client() and close_client() automatically.# GC run-time pauses enabled again.
```## init_client()
**Description:**
Instantiate the HTTP Client object, for deferred initialization, call it before the start of all HTTP operations.`get()`, `post()`, `put()`, `patch()`, `delete()`, `head()` do NOT need this, because they auto-init,
this exist for performance reasons to defer the initialization and was requested by the community.This is optional but recommended.
Read `optimizeGC` documentation before using.
**Arguments:** None.
Examples:
```python
import faster_than_requests as requests
requests.init_client()
# All your HTTP code here.
```**Returns:** None.
## close_client()
**Description:**
Tear down the HTTP Client object, for deferred de-initialization, call it after the end of all HTTP operations.`get()`, `post()`, `put()`, `patch()`, `delete()`, `head()` do NOT need this, because they auto-init,
this exist for performance reasons to defer the de-initialization and was requested by the community.This is optional but recommended.
Read `optimizeGC` documentation before using.
**Arguments:** None.
Examples:
```python
import faster_than_requests as requests
# All your HTTP code here.
requests.close_client()
```**Returns:** None.
[**For more Examples check the Examples and Tests.**](https://github.com/juancarlospaco/faster-than-requests/blob/master/examples/example.py)
Instead of having a pair of functions with a lot of arguments that you should provide to make it work,
we have tiny functions with very few arguments that do one thing and do it as fast as possible.A lot of functions are oriented to Data Science, Big Data, Open Data, Web Scrapping, working with HTTP REST JSON APIs.
# Install
- `pip install faster_than_requests`
# Docker
- Make a quick test drive on Docker!.
```bash
$ ./build-docker.sh
$ ./run-docker.sh
$ ./server4benchmarks & # Inside Docker.
$ python3 benchmark.py # Inside Docker.
```# Dependencies
- **None**
# Platforms
- ✅ Linux
- ✅ Windows
- ✅ Mac
- ✅ Android
- ✅ Raspberry Pi
- ✅ BSD# Extras
More Faster Libraries...
- https://github.com/juancarlospaco/faster-than-csv#faster-than-csv
- https://github.com/juancarlospaco/faster-than-walk#faster-than-walk
- We want to make Open Source faster, better, stronger.# Requisites
- Python 3.
- 64 Bit.# Windows
- Documentation assumes experience with Git, GitHub, cmd, Compiled software, PC with Administrator.
- If installation fails on Windows, just use the Source Code:![win-compile](https://user-images.githubusercontent.com/1189414/63147831-b8bf6100-bfd5-11e9-9e6e-91d61040f139.png "Git Clone and Compile on Windows 10 with only Git and Nim installed, just 2 commands!")
**The only software needed is [Git for Windows](https://github.com/git-for-windows/git/releases/latest) and [Nim](https://github.com/dom96/choosenim#windows).**
Reboot after install. Administrator required for install. Everything must be 64Bit.
If that fails too, dont waste time and go directly for [Docker for Windows.](https://docs.docker.com/docker-for-windows).
For info about how to install [Git for Windows](https://github.com/git-for-windows/git/releases/latest), read [Git for Windows](https://github.com/git-for-windows/git/releases/latest) Documentation.
[For info about how to install Nim, read Nim Documentation.](https://nim-lang.org/install.html)
For info about how to install [Docker for Windows.](https://docs.docker.com/docker-for-windows), read [Docker for Windows.](https://docs.docker.com/docker-for-windows) Documentation.
[GitHub Actions Build everything from zero on each push, use it as guidance too.](https://github.com/juancarlospaco/faster-than-requests/actions?query=workflow%3APYTHON)
- Git Clone and Compile on Windows 10 on just 2 commands!.
- [Alternatively you can try Docker for Windows.](https://docs.docker.com/docker-for-windows)
- [Alternatively you can try WSL for Windows.](https://docs.microsoft.com/en-us/windows/wsl/about)
- **The file extension must be `.pyd`, NOT `.dll`. Compile with `-d:ssl` to use HTTPS.**```
nimble install nimpy
nim c -d:ssl -d:danger --app:lib --tlsEmulation:on --out:faster_than_requests.pyd faster_than_requests.nim
```# Sponsors
- **Become a Sponsor and help improve this library with the features you want!, we need Sponsors!.**
Bitcoin BTC**BEP20 Binance Smart Chain Network BSC**
```
0xb78c4cf63274bb22f83481986157d234105ac17e
```
**BTC Bitcoin Network**
```
1Pnf45MgGgY32X4KDNJbutnpx96E4FxqVi
```
Ethereum ETH Dai DAI Uniswap UNI Axie Infinity AXS Smooth Love Potion SLP**BEP20 Binance Smart Chain Network BSC**
```
0xb78c4cf63274bb22f83481986157d234105ac17e
```
**ERC20 Ethereum Network**
```
0xb78c4cf63274bb22f83481986157d234105ac17e
```
Tether USDT**BEP20 Binance Smart Chain Network BSC**
```
0xb78c4cf63274bb22f83481986157d234105ac17e
```
**ERC20 Ethereum Network**
```
0xb78c4cf63274bb22f83481986157d234105ac17e
```
**TRC20 Tron Network**
```
TWGft53WgWvH2mnqR8ZUXq1GD8M4gZ4Yfu
```
Solana SOL**BEP20 Binance Smart Chain Network BSC**
```
0xb78c4cf63274bb22f83481986157d234105ac17e
```
**SOL Solana Network**
```
FKaPSd8kTUpH7Q76d77toy1jjPGpZSxR4xbhQHyCMSGq
```
Cardano ADA**BEP20 Binance Smart Chain Network BSC**
```
0xb78c4cf63274bb22f83481986157d234105ac17e
```
**ADA Cardano Network**
```
DdzFFzCqrht9Y1r4Yx7ouqG9yJNWeXFt69xavLdaeXdu4cQi2yXgNWagzh52o9k9YRh3ussHnBnDrg7v7W2hSXWXfBhbo2ooUKRFMieM
```
Sandbox SAND Decentraland MANA**ERC20 Ethereum Network**
```
0xb78c4cf63274bb22f83481986157d234105ac17e
```
Algorand ALGO**ALGO Algorand Network**
```
WM54DHVZQIQDVTHMPOH6FEZ4U2AU3OBPGAFTHSCYWMFE7ETKCUUOYAW24Q
```
Binance
https://pay.binance.com/en/checkout/e92e536210fd4f62b426ea7ee65b49c3# FAQ
- Whats the idea, inspiration, reason, etc ?.
[Feel free to Fork, Clone, Download, Improve, Reimplement, Play with this Open Source. Make it 10 times faster, 10 times smaller.](http://tonsky.me/blog/disenchantment)
- This works with SSL ?.
Yes.
- This works without SSL ?.
Yes.
- This requires Cython ?.
No.
- This runs on PyPy ?.
No.
- This runs on Python2 ?.
I dunno. (Not supported)
- This runs on 32Bit ?.
No.
- This runs with Clang ?.
No.
- Where to get help ?.
https://github.com/juancarlospaco/faster-than-requests/issues
- How to set the URL ?.
`url="http://example.com"` (1st argument always).
- How to set the HTTP Body ?.
`body="my body"`
- How to set an HTTP Header key=value ?.
[set_headers()](https://github.com/juancarlospaco/faster-than-requests#set_headers)
- How can be faster than PyCurl ?.
I dunno.
- Why use Tuple instead of Dict for HTTP Headers ?.
For speed performance reasons, `dict` is slower, bigger, heavier and mutable compared to `tuple`.
- Why needs 64Bit ?.
Maybe it works on 32Bit, but is not supported, integer sizes are too small, and performance can be worse.
- Why needs Python 3 ?.
Maybe it works on Python 2, but is not supported, and performance can be worse, we suggest to migrate to Python3.
- Can I wrap the functions on a `try: except:` block ?.
Functions do not have internal `try: except:` blocks,
so you can wrap them inside `try: except:` blocks if you need very resilient code.- PIP fails to install or fails build the wheel ?.
Add at the end of the PIP install command:
` --isolated --disable-pip-version-check --no-cache-dir --no-binary :all: `
Not my Bug.
- How to Build the project ?.
`build.sh` or `build.nims`
- How to Package the project ?.
`package.sh` or `package.nims`
- This requires Nimble ?.
No.
- Whats the unit of measurement for speed ?.
Unmmodified raw output of Python `timeit` module.
Please send Pull Request to Python to improve the output of `timeit`.
- The LoC is a lie, not counting the lines of code of the Compiler ?.
Projects that use Cython wont count the whole Cython on the LoC, so we wont neither.
# Stars
![](https://starchart.cc/juancarlospaco/faster-than-requests.svg "Star faster-than-requests on GitHub!")
:star: [@juancarlospaco](https://github.com/juancarlospaco '2022-02-15')
:star: [@nikitavoloboev](https://github.com/nikitavoloboev '2022-02-15')
:star: [@5u4](https://github.com/5u4 '2022-02-16')
:star: [@CKristensen](https://github.com/CKristensen '2022-02-16')
:star: [@Lips7](https://github.com/Lips7 '2022-02-17')
:star: [@zeandrade](https://github.com/zeandrade '2022-02-17')
:star: [@SoloDevOG](https://github.com/SoloDevOG '2022-02-18')
:star: [@AM-I-Human](https://github.com/AM-I-Human '2022-02-19')
:star: [@pauldevos](https://github.com/pauldevos '2022-02-20')
:star: [@divanovGH](https://github.com/divanovGH '2022-02-21')
:star: [@ali-sajjad-rizavi](https://github.com/ali-sajjad-rizavi '2022-02-23')
:star: [@George2901](https://github.com/George2901 '2022-03-01')
:star: [@jeaps17](https://github.com/jeaps17 '2022-03-01')
:star: [@TeeWallz](https://github.com/TeeWallz '2022-03-04')
:star: [@Shinji-Mimura](https://github.com/Shinji-Mimura '2022-03-18')
:star: [@oozdemir83](https://github.com/oozdemir83 '2022-03-18')
:star: [@aaman007](https://github.com/aaman007 '2022-03-18')
:star: [@dungtq](https://github.com/dungtq '2022-03-19')
:star: [@bonginkosi0607](https://github.com/bonginkosi0607 '2022-03-20')
:star: [@4nzor](https://github.com/4nzor '2022-03-20')
:star: [@CyberLionsNFT](https://github.com/CyberLionsNFT '2022-03-24')
:star: [@Lyapsus](https://github.com/Lyapsus '2022-03-24')
:star: [@boskuv](https://github.com/boskuv '2022-03-28')
:star: [@jckli](https://github.com/jckli '2022-03-28')
:star: [@VitSimon](https://github.com/VitSimon '2022-03-29')
:star: [@zjmdp](https://github.com/zjmdp '2022-03-30')
:star: [@maxclac](https://github.com/maxclac '2022-03-31')
:star: [@krishna2206](https://github.com/krishna2206 '2022-03-31')
:star: [@KhushC-03](https://github.com/KhushC-03 '2022-03-31')
:star: [@nicksnell](https://github.com/nicksnell '2022-04-01')
:star: [@skandix](https://github.com/skandix '2022-04-02')
:star: [@gioleppe](https://github.com/gioleppe '2022-04-08')
:star: [@mvandermeulen](https://github.com/mvandermeulen '2022-04-11')
:star: [@Vexarr](https://github.com/Vexarr '2022-04-13')
:star: [@baajarmeh](https://github.com/baajarmeh '2022-04-13')
:star: [@znel2002](https://github.com/znel2002 '2022-04-13')
:star: [@matkuki](https://github.com/matkuki '2022-04-14')
:star: [@SmartManoj](https://github.com/SmartManoj '2022-04-16')
:star: [@SmartManoj](https://github.com/SmartManoj '2022-04-21')
:star: [@Zardex1337](https://github.com/Zardex1337 '2022-05-08')
:star: [@jocago](https://github.com/jocago '2022-05-09')
:star: [@gnimmel](https://github.com/gnimmel '2022-05-11')
:star: [@sting8k](https://github.com/sting8k '2022-05-13')
:star: [@c4p-n1ck](https://github.com/c4p-n1ck '2022-05-13')
:star: [@w0xi](https://github.com/w0xi '2022-05-15')
:star: [@Bing-su](https://github.com/Bing-su '2022-05-16')
:star: [@AdrianCert](https://github.com/AdrianCert '2022-05-18')
:star: [@asanoop24](https://github.com/asanoop24 '2022-05-18')
:star: [@zaphodfortytwo](https://github.com/zaphodfortytwo '2022-05-19')
:star: [@Ozerioss](https://github.com/Ozerioss '2022-05-20')
:star: [@nkot56297](https://github.com/nkot56297 '2022-05-21')
:star: [@sendyputra](https://github.com/sendyputra '2022-05-22')
:star: [@thedrow](https://github.com/thedrow '2022-05-23')
:star: [@MinhHuyDev](https://github.com/MinhHuyDev '2022-05-24')
:star: [@rapjul](https://github.com/rapjul '2022-05-27')
:star: [@Arthavruksha](https://github.com/Arthavruksha '2022-05-31')
:star: [@xdroff](https://github.com/xdroff '2022-05-31')
:star: [@ShodhArthavruksha](https://github.com/ShodhArthavruksha '2022-06-01')
:star: [@aziddddd](https://github.com/aziddddd '2022-06-01')
:star: [@breezechen](https://github.com/breezechen '2022-06-02')
:star: [@TobeTek](https://github.com/TobeTek '2022-06-07')
:star: [@antonpetrov145](https://github.com/antonpetrov145 '2022-06-09')
:star: [@MartianDominic](https://github.com/MartianDominic '2022-06-14')
:star: [@sk37025](https://github.com/sk37025 '2022-06-15')
:star: [@Alexb309](https://github.com/Alexb309 '2022-06-16')
:star: [@alimp5](https://github.com/alimp5 '2022-06-17')
:star: [@oldhroft](https://github.com/oldhroft '2022-06-25')
:star: [@davidcralph](https://github.com/davidcralph '2022-06-25')
:star: [@rainmana](https://github.com/rainmana '2022-06-25')
:star: [@amitdo](https://github.com/amitdo '2022-07-03')
:star: [@atlassion](https://github.com/atlassion '2022-07-04')
:star: [@cytsai1008](https://github.com/cytsai1008 '2022-07-05')
:star: [@bsouthern](https://github.com/bsouthern '2022-07-07')
:star: [@techrattt](https://github.com/techrattt '2022-07-09')
:star: [@vnicetn](https://github.com/vnicetn '2022-07-11')
:star: [@Perry-xD](https://github.com/Perry-xD '2022-07-12')
:star: [@israelvf](https://github.com/israelvf '2022-07-13')
:star: [@BernardoOlisan](https://github.com/BernardoOlisan '2022-07-14')
:star: [@ZenoMilk12](https://github.com/ZenoMilk12 '2022-07-19')
:star: [@rundef](https://github.com/rundef '2022-07-20')
:star: [@semihyesilyurt](https://github.com/semihyesilyurt '2022-07-20')
:star: [@dunaevv](https://github.com/dunaevv '2022-07-21')
:star: [@mlueckl](https://github.com/mlueckl '2022-07-21')
:star: [@johnrlive](https://github.com/johnrlive '2022-07-24')
:star: [@CrazyBonze](https://github.com/CrazyBonze '2022-07-28')
:star: [@v-JiangNan](https://github.com/v-JiangNan '2022-07-31')
:star: [@justfly50](https://github.com/justfly50 '2022-07-31')
:star: [@mamert](https://github.com/mamert '2022-08-03')
:star: [@ccamateur](https://github.com/ccamateur '2022-08-07')
:star: [@5l1v3r1](https://github.com/5l1v3r1 '2022-08-07')
:star: [@Wykiki](https://github.com/Wykiki '2022-08-08')
:star: [@Kladdkaka](https://github.com/Kladdkaka '2022-08-09')
:star: [@giubaru](https://github.com/giubaru '2022-08-10')
:star: [@eamigo86](https://github.com/eamigo86 '2022-08-12')
:star: [@eadwinCode](https://github.com/eadwinCode '2022-08-12')
:star: [@s4ke](https://github.com/s4ke '2022-08-13')
:star: [@DG4MES](https://github.com/DG4MES '2022-08-14')
:star: [@AlexJupiterian](https://github.com/AlexJupiterian '2022-08-16')
:star: [@baodinhaaa](https://github.com/baodinhaaa '2022-08-19')
:star: [@a-golda](https://github.com/a-golda '2022-08-23')
:star: [@Furkan-izgi](https://github.com/Furkan-izgi '2022-08-25')
:star: [@Abdulvoris101](https://github.com/Abdulvoris101 '2022-08-31')
:star: [@devlenni](https://github.com/devlenni '2022-09-12')
:star: [@kasahh](https://github.com/kasahh '2022-09-15')
:star: [@vishaltanwar96](https://github.com/vishaltanwar96 '2022-09-16')
:star: [@codehangen](https://github.com/codehangen '2022-09-16')
:star: [@svenko99](https://github.com/svenko99 '2022-09-20')
:star: [@kb1900](https://github.com/kb1900 '2022-09-27')
:star: [@lusteroak](https://github.com/lusteroak '2022-10-03')
:star: [@nitheesh-cpu](https://github.com/nitheesh-cpu '2022-10-06')
:star: [@techpixel](https://github.com/techpixel '2022-10-06')
:star: [@tk-iitd](https://github.com/tk-iitd '2022-10-07')
:star: [@smartniz](https://github.com/smartniz '2022-10-13')
:star: [@popcrocks](https://github.com/popcrocks '2022-10-13')
:star: [@senic35](https://github.com/senic35 '2022-10-14')
:star: [@NychetheAwesome](https://github.com/NychetheAwesome '2022-10-17')
:star: [@rafalohaki](https://github.com/rafalohaki '2022-10-17')
:star: [@Danipulok](https://github.com/Danipulok '2022-10-19')
:star: [@lwinkelm](https://github.com/lwinkelm '2022-10-20')
:star: [@sunlei](https://github.com/sunlei '2022-10-21')
:star: [@Minnikeswar](https://github.com/Minnikeswar '2022-10-25')
:star: [@theol-git](https://github.com/theol-git '2022-10-27')
:star: [@Mohammad-Mohsen](https://github.com/Mohammad-Mohsen '2022-10-28')
:star: [@neeksor](https://github.com/neeksor '2022-10-28')
:star: [@0xNev](https://github.com/0xNev '2022-11-01')
:star: [@imvast](https://github.com/imvast '2022-11-02')
:star: [@daweedkob](https://github.com/daweedkob '2022-11-02')
:star: [@Landcruiser87](https://github.com/Landcruiser87 '2022-11-02')
:star: [@kirillzhosul](https://github.com/kirillzhosul '2022-11-02')
:star: [@FurkanEdizkan](https://github.com/FurkanEdizkan '2022-11-03')
:star: [@sodinokibi](https://github.com/sodinokibi '2022-11-04')
:star: [@stepan-zubkov](https://github.com/stepan-zubkov '2022-11-04')
:star: [@Nexlab-One](https://github.com/Nexlab-One '2022-11-05')
:star: [@PApostol](https://github.com/PApostol '2022-11-06')
:star: [@callmeAsadUllah](https://github.com/callmeAsadUllah '2022-11-12')
:star: [@jaredv1](https://github.com/jaredv1 '2022-11-13')
:star: [@Goblin80](https://github.com/Goblin80 '2022-11-14')
:star: [@michikxd](https://github.com/michikxd '2022-11-17')
:star: [@babywyrm](https://github.com/babywyrm '2022-11-19')
:star: [@MooneDrJune](https://github.com/MooneDrJune '2022-11-21')
:star: [@grknbyk](https://github.com/grknbyk '2022-11-24')
:star: [@francomerendan](https://github.com/francomerendan '2022-11-24')
:star: [@noudin-ledger](https://github.com/noudin-ledger '2022-11-24')
:star: [@chip-felton-montage](https://github.com/chip-felton-montage '2022-11-24')
:star: [@Ruddy35](https://github.com/Ruddy35 '2022-11-25')
:star: [@xilicode](https://github.com/xilicode '2022-11-25')
:star: [@BrianTurza](https://github.com/BrianTurza '2022-11-25')
:star: [@oguh43](https://github.com/oguh43 '2022-11-26')
:star: [@oyoxo](https://github.com/oyoxo '2022-11-26')
:star: [@encoreshao](https://github.com/encoreshao '2022-11-27')
:star: [@peter279k](https://github.com/peter279k '2022-11-28')
:star: [@xalien10](https://github.com/xalien10 '2022-11-28')
:star: [@DahnJ](https://github.com/DahnJ '2022-11-29')
:star: [@ld909](https://github.com/ld909 '2022-11-30')
:star: [@lafabo](https://github.com/lafabo '2022-11-30')
:star: [@AndrewGPU](https://github.com/AndrewGPU '2022-12-04')
:star: [@jerheff](https://github.com/jerheff '2022-12-05')
:star: [@wtv-piyush](https://github.com/wtv-piyush '2022-12-05')
:star: [@themorya](https://github.com/themorya '2022-12-05')
:star: [@frank-cq](https://github.com/frank-cq '2022-12-05')
:star: [@sunilravilla](https://github.com/sunilravilla '2022-12-09')
:star: [@root-11](https://github.com/root-11 '2022-12-12')
:star: [@BrianPugh](https://github.com/BrianPugh '2022-12-12')
:star: [@kioenguyen0207](https://github.com/kioenguyen0207 '2022-12-13')
:star: [@dotpmrcunha](https://github.com/dotpmrcunha '2022-12-16')
:star: [@eldar-mustafayev](https://github.com/eldar-mustafayev '2022-12-18')
:star: [@krishna2206](https://github.com/krishna2206 '2022-12-18')
:star: [@mrkprdo](https://github.com/mrkprdo '2022-12-28')
:star: [@pythoninthegrass](https://github.com/pythoninthegrass '2023-01-02')
:star: [@jaysontran-novobi](https://github.com/jaysontran-novobi '2023-01-02')
:star: [@eleqtrizit](https://github.com/eleqtrizit '2023-01-03')
:star: [@BlackNurse](https://github.com/BlackNurse '2023-01-03')
:star: [@rruimor](https://github.com/rruimor '2023-01-03')
:star: [@gosoxharp](https://github.com/gosoxharp '2023-01-07')
:star: [@scripthon](https://github.com/scripthon '2023-01-08')
:star: [@WillianFF](https://github.com/WillianFF '2023-01-08')
:star: [@yashprakash13](https://github.com/yashprakash13 '2023-01-09')
:star: [@rhyd0n](https://github.com/rhyd0n '2023-01-09')
:star: [@meet-ai](https://github.com/meet-ai '2023-01-11')
:star: [@Cremeux](https://github.com/Cremeux '2023-01-11')
:star: [@hawk-roman-rey](https://github.com/hawk-roman-rey '2023-01-12')
:star: [@OldMidnight](https://github.com/OldMidnight '2023-01-15')
:star: [@christos-bsq](https://github.com/christos-bsq '2023-01-15')
:star: [@Xenia101](https://github.com/Xenia101 '2023-01-17')
:star: [@beholders-eye](https://github.com/beholders-eye '2023-01-17')
:star: [@lectinua](https://github.com/lectinua '2023-01-19')
:star: [@pietroppeter](https://github.com/pietroppeter '2023-01-20')
:star: [@linwaytin](https://github.com/linwaytin '2023-01-22')
:star: [@0x13A0F](https://github.com/0x13A0F '2023-01-25')
:star: [@ocervell](https://github.com/ocervell '2023-01-25')
:star: [@ZhymabekRoman](https://github.com/ZhymabekRoman '2023-01-26')
:star: [@kabartay](https://github.com/kabartay '2023-01-26')
:star: [@tkanhe](https://github.com/tkanhe '2023-01-28')
:star: [@yudelevi](https://github.com/yudelevi '2023-01-30')
:star: [@kantarcise](https://github.com/kantarcise '2023-02-01')
:star: [@fernando-aristizabal](https://github.com/fernando-aristizabal '2023-02-02')
:star: [@BrianNTang](https://github.com/BrianNTang '2023-02-06')
:star: [@asen-mitov](https://github.com/asen-mitov '2023-02-08')
:star: [@mrgick](https://github.com/mrgick '2023-02-12')
:star: [@jpmanson](https://github.com/jpmanson '2023-02-13')
:star: [@thvadora](https://github.com/thvadora '2023-02-14')
:star: [@jramosss](https://github.com/jramosss '2023-02-14')
:star: [@AleksandrUsolcev](https://github.com/AleksandrUsolcev '2023-02-15')
:star: [@eantho](https://github.com/eantho '2023-02-26')
:star: [@RodrigoTorresWeb](https://github.com/RodrigoTorresWeb '2023-03-04')
:star: [@mighty-phoenix](https://github.com/mighty-phoenix '2023-03-05')
:star: [@DSDanielPark](https://github.com/DSDanielPark '2023-03-05')
:star: [@0xHaris](https://github.com/0xHaris '2023-03-08')
:star: [@watchakorn-18k](https://github.com/watchakorn-18k '2023-03-14')
:star: [@watchakorn-18k](https://github.com/watchakorn-18k '2023-03-14')
:star: [@enesalsat](https://github.com/enesalsat '2023-03-15')
:star: [@eal](https://github.com/eal '2023-03-16')
:star: [@abhie-lp](https://github.com/abhie-lp '2023-03-17')
:star: [@martinskou](https://github.com/martinskou '2023-03-17')
:star: [@SHjalilo](https://github.com/SHjalilo '2023-03-19')
:star: [@chrisgoddard](https://github.com/chrisgoddard '2023-03-19')
:star: [@OuMal](https://github.com/OuMal '2023-03-23')
:star: [@dylanosaur](https://github.com/dylanosaur '2023-03-29')
:star: [@Seirdy](https://github.com/Seirdy '2023-04-01')
:star: [@mix-protocol](https://github.com/mix-protocol '2023-04-01')
:star: [@hoangthanh168](https://github.com/hoangthanh168 '2023-04-03')
:star: [@LaffX](https://github.com/LaffX '2023-04-06')
:star: [@Kanchangk](https://github.com/Kanchangk '2023-04-06')
:star: [@epistemicjanitor](https://github.com/epistemicjanitor '2023-04-13')
:star: [@danielsalvador](https://github.com/danielsalvador '2023-04-13')
:star: [@Utone](https://github.com/Utone '2023-04-15')
:star: [@andimahathir](https://github.com/andimahathir '2023-04-20')
:star: [@alexvivanov](https://github.com/alexvivanov '2023-05-01')
:star: [@resort-io](https://github.com/resort-io '2023-05-01')
:star: [@itschasa](https://github.com/itschasa '2023-05-02')
:star: [@Kensingtonn](https://github.com/Kensingtonn '2023-05-02')
:star: [@christopherhall1815](https://github.com/christopherhall1815 '2023-05-04')
:star: [@edwardlopez0311](https://github.com/edwardlopez0311 '2023-05-08')
:star: [@XMVZ](https://github.com/XMVZ '2023-05-08')
:star: [@robd003](https://github.com/robd003 '2023-05-09')
:star: [@Tatsh](https://github.com/Tatsh '2023-05-09')
:star: [@movrsi](https://github.com/movrsi '2023-05-09')
:star: [@rachen](https://github.com/rachen '2023-05-10')
:star: [@DavideGalilei](https://github.com/DavideGalilei '2023-05-14')
:star: [@linusheinz](https://github.com/linusheinz '2023-05-14')
:star: [@levushakov](https://github.com/levushakov '2023-05-15')
:star: [@zerintg](https://github.com/zerintg '2023-05-16')
:star: [@arynyklas](https://github.com/arynyklas '2023-05-16')
:star: [@infamix](https://github.com/infamix '2023-05-16')
:star: [@aspekts](https://github.com/aspekts '2023-05-19')
:star: [@shahabejaz](https://github.com/shahabejaz '2023-05-20')
:star: [@vovavili](https://github.com/vovavili '2023-05-21')
:star: [@FingonCelebrindal](https://github.com/FingonCelebrindal '2023-05-22')
:star: [@ysnbogt](https://github.com/ysnbogt '2023-05-22')
:star: [@Athroniaeth](https://github.com/Athroniaeth '2023-05-23')
:star: [@hansalemaos](https://github.com/hansalemaos '2023-05-26')
:star: [@xTrayambak](https://github.com/xTrayambak '2023-05-28')
:star: [@extrememicro](https://github.com/extrememicro '2023-05-31')
:star: [@Pineapple217](https://github.com/Pineapple217 '2023-06-09')
:star: [@PiyushDixit96](https://github.com/PiyushDixit96 '2023-06-10')
:star: [@NateShoffner](https://github.com/NateShoffner '2023-06-11')
:star: [@noe1sanji](https://github.com/noe1sanji '2023-06-18')
:star: [@bsljth](https://github.com/bsljth '2023-06-23')
:star: [@hellogeraldblah](https://github.com/hellogeraldblah '2023-06-26')
:star: [@ande128](https://github.com/ande128 '2023-06-27')
:star: [@bsljth](https://github.com/bsljth '2023-06-30')
:star: [@Murplugg](https://github.com/Murplugg '2023-07-01')
:star: [@juguerre](https://github.com/juguerre '2023-07-02')
:star: [@ande128](https://github.com/ande128 '2023-07-03')
:star: [@atksh](https://github.com/atksh '2023-07-05')
:star: [@k-kawakami213](https://github.com/k-kawakami213 '2023-07-05')
:star: [@fredlarochelle](https://github.com/fredlarochelle '2023-07-05')
:star: [@emekeh](https://github.com/emekeh '2023-07-06')
:star: [@KNipunDananjaya](https://github.com/KNipunDananjaya '2023-07-07')
:star: [@cryoz](https://github.com/cryoz '2023-07-14')
:star: [@weztask](https://github.com/weztask '2023-07-17')
:star: [@retouching](https://github.com/retouching '2023-07-17')
:star: [@degaur](https://github.com/degaur '2023-07-19')
:star: [@testtsubscribe](https://github.com/testtsubscribe '2023-07-22')
:star: [@catsuns](https://github.com/catsuns '2023-07-22')
:star: [@Leiiib](https://github.com/Leiiib '2023-07-24')
:star: [@mqt2r](https://github.com/mqt2r '2023-07-28')
:star: [@solarsbeans](https://github.com/solarsbeans '2023-08-01')
:star: [@F1ashhimself](https://github.com/F1ashhimself '2023-08-02')
:star: [@IsakWesterlund](https://github.com/IsakWesterlund '2023-08-08')
:star: [@Mr-Jef](https://github.com/Mr-Jef '2023-08-10')
:star: [@ghbook](https://github.com/ghbook '2023-08-11')
:star: [@zyr1on](https://github.com/zyr1on '2023-08-11')
:star: [@goznauk](https://github.com/goznauk '2023-08-16')
:star: [@scmanjarrez](https://github.com/scmanjarrez '2023-08-17')
:star: [@NiYeh](https://github.com/NiYeh '2023-08-18')
:star: [@Sepehr0011](https://github.com/Sepehr0011 '2023-08-21')
:star: [@JarvanLei](https://github.com/JarvanLei '2023-08-22')
:star: [@kp6c6p6c](https://github.com/kp6c6p6c '2023-08-23')
:star: [@DrayChou](https://github.com/DrayChou '2023-08-24')
:star: [@szmyty](https://github.com/szmyty '2023-08-27')
:star: [@f1refa11](https://github.com/f1refa11 '2023-08-28')
:star: [@ddzzhen](https://github.com/ddzzhen '2023-08-29')
:star: [@flrngel](https://github.com/flrngel '2023-08-30')
:star: [@Sergimayol](https://github.com/Sergimayol '2023-09-01')
:star: [@Redrrx](https://github.com/Redrrx '2023-09-05')
:star: [@ilyazub](https://github.com/ilyazub '2023-09-19')
:star: [@AlskaPark](https://github.com/AlskaPark '2023-09-20')
:star: [@louyongjiu](https://github.com/louyongjiu '2023-09-22')
:star: [@adbforlife](https://github.com/adbforlife '2023-09-23')
:star: [@rahulmr](https://github.com/rahulmr '2023-10-03')
:star: [@soyandrey](https://github.com/soyandrey '2023-10-04')
:star: [@malaVydra](https://github.com/malaVydra '2023-10-07')
:star: [@RookCube](https://github.com/RookCube '2023-10-08')
:star: [@xjzh123](https://github.com/xjzh123 '2023-10-11')
:star: [@Ruy-Araujo](https://github.com/Ruy-Araujo '2023-10-15')
:star: [@vldkhramtsov](https://github.com/vldkhramtsov '2023-10-16')
:star: [@watsonhaw5566](https://github.com/watsonhaw5566 '2023-10-22')
:star: [@kumchick2055](https://github.com/kumchick2055 '2023-10-24')
:star: [@Stallon-niranjan](https://github.com/Stallon-niranjan '2023-10-30')
:star: [@devshjeon](https://github.com/devshjeon '2023-11-02')
:star: [@euchaliptus](https://github.com/euchaliptus '2023-11-06')
:star: [@S75XD](https://github.com/S75XD '2023-11-08')
:star: [@gster](https://github.com/gster '2023-11-17')
:star: [@QuanDeppChaii](https://github.com/QuanDeppChaii '2023-11-17')
:star: [@SevenworksDev](https://github.com/SevenworksDev '2023-11-18')
:star: [@cframe1337](https://github.com/cframe1337 '2023-11-19')
:star: [@Mdevpro78](https://github.com/Mdevpro78 '2023-11-25')
:star: [@ging-dev](https://github.com/ging-dev '2023-11-27')
:star: [@SYFH](https://github.com/SYFH '2023-11-27')
:star: [@Vchase-7047](https://github.com/Vchase-7047 '2023-11-27')
:star: [@ZhReimu](https://github.com/ZhReimu '2023-11-27')
:star: [@elvishoo](https://github.com/elvishoo '2023-11-27')
:star: [@KeepCodeing](https://github.com/KeepCodeing '2023-11-27')
:star: [@Fumeze](https://github.com/Fumeze '2023-11-27')
:star: [@JerryLiao26](https://github.com/JerryLiao26 '2023-11-27')
:star: [@ashrafnezhad-hamidreza](https://github.com/ashrafnezhad-hamidreza '2023-11-27')
:star: [@Lofimit](https://github.com/Lofimit '2023-11-28')
:star: [@shaohaiyang](https://github.com/shaohaiyang '2023-11-30')
:star: [@668](https://github.com/668 '2023-11-30')
:star: [@baby0o01999](https://github.com/baby0o01999 '2023-12-04')
:star: [@lightcax](https://github.com/lightcax '2023-12-08')
:star: [@gister9000](https://github.com/gister9000 '2023-12-09')
:star: [@BiltuDas1](https://github.com/BiltuDas1 '2023-12-18')
:star: [@ttycelery](https://github.com/ttycelery '2023-12-21')
:star: [@raihan-faza](https://github.com/raihan-faza '2023-12-21')
:star: [@hiiruki](https://github.com/hiiruki '2023-12-21')
:star: [@MhankBarBar](https://github.com/MhankBarBar '2023-12-21')
:star: [@MujurID](https://github.com/MujurID '2023-12-22')
:star: [@tommy-ca](https://github.com/tommy-ca '2023-12-26')
:star: [@nichdemos](https://github.com/nichdemos '2023-12-31')
:star: [@CheeseTurtle](https://github.com/CheeseTurtle '2023-12-31')
:star: [@sleepingcat4](https://github.com/sleepingcat4 '2024-01-04')
:star: [@bmouler](https://github.com/bmouler '2024-01-04')
:star: [@MaxMorais](https://github.com/MaxMorais '2024-01-06')
:star: [@juzt3](https://github.com/juzt3 '2024-01-12')
:star: [@dotsource](https://github.com/dotsource '2024-01-12')
:star: [@abdalrahman-saqr](https://github.com/abdalrahman-saqr '2024-01-14')
:star: [@AlexZotikov](https://github.com/AlexZotikov '2024-01-19')
:star: [@Chaunice](https://github.com/Chaunice '2024-01-21')
:star: [@hewhocannotbetamed](https://github.com/hewhocannotbetamed '2024-01-21')
:star: [@xiaojiujiuY9](https://github.com/xiaojiujiuY9 '2024-01-26')
:star: [@vvanglro](https://github.com/vvanglro '2024-01-26')
:star: [@KadeWuVungle](https://github.com/KadeWuVungle '2024-01-26')
:star: [@CC1001001](https://github.com/CC1001001 '2024-01-26')
:star: [@akeshmiri](https://github.com/akeshmiri '2024-01-27')
:star: [@visualrobots](https://github.com/visualrobots '2024-01-28')
:star: [@vypivshiy](https://github.com/vypivshiy '2024-01-29')
:star: [@ttrzcinski](https://github.com/ttrzcinski '2024-02-03')
:star: [@FastFingertips](https://github.com/FastFingertips '2024-02-09')
:star: [@christianmalek](https://github.com/christianmalek '2024-02-12')
:star: [@simms21](https://github.com/simms21 '2024-02-18')
:star: [@cenviity](https://github.com/cenviity '2024-02-19')
:star: [@iLollek](https://github.com/iLollek '2024-02-22')
:star: [@ChemicalNRG](https://github.com/ChemicalNRG '2024-02-26')
:star: [@Ehsan-U](https://github.com/Ehsan-U '2024-02-28')
:star: [@hosven](https://github.com/hosven '2024-03-07')
:star: [@JosepHyv](https://github.com/JosepHyv '2024-03-11')
:star: [@luczay](https://github.com/luczay '2024-03-22')
:star: [@Moat6](https://github.com/Moat6 '2024-03-27')
:star: [@jydxkj](https://github.com/jydxkj '2024-03-28')
:star: [@ivanrvpereira](https://github.com/ivanrvpereira '2024-04-08')
:star: [@TxQISchokEZz](https://github.com/TxQISchokEZz '2024-04-09')
:star: [@kerrycobb](https://github.com/kerrycobb '2024-04-11')
:star: [@lucasnuic](https://github.com/lucasnuic '2024-04-14')
:star: [@lyenliang](https://github.com/lyenliang '2024-04-17')
:star: [@danila-panteleev](https://github.com/danila-panteleev '2024-04-27')
:star: [@saarsil](https://github.com/saarsil '2024-04-28')
:star: [@amitness](https://github.com/amitness '2024-05-06')
:star: [@LaxmanSinghTomar](https://github.com/LaxmanSinghTomar '2024-05-06')
:star: [@pawanpaudel93](https://github.com/pawanpaudel93 '2024-05-06')
:star: [@dcyoung](https://github.com/dcyoung '2024-05-06')
:star: [@divyamani1](https://github.com/divyamani1 '2024-05-06')
:star: [@manzak](https://github.com/manzak '2024-05-13')
:star: [@dddyom](https://github.com/dddyom '2024-05-13')
:star: [@zwiebelslayer](https://github.com/zwiebelslayer '2024-05-14')
:star: [@WangWei90](https://github.com/WangWei90 '2024-05-18')
:star: [@bpenedo](https://github.com/bpenedo '2024-05-26')
:star: [@GhostYiL](https://github.com/GhostYiL '2024-05-28')
:star: [@Darkcast](https://github.com/Darkcast '2024-06-05')
:star: [@akasakaid](https://github.com/akasakaid '2024-06-07')
:star: [@rhysdg](https://github.com/rhysdg '2024-06-07')
:star: [@ahmadmfani](https://github.com/ahmadmfani '2024-06-07')
:star: [@naufaljct48](https://github.com/naufaljct48 '2024-06-07')
:star: [@amiune](https://github.com/amiune '2024-06-08')
:star: [@marylnrose](https://github.com/marylnrose '2024-06-08')
:star: [@himanshu076](https://github.com/himanshu076 '2024-06-09')
:star: [@herlangga72](https://github.com/herlangga72 '2024-06-10')
:star: [@SamDc73](https://github.com/SamDc73 '2024-06-11')
:star: [@ewnprn112](https://github.com/ewnprn112 '2024-06-20')
:star: [@rainmanzzz](https://github.com/rainmanzzz '2024-07-02')
:star: [@ranjian0](https://github.com/ranjian0 '2024-07-02')
:star: [@joelvaneenwyk](https://github.com/joelvaneenwyk '2024-07-11')
:star: [@shalahu](https://github.com/shalahu '2024-07-27')
:star: [@warezit](https://github.com/warezit '2024-08-01')
:star: [@firasuke](https://github.com/firasuke '2024-08-04')
:star: [@bit-code](https://github.com/bit-code '2024-08-14')
:star: [@beingDave](https://github.com/beingDave '2024-08-19')
:star: [@rekayno](https://github.com/rekayno '2024-08-23')
:star: [@EventGamer67](https://github.com/EventGamer67 '2024-08-24')
:star: [@arthurauffray](https://github.com/arthurauffray '2024-09-02')
:star: [@richardadonnell](https://github.com/richardadonnell '2024-09-08')
:star: [@cOborski](https://github.com/cOborski '2024-09-10')
:star: [@momvov](https://github.com/momvov '2024-09-19')
:star: [@M-logique](https://github.com/M-logique '2024-10-10')
:star: [@mohammad87115](https://github.com/mohammad87115 '2024-10-11')
:star: [@Vorrik](https://github.com/Vorrik '2024-10-25')
:star: [@aleemahmed96](https://github.com/aleemahmed96 '2024-10-30')
:star: [@rikistan45](https://github.com/rikistan45 '2024-11-14')
:star: [@KingJem](https://github.com/KingJem '2024-12-02')
:star: [@antilagg](https://github.com/antilagg '2024-12-06')
:star: [@0MeMo07](https://github.com/0MeMo07 '2024-12-11')