Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/lukeyeager/bounded_async_executor
Wrapper for concurrent.futures.{Thread,Process}PoolExecutor
https://github.com/lukeyeager/bounded_async_executor
python3 threading
Last synced: 27 days ago
JSON representation
Wrapper for concurrent.futures.{Thread,Process}PoolExecutor
- Host: GitHub
- URL: https://github.com/lukeyeager/bounded_async_executor
- Owner: lukeyeager
- License: mit
- Created: 2018-03-09T20:46:11.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2018-04-03T17:43:00.000Z (almost 7 years ago)
- Last Synced: 2024-10-29T18:45:47.373Z (3 months ago)
- Topics: python3, threading
- Language: Python
- Homepage:
- Size: 12.7 KB
- Stars: 5
- Watchers: 3
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# bounded_async_executor
[![PyPI version](https://badge.fury.io/py/bounded-async-executor.svg)](https://pypi.python.org/pypi/bounded-async-executor)
[![Build Status](https://travis-ci.org/lukeyeager/bounded_async_executor.svg?branch=master)](https://travis-ci.org/lukeyeager/bounded_async_executor)I found myself writing too much code that looked like this:
```python
def download_urls(urls):
downloaded = 0# Use concurrent.futures to create a pool of worker threads
with concurrent.futures.ThreadPoolExecutor() as executor:
futures = set()
for url in urls:
# Create a future for each url
futures.add(executor.submit(download_url, url))# Bound the results so that `futures` doesn't take up too much memory
while len(futures) >= 1000:
done, futures = concurrent.futures.wait(futures, return_when=concurrent.futures.FIRST_COMPLETED)
for future in done:
try:
future.result()
downloaded += 1
except Exception as e:
print(e)# Process the remaining futures
for future in concurrent.futures.as_completed(futures):
try:
future.result()
downloaded += 1
except Exception as e:
print(e)print('Downloaded {} files successfully.'.format(downloaded))
```So, I wrote a library to abstract away much of that complexity:
```python
def download_urls(urls):
downloaded = 0def on_success(result):
nonlocal downloaded
downloaded += 1def on_error(error):
print(error)with bounded_async_executor.Executor(download_url, on_success, on_error) as executor:
for url in urls:
executor.add(url)print('Downloaded {} files successfully.'.format(downloaded))
```