Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/testdrivenio/simple-task-queue
asynchronous task queues using python's multiprocessing library
https://github.com/testdrivenio/simple-task-queue
multiprocessing python redis task-queue
Last synced: about 1 month ago
JSON representation
asynchronous task queues using python's multiprocessing library
- Host: GitHub
- URL: https://github.com/testdrivenio/simple-task-queue
- Owner: testdrivenio
- License: mit
- Created: 2018-03-29T22:24:16.000Z (almost 7 years ago)
- Default Branch: main
- Last Pushed: 2023-06-21T12:06:39.000Z (over 1 year ago)
- Last Synced: 2024-08-07T23:56:11.419Z (5 months ago)
- Topics: multiprocessing, python, redis, task-queue
- Language: Python
- Homepage:
- Size: 801 KB
- Stars: 65
- Watchers: 11
- Forks: 35
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Asynchronous Task Queues in Python
Several implementations of asynchronous task queues in Python using the multiprocessing library and Redis.
> Blog post: [Developing an Asynchronous Task Queue in Python](http://testdriven.io/developing-an-asynchronous-task-queue-in-python)
## Setup
1. Fork/Clone
1. Create and activate a virtual environment
1. Install the dependencies
1. Enter the Python shell and download the NLTK `stopwords` [corpus](https://www.nltk.org/data.html):
```sh
>> import nltk
>> nltk.download('stopwords')[nltk_data] Downloading package stopwords to
[nltk_data] /Users/michael.herman/nltk_data...
[nltk_data] Unzipping corpora/stopwords.zip.
True
```## Examples
Multiprocessing Pool:
```sh
$ python simple_pool.py
```Multiprocessing Queue:
```sh
$ python simple_queue.py
$ python simple_task_queue.py
```Logging to a single file:
```sh
$ python simple_task_queue_logging.py
```Logging to separate files:
```sh
$ python simple_task_queue_logging_separate_files.py
```Redis:
```sh
$ python redis_queue.py
```