Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/brettlangdon/qw
qw (QueueWorker) - python library for processing a redis list as a work queue
https://github.com/brettlangdon/qw
Last synced: 3 months ago
JSON representation
qw (QueueWorker) - python library for processing a redis list as a work queue
- Host: GitHub
- URL: https://github.com/brettlangdon/qw
- Owner: brettlangdon
- License: mit
- Created: 2014-10-09T13:05:27.000Z (about 10 years ago)
- Default Branch: master
- Last Pushed: 2023-03-31T14:25:53.000Z (over 1 year ago)
- Last Synced: 2024-09-14T03:34:28.130Z (3 months ago)
- Language: Python
- Size: 17.6 KB
- Stars: 7
- Watchers: 2
- Forks: 2
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
Awesome Lists containing this project
README
qw
==qw (or QueueWorker) is used to run worker processes which listen on a redis list for jobs to process.
## Setup
### pip`pip install qw`
### git
```
git clone git://github.com/brettlangdon/qw.git
cd ./qw
python setup.py install
```## Design
### Manager
The manager is simply a process manager. It's job is to start/stop worker sub-processes.### Worker
The workers are processes which sit and listen for jobs on a few queues and then process
those jobs.### Target
The worker/manager take a `target` which can be either a function or a string (importable function).```python
def target(job_id, job_data):
passmanager = Manager(target)
# OR
manager = Manager('__main__.target')
```
### Queues
There are a few different queues that are used. The job queues are just redis lists, manager/worker lists are sets and jobs are hashes.A worker picks up a job from either `all:jobs`, `:jobs` or `:jobs`, pulls the corresponding `job:` key and
processes it with the provided `target`, after processing it will then remove the `job:` key as well as the job id from
the `:jobs` queue.* `all:managers` - a set of all managers
* `all:jobs` - a queue that all workers can pull jobs from, the values are just the job ids
* `job:` - a hash of the job data
* `:workers` - a set of all workers belonging to a given manager
* `:jobs` - a queue of jobs for a specific manager, workers will try to pull from here before `all:jobs`, the values are just the job ids
* `:jobs` - a queue of jobs for a specific worker, this is meant as a in progress queue for each worker, the workers will pull jobs into this queue from either `:jobs` or `all:jobs`, the values are just the job ids### Results
`qw` Workers make no assumptions about the result of processing each job. It does not place finished job into a queue or database
or anything else. Once a job has been successfully processed by a Worker, that job is removed completely from queues and from redis.
The worker itself must properly store the results into a finished queue or database if that is what is required.## Basic Usage
```python
from qw.manager import Managerdef job_printer(job_id, job_data):
print job_id
print job_datamanager = Manager(job_printer)
manager.start()
manager.join()
```## API
### Manager(object)
* `__init__(self, target, host="localhost", port=6379, db=0, num_workers=None, name=None)`
* `start(self)`
* `stop(self)`
* `join(self)`### Worker(multiprocess.Process)
* `__init__(self, client, target, manager_name=None, timeout=10)`
* `run(self)`
* `shutdown(self)`### Client(redis.StrictRedis)
* `__init__(self, host="localhost", port=6379, db=0)`
* `register_manager(self, name)`
* `deregister_manager(self, name)`
* `register_worker(self, manager, name)`
* `deregister_worker(self, manager, name)`
* `queue_job(self, job_data, manager=None, worker=None)`
* `fetch_next_job(self, manager, worker, timeout=10)`
* `finish_job(self, job_id, worker_name)`
* `get_all_managers(self)`
* `get_manager_workers(self, manager_name)`
* `get_worker_pending_jobs(self, worker_name)`
* `get_manager_queued_jobs(self, manager_name)`
* `get_all_queued_jobs(self)`
* `get_all_pending_jobs(self)`## CLI Tools
### qw-manager
The `qw-manager` tool is used to start a new manager process with the provided `target` string, which gets run
for every job processed by a worker.
```
$ qw-manager --help
Usage:
qw-manager [--level=] [--workers=] [--name=] [--host=] [--port=] [--db=]
qw-manager (--help | --version)Options:
--help Show this help message
--version Show version information
-l --level= Set the log level (debug,info,warn,error) [default: info]
-w --workers= Set the number of workers to start, defaults to number of cpus
-n --name= Set the manager name, defaults to hostname
-h --host= Set the redis host to use [default: localhost]
-p --port= Set the redis port to use [default: 6379]
-d --db= Set the redis db number to use [default: 0]
```
### qw-client
The `qw-client` command is useful to look at basic stats of running managers, workers and job queues
as well as to push json data in the form of a string or a file to the main queue or a manager specific queue.
```
$ qw-client --help
Usage:
qw-client [--host=] [--port=] [--db=] managers
qw-client [--host=] [--port=] [--db=] workers []
qw-client [--host=] [--port=] [--db=] jobs []
qw-client [--host=] [--port=] [--db=] queue string []
qw-client [--host=] [--port=] [--db=] queue file []
qw-client (--help | --version)Options:
--help Show this help message
--version Show version information
-h --host= Set the redis host to use [default: localhost]
-p --port= Set the redis port to use [default: 6379]
-d --db= Set the redis db number to use [default: 0]
```