Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/maxpoletaev/django-pgqueue
https://github.com/maxpoletaev/django-pgqueue
Last synced: about 1 month ago
JSON representation
- Host: GitHub
- URL: https://github.com/maxpoletaev/django-pgqueue
- Owner: maxpoletaev
- Created: 2018-05-01T12:38:56.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2019-12-17T11:25:06.000Z (about 5 years ago)
- Last Synced: 2024-10-03T12:22:29.454Z (3 months ago)
- Language: Python
- Size: 11.7 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Django PGQueue
The project was initially forked from [django-postgres-queue][dpq] for internal use
in some of my projects. After some changes and refactoring, adding tests, features
like warm shutdown I decided to put in on GitHub and PyPI as a standalone package
just in case.## Installation
```
pip install django-pgqueue
```Then add `pqueue` into `INSTALLED_APPS` and run `manage.py migrate` to create the jobs table.
## Usage
Initiate a queue object with defined tasks. This can go wherever you like and can
be called whatever you like. For example:```
# someapp/task_queue.pyfrom pqueue.queue import Queue
def say_hello(queue, job):
name = job.kwargs['name']
print('Hello, {}!'.format(name))task_queue = Queue(
tasks={
'say_hello': say_hello,
},
notify_channel='someapp_task_queue',
)
```Now define the worker command
```
# someapp/management/commands/pgqueue_worker.pyfrom pgqueue.worker import WorkerCommand
from someapp.queue import task_queueclass Command(WorkerCommand):
queue = task_queue
```And call the task this way:
```
from someapp.queue import task_queuetask_queue.enqueue('say_hello', {'name': 'Django'})
```Please note that only primitives can be used in job’s arguments as they are stored
as JSON in the database. When you try to pass any complex non-json-serializable object,
you will get an error saying `object is not json serializable`.## Periodic tasks
There is no built in way to run jobs periodically like celerybeat in Celery.
But you still can use cron. For example you can create a universal command
to execute any task. Something like this:```
import jsonfrom django.core.management import BaseCommand
from someapp.queue import task_queueclass Command(BaseCommand):
def add_arguments(self, parser):
parser.add_argument('task_name')
parser.add_argument('task_kwargs')def handle(self, task_name, task_kwargs, **options):
task_queue.enqueue(task_name, json.loads(task_kwargs))
```And then put it into your cron records:
```
0 0 * * * /path/to/python manage.py run_task say_hello '{"name": "Django!"}'
```[dpq]: https://github.com/gavinwahl/django-postgres-queue