2012-02-22 09:52:39 +00:00
|
|
|
RQ (_Redis Queue_) is a simple Python library for queueing jobs and processing
|
|
|
|
them in the background with workers. It is backed by Redis and it is designed
|
|
|
|
to have a low barrier to entry. It should easily be integrated in your web
|
|
|
|
stack.
|
2011-11-15 23:37:59 +00:00
|
|
|
|
2011-11-14 14:15:05 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
## Getting started
|
2011-11-14 14:15:05 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
First, run a Redis server, of course:
|
2011-11-14 14:15:05 +00:00
|
|
|
|
2011-11-26 12:10:04 +00:00
|
|
|
$ redis-server
|
2011-11-14 14:15:05 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
To put jobs on queues, you don't have to do anything special, just define
|
|
|
|
your typically lengthy or blocking function:
|
2011-11-15 21:45:51 +00:00
|
|
|
|
2012-02-13 12:57:36 +00:00
|
|
|
import requests
|
2011-11-14 14:15:05 +00:00
|
|
|
|
2011-11-26 12:10:04 +00:00
|
|
|
def count_words_at_url(url):
|
2012-02-13 12:57:36 +00:00
|
|
|
resp = requests.get(url)
|
|
|
|
return len(resp.text.split())
|
2011-11-14 14:15:05 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
Then, create a RQ queue:
|
2011-11-14 14:15:05 +00:00
|
|
|
|
2011-11-26 12:10:04 +00:00
|
|
|
import rq import *
|
|
|
|
use_redis()
|
|
|
|
q = Queue()
|
2011-11-14 14:15:05 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
And enqueue the function call:
|
2011-11-14 14:15:05 +00:00
|
|
|
|
2011-11-26 12:10:04 +00:00
|
|
|
from my_module import count_words_at_url
|
|
|
|
result = q.enqueue(count_words_at_url, 'http://nvie.com')
|
2011-11-15 22:00:55 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
For a more complete example, refer to the [docs][d]. But this is the essence.
|
2011-11-15 22:00:55 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
[d]: {{site.baseurl}}docs/
|
2011-11-15 22:00:55 +00:00
|
|
|
|
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
### The worker
|
2011-11-15 22:00:55 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
To start executing enqueued function calls in the background, start a worker
|
|
|
|
from your project's directory:
|
2011-11-15 22:00:55 +00:00
|
|
|
|
2011-11-26 12:10:04 +00:00
|
|
|
$ rqworker
|
|
|
|
*** Listening for work on default
|
|
|
|
Got count_words_at_url('http://nvie.com') from default
|
|
|
|
Job result = 818
|
|
|
|
*** Listening for work on default
|
2011-11-15 22:00:55 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
That's about it.
|
2011-11-15 22:00:55 +00:00
|
|
|
|
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
## Installation
|
2011-11-14 11:10:59 +00:00
|
|
|
|
|
|
|
Simply use the following command to install the latest released version:
|
|
|
|
|
|
|
|
pip install rq
|
|
|
|
|
|
|
|
If you want the cutting edge version (that may well be broken), use this:
|
|
|
|
|
2011-11-14 17:40:12 +00:00
|
|
|
pip install -e git+git@github.com:nvie/rq.git@master#egg=rq
|
2011-11-14 11:10:59 +00:00
|
|
|
|
2011-11-15 23:53:56 +00:00
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
## Project history
|
2011-11-15 23:53:56 +00:00
|
|
|
|
|
|
|
This project has been inspired by the good parts of [Celery][1], [Resque][2]
|
|
|
|
and [this snippet][3], and has been created as a lightweight alternative to the
|
|
|
|
heaviness of Celery or other AMQP-based queueing implementations.
|
|
|
|
|
2011-11-26 08:31:59 +00:00
|
|
|
[m]: http://pypi.python.org/pypi/mailer
|
|
|
|
[p]: http://docs.python.org/library/pickle.html
|
2011-11-15 23:53:56 +00:00
|
|
|
[1]: http://www.celeryproject.org/
|
|
|
|
[2]: https://github.com/defunkt/resque
|
|
|
|
[3]: http://flask.pocoo.org/snippets/73/
|