Simple job queues for Python
Go to file
Vincent Driessen 0bc451f75b Install importlib on Travis' py26 environment. 2013-01-24 10:17:46 +01:00
examples Rewrite of the connection setup. 2012-05-21 08:08:59 +02:00
rq Merge branch 'vierja-ended_time' 2013-01-23 22:19:38 +01:00
tests Use unittest2 for Python 2.6. 2013-01-23 22:43:30 +01:00
.env.fish Add .env.fish, for fish lovers. 2013-01-18 12:53:05 +01:00
.gitignore Add tox tests, for Python 2.6 and PyPy. 2013-01-23 22:57:44 +01:00
.travis.yml Install importlib on Travis' py26 environment. 2013-01-24 10:17:46 +01:00
CHANGES.md Update changelog. 2013-01-23 22:21:22 +01:00
LICENSE Fix year. 2012-03-28 10:49:28 +02:00
README.md Add comment to the README. 2012-08-04 09:18:46 +02:00
py26-requirements.txt Install importlib on Travis' py26 environment. 2013-01-24 10:17:46 +01:00
requirements.txt Install importlib on Travis' py26 environment. 2013-01-24 10:17:46 +01:00
run_tests Add way of running tests unfiltered. 2012-07-18 09:45:18 +02:00
setup.cfg Merge branch 'selwin-remove-logbook' 2013-01-18 15:47:21 +01:00
setup.py Promote to 'beta' stage. 2013-01-23 20:49:42 +01:00
tox.ini Add tox tests, for Python 2.6 and PyPy. 2013-01-23 22:57:44 +01:00

README.md

Build status

RQ (Redis Queue) is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.

Getting started

First, run a Redis server, of course:

$ redis-server

To put jobs on queues, you don't have to do anything special, just define your typically lengthy or blocking function:

import requests

def count_words_at_url(url):
    """Just an example function that's called async."""
    resp = requests.get(url)
    return len(resp.text.split())

You do use the excellent requests package, don't you?

Then, create a RQ queue:

from rq import Queue, use_connection
use_connection()
q = Queue()

And enqueue the function call:

from my_module import count_words_at_url
result = q.enqueue(count_words_at_url, 'http://nvie.com')

For a more complete example, refer to the docs. But this is the essence.

The worker

To start executing enqueued function calls in the background, start a worker from your project's directory:

$ rqworker
*** Listening for work on default
Got count_words_at_url('http://nvie.com') from default
Job result = 818
*** Listening for work on default

That's about it.

Installation

Simply use the following command to install the latest released version:

pip install rq

If you want the cutting edge version (that may well be broken), use this:

pip install -e git+git@github.com:nvie/rq.git@master#egg=rq

Project history

This project has been inspired by the good parts of Celery, Resque and this snippet, and has been created as a lightweight alternative to the heaviness of Celery or other AMQP-based queueing implementations.