Simple job queues for Python
Go to file
Selwin Ong 766bb60006 Merge branch 'dependent-jobs-with-result-ttl' 2016-05-06 11:42:50 +08:00
examples fix print in example 2014-05-21 09:37:10 +02:00
rq Merge branch 'dependent-jobs-with-result-ttl' 2016-05-06 11:42:50 +08:00
tests Merge branch 'dependent-jobs-with-result-ttl' 2016-05-06 11:42:50 +08:00
.coveragerc Ignore local.py (it's tested in werkzeug instead). 2014-08-14 10:19:12 +02:00
.gitignore ignore IDE cache folder 2015-11-09 12:38:27 +08:00
.mailmap Add .mailmap 2015-08-25 09:08:42 +02:00
.travis.yml correct SLOW env var, run slow tests on ci 2015-10-14 20:08:55 +01:00
CHANGES.md Update CHANGES.md 2015-10-08 18:15:08 +07:00
LICENSE Fix year. 2012-03-28 10:49:28 +02:00
MANIFEST.in Added a MANIFEST excluding tests from distribution 2013-03-16 10:31:02 +01:00
Makefile Clean dist+build folders before releasing 2015-06-03 10:43:00 +02:00
README.md Update outdated sample codes in README.md 2016-02-04 22:12:06 +09:00
dev-requirements.txt Move mock to test-only dependencies. 2014-05-06 15:51:23 +02:00
py26-requirements.txt Install importlib on Travis' py26 environment. 2013-01-24 10:17:46 +01:00
requirements.txt Reverted click requirement back to >= 3.0 2015-11-28 11:56:34 +07:00
run_tests correct SLOW env var, run slow tests on ci 2015-10-14 20:08:55 +01:00
setup.cfg Move the flake8 config to setup.cfg 2016-03-13 18:42:12 +01:00
setup.py Update setup.py to ensure Python 2.6 dependencies are installed by pip. 2015-06-01 11:00:26 +07:00
tox.ini Move the flake8 config to setup.cfg 2016-03-13 18:42:12 +01:00

README.md

RQ (Redis Queue) is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.

RQ requires Redis >= 2.7.0.

Build status Downloads Can I Use Python 3? Coverage Status

Full documentation can be found here.

Getting started

First, run a Redis server, of course:

$ redis-server

To put jobs on queues, you don't have to do anything special, just define your typically lengthy or blocking function:

import requests

def count_words_at_url(url):
    """Just an example function that's called async."""
    resp = requests.get(url)
    return len(resp.text.split())

You do use the excellent requests package, don't you?

Then, create an RQ queue:

from redis import Redis
from rq import Queue

q = Queue(connection=Redis())

And enqueue the function call:

from my_module import count_words_at_url
result = q.enqueue(count_words_at_url, 'http://nvie.com')

For a more complete example, refer to the docs. But this is the essence.

The worker

To start executing enqueued function calls in the background, start a worker from your project's directory:

$ rq worker
*** Listening for work on default
Got count_words_at_url('http://nvie.com') from default
Job result = 818
*** Listening for work on default

That's about it.

Installation

Simply use the following command to install the latest released version:

pip install rq

If you want the cutting edge version (that may well be broken), use this:

pip install -e git+git@github.com:nvie/rq.git@master#egg=rq

Project history

This project has been inspired by the good parts of Celery, Resque and this snippet, and has been created as a lightweight alternative to the heaviness of Celery or other AMQP-based queueing implementations.