2012-02-22 09:52:39 +00:00
RQ (_Redis Queue_) is a simple Python library for queueing jobs and processing
them in the background with workers. It is backed by Redis and it is designed
2012-03-28 08:57:01 +00:00
to have a low barrier to entry. It should be integrated in your web stack
easily.
2011-11-15 23:37:59 +00:00
2018-12-03 00:28:36 +00:00
RQ requires Redis >= 3.0.0.
2014-09-08 11:25:35 +00:00
2020-07-07 00:28:28 +00:00
![Build status ](https://github.com/rq/rq/workflows/Test%20rq/badge.svg )
2017-10-21 04:22:26 +00:00
[![PyPI ](https://img.shields.io/pypi/pyversions/rq.svg )](https://pypi.python.org/pypi/rq)
2018-01-17 23:41:45 +00:00
[![Coverage ](https://codecov.io/gh/rq/rq/branch/master/graph/badge.svg )](https://codecov.io/gh/rq/rq)
2014-07-23 11:08:24 +00:00
2015-06-29 13:10:53 +00:00
Full documentation can be found [here][d].
2011-11-14 14:15:05 +00:00
2019-06-07 14:19:34 +00:00
## Support RQ
If you find RQ useful, please consider supporting this project via [Tidelift ](https://tidelift.com/subscription/pkg/pypi-rq?utm_source=pypi-rq&utm_medium=referral&utm_campaign=readme ).
2011-11-26 08:31:59 +00:00
## Getting started
2011-11-14 14:15:05 +00:00
2011-11-26 08:31:59 +00:00
First, run a Redis server, of course:
2011-11-14 14:15:05 +00:00
2012-08-03 13:04:18 +00:00
```console
$ redis-server
```
2011-11-14 14:15:05 +00:00
2011-11-26 08:31:59 +00:00
To put jobs on queues, you don't have to do anything special, just define
your typically lengthy or blocking function:
2011-11-15 21:45:51 +00:00
2012-08-03 13:04:18 +00:00
```python
import requests
2011-11-14 14:15:05 +00:00
2012-08-03 13:04:18 +00:00
def count_words_at_url(url):
2012-08-04 07:18:46 +00:00
"""Just an example function that's called async."""
2012-08-03 13:04:18 +00:00
resp = requests.get(url)
return len(resp.text.split())
```
2011-11-14 14:15:05 +00:00
2012-04-02 06:37:47 +00:00
You do use the excellent [requests][r] package, don't you?
2015-02-01 07:37:21 +00:00
Then, create an RQ queue:
2011-11-14 14:15:05 +00:00
2012-08-03 13:04:18 +00:00
```python
2016-02-04 13:12:06 +00:00
from redis import Redis
from rq import Queue
q = Queue(connection=Redis())
2012-08-03 13:04:18 +00:00
```
2011-11-14 14:15:05 +00:00
2011-11-26 08:31:59 +00:00
And enqueue the function call:
2011-11-14 14:15:05 +00:00
2012-08-03 13:04:18 +00:00
```python
from my_module import count_words_at_url
2017-10-21 04:22:26 +00:00
job = q.enqueue(count_words_at_url, 'http://nvie.com')
2012-08-03 13:04:18 +00:00
```
2011-11-15 22:00:55 +00:00
2011-11-26 08:31:59 +00:00
For a more complete example, refer to the [docs][d]. But this is the essence.
2011-11-15 22:00:55 +00:00
2011-11-26 08:31:59 +00:00
### The worker
2011-11-15 22:00:55 +00:00
2011-11-26 08:31:59 +00:00
To start executing enqueued function calls in the background, start a worker
from your project's directory:
2011-11-15 22:00:55 +00:00
2012-08-03 13:04:18 +00:00
```console
2016-02-04 13:12:06 +00:00
$ rq worker
2012-08-03 13:04:18 +00:00
*** Listening for work on default
Got count_words_at_url('http://nvie.com') from default
Job result = 818
*** Listening for work on default
```
2011-11-15 22:00:55 +00:00
2011-11-26 08:31:59 +00:00
That's about it.
2011-11-15 22:00:55 +00:00
2011-11-26 08:31:59 +00:00
## Installation
2011-11-14 11:10:59 +00:00
Simply use the following command to install the latest released version:
pip install rq
If you want the cutting edge version (that may well be broken), use this:
2017-02-27 19:11:24 +00:00
pip install -e git+https://github.com/nvie/rq.git@master#egg=rq
2011-11-14 11:10:59 +00:00
2011-11-15 23:53:56 +00:00
2020-06-28 11:21:16 +00:00
## Related Projects
Check out these below repos which might be useful in your rq based project.
- [rq-dashboard ](https://github.com/Parallels/rq-dashboard )
- [rqmonitor ](https://github.com/pranavgupta1234/rqmonitor )
- [django-rq ](https://github.com/rq/django-rq )
- [Flask-RQ2 ](https://github.com/rq/Flask-RQ2 )
- [rq-scheduler ](https://github.com/rq/rq-scheduler )
2011-11-26 08:31:59 +00:00
## Project history
2011-11-15 23:53:56 +00:00
This project has been inspired by the good parts of [Celery][1], [Resque][2]
and [this snippet][3], and has been created as a lightweight alternative to the
heaviness of Celery or other AMQP-based queueing implementations.
2019-05-30 10:12:31 +00:00
2012-04-02 06:37:47 +00:00
[r]: http://python-requests.org
2015-06-29 13:10:53 +00:00
[d]: http://python-rq.org/
2011-11-26 08:31:59 +00:00
[m]: http://pypi.python.org/pypi/mailer
[p]: http://docs.python.org/library/pickle.html
2011-11-15 23:53:56 +00:00
[1]: http://www.celeryproject.org/
2014-09-17 10:53:30 +00:00
[2]: https://github.com/resque/resque
2011-11-15 23:53:56 +00:00
[3]: http://flask.pocoo.org/snippets/73/