proxy.py/benchmark
Abhinav Singh d1920460d0
Pre `v2.4.0rc8` cleanups (#1053)
* Pre-release cleanups

* Add listener pool test

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Add multi listener test

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2022-01-24 11:36:51 +05:30
..
README.md Optimizations & Update Benchmark Results (#832) 2021-12-01 04:18:49 +05:30
_aiohttp.py Use `--local-executor` flag by default for Docker container (#880) 2021-12-19 12:48:05 +05:30
_blacksheep.py [isort] Lib modules (#1016) 2022-01-20 15:34:54 +05:30
_proxy.py Pre `v2.4.0rc8` cleanups (#1053) 2022-01-24 11:36:51 +05:30
_starlette.py [isort] Lib modules (#1016) 2022-01-20 15:34:54 +05:30
_tornado.py [isort] Lib modules (#1016) 2022-01-20 15:34:54 +05:30
compare.sh Use `--local-executor` flag by default for Docker container (#880) 2021-12-19 12:48:05 +05:30
requirements.txt pip prod(deps): bump blacksheep from 1.2.1 to 1.2.2 (#885) 2021-12-21 01:27:15 +05:30

README.md

Benchmark

Table of Contents

TL;DR

NOTE: On Macbook Pro 2019 / 2.4 GHz 8-Core Intel Core i9 / 32 GB RAM

Server Throughput (request/sec) Num Workers Runner
blacksheep 46,564 10 uvicorn
starlette 44,102 10 uvicorn
proxy.py 39,232 10 -
aiohttp 6,615 1 -
tornado 3,301 1 -
  • On a single core, proxy.py yields ~9449 req/sec throughput.
  • Try it using --num-acceptors=1

Usage

 git clone https://github.com/abhinavsingh/proxy.py.git
 cd proxy.py
 pip install -r benchmark/requirements.txt
 ./benchmark/compare.sh > /tmp/compare.log 2>&1

Results

 cat /tmp/compare.log