mitmproxy/test/bench
Maximilian Hils c69239bb90 switch to stdlib logging
mitmproxy previously used a homegrown logging mechanism based around
`mitmproxy.ctx.log` and the `add_log` hook. This worked well for everything
we control, but does not work outside the mitmproxy universe.
For now we have simply ignored logging in e.g. tornado or h2, but with the
upcoming introduction of mitmproxy_wireguard we now have a dependency
on some Rust/PyO3 code for which we definitely want logs, but which also
cannot easily be changed to use our homegrown logging (PyO3 does the heavy
lifting to add interoperability with stdlib logging). Long story short,
we want to introduce a log handler for stdlib logging.

Now there are two ways how such a handler could operate:

 1. We could build a handler that forwards all stdlib log events
    into our homegrown mechanism.
 2. We embrace stdlib's logging as the correct way to do things,
    and get rid of our homegrown stuff.

This PR follows the second approach by removing the `add_log` hook and
rewriting the `TermLog` and `EventStore` addons to listen for stdlib log records.
This means that all `mitmproxy.ctx.log.info` events are now simply `logging.info` etc.

One upside of this approach is that many parts of the codebase now don't depend
on the existence of `mitmproxy.ctx` and we can use off-the-shelf things like pytest's
`caplog`. We can also now better colorize log output and/or add timestamps.
2022-09-17 17:28:35 +02:00
..
.gitignore
README.md Improve Markdown syntax(styling) (#4496) 2021-03-11 19:13:02 +01:00
benchmark.py switch to stdlib logging 2022-09-17 17:28:35 +02:00
run-mitmdump
run-mitmproxy

README.md

This directory contains an addon for benchmarking and profiling mitmproxy. At the moment, this is simply to give developers a quick way to see the impact of their work. Eventually, this might grow into a performance dashboard with historical data, so we can track performance over time.

Setup

Install the following tools:

https://github.com/wg/wrk

go get github.com/cortesi/devd/cmd/devd

You may also want to install snakeviz to make viewing profiles easier:

pip install snakeviz

Now run the benchmark by loading the addon. A typical invocation is as follows:

mitmdump -p0 -q --set benchmark_save_path=/tmp/foo -s ./benchmark.py

This will start up the backend server, run the benchmark, save the results to /tmp/foo.bench and /tmp/foo.prof, and exit.