This required some minor code changes, mainly some adjustments in tests
(which are now analyzed more thoroughly in spite of being mostly
unannotated), and some changes to placement of type:ignore comments.
Reduce tox matrix to one env per python version, with two extra builds
for lint and docs. Delegate to tox from travis-ci.
Add 3.8 to testing. Simplify by dropping coverage reporting and
"no-deps" test runs.
Added an extra set for handling dead links, and reporting.
One consequence of this is that using this script will "work" offline, but will report that some all the links were not fetched.
This function is called on more than just 304 responses; it’s
important to permit the Allow header on 204 responses. Also, the
relevant RFCs have changed significantly.
Fixes#2726.
Signed-off-by: Anders Kaseorg <andersk@mit.edu>
support x-www-form-urlencoded body with values consisting of
encoded bytes which are not url-encoded into ascii
(it seems other web frameworks often support this)
add bytes qs support to escape.parse_qs_bytes,
leave str qs support for backwards compatibility
This test started failing on windows CI with an upgrade to python
3.7.4 (which bundles a newer version of openssl). Disable tls 1.3 for
now.
Possibly related to #2536
This is slightly faster than using the builtin cache, e.g.:
With benchmark below (Python 3.7, Linux):
before: 0.7284867879934609
after: 0.2657967659761198
```py
import re
from time import perf_counter
line = 'HTTP/1.1'
_http_version_re = re.compile(r"^HTTP/1\.[0-9]$")
start = perf_counter()
for i in range(1000000):
_http_version_re.match(line)
print(perf_counter() - start)
start = perf_counter()
for i in range(1000000):
re.match(r"^HTTP/1\.[0-9]$", line)
print(perf_counter() - start)
```
Tornado is now py3-only so @lru_cache is always available.
Performance is about the same. Benchmark below. Python 3.7 on Linux.
before, cached: 0.9121252089971676
before, uncached: 13.358482279989403
after, cached: 0.9175888689933345
after, uncached: 11.085199063003529
```py
from time import perf_counter
names = [f'sOMe-RanDOM-hEAdeR-{i}' for i in range(1000)]
from tornado.httputil import _normalize_header
start = perf_counter()
for i in range(10000):
# _normalize_header.cache_clear()
for name in names:
_normalize_header(name)
print(perf_counter() - start)
from tornado.httputil import _NormalizedHeaderCache
start = perf_counter()
_normalized_headers = _NormalizedHeaderCache(1000)
for i in range(10000):
# _normalized_headers = _NormalizedHeaderCache(1000)
for name in names:
_normalized_headers[name]
print(perf_counter() - start)
```