v2.4.0rc3 (#882)
* Merge pull request #840 from abhinavsingh/release-schedule-notes Add release schedule under FAQ * Green CI (#841) * Cleanup parser & url classes (#843) * Optimize parser logic * Add `is_complete` property * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * pre commit fixes * We need `ValueError` * wuff * useless * flake8 * Simplify url Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * pip prod(deps): bump twine from 3.6.0 to 3.7.0 (#845) Bumps [twine](https://github.com/pypa/twine) from 3.6.0 to 3.7.0. - [Release notes](https://github.com/pypa/twine/releases) - [Changelog](https://github.com/pypa/twine/blob/main/docs/changelog.rst) - [Commits](https://github.com/pypa/twine/compare/3.6.0...3.7.0) --- updated-dependencies: - dependency-name: twine dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * npm: bump jasmine from 3.6.3 to 3.10.0 in /dashboard (#844) Bumps [jasmine](https://github.com/jasmine/jasmine-npm) from 3.6.3 to 3.10.0. - [Release notes](https://github.com/jasmine/jasmine-npm/releases) - [Commits](https://github.com/jasmine/jasmine-npm/compare/v3.6.3...v3.10.0) --- updated-dependencies: - dependency-name: jasmine dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Abhinav Singh <126065+abhinavsingh@users.noreply.github.com> * npm: bump chrome-devtools-frontend in /dashboard (#846) Bumps [chrome-devtools-frontend](https://github.com/ChromeDevTools/devtools-frontend) from 1.0.944903 to 1.0.947377. - [Release notes](https://github.com/ChromeDevTools/devtools-frontend/releases) - [Changelog](https://github.com/ChromeDevTools/devtools-frontend/blob/main/docs/release_management.md) - [Commits](https://github.com/ChromeDevTools/devtools-frontend/commits) --- updated-dependencies: - dependency-name: chrome-devtools-frontend dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * pip prod(deps): bump coverage from 6.1.2 to 6.2 (#847) Bumps [coverage](https://github.com/nedbat/coveragepy) from 6.1.2 to 6.2. - [Release notes](https://github.com/nedbat/coveragepy/releases) - [Changelog](https://github.com/nedbat/coveragepy/blob/master/CHANGES.rst) - [Commits](https://github.com/nedbat/coveragepy/compare/6.1.2...6.2) --- updated-dependencies: - dependency-name: coverage dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Abhinav Singh <126065+abhinavsingh@users.noreply.github.com> * Fix GHA check-gate to properly identify failures (#849) * pip prod(deps): bump pylint from 2.12.1 to 2.12.2 (#851) Bumps [pylint](https://github.com/PyCQA/pylint) from 2.12.1 to 2.12.2. - [Release notes](https://github.com/PyCQA/pylint/releases) - [Changelog](https://github.com/PyCQA/pylint/blob/main/ChangeLog) - [Commits](https://github.com/PyCQA/pylint/compare/v2.12.1...v2.12.2) --- updated-dependencies: - dependency-name: pylint dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * npm: bump @types/js-cookie from 2.2.6 to 3.0.1 in /dashboard (#850) Bumps [@types/js-cookie](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/js-cookie) from 2.2.6 to 3.0.1. - [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases) - [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/js-cookie) --- updated-dependencies: - dependency-name: "@types/js-cookie" dependency-type: direct:development update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Abhinav Singh <126065+abhinavsingh@users.noreply.github.com> * pip prod(deps): bump sphinx from 4.3.0 to 4.3.1 (#853) Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 4.3.0 to 4.3.1. - [Release notes](https://github.com/sphinx-doc/sphinx/releases) - [Changelog](https://github.com/sphinx-doc/sphinx/blob/4.x/CHANGES) - [Commits](https://github.com/sphinx-doc/sphinx/compare/v4.3.0...v4.3.1) --- updated-dependencies: - dependency-name: sphinx dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * pip prod(deps): bump paramiko from 2.8.0 to 2.8.1 (#855) * npm: bump ws from 7.4.6 to 8.3.0 in /dashboard (#854) * pip prod(deps): bump uvicorn from 0.15.0 to 0.16.0 (#857) Bumps [uvicorn](https://github.com/encode/uvicorn) from 0.15.0 to 0.16.0. - [Release notes](https://github.com/encode/uvicorn/releases) - [Changelog](https://github.com/encode/uvicorn/blob/master/CHANGELOG.md) - [Commits](https://github.com/encode/uvicorn/compare/0.15.0...0.16.0) --- updated-dependencies: - dependency-name: uvicorn dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * npm: bump chrome-devtools-frontend in /dashboard (#856) Bumps [chrome-devtools-frontend](https://github.com/ChromeDevTools/devtools-frontend) from 1.0.947377 to 1.0.949424. - [Release notes](https://github.com/ChromeDevTools/devtools-frontend/releases) - [Changelog](https://github.com/ChromeDevTools/devtools-frontend/blob/main/docs/release_management.md) - [Commits](https://github.com/ChromeDevTools/devtools-frontend/commits) --- updated-dependencies: - dependency-name: chrome-devtools-frontend dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Abhinav Singh <126065+abhinavsingh@users.noreply.github.com> * Process `--enable-*` flags before loading plugins (#860) * Process `--enable-*` flags before loading plugins * Fix spelling * Ignore RST299 and RST499 * npm: bump http-server from 0.12.3 to 14.0.0 in /dashboard (#858) Bumps [http-server](https://github.com/http-party/http-server) from 0.12.3 to 14.0.0. - [Release notes](https://github.com/http-party/http-server/releases) - [Commits](https://github.com/http-party/http-server/compare/v0.12.3...v14.0.0) --- updated-dependencies: - dependency-name: http-server dependency-type: direct:development update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Abhinav Singh <126065+abhinavsingh@users.noreply.github.com> * pip prod(deps): bump furo from 2021.11.15 to 2021.11.23 (#859) Bumps [furo](https://github.com/pradyunsg/furo) from 2021.11.15 to 2021.11.23. - [Release notes](https://github.com/pradyunsg/furo/releases) - [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md) - [Commits](https://github.com/pradyunsg/furo/compare/2021.11.15...2021.11.23) --- updated-dependencies: - dependency-name: furo dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Abhinav Singh <126065+abhinavsingh@users.noreply.github.com> * Update web log context fields to match proxy log context fields (#861) * Update web log context fields to match those with proxy log context for consistency * Fix spell * Fix `DEFAULT_WEB_ACCESS_LOG_FORMAT` * pip prod(deps): bump pytest-xdist from 2.4.0 to 2.5.0 (#864) Bumps [pytest-xdist](https://github.com/pytest-dev/pytest-xdist) from 2.4.0 to 2.5.0. - [Release notes](https://github.com/pytest-dev/pytest-xdist/releases) - [Changelog](https://github.com/pytest-dev/pytest-xdist/blob/master/CHANGELOG.rst) - [Commits](https://github.com/pytest-dev/pytest-xdist/compare/v2.4.0...v2.5.0) --- updated-dependencies: - dependency-name: pytest-xdist dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * npm: bump eslint-plugin-node from 10.0.0 to 11.1.0 in /dashboard (#863) Bumps [eslint-plugin-node](https://github.com/mysticatea/eslint-plugin-node) from 10.0.0 to 11.1.0. - [Release notes](https://github.com/mysticatea/eslint-plugin-node/releases) - [Commits](https://github.com/mysticatea/eslint-plugin-node/compare/v10.0.0...v11.1.0) --- updated-dependencies: - dependency-name: eslint-plugin-node dependency-type: direct:development update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Abhinav Singh <126065+abhinavsingh@users.noreply.github.com> * Fix broken TLS interception & CacheResponsesPlugin because UID is no longer a UUID (#866) * Fix broken TLS interception because uid is now no longer a UUID * Give enough context to work id for them to be unique within a `proxy.py` instance * Use --port=0 by default within `proxy.TestCase` * Attempt to fix weird buildx issue * Add makefile targets within workflow * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Order? * Write scm file for make * Fetch depth * Quote patch * Try with sudo? * https://github.com/docker/buildx/issues/850 * Remove sudo hack * https://github.com/docker/buildx/issues/850\#issuecomment-973270625 * Add explicit deps * Add `requirements-testing.txt` during linting phase * Pin buildx to v0.7.1 * Pin buildx to v0.7.0 * Revert back unnecessary change to dockerignore * Skip container within make workflow (because GHA lacks support for docker on macOS by default) * Repurpose make into developer workflow Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Integrate showing unreleased changelog draft (#873) * pip prod(deps): bump types-paramiko from 2.8.2 to 2.8.4 (#868) * npm: bump @types/jasmine from 3.6.1 to 3.10.2 in /dashboard (#867) * pip prod(deps): bump py-spy from 0.3.10 to 0.3.11 (#875) Bumps [py-spy](https://github.com/benfred/py-spy) from 0.3.10 to 0.3.11. - [Release notes](https://github.com/benfred/py-spy/releases) - [Changelog](https://github.com/benfred/py-spy/blob/master/CHANGELOG.md) - [Commits](https://github.com/benfred/py-spy/compare/v0.3.10...v0.3.11) --- updated-dependencies: - dependency-name: py-spy dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Abhinav Singh <126065+abhinavsingh@users.noreply.github.com> * [GHA] Add container integration test & publish containers to GHCR (#818) * Handle KBI in threadless * Remove line-length disable clause * Move `Set PROXYPY_CONTAINER_VERSION env` within pre-setup * Use step output * Use review suggestion * Break line at less essential places * Possibly fix? * alignment necessary? * :) * Fix integration * Add `v` prefix * Load buildx image within docker for integration tests to work * Run container in detached mode * Enable web server for integration test to work * Push to GHCR after successful tests * Tag & Push to GHCR * Add package permission * Login to GHCR * Login to GHCR * Login to GHCR * Explicitly use ghcr.io registry * Add a ghcr manifest creation step which depends upon docker step * Probably we need all images (tags) locally * arm not amd * Fix syntax * Fix typo * `Unable to find image ghcr.io/abhinavsingh/proxy.py:latest locally` * GHCR doesnt support manifest? * Publish multi-platform containers on GHCR (#877) * Build containers without matrix based strategy. Helps with buildx based manifest generation * Use buildx directly in workflows * Add PROXYPY_PKG_PATH arg * Add missing . * --push from buildx * Also add latest tag for GHCR * Fix typo * Explain differences between latest tag on DockerHub (stable) and GHCR (develop) * Publish multi-platform containers to DockerHub (#878) * Use `--local-executor` flag by default for Docker container (#880) * Benchmark gets packaged within wheel if set as a package * Use `--local-executor` flag by default for Docker containers * Dockerfile update * Fix mypy issues * Remove conflicting dir names * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Sviatoslav Sydorenko <wk@sydorenko.org.ua>
This commit is contained in:
parent
e473eb190f
commit
009935b29b
2
.flake8
2
.flake8
|
@ -73,7 +73,9 @@ extend-ignore =
|
|||
Q003 # FIXME: avoid escaping in-string quotes
|
||||
RST201 # FIXME: missing trailing blank line in docstring
|
||||
RST203 # FIXME: no trailing blank line in docstring
|
||||
RST299 # FIXME: Cannot extract compound bibliographic field "copyright"
|
||||
RST301 # FIXME: unexpected indent in docstring
|
||||
RST499 # FIXME: Missing matching underline for section title overline
|
||||
S101 # FIXME: assertions are thrown away in optimized mode, needs audit
|
||||
S104 # FIXME: bind-all interface listen
|
||||
S105 # FIXME: hardcoded password?
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
[worker.oci]
|
||||
max-parallelism = 4
|
||||
[registry."docker.io"]
|
||||
mirrors = ["mirror.gcr.io"]
|
||||
mirrors = ["ghcr.io"]
|
||||
|
|
|
@ -1,5 +1,4 @@
|
|||
---
|
||||
# yamllint disable rule:line-length
|
||||
name: lib
|
||||
|
||||
on: # yamllint disable-line rule:truthy
|
||||
|
@ -67,6 +66,7 @@ jobs:
|
|||
git-tag: ${{ steps.git-tag.outputs.tag }}
|
||||
sdist-artifact-name: ${{ steps.artifact-name.outputs.sdist }}
|
||||
wheel-artifact-name: ${{ steps.artifact-name.outputs.wheel }}
|
||||
container-version: v${{ steps.container.outputs.version }}
|
||||
steps:
|
||||
- name: Switch to using Python 3.9 by default
|
||||
uses: actions/setup-python@v2
|
||||
|
@ -195,6 +195,16 @@ jobs:
|
|||
&& github.event.inputs.release-version
|
||||
|| steps.scm-version.outputs.dist-version
|
||||
}}-py3-none-any.whl')
|
||||
- name: Calculate container attributes
|
||||
id: container
|
||||
shell: bash
|
||||
run: >-
|
||||
VER=$(echo '${{
|
||||
steps.request-check.outputs.release-requested == 'true'
|
||||
&& github.event.inputs.release-version
|
||||
|| steps.scm-version.outputs.dist-version
|
||||
}}' | tr + .);
|
||||
echo "::set-output name=version::$VER"
|
||||
|
||||
build:
|
||||
name: 👷 dists ${{ needs.pre-setup.outputs.git-tag }}
|
||||
|
@ -547,11 +557,6 @@ jobs:
|
|||
# a pull request then we can checkout the head.
|
||||
fetch-depth: 2
|
||||
|
||||
# If this run was triggered by a pull request event, then checkout
|
||||
# the head of the pull request instead of the merge commit.
|
||||
- run: git checkout HEAD^2
|
||||
if: ${{ github.event_name == 'pull_request' }}
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v1
|
||||
|
@ -631,45 +636,67 @@ jobs:
|
|||
npm run build
|
||||
cd ..
|
||||
|
||||
docker:
|
||||
# TODO: To build our docker container, we must wait for check,
|
||||
# so that we can use the same distribution available.
|
||||
developer:
|
||||
runs-on: ${{ matrix.os }}-latest
|
||||
name: 🧑💻 👩💻 👨💻 Developer setup ${{ matrix.node }} @ ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix:
|
||||
os: [ubuntu, macOS]
|
||||
python: ['3.10']
|
||||
fail-fast: false
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python }}
|
||||
- name: Install Pip Dependencies
|
||||
run: |
|
||||
make lib-dep
|
||||
- name: Run essentials
|
||||
run: |
|
||||
./write-scm-version.sh
|
||||
python3 check.py
|
||||
make https-certificates
|
||||
make sign-https-certificates
|
||||
make ca-certificates
|
||||
python3 -m proxy --version
|
||||
|
||||
docker:
|
||||
runs-on: Ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
needs:
|
||||
- build
|
||||
- pre-setup # transitive, for accessing settings
|
||||
name: 🐳 🐍${{ matrix.python }} @ ${{ matrix.targetplatform }}
|
||||
name: 🐳 containerize
|
||||
strategy:
|
||||
matrix:
|
||||
os:
|
||||
- Ubuntu
|
||||
python:
|
||||
- '3.10'
|
||||
targetplatform:
|
||||
- 'linux/386'
|
||||
- 'linux/amd64'
|
||||
- 'linux/arm/v6'
|
||||
- 'linux/arm/v7'
|
||||
- 'linux/arm64/v8'
|
||||
- 'linux/ppc64le'
|
||||
- 'linux/s390x'
|
||||
# max-parallel: 1
|
||||
fail-fast: false
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
- name: Set up Docker Buildx
|
||||
id: buildx
|
||||
uses: docker/setup-buildx-action@v1
|
||||
with:
|
||||
buildkitd-flags: --debug
|
||||
config: .github/buildkitd.toml
|
||||
install: true
|
||||
- name: Download all the dists
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: python-package-distributions
|
||||
path: dist/
|
||||
- name: Login to GHCR
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Set up Docker Buildx
|
||||
id: buildx
|
||||
uses: docker/setup-buildx-action@v1
|
||||
# See https://github.com/docker/buildx/issues/850#issuecomment-996408167
|
||||
with:
|
||||
version: v0.7.0
|
||||
buildkitd-flags: --debug
|
||||
config: .github/buildkitd.toml
|
||||
install: true
|
||||
- name: Enable Multiarch # This slows down arm build by 4-5x
|
||||
run: |
|
||||
docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
|
||||
|
@ -679,17 +706,64 @@ jobs:
|
|||
docker buildx use proxypybuilder
|
||||
docker buildx inspect
|
||||
docker buildx ls
|
||||
- name: Set PROXYPY_CONTAINER_VERSION
|
||||
run: |
|
||||
echo "PROXYPY_CONTAINER_VERSION=$(echo '${{ needs.pre-setup.outputs.dist-version }}' | tr + .)" > $GITHUB_ENV
|
||||
- name: Build container
|
||||
run: |
|
||||
make container-buildx \
|
||||
-e PROXYPY_PKG_PATH='dist/${{ needs.pre-setup.outputs.wheel-artifact-name }}' \
|
||||
-e BUILDX_TARGET_PLATFORM='${{ matrix.targetplatform }}' \
|
||||
-e PROXYPY_CONTAINER_VERSION='${{ env.PROXYPY_CONTAINER_VERSION }}'
|
||||
- name: Build, run & test container
|
||||
run: >-
|
||||
CONTAINER_TAG="abhinavsingh/proxy.py:${{
|
||||
needs.pre-setup.outputs.container-version
|
||||
}}";
|
||||
docker buildx build
|
||||
--load
|
||||
--build-arg PROXYPY_PKG_PATH='dist/${{
|
||||
needs.pre-setup.outputs.wheel-artifact-name
|
||||
}}'
|
||||
-t $CONTAINER_TAG .
|
||||
&&
|
||||
docker run
|
||||
-d
|
||||
-p 8899:8899
|
||||
$CONTAINER_TAG
|
||||
--hostname 0.0.0.0
|
||||
--enable-web-server
|
||||
--local-executor && ./tests/integration/test_integration.sh 8899
|
||||
- name: Push to GHCR
|
||||
run: >-
|
||||
PLATFORMS=linux/386,linux/amd64,linux/arm/v6,linux/arm/v7,linux/arm64/v8,linux/ppc64le,linux/s390x;
|
||||
REGISTRY_URL="ghcr.io/abhinavsingh/proxy.py";
|
||||
CONTAINER_TAG=$REGISTRY_URL:${{
|
||||
needs.pre-setup.outputs.container-version
|
||||
}};
|
||||
LATEST_TAG=$REGISTRY_URL:latest;
|
||||
docker buildx build
|
||||
--push
|
||||
--platform $PLATFORMS
|
||||
--build-arg PROXYPY_PKG_PATH='dist/${{
|
||||
needs.pre-setup.outputs.wheel-artifact-name
|
||||
}}'
|
||||
-t $CONTAINER_TAG
|
||||
-t $LATEST_TAG .
|
||||
- name: Login to DockerHub
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
username: abhinavsingh
|
||||
password: ${{ secrets.DOCKER_ACCESS_TOKEN }}
|
||||
- name: Push to DockerHub
|
||||
run: >-
|
||||
PLATFORMS=linux/386,linux/amd64,linux/arm/v6,linux/arm/v7,linux/arm64/v8,linux/ppc64le,linux/s390x;
|
||||
REGISTRY_URL="abhinavsingh/proxy.py";
|
||||
CONTAINER_TAG=$REGISTRY_URL:${{
|
||||
needs.pre-setup.outputs.container-version
|
||||
}};
|
||||
docker buildx build
|
||||
--push
|
||||
--platform $PLATFORMS
|
||||
--build-arg PROXYPY_PKG_PATH='dist/${{
|
||||
needs.pre-setup.outputs.wheel-artifact-name
|
||||
}}'
|
||||
-t $CONTAINER_TAG .
|
||||
|
||||
check: # This job does nothing and is only used for the branch protection
|
||||
if: always()
|
||||
|
||||
needs:
|
||||
- analyze
|
||||
- test
|
||||
|
@ -697,14 +771,15 @@ jobs:
|
|||
- docker
|
||||
- dashboard
|
||||
- brew
|
||||
- developer
|
||||
|
||||
runs-on: ubuntu-latest
|
||||
runs-on: Ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Report success of the test matrix
|
||||
run: >-
|
||||
print("All's good")
|
||||
shell: python
|
||||
- name: Decide whether the needed jobs succeeded or failed
|
||||
uses: re-actors/alls-green@release/v1
|
||||
with:
|
||||
jobs: ${{ toJSON(needs) }}
|
||||
|
||||
publish-pypi:
|
||||
name: Publish 🐍📦 ${{ needs.pre-setup.outputs.git-tag }} to PyPI
|
||||
|
@ -729,13 +804,13 @@ jobs:
|
|||
name: python-package-distributions
|
||||
path: dist/
|
||||
- name: >-
|
||||
Publish 🐍📦 v${{ needs.pre-setup.outputs.git-tag }} to PyPI
|
||||
Publish 🐍📦 ${{ needs.pre-setup.outputs.git-tag }} to PyPI
|
||||
uses: pypa/gh-action-pypi-publish@release/v1
|
||||
with:
|
||||
password: ${{ secrets.PYPI_TOKEN }}
|
||||
|
||||
publish-testpypi:
|
||||
name: Publish 🐍📦 to TestPyPI
|
||||
name: Publish 🐍📦 ${{ needs.pre-setup.outputs.git-tag }} to TestPyPI
|
||||
needs:
|
||||
- check
|
||||
- pre-setup # transitive, for accessing settings
|
||||
|
@ -758,12 +833,41 @@ jobs:
|
|||
name: python-package-distributions
|
||||
path: dist/
|
||||
- name: >-
|
||||
Publish 🐍📦 v${{ needs.pre-setup.outputs.git-tag }} to TestPyPI
|
||||
Publish 🐍📦 ${{ needs.pre-setup.outputs.git-tag }} to TestPyPI
|
||||
uses: pypa/gh-action-pypi-publish@release/v1
|
||||
with:
|
||||
password: ${{ secrets.TESTPYPI_API_TOKEN }}
|
||||
repository_url: https://test.pypi.org/legacy/
|
||||
|
||||
# publish-docker:
|
||||
# name: Publish 🐳 📦 ${{ needs.pre-setup.outputs.git-tag }} to Docker Hub
|
||||
# needs:
|
||||
# - check
|
||||
# - pre-setup # transitive, for accessing settings
|
||||
# if: >-
|
||||
# fromJSON(needs.pre-setup.outputs.release-requested)
|
||||
# runs-on: Ubuntu-latest
|
||||
|
||||
# environment:
|
||||
# name: release-docker
|
||||
# url: >-
|
||||
# https://test.pypi.org/project/proxy.py/${{
|
||||
# needs.pre-setup.outputs.dist-version
|
||||
# }}
|
||||
|
||||
# steps:
|
||||
# - name: Download all the dists
|
||||
# uses: actions/download-artifact@v2
|
||||
# with:
|
||||
# name: python-package-distributions
|
||||
# path: dist/
|
||||
# - name: >-
|
||||
# Publish 🐳 📦 ${{ needs.pre-setup.outputs.git-tag }} to Docker Hub
|
||||
# uses: pypa/gh-action-pypi-publish@release/v1
|
||||
# with:
|
||||
# password: ${{ secrets.TESTPYPI_API_TOKEN }}
|
||||
# repository_url: https://test.pypi.org/legacy/
|
||||
|
||||
post-release-repo-update:
|
||||
name: >-
|
||||
Publish post-release Git tag
|
||||
|
|
19
CHANGELOG.md
19
CHANGELOG.md
|
@ -1,10 +1,23 @@
|
|||
<!-- markdownlint-disable no-duplicate-heading -->
|
||||
<!-- markdownlint-disable no-duplicate-heading no-multiple-blanks -->
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
[//]: # (DO-NOT-REMOVE-versioning-promise-START)
|
||||
|
||||
```{note}
|
||||
The change notes follow [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
except for the title formatting, and this project adheres to [Semantic
|
||||
Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
```
|
||||
|
||||
<!--
|
||||
Do *NOT* manually add changelog entries here!
|
||||
This changelog is managed by Towncrier and is built at release time.
|
||||
See https://proxypy.rtfd.io/en/latest/contributing/guidelines#adding-change-notes-with-your-prs
|
||||
for details. Or read
|
||||
https://github.com/ansible/ansible-language-server/tree/main/docs/changelog-fragments.d#adding-change-notes-with-your-prs
|
||||
-->
|
||||
|
||||
<!-- towncrier release notes start -->
|
||||
|
||||
|
|
|
@ -6,7 +6,8 @@ LABEL com.abhinavsingh.name="abhinavsingh/proxy.py" \
|
|||
👷 \"Work\" acceptor & executor framework" \
|
||||
com.abhinavsingh.url="https://github.com/abhinavsingh/proxy.py" \
|
||||
com.abhinavsingh.vcs-url="https://github.com/abhinavsingh/proxy.py" \
|
||||
com.abhinavsingh.docker.cmd="docker run -it --rm -p 8899:8899 abhinavsingh/proxy.py"
|
||||
com.abhinavsingh.docker.cmd="docker run -it --rm -p 8899:8899 abhinavsingh/proxy.py" \
|
||||
org.opencontainers.image.source="https://github.com/abhinavsingh/proxy.py"
|
||||
ENV PYTHONUNBUFFERED 1
|
||||
ARG PROXYPY_PKG_PATH
|
||||
|
||||
|
@ -25,4 +26,7 @@ RUN apk update && apk add openssl
|
|||
|
||||
EXPOSE 8899/tcp
|
||||
ENTRYPOINT [ "proxy" ]
|
||||
CMD [ "--hostname=0.0.0.0" ]
|
||||
CMD [ \
|
||||
"--hostname=0.0.0.0" \
|
||||
"--local-executor" \
|
||||
]
|
||||
|
|
17
Makefile
17
Makefile
|
@ -29,7 +29,7 @@ endif
|
|||
.PHONY: all https-certificates sign-https-certificates ca-certificates
|
||||
.PHONY: lib-check lib-clean lib-test lib-package lib-coverage lib-lint lib-pytest
|
||||
.PHONY: lib-release-test lib-release lib-profile lib-doc
|
||||
.PHONY: lib-dep lib-flake8 lib-mypy lib-speedscope
|
||||
.PHONY: lib-dep lib-flake8 lib-mypy lib-speedscope container-buildx-all-platforms
|
||||
.PHONY: container container-run container-release container-build container-buildx
|
||||
.PHONY: devtools dashboard dashboard-clean
|
||||
|
||||
|
@ -126,11 +126,11 @@ lib-release: lib-package
|
|||
|
||||
lib-doc:
|
||||
python -m tox -e build-docs && \
|
||||
$(OPEN) .tox/build-docs/docs_out/index.html
|
||||
$(OPEN) .tox/build-docs/docs_out/index.html || true
|
||||
|
||||
lib-coverage:
|
||||
pytest --cov=proxy --cov=tests --cov-report=html tests/ && \
|
||||
$(OPEN) htmlcov/index.html
|
||||
$(OPEN) htmlcov/index.html || true
|
||||
|
||||
lib-profile:
|
||||
ulimit -n 65536 && \
|
||||
|
@ -177,6 +177,11 @@ dashboard-clean:
|
|||
container: lib-package
|
||||
$(MAKE) container-build -e PROXYPY_PKG_PATH=$$(ls dist/*.whl)
|
||||
|
||||
container-build:
|
||||
docker build \
|
||||
-t $(PROXYPY_CONTAINER_TAG) \
|
||||
--build-arg PROXYPY_PKG_PATH=$(PROXYPY_PKG_PATH) .
|
||||
|
||||
# Usage:
|
||||
#
|
||||
# make container-buildx \
|
||||
|
@ -185,12 +190,14 @@ container: lib-package
|
|||
# -e PROXYPY_CONTAINER_VERSION=latest
|
||||
container-buildx:
|
||||
docker buildx build \
|
||||
--load \
|
||||
--platform $(BUILDX_TARGET_PLATFORM) \
|
||||
-t $(PROXYPY_CONTAINER_TAG) \
|
||||
--build-arg PROXYPY_PKG_PATH=$(PROXYPY_PKG_PATH) .
|
||||
|
||||
container-build:
|
||||
docker build \
|
||||
container-buildx-all-platforms:
|
||||
docker buildx build \
|
||||
--platform linux/386,linux/amd64,linux/arm/v6,linux/arm/v7,linux/arm64/v8,linux/ppc64le,linux/s390x \
|
||||
-t $(PROXYPY_CONTAINER_TAG) \
|
||||
--build-arg PROXYPY_PKG_PATH=$(PROXYPY_PKG_PATH) .
|
||||
|
||||
|
|
68
README.md
68
README.md
|
@ -15,7 +15,7 @@
|
|||
[![pypi version](https://img.shields.io/pypi/v/proxy.py)](https://pypi.org/project/proxy.py/)
|
||||
[![Python 3.x](https://img.shields.io/static/v1?label=Python&message=3.6%20%7C%203.7%20%7C%203.8%20%7C%203.9%20%7C%203.10&color=blue)](https://www.python.org/)
|
||||
[![Checked with mypy](https://img.shields.io/static/v1?label=MyPy&message=checked&color=blue)](http://mypy-lang.org/)
|
||||
[![lib](https://github.com/abhinavsingh/proxy.py/actions/workflows/test-library.yml/badge.svg)](https://github.com/abhinavsingh/proxy.py/actions/workflows/test-library.yml)
|
||||
[![lib](https://github.com/abhinavsingh/proxy.py/actions/workflows/test-library.yml/badge.svg?branch=develop&event=push)](https://github.com/abhinavsingh/proxy.py/actions/workflows/test-library.yml)
|
||||
[![codecov](https://codecov.io/gh/abhinavsingh/proxy.py/branch/develop/graph/badge.svg?token=Zh9J7b4la2)](https://codecov.io/gh/abhinavsingh/proxy.py)
|
||||
|
||||
[![Contributions Welcome](https://img.shields.io/static/v1?label=Contributions&message=Welcome%20%F0%9F%91%8D&color=darkgreen)](https://github.com/abhinavsingh/proxy.py/issues)
|
||||
|
@ -26,13 +26,13 @@
|
|||
|
||||
- [Features](#features)
|
||||
- [Install](#install)
|
||||
- [Stable vs Develop](#stable-vs-develop)
|
||||
- [Using PIP](#using-pip)
|
||||
- [Stable version](#stable-version-with-pip)
|
||||
- [Development version](#development-version-with-pip)
|
||||
- [Using Docker](#using-docker)
|
||||
- [Stable version](#stable-version-from-docker-hub)
|
||||
- [Development version](#build-development-version-locally)
|
||||
- [Stable version from Docker Hub](#stable-version-from-docker-hub)
|
||||
- [Development Version from GHCR](#development-version-from-ghcr)
|
||||
- [Build container locally](#build-development-version-locally)
|
||||
- [Using HomeBrew](#using-homebrew)
|
||||
- [Stable version](#stable-version-with-homebrew)
|
||||
- [Development version](#development-version-with-homebrew)
|
||||
|
@ -92,6 +92,8 @@
|
|||
- [Inspect Traffic](#inspect-traffic)
|
||||
- [Chrome DevTools Protocol](#chrome-devtools-protocol)
|
||||
- [Frequently Asked Questions](#frequently-asked-questions)
|
||||
- [Stable vs Develop](#stable-vs-develop)
|
||||
- [Release Schedule](#release-schedule)
|
||||
- [Threads vs Threadless](#threads-vs-threadless)
|
||||
- [SyntaxError: invalid syntax](#syntaxerror-invalid-syntax)
|
||||
- [Unable to load plugins](#unable-to-load-plugins)
|
||||
|
@ -223,14 +225,6 @@
|
|||
|
||||
# Install
|
||||
|
||||
## Stable vs Develop
|
||||
|
||||
`master` branch contains latest stable code and is available via `PyPi` repository
|
||||
|
||||
`develop` branch contains cutting edge changes
|
||||
|
||||
Development branch is kept stable *(most of the times)*. But if you want 100% reliability and serving users in production environment, always use stable version from `PyPi` or `Docker` container from `hub.docker.com`.
|
||||
|
||||
## Using PIP
|
||||
|
||||
### Stable Version with PIP
|
||||
|
@ -255,6 +249,15 @@ or from GitHub `master` branch
|
|||
|
||||
## Using Docker
|
||||
|
||||
Multi-platform containers are available via:
|
||||
|
||||
- Docker Hub
|
||||
- `latest` tag points to last `stable` release
|
||||
- `docker pull abhinavsingh/proxy.py:latest`
|
||||
- GitHub container registry (GHCR)
|
||||
- `latest` tag points to last `develop` release
|
||||
- `docker pull ghcr.io/abhinavsingh/proxy.py:latest`
|
||||
|
||||
Stable version container releases are available for following platforms:
|
||||
|
||||
- `linux/386`
|
||||
|
@ -273,12 +276,21 @@ Run `proxy.py` latest container:
|
|||
❯ docker run -it -p 8899:8899 --rm abhinavsingh/proxy.py:latest
|
||||
```
|
||||
|
||||
Docker daemon will automatically pull the matching platform image.
|
||||
To run specific target platform container on multi-platform supported servers:
|
||||
|
||||
```console
|
||||
❯ docker run -it -p 8899:8899 --rm --platform linux/arm64/v8 abhinavsingh/proxy.py:latest
|
||||
```
|
||||
|
||||
### Development Version from GHCR
|
||||
|
||||
Run `proxy.py` container from cutting edge code in the develop branch:
|
||||
|
||||
```console
|
||||
❯ docker run -it -p 8899:8899 --rm ghcr.io/abhinavsingh/proxy.py:latest
|
||||
```
|
||||
|
||||
### Build Development Version Locally
|
||||
|
||||
```console
|
||||
|
@ -1329,10 +1341,10 @@ import proxy
|
|||
|
||||
if __name__ == '__main__':
|
||||
with proxy.Proxy([]) as p:
|
||||
print(p.acceptors.flags.port)
|
||||
print(p.flags.port)
|
||||
```
|
||||
|
||||
`acceptors.flags.port` will give you access to the random port allocated by the kernel.
|
||||
`flags.port` will give you access to the random port allocated by the kernel.
|
||||
|
||||
## Loading Plugins
|
||||
|
||||
|
@ -1391,7 +1403,7 @@ Note that:
|
|||
|
||||
1. `proxy.TestCase` overrides `unittest.TestCase.run()` method to setup and tear down `proxy.py`.
|
||||
2. `proxy.py` server will listen on a random available port on the system.
|
||||
This random port is available as `self.PROXY.acceptors.flags.port` within your test cases.
|
||||
This random port is available as `self.PROXY.flags.port` within your test cases.
|
||||
3. Only a single acceptor and worker is started by default (`--num-workers 1 --num-acceptors 1`) for faster setup and tear down.
|
||||
4. Most importantly, `proxy.TestCase` also ensures `proxy.py` server
|
||||
is up and running before proceeding with execution of tests. By default,
|
||||
|
@ -1695,6 +1707,30 @@ Now point your CDT instance to `ws://localhost:8899/devtools`.
|
|||
|
||||
# Frequently Asked Questions
|
||||
|
||||
## Stable vs Develop
|
||||
|
||||
- `master` branch contains latest `stable` code and is available via `PyPi` repository and `Docker` containers via `docker.io` and `ghcr.io` registries.
|
||||
|
||||
Issues reported for `stable` releases are considered with top-priority. However, currently we don't back port fixes into older releases. Example, if you reported an issue in `v2.3.1`, but current `master` branch now contains `v2.4.0rc1`. Then, the fix will land in `v2.4.0rc2`.
|
||||
|
||||
- `develop` branch contains cutting edge changes
|
||||
|
||||
Development branch is kept stable *(most of the times)*. **But**, if you want *100% reliability* and serving users in *production environment*, ALWAYS use the stable version.
|
||||
|
||||
### Release Schedule
|
||||
|
||||
A `vX.Y.ZrcN` pull request is created once a month which merges `develop` → `master`. Find below how code flows from a pull request to the next stable release.
|
||||
|
||||
1. Development release is deployed from `develop` → `test.pypi.org` after every pull request merge
|
||||
|
||||
2. Alpha release is deployed from `develop` → `pypi.org` **before** merging the `vX.Y.Z.rcN` pull request from `develop` → `master` branch. There can be multiple alpha releases made before merging the `rc` pull request
|
||||
|
||||
3. Beta release is deployed from `master` → `pypi.org`. Beta releases are made in preparation of `rc` releases and can be skipped if unnecessary
|
||||
|
||||
4. Release candidate is deployed from `master` → `pypi.org`. Release candidates are always made available before final stable release
|
||||
|
||||
5. Stable release is deployed from `master` → `pypi.org`
|
||||
|
||||
## Threads vs Threadless
|
||||
|
||||
### `v1.x`
|
||||
|
@ -2056,7 +2092,7 @@ usage: -m [-h] [--enable-events] [--enable-conn-pool] [--threadless]
|
|||
[--filtered-url-regex-config FILTERED_URL_REGEX_CONFIG]
|
||||
[--cloudflare-dns-mode CLOUDFLARE_DNS_MODE]
|
||||
|
||||
proxy.py v2.3.2.dev190+ge60d80d.d20211124
|
||||
proxy.py v2.4.0rc3.dev33+gc341594.d20211214
|
||||
|
||||
options:
|
||||
-h, --help show this help message and exit
|
||||
|
|
|
@ -1,10 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
proxy.py
|
||||
~~~~~~~~
|
||||
⚡⚡⚡ Fast, Lightweight, Pluggable, TLS interception capable proxy server focused on
|
||||
Network monitoring, controls & Application development, testing, debugging.
|
||||
|
||||
:copyright: (c) 2013-present by Abhinav Singh and contributors.
|
||||
:license: BSD, see LICENSE for more details.
|
||||
"""
|
|
@ -0,0 +1,33 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
proxy.py
|
||||
~~~~~~~~
|
||||
⚡⚡⚡ Fast, Lightweight, Pluggable, TLS interception capable proxy server focused on
|
||||
Network monitoring, controls & Application development, testing, debugging.
|
||||
|
||||
:copyright: (c) 2013-present by Abhinav Singh and contributors.
|
||||
:license: BSD, see LICENSE for more details.
|
||||
"""
|
||||
import time
|
||||
import ipaddress
|
||||
import proxy
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
with proxy.Proxy(
|
||||
['--plugin', 'proxy.plugin.WebServerPlugin'],
|
||||
hostname=ipaddress.ip_address('127.0.0.1'),
|
||||
port=8899,
|
||||
backlog=65536,
|
||||
open_file_limit=65536,
|
||||
enable_web_server=True,
|
||||
disable_proxy_server=False,
|
||||
num_acceptors=10,
|
||||
local_executor=True,
|
||||
log_file='/dev/null',
|
||||
) as _:
|
||||
while True:
|
||||
try:
|
||||
time.sleep(1)
|
||||
except KeyboardInterrupt:
|
||||
break
|
|
@ -1,10 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
proxy.py
|
||||
~~~~~~~~
|
||||
⚡⚡⚡ Fast, Lightweight, Pluggable, TLS interception capable proxy server focused on
|
||||
Network monitoring, controls & Application development, testing, debugging.
|
||||
|
||||
:copyright: (c) 2013-present by Abhinav Singh and contributors.
|
||||
:license: BSD, see LICENSE for more details.
|
||||
"""
|
|
@ -1,10 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
proxy.py
|
||||
~~~~~~~~
|
||||
⚡⚡⚡ Fast, Lightweight, Pluggable, TLS interception capable proxy server focused on
|
||||
Network monitoring, controls & Application development, testing, debugging.
|
||||
|
||||
:copyright: (c) 2013-present by Abhinav Singh and contributors.
|
||||
:license: BSD, see LICENSE for more details.
|
||||
"""
|
|
@ -61,7 +61,7 @@ run_benchmark() {
|
|||
}
|
||||
|
||||
benchmark_lib() {
|
||||
python ./benchmark/$1/server.py > /dev/null 2>&1 &
|
||||
python ./benchmark/_$1.py > /dev/null 2>&1 &
|
||||
local SERVER_PID=$!
|
||||
echo "Server (pid:$SERVER_PID) running"
|
||||
sleep 1
|
||||
|
@ -75,31 +75,6 @@ benchmark_lib() {
|
|||
fi
|
||||
}
|
||||
|
||||
benchmark_proxy_py() {
|
||||
python -m proxy \
|
||||
--hostname 127.0.0.1 \
|
||||
--port $1 \
|
||||
--backlog 65536 \
|
||||
--open-file-limit 65536 \
|
||||
--enable-web-server \
|
||||
--plugin proxy.plugin.WebServerPlugin \
|
||||
--disable-http-proxy \
|
||||
--num-acceptors 10 \
|
||||
--local-executor \
|
||||
--log-file /dev/null > /dev/null 2>&1 &
|
||||
local SERVER_PID=$!
|
||||
echo "Server (pid:$SERVER_PID) running"
|
||||
sleep 1
|
||||
run_benchmark $1
|
||||
kill -15 $SERVER_PID
|
||||
sleep 1
|
||||
kill -0 $SERVER_PID > /dev/null 2>&1
|
||||
local RUNNING=$?
|
||||
if [ "$RUNNING" == "1" ]; then
|
||||
echo "Server gracefully shutdown"
|
||||
fi
|
||||
}
|
||||
|
||||
benchmark_asgi() {
|
||||
uvicorn \
|
||||
--port $1 \
|
||||
|
@ -120,7 +95,7 @@ benchmark_asgi() {
|
|||
|
||||
# echo "============================="
|
||||
# echo "Benchmarking Proxy.Py"
|
||||
# benchmark_proxy_py $PROXYPY_PORT
|
||||
# PYTHONPATH=. benchmark_lib proxy $PROXYPY_PORT
|
||||
# echo "============================="
|
||||
|
||||
# echo "============================="
|
||||
|
|
|
@ -2,4 +2,4 @@ aiohttp==3.8.1
|
|||
blacksheep==1.2.1
|
||||
starlette==0.17.1
|
||||
tornado==6.1
|
||||
uvicorn==0.15.0
|
||||
uvicorn==0.16.0
|
||||
|
|
|
@ -1,10 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
proxy.py
|
||||
~~~~~~~~
|
||||
⚡⚡⚡ Fast, Lightweight, Pluggable, TLS interception capable proxy server focused on
|
||||
Network monitoring, controls & Application development, testing, debugging.
|
||||
|
||||
:copyright: (c) 2013-present by Abhinav Singh and contributors.
|
||||
:license: BSD, see LICENSE for more details.
|
||||
"""
|
|
@ -1,10 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
proxy.py
|
||||
~~~~~~~~
|
||||
⚡⚡⚡ Fast, Lightweight, Pluggable, TLS interception capable proxy server focused on
|
||||
Network monitoring, controls & Application development, testing, debugging.
|
||||
|
||||
:copyright: (c) 2013-present by Abhinav Singh and contributors.
|
||||
:license: BSD, see LICENSE for more details.
|
||||
"""
|
7
check.py
7
check.py
|
@ -60,11 +60,12 @@ lib_help = subprocess.check_output(
|
|||
with open('README.md', 'rb+') as f:
|
||||
c = f.read()
|
||||
pre_flags, post_flags = c.split(b'# Flags')
|
||||
help_text, post_changelog = post_flags.split(b'# Changelog')
|
||||
f.seek(0)
|
||||
f.write(
|
||||
pre_flags + b'# Flags\n\n```console\n\xe2\x9d\xaf proxy -h\n' + lib_help + b'```' +
|
||||
b'\n\n# Changelog' + post_changelog,
|
||||
pre_flags +
|
||||
b'# Flags\n\n```console\n\xe2\x9d\xaf proxy -h\n' +
|
||||
lib_help +
|
||||
b'```\n',
|
||||
)
|
||||
|
||||
# Version is also hardcoded in README.md flags section
|
||||
|
|
|
@ -9,20 +9,20 @@
|
|||
"version": "1.0.0",
|
||||
"license": "BSD-3-Clause",
|
||||
"devDependencies": {
|
||||
"@types/jasmine": "^3.6.1",
|
||||
"@types/jasmine": "^3.10.2",
|
||||
"@types/jquery": "^3.5.4",
|
||||
"@types/js-cookie": "^2.2.6",
|
||||
"@types/js-cookie": "^3.0.1",
|
||||
"@typescript-eslint/eslint-plugin": "^2.34.0",
|
||||
"@typescript-eslint/parser": "^2.34.0",
|
||||
"chrome-devtools-frontend": "^1.0.944903",
|
||||
"chrome-devtools-frontend": "^1.0.949424",
|
||||
"eslint": "^6.8.0",
|
||||
"eslint-config-standard": "^14.1.1",
|
||||
"eslint-plugin-import": "^2.25.3",
|
||||
"eslint-plugin-node": "^10.0.0",
|
||||
"eslint-plugin-node": "^11.1.0",
|
||||
"eslint-plugin-promise": "^4.2.1",
|
||||
"eslint-plugin-standard": "^5.0.0",
|
||||
"http-server": "^0.12.3",
|
||||
"jasmine": "^3.6.3",
|
||||
"http-server": "^14.0.0",
|
||||
"jasmine": "^3.10.0",
|
||||
"jasmine-ts": "^0.3.0",
|
||||
"jquery": "^3.5.1",
|
||||
"js-cookie": "^3.0.1",
|
||||
|
@ -34,7 +34,7 @@
|
|||
"rollup-plugin-typescript": "^1.0.1",
|
||||
"ts-node": "^7.0.1",
|
||||
"typescript": "^3.9.7",
|
||||
"ws": "^7.4.6"
|
||||
"ws": "^8.3.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@babel/code-frame": {
|
||||
|
@ -136,9 +136,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@types/jasmine": {
|
||||
"version": "3.6.1",
|
||||
"resolved": "https://registry.npmjs.org/@types/jasmine/-/jasmine-3.6.1.tgz",
|
||||
"integrity": "sha512-eeSCVhBsgwHNS1FmaMu4zrLxfykCTWJMLFZv7lmyrZQjw7foUUXoPu4GukSN9v7JvUw7X+/aDH3kCaymirBSTg==",
|
||||
"version": "3.10.2",
|
||||
"resolved": "https://registry.npmjs.org/@types/jasmine/-/jasmine-3.10.2.tgz",
|
||||
"integrity": "sha512-qs4xjVm4V/XjM6owGm/x6TNmhGl5iKX8dkTdsgdgl9oFnqgzxLepnS7rN9Tdo7kDmnFD/VEqKrW57cGD2odbEg==",
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/@types/jquery": {
|
||||
|
@ -151,9 +151,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/@types/js-cookie": {
|
||||
"version": "2.2.6",
|
||||
"resolved": "https://registry.npmjs.org/@types/js-cookie/-/js-cookie-2.2.6.tgz",
|
||||
"integrity": "sha512-+oY0FDTO2GYKEV0YPvSshGq9t7YozVkgvXLty7zogQNuCxBhT9/3INX9Q7H1aRZ4SUDRXAKlJuA4EA5nTt7SNw==",
|
||||
"version": "3.0.1",
|
||||
"resolved": "https://registry.npmjs.org/@types/js-cookie/-/js-cookie-3.0.1.tgz",
|
||||
"integrity": "sha512-7wg/8gfHltklehP+oyJnZrz9XBuX5ZPP4zB6UsI84utdlkRYLnOm2HfpLXazTwZA+fpGn0ir8tGNgVnMEleBGQ==",
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/@types/json-schema": {
|
||||
|
@ -613,14 +613,23 @@
|
|||
"dev": true
|
||||
},
|
||||
"node_modules/basic-auth": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/basic-auth/-/basic-auth-1.1.0.tgz",
|
||||
"integrity": "sha1-RSIe5Cn37h5QNb4/UVM/HN/SmIQ=",
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/basic-auth/-/basic-auth-2.0.1.tgz",
|
||||
"integrity": "sha512-NF+epuEdnUYVlGuhaxbbq+dvJttwLnGY+YixlXlME5KpQ5W3CnXA5cVTneY3SPbPDRkcjMbifrwmFYcClgOZeg==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"safe-buffer": "5.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
"node": ">= 0.8"
|
||||
}
|
||||
},
|
||||
"node_modules/basic-auth/node_modules/safe-buffer": {
|
||||
"version": "5.1.2",
|
||||
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz",
|
||||
"integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==",
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/bcrypt-pbkdf": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/bcrypt-pbkdf/-/bcrypt-pbkdf-1.0.2.tgz",
|
||||
|
@ -719,9 +728,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/chrome-devtools-frontend": {
|
||||
"version": "1.0.944903",
|
||||
"resolved": "https://registry.npmjs.org/chrome-devtools-frontend/-/chrome-devtools-frontend-1.0.944903.tgz",
|
||||
"integrity": "sha512-0AX3fSoR7l33Kxb4+U1QFbH4SkSKv4mhawDeex0CmbsmsdtfybI7y4NvN4Fen/+w5j/g4m6t79STQ8pjI+NrQA==",
|
||||
"version": "1.0.949424",
|
||||
"resolved": "https://registry.npmjs.org/chrome-devtools-frontend/-/chrome-devtools-frontend-1.0.949424.tgz",
|
||||
"integrity": "sha512-v4A+tyfJia6yOonl0I1M3lXYIV9J6idJ49+dT2TK7Z9SmlNd1ZPCcXDi3sBWBkpxdHJfpzl1We8HhUTh2VX5FA==",
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/cli-cursor": {
|
||||
|
@ -1020,22 +1029,6 @@
|
|||
"safer-buffer": "^2.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/ecstatic": {
|
||||
"version": "3.3.2",
|
||||
"resolved": "https://registry.npmjs.org/ecstatic/-/ecstatic-3.3.2.tgz",
|
||||
"integrity": "sha512-fLf9l1hnwrHI2xn9mEDT7KIi22UDqA2jaCwyCbSUJh9a1V+LEUSL/JO/6TIz/QyuBURWUHrFL5Kg2TtO1bkkog==",
|
||||
"deprecated": "This package is unmaintained and deprecated. See the GH Issue 259.",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"he": "^1.1.1",
|
||||
"mime": "^1.6.0",
|
||||
"minimist": "^1.1.0",
|
||||
"url-join": "^2.0.5"
|
||||
},
|
||||
"bin": {
|
||||
"ecstatic": "lib/ecstatic.js"
|
||||
}
|
||||
},
|
||||
"node_modules/emoji-regex": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
|
||||
|
@ -1246,31 +1239,6 @@
|
|||
"node": ">=4"
|
||||
}
|
||||
},
|
||||
"node_modules/eslint-plugin-es": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-es/-/eslint-plugin-es-2.0.0.tgz",
|
||||
"integrity": "sha512-f6fceVtg27BR02EYnBhgWLFQfK6bN4Ll0nQFrBHOlCsAyxeZkn0NHns5O0YZOPrV1B3ramd6cgFwaoFLcSkwEQ==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"eslint-utils": "^1.4.2",
|
||||
"regexpp": "^3.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8.10.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"eslint": ">=4.19.1"
|
||||
}
|
||||
},
|
||||
"node_modules/eslint-plugin-es/node_modules/regexpp": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/regexpp/-/regexpp-3.0.0.tgz",
|
||||
"integrity": "sha512-Z+hNr7RAVWxznLPuA7DIh8UNX1j9CDrUQxskw9IrBE1Dxue2lyXT+shqEIeLUjrokxIP8CMy1WkjgG3rTsd5/g==",
|
||||
"dev": true,
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/eslint-plugin-import": {
|
||||
"version": "2.25.3",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-import/-/eslint-plugin-import-2.25.3.tgz",
|
||||
|
@ -1326,13 +1294,13 @@
|
|||
"dev": true
|
||||
},
|
||||
"node_modules/eslint-plugin-node": {
|
||||
"version": "10.0.0",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-node/-/eslint-plugin-node-10.0.0.tgz",
|
||||
"integrity": "sha512-1CSyM/QCjs6PXaT18+zuAXsjXGIGo5Rw630rSKwokSs2jrYURQc4R5JZpoanNCqwNmepg+0eZ9L7YiRUJb8jiQ==",
|
||||
"version": "11.1.0",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-node/-/eslint-plugin-node-11.1.0.tgz",
|
||||
"integrity": "sha512-oUwtPJ1W0SKD0Tr+wqu92c5xuCeQqB3hSCHasn/ZgjFdA9iDGNkNf2Zi9ztY7X+hNuMib23LNGRm6+uN+KLE3g==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"eslint-plugin-es": "^2.0.0",
|
||||
"eslint-utils": "^1.4.2",
|
||||
"eslint-plugin-es": "^3.0.0",
|
||||
"eslint-utils": "^2.0.0",
|
||||
"ignore": "^5.1.1",
|
||||
"minimatch": "^3.0.4",
|
||||
"resolve": "^1.10.1",
|
||||
|
@ -1345,6 +1313,40 @@
|
|||
"eslint": ">=5.16.0"
|
||||
}
|
||||
},
|
||||
"node_modules/eslint-plugin-node/node_modules/eslint-plugin-es": {
|
||||
"version": "3.0.1",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-es/-/eslint-plugin-es-3.0.1.tgz",
|
||||
"integrity": "sha512-GUmAsJaN4Fc7Gbtl8uOBlayo2DqhwWvEzykMHSCZHU3XdJ+NSzzZcVhXh3VxX5icqQ+oQdIEawXX8xkR3mIFmQ==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"eslint-utils": "^2.0.0",
|
||||
"regexpp": "^3.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8.10.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/mysticatea"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"eslint": ">=4.19.1"
|
||||
}
|
||||
},
|
||||
"node_modules/eslint-plugin-node/node_modules/eslint-utils": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/eslint-utils/-/eslint-utils-2.1.0.tgz",
|
||||
"integrity": "sha512-w94dQYoauyvlDc43XnGB8lU3Zt713vNChgt4EWwhXAP2XkBvndfxF0AgIqKOOasjPIPzj9JqgwkwbCYD0/V3Zg==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"eslint-visitor-keys": "^1.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/mysticatea"
|
||||
}
|
||||
},
|
||||
"node_modules/eslint-plugin-node/node_modules/semver": {
|
||||
"version": "6.3.0",
|
||||
"resolved": "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz",
|
||||
|
@ -2444,28 +2446,66 @@
|
|||
}
|
||||
},
|
||||
"node_modules/http-server": {
|
||||
"version": "0.12.3",
|
||||
"resolved": "https://registry.npmjs.org/http-server/-/http-server-0.12.3.tgz",
|
||||
"integrity": "sha512-be0dKG6pni92bRjq0kvExtj/NrrAd28/8fCXkaI/4piTwQMSDSLMhWyW0NI1V+DBI3aa1HMlQu46/HjVLfmugA==",
|
||||
"version": "14.0.0",
|
||||
"resolved": "https://registry.npmjs.org/http-server/-/http-server-14.0.0.tgz",
|
||||
"integrity": "sha512-XTePIXAo5x72bI8SlKFSqsg7UuSHwsOa4+RJIe56YeMUvfTvGDy7TxFkTEhfIRmM/Dnf6x29ut541ythSBZdkQ==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"basic-auth": "^1.0.3",
|
||||
"basic-auth": "^2.0.1",
|
||||
"colors": "^1.4.0",
|
||||
"corser": "^2.0.1",
|
||||
"ecstatic": "^3.3.2",
|
||||
"http-proxy": "^1.18.0",
|
||||
"he": "^1.2.0",
|
||||
"html-encoding-sniffer": "^3.0.0",
|
||||
"http-proxy": "^1.18.1",
|
||||
"mime": "^1.6.0",
|
||||
"minimist": "^1.2.5",
|
||||
"opener": "^1.5.1",
|
||||
"portfinder": "^1.0.25",
|
||||
"portfinder": "^1.0.28",
|
||||
"secure-compare": "3.0.1",
|
||||
"union": "~0.5.0"
|
||||
"union": "~0.5.0",
|
||||
"url-join": "^4.0.1"
|
||||
},
|
||||
"bin": {
|
||||
"hs": "bin/http-server",
|
||||
"http-server": "bin/http-server"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6"
|
||||
"node": ">=12"
|
||||
}
|
||||
},
|
||||
"node_modules/http-server/node_modules/html-encoding-sniffer": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/html-encoding-sniffer/-/html-encoding-sniffer-3.0.0.tgz",
|
||||
"integrity": "sha512-oWv4T4yJ52iKrufjnyZPkrN0CH3QnrUqdB6In1g5Fe1mia8GmF36gnfNySxoZtxD5+NmYw1EElVXiBk93UeskA==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"whatwg-encoding": "^2.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
}
|
||||
},
|
||||
"node_modules/http-server/node_modules/iconv-lite": {
|
||||
"version": "0.6.3",
|
||||
"resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.6.3.tgz",
|
||||
"integrity": "sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"safer-buffer": ">= 2.1.2 < 3.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/http-server/node_modules/whatwg-encoding": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/whatwg-encoding/-/whatwg-encoding-2.0.0.tgz",
|
||||
"integrity": "sha512-p41ogyeMUrw3jWclHWTQg1k05DSVXPLcVxRTYsXUk+ZooOCZLcoYgPZ/HL/D/N+uQPOtcp1me1WhBEaX02mhWg==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"iconv-lite": "0.6.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
}
|
||||
},
|
||||
"node_modules/http-signature": {
|
||||
|
@ -2869,25 +2909,22 @@
|
|||
"dev": true
|
||||
},
|
||||
"node_modules/jasmine": {
|
||||
"version": "3.6.3",
|
||||
"resolved": "https://registry.npmjs.org/jasmine/-/jasmine-3.6.3.tgz",
|
||||
"integrity": "sha512-Th91zHsbsALWjDUIiU5d/W5zaYQsZFMPTdeNmi8GivZPmAaUAK8MblSG3yQI4VMGC/abF2us7ex60NH1AAIMTA==",
|
||||
"version": "3.10.0",
|
||||
"resolved": "https://registry.npmjs.org/jasmine/-/jasmine-3.10.0.tgz",
|
||||
"integrity": "sha512-2Y42VsC+3CQCTzTwJezOvji4qLORmKIE0kwowWC+934Krn6ZXNQYljiwK5st9V3PVx96BSiDYXSB60VVah3IlQ==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"glob": "^7.1.6",
|
||||
"jasmine-core": "~3.6.0"
|
||||
"jasmine-core": "~3.10.0"
|
||||
},
|
||||
"bin": {
|
||||
"jasmine": "bin/jasmine.js"
|
||||
},
|
||||
"engines": {
|
||||
"node": "^10 || ^12 || ^14"
|
||||
}
|
||||
},
|
||||
"node_modules/jasmine-core": {
|
||||
"version": "3.6.0",
|
||||
"resolved": "https://registry.npmjs.org/jasmine-core/-/jasmine-core-3.6.0.tgz",
|
||||
"integrity": "sha512-8uQYa7zJN8hq9z+g8z1bqCfdC8eoDAeVnM5sfqs7KHv9/ifoJ500m018fpFc7RDaO6SWCLCXwo/wPSNcdYTgcw==",
|
||||
"version": "3.10.1",
|
||||
"resolved": "https://registry.npmjs.org/jasmine-core/-/jasmine-core-3.10.1.tgz",
|
||||
"integrity": "sha512-ooZWSDVAdh79Rrj4/nnfklL3NQVra0BcuhcuWoAwwi+znLDoUeH87AFfeX8s+YeYi6xlv5nveRyaA1v7CintfA==",
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/jasmine-ts": {
|
||||
|
@ -3347,6 +3384,27 @@
|
|||
}
|
||||
}
|
||||
},
|
||||
"node_modules/jsdom/node_modules/ws": {
|
||||
"version": "7.5.6",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-7.5.6.tgz",
|
||||
"integrity": "sha512-6GLgCqo2cy2A2rjCNFlxQS6ZljG/coZfZXclldI8FB/1G3CCI36Zd8xy2HrFVACi8tfk5XrgLQEk+P0Tnz9UcA==",
|
||||
"dev": true,
|
||||
"engines": {
|
||||
"node": ">=8.3.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"bufferutil": "^4.0.1",
|
||||
"utf-8-validate": "^5.0.2"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"bufferutil": {
|
||||
"optional": true
|
||||
},
|
||||
"utf-8-validate": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/json-schema": {
|
||||
"version": "0.2.3",
|
||||
"resolved": "https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz",
|
||||
|
@ -4070,14 +4128,14 @@
|
|||
"dev": true
|
||||
},
|
||||
"node_modules/portfinder": {
|
||||
"version": "1.0.26",
|
||||
"resolved": "https://registry.npmjs.org/portfinder/-/portfinder-1.0.26.tgz",
|
||||
"integrity": "sha512-Xi7mKxJHHMI3rIUrnm/jjUgwhbYMkp/XKEcZX3aG4BrumLpq3nmoQMX+ClYnDZnZ/New7IatC1no5RX0zo1vXQ==",
|
||||
"version": "1.0.28",
|
||||
"resolved": "https://registry.npmjs.org/portfinder/-/portfinder-1.0.28.tgz",
|
||||
"integrity": "sha512-Se+2isanIcEqf2XMHjyUKskczxbPH7dQnlMjXX6+dybayyHvAf/TCgyMRlzf/B6QDhAEFOGes0pzRo3by4AbMA==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"async": "^2.6.2",
|
||||
"debug": "^3.1.1",
|
||||
"mkdirp": "^0.5.1"
|
||||
"mkdirp": "^0.5.5"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.12.0"
|
||||
|
@ -5082,9 +5140,9 @@
|
|||
}
|
||||
},
|
||||
"node_modules/url-join": {
|
||||
"version": "2.0.5",
|
||||
"resolved": "https://registry.npmjs.org/url-join/-/url-join-2.0.5.tgz",
|
||||
"integrity": "sha1-WvIvGMBSoACkjXuCxenC4v7tpyg=",
|
||||
"version": "4.0.1",
|
||||
"resolved": "https://registry.npmjs.org/url-join/-/url-join-4.0.1.tgz",
|
||||
"integrity": "sha512-jk1+QP6ZJqyOiuEI9AEWQfju/nB2Pw466kbA0LEZljHwKeMgd9WrAEgEGxjPDD2+TNbbb37rTyhEfrCXfuKXnA==",
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/uuid": {
|
||||
|
@ -5274,12 +5332,12 @@
|
|||
}
|
||||
},
|
||||
"node_modules/ws": {
|
||||
"version": "7.4.6",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-7.4.6.tgz",
|
||||
"integrity": "sha512-YmhHDO4MzaDLB+M9ym/mDA5z0naX8j7SIlT8f8z+I0VtzsRbekxEutHSme7NPS2qE8StCYQNUnfWdXta/Yu85A==",
|
||||
"version": "8.3.0",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-8.3.0.tgz",
|
||||
"integrity": "sha512-Gs5EZtpqZzLvmIM59w4igITU57lrtYVFneaa434VROv4thzJyV6UjIL3D42lslWlI+D4KzLYnxSwtfuiO79sNw==",
|
||||
"dev": true,
|
||||
"engines": {
|
||||
"node": ">=8.3.0"
|
||||
"node": ">=10.0.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"bufferutil": "^4.0.1",
|
||||
|
@ -5451,9 +5509,9 @@
|
|||
}
|
||||
},
|
||||
"@types/jasmine": {
|
||||
"version": "3.6.1",
|
||||
"resolved": "https://registry.npmjs.org/@types/jasmine/-/jasmine-3.6.1.tgz",
|
||||
"integrity": "sha512-eeSCVhBsgwHNS1FmaMu4zrLxfykCTWJMLFZv7lmyrZQjw7foUUXoPu4GukSN9v7JvUw7X+/aDH3kCaymirBSTg==",
|
||||
"version": "3.10.2",
|
||||
"resolved": "https://registry.npmjs.org/@types/jasmine/-/jasmine-3.10.2.tgz",
|
||||
"integrity": "sha512-qs4xjVm4V/XjM6owGm/x6TNmhGl5iKX8dkTdsgdgl9oFnqgzxLepnS7rN9Tdo7kDmnFD/VEqKrW57cGD2odbEg==",
|
||||
"dev": true
|
||||
},
|
||||
"@types/jquery": {
|
||||
|
@ -5466,9 +5524,9 @@
|
|||
}
|
||||
},
|
||||
"@types/js-cookie": {
|
||||
"version": "2.2.6",
|
||||
"resolved": "https://registry.npmjs.org/@types/js-cookie/-/js-cookie-2.2.6.tgz",
|
||||
"integrity": "sha512-+oY0FDTO2GYKEV0YPvSshGq9t7YozVkgvXLty7zogQNuCxBhT9/3INX9Q7H1aRZ4SUDRXAKlJuA4EA5nTt7SNw==",
|
||||
"version": "3.0.1",
|
||||
"resolved": "https://registry.npmjs.org/@types/js-cookie/-/js-cookie-3.0.1.tgz",
|
||||
"integrity": "sha512-7wg/8gfHltklehP+oyJnZrz9XBuX5ZPP4zB6UsI84utdlkRYLnOm2HfpLXazTwZA+fpGn0ir8tGNgVnMEleBGQ==",
|
||||
"dev": true
|
||||
},
|
||||
"@types/json-schema": {
|
||||
|
@ -5812,10 +5870,21 @@
|
|||
"dev": true
|
||||
},
|
||||
"basic-auth": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/basic-auth/-/basic-auth-1.1.0.tgz",
|
||||
"integrity": "sha1-RSIe5Cn37h5QNb4/UVM/HN/SmIQ=",
|
||||
"dev": true
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/basic-auth/-/basic-auth-2.0.1.tgz",
|
||||
"integrity": "sha512-NF+epuEdnUYVlGuhaxbbq+dvJttwLnGY+YixlXlME5KpQ5W3CnXA5cVTneY3SPbPDRkcjMbifrwmFYcClgOZeg==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"safe-buffer": "5.1.2"
|
||||
},
|
||||
"dependencies": {
|
||||
"safe-buffer": {
|
||||
"version": "5.1.2",
|
||||
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz",
|
||||
"integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==",
|
||||
"dev": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"bcrypt-pbkdf": {
|
||||
"version": "1.0.2",
|
||||
|
@ -5900,9 +5969,9 @@
|
|||
"dev": true
|
||||
},
|
||||
"chrome-devtools-frontend": {
|
||||
"version": "1.0.944903",
|
||||
"resolved": "https://registry.npmjs.org/chrome-devtools-frontend/-/chrome-devtools-frontend-1.0.944903.tgz",
|
||||
"integrity": "sha512-0AX3fSoR7l33Kxb4+U1QFbH4SkSKv4mhawDeex0CmbsmsdtfybI7y4NvN4Fen/+w5j/g4m6t79STQ8pjI+NrQA==",
|
||||
"version": "1.0.949424",
|
||||
"resolved": "https://registry.npmjs.org/chrome-devtools-frontend/-/chrome-devtools-frontend-1.0.949424.tgz",
|
||||
"integrity": "sha512-v4A+tyfJia6yOonl0I1M3lXYIV9J6idJ49+dT2TK7Z9SmlNd1ZPCcXDi3sBWBkpxdHJfpzl1We8HhUTh2VX5FA==",
|
||||
"dev": true
|
||||
},
|
||||
"cli-cursor": {
|
||||
|
@ -6157,18 +6226,6 @@
|
|||
"safer-buffer": "^2.1.0"
|
||||
}
|
||||
},
|
||||
"ecstatic": {
|
||||
"version": "3.3.2",
|
||||
"resolved": "https://registry.npmjs.org/ecstatic/-/ecstatic-3.3.2.tgz",
|
||||
"integrity": "sha512-fLf9l1hnwrHI2xn9mEDT7KIi22UDqA2jaCwyCbSUJh9a1V+LEUSL/JO/6TIz/QyuBURWUHrFL5Kg2TtO1bkkog==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"he": "^1.1.1",
|
||||
"mime": "^1.6.0",
|
||||
"minimist": "^1.1.0",
|
||||
"url-join": "^2.0.5"
|
||||
}
|
||||
},
|
||||
"emoji-regex": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
|
||||
|
@ -6622,24 +6679,6 @@
|
|||
"pkg-dir": "^2.0.0"
|
||||
}
|
||||
},
|
||||
"eslint-plugin-es": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-es/-/eslint-plugin-es-2.0.0.tgz",
|
||||
"integrity": "sha512-f6fceVtg27BR02EYnBhgWLFQfK6bN4Ll0nQFrBHOlCsAyxeZkn0NHns5O0YZOPrV1B3ramd6cgFwaoFLcSkwEQ==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"eslint-utils": "^1.4.2",
|
||||
"regexpp": "^3.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"regexpp": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/regexpp/-/regexpp-3.0.0.tgz",
|
||||
"integrity": "sha512-Z+hNr7RAVWxznLPuA7DIh8UNX1j9CDrUQxskw9IrBE1Dxue2lyXT+shqEIeLUjrokxIP8CMy1WkjgG3rTsd5/g==",
|
||||
"dev": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"eslint-plugin-import": {
|
||||
"version": "2.25.3",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-import/-/eslint-plugin-import-2.25.3.tgz",
|
||||
|
@ -6688,19 +6727,38 @@
|
|||
}
|
||||
},
|
||||
"eslint-plugin-node": {
|
||||
"version": "10.0.0",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-node/-/eslint-plugin-node-10.0.0.tgz",
|
||||
"integrity": "sha512-1CSyM/QCjs6PXaT18+zuAXsjXGIGo5Rw630rSKwokSs2jrYURQc4R5JZpoanNCqwNmepg+0eZ9L7YiRUJb8jiQ==",
|
||||
"version": "11.1.0",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-node/-/eslint-plugin-node-11.1.0.tgz",
|
||||
"integrity": "sha512-oUwtPJ1W0SKD0Tr+wqu92c5xuCeQqB3hSCHasn/ZgjFdA9iDGNkNf2Zi9ztY7X+hNuMib23LNGRm6+uN+KLE3g==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"eslint-plugin-es": "^2.0.0",
|
||||
"eslint-utils": "^1.4.2",
|
||||
"eslint-plugin-es": "^3.0.0",
|
||||
"eslint-utils": "^2.0.0",
|
||||
"ignore": "^5.1.1",
|
||||
"minimatch": "^3.0.4",
|
||||
"resolve": "^1.10.1",
|
||||
"semver": "^6.1.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"eslint-plugin-es": {
|
||||
"version": "3.0.1",
|
||||
"resolved": "https://registry.npmjs.org/eslint-plugin-es/-/eslint-plugin-es-3.0.1.tgz",
|
||||
"integrity": "sha512-GUmAsJaN4Fc7Gbtl8uOBlayo2DqhwWvEzykMHSCZHU3XdJ+NSzzZcVhXh3VxX5icqQ+oQdIEawXX8xkR3mIFmQ==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"eslint-utils": "^2.0.0",
|
||||
"regexpp": "^3.0.0"
|
||||
}
|
||||
},
|
||||
"eslint-utils": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/eslint-utils/-/eslint-utils-2.1.0.tgz",
|
||||
"integrity": "sha512-w94dQYoauyvlDc43XnGB8lU3Zt713vNChgt4EWwhXAP2XkBvndfxF0AgIqKOOasjPIPzj9JqgwkwbCYD0/V3Zg==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"eslint-visitor-keys": "^1.1.0"
|
||||
}
|
||||
},
|
||||
"semver": {
|
||||
"version": "6.3.0",
|
||||
"resolved": "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz",
|
||||
|
@ -7239,21 +7297,53 @@
|
|||
}
|
||||
},
|
||||
"http-server": {
|
||||
"version": "0.12.3",
|
||||
"resolved": "https://registry.npmjs.org/http-server/-/http-server-0.12.3.tgz",
|
||||
"integrity": "sha512-be0dKG6pni92bRjq0kvExtj/NrrAd28/8fCXkaI/4piTwQMSDSLMhWyW0NI1V+DBI3aa1HMlQu46/HjVLfmugA==",
|
||||
"version": "14.0.0",
|
||||
"resolved": "https://registry.npmjs.org/http-server/-/http-server-14.0.0.tgz",
|
||||
"integrity": "sha512-XTePIXAo5x72bI8SlKFSqsg7UuSHwsOa4+RJIe56YeMUvfTvGDy7TxFkTEhfIRmM/Dnf6x29ut541ythSBZdkQ==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"basic-auth": "^1.0.3",
|
||||
"basic-auth": "^2.0.1",
|
||||
"colors": "^1.4.0",
|
||||
"corser": "^2.0.1",
|
||||
"ecstatic": "^3.3.2",
|
||||
"http-proxy": "^1.18.0",
|
||||
"he": "^1.2.0",
|
||||
"html-encoding-sniffer": "^3.0.0",
|
||||
"http-proxy": "^1.18.1",
|
||||
"mime": "^1.6.0",
|
||||
"minimist": "^1.2.5",
|
||||
"opener": "^1.5.1",
|
||||
"portfinder": "^1.0.25",
|
||||
"portfinder": "^1.0.28",
|
||||
"secure-compare": "3.0.1",
|
||||
"union": "~0.5.0"
|
||||
"union": "~0.5.0",
|
||||
"url-join": "^4.0.1"
|
||||
},
|
||||
"dependencies": {
|
||||
"html-encoding-sniffer": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/html-encoding-sniffer/-/html-encoding-sniffer-3.0.0.tgz",
|
||||
"integrity": "sha512-oWv4T4yJ52iKrufjnyZPkrN0CH3QnrUqdB6In1g5Fe1mia8GmF36gnfNySxoZtxD5+NmYw1EElVXiBk93UeskA==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"whatwg-encoding": "^2.0.0"
|
||||
}
|
||||
},
|
||||
"iconv-lite": {
|
||||
"version": "0.6.3",
|
||||
"resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.6.3.tgz",
|
||||
"integrity": "sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"safer-buffer": ">= 2.1.2 < 3.0.0"
|
||||
}
|
||||
},
|
||||
"whatwg-encoding": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/whatwg-encoding/-/whatwg-encoding-2.0.0.tgz",
|
||||
"integrity": "sha512-p41ogyeMUrw3jWclHWTQg1k05DSVXPLcVxRTYsXUk+ZooOCZLcoYgPZ/HL/D/N+uQPOtcp1me1WhBEaX02mhWg==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"iconv-lite": "0.6.3"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"http-signature": {
|
||||
|
@ -7553,19 +7643,19 @@
|
|||
"dev": true
|
||||
},
|
||||
"jasmine": {
|
||||
"version": "3.6.3",
|
||||
"resolved": "https://registry.npmjs.org/jasmine/-/jasmine-3.6.3.tgz",
|
||||
"integrity": "sha512-Th91zHsbsALWjDUIiU5d/W5zaYQsZFMPTdeNmi8GivZPmAaUAK8MblSG3yQI4VMGC/abF2us7ex60NH1AAIMTA==",
|
||||
"version": "3.10.0",
|
||||
"resolved": "https://registry.npmjs.org/jasmine/-/jasmine-3.10.0.tgz",
|
||||
"integrity": "sha512-2Y42VsC+3CQCTzTwJezOvji4qLORmKIE0kwowWC+934Krn6ZXNQYljiwK5st9V3PVx96BSiDYXSB60VVah3IlQ==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"glob": "^7.1.6",
|
||||
"jasmine-core": "~3.6.0"
|
||||
"jasmine-core": "~3.10.0"
|
||||
}
|
||||
},
|
||||
"jasmine-core": {
|
||||
"version": "3.6.0",
|
||||
"resolved": "https://registry.npmjs.org/jasmine-core/-/jasmine-core-3.6.0.tgz",
|
||||
"integrity": "sha512-8uQYa7zJN8hq9z+g8z1bqCfdC8eoDAeVnM5sfqs7KHv9/ifoJ500m018fpFc7RDaO6SWCLCXwo/wPSNcdYTgcw==",
|
||||
"version": "3.10.1",
|
||||
"resolved": "https://registry.npmjs.org/jasmine-core/-/jasmine-core-3.10.1.tgz",
|
||||
"integrity": "sha512-ooZWSDVAdh79Rrj4/nnfklL3NQVra0BcuhcuWoAwwi+znLDoUeH87AFfeX8s+YeYi6xlv5nveRyaA1v7CintfA==",
|
||||
"dev": true
|
||||
},
|
||||
"jasmine-ts": {
|
||||
|
@ -7917,6 +8007,15 @@
|
|||
"whatwg-url": "^7.0.0",
|
||||
"ws": "^7.0.0",
|
||||
"xml-name-validator": "^3.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"ws": {
|
||||
"version": "7.5.6",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-7.5.6.tgz",
|
||||
"integrity": "sha512-6GLgCqo2cy2A2rjCNFlxQS6ZljG/coZfZXclldI8FB/1G3CCI36Zd8xy2HrFVACi8tfk5XrgLQEk+P0Tnz9UcA==",
|
||||
"dev": true,
|
||||
"requires": {}
|
||||
}
|
||||
}
|
||||
},
|
||||
"json-schema": {
|
||||
|
@ -8487,14 +8586,14 @@
|
|||
"dev": true
|
||||
},
|
||||
"portfinder": {
|
||||
"version": "1.0.26",
|
||||
"resolved": "https://registry.npmjs.org/portfinder/-/portfinder-1.0.26.tgz",
|
||||
"integrity": "sha512-Xi7mKxJHHMI3rIUrnm/jjUgwhbYMkp/XKEcZX3aG4BrumLpq3nmoQMX+ClYnDZnZ/New7IatC1no5RX0zo1vXQ==",
|
||||
"version": "1.0.28",
|
||||
"resolved": "https://registry.npmjs.org/portfinder/-/portfinder-1.0.28.tgz",
|
||||
"integrity": "sha512-Se+2isanIcEqf2XMHjyUKskczxbPH7dQnlMjXX6+dybayyHvAf/TCgyMRlzf/B6QDhAEFOGes0pzRo3by4AbMA==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"async": "^2.6.2",
|
||||
"debug": "^3.1.1",
|
||||
"mkdirp": "^0.5.1"
|
||||
"mkdirp": "^0.5.5"
|
||||
}
|
||||
},
|
||||
"prelude-ls": {
|
||||
|
@ -9270,9 +9369,9 @@
|
|||
}
|
||||
},
|
||||
"url-join": {
|
||||
"version": "2.0.5",
|
||||
"resolved": "https://registry.npmjs.org/url-join/-/url-join-2.0.5.tgz",
|
||||
"integrity": "sha1-WvIvGMBSoACkjXuCxenC4v7tpyg=",
|
||||
"version": "4.0.1",
|
||||
"resolved": "https://registry.npmjs.org/url-join/-/url-join-4.0.1.tgz",
|
||||
"integrity": "sha512-jk1+QP6ZJqyOiuEI9AEWQfju/nB2Pw466kbA0LEZljHwKeMgd9WrAEgEGxjPDD2+TNbbb37rTyhEfrCXfuKXnA==",
|
||||
"dev": true
|
||||
},
|
||||
"uuid": {
|
||||
|
@ -9439,9 +9538,9 @@
|
|||
}
|
||||
},
|
||||
"ws": {
|
||||
"version": "7.4.6",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-7.4.6.tgz",
|
||||
"integrity": "sha512-YmhHDO4MzaDLB+M9ym/mDA5z0naX8j7SIlT8f8z+I0VtzsRbekxEutHSme7NPS2qE8StCYQNUnfWdXta/Yu85A==",
|
||||
"version": "8.3.0",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-8.3.0.tgz",
|
||||
"integrity": "sha512-Gs5EZtpqZzLvmIM59w4igITU57lrtYVFneaa434VROv4thzJyV6UjIL3D42lslWlI+D4KzLYnxSwtfuiO79sNw==",
|
||||
"dev": true,
|
||||
"requires": {}
|
||||
},
|
||||
|
|
|
@ -25,20 +25,20 @@
|
|||
},
|
||||
"homepage": "https://github.com/abhinavsingh/proxy.py#readme",
|
||||
"devDependencies": {
|
||||
"@types/jasmine": "^3.6.1",
|
||||
"@types/jasmine": "^3.10.2",
|
||||
"@types/jquery": "^3.5.4",
|
||||
"@types/js-cookie": "^2.2.6",
|
||||
"@types/js-cookie": "^3.0.1",
|
||||
"@typescript-eslint/eslint-plugin": "^2.34.0",
|
||||
"@typescript-eslint/parser": "^2.34.0",
|
||||
"chrome-devtools-frontend": "^1.0.944903",
|
||||
"chrome-devtools-frontend": "^1.0.949424",
|
||||
"eslint": "^6.8.0",
|
||||
"eslint-config-standard": "^14.1.1",
|
||||
"eslint-plugin-import": "^2.25.3",
|
||||
"eslint-plugin-node": "^10.0.0",
|
||||
"eslint-plugin-node": "^11.1.0",
|
||||
"eslint-plugin-promise": "^4.2.1",
|
||||
"eslint-plugin-standard": "^5.0.0",
|
||||
"http-server": "^0.12.3",
|
||||
"jasmine": "^3.6.3",
|
||||
"http-server": "^14.0.0",
|
||||
"jasmine": "^3.10.0",
|
||||
"jasmine-ts": "^0.3.0",
|
||||
"jquery": "^3.5.1",
|
||||
"js-cookie": "^3.0.1",
|
||||
|
@ -50,6 +50,6 @@
|
|||
"rollup-plugin-typescript": "^1.0.1",
|
||||
"ts-node": "^7.0.1",
|
||||
"typescript": "^3.9.7",
|
||||
"ws": "^7.4.6"
|
||||
"ws": "^8.3.0"
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,6 +1,25 @@
|
|||
# Changelog
|
||||
|
||||
```{spelling}
|
||||
Changelog
|
||||
```
|
||||
|
||||
```{include} ../CHANGELOG.md
|
||||
:end-before: <!-- towncrier release notes start -->
|
||||
:start-after: (DO-NOT-REMOVE-versioning-promise-START)
|
||||
|
||||
```
|
||||
|
||||
<!-- markdownlint-disable-next-line no-space-in-emphasis -->
|
||||
## {{ release_l }}, as of {sub-ref}`today` _{subscript}`/UNRELEASED DRAFT/`_
|
||||
|
||||
```{important} This version is not yet released and is under active development
|
||||
```
|
||||
|
||||
```{towncrier-draft-entries}
|
||||
```
|
||||
|
||||
```{include} ../CHANGELOG.md
|
||||
:start-after: <!-- towncrier release notes start -->
|
||||
|
||||
```
|
||||
|
|
12
docs/conf.py
12
docs/conf.py
|
@ -53,6 +53,7 @@ release = get_scm_version()
|
|||
|
||||
rst_epilog = f"""
|
||||
.. |project| replace:: {project}
|
||||
.. |release_l| replace:: ``v{release}``
|
||||
"""
|
||||
|
||||
|
||||
|
@ -100,6 +101,7 @@ extensions = [
|
|||
# Third-party extensions:
|
||||
'myst_parser', # extended markdown; https://pypi.org/project/myst-parser/
|
||||
'sphinxcontrib.apidoc',
|
||||
'sphinxcontrib.towncrier', # provides `towncrier-draft-entries` directive
|
||||
]
|
||||
|
||||
# Conditional third-party extensions:
|
||||
|
@ -221,6 +223,13 @@ linkcheck_ignore = [
|
|||
]
|
||||
linkcheck_workers = 25
|
||||
|
||||
# -- Options for towncrier_draft extension -----------------------------------
|
||||
|
||||
towncrier_draft_autoversion_mode = 'draft' # or: 'sphinx-version', 'sphinx-release'
|
||||
towncrier_draft_include_empty = True
|
||||
towncrier_draft_working_directory = PROJECT_ROOT_DIR
|
||||
# Not yet supported: towncrier_draft_config_path = 'pyproject.toml' # relative to cwd
|
||||
|
||||
# -- Options for myst_parser extension ------------------------------------------
|
||||
|
||||
myst_enable_extensions = [
|
||||
|
@ -237,6 +246,9 @@ myst_enable_extensions = [
|
|||
]
|
||||
myst_substitutions = {
|
||||
'project': project,
|
||||
'release': release,
|
||||
'release_l': f'`v{release}`',
|
||||
'version': version,
|
||||
}
|
||||
|
||||
# -- Strict mode -------------------------------------------------------------
|
||||
|
|
|
@ -3,3 +3,4 @@ setuptools-scm >= 6.3.2
|
|||
Sphinx >= 4.3.0
|
||||
furo >= 2021.11.15
|
||||
sphinxcontrib-apidoc >= 0.3.0
|
||||
sphinxcontrib-towncrier >= 0.2.0a0
|
||||
|
|
|
@ -28,15 +28,22 @@ charset-normalizer==2.0.7 \
|
|||
--hash=sha256:e019de665e2bcf9c2b64e2e5aa025fa991da8720daa3c1138cadd2fd1856aed0 \
|
||||
--hash=sha256:f7af805c321bfa1ce6714c51f254e0d5bb5e5834039bc17db7ebe3a4cec9492b
|
||||
# via requests
|
||||
click==8.0.3 \
|
||||
--hash=sha256:353f466495adaeb40b6b5f592f9f91cb22372351c84caeb068132442a4518ef3 \
|
||||
--hash=sha256:410e932b050f5eed773c4cda94de75971c89cdb3155a72a0831139a79e5ecb5b
|
||||
# via towncrier
|
||||
click-default-group==1.2.2 \
|
||||
--hash=sha256:d9560e8e8dfa44b3562fbc9425042a0fd6d21956fcc2db0077f63f34253ab904
|
||||
# via towncrier
|
||||
docutils==0.17.1 \
|
||||
--hash=sha256:686577d2e4c32380bb50cbb22f575ed742d58168cee37e99117a854bcd88f125 \
|
||||
--hash=sha256:cf316c8370a737a022b72b56874f6602acf974a37a9fba42ec2876387549fc61
|
||||
# via
|
||||
# myst-parser
|
||||
# sphinx
|
||||
furo==2021.11.15 \
|
||||
--hash=sha256:17b9fcf4de20f661d13db1ea83f11f7bf30be13738cffc88637889bf79c0469f \
|
||||
--hash=sha256:bdca82c3f211a24f850dcb12be3cb0e3f152cd3f2adfc0449bf9db6a07856bd3
|
||||
furo==2021.11.23 \
|
||||
--hash=sha256:54cecac5f3b688b5c7370d72ecdf1cd91a6c53f0f42751f4a719184b562cde70 \
|
||||
--hash=sha256:6d396451ad1aadce380c662fca9362cb10f4fd85f296d74fe3ca32006eb641d7
|
||||
# via -r docs/requirements.in
|
||||
idna==3.3 \
|
||||
--hash=sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff \
|
||||
|
@ -46,12 +53,17 @@ imagesize==1.3.0 \
|
|||
--hash=sha256:1db2f82529e53c3e929e8926a1fa9235aa82d0bd0c580359c67ec31b2fddaa8c \
|
||||
--hash=sha256:cd1750d452385ca327479d45b64d9c7729ecf0b3969a58148298c77092261f9d
|
||||
# via sphinx
|
||||
incremental==21.3.0 \
|
||||
--hash=sha256:02f5de5aff48f6b9f665d99d48bfc7ec03b6e3943210de7cfc88856d755d6f57 \
|
||||
--hash=sha256:92014aebc6a20b78a8084cdd5645eeaa7f74b8933f70fa3ada2cfbd1e3b54321
|
||||
# via towncrier
|
||||
jinja2==3.0.3 \
|
||||
--hash=sha256:077ce6014f7b40d03b47d1f1ca4b0fc8328a692bd284016f806ed0eaca390ad8 \
|
||||
--hash=sha256:611bb273cd68f3b993fabdc4064fc858c5b47a973cb5aa7999ec1ba405c87cd7
|
||||
# via
|
||||
# myst-parser
|
||||
# sphinx
|
||||
# towncrier
|
||||
linkify-it-py==1.0.2 \
|
||||
--hash=sha256:4f416e72a41d9a00ecf1270ffb28b033318e458ac1144eb7c326563968a5dd24 \
|
||||
--hash=sha256:6c37ef4fc3001b38bc2359ccb5dc7e54388ec5d54fe46d2dbcd9a081f90fdbe3
|
||||
|
@ -154,7 +166,9 @@ pbr==5.7.0 \
|
|||
pygments==2.10.0 \
|
||||
--hash=sha256:b8e67fe6af78f492b3c4b3e2970c0624cbf08beb1e493b2c99b9fa1b67a20380 \
|
||||
--hash=sha256:f398865f7eb6874156579fdf36bc840a03cab64d1cde9e93d68f46a425ec52c6
|
||||
# via sphinx
|
||||
# via
|
||||
# furo
|
||||
# sphinx
|
||||
pyparsing==2.4.7 \
|
||||
--hash=sha256:c203ec8783bf771a155b207279b9bccb8dea02d8f0c9e5f8ead507bc3246ecc1 \
|
||||
--hash=sha256:ef9d7589ef3c200abe66653d3f1ab1033c3c419ae9b9bdb1240a85b024efc88b
|
||||
|
@ -214,14 +228,15 @@ soupsieve==2.3.1 \
|
|||
--hash=sha256:1a3cca2617c6b38c0343ed661b1fa5de5637f257d4fe22bd9f1338010a1efefb \
|
||||
--hash=sha256:b8d49b1cd4f037c7082a9683dfa1801aa2597fb11c3a1155b7a5b94829b4f1f9
|
||||
# via beautifulsoup4
|
||||
sphinx==4.3.0 \
|
||||
--hash=sha256:6d051ab6e0d06cba786c4656b0fe67ba259fe058410f49e95bee6e49c4052cbf \
|
||||
--hash=sha256:7e2b30da5f39170efcd95c6270f07669d623c276521fee27ad6c380f49d2bf5b
|
||||
sphinx==4.3.1 \
|
||||
--hash=sha256:048dac56039a5713f47a554589dc98a442b39226a2b9ed7f82797fcb2fe9253f \
|
||||
--hash=sha256:32a5b3e9a1b176cc25ed048557d4d3d01af635e6b76c5bc7a43b0a34447fbd45
|
||||
# via
|
||||
# -r docs/requirements.in
|
||||
# furo
|
||||
# myst-parser
|
||||
# sphinxcontrib-apidoc
|
||||
# sphinxcontrib-towncrier
|
||||
sphinxcontrib-apidoc==0.3.0 \
|
||||
--hash=sha256:6671a46b2c6c5b0dca3d8a147849d159065e50443df79614f921b42fbd15cb09 \
|
||||
--hash=sha256:729bf592cf7b7dd57c4c05794f732dc026127275d785c2a5494521fdde773fb9
|
||||
|
@ -250,10 +265,22 @@ sphinxcontrib-serializinghtml==1.1.5 \
|
|||
--hash=sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd \
|
||||
--hash=sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952
|
||||
# via sphinx
|
||||
sphinxcontrib-towncrier==0.2.0a0 \
|
||||
--hash=sha256:31eed078e0a8b4c38dc30978dac8c53e2dfa7342ad8597d11816d1ea9ab0eabb \
|
||||
--hash=sha256:3cd4295c0198e753d964e2c06ee4ecd91a73a8d103385d08af9b05487ae68dd0
|
||||
# via -r docs/requirements.in
|
||||
toml==0.10.2 \
|
||||
--hash=sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b \
|
||||
--hash=sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f
|
||||
# via towncrier
|
||||
tomli==1.2.2 \
|
||||
--hash=sha256:c6ce0015eb38820eaf32b5db832dbc26deb3dd427bd5f6556cf0acac2c214fee \
|
||||
--hash=sha256:f04066f68f5554911363063a30b108d2b5a5b1a010aa8b6132af78489fe3aade
|
||||
# via setuptools-scm
|
||||
towncrier==21.3.0 \
|
||||
--hash=sha256:6eed0bc924d72c98c000cb8a64de3bd566e5cb0d11032b73fcccf8a8f956ddfe \
|
||||
--hash=sha256:e6ccec65418bbcb8de5c908003e130e37fe0e9d6396cb77c1338241071edc082
|
||||
# via sphinxcontrib-towncrier
|
||||
uc-micro-py==1.0.1 \
|
||||
--hash=sha256:316cfb8b6862a0f1d03540f0ae6e7b033ff1fa0ddbe60c12cbe0d4cec846a69f \
|
||||
--hash=sha256:b7cdf4ea79433043ddfe2c82210208f26f7962c0cfbe3bacb05ee879a7fdb596
|
||||
|
|
|
@ -15,7 +15,6 @@ from typing import Any, Optional
|
|||
from proxy import Proxy
|
||||
from proxy.common.utils import build_http_response
|
||||
from proxy.http import httpStatusCodes
|
||||
from proxy.http.parser import httpParserStates
|
||||
from proxy.core.base import BaseTcpTunnelHandler
|
||||
|
||||
|
||||
|
@ -58,7 +57,7 @@ class HttpsConnectTunnelHandler(BaseTcpTunnelHandler):
|
|||
|
||||
# CONNECT requests are short and we need not worry about
|
||||
# receiving partial request bodies here.
|
||||
assert self.request.state == httpParserStates.COMPLETE
|
||||
assert self.request.is_complete
|
||||
|
||||
# Establish connection with upstream
|
||||
self.connect_upstream()
|
||||
|
|
|
@ -45,6 +45,8 @@ DOT = b'.'
|
|||
SLASH = b'/'
|
||||
HTTP_1_0 = b'HTTP/1.0'
|
||||
HTTP_1_1 = b'HTTP/1.1'
|
||||
HTTP_URL_PREFIX = b'http://'
|
||||
HTTPS_URL_PREFIX = b'https://'
|
||||
|
||||
PROXY_AGENT_HEADER_KEY = b'Proxy-agent'
|
||||
PROXY_AGENT_HEADER_VALUE = b'proxy.py v' + \
|
||||
|
@ -80,7 +82,7 @@ DEFAULT_KEY_FILE = None
|
|||
DEFAULT_LOG_FILE = None
|
||||
DEFAULT_LOG_FORMAT = '%(asctime)s - pid:%(process)d [%(levelname)-.1s] %(module)s.%(funcName)s:%(lineno)d - %(message)s'
|
||||
DEFAULT_LOG_LEVEL = 'INFO'
|
||||
DEFAULT_WEB_ACCESS_LOG_FORMAT = '{client_addr} - {request_method} {request_path} - {connection_time_ms}ms'
|
||||
DEFAULT_WEB_ACCESS_LOG_FORMAT = '{client_ip}:{client_port} - {request_method} {request_path} - {connection_time_ms}ms'
|
||||
DEFAULT_HTTP_ACCESS_LOG_FORMAT = '{client_ip}:{client_port} - ' + \
|
||||
'{request_method} {server_host}:{server_port}{request_path} - ' + \
|
||||
'{response_code} {response_reason} - {response_bytes} bytes - ' + \
|
||||
|
|
|
@ -156,6 +156,30 @@ class FlagParser:
|
|||
# unless user overrides the default auth plugin.
|
||||
auth_plugins.append(auth_plugin)
|
||||
|
||||
# --enable flags must be parsed before loading plugins
|
||||
# otherwise we will miss the plugins passed via constructor
|
||||
args.enable_web_server = cast(
|
||||
bool,
|
||||
opts.get(
|
||||
'enable_web_server',
|
||||
args.enable_web_server,
|
||||
),
|
||||
)
|
||||
args.enable_static_server = cast(
|
||||
bool,
|
||||
opts.get(
|
||||
'enable_static_server',
|
||||
args.enable_static_server,
|
||||
),
|
||||
)
|
||||
args.enable_events = cast(
|
||||
bool,
|
||||
opts.get(
|
||||
'enable_events',
|
||||
args.enable_events,
|
||||
),
|
||||
)
|
||||
|
||||
# Load default plugins along with user provided --plugins
|
||||
default_plugins = [
|
||||
bytes_(p)
|
||||
|
@ -290,6 +314,7 @@ class FlagParser:
|
|||
args.num_acceptors = cast(
|
||||
int, num_acceptors if num_acceptors > 0 else multiprocessing.cpu_count(),
|
||||
)
|
||||
|
||||
args.static_server_dir = cast(
|
||||
str,
|
||||
opts.get(
|
||||
|
@ -297,13 +322,6 @@ class FlagParser:
|
|||
args.static_server_dir,
|
||||
),
|
||||
)
|
||||
args.enable_static_server = cast(
|
||||
bool,
|
||||
opts.get(
|
||||
'enable_static_server',
|
||||
args.enable_static_server,
|
||||
),
|
||||
)
|
||||
args.min_compression_limit = cast(
|
||||
bool,
|
||||
opts.get(
|
||||
|
@ -324,13 +342,6 @@ class FlagParser:
|
|||
args.timeout = cast(int, opts.get('timeout', args.timeout))
|
||||
args.threadless = cast(bool, opts.get('threadless', args.threadless))
|
||||
args.threaded = cast(bool, opts.get('threaded', args.threaded))
|
||||
args.enable_events = cast(
|
||||
bool,
|
||||
opts.get(
|
||||
'enable_events',
|
||||
args.enable_events,
|
||||
),
|
||||
)
|
||||
args.pid_file = cast(
|
||||
Optional[str], opts.get(
|
||||
'pid_file',
|
||||
|
|
|
@ -191,6 +191,7 @@ class Acceptor(multiprocessing.Process):
|
|||
assert self.sock
|
||||
self._local_work_queue = NonBlockingQueue()
|
||||
self._local = LocalExecutor(
|
||||
iid=self.idd,
|
||||
work_queue=self._local_work_queue,
|
||||
flags=self.flags,
|
||||
event_queue=self.event_queue,
|
||||
|
|
|
@ -175,6 +175,7 @@ class ThreadlessPool:
|
|||
pipe = multiprocessing.Pipe()
|
||||
self.work_queues.append(pipe[0])
|
||||
w = RemoteExecutor(
|
||||
iid=index,
|
||||
work_queue=pipe[1],
|
||||
flags=self.flags,
|
||||
event_queue=self.event_queue,
|
||||
|
|
|
@ -59,11 +59,13 @@ class Threadless(ABC, Generic[T]):
|
|||
|
||||
def __init__(
|
||||
self,
|
||||
iid: str,
|
||||
work_queue: T,
|
||||
flags: argparse.Namespace,
|
||||
event_queue: Optional[EventQueue] = None,
|
||||
) -> None:
|
||||
super().__init__()
|
||||
self.iid = iid
|
||||
self.work_queue = work_queue
|
||||
self.flags = flags
|
||||
self.event_queue = event_queue
|
||||
|
@ -84,6 +86,7 @@ class Threadless(ABC, Generic[T]):
|
|||
] = {}
|
||||
self.wait_timeout: float = DEFAULT_WAIT_FOR_TASKS_TIMEOUT
|
||||
self.cleanup_inactive_timeout: float = DEFAULT_INACTIVE_CONN_CLEANUP_TIMEOUT
|
||||
self._total: int = 0
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
|
@ -122,6 +125,7 @@ class Threadless(ABC, Generic[T]):
|
|||
fileno, family=socket.AF_INET if self.flags.hostname.version == 4 else socket.AF_INET6,
|
||||
type=socket.SOCK_STREAM,
|
||||
)
|
||||
uid = '%s-%s-%s' % (self.iid, self._total, fileno)
|
||||
self.works[fileno] = self.flags.work_klass(
|
||||
TcpClientConnection(
|
||||
conn=conn,
|
||||
|
@ -129,7 +133,7 @@ class Threadless(ABC, Generic[T]):
|
|||
),
|
||||
flags=self.flags,
|
||||
event_queue=self.event_queue,
|
||||
uid=fileno,
|
||||
uid=uid,
|
||||
)
|
||||
self.works[fileno].publish_event(
|
||||
event_name=eventNames.WORK_STARTED,
|
||||
|
@ -138,6 +142,7 @@ class Threadless(ABC, Generic[T]):
|
|||
)
|
||||
try:
|
||||
self.works[fileno].initialize()
|
||||
self._total += 1
|
||||
except Exception as e:
|
||||
logger.exception(
|
||||
'Exception occurred during initialization',
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
import argparse
|
||||
|
||||
from abc import ABC, abstractmethod
|
||||
from uuid import uuid4, UUID
|
||||
from uuid import uuid4
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
from ..event import eventNames, EventQueue
|
||||
|
@ -31,10 +31,10 @@ class Work(ABC):
|
|||
work: TcpClientConnection,
|
||||
flags: argparse.Namespace,
|
||||
event_queue: Optional[EventQueue] = None,
|
||||
uid: Optional[UUID] = None,
|
||||
uid: Optional[str] = None,
|
||||
) -> None:
|
||||
# Work uuid
|
||||
self.uid: UUID = uid if uid is not None else uuid4()
|
||||
self.uid: str = uid if uid is not None else uuid4().hex
|
||||
self.flags = flags
|
||||
# Eventing core queue
|
||||
self.event_queue = event_queue
|
||||
|
@ -92,7 +92,7 @@ class Work(ABC):
|
|||
return
|
||||
assert self.event_queue
|
||||
self.event_queue.publish(
|
||||
self.uid.hex,
|
||||
self.uid,
|
||||
event_name,
|
||||
event_payload,
|
||||
publisher_id,
|
||||
|
|
|
@ -214,7 +214,7 @@ class HttpProtocolHandler(BaseTcpServerHandler):
|
|||
# TODO(abhinavsingh): Remove .tobytes after parser is
|
||||
# memoryview compliant
|
||||
self.request.parse(data.tobytes())
|
||||
if self.request.state == httpParserStates.COMPLETE:
|
||||
if self.request.is_complete:
|
||||
# Invoke plugin.on_request_complete
|
||||
for plugin in self.plugins.values():
|
||||
upgraded_sock = plugin.on_request_complete()
|
||||
|
|
|
@ -17,7 +17,7 @@ from typing import TypeVar, Optional, Dict, Type, Tuple, List
|
|||
from ...common.constants import DEFAULT_DISABLE_HEADERS, COLON, DEFAULT_ENABLE_PROXY_PROTOCOL
|
||||
from ...common.constants import HTTP_1_1, SLASH, CRLF
|
||||
from ...common.constants import WHITESPACE, DEFAULT_HTTP_PORT
|
||||
from ...common.utils import build_http_request, build_http_response, find_http_line, text_
|
||||
from ...common.utils import build_http_request, build_http_response, text_
|
||||
from ...common.flag import flags
|
||||
|
||||
from ..url import Url
|
||||
|
@ -63,10 +63,12 @@ class HttpParser:
|
|||
if enable_proxy_protocol:
|
||||
assert self.type == httpParserTypes.REQUEST_PARSER
|
||||
self.protocol = ProxyProtocol()
|
||||
# Request attributes
|
||||
self.host: Optional[bytes] = None
|
||||
self.port: Optional[int] = None
|
||||
self.path: Optional[bytes] = None
|
||||
self.method: Optional[bytes] = None
|
||||
# Response attributes
|
||||
self.code: Optional[bytes] = None
|
||||
self.reason: Optional[bytes] = None
|
||||
self.version: Optional[bytes] = None
|
||||
|
@ -78,7 +80,7 @@ class HttpParser:
|
|||
# - Keys are lower case header names.
|
||||
# - Values are 2-tuple containing original
|
||||
# header and it's value as received.
|
||||
self.headers: Dict[bytes, Tuple[bytes, bytes]] = {}
|
||||
self.headers: Optional[Dict[bytes, Tuple[bytes, bytes]]] = None
|
||||
self.body: Optional[bytes] = None
|
||||
self.chunk: Optional[ChunkParser] = None
|
||||
# Internal request line as a url structure
|
||||
|
@ -109,19 +111,24 @@ class HttpParser:
|
|||
|
||||
def header(self, key: bytes) -> bytes:
|
||||
"""Convenient method to return original header value from internal data structure."""
|
||||
if key.lower() not in self.headers:
|
||||
if self.headers is None or key.lower() not in self.headers:
|
||||
raise KeyError('%s not found in headers', text_(key))
|
||||
return self.headers[key.lower()][1]
|
||||
|
||||
def has_header(self, key: bytes) -> bool:
|
||||
"""Returns true if header key was found in payload."""
|
||||
if self.headers is None:
|
||||
return False
|
||||
return key.lower() in self.headers
|
||||
|
||||
def add_header(self, key: bytes, value: bytes) -> bytes:
|
||||
"""Add/Update a header to internal data structure.
|
||||
|
||||
Returns key with which passed (key, value) tuple is available."""
|
||||
if self.headers is None:
|
||||
self.headers = {}
|
||||
k = key.lower()
|
||||
# k = key
|
||||
self.headers[k] = (key, value)
|
||||
return k
|
||||
|
||||
|
@ -132,7 +139,7 @@ class HttpParser:
|
|||
|
||||
def del_header(self, header: bytes) -> None:
|
||||
"""Delete a header from internal data structure."""
|
||||
if header.lower() in self.headers:
|
||||
if self.headers and header.lower() in self.headers:
|
||||
del self.headers[header.lower()]
|
||||
|
||||
def del_headers(self, headers: List[bytes]) -> None:
|
||||
|
@ -151,6 +158,10 @@ class HttpParser:
|
|||
NOTE: Host field WILL be None for incoming local WebServer requests."""
|
||||
return self.host is not None
|
||||
|
||||
@property
|
||||
def is_complete(self) -> bool:
|
||||
return self.state == httpParserStates.COMPLETE
|
||||
|
||||
@property
|
||||
def is_http_1_1_keep_alive(self) -> bool:
|
||||
"""Returns true for HTTP/1.1 keep-alive connections."""
|
||||
|
@ -185,30 +196,34 @@ class HttpParser:
|
|||
@property
|
||||
def body_expected(self) -> bool:
|
||||
"""Returns true if content or chunked response is expected."""
|
||||
return self.content_expected or self.is_chunked_encoded
|
||||
return self._content_expected or self._is_chunked_encoded
|
||||
|
||||
def parse(self, raw: bytes) -> None:
|
||||
"""Parses HTTP request out of raw bytes.
|
||||
|
||||
Check for `HttpParser.state` after `parse` has successfully returned."""
|
||||
self.total_size += len(raw)
|
||||
size = len(raw)
|
||||
self.total_size += size
|
||||
raw = self.buffer + raw
|
||||
self.buffer, more = b'', len(raw) > 0
|
||||
self.buffer, more = b'', size > 0
|
||||
while more and self.state != httpParserStates.COMPLETE:
|
||||
# gte with HEADERS_COMPLETE also encapsulated RCVING_BODY state
|
||||
more, raw = self._process_body(raw) \
|
||||
if self.state >= httpParserStates.HEADERS_COMPLETE else \
|
||||
self._process_line_and_headers(raw)
|
||||
if self.state >= httpParserStates.HEADERS_COMPLETE:
|
||||
more, raw = self._process_body(raw)
|
||||
elif self.state == httpParserStates.INITIALIZED:
|
||||
more, raw = self._process_line(raw)
|
||||
else:
|
||||
more, raw = self._process_headers(raw)
|
||||
# When server sends a response line without any header or body e.g.
|
||||
# HTTP/1.1 200 Connection established\r\n\r\n
|
||||
if self.state == httpParserStates.LINE_RCVD and \
|
||||
raw == CRLF and \
|
||||
self.type == httpParserTypes.RESPONSE_PARSER:
|
||||
if self.type == httpParserTypes.RESPONSE_PARSER and \
|
||||
self.state == httpParserStates.LINE_RCVD and \
|
||||
raw == CRLF:
|
||||
self.state = httpParserStates.COMPLETE
|
||||
# Mark request as complete if headers received and no incoming
|
||||
# body indication received.
|
||||
elif self.state == httpParserStates.HEADERS_COMPLETE and \
|
||||
not self.body_expected and \
|
||||
not (self._content_expected or self._is_chunked_encoded) and \
|
||||
raw == b'':
|
||||
self.state = httpParserStates.COMPLETE
|
||||
self.buffer = raw
|
||||
|
@ -229,7 +244,7 @@ class HttpParser:
|
|||
COLON +
|
||||
str(self.port).encode() +
|
||||
path
|
||||
) if not self.is_https_tunnel else (self.host + COLON + str(self.port).encode())
|
||||
) if not self._is_https_tunnel else (self.host + COLON + str(self.port).encode())
|
||||
return build_http_request(
|
||||
self.method, path, self.version,
|
||||
headers={} if not self.headers else {
|
||||
|
@ -263,7 +278,7 @@ class HttpParser:
|
|||
# the latter MUST be ignored.
|
||||
#
|
||||
# TL;DR -- Give transfer-encoding header preference over content-length.
|
||||
if self.is_chunked_encoded:
|
||||
if self._is_chunked_encoded:
|
||||
if not self.chunk:
|
||||
self.chunk = ChunkParser()
|
||||
raw = self.chunk.parse(raw)
|
||||
|
@ -271,7 +286,7 @@ class HttpParser:
|
|||
self.body = self.chunk.body
|
||||
self.state = httpParserStates.COMPLETE
|
||||
more = False
|
||||
elif self.content_expected:
|
||||
elif self._content_expected:
|
||||
self.state = httpParserStates.RCVING_BODY
|
||||
if self.body is None:
|
||||
self.body = b''
|
||||
|
@ -297,7 +312,7 @@ class HttpParser:
|
|||
more, raw = False, b''
|
||||
return more, raw
|
||||
|
||||
def _process_line_and_headers(self, raw: bytes) -> Tuple[bool, bytes]:
|
||||
def _process_headers(self, raw: bytes) -> Tuple[bool, bytes]:
|
||||
"""Returns False when no CRLF could be found in received bytes.
|
||||
|
||||
TODO: We should not return until parser reaches headers complete
|
||||
|
@ -308,60 +323,59 @@ class HttpParser:
|
|||
This will also help make the parser even more stateless.
|
||||
"""
|
||||
while True:
|
||||
line, raw = find_http_line(raw)
|
||||
if line is None:
|
||||
parts = raw.split(CRLF, 1)
|
||||
if len(parts) == 1:
|
||||
return False, raw
|
||||
|
||||
if self.state == httpParserStates.INITIALIZED:
|
||||
self._process_line(line)
|
||||
if self.state == httpParserStates.INITIALIZED:
|
||||
# return len(raw) > 0, raw
|
||||
continue
|
||||
elif self.state in (httpParserStates.LINE_RCVD, httpParserStates.RCVING_HEADERS):
|
||||
if self.state == httpParserStates.LINE_RCVD:
|
||||
self.state = httpParserStates.RCVING_HEADERS
|
||||
line, raw = parts[0], parts[1]
|
||||
if self.state in (httpParserStates.LINE_RCVD, httpParserStates.RCVING_HEADERS):
|
||||
if line == b'' or line.strip() == b'': # Blank line received.
|
||||
self.state = httpParserStates.HEADERS_COMPLETE
|
||||
else:
|
||||
self.state = httpParserStates.RCVING_HEADERS
|
||||
self._process_header(line)
|
||||
|
||||
# If raw length is now zero, bail out
|
||||
# If we have received all headers, bail out
|
||||
if raw == b'' or self.state == httpParserStates.HEADERS_COMPLETE:
|
||||
break
|
||||
return len(raw) > 0, raw
|
||||
|
||||
def _process_line(self, raw: bytes) -> None:
|
||||
if self.type == httpParserTypes.REQUEST_PARSER:
|
||||
if self.protocol is not None and self.protocol.version is None:
|
||||
# We expect to receive entire proxy protocol v1 line
|
||||
# in one network read and don't expect partial packets
|
||||
self.protocol.parse(raw)
|
||||
else:
|
||||
def _process_line(self, raw: bytes) -> Tuple[bool, bytes]:
|
||||
while True:
|
||||
parts = raw.split(CRLF, 1)
|
||||
if len(parts) == 1:
|
||||
return False, raw
|
||||
line, raw = parts[0], parts[1]
|
||||
if self.type == httpParserTypes.REQUEST_PARSER:
|
||||
if self.protocol is not None and self.protocol.version is None:
|
||||
# We expect to receive entire proxy protocol v1 line
|
||||
# in one network read and don't expect partial packets
|
||||
self.protocol.parse(line)
|
||||
continue
|
||||
# Ref: https://datatracker.ietf.org/doc/html/rfc2616#section-5.1
|
||||
line = raw.split(WHITESPACE, 2)
|
||||
if len(line) == 3:
|
||||
self.method = line[0].upper()
|
||||
parts = line.split(WHITESPACE, 2)
|
||||
if len(parts) == 3:
|
||||
self.method = parts[0]
|
||||
if self.method == httpMethods.CONNECT:
|
||||
self._is_https_tunnel = True
|
||||
self.set_url(line[1])
|
||||
self.version = line[2]
|
||||
self.set_url(parts[1])
|
||||
self.version = parts[2]
|
||||
self.state = httpParserStates.LINE_RCVD
|
||||
else:
|
||||
# To avoid a possible attack vector, we raise exception
|
||||
# if parser receives an invalid request line.
|
||||
#
|
||||
# TODO: Better to use raise HttpProtocolException,
|
||||
# but we should solve circular import problem first.
|
||||
raise ValueError('Invalid request line')
|
||||
else:
|
||||
line = raw.split(WHITESPACE, 2)
|
||||
self.version = line[0]
|
||||
self.code = line[1]
|
||||
break
|
||||
# To avoid a possible attack vector, we raise exception
|
||||
# if parser receives an invalid request line.
|
||||
#
|
||||
# TODO: Better to use raise HttpProtocolException,
|
||||
# but we should solve circular import problem first.
|
||||
raise ValueError('Invalid request line')
|
||||
parts = line.split(WHITESPACE, 2)
|
||||
self.version = parts[0]
|
||||
self.code = parts[1]
|
||||
# Our own WebServerPlugin example currently doesn't send any reason
|
||||
if len(line) == 3:
|
||||
self.reason = line[2]
|
||||
if len(parts) == 3:
|
||||
self.reason = parts[2]
|
||||
self.state = httpParserStates.LINE_RCVD
|
||||
break
|
||||
return len(raw) > 0, raw
|
||||
|
||||
def _process_header(self, raw: bytes) -> None:
|
||||
parts = raw.split(COLON, 1)
|
||||
|
@ -380,20 +394,16 @@ class HttpParser:
|
|||
|
||||
def _get_body_or_chunks(self) -> Optional[bytes]:
|
||||
return ChunkParser.to_chunks(self.body) \
|
||||
if self.body and self.is_chunked_encoded else \
|
||||
if self.body and self._is_chunked_encoded else \
|
||||
self.body
|
||||
|
||||
def _set_line_attributes(self) -> None:
|
||||
if self.type == httpParserTypes.REQUEST_PARSER:
|
||||
if self.is_https_tunnel and self._url:
|
||||
assert self._url
|
||||
if self._is_https_tunnel:
|
||||
self.host = self._url.hostname
|
||||
self.port = 443 if self._url.port is None else self._url.port
|
||||
elif self._url:
|
||||
else:
|
||||
self.host, self.port = self._url.hostname, self._url.port \
|
||||
if self._url.port else DEFAULT_HTTP_PORT
|
||||
else:
|
||||
raise KeyError(
|
||||
'Invalid request. Method: %r, Url: %r' %
|
||||
(self.method, self._url),
|
||||
)
|
||||
self.path = self._url.remainder
|
||||
|
|
|
@ -11,7 +11,6 @@
|
|||
import socket
|
||||
import argparse
|
||||
|
||||
from uuid import UUID
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Tuple, List, Union, Optional
|
||||
|
||||
|
@ -46,13 +45,13 @@ class HttpProtocolHandlerPlugin(ABC):
|
|||
|
||||
def __init__(
|
||||
self,
|
||||
uid: UUID,
|
||||
uid: str,
|
||||
flags: argparse.Namespace,
|
||||
client: TcpClientConnection,
|
||||
request: HttpParser,
|
||||
event_queue: EventQueue,
|
||||
):
|
||||
self.uid: UUID = uid
|
||||
self.uid: str = uid
|
||||
self.flags: argparse.Namespace = flags
|
||||
self.client: TcpClientConnection = client
|
||||
self.request: HttpParser = request
|
||||
|
|
|
@ -38,7 +38,7 @@ class AuthPlugin(HttpProxyBasePlugin):
|
|||
def before_upstream_connection(
|
||||
self, request: HttpParser,
|
||||
) -> Optional[HttpParser]:
|
||||
if self.flags.auth_code:
|
||||
if self.flags.auth_code and request.headers:
|
||||
if b'proxy-authorization' not in request.headers:
|
||||
raise ProxyAuthenticationFailed()
|
||||
parts = request.headers[b'proxy-authorization'][1].split()
|
||||
|
|
|
@ -11,7 +11,6 @@
|
|||
import argparse
|
||||
|
||||
from abc import ABC
|
||||
from uuid import UUID
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
from ..parser import HttpParser
|
||||
|
@ -28,7 +27,7 @@ class HttpProxyBasePlugin(ABC):
|
|||
|
||||
def __init__(
|
||||
self,
|
||||
uid: UUID,
|
||||
uid: str,
|
||||
flags: argparse.Namespace,
|
||||
client: TcpClientConnection,
|
||||
event_queue: EventQueue,
|
||||
|
|
|
@ -307,12 +307,9 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
# parse incoming response packet
|
||||
# only for non-https requests and when
|
||||
# tls interception is enabled
|
||||
if not self.request.is_https_tunnel:
|
||||
# See https://github.com/abhinavsingh/proxy.py/issues/127 for why
|
||||
# currently response parsing is disabled when TLS interception is enabled.
|
||||
#
|
||||
# or self.tls_interception_enabled():
|
||||
if self.response.state == httpParserStates.COMPLETE:
|
||||
if not self.request.is_https_tunnel \
|
||||
or self.tls_interception_enabled():
|
||||
if self.response.is_complete:
|
||||
self.handle_pipeline_response(raw)
|
||||
else:
|
||||
# TODO(abhinavsingh): Remove .tobytes after parser is
|
||||
|
@ -340,11 +337,9 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
'request_method': text_(self.request.method),
|
||||
'request_path': text_(self.request.path),
|
||||
'request_bytes': self.request.total_size,
|
||||
'request_code': self.request.code,
|
||||
'request_ua': self.request.header(b'user-agent')
|
||||
if self.request.has_header(b'user-agent')
|
||||
else None,
|
||||
'request_reason': self.request.reason,
|
||||
'request_version': self.request.version,
|
||||
# Response
|
||||
'response_bytes': self.response.total_size,
|
||||
|
@ -436,7 +431,7 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
# and response objects.
|
||||
#
|
||||
# if not self.request.is_https_tunnel and \
|
||||
# self.response.state == httpParserStates.COMPLETE:
|
||||
# self.response.is_complete:
|
||||
# self.access_log()
|
||||
return chunk
|
||||
|
||||
|
@ -465,7 +460,7 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
# For http proxy requests, handle pipeline case.
|
||||
# We also handle pipeline scenario for https proxy
|
||||
# requests is TLS interception is enabled.
|
||||
if self.request.state == httpParserStates.COMPLETE and (
|
||||
if self.request.is_complete and (
|
||||
not self.request.is_https_tunnel or
|
||||
self.tls_interception_enabled()
|
||||
):
|
||||
|
@ -488,7 +483,7 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
# TODO(abhinavsingh): Remove .tobytes after parser is
|
||||
# memoryview compliant
|
||||
self.pipeline_request.parse(raw.tobytes())
|
||||
if self.pipeline_request.state == httpParserStates.COMPLETE:
|
||||
if self.pipeline_request.is_complete:
|
||||
for plugin in self.plugins.values():
|
||||
assert self.pipeline_request is not None
|
||||
r = plugin.handle_client_request(self.pipeline_request)
|
||||
|
@ -592,7 +587,7 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
# TODO(abhinavsingh): Remove .tobytes after parser is memoryview
|
||||
# compliant
|
||||
self.pipeline_response.parse(raw.tobytes())
|
||||
if self.pipeline_response.state == httpParserStates.COMPLETE:
|
||||
if self.pipeline_response.is_complete:
|
||||
self.pipeline_response = None
|
||||
|
||||
def connect_upstream(self) -> None:
|
||||
|
@ -735,7 +730,7 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
ca_key_path = self.flags.ca_key_file
|
||||
ca_key_password = ''
|
||||
ca_crt_path = self.flags.ca_cert_file
|
||||
serial = self.uid.int
|
||||
serial = self.uid
|
||||
|
||||
# Sign generated CSR
|
||||
if not os.path.isfile(cert_file_path):
|
||||
|
@ -905,14 +900,19 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
return
|
||||
assert self.request.port
|
||||
self.event_queue.publish(
|
||||
request_id=self.uid.hex,
|
||||
request_id=self.uid,
|
||||
event_name=eventNames.REQUEST_COMPLETE,
|
||||
event_payload={
|
||||
'url': text_(self.request.path)
|
||||
if self.request.is_https_tunnel
|
||||
else 'http://%s:%d%s' % (text_(self.request.host), self.request.port, text_(self.request.path)),
|
||||
'method': text_(self.request.method),
|
||||
'headers': {text_(k): text_(v[1]) for k, v in self.request.headers.items()},
|
||||
'headers': {}
|
||||
if not self.request.headers else
|
||||
{
|
||||
text_(k): text_(v[1])
|
||||
for k, v in self.request.headers.items()
|
||||
},
|
||||
'body': text_(self.request.body)
|
||||
if self.request.method == httpMethods.POST
|
||||
else None,
|
||||
|
@ -923,7 +923,7 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
def emit_response_events(self, chunk_size: int) -> None:
|
||||
if not self.flags.enable_events:
|
||||
return
|
||||
if self.response.state == httpParserStates.COMPLETE:
|
||||
if self.response.is_complete:
|
||||
self.emit_response_complete()
|
||||
elif self.response.state == httpParserStates.RCVING_BODY:
|
||||
self.emit_response_chunk_received(chunk_size)
|
||||
|
@ -934,10 +934,15 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
if not self.flags.enable_events:
|
||||
return
|
||||
self.event_queue.publish(
|
||||
request_id=self.uid.hex,
|
||||
request_id=self.uid,
|
||||
event_name=eventNames.RESPONSE_HEADERS_COMPLETE,
|
||||
event_payload={
|
||||
'headers': {text_(k): text_(v[1]) for k, v in self.response.headers.items()},
|
||||
'headers': {}
|
||||
if not self.response.headers else
|
||||
{
|
||||
text_(k): text_(v[1])
|
||||
for k, v in self.response.headers.items()
|
||||
},
|
||||
},
|
||||
publisher_id=self.__class__.__name__,
|
||||
)
|
||||
|
@ -946,7 +951,7 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
if not self.flags.enable_events:
|
||||
return
|
||||
self.event_queue.publish(
|
||||
request_id=self.uid.hex,
|
||||
request_id=self.uid,
|
||||
event_name=eventNames.RESPONSE_CHUNK_RECEIVED,
|
||||
event_payload={
|
||||
'chunk_size': chunk_size,
|
||||
|
@ -959,7 +964,7 @@ class HttpProxyPlugin(HttpProtocolHandlerPlugin):
|
|||
if not self.flags.enable_events:
|
||||
return
|
||||
self.event_queue.publish(
|
||||
request_id=self.uid.hex,
|
||||
request_id=self.uid,
|
||||
event_name=eventNames.RESPONSE_COMPLETE,
|
||||
event_payload={
|
||||
'encoded_response_size': self.response.total_size,
|
||||
|
|
|
@ -10,7 +10,6 @@
|
|||
"""
|
||||
import argparse
|
||||
|
||||
from uuid import UUID
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
|
@ -27,7 +26,7 @@ class HttpWebServerBasePlugin(ABC):
|
|||
|
||||
def __init__(
|
||||
self,
|
||||
uid: UUID,
|
||||
uid: str,
|
||||
flags: argparse.Namespace,
|
||||
client: TcpClientConnection,
|
||||
event_queue: EventQueue,
|
||||
|
|
|
@ -28,7 +28,7 @@ from ..codes import httpStatusCodes
|
|||
from ..exception import HttpProtocolException
|
||||
from ..plugin import HttpProtocolHandlerPlugin
|
||||
from ..websocket import WebsocketFrame, websocketOpcodes
|
||||
from ..parser import HttpParser, httpParserStates, httpParserTypes
|
||||
from ..parser import HttpParser, httpParserTypes
|
||||
|
||||
from .plugin import HttpWebServerBasePlugin
|
||||
from .protocols import httpProtocolTypes
|
||||
|
@ -274,7 +274,7 @@ class HttpWebServerPlugin(HttpProtocolHandlerPlugin):
|
|||
return None
|
||||
# If 1st valid request was completed and it's a HTTP/1.1 keep-alive
|
||||
# And only if we have a route, parse pipeline requests
|
||||
if self.request.state == httpParserStates.COMPLETE and \
|
||||
if self.request.is_complete and \
|
||||
self.request.is_http_1_1_keep_alive and \
|
||||
self.route is not None:
|
||||
if self.pipeline_request is None:
|
||||
|
@ -284,7 +284,7 @@ class HttpWebServerPlugin(HttpProtocolHandlerPlugin):
|
|||
# TODO(abhinavsingh): Remove .tobytes after parser is memoryview
|
||||
# compliant
|
||||
self.pipeline_request.parse(raw.tobytes())
|
||||
if self.pipeline_request.state == httpParserStates.COMPLETE:
|
||||
if self.pipeline_request.is_complete:
|
||||
self.route.handle_request(self.pipeline_request)
|
||||
if not self.pipeline_request.is_http_1_1_keep_alive:
|
||||
logger.error(
|
||||
|
@ -301,10 +301,28 @@ class HttpWebServerPlugin(HttpProtocolHandlerPlugin):
|
|||
if self.request.has_host():
|
||||
return
|
||||
context = {
|
||||
'client_addr': self.client.address,
|
||||
'client_ip': None if not self.client.addr else self.client.addr[0],
|
||||
'client_port': None if not self.client.addr else self.client.addr[1],
|
||||
'connection_time_ms': '%.2f' % ((time.time() - self.start_time) * 1000),
|
||||
# Request
|
||||
'request_method': text_(self.request.method),
|
||||
'request_path': text_(self.request.path),
|
||||
'connection_time_ms': '%.2f' % ((time.time() - self.start_time) * 1000),
|
||||
'request_bytes': self.request.total_size,
|
||||
'request_ua': self.request.header(b'user-agent')
|
||||
if self.request.has_header(b'user-agent')
|
||||
else None,
|
||||
'request_version': self.request.version,
|
||||
# Response
|
||||
#
|
||||
# TODO: Track and inject web server specific response attributes
|
||||
# Currently, plugins are allowed to queue raw bytes, because of
|
||||
# which we'll have to reparse the queued packets to deduce
|
||||
# several attributes required below. At least for code and
|
||||
# reason attributes.
|
||||
#
|
||||
# 'response_bytes': self.response.total_size,
|
||||
# 'response_code': text_(self.response.code),
|
||||
# 'response_reason': text_(self.response.reason),
|
||||
}
|
||||
log_handled = False
|
||||
if self.route:
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
"""
|
||||
from typing import Optional, Tuple
|
||||
|
||||
from ..common.constants import COLON, SLASH
|
||||
from ..common.constants import COLON, SLASH, HTTP_URL_PREFIX, HTTPS_URL_PREFIX
|
||||
from ..common.utils import text_
|
||||
|
||||
|
||||
|
@ -65,11 +65,11 @@ class Url:
|
|||
|
||||
We use heuristics based approach for our URL parser.
|
||||
"""
|
||||
sraw = raw.decode('utf-8')
|
||||
if sraw[0] == SLASH.decode('utf-8'):
|
||||
if raw[0] == 47: # SLASH == 47
|
||||
return cls(remainder=raw)
|
||||
if sraw.startswith('https://') or sraw.startswith('http://'):
|
||||
is_https = sraw.startswith('https://')
|
||||
is_http = raw.startswith(HTTP_URL_PREFIX)
|
||||
is_https = raw.startswith(HTTPS_URL_PREFIX)
|
||||
if is_http or is_https:
|
||||
rest = raw[len(b'https://'):] \
|
||||
if is_https \
|
||||
else raw[len(b'http://'):]
|
||||
|
@ -88,21 +88,26 @@ class Url:
|
|||
|
||||
@staticmethod
|
||||
def parse_host_and_port(raw: bytes) -> Tuple[bytes, Optional[int]]:
|
||||
parts = raw.split(COLON)
|
||||
parts = raw.split(COLON, 2)
|
||||
num_parts = len(parts)
|
||||
port: Optional[int] = None
|
||||
if len(parts) == 1:
|
||||
# No port found
|
||||
if num_parts == 1:
|
||||
return parts[0], None
|
||||
if len(parts) == 2:
|
||||
host, port = COLON.join(parts[:-1]), int(parts[-1])
|
||||
if len(parts) > 2:
|
||||
try:
|
||||
port = int(parts[-1])
|
||||
host = COLON.join(parts[:-1])
|
||||
except ValueError:
|
||||
# If unable to convert last part into port,
|
||||
# this is the IPv6 scenario. Treat entire
|
||||
# data as host
|
||||
host, port = raw, None
|
||||
# Host and port found
|
||||
if num_parts == 2:
|
||||
return COLON.join(parts[:-1]), int(parts[-1])
|
||||
# More than a single COLON i.e. IPv6 scenario
|
||||
try:
|
||||
# Try to resolve last part as an int port
|
||||
last_token = parts[-1].split(COLON)
|
||||
port = int(last_token[-1])
|
||||
host = COLON.join(parts[:-1]) + COLON + \
|
||||
COLON.join(last_token[:-1])
|
||||
except ValueError:
|
||||
# If unable to convert last part into port,
|
||||
# treat entire data as host
|
||||
host, port = raw, None
|
||||
# patch up invalid ipv6 scenario
|
||||
rhost = host.decode('utf-8')
|
||||
if COLON.decode('utf-8') in rhost and \
|
||||
|
|
|
@ -10,13 +10,13 @@
|
|||
"""
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Optional
|
||||
from uuid import UUID
|
||||
|
||||
from ....http.parser import HttpParser
|
||||
|
||||
|
||||
class CacheStore(ABC):
|
||||
|
||||
def __init__(self, uid: UUID) -> None:
|
||||
def __init__(self, uid: str) -> None:
|
||||
self.uid = uid
|
||||
|
||||
@abstractmethod
|
||||
|
|
|
@ -12,7 +12,6 @@ import logging
|
|||
import os
|
||||
import tempfile
|
||||
from typing import Optional, BinaryIO
|
||||
from uuid import UUID
|
||||
|
||||
from ....common.flag import flags
|
||||
from ....common.utils import text_
|
||||
|
@ -34,7 +33,7 @@ flags.add_argument(
|
|||
|
||||
class OnDiskCacheStore(CacheStore):
|
||||
|
||||
def __init__(self, uid: UUID, cache_dir: str) -> None:
|
||||
def __init__(self, uid: str, cache_dir: str) -> None:
|
||||
super().__init__(uid)
|
||||
self.cache_dir = cache_dir
|
||||
self.cache_file_path: Optional[str] = None
|
||||
|
@ -43,7 +42,7 @@ class OnDiskCacheStore(CacheStore):
|
|||
def open(self, request: HttpParser) -> None:
|
||||
self.cache_file_path = os.path.join(
|
||||
self.cache_dir,
|
||||
'%s-%s.txt' % (text_(request.host), self.uid.hex),
|
||||
'%s-%s.txt' % (text_(request.host), self.uid),
|
||||
)
|
||||
self.cache_file = open(self.cache_file_path, "wb")
|
||||
|
||||
|
|
|
@ -59,9 +59,8 @@ class FilterByURLRegexPlugin(HttpProxyBasePlugin):
|
|||
request_host = None
|
||||
if request.host:
|
||||
request_host = request.host
|
||||
else:
|
||||
if b'host' in request.headers:
|
||||
request_host = request.header(b'host')
|
||||
elif request.headers and b'host' in request.headers:
|
||||
request_host = request.header(b'host')
|
||||
|
||||
if not request_host:
|
||||
logger.error("Cannot determine host")
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
"""
|
||||
from typing import Any
|
||||
|
||||
from ..http.parser import HttpParser, httpParserTypes, httpParserStates
|
||||
from ..http.parser import HttpParser, httpParserTypes
|
||||
from ..http.proxy import HttpProxyBasePlugin
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ class ModifyChunkResponsePlugin(HttpProxyBasePlugin):
|
|||
# Note that these chunks also include headers
|
||||
self.response.parse(chunk.tobytes())
|
||||
# If response is complete, modify and dispatch to client
|
||||
if self.response.state == httpParserStates.COMPLETE:
|
||||
if self.response.is_complete:
|
||||
# Avoid setting a body for responses where a body is not expected.
|
||||
# Otherwise, example curl will report warnings.
|
||||
if self.response.body_expected:
|
||||
|
|
|
@ -23,6 +23,7 @@ class TestCase(unittest.TestCase):
|
|||
"""Base TestCase class that automatically setup and tear down proxy.py."""
|
||||
|
||||
DEFAULT_PROXY_PY_STARTUP_FLAGS = [
|
||||
'--port', '0',
|
||||
'--num-workers', '1',
|
||||
'--num-acceptors', '1',
|
||||
'--threadless',
|
||||
|
@ -48,7 +49,7 @@ class TestCase(unittest.TestCase):
|
|||
|
||||
cls.PROXY.__enter__()
|
||||
assert cls.PROXY.acceptors
|
||||
cls.wait_for_server(cls.PROXY.acceptors.flags.port)
|
||||
cls.wait_for_server(cls.PROXY.flags.port)
|
||||
|
||||
@staticmethod
|
||||
def wait_for_server(
|
||||
|
|
|
@ -1,2 +1,2 @@
|
|||
setuptools-scm == 6.3.2
|
||||
twine==3.6.0
|
||||
twine==3.7.0
|
||||
|
|
|
@ -1,17 +1,22 @@
|
|||
wheel==0.37.0
|
||||
python-coveralls==2.9.3
|
||||
coverage==6.1.2
|
||||
coverage==6.2
|
||||
flake8==4.0.1
|
||||
pytest==6.2.5
|
||||
pytest-cov==3.0.0
|
||||
pytest-xdist == 2.4.0
|
||||
pytest-xdist == 2.5.0
|
||||
pytest-mock==3.6.1
|
||||
pytest-asyncio==0.16.0
|
||||
autopep8==1.6.0
|
||||
mypy==0.910
|
||||
py-spy==0.3.10
|
||||
py-spy==0.3.11
|
||||
codecov==2.1.12
|
||||
tox==3.24.4
|
||||
mccabe==0.6.1
|
||||
pylint==2.12.1
|
||||
pylint==2.12.2
|
||||
rope==0.22.0
|
||||
# Required by test_http2.py
|
||||
httpx==0.20.0
|
||||
h2==4.1.0
|
||||
hpack==4.0.0
|
||||
hyperframe==6.0.1
|
||||
|
|
|
@ -1,2 +1,2 @@
|
|||
paramiko==2.8.0
|
||||
types-paramiko==2.8.2
|
||||
paramiko==2.8.1
|
||||
types-paramiko==2.8.4
|
||||
|
|
|
@ -0,0 +1,38 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
proxy.py
|
||||
~~~~~~~~
|
||||
⚡⚡⚡ Fast, Lightweight, Pluggable, TLS interception capable proxy server focused on
|
||||
Network monitoring, controls & Application development, testing, debugging.
|
||||
|
||||
:copyright: (c) 2013-present by Abhinav Singh and contributors.
|
||||
:license: BSD, see LICENSE for more details.
|
||||
"""
|
||||
import pytest
|
||||
import httpx
|
||||
|
||||
from proxy.common._compat import IS_WINDOWS # noqa: WPS436
|
||||
from proxy import TestCase
|
||||
|
||||
|
||||
class TestHttp2WithProxy(TestCase):
|
||||
|
||||
@pytest.mark.skipif(
|
||||
IS_WINDOWS,
|
||||
reason='--threadless not supported on Windows',
|
||||
) # type: ignore[misc]
|
||||
def test_http2_via_proxy(self) -> None:
|
||||
assert self.PROXY
|
||||
response = httpx.get(
|
||||
'https://httpbin.org/get',
|
||||
headers={'accept': 'application/json'},
|
||||
verify=httpx.create_ssl_context(http2=True),
|
||||
timeout=httpx.Timeout(timeout=5.0),
|
||||
proxies={
|
||||
'all://': 'http://localhost:%d' % self.PROXY.flags.port,
|
||||
},
|
||||
)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
# def test_http2_streams_over_proxy_keep_alive_connection(self) -> None:
|
||||
# pass
|
|
@ -193,7 +193,8 @@ class TestHttpParser(unittest.TestCase):
|
|||
self.assertTrue(self.parser.has_header(b'key'))
|
||||
|
||||
def test_set_host_port_raises(self) -> None:
|
||||
with self.assertRaises(KeyError):
|
||||
# Assertion for url will fail
|
||||
with self.assertRaises(AssertionError):
|
||||
self.parser._set_line_attributes()
|
||||
|
||||
def test_find_line(self) -> None:
|
||||
|
@ -243,6 +244,7 @@ class TestHttpParser(unittest.TestCase):
|
|||
self.assertEqual(self.parser._url.port, None)
|
||||
self.assertEqual(self.parser.version, b'HTTP/1.1')
|
||||
self.assertEqual(self.parser.state, httpParserStates.COMPLETE)
|
||||
assert self.parser.headers
|
||||
self.assertEqual(
|
||||
self.parser.headers[b'host'], (b'Host', b'example.com'),
|
||||
)
|
||||
|
@ -296,7 +298,7 @@ class TestHttpParser(unittest.TestCase):
|
|||
self.parser.total_size,
|
||||
len(pkt) + len(CRLF) + len(host_hdr),
|
||||
)
|
||||
self.assertDictEqual(self.parser.headers, {})
|
||||
assert self.parser.headers is None
|
||||
self.assertEqual(self.parser.buffer, b'Host: localhost:8080')
|
||||
self.assertEqual(self.parser.state, httpParserStates.LINE_RCVD)
|
||||
|
||||
|
@ -305,6 +307,7 @@ class TestHttpParser(unittest.TestCase):
|
|||
self.parser.total_size, len(pkt) +
|
||||
(3 * len(CRLF)) + len(host_hdr),
|
||||
)
|
||||
assert self.parser.headers is not None
|
||||
self.assertEqual(
|
||||
self.parser.headers[b'host'],
|
||||
(
|
||||
|
@ -330,6 +333,7 @@ class TestHttpParser(unittest.TestCase):
|
|||
self.assertEqual(self.parser.state, httpParserStates.LINE_RCVD)
|
||||
|
||||
self.parser.parse(b'localhost:8080' + CRLF)
|
||||
assert self.parser.headers
|
||||
self.assertEqual(
|
||||
self.parser.headers[b'host'],
|
||||
(
|
||||
|
@ -345,6 +349,7 @@ class TestHttpParser(unittest.TestCase):
|
|||
|
||||
self.parser.parse(b'Content-Type: text/plain' + CRLF)
|
||||
self.assertEqual(self.parser.buffer, b'')
|
||||
assert self.parser.headers
|
||||
self.assertEqual(
|
||||
self.parser.headers[b'content-type'], (
|
||||
b'Content-Type',
|
||||
|
@ -373,6 +378,7 @@ class TestHttpParser(unittest.TestCase):
|
|||
self.assertEqual(self.parser._url.hostname, b'localhost')
|
||||
self.assertEqual(self.parser._url.port, None)
|
||||
self.assertEqual(self.parser.version, b'HTTP/1.1')
|
||||
assert self.parser.headers
|
||||
self.assertEqual(
|
||||
self.parser.headers[b'content-type'],
|
||||
(b'Content-Type', b'application/x-www-form-urlencoded'),
|
||||
|
@ -528,6 +534,7 @@ class TestHttpParser(unittest.TestCase):
|
|||
b'<TITLE>301 Moved</TITLE></HEAD><BODY>\n<H1>301 Moved</H1>\nThe document has moved\n' +
|
||||
b'<A HREF="http://www.google.com/">here</A>.\r\n</BODY></HTML>\r\n',
|
||||
)
|
||||
assert self.parser.headers
|
||||
self.assertEqual(
|
||||
self.parser.headers[b'content-length'],
|
||||
(b'Content-Length', b'219'),
|
||||
|
@ -550,6 +557,7 @@ class TestHttpParser(unittest.TestCase):
|
|||
b'X-Frame-Options: SAMEORIGIN\r\n',
|
||||
]),
|
||||
)
|
||||
assert self.parser.headers
|
||||
self.assertEqual(
|
||||
self.parser.headers[b'x-frame-options'],
|
||||
(b'X-Frame-Options', b'SAMEORIGIN'),
|
||||
|
|
|
@ -163,8 +163,8 @@ class TestHttpProxyTlsInterception(Assertions):
|
|||
ssl.Purpose.SERVER_AUTH, cafile=str(DEFAULT_CA_FILE),
|
||||
)
|
||||
# self.assertEqual(self.mock_ssl_context.return_value.options,
|
||||
# ssl.OP_NO_SSLv2 | ssl.OP_NO_SSLv3 | ssl.OP_NO_TLSv1 |
|
||||
# ssl.OP_NO_TLSv1_1)
|
||||
# ssl.OP_NO_SSLv2 | ssl.OP_NO_SSLv3 | ssl.OP_NO_TLSv1 |
|
||||
# ssl.OP_NO_TLSv1_1)
|
||||
self.assertEqual(plain_connection.setblocking.call_count, 2)
|
||||
self.mock_ssl_context.return_value.wrap_socket.assert_called_with(
|
||||
plain_connection, server_hostname=host,
|
||||
|
|
|
@ -31,18 +31,18 @@ class TestProxyPyEmbedded(TestCase):
|
|||
integration test suite for proxy.py."""
|
||||
|
||||
PROXY_PY_STARTUP_FLAGS = TestCase.DEFAULT_PROXY_PY_STARTUP_FLAGS + [
|
||||
'--enable-web-server', '--port', '0',
|
||||
'--enable-web-server',
|
||||
]
|
||||
|
||||
def test_with_proxy(self) -> None:
|
||||
"""Makes a HTTP request to in-build web server via proxy server."""
|
||||
assert self.PROXY and self.PROXY.acceptors
|
||||
assert self.PROXY
|
||||
with socket_connection(('localhost', self.PROXY.flags.port)) as conn:
|
||||
conn.send(
|
||||
build_http_request(
|
||||
httpMethods.GET, b'http://localhost:%d/' % self.PROXY.acceptors.flags.port,
|
||||
httpMethods.GET, b'http://localhost:%d/' % self.PROXY.flags.port,
|
||||
headers={
|
||||
b'Host': b'localhost:%d' % self.PROXY.acceptors.flags.port,
|
||||
b'Host': b'localhost:%d' % self.PROXY.flags.port,
|
||||
},
|
||||
),
|
||||
)
|
||||
|
|
1
tox.ini
1
tox.ini
|
@ -266,6 +266,7 @@ deps =
|
|||
pytest-mock >= 3.6.1
|
||||
-r docs/requirements.in
|
||||
-r requirements-tunnel.txt
|
||||
-r requirements-testing.txt
|
||||
-r benchmark/requirements.txt
|
||||
isolated_build = true
|
||||
skip_install = true
|
||||
|
|
|
@ -40,7 +40,7 @@ echo "# coding: utf-8
|
|||
# file generated by setuptools_scm
|
||||
# don't change, don't track in version control
|
||||
version = '${VERSION}'
|
||||
version_tuple = (${MAJOR}, ${MINOR}, ${PATCH}, '${DISTANCE}', '${DATE_AND_HASH}')" > \
|
||||
version_tuple = (${MAJOR}, ${MINOR}, '${PATCH}', '${DISTANCE}', '${DATE_AND_HASH}')" > \
|
||||
proxy/common/_scm_version.py
|
||||
|
||||
echo $MAJOR.$MINOR.$PATCH.$DISTANCE
|
||||
|
|
Loading…
Reference in New Issue