It's possible that while we were awaiting on `_get_pypi_json`, some other branch
of the resolver already installed a different version of the same wheel. We need
to check a second time for whether the constraint is satisfied between when we
get the `pypi` metadata and when we put the information into `transaction.locked`
or else we might accidentally install the same wheel twice.
Closes#2557:
Users keep reporting issues about micropip not finding a pure python wheel
e.g. latest in pyscript/pyscript#297 so it appears the the current message is
not explicit enough.
We should explain in more detail,
* what is happening and why
* what the user can do about it and possibly point them to the issue tracker.
More micropip maintenance.
* `_parse_wheel_url` ==> static method `WheelInfo.from_url`
* `_extract_wheel` ==> `WheelInfo.extract`
* `_validate_wheel` ==> `WheelInfo.validate`
* `_install_wheel` ==> `WheelInfo.install`
* Parts of `Transaction.add_wheel` were extracted into new methods
`WheelInfo.download` and `WheelInfo.requires`.
Steps for `add_requirement`:
1. Convert `req` from a string to a Requirement if necessary
2. If `req.marker` is present, check whether `req.marker` matches `ctx`, if not we don't need it.
3. Check if `req` is already installed
4. Check if `req` is available from `packages.json`
5. Check if `req` is available from pypi
For some reason we had step 4 occurring before steps 2 and 3. This puts them in the correct order.
This is more reorganization of `micropip` to try to make the logic easier to follow.
I turned `INSTALLED_PACKAGES` into a global variable. I turned `_install` into
a top level function definition and merged it with `install` rather than a class method.
The other methods of `PackageManager` turned into `Transaction` methods. I also
removed `PackageManager` entirely.
This is more cleanup in micropip. I moved `ctx` and `fetch_extra_kwargs` into `Transaction`
properties so that we don't have to pass them through everywhere as separate arguments.
I also switch to creating the `Transaction` in `install` so that we don't have to pass the arguments
down into `gather_requirements` separately.
This PR does the following to complete #2431:
* Calculate the Subresource integrity hash from the sha256 as explained here:
https://developer.mozilla.org/en-US/docs/Web/Security/Subresource_Integrity
As the document explains, it is a base64 encoded string of the "binary" hash,
not the data returned by hexdigest() method.
* Implement the verification of the checksum when loading packages
This ignores the verification for NodeJS environment as the node-fetch module
doesn't support the integrity option. Node 17.5.0 has added the fetch API as
experimental, it would be prudent to come back and fix this when we are ready
to use that version.
This moves unisolation into a package key. `cross-build-env: true` means the package
is part of the cross build environment and should be unisolated. `cross-build-files`
gives a list of files that should be copied from the built copy of the package into the host
copy of the package.
This will allow us to construct a cross build environment automatically as part of building
packages. If we have these files and the Python include directory, this is sufficient for
cross-building binary packages.
It's useful to be able to hash on object identity. For instance in Python
if we want to store objects in a dictionary using object identity to
store/access obj we use `id(obj)` as the index. This adds `js_id` as an
attribute to `JsProxy` so that `jsobj.js_id` can be used as a dictionary key
to key on identity of the JavaScript object.