This adds a function `micropip.freeze()` which creates a `packages.json`
lock file which includes all packages that were loaded by micropip in the
current Pyodide session. This can be used in subsequent sessions to load
the same set of packages with `pyodide.loadPackage`.
For example in our repl:
```py
from js.console import log
import micropip
await micropip.install("sphinx-version-warning") # Installs 19 wheels from pypi
log(micropip.freeze())
```
Then opening the browser console, we can copy the JSON and make a new
`packages.json` file out of it. (Our repl will just say "<Long output truncated>"
but the browser console truncates it and provides tools to make it easy to copy.)
Reloading the page with the new `packages.json`, `versionwarning` will autoload:
```py
import versionwarning # Automatically loads 19 wheels from PyPI
```
We detected that `versionwarning` is an export provided by the `sphinx-version-warning`
package via the `top_level.txt` file. This file is part of the `egg-info` spec but isn't
mentioned anywhere in the `dist-info` spec. But wheels seem to include it nonetheless.
This PR adds support for extras to the `mock_fetch` fixture. A simple example would be as follows:
```python
@pytest.mark.asyncio
async def test_extras(mock_fetch: mock_fetch_cls, mock_importlib):
mock_fetch.add_pkg_version("reportlab")
mock_fetch.add_pkg_version("beagle-vote", extras={"pdf" : ["reportlab"]})
await micropip.install("beagle-vote[pdf]")
pkg_list = micropip.list()
assert "reportlab" in pkg_list
```
This test does not pass #2535 but hopefully it will pass on branch #2584.
Various improvements to `test_micropip`. The main feature is a new fixture
`mock_fetch` with an `add_pkg` method that takes a package name, a map
from version to requirements, and a choice of platform. This should
hopefully make writing more tests a lot easier (which is good because
we could use more micropip test coverage but we are limited by the
difficulty of writing good tests).
This also adds a fixture to create distinct dummy package names and
enables `@pytest.mark.asyncio` to handle the async calls rather than
using `asyncio.get_event_loop.run_until_complete`.
* chore: add some incomplete types
* chore: modernize pyproject.toml
Adding more incomplete types. About 2/3 of the way through being
able to turn on the strictness flag for it.
BrowserFS can mount custom filesystems into Emscripten.
However it requires the PATH and ERRNO_CODES exports from
Emscripten in addition to FS.
This exports `PATH` and `ERRNO_CODES` from `Module` into the `pyodide`
Javascript API so they can be used with BrowserFS.
Currently the following code fails:
```py
from js import eval
eval("Object.create(null)")
```
with:
```py
Traceback (most recent call last):
File "<console>", line 1, in <module>
JsException: TypeError: Cannot read properties of undefined (reading 'name')
```
This fixes it.
This switches to using the file system to store the information about packages
and using importlib.metadata to retrieve it.
This has two related benefits:
We don't have to separately maintain our own separate state about what
we've installed.
We are guaranteed to agree with the Python stdlib about whether or not a
package is installed, and if so which version is installed. Since the state is
maintained in only one place, there is no chance of it diverging.
According to the packaging specs, if the package is from a url we should put
that into a file here:
packaging.python.org/en/latest/specifications/direct-url/#direct-url
Other than that, we should set the INSTALLER to pyodide.loadPackage
if the package was loaded with pyodide.loadPackage either directly or
indirectly via micropip. Otherwise set INSTALLER to micropip. That
way we can maintain the current behavior of micropip.list:
if direct_url.json is present, then source is the url in direct_url.json
otherwise, set source to pyodide if INSTALLER is pyodide.loadPackage
and set it to micropip if INSTALLER is micropip.
Oddly enough, the packaging specs suggest no place to put the source
index, so it seems that according to the specs if a package was installed
by name from some custom index, this info should not be recorded.
This adds pyodide_build command create_xbuildenv which creates a
crossbuild environment (from a copy of Pyodide where scipy has been
build) and install_xbuildenv which installs the cross build environment
into a fresh copy of Pyodide.
I successfully installed the xbuild environment into a fresh checkout of
Pyodide then built statsmodels and scikit-learn in isolation, without
building the Python interpreter, numpy, or scipy. I dumped the generated
wheels into a copy of Pyodide downloaded from CI, and was able to import
and use them as normal.
The size of the xbuild environment is 1.5 megabytes, of which 1.2 megabytes
is Python headers.
In a subsequent PR, we can update the CI to automatically upload these
to aws s3 and then install the environment from there.
It's possible that while we were awaiting on `_get_pypi_json`, some other branch
of the resolver already installed a different version of the same wheel. We need
to check a second time for whether the constraint is satisfied between when we
get the `pypi` metadata and when we put the information into `transaction.locked`
or else we might accidentally install the same wheel twice.
Closes#2557:
Users keep reporting issues about micropip not finding a pure python wheel
e.g. latest in pyscript/pyscript#297 so it appears the the current message is
not explicit enough.
We should explain in more detail,
* what is happening and why
* what the user can do about it and possibly point them to the issue tracker.
More micropip maintenance.
* `_parse_wheel_url` ==> static method `WheelInfo.from_url`
* `_extract_wheel` ==> `WheelInfo.extract`
* `_validate_wheel` ==> `WheelInfo.validate`
* `_install_wheel` ==> `WheelInfo.install`
* Parts of `Transaction.add_wheel` were extracted into new methods
`WheelInfo.download` and `WheelInfo.requires`.
Steps for `add_requirement`:
1. Convert `req` from a string to a Requirement if necessary
2. If `req.marker` is present, check whether `req.marker` matches `ctx`, if not we don't need it.
3. Check if `req` is already installed
4. Check if `req` is available from `packages.json`
5. Check if `req` is available from pypi
For some reason we had step 4 occurring before steps 2 and 3. This puts them in the correct order.
This is more reorganization of `micropip` to try to make the logic easier to follow.
I turned `INSTALLED_PACKAGES` into a global variable. I turned `_install` into
a top level function definition and merged it with `install` rather than a class method.
The other methods of `PackageManager` turned into `Transaction` methods. I also
removed `PackageManager` entirely.
This is more cleanup in micropip. I moved `ctx` and `fetch_extra_kwargs` into `Transaction`
properties so that we don't have to pass them through everywhere as separate arguments.
I also switch to creating the `Transaction` in `install` so that we don't have to pass the arguments
down into `gather_requirements` separately.
This PR does the following to complete #2431:
* Calculate the Subresource integrity hash from the sha256 as explained here:
https://developer.mozilla.org/en-US/docs/Web/Security/Subresource_Integrity
As the document explains, it is a base64 encoded string of the "binary" hash,
not the data returned by hexdigest() method.
* Implement the verification of the checksum when loading packages
This ignores the verification for NodeJS environment as the node-fetch module
doesn't support the integrity option. Node 17.5.0 has added the fetch API as
experimental, it would be prudent to come back and fix this when we are ready
to use that version.
This moves unisolation into a package key. `cross-build-env: true` means the package
is part of the cross build environment and should be unisolated. `cross-build-files`
gives a list of files that should be copied from the built copy of the package into the host
copy of the package.
This will allow us to construct a cross build environment automatically as part of building
packages. If we have these files and the Python include directory, this is sufficient for
cross-building binary packages.