Frequently when updating Python or Emscripten, a handlful of packages
stop working. Because of the way that the CI works, these prevent
other packages from being built or tested. However, packages with
many dependents are currently annoying to disable. This adds a key
to meta.yaml "disable: true" that turns off a package.
I also fixed the "!package" and no-numpy-dependents to be transitive.
We use graphlib to sort the packages topologically, then we traverse
the packages once in build order to locate all the packages which
transitively depend on disabled packages. Then we traverse in reverse
build order to locate all packages that are dependencies of non-disabled
requested packages.
We use `packaging.tags.sys_tags` to get the list of supported tags
then use `packaging.utils.parse_wheel_filename` to get the set of
tags the current wheel implement then check if one of the wheel's
tags is a supported tag. This is a fully accurate check method and
could also catch things like abi3 wheels that are compatible with
multiple Python versions.
sysconfig.py uses the environment variable `_PYTHON_SYSCONFIGDATA_NAME`
to decide where to look for the sysconfig data file with info about the compile target.
We also need to separately insure that our sysconfig data file is on the path. We
don't want the rest of our target stdlib on the path, so I made an extra sysconfigdata
folder, copied the sysconfig data into it, and put it on the path.
I added a new `wheel_base` fixture which creates a new temporary
directory and mocks `site.getsitepackages()` to return it. I also removed
the `WHEEL_BASE` global variable from micropip and instead calculate
it each time in `micropip.install`.
This adds a function `micropip.freeze()` which creates a `packages.json`
lock file which includes all packages that were loaded by micropip in the
current Pyodide session. This can be used in subsequent sessions to load
the same set of packages with `pyodide.loadPackage`.
For example in our repl:
```py
from js.console import log
import micropip
await micropip.install("sphinx-version-warning") # Installs 19 wheels from pypi
log(micropip.freeze())
```
Then opening the browser console, we can copy the JSON and make a new
`packages.json` file out of it. (Our repl will just say "<Long output truncated>"
but the browser console truncates it and provides tools to make it easy to copy.)
Reloading the page with the new `packages.json`, `versionwarning` will autoload:
```py
import versionwarning # Automatically loads 19 wheels from PyPI
```
We detected that `versionwarning` is an export provided by the `sphinx-version-warning`
package via the `top_level.txt` file. This file is part of the `egg-info` spec but isn't
mentioned anywhere in the `dist-info` spec. But wheels seem to include it nonetheless.
This PR adds support for extras to the `mock_fetch` fixture. A simple example would be as follows:
```python
@pytest.mark.asyncio
async def test_extras(mock_fetch: mock_fetch_cls, mock_importlib):
mock_fetch.add_pkg_version("reportlab")
mock_fetch.add_pkg_version("beagle-vote", extras={"pdf" : ["reportlab"]})
await micropip.install("beagle-vote[pdf]")
pkg_list = micropip.list()
assert "reportlab" in pkg_list
```
This test does not pass #2535 but hopefully it will pass on branch #2584.
Various improvements to `test_micropip`. The main feature is a new fixture
`mock_fetch` with an `add_pkg` method that takes a package name, a map
from version to requirements, and a choice of platform. This should
hopefully make writing more tests a lot easier (which is good because
we could use more micropip test coverage but we are limited by the
difficulty of writing good tests).
This also adds a fixture to create distinct dummy package names and
enables `@pytest.mark.asyncio` to handle the async calls rather than
using `asyncio.get_event_loop.run_until_complete`.
* chore: add some incomplete types
* chore: modernize pyproject.toml
Adding more incomplete types. About 2/3 of the way through being
able to turn on the strictness flag for it.
This switches to using the file system to store the information about packages
and using importlib.metadata to retrieve it.
This has two related benefits:
We don't have to separately maintain our own separate state about what
we've installed.
We are guaranteed to agree with the Python stdlib about whether or not a
package is installed, and if so which version is installed. Since the state is
maintained in only one place, there is no chance of it diverging.
According to the packaging specs, if the package is from a url we should put
that into a file here:
packaging.python.org/en/latest/specifications/direct-url/#direct-url
Other than that, we should set the INSTALLER to pyodide.loadPackage
if the package was loaded with pyodide.loadPackage either directly or
indirectly via micropip. Otherwise set INSTALLER to micropip. That
way we can maintain the current behavior of micropip.list:
if direct_url.json is present, then source is the url in direct_url.json
otherwise, set source to pyodide if INSTALLER is pyodide.loadPackage
and set it to micropip if INSTALLER is micropip.
Oddly enough, the packaging specs suggest no place to put the source
index, so it seems that according to the specs if a package was installed
by name from some custom index, this info should not be recorded.
It's possible that while we were awaiting on `_get_pypi_json`, some other branch
of the resolver already installed a different version of the same wheel. We need
to check a second time for whether the constraint is satisfied between when we
get the `pypi` metadata and when we put the information into `transaction.locked`
or else we might accidentally install the same wheel twice.
Closes#2557:
Users keep reporting issues about micropip not finding a pure python wheel
e.g. latest in pyscript/pyscript#297 so it appears the the current message is
not explicit enough.
We should explain in more detail,
* what is happening and why
* what the user can do about it and possibly point them to the issue tracker.
More micropip maintenance.
* `_parse_wheel_url` ==> static method `WheelInfo.from_url`
* `_extract_wheel` ==> `WheelInfo.extract`
* `_validate_wheel` ==> `WheelInfo.validate`
* `_install_wheel` ==> `WheelInfo.install`
* Parts of `Transaction.add_wheel` were extracted into new methods
`WheelInfo.download` and `WheelInfo.requires`.
Steps for `add_requirement`:
1. Convert `req` from a string to a Requirement if necessary
2. If `req.marker` is present, check whether `req.marker` matches `ctx`, if not we don't need it.
3. Check if `req` is already installed
4. Check if `req` is available from `packages.json`
5. Check if `req` is available from pypi
For some reason we had step 4 occurring before steps 2 and 3. This puts them in the correct order.
This is more reorganization of `micropip` to try to make the logic easier to follow.
I turned `INSTALLED_PACKAGES` into a global variable. I turned `_install` into
a top level function definition and merged it with `install` rather than a class method.
The other methods of `PackageManager` turned into `Transaction` methods. I also
removed `PackageManager` entirely.