This PR makes building CMake based packages easier.
- Introduce a custom toolchain file for Pyodide. It inherits most of settings from the original Emscripten toolchain file with some modifications. Packages built by pyodide-build will automatically use that toolchain file.
- Proxy cmake in pywasmcross.py, in order to proxy other compiler toolchains.
We shouldn't export symbols with `__attribute__((visibility("hidden")))`. Numpy has
a test to check that symbols with this attribute aren't exported. This fixes that test.
This rule is out of date, we intend to ensure that HOST_INSTALL_DIR
consists of cross-build packages which have the right libs and includes
so they can be used.
* chore: add some incomplete types
* chore: modernize pyproject.toml
Adding more incomplete types. About 2/3 of the way through being
able to turn on the strictness flag for it.
This resolves#2189.
> build isolation would be a bit difficult to use in our case, as for instance
> when building scipy we need the patched numpy on the host and not the numpy
> version specified in pyproject.toml (which would be unpatched)
This is indeed the case, certain packages cannot be isolated. My strategy is to
make a list of packages that shouldn't be isolated and add symlinks from the
isolated build environment into the `.artifacts` directory to "unisolate" them.
Then we remove the unisolated package requirements from the list of packages to
install, in case pesky constraints aren't satisfied. In particular, packages
that expect to be used with `pypa/build` often feel free to put very specific
constraints on their build dependencies (often asking them to be == to a
particular version). Specific version constraints is good for build
reproducibility and with build isolation doesn't cost anything. So we just
ignore the constraints. Hopefully nothing goes wrong.
In particular, any package that does stuff both at build time and at runtime and
requires synchronization between the build time and run time environments needs
the unisolation. This includes cffi with `_cffi_backend.so`, and of course numpy
and scipy. pycparser needs to be unisolated because it is a dependency of cffi.
Currently I have also unisolated pythran and cython, though these are build time
only tools and do not really need to be unisolated. Cython I unisolated
specifically because numcodecs needs it but it isn't in the numcodecs build
dependencies. Pythran I unisolated because of a problem with the scipy build
which I don't fully understand (some problem with long double feature
detection).
Our package build process currently has a significant flaw: we first run setup.py, recording all compilation commands, then we rewrite these compilation commands to invoke emcc and replay them, and then we pray that the cross compiled executables ended up in the right place to go into the wheel. This is not a good strategy because the build script is allowed to implement arbitrary logic, and if it moves, renames, etc any of the output files then we lose track of them. This has repeatedly caused difficulty for us.
However, we also make no particularly significant use of the two pass approach. We can just do the simpler thing: capture the compiler commands as they occur, modify them as needed, and then run the fixed command.
I also added a patch to fix the numpy feature detection for wasm so that we don't have to include _npyconfig.h and config.h, numpy can generate them in the way it would for a native build. I opened a numpy PR that would fix the detection for us upstream:
numpy/numpy#21154
This clears the way for us to switch to using pypa/build (as @henryiii has suggested) by removing our dependence on specific setuptools behavior.
This is on top of #2238.