Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] dymamic crossenv + python3*-wheels + python310-313 updates #6282

Open
wants to merge 114 commits into
base: master
Choose a base branch
from

Conversation

th0ma7
Copy link
Contributor

@th0ma7 th0ma7 commented Oct 14, 2024

Description

Intent is to:

  1. Remove wheel build testing from python310/python311 base Makefiles to ease updates
  2. Add python312 and python313 packages
  3. Dissociate crossenv creation from python3*/Makefile -> Now uses spksrc.crossenv.mk
$ ls -la ~/spksrc/mk/crossenv/requirements-default.txt
$ make crossenvclean
$ make crossenv-<arch>-<version>
  1. Allow wheel specific crossenv enablement using a mk/crossenv/ directory containing wheel crossenv definitions:
$ ls -la ~/spksrc/mk/crossenv/requirements-aiohttp-3.8.5.txt
$ make crossenvclean
$ WHEEL="aiohttp-3.8.5" make crossenv-x64-7.1

Fixes #6284

Checklist

  • Build rule all-supported completed successfully
  • New installation of package completed successfully
  • Package upgrade completed successfully (Manually install the package again)
  • Package functionality was tested
  • Any needed documentation is updated/created

Type of change

  • Bug fix
  • New Package
  • Package update
  • Includes small framework changes
  • This change requires a documentation update (e.g. Wiki)

TODO

  • Move usage of pip cache through $HOME/.cache/pip to use $(WORK_DIR)/pip
2024-11-14T02:37:23.1044954Z WARNING: The directory '/github/home/.cache/pip' or its parent directory is not owned or is not writable by the current user. The cache has been disabled. Check the permissions and owner of that directory. If executing pip with sudo, you should use sudo's -H flag.
  • Fix handling of requirements.txt entry with trailing comment test==1234 # This is a test wheel
  • Have crossenv wheels use #$(WORK_DIR) caching directory instead of default $(HOME)/.cache/pip
  • Review OPENSSL_*_DIR variables and logic usage throughout python related mk files
  • Use status cookie for crossenv creation
  • Use status cookie for wheel building -->> LEFT FOR SUBSEQUENT PR
  • Have crossenv add status info to status-build.log file
  • Have spksrc.python-wheel.mk to use status cookie to avoid always rebuilding -->> LEFT FOR SUBSEQUENT PR
  • Avoid zlib creation + remove of symlinks from spksrc.python.mk to eliminate rebuilding

cross/python313/Makefile Outdated Show resolved Hide resolved
@SynoCommunity SynoCommunity deleted a comment from hacscred Oct 14, 2024
@th0ma7
Copy link
Contributor Author

th0ma7 commented Oct 14, 2024

@hgy59 I'll let this run but this patch in particular 9c764a7 needs to be in its own PR... interesting from not having noticed it earlier. I'll also mark as [WIP] as not even close to be merged, this was early work being push to initiate exchange with python github relatively to the build issue I had.

@th0ma7 th0ma7 changed the title python311:Dissociate python optional test wheels builds + python313 test drive [WIP] dissociate python optional test wheels builds + python313 Oct 14, 2024
@th0ma7 th0ma7 self-assigned this Oct 14, 2024
@SynoCommunity SynoCommunity deleted a comment from hacscred Oct 15, 2024
@th0ma7
Copy link
Contributor Author

th0ma7 commented Oct 15, 2024

@hgy59 bind compiling tries to install packages ?!?!? I'll try to find where that comes from but in case it ring any bell...

===>  Patching for bind
===>  Configuring for bind
===>  - Configure ARGS: --host=x86_64-pc-linux-gnu --build=i686-pc-linux --prefix=/usr/local/bind --localstatedir=/usr/local/bind/var --enable-full-report BUILD_CC=cc BUILD_CFLAGS=-I/home/spksrc/python-wheels/spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/x86_64-pc-linux-gnu/sys-root/usr/include -I/home/spksrc/python-wheels/spksrc/cross/bind/work-x64-7.1/install/usr/local/bind/include -O2 -O2 BUILD_CPPFLAGS=-I/home/spksrc/python-wheels/spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/x86_64-pc-linux-gnu/sys-root/usr/include -I/home/spksrc/python-wheels/spksrc/cross/bind/work-x64-7.1/install/usr/local/bind/include -O2 -O2 BUILD_LDFLAGS=-L/home/spksrc/python-wheels/spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/x86_64-pc-linux-gnu/sys-root/lib -L/home/spksrc/python-wheels/spksrc/cross/bind/work-x64-7.1/install/usr/local/bind/lib -Wl,--rpath-link,/home/spksrc/python-wheels/spksrc/cross/bind/work-x64-7.1/install/usr/local/bind/lib -Wl,--rpath,/usr/local/bind/lib  BUILD_LIBS=
===>  - Install prefix: /usr/local/bind
===>  Install libtool binary
sudo apt-get update
[sudo] password for spksrc: 

@hgy59
Copy link
Contributor

hgy59 commented Oct 15, 2024

@hgy59 bind compiling tries to install packages ?!?!? I'll try to find where that comes from but in case it ring any bell...

===>  Patching for bind
===>  Configuring for bind
===>  - Configure ARGS: --host=x86_64-pc-linux-gnu --build=i686-pc-linux --prefix=/usr/local/bind --localstatedir=/usr/local/bind/var --enable-full-report BUILD_CC=cc BUILD_CFLAGS=-I/home/spksrc/python-wheels/spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/x86_64-pc-linux-gnu/sys-root/usr/include -I/home/spksrc/python-wheels/spksrc/cross/bind/work-x64-7.1/install/usr/local/bind/include -O2 -O2 BUILD_CPPFLAGS=-I/home/spksrc/python-wheels/spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/x86_64-pc-linux-gnu/sys-root/usr/include -I/home/spksrc/python-wheels/spksrc/cross/bind/work-x64-7.1/install/usr/local/bind/include -O2 -O2 BUILD_LDFLAGS=-L/home/spksrc/python-wheels/spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/x86_64-pc-linux-gnu/sys-root/lib -L/home/spksrc/python-wheels/spksrc/cross/bind/work-x64-7.1/install/usr/local/bind/lib -Wl,--rpath-link,/home/spksrc/python-wheels/spksrc/cross/bind/work-x64-7.1/install/usr/local/bind/lib -Wl,--rpath,/usr/local/bind/lib  BUILD_LIBS=
===>  - Install prefix: /usr/local/bind
===>  Install libtool binary
sudo apt-get update
[sudo] password for spksrc: 

@th0ma7 I will remove this in #6269. It is obsolete since spksrc image on debian 12.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Oct 16, 2024

@hgy59, @mreid-tt and other @SynoCommunity/developers I'd appreciate having some thoughts on this...

I always kept within spk/python3* examples to build from and test cross-compiled wheels. While it served its purpose it also led to complexifying the overall maintenance of default python makefiles. In hope to simplify this I first thought it would make sense moving all of that under a python-wheels or similar sub-package, sort of placeholder to test wheel building when upgrading python / pip / setuptools and al (around where this PR is atm).

Doing so I thought why not building them all and include other remanants dispersed into other sub spk ...

  • While it may look neat on paper this becomes a burden to maintain as indivudual spk will evolve over time and this centralized copy won't serve it's purpose any longer...
  • On the other hand I do believe we need a central location providing clear and up-2-date examples and to test new versions of complex wheels while upgrading python versions.

Then I recall someone mentioning a while back ago, why not having our own wheel repository?

  • This could avoid maintaining cross-compiled wheels into individual packages + reducing package size as it would then download from our wheel repository at install time.
  • This centralized python-wheels could potentially meet that purpose.
  • While neat on paper I wonder what effect would lead upgrading dependent cross/* existing wheels ...

Thoughts on this would be much appreciated.

Also on my TODO (to which babysteps may be best):

  1. make available a python 3.13 package (undergoing)
  2. fix CMake based wheel building (requires sort of autodetection to allow passing proper parameters to at build-time)
  3. fix meson based wheel building (similar issue to CMake type)
  4. decouple crossenv from cross/python3* makefiles
  5. allow either on-demand OR legacy+latest crossenv (for compatibility reasons)

@mreid-tt
Copy link
Contributor

@th0ma7, I’m not very experienced with Python builds, so I had to familiarize myself with the basics (e.g., using Real Python's guide on wheels). Here are some initial thoughts:

  1. Moving test code examples to "build from and test cross-compiled wheels" seems like a solid idea, aligning well with the purpose of the demoservice and demowebservice packages.
  2. Setting up and maintaining our own wheel repository appears to be a significant long-term commitment.

I might be missing some key points due to my limited background, so I'd appreciate more details. For instance, you mentioned integrating other "remnants" scattered across sub SPKs. What exactly are these remnants, and what benefits would centralizing them bring? Are they also related to test code?

Regarding the internal wheel repository, is it a matter of PyPI's offerings being insufficient? Are there commonly missing platform-specific wheels for Synology hardware? Would it be feasible to advocate for the Python Package Index to include the wheels we need, perhaps by reaching out to the relevant projects and requesting support?

I might not fully grasp the situation, but I’m keen to understand the challenges better and contribute more meaningful suggestions.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Oct 19, 2024

I’m not very experienced with Python builds, so I had to familiarize myself with the basics (e.g., using Real Python's guide on wheels). Here are some initial thoughts:

  1. Moving test code examples to "build from and test cross-compiled wheels" seems like a solid idea, aligning well with the purpose of the demoservice and demowebservice packages.

Indeed. Package could even be renamed similarly.

  1. Setting up and maintaining our own wheel repository appears to be a significant long-term commitment.

Not really. That would mostly be just a web page listing our wheels, that you can query through json to list our pre-compiled wheel packages. From there in our requirements.txt file provided with each package, we add a URL to fetch from. Then when wheels gets installed at package installation, it can download them from our source, similarly to using pypi.

I might be missing some key points due to my limited background, so I'd appreciate more details. For instance, you mentioned integrating other "remnants" scattered across sub SPKs. What exactly are these remnants, and what benefits would centralizing them bring? Are they also related to test code?

Regarding the internal wheel repository, is it a matter of PyPI's offerings being insufficient? Are there commonly missing platform-specific wheels for Synology hardware? Would it be feasible to advocate for the Python Package Index to include the wheels we need, perhaps by reaching out to the relevant projects and requesting support?

Issue is, pypi provides tons of pre-compiled wheels but ppc, armv5, (and often armv7 and even aarch64) are always missing. Therefore, on our NAS at installation time pip will only find the source of the requested wheels and try to compile them locally... and failing miserably as our Synology NAS does not provide a pre-installed build system toolchain and we're not providing the include files of the various dependencies.

Therefore when we build our spk package using our spksrc framework, we pull source packages from pypi usng pip, then cross-compile them and generate a resulting *.whl file. This wheel file gets packaged withing the spk so at package installation installation time it is being fetched by pip after failing to find a valid source online.

I might not fully grasp the situation, but I’m keen to understand the challenges better and contribute more meaningful suggestions.

The challenge is that the python wheel build system is undergoing a lot of changes currently and impacting the overall wheel building approaches. As pypi modules documentation and maintenance varies from one to the next, this ends-up breaking things all over the place.

So question is, would it be easier to have a single location to manage all wheel cross-compiling, providing numerous versions online to ease package management? I've been thinking of this a little more and still unsure, further as this would probably mean statically linking wheels so it is compatible with any installation OR using rpath so dependencies are available from that python-wheel package for instance.

@Safihre
Copy link
Contributor

Safihre commented Oct 20, 2024

It makes sense to have our own repository. Although it will be quite a lot of work to implement and significantly add to maintenance burden..
I saw there's applications available that allow this, for example https://github.com/pypiserver/pypiserver

@th0ma7
Copy link
Contributor Author

th0ma7 commented Nov 14, 2024

@hgy59 I believe I may now have a feature-wise functional code ready for testing... I'll be away next week (SC24) so cycles until my return may be limited. Although I would much like your opinion on this invasive but in theory fully backward compatible code change.

TL;DR;

  1. Added the ability to create on-demand multiple crossenv (see description above)
  2. In brief, when faced with a legacy or wheels not so well supported, you can now create a mk/crossenv/requirements-<wheel>-<version>.txt OR it falls back to mk/crossenv/requirements-<wheel>.txt if exist a matching crossenv configuration.
  3. Resulting wheel specific crossenv will only be of use for that specific wheel you're trying to compile (unless using fallback mechanism that may work for multiple versions).

Nice addition (from my perspective), ability to re-generate crossenv on demand (see description above for howto).

Besides the non-blocker TODO items, in theory it should be ready for testing and shaking out bugs and/or adjusting proposed strategy. I'm mostly thinking of weird corner-cases, such as the ones found in homeassistant that I hope can be addressed with this (to be tested + wheel specific crossenv configurations to be created).

Let me know if you have a moment to test-bed this, your pair of 👀 would be much appreciated.

Lastly: I did not forgot you with your requirement to join ffmpeg + wheel cross-compiling. With the previous addition of spksrc.ffmpeg.mk, along with this code refactor, it shouldn't be too complext in theory to connect the dots and tweak the code to make this a reality.

EDIT: Looking at github-action output it seems there are still a few rough edges to look for... on my todo list.

@hgy59
Copy link
Contributor

hgy59 commented Nov 14, 2024

initial analysis:

when building python311 for the first time, the build-crossenv target fails, because some variables are not set, i.e. do not contain version info (I added those variables to the log)

make[3]: Leaving directory '/spksrc/cross/pip'
===>  crossenv wheel packages: pip==24.3.1, setuptools==75.4.0, wheel==0.45.0
===>  crossenv requirements file = /spksrc/mk/crossenv/requirements-default.txt
===>  PYTHON_PKG_VERS =
===>  PYTHON_PKG_NAME = python
===>  PYTHON_PKG_VERS_MAJOR_MINOR = .
mkdir -p /spksrc/spk/python311/work-x64-7.1/Python-/build/lib.linux-x86_64-.
cp -RL /spksrc/native/python/work-native/Python-/build/lib.linux-x86_64-. /spksrc/spk/python311/work-x64-7.1/Python-/build
cp: cannot stat '/spksrc/native/python/work-native/Python-/build/lib.linux-x86_64-.': No such file or directory
make[2]: *** [../../mk/spksrc.crossenv.mk:146: build-crossenv] Error 1
make[2]: Leaving directory '/spksrc/spk/python311'
make[1]: *** [../../mk/spksrc.supported.mk:74: build-arch-x64-7.1] Error 1
make[1]: Leaving directory '/spksrc/spk/python311'

when you call make again, the build-crossenv target is not built again (this looks like a missing .-response file)

but after call make crossenvclean

the the variables are set (and build finally succeeds):

make[3]: Leaving directory '/spksrc/cross/pip'
mkdir -p /spksrc/spk/python311/work-x64-7.1/crossenv-default/build
===>  crossenv wheel packages: pip==24.3.1, setuptools==75.4.0, wheel==0.45.0
===>  crossenv requirements file = /spksrc/mk/crossenv/requirements-default.txt
===>  PYTHON_PKG_VERS = 3.11.10
===>  PYTHON_PKG_NAME = python311
===>  PYTHON_PKG_VERS_MAJOR_MINOR = 3.11
mkdir -p /spksrc/spk/python311/work-x64-7.1/Python-3.11.10/build/lib.linux-x86_64-3.11
cp -RL /spksrc/native/python311/work-native/Python-3.11.10/build/lib.linux-x86_64-3.11 /spksrc/spk/python311/work-x64-7.1/Python-3.11.10/build
/spksrc/native/python311/work-native/install/usr/local/bin/python3 -m crossenv /spksrc/spk/python311/work-x64-7.1/install/var/packages/python311/target/bin/python3.11 --cc /spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/bin/x86_64-pc-linux-gnu-gcc --cxx /spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/bin/x86_64-pc-linux-gnu-c++ --ar /spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/bin/x86_64-pc-linux-gnu-ar --sysroot /spksrc/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/x86_64-pc-linux-gnu/sys-root --env LIBRARY_PATH= --manylinux manylinux2014 /spksrc/spk/python311/work-x64-7.1/crossenv-default
===>  Setting default crossenv /spksrc/spk/python311/work-x64-7.1/crossenv-default
===>  ln -sf crossenv-default /spksrc/spk/python311/work-x64-7.1/crossenv
...

@hgy59
Copy link
Contributor

hgy59 commented Nov 14, 2024

some more details to the analysis above

all lines from build log starting with ===> (i.e. @$(MSG)) without dependent packages:

	Zeile  8633: ===>  Extracting for python311
	Zeile  8635: ===>  Patching for python311
	Zeile  8644: ===>  Configuring for python311
	Zeile  8645: ===>  - Configure ARGS: --host=x86_64-pc-linux-gnu --build=i686-pc-linux --prefix=/var/packages/python311/target --localstatedir=/var/packages/python311/var --enable-shared --without-static-libpython --enable-ipv6 --without-ensurepip --enable-loadable-sqlite-extensions --with-computed-gotos=yes --with-build-python --disable-test-modules --with-lto ac_cv_buggy_getaddrinfo=no ac_cv_file__dev_ptmx=no ac_cv_file__dev_ptc=no ac_cv_have_long_long_format=yes --with-ssl-default-suites=openssl --with-dbmliborder=gdbm:ndbm:bdb --with-system-expat --with-system-ffi
	Zeile  8646: ===>  - Install prefix: /var/packages/python311/target
	Zeile  9394: ===>  Compiling for python311
	Zeile  9943: ===>  Installing for python311
	Zeile 13671: ===>  Correcting pkg-config file lib/pkgconfig/python-3.11-embed.pc
	Zeile 13672: ===>  Correcting pkg-config file lib/pkgconfig/python-3.11.pc
	Zeile 13673: ===>  Correcting pkg-config file lib/pkgconfig/python3-embed.pc
	Zeile 13674: ===>  Correcting pkg-config file lib/pkgconfig/python3.pc
	Zeile 13677: ===>  Downloading files for pip
	Zeile 13678: ===>    File pip-23.2.1.tar.gz already downloaded
	Zeile 13679: ===>  Verifying files for pip
	Zeile 13680: ===>    Checking sha1sum of file pip-23.2.1.tar.gz
	Zeile 13681: ===>    Checking sha256sum of file pip-23.2.1.tar.gz
	Zeile 13682: ===>    Checking md5sum of file pip-23.2.1.tar.gz
	Zeile 13684: ===>  Processing dependencies of pip
	Zeile 13685: ===>  Extracting for pip
	Zeile 13687: ===>  Patching for pip
	Zeile 13688: ===>  Configuring for pip
	Zeile 13689: ===>  - Configure ARGS:
	Zeile 13690: ===>  - Install prefix: /var/packages/python311/target
	Zeile 13696: ===>  crossenv wheel packages: pip==24.3.1, setuptools==75.4.0, wheel==0.45.0
	Zeile 13697: ===>  crossenv requirements file = /spksrc/mk/crossenv/requirements-default.txt
	Zeile 13698: ===>  PYTHON_PKG_VERS = 3.11.10
	Zeile 13699: ===>  PYTHON_PKG_NAME = python311
	Zeile 13700: ===>  PYTHON_PKG_VERS_MAJOR_MINOR = 3.11
	Zeile 13704: ===>  Setting default crossenv /spksrc/spk/python311/work-x64-7.1/crossenv-default
	Zeile 13705: ===>  ln -sf crossenv-default /spksrc/spk/python311/work-x64-7.1/crossenv
	Zeile 13707: ===>  /spksrc/spk/python311/work-x64-7.1/crossenv-default/bin/build-python /spksrc/spk/python311/work-x64-7.1/crossenv-default/build/get-pip.py pip==24.3.1 --no-setuptools --no-wheel --disable-pip-version-check
	Zeile 13717: ===>  /spksrc/spk/python311/work-x64-7.1/crossenv-default/bin/cross-python /spksrc/spk/python311/work-x64-7.1/crossenv-default/build/get-pip.py pip==24.3.1 --no-setuptools --no-wheel --disable-pip-version-check
	Zeile 13727: ===>  /spksrc/spk/python311/work-x64-7.1/crossenv-default/bin/build-pip --disable-pip-version-check install setuptools==75.4.0 wheel==0.45.0
	Zeile 13740: ===>  /spksrc/spk/python311/work-x64-7.1/crossenv-default/bin/cross-pip --disable-pip-version-check install setuptools==75.4.0 wheel==0.45.0
	Zeile 13753: ===>  [/spksrc/spk/python311/work-x64-7.1/crossenv-default] Processing /spksrc/mk/crossenv/requirements-default.txt
	Zeile 13754: ===>  /spksrc/spk/python311/work-x64-7.1/crossenv-default/bin/build-pip --disable-pip-version-check install -r /spksrc/mk/crossenv/requirements-default.txt
	Zeile 13922: ===>  /spksrc/spk/python311/work-x64-7.1/crossenv-default/bin/cross-pip --disable-pip-version-check install -r /spksrc/mk/crossenv/requirements-default.txt
	Zeile 14084: ===>  Package list for /spksrc/spk/python311/work-x64-7.1/crossenv-default:
	Zeile 14085: ===>  /spksrc/spk/python311/work-x64-7.1/crossenv-default/bin/cross-pip list
	Zeile 14147: ===>  Compiling for pip
	Zeile 14148: ===>  CROSSENV: /spksrc/spk/python311/work-x64-7.1/crossenv-default/bin/activate
	Zeile 14150: ===>  Installing for pip
	Zeile 15841: ===>  crossenv wheel packages: pip==24.3.1, setuptools==75.4.0, wheel==0.45.0
	Zeile 15842: ===>  crossenv requirements file = /spksrc/mk/crossenv/requirements-default.txt
	Zeile 15843: ===>  PYTHON_PKG_VERS =
	Zeile 15844: ===>  PYTHON_PKG_NAME = python
	Zeile 15845: ===>  PYTHON_PKG_VERS_MAJOR_MINOR = .

This shows that crossenv-build target is used twice

  • first for Configuring for pip
  • second for Installing for pip

the first has correct variables, but the second is missing the python version

This might trigger you...

@hgy59
Copy link
Contributor

hgy59 commented Nov 14, 2024

@th0ma7 another idea I want to share

When wheels couldn't be built without additional wheels in crossenv (like expandvars to build frozenlist), I was looking in the pyproject.toml file of the source to find the build dependencies.
A mutch more elegant solution would be an automatic population of the crossenv with all build requirements taken from pyproject.toml when the package source has such a file.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Nov 14, 2024

@th0ma7 another idea I want to share

When wheels couldn't be built without additional wheels in crossenv (like expandvars to build frozenlist), I was looking in the pyproject.toml file of the source to find the build dependencies.
A mutch more elegant solution would be an automatic population of the crossenv with all build requirements taken from pyproject.toml when the package source has such a file.

That would be elegant indeed, and this pr could set the stage as a start to move towards gainin more flexibility with wheel building.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Nov 15, 2024

@hgy59 now fixed and ready for testing. I also migrated the code to make use of status cookie handling like other pieces of the framework which simplify things a lot.

If by any chances you have a moment to look at py313 cross-compiling... they moved away from setup.py and I believe there may be some work to somehow port the 004-xcompile-paths.patch patch. Help would be welcomed. Note that from testing this switch seems to have started with py312...

@hgy59
Copy link
Contributor

hgy59 commented Nov 15, 2024

not yet testing...
current build fails for DSM 6.2.4

it fails to install requirements-default.txt as it does not only install the listed modules but additional dependencies.
list of all aditional dependencies that are installed into crossenv see below

For DSM6.2.4 it fails to install msgpack==1.1.0 because it would need additional cflags (use option -std=c99, -std=gnu99, -std=c11 or -std=gnu11 to compile your code)

We must either

  • use --no-deps to install requirements-default.txt
  • add msgpack==1.0.5 to requirements-default.txt
  • install each module in requirements-default.txt with a dedicated call of build-pip/cross-pip with the handling of additional defined flags (as we do for package wheels)

List of additional modules installed into crossenv (those are not listed in requirements-default.txt)

build==1.2.2.post1
CacheControl==0.14.1
certifi==2024.8.30
charset-normalizer==3.4.0
cleo==2.1.0
click==8.1.7
crashtest==0.4.1
distlib==0.3.9
distro==1.9.0
docutils==0.21.2
dulwich==0.21.7
fastjsonschema==2.20.0
filelock==3.16.1
flit_core==3.10.1
idna==3.10
installer==0.7.0
jaraco.classes==3.4.0
jeepney==0.8.0
keyring==24.3.1
more-itertools==10.5.0
msgpack==1.1.0
packaging==24.2
pexpect==4.9.0
pkginfo==1.11.2
platformdirs==4.3.6
poetry-core==1.9.1
poetry-plugin-export==1.8.0
ptyprocess==0.7.0
pycparser==2.22
pyproject_hooks==1.2.0
RapidFuzz==3.10.1
requests==2.32.3
requests-toolbelt==1.0.0
SecretStorage==3.3.3
semantic-version==2.10.0
shellingham==1.5.4
tomli_w==1.1.0
tomlkit==0.13.2
trove-classifiers==2024.10.21.16
urllib3==2.2.3
virtualenv==20.27.1

@th0ma7
Copy link
Contributor Author

th0ma7 commented Nov 15, 2024

@hgy59 question of managing expectations... This PR really still is a [WIP] and my main focus until now has been on keeping a working python 3.10-3.11 while enhancing the crossenv functionality to have a per-wheel crossenv as needed. Python 3.13 isn't yet fully getting my attraction until I can confirm new crossenv functionality allows re-producing all python based packages.

I was hoping that at this stage it could be tested to confirm wetter this suffice to resolve the immediate build failures we have and allow getting reproducibility back for some of our packages (homeassistant is a good candidate).

not yet testing... current build fails for DSM 6.2.4

it fails to install requirements-default.txt as it does not only install the listed modules but additional dependencies. list of all aditional dependencies that are installed into crossenv see below

Interestingly it's only python 3.13 that fails, and I'm glad it only fails on this as I had other issues previously which now looks solved, at least thru github-action.

For DSM6.2.4 it fails to install msgpack==1.1.0 because it would need additional cflags (use option -std=c99, -std=gnu99, -std=c11 or -std=gnu11 to compile your code)

We must either:

  • use --no-deps to install requirements-default.txt

The crossenv requires all dependencies to be installed as otherwise it may fail when cross-compile wheels later-on as the build environment would not allow functionalities of said tools to work without its needed dependencies. Also note, this is the exact same behavior as current.

  • add msgpack==1.0.5 to requirements-default.txt

Currently testing that to see if this helps... Also, on my local branch cross/python313 currenly fails with the following but builds fine using spk/python313 up the the crossenv.

checking for --with-build-python... configure: error: invalid or missing build python binary "python3.13"

I'll have another look at it upon my return (feel free to push fixes if you hapen to have cycles).

  • install each module in requirements-default.txt with a dedicated call of build-pip/cross-pip with the handling of additional defined flags (as we do for package wheels)

I tried this... and it's really tricky as ordering of install is important. On the other hand that may ease passing -std=c99, -std=gnu99, -std=c11 or -std=gnu11 but even then, we'd need a place to hold such compiler or platform specific crossenv build process configurations beyond current mk/crossenv. An idea could be of using a similar directory functionality as the patch/ directory, something worth investigating later-on.

But you're right, while downgrading msgpack==1.0.5 may solve the immediate issue, we will need extra build options if we intent to maintain backward compatibility with our older platforms (until gcc4 no longer works).

List of additional modules installed into crossenv (those are not listed in requirements-default.txt)

yup, due to the dependency chain as we need to provide all dependencies for cross-compiling to actually work. I find it handy to print the list of crossenv installed wheels to track exactly what is the build environment in use when cross-compiling.

Lastly, my next step is to review the wheel building code to use status cookies and be closer to the remaining of the framework code. I want to divide it such as the following for a start:

  • spksrc.wheel-download.mk
  • spksrc.wheel-compile.mk

So then it will become much easier to maintain and add extra functionalities as needed such as meson and cmake toolchain file support, and potentially automating crossenv based on wheel internal information like you suggested.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Nov 16, 2024

Qoriq failure may be related to python/cpython#125269

@th0ma7
Copy link
Contributor Author

th0ma7 commented Dec 3, 2024

@hgy59 lzma hopefuly fixed now. Somehow it isn't able to find that library from being symlinked to the actual spk/python31*. Some other packages had similar issues with zlib that is already excluded.

@th0ma7 th0ma7 changed the title [WIP] dymamic crossenv + python3*-wheels + python312-313 [WIP] dymamic crossenv + python3*-wheels + python310-313 updates Dec 3, 2024
@hgy59
Copy link
Contributor

hgy59 commented Dec 4, 2024

@th0ma7 I am trying to create wheels for newer homeassistant with python312.

To create pydantic_core==2.27.1 I created a custom crossenv with cython<3, matruin and typing_extensions.

unfortunately maturin does not see the installed typing_extensions

Successfully installed Cython-0.29.37 build-1.2.2 click-8.1.7 maturin-1.7.4 packaging-24.2 pip-23.2.1 pip-tools-7.4.1 pyproject-hooks-1.2.0 typing_extensions-4.12.2
===>  Package list for /spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core:
Package           Version
----------------- -------
build             1.2.2
click             8.1.7
Cython            0.29.37
maturin           1.7.4
packaging         24.2
pip               23.2.1
pip-tools         7.4.1
pyproject_hooks   1.2.0
setuptools        75.4.0
typing_extensions 4.12.2
wheel             0.45.0
make[4]: Leaving directory '/spksrc/spk/homeassistant.2024'
make[3]: Leaving directory '/spksrc/spk/homeassistant.2024'
===>  WHEEL: activate crossenv found: /spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core/bin/activate
===>  Python crossenv found: [/spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core/bin/activate]
===>  pip crossenv found: [/spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core/cross/bin/pip]
===>  Cross-compiling [pydantic_core==2.27.1] wheel using /spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core
===>  [pydantic_core]
make[3]: Entering directory '/spksrc/spk/homeassistant.2024'
===>  _PYTHON_HOST_PLATFORM=x86_64-pc-linux-gnu MESON_CROSS_FILE=/spksrc/spk/homeassistant.2024/work-x64-7.1/tc_vars.meson /spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core/cross/bin/pip wheel --disable-pip-version-check --no-binary :all: --find-links /spksrc/spk/homeassistant.2024/../../distrib/pip --cache-dir /spksrc/spk/homeassistant.2024/work-x64-7.1/pip --no-deps --wheel-dir /spksrc/spk/homeassistant.2024/work-x64-7.1/wheelhouse --no-index --no-build-isolation pydantic_core==2.27.1
Looking in links: /spksrc/spk/homeassistant.2024/../../distrib/pip
Processing /spksrc/distrib/pip/pydantic_core-2.27.1.tar.gz
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Building wheels for collected packages: pydantic_core
  Building wheel for pydantic_core (pyproject.toml): started
  Building wheel for pydantic_core (pyproject.toml): finished with status 'error'
  error: subprocess-exited-with-error

  × Building wheel for pydantic_core (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [137 lines of output]
      Running `maturin pep517 build-wheel -i /spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core/cross/bin/python --compatibility off`
      📦 Including license file "/tmp/pip-wheel-sv_5d8el/pydantic-core_41b1092a95f04f5a8fec6f7d5a4a4faf/LICENSE"
      🍹 Building a mixed python/rust project
      🔗 Found pyo3 bindings
      🐍 Found CPython 3.12 at /spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core/cross/bin/python
      📡 Using build options features, bindings from pyproject.toml
         Compiling proc-macro2 v1.0.86
         Compiling unicode-ident v1.0.12
         Compiling target-lexicon v0.12.14
         Compiling python3-dll-a v0.2.10
         Compiling once_cell v1.19.0
         Compiling autocfg v1.3.0
         Compiling stable_deref_trait v1.2.0
         Compiling libc v0.2.155
         Compiling heck v0.5.0
         Compiling pyo3-build-config v0.22.6
         Compiling num-traits v0.2.19
         Compiling quote v1.0.36
         Compiling litemap v0.7.3
         Compiling syn v2.0.82
         Compiling version_check v0.9.5
         Compiling writeable v0.5.5
         Compiling rustversion v1.0.17
         Compiling memoffset v0.9.1
         Compiling radium v0.7.0
         Compiling pyo3-macros-backend v0.22.6
         Compiling pyo3-ffi v0.22.6
         Compiling cfg-if v1.0.0
         Compiling memchr v2.7.4
         Compiling static_assertions v1.1.0
         Compiling tinyvec_macros v0.1.1
         Compiling icu_locid_transform_data v1.5.0
         Compiling tinyvec v1.6.1
         Compiling lexical-util v0.8.5
         Compiling num-integer v0.1.46
         Compiling pyo3 v0.22.6
         Compiling ahash v0.8.11
         Compiling tap v1.0.1
         Compiling icu_properties_data v1.5.0
         Compiling smallvec v1.13.2
         Compiling serde v1.0.214
         Compiling wyz v0.5.1
         Compiling synstructure v0.13.1
         Compiling unicode-normalization v0.1.23
         Compiling lexical-parse-integer v0.8.6
         Compiling num-bigint v0.4.6
         Compiling aho-corasick v1.1.3
         Compiling zerofrom-derive v0.1.4
         Compiling yoke-derive v0.7.4
         Compiling zerofrom v0.1.4
         Compiling zerovec-derive v0.10.3
         Compiling displaydoc v0.2.5
         Compiling icu_provider_macros v1.5.0
         Compiling yoke v0.7.4
         Compiling pyo3-macros v0.22.6
         Compiling serde_derive v1.0.214
         Compiling zerovec v0.10.4
         Compiling strum_macros v0.26.4
         Compiling tinystr v0.7.6
         Compiling icu_locid v1.5.0
         Compiling icu_collections v1.5.0
         Compiling icu_provider v1.5.0
         Compiling getrandom v0.2.15
         Compiling jiter v0.7.1
         Compiling icu_locid_transform v1.5.0
         Compiling utf16_iter v1.0.5
         Compiling icu_normalizer_data v1.5.0
         Compiling unicode-bidi v0.3.15
         Compiling hashbrown v0.14.5
         Compiling icu_properties v1.5.1
         Compiling funty v2.0.0
         Compiling equivalent v1.0.1
         Compiling write16 v1.0.0
         Compiling percent-encoding v2.3.1
         Compiling serde_json v1.0.132
         Compiling zerocopy v0.7.34
         Compiling indoc v2.0.5
         Compiling utf8_iter v1.0.4
         Compiling regex-syntax v0.8.5
         Compiling unindent v0.2.3
         Compiling bitvec v1.0.1
         Compiling icu_normalizer v1.5.0
         Compiling regex-automata v0.4.8
         Compiling form_urlencoded v1.2.1
         Compiling indexmap v2.2.6
         Compiling idna v0.5.0
         Compiling strum v0.26.3
         Compiling lexical-parse-float v0.8.5
         Compiling pydantic-core v2.27.1 (/tmp/pip-wheel-sv_5d8el/pydantic-core_41b1092a95f04f5a8fec6f7d5a4a4faf)
         Compiling ryu v1.0.18
         Compiling itoa v1.0.11
      error: failed to run custom build command for `pydantic-core v2.27.1 (/tmp/pip-wheel-sv_5d8el/pydantic-core_41b1092a95f04f5a8fec6f7d5a4a4faf)`

      Caused by:
        process didn't exit successfully: `/tmp/pip-wheel-sv_5d8el/pydantic-core_41b1092a95f04f5a8fec6f7d5a4a4faf/target/release/build/pydantic-core-bea6f20c00965740/build-script-build` (exit status: 101)
        --- stdout
        cargo:rustc-check-cfg=cfg(Py_LIMITED_API)
        cargo:rustc-check-cfg=cfg(PyPy)
        cargo:rustc-check-cfg=cfg(GraalPy)
        cargo:rustc-check-cfg=cfg(py_sys_config, values("Py_DEBUG", "Py_REF_DEBUG", "Py_TRACE_REFS", "COUNT_ALLOCS"))
        cargo:rustc-check-cfg=cfg(invalid_from_utf8_lint)
        cargo:rustc-check-cfg=cfg(pyo3_disable_reference_pool)
        cargo:rustc-check-cfg=cfg(pyo3_leak_on_drop_without_reference_pool)
        cargo:rustc-check-cfg=cfg(diagnostic_namespace)
        cargo:rustc-check-cfg=cfg(c_str_lit)
        cargo:rustc-check-cfg=cfg(Py_3_7)
        cargo:rustc-check-cfg=cfg(Py_3_8)
        cargo:rustc-check-cfg=cfg(Py_3_9)
        cargo:rustc-check-cfg=cfg(Py_3_10)
        cargo:rustc-check-cfg=cfg(Py_3_11)
        cargo:rustc-check-cfg=cfg(Py_3_12)
        cargo:rustc-check-cfg=cfg(Py_3_13)
        cargo:rustc-cfg=Py_3_6
        cargo:rustc-cfg=Py_3_7
        cargo:rustc-cfg=Py_3_8
        cargo:rustc-cfg=Py_3_9
        cargo:rustc-cfg=Py_3_10
        cargo:rustc-cfg=Py_3_11
        cargo:rustc-cfg=Py_3_12
        cargo:rustc-check-cfg=cfg(has_coverage_attribute)
        cargo:rustc-check-cfg=cfg(specified_profile_use)
        cargo:rerun-if-changed=python/pydantic_core/core_schema.py
        cargo:rerun-if-changed=generate_self_schema.py

        --- stderr
        Traceback (most recent call last):
          File "/tmp/pip-wheel-sv_5d8el/pydantic-core_41b1092a95f04f5a8fec6f7d5a4a4faf/generate_self_schema.py", line 19, in <module>
            from typing_extensions import TypedDict, get_args, get_origin, is_typeddict
        ModuleNotFoundError: No module named 'typing_extensions'
        thread 'main' panicked at build.rs:29:9:
        generate_self_schema.py failed with exit status: 1
        note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
      warning: build failed, waiting for other jobs to finish...
      💥 maturin failed
        Caused by: Failed to build a native library through cargo
        Caused by: Cargo build finished with "exit status: 101": `env -u CARGO PYO3_ENVIRONMENT_SIGNATURE="cpython-3.12-64bit" PYO3_PYTHON="/spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core/cross/bin/python" PYTHON_SYS_EXECUTABLE="/spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core/cross/bin/python" "cargo" "rustc" "--features" "pyo3/extension-module" "--target" "x86_64-unknown-linux-gnu" "--message-format" "json-render-diagnostics" "--manifest-path" "/tmp/pip-wheel-sv_5d8el/pydantic-core_41b1092a95f04f5a8fec6f7d5a4a4faf/Cargo.toml" "--release" "--lib" "--crate-type" "cdylib"`
      Error: command ['maturin', 'pep517', 'build-wheel', '-i', '/spksrc/spk/homeassistant.2024/work-x64-7.1/crossenv-pydantic_core/cross/bin/python', '--compatibility', 'off'] returned non-zero exit status 1
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for pydantic_core
Failed to build pydantic_core
ERROR: Failed to build one or more wheels
make[3]: *** [../../mk/spksrc.wheel.mk:221: cross-compile-wheel-pydantic_core] Error 1
make[3]: Leaving directory '/spksrc/spk/homeassistant.2024'
make[2]: *** [../../mk/spksrc.wheel.mk:156: build_wheel_target] Error 1
make[2]: Leaving directory '/spksrc/spk/homeassistant.2024'
make[1]: *** [../../mk/spksrc.supported.mk:74: build-arch-x64-7.1] Error 1
make[1]: Leaving directory '/spksrc/spk/homeassistant.2024'

is there something missing?

@th0ma7
Copy link
Contributor Author

th0ma7 commented Dec 4, 2024

Oh that is really tricky... It actually uses maturin from the crossenv in terms of dependencies when building the wheels... But maturin per say really is being used from the native python build.

I never found a way to have maturin available part of the crossenv similarly to cross-pip. Admittedly it worked so i left it as is...

@th0ma7
Copy link
Contributor Author

th0ma7 commented Dec 4, 2024

@hgy59 this got me thinking... and reminder, I'm working from prior-art and I don't have the full knowledge on this, but I think I found out the obvious from something that got ported from one python migration to the next in a form or another which is now only in spksrc.crossenv.mk :

# Required so native python and maturin binaries can always be found
export PATH := $(abspath $(WORK_DIR)/../../../native/$(PYTHON_PKG_NAME)/work-native/install/usr/local/bin):$(PATH)
export LD_LIBRARY_PATH := $(abspath $(WORK_DIR)/../../../native/$(PYTHON_PKG_NAME)/work-native/install/usr/local/lib):$(LD_LIBRARY_PATH)

I believe that this needs to be per-crossenv specific and be referring to PATH := $(WORK_DIR)/crossenv-<wheel>/build/bin:$(PATH). Maybe the LD_LIBRARY_PATH needs to stay related to native, unsure.

Because, if you look at the crossenv:

  • build subdirectory under the crossenv it contains a "target" specific binaries i.e. target being where the binaries are being run from at compile time - Important: this is being constructed in reference to our native python.
  • cross subdirectory contains the "host" binaries being where the resulting compiled binaries will be running to

The build directory does contain maturin with all of its dependencies... so, my theory is that the the tricky part I refered earlier should be deflected to use the build directory within said crossenv ...

I doubt I'll have time to confirm my theory before the weekend but I thought I'd let you know in case you do have a few spare cycles.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Dec 5, 2024

@hgy59 I was too curious and had a few cycles to test something up. I believe this should solve your issue.

@hgy59
Copy link
Contributor

hgy59 commented Dec 6, 2024

@hgy59 I was too curious and had a few cycles to test something up. I believe this should solve your issue.

but it breaks the build of spk/python***

checking build system type... i686-pc-linux-gnu
checking host system type... x86_64-pc-linux-gnu
checking for --with-build-python... configure: error: invalid or missing build python binary "python3.12"

so the path is required to build python.
it might be fixed by definition of full path with --with-build-python=...

hgy59 added 5 commits December 6, 2024 11:30
- --with-system-ffi is not supported (python312, python313)
- --with-system-ffi is "is ignored on this platform" (python311)
- python310 and python311 require the path to build python for "generate-posix-vars"
- add wheelhouse to package
- many wheels need -std=c11 for gcc4
- llfuse wheel fails to build (temp. excluded)
- remove ARMv5 specific handling
- udpate rpds_py==0.20.0 and remove requirements-crossenv-rpds-py.txt (not referenced)
- add specific crossenv for pydantic_core
@hgy59
Copy link
Contributor

hgy59 commented Dec 6, 2024

@th0ma7 there are still some issued (even I fixed some)

  1. The trick to configure with full path for --with-build-python did not work for python310 and python311 so I reverted the previous changes and added the path.
    This fixes the build of python310 but not python311 (in my local environment)

    even when I add the path and LD_LIBRARY_PATH in cross/python311/Makefile, the package cannot be built.
    It fails to build the pure-python wheels since pip does not find the ssl libraries (or is built without it). (seen in previous github build logs too).

    ===>  Adding existing src/requirements-pure.txt file as pure-python (discarding any cross-compiled)
    ===>  [SKIP] Cross-compiling wheels
    ===>  Building pure-python
    WARNING: pip is configured with locations that require TLS/SSL, however the ssl module in Python is not available.
    
  2. I had to temp. remove llfuse wheel from spk/python312-wheels/src/requirements-crossenv.txt since it fails with

       /spksrc/toolchain/syno-armv7-6.2.4/work/arm-unknown-linux-gnueabi/bin/arm-unknown-linux-gnueabi-gcc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O3 -Wall -I/spksrc/toolchain/syno-armv7-6.2.4/work/arm-unknown-linux-gnueabi/arm-unknown-linux-gnueabi/sysroot/usr/include -D__ARM_PCS_VFP=1 -I/spksrc/spk/python312/work-armv7-6.2.4/install/var/packages/python312/target/include -D_LARGEFILE64_SOURCE -D_FILE_OFFSET_BITS=64 -L /spksrc/spk/python312/work-armv7-6.2.4/install/var/packages/python312/target/lib -I /spksrc/spk/python312/work-armv7-6.2.4/install/var/packages/python312/target/include -I/spksrc/toolchain/syno-armv7-6.2.4/work/arm-unknown-linux-gnueabi/arm-unknown-linux-gnueabi/sysroot/usr/include -D__ARM_PCS_VFP=1 -I/spksrc/spk/python312/work-armv7-6.2.4/install/var/packages/python312/target/include -D_LARGEFILE64_SOURCE -D_FILE_OFFSET_BITS=64 -L /spksrc/spk/python312/work-armv7-6.2.4/install/var/packages/python312/target/lib -I /spksrc/spk/python312/work-armv7-6.2.4/install/var/packages/python312/target/include -I/spksrc/toolchain/syno-armv7-6.2.4/work/arm-unknown-linux-gnueabi/arm-unknown-linux-gnueabi/sysroot/usr/include -D__ARM_PCS_VFP=1 -I/spksrc/spk/python312-wheels/work-armv7-6.2.4/install/var/packages/python312-wheels/target/include -I/spksrc/spk/python312-wheels/work-armv7-6.2.4/install/var/packages/python312-wheels/target/ -std=c11 -I/spksrc/spk/python312/work-armv7-6.2.4/install/var/packages/python312/target/include -I/spksrc/toolchain/syno-armv7-6.2.4/work/arm-unknown-linux-gnueabi/arm-unknown-linux-gnueabi/sysroot/usr/include -D__ARM_PCS_VFP=1 -I/spksrc/spk/python312-wheels/work-armv7-6.2.4/install/var/packages/python312-wheels/target/include -I/spksrc/spk/python312-wheels/work-armv7-6.2.4/install/var/packages/python312-wheels/target/ -I/spksrc/spk/python312/work-armv7-6.2.4/install/var/packages/python312/target/include -fPIC -I/spksrc/spk/python312-wheels/work-armv7-6.2.4/crossenv-default/cross/include -I/spksrc/spk/python312/work-armv7-6.2.4/install/var/packages/python312/target/include/python3.12 -c src/lock.c -o build/temp.linux-arm-cpython-312/src/lock.o -I/spksrc/spk/python312-wheels/work-armv7-6.2.4/install/var/packages/python312-wheels/target/include/fuse -D_FILE_OFFSET_BITS=64 -DFUSE_USE_VERSION=29 -Wall -Wextra -Wconversion -Wsign-compare -DLLFUSE_VERSION=\"1.5.0\" -Wno-unused-function -Wno-implicit-fallthrough -Wno-unused-parameter
       src/lock.c: In function ‘acquire’:
       src/lock.c:63:9: warning: implicit declaration of function ‘clock_gettime’ [-Wimplicit-function-declaration]
                ret = clock_gettime(CLOCK_REALTIME, &abstime);
                ^
       src/lock.c:63:29: error: ‘CLOCK_REALTIME’ undeclared (first use in this function)
                ret = clock_gettime(CLOCK_REALTIME, &abstime);
                                    ^
       src/lock.c:63:29: note: each undeclared identifier is reported only once for each function it appears in
       src/lock.c: At top level:
       cc1: warning: unrecognized command line option "-Wno-implicit-fallthrough"
       error: command '/spksrc/toolchain/syno-armv7-6.2.4/work/arm-unknown-linux-gnueabi/bin/arm-unknown-linux-gnueabi-gcc' failed with exit code 1
       [end of output]
    
      note: This error originates from a subprocess, and is likely not a problem with pip.
      ERROR: Failed building wheel for llfuse
    

    This looks like the toolchain sources are not found in crossenv (anymore).
    The definition of WHEELS_LDFLAGS += [llfuse] -lrt did not help (and setup.py of llfuse already adds this flag for linux targets) so I guess it lacks the related toolchain header files.

@hgy59
Copy link
Contributor

hgy59 commented Dec 6, 2024

@th0ma7 remarks for the -std=c11 flags

I removed the code that defined -std=c99 for gcc < 4.9.
This was only required for ARMv5 having gcc 4.6.4 (ARMv5 is not supported for py312 anymore),
and ARMv7L with gcc 4.8.3 works fine with -std=c11.

I already had this changes prepared in the work on homeassistant for py312, and needed to verify only for ARMv7L that is not supported by ha.

hgy59 added 3 commits December 6, 2024 22:32
…rossenv

- use python version specific crossenv
- downgrade cryptography in default crossenv for python < 3.12 to v41.0.3
@hgy59
Copy link
Contributor

hgy59 commented Dec 6, 2024

@th0ma7 the latest changes have beaten me...

the build of spk/python*-wheel (and probably other python-module packages too) do not work anymore.
it looks like a regression in mk/spksrc.crossenv.mk that locks it 😞

@th0ma7
Copy link
Contributor Author

th0ma7 commented Dec 6, 2024

Not surprising as this is really tricky work. Give me a few days so i can reconvene and look into that, this week was a busier one and cycles have been scarce at best.

@hgy59
Copy link
Contributor

hgy59 commented Dec 7, 2024

Not surprising as this is really tricky work. Give me a few days so i can reconvene and look into that, this week was a busier one and cycles have been scarce at best.

some thoughts:

  • we need python version specific crossenv definitions (cryptography must be 41.* for python < 312)
  • my introduction of python version specific crossenv breaks spksrc.crossenv.mk due recursive rules/targets/???
  • not even make clean works for any package (all packages are affected, not only python related)
  • I have no idea why github build action for python*-wheels completed, locally it doesn't work at all
  • we must avoid global python related variables in spksrc.crossenv.mk (those must only be evaluated when python related targets are in use, and python version is defined - and if this is fixed correctly, we won't require the global SHELL = /bin/bash anymore)

@hgy59
Copy link
Contributor

hgy59 commented Dec 7, 2024

deluge fails buiding libtorrent wheel, not finding b2 tool of bootstrap:

2024-12-07T00:26:56.6500593Z DEPRECATION: --build-option and --global-option are deprecated. pip 23.3 will enforce this behaviour change. A possible replacement is to use --config-settings. Discussion can be found at https://github.com/pypa/pip/issues/11859
2024-12-07T00:26:56.6502293Z WARNING: Implying --no-binary=:all: due to the presence of --build-option / --global-option.
2024-12-07T00:26:56.6557524Z Looking in links: /github/workspace/spk/deluge/../../distrib/pip
2024-12-07T00:26:56.6571327Z Collecting libtorrent==2.0.10
2024-12-07T00:26:56.6575887Z   Cloning https://github.com/arvidn/libtorrent.git (to revision v2.0.9) to /tmp/pip-wheel-pp9cixa4/libtorrent_01250bae339a4f0db882bd68d394dc8b
2024-12-07T00:26:56.6601229Z   Running command git clone --filter=blob:none --quiet https://github.com/arvidn/libtorrent.git /tmp/pip-wheel-pp9cixa4/libtorrent_01250bae339a4f0db882bd68d394dc8b
2024-12-07T00:26:59.7963017Z   Running command git checkout -q 4d0b6c7433f8aa42cfcc54f7923adca1d0015f72
2024-12-07T00:27:00.6831833Z   Resolved https://github.com/arvidn/libtorrent.git to commit 4d0b6c7433f8aa42cfcc54f7923adca1d0015f72
2024-12-07T00:27:00.6832578Z   Running command git submodule update --init --recursive -q
2024-12-07T00:27:02.2665304Z   Preparing metadata (setup.py): started
2024-12-07T00:27:02.5696220Z   Preparing metadata (setup.py): finished with status 'error'
2024-12-07T00:27:02.5731556Z   error: subprocess-exited-with-error
2024-12-07T00:27:02.5731980Z   
2024-12-07T00:27:02.5732684Z   × python setup.py egg_info did not run successfully.
2024-12-07T00:27:02.5733390Z   │ exit code: 1
2024-12-07T00:27:02.5733853Z   ╰─> [7 lines of output]
2024-12-07T00:27:02.5734277Z       running egg_info
2024-12-07T00:27:02.5734870Z       creating /tmp/pip-pip-egg-info-jvya6kjw/libtorrent.egg-info
2024-12-07T00:27:02.5735934Z       writing /tmp/pip-pip-egg-info-jvya6kjw/libtorrent.egg-info/PKG-INFO
2024-12-07T00:27:02.5736943Z       writing dependency_links to /tmp/pip-pip-egg-info-jvya6kjw/libtorrent.egg-info/dependency_links.txt
2024-12-07T00:27:02.5737727Z       writing top-level names to /tmp/pip-pip-egg-info-jvya6kjw/libtorrent.egg-info/top_level.txt
2024-12-07T00:27:02.5738581Z       writing manifest file '/tmp/pip-pip-egg-info-jvya6kjw/libtorrent.egg-info/SOURCES.txt'
2024-12-07T00:27:02.5739140Z       error: [Errno 2] No such file or directory: 'b2'
2024-12-07T00:27:02.5739475Z       [end of output]
2024-12-07T00:27:02.5739702Z   
2024-12-07T00:27:02.5740090Z   note: This error originates from a subprocess, and is likely not a problem with pip.
2024-12-07T00:27:02.5747089Z error: metadata-generation-failed
2024-12-07T00:27:02.5747565Z 
2024-12-07T00:27:02.5747893Z × Encountered error while generating package metadata.
2024-12-07T00:27:02.5748296Z ╰─> See above for output.
2024-12-07T00:27:02.5748455Z 
2024-12-07T00:27:02.5748650Z note: This is an issue with the package mentioned above, not pip.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Dec 15, 2024

@hgy59 was overwhelmed with other priorities, should be able to reconvene this week, will keep you posted. cheers!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Creation of a new spk fails to trigger build
4 participants