PSA: Please, do not use %{?python_enable_dependency_generator} in Factory packages
Hello, Could you please not use `%{?python_enable_dependency_generator}` macro? We have agreed in https://github.com/openSUSE/python-rpm-macros/pull/40 that we will let pythonXdist() symbols be generated only for the compatibility with other distributions. Unfortunately for us, this macro expects that the upstream metadata are correctly maintained. The sad reality is that they are not. We have an experience that especially version numbers are quite often just plain wrong (author just wrote the versions of packages he had currently on his drive) and what's even worse, upstream quite often instead of dealing with some issues just add '<4.0' and they think the issue has been resolved. However, that creates hundreds of mutually exclusive packages, and it is absolutely impossible to maintain this when we are talking about thousands of packages at once and not enough manpower to keep it all together (just d:l:p is 2087 packages, and 3138 packages with python in name in Factory). We would have to patch hundreds of setup.py files and deal with the upstream pull requests. It is just much more simple just write version numbers in our own metadata. See https://build.opensuse.org/request/show/965773 for one of many of such issues. I will remove this macro from all our .spec files, please, do not return them back. Yes, I have been reminded that https://en.opensuse.org/openSUSE:Packaging_Python#Automatic_Runtime_Requirem... needs to be removed (or rewritten as a warning against using this macro). Thank you, Matěj -- https://matej.ceplovi.cz/blog/, Jabber: mcepl@ceplovi.cz GPG Finger: 3C76 A027 CA45 AD70 98B5 BC1D 7920 5802 880B C9D8 How many Bavarian Illuminati does it take to screw in a light bulb? Three: one to screw it in, and one to confuse the issue.
Hi all, Am 30.03.22 um 23:14 schrieb Matěj Cepl:
Hello,
Could you please not use `%{?python_enable_dependency_generator}` macro? We have agreed in https://github.com/openSUSE/python-rpm-macros/pull/40 that we will let pythonXdist() symbols be generated only for the compatibility with other distributions.
I don't see any agreement to that effect in that discussion. There is just the notion that it is not that useful for us, so we won't use it. It doesn't help with the BuildRequirements for the obs resolver at all. And when you list the BuildRequirements manually, why not also list the runtime requirements and have a proper human review for them?
Unfortunately for us, this macro expects that the upstream metadata are correctly maintained. The sad reality is that they are not. We have an experience that especially version numbers are quite often just plain wrong (author just wrote the versions of packages he had currently on his drive) and what's even worse, upstream quite often instead of dealing with some issues just add '<4.0' and they think the issue has been resolved.
Not using the macro will not allow you to ignore errors in the metadata. With bad metadata, you will run into pkg_resources and importlib metadata errors soon enough. In the specific case linked by you, the problem is a pattern I have seen tens of times in the last couple of weeks: The rpm packager for subprocess-tee forgot to BuildRequire setuptools_scm and toml/tomli. These are usually specified in setup.py or pyproject.toml as build requirement `setuptools_scm[toml]` and serve as version determinators for the metadata to be installed. Without it, the version becomes 0.0.0 and that is what the generator found. Not upstream's fault at all. If you stop to put catchall lines into the %files section but list a proper %{python_sitelib}/mymodule-%{version}*-info you catch such errors during packaging. I encourage everyone to do it.
However, that creates hundreds of mutually exclusive packages, and it is absolutely impossible to maintain this when we are talking about thousands of packages at once and not enough manpower to keep it all together (just d:l:p is 2087 packages, and 3138 packages with python in name in Factory). We would have to patch hundreds of setup.py files and deal with the upstream pull requests. It is just much more simple just write version numbers in our own metadata.
You have to sync the rpm metadata with the egg/dist-metadata or you will run into aforementioned pkg_resources/importlib errors again and again. See many of the currently failing packages in Staging:M due to a docutils update, which is not compatible with Sphinx. So you either have to unpin the upper bounds in the setup.* or requirements files or work them into the rpm packages.
See https://build.opensuse.org/request/show/965773 for one of many of such issues.
I will remove this macro from all our .spec files, please, do not return them back.
If you properly sync the metadata, and you must, there is no need to remove the generator. Although, as already said, the generator is not very useful within the context of the openbuildservice. - Ben
I think y'all are missing the point. The generator is certainly very useful within the context of OBS, just as much as any other generator. It's very common for people to express only the required packages to build a Python package, which is not the same set as the required packages to run the Python package. This generator ensures that you have all the correct runtime dependencies. Moreover, if your metadata is bad, then *you should try to fix it*. Those errors mean that the package is *unusable* at runtime, especially if they use setuptools-generated binary wrappers, which check runtime dependencies before executing code. Finally, the generator makes life considerably easier longer term when you combine it with dynamic build dependencies (which OBS has supported for a while now). For example, Fedora's pyproject-rpm-macros[1] leverages the generator so that it doesn't have to figure out package names for build dependencies as it probes and cycles through them to build the Python package *using the metadata*. [1]: https://src.fedoraproject.org/rpms/pyproject-rpm-macros
Am 31.03.22 um 12:31 schrieb Neal Gompa:
Moreover, if your metadata is bad, then *you should try to fix it*. Those errors mean that the package is *unusable* at runtime, especially if they use setuptools-generated binary wrappers, which check runtime dependencies before executing code.
Exactly my point.
Finally, the generator makes life considerably easier longer term when you combine it with dynamic build dependencies (which OBS has supported for a while now).
For example, Fedora's pyproject-rpm-macros[1] leverages the generator so that it doesn't have to figure out package names for build dependencies as it probes and cycles through them to build the Python package *using the metadata*.
[1]: https://src.fedoraproject.org/rpms/pyproject-rpm-macros
I would love to see this replacing the ugly %{python_module} hack for the obs server side resolver in the project configs. How does one enable dynamic build dependencies in obs? Moreover where is the PR/SR for openSUSE's python-rpm-macros or a port of pyproject-rpm-macros? I found https://github.com/rpm-software-management/rpm/pull/593 which references https://github.com/rpm-software-management/rpm/issues/104. But there is only the concern by @mlschroe that it will not work in obs and my google-fu fails me from here. - Ben
Dne 31. 03. 22 v 1:12 Ben Greiner napsal(a):
I don't see any agreement to that effect in that discussion. There is just the notion that it is not that useful for us, so we won't use it.
It doesn't help with the BuildRequirements for the obs resolver at all. And when you list the BuildRequirements manually, why not also list the runtime requirements and have a proper human review for them?
100% agree.
Not using the macro will not allow you to ignore errors in the metadata. With bad metadata, you will run into pkg_resources and importlib metadata errors soon enough.
You are right, and I stand corrected. We have to fix those upstream metadata anyway. Grr.
In the specific case linked by you, the problem is a pattern I have seen tens of times in the last couple of weeks: The rpm packager for subprocess-tee forgot to BuildRequire setuptools_scm and toml/tomli. These are usually specified in setup.py or pyproject.toml as build requirement `setuptools_scm[toml]` and serve as version determinators for the metadata to be installed. Without it, the version becomes 0.0.0 and that is what the generator found. Not upstream's fault at all.
Not only setuptools_scm, but even setuptools_scm_git_archive! Crazy. However, yes, your analysis was correct, and yes, it is useful to check what actually build package provides.
If you stop to put catchall lines into the %files section but list a proper %{python_sitelib}/%{module}-%{version}*-info you catch such errors during packaging. I encourage everyone to do it.
YES! I think somebody should really allocate time to update py2pack, it is severely outdated.
You have to sync the rpm metadata with the egg/dist-metadata or you will run into aforementioned pkg_resources/importlib errors again and again. See many of the currently failing packages in Staging:M due to a docutils update, which is not compatible with Sphinx. So you either have to unpin the upper bounds in the setup.* or requirements files or work them into the rpm packages.
I stand corrected.
If you properly sync the metadata, and you must, there is no need to remove the generator. Although, as already said, the generator is not very useful within the context of the openbuildservice.
You are right, removal of all those lines of Requires: could make a spec file slightly shorter, so perhaps it is not pure cost. We can try to play with it. Best, Matěj -- https://matej.ceplovi.cz/blog/, Jabber: mcepl@ceplovi.cz GPG Finger: 3C76 A027 CA45 AD70 98B5 BC1D 7920 5802 880B C9D8 Quod fuimus, estis; quod sumus, vos eritis.
participants (3)
-
Ben Greiner
-
Matěj Cepl
-
Neal Gompa