On 11/8/20 7:50 PM, L A Walsh wrote:
On 2020/11/07 02:51, Simon Lees wrote:
On 11/7/20 2:40 PM, L A Walsh wrote:
TLDR: rpm(s) >= 4.15, can't be read, built or installed by rpm < 4.15 (specifically, 4.11).
On 2020/10/29 03:55, Simon Lees wrote:
I'm not sure how much we claim to support building openSUSE rpm's as source rpm's on there own. openSUSE rpm spec files often depend on variables that are defined in project config files in open build service, as such just using rpmbuild may result in packages not building or building with different configurations.
But is *installing* "rpms" from tumbleweed, with "rpm" on a tumbleweed install, supported?
Well we still support upgrading from Leap to Tumbleweed and upgrading older tumbleweed installs so if people update using the supported method of zypper dup someone has put in the effort to make it work I haven't looked at how they have done that.
In a supported up to date system we provide all the tools required to view and install source rpm's,
But not build?
Not without additional effort, there are a number of macro's and definitions here ( https://build.opensuse.org/projects/openSUSE:Factory/prjconf ) for example that you would need to teach rpm about if you were going to do a build outside open build service.
The point here, is that my linux server was off-the-internet for a couple of months. Before it went off line, I was able to install rpm's from tumbleweed or build them. After a few months offline, my system came back online but without my local copies of the tumbleweed repo. I downloaded a copy and found I wasn't able to install any binary packages due to the binary format changing. Simply trying to build from the source required I install other binary (all 'devel') packages that were unreadable/inaccessible. FWIW, system was offline because of main disks becoming corrupted by a flakily repaired RAID card that eventually overheated due to the bad repair -- I didn't know it was bad until I got a replacement and noticed that the older card probably wasn't 'new' -- it was missing pressure screws that held a heat sink on a hot chip -- and to keep it on, someone had used something like superglue -- not a great heat conductor IMO. While the backup went offline as well and had some problems, I was able, with new enclosures and a new card, to restore OS+most files from those backups.
The problem was keeping my system "up-to-date" when the primary disks were down -- obviously that was not possible. This was an up-to-date system, that got out of sync as a transition was made to a new rpm format. One of the problems I am highlighting is the inability to resync a local installation after being forced offline for a few months. This is being complicated by suse-rpms's that, it seems, are no longer supported being used to rebuild rpms locally, primarily because there is no stand-alone converter to convert binary-rpms to a readable format for systems more than a few months old (more like 6 months, now) that only have the old rpm installed and don't have a matching set of binary-rpms for what is installed. They were on a disk that wasn't part of the backups because I had just downloaded them and thought I'd be able to do so again.
...boils down to variations ..."Reflections on Trusting Trust" (as from: http://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_ReflectionsonTrustin...
From our build service you can find and inspect all the build logs along with the source tarballs, beyond that obs has the ability to verify that upstream tarballs have been used along with signature checking. The Reproducible builds team has also done a lot of work in this area so generally i'd say we are doing better then most distro's.
That may be true, I haven't examined enough others to say, but that doesn't address the ability to build locally or examine which patches get applied to various rpms during the build process. Some of suse's rpm's apply huge numbers of unsigned patches...for example, in looking at opensuse's rpm source for 4.15.1, there are 63 patches.
Then there are: ..."the spec files" that include dependencies and variables "that are [only] defined in project config files in in the open build service". <<< Some or most of those should be able to be set in a script as part of a non-obs build phase. Otherwise, you have: "just using rpmbuild may result in packages not building or building with different configurations". It's the "not building" part that I'm hitting right now.
As for zypper -- will try it again, but last try had it trying to switch in over 200 packages -- just to upgrade 'rpm' -- and that was after I ignored some problems. Sigh.
I would expect that if rpm and its dependencies all came from the standard repo and that you had been updating your system with `zypper dup` in the past you should be able to still upgrade your system with `zypper dup` not being able to would be a bug worth reporting. Unfortunetly if you don't have a standard setup there might not be so much we can do to help. -- Simon Lees (Simotek) http://simotek.net Emergency Update Team keybase.io/simotek SUSE Linux Adelaide Australia, UTC+10:30 GPG Fingerprint: 5B87 DB9D 88DC F606 E489 CEC5 0922 C246 02F0 014B