
On 2/17/21 9:35 PM, L A Walsh wrote:
On 2021/02/16 14:25, John Paul Adrian Glaubitz wrote:
I can take programs in Win from 10-20 years ago and they still work on a current day win because the OS loads library updates by version.
That's actually not true. Microsoft fixed a lot of bugs in their ABI over the time and dropped support for various older ABIs when they switched to 64 bit Windows or rewrote fundamental subsystems such as the graphics or sound stack, so that a lot of older programs don't work anymore.
I have MANY programs on my machine from Win XP days and probably a few from before and they still run. You are talking about programs that did their own thing w/the underlying HW -- many graphics programs didn't use the DirectX interfaces or opengl, but ones that did still run. Problems with old programs that were tied to hardware access are not a software-library-versioning problem, so please don't confuse the issue.
No, I'm not. Your understanding on the inner workings of Windows is rather superficial. I recommend you talk to the WINE developers as they will explain you that the reality is much more complex. Try running a 16-bit application on a 64-bit Windows, for example. That won't work because VM86 mode is not available in long mode on x86_64 systems.
I think there even used to be a WINE on Windows port to address this issue and allow older Windows applications on newer versions of Windows.
Win7 came with a WindowsXP mode for the few programs that wouldn't work on newer version -- mostly ones that did thing using the 16-bit API that WinXP supported from Win95/98 days. I did say 10-20 years ago, not 25 years ago -- but those were because of changing HW, not because new library versions were incompatible.
Which is essentially a virtual machine hence you're proving my point.
Unix did the same -- but move to linux, and vendors got lazy and stopped using the correct versions to link with. Why? And how can this be fixed?
Actually, you can run 30-year-old binaries on a current version of Linux as long as you provide the necessary shared libraries or the application is statically linked since the Linux kernel guarantees to never break any userspace code.
Actually you are bullshitting us. There was no linux 30 years ago this month and V1.0 didn't come out until 1994. I'm pretty sure it wasn't binary compatible.
Version 1.0 wasn't the first release of Linux but the first one where X11 worked properly (if I remember correctly). See: https://en.wikipedia.org/wiki/Linux#Creation
Even when the linux kernel doesn't break user space code, glibc will refuse to run on older kernels. Old apps need to keep around the old libs.
Why would you want to run a recent version of glibc on an old kernel, that makes no sense. You want to run old application software in a recent Linux environment and that will work perfectly fine because both the Linux kernel and the glibc still provide the necessary ABIs for software that old.
But then you are saying what I said in saying that users need to provide the libraries -- it's not handled by the OS and that makes running older programs unviable as package managers like rpm, remove older versions of libraries when you upgrade, making it very difficult to support older progs that worked w/older libs.
Microsoft doesn't provide libraries either. In fact, having been a professional developer for Windows, macOS and Linux desktop applications, I can tell you that collecting and distributing shared libraries has always been the most hassle with Windows. It's much easier with Linux and macOS which have actually decent mechanisms with their package managers for that - albeit on macOS you have to install Brew or Macports first, but that's actually very easy. Microsoft has just recently started providing a package manager for Windows and my last information was that it was still in early beta.
That said -- most linux progs don't link to shareable object libraries even using the major number. Some do, but when did you see someone linking with glibc-1 vs. glibc-2. As for glibc compat, theoretically under semantic versioning, glibc 2.1 should be compat with glibc 2.33 -- so again, I ask: Why are all the apps in TW being rebuilt w/new glibc vs. being replaced as those apps are updated?
Because you want to make sure that all packages use the latest symbol versions. If you're not comfortable with that, then I'd actually recommend using Leap or similar distributions with a longer release cycle. I mean, you're basically complaining that a two-seated race car is impractical for going on family. openSUSE Tumbleweed is a rolling-release distribution, it is unstable by _definition_ (not in the sense that it crashes but that the software changes often). Adrian