Morning: I've seen a few messages go back and forth on this topic. Personally, I'm all in favor of static compiling. Yes, there are valid points raised about it making programs larger, etc. this is one of those "everyone is right" type scenarios. In the case of something like this, I think it's simply the users preference. If I have a large hard drive, lots of RAM, and I want to make things a little easier on myself, why not compile the source statically and be done with it? I can now take this to another Linux machine, and I don't have to be bothered with what the other user has installed, because I've got a self-contained unit. On the same token, I can't say anyone who is against this belief is wrong either. I saw a statement made by one user who said that programmers need to be more careful when writing code. This also has its merits, but I think it's a lot easier to work around the problem by statically linking what works, instead of trying to retrain a horde of programmers who may write code a little diferently depending on what library version they are linking against. Just my .02 :) - Mike On Wed, 28 Jun 2000, Andrei Mircea wrote:
Date: Wed, 28 Jun 2000 07:44:43 +0200 From: Andrei Mircea <mircea.andrei@wanadoo.fr> To: suse-linux-e@suse.com Subject: Re: [SLE] back to static compiled applications ?
I succeeded compiling with static linking the last version of lyx (1.1.5). Here is the size data: dynamic linking 1,528,800 static linking 2,529,308 The hard disk did not explode and I have a stable executable which I can use without having to cross the fingers at the next upgrade of my system. Is this correct ?
On Tue, Jun 27, 2000 at 07:58:47PM +0200, Philipp Thomas wrote:
* Andrei Mircea (mircea.andrei@wanadoo.fr) [20000627 01:59]:
Given the huge RAM memory available in present computers, I wonder if it would not be more reasonable to go back to the old system of statically linked programs. Dynamic linking sure is a bright idea, but...what do other people on this list think ?
Thats a very bad idea, for a number of reasons.
- The needed disk space would explode. To see the difference, just compile your favourite app with -static added to the compiler options. - Updating the library would force recompiling/updating *all* applications. - Not every machine has huge amounts of RAM.
There are others but this should suffice.
The reason for the problems with shared libraries is, that most developers aren't cautious enough. What you need for such work is a defined environment for building binary packages. That's why our automatic package building system uses a chroot environment in which all necessary packages for a given distribution are installed. That way you get a defined set of tools and libraries that match the distribution for which a package is built.
Philipp
-- Philipp Thomas <pthomas@suse.de> Development, SuSE GmbH, Schanzaecker Str. 10, D-90443 Nuremberg, Germany
#define NINODE 50 /* number of in core inodes */ #define NPROC 30 /* max number of processes */ -- Version 7 UNIX for PDP 11, /usr/include/sys/param.h
-- To unsubscribe send e-mail to suse-linux-e-unsubscribe@suse.com For additional commands send e-mail to suse-linux-e-help@suse.com Also check the FAQ at http://www.suse.com/Support/Doku/FAQ/
-- To unsubscribe send e-mail to suse-linux-e-unsubscribe@suse.com For additional commands send e-mail to suse-linux-e-help@suse.com Also check the FAQ at http://www.suse.com/Support/Doku/FAQ/