-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 The Friday 2007-03-02 at 17:24 -0800, David Brodbeck wrote:
Carlos E. R. wrote:
The relation is that with faster hardware programmers don't have to trim their programs. They can allow their programs to be huge, repetitive, non-optimized, because the hardware is faster, disks are bigger, and the diference will be hardly noticed.
That doesn't really correlate to stability, though. Non-optimized software will be slower (usually) than optimized software, but optimizing a piece of software does not make it more stable. In fact, it's the opposite -- optimization often introduces subtle bugs of its own.
It depends on how that optimization is done. I'm thinking about not linking every lib in sight, for instance, not in telling the compiler to optimize. Of thinking ahead what really needs to be done, of doing a real design job before writing a single line of code. Some programs are too big simply because a lot of unused lib code is linked in. A hello world should compile in about 2K, but I have seen some compilers produce 30 or 50K, because the linker is not clever enough to detect what functions are called and which aren't.
You should never start optimizing any code that isn't stable. As Cort Dougan put it when he was working on porting Linux to PowerPC, "A fast kernel that crashes is just a kernel that crashes quickly."
:-) - -- Cheers, Carlos E. R. -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.5 (GNU/Linux) Comment: Made with pgp4pine 1.76 iD8DBQFF6NystTMYHG2NR9URAqHVAJ936WwT8Z9y9ybNX4SCv6TM1uetKgCfSaNR ge1uQxEIpqm7RkyuBcdSLx4= =Q5/x -----END PGP SIGNATURE----- -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org For additional commands, e-mail: opensuse+help@opensuse.org