On Friday 15 April 2005 02:23, Colin Carter wrote:
Hi, I don't see much activity on this mailing list, so perhaps I am talking to myself. Perhaps I can stimulate a debate:
Programming standards have deteriorated significantly.
My argument: Some years ago academics developed design systems (Jordan etc) to stop young programmers writing spaghetti code (with lots of 'goto' statements). Then Knuth developed PASCAL to force young programmers to write properly structured code (goto statement not included).
Well, nobody could get a job if they hadn't done one of these 'design' courses and didn't know PASCAL.
Turns out that we old guys (especially FORTRAN scientific types) had been writing structured code for years, and we controlled our goto statements.
And 'they' had to add a goto to PASCAL (we grinned) because without it the code can become very inefficient (especially with PUSH/POP overheads).
Now the latest is "Object Oriented" (shouldn't that actually be orientated?) code with lots of over-heads. I know: there's plenty of cpu power and plenty of memory now. So now we are forced to buy hundreds of MB of RAM because so many young programmers say "there's plenty of memory". Nothing runs in 4 MB any more - too much over-head.
How very odd. I just ranted on this general subject on another list. The problem with the software development world is that software developers assume their program is the beneficiary of all system resources. No matter what the technological advance, software advances as fast, or even faster, to consume and overtake it. Today, developers code for their own convenience without regard or respect for system resources. While hard drives, physical RAM and CPU caches have become faster and larger, code has bloated even faster, negating the performance increase. From what I see of the entry-level people we hire and fire at work, schools aren't teaching programmers how computers work at even a fundamental level, so they have no real idea what "efficient" means. It is quite common to find some Object Oriented "expert" (that's someone with a degree and one job on their resume) who can't build a simple C char array. I've seen quite a few who have no clue at all that a C pointer corresponds to an instrumental, functional part of the CPU and is not merely a language syntax construct.
While scientists write A(1) = ... A(2) = ... A(3) = ... to cut out a nano-second or two, the new generation are writing functions (let alone for loops) to do the same thing.
Actually, I was reading recently in the Art of Unix Programming, that unrolling loops used to be good optimization, but with CPU caches now, it's better to keep the loop, since there is a better change that the entire code in the loop fits in the cache.
Anyway, why use Adobe - it takes five times as long to open a document than anything else I've seen, and the reader has no control - half the internet sites don't even let you copy it. And they keep changing it so that everybody MUST upgrade. And anyway, although it is massive, it is only a cheap version of Tex.
So you young guys, start writing faster/smarter code and forget the fancy, 'full of options', 42000 font code.
I still have an Amiga 3000 kicking around -- 25MHz 68030 and 16M RAM. (For that sytem 16M is HUGE -- especially back in 1990) I'm amazed that in 90% of practical situations there is no appreciable difference between using the Amiga and using my 1.8Ghz, 1G RAM linux system (or even "faster" Windows systems at work). The GUI is light and fast -- it even makes icewm feel bloated. The apps are a tiny fraction of the size of similar programs on the contemporary platforms and seem to load instantly. It still works as well as it does, because it was designed by a bunch of uber-geeks who had to care that the 68000 address range was only 16M.