On Saturday 16 April 2005 00:44, Stefan Hundhammer wrote:
On Friday 15 April 2005 14:43, Colin Carter wrote:
Yes, I have noticed how most modern programs waste time poling the mirriad of open windows ....
Urgh - folks, can we stop the urban legends at some point, please?
Everybody who has a minimum clue of how any kind of GUI programming works should know that those programs spend most of their time waiting on a socket, waiting for user input - on X11 (no matter what toolkit is being used - KDE, Gtk, OSF/Motif, Xt, ...) and on Win32. There is no "busy wait" in any such program I know.
Sorry mate - no myth. While working in the UK I became very frustrated waiting for some not-so-large number-cruncher to execute. I noticed that if I closed or covered all windows I could then the code ran faster. The young guy (an exceptional young programmer) revisited his code and 'disabled' every button on the screen except 'stop' and the execution time changed from over 10 minutes to less than one minute. You will also find that if your very simple window covers the desktop the code runs faster. Well, in M$ anyway; I'm a Linux Newbie.
The other myths about object orientation etc. are similar: If you do non-trivial software, you simply NEED that kind of thing so you can have the kind of abstraction level you will want to have so the software can be mantained at all, much more for extended periods of time. Not true: I have seen OO maintenance programmers really stuff up systems because they were unaware of how the 'objects' are relied on by other parts of the code. One such 'stuff up' cost the company months of work.
Most of the finite elements programs people were so fond of in the FORTRAN times are written by now - there are probably generic programs for that kind of thing. I don't think so. My best friend is steering the English uni gurus in this field - they can't get the finite element code accurate enough - he keeps exposing deficiencies in the code. His code runs on high speed multi- processor machines (He gave up on the Alpha) for one or two days. He is also quite critical of the 'slowness' of the code. No, the finite element code has a way to go yet.
Today's software is supposed to do everything, including making coffee, snip> I agree, but Why? I would be happier if the code did less, but was more robust and ran faster.
C's string handling for instance sucks - it is the source of most security holes that need to be fixed. Buffer overflows happen because C does not have a concept for variable length strings - it only has character pointers. What a nightmare. Maybe, but I think that there is a far worse problem: C strings do not have well defined lengths. Which means that the O.S. is always messing around allocating memory, and leaving unused bits floating.
I have been programming since the mid-80s, and even though I also tend to a beginner! bitch about many things, things have improved a lot since then. No more rebooting because a null pointer in C overwrote your PC's interrupt table at 0000:0000 on MS-DOS. Anybody remember what a PITA that was? Yes, I agree. But that was a function of small machines having limited memory and OS. The machine I worked on in the seventies, a 60 bit Cyber, was multi tasking in a big way (eg handled users from hundreds of kilometres away) and the only thing that crashed it was the cleaning lady unplugging it to at 6am to use the power point. (The thing re-started so efficiently that it took the operators a month to work out why things were 'funny'.)
snip>
And today, it's no more segfaults because the C string handling is so dumb.
I have to disagree with you and agree with Synthetooz who said: Konquerer segfaults more than anything else I know. Is that written in C? end quote. Except for Kaffeine Media Player which crashes with a seg fault at every 'close'.
Use modern tools. Use tools like C++ - or, for that matter, C#, or even Java. Use predefined (meaning: well-tested) classes for common purposes. Do not repeat everybody's (and their mothers' ) mistakes by writing your own because you think you can do a better job at that. Oh no, and Oh yes we can!
You argue this comes at a price - and the price is performance and system resources. That may be right, but I rather sacrifice some MB of RAM rather than experience random crashes because nobody can debug a software of that complexity written with outdated tools any more. My time (and, for that matter, my nerves) are way more precious to me than some MB of RAM saved. What do you mean "sacrifice memory". That is sacrosanct ! Any programmer who says "There's plenty of memory" would not get a job with me. A good programmer keeps tight reign on RAM and cpu time. I could keep ranting a lot more like that, but other duties are calling right now. ;-) Yeah, me too.
Just my 2 Cents (well, make that 4 - or 6) ;-) Mine too ;-) Stefan Hundhammer
Penguin by conviction. YaST2 Development
Programmed with MS-DOS, SunOS, Solaris, HP-UX, Win32 (just enough to hate it) and X11 / OSF/Motif, Qt since 1984 with about a dozen programming languages A fair range there. But no really big machines, and no DEC/VAX. About a century ago my mate was working in Scotland and I was in Australia, and we were interacting in real time via "VAX Phone", That is like an up-market version of chat. And we transmitted large chunks of code "at the touch of a button". VAX was so far ahead of it's time. It is a pity that the PC money giants bought and killed it.
And I must agree with you about the M$ system. Hence my desire to try to learn X11 and Xt. Now give me a break, don't say I should jump right into Qt - I've got to learn to walk first ;-) I am still trying to clear my head of the M$ stuff. Cheers, and thanks fro the debate, Colin