From: Stefan Hundhammer
[snip]
The other myths about object orientation etc. are similar: If you do non-trivial software, you simply NEED that kind of thing so you can have the kind of abstraction level you will want to have so the software can be mantained at all, much more for extended periods of time.
Many would disagree and consider it a myth that OO is more maintanable. There are few proofs outside of the limited scope of GUI code that shed positive light on the OO philosophy. [snip]
Today's software is supposed to do everything, including making coffee, cleaning your shoes and taking the dog out. You do not want to write that kind of software with a 70s era approach - monolithic blocks of code like people used to write with old Pascal, C, or (heaven forbid) FORTRAN.
Procedural code is no more likely to create "monoliths" than OO. In fact, surveying the memory map of many OO developed applications would tend to make one believe the purpose of OO is to produce massive, resource-hogging monoliths. In the C world there is an old and established concept called libraries. Any well written library means your C program doen't have to re-invent the wheel.
Sure, you CAN do that. But it hurts - big time.
C's string handling for instance sucks - it is the source of most security holes that need to be fixed.
Bad, lazy programmers are the source of security holes regardless of language.
Buffer overflows happen because C does not have a concept for variable length strings - it only has character pointers. What a nightmare.
Buffer overflows happen in many languages, not just C, due to bad, lazy programmers.
Been there, done that, hated it from the bottom of my guts.
I have been programming since the mid-80s, and even though I also tend to bitch about many things, things have improved a lot since then. No more rebooting because a null pointer in C overwrote your PC's interrupt table at 0000:0000 on MS-DOS. Anybody remember what a PITA that was?
No more dumbass 64k limits because certain ingenious inventors had considered that much memory "enough for everybody".
This is a matter of history and architecture. It's no more dumbass that any other physical limitation of hardware today. At one time 64K was viewed as generous and quite a lot of useful programs worked in it. There was a time where it was not possible -- physically, or economically -- for a typical computer to address more than 64K (or 16K or 4K depending on how far back in history you'd like to go.) Every Opteron system does not come with 64 Gigabytes of physical RAM, in spite of a theoretical 64-bit address space vastly larger than that, so is that dumbass?
And today, it's no more segfaults because the C string handling is so dumb.
Konquerer segfaults more than anything else I know. Is that written in C?
Use modern tools. Use tools like C++ - or, for that matter, C#, or even Java. Use predefined (meaning: well-tested) classes for common purposes. Do not repeat everybody's (and their mothers' ) mistakes by writing your own because you think you can do a better job at that.
Nobody is arguing every C program must be written from scratch.
You argue this comes at a price - and the price is performance and system resources. That may be right, but I rather sacrifice some MB of RAM rather than experience random crashes because nobody can debug a software of that complexity written with outdated tools any more. My time (and, for that matter, my nerves) are way more precious to me than some MB of RAM saved.
The OO approach has not eliminated bugs, crashes, errors, and the other ills of software. It has succeeded in making computers vastly slower -- a boon to the hardware industry, since everyone always needs to upgrade to maintain the status quo.