On Tue, 25 Mar 2003 21:57:12 -0500 "Steven T. Hatton" firstname.lastname@example.org wrote:
I'll have to confess, when my professor started talking about preprocessor directives in my C programming course, my head just started to spin. Now some of it seems to be falling into place.
It looks like Jerry Feldman has some insight into another twist on how this might affect preprocessing. Somehow, I suspect compilers are smart enough these days to know to skip the redundancies, but one never knows. And there may be still more subtle implications I have failed to comprehend in what he said.
(I teach C Programming at Northeastern :-). I am not aware of any C compiler than will try to skip redundancies.
In the old days, the C Preprocessor was a separate Unix utility, and is still available as a separate command:eg. cpp(1).
Most modern C compilers actually combine the functions of the C Preprocessor with the lexer, but functionally it works the same. As John mentioned, most C and C++ header files contain something like: #ifndef CCXX_MISC_H_ #define CCXX_MISC_H_ ... #endif
It is very important to understand the effects of the basic C Preprocessor functions. For example:
#define MAX(x, y) ((x) > (y) ? (x) : (y))
Note that either x or y may be evaluated 2 times, and that x and y could be expressions and may side effect. Before the C89 standard, the c type "functions" (eg. ctype.h) were routinely implemented as macros. Today, the standard mandates that they be functions. Also, some stdio functions, such as getchar() are normally implemented as macros.
You are also correct with translation unit. That is effectively what the compiler sees since the preprocessing procedure behaves as if it operated on the file before the compiler was executed.