Very, very important, due to the fact that #define is really just preprocessed substitution.

Examine the following clearly wrong code:

	#include <stdio.h>
	#define HALF(X) X/2
	int main() {
		printf( "Half of 4 is %d\n", HALF(4+0) );
		return 0;

Run it. It'll try to convince you that half of four is four! This is because the code is preprocessed as this:

	/* contents of stdio.h here */
	int main() {
		printf( "Half of 4 is %d\n", 4+0/2 );
		return 0;

See? You need the parenthesis; the proper macro is:

	#define HALF(X) ((X)/2)

This will give you the (4+0)/2 you need. Also note surrounding parenthesis, which allows one to safely multiply by its value.

Actually, you've explained one of the very good reasons why macros are BAD. Before you flame or downvote, allow me to explain.

Macros give C a great deal of power and expressiveness. They are much, much faster than function calls for small functions. On these bases, they are a very common construct. However, they are incredibly dangerous. Imagine, instead of the trivial case in the original writeup, a more serious one. Say the macro is used to calculate the proper length of a buffer. As many of you know, buffer overflow exploits are one of the most common types of security holes.

The solution is simple: use a more robust language for applications where this kind of error can cause damage. For instance, C++* solves this problem with a construct known as an inline function. Like a macro, it is substituted directly into the code in which it appears. However, C++ does this in the compiler stage rather than the preprocessor stage. Therefore, the substitution is done with some awareness of C++ syntax, and inline functions may be written exactly as a normal function -- none of the caution that comes with the use of macros.

* This should not be mistaken for me calling C++ a robust language.

Log in or registerto write something here or to contact authors.