|May11-08, 12:18 PM||#1|
Difference between #define and const
I wanted to know that is there any difference between #define and "const int"(suppose)
Apart from the fact that one is actually assigned a datatype and the other is not.
|May11-08, 12:43 PM||#2|
One of them informs the preprocessor to do a textual substitution. The other actually declares a (constant) C variable.
|May11-08, 04:24 PM||#3|
In addition to Hurkyl's answer ...
A #define is either an immediate value or a macro. A constant is a variable that doesn't change in value. You can delcare a pointer to a const, but not to a #define, although a define could be a pointer (for example "#define PI1234 ((int *)0x1234)".
In C, #defines are local only, so there's no way to make a #define externally available to the linker. Instead, the #define must be included in the source code for all modules that need to access the define. A const variable can be global and accessed via other linked modules, but requires a memory access and occupies space. In assembly, global equates are possible, and linkers typically generate global equates to indicate the bounds of key points within the code, like the start and end of data and code. Global equates don't require a memory access and don't occupy any space.
Within a module, a C compiler could optimize a const as if it were a #define, if there are no pointers declared to the constant. In CPU terms, the const would become an "immediate" value. Other alternatives is that a const variable could be placed in the code area as opposed to the data area since it doesn't change. On some machines, declaring a ponter to a constant could cause an exception if you tried to modify the constant via the pointer (if the constant were placed in a read-only code section).
|May13-08, 11:37 PM||#4|
Difference between #define and const
look at here:
[29.7] Why would I use a const variable / const identifier as opposed to #define?
const identifiers are often better than #define because:
they obey the language's scoping rules
you can see them in the debugger
you can take their address if you need to
you can pass them by const-reference if you need to
they don't create new "keywords" in your program.
In short, const identifiers act like they're part of the language because they are part of the language. The preprocessor can be thought of as a language layered on top of C++. You can imagine that the preprocessor runs as a separate pass through your code, which would mean your original source code would be seen only by the preprocessor, not by the C++ compiler itself. In other words, you can imagine the preprocessor sees your original source code and replaces all #define symbols with their values, then the C++ compiler proper sees the modified source code after the original symbols got replaced by the preprocessor.
There are cases where #define is needed, but you should generally avoid it when you have the choice. You should evaluate whether to use const vs. #define based on business value: time, money, risk. In other words, one size does not fit all. Most of the time you'll use const rather than #define for constants, but sometimes you'll use #define. But please remember to wash your hands afterwards.
|Similar Threads for: Difference between #define and const|
|What is the difference between #define and declaring a variable and more||Programming & Comp Sci||7|
|Vector - sum of two vectors * some const||Introductory Physics Homework||4|
|Presure @ Const Volume Problem||Introductory Physics Homework||5|
|Mavromatos---cos. const. + time direction||Beyond the Standard Model||0|
|[SOLVED] paradox V=const!?!?||General Physics||18|