Author Topic: Preprocessor and Macros  (Read 22858 times)

DerSaidin

  • Newbie
  • *
  • Posts: 13
    • View Profile
Preprocessor and Macros
« on: July 24, 2013, 02:12:44 PM »
http://stackoverflow.com/questions/14041453/why-preprocessor-macros-are-evil-and-what-is-the-real-alternative-c11
In general, I agree macros are evil.

Remove the preprocessor? Modules address the main problem

Or at least remove macro expansions to source tokens, so #define only effects other preprocessor directives?

bas

  • Full Member
  • ***
  • Posts: 220
    • View Profile
Re: Preprocessor and Macros
« Reply #1 on: July 24, 2013, 10:47:10 PM »
Sometimes macro's just save you from typing. But like many other basic tools, their heavily abused.

I through about removing the macro-preprocessor from C2, but still haven't found a good solution
as alternative. One idea are semantic macros. You can define these like functions, but the main
difference with normal macro's is that they are expanded inside the AST (Abstract Syntax Tree)
and not in the source code. This way, everything can be checked. For example:
old and very bad way ;)
#define MAX(a,b) (a>b? a : b)

With semantic macros, this would be:
public macro MAX(a, b) {
   a>b? a : b;
}

The nice thing is that the compiler understands it a lot better and that
it can actually warn if used with the wrong a and b for example.

Recursive macros are still evil, semantic or not.

DerSaidin

  • Newbie
  • *
  • Posts: 13
    • View Profile
Re: Preprocessor and Macros
« Reply #2 on: July 25, 2013, 10:51:37 AM »
Sometimes macro's just save you from typing. But like many other basic tools, their heavily abused.

Indeed, they are too easily abused and frequently abused.

I through about removing the macro-preprocessor from C2, but still haven't found a good solution
as alternative. One idea are semantic macros. You can define these like functions, but the main
difference with normal macro's is that they are expanded inside the AST (Abstract Syntax Tree)
and not in the source code. This way, everything can be checked. For example:
old and very bad way ;)
#define MAX(a,b) (a>b? a : b)

With semantic macros, this would be:
public macro MAX(a, b) {
   a>b? a : b;
}

The nice thing is that the compiler understands it a lot better and that
it can actually warn if used with the wrong a and b for example.

That use case is pretty much a function.

Expanding it in the style a macro achieves two things there:
  • Inlining for performance
  • Code is applicable for multiple types (generics/templates in other languages)

For the inlining performance case, I would argue that 1) that is an issue for optimizer, 2) the inline keyword should force it to occur.
Also, inlining the MAX in this example seems reasonable, but when the uses escalate to very large amounts of code being inlined it can become evil again.
The semantic macro in your example is an improvement.


Another use of macros I've seen, which is really an inlining issue, is for table gen type stuff.
Example:
http://hg.dersaidin.net/weaver/file/17e883a72fc6/base/code/game/spell_info.def
http://hg.dersaidin.net/weaver/file/17e883a72fc6/base/code/game/spell_shared.c#l352
But this could be done using a constant global to hold the data/table, and a for loop instead of a switch. I believe the reason for using macros instead is: to have a switch, which should gen more efficiently than a loop.
I think undefining and redefining macros like this would still be evil with semantic macros.

You could also argue this type of thing should be done using a tool like LLVM's tablegen.

The most common place macros are used for inlining is for constants. That is replaced with c++11 constexpr (or just const globals, I think it is really a pretty arbitrary difference).



Generics/templates in other languages could be described as "semantic macros", but they are are more strongly typed than the MAX code in your example:
Code: [Select]
template<typename T>                                                                           
T MAX(T a, T b) { return a > b ? a : b; } // a and b are both type T.

#include <iostream>

int main(int argc, char**argv) {
    int a = 4;
    int b = 9;
    float c = 4.99;
    float d = 9.52;
    std::cout << MAX(a, b) << std::endl;
    std::cout << MAX(c, d) << std::endl;

    // Error, they're not the same type
    std::cout << MAX(a, c) << std::endl;

    return 0;
}


All of the reasonable macro expansions uses I can think of are either
  • Inlining to work around optimizer failings
  • Generics

What other uses of macros expansions aren't totally evil?
There might be some, but I can't think of any others at the moment.

kyle

  • Newbie
  • *
  • Posts: 48
    • View Profile
Re: Preprocessor and Macros
« Reply #3 on: June 19, 2014, 05:13:37 AM »
I've used macros (in anger) to do things like define new syntax:

Code: [Select]
synchronized_block(mutex) {
  ... code that needs to be running in only one thread at once ...
}

This is defined as a macro using two nested for loops and the __LINE__ builtin macro.  It is a hack, but one that eliminated a huge source of bugs during one project.  I am working on some similar stuff for exceptions and resource handling.

The other use I do is to provide default file position arguments for debugging:

Code: [Select]
#define debug_print(m) do {fprintf(stderr,"In %s at line %d: %s\n",__FUNCTION__,__LINE__,m) } while(0)

Assert is often done in macros.

If I am generating tables or something, I tend not to use X-Macros.  I tend to use Perl scripts or something to generate the code. 

I think generics would be really useful.  I've had to build some large macros to do this for things like linked lists.  It's a pain and very easy to do incorrectly.

I agree that things like inlining are best left to the optimizer. 

So, we've got:

* constants -> const
* inlining -> optimizer
* templates -> ?
* syntax extensions -> ?
* assert/debug macros -> ?
* things like default __LINE__ args -> ?

I definitely use the last three more than I probably should :-)

Best,
Kyle


lerno

  • Full Member
  • ***
  • Posts: 247
    • View Profile
Re: Preprocessor and Macros
« Reply #4 on: October 20, 2018, 05:07:45 PM »
I have been thinking about this a bit. I actually prefer a meta language rather than using the same language to express more complex macros. Compare Jai, Zig and some others that use compile time execution of the normal code. I find this both hard to read and problematic for static analysis. Also, it would tempt overuse quite a bit.

In a simple meta lang we would need:

  • To loop over a range or a statically declared set (often some enums)
  • Access some information about the stack
  • Parameterize code with different (a) type, (b) struct member accessed (c) function called.
  • conditionals (if/else) with simple integer values.
  • string generation (basically a static printf)

Any suggestions?

bas

  • Full Member
  • ***
  • Posts: 220
    • View Profile
Re: Preprocessor and Macros
« Reply #5 on: October 22, 2018, 12:10:50 PM »
The Big problem with a meta language is that it becomes very hard for tooling. Even after almost 50 years of C, C-tooling still has a very hard time
to do proper analysis. For a very large part, this is because of the preprocessor magic that can happen. Because of this, I would try to avoid a meta
language. This was also the reason why fragment-based programming (hmm the name was different but it eludes me for now) failed; it was just a mess
to read.

I still amazes me that a C like language seems to need a macro system, while many other languages out there just do fine without.
Also if you think about it, an inline function is superior in almost every sense to a macro, except that it cannot modify variables in the calling scope.

lerno

  • Full Member
  • ***
  • Posts: 247
    • View Profile
Re: Preprocessor and Macros
« Reply #6 on: October 22, 2018, 05:08:38 PM »
I agree that it is a mess. For a meta language I'd consider something very simple that could easily be statically analysed. That's why I don't suggest going the Zig route. For both code reading and code parsing we always want to read as few lines as possible to understand the execution context, compile time execution makes that much harder.

bas

  • Full Member
  • ***
  • Posts: 220
    • View Profile
Re: Preprocessor and Macros
« Reply #7 on: October 24, 2018, 02:59:33 PM »
If you look at the Linux kernel code it seems that every 2 lines of code that look similar are macro-ed.
In the end this becomes a big mess.

From a language point of view, C macros are used for 3 different purposes:
1. defining constants.
Code: [Select]
#define BUFFER_SIZE 10in C2 this is replaced by just using
Code: [Select]
const u32 buffer_size = 10;
2. feature enabling/disabling
Code: [Select]
#ifdef HAVE_OPENGLC2 does not have an alternative to this (but you can define all the features in the recipe.txt)

3. code re-use
Code: [Select]
#define max(x, y) ((x > y) ? x : y)Also sometimes larger macros that include if-statements etc
This code would be replaced by semantic macros in C2 that do AST replacement.

The use of nr 3 is quite common, but I think that has also because it's become a
habit for C developers. Other languages don't seem to need this at all and are doing
fine.


lerno

  • Full Member
  • ***
  • Posts: 247
    • View Profile
Re: Preprocessor and Macros
« Reply #8 on: October 24, 2018, 10:01:04 PM »
You forget a big one: generics.

Look at pretty much any hash map implementation in C.

I haven't made a [generics] proposal for C2 yet, because I'm still trying to figure how to do this the best possible way.

It's interesting to look at the draft for Go2, since it wrestles with the same issue.

My personal opinion is that too much generics makes languages very messy and hard to read. They encourage solutions that layers generics upon generics until you have a mess trying to read it.

It is not strange to get something like: std::shared_ptr<std::unordered_map<std::string, std::function<void(const std::shared_ptr<Foo> &)>>>

This almost requires you to use type inference where possible, which makes code reading harder.

Obviously for C++ we could have done something like:

Code: [Select]
using FooPtr = std::shared_ptr<Foo>;
using FooCall = std::function<void(const FooPtr &)>;
using FooCallMap = std::unordered_map<std::string, FooCall>;
using FooCallMapPtr = std::shared_ptr<FooCallMap>;

But then where the original (long) definition was very verbose, you still knew exactly how to access things in it. Not so with our new alias, which means you probably will have to go back and look up those definitions, which interrupts code reading.

How to create generics that remain readable and limited is the question...

bas

  • Full Member
  • ***
  • Posts: 220
    • View Profile
Re: Preprocessor and Macros
« Reply #9 on: October 26, 2018, 09:27:27 AM »
You're completely right, I missed the generics one. Thanks.
I hadn't noticed Go2 yet, nice! The Go developers also seem to be in an improvement loop, self-inspecting and such. Good for them!

I think generics can be very powerful for container data structures. So a list of X uses the same code whether X is an u32 or a pointer.
So roughly the original STL template library (vector, array, list, etc).
In C++ I think they went too far, making code horribly complex to read and even worse than horrible to understand.

Generics are hard syntax-wise (you might want to avoid the C++ notation), but also put quite a strain on the rest of the language.
Maybe the Go2 team comes up with something genius..

lerno

  • Full Member
  • ***
  • Posts: 247
    • View Profile
Re: Preprocessor and Macros
« Reply #10 on: October 26, 2018, 11:11:37 AM »
As I've written before, I think generics is only truly needed for containers. Thus a sweet spot might be a syntax that allows for generic containers, but not anything else (also consider adding C11's _Generic).

I don't have this fully thought out yet so I'm unsure of how it should look.

Regarding G2 I have been looking at the counterproposals as well. None of them look exactly great. Again, what I want to avoid is stuff like foo<bar<baz>, bar<foobar<int, char *>>. Generics truly opens a Pandora's Box for verbose and unreadable code.

It's extremely reasonable to be able to write generic code that works on any array, and you also want to be able to write your own containers that can be used for any payload. But enabling that so often enables much more than you want to.