1. 11
  1.  

  2. 3

    The article doesn’t seem to explain why this should be avoided. It claims that it has “harmful side effects ranging from subtle breakage to miscompilation”, but has no examples to back it up. If it is indeed harmful, wouldn’t this also cause problems with software built with one set of defines, linked against libraries built with another?

    Sometimes, if my project uses POSIX in only one or two files, I’ll prefer the #define approach rather than global -D flags. This way, it’s clear that use of POSIX functions is expected to be contained within those files and prevents them from getting used accidentally in the others.

    Similarly, if a project is entirely POSIX compliant, but one file needs a single function guarded by _GNU_SOURCE (for instance, pipe2, which will be in POSIX issue 8 but is still guarded in glibc), I’ll define it only in that one file, since I don’t intend to use GNU APIs elsewhere.

    The main problem with feature-test macros that I see in the wild is people thinking that they are supposed to check them to see whether those features are supported, when they are really supposed to define them to tell libc to expose those features. For example, it is fairly common to see code like

    #if _POSIX_C_SOURCE >= 200809L || _XOPEN_SOURCE >= 700
    

    intending to check whether st_mtim is available in struct stat, but the project never defined those feature-test macros anywhere.

    1. 3

      I believe the problem from the pattern in the link is specific to things like this:

      #include <stdlib.h>
      #include <unistd.h>
      #define _XOPEN_SOURCE
      #include <string.h>
      

      The first two headers may include something that the third one includes. When string.h is included, these files will hit the include guards and not be re-parsed, so anything exposed differently by _XOPEN_SOURCE will not be caught.

      This doesn’t matter if you do the define at the top of the file (-DFoo is equivalent to adding #define Foo at the start of the file). But if you need to change the flags depending on the target, it’s much better to have that centralised in the build system than scattered in a load of different files.

      The article also doesn’t discuss the fact that glibc does this the opposite way around to BSD libcs (including Darwin libc). In GNU libc, everything except a core set is hidden by default and must be exposed by feature test macros. In BSD libcs, everything is exposed by default and the feature test macros are used for writing portable code by restricting you to a standard subset. This means that, in code, you typically want something like:

      #ifdef __linux__
      #  define _XOPEN_SOURCE
      #endif
      

      Unfortunately, that’s really a glibc-specific thing, not a Linux-specific thing and compilers don’t define a pre-defined macro for glibc, only for operating systems.

      1. 2

        Yes, defining in between includes is clearly wrong, but the article calls out any sort of #define _XOPEN_SOURCE as bad practice. As you said, -D is equivalent to adding a #define at the start of the file, which is why I am a bit skeptical.

        1. 1

          My rule of thumb is that it’s okay to add one or two of these in a codebase but if you have more than a couple of files containing them then it belongs in the build system. Don’t put them in files is probably a better rule of thumb than always put them in files but the article misses a lot of nuance.