While academically interesting, this is more like a look into the unsound implications of the C standard than real advice like namesake unicode article. But that doesn’t mean it can’t be fun.
For example, with #11, I’m doubtful that any C compiler which encourages (void*)x for creating a pointer to address 0 is going to be so standards compliant that it will also ensures x == NULL is false. But this makes me think that it could be a really silly challenge to make some sore of IOCC type competition to write an adversarial standards-compliant compiler, but require competitors to write code that still worked against it.
which is a problem because NULL + 0 is forbidden even though it seems to be a no-op — well, it was a problem, but it is so common that NULL + 0 is becoming permitted.
Well, offsetof() is defined by the C standard (even as far back as C89) so there’s no need to define it, but using it, in my opinion, feels dodgy because of all the casts that need to be done:
WRT offsetof() I was talking about how implementations used to define it, e.g., but can’t with modern compilers.
WRT why ptr is NULL, because that’s the value it might happen to have due to whatever is going on in some wider context. If the language forbids NULL + 0 then most data structure code has to have redundant defensive NULL checks even when the code should be a no-op because the length is zero. This is a particular problem in C++ because its iterators are start / end not start / length. If NULL + 0 is permitted, then the compiler cannot assume that ptr != NULL when it sees ptr + len.
While academically interesting, this is more like a look into the unsound implications of the C standard than real advice like namesake unicode article. But that doesn’t mean it can’t be fun.
For example, with #11, I’m doubtful that any C compiler which encourages
(void*)xfor creating a pointer to address 0 is going to be so standards compliant that it will also ensuresx == NULLis false. But this makes me think that it could be a really silly challenge to make some sore of IOCC type competition to write an adversarial standards-compliant compiler, but require competitors to write code that still worked against it.Is it my beloathed asm.js?
There have been some, uh, “interesting times” around pointer arithmetic on NULL pointers. There’s the classic
which used to be really common, but nowadays it’s obligatory to use a compiler builtin instead.
And there are all sorts of issues that often turn up in generic C++ where you do something like,
which is a problem because NULL + 0 is forbidden even though it seems to be a no-op — well, it was a problem, but it is so common that NULL + 0 is becoming permitted.
Well,
offsetof()is defined by the C standard (even as far back as C89) so there’s no need to define it, but using it, in my opinion, feels dodgy because of all the casts that need to be done:As for your example, why is
ptrNULL? Is this just an example ofNULL+ 0?WRT
offsetof()I was talking about how implementations used to define it, e.g., but can’t with modern compilers.WRT why
ptrisNULL, because that’s the value it might happen to have due to whatever is going on in some wider context. If the language forbidsNULL + 0then most data structure code has to have redundant defensive NULL checks even when the code should be a no-op because the length is zero. This is a particular problem in C++ because its iterators are start / end not start / length. IfNULL + 0is permitted, then the compiler cannot assume thatptr != NULLwhen it seesptr + len.This is a great list of things to avoid when writing a new language specification.