@jcs, could we replace this unprintable clickbait title with a more appropriate title, like “NULL is the worst mistake of computer science”?
I think he lets the JVM and .Net languages off too easily; yes, they have a principled sum type available to deal with the null case, but because of the misdesign of their original platforms, code still needs to account for NPEs. It’s gross.
Right, and this is a frequent issue with real-world Scala code. If you come into a Scala codebase written by Java programmers who didn’t take the time to learn how to do Scala right (or who weren’t given the time, because the business set unreasonable deadlines) then you have to handle None and null. The existence of null in Scala (which you can’t really not have, because Java APIs use it) means that the type system is leaky anyway.
Scala gives you the ability to impose discipline over your own code, and that’s worth something, but maintaining someone else’s Scala code, when that someone else is a typical corporate engineer, is not appreciably less painful than maintaining that person’s Java code. Either way, it’s bad enough that I’d generally turn down a project that involved maintaining typical Scala code unless it came with a 400+ hourly rate.
Clojure’s nil isn’t, in my opinion, quite as much of a wart. It’s a Lisp and I feel like Lisp still has somewhat of a never-crash ideology, which is why cond returns nil instead of an error if there’s no match. I’m not a fan of the never-crash approach, and Clojure certainly doesn’t take a never-crash approach (it has Java’s Exception system) but you probably need a “non-value” value if that’s the approach to programming that you’re taking. The nil overloading is conceptually attractive (false is the empty collection is the error value is Nothing) but painful in practice because of its effects on debugging: a long error-failure distance is usually undesirable. Still, I don’t fault Clojure for (to a lesser degree) continuing it.
This is my experience. Even in a Scala codebase written originally by excellent programmers, I continually found NPEs. Many of the java libraries that were required emitted them; many of less experienced programmers would resort to Java-ish idiom, which of course meant nulls. It is a big part of the reason I am so personally bearish on Scala .
I actually think that he’s too harsh on Objective-C; yes, it retains the nastiness of C’s null pointers, but the Smalltalk-ish half of the language supports very nice bottom propagation, and the language idiom embraces that fully. It falls over in liminal code, of course.
 The admittedly minor problem that sums up my dislike of the language is how appallingly bad its support for ADTs is. Yes, I understand that the case class nightmare is required by the necessity of adhering to the Java object model; but here I stand – I can do no other.
The nice thing about nil in lisps is that it (typically) is a value: the empty list. In a language oriented around lists, that makes quite a bit of sense.
Indeed, nil is a list, but (in modern Lisps anyway!) it is not a cons. But that’s true in any language!
It feels like to me that Clojure gets a slight bye on immutability helping out the uses of nil, but it almost invariably is used as a half-way Maybe type so, meh.
… and really, the ASCIIZ “NUL” is a different thing anyway. Related, I admit, because it is the character-that-is-not-a-character, but it isn’t really fair to blame Hoare for it.
Indeed, ASCII NUL predates not only Hoare’s null pointers, but also predates ASCII and Hoare himself, dating back to Baudot’s 1874 five-bit telegraphy code., or arguably Murray’s 1899–1901 redesign of it.
But Lisp’s 1959 NIL also predates Hoare, and it turns out that the IPL language systems Lisp got its ideas from, for which Alan Newell invented linked lists, also sort of terminated linked lists with null pointers; as the 1963 IPL-V manual says on p. 8:
A termination cell contains the word 00 00000 00000, and the symbol that names it is called a termination symbol. The internal symbol 0 is a termination symbol, and is used by the programmer in preference to other termination symbols. Hence, it is referred to as the termination symbol.
That’s not the only possible choice; you can terminate linked lists with a pointer back to the same node, or a pointer to the beginning of the list, or a specially-allocated list node that signals termination for any list and isn’t actually used to store anything, or a special kind of list node with internal type metadata indicating that it’s a terminator, which is kind of what IPL-V does: it isn’t enough to have a null next-pointer (aka CDR in Lisp or LINK in IPL-V), but rather a next-pointer that points to a physical node that is entirely filled with nothingness. And indeed in Lisps for a long time NIL was in fact a real physical cons pair whose CAR and CDR pointed back to itself.
These are all more or less details, though, since the problem with null pointers is not how you represent nulls or whether your program loops infinitely or dies with a segfault when it keeps following a list that has already terminated; the problem is that it’s hard to determine statically which pointers could be null and therefore need to be checked for nullness before access, and which are guaranteed to be non-null and can therefore be dereferenced safely. Sum types and monads remove this problem.
But criticizing this article is kind of fish-in-a-barrel. Did you notice that this article credits Hoare as being the inventor of Algol W? I suppose the author never thought to ask what the “W” stands for.
I’m all for removing NULL, but the author is a little late to the party and I didn’t think he added anything to the conversation.
If you read the Reddit thread, there are a lot of programmers who lack even a basic amount of information on this topic. While this may be true in a global sense, don’t discount the “ten thousand effect” or whatever that XKCD called it.
That, and I don’t honestly know where I’d go for an explanation of why it’s an issue, targeted at programmers who have yet to be sold on elaborate type systems, and without trying to make the larger pitch as part of the same essay. Does anybody have a favorite link about NULL for that audience?
There is a video from the horses mouth http://www.infoq.com/presentations/Null-References-The-Billion-Dollar-Mistake-Tony-Hoare
But there are literally dozens, probably a hundred at this point, null blog posts, so one probably exists.
Despite the article’s deplorable sloppiness with the facts, I guess you’re right that this is a good article for opening novice programmers' eyes to the possibilities of nullfreedom.
For anyone looking, that would be this Reddit thread.
And for the next person who writes the same article there will be even more people who lack a basic amount of information on this topic, so I’m not sure that is a strong enough justification for it to be written. I’ve been raising my fist in anger over NULL for years, but what has actually had an effect on my colleagues was Java adding Optional<T>. Now they think null is problematic.
I don’t know what to take from that or if it means that this blog post was a waste of effort. But I’m just not sure the number of programmers who don’t know about it is sufficient motivation.
That’s an interesting perspective. I guess I’d conclude that showing works better than telling, but that’s fairly trite and it feels like both are needed. Your point is well taken that adding to the discourse may not be helping at this point.