There’s a bit of Parkinson’s law of triviality in this. When picking a language to use, whether or not it has semicolons is extremely low on the list of considerations.
Agreed. Also, I think the mention of optional semicolons on the Kotlin homepage is somewhat tacky. Are there not more important reasons I should use their language than the optional semicolons?
+1 about Kotlin
I think the more important question is do we have a new line represent end of statement? Linewrap / wordwrap is often implemented poorly or sometimes not at all. Okay so you say, why not Implicit semicolon insertion, well that’s quite problematic too. The problem isn’t so much the character “;” but rather the idea that we don’t need a way to express the end of a statement, or worse that it should be fundamentally ambiguous whether a statement has ended or not.
Clearly many feel that wordwrap/linewrap is sufficient I suppose for long statements, but what happens if your line wraps in a place that makes it look like one statement is actually two?
Right. Every language where end-of-line is an implicit statement terminator, or worse might be one if some conditions are met, is one where I have less control over the whitespace and layout of my program. There is also a higher cognitive overhead when trying to figure out how the compiler will interpret my code, as I now have to apply rules beyond just “it ends where the semicolon is”.
I’m fond of the semicolon from a familiarity point of view, but the token it represents – end-of-statement – is the critical part.
The only partially sane idea I’ve seen is that every statement ends with the line UNLESS you use a statement continuation token . Depending on your code base however this might be just as much, or more work than using a semicolon as an end of statement. This “problem” has only been attempted to be solved because of an insufficient understanding of why the existing implementation is the way it is.
I don’t like the continuation-token pattern that much. If you’re keeping your code to a fixed width (for whatever reason suits you) you now have to knock off a few extra characters (e.g., for shell scripts, a space and a \). It’s visually a bit awful, and in practice it seems to encourage people to oversimplify their code by leaving things out in order to avoid breaking across lines.
To the best of my knowledge you don’t really hear authors whinging about having to use all that pesky punctuation to terminate sentences, even when the sentence is the last one before additional whitespace would otherwise break up the text. It’s not clear why people believe not punctuating their code saves them a lot of time and energy.
Here’s a list of languages he mentions and how he describes their age:
Cobol: 1959 very old
C: 1972 slightly aged
C++: 1983 slightly aged
ABAP: 1983 very old
Python: 1991 newer
Java: 1995 slightly aged
C#: 2000 slightly aged
It’s interesting how perceived age of programming languages works. For example, Python is older than Java, but if you asked most people not intimately familiar with the histories of those languages, they would likely guess Java is older. There are a number of possible reasons for this, but I would posit the following play the largest role:
I think you’re correct about why Java seemd older than Python. Perhaps it’s expecting too much of blog authors to do a bit of research into the material that they are writing about.
I would’ve agreed with this back when I used a mandatory-semicolon language all the time. But after ~4 years of pure(ish) Scala, it’s incredibly tedious to have to write a semicolon at the end of every line when I dip back into a language that needs them. If semicolons weren’t there, would we feel the need to introduce them?
Scala nailed this. I use Eclipse IDE with the “show inferred semicolons”. All semicola are just there, without ever having to deal with them (they are non-selectable and cannot be copied).
The semicolon is alive and strong, with amazing superpowers: being programmable
semicolon is technically programmable in C++ with the use of destructors ;-)
Regarding code quality, I also strongly believe that the semicolon …
Here is the problem with the article. This is a rant solely because of this. Somewhat harshly stated: Why should I care what you believe? Give me actual data. Let’s science this!
Save my pinky