I never really understood the hate towards goto.
Just for a perspective, see the tcp/ip4 code from FreeBSD.
This code is used by Sony and many embedded systems. The Linux implementation is probably similar.
The most critical piece of Internet Infrastructure is designed as a state machine using gotos. Gotos
express state machines rather cleanly.
What annoys me most is that indirect gotos are almost never discussed. Bare metal programmers use that technique for achieving maintainable code. What a shame !
The goto hate comes from a different world. In the last 60s and early 70s “structured programming” was the experimental new technique with weird ideas that didn’t read familiarly and couldn’t perform well enough for the real world. You had to use gotos, everywhere. Code was not decomposed into submodules: you could, and did jump from any one of your machine code/assembly language statements to another as needed, because the machine was so small you knew all of the state it could be in. If you haven’t read it, read The Story of Mel. Mel’s design is only unusual in the sheer number of clever tricks it pulled, not the character of them.
There was a lot of debate on what program design should look like. Structured programming won. Structured programming won so big and so overwhelmingly that, looking back, it’s hard to recognize what parts of it were contentious and what motivated the people who argued bitterly against it.
Gotos have many uses in which they are a better choice than other control structures. What’s so different now is that for all the rest of the cases, we have names, syntax, and familiarity with those other control structures.
Recently there was a study on goto use in C that is not even wrong because it doesn’t understand the world Dijkstra was writing in.
For context… I remember receiving a memo from the head office mainframe sysadmin, recommending against the use of subroutines as they burnt a lot of cpu cycles.
That was the character of realm in which the battle against FORTRAN 2 computed if() was fought.
Restricted and Computed gotos are really useful for lots of things ! I don’t even have to prove that. Ruby interpreter has a computed goto in the interpreter.
The real issue is – should programming languages disallow unrestricted modification of control flow ? Even in that case, I think the programmer can handle the freedom.
What is ironic about high level programming languages is that they also provide first class functions. Pretty easy to fake an unrestricted goto with first class functions and a Hash<Label, Function>. So why bother with all the gobble gobble ?
Basic is the only programming language I know that provides unrestricted gotos like assembly.
This is what unrestricted gotos look like in Apple Basic. Mel would be proud.
If it’s easy to achieve in the language with a standard library feature, why make it a language-level feature? (And the Hash example would still be clearer and more explicit than a true goto - it only allows you to execute a complete Function, and the full map is inspectable in a debugger rather than looking through all your code for labels).
I don’t think programmers can handle the freedom. Look at how modern language design is finding exceptions to be too dangerous a feature - they make code unreadable/unmaintainable/magic, because the control flow can be surprising - and what an exception can do is very limited compared to unrestricted goto.
Gotos can be a life saver if used properly. Indeed, in C-based languages, I am convinced they are the best way to do error handling, and this has born out to be true in our production security software. With the way we’ve done it, you don’t actually see the gotos, but they are being used to significantly reduce LOC for error handling.
goto fail is probably the best error handing technique in C.
Agreed. The goto error_<message>; pattern is something I picked up
recently and resulted in a huge control flow simplification in my last
two C projects. Definitely something I will be keeping going forwards.
I was (ironically you might think) introduced to this pattern by ancient QuickTime code.