Things like this are why I don’t trust Martin’s opinions. He didn’t say a single bad thing about Clojure, he didn’t have any nuance, he doesn’t respect the other viewpoints. He does the same thing with TDD and clean code, where it’s impossible for them to be the wrong tool. He’s been programming for five decades and still thinks in silver bullets.
For the record, his Clojure example is even shorter in J. It’s just *: i. 25.
but that misses the point. Both Clojure examples are easy to explain and understand, where in J it is not obvious what *: and . stand for, and how these should be changed should we wanted to compute something different. But even that is not the point.
The point is that Uncle Bob is writing about his own experience with the language that he finds fascinating. He writes about his experience and he gets to choose how he writes it. If anyone disagrees (plenty people do, I suppose) they are very well entitled to write about their experience themselves.
I don’t want to sound like an asshole, but what exactly is his experience besides teaching and writing books ? Cause we see so many people advocating for specific language/technology without any substantial real world experience.
A professional advocate advocating for something is a signal too
Yes, it’s called Appeal to Authority.
I’m also not convinced he’s much of an authority. I’d say he’s a zealot. His tirades against types are tired. His odes to discipline are masturbatory. His analogies… well… This is the same guy who said C++ is a “man’s language” and that you need big balls to write it.
His analogies… well… This is the same guy who said C++ is a “man’s language” and that you need big balls to write it.
This is called an ad hominem. If you’re going to be a stickler about logical fallacies I’m surprised that you can’t even make it a few sentences without contradicting yourself. Are they important or not?
A professional advocate advocating for something is a signal too
This is called inductive reasoning. Given some evidence, such as a well-regarded professional advocating for some tool, we can try to generalize that evidence, and decide the tool has a good chance of being useful. You’ve surely heard of Bayesian probability; signals exist and they’re noisy and often incorrect but minding them is necessary if you want to make any sense of the world around you.
Yes, it’s called Appeal to Authority.
Logical fallacies only really apply when you’re working in the world of what’s called deductive reasoning. Starting from some premises which are assumed to be true, and moving forward using only techniques which are known to be sound, we can reach conclusions which are definitely true (again, assuming the premises). In this context, the one of deductive reasoning, appeal to authority is distinctly unsound and yet quit common, so it’s been given a nice name and we try to avoid it.
Tying it all together, the parent is saying something like “here’s some evidence”, and you’re interjecting with “evidence isn’t proof”. Great, everybody already knew that wasn’t proof, all that we’ve really learned from your comment is that you’re kind of rude.
Fallacies can apply to inductive arguments too, but you are right in that there’s an important distinction between the two types and how they differ. I would say that the comment you’re replying to is referring to the idea of informal fallacies in the more non-academic context. The Stanford encyclopedia has a good in-depth page about the term.
This thread started with, “Things like this are why I don’t trust Martin’s opinions.” Uncle Bob’s star power (or noteriety) and whether that qualifies as social proof or condemnation, is the point of the discussion, not a distraction.
The point is that Uncle Bob is writing about his own experience with the language that he finds fascinating. He writes about his experience and he gets to choose how he writes it.
I wouldn’t be complaining if he was just sharing a language he liked. The problem is he’s pushing clojure as the best language for (almost) everything. Every language has tradeoffs. We need to know those to make an informed decision. Not only is he not telling us the tradeoffs, he’s saying there aren’t any! He’s either naïve or disingenuous, so why should we trust his pitch?
The problem is he’s pushing clojure as the best language for (almost) everything.
That’s not what he said though. The closest he came to that is:
Building large systems is Clojure is just simpler and easier than in any other language I’ve used.
Note the qualification: ‘… than any other language I’ve used’. This implies there may well be languages which are easier for building large systems. He just hasn’t used them.
Not only is he not telling us the tradeoffs, he’s saying there aren’t any!
He repeated, three times for emphasis, that it doesn’t have static typing. And that it doesn’t give you C-level performance.
Note the qualification: ‘… than any other language I’ve used’. This implies there may well be languages which are easier for building large systems. He just hasn’t used them.
We need to consider the connotations and broader context here. He frames the post with
I’ve programmed systems in many different languages; from assembler to Java. I’ve written programs in binary machine language. I’ve written applications in Fortran, COBOL, PL/1, C, Pascal, C++, Java, Lua, Smalltalk, Logo, and dozens of other languages. […] Over the last 5 decades, I’ve used a LOT of different languages.
He doesn’t directly say it, but he’s really strongly implying that he’s seen enough languages to make a universal judgement. So “than anything other language I used” has to be seen in that context.
Nor does he allow special cases. Things like
But what about Javascript? ClojureScript compiles right down to Javascript and runs in the browser just fine.
Strongly connotating that “I’m writing frontend code for the web” is not a good enough reason to use Clojure, and he brushes off the lack of “C-level performance” with
But isn’t it slow? … 99.9% of the software we write nowadays has no need of nanosecond performance.
If Clojure is not the best choice for only 0.1% of software, or even 5% of software, that’s pretty darn close to “best language for (almost) everything.”
He repeated, three times for emphasis, that it doesn’t have static typing.
He repeats it as if the reader is hung up on that objection, and not listening to him in dismissing it. Note the increasing number of exclamations he uses each time. And he ends with
OK, I get it. You like static typing. Fine. You use a nice statically typed language, and I’ll use Clojure. And I’ll be in Scotland before ye.
Combined with his other posts (see “The Dark Path”), he doesn’t see static typing as a drawback. We can infer it as a drawback, but he thinks we’d be totally wrong in doing so.
You have to explain both examples for them to make sense. What does map do? How do you change sqr out for a different function? If you learn the purpose of the snippet, or the semantics of each of the individual elements, you can understand either the J or Clojure example just as well as the other (if your understanding of both languages is equal).
Also the meat of the article is trying to convince the reader to use Clojure (by explaining the syntax and semantics, comparing its syntax to two of the big 5 languages, and rebutting a bunch of strawman arguments - nothing particularly in-depth). I don’t see a balance of pros and cons that would be in a true account of an experience learning and using the language, including more than just a bullet point on the ecosystem, tooling, optimisation, community, etc.
I am sure that any programmer that has any experience in any language would guess that you change sqr out for a different function by typing the name of that other function. For example, you compute exp instead of sqr by, well, typing “exp” instead of “sqr”.
The same with map. Of course that someone has to know what particular function does to be able to use it effectively. The thing with Clojure (and other Lisps) is that it is enough to know that. You don’t need special case syntax rules. Any expression that has pretty much complex semantics is easy to write following a few basic rules.
I understand the benefits of the uniformity of Lisp, but my point was just that you can’t really say that (map sqr (range 25)) is any more or less understandable than *: i. 25 if you know the purpose of the expressions and the semantics of their constituent parts. And given that knowledge, you can reasonably make substitutions like exp for sqr or ^: for *: (though I would end up consulting a manual for the exact spelling).
Further experimentation would require more knowledge of either language. For instance, why if isn’t a function in Clojure, or why lists don’t have delimiters in J. It’s all apples and oranges at this superficial level.
My version of Clojure doesn’t define sqr—is that built in?
That aside, I don’t find either version very easy to explain to someone who isn’t already experienced with functional programming. What does “map” mean? How does it make sense that it takes a function as an argument? These seem obvious once you’ve internalized them, but aren’t easy to understand from scratch at all.
If I were reviewing this code, I would suggest they write (for [x (range 25)] (* x x))
Of course that one has to understand the semantics of what they’re doing. But, in Clojure, and Lisps it is enough to understand the semantics, while in most other languages, one has to additionally master many syntax rules for special cases.
Closure has quite a lot of special syntax compared to many Lisps. for example, data type literals and other reader macros like literal lambdas, def forms, let forms, if forms and other syntax macros like -> are all built in. Each of these has their own special rules for syntax and semantics.
We’re on the same page I think, except that I think knowledge of semantics should be enough to understand any language. If you see a verb and a noun in close proximity, you’d be able to make a good guess as to what’s happening regardless of the glyphs representing their relationship on screen.
If you want a language that emphases semantics over syntax, then APL is the language for you! There are just a few things to understand about syntax, in order of importance.
Array literals
Numeric arrays are just numbers separated by spaces. Negative numbers are prefixed with ¯. Some dialects have special-case syntax for complex or rational numbers: 42 3.14 1J¯4
Character arrays are just text delimited by '' quotes. Doubling the quote inside an array escapes it: 'is' or 'isn''t'
Array indexing with [] braces: 'cafe'[3 2 1 4] ←→ 'face' (Many APLers have a disdain for this form because it has some inconsistency with the rest of the language.)
Function definitions
Inline anonymous “direct” functions delimited by {} braces.
Traditional named functions defined by the ∇ form.
Statement sequencing with ⋄ (Mainly useful for jamming more code into a single line)
From there, the grammatical rules are simple and natural in the form of verb noun or noun verb noun or verb adverb noun etc. Probably the most difficult thing to learn and remember is that there is no operator precedence and evaluation reduces from right-to-left.
When I’m programming in APL, I rarely think about the syntax. When I’m programming in Clojure, syntax is often a concern. Should I use map or for? Should I nest these function calls or use ->?
True enough. However, at least in Clojure, macros are pretty deliberately limited so as not to allow drastically changing the look-and-feel of the language. So I’m pretty sure every macro you’ll come across (except I guess reader macros) will have the same base syntax, (a b ...).
This is surprising. A famous TDD evangelist has decided that Clojure is his favorite language. Clojure is the least TDD-friendly language I have ever used. The startup times make a traditional edit-compile-test loop impossible to use: instead you must keep a REPL running, which gradually accretes state. It’s the “works on my machine” problem but even worse. So you have to restructure your application and use something like clojure.tools.namespace.repl or one of the myriad libraries that wrap it in an attempt to clean the slate between test runs. It will mostly work, until you do a large refactor and hit an edge case.
The Clojure community - as a gross generalization - prefers interactive REPL-based development to testing. Not much effort is put into supporting a TDD workflow because it isn’t the preferred workflow. If you complain about startup times, you will be told that it doesn’t matter because you should have a REPL running all the time, with an editor integration.
I would love to know how Martin has managed to reconcile TDD with Clojure.
Clojure is great for TDD, clojure.test isn’t in the core language (which is small) but is in the core namespace (https://clojure.github.io/clojure/clojure.test-api.html) and is both well-maintained and a pleasure to use.
You run your tests in the REPL, there’s nothing weird about that, it’s awesome!
You run your tests in the REPL, there’s nothing weird about that, it’s awesome!
I have found running tests in the REPL to be awful. If you approach Clojure naively, i.e. like any other language, then you will very quickly be SLIMEd. Here is a paragraph from the leiningen tutorial:
Keep in mind that while keeping a running process around is convenient, it’s easy for that process to get into a state that doesn’t reflect the files on disk—functions that are loaded and then deleted from the file will remain in memory, making it easy to miss problems arising from missing functions (often referred to as “getting slimed”). Because of this it’s advised to do a lein test run with a fresh instance periodically in any case, perhaps before you commit.
You can’t run lein test every time you make a change because it takes so long to compile and run. So you have to run your tests in a continuously running REPL, thus risking getting slimed. Not being able to trust the tests makes TDD awful.
People try to work around this by carefully structuring their applications using component or mount or similar, and relying on clojure.tools.namespace.repl to tear down their namespaces cleanly and rebuild them from scratch. Apart from the silliness of having to do all these workarounds just to run tests quickly, it doesn’t even work reliably. Certain types of refactoring can still result in being slimed.
I agree that clojure.test is quite nice… if only there were a quick and reliable way to actually run the tests!
I don’t find it awful, but it’s likely I have a different workflow or have already adapted my workflow in clojure to work around this weakness. I’ll usually go 30 minutes to an hour between running, ‘all the tests’ (which gets a clean lein test in a new console as a precaution.)
One thing I do frequently is eval-print-last-sexp into my editor (I’ve bound this to spacebar), verify the output by reading it, and then slap deftest around the resulting tuple. I find this so ridiculously great for testing that I’m willing to put up with a lot of inconvenience for it. (and it is far and away not the greatest thing about clojure).
I always get down-voted as “incorrect” when I recount personal experience or ask questions… it must be the way I phrase it. Oh, well.
I would love to know more about your workflow, if you have it documented anywhere. I have asked a few experienced Clojure devs about this and they usually give an answer similar to yours, i.e. they don’t suffer from sliming very much but they aren’t sure why. It seems to be something that affects some people really badly and other people not so much… I would like to work out why that is.
I run CIDER in an emacs process running as a daemon. I connect with emacsclient.
I refresh my CIDER instance pretty regularly. I initiate it directly from the .clj file I’m editing (which ensures the project is configured right and everything is fresh). I’ll do this as often as every file I edit (Sliming is no joke, I don’t disagree with you there).
I’ve done this for small projects (in the scale of projects), but part of that is that it’s possble to do so much in clj with less code (10X code size vs java is thrown around a lot) so my “small” 3kloc clojure projects with this workflow pack the punch of a 30kloc java app (I’ve seen whole companies run off less).
I’ve gotten paid for these “small” projects and have run them happily in a production environment for years with this workflow (letting you know I’m not comparing apples to oranges or talking about toys… though, I have made some fun toys in clojure). I haven’t had the opportunity to work on a giant clojure project.
I honestly don’t get these people who claim to run stable repls for days or weeks at a time without restarting… I end up rebooting my emacs instance around once a day due to stability issues there (and the fact that one mistake in an interactive environment can take down the rest).
I don’t think the environments themselves are unstable… I just find that I’m prone to end up making them do too much! (Printing a larger then expected clojure data structure can take down a repl for minutes or hours before it runs out of heap! So I find myself frequently ‘refreshing’ my environment for that reason alone).
It could be that my style in this case (which results in accidentally thunking a 100mb string into my editor) is working in my favor too… I honestly don’t know (just that hurting myself with sliming is rare and that I find my productivity gains way outpace the risks).
Thank you for taking the time to write that! I know Clojurists who keep REPLs running for weeks and I just don’t understand how they avoid being slimed. Your workflow seems much less likely to run into those issues.
I think your extra layer of indirection - using emacs as a daemon and connecting with emacsclient - might also help. I tend to work from the shell, opening my editor to edit a file and then closing it again, rather than leaving the editor going for a long time. I found that really doesn’t work very well for REPL-based development with an editor integration, but maybe it would work if it were emacsclient that I was starting and stopping all the time and emacs itself kept going in the background.
I might have to give that a go sometime. Thank you very much for sharing.
This is basically the post I was planning to write this weekend for my own blog, just expressed better.
The one thing I’m surprised he doesn’t mention, and I think is hugely important is how Clojure is designed as a hosted language, meaning not only Java interop is trivial, but also JS interop if you want to run CLJS in you browser, or on Electron, or on React Native (and also make use of the wealth of JS libraries, for better or worse). And if you have yet other requirements, GraalVM seems to be promising great performance, minimal overhead, and near-zero startup time (eg. for CLI apps). Or run it on the CLR (the C# runtime), maybe even inside Unity for game development (see Arcadia).
In short, it’s hugely flexible, and a lot of the tools and libraries work across most of those platforms. I’m currently writing Clojure professionally, and I don’t plan to change that. It’s not perfect, but it’s very close to.
I’m not familiar with this Robert C. Martin guy. This strong advocacy of Clojure seems odd, so I read some more of his posts to try and get an idea of where he’s coming from. It seems that a lot of his position is based on being an advocate of strict TDD. I develop mostly in dynamic languages nowadays (Ruby and Python), so I write a lot of tests. Maybe I’m not really well-practiced with TDD, but honestly, I just don’t get it.
IMO, a relatively small number of tasks and situations lend themselves to strict TDD, as far as actually writing tests first. Having really strict unit test coverage, enough to catch any bug at all, is IMO impossible in any project that does anything significant. Having enough tests to cover every branch and every possible error in every situation leads to having a ridiculously huge test suite that takes forever to run, and tends to blow up from any change at all and require much more work to sort out the tests than the actual change did. I don’t seem to hear many satisfying answers from TDD advocates on these issues.
I worked for most of my career in languages with strong type systems, mostly C# and a little C++. I did grow weary of how much boilerplate was often required around defining objects, and looked forward to working with more dynamic languages. Now, after working with dynamic languages for a few years, I feel like I have more appreciation for how many errors strong types prevent, and how inadequate even large and involved test suites are at really preventing them all. I have also observed how dynamic languages tend to grow static bits after a while (Typescript, optional types, etc), and static languages tend to grown dynamic bits (reflection, implicit types, C# dynamic keyword). I think now the idea language must be somewhere in between, but I’m not sure exactly where yet.
I’m a big fan of TDD. I find that if I don’t write tests first, it’s too tempting to just not write them at all. I combine it with up-front design, property-based testing, and static analysis though.
It depends what you mean by that. I mostly have no trouble writing tests later on, when the project comes into focus enough that I can see the value they bring more clearly. I’ve seen a drag come from when the application is architected in such a way that it is difficult to set up tests though. I’ve also heard the claim that much of the benefit from TDD comes from designing applications in such a way that they’re easy to write tests for. Maybe that’s what I’m really doing - just automatically designing things with testing in mind, which is what really makes them better.
I am not a fan of “Uncle Bob” but I do think TDD is a good idea.
Having really strict unit test coverage, enough to catch any bug at all…
This is really not the point of TDD. Not at all.
Because I’m lazy, and other people have explained these things better than I probably can, I’ll just point you at one blog post and one video of a talk. I think both are very good and helped me to understand TDD better:
That isn’t really the critical point of what I wrote though. It was a point, not an assertion that that is what TDD is all about. If I had to boil down to a single point, it would probably be:
Now I feel like I have more appreciation for how many errors strong types prevent, and how inadequate even large and involved test suites are at really preventing them all.
I made that assertion to counteract the main article’s point about Lisp being the ultimate language for everything, despite having no compile-time type enforcement AFAIK.
That’s what I never see an answer to. Anytime I ask, people just point me at more blog posts to read, but I feel like I’ve read and watched plenty already. It feels like TDD is a religion sometimes, where nobody can give you a straight answer on anything, just point you back to the “holy books” and tell you to read them more closely.
I think the most helpful article I’ve found on testing is Eevee’s https://eev.ee/blog/2016/08/22/testing-for-people-who-hate-testing/. That’s practical advice on how to write more and better tests without getting bogged down in pseudo-religous TDD jargon or huge and messy test suites that cause more problems than they solve.
In this particular case, I feel the links I suggested would actually have helped. But I’m happy to just post my own synthesis here if you’d prefer. The problem is that you are conflating different issues. You boiled your point down to this:
Now I feel like I have more appreciation for how many errors strong types prevent, and how inadequate even large and involved test suites are at really preventing them all.
Which is a totally valid point. I agree with it: my experience has led me to the same conclusion. It also has nothing to do with TDD.
TDD is not about creating “large and involved test suites”. In fact, in my experience, TDD generally leads to fewer tests. TDD is a design methodology. It is a good methodology but not the only good methodology. It works well on its own but can also be used together with other methodologies.
If you think that TDD is about enormous test suites that cover every possible eventuality then you have been misinformed.
Rust and others have closures and other functional stuff too. However, Rust is statically typed, fast as C/C++, memory safe, and able to handle implicit types. I’m no expert, but they seemed to have learned a lot from Lisp and other older duck-typed languages.
Here’s a Rust closure. One line:
let square = |x| (0..x).for_each(|x| println!("{}", x * x));
You can also set up the above like a regular ’ol named function, as show in the link above. And closures and functions can be passed around like any other variable.
Again, Rust is statically typed but it’s implicit in the example. Saves key strokes in a lot of places. You can be fully explicit if you want. Rust eliminates the need for unit testing just to make sure you have the types correct, such as with Clojure. Did I mention it’s fast like C/C++ but memory safe!?
However some ducked typed languages have properties that Rust will never have. Like Forth. It’s in a different game than Rust or Clojure all together.
let square = |x| (0..x).for_each(|x| println!(”{}”, x * x));
Just nitpicking, but that’s not a closure. It’s an anoymous procedure (or “lambda”). Closures are procedures which have free variables which are looked up in the surrounding lexical scope. One says they “close over” the lexical environment, hence the name.
It’s a closure. But it’s not a problem. Even I had to double check if I could call it a closure before making the original comment. I could have done this:
let y = 3;
let multiply = |x| -> () { (0..x).for_each(|x| println!("{}", x * y )) };
multiply(7);
I captured y.
On the other hand, this won’t work:
let y = 3;
fn multiply_func(x: i32) -> () {
(0..x).for_each(|x| println!("{}", x * y))
}
multiply_func(7)
Here’s the above all nice and neat on the playground.
EDIT:
I could have done the closure above without being explicit with scopes or return type.
let multiply = |x| (0..x).for_each(|x| println!("{}", x * y ));
I recently came across this interesting claim that the median clojure programmer is the second-highest paid in the US, by language. Beaten only by scala. Would not have expected that.
| Does anyone else find this treadmill exhausting?
Then don’t ride it. No one is forcing you to try new tools. However an acquaintance of mine made the remark once that you should try toolchains like you’d try restaurants- don’t go somewhere empty, try new places, but rely on places you and others have found to be dependable. I think that’s good advice.
On the contrary, plenty of popular things are garbage.
In fact, many wise travellers these days will look up what’s popular on TripAdvisor, and make a conscious effort to go nowhere near any of those places.
I learned more about how JavaScript works from playing with Io (which nobody uses) than any amount of building things with React. That isn’t to say React is garbage — it isn’t — but popularity is seldom a good indicator of quality.
It’s this little language. I’m pleased to see they even redesigned their website in the few years since I last looked!
I say Io taught me more about JavaScript because Io is a prototypal language. JavaScript [and I know you know this, but for everyone else’s benefit] is also a prototypal language, despite how much its users try to deny it.
Oh right, thanks. I did read about it a long time ago but completely forgot about it, even Google just gave me socket.io when I tried to figure out what it means in this context.
Unfortunately, this seems to be the trend. I remember JS resources trying to convince me of prototypal inheritance, and I actually bought into it and took advantage of it. Nowadays it feels like it got swept under the carpet. I work with dozens of developers who unequivocally love JavaScript and go on an on about “vanilla JS” and TypeScript, and none of them have heard of the prototype chain.
Eventually I ended up in one extreme in which I gave up on writing mostly functional-style JS, and switched to languages like OCaml which compile to really good JS and let me program the way I want without fighting the ecosystem. I only tap into OOP features for the API surface, when I want to provide a “fluent interface” because JS still doesn’t have a good way of chaining function calls (such as a pipe operator).
JS/TS I’m asked to write professionally ended up in the other extreme, where the linter yells at me for not using ES6 classes for pretty much everything, and most of it looks like
class SomeEndpointHandlerFactory implements IDependencyInjectionFactory, IConfigsProvider, ILogger {
/* cue two thousand lines of private methods */
}
Are you arguing for Sturgeon’s law and the popularity of mediocrity there? I agree (on that other forum, I replied “a billion flies can’t be wrong” when someone was making an appeal to popularity on a different topic). The point I was trying to make is that rejecting something simply because it’s new (though specifically Clojure is not new and the Lisp heritage is old) is a bad idea just as adopting something simply because it’s popular is a bad idea. It closes you off from experiences.
When it comes to food my attitude is “it’s one meal”. So if I didn’t like it or it didn’t measure up to my expectations my sense of investment is fairly low. For programming languages that investment is greater but a couple of focused evenings puttering to decide if I like it seems reasonable since I might find something I like or has the potential to change the way I work.
I think you have the right approach, and I absolutely agree that programmers should be making as many low-risk investments (in terms of time invested in researching and/or evaluating some technology) as they can afford.
The issue here is that — as @hwayne pointed out earlier — Robert Martin does not evaluate technology with anything near what any of us could reasonably call a balanced judgement. He’s certainly not willing to learn about types, for example, as he seems too invested (emotionally or otherwise) in railing against them. Rich Hickey is guilty of this sometimes too.
Things like this are why I don’t trust Martin’s opinions. He didn’t say a single bad thing about Clojure, he didn’t have any nuance, he doesn’t respect the other viewpoints. He does the same thing with TDD and clean code, where it’s impossible for them to be the wrong tool. He’s been programming for five decades and still thinks in silver bullets.
For the record, his Clojure example is even shorter in J. It’s just
*: i. 25
.His example is shorter even in Clojure:
(map sqr (range 25))
but that misses the point. Both Clojure examples are easy to explain and understand, where in J it is not obvious what *: and . stand for, and how these should be changed should we wanted to compute something different. But even that is not the point.
The point is that Uncle Bob is writing about his own experience with the language that he finds fascinating. He writes about his experience and he gets to choose how he writes it. If anyone disagrees (plenty people do, I suppose) they are very well entitled to write about their experience themselves.
I don’t want to sound like an asshole, but what exactly is his experience besides teaching and writing books ? Cause we see so many people advocating for specific language/technology without any substantial real world experience.
As professional advocates go, he’s well known and (at least be me) well regarded.
A professional advocate advocating for something is a signal too… and a lot of the things he was advocating 25 years ago are still relevant today.
http://web.archive.org/web/20000310234010/http://objectmentor.com/base.asp?id=42
Yes, it’s called Appeal to Authority.
I’m also not convinced he’s much of an authority. I’d say he’s a zealot. His tirades against types are tired. His odes to discipline are masturbatory. His analogies… well… This is the same guy who said C++ is a “man’s language” and that you need big balls to write it.
This is called an ad hominem. If you’re going to be a stickler about logical fallacies I’m surprised that you can’t even make it a few sentences without contradicting yourself. Are they important or not?
This is called inductive reasoning. Given some evidence, such as a well-regarded professional advocating for some tool, we can try to generalize that evidence, and decide the tool has a good chance of being useful. You’ve surely heard of Bayesian probability; signals exist and they’re noisy and often incorrect but minding them is necessary if you want to make any sense of the world around you.
Logical fallacies only really apply when you’re working in the world of what’s called deductive reasoning. Starting from some premises which are assumed to be true, and moving forward using only techniques which are known to be sound, we can reach conclusions which are definitely true (again, assuming the premises). In this context, the one of deductive reasoning, appeal to authority is distinctly unsound and yet quit common, so it’s been given a nice name and we try to avoid it.
Tying it all together, the parent is saying something like “here’s some evidence”, and you’re interjecting with “evidence isn’t proof”. Great, everybody already knew that wasn’t proof, all that we’ve really learned from your comment is that you’re kind of rude.
Fallacies can apply to inductive arguments too, but you are right in that there’s an important distinction between the two types and how they differ. I would say that the comment you’re replying to is referring to the idea of informal fallacies in the more non-academic context. The Stanford encyclopedia has a good in-depth page about the term.
Also, not all fallacies are equal, appeal to authority may be seen as worse than ad hominem these days.
This thread started with, “Things like this are why I don’t trust Martin’s opinions.” Uncle Bob’s star power (or noteriety) and whether that qualifies as social proof or condemnation, is the point of the discussion, not a distraction.
[Comment removed by author]
I wouldn’t be complaining if he was just sharing a language he liked. The problem is he’s pushing clojure as the best language for (almost) everything. Every language has tradeoffs. We need to know those to make an informed decision. Not only is he not telling us the tradeoffs, he’s saying there aren’t any! He’s either naïve or disingenuous, so why should we trust his pitch?
That’s not what he said though. The closest he came to that is:
Note the qualification: ‘… than any other language I’ve used’. This implies there may well be languages which are easier for building large systems. He just hasn’t used them.
He repeated, three times for emphasis, that it doesn’t have static typing. And that it doesn’t give you C-level performance.
We need to consider the connotations and broader context here. He frames the post with
He doesn’t directly say it, but he’s really strongly implying that he’s seen enough languages to make a universal judgement. So “than anything other language I used” has to be seen in that context.
Nor does he allow special cases. Things like
Strongly connotating that “I’m writing frontend code for the web” is not a good enough reason to use Clojure, and he brushes off the lack of “C-level performance” with
If Clojure is not the best choice for only 0.1% of software, or even 5% of software, that’s pretty darn close to “best language for (almost) everything.”
He repeats it as if the reader is hung up on that objection, and not listening to him in dismissing it. Note the increasing number of exclamations he uses each time. And he ends with
Combined with his other posts (see “The Dark Path”), he doesn’t see static typing as a drawback. We can infer it as a drawback, but he thinks we’d be totally wrong in doing so.
You have to explain both examples for them to make sense. What does
map
do? How do you changesqr
out for a different function? If you learn the purpose of the snippet, or the semantics of each of the individual elements, you can understand either the J or Clojure example just as well as the other (if your understanding of both languages is equal).Also the meat of the article is trying to convince the reader to use Clojure (by explaining the syntax and semantics, comparing its syntax to two of the big 5 languages, and rebutting a bunch of strawman arguments - nothing particularly in-depth). I don’t see a balance of pros and cons that would be in a true account of an experience learning and using the language, including more than just a bullet point on the ecosystem, tooling, optimisation, community, etc.
I am sure that any programmer that has any experience in any language would guess that you change sqr out for a different function by typing the name of that other function. For example, you compute exp instead of sqr by, well, typing “exp” instead of “sqr”.
The same with map. Of course that someone has to know what particular function does to be able to use it effectively. The thing with Clojure (and other Lisps) is that it is enough to know that. You don’t need special case syntax rules. Any expression that has pretty much complex semantics is easy to write following a few basic rules.
I understand the benefits of the uniformity of Lisp, but my point was just that you can’t really say that
(map sqr (range 25))
is any more or less understandable than*: i. 25
if you know the purpose of the expressions and the semantics of their constituent parts. And given that knowledge, you can reasonably make substitutions likeexp
forsqr
or^:
for*:
(though I would end up consulting a manual for the exact spelling).Further experimentation would require more knowledge of either language. For instance, why
if
isn’t a function in Clojure, or why lists don’t have delimiters in J. It’s all apples and oranges at this superficial level.My version of Clojure doesn’t define
sqr
—is that built in?That aside, I don’t find either version very easy to explain to someone who isn’t already experienced with functional programming. What does “map” mean? How does it make sense that it takes a function as an argument? These seem obvious once you’ve internalized them, but aren’t easy to understand from scratch at all.
If I were reviewing this code, I would suggest they write
(for [x (range 25)] (* x x))
Of course that one has to understand the semantics of what they’re doing. But, in Clojure, and Lisps it is enough to understand the semantics, while in most other languages, one has to additionally master many syntax rules for special cases.
Closure has quite a lot of special syntax compared to many Lisps.
for
example, data type literals and other reader macros like literal lambdas,def
forms,let
forms,if
forms and other syntax macros like->
are all built in. Each of these has their own special rules for syntax and semantics.We’re on the same page I think, except that I think knowledge of semantics should be enough to understand any language. If you see a verb and a noun in close proximity, you’d be able to make a good guess as to what’s happening regardless of the glyphs representing their relationship on screen.
If you want a language that emphases semantics over syntax, then APL is the language for you! There are just a few things to understand about syntax, in order of importance.
¯
. Some dialects have special-case syntax for complex or rational numbers:42 3.14 1J¯4
''
quotes. Doubling the quote inside an array escapes it:'is'
or'isn''t'
[]
braces:'cafe'[3 2 1 4]
←→'face'
(Many APLers have a disdain for this form because it has some inconsistency with the rest of the language.){}
braces.⋄
(Mainly useful for jamming more code into a single line)From there, the grammatical rules are simple and natural in the form of
verb noun
ornoun verb noun
orverb adverb noun
etc. Probably the most difficult thing to learn and remember is that there is no operator precedence and evaluation reduces from right-to-left.When I’m programming in APL, I rarely think about the syntax. When I’m programming in Clojure, syntax is often a concern. Should I use
map
orfor
? Should I nest these function calls or use->
?None of those are syntax.
map
is a function and the rest are macros. They’re all inside the existing Clojure syntax.https://clojure.org/reference/macros
True enough. However, at least in Clojure, macros are pretty deliberately limited so as not to allow drastically changing the look-and-feel of the language. So I’m pretty sure every macro you’ll come across (except I guess reader macros) will have the same base syntax,
(a b ...)
.This is surprising. A famous TDD evangelist has decided that Clojure is his favorite language. Clojure is the least TDD-friendly language I have ever used. The startup times make a traditional edit-compile-test loop impossible to use: instead you must keep a REPL running, which gradually accretes state. It’s the “works on my machine” problem but even worse. So you have to restructure your application and use something like clojure.tools.namespace.repl or one of the myriad libraries that wrap it in an attempt to clean the slate between test runs. It will mostly work, until you do a large refactor and hit an edge case.
The Clojure community - as a gross generalization - prefers interactive REPL-based development to testing. Not much effort is put into supporting a TDD workflow because it isn’t the preferred workflow. If you complain about startup times, you will be told that it doesn’t matter because you should have a REPL running all the time, with an editor integration.
I would love to know how Martin has managed to reconcile TDD with Clojure.
Clojure is great for TDD, clojure.test isn’t in the core language (which is small) but is in the core namespace (https://clojure.github.io/clojure/clojure.test-api.html) and is both well-maintained and a pleasure to use.
You run your tests in the REPL, there’s nothing weird about that, it’s awesome!
I have found running tests in the REPL to be awful. If you approach Clojure naively, i.e. like any other language, then you will very quickly be SLIMEd. Here is a paragraph from the leiningen tutorial:
You can’t run
lein test
every time you make a change because it takes so long to compile and run. So you have to run your tests in a continuously running REPL, thus risking getting slimed. Not being able to trust the tests makes TDD awful.People try to work around this by carefully structuring their applications using
component
ormount
or similar, and relying onclojure.tools.namespace.repl
to tear down their namespaces cleanly and rebuild them from scratch. Apart from the silliness of having to do all these workarounds just to run tests quickly, it doesn’t even work reliably. Certain types of refactoring can still result in being slimed.I agree that
clojure.test
is quite nice… if only there were a quick and reliable way to actually run the tests!I don’t find it awful, but it’s likely I have a different workflow or have already adapted my workflow in clojure to work around this weakness. I’ll usually go 30 minutes to an hour between running, ‘all the tests’ (which gets a clean lein test in a new console as a precaution.)
One thing I do frequently is eval-print-last-sexp into my editor (I’ve bound this to spacebar), verify the output by reading it, and then slap deftest around the resulting tuple. I find this so ridiculously great for testing that I’m willing to put up with a lot of inconvenience for it. (and it is far and away not the greatest thing about clojure).
I always get down-voted as “incorrect” when I recount personal experience or ask questions… it must be the way I phrase it. Oh, well.
I would love to know more about your workflow, if you have it documented anywhere. I have asked a few experienced Clojure devs about this and they usually give an answer similar to yours, i.e. they don’t suffer from sliming very much but they aren’t sure why. It seems to be something that affects some people really badly and other people not so much… I would like to work out why that is.
I run CIDER in an emacs process running as a daemon. I connect with emacsclient.
I refresh my CIDER instance pretty regularly. I initiate it directly from the .clj file I’m editing (which ensures the project is configured right and everything is fresh). I’ll do this as often as every file I edit (Sliming is no joke, I don’t disagree with you there).
I’ve done this for small projects (in the scale of projects), but part of that is that it’s possble to do so much in clj with less code (10X code size vs java is thrown around a lot) so my “small” 3kloc clojure projects with this workflow pack the punch of a 30kloc java app (I’ve seen whole companies run off less).
I’ve gotten paid for these “small” projects and have run them happily in a production environment for years with this workflow (letting you know I’m not comparing apples to oranges or talking about toys… though, I have made some fun toys in clojure). I haven’t had the opportunity to work on a giant clojure project.
I honestly don’t get these people who claim to run stable repls for days or weeks at a time without restarting… I end up rebooting my emacs instance around once a day due to stability issues there (and the fact that one mistake in an interactive environment can take down the rest).
I don’t think the environments themselves are unstable… I just find that I’m prone to end up making them do too much! (Printing a larger then expected clojure data structure can take down a repl for minutes or hours before it runs out of heap! So I find myself frequently ‘refreshing’ my environment for that reason alone).
It could be that my style in this case (which results in accidentally thunking a 100mb string into my editor) is working in my favor too… I honestly don’t know (just that hurting myself with sliming is rare and that I find my productivity gains way outpace the risks).
Thank you for taking the time to write that! I know Clojurists who keep REPLs running for weeks and I just don’t understand how they avoid being slimed. Your workflow seems much less likely to run into those issues.
I think your extra layer of indirection - using emacs as a daemon and connecting with emacsclient - might also help. I tend to work from the shell, opening my editor to edit a file and then closing it again, rather than leaving the editor going for a long time. I found that really doesn’t work very well for REPL-based development with an editor integration, but maybe it would work if it were emacsclient that I was starting and stopping all the time and emacs itself kept going in the background.
I might have to give that a go sometime. Thank you very much for sharing.
This is basically the post I was planning to write this weekend for my own blog, just expressed better.
The one thing I’m surprised he doesn’t mention, and I think is hugely important is how Clojure is designed as a hosted language, meaning not only Java interop is trivial, but also JS interop if you want to run CLJS in you browser, or on Electron, or on React Native (and also make use of the wealth of JS libraries, for better or worse). And if you have yet other requirements, GraalVM seems to be promising great performance, minimal overhead, and near-zero startup time (eg. for CLI apps). Or run it on the CLR (the C# runtime), maybe even inside Unity for game development (see Arcadia).
In short, it’s hugely flexible, and a lot of the tools and libraries work across most of those platforms. I’m currently writing Clojure professionally, and I don’t plan to change that. It’s not perfect, but it’s very close to.
I’m not familiar with this Robert C. Martin guy. This strong advocacy of Clojure seems odd, so I read some more of his posts to try and get an idea of where he’s coming from. It seems that a lot of his position is based on being an advocate of strict TDD. I develop mostly in dynamic languages nowadays (Ruby and Python), so I write a lot of tests. Maybe I’m not really well-practiced with TDD, but honestly, I just don’t get it.
IMO, a relatively small number of tasks and situations lend themselves to strict TDD, as far as actually writing tests first. Having really strict unit test coverage, enough to catch any bug at all, is IMO impossible in any project that does anything significant. Having enough tests to cover every branch and every possible error in every situation leads to having a ridiculously huge test suite that takes forever to run, and tends to blow up from any change at all and require much more work to sort out the tests than the actual change did. I don’t seem to hear many satisfying answers from TDD advocates on these issues.
I worked for most of my career in languages with strong type systems, mostly C# and a little C++. I did grow weary of how much boilerplate was often required around defining objects, and looked forward to working with more dynamic languages. Now, after working with dynamic languages for a few years, I feel like I have more appreciation for how many errors strong types prevent, and how inadequate even large and involved test suites are at really preventing them all. I have also observed how dynamic languages tend to grow static bits after a while (Typescript, optional types, etc), and static languages tend to grown dynamic bits (reflection, implicit types, C#
dynamic
keyword). I think now the idea language must be somewhere in between, but I’m not sure exactly where yet.I’m a big fan of TDD. I find that if I don’t write tests first, it’s too tempting to just not write them at all. I combine it with up-front design, property-based testing, and static analysis though.
It depends what you mean by that. I mostly have no trouble writing tests later on, when the project comes into focus enough that I can see the value they bring more clearly. I’ve seen a drag come from when the application is architected in such a way that it is difficult to set up tests though. I’ve also heard the claim that much of the benefit from TDD comes from designing applications in such a way that they’re easy to write tests for. Maybe that’s what I’m really doing - just automatically designing things with testing in mind, which is what really makes them better.
I am not a fan of “Uncle Bob” but I do think TDD is a good idea.
This is really not the point of TDD. Not at all.
Because I’m lazy, and other people have explained these things better than I probably can, I’ll just point you at one blog post and one video of a talk. I think both are very good and helped me to understand TDD better:
https://michaelfeathers.typepad.com/michael_feathers_blog/2008/06/the-flawed-theo.html
https://vimeo.com/68375232
If that last link doesn’t provide satisfying answers, I’d be happy to have a go.
That isn’t really the critical point of what I wrote though. It was a point, not an assertion that that is what TDD is all about. If I had to boil down to a single point, it would probably be:
I made that assertion to counteract the main article’s point about Lisp being the ultimate language for everything, despite having no compile-time type enforcement AFAIK.
That’s what I never see an answer to. Anytime I ask, people just point me at more blog posts to read, but I feel like I’ve read and watched plenty already. It feels like TDD is a religion sometimes, where nobody can give you a straight answer on anything, just point you back to the “holy books” and tell you to read them more closely.
I think the most helpful article I’ve found on testing is Eevee’s https://eev.ee/blog/2016/08/22/testing-for-people-who-hate-testing/. That’s practical advice on how to write more and better tests without getting bogged down in pseudo-religous TDD jargon or huge and messy test suites that cause more problems than they solve.
In this particular case, I feel the links I suggested would actually have helped. But I’m happy to just post my own synthesis here if you’d prefer. The problem is that you are conflating different issues. You boiled your point down to this:
Which is a totally valid point. I agree with it: my experience has led me to the same conclusion. It also has nothing to do with TDD.
TDD is not about creating “large and involved test suites”. In fact, in my experience, TDD generally leads to fewer tests. TDD is a design methodology. It is a good methodology but not the only good methodology. It works well on its own but can also be used together with other methodologies.
If you think that TDD is about enormous test suites that cover every possible eventuality then you have been misinformed.
Rust and others have closures and other functional stuff too. However, Rust is statically typed, fast as C/C++, memory safe, and able to handle implicit types. I’m no expert, but they seemed to have learned a lot from Lisp and other older duck-typed languages.
Here’s a Rust closure. One line:
let square = |x| (0..x).for_each(|x| println!("{}", x * x));
Here’s the Clojure example in the article:
To break down the Rust.
let square = |x|
Side-note: if you wanted do more than one arg you could do something like
|x, y, z|
.let square = |x| (0..x)
let square = |x| (0..x).map(|x| x * x);
But we want to print with side-effects, so instead:
let square = |x| (0..x).for_each(|x| println!("{}", x * x));
To call our closure:
square(25);
or whatever you want
x
to be.Run this example on Rust playground in your browser.
You can also set up the above like a regular ’ol named function, as show in the link above. And closures and functions can be passed around like any other variable.
Again, Rust is statically typed but it’s implicit in the example. Saves key strokes in a lot of places. You can be fully explicit if you want. Rust eliminates the need for unit testing just to make sure you have the types correct, such as with Clojure. Did I mention it’s fast like C/C++ but memory safe!?
However some ducked typed languages have properties that Rust will never have. Like Forth. It’s in a different game than Rust or Clojure all together.
Just nitpicking, but that’s not a closure. It’s an anoymous procedure (or “lambda”). Closures are procedures which have free variables which are looked up in the surrounding lexical scope. One says they “close over” the lexical environment, hence the name.
It’s a closure. But it’s not a problem. Even I had to double check if I could call it a closure before making the original comment. I could have done this:
I captured y.
On the other hand, this won’t work:
Here’s the above all nice and neat on the playground.
EDIT: I could have done the closure above without being explicit with scopes or return type.
Thanks for clarifying!
No, thank you. Others were probably scratching their heads too.
I’ve come to the same conclusion: my favourite language is also Lisp. Not Clojure like the title says, but Lisp, like the text says.
I’m almost sure I’ll never do anything beyond toys with Lisp, though. I wonder if Martin has.
Out of curiosity, what is your preferred flavor of lisp?
Racket, and I also like some aspects of Common Lisp and Chicken. My opinion on this is nearly worthless, due to lack of actual usage.
I recently came across this interesting claim that the median clojure programmer is the second-highest paid in the US, by language. Beaten only by scala. Would not have expected that.
https://www.techrepublic.com/article/developer-pay-heres-how-salaries-rise-with-experience-across-programming-languages/
[Comment removed by author]
| Does anyone else find this treadmill exhausting?
Then don’t ride it. No one is forcing you to try new tools. However an acquaintance of mine made the remark once that you should try toolchains like you’d try restaurants- don’t go somewhere empty, try new places, but rely on places you and others have found to be dependable. I think that’s good advice.
On the contrary, plenty of popular things are garbage.
In fact, many wise travellers these days will look up what’s popular on TripAdvisor, and make a conscious effort to go nowhere near any of those places.
I learned more about how JavaScript works from playing with Io (which nobody uses) than any amount of building things with React. That isn’t to say React is garbage — it isn’t — but popularity is seldom a good indicator of quality.
What is “Io”?
Another language with prototype-based inheritance.
It’s this little language. I’m pleased to see they even redesigned their website in the few years since I last looked!
I say Io taught me more about JavaScript because Io is a prototypal language. JavaScript [and I know you know this, but for everyone else’s benefit] is also a prototypal language, despite how much its users try to deny it.
Oh right, thanks. I did read about it a long time ago but completely forgot about it, even Google just gave me
socket.io
when I tried to figure out what it means in this context.Unfortunately, this seems to be the trend. I remember JS resources trying to convince me of prototypal inheritance, and I actually bought into it and took advantage of it. Nowadays it feels like it got swept under the carpet. I work with dozens of developers who unequivocally love JavaScript and go on an on about “vanilla JS” and TypeScript, and none of them have heard of the prototype chain.
Eventually I ended up in one extreme in which I gave up on writing mostly functional-style JS, and switched to languages like OCaml which compile to really good JS and let me program the way I want without fighting the ecosystem. I only tap into OOP features for the API surface, when I want to provide a “fluent interface” because JS still doesn’t have a good way of chaining function calls (such as a pipe operator).
JS/TS I’m asked to write professionally ended up in the other extreme, where the linter yells at me for not using ES6 classes for pretty much everything, and most of it looks like
Are you arguing for Sturgeon’s law and the popularity of mediocrity there? I agree (on that other forum, I replied “a billion flies can’t be wrong” when someone was making an appeal to popularity on a different topic). The point I was trying to make is that rejecting something simply because it’s new (though specifically Clojure is not new and the Lisp heritage is old) is a bad idea just as adopting something simply because it’s popular is a bad idea. It closes you off from experiences.
When it comes to food my attitude is “it’s one meal”. So if I didn’t like it or it didn’t measure up to my expectations my sense of investment is fairly low. For programming languages that investment is greater but a couple of focused evenings puttering to decide if I like it seems reasonable since I might find something I like or has the potential to change the way I work.
I think you have the right approach, and I absolutely agree that programmers should be making as many low-risk investments (in terms of time invested in researching and/or evaluating some technology) as they can afford.
The issue here is that — as @hwayne pointed out earlier — Robert Martin does not evaluate technology with anything near what any of us could reasonably call a balanced judgement. He’s certainly not willing to learn about types, for example, as he seems too invested (emotionally or otherwise) in railing against them. Rich Hickey is guilty of this sometimes too.