1. 14
  1.  

  2. 5

    This speaks to me on a fundamental level. There are definitely programming languages that just click for me and ones that I’ve had to work a lot to understand them.

    For example:

    • Forth took a lot of work but was very rewarding. It took on a new meaning when I learned Factor and started to pay closer attention to combinators.
    • Lisp/Scheme were both great introductions to meta programming for me. I didn’t really get what was possible with meta programming before I used them.
    • StandardML/Ocaml/F# was when I first grokked functional programming. They also were where I first grokked types.
    • Rust has been this way, though slowly. I was turned off by Rust’s earlier incarnations of syntax but of recent, I’ve been really enjoying it. I think Rust might the first language in many years that just feels natural to me.
    1. 4

      I wonder how much role interactivity plays here. Scheme and Smalltalk are both deeply interactive environments where you can have a conversation with the language. This makes learning the language both enjoyable and engaging.

      1. 3

        Excellent. Languages are vehicles for thought, and everything about them has a big impact on what thoughts you have. Syntax matters, semantics matters, paradigms matter. It’s never just one thing. Great languages manage to be more than the sum of their parts.

        1. 2

          The funny thing for me is that I dont even use the one that awed me most: LISP with unprecedented flexibility, incremental compilation per function, and live updates. It’s like one could do regular coding about as fast as they think with resultimg code runnimg way faster than scripting languages. Too weird to fully embrace with the BASIC and Pascal-style languages compiling super-fast, too. So I just used their implementations where possible.

          I still plan to have another go at trying to fully get the LISP’s or FP. Probably Racket for the DSL’s.

          1. 1

            Do Lisps try to only load the functions that have changed? I wondered how they managed to be a step beyond typical REPL workflows.

            1. 3

              Not in the way I think you mean, as an incremental-compilation system. That should be possible to build, but none of the commonly used ones that I can think of are actually built that way.

              There is at least more attention to being explicit about the relationship between the currently running image and the (possibly changed) source code though. As a very simple example, there are two Common Lisp forms that introduce global variables: defvar introduces a variable and assigns a default value to it only if it was not already introduced, while defparameter is the same but initializes it unconditionally. So if you reload a file with a defvar it will just silently note the declaration of an existing variable but not clobber it, while a defparameter will deliberately clobber it.

              As another example, the Common Lisp equivalent of exceptions, the condition system, is entirely designed around the idea that you might want to fix an exception by reloading or redefining something, but only lazily once you’ve noticed it failed. That allows for all kinds of workflows that are odd by the standards of other languages, based around what ‘restart’ you choose from a condition (either manually or automatically). The differences are substantial, but the first difference a programmer will normally notice is telling in itself: when a Java program throws an uncaught exception, it terminates with a stacktrace, but when Common Lisp raises an uncaught condition, it pauses and displays a multiple-choice menu asking the user which restart to take.

              1. 1

                The environment I had came with a REPL to interpret functions, a total compile, and an incremental compile. The first two worked about like you’d expect I guess. The REPL skipped compile step but maybe ran slower. Been a while so I can’t remember. That’s usually the case for REPL’s, though. The compiler took longer but the resulting code ran really fast. The cool part was incremental compile where I could change the definition of one function, hit a particular key combo, it would compile just that one function, and it would load it into live image for immediate testing.

                The incremental compiles took a fraction of a second. The first compile might take longer, esp if you’ve been REPLing or are adding optional types. From there, it’s a quick compile per new module or function change that takes no time at all. That let me no need to use REPL for iterations given I could get highly-optimized code without a long wait that would break my flow. From then on, both incremental compiles and live image are features I’d request for any language.

                Being able to save the running state of one was cool, too, for when you trusted it up to a certain point with certain data/state in it but then wanted to test a lot of behaviors. It was to exploratory testing what a NES emulator with save/reload is to speedruns. ;) I bet it could be even more useful in mocking up and testing algorithms for distributed systems.