1. 28
  1. 2

    TIL Common Lisp as aliases for car and cdr, first and rest respectively. I guess this was made as a minor change to be more approachable? Seems like kind of a pointless feature to be honest.

    1. 5

      I’ve often heard that first (second, third, …) and rest should be used by default and that c[ad]*r are just part of CL’s heritage, and are provided as legacy functions.

      1. 8

        After seeing how much my TLA+ students struggled with using /\ instead of &&, I’ve come to the opinion that any unnecessary naming differences should be ruthlessly purged.

        1. 3

          That struggle only lasts for a fraction of a generation — later generations may be confused why there’s & and && instead of and.

          1. 1

            As someone who somewhat-recently was a TLA+ student myself: the difference between those two is not unnecessary, because it avoids the classic student problem of “these symbols look similar, therefore the ideas must be similar” - /\ in TLA+ is different than && in JavaScript and C. It was very helpful for me to use the former instead of the latter.

          2. 8

            Not really. first, rest, etc should be used when dealing with lists. cons cells have more uses than to construct a list. (dequeue, trees, alists, etc). In those scenarios it is preferred to use car/cdr.

        2. 2

          First-class packages are the most underrated feature of lisp. AFAIK only perl offers it fully but it uses very bad syntax, globs . Most macros merely suppress evaluation and this can be done using first class functions. Here is my question for lispers, If you can use lex / yacc and can write a full fledged interpreter do you really need macros ?

          1. 7

            Most macros merely suppress evaluation and this can be done using first class functions.

            I strongly disagree with this. Macros are not there to “merely suppress evaluation.” As you point out, they’re not needed for that, and in my opinion they’re often not even the best tool for that job.

            “Good” macros extend the language in unusual or innovative ways that would be very clunky, ugly, and/or impractical to do in other ways. It’s in the same vein as asking if people really need all these control flow statements when there’s ‘if’ and ‘goto’.

            To give some idea, cl-autowrap uses macros to generate Common Lisp bindings to C and C++ libraries using (cl-autowrap:c-include "some-header.h"). Other libraries, like “iterate” add entirely new constructs or idioms to the language that behave as if they’re built-in.

            Here is my question for lispers, If you can use lex / yacc and can write a full fledged interpreter do you really need macros ?

            Lex/Yacc and CL macros do very different things. Lex/Yacc generate parsers for new languages that parse their input at runtime. CL macros emit CL code at compile time which in turn gets compiled into your program.

            In some sense your question is getting DSLs backwards The idea isn’t to create a new language for a special domain, but to extend the existing language with new capabilities and operations for the new domain.

            1. 1

              Here are examples of using lex/yacc to extend a language

              1. Ragel compiles state machines to multiple languages
              2. Swig which does something like autowrap
              3. The babel compiler uses parsing to add features ontop of older javascript like asyc/await.

              I am guessing all these use lex/yacc internally. Rails uses scaffolding and provides helpers to generate js code compile time. Something like parenscript.

              The basic property of a macro is to generate code at compile time. Granted most of these are not built into the compiler but nothing is stopping you adding a new pre-compile step with the help of a make file.

              Code walking is difficult in lisp as well. How would I know if an expression is a function or a macro ? If I wanted to write a code highlighter in vim that highlights all macros differently I would have a difficult time doing this by parsing alone even though lisp is an easy language to parse.

              1. 5

                Code walking is difficult in lisp as well. How would I know if an expression is a function or a macro ?

                CL-USER> (describe #'plus-macro)
                #<CLOSURE (:MACRO PLUS-MACRO) {1002F8AB1B}>
                  [compiled closure]
                
                
                Lambda-list: (&REST SB-IMPL::ARGS)
                Derived type: (FUNCTION (&REST T) NIL)
                Documentation:
                  T
                Source file: SYS:SRC;CODE;SIMPLE-FUN.LISP
                ; No value
                CL-USER> (describe #'plus-fn)
                #<FUNCTION PLUS-FN>
                  [compiled function]
                
                
                Lambda-list: (A B)
                Derived type: (FUNCTION (T T) (VALUES NUMBER &OPTIONAL))
                Source form:
                  (LAMBDA (A B) (BLOCK PLUS-FN (+ A B)))
                ; No value
                

                You underestimate the power of the dark side Common Lisp ;)

                In other words … macros aren’t an isolated textual tool like they are in other, less powerful, languages. They’re a part of the entire dynamic, reflective, homoiconic programming environment.

                1. 2

                  I know that but without using lisp runtime and parsing alone can you do the same ?

                  1. 3

                    I’m not sure where you’re going with this.

                    In the Lisp case, a tool (like an editor) only has to ask the Lisp environment about a bit of syntax to check if it’s a macro, function, variable, or whatever.

                    In the non-Lisp case, there’s no single source of information, and every tool has to know about every new language extension and parser that anybody may write.

                    1. 1

                      I believe the their claim is that code walkers can provide programmers with more power than Lisp macros. That’s some claim, but the possibility of it being true definitely makes reading the article they linked ( https://mkgnu.net/code-walkers ) worthwhile.

                    2. 2

                      Yes. You’d start by building a Lisp interpreter.

                      1. 1

                        … a common lisp interpreter, which you are better off writing in lex/yacc. Even if you do that each macro defines new ways of parsing code so you can’t write a generic highlighter for loop like macros. If you are going to write a language interpreter and parse, why not go the most generic route of lex/yacc and support any conceivable syntax ?

                        1. 5

                          I really don’t understand your point, here.

                          Writing a CL implementation in lex/yacc … I can’t begin to imagine that. I’m not an expert in either, but it seems like it’d be a lot of very hard work for nothing, even if it were possible, and I’m not sure it would be.

                          So, assuming it were possible … why would you? Why not just use the existing tooling as it is intended to be used???

                          1. 2

                            That’s too small of a problem to demonstrate why code walking is difficult. How about this then,

                            1. Count number of s-expression used in the program
                            2. Shows the number of macros used
                            3. Show number of lines generated by each macro and measure line savings
                            4. Write a linter which enforces stylistic choices
                            5. Suggest places where macros could be used for minimising code
                            6. Measure code complexity, coupling analysis
                            7. Write a lisp minifier, obfuscator
                            8. Find all places where garbage collection can be improved and memory leaks can be detected
                            9. Insert automatic profiling code for every s-expression and list out where the bottlenecks are
                            10. Write code refactoring tools.
                            11. List most used functions in runtime to suggest which of them can be optimised for speed

                            Ironically the above is much easier todo with assembly.

                            My point is simply this, lisp is only easy to parse superficially. Writing the above will still be challenging. Writing lexers and parsers is better at code generation and hence macros in the most general sense. If you are looking for power then code walking beats macros and thats also doable in C.

                            1. 1

                              While intriguing, it would be nice if the article spelled out the changes made with code walkers. Hearing that a program ballooned 9x isn’t impressive by itself. Without knowing about the nature of the change it just sounds bloated. (Which isn’t to say that it wasn’t valid, it’s just hard to judge without more information.)

                              Regarding your original point, unless I’m misunderstanding the scope of code walkers, I don’t see why it needs to be an either/or situation. Macros are a language supported feature that do localized code changes. It seems like code walkers are not language supported in most cases (all?), but they can do stateful transformations globally across the program. It sounds like the both have their use cases. Like lispers talk about using macros only if functions won’t cut it, maybe you only use code walkers if macros won’t cut it.

                              BTW, it looks like there is some prior art on code walkers in Common Lisp!

                              1. 1

                                Okay, I understand your argument now.

                                I’ll read that article soon.

                                1. 6

                                  “That’s two open problems: code walkers are hard to program and compilers to reprogram.”

                                  The linked article also ends with something like that. Supports your argument given macros are both already there in some languages and much easier to use. That there’s lots of working macros out there in many languages supports it empirically.

                                  There’s also nothing stopping experts from adding code walkers on top of that. Use the easy route when it works. Take the hard route when it works better.

                                  1. 6

                                    Welcome back Nick, haven’t seen you here in a while.

                                    1. 4

                                      Thank you! I missed you all!

                                      I’m still busy (see profile). That will probably increase. I figure I can squeeze a little time in here and there to show some love for folks and share some stuff on my favorite, tech site. :)

                        2. 1

                          That kind of is the point. Lisp demonstrates that there is no real boundary between the language as given and the “language” it’s user creates, by extending and creating new functions and macros. That being said, good lisp usually follows conventions so that you may recognize if something is a macro (eg. with-*) or not.

                      2. 1

                        Here are examples of using lex/yacc to extend a language

                        Those are making new languages, as they use new tooling, which doesn’t come with existing tooling for the language. If someone writes Babel code, it’s not JavaScript code anymore - it can’t be parsed by a normal JavaScript compiler.

                        Meanwhile, Common Lisp macros extend the language itself - if I write a Common Lisp macro, anyone with a vanilla, unmodified Common Lisp implementation can use them, without any additional tooling.

                        Granted most of these are not built into the compiler but nothing is stopping you adding a new pre-compile step with the help of a make file.

                        …at which point you have to modify the build processes of everybody that wants to use this new language, as well as breaking a lot of tooling - for instance, if you don’t modify your debugger, then it no longer shows an accurate translation from your source file to the code under debugging.

                        If I wanted to write a code highlighter in vim that highlights all macros differently I would have a difficult time doing this by parsing alone even though lisp is an easy language to parse.

                        Similarly, if you wanted to write a code highlighter that highlights defined functions differently without querying a compiler/implementation, you couldn’t do it for any language that allows a function to be bound at runtime, like Python. This isn’t a special property of Common Lisp, it’s just a natural implication of the fact that CL allows you to create macros at runtime.

                        Meanwhile, you could capture 99.9%+ of macro definitions in CL (and function definitions in Python) using static analysis - parse code files into s-expression trees, look for defmacro followed by a name, add that to the list of macro names (modulo packages/namespacing).

                        tl;dr “I can’t determine 100% of source code properties using static analysis without querying a compiler/implementation” is not an interesting property, as all commonly used programming languages have it to some extent.

                        1. 1

                          If you can use lex / yacc and can write a full fledged interpreter do you really need macros ?

                          I don’t know why you’d think they are comparable. The amount of effort to write a macro is way less than the amount of effort required to write a lexer + parser. The fact that macros are written in lisp itself also reduces the effort needed. But most importantly one is an in-process mechanism for code generation and the other one involves writing the generated code to the file. The first mechanism makes it easy to iterate and modify the generated codec. Given that most of the time you are maintain, hence modifying, code I’d say that is a pretty big difference.

                          The babel compiler uses parsing to add features on top of older javascript like asyc/await.

                          Babel is an example of how awful things can be when macros happen out of process. The core of babel is a macro system + plugable reader .

                          I am guessing all these use lex/yacc internally.

                          Babel certainly doesn’t. When it started it used estools which used acorn iirc. I think nowadays it uses its own parser.

                          Rails uses scaffolding and provides helpers to generate js code compile time. Something like parenscript.

                          I have no idea why you think scaffolding it is like parenscript. The common use case for parenscript is to do the expansion of the fly. Not to generate the initial boilerplate.

                          Code walking is difficult in lisp as well.

                          And impossible to write in portable code, which is why most (all?) implementations come with a code-walker you can use.

                          1. 1

                            If syntax is irrelevant, why even bother with Lisp ? If I just stick to using arrays in the native language I can also define functions like this and extend the array language to support new control flow structures

                            ["begin",
                                ["define", "fib",
                                    ["lambda", ["n"],
                                        ["cond", [["eq", "n", 0], 0],
                                                 [["eq", "n", 1], 1],
                                                 ["T", ["+", ["fib", ["-", "n", 1]], ["fib", ["-", "n", 2]]]] ]]],
                                ["fib", 6]]
                            
                          2. 1

                            Well, if your question is “Would you prefer a consistent, built-in way of extending the language, or a hacked together kludge of pre-processors?” then I’ll take the macros… ;-)

                            Code walking is difficult in lisp as well. How would I know if an expression is a function or a macro ? If I wanted to write a code highlighter in vim that highlights all macros differently I would have a difficult time with doing pure code walking alone even though lisp is an easy language to parse.

                            My first question would be whether or not it makes sense to highlight macros differently. The whole idea is that they extend the language transparently, and a lot of “built-in” constructs defined in the CL standard are macros.

                            Assuming you really wanted to do this, though, I’d suggest looking at Emacs’ Slime mode. It basically lets the CL compiler do the work. It may not be ideal, but it works, and it’s better than what you’d get using Ragel, Swig, or Babel.

                            FWIW, Emacs, as far as I know (and as I have it configured), only highlights symbols defined by the CL standard and keywords (i.e. :foo, :bar), and adjusts indentation based on cues like “&body” arguments.

                            1. 1

                              Btw there is already a syntax highlighter that uses a code walker and treats macros differently. The code walker may not be easy to write, but it can hardly be said that it is hard to use.

                              https://github.com/scymtym/sbcl/blob/wip-walk-forms-new-marco-stuff/examples/code-walking-example-syntax-highlighting.lisp

                        2. 5

                          Yes, you absolutely want macros even if you Lex/Yacc and interpreters.

                          Lex/Yacc (and parsers more generally), interpreters (and “full language compilers”), and macros all have different jobs at different stages of a language pipeline. They are complimentary, orthogonal systems.

                          Lex/Yacc are for building parsers (and aren’t necessarily the best tools for that job), which turn the textual representation of a program into a data structure (a tree). Every Lisp has a parser, for historical reasons usually called a “reader”. Lisps always have s-expression parsers, of course, but often they are extensible so you can make new concrete textual notations and specify how they are turned into a tree. This is the kind of job Lex and Yacc do, though extended s-expression parsers and lex/yacc parsers generally have some different capabilities in terms of what notations they can parse, how easy it is to build the parser, and how easy it is to extend or compose any parsers you create.

                          Macros are tree transformers. Well, M4 and C-preprocessor are textual macro systems that transform text before parsing, but that’s not what we’re talking about. Lisp macros transform the tree data structure you get from parsing. While parsing is all about syntax, macros can be a lot more about semantics. This depends a lot on the macro system – some macro systems don’t allow much more introspection on the tree than just what symbols there are and the structure, while other macro systems (like Racket’s) provide rich introspection capabilities to compare binding information, allow macros to communicate by annotating parts of the tree with extra properties, or by accessing other compile-time data from bindings (see Racket’s syntax-local-value for more details), etc. Racket has the most advanced macro system, and it can be used for things like building custom DSL type systems, creating extensible pattern matching systems, etc. But importantly, macros can be written one at a time as composable micro-compilers. Rather than writing up-front an entire compiler or interpreter for a DSL, with all its complexity, you can get most of it “for free” and just write a minor extension to your general-purpose language to help with some small (maybe domain-specific) pain point. And let me reiterate – macros compose! You can write several extensions that are each oblivious to each other, but use them together! You can’t do that with stand-alone language built with lex/yacc and stand-alone interpreters. Let me emphatically express my disagreement that “most macros merely suppress evaluation”!

                          Interpreters or “full” compilers then work after any macro expansion has happened, and again do a different, complimentary job. (And this post is already so verbose that I’ll skip further discussion of it…)

                          If you want to build languages with Lex/Yacc and interpreters, you clearly care about how languages allow programmers to express their programs. Macros provide a lot of power for custom languages and language extensions to be written more easily, more completely, and more compositionally than they otherwise can be. Macros are an awesome tool that programmers absolutely need! Without using macros, you have to put all kinds of complex stuff into your language compiler/interpreter or do without it. Eg. how will your language deal with name binding and scoping, how will your language order evaluation, how do errors and error handling work, what data structures does it have, how can it manipulate them, etc. Every new little language interpreter needs to make these decisions! Often a DSL author cares about only some of those decisions, and ends up making poor decisions or half-baked features for the other parts. Additionally, stand-alone interpreters don’t compose, and don’t allow their languages to compose. Eg. if you want to use 2+ independent languages together, you need to shuttle bits of code around as strings, convert data between different formats at every boundary, maybe serialize it between OS processes, etc. With DSL compilers that compile down to another language for the purpose of embedding (eg. Lex/Yacc are DSLs that output C code to integrate into a larger program), you don’t have the data shuffling problems. But you still have issues if you want to eg. write a function that mixes multiple such DSLs. In other words, stand-alone compilers that inject code into your main language are only suitable for problems that are sufficiently large and separated from other problems you might build a DSL for.

                          With macro-based embedded languages, you can sidestep all of those problems. Macro-based embedded languages can simply use the features of the host language, maybe substituting one feature that it wants to change. You mention delaying code – IE changing the host language’s evaluation order. This is only one aspect of the host language out of many you might change with macros. Macro extensions can be easily embedded within each other and used together. The only data wrangling at boundaries you need to do is if your embedded language uses different, custom data structures. But this is just the difference between two libraries in the same language, not like the low-level serialization data wrangling you need to do if you have separate interpreters. And macros can tackle problems as large as “I need a DSL for parsing” like Yacc to “I want a convenience form so I don’t have to write this repeteating pattern inside my parser”. And you can use one macro inside another with no problem. (That last sentence has a bit of ambiguity – I mean that users can nest arbitrary macro calls in their program. But also you can use one macro in the implementation of another, so… multiple interpretations of that sentence are correct.)

                          To end, I want to comment that macro systems vary a lot in expressive power and complexity – different macro systems provide different capabilities. The OP is discussing Common Lisp, which inhabits a very different place in the “expressive power vs complexity” space than the macro system I use most (Racket’s). Not to disparage the Common Lisp macro system (they both have their place!), but I would encourage anyone not to come to conclusions about what macros can be useful for or whether they are worthwhile without serious investigation of Racket’s macro system. It is more complicated, to be certain, but it provides so much expressive power.

                          1. 4

                            I mean, strictly, no - but that’s like saying “if you can write machine code, do you really need Java?”

                            (Edited to add: see also Greenspun’s tenth rule … if you were to build a macro system out of such tooling, I’d bet at least a few pints of beer that you’d basically wind up back at Common Lisp again).

                            1. 2

                              First-class packages are the most underrated feature of lisp. AFAIK only perl offers it fully

                              OCaml has first-class modules: https://ocaml.org/releases/4.11/htmlman/firstclassmodules.html

                              I’m a lot more familiar with them than I am with CL packages though, so they may not be 100% equivalent.

                              1. 2

                                I’m not claiming to speak for all lispers, but the question

                                Here is my question for lispers, If you can use lex / yacc and can write a full fledged interpreter do you really need macros ?

                                might be misleading. Obviously you don’t need macros, and everything could be done some other way, but macros are easy to use, while also powerful, can be dynamically created or restricted to a lexical scope. I’ve never bothered to learn lax/yacc, so I might be missing something.

                              2. 1

                                Conditions are a largely forgotten language feature that solves what I see as the fundamental problem with exceptions.

                                Fundamentally, exceptions solve the problem that the best place to handle an unusal state is often somewhere up the call stack where that unusual state is identified. Conditions, on the other hand, solve the problem that the best place to handle that unusual state is often down the call stack from the best place to decide how it should be handled.

                                (FWIW, it’s possible to implement conditions as a library feature in any language with both first-class functions and the ability to arbitrarily unwind the stack; e.g., with exceptions. I’ve done it in Java and Python, though I’ve never managed to figure out how to make the interface remotely acceptable)