What’s gained by using a new term “compile-time computing”? These techniques are well-established, and I feel like there are already too many terms.
I think metaprogramming is a good umbrella term which encompasses a variety of techniques. It’s more general than “compile-time computing”, and I think gets to the heart of the issue.
Metaprogramming is when the data a program operates on is code – whether in textual form, AST form, or “runtime form”. Not coincidentally, that’s also a good definition for a “compiler” or “interpreter”. Examples:
code generators are metaprogramming – the code is is text (e.g. lex/yacc/protobuf compiler)
macros are metaprogramming – the code is in the form of ASTs
runtime reflection is metaprogramming – e.g. dynamically creating classes in Python, Go’s JSON module, Java class loading
multi-stage programming [1] is metaprogramming – this relates to whether the generated code is typesafe. OCaml has a few systems like this.
The problem with “compile-time computing” is that the same DSL can be implemented in a variety of ways. Whether it happens at compile time or runtime is orthogonal to what you’re doing. That you’re using metaprogramming is a more apt description than “compile-time computing”.
Examples:
lexer / state machine generators
Rust and OCaml have compile-time macros to implement regexes
lex, re2c, ragel – done at compile-time with textual code generation
PCRE / re2 – regex interpreters, as opposed to compilers
v8 apparently has a regex JIT
in Lisp, you can do these at compile time, but with macros as opposed to textual code generation
parser generators
ANTLR, Bison – done at compile time, with code generation
C++ has parser generators done entirely in template metaprogramming, e.g. boost::spirit I think
there are several parser “interpreters”, e.g. I think many parser combinator libraries. I wrote a PEG parser interpreter in Python. PyParsing is another parser-interpreter I just looked at.
in Lisp, you can do these at compile time, but with macros as opposed to textual code generation
AST schema language, like ASDL [2]
Python’s ASDL – Python uses a Python program which turns an ASDL schema into a C library.
Oil’s ASDL – but I currently use an interpreted variant of this Oil. There is no textual code generation, it’s done with runtime reflection.
I wrote a toy protobuf compiler using runtime reflection – parsing the schema and generating Python classes dynamically.
web template languages – You can find more than a dozen such languages in Python, Ruby, Perl, PHP, etc.
most of these are AST interpreters
Cheetah was compiled to Python bytecode IIRC
There are multiple template languages that can be compiled to C and C++; I used one at Google and there were others too.
printf format strings – These are just like web template strings; in fact printf is accidentally Turing-complete [3]
The C stdlib has interpreters for printf strings (however modern C compilers have additional parsers for them to give compile time error messages.)
OCaml (and maybe Rust?) have printf string compilers
My claim: for any example he gives of “compile-time computing”, you can find an equivalent example of metaprogramming that is not done at compile-time. The fact that that it’s metaprogramming is the interesting thing about the software architecture.
Compile time and runtime don’t really have solid meanings anyway – Lisp intermingles the two. There is a Cling interpreter for C++. When you’re writing compilers and build systems, build time is runtime! (That is not a vacuous or trivial statement, if you work on that type of thing it comes up)
EDIT: I feel like the Futarama projection proves my point, i.e. it’s irrelevant whether it’s a compiler or interpreter, but I probably won’t do a good job explaining that: https://en.wikipedia.org/wiki/Partial_evaluation
Forth also makes it easy to run user written code during compilation. Want a three-way IF statement (IF<=> )? You can easily do it in Forth. It’s less a macro system and more writing an extension to the compiler. And because of this, Forth does define the terms “runtime” and “compiletime”.
I agree with you. It’s already called metaprogramming. Specific use of it is often called macros. A lot of good stuff in CompSci uses multi-staged computation. Teaching people those terms will result in them finding all sorts of good stuff, esp worked examples, when they Google them.
What’s gained by using a new term “compile-time computing”? These techniques are well-established, and I feel like there are already too many terms.
I think metaprogramming is a good umbrella term which encompasses a variety of techniques. It’s more general than “compile-time computing”, and I think gets to the heart of the issue.
Metaprogramming is when the data a program operates on is code – whether in textual form, AST form, or “runtime form”. Not coincidentally, that’s also a good definition for a “compiler” or “interpreter”. Examples:
The problem with “compile-time computing” is that the same DSL can be implemented in a variety of ways. Whether it happens at compile time or runtime is orthogonal to what you’re doing. That you’re using metaprogramming is a more apt description than “compile-time computing”.
Examples:
lexer / state machine generators
parser generators
boost::spiritI thinkAST schema language, like ASDL [2]
schema compilers / message formats (like protobuf)
web template languages – You can find more than a dozen such languages in Python, Ruby, Perl, PHP, etc.
printfformat strings – These are just like web template strings; in fact printf is accidentally Turing-complete [3]My claim: for any example he gives of “compile-time computing”, you can find an equivalent example of metaprogramming that is not done at compile-time. The fact that that it’s metaprogramming is the interesting thing about the software architecture.
Compile time and runtime don’t really have solid meanings anyway – Lisp intermingles the two. There is a Cling interpreter for C++. When you’re writing compilers and build systems, build time is runtime! (That is not a vacuous or trivial statement, if you work on that type of thing it comes up)
EDIT: I feel like the Futarama projection proves my point, i.e. it’s irrelevant whether it’s a compiler or interpreter, but I probably won’t do a good job explaining that: https://en.wikipedia.org/wiki/Partial_evaluation
[1] https://en.wikipedia.org/wiki/Multi-stage_programming
[2] http://www.oilshell.org/blog/2016/12/11.html
[3] https://github.com/HexHive/printbf
Forth also makes it easy to run user written code during compilation. Want a three-way IF statement (IF<=> )? You can easily do it in Forth. It’s less a macro system and more writing an extension to the compiler. And because of this, Forth does define the terms “runtime” and “compiletime”.
I agree with you. It’s already called metaprogramming. Specific use of it is often called macros. A lot of good stuff in CompSci uses multi-staged computation. Teaching people those terms will result in them finding all sorts of good stuff, esp worked examples, when they Google them.