1. 1

    What’s the difference between property-based testing and fuzzing?

    1. 5

      To me, the name “property-based” emphasizes the assertions you’re making, while “fuzzing” emphasizes the random inputs. I think a typical fuzz test makes pretty weak assertions, like “it doesn’t segfault”, while a typical property-based test would make assertions like “forall x, first(sort(x)) == min(x)”.

      Also, you can do property testing without using randomness: SmallCheck tests all inputs up to some size bound.

      Two other points of view I found interesting:

      1. 2

        I’m very new to property-based testing (recently started using hypothesis) but I came across this page by the primary author of hypothesis which talks about their preferred definitions of fuzzing and property based testing [edit: I’m too slow! this is the second link in dpercy’s comment]. They’re discussed/qualified on the page but just to have them here:

        Fuzzing is feeding a piece of code (function, program, etc.) data from a large corpus, possibly dynamically generated, possibly dependent on the results of execution on previous data, in order to see whether it fails.

        and

        Property based testing is the construction of tests such that, when these tests are fuzzed, failures in the test reveal problems with the system under test that could not have been revealed by direct fuzzing of that system.

        I’m not sure if these definitions are widely used though (still just getting started myself!).

        1. 2

          If you squint hard enough, they are the same thing. But today in practice they work differently. But there is convergence.

          What property-based testing emphasizes is generating random inputs of a specific type. Most interesting things are trees, and it’s relatively easy to write a generator for an arbitrary tree.

          What fuzzing emphasizes is coverage-guided search of examples. Unlike property based testing, fuzzers typically generate just random bytes, which aren’t conforming to any specific grammar. However, when the fuzzer runs, it watches the coverage of system under test, and, via this feedback and the magic of generic programming, can synthesize inputs conforming to the grammar.

          The synthesis of the two ideas, structured inputs + coverage feedback, is to use proptest-like approach to property based testing. Classical property based testing, like the one in the article, says that the input to the generator is an rng. Proptest makes this more specific, and says that the input is a finite sequence of numbers. That is, a generator is a mapping from a sequence of unstructured bytes to an element of the type we generate. And if you use a fuzzer as a source of those random bytes to feed into generator, you get:

          • property based testing with fuzzing-like coverage feedback ==
          • == fuzzing with property-based testing-like ability to use complex grammars as input

          A great write up here is https://fitzgeraldnick.com/2020/08/24/writing-a-test-case-generator.html. It essentially makes libfuzzer generate valid wasm programs.

          As a bonus, proptest-like approach gives you shrinking for free.

        1. 6

          This looks somewhat similar to Aaron Hsu’s array-based tree structures and algorithms; you may find them interesting.

          He uses two arrays in parallel: one contains, for each element, that element’s parent; and the other contains a given element’s left sibling. Terminal elements can use a sentinel value or (more likely) point to themselves. The second can be elided if order is not important.

          I believe these techniques had been described before, but were not in very common use; e.g. I don’t think anybody had ever written a compiler using such a structure before.

          1. 4

            That’s what I thought when reading the post! Two Aaron’s papers touching this:

            1. 3

              Thanks for the links! I was aware of this awesome work but did not realize the similarity of the approach - I find Aaron’s approach not very accessible, though I am skimming the relevant chapter of the PhD now.

            2. 2

              There are some related interesting array based tree representations in Iverson’s A Programming Language book (section 1.23 on ordered trees, page 45 of pdf book here). Also Stevan Apter’s treetable paper has some interesting uses of these ideas trees-as-arrays [edits for clarity - sorry!].

              1. 3

                Also interesting and relevant. Quickly skimming over these and Hsu’s work, it seems that many of them include sorting as a primitive operation, which I’d count as medium performance on GPU. It also seems like many of these use an intermediate representation which is n_elements * max_depth. That’s considerably easier than what I’ve done, which is strictly O(n). It’s entirely possible I’ve rediscovered something that already exists, but so far I haven’t found it, and in particular I’m pretty sure my adaptation to the decoupled look-back framework (for single pass processing) is new. In any case, thanks for these links.

                1. 2

                  include sorting as a primitive operation, which I’d count as medium performance on GPU

                  As far as I know, no one before hsu had really focused on performant or compact representations; adjacency matrices, for instance, had been known for a long time as a simple way to represent graphs in apl, and were obviously not performant.

                  But hsu’s algorithms all use linear space, and don’t really rely on sorting; I count only two sorts in the core co-dfns compiler.

                  I didn’t mean to imply that your work wasn’t original—it’s certainly new to me, at least—just thought this might be something interesting to look at, and maybe inspire further improvements to your own work :)

                  1. 2

                    Many apologies for my ambiguous comment! I didn’t at all mean to suggest these had your algorithm in; I hope it’s OK I edited my comment for clarity. Thanks for your admirably graceful reply and helpful complexity comparison, and of course for your very interesting blog post :)

                    1. 2

                      All good. I’ve updated the blog post with a bit of clarification and adding these links, which I’m hopeful puts everything in a good context.

              1. 11

                Most lists of “weird programming languages” get bogged down in brainfuck and brainfuck skins. I like that this one doesn’t!

                1. 11

                  I agree. Although I feel that APL and especially Lisp don’t really fit with the rest of the list - those are languages that (some) people really do want to program in.

                  1. 7

                    I think a listicle like this about unusual languages people actually use would be really interesting. Probably something like

                    • Forth
                    • APL/J/K
                    • Inform7
                    • Orca
                    • Golfscript (stretching it, I know)

                    Damn I’ve heard of so many bizarre languages

                    1. 10

                      PostScript.

                      1. 2

                        Any good resources on PS? I’ve heard… rumors, but never investigated myself.

                        1. 7

                          I’m dead tired and can’t find the docs before sleep, but PostScript is an awesome concatenative language and sincerely my favorite in the genre other than Factor. It’s not hard. I’ll find links to the guides in the morning. You can literally code in the GhostScript REPL meanwhile if you want to play.

                          1. 3

                            I really like what I’ve read of Bill Casselman’s Mathematical Illustrations which covers PostScript and some fun geometry.

                            1. 1

                              Back when I had to use PostScript for work, the language reference was the best document I was able to find.

                              1. 1

                                Unfortunately no, like so many of my opinions I’ve gotten it from The Internet.

                                I believe my primary memory of PostScript being used for programming is from this comment by JWZ: http://regex.info/blog/2006-09-15/247#comment-3085

                            2. 3

                              And there’s INRAC (used for at least two, possibly three, commercial products that I know of) where flow control is non-deterministic.

                              1. 2

                                I saw you mention INRAC on the alien languages ask, which to my eternal shame I didn’t notice until two weeks later. What are some resources for learning about it as an outsider? Sounds really interesting!

                                1. 4

                                  Unfortunately, there isn’t much available and most of the references I’ve come across just mention INRAC. I think, aside from the original creator of INRAC (William Chamberlain) I think I’ve inadvertently became an INRAC expert:

                                  Deconstruction Racter

                                  The Psychotherapy of Racter, or The Descent Into Madness of Sean Conner

                                  The Psychotherapy of Racter, or The Further Descent Into Madness of Sean Conner

                                  INRAC, the mind bending implementation language of Racter

                                  WTF INRAC?

                                  So how do you determine undefined behavior in a language you are reverse engineering?

                              2. 2

                                Hey, if the software historian / archeologist hasn’t heard of it…

                                For that hypothetical listicle, I’d consider adding one or two of your modelling languages - like, TLA+ looks pretty magical to people who are not you ;-). Also, I’d consider - LaTeX is not actually that uncommon, but very different from other languages in both appearance and semantics. (Maybe TikZ, but I’m not sure that counts as a programming language.)

                                Something like Haskell is probably too common, but Prolog might make the list?

                                [Quick EDIT: also, maybe assembly for the original MIPS CPUs, where you could apparently read the old value of a register if you manage to execute the instruction before the previous instruction has actually written the new value? It doesn’t look too evil, but…]

                                … do people use Orca?

                                1. 3

                                  … do people use Orca?

                                  @rwhaling introduced me to it and was using it for his synth music, so at least one person uses it :P

                                  1. 2

                                    Re MIPS, you may be thinking of https://en.m.wikipedia.org/wiki/Delay_slots. For some reason this is still being taught in introductory computing classes at university.

                                    1. 2

                                      [Quick EDIT: also, maybe assembly for the original MIPS CPUs, where you could apparently read the old value of a register if you manage to execute the instruction before the previous instruction has actually written the new value? It doesn’t look too evil, but…]

                                      Were you thinking of the divide and multiply instructions? Some instruction sequences give unpredictable results.

                                      1. 2

                                        I was thinking of https://retrocomputing.stackexchange.com/questions/17598/did-any-cpu-ever-expose-load-delays. (kameliya’s Wikipedia page is a little less informative; note that sufficiently-embedded processors may be able to ensure that an interrupt doesn’t happen. Which would allow one to write rather mind-bending code.)

                                    2. 2

                                      SQL is based around relationships (in the mathematical sense) and is the most popular goofy programming language no one thinks about.

                                      Lex/Yacc let you write half your program as a cfg and the rest in C, a language/tool chain that again no one thinks of in these lists.

                                      Wolfram is based on term rewriting and is somewhat popular and extensively used in physics.

                                      Erlang is based around a distributed model that is again something few other languages support naively.

                                      Most of the ‘esoteric’ language lists are list of ‘languages that do the same thing as C but poorly’.

                                      1. 1

                                        MiniZinc is also worth an include on that list.

                                        1. 1

                                          Factor is a really nice forth dialect.

                                          1. 1

                                            Yes, I was also just about to suggest Inform 7. It’s fantastic.

                                            1. 1

                                              Golfscript (stretching it, I know)

                                              No you’re not. I want to write an implementation that is not Ruby

                                              1. 1

                                                Prolog, MUMPS

                                                1. 1

                                                  Mumps, RPG…

                                                2. 1

                                                  TBH I interpreted the inclusion of CL on this list as a trolling attempt toward lispers.

                                              1. 10

                                                This is Ken Iverson’s Elementary Functions, a book that uses APL to illustrate mathematics.

                                                Ken’s heirs have placed a lot of his works under Creative Commons licenses. After that was done, I’ve made it a point of finding spare copies and having them digitally scanned so that they can be preserved. Algebra: An Algorithmic Treatment is another example.

                                                J Software is kind enough to host the PDFs.

                                                1. 3

                                                  Is there a page collecting links to Ken Iverson’s CC licensed works in PDF? I’d like to post this to an array languages group I visit.

                                                  (edit) I think I found the page that collects these documents: https://code.jsoftware.com/wiki/Books

                                                  1. 2

                                                    Yep, you found it.

                                                    Which array languages group, if I may ask?

                                                    1. 2

                                                      It’s a group internal to the Recurse Center. I posted these links and got an invite to pair on some katas in J! I may learn APL after all!

                                                  2. 3

                                                    Thank you very much for finding and having this scanned! This looks particularly interesting seeing as it looks like it uses the ‘book APL’ notation from A Programming Language rather than the APL programming language (used in Iverson’s Algebra and Analysis books). This Elementary Functions book was published just after the first APL implementation was finished I think so presumably it was written before / alongside the computer implementation of APL.

                                                    1. 3

                                                      What’s “book APL” ? How’s it different? I want to know more!

                                                      1. 4

                                                        I can do no better than lorddimwit’s excellent answer :) The book itself is available at softwarepreservation.org here.

                                                        There’s some more info on the history in The Design of APL in the appendix on the chronology of APL development. In particular it turns out this Elementary Functions book grew out of a high school course Iverson taught in 1964. The students got to use an experimental partial implementation of APL on an IBM 1620 - must have been quite a course! edit: Actually these historical details are in the preface of the Elementary Functions book too.

                                                        1. 2

                                                          APL started life as a “blackboard notation” for teaching and communicating mathematics. It was described in the book A Programming Language. In the title, “programming” wasn’t supposed to evoke “computer programming” so much as “designing algorithms and procedures.”

                                                          For the first several years of its existence, APL was purely a pen-and-paper/chalkboard notation.

                                                          When Aiden Falkoff wrote the first APL interpreter for a computer, several changes were made to the notation due to the realities of computing hardware at the time.

                                                          For example, “book” notation for “floor” is ⌊0.4⌋, whereas in “computer” notation, it’s ⌊0.4.

                                                          The book also uses a beautiful schematic representation of algorithms for loops and jumps; the ∇ notation on the computer approximates it but they’re not the same.

                                                          Reading A Programming Language to learn computer APL would be a bit like reading Shakespeare to learn modern English. You’d get 85% of the way there and get all the core concepts, but you’d say a lot of things that don’t quite make sense.

                                                          Interestingly, “book” APL has some stuff that was never completely put into computer APL.

                                                    1. 5

                                                      Nice intro.

                                                      I may not use J in my everyday programming, but I’m glad I learned it and think it made me a better programmer.

                                                      This was my experience too, though I continue to use J almost daily even if not professionally.

                                                      Far more than any other language, including Haskell, learning J felt like dropping acid and blasting away everything I thought I knew about programming.

                                                      It’s the third paradigm, and unfortunately the least well-known:

                                                      1. Procedural
                                                      2. Functional
                                                      3. Array

                                                      It really is a whole new mindset.

                                                      1. 7

                                                        How did you learn it? What kind of projects would you recommend to learn such a language to discover its benefits?

                                                        1. 4

                                                          Personally, I stuck to number-crunching a-la Project Euler and some Advent of Code challenges.

                                                          Also tried to use J for bio-signal processing (a set of student labs built around calculating various metrics of heart-rate variability), but after looking at the solution professor went “Nuh-uh” and I had to switch to R, which IMO wasn’t that bad of a compromise in terms of paradigm shift.


                                                          The most interesting point that I don’t see being addressed in most of the J write-ups is how one can use it to create ad-hoc calculi of various sorts, even problem-specific DSLs. Function composition, monadic/dyadic distinction and tacit programming surely allow for that, yet the “how” of it isn’t widely discussed.

                                                          1. 1

                                                            I had to switch to R, which IMO wasn’t that bad of a compromise in terms of paradigm shift.

                                                            That’s a shame: I think R is basically the same as numpy - no real paradigm shift, just more/different functions, so I hope you’ll give Iverson-ish languages another try, because by focusing on the array-ness you might’ve missed something incredible.

                                                            in most of the J write-ups is how one can use it to create ad-hoc calculi of various sorts, even problem-specific DSLs

                                                            I’m very curious to understand better what you’re looking for here. What’s a kind of problem you’d like to see?

                                                            1. 2

                                                              I think R is basically the same as numpy

                                                              That’s true, and NumPy is basically an ugly cousin of APL ;) What I rather meant is that sacrificing J’s expressiveness was a bearable trade-off in the context of my task, and that in the end I didn’t really need most of it: just reusing R’s libraries and mashing matrices together sufficed. Productivity and acceptance by peers over cleverness and personal satisfaction.

                                                              I’m very curious to understand better what you’re looking for here

                                                              “Notation as a tool of thought” I think. Is J a mean to an end as it is given, or does it permit/encourage building linguistic abstractions on top?

                                                              APL is often praised for its lyiricality, and J specifically describes the parts of its speech as nouns, verbs, adverbs, and conjunctions; but does it end there? In J, There are ways to assign obverses and adverses to a verb, to define its monadic and dyadic versions, to create adverbs with conjunctions; and then some tree-manipulating primitives, even ;: for tokenizing, and boxes that permit manipulating J code as data. Is any of that intended for meta-programming, AST building, and ultimately DSL creation?

                                                              Can an expert in their domain take J and carefully design a set of composable primitives for solving specific problems? If so, what is the process behind that, and how does the end result look like? Is it a notation that reads and writes like poetry, or is it a prosaic library? Is it even a normal practice in APL family?

                                                              So, I guess what I am looking for is a walkthrough thru the points I raised, with some interesting problem as its subject; both about the mindset and its execution. I’m too much of a puny human to grasp Aaron Hsu’s thesis on the compiler that he built with Dyalog APL, but the basic premise is roughly the same with what I have in mind. His live stream on this compiler’s design and architecture is also worth watching.

                                                              1. 3

                                                                Is J a mean to an end as it is given, or does it permit/encourage building linguistic abstractions on top?

                                                                Permit? Probably.

                                                                Encourage? I’ve not seen this personally. In fact, the ethos of J/APL seems the opposite of this. The big promise of the language is that with a single set of ~50 primitives you can elegantly solve the vast majority of practical problems. So there’s no need for bespoke DSLs for your different applications.

                                                                As a side note, my experience with Ruby has convinced me that such a DSLs are usually a mistake.

                                                                1. 3

                                                                  A case in point: in the above-mentioned talk from Aaron Hsu, he shows how he defined PEG parser combinators and used them to write parsing expressions in a seemingly declarative manner.

                                                                  Surely that doesn’t qualify as a DSL, but the ability to re-define a problem in terms of these 50+ vector-manipulating primitives is what always fascinated me in APL. There’s an inherent, almost visceral “spatiality” to this act. At times it feels that the more terse your code is, the closer it to some sort of underlying ur-computation that the world is build of.

                                                                  Perhaps my original question is not about how to extend the language towards the problem, but how to ground the problem in the notation. Or perhaps how to design a notation with APL-ish qualities that will influence how I think with it and create in it?


                                                                  APL family posits a certain view on the programming, but how to see everything thru its lens? Is it just linear algebra with mathematical know-how? Practice?

                                                                  An example from my experience: in the first semester of DSP I really struggled with the concept of convolution, I simply couldn’t grasp it on an intuitive level. At least in my native language “convolution” sounds more like “folding” or even “coagulating”, but what folds with what? What is the physical meaning? Implementing it in C-level language only obscured the question behind the semantic noise, by-the-book definitions didn’t help either.

                                                                  And then one day I just [:+//.*/ in J console and it clicked. In a sense, if two signals are threads, then their convolution is a helical braid, the one that you create by coiling one thread over another; a sum +/ of oblique /. diagonals formed by multiplicative * intersections /.

                                                                  There’s a scene in “Matrix: Revolutions” where blinded Neo says to Smith in Bane’s body “I can see you” — that’s the kind of feeling I got from this experience.

                                                                  1. 2

                                                                    APL family posits a certain view on the programming, but how to see everything thru its lens? Is it just linear algebra with mathematical know-how? Practice?

                                                                    Practical answer: doing many problems, getting it “wrong”, and then seeing a more natural way to do it. It seems like you already know this though, based on your very nice example.

                                                                    Larger answer: I’ve been meaning to write about this, but tldr… two words come to mind: geometrical and holistic. Think of the game of life in APL. In J/APL think, you don’t view the problem from the point of view of individual cells. Instead, you look at the whole plane at once, and imagine eight copies of that plane stacked on top of one another – one plane for each directional shift. Then the neighbor count is simply the +/ of those planes.

                                                                    This pattern appears a lot. You take the view from above, look at “everything at once,” and there’s a natural and simple way to express that with array computations. It doesn’t always work, though. Some problems aren’t a good fit for array thinking, but a surprising number are.

                                                                    1. 3

                                                                      Yes, the local action (counting neighbors) is applied on the board as a whole, not just to a specific cell; I even have a J one-liner stashed for that somewhere.† Nice example.

                                                                      Interesting that APL solutions turn out to be fractal in their nature (speaking of ur-computation). Reminds me of Konrad Zuse’s Rechnender Raum and Plankalkül.


                                                                      † Lo and behold:

                                                                      L =: [:+.`*./ ],~ 3 4=/ [:+/^:2 (>{;~i:1)|.]
                                                                      g =: 5 5 $ 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 1 1 1 0
                                                                      
                                                                         <"_1 ' o' {~ L^:(i.5) g
                                                                      ┌─────┬─────┬─────┬─────┬─────┐
                                                                      │     │     │     │     │     │
                                                                      │  o  │     │     │     │     │
                                                                      │   o │ o o │   o │  o  │   o │
                                                                      │ ooo │  oo │ o o │   oo│    o│
                                                                      │     │  o  │  oo │  oo │  ooo│
                                                                      └─────┴─────┴─────┴─────┴─────┘
                                                                      
                                                                      1. 1

                                                                        Nice. You might also enjoy this 26 byte version:

                                                                        (]=3+4=*)[:+/(>,{;~i:1)&|.
                                                                        

                                                                        By the way, what did you mean by “APL solutions turn out to be fractal in their nature”?

                                                                        1. 2

                                                                          Well, Moore neighborhood is applied to the playing field as a matrix rotation, but at the same time to each cell on the field (each element of a matrix). So, in a way, Game of Life field is a cell by itself (with a number of states dependant on a number of cells in it), and, in turn, each cell on it is a playing board. [1]

                                                                             [ c =: 4=i.3 3
                                                                          0 0 0 NB. A cell with 8 neighbors.
                                                                          0 1 0
                                                                          0 0 0
                                                                             <"_2 (>{;~i:_1)|. c
                                                                          ┌─────┬─────┬─────┐ NB. A cell with 8 neighbors?
                                                                          │1 0 0│0 1 0│0 0 1│
                                                                          │0 0 0│0 0 0│0 0 0│
                                                                          │0 0 0│0 0 0│0 0 0│
                                                                          ├─────┼─────┼─────┤
                                                                          │0 0 0│0 0 0│0 0 0│
                                                                          │1 0 0│0 1 0│0 0 1│
                                                                          │0 0 0│0 0 0│0 0 0│
                                                                          ├─────┼─────┼─────┤
                                                                          │0 0 0│0 0 0│0 0 0│
                                                                          │0 0 0│0 0 0│0 0 0│
                                                                          │1 0 0│0 1 0│0 0 1│
                                                                          └─────┴─────┴─────┘
                                                                          

                                                                          That was a nod towards your idea about geometry and holistic approach.

                                                                          1. 1

                                                                            Gotcha. Thanks.

                                                                  2. 2

                                                                    Yeah. One subtlety most people don’t appreciate about “notation as a tool of thought” is that the paper doesn’t mention “abstraction” once. Easy to miss in these modern times where “notation” has implicitly come to mean “for creating new abstractions”.

                                                                    1. 4

                                                                      That’s fascinating. I never noticed that before.

                                                                      Since I have it open now, relevant paragraph on my previous point:

                                                                      The utility of a language as a tool of thought increases with the range of topics it can treat, but decreases with the amount of vocabulary and the complexity of grammatical rules which the user must keep in mind. Economy of notation is therefore important.

                                                                      Notation as a tool of thought

                                                                      1. 1

                                                                        True, it doesn’t. But then the whole paper is a process of abstraction (deriving general patterns from concrete examples, addressing the main thesis case-by-case) that uses APL as a conceptual vehicle.

                                                                        1. 2

                                                                          The design goal is to create a notation that minimizes the need for new names.

                                                                          1. 3

                                                                            APL is like a beautiful diamond – flawless, beautifully symmetrical. But you can’t add anything to it. If you try to glue on another diamond, you don’t get a bigger diamond. Lisp is like a ball of mud.

                                                                            Extension of the base language is at odds with its core design principle, but that surely does not preclude us from defining problems in terms of it. My original questions (not well-formed TBH) were more concerned with the ways this notation can be adapted to a particular context (or vice versa).

                                                                            E.g. the above-mentioned Co-dfns compiler, as far as I understand, implements a Scheme-like subset of Dyalog APL and represents trees as vectors, but not the other way around (extend the language with tree-like datatype and spruce all of that with s-expressions or something).

                                                                            Like I said, J has a set of primitives and mechanics that to me seem language-oriented, so I wondered how compiler/interpreter creation looks like from this point of view, on a small exemplified scale (hence the DSLs), and what is the problem-solving mindset (heuristics?) behind it.

                                                                            FSM state-transitions can be modeled with tables, yes, and AST, as a graph, can use adjacency matrices with eigen-shticks, but this whole approach feels like a black magic and well-guarded trade secret. Maybe because it’s a fork from the mainstream road and off into the academical weeds. Need to check out more of Aaron Hsu’s talks for sure.

                                                                            1. 1

                                                                              As a historical aside, the APL presented in Iverson’s A Programming Language book (pdf) included trees as a basic data structure (page 45 in the book). They weren’t included in APL as first implemented, then APL2 added nested arrays which serve a similar purpose. Some of the book’s tree operations are in J.

                                                                              1. 2

                                                                                Yes, thanks for mentioning that! The chapter on microprogramming with APL was mind-blowing at the time I read it.

                                                              2. 2

                                                                I learned it mainly by doing hundreds of code golf problems, asking questions on the J IRC channel, reading (good chunks of) J for C programmers and Learning J, as well as the J dictionary and the more accessible NuVoc documentation.

                                                                I’d say it took 1.5-2 years to acquire a feeling of fluency. Much longer than other languages. One crucial practice was solving a problem and then having it edited by experts. Because unavoidably you’ll apply your current paradigms to J, or simply be unaware of J idioms, rather than solving the problem in a “J way.”

                                                                The best place to get this kind of help these days is the The APL Orchard. There’s a handful of very talented J and APL programmers willing to help.

                                                                By the way, you could get the same mind-expansion from learning APL, if you were more inclined that way. The free book on the Dyalog APL site is also good.

                                                              3. 1

                                                                I’d classify it as functional. It just uses a multi-dimensional array where others use a linked list. Nearly all operators are side-effect free and data is immutable. There are higher order patterns like map-reduce.

                                                                1. 3

                                                                  Array programming languages aren’t “functional” or “immutable” in the way that Erlang and ML are, and I observe beginners notice this only after a few months, probably because these features are used in Array-language applications, and not in the puzzles you practice on to learn how to think in Arrays. Real applications batch and use lots of global variables with delimited names like TCL or Perl; J programmers call them locales; K programmers have the K-tree; APL it’s isolates. And so on.

                                                                  I know other languages have “arrays” but it’s just an unfortunate name collision. Many of us use the term “Iverson languages” instead of “Array languages”, because we’re not trying to confuse things unnecessarily.

                                                                  1. 2

                                                                    To add on the sibling comments, the article linked and code golf expose you to a part of J. For me it go in a very different mindset with stuffs like : full tacit programming reaching more a function-level programming [1] (Even point-free in Haskell fill clumsy in comparison with APL and J), constructs like hooks and forks [2] permitting composition and chaining of functions with ease.

                                                                    1. 1

                                                                      It’s true that tacit programming in J is functional, but that misses the larger difference. The way you need to think about and frame a problem to solve it idiomatically is usually quite different in J versus, say, Haskell.

                                                                  1. 1

                                                                    I just got around to reading the Softnet papers (pdfs) mentioned in the posted article - they’re short and very interesting.

                                                                    The model of sending code as messages reminds me of the spirit of IPC in q; in its simplest form you can just send q code as a string to another q process, which runs it (and can return you the result).

                                                                    1. 6

                                                                      My first ‘lockdown book’ was Douglas Hofstadter’s Gödel, Escher, Bach - I can heartily recommend it! I’m completely ignorant about formal logic and still enjoyed it tremendously.

                                                                      1. 2

                                                                        I love that book. It is mind bending, and so well-written you don’t need to be a mathematician to understand it.

                                                                        1. 1

                                                                          Would it work in audio book form? Are there a lot of graphics?

                                                                          1. 3

                                                                            Sadly I think it would suffer - it features some lovely hand-drawn diagrams and many Escher drawings.

                                                                        1. 3

                                                                          The strided representation of multidimensional arrays allows a lot of really nice properties to just sort of naturally fall out. I find the representation particularly beautiful.

                                                                          1. 4

                                                                            I do too! Phil Abrams’ 1970 thesis on An APL Machine (pdf) has a really interesting discussion of ‘beating’, a process introduced to use the strided representation to carry out various operations (transpose, rotate, take, drop, …) efficiently.

                                                                            It also introduces ‘drag-along’ which is a sort of lazy evaluation of APL expressions - I actually don’t know if this is used widely/at all in APL implementations. It for example allows you to calculate 3↑2×-V by only negating and doubling the first three elements of V rather than working on the whole vector V then taking the first three elements at the end (example from An APL Machine). I’d like to try implementing drag-along one day.

                                                                            1. 2

                                                                              Thank you for that, I’ll look over that this weekend!

                                                                              It predates APL2/nested arrays; I wonder if it would be difficult to add those in to the design (without having read any of the paper it’s going to be either a non-issue and trivial, or impossible).

                                                                              1. 2

                                                                                Tangentially: Sometimes I imagine a parallel universe in which APL didn’t add nested arrays but instead implemented the notation for trees in the ‘A Programming Language’ book. Trees seem more natural to me than nested arrays, so since trees are in the book anyway it seems curious to me that nested arrays were chosen. Some of the tree operations from the book were added to J.

                                                                                The tree notation seems nice for some things - e.g. if you have a probability tree T then the probability to reach each leaf is ×/T and the probability to reach each node is ×\T (if / and \ are taken as acting along paths). And writing // for folding across levels, +//×\T gives a vector of ones (total probability at each level is one).

                                                                              2. 2

                                                                                It also introduces ‘drag-along’ which is a sort of lazy evaluation of APL expressions - I actually don’t know if this is used widely/at all in APL implementations. It for example allows you to calculate 3↑2×-V by only negating and doubling the first three elements of V rather than working on the whole vector V then taking the first three elements at the end (example from An APL Machine).

                                                                                Interesting. I have some tentative plans to implement something like that in my apl interpreter, but I didn’t know there was any interest in it already. In that particular example, I don’t think it gains you very much; you can just rewrite as 2×-3↑V, which is probably more clear in context. (Or even ¯2×3↑V.) I’m much more interested in it in for reductions. If you have +/⍳1e9, there’s no reason that should take up 8gb of memory. So you can make return a generator, which is basically a bit of state plus a function that can return the next number in the sequence. (Actually, you probably want to make it produce 1000 or so numbers at once, but that’s a separate refinement.) A related paper is Extending APL to Infinity, which wants to implement those ideas as part of the language itself, rather than just the implementation. Personally, I don’t think the method presented in that paper is the right way to do it, but it’s an interesting proposal.

                                                                                1. 1

                                                                                  my apl interpreter

                                                                                  If I may ask, is your interpreter available anywhere? I’d love to see it.

                                                                                  I’m much more interested in it in for reductions. If you have +/⍳1e9, there’s no reason that should take up 8gb of memory

                                                                                  So after my first quick read of the paper in question, it defines what are called “J-vectors”. Basically, a triple of (direction,start,count). I don’t know how that compares to the “Extending APL to Infinity” mechanism; thanks for the link there!

                                                                                  1. 1

                                                                                    Thanks that looks like an interesting paper! Will read it slowly.

                                                                                    There’s a cool language FAC (A Functional APL Language) designed by H. Tu and Alan Perlis which adds infinite arrays and laziness to APL too, e.g. from the IEEE paper:

                                                                                    A FAC array is a lazy data-structure where any array element is computed by demand. It means that each index access A[I] behaves as a function call A(I), not as a memory access.

                                                                                    From my skim it seems to be similar in spirit to the infinite arrays in the Extending APL paper (representing them as functions on indices).

                                                                                    Also in one of the more recent versions of k ⍳/! really did return an unevaluated range which I think acted like a generator - but it looks like in the version I just downloaded this has been removed…

                                                                                    1. 2

                                                                                      Also in one of the more recent versions of k ⍳/! really did return an unevaluated range which I think acted like a generator - but it looks like in the version I just downloaded this has been removed…

                                                                                      It was probably k7. I don’t think you can download it anymore (and I can’t find my old copy…), but there’s a web version, where !1000000000 evaluates correctly. The capabilities of that version seem to be pretty limited, though; 1+!1000000000 works, but +/!1000000000 and 2*!1000000000 give a ws error.


                                                                                      From my skim it seems to be similar in spirit to the infinite arrays in the Extending APL paper (representing them as functions on indices).

                                                                                      Right. There’s a big problem with this. Without the ability to make semantic assertions about these functions, it’s difficult to use them effectively. They give an example which computes e; here is another for π:

                                                                                      π ←→ 4×+/1,(¯1*⍳_)÷(1+2×⍳_)  ⍝remove '1,' when ⎕io≡0
                                                                                      

                                                                                      The problem being, of course, that you can’t actually apply reductions to arbitrary unbounded sequences. First off, reduction is right-associative, so something like -/⍳_ won’t work, since you’d have to start at the end of the sequence, and there is no end. So, fine; restrict it to associative functions. Maybe even just + and ×. Well, then you need the sequence to converge; either to 0 in the case of +, or to ±1 in the case of ×. (Or, for ×, it can have a 0 anywhere, but that’s not very interesting.) An actual APL implementation doesn’t know that, though; it will just keep applying the reduction until the difference between successive elements of the scan is less than some epsilon. So what do you do in the case of a sequence where the first 6000 elements look like they’re going to converge, but then the sequence diverges wildly? It’s not a particularly likely event, but it is possible, and it will happen at some point, and then the language will have failed its users.

                                                                                      My proposed solution is a full-on computer algebra system based on APL. So you can have infinite sequences, but the language knows about them rather than just blindly querying a function for their elements. If you try to apply a reduction to one, the implementation has to prove that the scan converges, and then returns a symbolic result, rather than just one that happens to be within an epsilon of the previous result.


                                                                                      Incidentally, though, I think modeling arrays as functions that map indices to elements is a great idea. (As a part of the language, not the implementation.) I’m currently working on an informal paper showing that, with forks and another almost inconsequential extension, arrays are isomorphic to a subset of functions. (It’s even better: we get prefix agreement with scalar functions.) It also leads to a generalised version of the rank operator, and with another small extension it can replace outer product too.

                                                                                      1. 1

                                                                                        Thanks this is very interesting! For what it’s worth I would certainly use an APL-ish computer algebra system. I used to use Mathematica a bit and something capturing its symbolic capability in a small number of primitives would be really fun to use I think (could be fun choosing the primitives too).

                                                                                        Also your informal paper sounds intriguing - I’d certainly be very interested to read it when it’s done!

                                                                                2. 2

                                                                                  Yeah I thought that was cool too. It certainly helps to explain why APL-ish langauge REPLs are so blazing fast. I wonder what other tricks they utilize to minimize data movement…

                                                                                1. 3

                                                                                  Credit to @marmoset for showing me this.

                                                                                  1. 2

                                                                                    Thanks for posting this! Looking through it again really makes me want to try coding a toy version of the D and E machines - chapter IV seems very detailed with lots of examples (and pseudo-APL for the APL machine in an appendix - what a lark!). Also I just spotted that the D and E machines are described in sections D and E of chapter IV which is a nice touch :p

                                                                                  1. 2

                                                                                    Rota and Baclawski’s book on probability may also be interesting for lobste.rs - pdf made available by David Ellerman here. They introduce multisets in chapter one (1.11) and use them to look at Bose-Einstein statistics (2.21) using ‘multiset coefficients’ which have some cool analogies with binomial coefficients (2.25). I think of multisets as more of a programming thing so it’s interesting for me seeing them in maths (maybe this just reflects my inexperience in both!).

                                                                                    1. 6

                                                                                      I’ve been running a gopher server for several years now. Is anyone else here running a gopher server?

                                                                                      1. 7

                                                                                        I do, but the Gemini protocol is more appealing to me: https://gemini.circumlunar.space/

                                                                                        1. 3

                                                                                          I also run a Gemini server that I wrote. I also wrote my own gopher server.

                                                                                          1. 2

                                                                                            I mostly run a Gemini capsule, but scrape websites and serve those sites up over Gopher for myself.

                                                                                          2. 4

                                                                                            coreboot.org has some basic gopher presence because, why not?

                                                                                            1. 1

                                                                                              Maybe that gives a shorter path than http/html for bootstrapping a system (with not much userland utilities there already) and accessing the coreboot.org page from the coreboot-ed system (just for fun maybe).

                                                                                              1. 1

                                                                                                Is it at gopher://coreboot.org ?

                                                                                                1. 1

                                                                                                  Yes

                                                                                              2. 4

                                                                                                My phlog is happily serving the markdown sources of my HTML blog. I recently clean the source for it to be more gopher friendly.

                                                                                                I now want to use gopher more for serving random thoughs/notes quickly without worrying about style.

                                                                                                1. 3

                                                                                                  I started this week! I just have a test though for now. Also in my first explorations I came across your gopher hole through a phlog aggregator as it happens!

                                                                                                  1. 2

                                                                                                    Great! AFAIK,servers of the tildeverse can do their best to support gopher. We can also use their gopherproxy to proxy your web gopher://gopher.conman.org. eg.https://gopher.tildeverse.org/conman.org

                                                                                                    1. 1

                                                                                                      Yes, dual-hosting with HTML. I use https://code.z0.is/notwiki/ for that.

                                                                                                      1. 1

                                                                                                        Floodgap admin here.

                                                                                                        1. 1

                                                                                                          Not yet, but I’ve revived an old project I intended on starting five years ago to build one. Just took a few days of occasional work to build a pretty complete Gopher server. I just have to get around to implementing executable gophermaps and possibly CGI support.

                                                                                                          After I’m done with that, I’m considering implementing my own Gemini server.

                                                                                                        1. 1

                                                                                                          I really wanted this to be from Ordinary differential equations to J but this is also ok

                                                                                                          1. 2

                                                                                                            After reading this I couldn’t resist solving the harmonic oscillator numerically in q…
                                                                                                            (last line is just ASCII plotting)

                                                                                                            q)a:2 2#1 0 0 1+0.01*0 1 -1 0
                                                                                                            q)z:2000(a$)\1.0 0
                                                                                                            q)" *"[(3-til 7)=\:"j"$3*z[25*til 50;0]]
                                                                                                            "***                    *****                    **"
                                                                                                            "   **                **     **                **  "
                                                                                                            "     *              *         *              *    "
                                                                                                            "      *            *           **           *     "
                                                                                                            "       **        **              *         *      "
                                                                                                            "         **     *                 **     **       "
                                                                                                            "           *****                    *****         "
                                                                                                            
                                                                                                            1. 1

                                                                                                              No, this is ok. ;)

                                                                                                              EDIT: actually, do you have a link to From Ordinary Differential Equations to J?

                                                                                                              1. 2

                                                                                                                From Ordinary Differential Equations to J

                                                                                                                Unfortunately I haven’t found this yet. Might have to learn me some APL and write it myself.

                                                                                                                1. 4

                                                                                                                  Oh. I get it. “Ode” as an acronym. In my defense I hadn’t had any coffee yet.

                                                                                                                  If you’re interested in that sort of thing, you might like this: http://www.softwarepreservation.org/projects/apl/Books/CollegeMathematicswithAPL/view

                                                                                                            1. 3

                                                                                                              I just could not learn calculus with limits, I had to go back to the old infinitesimals books to wrap my head around calculus. I ended up with a book from the early 1900s where everything makes much more sense (to me).

                                                                                                              1. 2

                                                                                                                You may enjoy nonstandard analysis! I am also very fond of infinitesimals. I’ve looked at the Keisler books and also a while back I was enjoying going through ‘Lectures on the hyperreals’ by Goldblatt.

                                                                                                              1. 2

                                                                                                                Has anyone tried running Arch Linux ARM on one of these? I’m not sure which platform (if any) would apply.

                                                                                                                https://archlinuxarm.org/

                                                                                                                1. 4

                                                                                                                  Not quite - but I’m using the Manjaro ARM Pinebook Pro image (posting this with it!). Can get it at https://manjaro.org/download under Editions→ARM→Pinebook Pro. Installation was completely smooth and it’s working great for me so far! I uninstalled XFCE and LightDM and am using dwm now. The Manjaro ARM image has a 64-bit userspace rather than the 32-bit userspace which was on the Pinebook when it arrived (found this out since I got the Pinebook in part to play with AArch64 assembly).

                                                                                                                1. 3

                                                                                                                  Thanks very much for posting this video! Even from the first segment I had never thought about how RPN is similar to the pencil and paper way of doing sums - I will chip away at the rest next week.

                                                                                                                  It’s at the opposite end of the spectrum but another old RPN calculator which may be interesting for lobste.rs is the computer programmer’s calculator the HP-16C. As well as all sorts of bit twiddling operations it could do some fun stuff like freely choosing the word length and complement type. Prefer your arithmetic PDP-1 style? You can use 18 bit words and 1’s complement with

                                                                                                                  1 2 f WSIZE f 1’s

                                                                                                                  Messing around with one of these got me into Forth…