1. 5

    This comes right on time, I’m attempting to introduce formal models to my team at $WORK, and have been going through so many materials (lots of them by you) these past few days. Will check this out today! Thank you!

    1. 5

      PLEASE let us know how it goes. I’ve got a couple of failures under my belt when it comes to such efforts. Would love to compare notes and learn.

      1. 3

        I have yet to have it take on at work at all. Even lighter weight things like property-based testing are outside of the comfort zone of a lot of people. I will say, what has piqued the most curiosity is when you show someone relevant to something they’re working on, instead of speaking philosophically / abstractly.

        For example, today I was able to draw up a quick proof in Isabelle to show that introducing a feature flag to a code path wouldn’t break the existing behavior when the flag was disabled. Simple example, but it was received well.

        1. 3

          Now that I think about it, property-based testing sounds like a much easier sell than “formal methods” proper. It’s really just a variation on unit testing, which is official industry “best practice” now (for good or ill). For programmers who already write lots of asserts, the notion of invariants should already be familiar. And once you have your team sold on property-based testing, formalizing invariants in a spec should be a little easier (I hope).

          1. 2

            Absolutely - properties and theorems are expressed the same way, theorems just must be proven. With property-based testing, you’re just evaluating the property with tons of input data and saying “even though we haven’t proven this, we’ve never observed it to be false.” That’s empiricism at its finest, and it’s something people are more open to than math proofs.

            This also offers a way to start with property-based testing and “upgrade” to proof if you choose to. They’ve written about this on the Cogent team, where they’re using exactly that workflow for formally verified file system implementations. No use trying to prove something that doesn’t even pass a property-based test.

            The way I sell property-based testing is that it’s also generally way less actual work to write the test. If you have to write test data generators, they’re written one time and reused between different properties. Test setup can be soul crushing, so it’s a way to automate the boring stuff away. That’s in line with the “practical programmer” ethos.

        2. 1

          I keep trying to introduce this on a team wide scale at the $WORK but so far its really just crazy ol kstatz12 tinkering in his imagination closet

          1. 1

            I can’t even get my colleagues to read my informal correctness proofs, so not sure how I’d sell them on formal methods…

            1. 1

              tell them that folks practicing formal methods at a $dayjob today, will be the ‘data scientists with exuberant pay options’ of tomorrow.

              My thinking is that the ‘break through’ will be called ‘compile-time testing and verification’, and it will happen when major languages and their toolchains (including compile time debugging) will integrate formal methods into them.

              There will be ‘learn formal methods in 21 days’ books and boot-camps, a million of youtube videos about how to ‘Ace an interview for compile-time testing and verification jobs’, recently graduated students coming your place of work and asking when you are going to get off a dinosaur-way of doing things.

              There will be companies going public that will list, ‘formally verified’ software as their ‘competitive advantage’. Manufacturing, robotics will be touting that their formally verified software powers the world… … And so on.

              … And, perhaps, the data science be remembered as a historically-relevant attempt to derive business value from log files :-)

        1. 5

          I might get into gamedev. Join some local group or something. I have no gamedev experience at all, but it sounds like a good area to do rewarding work with others and socialize a bit with fun people.

          1. 1

            It is very fun! I thoroughly recommend it

          1. 6

            Getting ready for my new position I start on Monday! That mostly means relaxing tbh, but may do a small project in Elixir to refresh my knowledge.

            Also tomorrow mine and my partner’s parents will meet for the first time

            1. 1

              congrats! What is your new position all about? Hopefully, you’ll be writing some Elixir there!

            1. 22

              Last week I said I was very burnt out, so this week I quit my job and I’m starting at another place in some weeks. Will be doing shakuhachi and japanese like last weekend, plus meeting with some work friends as a going away dinner tomorrow night.

              1. 2

                Best of luck!

                1. 1

                  Thanks!

              1. 13

                My daughter was born last Friday. I was fortunate enough to arrange three weeks parental leave where I work, and that’s naturally what am at this weekend too.

                Interestingly enough I found dabbing into my pet projects in between the chores and the little one’s feeding is quite doable. It feels like permanent tiredness and slight sleep deprivation allows for easier focusing, somehow? Or perhaps it’s just less coffee throughout the day.

                1. 5

                  Congrats

                  1. 2

                    Thanks!

                  2. 4

                    Congratulations :)

                    1. 2

                      Thank you!

                    2. 2

                      Congratulations! Glad you’re able to find time for tech as well.

                      1. 2

                        Thanks! Was bit afraid that the rest of my personal life will be completely derailed but it’s going good.

                      2. 2

                        Congrats and best of luck. 🎉 Mine are both teenagers but I still remember those early days.

                        1. 1

                          Thanks! Our older is 17 now, so our recollections are quite dim :)

                      1. 9

                        Nothing programming related, I’m VERY burnt out right now. Continue with shakuhachi and japanese practice.

                        Hopefully play some Go with the gf, it’s been a while since I’ve been able to sit and play some relaxed Go, last weeks have been… horrible. I’m about to crack.

                        1. 3

                          I hope things get better for you soon.

                          1. 1

                            Are you burnt out from work?

                            1. 5

                              Yes, very :/

                              The environment has become very aggressive and toxic.

                          1. 1

                            Shakuhachi practice, have been putting in 30-40 minutes a day. this insturment is HARD.

                            Building a youtube archiver service, tubearchivist is too heavy (it needs elastic and redis, all to handle a couple hundred videos, which seems a bit too much)

                            Continue with japanese immersion, trying to do 1-2h a days of immersion.

                            1. 3

                              This has been a horrible week work wise, so I’ll be just chilling, playing Go and practicing the shakuhachi.

                              1. 2

                                Vacations are over so:

                                • Attempt to survive the week
                                1. 4

                                  The Lisp Game Jam starts on Thursday: https://itch.io/jam/spring-lisp-game-jam-2021

                                  I’m going to be making a game using the TIC-80 retro game dev platform: https://tic.computer

                                  1. 3

                                    I saw this, and thinking of joining since my Lisping has improved a lot the past weeks.

                                    Excited to see your work this time, loved the FPS game of last jam.

                                    1. 2

                                      OOH, been doing a lot of common lisp lately, might join!

                                    1. 7

                                      At work, just pushing through the week, trying to not get burnt

                                      Personal is where I have mostly Go things going on:

                                      • Building a goban, (go board) bought some wood last week, need to setup my plane and start shaping that wood.
                                      • Continue reading Lessons in the Fundamentals of Go (it’s pretty dense, super interesting, I reread each chapter like 4 times)
                                      • Continue playing games everyday, trying to improve my reading and strategy
                                      • Maybe sign up for the league + reviews at nordic go dojo
                                      • Maybe buy some shell & slate stones

                                      So mostly continue with my Go obsession

                                      1. 1

                                        A go board needs to be shaped? I thought they were just flat with lines burned into them.

                                        1. 2

                                          Yeah! Shaped into flatness! hahaha, the wood is pretty rough, needs to be cut, flattened, sanded and treated

                                        2. 1

                                          Which kind of design do you plan for your goban, plank-style or more big chunk of wood traditional style?

                                          1. 2

                                            A tabletop but thick (5-6cm) traditional goban. When I finish this I will probably do a floor goban with legs if I find the time!

                                        1. 2

                                          It’s been some reaaaally long weeks at work. I need to get out of the house. Also need to work a little bit on my own projects to regain sanity.

                                          1. 2

                                            $WORK: Big project, lots of work

                                            $PERSONAL: Caddy made me realize it’s super nice to be able to modify configuration remotely without files and deploys and other annoying stuff. Will finish my own “Caddy” for DNS, a little DNS server written in Common Lisp, where I can inspect and change configuration from inside emacs. After this will probably do the same for a server (Caddy is nice, but I don’t like the JSON interface to it)

                                            1. 4

                                              Pottery, pottery and lots of pottery. Couldn’t go to the studio this week because of work and I really need it

                                              1. 1

                                                I just learned about kintsugi!

                                              1. 3

                                                So I got my shakuhachi (尺八, literally means 8 shaku, which is a length unit in japan. Although amusingly it measures 1.8 shaku not 8 shaku) yesterday in the mail (bought from here). This instrument has an interesting history, as for a long time it was only allowed to be played by monks of a particular buddhist sect, the Fuke sect, that used it for meditation. It was forbidden for everyone else. I’ll be practicing that every waking hour much to the dismay of my neighbours.

                                                It is notoriously difficult to make a note with it. I spent a couple of hours yesterday and a couple of hours today and it seems like I can finally play a (terrible sounding) note consistently. Will keep practicing over the weekend to maybe make it sound less terrible and try to learn a simple tune by monday. I’m starting classes next week.

                                                I’ll leave you with a couple of shakuhachi pieces I enjoy listening to:

                                                1. 2

                                                  Have an additional shakuhachi piece: https://youtu.be/jIZbVEGIHGk. A seriously impressive cover of Cory Henry’s solo on Lingus (originally by Snarky Puppy).

                                                1. 1

                                                  Deepen my understanding of the actualism method – which knocks the socks off mindfulness and meditation – by reading more about it, all the while applying it ever more consistently in life.

                                                  1. 1

                                                    Why do you say it knocks the socks off mindfulness and meditation? What are you reading about it?

                                                    1. 1

                                                      Mindfulness / meditation is really only about acceptance[^1], waiting for the feeling to pass away on its own (sooner or later). And not about getting rid of it, or never having it occur again, so that you can spend increasingly more of your time enjoying and appreciating life.

                                                      There are pages of correspondences on this topic, beginning from http://www.actualfreedom.com.au/richard/selectedcorrespondence/sc-buddhism.htm

                                                      [^1]: And in Theravada school, it is about replacing “bad” feelings with the “good” feelings (whereas in actualism both get replaced by the felicitous/innocuous feelings). In an actual freedom (ultimate goal), all feelings are gone.

                                                  1. 5

                                                    Nice intro.

                                                    I may not use J in my everyday programming, but I’m glad I learned it and think it made me a better programmer.

                                                    This was my experience too, though I continue to use J almost daily even if not professionally.

                                                    Far more than any other language, including Haskell, learning J felt like dropping acid and blasting away everything I thought I knew about programming.

                                                    It’s the third paradigm, and unfortunately the least well-known:

                                                    1. Procedural
                                                    2. Functional
                                                    3. Array

                                                    It really is a whole new mindset.

                                                    1. 7

                                                      How did you learn it? What kind of projects would you recommend to learn such a language to discover its benefits?

                                                      1. 4

                                                        Personally, I stuck to number-crunching a-la Project Euler and some Advent of Code challenges.

                                                        Also tried to use J for bio-signal processing (a set of student labs built around calculating various metrics of heart-rate variability), but after looking at the solution professor went “Nuh-uh” and I had to switch to R, which IMO wasn’t that bad of a compromise in terms of paradigm shift.


                                                        The most interesting point that I don’t see being addressed in most of the J write-ups is how one can use it to create ad-hoc calculi of various sorts, even problem-specific DSLs. Function composition, monadic/dyadic distinction and tacit programming surely allow for that, yet the “how” of it isn’t widely discussed.

                                                        1. 1

                                                          I had to switch to R, which IMO wasn’t that bad of a compromise in terms of paradigm shift.

                                                          That’s a shame: I think R is basically the same as numpy - no real paradigm shift, just more/different functions, so I hope you’ll give Iverson-ish languages another try, because by focusing on the array-ness you might’ve missed something incredible.

                                                          in most of the J write-ups is how one can use it to create ad-hoc calculi of various sorts, even problem-specific DSLs

                                                          I’m very curious to understand better what you’re looking for here. What’s a kind of problem you’d like to see?

                                                          1. 2

                                                            I think R is basically the same as numpy

                                                            That’s true, and NumPy is basically an ugly cousin of APL ;) What I rather meant is that sacrificing J’s expressiveness was a bearable trade-off in the context of my task, and that in the end I didn’t really need most of it: just reusing R’s libraries and mashing matrices together sufficed. Productivity and acceptance by peers over cleverness and personal satisfaction.

                                                            I’m very curious to understand better what you’re looking for here

                                                            “Notation as a tool of thought” I think. Is J a mean to an end as it is given, or does it permit/encourage building linguistic abstractions on top?

                                                            APL is often praised for its lyiricality, and J specifically describes the parts of its speech as nouns, verbs, adverbs, and conjunctions; but does it end there? In J, There are ways to assign obverses and adverses to a verb, to define its monadic and dyadic versions, to create adverbs with conjunctions; and then some tree-manipulating primitives, even ;: for tokenizing, and boxes that permit manipulating J code as data. Is any of that intended for meta-programming, AST building, and ultimately DSL creation?

                                                            Can an expert in their domain take J and carefully design a set of composable primitives for solving specific problems? If so, what is the process behind that, and how does the end result look like? Is it a notation that reads and writes like poetry, or is it a prosaic library? Is it even a normal practice in APL family?

                                                            So, I guess what I am looking for is a walkthrough thru the points I raised, with some interesting problem as its subject; both about the mindset and its execution. I’m too much of a puny human to grasp Aaron Hsu’s thesis on the compiler that he built with Dyalog APL, but the basic premise is roughly the same with what I have in mind. His live stream on this compiler’s design and architecture is also worth watching.

                                                            1. 3

                                                              Is J a mean to an end as it is given, or does it permit/encourage building linguistic abstractions on top?

                                                              Permit? Probably.

                                                              Encourage? I’ve not seen this personally. In fact, the ethos of J/APL seems the opposite of this. The big promise of the language is that with a single set of ~50 primitives you can elegantly solve the vast majority of practical problems. So there’s no need for bespoke DSLs for your different applications.

                                                              As a side note, my experience with Ruby has convinced me that such a DSLs are usually a mistake.

                                                              1. 3

                                                                A case in point: in the above-mentioned talk from Aaron Hsu, he shows how he defined PEG parser combinators and used them to write parsing expressions in a seemingly declarative manner.

                                                                Surely that doesn’t qualify as a DSL, but the ability to re-define a problem in terms of these 50+ vector-manipulating primitives is what always fascinated me in APL. There’s an inherent, almost visceral “spatiality” to this act. At times it feels that the more terse your code is, the closer it to some sort of underlying ur-computation that the world is build of.

                                                                Perhaps my original question is not about how to extend the language towards the problem, but how to ground the problem in the notation. Or perhaps how to design a notation with APL-ish qualities that will influence how I think with it and create in it?


                                                                APL family posits a certain view on the programming, but how to see everything thru its lens? Is it just linear algebra with mathematical know-how? Practice?

                                                                An example from my experience: in the first semester of DSP I really struggled with the concept of convolution, I simply couldn’t grasp it on an intuitive level. At least in my native language “convolution” sounds more like “folding” or even “coagulating”, but what folds with what? What is the physical meaning? Implementing it in C-level language only obscured the question behind the semantic noise, by-the-book definitions didn’t help either.

                                                                And then one day I just [:+//.*/ in J console and it clicked. In a sense, if two signals are threads, then their convolution is a helical braid, the one that you create by coiling one thread over another; a sum +/ of oblique /. diagonals formed by multiplicative * intersections /.

                                                                There’s a scene in “Matrix: Revolutions” where blinded Neo says to Smith in Bane’s body “I can see you” — that’s the kind of feeling I got from this experience.

                                                                1. 2

                                                                  APL family posits a certain view on the programming, but how to see everything thru its lens? Is it just linear algebra with mathematical know-how? Practice?

                                                                  Practical answer: doing many problems, getting it “wrong”, and then seeing a more natural way to do it. It seems like you already know this though, based on your very nice example.

                                                                  Larger answer: I’ve been meaning to write about this, but tldr… two words come to mind: geometrical and holistic. Think of the game of life in APL. In J/APL think, you don’t view the problem from the point of view of individual cells. Instead, you look at the whole plane at once, and imagine eight copies of that plane stacked on top of one another – one plane for each directional shift. Then the neighbor count is simply the +/ of those planes.

                                                                  This pattern appears a lot. You take the view from above, look at “everything at once,” and there’s a natural and simple way to express that with array computations. It doesn’t always work, though. Some problems aren’t a good fit for array thinking, but a surprising number are.

                                                                  1. 3

                                                                    Yes, the local action (counting neighbors) is applied on the board as a whole, not just to a specific cell; I even have a J one-liner stashed for that somewhere.† Nice example.

                                                                    Interesting that APL solutions turn out to be fractal in their nature (speaking of ur-computation). Reminds me of Konrad Zuse’s Rechnender Raum and Plankalkül.


                                                                    † Lo and behold:

                                                                    L =: [:+.`*./ ],~ 3 4=/ [:+/^:2 (>{;~i:1)|.]
                                                                    g =: 5 5 $ 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 1 1 1 0
                                                                    
                                                                       <"_1 ' o' {~ L^:(i.5) g
                                                                    ┌─────┬─────┬─────┬─────┬─────┐
                                                                    │     │     │     │     │     │
                                                                    │  o  │     │     │     │     │
                                                                    │   o │ o o │   o │  o  │   o │
                                                                    │ ooo │  oo │ o o │   oo│    o│
                                                                    │     │  o  │  oo │  oo │  ooo│
                                                                    └─────┴─────┴─────┴─────┴─────┘
                                                                    
                                                                    1. 1

                                                                      Nice. You might also enjoy this 26 byte version:

                                                                      (]=3+4=*)[:+/(>,{;~i:1)&|.
                                                                      

                                                                      By the way, what did you mean by “APL solutions turn out to be fractal in their nature”?

                                                                      1. 2

                                                                        Well, Moore neighborhood is applied to the playing field as a matrix rotation, but at the same time to each cell on the field (each element of a matrix). So, in a way, Game of Life field is a cell by itself (with a number of states dependant on a number of cells in it), and, in turn, each cell on it is a playing board. [1]

                                                                           [ c =: 4=i.3 3
                                                                        0 0 0 NB. A cell with 8 neighbors.
                                                                        0 1 0
                                                                        0 0 0
                                                                           <"_2 (>{;~i:_1)|. c
                                                                        ┌─────┬─────┬─────┐ NB. A cell with 8 neighbors?
                                                                        │1 0 0│0 1 0│0 0 1│
                                                                        │0 0 0│0 0 0│0 0 0│
                                                                        │0 0 0│0 0 0│0 0 0│
                                                                        ├─────┼─────┼─────┤
                                                                        │0 0 0│0 0 0│0 0 0│
                                                                        │1 0 0│0 1 0│0 0 1│
                                                                        │0 0 0│0 0 0│0 0 0│
                                                                        ├─────┼─────┼─────┤
                                                                        │0 0 0│0 0 0│0 0 0│
                                                                        │0 0 0│0 0 0│0 0 0│
                                                                        │1 0 0│0 1 0│0 0 1│
                                                                        └─────┴─────┴─────┘
                                                                        

                                                                        That was a nod towards your idea about geometry and holistic approach.

                                                                        1. 1

                                                                          Gotcha. Thanks.

                                                                2. 2

                                                                  Yeah. One subtlety most people don’t appreciate about “notation as a tool of thought” is that the paper doesn’t mention “abstraction” once. Easy to miss in these modern times where “notation” has implicitly come to mean “for creating new abstractions”.

                                                                  1. 4

                                                                    That’s fascinating. I never noticed that before.

                                                                    Since I have it open now, relevant paragraph on my previous point:

                                                                    The utility of a language as a tool of thought increases with the range of topics it can treat, but decreases with the amount of vocabulary and the complexity of grammatical rules which the user must keep in mind. Economy of notation is therefore important.

                                                                    Notation as a tool of thought

                                                                    1. 1

                                                                      True, it doesn’t. But then the whole paper is a process of abstraction (deriving general patterns from concrete examples, addressing the main thesis case-by-case) that uses APL as a conceptual vehicle.

                                                                      1. 2

                                                                        The design goal is to create a notation that minimizes the need for new names.

                                                                        1. 3

                                                                          APL is like a beautiful diamond – flawless, beautifully symmetrical. But you can’t add anything to it. If you try to glue on another diamond, you don’t get a bigger diamond. Lisp is like a ball of mud.

                                                                          Extension of the base language is at odds with its core design principle, but that surely does not preclude us from defining problems in terms of it. My original questions (not well-formed TBH) were more concerned with the ways this notation can be adapted to a particular context (or vice versa).

                                                                          E.g. the above-mentioned Co-dfns compiler, as far as I understand, implements a Scheme-like subset of Dyalog APL and represents trees as vectors, but not the other way around (extend the language with tree-like datatype and spruce all of that with s-expressions or something).

                                                                          Like I said, J has a set of primitives and mechanics that to me seem language-oriented, so I wondered how compiler/interpreter creation looks like from this point of view, on a small exemplified scale (hence the DSLs), and what is the problem-solving mindset (heuristics?) behind it.

                                                                          FSM state-transitions can be modeled with tables, yes, and AST, as a graph, can use adjacency matrices with eigen-shticks, but this whole approach feels like a black magic and well-guarded trade secret. Maybe because it’s a fork from the mainstream road and off into the academical weeds. Need to check out more of Aaron Hsu’s talks for sure.

                                                                          1. 1

                                                                            As a historical aside, the APL presented in Iverson’s A Programming Language book (pdf) included trees as a basic data structure (page 45 in the book). They weren’t included in APL as first implemented, then APL2 added nested arrays which serve a similar purpose. Some of the book’s tree operations are in J.

                                                                            1. 2

                                                                              Yes, thanks for mentioning that! The chapter on microprogramming with APL was mind-blowing at the time I read it.

                                                            2. 2

                                                              I learned it mainly by doing hundreds of code golf problems, asking questions on the J IRC channel, reading (good chunks of) J for C programmers and Learning J, as well as the J dictionary and the more accessible NuVoc documentation.

                                                              I’d say it took 1.5-2 years to acquire a feeling of fluency. Much longer than other languages. One crucial practice was solving a problem and then having it edited by experts. Because unavoidably you’ll apply your current paradigms to J, or simply be unaware of J idioms, rather than solving the problem in a “J way.”

                                                              The best place to get this kind of help these days is the The APL Orchard. There’s a handful of very talented J and APL programmers willing to help.

                                                              By the way, you could get the same mind-expansion from learning APL, if you were more inclined that way. The free book on the Dyalog APL site is also good.

                                                            3. 1

                                                              I’d classify it as functional. It just uses a multi-dimensional array where others use a linked list. Nearly all operators are side-effect free and data is immutable. There are higher order patterns like map-reduce.

                                                              1. 3

                                                                Array programming languages aren’t “functional” or “immutable” in the way that Erlang and ML are, and I observe beginners notice this only after a few months, probably because these features are used in Array-language applications, and not in the puzzles you practice on to learn how to think in Arrays. Real applications batch and use lots of global variables with delimited names like TCL or Perl; J programmers call them locales; K programmers have the K-tree; APL it’s isolates. And so on.

                                                                I know other languages have “arrays” but it’s just an unfortunate name collision. Many of us use the term “Iverson languages” instead of “Array languages”, because we’re not trying to confuse things unnecessarily.

                                                                1. 2

                                                                  To add on the sibling comments, the article linked and code golf expose you to a part of J. For me it go in a very different mindset with stuffs like : full tacit programming reaching more a function-level programming [1] (Even point-free in Haskell fill clumsy in comparison with APL and J), constructs like hooks and forks [2] permitting composition and chaining of functions with ease.

                                                                  1. 1

                                                                    It’s true that tacit programming in J is functional, but that misses the larger difference. The way you need to think about and frame a problem to solve it idiomatically is usually quite different in J versus, say, Haskell.

                                                                1. 19

                                                                  I’m probably not the only one with the opinion that rewrites in Rust may generally a good idea, but Rust’s compile times are unacceptable. I know there are efforts to improve that, but Rust’s compile times are so abysmally slow that it really affects me as a Gentoo user. Another point is that Rust is not standardized and a one-implementation-language, which also discourages me from looking deeper into Haskell and others. I’m not saying that I generally reject single-implementation languages, as this would disregard any new languages, but a language implementation should be possible without too much work (say within two man-months). Neither Haskell nor Rust satisfy this condition and contraptions like Cargo make it even worse, because implementing Rust would also mean to more or less implement the entire Cargo-ecosystem.

                                                                  Contrary to that, C compiles really fast, is an industry standard and has dozens of implementations. Another thing we should note is that the original C-codebase is a mature one. While Rust’s great ownership and type system may save you from general memory-handling- and type-errors, it won’t save you from intrinsic logic errors. However, I don’t weigh that point that much because this is an argument that could be given against any new codebase.

                                                                  What really matters to me is the increase in the diversity of git-implementations, which is a really good thing.

                                                                  1. 22

                                                                    but a language implementation should be possible without too much work (say within two man-months)

                                                                    Why is that a requirement? I don’t understand your position, we shouldn’t have complex, interesting or experimental languages only because a person couldn’t write an implementation by himself in 2 months? We should discard all the advances rust and haskell provide because they require a complex compiler?

                                                                    1. 5

                                                                      I’m not saying that we should discard those advances, because there is no mutual exclusion. I’m pretty certain one could work up a pure functional programming language based on linear type theory that provides the same benefits and is possible to implement in a reasonable amount of time.

                                                                      A good comparison is the web: 10-15 years ago, it was possible for a person to implement a basic web browser in a reasonable amount of time. Nowadays, it is impossible to follow all new web standards and you need an army of developers to keep up, which is why more and more groups give up on this endeavour (look at Opera and Microsoft as the most recent examples). We are now in a state where almost 90% of browsers are based on Webkit, which turns the web into a one-implementation-domain. I’m glad Mozilla is holding up there, but who knows for how long?

                                                                      The thing is the following: If you make the choice of a language as a developer, you “invest” into the ecosystem and if the ecosystem for some reason breaks apart/dies/changes into a direction you don’t agree with, you are forced to put additional work into it.

                                                                      This additional work can be a lot if you’re talking about proprietary ecosystems, meaning more or less you are forced to rewrite your programs. Rust satisfies the necessary condition of a qualified ecosystem, because it’s open source, but open source systems can also shut you out when the ABI/API isn’t stable, and the danger is especially given with the “loose” crate system that may provide high flexibility, but also means a lot of technical debt when you have to continually push your code to the newest specs to be able to use your dependencies. However, this is again a question of the ecosystem, and I’d prefer to only refer to the Rust compiler here.

                                                                      Anyway, I think the Rust community needs to address this and work up a standard for the Rust language. On my behalf, I won’t be investing my time into this ecosystem until this is addressed in some way. Anything else is just building a castle on sand.

                                                                      1. 5

                                                                        A good comparison is the web: 10-15 years ago, it was possible for a person to implement a basic web browser in a reasonable amount of time. Nowadays, it is impossible to follow all new web standards and you need an army of developers to keep up, which is why more and more groups give up on this endeavour (look at Opera and Microsoft as the most recent examples). We are now in a state where almost 90% of browsers are based on Webkit, which turns the web into a one-implementation-domain. I’m glad Mozilla is holding up there, but who knows for how long?

                                                                        There is a good argument by Drew DeVault that it is impossible to reimplement a web browser for the modern web

                                                                        1. 4

                                                                          I know Blink was forked from webkit but all these years later don’t you think it’s a little reductive to treat them as the same? If I’m not mistaken Blink sends nothing upstream to webkit and by now the codebases are fairly divergent.

                                                                      2. 8

                                                                        I feel ya - on OpenBSD compile times are orders of magnitude slower than on Linux! For example ncspot takes ~2 minutes to build on Linux and 37 minutes on OpenBSD (with most features disabled)!!

                                                                        1. 5

                                                                          37 minutes on OpenBSD

                                                                          For reals? This is terrifying.

                                                                          1. 1

                                                                            Excuse my ignorance – mind pointing me to some kind of article/document explaining why this is the case?

                                                                            1. 7

                                                                              There isn’t one. People (semarie@ - who maintains the rust port on OpenBSD being one) have looked into it with things like the RUSTC_BOOTSTRAP=1 and RUSTFLAGS='-Ztime-passes -Ztime-llvm-passes' env vars. These point to most of the time being spent in LLVM. But no one has tracked down the issue fully AFAIK.

                                                                          2. 6

                                                                            Another point is that Rust is not standardized and a one-implementation-language

                                                                            This is something that gives me pause when considering Rust. If the core Rust team does something that makes it impossible for me to continue using Rust (e.g. changes licenses to something incompatible with what I’m using it for), I don’t have anywhere to go and at best am stuck on an older version.

                                                                            One of the solutions to the above problem is a fork, but without a standard, the fork and the original can vary and no one is “right” and I lose the ability to write code portable between the two versions.

                                                                            Obviously, this isn’t a problem unique to Rust - most languages aren’t standardized and having a plethora of implementations can cause its own problems too - but the fact that there are large parts of Rust that are undefined and unstandardized (the ABI, the aliasing rules, etc) gives me pause from using it in mission-critical stuff.

                                                                            (I’m still learning Rust and I’m planning on using it for my next big thing if I get good enough at it in time, though given the time constraints it’s looking like I’ll be using C because my Rust won’t be good enough yet.)

                                                                            1. 2

                                                                              The fact that the trademark is still owned by the Mozilla foundation and not the to-be-created Rust Foundation is also likely chilling any attempts at independent reimplementation.

                                                                            2. 1

                                                                              As much as I understand your point about the slowness of compile time in Rust, I think it is a matter of time to see them shrink.

                                                                              On the standard point, Haskell have a standard : Haskell 2010 . GHC is the only implementation now but it have a lot of plugins to the compiler that are not in the standard. The new standard Haskell 2020 is on his way. Implementing the standard Haskell (not with all the GHC add-ons) is do-able but the language will way more simple and with flaws.

                                                                              1. 2

                                                                                The thing is, as you said: You can’t compile a lot of code by implementing Haskell 2010 (or 2020 for that matter) when you also don’t ship the “proprietary” extensions.

                                                                                1. 1

                                                                                  It is the same when you abuse GCC or Clang extensions in your codebase. The main difference with Haskell is that you, almost, only have GHC available and the community put their efforts in it and create a ecosystem of extensions.

                                                                                  As for C, your could write standard-compliant code that an hypothetical other compiler may compile. I am pretty sure if we only had one main compiler for C for so long that Haskell have had GHC, the situation would have been similar : lots of language extension outside the standard existing solely in the compiler.

                                                                                  1. 3

                                                                                    But this is exactly the case: There’s lots and lots of code out there that uses GNU extensions (from gcc). For a very long time, gcc was the only real compiler around and it lead to this problem. Some extensions are so persistent that clang had no other choice but to implement them.

                                                                                    1. 1

                                                                                      But does those extensions ever reached the standard? It as asked candidly as I do not know a lot of the evolution of C, compilers and standard that much.

                                                                                      1. 4

                                                                                        There’s a list by GNU that lists the extensions. I really hate it that you can’t enable a warning flag (like -Wextensions) that warns you about using GNU extensions.

                                                                                        Still, it is not as bad as bashism (i.e. extensions in GNU bash over Posix sh), because many scripts declare a /bin/sh-shebang at the top but are full of bashism because they incidentally have bash as the default shell. Most bashisms are just stupid, many people don’t know they are using them and there’s no warning to enable warnings. Another bad offender are GNU extensions of the Posix core utilities, especially GNU make, where 99% of all makefiles are actually GNU only and don’t work with Posix make.

                                                                                        In general, this is one major reason I dislike GNU: They see themselves as the one and only choice for software (demanding people to call Linux “GNU/Linux”) while introducing tons of extensions to chain their users to their ecosystem.

                                                                                        1. 2

                                                                                          Here are some of the GNU C extensions that ended up in the C standard.

                                                                                          • // comments
                                                                                          • inline functions
                                                                                          • Variable length arrays
                                                                                          • Hex floats
                                                                                          • Variadic macros
                                                                                          • alignof
                                                                                      2. 1

                                                                                        If I remember correctly 10 years ago hugs was still working and maybe even nhc :)

                                                                                        1. 1

                                                                                          Yep :) and yhc never landed after forking nhc. UHC and JHC seem dead. My main point is mainly that the existence of a standard does not assure the the multiplication of implementations and the cross-cIompilation between compilers/interpreters/jit/etc. It is a simplification around it and really depends on the community around those languages. If you look at Common Lisp with a set in the stone standard and a lot of compilers that can pin-point easily what is gonna work or not. Or Scheme with a fairly easy standard but you will quickly run out of the possibility to swap between interpreters if you focus on some specific stuffs.

                                                                                          After that, everyone have their checklist about what a programming language must or must not provide for them to learn and use.

                                                                                1. 4

                                                                                  Moving! New flat, new cat, and a place to practice woodworking at home. Can’t wait!

                                                                                  1. 6

                                                                                    Does anyone know of any good resources to undertake the creation of a custom chip? I’ve dabbled in FPGAs and VHDL before but don’t really understand how you go from that to a custom buildable chip.

                                                                                    1. 2

                                                                                      Same here. I would like to replicate the function of some old 80s custom chips, but I’m not sure where to start other than “VHDL.” It seems like the OpenROAD project is named as a major EDA component of this initiative, but I’m also unclear on how I’d use it.

                                                                                      Lots of reading ahead!