1. 17

    If you plan on using Lisp well into the future then you should use Common Lisp because you know the language is never going to change and the books written on it (from the 90s, to 2005’s Practical Common Lisp) will always work, as well as the libraries that are available. You will be part of a continuous tradition that started with John McCarthy - many of the best Lisp programmers moved to Common Lisp during the standardization process, whereas Scheme is relatively its own thing (yes, I know who Guy Steele is).

    It is not the prettiest language but I am going to say that, regarding the differences between it and Scheme that people start fights over, you might find as I have that the decisions Common Lisp made were uniformly better across most of those dimensions, including nil vs. false (these should be the same thing!!), and even, the somewhat strange #’f and funcall syntax. For its gargantuan feature list, over time you will grow to like and appreciate the facilities that are available to you and realize that it is, in fact, a very practical language, designed over the decades of experience people had with actually using it for real software. Scheme (and which scheme: RSR5? RSR6?) is a small, small language that the various implementations (chicken, racket, guile) have to foist extensions over just to be practicable. Scheme does not have keyword arguments! Both racket and chicken do, but they use differing syntaxes. Even such a small and useful feature is not standardized and a small taste of the kind of divergence among implementations so if you are ready to be platform locked, go right on ahead.

    Really, the difference between (any) Scheme and Common Lisp is that, after having used both for significant projects, the Scheme community is satisfied with a nice, clean definition of a language, and Common Lisp is the result of satisfying the real needs of people in industry.

    Should you even use Lisp? I don’t know you and I don’t know your goals. But if you are choosing between CL and Scheme the answer is exceedingly clear imo.

    1.  

      How do you like the concurrency support for CL? I’ve read that it isn’t supported by the standard lib, but most people use Bordeaux threads - are you happy with it? Coming from other languages, it’s a little scary for something as fundamental as concurrency not to be in standard lib.

      1. 8

        Bordeaux-threads might as well be considered the de facto library for system threads. If you’re interested in M:N threading with nonblocking scheduling there are also a plethora of implementations for that, although I haven’t used any personally. Clojure is another Lisp that does this quite well, and that one happens to have it in the standard library (but! I remember when it wasn’t, and Timothy Baldridge released lightweight threading as a codewalking macro. Such is the power of macros, and totally impossible with hygienic ones, I might add).

        As for this standard library business… if it’s ever been needed, someone has written it and published it. I wouldn’t be “scared” that anything is or isn’t in a spec released in 1995, especially something for which the dominant paradigm has shifted so much over the past 15 years. Remember when transactional memory was the next big thing? Pepperidge Farm remembers. And even now, there is a divergence on whether people program in a pseudo-blocking procedural way (go, async await), monad-style promise chaining (all the functional languages, JavaScript before ES16), and whether concurrency was even appropriate in the first place (data parallelism, like Rust’s rayon or beam crates). Why should any standing committee decide to bless any one paradigm over another without literal decades of trial testing testing? For a frozen language like this, people coalesce over time around well written libraries, so don’t worry about that and ask around (IRC, Reddit, Google) if there is a library for this or that.

        1.  

          I’ve used Bordeaux threads, and they work well enough. But I’m used to 90s-era and earlier languages which don’t come with concurrency baked in. In comparison to those, Lisp is pretty good.

          1.  

            Would also mention that C didn’t have threads in the standard library until 2011 and it wasn’t seen as big handicap.

            1.  

              Bordeaux threads is de-facto standard, yeah, but I rarely find myself wanting to do low-level threading (in any language) anymore. There are various libraries built on top of it to provide higher-level concurrency primitives. For example lparallel provides task queues, promises, and parallel versions of map, reduce, sort, etc.

          1. 9

            I’m glad somebody wrote this up, because I feel like everybody who works with big data learns these lessons independently (sometimes multiple times within the same organization). If you teach CS, please make your students read this before they graduate.

            Understanding these lessons is basically why the office I work at is so much more productive than the rest of the company we’re a part of: there’s been an effort to get incoming devs to understand that, in most cases, it’s faster, cheaper, & easier to use unix shell tools to process large data sets than to use fancy hypebeast toolchains like hadoop.

            There are a couple things this essay doesn’t mention that would probably speed up processing substantially. One is using LC_ALL=C – if you force locale to C, no locale processing occurs during piping, which speeds everything up a lot. Another is that if you are using GNU awk, there’s support for running commands and piping to them internally, which means that downloads can actually be done inside AWK and posts can be done there too – which allows you to open multiple input and output streams and switch between them in a single batch, avoiding some merge steps. Also, one might want to use xargs instead of gnu parallel, because xargs is a more mature tool & one that’s available on basically all unix machines out of the box.

            1. 3

              Yeah I’ve personally run into exactly this kind of slowness with R (and Python to a lesser extent), and fixed it with shell. I love R but it can be very slow.

              That’s part of the reason I’m working on Oil. Shell is still useful and relevant but a lot of people are reluctant to learn it.

              I posted this in another thread, but it is good to eyeball your computations with “numbers every programmer should know”:

              https://gist.github.com/jboner/2841832

              https://people.eecs.berkeley.edu/~rcs/research/interactive_latency.html

              In particular most “parsing” is linear time, so my rule is that you want to be within 2x-10x of the hardware’s theoretical speed. With certain tools you will be more in the 100-1000x range, and then it’s time to use a different tool, probably to cut down the data first. Hardware is cheap but waiting for data pipelines costs human time.

              1. 4

                When I was working in my first lab I did exactly the same – moved an existing computational biology pipeline off of R to AWK, lots of shell plumbing, GNU Parallel, and a Flask front-end server which submitted jobs in GridEngine. Brought runtime down from 40 minutes to about 30 seconds for one genome. R is nice but can be slow (also, it was just a prototype.)

                The pivotal lesson I learned was to embrace the battle-tested technologies in the shell stack and everything Unix instead of fantsy-pantsy modern stacks and tools on top of Hadoop, Spark and others. “Someone probably solved your problem in the 80s” from the author rings absolutely true.

                Others call it Taco Bell programming.

                1. 2

                  Taco Bell programming is amazing, I’m saddened by the fact that Taco Bell programming has become quite esoteric. I wish this knowledge was more widespread in the industry.

              2. 3

                One thing I found particularly useful about this post (not evident from the title, but constitutes the first half) is specifics about how the Big Data Science Toolchains can fail, in this case Apache Spark, even when the author tried a bunch of the obvious and less-obvious fixes.

                The biggest win here seems to be not necessarily the raw processing time due to low-level optimizations in awk, but more big-picture algorithmic wins from “manually” controlling data locality, where Spark didn’t do the right thing automatically, and couldn’t be persuaded to do the right thing less automatically.

                1. 2

                  Have them read “The Treacherous Optimization” which is all about how GNU grep is so fast: grep is important for its own sake, of course, but the point is that these tools have had decades of work poured into them, even the relatively new GNU tools which postdate the Classic Unix codebases.

                  It’s also an interesting introduction to code optimization and engineering tradeoffs, or tradeoffs where multiple decisions are defensible because none of them are absolutely perfect.

                  1. 1

                    You must be very glad. 3 identical comments 😅

                    1. 1

                      Just a glitch. My mouse’s debounce doesn’t work properly, and lobste.rs doesn’t properly deduplicate requests, so when I click post it sometimes emits several duplicate requests which the server treats as duplicate comments (even though they come from the same form).

                      There was a patch applied for this a year ago, but either it didn’t work or it never migrated from the git repo to the live version of the site.

                  1. 1

                    Looks like something like Prolog, but with s-expression based syntax and some object-oriented system. However, at first glance I can’t understand how it can be used. There’s example with facial expressions analyzer: some computer vision component fits face model with photo and then code in CLIPS analyzes such coordinates with rules? Examples are quite hard to understand, it’s not “show social media posts depending on keywords in previously liked post” which is called AI today. How it’s different from Prologs? Or datalog embedded in lisps?

                    I can understand simple “business rules” like shopping cart discounts encoded with prolog/datalog, I can understand use cases for RDF semantic web rules encoded in OWL language, but all these expert systems look like alien rocket science for me.

                    1. 3

                      At a basic level it’s a forward-chaining rule engine using the Rete algorithm. Toy examples do look basically like toy Prolog examples, e.g.

                      (defrule parent-rule
                        (or (father ?x ?y) (mother ?x ?y))
                        => (assert (parent ?x ?y)))
                      

                      From this slide deck which goes into a bit more detail. Beyond that, things get more complex due to: 1) programmer-level stuff to make it less tedious to encode large rule bases, 2) meta-level control for the deduction process, e.g. in addition to asserting new facts, rules can also modify the deduction process, partly to make it possible do non-trivial things with large-ish knowledge bases using 1980s hardware, and 3) a tendency in this style of expert system to intermix “procedural” and “declarative” aspects. For example the above rule looks pretty declarative, but you can stick an arbitrary C function with side effects in place of assert, which will be called when the rule fires (a bit like a fancy callback mechanism).

                      Jess is a modern descendant I’ve used a bit more. Drools also has a lot of overlap, though it positions itself more as a business logic engine than AI.

                    1. 4

                      I’d like to learn more about how this thing manages memory. There’s no trace of garbage collection (makes sense with the bold performance claims), but in the examples memory isn’t manually freed either. Am I missing something?

                      PS: After reading the Reddit thread I realized it only uses reference counting. Given how cycles and memory leaks can annoy people, I’m not sure Floyd is a better choice than a properly garbage-collected language for any project…

                      1. 4

                        I think the reason is that in general you’d want to avoid GC when doing soft real-time? Also, NextStep’s and early Cocoa Objective-C used reference counting along with scoped memory pools which worked pretty well, even if you had cycles – you just had to be a little bit more careful, but it would still be much easier than memory management with C. On the other hand, I think any new language today without a GC should have a borrow checker or at the very minimum an automated way to ensure there is no leak. I’m still surprised no one is using capabilities to manage memory allocation/positionning of objects. Pony shows how great capabilities can be for concurrency and reference sharing, so there’s no reason we couldn’t define a set of capabilities for allocation.

                        1. 4

                          Here’s a counterexample. Curv uses reference counting (not a tracing garbage collector). It doesn’t leak memory because, by design, the runtime system will never create an object with cycles. Users don’t need to be careful to avoid creating cycles; the language just doesn’t provide a way to create cycles in the first place. (Curv is a pure functional language where all data structures are immutable values. So you can’t deliberately create a cycle by mutating an object to point to itself, the way you can in a conventional imperative language.)

                          It looks like Floyd may be using the same strategy as Curv. With no mutable objects, reference counting is a viable strategy for memory reclamation.

                          1. 1

                            Curv is a pure functional language where all data structures are immutable values. So you can’t deliberately create a cycle by mutating an object to point to itself, the way you can in a conventional imperative language.

                            This is an interesting property, but it does require more than just being pure functional, since mutating objects isn’t the only way to create cyclic structures. For example both OCaml and Haskell support purely functional ways of doing it. It’s even semi common, because algebraic data types (ADTs) can’t represent graphs in a straightforward way if there’s no way to create a data structure with cycles, and graphs are a pretty common thing people like to write functional code to manipulate.

                            1. 5

                              That’s right, you can’t implement Haskell without a GC.

                              In Curv, I don’t support recursive data definitions, only recursive function definitions. As a result, all data structures are trees (no cycles). Haskell has recursive data definitions, which create cycles.

                              Curv is a DSL for 3D solid modelling and procedurally generated art. It is intended to be a very simple programming language that is easy to learn for 3D printer enthusiasts, designers and and artists. Immutable data, pure functions without side effects, and tree structured data with no cycles are all features for making the language easier to learn and use. I’m not competing with Haskell or O’Caml. Haskell in particular has a steep learning curve.

                              Although I’m writing a DSL, I think that tree-structured data is a good idea that can work in general purpose languages as well. In the 1950’s and 1960’s, the standard programming practice was to write code using GOTO and conditional GOTO. The result was called “spaghetti code”. Programs were hard to understand because the control flow looked like a plate of spaghetti. “Structured programming” was invented to solve this, but it initially met a lot of resistance. Cyclic data structures are spaghetti data structures. Hierarchical data structures are easier to understand, and also have technical benefits (such as, you can reference count them, and they are easy to serialize). I hope we will see more high level languages with hierarchical data in the future. It looks like Floyd is one of them.

                          2. 3

                            I think the reason is that in general you’d want to avoid GC when doing soft real-time?

                            RC doesn’t help with that as such. Calling malloc can take an indeterminate amount of time, just like a GC pass can, so the actual fix is to avoid allocations entirely…which you can also do in a full GC system. (A common optimization in languages like Java is indeed to pre-allocate in tight loops, reusing the same data structures over and over.)

                            1. 2

                              For reference, Rust’s borrow checker doesn’t necessarily prevent memory leaks.They can and do happen.

                              1. 1

                                Yes, exactly. I forgot those reference-counted-by-default languages like Swift. But both of them have some kind of borrow checker now (even Swift) and in my opinion they are more complex and operate at a much lower level than Floyd (Floyd doesn’t even have pointers/references).

                              2. 4

                                Reference counting doesn’t work in an object-oriented language, where you can modify an object to point to itself and create a cycle. But in a functional language with immutable values, reference counting can be used effectively to manage memory with no memory leaks. I know this because that’s how I implemented Curv.

                                The Floyd documentation says “Floyd has no classes, no pointers / references, no tracing GC (uses copying and RC)”. That’s enough to suggest that Floyd is using the same implementation strategy as Curv. There are no pointers, so you can’t create a cycle by modifying an object to contain a pointer to itself.

                              1. 2

                                I wonder which of these beating-the-averages, brain-amplifying language implementations have been successfully used in today’s front-end environments (web and/or native mobile). Do people use Smalltalk or Common Lisp to develop front-end web or native mobile apps? Maybe ClojureScript?

                                One caveat: If the language implementation encourages the use of its own UI toolkit that draws its own widgets, that’s probably a non-starter for many apps due to accessibility.

                                1. 4

                                  It gets a bit tricker in the modern era with multiparadigm languages. I’ve seen small teams making really non-typical usage of JavaScript, for example, writing everything in a very Haskell-influenced functional style complete with lenses everywhere, from which they’ve built up in-house and/or project-specific abstractions to the extent that much of the code arguably constitutes an internal DSL. (Ok, when I say I’ve seen teams doing this, I mean that I know of one such team.) So a very speculative hypothesis is that today to find such teams you need to look beyond choice of language to how the language is used—people who might’ve been getting this kind of advantage in 1985 by using Lisp might today be just making very specific usage of multiparadigm languages like JS or Python.

                                  I personally do still like Common Lisp, in part out of personal interest, and in part because I’m an AI researcher with a strong interest in the history of the field, and it’s historically an important language for AI. But a lot of the claims of 10x productivity boost in earlier eras are comparing CL to languages like C and Fortran where it’s really not possible to write in anything near the high-level style of idiomatic CL. Even today there are still things CL does that are hard to replicate in other high-level languages, but I’m not sure the gap is as big.

                                  1. 1

                                    I guess Facebook Messenger written in OCaml/ReasonML proves that functional languages with good type and module systems scale much better than those without. ;)

                                  1. 8

                                    Single use, tiny languages are my catnip. I actually have to keep the temptation to learn them in check.

                                    1. 3

                                      If you check out the documentation section of this site there is a DSL for rolling different dice and visualizing the distributions. It’s the most niche DSL I think I’ve encountered.

                                      https://anydice.com/

                                      1. 3

                                        That’s pretty interesting! Some of the basic features aren’t surprising: being able to combine dice, sequences of rolls, etc., is what I’d expect from someone making a dice-rolling DSL. But this has a whole programming language pretty clearly included too, complete with looping and function calls. It’s even seemingly DIY syntax, not exactly lifted from an existing language. I wonder what the history is. Did these features accrete over time due to being needed in modeling some specific dice-roll situations? Was it designed up front as such a full-featured DSL?

                                        1. 3

                                          I don’t know what the history is, but I thought it was fascinating when I stumbled upon it. I was trying to simulate dice rolls for D&D. (What’s the distribution of 1d20s vs 2d10s? How would you do advantage/disadvantage for 2d10s? Etc.) It was very much an “everything you try to do someone else has done 100x better” situation. I was happy with my ASCII command line charts and this guy over here made an entire programming language dedicated to this one problem.

                                    1. 19

                                      I always find it kind of funny that Google didn’t want generics in Go, but hacked it into one of their projects: https://github.com/google/gvisor/blob/master/tools/go_generics/generics.go

                                      1. 26

                                        There are many people in google, surely not all teams agree on this issue.

                                        1. 16

                                          There’s also a bit of a difference between an implementation that works for one project and The Solution built into the language, which is supposed to (ideally) be a choice in the design space that’s fairly good across a broad range of uses. There are a bunch of solutions that work for various definitions of work, but the ones I’ve read all still have weird warts and edge cases (but different ones). Besides lower-level questions like how they interact with other parts of Go syntax and semantics, there’s also still quite a bit of disagreement, I think, on which of the basic tradeoffs in generics design are right for Go. Some people want Go generics more along the lines of ML-style parametric polymorphism or Java-style generics, while others want something more like C++ or D-style templates that instantiate specialized versions per-type. (The implementation linked above looks like the 2nd type.) Probably only one of these mechanisms will end up in the core language, not both.

                                          Edit: This “problem overview” from August 2018 has more details on the tradeoffs and current draft proposal.

                                          1. -2

                                            And yet apparently the language was shitty enough to force them to do this?

                                        1. 8

                                          This is an interesting article, but surprised they didn’t dig into the other failure modes shown in the US DoE chart (59% for non-LED electronics).

                                          I very much suspect the electrolytic capacitors on the driver inputs - 25,000 hours is a long time for a can full of electrolyte to be dealing with AC ripple and heat cycles.

                                          Also, I wonder if anyone has researched this aspect:

                                          A quick calculation says that the old bulbs paid for themselves more than three times over in electricity savings relative to the incandescents they replaced, and put that much less carbon into the atmosphere. They may well continue to burn for another 15,000 hours, but after weighing the degraded output and the cost to replace them with brighter, more efficient versions, I’m headed back to the store.

                                          … from the perspective of embodied energy (& related carbon emissions). Incandescents use more power when plugged in but have a lot less stuff in them, so I would guess they use less total energy to make. Although they’re maybe less recyclable as well…?

                                          1. 4

                                            Although they’re maybe less recyclable as well…?

                                            I believe current de-facto practice is that LEDs and incadescent bulbs are both mostly treated as general waste. LEDs are tricky to recycle in a cost-effective manner, though as they become a bigger part of the waste stream and more standardized, something might be solved there.

                                            1. 2

                                              That’s depressing. Although the “less than 2% of the waste stream” number in the linked article seems encouraging for longevity so far…

                                          1. 15

                                            This post is entitled “Why Don’t People Use Formal Methods?”, but I feel it’s maybe underselling isself. It does explain why people don’t use formal methods, in the author’s opinion. But it also explains why you might use formal methods in the first place — better than most other sources I’ve seen.

                                            1. 2

                                              GNU tar has an incremental format also, which should be long-term supported. It does things on a per-file basis though, rather than diffing files, so wouldn’t be suitable if you regularly have large binary files that undergo small changes.

                                              1. 2

                                                Yeah I looked at that. Avoiding it for two reasons:

                                                • I don’t want to keep around the “listed-incremental” file
                                                • It’s only supported by GNU tar
                                              1. 15

                                                In addition to all the resume reviews, I’ll add that your location strategy might be working against you as well, namely living in the SF Bay Area but applying to Seattle

                                                In Feb 2017, I was living in Portland, decided to move back to SF, and started my job search only for SF. I was interviewing for 3 months finishing the move in May 2017. I had a few onsites which involved flying but the response rate always felt low for the initial application

                                                After moving back, it felt like the floodgates opened back up. I had the same amount of onsites in those 3 months in 1 month

                                                The message here is:

                                                • Recruiters might be filtering against people who aren’t local (reasoning might include interview process taking longer, no desire to pay relocation, might assume you improperly applied) so conditions might improve once you move
                                                • Applying to Seattle instead of SF Bay Area might be working against you
                                                1. 5

                                                  Some companies assume relocation assistance would need to be factored in for non-local candidates (if the position isn’t remote friendly), and thus higher acquisition costs.

                                                  My guess is that you are right and this would indeed be a strong filtering criteria.

                                                  1. 1

                                                    I’m surprised that would figure strongly, given what tech salaries are, unless people are being offered much larger relocation packages than I have been. For a non-international move I’ve been offered $5k usually, maybe negotiable up to $10k, which is basically in the salary-negotiation noise, a single-digit percentage of the first year’s pay.

                                                    1. 2

                                                      10k is single-digit %? I’d love to talk to your agent!

                                                      But yeah, otherwise this accords with my experience.

                                                      1. 1

                                                        “9” is a single digit.

                                                        1. 1

                                                          That it is. I may ought to have had more coffee before posting.

                                                  2. 2

                                                    I don’t know how much of the following applies in the US…

                                                    If the problem is that they won’t consider you because you’re not in the area, then if you have friends or family in the area, you might be able to use their address e.g. on your CV to get through the initial “not in the area” filter.

                                                    The company I work for is always a little suspicious of applicants who aren’t local, mainly because:

                                                    • There’s a good chance they’re either just doing a countrywide search, or a recruiter is on their behalf. If they don’t really have any intention of moving, then it’s a waste of time interviewing them.
                                                    • Especially for young graduates, there’s a fairly high chance that they’ll spend a year or two here, then decide that the area isn’t for them and move back home / to the cities. (There’s nothing wrong with doing that, but if a company thinks you’re going to leave, they’re less likely to invest time and resources in you than in an identical candidate who’s likely to stay for longer.)

                                                    The way to get past these blocks is to tell a convincing story about why you want to move to the area. If you have family in the area, that will look promising. If you’ve lived in the area before, that’s also worth mentioning.

                                                  1. 2

                                                    Packing up my apartment to move to the USA next week (from the UK). My furniture isn’t really fancy enough to be worth shipping transatlantically so this involves a bunch of selling stuff on Gumtree and Facebook Marketplace. Mildly stressful, but not too bad so far.

                                                    1. 2

                                                      Out of the frying pan into the fire :) Good luck! Where in the US will you land?

                                                      1. 3

                                                        Washington, DC, for a job as assistant professor in CS at American University.

                                                        And thanks! Will definitely be a change of scenery, pros and cons but hopefully good overall. Right now I live in a 20,000-person seaside town that’s 5 hours from London, while in a few weeks I’ll be living right in the Imperial Capital!

                                                        1. 2

                                                          Congratulations! DC is a fun town! If you drink beer the bar scene is rather lively (I’m a beer fan and so am rather fond of the U Street Saloon . Lots of great culture happening there.

                                                    1. 4

                                                      At what stage in the interview process do you have this in mind? If it’s late-ish, seems plausible. If it’s early-ish, seems like a lot to ask up front from a candidate to spend a day grokking a codebase when they’re still at the point where they might be summarily rejected after 10 minutes’ review. You do mention that some people might not have the time, but even for those who do, is it a good use of their time?

                                                      The academic-job version of this is a university wanting your initial application for a faculty position to come with a custom-made syllabus for one of their new courses, or a review of their degree program with suggested revisions, or something of that kind. (Distinct from asking you to send in an example syllabus of a course you might teach or have taught in the past, which isn’t custom, employer-specific work.) I usually pass on applying to those. I am happy to prep custom material if I made it to the shortlist and the potential employer shows they’re serious enough about me as a candidate to fly me out for an on-site interview, though. Then it seems fair and more likely to be time not wasted.

                                                      1. 4

                                                        I figured this would replace the in person technical interviews. So for a company, the recruiting flow might be

                                                        1. Initial Phone Interview
                                                        2. Maybe a fast phone technical filter interview
                                                        3. Give them the project, tell them to make a change (and that they’ll be reviewing a couple of PRs for the onsite)
                                                        4. On site: culture fit, discuss their change to the codebase, have them do the code review
                                                        5. Hire/no hire decision.
                                                        1. 1

                                                          For me, this would be a red flag - 2/3 interviews (including the absolute worst kind, the phone interview) and a day long coding test? Maybe if I was applying to be CTO of a Fortune 500 company. For a lowly developer, this is too much.

                                                          1. 4

                                                            You’ve definitely been luckier with interviews than I have. Every company I’ve ever interviewed with had at least three rounds!

                                                            1. 1

                                                              My last two interviews were just one short onsite session each. Both led to an offer. I turned down companies with more involved & time consuming process.

                                                      1. 3

                                                        I’m @mjn@icosahedron.website. I post some mix of tech/research stuff and miscellaneous my-daily-life sorts of stuff.

                                                        1. 7

                                                          Are FreeBSD jails remotely as usable as Docker for Linux? Last time I checked they seemed rather unusable.

                                                          1. 3

                                                            In technical terms they’re just fine, in my semi-professional experience. What they lack is the ergonomics of Docker.

                                                            1. 5

                                                              I’m not very impressed with the ergonomics of docker, and it’s definitely not obvious to me that BSD jails are an inferior solution to it.

                                                              1. 5

                                                                Ok, so I’m a big fan of BSDs, so I’d be very interested if there’d be a nice (not necessarily identical, but similar) way to do the roughly the following things with jails:

                                                                vi Dockerfile # implement your container based on another containers
                                                                docker build . # build it
                                                                docker push https://<internal_storage>/money-maker:0.9 # push it to internal repo
                                                                ssh test_machine
                                                                docker run https://<internal_storage_server>/money-maker:0.9 # run the container on the test machine
                                                                
                                                                1. 5

                                                                  The obvious equivalent I can think of is:

                                                                  • Create a jail
                                                                  • Set it up (whether manually or via a Dockerfile-equivalant shell script)
                                                                  • Store a tar of its filesystem to https://<internal_storage>/money-maker:0.9
                                                                  • Create a jail on the destination machine
                                                                  • Untar the stored filesystem
                                                                  • Start the jail

                                                                  These steps aren’t integrated nicely the way they are with docker, but they are made of small, otherwise-useful parts which compose easily.

                                                                  1. 4

                                                                    Sure. How much work do you think needs to be done to get the benefits of Docker’s layer-based approach to containers? If your containers are based on each other, you get significant space savings that way.

                                                                    1. 0

                                                                      ZFS deduplicates stored blocks, so you would still get the space savings. You would still have to get it over the network, though.

                                                                      1. 6

                                                                        ZFS does not dedup by default, and deduping requires a lot of ram to the point that I’d not turn it on for performance reasons. I tried a 20TiB pool with/without, the speed was about 300k/s versus something closer to the underlying ssd’s performance. It was that bad, even after trying to tune the piss out of it.

                                                                        Hardlinks would be faster at that point.

                                                                        1. 3

                                                                          No no no, ZFS dedup wastes some ridiculous amount of RAM. Do you use it IRL or are you just quoting a feature list?

                                                                          1. 1

                                                                            I use it, bit not on anything big, just my home BAS.

                                                                    2. 2

                                                                      One option is to use a jail-management system built on top of the raw OS functionality. They tend to take an opinionated approach to how management/launching/etc. should work, and enforce a more fleshed-out container model. As a result they’re more ergonomic if what you want to do fits with their opinions. CBSD is probably the most full-featured one, and is actively maintained, but there are a bunch of others too. Some of them (like CBSD) do additional things like providing a unified interface for launching a container as either a jail or a bhyve VM.

                                                              1. 21

                                                                I used to work in academia, and this is an argument that I had many times. “Teaching programming” is really about teaching symbolic logic and algorithmic thinking, and any number of languages can do that without the baggage and complexity of C++. I think, if I was in a similar position again, I’d probably argue for Scheme and use The Little Schemer as the class text.

                                                                1. 10

                                                                  This is called computational thinking. I’ve found the topic to be contentious in universities, where many people are exposed to programming for the first time. Idealists will want to focus on intangible, fundamental skills with languages that have a simple core, like scheme, while pragmatists will want to give students more marketable skills (e.g. python/java/matlab modeling). Students also get frustrated (understandably) at learning “some niche language” instead of the languages requested on job postings.

                                                                  Regardless, I think we can all agree C++ is indeed a terrible first language to learn.

                                                                  1. 9

                                                                    Ironically, if you’d asked me ten years ago I would’ve said Python. I suppose I’ve become more idealist over time: I think those intangible, fundamental skills are the necessary ingredients for a successful programmer. I’ve worked with a lot of people who “knew Python” but couldn’t think their way through a problem at all; I’ve had to whiteboard for someone why their contradictory boolean condition would never work. Logic and algorithms matter a lot.

                                                                    1. 9

                                                                      I think python is a nice compromise. The syntax and semantics are simple enough that you can focus on the fundamentals, and at the same time it gives a base for students to explore more practical aspects of they want.

                                                                    2. 7

                                                                      Students also get frustrated (understandably) at learning “some niche language” instead of the languages requested on job postings.

                                                                      Yeah, I feel like universities could do a better job at setting the stage for this stuff. They should explain why the “niche language” is being used, and help the students understand that this will give them a long term competitive advantage over people who have just been chasing the latest fads based on the whims of industry.

                                                                      Then there is also the additional problem of industry pressuring universities into becoming job training institutions, rather than places for fostering far-looking, independent thinkers, with a deep understanding of theory and history. :/

                                                                      1. 3

                                                                        I’ve been thinking about this a bit lately, because I’m teaching an intro programming languages course in Spring ‘19 (not intro to programming, but a 2nd year course that’s supposed to survey programming paradigms and fundamental concepts). I have some scope to revise the curriculum, and want to balance giving a survey of what I think of as fundamentals with picking specific languages to do assignments in that students will perceive as relevant, and ideally can even put on their resumes as something they have intro-level experience in.

                                                                        I think it might be getting easier than it has been in a while to square this circle though. For some language families at least, you can find a flavor that has some kind of modern relevance that students & employers will respect. Clojure is more mainstream than any Lisp has been in decades, for example. I may personally prefer CL or Scheme, but most of what I’d teach in those I can teach in Clojure. Or another one: I took a course that used SML in the early 2000s, and liked it, but it was very much not an “industry” language at the time. Nowadays ReasonML is from Facebook, so is hard to dismiss as purely ivory tower, and OCaml on a resume is something that increasingly gets respect. Even for things that haven’t quite been picked up in industry, there are modernish communities around some, e.g. Factor is an up-to-date take on stack languages.

                                                                        1. 3

                                                                          I one way you can look at it is: understanding how to analyse the syntax and semantics of programming languages can help you a great deal when learning new languages, and even in learning new frameworks (Rails, RSpec, Ember, React, NumPy, Regex, Query builders, etc. could all be seen as domain specific PLs embedded in a host language). Often they have weird behaviours, but it really helps to have a mental framework to quickly understand new language concepts.

                                                                          Note that I wouldn’t recommend this as a beginner programming language course - indeed I’d probably go with TypeScript, because if all else fails they’ll have learned something that can work in many places, and sets them on the path of using types early on. From the teaching languages Pyret looks good too, but you’d have to prevent it from being rejected. But as soon as possible I think it’s important to get them onto something like Coursera’s Programming Languages course (which goes from SML -> Racket -> Ruby, and shows them how to pick up new languages quickly).

                                                                      2. 7

                                                                        I started college in 1998, and our intro CS class was in Scheme. At the time, I already had done BASIC, Pascal, and C++, and was (over)confident in all of them, and I hated doing Scheme. It was different, it was impractical, I saw no use in learning it. By my sophomore year I was telling everyone who would listen that we should just do intro in Perl, because you can do useful things in it!

                                                                        Boy howdy, was I wrong, and not just about Perl. I didn’t appreciate it at the time, and I didn’t actually appreciate it until years later. It just sorta percolated up as, “Holy crap, this stuff is in my brain and it’s useful.”

                                                                        1. 3

                                                                          I hear this reasoning, about teaching tangible skills, but even one two or three quarters for Python is not enough for a job, at least it shouldn’t be. If it is, then employers are totally ok with extremely shallow knowledge.

                                                                          1. 1

                                                                            I didn’t even realize I had read this a month ago, nevermind I had commented on it, before I wrote my own post on the topic. Subconscious motivations at its finest.

                                                                        1. 5

                                                                          These have been floating around FOR-EVER but I’m glad they keep cropping up. I see evidence of these constantly in just about every technical community I inhabit.

                                                                          They were an eye opener for me at the time. Particularly #2 (accept me as I am) and #4 (transitive).

                                                                          Grokking the fundamental falsehood of some of these deeply was definitely a step towards finally growing up in certain ways that I REALLY needed to (and had for a long time).

                                                                          I also credit having successfully jettisoned #2 with being why at age 35 I finally started dating and met my wife :)

                                                                          1. 5

                                                                            I recognize some of these patterns, but I don’t think I associate them with technical communities. Where I’ve run into them is in “cultural geek” communities, those organized around things like fandoms. This could be idiosyncratic based on which specific kinds of both communities I’ve run into though.

                                                                            1. 2

                                                                              I’ll take your word for it. In my case, the degree ov overlap between technical communities and various fandoms is extremely high.

                                                                              1. 1

                                                                                That’s interesting and believable too, which is why I added the caveat that it could well be idiosyncratic. I’ve definitely read about this kind of thing in my area, artificial intelligence, e.g. the old MIT hacker culture. I just haven’t encountered it in person, and it always felt like something that existed way before my time. Out of curiosity, what kinds of technical communities have you encountered where the overlap is high?

                                                                                The AI conferences I personally go to do have a handful of geeky people, but way more business/startup/government/military/professor types. A bunch of these factors pretty clearly don’t apply as far as I can tell, for better or worse. For example, socially awkward and/or unhygienic people are pretty much jetissoned without a second thought if someone thinks they might interfere with funding.

                                                                                1. 2

                                                                                  So, I want to be sure to constrain this properly.

                                                                                  I came into the Boston technical scene in the early 1990s. At that time, the overlap with the Boston science fiction fandom community was HUGE as it was for the Polyamory and BDSM communities (of which I’ve never been a part. Vanilla and proud yo :)

                                                                                  In fact, I pretty much got my start in the Boston tech scene by showing up at a science fiction fandom oriented group house in the middle of a blizzard and passing out my resume to anyone who’d talk to me :) I ended up living in that group house for a time.

                                                                                  I’m fairly sure this isn’t representative of the here and now. Our industry has become a very VERY different place several times over since then (mostly for the better) and I suspect that younger folks are being drawn from a much more diverse background of interests.

                                                                                  1. 1

                                                                                    Hah interesting, I know some people who I think had a similar kind of experience in the SF Bay Area in the ’90s, living in group houses to get broadband internet and such. I got into tech in the late ‘90s in suburban Houston, which might have had a geek scene, but if so I didn’t know about it. The tech scene I was exposed to was much more “professional engineering” oriented, anchored by people who worked at NASA or NASA contractors (plus some people doing tech at oil companies).

                                                                              2. 1

                                                                                I’m not found that to be the case, even here in the Lobsters community in its forum and chat forms.

                                                                              3. 2

                                                                                I’m curious how #2 motivated you to start dating. Were you just generally more receptive of criticism from friends, and if so, how does that translate to wanting to start dating?

                                                                                1. 4

                                                                                  Not so much about wanting to start dating, but being willing to make the changes necessary to be perceived as attractive.

                                                                                  “Friends accept me as I am”.

                                                                                  Who cares if I have a giant sloppy beard, dress in sweat pants and faded T-shirts all the time, and generally take PRIDE in not giving two shits about my personal appearance? My TRUE friends will accept me for who I am and see past all that.

                                                                                  Except that this is the real world. How you look DOES matter. Shave the beard, lose a few pounds, buy some decent clothing and it’s a whole different ballgame.

                                                                                  1. 1

                                                                                    I definitely agree with what you’re saying, but it reminds me of some definitions from Douglas Coupland’s novel Generation X :

                                                                                    anti-victim device (AVD) - a small fashion accessory worn on an otherwise conservative outfit which announces to the world that one still has a spark of individuality burning inside: 1940s retro ties and earrings (on men), feminist buttons, noserings (on women), and the now almost completely extinct teeny weeny “rattail” haircut (both sexes).

                                                                                    … and:

                                                                                    personality tithe - a price paid for becoming a couple; previously amusing human beings become boring: “Thanks for inviting us, but Noreen and I are going to look at flatware catalogs tonight. Afterward we’re going to watch the shopping channel.”

                                                                                    https://en.wikiquote.org/wiki/Generation_X:_Tales_for_an_Accelerated_Culture

                                                                                    Some parts of a given personality are stupid and need to be shorn so the person can have a more interesting life. It’s easy to lionize the idea that someone can be Good Enough, or, in somewhat different subcultures, Cool Enough, that you never have to compromise on those things, but even if you luck into a job where that works, it doesn’t and can never work in a healthy personal relationship.

                                                                                    1. 4

                                                                                      Sounds like I need to read that book!

                                                                                      I don’t personally see it as compromise.

                                                                                      The truth is that my self confidence was in the shitter at that time. My personal appearance was just one outward manifestation of that.

                                                                                      Recognizing that I needed to make changes if I wanted to meet someone and be less lonely was a first step towards letting go of some of that baggage.

                                                                              1. 3

                                                                                I wish the article were a bit more substantive.

                                                                                They touch on type inference. Does anyone have some refs for how to do this in Lisp?

                                                                                1. 6

                                                                                  They touch on type inference. Does anyone have some refs for how to do this in Lisp?

                                                                                  Henry Baker has a paper about it: The Nimble Type Inferencer for Common Lisp-84

                                                                                  http://home.pipeline.com/~hbaker1/TInference.html

                                                                                  1. 4

                                                                                    You don’t do this, the compiler does it for you. Inferring it from the code, with or without the aid of type declarations by programmer.

                                                                                    The article has the example of declaring the types, which allows compiler to infer tight, specialized code.

                                                                                    1. 2

                                                                                      Sure, agreed. But in my case, I’m implementing the compiler. So I was hoping for some refs on how to do type inference.

                                                                                      1. 12

                                                                                        SBCL implements it as type propagation using dataflow analysis. Types in some places are known from explicit declarations, constant literals, known functions, etc., and this information is propagated in a pass named “constraint propagation”, to infer the types of other expressions when possible.

                                                                                        I’m not sure whether there’s proper documentation anywhere, but this post on some of the type propagator’s weaknesses starts with a summary of how it works [1].

                                                                                        “Gradual typing” has become a trend outside the Lisp world lately, and I believe those systems do something vaguely similar to propagate types. That might be another place to find implementation hints.

                                                                                        [1] Note when reading this post that “Python” is the name of SBCL’s compiler; the name predates Guido van Rossum’s language. The reason the compiler has a separate name in the first place is that SBCL’s predecessor, CMUCL, had an interpreter, a bytecode compiler, and a native-code compiler, and Python was the name of the native-code compiler.

                                                                                        1. 2

                                                                                          SBCL implements it as type propagation using dataflow analysis. Types in some places are known from explicit declarations, constant literals, known functions, etc., and this information is propagated in a pass named “constraint propagation”, to infer the types of other expressions when possible.

                                                                                          Ah! Interesting! This is what I was planning to do with my toy lisp :) It might be worth for scholary searches, I heard it first when using Erlang under the name of Succession Typing. I could be completely remembering, but I think Dialyzer uses this method of program typing.

                                                                                          1. 2

                                                                                            I don’t think this is quite the same thing as Dialyzer’s Success Typing, though there might be some overlap. Dialyzer’s goals are very different in that it’s only concerned with correctness and not performance. But it might use some similar methods.

                                                                                        2. 2

                                                                                          This may help, it’s an overview of FB’s Flow system for JS https://www.youtube.com/watch?v=VEaDsKyDxkY

                                                                                    1. 5

                                                                                      Only tangentially on topic: I hadn’t noticed there’s a Talos II in a more hobbyist affordable price range now. Last time I looked the complete-system prices were $7k or so, and that’s still what the flagship one goes for, but now they have a $2k “Entry-Level Developer System” with one 4-core/16-thread CPU. For $2.5k you can bump it to 16G RAM and 8-core/32-thread. Still not cheap for those specs compared to x86 unless you have pretty specific POWER9-optimized stuff, but it’s at least not an absurd amount of money.

                                                                                      1. 5

                                                                                        Rumour is that there’ll be a sub-$1k option along the lines of a Mac Mini at some point, too.

                                                                                      1. 9

                                                                                        This strikes me as a step-by-step rediscovery of “Checked Exceptions”, but with an even more awkward encoding.

                                                                                        1. 10

                                                                                          Checked exceptions are underrated.

                                                                                          1. 5

                                                                                            There’s a lobste.rs discussion from 6 months ago that’s somewhat relevant on that (but with Rust rather than Haskell).