1. 23
  1.  

  2. 18

    My summary, in so many words: “Computers are complicated, and the myriad layers of abstraction we built on top of them are error-prone. This makes our jobs as programmers difficult. We should try and make computers simpler.”

    No mention of possible solutions beyond “We need a top-down approach”, as if this hasn’t been a question that has entered the minds of previous generations of programmers since the invention of computing. Lots of people aren’t thrilled about the less-than-ideal foundations over which we’ve built our tower of abstractions. That said, centuries of man-hours have gone into getting everything to work together in some semblance of coordination. Rebuilding it from scratch is no small undertaking.

    That said, I think there’s value in taking some time to imagine what our ideal programming experience is like beyond the lofty “I think thoughts and it turns to code”. Whenever I think about this, my mind always wanders to the reported sophistication and coherence of Lisp Machines, but everybody has a different perspective. I think the intention of the article was to get everybody talking about their dreams of an ideal environment, and then we can build our way down from the commonalities between them all.

    Any ideas?

    1. 7

      “Computers are complicated, and the myriad layers of abstraction we built on top of them are error-prone. This makes our jobs as programmers difficult. We should try and make computers simpler.”

      That was the diagnosis and problem Alan Kay tackless on ‘Steps to reinventing Modern Computing[2][3]. The idea was to build computer from the ground with a 20K Lines of code buget. It is rife with good ideas and code to boot. Check it out. They handle complexity through using a DSLs. To compile between them they do a new riff on Meta II*, OMeta**. One example of a DSL is Nile that takes care of graphics. Btw, I found about this research research from Brit Butler’s talk Programming Archeology.

      IMHO, the general idea is interactivity and reflection to allow instrospection and intercession. Few languages have a decent interactive experience CL, Smalltalk, Elisp. From skimming Dylan I would guess so as well but don’t know.

      * Peter Seibel’s riff on Meta: https://github.com/gigamonkey/monkeylib-parser ** Ohm is OMeta’s younger, npm friendly, sibling.

      1. 4

        Do lisp-machines still exist? That could be cool to try out.

        1. 6

          Check this post out! Someone managed to buy an old lisp machine secondhand: https://lobste.rs/s/cedlzc/a_post_on_lisp_machines

          1. 3

            Unfortunately, I think that Lisp Machines are in a bit of an pickle these days. They were originally dismissed due to poor performance compared to other hardware systems, and we’ve since built an incredible amount on the hardware that succeeded Lisp Machines. I think that if Lisp Machines were to make a resurgence, it would start in FPGAs. The Intel CPU architecture is not really amenable to Lisp Machines

            That said, the future of embedded devices and FPGAs is very exciting!

            1. 6

              The selling point of lisp machines was not that they ran lisp but that the system was cohesive, introspectable, modifiable and documented. I have not used them but people who did speak highly of the productivity that such system enabled. Smalltalk is a similar system. VPRI as well (with custom hardware on FPGA). It would be wiser to focus on developer productivity first before speed.

              Checkout Stephen Kell’s strange loop talk to see through how much hoops one has to jump through to modify a running program and how hostile it is to instrospection.

              edit: Added link to talk

              1. 1

                Got a link to that talk? I’d love to listen to it but Google gave me nothing.

                1. 4

                  I was curious as well! It turns out his name is Stephen Kell. I believe this is the talk PuercoPop was referring to: https://www.youtube.com/watch?v=LwicN2u6Dro

                  1. 3

                    Yes, thanks for linking it, I was on mobile so xref-ing stuff is a pain.

          2. 4

            The lowest layer you can get in touch with a computer is writting Assembler. Just slightly above it is C. It comes with no surprise that C is keeping it’s “market”-share, because it’s literally one of the most immersive ways you can talk to a computer, and everything else builds on top of it.

            From my experience, talking to colleagues of mine, most talk about these great new languages, toolkits of frameworks (Rust, Node.js, …) only to find out weeks, month or years later that these solutions create more problems than they solve. In the end, if you really know your way around C and know how to use the restrict keyword, you literally have unlimited possibilities to do whatever the fuck you want while getting the most out of your machine. The functional programming trend will die down as well in the future. I don’t mean that nobody would use the functional languages by then, but what I mean is that people will realize that computers are imperative machines and you get the most out of them by using imperative programming languages. Analogous concepts apply to all high level programming languages, which you can avoid if you are careful what libraries, frameworks or data-types you want to work with.

            1. 13

              I think that the great number of people involved with developing and using languages other than C can be considered as evidence against your argument. I think that C’s (much-deserved) longevity is owed to its speed and immense portability (portability is definitely what helped it spread when it was invented).

              There’s no mistaking that a lot of great (and terrible) software has been written in C, but I don’t think it’s a panacea for all software development, and treating it like it is one is probably a barrier to advancing the field. I think that the diversity of ideas in programming languages and programming paradigms is a great thing, since a diverse ecosystem helps all its constituents, and causes new things (ideas, paradigms, languages, architectures) to emerge. If everybody thought that C was the be-all and end-all of programming languages, we wouldn’t be bothering with reviewing the standard every few years and making adjustments and additions, let alone striving to write new languages on top of it. People are different, and have unique preferences. Every field of human interest reflects this, why should programming be any different?

              The video I posted earlier today talks about the new ideas that arose in the 60s and 70s for new models of interacting with computers, nearly all occurring before the rise of C. (It’s interesting to note the homogenization of programming environments since the rise of C) It also advises against dogma in the field of programming. People were originally resistant to FORTRAN when it came around; they likely felt that writing in binary was “literally one of the most immersive ways you can talk to a computer.”

              1. 9

                if you really know your way around C […] you literally have unlimited possibilities

                Did you read the article? Giving unlimited possibilities to humans with extremely limited cognitive ability is how we got into this mess in the first place. The possibility to access an array using an out-of-bounds index is responsible for a huge chunk of bugs; the fact that we still write software where this can happen should be considered a disgrace to our whole profession.

                1. 4

                  computers are imperative machines

                  This is always a bit funny to me since my understanding is that often what C compilers do is very non-imperative e.g. dataflow analysis, in order to better take advantage of modern architecture which is increasingly less imperative as time goes on.

                  My bet is that imperative execution dominates today at least partially out of convenience, momentum, and homage to earlier mental models around computation.

                  Not to say it’s going anywhere, just that the “everything is imperative at the core” idea is becoming a little archaic. Especially in times when FPGAs and GPUs are increasingly possible targets for codegen.

                  1. 4

                    Exactly. Consider things like https://en.wikipedia.org/wiki/Static_single_assignment_form,

                    One might expect to find SSA in a compiler for Fortran or C whereas in functional language compilers, such as those for Scheme, ML and Haskell, continuation-passing style (CPS) is generally used. SSA is formally equivalent to a well-behaved subset of CPS excluding non-local control flow, which does not occur when CPS is used as intermediate representation. So optimizations and transformations formulated in terms of one immediately apply to the other.

                    1. 3

                      Yeah, that’s precisely what I was thinking of :)

                  2. 3

                    I don’t agree that functional programming is a short-term trend, but that’s a valid opinion and I don’t really understand why it was downvoted.

                    I will note that “unlimited possibilities” has a decades-long history of being used in marketing various technologies to mean “the limit is only what you are capable of building, because we didn’t build any of it for you”. That is not necessarily a bad thing; in fact, it’s C’s selling point, as you say. But for many purposes, it’s better to accept possibilities that are more limited because somebody has already done some of the work. :)

                    1. 2

                      If Rust is causing obstacles to getting stuff done you can get done in C, please let me know. We want it to be a legitimate contender in this space, so experience reports are useful.

                      1. -11

                        I think I’m not intelligent enough to write Rust and will rather stick with the simplicity of C. YMMV of course.

                        Also, Rey finds Luke Skywalker.

                        1. 4

                          Okay, well let me revise that to “If I can make Rust easier to learn, I’d love to know how,” then. :)

                          I have no idea what Luke Skywalker has to do with anything?

                          1. 3

                            I gave you a troll vote for the star wars spoiler. Whether it is a genuine spoiler or not I have no idea, but trolling is the only reason to put that comment in there. I call poor sportsmanship.

                    2. 7

                      I agree that stacking abstractions on top of abstractions usually ends in tears, but I don’t think the example of text they gave was fair at all. Unicode is actually a remarkably solid abstraction when used for text; most of the problems you see around encoding just stem from programming languages and protocols going on for way too long pretending they could get away without implementing Unicode support; the example in the article of the way it fails is emoji, so… well, yeah; you’re using Unicode for something other than text; that’s stupid.

                      Another great example of an abstraction that’s fairly un-leaky is TCP; how often do you need to peel back the layers and look at packet headers and routing? It happens occasionally, but compared to how often you have problems with higher levels it’s basically a rounding error.

                      Anyway, I think it’s worth pointing out how remarkable it is that these non-leaky abstractions do exist, and contemplating what it is that makes them work well.

                      1. 3

                        I liked this because it presents the computer in a slightly different way and it was a nice read on the bus ride home. I’m sympathetic to the notions of “doing it right” but I don’t find the case compelling.

                        For one thing, I’m not convinced by central the argument that with computers we are mostly looking for problems to solve since we have a general “solution machine” at hand. A lot of computer use is task-oriented, meaning that the user has something to do and sets about finding a way to do it. Anecdotally, [and hey, what are forum posts but a great way to get mini experience reports?] my wife is always using a computer for work, but that’s because she has lots of problems to deal with: entering grades, filling out report cards, student correspondence, and so forth. (And note that those problems weren’t created by having a computer, although they may be accelerated by it.) Most people I know fall into the same category.

                        Mapping various problems to the mathematical domain does reveal a lot of sloppy thinking and hidden assumptions, though.

                        I was also left wanting with respect to both defining what programming is so that one could quantify the suckiness (*) and providing a hint of a prescription beyond “top-bottom approach”, which isn’t even given a meaning, as far as I could tell.

                        I have been thinking a lot about the notion of throwing out the notion of the Von Neumann architecture and trying something new. The other comments here about Lisp machines and FPGAs strikes a chord. The Lisp machines seemed pretty special, but I’ve never had the chance to use one. It would be fun to get involved in a project like that.

                        (*) Maybe this is why it sucks: us obnoxious programmers!

                        1. 3

                          Many people have tried these top-down reinventions, these “ideal world” approaches. None have succeeded. Incremental improvements and simplistic interfaces and 90% cases, on the other hand, work. Agile works. Dependency managers work. Better to light a single candle than curse the darkness.