1. 3

    Great article! I’d also mention that magit works transparently with tramp so you can take advantage of magit’s awesomeness on remote machines.

    1. 5

      You make it sound like it’s a feature that was intentionally built in, but what makes TRAMP great is that it’s transparently built in between buffer and file access, meaning that all halfway good Elisp code (eg. also eshell, compile, …) can employ TRAMP without having to worry about it.

      1. 1

        Sadly, gdb in the GUD does not work quite right. I can run M-x gdb just fine, but the file references do not work properly (and there are some complaints about the terminal from GDB). I suspect I’ll have to fiddle with the source locations to get it to work, if it will work at all.

      2. 2

        Also Dired, which means also Sunrise Commander, which is an orthodox two-pane file manager. Now you can have both panes show any combination of remote and local directories, and manipulate files between them as if everything was local.

        It’s a rather slow for big transfers and/or many files, but for smaller operations it is insanely convenient.

      1. 2

        If you are going to use TRAMP, I highly recommend something akin to this near the start of your shell rc file:

        [[ "${TERM}" == dumb ]] && PS1='$ ' && return
        

        This way TRAMP has fewer chances to run into any odd interactive settings that generally don’t play well with it and it will easily recognize the prompt (it will just hang if these things aren’t set correctly). In my case, with zsh, I added unsetopt zle in there as well. TRAMP is sensitive to all this because it basically logs in to access files. Using some sort of remote file access API would be nice, but *nix seems to have nothing useful for that, so relying on tenuous shell access settings is the name of the game.

        1. 2

          Using some sort of remote file access API would be nice, but *nix seems to have nothing useful for that.

          Not sure I follow. scp avoids all the issues with prompts. Looks like you can have emacs use it in preference to ssh. https://www.gnu.org/software/emacs/manual/html_node/tramp/External-methods.html

          1. 2

            The issue with that (when I used it more frequently a couple years ago) is that scp will log in anew everytime you save a file. That will make the saving progress take quite some time. When you’re used to hitting C-x C-s every once in a while, this can easily become annoying.

            1. 1

              All I can tell you is that neither scp nor ssh worked until I did what I stated for the shell. When I typed C-x C-f /scp:host:/ Emacs just sat there, unresponsive. After I changed the remote shell rc file, all was fine.

          1. 2

            Myth: Common Lisp is a procedural, object-oriented language.

            Is that what people are saying about it now? I’ve never heard CL be accused of being either of those things. It was always accused of being “functional”.

            1. 6

              New directive from the Common Lisp Central Comittee: “We Have Always Been Functional”.

              <a weary sigh arises from the Emacs buffer *Ministry of Truth*>

            1. 12

              Requirements change. Every software engineering project will face this hard problem at some point. […] With this in mind, all software development processes can be seen as different responses to this essential truth. The original (and naive) waterfall process simply assumed that you could start with a firm statement of the requirements to be met.

              This isn’t something unique to software. I’ve talked with at least two civil engineers who had to move bridges because the requirements changed after they were built.

              1. 6

                Yes, the whole article is rehashing the “Agile” argument without anything of substance.

                My father has worked in construction for nearly 50 years, and I did as well for a few. Requirements and environment conditions always happen. What good construction and engineering firms do is seriously account for that. The guys who lowball a bid inevitably run over cost and time by significant amounts. Sadly, clients fall for the lowball bids too often.

                1. 3

                  And they get what they pay for. Probably.

                2. 1

                  Wasn’t waterfall the “do not do this” example in the paper that coined the term?

                  1. 1

                    Are these stories published somewhere?

                    Lots of developers are under the impression that civil engineers don’t have to deal with crazy requirements changes. Examples on orange site:

                    And no civil engineer will ever have to deal with the update to Gravity 2.0 now even better at 10m/s2

                    Also a civil engineer usually understands exactly what the bridge is supposed to do and how it is going to be used.

                    there’s a mutual agreement that such change comes with significant cost, and it’s this part that is missing in the software world.

                    Everything about a bridge is planned in extreme detail before work on the ground ever starts. This level of planning is absent in software.

                    client needs are easy to transport into the mind of [a civil engineer]

                    1. 3

                      I’m working on it! This is part of a broader project where I interviewed a bunch of people who worked as both “trad” engineers and software engineers to figure out how they’re actually different. Most software engineers don’t actually know what trad people actually have to deal with and are working on stereotypes.

                      And no civil engineer will ever have to deal with the update to Gravity 2.0 now even better at 10m/s2

                      Here’s a page on how individual screws can wildly vary in terms of structural strength: https://www.fastenal.com/en/76/metric-system-and-specifications.

                      And here’s one on how mixing aluminum and steel bolts can corrode the bridge! https://www.fastenal.com/en/70/corrosion

                      Also a civil engineer usually understands exactly what the bridge is supposed to do and how it is going to be used.

                      What about electrical engineering? Chemical engineering? Off-the-shelf integrated circuits? Mines?

                      there’s a mutual agreement that such change comes with significant cost, and it’s this part that is missing in the software world.

                      Not really, stuff changes all the time.

                      Everything about a bridge is planned in extreme detail before work on the ground ever starts. This level of planning is absent in software.

                      As one oil rig engineer told me: “we file three blueprints: what we originally planned, what we ended up building, and how we kludged it after we built it.”

                      client needs are easy to transport into the mind of [a civil engineer]

                      See above.

                  1. 5

                    Could @geoffwozniak give some context here? The last time I knew, repo surgeon was not the top dog in the gcc git conversion race. Did that change, or is esr an unreliable narrator?

                    1. 9

                      This was talked about for a long time on the mailing list, but once the GNU Cauldron happened in September 2019, it kind of lit a fire under the whole “let’s get this converted to Git” movement. There was already a mirror that many (including myself) were using and because the whole reposurgeon thing seemed stuck, some argued to just get on with it and use the existing mirror.

                      Some of the long-time contributors were a little more concerned about the tags and historical aspects of the repo, going back to the CVS days. As a result, the enthusiasm of the mirror and an existing conversion script waned. Personally, I was fine with the mirror.

                      At any rate, the wiki page lays out the pros and cons of each. It was sometime in December 2019 that the reposurgeon route was chosen. It’s somewhere in this thread (I’m too lazy to find the exact message).

                      1. 2

                        In addition, a previous story about this with some more context on why reposurgeon was chosen: https://lobste.rs/s/ykr0ct/gcc_has_really_high_quality_git

                    1. 1

                      I’m for fewer tags, not more. I do not find specificity in tags to be a good thing. For one thing, it encourages more tags to be added, both to the set of tags available and to the number of tags attached to each submission. “Tag soup” on a submission is already an annoying problem.

                      Adding more tags just means we have to manage more tags. We have too many tags already, many of which seemed to follow a flurry of submissions at one point (systemd and illumos come to mind, as do all the programming language ones, frankly) which then inevitably tapered off and leaving the tag to essentially become an orphan.

                      Personally, I prefer broad topic tags (themes, really). I have not found the addition of more tags to be all that helpful in filtering or discovering content. All it’s done is add more tags to submissions. The propensity to add them has resulted in lots of tags being added that are tangential at best in the hopes of getting noticed or trying to appear relevant. It feels very spammy.

                      1. 1

                        For one thing, it encourages more tags to be added, both to the set of tags available and to the number of tags attached to each submission. “Tag soup” on a submission is already an annoying problem.

                        It’s only a problem because people don’t understand that it only hurts their post. The more tags they add, the higher chance any of the tags would be filtered out meaning a smaller audience for your post. Accuracy is preferable, and it’s impossible to be accurate with the generic tags we have now.

                        Adding more tags just means we have to manage more tags.

                        Why do you have to manage them? They are just there to be used on your post and people can vote to add tags if you don’t tag correctly, just like today.

                        Personally, I prefer broad topic tags (themes, really). I have not found the addition of more tags to be all that helpful in filtering or discovering content.

                        They aren’t there to discover content. They are there to filter content. Be real, if we remove all programming language tags over night and required them to use the programming tag instead, then 50% of all submissions, if not more, would be put in that tag and it would be impossible to filter out content you aren’t interested in.

                        Specificity is always preferable in a tagging system, with tags that imply other tags for generality.

                        The propensity to add them has resulted in lots of tags being added that are tangential at best in the hopes of getting noticed or trying to appear relevant. It feels very spammy. Again, that only hurts the submission itself. You are more likely to hit a tag that people filter out and your post is effectively hidden for a majority of the site. People using many tags (as long as they are relevant) are actually using the tagging system how it should be used. Real spammers would want to use as few tags as possible to reach a wider audience.

                        1. 1

                          The simple answer to your retort, then, is that Powershell just isn’t that important.

                      1. 6

                        I’ve omitted explanations of why I believe these things, mostly so that I could get this post out the door at all - each one of these could easily be it’s own blog post. Think about them for a bit, and possibly you’ll find them compelling :)

                        I don’t find any of them compelling precisely because there are no explanations. Most of the statements are provocative and so vague that you can read whatever you want into them. That’s not productive discussion, that’s clickbait.

                        I’d much rather see a deeper exploration of these topics as opposed to just getting a “post out the door”.

                        1. 6

                          Connectivity will be the great equalizer in the future.

                          Equalizer of what, exactly? This statement and the ensuing paragraph betray the misguided belief that technology is the prescription for social problems. Starlink feels like nothing more than a giant ego trip. If this was truly an egalitarian effort, the people behind it would have consulted the rest of the world.

                          1. 4

                            The author also seems to assume that somehow Starlink will provide a cheap and high-quality service, while complaining about the “greedy last-mile monopolists” in another paragraph. Would it not make more sense to assume that this company will be just as greedy once it has established its own monopoly, at the detriment of us all?

                            1. 3

                              once it has established its own monopoly

                              You can’t create a monopoly by adding another competitor.

                              Existing telcos are a natural monopoly because trenches, poles and wires are astonishingly expensive and it doesn’t make economic sense to build a duplicate set of them in the same location.

                              While Starlink is also astonishingly expensive (more expensive per unit bandwidth for all but the lowest-density regions), it’s not locked to a single physical location. Being able to rearrange the fleet to serve different regions at different densities is a huge deal because it means every monopoly ISP on the planet now has plausible competition.

                              Monopoly ISPs (eg Comcast in many US cities) will be forced to adapt and offer a reasonable level of service to fight off this competition (as they did when google fiber came out).

                              I don’t think Starlink will offer particularly great value for money, but a capitalist market cannot operate well without competition, and Starlink will provide that.

                          1. 10

                            This may be better handled with cdpath, which is available in both Zsh and Bash.

                            If arg does not begin with a slash, the behaviour depends on whether the current directory ‘.’ occurs in the list of directories contained in the shell parameter cdpath. If it does not, first attempt to change to the directory arg under the current directory, and if that fails but cdpath is set and contains at least one element attempt to change to the directory arg under each component of cdpath in turn until successful.

                            Combined with auto_cd in Zsh, you get the same effect as defining all these aliases.

                            1. 5

                              cdpath looked extremely attractive to me but it appears that you need to install extra completion functions to make it work with tab completion. Drat.

                              1. 2

                                That looks like a great solution. I’m going to give this a try. Thanks for info.

                                1. 2

                                  I used CDPATH for a while. Turned it off because it was too confusing for me.

                                  It even broke a build once because some shell script deep down changed its behavior due to CDPATH. It took quite a while to debug.

                                1. 1

                                  The list is neat, but it’s definitely missing some from the 80s. DeSmet C is still around. There’s also Eco-C and Eco-C88 that were inexpensive C compilers for CP/M and DOS.

                                  1. 5

                                    I don’t understand the obsession with tacky Geocities sites. To each their own, but I was on the internet back then, and that’s certainly not the “internet of old” that I miss.

                                    Also, I find it highly ironic that this is linking in scripts from Google, Facebook, and Cloudflare.

                                    1. 5

                                      Agreed. For me, the “old internet” is sites like this:

                                      http://www.mir.com.my/rb/photography/companies/nikon/nikkoresources/index.htm

                                      Densely packed with info, links are made up as you go along… basically one person’s drive to put all that info out there.

                                      1. 3

                                        I dug through my bookmarks and found a few more examples that are still up:

                                        http://paulbourke.net/geometry/

                                        https://agner.org/optimize/?e=0#0

                                        https://www.sandpile.org/

                                        https://www.adahome.com/

                                        Like your example, there’s no advertising, tracking, third-party spyware, “monetizing”, or “sign up for my newsletter”, just lots of information in an easy to read format, shared just to share.

                                        1. 2

                                          “Firefox prevented this site from opening a pop-up”

                                          Old Internet indeed!

                                        2. 2

                                          I kind of understand it. I don’t miss it, but geocities, tripod and the like let people experiment at no cost and gave them a platform they could control. That energy was special, IMO, and seeing celebrations of it like this makes me remember the energy and the excitement about the potential of this new thing. Even if I don’t care for the aesthetic, personally.

                                          It was different from some of the gopher spaces, newsgroups and BBSes that I do really miss, but still seems to reflect a part of the spirit of the time that’s worth preserving.

                                          You’ve also got some of the kooks that were suddenly published and discoverable when the web and search engines came into being. Like the timecube guy and several less benign ones.

                                          I think Project Gutenberg might be one of my favorite “internet of old” things that is still really alive and useful.

                                        1. 6

                                          CICS is one of the most successful pieces of software in the world: there are over 30 000 licences, and most of the world’s top companies use it.

                                          Ooof. Looks like the book is from 1996, so it’s a bit of an extravagant claim even at the time.

                                          Fun fact: Daniel Jackson got so fed up with Z that he invented Alloy just to fix the shortcomings!

                                          1. 4

                                            CICS was the primary reason I worked on porting Swift to z/OS. COBOL interaction was the holy grail. (I left before we got to that point and I don’t know the status of things since then.) Point being that CICS plays a large role in finding the IBM compiler efforts.

                                            It should also be noted that the book was written on OS/2, so maybe they were getting those numbers from the IBM sales team. ;)

                                            1. 1

                                              Swift-COBOL interaction reminds me of the time I got multiprotocol routing set up between IP, IPX, AppleTalk, and SNA.

                                              it worked, somehow.

                                            2. 2

                                              It’s most successful, for suitably small values of “most”.

                                              1. 2

                                                The small most hypothesis.

                                            1. 2

                                              Sometimes, title alone is not enough to uniquely identify a book. This isn’t limited to this particular post, but wouldn’t it be great if, in submissions tagged book, authors were also named? I’m not even mentioning ISBN

                                              1. 2

                                                Thanks for the feedback, I added an ISBN-10 for each of the books

                                                1. 2

                                                  It is much better to include the author’s name, year of publication or printing, and edition. The ISBN on my copy of On Writing Well doesn’t match the one you listed. Besides, no one remembers ISBNs anyway. They are not used in any academic bibliography I’ve ever seen, either.

                                                  1. 1

                                                    Thanks a lot for that! it is now much easier to search for the books :^)

                                                    I do have to echo some of @GeoffWozniak’s comment here, however. Not sure whether I wasn’t clear enough in my initial suggestion but, alongside a full book title, I would have expected the author(s) to always be mentioned, with an ISBN being an addition to the above. Here’s an example of what I had in mind: https://www.openbsd.org/books.html

                                                    What I do not agree with, in terms of said comment, is that providing an ISBN is worth next to nothing - I frequently use it to search for books, i.e. in order to identify a particular edition (i.e. newest isn’t always the best and/or what I am after), etc.

                                                1. 43

                                                  If you want to heavily invest into GNUisms (bash, RECIPEPREFIX, …) and make your makefile less portable, go ahead and follow the advice in this article. If you don’t, be my guest, use sh(1) instead of bash(1), read the make-standard and test your makefiles with other make-implementations than GNU make.

                                                  This helped me tremendously with writing good, concise and simple makefiles. We need to stop writing unportable makefiles, and especially stop relying on the bloated GNU ecosystem.

                                                  1. 11

                                                    Alternatively, name your makefile GNUmakefile to mark it as “only tested with GNU”.

                                                    1. 10

                                                      and especially stop relying on the bloated GNU ecosystem.

                                                      I know this site likes to hate on GNU, but its make has more useful features than any other make. % rules make writing pattern rules so much easier. The text-manipulation functions make working with variables so much nicer. These are “killer” features of GNU make for me.

                                                      1. 11

                                                        I used to be pretty harsh on GNU stuff until I realized how decrepit pure POSIX implementations are. For better or for worse, GNU tools do what people actually want, instead of some religious interpretation of Unix or minimum specification compliance.

                                                        1. 5

                                                          Seconded. POSIX make is pretty sparse. GNU Make has a bunch of warts but it has some conveniences that are nice to work with. Reading what automake dumps out isn’t a good way to judge it, either.

                                                        2. 2

                                                          I agree. Writing a portable (POSIX compatible) works when you do “regular” stuff, like building .c -> .o -> binaries, but anything more complex ends up having to lash out to shell script, either in them or around.

                                                        3. 6

                                                          I don’t agree with the article as well, although for a different reason (that might be added up to yours, really).

                                                          My standpoint is that shell scripting and makefiles are poorly understood by many people in the industry. Spicing up a makefile like this is likely to end up in wrong makefiles when a colleague having a shallow experience with these tools is going to change something.

                                                          To give an example, it is expected that individual failures in the recipes will make the whole build rule fail, but setting ONESHELL will change this sematics dramatically, with no change in syntax! Unless you also set SHELLOPTS += -e, error checking will work in an often unexpected way!

                                                        1. 5

                                                          I have to pass -trigraphs to a modern version of gcc before this actually works.

                                                          As soon as I read “trigraphs”, the “WTF” made perfect sense.

                                                          1. 1

                                                            Got any bad experiences?

                                                            1. 4

                                                              It’s long been known that trigraghs are a disaster. No sane person would design that in today. It’s a vestige from days long gone kept around for compatibility reasons. The intense confusion they cause is why GCC disables them by default.

                                                              1. 5

                                                                If you don’t speak up for the trigraph users, will anyone speak up for you?

                                                                (The amusing parts is that EBCDIC does support these characters, but maybe that’s only on i, and z/OS continues to be a nightmare hell dimension…)

                                                          1. 1

                                                            This article is absurd.

                                                            Exhibit A:

                                                            <meta name="GENERATOR" content="Microsoft FrontPage 6.0">
                                                            

                                                            Can we really take someones arguments about software quality seriously when they are using Microsoft FrontPage 6.0?

                                                            Now here is an unpalatable truth, twenty years on: most open source code is poor or unusable.

                                                            He starts with the premise that most open source software is garbage, but misses the point that most of everything is garbage. For example, most books ever written, photos ever taken, and paintings ever drawn are absolute garbage when compared to works of high quality in the same field.

                                                            This is also true about all software, and all software companies. Most of all software is low quality and most companies will fail.

                                                            Linux is of course, mostly a copy of Unix, it is deeply unoriginal, being based on ideas going back to the time of the Vietnam War. These ideas were in turn evolved within Bell Labs by its creators who were also well-paid professionals. Linus Torvalds copied an idea whose basis had been funded by university and corporation money and without that basis there would have been no Linux. Early Linuxes were dreadful. My Ubuntu version of 2005 was an absolute crock that wasted the plastic on which it was distributed. Ubuntu was itself a loss-making personal hobby of a entrepreneur who had so many millions that he could afford to run the parent company, Canonical, at a loss for years. The situation in 2019 is better than 2005, but the Linux desktop still lags behind Windows and the interface looks stuck in the 90s.

                                                            I’d like to go into a deep discussion about the differences between a kernel and everything else but I think the point would be lost on Mark.

                                                            P.S - the author’s website is served nginx (and is very likely running on a Linux server), which, is open source software.

                                                            HTTP/1.1 200 OK
                                                            Server: nginx
                                                            Date: Sun, 15 Dec 2019 02:13:08 GMT
                                                            Content-Type: text/html
                                                            Content-Length: 89626
                                                            Connection: close
                                                            Last-Modified: Sat, 26 Oct 2019 13:39:53 GMT
                                                            ETag: "15e1a-595d066cc3e2a"
                                                            Accept-Ranges: bytes
                                                            
                                                            1. 7

                                                              Can we really take someones arguments about software quality seriously when they are using Microsoft FrontPage 6.0?

                                                              Short answer: yes. Longer answer: what the essay was written/published with has nothing to do with the arguments put forth.

                                                              This [most software is garbage] is also true about all software, and all software companies. Most of all software is low quality and most companies will fail.

                                                              His point in saying this is to explicitly refute the notion put forth in Raymond’s essay that open source will produce higher quality software. Having worked in and with software for the past 25 years, I can safely say Mr. Tarver is not wrong.

                                                              I’d like to go into a deep discussion about the differences between a kernel and everything else but I think the point would be lost on Mark.

                                                              I seriously doubt that. I’ve met Mark. While he’s not the most pleasant person to be around all the time, I’m certain he can grasp the difference between “a kernel and everything else”.

                                                              1. 5

                                                                There are plenty of issues with the linked article, from minor ones like the incorrect possessive its, to the misapprehension that ESR wrote “The Cathedral…” in polemic against closed source (he was attacking the GNU/FSF style of development). Nitpicking on what editor might have been used to write the content, or ages-old point-scoring about kernel vs userland, isn’t actually engaging with the content of the piece.

                                                                There’s a lot of stuff I don’t agree with in the article, but I do agree that “open source” has become more and more of a sharecropping field for developers, where they’re expected to put in unpaid work that makes corporations serious money. It’s a discussion worth having.

                                                                1. 3

                                                                  Fair enough, I hear you and I agree that it is a discussion worth having.

                                                              1. 3

                                                                I think I see what this is trying to do, but I can’t see it leading to anything but an unmaintainable mess. It feels like it would be the kind of code that you only throw patches and tests at until it starts passing. I will admit that such a style is nice when you’re exploring or prototyping. Not having to care about structure is nice in that situation. I’m not seeing this working with large projects.

                                                                Key difference is: there is no contact point between modules. They are always oblivious about each-other [sic].

                                                                Sure, but there are still dependencies between them. Specifically data and behavioural dependencies. It’s not like you can avoid that. For example, is the input() stuff shared across everything? Controlling access to that sounds like a real chore. Based on what I’m seeing here, it seems to be pushing all the structural complexity of a system into the data.

                                                                I will have to read the research to get a better idea. This introduction only makes me foresee maintenance and debugging nightmares.

                                                                1. 6

                                                                  Debugging, by David Agans (2002). There are is no better book on the skill of debugging out there. (Andreas Zeller’s Why Programs Fail is also great, but doesn’t count as “old”.)

                                                                  1. 11

                                                                    I’d recommend “Data and Reality” by William Kent. More of a philosophy book geared toward programmers, but it is asking good questions. The book has been updated a few times since the 70s, so the stale content has been removed.

                                                                    In the non-technical variety, “Zen and the Art of Motorcycle maintenance”, which was published in the 70s also. It’s an introspection into what quality and expertise is made of.

                                                                    Also published in the 70s, “The timeless way of building”, which args for looking at patterns in the way people build. This and it’s successor, “A pattern languages” are the inspiration for design patterns. The difference between the original thought and what has been made of it make for an interesting read.

                                                                    1. 3

                                                                      The only part of Zen and the art… that stuck with me was the beer can shim.

                                                                      However most of Zen Flesh and Zen bones is permanently wired into my head.

                                                                      1. 3

                                                                        I’d recommend “Data and Reality” by William Kent.

                                                                        Seconded. Great book. I found it went nicely with The Systems Bible, by John Gall.

                                                                        1. 1

                                                                          OOH Data and Reality is even availabe on Audible! Woot I am there! :) Thanks for the recommendation.

                                                                          1. 1

                                                                            I’d recommend “Data and Reality” by William Kent. More of a philosophy book geared toward programmers, but it is asking good questions. The book has been updated a few times since the 70s, so the stale content has been removed.

                                                                            While I came here to second Data and Reality, be warned about the third edition. It was updated after Kent died, removed something like half of the original text, and replaced it with the editor’s (Steve Hoberman) own half-baked ideas. And while the book says it put the new stuff in italics, but at least on Safari it often presents The content as the original.

                                                                            Try to read both the earlier editions: the content is the same but it’s a book worth reading twice.

                                                                            1. 1

                                                                              I’m about 3/4 of the way through the most recent edition now on Audible.

                                                                              I haven’t read the earlier editions (and may not get to any time soon) but I just wanted to say the sidebars by Steve Hoberman are getting on my nerves. They’re mostly breathless generalizations about the author’s original ideas and I’m finding them to be a distraction I wish I could skip.

                                                                              GREAT book though, so glad I’m reading it. Thanks @laurentbroy for the recommendation!

                                                                            2. 1

                                                                              In the non-technical variety, “Zen and the Art of Motorcycle maintenance”, which was published in the 70s also. It’s an introspection into what quality and expertise is made of.

                                                                              When reading this one I had a feeling that the entire concept could be more succinctly put instead of constant repetition of the same theme.