1. 31
    1. 36

      Then again, I’ve rarely seen anyone use their editor of choice well. I’ve lost count of how many times I’ve watched someone open a file in vim, realise it’s not the one they want, close vim, open another file, close it again… aaarrrgh.

      I do this a lot, because I prefer browsing files in the shell. I make pretty extensive use of a lot of other vim features though. When did you become the arbiter of how “well” I’m using my computer?

      1. 3

        Closing vim seems odd to me. Why wouldn’t one instead open the new file without closing vim? Maybe it’s a cultural thing? I don’t think anyone would do that in Emacs.

        1. 27

          “Why would I ever leave my editor” definitely feels like a common refrain from the Emacs crowd.

          1. 1

            I do the thing you quoted as well, but that is because vim is often my editor of convenience on a machine rather than my editor of choice, which is true for many usages I see of vim.

        2. 22

          Because the shell lets me change directories, list files with globs, run find, has better tab-completion (bash, anyway), etc, etc. I might not remember the exact name of the file, etc. Finding files in the shell is something I do all day, so I’m fast at it. Best tool for the job and all that.

          (Yes I can do all that with ! in vi/vim/whatever, but there’s a cognitive burden since that’s not how I “normally” run those commands. Rather than do it, mess it up because I forgot ! in front or whatever, do it again, etc, I can just do it how I know it’ll work the first time.)

        3. 6

          This is exactly why I struggle with editors like Emacs. My workflow is definitely oriented around the shell. The editor is just another tool among many. I want to use it just like I use all my other tools. I can’t get on with the Emacs workflow, where the editor is some special place that stays open. I open and close my editor many, many times every day. To my mind, keeping your editor open is the strange thing!

          1. 3

            It’s rather simple actually: the relationship between the editor and the shell is turned on it’s head – from within the editor you open a shell (eg. eshell, ansi-term, shell, …) and use it for as long as you need it, just like a one would use vi from a shell. Ninja-slices.

            You can compare this as someone who claims to log out of their x session every time they close a terminal or a shell in a multiplexer. Would seem wierd too.

            1. 3

              I know you can launch a shell from within your editor. I just never really understood why you would want to do that.

              Obviously some people do like to do that. My point is just that different ways of using a computer make intuitive sense to different people. I don’t think you can justify calling one way wrong just because it seems odd to you.

              1. 6

                I know you can launch a shell from within your editor. I just never really understood why you would want to do that.

                I do it because it allows me to use my editor’s features to:

                a) edit commands b) manipulate the output of commands in another buffer (and/or use shell pipelines to prep the output buffer) c) not have to context switch to a different window, shutdown the editor, suspend the editor, or otherwise change how I interact with the currently focused window.

                1. 1

                  That makes a lot of sense. I guess I have been misleading in using the word “shell” when I should really have said “terminal emulator”. I often fire off shell commands from within my editor, for just the same reasons as you, but I don’t run an interactive shell. I like M-! but I don’t like eshell, does that make sense?

                  Having pondered this all a bit more, I think it comes down to what you consider to be a place. I’m pretty certain I read about places versus tools here on lobsters but I can’t find it now. These are probably tendencies rather than absolutes, but I think there are at least a couple of different ways of thinking about interaction with a computer. Some people think of applications as places: you start up a number of applications, they persist for the length of your computing session, and you switch between them for different tasks (maybe a text editor, a web browser and an e-mail client, or something). Alternatively, applications are just tools that you pick up and drop as you need them. For me, a terminal, i.e. an interactive shell session, is a place. It is the only long-lived application on my desktop, everything else is ephemeral: I start it to accomplish some task then immediately kill it.

          2. 3

            It’s really simple in emacs. Just Ctrl-z and run fg when you are ready to go back.

    2. 11

      I don’t really get this list. I’m sure someone believes these things but I don’t know them. Seems like just a mish-mash of things the author has heard someone say and trying to make some sort of generalization out of it.

      1. 11

        I see these on Hacker News and Reddit all the time:

        1. Assumption that rewriting in C will always be fast. So, better to do that than make your HLL version faster or do a hybrid.

        2. Assumption that GC’s always have long delays or something that forces you to use C. Many still don’t know that real-time GC’s were invented or that some real-time software in industry uses HLL’s despite pjmpl and I continuously posting about that. Myths about memory management are so strong that we about need some coding celebrities to do a blog post with references these kinds of things to generate a baseline of awareness. Then, maybe people will build more of them into mainstream languages. :)

        3. You have to use C for C ABI. People were just arguing this on Lobsters and HN a while back on C-related posts. I was arguing the position you don’t need C in majority of cases in C ecosystem. Just include, wrap, and/or generate it whenever you can.

        4. Conflating of C/C++ where one shouldn’t. I slipped on that myself a lot in the past because many people coded C++ like it was higher-level C. There is a distinct style for C++ that eliminates many C-related problems. I’m more careful now. I still pass on the correction to others.

        5. You can only write kernels in C. This is a little inaccurate since many know you can use C++. They consider it a C superset, though, where it sort of reinforces the concept you need some kind of C. Many do believe everything depends on C underneath, though. I shows up in damn near every thread on high-level languages in low-level or performance-critical situations. In my main comment, I gave an extensive list of usages that came before, around same time, and much later than C. There’s even more that used C only for convenience of saving time with pre-built components. Linking to those would defeat the purpose of showing one doesn’t need C, though. ;)

        6. C maps closely to the hardware. This has been debated multiple times on Lobsters just this year. The most wide-spread metaphor for it is “cross-platform assembly.” So, this is a widespread belief whether it’s a myth or not. There’s a lot of disagreement on this one.

        The others outside of Emacs or terminal stuff I’ve also seen or countered plenty of times. I don’t know that they’re widespread, though. The ones I cited I’ve seen debated by many people in many places over long periods of time. They’re definitely popular beliefs even if I don’t know specific numbers of people involved.

        1. 5

          I’m not sure what you’re trying to say. I’m said that I’m sure some people believe things mentioned in this blog, but what about it? Your list is just a rehash of the contents of the blog, what are you specifically adding to the discussion? There are people in every industry that are ignorant of aspects of their own industry, that’s just how the world works. The counter evidence to the items in this list are accessible to anyone interested.

          1. 4

            I don’t really get this list. I’m sure someone believes these things but I don’t know them.

            You originally said the quote above. In your experience, you must never see these things. In my experience and probably the author’s, they’re so common that they programmers believing them due to widespread repetition miss opportunities in their projects. This includes veterans that just never encountered specific technologies used way outside their part of industry or FOSS. Countering the myths and broadening people’s perspectives on social media might bring attention to the alternatives that lead to more of it fielded in FOSS or commercial applications.

            That’s the aim when I bring stuff like this up. Sometimes, I see people actually put it into practice, too.

            1. 5

              believing them due to widespread repetition miss opportunities in their projects.

              But that’s clearly not true, right? How much code is written in JavaScript, Python, Ruby, Java, etc? Even if one believes GC’d languages are slow, they are still solving problems in them.

              After some thought, I believe my strong reaction to this are a few reasons:

              1. I think most programmers are just ignorant of layers they don’t actively work with. A web dev might think C is the only language to implement kernels in because they just don’t know any better. Maybe that makes it a myth and my splitting hairs but I feel like it’s more of just “thinks some programmers are ignorant of”.
              2. I think the wording of several of these is misleading. “C is magically fast”?? Even people that believe C is the fastest language to implement anything in don’t call it magic, IME. Things implemented in C are usually very fast, but there is nothing magic about it. Same for GC languages magically slow. Those descriptions just do not match the nuance of the actual discussion, IME.
              3. The endianess one is also misleading, IMO. The endianess of your machine certainly matters. The blogpost linked is not about endianess not mattering, it’s about how to write code agnostic if endianess, these are very different things.

              I think my initial response was stronger than it needed to be, however I do not believe this blog post really helps the situation and might even add some of its own myths.

              1. 4

                My experience isn’t with webdevs thinking C is the only language to implement kernels with - it’s about systems programmers who think so. The same goes for C being magically fast - people literally believe that it’s impossible to beat C at execution speed, no matter what language they pick to write code in. I had to prove to a colleague that C++‘s std::sort was faster than C’s qsort with code and benchmarks.

                On endianess I think I did myself a disservice by implying that it never matters - it just doesn’t 99.9% of the time. Write endianness-agnostic code and be done with it. It’s like caring which bits of a float are the mantissa - the CPU will do the right thing.

                1. 3

                  Oh yeah, Ive had many C programmers say that stuff to me. You were on point with the Fortan counter: I use it, too.

      2. 1

        This is true - it’s a list of things that I hear/read a lot and don’t understand why people believe them.

        1. 6

          Subtitle: the perils of the comment section.

          Corollary: disproving these myths will cause ten others to spontaneously emerge to take their place. :)

        2. 2

          Because people believe strange things and/or are often ignorant of things they don’t have close proximity to. How many production kernels are not written in C? And how many of those will your average developer have knowledge of if they aren’t actively interested in the OS layer?

          To put it another way: you probably believe things that someone with alternative experience would classify as myths and not understand why you believe them.

          1. 3

            Because people believe strange things and/or are often ignorant of things they don’t have close proximity to

            Sure, but I know people who still think these things despite having been presented with evidence to the contrary.

            you probably believe things that someone with alternative experience would classify as myths and not understand why you believe them

            With nearly 100% certainty. I love to be proven wrong, because after I am, I’m no longer wrong about that particular matter.

            This list is just things that I have observed that I find irrational, with no implications on my part that I’m any more rational on other matters. Or even these!

    3. 7

      Nice article. A few observations.

      On the GC part, it’s worth bringing up memory pools, refcounting, and low-latency/real-time GC’s. From what I read, Go and Nim are examples of languages using quick ones. C++ and Ada long supported use of things like memory pools or reference counting. If the compiler didn’t have it, the programmer could add it as a library/module.

      On text editors without goto definition, they might want to look at LISP Machines for other ancient features they might want to emulate in a UNIX/C environment. Although it had window system, the interface isn’t much more advanced than some stuff I had on MSDOS a long time ago. I’m sure a UNIX app or interacting apps could pull it off.

      On C ABI, I’ve been repeatedly countering this even here. One can always add seemless integration of C code and/or generation of it to their language. Should probably be one of first, design considerations, too, since some high-level languages will clash with C’s nature. At the least, one can get better parsing/refactoring, more safety, real macros, easier portability, and support for specialized hardware. One does this by getting rid of the parts of C the language that make it hard.

      On concurrency, Concurrent Pascal (1975) by Per Brinch Hansen was used in an OS with Ada Ravenscar and Eiffel SCOOP having a lot of commercial deployment.

      On kernels, there’s Burroughs MCP done in ALGOL (Unisys still sells it), IBM OS’s in PL/S done like PL/I, Intel i432 APX with OS in Ada, Xerox Star used Mesa, Oberon in Oberon, House in Haskell for most of it, and Muen in SPARK Ada. Hell, there’s even one in FreeBASIC.

      1. 0

        Thanks for the comments. I figured I needed to write a whole blog post about GCs and how it’s likely that the people who believe what they do just don’t know how they work. It’s a long post as it is.

        1. 2

          After you write it, be sure to submit it here. I’ll definitely read it. Here’s a search listing of the real-time GC’s I submitted in case any is useful.

    4. 3

      This article is mostly fluff, but I did laugh out loud at the line about “hacker news and reddit commenters are smarter than average”.

      1. 1

        I’m pretty sure that they are. Not because of some virtue that belonging to those communities inherently confers, but because of pre-selection for slightly-above-average invested people.

        HN is sketchier though, while I was active there it seemed to have more business people than actual industry workers…

        1. 1

          I can see why you would think that, based only on the messages that are posted, without being able to see the ones that weren’t. But this is more like an iceberg: the smartest and most experienced take one look at these communities and nope away forever. (Yes, I already know what that says about me.) :)

    5. 0

      A list of beliefs about programming that I maintain are misconceptions.

      1. 3

        Small suggestion: use a darker, bigger font. There are likely guidelines somewhere but I don’t think you can fail with using #000 for text people are supposed to read for longer than a couple of seconds.

        1. 3

          Current web design seems allergic to any sort of contrast. Even hyper-minimalist web design calls for less contrast for reasons I can’t figure out. Admittedly, I’m a sucker for contrast; I find most programming colorschemes hugely distasteful for the lack of contrast.

          1. 6

            I think a lot of people find the maximum contrast ratios their screens can produce physically unpleasant to look at when reading text.

            I believe that people with dyslexia in particular find reading easier with contrast ratios lower than #000-on-#fff. Research on this is a bit of a mixed bag but offhand I think a whole bunch of people report that contrast ratios around 10:1 are more comfortable for them to read.

            As well as personal preference, I think it’s also quite situational? IME, bright screens in dark rooms make black-on-white headache inducing but charcoal-on-silver or grey-on-black really nice to look at.

            WCAG AAA asks for a contrast ratio of 7:1 or higher in body text which does leave a nice amount of leeway for producing something that doesn’t look like looking into a laser pointer in the dark every time you hit the edge of a glyph. :)

            As for the people putting, like, #777-on-#999 on the web, I assume they’re just assholes or something, I dunno.

            Lobsters is #333-on-#fefefe which is a 12.5:1 contrast ratio and IMHO quite nice with these fairly narrow glyphs.

            (FWIW, I configure most of my software for contrast ratios around 8:1.)

            1. 2

              Very informative, thank you!

      2. 3

        I think the byte-order argument doesn’t hold when you mentioned ntohs and htons which are exactly where byte-order needs to be accounted for…

        1. 2

          If you read the byte stream as a byte stream and shift them into position, there’s no need to check endianness of your machine (just need to know endianness of the stream) - the shifts will always do the right thing. That’s the point he was trying to make there.

          1. 2

            ntohs and htons do that exact thing and you don’t need to check endianess of your machine, so the comment about not understanding why they exist makes me feel like the author is not quite groking it. Those functions/macros can be implemented to do the exact thing linked to in the blog post.