1. 3

    I remember learning Ada back in college (late 80s/early 90s) and the major complaint back then was the verbosity of the language. It’s funny to think that C++ today is more verbose than Ada.

    1. 1

      The verbosity is something I agree on. I think Ada’s goals can be accomplished with less verbosity today. I think I may have missed that C++ comparison entirely, though. I didn’t miss how long the code and error messages looked in some cases. Good point haha.

    1. 4
      1. Absolutely: rework code until warnings are 0 because of exactly this + issues across compilers and platforms

      -2. Did not understand why the fill bits are not 0 but the last bit-

      https://stackoverflow.com/questions/6555094/what-does-cltq-do-in-assembly

      Gives the answer: its a signed expansion.

      1. 5

        Yes! And I’d go a step further: once you get down to warning-zero, turn on -Werror to make sure no one introduces another warning.

        1. 3

          bonus points for -Wall -Wpedantic

          1. 1

            Not quite. The only warning I let slide is:

            warning: ISO C forbids assignment between function pointer and `void *'
            

            But that’s because POSIX requires that behavior. But otherwise, yes, I will crank up the warnings when compiling.

        1. 0

          Etsy has some interesting technology. I’m using a variation on their statsd for logging KPIs (Key Performance Indicators) at work. The old method to add a KPI involved several modifications per KPI (and complicated logic in the component to log them every X minutes). The new method you just add them as and where needed.

          This idea is also great.

          1. 15

            Polemics like this always seem to leave out the part where, though things might not be exactly to the author’s preference, they are nonetheless actually pretty impressive. We’ve built a lot of systems that are truly amazing in the positive impact they have, just as we have built tools that are used for distasteful purposes. In addition, a lot of the idealism vanishes out the window once you actually try to build a real thing which works for thousands or millions of people, rather than a lab experiment or research project.

            I’m sure we’re not at a global maximum of whatever it is we should be optimising, but the idea that everything is terrible and we’re all just lying to ourselves is such a tired one.

            1. 5

              Well, the article wasn’t about that (it’s about how history is misrepresented to present a world dominated by technological determinism), but I’m always up for discussing the subject.

              ‘Polemics like this’ are generally not making the argument that everything is terrible, but that relatively straightforward & obvious improvements are not being made (or once made are being ignored). In the case of the work of folks mentioned in this article, commercial products today are strictly worse along the axes these people care about than commercially-available systems in the late 1970s and early 1980s. In the case of both Alan Kay & Ted Nelson, they themselves released open source software that gets closer to their goals.

              I don’t think it’s unfair to get mad about a lack of progress that can’t be excused by technical difficulties. It’s absurd that popularly available software doesn’t support useful features common forty years ago. However, the tech industry is uniquely willing to reject labor-saving technology in favor of familiar techniques – it does so to a degree far greater than other industries – so while absurd, it’s not surprising: software engineering is an industry of amateurs, and one largely ignorant of its own history.

              1. 6

                I think you’re deliberately understating the current state of computing, programming, and networking.

                It’s absurd that popularly available software doesn’t support useful features common forty years ago.

                Like clipboards! And spellcheckers! And pivot tables! And multimedia embedding! And filesharing! And full-text searching! And emailing!

                Except…wait a second, those weren’t really common useful features at all. Wait a second….

                However, the tech industry is uniquely willing to reject labor-saving technology in favor of familiar techniques – it does so to a degree far greater than other industries – so while absurd, it’s not surprising

                What do you mean by this? Have you compared it to industries like, say, paper printing? Healthcare? Cooking?

                Would you consider the constant churn of, say, web frameworks promising ever-easier development to be favoring familiar techniques? What about the explosion functional and ML languages which will magically save us from the well-documented pitfalls and solutions of procedural and OOP languages, for the mere cost of complete retraining of our developers and reinvention of the entire software stack?

                Please put some more effort into these bromides–facile dismissals of tech without digging into the real problems and factors at play is at least as shortsighted as anything you complain of in your article.

                one largely ignorant of its own history.

                Here I would have to agree with you. :)

                1. 3

                  Like clipboards!

                  That’s actually a decent example. Companies operating under the philosophy to interoperate with maximum number of techs for benefits enkiv2 is going for would want everyone to have clipboards that could interoperate with each other, too. Instead we get walled garden implementations. Additionally, Microsoft patented it instead of leaving it open in case they want to use it offensively to block its adoption and/or monetize it.

                  On a technical angle, clipboards were much weaker than data-sharing and usage models that came before them. Some of the older systems could’ve easily been modified to do that with more uses than clipboards currently offer. There’s entire product lines on Windows and Linux dedicated to letting people manipulate their data in specific ways that might have just been a tie-in (like clipboards) on top of a fundamental mechanism using the extensible designs enkiv2’s and his sources prefer. Instead, we get patented, weak one-off’s like clipboards added on many years after. Other things like search and hyperlinks came even later with Microsoft’s implementation in Windows once again trying to use IE to lock everyone in vs real vision of WWW.

                  I could probably write something similar about filesharing adoption in mainstream OS’s vs distributed OS’s. languages, and filesystems from decades earlier.

                  1. 3

                    The clipboard mechanism in X Windows allows for more than just text. I just highlighted some text in Firefox on Linux. When I query the selection [1], I see I have the following targets:

                    • Timestamp
                    • targets (this returns the very list I’m presenting)
                    • text/html
                    • text/_moz_htmlcontext
                    • text/_moz_htmlinfo
                    • UTF8_STRING
                    • COMPOUND_TEXT
                    • TEXT
                    • STRING
                    • text/x-moz-url-priv

                    If I select text/html I get the actual HTML code I selected in the web page. When I select text/x-moz-url-priv I get the URL of the page that I selected the text on. TEXT just returns the text (and the alt text from the image that’s part of the selection). I use that feature (if I”m on Linux) when blogging—this allows me to cut a selection of a webpage to paste into an entry which grabs the URL along with the HTML.

                    Of course, it helps to know it’s available.

                    [1] When first playing around with this in X Windows, I wrote a tool that allowed me to query the X selection from the command line.

                    1. 1

                      Didnt know about those tricks. Thanks for the tip.

                      1. 2

                        There’s a reason I used it as an example. ;)

                        Ditto spellcheckers–they used to be a real bear to implement and ship as a feature, if they got shipped at all, because of the memory and speed constraints in implementing them.

                      2. 1

                        That’s table stakes for clipboard implementations. Microsoft Windows also allows multiple objects to be attached to the clipboard, with the expectation that they provide different representations of the same data. Android’s clipboard is built on the same Content Provider API that all of their application data interoperability uses. The WHATWG clipboard API allows the data to be keyed by mimetype, using the same formats as the drag-and-drop API. I assume macOS provides similar functionality, but I don’t know where to look for info on that.

                        It’s not used for anything that fancy because (a) the clipboard can only ever hold one piece of data at a time (b) you have to get the applications to support the same data format, just like if you’d used a file

                    2. 2

                      I think you’re deliberately understating the current state of computing, programming, and networking.

                      Never do I say that the tech we have is worthless. But, at every opportunity, I like to bring up the fact that with not much effort we could do substantially better.

                      those weren’t really common useful features at all.

                      It shouldn’t take fifty years for a useful feature to migrate from working well across a wide variety of experimental systems to working poorly in a handful of personal systems – particularly since we have a lot of developers, and every system owned by a developer is a de-facto experimental system.

                      There weren’t impossible scaling problems with these technologies. We just didn’t put the effort in.

                      Have you compared it to industries like, say, paper printing? Healthcare? Cooking?

                      I was thinking in particular of fields of engineering. CAD got adopted in mechanical engineering basically as soon as it was available.

                      But, sure: in the domain of cooking, sous vide systems got adopted in industrial contexts shortly after they became available, and are now becoming increasingly common among home cooks. Molecular gastronomy is a thing.

                      Healthcare is a bit of a special case. All sorts of problems, at least in the US, and since the stakes are substantially higher for failures, some conservativism is justified.

                      Printing as an industry has a tendency to adopt new technology quickly when it’ll increase yield or lower costs – even when it’s dangerous (as with the linotype). There are some seriously impressive large-scale color laser printers around. (And, there was a nice article going around about a year ago about the basic research being done on the dynamics of paper in order to design higher-speed non-jamming printers.) My familiarity with printing is limited, but I’m not surprised that Xerox ran PARC, because printing tech has been cutting edge since the invention of xerography.

                      Would you consider the constant churn of, say, web frameworks promising ever-easier development to be favoring familiar techniques?

                      Promising but never actually delivering hardly counts.

                      What about the explosion functional and ML languages which will magically save us from the well-documented pitfalls and solutions of procedural and OOP languages, for the mere cost of complete retraining of our developers and reinvention of the entire software stack?

                      Functional programming is 70s tech, and the delay in adoption is exactly what I’m complaining about. We could have all been doing it thirty years ago.

                      (The other big 70s tech we could benefit a great deal from as developers but aren’t is planners. We don’t write prolog, we don’t use constraint-based code construction, and we don’t use provers. SQL is the rare exception where we rely upon a planner-based system at all in production code. Instead, we go the opposite route: write java, where engineer time and engineer effort is maximized because everything is explicit.)

                      1. 1

                        We could have all been doing it thirty years ago.

                        I’m pretty sure the computers of the time weren’t really up to the task. How much RAM does GHC take up?

                        Can you write a useful functional language and interpreter that works in the hardware available at the time, and will it be faster/smaller than the equivalent, say, C compiler?

                        1. 2

                          The variety of lisps and schemes available for early-90s commodity hardware indicates that a functional style has been viable on that hardware for thirty years. We can be very conservative and call it twenty-five too: if Perl scripts running CGI are viable, so is all of the necessary features of functional programming (provided the developer of the language has been sensible & implemented the usual optimizations). Haskell is probably not the best representative of functional programming as a whole in this context: it really is heavy, in ways that other functional languages are not, and has a lot of theoretical baggage that is at best functional-adjacent. Folks can and did (but mostly didn’t) run lisp on PCs in ’95.

                          By the late 90s, we are already mostly not competing with C. We can compare performance to perl, python, and java.

                          The question is not “why didn’t people use haskell on their sinclair spectrums”. The question is “why didn’t developers start taking advantage of the stuff their peers on beefier machines had been using for decades as soon as it became viable on cheap hardware?”

                          1. 3

                            You’re playing fast and loose with your dates. You said 30 years ago, which would’ve been 1988–before the Intel 486. Even so, let’s set that aside.

                            The variety of lisps and schemes available for early-90s commodity hardware indicates that a functional style has been viable on that hardware for thirty years.

                            The most you could say was that the languages were supported–actual application development relies on having code that can run well on the hardware that existed. I think you’re taking a shortcut in your reasoning that history just doesn’t bear out.

                            if Perl scripts running CGI are viable, so is all of the necessary features of functional programming (provided the developer of the language has been sensible & implemented the usual optimizations).

                            I’m not sure what you mean by viable here. The Web in the 90s kinda sucked. You’re also overlooking that actual requirements for both desktops and servers at the time for web stuff were pretty low–bandwidth was small, clients were slow, and content was comparatively tiny in size when compared with what we use today (or even ten years ago).

                            The second bit about “assuming the developer of the language” is handwaving that doesn’t even hold up to today’s languages–people make really dumb language implementation decisions all the time. Ruby will never be fast or small in memory for most cases. Javascript is taking a long time to get TCO squared away properly. Erlang is crippled for numeric computation compared to the hardware it runs on.

                            By the late 90s, we are already mostly not competing with C. We can compare performance to perl, python, and java.

                            I don’t believe that to be the case, especially in the dominant desktop environment of the time, Windows. Desktop software was at the time very much written in C/C++. with Visual Basic and Delphi probably near leaders.

                            ~

                            I think the problem is that you’re basing your critiques on a present based on a past that didn’t happen.

                            1. 4

                              You’re also overlooking that actual requirements for both desktops and servers at the time for web stuff were pretty low–bandwidth was small, clients were slow, and content was comparatively tiny in size when compared with what we use today (or even ten years ago).

                              That’s not a bad thing. :)

                              1. 3

                                actual application development relies on having code that can run well on the hardware that existed

                                Professional (i.e., mass-production) development is more concerned with performance than hobby, research, and semi-professional development – all of which is primarily concerned with ease of exploration.

                                Sure, big computing is important and useful. I’m focusing on small computing contexts (like home computers, non-technical users, and technical users working outside of a business environment, and technical users working on prototypes rather than production software) because small computing gets no love (while big computing has big money behind it). Big computing doesn’t need me to protect it, but small computing does, because small computing is almost dead.

                                So, I think your criticisms here are based on an incorrect understanding of where I’m coming from.

                                Professional development is totally out of scope for this – of course professionals should be held to high standards (substantially higher than they are now), and of course the initial learning curve of tooling doesn’t matter as much to professionals, and of course performance matters a lot more when you multiply every inefficiency by number of units shipped. I don’t need to say much of anything about professional computing, because there are people smarter than me whose full time job is to have opinions about how to make your software engineering more reliable & efficient.

                                Powerful dynamic languages (and other features like functional programming & planners) have been viable on commodity hardware for experimental & prototype purposes for a long time, and continue to become progressively more viable. (At some point, these dynamic languages got fast enough that they started being used in production & user-facing services, which in many cases was a bad idea.)

                                For 30 years, fairly unambiguously, fewer people have been using these facilities than is justified by their viability.

                                Folks have been prototyping in their target languages (and thus making awkward end products shaped more by what is easy in their target language than what the user needs), or sticking to a single language for all development (and thus being unable to imagine solutions that are easy or even idiomatic in a language they don’t know).

                                For a concrete example, consider the differences between Wolfenstein 3d and Doom. Then, consider that just prior to writing Doom, id switched to developing on NeXT machines & started writing internal tooling in objective c. Even though Doom itself ran on DOS & could be built on DOS, the access to better tooling in early stages of development made a substantially more innovative engine easier to imagine. It’s a cut and dried example of impact of tools on the exploration side of the explore/exploit divide, wherein for technical reasons the same tools are not used on the production (exploit) side.

                                people make really dumb language implementation decisions all the time

                                Sure. And, we consider them dumb, and criticize them for it. But, while today’s hardware will run clojure, a c64 lisp that doesn’t have tail recursion optimization will have a very low upper limit on complexity. The difference between ‘viable for experimentation’, ‘viable for production’, and ‘cannot run a hello world program’ is huge, and the weaker the machine the bigger those differences are (and the smaller a mistake needs to be to force something into a lower category of usability).

                                The lower the power of the target machine, the higher the amount of sensible planning necessary to make something complex work at all. So, we can expect early 90s lisp implementations for early 90s commodity hardware to have avoided all seriously dumb mistakes (even ones that we today would not notice) & performed all the usual optimization tricks, so as to be capable of running their own bootstrap.

                                There are things that can barely run their own bootstrap, and we generally know what they are. I don’t really care about them. There are other things that were functional enough to develop in. Why were they not as widely used?

                                Desktop software was at the time very much written in C/C++. with Visual Basic and Delphi probably near leaders.

                                Sure, but software was being written in scripting languages, and so writing your software in a scripting language was not a guarantee that it would be the slowest thing on the box (or even unusably slow). That makes it viable for writing things in that will never be sold – which is what I’m concerned with.

                                I think the problem is that you’re basing your critiques on a present based on a past that didn’t happen.

                                I think I just have a different sense of appropriate engineer time to cpu time tradeoffs, and don’t consider the mass production aspect of software to be as important.

                                1. 1

                                  I certainly ran Emacs Lisp on a 386 SX-16, and it ran fine. I didn’t happen to run Common Lisp on it, mainly because I wasn’t into it, or maybe there were only commercial implementations of it back then. But I would be pretty surprised if reasonable applications in CL weren’t viable on a 386 or 68020 in 1990. Above-average amounts of RAM were helpful (4 or 8 MB instead of 1 or 2).

                    1. 13

                      You might want to read up on control characters before deciding what control characters you might want to redefine. I know that DC1 and DC3 are still in active use on Unix systems (^S will stop output in a terminal window, ^Q will start output going).

                      As far as the “reveal codes” feature of WordPerfect, an HTML editor could do the same—HTML tags are a type of “code” after all. In fact, in the 90s we had just a program for HTML—DreamWeaver. Worked pretty much like WordPerfect except it used HTML instead of control characters.

                      1. 2

                        That is a gem. Thank you for finding it out for me! I’m going to look at it a bit and see if there’s some selection of control characters that could be reused without a drama.

                        1. 1

                          I read up on the control characters yesterday, especially I paid attention to the rfc20 because the codes seem to have remained mostly same since then and that document is the easiest to comprehend.

                          The transmission and device control characters seem to be safest to use in text streams. They are used as control signals in terminal software but otherwise they wouldn’t seem to cause anything in file output. Therefore I probably can use the following block of characters:

                          [01-06]     SOH STX ETX EOT ENQ ACK
                          [10-16] DLE DC1 DC2 DC3 DC4 NAK SYN
                          

                          Emacs, nano, vim, gedit seem to be gladly printing these characters and still encode the file as utf-8. I also opened a Linux console and ‘cat’ these in. There were none sort of visible effects, so I guess it’s safe to reuse these codes.

                          Most transmissions seem to assume that the stream is binary and that anything goes over, so I doubt this would have too many negative effects. Except that of course some other software could not reuse them anymore. Maybe it doesn’t hurt to provide some small file header that’s easy to type with a text editor.

                          I don’t need this many, so I will probably leave the DC1-DC4 without interpretation and only reuse the transmission control characters on this range.

                          And regarding DreamWeaver.. I think it’s a bit different thing. I’m not after a WYSIWYG editor, but rather something that is good for content processing yet leaves files suitable to examine with a good old text editor.

                          The WYSIWYG editing requires that you’re creating poorly structured single-purpose content.

                          1. 1

                            The WYSIWYG editing requires that you’re creating poorly structured single-purpose content.

                            I disagree. It might make it harder, and you have to be more disciplined to do so, but using a WYSIWYG editor does not in itself preclude well-structured content.

                        1. 4

                          The 50th anniversary of “the mother of all demos” is tomorrow, Sunday December 9th.

                          1. 1

                            It’s a pity it isn’t broadcasted :_(

                            1. 2

                              Get yourself a projector, the side of a building, and a lot of popcorn, and fire it up!

                              1. 2

                                Oh no, no I referred to the 50th anniversary on the 9th ;-)

                                1. 1

                                  Again, I’m talking about today’s homage in Silicon Valley:

                                  https://thedemoat50.org/symposium/

                            1. 13

                              but Electron is without question a scourge

                              Well, how about a big F* you?

                              Sorry for the swear words, but while I agree that in general Electron is not great, calling it a scourge is incredibly offensive towards those who chose to develop with Electron, and have good reasons for it. Sometimes writing native apps is cost-prohibitive, and you’re better off with an Electron app that looks a bit out of place, than have no app at all. It’s cool for you to be smug and elitist and complain about stuff not following the HIG on every single platform the app is developed for, but have you ever thought about the cost of doing so? Yeah, big companies may be able to shell out enough money and pay developers to create native apps and follow the platform’s HIG, but not everyone’s a big business. I dare say the vast majority of app developers aren’t. By hating on Electron and anyone who doesn’t polish their app perfectly, you’re alienating a whole lot of developers.

                              Learn to see the other side of the coin already.

                              (I ranted about this on my blog recently, also explaining why Electron was chosen over developing native apps.)

                              1. 37

                                complain about stuff not following the HIG on every single platform the app is developed for

                                Do you know why human interface guidelines exist? They exist because humanity is imperfect and accessibility is really important.

                                Electron apps are not accessible. They’re often completely opaque to a screen reader, they’re often completely opaque to assistive speech recognition software, they often don’t respond properly to keyboard navigation and if they do, they don’t behave as expected. They don’t often respond properly to text scaling, they don’t understand system-wide options like increased contrast or reduce motion.

                                To whole classes of users, your Electron app is worthless and unusable. What are we supposed to do? Congratulate you on your accomplishment? You need to shrink your ego and take some time to understand real world users, rather than throwing words around like “smug” and “elitist”.

                                Also your blog post doesn’t even mention the word “accessibility” once. How disappointing.

                                By hating on Electron and anyone who doesn’t polish their app perfectly, you’re alienating a whole lot of developers.

                                At the risk of sounding controversial, what does it matter if we alienate some developers? Developers should do better. They need to do better. I’m fine with alienating developers for good reasons.

                                Electron is not the answer. It’s a shortcut at best and a bandaid at worst. This isn’t a secret, so there’s really no point in acting surprised that people don’t agree with your choices.

                                1. 10

                                  I think that we who care about accessibility need to avoid taking a condemning, vitriolic tone, and meet developers where they are. That way, we’ll be more likely to get results, rather than just alienating a large and growing group of people. It’s true that Electron apps often have accessibility problems. But I believe we can improve that situation without calling for people to throw out Electron entirely. Frankly, we need access to these apps more than these developers need us. So we have to work with what we’ve got.

                                  No, I haven’t always lived up to this ideal when communicating with mainstream developers, and I’m sorry about that.

                                  1. 14

                                    I think that we who care about accessibility need to avoid taking a condemning, vitriolic tone, and meet developers where they are.

                                    The problem is that these developers don’t want to meet anywhere else - that’s how we arrived at Electron apps in the first place. It’s the easy way out.

                                    1. 4

                                      Most trends in IT are driven by herd mentality, familiarity, leveraging ecosystems, and marketing by companies. All of these seem to contribute to Electron use just like they contributed to Java and .NET getting big. They sucked, too, compared to some prior languages. So, it’s best to identify what they’re using within what ecosystems to find an alternative that’s better while still being familiar.

                                      Then, see how many move to it or don’t for whatever reasons. Iterate from there.

                                      1. 1

                                        Developers don’t want to listen because by the time they start publishing something (based on Electron, for a number of reasons), what they meet with is pure, unconditional hate towards the technology (and not just the tech! towards developers using said tech, too!) that enabled them. It is not surprising they don’t want to listen to the haters anymore.

                                        You’re alienating new developers with this kind of mentality too, who would be willing to listen to your concerns. You’re alienating them, because of the sins of their fathers, so to say. I’m not surprised noone cares about accessibility, to be honest. When we’re told the world would be better off without developers like us, we’re not going to be interested in working towards better accessibility.

                                    2. 5

                                      Do you know why human interface guidelines exist? They exist because humanity is imperfect and accessibility is really important.

                                      I’m aware, thank you. I’m very much aware that Electron is not… great, for many reasons. That’s still not a reason to unconditionally call it a scourge.

                                      To whole classes of users, your Electron app is worthless and unusable. What are we supposed to do? Congratulate you on your accomplishment? You need to shrink your ego and take some time to understand real world users, rather than throwing words around like “smug” and “elitist”.

                                      You, like the article author, ignore circumstances, and generalize. Yes, my electron app is going to be useless for anyone using a screen reader. It will be useless for a whole lot of people. However, it will exist, and hundreds of people will be able to use it. Without Electron, it wouldn’t exist. So ask yourself this, which is better: an application that is not usable by some people, but makes the life of the vast majority of its intended audience easier; or an application that does not exist?

                                      Here’s the situation: there’s a keyboard with open source firmware. Right now, to change the layout, you need to edit the firmware source, compile a new one, and upload it to your keyboard. While we tried to make the process easy, it’s… not very friendly, and never going to be. So I’m building an application that lets you do this from a GUI, with no need for a compiler or anything else but the app itself. I develop on Linux, because that’s what I have most experience with. Our customers are usually on Windows or Mac, though. With Electron, I was able to create a useful application, that helps users. Without it, if I had to go native, I wouldn’t even start, because I lack the time and resources to go that route. For people that can’t use the Electron app, there are other ways to tweak their keyboard. The protocol the GUI talks can be implemented by any other app too (I have an Emacs package that talks to it, too). So people who can’t use the Electron app, have other choices.

                                      So, thank you, I do understand real world users. That is why I chose Electron. Because I made my due diligence, and concluded that despite all its shortcomings, Electron is still my best bet. Stop smugly throwing around Electron hate when you haven’t considered the circumstances.

                                      At the risk of sounding controversial, what does it matter if we alienate some developers? Developers should do better. They need to do better. I’m fine with alienating developers for good reasons.

                                      Well, for one, a lot of our customers would be deeply disappointed if they weren’t able to use the GUI configurator I built on Electron. “Developers should do better”. Well, come here and do my job then. Get the same functionality into the hands of customers without using Electron. I’ll wait (they won’t).

                                      Electron is not the answer. It’s a shortcut at best and a bandaid at worst. This isn’t a secret, so there’s really no point in acting surprised that people don’t agree with your choices.

                                      I agree it is not the best, and I’m not surprised people disagree with my use of it. I can even respect that, and have no problems with it. What I have problems with, is people calling Electron a scourge, and asserting that native is always best, and that anyone who doesn’t follow the HIG of a given platform “should do better”. I have a problem with people ignoring any and all circumstances, the reason why Electron was chosen for a particular product, and unconditionally proclaiming that the developers “should do better”. I have a problem with people who assert that alienating developers because they don’t (or can’t) write apps that match their idealistic desires is acceptable.

                                      Before writing off something completely, consider the circumstances, the whys. You may be surprised. You see, life is full of compromises, and so is software development. Sometimes you have to sacrifice accessibility, or native feel, or what have you, in order to ship something to the majority of your customers. I’d love to be able to support everyone, and write lighter, better apps, but I do not have the resources. People calling the technology that enables what I do a scourge, hurts. People asserting that I should do better, hurts.

                                      Until people stop ignoring these, I will call them elitist and smug. Because they assume others that chose Electron have the privilege of being able to chose something else, the resources to “do better”. Most often, they do not.

                                      1. 1

                                        Electron apps are not accessible. They’re often completely opaque to a screen reader, they’re often completely opaque to assistive speech recognition software, they often don’t respond properly to keyboard navigation and if they do, they don’t behave as expected.

                                        I haven’t written an electron app, but I’ve done a fair share of web UI programming with accessibility in mind. You communicate with platform accessibility APIs through web APIs. It’s not technically complicated, but it does require some domain expertise.

                                        Does it work the same way on electron?

                                        1. 8

                                          Electron embeds chromium, which doesn’t connect to the native a11y APIs (MS are working to fix this on windows).

                                          As a result electron apps are as inaccessible as chrome (screenreader users tend to use IE, safari or firefox).

                                          1. 2

                                            Huh, this is a real surprise if true.. i tested our web UI with screen readers across macOS and windows in chrome, FF, and IE, and the only problems that occurred were due to bad markup or the occasional platform bug. Reaching the accessibility API was not a problem i ran into with chrome

                                            1. 1

                                              Chrome is definitely accessible to screen readers. I’ve spent more time that I’d like getting JAWS to read consistently across IE, FF and Chrome. From memory, Chrome was generally the most well behaved.

                                        2. 7

                                          When Apple was first developing the Mac, they did a lot of research into human/computer interaction and one of the results was to apply a consistent interface on the system as a whole. This lead to every app having the same menu structure (for the most part, the system menu (the Apple logo) was first, “File” next, then “Edit”) and under these standard menus, the actions were largly the same and in the same order. This would lower training costs and if a user found themselves in a new app, they could at least expect some consistency in actions.

                                          I’ve been using Linux as a desktop since the mid-90s (and Unix in general since 1989) and to say there’s a lack of consistency in UI is an understatement. Some apps have a menu bar at the top of the window (Macs menu bars are always at the top of the screen per Fitt’s Law), some you have to hold Ctrl down and press the mouse button, some the right mouse button will cause a pop-up menu. I’ve been able to navigate these inconsistencies, but I’m still annoyed by them.

                                          Further more, I’m used to the CLI, and yet even there, the general UI (the command set, the options to each command) still surprises me. I wrote about the consistency of GUIs and the inconsistencies I found on the Unix CLI over the years and while I still prefer the CLI over the GUI [1], I can see where the consistency of the Mac GUI makes for a much better experience for many.

                                          As I’ve stated, I think it’s wonderful that PHP has enabled people to create the dynamic websites they envision, but I wouldn’t want to use the resulting code personally.

                                          [1] One can program the CLI to do repetitive tasks much easier than one can do the same for any of today’s GUIs. There have been some attempts over the years to script the GUI (Rexx, AppleTalk) but it still takes more deliberate action than just writing a for loop at the command prompt for a one-off type of job.

                                          1. 5

                                            I think the case where it makes sense to go Electron is not the point of Gruber’s rant. The point was that many, many developers today are easier with writing Electron app and using it on all platforms instead of putting time and effort into polished Cocoa apps.

                                            The core of this blog post is how Mac really was different in terms of UI/UX. During the time I started using Mac (10.5) it was really differentiating itself by having “different”, “better looking and feeling” apps. Electron definitely made Mac feel less unique. Critics were pointed towards macOS app developers. Don’t get so offended by simple blogpost. Your reasons are fine, but that simply isn’t the case most of the time. Most people decide to go with electron because of plethora of mediocre JS devs, that can chunk out a lot of code that does something, and then you get slow junk like UX. In minds of 2000s Apple fans that is a big no.

                                            Have a nice day, and move on.

                                            1. 7

                                              I think the case where it makes sense to go Electron is not the point of Gruber’s rant.

                                              Correct.

                                              The point was that many, many developers today are easier with writing Electron app

                                              Incorrect.

                                              Please just read the article.

                                              His point is what he says: it is bad news for the Mac platform that un-Mac-like apps far worse than those that were roundly rejected 15 years ago are now tolerated by today’s Mac users.

                                              It happens to be the case that Electron is the technology of choice for multiple prominent sub-par apps; that’s a simple statement of fact. (It also isn’t purely coincidental, which is why I agree with his characterisation of Electron as a scourge. If someone like @algernon who builds apps for Electron is bent on interpreting those statements as a judgement of their own personal merit, well… be my guest?) But Electron is not singled out: Marzipan gets a mention in the same vein. On top of that, Gruber also points out the new Mac App Store app, which uses neither. The particular technologies or their individual merits are not his point.

                                              His point is, again, that a Mac userbase which doesn’t care about consistency spells trouble for the Mac platform.

                                              1. 2

                                                Marzipan gets a mention in the same vein.

                                                Electron is the only one that gets called a scourge, and is singled out in the very beginning. It’s even in the title. It’s even the first sentence, which then continues: “because the Mac is the platform that attracts people who care”

                                                If that’s not an elitist smug, I haven’t seen any.

                                                1. 6

                                                  Once upon a time, Mac users were a ridiculed minority. In those days, Microsoft-powered PCs were better in just about every way. They were much faster, they had a more technically advanced OS (even crappy Win95 was far ahead of MacOS Classic), they had more applications, they had boatloads of games… just about every reason to pick one computer over another pointed in the direction of a Microsoft PC. You had to be special kind of kook to want a Mac regardless. It was inferior to a PC in basically every dimension. The one reason to pick a Mac over the PC was the depth of consistency and care in the UI design of its software. Only users who cared about that enough to accept the mile-long list of tradeoffs went for the Mac.

                                                  1. 1

                                                    Elitism is often a good thing. It’s how we get from the mundane to the truly excellent.

                                                2. 3

                                                  The point was that many, many developers today are easier with writing Electron app and using it on all platforms instead of putting time and effort into polished Cocoa apps.

                                                  My beef is not with the author wishing for apps that would look more native - I share the same wish. My beef is with him calling Electron “without a question a scourge”. How about I said MacOS is without a question a scourge, for it jails you in its walled garden? You’d be rightly upset.

                                                  There’s a big difference between wishing apps would be more polished on a particular platform, and between calling a technology (and by extension, developers who chose to use it) a scourge. It reeks from privileged elitism, and failure to understand why people go with Electron.

                                                  Most people decide to go with electron because of plethora of mediocre JS devs, that can chunk out a lot of code that does something

                                                  No. Most people decide to go with Electron because it provides a much better cross-platform environment than anything else. Please don’t call a whole bunch of people “mediocre JS devs”, unless you have solid data to back that up. Just because it is JS and “web stuff” doesn’t mean the people who develop it are any less smarter than native app developers. Can we stop this “only mediocre people write JS/PHP/whatever” bullshit?

                                                  1. 13

                                                    There are more bad developers writing webshit because there are more devs writing webshit period.

                                                    Native apps tend to outperform Electron apps and use less memory, because to do the same things that don’t bring in a browser and language runtime.

                                                    Elitism is, in this case, warranted. The only really performant app (usually) in Electron I’ve seen is VSCode, because MS really does have sharp people working in a domain they’ve been leaders in for decades.

                                                    1. 7

                                                      There seems to be a shift towards less attention paid, and value given, to the experience of the user. This makes me very sad.

                                                      When people talk about why they use Electron, they always phrase it in terms of “developer productivity”, and that’s where I find the most elitist bullshit to be. Developers talk about using Electron so they didn’t have to learn a new platform, or so they only had to test it in one place, or it was faster. They talk about lower development costs (which they wildly overstate, in my experience).

                                                      But the questions I’d like use to start asking: what are the costs of the shit user experience? What are the costs of people having to learn new tools that don’t behave quite like the others? When we save money and time on development that money and time is saved once, but when we save time for our users, it’s saved repeatedly.

                                                      Maybe calling Electron shit is elitist bullshit. But I’ll take that over having contempt for one’s users.

                                                      1. 3

                                                        Contempt is a strong word. Would all these people be users in the first place if the app doesn’t exist for their platform? Go ahead and write a native Cocoa app for OSX, but that sure feels like contempt for Windows or Linux users. “Buy a new machine to use my stuff” vs. “deal with menus in the wrong order”?

                                                        1. 1

                                                          I never said “buy a new machine to use my stuff.”

                                                          From extensive experience: for most small applications, I can develop them natively on Mac, Windows, and Linux* faster than someone can develop the same thing with similar quality using a cross platform thing.

                                                          (*) “native” on Linux is less of a sticky thing that on Mac and Windows.

                                                          1. 2

                                                            Here’s a challenge for you: https://github.com/keyboardio/chrysalis-bundle-keyboardio (demo here).

                                                            Go do something like that natively for Mac and Windows. It’s a small app, some 3700 lines of JS code with comments. You do that, and I promise I’ll never write an Electron app ever again. You can make the world a better place!

                                                            1. 1

                                                              Thank you for your interest in my consulting services.

                                                              1. 3

                                                                Thought so.

                                                                FWIW, the app, like many Electron apps, were originally built in my unpaid free time. Complain about Electron apps once you built the same stuff under the same conditions.

                                                    2. 9

                                                      How about I said MacOS is without a question a scourge, for it jails you in its walled garden? You’d be rightly upset.

                                                      I wouldn’t. I might point out that you absolutely can bypass their walled garden, on MacOS, but those objections for iOS are not only valid, but are honestly concerning.

                                                      Electron is a scourge. iOS is a scourge. Facebook is a scourge. The feature creep within the browser is absolutely a scourge. There are loads of scourges. This is not an exhaustive list. I pray daily for a solar flare which delivers enough of an EMP that it utterly destroys the entire technology landscape, and gives us an opportunity to rebuild it from the invention of fire onwards, because we’ve fucked up, and our technology is bad.

                                                      And I say this because I want technology to be better. The Web is an awfully complicated way to render a UI. Our systems are overly dependent on a few corporations who don’t have our best interests at heart. Our computers are slower to do less work than they did two decades ago. Pretty much the only way anybody makes money in tech anymore is by abusing their users (see: Google, Facebook, etc). Or their employees (see: Uber). Or both (see: Amazon).

                                                      1. 1

                                                        In the EMP scenario we’d be too busy trying to get essentials back up to care about doing it right

                                                      2. 7

                                                        You’d be rightly upset.

                                                        No, I wouldn’t be rightly upset. I am not the technologies I use, and neither is that true for you.

                                                        Electron is the technology used in multiple highly prominent applications that are written with little or no regard to platform conventions. Their success is bad for the native platforms. Those are statements of fact. If you have good reasons to use Electron, then there is no need for you to relate those facts to yourself and take them as a statement of your merits as an individual.

                                                        1. 7

                                                          Agree. The notion that your identity is somehow linked with the tools you use is toxic. It is fine to like your tools, but once you get comfy with them you should be pushing beyond them to expand your taste.

                                                          1. 5

                                                            your identity is somehow linked with the tools you use

                                                            I used QBasic, Visual Basic 6, and later FreeBASIC. If my tools define me, I feel like such a shallow person with no depth or skill. Such a sinking feeling. I think I’m going to re-install SPARK Ada and buy Matlab to feel like a mathematician. Yeah, that will be better… ;)

                                                    3. 4

                                                      Thanks for sharing the blog post.

                                                      I think your case is quite different from, say, Slack. I have worked on major cross-platform apps at a company, and it’s not a huge deal, when you have a few people working on it, whose collective knowledge covers those platforms. All apps used the same core libraries for the non-UI parts, and each app added native UI on top. A company with tens, or even hundreds of well-paid developers should be able to do that, if they care at all about accessibility, performance (which is a different kind of accessibility issue,) resource usage, and all those things.

                                                      1. 2

                                                        It is still a big deal, even for a larger company, because there’s a huge difference between employing N developers to develop a single cross-platform application (with some of them specializing in one platform or the other), and between employing a set of developers to create native applications, and a set for the core libraries. There may be overlap between them, but chances are that someone who’s good at developing for Windows or OSX would not be happy with developing for Linux. So you end up employing more people, paying more, diverging UIs, for what? Paying customers will use the Electron app just as well, so what’s the point of going native and increasing costs?

                                                        Yeah, they should be able to do that, yes, it would improve the user experience, yes, it would be better in almost every possible way. Yet, the benefits for the company are miniscule, most of the time. In the case of Slack, for example, or twitter, a uniform experience across devices is much more important than native feel. It’s easier to document, easier to troubleshoot, and easier for people who hop between devices: it’s the same everywhere. That’s quite a big benefit, but goes very much against making the apps feel native. And if you forego native feel, but still develop a native application that looks and behaves like on any other platform (good luck with that, by the way), then the benefits of native boil down to being less resource hungry. In the vast majority of cases, that is simply not worth the cost of the development and maintenance burden.

                                                        In the age of mobile devices, I do not feel that apps looking “native” is any benefit at all, but that’s a different topic.

                                                        1. 5

                                                          I built them in the past myself. I’ve read write-ups about what it takes for others. If designing program right, most of the code is shared between the different platforms. The things that are different are mostly in the front-end that calls the common code. A lot of that can be automated to a degree, too, after design is laid out. For main three platforms, it basically took a max of three people two of whom only worked on UI stuff here and there mostly focused on shared code. That isn’t the minimum either: it can be lower if you have 1-2 developers that are experts in more than one platform. In mobile, many people probably know both iOS and Android.

                                                          The cross-platform part will be a small part of the app’s overall cost in most cases. It will mostly be in design/UI, too. That’s worth investing in anyway, though. :)

                                                          1. 5

                                                            Our experience clearly differ then. I worked for companies that made native apps for the major platforms (mobile included), and each team was 10+ people at a minimum, with little code shared (the common code was behind an API, so there’s that, but the apps itself had virtually no code in common). Not to mention that the UIs differed a lot, because they were made to feel native. Different UIs, different designs, different bugs, different things to document and support. A whole lot of time was wasted on bridging the gaps.

                                                            If they made the apps feel less native, and have a common look across platforms, then indeed, it could have been done with fewer people. You’d have to fight the native widgets then, though. Or use a cross-platform widget library. Or write your own. And the writer of the article would then complain loudly, and proclaim that multi-platform apps are the worst that could happen to the Mac (paraphrasing), because they don’t follow the HIG, and developers nowadays just don’t care.

                                                            1. 3

                                                              each team was 10+ people at a minimum, with little code shared (the common code was behind an API, so there’s that, but the apps itself had virtually no code in common).

                                                              “If they made the apps feel less native, and have a common look across platforms, then indeed, it could have been done with fewer people. “

                                                              I said native look on multiple platforms minimizing cost. You example sounds like something about the company rather than an inherent property of cross-platform. You usually need at least one person per platform, esp UI and style experts, but most of the code can be reused. The UI’s just call into it. They’ll have some of their own code, too, for functions specific to that platform. Mostly portable. Just sounds like the company didn’t want to do it that way, didn’t know how, or maybe couldn’t due to constraints from legacy decisions.

                                                              1. 1

                                                                My experience mirrors yours almost exactly.

                                                        2. 3

                                                          What about sciter? Companies doing stuff like anti-virus have been using it for a long time with way, way, less, resource use. It’s licensing scheme looks like something other vendors should copy. Here’s a comparison claiming a simple editor is 2MB in sciter vs 100+ in Electron just because it brings in less baggage.

                                                          How many people using Electron could use the free, binary version of sciter? And how many companies using Electron could afford $310 per year? I mean, that’s within reach of startups and micro-businesses, yeah? I’m asking because you said it’s use Electron or not cross-platform at all cuz too costly/difficult. I’m making it easy by using a comparable offering rather than, say, Lazarus w/ Free Pascal or otherwise non-mainstream languages. :)

                                                          Note: I also have a feeling lots of people just don’t know about this product, too.

                                                          1. 4

                                                            I actually evaluated sciter when I got frustrated with developing in JS, and wanted to go native. For my use, it wasn’t an option, because the app I write is free software, and therefore so much be its dependencies. For closed-source use, it’s probably fine. Though, you’d still have to write plenty of platform-specific code. AV companies already do that, but companies that start off with a web-based service, and develop an application later do not have that platform-specific code already built. What they have, is plenty of JavaScript and web stuff. Putting that into Electron is considerably easier, and you end up with very similar UX, with little effort, because you’re targeting a browser still.

                                                            Oh, and:

                                                            With Sciter, changing the front end of your application involves just altering the styles (CSS), and probably a couple of scripts that do animations.

                                                            Yeaah… no. That’s not how changing the UX works. It’s a tad more involved than that. Not sure I’d be willing to trust my UI on an offering that basically tells me that changing between, say, Metro and Material UI is a matter of CSS and animations. (Hint: it’s much more involved than that.)

                                                            Since we’ve already decided we’re not going to follow any platform HIGs (and thus, have the article author proclaim we’re a scurge), what’s the point of going native? Less resource use? Why is that worth it for the company? People use the app as it is, otherwise they wouldn’t be writing an app. Writing native, and making it look and behave the same has non-negligible cost, but little to no benefits. Faster and lighter is in many cases not a goal worth pursuing, because customers buy the thing anyway. (I’d love if that wouldn’t be the case, but in many cases, it is.)

                                                            1. 1

                                                              I appreciate the balanced review of sciter. That makes sense.

                                                        1. 2

                                                          The worst thing to happen to the Mac is Steve Jobs dying (okay, it’s also the worst thing to happen to Apple). He was the only one with enough clout and confidence to say “we aren’t shipping this until it’s fixed.”

                                                          1. 18

                                                            Unfortunately if you count unfinished things that they shipped on his watch, you find a different story: the cracking-case G4 cube, Mac OS X 10.0, the iWork rewrite that removed features from existing documents, Apple Maps, iAd (we’re not an advertising company), the iPhone 4, and MobileMe are the ones I can think of now.

                                                            I’m not arguing that quality is better now (I think it isn’t), but I will argue that Steve Jobs is not the patron saint of problem-free technology products. Apple has, like all other technology companies, shipped things that they thought (or maybe hoped) were good enough, under all of their CEOs.

                                                            1. 1

                                                              Didn’t they charge for the updates to fix their broken software on top of it while Windows did them for free? I could be misremembering but I think Mac folks told me that.

                                                              1. 4

                                                                All I can recall is charging for a patch to enable 802.11n wireless (as opposed to draft) but the explanation was that sarbanes oxley prohibited delivering a “new” product because they already booked the revenue for it, but then the law was clarified and software updates are ok now.

                                                            2. 2

                                                              It’s not so much about Steve Jobs, but about people that were behind products and development process, which of course were brought and held together by Steve Jobs. I would say that departure of Bob Mansfield is one of the mayor impacts on Apple’s final products.

                                                            1. 5

                                                              As someone who feels pretty comfortable with pointers, I found this article to be:

                                                              1. longer than 5 minutes to go through,
                                                              2. fairly confusing and unfocused, and
                                                              3. provides background info at too varied of abstraction levels. There’s assembly, calling conventions, syntax, and types.

                                                              I may be biased as I’m in the middle of instructing a beginner C course right now, and this would definitely go over the students’ heads. I find its much more helpful to offer different mental models and visualizations of the underlying abstract machine. One explanation will not satisfy and click with everybody.

                                                              I do like beginning with something seemingly simple. “What is a variable?” is a great question to start peeling back the layers in C. Especially defining a pointer as a variable! A common misconception I see is conflating pointers and objects on the heap. A variable is an object on the stack. A variable has a name so you can use it. Some objects don’t have a name so they’re a little harder to use. A pointer can be variable. A pointer can point to anywhere, so they could help you use that object without a name! (or any other object). Alas, as I mentioned earlier, this would also whoosh over some heads.

                                                              1. 3

                                                                But not all variables live on the stack—global and static variables (even those defined in functions) live in the data segment [1]. Probably a better high level definition—a variable is a named location to store data, but said location can be ephemeral (in the case of non-static variables defined in a function).

                                                                [1] Or bss segment.

                                                                1. 1

                                                                  That is indeed a better definition! I try to avoid the topic of global variables with beginners to avoid nasty practices, but they should not get a faulty definition.

                                                                  1. 1

                                                                    And note that some variables may only ever exist in registers.

                                                                1. 52

                                                                  This is the collateral damage of making browsers so complex. If megacorps can’t justify the costs required for maintenance and improvement, we essentially cede the platform to those with the deepest pockets.

                                                                  1. 16

                                                                    Yeah, I think you are making a super important point so I’ll try to reiterate it:

                                                                    • Open standards succeed when many people/organizations can easily implement the standard due to its simplicity and obviousness.
                                                                    • Open standards fail when too few people/organizations can make a complete implementation; they “collapse” under their own weight.

                                                                    This is not backed up by data, rather it’s my opinion or a synthesis from bits of anecdotal evidence. In this case, the laundry list of features now required to make a web browser is pretty intractable. As others have said, it’s reaching the level of complexity of an OS.

                                                                    1. 8

                                                                      This is the collateral damage of making browsers so complex.

                                                                      In this case, the laundry list of features now required to make a web browser is pretty intractable. As others have said, it’s reaching the level of complexity of an OS.

                                                                      Yup and yup.

                                                                      I understand the emergency on the danger of having one megacorp controling a set of standards, but this is a monstrous set of standards that, imo, needs to die. Something simpler and lighter must replace it, and I’m not much into putting effort, by “doing my part,” to saving the current one. I refuse to believe that something simpler isn’t possible.

                                                                      1. 8

                                                                        I’m starting to share this position. The web is dead, long live the Internet.

                                                                        1. 3

                                                                          There’s always gopher. It’s not that hard to write a gopher client.

                                                                          1. 2

                                                                            Another solution might be to more clearly define essential parts of the standard and extra parts, with sane graceful degradation. The goal being to encourage web developers and companies to be less apt to require every flashy new (extra) feature, because not all browsers would to choose to implement all the extras.

                                                                            1. 1

                                                                              This wouldn’t stop Google from implementing Google features that require everyone to use Chrome, and then everyone would just use Chrome. I don’t think this would be any different than the status quo.

                                                                            2. 2

                                                                              Something simpler and lighter must replace it, and I’m not much into putting effort, by “doing my part,” to saving the current one.

                                                                              Just be careful that that “simpler and lighter” thing isn’t something like AMP that causes even more lock-in than we have now.

                                                                            3. 1

                                                                              While I agree with your analysis, we do have several FOSS operating systems. Admittedly, much Linux development is funded by corporations. Nonetheless, there are multiple existence proofs that free software communities can deliver software with complexity on the order of operating systems. I’m not aware of any browsers produced that way, though. I suppose Firefox would be the nearest thing.

                                                                          1. 10

                                                                            One major problem with C (and I like C) is that undefined behavior can come from just about any direction. I just recently learned (one a mailing list not normally associated with linking) that linking (you know, with the ld program) can invoke undefined behavior, even if the compilation phase did not invoke any undefined behavior. Even reading the C standard was ambiguous, with different people interpreting the text differently (with respect to the linking issue).

                                                                            1. 9

                                                                              This is a good point. I’m new to C and while I’ve picked up the basics of the language, I feel like actually learning to write good C is impossible. For example, minefields like the str* functions. I know there are safe alternatives to those, but what other dangerous stuff is left over from that era that I might accidentally use? The advice in this article is great - avoid undefined behavior at all cost - but I have no idea how to actually follow through with it, especially when I had no idea that e.g. int promotion was even a thing. I feel like I have to be an absolute expert in obscure semantics of the language in order to write even vaguely safe C because undefined behavior can pop out of virtually any situation.

                                                                              Has anyone found a good solution to this problem?

                                                                              1. 7

                                                                                The definitive list is Annex J of the C99/C11 standard, but fair warning—the language is tortuous. For instance, the very first bullet point under Annex J, which lists all the undefined behaviors:

                                                                                | A ‘‘shall’’ or ‘‘shall not’’ requirement that appears outside of a constraint is violated (clause 4).

                                                                                where “clause 4” seems to actually reference section 4, paragraph 2:

                                                                                | If a ‘‘shall’’ or ‘‘shall not’’ requirement that appears outside of a constraint is violated, the behavior is undefined. Undefined behavior is otherwise indicated in this International Standard by the words ‘‘undefined behavior’’ or by the omission of any explicit definition of behavior. There is no difference in emphasis among these three; they all describe ‘‘behavior that is undefined’’.

                                                                                But even though English is my first (and tragically, only) language, I’m still not sure how to interpret “a ‘shall’ or ‘shall not’ requirement that appears outside of a constraint is violated.” What is that even saying? And yes, any situation not covered in the Standard becomes undefined behavior, pretty much by definition.

                                                                                And a word of warning—Annex J takes 13 pages to list undefined behavior.

                                                                                Unfortunately, I don’t know of any real advice. 30 years of C programming and I’m still surprised as what is and isn’t technically allowed.

                                                                                1. 5

                                                                                  At some point I’d suggest just reading the C standard. For some reason this never seems to occur to many people. The parts describing the language aren’t that many pages, and then you’d know about integer conversions and promotions, etc. I mean, I can understand not immediately memorizing every nuance, but I think I lot of “nobody ever told me that!” could be avoided by simply reading the original source.

                                                                                  I guess the other approach is just to pay more attention to what you’re doing. You write a line of code. You think it does something. What is the basis for that belief?

                                                                                  1. 4

                                                                                    You are quite possibly further on the road to understanding UB in C/C++ than something like 97% of people who code in those languages for a living. I personally started to realize there’s a problem only after 10 years writing C++ (5 of them professionally). (Fortunately for me, I moved to Go soon afterwards.) Sadly, there are tons of people who don’t have a slightest idea about UB, and casually dismiss, ridcule, or even aggresively reject any explanations (maybe strenghtened by subconscious fear of having their life’s work undermined). Hmm; I just now thought it maybe in some ways resembles the situation with small particles pollution (including global warming), in that it has the problem of visibility.

                                                                                    1. 2

                                                                                      I second tedu’s suggestion to read the standard.

                                                                                      As for the unsafe functions, I would highly recommend you get in the habit of using OpenBSD man pages. They do a decent job of pointing out caveats and antipatterns while often referring to a better solution. mdoc.su/o/ is a handy shortcut to get there if you’re not sitting on an OpenBSD shell. For example: mdoc.su/o/malloc

                                                                                      Honestly, C is not that big of a language, and the minefield you mention as an example is but a handful of functions..

                                                                                      If books are your thing, I’d also recommend TAoSSA ch6 which was published as a freebie. URLs change but remember the filename and you’ll find it on various hosts: dowd_ch06.pdf.

                                                                                      You might learn a fact or three from Expert C Programming, although if you’ve been reading the standard, there’s not going to be that much to pick up from it.

                                                                                  1. 4

                                                                                    Some of the things in the blog post like := or ?= don’t appear in the posix spec for make. Are they GNU’isms?

                                                                                    1. 7

                                                                                      Yes, along with $(shell ... ). The author should have mentioned he was using GNUMake.

                                                                                      1. 1

                                                                                        := is almost mandatory for makefiles. If you have a shell expansion it will get run every time unless you use :=. Many of the extensions in Gnu make are simply unreproducable in posix make.

                                                                                      1. 8

                                                                                        I’m curious, how many of you are using Mutt as your daily email client at work? How do you cope with calendar invites, frequent HTML emails, …?

                                                                                        1. 3

                                                                                          I use mutt for personal email, so calendar invites is not an issue for me. I also have mutt use lynx to handle the case when the sender only sent HTML (usually, if there’s an HTML section, there’s also a plain text section). For work, I use whatever they give me—I like keeping a separation between personal and work stuff.

                                                                                          1. 1

                                                                                            Do you mean invites aren’t an issue because you don’t use them or because you solved this? If so, how?

                                                                                            I read in another comment that it’s just html, and to be fair as I come to think of it, it’s been a long time since I had to care about mutt and calendars, so maybe it was just a dumb link to click through the terminal browser.

                                                                                            1. 2

                                                                                              I don’t use invites or calendar things via personal email, and if anyone has sent me one, I haven’t noticed.

                                                                                              I did start using mutt at a previous job where I had to chew through a ton of daily mail (basically, all email sent to root on all our various servers were eventually funneled to me) and I found mutt to be much faster than Thunderbird (which should indicate how long ago this was). It was using mutt for a few weeks that prompted me to switch away from elm (which really dates me).

                                                                                          2. 3

                                                                                            IIRC, when I used mutt regularly, I used to have it pipe html emails straight into elinks to render them inside mutt. Didn’t need calendaring at the time.

                                                                                            1. 2

                                                                                              I gave up my resistance of modern email quite some time ago; it’s simply too much hassle, personally speaking, dealing with calendaring and rich media content in email to still use a console based MUA, but that being said I really miss the simplicity and lightweight of Mutt.

                                                                                              Mutt was my go-to client for many, many years, and I feel tremendous nostalgia when I am reminded that it’s still actively maintained and indeed has a user base. Bravo. :-)

                                                                                              1. 2

                                                                                                How many emails do you handle a day? I do about 200, though I need to read or skim all, I only reply to about 1/10th of them… but I can’t imagine keeping up with that in any of the gui clients I have had. With mutt, it feels like nothing.

                                                                                                1. 1

                                                                                                  I’m trying to do more and more with mutt, gradually using the GUI client less. Still haven’t configured a convenient way to view html or attached images but the message editing is nice. I hook it up to vim:

                                                                                                  set editor='vim + -c "set ft=mail" -c "set tw=72" -c "set wrap" -c "set spell spelllang=en"'
                                                                                                  

                                                                                                  This mostly formats things correctly, and allows me to touch paragraphs up by hand or with the “gq” command. I can also easily add mail headers such as In-Reply-To if needed. In some ways my graphical client is starting to feel like the constrained one.

                                                                                                2. 2

                                                                                                  I’ve been using Mutt for the past 15+ years for personal email and 5+ years for work - even with Exchange IMAP (special flavour) at one point.

                                                                                                  I mostly ignore HTML email - either there’s a text/plain part or HTML->text conversion is good enough - there are occasional issues with superfluous whitespace and it can look a bit ugly when plenty of in-line URLs are being used but these are not that common.

                                                                                                  For calendaring I still use web - we’re on G Suite - but am hoping to move to Calcurse at some point (still not sure how to accept invites, though). Bear in mind, calendar != email, and Mutt is an email client - once you accept it, you’ll be much happier :^)

                                                                                                  1. 1

                                                                                                    I used it 2015-mid 2017 but ended up moving back to Thunderbird and even web clients. It wasn’t worth the effort. If I didn’t have to handle all my configs to get a decent setup (imap, gpg, multi-account, addresses) then I’d consider using it again. I love the idea of not having to leave my term.

                                                                                                    1. 1

                                                                                                      I use mutt daily and have my mailcap set to render html email in lynx/w3m/elinks. It’s sufficient to see if I then need to switch to a GUI mail client. For GUI, I have previously used Thunderbird with DAVmail and currently just use the Outlook client.

                                                                                                      1. 1

                                                                                                        I use (neo)mutt as my daily personal email. HTML isn’t an issue, but forwarding attachments and dealing with calendar invites is embarrassing.

                                                                                                        Usually I use the bounce feature into my work email (Protonmail), which causes spf-related spam flags to get set, but generally gets the job done.

                                                                                                        I self-host my email so the pain threshold is quite high for me to start configuring RoundCube (or whatever the kids today use) or even IMAPS.

                                                                                                        PS. not using Google is a bit embarrassing as well, as the email and Nextcloud calendar are so disconnected, but it works better than mutt ;)

                                                                                                      1. 1

                                                                                                        I like the idea, but I have two concerns:

                                                                                                        • a missing environment variable causes a compilation error. Why? Why not just set the variable to NULL when it doesn’t exist? This feels like a misstep to me.

                                                                                                        • I don’t like the conditional statements. I find it very hard to follow the logic.

                                                                                                        The rest I have no complaints about. Also, were I to use this at work, we have two configuration file formats not covered by this tool—XML and Lua (yes, there are a few tools where we use Lua for configuration). Lua output seems like it would be easy to do; XML, I’m not so sure.

                                                                                                        1. 1

                                                                                                          XML is a challenge mostly because there is no single way to know if something should be an attribute or a nested tag. I’m still thinking about the best way to do that one.

                                                                                                          Regarding the environment variable missing causing an error I’m still trying to decide if that is the right call or not. On the one hand it’s nice to know that if you accidentally forget to set a variable you’ll get an error. On the other hand developing a configuration and getting compile errors all the time could get annoying.

                                                                                                          I’m curious about why you think the conditional statements are hard to follow. Sometimes when you’ve been working on something for a while you get tunnel vision so they don’t seem difficult to me, but I appreciate the feedback.

                                                                                                          1. 1

                                                                                                            It’s very indirect. A tradition condition:

                                                                                                            if afield == 'Shawn' then
                                                                                                              afield = 'Sean'
                                                                                                            end
                                                                                                            

                                                                                                            Given that, this:

                                                                                                            let what = 'afield';
                                                                                                            select what, "is the default always required?", {
                                                                                                              Shawn = 'Sean',
                                                                                                              // how do I keep the original value not get the default?
                                                                                                            };
                                                                                                            

                                                                                                            Is the answer?

                                                                                                            let what = 'afield';
                                                                                                            select what,what, {
                                                                                                              Shawn = 'Sean',
                                                                                                            };
                                                                                                            

                                                                                                            Is this the same?

                                                                                                            select 'afield' , afield , {
                                                                                                              Shawn = 'Sean';
                                                                                                            };
                                                                                                            

                                                                                                            Is that the same as my conditional? And why do I need to give the field name instead of the field? (Oh, replace single quotes with double quotes if that’s the syntax—I’m used to Lua). And it’s weird to call it a conditional when all you can do is check for equality.

                                                                                                            1. 1

                                                                                                              It’s closest analog is probably a switch statement in traditional languages which is also a conditional with more power than just an if statement. I tend to like languages not to have too many ways to do the same thing so I didn’t include an if-statement.

                                                                                                              1. 1

                                                                                                                In C, switch only tests for equality. if can do more than just that. In Lisp, cond is similar to switch, but can do more than just equality (less than, greater or equal, etc). But even if you restrict your conditional to just equality, I still find the current syntax clumsy and error-prone.

                                                                                                            2. 1

                                                                                                              For the environment variables, how about a warning if it’s missing?

                                                                                                              1. 1

                                                                                                                How would you feel about a flag that makes the compiler warn if it’s missing and then have the default behavior be to error. Or vice-versa. I want to avoid the case where you are deploying a configuration to prod and didn’t realize that an env variable wasn’t set thus causing an outage. Warnings often don’t prevent that especially in an automated deployment environment.

                                                                                                                1. 1

                                                                                                                  As long as I get a way to choose what happens is fine.

                                                                                                            3. 1

                                                                                                              FYI I just implemented a flag in HEAD to disable the compilation error and issue a warning since that seems reasonable to me.

                                                                                                              I’m still thinking about the select and trying to decide if it should be improved.

                                                                                                            1. 1

                                                                                                              I don’t really see what the article is trying to say. Is it saying “all code should be readable by everyone”? Because I think to understand something you should really try to understand the framework it’s within and the design philosophy behind the program and the language it’s programmed in. Why does all code need to be understood by everyone, I find it unlikely someone is reading code and doesn’t have the ability to learn the bare minimum of syntax to understand a given segment. This is why I like LiterateProgramming, a language I use called Retro introduces markdown-style code blocks in source just so you can seperate documentation from code and have that share the same source file.

                                                                                                              But maybe I’m misunderstanding. Is it saying “all code should be domain specific”? I feel that’s the philosophy of Forth. Forth is both a programming and metaprogramming language, without any weird syntax for the metaprogramming features. I feel like more people should learn old-style Forth simply for the way it helps you think about problems. An example Chuck Moore (inventor of Forth) gives in his book Programming a Problem-Oriented Language (1970) for some arbitrary database system;

                                                                                                              List twice, by seniority, all employees holding job 17 in dept 3:

                                                                                                              17 JOB EQUAL 3 DEPT EQUAL SENIORITY SORT LIST LIST

                                                                                                              It’s not SQL, but you could provide some simple words to better structure your queries. Of course every problem ends up being solved in what’s equivalent to a DSL, you’re in a given problem domain after all. Personally, Forth makes those solutions easier to express, without introducing new weird syntaxes (unless the programmer wants, I suppose).

                                                                                                              Either way (or a third way, if I’m doubly misunderstanding) I disagree with the quote from Mr. Martin, even though I get his point. A programmer’s job is not to design a language, it’s to solve a problem. If the language already provides the tools to solve that problem sufficiently, then so be it. In Python quite a lot can be done with the standard library, after all (is filter(lambda x: 0<x<255, someList) a DSL?). A “domain specific language” (ignoring the difference in syntaxes) is just what you end up with after solving a big and complicated enough problem.

                                                                                                              1. 2

                                                                                                                Either way (or a third way, if I’m doubly misunderstanding) I disagree with the quote from Mr. Martin, even though I get his point. A programmer’s job is not to design a language, it’s to solve a problem. If the language already provides the tools to solve that problem sufficiently, then so be it. In Python quite a lot can be done with the standard library, after all (is filter(lambda x: 0<x<255, someList) a DSL?).

                                                                                                                You say, that a programmer’s job is not to design a language, but to solve a problem. But the entire point of the quote from martin is that those are the same thing. You build functions in logical units (called ‘words’ in forth) out of the functions/words that already exist as part of the language, and then you use that new language (consisting of the original language, plus the new words you have written on top of it, to compose the program).

                                                                                                                Indeed, you say this exact thing in your last sentence:

                                                                                                                A “domain specific language” (ignoring the difference in syntaxes) is just what you end up with after solving a big and complicated enough problem.

                                                                                                                1. 1

                                                                                                                  In that case, I suppose it’s just his wording that leaves me unsatisfied. If we mean the same thing, then, logically, I must agree with the quote.

                                                                                                                  1. 2

                                                                                                                    If I remember correctly, that quote comes from a chapter about DSL. It’s hard to get just fron the quote, but he’s really sayong the same as you: e.g. that you build up a DSL just by creating functions. You might not have a “domain specific syntax”, but you definitely get a vocabulary.

                                                                                                                2. 1

                                                                                                                  I meant that the code, at least a high level of abstraction should be readable by domain experts that are not devs (not necessarily every code being readable by everyone).

                                                                                                                  Thank you for the references, I didn’t read everything, but it seams that Literate Programming is exactly what I had in mind.

                                                                                                                  1. 1

                                                                                                                    At work, the program that handles the business logic some something on the order of 40,000 lines of code (mostly C, with a smattering of C++ where we couldn’t avoid it) but the entire logic of it is expressed as a simple chart on my whiteboard. Where is the complexity? Dealing with a live stream of phone calls from the Monopolistic Phone Company, doing data queries to two different databases, merging the results and replying back to the Monopolistic Phone Company with a hard deadline [1]. There’s quite a bit of code to deal with potential issues with our setup.

                                                                                                                    The chart on the whiteboard, however, states quite concisely what we return when. Looking at the code is the last thing you want to do (it’s the last thing I want to do, but that’s another story … )

                                                                                                                    [1] Our product does a name lookup based upon the phone number of the caller; it also looks up the reputation of the caller (potential spammer, potential spoofer, normal caller, etc) and returns this information to the phone number being called.

                                                                                                                1. -4

                                                                                                                  The upgrade from TCP to QUIC

                                                                                                                  For a guy that says he knows protocols he certainly doesn’t know the OSI layers

                                                                                                                  1. 10

                                                                                                                    Are you sure about this? He specifically talks about moving off TCP to a layer 4+5 solution, UDP headers with QUIC inside.

                                                                                                                    1. 0

                                                                                                                      I’m very sure. He keeps conflating TCP with QUIC which are not at the same layers

                                                                                                                      1. 13

                                                                                                                        But they are. Ask yourself, what does a connection mean in networking context? Previously it was almost always a tcp connection as that’s what tcp does. Now it can be a non-tcp QUIC connection that does it’s own connection handling logic, multiplexing, in-order delivery, etc. That’s the whole point of QUIC-as-the-transport-layer thing at all.

                                                                                                                        People suggested to split QUIC-the-transport layer from HTTP/2 and this is essentially what happened. It’s a transport layer level thing with built-in TLS that can handle arbitrary application protocols on top of it, not just HTTP.

                                                                                                                        1. 5

                                                                                                                          They are at the same layer. I suppose one could imagine a QUIC connection as having two transport protocols (UDP and QUIC) but I just think of it as one most of the time. The reason UDP is there is just because it wouldn’t work over the internet any other way, but you could run QUIC on top of IP if you wanted.

                                                                                                                          1. 1

                                                                                                                            You certainly could, but it would never work on the real internet because of middleboxes that will only pass TCP and UDP. This is also what is stifling SCTP adoption.

                                                                                                                            The transport protocol is UDP not QUIC, so it would be good to end the ambiguity when discussing QUIC.

                                                                                                                            1. 2

                                                                                                                              There’s no reason why it couldn’t work one day even though it doesn’t work now. QUIC is a transport protocol. It provides all the features of a transport protocol. What do you call SCTP-over-UDP then? Just UDP?

                                                                                                                              1. 1

                                                                                                                                SCTP isn’t over UDP. I’m not aware of any implementation in the wild that attempts this. SCTP has its own implementation in OS kernels (Linux, FreeBSD) beside TCP and UDP. It’s not “over UDP”. But middlebox firewalls / shaping devices tend to drop any traffic that is not ICMP, TCP, UDP, or IPSEC which is why SCTP has never gained traction even though it is a superior protocol for many situations especially mobiles where seamless connection roaming between cellular and WiFi would be very much welcomed. Instead we have to live with “some services on iOS devices, for example, use MPTCP which is only supported by Apple services like Siri because very few servers on the internet have MPTCP support in their kernels”.

                                                                                                                                edit: I’m not an expert on SCTP, but I’ve certainly never heard of it being used over UDP. Would be curious to learn more if you’ve got a source.

                                                                                                                                edit2: correct acronym for Multipath TCP is MPTCP

                                                                                                                                1. 3

                                                                                                                                  RFC 6951.

                                                                                                                                  1. 0

                                                                                                                                    Interesting. Is anyone actually using this in the wild or is it just a dead RFC?

                                                                                                                                    1. 3

                                                                                                                                      It’s implemented by the FreeBSD SCTP stack.

                                                                                                                                      1. 0

                                                                                                                                        Yeah, but is anyone actually using it? :) I know dteske was disappointed at all of the missing/broken dtrace hooks for SCTP in FreeBSD

                                                                                                                                        1. 8

                                                                                                                                          I don’t think that was the original argument. You claimed QUIC is not a transport protocol because it sits on top of UDP, but that’s just a consequence of how the internet works. I showed you how SCTP tried to work around the problems around NATs by doing exactly the same: transmitting packets over UDP.

                                                                                                                                          1. 4

                                                                                                                                            Yes. WebRTC uses SCTP over UDP for its data streams. Google Hangouts, Facebook chat, and Discord all use WebRTC. So a non-trivial portion of internet traffic actually uses it. Further, in this usage it’s implemented with a user-mode library, just like QUIC currently is.

                                                                                                                                            1. 1

                                                                                                                                              Excellent, thanks for this info!

                                                                                                                                    2. 1

                                                                                                                                      And had Google said “your web site ranking will drop if we can’t reach your site via SCTP” then you can bet all those middle boxes would be patched, updated or replaced immediately!

                                                                                                                                      1. 4

                                                                                                                                        That only fixes web sites that care about their Google ranking. It doesn’t fix the middle boxes that sit in front of web browsers on corporate intranets and public wifi hotspots, because there’s no website to penalize. It also doesn’t do anything about the deep web sites that aren’t crawled by Google anyway, because you have to log into them.

                                                                                                                                        I strongly suspect that most of the middle boxes in question are being deployed on those things.

                                                                                                                                        1. 4

                                                                                                                                          Those middle boxes are affecting the clients not the servers. Nobody’s going to upgrade their corporate SSL proxy for QUIC if the fallback to HTTP/1.1 is still working fine.

                                                                                                                          1. 4

                                                                                                                            I’m working on a text based gopher client in Lua, just because. Parsing the index file is trivial; displaying and navigating said file is proving to be a nice challenge.

                                                                                                                            1. 5

                                                                                                                              In my experience, there are four types of errors [1]. To summarize, using connect() as the example:

                                                                                                                              • it’s a bug—EBADF should not be happening. That it is, is a bug. Once fixed, it should not happen. So EFAULT, ENOTSOCK, and EISCONN all fall under this category.

                                                                                                                              • It’s fixable outside the scope of the program—EACCESS, ENETUNREACH and EAGAIN are examples here. Report, and exit, not much else to do.

                                                                                                                              • Resource exhaustion, things are probably going bad quickly—EAGAIN might also be this category. ENOMEM definitely is, but that’s not a possible error for connect().

                                                                                                                              • Expected and should be handled by the program—ETIMEDOUT, ECONNREFUSED, EINTR (if using signal()), maybe ENETUNREACH could be considered here as well. Things that can normally happen, and there is some overlap with the second category here.

                                                                                                                              It’s now a bit simpler—just check for expected conditions and handle; everything else should be logged (and possibly terminate the program, depending upon the condition).

                                                                                                                              [1] On that page I list three ways to handle errors. Since then, my thinking has changed somewhat on that but I’ve yet to write it up.

                                                                                                                              1. 2

                                                                                                                                I like the categorization. It makes easier to think about the errors.

                                                                                                                                One thing that comes to mind is: Can we deal with some of those categories automatically? For example, I’ve never seen ENOMEM handled in a reasonable way. While in theory is looks like it can be handled, thigs like memory overcommitment and OOM killer make it futile. Maybe we’ve given up any chance of handling OOM errors back in 1960’s when we’ve replaced static invocation records by call stack. Anyway, maybe returning ENOMEM makes no sense at all. Instead OOM killer should just kill the process. But I never done embedded programming, so who am I to tell?

                                                                                                                              1. 6

                                                                                                                                While it’s true that there are better languages to guarantee memory safety, I would argue it’s not a huge problem in terms of cost compared to other problems of the internet. I think the larger issue is energy cost of the internet. In fact likely most code running the web is NOT c/c++ and instead IS using memory safe languages.

                                                                                                                                The risk of climate change is so huge, and the costs so High (from 10 trillion to human extinction ) that security pales in comparison to the cost of global warming. I simply cannot justify sacrificing performance for security because of this. If using C/C++ can reduce your energy and server use, then it’s a superior choice over safety.

                                                                                                                                Rust is interesting here because it doesn’t sacrifice speed for safety. My biggest concern with it is that it’s just not as pleasant compared to c++. Maybe it will get better genetic programming in the future. But to me energy use is the elephant in the room.

                                                                                                                                1. 4

                                                                                                                                  The use of slower languages also favors “cloud vendors” (people spend more on compute nodes if their code is slow). Those same vendors often promote slower languages. Conflicting priorities for sure.

                                                                                                                                  1. 3

                                                                                                                                    Would be interesting if someone actually tried to measure the impact of scripting language usage on the power consumption of servers around the world in general.

                                                                                                                                    I guess it’s not that big. It’s mostly the small sites that are written in PHP/Python/Ruby/etc. The stuff serving most of the world’s traffic is very efficient. Netflix’s CDN uses in-kernel TLS crypto. Google Search is, I think, C++. Twitter famously migrated from Ruby to Scala and other JVM stuff. And so on.

                                                                                                                                    1. 6

                                                                                                                                      The problem is that most servers are on 24/7 and mostly idle. When people use slower languages, they tend to overprovision servers to handle load spikes. So instead of thinking about “efficiency per request” this about “machines per load”. If you can reduce the number of machines you use by 1/10th by using a faster language, that’s basically a 1/10th reduction in power. And it’s not ONLY a reduction in power but a huge reduction in physical waste.

                                                                                                                                      I think I read somewhere that 90% of machines on the internet are sitting idle.

                                                                                                                                      That’s why I love the idea of unikernels since they boot in miliseconds. If you can have a machine literally turn on and serve a request when a packet comes in, vs having it running idle, it could be a huge win for power. You simply can’t do that with a VM based language whichtypically has a cost to start the VM and requires a VM warmup to be fast.

                                                                                                                                        1. 1

                                                                                                                                          Nice! Didn’t know the name of that, thanks.

                                                                                                                                      1. 2

                                                                                                                                        Stuff like that exists. Hard to Google on mobile cuz lots of crap results. I did find a result that at a glance looks like kind of stuff I was thinking off. Hows this?

                                                                                                                                      2. 3

                                                                                                                                        “In fact likely most code running the web is NOT c/c++ and instead IS using memory safe languages.”

                                                                                                                                        Good point. The new ones are more efficient. That will help address this problem. Hardware people stay getting watts down. People really concerned should probably also use energy-optimized CPU’s like ARM makes for their systems with embedded or laptop peripherals since they’re low power. Such a rig wont have great performance per dollar ratio, though.

                                                                                                                                        1. 2

                                                                                                                                          The ‘energy cost’ to create and maintain a working programmer is pretty substantial, too - to what degree does that dictate ‘use the language that lets you get the most out of that expenditure’?

                                                                                                                                          1. 2

                                                                                                                                            It’s a false dichotomy. C++ is a high level language and modern c++ is very productive. You can make code that almost looks like python but with huge performance over python.

                                                                                                                                            Secondly, energy is the main cost of a data center, so let’s tackle that first. It hasn’t been true for a long time that programmers cost more than the cost of running their software.

                                                                                                                                            Third, we can produce less software in general. Productivity is the greatest source of waste anyway.

                                                                                                                                            1. 9

                                                                                                                                              Third, we can produce less software in general.

                                                                                                                                              Heh. There would be a lot fewer bugs if we fixed the software we have instead of churning out new bugs. But good luck getting developers to admit that.

                                                                                                                                              1. 1

                                                                                                                                                My cookie file has the following quote:

                                                                                                                                                Every program has at least one bug and can be shortened by at least one instruction—from which, by induction, one can deduce that every program can be reduced to one instruction which doesn’t work.

                                                                                                                                                And before you dismiss that as silly, there was a one-instruction program that had a bug.

                                                                                                                                        1. 9

                                                                                                                                          I like how everyone dunks on C but like, we have four or five C alternatives right now that are viable, and numerous compilers and static analyzers and such that are fully capable of catching the bulk of problems that you have in C.

                                                                                                                                          Not just that, but the problems they list are relatively simple to solve code-wise. Let’s take a look:

                                                                                                                                          “use of uninitialized memory” -> just use calloc and memset on shit before you use it, and assign all of your variables default values. sparse or cppcheck are quite capable of catching these and will point them out for you.

                                                                                                                                          “type confusion” -> Literally any sane static analyser will catch this (and I think GCC will too a lot of the time with -pedantic -Wall), but really you shouldn’t assign between variables of different types anyway, unless you’re using a bona-fide conversion function (like, lrint and friends – which will point out bugs for you). Personally speaking I take this further and use large integer sizes and the appropriate size_t and friends wherever possible, anything else is just premature optimization TBH. Besides, the entire point of typedef is to guard against this sort of thing, though. Don’t use void *, use typedef XYZ * typename, and you will very rarely have this bug,

                                                                                                                                          “use after free” -> This goes under “use of uninitialized memory”. Anything that can be NULL/-1, check it. Set to NULL/-1 when you free. Clang’s scan-build catches a lot of these and sparse and cppcheck are capable of catching the rest.

                                                                                                                                          Also, from what I’ve read of security literature, most of the vulnerabilities come from things like, not sanitizing your input, allowing larger inputs than you have space for, etc. Those are programming problems that you can have in any language, including Python. While C does give you room to fuck up, it also gives you the tools to NOT do that. Use them.

                                                                                                                                          Javascript is quite literally a bigger danger with regards to proliferation, pure language idiocy, and fuckup potential, because you actually cannot avoid those parts that are broken (Some of which are generally considered to include, Unicode, and Arrays). People regard C as a loaded shotgun, and then go program in Javascript which has almost an equivalent number of flaws, and which is beyond broken.

                                                                                                                                          Not just that, but C had and continues to serve a (somewhat debatable) purpose in the embedded world, in kernel development, in drivers, and some other places. Javascript was arguably superceded when Scheme was invented, 20 years before Javascript was born.

                                                                                                                                          1. 8

                                                                                                                                            Good points on mitigations being available. On last paragraph, Ada has been around a long time, too, with stuff written in it having less defects. Same with Modula-3 at one point. Newer stuff like Rust can do embedded. There was even a C/C++ implementation done in Scheme for its benefits. D and Nim are contenders with advantages, too.

                                                                                                                                            C’s time has passed on technical grounds. Best replacement should still integrate well with its ecosystem, though, for practical reasons.

                                                                                                                                            1. 3

                                                                                                                                              On last paragraph, Ada has been around a long time, too, with stuff written in it having less defects. Same with Modula-3 at one point. […]

                                                                                                                                              Oh, indeed! However the main benefit to C as it is, is the lack of linguistic complexity. It’s easy to pick the Right Way To Do Things, there’s very little room for debate, except perhaps architecturally – i.e. where it matters. But in addition to that, the best feature of that linguistic complexity is that a) it’s an easy language to remember, and b) it’s an easy language to hold in your head. It doesn’t require a ridiculously huge parser and it’s ‘easy’ to port (at least, it was, heh).

                                                                                                                                              C’s time has passed on technical grounds.

                                                                                                                                              I disagree :)

                                                                                                                                              The main contender, Rust, not only has a ridiculously bloated stdlib (on part with Common Lisp’s with how lost you can get in it), and AFAIK still produces pretty large binaries. In addition it pushes a specific method of building on you, which really isn’t favourable to me.

                                                                                                                                              Personally I’d like to see a systems-level language with the syntax of Lisp or Scheme and the philosophy of C, just with a more robust (but hackable) type system.

                                                                                                                                              1. 3

                                                                                                                                                re linguistic complexity. There’s many of you that say that. I think you all pick a subset you need with coding style you can work with. That might keep it simple for you. The language itself isn’t simple as I said in the counter to vyodaiken. The people that were modeling and analyzing other languages took something like 40 years to do the same for subsets of C. Even experts like vyodaiken argue with other experts about the language details here on threads about pretty, basic stuff. That doesn’t seem simple.

                                                                                                                                                re Rust criticisms. Thanks for sharing them. I know Rust isn’t the end all. If anything, there’s still room for stuff that’s more C-like and flexible to help folks that don’t like Rust. I keep mentioning languages like Cyclone and Clay to give them ideas.

                                                                                                                                                re Lisp/Scheme. There’s two I know of in that direction: PreScheme and ZL. ZL had the most potential if combined with C tooling. Website here. If it doesn’t already, its own implementation probably could be updated to better tie-into popular forms of Scheme like Racket and HtDP.

                                                                                                                                                1. 3

                                                                                                                                                  re Lisp/Scheme. There’s two I know of in that direction: PreScheme and ZL

                                                                                                                                                  I think bitc was quite promising too (from afar, I’ve never actually played with it). I don’t know what happened to it, its website seems down.

                                                                                                                                                  1. 4

                                                                                                                                                    There was Retrospective Thoughts on BitC in 2012. Mail archive is down too, but you can use Internet Archive.

                                                                                                                                                    1. 1

                                                                                                                                                      Thanks for that! I’ll have to take out some time to read it, it’s quite long.

                                                                                                                                                    2. 3

                                                                                                                                                      He was on a row with EROS, COYOTOS, and BitC. Then, Microsoft poached him. (sighs) The best development in high-assurance on language side in recent times is COGENT. They did two filesystems in it. Although paper talks general purpose, O’ Connor showed up on Reddit and HN saying it’s only for stuff like filesystems. He wouldn’t answer follow-up questions about that. So, it’s interesting at unknown usefulness.

                                                                                                                                                    3. 1

                                                                                                                                                      The language itself isn’t simple as I said in the counter to vyodaiken. The people that were modeling and analyzing other languages took something like 40 years to do the same for subsets of C.

                                                                                                                                                      As I said, syntactically. The entire C grammar can fit in three pages. The base C library is like 20 pages and fits into the end of K&R next to the grammar. If you want more functions there’s posix, 99% of which is part of the base operating system.

                                                                                                                                                      You’re right that C-the-implementation isn’t simple. But at that level there are very few simple things, anyway. There are lots of approaches to choose from implementation-wise, for threads, etc. And not to mention the reality of the machine underneath, which doesn’t give a crap about what you think about it.

                                                                                                                                                      With regards to program verification, you are indeed correct, but I’d argue the main problem with that was that C ended up being subjected to the mutagens of almost every single platform of the 1970s to 1990s, very few of which were standardised. The standards committee ended up having to backwards-support everything. That’s ignoring the fact that in certain cases they make it deliberately more difficult to standardize for the sake of improving optimization, or allowing optimizations that already exist in the wild.

                                                                                                                                                      I was mulling it over after I wrote the above, and I think it’s useful to adopt a view of C as being forged by the pressures of being quick to write a compiler for, (and therefore relatively simple to understand how something was implemented (see: macro system, standard library – indeed, 99% of K&R is just teaching you C by reimplementing the standard library in tiny snippets of C)), and close enough to the machine that it’s easy to make optimization choices – you can generally (although it’s got harder with more advanced processors), just by looking at the C source, figure out the machine code produced. That’s where C’s power lies, and it’s something that other languages really do not know how to capture.

                                                                                                                                                      Like, it’s one thing to be able to say, X is better than Y, but C adopts itself really well to showing you why, I guess. And I don’t think we can find a replacement for C until we figure out a language that captures both of those features.

                                                                                                                                                    4. 1

                                                                                                                                                      In addition it pushes a specific method of building on you, which really isn’t favourable to me.

                                                                                                                                                      tbh this is my major objection to rust as well. For all C’s build “process” gets maligned, it is very easy to swap in different tools.

                                                                                                                                                      1. 1

                                                                                                                                                        However the main benefit to C as it is, is the lack of linguistic complexity

                                                                                                                                                        Say what now?

                                                                                                                                                        https://hackernoon.com/so-you-think-you-know-c-8d4e2cd6f6a6

                                                                                                                                                        1. 2

                                                                                                                                                          I believe you are replying to the wrong person.

                                                                                                                                                          1. 1

                                                                                                                                                            Oops.

                                                                                                                                                      2. -1

                                                                                                                                                        However the main benefit to C as it is, is the lack of linguistic complexity. It’s easy to pick the Right Way To Do Things, there’s very little room for debate, except perhaps architecturally – i.e. where it matters.

                                                                                                                                                        Excellent point and exactly why the Crappy Pascal initiative also known as the ISO C Standard has been so detrimental to C.

                                                                                                                                                        1. 4

                                                                                                                                                          I don’t know if you’ve ever worked with pre-ANSI-C code, but given the choice between that and ANSI-C, ANSI-C wins if just for function prototypes.

                                                                                                                                                          What I don’t like about the standard is the tortured language so that everything from signed-magnitude to 2’s-complement, 8-bit to 66-bit, can be supported. That may have been a valid compromise for C89, but is less and less so as time goes on [1]. The major problem with C now is the compiler writers trying to exploit undefined behavior to increase speed, to the point that formerly valid C code now breaks.

                                                                                                                                                          [1] Byte addressable, 2’s-complement won. Get over it C Standards Committee!

                                                                                                                                                          1. 1

                                                                                                                                                            The major problem with C now is the compiler writers trying to exploit undefined behavior to increase speed, to the point that formerly valid C code now breaks.

                                                                                                                                                            I think the major problem is people compiling with -O3 and then complaining that compilers are trying to make their broken code fast.

                                                                                                                                                            1. 0

                                                                                                                                                              The standard is full of contradictions and obscurities. Compiler writers treating the standard as if it were some shrink wrap contract that they could exploit to evade every obligation to their users is simply wrong. The code you complain is “broken” is not even broken according to the standard and is common in things like K&R2. It’s ridiculous to claim that exploiting loopholes in murky standard written by a committee that seems to have no idea what they are doing is somehow justifiable.

                                                                                                                                                            2. 1

                                                                                                                                                              You may get your wish soon. From https://herbsutter.com/2018/11/13/trip-report-fall-iso-c-standards-meeting-san-diego/:

                                                                                                                                                              … all known modern computers are two’s complement machines, and, no, we don’t really care about using C++ (or C) on the ones that aren’t. The C standard is likely to adopt the same change.

                                                                                                                                                              1. 0

                                                                                                                                                                I think K&R2 is basically C at its best and that is ANSI. I even like restrict, although God knows the description of it in the standard reads like the authors were typing it on their phones while working at the Motor Vehicle Bureau. But there are programs in K&R2 that are not valid programs according to the current interpretation of the current incarnation of the standard.

                                                                                                                                                        2. 3

                                                                                                                                                          esides, the entire point of typedef is to guard against this sort of thing, though.

                                                                                                                                                          The big flaw in typedef is that there is implicit type conversion between e.g. typedef int metric x and typedef int english y that permits x = y etc. There should be a flag in C to give warnings on all implicit type conversions. and a strong typedef (although the struct method works well too )

                                                                                                                                                          1. 3

                                                                                                                                                            To solve this thing, just do that thing!

                                                                                                                                                            If it’s so simple and yet so easy to forget… why don’t we just automate it? 😉

                                                                                                                                                            1. 0

                                                                                                                                                              why don’t we just automate it?

                                                                                                                                                              If you read what I have written, that is what I said. scan-build+sparse+cppcheck+valgrind will catch 99% of the errors mentioned in the article, and they take only about 5 seconds to run.

                                                                                                                                                              1. 7

                                                                                                                                                                Sounds to me like you’re choosing a language that requires boilerplate, then installing tools to scan for missing boilerplate.

                                                                                                                                                                For something as important as memory safety, it seems shortsighted to arrive at such a solution. But to each their own: if a certain workflow helps you produce safe code, then I won’t complain it puts the cart before the horse.

                                                                                                                                                                1. 3

                                                                                                                                                                  Valgrind requires you to exhaustively test your application. That’s not a five-second job.

                                                                                                                                                              2. 1

                                                                                                                                                                Nulling things only works if you have a single pointer to the memory, and pass that pointer by reference when you need it