1. 4

    Hi, I’m Atila and I write mostly about systems programming and testing: https://atilanevesoncode.wordpress.com/

    1. 2

      I’d really like to write more D. In my particular case, I couldn’t have a GC in play (self-imposed memory constraints), but there’s a lot about it that’s attractive to me. I don’t have any desire to choose Go over it - power of the language is considerably greater from my limited experience.

      That said, Go does have a big package community behind it like Rust.

      1. 14

        Stick a @nogc on your main function and you have a compile-time guarantee that no GC allocations will happen in your program.

        1. 6

          Neat - I didn’t realize this. Too late now for the current project, but good to know for the future. I’m particularly interested in its C++ FFI story. There’s a couple of specialized C++ libraries I’d like to use without having to write flat-C style wrappers just to call them sanely from Rust.

          Thanks for that!

          1. 3

            That’s exactly the kind of tip I was hoping for in the comments. Thanks!

            1. 5

              It always the same arguments with D discussions:

              • I don’t like D that has a GC!
              • Just use @nogc!
              • But then some stuff from the standard library does not work anymore!
              • How much of the standard library?
              • Nobody knows and how would you measure it anyway?
              1. 1

                It’s at least a pattern that’s solvable. Someone just has to attempt to compile the whole standard library with no GC option. Then, list the breakage. Then, fix in order of priority for the kind of apps that would want no-GC option. Then, write this up into a web page. Then, everyone shares it in threads where pattern shows up. Finally, the pattern dies after 10-20 years of network effects.

                1. 2

                  People are doing that. Well, except for the “write this up into a web page” part. I guess you are thinking of web pages like http://www.arewewebyet.org/

                  1. 1

                    Yeah, some way for people to know that they’re doing it with what level of progress. Good to know they’re doing it. That you’re the first to tell me illustrates how a page like that would be useful in these conversations. People in D camp can just drop a link and be done with it.

          2. 3

            I find D has a lot of packages too. Not an explosive smörgåsbord, but sufficient for my purposes.

            https://code.dlang.org/

            The standard library by itself is fairly rich already.

            https://dlang.org/phobos/

            1. 1

              I guess the question would be whether unsafe or smart pointers are about as easy to use in D as C or C++. If so, the GC might not be a problem. In some languages, GC is really hard to avoid.

              Maybe @JordiGH, who uses D, can tell us.

              1. 5

                I write D daily. Unsafe pointers work the same as in C or C++. I wrote a GC-less C++-like smart pointer library for D. It’s basically std::unique_ptr and std::shared_ptr, but no std::weak_ptr because 1) I haven’t needed it and 2) One can, if needed, rely on the GC to break cycles (although I don’t know how easy that would be do to currently in practice.

                1. 1

                  D is a better C++, so pointers easier to use than C++. As I understand, the main problem is that it used to be the case that the standard library used GC freely, making GC hard to avoid if you used the standard library. I understand there is an ongoing effort to clear this but I don’t know the current status.

                  1. 3

                    It depends on which part of the standard library. These days, the parts most often used have functions that don’t allocate. In any case it’s easy to avoid by using @nogc.

              1. 2

                I love coding, so much so that it’s what I do during the day if I were independently wealthy and didn’t need to work. I’m lucky that I get paid to do something I enjoy.

                Having said that, a lot of “real work” programming is tedious and frustrating, especially when you work on something you don’t think is necessary, or maintaining someone else’s code.

                Still, programming on a bad day beats a good day of working doing pretty much anything else for me.

                I don’t think you should feel guilty - figure out what matters to you and shape your life around that. If all coding is to you is a way to pay the bills so you get to do what’s important to you, well, there’s no shame in that.

                1. 11

                  I don’t really get this list. I’m sure someone believes these things but I don’t know them. Seems like just a mish-mash of things the author has heard someone say and trying to make some sort of generalization out of it.

                  1. 11

                    I see these on Hacker News and Reddit all the time:

                    1. Assumption that rewriting in C will always be fast. So, better to do that than make your HLL version faster or do a hybrid.

                    2. Assumption that GC’s always have long delays or something that forces you to use C. Many still don’t know that real-time GC’s were invented or that some real-time software in industry uses HLL’s despite pjmpl and I continuously posting about that. Myths about memory management are so strong that we about need some coding celebrities to do a blog post with references these kinds of things to generate a baseline of awareness. Then, maybe people will build more of them into mainstream languages. :)

                    3. You have to use C for C ABI. People were just arguing this on Lobsters and HN a while back on C-related posts. I was arguing the position you don’t need C in majority of cases in C ecosystem. Just include, wrap, and/or generate it whenever you can.

                    4. Conflating of C/C++ where one shouldn’t. I slipped on that myself a lot in the past because many people coded C++ like it was higher-level C. There is a distinct style for C++ that eliminates many C-related problems. I’m more careful now. I still pass on the correction to others.

                    5. You can only write kernels in C. This is a little inaccurate since many know you can use C++. They consider it a C superset, though, where it sort of reinforces the concept you need some kind of C. Many do believe everything depends on C underneath, though. I shows up in damn near every thread on high-level languages in low-level or performance-critical situations. In my main comment, I gave an extensive list of usages that came before, around same time, and much later than C. There’s even more that used C only for convenience of saving time with pre-built components. Linking to those would defeat the purpose of showing one doesn’t need C, though. ;)

                    6. C maps closely to the hardware. This has been debated multiple times on Lobsters just this year. The most wide-spread metaphor for it is “cross-platform assembly.” So, this is a widespread belief whether it’s a myth or not. There’s a lot of disagreement on this one.

                    The others outside of Emacs or terminal stuff I’ve also seen or countered plenty of times. I don’t know that they’re widespread, though. The ones I cited I’ve seen debated by many people in many places over long periods of time. They’re definitely popular beliefs even if I don’t know specific numbers of people involved.

                    1. 5

                      I’m not sure what you’re trying to say. I’m said that I’m sure some people believe things mentioned in this blog, but what about it? Your list is just a rehash of the contents of the blog, what are you specifically adding to the discussion? There are people in every industry that are ignorant of aspects of their own industry, that’s just how the world works. The counter evidence to the items in this list are accessible to anyone interested.

                      1. 4

                        I don’t really get this list. I’m sure someone believes these things but I don’t know them.

                        You originally said the quote above. In your experience, you must never see these things. In my experience and probably the author’s, they’re so common that they programmers believing them due to widespread repetition miss opportunities in their projects. This includes veterans that just never encountered specific technologies used way outside their part of industry or FOSS. Countering the myths and broadening people’s perspectives on social media might bring attention to the alternatives that lead to more of it fielded in FOSS or commercial applications.

                        That’s the aim when I bring stuff like this up. Sometimes, I see people actually put it into practice, too.

                        1. 5

                          believing them due to widespread repetition miss opportunities in their projects.

                          But that’s clearly not true, right? How much code is written in JavaScript, Python, Ruby, Java, etc? Even if one believes GC’d languages are slow, they are still solving problems in them.

                          After some thought, I believe my strong reaction to this are a few reasons:

                          1. I think most programmers are just ignorant of layers they don’t actively work with. A web dev might think C is the only language to implement kernels in because they just don’t know any better. Maybe that makes it a myth and my splitting hairs but I feel like it’s more of just “thinks some programmers are ignorant of”.
                          2. I think the wording of several of these is misleading. “C is magically fast”?? Even people that believe C is the fastest language to implement anything in don’t call it magic, IME. Things implemented in C are usually very fast, but there is nothing magic about it. Same for GC languages magically slow. Those descriptions just do not match the nuance of the actual discussion, IME.
                          3. The endianess one is also misleading, IMO. The endianess of your machine certainly matters. The blogpost linked is not about endianess not mattering, it’s about how to write code agnostic if endianess, these are very different things.

                          I think my initial response was stronger than it needed to be, however I do not believe this blog post really helps the situation and might even add some of its own myths.

                          1. 4

                            My experience isn’t with webdevs thinking C is the only language to implement kernels with - it’s about systems programmers who think so. The same goes for C being magically fast - people literally believe that it’s impossible to beat C at execution speed, no matter what language they pick to write code in. I had to prove to a colleague that C++‘s std::sort was faster than C’s qsort with code and benchmarks.

                            On endianess I think I did myself a disservice by implying that it never matters - it just doesn’t 99.9% of the time. Write endianness-agnostic code and be done with it. It’s like caring which bits of a float are the mantissa - the CPU will do the right thing.

                            1. 3

                              Oh yeah, Ive had many C programmers say that stuff to me. You were on point with the Fortan counter: I use it, too.

                    2. 1

                      This is true - it’s a list of things that I hear/read a lot and don’t understand why people believe them.

                      1. 6

                        Subtitle: the perils of the comment section.

                        Corollary: disproving these myths will cause ten others to spontaneously emerge to take their place. :)

                        1. 2

                          Because people believe strange things and/or are often ignorant of things they don’t have close proximity to. How many production kernels are not written in C? And how many of those will your average developer have knowledge of if they aren’t actively interested in the OS layer?

                          To put it another way: you probably believe things that someone with alternative experience would classify as myths and not understand why you believe them.

                          1. 3

                            Because people believe strange things and/or are often ignorant of things they don’t have close proximity to

                            Sure, but I know people who still think these things despite having been presented with evidence to the contrary.

                            you probably believe things that someone with alternative experience would classify as myths and not understand why you believe them

                            With nearly 100% certainty. I love to be proven wrong, because after I am, I’m no longer wrong about that particular matter.

                            This list is just things that I have observed that I find irrational, with no implications on my part that I’m any more rational on other matters. Or even these!

                      1. 7

                        Nice article. A few observations.

                        On the GC part, it’s worth bringing up memory pools, refcounting, and low-latency/real-time GC’s. From what I read, Go and Nim are examples of languages using quick ones. C++ and Ada long supported use of things like memory pools or reference counting. If the compiler didn’t have it, the programmer could add it as a library/module.

                        On text editors without goto definition, they might want to look at LISP Machines for other ancient features they might want to emulate in a UNIX/C environment. Although it had window system, the interface isn’t much more advanced than some stuff I had on MSDOS a long time ago. I’m sure a UNIX app or interacting apps could pull it off.

                        On C ABI, I’ve been repeatedly countering this even here. One can always add seemless integration of C code and/or generation of it to their language. Should probably be one of first, design considerations, too, since some high-level languages will clash with C’s nature. At the least, one can get better parsing/refactoring, more safety, real macros, easier portability, and support for specialized hardware. One does this by getting rid of the parts of C the language that make it hard.

                        On concurrency, Concurrent Pascal (1975) by Per Brinch Hansen was used in an OS with Ada Ravenscar and Eiffel SCOOP having a lot of commercial deployment.

                        On kernels, there’s Burroughs MCP done in ALGOL (Unisys still sells it), IBM OS’s in PL/S done like PL/I, Intel i432 APX with OS in Ada, Xerox Star used Mesa, Oberon in Oberon, House in Haskell for most of it, and Muen in SPARK Ada. Hell, there’s even one in FreeBASIC.

                        1. 0

                          Thanks for the comments. I figured I needed to write a whole blog post about GCs and how it’s likely that the people who believe what they do just don’t know how they work. It’s a long post as it is.

                          1. 2

                            After you write it, be sure to submit it here. I’ll definitely read it. Here’s a search listing of the real-time GC’s I submitted in case any is useful.

                        1. 36

                          Then again, I’ve rarely seen anyone use their editor of choice well. I’ve lost count of how many times I’ve watched someone open a file in vim, realise it’s not the one they want, close vim, open another file, close it again… aaarrrgh.

                          I do this a lot, because I prefer browsing files in the shell. I make pretty extensive use of a lot of other vim features though. When did you become the arbiter of how “well” I’m using my computer?

                          1. 3

                            Closing vim seems odd to me. Why wouldn’t one instead open the new file without closing vim? Maybe it’s a cultural thing? I don’t think anyone would do that in Emacs.

                            1. 26

                              “Why would I ever leave my editor” definitely feels like a common refrain from the Emacs crowd.

                              1. 1

                                I do the thing you quoted as well, but that is because vim is often my editor of convenience on a machine rather than my editor of choice, which is true for many usages I see of vim.

                              2. 21

                                Because the shell lets me change directories, list files with globs, run find, has better tab-completion (bash, anyway), etc, etc. I might not remember the exact name of the file, etc. Finding files in the shell is something I do all day, so I’m fast at it. Best tool for the job and all that.

                                (Yes I can do all that with ! in vi/vim/whatever, but there’s a cognitive burden since that’s not how I “normally” run those commands. Rather than do it, mess it up because I forgot ! in front or whatever, do it again, etc, I can just do it how I know it’ll work the first time.)

                                1. 6

                                  This is exactly why I struggle with editors like Emacs. My workflow is definitely oriented around the shell. The editor is just another tool among many. I want to use it just like I use all my other tools. I can’t get on with the Emacs workflow, where the editor is some special place that stays open. I open and close my editor many, many times every day. To my mind, keeping your editor open is the strange thing!

                                  1. 3

                                    It’s rather simple actually: the relationship between the editor and the shell is turned on it’s head – from within the editor you open a shell (eg. eshell, ansi-term, shell, …) and use it for as long as you need it, just like a one would use vi from a shell. Ninja-slices.

                                    You can compare this as someone who claims to log out of their x session every time they close a terminal or a shell in a multiplexer. Would seem wierd too.

                                    1. 3

                                      I know you can launch a shell from within your editor. I just never really understood why you would want to do that.

                                      Obviously some people do like to do that. My point is just that different ways of using a computer make intuitive sense to different people. I don’t think you can justify calling one way wrong just because it seems odd to you.

                                      1. 6

                                        I know you can launch a shell from within your editor. I just never really understood why you would want to do that.

                                        I do it because it allows me to use my editor’s features to:

                                        a) edit commands b) manipulate the output of commands in another buffer (and/or use shell pipelines to prep the output buffer) c) not have to context switch to a different window, shutdown the editor, suspend the editor, or otherwise change how I interact with the currently focused window.

                                        1. 1

                                          That makes a lot of sense. I guess I have been misleading in using the word “shell” when I should really have said “terminal emulator”. I often fire off shell commands from within my editor, for just the same reasons as you, but I don’t run an interactive shell. I like M-! but I don’t like eshell, does that make sense?

                                          Having pondered this all a bit more, I think it comes down to what you consider to be a place. I’m pretty certain I read about places versus tools here on lobsters but I can’t find it now. These are probably tendencies rather than absolutes, but I think there are at least a couple of different ways of thinking about interaction with a computer. Some people think of applications as places: you start up a number of applications, they persist for the length of your computing session, and you switch between them for different tasks (maybe a text editor, a web browser and an e-mail client, or something). Alternatively, applications are just tools that you pick up and drop as you need them. For me, a terminal, i.e. an interactive shell session, is a place. It is the only long-lived application on my desktop, everything else is ephemeral: I start it to accomplish some task then immediately kill it.

                                    2. 3

                                      It’s really simple in emacs. Just Ctrl-z and run fg when you are ready to go back.

                                1. 0

                                  A list of beliefs about programming that I maintain are misconceptions.

                                  1. 3

                                    Small suggestion: use a darker, bigger font. There are likely guidelines somewhere but I don’t think you can fail with using #000 for text people are supposed to read for longer than a couple of seconds.

                                    1. 3

                                      Current web design seems allergic to any sort of contrast. Even hyper-minimalist web design calls for less contrast for reasons I can’t figure out. Admittedly, I’m a sucker for contrast; I find most programming colorschemes hugely distasteful for the lack of contrast.

                                      1. 6

                                        I think a lot of people find the maximum contrast ratios their screens can produce physically unpleasant to look at when reading text.

                                        I believe that people with dyslexia in particular find reading easier with contrast ratios lower than #000-on-#fff. Research on this is a bit of a mixed bag but offhand I think a whole bunch of people report that contrast ratios around 10:1 are more comfortable for them to read.

                                        As well as personal preference, I think it’s also quite situational? IME, bright screens in dark rooms make black-on-white headache inducing but charcoal-on-silver or grey-on-black really nice to look at.

                                        WCAG AAA asks for a contrast ratio of 7:1 or higher in body text which does leave a nice amount of leeway for producing something that doesn’t look like looking into a laser pointer in the dark every time you hit the edge of a glyph. :)

                                        As for the people putting, like, #777-on-#999 on the web, I assume they’re just assholes or something, I dunno.

                                        Lobsters is #333-on-#fefefe which is a 12.5:1 contrast ratio and IMHO quite nice with these fairly narrow glyphs.

                                        (FWIW, I configure most of my software for contrast ratios around 8:1.)

                                        1. 2

                                          Very informative, thank you!

                                    2. 3

                                      I think the byte-order argument doesn’t hold when you mentioned ntohs and htons which are exactly where byte-order needs to be accounted for…

                                      1. 2

                                        If you read the byte stream as a byte stream and shift them into position, there’s no need to check endianness of your machine (just need to know endianness of the stream) - the shifts will always do the right thing. That’s the point he was trying to make there.

                                        1. 2

                                          ntohs and htons do that exact thing and you don’t need to check endianess of your machine, so the comment about not understanding why they exist makes me feel like the author is not quite groking it. Those functions/macros can be implemented to do the exact thing linked to in the blog post.

                                    1. 2

                                      I also wrote about a similar technique in D

                                      1. 7

                                        I always laugh when people come up with convoluted defenses for C and the effort that goes into that (even writing papers). Their attachment to this language has caused billions if not trillions worth of damages to society.

                                        All of the defenses that I’ve seen, including this one, boil down to nonsense. Like others, the author calls for “improved C implementations”. Well, we have those already, and they’re called Rust, Swift, and, for the things C is not needed for, yes, even JavaScript is better than C (if you’re not doing systems-programming).

                                        1. 31

                                          Their attachment to this language has caused billions if not trillions worth of damages to society.

                                          Their attachment to a language with known but manageable defects has created trillions if not more in value for society. Don’t be absurd.

                                          1. 4

                                            [citation needed] on the defects of memory unsafety being manageable. To a first approximation every large C/C++ codebase overfloweth with exploitable vulnerabilities, even after decades of attempting to resolve them (Windows, Linux, Firefox, Chrome, Edge, to take a few examples.)

                                            1. 2

                                              Compared to the widely used large codebase in which language for which application that accepts and parses external data and yet has no exploitable vulnerabilities? BTW: http://cr.yp.to/qmail/guarantee.html

                                              1. 6

                                                Your counter example is a smaller, low-featured, mail server written by a math and coding genius. I could cite Dean Karnazes doing ultramarathons on how far people can run. That doesn’t change that almost all runners would drop before 50 miles, esp before 300. Likewise with C code, citing the best of the secure coders doesn’t change what most will do or have done. I took author’s statement “to first approximation every” to mean “almost all” but not “every one.” It’s still true.

                                                Whereas, Ada and Rust code have done a lot better on memory-safety even when non-experts are using them. Might be something to that.

                                                1. 2

                                                  I’m still asking for the non C widely used large scale system with significant parsing that has no errors.

                                                  1. 3

                                                    That’s cheating saying “non-c” and “widely used.” Most of the no-error parsing systems I’ve seen use a formal grammar with autogeneration. They usually extract to Ocaml. Some also generate C just to plug into the ecosystem since it’s a C/C++-based ecosystem. It’s incidental in those cases: could be any language since the real programming is in the grammar and generator. An example of that is the parser in Mongrel server which was doing a solid job when I was following it. I’m not sure if they found vulnerabilities in it later.

                                                2. 5

                                                  At the bottom of the page you linked:

                                                  I’ve mostly given up on the standard C library. Many of its facilities, particularly stdio, seem designed to encourage bugs.

                                                  Not great support for your claim.

                                                  1. 2

                                                    There was an integer overflow reported in qmail in 2005. Bernstein does not consider this a vulnerability.

                                                3. 3

                                                  That’s not what I meant by attachment. Their interest in C certainly created much value.

                                                4. 9

                                                  Their attachment to this language has caused billions if not trillions worth of damages to society.

                                                  Inflammatory much? I’m highly skeptical that the damages have reached trillions, especially when you consider what wouldn’t have been built without C.

                                                  1. 12

                                                    Tony Hoare, null’s creator, regrets its invention and says that just inserting the one idea has cost billions. He mentions it in talks. It’s interesting to think that language creators even think of the mistakes they’ve made have caused billions in damages.

                                                    “I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn’t resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.

                                                    If the billion dollar mistake was the null pointer, the C gets function is a multi-billion dollar mistake that created the opportunity for malware and viruses to thrive.

                                                    1. 2

                                                      He’s deluded. You want a billion dollar mistake: try CSP/Occam plus Hoare Logic. Null is a necessary byproduct of implementing total functions that approximate partial ones. See, for example, McCarthy in 1958 defining a LISP search function with a null return on failure. http://www.softwarepreservation.org/projects/LISP/MIT/AIM-001.pdf

                                                      1. 3

                                                        “ try CSP/Occam plus Hoare Logic”

                                                        I think you meant formal verification, which is arguable. They could’ve wasted a hundred million easily on the useless stuff. Two out of three are bad examples, though.

                                                        Spin has had a ton of industrial success easily knocking out problems in protocols and hardware that are hard to find via other methods. With hardware, the defects could’ve caused recalls like the Pentium bug. Likewise, Hoare-style logic has been doing its job in Design-by-Contract which knocks time off debugging and maintenance phases. The most expensive. If anything, not using tech like this can add up to a billion dollar mistake over time.

                                                        Occam looks like it was a large waste of money, esp in the Transputer.

                                                        1. 1

                                                          No. I meant what I wrote. I like spin.

                                                      2. 1

                                                        Note what he does not claim is that the net result of C’s continued existence is negative. Something can have massive defects and still be an improvement over the alternatives.

                                                      3. 7

                                                        “especially when you consider what wouldn’t have been built without C.”

                                                        I just countered that. The language didn’t have to be built the way it was or persist that way. We could be building new stuff in a C-compatible language with many benefits of HLL’s like Smalltalk, LISP, Ada, or Rust with the legacy C getting gradually rewritten over time. If that started in the 90’s, we could have equivalent of a LISP machine for C code, OS, and browser by now.

                                                        1. 1

                                                          It didn’t have to, but it was, and it was then used to create tremendous value. Although I concur with the numerous shortcomings of C, and it’s past time to move on, I also prefer the concrete over the hypothetical.

                                                          The world is a messy place, and what actually happens is more interesting (and more realistic, obviously) than what people think could have happened. There are plenty of examples of this inside and outside of engineering.

                                                          1. 3

                                                            The major problem I see with this “concrete” winners-take-all mindset is that it encourages whig history which can’t distinguish the merely victorious from the inevitable. In order to learn from the past, we need to understand what alternatives were present before we can hope to discern what may have caused some to succeed and others to fail.

                                                            1. 2

                                                              Imagine if someone created Car2 which crashed 10% of the time that Car did, but Car just happened to win. Sure, Car created tremendous value. Do you really think people you’re arguing with think that most systems software, which is written in C, is not extremely valuable?

                                                              It would be valuable even if C was twice as bad. Because no one is arguing about absolute value, that’s a silly thing to impute. This is about opportunity cost.

                                                              Now we can debate whether this opportunity cost is an issue. Whether C is really comparatively bad. But that’s a different discussion, one where it doesn’t matter that C created value absolutely.

                                                        2. 8

                                                          C is still much more widely used than those safer alternatives, I don’t see how laughing off a fact is better than researching its causes.

                                                          1. 10

                                                            Billions of lines of COBOL run mission-critical services of the top 500 companies in America. Better to research the causes of this than laughing it off. Are you ready to give up C for COBOL on mainframes or you think both of them’s popularity were caused by historical events/contexts with inertia taking over? Im in latter camp.

                                                            1. 7

                                                              Are you ready to give up C for COBOL on mainframes or you think both of them’s popularity were caused by historical events/contexts with inertia taking over? Im in latter camp.

                                                              Researching the causes of something doesn’t imply taking a stance on it, if anything, taking a stance on something should hopefully imply you’ve researched it. Even with your comment I still don’t see how laughing off a fact is better than researching its causes.

                                                              You might be interested in laughing about all the cobol still in use, or in research that looks into the causes of that. I’m in the latter camp.

                                                              1. 5

                                                                I think you might be confused at what I’m laughing at. If someone wrote up a paper about how we should continue to use COBOL for reasons X, Y, Z, I would laugh at that too.

                                                                1. 3

                                                                  Cobol has some interesting features(!) that make it very “safe”. Referring to the 85 standard:

                                                                  X. No runtime stack, no stack overflow vulnerabilities
                                                                  Y. No dynamic memory allocation, impossible to consume heap
                                                                  Z. All memory statically allocated (see Y); no buffer overflows
                                                                  
                                                                  1. 3

                                                                    We should use COBOL with contracts for transactions on the blockchains. The reasons are:

                                                                    X. It’s already got compilers big businesses are willing to bet their future on.

                                                                    Y. It supports decimal math instead of floating point. No real-world to fake, computer-math conversions needed.

                                                                    Z. It’s been used in transaction-processing systems that have run for decades with no major downtime or financial losses disclosed to investors.

                                                                    λ. It can be mathematically verified by some people who understand the letter on the left.

                                                                    You can laugh. You’d still be missing out on a potentially $25+ million opportunity for IBM. Your call.

                                                                    1. 1

                                                                      Your call.

                                                                      I believe you just made it your call, Nick. $25+ million opportunity, according to you. What are you waiting for?

                                                                      1. 4

                                                                        You’re right! I’ll pitch IBM’s senior executives on it the first chance I get. I’ll even put on a $600 suit so they know I have more business acumen than most coin pitchers. I’ll use phrases like vertical integration of the coin stack. Haha.

                                                                  2. 4

                                                                    That makes sense. I did do the C research. Ill be posting about that in a reply later tonight.

                                                                    1. 10

                                                                      Ill be posting about that in a reply later tonight.

                                                                      Good god man, get a blog already.

                                                                      Like, seriously, do we need to pass a hat around or something? :P

                                                                      1. 5

                                                                        Haha. Someone actually built me a prototype a while back. Makes me feel guilty that I dont have one instead of the usual lazy or overloaded.

                                                                          1. 2

                                                                            That’s cool. Setting one up isn’t the hard part. The hard part is doing a presentable design, organizing the complex activities I do, moving my write-ups into it adding metadata, and so on. I’m still not sure how much I should worry about the design. One’s site can be considered a marketing tool for people that might offer jobs and such. I’d go into more detail but you’d tell me “that might be a better fit for Barnacles.” :P

                                                                            1. 3

                                                                              Skip the presentable design. Dan Luu’s blog does pretty well it’s not working hard to be easy on the eyes. The rest of that stuff you can add as you go - remember, perfect is the enemy of good.

                                                                              1. 0

                                                                                This.

                                                                                Hell, Charles Bloom’s blog is basically an append-only textfile.

                                                                              2. 1

                                                                                ugh okay next Christmas I’ll add all the metadata, how does that sound

                                                                                1. 1

                                                                                  Making me feel guilty again. Nah, I’ll build it myself likely on a VPS.

                                                                                  And damn time has been flying. Doesnt feel like several months have passed on my end.

                                                                        1. 1

                                                                          looking forward to read it:)

                                                                  3. 4

                                                                    Well, we have those already, and they’re called Rust, Swift, ….

                                                                    And D maybe too. D’s “better-c” is pretty interesting, in my mind.

                                                                    1. 3

                                                                      Last i checked, D’s “better-c” was a prototype.

                                                                    2. 5

                                                                      If you had actually made a serious effort at understanding the article, you might have come away with an understanding of what Rust, Swift, etc. are lacking to be a better C. By laughing at it, you learned nothing.

                                                                      1. 2

                                                                        the author calls for “improved C implementations”. Well, we have those already, and they’re called Rust, Swift

                                                                        Those (and Ada, and others) don’t translate to assembly well. And they’re harder to implement than, say, C90.

                                                                        1. 3

                                                                          Is there a reason why you believe that other languages don’t translate to assembly well?

                                                                          It’s true those other languages are harder to implement, but it seems to be a moot point to me when compilers for them already exist.

                                                                          1. 1

                                                                            Some users of C need an assembly-level understanding of what their code does. With most other languages that isn’t really achievable. It is also increasingly less possible with modern C compilers, and said users aren’t very happy about it (see various rants by Torvalds about braindamaged compilers etc.)

                                                                            1. 4

                                                                              “Some users of C need an assembly-level understanding of what their code does.”

                                                                              Which C doesnt give them due to compiler differences and effects of optimization. Aside from spotting errors, it’s why folks in safety- critical are required to check the assembly against the code. The C language is certainly closer to assembly behavior but doesnt by itself gives assembly-level understanding.

                                                                        2. 2

                                                                          So true. Every time I use the internet, the solid engineering of the Java/Jscript components just blows me away.

                                                                          1. 1

                                                                            Everyone prefers the smell of their own … software stack. I can only judge by what I can use now based on the merits I can measure. I don’t write new services in C, but the best operating systems are still written in it.

                                                                            1. 5

                                                                              “but the best operating systems are still written in it.”

                                                                              That’s an incidental part of history, though. People who are writing, say, a new x86 OS with a language balancing safety, maintenance, performance, and so on might not choose C. At least three chose Rust, one Ada, one SPARK, several Java, several C#, one LISP, one Haskell, one Go, and many C++. Plenty of choices being explored including languages C coders might say arent good for OS’s.

                                                                              Additionally, many choosing C or C++ say it’s for existing tooling, tutorials, talent, or libraries. Those are also incidental to its history rather than advantages of its language design. Definitely worthwhile reasons to choose a language for a project but they shift the language argument itself implying they had better things in mind that werent usable yet for that project.

                                                                              1. 4

                                                                                I think you misinterpreted what I meant. I don’t think the best operating systems are written in C because of C. I am just stating that the best current operating system I can run a website from is written in C, I’ll switch as soon as it is practical and beneficial to switch.

                                                                                1. 2

                                                                                  Oh OK. My bad. That’s a reasonable position.

                                                                                  1. 3

                                                                                    I worded it poorly, I won’t edit though for context.

                                                                          1. 1

                                                                            “all you need to annotate are function parameters and return values” - true in C++ now too, it’s not just Rust.

                                                                            “gtest sucks” - it does, but there are far better alternatives. I agree that pytest rocks. I’m curious as to whether dependency injection and mocking are better in Rust than in C++, especially given the lack of compile-time reflection.

                                                                            1. 3

                                                                              In my experience C++ generally requires more annotation of types within a function body, so it is still fair to call out annotating only function parameters and return values as a strength of Rust in particular.

                                                                              For example in Rust:

                                                                              // Within the same function body we push a `&str` into the vector
                                                                              // so compiler understands this must be a `Vec<&str>`.
                                                                              let mut vec = Vec::new();
                                                                              vec.push("str");
                                                                              

                                                                              versus C++:

                                                                              // Vector element type cannot be inferred.
                                                                              std::vector<const char *> vec;
                                                                              vec.push_back("str");
                                                                              
                                                                              1. 1

                                                                                C++17 has constructor template argument deduction, so you can just say auto vec = vector({"str"}) now. Though Rust’s type inference is obviously more powerful.

                                                                            1. 5

                                                                              Apropos of not much related to the actual article:

                                                                              I need to know the difference between int, long, uin8_t, size_t even if all I want is a goddamn integer.

                                                                              Use the fixed width integer types (uintX_t, intX_t and friends) and avoid the fundamental integer types (int, long, etc). It’s not as easy as “a goddamn integer”, but it’s easier than the fundamental types since you know the size.

                                                                              1. 1

                                                                                On the contrary, “Use int unless you’re interfacing with hardware” should be the rule of thumb. Yes, I know, the standard library makes it hard to use int because of the now we-never-should-have-done-that prevalence of size_t, but range-based for loops have made that mostly ok.

                                                                              1. 3

                                                                                Author here, AMA. I didn’t post on here before because I didn’t have an invite to lobte.rs.

                                                                                1. 4

                                                                                  I wonder how do they plan on handling templates. The reason that D’s interoperability with C++ is incomplete is that you would basically need a full C++ compiler to handle templates, so I really am curious to learn how does this work around that requirement.

                                                                                  I understand that this is actually using a full C++ compiler (llvm), but even so, how is the result going to be turned into D templates?

                                                                                  1. 3

                                                                                    I have some ideas about how to handle templates. D can link to C++ template functions and member functions, but needs a C++ source file to instantiate them and a C++ compiler to generate the binary to link to.

                                                                                    The easiest way is to leave it to the user, but that’s not very ergonomic. So I’m thinking of, at the very least, trying to figure out all instantiations that happen in a .dpp file and automatically generating a C++ source file that should be compiled that the rest of the program can link to.

                                                                                    I’m not sure yet. I won’t be until I try.

                                                                                    1. 1

                                                                                      I wonder how do they plan on handling templates.

                                                                                      Probably not at all? Templates are not a C feature.

                                                                                      1. 1

                                                                                        Please read the article until the end. C++ support is planned.

                                                                                        1. 4

                                                                                          My apologies. That is an ambitious goal.

                                                                                          Looking around the dpp codebase (which is impressively small!) it doesn’t look as though template work has started yet, although I did find official docs that seem to indicate D already has some support for C++ templates. I’m not sure how much complexity it tolerates.

                                                                                          1. 3

                                                                                            I take the “impressively small” comment as a compliment both to the D language and to myself! :)