1. 35

  2. 9

    D has two compilers, the fully free LLVM-based D compiler and the official reference compiler dmd, the backend of which used to be proprietary, only licensed for single users by Symantec.


    1. 10

      There’s also GDC.

      1. 5
        1. 1

          wow, hadn’t seen that before. nice project!

        2. 1

          Problem with GDC is that it’s a one-man show. Take a look at the commit log and the contributors graph:

          It’s great this person is placing so much effort into it, but I’d be careful before I use it for anything serious.

          1. 1


            Fun coincidence: Yesterday I discovered that you had been involved in developing rstat.us and today you are writing me a comment here.

            1. 3

              Yup! Been watching Mastodon with… feels. I hope they can accomplish what I could not.

        3. 7

          Big thanks to Symantec!

          1. 4

            This is pretty cool. Congrats to Andrei and Walter.

            Also, i would like to propose nicknaming it the *D compiler.

            Yes, it would be pronounced ‘dereference’

            1. 1

              On “Reflection on Trusting Trust”, K. T. shows that even an open source compiler cannot be trusted completely. So I do not even imagine a private source compiler. Fortunately, as you said, there are other compilers too.

              1. 6

                This is one of my favorite memes to destroy since the problem has been solved repeatedly. Paul Karger, who invented that attack in the 1970’s on MULTICS, also published ways to prevent subversion in systems in general. The first one was done before Thompson even wrote about that one problem. Here’s my summary of strategies FOSS might use for this that high-assurance security invented in 1970’s with many implementations (even FOSS) over time:


                Karger went on to do secure OS’s, VMM’s, smartcards, CPU’s, etc. The height of that part of the field are CompCert, KCC compiler, CakeML, COGENT, Verisoft w/ VAMP & C0 compiler, JavaCard verifications, and DeepSpec. Any of these might be used for the job.

                1. 2

                  I don’t understand how you destroyed it though? The existence of a solution does not imply it’s actually solved in the tools that we use today. Have the maintainers of open source compilers implemented your cited solutions? And if not, is it really a “meme to destroy”? And, if not, why haven’t they?

                  1. 4

                    The meme or recurring claim shows up constantly in anything even peripherally related to distribution security or trust discussions. That’s despite it being one of least-likely problems you’ll ever face usually coming with no references on work done in that area. It’s a social phenomenon where someone heard something, it interested/worried them, and they just repeat it in new places. Quite opposite of security R&D or engineering. The worst thing is the rarity where the biggest risk, compiler transforms/optimizations ruining security, gets very little mention compared to the one that will hit almost nobody in practice. I think there’s been two example vs uncountable times compiler complexity combined with source code to ruin reliability or security. Worth destroying for such reasons although I always do so with references to the better stuff to help the person promoting it get a deeper understanding of the topic. And maybe one will build the next solution. :)

                    Now to the “What” I’m destroying. First, the common explanation attributes the attack to Thompson instead of its inventor Paul Karger. People doing follow-up research will learn a lot more about security from Karger’s papers and secure systems instead of Thompson’s work like C and UNIX. Second, I point out it’s not an open problem so much as something with a pile of solutions available… some in commercial use… all across the spectrum in work vs risk vs immediate utility. People learning of these might get started on implementing a solution instead of wasting research effort trying to see if a solution exists because Thompson didn’t know anything about INFOSEC at the time. Or since his paper’s popularity implies for some people that INFOSEC has no solution to this day. Third, the pile of work CompSci and safety/security-critical industry did on this problem shows the lack of it in FOSS isn’t technological: it’s social where it’s simply not reaching them, people that know about them don’t promote those projects enough, or the knowledge is simply not being used even if they know about it. Either excuses are made or reasonable tradeoffs within limited resource keep them from bothering with it.

                    So, I destroy it the second I see it to give proper attribution to one of INFOSEC’s best, to prevent illusion it’s unsolved problem on methodology side, and to give ideas for potential implementors reading along. Also, I usually point out that it will be better for them to use or build certifying compilers first before worrying about the “Karger attack that Thompson popularized.” Full verification like CompCert is ideal but hard work. A lighter example would be FLINT ML compiler that used a safe language, straight-forward modules, and type checks where possible. Compiler was safer & backdoors easier to spot. Rust used Ocaml. Proof-carrying code is another option. Also, on distribution side, better to just do secure SCM w/ signed source, distributed over TLS, built with tooling local to Linux/BSD distro, and checked by mutually-distrusting parties who sign their results. Reproducible builds on top of that if people want same hashes. Most people’s use-case already trusts the distro’s and foreign repo + assumes anything in transport can be modified (hence signing or TLS). All the focus on binaries by paranoids is kind of funny in light of that. So, secure repo’s (see Wheeler) plus safer, certifying, or verified compilers = the win. Preferably push-button & quick so more adoption.




                    1. 2

                      his paper’s popularity implies for some people that INFOSEC has no solution

                      People new to computer security like me come across such a paper, confident on the author knowing the existing solution but still publishing the paper.

                      I think there’s been two example vs uncountable times compiler complexity combined with source code to ruin reliability or security.

                      Just like what Karger said:

                      2.4 Minimizing Complexity

                      Multics achieved much of its security by structuring of the system and by minimizing complexity.

                      I would appreciate to see this criteria put on the table more often while software are built.

                  2. 2

                    Thank you for bringing the context of how this paper happened. Now I know what I was talking about better.

                  3. 2


                    someone heard something, it interested/worried them, and they just repeat it in new places

                    I laughted when I read it, because it is true.

                    1. 1

                      It happens. I call it when I see it but I’m not judging too harshly. I do it myself on occasion. I respect you took the time to respond, admit that, and try to learn from it.