1. 48
  1.  

  2. 37

    “I am capable of writing a large codebase in a memory unsafe language without introducing enough security vulnerabilities to drive a truck through.”

    1. [Comment removed by author]

      1. 2

        There’s a huge area between memory safety and Python.

        1. [Comment removed by author]

          1. 6

            Once again, memory safe langs don’t sell being bug free and perfectly secure. They just say what they are… memory safe, it solves some issues, but not all. Let’s say that when you have a large codebase, having a memory safe lang helps you to focus on other things than solving memory issues (like preventing other bugs).

            1. 5

              that large code bases in memory-safe languages somehow don’t also end up with a lot of security vulnerabilities

              Inverse error. That’s not the implication. It’s that memory unsafe tools can and do cause security issues.

          2. 1

            I’ve done it.

            We probably have different definitions of large. But, @alex_gaynor did say as an individual…

            And I’ve written code in Python that had a huge command-injection vulnerability. Memory safety isn’t a panacea.

            And? That’s a class of vulnerabilities— shared by both C and Python— that no one was discussing.

            1. [Comment removed by author]

              1. 6

                The point— as I understood it— was that writing in a memory unsafe language makes it easy to make the mistakes you listed. Those mistakes “introduce enough security vulnerabilities to drive a truck through.”

                So do other things!

                But he wasn’t talking about any of those other things. He was talking about memory unsafe languages.

                Use a memory-safe language, remove a class of defects, remove a class of security vulnerabilities.

                Other vulnerabilities can and will still exist!

                Edit:

                “I am capable of making a sandwich with peanut butter without triggering a nut allergy.”

                “I’ve made sandwiches with almond butter that trigger nut allergies. Making sandwiches without peanut butter isn’t a panacea.”

            2. 1

              How did you know you did it?

              1. 1

                Change it from “writing” to “writing and maintaining over a period of several years with a team that has members come and go on a regular basis” and it’s another story.

            3. 7

              “Compiled languages are always faster.”

              This link seems to indicate problems in one of most overly-complicated languages designed (C++) rather than evidence of that bullet point. Plus, the comparisons rarely use a profile-guided optimizer for the AOT compiler when comparing to JIT’s which are essentially that on bytecode. That the AOT has more time to optimize with even more information than the JIT means it should always produce faster code in theory. The only exception I can recall is for code whose runtime patterns change a lot. Even then there could be an AOT-style solution combined with JIT that periodically recompiles the code or loadbalances in a heterogenous way matching workloads to ideal AOT-compiled processes. I haven’t seen one I don’t think.

              1. 9

                “Compiled languages are always faster”

                But Java has a compiler.

                We’re left trying to interpret what the author meant. Does “compiled languages” mean “AOT compiles to native code” vs. “JIT compiles to native code” as you reasonably assume based on the linked Forbes article? Or is it about interpreters, as I’ve suffered this debate before? Who knows!

                Gas on the fire.

                1. 1

                  That’s true in general but these are supposed to be memes. So, the meme would be an AOT compiler for a mainstream language vs one doing JIT. Author gave C++ vs Java as very common example. I always knock that down as apples vs oranges with one not allowed profiling even though it’s possible for AOT.

                2. 6

                  I’ll add a misconception to the list: “The term ‘compiled language’ is a useful concept”.

                  You don’t have languages that are compiled[1]; you can have a language for which a compiled implementation exists, or a language for which no compiled implementation exists, or a language for which the primary implementation is compiled but not always, etc. And even that is not a very useful category, because the performance implications of a compiler that targets machine code can be very different from one that compiles to bytecode.

                  I think most people mean “a language which is typically AOT-compiled directly to machine code” when they say this, but I can’t be sure because the terminology is so vague.

                  1 - OK, so technically there are languages like Forth where the compiler is part of the language spec, but A) these are very unusual and B) no one who uses the term “compiled language” is actually talking about this.

                  1. 2

                    Of note, it is possible for a language to be uncompilable[0] — c.f. some of the reflective-tower languages of the 1980s (Lisp in Small Pieces has an example of an uncompilable language).

                    [0] Really, it’s not so much that they’re uncompilable as that compilation has no real utility as applied to them.

                  2. 2

                    It also depends on your definition of “faster”. Generally, higher level representations of software are more compact - source is smaller[*] than bytecode which is smaller than machine code. If your performance is constrained by transferring your program representation (for example over a slow network, from slow storage medium) or if your interpreter + bytecode can fit in cache better than a compiled representation then compiled code may be slower.

                    [*] At least once gzipped or converted to a minimal representation.

                    1. 1

                      On that note, the Juice project that substituted Oberon for Java had about that effect I think. They wanted fast compilation and small transfers over dial-up more than ultra-optimized result. Their innovation was sending app as compressed AST’s so compiler still had more info to work with versus bytecodes.

                      I wonder about that last part, though. How often is it a problem that AOT-compiled code is worse at caching than interpreted/JIT’d bytecode?

                    2. 1

                      But it’s interpreters all the way down, and some interpreters are faster than others.

                      1. 1

                        Yeah, the fully-analog ones are still kicking the rest of them’s asses. People are just too picky about accurate results. ;)

                    3. 7

                      Just for my own notes really.

                      • Compiled languages are always faster. Compilation always has the opportunity to make things at least no slower, though genuine optimizations require more information and your compiler might not be using it (global optimizations, runtime information).

                      • floating point calculations will introduce non-deterministic errors into numerical results; OK, they introduce some errors into numerical results; alright I understand that floating point calculations are imprecise not inaccurate, Mister Pedantic Blog Author, but I cannot know what that imprecision is. Arithmetic operations on floating points are supposed to model real numbers, but real numbers are terrifically non-computational. If you haven’t explicitly studied computational analysis then you probably are missing a huge set of tools for making numerical computations with more than 1-3 sig figs work.

                      • at least the outcome of integer maths is always defined; fine, it’s not defined. But whatever it was, the result of doing arithmetic on two numbers that each fit in a data register itself fits in a data register. There’s lots of interesting behavior here! Again, though, it largely falls from distinction between our theories for things and our models. What is the right behavior for overflow?

                      • the bug isn’t in the hardware; bug-free computer hardware is completely deterministic; the lines on the hardware bus/bridge are always either at the voltage that represents 0 or the voltage that represents 1. Let’s be clear, there are public stories circulating now that our CPU runs Minix! Let’s also be clear that your hardware is treated as a discrete approximation to the physical/chemical process actually happening in your transistors.

                      • “Complete coverage”. Let’s just stop right here.

                      • “no hacker will target my system; information security is about protecting systems from hackers.” Take a look at your frontline server logs someday.

                      • “any metaprogramming expansion will resolve in reasonable time; any type annotation will resolve in reasonable time.” Metaprogramming is often computationally complete. Type inference/checking sometimes is and usually is at best a unification problem (often exponential).

                      • “OK, well at least any regular expression will resolve in reasonable time; can you at least, please, allow that regular expressions are regular?” DFAs are O(n), sure, and regexes can compile to DFAs sometimes. Basically every regex implementation adds features though that make them no longer representable/compilable to DFAs.

                      1. 5

                        “There is a silver bullet.”

                        1. 4

                          It’s called a genius using Common LISP in a good repo documenting what they were doing as they went along. ;) Of course, few describing the good side of their work as silver bullets managed to adequately warn folks of all their drawbacks.

                          Management at big companies who had dealt with geniuses before knew how to prevent their rise in the new profession: ensure their stacks were C++, Java, and Microsoft written in a process-heavy way with much proprietary tech whose licensing and debugging budgets minimize budget and time for geniuses’ great ideas. It’s still working outside of niche companies.

                          1. 4

                            Or as I like to say: “there are no rules, only rules of thumb”. Knowing when to break them is essential.

                            Silver-bullet-ism leads to a lot of stupid designs when people try to solve their latest problem with their completely unsuitable silver bullet of choice.

                          2. 9

                            “Falsehoods programmers believe about X” considered harmful.

                            1. 8

                              Falsehoods programmers believe about the phrase “X considered Harmful.”

                              Falsehoods programmers believe about the phrase “X considered harmful,” considered harmful.

                              Falsehoods programmers believe about the phrase “Falsehoods programmers believe about the phrase ‘X considered harmful,’ considered harmful,” considered harmful.

                              1. 5

                                Recursion considered recursive.

                                1. 3

                                  “Recursion considered recursive” considered falsehood believed by programmers about the phrase “Recursion considered.”

                                  if you torture the English language hard enough, you get poetry.

                                  Haiku are easy.

                                  But sometimes they don’t make sense

                                  Refrigerator.

                              2. 2

                                Why? I find these sorts of posts quite illuminating, as they often skewer conventional wisdom.

                                1. 5

                                  In my opinion it’s usually because they list the falsehood without listing an example of where it breaks down. These lists work better when you can point at a real-world case why X is a falsehood.

                                  1. 2

                                    Bingo!

                                    1. 1

                                      Exactly. Who cares if they skewer conventional wisdom if there is no evidence to prove that conventional wisdom should be skewered?

                                    2. 3

                                      The notion that such lists represent conventional wisdom is perhaps itself a falsehood.

                                  2. [Comment removed by author]

                                    1. 3

                                      I think that programming is like any other skill. Some people are born with a knack for it, some people can achieve proficiency through education, and some people won’t be able to learn it or won’t want to learn it. I don’t think there’s anything magical with programming that makes it different than other skills.

                                    2. 3

                                      There is some innate affinity for computer programming which you must be born with, and cannot be taught

                                      It’s hard to say whether this is true or false, or even what people believe.

                                      On the one hand, people are not all equally good at everything.

                                      On the other, even if they were, becoming good at things takes time; and if that length of time is long enough than it is hard to say what the practical difference is between not having an innate ability and simply not having the skill at present.

                                      1. 2

                                        That we are “software engineers”, despite not knowing or having a repeatable process or methodology that results in the successful delivery of a large team - complex project, on-time, on-budget and to some specification.

                                        1. 2

                                          There are attempts at doing the “Engineering” part of Software Engineer. But when talking and/or working with people (Lobsters included), the common conception is that they are only dull document used by manager to slow down or drive insane the smart developpers.

                                          Ref: https://www.computer.org/web/swebok and many ISOs.

                                          1. 1

                                            That’s usually true with how programming is done. There are those doing engineering of software. I linked to three here:

                                            https://news.ycombinator.com/item?id=15886317

                                            Example I just found for industrial application of formal simulation and verification of a plant’s operation:

                                            http://vigir.missouri.edu/~gdesouza/Research/Conference_CDs/IFAC_ICINCO_2007/ICINCO%202007/Area%203%20-%20Signal%20Processing,%20Systems%20Modeling%20and%20Control/Short%20Papers/C3_629_Seabra.pdf

                                            EDIT: The only thing I can’t tell them in engineering with any confidence is time and budget. Software is too non-linear for that if the team is doing arbitrary work. Might be more accurate at estimating stuff similar to past work.

                                          2. 1

                                            “the bug isn’t in the hardware.” Totally added after the Intel Skylake and Kaby Lake! (Or maybe he works with embedded systems!)

                                            1. 0

                                              at least the outcome of integer maths is always defined.

                                              Integer arithmetic is well-defined. However, machine integers are not a model of the integers, and what C calls “signed integers” are not a model of anything nice.

                                              if I have complete test coverage then I do not need a type system.

                                              You need to prove your code correct anyway. Test suites merely prevent you from wasting your time trying to prove incorrect code correct.

                                              if I have a type system then I do not need complete test coverage.

                                              You need to prove your code correct anyway. Type systems merely write somewhere between 0.01% and 0.1% of the proof for you.

                                              if the problem is SQL Injection, then the solution is to replace SQL; NoSQL Injection is impossible.

                                              The actual problem is representing queries as strings. Switching to a NoSQL database doesn’t automatically fix that.

                                              … but all of this is basic CS.

                                              EDIT: Fixed link.