1. 10
  1.  

  2. [Comment removed by author]

    1. [Comment removed by author]

      1. 4

        I love Standard ML and use it whenever I get a chance to - that is, at home, where my employer can’t tell me what to use. Standard ML is a beautiful language from a theoretical standpoint, but, barring a black swan event, it will never be practical to use for real-world business applications.

        As for what led to this sorry state of affairs, I can only offer a guess: Standard ML was too principled for its own good. Its formal specification killed the language. Formalizing a programming language exerts pressure towards simplifying it from a mathematical standpoint. Having to prove a type safety theorem (as the people who defined Standard ML in fact did), even more so. This simplifying pressure is in direct opposition to delivering what most programmers actually want: convenient features. OCaml and Haskell have non-academic users because they have convenient features. Standard ML to this day doesn’t even have standardized Unicode support.

        1. 5

          mythryl was a great effort to marry SML and posix; i’m sad it never gained traction

        2. 3

          This was best resource on practical application last I looked. It’s really old, though.

          http://mlton.org/References.attachments/Shipman02.pdf

          I can’t say anything about the tooling and stuff as I haven’t used it recently enough. Did keep that book in case it was useful, though. However, a quick Google gives a relatively recent Quora thread whose answers are basically the same as the ones I heard when I looked into Standard ML. That’s already a bad sign, eh?

          https://www.quora.com/Why-isnt-Standard-ML-more-popular

      2. 1

        I have argued that out of the infinite number of bugs in the world, memory safety bugs are the important bugs. They aren’t the only serious bugs in the world, but today they are overwhelmingly the ones that are being exploited to achieve complete control over a target (remote code exploitation).

        Embedded in that statement is a link, which leads to a broken page.

        I don’t like the term “bug”; It’s a mistake. An error the programmer made. And yet we don’t talk often enough about ways to change ourselves to make fewer mistakes. Why not?

        That broken link is also an error made by the page-programmer (author). Why did he hand-craft the html? Why didn’t he use “safe” editors like Microsoft Word?

        1. 2

          Interesting. Bug is just something catchy supposedly refering to live bugs messing up computers. Still happens to laptops (ants) and PS4s (roaches) but different insects. I agree it might have helped a bit if we changed term to mistake, defect, or failure. Many in high assurance call them defects or failures. Psychology means some people or teams might reduce the number of “failures” since that sounds like their fault more than bugs.

          The safe editor thing is funny, too. The web site might lack QA. Probably doesn’t use an HTML editor or a tool for safe, web apps like Opa language. Might have never looked into stuff like the latter. Calls self out as calls others out.

          1. 1

            When “bug” is normal, and expected, it’s no longer “my fault” for making a mistake.

            I sometimes wish other programmers took some responsibility for their mistakes.

            1. 2

              Other uses of the term include “I caught a bug” (luck), “there’s a bug on it” (luck again), and “that person is bugging me” (external responsibility). The word definitely seems more external and impersonal than “I didn’t do input validation on that incoming data or ensure the receiving memory had the right size. It was one of the ten things to always do taped to my cubicle/office cuz people keep screwing those up. I screwed up again.”