1. 16
    1. 6

      This is a great commentary, especially on the long-term costs. I’ve been making websites for over 20 years and lack of standards and compatability with standards has been an enormous, generally needless cost. Over the fall I implemented a new, barely-used standard and still lost time to differing implementations and questions that could only be resolved by running a debugger against an implementation.

      A small thing I’d add is that errors, except rarely in the presence of (very few) security-sensitive topics, must provide the recipient with an obvious next step for addressing the error. For years, Internet Explorer’s JavaScript engine would tell you what exception had occurred with no further detail (no function, line number, filename). The bigger a standard is and the more functionality or sub-standards it includes (and JS sits atop a very tall tower of standards) the more value there is in implementations being strict, and also the more value there is in them being explicit and useful.

      The biggest missing recommendation that standards MUST include programmatic test suites. Those suites SHOULD be expanded and updated more frequently than an RFC lifecycle of years based on the size and popularity of the standard. The various ACID tests were really good for CSS; a browser conformed or you had a list of missing features that you could discuss with a standard vocabulary and shared understanding. English is complex and vague. Computer standards need to be explicit and precise. That means some part of them needs to be programmatic, not prose.

    2. 6

      According to this post (*), the common interpretation of Postel’s Law is far too broad. Postel thought it should be

      “[…] in general, only a subset of a protocol is actually used in real life. So, you should be conservative and only generate that subset. However, you should also be liberal and accept everything that the protocol permits, even if it appears that nobody will ever use it.”

      Looking at the wording from RFC 793, I sympathize with the lament.

      TCP implementations will follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others.

      Personally, I think it’s irresponsible to take that to mean that you can accept any old crap from someone, like malformed headers or “mostly correct” packets. If anyone used Postel’s Law as justification for a demonstrably incorrect implementation in a group I was part of, I’d have some strenuous objections. But that is not the reality we live in, sadly. Postel’s Law seems like a reflection of the mentality of the software world: push out a beta thinking you’ll actually be able to fix it when 1.0 comes out.

      Ultimately, I’m in agreement with the harmfulness brought about by the interpretation of Postel’s principle, but perhaps I won’t blame him for it.

      (*) The forum discussion linked to in the post is no longer available. Here it is on archive.org: https://web.archive.org/web/20061208052317/http://www.webservertalk.com/archive60-2005-9-1209975.html