1. 48
  1.  

  2. 21

    I whole heartedly agree with the issue raised in the post. It has even come up several times here where I have suggested, for example, to use daemontools and invariably someone responds that the last commit is years old without ever actually suggesting why there should be a new commit.

    I don’t think the suggested solutions will work, though. If marking things DONE became a cultural phenomenon, what I think would happen is every project would get marked DONE then the author would decide they needed to add something. The thing with software that is really done is that it’s a specific mindset about how software should be written that does not exist in any significant way. Maybe using a DONE flag on software would push the culture more towards this mindset, but I remain cynical.

    1. 15

      One problem with software declared DONE is that it tends to only actually work if someone is doing maintenance releases, even if they choose not to officially call them that. Though you could conceivably distinguish between the core software, which becomes feature-complete and done, and anti-bitrotting maintenance, which usually has to be ongoing.

      For example Knuth somewhat famously declared TeX done years ago, using a version-numbering scheme that asymptotically approaches Pi. But it’d be difficult to use in practice if TeXLive weren’t making regular releases to ensure that it still works on a modern Unix: builds with a recent compiler, can find system fonts, doesn’t get tripped up over path issues, etc. Though you could argue that Knuth is even wrong about the core being feature-complete: it doesn’t support Unicode, for example, arguably nowadays an expected feature for a text-processing tool.

      I’ve used some software considered finished that has literally had no releases since the 1980s, and it is never fun to use: the best case is that you spend a day just getting it to build on a modern system. A much better case is when someone is lightly maintaining the old software. Debian is a source of a fairly large number of old packages that are still usable, because the Debian maintainer has de-facto taken over maintenance. For example Debian is still shipping mawk v1.3.3, released 1997. But it’s shipping it with 22 patches, added at a rate of 1-2 per year. I’ve noticed that even other distros use Debian as the de-facto upstream for a lot of old software that nobody else is maintaining.

      1. 2

        That’s an excellent point. I mostly was talking about library code rather than user-facing tools, but I didn’t at all made the distinction clear and your point still stands, eg. for any time a standard library or system API has a breaking change.

        I’d really like to see better ways of specifying and providing those dependencies to reduce breaks, but I don’t have any concrete ideas.

      2. 8

        I mean, I think the thing to do is to put a short position statement on these issues, somewhere obvious - the front page of the project site, and the README.md, at the very least. Semantically it may be the same as a DONE flag, but it takes a lot more thought to write, and the audience knows that. A costly signal, as psychology would say. :)

      3. 8

        One nice way to indicate “alive” status is to have a file with the version of everything the last test run passed on, and update it whenever a dependency changes.

        1. 3

          Nice! That’s definitely a productive approach.

          1. 3

            It’s specific to languages, but I’ve seen a few github projects with badges indicating dependency status from https://gemnasium.com/

          2. 6

            The Urbit project tried a variant of this, calling it Kelvin Versioning, which counted down with each change to absolute zero (0K).

            Normally, when normal people release normal software, they count by fractions, and they count up. Thus, they can keep extending and revising their systems incrementally. This is generally considered a good thing. It generally is.

            In some cases, however, specifications needs to be permanently frozen. This requirement is generally found in the context of standards. Some standards are extensible or versionable, but some are not. ASCII, for instance, is perma-frozen. So is IPv4 (its relationship to IPv6 is little more than nominal - if they were really the same protocol, they’d have the same ethertype). Moreover, many standards render themselves incompatible in practice through excessive enthusiasm for extensibility. They may not be perma-frozen, but they probably should be.

            The true, Martian way to perma-freeze a system is what I call Kelvin versioning. In Kelvin versioning, releases count down by integer degrees Kelvin. At absolute zero, the system can no longer be changed. At 1K, one more modification is possible. And so on. For instance, Nock is at 9K. It might change, though it probably won’t. Nouns themselves are at 0K - it is impossible to imagine changing anything about those three sentences.

            While Urbit has locked down the public-facing pages detailing their idealistic OS/programming language/architecture/sci-fi project since it “leaked” at urbit.org last year, you can find the pages at the Internet Archive.

            1. 2

              This is a breath of fresh air. A lot of the software I use should already be “done” by now. This should be a goal. Notepad/nano should not need updates.

              I have the same feeling about plenty of textbooks. e.g. If you couldn’t explain calculus properly the first ten times, what makes you think you can on the eleventh?!