1. 14
  1.  

  2. 19

    I’m disappointed about the use of the term “unsound”. I expected a demonstration of some elaborate imports that end up picking the wrong version or a similar logic error.

    The argument author makes is is just a general complaint about difficulty of adhering to go’s philosophy.

    1. 1

      I’m disappointed about the use of the term “unsound”. I expected a demonstration of some elaborate imports that end up picking the wrong version or a similar logic error.

      I agree on this, but I have learned to adjust my expectations based on the presence or absence of the go tag.

      It may be simplistic, but it’s surprisingly accurate.

    2. 6

      This appears to be an artifact of the software ecosystem within Google. At Google, package consumers expect their dependencies to be automatically updated with e.g. security fixes, or updated to work with new e.g. infrastructural requirements, without their active intervention. Stability is so highly valued, in fact, that package authors are expected to proactively audit and PR their consumers’ code to prevent any observable change in behavior. As I understand it, even issues caused by a dependency upgrade are considered the fault of the dependency, for inadequate risk analysis, rather than the fault of the consumer, for inadequate testing before upgrading to a new version. This, predictably, motivates a culture of automatic upgrades: even if extremely infrequent, a major version bump might also be expected to come with a tool that automatically fixes users code.

      No, it’s actually something google did which follows what used to very common practices until some time around when writing everything in javascript somehow became a good idea and everyone decided that getting a headache if you want to update a package was the right thing to do.

      I’m not saying everyone used to do this but certainly the libraries which didn’t do this usually didn’t end up getting adopted very much.

      Claiming that putting the responsibility of not breaking things on the writer of an API is something internal to google is the most laughable statement I’ve read on this website.

      For widely-imported modules with large API surface areas

      So the complaint is also that this forces people to minimise their API surface area. God forbid we ask programmers who want to design and implement fundamental components of software systems to think about the design of said components.

      But for software that’s still exploring its domain, or modeling something that has a naturally high rate of change, being able to make relatively frequent breaking changes could be essential to keeping the project healthy.

      Then stick to version 0, yes it’s a big “I am an under-developed project” sticker, but if you’re an under-developed project you should stop trying to hide it.

      projects shouldn’t be prevented from using semver to express their version semantics because ecosystem tooling has substituted stricter definitions

      What does this mean? Before go’s SIV as I understand it there was no good way to force people to follow semver in go, it was a mess which resulted in major problems, SIV is not ideal but it’s the way that go decided to allow multiple major versions to co-exist which doesn’t really make things “more” strict than semver. It just makes things exactly as strict as semver. This whole article seems to be complaining about the need to keep the major version number for versions greater than 1 in the package name and how this is confusing. I agree, but there’s really no more to it than that.

      it’s infeasible for me to maintain a major version into perpetuity

      I don’t think anything requires you to do that, you simply stop putting in any new features and only backport fixes for a duration which is roughly proportional to the size of your userbase. Document the upgrade path (oh no, having to write documentation, what is this effort!?) and tell your users that if they want to keep using the old version after you stop wanting to backport fixes to it then they’re on their own.

      Nobody criticises the linux project for not maintaining LTS versions forever and for circumstances where people running old and unmaintained versions of linux get bitten by security bugs. It’s the responsibility of you as someone who feels that they want to create a common component for software systems to ensure your component doesn’t break everything every update but it’s the responsibility of the user to keep up to date.

      (Please also note I’m not saying that Linux follows semver.)

      But API compatibility isn’t and can’t be precisely defined, nor can it even be discovered, in the P=NP sense.

      Really? I don’t think so, you can verify API compatibility if you want to go that far with actual formal verification (which limits you in some ways) but it’s not that difficult to comprehend a breaking change, really. It may not be easy to do with automated tooling but you can do it with a bit of mental effort and some peer review.

      Finally, this bias simply doesn’t reflect the reality of software development in the large.

      I agree it’s really unfortunate that software development in the large has taken a turn towards complete and utter laziness and a rejection of any responsibility but I don’t agree that it should stay that way.

      The notion that substantial version upgrades should be trivial or even automated by tooling is unheard of.

      Okay, I think my distribution package manager disagrees with you.

      Modules and SIV represent a normative argument: that, at least to some degree, we’re all doing it wrong, and that we should change our behavior.

      Yes, I think that’s the point here.

      I don’t really use go or want to use go or like go that much but when I read about SIV earlier this year I really thought: “Damn, finally some good news in the new-language-development sphere.”

      So of course it shouldn’t surprise me that there’s pushback to any good news.

      I understand that software development gives one a mindset that tools should make every aspect of life easier, but there also seems to be a bit of a mindset that if you can’t easily produce a tool to solve the problem you should forget about it. The current state of how versions are handled by companies (version pinning and an assumption that any version change can break things) is an example of pretending a problem doesn’t exist. Version pinning has a major impact on security, the way that developers are encourage to entangle their dependencies in a web of pinned versions scares me. I like the idea that the latest compatible version of can be installed automatically and affect all packages which depend on it.

      I also don’t mind the idea of asking people to think about how to design their libraries and APIs, yes it’s hard to design a good API but that doesn’t mean we should ignore the problem.

      1. 1

        Claiming that putting the responsibility of not breaking things on the writer of an API is something internal to google is the most laughable statement I’ve read on this website.

        How much of an extreme are you willing to go to? Compatibility is a good thing, yes. API authors should bear a lot of the responsibility for it, yes. But also Google’s alleged practices are way out on the extreme end and likely go too far, and you don’t seem willing to acknowledge that there is such a thing as responsible consumption of APIs. I’ve had people literally try to argue against patching exploitable security holes in the name of “not breaking compatibility”, for example.

        I don’t really use go or want to use go or like go that much but when I read about SIV earlier this year I really thought: “Damn, finally some good news in the new-language-development sphere.”

        When I read about what Go is doing, I immediately made a joke about how Go packages are going to end up being named things like somelibrary/v2_edited_3_final_no_not_that_one_6_REALLY_FINAL_5_USE_THIS_ONE_ACTUALLY_FINAL_2. To each their own, I guess.

        1. 2

          I think I made it quite clear of how “extreme” I want to go and I think you’re over-estimating the impact of what google is asking people. Nobody is asking you to not fix security holes and nobody is asking you to maintain 5 year old deprecated APIs but what we currently have is the extreme opposite of that. Sometimes you can’t expect things to work for more than a week before breaking if you expect to keep up to date with updates.

          somelibrary/v2_edited_3_final_no_not_that_one_6_REALLY_FINAL_5_USE_THIS_ONE_ACTUALLY_FINAL_2

          If your package name would result in somelibrary/v2_edited_3_final_no_not_that_one_6_REALLY_FINAL_5_USE_THIS_ONE_ACTUALLY_FINAL_2 maybe the problem isn’t go requiring API stability but rather the person writing that package not having spent enough time prototyping and designing it. I can see how people coming from the modern javascript oriented world would think that this would happen but it didn’t use to happen and doesn’t need to happen with a bit of forethought and effort.

          1. 2

            nobody is asking you to maintain 5 year old deprecated APIs

            The very clear expectation encoded in modules’ design, and evangelized by its authors, is that you will indeed continue to maintain all major versions of your artifact into perpetuity, by default.

            maybe the problem isn’t go requiring API stability but rather the person writing that package not having spent enough time prototyping and designing it

            The correct amount of time to prototype and define an API is different from package to package. Just like what “stability” means. It’s simply not true that rapidly iterating major versions represent, in all cases, a problem, or something to avoid.

            [it] doesn’t need to happen with a bit of forethought and effort.

            I mean, there are classes of software where it literally does. Right? If you’re iterating on an API in order to dial in the right ergonomics, and need user feedback to help you judge. Or if you’re writing software that interfaces with hardware that’s getting a new firmware revision every 2 weeks. This class of software exists, and its authors and consumers should get to participate in the ecosystem, and use semantic versioning to express their versions, just like anyone else.

            1. 1

              The very clear expectation encoded in modules’ design, and evangelized by its authors, is that you will indeed continue to maintain all major versions of your artifact into perpetuity, by default.

              I don’t really think there’s any expectation encoded in the design and I don’t know what the authors say but I don’t think there’s anything saying that you MUST do this or forcing you to do this in any way.

              You can just not maintain all major versions into perpetuity and I don’t think the module police will come (or maybe they will, you tell me, I doubt it though).

              The correct amount of time to prototype and define an API is different from package to package. Just like what “stability” means. It’s simply not true that rapidly iterating major versions represent, in all cases, a problem, or something to avoid.

              It doesn’t represent a problem, it just makes the major version useless. The whole point of version 0.x is that you can keep incrementing x as far as you want. If you want to prototype for a week you can prototype for a week, if you want to prototype forever you can prototype forever, if people want to pretend like they’re above stable APIs then there’s nothing stopping them from using 0.x. It just helps me filter out libraries I don’t want to use.

              I mean, there are classes of software where it literally does. Right? If you’re iterating on an API in order to dial in the right ergonomics, and need user feedback to help you judge. Or if you’re writing software that interfaces with hardware that’s getting a new firmware revision every 2 weeks. This class of software exists, and its authors and consumers should get to participate in the ecosystem, and use semantic versioning to express their versions, just like anyone else.

              Once again, I don’t see how version 0 doesn’t solve this problem already. Also I will point out that the use-cases you described have nothing to do with publicly available package repositories and package managers which interface with these repositories. If you have a company developing a hardware product and a driver you can use whatever versioning scheme you want and I don’t think anyone will stop you. The whole point of SIV seems to try to prod people developing open source libraries in the public space into following some good practices.

              Also, I will point out, “use semantic versioning to express their versions” is something you can do with SIV and which SIV doesn’t stop you from doing, it makes it easier to do even. Semantic versioning is a specific way of defining what the major, minor and patch number in a version number represent. If you really mean that people may want to release 1000 major versions for some reason then they should really just use 1000 different package names. After all, if you insist on breaking the API every week someone is going to have to go update their code anyway.

              1. 1

                I don’t really think there’s any expectation encoded in the design and I don’t know what the authors say but I don’t think there’s anything saying that you MUST do this or forcing you to do this in any way.

                You can just not maintain all major versions into perpetuity and I don’t think the module police will come (or maybe they will, you tell me, I doubt it though).

                Well, obviously modules doesn’t (and can’t) enforce a maintenance policy, but you can dig through the authors’ statements in related GitHub issues and blog posts and their perspective emerges pretty ambiguously. For them, a major version is, by default, a contract with users that doesn’t and shouldn’t have an expiry date. They even seem to suggest (in this issue and other places) that major versions should be considered by consumers to be essentially equivalent to each other — that there’s no reason to prefer a larger major version over a smaller one.

                It doesn’t represent a problem, it just makes the major version useless.

                A major version in semver encodes the intent of API compatibility and nothing more. It doesn’t encode actual API compatibility, that’s unknowable and unenforcable. It doesn’t (and can’t) define what stability means, that’s a decision that every project gets to make for itself. And there is no rate of change of major versions which is generally appropriate or inappropriate, the costs and benefits of a major version bump are different for every project, too.

                You describe how you look at a software ecosystem as a consumer and how you evaluate the viability of packages and how you value stability and that’s totally fine. But your context, and your evaluation criteria, aren’t universal. Not every author needs to oblige your needs and not every consumer shares them. And, as I write in the post, mandatory tooling for an ecosystem can’t evangelize a particular evaluation criteria without artificially constraining participation.

        2. 1

          The notion that substantial version upgrades should be trivial or even automated by tooling is unheard of.

          Okay, I think my distribution package manager disagrees with you.

          This is automated for the end-user, but not for the library consumer. It requires the distro maintainers to do a lot of work to integrate the upstream versions, whether it be finding the right versions or actively patching the code. It’s very much the situation the author says is reality, and that the Google/Go model doesn’t acknowledge.

          1. 1

            It’s not the situation the author says is reality. If an API developer breaks the guarantees they have promised with regards to versioning the distro maintainers usually contact said developer to inform them of this problem and to ask them to fix it. This can be seen with the recent bash fiasco, a distro maintainer directly reaching out to the bash project to inform them that they’ve made a breaking change. Yes a distro may end up having to patch some things as workarounds but distro maintainers don’t take kindly to having to patch codebases in major ways every release which is why packages which constantly break ABI just don’t get maintained, get dropped from the distro and as a result software doesn’t end up depending on them because nobody wants to package the library. It’s actually a bit of a self-regulating system which doesn’t exist in the javascript centric X-as-a-service and web-dev world because in those systems nobody has to package your application for the distro since a centralised authority hosts the application on the internet.

            If the situation which exists in javascript and similar ecosystems existed for open source software developed for linux then distros would have become flatpak (or your choice of all-in-one-package-thing here) bootstrap environments many years ago and distro repositories wouldn’t exist.

            Yes, updating packages is not a completely hands-off thing for distro maintainers but it’s minimal effort compared to if there was no expectation that minor-version updates break things. And yes, if you’re not using a distro and doing dependencies yourself (which languages like go insist on having you do for some reason, although it seems popular with pretty much any modern language) then you’re going to have to take on the role of maintainer and spend a little bit of time testing to ensure that you don’t break everything by updating but for sure you shouldn’t have to expect everything to break, it should not be the default.

        3. 2

          How frequently does the need for this feature (different versions of the same module) arise, in practice? I understand it is relatively common in ecosystems like Google’s… But I personally have never experienced this need, and an informal poll of my peers also doesn’t suggest it’s anything near common.

          I disagree with the most of the article, and it boils down to this. The author hates SIV because he (and his peers) doesn’t need “different versions of the same module” feature. Google wants SIV because it needs that feature. This is a classic Blub paradox.

          As for the question of how common, there is path dependency here. It is rare because it is not (or well) supported! It is well supported in Rust, and looking at Rust’s experience, it becomes common when it is well supported. It is a feature worth having.

          1. 0

            Completely agree with you. This is, a needed feature, regardless if it is needed by him. Not everyone can or should use the latest version of everything.

            SIV allows people to find a collection of packages, of arbitrary versions, that work. You can then say “hey, these all work together, don’t change any version of anything unless I say”.

            That’s pretty powerful. It allows you to keep software that just works, without having to be on the upgrade treadmill. You can upgrade at your pace.

            1. 4

              Strictly speaking, “upgrade at your pace” is a feature of lockfile, not multiversioning. Multiversioning just enables partial upgrade.

              1. 3

                Yeah, but why does go make it complicated?

                npm and Cargo have the same multi-versioning feature, but they make it work just based on package versions. It’s not mixed up with URLs and identifiers, with special-cased v2. You just get the version you asked for, and your deps get a version they asked for. Go makes such a big deal out of it.

                1. 1

                  “Why” is kind of easy to understand. Go works this way because it makes implementation simpler and easier. Of course it is a tradeoff and it makes usage harder and/or confusing, but that’s consistent with Go’s philosophy. Go’s whole philosophy is choosing implementation simplicity over user simplicity.

            2. 1

              I had some similar concerns in the beginning when go modules were finalized, but seeing how the community has adopted it has eased my fears. It has become the norm for popular libraries to have a vN suffix in the import path, and suddenly it looks much less ugly.

              Some examples from a quick sampling:

              • github.com/dgraph-io/badger/v2
              • github.com/cpuguy83/go-md2man/v2
              • github.com/gizak/termui/v3
              • github.com/gdamore/tcell/v2

              In the end, it ends up looking very similar to the old workaround we had using gopkg.in, like gopkg.in/yaml.v3 or gopkg.in/check.v1

              Once I started seeing major versions as part of the import path all over the place, it doesn’t feel scary and different anymore. I do believe that elevating the major version creates more appropriate expectations to the implication of breaking changes. It also scales appropriately with adoption: If you have a project nobody is using, then you can do naughty things like force-push tag overrides and do minor version breaking changes and it’s probably fine. If people actually depend on your project, then it’s a good thing that a major version bump is a big deal.

              I believe a lot of the initial concern stemmed from social constructs we weren’t used to rather than mechanical issues with the approach.