1. 4
  1.  

  2. 21

    The title of the article is a heavy “editorialization” (more like misrepresentation) of the actual contents, a.k.a bait an switch. The tl;dr quote from the actual text:

    Q: So when will these [generics, errors, etc.] actually land?

    A: When we get it right. [… Maybe, let’s] say two years from now […]

    No slightest mention of “coming to Go[lang] in 2019” for those.

    1. 3

      I agree. I clicked on it to see the specifics of how they did generics and modules. Especially generics since it was a contested issue.

      1. 4

        Modules are usable today: https://github.com/golang/go/wiki/Modules

        The additional work is turning it on by default and adding things like the central server.

        For generics the proposal is here: https://go.googlesource.com/proposal/+/master/design/go2draft-generics-overview.md

        And apparently there’s a branch you can test it with:

        https://github.com/golang/go/issues/15292#issuecomment-438880159

        1. 2

          Go modules are more “unusable” in the current state than “usable”. It’s looking to me like it didn’t solve any of the real problems with Go depedencies. It still doesn’t vendor local to the project by default, there is little or no CLI support for basic dependency management tasks (like adding new deps, removing old ones, updating existing ones), and there’s no support for environment-specific dependencies.

          At this point they just need to scrap it completely and rewrite it using crate and yarn as examples of how to solve this problem. It’s frustrating that this is a solved problem for many languages, but not Go.

          On the plus side, I think it speaks the the strength of the language to be so prolific with such poor dependency management tooling.

          1. 3

            Completely disagree. I’ve converted several projects from dep and found modules a real pleasure. Its fast and we’ll thought out.

            Vendoring has major problems for large projects. It takes me 30 seconds to run any command on Macos docker because there are so many files in this repo I’m working on. Sure that’s a docker problem, but it’s a pain regardless.

            With modules you can get offline proxying server capabilities without the need to vendor the dependencies in the repo itself and if you really want vendoring it’s still available. And this is something they are actively working on. A centralized, signed cache of dependencies.

            Also the go mod and go get commands can add and update dependencies. You can also change import paths this way. It’s underdocumented but available. (do go get some/path@someversion)

            Not sure about env-specific dependencies… That’s not a situation I’ve heard of before.

            There are a lot of things go got right about dependencies: a compiler with module capabilities built into the language, no centralized server, import paths as urls so they were discoverable, code as documentation which made godoc.org possible.

            And FWIW this isn’t a “solved” problem. Every other solution has problems too. Software dependencies is a hard problem and any solution has trade-offs.

            1. 1

              I’m glad it’s working well for you! This gives me hope. I’m basing my feedback on the last time I messed around with go modules which was a few months ago, so sounds like things have improved. Nevertheless I think it’s a long way off what it should be.

              By environment specific dependencies, I’m referring to things like test and development libraries that aren’t needed in production.