1. 17
    Deno Deploy Beta 1 javascript release web deno.com
  1.  

  2. 10

    I hate how Go’s and Deno’s approach to dependencies of just pasting the URL to the web front-end of the git hosting service used by the library seems to be taking off. I think it’s extremely useful to maintain a distinction between the logical identifier for a library and the physical host you talk to over the network to download it.

    1. 4

      I like the idea of using URL fragments for importing. There’s a beautiful simplicity and universality to it. You don’t need a separate distributed package system—any remote VCS or file system protocol can work. However, it needs to be combined with import maps, so that you can hoist the location and version info out of the code, when desired. And there should be support/tools for explicitly downloading dependencies to a local cache, and for enforcing offline running. This is the approach I plan to take for Dawn.

      1. 2

        This strikes me as problematic as well. LibMan in .NET is the same way. npm audit may be flawed, but npm itself at least provides a mechanism for evaluating common dependency chains for vulnerabilities.

        Ryan Dahl and and Kit Kelly drew the opposite conclusion in their work on Deno. They believe that a central authority for package identity creates a false sense of security and that washing their hands of package identity altogether is the solution. Deno does at least have a registry for third party modules of sorts, but installation is still URL based.

        1. 1

          Think of it like this. The URI is just a slightly longer-than-usual package name. As a handy side-effect, you can also fetch the actual code from it. There’s nothing stopping you from having your build tools fetch the same package from a different host (say, an internal package cache) using that URI as the lookup key.

          The big benefit is that instead of having to rely on a single source of truth like the npm repository, the dependency ecosystem is distributed by default. Instead of needing to jump through hoops to set up a private package repository it’s just… the git server you already have. Easy.

          1. 4

            The problem is that it’s precisely not just a slightly longer than usual package name. It’s a package name which refers to working web infrastructure. If you ever decide to move your code to another git host, every single source file has to be updated.

            I have nothing against the idea of using VCS for distribution (or, well, I do have concerns there but it’s not the main point). But there has to be a mapping from logical package name to physical package location. I want my source code to refer to a logical package, and then some file (package.toml?) to map the logical names to git URIs or whatever.

            I don’t want to have to change every single source file in a project to use a drop-in replacement library (as happened with the satori/go.uuid -> gofrs/uuid thing in the Go world), or to use a local clone of the library, or to move. Library to another git host.

            1. 1

              It’s a package name which refers to working web infrastructure.

              But that’s true about more classical packaging systems, like Cargo. If crates.io goes down, all dependency specifications become a pumpkin.

              It seems to me that deno’s scheme allows to have roughly the same semantics as cargo. You don’t have to point urls directly at repos, you can point them at some kind of immutable register. If you want to, I think you can restrict, transitively, all the deps to go only via such a registry. So, deno allows, but does not prescribe, a specific registry.

              To me, it seems not a technical question of what is possible, but rather a social question of what such a distributed ecosystem would look like in practice.

              1. 1

                If you want to complain that rust is too dependent on crates.io then I agree of course. But nothing about a Rust package name indicates anything about crates.io; you’re not importing URLs, you’re importing modules. Those modules can be on your own filesystem, or you can let Cargo download them from crates.io for you.

                If your import statement is the URL to “some kind of immutable register” then your source code still contains URLs to working web infrastructure. It literally doesn’t fix anything.

              2. 0

                Well, Go has hard-coded mappings for common code hosting services, but as a package author you can map logical module name to repository location using HTML meta tags. Regarding forks, you can’t keep the same logical name without potentially breaking backwards compatibility.

                1. 3

                  The HTML meta tag solution is so ridiculous. It doesn’t actually fix the issue. There’s a hard dependency in the source code on actual working web infrastructure, be it a web front-end for your actual repo or an HTML page with a redirect. It solves absolutely none of the issues I have with Go’s module system.

          2. 7

            Seems like a product release page.

            Still super neat, but this makes me increasingly believe that Deno is some kind of low-key startup play.

            1. 13

              They hinted to that in the original Deno company announcement post. https://deno.com/blog/the-deno-company

              I personally think it’s good that OSS initiatives try to experiment with different sustainability models, even if there are plenty of ways it can go wrong. To me it’s a wake up call for the tech world: we must start getting creative to make sure our projects are sustainable, because right now it seems that the predominant model of caring about OSS is to stay asleep until a company buys the project and it’s only when it’s too late that people wake up and rage-fork the project (see Audacity).

              1. 1

                Isn’t that exactly what this venture is optimized on? Minimizing the time until it gets sold, while maximizing the money they get for it?

                I would be honestly surprised if users won’t get beautiful-journey-ed in short order.

                1. 2

                  Maybe, but looking at what’s happening with OSS lately, trying to stay virtuous without a real business plan ends up there anyway. I think being deliberate has a chance of making things work out better in the end. That said, only time will tell for Deno.

              2. 5

                To me it’s a great news that Deno finally has a product.

                I kept wondering why they were building a niche version of Node.js, but it turns out they have been building their Cloudflare Workers. That makes a lot more sense now.

              3. 1

                This feels a lot like Erlang/OTP. I’d love to read more about how this thing is implemented.

                1. 1

                  It’ll be interesting to see where this goes. I wonder if a database is coming?

                  Strange that they don’t mention the suffix list in the announcement - but at least they’re on it:

                  https://publicsuffix.org/list/public_suffix_list.dat

                  1. 2

                    Their docs have examples on using three existing databases, so I doubt they’d make up a new one…

                    1. 1

                      I was wondering more about actual databases, rather than rest/web services that persist data - but I guess it makes perfects ense for deno to use/showcase services that work similar from browsers, rather than try to integrate directly with something like postgresql.