1. 48
  1. 59

    tl;dw

    • Removing Promises after they were initially added to Node. They should have been left in.

    • There’s virtually no security. V8 has a very good sandbox, which could have been maintained for server-side applications

    • The build system (GYP): Chrome used to use it, but moved on, now Node is the sole user:

      “it’s a very funky interface […] it’s very terrible”

      “it’s an awful experience for users”

      “the continued use of GYP is probably the largest failure of node core”

    • package.json led to NPM being sanctioned, and there being a centralized repo for JS packages (Ryan doesn’t explain why this is a regret).

      “It gives rise to this concept of a ‘module’ as a directory of files, which wasn’t a concept before, when we just had Javascript files […] it’s not a strictly necessary abstraction”

      “package.json has all this unnecessary noise in it”

    • node_modules:

      “it massively complicates the module resolution algorithm”

      “deviates greatly from browser semantics”

      “In practice a $NODE_PATH, like $PYTHON_PATH, would have worked, you can do vendoring that way too.”

    • “require” without the extension (eg “.js”)

      “Why? It just makes things more complicated”

    • “index.js”

      “I thought it was cute, because there was index.html”;

      “it needlessly complicates the module loading system”;

      “One thing I’ve learned as I’m aging is that whenever you’re designing a program there’s things that you think might be cute to add in. You always regret those”.

    1. 16

      Thanks for typing these up. I saw some of the slides and some hot takes fly past on Twitter at the time Ryan as giving the talk, too.

      I definitely get the impression that Ryan holds a particularly fundamentalist view on what he perceives as unnecessary software complexity, when I think in reality one of the rough parts of Node was that for a long time (and on some level, still today) there is insufficient complexity to cover the problem domain.

      For instance, the EventEmitter class is extremely simple in its implementation, and as such it can be very difficult to produce correct and robust software with it. The abstraction provides no mechanism to enumerate the set of events a producer might emit, and no way to enforce any rules about the ordering of events (e.g., end and close) or the valid sequences of events (e.g., you should emit at most end or error, but not both). It is also difficult to reason about what happens when you attach more than one listener for the same event. As such, there have been many bugs in core and add-on modules alike, both in event producers and consumers.

      I think his comments on node_modules and require() are pretty similar. One of the fantastic things about node_modules the way it works today is that you don’t have to futz with environment variables (e.g., $NODE_PATH). Resolution is performed starting at the location of the source file which the require() call is made, traversing the directory tree in a standard way. This is in stark contrast to some other environments which require either installation of modules into a global directory tree, and/or an environment variable to be correctly set informing the runtime or compiler of the location of modules.

      I’ll be the first in line with criticism about NPM, or the quality of many or even most of the modules in the ecosystem, but having built production-grade software for six years using Node, I can say I would absolutely miss the current module resolution behaviour if it was gone.

      1. 4

        I did watch, but appreciate your notes anyways!

      2. 8

        Deno seems very interesting. He’s trying to rectify his mistakes and adopt typed, async JS.

        URL imports seem somewhat interesting but have one large flaw in my opinion - they drop the singular responsibility of security in exchange for a shared one.

        NPM has a lot of issues, especially security ones, but at least it’s an actively maintained, up to date entity. If something goes wrong someone will notice very quickly and they will attempt to rectify it very quickly.

        With URL imports, however, I’m worried that we’ll lose the quickly part of this. Imagine a scenario in which an early adopter makes a left-pad like module - something trivial that isn’t in the STL but everyone wants. In a few years they migrate away, and become removed from the tech community (but still host their domain/js)

        If this domain gets compromised, every package with that import will be done. There won’t be a central authority that can sink that package or domain - it will be a shared responsibility of every developer, which is significantly more dangerous than the singular responsibility of npm.

        1. 14

          I was amused to note that he cautions against using “cute” code and then goes on to call URL imports cute at around 21:14

          1. 4

            The main thing I’d counter with is that the web has run on URL imports like that, and has managed it fairly well. Spread of imports also means spread of attack targets. NPM is a singular target by comparison, vs the various CDNs, libraries copied locally, and so on that this scheme proposes.

            Overall, it’s a different tradeoff, and so far, this JS runtime is a thought experiment.

          2. 3

            Here’s a PDF of the slides - http://tinyclouds.org/jsconf2018.pdf

            1. 1

              On the window vs global thing: for the love of compatibility, please put both in, both pointing to exactly the same object.