1. 17
    1. 14

      The tldr is that it doesn’t support the full TS syntax (so no skipping compilation) and it mandates that engines not enforcing the types (thus burning the syntax for any actual type safety in future)

      1. 3

        The organizing principle of web APIs is, “Don’t break the web.” The runtime type safety ship sailed for JavaScript the moment ECMA-262 was adopted. The only way to maintain backward compatibility for existing codebases is to define some other language browsers are expected to support:

        <script lang="someotherlangauge">...</script>
        

        Given the large surface area of the DOM and the likely complexity and performance overhead of running two scripting language engines in parallel, that seems unlikely.

        1. 3

          Adding type syntax to JS would not break the web, but if the syntax does not require enforcement (or in tis case mandates non-enforcement), then engines would not be able to enforce type safety in future because that would break content that previously worked.

          1. 3

            There are pretty easy ways around this if TC39 wanted to allow enforcing types in the future. The most obvious would be to create a "use types"; pragma similar to the existing "use strict";

            Google even experimented with something similar in strong mode a few years ago.

            1. 4

              Use strict had a phenomenal cost to the spec complexity, and required a significant amount of complexity in the implementation of basic semantics of JS.

              The only reason I accepted strict mode in JS is because of the removal of |this| parameter coercion, nothing else in strict mode warranted the semantic overhead of the mode switch.

              Adding another mode is not a think that is ever going to happen in JS.

              The solution to opting in to types in JS, is to add an optional syntax that allows type constraints and enforce it.

              This proposal adds the some of the syntax, doesn’t really specifies details, and also prohibits the engines from enforcing those constraints, which will burn the syntax.

              So you get a large increase in complexity, but no actual benefit.

              I need to be very clear here: the solution to optional/progressive typing in JS is to actually specify the constraint syntax and semantics, and require the engines enforce those constraints. I would be 100% on board with that. What this proposal does is half-assed.

              In a HN comment one of the authors said a motivating factor was pasting typescript annotated types into the console, and said minifiers would strip out the annotation. The first could be trivially solved by console level rules, the latter completely defeats the argument for adding this proposal to the language.

      2. 2

        Right, it’s neither one nor the other. And it’s not fully accurate either:

        Things like visibility modifiers (e.g. public, private, and protected) might be in scope as well; however, enums, namespaces, and parameter properties would be out of scope for this proposal since they have observable runtime behavior.

        Is this correct? JavaScript would just ignore private keyword, but not all of TS compilers do, or?

        1. 4

          Didn’t JavaScript come up with it’s # notation for private fields after typescript had its private keyword? Private appears a little superfluous now perhaps.

        2. 1

          Visibility modifiers (public, private, and protected) are not valid in JavaScript. When TypeScript is compiled to JS, those modifiers are removed. If it didn’t remove them, it would cause JS engines to throw syntax errors.

          JavaScript has also added a different way of having private fields (#privateProperty), but that is a separate thing.

          1. 1

            No, i know about the #, but I am asking what happens if you both remove these modifiers and enforce them. E.g. tsc removes the private modifier on compilation, so I have a method visible (but the source said private) on an object. In the new proposal, the annotations are optional, but enforced. So the same source but different results. Or am I getting something wrong?

            1. 1

              The proposal is that certain slots in JavaScript syntax will be reserved for type information, and that type information can be used by external tools, but that the JavaScript runtime will ignore those slots at runtime. This allows TypeScript, Flow, etc. to use those slots to express their type systems, and to continue to improve without locking JavaScript itself into any of those competing approaches.

              So to answer your question, if this proposal is adopted, and it includes visibility modifiers like private, then TypeScript would no longer have to strip out private, because the JavaScript runtime would just ignore the modifier.

              It is worth noting that the original post says that the proposal may include visibility modifiers, but it may not.

      3. 1

        I think it’s important to acknowledge that Typescript also doesn’t enforce types, so a whole set of question of semantics and what to do in those cases would appear.

        “enforcing types” is also a whole weird thing, where it’s like… for dictionaries that would involve comparing a bunch of keys, and you get into a lot of questions about what X or Y is… remember, Typescript is not nominal typing but structural typing! So “enforcement” involves a lot more work than just checking a type tag

        1. 2

          Typescript does compile time enforcement, so while regular JS presumably can use incorrect types TS language projects presumably don’t

          In terms of checking type constraints, that is not a thing that is hard for modern JS engines - they can also do it more efficiently than JS based checks as they can coalesce property checks to get single pointer comparison to verify in the common cases

          1. 1

            Well, no one has any experience using runtime type checking across the whole codebase. Typescript’s type system is extremely expressive and complex - it allows arbitrary recursive type-level computations and type transformation. For example, here’s a type-level SQL library: https://github.com/codemix/ts-sql

            In many of my projects I use these capabilities to express a well-typed interface, but resort to any casting underneath the covers because I can’t prove my internal code follows the interface constraints. Most TS code today has issues like this to some extent. If we turn on runtime type check, how much inconsistency would there be? I expect quite a lot.

            1. 1

              If we add syntax now, but then don’t enforce the type checks then the results is exactly what you’re claiming: people will write code that fails the typechecks, so in future the spec will not be able to specify typechecking rules because of breaks to existing content.

              As far as TS goes: I’m not convinced that their type system is decidable, and any standard for optional/progressive typing in JS will require an exact deterministic specification.

              For performance: it is possible that you could make type constraints that are hilariously complicated, but I suspect that the slow check case is uncommon.

              I also think you could start with a less complete specification that has the benefit of not burning syntax. For example, you can start off with only having the ability to to specify property existence rather than existence + type, etc in such a case the type annotation would be purely additive so unconstrained properties would not prevent the future addition of constraints

              I’ve said it elsewhere: the problem with this “specification”, is not the addition of syntax, it’s the addition of syntax that the runtimes are expected to ignore. If they ship ignoring the syntax then the syntax is burned: it can’t be reused for actual constraints in future without breaking content.

    2. 5

      I don’t see much benefit from this, really. I would personally always use a subset of TypeScript over Javascript with types. Tools like esbuild make it easier than ever to use TypeScript. I do wish JavaScript had proper types - but TypeScript got there first, and it’s a great ecosystem to work with.

      That being said, if the option was JavaScript with types vs JSDoc, I would always vote for JS + types. Not a huge fan of JSDoc.

    3. 3

      This, from Microsoft. The same microsoft who single-handedly sank ES4; basically JS with types, based on existing, well-trodden, battle-tested, well-thought-out changes in AS3 developed by Adobe. The same microsoft who re-invented 70% of that as TypeScript, 3 years later.

      If they hadn’t sunk ES4, we’d all have had working, enforced-but-optional types in-browser for about 15 years now.

      1. 3

        ES4 was not just killed by MS, it was also not “just JS with types”. At the point ES4 was killed it was a huge spec, including multiple distinct name spacing mechanism, no real consideration of the interaction between the various namespacing/module systems and the web. The specification as it was also failed to completely specify all behaviors and interactions of those new features, remember that despite this it was a huge specification.

        At the same time it failed to do anything to actually specify the existing JS behavior.

        ES4 died for many reasons, and MS was not the “single hand” that killed it. It was decided by everyone in TC39 that it was not a viable spec, and that included people from MS, Mozilla, Google (this was pre-chrome, possibly before they had even started internally?), Apple, jQuery folk, and a few other random people who were periodically present.

        If ES4 hadn’t been killed we would likely still be stuck with multiple incompatible JS engines.

    4. 3

      I can only speculate that the purpose of this proposal is to make further TypeScript-ish inroads into existing JavaScript codebases for the benefit of tools like VS Code.

      1. 3

        one of the authors commenting in HN made it sound like the ability to past ts into console was at least a motivating factor, which is really not very compelling

        1. 2

          Perhaps not for you, but it is very compelling for someone like me who works in TypeScript codebases (and enjoys it!) and would also like to be able to use the JavaScript console as a REPL for interactive development.

          1. 4

            That seems like a reason for the TS folk to make a repl, not for JS engines to take the burden of parsing type annotations without providing any of the benefits (and worse burning the syntax so that they could never enforce types with that syntax in future)

            1. 1

              There is already the deno REPL but I think they would also like to use the browser console, which sounds useful to me.

              1. 3

                There is also ts-node, so no need to leap to Deno - though Deno’s CLI is a little faster, as AFAIK it doesn’t do type checking

    5. 2

      And then when the type syntax inevitably evolves, we’ll need a babel equivalent for type syntax. This is absurd.