1. 21
  1. 19

    Multiple of these are just the standard “I don’t understand floating point” nonsense questions :-/

    1. 5

      That doesn’t explain why a script language uses floating point as its default representation, let alone why that is its only numeric type.

      1. 4

        JavaScript has decimal numbers now fwiw, though I agree. Honestly I’ve been convinced that IEEE floating point is just a bad choice as a default floating point representation too. I’d prefer arbitrary size rationals.

        1. 2

          Arbitrary size rationals have pretty terrible properties. A long chain of operations where the numerator and denominator are relatively prime will blow the representation up in size.

      2. 5

        Indeed, same goes for the octal notation question (010 - 3 = ?)

        1. 7

          tbh the 010 octal format IS pretty awful. I don’t know what they were thinking putting that in C.

          1. 5

            well at least JS users have 0o10 - 0o5 now, if they find leading 0 octal notation to be confusing.

            1. 3

              Thanks for note, wasn’t aware of the ES2015 notation and MDN is helpful as always.

            2. 4

              I mean if you want fun 08 is valid JS, and that’s absurd :) (it falls back to decimal, nothing could go wrong with those semantics)

              1. 3

                Amusing, I’ve seen people write PHP with leading 0s before. Newer PHP rejects if there are invalid octal digits - fun! Putting the leading zeroes is common for people used to i.e COBOL/RPG and SQL; business programming where they’ve never seen C.

            3. 2

              really? only like ~5 of the 25 appeared to be floating point releated: 0.1 + 0.2, x/0 behavior, 0 === -0 and NaN !== NaN. Correct me if I’m wrong. Most of them seem to be about operators and what type of valueOf/toString behavior one gets when faced with such operators. Only two I got wrong were because I forgot +undefined is NaN and I was a bit surprised that one could use postfix increment on NaN (and apparently undefined?).

              1. 2

                Any arithmetic operation can be performed on NaN, but it always yields another NaN.

                The undefined one is a bit weird but kinda makes sense, it is indeed not a number.

                I actually think what’s weirder is how javascript will sometimes give you basically an integer. x|0 for example. The behavior makes a lot of sense when you know what it is actually doing with floating point, but it is still just a little strange that it even offers these things.

                But again I actually think it is OK. I’m a weirdo in that I don’t hate javascript or even php.

              2. 1

                i don’t see where is the contradiction there. JS numbers are IEEE 64-bit floating point numbers, so any weirdness/gotcha in IEEE floating point is also a weirdness/gotcha in JS too

                i know that many (most) languages also use floating point numbers by default, but that doesn’t floating point gotchas any less weird, maybe just more familiar to already-seasoned programmers :)

              3. 14

                Some of these aren’t weird. Like some are literally what IEEE 754 spec says to do with floating point numbers. One thing that’s only obvious in retrospect is that programming languages are also pieces of software with their own engineering trade-offs. When you decide what “+” means you really have to decide that for all the possible ways it can be used in the language. Javascript decided “+” is well-defined for strings and numbers along with everything that can be coerced into a string or number. The rest is a consequence of that decision.

                1. 12

                  This feels like the worst Javascript job interview ever and like beating a dead horse.

                  Everyone drags out this “Made In 10 days” line, even Eich himself, but he made a working prototype in 10 days 26 years and 12 ECMA versions ago, and somehow it’s the core of every single argument against Javascript - by this standard, you can discount the entire World Wide Web by the time it took Tim Berner’s Lee to make the first prototype web browser. Heck, you can summarize the entire history of the Web as “prototype released into the wild” and nab all the problems of modern computing as “weird edge cases that still exist for historical reasons” but for some bizarre reason, people only ever bring this up when it comes to Javascript.

                  1. 5

                    JavaScript has a strong commitment to backwards compatibility and that makes the 10 day prototype more significant IMO. Many things that get built quickly can be refactored down the line. JavaScript won’t.

                    1. 4

                      Open up the console in your browser and try out print("Hello World!") … this would’ve printed Hello World in 1995 on Eich’s 10 day prototype “Mocha” - it doesn’t do that anymore. It’s not completely backward-compatible.

                      1. 4

                        That seems like a minor form of backwards incompatibility. I don’t think we’ll ever see a breaking change like python 2 to 3, which is a good thing, but it means that the underlying design is even more important.

                        1. 3

                          The p2 to p3 break is one of the worst things they could do, for negligible actual “improvement”.

                          My wife is now stuck perpetually having to fight p2 vs p3 software issues, and will be for all time as there are some very hefty bioinformatics packages that aren’t going to be rewritten any time soon.

                        2. 2

                          Print has been (surprisingly) bringing up a print dialog for pretty much every version of JS other than maybe the first betas when there was no backwards compatibility problem.

                          1. 1

                            So he actually made it even worse.

                        3. 3

                          Heck, you can summarize the entire history of the Web as “prototype released into the wild” and nab all the problems of modern computing as “weird edge cases that still exist for historical reasons”

                          Great, let’s do it. And go back to native software running on machines users actually control.

                        4. 4

                          As others pointed out these are mostly floating point things and octal notation, and the rest is operator coercion rules, which are a very small part of the language! Everybody likes to pick on JS because of the coercion rules, but almost none of this nonsense has ever caused me any issues.

                          1. 4

                            I mostly agree, most of this is mainly comedy material.

                            That said though even in everyday use, Javascript’s “eager to please” model of fuzzy comparisons and type coercions do bite you, just like its semicolon insertion, floating point numbers and its two kinds of null or four kinds of for loops.

                            In a way, articles like this obscure the more damning parts of JS — its sloppy language design — of which “wat” problems are but a small part.

                          2. 3

                            Aside from a couple of IEEE pecadillos, almost all of these are from implicit conversions. The reason JS has so many implicit conversions was because the initial design of the language had no way to report runtime errors. In the absence of that, every operation on every possible kinds of operands had to produce some kind of value and now here we are.

                            If there a lesson here for other language designers, it’s that you need to have a coherent strategy for handling type errors from day one or you will end up regretting it. That can be static types, exceptions, or something else, but you have to think about it and do something or you’ll end up having to fill every single cell of your implicit conversion table with increasingly arbitrary values and your users will hate you forever.

                            1. 2

                              I stopped reading at “comma operator”.

                              1. 2

                                Why? The comma operator in JavaScript appears to have the same semantics as in C, and it’s confusing there too.

                                1. 2

                                  The sheer existence of an operator that is a comma is weird in my opinion.

                                  1. 1

                                    It is weird, but it’s a property of all C-family languages: It’s a sequencing operator and it’s required in C because C is a statement-oriented language and requires a single statement in various places. For example, you can write i++, j++ in the increment statement of a for loop, because it’s a single statement. It’s less necessary in JavaScript, where you could just use an anonymous function / closure to contain multiple statements but JavaScript just lifted a load of C syntax directly, as did Java and C++. All of these languages have these warts.