1. 52

  2. 14

    I’m impressed with the results of disabling JIT. I wonder if with a suitably tuned interpreter, you could basically get most of the performance with possibly much better power efficiency and security.

    1. 6

      S7 scheme, which is interpreted, claims similar performance to guile (which is jitted) and chicken scheme and sbcl (which are compiled).

      1. 8

        The JavaScriptCore[1] interpreter is hand-written portable macro assembly that generates around 30 KiB of machine code that fits in L1 i-cache on most systems. Their tier-1 JIT is incredibly simple: it just copies and pastes the chunks for the small opcodes inline and inserts jumps to the large ones. The vast majority of JavaScript code runs on one of these two implementations, things only get promoted to the third and fourth tiers after being identified as very hot code paths. Unless you’re running canvas-based games, the odds are most JavaScript that you see executing in Safari is in the interpreter or baseline JIT.

        [1] The JS implementation in WebKit that Chrome replaced with v8.

        1. 2

          Google Docs is switching to canvas based rendering and a working office suite is table stakes for many, many office workers.

          1. 8

            Just because it uses canvas doesn’t mean that it would be slow without the JIT. The frame rate for a typical office suite can easily be 10fps without users caring and the amount of compute required to redraw each frame is comparatively small. Unless you’re recomputing an entire large spreadsheet every keystroke or redoing text layout for a long document with no caching then you’re unlikely to be CPU bound (for comparison, take a look at the speed of SILE, which produces vastly nicer output than Google Docs, is fast, and is 100% interpreted Lua).

            The canvas itself is entirely AoT-compiled native code and that’s where the vast majority of the compute time will be.

            1. 1

              I don’t think they’re re-drawing the page background on every frame

            2. 1

              Now we just need heuristics (and/or UI controls) to distinguish “random possibly attacking page” vs “the Quake 3 WebGL version I like” and hard disallow the high tiers on the former..

            3. 1

              I’m curious to read more about it. Where is that claim made? That site doesn’t seem to say anything about performance and I cannot find other mentions or benchmarks while googling “s7 scheme performance”.

              Edit: nevermind it’s in the page you link.

              In s7, that takes 0.09 seconds on my home machine. In tinyScheme, from whence we sprang, it takes 85 seconds. In the chicken interpreter, 5.3 seconds, and after compilation (using -O2) of the chicken compiler output, 0.75 seconds. So, s7 is comparable to chicken in speed, even though chicken is compiling to C. I think Guile 2.0.9 takes about 1 second. The equivalent in CL: clisp interpreted 9.3 seconds, compiled 0.85 seconds; sbcl 0.21 seconds. Similarly, s7 computes (fib 40) in 0.8 seconds, approximately the same as sbcl. Guile 2.2.3 takes 7 seconds.

              1. 2

                Doing a bit more digging, S7 is about middle of the pack if you trust ecraven’s benchmarks (I see no reason not to). The ones to beat are Chez, Bigloo, Racket, Gambit, MIT, etc.


          2. 7

            This article reminded me of a personal experience.

            Many years ago, I was consulting with a Fortune 50 company and we were creating and installing a rules-based configuration engine for their product. At the time SQL Server and Oracle were the big dogs on the database scene, and the customer after long deliberation had settled on SQL Server. Overall, they felt the cool new features were much more adaptive to their business than Oracle’s.

            Well, rules-based engines tend to be recursive. They can do ugly things to performance, like endless loops. Our code was good, we had tested elsewhere, but it wasn’t hitting on much on their state-of-the-art servers.

            Instead of using us to work through the performance, they decided as a top-tier customer of Microsoft they would bring out the key engineers and have them tune things.

            The first thing they did was turn off all of the cool new stuff that was the reason for purchasing it in the first place.

            1. 6

              I don’t very much care about what Microsoft’s browser does because I don’t use it, but I’m very much interested in spreading the idea that there’s good reasons to use simple, maintainable software even if it isn’t the fastest.

              1. 3

                Interpreters are pretty simple, and pretty equivalent in the naive case to a CPU (the kind you design in a college course). This simple CPU parses instructions one after another in hardware and does operations storing the results in some memory. Any reasonable interpreter (so ignore tree walking interpreters) does that same exact thing in software.

                It’s hard to fathom physically how you could get within 2x (let’s say) the speed of hardware in software on top of that same hardware. And the x factor is probably realistically much higher than that though I don’t have an intuition for what it is. Maybe 10x?

                The point Microsoft is trying to figure out here I assume is if hardware is just fast enough that 2x slower (or whatever factor it is) is still reasonable.

                But maybe I’m just building my interpreters incorrectly. I’d love to learn if so. :)

                1. 3

                  My cognitive dissonance is off the charts these days with Microsoft. Are they the goodies?

                  1. 24

                    They are a big company doing many different things. It saves a lot of heartache to ditch the goodie/baddie scale and consider each activity on its merits. And in this case I think they’re onto something good. I for one would sacrifice 10-50% JS speed to rule out an entire class of common security bugs.

                    1. 2

                      ditch the goodie/baddie scale and consider each activity on its merits.

                      I gradually changed to this thinking approach after I crossed 33-34, however I struggle to communicate this thinking approach to people. Is there a name for this mental model?

                      1. 3

                        People are complex, organizations change over time. Initial impressions matter a large amount, but people who are interested update their opinions on the basis of new information, while retaining history.

                        One way to do this is bayesian estimation. The two primary problems with bayesian estimation are that people are no good at keeping track of their priors and are no good at estimating new probabilities.

                        It is reasonable to assume that Microsoft, at any time, is doing something clever, something stupid, and a lot of things that they think will make them money in a medium-term future.

                        1. 1

                          Right tool for the job. It almost always redirects the conversation to the technical merits than moral policing. That does not mean moral and ethical considerations are useless, it just helps better calibrate the discussion.

                    2. 17

                      They are:

                      • Giving schools free copies of their office package in order to maintain their OS and productivity software dominance on the desktop.
                      • Lobbying public sector heavily to get locked into Azure.
                      • Using their dominant position to get the whole public sector as well as kids and students onto Teams and O365 with a recurring subscription.
                      • Cross selling One Drive heavily on their non-enterprise Windows editions.
                      • Showing ads and install bloatware in their non-enterprise Windows editions.

                      They want nothing else than total platform dominance and they don’t care about the little people, obviously. The question is, do you consider total dependence of our administrations and of the less-well-off on them a goodie?

                      1. 4
                        • Microsoft fights to get big Pentagon contracts. GitHub can’t even drop a small ICE contract, because Microsoft doesn’t want to offend the military-industrial complex.
                        1. 3

                          Thank you.

                          1. 6

                            No problem.

                            They also care about developers and fund some important research, so it’s not all black & white. Just to be fair.

                          2. 1

                            Google, Amazon, Apple, etc do things too, it’s not like MS is alone in their, MY PLATFORM ONLY perspective.

                            1. 1

                              True, but I have yet to see GAA employees to write local laws.

                              I am pretty sure their lobbyist are real busy in Brussels and Washington, though.

                          3. 8

                            I think, very approximately, corporations which sell platforms tend to act like arseholes when they have the upper hand (in terms of network effects and lock-in) and saintly when they don’t.

                            e.g. 10-14 years ago, with win32 being the juggernaut of “we already sunk cost into desktop apps on this platform” and the iPhone only just being released, Microsoft were barely even bothering to disguise their efforts to extinguish open protocols. Meanwhile Apple were pushing html5 hard as the big thing that would allow software to work beautifully on all platforms.

                            Whereas now we have Microsoft very much less dominant and Apple have a near monopoly on phones purchased by the subset of users who spend money on their phones. So Microsoft are promoting open everything, and Apple are doing things like strategically dragging their feet on html5 features, adding horrendous bugs to iOS Safari and not fixing them for months to years.

                            1. 5

                              Nope. I think they’re doing a great job of “embrace” though.


                              1. 5

                                No such thing–these companies look out for themselves, period. But you are better off with more than one company duking it out, because to compete they will (sometimes) do things that are pro-consumer or pro-developer or whatever.

                                Microsoft on top of the world tried to keep the Web stagnant once they’d killed Netscape, and undermined the growth of Linux and GPL software every way they could. Microsoft as a bit more of an underdog likes Web standards, runs GitHub, and builds the favorite open-source dev environment of a bunch of new coders.

                                Same with, for example, AMD and Intel. AMD got competitive and prices of high-core-count chips plummeted–great! Now with a chip shortage and a solid follow-up product, AMD is in a position to start raising prices, and they are. Intel getting its manufacturing improvements on track would probably give us more of a price war in certain segments, not less!

                                I’m old enough to remember when AWS was exciting to smaller devs because you didn’t have to shell out upfront for servers or provision far in advance for spiky loads. 🤣 Now it’s a huge profit center for Amazon and there are plenty of justifiable complaints about pricing (notably for egress) and undermining open-source-centric companies by using their market position to compete with the product developer’s own managed offerings.

                                Tactically, I guess I want underdogs to reach competitiveness, and for companies to see some benefit to doing good things (e.g. open sourcing things) and cost to bad ones (e.g. pushing out little companies) even when the immediate economic incentives point the wrong way. But in the long run all these companies respond to their economic situation and none of them are your friend.

                                1. 4

                                  It doesn’t really make much sense to blame the moral character of individual actors when there’s a system of incentives and punishments driving things. If a big company were all saints and angels, they’d be out competed by another company that was willing to play the cutthroat game to maximum advantage. (And sometimes it is worth noting playing the cutthroat game to maximum advantage means doing good things, and sometimes good things come out of bad actions too, but if the market+political winds shift, they’ll (eventually) shift with it or go out of business.)

                                  They’re doing what they think will make them money without getting another nasty visit from the government regulators. So is Google and Apple and Mozilla. None are good or bad per se, they’re all just navigating the same sea.

                                  1. 3

                                    I’ll add to mordae that they were using legal muscle to extract billions in patent royalties from Android. I saw one article claim that stopped. I haven’t followed it in a while. You could check to see if they still patent troll others in general. I mean, that’s clearly evil.

                                    On showing ads, the traditional logic for desktop software, games, etc is that people who pay don’t see ads. Microsoft started putting ads in paid products, including the Xbox. After gamers protested, they expanded how many were on the screen instead of reduced it.

                                    One might also count the integration between single-player games with Xbox Live. I shouldn’t have to login to an online service to play an offline game. If I am offline, I shouldn’t lose access to my progress, items, etc. I recall having to deal with stuff like that on Xbox. The nature of online service is they eventually discontinue it for some products and services. Then, they’ll stop working at all or not work as well. All of this is done to maximize sales of new products using Microsoft’s and their suppliers’ platforms.

                                  2. 2

                                    This is really interesting. If I remember correctly, the deal with JavaScript JITs is also the main reason why there is currently no real versions of Firefox and Chrome on iOS since Apple forbids apps to generate executable code for quite obvious reasons. Browser diversity and security is probably worth loosing a bit of performance.

                                    Moreover it would be interesting to know which websites are the most affected by this change. If JIT is beneficial for the bloated ones only then that’s one additional reason to drop it.

                                    1. 3

                                      probably worth loosing a bit of performance

                                      I’m not sure what browser vendor is likely to risk the “browser Foo 50 times slower then Safari”.

                                      1. 6

                                        But it isn’t 50X worse, their own testing on a performance oriented benchmark shows about a 50% performance regression, and that seems perfectly acceptable for not having to worry about a whole type of memory bugs that could potentially expose private information to attackers. Most JavaScript is used for presentation improvement rather than heavy computation, so most users wouldn’t see a huge degradation in experience. Besides there’s a whole market out there who doesn’t care about performance that much, but cares a lot about security: corporations. It’s cheaper for a company to throw an extra grand to buy a faster laptop that makes up for the lost speed than dealing with a huge data leak.

                                        1. 3

                                          I know it’s ok and usable in most cases. But there will be some number crunching micro-benchmark which is 50 times slower, even if not representative. And I expect that’s the one that will make the headlines. Rather than “slightly slower more secure alternative available”, I expect “Chrome and Firefox finally available on iOS, up to 50x slower”.

                                          Maybe I’m totally wrong. But I honestly think that would stop Google from going with it.

                                    2. 1

                                      It seems to me that the need to allow JIT code generation especially weakens Electron apps, where V8 is used in both the main and renderer processes. If the JIT were disabled in both processes in Electron, how badly would it hurt performance, in both the best- and worst-performing Electron apps?

                                      1. 2

                                        It Depends™. Where the main process code just does I/O syscalls and the renderer process code just changes button labels (e.g. Balena Etcher), performance shouldn’t worsen at all.