Ok, but does this matter? Even a 50% speedup won’t matter much unless a program spends a significant portion of its time doing hash calculations on arrays, which seems unlikely (at least, unusual). I’m not too surprised the Java devs haven’t been too bothered about this.
Performance is death by a thousand cuts. Good performance, performance so good you don’t have to bother optimizing your app, is often the result of thousands of people who sweat the details and agonize over nanoseconds and stalls caused by data dependence. People like the author.
I would say that it’s not a problem - until it is. It’s good to have this kind of information available, but even the tone of the article is that one should avoid prematurely optimizing, since changes in the JVM are likely to render those optimizations moot at some point.
I didn’t take that away. There are a few tidbits that are fundamental to Hotspot and probably aren’t going to change anytime soon. One is that more small methods is better than large methods as the JIT works best on smaller blocks of code and does aggressive inlining so hot methods have no call overhead anyway. Also final variables help with with things like constant propagation. Things related to GC performance are also worth over-fitting for as it takes like a decade between new GCs to get into the JDK.
If anything, I took away from the article that there’s enough time between JDK releases that micro-optimizations you implement now have a shelf-life of years, not months.