1. 6
  1.  

  2. 2

    This is quite promising, but it still only goes as far as the compiler. If we’re willing to reconsider historical assumptions, there’s a lot further we could go.

    Speculative execution only exists as a way to squeeze parallel execution out of sequential code. It has allowed legacy code bases to benefit incrementally from better hardware without having to rethink fundamental assumptions about computer architecture (and “computation” in general) established in the 1950s and 60s. As such, it’s been extremely valuable: but we’re now paying the piper, so to speak.

    If we programmers and our languages and kernels were willing and able to genuinely rethink how our machines should work, based on the current economics and physics of computer hardware design and manufacturing, we could use a lot of interesting ideas that have languished in the academic literature for decades.

    For the more practically minded, just look at how GPU computation has taken a model originally developed specifically for 3D graphics and turned it into… all kinds of stuff, but mostly neural network processors and cryptocurrency miners, if you were to rank applications by dollar amount.