These overloaded terms are harder to explain, due to the person (who I am explaining to) having misconceptions about what the term is.
Like, when I explain what ‘server’ is, I always have to explain that the physical computer called server is different from what I’m trying to explain. :-( When I (forget and) don’t, everything gets mixed up and both people lose track of the conversation…
Ah. This article, which argues that there are no low level languages you can write code in.
I read this article as that computers aren’t improving as much as they can, due to the hardware & software people thinking in the “C” way.
People optimize CPUs to the C model, optimize compilers to the CPU, optimize software for compilers… and as a result the entire computing model, despite having multiple processors, fast caches, accurate branch prediction, is still revolving around C.
There are some computing models for concurrency (due to some geniuses like Alan Kay and great timing). Maybe there can be models for caches or branch prediction too, but nobody really thinks about these models…
And this is true. CPUs are high-level machines now. In x86 even the assembly/machine code is an emulated language that is dynamically compiled to μops, executed out of order, on multiple complex execution units that you have no direct control over.
Totally agree but if you put aside that one rather questionable assertion there’s a whole lot of interesting here :)