I do not have a SO account but.
I probably forget a lot of things. Things are moving quite well :) adoption into industry is a bit harder, but we are getting there.
If you had to pick one of these, which would you?
I would say query-based compilers, or sugaring and desugaring, as they are allowing massive change in UX.
For personal purely “cool” factor, algebraic effect handling as capabilities and the floating point numbers casting to strings, because it is stuff I work on.
But tbf, I work on my personal projects on nearly all of list I posted :D
I’m not sure if replacing theoretical computer science with a fudge counts, but DLSS has pretty much solved anti-aliasing.
Good that you mentioned DLSS. It’s something I’ve been thinking about a lot myself lately so I indulged myself in musing about what inventions since 2010 were necessary for it work.
Theoretical inventions first. Note that not all these were crucial but I believe them to been at least an inspiration.
f(3x3 neighbor pixel colors colors, guessed color) -> clamped color
My deep learning knowledge is somewhat out of date though :)
On the practical side there’s of course simply the birth of GPU-accelerated differential programming frameworks PyTorch (2016) – Facebook’s combination of torch (2002) and Chainer (2015) – and Google’s TensorFlow (2015). These both implement reverse mode automatic differentiation which has been around for decades. I suppose the combination of GPUs + Python + deep learning built-in was the usability invention here.
Fast GPUs with programmable shaders are of course a necessity but weren’t those around in 2010 already? Compare two roughly comparable Nvidia cards:
So in memory size and bandwidth we have a 4x increase that happens to reflect a pixel count jump from 1080p to 2160p resolution. The 19x increase in floating point ops sounds awesome but it’s still only 5x more per-pixel if you take the higher display resolutions into account.
To me it seems like high-resolution displays are the hardware improvement that really drove the development of DLSS and its ilk, not GPUs. Well you could argue that Nvidia’s proprietary RTX tech with its high computational load was the culprit. In any case, games ended up having too much work to do per pixel and algorithmic & tooling improvements came to the rescue. And there really have been new discoveries since 2010!