So, this is sort of a talk by Matthew Hammer introducing people to incremental computation, based on his doctoral work under kind of Umut Acar (the self-adjusting computaiton guy) but extending it to avoid recomputation in many more situations, by using partial ordering constraints on the computation instead of sort of total ordering. Okay, so this is sort of helping me to understand Acar’s dissertation, which I’ve been finding somewhat tough going, even though I feel like the dissertation is sort of pretty well explained, and this talk is sort of driving me nuts with the, uh, hesitation noises and filler, right?
Okay, but, so, this talk is really helping me put Acar’s dissertation in context, and also kind of gives me an overview of what Hammer is doing, including in some languages like OCaml and Python, which, again, by the way, I care a lot more about than Standard ML or CEAL. So, the other thing is, I think this talk is well worth watching, especially with
mplayer -af scaletempo so you can sort of speed it up to 1.6× to compensate for the, uh, hesitation noises, right?