Oh man, this guy. It’s like he’s trying to be the Elon Musk of automata theory.
I’ve actually read his entire NKS tome, on paper, long ago when I had that kind of free time. There’s some really interesting stuff in there, which he makes quite accessible, especially compared to the academic literature he’s plundering. It’s a shame about his overbearing personality: at this point, he’s basically alienated the entire community of discrete mathematicians and theoretical physicists and computer scientists who are best equipped to engage with and evaluate his few novel ideas, amidst all the work he’s… not quite plagiarized, but not exactly given fair credit for either. So he goes direct to the dabblers and crackpots. Who knows, maybe he’ll have some success there.
All good with the exception that he’s been on that longer than Musk on any of his endeavors.
You’ve read the whole of NKS? I’ve never managed that. I learned of NKS by reading Meta-Math
I was much younger, much less employed, with much less formal education. Given all that it was pretty influential, to be honest. I had no idea that I would end up studying cellular automata (from a topological perspective) in school. I still remember the flinches, sour looks, and tactful hedging when I first brought up Wolfram with my professors.
Very cool. I always wanted to study CA, but people pay me for different things to do. If only DevOps and CA were a match…
There’s definitely a connection there, but it’s too theoretical to get paid for; as far as I can tell it’s not even a popular (meaning well-funded) research topic at present. See Ackley’s paper or more broadly “spatial computing”. If you filter out the GIS and visualization stuff from those search results, you’ll see there are a handful of people working to reconcile the theoretical models of computation (like von Neumann’s) that underpin all current practical computer designs with the fundamental constraints imposed by physical space-time. I think the value of doing so is most apparent at scales where the speed of light really matters: i.e. computer architecture (very small) and network infrastructure (very large).
Why do you need colocation or a CDN? Because of physical space: information can only propagate so fast. How do you model spatially situated information-processing systems? CA are a reasonable starting place. Interestingly, von Neumann understood this back in the forties, but that area of his work hasn’t flourished in the same way.