One key question, missed by the article, is: What will computers look like in 100 years? Massively parallel is a given, but beyond that, will it be higher performance to tightly couple memory to the processor and have NUMA-esqe interconnects, or will we have massive processor complexes connected to tebibytes of RAM?
One interesting bit from Mill Computing is that they are designing the processor architecture and instruction set based on the constraints of silicon lithography! This leads to choices like having two separate instruction encodings, so that they can have two simpler (and smaller) decode units, coupled with smaller and faster (due to locality to the decoders) instruction caches. The video series goes into much, much more detail and they are worth a watch for anyone interested in computer architecture.
And then we need to talk about truly 3D processor architectures, because we won’t be using silicon lithography forever.
And then we need to ask, who is writing programs 100 years from now? Mostly other computers. So restricting program keywords to English text, saving code in files, in-line comments, and such may all go by the wayside. Does it make more sense for all the code to be in some kind of database instead? Will there be any code reuse in the conventional sense, or will AGI just custom-write each separate program, making things ultra-optimized?
it’s very bold to believe that computers will still exist in 100 years. They might exist in some parts of the world where you can produce simple controllers locally if you have the luck to have in the same place the resources necessary, the know-how and an economy that can make use of them but it’s clear that personal computers won’t survive the ongoing collapse of the industrial civilization. Maybe there still will be some in 50 years but 100 years is way too much.
Oh, we’re certainly racing towards the cliff of un-sustainability. I’m fairly convinced, one way or the other, that human civilization won’t exist like we know it now by 2100. It might be good, or there might be civilizational collapse from ecologic or other factors. Or a robot uprising.
LISP, another of the oldest, doesn’t care a whit about it and thrives on flexibility and ease of implementation. So there’s one in the “no” column.
This is nonsense. Common lisp (mentioned here as a ‘bigger dialect’ with ‘momentum’) makes a number of compromises for the sake of performance, and is very difficult to implement (especially if you would like to bootstrap), both for its breadth and its depth of features.
I mean, how many types of physical tools do we have in the world today that have gone essentially unchanged for 100 years? A lot of the simpler ones such as basic hand tools, sure, but not too many of the bigger and more complicated ones.
Even a hammer is pretty different, the handle is now typically metal rather than wood, it has a contoured rubber grip for easier use, and the shape of the head is completely different to even 50 years ago (the two-pronged nail remover is a surprisingly recent invention).
That said, someone familiar with a thousand-year-old hammer would easily be able to use a modern one. I wonder what the equivalent is for programming languages.
Hundred years is shortsighted imo. Pythagoras’s theorem is still true (and many other arguments from antiquity). I think we can solve the problem of metaprogramming sufficiently well that progress will happen within that system in perpetuity. My opinion here is obviously biased by the fact that I am attempting to find this foundation; datalisp.is.
JavaScript is an interesting case (would be interested to hear the author’s take). Even though it’s newer than a number of the other examples, given its wideness of use and the backward compatibility still having to be maintained by browsers. If we still have the same underpinning of core web technologies, then it’s not hard to believe we’d still have JavaScript support available in some guise- for better or worse.
I see JS as a general-purpose language. It’s definitely the dominant language in client apps, widely used in web servers, and is moving into embedded systems too.
It’s evolved quite smoothly, given its humble origins. ES2015 is very comfortable to use, modulo some warts that can’t be removed (the weirdness of “this”, the behavior of “for-in”, etc.) And TypeScript was able to layer a very powerful static type system onto it without disturbing the underlying language at all.
But I don’t think there will be programming languages as we know them in 100 years. If humans are still telling computers what to do*, it will be in higher level ways that aren’t reliant on strict syntax, because the computers will be smart enough to understand natural language. We’ll still need to describe things with precision, but it might be more like legal documents.
* that is, unless the computers are telling us what to do, or they’ve exterminated us, or we’ve bombed ourselves back to pre-industrial levels and there are no computers left.
This topic is very interesting to think about.
One key question, missed by the article, is: What will computers look like in 100 years? Massively parallel is a given, but beyond that, will it be higher performance to tightly couple memory to the processor and have NUMA-esqe interconnects, or will we have massive processor complexes connected to tebibytes of RAM?
One interesting bit from Mill Computing is that they are designing the processor architecture and instruction set based on the constraints of silicon lithography! This leads to choices like having two separate instruction encodings, so that they can have two simpler (and smaller) decode units, coupled with smaller and faster (due to locality to the decoders) instruction caches. The video series goes into much, much more detail and they are worth a watch for anyone interested in computer architecture.
And then we need to talk about truly 3D processor architectures, because we won’t be using silicon lithography forever.And then we need to ask, who is writing programs 100 years from now? Mostly other computers. So restricting program keywords to English text, saving code in files, in-line comments, and such may all go by the wayside. Does it make more sense for all the code to be in some kind of database instead? Will there be any code reuse in the conventional sense, or will AGI just custom-write each separate program, making things ultra-optimized?
it’s very bold to believe that computers will still exist in 100 years. They might exist in some parts of the world where you can produce simple controllers locally if you have the luck to have in the same place the resources necessary, the know-how and an economy that can make use of them but it’s clear that personal computers won’t survive the ongoing collapse of the industrial civilization. Maybe there still will be some in 50 years but 100 years is way too much.
A nice paper on the subject: https://kurti.sh/pubs/unplanned_limits17.pdf
Oh, we’re certainly racing towards the cliff of un-sustainability. I’m fairly convinced, one way or the other, that human civilization won’t exist like we know it now by 2100. It might be good, or there might be civilizational collapse from ecologic or other factors. Or a robot uprising.
This is nonsense. Common lisp (mentioned here as a ‘bigger dialect’ with ‘momentum’) makes a number of compromises for the sake of performance, and is very difficult to implement (especially if you would like to bootstrap), both for its breadth and its depth of features.
Common Lisp has been n°3 on the TIOBE index in the 80s, if that’s a relevant measure of “popularity”.
CL performance: it does care. CL can be faster than Rust, even run from sources. https://www.reddit.com/r/lisp/comments/owpedp/my_battle_to_beat_common_lisp_and_java_in_rust_on/
you might be thinking about Scheme?
disagree! lol Especially when you look at the ecosystem. For stability (alongside evolution) -> Common Lisp ;)
I mean, how many types of physical tools do we have in the world today that have gone essentially unchanged for 100 years? A lot of the simpler ones such as basic hand tools, sure, but not too many of the bigger and more complicated ones.
Even a hammer is pretty different, the handle is now typically metal rather than wood, it has a contoured rubber grip for easier use, and the shape of the head is completely different to even 50 years ago (the two-pronged nail remover is a surprisingly recent invention).
That said, someone familiar with a thousand-year-old hammer would easily be able to use a modern one. I wonder what the equivalent is for programming languages.
Hundred years is shortsighted imo. Pythagoras’s theorem is still true (and many other arguments from antiquity). I think we can solve the problem of metaprogramming sufficiently well that progress will happen within that system in perpetuity. My opinion here is obviously biased by the fact that I am attempting to find this foundation; datalisp.is.
JavaScript is an interesting case (would be interested to hear the author’s take). Even though it’s newer than a number of the other examples, given its wideness of use and the backward compatibility still having to be maintained by browsers. If we still have the same underpinning of core web technologies, then it’s not hard to believe we’d still have JavaScript support available in some guise- for better or worse.
I see JS as a general-purpose language. It’s definitely the dominant language in client apps, widely used in web servers, and is moving into embedded systems too.
It’s evolved quite smoothly, given its humble origins. ES2015 is very comfortable to use, modulo some warts that can’t be removed (the weirdness of “this”, the behavior of “for-in”, etc.) And TypeScript was able to layer a very powerful static type system onto it without disturbing the underlying language at all.
But I don’t think there will be programming languages as we know them in 100 years. If humans are still telling computers what to do*, it will be in higher level ways that aren’t reliant on strict syntax, because the computers will be smart enough to understand natural language. We’ll still need to describe things with precision, but it might be more like legal documents.
* that is, unless the computers are telling us what to do, or they’ve exterminated us, or we’ve bombed ourselves back to pre-industrial levels and there are no computers left.