1. 31

  2. 7

    I like this because it doesn’t herald Forth so much. There is a good article from YosefK:

    Forth is not the language. Forth the language captures nothing, it’s a moving target. Chuck Moore constantly tweaks the language and largely dismisses the ANS standard as rooted in the past and bloated. Forth is the approach to engineering aiming to produce as small, simple and optimal system as possible, by shaving off as many requirements of every imaginable kind as you can.

    The article also describes why Forth is not successful. It just doesn’t scale to many people. Even Chuck could not build a web browser which must be able to handle a range of standard formats (HTTP,SSL,HTML,CSS,Javascript,JPG,GIF,PNG,SVG,MPEG,MP3,Unicode,…). What Chuck would would do is described in the article:

    It’s about being brave enough to simplify our specifications of software to a managable level of complexity.

    Well, if you can significantly reduce the requirements from your customers in a business environment, you should of course do that. Good luck trying to throw industry standards out of the window.

    If we are talking about end users and free software, being like Chuck probably implies the suckless philosophy but most end users don’t like to recompile their window manager to change some configuration. They also use libwebkit instead of having their own simple browser.

    Being like Chuck is an extreme desire for simplicity but at the cost of easyness. If you believe Chucks stuff is easy, I dare to you get colorForth running and do something simple as 1 dup +.

    I find Forth interesting like I find VPRIs STEPS interesting: Academically. Not relevant for real world use.

    To end on a positive note: What we can learn from Chuck is that 90% of every software is not essential to its actual goal. So if you really want to optimize something you probably can. Caveat: You will have to sacrifice some -ilities like maintainability, portability, extensibility, etc.

    1. 1

      The only real world (above hardware) value I see in forth is as an embedded language in some other tool, but there we already have Lua (speed) and TCL (executing untrustworthy code). The decision to use a forth in this space must be weighed against the costs of using one of those preexisting languages.

      1. 1

        Why do you see TCLs strength in executing untrustworthy code?

        I’m fascinated by TCL for a while now, but I have not seen that argument yet.

        1. 1

          TCL has extensions to help people who need to run code and also can’t necessarily trust the authors. Example document found on Google: https://www.astro.princeton.edu/~rhl/Tcl-Tk_docs/tcl8.0a1/safe.n.html

    2. 11

      Chuck Moore is the apex programmer. As far as I can tell the man is one of if not the single most underappreciated language designer and practical programmer. I think the problem with his ideas and Forth is they seem too simple to people who’ve been taught complexity. His manner of thinking is so far from the norm that it seems absurd without knowing his history. The man deserves much more respect and acknowledgement than he recieves. Shame colorForth and the subsequent arrayForth are a little too esoteric for most programmers. Shame they don’t have native Linux, BSD, and Windows implementations but instead run as OSes natively or emulated. Our software ecosystem could be completely different, and considerably simpler.

      1. 6

        I totally agree with your comment about him being underappriciated, world-class, elite engineer. I disagree with one part:

        “I think the problem with his ideas and Forth is they seem to simple”

        The number of people who pick up BASIC, Pascal, Go, Python, maybe Lua all argues against that. Moore’s flaw (or feature depending on view) is that he’s working against instead of with human nature. Examples that lead to success if new PL’s:

        1. Familiarity. Like the syntax or semantics of stuff people already know. He’s using a stack machine, 18-bits, words, macros, his own environment… mind shattering for majority of new programmers. Most people that learned a similarly-powerful language, Scheme, did so cuz schools forced it.

        2. Ecosystem. People want lots of libraries, tooling, etc. Forth probably isnt where Java or .NET is on that. Might not even match Go.

        3. Herd behavior. People go where the masses are going. Forth mostly doesnt except in open firmwares. There’s plenty of Forth written for that as a result.

        4. Big company with big ecosystem or lots of employees makes it their language. Java, .NET, Go, and Rust are all examples.

        Just lacking any one of these can hurt adoption. Lacking all four is a middle finger to the world. Depending on his goal, that might be really bad or really admirable. I think he just likes to live life his way offering others a chance to get on the ride but not take the wheel. ;)

        1. 8

          While I agree that each of these factors hurt adoption, I think the reasons aren’t “human nature”. Moore philosophically approaches this problem from a hardware perspective. PL has largely moved to abstract areas for both ergonomics reasons and for mathematical reasons. The PL is seen as a tool to lower barrier to entry and increase safety. Moore sees programming as adding minimal abstractions to the hardware, and making most of what he is given. The flawed analogy I like to use is the bicycle and the car. One (the car) uses complexity to give the operator maximum comfort and reliability with transportation. The other (the bicycle) works with the underlying physical platform to offer a minimal experience for transportation. One uses engineering to offer it’s guarantees while the other relies on the underlying physical platform to offer its.

          As to why the DRY philosophy won over the Forth philosophy I think is a very complicated look into educational experiences for most programmers, expectations of the profession, and the economics of producing software.

          1. 3

            I don’t see any reason why the DRY philosophy and the Forth philosophy should be seen as orthogonal.

          2. 4

            I would have expected you, as our local chief security evangelist (;), to be against the Forth philosophy. Forth, or rather Chuck, is roll-you-own not-invented-here to the extreme. Forth is about not having an ecosystem. I would not want Chuck to build crypto code.

            1. 4

              I’m very against Forth for security-critical code. I’m against Forth in general. I do see why people, esp hobbyists or embedded folks, like or love it. I also respect Chuck Moore. So, I’m really just trying to be nice and helpful in a thread by a fan of Chuck and his work intended to celebrate both. Instead of doing my thing, I gave a constructive critique aimed at helping them do their thing with more uptake. :)

            2. 1
              1. Back in the 70s there was COBOL, ALGOL, and FORTRAN syntax. The only surviving one of those is ALGOL syntax, and so I doubt Forth died due to syntax familiarity. Stack machines are much simpler to understand than current register machine architectures and I think it would be hard to argue against that. The points on 18-bit words etc are quite irrelevant because they’re arrayForth, for the F18A CPU GreenArrays designed - nothing like a usual Forth system.

              2. This I agree with. People love the library-based programming paradigm, because it’s how bad programmers are taught to code. And I agree libraries are useful, but it was never really part of the Forth philosophy to use libraries not included with your distribution - a shame. There have been attempts at Forth packaging systms and there is a good attempt with the ForthNet, but nothing major. Forth doesn’t really have any “tooling” in the community - you write the code, you run the code.

              3. This is also accurate, and to be lamented - the masses have chosen the JVM and dotNET as the base platforms for the world of tomorrow. Forth, being held back by it’s esotericity and a community largely dedicated to meta-wanking (writing endless Forth implementations and system code, as opposed to application code) suffers on the community front too. #Forth on freenode has picked up lately though. But also, Forth is best for programmers who can write in a way that isn’t C-style, a skillset which is severely lacking, and also the way the majority of programmers code. Look up “Dispelling the User Illusion” (Moore, 1999) or “ Introduction to Thoughtful Programming and the Forth Philosophy” (Misamore, 2012) for thoughts on that.

              4. Again, accurate. Forth isn’t picked up by big companies for various reasons and so adoption is hurt.

              1. 3

                Ok, so we agree on most of the sad state of how things work. Thanks for the references in 3. I’m going to focus on 1 which is in dispute:

                “Back in the 70s there was COBOL, ALGOL, and FORTRAN syntax.”

                Back then, there were different types of computers which ran different types of languages. Often what the hardware could handle and/or users could afford or build. The PDP generation had some lightweight languages. The fact that UNIX happened on that in C language made C spread like wildfire. It got adopted by Windows and Mac family. This created a whole fork of the ALGOL family… ALGOL -> CPL -> BCPL -> B -> proto-C’s -> C (with structs)… to simpler, low-level languages and high-level languages that were C-like or C-based. Java syntax being C-like with C++ being C-based. Once these went mainstream, with No 3 showing marketing being major driver, they became status quo for industrial languages in many people’s minds. Such people may have never seen COBOL, ALGOL, or FORTRAN. Others did but go where the bandwagons (and jobs and libraries) are.

                One might say something similar about scripting languages. I haven’t traced the history of that much. However, PHP, Python and Ruby particularly got pretty famous. So, some new languages are copying their syntax/style with improvements on language features with better adoption than really different languages with same features. So, add them to the mix with the existing system and application languages in the C family. Forth is so unlike any of this that it can’t feel familiar. For me, the lesson was to start with a subset of a traditional, imperative 3GL with metaprogramming support plus an interpreter for at least development.

                “Stack machines are much simpler to understand than current register machine architectures and I think it would be hard to argue against that. “

                I need data for that. All I have is anecdotes myself. I’ve watched many people learn about basic variables and stacks. The stacks confused the hell out of them. I found them weird, too. Whereas, register/heap machines were so intuitive for new people. Here’s the simplistic way I taught them: “Registers are the hardware version of variables. Your heap or RAM is like hardware version of an array, list, or collection. Each is built with circuits with their own strengths and weaknesses. Registers are ultra fast and ultra-expensive, leaving you just a few to work with. Heap/RAM is many times slower but cheap and large enough to store billions of objects. Your compiler moves your variables back and forth between the two. Optimizations try to put the stuff you use the most in the registers most of the time while minimizing the number of times they’re moved too/from RAM.” They always get this quickly.

                “The points on 18-bit words etc are quite irrelevant because they’re arrayForth, for the F18A CPU GreenArrays designed - nothing like a usual Forth system.”

                Maybe you have me there. I didn’t learn Forth. I just read that in a few write-ups by other people who used it. My favorite was Yosefk’s report.

                EDIT: Reskimming Yosefk’s report, I found another section that illustrates just how badass Chuck Moore is on top of how he thinks of hardware, software, and simplicity all together in a harmonious way.

                “In his chip design tools, Chuck Moore naturally did not use the standard equations:

                Chuck showed me the equations he was using for transistor models in OKAD and compared them to the SPICE equations that required solving several differential equations. He also showed how he scaled the values to simplify the calculation. It is pretty obvious that he has sped up the inner loop a hundred times by simplifying the calculation. He adds that his calculation is not only faster but more accurate than the standard SPICE equation. … He said, “I originally chose mV for internal units. But using 6400 mV = 4096 units replaces a divide with a shift and requires only 2 multiplies per transistor. … Even the multiplies are optimized to only step through as many bits of precision as needed.”

          3. 1

            I’m still trying to find a primary reference for the quote about the Forth VLSI compiler being too limited to design a “real” microprocessor but that’s okay because it was enough to design a CPU capable of running Forth … any hints?

            1. 1

              I showed Chuck Moore’s description of that to an ASIC hacker. He said it was very simplistic – no way you could do with that what people are doing with standard tools.

              So we would have to ask Yossi Kreinin who that ASIC hacker is?

              1. 1

                Although I can’t do a counter in Verilog, I do know a lot about EDA processes since I collected and skimmed tons of papers on hardware to have something ready for open-source attempts at EDA. One site described it as a whole series of NP-Hard problems you had to do with multi-variable optimization. That looks correct based on what I found. The design rules to enforce and problems to solve go up with each process shrink. It think it was a few hundred on the ones Moore uses. It’s about 2,500 on 28nm. Gets to the point that they have to do things like image recognition on the circuits to find patterns that do weird things before trying to rewire each into equivalent circuits that lack those patterns. That’s just one or a few rules.

                I’m not even sure someone could understand in their head a design of significant size on recent nodes. All those rules along with their interactions even on a stack processor must be huge complexity. Abstraction is critical on those nodes. Most designers just synthesize the RTL from high-level descriptions to manage complexity. I think that’s the right thing to do, too, since we shouldn’t have to understand a pile of booleans. They’re effectively meaningless to a human whereas the high-level ASM’s or FSM’s do have meaning. So, they do high-level stuff with multiple stages of synthesis, optimization, verification, testing, and inspection.