1. 3

    Does anyone here use / love D? I’ve not taken the time to learn it, but from what I’ve seen it seems like I might enjoy it more than go but still less than Rust.

    1. 2

      I like it. Imho it is decent language with nice features.

      1. 2

        I really love it. I blogged about my attempt to really learn it in earnest:


      1. 3

        Support for loops in my toy programming language tinySelf.

        I am kind of stuck on this one, because it is “bytecode and stack” programming language, and cycles are just messages to block (anonymous “lambda” object closure) repeated as long as the block is true or false.

        [a = b] whileTrue: [do something]

        “Primitive” methods (implemented in “native code”) can’t get result of the block evaluation, and there is no support for jumps in bytecode (also I consider this quite inelegant). And implementation with recursion will eat up all stack without tail call optimization, which is something I want to avoid right now. So, I am not sure how to implement it. I am kind of inclined to use forth-like instruction stack, which will have precedence over bytecode-crunching-loop, so primitive would be able to put there instructions like “get result of the evaluation of this block, evaluate block with body and then call again this primitive”, but I did not yet decided whether this is what I want.

        So, my weekend plans are to think about this, maybe draw few diagrams, or try few approaches and decide what suits me best.

        Other than that, I would also like to do some writing, I have some stories and blogs which I would like to finish.

        1. 1

          Oh, interesting puzzle. Would it be hard to implement TCO? It seems like the right answer.

          1. 3

            Continue to crunch issues in my pet language tinySelf. I have to implement exceptions and exception handling.

              1. 1


                +1 for CherryTree

              1. 7

                VS Code has way to much telemetry built in for my liking. Also, there’s pretty much no way to turn it off completely either.

                1. 8

                  You shoud seriously look at the napoleon / google doc: https://sphinxcontrib-napoleon.readthedocs.io/en/latest/

                  This is already implemented and supported standard.

                  1. 3

                    Yes indeed. For those unfamiliar, here are (from the link) examples of the two docstring styles that Napoleon (a Sphinx extension) parses and renders. PyCharm, too, parses Numpy and Google docstrings and uses the information for tooltips, static analysis, etc.

                    Google style:

                        """Summary line.
                        Extended description of function.
                            arg1 (int): Description of arg1
                            arg2 (str): Description of arg2
                            bool: Description of return value
                        return True

                    NumPy style:

                        """Summary line.
                        Extended description of function.
                        arg1 : int
                            Description of arg1
                        arg2 : str
                            Description of arg2
                            Description of return value
                        return True
                    1. 2

                      Hmm, Google’s style + napoleon extension does seem quite good. I wonder if I should update my style guide. I suggested that you should just bite the bullet and use Sphinx style there due to the doc auto-gen benefits, but seems like this is best of both worlds.

                      1. 2

                        Nice style guide! Changing it to recommend Numpy and/or Google style over Sphinx style gets a big +1 from me, it’s what I myself teach students. We’re all data analysts/statisticians (‘data scientists’), so in our case we use Numpy style, which is also used by Scipy, Pandas, and Scikit-learn (and certainly others, too).

                        1. 1

                          I am using Google’s style for quite few years and I have to say, that I didn’t yet seen anything better. So yes from me.

                    1. 3
                      • Implement a project for the second part of the SpaceKnow interview process.
                      • Continue putting together my own programming language (tinySelf).
                      1. 4

                        I dog-sat my brother’s 7 months old Australian Sheppard. Never again. Now that that’s done, kids’ swimming classes in half an hour, then we’re receiving friends this afternoon. I’d like to get some coding done tonight maybe. That our more LinkedIn Learning.

                        Tomorrow, I can’t remember what’s up. Probably gonna read me some more Lovecraft. Maybe something more technical. Definitely some reading.

                        1. 1

                          I dog-sat my brother’s 7 months old Australian Sheppard. Never again.

                          Hah. I walked this pretty boy yesterday. What problems did you had?

                          1. 1

                            I have two older rescue dogs, one of which has severe arthrosis of the elbows, preventing her from even fleeing a situation. The other one is much smaller and couldn’t just make the guest dog stop. So I had 48 hours of dog bickering to manage, because the guest dog is much younger and more playful. And did not get the cute that the other dogs wanted nothing to do with that.

                            Full disclosure, I probably suck at dogs.

                          1. 17

                            Considering harmful considered harmful.

                              1. 2

                                Yes lol this was 100% tongue-in-cheek

                              1. 12

                                He didn’t really answer the question though :(

                                I think they’re CONSIDERED in opposition as a historical thing. While objects entered heavy use in the 80’s, the paradigm of “everything is an object” started with Java in the mid 90’s. Java rapidly became the most popular language, and functional languages started representing themselves as “not OOP”. Note that before the Java Era we had CLOS and Ocaml, both of which are functional languages with objects.

                                1. 5

                                  You are right, he didn’t answer it! He answered “Is FP in opposition to OO”. I think your answer is pretty accurate. People confused C++ and Java as OOP (instead of recognizing them for what they were, Class Based Programming). And because these languages mutated state, FP is in opposition to them, and therefore OOP.

                                  I think more importantly, the pop culture has no idea what OOP is and therefore people are confused when they think FP is in opposition to OOP.

                                  1. 5

                                    I think more importantly, the pop culture has no idea what OOP is and therefore people are confused when they think FP is in opposition to OOP.

                                    I don’t think it’s fair to say that the “pop culture” doesn’t know what “OOP is”, because there really isn’t a definition of OOP. A lot of people equate it with Smalltalk, but you could also say OOP is Eiffel, or Ada, or Simula…

                                    1. 3

                                      People confused C++ and Java as OOP (instead of recognizing them for what they were, Class Based Programming).

                                      I don’t really think that classes are problem*. They were not just Class Based Programming, but imperative Class Based Programming inspired by C. If you look at Smalltalk (which is also Class Based), missing component is late binding, which allows you to do all kinds of neat stuff and cleaner style of programming (imho).

                                      *Although I really like Self, which is basically prototype based Smalltalk-like system.

                                      1. 3

                                        Unfortunately to most people class based programming and OOP are the same.

                                        1. 5

                                          I don’t know if “most” people do, but there is certainly a decent collection of people out there who think this. Consider this document (“Object-Oriented Programming in C”, revised December 2017), which starts out with this:

                                          Object-oriented programming (OOP) is not the use of a particular language or a tool. It is rather a way of design based on the three fundamental design meta-patterns:

                                          • Encapsulation – the ability to package data and functions together into classes
                                          • Inheritance – the ability to define new classes based on existing classes in order to obtain reuse and code organization
                                          • Polymorphism – the ability to substitute objects of matching interfaces for one another at run-time
                                          1. 1

                                            most people I have met while programming professionally in New Zealand.

                                            1. 1

                                              Inheritance – the ability to define new classes based on existing classes in order to obtain reuse and code organization

                                              I think this is universally accepted as an anti-pattern, both by OO programmers and FP.

                                            2. 2

                                              I think how most C++, Java, and .NET programmers code supports your position. At least, how most code I’ve seen works looking at code from… everywhere. Whatever my sample bias is, it’s not dependent on any one location. The bad thinking clearly spread along with the language and tools themselves.

                                        1. 2

                                          The slug in the URL here has somehow become notes-of-cpython-lists, when it should actually be notes-on-cpython-lists, not the of vs. on.

                                            1. 1

                                              Actually decent article.

                                            1. 3

                                              Really good books:

                                              More serious kind:

                                              Kinda good:

                                              + more, but not worth recommending.

                                                1. 1

                                                  What? What it is? What potential? How can I use it and why should I? This article provides no answers.

                                                  1. 1

                                                    9 meme gifs in one artcile.

                                                    1. 2

                                                      I think I will stay with python.

                                                      1. 29

                                                        I think one of the biggest “secret” is the debugging technique or method. It is kind of similar to scientific method (hypothesis, test, evaluation of result), but never explicitly described, it is just something you have to pick up as you go. That really separates those who can program anything and those who can’t.

                                                        1. 13

                                                          I would agree with this, and specifically the part about generation of hypotheses about what might be wrong, and how to specifically test and evaluate then, and if they’re wrong, reject them and come up with a new one afterwards. It’s hard without experience to generate hypotheses when there’s no hints and nobody who knows more than you about the problem. It also seems to be hard to be objective about your hypothesis, seek to prove if it’s right or wrong, and reject it if it’s wrong. It’s these things that you can’t learn by reading about them.

                                                          1. 8

                                                            It is frustrating.

                                                            We have an entire Internet or two of “programming tutorials” that frequently leave out all of the problems and mistakes that the author made while trying to write it – believing perhaps this makes them seem like less of an expert. I’d like to see more things like this gem (see the mistakes at the bottom).

                                                            We also have a computer science curriculum which still seems to pretend (at least at the beginning) that instructions are equal and memory is fast and “teach” binary-trees and probed hash tables as “data structures”. But whatever. How do you know this hasn’t degenerated into a linked-list? Debugging seems remarkably absent any CS curriculum being single stepping, mental simulation of an algorithm, and printf.

                                                            However I don’t think the scientific method is quite right. Peirce believed that too much rigour (or as he put it “stumbling ratiocination”) was inferior to sentiment, and that the scientific method was best suited to theoretical research. For more on this subject, see the Pragmatic theory of Truth, but a little background for my argument should be enough: Peirce outlined four methods of settling an opinion:

                                                            1. Tenacity; sticking to ones initial belief brings comforts and decisiveness, but ignores contrary information.
                                                            2. Authority, which overcomes disagreements but sometimes brutally
                                                            3. a priori – which promotes conformity less brutally but fosters opinions as something like tastes. It depends on fashion in paradigms, and while it is more intellectual and respectable it sustains accidental and capricious beliefs.
                                                            4. The scientific method, which obviously excels the others by being deliberately designed to arrive – eventually – at the most secure beliefs, upon which the most successful practices can be based

                                                            Now we still see a lot of “programming wisdom and lore” which people do because they always have, or because some blog said so. I’d argue syntax-highlighting and oh-my-zsh are fashionable, and laugh at anyone who believed that the scientific method could demonstrate these tools are ideal.

                                                            So what then? Well, it means we have pseudo-science in our programming.

                                                            It’s for this that I remain that we (as a society) don’t know how to program computers – let alone teach anyone how to program (and therefore debug). I predict this will mature over the next couple hundred years or so, but I don’t expect anyone in my lifetime to be able to teach programming itself, the way, for example, we can teach bridge-building.

                                                            1. 1

                                                              “I don’t expect anyone in my lifetime to be able to teach programming itself, the way, for example, we can teach bridge-building.”

                                                              We’ve been doing it a while if you keep the structuring simple. The first is an iterative method for doing that with low cost that combines things like lego blocks. Even students get low defect rate on quite-maintainable code. The second adds formal specifications and verifiable code to drive predictability up and defects further down. The third combined with error-handling techniques and automated testing is pretty good at dealing with stuff too complex for the rest. So, I’d say we can do quite a bit of what you describe but it’s mostly just not applied.




                                                              1. 2

                                                                I’m not sure I understand what you’re saying. Maybe you don’t understand what I’m saying.

                                                                Clean room doesn’t help a programmer understand what’s wrong with:

                                                                memcpy(a, b, c*d);

                                                                All of these examples advise not putting bugs in the first place – sensible advice to the novice, for sure, but how exactly does that teach us how to debug programs?

                                                                What constitutes a bug in the first place? Eiffel sounds great not having any bugs in it, but what exactly does that mean?

                                                                Is “DISK 0K” a bug? There’s a wonderful story of tech support getting a report that I’m getting an error message about my disk being full, but the computer says it’s ok.

                                                                If you’re saving a big (multi-gigabyte) CAD drawing to disk and run out of space ten minutes in, should the system generate an error, telling you to quit and delete some files and try again later? We use multitasking systems, so why not pause and give the user the option to retry or fail? They can delete some files if they want to…

                                                                What about an email server? What if it runs out of disk space? Should reject a message? Could we still pause, let the client timeout while we page the sysadmin/operator?

                                                                Writing software is in part, being able to say what you mean (implement), and part being able to mean what the business means (specify), but it’s also clearly (still) a matter of taste because we don’t have good science to point to giving us the answers to these questions.

                                                                In contrast, have you seen bridge engineering handbooks? Pretty much every consideration you might have when you need to build a bridge is documented and well researched in a way that makes software development professionals look like lego builders.

                                                            2. 2

                                                              Two great talks on the topic of debugging:

                                                              Stu Halloway on Debugging with the Scientific Method

                                                              Brian Cantrill on Debugging Under Fire: Keep your Head when Systems have Lost their Mind

                                                              What I find interesting is that, depsite their different backgrounds and presentation styles, their advice has a lot of similarity.