1. 3
    • Implement a project for the second part of the SpaceKnow interview process.
    • Continue putting together my own programming language (tinySelf).
    1. 4

      I dog-sat my brother’s 7 months old Australian Sheppard. Never again. Now that that’s done, kids’ swimming classes in half an hour, then we’re receiving friends this afternoon. I’d like to get some coding done tonight maybe. That our more LinkedIn Learning.

      Tomorrow, I can’t remember what’s up. Probably gonna read me some more Lovecraft. Maybe something more technical. Definitely some reading.

      1. 1

        I dog-sat my brother’s 7 months old Australian Sheppard. Never again.

        Hah. I walked this pretty boy yesterday. What problems did you had?

        1. 1

          I have two older rescue dogs, one of which has severe arthrosis of the elbows, preventing her from even fleeing a situation. The other one is much smaller and couldn’t just make the guest dog stop. So I had 48 hours of dog bickering to manage, because the guest dog is much younger and more playful. And did not get the cute that the other dogs wanted nothing to do with that.

          Full disclosure, I probably suck at dogs.

        1. 17

          Considering harmful considered harmful.

            1. 2

              Yes lol this was 100% tongue-in-cheek

            1. 12

              He didn’t really answer the question though :(

              I think they’re CONSIDERED in opposition as a historical thing. While objects entered heavy use in the 80’s, the paradigm of “everything is an object” started with Java in the mid 90’s. Java rapidly became the most popular language, and functional languages started representing themselves as “not OOP”. Note that before the Java Era we had CLOS and Ocaml, both of which are functional languages with objects.

              1. 5

                You are right, he didn’t answer it! He answered “Is FP in opposition to OO”. I think your answer is pretty accurate. People confused C++ and Java as OOP (instead of recognizing them for what they were, Class Based Programming). And because these languages mutated state, FP is in opposition to them, and therefore OOP.

                I think more importantly, the pop culture has no idea what OOP is and therefore people are confused when they think FP is in opposition to OOP.

                1. 5

                  I think more importantly, the pop culture has no idea what OOP is and therefore people are confused when they think FP is in opposition to OOP.

                  I don’t think it’s fair to say that the “pop culture” doesn’t know what “OOP is”, because there really isn’t a definition of OOP. A lot of people equate it with Smalltalk, but you could also say OOP is Eiffel, or Ada, or Simula…

                  1. 3

                    People confused C++ and Java as OOP (instead of recognizing them for what they were, Class Based Programming).

                    I don’t really think that classes are problem*. They were not just Class Based Programming, but imperative Class Based Programming inspired by C. If you look at Smalltalk (which is also Class Based), missing component is late binding, which allows you to do all kinds of neat stuff and cleaner style of programming (imho).

                    *Although I really like Self, which is basically prototype based Smalltalk-like system.

                    1. 3

                      Unfortunately to most people class based programming and OOP are the same.

                      1. 5

                        I don’t know if “most” people do, but there is certainly a decent collection of people out there who think this. Consider this document (“Object-Oriented Programming in C”, revised December 2017), which starts out with this:

                        Object-oriented programming (OOP) is not the use of a particular language or a tool. It is rather a way of design based on the three fundamental design meta-patterns:

                        • Encapsulation – the ability to package data and functions together into classes
                        • Inheritance – the ability to define new classes based on existing classes in order to obtain reuse and code organization
                        • Polymorphism – the ability to substitute objects of matching interfaces for one another at run-time
                        1. 1

                          most people I have met while programming professionally in New Zealand.

                          1. 1

                            Inheritance – the ability to define new classes based on existing classes in order to obtain reuse and code organization

                            I think this is universally accepted as an anti-pattern, both by OO programmers and FP.

                          2. 2

                            I think how most C++, Java, and .NET programmers code supports your position. At least, how most code I’ve seen works looking at code from… everywhere. Whatever my sample bias is, it’s not dependent on any one location. The bad thinking clearly spread along with the language and tools themselves.

                      1. 2

                        The slug in the URL here has somehow become notes-of-cpython-lists, when it should actually be notes-on-cpython-lists, not the of vs. on.

                          1. 1

                            Actually decent article.

                          1. 3

                            Really good books:

                            More serious kind:

                            Kinda good:

                            + more, but not worth recommending.

                              1. 1

                                What? What it is? What potential? How can I use it and why should I? This article provides no answers.

                                1. 1

                                  9 meme gifs in one artcile.

                                  1. 2

                                    I think I will stay with python.

                                    1. 29

                                      I think one of the biggest “secret” is the debugging technique or method. It is kind of similar to scientific method (hypothesis, test, evaluation of result), but never explicitly described, it is just something you have to pick up as you go. That really separates those who can program anything and those who can’t.

                                      1. 13

                                        I would agree with this, and specifically the part about generation of hypotheses about what might be wrong, and how to specifically test and evaluate then, and if they’re wrong, reject them and come up with a new one afterwards. It’s hard without experience to generate hypotheses when there’s no hints and nobody who knows more than you about the problem. It also seems to be hard to be objective about your hypothesis, seek to prove if it’s right or wrong, and reject it if it’s wrong. It’s these things that you can’t learn by reading about them.

                                        1. 8

                                          It is frustrating.

                                          We have an entire Internet or two of “programming tutorials” that frequently leave out all of the problems and mistakes that the author made while trying to write it – believing perhaps this makes them seem like less of an expert. I’d like to see more things like this gem (see the mistakes at the bottom).

                                          We also have a computer science curriculum which still seems to pretend (at least at the beginning) that instructions are equal and memory is fast and “teach” binary-trees and probed hash tables as “data structures”. But whatever. How do you know this hasn’t degenerated into a linked-list? Debugging seems remarkably absent any CS curriculum being single stepping, mental simulation of an algorithm, and printf.

                                          However I don’t think the scientific method is quite right. Peirce believed that too much rigour (or as he put it “stumbling ratiocination”) was inferior to sentiment, and that the scientific method was best suited to theoretical research. For more on this subject, see the Pragmatic theory of Truth, but a little background for my argument should be enough: Peirce outlined four methods of settling an opinion:

                                          1. Tenacity; sticking to ones initial belief brings comforts and decisiveness, but ignores contrary information.
                                          2. Authority, which overcomes disagreements but sometimes brutally
                                          3. a priori – which promotes conformity less brutally but fosters opinions as something like tastes. It depends on fashion in paradigms, and while it is more intellectual and respectable it sustains accidental and capricious beliefs.
                                          4. The scientific method, which obviously excels the others by being deliberately designed to arrive – eventually – at the most secure beliefs, upon which the most successful practices can be based

                                          Now we still see a lot of “programming wisdom and lore” which people do because they always have, or because some blog said so. I’d argue syntax-highlighting and oh-my-zsh are fashionable, and laugh at anyone who believed that the scientific method could demonstrate these tools are ideal.

                                          So what then? Well, it means we have pseudo-science in our programming.

                                          It’s for this that I remain that we (as a society) don’t know how to program computers – let alone teach anyone how to program (and therefore debug). I predict this will mature over the next couple hundred years or so, but I don’t expect anyone in my lifetime to be able to teach programming itself, the way, for example, we can teach bridge-building.

                                          1. 1

                                            “I don’t expect anyone in my lifetime to be able to teach programming itself, the way, for example, we can teach bridge-building.”

                                            We’ve been doing it a while if you keep the structuring simple. The first is an iterative method for doing that with low cost that combines things like lego blocks. Even students get low defect rate on quite-maintainable code. The second adds formal specifications and verifiable code to drive predictability up and defects further down. The third combined with error-handling techniques and automated testing is pretty good at dealing with stuff too complex for the rest. So, I’d say we can do quite a bit of what you describe but it’s mostly just not applied.




                                            1. 2

                                              I’m not sure I understand what you’re saying. Maybe you don’t understand what I’m saying.

                                              Clean room doesn’t help a programmer understand what’s wrong with:

                                              memcpy(a, b, c*d);

                                              All of these examples advise not putting bugs in the first place – sensible advice to the novice, for sure, but how exactly does that teach us how to debug programs?

                                              What constitutes a bug in the first place? Eiffel sounds great not having any bugs in it, but what exactly does that mean?

                                              Is “DISK 0K” a bug? There’s a wonderful story of tech support getting a report that I’m getting an error message about my disk being full, but the computer says it’s ok.

                                              If you’re saving a big (multi-gigabyte) CAD drawing to disk and run out of space ten minutes in, should the system generate an error, telling you to quit and delete some files and try again later? We use multitasking systems, so why not pause and give the user the option to retry or fail? They can delete some files if they want to…

                                              What about an email server? What if it runs out of disk space? Should reject a message? Could we still pause, let the client timeout while we page the sysadmin/operator?

                                              Writing software is in part, being able to say what you mean (implement), and part being able to mean what the business means (specify), but it’s also clearly (still) a matter of taste because we don’t have good science to point to giving us the answers to these questions.

                                              In contrast, have you seen bridge engineering handbooks? Pretty much every consideration you might have when you need to build a bridge is documented and well researched in a way that makes software development professionals look like lego builders.

                                          2. 2

                                            Two great talks on the topic of debugging:

                                            Stu Halloway on Debugging with the Scientific Method

                                            Brian Cantrill on Debugging Under Fire: Keep your Head when Systems have Lost their Mind

                                            What I find interesting is that, depsite their different backgrounds and presentation styles, their advice has a lot of similarity.

                                          1. 3

                                            This is pretty standard in most of the OO languages I’ve encountered. Also mentioned in good books about python.

                                            1. 2

                                              The game wants to be played!

                                              1. 9

                                                Nice. I have been thinking about this for some time now and I really think, that there should be law to make things repairable. The amount of shit that is thrown away each year is really embarrassing. It would make much more ecological sense than ban on light bulbs.

                                                1. 4

                                                  Is it being thrown out that you think is a problem or some sort of environmental or economic issue? I’m not sure things being repairable is a good idea for the sake of being repairable. But I think if things are not repairable then they need to prove some minimal environmental impact. My laptop, for example, is a highly coupled piece of technology. I’m not sure making it repairable is something that I, as a consumer, want.

                                                  1. 9

                                                    I do. My Macbook Pro is going to be 9 years old in a couple months, and I can still use it fine (at least for web development type stuff). The reason it’s lasted that long, besides 2008 Apple building awesome laptops, is also that I was able to upgrade it (RAM upgrades and HD->SSD replacement) and “repair” it (replacing the battery a couple times, of course, but also replacing a couple cables internally that started failing). Without this, I would have maybe been able to bring it to an AppleStore a few times but the cost would have forced me to throw it away much quicker.

                                                    Now if I was to look for a new laptop, I would definitely look for one that has the same level of “repairability”, which means leaving the Apple nest.

                                                    1. 2

                                                      In my opinion post 2008 a lot if not most of the hardware (in particular laptops) and other kinds of consumer devices have become stripped of a lot of great things. I have a Toshiba from that year that packs everything. 5ghz wifi, bluetooth, firewire, 4xusb, hdmi, vga, serial, pcmcia, dvdrw, physical rfkill switch, webcam, esata, multimedia keys, Harman/Kardon speakers (the best ever), great quality (but low res) screen, and the type of plastic used for the case is high quality.

                                                      Nowadays it’s nothing like back then. You get a fraction of the features for twice the price, in thin plastic. Sure, more memory, better cpu and more storage but that’s aboit it. I suspect the 2008 financial crisis set things in motion - to me, that’s the year when quality dropped.

                                                      1. 2

                                                        A major reason Apple is able to make their new laptops so lightweight and portable while maintaining long battery life is the space-saving afforded by integrated components. Repairability takes space (removable panels vs structural panels, sockets vs soldering, shielded vs unshielded batteries, etc.). Most people care more about portability than repairability. Even though I place a heavier emphasis on repairability, it would be wrong to force an inferior laptop experience on everyone for the sake of a few power users.

                                                        1. 1

                                                          I’m ok with that, as long as the costs of responsible disposal are factored in, i.e. price in how much it’ll cost to disassemble and properly recycle a glued-together single-piece laptop. There is a movement in that direction, but very inconsistent at the moment, in the U.S. varying state-to-state (e.g. California and New York have stricter e-waste laws than most states, requiring up-front payment of lifecycle costs). Implementation varies even more, currently with a lot of fraud where stuff doesn’t necessarily actually get recycled, or is exported with questionable post-export controls.

                                                        2. 1

                                                          I want that, but I also like how thin/light my MBP is. Similarly I wouldn’t expect to be able to swap out the CPU for my Game Boy very easily. I imagine having a way to do that would add quite a bit of weight if you look at all the componenets

                                                          Maybe I’ve just been brainwashed, but my impression is that this is mostly an either/or situation, moreso than a smooth gradient.

                                                      2. 1

                                                        I agree, but laws are hard to get right. In eg. video panels, planned obsilescence helps fund the development. Regulating that out may give us robust black-and-whie CRTs.

                                                        Of course most crap is along the lines of kitchen appliances that are cheaper to replace than repair, in which case recycling sounds better than throwing out.

                                                        The last case is eg. not having built-in batteries, here at the fringe, where you can put your money into a Fairphone instead of an iPhone.

                                                      1. 8

                                                        Good article, which I would suggest to everyone who wants to start to use message broker.

                                                        For short-lived tasks, publish-subscribe is a convenient way to build a system quickly, but you inevitably end up implementing a new protocol atop. You have publish-subscribe, but you really want request-response. If you want something computed, you’ll probably want to know the result.

                                                        Starting with publish-subscribe makes work assignment easy: jobs get added to the queue, workers take turns to remove them. Unfortunately, it makes finding out what happened quite hard, and you’ll need to add another queue to send a result back.

                                                        Once you can handle success, it is time to handle the errors. The first step is often adding code to retry the request a few times. After you DDoS your system, you put a call to sleep(). After you slowly DDoS your system, each retry waits twice as long as the previous.

                                                        This happened to me in my last job. One thing I learned was that I don’t really want to reinvent RPC framework and all the fluff around it myself.

                                                          1. 1

                                                            Awesome link, thanks for sharing!

                                                          1. 5

                                                            Homestation http://i.imgur.com/aug4Kx0.jpg

                                                            Desk’s height is adjustable (Ikea Skarsta) and primary 27” 2k LCD is held by positionable holder (like this https://i.czc.cz/e310g0bsv0jhc8qpkhf0rekotc_1/obrazek). PC is custom made i5-4590 with 16GB RAM and ASUS GTX950-M-2GD5. Irony is that I built it two years back to play games, but every game I’ve tried failed to engage me. Now I can’t be bothered to switch from Linux Mint to Windows to even try to play. I am actively using 16 virtual desktops and Sublime text as an editor.

                                                            Work https://i.imgur.com/EIkMRtq.jpg

                                                            X220 with 8G RAM and 120G intel SSD. Also i5, but older and much slower. It is much more shittier environment compared to my homestation. Thinkpad is mine, because I didn’t want their shitty Dell

                                                            I find it sad, that you may have excellent home environment for like $1k, but as an programmer, you’ll get shitty desk, shitty display, shitty chair and shitty computer.

                                                            1. 2

                                                              I am currently using X220 in job. Easily best notebook I’ve ever used. I’ve used Macbook Air, Asus and in last job, Dell XPS13.

                                                              X220 has one killer feature, which makes it better than everything else and that is old, „traditional“ keyboard. In my current job, they offered me external keyboard, and I took it, but then decided to use X220 as keyboard, because it’s better.


                                                              I would definitely buy the Retro ThinkPad, if they keep old keyboard.

                                                              1. 2

                                                                Yes! I did finally upgrade my X220 last year, but I miss that keyboard. The only laptop keyboard I liked better was X60 (that thing was a slimline tank.)