1. 23

    I suggested a rant tag since this feels like a super vague long form subtweet that likely has a specific story/example behind it. I don’t understand what dhh actually complains about there and whether it’s genuine without knowing that context.

    1. 10

      Pretty sure he’s railing against the /r/programmerhumor style “software development is just copy-and-pasting from stack overflow right guiz!” meme. I’m sympathetic to his frustration because this joke (which was never that funny in the first place) has leaked into non-technical circles. I’ve had non techies say to me semi-seriously “programming, that’s just copying code from the internet, right?” and it galls a bit. Obviously we all copied code when we were starting out but it’s not something that proficient developers do often and to assert otherwise is a little demeaning.

      1. 6

        Obviously we all copied code when we were starting out

        Well no, I copied examples from a book. Manually, line by line.


          The meme starts being a bit more condescending now though. I frequently come across tweets saying things like « lol no one of us has any idea what we are doing we just copy paste stuff ». The copy pasting stuff is kinda true in a way (although a bit more complicated, even as a senior dev I copy paste snippets but know how to adapt them to my use case and test them), but the incompetence part is not. But it sadly starts to feel like there are tons of incompetent OR self deprecating people in the field. That’s pretty bad.

          This blog post resonates with me, it really pinpoints something.


            It’s cool if that’s what he wanted to say, but the inclusion of impostor syndrome and gatekeeping made me think otherwise.


              That was probably just him hedging against expected criticism


              I have 20 years experience and I regularly copy paste code rather than memorize apis or painstakingly figure out the api from its docs. I do the latter too, but if I can copy paste some code as a start all the better.


                Why am I paying this exorbitant salary, to attract people like you with a fancy degree and years of experience when all you ever do is a four-second copy-and-paste job?

                You pay it because it because I spent a long time achieving my degree and accumulating years of experience to be able to judge which code to copy and paste where, and why and in only four seconds at that.

                No matter the context these reductions are always boiled down to the easy to perform operation, never the understanding behind the operation.


                It absolutely feels like a subtweet, but I have no idea what the context was. Did someone at Basecamp just post the no idea dog one time too often?

              1. 4

                As it stands, Nim is well on its way to becoming as complex as Ada or C++. I guess the price for “one programming language for everything” is that you then have to satisfy everyone, which is only possible if the language becomes continually more extensive.

                1. 3

                  That’s a fair critism, but I came to Nim from C++ because I thought most of the C++ complexity was ad-hoc and unjustified.

                  1. 2

                    It’s worth taking a look at how C++ came to be. Stroutrup added concepts to the existing C language that he knew from Simula and found useful. In this sense, “C with classes” was minimal and complete. Ada, on the other hand, started with the claim to support all applications of the DoD at that time with only one language (i.e. “one programming language for everything”); already the first version of Ada was accordingly large and complex. In the meantime, C++ has also reached an almost incomprehensible size and complexity. With C++ 11, a lot of “syntactic sugar” was introduced, i.e. things that could already be done, but perhaps somewhat less elegantly; this trend continued and the result we see in C++17 and 20; the price is an ever larger language scope. How much is enough (i.e., optimal) is a difficult question. At the moment I am trying to find an answer to this with http://oberon-lang.ch.

                    1. 1

                      I believe that Stroustroup planned for C++ to be multi-paradigm from the beginning, at least that’s the take I got from his book. C++ just happened to luck into the OO craze and I guess that paradigm became dominant.

                      1. 2

                        “multi-paradigm” is not the same as “one programming language for everything”. C++ was “multi-paradigm” by construction in that OO features were added to a procedural language without removing anything. But the new features were not just useful for OO, but also e.g. for better modularization and resource management.

                      2. 1

                        oberon+ looks neat (and reminds me of ada but presumably a lot less complex). But I can’t find any resources for learning it or any information about any standard libraries it has. Do the standard libraries use camelCase (as shown in the examples) this would also be a blocker for me.

                        1. 2

                          Do the standard libraries use camelCase (as shown in the examples) this would also be a blocker for me.

                          honest question, just curious: why is casing so important for you that you basically ignore all other technical merits of a language?

                          1. 1

                            I find it hard and aesthetically displeasing to read. (Also worth noting that languages which like to rely on camel case often pick the wrong side of XMLHttpRequest.) I find it hard to type on a keyboard. Given the choice of learning a language or not dealing with camelCase I simply pick not to learn the language. There are in fact so many languages out there that it is difficult to “miss out” on much by making such an arbitrary choice. There is likely at least one more language out there with mostly overlapping technical merits which does not force camelCase upon me.

                            That being said, I have recently thought about investigating using something like treesitter to basically place a thin layer over the top of a language which can translate snake_case to camelCase using a variety of rules (or even by also communicating with an LSP server) so that I can learn a language like Haskell comfortably.

                          2. 1

                            Oberon+ is a union and extension of the existing Oberon 90, Oberon-2 and Oberon-07 dialects (see https://en.wikipedia.org/wiki/Oberon_(programming_language)). Historically there is no “standard library” for Oberon, but there is the Oberon System, which is a full operating system with a lot of modules also available for custom applications. There was an initiative to define Oberon System independent standard libraries (see http://www.edm2.com/index.php/The_Oakwood_Guidelines_for_Oberon-2_Compiler_Developers) which my compiler supports. But I will eventually implement my own standard libraries; up to then you can use any C library by the foreign function interface built into Oberon+; at http://oberon-lang.ch there is a language specification and some other articles; see also https://github.com/rochus-keller/Oberon.

                      3. 2

                        From what I can tell, the feeping creaturism is a hell of a lot more integrated than in other languages I’ve seen, and all of the language grammar in the above article is pretty well thought-out.

                      1. 4

                        Not only is Habitat awesome, and an awesome piece of history, but you can go to neohabitat.org and actually play the game in your browser! They reimplemented the backend and it runs in a commodore 64 emulator in browser mode (but you can even play it on a real actual c64!)

                        1. 4

                          TLS error and then this:

                          Placeholder page The owner of this web site has not put up any web pages yet. Please come back later.


                          edit: seems to be http://exhibit-demo.spi.ne/

                        1. 2

                          Not trying to critique this package, I think this is fun and helped me understand generics a little bit more.

                          But on that topic: Is this meaningfully using generics? I though any was an alias for interface{}. Does this do type checking beyond what the non-generic version would do?

                          1. 5

                            Is this meaningfully using generics? I though any was an alias for interface{}. Does this do type checking beyond what the non-generic version would do?

                            I tried to write a non-generic version of this package first, but you can’t reflect on an interface type. When you do reflect.Value(x), you lose the fact that x was, e.g. an error, because reflect.Value only takes interface{}. You’d end up saying whether the underlying concrete type was nil or not, which is typically not what you want. To work around that, you could require that everything is passed as a pointer, e.g. reflect.Value(&err), but that kind of sucks as an API. If you look at what truthy.Value does, it accepts a value of type T, and then passes &T to reflect.Value and calls value.Elem() to get the correct type. So yes, on a technical level, you couldn’t quite make this API work without generics, although it could be close.

                            Then there’s truthy.First. To be honest, truthy.First is the only part of the package that I consider actually useful, and even that, I mostly expect it to be used for picking a string or default. Anyhow, that requires generics to avoid the cast back from interface type to the concrete type. However, if you wanted to, you could write truthy.SetDefault with Go 1.17 reflection alone, since that takes a pointer. truthy.Filter is also pretty easy to do with pre-generic Go, but the code would be a lot uglier.

                            1. 2

                              Ah interesting. Thank you for the detailed response.

                            2. 1

                              afaics not, no.

                            1. 4

                              Server side go is the old fashioned way now?

                              I feel old.

                              1. 5

                                Want to impress your boss with a super snappy, low maintenance, quickly build, trivial to install backend system which Just Works? Use go templates and old-school forms. It’s unbeatable in many respects :)

                              1. 2

                                random tip: if you use go templates, I highly recommend you have a look at github.com/jba/templatecheck It removes most of the edit-reload cycle when making html forms with templates.

                                1. 16

                                  There’s a lot of good stuff in here that we all think everyone knows and we say to each other in the pub but we don’t really say out loud to the people that need to hear it.

                                  The main one that comes to mind is about mobility. They said something like “if I get fired I’ll have a new job in two weeks.” The tech folks that don’t know this is true need to learn it. More importantly: the people who manage tech people need to learn it.

                                  1. 22

                                    if I get fired I’ll have a new job in two weeks.

                                    This has never been true for me. Job hunting has always been a relentless slog.

                                    1. 12

                                      Imma guess it depends on where you are. Silicon Valley, Seattle, NYC, London, you can basically put your desk stuff in a box, and throw it out a window and have it land in another tech company’s lobby.

                                      Other places, not so much.

                                      1. 9

                                        I agree living in a tech hub makes finding a job way easier, but I jump to temper the hyperbole just a bit. I know that I personally felt a lot of self-hatred when I tried to change jobs and it took months of applications and references and interviews to actually get one, even living in a tech hub.

                                        1. 6

                                          Technology stacks don’t really matter because there are like 15 basic patterns of software engineering in my field that apply. I work in data so it’s not going to be the same as webdev or embedded.

                                          It depends on what you do. The author is a database specialist, so of course they’re going to claim that SQL is the ultimate language and that jobs are plentiful. I’m an SRE, so my career path requires me to pick specific backend-ready languages to learn. I have several great memories of failed interviews because I didn’t have precisely the right tools under the belt:

                                          • I worked on a Free Software library in Python along with other folks. They invited me to interview at their employer. Their employer offered me a position writing Lua for production backends. To this day, I still think that this was a bait-and-switch.
                                          • I interviewed at a local startup that was personally significant in my life. I had known that it wasn’t a good fit. Their champion had just quit and left behind a frontend written with the trendiest JS libraries, locking their main product into a rigid unmaintainable monolith. I didn’t know the exact combination of five libraries that they had used.
                                          • I interviewed at a multinational group for a position handling Kubernetes. I gathered that they had their own in-house monitoring instead of Prometheus, in-house authentication, etc. They also had a clothing line, and I’m still not sure whether I was turned down because I didn’t already know their in-house tools or because I wasn’t wearing their clothes.
                                          1. 3

                                            They also had a clothing line, and I’m still not sure whether I was turned down because I didn’t already know their in-house tools or because I wasn’t wearing their clothes.

                                            Seems like a blessing in disguise if it was the clothes.

                                          2. 3

                                            I have this problem and I’m in a tech hub. Most of my coworkers and technical friends are in different countries I can’t legally work in, so I rarely get interviews through networking. Interviewing is also not smooth sailing afterwards.

                                          3. 5

                                            This has never been true for me. Job hunting has always been a relentless slog.

                                            Same here, I also live in a city with many startups, but companies I actually want to work for, which do things I think are worthwhile, are very rare.

                                          4. 7

                                            There’s a lot of good stuff in here that we all think everyone knows and we say to each other in the pub but we don’t really say out loud to the people that need to hear it.

                                            Interesting that you say that in the context of modern IT. It has been so with many things since ancient time.


                                            Perhaps the traditional after-work Friday beer plays a more important role in one’s career than most people think. Wisdom is valuable and not available ons course you can sign up to.

                                            1. 1

                                              Wisdom is valuable and not available ons course you can sign up to.

                                              Which is ironic given wisdom is often what they’re being sold as providing.

                                            2. 5

                                              The main one that comes to mind is about mobility. They said something like “if I get fired I’ll have a new job in two weeks.” The tech folks that don’t know this is true need to learn it. More importantly: the people who manage tech people need to learn it.

                                              Retention is a big problem. It can take up to a year to ramp up even a senior person to be fully productive on a complicated legacy code base. Take care of your employees and make sure they are paid a fair wage and not under the pressure cooker of bad management who thinks yelling fixes problems.

                                              1. 2

                                                That’s probably why the OP says their salary went up 50% while their responsibilities reduced by 50%. Onboarding.

                                            1. 3

                                              Fun(*) way to practice spotting smaller design issues.

                                              On your way to reach master status: https://xkcd.com/1015/

                                              *) and very annoying at the same time.

                                              1. 1

                                                HAProxy is great, never gave me any trouble, did its tasks perfectly.

                                                1. 9

                                                  Since I’m sure I’m not the only one who missed the memo on Gemini: https://en.wikipedia.org/wiki/Gemini_(protocol)

                                                  1. 3

                                                    Just curious, how did you like using Svelte for this, Elton/1ntEgr8?

                                                    1. 8

                                                      It was my first time using Svelte, and I really enjoyed it. Can’t say that my code is idiomatic, but I thought Svelte was a refreshing take on frontend-dev

                                                      1. 1

                                                        Thanks. I started to port a small project over. There are some limitations, but it’s definitely interesting (I also use esbuilder. I feel so modern).

                                                    1. 5

                                                      Best bonus situation I’ve ever seen, management handed the head of the engineering dept a pile of money for bonuses and said “figure out how to divide this money up among your department based on merit”, then the head of engineering said “everyone here works their ass off” and split it evenly among everyone.

                                                      Think that’s a pretty good example to set.

                                                      1. 3

                                                        In my experience, people can be divided in two categories:

                                                        1. People who do work.
                                                        2. People who do very little or no work.

                                                        Many smaller companies it’s 100% of the first category, but the larger the company gets the more of category 2 seems to seep in. It’s just easier to hide that you’re a fuckup I suppose.

                                                        Barring a few rare exceptions, I find it very difficult to really rank the merit of people in category 1.

                                                        1. 2

                                                          People who do very little or no work.

                                                          I don’t mind people who do little work. Not everyone has the same background and experience and work speed. I do, however, really have zero patience for people who do a negative amount of work. They slowly drain a company.

                                                      1. 39

                                                        This article is full of misinformation. I posted details on HN: https://news.ycombinator.com/item?id=26834128.

                                                        1. 10

                                                          This really shouldn’t be needed, and even someone without any exposure to Go can see this is just bunk with the minimal application of critical thinking. It’s sad to see this so highly upvoted on HN.

                                                          When I was in high school one of the my classmates ended up with a 17A doorbell in some calculations. I think he used the wrong formula or swapped some numbers; a simple mistake we all made. The teacher, quite rightfully, berated him for not actually looking at the result of his calculation and judging if it’s roughly in the right ballpark. 17A is a ludicrous amount of power for a doorbell and anyone can see that’s just spectacularly wrong. The highest rated domestic fuses we have are 16A.

                                                          If this story had ended up with 0.7%, sure, I can believe that. 7%? Very unlikely and I’d be skeptical, but still possible I suppose. 70% Yeah nah, that’s just as silly as a 17A doorbell. The author should have seen this, and so should anyone reading this, with or without exposure to Go. This is just basic critical thinking 101.

                                                          Besides, does the author think the Go authors are stupid blubbering idiots who someone missed this huge elephant-sized low-hanging fruit? Binary sizes have been a point of attention for years, and somehow missing 70% wasted space of “dark bytes” would be staggeringly incompetent. If Go was written by a single author then I suppose it would have been possible (though still unlikely), but an entire team missing this for years?

                                                          Everything about this story is just stupid. I actually read it twice because surely someone can’t make such a ludicrous claim with such confidence, on the cockroachdb website no less? I must be misunderstanding it? But yup, it’s really right there. In bold even.

                                                          1. 6

                                                            I think this is really interesting from a project management and public perception point of view. This is slightly different from your high school classmate, because they might not have been aware the ridiculousness of their claims. Of course, this situation could be the same, but I think it is more interesting if we assume the author did see this number and thought it was ridiculous and still wrote the article anyway.

                                                            Someone doesn’t write a post like this without feeling some sort of distrust to the tool they are using. For some reason, once you’ve lost the trust, people will start making outlandish claims without giving any benefit of the doubt. I feel like this is similar to the Python drama which ousted the BDFL and to Rust’s actix-web drama which ousted the founding developer. Once the trust is lost in whoever is making the decisions, logic and reason seem to just go out the window. Unfortunately this can lead to snowballing and people acting very nasty for no real reason.

                                                            I don’t have much knowledge of the Go community or drama, and in some sense this is at least much more nicely put than some of Rust’s actix-web drama (which really threw good intent out the window), but I’d be curious to know what happened that lost the trust here. It might be as simple as being upfront about the steps being done to reduce binary size, even if they are not impactful, that might gain back trust in this area.

                                                            1. 3

                                                              It’s my impression that the Python and actix-web conflicts were quite different; with Python Guido just quit as he got tired of all the bickering, and actix-web was more or less similar (AFAIK neither were “ousted” though, but quit on their own?) I only followed those things at a distance, but that’s the impression I had anyway.

                                                              But I think you may be correct with lack of trust – especially when taking the author’s comments on the HN story in to account – but it’s hard to say for sure though as I don’t know the author.

                                                              1. 2

                                                                Perhaps I am over-generalizing, but I think they are all the same thing. With Rust’s actix-web it essentially boiled down to some people have a mental model of Rust which involves no unsafe code (which differed from the primary developer’s mental model). At some point, this went from “lets minimize unsafe” to “any unsafe is horrible and makes the project and developer a failure”, regardless of the validity of the unsafe statements. Unfortunately it devolved to the point where the main developer left.

                                                                In the Go situation it seems very similar. Some people have a mental model that any binary bloat is unacceptable, while the core devs see the situation differently (obviously balancing many different requirements). It seems like this article is that disagreement boiling over to the point where any unaccounted-for bits in a binary are completely unacceptable, leading to outlandish claims like 70% of the binary is wasted space. Hopefully no Go core developers take this personally enough to leave, but it seems like a very similar situation where different mental models and lack of trust lead to logic and benefit of the doubt getting thrown out the window.

                                                                It is hard to say what is going on for sure, and in many ways I’m just being an armchair psycologist with no degree, but I think it is interesting how this is a “common” trend. At some point projects that are doing a balancing act get lashed out at for perceived imbalances being misconstrued as malicious intent.

                                                                1. 1

                                                                  I don’t think you’re correctly characterizing the actix situation. I think the mental model was “no unnecessary unsafe”. There were some spots where the use of unsafe was correct but unnecessary, and others where it was incorrect and dangerous. I think there was poor behavior on both sides of that situation. The maintainer consistently minimized the dangerous uses and closed issues, meanwhile a bunch of people on the periphery acted like a mob of children and just kept piling on the issues. I personally think someone should have forked it and moved on with their lives instead of trying to convince the maintainer to the point of harassment.

                                                            2. 2

                                                              on the cockroachdb website no less

                                                              Cockroachdb is on my list to play with on a rainy afternoon, but this article did knock it down the list quite a few notches.

                                                              1. 2

                                                                We use it as our main database at work and it’s pretty solid. The docs for it are pretty good as well. But I definitely agree, this is a pretty disappointing article.

                                                          1. 3

                                                            That was fun. Great idea, great execution!

                                                            1. 3

                                                              It’s very rare, but I would actually like to see a video explaining this better.

                                                              (Good job with the minimal JS dependencies! uMatrix is very happy with this)

                                                              1. 1

                                                                prometheus with grafana (or similar)?

                                                                1. 2

                                                                  just FYI, if you run postgres CI tests, you can set this in postgres.conf:

                                                                  fsync = off
                                                                  1. 1

                                                                    I believe many other DBMS supports this. For example Tarantool allows to make such speedup too (see https://www.tarantool.io/en/doc/latest/reference/configuration/#confval-wal_mode)

                                                                  1. 2

                                                                    Oh the Nostalgia. To think that I’m so old that I’ve experienced a big chunk of computer history is mind blowing. I started out with Commodore 64 and a VIC20. I used Intel 8086 and Intel 8088, my 486 66mhz I remember fondly, as I remember my Pentium from Digital (what a beast it was). After that point it went fast and from that point on I cannot really remember any particular computer as very special, up until my first Mac with OSX.

                                                                    1. 1


                                                                      … which also turned 20 a week ago.

                                                                      1. 3

                                                                        I’ve never been a mac user but I wonder if the upgrade path/user experience feels much diifferent over these 20 years of OS X compared to Windows (either 3.11 up to 5 years ago, or Win98/2000 up till Win 10)…

                                                                        Because despite having used all these Windows systems (3.11, NT 4, 95A,B,C, 98, 98 SE, Me, 2000, XP, 7, 10, and not Vista and 8/8.1) - while some people might say the gui is kinda samey or had a clear evolution, my /experience/ is so vastly different.

                                                                        3.11 was basic but worked.

                                                                        95A was a complete shitshow and crashed daily and I had to reinstall once a month, at least

                                                                        95 B and C were tolerable

                                                                        98 was somehow fresher but less stable again

                                                                        98 SE was pretty good

                                                                        Me I don’t really remember

                                                                        2000 was awesome (after the first few months with driver problems for some games)

                                                                        XP was ok

                                                                        7 was solid

                                                                        10 is a step back in my opinion but it’s close to 7 in quality

                                                                        1. 2

                                                                          The 1984 original Mac was “the first [UI] worth criticizing”, to misquote Alan Kay. Once you upgraded the RAM it was very capable, and quickly launched desktop publishing once PageMaker was released.

                                                                          The later 80s brought color and bigger screen support, some limited multitasking, networking, and a huge filesystem improvement.

                                                                          System 7 in 1991 was a big step with a fully-color GUI, multitasking, IAC, and tons of usability improvements. But under the hood it was still quite primitive with no memory protection or pre-emptive scheduling.

                                                                          The rest of the 90s saw only incremental improvements since Apple kept working on a series of failed attempts to build a better OS from scratch and/or port to x86 (Pink/Taligent, Star Trek, Maxwell/Copland).

                                                                          Finally in 2001 came Mac OS X, which was a NeXT-derived OS using the Mach microkernel, BSD Unix, the “AppKit” evolution of OpenStep, the “Carbon” porting layer for the old Mac APIs, and the “blue box” classic OS emulator to run unported apps. 10.0 was buggy and incomplete, but by 10.2 in 2002 it was solid.

                                                                          1. 1

                                                                            When I started working we had a lot of OS 9 macs, I used to only use them to test web pages in Internet Explorer. They crashed often and to a casual Windows/Linux user they weren’t great, but usable.

                                                                            When a coworker showed me OS X (must have been 10.0) it was kinda amazing, but I didn’t use it a lot, so can’t really comment. But I’ve always felt that mac users have sometimes lamented about good and bad releases, but hardly any game breakers to switch away for a certain release, more of a “been sick of it for a while”:..

                                                                          2. 1

                                                                            3.11 was basic but worked.

                                                                            I worked in the helpdesk in a university library back then. I can’t remember how many people lost their complete dissertations from crashing window 3.11 machines (Combined with having no idea that you need to keep multiple backups on these slow and unreliable floppy disks). Whatever came after might have been bad, but all of them have been better than 3.11.

                                                                            1. 1

                                                                              interesting. I mean we only had it for like 2 years (on one PC) and it was mostly used for Word and Excel but I can’t remember any crashes at all, that’s why I was so surprised that 95A was so bad…

                                                                            2. 1

                                                                              I have been using Windows since 3.11 and was using only Windows (and Dos) up until around Mac OS X. Never used a Mac before that point.

                                                                              But for me it has seemed like Windows have been more incremental while OS X release have been more continuous. I mean, If I think back to my original OS X, I kind of remember it being just the same as what I am using today (Big Sur), which is obviously wasn’t. Windows releases however has been more distinct from its previous version, in my mind

                                                                              I also used OS/2 (was that what it was called?) along side of Windows 3.11. But to be frank, back in those days, I was mostly using Dos. Windows 3.11, to me as a gamer at the time, didn’t really add anything for my needs.

                                                                        1. 8

                                                                          Brilliant, thanks for posting!

                                                                          Apart from the main point, promoting interest in Alexander’s book A Pattern Language (which I just ordered), I was surprised to finally find the original raison d’etre of Design Patterns: to paste in bits of code that overcame the lack of convenience mechanisms in earlier versions of C++ and Java.

                                                                          I realize now that I had never placed the GOF book in it’s original context, especially it’s time. Back then, this made a lot of sense. As much sense as the magazines with printed program code before the internet. It might even have made sense to call these bits of code ‘patterns’.

                                                                          But this also points out very clearly that clinging to these patterns today is absurd. Believing they hold some sort of fundamental truth about programming is a big fallacy. In fact, claiming that any aspect of structuring a program relies on recurring patterns makes no sense. Common terminology can be very useful, yes, but recurring design structures, no. Every programming scenario and thus every design is different almost by definition: if a problem occurred before, it will already be solved so you don’t need to do so again.

                                                                          In other words: it’s finally time to decisively ditch the religious status of the GOF book and develop new ideas about structuring code that fits our current times.

                                                                          1. 15

                                                                            I think it was a great disservice to the industry as a whole, and to their students, when instructors started teaching design patterns as if they were first principles of programming. If you read the GoF book it is pretty clear that even in the context of working around C++ shortcomings, the patterns are primarily descriptive, not prescriptive: “We see that a common thing people already do when confronted with a problem shaped like X is to use a solution shaped like Y. Here’s a name for Y so we can have conversations about it and not have to explain what we’re talking about every single time.” I never got the sense from the GoF book that the authors intended it to be a rulebook.

                                                                            1. 11

                                                                              I was super frustrated when I was being asked to be able to cite and explain GoF design patterns when interviewed for a (senior) Go SE position. I mean, ok, a few of them kinda maybe still carry over, and thus could be used in speaking when mentoring juniors, but the majority are not relevant at all in Go! Basically, just use a closure, most of the time… Fortunately, I’ve recently learnt, that for other reasons as well, it’s probably good for me that they didn’t want to hire me in the end.

                                                                              1. 4

                                                                                I always feel that I learn more about the company by the questions they ask, then they learn about me by my answer. That’s good for me, though :)

                                                                            2. 4

                                                                              I really don’t think Design Patterns were “to paste in bits of code that overcame the lack of convenience mechanisms in earlier versions of C++ and Java”. First, they’re not language specific — when the book came out, I recognized a lot of them from Smalltalk-80. Second, they’re not as simple as copy/paste. Third, good frameworks incorporate the appropriate patterns for you so you don’t have to re-implement them.

                                                                              The author really seems to misunderstand some of this stuff. His statement that (paraphrasing) “the Iterator pattern is for sucky languages” misses the point that the reason iterating is so easy in higher level languages is because their collections already have iteration baked in. If you implemented a custom collection like, say, a b-tree in Perl, you’d need to do some work to be able to iterate it simply. And that work would probably look just like the Iterator pattern.

                                                                              1. 2

                                                                                Maybe you want to read it again. Especially the Postscript although I’ll warn there are spoilers in there.

                                                                                You’re arguing with something the author isn’t saying. I agree that a book about designing a town is more useful (and less harmful) to programmers than the book about iterators.

                                                                                If you implemented a custom collection like, say, a b-tree in Perl, you’d need to do some work to be able to iterate it simply.

                                                                                So don’t do that? One important part of Alexander’s patterns is (paraphrasing) that the implementation serves needs instead of the other way around. Most problems don’t need a b-tree (and most people that use them could choose something better) and even those that do have a b-tree don’t need (or even should) iterate.

                                                                                1. 2

                                                                                  Most software design needs good data structures, they’re critical for efficiency and they also frame how you think about the problem. A language that makes designing a new data structure difficult is a problem.

                                                                                  That said, although I like the core idea overall, I think the author picked a terrible example in iterators. As he says, Alexander’s design patterns were about delegating aspects of design[1]. Iterators do exactly that: they exist to allow someone who is an expert in designing an efficient data structure build something that can then be used by someone who is focusing on high-level application design. That’s exactly the same idea that Alexander has in, for example, his nook or niche patterns: allowing someone with fine-grained local knowledge to design a space, without needing the person laying out the floor to understand that.

                                                                                  There’s a lot more to Alexander’s ideas, particularly in how to build cohesive overall structures, but I don’t think it’s fair to say that GoF-style patterns are something completely different, they’re just a subset of Alexander’s vision. And, to be fair to the software engineering community, they’ve done better than architects by embracing even part of Alexander’s vision. I’ve never worked in a building that wouldn’t have been improved by first hitting the architect repeatedly with The Timeless Way of Building and then making them read A Pattern Language.

                                                                                  [1] I’d thoroughly recommend reading everything he wrote, but if you don’t have time and want to learn something directly relevant to your day-to-day job, the chapter in Peopleware on office design cites Alexander and gives one of the best one-page summaries I’ve ever read of his core ideas.

                                                                                  1. 2

                                                                                    Most software design needs good data structures, they’re critical for efficiency …

                                                                                    I have heard this before, but I think it’s a myth; It’s repeated so often people believe it to be true without people ever evaluating it for themselves, and that’s a shame. Spend a few years in an Iverson language and you will be convinced otherwise: All you need are arrays and good tools for operating on arrays.

                                                                                    • Instead of a b-tree, use an array and binary search

                                                                                    • Instead of a hash-table, use two-arrays of the same length, and order them by the hash-position of the value in the first.

                                                                                    • Instead of an ordered hash-table, keep the hash-positions as a permutation index in a third array.

                                                                                    • Instead of a bloom-filter, just use one-array and the same hash-position function you used for the hash table.

                                                                                    • Instead of an lsm-tree, use a array-of-arrays, and an array-of-lengths.

                                                                                    And so on.

                                                                                    Arrays work better than anything else for performance. Operating systems have excellent support for mapping arrays to permanent storage. CPUs have special logic and operators for dealing with arrays. The highest-performing fastest databases in the world don’t use anything more exotic than arrays.

                                                                                    … and they also frame how you think about the problem.

                                                                                    I think you should take a look at notation as a tool of thought. I believe so strongly that this is a better way to think about the problem than “data structures” that I would prefer them even if they weren’t faster.

                                                                                    Thankfully I don’t have to choose.

                                                                                    There’s a lot more to Alexander’s ideas, … I don’t think it’s fair to say that GoF-style patterns are something completely different

                                                                                    I think we must’ve read different books. I enjoy a humanist and organic design philosophy, and I can’t agree that “design patterns” is it. I’m also not sure Christopher Alexander would agree with you:

                                                                                    When I look at the object-oriented work on patterns that I’ve seen, I see the format of a pattern (context, problem, solution, and so forth). It is a nice and useful format. It allows you to write down good ideas about software design in a way that can be discussed, shared, modified, and so forth. So, it is a really useful vehicle of communication. And, I think that insofar as patterns have become useful tools in the design of software, it helps the task of programming in that way. It is a nice, neat format and that is fine.

                                                                                    However, that is not all that pattern languages are supposed to do. The pattern language that we began creating in the 1970s had other essential features. First, it has a moral component. Second, it has the aim of creating coherence, morphological coherence in the things which are made with it. And third, it is generative: it allows people to create coherence, morally sound objects, and encourages and enables this process because of its emphasis on the coherence of the created whole.

                                                                                    1. 2

                                                                                      Arrays work better than anything else for performance.

                                                                                      But the ways APL uses arrays are terrible for performance. A lot of common idioms do huge amounts of work on large intermediate arrays to produce a particular transformation, kind of like those Rubik’s Cube macros where you flip six or seven edges to rotate one pair of corners.

                                                                                      My college compilers class was taught by Jim Kajiya, who had worked on an APL compiler. During the discussion of APL he told us how getting good performance from the language was terribly difficult — it relied on a lot of knowledge of those idioms used in the language, using pattern recognition to optimize those into a more efficient direct implementation of their effect. And such a compiler is only as good as its library of idioms, so I imagine that if you use one it doesn’t know, or use one with slightly different syntax, your performance plummets.

                                                                                      1. 2

                                                                                        Spend a few years in an Iverson language and you will be convinced otherwise: All you need are arrays and good tools for operating on arrays.

                                                                                        I think you’re disagreeing with people because you use words to mean different things. All of the things you say you can use instead of data structures are implementations of those data structures.

                                                                                        1. 1

                                                                                          All of the things you say you can use instead of data structures are implementations of those data structures.

                                                                                          I’m not sure I’ve ever heard anyone suggest that a b-tree is the same as a sorted array. The performance difference is striking.

                                                                                          In any event, they’re definitely different in the critical way: They all use the same “iterator” which should prove that an iterator isn’t a pattern, but a single operator.

                                                                                          I think you’re disagreeing with people because you use words to mean different things.

                                                                                          I’m agreeing with the author of the linked post. I’m disagreeing with you. I also think abuse is a poor form of debate.

                                                                                      2. 1

                                                                                        There is also Alexander’s OOPSLA keynote. Here is a quote which shows that he aspire to a lot more than GoF:

                                                                                        I understand that the software patterns, insofar as they refer to objects and programs and so on, can make a program better. That isn’t the same thing, because in that sentence “better” could mean merely technically efficient, not actually “good.” Again, if I’m translating from my experience, I would ask that the use of pattern language in software has the tendency to make the program or the thing that is being created is morally profound—actually has the capacity to play a more significant role in human life. A deeper role in human life. Will it actually make human life better as a result of its injection into a software system?

                                                                                    2. 1

                                                                                      Which GoF patterns can’t be reasonably implemented as a library of abstract classes (or language equivalent), distributed with a language’s package manager? After reading these slides I went back over the GoF patterns and realized that pretty much all of them could. Distributing them in a book — it even came with “sample” implementations in C++ and Smalltalk! — and calling them “patterns” made sense in the 90s, before any reasonably-modern package managers existed (and when the Internet barely existed), but it doesn’t feel like it still makes much sense today.

                                                                                      Plenty of the “patterns” are fairly… dated, as well. Memento for undo, for example, rather than a lens-like approach of collapsing an immutable series of state updates into the current state (and undo is simply popping off the last state update). GoF feels more like a 1994-era utility library that got some things right — and thus those parts got ported to other languages — and plenty of things wrong, too.

                                                                                      1. 1

                                                                                        So one implementation of a pattern in one language makes the book unnecessary? That makes no sense to me. The next language implementor has to just copy the existing implementation and translate it to their language, even if the languages are wholly different? This is kind of like saying we don’t need texts on, say, B-trees because you can just go get a B-tree class from a library.

                                                                                        Maybe the patterns in the book seem so obvious to you that descriptions and explanations are unnecessary? I’d say that’s more a result of the book (and other usage of those patterns) having done its job well.