Threads for schmudde

  1. 21

    Python is an anti-social language. It focuses primarily on the needs of developers and not as much on the user of the software.

    I think that’s close, but not quite it. It focuses on the needs of some developers to the exclusion of others. It does not enforce any sort of balance.

    Say someone on your team has a Jupyter notebook they use to calculate something important to the company. Great! Okay, now try to run it on the developer who sits across from them at the office. Bzzt, nope, not going to work. You’ll need to start pip freezing and hope that there aren’t any important C build tools needed to get it working.

    Have a web team with separate frontend and backend devs. Enjoy explaining to the frontend devs what a virtual env is and why they should care. :-) Python’s “it works on my machine” factor is so bad that it propelled Docker into prominence. But your coworker who is a good designer and HTML editor might not know what a PATH is, let alone Docker.

    There’s the classic XKCD about his Python installs being a “superfund site”. Here’s the thing though: Randall Munroe is a developer.

    This article is being poorly received here—just pip freeze or use Poetry or follow some other conflicting advice!—because for these developers there are patterns in place that work. But for me as someone who tries to bridge between different people at shops that aren’t all Python experts, it’s a nightmare.

    1. 12

      Python’s “it works on my machine” factor is so bad that it propelled Docker into prominence. But your coworker who is a good designer and HTML editor might not know what a PATH is, let alone Docker.


      1. 11

        Not to drag out the Go vs Python flamewar more than necessary, but I do think this is a good way to get a handle on the culture gap between the two languages. Lots of things in Go are inconvenient or ugly for the individual but good for the group or others.

        So, eg it’s a pain to me that I can’t just version tag my projects however I want, but I have to use the ugly /v2 semantic version imports system. But for other users, it’s good that they have clear and simple rules about versions and importing that work across all packages. Lots of times, when you see someone having a problem with Go, it’s because they want to do something non-standard, and then the pro-Go people end up sounding like cultists because we say “oh no, it’s for your own good not to have unused imports” or whatever. But the core is the culture clash: am I working for Go or is Go working for me? If you work for Go, it benefits the ecosystem, but maybe you would rather just do your own thing.

        1. 16

          To paraphrase a Jedi master: When as old as Python your favorite language is, look as good it will not.

          Or to just directly quote Stroustrup: There are only two kinds of languages: the ones people complain about and the ones nobody uses.

          Basically everything you complain about from Python is directly or indirectly a consequence of its age and widespread adoption. Check back in a few decades to see if Go gets to a similar spot.


          This article is being poorly received here—just pip freeze or use Poetry or follow some other conflicting advice!—because for these developers there are patterns in place that work.

          I see one pretty bombastic chain going after the author’s tone. You’re the only comment that’s mentioned pip freeze so far.

          It may well be that people who aren’t you don’t have the same experiences you do with certain tools and come to different conclusions. You’re free to do what works for you, but you seem to spend an inordinate amount of time on this site, from what I recall, trying to put down other people’s preferences. Maybe do less of that?

          1. 11

            Basically everything you complain about from Python is directly or indirectly a consequence of its age

            Another phrasing: many of the things that older languages do poorly and newer languages do well are a direct consequence of people having learned from the older languages’ missteps.

            1. 3

              Basically everything you complain about from Python is directly or indirectly a consequence of its age and widespread adoption. Check back in a few decades to see if Go gets to a similar spot.

              I’m certain that languages like Go will accumulate warts over time, but I can’t help but think that newer languages will fair better structurally. There have been so many huge advances to basic programming practices in the past 30 years and older languages were not built with them in mind. Programming languages built 30 years ago didn’t have to think about package managers, formatter, fuzzing, testing, documentation sites, and probably a lot more.

              I’m sure that there will be new things that today’s languages haven’t considered, but at least they have examples of how to accommodate those major features from the last 30 years to draw from.

              1. 9

                I’m sure that there will be new things that today’s languages haven’t considered, but at least they have examples of how to accommodate those major features from the last 30 years to draw from.

                Again, I think time is not going to be as kind as Go’s advocates want it to be. I’ll pick on one specific issue and one general issue to illustrate why:

                A specific issue: Go, for all that people are promoting it as better than Python on “packaging”, initially shipped with… “just list some Git repos as strings at the top of your file” as the “package manager”. And a lot of details of Go’s “proper” added-much-later package manager are still influenced heavily by that early design choice. The infamous /v2 thing, for example, is a consequence of how bad Go’s initial ability to specify and target versions of dependencies was. And the whole interface of Go modules is kind of clunky and shows as having been retrofitted on after the language was already in pretty wide use, rather than something that was thoughtfully designed to go with the language. And with Go’s policies being what they are on compatibility (or at least in theory), I’m not sure it’s possible to improve significantly on this in the future.

                And a general issue: Go is, well, sort of infamous for actively refusing to learn from “the last 30 years” (and really more than just 30 years) of programming language design and evolution. It’s one of the most common criticisms of Go! The error handling, the weird quirks and limitations of the type system, the difficulty-bordering-on-outright-lack of interoperability, the massive hostility to generics… all of this and more is important stuff from the past few decades of learning about how to make programming languages that Go is proud of not making use of.

                So I’m not convinced at all that Go is going to look significantly better at its 30th birthday than Python did.

                1. 2

                  Go, for all that people are promoting it as better than Python on “packaging”, initially shipped with… “just list some Git repos as strings at the top of your file” as the “package manager”.

                  TBH, I had forgotten about that. I can’t imagine that will age well. Just thinking about the Heroku and Gitlab free tier changes that have happened I can only imagine the sort of bit rot that might occur with direct references to GitHub repositories.

            2. 7

              I think Python developers don’t really understand how bad it is because they know how to work around it. They have already picked their favorite python environment manager for their own projects, they know exactly what to do with other people’s code when it has a requirements.txt or a or a venv file or a pipenv file or a pyenv file or a virtualenv file or a poetry file or whatever else. And for system stuff, they make conscious and informed decisions about whether to install a package using their system package manager or using pip, and when using pip, they know whether they want the package locally or globally, and they know how to resolve problems where they have already installed a package globally with pip and their system package manager wants to overwrite the same files.

              It’s not like it’s the only community with that problem. C++ people may be a bit too quick to dismiss concerns about how hard memory management is, because we know how to choose between unique_ptr and shared_ptr and raw pointers and references and stack allocation, and we may have found strategies to read the multi-megabyte template or overload resolution errors and find the nugget of useful information in the sea of useless logs. But in general, C++ people probably don’t expect non-C++ developers to deal with C++ code as much as Python developers expect non-Python developers to deal with Python code.

              1. 4

                I agree that it’s easy to forget how complicated Python tooling can be, but trying to set up a moderately complex project on someone else’s machine usually serves as a good reminder

                1. 4

                  I once dealt with a legacy project where I had to recreate their dependencies by guessing from their imports and testing whether it worked. I had to go so far as checking commit timestamps and correlating it to release versions through pypi.

              2. 3

                Have a web team with separate frontend and backend devs. Enjoy explaining to the frontend devs what a virtual env is and why they should care.

                By default, pip is like using npm install —global. Virtual envs are the equivalent to node_modules.

                1. 1

                  Markdown changed your “dash dash global” to “emdash global”, which made me misread your comment.

                  Virtual envs can be used like node_modules, but a) they aren’t the default b) they need to be activated (or install a tool to activate them) c) they also copy the Python libraries themselves, but not enough to keep it from breaking when I upgrade homebrew Python.

                  Python has all the pieces for dependency management, but it’s just pieces and you have to have someone on your team write a blog post about why at your company we’re all going to do X. No one writes a blog post about how they use node_modules. :-)

                  1. 3

                    I didn’t say it’s perfect (or even good), but it’s pretty much the same situation as node just with bad defaults. If people understand node they should be able to understand the python situation. (And at this point I mostly assume that front end devs know node, which might be a poor assumption.)

                    1. 1

                      No, you don’t have to “activate” a node_modules folder, and frankly, it’s very embarrassing and hacky that virtual envs require it. (Again, my otherwise competent coworker did not know what a PATH was. How can I explain sourcing a Bash script vs running it?) I first saw activation with Ruby Gems, and I remember thinking as an arrogant Python user, “oh those monkey patching Ruby people! They’re even monkey patching my shell variables, how gross!” Little did I know that Python would copy it soon after. It really should have been made a default many years ago now, and to TFA’s point, it’s very anti-social that it has not been made the default.

                      1. 3

                        My dude, the question you posed was “how do I explain this to someone?” I proposed explanation by analogy. Analogy does not imply a 1 to 1 correspondence. It does not imply that the things are equal in value. The idea here is that this new thing serves a similar function to another thing that they might know. It serves as a great jumping off point to discuss the differences. (Nested dependencies vs one common dependency, how PATH works, etc.)

                        Do I love virtual envs? No, I don’t, but that wasn’t your question. I get the sense that you never really wanted an answer to the question. You wanted to complain. I don’t really want to listen to you complain so I’m going to ignore this thread.

                        1. 1

                          It’s a rhetorical question. I haven’t worked with that colleague for three years.

                          Any analogy is going to have points of commonality and points outside the scope of the analogy, but “venv is like node_modules” does very little work as an analogy. The only things they have in common is having project dependencies somewhere. Everything else is different. I guess it’s better than having no point of reference, but it leaves a big gap.

                  2. 1

                    My understanding is that packages installed with pip do not ‘own’ their dependencies - as in, if you install package A that requires version 1 of package B, package B will be in the global scope and could conflict with user-installed package C that need version 2 of package B. Is that correct?

                    1. 1

                      Yeah, that is a major difference between node_modules and virtual envs.

                  3. 2

                    Hard agree about the Docker thing—before Docker I wouldn’t even have attempted to set up a local environment for someone who isn’t a Python developer.

                    1. 1

                      Oh yeah, I’m down with the multi-account containers. Great for work/personal/shopping.

                    1. 11

                      Which is why Mozilla Firefox is such a breath of fresh air, as it uses much less of your CPU while still delivering a fast browsing experience. So feel free to have as many tabs open as you want in Firefox, your device will barely feel the effects.

                      I use Firefox everyday but let’s be real here.

                      I am rather tab-phobic but a few contemporary websites and one modern app like Figma puts my Firefox into a swapping tailspin after a couple hours of use. This may be better than Chrome, but it feels like the bad old days of thrashing your hard drive cache.

                      To remedy this, it seems the developers decided to unload tabs from memory. It has made Firefox more stable, but page refreshes come surprisingly frequently.

                      1. 16

                        I am rather tab-phobic but a few contemporary websites and one modern app like Figma puts my Firefox into a swapping tailspin after a couple hours of use.

                        I don’t entirely disagree but maybe some of the fault here lies on the engineers who decided that writing a vector graphics editor should be written as a web page – it would use several orders of magnitude fewer resources if it were simply a native executable using native graphics APIs.

                        There’s only so much that browsers can do to save engineers from their own bad decisions.

                        1. 6

                          Figma being in browser is a product decision much more than an engineering decision. And being in browser is their competitive advantage.

                          It means people can be up and running without installing a client. Seamless collaboration and sharing. These are huge differentiators compared to most things out there that require a native client.

                          Yeah, I hate resource waste as much as the next person. But using web tech gives tools like Figma a huge advantage and head start for collaboration vs native apps. But yes, at some cost in client (browser) resources and waste.

                          1. 1

                            Figma being available in the browser is a competitive advantage, yes, in that it facilitates easy initial on-boarding and reduces friction for getting a stakeholder to look at a diagram.

                            But there’s zero competitive advantage for Figma only being available as a browser app – once customers are in the ecosystem there’s every reason for the heavy Figma users to want a better performing native client, even while the crappier web app remains available for the new user or occasional-use stakeholder.

                            Figma sort-of recognizes this – they do make a desktop version available for the heavy user, but it’s just the same old repackaged webapp garbage. And limiting themselves to just repackaging the web app is not a “competitive advantage” decision so much as an engineering decision to favour never having to learn anything new ever (once you’ve got the JavaScript hammer, everything’s a nail) over maybe having to learn some new languages and acquire some new skills, which used to be considered a norm in this industry instead of something for engineers to fear and avoid at all costs.

                            1. 3

                              I’m friends with an early engineer at Figma who architected a lot of the system and knows all the history.

                              They says if they had done a cross platform native app, it would have been nearly impossible to get the rendering engine to get exactly the same result across platforms. Even for the web, they had to write their own font render.

                              Yes, a native app could be faster, but it’s a major tradeoff, collaboration, networking and distribution features, security sandbox, much of that is just given to you by the browser. With a native app you have to build it all yourself.

                              They started with a native app and ended up switching to web only. They also write Ruby, Go and Rust.

                              And the Figma app is written in C++, wasm.

                      1. 3

                        Company: Yorba

                        Company site:

                        Position(s): Clojure Developer

                        Location: Fully Remote

                        Description: Yorba offers simple, secure identity management. The service aggregates a person’s online presence and provides insights, guidance, and a set of organizing principles to manage personal information. Identity is actionable through open standards, available APIs, and partner interoperation.

                        We’re committed to building a more ethical internet. These aren’t just words. Yorba is a Public Benefit Corporation, which means that we legally cannot prioritize profits at the expense of our core values, our members, or our promises. Furthermore, we’re building on open standards; our product must win on the merits of our user experience and the trust we build with our members. We believe that people are tired of the status quo and Yorba offers a way to make meaningful change.

                        Tech stack: Clojure, Google Cloud, Node, Literate Programming

                        Compensation: market rates, flexible hours, equity possible


                        1. 5

                          I’ve been using this workflow for a while and I absolutely love it. I tangle (generate) multiple files that have interdependent documentation. And don’t even get me started on configuration files. Generating all of it from one well-documented source is a real revelation.

                          1. 3

                            He kind of speculates on a “what could have been” but I’m actually not sure what differentiates “Groupware” from what we have today. Is it less Unix-y?

                            The first time I encountered this idea was Microsoft Windows 3.11, aka “Windows For Workgroups”. If we take Microsoft at their word, it means that non-collaborative personal computing only really existed from 1977-1994 at the latest. Pretty short time in computing.

                            AppleTalk already existed and people were networking home computers before 1994. If we count Xerox’s office solutions, the window shrinks even more.

                            1. 2

                              Let’s be clear on the premise of this argument: the code has a liberal license but the output of the code should not.

                              I’m not sure this is a novel debate. Lingdong wants to restrict the code’s use in a commercial context. I’ve talked to other devs that longed to limit their code’s use in the context of state violence.

                              As the article points out, there seems to be no solution here from the Open Source Initiative. But the OSI’s position is almost certainly the only rational one. Property right restrictions will continue to seem absurd absent of a functioning digital commons.

                              1. 1

                                I agree that as long as FLOSS licenses are basically a clever hack on copyright, you can’t really realistically get rid of Freedom Zero. But what this situation points to is what I think is becoming a growing distaste of the fundamentally exploitative nature of some open-source - a creator releases something to the world, and it is immediately harnessed to produce profit.

                                The emergence of NTF enthusiasts[1] has merely closed the loop from months to days. Generative art is catnip to these people, as it enables them to produce any number of unique items to “mint”, without having to cut any profits to artists. The fact that they quickly backpedalled in the face of massive criticism is testament to the power of moral suasion, and the fact that the thin fig-leaf covering the naked greed of NFT purveyors is the fiction they’re helping artists. One does not kill the golden goose.

                                [1] scientific name: nftardius parasiticus

                                1. 1

                                  The emergence of NTF enthusiasts[1] has merely closed the loop from months to days.

                                  It’s an interesting idea that NFTs are making this exploitation more immediate. But it seems that the sort of contracts that could fix this could only come in the form of code.

                                  For example, many NFT contracts provide a mechanism to pay artists upon resale. This has a rich history throughout the 20th century, but it was really hard to make it ubiquitous and permanent. And now here we are.

                                  But the only way to prevent cut ’n paste jobs is to distribute a binary or couch it in a more complex (and hidden) system. This seems like its own nightmare.

                              1. 6

                                I truly hate the fact that nil stands for “the empty list”, “false”, “undefined result” and “error” in Common Lisp. Its the worst.

                                1. 5

                                  It really works out well in my experience. Also, as alexandria points out, errors in Lisp are typically represented with conditions, not NILs.

                                  FWIW, I truly hate that Scheme separates #f, NIL and ’().

                                  1. 4

                                    I truly hate that Scheme separates #f, NIL and ’()

                                    I don’t like it either. But:

                                    1. Scheme has no NIL. (I mean, you could have a symbol named ‘nil’, but it doesn’t mean anything in particular.)

                                    2. The bigger wart (imo) is that () is not self-evaluating.

                                    1. 1

                                      In my scheme programs I always denote the empty as (list).

                                    2. 2

                                      If you look at this from a non-Lisp perspective, it makes a lot of sense that an empty list is not the same thing as a boolean false value, just like an empty string is not the same thing as zero.

                                      nil being the empty list and the universal falsy value is entirely a Lisp thing.

                                      1. 2

                                        Not even a Lisp thing but a MacLisp thing (or maybe even a Lisp 1.5 thing?) which got inherited by MacLisp descendants Common Lisp, InterLisp, Emacs Lisp, etc.

                                      2. 1

                                        (cadr nil) is an error by almost any conceivable standard1 but it results in nil.

                                        I admit that in practice it typically doesn’t cause trouble, but it drives me crazy that this is somehow an allowed thing you can say in common lisp:

                                        (caddr (= 3 10))

                                        1 Ok I’m being a little salty here.

                                        I know its a silly thing to get caught up on but it just feels wrong to me.

                                        1. 2

                                          Ok but False is NIL, because (= 3 10) is NIL. The empty list is also NIL. Taking the CAR of the CDR of the CDR of an empty list, returns an empty list. Since CDR returns an empty list (NIL) when given an empty list (Because the CDR of the list is empty) and since CAR gives NIL when given an empty list (Because the value of the CAR is NIL), it’s just a byproduct of the chain of evaluation. None of these are error conditions in and of their own right, so the result is not an error condition either.

                                          I know you understand this, and still feel that this should be an error (which I can understand except for the fact that it would make the LISP interpreter more complex to implement), I wrote this for the people who are reading this discussion.

                                          1. 3

                                            it would make the LISP interpreter more complex to implement

                                            (disassemble (lambda (x) (car x)))
                                            ; disassembly for (LAMBDA (X))
                                            ; Size: 31 bytes. Origin: #x53571660                          ; (LAMBDA (X))
                                            ; 60:       498B5D10         MOV RBX, [R13+16]                ; thread.binding-stack-pointer
                                            ; 64:       48895DF8         MOV [RBP-8], RBX
                                            ; 68:       8D50F9           LEA EDX, [RAX-7]
                                            ; 6B:       F6C20F           TEST DL, 15
                                            ; 6E:       7403             JEQ L0
                                            ; 70:       CC49             INT3 73                          ; OBJECT-NOT-LIST-ERROR
                                            ; 72:       00               BYTE #X00                        ; RAX
                                            ; 73: L0:   488B50F9         MOV RDX, [RAX-7]
                                            ; 77:       488BE5           MOV RSP, RBP
                                            ; 7A:       F8               CLC
                                            ; 7B:       5D               POP RBP
                                            ; 7C:       C3               RET
                                            ; 7D:       CC10             INT3 16                          ; Invalid argument count trap
                                            1. Compiler, not interpreter

                                            2. It already requires a type check; it would complicate nothing to make (car nil) an error

                                        1. 1

                                          Yes, I programmed in CL for a few years at a startup. Quite familiar with the language. See my response to rau.

                                          1. 2

                                            I did, and I disagree that that represents an error. It’s correct behaviour of the system. If False was represented by a separate value, or Lisp was more strict with types, then sure it would be an error. But since any false expression returns an empty list, and since the interpreter is not literally doing pointer lookups, it is not in any way an error.

                                            1. 2

                                              The issue with me is that there are things which you can express as valid programs which run and even produce values which don’t really make sense at a type level. I know its impossible for mathematical reasons that a programming language prevent you from denoting nonsense statically (and I’m not even really sold on the idea that such static guarantees are that useful or ergonomic) but I do sort of like when a language at least stops when something silly has happened.

                                              If one sees the expression (cadr x) its pretty reasonable to expect that x denotes a list. It also seems reasonable that x is a non-empty list. But CL doesn’t guarantee either of those things are true. It doesn’t necessarily bother me too much that cadr returns a value instead of throwing an error when it receives a list which doesn’t have a cadr, but nil seems to me to be the wrong value. Why?


                                              (cadr nil)
                                              (cadr (list 'a nil))

                                              Return the same value. In other words, if you have a list with empty sublists (which doesn’t even seem particularly unusual a situation) then you can’t count on cadr to tell you about the cadr of the list! It would be better of cadr was called maybe-cadr and it returned a user provided sentinel value (default nil, perhaps) in the event of a list without a cadr.

                                              All this goes back to the idea that weird circumstances should halt programs as quickly as possible. Given that many of these list destructuring functions fold nil, its possible, and I have even encountered situations where, an unusual condition doesn’t raise a real error until long after the offending list is off the stack and beyond the debugger.

                                        2. 1

                                          nil punning ftw… until it is null pointer error.

                                        1. 23

                                          My practice routine used to consist of idling on ##c on freenode, looking at questions people were having with regards to C and solving any code problems they come up with. I usually made sure that if I were to send the person the code it was either instructive (in the case of outright rewriting what they wrote) or didn’t completely solve the problem (in the case of questions on how to do something). This meant I could solve problems I would not normally spend my time solving, keep my understanding of C very sharp and provide a high quality of help to people asking in the channel.

                                          1. 12

                                            This is both brilliant and obvious. Obvious in the sense that helping others helps yourself; it’s a tried and tested method. Brilliant in the way that skill sharing is not something ingrained in the culture.

                                            Don’t get me wrong - there are a lot of places to get help on the internet and that’s a great thing. But you’ll know it’s part of the culture when “productivity” is measured - in part - by the amount that you help other people.

                                            1. 2

                                              Ex #linux (IRCNet) and #csharp (Freenode) hanger-outer here, learning in the same way. Free time? See an interesting question? Try to solve it. If it seems like it’ll help, post it. The original requestor or others provide more info and you end up with a full picture. A fantastic way to learn.

                                              1. 2

                                                Did you do this on your own time or as psrt if your job? For the discussion on the industry culture that would make a big difference.

                                                1. 2

                                                  I did this entirely on my own time.

                                              1. 3

                                                However, if iOS was never a tool to be used as one wishes, but rather being used in the limited ways Apple foresaw, iCloud never reached the potential like replacing Google Mail or Google Drive. Never open, never as powerful, no API.

                                                It’s kind of impressive how bad Apple is at the ‘i’ part of the iMac, iPhone, etc….

                                                But if Apple had their way, the internet would look more like a minimalist AOL and less like the organic, human-made, wonderful mess that it currently embodies.

                                                1. 2

                                                  Go is so far away from Smalltalk. And … wow … our community has even forgot the Pascal vs. C wars.

                                                  1. 3

                                                    our community has even forgot the Pascal vs. C wars.

                                                    We really need comp sci students to be taught the ‘modern’ history of the industry.

                                                    1. 2

                                                      Go is so far away from Smalltalk

                                                      Yeah a friend even commented that he mentions the superiority of interactive programming (Smalltalk, Interlisp) at the beginning and then ends up at Go. Not the most interactive contemporary choice.

                                                    1. 2

                                                      The oldest thing I know of offhand is a site I made in 97 and kept updating for a couple of years before abandoning it.. Which you can see in the text. Anyway, here you go, a time when webrings and frames were still ok: still on tripod!

                                                      1. 1

                                                        I love the design on the image selection.

                                                      1. 2

                                                        At this point it’s probably my YouTube channel. I used to have, but in terms of account age that’s one of the oldest web presences that I still have. That youtube channel was made back a whole gender ago when I was in middle school and has always just avoided being a part of the partner program. I sometimes wonder what would have happened if i focused on it as a main career prosepect.

                                                        1. 1

                                                          The most awesome thing ever from 2008 is still pretty awesome.

                                                        1. 2

                                                          My 18 year old personal web site.

                                                          It used to be for an old dial up ISP that I worked at in the 90s that hadn’t shut down their site from ‘98. But, I linked to it in my newsletter once and then influx of traffic must have been odd and they finally killed the spot.

                                                          1. 1

                                                            Boooooo. Did Internet Archive manage to capture it before it went down?

                                                            1. 1

                                                              I’m certain it did. But, it’s down right now. But now I feel like I need to capture that history somehow.

                                                          1. 13

                                                            Nuclear take: I think it’s interesting so many “computer engineering/enthusiast” types (for lack of a better term) tended to gravitate towards DEC systems when their design is full of bonkers mistakes no EE should repeat: PDP-10’s recursive indirect addressing, PDP-11’s segmentation and PC in memory (ok, DSPs do this, but that’s an acceptable optimization for a DSP, not a general-purpose CPU), the absurd CISCiness of VAX, etc. (Alpha was pretty reasonable.) I say this as someone who likes VMS.

                                                            I think 360/370 is much better designed, and the influence in modern CPUs design is more obvious (lots of GPRs, clean instruction formats, pipelining, virtualization, etc.). Plus they had the also influential ACS/Stretch to draw from. I can’t say the same for many DEC designs. It’s amusing Unix types are so obsessed with VAX when Unix would feel far more at home on 370.

                                                            1. 5

                                                              I suspect a variety of factors are to blame:

                                                              IBM in the ’70s and ’80s had the reputation that Microsoft had in the ‘90s and 2000s and Google, Amazon, and Facebook are competing for now: the evil empire monopolist that the rest of the industry stands against. There’s a story around the founding of Sun that they got a visit a few months in from the IBM legal department inviting them to sign a patent cross-licensing agreement and showing six patents that Sun might be infringing. Scott McNealy sat them down and demonstrated prior art for some and that Sun wasn’t infringing any of the ones that might be valid. The IBM lawyers weren’t phased by this and said ‘you might not be infringing these, would you like us to find some that you are?’ Sun signed the cross-licensing agreement. This kind of thing is why IBM’s legal department was referred to as the Nazgul. To add to this, IBM was famously business-facing. They required programmers to wear suits and ties. The hacker ‘uniform’ of jeans and t-shirts was a push-back against companies like IBM in general and IBM in particular and hacker culture in general was part of a counter-culture rebellion where IBM was the archetype of the mainstream against which they were rebelling.

                                                              The DEC machines were so closely linked to the development of UNIX. IBM’s biggest contribution with the 360 was the idea that software written for one computer could run on another. This meant that their customers were able to build up a large amount of legacy software by the ’80s so IBM had no incentive to encourage people to write new systems software for their machines: quite the reverse, they wanted you locked in. DEC encouraged this kind of experimentation. Universities may have had an IBM mainframe for the admin department to use but the computer science departments and research groups bought DEC (and other small-vendor) machines to tinker with.

                                                              Multics was developed for the GE45, which had all manner of interesting features (including a segmentation model that allowed a single-level store and no distinction between shared libraries and processes), Unics was written for the tiny PDP in the corner and it grew with that line.

                                                              There were a lot of other big-iron systems suffered from the rise of UNIX. I’m particularly sad about the Burroughs Large Systems architecture. The B5000 was released at almost the same time as the 360 and had an OS written in a high-level language (Algol-60), with hardware-assisted garbage collection, and provided a fully memory-safe (and mostly type-safe) environment with hardware enforcement. Most modern language VMs (JVM, CLR, and so on) are attempts to emulate something close to the B5000 on a computer that exposes an abstract machine that is basically a virtualised PDP-11. I wish CPU vendors would get the hint: if the first thing people do when they get your CPU is use it to emulate one with a completely different abstract machine, you’ve done something wrong.

                                                              Oh, and before you criticise the VAX for being too CISCy (and, yes, evaluate polynomial probably doesn’t need to be a single instruction), remember that the descendants of the 360 have instructions for converting strings between EBCDIC and unicode.

                                                              1. 2

                                                                I think you exaggerate about the IBM. There is a general 1:1 table based translate which can do EBCDIC to ACII or Unicode, and there are different instructions for converting between the different Unicode flavours. It can’t do it in one instruction, that I know of.

                                                                But anyway, those and VAX POLY aren’t the problem. You can happily use microcode or just trap and emulate and no one will care.

                                                                The problem with the VAX is that the extremely common ADDL3 instruction (to name just one) can vary in length from 4 to 19 bytes and cause half a dozen memory references / cache misses / page faults.

                                                                x86, for all its ugliness, never uses more than one memory address per instruction for common instructions e.g. code generated from C. Same for S/360. Both have string instructions, but those are not a big deal, and relatively uncommon.

                                                              2. 3

                                                                That’s an interesting observation.

                                                                I think there would be a lot to learn from comparing the two engineering cultures. I would specifically include the management style and the kind of money each company was dealing with. When IBM was developing ground-breaking products like the Stretch and the Selectric typewriters, half of the company’s income came from incredibly lucrative military contracts.

                                                                The kinds of pressures on an engineering team and the corner/cost-cutting they may take is dramatically different when they are awash with money.

                                                                1. 3

                                                                  To elaborate, I feel the DEC did more influence to product segments than they did engineering. The PDP-8 and then PDP-11 redefined minicomputers, but the PDP-8’s influence was short-lived and the PDP-11’s influence….would have rather not been felt (i.e x86).

                                                              1. 5

                                                                I may need the 16 port FW800 hub at the end of the piece. Not sure why… but I may need it.

                                                                1. 2

                                                                  For access rights, I would strongly suggest converting everything over to SAML. If you have GSuite, you already have a SAML IDP included in your purchase.

                                                                  For everything else, I would set up a shared spreadsheet with finance/accounts receivable. It’s also worth starting a shared drive between ops, legal and finance where you keep all of the executed contracts. When you need them, its really important they are quickly located.

                                                                  1. 1

                                                                    Great advice - thanks!

                                                                  1. 2

                                                                    Interesting history. Based on a conversation with @mjn, it appears that Weizenbaum’s SLIP was the first programming language to use reference counting. Finally I get to see a SLIP program.

                                                                    More context: SLIP was a list processing language, created a few years after LISP. LISP used a tracing garbage collector (a first, published 1960), and SLIP used reference counting (seems to be the first programming language to use this technique, published 1963).

                                                                    1. 1

                                                                      Does the divergence in garbage collection methods have anything to do with SLIP being a Fortran extension? I have to imagine the garbage collector was still implemented in machine code.

                                                                      1. 2

                                                                        Here’s the abstract for Weizenbaum’s paper:

                                                                        Symmetric list processor; J Weizenbaum - Communications of the ACM, 1963; A list processing system in which each list cell contains both a forward and a backward link as well as a datum is described. This system is intended for imbedding in higher level languages capable of calling functions and subroutines coded in machine language. The presentation is in the form of FORTRAN programs depending on only a limited set of “primitive” machine language subroutines which are also defined. Finally, a set of field, particularly character, manipulation primitives are given to round out the system.

                                                                        Based on this, it sounds like SLIP is implemented in machine language. This would have been normal and expected in 1963. I don’t have access to the full paper. Also note that in the ELIZAGEN article, this version of SLIP is embedded in MAD, not FORTRAN.

                                                                    1. 3

                                                                      I stopped keeping paper notebooks when I realized I never referenced them again. I recently switched to an eink device that recognized handwriting. Now I have an append-only daily journal that can also be searched via grep.

                                                                      As much as I like the idea of a paper notebook, I just have to accept it’s not how I work.